diff --git a/.env b/.env new file mode 100644 index 000000000..54000569a --- /dev/null +++ b/.env @@ -0,0 +1 @@ +DB_CONNECT = mongodb+srv://Magic:MagicPassword@finalcluster.gcs76.mongodb.net/myFirstDatabase?retryWrites=true&w=majority \ No newline at end of file diff --git a/README.md b/README.md index 88765ffde..b0e5b84e2 100644 --- a/README.md +++ b/README.md @@ -1,51 +1,84 @@ -# Final Project -*Due before the start of class, October 11th (final day of the term)* +We created an application that allows users to login and register to a website. The user then will be redirected to a welcome page where they can create,delete, and edit tasks and subtasks to track items they need to keep track of in their life. -For your final project, you'll implement a web application that exhibits understanding of the course materials. -This project should provide an opportunity to both be creative and to pursue individual research and learning goals. +You can register for an account and the website will automaticlly log you in. Username must be at least 3 charachters long and password must be at leat 6 characters long -## General description -Your project should consist of a complete Web application, exhibiting facets of the three main sections of the course material: +Technologies +- cookies: Keep the user logged in after they refresh the page +- mongoDB: Database to store information +- @hapi/joi: login validation +- mongoose: Provides schemas for working with the Database +- Javascript: Code to run commands when users perform actions +- EJS: Helped format HTML and reduce the overall amount of HTML that had to be written +- Water: CSS Framework providing styling for our site +- HTML -- Static Web page content and design. You should have a project that is accessible, easily navigable, and features significant content. -- Dynamic behavior implemented with JavaScript (TypeScript is also allowed if your group wants to explore it). -- Server-side programming *using Node.js*. Typically this will take the form of some sort of persistent data (database), authentication, and possibly server-side computation. -- A video (less than five minutes) where each group member explains some aspect of the project. An easy way to produce this video is for you all the groups members to join a Zoom call that is recorded; each member can share their screen when they discuss the project or one member can "drive" the interface while other members narrate (this second option will probably work better.) The video should be posted on YouTube or some other accessible video hosting service. Make sure your video is less than five minutes, but long enough to successfully explain your project and show it in action. There is no minimum video length. +What challenges you faced in completing the project. +-Various minor rendering bugs popped up, though we were able to fix most, some were hard to find/difficult to debug and we may not have caught all of them +-Making sure subtasks were linked with their parent. Some minor mistakes early on when initially setting up the project caused serious complications for implementing subtasks, as these seemingly innocent decisions made tracking which item we were working on exceedingly difficult. +-Visualization of subtasks. Displaying the subtasks as visually distinct from the main tasks was harder than expected and we had to settle for just using different colors +-It was difficult to allow for collapsible subtasks without causing issues where some of a tasks's subtasks would be collpased but not all +-Issues with updating the database when a user clicked on the checkbox to mark a task as complete or no longer complete -## Project ideation -Excellent projects typically serve someone/some group; for this assignment you need to define your users and stakeholders. I encourage you to identify projects that will have impact, either artistically, politically, or in terms of productivity. +What each group member was responsible for designing / developing -## Logistics -### Team size -Students are will work in teams of 3-5 students for the project; teams of two can be approved with the permission of the instructor. Working in teams should help enable you to build a good project in a limited amount of time. Use the `#project-logistics` channel in Discord to pitch ideas for final projects and/or find fellow team members as needed. +Orest Ropi +- Login/Registration design and implementation. +- Main tasks creation, deletion, and editing. (design and implementation) +- CSS style +- Fixed bugs/testing +- Documentation -Teams must be in place by end of day on Saturday, September 25th. If you have not identified a team at this point, you will be assigned a team. You will be given some class time on Monday to work on your proposal, but please plan on reserving additional time as needed. +Keval Ashara +- Intial project setup +- Setup key technologies used in the project +- Helped implement subtasks -### Deliverables +Daniel Stusalitus +- Implemented subtasks +- Updated existing code to work with subtasks +- Fixed bugs/testing +- Created database -__Proposal:__ -Provide an outline of your project direction and the names of associated team members. -The outline should have enough detail so that staff can determine if it meets the minimum expectations, or if it goes too far to be reasonable by the deadline. Please include a general description of a project, and list of key technologies/libraries you plan on using (e.g. React, Three.js, Svelte, TypeScript etc.). Name the file proposal.md and submit a pull request. -Submit a PR to turn it in by Monday, September 27th at11:59 PM. Only one pull request is required per team. +**Link to Project Video** +https://youtu.be/gQYvom4qQxo +**Link to Glitch Page** +https://group20-final.glitch.me/ -There are no other scheduled checkpoints for your project. -#### Turning in Your Outline / Project -Submit a second PR on the final project repo to turn in your app and code. Again, only one pull request per team. +**Design Acievements** +Design Achievement 1: +(5 points) How site use CRAP principles in the Non-Designer's Design Book readings +Which element received the most emphasis (contrast) on each page? -Deploy your app, in the form of a webpage, to Glitch/Heroku/Digital Ocean or some other service; it is critical that the application functions correctly wherever you post it. +The elements that received the most contrast were my buttons and title. They were what I wanted the user attention to be drawn to. Information about what they could do and how they can do it is the users main priority. I used dark blue for the title and black for the buttons because they contrasted the white background really well. In my log in page I also made the register and login buttons have a colored background. This is so the user can immediately locate them and use them. I had grey boxes in the login page to represent what sections were for login or registering, so the user could more easily differentiate between the two. Lastly I left the background white(on my login page and page with data) so the user would focus more on the application which they would be using, and not getting distracted by unnecessary images or colors. -The README for your second pull request doesn’t need to be a formal report, but it should contain: +How did you use proximity to organize the visual information on your page? -1. A brief description of what you created, and a link to the project itself (two paragraphs of text) -2. Any additional instructions that might be needed to fully use your project (login information etc.) -3. An outline of the technologies you used and how you used them. -4. What challenges you faced in completing the project. -5. What each group member was responsible for designing / developing. -6. A link to your project video. +I used proximity to organize the visual information on your page by grouping together items that were related to one another. On my login page I had the user login information all together inside a box. This allowed the user to find everything he needed about logging in quickly. I also had the registration information all together inside a box. This allowed the user to find everything he needed about registering quickly. Putting these items in my logging page together also reduced clutter. In my page with the user data I grouped together all the information of a single contact together. In the user data page, I also grouped up the title with the table so the user knows exactly what they can put into the table. -Think of 1,3, and 4 in particular in a similar vein to the design / tech achievements for A1—A4… make a case for why what you did was challenging and why your implementation deserves a grade of 100%. +What design elements (colors, fonts, layouts, etc.) did you use repeatedly throughout your site? -## FAQs +I used the design elements of colors, fonts, and layout repeatedly throughout my site. I used colors to show similarities between logging in and registering. They both require a username and password to be inputted so I made their box, text, and background color the same. It also helps distinguish between the two options. This is since both have a similar color scheme and layout it is obvious that they are not interconnected. In my data page I repeatedly use blue color and large font to show important information. The user eventually gets used to the blue color and large font signifying important information that needs to be read. I also use bolded black on all my text in the labels of the tables. From this the user can concur that those are labels and not another data entry. + +How did you use alignment to organize information and/or increase contrast for particular elements? + +I used alignment to organize information and/or increase contrast for particular elements. For example, in the login page the login button and the header for the login button are both slightly to the left of the center of the screen. This allows the user to organize them together(i.e. they would organize 'Log in here!' and the log in button). Another example, in the login page the register button and the header for the register button are both slightly to the left of the center of the screen. This allows the user to organize them together(i.e. they would organize 'Register in here!' and register the button). I also aligned the buttons and headers towards the center, so they would still be close to the boxes were the user is putting the data. In my user data page all the elements in my table cells are aligned in the center so they are easier to see. This also makes the cells more distinguishable from one another giving an increased contrast between them. + +Design Achievement 2: +(5 points per person, with a max of 10 points) Test your user interface with other students in the class. Define a specific task for them to complete (ideally something short that takes <10 minutes), and then use the think-aloud protocol to obtain feedback on your design (talk-aloud is also find). Important considerations when designing your study: + +Task: Register for an account, create some tasks, create some subtasks, edit some tasks, edit some subtasks, delete some tasks, and delete some subtasks. + +(5 points)Provide the last name of each person you conduct the evaluation with. Ropi (This was my brother, not sure if this is allowed but i dont know anyone else in the class) + +What problems did the user have with your design? He said he did not like how the background of the webpage was completely white. He called the login page "boring". Creating a subtasks was weird at first for him, as the create a task button and create a subtask button both used the same input text field. + +What comments did they make that surprised you? He said it was really easy to accomplish his task, due to abillity to read the content on the screen very clearly. Font was large and colors were complementary. + +What would you change about the interface based on their feedback? I would add something to make the user feel excited about the website at the login page. For example, maybe an animation or a small game that could serve as a "wow factor". I would also make it so that creating a subtask would be done through another input text field or a pop-up. + + +**Technological Achievements** +Technological Achievement 1: (10 points) +We implemented one of our large stretch goals of creating login/register functionallity. We achieved this using cookies, happi joi, and mogoDB/mangooose. -- **Can I use XYZ framework?** You can use any web-based frameworks or tools available, but for your server programming you need to use Node.js. Your client-side scripting language should be either JavaScript or TypeScript. diff --git a/model/Task.js b/model/Task.js new file mode 100644 index 000000000..b80a36dd1 --- /dev/null +++ b/model/Task.js @@ -0,0 +1,43 @@ +const mongoose = require('mongoose'); + +var Schema = mongoose.Schema, + ObjectId = Schema.ObjectId; + +const taskSchema = new mongoose.Schema({ + username: { + type: String, + required: true, + max: 200, + min: 3 + }, + task:{ + type: String, + required: true, + max: 1024, + min: 6 + }, + due:{ + type: String, + required: true, + max: 1024, + }, + check:{ + type: Boolean, + required: true, + }, + date:{ + type: Date, + default: Date.now + }, + parent: { + type: String, + default: "" + } +}); + +taskSchema.virtual('categoryId').get(function() { + return this._id +}) + +module.exports = mongoose.model('Task', taskSchema); + diff --git a/model/User.js b/model/User.js new file mode 100644 index 000000000..feffc5e05 --- /dev/null +++ b/model/User.js @@ -0,0 +1,22 @@ +const mongoose = require('mongoose'); + +const userlogInSchema = new mongoose.Schema({ + username: { + type: String, + required: true, + max: 200, + min: 3 + }, + password:{ + type: String, + required: true, + max: 1024, + min: 6 + }, + date:{ + type: Date, + default: Date.now + } +}); + +module.exports = mongoose.model('User', userlogInSchema); \ No newline at end of file diff --git a/node_modules/.bin/ejs b/node_modules/.bin/ejs new file mode 100644 index 000000000..0b9c6b751 --- /dev/null +++ b/node_modules/.bin/ejs @@ -0,0 +1,15 @@ +#!/bin/sh +basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')") + +case `uname` in + *CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;; +esac + +if [ -x "$basedir/node" ]; then + "$basedir/node" "$basedir/../ejs/bin/cli.js" "$@" + ret=$? +else + node "$basedir/../ejs/bin/cli.js" "$@" + ret=$? +fi +exit $ret diff --git a/node_modules/.bin/ejs.cmd b/node_modules/.bin/ejs.cmd new file mode 100644 index 000000000..3bb42adda --- /dev/null +++ b/node_modules/.bin/ejs.cmd @@ -0,0 +1,17 @@ +@ECHO off +SETLOCAL +CALL :find_dp0 + +IF EXIST "%dp0%\node.exe" ( + SET "_prog=%dp0%\node.exe" +) ELSE ( + SET "_prog=node" + SET PATHEXT=%PATHEXT:;.JS;=;% +) + +"%_prog%" "%dp0%\..\ejs\bin\cli.js" %* +ENDLOCAL +EXIT /b %errorlevel% +:find_dp0 +SET dp0=%~dp0 +EXIT /b diff --git a/node_modules/.bin/ejs.ps1 b/node_modules/.bin/ejs.ps1 new file mode 100644 index 000000000..78e73c8af --- /dev/null +++ b/node_modules/.bin/ejs.ps1 @@ -0,0 +1,18 @@ +#!/usr/bin/env pwsh +$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent + +$exe="" +if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) { + # Fix case when both the Windows and Linux builds of Node + # are installed in the same directory + $exe=".exe" +} +$ret=0 +if (Test-Path "$basedir/node$exe") { + & "$basedir/node$exe" "$basedir/../ejs/bin/cli.js" $args + $ret=$LASTEXITCODE +} else { + & "node$exe" "$basedir/../ejs/bin/cli.js" $args + $ret=$LASTEXITCODE +} +exit $ret diff --git a/node_modules/.bin/is-ci b/node_modules/.bin/is-ci new file mode 100644 index 000000000..e79342ff2 --- /dev/null +++ b/node_modules/.bin/is-ci @@ -0,0 +1,15 @@ +#!/bin/sh +basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')") + +case `uname` in + *CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;; +esac + +if [ -x "$basedir/node" ]; then + "$basedir/node" "$basedir/../is-ci/bin.js" "$@" + ret=$? +else + node "$basedir/../is-ci/bin.js" "$@" + ret=$? +fi +exit $ret diff --git a/node_modules/.bin/is-ci.cmd b/node_modules/.bin/is-ci.cmd new file mode 100644 index 000000000..e3276c084 --- /dev/null +++ b/node_modules/.bin/is-ci.cmd @@ -0,0 +1,17 @@ +@ECHO off +SETLOCAL +CALL :find_dp0 + +IF EXIST "%dp0%\node.exe" ( + SET "_prog=%dp0%\node.exe" +) ELSE ( + SET "_prog=node" + SET PATHEXT=%PATHEXT:;.JS;=;% +) + +"%_prog%" "%dp0%\..\is-ci\bin.js" %* +ENDLOCAL +EXIT /b %errorlevel% +:find_dp0 +SET dp0=%~dp0 +EXIT /b diff --git a/node_modules/.bin/is-ci.ps1 b/node_modules/.bin/is-ci.ps1 new file mode 100644 index 000000000..3fe23403d --- /dev/null +++ b/node_modules/.bin/is-ci.ps1 @@ -0,0 +1,18 @@ +#!/usr/bin/env pwsh +$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent + +$exe="" +if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) { + # Fix case when both the Windows and Linux builds of Node + # are installed in the same directory + $exe=".exe" +} +$ret=0 +if (Test-Path "$basedir/node$exe") { + & "$basedir/node$exe" "$basedir/../is-ci/bin.js" $args + $ret=$LASTEXITCODE +} else { + & "node$exe" "$basedir/../is-ci/bin.js" $args + $ret=$LASTEXITCODE +} +exit $ret diff --git a/node_modules/.bin/jake b/node_modules/.bin/jake new file mode 100644 index 000000000..df203c157 --- /dev/null +++ b/node_modules/.bin/jake @@ -0,0 +1,15 @@ +#!/bin/sh +basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')") + +case `uname` in + *CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;; +esac + +if [ -x "$basedir/node" ]; then + "$basedir/node" "$basedir/../jake/bin/cli.js" "$@" + ret=$? +else + node "$basedir/../jake/bin/cli.js" "$@" + ret=$? +fi +exit $ret diff --git a/node_modules/.bin/jake.cmd b/node_modules/.bin/jake.cmd new file mode 100644 index 000000000..5b09dcb89 --- /dev/null +++ b/node_modules/.bin/jake.cmd @@ -0,0 +1,17 @@ +@ECHO off +SETLOCAL +CALL :find_dp0 + +IF EXIST "%dp0%\node.exe" ( + SET "_prog=%dp0%\node.exe" +) ELSE ( + SET "_prog=node" + SET PATHEXT=%PATHEXT:;.JS;=;% +) + +"%_prog%" "%dp0%\..\jake\bin\cli.js" %* +ENDLOCAL +EXIT /b %errorlevel% +:find_dp0 +SET dp0=%~dp0 +EXIT /b diff --git a/node_modules/.bin/jake.ps1 b/node_modules/.bin/jake.ps1 new file mode 100644 index 000000000..bbe80ead3 --- /dev/null +++ b/node_modules/.bin/jake.ps1 @@ -0,0 +1,18 @@ +#!/usr/bin/env pwsh +$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent + +$exe="" +if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) { + # Fix case when both the Windows and Linux builds of Node + # are installed in the same directory + $exe=".exe" +} +$ret=0 +if (Test-Path "$basedir/node$exe") { + & "$basedir/node$exe" "$basedir/../jake/bin/cli.js" $args + $ret=$LASTEXITCODE +} else { + & "node$exe" "$basedir/../jake/bin/cli.js" $args + $ret=$LASTEXITCODE +} +exit $ret diff --git a/node_modules/.bin/mime b/node_modules/.bin/mime new file mode 100644 index 000000000..91e5e16a6 --- /dev/null +++ b/node_modules/.bin/mime @@ -0,0 +1,15 @@ +#!/bin/sh +basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')") + +case `uname` in + *CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;; +esac + +if [ -x "$basedir/node" ]; then + "$basedir/node" "$basedir/../mime/cli.js" "$@" + ret=$? +else + node "$basedir/../mime/cli.js" "$@" + ret=$? +fi +exit $ret diff --git a/node_modules/.bin/mime.cmd b/node_modules/.bin/mime.cmd new file mode 100644 index 000000000..746a27981 --- /dev/null +++ b/node_modules/.bin/mime.cmd @@ -0,0 +1,17 @@ +@ECHO off +SETLOCAL +CALL :find_dp0 + +IF EXIST "%dp0%\node.exe" ( + SET "_prog=%dp0%\node.exe" +) ELSE ( + SET "_prog=node" + SET PATHEXT=%PATHEXT:;.JS;=;% +) + +"%_prog%" "%dp0%\..\mime\cli.js" %* +ENDLOCAL +EXIT /b %errorlevel% +:find_dp0 +SET dp0=%~dp0 +EXIT /b diff --git a/node_modules/.bin/mime.ps1 b/node_modules/.bin/mime.ps1 new file mode 100644 index 000000000..a6f6f4700 --- /dev/null +++ b/node_modules/.bin/mime.ps1 @@ -0,0 +1,18 @@ +#!/usr/bin/env pwsh +$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent + +$exe="" +if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) { + # Fix case when both the Windows and Linux builds of Node + # are installed in the same directory + $exe=".exe" +} +$ret=0 +if (Test-Path "$basedir/node$exe") { + & "$basedir/node$exe" "$basedir/../mime/cli.js" $args + $ret=$LASTEXITCODE +} else { + & "node$exe" "$basedir/../mime/cli.js" $args + $ret=$LASTEXITCODE +} +exit $ret diff --git a/node_modules/.bin/nodemon b/node_modules/.bin/nodemon new file mode 100644 index 000000000..439386d99 --- /dev/null +++ b/node_modules/.bin/nodemon @@ -0,0 +1,15 @@ +#!/bin/sh +basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')") + +case `uname` in + *CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;; +esac + +if [ -x "$basedir/node" ]; then + "$basedir/node" "$basedir/../nodemon/bin/nodemon.js" "$@" + ret=$? +else + node "$basedir/../nodemon/bin/nodemon.js" "$@" + ret=$? +fi +exit $ret diff --git a/node_modules/.bin/nodemon.cmd b/node_modules/.bin/nodemon.cmd new file mode 100644 index 000000000..2ef288808 --- /dev/null +++ b/node_modules/.bin/nodemon.cmd @@ -0,0 +1,17 @@ +@ECHO off +SETLOCAL +CALL :find_dp0 + +IF EXIST "%dp0%\node.exe" ( + SET "_prog=%dp0%\node.exe" +) ELSE ( + SET "_prog=node" + SET PATHEXT=%PATHEXT:;.JS;=;% +) + +"%_prog%" "%dp0%\..\nodemon\bin\nodemon.js" %* +ENDLOCAL +EXIT /b %errorlevel% +:find_dp0 +SET dp0=%~dp0 +EXIT /b diff --git a/node_modules/.bin/nodemon.ps1 b/node_modules/.bin/nodemon.ps1 new file mode 100644 index 000000000..413e5cb80 --- /dev/null +++ b/node_modules/.bin/nodemon.ps1 @@ -0,0 +1,18 @@ +#!/usr/bin/env pwsh +$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent + +$exe="" +if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) { + # Fix case when both the Windows and Linux builds of Node + # are installed in the same directory + $exe=".exe" +} +$ret=0 +if (Test-Path "$basedir/node$exe") { + & "$basedir/node$exe" "$basedir/../nodemon/bin/nodemon.js" $args + $ret=$LASTEXITCODE +} else { + & "node$exe" "$basedir/../nodemon/bin/nodemon.js" $args + $ret=$LASTEXITCODE +} +exit $ret diff --git a/node_modules/.bin/nodetouch b/node_modules/.bin/nodetouch new file mode 100644 index 000000000..1f7f0015b --- /dev/null +++ b/node_modules/.bin/nodetouch @@ -0,0 +1,15 @@ +#!/bin/sh +basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')") + +case `uname` in + *CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;; +esac + +if [ -x "$basedir/node" ]; then + "$basedir/node" "$basedir/../touch/bin/nodetouch.js" "$@" + ret=$? +else + node "$basedir/../touch/bin/nodetouch.js" "$@" + ret=$? +fi +exit $ret diff --git a/node_modules/.bin/nodetouch.cmd b/node_modules/.bin/nodetouch.cmd new file mode 100644 index 000000000..1f78c68a9 --- /dev/null +++ b/node_modules/.bin/nodetouch.cmd @@ -0,0 +1,17 @@ +@ECHO off +SETLOCAL +CALL :find_dp0 + +IF EXIST "%dp0%\node.exe" ( + SET "_prog=%dp0%\node.exe" +) ELSE ( + SET "_prog=node" + SET PATHEXT=%PATHEXT:;.JS;=;% +) + +"%_prog%" "%dp0%\..\touch\bin\nodetouch.js" %* +ENDLOCAL +EXIT /b %errorlevel% +:find_dp0 +SET dp0=%~dp0 +EXIT /b diff --git a/node_modules/.bin/nodetouch.ps1 b/node_modules/.bin/nodetouch.ps1 new file mode 100644 index 000000000..236659cef --- /dev/null +++ b/node_modules/.bin/nodetouch.ps1 @@ -0,0 +1,18 @@ +#!/usr/bin/env pwsh +$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent + +$exe="" +if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) { + # Fix case when both the Windows and Linux builds of Node + # are installed in the same directory + $exe=".exe" +} +$ret=0 +if (Test-Path "$basedir/node$exe") { + & "$basedir/node$exe" "$basedir/../touch/bin/nodetouch.js" $args + $ret=$LASTEXITCODE +} else { + & "node$exe" "$basedir/../touch/bin/nodetouch.js" $args + $ret=$LASTEXITCODE +} +exit $ret diff --git a/node_modules/.bin/nopt b/node_modules/.bin/nopt new file mode 100644 index 000000000..e658aac45 --- /dev/null +++ b/node_modules/.bin/nopt @@ -0,0 +1,15 @@ +#!/bin/sh +basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')") + +case `uname` in + *CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;; +esac + +if [ -x "$basedir/node" ]; then + "$basedir/node" "$basedir/../nopt/bin/nopt.js" "$@" + ret=$? +else + node "$basedir/../nopt/bin/nopt.js" "$@" + ret=$? +fi +exit $ret diff --git a/node_modules/.bin/nopt.cmd b/node_modules/.bin/nopt.cmd new file mode 100644 index 000000000..c92ec036d --- /dev/null +++ b/node_modules/.bin/nopt.cmd @@ -0,0 +1,17 @@ +@ECHO off +SETLOCAL +CALL :find_dp0 + +IF EXIST "%dp0%\node.exe" ( + SET "_prog=%dp0%\node.exe" +) ELSE ( + SET "_prog=node" + SET PATHEXT=%PATHEXT:;.JS;=;% +) + +"%_prog%" "%dp0%\..\nopt\bin\nopt.js" %* +ENDLOCAL +EXIT /b %errorlevel% +:find_dp0 +SET dp0=%~dp0 +EXIT /b diff --git a/node_modules/.bin/nopt.ps1 b/node_modules/.bin/nopt.ps1 new file mode 100644 index 000000000..68c40bf91 --- /dev/null +++ b/node_modules/.bin/nopt.ps1 @@ -0,0 +1,18 @@ +#!/usr/bin/env pwsh +$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent + +$exe="" +if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) { + # Fix case when both the Windows and Linux builds of Node + # are installed in the same directory + $exe=".exe" +} +$ret=0 +if (Test-Path "$basedir/node$exe") { + & "$basedir/node$exe" "$basedir/../nopt/bin/nopt.js" $args + $ret=$LASTEXITCODE +} else { + & "node$exe" "$basedir/../nopt/bin/nopt.js" $args + $ret=$LASTEXITCODE +} +exit $ret diff --git a/node_modules/.bin/rc b/node_modules/.bin/rc new file mode 100644 index 000000000..9e0162675 --- /dev/null +++ b/node_modules/.bin/rc @@ -0,0 +1,15 @@ +#!/bin/sh +basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')") + +case `uname` in + *CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;; +esac + +if [ -x "$basedir/node" ]; then + "$basedir/node" "$basedir/../rc/cli.js" "$@" + ret=$? +else + node "$basedir/../rc/cli.js" "$@" + ret=$? +fi +exit $ret diff --git a/node_modules/.bin/rc.cmd b/node_modules/.bin/rc.cmd new file mode 100644 index 000000000..aedece9d5 --- /dev/null +++ b/node_modules/.bin/rc.cmd @@ -0,0 +1,17 @@ +@ECHO off +SETLOCAL +CALL :find_dp0 + +IF EXIST "%dp0%\node.exe" ( + SET "_prog=%dp0%\node.exe" +) ELSE ( + SET "_prog=node" + SET PATHEXT=%PATHEXT:;.JS;=;% +) + +"%_prog%" "%dp0%\..\rc\cli.js" %* +ENDLOCAL +EXIT /b %errorlevel% +:find_dp0 +SET dp0=%~dp0 +EXIT /b diff --git a/node_modules/.bin/rc.ps1 b/node_modules/.bin/rc.ps1 new file mode 100644 index 000000000..ac2cd2a5a --- /dev/null +++ b/node_modules/.bin/rc.ps1 @@ -0,0 +1,18 @@ +#!/usr/bin/env pwsh +$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent + +$exe="" +if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) { + # Fix case when both the Windows and Linux builds of Node + # are installed in the same directory + $exe=".exe" +} +$ret=0 +if (Test-Path "$basedir/node$exe") { + & "$basedir/node$exe" "$basedir/../rc/cli.js" $args + $ret=$LASTEXITCODE +} else { + & "node$exe" "$basedir/../rc/cli.js" $args + $ret=$LASTEXITCODE +} +exit $ret diff --git a/node_modules/.bin/semver b/node_modules/.bin/semver new file mode 100644 index 000000000..10497aa88 --- /dev/null +++ b/node_modules/.bin/semver @@ -0,0 +1,15 @@ +#!/bin/sh +basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')") + +case `uname` in + *CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;; +esac + +if [ -x "$basedir/node" ]; then + "$basedir/node" "$basedir/../semver/bin/semver" "$@" + ret=$? +else + node "$basedir/../semver/bin/semver" "$@" + ret=$? +fi +exit $ret diff --git a/node_modules/.bin/semver.cmd b/node_modules/.bin/semver.cmd new file mode 100644 index 000000000..eb3aaa1e7 --- /dev/null +++ b/node_modules/.bin/semver.cmd @@ -0,0 +1,17 @@ +@ECHO off +SETLOCAL +CALL :find_dp0 + +IF EXIST "%dp0%\node.exe" ( + SET "_prog=%dp0%\node.exe" +) ELSE ( + SET "_prog=node" + SET PATHEXT=%PATHEXT:;.JS;=;% +) + +"%_prog%" "%dp0%\..\semver\bin\semver" %* +ENDLOCAL +EXIT /b %errorlevel% +:find_dp0 +SET dp0=%~dp0 +EXIT /b diff --git a/node_modules/.bin/semver.ps1 b/node_modules/.bin/semver.ps1 new file mode 100644 index 000000000..a3315ffc6 --- /dev/null +++ b/node_modules/.bin/semver.ps1 @@ -0,0 +1,18 @@ +#!/usr/bin/env pwsh +$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent + +$exe="" +if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) { + # Fix case when both the Windows and Linux builds of Node + # are installed in the same directory + $exe=".exe" +} +$ret=0 +if (Test-Path "$basedir/node$exe") { + & "$basedir/node$exe" "$basedir/../semver/bin/semver" $args + $ret=$LASTEXITCODE +} else { + & "node$exe" "$basedir/../semver/bin/semver" $args + $ret=$LASTEXITCODE +} +exit $ret diff --git a/node_modules/@hapi/address/LICENSE.md b/node_modules/@hapi/address/LICENSE.md new file mode 100644 index 000000000..d3775686c --- /dev/null +++ b/node_modules/@hapi/address/LICENSE.md @@ -0,0 +1,9 @@ +Copyright (c) 2019-2020, Project contributors +All rights reserved. + +Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: + * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. + * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. + * The names of any contributors may not be used to endorse or promote products derived from this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS AND CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/node_modules/@hapi/address/README.md b/node_modules/@hapi/address/README.md new file mode 100644 index 000000000..c12d05430 --- /dev/null +++ b/node_modules/@hapi/address/README.md @@ -0,0 +1,17 @@ + + +# @hapi/address + +#### Validate email address and domain. + +**address** is part of the **hapi** ecosystem and was designed to work seamlessly with the [hapi web framework](https://hapi.dev) and its other components (but works great on its own or with other frameworks). If you are using a different web framework and find this module useful, check out [hapi](https://hapi.dev) they work even better together. + +### Visit the [hapi.dev](https://hapi.dev) Developer Portal for tutorials, documentation, and support + +## Useful resources + +- [Documentation and API](https://hapi.dev/family/address/) +- [Versions status](https://hapi.dev/resources/status/#address) +- [Changelog](https://hapi.dev/family/address/changelog/) +- [Project policies](https://hapi.dev/policies/) +- [Free and commercial support options](https://hapi.dev/support/) diff --git a/node_modules/@hapi/address/lib/decode.js b/node_modules/@hapi/address/lib/decode.js new file mode 100644 index 000000000..06a123694 --- /dev/null +++ b/node_modules/@hapi/address/lib/decode.js @@ -0,0 +1,120 @@ +'use strict'; + +// Adapted from: +// Copyright (c) 2017-2019 Justin Ridgewell, MIT Licensed, https://github.com/jridgewell/safe-decode-string-component +// Copyright (c) 2008-2009 Bjoern Hoehrmann , MIT Licensed, http://bjoern.hoehrmann.de/utf-8/decoder/dfa/ + + +const internals = {}; + + +exports.decode = function (string) { + + let percentPos = string.indexOf('%'); + if (percentPos === -1) { + return string; + } + + let decoded = ''; + let last = 0; + let codepoint = 0; + let startOfOctets = percentPos; + let state = internals.utf8.accept; + + while (percentPos > -1 && + percentPos < string.length) { + + const high = internals.resolveHex(string[percentPos + 1], 4); + const low = internals.resolveHex(string[percentPos + 2], 0); + const byte = high | low; + const type = internals.utf8.data[byte]; + state = internals.utf8.data[256 + state + type]; + codepoint = (codepoint << 6) | (byte & internals.utf8.data[364 + type]); + + if (state === internals.utf8.accept) { + decoded += string.slice(last, startOfOctets); + decoded += codepoint <= 0xFFFF + ? String.fromCharCode(codepoint) + : String.fromCharCode(0xD7C0 + (codepoint >> 10), 0xDC00 + (codepoint & 0x3FF)); + + codepoint = 0; + last = percentPos + 3; + percentPos = string.indexOf('%', last); + startOfOctets = percentPos; + continue; + } + + if (state === internals.utf8.reject) { + return null; + } + + percentPos += 3; + + if (percentPos >= string.length || + string[percentPos] !== '%') { + + return null; + } + } + + return decoded + string.slice(last); +}; + + +internals.resolveHex = function (char, shift) { + + const i = internals.hex[char]; + return i === undefined ? 255 : i << shift; +}; + + +internals.hex = { + '0': 0, '1': 1, '2': 2, '3': 3, '4': 4, + '5': 5, '6': 6, '7': 7, '8': 8, '9': 9, + 'a': 10, 'A': 10, 'b': 11, 'B': 11, 'c': 12, + 'C': 12, 'd': 13, 'D': 13, 'e': 14, 'E': 14, + 'f': 15, 'F': 15 +}; + + +internals.utf8 = { + accept: 12, + reject: 0, + data: [ + + // Maps bytes to character to a transition + + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, + 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, + 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, + 4, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, + 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, + 6, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 8, 7, 7, + 10, 9, 9, 9, 11, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, + + // Maps a state to a new state when adding a transition + + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 12, 0, 0, 0, 0, 24, 36, 48, 60, 72, 84, 96, + 0, 12, 12, 12, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 24, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 24, 24, 24, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 24, 24, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 48, 48, 48, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 48, 48, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 48, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + + // Maps the current transition to a mask that needs to apply to the byte + + 0x7F, 0x3F, 0x3F, 0x3F, 0x00, 0x1F, 0x0F, 0x0F, 0x0F, 0x07, 0x07, 0x07 + ] +}; diff --git a/node_modules/@hapi/address/lib/domain.js b/node_modules/@hapi/address/lib/domain.js new file mode 100644 index 000000000..e96bf5ef6 --- /dev/null +++ b/node_modules/@hapi/address/lib/domain.js @@ -0,0 +1,113 @@ +'use strict'; + +const Url = require('url'); + +const Errors = require('./errors'); + + +const internals = { + minDomainSegments: 2, + nonAsciiRx: /[^\x00-\x7f]/, + domainControlRx: /[\x00-\x20@\:\/]/, // Control + space + separators + tldSegmentRx: /^[a-zA-Z](?:[a-zA-Z0-9\-]*[a-zA-Z0-9])?$/, + domainSegmentRx: /^[a-zA-Z0-9](?:[a-zA-Z0-9\-]*[a-zA-Z0-9])?$/, + URL: Url.URL || URL // $lab:coverage:ignore$ +}; + + +exports.analyze = function (domain, options = {}) { + + if (typeof domain !== 'string') { + throw new Error('Invalid input: domain must be a string'); + } + + if (!domain) { + return Errors.code('DOMAIN_NON_EMPTY_STRING'); + } + + if (domain.length > 256) { + return Errors.code('DOMAIN_TOO_LONG'); + } + + const ascii = !internals.nonAsciiRx.test(domain); + if (!ascii) { + if (options.allowUnicode === false) { // Defaults to true + return Errors.code('DOMAIN_INVALID_UNICODE_CHARS'); + } + + domain = domain.normalize('NFC'); + } + + if (internals.domainControlRx.test(domain)) { + return Errors.code('DOMAIN_INVALID_CHARS'); + } + + domain = internals.punycode(domain); + + // https://tools.ietf.org/html/rfc1035 section 2.3.1 + + const minDomainSegments = options.minDomainSegments || internals.minDomainSegments; + + const segments = domain.split('.'); + if (segments.length < minDomainSegments) { + return Errors.code('DOMAIN_SEGMENTS_COUNT'); + } + + if (options.maxDomainSegments) { + if (segments.length > options.maxDomainSegments) { + return Errors.code('DOMAIN_SEGMENTS_COUNT_MAX'); + } + } + + const tlds = options.tlds; + if (tlds) { + const tld = segments[segments.length - 1].toLowerCase(); + if (tlds.deny && tlds.deny.has(tld) || + tlds.allow && !tlds.allow.has(tld)) { + + return Errors.code('DOMAIN_FORBIDDEN_TLDS'); + } + } + + for (let i = 0; i < segments.length; ++i) { + const segment = segments[i]; + + if (!segment.length) { + return Errors.code('DOMAIN_EMPTY_SEGMENT'); + } + + if (segment.length > 63) { + return Errors.code('DOMAIN_LONG_SEGMENT'); + } + + if (i < segments.length - 1) { + if (!internals.domainSegmentRx.test(segment)) { + return Errors.code('DOMAIN_INVALID_CHARS'); + } + } + else { + if (!internals.tldSegmentRx.test(segment)) { + return Errors.code('DOMAIN_INVALID_TLDS_CHARS'); + } + } + } + + return null; +}; + + +exports.isValid = function (domain, options) { + + return !exports.analyze(domain, options); +}; + + +internals.punycode = function (domain) { + + try { + return new internals.URL(`http://${domain}`).host; + } + catch (err) { + return domain; + } +}; diff --git a/node_modules/@hapi/address/lib/email.js b/node_modules/@hapi/address/lib/email.js new file mode 100644 index 000000000..8343ab7e1 --- /dev/null +++ b/node_modules/@hapi/address/lib/email.js @@ -0,0 +1,170 @@ +'use strict'; + +const Util = require('util'); + +const Domain = require('./domain'); +const Errors = require('./errors'); + + +const internals = { + nonAsciiRx: /[^\x00-\x7f]/, + encoder: new (Util.TextEncoder || TextEncoder)() // $lab:coverage:ignore$ +}; + + +exports.analyze = function (email, options) { + + return internals.email(email, options); +}; + + +exports.isValid = function (email, options) { + + return !internals.email(email, options); +}; + + +internals.email = function (email, options = {}) { + + if (typeof email !== 'string') { + throw new Error('Invalid input: email must be a string'); + } + + if (!email) { + return Errors.code('EMPTY_STRING'); + } + + // Unicode + + const ascii = !internals.nonAsciiRx.test(email); + if (!ascii) { + if (options.allowUnicode === false) { // Defaults to true + return Errors.code('FORBIDDEN_UNICODE'); + } + + email = email.normalize('NFC'); + } + + // Basic structure + + const parts = email.split('@'); + if (parts.length !== 2) { + return parts.length > 2 ? Errors.code('MULTIPLE_AT_CHAR') : Errors.code('MISSING_AT_CHAR'); + } + + const [local, domain] = parts; + + if (!local) { + return Errors.code('EMPTY_LOCAL'); + } + + if (!options.ignoreLength) { + if (email.length > 254) { // http://tools.ietf.org/html/rfc5321#section-4.5.3.1.3 + return Errors.code('ADDRESS_TOO_LONG'); + } + + if (internals.encoder.encode(local).length > 64) { // http://tools.ietf.org/html/rfc5321#section-4.5.3.1.1 + return Errors.code('LOCAL_TOO_LONG'); + } + } + + // Validate parts + + return internals.local(local, ascii) || Domain.analyze(domain, options); +}; + + +internals.local = function (local, ascii) { + + const segments = local.split('.'); + for (const segment of segments) { + if (!segment.length) { + return Errors.code('EMPTY_LOCAL_SEGMENT'); + } + + if (ascii) { + if (!internals.atextRx.test(segment)) { + return Errors.code('INVALID_LOCAL_CHARS'); + } + + continue; + } + + for (const char of segment) { + if (internals.atextRx.test(char)) { + continue; + } + + const binary = internals.binary(char); + if (!internals.atomRx.test(binary)) { + return Errors.code('INVALID_LOCAL_CHARS'); + } + } + } +}; + + +internals.binary = function (char) { + + return Array.from(internals.encoder.encode(char)).map((v) => String.fromCharCode(v)).join(''); +}; + + +/* + From RFC 5321: + + Mailbox = Local-part "@" ( Domain / address-literal ) + + Local-part = Dot-string / Quoted-string + Dot-string = Atom *("." Atom) + Atom = 1*atext + atext = ALPHA / DIGIT / "!" / "#" / "$" / "%" / "&" / "'" / "*" / "+" / "-" / "/" / "=" / "?" / "^" / "_" / "`" / "{" / "|" / "}" / "~" + + Domain = sub-domain *("." sub-domain) + sub-domain = Let-dig [Ldh-str] + Let-dig = ALPHA / DIGIT + Ldh-str = *( ALPHA / DIGIT / "-" ) Let-dig + + ALPHA = %x41-5A / %x61-7A ; a-z, A-Z + DIGIT = %x30-39 ; 0-9 + + From RFC 6531: + + sub-domain =/ U-label + atext =/ UTF8-non-ascii + + UTF8-non-ascii = UTF8-2 / UTF8-3 / UTF8-4 + + UTF8-2 = %xC2-DF UTF8-tail + UTF8-3 = %xE0 %xA0-BF UTF8-tail / + %xE1-EC 2( UTF8-tail ) / + %xED %x80-9F UTF8-tail / + %xEE-EF 2( UTF8-tail ) + UTF8-4 = %xF0 %x90-BF 2( UTF8-tail ) / + %xF1-F3 3( UTF8-tail ) / + %xF4 %x80-8F 2( UTF8-tail ) + + UTF8-tail = %x80-BF + + Note: The following are not supported: + + RFC 5321: address-literal, Quoted-string + RFC 5322: obs-*, CFWS +*/ + + +internals.atextRx = /^[\w!#\$%&'\*\+\-/=\?\^`\{\|\}~]+$/; // _ included in \w + + +internals.atomRx = new RegExp([ + + // %xC2-DF UTF8-tail + '(?:[\\xc2-\\xdf][\\x80-\\xbf])', + + // %xE0 %xA0-BF UTF8-tail %xE1-EC 2( UTF8-tail ) %xED %x80-9F UTF8-tail %xEE-EF 2( UTF8-tail ) + '(?:\\xe0[\\xa0-\\xbf][\\x80-\\xbf])|(?:[\\xe1-\\xec][\\x80-\\xbf]{2})|(?:\\xed[\\x80-\\x9f][\\x80-\\xbf])|(?:[\\xee-\\xef][\\x80-\\xbf]{2})', + + // %xF0 %x90-BF 2( UTF8-tail ) %xF1-F3 3( UTF8-tail ) %xF4 %x80-8F 2( UTF8-tail ) + '(?:\\xf0[\\x90-\\xbf][\\x80-\\xbf]{2})|(?:[\\xf1-\\xf3][\\x80-\\xbf]{3})|(?:\\xf4[\\x80-\\x8f][\\x80-\\xbf]{2})' + +].join('|')); diff --git a/node_modules/@hapi/address/lib/errors.js b/node_modules/@hapi/address/lib/errors.js new file mode 100644 index 000000000..001dd1068 --- /dev/null +++ b/node_modules/@hapi/address/lib/errors.js @@ -0,0 +1,29 @@ +'use strict'; + +exports.codes = { + EMPTY_STRING: 'Address must be a non-empty string', + FORBIDDEN_UNICODE: 'Address contains forbidden Unicode characters', + MULTIPLE_AT_CHAR: 'Address cannot contain more than one @ character', + MISSING_AT_CHAR: 'Address must contain one @ character', + EMPTY_LOCAL: 'Address local part cannot be empty', + ADDRESS_TOO_LONG: 'Address too long', + LOCAL_TOO_LONG: 'Address local part too long', + EMPTY_LOCAL_SEGMENT: 'Address local part contains empty dot-separated segment', + INVALID_LOCAL_CHARS: 'Address local part contains invalid character', + DOMAIN_NON_EMPTY_STRING: 'Domain must be a non-empty string', + DOMAIN_TOO_LONG: 'Domain too long', + DOMAIN_INVALID_UNICODE_CHARS: 'Domain contains forbidden Unicode characters', + DOMAIN_INVALID_CHARS: 'Domain contains invalid character', + DOMAIN_INVALID_TLDS_CHARS: 'Domain contains invalid tld character', + DOMAIN_SEGMENTS_COUNT: 'Domain lacks the minimum required number of segments', + DOMAIN_SEGMENTS_COUNT_MAX: 'Domain contains too many segments', + DOMAIN_FORBIDDEN_TLDS: 'Domain uses forbidden TLD', + DOMAIN_EMPTY_SEGMENT: 'Domain contains empty dot-separated segment', + DOMAIN_LONG_SEGMENT: 'Domain contains dot-separated segment that is too long' +}; + + +exports.code = function (code) { + + return { code, error: exports.codes[code] }; +}; diff --git a/node_modules/@hapi/address/lib/index.d.ts b/node_modules/@hapi/address/lib/index.d.ts new file mode 100644 index 000000000..a533d7371 --- /dev/null +++ b/node_modules/@hapi/address/lib/index.d.ts @@ -0,0 +1,255 @@ +/// + +import * as Hoek from '@hapi/hoek'; + + +export namespace domain { + + /** + * Analyzes a string to verify it is a valid domain name. + * + * @param domain - the domain name to validate. + * @param options - optional settings. + * + * @return - undefined when valid, otherwise an object with single error key with a string message value. + */ + function analyze(domain: string, options?: Options): Analysis | null; + + /** + * Analyzes a string to verify it is a valid domain name. + * + * @param domain - the domain name to validate. + * @param options - optional settings. + * + * @return - true when valid, otherwise false. + */ + function isValid(domain: string, options?: Options): boolean; + + interface Options { + + /** + * Determines whether Unicode characters are allowed. + * + * @default true + */ + readonly allowUnicode?: boolean; + + /** + * The minimum number of domain segments (e.g. `x.y.z` has 3 segments) required. + * + * @default 2 + */ + readonly minDomainSegments?: number; + + /** + * Top-level-domain options + * + * @default true + */ + readonly tlds?: Tlds.Allow | Tlds.Deny | boolean; + } + + namespace Tlds { + + interface Allow { + + readonly allow: Set | true; + } + + interface Deny { + + readonly deny: Set; + } + } +} + + +export namespace email { + + /** + * Analyzes a string to verify it is a valid email address. + * + * @param email - the email address to validate. + * @param options - optional settings. + * + * @return - undefined when valid, otherwise an object with single error key with a string message value. + */ + function analyze(email: string, options?: Options): Analysis | null; + + /** + * Analyzes a string to verify it is a valid email address. + * + * @param email - the email address to validate. + * @param options - optional settings. + * + * @return - true when valid, otherwise false. + */ + function isValid(email: string, options?: Options): boolean; + + interface Options extends domain.Options { + + /** + * Determines whether to ignore the standards maximum email length limit. + * + * @default false + */ + readonly ignoreLength?: boolean; + } +} + + +export interface Analysis { + + /** + * The reason validation failed. + */ + error: string; + + /** + * The error code. + */ + code: string; +} + + +export const errors: Record; + + +export namespace ip { + + /** + * Generates a regular expression used to validate IP addresses. + * + * @param options - optional settings. + * + * @returns an object with the regular expression and meta data. + */ + function regex(options?: Options): Expression; + + interface Options { + + /** + * The required CIDR mode. + * + * @default 'optional' + */ + readonly cidr?: Cidr; + + /** + * The allowed versions. + * + * @default ['ipv4', 'ipv6', 'ipvfuture'] + */ + readonly version?: Version | Version[]; + } + + type Cidr = 'optional' | 'required' | 'forbidden'; + type Version = 'ipv4' | 'ipv6' | 'ipvfuture'; + + interface Expression { + + /** + * The CIDR mode. + */ + cidr: Cidr; + + /** + * The raw regular expression string. + */ + raw: string; + + /** + * The regular expression. + */ + regex: RegExp; + + /** + * The array of versions allowed. + */ + versions: Version[]; + } +} + + +export namespace uri { + + /** + * Faster version of decodeURIComponent() that does not throw. + * + * @param string - the URL string to decode. + * + * @returns the decoded string or null if invalid. + */ + function decode(string: string): string | null; + + /** + * Generates a regular expression used to validate URI addresses. + * + * @param options - optional settings. + * + * @returns an object with the regular expression and meta data. + */ + function regex(options?: Options): Expression; + + type Options = Hoek.ts.XOR; + + namespace Options { + + interface Query { + + /** + * Allow the use of [] in query parameters. + * + * @default false + */ + readonly allowQuerySquareBrackets?: boolean; + } + + interface Relative extends Query { + + /** + * Requires the URI to be relative. + * + * @default false + */ + readonly relativeOnly?: boolean; + } + + interface Options extends Query { + + /** + * Allow relative URIs. + * + * @default false + */ + readonly allowRelative?: boolean; + + /** + * Capture domain segment ($1). + * + * @default false + */ + readonly domain?: boolean; + + /** + * The allowed URI schemes. + */ + readonly scheme?: Scheme | Scheme[]; + } + + type Scheme = string | RegExp; + } + + interface Expression { + + /** + * The raw regular expression string. + */ + raw: string; + + /** + * The regular expression. + */ + regex: RegExp; + } +} diff --git a/node_modules/@hapi/address/lib/index.js b/node_modules/@hapi/address/lib/index.js new file mode 100644 index 000000000..b93a9c5c2 --- /dev/null +++ b/node_modules/@hapi/address/lib/index.js @@ -0,0 +1,97 @@ +'use strict'; + +const Decode = require('./decode'); +const Domain = require('./domain'); +const Email = require('./email'); +const Errors = require('./errors'); +const Ip = require('./ip'); +const Tlds = require('./tlds'); +const Uri = require('./uri'); + + +const internals = { + defaultTlds: { allow: Tlds, deny: null } +}; + + +module.exports = { + errors: Errors.codes, + + domain: { + analyze(domain, options) { + + options = internals.options(options); + return Domain.analyze(domain, options); + }, + + isValid(domain, options) { + + options = internals.options(options); + return Domain.isValid(domain, options); + } + }, + email: { + analyze(email, options) { + + options = internals.options(options); + return Email.analyze(email, options); + }, + + isValid(email, options) { + + options = internals.options(options); + return Email.isValid(email, options); + } + }, + ip: { + regex: Ip.regex + }, + uri: { + decode: Decode.decode, + regex: Uri.regex + } +}; + + +internals.options = function (options) { + + if (!options) { + return { tlds: internals.defaultTlds }; + } + + if (options.tlds === false) { // Defaults to true + return options; + } + + if (!options.tlds || + options.tlds === true) { + + return Object.assign({}, options, { tlds: internals.defaultTlds }); + } + + if (typeof options.tlds !== 'object') { + throw new Error('Invalid options: tlds must be a boolean or an object'); + } + + if (options.tlds.deny) { + if (options.tlds.deny instanceof Set === false) { + throw new Error('Invalid options: tlds.deny must be a Set object'); + } + + if (options.tlds.allow) { + throw new Error('Invalid options: cannot specify both tlds.allow and tlds.deny lists'); + } + + return options; + } + + if (options.tlds.allow === true) { + return Object.assign({}, options, { tlds: internals.defaultTlds }); + } + + if (options.tlds.allow instanceof Set === false) { + throw new Error('Invalid options: tlds.allow must be a Set object or true'); + } + + return options; +}; diff --git a/node_modules/@hapi/address/lib/ip.js b/node_modules/@hapi/address/lib/ip.js new file mode 100644 index 000000000..541b72cc7 --- /dev/null +++ b/node_modules/@hapi/address/lib/ip.js @@ -0,0 +1,63 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Uri = require('./uri'); + + +const internals = {}; + + +exports.regex = function (options = {}) { + + // CIDR + + Assert(options.cidr === undefined || typeof options.cidr === 'string', 'options.cidr must be a string'); + const cidr = options.cidr ? options.cidr.toLowerCase() : 'optional'; + Assert(['required', 'optional', 'forbidden'].includes(cidr), 'options.cidr must be one of required, optional, forbidden'); + + // Versions + + Assert(options.version === undefined || typeof options.version === 'string' || Array.isArray(options.version), 'options.version must be a string or an array of string'); + let versions = options.version || ['ipv4', 'ipv6', 'ipvfuture']; + if (!Array.isArray(versions)) { + versions = [versions]; + } + + Assert(versions.length >= 1, 'options.version must have at least 1 version specified'); + + for (let i = 0; i < versions.length; ++i) { + Assert(typeof versions[i] === 'string', 'options.version must only contain strings'); + versions[i] = versions[i].toLowerCase(); + Assert(['ipv4', 'ipv6', 'ipvfuture'].includes(versions[i]), 'options.version contains unknown version ' + versions[i] + ' - must be one of ipv4, ipv6, ipvfuture'); + } + + versions = Array.from(new Set(versions)); + + // Regex + + const parts = versions.map((version) => { + + // Forbidden + + if (cidr === 'forbidden') { + return Uri.ip[version]; + } + + // Required + + const cidrpart = `\\/${version === 'ipv4' ? Uri.ip.v4Cidr : Uri.ip.v6Cidr}`; + + if (cidr === 'required') { + return `${Uri.ip[version]}${cidrpart}`; + } + + // Optional + + return `${Uri.ip[version]}(?:${cidrpart})?`; + }); + + const raw = `(?:${parts.join('|')})`; + const regex = new RegExp(`^${raw}$`); + return { cidr, versions, regex, raw }; +}; diff --git a/node_modules/@hapi/address/lib/tlds.js b/node_modules/@hapi/address/lib/tlds.js new file mode 100644 index 000000000..77a00e3dd --- /dev/null +++ b/node_modules/@hapi/address/lib/tlds.js @@ -0,0 +1,1542 @@ +'use strict'; + +const internals = {}; + + +// http://data.iana.org/TLD/tlds-alpha-by-domain.txt +// # Version 2019091902, Last Updated Fri Sep 20 07: 07: 02 2019 UTC + + +internals.tlds = [ + 'AAA', + 'AARP', + 'ABARTH', + 'ABB', + 'ABBOTT', + 'ABBVIE', + 'ABC', + 'ABLE', + 'ABOGADO', + 'ABUDHABI', + 'AC', + 'ACADEMY', + 'ACCENTURE', + 'ACCOUNTANT', + 'ACCOUNTANTS', + 'ACO', + 'ACTOR', + 'AD', + 'ADAC', + 'ADS', + 'ADULT', + 'AE', + 'AEG', + 'AERO', + 'AETNA', + 'AF', + 'AFAMILYCOMPANY', + 'AFL', + 'AFRICA', + 'AG', + 'AGAKHAN', + 'AGENCY', + 'AI', + 'AIG', + 'AIGO', + 'AIRBUS', + 'AIRFORCE', + 'AIRTEL', + 'AKDN', + 'AL', + 'ALFAROMEO', + 'ALIBABA', + 'ALIPAY', + 'ALLFINANZ', + 'ALLSTATE', + 'ALLY', + 'ALSACE', + 'ALSTOM', + 'AM', + 'AMERICANEXPRESS', + 'AMERICANFAMILY', + 'AMEX', + 'AMFAM', + 'AMICA', + 'AMSTERDAM', + 'ANALYTICS', + 'ANDROID', + 'ANQUAN', + 'ANZ', + 'AO', + 'AOL', + 'APARTMENTS', + 'APP', + 'APPLE', + 'AQ', + 'AQUARELLE', + 'AR', + 'ARAB', + 'ARAMCO', + 'ARCHI', + 'ARMY', + 'ARPA', + 'ART', + 'ARTE', + 'AS', + 'ASDA', + 'ASIA', + 'ASSOCIATES', + 'AT', + 'ATHLETA', + 'ATTORNEY', + 'AU', + 'AUCTION', + 'AUDI', + 'AUDIBLE', + 'AUDIO', + 'AUSPOST', + 'AUTHOR', + 'AUTO', + 'AUTOS', + 'AVIANCA', + 'AW', + 'AWS', + 'AX', + 'AXA', + 'AZ', + 'AZURE', + 'BA', + 'BABY', + 'BAIDU', + 'BANAMEX', + 'BANANAREPUBLIC', + 'BAND', + 'BANK', + 'BAR', + 'BARCELONA', + 'BARCLAYCARD', + 'BARCLAYS', + 'BAREFOOT', + 'BARGAINS', + 'BASEBALL', + 'BASKETBALL', + 'BAUHAUS', + 'BAYERN', + 'BB', + 'BBC', + 'BBT', + 'BBVA', + 'BCG', + 'BCN', + 'BD', + 'BE', + 'BEATS', + 'BEAUTY', + 'BEER', + 'BENTLEY', + 'BERLIN', + 'BEST', + 'BESTBUY', + 'BET', + 'BF', + 'BG', + 'BH', + 'BHARTI', + 'BI', + 'BIBLE', + 'BID', + 'BIKE', + 'BING', + 'BINGO', + 'BIO', + 'BIZ', + 'BJ', + 'BLACK', + 'BLACKFRIDAY', + 'BLOCKBUSTER', + 'BLOG', + 'BLOOMBERG', + 'BLUE', + 'BM', + 'BMS', + 'BMW', + 'BN', + 'BNPPARIBAS', + 'BO', + 'BOATS', + 'BOEHRINGER', + 'BOFA', + 'BOM', + 'BOND', + 'BOO', + 'BOOK', + 'BOOKING', + 'BOSCH', + 'BOSTIK', + 'BOSTON', + 'BOT', + 'BOUTIQUE', + 'BOX', + 'BR', + 'BRADESCO', + 'BRIDGESTONE', + 'BROADWAY', + 'BROKER', + 'BROTHER', + 'BRUSSELS', + 'BS', + 'BT', + 'BUDAPEST', + 'BUGATTI', + 'BUILD', + 'BUILDERS', + 'BUSINESS', + 'BUY', + 'BUZZ', + 'BV', + 'BW', + 'BY', + 'BZ', + 'BZH', + 'CA', + 'CAB', + 'CAFE', + 'CAL', + 'CALL', + 'CALVINKLEIN', + 'CAM', + 'CAMERA', + 'CAMP', + 'CANCERRESEARCH', + 'CANON', + 'CAPETOWN', + 'CAPITAL', + 'CAPITALONE', + 'CAR', + 'CARAVAN', + 'CARDS', + 'CARE', + 'CAREER', + 'CAREERS', + 'CARS', + 'CARTIER', + 'CASA', + 'CASE', + 'CASEIH', + 'CASH', + 'CASINO', + 'CAT', + 'CATERING', + 'CATHOLIC', + 'CBA', + 'CBN', + 'CBRE', + 'CBS', + 'CC', + 'CD', + 'CEB', + 'CENTER', + 'CEO', + 'CERN', + 'CF', + 'CFA', + 'CFD', + 'CG', + 'CH', + 'CHANEL', + 'CHANNEL', + 'CHARITY', + 'CHASE', + 'CHAT', + 'CHEAP', + 'CHINTAI', + 'CHRISTMAS', + 'CHROME', + 'CHRYSLER', + 'CHURCH', + 'CI', + 'CIPRIANI', + 'CIRCLE', + 'CISCO', + 'CITADEL', + 'CITI', + 'CITIC', + 'CITY', + 'CITYEATS', + 'CK', + 'CL', + 'CLAIMS', + 'CLEANING', + 'CLICK', + 'CLINIC', + 'CLINIQUE', + 'CLOTHING', + 'CLOUD', + 'CLUB', + 'CLUBMED', + 'CM', + 'CN', + 'CO', + 'COACH', + 'CODES', + 'COFFEE', + 'COLLEGE', + 'COLOGNE', + 'COM', + 'COMCAST', + 'COMMBANK', + 'COMMUNITY', + 'COMPANY', + 'COMPARE', + 'COMPUTER', + 'COMSEC', + 'CONDOS', + 'CONSTRUCTION', + 'CONSULTING', + 'CONTACT', + 'CONTRACTORS', + 'COOKING', + 'COOKINGCHANNEL', + 'COOL', + 'COOP', + 'CORSICA', + 'COUNTRY', + 'COUPON', + 'COUPONS', + 'COURSES', + 'CR', + 'CREDIT', + 'CREDITCARD', + 'CREDITUNION', + 'CRICKET', + 'CROWN', + 'CRS', + 'CRUISE', + 'CRUISES', + 'CSC', + 'CU', + 'CUISINELLA', + 'CV', + 'CW', + 'CX', + 'CY', + 'CYMRU', + 'CYOU', + 'CZ', + 'DABUR', + 'DAD', + 'DANCE', + 'DATA', + 'DATE', + 'DATING', + 'DATSUN', + 'DAY', + 'DCLK', + 'DDS', + 'DE', + 'DEAL', + 'DEALER', + 'DEALS', + 'DEGREE', + 'DELIVERY', + 'DELL', + 'DELOITTE', + 'DELTA', + 'DEMOCRAT', + 'DENTAL', + 'DENTIST', + 'DESI', + 'DESIGN', + 'DEV', + 'DHL', + 'DIAMONDS', + 'DIET', + 'DIGITAL', + 'DIRECT', + 'DIRECTORY', + 'DISCOUNT', + 'DISCOVER', + 'DISH', + 'DIY', + 'DJ', + 'DK', + 'DM', + 'DNP', + 'DO', + 'DOCS', + 'DOCTOR', + 'DODGE', + 'DOG', + 'DOMAINS', + 'DOT', + 'DOWNLOAD', + 'DRIVE', + 'DTV', + 'DUBAI', + 'DUCK', + 'DUNLOP', + 'DUPONT', + 'DURBAN', + 'DVAG', + 'DVR', + 'DZ', + 'EARTH', + 'EAT', + 'EC', + 'ECO', + 'EDEKA', + 'EDU', + 'EDUCATION', + 'EE', + 'EG', + 'EMAIL', + 'EMERCK', + 'ENERGY', + 'ENGINEER', + 'ENGINEERING', + 'ENTERPRISES', + 'EPSON', + 'EQUIPMENT', + 'ER', + 'ERICSSON', + 'ERNI', + 'ES', + 'ESQ', + 'ESTATE', + 'ESURANCE', + 'ET', + 'ETISALAT', + 'EU', + 'EUROVISION', + 'EUS', + 'EVENTS', + 'EVERBANK', + 'EXCHANGE', + 'EXPERT', + 'EXPOSED', + 'EXPRESS', + 'EXTRASPACE', + 'FAGE', + 'FAIL', + 'FAIRWINDS', + 'FAITH', + 'FAMILY', + 'FAN', + 'FANS', + 'FARM', + 'FARMERS', + 'FASHION', + 'FAST', + 'FEDEX', + 'FEEDBACK', + 'FERRARI', + 'FERRERO', + 'FI', + 'FIAT', + 'FIDELITY', + 'FIDO', + 'FILM', + 'FINAL', + 'FINANCE', + 'FINANCIAL', + 'FIRE', + 'FIRESTONE', + 'FIRMDALE', + 'FISH', + 'FISHING', + 'FIT', + 'FITNESS', + 'FJ', + 'FK', + 'FLICKR', + 'FLIGHTS', + 'FLIR', + 'FLORIST', + 'FLOWERS', + 'FLY', + 'FM', + 'FO', + 'FOO', + 'FOOD', + 'FOODNETWORK', + 'FOOTBALL', + 'FORD', + 'FOREX', + 'FORSALE', + 'FORUM', + 'FOUNDATION', + 'FOX', + 'FR', + 'FREE', + 'FRESENIUS', + 'FRL', + 'FROGANS', + 'FRONTDOOR', + 'FRONTIER', + 'FTR', + 'FUJITSU', + 'FUJIXEROX', + 'FUN', + 'FUND', + 'FURNITURE', + 'FUTBOL', + 'FYI', + 'GA', + 'GAL', + 'GALLERY', + 'GALLO', + 'GALLUP', + 'GAME', + 'GAMES', + 'GAP', + 'GARDEN', + 'GAY', + 'GB', + 'GBIZ', + 'GD', + 'GDN', + 'GE', + 'GEA', + 'GENT', + 'GENTING', + 'GEORGE', + 'GF', + 'GG', + 'GGEE', + 'GH', + 'GI', + 'GIFT', + 'GIFTS', + 'GIVES', + 'GIVING', + 'GL', + 'GLADE', + 'GLASS', + 'GLE', + 'GLOBAL', + 'GLOBO', + 'GM', + 'GMAIL', + 'GMBH', + 'GMO', + 'GMX', + 'GN', + 'GODADDY', + 'GOLD', + 'GOLDPOINT', + 'GOLF', + 'GOO', + 'GOODYEAR', + 'GOOG', + 'GOOGLE', + 'GOP', + 'GOT', + 'GOV', + 'GP', + 'GQ', + 'GR', + 'GRAINGER', + 'GRAPHICS', + 'GRATIS', + 'GREEN', + 'GRIPE', + 'GROCERY', + 'GROUP', + 'GS', + 'GT', + 'GU', + 'GUARDIAN', + 'GUCCI', + 'GUGE', + 'GUIDE', + 'GUITARS', + 'GURU', + 'GW', + 'GY', + 'HAIR', + 'HAMBURG', + 'HANGOUT', + 'HAUS', + 'HBO', + 'HDFC', + 'HDFCBANK', + 'HEALTH', + 'HEALTHCARE', + 'HELP', + 'HELSINKI', + 'HERE', + 'HERMES', + 'HGTV', + 'HIPHOP', + 'HISAMITSU', + 'HITACHI', + 'HIV', + 'HK', + 'HKT', + 'HM', + 'HN', + 'HOCKEY', + 'HOLDINGS', + 'HOLIDAY', + 'HOMEDEPOT', + 'HOMEGOODS', + 'HOMES', + 'HOMESENSE', + 'HONDA', + 'HORSE', + 'HOSPITAL', + 'HOST', + 'HOSTING', + 'HOT', + 'HOTELES', + 'HOTELS', + 'HOTMAIL', + 'HOUSE', + 'HOW', + 'HR', + 'HSBC', + 'HT', + 'HU', + 'HUGHES', + 'HYATT', + 'HYUNDAI', + 'IBM', + 'ICBC', + 'ICE', + 'ICU', + 'ID', + 'IE', + 'IEEE', + 'IFM', + 'IKANO', + 'IL', + 'IM', + 'IMAMAT', + 'IMDB', + 'IMMO', + 'IMMOBILIEN', + 'IN', + 'INC', + 'INDUSTRIES', + 'INFINITI', + 'INFO', + 'ING', + 'INK', + 'INSTITUTE', + 'INSURANCE', + 'INSURE', + 'INT', + 'INTEL', + 'INTERNATIONAL', + 'INTUIT', + 'INVESTMENTS', + 'IO', + 'IPIRANGA', + 'IQ', + 'IR', + 'IRISH', + 'IS', + 'ISMAILI', + 'IST', + 'ISTANBUL', + 'IT', + 'ITAU', + 'ITV', + 'IVECO', + 'JAGUAR', + 'JAVA', + 'JCB', + 'JCP', + 'JE', + 'JEEP', + 'JETZT', + 'JEWELRY', + 'JIO', + 'JLL', + 'JM', + 'JMP', + 'JNJ', + 'JO', + 'JOBS', + 'JOBURG', + 'JOT', + 'JOY', + 'JP', + 'JPMORGAN', + 'JPRS', + 'JUEGOS', + 'JUNIPER', + 'KAUFEN', + 'KDDI', + 'KE', + 'KERRYHOTELS', + 'KERRYLOGISTICS', + 'KERRYPROPERTIES', + 'KFH', + 'KG', + 'KH', + 'KI', + 'KIA', + 'KIM', + 'KINDER', + 'KINDLE', + 'KITCHEN', + 'KIWI', + 'KM', + 'KN', + 'KOELN', + 'KOMATSU', + 'KOSHER', + 'KP', + 'KPMG', + 'KPN', + 'KR', + 'KRD', + 'KRED', + 'KUOKGROUP', + 'KW', + 'KY', + 'KYOTO', + 'KZ', + 'LA', + 'LACAIXA', + 'LADBROKES', + 'LAMBORGHINI', + 'LAMER', + 'LANCASTER', + 'LANCIA', + 'LANCOME', + 'LAND', + 'LANDROVER', + 'LANXESS', + 'LASALLE', + 'LAT', + 'LATINO', + 'LATROBE', + 'LAW', + 'LAWYER', + 'LB', + 'LC', + 'LDS', + 'LEASE', + 'LECLERC', + 'LEFRAK', + 'LEGAL', + 'LEGO', + 'LEXUS', + 'LGBT', + 'LI', + 'LIAISON', + 'LIDL', + 'LIFE', + 'LIFEINSURANCE', + 'LIFESTYLE', + 'LIGHTING', + 'LIKE', + 'LILLY', + 'LIMITED', + 'LIMO', + 'LINCOLN', + 'LINDE', + 'LINK', + 'LIPSY', + 'LIVE', + 'LIVING', + 'LIXIL', + 'LK', + 'LLC', + 'LOAN', + 'LOANS', + 'LOCKER', + 'LOCUS', + 'LOFT', + 'LOL', + 'LONDON', + 'LOTTE', + 'LOTTO', + 'LOVE', + 'LPL', + 'LPLFINANCIAL', + 'LR', + 'LS', + 'LT', + 'LTD', + 'LTDA', + 'LU', + 'LUNDBECK', + 'LUPIN', + 'LUXE', + 'LUXURY', + 'LV', + 'LY', + 'MA', + 'MACYS', + 'MADRID', + 'MAIF', + 'MAISON', + 'MAKEUP', + 'MAN', + 'MANAGEMENT', + 'MANGO', + 'MAP', + 'MARKET', + 'MARKETING', + 'MARKETS', + 'MARRIOTT', + 'MARSHALLS', + 'MASERATI', + 'MATTEL', + 'MBA', + 'MC', + 'MCKINSEY', + 'MD', + 'ME', + 'MED', + 'MEDIA', + 'MEET', + 'MELBOURNE', + 'MEME', + 'MEMORIAL', + 'MEN', + 'MENU', + 'MERCKMSD', + 'METLIFE', + 'MG', + 'MH', + 'MIAMI', + 'MICROSOFT', + 'MIL', + 'MINI', + 'MINT', + 'MIT', + 'MITSUBISHI', + 'MK', + 'ML', + 'MLB', + 'MLS', + 'MM', + 'MMA', + 'MN', + 'MO', + 'MOBI', + 'MOBILE', + 'MODA', + 'MOE', + 'MOI', + 'MOM', + 'MONASH', + 'MONEY', + 'MONSTER', + 'MOPAR', + 'MORMON', + 'MORTGAGE', + 'MOSCOW', + 'MOTO', + 'MOTORCYCLES', + 'MOV', + 'MOVIE', + 'MOVISTAR', + 'MP', + 'MQ', + 'MR', + 'MS', + 'MSD', + 'MT', + 'MTN', + 'MTR', + 'MU', + 'MUSEUM', + 'MUTUAL', + 'MV', + 'MW', + 'MX', + 'MY', + 'MZ', + 'NA', + 'NAB', + 'NADEX', + 'NAGOYA', + 'NAME', + 'NATIONWIDE', + 'NATURA', + 'NAVY', + 'NBA', + 'NC', + 'NE', + 'NEC', + 'NET', + 'NETBANK', + 'NETFLIX', + 'NETWORK', + 'NEUSTAR', + 'NEW', + 'NEWHOLLAND', + 'NEWS', + 'NEXT', + 'NEXTDIRECT', + 'NEXUS', + 'NF', + 'NFL', + 'NG', + 'NGO', + 'NHK', + 'NI', + 'NICO', + 'NIKE', + 'NIKON', + 'NINJA', + 'NISSAN', + 'NISSAY', + 'NL', + 'NO', + 'NOKIA', + 'NORTHWESTERNMUTUAL', + 'NORTON', + 'NOW', + 'NOWRUZ', + 'NOWTV', + 'NP', + 'NR', + 'NRA', + 'NRW', + 'NTT', + 'NU', + 'NYC', + 'NZ', + 'OBI', + 'OBSERVER', + 'OFF', + 'OFFICE', + 'OKINAWA', + 'OLAYAN', + 'OLAYANGROUP', + 'OLDNAVY', + 'OLLO', + 'OM', + 'OMEGA', + 'ONE', + 'ONG', + 'ONL', + 'ONLINE', + 'ONYOURSIDE', + 'OOO', + 'OPEN', + 'ORACLE', + 'ORANGE', + 'ORG', + 'ORGANIC', + 'ORIGINS', + 'OSAKA', + 'OTSUKA', + 'OTT', + 'OVH', + 'PA', + 'PAGE', + 'PANASONIC', + 'PARIS', + 'PARS', + 'PARTNERS', + 'PARTS', + 'PARTY', + 'PASSAGENS', + 'PAY', + 'PCCW', + 'PE', + 'PET', + 'PF', + 'PFIZER', + 'PG', + 'PH', + 'PHARMACY', + 'PHD', + 'PHILIPS', + 'PHONE', + 'PHOTO', + 'PHOTOGRAPHY', + 'PHOTOS', + 'PHYSIO', + 'PIAGET', + 'PICS', + 'PICTET', + 'PICTURES', + 'PID', + 'PIN', + 'PING', + 'PINK', + 'PIONEER', + 'PIZZA', + 'PK', + 'PL', + 'PLACE', + 'PLAY', + 'PLAYSTATION', + 'PLUMBING', + 'PLUS', + 'PM', + 'PN', + 'PNC', + 'POHL', + 'POKER', + 'POLITIE', + 'PORN', + 'POST', + 'PR', + 'PRAMERICA', + 'PRAXI', + 'PRESS', + 'PRIME', + 'PRO', + 'PROD', + 'PRODUCTIONS', + 'PROF', + 'PROGRESSIVE', + 'PROMO', + 'PROPERTIES', + 'PROPERTY', + 'PROTECTION', + 'PRU', + 'PRUDENTIAL', + 'PS', + 'PT', + 'PUB', + 'PW', + 'PWC', + 'PY', + 'QA', + 'QPON', + 'QUEBEC', + 'QUEST', + 'QVC', + 'RACING', + 'RADIO', + 'RAID', + 'RE', + 'READ', + 'REALESTATE', + 'REALTOR', + 'REALTY', + 'RECIPES', + 'RED', + 'REDSTONE', + 'REDUMBRELLA', + 'REHAB', + 'REISE', + 'REISEN', + 'REIT', + 'RELIANCE', + 'REN', + 'RENT', + 'RENTALS', + 'REPAIR', + 'REPORT', + 'REPUBLICAN', + 'REST', + 'RESTAURANT', + 'REVIEW', + 'REVIEWS', + 'REXROTH', + 'RICH', + 'RICHARDLI', + 'RICOH', + 'RIGHTATHOME', + 'RIL', + 'RIO', + 'RIP', + 'RMIT', + 'RO', + 'ROCHER', + 'ROCKS', + 'RODEO', + 'ROGERS', + 'ROOM', + 'RS', + 'RSVP', + 'RU', + 'RUGBY', + 'RUHR', + 'RUN', + 'RW', + 'RWE', + 'RYUKYU', + 'SA', + 'SAARLAND', + 'SAFE', + 'SAFETY', + 'SAKURA', + 'SALE', + 'SALON', + 'SAMSCLUB', + 'SAMSUNG', + 'SANDVIK', + 'SANDVIKCOROMANT', + 'SANOFI', + 'SAP', + 'SARL', + 'SAS', + 'SAVE', + 'SAXO', + 'SB', + 'SBI', + 'SBS', + 'SC', + 'SCA', + 'SCB', + 'SCHAEFFLER', + 'SCHMIDT', + 'SCHOLARSHIPS', + 'SCHOOL', + 'SCHULE', + 'SCHWARZ', + 'SCIENCE', + 'SCJOHNSON', + 'SCOR', + 'SCOT', + 'SD', + 'SE', + 'SEARCH', + 'SEAT', + 'SECURE', + 'SECURITY', + 'SEEK', + 'SELECT', + 'SENER', + 'SERVICES', + 'SES', + 'SEVEN', + 'SEW', + 'SEX', + 'SEXY', + 'SFR', + 'SG', + 'SH', + 'SHANGRILA', + 'SHARP', + 'SHAW', + 'SHELL', + 'SHIA', + 'SHIKSHA', + 'SHOES', + 'SHOP', + 'SHOPPING', + 'SHOUJI', + 'SHOW', + 'SHOWTIME', + 'SHRIRAM', + 'SI', + 'SILK', + 'SINA', + 'SINGLES', + 'SITE', + 'SJ', + 'SK', + 'SKI', + 'SKIN', + 'SKY', + 'SKYPE', + 'SL', + 'SLING', + 'SM', + 'SMART', + 'SMILE', + 'SN', + 'SNCF', + 'SO', + 'SOCCER', + 'SOCIAL', + 'SOFTBANK', + 'SOFTWARE', + 'SOHU', + 'SOLAR', + 'SOLUTIONS', + 'SONG', + 'SONY', + 'SOY', + 'SPACE', + 'SPORT', + 'SPOT', + 'SPREADBETTING', + 'SR', + 'SRL', + 'SRT', + 'SS', + 'ST', + 'STADA', + 'STAPLES', + 'STAR', + 'STATEBANK', + 'STATEFARM', + 'STC', + 'STCGROUP', + 'STOCKHOLM', + 'STORAGE', + 'STORE', + 'STREAM', + 'STUDIO', + 'STUDY', + 'STYLE', + 'SU', + 'SUCKS', + 'SUPPLIES', + 'SUPPLY', + 'SUPPORT', + 'SURF', + 'SURGERY', + 'SUZUKI', + 'SV', + 'SWATCH', + 'SWIFTCOVER', + 'SWISS', + 'SX', + 'SY', + 'SYDNEY', + 'SYMANTEC', + 'SYSTEMS', + 'SZ', + 'TAB', + 'TAIPEI', + 'TALK', + 'TAOBAO', + 'TARGET', + 'TATAMOTORS', + 'TATAR', + 'TATTOO', + 'TAX', + 'TAXI', + 'TC', + 'TCI', + 'TD', + 'TDK', + 'TEAM', + 'TECH', + 'TECHNOLOGY', + 'TEL', + 'TELEFONICA', + 'TEMASEK', + 'TENNIS', + 'TEVA', + 'TF', + 'TG', + 'TH', + 'THD', + 'THEATER', + 'THEATRE', + 'TIAA', + 'TICKETS', + 'TIENDA', + 'TIFFANY', + 'TIPS', + 'TIRES', + 'TIROL', + 'TJ', + 'TJMAXX', + 'TJX', + 'TK', + 'TKMAXX', + 'TL', + 'TM', + 'TMALL', + 'TN', + 'TO', + 'TODAY', + 'TOKYO', + 'TOOLS', + 'TOP', + 'TORAY', + 'TOSHIBA', + 'TOTAL', + 'TOURS', + 'TOWN', + 'TOYOTA', + 'TOYS', + 'TR', + 'TRADE', + 'TRADING', + 'TRAINING', + 'TRAVEL', + 'TRAVELCHANNEL', + 'TRAVELERS', + 'TRAVELERSINSURANCE', + 'TRUST', + 'TRV', + 'TT', + 'TUBE', + 'TUI', + 'TUNES', + 'TUSHU', + 'TV', + 'TVS', + 'TW', + 'TZ', + 'UA', + 'UBANK', + 'UBS', + 'UCONNECT', + 'UG', + 'UK', + 'UNICOM', + 'UNIVERSITY', + 'UNO', + 'UOL', + 'UPS', + 'US', + 'UY', + 'UZ', + 'VA', + 'VACATIONS', + 'VANA', + 'VANGUARD', + 'VC', + 'VE', + 'VEGAS', + 'VENTURES', + 'VERISIGN', + 'VERSICHERUNG', + 'VET', + 'VG', + 'VI', + 'VIAJES', + 'VIDEO', + 'VIG', + 'VIKING', + 'VILLAS', + 'VIN', + 'VIP', + 'VIRGIN', + 'VISA', + 'VISION', + 'VISTAPRINT', + 'VIVA', + 'VIVO', + 'VLAANDEREN', + 'VN', + 'VODKA', + 'VOLKSWAGEN', + 'VOLVO', + 'VOTE', + 'VOTING', + 'VOTO', + 'VOYAGE', + 'VU', + 'VUELOS', + 'WALES', + 'WALMART', + 'WALTER', + 'WANG', + 'WANGGOU', + 'WARMAN', + 'WATCH', + 'WATCHES', + 'WEATHER', + 'WEATHERCHANNEL', + 'WEBCAM', + 'WEBER', + 'WEBSITE', + 'WED', + 'WEDDING', + 'WEIBO', + 'WEIR', + 'WF', + 'WHOSWHO', + 'WIEN', + 'WIKI', + 'WILLIAMHILL', + 'WIN', + 'WINDOWS', + 'WINE', + 'WINNERS', + 'WME', + 'WOLTERSKLUWER', + 'WOODSIDE', + 'WORK', + 'WORKS', + 'WORLD', + 'WOW', + 'WS', + 'WTC', + 'WTF', + 'XBOX', + 'XEROX', + 'XFINITY', + 'XIHUAN', + 'XIN', + 'XN--11B4C3D', + 'XN--1CK2E1B', + 'XN--1QQW23A', + 'XN--2SCRJ9C', + 'XN--30RR7Y', + 'XN--3BST00M', + 'XN--3DS443G', + 'XN--3E0B707E', + 'XN--3HCRJ9C', + 'XN--3OQ18VL8PN36A', + 'XN--3PXU8K', + 'XN--42C2D9A', + 'XN--45BR5CYL', + 'XN--45BRJ9C', + 'XN--45Q11C', + 'XN--4GBRIM', + 'XN--54B7FTA0CC', + 'XN--55QW42G', + 'XN--55QX5D', + 'XN--5SU34J936BGSG', + 'XN--5TZM5G', + 'XN--6FRZ82G', + 'XN--6QQ986B3XL', + 'XN--80ADXHKS', + 'XN--80AO21A', + 'XN--80AQECDR1A', + 'XN--80ASEHDB', + 'XN--80ASWG', + 'XN--8Y0A063A', + 'XN--90A3AC', + 'XN--90AE', + 'XN--90AIS', + 'XN--9DBQ2A', + 'XN--9ET52U', + 'XN--9KRT00A', + 'XN--B4W605FERD', + 'XN--BCK1B9A5DRE4C', + 'XN--C1AVG', + 'XN--C2BR7G', + 'XN--CCK2B3B', + 'XN--CG4BKI', + 'XN--CLCHC0EA0B2G2A9GCD', + 'XN--CZR694B', + 'XN--CZRS0T', + 'XN--CZRU2D', + 'XN--D1ACJ3B', + 'XN--D1ALF', + 'XN--E1A4C', + 'XN--ECKVDTC9D', + 'XN--EFVY88H', + 'XN--ESTV75G', + 'XN--FCT429K', + 'XN--FHBEI', + 'XN--FIQ228C5HS', + 'XN--FIQ64B', + 'XN--FIQS8S', + 'XN--FIQZ9S', + 'XN--FJQ720A', + 'XN--FLW351E', + 'XN--FPCRJ9C3D', + 'XN--FZC2C9E2C', + 'XN--FZYS8D69UVGM', + 'XN--G2XX48C', + 'XN--GCKR3F0F', + 'XN--GECRJ9C', + 'XN--GK3AT1E', + 'XN--H2BREG3EVE', + 'XN--H2BRJ9C', + 'XN--H2BRJ9C8C', + 'XN--HXT814E', + 'XN--I1B6B1A6A2E', + 'XN--IMR513N', + 'XN--IO0A7I', + 'XN--J1AEF', + 'XN--J1AMH', + 'XN--J6W193G', + 'XN--JLQ61U9W7B', + 'XN--JVR189M', + 'XN--KCRX77D1X4A', + 'XN--KPRW13D', + 'XN--KPRY57D', + 'XN--KPU716F', + 'XN--KPUT3I', + 'XN--L1ACC', + 'XN--LGBBAT1AD8J', + 'XN--MGB9AWBF', + 'XN--MGBA3A3EJT', + 'XN--MGBA3A4F16A', + 'XN--MGBA7C0BBN0A', + 'XN--MGBAAKC7DVF', + 'XN--MGBAAM7A8H', + 'XN--MGBAB2BD', + 'XN--MGBAH1A3HJKRD', + 'XN--MGBAI9AZGQP6J', + 'XN--MGBAYH7GPA', + 'XN--MGBBH1A', + 'XN--MGBBH1A71E', + 'XN--MGBC0A9AZCG', + 'XN--MGBCA7DZDO', + 'XN--MGBERP4A5D4AR', + 'XN--MGBGU82A', + 'XN--MGBI4ECEXP', + 'XN--MGBPL2FH', + 'XN--MGBT3DHD', + 'XN--MGBTX2B', + 'XN--MGBX4CD0AB', + 'XN--MIX891F', + 'XN--MK1BU44C', + 'XN--MXTQ1M', + 'XN--NGBC5AZD', + 'XN--NGBE9E0A', + 'XN--NGBRX', + 'XN--NODE', + 'XN--NQV7F', + 'XN--NQV7FS00EMA', + 'XN--NYQY26A', + 'XN--O3CW4H', + 'XN--OGBPF8FL', + 'XN--OTU796D', + 'XN--P1ACF', + 'XN--P1AI', + 'XN--PBT977C', + 'XN--PGBS0DH', + 'XN--PSSY2U', + 'XN--Q9JYB4C', + 'XN--QCKA1PMC', + 'XN--QXA6A', + 'XN--QXAM', + 'XN--RHQV96G', + 'XN--ROVU88B', + 'XN--RVC1E0AM3E', + 'XN--S9BRJ9C', + 'XN--SES554G', + 'XN--T60B56A', + 'XN--TCKWE', + 'XN--TIQ49XQYJ', + 'XN--UNUP4Y', + 'XN--VERMGENSBERATER-CTB', + 'XN--VERMGENSBERATUNG-PWB', + 'XN--VHQUV', + 'XN--VUQ861B', + 'XN--W4R85EL8FHU5DNRA', + 'XN--W4RS40L', + 'XN--WGBH1C', + 'XN--WGBL6A', + 'XN--XHQ521B', + 'XN--XKC2AL3HYE2A', + 'XN--XKC2DL3A5EE0H', + 'XN--Y9A3AQ', + 'XN--YFRO4I67O', + 'XN--YGBI2AMMX', + 'XN--ZFR164B', + 'XXX', + 'XYZ', + 'YACHTS', + 'YAHOO', + 'YAMAXUN', + 'YANDEX', + 'YE', + 'YODOBASHI', + 'YOGA', + 'YOKOHAMA', + 'YOU', + 'YOUTUBE', + 'YT', + 'YUN', + 'ZA', + 'ZAPPOS', + 'ZARA', + 'ZERO', + 'ZIP', + 'ZM', + 'ZONE', + 'ZUERICH', + 'ZW' +]; + + +// Keep as upper-case to make updating from source easier + +module.exports = new Set(internals.tlds.map((tld) => tld.toLowerCase())); diff --git a/node_modules/@hapi/address/lib/uri.js b/node_modules/@hapi/address/lib/uri.js new file mode 100644 index 000000000..f9791840a --- /dev/null +++ b/node_modules/@hapi/address/lib/uri.js @@ -0,0 +1,207 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const EscapeRegex = require('@hapi/hoek/lib/escapeRegex'); + + +const internals = {}; + + +internals.generate = function () { + + const rfc3986 = {}; + + const hexDigit = '\\dA-Fa-f'; // HEXDIG = DIGIT / "A" / "B" / "C" / "D" / "E" / "F" + const hexDigitOnly = '[' + hexDigit + ']'; + + const unreserved = '\\w-\\.~'; // unreserved = ALPHA / DIGIT / "-" / "." / "_" / "~" + const subDelims = '!\\$&\'\\(\\)\\*\\+,;='; // sub-delims = "!" / "$" / "&" / "'" / "(" / ")" / "*" / "+" / "," / ";" / "=" + const pctEncoded = '%' + hexDigit; // pct-encoded = "%" HEXDIG HEXDIG + const pchar = unreserved + pctEncoded + subDelims + ':@'; // pchar = unreserved / pct-encoded / sub-delims / ":" / "@" + const pcharOnly = '[' + pchar + ']'; + const decOctect = '(?:0{0,2}\\d|0?[1-9]\\d|1\\d\\d|2[0-4]\\d|25[0-5])'; // dec-octet = DIGIT / %x31-39 DIGIT / "1" 2DIGIT / "2" %x30-34 DIGIT / "25" %x30-35 ; 0-9 / 10-99 / 100-199 / 200-249 / 250-255 + + rfc3986.ipv4address = '(?:' + decOctect + '\\.){3}' + decOctect; // IPv4address = dec-octet "." dec-octet "." dec-octet "." dec-octet + + /* + h16 = 1*4HEXDIG ; 16 bits of address represented in hexadecimal + ls32 = ( h16 ":" h16 ) / IPv4address ; least-significant 32 bits of address + IPv6address = 6( h16 ":" ) ls32 + / "::" 5( h16 ":" ) ls32 + / [ h16 ] "::" 4( h16 ":" ) ls32 + / [ *1( h16 ":" ) h16 ] "::" 3( h16 ":" ) ls32 + / [ *2( h16 ":" ) h16 ] "::" 2( h16 ":" ) ls32 + / [ *3( h16 ":" ) h16 ] "::" h16 ":" ls32 + / [ *4( h16 ":" ) h16 ] "::" ls32 + / [ *5( h16 ":" ) h16 ] "::" h16 + / [ *6( h16 ":" ) h16 ] "::" + */ + + const h16 = hexDigitOnly + '{1,4}'; + const ls32 = '(?:' + h16 + ':' + h16 + '|' + rfc3986.ipv4address + ')'; + const IPv6SixHex = '(?:' + h16 + ':){6}' + ls32; + const IPv6FiveHex = '::(?:' + h16 + ':){5}' + ls32; + const IPv6FourHex = '(?:' + h16 + ')?::(?:' + h16 + ':){4}' + ls32; + const IPv6ThreeHex = '(?:(?:' + h16 + ':){0,1}' + h16 + ')?::(?:' + h16 + ':){3}' + ls32; + const IPv6TwoHex = '(?:(?:' + h16 + ':){0,2}' + h16 + ')?::(?:' + h16 + ':){2}' + ls32; + const IPv6OneHex = '(?:(?:' + h16 + ':){0,3}' + h16 + ')?::' + h16 + ':' + ls32; + const IPv6NoneHex = '(?:(?:' + h16 + ':){0,4}' + h16 + ')?::' + ls32; + const IPv6NoneHex2 = '(?:(?:' + h16 + ':){0,5}' + h16 + ')?::' + h16; + const IPv6NoneHex3 = '(?:(?:' + h16 + ':){0,6}' + h16 + ')?::'; + + rfc3986.ipv4Cidr = '(?:\\d|[1-2]\\d|3[0-2])'; // IPv4 cidr = DIGIT / %x31-32 DIGIT / "3" %x30-32 ; 0-9 / 10-29 / 30-32 + rfc3986.ipv6Cidr = '(?:0{0,2}\\d|0?[1-9]\\d|1[01]\\d|12[0-8])'; // IPv6 cidr = DIGIT / %x31-39 DIGIT / "1" %x0-1 DIGIT / "12" %x0-8; 0-9 / 10-99 / 100-119 / 120-128 + rfc3986.ipv6address = '(?:' + IPv6SixHex + '|' + IPv6FiveHex + '|' + IPv6FourHex + '|' + IPv6ThreeHex + '|' + IPv6TwoHex + '|' + IPv6OneHex + '|' + IPv6NoneHex + '|' + IPv6NoneHex2 + '|' + IPv6NoneHex3 + ')'; + rfc3986.ipvFuture = 'v' + hexDigitOnly + '+\\.[' + unreserved + subDelims + ':]+'; // IPvFuture = "v" 1*HEXDIG "." 1*( unreserved / sub-delims / ":" ) + + rfc3986.scheme = '[a-zA-Z][a-zA-Z\\d+-\\.]*'; // scheme = ALPHA *( ALPHA / DIGIT / "+" / "-" / "." ) + rfc3986.schemeRegex = new RegExp(rfc3986.scheme); + + const userinfo = '[' + unreserved + pctEncoded + subDelims + ':]*'; // userinfo = *( unreserved / pct-encoded / sub-delims / ":" ) + const IPLiteral = '\\[(?:' + rfc3986.ipv6address + '|' + rfc3986.ipvFuture + ')\\]'; // IP-literal = "[" ( IPv6address / IPvFuture ) "]" + const regName = '[' + unreserved + pctEncoded + subDelims + ']{1,255}'; // reg-name = *( unreserved / pct-encoded / sub-delims ) + const host = '(?:' + IPLiteral + '|' + rfc3986.ipv4address + '|' + regName + ')'; // host = IP-literal / IPv4address / reg-name + const port = '\\d*'; // port = *DIGIT + const authority = '(?:' + userinfo + '@)?' + host + '(?::' + port + ')?'; // authority = [ userinfo "@" ] host [ ":" port ] + const authorityCapture = '(?:' + userinfo + '@)?(' + host + ')(?::' + port + ')?'; + + /* + segment = *pchar + segment-nz = 1*pchar + path = path-abempty ; begins with "/" '|' is empty + / path-absolute ; begins with "/" but not "//" + / path-noscheme ; begins with a non-colon segment + / path-rootless ; begins with a segment + / path-empty ; zero characters + path-abempty = *( "/" segment ) + path-absolute = "/" [ segment-nz *( "/" segment ) ] + path-rootless = segment-nz *( "/" segment ) + */ + + const segment = pcharOnly + '*'; + const segmentNz = pcharOnly + '+'; + const segmentNzNc = '[' + unreserved + pctEncoded + subDelims + '@' + ']+'; + const pathEmpty = ''; + const pathAbEmpty = '(?:\\/' + segment + ')*'; + const pathAbsolute = '\\/(?:' + segmentNz + pathAbEmpty + ')?'; + const pathRootless = segmentNz + pathAbEmpty; + const pathNoScheme = segmentNzNc + pathAbEmpty; + const pathAbNoAuthority = '(?:\\/\\/\\/' + segment + pathAbEmpty + ')'; // Used by file:/// + + // hier-part = "//" authority path + + rfc3986.hierPart = '(?:' + '(?:\\/\\/' + authority + pathAbEmpty + ')' + '|' + pathAbsolute + '|' + pathRootless + '|' + pathAbNoAuthority + ')'; + rfc3986.hierPartCapture = '(?:' + '(?:\\/\\/' + authorityCapture + pathAbEmpty + ')' + '|' + pathAbsolute + '|' + pathRootless + ')'; + + // relative-part = "//" authority path-abempty / path-absolute / path-noscheme / path-empty + + rfc3986.relativeRef = '(?:' + '(?:\\/\\/' + authority + pathAbEmpty + ')' + '|' + pathAbsolute + '|' + pathNoScheme + '|' + pathEmpty + ')'; + rfc3986.relativeRefCapture = '(?:' + '(?:\\/\\/' + authorityCapture + pathAbEmpty + ')' + '|' + pathAbsolute + '|' + pathNoScheme + '|' + pathEmpty + ')'; + + // query = *( pchar / "/" / "?" ) + // query = *( pchar / "[" / "]" / "/" / "?" ) + + rfc3986.query = '[' + pchar + '\\/\\?]*(?=#|$)'; //Finish matching either at the fragment part '|' end of the line. + rfc3986.queryWithSquareBrackets = '[' + pchar + '\\[\\]\\/\\?]*(?=#|$)'; + + // fragment = *( pchar / "/" / "?" ) + + rfc3986.fragment = '[' + pchar + '\\/\\?]*'; + + return rfc3986; +}; + +internals.rfc3986 = internals.generate(); + + +exports.ip = { + v4Cidr: internals.rfc3986.ipv4Cidr, + v6Cidr: internals.rfc3986.ipv6Cidr, + ipv4: internals.rfc3986.ipv4address, + ipv6: internals.rfc3986.ipv6address, + ipvfuture: internals.rfc3986.ipvFuture +}; + + +internals.createRegex = function (options) { + + const rfc = internals.rfc3986; + + // Construct expression + + const query = options.allowQuerySquareBrackets ? rfc.queryWithSquareBrackets : rfc.query; + const suffix = '(?:\\?' + query + ')?' + '(?:#' + rfc.fragment + ')?'; + + // relative-ref = relative-part [ "?" query ] [ "#" fragment ] + + const relative = options.domain ? rfc.relativeRefCapture : rfc.relativeRef; + + if (options.relativeOnly) { + return internals.wrap(relative + suffix); + } + + // Custom schemes + + let customScheme = ''; + if (options.scheme) { + Assert(options.scheme instanceof RegExp || typeof options.scheme === 'string' || Array.isArray(options.scheme), 'scheme must be a RegExp, String, or Array'); + + const schemes = [].concat(options.scheme); + Assert(schemes.length >= 1, 'scheme must have at least 1 scheme specified'); + + // Flatten the array into a string to be used to match the schemes + + const selections = []; + for (let i = 0; i < schemes.length; ++i) { + const scheme = schemes[i]; + Assert(scheme instanceof RegExp || typeof scheme === 'string', 'scheme at position ' + i + ' must be a RegExp or String'); + + if (scheme instanceof RegExp) { + selections.push(scheme.source.toString()); + } + else { + Assert(rfc.schemeRegex.test(scheme), 'scheme at position ' + i + ' must be a valid scheme'); + selections.push(EscapeRegex(scheme)); + } + } + + customScheme = selections.join('|'); + } + + // URI = scheme ":" hier-part [ "?" query ] [ "#" fragment ] + + const scheme = customScheme ? '(?:' + customScheme + ')' : rfc.scheme; + const absolute = '(?:' + scheme + ':' + (options.domain ? rfc.hierPartCapture : rfc.hierPart) + ')'; + const prefix = options.allowRelative ? '(?:' + absolute + '|' + relative + ')' : absolute; + return internals.wrap(prefix + suffix, customScheme); +}; + + +internals.wrap = function (raw, scheme) { + + raw = `(?=.)(?!https?\:/$)${raw}`; // Require at least one character and explicitly forbid 'http:/' + + return { + raw, + regex: new RegExp(`^${raw}$`), + scheme + }; +}; + + +internals.uriRegex = internals.createRegex({}); + + +exports.regex = function (options = {}) { + + if (options.scheme || + options.allowRelative || + options.relativeOnly || + options.allowQuerySquareBrackets || + options.domain) { + + return internals.createRegex(options); + } + + return internals.uriRegex; +}; diff --git a/node_modules/@hapi/address/package.json b/node_modules/@hapi/address/package.json new file mode 100644 index 000000000..e479012aa --- /dev/null +++ b/node_modules/@hapi/address/package.json @@ -0,0 +1,62 @@ +{ + "_from": "@hapi/address@^4.0.1", + "_id": "@hapi/address@4.1.0", + "_inBundle": false, + "_integrity": "sha512-SkszZf13HVgGmChdHo/PxchnSaCJ6cetVqLzyciudzZRT0jcOouIF/Q93mgjw8cce+D+4F4C1Z/WrfFN+O3VHQ==", + "_location": "/@hapi/address", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "@hapi/address@^4.0.1", + "name": "@hapi/address", + "escapedName": "@hapi%2faddress", + "scope": "@hapi", + "rawSpec": "^4.0.1", + "saveSpec": null, + "fetchSpec": "^4.0.1" + }, + "_requiredBy": [ + "/@hapi/joi" + ], + "_resolved": "https://registry.npmjs.org/@hapi/address/-/address-4.1.0.tgz", + "_shasum": "d60c5c0d930e77456fdcde2598e77302e2955e1d", + "_spec": "@hapi/address@^4.0.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\@hapi\\joi", + "bugs": { + "url": "https://github.com/hapijs/address/issues" + }, + "bundleDependencies": false, + "dependencies": { + "@hapi/hoek": "^9.0.0" + }, + "deprecated": "Moved to 'npm install @sideway/address'", + "description": "Email address and domain validation", + "devDependencies": { + "@hapi/code": "8.x.x", + "@hapi/lab": "22.x.x" + }, + "files": [ + "lib" + ], + "homepage": "https://github.com/hapijs/address#readme", + "keywords": [ + "email", + "domain", + "address", + "validation" + ], + "license": "BSD-3-Clause", + "main": "lib/index.js", + "name": "@hapi/address", + "repository": { + "type": "git", + "url": "git://github.com/hapijs/address.git" + }, + "scripts": { + "test": "lab -a @hapi/code -t 100 -L -Y", + "test-cov-html": "lab -a @hapi/code -t 100 -L -r html -o coverage.html" + }, + "types": "lib/index.d.ts", + "version": "4.1.0" +} diff --git a/node_modules/@hapi/formula/CHANGELOG.md b/node_modules/@hapi/formula/CHANGELOG.md new file mode 100644 index 000000000..6ad29cebe --- /dev/null +++ b/node_modules/@hapi/formula/CHANGELOG.md @@ -0,0 +1,3 @@ +Breaking changes are documented using GitHub issues, see [issues labeled "release notes"](https://github.com/hapijs/formula/issues?q=is%3Aissue+label%3A%22release+notes%22). + +If you want changes of a specific minor or patch release, you can browse the [GitHub milestones](https://github.com/hapijs/formula/milestones?state=closed&direction=asc&sort=due_date). diff --git a/node_modules/@hapi/formula/LICENSE.md b/node_modules/@hapi/formula/LICENSE.md new file mode 100644 index 000000000..d8a7c565d --- /dev/null +++ b/node_modules/@hapi/formula/LICENSE.md @@ -0,0 +1,9 @@ +Copyright (c) 2019, Project contributors +All rights reserved. + +Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: + * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. + * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. + * The names of any contributors may not be used to endorse or promote products derived from this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS AND CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/node_modules/@hapi/formula/README.md b/node_modules/@hapi/formula/README.md new file mode 100644 index 000000000..d18d3c922 --- /dev/null +++ b/node_modules/@hapi/formula/README.md @@ -0,0 +1,20 @@ + + +# @hapi/formula + +#### Math and string formula parser. + +**formula** is part of the **hapi** ecosystem and was designed to work seamlessly with the [hapi web framework](https://hapi.dev) and its other components (but works great on its own or with other frameworks). If you are using a different web framework and find this module useful, check out [hapi](https://hapi.dev) they work even better together. + +### Visit the [hapi.dev](https://hapi.dev) Developer Portal for tutorials, documentation, and support + +## Useful resources + +- [Documentation and API](https://hapi.dev/family/formula/) +- [Version status](https://hapi.dev/resources/status/#formula) (builds, dependencies, node versions, licenses, eol) +- [Project policies](https://hapi.dev/policies/) +- [Free and commercial support options](https://hapi.dev/support/) + +## Acknowledgements + +Inspired by [**fparse**](https://github.com/bylexus/fparse), copyright 2012-2018 Alexander Schenkel diff --git a/node_modules/@hapi/formula/lib/index.d.ts b/node_modules/@hapi/formula/lib/index.d.ts new file mode 100644 index 000000000..d78cc1d47 --- /dev/null +++ b/node_modules/@hapi/formula/lib/index.d.ts @@ -0,0 +1,52 @@ +/** + * Formula parser + */ +export class Parser { + + /** + * Create a new formula parser. + * + * @param formula - the formula string to parse. + * @param options - optional settings. + */ + constructor(formula: string, options?: Options); + + /** + * Evaluate the formula. + * + * @param context - optional object with runtime formula context used to resolve variables. + * + * @returns the string or number outcome of the resolved formula. + */ + evaluate(context?: any): T; +} + + +export interface Options { + + /** + * A hash of key - value pairs used to convert constants to values. + */ + readonly constants?: Record; + + /** + * A regular expression used to validate token variables. + */ + readonly tokenRx?: RegExp; + + /** + * A variable resolver factory function. + */ + readonly reference?: Options.Reference; + + /** + * A hash of key-value pairs used to resolve formula functions. + */ + readonly functions?: Record; +} + + +export namespace Options { + + type Reference = (name: string) => (context: any) => any; +} diff --git a/node_modules/@hapi/formula/lib/index.js b/node_modules/@hapi/formula/lib/index.js new file mode 100644 index 000000000..871ab1bbb --- /dev/null +++ b/node_modules/@hapi/formula/lib/index.js @@ -0,0 +1,456 @@ +'use strict'; + +const internals = { + operators: ['!', '^', '*', '/', '%', '+', '-', '<', '<=', '>', '>=', '==', '!=', '&&', '||', '??'], + operatorCharacters: ['!', '^', '*', '/', '%', '+', '-', '<', '=', '>', '&', '|', '?'], + operatorsOrder: [['^'], ['*', '/', '%'], ['+', '-'], ['<', '<=', '>', '>='], ['==', '!='], ['&&'], ['||', '??']], + operatorsPrefix: ['!', 'n'], + + literals: { + '"': '"', + '`': '`', + '\'': '\'', + '[': ']' + }, + + numberRx: /^(?:[0-9]*\.?[0-9]*){1}$/, + tokenRx: /^[\w\$\#\.\@\:\{\}]+$/, + + symbol: Symbol('formula'), + settings: Symbol('settings') +}; + + +exports.Parser = class { + + constructor(string, options = {}) { + + if (!options[internals.settings] && + options.constants) { + + for (const constant in options.constants) { + const value = options.constants[constant]; + if (value !== null && + !['boolean', 'number', 'string'].includes(typeof value)) { + + throw new Error(`Formula constant ${constant} contains invalid ${typeof value} value type`); + } + } + } + + this.settings = options[internals.settings] ? options : Object.assign({ [internals.settings]: true, constants: {}, functions: {} }, options); + this.single = null; + + this._parts = null; + this._parse(string); + } + + _parse(string) { + + let parts = []; + let current = ''; + let parenthesis = 0; + let literal = false; + + const flush = (inner) => { + + if (parenthesis) { + throw new Error('Formula missing closing parenthesis'); + } + + const last = parts.length ? parts[parts.length - 1] : null; + + if (!literal && + !current && + !inner) { + + return; + } + + if (last && + last.type === 'reference' && + inner === ')') { // Function + + last.type = 'function'; + last.value = this._subFormula(current, last.value); + current = ''; + return; + } + + if (inner === ')') { // Segment + const sub = new exports.Parser(current, this.settings); + parts.push({ type: 'segment', value: sub }); + } + else if (literal) { + if (literal === ']') { // Reference + parts.push({ type: 'reference', value: current }); + current = ''; + return; + } + + parts.push({ type: 'literal', value: current }); // Literal + } + else if (internals.operatorCharacters.includes(current)) { // Operator + if (last && + last.type === 'operator' && + internals.operators.includes(last.value + current)) { // 2 characters operator + + last.value += current; + } + else { + parts.push({ type: 'operator', value: current }); + } + } + else if (current.match(internals.numberRx)) { // Number + parts.push({ type: 'constant', value: parseFloat(current) }); + } + else if (this.settings.constants[current] !== undefined) { // Constant + parts.push({ type: 'constant', value: this.settings.constants[current] }); + } + else { // Reference + if (!current.match(internals.tokenRx)) { + throw new Error(`Formula contains invalid token: ${current}`); + } + + parts.push({ type: 'reference', value: current }); + } + + current = ''; + }; + + for (const c of string) { + if (literal) { + if (c === literal) { + flush(); + literal = false; + } + else { + current += c; + } + } + else if (parenthesis) { + if (c === '(') { + current += c; + ++parenthesis; + } + else if (c === ')') { + --parenthesis; + if (!parenthesis) { + flush(c); + } + else { + current += c; + } + } + else { + current += c; + } + } + else if (c in internals.literals) { + literal = internals.literals[c]; + } + else if (c === '(') { + flush(); + ++parenthesis; + } + else if (internals.operatorCharacters.includes(c)) { + flush(); + current = c; + flush(); + } + else if (c !== ' ') { + current += c; + } + else { + flush(); + } + } + + flush(); + + // Replace prefix - to internal negative operator + + parts = parts.map((part, i) => { + + if (part.type !== 'operator' || + part.value !== '-' || + i && parts[i - 1].type !== 'operator') { + + return part; + } + + return { type: 'operator', value: 'n' }; + }); + + // Validate tokens order + + let operator = false; + for (const part of parts) { + if (part.type === 'operator') { + if (internals.operatorsPrefix.includes(part.value)) { + continue; + } + + if (!operator) { + throw new Error('Formula contains an operator in invalid position'); + } + + if (!internals.operators.includes(part.value)) { + throw new Error(`Formula contains an unknown operator ${part.value}`); + } + } + else if (operator) { + throw new Error('Formula missing expected operator'); + } + + operator = !operator; + } + + if (!operator) { + throw new Error('Formula contains invalid trailing operator'); + } + + // Identify single part + + if (parts.length === 1 && + ['reference', 'literal', 'constant'].includes(parts[0].type)) { + + this.single = { type: parts[0].type === 'reference' ? 'reference' : 'value', value: parts[0].value }; + } + + // Process parts + + this._parts = parts.map((part) => { + + // Operators + + if (part.type === 'operator') { + return internals.operatorsPrefix.includes(part.value) ? part : part.value; + } + + // Literals, constants, segments + + if (part.type !== 'reference') { + return part.value; + } + + // References + + if (this.settings.tokenRx && + !this.settings.tokenRx.test(part.value)) { + + throw new Error(`Formula contains invalid reference ${part.value}`); + } + + if (this.settings.reference) { + return this.settings.reference(part.value); + } + + return internals.reference(part.value); + }); + } + + _subFormula(string, name) { + + const method = this.settings.functions[name]; + if (typeof method !== 'function') { + throw new Error(`Formula contains unknown function ${name}`); + } + + let args = []; + if (string) { + let current = ''; + let parenthesis = 0; + let literal = false; + + const flush = () => { + + if (!current) { + throw new Error(`Formula contains function ${name} with invalid arguments ${string}`); + } + + args.push(current); + current = ''; + }; + + for (let i = 0; i < string.length; ++i) { + const c = string[i]; + if (literal) { + current += c; + if (c === literal) { + literal = false; + } + } + else if (c in internals.literals && + !parenthesis) { + + current += c; + literal = internals.literals[c]; + } + else if (c === ',' && + !parenthesis) { + + flush(); + } + else { + current += c; + if (c === '(') { + ++parenthesis; + } + else if (c === ')') { + --parenthesis; + } + } + } + + flush(); + } + + args = args.map((arg) => new exports.Parser(arg, this.settings)); + + return function (context) { + + const innerValues = []; + for (const arg of args) { + innerValues.push(arg.evaluate(context)); + } + + return method.call(context, ...innerValues); + }; + } + + evaluate(context) { + + const parts = this._parts.slice(); + + // Prefix operators + + for (let i = parts.length - 2; i >= 0; --i) { + const part = parts[i]; + if (part && + part.type === 'operator') { + + const current = parts[i + 1]; + parts.splice(i + 1, 1); + const value = internals.evaluate(current, context); + parts[i] = internals.single(part.value, value); + } + } + + // Left-right operators + + internals.operatorsOrder.forEach((set) => { + + for (let i = 1; i < parts.length - 1;) { + if (set.includes(parts[i])) { + const operator = parts[i]; + const left = internals.evaluate(parts[i - 1], context); + const right = internals.evaluate(parts[i + 1], context); + + parts.splice(i, 2); + const result = internals.calculate(operator, left, right); + parts[i - 1] = result === 0 ? 0 : result; // Convert -0 + } + else { + i += 2; + } + } + }); + + return internals.evaluate(parts[0], context); + } +}; + + +exports.Parser.prototype[internals.symbol] = true; + + +internals.reference = function (name) { + + return function (context) { + + return context && context[name] !== undefined ? context[name] : null; + }; +}; + + +internals.evaluate = function (part, context) { + + if (part === null) { + return null; + } + + if (typeof part === 'function') { + return part(context); + } + + if (part[internals.symbol]) { + return part.evaluate(context); + } + + return part; +}; + + +internals.single = function (operator, value) { + + if (operator === '!') { + return value ? false : true; + } + + // operator === 'n' + + const negative = -value; + if (negative === 0) { // Override -0 + return 0; + } + + return negative; +}; + + +internals.calculate = function (operator, left, right) { + + if (operator === '??') { + return internals.exists(left) ? left : right; + } + + if (typeof left === 'string' || + typeof right === 'string') { + + if (operator === '+') { + left = internals.exists(left) ? left : ''; + right = internals.exists(right) ? right : ''; + return left + right; + } + } + else { + switch (operator) { + case '^': return Math.pow(left, right); + case '*': return left * right; + case '/': return left / right; + case '%': return left % right; + case '+': return left + right; + case '-': return left - right; + } + } + + switch (operator) { + case '<': return left < right; + case '<=': return left <= right; + case '>': return left > right; + case '>=': return left >= right; + case '==': return left === right; + case '!=': return left !== right; + case '&&': return left && right; + case '||': return left || right; + } + + return null; +}; + + +internals.exists = function (value) { + + return value !== null && value !== undefined; +}; diff --git a/node_modules/@hapi/formula/package.json b/node_modules/@hapi/formula/package.json new file mode 100644 index 000000000..d8de84063 --- /dev/null +++ b/node_modules/@hapi/formula/package.json @@ -0,0 +1,60 @@ +{ + "_from": "@hapi/formula@^2.0.0", + "_id": "@hapi/formula@2.0.0", + "_inBundle": false, + "_integrity": "sha512-V87P8fv7PI0LH7LiVi8Lkf3x+KCO7pQozXRssAHNXXL9L1K+uyu4XypLXwxqVDKgyQai6qj3/KteNlrqDx4W5A==", + "_location": "/@hapi/formula", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "@hapi/formula@^2.0.0", + "name": "@hapi/formula", + "escapedName": "@hapi%2fformula", + "scope": "@hapi", + "rawSpec": "^2.0.0", + "saveSpec": null, + "fetchSpec": "^2.0.0" + }, + "_requiredBy": [ + "/@hapi/joi" + ], + "_resolved": "https://registry.npmjs.org/@hapi/formula/-/formula-2.0.0.tgz", + "_shasum": "edade0619ed58c8e4f164f233cda70211e787128", + "_spec": "@hapi/formula@^2.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\@hapi\\joi", + "bugs": { + "url": "https://github.com/hapijs/formula/issues" + }, + "bundleDependencies": false, + "dependencies": {}, + "deprecated": "Moved to 'npm install @sideway/formula'", + "description": "Math and string formula parser.", + "devDependencies": { + "@hapi/code": "6.x.x", + "@hapi/lab": "20.x.x" + }, + "files": [ + "lib" + ], + "homepage": "https://github.com/hapijs/formula#readme", + "keywords": [ + "formula", + "parser", + "math", + "string" + ], + "license": "BSD-3-Clause", + "main": "lib/index.js", + "name": "@hapi/formula", + "repository": { + "type": "git", + "url": "git://github.com/hapijs/formula.git" + }, + "scripts": { + "test": "lab -a @hapi/code -t 100 -L -Y", + "test-cov-html": "lab -a @hapi/code -t 100 -L -r html -o coverage.html" + }, + "types": "lib/index.d.ts", + "version": "2.0.0" +} diff --git a/node_modules/@hapi/hoek/LICENSE.md b/node_modules/@hapi/hoek/LICENSE.md new file mode 100644 index 000000000..0c95a4c7b --- /dev/null +++ b/node_modules/@hapi/hoek/LICENSE.md @@ -0,0 +1,12 @@ +Copyright (c) 2011-2020, Sideway Inc, and project contributors +Copyright (c) 2011-2014, Walmart +Copyright (c) 2011, Yahoo Inc. + +All rights reserved. + +Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: +* Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. +* Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. +* The names of any contributors may not be used to endorse or promote products derived from this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS AND CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS OFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/node_modules/@hapi/hoek/README.md b/node_modules/@hapi/hoek/README.md new file mode 100644 index 000000000..8771dbfdd --- /dev/null +++ b/node_modules/@hapi/hoek/README.md @@ -0,0 +1,19 @@ + + +# @hapi/hoek + +#### Utility methods for the hapi ecosystem. + +**hoek** is part of the **hapi** ecosystem and was designed to work seamlessly with the [hapi web framework](https://hapi.dev) and its other components (but works great on its own or with other frameworks). If you are using a different web framework and find this module useful, check out [hapi](https://hapi.dev) – they work even better together. + +This module is not intended to solve every problem for everyone, but rather as a central place to store hapi-specific methods. If you're looking for a general purpose utility module, check out [lodash](https://github.com/lodash/lodash). + +### Visit the [hapi.dev](https://hapi.dev) Developer Portal for tutorials, documentation, and support + +## Useful resources + +- [Documentation and API](https://hapi.dev/family/hoek/) +- [Version status](https://hapi.dev/resources/status/#hoek) (builds, dependencies, node versions, licenses, eol) +- [Changelog](https://hapi.dev/family/hoek/changelog/) +- [Project policies](https://hapi.dev/policies/) +- [Free and commercial support options](https://hapi.dev/support/) diff --git a/node_modules/@hapi/hoek/lib/applyToDefaults.js b/node_modules/@hapi/hoek/lib/applyToDefaults.js new file mode 100644 index 000000000..9881247b9 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/applyToDefaults.js @@ -0,0 +1,102 @@ +'use strict'; + +const Assert = require('./assert'); +const Clone = require('./clone'); +const Merge = require('./merge'); +const Reach = require('./reach'); + + +const internals = {}; + + +module.exports = function (defaults, source, options = {}) { + + Assert(defaults && typeof defaults === 'object', 'Invalid defaults value: must be an object'); + Assert(!source || source === true || typeof source === 'object', 'Invalid source value: must be true, falsy or an object'); + Assert(typeof options === 'object', 'Invalid options: must be an object'); + + if (!source) { // If no source, return null + return null; + } + + if (options.shallow) { + return internals.applyToDefaultsWithShallow(defaults, source, options); + } + + const copy = Clone(defaults); + + if (source === true) { // If source is set to true, use defaults + return copy; + } + + const nullOverride = options.nullOverride !== undefined ? options.nullOverride : false; + return Merge(copy, source, { nullOverride, mergeArrays: false }); +}; + + +internals.applyToDefaultsWithShallow = function (defaults, source, options) { + + const keys = options.shallow; + Assert(Array.isArray(keys), 'Invalid keys'); + + const seen = new Map(); + const merge = source === true ? null : new Set(); + + for (let key of keys) { + key = Array.isArray(key) ? key : key.split('.'); // Pre-split optimization + + const ref = Reach(defaults, key); + if (ref && + typeof ref === 'object') { + + seen.set(ref, merge && Reach(source, key) || ref); + } + else if (merge) { + merge.add(key); + } + } + + const copy = Clone(defaults, {}, seen); + + if (!merge) { + return copy; + } + + for (const key of merge) { + internals.reachCopy(copy, source, key); + } + + const nullOverride = options.nullOverride !== undefined ? options.nullOverride : false; + return Merge(copy, source, { nullOverride, mergeArrays: false }); +}; + + +internals.reachCopy = function (dst, src, path) { + + for (const segment of path) { + if (!(segment in src)) { + return; + } + + const val = src[segment]; + + if (typeof val !== 'object' || val === null) { + return; + } + + src = val; + } + + const value = src; + let ref = dst; + for (let i = 0; i < path.length - 1; ++i) { + const segment = path[i]; + if (typeof ref[segment] !== 'object') { + ref[segment] = {}; + } + + ref = ref[segment]; + } + + ref[path[path.length - 1]] = value; +}; diff --git a/node_modules/@hapi/hoek/lib/assert.js b/node_modules/@hapi/hoek/lib/assert.js new file mode 100644 index 000000000..6a11e9336 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/assert.js @@ -0,0 +1,21 @@ +'use strict'; + +const AssertError = require('./error'); + +const internals = {}; + + +module.exports = function (condition, ...args) { + + if (condition) { + return; + } + + if (args.length === 1 && + args[0] instanceof Error) { + + throw args[0]; + } + + throw new AssertError(args); +}; diff --git a/node_modules/@hapi/hoek/lib/bench.js b/node_modules/@hapi/hoek/lib/bench.js new file mode 100644 index 000000000..26ee19624 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/bench.js @@ -0,0 +1,29 @@ +'use strict'; + +const internals = {}; + + +module.exports = internals.Bench = class { + + constructor() { + + this.ts = 0; + this.reset(); + } + + reset() { + + this.ts = internals.Bench.now(); + } + + elapsed() { + + return internals.Bench.now() - this.ts; + } + + static now() { + + const ts = process.hrtime(); + return (ts[0] * 1e3) + (ts[1] / 1e6); + } +}; diff --git a/node_modules/@hapi/hoek/lib/block.js b/node_modules/@hapi/hoek/lib/block.js new file mode 100644 index 000000000..73fb9a537 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/block.js @@ -0,0 +1,12 @@ +'use strict'; + +const Ignore = require('./ignore'); + + +const internals = {}; + + +module.exports = function () { + + return new Promise(Ignore); +}; diff --git a/node_modules/@hapi/hoek/lib/clone.js b/node_modules/@hapi/hoek/lib/clone.js new file mode 100644 index 000000000..e64defb86 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/clone.js @@ -0,0 +1,176 @@ +'use strict'; + +const Reach = require('./reach'); +const Types = require('./types'); +const Utils = require('./utils'); + + +const internals = { + needsProtoHack: new Set([Types.set, Types.map, Types.weakSet, Types.weakMap]) +}; + + +module.exports = internals.clone = function (obj, options = {}, _seen = null) { + + if (typeof obj !== 'object' || + obj === null) { + + return obj; + } + + let clone = internals.clone; + let seen = _seen; + + if (options.shallow) { + if (options.shallow !== true) { + return internals.cloneWithShallow(obj, options); + } + + clone = (value) => value; + } + else if (seen) { + const lookup = seen.get(obj); + if (lookup) { + return lookup; + } + } + else { + seen = new Map(); + } + + // Built-in object types + + const baseProto = Types.getInternalProto(obj); + if (baseProto === Types.buffer) { + return Buffer && Buffer.from(obj); // $lab:coverage:ignore$ + } + + if (baseProto === Types.date) { + return new Date(obj.getTime()); + } + + if (baseProto === Types.regex) { + return new RegExp(obj); + } + + // Generic objects + + const newObj = internals.base(obj, baseProto, options); + if (newObj === obj) { + return obj; + } + + if (seen) { + seen.set(obj, newObj); // Set seen, since obj could recurse + } + + if (baseProto === Types.set) { + for (const value of obj) { + newObj.add(clone(value, options, seen)); + } + } + else if (baseProto === Types.map) { + for (const [key, value] of obj) { + newObj.set(key, clone(value, options, seen)); + } + } + + const keys = Utils.keys(obj, options); + for (const key of keys) { + if (key === '__proto__') { + continue; + } + + if (baseProto === Types.array && + key === 'length') { + + newObj.length = obj.length; + continue; + } + + const descriptor = Object.getOwnPropertyDescriptor(obj, key); + if (descriptor) { + if (descriptor.get || + descriptor.set) { + + Object.defineProperty(newObj, key, descriptor); + } + else if (descriptor.enumerable) { + newObj[key] = clone(obj[key], options, seen); + } + else { + Object.defineProperty(newObj, key, { enumerable: false, writable: true, configurable: true, value: clone(obj[key], options, seen) }); + } + } + else { + Object.defineProperty(newObj, key, { + enumerable: true, + writable: true, + configurable: true, + value: clone(obj[key], options, seen) + }); + } + } + + return newObj; +}; + + +internals.cloneWithShallow = function (source, options) { + + const keys = options.shallow; + options = Object.assign({}, options); + options.shallow = false; + + const seen = new Map(); + + for (const key of keys) { + const ref = Reach(source, key); + if (typeof ref === 'object' || + typeof ref === 'function') { + + seen.set(ref, ref); + } + } + + return internals.clone(source, options, seen); +}; + + +internals.base = function (obj, baseProto, options) { + + if (options.prototype === false) { // Defaults to true + if (internals.needsProtoHack.has(baseProto)) { + return new baseProto.constructor(); + } + + return baseProto === Types.array ? [] : {}; + } + + const proto = Object.getPrototypeOf(obj); + if (proto && + proto.isImmutable) { + + return obj; + } + + if (baseProto === Types.array) { + const newObj = []; + if (proto !== baseProto) { + Object.setPrototypeOf(newObj, proto); + } + + return newObj; + } + + if (internals.needsProtoHack.has(baseProto)) { + const newObj = new proto.constructor(); + if (proto !== baseProto) { + Object.setPrototypeOf(newObj, proto); + } + + return newObj; + } + + return Object.create(proto); +}; diff --git a/node_modules/@hapi/hoek/lib/contain.js b/node_modules/@hapi/hoek/lib/contain.js new file mode 100644 index 000000000..162ea3e83 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/contain.js @@ -0,0 +1,307 @@ +'use strict'; + +const Assert = require('./assert'); +const DeepEqual = require('./deepEqual'); +const EscapeRegex = require('./escapeRegex'); +const Utils = require('./utils'); + + +const internals = {}; + + +module.exports = function (ref, values, options = {}) { // options: { deep, once, only, part, symbols } + + /* + string -> string(s) + array -> item(s) + object -> key(s) + object -> object (key:value) + */ + + if (typeof values !== 'object') { + values = [values]; + } + + Assert(!Array.isArray(values) || values.length, 'Values array cannot be empty'); + + // String + + if (typeof ref === 'string') { + return internals.string(ref, values, options); + } + + // Array + + if (Array.isArray(ref)) { + return internals.array(ref, values, options); + } + + // Object + + Assert(typeof ref === 'object', 'Reference must be string or an object'); + return internals.object(ref, values, options); +}; + + +internals.array = function (ref, values, options) { + + if (!Array.isArray(values)) { + values = [values]; + } + + if (!ref.length) { + return false; + } + + if (options.only && + options.once && + ref.length !== values.length) { + + return false; + } + + let compare; + + // Map values + + const map = new Map(); + for (const value of values) { + if (!options.deep || + !value || + typeof value !== 'object') { + + const existing = map.get(value); + if (existing) { + ++existing.allowed; + } + else { + map.set(value, { allowed: 1, hits: 0 }); + } + } + else { + compare = compare || internals.compare(options); + + let found = false; + for (const [key, existing] of map.entries()) { + if (compare(key, value)) { + ++existing.allowed; + found = true; + break; + } + } + + if (!found) { + map.set(value, { allowed: 1, hits: 0 }); + } + } + } + + // Lookup values + + let hits = 0; + for (const item of ref) { + let match; + if (!options.deep || + !item || + typeof item !== 'object') { + + match = map.get(item); + } + else { + compare = compare || internals.compare(options); + + for (const [key, existing] of map.entries()) { + if (compare(key, item)) { + match = existing; + break; + } + } + } + + if (match) { + ++match.hits; + ++hits; + + if (options.once && + match.hits > match.allowed) { + + return false; + } + } + } + + // Validate results + + if (options.only && + hits !== ref.length) { + + return false; + } + + for (const match of map.values()) { + if (match.hits === match.allowed) { + continue; + } + + if (match.hits < match.allowed && + !options.part) { + + return false; + } + } + + return !!hits; +}; + + +internals.object = function (ref, values, options) { + + Assert(options.once === undefined, 'Cannot use option once with object'); + + const keys = Utils.keys(ref, options); + if (!keys.length) { + return false; + } + + // Keys list + + if (Array.isArray(values)) { + return internals.array(keys, values, options); + } + + // Key value pairs + + const symbols = Object.getOwnPropertySymbols(values).filter((sym) => values.propertyIsEnumerable(sym)); + const targets = [...Object.keys(values), ...symbols]; + + const compare = internals.compare(options); + const set = new Set(targets); + + for (const key of keys) { + if (!set.has(key)) { + if (options.only) { + return false; + } + + continue; + } + + if (!compare(values[key], ref[key])) { + return false; + } + + set.delete(key); + } + + if (set.size) { + return options.part ? set.size < targets.length : false; + } + + return true; +}; + + +internals.string = function (ref, values, options) { + + // Empty string + + if (ref === '') { + return values.length === 1 && values[0] === '' || // '' contains '' + !options.once && !values.some((v) => v !== ''); // '' contains multiple '' if !once + } + + // Map values + + const map = new Map(); + const patterns = []; + + for (const value of values) { + Assert(typeof value === 'string', 'Cannot compare string reference to non-string value'); + + if (value) { + const existing = map.get(value); + if (existing) { + ++existing.allowed; + } + else { + map.set(value, { allowed: 1, hits: 0 }); + patterns.push(EscapeRegex(value)); + } + } + else if (options.once || + options.only) { + + return false; + } + } + + if (!patterns.length) { // Non-empty string contains unlimited empty string + return true; + } + + // Match patterns + + const regex = new RegExp(`(${patterns.join('|')})`, 'g'); + const leftovers = ref.replace(regex, ($0, $1) => { + + ++map.get($1).hits; + return ''; // Remove from string + }); + + // Validate results + + if (options.only && + leftovers) { + + return false; + } + + let any = false; + for (const match of map.values()) { + if (match.hits) { + any = true; + } + + if (match.hits === match.allowed) { + continue; + } + + if (match.hits < match.allowed && + !options.part) { + + return false; + } + + // match.hits > match.allowed + + if (options.once) { + return false; + } + } + + return !!any; +}; + + +internals.compare = function (options) { + + if (!options.deep) { + return internals.shallow; + } + + const hasOnly = options.only !== undefined; + const hasPart = options.part !== undefined; + + const flags = { + prototype: hasOnly ? options.only : hasPart ? !options.part : false, + part: hasOnly ? !options.only : hasPart ? options.part : false + }; + + return (a, b) => DeepEqual(a, b, flags); +}; + + +internals.shallow = function (a, b) { + + return a === b; +}; diff --git a/node_modules/@hapi/hoek/lib/deepEqual.js b/node_modules/@hapi/hoek/lib/deepEqual.js new file mode 100644 index 000000000..a82647bea --- /dev/null +++ b/node_modules/@hapi/hoek/lib/deepEqual.js @@ -0,0 +1,317 @@ +'use strict'; + +const Types = require('./types'); + + +const internals = { + mismatched: null +}; + + +module.exports = function (obj, ref, options) { + + options = Object.assign({ prototype: true }, options); + + return !!internals.isDeepEqual(obj, ref, options, []); +}; + + +internals.isDeepEqual = function (obj, ref, options, seen) { + + if (obj === ref) { // Copied from Deep-eql, copyright(c) 2013 Jake Luer, jake@alogicalparadox.com, MIT Licensed, https://github.com/chaijs/deep-eql + return obj !== 0 || 1 / obj === 1 / ref; + } + + const type = typeof obj; + + if (type !== typeof ref) { + return false; + } + + if (obj === null || + ref === null) { + + return false; + } + + if (type === 'function') { + if (!options.deepFunction || + obj.toString() !== ref.toString()) { + + return false; + } + + // Continue as object + } + else if (type !== 'object') { + return obj !== obj && ref !== ref; // NaN + } + + const instanceType = internals.getSharedType(obj, ref, !!options.prototype); + switch (instanceType) { + case Types.buffer: + return Buffer && Buffer.prototype.equals.call(obj, ref); // $lab:coverage:ignore$ + case Types.promise: + return obj === ref; + case Types.regex: + return obj.toString() === ref.toString(); + case internals.mismatched: + return false; + } + + for (let i = seen.length - 1; i >= 0; --i) { + if (seen[i].isSame(obj, ref)) { + return true; // If previous comparison failed, it would have stopped execution + } + } + + seen.push(new internals.SeenEntry(obj, ref)); + + try { + return !!internals.isDeepEqualObj(instanceType, obj, ref, options, seen); + } + finally { + seen.pop(); + } +}; + + +internals.getSharedType = function (obj, ref, checkPrototype) { + + if (checkPrototype) { + if (Object.getPrototypeOf(obj) !== Object.getPrototypeOf(ref)) { + return internals.mismatched; + } + + return Types.getInternalProto(obj); + } + + const type = Types.getInternalProto(obj); + if (type !== Types.getInternalProto(ref)) { + return internals.mismatched; + } + + return type; +}; + + +internals.valueOf = function (obj) { + + const objValueOf = obj.valueOf; + if (objValueOf === undefined) { + return obj; + } + + try { + return objValueOf.call(obj); + } + catch (err) { + return err; + } +}; + + +internals.hasOwnEnumerableProperty = function (obj, key) { + + return Object.prototype.propertyIsEnumerable.call(obj, key); +}; + + +internals.isSetSimpleEqual = function (obj, ref) { + + for (const entry of Set.prototype.values.call(obj)) { + if (!Set.prototype.has.call(ref, entry)) { + return false; + } + } + + return true; +}; + + +internals.isDeepEqualObj = function (instanceType, obj, ref, options, seen) { + + const { isDeepEqual, valueOf, hasOwnEnumerableProperty } = internals; + const { keys, getOwnPropertySymbols } = Object; + + if (instanceType === Types.array) { + if (options.part) { + + // Check if any index match any other index + + for (const objValue of obj) { + for (const refValue of ref) { + if (isDeepEqual(objValue, refValue, options, seen)) { + return true; + } + } + } + } + else { + if (obj.length !== ref.length) { + return false; + } + + for (let i = 0; i < obj.length; ++i) { + if (!isDeepEqual(obj[i], ref[i], options, seen)) { + return false; + } + } + + return true; + } + } + else if (instanceType === Types.set) { + if (obj.size !== ref.size) { + return false; + } + + if (!internals.isSetSimpleEqual(obj, ref)) { + + // Check for deep equality + + const ref2 = new Set(Set.prototype.values.call(ref)); + for (const objEntry of Set.prototype.values.call(obj)) { + if (ref2.delete(objEntry)) { + continue; + } + + let found = false; + for (const refEntry of ref2) { + if (isDeepEqual(objEntry, refEntry, options, seen)) { + ref2.delete(refEntry); + found = true; + break; + } + } + + if (!found) { + return false; + } + } + } + } + else if (instanceType === Types.map) { + if (obj.size !== ref.size) { + return false; + } + + for (const [key, value] of Map.prototype.entries.call(obj)) { + if (value === undefined && !Map.prototype.has.call(ref, key)) { + return false; + } + + if (!isDeepEqual(value, Map.prototype.get.call(ref, key), options, seen)) { + return false; + } + } + } + else if (instanceType === Types.error) { + + // Always check name and message + + if (obj.name !== ref.name || + obj.message !== ref.message) { + + return false; + } + } + + // Check .valueOf() + + const valueOfObj = valueOf(obj); + const valueOfRef = valueOf(ref); + if ((obj !== valueOfObj || ref !== valueOfRef) && + !isDeepEqual(valueOfObj, valueOfRef, options, seen)) { + + return false; + } + + // Check properties + + const objKeys = keys(obj); + if (!options.part && + objKeys.length !== keys(ref).length && + !options.skip) { + + return false; + } + + let skipped = 0; + for (const key of objKeys) { + if (options.skip && + options.skip.includes(key)) { + + if (ref[key] === undefined) { + ++skipped; + } + + continue; + } + + if (!hasOwnEnumerableProperty(ref, key)) { + return false; + } + + if (!isDeepEqual(obj[key], ref[key], options, seen)) { + return false; + } + } + + if (!options.part && + objKeys.length - skipped !== keys(ref).length) { + + return false; + } + + // Check symbols + + if (options.symbols !== false) { // Defaults to true + const objSymbols = getOwnPropertySymbols(obj); + const refSymbols = new Set(getOwnPropertySymbols(ref)); + + for (const key of objSymbols) { + if (!options.skip || + !options.skip.includes(key)) { + + if (hasOwnEnumerableProperty(obj, key)) { + if (!hasOwnEnumerableProperty(ref, key)) { + return false; + } + + if (!isDeepEqual(obj[key], ref[key], options, seen)) { + return false; + } + } + else if (hasOwnEnumerableProperty(ref, key)) { + return false; + } + } + + refSymbols.delete(key); + } + + for (const key of refSymbols) { + if (hasOwnEnumerableProperty(ref, key)) { + return false; + } + } + } + + return true; +}; + + +internals.SeenEntry = class { + + constructor(obj, ref) { + + this.obj = obj; + this.ref = ref; + } + + isSame(obj, ref) { + + return this.obj === obj && this.ref === ref; + } +}; diff --git a/node_modules/@hapi/hoek/lib/error.js b/node_modules/@hapi/hoek/lib/error.js new file mode 100644 index 000000000..9fc4f5df4 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/error.js @@ -0,0 +1,26 @@ +'use strict'; + +const Stringify = require('./stringify'); + + +const internals = {}; + + +module.exports = class extends Error { + + constructor(args) { + + const msgs = args + .filter((arg) => arg !== '') + .map((arg) => { + + return typeof arg === 'string' ? arg : arg instanceof Error ? arg.message : Stringify(arg); + }); + + super(msgs.join(' ') || 'Unknown error'); + + if (typeof Error.captureStackTrace === 'function') { // $lab:coverage:ignore$ + Error.captureStackTrace(this, exports.assert); + } + } +}; diff --git a/node_modules/@hapi/hoek/lib/escapeHeaderAttribute.js b/node_modules/@hapi/hoek/lib/escapeHeaderAttribute.js new file mode 100644 index 000000000..a0a4deea4 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/escapeHeaderAttribute.js @@ -0,0 +1,16 @@ +'use strict'; + +const Assert = require('./assert'); + + +const internals = {}; + + +module.exports = function (attribute) { + + // Allowed value characters: !#$%&'()*+,-./:;<=>?@[]^_`{|}~ and space, a-z, A-Z, 0-9, \, " + + Assert(/^[ \w\!#\$%&'\(\)\*\+,\-\.\/\:;<\=>\?@\[\]\^`\{\|\}~\"\\]*$/.test(attribute), 'Bad attribute value (' + attribute + ')'); + + return attribute.replace(/\\/g, '\\\\').replace(/\"/g, '\\"'); // Escape quotes and slash +}; diff --git a/node_modules/@hapi/hoek/lib/escapeHtml.js b/node_modules/@hapi/hoek/lib/escapeHtml.js new file mode 100644 index 000000000..ddef2b61f --- /dev/null +++ b/node_modules/@hapi/hoek/lib/escapeHtml.js @@ -0,0 +1,87 @@ +'use strict'; + +const internals = {}; + + +module.exports = function (input) { + + if (!input) { + return ''; + } + + let escaped = ''; + + for (let i = 0; i < input.length; ++i) { + + const charCode = input.charCodeAt(i); + + if (internals.isSafe(charCode)) { + escaped += input[i]; + } + else { + escaped += internals.escapeHtmlChar(charCode); + } + } + + return escaped; +}; + + +internals.escapeHtmlChar = function (charCode) { + + const namedEscape = internals.namedHtml[charCode]; + if (typeof namedEscape !== 'undefined') { + return namedEscape; + } + + if (charCode >= 256) { + return '&#' + charCode + ';'; + } + + const hexValue = charCode.toString(16).padStart(2, '0'); + return `&#x${hexValue};`; +}; + + +internals.isSafe = function (charCode) { + + return (typeof internals.safeCharCodes[charCode] !== 'undefined'); +}; + + +internals.namedHtml = { + '38': '&', + '60': '<', + '62': '>', + '34': '"', + '160': ' ', + '162': '¢', + '163': '£', + '164': '¤', + '169': '©', + '174': '®' +}; + + +internals.safeCharCodes = (function () { + + const safe = {}; + + for (let i = 32; i < 123; ++i) { + + if ((i >= 97) || // a-z + (i >= 65 && i <= 90) || // A-Z + (i >= 48 && i <= 57) || // 0-9 + i === 32 || // space + i === 46 || // . + i === 44 || // , + i === 45 || // - + i === 58 || // : + i === 95) { // _ + + safe[i] = null; + } + } + + return safe; +}()); diff --git a/node_modules/@hapi/hoek/lib/escapeJson.js b/node_modules/@hapi/hoek/lib/escapeJson.js new file mode 100644 index 000000000..e6e94b38d --- /dev/null +++ b/node_modules/@hapi/hoek/lib/escapeJson.js @@ -0,0 +1,41 @@ +'use strict'; + +const internals = {}; + + +module.exports = function (input) { + + if (!input) { + return ''; + } + + const lessThan = 0x3C; + const greaterThan = 0x3E; + const andSymbol = 0x26; + const lineSeperator = 0x2028; + + // replace method + let charCode; + return input.replace(/[<>&\u2028\u2029]/g, (match) => { + + charCode = match.charCodeAt(0); + + if (charCode === lessThan) { + return '\\u003c'; + } + + if (charCode === greaterThan) { + return '\\u003e'; + } + + if (charCode === andSymbol) { + return '\\u0026'; + } + + if (charCode === lineSeperator) { + return '\\u2028'; + } + + return '\\u2029'; + }); +}; diff --git a/node_modules/@hapi/hoek/lib/escapeRegex.js b/node_modules/@hapi/hoek/lib/escapeRegex.js new file mode 100644 index 000000000..3272497ef --- /dev/null +++ b/node_modules/@hapi/hoek/lib/escapeRegex.js @@ -0,0 +1,11 @@ +'use strict'; + +const internals = {}; + + +module.exports = function (string) { + + // Escape ^$.*+-?=!:|\/()[]{}, + + return string.replace(/[\^\$\.\*\+\-\?\=\!\:\|\\\/\(\)\[\]\{\}\,]/g, '\\$&'); +}; diff --git a/node_modules/@hapi/hoek/lib/flatten.js b/node_modules/@hapi/hoek/lib/flatten.js new file mode 100644 index 000000000..726e23140 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/flatten.js @@ -0,0 +1,20 @@ +'use strict'; + +const internals = {}; + + +module.exports = internals.flatten = function (array, target) { + + const result = target || []; + + for (let i = 0; i < array.length; ++i) { + if (Array.isArray(array[i])) { + internals.flatten(array[i], result); + } + else { + result.push(array[i]); + } + } + + return result; +}; diff --git a/node_modules/@hapi/hoek/lib/ignore.js b/node_modules/@hapi/hoek/lib/ignore.js new file mode 100644 index 000000000..21ad14439 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/ignore.js @@ -0,0 +1,6 @@ +'use strict'; + +const internals = {}; + + +module.exports = function () { }; diff --git a/node_modules/@hapi/hoek/lib/index.d.ts b/node_modules/@hapi/hoek/lib/index.d.ts new file mode 100644 index 000000000..e9bcdc286 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/index.d.ts @@ -0,0 +1,471 @@ +/// + + +/** + * Performs a deep comparison of the two values including support for circular dependencies, prototype, and enumerable properties. + * + * @param obj - The value being compared. + * @param ref - The reference value used for comparison. + * + * @return true when the two values are equal, otherwise false. + */ +export function deepEqual(obj: any, ref: any, options?: deepEqual.Options): boolean; + +export namespace deepEqual { + + interface Options { + + /** + * Compare functions with difference references by comparing their internal code and properties. + * + * @default false + */ + readonly deepFunction?: boolean; + + /** + * Allow partial match. + * + * @default false + */ + readonly part?: boolean; + + /** + * Compare the objects' prototypes. + * + * @default true + */ + readonly prototype?: boolean; + + /** + * List of object keys to ignore different values of. + * + * @default null + */ + readonly skip?: (string | symbol)[]; + + /** + * Compare symbol properties. + * + * @default true + */ + readonly symbols?: boolean; + } +} + + +/** + * Clone any value, object, or array. + * + * @param obj - The value being cloned. + * @param options - Optional settings. + * + * @returns A deep clone of `obj`. + */ +export function clone(obj: T, options?: clone.Options): T; + +export namespace clone { + + interface Options { + + /** + * Clone the object's prototype. + * + * @default true + */ + readonly prototype?: boolean; + + /** + * Include symbol properties. + * + * @default true + */ + readonly symbols?: boolean; + + /** + * Shallow clone the specified keys. + * + * @default undefined + */ + readonly shallow?: string[] | string[][] | boolean; + } +} + + +/** + * Merge all the properties of source into target. + * + * @param target - The object being modified. + * @param source - The object used to copy properties from. + * @param options - Optional settings. + * + * @returns The `target` object. + */ +export function merge(target: T1, source: T2, options?: merge.Options): T1 & T2; + +export namespace merge { + + interface Options { + + /** + * When true, null value from `source` overrides existing value in `target`. + * + * @default true + */ + readonly nullOverride?: boolean; + + /** + * When true, array value from `source` is merged with the existing value in `target`. + * + * @default false + */ + readonly mergeArrays?: boolean; + + /** + * Compare symbol properties. + * + * @default true + */ + readonly symbols?: boolean; + } +} + + +/** + * Apply source to a copy of the defaults. + * + * @param defaults - An object with the default values to use of `options` does not contain the same keys. + * @param source - The source used to override the `defaults`. + * @param options - Optional settings. + * + * @returns A copy of `defaults` with `source` keys overriding any conflicts. + */ +export function applyToDefaults(defaults: Partial, source: Partial | boolean | null, options?: applyToDefaults.Options): Partial; + +export namespace applyToDefaults { + + interface Options { + + /** + * When true, null value from `source` overrides existing value in `target`. + * + * @default true + */ + readonly nullOverride?: boolean; + + /** + * Shallow clone the specified keys. + * + * @default undefined + */ + readonly shallow?: string[] | string[][]; + } +} + + +/** + * Find the common unique items in two arrays. + * + * @param array1 - The first array to compare. + * @param array2 - The second array to compare. + * @param options - Optional settings. + * + * @return - An array of the common items. If `justFirst` is true, returns the first common item. + */ +export function intersect(array1: intersect.Array, array2: intersect.Array, options?: intersect.Options): Array; +export function intersect(array1: intersect.Array, array2: intersect.Array, options?: intersect.Options): T1 | T2; + +export namespace intersect { + + type Array = ArrayLike | Set | null; + + interface Options { + + /** + * When true, return the first overlapping value. + * + * @default false + */ + readonly first?: boolean; + } +} + + +/** + * Checks if the reference value contains the provided values. + * + * @param ref - The reference string, array, or object. + * @param values - A single or array of values to find within `ref`. If `ref` is an object, `values` can be a key name, an array of key names, or an object with key-value pairs to compare. + * + * @return true if the value contains the provided values, otherwise false. + */ +export function contain(ref: string, values: string | string[], options?: contain.Options): boolean; +export function contain(ref: any[], values: any, options?: contain.Options): boolean; +export function contain(ref: object, values: string | string[] | object, options?: Omit): boolean; + +export namespace contain { + + interface Options { + + /** + * Perform a deep comparison. + * + * @default false + */ + readonly deep?: boolean; + + /** + * Allow only one occurrence of each value. + * + * @default false + */ + readonly once?: boolean; + + /** + * Allow only values explicitly listed. + * + * @default false + */ + readonly only?: boolean; + + /** + * Allow partial match. + * + * @default false + */ + readonly part?: boolean; + + /** + * Include symbol properties. + * + * @default true + */ + readonly symbols?: boolean; + } +} + + +/** + * Flatten an array with sub arrays + * + * @param array - an array of items or other arrays to flatten. + * @param target - if provided, an array to shallow copy the flattened `array` items to + * + * @return a flat array of the provided values (appended to `target` is provided). + */ +export function flatten(array: ArrayLike>, target?: ArrayLike>): T[]; + + +/** + * Convert an object key chain string to reference. + * + * @param obj - the object from which to look up the value. + * @param chain - the string path of the requested value. The chain string is split into key names using `options.separator`, or an array containing each individual key name. A chain including negative numbers will work like a negative index on an array. + * + * @return The value referenced by the chain if found, otherwise undefined. If chain is null, undefined, or false, the object itself will be returned. + */ +export function reach(obj: object | null, chain: string | (string | number)[] | false | null | undefined, options?: reach.Options): any; + +export namespace reach { + + interface Options { + + /** + * String to split chain path on. Defaults to '.'. + * + * @default false + */ + readonly separator?: string; + + /** + * Value to return if the path or value is not present. No default value. + * + * @default false + */ + readonly default?: any; + + /** + * If true, will throw an error on missing member in the chain. Default to false. + * + * @default false + */ + readonly strict?: boolean; + + /** + * If true, allows traversing functions for properties. false will throw an error if a function is part of the chain. + * + * @default true + */ + readonly functions?: boolean; + + /** + * If true, allows traversing Set and Map objects for properties. false will return undefined regardless of the Set or Map passed. + * + * @default false + */ + readonly iterables?: boolean; + } +} + + +/** + * Replace string parameters (using format "{path.to.key}") with their corresponding object key values using `Hoek.reach()`. + * + * @param obj - the object from which to look up the value. + * @param template - the string containing {} enclosed key paths to be replaced. + * + * @return The template string with the {} enclosed keys replaced with looked-up values. + */ +export function reachTemplate(obj: object | null, template: string, options?: reach.Options): string; + + +/** + * Throw an error if condition is falsy. + * + * @param condition - If `condition` is not truthy, an exception is thrown. + * @param error - The error thrown if the condition fails. + * + * @return Does not return a value but throws if the `condition` is falsy. + */ +export function assert(condition: any, error: Error): void; + + +/** + * Throw an error if condition is falsy. + * + * @param condition - If `condition` is not truthy, an exception is thrown. + * @param args - Any number of values, concatenated together (space separated) to create the error message. + * + * @return Does not return a value but throws if the `condition` is falsy. + */ +export function assert(condition: any, ...args: any): void; + + +/** + * A benchmarking timer, using the internal node clock for maximum accuracy. + */ +export class Bench { + + constructor(); + + /** The starting timestamp expressed in the number of milliseconds since the epoch. */ + ts: number; + + /** The time in milliseconds since the object was created. */ + elapsed(): number; + + /** Reset the `ts` value to now. */ + reset(): void; + + /** The current time in milliseconds since the epoch. */ + static now(): number; +} + + +/** + * Escape string for Regex construction by prefixing all reserved characters with a backslash. + * + * @param string - The string to be escaped. + * + * @return The escaped string. + */ +export function escapeRegex(string: string): string; + + +/** + * Escape string for usage as an attribute value in HTTP headers. + * + * @param attribute - The string to be escaped. + * + * @return The escaped string. Will throw on invalid characters that are not supported to be escaped. + */ +export function escapeHeaderAttribute(attribute: string): string; + + +/** + * Escape string for usage in HTML. + * + * @param string - The string to be escaped. + * + * @return The escaped string. + */ +export function escapeHtml(string: string): string; + + +/** + * Escape string for usage in JSON. + * + * @param string - The string to be escaped. + * + * @return The escaped string. + */ +export function escapeJson(string: string): string; + + +/** + * Wraps a function to ensure it can only execute once. + * + * @param method - The function to be wrapped. + * + * @return The wrapped function. + */ +export function once(method: T): T; + + +/** + * A reusable no-op function. + */ +export function ignore(...ignore: any): void; + + +/** + * Converts a JavaScript value to a JavaScript Object Notation (JSON) string with protection against thrown errors. + * + * @param value A JavaScript value, usually an object or array, to be converted. + * @param replacer The JSON.stringify() `replacer` argument. + * @param space Adds indentation, white space, and line break characters to the return-value JSON text to make it easier to read. + * + * @return The JSON string. If the operation fails, an error string value is returned (no exception thrown). + */ +export function stringify(value: any, replacer?: any, space?: string | number): string; + + +/** + * Returns a Promise that resolves after the requested timeout. + * + * @param timeout - The number of milliseconds to wait before resolving the Promise. + * @param returnValue - The value that the Promise will resolve to. + * + * @return A Promise that resolves with `returnValue`. + */ +export function wait(timeout?: number, returnValue?: T): Promise; + + +/** + * Returns a Promise that never resolves. + */ +export function block(): Promise; + + +/** + * Determines if an object is a promise. + * + * @param promise - the object tested. + * + * @returns true if the object is a promise, otherwise false. + */ +export function isPromise(promise: any): boolean; + + +export namespace ts { + + /** + * Defines a type that can must be one of T or U but not both. + */ + type XOR = (T | U) extends object ? (internals.Without & U) | (internals.Without & T) : T | U; +} + + +declare namespace internals { + + type Without = { [P in Exclude]?: never }; +} diff --git a/node_modules/@hapi/hoek/lib/index.js b/node_modules/@hapi/hoek/lib/index.js new file mode 100644 index 000000000..2062f1803 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/index.js @@ -0,0 +1,45 @@ +'use strict'; + +exports.applyToDefaults = require('./applyToDefaults'); + +exports.assert = require('./assert'); + +exports.Bench = require('./bench'); + +exports.block = require('./block'); + +exports.clone = require('./clone'); + +exports.contain = require('./contain'); + +exports.deepEqual = require('./deepEqual'); + +exports.Error = require('./error'); + +exports.escapeHeaderAttribute = require('./escapeHeaderAttribute'); + +exports.escapeHtml = require('./escapeHtml'); + +exports.escapeJson = require('./escapeJson'); + +exports.escapeRegex = require('./escapeRegex'); + +exports.flatten = require('./flatten'); + +exports.ignore = require('./ignore'); + +exports.intersect = require('./intersect'); + +exports.isPromise = require('./isPromise'); + +exports.merge = require('./merge'); + +exports.once = require('./once'); + +exports.reach = require('./reach'); + +exports.reachTemplate = require('./reachTemplate'); + +exports.stringify = require('./stringify'); + +exports.wait = require('./wait'); diff --git a/node_modules/@hapi/hoek/lib/intersect.js b/node_modules/@hapi/hoek/lib/intersect.js new file mode 100644 index 000000000..59e6aaf16 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/intersect.js @@ -0,0 +1,41 @@ +'use strict'; + +const internals = {}; + + +module.exports = function (array1, array2, options = {}) { + + if (!array1 || + !array2) { + + return (options.first ? null : []); + } + + const common = []; + const hash = (Array.isArray(array1) ? new Set(array1) : array1); + const found = new Set(); + for (const value of array2) { + if (internals.has(hash, value) && + !found.has(value)) { + + if (options.first) { + return value; + } + + common.push(value); + found.add(value); + } + } + + return (options.first ? null : common); +}; + + +internals.has = function (ref, key) { + + if (typeof ref.has === 'function') { + return ref.has(key); + } + + return ref[key] !== undefined; +}; diff --git a/node_modules/@hapi/hoek/lib/isPromise.js b/node_modules/@hapi/hoek/lib/isPromise.js new file mode 100644 index 000000000..402980404 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/isPromise.js @@ -0,0 +1,9 @@ +'use strict'; + +const internals = {}; + + +module.exports = function (promise) { + + return !!promise && typeof promise.then === 'function'; +}; diff --git a/node_modules/@hapi/hoek/lib/merge.js b/node_modules/@hapi/hoek/lib/merge.js new file mode 100644 index 000000000..47a1e1e9d --- /dev/null +++ b/node_modules/@hapi/hoek/lib/merge.js @@ -0,0 +1,78 @@ +'use strict'; + +const Assert = require('./assert'); +const Clone = require('./clone'); +const Utils = require('./utils'); + + +const internals = {}; + + +module.exports = internals.merge = function (target, source, options) { + + Assert(target && typeof target === 'object', 'Invalid target value: must be an object'); + Assert(source === null || source === undefined || typeof source === 'object', 'Invalid source value: must be null, undefined, or an object'); + + if (!source) { + return target; + } + + options = Object.assign({ nullOverride: true, mergeArrays: true }, options); + + if (Array.isArray(source)) { + Assert(Array.isArray(target), 'Cannot merge array onto an object'); + if (!options.mergeArrays) { + target.length = 0; // Must not change target assignment + } + + for (let i = 0; i < source.length; ++i) { + target.push(Clone(source[i], { symbols: options.symbols })); + } + + return target; + } + + const keys = Utils.keys(source, options); + for (let i = 0; i < keys.length; ++i) { + const key = keys[i]; + if (key === '__proto__' || + !Object.prototype.propertyIsEnumerable.call(source, key)) { + + continue; + } + + const value = source[key]; + if (value && + typeof value === 'object') { + + if (target[key] === value) { + continue; // Can occur for shallow merges + } + + if (!target[key] || + typeof target[key] !== 'object' || + (Array.isArray(target[key]) !== Array.isArray(value)) || + value instanceof Date || + (Buffer && Buffer.isBuffer(value)) || // $lab:coverage:ignore$ + value instanceof RegExp) { + + target[key] = Clone(value, { symbols: options.symbols }); + } + else { + internals.merge(target[key], value, options); + } + } + else { + if (value !== null && + value !== undefined) { // Explicit to preserve empty strings + + target[key] = value; + } + else if (options.nullOverride) { + target[key] = value; + } + } + } + + return target; +}; diff --git a/node_modules/@hapi/hoek/lib/once.js b/node_modules/@hapi/hoek/lib/once.js new file mode 100644 index 000000000..de94ea0ab --- /dev/null +++ b/node_modules/@hapi/hoek/lib/once.js @@ -0,0 +1,23 @@ +'use strict'; + +const internals = {}; + + +module.exports = function (method) { + + if (method._hoekOnce) { + return method; + } + + let once = false; + const wrapped = function (...args) { + + if (!once) { + once = true; + method(...args); + } + }; + + wrapped._hoekOnce = true; + return wrapped; +}; diff --git a/node_modules/@hapi/hoek/lib/reach.js b/node_modules/@hapi/hoek/lib/reach.js new file mode 100644 index 000000000..3791b37b2 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/reach.js @@ -0,0 +1,76 @@ +'use strict'; + +const Assert = require('./assert'); + + +const internals = {}; + + +module.exports = function (obj, chain, options) { + + if (chain === false || + chain === null || + chain === undefined) { + + return obj; + } + + options = options || {}; + if (typeof options === 'string') { + options = { separator: options }; + } + + const isChainArray = Array.isArray(chain); + + Assert(!isChainArray || !options.separator, 'Separator option no valid for array-based chain'); + + const path = isChainArray ? chain : chain.split(options.separator || '.'); + let ref = obj; + for (let i = 0; i < path.length; ++i) { + let key = path[i]; + const type = options.iterables && internals.iterables(ref); + + if (Array.isArray(ref) || + type === 'set') { + + const number = Number(key); + if (Number.isInteger(number)) { + key = number < 0 ? ref.length + number : number; + } + } + + if (!ref || + typeof ref === 'function' && options.functions === false || // Defaults to true + !type && ref[key] === undefined) { + + Assert(!options.strict || i + 1 === path.length, 'Missing segment', key, 'in reach path ', chain); + Assert(typeof ref === 'object' || options.functions === true || typeof ref !== 'function', 'Invalid segment', key, 'in reach path ', chain); + ref = options.default; + break; + } + + if (!type) { + ref = ref[key]; + } + else if (type === 'set') { + ref = [...ref][key]; + } + else { // type === 'map' + ref = ref.get(key); + } + } + + return ref; +}; + + +internals.iterables = function (ref) { + + if (ref instanceof Set) { + return 'set'; + } + + if (ref instanceof Map) { + return 'map'; + } +}; diff --git a/node_modules/@hapi/hoek/lib/reachTemplate.js b/node_modules/@hapi/hoek/lib/reachTemplate.js new file mode 100644 index 000000000..e382d50c1 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/reachTemplate.js @@ -0,0 +1,16 @@ +'use strict'; + +const Reach = require('./reach'); + + +const internals = {}; + + +module.exports = function (obj, template, options) { + + return template.replace(/{([^{}]+)}/g, ($0, chain) => { + + const value = Reach(obj, chain, options); + return (value === undefined || value === null ? '' : value); + }); +}; diff --git a/node_modules/@hapi/hoek/lib/stringify.js b/node_modules/@hapi/hoek/lib/stringify.js new file mode 100644 index 000000000..88d0fc4ed --- /dev/null +++ b/node_modules/@hapi/hoek/lib/stringify.js @@ -0,0 +1,14 @@ +'use strict'; + +const internals = {}; + + +module.exports = function (...args) { + + try { + return JSON.stringify.apply(null, args); + } + catch (err) { + return '[Cannot display object: ' + err.message + ']'; + } +}; diff --git a/node_modules/@hapi/hoek/lib/types.js b/node_modules/@hapi/hoek/lib/types.js new file mode 100644 index 000000000..c291b6578 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/types.js @@ -0,0 +1,55 @@ +'use strict'; + +const internals = {}; + + +exports = module.exports = { + array: Array.prototype, + buffer: Buffer && Buffer.prototype, // $lab:coverage:ignore$ + date: Date.prototype, + error: Error.prototype, + generic: Object.prototype, + map: Map.prototype, + promise: Promise.prototype, + regex: RegExp.prototype, + set: Set.prototype, + weakMap: WeakMap.prototype, + weakSet: WeakSet.prototype +}; + + +internals.typeMap = new Map([ + ['[object Error]', exports.error], + ['[object Map]', exports.map], + ['[object Promise]', exports.promise], + ['[object Set]', exports.set], + ['[object WeakMap]', exports.weakMap], + ['[object WeakSet]', exports.weakSet] +]); + + +exports.getInternalProto = function (obj) { + + if (Array.isArray(obj)) { + return exports.array; + } + + if (Buffer && obj instanceof Buffer) { // $lab:coverage:ignore$ + return exports.buffer; + } + + if (obj instanceof Date) { + return exports.date; + } + + if (obj instanceof RegExp) { + return exports.regex; + } + + if (obj instanceof Error) { + return exports.error; + } + + const objName = Object.prototype.toString.call(obj); + return internals.typeMap.get(objName) || exports.generic; +}; diff --git a/node_modules/@hapi/hoek/lib/utils.js b/node_modules/@hapi/hoek/lib/utils.js new file mode 100644 index 000000000..bab1e8c41 --- /dev/null +++ b/node_modules/@hapi/hoek/lib/utils.js @@ -0,0 +1,9 @@ +'use strict'; + +const internals = {}; + + +exports.keys = function (obj, options = {}) { + + return options.symbols !== false ? Reflect.ownKeys(obj) : Object.getOwnPropertyNames(obj); // Defaults to true +}; diff --git a/node_modules/@hapi/hoek/lib/wait.js b/node_modules/@hapi/hoek/lib/wait.js new file mode 100644 index 000000000..f34585efe --- /dev/null +++ b/node_modules/@hapi/hoek/lib/wait.js @@ -0,0 +1,13 @@ +'use strict'; + +const internals = {}; + + +module.exports = function (timeout, returnValue) { + + if (typeof timeout !== 'number' && timeout !== undefined) { + throw new TypeError('Timeout must be a number'); + } + + return new Promise((resolve) => setTimeout(resolve, timeout, returnValue)); +}; diff --git a/node_modules/@hapi/hoek/package.json b/node_modules/@hapi/hoek/package.json new file mode 100644 index 000000000..f28910a81 --- /dev/null +++ b/node_modules/@hapi/hoek/package.json @@ -0,0 +1,66 @@ +{ + "_from": "@hapi/hoek@^9.0.0", + "_id": "@hapi/hoek@9.2.1", + "_inBundle": false, + "_integrity": "sha512-gfta+H8aziZsm8pZa0vj04KO6biEiisppNgA1kbJvFrrWu9Vm7eaUEy76DIxsuTaWvti5fkJVhllWc6ZTE+Mdw==", + "_location": "/@hapi/hoek", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "@hapi/hoek@^9.0.0", + "name": "@hapi/hoek", + "escapedName": "@hapi%2fhoek", + "scope": "@hapi", + "rawSpec": "^9.0.0", + "saveSpec": null, + "fetchSpec": "^9.0.0" + }, + "_requiredBy": [ + "/@hapi/topo", + "/@sideway/address", + "/joi" + ], + "_resolved": "https://registry.npmjs.org/@hapi/hoek/-/hoek-9.2.1.tgz", + "_shasum": "9551142a1980503752536b5050fd99f4a7f13b17", + "_spec": "@hapi/hoek@^9.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\joi", + "bugs": { + "url": "https://github.com/hapijs/hoek/issues" + }, + "bundleDependencies": false, + "dependencies": {}, + "deprecated": false, + "description": "General purpose node utilities", + "devDependencies": { + "@hapi/code": "8.x.x", + "@hapi/eslint-plugin": "*", + "@hapi/lab": "^24.0.0", + "typescript": "~4.0.2" + }, + "eslintConfig": { + "extends": [ + "plugin:@hapi/module" + ] + }, + "files": [ + "lib" + ], + "homepage": "https://github.com/hapijs/hoek#readme", + "keywords": [ + "utilities" + ], + "license": "BSD-3-Clause", + "main": "lib/index.js", + "name": "@hapi/hoek", + "repository": { + "type": "git", + "url": "git://github.com/hapijs/hoek.git" + }, + "scripts": { + "test": "lab -a @hapi/code -t 100 -L -Y", + "test-cov-html": "lab -a @hapi/code -t 100 -L -r html -o coverage.html" + }, + "types": "lib/index.d.ts", + "version": "9.2.1" +} diff --git a/node_modules/@hapi/joi/LICENSE.md b/node_modules/@hapi/joi/LICENSE.md new file mode 100644 index 000000000..92cd16f46 --- /dev/null +++ b/node_modules/@hapi/joi/LICENSE.md @@ -0,0 +1,10 @@ +Copyright (c) 2012-2020, Sideway Inc, and project contributors +Copyright (c) 2012-2014, Walmart. +All rights reserved. + +Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: +* Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. +* Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. +* The names of any contributors may not be used to endorse or promote products derived from this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS AND CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/node_modules/@hapi/joi/README.md b/node_modules/@hapi/joi/README.md new file mode 100644 index 000000000..0498e22b9 --- /dev/null +++ b/node_modules/@hapi/joi/README.md @@ -0,0 +1,17 @@ + + +# @hapi/joi + +#### The most powerful schema description language and data validator for JavaScript. + +**joi** is part of the **hapi** ecosystem and was designed to work seamlessly with the [hapi web framework](https://hapi.dev) and its other components (but works great on its own or with other frameworks). If you are using a different web framework and find this module useful, check out [hapi](https://hapi.dev) – they work even better together. + +### Visit the [hapi.dev](https://hapi.dev) Developer Portal for tutorials, documentation, and support + +## Useful resources + +- [Documentation and API](https://hapi.dev/family/joi/) +- [Versions status](https://hapi.dev/resources/status/#joi) +- [Changelog](https://hapi.dev/family/joi/changelog/) +- [Project policies](https://hapi.dev/policies/) +- [Free and commercial support options](https://hapi.dev/support/) diff --git a/node_modules/@hapi/joi/dist/joi-browser.min.js b/node_modules/@hapi/joi/dist/joi-browser.min.js new file mode 100644 index 000000000..fc22562b5 --- /dev/null +++ b/node_modules/@hapi/joi/dist/joi-browser.min.js @@ -0,0 +1 @@ +!function(e,t){"object"==typeof exports&&"object"==typeof module?module.exports=t():"function"==typeof define&&define.amd?define([],t):"object"==typeof exports?exports.joi=t():e.joi=t()}(window,(function(){return function(e){var t={};function r(s){if(t[s])return t[s].exports;var n=t[s]={i:s,l:!1,exports:{}};return e[s].call(n.exports,n,n.exports,r),n.l=!0,n.exports}return r.m=e,r.c=t,r.d=function(e,t,s){r.o(e,t)||Object.defineProperty(e,t,{enumerable:!0,get:s})},r.r=function(e){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(e,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(e,"__esModule",{value:!0})},r.t=function(e,t){if(1&t&&(e=r(e)),8&t)return e;if(4&t&&"object"==typeof e&&e&&e.__esModule)return e;var s=Object.create(null);if(r.r(s),Object.defineProperty(s,"default",{enumerable:!0,value:e}),2&t&&"string"!=typeof e)for(var n in e)r.d(s,n,function(t){return e[t]}.bind(null,n));return s},r.n=function(e){var t=e&&e.__esModule?function(){return e.default}:function(){return e};return r.d(t,"a",t),t},r.o=function(e,t){return Object.prototype.hasOwnProperty.call(e,t)},r.p="",r(r.s=11)}([function(e,t,r){"use strict";const s=r(12);e.exports=function(e,...t){if(!e){if(1===t.length&&t[0]instanceof Error)throw t[0];throw new s(t)}}},function(e,t,r){"use strict";const s=r(0),n=r(12),o=r(29);let a,i;const l={isoDate:/^(?:[-+]\d{2})?(?:\d{4}(?!\d{2}\b))(?:(-?)(?:(?:0[1-9]|1[0-2])(?:\1(?:[12]\d|0[1-9]|3[01]))?|W(?:[0-4]\d|5[0-2])(?:-?[1-7])?|(?:00[1-9]|0[1-9]\d|[12]\d{2}|3(?:[0-5]\d|6[1-6])))(?![T]$|[T][\d]+Z$)(?:[T\s](?:(?:(?:[01]\d|2[0-3])(?:(:?)[0-5]\d)?|24\:?00)(?:[.,]\d+(?!:))?)(?:\2[0-5]\d(?:[.,]\d+)?)?(?:[Z]|(?:[+-])(?:[01]\d|2[0-3])(?::?[0-5]\d)?)?)?)?$/};t.version=o.version,t.defaults={abortEarly:!0,allowUnknown:!1,cache:!0,context:null,convert:!0,dateFormat:"iso",errors:{escapeHtml:!1,label:"path",language:null,render:!0,stack:!1,wrap:{label:'"',array:"[]"}},externals:!0,messages:{},nonEnumerables:!1,noDefaults:!1,presence:"optional",skipFunctions:!1,stripUnknown:!1,warnings:!1},t.symbols={any:Symbol.for("@hapi/joi/schema"),arraySingle:Symbol("arraySingle"),deepDefault:Symbol("deepDefault"),literal:Symbol("literal"),override:Symbol("override"),prefs:Symbol("prefs"),ref:Symbol("ref"),values:Symbol("values"),template:Symbol("template")},t.assertOptions=function(e,t,r="Options"){s(e&&"object"==typeof e&&!Array.isArray(e),"Options must be of type object");const n=Object.keys(e).filter(e=>!t.includes(e));s(0===n.length,"".concat(r," contain unknown keys: ").concat(n))},t.checkPreferences=function(e){i=i||r(16);const t=i.preferences.validate(e);if(t.error)throw new n([t.error.details[0].message])},t.compare=function(e,t,r){switch(r){case"=":return e===t;case">":return e>t;case"<":return e=":return e>=t;case"<=":return e<=t}},t.default=function(e,t){return void 0===e?t:e},t.isIsoDate=function(e){return l.isoDate.test(e)},t.isNumber=function(e){return"number"==typeof e&&!isNaN(e)},t.isResolvable=function(e){return!!e&&(e[t.symbols.ref]||e[t.symbols.template])},t.isSchema=function(e,r={}){const n=e&&e[t.symbols.any];return!!n&&(s(r.legacy||n.version===t.version,"Cannot mix different versions of joi schemas"),!0)},t.isValues=function(e){return e[t.symbols.values]},t.limit=function(e){return Number.isSafeInteger(e)&&e>=0},t.preferences=function(e,s){a=a||r(9),e=e||{},s=s||{};const n=Object.assign({},e,s);return s.errors&&e.errors&&(n.errors=Object.assign({},e.errors,s.errors),n.errors.wrap=Object.assign({},e.errors.wrap,s.errors.wrap)),s.messages&&(n.messages=a.compile(s.messages,e.messages)),delete n[t.symbols.prefs],n},t.tryWithPath=function(e,t,r={}){try{return e()}catch(e){throw void 0!==e.path?e.path=t+"."+e.path:e.path=t,r.append&&(e.message="".concat(e.message," (").concat(e.path,")")),e}},t.validateArg=function(e,r,{assert:s,message:n}){if(t.isSchema(s)){const t=s.validate(e);if(!t.error)return;return t.error.message}if(!s(e))return r?"".concat(r," ").concat(n):n},t.verifyFlat=function(e,t){for(const r of e)s(!Array.isArray(r),"Method no longer accepts array arguments:",t)}},function(e,t,r){"use strict";const s=r(6),n=r(13),o=r(14),a={needsProtoHack:new Set([n.set,n.map,n.weakSet,n.weakMap])};e.exports=a.clone=function(e,t={},r=null){if("object"!=typeof e||null===e)return e;let s=a.clone,i=r;if(t.shallow){if(!0!==t.shallow)return a.cloneWithShallow(e,t);s=e=>e}else if(i){const t=i.get(e);if(t)return t}else i=new Map;const l=n.getInternalProto(e);if(l===n.buffer)return!1;if(l===n.date)return new Date(e.getTime());if(l===n.regex)return new RegExp(e);const c=a.base(e,l,t);if(c===e)return e;if(i&&i.set(e,c),l===n.set)for(const r of e)c.add(s(r,t,i));else if(l===n.map)for(const[r,n]of e)c.set(r,s(n,t,i));const u=o.keys(e,t);for(const r of u){if("__proto__"===r)continue;if(l===n.array&&"length"===r){c.length=e.length;continue}const o=Object.getOwnPropertyDescriptor(e,r);o?o.get||o.set?Object.defineProperty(c,r,o):o.enumerable?c[r]=s(e[r],t,i):Object.defineProperty(c,r,{enumerable:!1,writable:!0,configurable:!0,value:s(e[r],t,i)}):Object.defineProperty(c,r,{enumerable:!0,writable:!0,configurable:!0,value:s(e[r],t,i)})}return c},a.cloneWithShallow=function(e,t){const r=t.shallow;(t=Object.assign({},t)).shallow=!1;const n=new Map;for(const t of r){const r=s(e,t);"object"!=typeof r&&"function"!=typeof r||n.set(r,r)}return a.clone(e,t,n)},a.base=function(e,t,r){if(!1===r.prototype)return a.needsProtoHack.has(t)?new t.constructor:t===n.array?[]:{};const s=Object.getPrototypeOf(e);if(s&&s.isImmutable)return e;if(t===n.array){const e=[];return s!==t&&Object.setPrototypeOf(e,s),e}if(a.needsProtoHack.has(t)){const e=new s.constructor;return s!==t&&Object.setPrototypeOf(e,s),e}return Object.create(s)}},function(e,t,r){"use strict";const s=r(0),n=r(34),o=r(1),a=r(9);e.exports=n.extend({type:"any",flags:{only:{default:!1}},terms:{alterations:{init:null},examples:{init:null},externals:{init:null},metas:{init:[]},notes:{init:[]},shared:{init:null},tags:{init:[]},whens:{init:null}},rules:{custom:{method(e,t){return s("function"==typeof e,"Method must be a function"),s(void 0===t||t&&"string"==typeof t,"Description must be a non-empty string"),this.$_addRule({name:"custom",args:{method:e,description:t}})},validate(e,t,{method:r}){try{return r(e,t)}catch(e){return t.error("any.custom",{error:e})}},args:["method","description"],multi:!0},messages:{method(e){return this.prefs({messages:e})}},shared:{method(e){s(o.isSchema(e)&&e._flags.id,"Schema must be a schema with an id");const t=this.clone();return t.$_terms.shared=t.$_terms.shared||[],t.$_terms.shared.push(e),t.$_mutateRegister(e),t}},warning:{method(e,t){return s(e&&"string"==typeof e,"Invalid warning code"),this.$_addRule({name:"warning",args:{code:e,local:t},warn:!0})},validate:(e,t,{code:r,local:s})=>t.error(r,s),args:["code","local"],multi:!0}},modifiers:{keep(e,t=!0){e.keep=t},message(e,t){e.message=a.compile(t)},warn(e,t=!0){e.warn=t}},manifest:{build(e,t){for(const r in t){const s=t[r];if(["examples","externals","metas","notes","tags"].includes(r))for(const t of s)e=e[r.slice(0,-1)](t);else if("alterations"!==r)if("whens"!==r){if("shared"===r)for(const t of s)e=e.shared(t)}else for(const t of s){const{ref:r,is:s,not:n,then:o,otherwise:a,concat:i}=t;e=i?e.concat(i):r?e.when(r,{is:s,not:n,then:o,otherwise:a,switch:t.switch,break:t.break}):e.when(s,{then:o,otherwise:a,break:t.break})}else{const t={};for(const{target:e,adjuster:r}of s)t[e]=r;e=e.alter(t)}}return e}},messages:{"any.custom":"{{#label}} failed custom validation because {{#error.message}}","any.default":"{{#label}} threw an error when running default method","any.failover":"{{#label}} threw an error when running failover method","any.invalid":"{{#label}} contains an invalid value","any.only":'{{#label}} must be {if(#valids.length == 1, "", "one of ")}{{#valids}}',"any.ref":'{{#label}} {{#arg}} references "{{#ref}}" which {{#reason}}',"any.required":"{{#label}} is required","any.unknown":"{{#label}} is not allowed"}})},function(e,t,r){"use strict";const s=r(32),n=r(1),o=r(7);t.Report=class{constructor(e,r,s,n,o,a,i){if(this.code=e,this.flags=n,this.messages=o,this.path=a.path,this.prefs=i,this.state=a,this.value=r,this.message=null,this.template=null,this.local=s||{},this.local.label=t.label(this.flags,this.state,this.prefs,this.messages),void 0===this.value||this.local.hasOwnProperty("value")||(this.local.value=this.value),this.path.length){const e=this.path[this.path.length-1];"object"!=typeof e&&(this.local.key=e)}}_setTemplate(e){if(this.template=e,!this.flags.label&&0===this.path.length){const e=this._template(this.template,"root");e&&(this.local.label=e)}}toString(){if(this.message)return this.message;const e=this.code;if(!this.prefs.errors.render)return this.code;const t=this._template(this.template)||this._template(this.prefs.messages)||this._template(this.messages);return void 0===t?'Error code "'.concat(e,'" is not defined, your custom type is missing the correct messages definition'):(this.message=t.render(this.value,this.state,this.prefs,this.local,{errors:this.prefs.errors,messages:[this.prefs.messages,this.messages]}),this.prefs.errors.label||(this.message=this.message.replace(/^"" /,"").trim()),this.message)}_template(e,r){return t.template(this.value,e,r||this.code,this.state,this.prefs)}},t.path=function(e){let t="";for(const r of e)"object"!=typeof r&&("string"==typeof r?(t&&(t+="."),t+=r):t+="[".concat(r,"]"));return t},t.template=function(e,t,r,s,a){if(!t)return;if(o.isTemplate(t))return"root"!==r?t:null;let i=a.errors.language;return n.isResolvable(i)&&(i=i.resolve(e,s,a)),i&&t[i]&&void 0!==t[i][r]?t[i][r]:t[r]},t.label=function(e,r,s,n){if(e.label)return e.label;if(!s.errors.label)return"";let o=r.path;"key"===s.errors.label&&r.path.length>1&&(o=r.path.slice(-1));const a=t.path(o);return a||(t.template(null,s.messages,"root",r,s)||n&&t.template(null,n,"root",r,s)||"value")},t.process=function(e,r,s){if(!e)return null;const{override:n,message:o,details:a}=t.details(e);if(n)return n;if(s.errors.stack)return new t.ValidationError(o,a,r);const i=Error.stackTraceLimit;Error.stackTraceLimit=0;const l=new t.ValidationError(o,a,r);return Error.stackTraceLimit=i,l},t.details=function(e,t={}){let r=[];const s=[];for(const n of e){if(n instanceof Error){if(!1!==t.override)return{override:n};const e=n.toString();r.push(e),s.push({message:e,type:"override",context:{error:n}});continue}const e=n.toString();r.push(e),s.push({message:e,path:n.path.filter(e=>"object"!=typeof e),type:n.code,context:n.local})}return r.length>1&&(r=[...new Set(r)]),{message:r.join(". "),details:s}},t.ValidationError=class extends Error{constructor(e,t,r){super(e),this._original=r,this.details=t}static isError(e){return e instanceof t.ValidationError}},t.ValidationError.prototype.isJoi=!0,t.ValidationError.prototype.name="ValidationError",t.ValidationError.prototype.annotate=s.error},function(e,t,r){"use strict";const s=r(0),n=r(2),o=r(6),a=r(1);let i;const l={symbol:Symbol("ref"),defaults:{adjust:null,in:!1,iterables:null,map:null,separator:".",type:"value"}};t.create=function(e,t={}){s("string"==typeof e,"Invalid reference key:",e),a.assertOptions(t,["adjust","ancestor","in","iterables","map","prefix","separator"]),s(!t.prefix||"object"==typeof t.prefix,"options.prefix must be of type object");const r=Object.assign({},l.defaults,t);delete r.prefix;const n=r.separator,o=l.context(e,n,t.prefix);if(r.type=o.type,e=o.key,"value"===r.type)if(o.root&&(s(!n||e[0]!==n,"Cannot specify relative path with root prefix"),r.ancestor="root",e||(e=null)),n&&n===e)e=null,r.ancestor=0;else if(void 0!==r.ancestor)s(!n||!e||e[0]!==n,"Cannot combine prefix with ancestor option");else{const[t,s]=l.ancestor(e,n);s&&""===(e=e.slice(s))&&(e=null),r.ancestor=t}return r.path=n?null===e?[]:e.split(n):[e],new l.Ref(r)},t.in=function(e,r={}){return t.create(e,Object.assign({},r,{in:!0}))},t.isRef=function(e){return!!e&&!!e[a.symbols.ref]},l.Ref=class{constructor(e){s("object"==typeof e,"Invalid reference construction"),a.assertOptions(e,["adjust","ancestor","in","iterables","map","path","separator","type","depth","key","root","display"]),s([!1,void 0].includes(e.separator)||"string"==typeof e.separator&&1===e.separator.length,"Invalid separator"),s(!e.adjust||"function"==typeof e.adjust,"options.adjust must be a function"),s(!e.map||Array.isArray(e.map),"options.map must be an array"),s(!e.map||!e.adjust,"Cannot set both map and adjust options"),Object.assign(this,l.defaults,e),s("value"===this.type||void 0===this.ancestor,"Non-value references cannot reference ancestors"),Array.isArray(this.map)&&(this.map=new Map(this.map)),this.depth=this.path.length,this.key=this.path.length?this.path.join(this.separator):null,this.root=this.path[0],this.updateDisplay()}resolve(e,t,r,n,o={}){return s(!this.in||o.in,"Invalid in() reference usage"),"global"===this.type?this._resolve(r.context,t,o):"local"===this.type?this._resolve(n,t,o):this.ancestor?"root"===this.ancestor?this._resolve(t.ancestors[t.ancestors.length-1],t,o):(s(this.ancestor<=t.ancestors.length,"Invalid reference exceeds the schema root:",this.display),this._resolve(t.ancestors[this.ancestor-1],t,o)):this._resolve(e,t,o)}_resolve(e,t,r){let s;if("value"===this.type&&t.mainstay.shadow&&!1!==r.shadow&&(s=t.mainstay.shadow.get(this.absolute(t))),void 0===s&&(s=o(e,this.path,{iterables:this.iterables,functions:!0})),this.adjust&&(s=this.adjust(s)),this.map){const e=this.map.get(s);void 0!==e&&(s=e)}return t.mainstay&&t.mainstay.tracer.resolve(t,this,s),s}toString(){return this.display}absolute(e){return[...e.path.slice(0,-this.ancestor),...this.path]}clone(){return new l.Ref(this)}describe(){const e={path:this.path};"value"!==this.type&&(e.type=this.type),"."!==this.separator&&(e.separator=this.separator),"value"===this.type&&1!==this.ancestor&&(e.ancestor=this.ancestor),this.map&&(e.map=[...this.map]);for(const t of["adjust","iterables"])null!==this[t]&&(e[t]=this[t]);return!1!==this.in&&(e.in=!0),{ref:e}}updateDisplay(){const e=null!==this.key?this.key:"";if("value"!==this.type)return void(this.display="ref:".concat(this.type,":").concat(e));if(!this.separator)return void(this.display="ref:".concat(e));if(!this.ancestor)return void(this.display="ref:".concat(this.separator).concat(e));if("root"===this.ancestor)return void(this.display="ref:root:".concat(e));if(1===this.ancestor)return void(this.display="ref:".concat(e||".."));const t=new Array(this.ancestor+1).fill(this.separator).join("");this.display="ref:".concat(t).concat(e||"")}},l.Ref.prototype[a.symbols.ref]=!0,t.build=function(e){return"value"===(e=Object.assign({},l.defaults,e)).type&&void 0===e.ancestor&&(e.ancestor=1),new l.Ref(e)},l.context=function(e,t,r={}){if(e=e.trim(),r){const s=void 0===r.global?"$":r.global;if(s!==t&&e.startsWith(s))return{key:e.slice(s.length),type:"global"};const n=void 0===r.local?"#":r.local;if(n!==t&&e.startsWith(n))return{key:e.slice(n.length),type:"local"};const o=void 0===r.root?"/":r.root;if(o!==t&&e.startsWith(o))return{key:e.slice(o.length),type:"value",root:!0}}return{key:e,type:"value"}},l.ancestor=function(e,t){if(!t)return[1,0];if(e[0]!==t)return[1,0];if(e[1]!==t)return[0,1];let r=2;for(;e[r]===t;)++r;return[r-1,r]},t.toSibling=0,t.toParent=1,t.Manager=class{constructor(){this.refs=[]}register(e,s){if(e)if(s=void 0===s?t.toParent:s,Array.isArray(e))for(const t of e)this.register(t,s);else if(a.isSchema(e))for(const t of e._refs.refs)t.ancestor-s>=0&&this.refs.push({ancestor:t.ancestor-s,root:t.root});else t.isRef(e)&&"value"===e.type&&e.ancestor-s>=0&&this.refs.push({ancestor:e.ancestor-s,root:e.root}),i=i||r(7),i.isTemplate(e)&&this.register(e.refs(),s)}get length(){return this.refs.length}clone(){const e=new t.Manager;return e.refs=n(this.refs),e}reset(){this.refs=[]}roots(){return this.refs.filter(e=>!e.ancestor).map(e=>e.root)}}},function(e,t,r){"use strict";const s=r(0),n={};e.exports=function(e,t,r){if(!1===t||null==t)return e;"string"==typeof(r=r||{})&&(r={separator:r});const o=Array.isArray(t);s(!o||!r.separator,"Separator option no valid for array-based chain");const a=o?t:t.split(r.separator||".");let i=e;for(let e=0;e{const t=c.create(e,this._settings);return r.push(t),e=>t.resolve(...e)};try{var n=new a.Parser(e,{reference:s,functions:u.functions,constants:u.constants})}catch(t){throw t.message='Invalid template variable "'.concat(e,'" fails due to: ').concat(t.message),t}return n.single?"reference"===n.single.type?{ref:r[0],raw:t,refs:r}:u.stringify(n.single.value):{formula:n,raw:t,refs:r}}toString(){return this.source}},u.Template.prototype[i.symbols.template]=!0,u.Template.prototype.isImmutable=!0,u.encode=function(e){return e.replace(/\\(\{+)/g,(e,t)=>u.opens.slice(0,t.length)).replace(/\\(\}+)/g,(e,t)=>u.closes.slice(0,t.length))},u.decode=function(e){return e.replace(/\u0000/g,"{").replace(/\u0001/g,"}")},u.split=function(e){const t=[];let r="";for(let s=0;s ").concat(s.toString()));e=t}if(!Array.isArray(e))return e.toString();let n="";for(const s of e)n=n+(n.length?", ":"")+u.stringify(s,t,r);return u.wrap(n,t.errors.wrap.array)},u.constants={true:!0,false:!1,null:null,second:1e3,minute:6e4,hour:36e5,day:864e5},u.functions={if:(e,t,r)=>e?t:r,msg(e){const[t,r,s,n,o]=this,a=o.messages;if(!a)return"";const i=l.template(t,a[0],e,r,s)||l.template(t,a[1],e,r,s);return i?i.render(t,r,s,n,o):""},number:e=>"number"==typeof e?e:"string"==typeof e?parseFloat(e):"boolean"==typeof e?e?1:0:e instanceof Date?e.getTime():null}},function(e,t,r){"use strict";const s=r(0),n=r(1),o=r(5),a={};t.schema=function(e,t,r={}){n.assertOptions(r,["appendPath","override"]);try{return a.schema(e,t,r)}catch(e){throw r.appendPath&&void 0!==e.path&&(e.message="".concat(e.message," (").concat(e.path,")")),e}},a.schema=function(e,t,r){s(void 0!==t,"Invalid undefined schema"),Array.isArray(t)&&(s(t.length,"Invalid empty array schema"),1===t.length&&(t=t[0]));const o=(t,...s)=>!1!==r.override?t.valid(e.override,...s):t.valid(...s);if(a.simple(t))return o(e,t);if("function"==typeof t)return e.custom(t);if(s("object"==typeof t,"Invalid schema content:",typeof t),n.isResolvable(t))return o(e,t);if(n.isSchema(t))return t;if(Array.isArray(t)){for(const r of t)if(!a.simple(r))return e.alternatives().try(...t);return o(e,...t)}return t instanceof RegExp?e.string().regex(t):t instanceof Date?o(e.date(),t):(s(Object.getPrototypeOf(t)===Object.getPrototypeOf({}),"Schema can only contain plain objects"),e.object().keys(t))},t.ref=function(e,t){return o.isRef(e)?e:o.create(e,t)},t.compile=function(e,r,o={}){n.assertOptions(o,["legacy"]);const i=r&&r[n.symbols.any];if(i)return s(o.legacy||i.version===n.version,"Cannot mix different versions of joi schemas:",i.version,n.version),r;if("object"!=typeof r||!o.legacy)return t.schema(e,r,{appendPath:!0});const l=a.walk(r);return l?l.compile(l.root,r):t.schema(e,r,{appendPath:!0})},a.walk=function(e){if("object"!=typeof e)return null;if(Array.isArray(e)){for(const t of e){const e=a.walk(t);if(e)return e}return null}const t=e[n.symbols.any];if(t)return{root:e[t.root],compile:t.compile};s(Object.getPrototypeOf(e)===Object.getPrototypeOf({}),"Schema can only contain plain objects");for(const t in e){const r=a.walk(e[t]);if(r)return r}return null},a.simple=function(e){return null===e||["boolean","string","number"].includes(typeof e)},t.when=function(e,r,i){if(void 0===i&&(s(r&&"object"==typeof r,"Missing options"),i=r,r=o.create(".")),Array.isArray(i)&&(i={switch:i}),n.assertOptions(i,["is","not","then","otherwise","switch","break"]),n.isSchema(r))return s(void 0===i.is,'"is" can not be used with a schema condition'),s(void 0===i.not,'"not" can not be used with a schema condition'),s(void 0===i.switch,'"switch" can not be used with a schema condition'),a.condition(e,{is:r,then:i.then,otherwise:i.otherwise,break:i.break});if(s(o.isRef(r)||"string"==typeof r,"Invalid condition:",r),s(void 0===i.not||void 0===i.is,'Cannot combine "is" with "not"'),void 0===i.switch){let l=i;void 0!==i.not&&(l={is:i.not,then:i.otherwise,otherwise:i.then,break:i.break});let c=void 0!==l.is?e.$_compile(l.is):e.$_root.invalid(null,!1,0,"").required();return s(void 0!==l.then||void 0!==l.otherwise,'options must have at least one of "then", "otherwise", or "switch"'),s(void 0===l.break||void 0===l.then||void 0===l.otherwise,"Cannot specify then, otherwise, and break all together"),void 0===i.is||o.isRef(i.is)||n.isSchema(i.is)||(c=c.required()),a.condition(e,{ref:t.ref(r),is:c,then:l.then,otherwise:l.otherwise,break:l.break})}s(Array.isArray(i.switch),'"switch" must be an array'),s(void 0===i.is,'Cannot combine "switch" with "is"'),s(void 0===i.not,'Cannot combine "switch" with "not"'),s(void 0===i.then,'Cannot combine "switch" with "then"');const l={ref:t.ref(r),switch:[],break:i.break};for(let t=0;t=0;--r)if(o[r].isSame(e,t))return!0;o.push(new n.SeenEntry(e,t));try{return!!n.isDeepEqualObj(i,e,t,r,o)}finally{o.pop()}},n.getSharedType=function(e,t,r){if(r)return Object.getPrototypeOf(e)!==Object.getPrototypeOf(t)?n.mismatched:s.getInternalProto(e);const o=s.getInternalProto(e);return o!==s.getInternalProto(t)?n.mismatched:o},n.valueOf=function(e){const t=e.valueOf;if(void 0===t)return e;try{return t.call(e)}catch(e){return e}},n.hasOwnEnumerableProperty=function(e,t){return Object.prototype.propertyIsEnumerable.call(e,t)},n.isSetSimpleEqual=function(e,t){for(const r of e)if(!t.has(r))return!1;return!0},n.isDeepEqualObj=function(e,t,r,o,a){const{isDeepEqual:i,valueOf:l,hasOwnEnumerableProperty:c}=n,{keys:u,getOwnPropertySymbols:f}=Object;if(e===s.array){if(!o.part){if(t.length!==r.length)return!1;for(let e=0;ep.assert(e,t,!1,r),build(e){return s("function"==typeof u.build,"Manifest functionality disabled"),u.build(this,e)},checkPreferences(e){a.checkPreferences(e)},compile(e,t){return i.compile(this,e,t)},defaults(e){s("function"==typeof e,"modifier must be a function");const t=Object.assign({},this);for(const r of t._types){const n=e(t[r]());s(a.isSchema(n),"modifier must return a valid schema object"),t[r]=function(...e){return p.generate(this,n,e)}}return t},expression:(...e)=>new m(...e),extend(...e){a.verifyFlat(e,"extend"),d=d||r(16),s(e.length,"You need to provide at least one extension"),this.assert(e,d.extensions);const t=Object.assign({},this);t._types=new Set(t._types);for(let r of e){"function"==typeof r&&(r=r(t)),this.assert(r,d.extension);const e=p.expandExtension(r,t);for(const r of e){s(void 0===t[r.type]||t._types.has(r.type),"Cannot override name",r.type);const e=r.base||this.any(),n=c.type(e,r);t._types.add(r.type),t[r.type]=function(...e){return p.generate(this,n,e)}}}return t},isError:l.ValidationError.isError,isExpression:m.isTemplate,isRef:f.isRef,isSchema:a.isSchema,in:(...e)=>f.in(...e),override:a.symbols.override,ref:(...e)=>f.create(...e),types(){const e={};for(const t of this._types)e[t]=this[t]();for(const t in p.aliases)e[t]=this[t]();return e}},p.assert=function(e,t,r,s){const o=s[0]instanceof Error||"string"==typeof s[0]?s[0]:null,i=o?s[1]:s[0],c=t.validate(e,a.preferences({errors:{stack:!0}},i||{}));let u=c.error;if(!u)return c.value;if(o instanceof Error)throw o;const f=r&&"function"==typeof u.annotate?u.annotate():u.message;throw u instanceof l.ValidationError==!1&&(u=n(u)),u.message=o?"".concat(o," ").concat(f):f,u},p.generate=function(e,t,r){return s(e,"Must be invoked on a Joi instance."),t.$_root=e,t._definition.args&&r.length?t._definition.args(t,...r):t},p.expandExtension=function(e,t){if("string"==typeof e.type)return[e];const r=[];for(const s of t._types)if(e.type.test(s)){const n=Object.assign({},e);n.type=s,n.base=t[s](),r.push(n)}return r},e.exports=p.root()},function(e,t,r){"use strict";const s=r(28);e.exports=class extends Error{constructor(e){super(e.filter(e=>""!==e).map(e=>"string"==typeof e?e:e instanceof Error?e.message:s(e)).join(" ")||"Unknown error"),"function"==typeof Error.captureStackTrace&&Error.captureStackTrace(this,t.assert)}}},function(e,t,r){"use strict";const s={};t=e.exports={array:Array.prototype,buffer:!1,date:Date.prototype,error:Error.prototype,generic:Object.prototype,map:Map.prototype,promise:Promise.prototype,regex:RegExp.prototype,set:Set.prototype,weakMap:WeakMap.prototype,weakSet:WeakSet.prototype},s.typeMap=new Map([["[object Error]",t.error],["[object Map]",t.map],["[object Promise]",t.promise],["[object Set]",t.set],["[object WeakMap]",t.weakMap],["[object WeakSet]",t.weakSet]]),t.getInternalProto=function(e){if(Array.isArray(e))return t.array;if(e instanceof Date)return t.date;if(e instanceof RegExp)return t.regex;if(e instanceof Error)return t.error;const r=Object.prototype.toString.call(e);return s.typeMap.get(r)||t.generic}},function(e,t,r){"use strict";t.keys=function(e,t={}){return!1!==t.symbols?Reflect.ownKeys(e):Object.getOwnPropertyNames(e)}},function(e,t,r){"use strict";const s=r(0),n=r(2),o=r(1),a={max:1e3,supported:new Set(["undefined","boolean","number","string"])};t.provider={provision:e=>new a.Cache(e)},a.Cache=class{constructor(e={}){o.assertOptions(e,["max"]),s(void 0===e.max||e.max&&e.max>0&&isFinite(e.max),"Invalid max cache size"),this._max=e.max||a.max,this._map=new Map,this._list=new a.List}get length(){return this._map.size}set(e,t){if(null!==e&&!a.supported.has(typeof e))return;let r=this._map.get(e);if(r)return r.value=t,void this._list.first(r);r=this._list.unshift({key:e,value:t}),this._map.set(e,r),this._compact()}get(e){const t=this._map.get(e);if(t)return this._list.first(t),n(t.value)}_compact(){if(this._map.size>this._max){const e=this._list.pop();this._map.delete(e.key)}}},a.List=class{constructor(){this.tail=null,this.head=null}unshift(e){return e.next=null,e.prev=this.head,this.head&&(this.head.next=e),this.head=e,this.tail||(this.tail=e),e}first(e){e!==this.head&&(this._remove(e),this.unshift(e))}pop(){return this._remove(this.tail)}_remove(e){const{next:t,prev:r}=e;return t.prev=r,r&&(r.next=t),e===this.tail&&(this.tail=t),e.prev=null,e.next=null,e}}},function(e,t,r){"use strict";const s=r(11),n={};n.wrap=s.string().min(1).max(2).allow(!1),t.preferences=s.object({allowUnknown:s.boolean(),abortEarly:s.boolean(),cache:s.boolean(),context:s.object(),convert:s.boolean(),dateFormat:s.valid("date","iso","string","time","utc"),debug:s.boolean(),errors:{escapeHtml:s.boolean(),label:s.valid("path","key",!1),language:[s.string(),s.object().ref()],render:s.boolean(),stack:s.boolean(),wrap:{label:n.wrap,array:n.wrap}},externals:s.boolean(),messages:s.object(),noDefaults:s.boolean(),nonEnumerables:s.boolean(),presence:s.valid("required","optional","forbidden"),skipFunctions:s.boolean(),stripUnknown:s.object({arrays:s.boolean(),objects:s.boolean()}).or("arrays","objects").allow(!0,!1),warnings:s.boolean()}).strict(),n.nameRx=/^[a-zA-Z0-9]\w*$/,n.rule=s.object({alias:s.array().items(s.string().pattern(n.nameRx)).single(),args:s.array().items(s.string(),s.object({name:s.string().pattern(n.nameRx).required(),ref:s.boolean(),assert:s.alternatives([s.function(),s.object().schema()]).conditional("ref",{is:!0,then:s.required()}),normalize:s.function(),message:s.string().when("assert",{is:s.function(),then:s.required()})})),convert:s.boolean(),manifest:s.boolean(),method:s.function().allow(!1),multi:s.boolean(),validate:s.function()}),t.extension=s.object({type:s.alternatives([s.string(),s.object().regex()]).required(),args:s.function(),base:s.object().schema().when("type",{is:s.object().regex(),then:s.forbidden()}),coerce:[s.function().maxArity(3),s.object({method:s.function().maxArity(3).required(),from:s.array().items(s.string()).single()})],flags:s.object().pattern(n.nameRx,s.object({setter:s.string(),default:s.any()})),manifest:{build:s.function().arity(2)},messages:[s.object(),s.string()],modifiers:s.object().pattern(n.nameRx,s.function().minArity(1).maxArity(2)),overrides:s.object().pattern(n.nameRx,s.function()),prepare:s.function().maxArity(3),rebuild:s.function().arity(1),rules:s.object().pattern(n.nameRx,n.rule),terms:s.object().pattern(n.nameRx,s.object({init:s.array().allow(null).required(),manifest:s.object().pattern(/.+/,[s.valid("schema","single"),s.object({mapped:s.object({from:s.string().required(),to:s.string().required()}).required()})])})),validate:s.function().maxArity(3)}).strict(),t.extensions=s.array().items(s.object(),s.function().arity(1)).strict(),n.desc={buffer:s.object({buffer:s.string()}),func:s.object({function:s.function().required(),options:{literal:!0}}),override:s.object({override:!0}),ref:s.object({ref:s.object({type:s.valid("value","global","local"),path:s.array().required(),separator:s.string().length(1).allow(!1),ancestor:s.number().min(0).integer().allow("root"),map:s.array().items(s.array().length(2)).min(1),adjust:s.function(),iterables:s.boolean(),in:s.boolean()}).required()}),regex:s.object({regex:s.string().min(3)}),special:s.object({special:s.valid("deep").required()}),template:s.object({template:s.string().required(),options:s.object()}),value:s.object({value:s.alternatives([s.object(),s.array()]).required()})},n.desc.entity=s.alternatives([s.array().items(s.link("...")),s.boolean(),s.function(),s.number(),s.string(),n.desc.buffer,n.desc.func,n.desc.ref,n.desc.regex,n.desc.special,n.desc.template,n.desc.value,s.link("/")]),n.desc.values=s.array().items(null,s.boolean(),s.function(),s.number().allow(1/0,-1/0),s.string().allow(""),s.symbol(),n.desc.buffer,n.desc.func,n.desc.override,n.desc.ref,n.desc.regex,n.desc.template,n.desc.value),n.desc.messages=s.object().pattern(/.+/,[s.string(),n.desc.template,s.object().pattern(/.+/,[s.string(),n.desc.template])]),t.description=s.object({type:s.string().required(),flags:s.object({cast:s.string(),default:s.any(),description:s.string(),empty:s.link("/"),failover:n.desc.entity,id:s.string(),label:s.string(),only:!0,presence:["optional","required","forbidden"],result:["raw","strip"],strip:s.boolean(),unit:s.string()}).unknown(),preferences:{allowUnknown:s.boolean(),abortEarly:s.boolean(),cache:s.boolean(),convert:s.boolean(),dateFormat:["date","iso","string","time","utc"],errors:{escapeHtml:s.boolean(),label:["path","key"],language:[s.string(),n.desc.ref],wrap:{label:n.wrap,array:n.wrap}},externals:s.boolean(),messages:n.desc.messages,noDefaults:s.boolean(),nonEnumerables:s.boolean(),presence:["required","optional","forbidden"],skipFunctions:s.boolean(),stripUnknown:s.object({arrays:s.boolean(),objects:s.boolean()}).or("arrays","objects").allow(!0,!1),warnings:s.boolean()},allow:n.desc.values,invalid:n.desc.values,rules:s.array().min(1).items({name:s.string().required(),args:s.object().min(1),keep:s.boolean(),message:[s.string(),n.desc.messages],warn:s.boolean()}),keys:s.object().pattern(/.*/,s.link("/")),link:n.desc.ref}).pattern(/^[a-z]\w*$/,s.any())},function(e,t,r){"use strict";const s=r(0),n=r(2),o=r(1),a=r(9),i={};t.type=function(e,t){const r=Object.getPrototypeOf(e),l=n(r),c=e._assign(Object.create(l)),u=Object.assign({},t);delete u.base,l._definition=u;const f=r._definition||{};u.messages=a.merge(f.messages,u.messages),u.properties=Object.assign({},f.properties,u.properties),c.type=u.type,u.flags=Object.assign({},f.flags,u.flags);const m=Object.assign({},f.terms);if(u.terms)for(const e in u.terms){const t=u.terms[e];s(void 0===c.$_terms[e],"Invalid term override for",u.type,e),c.$_terms[e]=t.init,m[e]=t}u.terms=m,u.args||(u.args=f.args),u.prepare=i.prepare(u.prepare,f.prepare),u.coerce&&("function"==typeof u.coerce&&(u.coerce={method:u.coerce}),u.coerce.from&&!Array.isArray(u.coerce.from)&&(u.coerce={method:u.coerce.method,from:[].concat(u.coerce.from)})),u.coerce=i.coerce(u.coerce,f.coerce),u.validate=i.validate(u.validate,f.validate);const h=Object.assign({},f.rules);if(u.rules)for(const e in u.rules){const t=u.rules[e];s("object"==typeof t,"Invalid rule definition for",u.type,e);let r=t.method;if(void 0===r&&(r=function(){return this.$_addRule(e)}),r&&(s(!l[e],"Rule conflict in",u.type,e),l[e]=r),s(!h[e],"Rule conflict in",u.type,e),h[e]=t,t.alias){const e=[].concat(t.alias);for(const r of e)l[r]=t.method}t.args&&(t.argsByName=new Map,t.args=t.args.map(e=>("string"==typeof e&&(e={name:e}),s(!t.argsByName.has(e.name),"Duplicated argument name",e.name),o.isSchema(e.assert)&&(e.assert=e.assert.strict().label(e.name)),t.argsByName.set(e.name,e),e)))}u.rules=h;const d=Object.assign({},f.modifiers);if(u.modifiers)for(const e in u.modifiers){s(!l[e],"Rule conflict in",u.type,e);const t=u.modifiers[e];s("function"==typeof t,"Invalid modifier definition for",u.type,e);const r=function(t){return this.rule({[e]:t})};l[e]=r,d[e]=t}if(u.modifiers=d,u.overrides){l._super=r,c.$_super={};for(const e in u.overrides)s(r[e],"Cannot override missing",e),c.$_super[e]=r[e].bind(c);Object.assign(l,u.overrides)}u.cast=Object.assign({},f.cast,u.cast);const p=Object.assign({},f.manifest,u.manifest);return p.build=i.build(u.manifest&&u.manifest.build,f.manifest&&f.manifest.build),u.manifest=p,u.rebuild=i.rebuild(u.rebuild,f.rebuild),c},i.build=function(e,t){return e&&t?function(r,s){return t(e(r,s),s)}:e||t},i.coerce=function(e,t){return e&&t?{from:e.from&&t.from?[...new Set([...e.from,...t.from])]:null,method(r,s){let n;if((!t.from||t.from.includes(typeof r))&&(n=t.method(r,s),n)){if(n.errors||void 0===n.value)return n;r=n.value}if(!e.from||e.from.includes(typeof r)){const t=e.method(r,s);if(t)return t}return n}}:e||t},i.prepare=function(e,t){return e&&t?function(r,s){const n=e(r,s);if(n){if(n.errors||void 0===n.value)return n;r=n.value}return t(r,s)||n}:e||t},i.rebuild=function(e,t){return e&&t?function(r){t(r),e(r)}:e||t},i.validate=function(e,t){return e&&t?function(r,s){const n=t(r,s);if(n){if(n.errors&&(!Array.isArray(n.errors)||n.errors.length))return n;r=n.value}return e(r,s)||n}:e||t}},function(e,t){},function(e,t){},function(e,t,r){"use strict";const s=r(0),n=r(2),o=r(14),a={};e.exports=a.merge=function(e,t,r){if(s(e&&"object"==typeof e,"Invalid target value: must be an object"),s(null==t||"object"==typeof t,"Invalid source value: must be null, undefined, or an object"),!t)return e;if(r=Object.assign({nullOverride:!0,mergeArrays:!0},r),Array.isArray(t)){s(Array.isArray(e),"Cannot merge array onto an object"),r.mergeArrays||(e.length=0);for(let s=0;se.keys(t),validate(e,{schema:t,error:r,state:s,prefs:n}){if(!e||typeof e!==t.$_property("typeof")||Array.isArray(e))return{value:e,errors:r("object.base",{type:t.$_property("typeof")})};if(!(t.$_terms.renames||t.$_terms.dependencies||t.$_terms.keys||t.$_terms.patterns||t.$_terms.externals))return;e=h.clone(e,n);const o=[];if(t.$_terms.renames&&!h.rename(t,e,s,n,o))return{value:e,errors:o};if(!t.$_terms.keys&&!t.$_terms.patterns&&!t.$_terms.dependencies)return{value:e,errors:o};const a=new Set(Object.keys(e));if(t.$_terms.keys){const r=[e,...s.ancestors];for(const i of t.$_terms.keys){const t=i.key,l=e[t];a.delete(t);const c=s.localize([...s.path,t],r,i),u=i.schema.$_validate(l,c,n);if(u.errors){if(n.abortEarly)return{value:e,errors:u.errors};o.push(...u.errors)}else"strip"===i.schema._flags.result||void 0===u.value&&void 0!==l?delete e[t]:void 0!==u.value&&(e[t]=u.value)}}if(a.size||t._flags._hasPatternMatch){const r=h.unknown(t,e,a,o,s,n);if(r)return r}if(t.$_terms.dependencies)for(const r of t.$_terms.dependencies){if(r.key&&void 0===r.key.resolve(e,s,n,null,{shadow:!1}))continue;const a=h.dependencies[r.rel](t,r,e,s,n);if(a){const r=t.$_createError(a.code,e,a.context,s,n);if(n.abortEarly)return{value:e,errors:r};o.push(r)}}return{value:e,errors:o}},rules:{and:{method(...e){return l.verifyFlat(e,"and"),h.dependency(this,"and",null,e)}},append:{method(e){return null==e||0===Object.keys(e).length?this:this.keys(e)}},assert:{method(e,t,r){m.isTemplate(e)||(e=c.ref(e)),n(void 0===r||"string"==typeof r,"Message must be a string"),t=this.$_compile(t,{appendPath:!0});const s=this.$_addRule({name:"assert",args:{subject:e,schema:t,message:r}});return s.$_mutateRegister(e),s.$_mutateRegister(t),s},validate(e,{error:t,prefs:r,state:s},{subject:n,schema:o,message:a}){const i=n.resolve(e,s,r),l=f.isRef(n)?n.absolute(s):[];return o.$_match(i,s.localize(l,[e,...s.ancestors],o),r)?e:t("object.assert",{subject:n,message:a})},args:["subject","schema","message"],multi:!0},instance:{method(e,t){return n("function"==typeof e,"constructor must be a function"),t=t||e.name,this.$_addRule({name:"instance",args:{constructor:e,name:t}})},validate:(e,t,{constructor:r,name:s})=>e instanceof r?e:t.error("object.instance",{type:s,value:e}),args:["constructor","name"]},keys:{method(e){n(void 0===e||"object"==typeof e,"Object schema must be a valid object"),n(!l.isSchema(e),"Object schema cannot be a joi schema");const t=this.clone();if(e)if(Object.keys(e).length){t.$_terms.keys=t.$_terms.keys?t.$_terms.keys.filter(t=>!e.hasOwnProperty(t.key)):new h.Keys;for(const r in e)l.tryWithPath(()=>t.$_terms.keys.push({key:r,schema:this.$_compile(e[r])}),r)}else t.$_terms.keys=new h.Keys;else t.$_terms.keys=null;return t.$_mutateRebuild()}},length:{method(e){return this.$_addRule({name:"length",args:{limit:e},operator:"="})},validate:(e,t,{limit:r},{name:s,operator:n,args:o})=>l.compare(Object.keys(e).length,r,n)?e:t.error("object."+s,{limit:o.limit,value:e}),args:[{name:"limit",ref:!0,assert:l.limit,message:"must be a positive integer"}]},max:{method(e){return this.$_addRule({name:"max",method:"length",args:{limit:e},operator:"<="})}},min:{method(e){return this.$_addRule({name:"min",method:"length",args:{limit:e},operator:">="})}},nand:{method(...e){return l.verifyFlat(e,"nand"),h.dependency(this,"nand",null,e)}},or:{method(...e){return l.verifyFlat(e,"or"),h.dependency(this,"or",null,e)}},oxor:{method(...e){return h.dependency(this,"oxor",null,e)}},pattern:{method(e,t,r={}){const s=e instanceof RegExp;s||(e=this.$_compile(e,{appendPath:!0})),n(void 0!==t,"Invalid rule"),l.assertOptions(r,["fallthrough","matches"]),s&&n(!e.flags.includes("g")&&!e.flags.includes("y"),"pattern should not use global or sticky mode"),t=this.$_compile(t,{appendPath:!0});const o=this.clone();o.$_terms.patterns=o.$_terms.patterns||[];const a={[s?"regex":"schema"]:e,rule:t};return r.matches&&(a.matches=this.$_compile(r.matches),"array"!==a.matches.type&&(a.matches=a.matches.$_root.array().items(a.matches)),o.$_mutateRegister(a.matches),o.$_setFlag("_hasPatternMatch",!0,{clone:!1})),r.fallthrough&&(a.fallthrough=!0),o.$_terms.patterns.push(a),o.$_mutateRegister(t),o}},ref:{method(){return this.$_addRule("ref")},validate:(e,t)=>f.isRef(e)?e:t.error("object.refType",{value:e})},regex:{method(){return this.$_addRule("regex")},validate:(e,t)=>e instanceof RegExp?e:t.error("object.regex",{value:e})},rename:{method(e,t,r={}){n("string"==typeof e||e instanceof RegExp,"Rename missing the from argument"),n("string"==typeof t||t instanceof m,"Invalid rename to argument"),n(t!==e,"Cannot rename key to same name:",e),l.assertOptions(r,["alias","ignoreUndefined","override","multiple"]);const o=this.clone();o.$_terms.renames=o.$_terms.renames||[];for(const t of o.$_terms.renames)n(t.from!==e,"Cannot rename the same key multiple times");return t instanceof m&&o.$_mutateRegister(t),o.$_terms.renames.push({from:e,to:t,options:s(h.renameDefaults,r)}),o}},schema:{method(e="any"){return this.$_addRule({name:"schema",args:{type:e}})},validate:(e,t,{type:r})=>!l.isSchema(e)||"any"!==r&&e.type!==r?t.error("object.schema",{type:r}):e},unknown:{method(e){return this.$_setFlag("unknown",!1!==e)}},with:{method(e,t,r={}){return h.dependency(this,"with",e,t,r)}},without:{method(e,t,r={}){return h.dependency(this,"without",e,t,r)}},xor:{method(...e){return l.verifyFlat(e,"xor"),h.dependency(this,"xor",null,e)}}},overrides:{default(e,t){return void 0===e&&(e=l.symbols.deepDefault),this.$_super.default(e,t)}},rebuild(e){if(e.$_terms.keys){const t=new a.Sorter;for(const r of e.$_terms.keys)l.tryWithPath(()=>t.add(r,{after:r.schema.$_rootReferences(),group:r.key}),r.key);e.$_terms.keys=new h.Keys(...t.nodes)}},manifest:{build(e,t){if(t.keys&&(e=e.keys(t.keys)),t.dependencies)for(const{rel:r,key:s=null,peers:n,options:o}of t.dependencies)e=h.dependency(e,r,s,n,o);if(t.patterns)for(const{regex:r,schema:s,rule:n,fallthrough:o,matches:a}of t.patterns)e=e.pattern(r||s,n,{fallthrough:o,matches:a});if(t.renames)for(const{from:r,to:s,options:n}of t.renames)e=e.rename(r,s,n);return e}},messages:{"object.and":"{{#label}} contains {{#presentWithLabels}} without its required peers {{#missingWithLabels}}","object.assert":'{{#label}} is invalid because {if(#subject.key, `"` + #subject.key + `" failed to ` + (#message || "pass the assertion test"), #message || "the assertion failed")}',"object.base":"{{#label}} must be of type {{#type}}","object.instance":'{{#label}} must be an instance of "{{#type}}"',"object.length":'{{#label}} must have {{#limit}} key{if(#limit == 1, "", "s")}',"object.max":'{{#label}} must have less than or equal to {{#limit}} key{if(#limit == 1, "", "s")}',"object.min":'{{#label}} must have at least {{#limit}} key{if(#limit == 1, "", "s")}',"object.missing":"{{#label}} must contain at least one of {{#peersWithLabels}}","object.nand":'"{{#mainWithLabel}}" must not exist simultaneously with {{#peersWithLabels}}',"object.oxor":"{{#label}} contains a conflict between optional exclusive peers {{#peersWithLabels}}","object.pattern.match":"{{#label}} keys failed to match pattern requirements","object.refType":"{{#label}} must be a Joi reference","object.regex":"{{#label}} must be a RegExp object","object.rename.multiple":'{{#label}} cannot rename "{{#from}}" because multiple renames are disabled and another key was already renamed to "{{#to}}"',"object.rename.override":'{{#label}} cannot rename "{{#from}}" because override is disabled and target "{{#to}}" exists',"object.schema":"{{#label}} must be a Joi schema of {{#type}} type","object.unknown":"{{#label}} is not allowed","object.with":'"{{#mainWithLabel}}" missing required peer "{{#peerWithLabel}}"',"object.without":'"{{#mainWithLabel}}" conflict with forbidden peer "{{#peerWithLabel}}"',"object.xor":"{{#label}} contains a conflict between exclusive peers {{#peersWithLabels}}"}}),h.clone=function(e,t){if("object"==typeof e){if(t.nonEnumerables)return o(e,{shallow:!0});const r=Object.create(Object.getPrototypeOf(e));return Object.assign(r,e),r}const r=function(...t){return e.apply(this,t)};return r.prototype=o(e.prototype),Object.defineProperty(r,"name",{value:e.name,writable:!1}),Object.defineProperty(r,"length",{value:e.length,writable:!1}),Object.assign(r,e),r},h.dependency=function(e,t,r,s,o){n(null===r||"string"==typeof r,t,"key must be a strings"),o||(o=s.length>1&&"object"==typeof s[s.length-1]?s.pop():{}),l.assertOptions(o,["separator"]),s=[].concat(s);const a=l.default(o.separator,"."),i=[];for(const e of s)n("string"==typeof e,t,"peers must be a string or a reference"),i.push(c.ref(e,{separator:a,ancestor:0,prefix:!1}));null!==r&&(r=c.ref(r,{separator:a,ancestor:0,prefix:!1}));const u=e.clone();return u.$_terms.dependencies=u.$_terms.dependencies||[],u.$_terms.dependencies.push(new h.Dependency(t,r,i,s)),u},h.dependencies={and(e,t,r,s,n){const o=[],a=[],i=t.peers.length;for(const e of t.peers)void 0===e.resolve(r,s,n,null,{shadow:!1})?o.push(e.key):a.push(e.key);if(o.length!==i&&a.length!==i)return{code:"object.and",context:{present:a,presentWithLabels:h.keysToLabels(e,a),missing:o,missingWithLabels:h.keysToLabels(e,o)}}},nand(e,t,r,s,n){const o=[];for(const e of t.peers)void 0!==e.resolve(r,s,n,null,{shadow:!1})&&o.push(e.key);if(o.length!==t.peers.length)return;const a=t.paths[0],i=t.paths.slice(1);return{code:"object.nand",context:{main:a,mainWithLabel:h.keysToLabels(e,a),peers:i,peersWithLabels:h.keysToLabels(e,i)}}},or(e,t,r,s,n){for(const e of t.peers)if(void 0!==e.resolve(r,s,n,null,{shadow:!1}))return;return{code:"object.missing",context:{peers:t.paths,peersWithLabels:h.keysToLabels(e,t.paths)}}},oxor(e,t,r,s,n){const o=[];for(const e of t.peers)void 0!==e.resolve(r,s,n,null,{shadow:!1})&&o.push(e.key);if(!o.length||1===o.length)return;const a={peers:t.paths,peersWithLabels:h.keysToLabels(e,t.paths)};return a.present=o,a.presentWithLabels=h.keysToLabels(e,o),{code:"object.oxor",context:a}},with(e,t,r,s,n){for(const o of t.peers)if(void 0===o.resolve(r,s,n,null,{shadow:!1}))return{code:"object.with",context:{main:t.key.key,mainWithLabel:h.keysToLabels(e,t.key.key),peer:o.key,peerWithLabel:h.keysToLabels(e,o.key)}}},without(e,t,r,s,n){for(const o of t.peers)if(void 0!==o.resolve(r,s,n,null,{shadow:!1}))return{code:"object.without",context:{main:t.key.key,mainWithLabel:h.keysToLabels(e,t.key.key),peer:o.key,peerWithLabel:h.keysToLabels(e,o.key)}}},xor(e,t,r,s,n){const o=[];for(const e of t.peers)void 0!==e.resolve(r,s,n,null,{shadow:!1})&&o.push(e.key);if(1===o.length)return;const a={peers:t.paths,peersWithLabels:h.keysToLabels(e,t.paths)};return 0===o.length?{code:"object.missing",context:a}:(a.present=o,a.presentWithLabels=h.keysToLabels(e,o),{code:"object.xor",context:a})}},h.keysToLabels=function(e,t){return Array.isArray(t)?t.map(t=>e.$_mapLabels(t)):e.$_mapLabels(t)},h.rename=function(e,t,r,s,n){const o={};for(const a of e.$_terms.renames){const i=[],l="string"!=typeof a.from;if(l)for(const e in t){if(void 0===t[e]&&a.options.ignoreUndefined)continue;if(e===a.to)continue;const r=a.from.exec(e);r&&i.push({from:e,to:a.to,match:r})}else!Object.prototype.hasOwnProperty.call(t,a.from)||void 0===t[a.from]&&a.options.ignoreUndefined||i.push(a);for(const c of i){const i=c.from;let u=c.to;if(u instanceof m&&(u=u.render(t,r,s,c.match)),i!==u){if(!a.options.multiple&&o[u]&&(n.push(e.$_createError("object.rename.multiple",t,{from:i,to:u,pattern:l},r,s)),s.abortEarly))return!1;if(Object.prototype.hasOwnProperty.call(t,u)&&!a.options.override&&!o[u]&&(n.push(e.$_createError("object.rename.override",t,{from:i,to:u,pattern:l},r,s)),s.abortEarly))return!1;void 0===t[i]?delete t[u]:t[u]=t[i],o[u]=!0,a.options.alias||delete t[i]}}}return!0},h.unknown=function(e,t,r,s,n,o){if(e.$_terms.patterns){let a=!1;const i=e.$_terms.patterns.map(e=>{if(e.matches)return a=!0,[]}),l=[t,...n.ancestors];for(const a of r){const c=t[a],u=[...n.path,a];for(let f=0;f256)return n.code("DOMAIN_TOO_LONG");if(!!o.nonAsciiRx.test(e)){if(!1===t.allowUnicode)return n.code("DOMAIN_INVALID_UNICODE_CHARS");e=e.normalize("NFC")}if(o.domainControlRx.test(e))return n.code("DOMAIN_INVALID_CHARS");e=o.punycode(e);const r=t.minDomainSegments||o.minDomainSegments,s=e.split(".");if(s.length63)return n.code("DOMAIN_LONG_SEGMENT");if(e=1,"scheme must have at least 1 scheme specified");const o=[];for(let e=0;e=256)return"&#"+e+";";const r=e.toString(16).padStart(2,"0");return"&#x".concat(r,";")},s.isSafe=function(e){return void 0!==s.safeCharCodes[e]},s.namedHtml={38:"&",60:"<",62:">",34:""",160:" ",162:"¢",163:"£",164:"¤",169:"©",174:"®"},s.safeCharCodes=function(){const e={};for(let t=32;t<123;++t)(t>=97||t>=65&&t<=90||t>=48&&t<=57||32===t||46===t||44===t||45===t||58===t||95===t)&&(e[t]=null);return e}()},function(e,t,r){"use strict";const s={operators:["!","^","*","/","%","+","-","<","<=",">",">=","==","!=","&&","||","??"],operatorCharacters:["!","^","*","/","%","+","-","<","=",">","&","|","?"],operatorsOrder:[["^"],["*","/","%"],["+","-"],["<","<=",">",">="],["==","!="],["&&"],["||","??"]],operatorsPrefix:["!","n"],literals:{'"':'"',"`":"`","'":"'","[":"]"},numberRx:/^(?:[0-9]*\.?[0-9]*){1}$/,tokenRx:/^[\w\$\#\.\@\:\{\}]+$/,symbol:Symbol("formula"),settings:Symbol("settings")};t.Parser=class{constructor(e,t={}){if(!t[s.settings]&&t.constants)for(const e in t.constants){const r=t.constants[e];if(null!==r&&!["boolean","number","string"].includes(typeof r))throw new Error("Formula constant ".concat(e," contains invalid ").concat(typeof r," value type"))}this.settings=t[s.settings]?t:Object.assign({[s.settings]:!0,constants:{},functions:{}},t),this.single=null,this._parts=null,this._parse(e)}_parse(e){let r=[],n="",o=0,a=!1;const i=e=>{if(o)throw new Error("Formula missing closing parenthesis");const i=r.length?r[r.length-1]:null;if(a||n||e){if(i&&"reference"===i.type&&")"===e)return i.type="function",i.value=this._subFormula(n,i.value),void(n="");if(")"===e){const e=new t.Parser(n,this.settings);r.push({type:"segment",value:e})}else if(a){if("]"===a)return r.push({type:"reference",value:n}),void(n="");r.push({type:"literal",value:n})}else if(s.operatorCharacters.includes(n))i&&"operator"===i.type&&s.operators.includes(i.value+n)?i.value+=n:r.push({type:"operator",value:n});else if(n.match(s.numberRx))r.push({type:"constant",value:parseFloat(n)});else if(void 0!==this.settings.constants[n])r.push({type:"constant",value:this.settings.constants[n]});else{if(!n.match(s.tokenRx))throw new Error("Formula contains invalid token: ".concat(n));r.push({type:"reference",value:n})}n=""}};for(const t of e)a?t===a?(i(),a=!1):n+=t:o?"("===t?(n+=t,++o):")"===t?(--o,o?n+=t:i(t)):n+=t:t in s.literals?a=s.literals[t]:"("===t?(i(),++o):s.operatorCharacters.includes(t)?(i(),n=t,i()):" "!==t?n+=t:i();i(),r=r.map((e,t)=>"operator"!==e.type||"-"!==e.value||t&&"operator"!==r[t-1].type?e:{type:"operator",value:"n"});let l=!1;for(const e of r){if("operator"===e.type){if(s.operatorsPrefix.includes(e.value))continue;if(!l)throw new Error("Formula contains an operator in invalid position");if(!s.operators.includes(e.value))throw new Error("Formula contains an unknown operator ".concat(e.value))}else if(l)throw new Error("Formula missing expected operator");l=!l}if(!l)throw new Error("Formula contains invalid trailing operator");1===r.length&&["reference","literal","constant"].includes(r[0].type)&&(this.single={type:"reference"===r[0].type?"reference":"value",value:r[0].value}),this._parts=r.map(e=>{if("operator"===e.type)return s.operatorsPrefix.includes(e.value)?e:e.value;if("reference"!==e.type)return e.value;if(this.settings.tokenRx&&!this.settings.tokenRx.test(e.value))throw new Error("Formula contains invalid reference ".concat(e.value));return this.settings.reference?this.settings.reference(e.value):s.reference(e.value)})}_subFormula(e,r){const n=this.settings.functions[r];if("function"!=typeof n)throw new Error("Formula contains unknown function ".concat(r));let o=[];if(e){let t="",n=0,a=!1;const i=()=>{if(!t)throw new Error("Formula contains function ".concat(r," with invalid arguments ").concat(e));o.push(t),t=""};for(let r=0;rnew t.Parser(e,this.settings)),function(e){const t=[];for(const r of o)t.push(r.evaluate(e));return n.call(e,...t)}}evaluate(e){const t=this._parts.slice();for(let r=t.length-2;r>=0;--r){const n=t[r];if(n&&"operator"===n.type){const o=t[r+1];t.splice(r+1,1);const a=s.evaluate(o,e);t[r]=s.single(n.value,a)}}return s.operatorsOrder.forEach(r=>{for(let n=1;n":return t>r;case">=":return t>=r;case"==":return t===r;case"!=":return t!==r;case"&&":return t&&r;case"||":return t||r}return null},s.exists=function(e){return null!=e}},function(e,t){},function(e,t,r){"use strict";const s=r(0),n=r(3),o=r(1),a=r(8),i=r(4),l=r(5),c={};e.exports=n.extend({type:"alternatives",flags:{match:{default:"any"}},terms:{matches:{init:[],register:l.toSibling}},args:(e,...t)=>1===t.length&&Array.isArray(t[0])?e.try(...t[0]):e.try(...t),validate(e,t){const{schema:r,error:s,state:n,prefs:o}=t;if(r._flags.match){let t,a=0;for(let s=0;s"is"!==r.path[0]?t.label(e):void 0,ref:!1})}},rebuild(e){e.$_modify({each:t=>{o.isSchema(t)&&"array"===t.type&&e.$_setFlag("_arrayItems",!0,{clone:!1})}})},manifest:{build(e,t){if(t.matches)for(const r of t.matches){const{schema:t,ref:s,is:n,not:o,then:a,otherwise:i}=r;e=t?e.try(t):s?e.conditional(s,{is:n,then:a,not:o,otherwise:i,switch:r.switch}):e.conditional(n,{then:a,otherwise:i})}return e}},messages:{"alternatives.all":"{{#label}} does not match all of the required types","alternatives.any":"{{#label}} does not match any of the allowed types","alternatives.match":"{{#label}} does not match any of the allowed types","alternatives.one":"{{#label}} matches more than one allowed type","alternatives.types":"{{#label}} must be one of {{#types}}"}}),c.errors=function(e,{error:t,state:r}){if(!e.length)return{errors:t("alternatives.any")};if(1===e.length)return{errors:e[0].reports};const s=new Set,n=[];for(const{reports:o,schema:a}of e){if(o.length>1)return c.unmatched(e,t);const l=o[0];if(l instanceof i.Report==!1)return c.unmatched(e,t);if(l.state.path.length!==r.path.length){n.push({type:a.type,report:l});continue}if("any.only"===l.code){for(const e of l.local.valids)s.add(e);continue}const[u,f]=l.code.split(".");"base"===f?s.add(u):n.push({type:a.type,report:l})}return n.length?1===n.length?{errors:n[0].report}:c.unmatched(e,t):{errors:t("alternatives.types",{types:[...s]})}},c.unmatched=function(e,t){const r=[];for(const t of e)r.push(...t.reports);return{errors:t("alternatives.match",i.details(r,{override:!1}))}}},function(e,t,r){"use strict";const s=r(0),n=r(2),o=r(10),a=r(20),i=r(15),l=r(1),c=r(8),u=r(4),f=r(17),m=r(18),h=r(9),d=r(35),p=r(5),g=r(19),y=r(36),b=r(21),v={Base:class{constructor(e){this.type=e,this.$_root=null,this._definition={},this._ids=new d.Ids,this._preferences=null,this._refs=new p.Manager,this._cache=null,this._valids=null,this._invalids=null,this._flags={},this._rules=[],this._singleRules=new Map,this.$_terms={},this.$_temp={ruleset:null,whens:{}}}describe(){return s("function"==typeof m.describe,"Manifest functionality disabled"),m.describe(this)}allow(...e){return l.verifyFlat(e,"allow"),this._values(e,"_valids")}alter(e){s(e&&"object"==typeof e&&!Array.isArray(e),"Invalid targets argument"),s(!this._inRuleset(),"Cannot set alterations inside a ruleset");const t=this.clone();t.$_terms.alterations=t.$_terms.alterations||[];for(const r in e){const n=e[r];s("function"==typeof n,"Alteration adjuster for",r,"must be a function"),t.$_terms.alterations.push({target:r,adjuster:n})}return t.$_temp.ruleset=!1,t}cast(e){return s(!1===e||"string"==typeof e,"Invalid to value"),s(!1===e||this._definition.cast[e],"Type",this.type,"does not support casting to",e),this.$_setFlag("cast",!1===e?void 0:e)}default(e,t){return this._default("default",e,t)}description(e){return s(e&&"string"==typeof e,"Description must be a non-empty string"),this.$_setFlag("description",e)}empty(e){const t=this.clone();return void 0!==e&&(e=t.$_compile(e,{override:!1})),t.$_setFlag("empty",e,{clone:!1})}error(e){return s(e,"Missing error"),s(e instanceof Error||"function"==typeof e,"Must provide a valid Error object or a function"),this.$_setFlag("error",e)}example(e,t={}){return s(void 0!==e,"Missing example"),l.assertOptions(t,["override"]),this._inner("examples",e,{single:!0,override:t.override})}external(e,t){return"object"==typeof e&&(s(!t,"Cannot combine options with description"),t=e.description,e=e.method),s("function"==typeof e,"Method must be a function"),s(void 0===t||t&&"string"==typeof t,"Description must be a non-empty string"),this._inner("externals",{method:e,description:t},{single:!0})}failover(e,t){return this._default("failover",e,t)}forbidden(){return this.presence("forbidden")}id(e){return e?(s("string"==typeof e,"id must be a non-empty string"),s(/^[^\.]+$/.test(e),"id cannot contain period character"),this.$_setFlag("id",e)):this.$_setFlag("id",void 0)}invalid(...e){return this._values(e,"_invalids")}label(e){return s(e&&"string"==typeof e,"Label name must be a non-empty string"),this.$_setFlag("label",e)}meta(e){return s(void 0!==e,"Meta cannot be undefined"),this._inner("metas",e,{single:!0})}note(...e){s(e.length,"Missing notes");for(const t of e)s(t&&"string"==typeof t,"Notes must be non-empty strings");return this._inner("notes",e)}only(e=!0){return s("boolean"==typeof e,"Invalid mode:",e),this.$_setFlag("only",e)}optional(){return this.presence("optional")}prefs(e){s(e,"Missing preferences"),s(void 0===e.context,"Cannot override context"),s(void 0===e.externals,"Cannot override externals"),s(void 0===e.warnings,"Cannot override warnings"),s(void 0===e.debug,"Cannot override debug"),l.checkPreferences(e);const t=this.clone();return t._preferences=l.preferences(t._preferences,e),t}presence(e){return s(["optional","required","forbidden"].includes(e),"Unknown presence mode",e),this.$_setFlag("presence",e)}raw(e=!0){return this.$_setFlag("result",e?"raw":void 0)}result(e){return s(["raw","strip"].includes(e),"Unknown result mode",e),this.$_setFlag("result",e)}required(){return this.presence("required")}strict(e){const t=this.clone(),r=void 0!==e&&!e;return t._preferences=l.preferences(t._preferences,{convert:r}),t}strip(e=!0){return this.$_setFlag("result",e?"strip":void 0)}tag(...e){s(e.length,"Missing tags");for(const t of e)s(t&&"string"==typeof t,"Tags must be non-empty strings");return this._inner("tags",e)}unit(e){return s(e&&"string"==typeof e,"Unit name must be a non-empty string"),this.$_setFlag("unit",e)}valid(...e){l.verifyFlat(e,"valid");const t=this.allow(...e);return t.$_setFlag("only",!!t._valids,{clone:!1}),t}when(e,t){const r=this.clone();r.$_terms.whens||(r.$_terms.whens=[]);const n=c.when(r,e,t);if(!["any","link"].includes(r.type)){const e=n.is?[n]:n.switch;for(const t of e)s(!t.then||"any"===t.then.type||t.then.type===r.type,"Cannot combine",r.type,"with",t.then&&t.then.type),s(!t.otherwise||"any"===t.otherwise.type||t.otherwise.type===r.type,"Cannot combine",r.type,"with",t.otherwise&&t.otherwise.type)}return r.$_terms.whens.push(n),r.$_mutateRebuild()}cache(e){s(!this._inRuleset(),"Cannot set caching inside a ruleset"),s(!this._cache,"Cannot override schema cache");const t=this.clone();return t._cache=e||i.provider.provision(),t.$_temp.ruleset=!1,t}clone(){const e=Object.create(Object.getPrototypeOf(this));return this._assign(e)}concat(e){s(l.isSchema(e),"Invalid schema object"),s("any"===this.type||"any"===e.type||e.type===this.type,"Cannot merge type",this.type,"with another type:",e.type),s(!this._inRuleset(),"Cannot concatenate onto a schema with open ruleset"),s(!e._inRuleset(),"Cannot concatenate a schema with open ruleset");let t=this.clone();if("any"===this.type&&"any"!==e.type){const r=e.clone();for(const e of Object.keys(t))"type"!==e&&(r[e]=t[e]);t=r}t._ids.concat(e._ids),t._refs.register(e,p.toSibling),t._preferences=t._preferences?l.preferences(t._preferences,e._preferences):e._preferences,t._valids=b.merge(t._valids,e._valids,e._invalids),t._invalids=b.merge(t._invalids,e._invalids,e._valids);for(const r of e._singleRules.keys())t._singleRules.has(r)&&(t._rules=t._rules.filter(e=>e.keep||e.name!==r),t._singleRules.delete(r));for(const r of e._rules)e._definition.rules[r.method].multi||t._singleRules.set(r.name,r),t._rules.push(r);if(t._flags.empty&&e._flags.empty){t._flags.empty=t._flags.empty.concat(e._flags.empty);const r=Object.assign({},e._flags);delete r.empty,a(t._flags,r)}else if(e._flags.empty){t._flags.empty=e._flags.empty;const r=Object.assign({},e._flags);delete r.empty,a(t._flags,r)}else a(t._flags,e._flags);for(const r in e.$_terms){const s=e.$_terms[r];s?t.$_terms[r]?t.$_terms[r]=t.$_terms[r].concat(s):t.$_terms[r]=s.slice():t.$_terms[r]||(t.$_terms[r]=s)}return this.$_root._tracer&&this.$_root._tracer._combine(t,[this,e]),t.$_mutateRebuild()}extend(e){return s(!e.base,"Cannot extend type with another base"),f.type(this,e)}extract(e){return e=Array.isArray(e)?e:e.split("."),this._ids.reach(e)}fork(e,t){s(!this._inRuleset(),"Cannot fork inside a ruleset");let r=this;for(let s of[].concat(e))s=Array.isArray(s)?s:s.split("."),r=r._ids.fork(s,t,r);return r.$_temp.ruleset=!1,r}rule(e){const t=this._definition;l.assertOptions(e,Object.keys(t.modifiers)),s(!1!==this.$_temp.ruleset,"Cannot apply rules to empty ruleset or the last rule added does not support rule properties");const r=null===this.$_temp.ruleset?this._rules.length-1:this.$_temp.ruleset;s(r>=0&&rt.tailor(e),ref:!1}),t.$_temp.ruleset=!1,t.$_mutateRebuild()}tracer(){return g.location?g.location(this):this}validate(e,t){return y.entry(e,this,t)}validateAsync(e,t){return y.entryAsync(e,this,t)}$_addRule(e){"string"==typeof e&&(e={name:e}),s(e&&"object"==typeof e,"Invalid options"),s(e.name&&"string"==typeof e.name,"Invalid rule name");for(const t in e)s("_"!==t[0],"Cannot set private rule properties");const t=Object.assign({},e);t._resolve=[],t.method=t.method||t.name;const r=this._definition.rules[t.method],n=t.args;s(r,"Unknown rule",t.method);const o=this.clone();if(n){s(1===Object.keys(n).length||Object.keys(n).length===this._definition.rules[t.name].args.length,"Invalid rule definition for",this.type,t.name);for(const e in n){let a=n[e];if(void 0!==a){if(r.argsByName){const i=r.argsByName.get(e);if(i.ref&&l.isResolvable(a))t._resolve.push(e),o.$_mutateRegister(a);else if(i.normalize&&(a=i.normalize(a),n[e]=a),i.assert){const t=l.validateArg(a,e,i);s(!t,t,"or reference")}}n[e]=a}else delete n[e]}}return r.multi||(o._ruleRemove(t.name,{clone:!1}),o._singleRules.set(t.name,t)),!1===o.$_temp.ruleset&&(o.$_temp.ruleset=null),r.priority?o._rules.unshift(t):o._rules.push(t),o}$_compile(e,t){return c.schema(this.$_root,e,t)}$_createError(e,t,r,s,n,o={}){const a=!1!==o.flags?this._flags:{},i=o.messages?h.merge(this._definition.messages,o.messages):this._definition.messages;return new u.Report(e,t,r,a,i,s,n)}$_getFlag(e){return this._flags[e]}$_getRule(e){return this._singleRules.get(e)}$_mapLabels(e){return e=Array.isArray(e)?e:e.split("."),this._ids.labels(e)}$_match(e,t,r,s){(r=Object.assign({},r)).abortEarly=!0,r._externals=!1,t.snapshot();const n=!y.validate(e,this,t,r,s).errors;return t.restore(),n}$_modify(e){return l.assertOptions(e,["each","once","ref","schema"]),d.schema(this,e)||this}$_mutateRebuild(){s(!this._inRuleset(),"Cannot add this rule inside a ruleset"),this._refs.reset(),this._ids.reset();return this.$_modify({each:(e,{source:t,name:r,path:s,key:n})=>{const o=this._definition[t][r]&&this._definition[t][r].register;!1!==o&&this.$_mutateRegister(e,{family:o,key:n})}}),this._definition.rebuild&&this._definition.rebuild(this),this.$_temp.ruleset=!1,this}$_mutateRegister(e,{family:t,key:r}={}){this._refs.register(e,t),this._ids.register(e,{key:r})}$_property(e){return this._definition.properties[e]}$_reach(e){return this._ids.reach(e)}$_rootReferences(){return this._refs.roots()}$_setFlag(e,t,r={}){s("_"===e[0]||!this._inRuleset(),"Cannot set flag inside a ruleset");const n=this._definition.flags[e]||{};if(o(t,n.default)&&(t=void 0),o(t,this._flags[e]))return this;const a=!1!==r.clone?this.clone():this;return void 0!==t?(a._flags[e]=t,a.$_mutateRegister(t)):delete a._flags[e],"_"!==e[0]&&(a.$_temp.ruleset=!1),a}$_validate(e,t,r){return y.validate(e,this,t,r)}_assign(e){e.type=this.type,e.$_root=this.$_root,e.$_temp=Object.assign({},this.$_temp),e.$_temp.whens={},e._ids=this._ids.clone(),e._preferences=this._preferences,e._valids=this._valids&&this._valids.clone(),e._invalids=this._invalids&&this._invalids.clone(),e._rules=this._rules.slice(),e._singleRules=n(this._singleRules,{shallow:!0}),e._refs=this._refs.clone(),e._flags=Object.assign({},this._flags),e._cache=null,e.$_terms={};for(const t in this.$_terms)e.$_terms[t]=this.$_terms[t]?this.$_terms[t].slice():null;e.$_super={};for(const t in this.$_super)e.$_super[t]=this._super[t].bind(e);return e}_default(e,t,r={}){return l.assertOptions(r,"literal"),s(void 0!==t,"Missing",e,"value"),s("function"==typeof t||!r.literal,"Only function value supports literal option"),"function"==typeof t&&r.literal&&(t={[l.symbols.literal]:!0,literal:t}),this.$_setFlag(e,t)}_generate(e,t,r){if(!this.$_terms.whens)return{schema:this};const s=[],n=[];for(let o=0;oc)break}const o=n.join(", ");if(t.mainstay.tracer.debug(t,"rule","when",o),!o)return{schema:this};if(!t.mainstay.tracer.active&&this.$_temp.whens[o])return{schema:this.$_temp.whens[o],id:o};let a=this;this._definition.generate&&(a=this._definition.generate(this,e,t,r));for(const e of s)a=a.concat(e);return this.$_root._tracer&&this.$_root._tracer._combine(a,[this,...s]),this.$_temp.whens[o]=a,{schema:a,id:o}}_inner(e,t,r={}){s(!this._inRuleset(),"Cannot set ".concat(e," inside a ruleset"));const n=this.clone();return n.$_terms[e]&&!r.override||(n.$_terms[e]=[]),r.single?n.$_terms[e].push(t):n.$_terms[e].push(...t),n.$_temp.ruleset=!1,n}_inRuleset(){return null!==this.$_temp.ruleset&&!1!==this.$_temp.ruleset}_ruleRemove(e,t={}){if(!this._singleRules.has(e))return this;const r=!1!==t.clone?this.clone():this;r._singleRules.delete(e);const s=[];for(let t=0;t{if(r===(e._flags.id||t))return s},ref:!1});return n?n.$_mutateRebuild():e},t.schema=function(e,t){let r;for(const s in e._flags){if("_"===s[0])continue;const n=l.scan(e._flags[s],{source:"flags",name:s},t);void 0!==n&&(r=r||e.clone(),r._flags[s]=n)}for(let s=0;st.$_createError(n,e,o,a||r,s),a={original:e,prefs:s,schema:t,state:r,error:o,warn:(e,t,s)=>r.mainstay.warnings.push(o(e,t,s)),message:(n,o)=>t.$_createError("custom",e,o,r,s,{messages:n})};r.mainstay.tracer.entry(t,r);const l=t._definition;if(l.prepare&&void 0!==e&&s.convert){const t=l.prepare(e,a);if(t){if(r.mainstay.tracer.value(r,"prepare",e,t.value),t.errors)return u.finalize(t.value,[].concat(t.errors),a);e=t.value}}if(l.coerce&&void 0!==e&&s.convert&&(!l.coerce.from||l.coerce.from.includes(typeof e))){const t=l.coerce.method(e,a);if(t){if(r.mainstay.tracer.value(r,"coerced",e,t.value),t.errors)return u.finalize(t.value,[].concat(t.errors),a);e=t.value}}const c=t._flags.empty;c&&c.$_match(u.trim(e,t),r.nest(c),i.defaults)&&(r.mainstay.tracer.value(r,"empty",e,void 0),e=void 0);const f=n.presence||t._flags.presence||(t._flags._endedSwitch?"ignore":s.presence);if(void 0===e){if("forbidden"===f)return u.finalize(e,null,a);if("required"===f)return u.finalize(e,[t.$_createError("any.required",e,null,r,s)],a);if("optional"===f){if(t._flags.default!==i.symbols.deepDefault)return u.finalize(e,null,a);r.mainstay.tracer.value(r,"default",e,{}),e={}}}else if("forbidden"===f)return u.finalize(e,[t.$_createError("any.unknown",e,null,r,s)],a);const m=[];if(t._valids){const n=t._valids.get(e,r,s,t._flags.insensitive);if(n)return s.convert&&(r.mainstay.tracer.value(r,"valids",e,n.value),e=n.value),r.mainstay.tracer.filter(t,r,"valid",n),u.finalize(e,null,a);if(t._flags.only){const n=t.$_createError("any.only",e,{valids:t._valids.values({display:!0})},r,s);if(s.abortEarly)return u.finalize(e,[n],a);m.push(n)}}if(t._invalids){const n=t._invalids.get(e,r,s,t._flags.insensitive);if(n){r.mainstay.tracer.filter(t,r,"invalid",n);const o=t.$_createError("any.invalid",e,{invalids:t._invalids.values({display:!0})},r,s);if(s.abortEarly)return u.finalize(e,[o],a);m.push(o)}}if(l.validate){const t=l.validate(e,a);if(t&&(r.mainstay.tracer.value(r,"base",e,t.value),e=t.value,t.errors)){if(!Array.isArray(t.errors))return m.push(t.errors),u.finalize(e,m,a);if(t.errors.length)return m.push(...t.errors),u.finalize(e,m,a)}}return t._rules.length?u.rules(e,m,a):u.finalize(e,m,a)},u.rules=function(e,t,r){const{schema:s,state:n,prefs:o}=r;for(const a of s._rules){const l=s._definition.rules[a.method];if(l.convert&&o.convert){n.mainstay.tracer.log(s,n,"rule",a.name,"full");continue}let c,f=a.args;if(a._resolve.length){f=Object.assign({},f);for(const t of a._resolve){const r=l.argsByName.get(t),a=f[t].resolve(e,n,o),u=r.normalize?r.normalize(a):a,m=i.validateArg(u,null,r);if(m){c=s.$_createError("any.ref",a,{arg:t,ref:f[t],reason:m},n,o);break}f[t]=u}}c=c||l.validate(e,r,f,a);const m=u.rule(c,a);if(m.errors){if(n.mainstay.tracer.log(s,n,"rule",a.name,"error"),a.warn){n.mainstay.warnings.push(...m.errors);continue}if(o.abortEarly)return u.finalize(e,m.errors,r);t.push(...m.errors)}else n.mainstay.tracer.log(s,n,"rule",a.name,"pass"),n.mainstay.tracer.value(n,"rule",e,m.value,a.name),e=m.value}return u.finalize(e,t,r)},u.rule=function(e,t){return e instanceof l.Report?(u.error(e,t),{errors:[e],value:null}):Array.isArray(e)&&(e[0]instanceof l.Report||e[0]instanceof Error)?(e.forEach(e=>u.error(e,t)),{errors:e,value:null}):{errors:null,value:e}},u.error=function(e,t){return t.message&&e._setTemplate(t.message),e},u.finalize=function(e,t,r){t=t||[];const{schema:n,state:o,prefs:a}=r;if(t.length){const s=u.default("failover",void 0,t,r);void 0!==s&&(o.mainstay.tracer.value(o,"failover",e,s),e=s,t=[])}if(t.length&&n._flags.error)if("function"==typeof n._flags.error){t=n._flags.error(t),Array.isArray(t)||(t=[t]);for(const e of t)s(e instanceof Error||e instanceof l.Report,"error() must return an Error object")}else t=[n._flags.error];if(void 0===e){const s=u.default("default",e,t,r);o.mainstay.tracer.value(o,"default",e,s),e=s}if(n._flags.cast&&void 0!==e){const t=n._definition.cast[n._flags.cast];if(t.from(e)){const s=t.to(e,r);o.mainstay.tracer.value(o,"cast",e,s,n._flags.cast),e=s}}if(n.$_terms.externals&&a.externals&&!1!==a._externals)for(const{method:e}of n.$_terms.externals)o.mainstay.externals.push({method:e,path:o.path,label:l.label(n._flags,o,a)});const i={value:e,errors:t.length?t:null};return n._flags.result&&(i.value="strip"===n._flags.result?void 0:r.original,o.mainstay.tracer.value(o,n._flags.result,e,i.value),o.shadow(e,n._flags.result)),n._cache&&!1!==a.cache&&!n._refs.length&&n._cache.set(r.original,i),i},u.prefs=function(e,t){const r=t===i.defaults;return r&&e._preferences[i.symbols.prefs]?e._preferences[i.symbols.prefs]:(t=i.preferences(t,e._preferences),r&&(e._preferences[i.symbols.prefs]=t),t)},u.default=function(e,t,r,s){const{schema:o,state:a,prefs:l}=s,c=o._flags[e];if(l.noDefaults||void 0===c)return t;if(a.mainstay.tracer.log(o,a,"rule",e,"full"),!c)return c;if("function"==typeof c){const t=c.length?[n(a.ancestors[0]),s]:[];try{return c(...t)}catch(t){return void r.push(o.$_createError("any.".concat(e),null,{error:t},a,l))}}return"object"!=typeof c?c:c[i.symbols.literal]?c.literal:i.isResolvable(c)?c.resolve(t,a,l):n(c)},u.trim=function(e,t){if("string"!=typeof e)return e;const r=t.$_getRule("trim");return r&&r.args.enabled?e.trim():e},u.ignore={active:!1,debug:o,entry:o,filter:o,log:o,resolve:o,value:o}},function(e,t,r){"use strict";e.exports=function(){}},function(e,t,r){"use strict";const s=r(2),n=r(6),o=r(1),a={value:Symbol("value")};e.exports=a.State=class{constructor(e,t,r){this.path=e,this.ancestors=t,this.mainstay=r.mainstay,this.schemas=r.schemas,this.debug=null}localize(e,t=null,r=null){const s=new a.State(e,t,this);return r&&s.schemas&&(s.schemas=[a.schemas(r),...s.schemas]),s}nest(e,t){const r=new a.State(this.path,this.ancestors,this);return r.schemas=r.schemas&&[a.schemas(e),...r.schemas],r.debug=t,r}shadow(e,t){this.mainstay.shadow=this.mainstay.shadow||new a.Shadow,this.mainstay.shadow.set(this.path,e,t)}snapshot(){this.mainstay.shadow&&(this._snapshot=s(this.mainstay.shadow.node(this.path)))}restore(){this.mainstay.shadow&&(this.mainstay.shadow.override(this.path,this._snapshot),this._snapshot=void 0)}},a.schemas=function(e){return o.isSchema(e)?{schema:e}:e},a.Shadow=class{constructor(){this._values=null}set(e,t,r){if(!e.length)return;if("strip"===r&&"number"==typeof e[e.length-1])return;this._values=this._values||new Map;let s=this._values;for(let t=0;tthis.$_compile(e[r]),r,{append:!0});t.$_terms.items.push(s)}return t.$_mutateRebuild()},validate(e,{schema:t,error:r,state:s,prefs:n}){const o=t.$_terms._requireds.slice(),a=t.$_terms.ordered.slice(),l=[...t.$_terms._inclusions,...o],u=!e[i.symbols.arraySingle];delete e[i.symbols.arraySingle];const f=[];let m=e.length;for(let i=0;i="})}},ordered:{method(...e){i.verifyFlat(e,"ordered");const t=this.$_addRule("items");for(let r=0;rthis.$_compile(e[r]),r,{append:!0});c.validateSingle(s,t),t.$_mutateRegister(s),t.$_terms.ordered.push(s)}return t.$_mutateRebuild()}},single:{method(e){const t=void 0===e||!!e;return s(!t||!this._flags._arrayItems,"Cannot specify single rule when array has array items"),this.$_setFlag("single",t)}},sort:{method(e={}){i.assertOptions(e,["by","order"]);const t={order:e.order||"ascending"};return e.by&&(t.by=l.ref(e.by,{ancestor:0}),s(!t.by.ancestor,"Cannot sort by ancestor")),this.$_addRule({name:"sort",args:{options:t}})},validate(e,{error:t,state:r,prefs:s,schema:n},{options:o}){const{value:a,errors:i}=c.sort(n,e,o,r,s);if(i)return i;for(let r=0;rnew Set(e)}},rebuild(e){e.$_terms._inclusions=[],e.$_terms._exclusions=[],e.$_terms._requireds=[];for(const t of e.$_terms.items)c.validateSingle(t,e),"required"===t._flags.presence?e.$_terms._requireds.push(t):"forbidden"===t._flags.presence?e.$_terms._exclusions.push(t):e.$_terms._inclusions.push(t);for(const t of e.$_terms.ordered)c.validateSingle(t,e)},manifest:{build:(e,t)=>(t.items&&(e=e.items(...t.items)),t.ordered&&(e=e.ordered(...t.ordered)),e)},messages:{"array.base":"{{#label}} must be an array","array.excludes":"{{#label}} contains an excluded value","array.hasKnown":'{{#label}} does not contain at least one required match for type "{#patternLabel}"',"array.hasUnknown":"{{#label}} does not contain at least one required match","array.includes":"{{#label}} does not match any of the allowed types","array.includesRequiredBoth":"{{#label}} does not contain {{#knownMisses}} and {{#unknownMisses}} other required value(s)","array.includesRequiredKnowns":"{{#label}} does not contain {{#knownMisses}}","array.includesRequiredUnknowns":"{{#label}} does not contain {{#unknownMisses}} required value(s)","array.length":"{{#label}} must contain {{#limit}} items","array.max":"{{#label}} must contain less than or equal to {{#limit}} items","array.min":"{{#label}} must contain at least {{#limit}} items","array.orderedLength":"{{#label}} must contain at most {{#limit}} items","array.sort":"{{#label}} must be sorted in {#order} order by {{#by}}","array.sort.mismatching":"{{#label}} cannot be sorted due to mismatching types","array.sort.unsupported":"{{#label}} cannot be sorted due to unsupported type {#type}","array.sparse":"{{#label}} must not be a sparse array item","array.unique":"{{#label}} contains a duplicate value"}}),c.fillMissedErrors=function(e,t,r,s,n,o){const a=[];let i=0;for(const e of r){const t=e._flags.label;t?a.push(t):++i}a.length?i?t.push(e.$_createError("array.includesRequiredBoth",s,{knownMisses:a,unknownMisses:i},n,o)):t.push(e.$_createError("array.includesRequiredKnowns",s,{knownMisses:a},n,o)):t.push(e.$_createError("array.includesRequiredUnknowns",s,{unknownMisses:i},n,o))},c.fillOrderedErrors=function(e,t,r,s,n,o){const a=[];for(const e of r)"required"===e._flags.presence&&a.push(e);a.length&&c.fillMissedErrors(e,t,a,s,n,o)},c.fastSplice=function(e,t){let r=t;for(;r{let f=c.compare(l,u,a,i);if(null!==f)return f;if(r.by&&(l=r.by.resolve(l,s,n),u=r.by.resolve(u,s,n)),f=c.compare(l,u,a,i),null!==f)return f;const m=typeof l;if(m!==typeof u)throw e.$_createError("array.sort.mismatching",t,null,s,n);if("number"!==m&&"string"!==m)throw e.$_createError("array.sort.unsupported",t,{type:m},s,n);return"number"===m?(l-u)*o:le?1:0},string:{from:i.isBool,to:(e,t)=>e?"true":"false"}},manifest:{build:(e,t)=>(t.truthy&&(e=e.truthy(...t.truthy)),t.falsy&&(e=e.falsy(...t.falsy)),e)},messages:{"boolean.base":"{{#label}} must be a boolean"}})},function(e,t,r){"use strict";const s=r(0),n=r(3),o=r(1),a=r(7),i={isDate:function(e){return e instanceof Date}};e.exports=n.extend({type:"date",coerce:{from:["number","string"],method:(e,{schema:t})=>({value:i.parse(e,t._flags.format)||e})},validate(e,{schema:t,error:r,prefs:s}){if(e instanceof Date&&!isNaN(e.getTime()))return;const n=t._flags.format;return s.convert&&n&&"string"==typeof e?{value:e,errors:r("date.format",{format:n})}:{value:e,errors:r("date.base")}},rules:{compare:{method:!1,validate(e,t,{date:r},{name:s,operator:n,args:a}){const i="now"===r?Date.now():r.getTime();return o.compare(e.getTime(),i,n)?e:t.error("date."+s,{limit:a.date,value:e})},args:[{name:"date",ref:!0,normalize:e=>"now"===e?e:i.parse(e),assert:e=>null!==e,message:"must have a valid date format"}]},format:{method(e){return s(["iso","javascript","unix"].includes(e),"Unknown date format",e),this.$_setFlag("format",e)}},greater:{method(e){return this.$_addRule({name:"greater",method:"compare",args:{date:e},operator:">"})}},iso:{method(){return this.format("iso")}},less:{method(e){return this.$_addRule({name:"less",method:"compare",args:{date:e},operator:"<"})}},max:{method(e){return this.$_addRule({name:"max",method:"compare",args:{date:e},operator:"<="})}},min:{method(e){return this.$_addRule({name:"min",method:"compare",args:{date:e},operator:">="})}},timestamp:{method(e="javascript"){return s(["javascript","unix"].includes(e),'"type" must be one of "javascript, unix"'),this.format(e)}}},cast:{number:{from:i.isDate,to:(e,t)=>e.getTime()},string:{from:i.isDate,to:(e,{prefs:t})=>a.date(e,t)}},messages:{"date.base":"{{#label}} must be a valid date","date.format":'{{#label}} must be in {msg("date.format." + #format) || #format} format',"date.greater":'{{#label}} must be greater than "{{#limit}}"',"date.less":'{{#label}} must be less than "{{#limit}}"',"date.max":'{{#label}} must be less than or equal to "{{#limit}}"',"date.min":'{{#label}} must be larger than or equal to "{{#limit}}"',"date.format.iso":"ISO 8601 date","date.format.javascript":"timestamp or number of milliseconds","date.format.unix":"timestamp or number of seconds"}}),i.parse=function(e,t){if(e instanceof Date)return e;if("string"!=typeof e&&(isNaN(e)||!isFinite(e)))return null;if(/^\s*$/.test(e))return null;if("iso"===t)return o.isIsoDate(e)?i.date(e.toString()):null;const r=e;if("string"==typeof e&&/^[+-]?\d+(\.\d+)?$/.test(e)&&(e=parseFloat(e)),t){if("javascript"===t)return i.date(1*e);if("unix"===t)return i.date(1e3*e);if("string"==typeof r)return null}return i.date(e)},i.date=function(e){const t=new Date(e);return isNaN(t.getTime())?null:t}},function(e,t,r){"use strict";const s=r(0),n=r(22);e.exports=n.extend({type:"function",properties:{typeof:"function"},rules:{arity:{method(e){return s(Number.isSafeInteger(e)&&e>=0,"n must be a positive integer"),this.$_addRule({name:"arity",args:{n:e}})},validate:(e,t,{n:r})=>e.length===r?e:t.error("function.arity",{n:r})},class:{method(){return this.$_addRule("class")},validate:(e,t)=>/^\s*class\s/.test(e.toString())?e:t.error("function.class",{value:e})},minArity:{method(e){return s(Number.isSafeInteger(e)&&e>0,"n must be a strict positive integer"),this.$_addRule({name:"minArity",args:{n:e}})},validate:(e,t,{n:r})=>e.length>=r?e:t.error("function.minArity",{n:r})},maxArity:{method(e){return s(Number.isSafeInteger(e)&&e>=0,"n must be a positive integer"),this.$_addRule({name:"maxArity",args:{n:e}})},validate:(e,t,{n:r})=>e.length<=r?e:t.error("function.maxArity",{n:r})}},messages:{"function.arity":"{{#label}} must have an arity of {{#n}}","function.class":"{{#label}} must be a class","function.maxArity":"{{#label}} must have an arity lesser or equal to {{#n}}","function.minArity":"{{#label}} must have an arity greater or equal to {{#n}}"}})},function(e,t,r){"use strict";const s=r(0),n=r(2),o=r(20),a=r(6),i={};e.exports=function(e,t,r={}){if(s(e&&"object"==typeof e,"Invalid defaults value: must be an object"),s(!t||!0===t||"object"==typeof t,"Invalid source value: must be true, falsy or an object"),s("object"==typeof r,"Invalid options: must be an object"),!t)return null;if(r.shallow)return i.applyToDefaultsWithShallow(e,t,r);const a=n(e);if(!0===t)return a;const l=void 0!==r.nullOverride&&r.nullOverride;return o(a,t,{nullOverride:l,mergeArrays:!1})},i.applyToDefaultsWithShallow=function(e,t,r){const l=r.shallow;s(Array.isArray(l),"Invalid keys");const c=new Map,u=!0===t?null:new Set;for(let r of l){r=Array.isArray(r)?r:r.split(".");const s=a(e,r);s&&"object"==typeof s?c.set(s,u&&a(t,r)||s):u&&u.add(r)}const f=n(e,{},c);if(!u)return f;for(const e of u)i.reachCopy(f,t,e);return o(f,t,{mergeArrays:!1,nullOverride:!1})},i.reachCopy=function(e,t,r){for(const e of r){if(!(e in t))return;t=t[e]}const s=t;let n=e;for(let e=0;ee.sort===t.sort?0:e.sorte.ref(t),validate(e,{schema:t,state:r,prefs:n}){s(t.$_terms.link,"Uninitialized link schema");const o=l.generate(t,e,r,n),a=t.$_terms.link[0].ref;return o.$_validate(e,r.nest(o,"link:".concat(a.display,":").concat(o.type)),n)},generate:(e,t,r,s)=>l.generate(e,t,r,s),rules:{ref:{method(e){s(!this.$_terms.link,"Cannot reinitialize schema"),e=a.ref(e),s("value"===e.type||"local"===e.type,"Invalid reference type:",e.type),s("local"===e.type||"root"===e.ancestor||e.ancestor>0,"Link cannot reference itself");const t=this.clone();return t.$_terms.link=[{ref:e}],t}},relative:{method(e=!0){return this.$_setFlag("relative",e)}}},overrides:{concat(e){s(this.$_terms.link,"Uninitialized link schema"),s(o.isSchema(e),"Invalid schema object"),s("link"!==e.type,"Cannot merge type link with another link");const t=this.clone();return t.$_terms.whens||(t.$_terms.whens=[]),t.$_terms.whens.push({concat:e}),t.$_mutateRebuild()}},manifest:{build:(e,t)=>(s(t.link,"Invalid link description missing link"),e.ref(t.link))}}),l.generate=function(e,t,r,s){let n=r.mainstay.links.get(e);if(n)return n._generate(t,r,s).schema;const o=e.$_terms.link[0].ref,{perspective:a,path:i}=l.perspective(o,r);l.assert(a,"which is outside of schema boundaries",o,e,r,s);try{n=i.length?a.$_reach(i):a}catch(t){l.assert(!1,"to non-existing schema",o,e,r,s)}return l.assert("link"!==n.type,"which is another link",o,e,r,s),e._flags.relative||r.mainstay.links.set(e,n),n._generate(t,r,s).schema},l.perspective=function(e,t){if("local"===e.type){for(const{schema:r,key:s}of t.schemas){if((r._flags.id||s)===e.path[0])return{perspective:r,path:e.path.slice(1)};if(r.$_terms.shared)for(const t of r.$_terms.shared)if(t._flags.id===e.path[0])return{perspective:t,path:e.path.slice(1)}}return{perspective:null,path:null}}return"root"===e.ancestor?{perspective:t.schemas[t.schemas.length-1].schema,path:e.path}:{perspective:t.schemas[e.ancestor]&&t.schemas[e.ancestor].schema,path:e.path}},l.assert=function(e,t,r,n,o,a){e||s(!1,'"'.concat(i.label(n._flags,o,a),'" contains link reference "').concat(r.display,'" ').concat(t))}},function(e,t,r){"use strict";const s=r(0),n=r(3),o=r(1),a={numberRx:/^\s*[+-]?(?:(?:\d+(?:\.\d*)?)|(?:\.\d+))(?:e([+-]?\d+))?\s*$/i,precisionRx:/(?:\.(\d+))?(?:[eE]([+-]?\d+))?$/};e.exports=n.extend({type:"number",flags:{unsafe:{default:!1}},coerce:{from:"string",method(e,{schema:t,error:r}){const s=e.match(a.numberRx);if(!s)return;e=e.trim();const n={value:parseFloat(e)};if(0===n.value&&(n.value=0),!t._flags.unsafe)if(e.match(/e/i)){if(a.normalizeExponent("".concat(n.value/Math.pow(10,s[1]),"e").concat(s[1]))!==a.normalizeExponent(e))return n.errors=r("number.unsafe"),n}else{const t=n.value.toString();if(t.match(/e/i))return n;if(t!==a.normalizeDecimal(e))return n.errors=r("number.unsafe"),n}return n}},validate(e,{schema:t,error:r,prefs:s}){if(e===1/0||e===-1/0)return{value:e,errors:r("number.infinity")};if(!o.isNumber(e))return{value:e,errors:r("number.base")};const n={value:e};if(s.convert){const e=t.$_getRule("precision");if(e){const t=Math.pow(10,e.args.limit);n.value=Math.round(n.value*t)/t}}return 0===n.value&&(n.value=0),!t._flags.unsafe&&(e>Number.MAX_SAFE_INTEGER||eo.compare(e,r,n)?e:t.error("number."+s,{limit:a.limit,value:e}),args:[{name:"limit",ref:!0,assert:o.isNumber,message:"must be a number"}]},greater:{method(e){return this.$_addRule({name:"greater",method:"compare",args:{limit:e},operator:">"})}},integer:{method(){return this.$_addRule("integer")},validate:(e,t)=>Math.trunc(e)-e==0?e:t.error("number.integer")},less:{method(e){return this.$_addRule({name:"less",method:"compare",args:{limit:e},operator:"<"})}},max:{method(e){return this.$_addRule({name:"max",method:"compare",args:{limit:e},operator:"<="})}},min:{method(e){return this.$_addRule({name:"min",method:"compare",args:{limit:e},operator:">="})}},multiple:{method(e){return this.$_addRule({name:"multiple",args:{base:e}})},validate:(e,t,{base:r},s)=>e%r==0?e:t.error("number.multiple",{multiple:s.args.base,value:e}),args:[{name:"base",ref:!0,assert:e=>"number"==typeof e&&isFinite(e)&&e>0,message:"must be a positive number"}],multi:!0},negative:{method(){return this.sign("negative")}},port:{method(){return this.$_addRule("port")},validate:(e,t)=>Number.isSafeInteger(e)&&e>=0&&e<=65535?e:t.error("number.port")},positive:{method(){return this.sign("positive")}},precision:{method(e){return s(Number.isSafeInteger(e),"limit must be an integer"),this.$_addRule({name:"precision",args:{limit:e}})},validate(e,t,{limit:r}){const s=e.toString().match(a.precisionRx);return Math.max((s[1]?s[1].length:0)-(s[2]?parseInt(s[2],10):0),0)<=r?e:t.error("number.precision",{limit:r,value:e})},convert:!0},sign:{method(e){return s(["negative","positive"].includes(e),"Invalid sign",e),this.$_addRule({name:"sign",args:{sign:e}})},validate:(e,t,{sign:r})=>"negative"===r&&e<0||"positive"===r&&e>0?e:t.error("number.".concat(r))},unsafe:{method(e=!0){return s("boolean"==typeof e,"enabled must be a boolean"),this.$_setFlag("unsafe",e)}}},cast:{string:{from:e=>"number"==typeof e,to:(e,t)=>e.toString()}},messages:{"number.base":"{{#label}} must be a number","number.greater":"{{#label}} must be greater than {{#limit}}","number.infinity":"{{#label}} cannot be infinity","number.integer":"{{#label}} must be an integer","number.less":"{{#label}} must be less than {{#limit}}","number.max":"{{#label}} must be less than or equal to {{#limit}}","number.min":"{{#label}} must be larger than or equal to {{#limit}}","number.multiple":"{{#label}} must be a multiple of {{#multiple}}","number.negative":"{{#label}} must be a negative number","number.port":"{{#label}} must be a valid port","number.positive":"{{#label}} must be a positive number","number.precision":"{{#label}} must have no more than {{#limit}} decimal places","number.unsafe":"{{#label}} must be a safe number"}}),a.normalizeExponent=function(e){return e.replace(/E/,"e").replace(/\.(\d*[1-9])?0+e/,".$1e").replace(/\.e/,"e").replace(/e\+/,"e").replace(/^\+/,"").replace(/^(-?)0+([1-9])/,"$1$2")},a.normalizeDecimal=function(e){return(e=e.replace(/^\+/,"").replace(/\.0+$/,"").replace(/^(-?)\.([^\.]*)$/,"$10.$2").replace(/^(-?)0+([1-9])/,"$1$2")).includes(".")&&e.endsWith("0")&&(e=e.replace(/0+$/,"")),"-0"===e?"0":e}},function(e,t,r){"use strict";const s=r(22);e.exports=s.extend({type:"object",cast:{map:{from:e=>e&&"object"==typeof e,to:(e,t)=>new Map(Object.entries(e))}}})},function(e,t,r){"use strict";function s(e,t){var r=Object.keys(e);if(Object.getOwnPropertySymbols){var s=Object.getOwnPropertySymbols(e);t&&(s=s.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),r.push.apply(r,s)}return r}function n(e){for(var t=1;t"string"!=typeof e?{value:e,errors:t("string.base")}:""===e?{value:e,errors:t("string.empty")}:void 0,rules:{alphanum:{method(){return this.$_addRule("alphanum")},validate:(e,t)=>/^[a-zA-Z0-9]+$/.test(e)?e:t.error("string.alphanum")},base64:{method(e={}){return d.assertOptions(e,["paddingRequired","urlSafe"]),e=n({urlSafe:!1,paddingRequired:!0},e),a("boolean"==typeof e.paddingRequired,"paddingRequired must be boolean"),a("boolean"==typeof e.urlSafe,"urlSafe must be boolean"),this.$_addRule({name:"base64",args:{options:e}})},validate:(e,t,{options:r})=>p.base64Regex[r.paddingRequired][r.urlSafe].test(e)?e:t.error("string.base64")},case:{method(e){return a(["lower","upper"].includes(e),"Invalid case:",e),this.$_addRule({name:"case",args:{direction:e}})},validate:(e,t,{direction:r})=>"lower"===r&&e===e.toLocaleLowerCase()||"upper"===r&&e===e.toLocaleUpperCase()?e:t.error("string.".concat(r,"case")),convert:!0},creditCard:{method(){return this.$_addRule("creditCard")},validate(e,t){let r=e.length,s=0,n=1;for(;r--;){const t=e.charAt(r)*n;s+=t-9*(t>9),n^=3}return s>0&&s%10==0?e:t.error("string.creditCard")}},dataUri:{method(e={}){return d.assertOptions(e,["paddingRequired"]),e=n({paddingRequired:!0},e),a("boolean"==typeof e.paddingRequired,"paddingRequired must be boolean"),this.$_addRule({name:"dataUri",args:{options:e}})},validate(e,t,{options:r}){const s=e.match(p.dataUriRegex);if(s){if(!s[2])return e;if("base64"!==s[2])return e;if(p.base64Regex[r.paddingRequired].false.test(s[3]))return e}return t.error("string.dataUri")}},domain:{method(e){e&&d.assertOptions(e,["allowUnicode","minDomainSegments","tlds"]);const t=p.addressOptions(e);return this.$_addRule({name:"domain",args:{options:e},address:t})},validate:(e,t,r,{address:s})=>i.isValid(e,s)?e:t.error("string.domain")},email:{method(e={}){d.assertOptions(e,["allowUnicode","ignoreLength","minDomainSegments","multiple","separator","tlds"]),a(void 0===e.multiple||"boolean"==typeof e.multiple,"multiple option must be an boolean");const t=p.addressOptions(e),r=new RegExp("\\s*[".concat(e.separator?u(e.separator):",","]\\s*"));return this.$_addRule({name:"email",args:{options:e},regex:r,address:t})},validate(e,t,{options:r},{regex:s,address:n}){const o=r.multiple?e.split(s):[e],a=[];for(const e of o)l.isValid(e,n)||a.push(e);return a.length?t.error("string.email",{value:e,invalids:a}):e}},guid:{alias:"uuid",method(e={}){d.assertOptions(e,["version"]);let t="";if(e.version){const r=[].concat(e.version);a(r.length>=1,"version must have at least 1 valid version specified");const s=new Set;for(let e=0;ep.hexRegex.test(e)?r.byteAligned&&e.length%2!=0?t.error("string.hexAlign"):e:t.error("string.hex")},hostname:{method(){return this.$_addRule("hostname")},validate:(e,t)=>i.isValid(e,{minDomainSegments:1})||p.ipRegex.test(e)?e:t.error("string.hostname")},insensitive:{method(){return this.$_setFlag("insensitive",!0)}},ip:{method(e={}){d.assertOptions(e,["cidr","version"]);const{cidr:t,versions:r,regex:s}=c.regex(e),n=e.version?r:void 0;return this.$_addRule({name:"ip",args:{options:{cidr:t,version:n}},regex:s})},validate:(e,t,{options:r},{regex:s})=>s.test(e)?e:r.version?t.error("string.ipVersion",{value:e,cidr:r.cidr,version:r.version}):t.error("string.ip",{value:e,cidr:r.cidr})},isoDate:{method(){return this.$_addRule("isoDate")},validate:(e,{error:t})=>p.isoDate(e)?e:t("string.isoDate")},isoDuration:{method(){return this.$_addRule("isoDuration")},validate:(e,t)=>p.isoDurationRegex.test(e)?e:t.error("string.isoDuration")},length:{method(e,t){return p.length(this,"length",e,"=",t)},validate(e,t,{limit:r,encoding:s},{name:n,operator:o,args:a}){const i=!s&&e.length;return d.compare(i,r,o)?e:t.error("string."+n,{limit:a.limit,value:e,encoding:s})},args:[{name:"limit",ref:!0,assert:d.limit,message:"must be a positive integer"},"encoding"]},lowercase:{method(){return this.case("lower")}},max:{method(e,t){return p.length(this,"max",e,"<=",t)},args:["limit","encoding"]},min:{method(e,t){return p.length(this,"min",e,">=",t)},args:["limit","encoding"]},normalize:{method(e="NFC"){return a(p.normalizationForms.includes(e),"normalization form must be one of "+p.normalizationForms.join(", ")),this.$_addRule({name:"normalize",args:{form:e}})},validate:(e,{error:t},{form:r})=>e===e.normalize(r)?e:t("string.normalize",{value:e,form:r}),convert:!0},pattern:{alias:"regex",method(e,t={}){a(e instanceof RegExp,"regex must be a RegExp"),a(!e.flags.includes("g")&&!e.flags.includes("y"),"regex should not use global or sticky mode"),"string"==typeof t&&(t={name:t}),d.assertOptions(t,["invert","name"]);const r=["string.pattern",t.invert?".invert":"",t.name?".name":".base"].join("");return this.$_addRule({name:"pattern",args:{regex:e,options:t},errorCode:r})},validate:(e,t,{regex:r,options:s},{errorCode:n})=>r.test(e)^s.invert?e:t.error(n,{name:s.name,regex:r,value:e}),args:["regex","options"],multi:!0},replace:{method(e,t){"string"==typeof e&&(e=new RegExp(u(e),"g")),a(e instanceof RegExp,"pattern must be a RegExp"),a("string"==typeof t,"replacement must be a String");const r=this.clone();return r.$_terms.replacements||(r.$_terms.replacements=[]),r.$_terms.replacements.push({pattern:e,replacement:t}),r}},token:{method(){return this.$_addRule("token")},validate:(e,t)=>/^\w+$/.test(e)?e:t.error("string.token")},trim:{method(e=!0){return a("boolean"==typeof e,"enabled must be a boolean"),this.$_addRule({name:"trim",args:{enabled:e}})},validate:(e,t,{enabled:r})=>r&&e!==e.trim()?t.error("string.trim"):e,convert:!0},truncate:{method(e=!0){return a("boolean"==typeof e,"enabled must be a boolean"),this.$_setFlag("truncate",e)}},uppercase:{method(){return this.case("upper")}},uri:{method(e={}){d.assertOptions(e,["allowRelative","allowQuerySquareBrackets","domain","relativeOnly","scheme"]),e.domain&&d.assertOptions(e.domain,["allowUnicode","minDomainSegments","tlds"]);const{regex:t,scheme:r}=m.regex(e),s=e.domain?p.addressOptions(e.domain):null;return this.$_addRule({name:"uri",args:{options:e},regex:t,domain:s,scheme:r})},validate(e,t,{options:r},{regex:s,domain:n,scheme:o}){if(["http:/","https:/"].includes(e))return t.error("string.uri");const a=s.exec(e);if(a){if(n){const e=a[1]||a[2];if(!i.isValid(e,n))return t.error("string.domain",{value:e})}return e}return r.relativeOnly?t.error("string.uriRelativeOnly"):r.scheme?t.error("string.uriCustomScheme",{scheme:o,value:e}):t.error("string.uri")}}},manifest:{build(e,t){if(t.replacements)for(const{pattern:r,replacement:s}of t.replacements)e=e.replace(r,s);return e}},messages:{"string.alphanum":"{{#label}} must only contain alpha-numeric characters","string.base":"{{#label}} must be a string","string.base64":"{{#label}} must be a valid base64 string","string.creditCard":"{{#label}} must be a credit card","string.dataUri":"{{#label}} must be a valid dataUri string","string.domain":"{{#label}} must contain a valid domain name","string.email":"{{#label}} must be a valid email","string.empty":"{{#label}} is not allowed to be empty","string.guid":"{{#label}} must be a valid GUID","string.hex":"{{#label}} must only contain hexadecimal characters","string.hexAlign":"{{#label}} hex decoded representation must be byte aligned","string.hostname":"{{#label}} must be a valid hostname","string.ip":"{{#label}} must be a valid ip address with a {{#cidr}} CIDR","string.ipVersion":"{{#label}} must be a valid ip address of one of the following versions {{#version}} with a {{#cidr}} CIDR","string.isoDate":"{{#label}} must be in iso format","string.isoDuration":"{{#label}} must be a valid ISO 8601 duration","string.length":"{{#label}} length must be {{#limit}} characters long","string.lowercase":"{{#label}} must only contain lowercase characters","string.max":"{{#label}} length must be less than or equal to {{#limit}} characters long","string.min":"{{#label}} length must be at least {{#limit}} characters long","string.normalize":"{{#label}} must be unicode normalized in the {{#form}} form","string.token":"{{#label}} must only contain alpha-numeric and underscore characters","string.pattern.base":'{{#label}} with value "{[.]}" fails to match the required pattern: {{#regex}}',"string.pattern.name":'{{#label}} with value "{[.]}" fails to match the {{#name}} pattern',"string.pattern.invert.base":'{{#label}} with value "{[.]}" matches the inverted pattern: {{#regex}}',"string.pattern.invert.name":'{{#label}} with value "{[.]}" matches the inverted {{#name}} pattern',"string.trim":"{{#label}} must not have leading or trailing whitespace","string.uri":"{{#label}} must be a valid uri","string.uriCustomScheme":"{{#label}} must be a valid uri with a scheme matching the {{#scheme}} pattern","string.uriRelativeOnly":"{{#label}} must be a valid relative uri","string.uppercase":"{{#label}} must only contain uppercase characters"}}),p.addressOptions=function(e){if(!e)return e;if(a(void 0===e.minDomainSegments||Number.isSafeInteger(e.minDomainSegments)&&e.minDomainSegments>0,"minDomainSegments must be a positive integer"),!1===e.tlds)return e;if(!0===e.tlds||void 0===e.tlds)return a(p.tlds,"Built-in TLD list disabled"),Object.assign({},e,p.tlds);a("object"==typeof e.tlds,"tlds must be true, false, or an object");const t=e.tlds.deny;if(t)return Array.isArray(t)&&(e=Object.assign({},e,{tlds:{deny:new Set(t)}})),a(e.tlds.deny instanceof Set,"tlds.deny must be an array, Set, or boolean"),a(!e.tlds.allow,"Cannot specify both tlds.allow and tlds.deny lists"),e;const r=e.tlds.allow;return r?!0===r?(a(p.tlds,"Built-in TLD list disabled"),Object.assign({},e,p.tlds)):(Array.isArray(r)&&(e=Object.assign({},e,{tlds:{allow:new Set(r)}})),a(e.tlds.allow instanceof Set,"tlds.allow must be an array, Set, or boolean"),e):e},p.isoDate=function(e){if(!d.isIsoDate(e))return null;const t=new Date(e);return isNaN(t.getTime())?null:t.toISOString()},p.length=function(e,t,r,s,n){return a(!n||!1,"Invalid encoding:",n),e.$_addRule({name:t,method:"length",args:{limit:r,encoding:n},operator:s})}},function(e,t,r){"use strict";const s=r(24),n=r(23),o=r(25),a={nonAsciiRx:/[^\x00-\x7f]/,encoder:new(s.TextEncoder||TextEncoder)};t.analyze=function(e,t){return a.email(e,t)},t.isValid=function(e,t){return!a.email(e,t)},a.email=function(e,t={}){if("string"!=typeof e)throw new Error("Invalid input: email must be a string");if(!e)return o.code("EMPTY_STRING");const r=!a.nonAsciiRx.test(e);if(!r){if(!1===t.allowUnicode)return o.code("FORBIDDEN_UNICODE");e=e.normalize("NFC")}const s=e.split("@");if(2!==s.length)return s.length>2?o.code("MULTIPLE_AT_CHAR"):o.code("MISSING_AT_CHAR");const[i,l]=s;if(!i)return o.code("EMPTY_LOCAL");if(!t.ignoreLength){if(e.length>254)return o.code("ADDRESS_TOO_LONG");if(a.encoder.encode(i).length>64)return o.code("LOCAL_TOO_LONG")}return a.local(i,r)||n.analyze(l,t)},a.local=function(e,t){const r=e.split(".");for(const e of r){if(!e.length)return o.code("EMPTY_LOCAL_SEGMENT");if(t){if(!a.atextRx.test(e))return o.code("INVALID_LOCAL_CHARS")}else for(const t of e){if(a.atextRx.test(t))continue;const e=a.binary(t);if(!a.atomRx.test(e))return o.code("INVALID_LOCAL_CHARS")}}},a.binary=function(e){return Array.from(a.encoder.encode(e)).map(e=>String.fromCharCode(e)).join("")},a.atextRx=/^[\w!#\$%&'\*\+\-/=\?\^`\{\|\}~]+$/,a.atomRx=new RegExp(["(?:[\\xc2-\\xdf][\\x80-\\xbf])","(?:\\xe0[\\xa0-\\xbf][\\x80-\\xbf])|(?:[\\xe1-\\xec][\\x80-\\xbf]{2})|(?:\\xed[\\x80-\\x9f][\\x80-\\xbf])|(?:[\\xee-\\xef][\\x80-\\xbf]{2})","(?:\\xf0[\\x90-\\xbf][\\x80-\\xbf]{2})|(?:[\\xf1-\\xf3][\\x80-\\xbf]{3})|(?:\\xf4[\\x80-\\x8f][\\x80-\\xbf]{2})"].join("|"))},function(e,t,r){"use strict";const s=r(0),n=r(26);t.regex=function(e={}){s(void 0===e.cidr||"string"==typeof e.cidr,"options.cidr must be a string");const t=e.cidr?e.cidr.toLowerCase():"optional";s(["required","optional","forbidden"].includes(t),"options.cidr must be one of required, optional, forbidden"),s(void 0===e.version||"string"==typeof e.version||Array.isArray(e.version),"options.version must be a string or an array of string");let r=e.version||["ipv4","ipv6","ipvfuture"];Array.isArray(r)||(r=[r]),s(r.length>=1,"options.version must have at least 1 version specified");for(let e=0;e{if("forbidden"===t)return n.ip[e];const r="\\/".concat("ipv4"===e?n.ip.v4Cidr:n.ip.v6Cidr);return"required"===t?"".concat(n.ip[e]).concat(r):"".concat(n.ip[e],"(?:").concat(r,")?")}),a="(?:".concat(o.join("|"),")"),i=new RegExp("^".concat(a,"$"));return{cidr:t,versions:r,regex:i,raw:a}}},function(e,t){},function(e,t,r){"use strict";const s=r(0),n=r(3),o={};o.Map=class extends Map{slice(){return new o.Map(this)}},e.exports=n.extend({type:"symbol",terms:{map:{init:new o.Map}},coerce:{method(e,{schema:t,error:r}){const s=t.$_terms.map.get(e);return s&&(e=s),t._flags.only&&"symbol"!=typeof e?{value:e,errors:r("symbol.map",{map:t.$_terms.map})}:{value:e}}},validate(e,{error:t}){if("symbol"!=typeof e)return{value:e,errors:t("symbol.base")}},rules:{map:{method(e){e&&!e[Symbol.iterator]&&"object"==typeof e&&(e=Object.entries(e)),s(e&&e[Symbol.iterator],"Iterable must be an iterable or object");const t=this.clone(),r=[];for(const n of e){s(n&&n[Symbol.iterator],"Entry must be an iterable");const[e,o]=n;s("object"!=typeof e&&"function"!=typeof e&&"symbol"!=typeof e,"Key must not be of type object, function, or Symbol"),s("symbol"==typeof o,"Value must be a Symbol"),t.$_terms.map.set(e,o),r.push(o)}return t.valid(...r)}}},manifest:{build:(e,t)=>(t.map&&(e=e.map(t.map)),e)},messages:{"symbol.base":"{{#label}} must be a symbol","symbol.map":"{{#label}} must be one of {{#map}}"}})}])})); \ No newline at end of file diff --git a/node_modules/@hapi/joi/lib/annotate.js b/node_modules/@hapi/joi/lib/annotate.js new file mode 100644 index 000000000..42798fc3e --- /dev/null +++ b/node_modules/@hapi/joi/lib/annotate.js @@ -0,0 +1,175 @@ +'use strict'; + +const Clone = require('@hapi/hoek/lib/clone'); + +const Common = require('./common'); + + +const internals = { + annotations: Symbol('annotations') +}; + + +exports.error = function (stripColorCodes) { + + if (!this._original || + typeof this._original !== 'object') { + + return this.details[0].message; + } + + const redFgEscape = stripColorCodes ? '' : '\u001b[31m'; + const redBgEscape = stripColorCodes ? '' : '\u001b[41m'; + const endColor = stripColorCodes ? '' : '\u001b[0m'; + + const obj = Clone(this._original); + + for (let i = this.details.length - 1; i >= 0; --i) { // Reverse order to process deepest child first + const pos = i + 1; + const error = this.details[i]; + const path = error.path; + let node = obj; + for (let j = 0; ; ++j) { + const seg = path[j]; + + if (Common.isSchema(node)) { + node = node.clone(); // joi schemas are not cloned by hoek, we have to take this extra step + } + + if (j + 1 < path.length && + typeof node[seg] !== 'string') { + + node = node[seg]; + } + else { + const refAnnotations = node[internals.annotations] || { errors: {}, missing: {} }; + node[internals.annotations] = refAnnotations; + + const cacheKey = seg || error.context.key; + + if (node[seg] !== undefined) { + refAnnotations.errors[cacheKey] = refAnnotations.errors[cacheKey] || []; + refAnnotations.errors[cacheKey].push(pos); + } + else { + refAnnotations.missing[cacheKey] = pos; + } + + break; + } + } + } + + const replacers = { + key: /_\$key\$_([, \d]+)_\$end\$_"/g, + missing: /"_\$miss\$_([^|]+)\|(\d+)_\$end\$_": "__missing__"/g, + arrayIndex: /\s*"_\$idx\$_([, \d]+)_\$end\$_",?\n(.*)/g, + specials: /"\[(NaN|Symbol.*|-?Infinity|function.*|\(.*)]"/g + }; + + let message = internals.safeStringify(obj, 2) + .replace(replacers.key, ($0, $1) => `" ${redFgEscape}[${$1}]${endColor}`) + .replace(replacers.missing, ($0, $1, $2) => `${redBgEscape}"${$1}"${endColor}${redFgEscape} [${$2}]: -- missing --${endColor}`) + .replace(replacers.arrayIndex, ($0, $1, $2) => `\n${$2} ${redFgEscape}[${$1}]${endColor}`) + .replace(replacers.specials, ($0, $1) => $1); + + message = `${message}\n${redFgEscape}`; + + for (let i = 0; i < this.details.length; ++i) { + const pos = i + 1; + message = `${message}\n[${pos}] ${this.details[i].message}`; + } + + message = message + endColor; + + return message; +}; + + +// Inspired by json-stringify-safe + +internals.safeStringify = function (obj, spaces) { + + return JSON.stringify(obj, internals.serializer(), spaces); +}; + + +internals.serializer = function () { + + const keys = []; + const stack = []; + + const cycleReplacer = (key, value) => { + + if (stack[0] === value) { + return '[Circular ~]'; + } + + return '[Circular ~.' + keys.slice(0, stack.indexOf(value)).join('.') + ']'; + }; + + return function (key, value) { + + if (stack.length > 0) { + const thisPos = stack.indexOf(this); + if (~thisPos) { + stack.length = thisPos + 1; + keys.length = thisPos + 1; + keys[thisPos] = key; + } + else { + stack.push(this); + keys.push(key); + } + + if (~stack.indexOf(value)) { + value = cycleReplacer.call(this, key, value); + } + } + else { + stack.push(value); + } + + if (value) { + const annotations = value[internals.annotations]; + if (annotations) { + if (Array.isArray(value)) { + const annotated = []; + + for (let i = 0; i < value.length; ++i) { + if (annotations.errors[i]) { + annotated.push(`_$idx$_${annotations.errors[i].sort().join(', ')}_$end$_`); + } + + annotated.push(value[i]); + } + + value = annotated; + } + else { + for (const errorKey in annotations.errors) { + value[`${errorKey}_$key$_${annotations.errors[errorKey].sort().join(', ')}_$end$_`] = value[errorKey]; + value[errorKey] = undefined; + } + + for (const missingKey in annotations.missing) { + value[`_$miss$_${missingKey}|${annotations.missing[missingKey]}_$end$_`] = '__missing__'; + } + } + + return value; + } + } + + if (value === Infinity || + value === -Infinity || + Number.isNaN(value) || + typeof value === 'function' || + typeof value === 'symbol') { + + return '[' + value.toString() + ']'; + } + + return value; + }; +}; diff --git a/node_modules/@hapi/joi/lib/base.js b/node_modules/@hapi/joi/lib/base.js new file mode 100644 index 000000000..4a75b15aa --- /dev/null +++ b/node_modules/@hapi/joi/lib/base.js @@ -0,0 +1,1033 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); +const DeepEqual = require('@hapi/hoek/lib/deepEqual'); +const Merge = require('@hapi/hoek/lib/merge'); + +const Cache = require('./cache'); +const Common = require('./common'); +const Compile = require('./compile'); +const Errors = require('./errors'); +const Extend = require('./extend'); +const Manifest = require('./manifest'); +const Messages = require('./messages'); +const Modify = require('./modify'); +const Ref = require('./ref'); +const Trace = require('./trace'); +const Validator = require('./validator'); +const Values = require('./values'); + + +const internals = {}; + + +internals.Base = class { + + constructor(type) { + + // Naming: public, _private, $_extension, $_mutate{action} + + this.type = type; + + this.$_root = null; + this._definition = {}; + this._ids = new Modify.Ids(); + this._preferences = null; + this._refs = new Ref.Manager(); + this._cache = null; + + this._valids = null; + this._invalids = null; + + this._flags = {}; + this._rules = []; + this._singleRules = new Map(); // The rule options passed for non-multi rules + + this.$_terms = {}; // Hash of arrays of immutable objects (extended by other types) + + this.$_temp = { // Runtime state (not cloned) + ruleset: null, // null: use last, false: error, number: start position + whens: {} // Runtime cache of generated whens + }; + } + + // Manifest + + describe() { + + Assert(typeof Manifest.describe === 'function', 'Manifest functionality disabled'); + return Manifest.describe(this); + } + + // Rules + + allow(...values) { + + Common.verifyFlat(values, 'allow'); + return this._values(values, '_valids'); + } + + alter(targets) { + + Assert(targets && typeof targets === 'object' && !Array.isArray(targets), 'Invalid targets argument'); + Assert(!this._inRuleset(), 'Cannot set alterations inside a ruleset'); + + const obj = this.clone(); + obj.$_terms.alterations = obj.$_terms.alterations || []; + for (const target in targets) { + const adjuster = targets[target]; + Assert(typeof adjuster === 'function', 'Alteration adjuster for', target, 'must be a function'); + obj.$_terms.alterations.push({ target, adjuster }); + } + + obj.$_temp.ruleset = false; + return obj; + } + + cast(to) { + + Assert(to === false || typeof to === 'string', 'Invalid to value'); + Assert(to === false || this._definition.cast[to], 'Type', this.type, 'does not support casting to', to); + + return this.$_setFlag('cast', to === false ? undefined : to); + } + + default(value, options) { + + return this._default('default', value, options); + } + + description(desc) { + + Assert(desc && typeof desc === 'string', 'Description must be a non-empty string'); + + return this.$_setFlag('description', desc); + } + + empty(schema) { + + const obj = this.clone(); + + if (schema !== undefined) { + schema = obj.$_compile(schema, { override: false }); + } + + return obj.$_setFlag('empty', schema, { clone: false }); + } + + error(err) { + + Assert(err, 'Missing error'); + Assert(err instanceof Error || typeof err === 'function', 'Must provide a valid Error object or a function'); + + return this.$_setFlag('error', err); + } + + example(example, options = {}) { + + Assert(example !== undefined, 'Missing example'); + Common.assertOptions(options, ['override']); + + return this._inner('examples', example, { single: true, override: options.override }); + } + + external(method, description) { + + if (typeof method === 'object') { + Assert(!description, 'Cannot combine options with description'); + description = method.description; + method = method.method; + } + + Assert(typeof method === 'function', 'Method must be a function'); + Assert(description === undefined || description && typeof description === 'string', 'Description must be a non-empty string'); + + return this._inner('externals', { method, description }, { single: true }); + } + + failover(value, options) { + + return this._default('failover', value, options); + } + + forbidden() { + + return this.presence('forbidden'); + } + + id(id) { + + if (!id) { + return this.$_setFlag('id', undefined); + } + + Assert(typeof id === 'string', 'id must be a non-empty string'); + Assert(/^[^\.]+$/.test(id), 'id cannot contain period character'); + + return this.$_setFlag('id', id); + } + + invalid(...values) { + + return this._values(values, '_invalids'); + } + + label(name) { + + Assert(name && typeof name === 'string', 'Label name must be a non-empty string'); + + return this.$_setFlag('label', name); + } + + meta(meta) { + + Assert(meta !== undefined, 'Meta cannot be undefined'); + + return this._inner('metas', meta, { single: true }); + } + + note(...notes) { + + Assert(notes.length, 'Missing notes'); + for (const note of notes) { + Assert(note && typeof note === 'string', 'Notes must be non-empty strings'); + } + + return this._inner('notes', notes); + } + + only(mode = true) { + + Assert(typeof mode === 'boolean', 'Invalid mode:', mode); + + return this.$_setFlag('only', mode); + } + + optional() { + + return this.presence('optional'); + } + + prefs(prefs) { + + Assert(prefs, 'Missing preferences'); + Assert(prefs.context === undefined, 'Cannot override context'); + Assert(prefs.externals === undefined, 'Cannot override externals'); + Assert(prefs.warnings === undefined, 'Cannot override warnings'); + Assert(prefs.debug === undefined, 'Cannot override debug'); + + Common.checkPreferences(prefs); + + const obj = this.clone(); + obj._preferences = Common.preferences(obj._preferences, prefs); + return obj; + } + + presence(mode) { + + Assert(['optional', 'required', 'forbidden'].includes(mode), 'Unknown presence mode', mode); + + return this.$_setFlag('presence', mode); + } + + raw(enabled = true) { + + return this.$_setFlag('result', enabled ? 'raw' : undefined); + } + + result(mode) { + + Assert(['raw', 'strip'].includes(mode), 'Unknown result mode', mode); + + return this.$_setFlag('result', mode); + } + + required() { + + return this.presence('required'); + } + + strict(enabled) { + + const obj = this.clone(); + + const convert = enabled === undefined ? false : !enabled; + obj._preferences = Common.preferences(obj._preferences, { convert }); + return obj; + } + + strip(enabled = true) { + + return this.$_setFlag('result', enabled ? 'strip' : undefined); + } + + tag(...tags) { + + Assert(tags.length, 'Missing tags'); + for (const tag of tags) { + Assert(tag && typeof tag === 'string', 'Tags must be non-empty strings'); + } + + return this._inner('tags', tags); + } + + unit(name) { + + Assert(name && typeof name === 'string', 'Unit name must be a non-empty string'); + + return this.$_setFlag('unit', name); + } + + valid(...values) { + + Common.verifyFlat(values, 'valid'); + + const obj = this.allow(...values); + obj.$_setFlag('only', !!obj._valids, { clone: false }); + return obj; + } + + when(condition, options) { + + const obj = this.clone(); + + if (!obj.$_terms.whens) { + obj.$_terms.whens = []; + } + + const when = Compile.when(obj, condition, options); + if (!['any', 'link'].includes(obj.type)) { + const conditions = when.is ? [when] : when.switch; + for (const item of conditions) { + Assert(!item.then || item.then.type === 'any' || item.then.type === obj.type, 'Cannot combine', obj.type, 'with', item.then && item.then.type); + Assert(!item.otherwise || item.otherwise.type === 'any' || item.otherwise.type === obj.type, 'Cannot combine', obj.type, 'with', item.otherwise && item.otherwise.type); + + } + } + + obj.$_terms.whens.push(when); + return obj.$_mutateRebuild(); + } + + // Helpers + + cache(cache) { + + Assert(!this._inRuleset(), 'Cannot set caching inside a ruleset'); + Assert(!this._cache, 'Cannot override schema cache'); + + const obj = this.clone(); + obj._cache = cache || Cache.provider.provision(); + obj.$_temp.ruleset = false; + return obj; + } + + clone() { + + const obj = Object.create(Object.getPrototypeOf(this)); + return this._assign(obj); + } + + concat(source) { + + Assert(Common.isSchema(source), 'Invalid schema object'); + Assert(this.type === 'any' || source.type === 'any' || source.type === this.type, 'Cannot merge type', this.type, 'with another type:', source.type); + Assert(!this._inRuleset(), 'Cannot concatenate onto a schema with open ruleset'); + Assert(!source._inRuleset(), 'Cannot concatenate a schema with open ruleset'); + + let obj = this.clone(); + + if (this.type === 'any' && + source.type !== 'any') { + + // Change obj to match source type + + const tmpObj = source.clone(); + for (const key of Object.keys(obj)) { + if (key !== 'type') { + tmpObj[key] = obj[key]; + } + } + + obj = tmpObj; + } + + obj._ids.concat(source._ids); + obj._refs.register(source, Ref.toSibling); + + obj._preferences = obj._preferences ? Common.preferences(obj._preferences, source._preferences) : source._preferences; + obj._valids = Values.merge(obj._valids, source._valids, source._invalids); + obj._invalids = Values.merge(obj._invalids, source._invalids, source._valids); + + // Remove unique rules present in source + + for (const name of source._singleRules.keys()) { + if (obj._singleRules.has(name)) { + obj._rules = obj._rules.filter((target) => target.keep || target.name !== name); + obj._singleRules.delete(name); + } + } + + // Rules + + for (const test of source._rules) { + if (!source._definition.rules[test.method].multi) { + obj._singleRules.set(test.name, test); + } + + obj._rules.push(test); + } + + // Flags + + if (obj._flags.empty && + source._flags.empty) { + + obj._flags.empty = obj._flags.empty.concat(source._flags.empty); + const flags = Object.assign({}, source._flags); + delete flags.empty; + Merge(obj._flags, flags); + } + else if (source._flags.empty) { + obj._flags.empty = source._flags.empty; + const flags = Object.assign({}, source._flags); + delete flags.empty; + Merge(obj._flags, flags); + } + else { + Merge(obj._flags, source._flags); + } + + // Terms + + for (const key in source.$_terms) { + const terms = source.$_terms[key]; + if (!terms) { + if (!obj.$_terms[key]) { + obj.$_terms[key] = terms; + } + + continue; + } + + if (!obj.$_terms[key]) { + obj.$_terms[key] = terms.slice(); + continue; + } + + obj.$_terms[key] = obj.$_terms[key].concat(terms); + } + + // Tracing + + if (this.$_root._tracer) { + this.$_root._tracer._combine(obj, [this, source]); + } + + // Rebuild + + return obj.$_mutateRebuild(); + } + + extend(options) { + + Assert(!options.base, 'Cannot extend type with another base'); + + return Extend.type(this, options); + } + + extract(path) { + + path = Array.isArray(path) ? path : path.split('.'); + return this._ids.reach(path); + } + + fork(paths, adjuster) { + + Assert(!this._inRuleset(), 'Cannot fork inside a ruleset'); + + let obj = this; // eslint-disable-line consistent-this + for (let path of [].concat(paths)) { + path = Array.isArray(path) ? path : path.split('.'); + obj = obj._ids.fork(path, adjuster, obj); + } + + obj.$_temp.ruleset = false; + return obj; + } + + rule(options) { + + const def = this._definition; + Common.assertOptions(options, Object.keys(def.modifiers)); + + Assert(this.$_temp.ruleset !== false, 'Cannot apply rules to empty ruleset or the last rule added does not support rule properties'); + const start = this.$_temp.ruleset === null ? this._rules.length - 1 : this.$_temp.ruleset; + Assert(start >= 0 && start < this._rules.length, 'Cannot apply rules to empty ruleset'); + + const obj = this.clone(); + + for (let i = start; i < obj._rules.length; ++i) { + const original = obj._rules[i]; + const rule = Clone(original); + + for (const name in options) { + def.modifiers[name](rule, options[name]); + Assert(rule.name === original.name, 'Cannot change rule name'); + } + + obj._rules[i] = rule; + + if (obj._singleRules.get(rule.name) === original) { + obj._singleRules.set(rule.name, rule); + } + } + + obj.$_temp.ruleset = false; + return obj.$_mutateRebuild(); + } + + get ruleset() { + + Assert(!this._inRuleset(), 'Cannot start a new ruleset without closing the previous one'); + + const obj = this.clone(); + obj.$_temp.ruleset = obj._rules.length; + return obj; + } + + get $() { + + return this.ruleset; + } + + tailor(targets) { + + targets = [].concat(targets); + + Assert(!this._inRuleset(), 'Cannot tailor inside a ruleset'); + + let obj = this; // eslint-disable-line consistent-this + + if (this.$_terms.alterations) { + for (const { target, adjuster } of this.$_terms.alterations) { + if (targets.includes(target)) { + obj = adjuster(obj); + Assert(Common.isSchema(obj), 'Alteration adjuster for', target, 'failed to return a schema object'); + } + } + } + + obj = obj.$_modify({ each: (item) => item.tailor(targets), ref: false }); + obj.$_temp.ruleset = false; + return obj.$_mutateRebuild(); + } + + tracer() { + + return Trace.location ? Trace.location(this) : this; // $lab:coverage:ignore$ + } + + validate(value, options) { + + return Validator.entry(value, this, options); + } + + validateAsync(value, options) { + + return Validator.entryAsync(value, this, options); + } + + // Extensions + + $_addRule(options) { + + // Normalize rule + + if (typeof options === 'string') { + options = { name: options }; + } + + Assert(options && typeof options === 'object', 'Invalid options'); + Assert(options.name && typeof options.name === 'string', 'Invalid rule name'); + + for (const key in options) { + Assert(key[0] !== '_', 'Cannot set private rule properties'); + } + + const rule = Object.assign({}, options); // Shallow cloned + rule._resolve = []; + rule.method = rule.method || rule.name; + + const definition = this._definition.rules[rule.method]; + const args = rule.args; + + Assert(definition, 'Unknown rule', rule.method); + + // Args + + const obj = this.clone(); + + if (args) { + Assert(Object.keys(args).length === 1 || Object.keys(args).length === this._definition.rules[rule.name].args.length, 'Invalid rule definition for', this.type, rule.name); + + for (const key in args) { + let arg = args[key]; + if (arg === undefined) { + delete args[key]; + continue; + } + + if (definition.argsByName) { + const resolver = definition.argsByName.get(key); + + if (resolver.ref && + Common.isResolvable(arg)) { + + rule._resolve.push(key); + obj.$_mutateRegister(arg); + } + else { + if (resolver.normalize) { + arg = resolver.normalize(arg); + args[key] = arg; + } + + if (resolver.assert) { + const error = Common.validateArg(arg, key, resolver); + Assert(!error, error, 'or reference'); + } + } + } + + args[key] = arg; + } + } + + // Unique rules + + if (!definition.multi) { + obj._ruleRemove(rule.name, { clone: false }); + obj._singleRules.set(rule.name, rule); + } + + if (obj.$_temp.ruleset === false) { + obj.$_temp.ruleset = null; + } + + if (definition.priority) { + obj._rules.unshift(rule); + } + else { + obj._rules.push(rule); + } + + return obj; + } + + $_compile(schema, options) { + + return Compile.schema(this.$_root, schema, options); + } + + $_createError(code, value, local, state, prefs, options = {}) { + + const flags = options.flags !== false ? this._flags : {}; + const messages = options.messages ? Messages.merge(this._definition.messages, options.messages) : this._definition.messages; + return new Errors.Report(code, value, local, flags, messages, state, prefs); + } + + $_getFlag(name) { + + return this._flags[name]; + } + + $_getRule(name) { + + return this._singleRules.get(name); + } + + $_mapLabels(path) { + + path = Array.isArray(path) ? path : path.split('.'); + return this._ids.labels(path); + } + + $_match(value, state, prefs, overrides) { + + prefs = Object.assign({}, prefs); // Shallow cloned + prefs.abortEarly = true; + prefs._externals = false; + + state.snapshot(); + const result = !Validator.validate(value, this, state, prefs, overrides).errors; + state.restore(); + + return result; + } + + $_modify(options) { + + Common.assertOptions(options, ['each', 'once', 'ref', 'schema']); + return Modify.schema(this, options) || this; + } + + $_mutateRebuild() { + + Assert(!this._inRuleset(), 'Cannot add this rule inside a ruleset'); + + this._refs.reset(); + this._ids.reset(); + + const each = (item, { source, name, path, key }) => { + + const family = this._definition[source][name] && this._definition[source][name].register; + if (family !== false) { + this.$_mutateRegister(item, { family, key }); + } + }; + + this.$_modify({ each }); + + if (this._definition.rebuild) { + this._definition.rebuild(this); + } + + this.$_temp.ruleset = false; + return this; + } + + $_mutateRegister(schema, { family, key } = {}) { + + this._refs.register(schema, family); + this._ids.register(schema, { key }); + } + + $_property(name) { + + return this._definition.properties[name]; + } + + $_reach(path) { + + return this._ids.reach(path); + } + + $_rootReferences() { + + return this._refs.roots(); + } + + $_setFlag(name, value, options = {}) { + + Assert(name[0] === '_' || !this._inRuleset(), 'Cannot set flag inside a ruleset'); + + const flag = this._definition.flags[name] || {}; + if (DeepEqual(value, flag.default)) { + value = undefined; + } + + if (DeepEqual(value, this._flags[name])) { + return this; + } + + const obj = options.clone !== false ? this.clone() : this; + + if (value !== undefined) { + obj._flags[name] = value; + obj.$_mutateRegister(value); + } + else { + delete obj._flags[name]; + } + + if (name[0] !== '_') { + obj.$_temp.ruleset = false; + } + + return obj; + } + + $_validate(value, state, prefs) { + + return Validator.validate(value, this, state, prefs); + } + + // Internals + + _assign(target) { + + target.type = this.type; + + target.$_root = this.$_root; + + target.$_temp = Object.assign({}, this.$_temp); + target.$_temp.whens = {}; + + target._ids = this._ids.clone(); + target._preferences = this._preferences; + target._valids = this._valids && this._valids.clone(); + target._invalids = this._invalids && this._invalids.clone(); + target._rules = this._rules.slice(); + target._singleRules = Clone(this._singleRules, { shallow: true }); + target._refs = this._refs.clone(); + target._flags = Object.assign({}, this._flags); + target._cache = null; + + target.$_terms = {}; + for (const key in this.$_terms) { + target.$_terms[key] = this.$_terms[key] ? this.$_terms[key].slice() : null; + } + + target.$_super = {}; + for (const override in this.$_super) { + target.$_super[override] = this._super[override].bind(target); + } + + return target; + } + + _default(flag, value, options = {}) { + + Common.assertOptions(options, 'literal'); + + Assert(value !== undefined, 'Missing', flag, 'value'); + Assert(typeof value === 'function' || !options.literal, 'Only function value supports literal option'); + + if (typeof value === 'function' && + options.literal) { + + value = { + [Common.symbols.literal]: true, + literal: value + }; + } + + const obj = this.$_setFlag(flag, value); + return obj; + } + + _generate(value, state, prefs) { + + if (!this.$_terms.whens) { + return { schema: this }; + } + + // Collect matching whens + + const whens = []; + const ids = []; + for (let i = 0; i < this.$_terms.whens.length; ++i) { + const when = this.$_terms.whens[i]; + + if (when.concat) { + whens.push(when.concat); + ids.push(`${i}.concat`); + continue; + } + + const input = when.ref ? when.ref.resolve(value, state, prefs) : value; + const tests = when.is ? [when] : when.switch; + const before = ids.length; + + for (let j = 0; j < tests.length; ++j) { + const { is, then, otherwise } = tests[j]; + + const baseId = `${i}${when.switch ? '.' + j : ''}`; + if (is.$_match(input, state.nest(is, `${baseId}.is`), prefs)) { + if (then) { + const localState = state.localize([...state.path, `${baseId}.then`], state.ancestors, state.schemas); + const { schema: generated, id } = then._generate(value, localState, prefs); + whens.push(generated); + ids.push(`${baseId}.then${id ? `(${id})` : ''}`); + break; + } + } + else if (otherwise) { + const localState = state.localize([...state.path, `${baseId}.otherwise`], state.ancestors, state.schemas); + const { schema: generated, id } = otherwise._generate(value, localState, prefs); + whens.push(generated); + ids.push(`${baseId}.otherwise${id ? `(${id})` : ''}`); + break; + } + } + + if (when.break && + ids.length > before) { // Something matched + + break; + } + } + + // Check cache + + const id = ids.join(', '); + state.mainstay.tracer.debug(state, 'rule', 'when', id); + + if (!id) { + return { schema: this }; + } + + if (!state.mainstay.tracer.active && + this.$_temp.whens[id]) { + + return { schema: this.$_temp.whens[id], id }; + } + + // Generate dynamic schema + + let obj = this; // eslint-disable-line consistent-this + if (this._definition.generate) { + obj = this._definition.generate(this, value, state, prefs); + } + + // Apply whens + + for (const when of whens) { + obj = obj.concat(when); + } + + // Tracing + + if (this.$_root._tracer) { + this.$_root._tracer._combine(obj, [this, ...whens]); + } + + // Cache result + + this.$_temp.whens[id] = obj; + return { schema: obj, id }; + } + + _inner(type, values, options = {}) { + + Assert(!this._inRuleset(), `Cannot set ${type} inside a ruleset`); + + const obj = this.clone(); + if (!obj.$_terms[type] || + options.override) { + + obj.$_terms[type] = []; + } + + if (options.single) { + obj.$_terms[type].push(values); + } + else { + obj.$_terms[type].push(...values); + } + + obj.$_temp.ruleset = false; + return obj; + } + + _inRuleset() { + + return this.$_temp.ruleset !== null && this.$_temp.ruleset !== false; + } + + _ruleRemove(name, options = {}) { + + if (!this._singleRules.has(name)) { + return this; + } + + const obj = options.clone !== false ? this.clone() : this; + + obj._singleRules.delete(name); + + const filtered = []; + for (let i = 0; i < obj._rules.length; ++i) { + const test = obj._rules[i]; + if (test.name === name && + !test.keep) { + + if (obj._inRuleset() && + i < obj.$_temp.ruleset) { + + --obj.$_temp.ruleset; + } + + continue; + } + + filtered.push(test); + } + + obj._rules = filtered; + return obj; + } + + _values(values, key) { + + Common.verifyFlat(values, key.slice(1, -1)); + + const obj = this.clone(); + + const override = values[0] === Common.symbols.override; + if (override) { + values = values.slice(1); + } + + if (!obj[key] && + values.length) { + + obj[key] = new Values(); + } + else if (override) { + obj[key] = values.length ? new Values() : null; + obj.$_mutateRebuild(); + } + + if (!obj[key]) { + return obj; + } + + if (override) { + obj[key].override(); + } + + for (const value of values) { + Assert(value !== undefined, 'Cannot call allow/valid/invalid with undefined'); + Assert(value !== Common.symbols.override, 'Override must be the first value'); + + const other = key === '_invalids' ? '_valids' : '_invalids'; + if (obj[other]) { + obj[other].remove(value); + if (!obj[other].length) { + Assert(key === '_valids' || !obj._flags.only, 'Setting invalid value', value, 'leaves schema rejecting all values due to previous valid rule'); + obj[other] = null; + } + } + + obj[key].add(value, obj._refs); + } + + return obj; + } +}; + + +internals.Base.prototype[Common.symbols.any] = { + version: Common.version, + compile: Compile.compile, + root: '$_root' +}; + + +internals.Base.prototype.isImmutable = true; // Prevents Hoek from deep cloning schema objects (must be on prototype) + + +// Aliases + +internals.Base.prototype.deny = internals.Base.prototype.invalid; +internals.Base.prototype.disallow = internals.Base.prototype.invalid; +internals.Base.prototype.equal = internals.Base.prototype.valid; +internals.Base.prototype.exist = internals.Base.prototype.required; +internals.Base.prototype.not = internals.Base.prototype.invalid; +internals.Base.prototype.options = internals.Base.prototype.prefs; +internals.Base.prototype.preferences = internals.Base.prototype.prefs; + + +module.exports = new internals.Base(); diff --git a/node_modules/@hapi/joi/lib/cache.js b/node_modules/@hapi/joi/lib/cache.js new file mode 100644 index 000000000..32c61e0e4 --- /dev/null +++ b/node_modules/@hapi/joi/lib/cache.js @@ -0,0 +1,143 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); + +const Common = require('./common'); + + +const internals = { + max: 1000, + supported: new Set(['undefined', 'boolean', 'number', 'string']) +}; + + +exports.provider = { + + provision(options) { + + return new internals.Cache(options); + } +}; + + +// Least Recently Used (LRU) Cache + +internals.Cache = class { + + constructor(options = {}) { + + Common.assertOptions(options, ['max']); + Assert(options.max === undefined || options.max && options.max > 0 && isFinite(options.max), 'Invalid max cache size'); + + this._max = options.max || internals.max; + + this._map = new Map(); // Map of nodes by key + this._list = new internals.List(); // List of nodes (most recently used in head) + } + + get length() { + + return this._map.size; + } + + set(key, value) { + + if (key !== null && + !internals.supported.has(typeof key)) { + + return; + } + + let node = this._map.get(key); + if (node) { + node.value = value; + this._list.first(node); + return; + } + + node = this._list.unshift({ key, value }); + this._map.set(key, node); + this._compact(); + } + + get(key) { + + const node = this._map.get(key); + if (node) { + this._list.first(node); + return Clone(node.value); + } + } + + _compact() { + + if (this._map.size > this._max) { + const node = this._list.pop(); + this._map.delete(node.key); + } + } +}; + + +internals.List = class { + + constructor() { + + this.tail = null; + this.head = null; + } + + unshift(node) { + + node.next = null; + node.prev = this.head; + + if (this.head) { + this.head.next = node; + } + + this.head = node; + + if (!this.tail) { + this.tail = node; + } + + return node; + } + + first(node) { + + if (node === this.head) { + return; + } + + this._remove(node); + this.unshift(node); + } + + pop() { + + return this._remove(this.tail); + } + + _remove(node) { + + const { next, prev } = node; + + next.prev = prev; + + if (prev) { + prev.next = next; + } + + if (node === this.tail) { + this.tail = next; + } + + node.prev = null; + node.next = null; + + return node; + } +}; diff --git a/node_modules/@hapi/joi/lib/common.js b/node_modules/@hapi/joi/lib/common.js new file mode 100644 index 000000000..8891d2db0 --- /dev/null +++ b/node_modules/@hapi/joi/lib/common.js @@ -0,0 +1,213 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const AssertError = require('@hapi/hoek/lib/error'); + +const Pkg = require('../package.json'); + +let Messages; +let Schemas; + + +const internals = { + isoDate: /^(?:[-+]\d{2})?(?:\d{4}(?!\d{2}\b))(?:(-?)(?:(?:0[1-9]|1[0-2])(?:\1(?:[12]\d|0[1-9]|3[01]))?|W(?:[0-4]\d|5[0-2])(?:-?[1-7])?|(?:00[1-9]|0[1-9]\d|[12]\d{2}|3(?:[0-5]\d|6[1-6])))(?![T]$|[T][\d]+Z$)(?:[T\s](?:(?:(?:[01]\d|2[0-3])(?:(:?)[0-5]\d)?|24\:?00)(?:[.,]\d+(?!:))?)(?:\2[0-5]\d(?:[.,]\d+)?)?(?:[Z]|(?:[+-])(?:[01]\d|2[0-3])(?::?[0-5]\d)?)?)?)?$/ +}; + + +exports.version = Pkg.version; + + +exports.defaults = { + abortEarly: true, + allowUnknown: false, + cache: true, + context: null, + convert: true, + dateFormat: 'iso', + errors: { + escapeHtml: false, + label: 'path', + language: null, + render: true, + stack: false, + wrap: { + label: '"', + array: '[]' + } + }, + externals: true, + messages: {}, + nonEnumerables: false, + noDefaults: false, + presence: 'optional', + skipFunctions: false, + stripUnknown: false, + warnings: false +}; + + +exports.symbols = { + any: Symbol.for('@hapi/joi/schema'), // Used to internally identify any-based types (shared with other joi versions) + arraySingle: Symbol('arraySingle'), + deepDefault: Symbol('deepDefault'), + literal: Symbol('literal'), + override: Symbol('override'), + prefs: Symbol('prefs'), + ref: Symbol('ref'), + values: Symbol('values'), + template: Symbol('template') +}; + + +exports.assertOptions = function (options, keys, name = 'Options') { + + Assert(options && typeof options === 'object' && !Array.isArray(options), 'Options must be of type object'); + const unknownKeys = Object.keys(options).filter((k) => !keys.includes(k)); + Assert(unknownKeys.length === 0, `${name} contain unknown keys: ${unknownKeys}`); +}; + + +exports.checkPreferences = function (prefs) { + + Schemas = Schemas || require('./schemas'); + + const result = Schemas.preferences.validate(prefs); + + if (result.error) { + throw new AssertError([result.error.details[0].message]); + } +}; + + +exports.compare = function (a, b, operator) { + + switch (operator) { + case '=': return a === b; + case '>': return a > b; + case '<': return a < b; + case '>=': return a >= b; + case '<=': return a <= b; + } +}; + + +exports.default = function (value, defaultValue) { + + return value === undefined ? defaultValue : value; +}; + + +exports.isIsoDate = function (date) { + + return internals.isoDate.test(date); +}; + + +exports.isNumber = function (value) { + + return typeof value === 'number' && !isNaN(value); +}; + + +exports.isResolvable = function (obj) { + + if (!obj) { + return false; + } + + return obj[exports.symbols.ref] || obj[exports.symbols.template]; +}; + + +exports.isSchema = function (schema, options = {}) { + + const any = schema && schema[exports.symbols.any]; + if (!any) { + return false; + } + + Assert(options.legacy || any.version === exports.version, 'Cannot mix different versions of joi schemas'); + return true; +}; + + +exports.isValues = function (obj) { + + return obj[exports.symbols.values]; +}; + + +exports.limit = function (value) { + + return Number.isSafeInteger(value) && value >= 0; +}; + + +exports.preferences = function (target, source) { + + Messages = Messages || require('./messages'); + + target = target || {}; + source = source || {}; + + const merged = Object.assign({}, target, source); + if (source.errors && + target.errors) { + + merged.errors = Object.assign({}, target.errors, source.errors); + merged.errors.wrap = Object.assign({}, target.errors.wrap, source.errors.wrap); + } + + if (source.messages) { + merged.messages = Messages.compile(source.messages, target.messages); + } + + delete merged[exports.symbols.prefs]; + return merged; +}; + + +exports.tryWithPath = function (fn, key, options = {}) { + + try { + return fn(); + } + catch (err) { + if (err.path !== undefined) { + err.path = key + '.' + err.path; + } + else { + err.path = key; + } + + if (options.append) { + err.message = `${err.message} (${err.path})`; + } + + throw err; + } +}; + + +exports.validateArg = function (value, label, { assert, message }) { + + if (exports.isSchema(assert)) { + const result = assert.validate(value); + if (!result.error) { + return; + } + + return result.error.message; + } + else if (!assert(value)) { + return label ? `${label} ${message}` : message; + } +}; + + +exports.verifyFlat = function (args, method) { + + for (const arg of args) { + Assert(!Array.isArray(arg), 'Method no longer accepts array arguments:', method); + } +}; diff --git a/node_modules/@hapi/joi/lib/compile.js b/node_modules/@hapi/joi/lib/compile.js new file mode 100644 index 000000000..5593b7c90 --- /dev/null +++ b/node_modules/@hapi/joi/lib/compile.js @@ -0,0 +1,283 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Common = require('./common'); +const Ref = require('./ref'); + + +const internals = {}; + + +exports.schema = function (Joi, config, options = {}) { + + Common.assertOptions(options, ['appendPath', 'override']); + + try { + return internals.schema(Joi, config, options); + } + catch (err) { + if (options.appendPath && + err.path !== undefined) { + + err.message = `${err.message} (${err.path})`; + } + + throw err; + } +}; + + +internals.schema = function (Joi, config, options) { + + Assert(config !== undefined, 'Invalid undefined schema'); + + if (Array.isArray(config)) { + Assert(config.length, 'Invalid empty array schema'); + + if (config.length === 1) { + config = config[0]; + } + } + + const valid = (base, ...values) => { + + if (options.override !== false) { + return base.valid(Joi.override, ...values); + } + + return base.valid(...values); + }; + + if (internals.simple(config)) { + return valid(Joi, config); + } + + if (typeof config === 'function') { + return Joi.custom(config); + } + + Assert(typeof config === 'object', 'Invalid schema content:', typeof config); + + if (Common.isResolvable(config)) { + return valid(Joi, config); + } + + if (Common.isSchema(config)) { + return config; + } + + if (Array.isArray(config)) { + for (const item of config) { + if (!internals.simple(item)) { + return Joi.alternatives().try(...config); + } + } + + return valid(Joi, ...config); + } + + if (config instanceof RegExp) { + return Joi.string().regex(config); + } + + if (config instanceof Date) { + return valid(Joi.date(), config); + } + + Assert(Object.getPrototypeOf(config) === Object.getPrototypeOf({}), 'Schema can only contain plain objects'); + + return Joi.object().keys(config); +}; + + +exports.ref = function (id, options) { + + return Ref.isRef(id) ? id : Ref.create(id, options); +}; + + +exports.compile = function (root, schema, options = {}) { + + Common.assertOptions(options, ['legacy']); + + // Compiled by any supported version + + const any = schema && schema[Common.symbols.any]; + if (any) { + Assert(options.legacy || any.version === Common.version, 'Cannot mix different versions of joi schemas:', any.version, Common.version); + return schema; + } + + // Uncompiled root + + if (typeof schema !== 'object' || + !options.legacy) { + + return exports.schema(root, schema, { appendPath: true }); // Will error if schema contains other versions + } + + // Scan schema for compiled parts + + const compiler = internals.walk(schema); + if (!compiler) { + return exports.schema(root, schema, { appendPath: true }); + } + + return compiler.compile(compiler.root, schema); +}; + + +internals.walk = function (schema) { + + if (typeof schema !== 'object') { + return null; + } + + if (Array.isArray(schema)) { + for (const item of schema) { + const compiler = internals.walk(item); + if (compiler) { + return compiler; + } + } + + return null; + } + + const any = schema[Common.symbols.any]; + if (any) { + return { root: schema[any.root], compile: any.compile }; + } + + Assert(Object.getPrototypeOf(schema) === Object.getPrototypeOf({}), 'Schema can only contain plain objects'); + + for (const key in schema) { + const compiler = internals.walk(schema[key]); + if (compiler) { + return compiler; + } + } + + return null; +}; + + +internals.simple = function (value) { + + return value === null || ['boolean', 'string', 'number'].includes(typeof value); +}; + + +exports.when = function (schema, condition, options) { + + if (options === undefined) { + Assert(condition && typeof condition === 'object', 'Missing options'); + + options = condition; + condition = Ref.create('.'); + } + + if (Array.isArray(options)) { + options = { switch: options }; + } + + Common.assertOptions(options, ['is', 'not', 'then', 'otherwise', 'switch', 'break']); + + // Schema condition + + if (Common.isSchema(condition)) { + Assert(options.is === undefined, '"is" can not be used with a schema condition'); + Assert(options.not === undefined, '"not" can not be used with a schema condition'); + Assert(options.switch === undefined, '"switch" can not be used with a schema condition'); + + return internals.condition(schema, { is: condition, then: options.then, otherwise: options.otherwise, break: options.break }); + } + + // Single condition + + Assert(Ref.isRef(condition) || typeof condition === 'string', 'Invalid condition:', condition); + Assert(options.not === undefined || options.is === undefined, 'Cannot combine "is" with "not"'); + + if (options.switch === undefined) { + let rule = options; + if (options.not !== undefined) { + rule = { is: options.not, then: options.otherwise, otherwise: options.then, break: options.break }; + } + + let is = rule.is !== undefined ? schema.$_compile(rule.is) : schema.$_root.invalid(null, false, 0, '').required(); + Assert(rule.then !== undefined || rule.otherwise !== undefined, 'options must have at least one of "then", "otherwise", or "switch"'); + Assert(rule.break === undefined || rule.then === undefined || rule.otherwise === undefined, 'Cannot specify then, otherwise, and break all together'); + + if (options.is !== undefined && + !Ref.isRef(options.is) && + !Common.isSchema(options.is)) { + + is = is.required(); // Only apply required if this wasn't already a schema or a ref + } + + return internals.condition(schema, { ref: exports.ref(condition), is, then: rule.then, otherwise: rule.otherwise, break: rule.break }); + } + + // Switch statement + + Assert(Array.isArray(options.switch), '"switch" must be an array'); + Assert(options.is === undefined, 'Cannot combine "switch" with "is"'); + Assert(options.not === undefined, 'Cannot combine "switch" with "not"'); + Assert(options.then === undefined, 'Cannot combine "switch" with "then"'); + + const rule = { + ref: exports.ref(condition), + switch: [], + break: options.break + }; + + for (let i = 0; i < options.switch.length; ++i) { + const test = options.switch[i]; + const last = i === options.switch.length - 1; + + Common.assertOptions(test, last ? ['is', 'then', 'otherwise'] : ['is', 'then']); + + Assert(test.is !== undefined, 'Switch statement missing "is"'); + Assert(test.then !== undefined, 'Switch statement missing "then"'); + + const item = { + is: schema.$_compile(test.is), + then: schema.$_compile(test.then) + }; + + if (!Ref.isRef(test.is) && + !Common.isSchema(test.is)) { + + item.is = item.is.required(); // Only apply required if this wasn't already a schema or a ref + } + + if (last) { + Assert(options.otherwise === undefined || test.otherwise === undefined, 'Cannot specify "otherwise" inside and outside a "switch"'); + const otherwise = options.otherwise !== undefined ? options.otherwise : test.otherwise; + if (otherwise !== undefined) { + Assert(rule.break === undefined, 'Cannot specify both otherwise and break'); + item.otherwise = schema.$_compile(otherwise); + } + } + + rule.switch.push(item); + } + + return rule; +}; + + +internals.condition = function (schema, condition) { + + for (const key of ['then', 'otherwise']) { + if (condition[key] === undefined) { + delete condition[key]; + } + else { + condition[key] = schema.$_compile(condition[key]); + } + } + + return condition; +}; diff --git a/node_modules/@hapi/joi/lib/errors.js b/node_modules/@hapi/joi/lib/errors.js new file mode 100644 index 000000000..0823f924f --- /dev/null +++ b/node_modules/@hapi/joi/lib/errors.js @@ -0,0 +1,262 @@ +'use strict'; + +const Annotate = require('./annotate'); +const Common = require('./common'); +const Template = require('./template'); + + +const internals = {}; + + +exports.Report = class { + + constructor(code, value, local, flags, messages, state, prefs) { + + this.code = code; + this.flags = flags; + this.messages = messages; + this.path = state.path; + this.prefs = prefs; + this.state = state; + this.value = value; + + this.message = null; + this.template = null; + + this.local = local || {}; + this.local.label = exports.label(this.flags, this.state, this.prefs, this.messages); + + if (this.value !== undefined && + !this.local.hasOwnProperty('value')) { + + this.local.value = this.value; + } + + if (this.path.length) { + const key = this.path[this.path.length - 1]; + if (typeof key !== 'object') { + this.local.key = key; + } + } + } + + _setTemplate(template) { + + this.template = template; + + if (!this.flags.label && + this.path.length === 0) { + + const localized = this._template(this.template, 'root'); + if (localized) { + this.local.label = localized; + } + } + } + + toString() { + + if (this.message) { + return this.message; + } + + const code = this.code; + + if (!this.prefs.errors.render) { + return this.code; + } + + const template = this._template(this.template) || + this._template(this.prefs.messages) || + this._template(this.messages); + + if (template === undefined) { + return `Error code "${code}" is not defined, your custom type is missing the correct messages definition`; + } + + // Render and cache result + + this.message = template.render(this.value, this.state, this.prefs, this.local, { errors: this.prefs.errors, messages: [this.prefs.messages, this.messages] }); + if (!this.prefs.errors.label) { + this.message = this.message.replace(/^"" /, '').trim(); + } + + return this.message; + } + + _template(messages, code) { + + return exports.template(this.value, messages, code || this.code, this.state, this.prefs); + } +}; + + +exports.path = function (path) { + + let label = ''; + for (const segment of path) { + if (typeof segment === 'object') { // Exclude array single path segment + continue; + } + + if (typeof segment === 'string') { + if (label) { + label += '.'; + } + + label += segment; + } + else { + label += `[${segment}]`; + } + } + + return label; +}; + + +exports.template = function (value, messages, code, state, prefs) { + + if (!messages) { + return; + } + + if (Template.isTemplate(messages)) { + return code !== 'root' ? messages : null; + } + + let lang = prefs.errors.language; + if (Common.isResolvable(lang)) { + lang = lang.resolve(value, state, prefs); + } + + if (lang && + messages[lang] && + messages[lang][code] !== undefined) { + + return messages[lang][code]; + } + + return messages[code]; +}; + + +exports.label = function (flags, state, prefs, messages) { + + if (flags.label) { + return flags.label; + } + + if (!prefs.errors.label) { + return ''; + } + + let path = state.path; + if (prefs.errors.label === 'key' && + state.path.length > 1) { + + path = state.path.slice(-1); + } + + const normalized = exports.path(path); + if (normalized) { + return normalized; + } + + return exports.template(null, prefs.messages, 'root', state, prefs) || + messages && exports.template(null, messages, 'root', state, prefs) || + 'value'; +}; + + +exports.process = function (errors, original, prefs) { + + if (!errors) { + return null; + } + + const { override, message, details } = exports.details(errors); + if (override) { + return override; + } + + if (prefs.errors.stack) { + return new exports.ValidationError(message, details, original); + } + + const limit = Error.stackTraceLimit; + Error.stackTraceLimit = 0; + const validationError = new exports.ValidationError(message, details, original); + Error.stackTraceLimit = limit; + return validationError; +}; + + +exports.details = function (errors, options = {}) { + + let messages = []; + const details = []; + + for (const item of errors) { + + // Override + + if (item instanceof Error) { + if (options.override !== false) { + return { override: item }; + } + + const message = item.toString(); + messages.push(message); + + details.push({ + message, + type: 'override', + context: { error: item } + }); + + continue; + } + + // Report + + const message = item.toString(); + messages.push(message); + + details.push({ + message, + path: item.path.filter((v) => typeof v !== 'object'), + type: item.code, + context: item.local + }); + } + + if (messages.length > 1) { + messages = [...new Set(messages)]; + } + + return { message: messages.join('. '), details }; +}; + + +exports.ValidationError = class extends Error { + + constructor(message, details, original) { + + super(message); + this._original = original; + this.details = details; + } + + static isError(err) { + + return err instanceof exports.ValidationError; + } +}; + + +exports.ValidationError.prototype.isJoi = true; + +exports.ValidationError.prototype.name = 'ValidationError'; + +exports.ValidationError.prototype.annotate = Annotate.error; diff --git a/node_modules/@hapi/joi/lib/extend.js b/node_modules/@hapi/joi/lib/extend.js new file mode 100644 index 000000000..443d0ed2d --- /dev/null +++ b/node_modules/@hapi/joi/lib/extend.js @@ -0,0 +1,311 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); + +const Common = require('./common'); +const Messages = require('./messages'); + + +const internals = {}; + + +exports.type = function (from, options) { + + const base = Object.getPrototypeOf(from); + const prototype = Clone(base); + const schema = from._assign(Object.create(prototype)); + const def = Object.assign({}, options); // Shallow cloned + delete def.base; + + prototype._definition = def; + + const parent = base._definition || {}; + def.messages = Messages.merge(parent.messages, def.messages); + def.properties = Object.assign({}, parent.properties, def.properties); + + // Type + + schema.type = def.type; + + // Flags + + def.flags = Object.assign({}, parent.flags, def.flags); + + // Terms + + const terms = Object.assign({}, parent.terms); + if (def.terms) { + for (const name in def.terms) { // Only apply own terms + const term = def.terms[name]; + Assert(schema.$_terms[name] === undefined, 'Invalid term override for', def.type, name); + schema.$_terms[name] = term.init; + terms[name] = term; + } + } + + def.terms = terms; + + // Constructor arguments + + if (!def.args) { + def.args = parent.args; + } + + // Prepare + + def.prepare = internals.prepare(def.prepare, parent.prepare); + + // Coerce + + if (def.coerce) { + if (typeof def.coerce === 'function') { + def.coerce = { method: def.coerce }; + } + + if (def.coerce.from && + !Array.isArray(def.coerce.from)) { + + def.coerce = { method: def.coerce.method, from: [].concat(def.coerce.from) }; + } + } + + def.coerce = internals.coerce(def.coerce, parent.coerce); + + // Validate + + def.validate = internals.validate(def.validate, parent.validate); + + // Rules + + const rules = Object.assign({}, parent.rules); + if (def.rules) { + for (const name in def.rules) { + const rule = def.rules[name]; + Assert(typeof rule === 'object', 'Invalid rule definition for', def.type, name); + + let method = rule.method; + if (method === undefined) { + method = function () { + + return this.$_addRule(name); + }; + } + + if (method) { + Assert(!prototype[name], 'Rule conflict in', def.type, name); + prototype[name] = method; + } + + Assert(!rules[name], 'Rule conflict in', def.type, name); + rules[name] = rule; + + if (rule.alias) { + const aliases = [].concat(rule.alias); + for (const alias of aliases) { + prototype[alias] = rule.method; + } + } + + if (rule.args) { + rule.argsByName = new Map(); + rule.args = rule.args.map((arg) => { + + if (typeof arg === 'string') { + arg = { name: arg }; + } + + Assert(!rule.argsByName.has(arg.name), 'Duplicated argument name', arg.name); + + if (Common.isSchema(arg.assert)) { + arg.assert = arg.assert.strict().label(arg.name); + } + + rule.argsByName.set(arg.name, arg); + return arg; + }); + } + } + } + + def.rules = rules; + + // Modifiers + + const modifiers = Object.assign({}, parent.modifiers); + if (def.modifiers) { + for (const name in def.modifiers) { + Assert(!prototype[name], 'Rule conflict in', def.type, name); + + const modifier = def.modifiers[name]; + Assert(typeof modifier === 'function', 'Invalid modifier definition for', def.type, name); + + const method = function (arg) { + + return this.rule({ [name]: arg }); + }; + + prototype[name] = method; + modifiers[name] = modifier; + } + } + + def.modifiers = modifiers; + + // Overrides + + if (def.overrides) { + prototype._super = base; + schema.$_super = {}; + for (const override in def.overrides) { + Assert(base[override], 'Cannot override missing', override); + schema.$_super[override] = base[override].bind(schema); + } + + Object.assign(prototype, def.overrides); + } + + // Casts + + def.cast = Object.assign({}, parent.cast, def.cast); + + // Manifest + + const manifest = Object.assign({}, parent.manifest, def.manifest); + manifest.build = internals.build(def.manifest && def.manifest.build, parent.manifest && parent.manifest.build); + def.manifest = manifest; + + // Rebuild + + def.rebuild = internals.rebuild(def.rebuild, parent.rebuild); + + return schema; +}; + + +// Helpers + +internals.build = function (child, parent) { + + if (!child || + !parent) { + + return child || parent; + } + + return function (obj, desc) { + + return parent(child(obj, desc), desc); + }; +}; + + +internals.coerce = function (child, parent) { + + if (!child || + !parent) { + + return child || parent; + } + + return { + from: child.from && parent.from ? [...new Set([...child.from, ...parent.from])] : null, + method(value, helpers) { + + let coerced; + if (!parent.from || + parent.from.includes(typeof value)) { + + coerced = parent.method(value, helpers); + if (coerced) { + if (coerced.errors || + coerced.value === undefined) { + + return coerced; + } + + value = coerced.value; + } + } + + if (!child.from || + child.from.includes(typeof value)) { + + const own = child.method(value, helpers); + if (own) { + return own; + } + } + + return coerced; + } + }; +}; + + +internals.prepare = function (child, parent) { + + if (!child || + !parent) { + + return child || parent; + } + + return function (value, helpers) { + + const prepared = child(value, helpers); + if (prepared) { + if (prepared.errors || + prepared.value === undefined) { + + return prepared; + } + + value = prepared.value; + } + + return parent(value, helpers) || prepared; + }; +}; + + +internals.rebuild = function (child, parent) { + + if (!child || + !parent) { + + return child || parent; + } + + return function (schema) { + + parent(schema); + child(schema); + }; +}; + + +internals.validate = function (child, parent) { + + if (!child || + !parent) { + + return child || parent; + } + + return function (value, helpers) { + + const result = parent(value, helpers); + if (result) { + if (result.errors && + (!Array.isArray(result.errors) || result.errors.length)) { + + return result; + } + + value = result.value; + } + + return child(value, helpers) || result; + }; +}; diff --git a/node_modules/@hapi/joi/lib/index.js b/node_modules/@hapi/joi/lib/index.js new file mode 100644 index 000000000..8813f516d --- /dev/null +++ b/node_modules/@hapi/joi/lib/index.js @@ -0,0 +1,283 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); + +const Cache = require('./cache'); +const Common = require('./common'); +const Compile = require('./compile'); +const Errors = require('./errors'); +const Extend = require('./extend'); +const Manifest = require('./manifest'); +const Ref = require('./ref'); +const Template = require('./template'); +const Trace = require('./trace'); + +let Schemas; + + +const internals = { + types: { + alternatives: require('./types/alternatives'), + any: require('./types/any'), + array: require('./types/array'), + boolean: require('./types/boolean'), + date: require('./types/date'), + function: require('./types/function'), + link: require('./types/link'), + number: require('./types/number'), + object: require('./types/object'), + string: require('./types/string'), + symbol: require('./types/symbol') + }, + aliases: { + alt: 'alternatives', + bool: 'boolean', + func: 'function' + } +}; + + +if (Buffer) { // $lab:coverage:ignore$ + internals.types.binary = require('./types/binary'); +} + + +internals.root = function () { + + const root = { + _types: new Set(Object.keys(internals.types)) + }; + + // Types + + for (const type of root._types) { + root[type] = function (...args) { + + Assert(!args.length || ['alternatives', 'link', 'object'].includes(type), 'The', type, 'type does not allow arguments'); + return internals.generate(this, internals.types[type], args); + }; + } + + // Shortcuts + + for (const method of ['allow', 'custom', 'disallow', 'equal', 'exist', 'forbidden', 'invalid', 'not', 'only', 'optional', 'options', 'prefs', 'preferences', 'required', 'strip', 'valid', 'when']) { + root[method] = function (...args) { + + return this.any()[method](...args); + }; + } + + // Methods + + Object.assign(root, internals.methods); + + // Aliases + + for (const alias in internals.aliases) { + const target = internals.aliases[alias]; + root[alias] = root[target]; + } + + root.x = root.expression; + + // Trace + + if (Trace.setup) { // $lab:coverage:ignore$ + Trace.setup(root); + } + + return root; +}; + + +internals.methods = { + + ValidationError: Errors.ValidationError, + version: Common.version, + cache: Cache.provider, + + assert(value, schema, ...args /* [message], [options] */) { + + internals.assert(value, schema, true, args); + }, + + attempt(value, schema, ...args /* [message], [options] */) { + + return internals.assert(value, schema, false, args); + }, + + build(desc) { + + Assert(typeof Manifest.build === 'function', 'Manifest functionality disabled'); + return Manifest.build(this, desc); + }, + + checkPreferences(prefs) { + + Common.checkPreferences(prefs); + }, + + compile(schema, options) { + + return Compile.compile(this, schema, options); + }, + + defaults(modifier) { + + Assert(typeof modifier === 'function', 'modifier must be a function'); + + const joi = Object.assign({}, this); + for (const type of joi._types) { + const schema = modifier(joi[type]()); + Assert(Common.isSchema(schema), 'modifier must return a valid schema object'); + + joi[type] = function (...args) { + + return internals.generate(this, schema, args); + }; + } + + return joi; + }, + + expression(...args) { + + return new Template(...args); + }, + + extend(...extensions) { + + Common.verifyFlat(extensions, 'extend'); + + Schemas = Schemas || require('./schemas'); + + Assert(extensions.length, 'You need to provide at least one extension'); + this.assert(extensions, Schemas.extensions); + + const joi = Object.assign({}, this); + joi._types = new Set(joi._types); + + for (let extension of extensions) { + if (typeof extension === 'function') { + extension = extension(joi); + } + + this.assert(extension, Schemas.extension); + + const expanded = internals.expandExtension(extension, joi); + for (const item of expanded) { + Assert(joi[item.type] === undefined || joi._types.has(item.type), 'Cannot override name', item.type); + + const base = item.base || this.any(); + const schema = Extend.type(base, item); + + joi._types.add(item.type); + joi[item.type] = function (...args) { + + return internals.generate(this, schema, args); + }; + } + } + + return joi; + }, + + isError: Errors.ValidationError.isError, + isExpression: Template.isTemplate, + isRef: Ref.isRef, + isSchema: Common.isSchema, + + in(...args) { + + return Ref.in(...args); + }, + + override: Common.symbols.override, + + ref(...args) { + + return Ref.create(...args); + }, + + types() { + + const types = {}; + for (const type of this._types) { + types[type] = this[type](); + } + + for (const target in internals.aliases) { + types[target] = this[target](); + } + + return types; + } +}; + + +// Helpers + +internals.assert = function (value, schema, annotate, args /* [message], [options] */) { + + const message = args[0] instanceof Error || typeof args[0] === 'string' ? args[0] : null; + const options = message ? args[1] : args[0]; + const result = schema.validate(value, Common.preferences({ errors: { stack: true } }, options || {})); + + let error = result.error; + if (!error) { + return result.value; + } + + if (message instanceof Error) { + throw message; + } + + const display = annotate && typeof error.annotate === 'function' ? error.annotate() : error.message; + + if (error instanceof Errors.ValidationError === false) { + error = Clone(error); + } + + error.message = message ? `${message} ${display}` : display; + throw error; +}; + + +internals.generate = function (root, schema, args) { + + Assert(root, 'Must be invoked on a Joi instance.'); + + schema.$_root = root; + + if (!schema._definition.args || + !args.length) { + + return schema; + } + + return schema._definition.args(schema, ...args); +}; + + +internals.expandExtension = function (extension, joi) { + + if (typeof extension.type === 'string') { + return [extension]; + } + + const extended = []; + for (const type of joi._types) { + if (extension.type.test(type)) { + const item = Object.assign({}, extension); + item.type = type; + item.base = joi[type](); + extended.push(item); + } + } + + return extended; +}; + + +module.exports = internals.root(); diff --git a/node_modules/@hapi/joi/lib/manifest.js b/node_modules/@hapi/joi/lib/manifest.js new file mode 100644 index 000000000..fd0ae953a --- /dev/null +++ b/node_modules/@hapi/joi/lib/manifest.js @@ -0,0 +1,476 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); + +const Common = require('./common'); +const Messages = require('./messages'); +const Ref = require('./ref'); +const Template = require('./template'); + +let Schemas; + + +const internals = {}; + + +exports.describe = function (schema) { + + const def = schema._definition; + + // Type + + const desc = { + type: schema.type, + flags: {}, + rules: [] + }; + + // Flags + + for (const flag in schema._flags) { + if (flag[0] !== '_') { + desc.flags[flag] = internals.describe(schema._flags[flag]); + } + } + + if (!Object.keys(desc.flags).length) { + delete desc.flags; + } + + // Preferences + + if (schema._preferences) { + desc.preferences = Clone(schema._preferences, { shallow: ['messages'] }); + delete desc.preferences[Common.symbols.prefs]; + if (desc.preferences.messages) { + desc.preferences.messages = Messages.decompile(desc.preferences.messages); + } + } + + // Allow / Invalid + + if (schema._valids) { + desc.allow = schema._valids.describe(); + } + + if (schema._invalids) { + desc.invalid = schema._invalids.describe(); + } + + // Rules + + for (const rule of schema._rules) { + const ruleDef = def.rules[rule.name]; + if (ruleDef.manifest === false) { // Defaults to true + continue; + } + + const item = { name: rule.name }; + + for (const custom in def.modifiers) { + if (rule[custom] !== undefined) { + item[custom] = internals.describe(rule[custom]); + } + } + + if (rule.args) { + item.args = {}; + for (const key in rule.args) { + const arg = rule.args[key]; + if (key === 'options' && + !Object.keys(arg).length) { + + continue; + } + + item.args[key] = internals.describe(arg, { assign: key }); + } + + if (!Object.keys(item.args).length) { + delete item.args; + } + } + + desc.rules.push(item); + } + + if (!desc.rules.length) { + delete desc.rules; + } + + // Terms (must be last to verify no name conflicts) + + for (const term in schema.$_terms) { + if (term[0] === '_') { + continue; + } + + Assert(!desc[term], 'Cannot describe schema due to internal name conflict with', term); + + const items = schema.$_terms[term]; + if (!items) { + continue; + } + + if (items instanceof Map) { + if (items.size) { + desc[term] = [...items.entries()]; + } + + continue; + } + + if (Common.isValues(items)) { + desc[term] = items.describe(); + continue; + } + + Assert(def.terms[term], 'Term', term, 'missing configuration'); + const manifest = def.terms[term].manifest; + const mapped = typeof manifest === 'object'; + if (!items.length && + !mapped) { + + continue; + } + + const normalized = []; + for (const item of items) { + normalized.push(internals.describe(item)); + } + + // Mapped + + if (mapped) { + const { from, to } = manifest.mapped; + desc[term] = {}; + for (const item of normalized) { + desc[term][item[to]] = item[from]; + } + + continue; + } + + // Single + + if (manifest === 'single') { + Assert(normalized.length === 1, 'Term', term, 'contains more than one item'); + desc[term] = normalized[0]; + continue; + } + + // Array + + desc[term] = normalized; + } + + internals.validate(schema.$_root, desc); + return desc; +}; + + +internals.describe = function (item, options = {}) { + + if (Array.isArray(item)) { + return item.map(internals.describe); + } + + if (item === Common.symbols.deepDefault) { + return { special: 'deep' }; + } + + if (typeof item !== 'object' || + item === null) { + + return item; + } + + if (options.assign === 'options') { + return Clone(item); + } + + if (Buffer && Buffer.isBuffer(item)) { // $lab:coverage:ignore$ + return { buffer: item.toString('binary') }; + } + + if (item instanceof Date) { + return item.toISOString(); + } + + if (item instanceof Error) { + return item; + } + + if (item instanceof RegExp) { + if (options.assign === 'regex') { + return item.toString(); + } + + return { regex: item.toString() }; + } + + if (item[Common.symbols.literal]) { + return { function: item.literal }; + } + + if (typeof item.describe === 'function') { + if (options.assign === 'ref') { + return item.describe().ref; + } + + return item.describe(); + } + + const normalized = {}; + for (const key in item) { + const value = item[key]; + if (value === undefined) { + continue; + } + + normalized[key] = internals.describe(value, { assign: key }); + } + + return normalized; +}; + + +exports.build = function (joi, desc) { + + const builder = new internals.Builder(joi); + return builder.parse(desc); +}; + + +internals.Builder = class { + + constructor(joi) { + + this.joi = joi; + } + + parse(desc) { + + internals.validate(this.joi, desc); + + // Type + + let schema = this.joi[desc.type](); + const def = schema._definition; + + // Flags + + if (desc.flags) { + for (const flag in desc.flags) { + const setter = def.flags[flag] && def.flags[flag].setter || flag; + Assert(typeof schema[setter] === 'function', 'Invalid flag', flag, 'for type', desc.type); + schema = schema[setter](this.build(desc.flags[flag])); + } + } + + // Preferences + + if (desc.preferences) { + schema = schema.preferences(this.build(desc.preferences)); + } + + // Allow / Invalid + + if (desc.allow) { + schema = schema.allow(...this.build(desc.allow)); + } + + if (desc.invalid) { + schema = schema.invalid(...this.build(desc.invalid)); + } + + // Rules + + if (desc.rules) { + for (const rule of desc.rules) { + Assert(typeof schema[rule.name] === 'function', 'Invalid rule', rule.name, 'for type', desc.type); + + const args = []; + if (rule.args) { + const built = {}; + for (const key in rule.args) { + built[key] = this.build(rule.args[key], { assign: key }); + } + + const keys = Object.keys(built); + const definition = def.rules[rule.name].args; + if (definition) { + Assert(keys.length <= definition.length, 'Invalid number of arguments for', desc.type, rule.name, '(expected up to', definition.length, ', found', keys.length, ')'); + for (const { name } of definition) { + args.push(built[name]); + } + } + else { + Assert(keys.length === 1, 'Invalid number of arguments for', desc.type, rule.name, '(expected up to 1, found', keys.length, ')'); + args.push(built[keys[0]]); + } + } + + // Apply + + schema = schema[rule.name](...args); + + // Ruleset + + const options = {}; + for (const custom in def.modifiers) { + if (rule[custom] !== undefined) { + options[custom] = this.build(rule[custom]); + } + } + + if (Object.keys(options).length) { + schema = schema.rule(options); + } + } + } + + // Terms + + const terms = {}; + for (const key in desc) { + if (['allow', 'flags', 'invalid', 'whens', 'preferences', 'rules', 'type'].includes(key)) { + continue; + } + + Assert(def.terms[key], 'Term', key, 'missing configuration'); + const manifest = def.terms[key].manifest; + + if (manifest === 'schema') { + terms[key] = desc[key].map((item) => this.parse(item)); + continue; + } + + if (manifest === 'values') { + terms[key] = desc[key].map((item) => this.build(item)); + continue; + } + + if (manifest === 'single') { + terms[key] = this.build(desc[key]); + continue; + } + + if (typeof manifest === 'object') { + terms[key] = {}; + for (const name in desc[key]) { + const value = desc[key][name]; + terms[key][name] = this.parse(value); + } + + continue; + } + + terms[key] = this.build(desc[key]); + } + + if (desc.whens) { + terms.whens = desc.whens.map((when) => this.build(when)); + } + + schema = def.manifest.build(schema, terms); + schema.$_temp.ruleset = false; + return schema; + } + + build(desc, options = {}) { + + if (desc === null) { + return null; + } + + if (Array.isArray(desc)) { + return desc.map((item) => this.build(item)); + } + + if (desc instanceof Error) { + return desc; + } + + if (options.assign === 'options') { + return Clone(desc); + } + + if (options.assign === 'regex') { + return internals.regex(desc); + } + + if (options.assign === 'ref') { + return Ref.build(desc); + } + + if (typeof desc !== 'object') { + return desc; + } + + if (Object.keys(desc).length === 1) { + if (desc.buffer) { + Assert(Buffer, 'Buffers are not supported'); + return Buffer && Buffer.from(desc.buffer, 'binary'); // $lab:coverage:ignore$ + } + + if (desc.function) { + return { [Common.symbols.literal]: true, literal: desc.function }; + } + + if (desc.override) { + return Common.symbols.override; + } + + if (desc.ref) { + return Ref.build(desc.ref); + } + + if (desc.regex) { + return internals.regex(desc.regex); + } + + if (desc.special) { + Assert(['deep'].includes(desc.special), 'Unknown special value', desc.special); + return Common.symbols.deepDefault; + } + + if (desc.value) { + return Clone(desc.value); + } + } + + if (desc.type) { + return this.parse(desc); + } + + if (desc.template) { + return Template.build(desc); + } + + const normalized = {}; + for (const key in desc) { + normalized[key] = this.build(desc[key], { assign: key }); + } + + return normalized; + } +}; + + +internals.regex = function (string) { + + const end = string.lastIndexOf('/'); + const exp = string.slice(1, end); + const flags = string.slice(end + 1); + return new RegExp(exp, flags); +}; + + +internals.validate = function (joi, desc) { + + Schemas = Schemas || require('./schemas'); + + joi.assert(desc, Schemas.description); +}; diff --git a/node_modules/@hapi/joi/lib/messages.js b/node_modules/@hapi/joi/lib/messages.js new file mode 100644 index 000000000..f719779b2 --- /dev/null +++ b/node_modules/@hapi/joi/lib/messages.js @@ -0,0 +1,178 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); + +const Template = require('./template'); + + +const internals = {}; + + +exports.compile = function (messages, target) { + + // Single value string ('plain error message', 'template {error} message') + + if (typeof messages === 'string') { + Assert(!target, 'Cannot set single message string'); + return new Template(messages); + } + + // Single value template + + if (Template.isTemplate(messages)) { + Assert(!target, 'Cannot set single message template'); + return messages; + } + + // By error code { 'number.min': } + + Assert(typeof messages === 'object' && !Array.isArray(messages), 'Invalid message options'); + + target = target ? Clone(target) : {}; + + for (let code in messages) { + const message = messages[code]; + + if (code === 'root' || + Template.isTemplate(message)) { + + target[code] = message; + continue; + } + + if (typeof message === 'string') { + target[code] = new Template(message); + continue; + } + + // By language { english: { 'number.min': } } + + Assert(typeof message === 'object' && !Array.isArray(message), 'Invalid message for', code); + + const language = code; + target[language] = target[language] || {}; + + for (code in message) { + const localized = message[code]; + + if (code === 'root' || + Template.isTemplate(localized)) { + + target[language][code] = localized; + continue; + } + + Assert(typeof localized === 'string', 'Invalid message for', code, 'in', language); + target[language][code] = new Template(localized); + } + } + + return target; +}; + + +exports.decompile = function (messages) { + + // By error code { 'number.min': } + + const target = {}; + for (let code in messages) { + const message = messages[code]; + + if (code === 'root') { + target[code] = message; + continue; + } + + if (Template.isTemplate(message)) { + target[code] = message.describe({ compact: true }); + continue; + } + + // By language { english: { 'number.min': } } + + const language = code; + target[language] = {}; + + for (code in message) { + const localized = message[code]; + + if (code === 'root') { + target[language][code] = localized; + continue; + } + + target[language][code] = localized.describe({ compact: true }); + } + } + + return target; +}; + + +exports.merge = function (base, extended) { + + if (!base) { + return exports.compile(extended); + } + + if (!extended) { + return base; + } + + // Single value string + + if (typeof extended === 'string') { + return new Template(extended); + } + + // Single value template + + if (Template.isTemplate(extended)) { + return extended; + } + + // By error code { 'number.min': } + + const target = Clone(base); + + for (let code in extended) { + const message = extended[code]; + + if (code === 'root' || + Template.isTemplate(message)) { + + target[code] = message; + continue; + } + + if (typeof message === 'string') { + target[code] = new Template(message); + continue; + } + + // By language { english: { 'number.min': } } + + Assert(typeof message === 'object' && !Array.isArray(message), 'Invalid message for', code); + + const language = code; + target[language] = target[language] || {}; + + for (code in message) { + const localized = message[code]; + + if (code === 'root' || + Template.isTemplate(localized)) { + + target[language][code] = localized; + continue; + } + + Assert(typeof localized === 'string', 'Invalid message for', code, 'in', language); + target[language][code] = new Template(localized); + } + } + + return target; +}; diff --git a/node_modules/@hapi/joi/lib/modify.js b/node_modules/@hapi/joi/lib/modify.js new file mode 100644 index 000000000..6f1484847 --- /dev/null +++ b/node_modules/@hapi/joi/lib/modify.js @@ -0,0 +1,267 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Common = require('./common'); +const Ref = require('./ref'); + + +const internals = {}; + + + +exports.Ids = internals.Ids = class { + + constructor() { + + this._byId = new Map(); + this._byKey = new Map(); + this._schemaChain = false; + } + + clone() { + + const clone = new internals.Ids(); + clone._byId = new Map(this._byId); + clone._byKey = new Map(this._byKey); + clone._schemaChain = this._schemaChain; + return clone; + } + + concat(source) { + + if (source._schemaChain) { + this._schemaChain = true; + } + + for (const [id, value] of source._byId.entries()) { + Assert(!this._byKey.has(id), 'Schema id conflicts with existing key:', id); + this._byId.set(id, value); + } + + for (const [key, value] of source._byKey.entries()) { + Assert(!this._byId.has(key), 'Schema key conflicts with existing id:', key); + this._byKey.set(key, value); + } + } + + fork(path, adjuster, root) { + + const chain = this._collect(path); + chain.push({ schema: root }); + const tail = chain.shift(); + let adjusted = { id: tail.id, schema: adjuster(tail.schema) }; + + Assert(Common.isSchema(adjusted.schema), 'adjuster function failed to return a joi schema type'); + + for (const node of chain) { + adjusted = { id: node.id, schema: internals.fork(node.schema, adjusted.id, adjusted.schema) }; + } + + return adjusted.schema; + } + + labels(path, behind = []) { + + const current = path[0]; + const node = this._get(current); + if (!node) { + return [...behind, ...path].join('.'); + } + + const forward = path.slice(1); + behind = [...behind, node.schema._flags.label || current]; + if (!forward.length) { + return behind.join('.'); + } + + return node.schema._ids.labels(forward, behind); + } + + reach(path, behind = []) { + + const current = path[0]; + const node = this._get(current); + Assert(node, 'Schema does not contain path', [...behind, ...path].join('.')); + + const forward = path.slice(1); + if (!forward.length) { + return node.schema; + } + + return node.schema._ids.reach(forward, [...behind, current]); + } + + register(schema, { key } = {}) { + + if (!schema || + !Common.isSchema(schema)) { + + return; + } + + if (schema.$_property('schemaChain') || + schema._ids._schemaChain) { + + this._schemaChain = true; + } + + const id = schema._flags.id; + if (id) { + const existing = this._byId.get(id); + Assert(!existing || existing.schema === schema, 'Cannot add different schemas with the same id:', id); + Assert(!this._byKey.has(id), 'Schema id conflicts with existing key:', id); + + this._byId.set(id, { schema, id }); + } + + if (key) { + Assert(!this._byKey.has(key), 'Schema already contains key:', key); + Assert(!this._byId.has(key), 'Schema key conflicts with existing id:', key); + + this._byKey.set(key, { schema, id: key }); + } + } + + reset() { + + this._byId = new Map(); + this._byKey = new Map(); + this._schemaChain = false; + } + + _collect(path, behind = [], nodes = []) { + + const current = path[0]; + const node = this._get(current); + Assert(node, 'Schema does not contain path', [...behind, ...path].join('.')); + + nodes = [node, ...nodes]; + + const forward = path.slice(1); + if (!forward.length) { + return nodes; + } + + return node.schema._ids._collect(forward, [...behind, current], nodes); + } + + _get(id) { + + return this._byId.get(id) || this._byKey.get(id); + } +}; + + +internals.fork = function (schema, id, replacement) { + + const each = (item, { key }) => { + + if (id === (item._flags.id || key)) { + return replacement; + } + }; + + const obj = exports.schema(schema, { each, ref: false }); + return obj ? obj.$_mutateRebuild() : schema; +}; + + +exports.schema = function (schema, options) { + + let obj; + + for (const name in schema._flags) { + if (name[0] === '_') { + continue; + } + + const result = internals.scan(schema._flags[name], { source: 'flags', name }, options); + if (result !== undefined) { + obj = obj || schema.clone(); + obj._flags[name] = result; + } + } + + for (let i = 0; i < schema._rules.length; ++i) { + const rule = schema._rules[i]; + const result = internals.scan(rule.args, { source: 'rules', name: rule.name }, options); + if (result !== undefined) { + obj = obj || schema.clone(); + const clone = Object.assign({}, rule); + clone.args = result; + obj._rules[i] = clone; + + const existingUnique = obj._singleRules.get(rule.name); + if (existingUnique === rule) { + obj._singleRules.set(rule.name, clone); + } + } + } + + for (const name in schema.$_terms) { + if (name[0] === '_') { + continue; + } + + const result = internals.scan(schema.$_terms[name], { source: 'terms', name }, options); + if (result !== undefined) { + obj = obj || schema.clone(); + obj.$_terms[name] = result; + } + } + + return obj; +}; + + +internals.scan = function (item, source, options, _path, _key) { + + const path = _path || []; + + if (item === null || + typeof item !== 'object') { + + return; + } + + let clone; + + if (Array.isArray(item)) { + for (let i = 0; i < item.length; ++i) { + const key = source.source === 'terms' && source.name === 'keys' && item[i].key; + const result = internals.scan(item[i], source, options, [i, ...path], key); + if (result !== undefined) { + clone = clone || item.slice(); + clone[i] = result; + } + } + + return clone; + } + + if (options.schema !== false && Common.isSchema(item) || + options.ref !== false && Ref.isRef(item)) { + + const result = options.each(item, { ...source, path, key: _key }); + if (result === item) { + return; + } + + return result; + } + + for (const key in item) { + if (key[0] === '_') { + continue; + } + + const result = internals.scan(item[key], source, options, [key, ...path], _key); + if (result !== undefined) { + clone = clone || Object.assign({}, item); + clone[key] = result; + } + } + + return clone; +}; diff --git a/node_modules/@hapi/joi/lib/ref.js b/node_modules/@hapi/joi/lib/ref.js new file mode 100644 index 000000000..417732164 --- /dev/null +++ b/node_modules/@hapi/joi/lib/ref.js @@ -0,0 +1,412 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); +const Reach = require('@hapi/hoek/lib/reach'); + +const Common = require('./common'); + +let Template; + + +const internals = { + symbol: Symbol('ref'), // Used to internally identify references (shared with other joi versions) + defaults: { + adjust: null, + in: false, + iterables: null, + map: null, + separator: '.', + type: 'value' + } +}; + + +exports.create = function (key, options = {}) { + + Assert(typeof key === 'string', 'Invalid reference key:', key); + Common.assertOptions(options, ['adjust', 'ancestor', 'in', 'iterables', 'map', 'prefix', 'separator']); + Assert(!options.prefix || typeof options.prefix === 'object', 'options.prefix must be of type object'); + + const ref = Object.assign({}, internals.defaults, options); + delete ref.prefix; + + const separator = ref.separator; + const context = internals.context(key, separator, options.prefix); + ref.type = context.type; + key = context.key; + + if (ref.type === 'value') { + if (context.root) { + Assert(!separator || key[0] !== separator, 'Cannot specify relative path with root prefix'); + ref.ancestor = 'root'; + if (!key) { + key = null; + } + } + + if (separator && + separator === key) { + + key = null; + ref.ancestor = 0; + } + else { + if (ref.ancestor !== undefined) { + Assert(!separator || !key || key[0] !== separator, 'Cannot combine prefix with ancestor option'); + } + else { + const [ancestor, slice] = internals.ancestor(key, separator); + if (slice) { + key = key.slice(slice); + if (key === '') { + key = null; + } + } + + ref.ancestor = ancestor; + } + } + } + + ref.path = separator ? (key === null ? [] : key.split(separator)) : [key]; + + return new internals.Ref(ref); +}; + + +exports.in = function (key, options = {}) { + + return exports.create(key, Object.assign({}, options, { in: true })); +}; + + +exports.isRef = function (ref) { + + return ref ? !!ref[Common.symbols.ref] : false; +}; + + +internals.Ref = class { + + constructor(options) { + + Assert(typeof options === 'object', 'Invalid reference construction'); + Common.assertOptions(options, [ + 'adjust', 'ancestor', 'in', 'iterables', 'map', 'path', 'separator', 'type', // Copied + 'depth', 'key', 'root', 'display' // Overridden + ]); + + Assert([false, undefined].includes(options.separator) || typeof options.separator === 'string' && options.separator.length === 1, 'Invalid separator'); + Assert(!options.adjust || typeof options.adjust === 'function', 'options.adjust must be a function'); + Assert(!options.map || Array.isArray(options.map), 'options.map must be an array'); + Assert(!options.map || !options.adjust, 'Cannot set both map and adjust options'); + + Object.assign(this, internals.defaults, options); + + Assert(this.type === 'value' || this.ancestor === undefined, 'Non-value references cannot reference ancestors'); + + if (Array.isArray(this.map)) { + this.map = new Map(this.map); + } + + this.depth = this.path.length; + this.key = this.path.length ? this.path.join(this.separator) : null; + this.root = this.path[0]; + + this.updateDisplay(); + } + + resolve(value, state, prefs, local, options = {}) { + + Assert(!this.in || options.in, 'Invalid in() reference usage'); + + if (this.type === 'global') { + return this._resolve(prefs.context, state, options); + } + + if (this.type === 'local') { + return this._resolve(local, state, options); + } + + if (!this.ancestor) { + return this._resolve(value, state, options); + } + + if (this.ancestor === 'root') { + return this._resolve(state.ancestors[state.ancestors.length - 1], state, options); + } + + Assert(this.ancestor <= state.ancestors.length, 'Invalid reference exceeds the schema root:', this.display); + return this._resolve(state.ancestors[this.ancestor - 1], state, options); + } + + _resolve(target, state, options) { + + let resolved; + + if (this.type === 'value' && + state.mainstay.shadow && + options.shadow !== false) { + + resolved = state.mainstay.shadow.get(this.absolute(state)); + } + + if (resolved === undefined) { + resolved = Reach(target, this.path, { iterables: this.iterables, functions: true }); + } + + if (this.adjust) { + resolved = this.adjust(resolved); + } + + if (this.map) { + const mapped = this.map.get(resolved); + if (mapped !== undefined) { + resolved = mapped; + } + } + + if (state.mainstay) { + state.mainstay.tracer.resolve(state, this, resolved); + } + + return resolved; + } + + toString() { + + return this.display; + } + + absolute(state) { + + return [...state.path.slice(0, -this.ancestor), ...this.path]; + } + + clone() { + + return new internals.Ref(this); + } + + describe() { + + const ref = { path: this.path }; + + if (this.type !== 'value') { + ref.type = this.type; + } + + if (this.separator !== '.') { + ref.separator = this.separator; + } + + if (this.type === 'value' && + this.ancestor !== 1) { + + ref.ancestor = this.ancestor; + } + + if (this.map) { + ref.map = [...this.map]; + } + + for (const key of ['adjust', 'iterables']) { + if (this[key] !== null) { + ref[key] = this[key]; + } + } + + if (this.in !== false) { + ref.in = true; + } + + return { ref }; + } + + updateDisplay() { + + const key = this.key !== null ? this.key : ''; + if (this.type !== 'value') { + this.display = `ref:${this.type}:${key}`; + return; + } + + if (!this.separator) { + this.display = `ref:${key}`; + return; + } + + if (!this.ancestor) { + this.display = `ref:${this.separator}${key}`; + return; + } + + if (this.ancestor === 'root') { + this.display = `ref:root:${key}`; + return; + } + + if (this.ancestor === 1) { + this.display = `ref:${key || '..'}`; + return; + } + + const lead = new Array(this.ancestor + 1).fill(this.separator).join(''); + this.display = `ref:${lead}${key || ''}`; + } +}; + + +internals.Ref.prototype[Common.symbols.ref] = true; + + +exports.build = function (desc) { + + desc = Object.assign({}, internals.defaults, desc); + if (desc.type === 'value' && + desc.ancestor === undefined) { + + desc.ancestor = 1; + } + + return new internals.Ref(desc); +}; + + +internals.context = function (key, separator, prefix = {}) { + + key = key.trim(); + + if (prefix) { + const globalp = prefix.global === undefined ? '$' : prefix.global; + if (globalp !== separator && + key.startsWith(globalp)) { + + return { key: key.slice(globalp.length), type: 'global' }; + } + + const local = prefix.local === undefined ? '#' : prefix.local; + if (local !== separator && + key.startsWith(local)) { + + return { key: key.slice(local.length), type: 'local' }; + } + + const root = prefix.root === undefined ? '/' : prefix.root; + if (root !== separator && + key.startsWith(root)) { + + return { key: key.slice(root.length), type: 'value', root: true }; + } + } + + return { key, type: 'value' }; +}; + + +internals.ancestor = function (key, separator) { + + if (!separator) { + return [1, 0]; // 'a_b' -> 1 (parent) + } + + if (key[0] !== separator) { // 'a.b' -> 1 (parent) + return [1, 0]; + } + + if (key[1] !== separator) { // '.a.b' -> 0 (self) + return [0, 1]; + } + + let i = 2; + while (key[i] === separator) { + ++i; + } + + return [i - 1, i]; // '...a.b.' -> 2 (grandparent) +}; + + +exports.toSibling = 0; + +exports.toParent = 1; + + +exports.Manager = class { + + constructor() { + + this.refs = []; // 0: [self refs], 1: [parent refs], 2: [grandparent refs], ... + } + + register(source, target) { + + if (!source) { + return; + } + + target = target === undefined ? exports.toParent : target; + + // Array + + if (Array.isArray(source)) { + for (const ref of source) { + this.register(ref, target); + } + + return; + } + + // Schema + + if (Common.isSchema(source)) { + for (const item of source._refs.refs) { + if (item.ancestor - target >= 0) { + this.refs.push({ ancestor: item.ancestor - target, root: item.root }); + } + } + + return; + } + + // Reference + + if (exports.isRef(source) && + source.type === 'value' && + source.ancestor - target >= 0) { + + this.refs.push({ ancestor: source.ancestor - target, root: source.root }); + } + + // Template + + Template = Template || require('./template'); + + if (Template.isTemplate(source)) { + this.register(source.refs(), target); + } + } + + get length() { + + return this.refs.length; + } + + clone() { + + const copy = new exports.Manager(); + copy.refs = Clone(this.refs); + return copy; + } + + reset() { + + this.refs = []; + } + + roots() { + + return this.refs.filter((ref) => !ref.ancestor).map((ref) => ref.root); + } +}; diff --git a/node_modules/@hapi/joi/lib/schemas.js b/node_modules/@hapi/joi/lib/schemas.js new file mode 100644 index 000000000..5b0221b38 --- /dev/null +++ b/node_modules/@hapi/joi/lib/schemas.js @@ -0,0 +1,294 @@ +'use strict'; + +const Joi = require('./index'); + + +const internals = {}; + + +// Preferences + +internals.wrap = Joi.string() + .min(1) + .max(2) + .allow(false); + + +exports.preferences = Joi.object({ + allowUnknown: Joi.boolean(), + abortEarly: Joi.boolean(), + cache: Joi.boolean(), + context: Joi.object(), + convert: Joi.boolean(), + dateFormat: Joi.valid('date', 'iso', 'string', 'time', 'utc'), + debug: Joi.boolean(), + errors: { + escapeHtml: Joi.boolean(), + label: Joi.valid('path', 'key', false), + language: [ + Joi.string(), + Joi.object().ref() + ], + render: Joi.boolean(), + stack: Joi.boolean(), + wrap: { + label: internals.wrap, + array: internals.wrap + } + }, + externals: Joi.boolean(), + messages: Joi.object(), + noDefaults: Joi.boolean(), + nonEnumerables: Joi.boolean(), + presence: Joi.valid('required', 'optional', 'forbidden'), + skipFunctions: Joi.boolean(), + stripUnknown: Joi.object({ + arrays: Joi.boolean(), + objects: Joi.boolean() + }) + .or('arrays', 'objects') + .allow(true, false), + warnings: Joi.boolean() +}) + .strict(); + + +// Extensions + +internals.nameRx = /^[a-zA-Z0-9]\w*$/; + + +internals.rule = Joi.object({ + alias: Joi.array().items(Joi.string().pattern(internals.nameRx)).single(), + args: Joi.array().items( + Joi.string(), + Joi.object({ + name: Joi.string().pattern(internals.nameRx).required(), + ref: Joi.boolean(), + assert: Joi.alternatives([ + Joi.function(), + Joi.object().schema() + ]) + .conditional('ref', { is: true, then: Joi.required() }), + normalize: Joi.function(), + message: Joi.string().when('assert', { is: Joi.function(), then: Joi.required() }) + }) + ), + convert: Joi.boolean(), + manifest: Joi.boolean(), + method: Joi.function().allow(false), + multi: Joi.boolean(), + validate: Joi.function() +}); + + +exports.extension = Joi.object({ + type: Joi.alternatives([ + Joi.string(), + Joi.object().regex() + ]) + .required(), + args: Joi.function(), + base: Joi.object().schema() + .when('type', { is: Joi.object().regex(), then: Joi.forbidden() }), + coerce: [ + Joi.function().maxArity(3), + Joi.object({ method: Joi.function().maxArity(3).required(), from: Joi.array().items(Joi.string()).single() }) + ], + flags: Joi.object().pattern(internals.nameRx, Joi.object({ + setter: Joi.string(), + default: Joi.any() + })), + manifest: { + build: Joi.function().arity(2) + }, + messages: [Joi.object(), Joi.string()], + modifiers: Joi.object().pattern(internals.nameRx, Joi.function().minArity(1).maxArity(2)), + overrides: Joi.object().pattern(internals.nameRx, Joi.function()), + prepare: Joi.function().maxArity(3), + rebuild: Joi.function().arity(1), + rules: Joi.object().pattern(internals.nameRx, internals.rule), + terms: Joi.object().pattern(internals.nameRx, Joi.object({ + init: Joi.array().allow(null).required(), + manifest: Joi.object().pattern(/.+/, [ + Joi.valid('schema', 'single'), + Joi.object({ + mapped: Joi.object({ + from: Joi.string().required(), + to: Joi.string().required() + }) + .required() + }) + ]) + })), + validate: Joi.function().maxArity(3) +}) + .strict(); + + +exports.extensions = Joi.array().items(Joi.object(), Joi.function().arity(1)).strict(); + + +// Manifest + +internals.desc = { + + buffer: Joi.object({ + buffer: Joi.string() + }), + + func: Joi.object({ + function: Joi.function().required(), + options: { + literal: true + } + }), + + override: Joi.object({ + override: true + }), + + ref: Joi.object({ + ref: Joi.object({ + type: Joi.valid('value', 'global', 'local'), + path: Joi.array().required(), + separator: Joi.string().length(1).allow(false), + ancestor: Joi.number().min(0).integer().allow('root'), + map: Joi.array().items(Joi.array().length(2)).min(1), + adjust: Joi.function(), + iterables: Joi.boolean(), + in: Joi.boolean() + }) + .required() + }), + + regex: Joi.object({ + regex: Joi.string().min(3) + }), + + special: Joi.object({ + special: Joi.valid('deep').required() + }), + + template: Joi.object({ + template: Joi.string().required(), + options: Joi.object() + }), + + value: Joi.object({ + value: Joi.alternatives([Joi.object(), Joi.array()]).required() + }) +}; + + +internals.desc.entity = Joi.alternatives([ + Joi.array().items(Joi.link('...')), + Joi.boolean(), + Joi.function(), + Joi.number(), + Joi.string(), + internals.desc.buffer, + internals.desc.func, + internals.desc.ref, + internals.desc.regex, + internals.desc.special, + internals.desc.template, + internals.desc.value, + Joi.link('/') +]); + + +internals.desc.values = Joi.array() + .items( + null, + Joi.boolean(), + Joi.function(), + Joi.number().allow(Infinity, -Infinity), + Joi.string().allow(''), + Joi.symbol(), + internals.desc.buffer, + internals.desc.func, + internals.desc.override, + internals.desc.ref, + internals.desc.regex, + internals.desc.template, + internals.desc.value + ); + + +internals.desc.messages = Joi.object() + .pattern(/.+/, [ + Joi.string(), + internals.desc.template, + Joi.object().pattern(/.+/, [Joi.string(), internals.desc.template]) + ]); + + +exports.description = Joi.object({ + type: Joi.string().required(), + flags: Joi.object({ + cast: Joi.string(), + default: Joi.any(), + description: Joi.string(), + empty: Joi.link('/'), + failover: internals.desc.entity, + id: Joi.string(), + label: Joi.string(), + only: true, + presence: ['optional', 'required', 'forbidden'], + result: ['raw', 'strip'], + strip: Joi.boolean(), + unit: Joi.string() + }) + .unknown(), + preferences: { + allowUnknown: Joi.boolean(), + abortEarly: Joi.boolean(), + cache: Joi.boolean(), + convert: Joi.boolean(), + dateFormat: ['date', 'iso', 'string', 'time', 'utc'], + errors: { + escapeHtml: Joi.boolean(), + label: ['path', 'key'], + language: [ + Joi.string(), + internals.desc.ref + ], + wrap: { + label: internals.wrap, + array: internals.wrap + } + }, + externals: Joi.boolean(), + messages: internals.desc.messages, + noDefaults: Joi.boolean(), + nonEnumerables: Joi.boolean(), + presence: ['required', 'optional', 'forbidden'], + skipFunctions: Joi.boolean(), + stripUnknown: Joi.object({ + arrays: Joi.boolean(), + objects: Joi.boolean() + }) + .or('arrays', 'objects') + .allow(true, false), + warnings: Joi.boolean() + }, + allow: internals.desc.values, + invalid: internals.desc.values, + rules: Joi.array().min(1).items({ + name: Joi.string().required(), + args: Joi.object().min(1), + keep: Joi.boolean(), + message: [ + Joi.string(), + internals.desc.messages + ], + warn: Joi.boolean() + }), + + // Terms + + keys: Joi.object().pattern(/.*/, Joi.link('/')), + link: internals.desc.ref +}) + .pattern(/^[a-z]\w*$/, Joi.any()); diff --git a/node_modules/@hapi/joi/lib/state.js b/node_modules/@hapi/joi/lib/state.js new file mode 100644 index 000000000..8db251b76 --- /dev/null +++ b/node_modules/@hapi/joi/lib/state.js @@ -0,0 +1,152 @@ +'use strict'; + +const Clone = require('@hapi/hoek/lib/clone'); +const Reach = require('@hapi/hoek/lib/reach'); + +const Common = require('./common'); + + +const internals = { + value: Symbol('value') +}; + + +module.exports = internals.State = class { + + constructor(path, ancestors, state) { + + this.path = path; + this.ancestors = ancestors; // [parent, ..., root] + + this.mainstay = state.mainstay; + this.schemas = state.schemas; // [current, ..., root] + this.debug = null; + } + + localize(path, ancestors = null, schema = null) { + + const state = new internals.State(path, ancestors, this); + + if (schema && + state.schemas) { + + state.schemas = [internals.schemas(schema), ...state.schemas]; + } + + return state; + } + + nest(schema, debug) { + + const state = new internals.State(this.path, this.ancestors, this); + state.schemas = state.schemas && [internals.schemas(schema), ...state.schemas]; + state.debug = debug; + return state; + } + + shadow(value, reason) { + + this.mainstay.shadow = this.mainstay.shadow || new internals.Shadow(); + this.mainstay.shadow.set(this.path, value, reason); + } + + snapshot() { + + if (this.mainstay.shadow) { + this._snapshot = Clone(this.mainstay.shadow.node(this.path)); + } + } + + restore() { + + if (this.mainstay.shadow) { + this.mainstay.shadow.override(this.path, this._snapshot); + this._snapshot = undefined; + } + } +}; + + +internals.schemas = function (schema) { + + if (Common.isSchema(schema)) { + return { schema }; + } + + return schema; +}; + + +internals.Shadow = class { + + constructor() { + + this._values = null; + } + + set(path, value, reason) { + + if (!path.length) { // No need to store root value + return; + } + + if (reason === 'strip' && + typeof path[path.length - 1] === 'number') { // Cannot store stripped array values (due to shift) + + return; + } + + this._values = this._values || new Map(); + + let node = this._values; + for (let i = 0; i < path.length; ++i) { + const segment = path[i]; + let next = node.get(segment); + if (!next) { + next = new Map(); + node.set(segment, next); + } + + node = next; + } + + node[internals.value] = value; + } + + get(path) { + + const node = this.node(path); + if (node) { + return node[internals.value]; + } + } + + node(path) { + + if (!this._values) { + return; + } + + return Reach(this._values, path, { iterables: true }); + } + + override(path, node) { + + if (!this._values) { + return; + } + + const parents = path.slice(0, -1); + const own = path[path.length - 1]; + const parent = Reach(this._values, parents, { iterables: true }); + + if (node) { + parent.set(own, node); + return; + } + + if (parent) { + parent.delete(own); + } + } +}; diff --git a/node_modules/@hapi/joi/lib/template.js b/node_modules/@hapi/joi/lib/template.js new file mode 100644 index 000000000..1d5c0dbf1 --- /dev/null +++ b/node_modules/@hapi/joi/lib/template.js @@ -0,0 +1,410 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); +const EscapeHtml = require('@hapi/hoek/lib/escapeHtml'); +const Formula = require('@hapi/formula'); + +const Common = require('./common'); +const Errors = require('./errors'); +const Ref = require('./ref'); + + +const internals = { + symbol: Symbol('template'), + + opens: new Array(1000).join('\u0000'), + closes: new Array(1000).join('\u0001'), + + dateFormat: { + date: Date.prototype.toDateString, + iso: Date.prototype.toISOString, + string: Date.prototype.toString, + time: Date.prototype.toTimeString, + utc: Date.prototype.toUTCString + } +}; + + +module.exports = exports = internals.Template = class { + + constructor(source, options) { + + Assert(typeof source === 'string', 'Template source must be a string'); + Assert(!source.includes('\u0000') && !source.includes('\u0001'), 'Template source cannot contain reserved control characters'); + + this.source = source; + this.rendered = source; + + this._template = null; + this._settings = Clone(options); + + this._parse(); + } + + _parse() { + + // 'text {raw} {{ref}} \\{{ignore}} {{ignore\\}} {{ignore {{ignore}' + + if (!this.source.includes('{')) { + return; + } + + // Encode escaped \\{{{{{ + + const encoded = internals.encode(this.source); + + // Split on first { in each set + + const parts = internals.split(encoded); + + // Process parts + + let refs = false; + const processed = []; + const head = parts.shift(); + if (head) { + processed.push(head); + } + + for (const part of parts) { + const raw = part[0] !== '{'; + const ender = raw ? '}' : '}}'; + const end = part.indexOf(ender); + if (end === -1 || // Ignore non-matching closing + part[1] === '{') { // Ignore more than two { + + processed.push(`{${internals.decode(part)}`); + continue; + } + + const variable = part.slice(raw ? 0 : 1, end); + const dynamic = this._ref(internals.decode(variable), raw); + processed.push(dynamic); + if (typeof dynamic !== 'string') { + refs = true; + } + + const rest = part.slice(end + ender.length); + if (rest) { + processed.push(internals.decode(rest)); + } + } + + if (!refs) { + this.rendered = processed.join(''); + return; + } + + this._template = processed; + } + + static date(date, prefs) { + + return internals.dateFormat[prefs.dateFormat].call(date); + } + + describe(options = {}) { + + if (!this._settings && + options.compact) { + + return this.source; + } + + const desc = { template: this.source }; + if (this._settings) { + desc.options = this._settings; + } + + return desc; + } + + static build(desc) { + + return new internals.Template(desc.template, desc.options); + } + + isDynamic() { + + return !!this._template; + } + + static isTemplate(template) { + + return template ? !!template[Common.symbols.template] : false; + } + + refs() { + + if (!this._template) { + return; + } + + const refs = []; + for (const part of this._template) { + if (typeof part !== 'string') { + refs.push(...part.refs); + } + } + + return refs; + } + + resolve(value, state, prefs, local) { + + if (this._template && + this._template.length === 1) { + + return this._part(this._template[0], /* context -> [*/ value, state, prefs, local, {} /*] */); + } + + return this.render(value, state, prefs, local); + } + + _part(part, ...args) { + + if (part.ref) { + return part.ref.resolve(...args); + } + + return part.formula.evaluate(args); + } + + render(value, state, prefs, local, options = {}) { + + if (!this.isDynamic()) { + return this.rendered; + } + + const parts = []; + for (const part of this._template) { + if (typeof part === 'string') { + parts.push(part); + } + else { + const rendered = this._part(part, /* context -> [*/ value, state, prefs, local, options /*] */); + const string = internals.stringify(rendered, prefs, options.errors); + if (string !== undefined) { + const result = part.raw || (options.errors && options.errors.escapeHtml) === false ? string : EscapeHtml(string); + const ends = part.ref && part.ref.type === 'local' && part.ref.key === 'label' && prefs.errors.wrap.label; + parts.push(internals.wrap(result, ends)); + } + } + } + + return parts.join(''); + } + + _ref(content, raw) { + + const refs = []; + const reference = (variable) => { + + const ref = Ref.create(variable, this._settings); + refs.push(ref); + return (context) => ref.resolve(...context); + }; + + try { + var formula = new Formula.Parser(content, { reference, functions: internals.functions, constants: internals.constants }); + } + catch (err) { + err.message = `Invalid template variable "${content}" fails due to: ${err.message}`; + throw err; + } + + if (formula.single) { + if (formula.single.type === 'reference') { + return { ref: refs[0], raw, refs }; + } + + return internals.stringify(formula.single.value); + } + + return { formula, raw, refs }; + } + + toString() { + + return this.source; + } +}; + + +internals.Template.prototype[Common.symbols.template] = true; +internals.Template.prototype.isImmutable = true; // Prevents Hoek from deep cloning schema objects + + +internals.encode = function (string) { + + return string + .replace(/\\(\{+)/g, ($0, $1) => { + + return internals.opens.slice(0, $1.length); + }) + .replace(/\\(\}+)/g, ($0, $1) => { + + return internals.closes.slice(0, $1.length); + }); +}; + + +internals.decode = function (string) { + + return string + .replace(/\u0000/g, '{') + .replace(/\u0001/g, '}'); +}; + + +internals.split = function (string) { + + const parts = []; + let current = ''; + + for (let i = 0; i < string.length; ++i) { + const char = string[i]; + + if (char === '{') { + let next = ''; + while (i + 1 < string.length && + string[i + 1] === '{') { + + next += '{'; + ++i; + } + + parts.push(current); + current = next; + } + else { + current += char; + } + } + + parts.push(current); + return parts; +}; + + +internals.wrap = function (value, ends) { + + if (!ends) { + return value; + } + + if (ends.length === 1) { + return `${ends}${value}${ends}`; + } + + return `${ends[0]}${value}${ends[1]}`; +}; + + +internals.stringify = function (value, prefs, options) { + + const type = typeof value; + + if (value === null) { + return 'null'; + } + + if (type === 'string') { + return value; + } + + if (type === 'number' || + type === 'function' || + type === 'symbol') { + + return value.toString(); + } + + if (type !== 'object') { + return JSON.stringify(value); + } + + if (value instanceof Date) { + return internals.Template.date(value, prefs); + } + + if (value instanceof Map) { + const pairs = []; + for (const [key, sym] of value.entries()) { + pairs.push(`${key.toString()} -> ${sym.toString()}`); + } + + value = pairs; + } + + if (!Array.isArray(value)) { + return value.toString(); + } + + let partial = ''; + for (const item of value) { + partial = partial + (partial.length ? ', ' : '') + internals.stringify(item, prefs, options); + } + + return internals.wrap(partial, prefs.errors.wrap.array); +}; + + +internals.constants = { + + true: true, + false: false, + null: null, + + second: 1000, + minute: 60 * 1000, + hour: 60 * 60 * 1000, + day: 24 * 60 * 60 * 1000 +}; + + +internals.functions = { + + if(condition, then, otherwise) { + + return condition ? then : otherwise; + }, + + msg(code) { + + const [value, state, prefs, local, options] = this; + const messages = options.messages; + if (!messages) { + return ''; + } + + const template = Errors.template(value, messages[0], code, state, prefs) || Errors.template(value, messages[1], code, state, prefs); + if (!template) { + return ''; + } + + return template.render(value, state, prefs, local, options); + }, + + number(value) { + + if (typeof value === 'number') { + return value; + } + + if (typeof value === 'string') { + return parseFloat(value); + } + + if (typeof value === 'boolean') { + return value ? 1 : 0; + } + + if (value instanceof Date) { + return value.getTime(); + } + + return null; + } +}; diff --git a/node_modules/@hapi/joi/lib/trace.js b/node_modules/@hapi/joi/lib/trace.js new file mode 100644 index 000000000..8c6991707 --- /dev/null +++ b/node_modules/@hapi/joi/lib/trace.js @@ -0,0 +1,346 @@ +'use strict'; + +const DeepEqual = require('@hapi/hoek/lib/deepEqual'); +const Pinpoint = require('@hapi/pinpoint'); + +const Errors = require('./errors'); + + +const internals = { + codes: { + error: 1, + pass: 2, + full: 3 + }, + labels: { + 0: 'never used', + 1: 'always error', + 2: 'always pass' + } +}; + + +exports.setup = function (root) { + + const trace = function () { + + root._tracer = root._tracer || new internals.Tracer(); + return root._tracer; + }; + + root.trace = trace; + root[Symbol.for('@hapi/lab/coverage/initialize')] = trace; + + root.untrace = () => { + + root._tracer = null; + }; +}; + + +exports.location = function (schema) { + + return schema.$_setFlag('_tracerLocation', Pinpoint.location(2)); // base.tracer(), caller +}; + + +internals.Tracer = class { + + constructor() { + + this.name = 'Joi'; + this._schemas = new Map(); + } + + _register(schema) { + + const existing = this._schemas.get(schema); + if (existing) { + return existing.store; + } + + const store = new internals.Store(schema); + const { filename, line } = schema._flags._tracerLocation || Pinpoint.location(5); // internals.tracer(), internals.entry(), exports.entry(), validate(), caller + this._schemas.set(schema, { filename, line, store }); + return store; + } + + _combine(merged, sources) { + + for (const { store } of this._schemas.values()) { + store._combine(merged, sources); + } + } + + report(file) { + + const coverage = []; + + // Process each registered schema + + for (const { filename, line, store } of this._schemas.values()) { + if (file && + file !== filename) { + + continue; + } + + // Process sub schemas of the registered root + + const missing = []; + const skipped = []; + + for (const [schema, log] of store._sources.entries()) { + + // Check if sub schema parent skipped + + if (internals.sub(log.paths, skipped)) { + continue; + } + + // Check if sub schema reached + + if (!log.entry) { + missing.push({ + status: 'never reached', + paths: [...log.paths] + }); + + skipped.push(...log.paths); + continue; + } + + // Check values + + for (const type of ['valid', 'invalid']) { + const set = schema[`_${type}s`]; + if (!set) { + continue; + } + + const values = new Set(set._values); + const refs = new Set(set._refs); + for (const { value, ref } of log[type]) { + values.delete(value); + refs.delete(ref); + } + + if (values.size || + refs.size) { + + missing.push({ + status: [...values, ...[...refs].map((ref) => ref.display)], + rule: `${type}s` + }); + } + } + + // Check rules status + + const rules = schema._rules.map((rule) => rule.name); + for (const type of ['default', 'failover']) { + if (schema._flags[type] !== undefined) { + rules.push(type); + } + } + + for (const name of rules) { + const status = internals.labels[log.rule[name] || 0]; + if (status) { + const report = { rule: name, status }; + if (log.paths.size) { + report.paths = [...log.paths]; + } + + missing.push(report); + } + } + } + + if (missing.length) { + coverage.push({ + filename, + line, + missing, + severity: 'error', + message: `Schema missing tests for ${missing.map(internals.message).join(', ')}` + }); + } + } + + return coverage.length ? coverage : null; + } +}; + + +internals.Store = class { + + constructor(schema) { + + this.active = true; + this._sources = new Map(); // schema -> { paths, entry, rule, valid, invalid } + this._combos = new Map(); // merged -> [sources] + this._scan(schema); + } + + debug(state, source, name, result) { + + state.mainstay.debug && state.mainstay.debug.push({ type: source, name, result, path: state.path }); + } + + entry(schema, state) { + + internals.debug(state, { type: 'entry' }); + + this._record(schema, (log) => { + + log.entry = true; + }); + } + + filter(schema, state, source, value) { + + internals.debug(state, { type: source, ...value }); + + this._record(schema, (log) => { + + log[source].add(value); + }); + } + + log(schema, state, source, name, result) { + + internals.debug(state, { type: source, name, result: result === 'full' ? 'pass' : result }); + + this._record(schema, (log) => { + + log[source][name] = log[source][name] || 0; + log[source][name] |= internals.codes[result]; + }); + } + + resolve(state, ref, to) { + + if (!state.mainstay.debug) { + return; + } + + const log = { type: 'resolve', ref: ref.display, to, path: state.path }; + state.mainstay.debug.push(log); + } + + value(state, by, from, to, name) { + + if (!state.mainstay.debug || + DeepEqual(from, to)) { + + return; + } + + const log = { type: 'value', by, from, to, path: state.path }; + if (name) { + log.name = name; + } + + state.mainstay.debug.push(log); + } + + _record(schema, each) { + + const log = this._sources.get(schema); + if (log) { + each(log); + return; + } + + const sources = this._combos.get(schema); + for (const source of sources) { + this._record(source, each); + } + } + + _scan(schema, _path) { + + const path = _path || []; + + let log = this._sources.get(schema); + if (!log) { + log = { + paths: new Set(), + entry: false, + rule: {}, + valid: new Set(), + invalid: new Set() + }; + + this._sources.set(schema, log); + } + + if (path.length) { + log.paths.add(path); + } + + const each = (sub, source) => { + + const subId = internals.id(sub, source); + this._scan(sub, path.concat(subId)); + }; + + schema.$_modify({ each, ref: false }); + } + + _combine(merged, sources) { + + this._combos.set(merged, sources); + } +}; + + +internals.message = function (item) { + + const path = item.paths ? Errors.path(item.paths[0]) + (item.rule ? ':' : '') : ''; + return `${path}${item.rule || ''} (${item.status})`; +}; + + +internals.id = function (schema, { source, name, path, key }) { + + if (schema._flags.id) { + return schema._flags.id; + } + + if (key) { + return key; + } + + name = `@${name}`; + + if (source === 'terms') { + return [name, path[Math.min(path.length - 1, 1)]]; + } + + return name; +}; + + +internals.sub = function (paths, skipped) { + + for (const path of paths) { + for (const skip of skipped) { + if (DeepEqual(path.slice(0, skip.length), skip)) { + return true; + } + } + } + + return false; +}; + + +internals.debug = function (state, event) { + + if (state.mainstay.debug) { + event.path = state.debug ? [...state.path, state.debug] : state.path; + state.mainstay.debug.push(event); + } +}; diff --git a/node_modules/@hapi/joi/lib/types/alternatives.js b/node_modules/@hapi/joi/lib/types/alternatives.js new file mode 100644 index 000000000..07b5b13dd --- /dev/null +++ b/node_modules/@hapi/joi/lib/types/alternatives.js @@ -0,0 +1,329 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Any = require('./any'); +const Common = require('../common'); +const Compile = require('../compile'); +const Errors = require('../errors'); +const Ref = require('../ref'); + + +const internals = {}; + + +module.exports = Any.extend({ + + type: 'alternatives', + + flags: { + + match: { default: 'any' } // 'any', 'one', 'all' + }, + + terms: { + + matches: { init: [], register: Ref.toSibling } + }, + + args(schema, ...schemas) { + + if (schemas.length === 1) { + if (Array.isArray(schemas[0])) { + return schema.try(...schemas[0]); + } + } + + return schema.try(...schemas); + }, + + validate(value, helpers) { + + const { schema, error, state, prefs } = helpers; + + // Match all or one + + if (schema._flags.match) { + let hits = 0; + let matched; + + for (let i = 0; i < schema.$_terms.matches.length; ++i) { + const item = schema.$_terms.matches[i]; + const localState = state.nest(item.schema, `match.${i}`); + localState.snapshot(); + + const result = item.schema.$_validate(value, localState, prefs); + if (!result.errors) { + ++hits; + matched = result.value; + } + else { + localState.restore(); + } + } + + if (!hits) { + return { errors: error('alternatives.any') }; + } + + if (schema._flags.match === 'one') { + return hits === 1 ? { value: matched } : { errors: error('alternatives.one') }; + } + + return hits === schema.$_terms.matches.length ? { value } : { errors: error('alternatives.all') }; + } + + // Match any + + const errors = []; + for (let i = 0; i < schema.$_terms.matches.length; ++i) { + const item = schema.$_terms.matches[i]; + + // Try + + if (item.schema) { + const localState = state.nest(item.schema, `match.${i}`); + localState.snapshot(); + + const result = item.schema.$_validate(value, localState, prefs); + if (!result.errors) { + return result; + } + + localState.restore(); + errors.push({ schema: item.schema, reports: result.errors }); + continue; + } + + // Conditional + + const input = item.ref ? item.ref.resolve(value, state, prefs) : value; + const tests = item.is ? [item] : item.switch; + + for (let j = 0; j < tests.length; ++j) { + const test = tests[j]; + const { is, then, otherwise } = test; + + const id = `match.${i}${item.switch ? '.' + j : ''}`; + if (!is.$_match(input, state.nest(is, `${id}.is`), prefs)) { + if (otherwise) { + return otherwise.$_validate(value, state.nest(otherwise, `${id}.otherwise`), prefs); + } + } + else if (then) { + return then.$_validate(value, state.nest(then, `${id}.then`), prefs); + } + } + } + + return internals.errors(errors, helpers); + }, + + rules: { + + conditional: { + method(condition, options) { + + Assert(!this._flags._endedSwitch, 'Unreachable condition'); + Assert(!this._flags.match, 'Cannot combine match mode', this._flags.match, 'with conditional rule'); + Assert(options.break === undefined, 'Cannot use break option with alternatives conditional'); + + const obj = this.clone(); + + const match = Compile.when(obj, condition, options); + const conditions = match.is ? [match] : match.switch; + for (const item of conditions) { + if (item.then && + item.otherwise) { + + obj.$_setFlag('_endedSwitch', true, { clone: false }); + break; + } + } + + obj.$_terms.matches.push(match); + return obj.$_mutateRebuild(); + } + }, + + match: { + method(mode) { + + Assert(['any', 'one', 'all'].includes(mode), 'Invalid alternatives match mode', mode); + + if (mode !== 'any') { + for (const match of this.$_terms.matches) { + Assert(match.schema, 'Cannot combine match mode', mode, 'with conditional rules'); + } + } + + return this.$_setFlag('match', mode); + } + }, + + try: { + method(...schemas) { + + Assert(schemas.length, 'Missing alternative schemas'); + Common.verifyFlat(schemas, 'try'); + + Assert(!this._flags._endedSwitch, 'Unreachable condition'); + + const obj = this.clone(); + for (const schema of schemas) { + obj.$_terms.matches.push({ schema: obj.$_compile(schema) }); + } + + return obj.$_mutateRebuild(); + } + } + }, + + overrides: { + + label(name) { + + const obj = this.$_super.label(name); + const each = (item, source) => (source.path[0] !== 'is' ? item.label(name) : undefined); + return obj.$_modify({ each, ref: false }); + } + }, + + rebuild(schema) { + + // Flag when an alternative type is an array + + const each = (item) => { + + if (Common.isSchema(item) && + item.type === 'array') { + + schema.$_setFlag('_arrayItems', true, { clone: false }); + } + }; + + schema.$_modify({ each }); + }, + + manifest: { + + build(obj, desc) { + + if (desc.matches) { + for (const match of desc.matches) { + const { schema, ref, is, not, then, otherwise } = match; + if (schema) { + obj = obj.try(schema); + } + else if (ref) { + obj = obj.conditional(ref, { is, then, not, otherwise, switch: match.switch }); + } + else { + obj = obj.conditional(is, { then, otherwise }); + } + } + } + + return obj; + } + }, + + messages: { + 'alternatives.all': '{{#label}} does not match all of the required types', + 'alternatives.any': '{{#label}} does not match any of the allowed types', + 'alternatives.match': '{{#label}} does not match any of the allowed types', + 'alternatives.one': '{{#label}} matches more than one allowed type', + 'alternatives.types': '{{#label}} must be one of {{#types}}' + } +}); + + +// Helpers + +internals.errors = function (failures, { error, state }) { + + // Nothing matched due to type criteria rules + + if (!failures.length) { + return { errors: error('alternatives.any') }; + } + + // Single error + + if (failures.length === 1) { + return { errors: failures[0].reports }; + } + + // Analyze reasons + + const valids = new Set(); + const complex = []; + + for (const { reports, schema } of failures) { + + // Multiple errors (!abortEarly) + + if (reports.length > 1) { + return internals.unmatched(failures, error); + } + + // Custom error + + const report = reports[0]; + if (report instanceof Errors.Report === false) { + return internals.unmatched(failures, error); + } + + // Internal object or array error + + if (report.state.path.length !== state.path.length) { + complex.push({ type: schema.type, report }); + continue; + } + + // Valids + + if (report.code === 'any.only') { + for (const valid of report.local.valids) { + valids.add(valid); + } + + continue; + } + + // Base type + + const [type, code] = report.code.split('.'); + if (code !== 'base') { + complex.push({ type: schema.type, report }); + continue; + } + + valids.add(type); + } + + // All errors are base types or valids + + if (!complex.length) { + return { errors: error('alternatives.types', { types: [...valids] }) }; + } + + // Single complex error + + if (complex.length === 1) { + return { errors: complex[0].report }; + } + + return internals.unmatched(failures, error); +}; + + +internals.unmatched = function (failures, error) { + + const errors = []; + for (const failure of failures) { + errors.push(...failure.reports); + } + + return { errors: error('alternatives.match', Errors.details(errors, { override: false })) }; +}; diff --git a/node_modules/@hapi/joi/lib/types/any.js b/node_modules/@hapi/joi/lib/types/any.js new file mode 100644 index 000000000..42aebf9ec --- /dev/null +++ b/node_modules/@hapi/joi/lib/types/any.js @@ -0,0 +1,174 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Base = require('../base'); +const Common = require('../common'); +const Messages = require('../messages'); + + +const internals = {}; + + +module.exports = Base.extend({ + + type: 'any', + + flags: { + + only: { default: false } + }, + + terms: { + + alterations: { init: null }, + examples: { init: null }, + externals: { init: null }, + metas: { init: [] }, + notes: { init: [] }, + shared: { init: null }, + tags: { init: [] }, + whens: { init: null } + }, + + rules: { + + custom: { + method(method, description) { + + Assert(typeof method === 'function', 'Method must be a function'); + Assert(description === undefined || description && typeof description === 'string', 'Description must be a non-empty string'); + + return this.$_addRule({ name: 'custom', args: { method, description } }); + }, + validate(value, helpers, { method }) { + + try { + return method(value, helpers); + } + catch (err) { + return helpers.error('any.custom', { error: err }); + } + }, + args: ['method', 'description'], + multi: true + }, + + messages: { + method(messages) { + + return this.prefs({ messages }); + } + }, + + shared: { + method(schema) { + + Assert(Common.isSchema(schema) && schema._flags.id, 'Schema must be a schema with an id'); + + const obj = this.clone(); + obj.$_terms.shared = obj.$_terms.shared || []; + obj.$_terms.shared.push(schema); + obj.$_mutateRegister(schema); + return obj; + } + }, + + warning: { + method(code, local) { + + Assert(code && typeof code === 'string', 'Invalid warning code'); + + return this.$_addRule({ name: 'warning', args: { code, local }, warn: true }); + }, + validate(value, helpers, { code, local }) { + + return helpers.error(code, local); + }, + args: ['code', 'local'], + multi: true + } + }, + + modifiers: { + + keep(rule, enabled = true) { + + rule.keep = enabled; + }, + + message(rule, message) { + + rule.message = Messages.compile(message); + }, + + warn(rule, enabled = true) { + + rule.warn = enabled; + } + }, + + manifest: { + + build(obj, desc) { + + for (const key in desc) { + const values = desc[key]; + + if (['examples', 'externals', 'metas', 'notes', 'tags'].includes(key)) { + for (const value of values) { + obj = obj[key.slice(0, -1)](value); + } + + continue; + } + + if (key === 'alterations') { + const alter = {}; + for (const { target, adjuster } of values) { + alter[target] = adjuster; + } + + obj = obj.alter(alter); + continue; + } + + if (key === 'whens') { + for (const value of values) { + const { ref, is, not, then, otherwise, concat } = value; + if (concat) { + obj = obj.concat(concat); + } + else if (ref) { + obj = obj.when(ref, { is, not, then, otherwise, switch: value.switch, break: value.break }); + } + else { + obj = obj.when(is, { then, otherwise, break: value.break }); + } + } + + continue; + } + + if (key === 'shared') { + for (const value of values) { + obj = obj.shared(value); + } + } + } + + return obj; + } + }, + + messages: { + 'any.custom': '{{#label}} failed custom validation because {{#error.message}}', + 'any.default': '{{#label}} threw an error when running default method', + 'any.failover': '{{#label}} threw an error when running failover method', + 'any.invalid': '{{#label}} contains an invalid value', + 'any.only': '{{#label}} must be {if(#valids.length == 1, "", "one of ")}{{#valids}}', + 'any.ref': '{{#label}} {{#arg}} references "{{#ref}}" which {{#reason}}', + 'any.required': '{{#label}} is required', + 'any.unknown': '{{#label}} is not allowed' + } +}); diff --git a/node_modules/@hapi/joi/lib/types/array.js b/node_modules/@hapi/joi/lib/types/array.js new file mode 100644 index 000000000..b5a4bd1f7 --- /dev/null +++ b/node_modules/@hapi/joi/lib/types/array.js @@ -0,0 +1,774 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const DeepEqual = require('@hapi/hoek/lib/deepEqual'); +const Reach = require('@hapi/hoek/lib/reach'); + +const Any = require('./any'); +const Common = require('../common'); +const Compile = require('../compile'); + + +const internals = {}; + + +module.exports = Any.extend({ + + type: 'array', + + flags: { + + single: { default: false }, + sparse: { default: false } + }, + + terms: { + + items: { init: [], manifest: 'schema' }, + ordered: { init: [], manifest: 'schema' }, + + _exclusions: { init: [] }, + _inclusions: { init: [] }, + _requireds: { init: [] } + }, + + coerce: { + from: 'object', + method(value, { schema, state, prefs }) { + + if (!Array.isArray(value)) { + return; + } + + const sort = schema.$_getRule('sort'); + if (!sort) { + return; + } + + return internals.sort(schema, value, sort.args.options, state, prefs); + } + }, + + validate(value, { schema, error }) { + + if (!Array.isArray(value)) { + if (schema._flags.single) { + const single = [value]; + single[Common.symbols.arraySingle] = true; + return { value: single }; + } + + return { errors: error('array.base') }; + } + + if (!schema.$_getRule('items') && + !schema.$_terms.externals) { + + return; + } + + return { value: value.slice() }; // Clone the array so that we don't modify the original + }, + + rules: { + + has: { + method(schema) { + + schema = this.$_compile(schema, { appendPath: true }); + const obj = this.$_addRule({ name: 'has', args: { schema } }); + obj.$_mutateRegister(schema); + return obj; + }, + validate(value, { state, prefs, error }, { schema: has }) { + + const ancestors = [value, ...state.ancestors]; + for (let i = 0; i < value.length; ++i) { + const localState = state.localize([...state.path, i], ancestors, has); + if (has.$_match(value[i], localState, prefs)) { + return value; + } + } + + const patternLabel = has._flags.label; + if (patternLabel) { + return error('array.hasKnown', { patternLabel }); + } + + return error('array.hasUnknown', null); + }, + multi: true + }, + + items: { + method(...schemas) { + + Common.verifyFlat(schemas, 'items'); + + const obj = this.$_addRule('items'); + + for (let i = 0; i < schemas.length; ++i) { + const type = Common.tryWithPath(() => this.$_compile(schemas[i]), i, { append: true }); + obj.$_terms.items.push(type); + } + + return obj.$_mutateRebuild(); + }, + validate(value, { schema, error, state, prefs }) { + + const requireds = schema.$_terms._requireds.slice(); + const ordereds = schema.$_terms.ordered.slice(); + const inclusions = [...schema.$_terms._inclusions, ...requireds]; + + const wasArray = !value[Common.symbols.arraySingle]; + delete value[Common.symbols.arraySingle]; + + const errors = []; + let il = value.length; + for (let i = 0; i < il; ++i) { + const item = value[i]; + + let errored = false; + let isValid = false; + + const key = wasArray ? i : new Number(i); // eslint-disable-line no-new-wrappers + const path = [...state.path, key]; + + // Sparse + + if (!schema._flags.sparse && + item === undefined) { + + errors.push(error('array.sparse', { key, path, pos: i, value: undefined }, state.localize(path))); + if (prefs.abortEarly) { + return errors; + } + + ordereds.shift(); + continue; + } + + // Exclusions + + const ancestors = [value, ...state.ancestors]; + + for (const exclusion of schema.$_terms._exclusions) { + if (!exclusion.$_match(item, state.localize(path, ancestors, exclusion), prefs, { presence: 'ignore' })) { + continue; + } + + errors.push(error('array.excludes', { pos: i, value: item }, state.localize(path))); + if (prefs.abortEarly) { + return errors; + } + + errored = true; + ordereds.shift(); + break; + } + + if (errored) { + continue; + } + + // Ordered + + if (schema.$_terms.ordered.length) { + if (ordereds.length) { + const ordered = ordereds.shift(); + const res = ordered.$_validate(item, state.localize(path, ancestors, ordered), prefs); + if (!res.errors) { + if (ordered._flags.result === 'strip') { + internals.fastSplice(value, i); + --i; + --il; + } + else if (!schema._flags.sparse && res.value === undefined) { + errors.push(error('array.sparse', { key, path, pos: i, value: undefined }, state.localize(path))); + if (prefs.abortEarly) { + return errors; + } + + continue; + } + else { + value[i] = res.value; + } + } + else { + errors.push(...res.errors); + if (prefs.abortEarly) { + return errors; + } + } + + continue; + } + else if (!schema.$_terms.items.length) { + errors.push(error('array.orderedLength', { pos: i, limit: schema.$_terms.ordered.length })); + if (prefs.abortEarly) { + return errors; + } + + break; // No reason to continue since there are no other rules to validate other than array.orderedLength + } + } + + // Requireds + + const requiredChecks = []; + let jl = requireds.length; + for (let j = 0; j < jl; ++j) { + const localState = state.localize(path, ancestors, requireds[j]); + localState.snapshot(); + + const res = requireds[j].$_validate(item, localState, prefs); + requiredChecks[j] = res; + + if (!res.errors) { + value[i] = res.value; + isValid = true; + internals.fastSplice(requireds, j); + --j; + --jl; + + if (!schema._flags.sparse && + res.value === undefined) { + + errors.push(error('array.sparse', { key, path, pos: i, value: undefined }, state.localize(path))); + if (prefs.abortEarly) { + return errors; + } + } + + break; + } + + localState.restore(); + } + + if (isValid) { + continue; + } + + // Inclusions + + const stripUnknown = prefs.stripUnknown && !!prefs.stripUnknown.arrays || false; + + jl = inclusions.length; + for (const inclusion of inclusions) { + + // Avoid re-running requireds that already didn't match in the previous loop + + let res; + const previousCheck = requireds.indexOf(inclusion); + if (previousCheck !== -1) { + res = requiredChecks[previousCheck]; + } + else { + const localState = state.localize(path, ancestors, inclusion); + localState.snapshot(); + + res = inclusion.$_validate(item, localState, prefs); + if (!res.errors) { + if (inclusion._flags.result === 'strip') { + internals.fastSplice(value, i); + --i; + --il; + } + else if (!schema._flags.sparse && + res.value === undefined) { + + errors.push(error('array.sparse', { key, path, pos: i, value: undefined }, state.localize(path))); + errored = true; + } + else { + value[i] = res.value; + } + + isValid = true; + break; + } + + localState.restore(); + } + + // Return the actual error if only one inclusion defined + + if (jl === 1) { + if (stripUnknown) { + internals.fastSplice(value, i); + --i; + --il; + isValid = true; + break; + } + + errors.push(...res.errors); + if (prefs.abortEarly) { + return errors; + } + + errored = true; + break; + } + } + + if (errored) { + continue; + } + + if (schema.$_terms._inclusions.length && + !isValid) { + + if (stripUnknown) { + internals.fastSplice(value, i); + --i; + --il; + continue; + } + + errors.push(error('array.includes', { pos: i, value: item }, state.localize(path))); + if (prefs.abortEarly) { + return errors; + } + } + } + + if (requireds.length) { + internals.fillMissedErrors(schema, errors, requireds, value, state, prefs); + } + + if (ordereds.length) { + internals.fillOrderedErrors(schema, errors, ordereds, value, state, prefs); + } + + return errors.length ? errors : value; + }, + + priority: true, + manifest: false + }, + + length: { + method(limit) { + + return this.$_addRule({ name: 'length', args: { limit }, operator: '=' }); + }, + validate(value, helpers, { limit }, { name, operator, args }) { + + if (Common.compare(value.length, limit, operator)) { + return value; + } + + return helpers.error('array.' + name, { limit: args.limit, value }); + }, + args: [ + { + name: 'limit', + ref: true, + assert: Common.limit, + message: 'must be a positive integer' + } + ] + }, + + max: { + method(limit) { + + return this.$_addRule({ name: 'max', method: 'length', args: { limit }, operator: '<=' }); + } + }, + + min: { + method(limit) { + + return this.$_addRule({ name: 'min', method: 'length', args: { limit }, operator: '>=' }); + } + }, + + ordered: { + method(...schemas) { + + Common.verifyFlat(schemas, 'ordered'); + + const obj = this.$_addRule('items'); + + for (let i = 0; i < schemas.length; ++i) { + const type = Common.tryWithPath(() => this.$_compile(schemas[i]), i, { append: true }); + internals.validateSingle(type, obj); + + obj.$_mutateRegister(type); + obj.$_terms.ordered.push(type); + } + + return obj.$_mutateRebuild(); + } + }, + + single: { + method(enabled) { + + const value = enabled === undefined ? true : !!enabled; + Assert(!value || !this._flags._arrayItems, 'Cannot specify single rule when array has array items'); + + return this.$_setFlag('single', value); + } + }, + + sort: { + method(options = {}) { + + Common.assertOptions(options, ['by', 'order']); + + const settings = { + order: options.order || 'ascending' + }; + + if (options.by) { + settings.by = Compile.ref(options.by, { ancestor: 0 }); + Assert(!settings.by.ancestor, 'Cannot sort by ancestor'); + } + + return this.$_addRule({ name: 'sort', args: { options: settings } }); + }, + validate(value, { error, state, prefs, schema }, { options }) { + + const { value: sorted, errors } = internals.sort(schema, value, options, state, prefs); + if (errors) { + return errors; + } + + for (let i = 0; i < value.length; ++i) { + if (value[i] !== sorted[i]) { + return error('array.sort', { order: options.order, by: options.by ? options.by.key : 'value' }); + } + } + + return value; + }, + convert: true + }, + + sparse: { + method(enabled) { + + const value = enabled === undefined ? true : !!enabled; + + if (this._flags.sparse === value) { + return this; + } + + const obj = value ? this.clone() : this.$_addRule('items'); + return obj.$_setFlag('sparse', value, { clone: false }); + } + }, + + unique: { + method(comparator, options = {}) { + + Assert(!comparator || typeof comparator === 'function' || typeof comparator === 'string', 'comparator must be a function or a string'); + Common.assertOptions(options, ['ignoreUndefined', 'separator']); + + const rule = { name: 'unique', args: { options, comparator } }; + + if (comparator) { + if (typeof comparator === 'string') { + const separator = Common.default(options.separator, '.'); + rule.path = separator ? comparator.split(separator) : [comparator]; + } + else { + rule.comparator = comparator; + } + } + + return this.$_addRule(rule); + }, + validate(value, { state, error, schema }, { comparator: raw, options }, { comparator, path }) { + + const found = { + string: Object.create(null), + number: Object.create(null), + undefined: Object.create(null), + boolean: Object.create(null), + object: new Map(), + function: new Map(), + custom: new Map() + }; + + const compare = comparator || DeepEqual; + const ignoreUndefined = options.ignoreUndefined; + + for (let i = 0; i < value.length; ++i) { + const item = path ? Reach(value[i], path) : value[i]; + const records = comparator ? found.custom : found[typeof item]; + Assert(records, 'Failed to find unique map container for type', typeof item); + + if (records instanceof Map) { + const entries = records.entries(); + let current; + while (!(current = entries.next()).done) { + if (compare(current.value[0], item)) { + const localState = state.localize([...state.path, i], [value, ...state.ancestors]); + const context = { + pos: i, + value: value[i], + dupePos: current.value[1], + dupeValue: value[current.value[1]] + }; + + if (path) { + context.path = raw; + } + + return error('array.unique', context, localState); + } + } + + records.set(item, i); + } + else { + if ((!ignoreUndefined || item !== undefined) && + records[item] !== undefined) { + + const context = { + pos: i, + value: value[i], + dupePos: records[item], + dupeValue: value[records[item]] + }; + + if (path) { + context.path = raw; + } + + const localState = state.localize([...state.path, i], [value, ...state.ancestors]); + return error('array.unique', context, localState); + } + + records[item] = i; + } + } + + return value; + }, + args: ['comparator', 'options'], + multi: true + } + }, + + cast: { + set: { + from: Array.isArray, + to(value, helpers) { + + return new Set(value); + } + } + }, + + rebuild(schema) { + + schema.$_terms._inclusions = []; + schema.$_terms._exclusions = []; + schema.$_terms._requireds = []; + + for (const type of schema.$_terms.items) { + internals.validateSingle(type, schema); + + if (type._flags.presence === 'required') { + schema.$_terms._requireds.push(type); + } + else if (type._flags.presence === 'forbidden') { + schema.$_terms._exclusions.push(type); + } + else { + schema.$_terms._inclusions.push(type); + } + } + + for (const type of schema.$_terms.ordered) { + internals.validateSingle(type, schema); + } + }, + + manifest: { + + build(obj, desc) { + + if (desc.items) { + obj = obj.items(...desc.items); + } + + if (desc.ordered) { + obj = obj.ordered(...desc.ordered); + } + + return obj; + } + }, + + messages: { + 'array.base': '{{#label}} must be an array', + 'array.excludes': '{{#label}} contains an excluded value', + 'array.hasKnown': '{{#label}} does not contain at least one required match for type "{#patternLabel}"', + 'array.hasUnknown': '{{#label}} does not contain at least one required match', + 'array.includes': '{{#label}} does not match any of the allowed types', + 'array.includesRequiredBoth': '{{#label}} does not contain {{#knownMisses}} and {{#unknownMisses}} other required value(s)', + 'array.includesRequiredKnowns': '{{#label}} does not contain {{#knownMisses}}', + 'array.includesRequiredUnknowns': '{{#label}} does not contain {{#unknownMisses}} required value(s)', + 'array.length': '{{#label}} must contain {{#limit}} items', + 'array.max': '{{#label}} must contain less than or equal to {{#limit}} items', + 'array.min': '{{#label}} must contain at least {{#limit}} items', + 'array.orderedLength': '{{#label}} must contain at most {{#limit}} items', + 'array.sort': '{{#label}} must be sorted in {#order} order by {{#by}}', + 'array.sort.mismatching': '{{#label}} cannot be sorted due to mismatching types', + 'array.sort.unsupported': '{{#label}} cannot be sorted due to unsupported type {#type}', + 'array.sparse': '{{#label}} must not be a sparse array item', + 'array.unique': '{{#label}} contains a duplicate value' + } +}); + + +// Helpers + +internals.fillMissedErrors = function (schema, errors, requireds, value, state, prefs) { + + const knownMisses = []; + let unknownMisses = 0; + for (const required of requireds) { + const label = required._flags.label; + if (label) { + knownMisses.push(label); + } + else { + ++unknownMisses; + } + } + + if (knownMisses.length) { + if (unknownMisses) { + errors.push(schema.$_createError('array.includesRequiredBoth', value, { knownMisses, unknownMisses }, state, prefs)); + } + else { + errors.push(schema.$_createError('array.includesRequiredKnowns', value, { knownMisses }, state, prefs)); + } + } + else { + errors.push(schema.$_createError('array.includesRequiredUnknowns', value, { unknownMisses }, state, prefs)); + } +}; + + +internals.fillOrderedErrors = function (schema, errors, ordereds, value, state, prefs) { + + const requiredOrdereds = []; + + for (const ordered of ordereds) { + if (ordered._flags.presence === 'required') { + requiredOrdereds.push(ordered); + } + } + + if (requiredOrdereds.length) { + internals.fillMissedErrors(schema, errors, requiredOrdereds, value, state, prefs); + } +}; + + +internals.fastSplice = function (arr, i) { + + let pos = i; + while (pos < arr.length) { + arr[pos++] = arr[pos]; + } + + --arr.length; +}; + + +internals.validateSingle = function (type, obj) { + + if (type.type === 'array' || + type._flags._arrayItems) { + + Assert(!obj._flags.single, 'Cannot specify array item with single rule enabled'); + obj.$_setFlag('_arrayItems', true, { clone: false }); + } +}; + + +internals.sort = function (schema, value, settings, state, prefs) { + + const order = settings.order === 'ascending' ? 1 : -1; + const aFirst = -1 * order; + const bFirst = order; + + const sort = (a, b) => { + + let compare = internals.compare(a, b, aFirst, bFirst); + if (compare !== null) { + return compare; + } + + if (settings.by) { + a = settings.by.resolve(a, state, prefs); + b = settings.by.resolve(b, state, prefs); + } + + compare = internals.compare(a, b, aFirst, bFirst); + if (compare !== null) { + return compare; + } + + const type = typeof a; + if (type !== typeof b) { + throw schema.$_createError('array.sort.mismatching', value, null, state, prefs); + } + + if (type !== 'number' && + type !== 'string') { + + throw schema.$_createError('array.sort.unsupported', value, { type }, state, prefs); + } + + if (type === 'number') { + return (a - b) * order; + } + + return a < b ? aFirst : bFirst; + }; + + try { + return { value: value.slice().sort(sort) }; + } + catch (err) { + return { errors: err }; + } +}; + + +internals.compare = function (a, b, aFirst, bFirst) { + + if (a === b) { + return 0; + } + + if (a === undefined) { + return 1; // Always last regardless of sort order + } + + if (b === undefined) { + return -1; // Always last regardless of sort order + } + + if (a === null) { + return bFirst; + } + + if (b === null) { + return aFirst; + } + + return null; +}; diff --git a/node_modules/@hapi/joi/lib/types/binary.js b/node_modules/@hapi/joi/lib/types/binary.js new file mode 100644 index 000000000..9147166ab --- /dev/null +++ b/node_modules/@hapi/joi/lib/types/binary.js @@ -0,0 +1,98 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Any = require('./any'); +const Common = require('../common'); + + +const internals = {}; + + +module.exports = Any.extend({ + + type: 'binary', + + coerce: { + from: 'string', + method(value, { schema }) { + + try { + return { value: Buffer.from(value, schema._flags.encoding) }; + } + catch (ignoreErr) { } + } + }, + + validate(value, { error }) { + + if (!Buffer.isBuffer(value)) { + return { value, errors: error('binary.base') }; + } + }, + + rules: { + encoding: { + method(encoding) { + + Assert(Buffer.isEncoding(encoding), 'Invalid encoding:', encoding); + + return this.$_setFlag('encoding', encoding); + } + }, + + length: { + method(limit) { + + return this.$_addRule({ name: 'length', method: 'length', args: { limit }, operator: '=' }); + }, + validate(value, helpers, { limit }, { name, operator, args }) { + + if (Common.compare(value.length, limit, operator)) { + return value; + } + + return helpers.error('binary.' + name, { limit: args.limit, value }); + }, + args: [ + { + name: 'limit', + ref: true, + assert: Common.limit, + message: 'must be a positive integer' + } + ] + }, + + max: { + method(limit) { + + return this.$_addRule({ name: 'max', method: 'length', args: { limit }, operator: '<=' }); + } + }, + + min: { + method(limit) { + + return this.$_addRule({ name: 'min', method: 'length', args: { limit }, operator: '>=' }); + } + } + }, + + cast: { + string: { + from: (value) => Buffer.isBuffer(value), + to(value, helpers) { + + return value.toString(); + } + } + }, + + messages: { + 'binary.base': '{{#label}} must be a buffer or a string', + 'binary.length': '{{#label}} must be {{#limit}} bytes', + 'binary.max': '{{#label}} must be less than or equal to {{#limit}} bytes', + 'binary.min': '{{#label}} must be at least {{#limit}} bytes' + } +}); diff --git a/node_modules/@hapi/joi/lib/types/boolean.js b/node_modules/@hapi/joi/lib/types/boolean.js new file mode 100644 index 000000000..686586648 --- /dev/null +++ b/node_modules/@hapi/joi/lib/types/boolean.js @@ -0,0 +1,150 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Any = require('./any'); +const Common = require('../common'); +const Values = require('../values'); + + +const internals = {}; + + +internals.isBool = function (value) { + + return typeof value === 'boolean'; +}; + + +module.exports = Any.extend({ + + type: 'boolean', + + flags: { + + sensitive: { default: false } + }, + + terms: { + + falsy: { + init: null, + manifest: 'values' + }, + + truthy: { + init: null, + manifest: 'values' + } + }, + + coerce(value, { schema }) { + + if (typeof value === 'boolean') { + return; + } + + if (typeof value === 'string') { + const normalized = schema._flags.sensitive ? value : value.toLowerCase(); + value = normalized === 'true' ? true : (normalized === 'false' ? false : value); + } + + if (typeof value !== 'boolean') { + value = schema.$_terms.truthy && schema.$_terms.truthy.has(value, null, null, !schema._flags.sensitive) || + (schema.$_terms.falsy && schema.$_terms.falsy.has(value, null, null, !schema._flags.sensitive) ? false : value); + } + + return { value }; + }, + + validate(value, { error }) { + + if (typeof value !== 'boolean') { + return { value, errors: error('boolean.base') }; + } + }, + + rules: { + truthy: { + method(...values) { + + Common.verifyFlat(values, 'truthy'); + + const obj = this.clone(); + obj.$_terms.truthy = obj.$_terms.truthy || new Values(); + + for (let i = 0; i < values.length; ++i) { + const value = values[i]; + + Assert(value !== undefined, 'Cannot call truthy with undefined'); + obj.$_terms.truthy.add(value); + } + + return obj; + } + }, + + falsy: { + method(...values) { + + Common.verifyFlat(values, 'falsy'); + + const obj = this.clone(); + obj.$_terms.falsy = obj.$_terms.falsy || new Values(); + + for (let i = 0; i < values.length; ++i) { + const value = values[i]; + + Assert(value !== undefined, 'Cannot call falsy with undefined'); + obj.$_terms.falsy.add(value); + } + + return obj; + } + }, + + sensitive: { + method(enabled = true) { + + return this.$_setFlag('sensitive', enabled); + } + } + }, + + cast: { + number: { + from: internals.isBool, + to(value, helpers) { + + return value ? 1 : 0; + } + }, + string: { + from: internals.isBool, + to(value, helpers) { + + return value ? 'true' : 'false'; + } + } + }, + + manifest: { + + build(obj, desc) { + + if (desc.truthy) { + obj = obj.truthy(...desc.truthy); + } + + if (desc.falsy) { + obj = obj.falsy(...desc.falsy); + } + + return obj; + } + }, + + messages: { + 'boolean.base': '{{#label}} must be a boolean' + } +}); diff --git a/node_modules/@hapi/joi/lib/types/date.js b/node_modules/@hapi/joi/lib/types/date.js new file mode 100644 index 000000000..08d3f3638 --- /dev/null +++ b/node_modules/@hapi/joi/lib/types/date.js @@ -0,0 +1,233 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Any = require('./any'); +const Common = require('../common'); +const Template = require('../template'); + + +const internals = {}; + + +internals.isDate = function (value) { + + return value instanceof Date; +}; + + +module.exports = Any.extend({ + + type: 'date', + + coerce: { + from: ['number', 'string'], + method(value, { schema }) { + + return { value: internals.parse(value, schema._flags.format) || value }; + } + }, + + validate(value, { schema, error, prefs }) { + + if (value instanceof Date && + !isNaN(value.getTime())) { + + return; + } + + const format = schema._flags.format; + + if (!prefs.convert || + !format || + typeof value !== 'string') { + + return { value, errors: error('date.base') }; + } + + return { value, errors: error('date.format', { format }) }; + }, + + rules: { + + compare: { + method: false, + validate(value, helpers, { date }, { name, operator, args }) { + + const to = date === 'now' ? Date.now() : date.getTime(); + if (Common.compare(value.getTime(), to, operator)) { + return value; + } + + return helpers.error('date.' + name, { limit: args.date, value }); + }, + args: [ + { + name: 'date', + ref: true, + normalize: (date) => { + + return date === 'now' ? date : internals.parse(date); + }, + assert: (date) => date !== null, + message: 'must have a valid date format' + } + ] + }, + + format: { + method(format) { + + Assert(['iso', 'javascript', 'unix'].includes(format), 'Unknown date format', format); + + return this.$_setFlag('format', format); + } + }, + + greater: { + method(date) { + + return this.$_addRule({ name: 'greater', method: 'compare', args: { date }, operator: '>' }); + } + }, + + iso: { + method() { + + return this.format('iso'); + } + }, + + less: { + method(date) { + + return this.$_addRule({ name: 'less', method: 'compare', args: { date }, operator: '<' }); + } + }, + + max: { + method(date) { + + return this.$_addRule({ name: 'max', method: 'compare', args: { date }, operator: '<=' }); + } + }, + + min: { + method(date) { + + return this.$_addRule({ name: 'min', method: 'compare', args: { date }, operator: '>=' }); + } + }, + + timestamp: { + method(type = 'javascript') { + + Assert(['javascript', 'unix'].includes(type), '"type" must be one of "javascript, unix"'); + + return this.format(type); + } + } + }, + + cast: { + number: { + from: internals.isDate, + to(value, helpers) { + + return value.getTime(); + } + }, + string: { + from: internals.isDate, + to(value, { prefs }) { + + return Template.date(value, prefs); + } + } + }, + + messages: { + 'date.base': '{{#label}} must be a valid date', + 'date.format': '{{#label}} must be in {msg("date.format." + #format) || #format} format', + 'date.greater': '{{#label}} must be greater than "{{#limit}}"', + 'date.less': '{{#label}} must be less than "{{#limit}}"', + 'date.max': '{{#label}} must be less than or equal to "{{#limit}}"', + 'date.min': '{{#label}} must be larger than or equal to "{{#limit}}"', + + // Messages used in date.format + + 'date.format.iso': 'ISO 8601 date', + 'date.format.javascript': 'timestamp or number of milliseconds', + 'date.format.unix': 'timestamp or number of seconds' + } +}); + + +// Helpers + +internals.parse = function (value, format) { + + if (value instanceof Date) { + return value; + } + + if (typeof value !== 'string' && + (isNaN(value) || !isFinite(value))) { + + return null; + } + + if (/^\s*$/.test(value)) { + return null; + } + + // ISO + + if (format === 'iso') { + if (!Common.isIsoDate(value)) { + return null; + } + + return internals.date(value.toString()); + } + + // Normalize number string + + const original = value; + if (typeof value === 'string' && + /^[+-]?\d+(\.\d+)?$/.test(value)) { + + value = parseFloat(value); + } + + // Timestamp + + if (format) { + if (format === 'javascript') { + return internals.date(1 * value); // Casting to number + } + + if (format === 'unix') { + return internals.date(1000 * value); + } + + if (typeof original === 'string') { + return null; + } + } + + // Plain + + return internals.date(value); +}; + + +internals.date = function (value) { + + const date = new Date(value); + if (!isNaN(date.getTime())) { + return date; + } + + return null; +}; diff --git a/node_modules/@hapi/joi/lib/types/function.js b/node_modules/@hapi/joi/lib/types/function.js new file mode 100644 index 000000000..eb8792e43 --- /dev/null +++ b/node_modules/@hapi/joi/lib/types/function.js @@ -0,0 +1,93 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Keys = require('./keys'); + + +const internals = {}; + + +module.exports = Keys.extend({ + + type: 'function', + + properties: { + typeof: 'function' + }, + + rules: { + arity: { + method(n) { + + Assert(Number.isSafeInteger(n) && n >= 0, 'n must be a positive integer'); + + return this.$_addRule({ name: 'arity', args: { n } }); + }, + validate(value, helpers, { n }) { + + if (value.length === n) { + return value; + } + + return helpers.error('function.arity', { n }); + } + }, + + class: { + method() { + + return this.$_addRule('class'); + }, + validate(value, helpers) { + + if ((/^\s*class\s/).test(value.toString())) { + return value; + } + + return helpers.error('function.class', { value }); + } + }, + + minArity: { + method(n) { + + Assert(Number.isSafeInteger(n) && n > 0, 'n must be a strict positive integer'); + + return this.$_addRule({ name: 'minArity', args: { n } }); + }, + validate(value, helpers, { n }) { + + if (value.length >= n) { + return value; + } + + return helpers.error('function.minArity', { n }); + } + }, + + maxArity: { + method(n) { + + Assert(Number.isSafeInteger(n) && n >= 0, 'n must be a positive integer'); + + return this.$_addRule({ name: 'maxArity', args: { n } }); + }, + validate(value, helpers, { n }) { + + if (value.length <= n) { + return value; + } + + return helpers.error('function.maxArity', { n }); + } + } + }, + + messages: { + 'function.arity': '{{#label}} must have an arity of {{#n}}', + 'function.class': '{{#label}} must be a class', + 'function.maxArity': '{{#label}} must have an arity lesser or equal to {{#n}}', + 'function.minArity': '{{#label}} must have an arity greater or equal to {{#n}}' + } +}); diff --git a/node_modules/@hapi/joi/lib/types/keys.js b/node_modules/@hapi/joi/lib/types/keys.js new file mode 100644 index 000000000..0f56bb5c1 --- /dev/null +++ b/node_modules/@hapi/joi/lib/types/keys.js @@ -0,0 +1,1042 @@ +'use strict'; + +const ApplyToDefaults = require('@hapi/hoek/lib/applyToDefaults'); +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); +const Topo = require('@hapi/topo'); + +const Any = require('./any'); +const Common = require('../common'); +const Compile = require('../compile'); +const Errors = require('../errors'); +const Ref = require('../ref'); +const Template = require('../template'); + + +const internals = { + renameDefaults: { + alias: false, // Keep old value in place + multiple: false, // Allow renaming multiple keys into the same target + override: false // Overrides an existing key + } +}; + + +module.exports = Any.extend({ + + type: '_keys', + + properties: { + typeof: 'object' + }, + + flags: { + + unknown: { default: false } + }, + + terms: { + + dependencies: { init: null }, + keys: { init: null, manifest: { mapped: { from: 'schema', to: 'key' } } }, + patterns: { init: null }, + renames: { init: null } + }, + + args(schema, keys) { + + return schema.keys(keys); + }, + + validate(value, { schema, error, state, prefs }) { + + if (!value || + typeof value !== schema.$_property('typeof') || + Array.isArray(value)) { + + return { value, errors: error('object.base', { type: schema.$_property('typeof') }) }; + } + + // Skip if there are no other rules to test + + if (!schema.$_terms.renames && + !schema.$_terms.dependencies && + !schema.$_terms.keys && // null allows any keys + !schema.$_terms.patterns && + !schema.$_terms.externals) { + + return; + } + + // Shallow clone value + + value = internals.clone(value, prefs); + const errors = []; + + // Rename keys + + if (schema.$_terms.renames && + !internals.rename(schema, value, state, prefs, errors)) { + + return { value, errors }; + } + + // Anything allowed + + if (!schema.$_terms.keys && // null allows any keys + !schema.$_terms.patterns && + !schema.$_terms.dependencies) { + + return { value, errors }; + } + + // Defined keys + + const unprocessed = new Set(Object.keys(value)); + + if (schema.$_terms.keys) { + const ancestors = [value, ...state.ancestors]; + + for (const child of schema.$_terms.keys) { + const key = child.key; + const item = value[key]; + + unprocessed.delete(key); + + const localState = state.localize([...state.path, key], ancestors, child); + const result = child.schema.$_validate(item, localState, prefs); + + if (result.errors) { + if (prefs.abortEarly) { + return { value, errors: result.errors }; + } + + errors.push(...result.errors); + } + else if (child.schema._flags.result === 'strip' || + result.value === undefined && item !== undefined) { + + delete value[key]; + } + else if (result.value !== undefined) { + value[key] = result.value; + } + } + } + + // Unknown keys + + if (unprocessed.size || + schema._flags._hasPatternMatch) { + + const early = internals.unknown(schema, value, unprocessed, errors, state, prefs); + if (early) { + return early; + } + } + + // Validate dependencies + + if (schema.$_terms.dependencies) { + for (const dep of schema.$_terms.dependencies) { + if (dep.key && + dep.key.resolve(value, state, prefs, null, { shadow: false }) === undefined) { + + continue; + } + + const failed = internals.dependencies[dep.rel](schema, dep, value, state, prefs); + if (failed) { + const report = schema.$_createError(failed.code, value, failed.context, state, prefs); + if (prefs.abortEarly) { + return { value, errors: report }; + } + + errors.push(report); + } + } + } + + return { value, errors }; + }, + + rules: { + + and: { + method(...peers /*, [options] */) { + + Common.verifyFlat(peers, 'and'); + + return internals.dependency(this, 'and', null, peers); + } + }, + + append: { + method(schema) { + + if (schema === null || + schema === undefined || + Object.keys(schema).length === 0) { + + return this; + } + + return this.keys(schema); + } + }, + + assert: { + method(subject, schema, message) { + + if (!Template.isTemplate(subject)) { + subject = Compile.ref(subject); + } + + Assert(message === undefined || typeof message === 'string', 'Message must be a string'); + + schema = this.$_compile(schema, { appendPath: true }); + + const obj = this.$_addRule({ name: 'assert', args: { subject, schema, message } }); + obj.$_mutateRegister(subject); + obj.$_mutateRegister(schema); + return obj; + }, + validate(value, { error, prefs, state }, { subject, schema, message }) { + + const about = subject.resolve(value, state, prefs); + const path = Ref.isRef(subject) ? subject.absolute(state) : []; + if (schema.$_match(about, state.localize(path, [value, ...state.ancestors], schema), prefs)) { + return value; + } + + return error('object.assert', { subject, message }); + }, + args: ['subject', 'schema', 'message'], + multi: true + }, + + instance: { + method(constructor, name) { + + Assert(typeof constructor === 'function', 'constructor must be a function'); + + name = name || constructor.name; + + return this.$_addRule({ name: 'instance', args: { constructor, name } }); + }, + validate(value, helpers, { constructor, name }) { + + if (value instanceof constructor) { + return value; + } + + return helpers.error('object.instance', { type: name, value }); + }, + args: ['constructor', 'name'] + }, + + keys: { + method(schema) { + + Assert(schema === undefined || typeof schema === 'object', 'Object schema must be a valid object'); + Assert(!Common.isSchema(schema), 'Object schema cannot be a joi schema'); + + const obj = this.clone(); + + if (!schema) { // Allow all + obj.$_terms.keys = null; + } + else if (!Object.keys(schema).length) { // Allow none + obj.$_terms.keys = new internals.Keys(); + } + else { + obj.$_terms.keys = obj.$_terms.keys ? obj.$_terms.keys.filter((child) => !schema.hasOwnProperty(child.key)) : new internals.Keys(); + for (const key in schema) { + Common.tryWithPath(() => obj.$_terms.keys.push({ key, schema: this.$_compile(schema[key]) }), key); + } + } + + return obj.$_mutateRebuild(); + } + }, + + length: { + method(limit) { + + return this.$_addRule({ name: 'length', args: { limit }, operator: '=' }); + }, + validate(value, helpers, { limit }, { name, operator, args }) { + + if (Common.compare(Object.keys(value).length, limit, operator)) { + return value; + } + + return helpers.error('object.' + name, { limit: args.limit, value }); + }, + args: [ + { + name: 'limit', + ref: true, + assert: Common.limit, + message: 'must be a positive integer' + } + ] + }, + + max: { + method(limit) { + + return this.$_addRule({ name: 'max', method: 'length', args: { limit }, operator: '<=' }); + } + }, + + min: { + method(limit) { + + return this.$_addRule({ name: 'min', method: 'length', args: { limit }, operator: '>=' }); + } + }, + + nand: { + method(...peers /*, [options] */) { + + Common.verifyFlat(peers, 'nand'); + + return internals.dependency(this, 'nand', null, peers); + } + }, + + or: { + method(...peers /*, [options] */) { + + Common.verifyFlat(peers, 'or'); + + return internals.dependency(this, 'or', null, peers); + } + }, + + oxor: { + method(...peers /*, [options] */) { + + return internals.dependency(this, 'oxor', null, peers); + } + }, + + pattern: { + method(pattern, schema, options = {}) { + + const isRegExp = pattern instanceof RegExp; + if (!isRegExp) { + pattern = this.$_compile(pattern, { appendPath: true }); + } + + Assert(schema !== undefined, 'Invalid rule'); + Common.assertOptions(options, ['fallthrough', 'matches']); + + if (isRegExp) { + Assert(!pattern.flags.includes('g') && !pattern.flags.includes('y'), 'pattern should not use global or sticky mode'); + } + + schema = this.$_compile(schema, { appendPath: true }); + + const obj = this.clone(); + obj.$_terms.patterns = obj.$_terms.patterns || []; + const config = { [isRegExp ? 'regex' : 'schema']: pattern, rule: schema }; + if (options.matches) { + config.matches = this.$_compile(options.matches); + if (config.matches.type !== 'array') { + config.matches = config.matches.$_root.array().items(config.matches); + } + + obj.$_mutateRegister(config.matches); + obj.$_setFlag('_hasPatternMatch', true, { clone: false }); + } + + if (options.fallthrough) { + config.fallthrough = true; + } + + obj.$_terms.patterns.push(config); + obj.$_mutateRegister(schema); + return obj; + } + }, + + ref: { + method() { + + return this.$_addRule('ref'); + }, + validate(value, helpers) { + + if (Ref.isRef(value)) { + return value; + } + + return helpers.error('object.refType', { value }); + } + }, + + regex: { + method() { + + return this.$_addRule('regex'); + }, + validate(value, helpers) { + + if (value instanceof RegExp) { + return value; + } + + return helpers.error('object.regex', { value }); + } + }, + + rename: { + method(from, to, options = {}) { + + Assert(typeof from === 'string' || from instanceof RegExp, 'Rename missing the from argument'); + Assert(typeof to === 'string' || to instanceof Template, 'Invalid rename to argument'); + Assert(to !== from, 'Cannot rename key to same name:', from); + + Common.assertOptions(options, ['alias', 'ignoreUndefined', 'override', 'multiple']); + + const obj = this.clone(); + + obj.$_terms.renames = obj.$_terms.renames || []; + for (const rename of obj.$_terms.renames) { + Assert(rename.from !== from, 'Cannot rename the same key multiple times'); + } + + if (to instanceof Template) { + obj.$_mutateRegister(to); + } + + obj.$_terms.renames.push({ + from, + to, + options: ApplyToDefaults(internals.renameDefaults, options) + }); + + return obj; + } + }, + + schema: { + method(type = 'any') { + + return this.$_addRule({ name: 'schema', args: { type } }); + }, + validate(value, helpers, { type }) { + + if (Common.isSchema(value) && + (type === 'any' || value.type === type)) { + + return value; + } + + return helpers.error('object.schema', { type }); + } + }, + + unknown: { + method(allow) { + + return this.$_setFlag('unknown', allow !== false); + } + }, + + with: { + method(key, peers, options = {}) { + + return internals.dependency(this, 'with', key, peers, options); + } + }, + + without: { + method(key, peers, options = {}) { + + return internals.dependency(this, 'without', key, peers, options); + } + }, + + xor: { + method(...peers /*, [options] */) { + + Common.verifyFlat(peers, 'xor'); + + return internals.dependency(this, 'xor', null, peers); + } + } + }, + + overrides: { + + default(value, options) { + + if (value === undefined) { + value = Common.symbols.deepDefault; + } + + return this.$_super.default(value, options); + } + }, + + rebuild(schema) { + + if (schema.$_terms.keys) { + const topo = new Topo.Sorter(); + for (const child of schema.$_terms.keys) { + Common.tryWithPath(() => topo.add(child, { after: child.schema.$_rootReferences(), group: child.key }), child.key); + } + + schema.$_terms.keys = new internals.Keys(...topo.nodes); + } + }, + + manifest: { + + build(obj, desc) { + + if (desc.keys) { + obj = obj.keys(desc.keys); + } + + if (desc.dependencies) { + for (const { rel, key = null, peers, options } of desc.dependencies) { + obj = internals.dependency(obj, rel, key, peers, options); + } + } + + if (desc.patterns) { + for (const { regex, schema, rule, fallthrough, matches } of desc.patterns) { + obj = obj.pattern(regex || schema, rule, { fallthrough, matches }); + } + } + + if (desc.renames) { + for (const { from, to, options } of desc.renames) { + obj = obj.rename(from, to, options); + } + } + + return obj; + } + }, + + messages: { + 'object.and': '{{#label}} contains {{#presentWithLabels}} without its required peers {{#missingWithLabels}}', + 'object.assert': '{{#label}} is invalid because {if(#subject.key, `"` + #subject.key + `" failed to ` + (#message || "pass the assertion test"), #message || "the assertion failed")}', + 'object.base': '{{#label}} must be of type {{#type}}', + 'object.instance': '{{#label}} must be an instance of "{{#type}}"', + 'object.length': '{{#label}} must have {{#limit}} key{if(#limit == 1, "", "s")}', + 'object.max': '{{#label}} must have less than or equal to {{#limit}} key{if(#limit == 1, "", "s")}', + 'object.min': '{{#label}} must have at least {{#limit}} key{if(#limit == 1, "", "s")}', + 'object.missing': '{{#label}} must contain at least one of {{#peersWithLabels}}', + 'object.nand': '"{{#mainWithLabel}}" must not exist simultaneously with {{#peersWithLabels}}', + 'object.oxor': '{{#label}} contains a conflict between optional exclusive peers {{#peersWithLabels}}', + 'object.pattern.match': '{{#label}} keys failed to match pattern requirements', + 'object.refType': '{{#label}} must be a Joi reference', + 'object.regex': '{{#label}} must be a RegExp object', + 'object.rename.multiple': '{{#label}} cannot rename "{{#from}}" because multiple renames are disabled and another key was already renamed to "{{#to}}"', + 'object.rename.override': '{{#label}} cannot rename "{{#from}}" because override is disabled and target "{{#to}}" exists', + 'object.schema': '{{#label}} must be a Joi schema of {{#type}} type', + 'object.unknown': '{{#label}} is not allowed', + 'object.with': '"{{#mainWithLabel}}" missing required peer "{{#peerWithLabel}}"', + 'object.without': '"{{#mainWithLabel}}" conflict with forbidden peer "{{#peerWithLabel}}"', + 'object.xor': '{{#label}} contains a conflict between exclusive peers {{#peersWithLabels}}' + } +}); + + +// Helpers + +internals.clone = function (value, prefs) { + + // Object + + if (typeof value === 'object') { + if (prefs.nonEnumerables) { + return Clone(value, { shallow: true }); + } + + const clone = Object.create(Object.getPrototypeOf(value)); + Object.assign(clone, value); + return clone; + } + + // Function + + const clone = function (...args) { + + return value.apply(this, args); + }; + + clone.prototype = Clone(value.prototype); + Object.defineProperty(clone, 'name', { value: value.name, writable: false }); + Object.defineProperty(clone, 'length', { value: value.length, writable: false }); + Object.assign(clone, value); + return clone; +}; + + +internals.dependency = function (schema, rel, key, peers, options) { + + Assert(key === null || typeof key === 'string', rel, 'key must be a strings'); + + // Extract options from peers array + + if (!options) { + options = peers.length > 1 && typeof peers[peers.length - 1] === 'object' ? peers.pop() : {}; + } + + Common.assertOptions(options, ['separator']); + + peers = [].concat(peers); + + // Cast peer paths + + const separator = Common.default(options.separator, '.'); + const paths = []; + for (const peer of peers) { + Assert(typeof peer === 'string', rel, 'peers must be a string or a reference'); + paths.push(Compile.ref(peer, { separator, ancestor: 0, prefix: false })); + } + + // Cast key + + if (key !== null) { + key = Compile.ref(key, { separator, ancestor: 0, prefix: false }); + } + + // Add rule + + const obj = schema.clone(); + obj.$_terms.dependencies = obj.$_terms.dependencies || []; + obj.$_terms.dependencies.push(new internals.Dependency(rel, key, paths, peers)); + return obj; +}; + + +internals.dependencies = { + + and(schema, dep, value, state, prefs) { + + const missing = []; + const present = []; + const count = dep.peers.length; + for (const peer of dep.peers) { + if (peer.resolve(value, state, prefs, null, { shadow: false }) === undefined) { + missing.push(peer.key); + } + else { + present.push(peer.key); + } + } + + if (missing.length !== count && + present.length !== count) { + + return { + code: 'object.and', + context: { + present, + presentWithLabels: internals.keysToLabels(schema, present), + missing, + missingWithLabels: internals.keysToLabels(schema, missing) + } + }; + } + }, + + nand(schema, dep, value, state, prefs) { + + const present = []; + for (const peer of dep.peers) { + if (peer.resolve(value, state, prefs, null, { shadow: false }) !== undefined) { + present.push(peer.key); + } + } + + if (present.length !== dep.peers.length) { + return; + } + + const main = dep.paths[0]; + const values = dep.paths.slice(1); + return { + code: 'object.nand', + context: { + main, + mainWithLabel: internals.keysToLabels(schema, main), + peers: values, + peersWithLabels: internals.keysToLabels(schema, values) + } + }; + }, + + or(schema, dep, value, state, prefs) { + + for (const peer of dep.peers) { + if (peer.resolve(value, state, prefs, null, { shadow: false }) !== undefined) { + return; + } + } + + return { + code: 'object.missing', + context: { + peers: dep.paths, + peersWithLabels: internals.keysToLabels(schema, dep.paths) + } + }; + }, + + oxor(schema, dep, value, state, prefs) { + + const present = []; + for (const peer of dep.peers) { + if (peer.resolve(value, state, prefs, null, { shadow: false }) !== undefined) { + present.push(peer.key); + } + } + + if (!present.length || + present.length === 1) { + + return; + } + + const context = { peers: dep.paths, peersWithLabels: internals.keysToLabels(schema, dep.paths) }; + context.present = present; + context.presentWithLabels = internals.keysToLabels(schema, present); + return { code: 'object.oxor', context }; + }, + + with(schema, dep, value, state, prefs) { + + for (const peer of dep.peers) { + if (peer.resolve(value, state, prefs, null, { shadow: false }) === undefined) { + return { + code: 'object.with', + context: { + main: dep.key.key, + mainWithLabel: internals.keysToLabels(schema, dep.key.key), + peer: peer.key, + peerWithLabel: internals.keysToLabels(schema, peer.key) + } + }; + } + } + }, + + without(schema, dep, value, state, prefs) { + + for (const peer of dep.peers) { + if (peer.resolve(value, state, prefs, null, { shadow: false }) !== undefined) { + return { + code: 'object.without', + context: { + main: dep.key.key, + mainWithLabel: internals.keysToLabels(schema, dep.key.key), + peer: peer.key, + peerWithLabel: internals.keysToLabels(schema, peer.key) + } + }; + } + } + }, + + xor(schema, dep, value, state, prefs) { + + const present = []; + for (const peer of dep.peers) { + if (peer.resolve(value, state, prefs, null, { shadow: false }) !== undefined) { + present.push(peer.key); + } + } + + if (present.length === 1) { + return; + } + + const context = { peers: dep.paths, peersWithLabels: internals.keysToLabels(schema, dep.paths) }; + if (present.length === 0) { + return { code: 'object.missing', context }; + } + + context.present = present; + context.presentWithLabels = internals.keysToLabels(schema, present); + return { code: 'object.xor', context }; + } +}; + + +internals.keysToLabels = function (schema, keys) { + + if (Array.isArray(keys)) { + return keys.map((key) => schema.$_mapLabels(key)); + } + + return schema.$_mapLabels(keys); +}; + + +internals.rename = function (schema, value, state, prefs, errors) { + + const renamed = {}; + for (const rename of schema.$_terms.renames) { + const matches = []; + const pattern = typeof rename.from !== 'string'; + + if (!pattern) { + if (Object.prototype.hasOwnProperty.call(value, rename.from) && + (value[rename.from] !== undefined || !rename.options.ignoreUndefined)) { + + matches.push(rename); + } + } + else { + for (const from in value) { + if (value[from] === undefined && + rename.options.ignoreUndefined) { + + continue; + } + + if (from === rename.to) { + continue; + } + + const match = rename.from.exec(from); + if (!match) { + continue; + } + + matches.push({ from, to: rename.to, match }); + } + } + + for (const match of matches) { + const from = match.from; + let to = match.to; + if (to instanceof Template) { + to = to.render(value, state, prefs, match.match); + } + + if (from === to) { + continue; + } + + if (!rename.options.multiple && + renamed[to]) { + + errors.push(schema.$_createError('object.rename.multiple', value, { from, to, pattern }, state, prefs)); + if (prefs.abortEarly) { + return false; + } + } + + if (Object.prototype.hasOwnProperty.call(value, to) && + !rename.options.override && + !renamed[to]) { + + errors.push(schema.$_createError('object.rename.override', value, { from, to, pattern }, state, prefs)); + if (prefs.abortEarly) { + return false; + } + } + + if (value[from] === undefined) { + delete value[to]; + } + else { + value[to] = value[from]; + } + + renamed[to] = true; + + if (!rename.options.alias) { + delete value[from]; + } + } + } + + return true; +}; + + +internals.unknown = function (schema, value, unprocessed, errors, state, prefs) { + + if (schema.$_terms.patterns) { + let hasMatches = false; + const matches = schema.$_terms.patterns.map((pattern) => { + + if (pattern.matches) { + hasMatches = true; + return []; + } + }); + + const ancestors = [value, ...state.ancestors]; + + for (const key of unprocessed) { + const item = value[key]; + const path = [...state.path, key]; + + for (let i = 0; i < schema.$_terms.patterns.length; ++i) { + const pattern = schema.$_terms.patterns[i]; + if (pattern.regex) { + const match = pattern.regex.test(key); + state.mainstay.tracer.debug(state, 'rule', `pattern.${i}`, match ? 'pass' : 'error'); + if (!match) { + continue; + } + } + else { + if (!pattern.schema.$_match(key, state.nest(pattern.schema, `pattern.${i}`), prefs)) { + continue; + } + } + + unprocessed.delete(key); + + const localState = state.localize(path, ancestors, { schema: pattern.rule, key }); + const result = pattern.rule.$_validate(item, localState, prefs); + if (result.errors) { + if (prefs.abortEarly) { + return { value, errors: result.errors }; + } + + errors.push(...result.errors); + } + + if (pattern.matches) { + matches[i].push(key); + } + + value[key] = result.value; + if (!pattern.fallthrough) { + break; + } + } + } + + // Validate pattern matches rules + + if (hasMatches) { + for (let i = 0; i < matches.length; ++i) { + const match = matches[i]; + if (!match) { + continue; + } + + const stpm = schema.$_terms.patterns[i].matches; + const localState = state.localize(state.path, ancestors, stpm); + const result = stpm.$_validate(match, localState, prefs); + if (result.errors) { + const details = Errors.details(result.errors, { override: false }); + details.matches = match; + const report = schema.$_createError('object.pattern.match', value, details, state, prefs); + if (prefs.abortEarly) { + return { value, errors: report }; + } + + errors.push(report); + } + } + } + } + + if (!unprocessed.size || + !schema.$_terms.keys && !schema.$_terms.patterns) { // If no keys or patterns specified, unknown keys allowed + + return; + } + + if (prefs.stripUnknown && !schema._flags.unknown || + prefs.skipFunctions) { + + const stripUnknown = prefs.stripUnknown ? (prefs.stripUnknown === true ? true : !!prefs.stripUnknown.objects) : false; + + for (const key of unprocessed) { + if (stripUnknown) { + delete value[key]; + unprocessed.delete(key); + } + else if (typeof value[key] === 'function') { + unprocessed.delete(key); + } + } + } + + const forbidUnknown = !Common.default(schema._flags.unknown, prefs.allowUnknown); + if (forbidUnknown) { + for (const unprocessedKey of unprocessed) { + const localState = state.localize([...state.path, unprocessedKey], []); + const report = schema.$_createError('object.unknown', value[unprocessedKey], { child: unprocessedKey }, localState, prefs, { flags: false }); + if (prefs.abortEarly) { + return { value, errors: report }; + } + + errors.push(report); + } + } +}; + + +internals.Dependency = class { + + constructor(rel, key, peers, paths) { + + this.rel = rel; + this.key = key; + this.peers = peers; + this.paths = paths; + } + + describe() { + + const desc = { + rel: this.rel, + peers: this.paths + }; + + if (this.key !== null) { + desc.key = this.key.key; + } + + if (this.peers[0].separator !== '.') { + desc.options = { separator: this.peers[0].separator }; + } + + return desc; + } +}; + + +internals.Keys = class extends Array { + + concat(source) { + + const result = this.slice(); + + const keys = new Map(); + for (let i = 0; i < result.length; ++i) { + keys.set(result[i].key, i); + } + + for (const item of source) { + const key = item.key; + const pos = keys.get(key); + if (pos !== undefined) { + result[pos] = { key, schema: result[pos].schema.concat(item.schema) }; + } + else { + result.push(item); + } + } + + return result; + } +}; diff --git a/node_modules/@hapi/joi/lib/types/link.js b/node_modules/@hapi/joi/lib/types/link.js new file mode 100644 index 000000000..d99d0025c --- /dev/null +++ b/node_modules/@hapi/joi/lib/types/link.js @@ -0,0 +1,168 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Any = require('./any'); +const Common = require('../common'); +const Compile = require('../compile'); +const Errors = require('../errors'); + + +const internals = {}; + + +module.exports = Any.extend({ + + type: 'link', + + properties: { + schemaChain: true + }, + + terms: { + + link: { init: null, manifest: 'single', register: false } + }, + + args(schema, ref) { + + return schema.ref(ref); + }, + + validate(value, { schema, state, prefs }) { + + Assert(schema.$_terms.link, 'Uninitialized link schema'); + + const linked = internals.generate(schema, value, state, prefs); + const ref = schema.$_terms.link[0].ref; + return linked.$_validate(value, state.nest(linked, `link:${ref.display}:${linked.type}`), prefs); + }, + + generate(schema, value, state, prefs) { + + return internals.generate(schema, value, state, prefs); + }, + + rules: { + + ref: { + method(ref) { + + Assert(!this.$_terms.link, 'Cannot reinitialize schema'); + + ref = Compile.ref(ref); + + Assert(ref.type === 'value' || ref.type === 'local', 'Invalid reference type:', ref.type); + Assert(ref.type === 'local' || ref.ancestor === 'root' || ref.ancestor > 0, 'Link cannot reference itself'); + + const obj = this.clone(); + obj.$_terms.link = [{ ref }]; + return obj; + } + }, + + relative: { + method(enabled = true) { + + return this.$_setFlag('relative', enabled); + } + } + }, + + overrides: { + + concat(source) { + + Assert(this.$_terms.link, 'Uninitialized link schema'); + Assert(Common.isSchema(source), 'Invalid schema object'); + Assert(source.type !== 'link', 'Cannot merge type link with another link'); + + const obj = this.clone(); + + if (!obj.$_terms.whens) { + obj.$_terms.whens = []; + } + + obj.$_terms.whens.push({ concat: source }); + return obj.$_mutateRebuild(); + } + }, + + manifest: { + + build(obj, desc) { + + Assert(desc.link, 'Invalid link description missing link'); + return obj.ref(desc.link); + } + } +}); + + +// Helpers + +internals.generate = function (schema, value, state, prefs) { + + let linked = state.mainstay.links.get(schema); + if (linked) { + return linked._generate(value, state, prefs).schema; + } + + const ref = schema.$_terms.link[0].ref; + const { perspective, path } = internals.perspective(ref, state); + internals.assert(perspective, 'which is outside of schema boundaries', ref, schema, state, prefs); + + try { + linked = path.length ? perspective.$_reach(path) : perspective; + } + catch (ignoreErr) { + internals.assert(false, 'to non-existing schema', ref, schema, state, prefs); + } + + internals.assert(linked.type !== 'link', 'which is another link', ref, schema, state, prefs); + + if (!schema._flags.relative) { + state.mainstay.links.set(schema, linked); + } + + return linked._generate(value, state, prefs).schema; +}; + + +internals.perspective = function (ref, state) { + + if (ref.type === 'local') { + for (const { schema, key } of state.schemas) { // From parent to root + const id = schema._flags.id || key; + if (id === ref.path[0]) { + return { perspective: schema, path: ref.path.slice(1) }; + } + + if (schema.$_terms.shared) { + for (const shared of schema.$_terms.shared) { + if (shared._flags.id === ref.path[0]) { + return { perspective: shared, path: ref.path.slice(1) }; + } + } + } + } + + return { perspective: null, path: null }; + } + + if (ref.ancestor === 'root') { + return { perspective: state.schemas[state.schemas.length - 1].schema, path: ref.path }; + } + + return { perspective: state.schemas[ref.ancestor] && state.schemas[ref.ancestor].schema, path: ref.path }; +}; + + +internals.assert = function (condition, message, ref, schema, state, prefs) { + + if (condition) { // Manual check to avoid generating error message on success + return; + } + + Assert(false, `"${Errors.label(schema._flags, state, prefs)}" contains link reference "${ref.display}" ${message}`); +}; diff --git a/node_modules/@hapi/joi/lib/types/number.js b/node_modules/@hapi/joi/lib/types/number.js new file mode 100644 index 000000000..4158dfa9f --- /dev/null +++ b/node_modules/@hapi/joi/lib/types/number.js @@ -0,0 +1,331 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Any = require('./any'); +const Common = require('../common'); + + +const internals = { + numberRx: /^\s*[+-]?(?:(?:\d+(?:\.\d*)?)|(?:\.\d+))(?:e([+-]?\d+))?\s*$/i, + precisionRx: /(?:\.(\d+))?(?:[eE]([+-]?\d+))?$/ +}; + + +module.exports = Any.extend({ + + type: 'number', + + flags: { + + unsafe: { default: false } + }, + + coerce: { + from: 'string', + method(value, { schema, error }) { + + const matches = value.match(internals.numberRx); + if (!matches) { + return; + } + + value = value.trim(); + const result = { value: parseFloat(value) }; + + if (result.value === 0) { + result.value = 0; // -0 + } + + if (!schema._flags.unsafe) { + if (value.match(/e/i)) { + const constructed = internals.normalizeExponent(`${result.value / Math.pow(10, matches[1])}e${matches[1]}`); + if (constructed !== internals.normalizeExponent(value)) { + result.errors = error('number.unsafe'); + return result; + } + } + else { + const string = result.value.toString(); + if (string.match(/e/i)) { + return result; + } + + if (string !== internals.normalizeDecimal(value)) { + result.errors = error('number.unsafe'); + return result; + } + } + } + + return result; + } + }, + + validate(value, { schema, error, prefs }) { + + if (value === Infinity || + value === -Infinity) { + + return { value, errors: error('number.infinity') }; + } + + if (!Common.isNumber(value)) { + return { value, errors: error('number.base') }; + } + + const result = { value }; + + if (prefs.convert) { + const rule = schema.$_getRule('precision'); + if (rule) { + const precision = Math.pow(10, rule.args.limit); // This is conceptually equivalent to using toFixed but it should be much faster + result.value = Math.round(result.value * precision) / precision; + } + } + + if (result.value === 0) { + result.value = 0; // -0 + } + + if (!schema._flags.unsafe && + (value > Number.MAX_SAFE_INTEGER || value < Number.MIN_SAFE_INTEGER)) { + + result.errors = error('number.unsafe'); + } + + return result; + }, + + rules: { + + compare: { + method: false, + validate(value, helpers, { limit }, { name, operator, args }) { + + if (Common.compare(value, limit, operator)) { + return value; + } + + return helpers.error('number.' + name, { limit: args.limit, value }); + }, + args: [ + { + name: 'limit', + ref: true, + assert: Common.isNumber, + message: 'must be a number' + } + ] + }, + + greater: { + method(limit) { + + return this.$_addRule({ name: 'greater', method: 'compare', args: { limit }, operator: '>' }); + } + }, + + integer: { + method() { + + return this.$_addRule('integer'); + }, + validate(value, helpers) { + + if (Math.trunc(value) - value === 0) { + return value; + } + + return helpers.error('number.integer'); + } + }, + + less: { + method(limit) { + + return this.$_addRule({ name: 'less', method: 'compare', args: { limit }, operator: '<' }); + } + }, + + max: { + method(limit) { + + return this.$_addRule({ name: 'max', method: 'compare', args: { limit }, operator: '<=' }); + } + }, + + min: { + method(limit) { + + return this.$_addRule({ name: 'min', method: 'compare', args: { limit }, operator: '>=' }); + } + }, + + multiple: { + method(base) { + + return this.$_addRule({ name: 'multiple', args: { base } }); + }, + validate(value, helpers, { base }, options) { + + if (value % base === 0) { + return value; + } + + return helpers.error('number.multiple', { multiple: options.args.base, value }); + }, + args: [ + { + name: 'base', + ref: true, + assert: (value) => typeof value === 'number' && isFinite(value) && value > 0, + message: 'must be a positive number' + } + ], + multi: true + }, + + negative: { + method() { + + return this.sign('negative'); + } + }, + + port: { + method() { + + return this.$_addRule('port'); + }, + validate(value, helpers) { + + if (Number.isSafeInteger(value) && + value >= 0 && + value <= 65535) { + + return value; + } + + return helpers.error('number.port'); + } + }, + + positive: { + method() { + + return this.sign('positive'); + } + }, + + precision: { + method(limit) { + + Assert(Number.isSafeInteger(limit), 'limit must be an integer'); + + return this.$_addRule({ name: 'precision', args: { limit } }); + }, + validate(value, helpers, { limit }) { + + const places = value.toString().match(internals.precisionRx); + const decimals = Math.max((places[1] ? places[1].length : 0) - (places[2] ? parseInt(places[2], 10) : 0), 0); + if (decimals <= limit) { + return value; + } + + return helpers.error('number.precision', { limit, value }); + }, + convert: true + }, + + sign: { + method(sign) { + + Assert(['negative', 'positive'].includes(sign), 'Invalid sign', sign); + + return this.$_addRule({ name: 'sign', args: { sign } }); + }, + validate(value, helpers, { sign }) { + + if (sign === 'negative' && value < 0 || + sign === 'positive' && value > 0) { + + return value; + } + + return helpers.error(`number.${sign}`); + } + }, + + unsafe: { + method(enabled = true) { + + Assert(typeof enabled === 'boolean', 'enabled must be a boolean'); + + return this.$_setFlag('unsafe', enabled); + } + } + }, + + cast: { + string: { + from: (value) => typeof value === 'number', + to(value, helpers) { + + return value.toString(); + } + } + }, + + messages: { + 'number.base': '{{#label}} must be a number', + 'number.greater': '{{#label}} must be greater than {{#limit}}', + 'number.infinity': '{{#label}} cannot be infinity', + 'number.integer': '{{#label}} must be an integer', + 'number.less': '{{#label}} must be less than {{#limit}}', + 'number.max': '{{#label}} must be less than or equal to {{#limit}}', + 'number.min': '{{#label}} must be larger than or equal to {{#limit}}', + 'number.multiple': '{{#label}} must be a multiple of {{#multiple}}', + 'number.negative': '{{#label}} must be a negative number', + 'number.port': '{{#label}} must be a valid port', + 'number.positive': '{{#label}} must be a positive number', + 'number.precision': '{{#label}} must have no more than {{#limit}} decimal places', + 'number.unsafe': '{{#label}} must be a safe number' + } +}); + + +// Helpers + +internals.normalizeExponent = function (str) { + + return str + .replace(/E/, 'e') + .replace(/\.(\d*[1-9])?0+e/, '.$1e') + .replace(/\.e/, 'e') + .replace(/e\+/, 'e') + .replace(/^\+/, '') + .replace(/^(-?)0+([1-9])/, '$1$2'); +}; + + +internals.normalizeDecimal = function (str) { + + str = str + .replace(/^\+/, '') + .replace(/\.0+$/, '') + .replace(/^(-?)\.([^\.]*)$/, '$10.$2') + .replace(/^(-?)0+([1-9])/, '$1$2'); + + if (str.includes('.') && + str.endsWith('0')) { + + str = str.replace(/0+$/, ''); + } + + if (str === '-0') { + return '0'; + } + + return str; +}; diff --git a/node_modules/@hapi/joi/lib/types/object.js b/node_modules/@hapi/joi/lib/types/object.js new file mode 100644 index 000000000..57046fdd4 --- /dev/null +++ b/node_modules/@hapi/joi/lib/types/object.js @@ -0,0 +1,22 @@ +'use strict'; + +const Keys = require('./keys'); + + +const internals = {}; + + +module.exports = Keys.extend({ + + type: 'object', + + cast: { + map: { + from: (value) => value && typeof value === 'object', + to(value, helpers) { + + return new Map(Object.entries(value)); + } + } + } +}); diff --git a/node_modules/@hapi/joi/lib/types/string.js b/node_modules/@hapi/joi/lib/types/string.js new file mode 100644 index 000000000..4f3dc7f1a --- /dev/null +++ b/node_modules/@hapi/joi/lib/types/string.js @@ -0,0 +1,796 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Domain = require('@hapi/address/lib/domain'); +const Email = require('@hapi/address/lib/email'); +const Ip = require('@hapi/address/lib/ip'); +const EscapeRegex = require('@hapi/hoek/lib/escapeRegex'); +const Tlds = require('@hapi/address/lib/tlds'); +const Uri = require('@hapi/address/lib/uri'); + +const Any = require('./any'); +const Common = require('../common'); + + +const internals = { + tlds: Tlds instanceof Set ? { tlds: { allow: Tlds, deny: null } } : false, // $lab:coverage:ignore$ + base64Regex: { + // paddingRequired + true: { + // urlSafe + true: /^(?:[\w\-]{2}[\w\-]{2})*(?:[\w\-]{2}==|[\w\-]{3}=)?$/, + false: /^(?:[A-Za-z0-9+\/]{2}[A-Za-z0-9+\/]{2})*(?:[A-Za-z0-9+\/]{2}==|[A-Za-z0-9+\/]{3}=)?$/ + }, + false: { + true: /^(?:[\w\-]{2}[\w\-]{2})*(?:[\w\-]{2}(==)?|[\w\-]{3}=?)?$/, + false: /^(?:[A-Za-z0-9+\/]{2}[A-Za-z0-9+\/]{2})*(?:[A-Za-z0-9+\/]{2}(==)?|[A-Za-z0-9+\/]{3}=?)?$/ + } + }, + dataUriRegex: /^data:[\w+.-]+\/[\w+.-]+;((charset=[\w-]+|base64),)?(.*)$/, + hexRegex: /^[a-f0-9]+$/i, + ipRegex: Ip.regex().regex, + isoDurationRegex: /^P(?!$)(\d+Y)?(\d+M)?(\d+W)?(\d+D)?(T(?=\d)(\d+H)?(\d+M)?(\d+S)?)?$/, + + guidBrackets: { + '{': '}', '[': ']', '(': ')', '': '' + }, + guidVersions: { + uuidv1: '1', + uuidv2: '2', + uuidv3: '3', + uuidv4: '4', + uuidv5: '5' + }, + + cidrPresences: ['required', 'optional', 'forbidden'], + normalizationForms: ['NFC', 'NFD', 'NFKC', 'NFKD'] +}; + + +module.exports = Any.extend({ + + type: 'string', + + flags: { + + insensitive: { default: false }, + truncate: { default: false } + }, + + terms: { + + replacements: { init: null } + }, + + coerce: { + from: 'string', + method(value, { schema, state, prefs }) { + + const normalize = schema.$_getRule('normalize'); + if (normalize) { + value = value.normalize(normalize.args.form); + } + + const casing = schema.$_getRule('case'); + if (casing) { + value = casing.args.direction === 'upper' ? value.toLocaleUpperCase() : value.toLocaleLowerCase(); + } + + const trim = schema.$_getRule('trim'); + if (trim && + trim.args.enabled) { + + value = value.trim(); + } + + if (schema.$_terms.replacements) { + for (const replacement of schema.$_terms.replacements) { + value = value.replace(replacement.pattern, replacement.replacement); + } + } + + const hex = schema.$_getRule('hex'); + if (hex && + hex.args.options.byteAligned && + value.length % 2 !== 0) { + + value = `0${value}`; + } + + if (schema.$_getRule('isoDate')) { + const iso = internals.isoDate(value); + if (iso) { + value = iso; + } + } + + if (schema._flags.truncate) { + const rule = schema.$_getRule('max'); + if (rule) { + let limit = rule.args.limit; + if (Common.isResolvable(limit)) { + limit = limit.resolve(value, state, prefs); + if (!Common.limit(limit)) { + return { value, errors: schema.$_createError('any.ref', limit, { ref: rule.args.limit, arg: 'limit', reason: 'must be a positive integer' }, state, prefs) }; + } + } + + value = value.slice(0, limit); + } + } + + return { value }; + } + }, + + validate(value, { error }) { + + if (typeof value !== 'string') { + return { value, errors: error('string.base') }; + } + + if (value === '') { + return { value, errors: error('string.empty') }; + } + }, + + rules: { + + alphanum: { + method() { + + return this.$_addRule('alphanum'); + }, + validate(value, helpers) { + + if (/^[a-zA-Z0-9]+$/.test(value)) { + return value; + } + + return helpers.error('string.alphanum'); + } + }, + + base64: { + method(options = {}) { + + Common.assertOptions(options, ['paddingRequired', 'urlSafe']); + + options = { urlSafe: false, paddingRequired: true, ...options }; + Assert(typeof options.paddingRequired === 'boolean', 'paddingRequired must be boolean'); + Assert(typeof options.urlSafe === 'boolean', 'urlSafe must be boolean'); + + return this.$_addRule({ name: 'base64', args: { options } }); + }, + validate(value, helpers, { options }) { + + const regex = internals.base64Regex[options.paddingRequired][options.urlSafe]; + if (regex.test(value)) { + return value; + } + + return helpers.error('string.base64'); + } + }, + + case: { + method(direction) { + + Assert(['lower', 'upper'].includes(direction), 'Invalid case:', direction); + + return this.$_addRule({ name: 'case', args: { direction } }); + }, + validate(value, helpers, { direction }) { + + if (direction === 'lower' && value === value.toLocaleLowerCase() || + direction === 'upper' && value === value.toLocaleUpperCase()) { + + return value; + } + + return helpers.error(`string.${direction}case`); + }, + convert: true + }, + + creditCard: { + method() { + + return this.$_addRule('creditCard'); + }, + validate(value, helpers) { + + let i = value.length; + let sum = 0; + let mul = 1; + + while (i--) { + const char = value.charAt(i) * mul; + sum = sum + (char - (char > 9) * 9); + mul = mul ^ 3; + } + + if (sum > 0 && + sum % 10 === 0) { + + return value; + } + + return helpers.error('string.creditCard'); + } + }, + + dataUri: { + method(options = {}) { + + Common.assertOptions(options, ['paddingRequired']); + + options = { paddingRequired: true, ...options }; + Assert(typeof options.paddingRequired === 'boolean', 'paddingRequired must be boolean'); + + return this.$_addRule({ name: 'dataUri', args: { options } }); + }, + validate(value, helpers, { options }) { + + const matches = value.match(internals.dataUriRegex); + + if (matches) { + if (!matches[2]) { + return value; + } + + if (matches[2] !== 'base64') { + return value; + } + + const base64regex = internals.base64Regex[options.paddingRequired].false; + if (base64regex.test(matches[3])) { + return value; + } + } + + return helpers.error('string.dataUri'); + } + }, + + domain: { + method(options) { + + if (options) { + Common.assertOptions(options, ['allowUnicode', 'minDomainSegments', 'tlds']); + } + + const address = internals.addressOptions(options); + return this.$_addRule({ name: 'domain', args: { options }, address }); + }, + validate(value, helpers, args, { address }) { + + if (Domain.isValid(value, address)) { + return value; + } + + return helpers.error('string.domain'); + } + }, + + email: { + method(options = {}) { + + Common.assertOptions(options, ['allowUnicode', 'ignoreLength', 'minDomainSegments', 'multiple', 'separator', 'tlds']); + Assert(options.multiple === undefined || typeof options.multiple === 'boolean', 'multiple option must be an boolean'); + + const address = internals.addressOptions(options); + const regex = new RegExp(`\\s*[${options.separator ? EscapeRegex(options.separator) : ','}]\\s*`); + + return this.$_addRule({ name: 'email', args: { options }, regex, address }); + }, + validate(value, helpers, { options }, { regex, address }) { + + const emails = options.multiple ? value.split(regex) : [value]; + const invalids = []; + for (const email of emails) { + if (!Email.isValid(email, address)) { + invalids.push(email); + } + } + + if (!invalids.length) { + return value; + } + + return helpers.error('string.email', { value, invalids }); + } + }, + + guid: { + alias: 'uuid', + method(options = {}) { + + Common.assertOptions(options, ['version']); + + let versionNumbers = ''; + + if (options.version) { + const versions = [].concat(options.version); + + Assert(versions.length >= 1, 'version must have at least 1 valid version specified'); + const set = new Set(); + + for (let i = 0; i < versions.length; ++i) { + const version = versions[i]; + Assert(typeof version === 'string', 'version at position ' + i + ' must be a string'); + const versionNumber = internals.guidVersions[version.toLowerCase()]; + Assert(versionNumber, 'version at position ' + i + ' must be one of ' + Object.keys(internals.guidVersions).join(', ')); + Assert(!set.has(versionNumber), 'version at position ' + i + ' must not be a duplicate'); + + versionNumbers += versionNumber; + set.add(versionNumber); + } + } + + const regex = new RegExp(`^([\\[{\\(]?)[0-9A-F]{8}([:-]?)[0-9A-F]{4}\\2?[${versionNumbers || '0-9A-F'}][0-9A-F]{3}\\2?[${versionNumbers ? '89AB' : '0-9A-F'}][0-9A-F]{3}\\2?[0-9A-F]{12}([\\]}\\)]?)$`, 'i'); + + return this.$_addRule({ name: 'guid', args: { options }, regex }); + }, + validate(value, helpers, args, { regex }) { + + const results = regex.exec(value); + + if (!results) { + return helpers.error('string.guid'); + } + + // Matching braces + + if (internals.guidBrackets[results[1]] !== results[results.length - 1]) { + return helpers.error('string.guid'); + } + + return value; + } + }, + + hex: { + method(options = {}) { + + Common.assertOptions(options, ['byteAligned']); + + options = { byteAligned: false, ...options }; + Assert(typeof options.byteAligned === 'boolean', 'byteAligned must be boolean'); + + return this.$_addRule({ name: 'hex', args: { options } }); + }, + validate(value, helpers, { options }) { + + if (!internals.hexRegex.test(value)) { + return helpers.error('string.hex'); + } + + if (options.byteAligned && + value.length % 2 !== 0) { + + return helpers.error('string.hexAlign'); + } + + return value; + } + }, + + hostname: { + method() { + + return this.$_addRule('hostname'); + }, + validate(value, helpers) { + + if (Domain.isValid(value, { minDomainSegments: 1 }) || + internals.ipRegex.test(value)) { + + return value; + } + + return helpers.error('string.hostname'); + } + }, + + insensitive: { + method() { + + return this.$_setFlag('insensitive', true); + } + }, + + ip: { + method(options = {}) { + + Common.assertOptions(options, ['cidr', 'version']); + + const { cidr, versions, regex } = Ip.regex(options); + const version = options.version ? versions : undefined; + return this.$_addRule({ name: 'ip', args: { options: { cidr, version } }, regex }); + }, + validate(value, helpers, { options }, { regex }) { + + if (regex.test(value)) { + return value; + } + + if (options.version) { + return helpers.error('string.ipVersion', { value, cidr: options.cidr, version: options.version }); + } + + return helpers.error('string.ip', { value, cidr: options.cidr }); + } + }, + + isoDate: { + method() { + + return this.$_addRule('isoDate'); + }, + validate(value, { error }) { + + if (internals.isoDate(value)) { + return value; + } + + return error('string.isoDate'); + } + }, + + isoDuration: { + method() { + + return this.$_addRule('isoDuration'); + }, + validate(value, helpers) { + + if (internals.isoDurationRegex.test(value)) { + return value; + } + + return helpers.error('string.isoDuration'); + } + }, + + length: { + method(limit, encoding) { + + return internals.length(this, 'length', limit, '=', encoding); + }, + validate(value, helpers, { limit, encoding }, { name, operator, args }) { + + const length = encoding ? Buffer && Buffer.byteLength(value, encoding) : value.length; // $lab:coverage:ignore$ + if (Common.compare(length, limit, operator)) { + return value; + } + + return helpers.error('string.' + name, { limit: args.limit, value, encoding }); + }, + args: [ + { + name: 'limit', + ref: true, + assert: Common.limit, + message: 'must be a positive integer' + }, + 'encoding' + ] + }, + + lowercase: { + method() { + + return this.case('lower'); + } + }, + + max: { + method(limit, encoding) { + + return internals.length(this, 'max', limit, '<=', encoding); + }, + args: ['limit', 'encoding'] + }, + + min: { + method(limit, encoding) { + + return internals.length(this, 'min', limit, '>=', encoding); + }, + args: ['limit', 'encoding'] + }, + + normalize: { + method(form = 'NFC') { + + Assert(internals.normalizationForms.includes(form), 'normalization form must be one of ' + internals.normalizationForms.join(', ')); + + return this.$_addRule({ name: 'normalize', args: { form } }); + }, + validate(value, { error }, { form }) { + + if (value === value.normalize(form)) { + return value; + } + + return error('string.normalize', { value, form }); + }, + convert: true + }, + + pattern: { + alias: 'regex', + method(regex, options = {}) { + + Assert(regex instanceof RegExp, 'regex must be a RegExp'); + Assert(!regex.flags.includes('g') && !regex.flags.includes('y'), 'regex should not use global or sticky mode'); + + if (typeof options === 'string') { + options = { name: options }; + } + + Common.assertOptions(options, ['invert', 'name']); + + const errorCode = ['string.pattern', options.invert ? '.invert' : '', options.name ? '.name' : '.base'].join(''); + return this.$_addRule({ name: 'pattern', args: { regex, options }, errorCode }); + }, + validate(value, helpers, { regex, options }, { errorCode }) { + + const patternMatch = regex.test(value); + + if (patternMatch ^ options.invert) { + return value; + } + + return helpers.error(errorCode, { name: options.name, regex, value }); + }, + args: ['regex', 'options'], + multi: true + }, + + replace: { + method(pattern, replacement) { + + if (typeof pattern === 'string') { + pattern = new RegExp(EscapeRegex(pattern), 'g'); + } + + Assert(pattern instanceof RegExp, 'pattern must be a RegExp'); + Assert(typeof replacement === 'string', 'replacement must be a String'); + + const obj = this.clone(); + + if (!obj.$_terms.replacements) { + obj.$_terms.replacements = []; + } + + obj.$_terms.replacements.push({ pattern, replacement }); + return obj; + } + }, + + token: { + method() { + + return this.$_addRule('token'); + }, + validate(value, helpers) { + + if (/^\w+$/.test(value)) { + return value; + } + + return helpers.error('string.token'); + } + }, + + trim: { + method(enabled = true) { + + Assert(typeof enabled === 'boolean', 'enabled must be a boolean'); + + return this.$_addRule({ name: 'trim', args: { enabled } }); + }, + validate(value, helpers, { enabled }) { + + if (!enabled || + value === value.trim()) { + + return value; + } + + return helpers.error('string.trim'); + }, + convert: true + }, + + truncate: { + method(enabled = true) { + + Assert(typeof enabled === 'boolean', 'enabled must be a boolean'); + + return this.$_setFlag('truncate', enabled); + } + }, + + uppercase: { + method() { + + return this.case('upper'); + } + }, + + uri: { + method(options = {}) { + + Common.assertOptions(options, ['allowRelative', 'allowQuerySquareBrackets', 'domain', 'relativeOnly', 'scheme']); + + if (options.domain) { + Common.assertOptions(options.domain, ['allowUnicode', 'minDomainSegments', 'tlds']); + } + + const { regex, scheme } = Uri.regex(options); + const domain = options.domain ? internals.addressOptions(options.domain) : null; + return this.$_addRule({ name: 'uri', args: { options }, regex, domain, scheme }); + }, + validate(value, helpers, { options }, { regex, domain, scheme }) { + + if (['http:/', 'https:/'].includes(value)) { // scheme:/ is technically valid but makes no sense + return helpers.error('string.uri'); + } + + const match = regex.exec(value); + if (match) { + if (domain) { + const matched = match[1] || match[2]; + if (!Domain.isValid(matched, domain)) { + return helpers.error('string.domain', { value: matched }); + } + } + + return value; + } + + if (options.relativeOnly) { + return helpers.error('string.uriRelativeOnly'); + } + + if (options.scheme) { + return helpers.error('string.uriCustomScheme', { scheme, value }); + } + + return helpers.error('string.uri'); + } + } + }, + + manifest: { + + build(obj, desc) { + + if (desc.replacements) { + for (const { pattern, replacement } of desc.replacements) { + obj = obj.replace(pattern, replacement); + } + } + + return obj; + } + }, + + messages: { + 'string.alphanum': '{{#label}} must only contain alpha-numeric characters', + 'string.base': '{{#label}} must be a string', + 'string.base64': '{{#label}} must be a valid base64 string', + 'string.creditCard': '{{#label}} must be a credit card', + 'string.dataUri': '{{#label}} must be a valid dataUri string', + 'string.domain': '{{#label}} must contain a valid domain name', + 'string.email': '{{#label}} must be a valid email', + 'string.empty': '{{#label}} is not allowed to be empty', + 'string.guid': '{{#label}} must be a valid GUID', + 'string.hex': '{{#label}} must only contain hexadecimal characters', + 'string.hexAlign': '{{#label}} hex decoded representation must be byte aligned', + 'string.hostname': '{{#label}} must be a valid hostname', + 'string.ip': '{{#label}} must be a valid ip address with a {{#cidr}} CIDR', + 'string.ipVersion': '{{#label}} must be a valid ip address of one of the following versions {{#version}} with a {{#cidr}} CIDR', + 'string.isoDate': '{{#label}} must be in iso format', + 'string.isoDuration': '{{#label}} must be a valid ISO 8601 duration', + 'string.length': '{{#label}} length must be {{#limit}} characters long', + 'string.lowercase': '{{#label}} must only contain lowercase characters', + 'string.max': '{{#label}} length must be less than or equal to {{#limit}} characters long', + 'string.min': '{{#label}} length must be at least {{#limit}} characters long', + 'string.normalize': '{{#label}} must be unicode normalized in the {{#form}} form', + 'string.token': '{{#label}} must only contain alpha-numeric and underscore characters', + 'string.pattern.base': '{{#label}} with value "{[.]}" fails to match the required pattern: {{#regex}}', + 'string.pattern.name': '{{#label}} with value "{[.]}" fails to match the {{#name}} pattern', + 'string.pattern.invert.base': '{{#label}} with value "{[.]}" matches the inverted pattern: {{#regex}}', + 'string.pattern.invert.name': '{{#label}} with value "{[.]}" matches the inverted {{#name}} pattern', + 'string.trim': '{{#label}} must not have leading or trailing whitespace', + 'string.uri': '{{#label}} must be a valid uri', + 'string.uriCustomScheme': '{{#label}} must be a valid uri with a scheme matching the {{#scheme}} pattern', + 'string.uriRelativeOnly': '{{#label}} must be a valid relative uri', + 'string.uppercase': '{{#label}} must only contain uppercase characters' + } +}); + + +// Helpers + +internals.addressOptions = function (options) { + + if (!options) { + return options; + } + + // minDomainSegments + + Assert(options.minDomainSegments === undefined || + Number.isSafeInteger(options.minDomainSegments) && options.minDomainSegments > 0, 'minDomainSegments must be a positive integer'); + + // tlds + + if (options.tlds === false) { + return options; + } + + if (options.tlds === true || + options.tlds === undefined) { + + Assert(internals.tlds, 'Built-in TLD list disabled'); + return Object.assign({}, options, internals.tlds); + } + + Assert(typeof options.tlds === 'object', 'tlds must be true, false, or an object'); + + const deny = options.tlds.deny; + if (deny) { + if (Array.isArray(deny)) { + options = Object.assign({}, options, { tlds: { deny: new Set(deny) } }); + } + + Assert(options.tlds.deny instanceof Set, 'tlds.deny must be an array, Set, or boolean'); + Assert(!options.tlds.allow, 'Cannot specify both tlds.allow and tlds.deny lists'); + return options; + } + + const allow = options.tlds.allow; + if (!allow) { + return options; + } + + if (allow === true) { + Assert(internals.tlds, 'Built-in TLD list disabled'); + return Object.assign({}, options, internals.tlds); + } + + if (Array.isArray(allow)) { + options = Object.assign({}, options, { tlds: { allow: new Set(allow) } }); + } + + Assert(options.tlds.allow instanceof Set, 'tlds.allow must be an array, Set, or boolean'); + return options; +}; + + +internals.isoDate = function (value) { + + if (!Common.isIsoDate(value)) { + return null; + } + + const date = new Date(value); + if (isNaN(date.getTime())) { + return null; + } + + return date.toISOString(); +}; + + +internals.length = function (schema, name, limit, operator, encoding) { + + Assert(!encoding || Buffer && Buffer.isEncoding(encoding), 'Invalid encoding:', encoding); // $lab:coverage:ignore$ + + return schema.$_addRule({ name, method: 'length', args: { limit, encoding }, operator }); +}; diff --git a/node_modules/@hapi/joi/lib/types/symbol.js b/node_modules/@hapi/joi/lib/types/symbol.js new file mode 100644 index 000000000..eafe9ae53 --- /dev/null +++ b/node_modules/@hapi/joi/lib/types/symbol.js @@ -0,0 +1,102 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Any = require('./any'); + + +const internals = {}; + + +internals.Map = class extends Map { + + slice() { + + return new internals.Map(this); + } +}; + + +module.exports = Any.extend({ + + type: 'symbol', + + terms: { + + map: { init: new internals.Map() } + }, + + coerce: { + method(value, { schema, error }) { + + const lookup = schema.$_terms.map.get(value); + if (lookup) { + value = lookup; + } + + if (!schema._flags.only || + typeof value === 'symbol') { + + return { value }; + } + + return { value, errors: error('symbol.map', { map: schema.$_terms.map }) }; + } + }, + + validate(value, { error }) { + + if (typeof value !== 'symbol') { + return { value, errors: error('symbol.base') }; + } + }, + + rules: { + map: { + method(iterable) { + + if (iterable && + !iterable[Symbol.iterator] && + typeof iterable === 'object') { + + iterable = Object.entries(iterable); + } + + Assert(iterable && iterable[Symbol.iterator], 'Iterable must be an iterable or object'); + + const obj = this.clone(); + + const symbols = []; + for (const entry of iterable) { + Assert(entry && entry[Symbol.iterator], 'Entry must be an iterable'); + const [key, value] = entry; + + Assert(typeof key !== 'object' && typeof key !== 'function' && typeof key !== 'symbol', 'Key must not be of type object, function, or Symbol'); + Assert(typeof value === 'symbol', 'Value must be a Symbol'); + + obj.$_terms.map.set(key, value); + symbols.push(value); + } + + return obj.valid(...symbols); + } + } + }, + + manifest: { + + build(obj, desc) { + + if (desc.map) { + obj = obj.map(desc.map); + } + + return obj; + } + }, + + messages: { + 'symbol.base': '{{#label}} must be a symbol', + 'symbol.map': '{{#label}} must be one of {{#map}}' + } +}); diff --git a/node_modules/@hapi/joi/lib/validator.js b/node_modules/@hapi/joi/lib/validator.js new file mode 100644 index 000000000..0f635cb8b --- /dev/null +++ b/node_modules/@hapi/joi/lib/validator.js @@ -0,0 +1,617 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); +const Ignore = require('@hapi/hoek/lib/ignore'); +const Reach = require('@hapi/hoek/lib/reach'); + +const Common = require('./common'); +const Errors = require('./errors'); +const State = require('./state'); + + +const internals = { + result: Symbol('result') +}; + + +exports.entry = function (value, schema, prefs) { + + let settings = Common.defaults; + if (prefs) { + Assert(prefs.warnings === undefined, 'Cannot override warnings preference in synchronous validation'); + settings = Common.preferences(Common.defaults, prefs); + } + + const result = internals.entry(value, schema, settings); + Assert(!result.mainstay.externals.length, 'Schema with external rules must use validateAsync()'); + const outcome = { value: result.value }; + + if (result.error) { + outcome.error = result.error; + } + + if (result.mainstay.warnings.length) { + outcome.warning = Errors.details(result.mainstay.warnings); + } + + if (result.mainstay.debug) { + outcome.debug = result.mainstay.debug; + } + + return outcome; +}; + + +exports.entryAsync = async function (value, schema, prefs) { + + let settings = Common.defaults; + if (prefs) { + settings = Common.preferences(Common.defaults, prefs); + } + + const result = internals.entry(value, schema, settings); + const mainstay = result.mainstay; + if (result.error) { + if (mainstay.debug) { + result.error.debug = mainstay.debug; + } + + throw result.error; + } + + if (mainstay.externals.length) { + let root = result.value; + for (const { method, path, label } of mainstay.externals) { + let node = root; + let key; + let parent; + + if (path.length) { + key = path[path.length - 1]; + parent = Reach(root, path.slice(0, -1)); + node = parent[key]; + } + + try { + const output = await method(node); + if (output === undefined || + output === node) { + + continue; + } + + if (parent) { + parent[key] = output; + } + else { + root = output; + } + } + catch (err) { + err.message += ` (${label})`; // Change message to include path + throw err; + } + } + + result.value = root; + } + + if (!settings.warnings && + !settings.debug) { + + return result.value; + } + + const outcome = { value: result.value }; + if (mainstay.warnings.length) { + outcome.warning = Errors.details(mainstay.warnings); + } + + if (mainstay.debug) { + outcome.debug = mainstay.debug; + } + + return outcome; +}; + + +internals.entry = function (value, schema, prefs) { + + // Prepare state + + const { tracer, cleanup } = internals.tracer(schema, prefs); + const debug = prefs.debug ? [] : null; + const links = schema._ids._schemaChain ? new Map() : null; + const mainstay = { externals: [], warnings: [], tracer, debug, links }; + const schemas = schema._ids._schemaChain ? [{ schema }] : null; + const state = new State([], [], { mainstay, schemas }); + + // Validate value + + const result = exports.validate(value, schema, state, prefs); + + // Process value and errors + + if (cleanup) { + schema.$_root.untrace(); + } + + const error = Errors.process(result.errors, value, prefs); + return { value: result.value, error, mainstay }; +}; + + +internals.tracer = function (schema, prefs) { + + if (schema.$_root._tracer) { + return { tracer: schema.$_root._tracer._register(schema) }; + } + + if (prefs.debug) { + Assert(schema.$_root.trace, 'Debug mode not supported'); + return { tracer: schema.$_root.trace()._register(schema), cleanup: true }; + } + + return { tracer: internals.ignore }; +}; + + +exports.validate = function (value, schema, state, prefs, overrides = {}) { + + if (schema.$_terms.whens) { + schema = schema._generate(value, state, prefs).schema; + } + + // Setup state and settings + + if (schema._preferences) { + prefs = internals.prefs(schema, prefs); + } + + // Cache + + if (schema._cache && + prefs.cache) { + + const result = schema._cache.get(value); + state.mainstay.tracer.debug(state, 'validate', 'cached', !!result); + if (result) { + return result; + } + } + + // Helpers + + const createError = (code, local, localState) => schema.$_createError(code, value, local, localState || state, prefs); + const helpers = { + original: value, + prefs, + schema, + state, + error: createError, + warn: (code, local, localState) => state.mainstay.warnings.push(createError(code, local, localState)), + message: (messages, local) => schema.$_createError('custom', value, local, state, prefs, { messages }) + }; + + // Prepare + + state.mainstay.tracer.entry(schema, state); + + const def = schema._definition; + if (def.prepare && + value !== undefined && + prefs.convert) { + + const prepared = def.prepare(value, helpers); + if (prepared) { + state.mainstay.tracer.value(state, 'prepare', value, prepared.value); + if (prepared.errors) { + return internals.finalize(prepared.value, [].concat(prepared.errors), helpers); // Prepare error always aborts early + } + + value = prepared.value; + } + } + + // Type coercion + + if (def.coerce && + value !== undefined && + prefs.convert && + (!def.coerce.from || def.coerce.from.includes(typeof value))) { + + const coerced = def.coerce.method(value, helpers); + if (coerced) { + state.mainstay.tracer.value(state, 'coerced', value, coerced.value); + if (coerced.errors) { + return internals.finalize(coerced.value, [].concat(coerced.errors), helpers); // Coerce error always aborts early + } + + value = coerced.value; + } + } + + // Empty value + + const empty = schema._flags.empty; + if (empty && + empty.$_match(internals.trim(value, schema), state.nest(empty), Common.defaults)) { + + state.mainstay.tracer.value(state, 'empty', value, undefined); + value = undefined; + } + + // Presence requirements (required, optional, forbidden) + + const presence = overrides.presence || schema._flags.presence || (schema._flags._endedSwitch ? 'ignore' : prefs.presence); + if (value === undefined) { + if (presence === 'forbidden') { + return internals.finalize(value, null, helpers); + } + + if (presence === 'required') { + return internals.finalize(value, [schema.$_createError('any.required', value, null, state, prefs)], helpers); + } + + if (presence === 'optional') { + if (schema._flags.default !== Common.symbols.deepDefault) { + return internals.finalize(value, null, helpers); + } + + state.mainstay.tracer.value(state, 'default', value, {}); + value = {}; + } + } + else if (presence === 'forbidden') { + return internals.finalize(value, [schema.$_createError('any.unknown', value, null, state, prefs)], helpers); + } + + // Allowed values + + const errors = []; + + if (schema._valids) { + const match = schema._valids.get(value, state, prefs, schema._flags.insensitive); + if (match) { + if (prefs.convert) { + state.mainstay.tracer.value(state, 'valids', value, match.value); + value = match.value; + } + + state.mainstay.tracer.filter(schema, state, 'valid', match); + return internals.finalize(value, null, helpers); + } + + if (schema._flags.only) { + const report = schema.$_createError('any.only', value, { valids: schema._valids.values({ display: true }) }, state, prefs); + if (prefs.abortEarly) { + return internals.finalize(value, [report], helpers); + } + + errors.push(report); + } + } + + // Denied values + + if (schema._invalids) { + const match = schema._invalids.get(value, state, prefs, schema._flags.insensitive); + if (match) { + state.mainstay.tracer.filter(schema, state, 'invalid', match); + const report = schema.$_createError('any.invalid', value, { invalids: schema._invalids.values({ display: true }) }, state, prefs); + if (prefs.abortEarly) { + return internals.finalize(value, [report], helpers); + } + + errors.push(report); + } + } + + // Base type + + if (def.validate) { + const base = def.validate(value, helpers); + if (base) { + state.mainstay.tracer.value(state, 'base', value, base.value); + value = base.value; + + if (base.errors) { + if (!Array.isArray(base.errors)) { + errors.push(base.errors); + return internals.finalize(value, errors, helpers); // Base error always aborts early + } + + if (base.errors.length) { + errors.push(...base.errors); + return internals.finalize(value, errors, helpers); // Base error always aborts early + } + } + } + } + + // Validate tests + + if (!schema._rules.length) { + return internals.finalize(value, errors, helpers); + } + + return internals.rules(value, errors, helpers); +}; + + +internals.rules = function (value, errors, helpers) { + + const { schema, state, prefs } = helpers; + + for (const rule of schema._rules) { + const definition = schema._definition.rules[rule.method]; + + // Skip rules that are also applied in coerce step + + if (definition.convert && + prefs.convert) { + + state.mainstay.tracer.log(schema, state, 'rule', rule.name, 'full'); + continue; + } + + // Resolve references + + let ret; + let args = rule.args; + if (rule._resolve.length) { + args = Object.assign({}, args); // Shallow copy + for (const key of rule._resolve) { + const resolver = definition.argsByName.get(key); + + const resolved = args[key].resolve(value, state, prefs); + const normalized = resolver.normalize ? resolver.normalize(resolved) : resolved; + + const invalid = Common.validateArg(normalized, null, resolver); + if (invalid) { + ret = schema.$_createError('any.ref', resolved, { arg: key, ref: args[key], reason: invalid }, state, prefs); + break; + } + + args[key] = normalized; + } + } + + // Test rule + + ret = ret || definition.validate(value, helpers, args, rule); // Use ret if already set to reference error + + const result = internals.rule(ret, rule); + if (result.errors) { + state.mainstay.tracer.log(schema, state, 'rule', rule.name, 'error'); + + if (rule.warn) { + state.mainstay.warnings.push(...result.errors); + continue; + } + + if (prefs.abortEarly) { + return internals.finalize(value, result.errors, helpers); + } + + errors.push(...result.errors); + } + else { + state.mainstay.tracer.log(schema, state, 'rule', rule.name, 'pass'); + state.mainstay.tracer.value(state, 'rule', value, result.value, rule.name); + value = result.value; + } + } + + return internals.finalize(value, errors, helpers); +}; + + +internals.rule = function (ret, rule) { + + if (ret instanceof Errors.Report) { + internals.error(ret, rule); + return { errors: [ret], value: null }; + } + + if (Array.isArray(ret) && + (ret[0] instanceof Errors.Report || ret[0] instanceof Error)) { + + ret.forEach((report) => internals.error(report, rule)); + return { errors: ret, value: null }; + } + + return { errors: null, value: ret }; +}; + + +internals.error = function (report, rule) { + + if (rule.message) { + report._setTemplate(rule.message); + } + + return report; +}; + + +internals.finalize = function (value, errors, helpers) { + + errors = errors || []; + const { schema, state, prefs } = helpers; + + // Failover value + + if (errors.length) { + const failover = internals.default('failover', undefined, errors, helpers); + if (failover !== undefined) { + state.mainstay.tracer.value(state, 'failover', value, failover); + value = failover; + errors = []; + } + } + + // Error override + + if (errors.length && + schema._flags.error) { + + if (typeof schema._flags.error === 'function') { + errors = schema._flags.error(errors); + if (!Array.isArray(errors)) { + errors = [errors]; + } + + for (const error of errors) { + Assert(error instanceof Error || error instanceof Errors.Report, 'error() must return an Error object'); + } + } + else { + errors = [schema._flags.error]; + } + } + + // Default + + if (value === undefined) { + const defaulted = internals.default('default', value, errors, helpers); + state.mainstay.tracer.value(state, 'default', value, defaulted); + value = defaulted; + } + + // Cast + + if (schema._flags.cast && + value !== undefined) { + + const caster = schema._definition.cast[schema._flags.cast]; + if (caster.from(value)) { + const casted = caster.to(value, helpers); + state.mainstay.tracer.value(state, 'cast', value, casted, schema._flags.cast); + value = casted; + } + } + + // Externals + + if (schema.$_terms.externals && + prefs.externals && + prefs._externals !== false) { // Disabled for matching + + for (const { method } of schema.$_terms.externals) { + state.mainstay.externals.push({ method, path: state.path, label: Errors.label(schema._flags, state, prefs) }); + } + } + + // Result + + const result = { value, errors: errors.length ? errors : null }; + + if (schema._flags.result) { + result.value = schema._flags.result === 'strip' ? undefined : /* raw */ helpers.original; + state.mainstay.tracer.value(state, schema._flags.result, value, result.value); + state.shadow(value, schema._flags.result); + } + + // Cache + + if (schema._cache && + prefs.cache !== false && + !schema._refs.length) { + + schema._cache.set(helpers.original, result); + } + + return result; +}; + + +internals.prefs = function (schema, prefs) { + + const isDefaultOptions = prefs === Common.defaults; + if (isDefaultOptions && + schema._preferences[Common.symbols.prefs]) { + + return schema._preferences[Common.symbols.prefs]; + } + + prefs = Common.preferences(prefs, schema._preferences); + if (isDefaultOptions) { + schema._preferences[Common.symbols.prefs] = prefs; + } + + return prefs; +}; + + +internals.default = function (flag, value, errors, helpers) { + + const { schema, state, prefs } = helpers; + const source = schema._flags[flag]; + if (prefs.noDefaults || + source === undefined) { + + return value; + } + + state.mainstay.tracer.log(schema, state, 'rule', flag, 'full'); + + if (!source) { + return source; + } + + if (typeof source === 'function') { + const args = source.length ? [Clone(state.ancestors[0]), helpers] : []; + + try { + return source(...args); + } + catch (err) { + errors.push(schema.$_createError(`any.${flag}`, null, { error: err }, state, prefs)); + return; + } + } + + if (typeof source !== 'object') { + return source; + } + + if (source[Common.symbols.literal]) { + return source.literal; + } + + if (Common.isResolvable(source)) { + return source.resolve(value, state, prefs); + } + + return Clone(source); +}; + + +internals.trim = function (value, schema) { + + if (typeof value !== 'string') { + return value; + } + + const trim = schema.$_getRule('trim'); + if (!trim || + !trim.args.enabled) { + + return value; + } + + return value.trim(); +}; + + +internals.ignore = { + active: false, + debug: Ignore, + entry: Ignore, + filter: Ignore, + log: Ignore, + resolve: Ignore, + value: Ignore +}; diff --git a/node_modules/@hapi/joi/lib/values.js b/node_modules/@hapi/joi/lib/values.js new file mode 100644 index 000000000..bfdb90b41 --- /dev/null +++ b/node_modules/@hapi/joi/lib/values.js @@ -0,0 +1,263 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const DeepEqual = require('@hapi/hoek/lib/deepEqual'); + +const Common = require('./common'); + + +const internals = {}; + + +module.exports = internals.Values = class { + + constructor(values, refs) { + + this._values = new Set(values); + this._refs = new Set(refs); + this._lowercase = internals.lowercases(values); + + this._override = false; + } + + get length() { + + return this._values.size + this._refs.size; + } + + add(value, refs) { + + // Reference + + if (Common.isResolvable(value)) { + if (!this._refs.has(value)) { + this._refs.add(value); + + if (refs) { // Skipped in a merge + refs.register(value); + } + } + + return; + } + + // Value + + if (!this.has(value, null, null, false)) { + this._values.add(value); + + if (typeof value === 'string') { + this._lowercase.set(value.toLowerCase(), value); + } + } + } + + static merge(target, source, remove) { + + target = target || new internals.Values(); + + if (source) { + if (source._override) { + return source.clone(); + } + + for (const item of [...source._values, ...source._refs]) { + target.add(item); + } + } + + if (remove) { + for (const item of [...remove._values, ...remove._refs]) { + target.remove(item); + } + } + + return target.length ? target : null; + } + + remove(value) { + + // Reference + + if (Common.isResolvable(value)) { + this._refs.delete(value); + return; + } + + // Value + + this._values.delete(value); + + if (typeof value === 'string') { + this._lowercase.delete(value.toLowerCase()); + } + } + + has(value, state, prefs, insensitive) { + + return !!this.get(value, state, prefs, insensitive); + } + + get(value, state, prefs, insensitive) { + + if (!this.length) { + return false; + } + + // Simple match + + if (this._values.has(value)) { + return { value }; + } + + // Case insensitive string match + + if (typeof value === 'string' && + value && + insensitive) { + + const found = this._lowercase.get(value.toLowerCase()); + if (found) { + return { value: found }; + } + } + + if (!this._refs.size && + typeof value !== 'object') { + + return false; + } + + // Objects + + if (typeof value === 'object') { + for (const item of this._values) { + if (DeepEqual(item, value)) { + return { value: item }; + } + } + } + + // References + + if (state) { + for (const ref of this._refs) { + const resolved = ref.resolve(value, state, prefs, null, { in: true }); + if (resolved === undefined) { + continue; + } + + const items = !ref.in || typeof resolved !== 'object' + ? [resolved] + : Array.isArray(resolved) ? resolved : Object.keys(resolved); + + for (const item of items) { + if (typeof item !== typeof value) { + continue; + } + + if (insensitive && + value && + typeof value === 'string') { + + if (item.toLowerCase() === value.toLowerCase()) { + return { value: item, ref }; + } + } + else { + if (DeepEqual(item, value)) { + return { value: item, ref }; + } + } + } + } + } + + return false; + } + + override() { + + this._override = true; + } + + values(options) { + + if (options && + options.display) { + + const values = []; + + for (const item of [...this._values, ...this._refs]) { + if (item !== undefined) { + values.push(item); + } + } + + return values; + } + + return Array.from([...this._values, ...this._refs]); + } + + clone() { + + const set = new internals.Values(this._values, this._refs); + set._override = this._override; + return set; + } + + concat(source) { + + Assert(!source._override, 'Cannot concat override set of values'); + + const set = new internals.Values([...this._values, ...source._values], [...this._refs, ...source._refs]); + set._override = this._override; + return set; + } + + describe() { + + const normalized = []; + + if (this._override) { + normalized.push({ override: true }); + } + + for (const value of this._values.values()) { + normalized.push(value && typeof value === 'object' ? { value } : value); + } + + for (const value of this._refs.values()) { + normalized.push(value.describe()); + } + + return normalized; + } +}; + + +internals.Values.prototype[Common.symbols.values] = true; + + +// Aliases + +internals.Values.prototype.slice = internals.Values.prototype.clone; + + +// Helpers + +internals.lowercases = function (from) { + + const map = new Map(); + + if (from) { + for (const value of from) { + if (typeof value === 'string') { + map.set(value.toLowerCase(), value); + } + } + } + + return map; +}; diff --git a/node_modules/@hapi/joi/package.json b/node_modules/@hapi/joi/package.json new file mode 100644 index 000000000..275111cf9 --- /dev/null +++ b/node_modules/@hapi/joi/package.json @@ -0,0 +1,69 @@ +{ + "_from": "@hapi/joi", + "_id": "@hapi/joi@17.1.1", + "_inBundle": false, + "_integrity": "sha512-p4DKeZAoeZW4g3u7ZeRo+vCDuSDgSvtsB/NpfjXEHTUjSeINAi/RrVOWiVQ1isaoLzMvFEhe8n5065mQq1AdQg==", + "_location": "/@hapi/joi", + "_phantomChildren": {}, + "_requested": { + "type": "tag", + "registry": true, + "raw": "@hapi/joi", + "name": "@hapi/joi", + "escapedName": "@hapi%2fjoi", + "scope": "@hapi", + "rawSpec": "", + "saveSpec": null, + "fetchSpec": "latest" + }, + "_requiredBy": [ + "#USER", + "/" + ], + "_resolved": "https://registry.npmjs.org/@hapi/joi/-/joi-17.1.1.tgz", + "_shasum": "9cc8d7e2c2213d1e46708c6260184b447c661350", + "_spec": "@hapi/joi", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project", + "browser": "dist/joi-browser.min.js", + "bugs": { + "url": "https://github.com/hapijs/joi/issues" + }, + "bundleDependencies": false, + "dependencies": { + "@hapi/address": "^4.0.1", + "@hapi/formula": "^2.0.0", + "@hapi/hoek": "^9.0.0", + "@hapi/pinpoint": "^2.0.0", + "@hapi/topo": "^5.0.0" + }, + "deprecated": "Switch to 'npm install joi'", + "description": "Object schema validation", + "devDependencies": { + "@hapi/bourne": "2.x.x", + "@hapi/code": "8.x.x", + "@hapi/joi-legacy-test": "npm:@hapi/joi@15.x.x", + "@hapi/lab": "22.x.x" + }, + "files": [ + "lib/**/*", + "dist/*" + ], + "homepage": "https://github.com/hapijs/joi#readme", + "keywords": [ + "schema", + "validation" + ], + "license": "BSD-3-Clause", + "main": "lib/index.js", + "name": "@hapi/joi", + "repository": { + "type": "git", + "url": "git://github.com/hapijs/joi.git" + }, + "scripts": { + "prepublishOnly": "cd browser && npm install && npm run build", + "test": "lab -t 100 -a @hapi/code -L", + "test-cov-html": "lab -r html -o coverage.html -a @hapi/code" + }, + "version": "17.1.1" +} diff --git a/node_modules/@hapi/pinpoint/LICENSE.md b/node_modules/@hapi/pinpoint/LICENSE.md new file mode 100644 index 000000000..a7028c4af --- /dev/null +++ b/node_modules/@hapi/pinpoint/LICENSE.md @@ -0,0 +1,10 @@ +Copyright (c) 2019-2020, Sideway Inc, and project contributors + +All rights reserved. + +Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: +* Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. +* Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. +* The names of any contributors may not be used to endorse or promote products derived from this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS AND CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS OFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/node_modules/@hapi/pinpoint/README.md b/node_modules/@hapi/pinpoint/README.md new file mode 100644 index 000000000..d485bd2f8 --- /dev/null +++ b/node_modules/@hapi/pinpoint/README.md @@ -0,0 +1,17 @@ + + +# @hapi/pinpoint + +#### Return the filename and line number of the calling function. + +**pinpoint** is part of the **hapi** ecosystem and was designed to work seamlessly with the [hapi web framework](https://hapi.dev) and its other components (but works great on its own or with other frameworks). If you are using a different web framework and find this module useful, check out [hapi](https://hapi.dev) – they work even better together. + +### Visit the [hapi.dev](https://hapi.dev) Developer Portal for tutorials, documentation, and support + +## Useful resources + +- [Documentation and API](https://hapi.dev/family/pinpoint/) +- [Version status](https://hapi.dev/resources/status/#pinpoint) (builds, dependencies, node versions, licenses, eol) +- [Changelog](https://hapi.dev/family/pinpoint/changelog/) +- [Project policies](https://hapi.dev/policies/) +- [Free and commercial support options](https://hapi.dev/support/) diff --git a/node_modules/@hapi/pinpoint/lib/index.d.ts b/node_modules/@hapi/pinpoint/lib/index.d.ts new file mode 100644 index 000000000..38fadaa3f --- /dev/null +++ b/node_modules/@hapi/pinpoint/lib/index.d.ts @@ -0,0 +1,24 @@ +/** +Returns the filename and line number of the caller in the call stack + +@param depth - The distance from the location function in the call stack. Defaults to 1 (caller). + +@return an object with the filename and line number. +*/ +export function location(depth?: number): location.Location; + +declare namespace location { + + interface Location { + + /** + The fully qualified filename. + */ + readonly filename: string; + + /** + The file line number. + */ + readonly line: number; + } +} diff --git a/node_modules/@hapi/pinpoint/lib/index.js b/node_modules/@hapi/pinpoint/lib/index.js new file mode 100644 index 000000000..482055097 --- /dev/null +++ b/node_modules/@hapi/pinpoint/lib/index.js @@ -0,0 +1,21 @@ +'use strict'; + +const internals = {}; + + +exports.location = function (depth = 0) { + + const orig = Error.prepareStackTrace; + Error.prepareStackTrace = (ignore, stack) => stack; + + const capture = {}; + Error.captureStackTrace(capture, this); + const line = capture.stack[depth + 1]; + + Error.prepareStackTrace = orig; + + return { + filename: line.getFileName(), + line: line.getLineNumber() + }; +}; diff --git a/node_modules/@hapi/pinpoint/package.json b/node_modules/@hapi/pinpoint/package.json new file mode 100644 index 000000000..3560c2161 --- /dev/null +++ b/node_modules/@hapi/pinpoint/package.json @@ -0,0 +1,57 @@ +{ + "_from": "@hapi/pinpoint@^2.0.0", + "_id": "@hapi/pinpoint@2.0.0", + "_inBundle": false, + "_integrity": "sha512-vzXR5MY7n4XeIvLpfl3HtE3coZYO4raKXW766R6DZw/6aLqR26iuZ109K7a0NtF2Db0jxqh7xz2AxkUwpUFybw==", + "_location": "/@hapi/pinpoint", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "@hapi/pinpoint@^2.0.0", + "name": "@hapi/pinpoint", + "escapedName": "@hapi%2fpinpoint", + "scope": "@hapi", + "rawSpec": "^2.0.0", + "saveSpec": null, + "fetchSpec": "^2.0.0" + }, + "_requiredBy": [ + "/@hapi/joi" + ], + "_resolved": "https://registry.npmjs.org/@hapi/pinpoint/-/pinpoint-2.0.0.tgz", + "_shasum": "805b40d4dbec04fc116a73089494e00f073de8df", + "_spec": "@hapi/pinpoint@^2.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\@hapi\\joi", + "bugs": { + "url": "https://github.com/hapijs/pinpoint/issues" + }, + "bundleDependencies": false, + "dependencies": {}, + "deprecated": "Moved to 'npm install @sideway/pinpoint'", + "description": "Return the filename and line number of the calling function", + "devDependencies": { + "@hapi/code": "8.x.x", + "@hapi/lab": "22.x.x" + }, + "files": [ + "lib" + ], + "homepage": "https://github.com/hapijs/pinpoint#readme", + "keywords": [ + "utilities" + ], + "license": "BSD-3-Clause", + "main": "lib/index.js", + "name": "@hapi/pinpoint", + "repository": { + "type": "git", + "url": "git://github.com/hapijs/pinpoint.git" + }, + "scripts": { + "test": "lab -a @hapi/code -t 100 -L -Y", + "test-cov-html": "lab -a @hapi/code -t 100 -L -r html -o coverage.html" + }, + "types": "lib/index.d.ts", + "version": "2.0.0" +} diff --git a/node_modules/@hapi/topo/LICENSE.md b/node_modules/@hapi/topo/LICENSE.md new file mode 100644 index 000000000..0d96bf8fd --- /dev/null +++ b/node_modules/@hapi/topo/LICENSE.md @@ -0,0 +1,10 @@ +Copyright (c) 2012-2020, Sideway Inc, and project contributors +Copyright (c) 2012-2014, Walmart. +All rights reserved. + +Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: +* Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. +* Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. +* The names of any contributors may not be used to endorse or promote products derived from this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS AND CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS OFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/node_modules/@hapi/topo/README.md b/node_modules/@hapi/topo/README.md new file mode 100644 index 000000000..118bacba6 --- /dev/null +++ b/node_modules/@hapi/topo/README.md @@ -0,0 +1,17 @@ + + +# @hapi/topo + +#### Topological sorting with grouping support. + +**topo** is part of the **hapi** ecosystem and was designed to work seamlessly with the [hapi web framework](https://hapi.dev) and its other components (but works great on its own or with other frameworks). If you are using a different web framework and find this module useful, check out [hapi](https://hapi.dev) – they work even better together. + +### Visit the [hapi.dev](https://hapi.dev) Developer Portal for tutorials, documentation, and support + +## Useful resources + +- [Documentation and API](https://hapi.dev/family/topo/) +- [Version status](https://hapi.dev/resources/status/#topo) (builds, dependencies, node versions, licenses, eol) +- [Changelog](https://hapi.dev/family/topo/changelog/) +- [Project policies](https://hapi.dev/policies/) +- [Free and commercial support options](https://hapi.dev/support/) diff --git a/node_modules/@hapi/topo/lib/index.d.ts b/node_modules/@hapi/topo/lib/index.d.ts new file mode 100644 index 000000000..e3367befe --- /dev/null +++ b/node_modules/@hapi/topo/lib/index.d.ts @@ -0,0 +1,60 @@ +export class Sorter { + + /** + * An array of the topologically sorted nodes. This list is renewed upon each call to topo.add(). + */ + nodes: T[]; + + /** + * Adds a node or list of nodes to be added and topologically sorted + * + * @param nodes - A mixed value or array of mixed values to be added as nodes to the topologically sorted list. + * @param options - Optional sorting information about the nodes. + * + * @returns Returns an array of the topologically sorted nodes. + */ + add(nodes: T | T[], options?: Options): T[]; + + /** + * Merges another Sorter object into the current object. + * + * @param others - The other object or array of objects to be merged into the current one. + * + * @returns Returns an array of the topologically sorted nodes. + */ + merge(others: Sorter | Sorter[]): T[]; + + /** + * Sorts the nodes array (only required if the manual option is used when adding items) + */ + sort(): T[]; +} + + +export interface Options { + + /** + * The sorting group the added items belong to + */ + readonly group?: string; + + /** + * The group or groups the added items must come before + */ + readonly before?: string | string[]; + + /** + * The group or groups the added items must come after + */ + readonly after?: string | string[]; + + /** + * A number used to sort items with equal before/after requirements + */ + readonly sort?: number; + + /** + * If true, the array is not sorted automatically until sort() is called + */ + readonly manual?: boolean; +} diff --git a/node_modules/@hapi/topo/lib/index.js b/node_modules/@hapi/topo/lib/index.js new file mode 100644 index 000000000..48c19dd21 --- /dev/null +++ b/node_modules/@hapi/topo/lib/index.js @@ -0,0 +1,225 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + + +const internals = {}; + + +exports.Sorter = class { + + constructor() { + + this._items = []; + this.nodes = []; + } + + add(nodes, options) { + + options = options || {}; + + // Validate rules + + const before = [].concat(options.before || []); + const after = [].concat(options.after || []); + const group = options.group || '?'; + const sort = options.sort || 0; // Used for merging only + + Assert(!before.includes(group), `Item cannot come before itself: ${group}`); + Assert(!before.includes('?'), 'Item cannot come before unassociated items'); + Assert(!after.includes(group), `Item cannot come after itself: ${group}`); + Assert(!after.includes('?'), 'Item cannot come after unassociated items'); + + if (!Array.isArray(nodes)) { + nodes = [nodes]; + } + + for (const node of nodes) { + const item = { + seq: this._items.length, + sort, + before, + after, + group, + node + }; + + this._items.push(item); + } + + // Insert event + + if (!options.manual) { + const valid = this._sort(); + Assert(valid, 'item', group !== '?' ? `added into group ${group}` : '', 'created a dependencies error'); + } + + return this.nodes; + } + + merge(others) { + + if (!Array.isArray(others)) { + others = [others]; + } + + for (const other of others) { + if (other) { + for (const item of other._items) { + this._items.push(Object.assign({}, item)); // Shallow cloned + } + } + } + + // Sort items + + this._items.sort(internals.mergeSort); + for (let i = 0; i < this._items.length; ++i) { + this._items[i].seq = i; + } + + const valid = this._sort(); + Assert(valid, 'merge created a dependencies error'); + + return this.nodes; + } + + sort() { + + const valid = this._sort(); + Assert(valid, 'sort created a dependencies error'); + + return this.nodes; + } + + _sort() { + + // Construct graph + + const graph = {}; + const graphAfters = Object.create(null); // A prototype can bungle lookups w/ false positives + const groups = Object.create(null); + + for (const item of this._items) { + const seq = item.seq; // Unique across all items + const group = item.group; + + // Determine Groups + + groups[group] = groups[group] || []; + groups[group].push(seq); + + // Build intermediary graph using 'before' + + graph[seq] = item.before; + + // Build second intermediary graph with 'after' + + for (const after of item.after) { + graphAfters[after] = graphAfters[after] || []; + graphAfters[after].push(seq); + } + } + + // Expand intermediary graph + + for (const node in graph) { + const expandedGroups = []; + + for (const graphNodeItem in graph[node]) { + const group = graph[node][graphNodeItem]; + groups[group] = groups[group] || []; + expandedGroups.push(...groups[group]); + } + + graph[node] = expandedGroups; + } + + // Merge intermediary graph using graphAfters into final graph + + for (const group in graphAfters) { + if (groups[group]) { + for (const node of groups[group]) { + graph[node].push(...graphAfters[group]); + } + } + } + + // Compile ancestors + + const ancestors = {}; + for (const node in graph) { + const children = graph[node]; + for (const child of children) { + ancestors[child] = ancestors[child] || []; + ancestors[child].push(node); + } + } + + // Topo sort + + const visited = {}; + const sorted = []; + + for (let i = 0; i < this._items.length; ++i) { // Looping through item.seq values out of order + let next = i; + + if (ancestors[i]) { + next = null; + for (let j = 0; j < this._items.length; ++j) { // As above, these are item.seq values + if (visited[j] === true) { + continue; + } + + if (!ancestors[j]) { + ancestors[j] = []; + } + + const shouldSeeCount = ancestors[j].length; + let seenCount = 0; + for (let k = 0; k < shouldSeeCount; ++k) { + if (visited[ancestors[j][k]]) { + ++seenCount; + } + } + + if (seenCount === shouldSeeCount) { + next = j; + break; + } + } + } + + if (next !== null) { + visited[next] = true; + sorted.push(next); + } + } + + if (sorted.length !== this._items.length) { + return false; + } + + const seqIndex = {}; + for (const item of this._items) { + seqIndex[item.seq] = item; + } + + this._items = []; + this.nodes = []; + + for (const value of sorted) { + const sortedItem = seqIndex[value]; + this.nodes.push(sortedItem.node); + this._items.push(sortedItem); + } + + return true; + } +}; + + +internals.mergeSort = (a, b) => { + + return a.sort === b.sort ? 0 : (a.sort < b.sort ? -1 : 1); +}; diff --git a/node_modules/@hapi/topo/package.json b/node_modules/@hapi/topo/package.json new file mode 100644 index 000000000..d5071fccc --- /dev/null +++ b/node_modules/@hapi/topo/package.json @@ -0,0 +1,63 @@ +{ + "_from": "@hapi/topo@^5.0.0", + "_id": "@hapi/topo@5.1.0", + "_inBundle": false, + "_integrity": "sha512-foQZKJig7Ob0BMAYBfcJk8d77QtOe7Wo4ox7ff1lQYoNNAb6jwcY1ncdoy2e9wQZzvNy7ODZCYJkK8kzmcAnAg==", + "_location": "/@hapi/topo", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "@hapi/topo@^5.0.0", + "name": "@hapi/topo", + "escapedName": "@hapi%2ftopo", + "scope": "@hapi", + "rawSpec": "^5.0.0", + "saveSpec": null, + "fetchSpec": "^5.0.0" + }, + "_requiredBy": [ + "/joi" + ], + "_resolved": "https://registry.npmjs.org/@hapi/topo/-/topo-5.1.0.tgz", + "_shasum": "dc448e332c6c6e37a4dc02fd84ba8d44b9afb012", + "_spec": "@hapi/topo@^5.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\joi", + "bugs": { + "url": "https://github.com/hapijs/topo/issues" + }, + "bundleDependencies": false, + "dependencies": { + "@hapi/hoek": "^9.0.0" + }, + "deprecated": false, + "description": "Topological sorting with grouping support", + "devDependencies": { + "@hapi/code": "8.x.x", + "@hapi/lab": "24.x.x", + "typescript": "~4.0.2" + }, + "files": [ + "lib" + ], + "homepage": "https://github.com/hapijs/topo#readme", + "keywords": [ + "topological", + "sort", + "toposort", + "topsort" + ], + "license": "BSD-3-Clause", + "main": "lib/index.js", + "name": "@hapi/topo", + "repository": { + "type": "git", + "url": "git://github.com/hapijs/topo.git" + }, + "scripts": { + "test": "lab -a @hapi/code -t 100 -L -Y", + "test-cov-html": "lab -a @hapi/code -t 100 -L -r html -o coverage.html" + }, + "types": "lib/index.d.ts", + "version": "5.1.0" +} diff --git a/node_modules/@sideway/address/LICENSE.md b/node_modules/@sideway/address/LICENSE.md new file mode 100644 index 000000000..0c6e658d5 --- /dev/null +++ b/node_modules/@sideway/address/LICENSE.md @@ -0,0 +1,9 @@ +Copyright (c) 2019-2020, Sideway, Inc. and Project contributors +All rights reserved. + +Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: + * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. + * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. + * The names of any contributors may not be used to endorse or promote products derived from this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS AND CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/node_modules/@sideway/address/README.md b/node_modules/@sideway/address/README.md new file mode 100644 index 000000000..c26895fea --- /dev/null +++ b/node_modules/@sideway/address/README.md @@ -0,0 +1,14 @@ +# @sideway/address + +#### Validate email address and domain. + +**address** is part of the **joi** ecosystem. + +### Visit the [joi.dev](https://joi.dev) Developer Portal for tutorials, documentation, and support + +## Useful resources + +- [Documentation and API](https://joi.dev/module/address/) +- [Versions status](https://joi.dev/resources/status/#address) +- [Changelog](https://joi.dev/module/address/changelog/) +- [Project policies](https://joi.dev/policies/) diff --git a/node_modules/@sideway/address/lib/decode.js b/node_modules/@sideway/address/lib/decode.js new file mode 100644 index 000000000..06a123694 --- /dev/null +++ b/node_modules/@sideway/address/lib/decode.js @@ -0,0 +1,120 @@ +'use strict'; + +// Adapted from: +// Copyright (c) 2017-2019 Justin Ridgewell, MIT Licensed, https://github.com/jridgewell/safe-decode-string-component +// Copyright (c) 2008-2009 Bjoern Hoehrmann , MIT Licensed, http://bjoern.hoehrmann.de/utf-8/decoder/dfa/ + + +const internals = {}; + + +exports.decode = function (string) { + + let percentPos = string.indexOf('%'); + if (percentPos === -1) { + return string; + } + + let decoded = ''; + let last = 0; + let codepoint = 0; + let startOfOctets = percentPos; + let state = internals.utf8.accept; + + while (percentPos > -1 && + percentPos < string.length) { + + const high = internals.resolveHex(string[percentPos + 1], 4); + const low = internals.resolveHex(string[percentPos + 2], 0); + const byte = high | low; + const type = internals.utf8.data[byte]; + state = internals.utf8.data[256 + state + type]; + codepoint = (codepoint << 6) | (byte & internals.utf8.data[364 + type]); + + if (state === internals.utf8.accept) { + decoded += string.slice(last, startOfOctets); + decoded += codepoint <= 0xFFFF + ? String.fromCharCode(codepoint) + : String.fromCharCode(0xD7C0 + (codepoint >> 10), 0xDC00 + (codepoint & 0x3FF)); + + codepoint = 0; + last = percentPos + 3; + percentPos = string.indexOf('%', last); + startOfOctets = percentPos; + continue; + } + + if (state === internals.utf8.reject) { + return null; + } + + percentPos += 3; + + if (percentPos >= string.length || + string[percentPos] !== '%') { + + return null; + } + } + + return decoded + string.slice(last); +}; + + +internals.resolveHex = function (char, shift) { + + const i = internals.hex[char]; + return i === undefined ? 255 : i << shift; +}; + + +internals.hex = { + '0': 0, '1': 1, '2': 2, '3': 3, '4': 4, + '5': 5, '6': 6, '7': 7, '8': 8, '9': 9, + 'a': 10, 'A': 10, 'b': 11, 'B': 11, 'c': 12, + 'C': 12, 'd': 13, 'D': 13, 'e': 14, 'E': 14, + 'f': 15, 'F': 15 +}; + + +internals.utf8 = { + accept: 12, + reject: 0, + data: [ + + // Maps bytes to character to a transition + + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, + 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, 2, + 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, + 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, 3, + 4, 4, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, + 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, 5, + 6, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 7, 8, 7, 7, + 10, 9, 9, 9, 11, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, 4, + + // Maps a state to a new state when adding a transition + + 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 12, 0, 0, 0, 0, 24, 36, 48, 60, 72, 84, 96, + 0, 12, 12, 12, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 0, 24, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 24, 24, 24, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 24, 24, 0, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 48, 48, 48, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 0, 48, 48, 0, 0, 0, 0, 0, 0, 0, 0, + 0, 48, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, + + // Maps the current transition to a mask that needs to apply to the byte + + 0x7F, 0x3F, 0x3F, 0x3F, 0x00, 0x1F, 0x0F, 0x0F, 0x0F, 0x07, 0x07, 0x07 + ] +}; diff --git a/node_modules/@sideway/address/lib/domain.js b/node_modules/@sideway/address/lib/domain.js new file mode 100644 index 000000000..79f6253a1 --- /dev/null +++ b/node_modules/@sideway/address/lib/domain.js @@ -0,0 +1,113 @@ +'use strict'; + +const Url = require('url'); + +const Errors = require('./errors'); + + +const internals = { + minDomainSegments: 2, + nonAsciiRx: /[^\x00-\x7f]/, + domainControlRx: /[\x00-\x20@\:\/\\#!\$&\'\(\)\*\+,;=\?]/, // Control + space + separators + tldSegmentRx: /^[a-zA-Z](?:[a-zA-Z0-9\-]*[a-zA-Z0-9])?$/, + domainSegmentRx: /^[a-zA-Z0-9](?:[a-zA-Z0-9\-]*[a-zA-Z0-9])?$/, + URL: Url.URL || URL // $lab:coverage:ignore$ +}; + + +exports.analyze = function (domain, options = {}) { + + if (typeof domain !== 'string') { + throw new Error('Invalid input: domain must be a string'); + } + + if (!domain) { + return Errors.code('DOMAIN_NON_EMPTY_STRING'); + } + + if (domain.length > 256) { + return Errors.code('DOMAIN_TOO_LONG'); + } + + const ascii = !internals.nonAsciiRx.test(domain); + if (!ascii) { + if (options.allowUnicode === false) { // Defaults to true + return Errors.code('DOMAIN_INVALID_UNICODE_CHARS'); + } + + domain = domain.normalize('NFC'); + } + + if (internals.domainControlRx.test(domain)) { + return Errors.code('DOMAIN_INVALID_CHARS'); + } + + domain = internals.punycode(domain); + + // https://tools.ietf.org/html/rfc1035 section 2.3.1 + + const minDomainSegments = options.minDomainSegments || internals.minDomainSegments; + + const segments = domain.split('.'); + if (segments.length < minDomainSegments) { + return Errors.code('DOMAIN_SEGMENTS_COUNT'); + } + + if (options.maxDomainSegments) { + if (segments.length > options.maxDomainSegments) { + return Errors.code('DOMAIN_SEGMENTS_COUNT_MAX'); + } + } + + const tlds = options.tlds; + if (tlds) { + const tld = segments[segments.length - 1].toLowerCase(); + if (tlds.deny && tlds.deny.has(tld) || + tlds.allow && !tlds.allow.has(tld)) { + + return Errors.code('DOMAIN_FORBIDDEN_TLDS'); + } + } + + for (let i = 0; i < segments.length; ++i) { + const segment = segments[i]; + + if (!segment.length) { + return Errors.code('DOMAIN_EMPTY_SEGMENT'); + } + + if (segment.length > 63) { + return Errors.code('DOMAIN_LONG_SEGMENT'); + } + + if (i < segments.length - 1) { + if (!internals.domainSegmentRx.test(segment)) { + return Errors.code('DOMAIN_INVALID_CHARS'); + } + } + else { + if (!internals.tldSegmentRx.test(segment)) { + return Errors.code('DOMAIN_INVALID_TLDS_CHARS'); + } + } + } + + return null; +}; + + +exports.isValid = function (domain, options) { + + return !exports.analyze(domain, options); +}; + + +internals.punycode = function (domain) { + + try { + return new internals.URL(`http://${domain}`).host; + } + catch (err) { + return domain; + } +}; diff --git a/node_modules/@sideway/address/lib/email.js b/node_modules/@sideway/address/lib/email.js new file mode 100644 index 000000000..8343ab7e1 --- /dev/null +++ b/node_modules/@sideway/address/lib/email.js @@ -0,0 +1,170 @@ +'use strict'; + +const Util = require('util'); + +const Domain = require('./domain'); +const Errors = require('./errors'); + + +const internals = { + nonAsciiRx: /[^\x00-\x7f]/, + encoder: new (Util.TextEncoder || TextEncoder)() // $lab:coverage:ignore$ +}; + + +exports.analyze = function (email, options) { + + return internals.email(email, options); +}; + + +exports.isValid = function (email, options) { + + return !internals.email(email, options); +}; + + +internals.email = function (email, options = {}) { + + if (typeof email !== 'string') { + throw new Error('Invalid input: email must be a string'); + } + + if (!email) { + return Errors.code('EMPTY_STRING'); + } + + // Unicode + + const ascii = !internals.nonAsciiRx.test(email); + if (!ascii) { + if (options.allowUnicode === false) { // Defaults to true + return Errors.code('FORBIDDEN_UNICODE'); + } + + email = email.normalize('NFC'); + } + + // Basic structure + + const parts = email.split('@'); + if (parts.length !== 2) { + return parts.length > 2 ? Errors.code('MULTIPLE_AT_CHAR') : Errors.code('MISSING_AT_CHAR'); + } + + const [local, domain] = parts; + + if (!local) { + return Errors.code('EMPTY_LOCAL'); + } + + if (!options.ignoreLength) { + if (email.length > 254) { // http://tools.ietf.org/html/rfc5321#section-4.5.3.1.3 + return Errors.code('ADDRESS_TOO_LONG'); + } + + if (internals.encoder.encode(local).length > 64) { // http://tools.ietf.org/html/rfc5321#section-4.5.3.1.1 + return Errors.code('LOCAL_TOO_LONG'); + } + } + + // Validate parts + + return internals.local(local, ascii) || Domain.analyze(domain, options); +}; + + +internals.local = function (local, ascii) { + + const segments = local.split('.'); + for (const segment of segments) { + if (!segment.length) { + return Errors.code('EMPTY_LOCAL_SEGMENT'); + } + + if (ascii) { + if (!internals.atextRx.test(segment)) { + return Errors.code('INVALID_LOCAL_CHARS'); + } + + continue; + } + + for (const char of segment) { + if (internals.atextRx.test(char)) { + continue; + } + + const binary = internals.binary(char); + if (!internals.atomRx.test(binary)) { + return Errors.code('INVALID_LOCAL_CHARS'); + } + } + } +}; + + +internals.binary = function (char) { + + return Array.from(internals.encoder.encode(char)).map((v) => String.fromCharCode(v)).join(''); +}; + + +/* + From RFC 5321: + + Mailbox = Local-part "@" ( Domain / address-literal ) + + Local-part = Dot-string / Quoted-string + Dot-string = Atom *("." Atom) + Atom = 1*atext + atext = ALPHA / DIGIT / "!" / "#" / "$" / "%" / "&" / "'" / "*" / "+" / "-" / "/" / "=" / "?" / "^" / "_" / "`" / "{" / "|" / "}" / "~" + + Domain = sub-domain *("." sub-domain) + sub-domain = Let-dig [Ldh-str] + Let-dig = ALPHA / DIGIT + Ldh-str = *( ALPHA / DIGIT / "-" ) Let-dig + + ALPHA = %x41-5A / %x61-7A ; a-z, A-Z + DIGIT = %x30-39 ; 0-9 + + From RFC 6531: + + sub-domain =/ U-label + atext =/ UTF8-non-ascii + + UTF8-non-ascii = UTF8-2 / UTF8-3 / UTF8-4 + + UTF8-2 = %xC2-DF UTF8-tail + UTF8-3 = %xE0 %xA0-BF UTF8-tail / + %xE1-EC 2( UTF8-tail ) / + %xED %x80-9F UTF8-tail / + %xEE-EF 2( UTF8-tail ) + UTF8-4 = %xF0 %x90-BF 2( UTF8-tail ) / + %xF1-F3 3( UTF8-tail ) / + %xF4 %x80-8F 2( UTF8-tail ) + + UTF8-tail = %x80-BF + + Note: The following are not supported: + + RFC 5321: address-literal, Quoted-string + RFC 5322: obs-*, CFWS +*/ + + +internals.atextRx = /^[\w!#\$%&'\*\+\-/=\?\^`\{\|\}~]+$/; // _ included in \w + + +internals.atomRx = new RegExp([ + + // %xC2-DF UTF8-tail + '(?:[\\xc2-\\xdf][\\x80-\\xbf])', + + // %xE0 %xA0-BF UTF8-tail %xE1-EC 2( UTF8-tail ) %xED %x80-9F UTF8-tail %xEE-EF 2( UTF8-tail ) + '(?:\\xe0[\\xa0-\\xbf][\\x80-\\xbf])|(?:[\\xe1-\\xec][\\x80-\\xbf]{2})|(?:\\xed[\\x80-\\x9f][\\x80-\\xbf])|(?:[\\xee-\\xef][\\x80-\\xbf]{2})', + + // %xF0 %x90-BF 2( UTF8-tail ) %xF1-F3 3( UTF8-tail ) %xF4 %x80-8F 2( UTF8-tail ) + '(?:\\xf0[\\x90-\\xbf][\\x80-\\xbf]{2})|(?:[\\xf1-\\xf3][\\x80-\\xbf]{3})|(?:\\xf4[\\x80-\\x8f][\\x80-\\xbf]{2})' + +].join('|')); diff --git a/node_modules/@sideway/address/lib/errors.js b/node_modules/@sideway/address/lib/errors.js new file mode 100644 index 000000000..001dd1068 --- /dev/null +++ b/node_modules/@sideway/address/lib/errors.js @@ -0,0 +1,29 @@ +'use strict'; + +exports.codes = { + EMPTY_STRING: 'Address must be a non-empty string', + FORBIDDEN_UNICODE: 'Address contains forbidden Unicode characters', + MULTIPLE_AT_CHAR: 'Address cannot contain more than one @ character', + MISSING_AT_CHAR: 'Address must contain one @ character', + EMPTY_LOCAL: 'Address local part cannot be empty', + ADDRESS_TOO_LONG: 'Address too long', + LOCAL_TOO_LONG: 'Address local part too long', + EMPTY_LOCAL_SEGMENT: 'Address local part contains empty dot-separated segment', + INVALID_LOCAL_CHARS: 'Address local part contains invalid character', + DOMAIN_NON_EMPTY_STRING: 'Domain must be a non-empty string', + DOMAIN_TOO_LONG: 'Domain too long', + DOMAIN_INVALID_UNICODE_CHARS: 'Domain contains forbidden Unicode characters', + DOMAIN_INVALID_CHARS: 'Domain contains invalid character', + DOMAIN_INVALID_TLDS_CHARS: 'Domain contains invalid tld character', + DOMAIN_SEGMENTS_COUNT: 'Domain lacks the minimum required number of segments', + DOMAIN_SEGMENTS_COUNT_MAX: 'Domain contains too many segments', + DOMAIN_FORBIDDEN_TLDS: 'Domain uses forbidden TLD', + DOMAIN_EMPTY_SEGMENT: 'Domain contains empty dot-separated segment', + DOMAIN_LONG_SEGMENT: 'Domain contains dot-separated segment that is too long' +}; + + +exports.code = function (code) { + + return { code, error: exports.codes[code] }; +}; diff --git a/node_modules/@sideway/address/lib/index.d.ts b/node_modules/@sideway/address/lib/index.d.ts new file mode 100644 index 000000000..a533d7371 --- /dev/null +++ b/node_modules/@sideway/address/lib/index.d.ts @@ -0,0 +1,255 @@ +/// + +import * as Hoek from '@hapi/hoek'; + + +export namespace domain { + + /** + * Analyzes a string to verify it is a valid domain name. + * + * @param domain - the domain name to validate. + * @param options - optional settings. + * + * @return - undefined when valid, otherwise an object with single error key with a string message value. + */ + function analyze(domain: string, options?: Options): Analysis | null; + + /** + * Analyzes a string to verify it is a valid domain name. + * + * @param domain - the domain name to validate. + * @param options - optional settings. + * + * @return - true when valid, otherwise false. + */ + function isValid(domain: string, options?: Options): boolean; + + interface Options { + + /** + * Determines whether Unicode characters are allowed. + * + * @default true + */ + readonly allowUnicode?: boolean; + + /** + * The minimum number of domain segments (e.g. `x.y.z` has 3 segments) required. + * + * @default 2 + */ + readonly minDomainSegments?: number; + + /** + * Top-level-domain options + * + * @default true + */ + readonly tlds?: Tlds.Allow | Tlds.Deny | boolean; + } + + namespace Tlds { + + interface Allow { + + readonly allow: Set | true; + } + + interface Deny { + + readonly deny: Set; + } + } +} + + +export namespace email { + + /** + * Analyzes a string to verify it is a valid email address. + * + * @param email - the email address to validate. + * @param options - optional settings. + * + * @return - undefined when valid, otherwise an object with single error key with a string message value. + */ + function analyze(email: string, options?: Options): Analysis | null; + + /** + * Analyzes a string to verify it is a valid email address. + * + * @param email - the email address to validate. + * @param options - optional settings. + * + * @return - true when valid, otherwise false. + */ + function isValid(email: string, options?: Options): boolean; + + interface Options extends domain.Options { + + /** + * Determines whether to ignore the standards maximum email length limit. + * + * @default false + */ + readonly ignoreLength?: boolean; + } +} + + +export interface Analysis { + + /** + * The reason validation failed. + */ + error: string; + + /** + * The error code. + */ + code: string; +} + + +export const errors: Record; + + +export namespace ip { + + /** + * Generates a regular expression used to validate IP addresses. + * + * @param options - optional settings. + * + * @returns an object with the regular expression and meta data. + */ + function regex(options?: Options): Expression; + + interface Options { + + /** + * The required CIDR mode. + * + * @default 'optional' + */ + readonly cidr?: Cidr; + + /** + * The allowed versions. + * + * @default ['ipv4', 'ipv6', 'ipvfuture'] + */ + readonly version?: Version | Version[]; + } + + type Cidr = 'optional' | 'required' | 'forbidden'; + type Version = 'ipv4' | 'ipv6' | 'ipvfuture'; + + interface Expression { + + /** + * The CIDR mode. + */ + cidr: Cidr; + + /** + * The raw regular expression string. + */ + raw: string; + + /** + * The regular expression. + */ + regex: RegExp; + + /** + * The array of versions allowed. + */ + versions: Version[]; + } +} + + +export namespace uri { + + /** + * Faster version of decodeURIComponent() that does not throw. + * + * @param string - the URL string to decode. + * + * @returns the decoded string or null if invalid. + */ + function decode(string: string): string | null; + + /** + * Generates a regular expression used to validate URI addresses. + * + * @param options - optional settings. + * + * @returns an object with the regular expression and meta data. + */ + function regex(options?: Options): Expression; + + type Options = Hoek.ts.XOR; + + namespace Options { + + interface Query { + + /** + * Allow the use of [] in query parameters. + * + * @default false + */ + readonly allowQuerySquareBrackets?: boolean; + } + + interface Relative extends Query { + + /** + * Requires the URI to be relative. + * + * @default false + */ + readonly relativeOnly?: boolean; + } + + interface Options extends Query { + + /** + * Allow relative URIs. + * + * @default false + */ + readonly allowRelative?: boolean; + + /** + * Capture domain segment ($1). + * + * @default false + */ + readonly domain?: boolean; + + /** + * The allowed URI schemes. + */ + readonly scheme?: Scheme | Scheme[]; + } + + type Scheme = string | RegExp; + } + + interface Expression { + + /** + * The raw regular expression string. + */ + raw: string; + + /** + * The regular expression. + */ + regex: RegExp; + } +} diff --git a/node_modules/@sideway/address/lib/index.js b/node_modules/@sideway/address/lib/index.js new file mode 100644 index 000000000..b93a9c5c2 --- /dev/null +++ b/node_modules/@sideway/address/lib/index.js @@ -0,0 +1,97 @@ +'use strict'; + +const Decode = require('./decode'); +const Domain = require('./domain'); +const Email = require('./email'); +const Errors = require('./errors'); +const Ip = require('./ip'); +const Tlds = require('./tlds'); +const Uri = require('./uri'); + + +const internals = { + defaultTlds: { allow: Tlds, deny: null } +}; + + +module.exports = { + errors: Errors.codes, + + domain: { + analyze(domain, options) { + + options = internals.options(options); + return Domain.analyze(domain, options); + }, + + isValid(domain, options) { + + options = internals.options(options); + return Domain.isValid(domain, options); + } + }, + email: { + analyze(email, options) { + + options = internals.options(options); + return Email.analyze(email, options); + }, + + isValid(email, options) { + + options = internals.options(options); + return Email.isValid(email, options); + } + }, + ip: { + regex: Ip.regex + }, + uri: { + decode: Decode.decode, + regex: Uri.regex + } +}; + + +internals.options = function (options) { + + if (!options) { + return { tlds: internals.defaultTlds }; + } + + if (options.tlds === false) { // Defaults to true + return options; + } + + if (!options.tlds || + options.tlds === true) { + + return Object.assign({}, options, { tlds: internals.defaultTlds }); + } + + if (typeof options.tlds !== 'object') { + throw new Error('Invalid options: tlds must be a boolean or an object'); + } + + if (options.tlds.deny) { + if (options.tlds.deny instanceof Set === false) { + throw new Error('Invalid options: tlds.deny must be a Set object'); + } + + if (options.tlds.allow) { + throw new Error('Invalid options: cannot specify both tlds.allow and tlds.deny lists'); + } + + return options; + } + + if (options.tlds.allow === true) { + return Object.assign({}, options, { tlds: internals.defaultTlds }); + } + + if (options.tlds.allow instanceof Set === false) { + throw new Error('Invalid options: tlds.allow must be a Set object or true'); + } + + return options; +}; diff --git a/node_modules/@sideway/address/lib/ip.js b/node_modules/@sideway/address/lib/ip.js new file mode 100644 index 000000000..541b72cc7 --- /dev/null +++ b/node_modules/@sideway/address/lib/ip.js @@ -0,0 +1,63 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Uri = require('./uri'); + + +const internals = {}; + + +exports.regex = function (options = {}) { + + // CIDR + + Assert(options.cidr === undefined || typeof options.cidr === 'string', 'options.cidr must be a string'); + const cidr = options.cidr ? options.cidr.toLowerCase() : 'optional'; + Assert(['required', 'optional', 'forbidden'].includes(cidr), 'options.cidr must be one of required, optional, forbidden'); + + // Versions + + Assert(options.version === undefined || typeof options.version === 'string' || Array.isArray(options.version), 'options.version must be a string or an array of string'); + let versions = options.version || ['ipv4', 'ipv6', 'ipvfuture']; + if (!Array.isArray(versions)) { + versions = [versions]; + } + + Assert(versions.length >= 1, 'options.version must have at least 1 version specified'); + + for (let i = 0; i < versions.length; ++i) { + Assert(typeof versions[i] === 'string', 'options.version must only contain strings'); + versions[i] = versions[i].toLowerCase(); + Assert(['ipv4', 'ipv6', 'ipvfuture'].includes(versions[i]), 'options.version contains unknown version ' + versions[i] + ' - must be one of ipv4, ipv6, ipvfuture'); + } + + versions = Array.from(new Set(versions)); + + // Regex + + const parts = versions.map((version) => { + + // Forbidden + + if (cidr === 'forbidden') { + return Uri.ip[version]; + } + + // Required + + const cidrpart = `\\/${version === 'ipv4' ? Uri.ip.v4Cidr : Uri.ip.v6Cidr}`; + + if (cidr === 'required') { + return `${Uri.ip[version]}${cidrpart}`; + } + + // Optional + + return `${Uri.ip[version]}(?:${cidrpart})?`; + }); + + const raw = `(?:${parts.join('|')})`; + const regex = new RegExp(`^${raw}$`); + return { cidr, versions, regex, raw }; +}; diff --git a/node_modules/@sideway/address/lib/tlds.js b/node_modules/@sideway/address/lib/tlds.js new file mode 100644 index 000000000..9599eddcc --- /dev/null +++ b/node_modules/@sideway/address/lib/tlds.js @@ -0,0 +1,1520 @@ +'use strict'; + +const internals = {}; + + +// http://data.iana.org/TLD/tlds-alpha-by-domain.txt +// # Version 2021020700, Last Updated Sun Feb 7 07: 07: 01 2021 UTC + + +internals.tlds = [ + 'AAA', + 'AARP', + 'ABARTH', + 'ABB', + 'ABBOTT', + 'ABBVIE', + 'ABC', + 'ABLE', + 'ABOGADO', + 'ABUDHABI', + 'AC', + 'ACADEMY', + 'ACCENTURE', + 'ACCOUNTANT', + 'ACCOUNTANTS', + 'ACO', + 'ACTOR', + 'AD', + 'ADAC', + 'ADS', + 'ADULT', + 'AE', + 'AEG', + 'AERO', + 'AETNA', + 'AF', + 'AFAMILYCOMPANY', + 'AFL', + 'AFRICA', + 'AG', + 'AGAKHAN', + 'AGENCY', + 'AI', + 'AIG', + 'AIRBUS', + 'AIRFORCE', + 'AIRTEL', + 'AKDN', + 'AL', + 'ALFAROMEO', + 'ALIBABA', + 'ALIPAY', + 'ALLFINANZ', + 'ALLSTATE', + 'ALLY', + 'ALSACE', + 'ALSTOM', + 'AM', + 'AMAZON', + 'AMERICANEXPRESS', + 'AMERICANFAMILY', + 'AMEX', + 'AMFAM', + 'AMICA', + 'AMSTERDAM', + 'ANALYTICS', + 'ANDROID', + 'ANQUAN', + 'ANZ', + 'AO', + 'AOL', + 'APARTMENTS', + 'APP', + 'APPLE', + 'AQ', + 'AQUARELLE', + 'AR', + 'ARAB', + 'ARAMCO', + 'ARCHI', + 'ARMY', + 'ARPA', + 'ART', + 'ARTE', + 'AS', + 'ASDA', + 'ASIA', + 'ASSOCIATES', + 'AT', + 'ATHLETA', + 'ATTORNEY', + 'AU', + 'AUCTION', + 'AUDI', + 'AUDIBLE', + 'AUDIO', + 'AUSPOST', + 'AUTHOR', + 'AUTO', + 'AUTOS', + 'AVIANCA', + 'AW', + 'AWS', + 'AX', + 'AXA', + 'AZ', + 'AZURE', + 'BA', + 'BABY', + 'BAIDU', + 'BANAMEX', + 'BANANAREPUBLIC', + 'BAND', + 'BANK', + 'BAR', + 'BARCELONA', + 'BARCLAYCARD', + 'BARCLAYS', + 'BAREFOOT', + 'BARGAINS', + 'BASEBALL', + 'BASKETBALL', + 'BAUHAUS', + 'BAYERN', + 'BB', + 'BBC', + 'BBT', + 'BBVA', + 'BCG', + 'BCN', + 'BD', + 'BE', + 'BEATS', + 'BEAUTY', + 'BEER', + 'BENTLEY', + 'BERLIN', + 'BEST', + 'BESTBUY', + 'BET', + 'BF', + 'BG', + 'BH', + 'BHARTI', + 'BI', + 'BIBLE', + 'BID', + 'BIKE', + 'BING', + 'BINGO', + 'BIO', + 'BIZ', + 'BJ', + 'BLACK', + 'BLACKFRIDAY', + 'BLOCKBUSTER', + 'BLOG', + 'BLOOMBERG', + 'BLUE', + 'BM', + 'BMS', + 'BMW', + 'BN', + 'BNPPARIBAS', + 'BO', + 'BOATS', + 'BOEHRINGER', + 'BOFA', + 'BOM', + 'BOND', + 'BOO', + 'BOOK', + 'BOOKING', + 'BOSCH', + 'BOSTIK', + 'BOSTON', + 'BOT', + 'BOUTIQUE', + 'BOX', + 'BR', + 'BRADESCO', + 'BRIDGESTONE', + 'BROADWAY', + 'BROKER', + 'BROTHER', + 'BRUSSELS', + 'BS', + 'BT', + 'BUDAPEST', + 'BUGATTI', + 'BUILD', + 'BUILDERS', + 'BUSINESS', + 'BUY', + 'BUZZ', + 'BV', + 'BW', + 'BY', + 'BZ', + 'BZH', + 'CA', + 'CAB', + 'CAFE', + 'CAL', + 'CALL', + 'CALVINKLEIN', + 'CAM', + 'CAMERA', + 'CAMP', + 'CANCERRESEARCH', + 'CANON', + 'CAPETOWN', + 'CAPITAL', + 'CAPITALONE', + 'CAR', + 'CARAVAN', + 'CARDS', + 'CARE', + 'CAREER', + 'CAREERS', + 'CARS', + 'CASA', + 'CASE', + 'CASEIH', + 'CASH', + 'CASINO', + 'CAT', + 'CATERING', + 'CATHOLIC', + 'CBA', + 'CBN', + 'CBRE', + 'CBS', + 'CC', + 'CD', + 'CENTER', + 'CEO', + 'CERN', + 'CF', + 'CFA', + 'CFD', + 'CG', + 'CH', + 'CHANEL', + 'CHANNEL', + 'CHARITY', + 'CHASE', + 'CHAT', + 'CHEAP', + 'CHINTAI', + 'CHRISTMAS', + 'CHROME', + 'CHURCH', + 'CI', + 'CIPRIANI', + 'CIRCLE', + 'CISCO', + 'CITADEL', + 'CITI', + 'CITIC', + 'CITY', + 'CITYEATS', + 'CK', + 'CL', + 'CLAIMS', + 'CLEANING', + 'CLICK', + 'CLINIC', + 'CLINIQUE', + 'CLOTHING', + 'CLOUD', + 'CLUB', + 'CLUBMED', + 'CM', + 'CN', + 'CO', + 'COACH', + 'CODES', + 'COFFEE', + 'COLLEGE', + 'COLOGNE', + 'COM', + 'COMCAST', + 'COMMBANK', + 'COMMUNITY', + 'COMPANY', + 'COMPARE', + 'COMPUTER', + 'COMSEC', + 'CONDOS', + 'CONSTRUCTION', + 'CONSULTING', + 'CONTACT', + 'CONTRACTORS', + 'COOKING', + 'COOKINGCHANNEL', + 'COOL', + 'COOP', + 'CORSICA', + 'COUNTRY', + 'COUPON', + 'COUPONS', + 'COURSES', + 'CPA', + 'CR', + 'CREDIT', + 'CREDITCARD', + 'CREDITUNION', + 'CRICKET', + 'CROWN', + 'CRS', + 'CRUISE', + 'CRUISES', + 'CSC', + 'CU', + 'CUISINELLA', + 'CV', + 'CW', + 'CX', + 'CY', + 'CYMRU', + 'CYOU', + 'CZ', + 'DABUR', + 'DAD', + 'DANCE', + 'DATA', + 'DATE', + 'DATING', + 'DATSUN', + 'DAY', + 'DCLK', + 'DDS', + 'DE', + 'DEAL', + 'DEALER', + 'DEALS', + 'DEGREE', + 'DELIVERY', + 'DELL', + 'DELOITTE', + 'DELTA', + 'DEMOCRAT', + 'DENTAL', + 'DENTIST', + 'DESI', + 'DESIGN', + 'DEV', + 'DHL', + 'DIAMONDS', + 'DIET', + 'DIGITAL', + 'DIRECT', + 'DIRECTORY', + 'DISCOUNT', + 'DISCOVER', + 'DISH', + 'DIY', + 'DJ', + 'DK', + 'DM', + 'DNP', + 'DO', + 'DOCS', + 'DOCTOR', + 'DOG', + 'DOMAINS', + 'DOT', + 'DOWNLOAD', + 'DRIVE', + 'DTV', + 'DUBAI', + 'DUCK', + 'DUNLOP', + 'DUPONT', + 'DURBAN', + 'DVAG', + 'DVR', + 'DZ', + 'EARTH', + 'EAT', + 'EC', + 'ECO', + 'EDEKA', + 'EDU', + 'EDUCATION', + 'EE', + 'EG', + 'EMAIL', + 'EMERCK', + 'ENERGY', + 'ENGINEER', + 'ENGINEERING', + 'ENTERPRISES', + 'EPSON', + 'EQUIPMENT', + 'ER', + 'ERICSSON', + 'ERNI', + 'ES', + 'ESQ', + 'ESTATE', + 'ET', + 'ETISALAT', + 'EU', + 'EUROVISION', + 'EUS', + 'EVENTS', + 'EXCHANGE', + 'EXPERT', + 'EXPOSED', + 'EXPRESS', + 'EXTRASPACE', + 'FAGE', + 'FAIL', + 'FAIRWINDS', + 'FAITH', + 'FAMILY', + 'FAN', + 'FANS', + 'FARM', + 'FARMERS', + 'FASHION', + 'FAST', + 'FEDEX', + 'FEEDBACK', + 'FERRARI', + 'FERRERO', + 'FI', + 'FIAT', + 'FIDELITY', + 'FIDO', + 'FILM', + 'FINAL', + 'FINANCE', + 'FINANCIAL', + 'FIRE', + 'FIRESTONE', + 'FIRMDALE', + 'FISH', + 'FISHING', + 'FIT', + 'FITNESS', + 'FJ', + 'FK', + 'FLICKR', + 'FLIGHTS', + 'FLIR', + 'FLORIST', + 'FLOWERS', + 'FLY', + 'FM', + 'FO', + 'FOO', + 'FOOD', + 'FOODNETWORK', + 'FOOTBALL', + 'FORD', + 'FOREX', + 'FORSALE', + 'FORUM', + 'FOUNDATION', + 'FOX', + 'FR', + 'FREE', + 'FRESENIUS', + 'FRL', + 'FROGANS', + 'FRONTDOOR', + 'FRONTIER', + 'FTR', + 'FUJITSU', + 'FUJIXEROX', + 'FUN', + 'FUND', + 'FURNITURE', + 'FUTBOL', + 'FYI', + 'GA', + 'GAL', + 'GALLERY', + 'GALLO', + 'GALLUP', + 'GAME', + 'GAMES', + 'GAP', + 'GARDEN', + 'GAY', + 'GB', + 'GBIZ', + 'GD', + 'GDN', + 'GE', + 'GEA', + 'GENT', + 'GENTING', + 'GEORGE', + 'GF', + 'GG', + 'GGEE', + 'GH', + 'GI', + 'GIFT', + 'GIFTS', + 'GIVES', + 'GIVING', + 'GL', + 'GLADE', + 'GLASS', + 'GLE', + 'GLOBAL', + 'GLOBO', + 'GM', + 'GMAIL', + 'GMBH', + 'GMO', + 'GMX', + 'GN', + 'GODADDY', + 'GOLD', + 'GOLDPOINT', + 'GOLF', + 'GOO', + 'GOODYEAR', + 'GOOG', + 'GOOGLE', + 'GOP', + 'GOT', + 'GOV', + 'GP', + 'GQ', + 'GR', + 'GRAINGER', + 'GRAPHICS', + 'GRATIS', + 'GREEN', + 'GRIPE', + 'GROCERY', + 'GROUP', + 'GS', + 'GT', + 'GU', + 'GUARDIAN', + 'GUCCI', + 'GUGE', + 'GUIDE', + 'GUITARS', + 'GURU', + 'GW', + 'GY', + 'HAIR', + 'HAMBURG', + 'HANGOUT', + 'HAUS', + 'HBO', + 'HDFC', + 'HDFCBANK', + 'HEALTH', + 'HEALTHCARE', + 'HELP', + 'HELSINKI', + 'HERE', + 'HERMES', + 'HGTV', + 'HIPHOP', + 'HISAMITSU', + 'HITACHI', + 'HIV', + 'HK', + 'HKT', + 'HM', + 'HN', + 'HOCKEY', + 'HOLDINGS', + 'HOLIDAY', + 'HOMEDEPOT', + 'HOMEGOODS', + 'HOMES', + 'HOMESENSE', + 'HONDA', + 'HORSE', + 'HOSPITAL', + 'HOST', + 'HOSTING', + 'HOT', + 'HOTELES', + 'HOTELS', + 'HOTMAIL', + 'HOUSE', + 'HOW', + 'HR', + 'HSBC', + 'HT', + 'HU', + 'HUGHES', + 'HYATT', + 'HYUNDAI', + 'IBM', + 'ICBC', + 'ICE', + 'ICU', + 'ID', + 'IE', + 'IEEE', + 'IFM', + 'IKANO', + 'IL', + 'IM', + 'IMAMAT', + 'IMDB', + 'IMMO', + 'IMMOBILIEN', + 'IN', + 'INC', + 'INDUSTRIES', + 'INFINITI', + 'INFO', + 'ING', + 'INK', + 'INSTITUTE', + 'INSURANCE', + 'INSURE', + 'INT', + 'INTERNATIONAL', + 'INTUIT', + 'INVESTMENTS', + 'IO', + 'IPIRANGA', + 'IQ', + 'IR', + 'IRISH', + 'IS', + 'ISMAILI', + 'IST', + 'ISTANBUL', + 'IT', + 'ITAU', + 'ITV', + 'IVECO', + 'JAGUAR', + 'JAVA', + 'JCB', + 'JE', + 'JEEP', + 'JETZT', + 'JEWELRY', + 'JIO', + 'JLL', + 'JM', + 'JMP', + 'JNJ', + 'JO', + 'JOBS', + 'JOBURG', + 'JOT', + 'JOY', + 'JP', + 'JPMORGAN', + 'JPRS', + 'JUEGOS', + 'JUNIPER', + 'KAUFEN', + 'KDDI', + 'KE', + 'KERRYHOTELS', + 'KERRYLOGISTICS', + 'KERRYPROPERTIES', + 'KFH', + 'KG', + 'KH', + 'KI', + 'KIA', + 'KIM', + 'KINDER', + 'KINDLE', + 'KITCHEN', + 'KIWI', + 'KM', + 'KN', + 'KOELN', + 'KOMATSU', + 'KOSHER', + 'KP', + 'KPMG', + 'KPN', + 'KR', + 'KRD', + 'KRED', + 'KUOKGROUP', + 'KW', + 'KY', + 'KYOTO', + 'KZ', + 'LA', + 'LACAIXA', + 'LAMBORGHINI', + 'LAMER', + 'LANCASTER', + 'LANCIA', + 'LAND', + 'LANDROVER', + 'LANXESS', + 'LASALLE', + 'LAT', + 'LATINO', + 'LATROBE', + 'LAW', + 'LAWYER', + 'LB', + 'LC', + 'LDS', + 'LEASE', + 'LECLERC', + 'LEFRAK', + 'LEGAL', + 'LEGO', + 'LEXUS', + 'LGBT', + 'LI', + 'LIDL', + 'LIFE', + 'LIFEINSURANCE', + 'LIFESTYLE', + 'LIGHTING', + 'LIKE', + 'LILLY', + 'LIMITED', + 'LIMO', + 'LINCOLN', + 'LINDE', + 'LINK', + 'LIPSY', + 'LIVE', + 'LIVING', + 'LIXIL', + 'LK', + 'LLC', + 'LLP', + 'LOAN', + 'LOANS', + 'LOCKER', + 'LOCUS', + 'LOFT', + 'LOL', + 'LONDON', + 'LOTTE', + 'LOTTO', + 'LOVE', + 'LPL', + 'LPLFINANCIAL', + 'LR', + 'LS', + 'LT', + 'LTD', + 'LTDA', + 'LU', + 'LUNDBECK', + 'LUXE', + 'LUXURY', + 'LV', + 'LY', + 'MA', + 'MACYS', + 'MADRID', + 'MAIF', + 'MAISON', + 'MAKEUP', + 'MAN', + 'MANAGEMENT', + 'MANGO', + 'MAP', + 'MARKET', + 'MARKETING', + 'MARKETS', + 'MARRIOTT', + 'MARSHALLS', + 'MASERATI', + 'MATTEL', + 'MBA', + 'MC', + 'MCKINSEY', + 'MD', + 'ME', + 'MED', + 'MEDIA', + 'MEET', + 'MELBOURNE', + 'MEME', + 'MEMORIAL', + 'MEN', + 'MENU', + 'MERCKMSD', + 'MG', + 'MH', + 'MIAMI', + 'MICROSOFT', + 'MIL', + 'MINI', + 'MINT', + 'MIT', + 'MITSUBISHI', + 'MK', + 'ML', + 'MLB', + 'MLS', + 'MM', + 'MMA', + 'MN', + 'MO', + 'MOBI', + 'MOBILE', + 'MODA', + 'MOE', + 'MOI', + 'MOM', + 'MONASH', + 'MONEY', + 'MONSTER', + 'MORMON', + 'MORTGAGE', + 'MOSCOW', + 'MOTO', + 'MOTORCYCLES', + 'MOV', + 'MOVIE', + 'MP', + 'MQ', + 'MR', + 'MS', + 'MSD', + 'MT', + 'MTN', + 'MTR', + 'MU', + 'MUSEUM', + 'MUTUAL', + 'MV', + 'MW', + 'MX', + 'MY', + 'MZ', + 'NA', + 'NAB', + 'NAGOYA', + 'NAME', + 'NATIONWIDE', + 'NATURA', + 'NAVY', + 'NBA', + 'NC', + 'NE', + 'NEC', + 'NET', + 'NETBANK', + 'NETFLIX', + 'NETWORK', + 'NEUSTAR', + 'NEW', + 'NEWHOLLAND', + 'NEWS', + 'NEXT', + 'NEXTDIRECT', + 'NEXUS', + 'NF', + 'NFL', + 'NG', + 'NGO', + 'NHK', + 'NI', + 'NICO', + 'NIKE', + 'NIKON', + 'NINJA', + 'NISSAN', + 'NISSAY', + 'NL', + 'NO', + 'NOKIA', + 'NORTHWESTERNMUTUAL', + 'NORTON', + 'NOW', + 'NOWRUZ', + 'NOWTV', + 'NP', + 'NR', + 'NRA', + 'NRW', + 'NTT', + 'NU', + 'NYC', + 'NZ', + 'OBI', + 'OBSERVER', + 'OFF', + 'OFFICE', + 'OKINAWA', + 'OLAYAN', + 'OLAYANGROUP', + 'OLDNAVY', + 'OLLO', + 'OM', + 'OMEGA', + 'ONE', + 'ONG', + 'ONL', + 'ONLINE', + 'ONYOURSIDE', + 'OOO', + 'OPEN', + 'ORACLE', + 'ORANGE', + 'ORG', + 'ORGANIC', + 'ORIGINS', + 'OSAKA', + 'OTSUKA', + 'OTT', + 'OVH', + 'PA', + 'PAGE', + 'PANASONIC', + 'PARIS', + 'PARS', + 'PARTNERS', + 'PARTS', + 'PARTY', + 'PASSAGENS', + 'PAY', + 'PCCW', + 'PE', + 'PET', + 'PF', + 'PFIZER', + 'PG', + 'PH', + 'PHARMACY', + 'PHD', + 'PHILIPS', + 'PHONE', + 'PHOTO', + 'PHOTOGRAPHY', + 'PHOTOS', + 'PHYSIO', + 'PICS', + 'PICTET', + 'PICTURES', + 'PID', + 'PIN', + 'PING', + 'PINK', + 'PIONEER', + 'PIZZA', + 'PK', + 'PL', + 'PLACE', + 'PLAY', + 'PLAYSTATION', + 'PLUMBING', + 'PLUS', + 'PM', + 'PN', + 'PNC', + 'POHL', + 'POKER', + 'POLITIE', + 'PORN', + 'POST', + 'PR', + 'PRAMERICA', + 'PRAXI', + 'PRESS', + 'PRIME', + 'PRO', + 'PROD', + 'PRODUCTIONS', + 'PROF', + 'PROGRESSIVE', + 'PROMO', + 'PROPERTIES', + 'PROPERTY', + 'PROTECTION', + 'PRU', + 'PRUDENTIAL', + 'PS', + 'PT', + 'PUB', + 'PW', + 'PWC', + 'PY', + 'QA', + 'QPON', + 'QUEBEC', + 'QUEST', + 'QVC', + 'RACING', + 'RADIO', + 'RAID', + 'RE', + 'READ', + 'REALESTATE', + 'REALTOR', + 'REALTY', + 'RECIPES', + 'RED', + 'REDSTONE', + 'REDUMBRELLA', + 'REHAB', + 'REISE', + 'REISEN', + 'REIT', + 'RELIANCE', + 'REN', + 'RENT', + 'RENTALS', + 'REPAIR', + 'REPORT', + 'REPUBLICAN', + 'REST', + 'RESTAURANT', + 'REVIEW', + 'REVIEWS', + 'REXROTH', + 'RICH', + 'RICHARDLI', + 'RICOH', + 'RIL', + 'RIO', + 'RIP', + 'RMIT', + 'RO', + 'ROCHER', + 'ROCKS', + 'RODEO', + 'ROGERS', + 'ROOM', + 'RS', + 'RSVP', + 'RU', + 'RUGBY', + 'RUHR', + 'RUN', + 'RW', + 'RWE', + 'RYUKYU', + 'SA', + 'SAARLAND', + 'SAFE', + 'SAFETY', + 'SAKURA', + 'SALE', + 'SALON', + 'SAMSCLUB', + 'SAMSUNG', + 'SANDVIK', + 'SANDVIKCOROMANT', + 'SANOFI', + 'SAP', + 'SARL', + 'SAS', + 'SAVE', + 'SAXO', + 'SB', + 'SBI', + 'SBS', + 'SC', + 'SCA', + 'SCB', + 'SCHAEFFLER', + 'SCHMIDT', + 'SCHOLARSHIPS', + 'SCHOOL', + 'SCHULE', + 'SCHWARZ', + 'SCIENCE', + 'SCJOHNSON', + 'SCOT', + 'SD', + 'SE', + 'SEARCH', + 'SEAT', + 'SECURE', + 'SECURITY', + 'SEEK', + 'SELECT', + 'SENER', + 'SERVICES', + 'SES', + 'SEVEN', + 'SEW', + 'SEX', + 'SEXY', + 'SFR', + 'SG', + 'SH', + 'SHANGRILA', + 'SHARP', + 'SHAW', + 'SHELL', + 'SHIA', + 'SHIKSHA', + 'SHOES', + 'SHOP', + 'SHOPPING', + 'SHOUJI', + 'SHOW', + 'SHOWTIME', + 'SI', + 'SILK', + 'SINA', + 'SINGLES', + 'SITE', + 'SJ', + 'SK', + 'SKI', + 'SKIN', + 'SKY', + 'SKYPE', + 'SL', + 'SLING', + 'SM', + 'SMART', + 'SMILE', + 'SN', + 'SNCF', + 'SO', + 'SOCCER', + 'SOCIAL', + 'SOFTBANK', + 'SOFTWARE', + 'SOHU', + 'SOLAR', + 'SOLUTIONS', + 'SONG', + 'SONY', + 'SOY', + 'SPA', + 'SPACE', + 'SPORT', + 'SPOT', + 'SPREADBETTING', + 'SR', + 'SRL', + 'SS', + 'ST', + 'STADA', + 'STAPLES', + 'STAR', + 'STATEBANK', + 'STATEFARM', + 'STC', + 'STCGROUP', + 'STOCKHOLM', + 'STORAGE', + 'STORE', + 'STREAM', + 'STUDIO', + 'STUDY', + 'STYLE', + 'SU', + 'SUCKS', + 'SUPPLIES', + 'SUPPLY', + 'SUPPORT', + 'SURF', + 'SURGERY', + 'SUZUKI', + 'SV', + 'SWATCH', + 'SWIFTCOVER', + 'SWISS', + 'SX', + 'SY', + 'SYDNEY', + 'SYSTEMS', + 'SZ', + 'TAB', + 'TAIPEI', + 'TALK', + 'TAOBAO', + 'TARGET', + 'TATAMOTORS', + 'TATAR', + 'TATTOO', + 'TAX', + 'TAXI', + 'TC', + 'TCI', + 'TD', + 'TDK', + 'TEAM', + 'TECH', + 'TECHNOLOGY', + 'TEL', + 'TEMASEK', + 'TENNIS', + 'TEVA', + 'TF', + 'TG', + 'TH', + 'THD', + 'THEATER', + 'THEATRE', + 'TIAA', + 'TICKETS', + 'TIENDA', + 'TIFFANY', + 'TIPS', + 'TIRES', + 'TIROL', + 'TJ', + 'TJMAXX', + 'TJX', + 'TK', + 'TKMAXX', + 'TL', + 'TM', + 'TMALL', + 'TN', + 'TO', + 'TODAY', + 'TOKYO', + 'TOOLS', + 'TOP', + 'TORAY', + 'TOSHIBA', + 'TOTAL', + 'TOURS', + 'TOWN', + 'TOYOTA', + 'TOYS', + 'TR', + 'TRADE', + 'TRADING', + 'TRAINING', + 'TRAVEL', + 'TRAVELCHANNEL', + 'TRAVELERS', + 'TRAVELERSINSURANCE', + 'TRUST', + 'TRV', + 'TT', + 'TUBE', + 'TUI', + 'TUNES', + 'TUSHU', + 'TV', + 'TVS', + 'TW', + 'TZ', + 'UA', + 'UBANK', + 'UBS', + 'UG', + 'UK', + 'UNICOM', + 'UNIVERSITY', + 'UNO', + 'UOL', + 'UPS', + 'US', + 'UY', + 'UZ', + 'VA', + 'VACATIONS', + 'VANA', + 'VANGUARD', + 'VC', + 'VE', + 'VEGAS', + 'VENTURES', + 'VERISIGN', + 'VERSICHERUNG', + 'VET', + 'VG', + 'VI', + 'VIAJES', + 'VIDEO', + 'VIG', + 'VIKING', + 'VILLAS', + 'VIN', + 'VIP', + 'VIRGIN', + 'VISA', + 'VISION', + 'VIVA', + 'VIVO', + 'VLAANDEREN', + 'VN', + 'VODKA', + 'VOLKSWAGEN', + 'VOLVO', + 'VOTE', + 'VOTING', + 'VOTO', + 'VOYAGE', + 'VU', + 'VUELOS', + 'WALES', + 'WALMART', + 'WALTER', + 'WANG', + 'WANGGOU', + 'WATCH', + 'WATCHES', + 'WEATHER', + 'WEATHERCHANNEL', + 'WEBCAM', + 'WEBER', + 'WEBSITE', + 'WED', + 'WEDDING', + 'WEIBO', + 'WEIR', + 'WF', + 'WHOSWHO', + 'WIEN', + 'WIKI', + 'WILLIAMHILL', + 'WIN', + 'WINDOWS', + 'WINE', + 'WINNERS', + 'WME', + 'WOLTERSKLUWER', + 'WOODSIDE', + 'WORK', + 'WORKS', + 'WORLD', + 'WOW', + 'WS', + 'WTC', + 'WTF', + 'XBOX', + 'XEROX', + 'XFINITY', + 'XIHUAN', + 'XIN', + 'XN--11B4C3D', + 'XN--1CK2E1B', + 'XN--1QQW23A', + 'XN--2SCRJ9C', + 'XN--30RR7Y', + 'XN--3BST00M', + 'XN--3DS443G', + 'XN--3E0B707E', + 'XN--3HCRJ9C', + 'XN--3OQ18VL8PN36A', + 'XN--3PXU8K', + 'XN--42C2D9A', + 'XN--45BR5CYL', + 'XN--45BRJ9C', + 'XN--45Q11C', + 'XN--4GBRIM', + 'XN--54B7FTA0CC', + 'XN--55QW42G', + 'XN--55QX5D', + 'XN--5SU34J936BGSG', + 'XN--5TZM5G', + 'XN--6FRZ82G', + 'XN--6QQ986B3XL', + 'XN--80ADXHKS', + 'XN--80AO21A', + 'XN--80AQECDR1A', + 'XN--80ASEHDB', + 'XN--80ASWG', + 'XN--8Y0A063A', + 'XN--90A3AC', + 'XN--90AE', + 'XN--90AIS', + 'XN--9DBQ2A', + 'XN--9ET52U', + 'XN--9KRT00A', + 'XN--B4W605FERD', + 'XN--BCK1B9A5DRE4C', + 'XN--C1AVG', + 'XN--C2BR7G', + 'XN--CCK2B3B', + 'XN--CCKWCXETD', + 'XN--CG4BKI', + 'XN--CLCHC0EA0B2G2A9GCD', + 'XN--CZR694B', + 'XN--CZRS0T', + 'XN--CZRU2D', + 'XN--D1ACJ3B', + 'XN--D1ALF', + 'XN--E1A4C', + 'XN--ECKVDTC9D', + 'XN--EFVY88H', + 'XN--FCT429K', + 'XN--FHBEI', + 'XN--FIQ228C5HS', + 'XN--FIQ64B', + 'XN--FIQS8S', + 'XN--FIQZ9S', + 'XN--FJQ720A', + 'XN--FLW351E', + 'XN--FPCRJ9C3D', + 'XN--FZC2C9E2C', + 'XN--FZYS8D69UVGM', + 'XN--G2XX48C', + 'XN--GCKR3F0F', + 'XN--GECRJ9C', + 'XN--GK3AT1E', + 'XN--H2BREG3EVE', + 'XN--H2BRJ9C', + 'XN--H2BRJ9C8C', + 'XN--HXT814E', + 'XN--I1B6B1A6A2E', + 'XN--IMR513N', + 'XN--IO0A7I', + 'XN--J1AEF', + 'XN--J1AMH', + 'XN--J6W193G', + 'XN--JLQ480N2RG', + 'XN--JLQ61U9W7B', + 'XN--JVR189M', + 'XN--KCRX77D1X4A', + 'XN--KPRW13D', + 'XN--KPRY57D', + 'XN--KPUT3I', + 'XN--L1ACC', + 'XN--LGBBAT1AD8J', + 'XN--MGB9AWBF', + 'XN--MGBA3A3EJT', + 'XN--MGBA3A4F16A', + 'XN--MGBA7C0BBN0A', + 'XN--MGBAAKC7DVF', + 'XN--MGBAAM7A8H', + 'XN--MGBAB2BD', + 'XN--MGBAH1A3HJKRD', + 'XN--MGBAI9AZGQP6J', + 'XN--MGBAYH7GPA', + 'XN--MGBBH1A', + 'XN--MGBBH1A71E', + 'XN--MGBC0A9AZCG', + 'XN--MGBCA7DZDO', + 'XN--MGBCPQ6GPA1A', + 'XN--MGBERP4A5D4AR', + 'XN--MGBGU82A', + 'XN--MGBI4ECEXP', + 'XN--MGBPL2FH', + 'XN--MGBT3DHD', + 'XN--MGBTX2B', + 'XN--MGBX4CD0AB', + 'XN--MIX891F', + 'XN--MK1BU44C', + 'XN--MXTQ1M', + 'XN--NGBC5AZD', + 'XN--NGBE9E0A', + 'XN--NGBRX', + 'XN--NODE', + 'XN--NQV7F', + 'XN--NQV7FS00EMA', + 'XN--NYQY26A', + 'XN--O3CW4H', + 'XN--OGBPF8FL', + 'XN--OTU796D', + 'XN--P1ACF', + 'XN--P1AI', + 'XN--PGBS0DH', + 'XN--PSSY2U', + 'XN--Q7CE6A', + 'XN--Q9JYB4C', + 'XN--QCKA1PMC', + 'XN--QXA6A', + 'XN--QXAM', + 'XN--RHQV96G', + 'XN--ROVU88B', + 'XN--RVC1E0AM3E', + 'XN--S9BRJ9C', + 'XN--SES554G', + 'XN--T60B56A', + 'XN--TCKWE', + 'XN--TIQ49XQYJ', + 'XN--UNUP4Y', + 'XN--VERMGENSBERATER-CTB', + 'XN--VERMGENSBERATUNG-PWB', + 'XN--VHQUV', + 'XN--VUQ861B', + 'XN--W4R85EL8FHU5DNRA', + 'XN--W4RS40L', + 'XN--WGBH1C', + 'XN--WGBL6A', + 'XN--XHQ521B', + 'XN--XKC2AL3HYE2A', + 'XN--XKC2DL3A5EE0H', + 'XN--Y9A3AQ', + 'XN--YFRO4I67O', + 'XN--YGBI2AMMX', + 'XN--ZFR164B', + 'XXX', + 'XYZ', + 'YACHTS', + 'YAHOO', + 'YAMAXUN', + 'YANDEX', + 'YE', + 'YODOBASHI', + 'YOGA', + 'YOKOHAMA', + 'YOU', + 'YOUTUBE', + 'YT', + 'YUN', + 'ZA', + 'ZAPPOS', + 'ZARA', + 'ZERO', + 'ZIP', + 'ZM', + 'ZONE', + 'ZUERICH', + 'ZW' +]; + + +// Keep as upper-case to make updating from source easier + +module.exports = new Set(internals.tlds.map((tld) => tld.toLowerCase())); diff --git a/node_modules/@sideway/address/lib/uri.js b/node_modules/@sideway/address/lib/uri.js new file mode 100644 index 000000000..f9791840a --- /dev/null +++ b/node_modules/@sideway/address/lib/uri.js @@ -0,0 +1,207 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const EscapeRegex = require('@hapi/hoek/lib/escapeRegex'); + + +const internals = {}; + + +internals.generate = function () { + + const rfc3986 = {}; + + const hexDigit = '\\dA-Fa-f'; // HEXDIG = DIGIT / "A" / "B" / "C" / "D" / "E" / "F" + const hexDigitOnly = '[' + hexDigit + ']'; + + const unreserved = '\\w-\\.~'; // unreserved = ALPHA / DIGIT / "-" / "." / "_" / "~" + const subDelims = '!\\$&\'\\(\\)\\*\\+,;='; // sub-delims = "!" / "$" / "&" / "'" / "(" / ")" / "*" / "+" / "," / ";" / "=" + const pctEncoded = '%' + hexDigit; // pct-encoded = "%" HEXDIG HEXDIG + const pchar = unreserved + pctEncoded + subDelims + ':@'; // pchar = unreserved / pct-encoded / sub-delims / ":" / "@" + const pcharOnly = '[' + pchar + ']'; + const decOctect = '(?:0{0,2}\\d|0?[1-9]\\d|1\\d\\d|2[0-4]\\d|25[0-5])'; // dec-octet = DIGIT / %x31-39 DIGIT / "1" 2DIGIT / "2" %x30-34 DIGIT / "25" %x30-35 ; 0-9 / 10-99 / 100-199 / 200-249 / 250-255 + + rfc3986.ipv4address = '(?:' + decOctect + '\\.){3}' + decOctect; // IPv4address = dec-octet "." dec-octet "." dec-octet "." dec-octet + + /* + h16 = 1*4HEXDIG ; 16 bits of address represented in hexadecimal + ls32 = ( h16 ":" h16 ) / IPv4address ; least-significant 32 bits of address + IPv6address = 6( h16 ":" ) ls32 + / "::" 5( h16 ":" ) ls32 + / [ h16 ] "::" 4( h16 ":" ) ls32 + / [ *1( h16 ":" ) h16 ] "::" 3( h16 ":" ) ls32 + / [ *2( h16 ":" ) h16 ] "::" 2( h16 ":" ) ls32 + / [ *3( h16 ":" ) h16 ] "::" h16 ":" ls32 + / [ *4( h16 ":" ) h16 ] "::" ls32 + / [ *5( h16 ":" ) h16 ] "::" h16 + / [ *6( h16 ":" ) h16 ] "::" + */ + + const h16 = hexDigitOnly + '{1,4}'; + const ls32 = '(?:' + h16 + ':' + h16 + '|' + rfc3986.ipv4address + ')'; + const IPv6SixHex = '(?:' + h16 + ':){6}' + ls32; + const IPv6FiveHex = '::(?:' + h16 + ':){5}' + ls32; + const IPv6FourHex = '(?:' + h16 + ')?::(?:' + h16 + ':){4}' + ls32; + const IPv6ThreeHex = '(?:(?:' + h16 + ':){0,1}' + h16 + ')?::(?:' + h16 + ':){3}' + ls32; + const IPv6TwoHex = '(?:(?:' + h16 + ':){0,2}' + h16 + ')?::(?:' + h16 + ':){2}' + ls32; + const IPv6OneHex = '(?:(?:' + h16 + ':){0,3}' + h16 + ')?::' + h16 + ':' + ls32; + const IPv6NoneHex = '(?:(?:' + h16 + ':){0,4}' + h16 + ')?::' + ls32; + const IPv6NoneHex2 = '(?:(?:' + h16 + ':){0,5}' + h16 + ')?::' + h16; + const IPv6NoneHex3 = '(?:(?:' + h16 + ':){0,6}' + h16 + ')?::'; + + rfc3986.ipv4Cidr = '(?:\\d|[1-2]\\d|3[0-2])'; // IPv4 cidr = DIGIT / %x31-32 DIGIT / "3" %x30-32 ; 0-9 / 10-29 / 30-32 + rfc3986.ipv6Cidr = '(?:0{0,2}\\d|0?[1-9]\\d|1[01]\\d|12[0-8])'; // IPv6 cidr = DIGIT / %x31-39 DIGIT / "1" %x0-1 DIGIT / "12" %x0-8; 0-9 / 10-99 / 100-119 / 120-128 + rfc3986.ipv6address = '(?:' + IPv6SixHex + '|' + IPv6FiveHex + '|' + IPv6FourHex + '|' + IPv6ThreeHex + '|' + IPv6TwoHex + '|' + IPv6OneHex + '|' + IPv6NoneHex + '|' + IPv6NoneHex2 + '|' + IPv6NoneHex3 + ')'; + rfc3986.ipvFuture = 'v' + hexDigitOnly + '+\\.[' + unreserved + subDelims + ':]+'; // IPvFuture = "v" 1*HEXDIG "." 1*( unreserved / sub-delims / ":" ) + + rfc3986.scheme = '[a-zA-Z][a-zA-Z\\d+-\\.]*'; // scheme = ALPHA *( ALPHA / DIGIT / "+" / "-" / "." ) + rfc3986.schemeRegex = new RegExp(rfc3986.scheme); + + const userinfo = '[' + unreserved + pctEncoded + subDelims + ':]*'; // userinfo = *( unreserved / pct-encoded / sub-delims / ":" ) + const IPLiteral = '\\[(?:' + rfc3986.ipv6address + '|' + rfc3986.ipvFuture + ')\\]'; // IP-literal = "[" ( IPv6address / IPvFuture ) "]" + const regName = '[' + unreserved + pctEncoded + subDelims + ']{1,255}'; // reg-name = *( unreserved / pct-encoded / sub-delims ) + const host = '(?:' + IPLiteral + '|' + rfc3986.ipv4address + '|' + regName + ')'; // host = IP-literal / IPv4address / reg-name + const port = '\\d*'; // port = *DIGIT + const authority = '(?:' + userinfo + '@)?' + host + '(?::' + port + ')?'; // authority = [ userinfo "@" ] host [ ":" port ] + const authorityCapture = '(?:' + userinfo + '@)?(' + host + ')(?::' + port + ')?'; + + /* + segment = *pchar + segment-nz = 1*pchar + path = path-abempty ; begins with "/" '|' is empty + / path-absolute ; begins with "/" but not "//" + / path-noscheme ; begins with a non-colon segment + / path-rootless ; begins with a segment + / path-empty ; zero characters + path-abempty = *( "/" segment ) + path-absolute = "/" [ segment-nz *( "/" segment ) ] + path-rootless = segment-nz *( "/" segment ) + */ + + const segment = pcharOnly + '*'; + const segmentNz = pcharOnly + '+'; + const segmentNzNc = '[' + unreserved + pctEncoded + subDelims + '@' + ']+'; + const pathEmpty = ''; + const pathAbEmpty = '(?:\\/' + segment + ')*'; + const pathAbsolute = '\\/(?:' + segmentNz + pathAbEmpty + ')?'; + const pathRootless = segmentNz + pathAbEmpty; + const pathNoScheme = segmentNzNc + pathAbEmpty; + const pathAbNoAuthority = '(?:\\/\\/\\/' + segment + pathAbEmpty + ')'; // Used by file:/// + + // hier-part = "//" authority path + + rfc3986.hierPart = '(?:' + '(?:\\/\\/' + authority + pathAbEmpty + ')' + '|' + pathAbsolute + '|' + pathRootless + '|' + pathAbNoAuthority + ')'; + rfc3986.hierPartCapture = '(?:' + '(?:\\/\\/' + authorityCapture + pathAbEmpty + ')' + '|' + pathAbsolute + '|' + pathRootless + ')'; + + // relative-part = "//" authority path-abempty / path-absolute / path-noscheme / path-empty + + rfc3986.relativeRef = '(?:' + '(?:\\/\\/' + authority + pathAbEmpty + ')' + '|' + pathAbsolute + '|' + pathNoScheme + '|' + pathEmpty + ')'; + rfc3986.relativeRefCapture = '(?:' + '(?:\\/\\/' + authorityCapture + pathAbEmpty + ')' + '|' + pathAbsolute + '|' + pathNoScheme + '|' + pathEmpty + ')'; + + // query = *( pchar / "/" / "?" ) + // query = *( pchar / "[" / "]" / "/" / "?" ) + + rfc3986.query = '[' + pchar + '\\/\\?]*(?=#|$)'; //Finish matching either at the fragment part '|' end of the line. + rfc3986.queryWithSquareBrackets = '[' + pchar + '\\[\\]\\/\\?]*(?=#|$)'; + + // fragment = *( pchar / "/" / "?" ) + + rfc3986.fragment = '[' + pchar + '\\/\\?]*'; + + return rfc3986; +}; + +internals.rfc3986 = internals.generate(); + + +exports.ip = { + v4Cidr: internals.rfc3986.ipv4Cidr, + v6Cidr: internals.rfc3986.ipv6Cidr, + ipv4: internals.rfc3986.ipv4address, + ipv6: internals.rfc3986.ipv6address, + ipvfuture: internals.rfc3986.ipvFuture +}; + + +internals.createRegex = function (options) { + + const rfc = internals.rfc3986; + + // Construct expression + + const query = options.allowQuerySquareBrackets ? rfc.queryWithSquareBrackets : rfc.query; + const suffix = '(?:\\?' + query + ')?' + '(?:#' + rfc.fragment + ')?'; + + // relative-ref = relative-part [ "?" query ] [ "#" fragment ] + + const relative = options.domain ? rfc.relativeRefCapture : rfc.relativeRef; + + if (options.relativeOnly) { + return internals.wrap(relative + suffix); + } + + // Custom schemes + + let customScheme = ''; + if (options.scheme) { + Assert(options.scheme instanceof RegExp || typeof options.scheme === 'string' || Array.isArray(options.scheme), 'scheme must be a RegExp, String, or Array'); + + const schemes = [].concat(options.scheme); + Assert(schemes.length >= 1, 'scheme must have at least 1 scheme specified'); + + // Flatten the array into a string to be used to match the schemes + + const selections = []; + for (let i = 0; i < schemes.length; ++i) { + const scheme = schemes[i]; + Assert(scheme instanceof RegExp || typeof scheme === 'string', 'scheme at position ' + i + ' must be a RegExp or String'); + + if (scheme instanceof RegExp) { + selections.push(scheme.source.toString()); + } + else { + Assert(rfc.schemeRegex.test(scheme), 'scheme at position ' + i + ' must be a valid scheme'); + selections.push(EscapeRegex(scheme)); + } + } + + customScheme = selections.join('|'); + } + + // URI = scheme ":" hier-part [ "?" query ] [ "#" fragment ] + + const scheme = customScheme ? '(?:' + customScheme + ')' : rfc.scheme; + const absolute = '(?:' + scheme + ':' + (options.domain ? rfc.hierPartCapture : rfc.hierPart) + ')'; + const prefix = options.allowRelative ? '(?:' + absolute + '|' + relative + ')' : absolute; + return internals.wrap(prefix + suffix, customScheme); +}; + + +internals.wrap = function (raw, scheme) { + + raw = `(?=.)(?!https?\:/$)${raw}`; // Require at least one character and explicitly forbid 'http:/' + + return { + raw, + regex: new RegExp(`^${raw}$`), + scheme + }; +}; + + +internals.uriRegex = internals.createRegex({}); + + +exports.regex = function (options = {}) { + + if (options.scheme || + options.allowRelative || + options.relativeOnly || + options.allowQuerySquareBrackets || + options.domain) { + + return internals.createRegex(options); + } + + return internals.uriRegex; +}; diff --git a/node_modules/@sideway/address/package.json b/node_modules/@sideway/address/package.json new file mode 100644 index 000000000..5728bad9b --- /dev/null +++ b/node_modules/@sideway/address/package.json @@ -0,0 +1,63 @@ +{ + "_from": "@sideway/address@^4.1.0", + "_id": "@sideway/address@4.1.2", + "_inBundle": false, + "_integrity": "sha512-idTz8ibqWFrPU8kMirL0CoPH/A29XOzzAzpyN3zQ4kAWnzmNfFmRaoMNN6VI8ske5M73HZyhIaW4OuSFIdM4oA==", + "_location": "/@sideway/address", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "@sideway/address@^4.1.0", + "name": "@sideway/address", + "escapedName": "@sideway%2faddress", + "scope": "@sideway", + "rawSpec": "^4.1.0", + "saveSpec": null, + "fetchSpec": "^4.1.0" + }, + "_requiredBy": [ + "/joi" + ], + "_resolved": "https://registry.npmjs.org/@sideway/address/-/address-4.1.2.tgz", + "_shasum": "811b84333a335739d3969cfc434736268170cad1", + "_spec": "@sideway/address@^4.1.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\joi", + "bugs": { + "url": "https://github.com/sideway/address/issues" + }, + "bundleDependencies": false, + "dependencies": { + "@hapi/hoek": "^9.0.0" + }, + "deprecated": false, + "description": "Email address and domain validation", + "devDependencies": { + "@hapi/code": "8.x.x", + "@hapi/lab": "24.x.x", + "typescript": "4.0.x" + }, + "files": [ + "lib" + ], + "homepage": "https://github.com/sideway/address#readme", + "keywords": [ + "email", + "domain", + "address", + "validation" + ], + "license": "BSD-3-Clause", + "main": "lib/index.js", + "name": "@sideway/address", + "repository": { + "type": "git", + "url": "git://github.com/sideway/address.git" + }, + "scripts": { + "test": "lab -a @hapi/code -t 100 -L -Y", + "test-cov-html": "lab -a @hapi/code -t 100 -L -r html -o coverage.html" + }, + "types": "lib/index.d.ts", + "version": "4.1.2" +} diff --git a/node_modules/@sideway/formula/LICENSE.md b/node_modules/@sideway/formula/LICENSE.md new file mode 100644 index 000000000..995d34079 --- /dev/null +++ b/node_modules/@sideway/formula/LICENSE.md @@ -0,0 +1,9 @@ +Copyright (c) 2019-2020, Sideway. Inc, and project contributors +All rights reserved. + +Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: + * Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. + * Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. + * The names of any contributors may not be used to endorse or promote products derived from this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS AND CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/node_modules/@sideway/formula/README.md b/node_modules/@sideway/formula/README.md new file mode 100644 index 000000000..13da3afae --- /dev/null +++ b/node_modules/@sideway/formula/README.md @@ -0,0 +1,18 @@ +# @sideway/formula + +#### Math and string formula parser. + +**formula** is part of the **joi** ecosystem. + +### Visit the [joi.dev](https://joi.dev) Developer Portal for tutorials, documentation, and support + +## Useful resources + +- [Documentation and API](https://joi.dev/module/formula/) +- [Version status](https://joi.dev/resources/status/#formula) (builds, dependencies, node versions, licenses, eol) +- [Changelog](https://joi.dev/module/formula/changelog/) +- [Project policies](https://joi.dev/policies/) + +## Acknowledgements + +Inspired by [**fparse**](https://github.com/bylexus/fparse), copyright 2012-2018 Alexander Schenkel diff --git a/node_modules/@sideway/formula/lib/index.d.ts b/node_modules/@sideway/formula/lib/index.d.ts new file mode 100644 index 000000000..d78cc1d47 --- /dev/null +++ b/node_modules/@sideway/formula/lib/index.d.ts @@ -0,0 +1,52 @@ +/** + * Formula parser + */ +export class Parser { + + /** + * Create a new formula parser. + * + * @param formula - the formula string to parse. + * @param options - optional settings. + */ + constructor(formula: string, options?: Options); + + /** + * Evaluate the formula. + * + * @param context - optional object with runtime formula context used to resolve variables. + * + * @returns the string or number outcome of the resolved formula. + */ + evaluate(context?: any): T; +} + + +export interface Options { + + /** + * A hash of key - value pairs used to convert constants to values. + */ + readonly constants?: Record; + + /** + * A regular expression used to validate token variables. + */ + readonly tokenRx?: RegExp; + + /** + * A variable resolver factory function. + */ + readonly reference?: Options.Reference; + + /** + * A hash of key-value pairs used to resolve formula functions. + */ + readonly functions?: Record; +} + + +export namespace Options { + + type Reference = (name: string) => (context: any) => any; +} diff --git a/node_modules/@sideway/formula/lib/index.js b/node_modules/@sideway/formula/lib/index.js new file mode 100644 index 000000000..871ab1bbb --- /dev/null +++ b/node_modules/@sideway/formula/lib/index.js @@ -0,0 +1,456 @@ +'use strict'; + +const internals = { + operators: ['!', '^', '*', '/', '%', '+', '-', '<', '<=', '>', '>=', '==', '!=', '&&', '||', '??'], + operatorCharacters: ['!', '^', '*', '/', '%', '+', '-', '<', '=', '>', '&', '|', '?'], + operatorsOrder: [['^'], ['*', '/', '%'], ['+', '-'], ['<', '<=', '>', '>='], ['==', '!='], ['&&'], ['||', '??']], + operatorsPrefix: ['!', 'n'], + + literals: { + '"': '"', + '`': '`', + '\'': '\'', + '[': ']' + }, + + numberRx: /^(?:[0-9]*\.?[0-9]*){1}$/, + tokenRx: /^[\w\$\#\.\@\:\{\}]+$/, + + symbol: Symbol('formula'), + settings: Symbol('settings') +}; + + +exports.Parser = class { + + constructor(string, options = {}) { + + if (!options[internals.settings] && + options.constants) { + + for (const constant in options.constants) { + const value = options.constants[constant]; + if (value !== null && + !['boolean', 'number', 'string'].includes(typeof value)) { + + throw new Error(`Formula constant ${constant} contains invalid ${typeof value} value type`); + } + } + } + + this.settings = options[internals.settings] ? options : Object.assign({ [internals.settings]: true, constants: {}, functions: {} }, options); + this.single = null; + + this._parts = null; + this._parse(string); + } + + _parse(string) { + + let parts = []; + let current = ''; + let parenthesis = 0; + let literal = false; + + const flush = (inner) => { + + if (parenthesis) { + throw new Error('Formula missing closing parenthesis'); + } + + const last = parts.length ? parts[parts.length - 1] : null; + + if (!literal && + !current && + !inner) { + + return; + } + + if (last && + last.type === 'reference' && + inner === ')') { // Function + + last.type = 'function'; + last.value = this._subFormula(current, last.value); + current = ''; + return; + } + + if (inner === ')') { // Segment + const sub = new exports.Parser(current, this.settings); + parts.push({ type: 'segment', value: sub }); + } + else if (literal) { + if (literal === ']') { // Reference + parts.push({ type: 'reference', value: current }); + current = ''; + return; + } + + parts.push({ type: 'literal', value: current }); // Literal + } + else if (internals.operatorCharacters.includes(current)) { // Operator + if (last && + last.type === 'operator' && + internals.operators.includes(last.value + current)) { // 2 characters operator + + last.value += current; + } + else { + parts.push({ type: 'operator', value: current }); + } + } + else if (current.match(internals.numberRx)) { // Number + parts.push({ type: 'constant', value: parseFloat(current) }); + } + else if (this.settings.constants[current] !== undefined) { // Constant + parts.push({ type: 'constant', value: this.settings.constants[current] }); + } + else { // Reference + if (!current.match(internals.tokenRx)) { + throw new Error(`Formula contains invalid token: ${current}`); + } + + parts.push({ type: 'reference', value: current }); + } + + current = ''; + }; + + for (const c of string) { + if (literal) { + if (c === literal) { + flush(); + literal = false; + } + else { + current += c; + } + } + else if (parenthesis) { + if (c === '(') { + current += c; + ++parenthesis; + } + else if (c === ')') { + --parenthesis; + if (!parenthesis) { + flush(c); + } + else { + current += c; + } + } + else { + current += c; + } + } + else if (c in internals.literals) { + literal = internals.literals[c]; + } + else if (c === '(') { + flush(); + ++parenthesis; + } + else if (internals.operatorCharacters.includes(c)) { + flush(); + current = c; + flush(); + } + else if (c !== ' ') { + current += c; + } + else { + flush(); + } + } + + flush(); + + // Replace prefix - to internal negative operator + + parts = parts.map((part, i) => { + + if (part.type !== 'operator' || + part.value !== '-' || + i && parts[i - 1].type !== 'operator') { + + return part; + } + + return { type: 'operator', value: 'n' }; + }); + + // Validate tokens order + + let operator = false; + for (const part of parts) { + if (part.type === 'operator') { + if (internals.operatorsPrefix.includes(part.value)) { + continue; + } + + if (!operator) { + throw new Error('Formula contains an operator in invalid position'); + } + + if (!internals.operators.includes(part.value)) { + throw new Error(`Formula contains an unknown operator ${part.value}`); + } + } + else if (operator) { + throw new Error('Formula missing expected operator'); + } + + operator = !operator; + } + + if (!operator) { + throw new Error('Formula contains invalid trailing operator'); + } + + // Identify single part + + if (parts.length === 1 && + ['reference', 'literal', 'constant'].includes(parts[0].type)) { + + this.single = { type: parts[0].type === 'reference' ? 'reference' : 'value', value: parts[0].value }; + } + + // Process parts + + this._parts = parts.map((part) => { + + // Operators + + if (part.type === 'operator') { + return internals.operatorsPrefix.includes(part.value) ? part : part.value; + } + + // Literals, constants, segments + + if (part.type !== 'reference') { + return part.value; + } + + // References + + if (this.settings.tokenRx && + !this.settings.tokenRx.test(part.value)) { + + throw new Error(`Formula contains invalid reference ${part.value}`); + } + + if (this.settings.reference) { + return this.settings.reference(part.value); + } + + return internals.reference(part.value); + }); + } + + _subFormula(string, name) { + + const method = this.settings.functions[name]; + if (typeof method !== 'function') { + throw new Error(`Formula contains unknown function ${name}`); + } + + let args = []; + if (string) { + let current = ''; + let parenthesis = 0; + let literal = false; + + const flush = () => { + + if (!current) { + throw new Error(`Formula contains function ${name} with invalid arguments ${string}`); + } + + args.push(current); + current = ''; + }; + + for (let i = 0; i < string.length; ++i) { + const c = string[i]; + if (literal) { + current += c; + if (c === literal) { + literal = false; + } + } + else if (c in internals.literals && + !parenthesis) { + + current += c; + literal = internals.literals[c]; + } + else if (c === ',' && + !parenthesis) { + + flush(); + } + else { + current += c; + if (c === '(') { + ++parenthesis; + } + else if (c === ')') { + --parenthesis; + } + } + } + + flush(); + } + + args = args.map((arg) => new exports.Parser(arg, this.settings)); + + return function (context) { + + const innerValues = []; + for (const arg of args) { + innerValues.push(arg.evaluate(context)); + } + + return method.call(context, ...innerValues); + }; + } + + evaluate(context) { + + const parts = this._parts.slice(); + + // Prefix operators + + for (let i = parts.length - 2; i >= 0; --i) { + const part = parts[i]; + if (part && + part.type === 'operator') { + + const current = parts[i + 1]; + parts.splice(i + 1, 1); + const value = internals.evaluate(current, context); + parts[i] = internals.single(part.value, value); + } + } + + // Left-right operators + + internals.operatorsOrder.forEach((set) => { + + for (let i = 1; i < parts.length - 1;) { + if (set.includes(parts[i])) { + const operator = parts[i]; + const left = internals.evaluate(parts[i - 1], context); + const right = internals.evaluate(parts[i + 1], context); + + parts.splice(i, 2); + const result = internals.calculate(operator, left, right); + parts[i - 1] = result === 0 ? 0 : result; // Convert -0 + } + else { + i += 2; + } + } + }); + + return internals.evaluate(parts[0], context); + } +}; + + +exports.Parser.prototype[internals.symbol] = true; + + +internals.reference = function (name) { + + return function (context) { + + return context && context[name] !== undefined ? context[name] : null; + }; +}; + + +internals.evaluate = function (part, context) { + + if (part === null) { + return null; + } + + if (typeof part === 'function') { + return part(context); + } + + if (part[internals.symbol]) { + return part.evaluate(context); + } + + return part; +}; + + +internals.single = function (operator, value) { + + if (operator === '!') { + return value ? false : true; + } + + // operator === 'n' + + const negative = -value; + if (negative === 0) { // Override -0 + return 0; + } + + return negative; +}; + + +internals.calculate = function (operator, left, right) { + + if (operator === '??') { + return internals.exists(left) ? left : right; + } + + if (typeof left === 'string' || + typeof right === 'string') { + + if (operator === '+') { + left = internals.exists(left) ? left : ''; + right = internals.exists(right) ? right : ''; + return left + right; + } + } + else { + switch (operator) { + case '^': return Math.pow(left, right); + case '*': return left * right; + case '/': return left / right; + case '%': return left % right; + case '+': return left + right; + case '-': return left - right; + } + } + + switch (operator) { + case '<': return left < right; + case '<=': return left <= right; + case '>': return left > right; + case '>=': return left >= right; + case '==': return left === right; + case '!=': return left !== right; + case '&&': return left && right; + case '||': return left || right; + } + + return null; +}; + + +internals.exists = function (value) { + + return value !== null && value !== undefined; +}; diff --git a/node_modules/@sideway/formula/package.json b/node_modules/@sideway/formula/package.json new file mode 100644 index 000000000..4a1efebeb --- /dev/null +++ b/node_modules/@sideway/formula/package.json @@ -0,0 +1,61 @@ +{ + "_from": "@sideway/formula@^3.0.0", + "_id": "@sideway/formula@3.0.0", + "_inBundle": false, + "_integrity": "sha512-vHe7wZ4NOXVfkoRb8T5otiENVlT7a3IAiw7H5M2+GO+9CDgcVUUsX1zalAztCmwyOr2RUTGJdgB+ZvSVqmdHmg==", + "_location": "/@sideway/formula", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "@sideway/formula@^3.0.0", + "name": "@sideway/formula", + "escapedName": "@sideway%2fformula", + "scope": "@sideway", + "rawSpec": "^3.0.0", + "saveSpec": null, + "fetchSpec": "^3.0.0" + }, + "_requiredBy": [ + "/joi" + ], + "_resolved": "https://registry.npmjs.org/@sideway/formula/-/formula-3.0.0.tgz", + "_shasum": "fe158aee32e6bd5de85044be615bc08478a0a13c", + "_spec": "@sideway/formula@^3.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\joi", + "bugs": { + "url": "https://github.com/sideway/formula/issues" + }, + "bundleDependencies": false, + "dependencies": {}, + "deprecated": false, + "description": "Math and string formula parser.", + "devDependencies": { + "@hapi/code": "8.x.x", + "@hapi/lab": "24.x.x", + "typescript": "4.0.x" + }, + "files": [ + "lib" + ], + "homepage": "https://github.com/sideway/formula#readme", + "keywords": [ + "formula", + "parser", + "math", + "string" + ], + "license": "BSD-3-Clause", + "main": "lib/index.js", + "name": "@sideway/formula", + "repository": { + "type": "git", + "url": "git://github.com/sideway/formula.git" + }, + "scripts": { + "test": "lab -a @hapi/code -t 100 -L -Y", + "test-cov-html": "lab -a @hapi/code -t 100 -L -r html -o coverage.html" + }, + "types": "lib/index.d.ts", + "version": "3.0.0" +} diff --git a/node_modules/@sideway/pinpoint/LICENSE.md b/node_modules/@sideway/pinpoint/LICENSE.md new file mode 100644 index 000000000..f0dfb1717 --- /dev/null +++ b/node_modules/@sideway/pinpoint/LICENSE.md @@ -0,0 +1,10 @@ +Copyright (c) 2019-2020, Sideway. Inc, and project contributors + +All rights reserved. + +Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: +* Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. +* Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. +* The names of any contributors may not be used to endorse or promote products derived from this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS AND CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS OFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/node_modules/@sideway/pinpoint/README.md b/node_modules/@sideway/pinpoint/README.md new file mode 100644 index 000000000..2996c8211 --- /dev/null +++ b/node_modules/@sideway/pinpoint/README.md @@ -0,0 +1,14 @@ +# @sideway/pinpoint + +#### Return the filename and line number of the calling function. + +**pinpoint** is part of the **joi** ecosystem. + +### Visit the [joi.dev](https://joi.dev) Developer Portal for tutorials, documentation, and support + +## Useful resources + +- [Documentation and API](https://joi.dev/module/pinpoint/) +- [Version status](https://joi.dev/resources/status/#pinpoint) (builds, dependencies, node versions, licenses, eol) +- [Changelog](https://joi.dev/module/pinpoint/changelog/) +- [Project policies](https://joi.dev/policies/) diff --git a/node_modules/@sideway/pinpoint/lib/index.d.ts b/node_modules/@sideway/pinpoint/lib/index.d.ts new file mode 100644 index 000000000..38fadaa3f --- /dev/null +++ b/node_modules/@sideway/pinpoint/lib/index.d.ts @@ -0,0 +1,24 @@ +/** +Returns the filename and line number of the caller in the call stack + +@param depth - The distance from the location function in the call stack. Defaults to 1 (caller). + +@return an object with the filename and line number. +*/ +export function location(depth?: number): location.Location; + +declare namespace location { + + interface Location { + + /** + The fully qualified filename. + */ + readonly filename: string; + + /** + The file line number. + */ + readonly line: number; + } +} diff --git a/node_modules/@sideway/pinpoint/lib/index.js b/node_modules/@sideway/pinpoint/lib/index.js new file mode 100644 index 000000000..482055097 --- /dev/null +++ b/node_modules/@sideway/pinpoint/lib/index.js @@ -0,0 +1,21 @@ +'use strict'; + +const internals = {}; + + +exports.location = function (depth = 0) { + + const orig = Error.prepareStackTrace; + Error.prepareStackTrace = (ignore, stack) => stack; + + const capture = {}; + Error.captureStackTrace(capture, this); + const line = capture.stack[depth + 1]; + + Error.prepareStackTrace = orig; + + return { + filename: line.getFileName(), + line: line.getLineNumber() + }; +}; diff --git a/node_modules/@sideway/pinpoint/package.json b/node_modules/@sideway/pinpoint/package.json new file mode 100644 index 000000000..04679c652 --- /dev/null +++ b/node_modules/@sideway/pinpoint/package.json @@ -0,0 +1,58 @@ +{ + "_from": "@sideway/pinpoint@^2.0.0", + "_id": "@sideway/pinpoint@2.0.0", + "_inBundle": false, + "_integrity": "sha512-RNiOoTPkptFtSVzQevY/yWtZwf/RxyVnPy/OcA9HBM3MlGDnBEYL5B41H0MTn0Uec8Hi+2qUtTfG2WWZBmMejQ==", + "_location": "/@sideway/pinpoint", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "@sideway/pinpoint@^2.0.0", + "name": "@sideway/pinpoint", + "escapedName": "@sideway%2fpinpoint", + "scope": "@sideway", + "rawSpec": "^2.0.0", + "saveSpec": null, + "fetchSpec": "^2.0.0" + }, + "_requiredBy": [ + "/joi" + ], + "_resolved": "https://registry.npmjs.org/@sideway/pinpoint/-/pinpoint-2.0.0.tgz", + "_shasum": "cff8ffadc372ad29fd3f78277aeb29e632cc70df", + "_spec": "@sideway/pinpoint@^2.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\joi", + "bugs": { + "url": "https://github.com/sideway/pinpoint/issues" + }, + "bundleDependencies": false, + "dependencies": {}, + "deprecated": false, + "description": "Return the filename and line number of the calling function", + "devDependencies": { + "@hapi/code": "8.x.x", + "@hapi/lab": "24.x.x", + "typescript": "4.0.x" + }, + "files": [ + "lib" + ], + "homepage": "https://github.com/sideway/pinpoint#readme", + "keywords": [ + "utilities" + ], + "license": "BSD-3-Clause", + "main": "lib/index.js", + "name": "@sideway/pinpoint", + "repository": { + "type": "git", + "url": "git://github.com/sideway/pinpoint.git" + }, + "scripts": { + "test": "lab -a @hapi/code -t 100 -L -Y", + "test-cov-html": "lab -a @hapi/code -t 100 -L -r html -o coverage.html" + }, + "types": "lib/index.d.ts", + "version": "2.0.0" +} diff --git a/node_modules/@sindresorhus/is/dist/index.d.ts b/node_modules/@sindresorhus/is/dist/index.d.ts new file mode 100644 index 000000000..e94d30b76 --- /dev/null +++ b/node_modules/@sindresorhus/is/dist/index.d.ts @@ -0,0 +1,132 @@ +/// +/// +/// +/// +/// +declare type TypedArray = Int8Array | Uint8Array | Uint8ClampedArray | Int16Array | Uint16Array | Int32Array | Uint32Array | Float32Array | Float64Array; +declare type Primitive = null | undefined | string | number | boolean | Symbol; +export interface ArrayLike { + length: number; +} +export interface Class { + new (...args: any[]): T; +} +declare type DomElement = object & { + nodeType: 1; + nodeName: string; +}; +declare type NodeStream = object & { + pipe: Function; +}; +export declare const enum TypeName { + null = "null", + boolean = "boolean", + undefined = "undefined", + string = "string", + number = "number", + symbol = "symbol", + Function = "Function", + GeneratorFunction = "GeneratorFunction", + AsyncFunction = "AsyncFunction", + Observable = "Observable", + Array = "Array", + Buffer = "Buffer", + Object = "Object", + RegExp = "RegExp", + Date = "Date", + Error = "Error", + Map = "Map", + Set = "Set", + WeakMap = "WeakMap", + WeakSet = "WeakSet", + Int8Array = "Int8Array", + Uint8Array = "Uint8Array", + Uint8ClampedArray = "Uint8ClampedArray", + Int16Array = "Int16Array", + Uint16Array = "Uint16Array", + Int32Array = "Int32Array", + Uint32Array = "Uint32Array", + Float32Array = "Float32Array", + Float64Array = "Float64Array", + ArrayBuffer = "ArrayBuffer", + SharedArrayBuffer = "SharedArrayBuffer", + DataView = "DataView", + Promise = "Promise", + URL = "URL" +} +declare function is(value: unknown): TypeName; +declare namespace is { + const undefined: (value: unknown) => value is undefined; + const string: (value: unknown) => value is string; + const number: (value: unknown) => value is number; + const function_: (value: unknown) => value is Function; + const null_: (value: unknown) => value is null; + const class_: (value: unknown) => value is Class; + const boolean: (value: unknown) => value is boolean; + const symbol: (value: unknown) => value is Symbol; + const numericString: (value: unknown) => boolean; + const array: (arg: any) => arg is any[]; + const buffer: (input: unknown) => input is Buffer; + const nullOrUndefined: (value: unknown) => value is null | undefined; + const object: (value: unknown) => value is object; + const iterable: (value: unknown) => value is IterableIterator; + const asyncIterable: (value: unknown) => value is AsyncIterableIterator; + const generator: (value: unknown) => value is Generator; + const nativePromise: (value: unknown) => value is Promise; + const promise: (value: unknown) => value is Promise; + const generatorFunction: (value: unknown) => value is GeneratorFunction; + const asyncFunction: (value: unknown) => value is Function; + const boundFunction: (value: unknown) => value is Function; + const regExp: (value: unknown) => value is RegExp; + const date: (value: unknown) => value is Date; + const error: (value: unknown) => value is Error; + const map: (value: unknown) => value is Map; + const set: (value: unknown) => value is Set; + const weakMap: (value: unknown) => value is WeakMap; + const weakSet: (value: unknown) => value is WeakSet; + const int8Array: (value: unknown) => value is Int8Array; + const uint8Array: (value: unknown) => value is Uint8Array; + const uint8ClampedArray: (value: unknown) => value is Uint8ClampedArray; + const int16Array: (value: unknown) => value is Int16Array; + const uint16Array: (value: unknown) => value is Uint16Array; + const int32Array: (value: unknown) => value is Int32Array; + const uint32Array: (value: unknown) => value is Uint32Array; + const float32Array: (value: unknown) => value is Float32Array; + const float64Array: (value: unknown) => value is Float64Array; + const arrayBuffer: (value: unknown) => value is ArrayBuffer; + const sharedArrayBuffer: (value: unknown) => value is SharedArrayBuffer; + const dataView: (value: unknown) => value is DataView; + const directInstanceOf: (instance: unknown, klass: Class) => instance is T; + const urlInstance: (value: unknown) => value is URL; + const urlString: (value: unknown) => boolean; + const truthy: (value: unknown) => boolean; + const falsy: (value: unknown) => boolean; + const nan: (value: unknown) => boolean; + const primitive: (value: unknown) => value is Primitive; + const integer: (value: unknown) => value is number; + const safeInteger: (value: unknown) => value is number; + const plainObject: (value: unknown) => boolean; + const typedArray: (value: unknown) => value is TypedArray; + const arrayLike: (value: unknown) => value is ArrayLike; + const inRange: (value: number, range: number | number[]) => boolean; + const domElement: (value: unknown) => value is DomElement; + const observable: (value: unknown) => boolean; + const nodeStream: (value: unknown) => value is NodeStream; + const infinite: (value: unknown) => boolean; + const even: (value: number) => boolean; + const odd: (value: number) => boolean; + const emptyArray: (value: unknown) => boolean; + const nonEmptyArray: (value: unknown) => boolean; + const emptyString: (value: unknown) => boolean; + const nonEmptyString: (value: unknown) => boolean; + const emptyStringOrWhitespace: (value: unknown) => boolean; + const emptyObject: (value: unknown) => boolean; + const nonEmptyObject: (value: unknown) => boolean; + const emptySet: (value: unknown) => boolean; + const nonEmptySet: (value: unknown) => boolean; + const emptyMap: (value: unknown) => boolean; + const nonEmptyMap: (value: unknown) => boolean; + const any: (predicate: unknown, ...values: unknown[]) => boolean; + const all: (predicate: unknown, ...values: unknown[]) => boolean; +} +export default is; diff --git a/node_modules/@sindresorhus/is/dist/index.js b/node_modules/@sindresorhus/is/dist/index.js new file mode 100644 index 000000000..3cbafae18 --- /dev/null +++ b/node_modules/@sindresorhus/is/dist/index.js @@ -0,0 +1,245 @@ +"use strict"; +/// +/// +/// +/// +Object.defineProperty(exports, "__esModule", { value: true }); +// TODO: Use the `URL` global when targeting Node.js 10 +// tslint:disable-next-line +const URLGlobal = typeof URL === 'undefined' ? require('url').URL : URL; +const toString = Object.prototype.toString; +const isOfType = (type) => (value) => typeof value === type; +const isBuffer = (input) => !is.nullOrUndefined(input) && !is.nullOrUndefined(input.constructor) && is.function_(input.constructor.isBuffer) && input.constructor.isBuffer(input); +const getObjectType = (value) => { + const objectName = toString.call(value).slice(8, -1); + if (objectName) { + return objectName; + } + return null; +}; +const isObjectOfType = (type) => (value) => getObjectType(value) === type; +function is(value) { + switch (value) { + case null: + return "null" /* null */; + case true: + case false: + return "boolean" /* boolean */; + default: + } + switch (typeof value) { + case 'undefined': + return "undefined" /* undefined */; + case 'string': + return "string" /* string */; + case 'number': + return "number" /* number */; + case 'symbol': + return "symbol" /* symbol */; + default: + } + if (is.function_(value)) { + return "Function" /* Function */; + } + if (is.observable(value)) { + return "Observable" /* Observable */; + } + if (Array.isArray(value)) { + return "Array" /* Array */; + } + if (isBuffer(value)) { + return "Buffer" /* Buffer */; + } + const tagType = getObjectType(value); + if (tagType) { + return tagType; + } + if (value instanceof String || value instanceof Boolean || value instanceof Number) { + throw new TypeError('Please don\'t use object wrappers for primitive types'); + } + return "Object" /* Object */; +} +(function (is) { + // tslint:disable-next-line:strict-type-predicates + const isObject = (value) => typeof value === 'object'; + // tslint:disable:variable-name + is.undefined = isOfType('undefined'); + is.string = isOfType('string'); + is.number = isOfType('number'); + is.function_ = isOfType('function'); + // tslint:disable-next-line:strict-type-predicates + is.null_ = (value) => value === null; + is.class_ = (value) => is.function_(value) && value.toString().startsWith('class '); + is.boolean = (value) => value === true || value === false; + is.symbol = isOfType('symbol'); + // tslint:enable:variable-name + is.numericString = (value) => is.string(value) && value.length > 0 && !Number.isNaN(Number(value)); + is.array = Array.isArray; + is.buffer = isBuffer; + is.nullOrUndefined = (value) => is.null_(value) || is.undefined(value); + is.object = (value) => !is.nullOrUndefined(value) && (is.function_(value) || isObject(value)); + is.iterable = (value) => !is.nullOrUndefined(value) && is.function_(value[Symbol.iterator]); + is.asyncIterable = (value) => !is.nullOrUndefined(value) && is.function_(value[Symbol.asyncIterator]); + is.generator = (value) => is.iterable(value) && is.function_(value.next) && is.function_(value.throw); + is.nativePromise = (value) => isObjectOfType("Promise" /* Promise */)(value); + const hasPromiseAPI = (value) => !is.null_(value) && + isObject(value) && + is.function_(value.then) && + is.function_(value.catch); + is.promise = (value) => is.nativePromise(value) || hasPromiseAPI(value); + is.generatorFunction = isObjectOfType("GeneratorFunction" /* GeneratorFunction */); + is.asyncFunction = isObjectOfType("AsyncFunction" /* AsyncFunction */); + is.boundFunction = (value) => is.function_(value) && !value.hasOwnProperty('prototype'); + is.regExp = isObjectOfType("RegExp" /* RegExp */); + is.date = isObjectOfType("Date" /* Date */); + is.error = isObjectOfType("Error" /* Error */); + is.map = (value) => isObjectOfType("Map" /* Map */)(value); + is.set = (value) => isObjectOfType("Set" /* Set */)(value); + is.weakMap = (value) => isObjectOfType("WeakMap" /* WeakMap */)(value); + is.weakSet = (value) => isObjectOfType("WeakSet" /* WeakSet */)(value); + is.int8Array = isObjectOfType("Int8Array" /* Int8Array */); + is.uint8Array = isObjectOfType("Uint8Array" /* Uint8Array */); + is.uint8ClampedArray = isObjectOfType("Uint8ClampedArray" /* Uint8ClampedArray */); + is.int16Array = isObjectOfType("Int16Array" /* Int16Array */); + is.uint16Array = isObjectOfType("Uint16Array" /* Uint16Array */); + is.int32Array = isObjectOfType("Int32Array" /* Int32Array */); + is.uint32Array = isObjectOfType("Uint32Array" /* Uint32Array */); + is.float32Array = isObjectOfType("Float32Array" /* Float32Array */); + is.float64Array = isObjectOfType("Float64Array" /* Float64Array */); + is.arrayBuffer = isObjectOfType("ArrayBuffer" /* ArrayBuffer */); + is.sharedArrayBuffer = isObjectOfType("SharedArrayBuffer" /* SharedArrayBuffer */); + is.dataView = isObjectOfType("DataView" /* DataView */); + is.directInstanceOf = (instance, klass) => Object.getPrototypeOf(instance) === klass.prototype; + is.urlInstance = (value) => isObjectOfType("URL" /* URL */)(value); + is.urlString = (value) => { + if (!is.string(value)) { + return false; + } + try { + new URLGlobal(value); // tslint:disable-line no-unused-expression + return true; + } + catch (_a) { + return false; + } + }; + is.truthy = (value) => Boolean(value); + is.falsy = (value) => !value; + is.nan = (value) => Number.isNaN(value); + const primitiveTypes = new Set([ + 'undefined', + 'string', + 'number', + 'boolean', + 'symbol' + ]); + is.primitive = (value) => is.null_(value) || primitiveTypes.has(typeof value); + is.integer = (value) => Number.isInteger(value); + is.safeInteger = (value) => Number.isSafeInteger(value); + is.plainObject = (value) => { + // From: https://github.com/sindresorhus/is-plain-obj/blob/master/index.js + let prototype; + return getObjectType(value) === "Object" /* Object */ && + (prototype = Object.getPrototypeOf(value), prototype === null || // tslint:disable-line:ban-comma-operator + prototype === Object.getPrototypeOf({})); + }; + const typedArrayTypes = new Set([ + "Int8Array" /* Int8Array */, + "Uint8Array" /* Uint8Array */, + "Uint8ClampedArray" /* Uint8ClampedArray */, + "Int16Array" /* Int16Array */, + "Uint16Array" /* Uint16Array */, + "Int32Array" /* Int32Array */, + "Uint32Array" /* Uint32Array */, + "Float32Array" /* Float32Array */, + "Float64Array" /* Float64Array */ + ]); + is.typedArray = (value) => { + const objectType = getObjectType(value); + if (objectType === null) { + return false; + } + return typedArrayTypes.has(objectType); + }; + const isValidLength = (value) => is.safeInteger(value) && value > -1; + is.arrayLike = (value) => !is.nullOrUndefined(value) && !is.function_(value) && isValidLength(value.length); + is.inRange = (value, range) => { + if (is.number(range)) { + return value >= Math.min(0, range) && value <= Math.max(range, 0); + } + if (is.array(range) && range.length === 2) { + return value >= Math.min(...range) && value <= Math.max(...range); + } + throw new TypeError(`Invalid range: ${JSON.stringify(range)}`); + }; + const NODE_TYPE_ELEMENT = 1; + const DOM_PROPERTIES_TO_CHECK = [ + 'innerHTML', + 'ownerDocument', + 'style', + 'attributes', + 'nodeValue' + ]; + is.domElement = (value) => is.object(value) && value.nodeType === NODE_TYPE_ELEMENT && is.string(value.nodeName) && + !is.plainObject(value) && DOM_PROPERTIES_TO_CHECK.every(property => property in value); + is.observable = (value) => { + if (!value) { + return false; + } + if (value[Symbol.observable] && value === value[Symbol.observable]()) { + return true; + } + if (value['@@observable'] && value === value['@@observable']()) { + return true; + } + return false; + }; + is.nodeStream = (value) => !is.nullOrUndefined(value) && isObject(value) && is.function_(value.pipe) && !is.observable(value); + is.infinite = (value) => value === Infinity || value === -Infinity; + const isAbsoluteMod2 = (rem) => (value) => is.integer(value) && Math.abs(value % 2) === rem; + is.even = isAbsoluteMod2(0); + is.odd = isAbsoluteMod2(1); + const isWhiteSpaceString = (value) => is.string(value) && /\S/.test(value) === false; + is.emptyArray = (value) => is.array(value) && value.length === 0; + is.nonEmptyArray = (value) => is.array(value) && value.length > 0; + is.emptyString = (value) => is.string(value) && value.length === 0; + is.nonEmptyString = (value) => is.string(value) && value.length > 0; + is.emptyStringOrWhitespace = (value) => is.emptyString(value) || isWhiteSpaceString(value); + is.emptyObject = (value) => is.object(value) && !is.map(value) && !is.set(value) && Object.keys(value).length === 0; + is.nonEmptyObject = (value) => is.object(value) && !is.map(value) && !is.set(value) && Object.keys(value).length > 0; + is.emptySet = (value) => is.set(value) && value.size === 0; + is.nonEmptySet = (value) => is.set(value) && value.size > 0; + is.emptyMap = (value) => is.map(value) && value.size === 0; + is.nonEmptyMap = (value) => is.map(value) && value.size > 0; + const predicateOnArray = (method, predicate, values) => { + if (is.function_(predicate) === false) { + throw new TypeError(`Invalid predicate: ${JSON.stringify(predicate)}`); + } + if (values.length === 0) { + throw new TypeError('Invalid number of values'); + } + return method.call(values, predicate); + }; + // tslint:disable variable-name + is.any = (predicate, ...values) => predicateOnArray(Array.prototype.some, predicate, values); + is.all = (predicate, ...values) => predicateOnArray(Array.prototype.every, predicate, values); + // tslint:enable variable-name +})(is || (is = {})); +// Some few keywords are reserved, but we'll populate them for Node.js users +// See https://github.com/Microsoft/TypeScript/issues/2536 +Object.defineProperties(is, { + class: { + value: is.class_ + }, + function: { + value: is.function_ + }, + null: { + value: is.null_ + } +}); +exports.default = is; +// For CommonJS default export support +module.exports = is; +module.exports.default = is; +//# sourceMappingURL=index.js.map \ No newline at end of file diff --git a/node_modules/@sindresorhus/is/dist/index.js.map b/node_modules/@sindresorhus/is/dist/index.js.map new file mode 100644 index 000000000..cd827fc29 --- /dev/null +++ b/node_modules/@sindresorhus/is/dist/index.js.map @@ -0,0 +1 @@ +{"version":3,"file":"index.js","sourceRoot":"","sources":["../source/index.ts"],"names":[],"mappings":";AAAA,6BAA6B;AAC7B,0CAA0C;AAC1C,2CAA2C;AAC3C,0BAA0B;;AAE1B,uDAAuD;AACvD,2BAA2B;AAC3B,MAAM,SAAS,GAAG,OAAO,GAAG,KAAK,WAAW,CAAC,CAAC,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,GAAG,CAAC;AAqDxE,MAAM,QAAQ,GAAG,MAAM,CAAC,SAAS,CAAC,QAAQ,CAAC;AAC3C,MAAM,QAAQ,GAAG,CAAI,IAAY,EAAE,EAAE,CAAC,CAAC,KAAc,EAAc,EAAE,CAAC,OAAO,KAAK,KAAK,IAAI,CAAC;AAC5F,MAAM,QAAQ,GAAG,CAAC,KAAc,EAAmB,EAAE,CAAC,CAAC,EAAE,CAAC,eAAe,CAAC,KAAK,CAAC,IAAI,CAAC,EAAE,CAAC,eAAe,CAAE,KAAgB,CAAC,WAAW,CAAC,IAAI,EAAE,CAAC,SAAS,CAAE,KAAgB,CAAC,WAAW,CAAC,QAAQ,CAAC,IAAK,KAAgB,CAAC,WAAW,CAAC,QAAQ,CAAC,KAAK,CAAC,CAAC;AAEhP,MAAM,aAAa,GAAG,CAAC,KAAc,EAAmB,EAAE;IACzD,MAAM,UAAU,GAAG,QAAQ,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC,KAAK,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC;IAErD,IAAI,UAAU,EAAE;QACf,OAAO,UAAsB,CAAC;KAC9B;IAED,OAAO,IAAI,CAAC;AACb,CAAC,CAAC;AAEF,MAAM,cAAc,GAAG,CAAI,IAAc,EAAE,EAAE,CAAC,CAAC,KAAc,EAAc,EAAE,CAAC,aAAa,CAAC,KAAK,CAAC,KAAK,IAAI,CAAC;AAE5G,SAAS,EAAE,CAAC,KAAc;IACzB,QAAQ,KAAK,EAAE;QACd,KAAK,IAAI;YACR,yBAAqB;QACtB,KAAK,IAAI,CAAC;QACV,KAAK,KAAK;YACT,+BAAwB;QACzB,QAAQ;KACR;IAED,QAAQ,OAAO,KAAK,EAAE;QACrB,KAAK,WAAW;YACf,mCAA0B;QAC3B,KAAK,QAAQ;YACZ,6BAAuB;QACxB,KAAK,QAAQ;YACZ,6BAAuB;QACxB,KAAK,QAAQ;YACZ,6BAAuB;QACxB,QAAQ;KACR;IAED,IAAI,EAAE,CAAC,SAAS,CAAC,KAAK,CAAC,EAAE;QACxB,iCAAyB;KACzB;IAED,IAAI,EAAE,CAAC,UAAU,CAAC,KAAK,CAAC,EAAE;QACzB,qCAA2B;KAC3B;IAED,IAAI,KAAK,CAAC,OAAO,CAAC,KAAK,CAAC,EAAE;QACzB,2BAAsB;KACtB;IAED,IAAI,QAAQ,CAAC,KAAK,CAAC,EAAE;QACpB,6BAAuB;KACvB;IAED,MAAM,OAAO,GAAG,aAAa,CAAC,KAAK,CAAC,CAAC;IACrC,IAAI,OAAO,EAAE;QACZ,OAAO,OAAO,CAAC;KACf;IAED,IAAI,KAAK,YAAY,MAAM,IAAI,KAAK,YAAY,OAAO,IAAI,KAAK,YAAY,MAAM,EAAE;QACnF,MAAM,IAAI,SAAS,CAAC,uDAAuD,CAAC,CAAC;KAC7E;IAED,6BAAuB;AACxB,CAAC;AAED,WAAU,EAAE;IACX,kDAAkD;IAClD,MAAM,QAAQ,GAAG,CAAC,KAAc,EAAmB,EAAE,CAAC,OAAO,KAAK,KAAK,QAAQ,CAAC;IAEhF,+BAA+B;IAClB,YAAS,GAAG,QAAQ,CAAY,WAAW,CAAC,CAAC;IAC7C,SAAM,GAAG,QAAQ,CAAS,QAAQ,CAAC,CAAC;IACpC,SAAM,GAAG,QAAQ,CAAS,QAAQ,CAAC,CAAC;IACpC,YAAS,GAAG,QAAQ,CAAW,UAAU,CAAC,CAAC;IACxD,kDAAkD;IACrC,QAAK,GAAG,CAAC,KAAc,EAAiB,EAAE,CAAC,KAAK,KAAK,IAAI,CAAC;IAC1D,SAAM,GAAG,CAAC,KAAc,EAAkB,EAAE,CAAC,GAAA,SAAS,CAAC,KAAK,CAAC,IAAI,KAAK,CAAC,QAAQ,EAAE,CAAC,UAAU,CAAC,QAAQ,CAAC,CAAC;IACvG,UAAO,GAAG,CAAC,KAAc,EAAoB,EAAE,CAAC,KAAK,KAAK,IAAI,IAAI,KAAK,KAAK,KAAK,CAAC;IAClF,SAAM,GAAG,QAAQ,CAAS,QAAQ,CAAC,CAAC;IACjD,8BAA8B;IAEjB,gBAAa,GAAG,CAAC,KAAc,EAAW,EAAE,CACxD,GAAA,MAAM,CAAC,KAAK,CAAC,IAAI,KAAK,CAAC,MAAM,GAAG,CAAC,IAAI,CAAC,MAAM,CAAC,KAAK,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC,CAAC;IAEtD,QAAK,GAAG,KAAK,CAAC,OAAO,CAAC;IACtB,SAAM,GAAG,QAAQ,CAAC;IAElB,kBAAe,GAAG,CAAC,KAAc,EAA6B,EAAE,CAAC,GAAA,KAAK,CAAC,KAAK,CAAC,IAAI,GAAA,SAAS,CAAC,KAAK,CAAC,CAAC;IAClG,SAAM,GAAG,CAAC,KAAc,EAAmB,EAAE,CAAC,CAAC,GAAA,eAAe,CAAC,KAAK,CAAC,IAAI,CAAC,GAAA,SAAS,CAAC,KAAK,CAAC,IAAI,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC;IAC/G,WAAQ,GAAG,CAAC,KAAc,EAAsC,EAAE,CAAC,CAAC,GAAA,eAAe,CAAC,KAAK,CAAC,IAAI,GAAA,SAAS,CAAE,KAAmC,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAC,CAAC;IAC/J,gBAAa,GAAG,CAAC,KAAc,EAA2C,EAAE,CAAC,CAAC,GAAA,eAAe,CAAC,KAAK,CAAC,IAAI,GAAA,SAAS,CAAE,KAAwC,CAAC,MAAM,CAAC,aAAa,CAAC,CAAC,CAAC;IACnL,YAAS,GAAG,CAAC,KAAc,EAAsB,EAAE,CAAC,GAAA,QAAQ,CAAC,KAAK,CAAC,IAAI,GAAA,SAAS,CAAC,KAAK,CAAC,IAAI,CAAC,IAAI,GAAA,SAAS,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC;IAEvH,gBAAa,GAAG,CAAC,KAAc,EAA6B,EAAE,CAC1E,cAAc,yBAAoC,CAAC,KAAK,CAAC,CAAC;IAE3D,MAAM,aAAa,GAAG,CAAC,KAAc,EAA6B,EAAE,CACnE,CAAC,GAAA,KAAK,CAAC,KAAK,CAAC;QACb,QAAQ,CAAC,KAAK,CAAY;QAC1B,GAAA,SAAS,CAAE,KAA0B,CAAC,IAAI,CAAC;QAC3C,GAAA,SAAS,CAAE,KAA0B,CAAC,KAAK,CAAC,CAAC;IAEjC,UAAO,GAAG,CAAC,KAAc,EAA6B,EAAE,CAAC,GAAA,aAAa,CAAC,KAAK,CAAC,IAAI,aAAa,CAAC,KAAK,CAAC,CAAC;IAEtG,oBAAiB,GAAG,cAAc,6CAA+C,CAAC;IAClF,gBAAa,GAAG,cAAc,qCAAkC,CAAC;IACjE,gBAAa,GAAG,CAAC,KAAc,EAAqB,EAAE,CAAC,GAAA,SAAS,CAAC,KAAK,CAAC,IAAI,CAAC,KAAK,CAAC,cAAc,CAAC,WAAW,CAAC,CAAC;IAE9G,SAAM,GAAG,cAAc,uBAAyB,CAAC;IACjD,OAAI,GAAG,cAAc,mBAAqB,CAAC;IAC3C,QAAK,GAAG,cAAc,qBAAuB,CAAC;IAC9C,MAAG,GAAG,CAAC,KAAc,EAAkC,EAAE,CAAC,cAAc,iBAAqC,CAAC,KAAK,CAAC,CAAC;IACrH,MAAG,GAAG,CAAC,KAAc,EAAyB,EAAE,CAAC,cAAc,iBAA4B,CAAC,KAAK,CAAC,CAAC;IACnG,UAAO,GAAG,CAAC,KAAc,EAAqC,EAAE,CAAC,cAAc,yBAA4C,CAAC,KAAK,CAAC,CAAC;IACnI,UAAO,GAAG,CAAC,KAAc,EAA4B,EAAE,CAAC,cAAc,yBAAmC,CAAC,KAAK,CAAC,CAAC;IAEjH,YAAS,GAAG,cAAc,6BAA+B,CAAC;IAC1D,aAAU,GAAG,cAAc,+BAAiC,CAAC;IAC7D,oBAAiB,GAAG,cAAc,6CAA+C,CAAC;IAClF,aAAU,GAAG,cAAc,+BAAiC,CAAC;IAC7D,cAAW,GAAG,cAAc,iCAAmC,CAAC;IAChE,aAAU,GAAG,cAAc,+BAAiC,CAAC;IAC7D,cAAW,GAAG,cAAc,iCAAmC,CAAC;IAChE,eAAY,GAAG,cAAc,mCAAqC,CAAC;IACnE,eAAY,GAAG,cAAc,mCAAqC,CAAC;IAEnE,cAAW,GAAG,cAAc,iCAAmC,CAAC;IAChE,oBAAiB,GAAG,cAAc,6CAA+C,CAAC;IAClF,WAAQ,GAAG,cAAc,2BAA6B,CAAC;IAEvD,mBAAgB,GAAG,CAAI,QAAiB,EAAE,KAAe,EAAiB,EAAE,CAAC,MAAM,CAAC,cAAc,CAAC,QAAQ,CAAC,KAAK,KAAK,CAAC,SAAS,CAAC;IACjI,cAAW,GAAG,CAAC,KAAc,EAAgB,EAAE,CAAC,cAAc,iBAAmB,CAAC,KAAK,CAAC,CAAC;IAEzF,YAAS,GAAG,CAAC,KAAc,EAAE,EAAE;QAC3C,IAAI,CAAC,GAAA,MAAM,CAAC,KAAK,CAAC,EAAE;YACnB,OAAO,KAAK,CAAC;SACb;QAED,IAAI;YACH,IAAI,SAAS,CAAC,KAAK,CAAC,CAAC,CAAC,2CAA2C;YACjE,OAAO,IAAI,CAAC;SACZ;QAAC,WAAM;YACP,OAAO,KAAK,CAAC;SACb;IACF,CAAC,CAAC;IAEW,SAAM,GAAG,CAAC,KAAc,EAAE,EAAE,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC;IAC5C,QAAK,GAAG,CAAC,KAAc,EAAE,EAAE,CAAC,CAAC,KAAK,CAAC;IAEnC,MAAG,GAAG,CAAC,KAAc,EAAE,EAAE,CAAC,MAAM,CAAC,KAAK,CAAC,KAAe,CAAC,CAAC;IAErE,MAAM,cAAc,GAAG,IAAI,GAAG,CAAC;QAC9B,WAAW;QACX,QAAQ;QACR,QAAQ;QACR,SAAS;QACT,QAAQ;KACR,CAAC,CAAC;IAEU,YAAS,GAAG,CAAC,KAAc,EAAsB,EAAE,CAAC,GAAA,KAAK,CAAC,KAAK,CAAC,IAAI,cAAc,CAAC,GAAG,CAAC,OAAO,KAAK,CAAC,CAAC;IAErG,UAAO,GAAG,CAAC,KAAc,EAAmB,EAAE,CAAC,MAAM,CAAC,SAAS,CAAC,KAAe,CAAC,CAAC;IACjF,cAAW,GAAG,CAAC,KAAc,EAAmB,EAAE,CAAC,MAAM,CAAC,aAAa,CAAC,KAAe,CAAC,CAAC;IAEzF,cAAW,GAAG,CAAC,KAAc,EAAE,EAAE;QAC7C,0EAA0E;QAC1E,IAAI,SAAS,CAAC;QAEd,OAAO,aAAa,CAAC,KAAK,CAAC,0BAAoB;YAC9C,CAAC,SAAS,GAAG,MAAM,CAAC,cAAc,CAAC,KAAK,CAAC,EAAE,SAAS,KAAK,IAAI,IAAI,yCAAyC;gBACzG,SAAS,KAAK,MAAM,CAAC,cAAc,CAAC,EAAE,CAAC,CAAC,CAAC;IAC5C,CAAC,CAAC;IAEF,MAAM,eAAe,GAAG,IAAI,GAAG,CAAC;;;;;;;;;;KAU/B,CAAC,CAAC;IACU,aAAU,GAAG,CAAC,KAAc,EAAuB,EAAE;QACjE,MAAM,UAAU,GAAG,aAAa,CAAC,KAAK,CAAC,CAAC;QAExC,IAAI,UAAU,KAAK,IAAI,EAAE;YACxB,OAAO,KAAK,CAAC;SACb;QAED,OAAO,eAAe,CAAC,GAAG,CAAC,UAAU,CAAC,CAAC;IACxC,CAAC,CAAC;IAEF,MAAM,aAAa,GAAG,CAAC,KAAc,EAAE,EAAE,CAAC,GAAA,WAAW,CAAC,KAAK,CAAC,IAAI,KAAK,GAAG,CAAC,CAAC,CAAC;IAC9D,YAAS,GAAG,CAAC,KAAc,EAAsB,EAAE,CAAC,CAAC,GAAA,eAAe,CAAC,KAAK,CAAC,IAAI,CAAC,GAAA,SAAS,CAAC,KAAK,CAAC,IAAI,aAAa,CAAE,KAAmB,CAAC,MAAM,CAAC,CAAC;IAE/I,UAAO,GAAG,CAAC,KAAa,EAAE,KAAwB,EAAE,EAAE;QAClE,IAAI,GAAA,MAAM,CAAC,KAAK,CAAC,EAAE;YAClB,OAAO,KAAK,IAAI,IAAI,CAAC,GAAG,CAAC,CAAC,EAAE,KAAK,CAAC,IAAI,KAAK,IAAI,IAAI,CAAC,GAAG,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC;SAClE;QAED,IAAI,GAAA,KAAK,CAAC,KAAK,CAAC,IAAI,KAAK,CAAC,MAAM,KAAK,CAAC,EAAE;YACvC,OAAO,KAAK,IAAI,IAAI,CAAC,GAAG,CAAC,GAAG,KAAK,CAAC,IAAI,KAAK,IAAI,IAAI,CAAC,GAAG,CAAC,GAAG,KAAK,CAAC,CAAC;SAClE;QAED,MAAM,IAAI,SAAS,CAAC,kBAAkB,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,EAAE,CAAC,CAAC;IAChE,CAAC,CAAC;IAEF,MAAM,iBAAiB,GAAG,CAAC,CAAC;IAC5B,MAAM,uBAAuB,GAAG;QAC/B,WAAW;QACX,eAAe;QACf,OAAO;QACP,YAAY;QACZ,WAAW;KACX,CAAC;IAEW,aAAU,GAAG,CAAC,KAAc,EAAuB,EAAE,CAAC,GAAA,MAAM,CAAC,KAAK,CAAC,IAAK,KAAoB,CAAC,QAAQ,KAAK,iBAAiB,IAAI,GAAA,MAAM,CAAE,KAAoB,CAAC,QAAQ,CAAC;QACjL,CAAC,GAAA,WAAW,CAAC,KAAK,CAAC,IAAI,uBAAuB,CAAC,KAAK,CAAC,QAAQ,CAAC,EAAE,CAAC,QAAQ,IAAK,KAAoB,CAAC,CAAC;IAExF,aAAU,GAAG,CAAC,KAAc,EAAE,EAAE;QAC5C,IAAI,CAAC,KAAK,EAAE;YACX,OAAO,KAAK,CAAC;SACb;QAED,IAAK,KAAa,CAAC,MAAM,CAAC,UAAU,CAAC,IAAI,KAAK,KAAM,KAAa,CAAC,MAAM,CAAC,UAAU,CAAC,EAAE,EAAE;YACvF,OAAO,IAAI,CAAC;SACZ;QAED,IAAK,KAAa,CAAC,cAAc,CAAC,IAAI,KAAK,KAAM,KAAa,CAAC,cAAc,CAAC,EAAE,EAAE;YACjF,OAAO,IAAI,CAAC;SACZ;QAED,OAAO,KAAK,CAAC;IACd,CAAC,CAAC;IAEW,aAAU,GAAG,CAAC,KAAc,EAAuB,EAAE,CAAC,CAAC,GAAA,eAAe,CAAC,KAAK,CAAC,IAAI,QAAQ,CAAC,KAAK,CAAY,IAAI,GAAA,SAAS,CAAE,KAAoB,CAAC,IAAI,CAAC,IAAI,CAAC,GAAA,UAAU,CAAC,KAAK,CAAC,CAAC;IAE3K,WAAQ,GAAG,CAAC,KAAc,EAAE,EAAE,CAAC,KAAK,KAAK,QAAQ,IAAI,KAAK,KAAK,CAAC,QAAQ,CAAC;IAEtF,MAAM,cAAc,GAAG,CAAC,GAAW,EAAE,EAAE,CAAC,CAAC,KAAa,EAAE,EAAE,CAAC,GAAA,OAAO,CAAC,KAAK,CAAC,IAAI,IAAI,CAAC,GAAG,CAAC,KAAK,GAAG,CAAC,CAAC,KAAK,GAAG,CAAC;IAC5F,OAAI,GAAG,cAAc,CAAC,CAAC,CAAC,CAAC;IACzB,MAAG,GAAG,cAAc,CAAC,CAAC,CAAC,CAAC;IAErC,MAAM,kBAAkB,GAAG,CAAC,KAAc,EAAE,EAAE,CAAC,GAAA,MAAM,CAAC,KAAK,CAAC,IAAI,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC,KAAK,KAAK,CAAC;IAE9E,aAAU,GAAG,CAAC,KAAc,EAAE,EAAE,CAAC,GAAA,KAAK,CAAC,KAAK,CAAC,IAAI,KAAK,CAAC,MAAM,KAAK,CAAC,CAAC;IACpE,gBAAa,GAAG,CAAC,KAAc,EAAE,EAAE,CAAC,GAAA,KAAK,CAAC,KAAK,CAAC,IAAI,KAAK,CAAC,MAAM,GAAG,CAAC,CAAC;IAErE,cAAW,GAAG,CAAC,KAAc,EAAE,EAAE,CAAC,GAAA,MAAM,CAAC,KAAK,CAAC,IAAI,KAAK,CAAC,MAAM,KAAK,CAAC,CAAC;IACtE,iBAAc,GAAG,CAAC,KAAc,EAAE,EAAE,CAAC,GAAA,MAAM,CAAC,KAAK,CAAC,IAAI,KAAK,CAAC,MAAM,GAAG,CAAC,CAAC;IACvE,0BAAuB,GAAG,CAAC,KAAc,EAAE,EAAE,CAAC,GAAA,WAAW,CAAC,KAAK,CAAC,IAAI,kBAAkB,CAAC,KAAK,CAAC,CAAC;IAE9F,cAAW,GAAG,CAAC,KAAc,EAAE,EAAE,CAAC,GAAA,MAAM,CAAC,KAAK,CAAC,IAAI,CAAC,GAAA,GAAG,CAAC,KAAK,CAAC,IAAI,CAAC,GAAA,GAAG,CAAC,KAAK,CAAC,IAAI,MAAM,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC,MAAM,KAAK,CAAC,CAAC;IACjH,iBAAc,GAAG,CAAC,KAAc,EAAE,EAAE,CAAC,GAAA,MAAM,CAAC,KAAK,CAAC,IAAI,CAAC,GAAA,GAAG,CAAC,KAAK,CAAC,IAAI,CAAC,GAAA,GAAG,CAAC,KAAK,CAAC,IAAI,MAAM,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC,MAAM,GAAG,CAAC,CAAC;IAElH,WAAQ,GAAG,CAAC,KAAc,EAAE,EAAE,CAAC,GAAA,GAAG,CAAC,KAAK,CAAC,IAAI,KAAK,CAAC,IAAI,KAAK,CAAC,CAAC;IAC9D,cAAW,GAAG,CAAC,KAAc,EAAE,EAAE,CAAC,GAAA,GAAG,CAAC,KAAK,CAAC,IAAI,KAAK,CAAC,IAAI,GAAG,CAAC,CAAC;IAE/D,WAAQ,GAAG,CAAC,KAAc,EAAE,EAAE,CAAC,GAAA,GAAG,CAAC,KAAK,CAAC,IAAI,KAAK,CAAC,IAAI,KAAK,CAAC,CAAC;IAC9D,cAAW,GAAG,CAAC,KAAc,EAAE,EAAE,CAAC,GAAA,GAAG,CAAC,KAAK,CAAC,IAAI,KAAK,CAAC,IAAI,GAAG,CAAC,CAAC;IAG5E,MAAM,gBAAgB,GAAG,CAAC,MAAmB,EAAE,SAAkB,EAAE,MAAiB,EAAE,EAAE;QACvF,IAAI,GAAA,SAAS,CAAC,SAAS,CAAC,KAAK,KAAK,EAAE;YACnC,MAAM,IAAI,SAAS,CAAC,sBAAsB,IAAI,CAAC,SAAS,CAAC,SAAS,CAAC,EAAE,CAAC,CAAC;SACvE;QAED,IAAI,MAAM,CAAC,MAAM,KAAK,CAAC,EAAE;YACxB,MAAM,IAAI,SAAS,CAAC,0BAA0B,CAAC,CAAC;SAChD;QAED,OAAO,MAAM,CAAC,IAAI,CAAC,MAAM,EAAE,SAAgB,CAAC,CAAC;IAC9C,CAAC,CAAC;IAEF,+BAA+B;IAClB,MAAG,GAAG,CAAC,SAAkB,EAAE,GAAG,MAAiB,EAAE,EAAE,CAAC,gBAAgB,CAAC,KAAK,CAAC,SAAS,CAAC,IAAI,EAAE,SAAS,EAAE,MAAM,CAAC,CAAC;IAC9G,MAAG,GAAG,CAAC,SAAkB,EAAE,GAAG,MAAiB,EAAE,EAAE,CAAC,gBAAgB,CAAC,KAAK,CAAC,SAAS,CAAC,KAAK,EAAE,SAAS,EAAE,MAAM,CAAC,CAAC;IAC5H,8BAA8B;AAC/B,CAAC,EAvNS,EAAE,KAAF,EAAE,QAuNX;AAED,4EAA4E;AAC5E,0DAA0D;AAC1D,MAAM,CAAC,gBAAgB,CAAC,EAAE,EAAE;IAC3B,KAAK,EAAE;QACN,KAAK,EAAE,EAAE,CAAC,MAAM;KAChB;IACD,QAAQ,EAAE;QACT,KAAK,EAAE,EAAE,CAAC,SAAS;KACnB;IACD,IAAI,EAAE;QACL,KAAK,EAAE,EAAE,CAAC,KAAK;KACf;CACD,CAAC,CAAC;AAEH,kBAAe,EAAE,CAAC;AAElB,sCAAsC;AACtC,MAAM,CAAC,OAAO,GAAG,EAAE,CAAC;AACpB,MAAM,CAAC,OAAO,CAAC,OAAO,GAAG,EAAE,CAAC"} \ No newline at end of file diff --git a/node_modules/@sindresorhus/is/license b/node_modules/@sindresorhus/is/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/@sindresorhus/is/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/@sindresorhus/is/package.json b/node_modules/@sindresorhus/is/package.json new file mode 100644 index 000000000..2ce190092 --- /dev/null +++ b/node_modules/@sindresorhus/is/package.json @@ -0,0 +1,96 @@ +{ + "_from": "@sindresorhus/is@^0.14.0", + "_id": "@sindresorhus/is@0.14.0", + "_inBundle": false, + "_integrity": "sha512-9NET910DNaIPngYnLLPeg+Ogzqsi9uM4mSboU5y6p8S5DzMTVEsJZrawi+BoDNUVBa2DhJqQYUFvMDfgU062LQ==", + "_location": "/@sindresorhus/is", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "@sindresorhus/is@^0.14.0", + "name": "@sindresorhus/is", + "escapedName": "@sindresorhus%2fis", + "scope": "@sindresorhus", + "rawSpec": "^0.14.0", + "saveSpec": null, + "fetchSpec": "^0.14.0" + }, + "_requiredBy": [ + "/got" + ], + "_resolved": "https://registry.npmjs.org/@sindresorhus/is/-/is-0.14.0.tgz", + "_shasum": "9fb3a3cf3132328151f353de4632e01e52102bea", + "_spec": "@sindresorhus/is@^0.14.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\got", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/is/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Type check values: `is.string('🦄') //=> true`", + "devDependencies": { + "@sindresorhus/tsconfig": "^0.1.0", + "@types/jsdom": "^11.12.0", + "@types/node": "^10.12.10", + "@types/tempy": "^0.2.0", + "@types/zen-observable": "^0.8.0", + "ava": "^0.25.0", + "del-cli": "^1.1.0", + "jsdom": "^11.6.2", + "rxjs": "^6.3.3", + "tempy": "^0.2.1", + "tslint": "^5.9.1", + "tslint-xo": "^0.10.0", + "typescript": "^3.2.1", + "zen-observable": "^0.8.8" + }, + "engines": { + "node": ">=6" + }, + "files": [ + "dist" + ], + "homepage": "https://github.com/sindresorhus/is#readme", + "keywords": [ + "type", + "types", + "is", + "check", + "checking", + "validate", + "validation", + "utility", + "util", + "typeof", + "instanceof", + "object", + "assert", + "assertion", + "test", + "kind", + "primitive", + "verify", + "compare" + ], + "license": "MIT", + "main": "dist/index.js", + "name": "@sindresorhus/is", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/is.git" + }, + "scripts": { + "build": "del dist && tsc", + "lint": "tslint --format stylish --project .", + "prepublish": "npm run build && del dist/tests", + "test": "npm run lint && npm run build && ava dist/tests" + }, + "types": "dist/index.d.ts", + "version": "0.14.0" +} diff --git a/node_modules/@sindresorhus/is/readme.md b/node_modules/@sindresorhus/is/readme.md new file mode 100644 index 000000000..97c023b57 --- /dev/null +++ b/node_modules/@sindresorhus/is/readme.md @@ -0,0 +1,451 @@ +# is [![Build Status](https://travis-ci.org/sindresorhus/is.svg?branch=master)](https://travis-ci.org/sindresorhus/is) + +> Type check values: `is.string('🦄') //=> true` + + + + +## Install + +``` +$ npm install @sindresorhus/is +``` + + +## Usage + +```js +const is = require('@sindresorhus/is'); + +is('🦄'); +//=> 'string' + +is(new Map()); +//=> 'Map' + +is.number(6); +//=> true +``` + +When using `is` together with TypeScript, [type guards](http://www.typescriptlang.org/docs/handbook/advanced-types.html#type-guards-and-differentiating-types) are being used to infer the correct type inside if-else statements. + +```ts +import is from '@sindresorhus/is'; + +const padLeft = (value: string, padding: string | number) => { + if (is.number(padding)) { + // `padding` is typed as `number` + return Array(padding + 1).join(' ') + value; + } + + if (is.string(padding)) { + // `padding` is typed as `string` + return padding + value; + } + + throw new TypeError(`Expected 'padding' to be of type 'string' or 'number', got '${is(padding)}'.`); +} + +padLeft('🦄', 3); +//=> ' 🦄' + +padLeft('🦄', '🌈'); +//=> '🌈🦄' +``` + + +## API + +### is(value) + +Returns the type of `value`. + +Primitives are lowercase and object types are camelcase. + +Example: + +- `'undefined'` +- `'null'` +- `'string'` +- `'symbol'` +- `'Array'` +- `'Function'` +- `'Object'` + +Note: It will throw an error if you try to feed it object-wrapped primitives, as that's a bad practice. For example `new String('foo')`. + +### is.{method} + +All the below methods accept a value and returns a boolean for whether the value is of the desired type. + +#### Primitives + +##### .undefined(value) +##### .null(value) +##### .string(value) +##### .number(value) +##### .boolean(value) +##### .symbol(value) + +#### Built-in types + +##### .array(value) +##### .function(value) +##### .buffer(value) +##### .object(value) + +Keep in mind that [functions are objects too](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Functions). + +##### .numericString(value) + +Returns `true` for a string that represents a number. For example, `'42'` and `'-8'`. + +Note: `'NaN'` returns `false`, but `'Infinity'` and `'-Infinity'` return `true`. + +##### .regExp(value) +##### .date(value) +##### .error(value) +##### .nativePromise(value) +##### .promise(value) + +Returns `true` for any object with a `.then()` and `.catch()` method. Prefer this one over `.nativePromise()` as you usually want to allow userland promise implementations too. + +##### .generator(value) + +Returns `true` for any object that implements its own `.next()` and `.throw()` methods and has a function definition for `Symbol.iterator`. + +##### .generatorFunction(value) + +##### .asyncFunction(value) + +Returns `true` for any `async` function that can be called with the `await` operator. + +```js +is.asyncFunction(async () => {}); +// => true + +is.asyncFunction(() => {}); +// => false +``` + +##### .boundFunction(value) + +Returns `true` for any `bound` function. + +```js +is.boundFunction(() => {}); +// => true + +is.boundFunction(function () {}.bind(null)); +// => true + +is.boundFunction(function () {}); +// => false +``` + +##### .map(value) +##### .set(value) +##### .weakMap(value) +##### .weakSet(value) + +#### Typed arrays + +##### .int8Array(value) +##### .uint8Array(value) +##### .uint8ClampedArray(value) +##### .int16Array(value) +##### .uint16Array(value) +##### .int32Array(value) +##### .uint32Array(value) +##### .float32Array(value) +##### .float64Array(value) + +#### Structured data + +##### .arrayBuffer(value) +##### .sharedArrayBuffer(value) +##### .dataView(value) + +#### Emptiness + +##### .emptyString(value) + +Returns `true` if the value is a `string` and the `.length` is 0. + +##### .nonEmptyString(value) + +Returns `true` if the value is a `string` and the `.length` is more than 0. + +##### .emptyStringOrWhitespace(value) + +Returns `true` if `is.emptyString(value)` or if it's a `string` that is all whitespace. + +##### .emptyArray(value) + +Returns `true` if the value is an `Array` and the `.length` is 0. + +##### .nonEmptyArray(value) + +Returns `true` if the value is an `Array` and the `.length` is more than 0. + +##### .emptyObject(value) + +Returns `true` if the value is an `Object` and `Object.keys(value).length` is 0. + +Please note that `Object.keys` returns only own enumerable properties. Hence something like this can happen: + +```js +const object1 = {}; + +Object.defineProperty(object1, 'property1', { + value: 42, + writable: true, + enumerable: false, + configurable: true +}); + +is.emptyObject(object1); +// => true +``` + +##### .nonEmptyObject(value) + +Returns `true` if the value is an `Object` and `Object.keys(value).length` is more than 0. + +##### .emptySet(value) + +Returns `true` if the value is a `Set` and the `.size` is 0. + +##### .nonEmptySet(Value) + +Returns `true` if the value is a `Set` and the `.size` is more than 0. + +##### .emptyMap(value) + +Returns `true` if the value is a `Map` and the `.size` is 0. + +##### .nonEmptyMap(value) + +Returns `true` if the value is a `Map` and the `.size` is more than 0. + +#### Miscellaneous + +##### .directInstanceOf(value, class) + +Returns `true` if `value` is a direct instance of `class`. + +```js +is.directInstanceOf(new Error(), Error); +//=> true + +class UnicornError extends Error {} + +is.directInstanceOf(new UnicornError(), Error); +//=> false +``` + +##### .urlInstance(value) + +Returns `true` if `value` is an instance of the [`URL` class](https://developer.mozilla.org/en-US/docs/Web/API/URL). + +```js +const url = new URL('https://example.com'); + +is.urlInstance(url); +//=> true +``` + +### .url(value) + +Returns `true` if `value` is a URL string. + +Note: this only does basic checking using the [`URL` class](https://developer.mozilla.org/en-US/docs/Web/API/URL) constructor. + +```js +const url = 'https://example.com'; + +is.url(url); +//=> true + +is.url(new URL(url)); +//=> false +``` + +##### .truthy(value) + +Returns `true` for all values that evaluate to true in a boolean context: + +```js +is.truthy('🦄'); +//=> true + +is.truthy(undefined); +//=> false +``` + +##### .falsy(value) + +Returns `true` if `value` is one of: `false`, `0`, `''`, `null`, `undefined`, `NaN`. + +##### .nan(value) +##### .nullOrUndefined(value) +##### .primitive(value) + +JavaScript primitives are as follows: `null`, `undefined`, `string`, `number`, `boolean`, `symbol`. + +##### .integer(value) + +##### .safeInteger(value) + +Returns `true` if `value` is a [safe integer](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number/isSafeInteger). + +##### .plainObject(value) + +An object is plain if it's created by either `{}`, `new Object()`, or `Object.create(null)`. + +##### .iterable(value) +##### .asyncIterable(value) +##### .class(value) + +Returns `true` for instances created by a class. + +##### .typedArray(value) + +##### .arrayLike(value) + +A `value` is array-like if it is not a function and has a `value.length` that is a safe integer greater than or equal to 0. + +```js +is.arrayLike(document.forms); +//=> true + +function foo() { + is.arrayLike(arguments); + //=> true +} +foo(); +``` + +##### .inRange(value, range) + +Check if `value` (number) is in the given `range`. The range is an array of two values, lower bound and upper bound, in no specific order. + +```js +is.inRange(3, [0, 5]); +is.inRange(3, [5, 0]); +is.inRange(0, [-2, 2]); +``` + +##### .inRange(value, upperBound) + +Check if `value` (number) is in the range of `0` to `upperBound`. + +```js +is.inRange(3, 10); +``` + +##### .domElement(value) + +Returns `true` if `value` is a DOM Element. + +##### .nodeStream(value) + +Returns `true` if `value` is a Node.js [stream](https://nodejs.org/api/stream.html). + +```js +const fs = require('fs'); + +is.nodeStream(fs.createReadStream('unicorn.png')); +//=> true +``` + +##### .observable(value) + +Returns `true` if `value` is an `Observable`. + +```js +const {Observable} = require('rxjs'); + +is.observable(new Observable()); +//=> true +``` + +##### .infinite(value) + +Check if `value` is `Infinity` or `-Infinity`. + +##### .even(value) + +Returns `true` if `value` is an even integer. + +##### .odd(value) + +Returns `true` if `value` is an odd integer. + +##### .any(predicate, ...values) + +Returns `true` if **any** of the input `values` returns true in the `predicate`: + +```js +is.any(is.string, {}, true, '🦄'); +//=> true + +is.any(is.boolean, 'unicorns', [], new Map()); +//=> false +``` + +##### .all(predicate, ...values) + +Returns `true` if **all** of the input `values` returns true in the `predicate`: + +```js +is.all(is.object, {}, new Map(), new Set()); +//=> true + +is.all(is.string, '🦄', [], 'unicorns'); +//=> false +``` + + +## FAQ + +### Why yet another type checking module? + +There are hundreds of type checking modules on npm, unfortunately, I couldn't find any that fit my needs: + +- Includes both type methods and ability to get the type +- Types of primitives returned as lowercase and object types as camelcase +- Covers all built-ins +- Unsurprising behavior +- Well-maintained +- Comprehensive test suite + +For the ones I found, pick 3 of these. + +The most common mistakes I noticed in these modules was using `instanceof` for type checking, forgetting that functions are objects, and omitting `symbol` as a primitive. + + +## Related + +- [ow](https://github.com/sindresorhus/ow) - Function argument validation for humans +- [is-stream](https://github.com/sindresorhus/is-stream) - Check if something is a Node.js stream +- [is-observable](https://github.com/sindresorhus/is-observable) - Check if a value is an Observable +- [file-type](https://github.com/sindresorhus/file-type) - Detect the file type of a Buffer/Uint8Array +- [is-ip](https://github.com/sindresorhus/is-ip) - Check if a string is an IP address +- [is-array-sorted](https://github.com/sindresorhus/is-array-sorted) - Check if an Array is sorted +- [is-error-constructor](https://github.com/sindresorhus/is-error-constructor) - Check if a value is an error constructor +- [is-empty-iterable](https://github.com/sindresorhus/is-empty-iterable) - Check if an Iterable is empty +- [is-blob](https://github.com/sindresorhus/is-blob) - Check if a value is a Blob - File-like object of immutable, raw data +- [has-emoji](https://github.com/sindresorhus/has-emoji) - Check whether a string has any emoji + + +## Created by + +- [Sindre Sorhus](https://github.com/sindresorhus) +- [Giora Guttsait](https://github.com/gioragutt) +- [Brandon Smith](https://github.com/brandon93s) + + +## License + +MIT diff --git a/node_modules/@szmarczak/http-timer/LICENSE b/node_modules/@szmarczak/http-timer/LICENSE new file mode 100644 index 000000000..15ad2e8d5 --- /dev/null +++ b/node_modules/@szmarczak/http-timer/LICENSE @@ -0,0 +1,21 @@ +MIT License + +Copyright (c) 2018 Szymon Marczak + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/node_modules/@szmarczak/http-timer/README.md b/node_modules/@szmarczak/http-timer/README.md new file mode 100644 index 000000000..13279ed8c --- /dev/null +++ b/node_modules/@szmarczak/http-timer/README.md @@ -0,0 +1,70 @@ +# http-timer +> Timings for HTTP requests + +[![Build Status](https://travis-ci.org/szmarczak/http-timer.svg?branch=master)](https://travis-ci.org/szmarczak/http-timer) +[![Coverage Status](https://coveralls.io/repos/github/szmarczak/http-timer/badge.svg?branch=master)](https://coveralls.io/github/szmarczak/http-timer?branch=master) +[![install size](https://packagephobia.now.sh/badge?p=@szmarczak/http-timer)](https://packagephobia.now.sh/result?p=@szmarczak/http-timer) + +Inspired by the [`request` package](https://github.com/request/request). + +## Usage +```js +'use strict'; +const https = require('https'); +const timer = require('@szmarczak/http-timer'); + +const request = https.get('https://httpbin.org/anything'); +const timings = timer(request); + +request.on('response', response => { + response.on('data', () => {}); // Consume the data somehow + response.on('end', () => { + console.log(timings); + }); +}); + +// { start: 1535708511443, +// socket: 1535708511444, +// lookup: 1535708511444, +// connect: 1535708511582, +// upload: 1535708511887, +// response: 1535708512037, +// end: 1535708512040, +// phases: +// { wait: 1, +// dns: 0, +// tcp: 138, +// request: 305, +// firstByte: 150, +// download: 3, +// total: 597 } } +``` + +## API + +### timer(request) + +Returns: `Object` + +- `start` - Time when the request started. +- `socket` - Time when a socket was assigned to the request. +- `lookup` - Time when the DNS lookup finished. +- `connect` - Time when the socket successfully connected. +- `upload` - Time when the request finished uploading. +- `response` - Time when the request fired the `response` event. +- `end` - Time when the response fired the `end` event. +- `error` - Time when the request fired the `error` event. +- `phases` + - `wait` - `timings.socket - timings.start` + - `dns` - `timings.lookup - timings.socket` + - `tcp` - `timings.connect - timings.lookup` + - `request` - `timings.upload - timings.connect` + - `firstByte` - `timings.response - timings.upload` + - `download` - `timings.end - timings.response` + - `total` - `timings.end - timings.start` or `timings.error - timings.start` + +**Note**: The time is a `number` representing the milliseconds elapsed since the UNIX epoch. + +## License + +MIT diff --git a/node_modules/@szmarczak/http-timer/package.json b/node_modules/@szmarczak/http-timer/package.json new file mode 100644 index 000000000..8b7bbeaef --- /dev/null +++ b/node_modules/@szmarczak/http-timer/package.json @@ -0,0 +1,75 @@ +{ + "_from": "@szmarczak/http-timer@^1.1.2", + "_id": "@szmarczak/http-timer@1.1.2", + "_inBundle": false, + "_integrity": "sha512-XIB2XbzHTN6ieIjfIMV9hlVcfPU26s2vafYWQcZHWXHOxiaRZYEDKEwdl129Zyg50+foYV2jCgtrqSA6qNuNSA==", + "_location": "/@szmarczak/http-timer", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "@szmarczak/http-timer@^1.1.2", + "name": "@szmarczak/http-timer", + "escapedName": "@szmarczak%2fhttp-timer", + "scope": "@szmarczak", + "rawSpec": "^1.1.2", + "saveSpec": null, + "fetchSpec": "^1.1.2" + }, + "_requiredBy": [ + "/got" + ], + "_resolved": "https://registry.npmjs.org/@szmarczak/http-timer/-/http-timer-1.1.2.tgz", + "_shasum": "b1665e2c461a2cd92f4c1bbf50d5454de0d4b421", + "_spec": "@szmarczak/http-timer@^1.1.2", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\got", + "author": { + "name": "Szymon Marczak" + }, + "bugs": { + "url": "https://github.com/szmarczak/http-timer/issues" + }, + "bundleDependencies": false, + "dependencies": { + "defer-to-connect": "^1.0.1" + }, + "deprecated": false, + "description": "Timings for HTTP requests", + "devDependencies": { + "ava": "^0.25.0", + "coveralls": "^3.0.2", + "nyc": "^12.0.2", + "p-event": "^2.1.0", + "xo": "^0.22.0" + }, + "engines": { + "node": ">=6" + }, + "files": [ + "source" + ], + "homepage": "https://github.com/szmarczak/http-timer#readme", + "keywords": [ + "http", + "https", + "timer", + "timings" + ], + "license": "MIT", + "main": "source", + "name": "@szmarczak/http-timer", + "repository": { + "type": "git", + "url": "git+https://github.com/szmarczak/http-timer.git" + }, + "scripts": { + "coveralls": "nyc report --reporter=text-lcov | coveralls", + "test": "xo && nyc ava" + }, + "version": "1.1.2", + "xo": { + "rules": { + "unicorn/filename-case": "camelCase" + } + } +} diff --git a/node_modules/@szmarczak/http-timer/source/index.js b/node_modules/@szmarczak/http-timer/source/index.js new file mode 100644 index 000000000..e29458040 --- /dev/null +++ b/node_modules/@szmarczak/http-timer/source/index.js @@ -0,0 +1,99 @@ +'use strict'; +const deferToConnect = require('defer-to-connect'); + +module.exports = request => { + const timings = { + start: Date.now(), + socket: null, + lookup: null, + connect: null, + upload: null, + response: null, + end: null, + error: null, + phases: { + wait: null, + dns: null, + tcp: null, + request: null, + firstByte: null, + download: null, + total: null + } + }; + + const handleError = origin => { + const emit = origin.emit.bind(origin); + origin.emit = (event, ...args) => { + // Catches the `error` event + if (event === 'error') { + timings.error = Date.now(); + timings.phases.total = timings.error - timings.start; + + origin.emit = emit; + } + + // Saves the original behavior + return emit(event, ...args); + }; + }; + + let uploadFinished = false; + const onUpload = () => { + timings.upload = Date.now(); + timings.phases.request = timings.upload - timings.connect; + }; + + handleError(request); + + request.once('socket', socket => { + timings.socket = Date.now(); + timings.phases.wait = timings.socket - timings.start; + + const lookupListener = () => { + timings.lookup = Date.now(); + timings.phases.dns = timings.lookup - timings.socket; + }; + + socket.once('lookup', lookupListener); + + deferToConnect(socket, () => { + timings.connect = Date.now(); + + if (timings.lookup === null) { + socket.removeListener('lookup', lookupListener); + timings.lookup = timings.connect; + timings.phases.dns = timings.lookup - timings.socket; + } + + timings.phases.tcp = timings.connect - timings.lookup; + + if (uploadFinished && !timings.upload) { + onUpload(); + } + }); + }); + + request.once('finish', () => { + uploadFinished = true; + + if (timings.connect) { + onUpload(); + } + }); + + request.once('response', response => { + timings.response = Date.now(); + timings.phases.firstByte = timings.response - timings.upload; + + handleError(response); + + response.once('end', () => { + timings.end = Date.now(); + timings.phases.download = timings.end - timings.response; + timings.phases.total = timings.end - timings.start; + }); + }); + + return timings; +}; diff --git a/node_modules/@types/node/LICENSE b/node_modules/@types/node/LICENSE new file mode 100644 index 000000000..9e841e7a2 --- /dev/null +++ b/node_modules/@types/node/LICENSE @@ -0,0 +1,21 @@ + MIT License + + Copyright (c) Microsoft Corporation. + + Permission is hereby granted, free of charge, to any person obtaining a copy + of this software and associated documentation files (the "Software"), to deal + in the Software without restriction, including without limitation the rights + to use, copy, modify, merge, publish, distribute, sublicense, and/or sell + copies of the Software, and to permit persons to whom the Software is + furnished to do so, subject to the following conditions: + + The above copyright notice and this permission notice shall be included in all + copies or substantial portions of the Software. + + THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, + OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE + SOFTWARE diff --git a/node_modules/@types/node/README.md b/node_modules/@types/node/README.md new file mode 100644 index 000000000..724c87163 --- /dev/null +++ b/node_modules/@types/node/README.md @@ -0,0 +1,16 @@ +# Installation +> `npm install --save @types/node` + +# Summary +This package contains type definitions for Node.js (https://nodejs.org/). + +# Details +Files were exported from https://github.com/DefinitelyTyped/DefinitelyTyped/tree/master/types/node. + +### Additional Details + * Last updated: Tue, 05 Oct 2021 20:31:28 GMT + * Dependencies: none + * Global values: `AbortController`, `AbortSignal`, `__dirname`, `__filename`, `console`, `exports`, `gc`, `global`, `module`, `process`, `require` + +# Credits +These definitions were written by [Microsoft TypeScript](https://github.com/Microsoft), [DefinitelyTyped](https://github.com/DefinitelyTyped), [Alberto Schiabel](https://github.com/jkomyno), [Alvis HT Tang](https://github.com/alvis), [Andrew Makarov](https://github.com/r3nya), [Benjamin Toueg](https://github.com/btoueg), [Chigozirim C.](https://github.com/smac89), [David Junger](https://github.com/touffy), [Deividas Bakanas](https://github.com/DeividasBakanas), [Eugene Y. Q. Shen](https://github.com/eyqs), [Hannes Magnusson](https://github.com/Hannes-Magnusson-CK), [Huw](https://github.com/hoo29), [Kelvin Jin](https://github.com/kjin), [Klaus Meinhardt](https://github.com/ajafff), [Lishude](https://github.com/islishude), [Mariusz Wiktorczyk](https://github.com/mwiktorczyk), [Mohsen Azimi](https://github.com/mohsen1), [Nicolas Even](https://github.com/n-e), [Nikita Galkin](https://github.com/galkin), [Parambir Singh](https://github.com/parambirs), [Sebastian Silbermann](https://github.com/eps1lon), [Simon Schick](https://github.com/SimonSchick), [Thomas den Hollander](https://github.com/ThomasdenH), [Wilco Bakker](https://github.com/WilcoBakker), [wwwy3y3](https://github.com/wwwy3y3), [Samuel Ainsworth](https://github.com/samuela), [Kyle Uehlein](https://github.com/kuehlein), [Thanik Bhongbhibhat](https://github.com/bhongy), [Marcin Kopacz](https://github.com/chyzwar), [Trivikram Kamat](https://github.com/trivikr), [Minh Son Nguyen](https://github.com/nguymin4), [Junxiao Shi](https://github.com/yoursunny), [Ilia Baryshnikov](https://github.com/qwelias), [ExE Boss](https://github.com/ExE-Boss), [Surasak Chaisurin](https://github.com/Ryan-Willpower), [Piotr Błażejewicz](https://github.com/peterblazejewicz), [Anna Henningsen](https://github.com/addaleax), [Victor Perin](https://github.com/victorperin), [Yongsheng Zhang](https://github.com/ZYSzys), [NodeJS Contributors](https://github.com/NodeJS), [Linus Unnebäck](https://github.com/LinusU), and [wafuwafu13](https://github.com/wafuwafu13). diff --git a/node_modules/@types/node/assert.d.ts b/node_modules/@types/node/assert.d.ts new file mode 100644 index 000000000..9f916c16b --- /dev/null +++ b/node_modules/@types/node/assert.d.ts @@ -0,0 +1,912 @@ +/** + * The `assert` module provides a set of assertion functions for verifying + * invariants. + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/assert.js) + */ +declare module 'assert' { + /** + * An alias of {@link ok}. + * @since v0.5.9 + * @param value The input that is checked for being truthy. + */ + function assert(value: unknown, message?: string | Error): asserts value; + namespace assert { + /** + * Indicates the failure of an assertion. All errors thrown by the `assert` module + * will be instances of the `AssertionError` class. + */ + class AssertionError extends Error { + actual: unknown; + expected: unknown; + operator: string; + generatedMessage: boolean; + code: 'ERR_ASSERTION'; + constructor(options?: { + /** If provided, the error message is set to this value. */ + message?: string | undefined; + /** The `actual` property on the error instance. */ + actual?: unknown | undefined; + /** The `expected` property on the error instance. */ + expected?: unknown | undefined; + /** The `operator` property on the error instance. */ + operator?: string | undefined; + /** If provided, the generated stack trace omits frames before this function. */ + // tslint:disable-next-line:ban-types + stackStartFn?: Function | undefined; + }); + } + /** + * This feature is currently experimental and behavior might still change. + * @since v14.2.0, v12.19.0 + * @experimental + */ + class CallTracker { + /** + * The wrapper function is expected to be called exactly `exact` times. If the + * function has not been called exactly `exact` times when `tracker.verify()` is called, then `tracker.verify()` will throw an + * error. + * + * ```js + * import assert from 'assert'; + * + * // Creates call tracker. + * const tracker = new assert.CallTracker(); + * + * function func() {} + * + * // Returns a function that wraps func() that must be called exact times + * // before tracker.verify(). + * const callsfunc = tracker.calls(func); + * ``` + * @since v14.2.0, v12.19.0 + * @param [fn='A no-op function'] + * @param [exact=1] + * @return that wraps `fn`. + */ + calls(exact?: number): () => void; + calls any>(fn?: Func, exact?: number): Func; + /** + * The arrays contains information about the expected and actual number of calls of + * the functions that have not been called the expected number of times. + * + * ```js + * import assert from 'assert'; + * + * // Creates call tracker. + * const tracker = new assert.CallTracker(); + * + * function func() {} + * + * function foo() {} + * + * // Returns a function that wraps func() that must be called exact times + * // before tracker.verify(). + * const callsfunc = tracker.calls(func, 2); + * + * // Returns an array containing information on callsfunc() + * tracker.report(); + * // [ + * // { + * // message: 'Expected the func function to be executed 2 time(s) but was + * // executed 0 time(s).', + * // actual: 0, + * // expected: 2, + * // operator: 'func', + * // stack: stack trace + * // } + * // ] + * ``` + * @since v14.2.0, v12.19.0 + * @return of objects containing information about the wrapper functions returned by `calls`. + */ + report(): CallTrackerReportInformation[]; + /** + * Iterates through the list of functions passed to `tracker.calls()` and will throw an error for functions that + * have not been called the expected number of times. + * + * ```js + * import assert from 'assert'; + * + * // Creates call tracker. + * const tracker = new assert.CallTracker(); + * + * function func() {} + * + * // Returns a function that wraps func() that must be called exact times + * // before tracker.verify(). + * const callsfunc = tracker.calls(func, 2); + * + * callsfunc(); + * + * // Will throw an error since callsfunc() was only called once. + * tracker.verify(); + * ``` + * @since v14.2.0, v12.19.0 + */ + verify(): void; + } + interface CallTrackerReportInformation { + message: string; + /** The actual number of times the function was called. */ + actual: number; + /** The number of times the function was expected to be called. */ + expected: number; + /** The name of the function that is wrapped. */ + operator: string; + /** A stack trace of the function. */ + stack: object; + } + type AssertPredicate = RegExp | (new () => object) | ((thrown: unknown) => boolean) | object | Error; + /** + * Throws an `AssertionError` with the provided error message or a default + * error message. If the `message` parameter is an instance of an `Error` then + * it will be thrown instead of the `AssertionError`. + * + * ```js + * import assert from 'assert/strict'; + * + * assert.fail(); + * // AssertionError [ERR_ASSERTION]: Failed + * + * assert.fail('boom'); + * // AssertionError [ERR_ASSERTION]: boom + * + * assert.fail(new TypeError('need array')); + * // TypeError: need array + * ``` + * + * Using `assert.fail()` with more than two arguments is possible but deprecated. + * See below for further details. + * @since v0.1.21 + * @param [message='Failed'] + */ + function fail(message?: string | Error): never; + /** @deprecated since v10.0.0 - use fail([message]) or other assert functions instead. */ + function fail( + actual: unknown, + expected: unknown, + message?: string | Error, + operator?: string, + // tslint:disable-next-line:ban-types + stackStartFn?: Function + ): never; + /** + * Tests if `value` is truthy. It is equivalent to`assert.equal(!!value, true, message)`. + * + * If `value` is not truthy, an `AssertionError` is thrown with a `message`property set equal to the value of the `message` parameter. If the `message`parameter is `undefined`, a default + * error message is assigned. If the `message`parameter is an instance of an `Error` then it will be thrown instead of the`AssertionError`. + * If no arguments are passed in at all `message` will be set to the string:`` 'No value argument passed to `assert.ok()`' ``. + * + * Be aware that in the `repl` the error message will be different to the one + * thrown in a file! See below for further details. + * + * ```js + * import assert from 'assert/strict'; + * + * assert.ok(true); + * // OK + * assert.ok(1); + * // OK + * + * assert.ok(); + * // AssertionError: No value argument passed to `assert.ok()` + * + * assert.ok(false, 'it\'s false'); + * // AssertionError: it's false + * + * // In the repl: + * assert.ok(typeof 123 === 'string'); + * // AssertionError: false == true + * + * // In a file (e.g. test.js): + * assert.ok(typeof 123 === 'string'); + * // AssertionError: The expression evaluated to a falsy value: + * // + * // assert.ok(typeof 123 === 'string') + * + * assert.ok(false); + * // AssertionError: The expression evaluated to a falsy value: + * // + * // assert.ok(false) + * + * assert.ok(0); + * // AssertionError: The expression evaluated to a falsy value: + * // + * // assert.ok(0) + * ``` + * + * ```js + * import assert from 'assert/strict'; + * + * // Using `assert()` works the same: + * assert(0); + * // AssertionError: The expression evaluated to a falsy value: + * // + * // assert(0) + * ``` + * @since v0.1.21 + */ + function ok(value: unknown, message?: string | Error): asserts value; + /** + * **Strict assertion mode** + * + * An alias of {@link strictEqual}. + * + * **Legacy assertion mode** + * + * > Stability: 3 - Legacy: Use {@link strictEqual} instead. + * + * Tests shallow, coercive equality between the `actual` and `expected` parameters + * using the [Abstract Equality Comparison](https://tc39.github.io/ecma262/#sec-abstract-equality-comparison) ( `==` ). `NaN` is special handled + * and treated as being identical in case both sides are `NaN`. + * + * ```js + * import assert from 'assert'; + * + * assert.equal(1, 1); + * // OK, 1 == 1 + * assert.equal(1, '1'); + * // OK, 1 == '1' + * assert.equal(NaN, NaN); + * // OK + * + * assert.equal(1, 2); + * // AssertionError: 1 == 2 + * assert.equal({ a: { b: 1 } }, { a: { b: 1 } }); + * // AssertionError: { a: { b: 1 } } == { a: { b: 1 } } + * ``` + * + * If the values are not equal, an `AssertionError` is thrown with a `message`property set equal to the value of the `message` parameter. If the `message`parameter is undefined, a default + * error message is assigned. If the `message`parameter is an instance of an `Error` then it will be thrown instead of the`AssertionError`. + * @since v0.1.21 + */ + function equal(actual: unknown, expected: unknown, message?: string | Error): void; + /** + * **Strict assertion mode** + * + * An alias of {@link notStrictEqual}. + * + * **Legacy assertion mode** + * + * > Stability: 3 - Legacy: Use {@link notStrictEqual} instead. + * + * Tests shallow, coercive inequality with the [Abstract Equality Comparison](https://tc39.github.io/ecma262/#sec-abstract-equality-comparison)(`!=` ). `NaN` is special handled and treated as + * being identical in case both + * sides are `NaN`. + * + * ```js + * import assert from 'assert'; + * + * assert.notEqual(1, 2); + * // OK + * + * assert.notEqual(1, 1); + * // AssertionError: 1 != 1 + * + * assert.notEqual(1, '1'); + * // AssertionError: 1 != '1' + * ``` + * + * If the values are equal, an `AssertionError` is thrown with a `message`property set equal to the value of the `message` parameter. If the `message`parameter is undefined, a default error + * message is assigned. If the `message`parameter is an instance of an `Error` then it will be thrown instead of the`AssertionError`. + * @since v0.1.21 + */ + function notEqual(actual: unknown, expected: unknown, message?: string | Error): void; + /** + * **Strict assertion mode** + * + * An alias of {@link deepStrictEqual}. + * + * **Legacy assertion mode** + * + * > Stability: 3 - Legacy: Use {@link deepStrictEqual} instead. + * + * Tests for deep equality between the `actual` and `expected` parameters. Consider + * using {@link deepStrictEqual} instead. {@link deepEqual} can have + * surprising results. + * + * _Deep equality_ means that the enumerable "own" properties of child objects + * are also recursively evaluated by the following rules. + * @since v0.1.21 + */ + function deepEqual(actual: unknown, expected: unknown, message?: string | Error): void; + /** + * **Strict assertion mode** + * + * An alias of {@link notDeepStrictEqual}. + * + * **Legacy assertion mode** + * + * > Stability: 3 - Legacy: Use {@link notDeepStrictEqual} instead. + * + * Tests for any deep inequality. Opposite of {@link deepEqual}. + * + * ```js + * import assert from 'assert'; + * + * const obj1 = { + * a: { + * b: 1 + * } + * }; + * const obj2 = { + * a: { + * b: 2 + * } + * }; + * const obj3 = { + * a: { + * b: 1 + * } + * }; + * const obj4 = Object.create(obj1); + * + * assert.notDeepEqual(obj1, obj1); + * // AssertionError: { a: { b: 1 } } notDeepEqual { a: { b: 1 } } + * + * assert.notDeepEqual(obj1, obj2); + * // OK + * + * assert.notDeepEqual(obj1, obj3); + * // AssertionError: { a: { b: 1 } } notDeepEqual { a: { b: 1 } } + * + * assert.notDeepEqual(obj1, obj4); + * // OK + * ``` + * + * If the values are deeply equal, an `AssertionError` is thrown with a`message` property set equal to the value of the `message` parameter. If the`message` parameter is undefined, a default + * error message is assigned. If the`message` parameter is an instance of an `Error` then it will be thrown + * instead of the `AssertionError`. + * @since v0.1.21 + */ + function notDeepEqual(actual: unknown, expected: unknown, message?: string | Error): void; + /** + * Tests strict equality between the `actual` and `expected` parameters as + * determined by the [SameValue Comparison](https://tc39.github.io/ecma262/#sec-samevalue). + * + * ```js + * import assert from 'assert/strict'; + * + * assert.strictEqual(1, 2); + * // AssertionError [ERR_ASSERTION]: Expected inputs to be strictly equal: + * // + * // 1 !== 2 + * + * assert.strictEqual(1, 1); + * // OK + * + * assert.strictEqual('Hello foobar', 'Hello World!'); + * // AssertionError [ERR_ASSERTION]: Expected inputs to be strictly equal: + * // + actual - expected + * // + * // + 'Hello foobar' + * // - 'Hello World!' + * // ^ + * + * const apples = 1; + * const oranges = 2; + * assert.strictEqual(apples, oranges, `apples ${apples} !== oranges ${oranges}`); + * // AssertionError [ERR_ASSERTION]: apples 1 !== oranges 2 + * + * assert.strictEqual(1, '1', new TypeError('Inputs are not identical')); + * // TypeError: Inputs are not identical + * ``` + * + * If the values are not strictly equal, an `AssertionError` is thrown with a`message` property set equal to the value of the `message` parameter. If the`message` parameter is undefined, a + * default error message is assigned. If the`message` parameter is an instance of an `Error` then it will be thrown + * instead of the `AssertionError`. + * @since v0.1.21 + */ + function strictEqual(actual: unknown, expected: T, message?: string | Error): asserts actual is T; + /** + * Tests strict inequality between the `actual` and `expected` parameters as + * determined by the [SameValue Comparison](https://tc39.github.io/ecma262/#sec-samevalue). + * + * ```js + * import assert from 'assert/strict'; + * + * assert.notStrictEqual(1, 2); + * // OK + * + * assert.notStrictEqual(1, 1); + * // AssertionError [ERR_ASSERTION]: Expected "actual" to be strictly unequal to: + * // + * // 1 + * + * assert.notStrictEqual(1, '1'); + * // OK + * ``` + * + * If the values are strictly equal, an `AssertionError` is thrown with a`message` property set equal to the value of the `message` parameter. If the`message` parameter is undefined, a + * default error message is assigned. If the`message` parameter is an instance of an `Error` then it will be thrown + * instead of the `AssertionError`. + * @since v0.1.21 + */ + function notStrictEqual(actual: unknown, expected: unknown, message?: string | Error): void; + /** + * Tests for deep equality between the `actual` and `expected` parameters. + * "Deep" equality means that the enumerable "own" properties of child objects + * are recursively evaluated also by the following rules. + * @since v1.2.0 + */ + function deepStrictEqual(actual: unknown, expected: T, message?: string | Error): asserts actual is T; + /** + * Tests for deep strict inequality. Opposite of {@link deepStrictEqual}. + * + * ```js + * import assert from 'assert/strict'; + * + * assert.notDeepStrictEqual({ a: 1 }, { a: '1' }); + * // OK + * ``` + * + * If the values are deeply and strictly equal, an `AssertionError` is thrown + * with a `message` property set equal to the value of the `message` parameter. If + * the `message` parameter is undefined, a default error message is assigned. If + * the `message` parameter is an instance of an `Error` then it will be thrown + * instead of the `AssertionError`. + * @since v1.2.0 + */ + function notDeepStrictEqual(actual: unknown, expected: unknown, message?: string | Error): void; + /** + * Expects the function `fn` to throw an error. + * + * If specified, `error` can be a [`Class`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Classes), + * [`RegExp`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions), a validation function, + * a validation object where each property will be tested for strict deep equality, + * or an instance of error where each property will be tested for strict deep + * equality including the non-enumerable `message` and `name` properties. When + * using an object, it is also possible to use a regular expression, when + * validating against a string property. See below for examples. + * + * If specified, `message` will be appended to the message provided by the`AssertionError` if the `fn` call fails to throw or in case the error validation + * fails. + * + * Custom validation object/error instance: + * + * ```js + * import assert from 'assert/strict'; + * + * const err = new TypeError('Wrong value'); + * err.code = 404; + * err.foo = 'bar'; + * err.info = { + * nested: true, + * baz: 'text' + * }; + * err.reg = /abc/i; + * + * assert.throws( + * () => { + * throw err; + * }, + * { + * name: 'TypeError', + * message: 'Wrong value', + * info: { + * nested: true, + * baz: 'text' + * } + * // Only properties on the validation object will be tested for. + * // Using nested objects requires all properties to be present. Otherwise + * // the validation is going to fail. + * } + * ); + * + * // Using regular expressions to validate error properties: + * throws( + * () => { + * throw err; + * }, + * { + * // The `name` and `message` properties are strings and using regular + * // expressions on those will match against the string. If they fail, an + * // error is thrown. + * name: /^TypeError$/, + * message: /Wrong/, + * foo: 'bar', + * info: { + * nested: true, + * // It is not possible to use regular expressions for nested properties! + * baz: 'text' + * }, + * // The `reg` property contains a regular expression and only if the + * // validation object contains an identical regular expression, it is going + * // to pass. + * reg: /abc/i + * } + * ); + * + * // Fails due to the different `message` and `name` properties: + * throws( + * () => { + * const otherErr = new Error('Not found'); + * // Copy all enumerable properties from `err` to `otherErr`. + * for (const [key, value] of Object.entries(err)) { + * otherErr[key] = value; + * } + * throw otherErr; + * }, + * // The error's `message` and `name` properties will also be checked when using + * // an error as validation object. + * err + * ); + * ``` + * + * Validate instanceof using constructor: + * + * ```js + * import assert from 'assert/strict'; + * + * assert.throws( + * () => { + * throw new Error('Wrong value'); + * }, + * Error + * ); + * ``` + * + * Validate error message using [`RegExp`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions): + * + * Using a regular expression runs `.toString` on the error object, and will + * therefore also include the error name. + * + * ```js + * import assert from 'assert/strict'; + * + * assert.throws( + * () => { + * throw new Error('Wrong value'); + * }, + * /^Error: Wrong value$/ + * ); + * ``` + * + * Custom error validation: + * + * The function must return `true` to indicate all internal validations passed. + * It will otherwise fail with an `AssertionError`. + * + * ```js + * import assert from 'assert/strict'; + * + * assert.throws( + * () => { + * throw new Error('Wrong value'); + * }, + * (err) => { + * assert(err instanceof Error); + * assert(/value/.test(err)); + * // Avoid returning anything from validation functions besides `true`. + * // Otherwise, it's not clear what part of the validation failed. Instead, + * // throw an error about the specific validation that failed (as done in this + * // example) and add as much helpful debugging information to that error as + * // possible. + * return true; + * }, + * 'unexpected error' + * ); + * ``` + * + * `error` cannot be a string. If a string is provided as the second + * argument, then `error` is assumed to be omitted and the string will be used for`message` instead. This can lead to easy-to-miss mistakes. Using the same + * message as the thrown error message is going to result in an`ERR_AMBIGUOUS_ARGUMENT` error. Please read the example below carefully if using + * a string as the second argument gets considered: + * + * ```js + * import assert from 'assert/strict'; + * + * function throwingFirst() { + * throw new Error('First'); + * } + * + * function throwingSecond() { + * throw new Error('Second'); + * } + * + * function notThrowing() {} + * + * // The second argument is a string and the input function threw an Error. + * // The first case will not throw as it does not match for the error message + * // thrown by the input function! + * assert.throws(throwingFirst, 'Second'); + * // In the next example the message has no benefit over the message from the + * // error and since it is not clear if the user intended to actually match + * // against the error message, Node.js throws an `ERR_AMBIGUOUS_ARGUMENT` error. + * assert.throws(throwingSecond, 'Second'); + * // TypeError [ERR_AMBIGUOUS_ARGUMENT] + * + * // The string is only used (as message) in case the function does not throw: + * assert.throws(notThrowing, 'Second'); + * // AssertionError [ERR_ASSERTION]: Missing expected exception: Second + * + * // If it was intended to match for the error message do this instead: + * // It does not throw because the error messages match. + * assert.throws(throwingSecond, /Second$/); + * + * // If the error message does not match, an AssertionError is thrown. + * assert.throws(throwingFirst, /Second$/); + * // AssertionError [ERR_ASSERTION] + * ``` + * + * Due to the confusing error-prone notation, avoid a string as the second + * argument. + * @since v0.1.21 + */ + function throws(block: () => unknown, message?: string | Error): void; + function throws(block: () => unknown, error: AssertPredicate, message?: string | Error): void; + /** + * Asserts that the function `fn` does not throw an error. + * + * Using `assert.doesNotThrow()` is actually not useful because there + * is no benefit in catching an error and then rethrowing it. Instead, consider + * adding a comment next to the specific code path that should not throw and keep + * error messages as expressive as possible. + * + * When `assert.doesNotThrow()` is called, it will immediately call the `fn`function. + * + * If an error is thrown and it is the same type as that specified by the `error`parameter, then an `AssertionError` is thrown. If the error is of a + * different type, or if the `error` parameter is undefined, the error is + * propagated back to the caller. + * + * If specified, `error` can be a [`Class`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Classes), + * [`RegExp`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions) or a validation + * function. See {@link throws} for more details. + * + * The following, for instance, will throw the `TypeError` because there is no + * matching error type in the assertion: + * + * ```js + * import assert from 'assert/strict'; + * + * assert.doesNotThrow( + * () => { + * throw new TypeError('Wrong value'); + * }, + * SyntaxError + * ); + * ``` + * + * However, the following will result in an `AssertionError` with the message + * 'Got unwanted exception...': + * + * ```js + * import assert from 'assert/strict'; + * + * assert.doesNotThrow( + * () => { + * throw new TypeError('Wrong value'); + * }, + * TypeError + * ); + * ``` + * + * If an `AssertionError` is thrown and a value is provided for the `message`parameter, the value of `message` will be appended to the `AssertionError` message: + * + * ```js + * import assert from 'assert/strict'; + * + * assert.doesNotThrow( + * () => { + * throw new TypeError('Wrong value'); + * }, + * /Wrong value/, + * 'Whoops' + * ); + * // Throws: AssertionError: Got unwanted exception: Whoops + * ``` + * @since v0.1.21 + */ + function doesNotThrow(block: () => unknown, message?: string | Error): void; + function doesNotThrow(block: () => unknown, error: AssertPredicate, message?: string | Error): void; + /** + * Throws `value` if `value` is not `undefined` or `null`. This is useful when + * testing the `error` argument in callbacks. The stack trace contains all frames + * from the error passed to `ifError()` including the potential new frames for`ifError()` itself. + * + * ```js + * import assert from 'assert/strict'; + * + * assert.ifError(null); + * // OK + * assert.ifError(0); + * // AssertionError [ERR_ASSERTION]: ifError got unwanted exception: 0 + * assert.ifError('error'); + * // AssertionError [ERR_ASSERTION]: ifError got unwanted exception: 'error' + * assert.ifError(new Error()); + * // AssertionError [ERR_ASSERTION]: ifError got unwanted exception: Error + * + * // Create some random error frames. + * let err; + * (function errorFrame() { + * err = new Error('test error'); + * })(); + * + * (function ifErrorFrame() { + * assert.ifError(err); + * })(); + * // AssertionError [ERR_ASSERTION]: ifError got unwanted exception: test error + * // at ifErrorFrame + * // at errorFrame + * ``` + * @since v0.1.97 + */ + function ifError(value: unknown): asserts value is null | undefined; + /** + * Awaits the `asyncFn` promise or, if `asyncFn` is a function, immediately + * calls the function and awaits the returned promise to complete. It will then + * check that the promise is rejected. + * + * If `asyncFn` is a function and it throws an error synchronously,`assert.rejects()` will return a rejected `Promise` with that error. If the + * function does not return a promise, `assert.rejects()` will return a rejected`Promise` with an `ERR_INVALID_RETURN_VALUE` error. In both cases the error + * handler is skipped. + * + * Besides the async nature to await the completion behaves identically to {@link throws}. + * + * If specified, `error` can be a [`Class`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Classes), + * [`RegExp`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions), a validation function, + * an object where each property will be tested for, or an instance of error where + * each property will be tested for including the non-enumerable `message` and`name` properties. + * + * If specified, `message` will be the message provided by the `AssertionError` if the `asyncFn` fails to reject. + * + * ```js + * import assert from 'assert/strict'; + * + * await assert.rejects( + * async () => { + * throw new TypeError('Wrong value'); + * }, + * { + * name: 'TypeError', + * message: 'Wrong value' + * } + * ); + * ``` + * + * ```js + * import assert from 'assert/strict'; + * + * await assert.rejects( + * async () => { + * throw new TypeError('Wrong value'); + * }, + * (err) => { + * assert.strictEqual(err.name, 'TypeError'); + * assert.strictEqual(err.message, 'Wrong value'); + * return true; + * } + * ); + * ``` + * + * ```js + * import assert from 'assert/strict'; + * + * assert.rejects( + * Promise.reject(new Error('Wrong value')), + * Error + * ).then(() => { + * // ... + * }); + * ``` + * + * `error` cannot be a string. If a string is provided as the second + * argument, then `error` is assumed to be omitted and the string will be used for`message` instead. This can lead to easy-to-miss mistakes. Please read the + * example in {@link throws} carefully if using a string as the second + * argument gets considered. + * @since v10.0.0 + */ + function rejects(block: (() => Promise) | Promise, message?: string | Error): Promise; + function rejects(block: (() => Promise) | Promise, error: AssertPredicate, message?: string | Error): Promise; + /** + * Awaits the `asyncFn` promise or, if `asyncFn` is a function, immediately + * calls the function and awaits the returned promise to complete. It will then + * check that the promise is not rejected. + * + * If `asyncFn` is a function and it throws an error synchronously,`assert.doesNotReject()` will return a rejected `Promise` with that error. If + * the function does not return a promise, `assert.doesNotReject()` will return a + * rejected `Promise` with an `ERR_INVALID_RETURN_VALUE` error. In both cases + * the error handler is skipped. + * + * Using `assert.doesNotReject()` is actually not useful because there is little + * benefit in catching a rejection and then rejecting it again. Instead, consider + * adding a comment next to the specific code path that should not reject and keep + * error messages as expressive as possible. + * + * If specified, `error` can be a [`Class`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Classes), + * [`RegExp`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Regular_Expressions) or a validation + * function. See {@link throws} for more details. + * + * Besides the async nature to await the completion behaves identically to {@link doesNotThrow}. + * + * ```js + * import assert from 'assert/strict'; + * + * await assert.doesNotReject( + * async () => { + * throw new TypeError('Wrong value'); + * }, + * SyntaxError + * ); + * ``` + * + * ```js + * import assert from 'assert/strict'; + * + * assert.doesNotReject(Promise.reject(new TypeError('Wrong value'))) + * .then(() => { + * // ... + * }); + * ``` + * @since v10.0.0 + */ + function doesNotReject(block: (() => Promise) | Promise, message?: string | Error): Promise; + function doesNotReject(block: (() => Promise) | Promise, error: AssertPredicate, message?: string | Error): Promise; + /** + * Expects the `string` input to match the regular expression. + * + * ```js + * import assert from 'assert/strict'; + * + * assert.match('I will fail', /pass/); + * // AssertionError [ERR_ASSERTION]: The input did not match the regular ... + * + * assert.match(123, /pass/); + * // AssertionError [ERR_ASSERTION]: The "string" argument must be of type string. + * + * assert.match('I will pass', /pass/); + * // OK + * ``` + * + * If the values do not match, or if the `string` argument is of another type than`string`, an `AssertionError` is thrown with a `message` property set equal + * to the value of the `message` parameter. If the `message` parameter is + * undefined, a default error message is assigned. If the `message` parameter is an + * instance of an `Error` then it will be thrown instead of the `AssertionError`. + * @since v13.6.0, v12.16.0 + */ + function match(value: string, regExp: RegExp, message?: string | Error): void; + /** + * Expects the `string` input not to match the regular expression. + * + * ```js + * import assert from 'assert/strict'; + * + * assert.doesNotMatch('I will fail', /fail/); + * // AssertionError [ERR_ASSERTION]: The input was expected to not match the ... + * + * assert.doesNotMatch(123, /pass/); + * // AssertionError [ERR_ASSERTION]: The "string" argument must be of type string. + * + * assert.doesNotMatch('I will pass', /different/); + * // OK + * ``` + * + * If the values do match, or if the `string` argument is of another type than`string`, an `AssertionError` is thrown with a `message` property set equal + * to the value of the `message` parameter. If the `message` parameter is + * undefined, a default error message is assigned. If the `message` parameter is an + * instance of an `Error` then it will be thrown instead of the `AssertionError`. + * @since v13.6.0, v12.16.0 + */ + function doesNotMatch(value: string, regExp: RegExp, message?: string | Error): void; + const strict: Omit & { + (value: unknown, message?: string | Error): asserts value; + equal: typeof strictEqual; + notEqual: typeof notStrictEqual; + deepEqual: typeof deepStrictEqual; + notDeepEqual: typeof notDeepStrictEqual; + // Mapped types and assertion functions are incompatible? + // TS2775: Assertions require every name in the call target + // to be declared with an explicit type annotation. + ok: typeof ok; + strictEqual: typeof strictEqual; + deepStrictEqual: typeof deepStrictEqual; + ifError: typeof ifError; + strict: typeof strict; + }; + } + export = assert; +} +declare module 'node:assert' { + import assert = require('assert'); + export = assert; +} diff --git a/node_modules/@types/node/assert/strict.d.ts b/node_modules/@types/node/assert/strict.d.ts new file mode 100644 index 000000000..b4319b974 --- /dev/null +++ b/node_modules/@types/node/assert/strict.d.ts @@ -0,0 +1,8 @@ +declare module 'assert/strict' { + import { strict } from 'node:assert'; + export = strict; +} +declare module 'node:assert/strict' { + import { strict } from 'node:assert'; + export = strict; +} diff --git a/node_modules/@types/node/async_hooks.d.ts b/node_modules/@types/node/async_hooks.d.ts new file mode 100644 index 000000000..35b9be2c1 --- /dev/null +++ b/node_modules/@types/node/async_hooks.d.ts @@ -0,0 +1,497 @@ +/** + * The `async_hooks` module provides an API to track asynchronous resources. It + * can be accessed using: + * + * ```js + * import async_hooks from 'async_hooks'; + * ``` + * @experimental + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/async_hooks.js) + */ +declare module 'async_hooks' { + /** + * ```js + * import { executionAsyncId } from 'async_hooks'; + * + * console.log(executionAsyncId()); // 1 - bootstrap + * fs.open(path, 'r', (err, fd) => { + * console.log(executionAsyncId()); // 6 - open() + * }); + * ``` + * + * The ID returned from `executionAsyncId()` is related to execution timing, not + * causality (which is covered by `triggerAsyncId()`): + * + * ```js + * const server = net.createServer((conn) => { + * // Returns the ID of the server, not of the new connection, because the + * // callback runs in the execution scope of the server's MakeCallback(). + * async_hooks.executionAsyncId(); + * + * }).listen(port, () => { + * // Returns the ID of a TickObject (process.nextTick()) because all + * // callbacks passed to .listen() are wrapped in a nextTick(). + * async_hooks.executionAsyncId(); + * }); + * ``` + * + * Promise contexts may not get precise `executionAsyncIds` by default. + * See the section on `promise execution tracking`. + * @since v8.1.0 + * @return The `asyncId` of the current execution context. Useful to track when something calls. + */ + function executionAsyncId(): number; + /** + * Resource objects returned by `executionAsyncResource()` are most often internal + * Node.js handle objects with undocumented APIs. Using any functions or properties + * on the object is likely to crash your application and should be avoided. + * + * Using `executionAsyncResource()` in the top-level execution context will + * return an empty object as there is no handle or request object to use, + * but having an object representing the top-level can be helpful. + * + * ```js + * import { open } from 'fs'; + * import { executionAsyncId, executionAsyncResource } from 'async_hooks'; + * + * console.log(executionAsyncId(), executionAsyncResource()); // 1 {} + * open(new URL(import.meta.url), 'r', (err, fd) => { + * console.log(executionAsyncId(), executionAsyncResource()); // 7 FSReqWrap + * }); + * ``` + * + * This can be used to implement continuation local storage without the + * use of a tracking `Map` to store the metadata: + * + * ```js + * import { createServer } from 'http'; + * import { + * executionAsyncId, + * executionAsyncResource, + * createHook + * } from 'async_hooks'; + * const sym = Symbol('state'); // Private symbol to avoid pollution + * + * createHook({ + * init(asyncId, type, triggerAsyncId, resource) { + * const cr = executionAsyncResource(); + * if (cr) { + * resource[sym] = cr[sym]; + * } + * } + * }).enable(); + * + * const server = createServer((req, res) => { + * executionAsyncResource()[sym] = { state: req.url }; + * setTimeout(function() { + * res.end(JSON.stringify(executionAsyncResource()[sym])); + * }, 100); + * }).listen(3000); + * ``` + * @since v13.9.0, v12.17.0 + * @return The resource representing the current execution. Useful to store data within the resource. + */ + function executionAsyncResource(): object; + /** + * ```js + * const server = net.createServer((conn) => { + * // The resource that caused (or triggered) this callback to be called + * // was that of the new connection. Thus the return value of triggerAsyncId() + * // is the asyncId of "conn". + * async_hooks.triggerAsyncId(); + * + * }).listen(port, () => { + * // Even though all callbacks passed to .listen() are wrapped in a nextTick() + * // the callback itself exists because the call to the server's .listen() + * // was made. So the return value would be the ID of the server. + * async_hooks.triggerAsyncId(); + * }); + * ``` + * + * Promise contexts may not get valid `triggerAsyncId`s by default. See + * the section on `promise execution tracking`. + * @return The ID of the resource responsible for calling the callback that is currently being executed. + */ + function triggerAsyncId(): number; + interface HookCallbacks { + /** + * Called when a class is constructed that has the possibility to emit an asynchronous event. + * @param asyncId a unique ID for the async resource + * @param type the type of the async resource + * @param triggerAsyncId the unique ID of the async resource in whose execution context this async resource was created + * @param resource reference to the resource representing the async operation, needs to be released during destroy + */ + init?(asyncId: number, type: string, triggerAsyncId: number, resource: object): void; + /** + * When an asynchronous operation is initiated or completes a callback is called to notify the user. + * The before callback is called just before said callback is executed. + * @param asyncId the unique identifier assigned to the resource about to execute the callback. + */ + before?(asyncId: number): void; + /** + * Called immediately after the callback specified in before is completed. + * @param asyncId the unique identifier assigned to the resource which has executed the callback. + */ + after?(asyncId: number): void; + /** + * Called when a promise has resolve() called. This may not be in the same execution id + * as the promise itself. + * @param asyncId the unique id for the promise that was resolve()d. + */ + promiseResolve?(asyncId: number): void; + /** + * Called after the resource corresponding to asyncId is destroyed + * @param asyncId a unique ID for the async resource + */ + destroy?(asyncId: number): void; + } + interface AsyncHook { + /** + * Enable the callbacks for a given AsyncHook instance. If no callbacks are provided enabling is a noop. + */ + enable(): this; + /** + * Disable the callbacks for a given AsyncHook instance from the global pool of AsyncHook callbacks to be executed. Once a hook has been disabled it will not be called again until enabled. + */ + disable(): this; + } + /** + * Registers functions to be called for different lifetime events of each async + * operation. + * + * The callbacks `init()`/`before()`/`after()`/`destroy()` are called for the + * respective asynchronous event during a resource's lifetime. + * + * All callbacks are optional. For example, if only resource cleanup needs to + * be tracked, then only the `destroy` callback needs to be passed. The + * specifics of all functions that can be passed to `callbacks` is in the `Hook Callbacks` section. + * + * ```js + * import { createHook } from 'async_hooks'; + * + * const asyncHook = createHook({ + * init(asyncId, type, triggerAsyncId, resource) { }, + * destroy(asyncId) { } + * }); + * ``` + * + * The callbacks will be inherited via the prototype chain: + * + * ```js + * class MyAsyncCallbacks { + * init(asyncId, type, triggerAsyncId, resource) { } + * destroy(asyncId) {} + * } + * + * class MyAddedCallbacks extends MyAsyncCallbacks { + * before(asyncId) { } + * after(asyncId) { } + * } + * + * const asyncHook = async_hooks.createHook(new MyAddedCallbacks()); + * ``` + * + * Because promises are asynchronous resources whose lifecycle is tracked + * via the async hooks mechanism, the `init()`, `before()`, `after()`, and`destroy()` callbacks _must not_ be async functions that return promises. + * @since v8.1.0 + * @param callbacks The `Hook Callbacks` to register + * @return Instance used for disabling and enabling hooks + */ + function createHook(callbacks: HookCallbacks): AsyncHook; + interface AsyncResourceOptions { + /** + * The ID of the execution context that created this async event. + * @default executionAsyncId() + */ + triggerAsyncId?: number | undefined; + /** + * Disables automatic `emitDestroy` when the object is garbage collected. + * This usually does not need to be set (even if `emitDestroy` is called + * manually), unless the resource's `asyncId` is retrieved and the + * sensitive API's `emitDestroy` is called with it. + * @default false + */ + requireManualDestroy?: boolean | undefined; + } + /** + * The class `AsyncResource` is designed to be extended by the embedder's async + * resources. Using this, users can easily trigger the lifetime events of their + * own resources. + * + * The `init` hook will trigger when an `AsyncResource` is instantiated. + * + * The following is an overview of the `AsyncResource` API. + * + * ```js + * import { AsyncResource, executionAsyncId } from 'async_hooks'; + * + * // AsyncResource() is meant to be extended. Instantiating a + * // new AsyncResource() also triggers init. If triggerAsyncId is omitted then + * // async_hook.executionAsyncId() is used. + * const asyncResource = new AsyncResource( + * type, { triggerAsyncId: executionAsyncId(), requireManualDestroy: false } + * ); + * + * // Run a function in the execution context of the resource. This will + * // * establish the context of the resource + * // * trigger the AsyncHooks before callbacks + * // * call the provided function `fn` with the supplied arguments + * // * trigger the AsyncHooks after callbacks + * // * restore the original execution context + * asyncResource.runInAsyncScope(fn, thisArg, ...args); + * + * // Call AsyncHooks destroy callbacks. + * asyncResource.emitDestroy(); + * + * // Return the unique ID assigned to the AsyncResource instance. + * asyncResource.asyncId(); + * + * // Return the trigger ID for the AsyncResource instance. + * asyncResource.triggerAsyncId(); + * ``` + */ + class AsyncResource { + /** + * AsyncResource() is meant to be extended. Instantiating a + * new AsyncResource() also triggers init. If triggerAsyncId is omitted then + * async_hook.executionAsyncId() is used. + * @param type The type of async event. + * @param triggerAsyncId The ID of the execution context that created + * this async event (default: `executionAsyncId()`), or an + * AsyncResourceOptions object (since 9.3) + */ + constructor(type: string, triggerAsyncId?: number | AsyncResourceOptions); + /** + * Binds the given function to the current execution context. + * + * The returned function will have an `asyncResource` property referencing + * the `AsyncResource` to which the function is bound. + * @since v14.8.0, v12.19.0 + * @param fn The function to bind to the current execution context. + * @param type An optional name to associate with the underlying `AsyncResource`. + */ + static bind any, ThisArg>( + fn: Func, + type?: string, + thisArg?: ThisArg + ): Func & { + asyncResource: AsyncResource; + }; + /** + * Binds the given function to execute to this `AsyncResource`'s scope. + * + * The returned function will have an `asyncResource` property referencing + * the `AsyncResource` to which the function is bound. + * @since v14.8.0, v12.19.0 + * @param fn The function to bind to the current `AsyncResource`. + */ + bind any>( + fn: Func + ): Func & { + asyncResource: AsyncResource; + }; + /** + * Call the provided function with the provided arguments in the execution context + * of the async resource. This will establish the context, trigger the AsyncHooks + * before callbacks, call the function, trigger the AsyncHooks after callbacks, and + * then restore the original execution context. + * @since v9.6.0 + * @param fn The function to call in the execution context of this async resource. + * @param thisArg The receiver to be used for the function call. + * @param args Optional arguments to pass to the function. + */ + runInAsyncScope(fn: (this: This, ...args: any[]) => Result, thisArg?: This, ...args: any[]): Result; + /** + * Call all `destroy` hooks. This should only ever be called once. An error will + * be thrown if it is called more than once. This **must** be manually called. If + * the resource is left to be collected by the GC then the `destroy` hooks will + * never be called. + * @return A reference to `asyncResource`. + */ + emitDestroy(): this; + /** + * @return The unique `asyncId` assigned to the resource. + */ + asyncId(): number; + /** + * + * @return The same `triggerAsyncId` that is passed to the `AsyncResource` constructor. + */ + triggerAsyncId(): number; + } + /** + * This class creates stores that stay coherent through asynchronous operations. + * + * While you can create your own implementation on top of the `async_hooks` module,`AsyncLocalStorage` should be preferred as it is a performant and memory safe + * implementation that involves significant optimizations that are non-obvious to + * implement. + * + * The following example uses `AsyncLocalStorage` to build a simple logger + * that assigns IDs to incoming HTTP requests and includes them in messages + * logged within each request. + * + * ```js + * import http from 'http'; + * import { AsyncLocalStorage } from 'async_hooks'; + * + * const asyncLocalStorage = new AsyncLocalStorage(); + * + * function logWithId(msg) { + * const id = asyncLocalStorage.getStore(); + * console.log(`${id !== undefined ? id : '-'}:`, msg); + * } + * + * let idSeq = 0; + * http.createServer((req, res) => { + * asyncLocalStorage.run(idSeq++, () => { + * logWithId('start'); + * // Imagine any chain of async operations here + * setImmediate(() => { + * logWithId('finish'); + * res.end(); + * }); + * }); + * }).listen(8080); + * + * http.get('http://localhost:8080'); + * http.get('http://localhost:8080'); + * // Prints: + * // 0: start + * // 1: start + * // 0: finish + * // 1: finish + * ``` + * + * Each instance of `AsyncLocalStorage` maintains an independent storage context. + * Multiple instances can safely exist simultaneously without risk of interfering + * with each other data. + * @since v13.10.0, v12.17.0 + */ + class AsyncLocalStorage { + /** + * Disables the instance of `AsyncLocalStorage`. All subsequent calls + * to `asyncLocalStorage.getStore()` will return `undefined` until`asyncLocalStorage.run()` or `asyncLocalStorage.enterWith()` is called again. + * + * When calling `asyncLocalStorage.disable()`, all current contexts linked to the + * instance will be exited. + * + * Calling `asyncLocalStorage.disable()` is required before the`asyncLocalStorage` can be garbage collected. This does not apply to stores + * provided by the `asyncLocalStorage`, as those objects are garbage collected + * along with the corresponding async resources. + * + * Use this method when the `asyncLocalStorage` is not in use anymore + * in the current process. + * @since v13.10.0, v12.17.0 + * @experimental + */ + disable(): void; + /** + * Returns the current store. + * If called outside of an asynchronous context initialized by + * calling `asyncLocalStorage.run()` or `asyncLocalStorage.enterWith()`, it + * returns `undefined`. + * @since v13.10.0, v12.17.0 + */ + getStore(): T | undefined; + /** + * Runs a function synchronously within a context and returns its + * return value. The store is not accessible outside of the callback function or + * the asynchronous operations created within the callback. + * + * The optional `args` are passed to the callback function. + * + * If the callback function throws an error, the error is thrown by `run()` too. + * The stacktrace is not impacted by this call and the context is exited. + * + * Example: + * + * ```js + * const store = { id: 2 }; + * try { + * asyncLocalStorage.run(store, () => { + * asyncLocalStorage.getStore(); // Returns the store object + * throw new Error(); + * }); + * } catch (e) { + * asyncLocalStorage.getStore(); // Returns undefined + * // The error will be caught here + * } + * ``` + * @since v13.10.0, v12.17.0 + */ + run(store: T, callback: (...args: TArgs) => R, ...args: TArgs): R; + /** + * Runs a function synchronously outside of a context and returns its + * return value. The store is not accessible within the callback function or + * the asynchronous operations created within the callback. Any `getStore()`call done within the callback function will always return `undefined`. + * + * The optional `args` are passed to the callback function. + * + * If the callback function throws an error, the error is thrown by `exit()` too. + * The stacktrace is not impacted by this call and the context is re-entered. + * + * Example: + * + * ```js + * // Within a call to run + * try { + * asyncLocalStorage.getStore(); // Returns the store object or value + * asyncLocalStorage.exit(() => { + * asyncLocalStorage.getStore(); // Returns undefined + * throw new Error(); + * }); + * } catch (e) { + * asyncLocalStorage.getStore(); // Returns the same object or value + * // The error will be caught here + * } + * ``` + * @since v13.10.0, v12.17.0 + * @experimental + */ + exit(callback: (...args: TArgs) => R, ...args: TArgs): R; + /** + * Transitions into the context for the remainder of the current + * synchronous execution and then persists the store through any following + * asynchronous calls. + * + * Example: + * + * ```js + * const store = { id: 1 }; + * // Replaces previous store with the given store object + * asyncLocalStorage.enterWith(store); + * asyncLocalStorage.getStore(); // Returns the store object + * someAsyncOperation(() => { + * asyncLocalStorage.getStore(); // Returns the same object + * }); + * ``` + * + * This transition will continue for the _entire_ synchronous execution. + * This means that if, for example, the context is entered within an event + * handler subsequent event handlers will also run within that context unless + * specifically bound to another context with an `AsyncResource`. That is why`run()` should be preferred over `enterWith()` unless there are strong reasons + * to use the latter method. + * + * ```js + * const store = { id: 1 }; + * + * emitter.on('my-event', () => { + * asyncLocalStorage.enterWith(store); + * }); + * emitter.on('my-event', () => { + * asyncLocalStorage.getStore(); // Returns the same object + * }); + * + * asyncLocalStorage.getStore(); // Returns undefined + * emitter.emit('my-event'); + * asyncLocalStorage.getStore(); // Returns the same object + * ``` + * @since v13.11.0, v12.17.0 + * @experimental + */ + enterWith(store: T): void; + } +} +declare module 'node:async_hooks' { + export * from 'async_hooks'; +} diff --git a/node_modules/@types/node/buffer.d.ts b/node_modules/@types/node/buffer.d.ts new file mode 100644 index 000000000..cfcd33e0a --- /dev/null +++ b/node_modules/@types/node/buffer.d.ts @@ -0,0 +1,2142 @@ +/** + * `Buffer` objects are used to represent a fixed-length sequence of bytes. Many + * Node.js APIs support `Buffer`s. + * + * The `Buffer` class is a subclass of JavaScript's [`Uint8Array`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array) class and + * extends it with methods that cover additional use cases. Node.js APIs accept + * plain [`Uint8Array`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array) s wherever `Buffer`s are supported as well. + * + * While the `Buffer` class is available within the global scope, it is still + * recommended to explicitly reference it via an import or require statement. + * + * ```js + * import { Buffer } from 'buffer'; + * + * // Creates a zero-filled Buffer of length 10. + * const buf1 = Buffer.alloc(10); + * + * // Creates a Buffer of length 10, + * // filled with bytes which all have the value `1`. + * const buf2 = Buffer.alloc(10, 1); + * + * // Creates an uninitialized buffer of length 10. + * // This is faster than calling Buffer.alloc() but the returned + * // Buffer instance might contain old data that needs to be + * // overwritten using fill(), write(), or other functions that fill the Buffer's + * // contents. + * const buf3 = Buffer.allocUnsafe(10); + * + * // Creates a Buffer containing the bytes [1, 2, 3]. + * const buf4 = Buffer.from([1, 2, 3]); + * + * // Creates a Buffer containing the bytes [1, 1, 1, 1] – the entries + * // are all truncated using `(value & 255)` to fit into the range 0–255. + * const buf5 = Buffer.from([257, 257.5, -255, '1']); + * + * // Creates a Buffer containing the UTF-8-encoded bytes for the string 'tést': + * // [0x74, 0xc3, 0xa9, 0x73, 0x74] (in hexadecimal notation) + * // [116, 195, 169, 115, 116] (in decimal notation) + * const buf6 = Buffer.from('tést'); + * + * // Creates a Buffer containing the Latin-1 bytes [0x74, 0xe9, 0x73, 0x74]. + * const buf7 = Buffer.from('tést', 'latin1'); + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/buffer.js) + */ +declare module 'buffer' { + import { BinaryLike } from 'node:crypto'; + export const INSPECT_MAX_BYTES: number; + export const kMaxLength: number; + export const kStringMaxLength: number; + export const constants: { + MAX_LENGTH: number; + MAX_STRING_LENGTH: number; + }; + export type TranscodeEncoding = 'ascii' | 'utf8' | 'utf16le' | 'ucs2' | 'latin1' | 'binary'; + /** + * Re-encodes the given `Buffer` or `Uint8Array` instance from one character + * encoding to another. Returns a new `Buffer` instance. + * + * Throws if the `fromEnc` or `toEnc` specify invalid character encodings or if + * conversion from `fromEnc` to `toEnc` is not permitted. + * + * Encodings supported by `buffer.transcode()` are: `'ascii'`, `'utf8'`,`'utf16le'`, `'ucs2'`, `'latin1'`, and `'binary'`. + * + * The transcoding process will use substitution characters if a given byte + * sequence cannot be adequately represented in the target encoding. For instance: + * + * ```js + * import { Buffer, transcode } from 'buffer'; + * + * const newBuf = transcode(Buffer.from('€'), 'utf8', 'ascii'); + * console.log(newBuf.toString('ascii')); + * // Prints: '?' + * ``` + * + * Because the Euro (`€`) sign is not representable in US-ASCII, it is replaced + * with `?` in the transcoded `Buffer`. + * @since v7.1.0 + * @param source A `Buffer` or `Uint8Array` instance. + * @param fromEnc The current encoding. + * @param toEnc To target encoding. + */ + export function transcode(source: Uint8Array, fromEnc: TranscodeEncoding, toEnc: TranscodeEncoding): Buffer; + export const SlowBuffer: { + /** @deprecated since v6.0.0, use `Buffer.allocUnsafeSlow()` */ + new (size: number): Buffer; + prototype: Buffer; + }; + /** + * Resolves a `'blob:nodedata:...'` an associated `Blob` object registered using + * a prior call to `URL.createObjectURL()`. + * @since v16.7.0 + * @experimental + * @param id A `'blob:nodedata:...` URL string returned by a prior call to `URL.createObjectURL()`. + */ + export function resolveObjectURL(id: string): Blob | undefined; + export { Buffer }; + /** + * @experimental + */ + export interface BlobOptions { + /** + * @default 'utf8' + */ + encoding?: BufferEncoding | undefined; + /** + * The Blob content-type. The intent is for `type` to convey + * the MIME media type of the data, however no validation of the type format + * is performed. + */ + type?: string | undefined; + } + /** + * A [`Blob`](https://developer.mozilla.org/en-US/docs/Web/API/Blob) encapsulates immutable, raw data that can be safely shared across + * multiple worker threads. + * @since v15.7.0 + * @experimental + */ + export class Blob { + /** + * The total size of the `Blob` in bytes. + * @since v15.7.0 + */ + readonly size: number; + /** + * The content-type of the `Blob`. + * @since v15.7.0 + */ + readonly type: string; + /** + * Creates a new `Blob` object containing a concatenation of the given sources. + * + * {ArrayBuffer}, {TypedArray}, {DataView}, and {Buffer} sources are copied into + * the 'Blob' and can therefore be safely modified after the 'Blob' is created. + * + * String sources are also copied into the `Blob`. + */ + constructor(sources: Array, options?: BlobOptions); + /** + * Returns a promise that fulfills with an [ArrayBuffer](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer) containing a copy of + * the `Blob` data. + * @since v15.7.0 + */ + arrayBuffer(): Promise; + /** + * Creates and returns a new `Blob` containing a subset of this `Blob` objects + * data. The original `Blob` is not altered. + * @since v15.7.0 + * @param start The starting index. + * @param end The ending index. + * @param type The content-type for the new `Blob` + */ + slice(start?: number, end?: number, type?: string): Blob; + /** + * Returns a promise that fulfills with the contents of the `Blob` decoded as a + * UTF-8 string. + * @since v15.7.0 + */ + text(): Promise; + /** + * Returns a new `ReadableStream` that allows the content of the `Blob` to be read. + * @since v16.7.0 + */ + stream(): unknown; // pending web streams types + } + export import atob = globalThis.atob; + export import btoa = globalThis.btoa; + global { + // Buffer class + type BufferEncoding = 'ascii' | 'utf8' | 'utf-8' | 'utf16le' | 'ucs2' | 'ucs-2' | 'base64' | 'base64url' | 'latin1' | 'binary' | 'hex'; + type WithImplicitCoercion = + | T + | { + valueOf(): T; + }; + /** + * Raw data is stored in instances of the Buffer class. + * A Buffer is similar to an array of integers but corresponds to a raw memory allocation outside the V8 heap. A Buffer cannot be resized. + * Valid string encodings: 'ascii'|'utf8'|'utf16le'|'ucs2'(alias of 'utf16le')|'base64'|'base64url'|'binary'(deprecated)|'hex' + */ + interface BufferConstructor { + /** + * Allocates a new buffer containing the given {str}. + * + * @param str String to store in buffer. + * @param encoding encoding to use, optional. Default is 'utf8' + * @deprecated since v10.0.0 - Use `Buffer.from(string[, encoding])` instead. + */ + new (str: string, encoding?: BufferEncoding): Buffer; + /** + * Allocates a new buffer of {size} octets. + * + * @param size count of octets to allocate. + * @deprecated since v10.0.0 - Use `Buffer.alloc()` instead (also see `Buffer.allocUnsafe()`). + */ + new (size: number): Buffer; + /** + * Allocates a new buffer containing the given {array} of octets. + * + * @param array The octets to store. + * @deprecated since v10.0.0 - Use `Buffer.from(array)` instead. + */ + new (array: Uint8Array): Buffer; + /** + * Produces a Buffer backed by the same allocated memory as + * the given {ArrayBuffer}/{SharedArrayBuffer}. + * + * + * @param arrayBuffer The ArrayBuffer with which to share memory. + * @deprecated since v10.0.0 - Use `Buffer.from(arrayBuffer[, byteOffset[, length]])` instead. + */ + new (arrayBuffer: ArrayBuffer | SharedArrayBuffer): Buffer; + /** + * Allocates a new buffer containing the given {array} of octets. + * + * @param array The octets to store. + * @deprecated since v10.0.0 - Use `Buffer.from(array)` instead. + */ + new (array: ReadonlyArray): Buffer; + /** + * Copies the passed {buffer} data onto a new {Buffer} instance. + * + * @param buffer The buffer to copy. + * @deprecated since v10.0.0 - Use `Buffer.from(buffer)` instead. + */ + new (buffer: Buffer): Buffer; + /** + * Allocates a new `Buffer` using an `array` of bytes in the range `0` – `255`. + * Array entries outside that range will be truncated to fit into it. + * + * ```js + * import { Buffer } from 'buffer'; + * + * // Creates a new Buffer containing the UTF-8 bytes of the string 'buffer'. + * const buf = Buffer.from([0x62, 0x75, 0x66, 0x66, 0x65, 0x72]); + * ``` + * + * A `TypeError` will be thrown if `array` is not an `Array` or another type + * appropriate for `Buffer.from()` variants. + * + * `Buffer.from(array)` and `Buffer.from(string)` may also use the internal`Buffer` pool like `Buffer.allocUnsafe()` does. + * @since v5.10.0 + */ + from(arrayBuffer: WithImplicitCoercion, byteOffset?: number, length?: number): Buffer; + /** + * Creates a new Buffer using the passed {data} + * @param data data to create a new Buffer + */ + from(data: Uint8Array | ReadonlyArray): Buffer; + from(data: WithImplicitCoercion | string>): Buffer; + /** + * Creates a new Buffer containing the given JavaScript string {str}. + * If provided, the {encoding} parameter identifies the character encoding. + * If not provided, {encoding} defaults to 'utf8'. + */ + from( + str: + | WithImplicitCoercion + | { + [Symbol.toPrimitive](hint: 'string'): string; + }, + encoding?: BufferEncoding + ): Buffer; + /** + * Creates a new Buffer using the passed {data} + * @param values to create a new Buffer + */ + of(...items: number[]): Buffer; + /** + * Returns `true` if `obj` is a `Buffer`, `false` otherwise. + * + * ```js + * import { Buffer } from 'buffer'; + * + * Buffer.isBuffer(Buffer.alloc(10)); // true + * Buffer.isBuffer(Buffer.from('foo')); // true + * Buffer.isBuffer('a string'); // false + * Buffer.isBuffer([]); // false + * Buffer.isBuffer(new Uint8Array(1024)); // false + * ``` + * @since v0.1.101 + */ + isBuffer(obj: any): obj is Buffer; + /** + * Returns `true` if `encoding` is the name of a supported character encoding, + * or `false` otherwise. + * + * ```js + * import { Buffer } from 'buffer'; + * + * console.log(Buffer.isEncoding('utf8')); + * // Prints: true + * + * console.log(Buffer.isEncoding('hex')); + * // Prints: true + * + * console.log(Buffer.isEncoding('utf/8')); + * // Prints: false + * + * console.log(Buffer.isEncoding('')); + * // Prints: false + * ``` + * @since v0.9.1 + * @param encoding A character encoding name to check. + */ + isEncoding(encoding: string): encoding is BufferEncoding; + /** + * Returns the byte length of a string when encoded using `encoding`. + * This is not the same as [`String.prototype.length`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/length), which does not account + * for the encoding that is used to convert the string into bytes. + * + * For `'base64'`, `'base64url'`, and `'hex'`, this function assumes valid input. + * For strings that contain non-base64/hex-encoded data (e.g. whitespace), the + * return value might be greater than the length of a `Buffer` created from the + * string. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const str = '\u00bd + \u00bc = \u00be'; + * + * console.log(`${str}: ${str.length} characters, ` + + * `${Buffer.byteLength(str, 'utf8')} bytes`); + * // Prints: ½ + ¼ = ¾: 9 characters, 12 bytes + * ``` + * + * When `string` is a + * `Buffer`/[`DataView`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/DataView)/[`TypedArray`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/- + * Reference/Global_Objects/TypedArray)/[`ArrayBuffer`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer)/[`SharedArrayBuffer`](https://develop- + * er.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/SharedArrayBuffer), the byte length as reported by `.byteLength`is returned. + * @since v0.1.90 + * @param string A value to calculate the length of. + * @param [encoding='utf8'] If `string` is a string, this is its encoding. + * @return The number of bytes contained within `string`. + */ + byteLength(string: string | NodeJS.ArrayBufferView | ArrayBuffer | SharedArrayBuffer, encoding?: BufferEncoding): number; + /** + * Returns a new `Buffer` which is the result of concatenating all the `Buffer`instances in the `list` together. + * + * If the list has no items, or if the `totalLength` is 0, then a new zero-length`Buffer` is returned. + * + * If `totalLength` is not provided, it is calculated from the `Buffer` instances + * in `list` by adding their lengths. + * + * If `totalLength` is provided, it is coerced to an unsigned integer. If the + * combined length of the `Buffer`s in `list` exceeds `totalLength`, the result is + * truncated to `totalLength`. + * + * ```js + * import { Buffer } from 'buffer'; + * + * // Create a single `Buffer` from a list of three `Buffer` instances. + * + * const buf1 = Buffer.alloc(10); + * const buf2 = Buffer.alloc(14); + * const buf3 = Buffer.alloc(18); + * const totalLength = buf1.length + buf2.length + buf3.length; + * + * console.log(totalLength); + * // Prints: 42 + * + * const bufA = Buffer.concat([buf1, buf2, buf3], totalLength); + * + * console.log(bufA); + * // Prints: + * console.log(bufA.length); + * // Prints: 42 + * ``` + * + * `Buffer.concat()` may also use the internal `Buffer` pool like `Buffer.allocUnsafe()` does. + * @since v0.7.11 + * @param list List of `Buffer` or {@link Uint8Array} instances to concatenate. + * @param totalLength Total length of the `Buffer` instances in `list` when concatenated. + */ + concat(list: ReadonlyArray, totalLength?: number): Buffer; + /** + * Compares `buf1` to `buf2`, typically for the purpose of sorting arrays of`Buffer` instances. This is equivalent to calling `buf1.compare(buf2)`. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf1 = Buffer.from('1234'); + * const buf2 = Buffer.from('0123'); + * const arr = [buf1, buf2]; + * + * console.log(arr.sort(Buffer.compare)); + * // Prints: [ , ] + * // (This result is equal to: [buf2, buf1].) + * ``` + * @since v0.11.13 + * @return Either `-1`, `0`, or `1`, depending on the result of the comparison. See `compare` for details. + */ + compare(buf1: Uint8Array, buf2: Uint8Array): number; + /** + * Allocates a new `Buffer` of `size` bytes. If `fill` is `undefined`, the`Buffer` will be zero-filled. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.alloc(5); + * + * console.log(buf); + * // Prints: + * ``` + * + * If `size` is larger than {@link constants.MAX_LENGTH} or smaller than 0, `ERR_INVALID_ARG_VALUE` is thrown. + * + * If `fill` is specified, the allocated `Buffer` will be initialized by calling `buf.fill(fill)`. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.alloc(5, 'a'); + * + * console.log(buf); + * // Prints: + * ``` + * + * If both `fill` and `encoding` are specified, the allocated `Buffer` will be + * initialized by calling `buf.fill(fill, encoding)`. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.alloc(11, 'aGVsbG8gd29ybGQ=', 'base64'); + * + * console.log(buf); + * // Prints: + * ``` + * + * Calling `Buffer.alloc()` can be measurably slower than the alternative `Buffer.allocUnsafe()` but ensures that the newly created `Buffer` instance + * contents will never contain sensitive data from previous allocations, including + * data that might not have been allocated for `Buffer`s. + * + * A `TypeError` will be thrown if `size` is not a number. + * @since v5.10.0 + * @param size The desired length of the new `Buffer`. + * @param [fill=0] A value to pre-fill the new `Buffer` with. + * @param [encoding='utf8'] If `fill` is a string, this is its encoding. + */ + alloc(size: number, fill?: string | Buffer | number, encoding?: BufferEncoding): Buffer; + /** + * Allocates a new `Buffer` of `size` bytes. If `size` is larger than {@link constants.MAX_LENGTH} or smaller than 0, `ERR_INVALID_ARG_VALUE` is thrown. + * + * The underlying memory for `Buffer` instances created in this way is _not_ + * _initialized_. The contents of the newly created `Buffer` are unknown and_may contain sensitive data_. Use `Buffer.alloc()` instead to initialize`Buffer` instances with zeroes. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(10); + * + * console.log(buf); + * // Prints (contents may vary): + * + * buf.fill(0); + * + * console.log(buf); + * // Prints: + * ``` + * + * A `TypeError` will be thrown if `size` is not a number. + * + * The `Buffer` module pre-allocates an internal `Buffer` instance of + * size `Buffer.poolSize` that is used as a pool for the fast allocation of new`Buffer` instances created using `Buffer.allocUnsafe()`,`Buffer.from(array)`, `Buffer.concat()`, and the + * deprecated`new Buffer(size)` constructor only when `size` is less than or equal + * to `Buffer.poolSize >> 1` (floor of `Buffer.poolSize` divided by two). + * + * Use of this pre-allocated internal memory pool is a key difference between + * calling `Buffer.alloc(size, fill)` vs. `Buffer.allocUnsafe(size).fill(fill)`. + * Specifically, `Buffer.alloc(size, fill)` will _never_ use the internal `Buffer`pool, while `Buffer.allocUnsafe(size).fill(fill)`_will_ use the internal`Buffer` pool if `size` is less + * than or equal to half `Buffer.poolSize`. The + * difference is subtle but can be important when an application requires the + * additional performance that `Buffer.allocUnsafe()` provides. + * @since v5.10.0 + * @param size The desired length of the new `Buffer`. + */ + allocUnsafe(size: number): Buffer; + /** + * Allocates a new `Buffer` of `size` bytes. If `size` is larger than {@link constants.MAX_LENGTH} or smaller than 0, `ERR_INVALID_ARG_VALUE` is thrown. A zero-length `Buffer` is created + * if `size` is 0. + * + * The underlying memory for `Buffer` instances created in this way is _not_ + * _initialized_. The contents of the newly created `Buffer` are unknown and_may contain sensitive data_. Use `buf.fill(0)` to initialize + * such `Buffer` instances with zeroes. + * + * When using `Buffer.allocUnsafe()` to allocate new `Buffer` instances, + * allocations under 4 KB are sliced from a single pre-allocated `Buffer`. This + * allows applications to avoid the garbage collection overhead of creating many + * individually allocated `Buffer` instances. This approach improves both + * performance and memory usage by eliminating the need to track and clean up as + * many individual `ArrayBuffer` objects. + * + * However, in the case where a developer may need to retain a small chunk of + * memory from a pool for an indeterminate amount of time, it may be appropriate + * to create an un-pooled `Buffer` instance using `Buffer.allocUnsafeSlow()` and + * then copying out the relevant bits. + * + * ```js + * import { Buffer } from 'buffer'; + * + * // Need to keep around a few small chunks of memory. + * const store = []; + * + * socket.on('readable', () => { + * let data; + * while (null !== (data = readable.read())) { + * // Allocate for retained data. + * const sb = Buffer.allocUnsafeSlow(10); + * + * // Copy the data into the new allocation. + * data.copy(sb, 0, 0, 10); + * + * store.push(sb); + * } + * }); + * ``` + * + * A `TypeError` will be thrown if `size` is not a number. + * @since v5.12.0 + * @param size The desired length of the new `Buffer`. + */ + allocUnsafeSlow(size: number): Buffer; + /** + * This is the size (in bytes) of pre-allocated internal `Buffer` instances used + * for pooling. This value may be modified. + * @since v0.11.3 + */ + poolSize: number; + } + interface Buffer extends Uint8Array { + /** + * Writes `string` to `buf` at `offset` according to the character encoding in`encoding`. The `length` parameter is the number of bytes to write. If `buf` did + * not contain enough space to fit the entire string, only part of `string` will be + * written. However, partially encoded characters will not be written. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.alloc(256); + * + * const len = buf.write('\u00bd + \u00bc = \u00be', 0); + * + * console.log(`${len} bytes: ${buf.toString('utf8', 0, len)}`); + * // Prints: 12 bytes: ½ + ¼ = ¾ + * + * const buffer = Buffer.alloc(10); + * + * const length = buffer.write('abcd', 8); + * + * console.log(`${length} bytes: ${buffer.toString('utf8', 8, 10)}`); + * // Prints: 2 bytes : ab + * ``` + * @since v0.1.90 + * @param string String to write to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write `string`. + * @param [length=buf.length - offset] Maximum number of bytes to write (written bytes will not exceed `buf.length - offset`). + * @param [encoding='utf8'] The character encoding of `string`. + * @return Number of bytes written. + */ + write(string: string, encoding?: BufferEncoding): number; + write(string: string, offset: number, encoding?: BufferEncoding): number; + write(string: string, offset: number, length: number, encoding?: BufferEncoding): number; + /** + * Decodes `buf` to a string according to the specified character encoding in`encoding`. `start` and `end` may be passed to decode only a subset of `buf`. + * + * If `encoding` is `'utf8'` and a byte sequence in the input is not valid UTF-8, + * then each invalid byte is replaced with the replacement character `U+FFFD`. + * + * The maximum length of a string instance (in UTF-16 code units) is available + * as {@link constants.MAX_STRING_LENGTH}. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf1 = Buffer.allocUnsafe(26); + * + * for (let i = 0; i < 26; i++) { + * // 97 is the decimal ASCII value for 'a'. + * buf1[i] = i + 97; + * } + * + * console.log(buf1.toString('utf8')); + * // Prints: abcdefghijklmnopqrstuvwxyz + * console.log(buf1.toString('utf8', 0, 5)); + * // Prints: abcde + * + * const buf2 = Buffer.from('tést'); + * + * console.log(buf2.toString('hex')); + * // Prints: 74c3a97374 + * console.log(buf2.toString('utf8', 0, 3)); + * // Prints: té + * console.log(buf2.toString(undefined, 0, 3)); + * // Prints: té + * ``` + * @since v0.1.90 + * @param [encoding='utf8'] The character encoding to use. + * @param [start=0] The byte offset to start decoding at. + * @param [end=buf.length] The byte offset to stop decoding at (not inclusive). + */ + toString(encoding?: BufferEncoding, start?: number, end?: number): string; + /** + * Returns a JSON representation of `buf`. [`JSON.stringify()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify) implicitly calls + * this function when stringifying a `Buffer` instance. + * + * `Buffer.from()` accepts objects in the format returned from this method. + * In particular, `Buffer.from(buf.toJSON())` works like `Buffer.from(buf)`. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([0x1, 0x2, 0x3, 0x4, 0x5]); + * const json = JSON.stringify(buf); + * + * console.log(json); + * // Prints: {"type":"Buffer","data":[1,2,3,4,5]} + * + * const copy = JSON.parse(json, (key, value) => { + * return value && value.type === 'Buffer' ? + * Buffer.from(value) : + * value; + * }); + * + * console.log(copy); + * // Prints: + * ``` + * @since v0.9.2 + */ + toJSON(): { + type: 'Buffer'; + data: number[]; + }; + /** + * Returns `true` if both `buf` and `otherBuffer` have exactly the same bytes,`false` otherwise. Equivalent to `buf.compare(otherBuffer) === 0`. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf1 = Buffer.from('ABC'); + * const buf2 = Buffer.from('414243', 'hex'); + * const buf3 = Buffer.from('ABCD'); + * + * console.log(buf1.equals(buf2)); + * // Prints: true + * console.log(buf1.equals(buf3)); + * // Prints: false + * ``` + * @since v0.11.13 + * @param otherBuffer A `Buffer` or {@link Uint8Array} with which to compare `buf`. + */ + equals(otherBuffer: Uint8Array): boolean; + /** + * Compares `buf` with `target` and returns a number indicating whether `buf`comes before, after, or is the same as `target` in sort order. + * Comparison is based on the actual sequence of bytes in each `Buffer`. + * + * * `0` is returned if `target` is the same as `buf` + * * `1` is returned if `target` should come _before_`buf` when sorted. + * * `-1` is returned if `target` should come _after_`buf` when sorted. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf1 = Buffer.from('ABC'); + * const buf2 = Buffer.from('BCD'); + * const buf3 = Buffer.from('ABCD'); + * + * console.log(buf1.compare(buf1)); + * // Prints: 0 + * console.log(buf1.compare(buf2)); + * // Prints: -1 + * console.log(buf1.compare(buf3)); + * // Prints: -1 + * console.log(buf2.compare(buf1)); + * // Prints: 1 + * console.log(buf2.compare(buf3)); + * // Prints: 1 + * console.log([buf1, buf2, buf3].sort(Buffer.compare)); + * // Prints: [ , , ] + * // (This result is equal to: [buf1, buf3, buf2].) + * ``` + * + * The optional `targetStart`, `targetEnd`, `sourceStart`, and `sourceEnd`arguments can be used to limit the comparison to specific ranges within `target`and `buf` respectively. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf1 = Buffer.from([1, 2, 3, 4, 5, 6, 7, 8, 9]); + * const buf2 = Buffer.from([5, 6, 7, 8, 9, 1, 2, 3, 4]); + * + * console.log(buf1.compare(buf2, 5, 9, 0, 4)); + * // Prints: 0 + * console.log(buf1.compare(buf2, 0, 6, 4)); + * // Prints: -1 + * console.log(buf1.compare(buf2, 5, 6, 5)); + * // Prints: 1 + * ``` + * + * `ERR_OUT_OF_RANGE` is thrown if `targetStart < 0`, `sourceStart < 0`,`targetEnd > target.byteLength`, or `sourceEnd > source.byteLength`. + * @since v0.11.13 + * @param target A `Buffer` or {@link Uint8Array} with which to compare `buf`. + * @param [targetStart=0] The offset within `target` at which to begin comparison. + * @param [targetEnd=target.length] The offset within `target` at which to end comparison (not inclusive). + * @param [sourceStart=0] The offset within `buf` at which to begin comparison. + * @param [sourceEnd=buf.length] The offset within `buf` at which to end comparison (not inclusive). + */ + compare(target: Uint8Array, targetStart?: number, targetEnd?: number, sourceStart?: number, sourceEnd?: number): number; + /** + * Copies data from a region of `buf` to a region in `target`, even if the `target`memory region overlaps with `buf`. + * + * [`TypedArray.prototype.set()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/TypedArray/set) performs the same operation, and is available + * for all TypedArrays, including Node.js `Buffer`s, although it takes + * different function arguments. + * + * ```js + * import { Buffer } from 'buffer'; + * + * // Create two `Buffer` instances. + * const buf1 = Buffer.allocUnsafe(26); + * const buf2 = Buffer.allocUnsafe(26).fill('!'); + * + * for (let i = 0; i < 26; i++) { + * // 97 is the decimal ASCII value for 'a'. + * buf1[i] = i + 97; + * } + * + * // Copy `buf1` bytes 16 through 19 into `buf2` starting at byte 8 of `buf2`. + * buf1.copy(buf2, 8, 16, 20); + * // This is equivalent to: + * // buf2.set(buf1.subarray(16, 20), 8); + * + * console.log(buf2.toString('ascii', 0, 25)); + * // Prints: !!!!!!!!qrst!!!!!!!!!!!!! + * ``` + * + * ```js + * import { Buffer } from 'buffer'; + * + * // Create a `Buffer` and copy data from one region to an overlapping region + * // within the same `Buffer`. + * + * const buf = Buffer.allocUnsafe(26); + * + * for (let i = 0; i < 26; i++) { + * // 97 is the decimal ASCII value for 'a'. + * buf[i] = i + 97; + * } + * + * buf.copy(buf, 0, 4, 10); + * + * console.log(buf.toString()); + * // Prints: efghijghijklmnopqrstuvwxyz + * ``` + * @since v0.1.90 + * @param target A `Buffer` or {@link Uint8Array} to copy into. + * @param [targetStart=0] The offset within `target` at which to begin writing. + * @param [sourceStart=0] The offset within `buf` from which to begin copying. + * @param [sourceEnd=buf.length] The offset within `buf` at which to stop copying (not inclusive). + * @return The number of bytes copied. + */ + copy(target: Uint8Array, targetStart?: number, sourceStart?: number, sourceEnd?: number): number; + /** + * Returns a new `Buffer` that references the same memory as the original, but + * offset and cropped by the `start` and `end` indices. + * + * This is the same behavior as `buf.subarray()`. + * + * This method is not compatible with the `Uint8Array.prototype.slice()`, + * which is a superclass of `Buffer`. To copy the slice, use`Uint8Array.prototype.slice()`. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from('buffer'); + * + * const copiedBuf = Uint8Array.prototype.slice.call(buf); + * copiedBuf[0]++; + * console.log(copiedBuf.toString()); + * // Prints: cuffer + * + * console.log(buf.toString()); + * // Prints: buffer + * ``` + * @since v0.3.0 + * @param [start=0] Where the new `Buffer` will start. + * @param [end=buf.length] Where the new `Buffer` will end (not inclusive). + */ + slice(start?: number, end?: number): Buffer; + /** + * Returns a new `Buffer` that references the same memory as the original, but + * offset and cropped by the `start` and `end` indices. + * + * Specifying `end` greater than `buf.length` will return the same result as + * that of `end` equal to `buf.length`. + * + * This method is inherited from [`TypedArray.prototype.subarray()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/TypedArray/subarray). + * + * Modifying the new `Buffer` slice will modify the memory in the original `Buffer`because the allocated memory of the two objects overlap. + * + * ```js + * import { Buffer } from 'buffer'; + * + * // Create a `Buffer` with the ASCII alphabet, take a slice, and modify one byte + * // from the original `Buffer`. + * + * const buf1 = Buffer.allocUnsafe(26); + * + * for (let i = 0; i < 26; i++) { + * // 97 is the decimal ASCII value for 'a'. + * buf1[i] = i + 97; + * } + * + * const buf2 = buf1.subarray(0, 3); + * + * console.log(buf2.toString('ascii', 0, buf2.length)); + * // Prints: abc + * + * buf1[0] = 33; + * + * console.log(buf2.toString('ascii', 0, buf2.length)); + * // Prints: !bc + * ``` + * + * Specifying negative indexes causes the slice to be generated relative to the + * end of `buf` rather than the beginning. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from('buffer'); + * + * console.log(buf.subarray(-6, -1).toString()); + * // Prints: buffe + * // (Equivalent to buf.subarray(0, 5).) + * + * console.log(buf.subarray(-6, -2).toString()); + * // Prints: buff + * // (Equivalent to buf.subarray(0, 4).) + * + * console.log(buf.subarray(-5, -2).toString()); + * // Prints: uff + * // (Equivalent to buf.subarray(1, 4).) + * ``` + * @since v3.0.0 + * @param [start=0] Where the new `Buffer` will start. + * @param [end=buf.length] Where the new `Buffer` will end (not inclusive). + */ + subarray(start?: number, end?: number): Buffer; + /** + * Writes `value` to `buf` at the specified `offset` as big-endian. + * + * `value` is interpreted and written as a two's complement signed integer. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(8); + * + * buf.writeBigInt64BE(0x0102030405060708n, 0); + * + * console.log(buf); + * // Prints: + * ``` + * @since v12.0.0, v10.20.0 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy: `0 <= offset <= buf.length - 8`. + * @return `offset` plus the number of bytes written. + */ + writeBigInt64BE(value: bigint, offset?: number): number; + /** + * Writes `value` to `buf` at the specified `offset` as little-endian. + * + * `value` is interpreted and written as a two's complement signed integer. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(8); + * + * buf.writeBigInt64LE(0x0102030405060708n, 0); + * + * console.log(buf); + * // Prints: + * ``` + * @since v12.0.0, v10.20.0 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy: `0 <= offset <= buf.length - 8`. + * @return `offset` plus the number of bytes written. + */ + writeBigInt64LE(value: bigint, offset?: number): number; + /** + * Writes `value` to `buf` at the specified `offset` as big-endian. + * + * This function is also available under the `writeBigUint64BE` alias. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(8); + * + * buf.writeBigUInt64BE(0xdecafafecacefaden, 0); + * + * console.log(buf); + * // Prints: + * ``` + * @since v12.0.0, v10.20.0 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy: `0 <= offset <= buf.length - 8`. + * @return `offset` plus the number of bytes written. + */ + writeBigUInt64BE(value: bigint, offset?: number): number; + /** + * Writes `value` to `buf` at the specified `offset` as little-endian + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(8); + * + * buf.writeBigUInt64LE(0xdecafafecacefaden, 0); + * + * console.log(buf); + * // Prints: + * ``` + * + * This function is also available under the `writeBigUint64LE` alias. + * @since v12.0.0, v10.20.0 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy: `0 <= offset <= buf.length - 8`. + * @return `offset` plus the number of bytes written. + */ + writeBigUInt64LE(value: bigint, offset?: number): number; + /** + * Writes `byteLength` bytes of `value` to `buf` at the specified `offset`as little-endian. Supports up to 48 bits of accuracy. Behavior is undefined + * when `value` is anything other than an unsigned integer. + * + * This function is also available under the `writeUintLE` alias. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(6); + * + * buf.writeUIntLE(0x1234567890ab, 0, 6); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.5.5 + * @param value Number to be written to `buf`. + * @param offset Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - byteLength`. + * @param byteLength Number of bytes to write. Must satisfy `0 < byteLength <= 6`. + * @return `offset` plus the number of bytes written. + */ + writeUIntLE(value: number, offset: number, byteLength: number): number; + /** + * Writes `byteLength` bytes of `value` to `buf` at the specified `offset`as big-endian. Supports up to 48 bits of accuracy. Behavior is undefined + * when `value` is anything other than an unsigned integer. + * + * This function is also available under the `writeUintBE` alias. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(6); + * + * buf.writeUIntBE(0x1234567890ab, 0, 6); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.5.5 + * @param value Number to be written to `buf`. + * @param offset Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - byteLength`. + * @param byteLength Number of bytes to write. Must satisfy `0 < byteLength <= 6`. + * @return `offset` plus the number of bytes written. + */ + writeUIntBE(value: number, offset: number, byteLength: number): number; + /** + * Writes `byteLength` bytes of `value` to `buf` at the specified `offset`as little-endian. Supports up to 48 bits of accuracy. Behavior is undefined + * when `value` is anything other than a signed integer. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(6); + * + * buf.writeIntLE(0x1234567890ab, 0, 6); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.11.15 + * @param value Number to be written to `buf`. + * @param offset Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - byteLength`. + * @param byteLength Number of bytes to write. Must satisfy `0 < byteLength <= 6`. + * @return `offset` plus the number of bytes written. + */ + writeIntLE(value: number, offset: number, byteLength: number): number; + /** + * Writes `byteLength` bytes of `value` to `buf` at the specified `offset`as big-endian. Supports up to 48 bits of accuracy. Behavior is undefined when`value` is anything other than a + * signed integer. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(6); + * + * buf.writeIntBE(0x1234567890ab, 0, 6); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.11.15 + * @param value Number to be written to `buf`. + * @param offset Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - byteLength`. + * @param byteLength Number of bytes to write. Must satisfy `0 < byteLength <= 6`. + * @return `offset` plus the number of bytes written. + */ + writeIntBE(value: number, offset: number, byteLength: number): number; + /** + * Reads an unsigned, big-endian 64-bit integer from `buf` at the specified`offset`. + * + * This function is also available under the `readBigUint64BE` alias. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([0x00, 0x00, 0x00, 0x00, 0xff, 0xff, 0xff, 0xff]); + * + * console.log(buf.readBigUInt64BE(0)); + * // Prints: 4294967295n + * ``` + * @since v12.0.0, v10.20.0 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy: `0 <= offset <= buf.length - 8`. + */ + readBigUInt64BE(offset?: number): bigint; + /** + * Reads an unsigned, little-endian 64-bit integer from `buf` at the specified`offset`. + * + * This function is also available under the `readBigUint64LE` alias. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([0x00, 0x00, 0x00, 0x00, 0xff, 0xff, 0xff, 0xff]); + * + * console.log(buf.readBigUInt64LE(0)); + * // Prints: 18446744069414584320n + * ``` + * @since v12.0.0, v10.20.0 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy: `0 <= offset <= buf.length - 8`. + */ + readBigUInt64LE(offset?: number): bigint; + /** + * Reads a signed, big-endian 64-bit integer from `buf` at the specified `offset`. + * + * Integers read from a `Buffer` are interpreted as two's complement signed + * values. + * @since v12.0.0, v10.20.0 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy: `0 <= offset <= buf.length - 8`. + */ + readBigInt64BE(offset?: number): bigint; + /** + * Reads a signed, little-endian 64-bit integer from `buf` at the specified`offset`. + * + * Integers read from a `Buffer` are interpreted as two's complement signed + * values. + * @since v12.0.0, v10.20.0 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy: `0 <= offset <= buf.length - 8`. + */ + readBigInt64LE(offset?: number): bigint; + /** + * Reads `byteLength` number of bytes from `buf` at the specified `offset`and interprets the result as an unsigned, little-endian integer supporting + * up to 48 bits of accuracy. + * + * This function is also available under the `readUintLE` alias. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([0x12, 0x34, 0x56, 0x78, 0x90, 0xab]); + * + * console.log(buf.readUIntLE(0, 6).toString(16)); + * // Prints: ab9078563412 + * ``` + * @since v0.11.15 + * @param offset Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - byteLength`. + * @param byteLength Number of bytes to read. Must satisfy `0 < byteLength <= 6`. + */ + readUIntLE(offset: number, byteLength: number): number; + /** + * Reads `byteLength` number of bytes from `buf` at the specified `offset`and interprets the result as an unsigned big-endian integer supporting + * up to 48 bits of accuracy. + * + * This function is also available under the `readUintBE` alias. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([0x12, 0x34, 0x56, 0x78, 0x90, 0xab]); + * + * console.log(buf.readUIntBE(0, 6).toString(16)); + * // Prints: 1234567890ab + * console.log(buf.readUIntBE(1, 6).toString(16)); + * // Throws ERR_OUT_OF_RANGE. + * ``` + * @since v0.11.15 + * @param offset Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - byteLength`. + * @param byteLength Number of bytes to read. Must satisfy `0 < byteLength <= 6`. + */ + readUIntBE(offset: number, byteLength: number): number; + /** + * Reads `byteLength` number of bytes from `buf` at the specified `offset`and interprets the result as a little-endian, two's complement signed value + * supporting up to 48 bits of accuracy. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([0x12, 0x34, 0x56, 0x78, 0x90, 0xab]); + * + * console.log(buf.readIntLE(0, 6).toString(16)); + * // Prints: -546f87a9cbee + * ``` + * @since v0.11.15 + * @param offset Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - byteLength`. + * @param byteLength Number of bytes to read. Must satisfy `0 < byteLength <= 6`. + */ + readIntLE(offset: number, byteLength: number): number; + /** + * Reads `byteLength` number of bytes from `buf` at the specified `offset`and interprets the result as a big-endian, two's complement signed value + * supporting up to 48 bits of accuracy. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([0x12, 0x34, 0x56, 0x78, 0x90, 0xab]); + * + * console.log(buf.readIntBE(0, 6).toString(16)); + * // Prints: 1234567890ab + * console.log(buf.readIntBE(1, 6).toString(16)); + * // Throws ERR_OUT_OF_RANGE. + * console.log(buf.readIntBE(1, 0).toString(16)); + * // Throws ERR_OUT_OF_RANGE. + * ``` + * @since v0.11.15 + * @param offset Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - byteLength`. + * @param byteLength Number of bytes to read. Must satisfy `0 < byteLength <= 6`. + */ + readIntBE(offset: number, byteLength: number): number; + /** + * Reads an unsigned 8-bit integer from `buf` at the specified `offset`. + * + * This function is also available under the `readUint8` alias. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([1, -2]); + * + * console.log(buf.readUInt8(0)); + * // Prints: 1 + * console.log(buf.readUInt8(1)); + * // Prints: 254 + * console.log(buf.readUInt8(2)); + * // Throws ERR_OUT_OF_RANGE. + * ``` + * @since v0.5.0 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - 1`. + */ + readUInt8(offset?: number): number; + /** + * Reads an unsigned, little-endian 16-bit integer from `buf` at the specified`offset`. + * + * This function is also available under the `readUint16LE` alias. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([0x12, 0x34, 0x56]); + * + * console.log(buf.readUInt16LE(0).toString(16)); + * // Prints: 3412 + * console.log(buf.readUInt16LE(1).toString(16)); + * // Prints: 5634 + * console.log(buf.readUInt16LE(2).toString(16)); + * // Throws ERR_OUT_OF_RANGE. + * ``` + * @since v0.5.5 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - 2`. + */ + readUInt16LE(offset?: number): number; + /** + * Reads an unsigned, big-endian 16-bit integer from `buf` at the specified`offset`. + * + * This function is also available under the `readUint16BE` alias. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([0x12, 0x34, 0x56]); + * + * console.log(buf.readUInt16BE(0).toString(16)); + * // Prints: 1234 + * console.log(buf.readUInt16BE(1).toString(16)); + * // Prints: 3456 + * ``` + * @since v0.5.5 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - 2`. + */ + readUInt16BE(offset?: number): number; + /** + * Reads an unsigned, little-endian 32-bit integer from `buf` at the specified`offset`. + * + * This function is also available under the `readUint32LE` alias. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([0x12, 0x34, 0x56, 0x78]); + * + * console.log(buf.readUInt32LE(0).toString(16)); + * // Prints: 78563412 + * console.log(buf.readUInt32LE(1).toString(16)); + * // Throws ERR_OUT_OF_RANGE. + * ``` + * @since v0.5.5 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - 4`. + */ + readUInt32LE(offset?: number): number; + /** + * Reads an unsigned, big-endian 32-bit integer from `buf` at the specified`offset`. + * + * This function is also available under the `readUint32BE` alias. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([0x12, 0x34, 0x56, 0x78]); + * + * console.log(buf.readUInt32BE(0).toString(16)); + * // Prints: 12345678 + * ``` + * @since v0.5.5 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - 4`. + */ + readUInt32BE(offset?: number): number; + /** + * Reads a signed 8-bit integer from `buf` at the specified `offset`. + * + * Integers read from a `Buffer` are interpreted as two's complement signed values. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([-1, 5]); + * + * console.log(buf.readInt8(0)); + * // Prints: -1 + * console.log(buf.readInt8(1)); + * // Prints: 5 + * console.log(buf.readInt8(2)); + * // Throws ERR_OUT_OF_RANGE. + * ``` + * @since v0.5.0 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - 1`. + */ + readInt8(offset?: number): number; + /** + * Reads a signed, little-endian 16-bit integer from `buf` at the specified`offset`. + * + * Integers read from a `Buffer` are interpreted as two's complement signed values. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([0, 5]); + * + * console.log(buf.readInt16LE(0)); + * // Prints: 1280 + * console.log(buf.readInt16LE(1)); + * // Throws ERR_OUT_OF_RANGE. + * ``` + * @since v0.5.5 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - 2`. + */ + readInt16LE(offset?: number): number; + /** + * Reads a signed, big-endian 16-bit integer from `buf` at the specified `offset`. + * + * Integers read from a `Buffer` are interpreted as two's complement signed values. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([0, 5]); + * + * console.log(buf.readInt16BE(0)); + * // Prints: 5 + * ``` + * @since v0.5.5 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - 2`. + */ + readInt16BE(offset?: number): number; + /** + * Reads a signed, little-endian 32-bit integer from `buf` at the specified`offset`. + * + * Integers read from a `Buffer` are interpreted as two's complement signed values. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([0, 0, 0, 5]); + * + * console.log(buf.readInt32LE(0)); + * // Prints: 83886080 + * console.log(buf.readInt32LE(1)); + * // Throws ERR_OUT_OF_RANGE. + * ``` + * @since v0.5.5 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - 4`. + */ + readInt32LE(offset?: number): number; + /** + * Reads a signed, big-endian 32-bit integer from `buf` at the specified `offset`. + * + * Integers read from a `Buffer` are interpreted as two's complement signed values. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([0, 0, 0, 5]); + * + * console.log(buf.readInt32BE(0)); + * // Prints: 5 + * ``` + * @since v0.5.5 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - 4`. + */ + readInt32BE(offset?: number): number; + /** + * Reads a 32-bit, little-endian float from `buf` at the specified `offset`. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([1, 2, 3, 4]); + * + * console.log(buf.readFloatLE(0)); + * // Prints: 1.539989614439558e-36 + * console.log(buf.readFloatLE(1)); + * // Throws ERR_OUT_OF_RANGE. + * ``` + * @since v0.11.15 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - 4`. + */ + readFloatLE(offset?: number): number; + /** + * Reads a 32-bit, big-endian float from `buf` at the specified `offset`. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([1, 2, 3, 4]); + * + * console.log(buf.readFloatBE(0)); + * // Prints: 2.387939260590663e-38 + * ``` + * @since v0.11.15 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - 4`. + */ + readFloatBE(offset?: number): number; + /** + * Reads a 64-bit, little-endian double from `buf` at the specified `offset`. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([1, 2, 3, 4, 5, 6, 7, 8]); + * + * console.log(buf.readDoubleLE(0)); + * // Prints: 5.447603722011605e-270 + * console.log(buf.readDoubleLE(1)); + * // Throws ERR_OUT_OF_RANGE. + * ``` + * @since v0.11.15 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - 8`. + */ + readDoubleLE(offset?: number): number; + /** + * Reads a 64-bit, big-endian double from `buf` at the specified `offset`. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from([1, 2, 3, 4, 5, 6, 7, 8]); + * + * console.log(buf.readDoubleBE(0)); + * // Prints: 8.20788039913184e-304 + * ``` + * @since v0.11.15 + * @param [offset=0] Number of bytes to skip before starting to read. Must satisfy `0 <= offset <= buf.length - 8`. + */ + readDoubleBE(offset?: number): number; + reverse(): this; + /** + * Interprets `buf` as an array of unsigned 16-bit integers and swaps the + * byte order _in-place_. Throws `ERR_INVALID_BUFFER_SIZE` if `buf.length` is not a multiple of 2. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf1 = Buffer.from([0x1, 0x2, 0x3, 0x4, 0x5, 0x6, 0x7, 0x8]); + * + * console.log(buf1); + * // Prints: + * + * buf1.swap16(); + * + * console.log(buf1); + * // Prints: + * + * const buf2 = Buffer.from([0x1, 0x2, 0x3]); + * + * buf2.swap16(); + * // Throws ERR_INVALID_BUFFER_SIZE. + * ``` + * + * One convenient use of `buf.swap16()` is to perform a fast in-place conversion + * between UTF-16 little-endian and UTF-16 big-endian: + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from('This is little-endian UTF-16', 'utf16le'); + * buf.swap16(); // Convert to big-endian UTF-16 text. + * ``` + * @since v5.10.0 + * @return A reference to `buf`. + */ + swap16(): Buffer; + /** + * Interprets `buf` as an array of unsigned 32-bit integers and swaps the + * byte order _in-place_. Throws `ERR_INVALID_BUFFER_SIZE` if `buf.length` is not a multiple of 4. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf1 = Buffer.from([0x1, 0x2, 0x3, 0x4, 0x5, 0x6, 0x7, 0x8]); + * + * console.log(buf1); + * // Prints: + * + * buf1.swap32(); + * + * console.log(buf1); + * // Prints: + * + * const buf2 = Buffer.from([0x1, 0x2, 0x3]); + * + * buf2.swap32(); + * // Throws ERR_INVALID_BUFFER_SIZE. + * ``` + * @since v5.10.0 + * @return A reference to `buf`. + */ + swap32(): Buffer; + /** + * Interprets `buf` as an array of 64-bit numbers and swaps byte order _in-place_. + * Throws `ERR_INVALID_BUFFER_SIZE` if `buf.length` is not a multiple of 8. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf1 = Buffer.from([0x1, 0x2, 0x3, 0x4, 0x5, 0x6, 0x7, 0x8]); + * + * console.log(buf1); + * // Prints: + * + * buf1.swap64(); + * + * console.log(buf1); + * // Prints: + * + * const buf2 = Buffer.from([0x1, 0x2, 0x3]); + * + * buf2.swap64(); + * // Throws ERR_INVALID_BUFFER_SIZE. + * ``` + * @since v6.3.0 + * @return A reference to `buf`. + */ + swap64(): Buffer; + /** + * Writes `value` to `buf` at the specified `offset`. `value` must be a + * valid unsigned 8-bit integer. Behavior is undefined when `value` is anything + * other than an unsigned 8-bit integer. + * + * This function is also available under the `writeUint8` alias. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(4); + * + * buf.writeUInt8(0x3, 0); + * buf.writeUInt8(0x4, 1); + * buf.writeUInt8(0x23, 2); + * buf.writeUInt8(0x42, 3); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.5.0 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - 1`. + * @return `offset` plus the number of bytes written. + */ + writeUInt8(value: number, offset?: number): number; + /** + * Writes `value` to `buf` at the specified `offset` as little-endian. The `value`must be a valid unsigned 16-bit integer. Behavior is undefined when `value` is + * anything other than an unsigned 16-bit integer. + * + * This function is also available under the `writeUint16LE` alias. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(4); + * + * buf.writeUInt16LE(0xdead, 0); + * buf.writeUInt16LE(0xbeef, 2); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.5.5 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - 2`. + * @return `offset` plus the number of bytes written. + */ + writeUInt16LE(value: number, offset?: number): number; + /** + * Writes `value` to `buf` at the specified `offset` as big-endian. The `value`must be a valid unsigned 16-bit integer. Behavior is undefined when `value`is anything other than an + * unsigned 16-bit integer. + * + * This function is also available under the `writeUint16BE` alias. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(4); + * + * buf.writeUInt16BE(0xdead, 0); + * buf.writeUInt16BE(0xbeef, 2); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.5.5 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - 2`. + * @return `offset` plus the number of bytes written. + */ + writeUInt16BE(value: number, offset?: number): number; + /** + * Writes `value` to `buf` at the specified `offset` as little-endian. The `value`must be a valid unsigned 32-bit integer. Behavior is undefined when `value` is + * anything other than an unsigned 32-bit integer. + * + * This function is also available under the `writeUint32LE` alias. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(4); + * + * buf.writeUInt32LE(0xfeedface, 0); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.5.5 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - 4`. + * @return `offset` plus the number of bytes written. + */ + writeUInt32LE(value: number, offset?: number): number; + /** + * Writes `value` to `buf` at the specified `offset` as big-endian. The `value`must be a valid unsigned 32-bit integer. Behavior is undefined when `value`is anything other than an + * unsigned 32-bit integer. + * + * This function is also available under the `writeUint32BE` alias. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(4); + * + * buf.writeUInt32BE(0xfeedface, 0); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.5.5 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - 4`. + * @return `offset` plus the number of bytes written. + */ + writeUInt32BE(value: number, offset?: number): number; + /** + * Writes `value` to `buf` at the specified `offset`. `value` must be a valid + * signed 8-bit integer. Behavior is undefined when `value` is anything other than + * a signed 8-bit integer. + * + * `value` is interpreted and written as a two's complement signed integer. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(2); + * + * buf.writeInt8(2, 0); + * buf.writeInt8(-2, 1); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.5.0 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - 1`. + * @return `offset` plus the number of bytes written. + */ + writeInt8(value: number, offset?: number): number; + /** + * Writes `value` to `buf` at the specified `offset` as little-endian. The `value`must be a valid signed 16-bit integer. Behavior is undefined when `value` is + * anything other than a signed 16-bit integer. + * + * The `value` is interpreted and written as a two's complement signed integer. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(2); + * + * buf.writeInt16LE(0x0304, 0); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.5.5 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - 2`. + * @return `offset` plus the number of bytes written. + */ + writeInt16LE(value: number, offset?: number): number; + /** + * Writes `value` to `buf` at the specified `offset` as big-endian. The `value`must be a valid signed 16-bit integer. Behavior is undefined when `value` is + * anything other than a signed 16-bit integer. + * + * The `value` is interpreted and written as a two's complement signed integer. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(2); + * + * buf.writeInt16BE(0x0102, 0); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.5.5 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - 2`. + * @return `offset` plus the number of bytes written. + */ + writeInt16BE(value: number, offset?: number): number; + /** + * Writes `value` to `buf` at the specified `offset` as little-endian. The `value`must be a valid signed 32-bit integer. Behavior is undefined when `value` is + * anything other than a signed 32-bit integer. + * + * The `value` is interpreted and written as a two's complement signed integer. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(4); + * + * buf.writeInt32LE(0x05060708, 0); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.5.5 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - 4`. + * @return `offset` plus the number of bytes written. + */ + writeInt32LE(value: number, offset?: number): number; + /** + * Writes `value` to `buf` at the specified `offset` as big-endian. The `value`must be a valid signed 32-bit integer. Behavior is undefined when `value` is + * anything other than a signed 32-bit integer. + * + * The `value` is interpreted and written as a two's complement signed integer. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(4); + * + * buf.writeInt32BE(0x01020304, 0); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.5.5 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - 4`. + * @return `offset` plus the number of bytes written. + */ + writeInt32BE(value: number, offset?: number): number; + /** + * Writes `value` to `buf` at the specified `offset` as little-endian. Behavior is + * undefined when `value` is anything other than a JavaScript number. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(4); + * + * buf.writeFloatLE(0xcafebabe, 0); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.11.15 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - 4`. + * @return `offset` plus the number of bytes written. + */ + writeFloatLE(value: number, offset?: number): number; + /** + * Writes `value` to `buf` at the specified `offset` as big-endian. Behavior is + * undefined when `value` is anything other than a JavaScript number. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(4); + * + * buf.writeFloatBE(0xcafebabe, 0); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.11.15 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - 4`. + * @return `offset` plus the number of bytes written. + */ + writeFloatBE(value: number, offset?: number): number; + /** + * Writes `value` to `buf` at the specified `offset` as little-endian. The `value`must be a JavaScript number. Behavior is undefined when `value` is anything + * other than a JavaScript number. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(8); + * + * buf.writeDoubleLE(123.456, 0); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.11.15 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - 8`. + * @return `offset` plus the number of bytes written. + */ + writeDoubleLE(value: number, offset?: number): number; + /** + * Writes `value` to `buf` at the specified `offset` as big-endian. The `value`must be a JavaScript number. Behavior is undefined when `value` is anything + * other than a JavaScript number. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(8); + * + * buf.writeDoubleBE(123.456, 0); + * + * console.log(buf); + * // Prints: + * ``` + * @since v0.11.15 + * @param value Number to be written to `buf`. + * @param [offset=0] Number of bytes to skip before starting to write. Must satisfy `0 <= offset <= buf.length - 8`. + * @return `offset` plus the number of bytes written. + */ + writeDoubleBE(value: number, offset?: number): number; + /** + * Fills `buf` with the specified `value`. If the `offset` and `end` are not given, + * the entire `buf` will be filled: + * + * ```js + * import { Buffer } from 'buffer'; + * + * // Fill a `Buffer` with the ASCII character 'h'. + * + * const b = Buffer.allocUnsafe(50).fill('h'); + * + * console.log(b.toString()); + * // Prints: hhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh + * ``` + * + * `value` is coerced to a `uint32` value if it is not a string, `Buffer`, or + * integer. If the resulting integer is greater than `255` (decimal), `buf` will be + * filled with `value & 255`. + * + * If the final write of a `fill()` operation falls on a multi-byte character, + * then only the bytes of that character that fit into `buf` are written: + * + * ```js + * import { Buffer } from 'buffer'; + * + * // Fill a `Buffer` with character that takes up two bytes in UTF-8. + * + * console.log(Buffer.allocUnsafe(5).fill('\u0222')); + * // Prints: + * ``` + * + * If `value` contains invalid characters, it is truncated; if no valid + * fill data remains, an exception is thrown: + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.allocUnsafe(5); + * + * console.log(buf.fill('a')); + * // Prints: + * console.log(buf.fill('aazz', 'hex')); + * // Prints: + * console.log(buf.fill('zz', 'hex')); + * // Throws an exception. + * ``` + * @since v0.5.0 + * @param value The value with which to fill `buf`. + * @param [offset=0] Number of bytes to skip before starting to fill `buf`. + * @param [end=buf.length] Where to stop filling `buf` (not inclusive). + * @param [encoding='utf8'] The encoding for `value` if `value` is a string. + * @return A reference to `buf`. + */ + fill(value: string | Uint8Array | number, offset?: number, end?: number, encoding?: BufferEncoding): this; + /** + * If `value` is: + * + * * a string, `value` is interpreted according to the character encoding in`encoding`. + * * a `Buffer` or [`Uint8Array`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array), `value` will be used in its entirety. + * To compare a partial `Buffer`, use `buf.slice()`. + * * a number, `value` will be interpreted as an unsigned 8-bit integer + * value between `0` and `255`. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from('this is a buffer'); + * + * console.log(buf.indexOf('this')); + * // Prints: 0 + * console.log(buf.indexOf('is')); + * // Prints: 2 + * console.log(buf.indexOf(Buffer.from('a buffer'))); + * // Prints: 8 + * console.log(buf.indexOf(97)); + * // Prints: 8 (97 is the decimal ASCII value for 'a') + * console.log(buf.indexOf(Buffer.from('a buffer example'))); + * // Prints: -1 + * console.log(buf.indexOf(Buffer.from('a buffer example').slice(0, 8))); + * // Prints: 8 + * + * const utf16Buffer = Buffer.from('\u039a\u0391\u03a3\u03a3\u0395', 'utf16le'); + * + * console.log(utf16Buffer.indexOf('\u03a3', 0, 'utf16le')); + * // Prints: 4 + * console.log(utf16Buffer.indexOf('\u03a3', -4, 'utf16le')); + * // Prints: 6 + * ``` + * + * If `value` is not a string, number, or `Buffer`, this method will throw a`TypeError`. If `value` is a number, it will be coerced to a valid byte value, + * an integer between 0 and 255. + * + * If `byteOffset` is not a number, it will be coerced to a number. If the result + * of coercion is `NaN` or `0`, then the entire buffer will be searched. This + * behavior matches [`String.prototype.indexOf()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/indexOf). + * + * ```js + * import { Buffer } from 'buffer'; + * + * const b = Buffer.from('abcdef'); + * + * // Passing a value that's a number, but not a valid byte. + * // Prints: 2, equivalent to searching for 99 or 'c'. + * console.log(b.indexOf(99.9)); + * console.log(b.indexOf(256 + 99)); + * + * // Passing a byteOffset that coerces to NaN or 0. + * // Prints: 1, searching the whole buffer. + * console.log(b.indexOf('b', undefined)); + * console.log(b.indexOf('b', {})); + * console.log(b.indexOf('b', null)); + * console.log(b.indexOf('b', [])); + * ``` + * + * If `value` is an empty string or empty `Buffer` and `byteOffset` is less + * than `buf.length`, `byteOffset` will be returned. If `value` is empty and`byteOffset` is at least `buf.length`, `buf.length` will be returned. + * @since v1.5.0 + * @param value What to search for. + * @param [byteOffset=0] Where to begin searching in `buf`. If negative, then offset is calculated from the end of `buf`. + * @param [encoding='utf8'] If `value` is a string, this is the encoding used to determine the binary representation of the string that will be searched for in `buf`. + * @return The index of the first occurrence of `value` in `buf`, or `-1` if `buf` does not contain `value`. + */ + indexOf(value: string | number | Uint8Array, byteOffset?: number, encoding?: BufferEncoding): number; + /** + * Identical to `buf.indexOf()`, except the last occurrence of `value` is found + * rather than the first occurrence. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from('this buffer is a buffer'); + * + * console.log(buf.lastIndexOf('this')); + * // Prints: 0 + * console.log(buf.lastIndexOf('buffer')); + * // Prints: 17 + * console.log(buf.lastIndexOf(Buffer.from('buffer'))); + * // Prints: 17 + * console.log(buf.lastIndexOf(97)); + * // Prints: 15 (97 is the decimal ASCII value for 'a') + * console.log(buf.lastIndexOf(Buffer.from('yolo'))); + * // Prints: -1 + * console.log(buf.lastIndexOf('buffer', 5)); + * // Prints: 5 + * console.log(buf.lastIndexOf('buffer', 4)); + * // Prints: -1 + * + * const utf16Buffer = Buffer.from('\u039a\u0391\u03a3\u03a3\u0395', 'utf16le'); + * + * console.log(utf16Buffer.lastIndexOf('\u03a3', undefined, 'utf16le')); + * // Prints: 6 + * console.log(utf16Buffer.lastIndexOf('\u03a3', -5, 'utf16le')); + * // Prints: 4 + * ``` + * + * If `value` is not a string, number, or `Buffer`, this method will throw a`TypeError`. If `value` is a number, it will be coerced to a valid byte value, + * an integer between 0 and 255. + * + * If `byteOffset` is not a number, it will be coerced to a number. Any arguments + * that coerce to `NaN`, like `{}` or `undefined`, will search the whole buffer. + * This behavior matches [`String.prototype.lastIndexOf()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/String/lastIndexOf). + * + * ```js + * import { Buffer } from 'buffer'; + * + * const b = Buffer.from('abcdef'); + * + * // Passing a value that's a number, but not a valid byte. + * // Prints: 2, equivalent to searching for 99 or 'c'. + * console.log(b.lastIndexOf(99.9)); + * console.log(b.lastIndexOf(256 + 99)); + * + * // Passing a byteOffset that coerces to NaN. + * // Prints: 1, searching the whole buffer. + * console.log(b.lastIndexOf('b', undefined)); + * console.log(b.lastIndexOf('b', {})); + * + * // Passing a byteOffset that coerces to 0. + * // Prints: -1, equivalent to passing 0. + * console.log(b.lastIndexOf('b', null)); + * console.log(b.lastIndexOf('b', [])); + * ``` + * + * If `value` is an empty string or empty `Buffer`, `byteOffset` will be returned. + * @since v6.0.0 + * @param value What to search for. + * @param [byteOffset=buf.length - 1] Where to begin searching in `buf`. If negative, then offset is calculated from the end of `buf`. + * @param [encoding='utf8'] If `value` is a string, this is the encoding used to determine the binary representation of the string that will be searched for in `buf`. + * @return The index of the last occurrence of `value` in `buf`, or `-1` if `buf` does not contain `value`. + */ + lastIndexOf(value: string | number | Uint8Array, byteOffset?: number, encoding?: BufferEncoding): number; + /** + * Creates and returns an [iterator](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Iteration_protocols) of `[index, byte]` pairs from the contents + * of `buf`. + * + * ```js + * import { Buffer } from 'buffer'; + * + * // Log the entire contents of a `Buffer`. + * + * const buf = Buffer.from('buffer'); + * + * for (const pair of buf.entries()) { + * console.log(pair); + * } + * // Prints: + * // [0, 98] + * // [1, 117] + * // [2, 102] + * // [3, 102] + * // [4, 101] + * // [5, 114] + * ``` + * @since v1.1.0 + */ + entries(): IterableIterator<[number, number]>; + /** + * Equivalent to `buf.indexOf() !== -1`. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from('this is a buffer'); + * + * console.log(buf.includes('this')); + * // Prints: true + * console.log(buf.includes('is')); + * // Prints: true + * console.log(buf.includes(Buffer.from('a buffer'))); + * // Prints: true + * console.log(buf.includes(97)); + * // Prints: true (97 is the decimal ASCII value for 'a') + * console.log(buf.includes(Buffer.from('a buffer example'))); + * // Prints: false + * console.log(buf.includes(Buffer.from('a buffer example').slice(0, 8))); + * // Prints: true + * console.log(buf.includes('this', 4)); + * // Prints: false + * ``` + * @since v5.3.0 + * @param value What to search for. + * @param [byteOffset=0] Where to begin searching in `buf`. If negative, then offset is calculated from the end of `buf`. + * @param [encoding='utf8'] If `value` is a string, this is its encoding. + * @return `true` if `value` was found in `buf`, `false` otherwise. + */ + includes(value: string | number | Buffer, byteOffset?: number, encoding?: BufferEncoding): boolean; + /** + * Creates and returns an [iterator](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Iteration_protocols) of `buf` keys (indices). + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from('buffer'); + * + * for (const key of buf.keys()) { + * console.log(key); + * } + * // Prints: + * // 0 + * // 1 + * // 2 + * // 3 + * // 4 + * // 5 + * ``` + * @since v1.1.0 + */ + keys(): IterableIterator; + /** + * Creates and returns an [iterator](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Iteration_protocols) for `buf` values (bytes). This function is + * called automatically when a `Buffer` is used in a `for..of` statement. + * + * ```js + * import { Buffer } from 'buffer'; + * + * const buf = Buffer.from('buffer'); + * + * for (const value of buf.values()) { + * console.log(value); + * } + * // Prints: + * // 98 + * // 117 + * // 102 + * // 102 + * // 101 + * // 114 + * + * for (const value of buf) { + * console.log(value); + * } + * // Prints: + * // 98 + * // 117 + * // 102 + * // 102 + * // 101 + * // 114 + * ``` + * @since v1.1.0 + */ + values(): IterableIterator; + } + var Buffer: BufferConstructor; + /** + * Decodes a string of Base64-encoded data into bytes, and encodes those bytes + * into a string using Latin-1 (ISO-8859-1). + * + * The `data` may be any JavaScript-value that can be coerced into a string. + * + * **This function is only provided for compatibility with legacy web platform APIs** + * **and should never be used in new code, because they use strings to represent** + * **binary data and predate the introduction of typed arrays in JavaScript.** + * **For code running using Node.js APIs, converting between base64-encoded strings** + * **and binary data should be performed using `Buffer.from(str, 'base64')` and`buf.toString('base64')`.** + * @since v15.13.0 + * @deprecated Use `Buffer.from(data, 'base64')` instead. + * @param data The Base64-encoded input string. + */ + function atob(data: string): string; + /** + * Decodes a string into bytes using Latin-1 (ISO-8859), and encodes those bytes + * into a string using Base64. + * + * The `data` may be any JavaScript-value that can be coerced into a string. + * + * **This function is only provided for compatibility with legacy web platform APIs** + * **and should never be used in new code, because they use strings to represent** + * **binary data and predate the introduction of typed arrays in JavaScript.** + * **For code running using Node.js APIs, converting between base64-encoded strings** + * **and binary data should be performed using `Buffer.from(str, 'base64')` and`buf.toString('base64')`.** + * @since v15.13.0 + * @deprecated Use `buf.toString('base64')` instead. + * @param data An ASCII (Latin1) string. + */ + function btoa(data: string): string; + } +} +declare module 'node:buffer' { + export * from 'buffer'; +} diff --git a/node_modules/@types/node/child_process.d.ts b/node_modules/@types/node/child_process.d.ts new file mode 100644 index 000000000..0851f51c5 --- /dev/null +++ b/node_modules/@types/node/child_process.d.ts @@ -0,0 +1,1365 @@ +/** + * The `child_process` module provides the ability to spawn subprocesses in + * a manner that is similar, but not identical, to [`popen(3)`](http://man7.org/linux/man-pages/man3/popen.3.html). This capability + * is primarily provided by the {@link spawn} function: + * + * ```js + * const { spawn } = require('child_process'); + * const ls = spawn('ls', ['-lh', '/usr']); + * + * ls.stdout.on('data', (data) => { + * console.log(`stdout: ${data}`); + * }); + * + * ls.stderr.on('data', (data) => { + * console.error(`stderr: ${data}`); + * }); + * + * ls.on('close', (code) => { + * console.log(`child process exited with code ${code}`); + * }); + * ``` + * + * By default, pipes for `stdin`, `stdout`, and `stderr` are established between + * the parent Node.js process and the spawned subprocess. These pipes have + * limited (and platform-specific) capacity. If the subprocess writes to + * stdout in excess of that limit without the output being captured, the + * subprocess blocks waiting for the pipe buffer to accept more data. This is + * identical to the behavior of pipes in the shell. Use the `{ stdio: 'ignore' }`option if the output will not be consumed. + * + * The command lookup is performed using the `options.env.PATH` environment + * variable if it is in the `options` object. Otherwise, `process.env.PATH` is + * used. + * + * On Windows, environment variables are case-insensitive. Node.js + * lexicographically sorts the `env` keys and uses the first one that + * case-insensitively matches. Only first (in lexicographic order) entry will be + * passed to the subprocess. This might lead to issues on Windows when passing + * objects to the `env` option that have multiple variants of the same key, such as`PATH` and `Path`. + * + * The {@link spawn} method spawns the child process asynchronously, + * without blocking the Node.js event loop. The {@link spawnSync} function provides equivalent functionality in a synchronous manner that blocks + * the event loop until the spawned process either exits or is terminated. + * + * For convenience, the `child_process` module provides a handful of synchronous + * and asynchronous alternatives to {@link spawn} and {@link spawnSync}. Each of these alternatives are implemented on + * top of {@link spawn} or {@link spawnSync}. + * + * * {@link exec}: spawns a shell and runs a command within that + * shell, passing the `stdout` and `stderr` to a callback function when + * complete. + * * {@link execFile}: similar to {@link exec} except + * that it spawns the command directly without first spawning a shell by + * default. + * * {@link fork}: spawns a new Node.js process and invokes a + * specified module with an IPC communication channel established that allows + * sending messages between parent and child. + * * {@link execSync}: a synchronous version of {@link exec} that will block the Node.js event loop. + * * {@link execFileSync}: a synchronous version of {@link execFile} that will block the Node.js event loop. + * + * For certain use cases, such as automating shell scripts, the `synchronous counterparts` may be more convenient. In many cases, however, + * the synchronous methods can have significant impact on performance due to + * stalling the event loop while spawned processes complete. + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/child_process.js) + */ +declare module 'child_process' { + import { ObjectEncodingOptions } from 'node:fs'; + import { EventEmitter, Abortable } from 'node:events'; + import * as net from 'node:net'; + import { Writable, Readable, Stream, Pipe } from 'node:stream'; + import { URL } from 'node:url'; + type Serializable = string | object | number | boolean | bigint; + type SendHandle = net.Socket | net.Server; + /** + * Instances of the `ChildProcess` represent spawned child processes. + * + * Instances of `ChildProcess` are not intended to be created directly. Rather, + * use the {@link spawn}, {@link exec},{@link execFile}, or {@link fork} methods to create + * instances of `ChildProcess`. + * @since v2.2.0 + */ + class ChildProcess extends EventEmitter { + /** + * A `Writable Stream` that represents the child process's `stdin`. + * + * If a child process waits to read all of its input, the child will not continue + * until this stream has been closed via `end()`. + * + * If the child was spawned with `stdio[0]` set to anything other than `'pipe'`, + * then this will be `null`. + * + * `subprocess.stdin` is an alias for `subprocess.stdio[0]`. Both properties will + * refer to the same value. + * + * The `subprocess.stdin` property can be `undefined` if the child process could + * not be successfully spawned. + * @since v0.1.90 + */ + stdin: Writable | null; + /** + * A `Readable Stream` that represents the child process's `stdout`. + * + * If the child was spawned with `stdio[1]` set to anything other than `'pipe'`, + * then this will be `null`. + * + * `subprocess.stdout` is an alias for `subprocess.stdio[1]`. Both properties will + * refer to the same value. + * + * ```js + * const { spawn } = require('child_process'); + * + * const subprocess = spawn('ls'); + * + * subprocess.stdout.on('data', (data) => { + * console.log(`Received chunk ${data}`); + * }); + * ``` + * + * The `subprocess.stdout` property can be `null` if the child process could + * not be successfully spawned. + * @since v0.1.90 + */ + stdout: Readable | null; + /** + * A `Readable Stream` that represents the child process's `stderr`. + * + * If the child was spawned with `stdio[2]` set to anything other than `'pipe'`, + * then this will be `null`. + * + * `subprocess.stderr` is an alias for `subprocess.stdio[2]`. Both properties will + * refer to the same value. + * + * The `subprocess.stderr` property can be `null` if the child process could + * not be successfully spawned. + * @since v0.1.90 + */ + stderr: Readable | null; + /** + * The `subprocess.channel` property is a reference to the child's IPC channel. If + * no IPC channel currently exists, this property is `undefined`. + * @since v7.1.0 + */ + readonly channel?: Pipe | null | undefined; + /** + * A sparse array of pipes to the child process, corresponding with positions in + * the `stdio` option passed to {@link spawn} that have been set + * to the value `'pipe'`. `subprocess.stdio[0]`, `subprocess.stdio[1]`, and`subprocess.stdio[2]` are also available as `subprocess.stdin`,`subprocess.stdout`, and `subprocess.stderr`, + * respectively. + * + * In the following example, only the child's fd `1` (stdout) is configured as a + * pipe, so only the parent's `subprocess.stdio[1]` is a stream, all other values + * in the array are `null`. + * + * ```js + * const assert = require('assert'); + * const fs = require('fs'); + * const child_process = require('child_process'); + * + * const subprocess = child_process.spawn('ls', { + * stdio: [ + * 0, // Use parent's stdin for child. + * 'pipe', // Pipe child's stdout to parent. + * fs.openSync('err.out', 'w'), // Direct child's stderr to a file. + * ] + * }); + * + * assert.strictEqual(subprocess.stdio[0], null); + * assert.strictEqual(subprocess.stdio[0], subprocess.stdin); + * + * assert(subprocess.stdout); + * assert.strictEqual(subprocess.stdio[1], subprocess.stdout); + * + * assert.strictEqual(subprocess.stdio[2], null); + * assert.strictEqual(subprocess.stdio[2], subprocess.stderr); + * ``` + * + * The `subprocess.stdio` property can be `undefined` if the child process could + * not be successfully spawned. + * @since v0.7.10 + */ + readonly stdio: [ + Writable | null, + // stdin + Readable | null, + // stdout + Readable | null, + // stderr + Readable | Writable | null | undefined, + // extra + Readable | Writable | null | undefined // extra + ]; + /** + * The `subprocess.killed` property indicates whether the child process + * successfully received a signal from `subprocess.kill()`. The `killed` property + * does not indicate that the child process has been terminated. + * @since v0.5.10 + */ + readonly killed: boolean; + /** + * Returns the process identifier (PID) of the child process. If the child process + * fails to spawn due to errors, then the value is `undefined` and `error` is + * emitted. + * + * ```js + * const { spawn } = require('child_process'); + * const grep = spawn('grep', ['ssh']); + * + * console.log(`Spawned child pid: ${grep.pid}`); + * grep.stdin.end(); + * ``` + * @since v0.1.90 + */ + readonly pid?: number | undefined; + /** + * The `subprocess.connected` property indicates whether it is still possible to + * send and receive messages from a child process. When `subprocess.connected` is`false`, it is no longer possible to send or receive messages. + * @since v0.7.2 + */ + readonly connected: boolean; + /** + * The `subprocess.exitCode` property indicates the exit code of the child process. + * If the child process is still running, the field will be `null`. + */ + readonly exitCode: number | null; + /** + * The `subprocess.signalCode` property indicates the signal received by + * the child process if any, else `null`. + */ + readonly signalCode: NodeJS.Signals | null; + /** + * The `subprocess.spawnargs` property represents the full list of command-line + * arguments the child process was launched with. + */ + readonly spawnargs: string[]; + /** + * The `subprocess.spawnfile` property indicates the executable file name of + * the child process that is launched. + * + * For {@link fork}, its value will be equal to `process.execPath`. + * For {@link spawn}, its value will be the name of + * the executable file. + * For {@link exec}, its value will be the name of the shell + * in which the child process is launched. + */ + readonly spawnfile: string; + /** + * The `subprocess.kill()` method sends a signal to the child process. If no + * argument is given, the process will be sent the `'SIGTERM'` signal. See [`signal(7)`](http://man7.org/linux/man-pages/man7/signal.7.html) for a list of available signals. This function + * returns `true` if [`kill(2)`](http://man7.org/linux/man-pages/man2/kill.2.html) succeeds, and `false` otherwise. + * + * ```js + * const { spawn } = require('child_process'); + * const grep = spawn('grep', ['ssh']); + * + * grep.on('close', (code, signal) => { + * console.log( + * `child process terminated due to receipt of signal ${signal}`); + * }); + * + * // Send SIGHUP to process. + * grep.kill('SIGHUP'); + * ``` + * + * The `ChildProcess` object may emit an `'error'` event if the signal + * cannot be delivered. Sending a signal to a child process that has already exited + * is not an error but may have unforeseen consequences. Specifically, if the + * process identifier (PID) has been reassigned to another process, the signal will + * be delivered to that process instead which can have unexpected results. + * + * While the function is called `kill`, the signal delivered to the child process + * may not actually terminate the process. + * + * See [`kill(2)`](http://man7.org/linux/man-pages/man2/kill.2.html) for reference. + * + * On Windows, where POSIX signals do not exist, the `signal` argument will be + * ignored, and the process will be killed forcefully and abruptly (similar to`'SIGKILL'`). + * See `Signal Events` for more details. + * + * On Linux, child processes of child processes will not be terminated + * when attempting to kill their parent. This is likely to happen when running a + * new process in a shell or with the use of the `shell` option of `ChildProcess`: + * + * ```js + * 'use strict'; + * const { spawn } = require('child_process'); + * + * const subprocess = spawn( + * 'sh', + * [ + * '-c', + * `node -e "setInterval(() => { + * console.log(process.pid, 'is alive') + * }, 500);"`, + * ], { + * stdio: ['inherit', 'inherit', 'inherit'] + * } + * ); + * + * setTimeout(() => { + * subprocess.kill(); // Does not terminate the Node.js process in the shell. + * }, 2000); + * ``` + * @since v0.1.90 + */ + kill(signal?: NodeJS.Signals | number): boolean; + /** + * When an IPC channel has been established between the parent and child ( + * i.e. when using {@link fork}), the `subprocess.send()` method can + * be used to send messages to the child process. When the child process is a + * Node.js instance, these messages can be received via the `'message'` event. + * + * The message goes through serialization and parsing. The resulting + * message might not be the same as what is originally sent. + * + * For example, in the parent script: + * + * ```js + * const cp = require('child_process'); + * const n = cp.fork(`${__dirname}/sub.js`); + * + * n.on('message', (m) => { + * console.log('PARENT got message:', m); + * }); + * + * // Causes the child to print: CHILD got message: { hello: 'world' } + * n.send({ hello: 'world' }); + * ``` + * + * And then the child script, `'sub.js'` might look like this: + * + * ```js + * process.on('message', (m) => { + * console.log('CHILD got message:', m); + * }); + * + * // Causes the parent to print: PARENT got message: { foo: 'bar', baz: null } + * process.send({ foo: 'bar', baz: NaN }); + * ``` + * + * Child Node.js processes will have a `process.send()` method of their own + * that allows the child to send messages back to the parent. + * + * There is a special case when sending a `{cmd: 'NODE_foo'}` message. Messages + * containing a `NODE_` prefix in the `cmd` property are reserved for use within + * Node.js core and will not be emitted in the child's `'message'` event. Rather, such messages are emitted using the`'internalMessage'` event and are consumed internally by Node.js. + * Applications should avoid using such messages or listening for`'internalMessage'` events as it is subject to change without notice. + * + * The optional `sendHandle` argument that may be passed to `subprocess.send()` is + * for passing a TCP server or socket object to the child process. The child will + * receive the object as the second argument passed to the callback function + * registered on the `'message'` event. Any data that is received + * and buffered in the socket will not be sent to the child. + * + * The optional `callback` is a function that is invoked after the message is + * sent but before the child may have received it. The function is called with a + * single argument: `null` on success, or an `Error` object on failure. + * + * If no `callback` function is provided and the message cannot be sent, an`'error'` event will be emitted by the `ChildProcess` object. This can + * happen, for instance, when the child process has already exited. + * + * `subprocess.send()` will return `false` if the channel has closed or when the + * backlog of unsent messages exceeds a threshold that makes it unwise to send + * more. Otherwise, the method returns `true`. The `callback` function can be + * used to implement flow control. + * + * #### Example: sending a server object + * + * The `sendHandle` argument can be used, for instance, to pass the handle of + * a TCP server object to the child process as illustrated in the example below: + * + * ```js + * const subprocess = require('child_process').fork('subprocess.js'); + * + * // Open up the server object and send the handle. + * const server = require('net').createServer(); + * server.on('connection', (socket) => { + * socket.end('handled by parent'); + * }); + * server.listen(1337, () => { + * subprocess.send('server', server); + * }); + * ``` + * + * The child would then receive the server object as: + * + * ```js + * process.on('message', (m, server) => { + * if (m === 'server') { + * server.on('connection', (socket) => { + * socket.end('handled by child'); + * }); + * } + * }); + * ``` + * + * Once the server is now shared between the parent and child, some connections + * can be handled by the parent and some by the child. + * + * While the example above uses a server created using the `net` module, `dgram`module servers use exactly the same workflow with the exceptions of listening on + * a `'message'` event instead of `'connection'` and using `server.bind()` instead + * of `server.listen()`. This is, however, currently only supported on Unix + * platforms. + * + * #### Example: sending a socket object + * + * Similarly, the `sendHandler` argument can be used to pass the handle of a + * socket to the child process. The example below spawns two children that each + * handle connections with "normal" or "special" priority: + * + * ```js + * const { fork } = require('child_process'); + * const normal = fork('subprocess.js', ['normal']); + * const special = fork('subprocess.js', ['special']); + * + * // Open up the server and send sockets to child. Use pauseOnConnect to prevent + * // the sockets from being read before they are sent to the child process. + * const server = require('net').createServer({ pauseOnConnect: true }); + * server.on('connection', (socket) => { + * + * // If this is special priority... + * if (socket.remoteAddress === '74.125.127.100') { + * special.send('socket', socket); + * return; + * } + * // This is normal priority. + * normal.send('socket', socket); + * }); + * server.listen(1337); + * ``` + * + * The `subprocess.js` would receive the socket handle as the second argument + * passed to the event callback function: + * + * ```js + * process.on('message', (m, socket) => { + * if (m === 'socket') { + * if (socket) { + * // Check that the client socket exists. + * // It is possible for the socket to be closed between the time it is + * // sent and the time it is received in the child process. + * socket.end(`Request handled with ${process.argv[2]} priority`); + * } + * } + * }); + * ``` + * + * Do not use `.maxConnections` on a socket that has been passed to a subprocess. + * The parent cannot track when the socket is destroyed. + * + * Any `'message'` handlers in the subprocess should verify that `socket` exists, + * as the connection may have been closed during the time it takes to send the + * connection to the child. + * @since v0.5.9 + * @param options The `options` argument, if present, is an object used to parameterize the sending of certain types of handles. `options` supports the following properties: + */ + send(message: Serializable, callback?: (error: Error | null) => void): boolean; + send(message: Serializable, sendHandle?: SendHandle, callback?: (error: Error | null) => void): boolean; + send(message: Serializable, sendHandle?: SendHandle, options?: MessageOptions, callback?: (error: Error | null) => void): boolean; + /** + * Closes the IPC channel between parent and child, allowing the child to exit + * gracefully once there are no other connections keeping it alive. After calling + * this method the `subprocess.connected` and `process.connected` properties in + * both the parent and child (respectively) will be set to `false`, and it will be + * no longer possible to pass messages between the processes. + * + * The `'disconnect'` event will be emitted when there are no messages in the + * process of being received. This will most often be triggered immediately after + * calling `subprocess.disconnect()`. + * + * When the child process is a Node.js instance (e.g. spawned using {@link fork}), the `process.disconnect()` method can be invoked + * within the child process to close the IPC channel as well. + * @since v0.7.2 + */ + disconnect(): void; + /** + * By default, the parent will wait for the detached child to exit. To prevent the + * parent from waiting for a given `subprocess` to exit, use the`subprocess.unref()` method. Doing so will cause the parent's event loop to not + * include the child in its reference count, allowing the parent to exit + * independently of the child, unless there is an established IPC channel between + * the child and the parent. + * + * ```js + * const { spawn } = require('child_process'); + * + * const subprocess = spawn(process.argv[0], ['child_program.js'], { + * detached: true, + * stdio: 'ignore' + * }); + * + * subprocess.unref(); + * ``` + * @since v0.7.10 + */ + unref(): void; + /** + * Calling `subprocess.ref()` after making a call to `subprocess.unref()` will + * restore the removed reference count for the child process, forcing the parent + * to wait for the child to exit before exiting itself. + * + * ```js + * const { spawn } = require('child_process'); + * + * const subprocess = spawn(process.argv[0], ['child_program.js'], { + * detached: true, + * stdio: 'ignore' + * }); + * + * subprocess.unref(); + * subprocess.ref(); + * ``` + * @since v0.7.10 + */ + ref(): void; + /** + * events.EventEmitter + * 1. close + * 2. disconnect + * 3. error + * 4. exit + * 5. message + * 6. spawn + */ + addListener(event: string, listener: (...args: any[]) => void): this; + addListener(event: 'close', listener: (code: number | null, signal: NodeJS.Signals | null) => void): this; + addListener(event: 'disconnect', listener: () => void): this; + addListener(event: 'error', listener: (err: Error) => void): this; + addListener(event: 'exit', listener: (code: number | null, signal: NodeJS.Signals | null) => void): this; + addListener(event: 'message', listener: (message: Serializable, sendHandle: SendHandle) => void): this; + addListener(event: 'spawn', listener: () => void): this; + emit(event: string | symbol, ...args: any[]): boolean; + emit(event: 'close', code: number | null, signal: NodeJS.Signals | null): boolean; + emit(event: 'disconnect'): boolean; + emit(event: 'error', err: Error): boolean; + emit(event: 'exit', code: number | null, signal: NodeJS.Signals | null): boolean; + emit(event: 'message', message: Serializable, sendHandle: SendHandle): boolean; + emit(event: 'spawn', listener: () => void): boolean; + on(event: string, listener: (...args: any[]) => void): this; + on(event: 'close', listener: (code: number | null, signal: NodeJS.Signals | null) => void): this; + on(event: 'disconnect', listener: () => void): this; + on(event: 'error', listener: (err: Error) => void): this; + on(event: 'exit', listener: (code: number | null, signal: NodeJS.Signals | null) => void): this; + on(event: 'message', listener: (message: Serializable, sendHandle: SendHandle) => void): this; + on(event: 'spawn', listener: () => void): this; + once(event: string, listener: (...args: any[]) => void): this; + once(event: 'close', listener: (code: number | null, signal: NodeJS.Signals | null) => void): this; + once(event: 'disconnect', listener: () => void): this; + once(event: 'error', listener: (err: Error) => void): this; + once(event: 'exit', listener: (code: number | null, signal: NodeJS.Signals | null) => void): this; + once(event: 'message', listener: (message: Serializable, sendHandle: SendHandle) => void): this; + once(event: 'spawn', listener: () => void): this; + prependListener(event: string, listener: (...args: any[]) => void): this; + prependListener(event: 'close', listener: (code: number | null, signal: NodeJS.Signals | null) => void): this; + prependListener(event: 'disconnect', listener: () => void): this; + prependListener(event: 'error', listener: (err: Error) => void): this; + prependListener(event: 'exit', listener: (code: number | null, signal: NodeJS.Signals | null) => void): this; + prependListener(event: 'message', listener: (message: Serializable, sendHandle: SendHandle) => void): this; + prependListener(event: 'spawn', listener: () => void): this; + prependOnceListener(event: string, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'close', listener: (code: number | null, signal: NodeJS.Signals | null) => void): this; + prependOnceListener(event: 'disconnect', listener: () => void): this; + prependOnceListener(event: 'error', listener: (err: Error) => void): this; + prependOnceListener(event: 'exit', listener: (code: number | null, signal: NodeJS.Signals | null) => void): this; + prependOnceListener(event: 'message', listener: (message: Serializable, sendHandle: SendHandle) => void): this; + prependOnceListener(event: 'spawn', listener: () => void): this; + } + // return this object when stdio option is undefined or not specified + interface ChildProcessWithoutNullStreams extends ChildProcess { + stdin: Writable; + stdout: Readable; + stderr: Readable; + readonly stdio: [ + Writable, + Readable, + Readable, + // stderr + Readable | Writable | null | undefined, + // extra, no modification + Readable | Writable | null | undefined // extra, no modification + ]; + } + // return this object when stdio option is a tuple of 3 + interface ChildProcessByStdio extends ChildProcess { + stdin: I; + stdout: O; + stderr: E; + readonly stdio: [ + I, + O, + E, + Readable | Writable | null | undefined, + // extra, no modification + Readable | Writable | null | undefined // extra, no modification + ]; + } + interface MessageOptions { + keepOpen?: boolean | undefined; + } + type IOType = 'overlapped' | 'pipe' | 'ignore' | 'inherit'; + type StdioOptions = IOType | Array; + type SerializationType = 'json' | 'advanced'; + interface MessagingOptions extends Abortable { + /** + * Specify the kind of serialization used for sending messages between processes. + * @default 'json' + */ + serialization?: SerializationType | undefined; + /** + * The signal value to be used when the spawned process will be killed by the abort signal. + * @default 'SIGTERM' + */ + killSignal?: NodeJS.Signals | number | undefined; + /** + * In milliseconds the maximum amount of time the process is allowed to run. + */ + timeout?: number | undefined; + } + interface ProcessEnvOptions { + uid?: number | undefined; + gid?: number | undefined; + cwd?: string | URL | undefined; + env?: NodeJS.ProcessEnv | undefined; + } + interface CommonOptions extends ProcessEnvOptions { + /** + * @default true + */ + windowsHide?: boolean | undefined; + /** + * @default 0 + */ + timeout?: number | undefined; + } + interface CommonSpawnOptions extends CommonOptions, MessagingOptions, Abortable { + argv0?: string | undefined; + stdio?: StdioOptions | undefined; + shell?: boolean | string | undefined; + windowsVerbatimArguments?: boolean | undefined; + } + interface SpawnOptions extends CommonSpawnOptions { + detached?: boolean | undefined; + } + interface SpawnOptionsWithoutStdio extends SpawnOptions { + stdio?: StdioPipeNamed | StdioPipe[] | undefined; + } + type StdioNull = 'inherit' | 'ignore' | Stream; + type StdioPipeNamed = 'pipe' | 'overlapped'; + type StdioPipe = undefined | null | StdioPipeNamed; + interface SpawnOptionsWithStdioTuple extends SpawnOptions { + stdio: [Stdin, Stdout, Stderr]; + } + /** + * The `child_process.spawn()` method spawns a new process using the given`command`, with command-line arguments in `args`. If omitted, `args` defaults + * to an empty array. + * + * **If the `shell` option is enabled, do not pass unsanitized user input to this** + * **function. Any input containing shell metacharacters may be used to trigger** + * **arbitrary command execution.** + * + * A third argument may be used to specify additional options, with these defaults: + * + * ```js + * const defaults = { + * cwd: undefined, + * env: process.env + * }; + * ``` + * + * Use `cwd` to specify the working directory from which the process is spawned. + * If not given, the default is to inherit the current working directory. If given, + * but the path does not exist, the child process emits an `ENOENT` error + * and exits immediately. `ENOENT` is also emitted when the command + * does not exist. + * + * Use `env` to specify environment variables that will be visible to the new + * process, the default is `process.env`. + * + * `undefined` values in `env` will be ignored. + * + * Example of running `ls -lh /usr`, capturing `stdout`, `stderr`, and the + * exit code: + * + * ```js + * const { spawn } = require('child_process'); + * const ls = spawn('ls', ['-lh', '/usr']); + * + * ls.stdout.on('data', (data) => { + * console.log(`stdout: ${data}`); + * }); + * + * ls.stderr.on('data', (data) => { + * console.error(`stderr: ${data}`); + * }); + * + * ls.on('close', (code) => { + * console.log(`child process exited with code ${code}`); + * }); + * ``` + * + * Example: A very elaborate way to run `ps ax | grep ssh` + * + * ```js + * const { spawn } = require('child_process'); + * const ps = spawn('ps', ['ax']); + * const grep = spawn('grep', ['ssh']); + * + * ps.stdout.on('data', (data) => { + * grep.stdin.write(data); + * }); + * + * ps.stderr.on('data', (data) => { + * console.error(`ps stderr: ${data}`); + * }); + * + * ps.on('close', (code) => { + * if (code !== 0) { + * console.log(`ps process exited with code ${code}`); + * } + * grep.stdin.end(); + * }); + * + * grep.stdout.on('data', (data) => { + * console.log(data.toString()); + * }); + * + * grep.stderr.on('data', (data) => { + * console.error(`grep stderr: ${data}`); + * }); + * + * grep.on('close', (code) => { + * if (code !== 0) { + * console.log(`grep process exited with code ${code}`); + * } + * }); + * ``` + * + * Example of checking for failed `spawn`: + * + * ```js + * const { spawn } = require('child_process'); + * const subprocess = spawn('bad_command'); + * + * subprocess.on('error', (err) => { + * console.error('Failed to start subprocess.'); + * }); + * ``` + * + * Certain platforms (macOS, Linux) will use the value of `argv[0]` for the process + * title while others (Windows, SunOS) will use `command`. + * + * Node.js currently overwrites `argv[0]` with `process.execPath` on startup, so`process.argv[0]` in a Node.js child process will not match the `argv0`parameter passed to `spawn` from the parent, + * retrieve it with the`process.argv0` property instead. + * + * If the `signal` option is enabled, calling `.abort()` on the corresponding`AbortController` is similar to calling `.kill()` on the child process except + * the error passed to the callback will be an `AbortError`: + * + * ```js + * const { spawn } = require('child_process'); + * const controller = new AbortController(); + * const { signal } = controller; + * const grep = spawn('grep', ['ssh'], { signal }); + * grep.on('error', (err) => { + * // This will be called with err being an AbortError if the controller aborts + * }); + * controller.abort(); // Stops the child process + * ``` + * @since v0.1.90 + * @param command The command to run. + * @param args List of string arguments. + */ + function spawn(command: string, options?: SpawnOptionsWithoutStdio): ChildProcessWithoutNullStreams; + function spawn(command: string, options: SpawnOptionsWithStdioTuple): ChildProcessByStdio; + function spawn(command: string, options: SpawnOptionsWithStdioTuple): ChildProcessByStdio; + function spawn(command: string, options: SpawnOptionsWithStdioTuple): ChildProcessByStdio; + function spawn(command: string, options: SpawnOptionsWithStdioTuple): ChildProcessByStdio; + function spawn(command: string, options: SpawnOptionsWithStdioTuple): ChildProcessByStdio; + function spawn(command: string, options: SpawnOptionsWithStdioTuple): ChildProcessByStdio; + function spawn(command: string, options: SpawnOptionsWithStdioTuple): ChildProcessByStdio; + function spawn(command: string, options: SpawnOptionsWithStdioTuple): ChildProcessByStdio; + function spawn(command: string, options: SpawnOptions): ChildProcess; + // overloads of spawn with 'args' + function spawn(command: string, args?: ReadonlyArray, options?: SpawnOptionsWithoutStdio): ChildProcessWithoutNullStreams; + function spawn(command: string, args: ReadonlyArray, options: SpawnOptionsWithStdioTuple): ChildProcessByStdio; + function spawn(command: string, args: ReadonlyArray, options: SpawnOptionsWithStdioTuple): ChildProcessByStdio; + function spawn(command: string, args: ReadonlyArray, options: SpawnOptionsWithStdioTuple): ChildProcessByStdio; + function spawn(command: string, args: ReadonlyArray, options: SpawnOptionsWithStdioTuple): ChildProcessByStdio; + function spawn(command: string, args: ReadonlyArray, options: SpawnOptionsWithStdioTuple): ChildProcessByStdio; + function spawn(command: string, args: ReadonlyArray, options: SpawnOptionsWithStdioTuple): ChildProcessByStdio; + function spawn(command: string, args: ReadonlyArray, options: SpawnOptionsWithStdioTuple): ChildProcessByStdio; + function spawn(command: string, args: ReadonlyArray, options: SpawnOptionsWithStdioTuple): ChildProcessByStdio; + function spawn(command: string, args: ReadonlyArray, options: SpawnOptions): ChildProcess; + interface ExecOptions extends CommonOptions { + shell?: string | undefined; + maxBuffer?: number | undefined; + killSignal?: NodeJS.Signals | number | undefined; + } + interface ExecOptionsWithStringEncoding extends ExecOptions { + encoding: BufferEncoding; + } + interface ExecOptionsWithBufferEncoding extends ExecOptions { + encoding: BufferEncoding | null; // specify `null`. + } + interface ExecException extends Error { + cmd?: string | undefined; + killed?: boolean | undefined; + code?: number | undefined; + signal?: NodeJS.Signals | undefined; + } + /** + * Spawns a shell then executes the `command` within that shell, buffering any + * generated output. The `command` string passed to the exec function is processed + * directly by the shell and special characters (vary based on [shell](https://en.wikipedia.org/wiki/List_of_command-line_interpreters)) + * need to be dealt with accordingly: + * + * ```js + * const { exec } = require('child_process'); + * + * exec('"/path/to/test file/test.sh" arg1 arg2'); + * // Double quotes are used so that the space in the path is not interpreted as + * // a delimiter of multiple arguments. + * + * exec('echo "The \\$HOME variable is $HOME"'); + * // The $HOME variable is escaped in the first instance, but not in the second. + * ``` + * + * **Never pass unsanitized user input to this function. Any input containing shell** + * **metacharacters may be used to trigger arbitrary command execution.** + * + * If a `callback` function is provided, it is called with the arguments`(error, stdout, stderr)`. On success, `error` will be `null`. On error,`error` will be an instance of `Error`. The + * `error.code` property will be + * the exit code of the process. By convention, any exit code other than `0`indicates an error. `error.signal` will be the signal that terminated the + * process. + * + * The `stdout` and `stderr` arguments passed to the callback will contain the + * stdout and stderr output of the child process. By default, Node.js will decode + * the output as UTF-8 and pass strings to the callback. The `encoding` option + * can be used to specify the character encoding used to decode the stdout and + * stderr output. If `encoding` is `'buffer'`, or an unrecognized character + * encoding, `Buffer` objects will be passed to the callback instead. + * + * ```js + * const { exec } = require('child_process'); + * exec('cat *.js missing_file | wc -l', (error, stdout, stderr) => { + * if (error) { + * console.error(`exec error: ${error}`); + * return; + * } + * console.log(`stdout: ${stdout}`); + * console.error(`stderr: ${stderr}`); + * }); + * ``` + * + * If `timeout` is greater than `0`, the parent will send the signal + * identified by the `killSignal` property (the default is `'SIGTERM'`) if the + * child runs longer than `timeout` milliseconds. + * + * Unlike the [`exec(3)`](http://man7.org/linux/man-pages/man3/exec.3.html) POSIX system call, `child_process.exec()` does not replace + * the existing process and uses a shell to execute the command. + * + * If this method is invoked as its `util.promisify()` ed version, it returns + * a `Promise` for an `Object` with `stdout` and `stderr` properties. The returned`ChildProcess` instance is attached to the `Promise` as a `child` property. In + * case of an error (including any error resulting in an exit code other than 0), a + * rejected promise is returned, with the same `error` object given in the + * callback, but with two additional properties `stdout` and `stderr`. + * + * ```js + * const util = require('util'); + * const exec = util.promisify(require('child_process').exec); + * + * async function lsExample() { + * const { stdout, stderr } = await exec('ls'); + * console.log('stdout:', stdout); + * console.error('stderr:', stderr); + * } + * lsExample(); + * ``` + * + * If the `signal` option is enabled, calling `.abort()` on the corresponding`AbortController` is similar to calling `.kill()` on the child process except + * the error passed to the callback will be an `AbortError`: + * + * ```js + * const { exec } = require('child_process'); + * const controller = new AbortController(); + * const { signal } = controller; + * const child = exec('grep ssh', { signal }, (error) => { + * console.log(error); // an AbortError + * }); + * controller.abort(); + * ``` + * @since v0.1.90 + * @param command The command to run, with space-separated arguments. + * @param callback called with the output when process terminates. + */ + function exec(command: string, callback?: (error: ExecException | null, stdout: string, stderr: string) => void): ChildProcess; + // `options` with `"buffer"` or `null` for `encoding` means stdout/stderr are definitely `Buffer`. + function exec( + command: string, + options: { + encoding: 'buffer' | null; + } & ExecOptions, + callback?: (error: ExecException | null, stdout: Buffer, stderr: Buffer) => void + ): ChildProcess; + // `options` with well known `encoding` means stdout/stderr are definitely `string`. + function exec( + command: string, + options: { + encoding: BufferEncoding; + } & ExecOptions, + callback?: (error: ExecException | null, stdout: string, stderr: string) => void + ): ChildProcess; + // `options` with an `encoding` whose type is `string` means stdout/stderr could either be `Buffer` or `string`. + // There is no guarantee the `encoding` is unknown as `string` is a superset of `BufferEncoding`. + function exec( + command: string, + options: { + encoding: BufferEncoding; + } & ExecOptions, + callback?: (error: ExecException | null, stdout: string | Buffer, stderr: string | Buffer) => void + ): ChildProcess; + // `options` without an `encoding` means stdout/stderr are definitely `string`. + function exec(command: string, options: ExecOptions, callback?: (error: ExecException | null, stdout: string, stderr: string) => void): ChildProcess; + // fallback if nothing else matches. Worst case is always `string | Buffer`. + function exec( + command: string, + options: (ObjectEncodingOptions & ExecOptions) | undefined | null, + callback?: (error: ExecException | null, stdout: string | Buffer, stderr: string | Buffer) => void + ): ChildProcess; + interface PromiseWithChild extends Promise { + child: ChildProcess; + } + namespace exec { + function __promisify__(command: string): PromiseWithChild<{ + stdout: string; + stderr: string; + }>; + function __promisify__( + command: string, + options: { + encoding: 'buffer' | null; + } & ExecOptions + ): PromiseWithChild<{ + stdout: Buffer; + stderr: Buffer; + }>; + function __promisify__( + command: string, + options: { + encoding: BufferEncoding; + } & ExecOptions + ): PromiseWithChild<{ + stdout: string; + stderr: string; + }>; + function __promisify__( + command: string, + options: ExecOptions + ): PromiseWithChild<{ + stdout: string; + stderr: string; + }>; + function __promisify__( + command: string, + options?: (ObjectEncodingOptions & ExecOptions) | null + ): PromiseWithChild<{ + stdout: string | Buffer; + stderr: string | Buffer; + }>; + } + interface ExecFileOptions extends CommonOptions, Abortable { + maxBuffer?: number | undefined; + killSignal?: NodeJS.Signals | number | undefined; + windowsVerbatimArguments?: boolean | undefined; + shell?: boolean | string | undefined; + signal?: AbortSignal | undefined; + } + interface ExecFileOptionsWithStringEncoding extends ExecFileOptions { + encoding: BufferEncoding; + } + interface ExecFileOptionsWithBufferEncoding extends ExecFileOptions { + encoding: 'buffer' | null; + } + interface ExecFileOptionsWithOtherEncoding extends ExecFileOptions { + encoding: BufferEncoding; + } + type ExecFileException = ExecException & NodeJS.ErrnoException; + /** + * The `child_process.execFile()` function is similar to {@link exec} except that it does not spawn a shell by default. Rather, the specified + * executable `file` is spawned directly as a new process making it slightly more + * efficient than {@link exec}. + * + * The same options as {@link exec} are supported. Since a shell is + * not spawned, behaviors such as I/O redirection and file globbing are not + * supported. + * + * ```js + * const { execFile } = require('child_process'); + * const child = execFile('node', ['--version'], (error, stdout, stderr) => { + * if (error) { + * throw error; + * } + * console.log(stdout); + * }); + * ``` + * + * The `stdout` and `stderr` arguments passed to the callback will contain the + * stdout and stderr output of the child process. By default, Node.js will decode + * the output as UTF-8 and pass strings to the callback. The `encoding` option + * can be used to specify the character encoding used to decode the stdout and + * stderr output. If `encoding` is `'buffer'`, or an unrecognized character + * encoding, `Buffer` objects will be passed to the callback instead. + * + * If this method is invoked as its `util.promisify()` ed version, it returns + * a `Promise` for an `Object` with `stdout` and `stderr` properties. The returned`ChildProcess` instance is attached to the `Promise` as a `child` property. In + * case of an error (including any error resulting in an exit code other than 0), a + * rejected promise is returned, with the same `error` object given in the + * callback, but with two additional properties `stdout` and `stderr`. + * + * ```js + * const util = require('util'); + * const execFile = util.promisify(require('child_process').execFile); + * async function getVersion() { + * const { stdout } = await execFile('node', ['--version']); + * console.log(stdout); + * } + * getVersion(); + * ``` + * + * **If the `shell` option is enabled, do not pass unsanitized user input to this** + * **function. Any input containing shell metacharacters may be used to trigger** + * **arbitrary command execution.** + * + * If the `signal` option is enabled, calling `.abort()` on the corresponding`AbortController` is similar to calling `.kill()` on the child process except + * the error passed to the callback will be an `AbortError`: + * + * ```js + * const { execFile } = require('child_process'); + * const controller = new AbortController(); + * const { signal } = controller; + * const child = execFile('node', ['--version'], { signal }, (error) => { + * console.log(error); // an AbortError + * }); + * controller.abort(); + * ``` + * @since v0.1.91 + * @param file The name or path of the executable file to run. + * @param args List of string arguments. + * @param callback Called with the output when process terminates. + */ + function execFile(file: string): ChildProcess; + function execFile(file: string, options: (ObjectEncodingOptions & ExecFileOptions) | undefined | null): ChildProcess; + function execFile(file: string, args?: ReadonlyArray | null): ChildProcess; + function execFile(file: string, args: ReadonlyArray | undefined | null, options: (ObjectEncodingOptions & ExecFileOptions) | undefined | null): ChildProcess; + // no `options` definitely means stdout/stderr are `string`. + function execFile(file: string, callback: (error: ExecFileException | null, stdout: string, stderr: string) => void): ChildProcess; + function execFile(file: string, args: ReadonlyArray | undefined | null, callback: (error: ExecFileException | null, stdout: string, stderr: string) => void): ChildProcess; + // `options` with `"buffer"` or `null` for `encoding` means stdout/stderr are definitely `Buffer`. + function execFile(file: string, options: ExecFileOptionsWithBufferEncoding, callback: (error: ExecFileException | null, stdout: Buffer, stderr: Buffer) => void): ChildProcess; + function execFile( + file: string, + args: ReadonlyArray | undefined | null, + options: ExecFileOptionsWithBufferEncoding, + callback: (error: ExecFileException | null, stdout: Buffer, stderr: Buffer) => void + ): ChildProcess; + // `options` with well known `encoding` means stdout/stderr are definitely `string`. + function execFile(file: string, options: ExecFileOptionsWithStringEncoding, callback: (error: ExecFileException | null, stdout: string, stderr: string) => void): ChildProcess; + function execFile( + file: string, + args: ReadonlyArray | undefined | null, + options: ExecFileOptionsWithStringEncoding, + callback: (error: ExecFileException | null, stdout: string, stderr: string) => void + ): ChildProcess; + // `options` with an `encoding` whose type is `string` means stdout/stderr could either be `Buffer` or `string`. + // There is no guarantee the `encoding` is unknown as `string` is a superset of `BufferEncoding`. + function execFile(file: string, options: ExecFileOptionsWithOtherEncoding, callback: (error: ExecFileException | null, stdout: string | Buffer, stderr: string | Buffer) => void): ChildProcess; + function execFile( + file: string, + args: ReadonlyArray | undefined | null, + options: ExecFileOptionsWithOtherEncoding, + callback: (error: ExecFileException | null, stdout: string | Buffer, stderr: string | Buffer) => void + ): ChildProcess; + // `options` without an `encoding` means stdout/stderr are definitely `string`. + function execFile(file: string, options: ExecFileOptions, callback: (error: ExecFileException | null, stdout: string, stderr: string) => void): ChildProcess; + function execFile( + file: string, + args: ReadonlyArray | undefined | null, + options: ExecFileOptions, + callback: (error: ExecFileException | null, stdout: string, stderr: string) => void + ): ChildProcess; + // fallback if nothing else matches. Worst case is always `string | Buffer`. + function execFile( + file: string, + options: (ObjectEncodingOptions & ExecFileOptions) | undefined | null, + callback: ((error: ExecFileException | null, stdout: string | Buffer, stderr: string | Buffer) => void) | undefined | null + ): ChildProcess; + function execFile( + file: string, + args: ReadonlyArray | undefined | null, + options: (ObjectEncodingOptions & ExecFileOptions) | undefined | null, + callback: ((error: ExecFileException | null, stdout: string | Buffer, stderr: string | Buffer) => void) | undefined | null + ): ChildProcess; + namespace execFile { + function __promisify__(file: string): PromiseWithChild<{ + stdout: string; + stderr: string; + }>; + function __promisify__( + file: string, + args: ReadonlyArray | undefined | null + ): PromiseWithChild<{ + stdout: string; + stderr: string; + }>; + function __promisify__( + file: string, + options: ExecFileOptionsWithBufferEncoding + ): PromiseWithChild<{ + stdout: Buffer; + stderr: Buffer; + }>; + function __promisify__( + file: string, + args: ReadonlyArray | undefined | null, + options: ExecFileOptionsWithBufferEncoding + ): PromiseWithChild<{ + stdout: Buffer; + stderr: Buffer; + }>; + function __promisify__( + file: string, + options: ExecFileOptionsWithStringEncoding + ): PromiseWithChild<{ + stdout: string; + stderr: string; + }>; + function __promisify__( + file: string, + args: ReadonlyArray | undefined | null, + options: ExecFileOptionsWithStringEncoding + ): PromiseWithChild<{ + stdout: string; + stderr: string; + }>; + function __promisify__( + file: string, + options: ExecFileOptionsWithOtherEncoding + ): PromiseWithChild<{ + stdout: string | Buffer; + stderr: string | Buffer; + }>; + function __promisify__( + file: string, + args: ReadonlyArray | undefined | null, + options: ExecFileOptionsWithOtherEncoding + ): PromiseWithChild<{ + stdout: string | Buffer; + stderr: string | Buffer; + }>; + function __promisify__( + file: string, + options: ExecFileOptions + ): PromiseWithChild<{ + stdout: string; + stderr: string; + }>; + function __promisify__( + file: string, + args: ReadonlyArray | undefined | null, + options: ExecFileOptions + ): PromiseWithChild<{ + stdout: string; + stderr: string; + }>; + function __promisify__( + file: string, + options: (ObjectEncodingOptions & ExecFileOptions) | undefined | null + ): PromiseWithChild<{ + stdout: string | Buffer; + stderr: string | Buffer; + }>; + function __promisify__( + file: string, + args: ReadonlyArray | undefined | null, + options: (ObjectEncodingOptions & ExecFileOptions) | undefined | null + ): PromiseWithChild<{ + stdout: string | Buffer; + stderr: string | Buffer; + }>; + } + interface ForkOptions extends ProcessEnvOptions, MessagingOptions, Abortable { + execPath?: string | undefined; + execArgv?: string[] | undefined; + silent?: boolean | undefined; + stdio?: StdioOptions | undefined; + detached?: boolean | undefined; + windowsVerbatimArguments?: boolean | undefined; + } + /** + * The `child_process.fork()` method is a special case of {@link spawn} used specifically to spawn new Node.js processes. + * Like {@link spawn}, a `ChildProcess` object is returned. The + * returned `ChildProcess` will have an additional communication channel + * built-in that allows messages to be passed back and forth between the parent and + * child. See `subprocess.send()` for details. + * + * Keep in mind that spawned Node.js child processes are + * independent of the parent with exception of the IPC communication channel + * that is established between the two. Each process has its own memory, with + * their own V8 instances. Because of the additional resource allocations + * required, spawning a large number of child Node.js processes is not + * recommended. + * + * By default, `child_process.fork()` will spawn new Node.js instances using the `process.execPath` of the parent process. The `execPath` property in the`options` object allows for an alternative + * execution path to be used. + * + * Node.js processes launched with a custom `execPath` will communicate with the + * parent process using the file descriptor (fd) identified using the + * environment variable `NODE_CHANNEL_FD` on the child process. + * + * Unlike the [`fork(2)`](http://man7.org/linux/man-pages/man2/fork.2.html) POSIX system call, `child_process.fork()` does not clone the + * current process. + * + * The `shell` option available in {@link spawn} is not supported by`child_process.fork()` and will be ignored if set. + * + * If the `signal` option is enabled, calling `.abort()` on the corresponding`AbortController` is similar to calling `.kill()` on the child process except + * the error passed to the callback will be an `AbortError`: + * + * ```js + * if (process.argv[2] === 'child') { + * setTimeout(() => { + * console.log(`Hello from ${process.argv[2]}!`); + * }, 1_000); + * } else { + * const { fork } = require('child_process'); + * const controller = new AbortController(); + * const { signal } = controller; + * const child = fork(__filename, ['child'], { signal }); + * child.on('error', (err) => { + * // This will be called with err being an AbortError if the controller aborts + * }); + * controller.abort(); // Stops the child process + * } + * ``` + * @since v0.5.0 + * @param modulePath The module to run in the child. + * @param args List of string arguments. + */ + function fork(modulePath: string, options?: ForkOptions): ChildProcess; + function fork(modulePath: string, args?: ReadonlyArray, options?: ForkOptions): ChildProcess; + interface SpawnSyncOptions extends CommonSpawnOptions { + input?: string | NodeJS.ArrayBufferView | undefined; + maxBuffer?: number | undefined; + encoding?: BufferEncoding | 'buffer' | null | undefined; + } + interface SpawnSyncOptionsWithStringEncoding extends SpawnSyncOptions { + encoding: BufferEncoding; + } + interface SpawnSyncOptionsWithBufferEncoding extends SpawnSyncOptions { + encoding?: 'buffer' | null | undefined; + } + interface SpawnSyncReturns { + pid: number; + output: Array; + stdout: T; + stderr: T; + status: number | null; + signal: NodeJS.Signals | null; + error?: Error | undefined; + } + /** + * The `child_process.spawnSync()` method is generally identical to {@link spawn} with the exception that the function will not return + * until the child process has fully closed. When a timeout has been encountered + * and `killSignal` is sent, the method won't return until the process has + * completely exited. If the process intercepts and handles the `SIGTERM` signal + * and doesn't exit, the parent process will wait until the child process has + * exited. + * + * **If the `shell` option is enabled, do not pass unsanitized user input to this** + * **function. Any input containing shell metacharacters may be used to trigger** + * **arbitrary command execution.** + * @since v0.11.12 + * @param command The command to run. + * @param args List of string arguments. + */ + function spawnSync(command: string): SpawnSyncReturns; + function spawnSync(command: string, options: SpawnSyncOptionsWithStringEncoding): SpawnSyncReturns; + function spawnSync(command: string, options: SpawnSyncOptionsWithBufferEncoding): SpawnSyncReturns; + function spawnSync(command: string, options?: SpawnSyncOptions): SpawnSyncReturns; + function spawnSync(command: string, args: ReadonlyArray): SpawnSyncReturns; + function spawnSync(command: string, args: ReadonlyArray, options: SpawnSyncOptionsWithStringEncoding): SpawnSyncReturns; + function spawnSync(command: string, args: ReadonlyArray, options: SpawnSyncOptionsWithBufferEncoding): SpawnSyncReturns; + function spawnSync(command: string, args?: ReadonlyArray, options?: SpawnSyncOptions): SpawnSyncReturns; + interface CommonExecOptions extends CommonOptions { + input?: string | NodeJS.ArrayBufferView | undefined; + stdio?: StdioOptions | undefined; + killSignal?: NodeJS.Signals | number | undefined; + maxBuffer?: number | undefined; + encoding?: BufferEncoding | 'buffer' | null | undefined; + } + interface ExecSyncOptions extends CommonExecOptions { + shell?: string | undefined; + } + interface ExecSyncOptionsWithStringEncoding extends ExecSyncOptions { + encoding: BufferEncoding; + } + interface ExecSyncOptionsWithBufferEncoding extends ExecSyncOptions { + encoding?: 'buffer' | null | undefined; + } + /** + * The `child_process.execSync()` method is generally identical to {@link exec} with the exception that the method will not return + * until the child process has fully closed. When a timeout has been encountered + * and `killSignal` is sent, the method won't return until the process has + * completely exited. If the child process intercepts and handles the `SIGTERM`signal and doesn't exit, the parent process will wait until the child process + * has exited. + * + * If the process times out or has a non-zero exit code, this method will throw. + * The `Error` object will contain the entire result from {@link spawnSync}. + * + * **Never pass unsanitized user input to this function. Any input containing shell** + * **metacharacters may be used to trigger arbitrary command execution.** + * @since v0.11.12 + * @param command The command to run. + * @return The stdout from the command. + */ + function execSync(command: string): Buffer; + function execSync(command: string, options: ExecSyncOptionsWithStringEncoding): string; + function execSync(command: string, options: ExecSyncOptionsWithBufferEncoding): Buffer; + function execSync(command: string, options?: ExecSyncOptions): string | Buffer; + interface ExecFileSyncOptions extends CommonExecOptions { + shell?: boolean | string | undefined; + } + interface ExecFileSyncOptionsWithStringEncoding extends ExecFileSyncOptions { + encoding: BufferEncoding; + } + interface ExecFileSyncOptionsWithBufferEncoding extends ExecFileSyncOptions { + encoding?: 'buffer' | null; // specify `null`. + } + /** + * The `child_process.execFileSync()` method is generally identical to {@link execFile} with the exception that the method will not + * return until the child process has fully closed. When a timeout has been + * encountered and `killSignal` is sent, the method won't return until the process + * has completely exited. + * + * If the child process intercepts and handles the `SIGTERM` signal and + * does not exit, the parent process will still wait until the child process has + * exited. + * + * If the process times out or has a non-zero exit code, this method will throw an `Error` that will include the full result of the underlying {@link spawnSync}. + * + * **If the `shell` option is enabled, do not pass unsanitized user input to this** + * **function. Any input containing shell metacharacters may be used to trigger** + * **arbitrary command execution.** + * @since v0.11.12 + * @param file The name or path of the executable file to run. + * @param args List of string arguments. + * @return The stdout from the command. + */ + function execFileSync(file: string): Buffer; + function execFileSync(file: string, options: ExecFileSyncOptionsWithStringEncoding): string; + function execFileSync(file: string, options: ExecFileSyncOptionsWithBufferEncoding): Buffer; + function execFileSync(file: string, options?: ExecFileSyncOptions): string | Buffer; + function execFileSync(file: string, args: ReadonlyArray): Buffer; + function execFileSync(file: string, args: ReadonlyArray, options: ExecFileSyncOptionsWithStringEncoding): string; + function execFileSync(file: string, args: ReadonlyArray, options: ExecFileSyncOptionsWithBufferEncoding): Buffer; + function execFileSync(file: string, args?: ReadonlyArray, options?: ExecFileSyncOptions): string | Buffer; +} +declare module 'node:child_process' { + export * from 'child_process'; +} diff --git a/node_modules/@types/node/cluster.d.ts b/node_modules/@types/node/cluster.d.ts new file mode 100644 index 000000000..7f1614105 --- /dev/null +++ b/node_modules/@types/node/cluster.d.ts @@ -0,0 +1,414 @@ +/** + * A single instance of Node.js runs in a single thread. To take advantage of + * multi-core systems, the user will sometimes want to launch a cluster of Node.js + * processes to handle the load. + * + * The cluster module allows easy creation of child processes that all share + * server ports. + * + * ```js + * import cluster from 'cluster'; + * import http from 'http'; + * import { cpus } from 'os'; + * import process from 'process'; + * + * const numCPUs = cpus().length; + * + * if (cluster.isPrimary) { + * console.log(`Primary ${process.pid} is running`); + * + * // Fork workers. + * for (let i = 0; i < numCPUs; i++) { + * cluster.fork(); + * } + * + * cluster.on('exit', (worker, code, signal) => { + * console.log(`worker ${worker.process.pid} died`); + * }); + * } else { + * // Workers can share any TCP connection + * // In this case it is an HTTP server + * http.createServer((req, res) => { + * res.writeHead(200); + * res.end('hello world\n'); + * }).listen(8000); + * + * console.log(`Worker ${process.pid} started`); + * } + * ``` + * + * Running Node.js will now share port 8000 between the workers: + * + * ```console + * $ node server.js + * Primary 3596 is running + * Worker 4324 started + * Worker 4520 started + * Worker 6056 started + * Worker 5644 started + * ``` + * + * On Windows, it is not yet possible to set up a named pipe server in a worker. + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/cluster.js) + */ +declare module 'cluster' { + import * as child from 'node:child_process'; + import EventEmitter = require('node:events'); + import * as net from 'node:net'; + export interface ClusterSettings { + execArgv?: string[] | undefined; // default: process.execArgv + exec?: string | undefined; + args?: string[] | undefined; + silent?: boolean | undefined; + stdio?: any[] | undefined; + uid?: number | undefined; + gid?: number | undefined; + inspectPort?: number | (() => number) | undefined; + } + export interface Address { + address: string; + port: number; + addressType: number | 'udp4' | 'udp6'; // 4, 6, -1, "udp4", "udp6" + } + /** + * A `Worker` object contains all public information and method about a worker. + * In the primary it can be obtained using `cluster.workers`. In a worker + * it can be obtained using `cluster.worker`. + * @since v0.7.0 + */ + export class Worker extends EventEmitter { + /** + * Each new worker is given its own unique id, this id is stored in the`id`. + * + * While a worker is alive, this is the key that indexes it in`cluster.workers`. + * @since v0.8.0 + */ + id: number; + /** + * All workers are created using `child_process.fork()`, the returned object + * from this function is stored as `.process`. In a worker, the global `process`is stored. + * + * See: `Child Process module`. + * + * Workers will call `process.exit(0)` if the `'disconnect'` event occurs + * on `process` and `.exitedAfterDisconnect` is not `true`. This protects against + * accidental disconnection. + * @since v0.7.0 + */ + process: child.ChildProcess; + /** + * Send a message to a worker or primary, optionally with a handle. + * + * In the primary this sends a message to a specific worker. It is identical to `ChildProcess.send()`. + * + * In a worker this sends a message to the primary. It is identical to`process.send()`. + * + * This example will echo back all messages from the primary: + * + * ```js + * if (cluster.isPrimary) { + * const worker = cluster.fork(); + * worker.send('hi there'); + * + * } else if (cluster.isWorker) { + * process.on('message', (msg) => { + * process.send(msg); + * }); + * } + * ``` + * @since v0.7.0 + * @param options The `options` argument, if present, is an object used to parameterize the sending of certain types of handles. `options` supports the following properties: + */ + send(message: child.Serializable, callback?: (error: Error | null) => void): boolean; + send(message: child.Serializable, sendHandle: child.SendHandle, callback?: (error: Error | null) => void): boolean; + send(message: child.Serializable, sendHandle: child.SendHandle, options?: child.MessageOptions, callback?: (error: Error | null) => void): boolean; + /** + * This function will kill the worker. In the primary, it does this + * by disconnecting the `worker.process`, and once disconnected, killing + * with `signal`. In the worker, it does it by disconnecting the channel, + * and then exiting with code `0`. + * + * Because `kill()` attempts to gracefully disconnect the worker process, it is + * susceptible to waiting indefinitely for the disconnect to complete. For example, + * if the worker enters an infinite loop, a graceful disconnect will never occur. + * If the graceful disconnect behavior is not needed, use `worker.process.kill()`. + * + * Causes `.exitedAfterDisconnect` to be set. + * + * This method is aliased as `worker.destroy()` for backward compatibility. + * + * In a worker, `process.kill()` exists, but it is not this function; + * it is `kill()`. + * @since v0.9.12 + * @param [signal='SIGTERM'] Name of the kill signal to send to the worker process. + */ + kill(signal?: string): void; + destroy(signal?: string): void; + /** + * In a worker, this function will close all servers, wait for the `'close'` event + * on those servers, and then disconnect the IPC channel. + * + * In the primary, an internal message is sent to the worker causing it to call`.disconnect()` on itself. + * + * Causes `.exitedAfterDisconnect` to be set. + * + * After a server is closed, it will no longer accept new connections, + * but connections may be accepted by any other listening worker. Existing + * connections will be allowed to close as usual. When no more connections exist, + * see `server.close()`, the IPC channel to the worker will close allowing it + * to die gracefully. + * + * The above applies _only_ to server connections, client connections are not + * automatically closed by workers, and disconnect does not wait for them to close + * before exiting. + * + * In a worker, `process.disconnect` exists, but it is not this function; + * it is `disconnect()`. + * + * Because long living server connections may block workers from disconnecting, it + * may be useful to send a message, so application specific actions may be taken to + * close them. It also may be useful to implement a timeout, killing a worker if + * the `'disconnect'` event has not been emitted after some time. + * + * ```js + * if (cluster.isPrimary) { + * const worker = cluster.fork(); + * let timeout; + * + * worker.on('listening', (address) => { + * worker.send('shutdown'); + * worker.disconnect(); + * timeout = setTimeout(() => { + * worker.kill(); + * }, 2000); + * }); + * + * worker.on('disconnect', () => { + * clearTimeout(timeout); + * }); + * + * } else if (cluster.isWorker) { + * const net = require('net'); + * const server = net.createServer((socket) => { + * // Connections never end + * }); + * + * server.listen(8000); + * + * process.on('message', (msg) => { + * if (msg === 'shutdown') { + * // Initiate graceful close of any connections to server + * } + * }); + * } + * ``` + * @since v0.7.7 + * @return A reference to `worker`. + */ + disconnect(): void; + /** + * This function returns `true` if the worker is connected to its primary via its + * IPC channel, `false` otherwise. A worker is connected to its primary after it + * has been created. It is disconnected after the `'disconnect'` event is emitted. + * @since v0.11.14 + */ + isConnected(): boolean; + /** + * This function returns `true` if the worker's process has terminated (either + * because of exiting or being signaled). Otherwise, it returns `false`. + * + * ```js + * import cluster from 'cluster'; + * import http from 'http'; + * import { cpus } from 'os'; + * import process from 'process'; + * + * const numCPUs = cpus().length; + * + * if (cluster.isPrimary) { + * console.log(`Primary ${process.pid} is running`); + * + * // Fork workers. + * for (let i = 0; i < numCPUs; i++) { + * cluster.fork(); + * } + * + * cluster.on('fork', (worker) => { + * console.log('worker is dead:', worker.isDead()); + * }); + * + * cluster.on('exit', (worker, code, signal) => { + * console.log('worker is dead:', worker.isDead()); + * }); + * } else { + * // Workers can share any TCP connection. In this case, it is an HTTP server. + * http.createServer((req, res) => { + * res.writeHead(200); + * res.end(`Current process\n ${process.pid}`); + * process.kill(process.pid); + * }).listen(8000); + * } + * ``` + * @since v0.11.14 + */ + isDead(): boolean; + /** + * This property is `true` if the worker exited due to `.kill()` or`.disconnect()`. If the worker exited any other way, it is `false`. If the + * worker has not exited, it is `undefined`. + * + * The boolean `worker.exitedAfterDisconnect` allows distinguishing between + * voluntary and accidental exit, the primary may choose not to respawn a worker + * based on this value. + * + * ```js + * cluster.on('exit', (worker, code, signal) => { + * if (worker.exitedAfterDisconnect === true) { + * console.log('Oh, it was just voluntary – no need to worry'); + * } + * }); + * + * // kill worker + * worker.kill(); + * ``` + * @since v6.0.0 + */ + exitedAfterDisconnect: boolean; + /** + * events.EventEmitter + * 1. disconnect + * 2. error + * 3. exit + * 4. listening + * 5. message + * 6. online + */ + addListener(event: string, listener: (...args: any[]) => void): this; + addListener(event: 'disconnect', listener: () => void): this; + addListener(event: 'error', listener: (error: Error) => void): this; + addListener(event: 'exit', listener: (code: number, signal: string) => void): this; + addListener(event: 'listening', listener: (address: Address) => void): this; + addListener(event: 'message', listener: (message: any, handle: net.Socket | net.Server) => void): this; // the handle is a net.Socket or net.Server object, or undefined. + addListener(event: 'online', listener: () => void): this; + emit(event: string | symbol, ...args: any[]): boolean; + emit(event: 'disconnect'): boolean; + emit(event: 'error', error: Error): boolean; + emit(event: 'exit', code: number, signal: string): boolean; + emit(event: 'listening', address: Address): boolean; + emit(event: 'message', message: any, handle: net.Socket | net.Server): boolean; + emit(event: 'online'): boolean; + on(event: string, listener: (...args: any[]) => void): this; + on(event: 'disconnect', listener: () => void): this; + on(event: 'error', listener: (error: Error) => void): this; + on(event: 'exit', listener: (code: number, signal: string) => void): this; + on(event: 'listening', listener: (address: Address) => void): this; + on(event: 'message', listener: (message: any, handle: net.Socket | net.Server) => void): this; // the handle is a net.Socket or net.Server object, or undefined. + on(event: 'online', listener: () => void): this; + once(event: string, listener: (...args: any[]) => void): this; + once(event: 'disconnect', listener: () => void): this; + once(event: 'error', listener: (error: Error) => void): this; + once(event: 'exit', listener: (code: number, signal: string) => void): this; + once(event: 'listening', listener: (address: Address) => void): this; + once(event: 'message', listener: (message: any, handle: net.Socket | net.Server) => void): this; // the handle is a net.Socket or net.Server object, or undefined. + once(event: 'online', listener: () => void): this; + prependListener(event: string, listener: (...args: any[]) => void): this; + prependListener(event: 'disconnect', listener: () => void): this; + prependListener(event: 'error', listener: (error: Error) => void): this; + prependListener(event: 'exit', listener: (code: number, signal: string) => void): this; + prependListener(event: 'listening', listener: (address: Address) => void): this; + prependListener(event: 'message', listener: (message: any, handle: net.Socket | net.Server) => void): this; // the handle is a net.Socket or net.Server object, or undefined. + prependListener(event: 'online', listener: () => void): this; + prependOnceListener(event: string, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'disconnect', listener: () => void): this; + prependOnceListener(event: 'error', listener: (error: Error) => void): this; + prependOnceListener(event: 'exit', listener: (code: number, signal: string) => void): this; + prependOnceListener(event: 'listening', listener: (address: Address) => void): this; + prependOnceListener(event: 'message', listener: (message: any, handle: net.Socket | net.Server) => void): this; // the handle is a net.Socket or net.Server object, or undefined. + prependOnceListener(event: 'online', listener: () => void): this; + } + export interface Cluster extends EventEmitter { + disconnect(callback?: () => void): void; + fork(env?: any): Worker; + /** @deprecated since v16.0.0 - use isPrimary. */ + readonly isMaster: boolean; + readonly isPrimary: boolean; + readonly isWorker: boolean; + schedulingPolicy: number; + readonly settings: ClusterSettings; + /** @deprecated since v16.0.0 - use setupPrimary. */ + setupMaster(settings?: ClusterSettings): void; + /** + * `setupPrimary` is used to change the default 'fork' behavior. Once called, the settings will be present in cluster.settings. + */ + setupPrimary(settings?: ClusterSettings): void; + readonly worker?: Worker | undefined; + readonly workers?: NodeJS.Dict | undefined; + readonly SCHED_NONE: number; + readonly SCHED_RR: number; + /** + * events.EventEmitter + * 1. disconnect + * 2. exit + * 3. fork + * 4. listening + * 5. message + * 6. online + * 7. setup + */ + addListener(event: string, listener: (...args: any[]) => void): this; + addListener(event: 'disconnect', listener: (worker: Worker) => void): this; + addListener(event: 'exit', listener: (worker: Worker, code: number, signal: string) => void): this; + addListener(event: 'fork', listener: (worker: Worker) => void): this; + addListener(event: 'listening', listener: (worker: Worker, address: Address) => void): this; + addListener(event: 'message', listener: (worker: Worker, message: any, handle: net.Socket | net.Server) => void): this; // the handle is a net.Socket or net.Server object, or undefined. + addListener(event: 'online', listener: (worker: Worker) => void): this; + addListener(event: 'setup', listener: (settings: ClusterSettings) => void): this; + emit(event: string | symbol, ...args: any[]): boolean; + emit(event: 'disconnect', worker: Worker): boolean; + emit(event: 'exit', worker: Worker, code: number, signal: string): boolean; + emit(event: 'fork', worker: Worker): boolean; + emit(event: 'listening', worker: Worker, address: Address): boolean; + emit(event: 'message', worker: Worker, message: any, handle: net.Socket | net.Server): boolean; + emit(event: 'online', worker: Worker): boolean; + emit(event: 'setup', settings: ClusterSettings): boolean; + on(event: string, listener: (...args: any[]) => void): this; + on(event: 'disconnect', listener: (worker: Worker) => void): this; + on(event: 'exit', listener: (worker: Worker, code: number, signal: string) => void): this; + on(event: 'fork', listener: (worker: Worker) => void): this; + on(event: 'listening', listener: (worker: Worker, address: Address) => void): this; + on(event: 'message', listener: (worker: Worker, message: any, handle: net.Socket | net.Server) => void): this; // the handle is a net.Socket or net.Server object, or undefined. + on(event: 'online', listener: (worker: Worker) => void): this; + on(event: 'setup', listener: (settings: ClusterSettings) => void): this; + once(event: string, listener: (...args: any[]) => void): this; + once(event: 'disconnect', listener: (worker: Worker) => void): this; + once(event: 'exit', listener: (worker: Worker, code: number, signal: string) => void): this; + once(event: 'fork', listener: (worker: Worker) => void): this; + once(event: 'listening', listener: (worker: Worker, address: Address) => void): this; + once(event: 'message', listener: (worker: Worker, message: any, handle: net.Socket | net.Server) => void): this; // the handle is a net.Socket or net.Server object, or undefined. + once(event: 'online', listener: (worker: Worker) => void): this; + once(event: 'setup', listener: (settings: ClusterSettings) => void): this; + prependListener(event: string, listener: (...args: any[]) => void): this; + prependListener(event: 'disconnect', listener: (worker: Worker) => void): this; + prependListener(event: 'exit', listener: (worker: Worker, code: number, signal: string) => void): this; + prependListener(event: 'fork', listener: (worker: Worker) => void): this; + prependListener(event: 'listening', listener: (worker: Worker, address: Address) => void): this; + // the handle is a net.Socket or net.Server object, or undefined. + prependListener(event: 'message', listener: (worker: Worker, message: any, handle?: net.Socket | net.Server) => void): this; + prependListener(event: 'online', listener: (worker: Worker) => void): this; + prependListener(event: 'setup', listener: (settings: ClusterSettings) => void): this; + prependOnceListener(event: string, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'disconnect', listener: (worker: Worker) => void): this; + prependOnceListener(event: 'exit', listener: (worker: Worker, code: number, signal: string) => void): this; + prependOnceListener(event: 'fork', listener: (worker: Worker) => void): this; + prependOnceListener(event: 'listening', listener: (worker: Worker, address: Address) => void): this; + // the handle is a net.Socket or net.Server object, or undefined. + prependOnceListener(event: 'message', listener: (worker: Worker, message: any, handle: net.Socket | net.Server) => void): this; + prependOnceListener(event: 'online', listener: (worker: Worker) => void): this; + prependOnceListener(event: 'setup', listener: (settings: ClusterSettings) => void): this; + } + const cluster: Cluster; + export default cluster; +} +declare module 'node:cluster' { + export * from 'cluster'; + export { default as default } from 'cluster'; +} diff --git a/node_modules/@types/node/console.d.ts b/node_modules/@types/node/console.d.ts new file mode 100644 index 000000000..ede7a53f2 --- /dev/null +++ b/node_modules/@types/node/console.d.ts @@ -0,0 +1,412 @@ +/** + * The `console` module provides a simple debugging console that is similar to the + * JavaScript console mechanism provided by web browsers. + * + * The module exports two specific components: + * + * * A `Console` class with methods such as `console.log()`, `console.error()` and`console.warn()` that can be used to write to any Node.js stream. + * * A global `console` instance configured to write to `process.stdout` and `process.stderr`. The global `console` can be used without calling`require('console')`. + * + * _**Warning**_: The global console object's methods are neither consistently + * synchronous like the browser APIs they resemble, nor are they consistently + * asynchronous like all other Node.js streams. See the `note on process I/O` for + * more information. + * + * Example using the global `console`: + * + * ```js + * console.log('hello world'); + * // Prints: hello world, to stdout + * console.log('hello %s', 'world'); + * // Prints: hello world, to stdout + * console.error(new Error('Whoops, something bad happened')); + * // Prints error message and stack trace to stderr: + * // Error: Whoops, something bad happened + * // at [eval]:5:15 + * // at Script.runInThisContext (node:vm:132:18) + * // at Object.runInThisContext (node:vm:309:38) + * // at node:internal/process/execution:77:19 + * // at [eval]-wrapper:6:22 + * // at evalScript (node:internal/process/execution:76:60) + * // at node:internal/main/eval_string:23:3 + * + * const name = 'Will Robinson'; + * console.warn(`Danger ${name}! Danger!`); + * // Prints: Danger Will Robinson! Danger!, to stderr + * ``` + * + * Example using the `Console` class: + * + * ```js + * const out = getStreamSomehow(); + * const err = getStreamSomehow(); + * const myConsole = new console.Console(out, err); + * + * myConsole.log('hello world'); + * // Prints: hello world, to out + * myConsole.log('hello %s', 'world'); + * // Prints: hello world, to out + * myConsole.error(new Error('Whoops, something bad happened')); + * // Prints: [Error: Whoops, something bad happened], to err + * + * const name = 'Will Robinson'; + * myConsole.warn(`Danger ${name}! Danger!`); + * // Prints: Danger Will Robinson! Danger!, to err + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/console.js) + */ +declare module 'console' { + import console = require('node:console'); + export = console; +} +declare module 'node:console' { + import { InspectOptions } from 'node:util'; + global { + // This needs to be global to avoid TS2403 in case lib.dom.d.ts is present in the same build + interface Console { + Console: console.ConsoleConstructor; + /** + * `console.assert()` writes a message if `value` is [falsy](https://developer.mozilla.org/en-US/docs/Glossary/Falsy) or omitted. It only + * writes a message and does not otherwise affect execution. The output always + * starts with `"Assertion failed"`. If provided, `message` is formatted using `util.format()`. + * + * If `value` is [truthy](https://developer.mozilla.org/en-US/docs/Glossary/Truthy), nothing happens. + * + * ```js + * console.assert(true, 'does nothing'); + * + * console.assert(false, 'Whoops %s work', 'didn\'t'); + * // Assertion failed: Whoops didn't work + * + * console.assert(); + * // Assertion failed + * ``` + * @since v0.1.101 + * @param value The value tested for being truthy. + * @param message All arguments besides `value` are used as error message. + */ + assert(value: any, message?: string, ...optionalParams: any[]): void; + /** + * When `stdout` is a TTY, calling `console.clear()` will attempt to clear the + * TTY. When `stdout` is not a TTY, this method does nothing. + * + * The specific operation of `console.clear()` can vary across operating systems + * and terminal types. For most Linux operating systems, `console.clear()`operates similarly to the `clear` shell command. On Windows, `console.clear()`will clear only the output in the + * current terminal viewport for the Node.js + * binary. + * @since v8.3.0 + */ + clear(): void; + /** + * Maintains an internal counter specific to `label` and outputs to `stdout` the + * number of times `console.count()` has been called with the given `label`. + * + * ```js + * > console.count() + * default: 1 + * undefined + * > console.count('default') + * default: 2 + * undefined + * > console.count('abc') + * abc: 1 + * undefined + * > console.count('xyz') + * xyz: 1 + * undefined + * > console.count('abc') + * abc: 2 + * undefined + * > console.count() + * default: 3 + * undefined + * > + * ``` + * @since v8.3.0 + * @param label The display label for the counter. + */ + count(label?: string): void; + /** + * Resets the internal counter specific to `label`. + * + * ```js + * > console.count('abc'); + * abc: 1 + * undefined + * > console.countReset('abc'); + * undefined + * > console.count('abc'); + * abc: 1 + * undefined + * > + * ``` + * @since v8.3.0 + * @param label The display label for the counter. + */ + countReset(label?: string): void; + /** + * The `console.debug()` function is an alias for {@link log}. + * @since v8.0.0 + */ + debug(message?: any, ...optionalParams: any[]): void; + /** + * Uses `util.inspect()` on `obj` and prints the resulting string to `stdout`. + * This function bypasses any custom `inspect()` function defined on `obj`. + * @since v0.1.101 + */ + dir(obj: any, options?: InspectOptions): void; + /** + * This method calls `console.log()` passing it the arguments received. + * This method does not produce any XML formatting. + * @since v8.0.0 + */ + dirxml(...data: any[]): void; + /** + * Prints to `stderr` with newline. Multiple arguments can be passed, with the + * first used as the primary message and all additional used as substitution + * values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html) (the arguments are all passed to `util.format()`). + * + * ```js + * const code = 5; + * console.error('error #%d', code); + * // Prints: error #5, to stderr + * console.error('error', code); + * // Prints: error 5, to stderr + * ``` + * + * If formatting elements (e.g. `%d`) are not found in the first string then `util.inspect()` is called on each argument and the resulting string + * values are concatenated. See `util.format()` for more information. + * @since v0.1.100 + */ + error(message?: any, ...optionalParams: any[]): void; + /** + * Increases indentation of subsequent lines by spaces for `groupIndentation`length. + * + * If one or more `label`s are provided, those are printed first without the + * additional indentation. + * @since v8.5.0 + */ + group(...label: any[]): void; + /** + * An alias for {@link group}. + * @since v8.5.0 + */ + groupCollapsed(...label: any[]): void; + /** + * Decreases indentation of subsequent lines by spaces for `groupIndentation`length. + * @since v8.5.0 + */ + groupEnd(): void; + /** + * The `console.info()` function is an alias for {@link log}. + * @since v0.1.100 + */ + info(message?: any, ...optionalParams: any[]): void; + /** + * Prints to `stdout` with newline. Multiple arguments can be passed, with the + * first used as the primary message and all additional used as substitution + * values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html) (the arguments are all passed to `util.format()`). + * + * ```js + * const count = 5; + * console.log('count: %d', count); + * // Prints: count: 5, to stdout + * console.log('count:', count); + * // Prints: count: 5, to stdout + * ``` + * + * See `util.format()` for more information. + * @since v0.1.100 + */ + log(message?: any, ...optionalParams: any[]): void; + /** + * Try to construct a table with the columns of the properties of `tabularData`(or use `properties`) and rows of `tabularData` and log it. Falls back to just + * logging the argument if it can’t be parsed as tabular. + * + * ```js + * // These can't be parsed as tabular data + * console.table(Symbol()); + * // Symbol() + * + * console.table(undefined); + * // undefined + * + * console.table([{ a: 1, b: 'Y' }, { a: 'Z', b: 2 }]); + * // ┌─────────┬─────┬─────┐ + * // │ (index) │ a │ b │ + * // ├─────────┼─────┼─────┤ + * // │ 0 │ 1 │ 'Y' │ + * // │ 1 │ 'Z' │ 2 │ + * // └─────────┴─────┴─────┘ + * + * console.table([{ a: 1, b: 'Y' }, { a: 'Z', b: 2 }], ['a']); + * // ┌─────────┬─────┐ + * // │ (index) │ a │ + * // ├─────────┼─────┤ + * // │ 0 │ 1 │ + * // │ 1 │ 'Z' │ + * // └─────────┴─────┘ + * ``` + * @since v10.0.0 + * @param properties Alternate properties for constructing the table. + */ + table(tabularData: any, properties?: ReadonlyArray): void; + /** + * Starts a timer that can be used to compute the duration of an operation. Timers + * are identified by a unique `label`. Use the same `label` when calling {@link timeEnd} to stop the timer and output the elapsed time in + * suitable time units to `stdout`. For example, if the elapsed + * time is 3869ms, `console.timeEnd()` displays "3.869s". + * @since v0.1.104 + */ + time(label?: string): void; + /** + * Stops a timer that was previously started by calling {@link time} and + * prints the result to `stdout`: + * + * ```js + * console.time('100-elements'); + * for (let i = 0; i < 100; i++) {} + * console.timeEnd('100-elements'); + * // prints 100-elements: 225.438ms + * ``` + * @since v0.1.104 + */ + timeEnd(label?: string): void; + /** + * For a timer that was previously started by calling {@link time}, prints + * the elapsed time and other `data` arguments to `stdout`: + * + * ```js + * console.time('process'); + * const value = expensiveProcess1(); // Returns 42 + * console.timeLog('process', value); + * // Prints "process: 365.227ms 42". + * doExpensiveProcess2(value); + * console.timeEnd('process'); + * ``` + * @since v10.7.0 + */ + timeLog(label?: string, ...data: any[]): void; + /** + * Prints to `stderr` the string `'Trace: '`, followed by the `util.format()` formatted message and stack trace to the current position in the code. + * + * ```js + * console.trace('Show me'); + * // Prints: (stack trace will vary based on where trace is called) + * // Trace: Show me + * // at repl:2:9 + * // at REPLServer.defaultEval (repl.js:248:27) + * // at bound (domain.js:287:14) + * // at REPLServer.runBound [as eval] (domain.js:300:12) + * // at REPLServer. (repl.js:412:12) + * // at emitOne (events.js:82:20) + * // at REPLServer.emit (events.js:169:7) + * // at REPLServer.Interface._onLine (readline.js:210:10) + * // at REPLServer.Interface._line (readline.js:549:8) + * // at REPLServer.Interface._ttyWrite (readline.js:826:14) + * ``` + * @since v0.1.104 + */ + trace(message?: any, ...optionalParams: any[]): void; + /** + * The `console.warn()` function is an alias for {@link error}. + * @since v0.1.100 + */ + warn(message?: any, ...optionalParams: any[]): void; + // --- Inspector mode only --- + /** + * This method does not display anything unless used in the inspector. + * Starts a JavaScript CPU profile with an optional label. + */ + profile(label?: string): void; + /** + * This method does not display anything unless used in the inspector. + * Stops the current JavaScript CPU profiling session if one has been started and prints the report to the Profiles panel of the inspector. + */ + profileEnd(label?: string): void; + /** + * This method does not display anything unless used in the inspector. + * Adds an event with the label `label` to the Timeline panel of the inspector. + */ + timeStamp(label?: string): void; + } + /** + * The `console` module provides a simple debugging console that is similar to the + * JavaScript console mechanism provided by web browsers. + * + * The module exports two specific components: + * + * * A `Console` class with methods such as `console.log()`, `console.error()` and`console.warn()` that can be used to write to any Node.js stream. + * * A global `console` instance configured to write to `process.stdout` and `process.stderr`. The global `console` can be used without calling`require('console')`. + * + * _**Warning**_: The global console object's methods are neither consistently + * synchronous like the browser APIs they resemble, nor are they consistently + * asynchronous like all other Node.js streams. See the `note on process I/O` for + * more information. + * + * Example using the global `console`: + * + * ```js + * console.log('hello world'); + * // Prints: hello world, to stdout + * console.log('hello %s', 'world'); + * // Prints: hello world, to stdout + * console.error(new Error('Whoops, something bad happened')); + * // Prints error message and stack trace to stderr: + * // Error: Whoops, something bad happened + * // at [eval]:5:15 + * // at Script.runInThisContext (node:vm:132:18) + * // at Object.runInThisContext (node:vm:309:38) + * // at node:internal/process/execution:77:19 + * // at [eval]-wrapper:6:22 + * // at evalScript (node:internal/process/execution:76:60) + * // at node:internal/main/eval_string:23:3 + * + * const name = 'Will Robinson'; + * console.warn(`Danger ${name}! Danger!`); + * // Prints: Danger Will Robinson! Danger!, to stderr + * ``` + * + * Example using the `Console` class: + * + * ```js + * const out = getStreamSomehow(); + * const err = getStreamSomehow(); + * const myConsole = new console.Console(out, err); + * + * myConsole.log('hello world'); + * // Prints: hello world, to out + * myConsole.log('hello %s', 'world'); + * // Prints: hello world, to out + * myConsole.error(new Error('Whoops, something bad happened')); + * // Prints: [Error: Whoops, something bad happened], to err + * + * const name = 'Will Robinson'; + * myConsole.warn(`Danger ${name}! Danger!`); + * // Prints: Danger Will Robinson! Danger!, to err + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.4.2/lib/console.js) + */ + namespace console { + interface ConsoleConstructorOptions { + stdout: NodeJS.WritableStream; + stderr?: NodeJS.WritableStream | undefined; + ignoreErrors?: boolean | undefined; + colorMode?: boolean | 'auto' | undefined; + inspectOptions?: InspectOptions | undefined; + /** + * Set group indentation + * @default 2 + */ + groupIndentation?: number | undefined; + } + interface ConsoleConstructor { + prototype: Console; + new (stdout: NodeJS.WritableStream, stderr?: NodeJS.WritableStream, ignoreErrors?: boolean): Console; + new (options: ConsoleConstructorOptions): Console; + } + } + var console: Console; + } + export = globalThis.console; +} diff --git a/node_modules/@types/node/constants.d.ts b/node_modules/@types/node/constants.d.ts new file mode 100644 index 000000000..208020dcb --- /dev/null +++ b/node_modules/@types/node/constants.d.ts @@ -0,0 +1,18 @@ +/** @deprecated since v6.3.0 - use constants property exposed by the relevant module instead. */ +declare module 'constants' { + import { constants as osConstants, SignalConstants } from 'node:os'; + import { constants as cryptoConstants } from 'node:crypto'; + import { constants as fsConstants } from 'node:fs'; + + const exp: typeof osConstants.errno & + typeof osConstants.priority & + SignalConstants & + typeof cryptoConstants & + typeof fsConstants; + export = exp; +} + +declare module 'node:constants' { + import constants = require('constants'); + export = constants; +} diff --git a/node_modules/@types/node/crypto.d.ts b/node_modules/@types/node/crypto.d.ts new file mode 100644 index 000000000..f91f1c4c8 --- /dev/null +++ b/node_modules/@types/node/crypto.d.ts @@ -0,0 +1,3307 @@ +/** + * The `crypto` module provides cryptographic functionality that includes a set of + * wrappers for OpenSSL's hash, HMAC, cipher, decipher, sign, and verify functions. + * + * ```js + * const { createHmac } = await import('crypto'); + * + * const secret = 'abcdefg'; + * const hash = createHmac('sha256', secret) + * .update('I love cupcakes') + * .digest('hex'); + * console.log(hash); + * // Prints: + * // c0fa1bc00531bd78ef38c628449c5102aeabd49b5dc3a2a516ea6ea959d6658e + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/crypto.js) + */ +declare module 'crypto' { + import * as stream from 'node:stream'; + import { PeerCertificate } from 'node:tls'; + interface Certificate { + /** + * @deprecated + * @param spkac + * @returns The challenge component of the `spkac` data structure, + * which includes a public key and a challenge. + */ + exportChallenge(spkac: BinaryLike): Buffer; + /** + * @deprecated + * @param spkac + * @param encoding The encoding of the spkac string. + * @returns The public key component of the `spkac` data structure, + * which includes a public key and a challenge. + */ + exportPublicKey(spkac: BinaryLike, encoding?: string): Buffer; + /** + * @deprecated + * @param spkac + * @returns `true` if the given `spkac` data structure is valid, + * `false` otherwise. + */ + verifySpkac(spkac: NodeJS.ArrayBufferView): boolean; + } + const Certificate: Certificate & { + /** @deprecated since v14.9.0 - Use static methods of `crypto.Certificate` instead. */ + new (): Certificate; + /** @deprecated since v14.9.0 - Use static methods of `crypto.Certificate` instead. */ + (): Certificate; + /** + * @param spkac + * @returns The challenge component of the `spkac` data structure, + * which includes a public key and a challenge. + */ + exportChallenge(spkac: BinaryLike): Buffer; + /** + * @param spkac + * @param encoding The encoding of the spkac string. + * @returns The public key component of the `spkac` data structure, + * which includes a public key and a challenge. + */ + exportPublicKey(spkac: BinaryLike, encoding?: string): Buffer; + /** + * @param spkac + * @returns `true` if the given `spkac` data structure is valid, + * `false` otherwise. + */ + verifySpkac(spkac: NodeJS.ArrayBufferView): boolean; + }; + namespace constants { + // https://nodejs.org/dist/latest-v10.x/docs/api/crypto.html#crypto_crypto_constants + const OPENSSL_VERSION_NUMBER: number; + /** Applies multiple bug workarounds within OpenSSL. See https://www.openssl.org/docs/man1.0.2/ssl/SSL_CTX_set_options.html for detail. */ + const SSL_OP_ALL: number; + /** Allows legacy insecure renegotiation between OpenSSL and unpatched clients or servers. See https://www.openssl.org/docs/man1.0.2/ssl/SSL_CTX_set_options.html. */ + const SSL_OP_ALLOW_UNSAFE_LEGACY_RENEGOTIATION: number; + /** Attempts to use the server's preferences instead of the client's when selecting a cipher. See https://www.openssl.org/docs/man1.0.2/ssl/SSL_CTX_set_options.html. */ + const SSL_OP_CIPHER_SERVER_PREFERENCE: number; + /** Instructs OpenSSL to use Cisco's "speshul" version of DTLS_BAD_VER. */ + const SSL_OP_CISCO_ANYCONNECT: number; + /** Instructs OpenSSL to turn on cookie exchange. */ + const SSL_OP_COOKIE_EXCHANGE: number; + /** Instructs OpenSSL to add server-hello extension from an early version of the cryptopro draft. */ + const SSL_OP_CRYPTOPRO_TLSEXT_BUG: number; + /** Instructs OpenSSL to disable a SSL 3.0/TLS 1.0 vulnerability workaround added in OpenSSL 0.9.6d. */ + const SSL_OP_DONT_INSERT_EMPTY_FRAGMENTS: number; + /** Instructs OpenSSL to always use the tmp_rsa key when performing RSA operations. */ + const SSL_OP_EPHEMERAL_RSA: number; + /** Allows initial connection to servers that do not support RI. */ + const SSL_OP_LEGACY_SERVER_CONNECT: number; + const SSL_OP_MICROSOFT_BIG_SSLV3_BUFFER: number; + const SSL_OP_MICROSOFT_SESS_ID_BUG: number; + /** Instructs OpenSSL to disable the workaround for a man-in-the-middle protocol-version vulnerability in the SSL 2.0 server implementation. */ + const SSL_OP_MSIE_SSLV2_RSA_PADDING: number; + const SSL_OP_NETSCAPE_CA_DN_BUG: number; + const SSL_OP_NETSCAPE_CHALLENGE_BUG: number; + const SSL_OP_NETSCAPE_DEMO_CIPHER_CHANGE_BUG: number; + const SSL_OP_NETSCAPE_REUSE_CIPHER_CHANGE_BUG: number; + /** Instructs OpenSSL to disable support for SSL/TLS compression. */ + const SSL_OP_NO_COMPRESSION: number; + const SSL_OP_NO_QUERY_MTU: number; + /** Instructs OpenSSL to always start a new session when performing renegotiation. */ + const SSL_OP_NO_SESSION_RESUMPTION_ON_RENEGOTIATION: number; + const SSL_OP_NO_SSLv2: number; + const SSL_OP_NO_SSLv3: number; + const SSL_OP_NO_TICKET: number; + const SSL_OP_NO_TLSv1: number; + const SSL_OP_NO_TLSv1_1: number; + const SSL_OP_NO_TLSv1_2: number; + const SSL_OP_PKCS1_CHECK_1: number; + const SSL_OP_PKCS1_CHECK_2: number; + /** Instructs OpenSSL to always create a new key when using temporary/ephemeral DH parameters. */ + const SSL_OP_SINGLE_DH_USE: number; + /** Instructs OpenSSL to always create a new key when using temporary/ephemeral ECDH parameters. */ + const SSL_OP_SINGLE_ECDH_USE: number; + const SSL_OP_SSLEAY_080_CLIENT_DH_BUG: number; + const SSL_OP_SSLREF2_REUSE_CERT_TYPE_BUG: number; + const SSL_OP_TLS_BLOCK_PADDING_BUG: number; + const SSL_OP_TLS_D5_BUG: number; + /** Instructs OpenSSL to disable version rollback attack detection. */ + const SSL_OP_TLS_ROLLBACK_BUG: number; + const ENGINE_METHOD_RSA: number; + const ENGINE_METHOD_DSA: number; + const ENGINE_METHOD_DH: number; + const ENGINE_METHOD_RAND: number; + const ENGINE_METHOD_EC: number; + const ENGINE_METHOD_CIPHERS: number; + const ENGINE_METHOD_DIGESTS: number; + const ENGINE_METHOD_PKEY_METHS: number; + const ENGINE_METHOD_PKEY_ASN1_METHS: number; + const ENGINE_METHOD_ALL: number; + const ENGINE_METHOD_NONE: number; + const DH_CHECK_P_NOT_SAFE_PRIME: number; + const DH_CHECK_P_NOT_PRIME: number; + const DH_UNABLE_TO_CHECK_GENERATOR: number; + const DH_NOT_SUITABLE_GENERATOR: number; + const ALPN_ENABLED: number; + const RSA_PKCS1_PADDING: number; + const RSA_SSLV23_PADDING: number; + const RSA_NO_PADDING: number; + const RSA_PKCS1_OAEP_PADDING: number; + const RSA_X931_PADDING: number; + const RSA_PKCS1_PSS_PADDING: number; + /** Sets the salt length for RSA_PKCS1_PSS_PADDING to the digest size when signing or verifying. */ + const RSA_PSS_SALTLEN_DIGEST: number; + /** Sets the salt length for RSA_PKCS1_PSS_PADDING to the maximum permissible value when signing data. */ + const RSA_PSS_SALTLEN_MAX_SIGN: number; + /** Causes the salt length for RSA_PKCS1_PSS_PADDING to be determined automatically when verifying a signature. */ + const RSA_PSS_SALTLEN_AUTO: number; + const POINT_CONVERSION_COMPRESSED: number; + const POINT_CONVERSION_UNCOMPRESSED: number; + const POINT_CONVERSION_HYBRID: number; + /** Specifies the built-in default cipher list used by Node.js (colon-separated values). */ + const defaultCoreCipherList: string; + /** Specifies the active default cipher list used by the current Node.js process (colon-separated values). */ + const defaultCipherList: string; + } + interface HashOptions extends stream.TransformOptions { + /** + * For XOF hash functions such as `shake256`, the + * outputLength option can be used to specify the desired output length in bytes. + */ + outputLength?: number | undefined; + } + /** @deprecated since v10.0.0 */ + const fips: boolean; + /** + * Creates and returns a `Hash` object that can be used to generate hash digests + * using the given `algorithm`. Optional `options` argument controls stream + * behavior. For XOF hash functions such as `'shake256'`, the `outputLength` option + * can be used to specify the desired output length in bytes. + * + * The `algorithm` is dependent on the available algorithms supported by the + * version of OpenSSL on the platform. Examples are `'sha256'`, `'sha512'`, etc. + * On recent releases of OpenSSL, `openssl list -digest-algorithms`(`openssl list-message-digest-algorithms` for older versions of OpenSSL) will + * display the available digest algorithms. + * + * Example: generating the sha256 sum of a file + * + * ```js + * import { + * createReadStream + * } from 'fs'; + * import { argv } from 'process'; + * const { + * createHash + * } = await import('crypto'); + * + * const filename = argv[2]; + * + * const hash = createHash('sha256'); + * + * const input = createReadStream(filename); + * input.on('readable', () => { + * // Only one element is going to be produced by the + * // hash stream. + * const data = input.read(); + * if (data) + * hash.update(data); + * else { + * console.log(`${hash.digest('hex')} ${filename}`); + * } + * }); + * ``` + * @since v0.1.92 + * @param options `stream.transform` options + */ + function createHash(algorithm: string, options?: HashOptions): Hash; + /** + * Creates and returns an `Hmac` object that uses the given `algorithm` and `key`. + * Optional `options` argument controls stream behavior. + * + * The `algorithm` is dependent on the available algorithms supported by the + * version of OpenSSL on the platform. Examples are `'sha256'`, `'sha512'`, etc. + * On recent releases of OpenSSL, `openssl list -digest-algorithms`(`openssl list-message-digest-algorithms` for older versions of OpenSSL) will + * display the available digest algorithms. + * + * The `key` is the HMAC key used to generate the cryptographic HMAC hash. If it is + * a `KeyObject`, its type must be `secret`. + * + * Example: generating the sha256 HMAC of a file + * + * ```js + * import { + * createReadStream + * } from 'fs'; + * import { argv } from 'process'; + * const { + * createHmac + * } = await import('crypto'); + * + * const filename = argv[2]; + * + * const hmac = createHmac('sha256', 'a secret'); + * + * const input = createReadStream(filename); + * input.on('readable', () => { + * // Only one element is going to be produced by the + * // hash stream. + * const data = input.read(); + * if (data) + * hmac.update(data); + * else { + * console.log(`${hmac.digest('hex')} ${filename}`); + * } + * }); + * ``` + * @since v0.1.94 + * @param options `stream.transform` options + */ + function createHmac(algorithm: string, key: BinaryLike | KeyObject, options?: stream.TransformOptions): Hmac; + // https://nodejs.org/api/buffer.html#buffer_buffers_and_character_encodings + type BinaryToTextEncoding = 'base64' | 'base64url' | 'hex'; + type CharacterEncoding = 'utf8' | 'utf-8' | 'utf16le' | 'latin1'; + type LegacyCharacterEncoding = 'ascii' | 'binary' | 'ucs2' | 'ucs-2'; + type Encoding = BinaryToTextEncoding | CharacterEncoding | LegacyCharacterEncoding; + type ECDHKeyFormat = 'compressed' | 'uncompressed' | 'hybrid'; + /** + * The `Hash` class is a utility for creating hash digests of data. It can be + * used in one of two ways: + * + * * As a `stream` that is both readable and writable, where data is written + * to produce a computed hash digest on the readable side, or + * * Using the `hash.update()` and `hash.digest()` methods to produce the + * computed hash. + * + * The {@link createHash} method is used to create `Hash` instances. `Hash`objects are not to be created directly using the `new` keyword. + * + * Example: Using `Hash` objects as streams: + * + * ```js + * const { + * createHash + * } = await import('crypto'); + * + * const hash = createHash('sha256'); + * + * hash.on('readable', () => { + * // Only one element is going to be produced by the + * // hash stream. + * const data = hash.read(); + * if (data) { + * console.log(data.toString('hex')); + * // Prints: + * // 6a2da20943931e9834fc12cfe5bb47bbd9ae43489a30726962b576f4e3993e50 + * } + * }); + * + * hash.write('some data to hash'); + * hash.end(); + * ``` + * + * Example: Using `Hash` and piped streams: + * + * ```js + * import { createReadStream } from 'fs'; + * import { stdout } from 'process'; + * const { createHash } = await import('crypto'); + * + * const hash = createHash('sha256'); + * + * const input = createReadStream('test.js'); + * input.pipe(hash).setEncoding('hex').pipe(stdout); + * ``` + * + * Example: Using the `hash.update()` and `hash.digest()` methods: + * + * ```js + * const { + * createHash + * } = await import('crypto'); + * + * const hash = createHash('sha256'); + * + * hash.update('some data to hash'); + * console.log(hash.digest('hex')); + * // Prints: + * // 6a2da20943931e9834fc12cfe5bb47bbd9ae43489a30726962b576f4e3993e50 + * ``` + * @since v0.1.92 + */ + class Hash extends stream.Transform { + private constructor(); + /** + * Creates a new `Hash` object that contains a deep copy of the internal state + * of the current `Hash` object. + * + * The optional `options` argument controls stream behavior. For XOF hash + * functions such as `'shake256'`, the `outputLength` option can be used to + * specify the desired output length in bytes. + * + * An error is thrown when an attempt is made to copy the `Hash` object after + * its `hash.digest()` method has been called. + * + * ```js + * // Calculate a rolling hash. + * const { + * createHash + * } = await import('crypto'); + * + * const hash = createHash('sha256'); + * + * hash.update('one'); + * console.log(hash.copy().digest('hex')); + * + * hash.update('two'); + * console.log(hash.copy().digest('hex')); + * + * hash.update('three'); + * console.log(hash.copy().digest('hex')); + * + * // Etc. + * ``` + * @since v13.1.0 + * @param options `stream.transform` options + */ + copy(options?: stream.TransformOptions): Hash; + /** + * Updates the hash content with the given `data`, the encoding of which + * is given in `inputEncoding`. + * If `encoding` is not provided, and the `data` is a string, an + * encoding of `'utf8'` is enforced. If `data` is a `Buffer`, `TypedArray`, or`DataView`, then `inputEncoding` is ignored. + * + * This can be called many times with new data as it is streamed. + * @since v0.1.92 + * @param inputEncoding The `encoding` of the `data` string. + */ + update(data: BinaryLike): Hash; + update(data: string, inputEncoding: Encoding): Hash; + /** + * Calculates the digest of all of the data passed to be hashed (using the `hash.update()` method). + * If `encoding` is provided a string will be returned; otherwise + * a `Buffer` is returned. + * + * The `Hash` object can not be used again after `hash.digest()` method has been + * called. Multiple calls will cause an error to be thrown. + * @since v0.1.92 + * @param encoding The `encoding` of the return value. + */ + digest(): Buffer; + digest(encoding: BinaryToTextEncoding): string; + } + /** + * The `Hmac` class is a utility for creating cryptographic HMAC digests. It can + * be used in one of two ways: + * + * * As a `stream` that is both readable and writable, where data is written + * to produce a computed HMAC digest on the readable side, or + * * Using the `hmac.update()` and `hmac.digest()` methods to produce the + * computed HMAC digest. + * + * The {@link createHmac} method is used to create `Hmac` instances. `Hmac`objects are not to be created directly using the `new` keyword. + * + * Example: Using `Hmac` objects as streams: + * + * ```js + * const { + * createHmac + * } = await import('crypto'); + * + * const hmac = createHmac('sha256', 'a secret'); + * + * hmac.on('readable', () => { + * // Only one element is going to be produced by the + * // hash stream. + * const data = hmac.read(); + * if (data) { + * console.log(data.toString('hex')); + * // Prints: + * // 7fd04df92f636fd450bc841c9418e5825c17f33ad9c87c518115a45971f7f77e + * } + * }); + * + * hmac.write('some data to hash'); + * hmac.end(); + * ``` + * + * Example: Using `Hmac` and piped streams: + * + * ```js + * import { createReadStream } from 'fs'; + * import { stdout } from 'process'; + * const { + * createHmac + * } = await import('crypto'); + * + * const hmac = createHmac('sha256', 'a secret'); + * + * const input = createReadStream('test.js'); + * input.pipe(hmac).pipe(stdout); + * ``` + * + * Example: Using the `hmac.update()` and `hmac.digest()` methods: + * + * ```js + * const { + * createHmac + * } = await import('crypto'); + * + * const hmac = createHmac('sha256', 'a secret'); + * + * hmac.update('some data to hash'); + * console.log(hmac.digest('hex')); + * // Prints: + * // 7fd04df92f636fd450bc841c9418e5825c17f33ad9c87c518115a45971f7f77e + * ``` + * @since v0.1.94 + */ + class Hmac extends stream.Transform { + private constructor(); + /** + * Updates the `Hmac` content with the given `data`, the encoding of which + * is given in `inputEncoding`. + * If `encoding` is not provided, and the `data` is a string, an + * encoding of `'utf8'` is enforced. If `data` is a `Buffer`, `TypedArray`, or`DataView`, then `inputEncoding` is ignored. + * + * This can be called many times with new data as it is streamed. + * @since v0.1.94 + * @param inputEncoding The `encoding` of the `data` string. + */ + update(data: BinaryLike): Hmac; + update(data: string, inputEncoding: Encoding): Hmac; + /** + * Calculates the HMAC digest of all of the data passed using `hmac.update()`. + * If `encoding` is + * provided a string is returned; otherwise a `Buffer` is returned; + * + * The `Hmac` object can not be used again after `hmac.digest()` has been + * called. Multiple calls to `hmac.digest()` will result in an error being thrown. + * @since v0.1.94 + * @param encoding The `encoding` of the return value. + */ + digest(): Buffer; + digest(encoding: BinaryToTextEncoding): string; + } + type KeyObjectType = 'secret' | 'public' | 'private'; + interface KeyExportOptions { + type: 'pkcs1' | 'spki' | 'pkcs8' | 'sec1'; + format: T; + cipher?: string | undefined; + passphrase?: string | Buffer | undefined; + } + interface JwkKeyExportOptions { + format: 'jwk'; + } + interface JsonWebKey { + crv?: string | undefined; + d?: string | undefined; + dp?: string | undefined; + dq?: string | undefined; + e?: string | undefined; + k?: string | undefined; + kty?: string | undefined; + n?: string | undefined; + p?: string | undefined; + q?: string | undefined; + qi?: string | undefined; + x?: string | undefined; + y?: string | undefined; + [key: string]: unknown; + } + interface AsymmetricKeyDetails { + /** + * Key size in bits (RSA, DSA). + */ + modulusLength?: number | undefined; + /** + * Public exponent (RSA). + */ + publicExponent?: bigint | undefined; + /** + * Name of the message digest (RSA-PSS). + */ + hashAlgorithm?: string | undefined; + /** + * Name of the message digest used by MGF1 (RSA-PSS). + */ + mgf1HashAlgorithm?: string | undefined; + /** + * Minimal salt length in bytes (RSA-PSS). + */ + saltLength?: number | undefined; + /** + * Size of q in bits (DSA). + */ + divisorLength?: number | undefined; + /** + * Name of the curve (EC). + */ + namedCurve?: string | undefined; + } + interface JwkKeyExportOptions { + format: 'jwk'; + } + /** + * Node.js uses a `KeyObject` class to represent a symmetric or asymmetric key, + * and each kind of key exposes different functions. The {@link createSecretKey}, {@link createPublicKey} and {@link createPrivateKey} methods are used to create `KeyObject`instances. `KeyObject` + * objects are not to be created directly using the `new`keyword. + * + * Most applications should consider using the new `KeyObject` API instead of + * passing keys as strings or `Buffer`s due to improved security features. + * + * `KeyObject` instances can be passed to other threads via `postMessage()`. + * The receiver obtains a cloned `KeyObject`, and the `KeyObject` does not need to + * be listed in the `transferList` argument. + * @since v11.6.0 + */ + class KeyObject { + private constructor(); + /** + * Example: Converting a `CryptoKey` instance to a `KeyObject`: + * + * ```js + * const { webcrypto, KeyObject } = await import('crypto'); + * const { subtle } = webcrypto; + * + * const key = await subtle.generateKey({ + * name: 'HMAC', + * hash: 'SHA-256', + * length: 256 + * }, true, ['sign', 'verify']); + * + * const keyObject = KeyObject.from(key); + * console.log(keyObject.symmetricKeySize); + * // Prints: 32 (symmetric key size in bytes) + * ``` + * @since v15.0.0 + */ + static from(key: webcrypto.CryptoKey): KeyObject; + /** + * For asymmetric keys, this property represents the type of the key. Supported key + * types are: + * + * * `'rsa'` (OID 1.2.840.113549.1.1.1) + * * `'rsa-pss'` (OID 1.2.840.113549.1.1.10) + * * `'dsa'` (OID 1.2.840.10040.4.1) + * * `'ec'` (OID 1.2.840.10045.2.1) + * * `'x25519'` (OID 1.3.101.110) + * * `'x448'` (OID 1.3.101.111) + * * `'ed25519'` (OID 1.3.101.112) + * * `'ed448'` (OID 1.3.101.113) + * * `'dh'` (OID 1.2.840.113549.1.3.1) + * + * This property is `undefined` for unrecognized `KeyObject` types and symmetric + * keys. + * @since v11.6.0 + */ + asymmetricKeyType?: KeyType | undefined; + /** + * For asymmetric keys, this property represents the size of the embedded key in + * bytes. This property is `undefined` for symmetric keys. + */ + asymmetricKeySize?: number | undefined; + /** + * This property exists only on asymmetric keys. Depending on the type of the key, + * this object contains information about the key. None of the information obtained + * through this property can be used to uniquely identify a key or to compromise + * the security of the key. + * + * For RSA-PSS keys, if the key material contains a `RSASSA-PSS-params` sequence, + * the `hashAlgorithm`, `mgf1HashAlgorithm`, and `saltLength` properties will be + * set. + * + * Other key details might be exposed via this API using additional attributes. + * @since v15.7.0 + */ + asymmetricKeyDetails?: AsymmetricKeyDetails | undefined; + /** + * For symmetric keys, the following encoding options can be used: + * + * For public keys, the following encoding options can be used: + * + * For private keys, the following encoding options can be used: + * + * The result type depends on the selected encoding format, when PEM the + * result is a string, when DER it will be a buffer containing the data + * encoded as DER, when [JWK](https://tools.ietf.org/html/rfc7517) it will be an object. + * + * When [JWK](https://tools.ietf.org/html/rfc7517) encoding format was selected, all other encoding options are + * ignored. + * + * PKCS#1, SEC1, and PKCS#8 type keys can be encrypted by using a combination of + * the `cipher` and `format` options. The PKCS#8 `type` can be used with any`format` to encrypt any key algorithm (RSA, EC, or DH) by specifying a`cipher`. PKCS#1 and SEC1 can only be + * encrypted by specifying a `cipher`when the PEM `format` is used. For maximum compatibility, use PKCS#8 for + * encrypted private keys. Since PKCS#8 defines its own + * encryption mechanism, PEM-level encryption is not supported when encrypting + * a PKCS#8 key. See [RFC 5208](https://www.rfc-editor.org/rfc/rfc5208.txt) for PKCS#8 encryption and [RFC 1421](https://www.rfc-editor.org/rfc/rfc1421.txt) for + * PKCS#1 and SEC1 encryption. + * @since v11.6.0 + */ + export(options: KeyExportOptions<'pem'>): string | Buffer; + export(options?: KeyExportOptions<'der'>): Buffer; + export(options?: JwkKeyExportOptions): JsonWebKey; + /** + * For secret keys, this property represents the size of the key in bytes. This + * property is `undefined` for asymmetric keys. + * @since v11.6.0 + */ + symmetricKeySize?: number | undefined; + /** + * Depending on the type of this `KeyObject`, this property is either`'secret'` for secret (symmetric) keys, `'public'` for public (asymmetric) keys + * or `'private'` for private (asymmetric) keys. + * @since v11.6.0 + */ + type: KeyObjectType; + } + type CipherCCMTypes = 'aes-128-ccm' | 'aes-192-ccm' | 'aes-256-ccm' | 'chacha20-poly1305'; + type CipherGCMTypes = 'aes-128-gcm' | 'aes-192-gcm' | 'aes-256-gcm'; + type BinaryLike = string | NodeJS.ArrayBufferView; + type CipherKey = BinaryLike | KeyObject; + interface CipherCCMOptions extends stream.TransformOptions { + authTagLength: number; + } + interface CipherGCMOptions extends stream.TransformOptions { + authTagLength?: number | undefined; + } + /** + * Creates and returns a `Cipher` object that uses the given `algorithm` and`password`. + * + * The `options` argument controls stream behavior and is optional except when a + * cipher in CCM or OCB mode is used (e.g. `'aes-128-ccm'`). In that case, the`authTagLength` option is required and specifies the length of the + * authentication tag in bytes, see `CCM mode`. In GCM mode, the `authTagLength`option is not required but can be used to set the length of the authentication + * tag that will be returned by `getAuthTag()` and defaults to 16 bytes. + * + * The `algorithm` is dependent on OpenSSL, examples are `'aes192'`, etc. On + * recent OpenSSL releases, `openssl list -cipher-algorithms`(`openssl list-cipher-algorithms` for older versions of OpenSSL) will + * display the available cipher algorithms. + * + * The `password` is used to derive the cipher key and initialization vector (IV). + * The value must be either a `'latin1'` encoded string, a `Buffer`, a`TypedArray`, or a `DataView`. + * + * The implementation of `crypto.createCipher()` derives keys using the OpenSSL + * function [`EVP_BytesToKey`](https://www.openssl.org/docs/man1.1.0/crypto/EVP_BytesToKey.html) with the digest algorithm set to MD5, one + * iteration, and no salt. The lack of salt allows dictionary attacks as the same + * password always creates the same key. The low iteration count and + * non-cryptographically secure hash algorithm allow passwords to be tested very + * rapidly. + * + * In line with OpenSSL's recommendation to use a more modern algorithm instead of [`EVP_BytesToKey`](https://www.openssl.org/docs/man1.1.0/crypto/EVP_BytesToKey.html) it is recommended that + * developers derive a key and IV on + * their own using {@link scrypt} and to use {@link createCipheriv} to create the `Cipher` object. Users should not use ciphers with counter mode + * (e.g. CTR, GCM, or CCM) in `crypto.createCipher()`. A warning is emitted when + * they are used in order to avoid the risk of IV reuse that causes + * vulnerabilities. For the case when IV is reused in GCM, see [Nonce-Disrespecting Adversaries](https://github.com/nonce-disrespect/nonce-disrespect) for details. + * @since v0.1.94 + * @deprecated Since v10.0.0 - Use {@link createCipheriv} instead. + * @param options `stream.transform` options + */ + function createCipher(algorithm: CipherCCMTypes, password: BinaryLike, options: CipherCCMOptions): CipherCCM; + /** @deprecated since v10.0.0 use `createCipheriv()` */ + function createCipher(algorithm: CipherGCMTypes, password: BinaryLike, options?: CipherGCMOptions): CipherGCM; + /** @deprecated since v10.0.0 use `createCipheriv()` */ + function createCipher(algorithm: string, password: BinaryLike, options?: stream.TransformOptions): Cipher; + /** + * Creates and returns a `Cipher` object, with the given `algorithm`, `key` and + * initialization vector (`iv`). + * + * The `options` argument controls stream behavior and is optional except when a + * cipher in CCM or OCB mode is used (e.g. `'aes-128-ccm'`). In that case, the`authTagLength` option is required and specifies the length of the + * authentication tag in bytes, see `CCM mode`. In GCM mode, the `authTagLength`option is not required but can be used to set the length of the authentication + * tag that will be returned by `getAuthTag()` and defaults to 16 bytes. + * + * The `algorithm` is dependent on OpenSSL, examples are `'aes192'`, etc. On + * recent OpenSSL releases, `openssl list -cipher-algorithms`(`openssl list-cipher-algorithms` for older versions of OpenSSL) will + * display the available cipher algorithms. + * + * The `key` is the raw key used by the `algorithm` and `iv` is an [initialization vector](https://en.wikipedia.org/wiki/Initialization_vector). Both arguments must be `'utf8'` encoded + * strings,`Buffers`, `TypedArray`, or `DataView`s. The `key` may optionally be + * a `KeyObject` of type `secret`. If the cipher does not need + * an initialization vector, `iv` may be `null`. + * + * When passing strings for `key` or `iv`, please consider `caveats when using strings as inputs to cryptographic APIs`. + * + * Initialization vectors should be unpredictable and unique; ideally, they will be + * cryptographically random. They do not have to be secret: IVs are typically just + * added to ciphertext messages unencrypted. It may sound contradictory that + * something has to be unpredictable and unique, but does not have to be secret; + * remember that an attacker must not be able to predict ahead of time what a + * given IV will be. + * @since v0.1.94 + * @param options `stream.transform` options + */ + function createCipheriv(algorithm: CipherCCMTypes, key: CipherKey, iv: BinaryLike | null, options: CipherCCMOptions): CipherCCM; + function createCipheriv(algorithm: CipherGCMTypes, key: CipherKey, iv: BinaryLike | null, options?: CipherGCMOptions): CipherGCM; + function createCipheriv(algorithm: string, key: CipherKey, iv: BinaryLike | null, options?: stream.TransformOptions): Cipher; + /** + * Instances of the `Cipher` class are used to encrypt data. The class can be + * used in one of two ways: + * + * * As a `stream` that is both readable and writable, where plain unencrypted + * data is written to produce encrypted data on the readable side, or + * * Using the `cipher.update()` and `cipher.final()` methods to produce + * the encrypted data. + * + * The {@link createCipher} or {@link createCipheriv} methods are + * used to create `Cipher` instances. `Cipher` objects are not to be created + * directly using the `new` keyword. + * + * Example: Using `Cipher` objects as streams: + * + * ```js + * const { + * scrypt, + * randomFill, + * createCipheriv + * } = await import('crypto'); + * + * const algorithm = 'aes-192-cbc'; + * const password = 'Password used to generate key'; + * + * // First, we'll generate the key. The key length is dependent on the algorithm. + * // In this case for aes192, it is 24 bytes (192 bits). + * scrypt(password, 'salt', 24, (err, key) => { + * if (err) throw err; + * // Then, we'll generate a random initialization vector + * randomFill(new Uint8Array(16), (err, iv) => { + * if (err) throw err; + * + * // Once we have the key and iv, we can create and use the cipher... + * const cipher = createCipheriv(algorithm, key, iv); + * + * let encrypted = ''; + * cipher.setEncoding('hex'); + * + * cipher.on('data', (chunk) => encrypted += chunk); + * cipher.on('end', () => console.log(encrypted)); + * + * cipher.write('some clear text data'); + * cipher.end(); + * }); + * }); + * ``` + * + * Example: Using `Cipher` and piped streams: + * + * ```js + * import { + * createReadStream, + * createWriteStream, + * } from 'fs'; + * + * import { + * pipeline + * } from 'stream'; + * + * const { + * scrypt, + * randomFill, + * createCipheriv + * } = await import('crypto'); + * + * const algorithm = 'aes-192-cbc'; + * const password = 'Password used to generate key'; + * + * // First, we'll generate the key. The key length is dependent on the algorithm. + * // In this case for aes192, it is 24 bytes (192 bits). + * scrypt(password, 'salt', 24, (err, key) => { + * if (err) throw err; + * // Then, we'll generate a random initialization vector + * randomFill(new Uint8Array(16), (err, iv) => { + * if (err) throw err; + * + * const cipher = createCipheriv(algorithm, key, iv); + * + * const input = createReadStream('test.js'); + * const output = createWriteStream('test.enc'); + * + * pipeline(input, cipher, output, (err) => { + * if (err) throw err; + * }); + * }); + * }); + * ``` + * + * Example: Using the `cipher.update()` and `cipher.final()` methods: + * + * ```js + * const { + * scrypt, + * randomFill, + * createCipheriv + * } = await import('crypto'); + * + * const algorithm = 'aes-192-cbc'; + * const password = 'Password used to generate key'; + * + * // First, we'll generate the key. The key length is dependent on the algorithm. + * // In this case for aes192, it is 24 bytes (192 bits). + * scrypt(password, 'salt', 24, (err, key) => { + * if (err) throw err; + * // Then, we'll generate a random initialization vector + * randomFill(new Uint8Array(16), (err, iv) => { + * if (err) throw err; + * + * const cipher = createCipheriv(algorithm, key, iv); + * + * let encrypted = cipher.update('some clear text data', 'utf8', 'hex'); + * encrypted += cipher.final('hex'); + * console.log(encrypted); + * }); + * }); + * ``` + * @since v0.1.94 + */ + class Cipher extends stream.Transform { + private constructor(); + /** + * Updates the cipher with `data`. If the `inputEncoding` argument is given, + * the `data`argument is a string using the specified encoding. If the `inputEncoding`argument is not given, `data` must be a `Buffer`, `TypedArray`, or`DataView`. If `data` is a `Buffer`, + * `TypedArray`, or `DataView`, then`inputEncoding` is ignored. + * + * The `outputEncoding` specifies the output format of the enciphered + * data. If the `outputEncoding`is specified, a string using the specified encoding is returned. If no`outputEncoding` is provided, a `Buffer` is returned. + * + * The `cipher.update()` method can be called multiple times with new data until `cipher.final()` is called. Calling `cipher.update()` after `cipher.final()` will result in an error being + * thrown. + * @since v0.1.94 + * @param inputEncoding The `encoding` of the data. + * @param outputEncoding The `encoding` of the return value. + */ + update(data: BinaryLike): Buffer; + update(data: string, inputEncoding: Encoding): Buffer; + update(data: NodeJS.ArrayBufferView, inputEncoding: undefined, outputEncoding: Encoding): string; + update(data: string, inputEncoding: Encoding | undefined, outputEncoding: Encoding): string; + /** + * Once the `cipher.final()` method has been called, the `Cipher` object can no + * longer be used to encrypt data. Attempts to call `cipher.final()` more than + * once will result in an error being thrown. + * @since v0.1.94 + * @param outputEncoding The `encoding` of the return value. + * @return Any remaining enciphered contents. If `outputEncoding` is specified, a string is returned. If an `outputEncoding` is not provided, a {@link Buffer} is returned. + */ + final(): Buffer; + final(outputEncoding: BufferEncoding): string; + /** + * When using block encryption algorithms, the `Cipher` class will automatically + * add padding to the input data to the appropriate block size. To disable the + * default padding call `cipher.setAutoPadding(false)`. + * + * When `autoPadding` is `false`, the length of the entire input data must be a + * multiple of the cipher's block size or `cipher.final()` will throw an error. + * Disabling automatic padding is useful for non-standard padding, for instance + * using `0x0` instead of PKCS padding. + * + * The `cipher.setAutoPadding()` method must be called before `cipher.final()`. + * @since v0.7.1 + * @param [autoPadding=true] + * @return for method chaining. + */ + setAutoPadding(autoPadding?: boolean): this; + } + interface CipherCCM extends Cipher { + setAAD( + buffer: NodeJS.ArrayBufferView, + options: { + plaintextLength: number; + } + ): this; + getAuthTag(): Buffer; + } + interface CipherGCM extends Cipher { + setAAD( + buffer: NodeJS.ArrayBufferView, + options?: { + plaintextLength: number; + } + ): this; + getAuthTag(): Buffer; + } + /** + * Creates and returns a `Decipher` object that uses the given `algorithm` and`password` (key). + * + * The `options` argument controls stream behavior and is optional except when a + * cipher in CCM or OCB mode is used (e.g. `'aes-128-ccm'`). In that case, the`authTagLength` option is required and specifies the length of the + * authentication tag in bytes, see `CCM mode`. + * + * The implementation of `crypto.createDecipher()` derives keys using the OpenSSL + * function [`EVP_BytesToKey`](https://www.openssl.org/docs/man1.1.0/crypto/EVP_BytesToKey.html) with the digest algorithm set to MD5, one + * iteration, and no salt. The lack of salt allows dictionary attacks as the same + * password always creates the same key. The low iteration count and + * non-cryptographically secure hash algorithm allow passwords to be tested very + * rapidly. + * + * In line with OpenSSL's recommendation to use a more modern algorithm instead of [`EVP_BytesToKey`](https://www.openssl.org/docs/man1.1.0/crypto/EVP_BytesToKey.html) it is recommended that + * developers derive a key and IV on + * their own using {@link scrypt} and to use {@link createDecipheriv} to create the `Decipher` object. + * @since v0.1.94 + * @deprecated Since v10.0.0 - Use {@link createDecipheriv} instead. + * @param options `stream.transform` options + */ + function createDecipher(algorithm: CipherCCMTypes, password: BinaryLike, options: CipherCCMOptions): DecipherCCM; + /** @deprecated since v10.0.0 use `createDecipheriv()` */ + function createDecipher(algorithm: CipherGCMTypes, password: BinaryLike, options?: CipherGCMOptions): DecipherGCM; + /** @deprecated since v10.0.0 use `createDecipheriv()` */ + function createDecipher(algorithm: string, password: BinaryLike, options?: stream.TransformOptions): Decipher; + /** + * Creates and returns a `Decipher` object that uses the given `algorithm`, `key`and initialization vector (`iv`). + * + * The `options` argument controls stream behavior and is optional except when a + * cipher in CCM or OCB mode is used (e.g. `'aes-128-ccm'`). In that case, the`authTagLength` option is required and specifies the length of the + * authentication tag in bytes, see `CCM mode`. In GCM mode, the `authTagLength`option is not required but can be used to restrict accepted authentication tags + * to those with the specified length. + * + * The `algorithm` is dependent on OpenSSL, examples are `'aes192'`, etc. On + * recent OpenSSL releases, `openssl list -cipher-algorithms`(`openssl list-cipher-algorithms` for older versions of OpenSSL) will + * display the available cipher algorithms. + * + * The `key` is the raw key used by the `algorithm` and `iv` is an [initialization vector](https://en.wikipedia.org/wiki/Initialization_vector). Both arguments must be `'utf8'` encoded + * strings,`Buffers`, `TypedArray`, or `DataView`s. The `key` may optionally be + * a `KeyObject` of type `secret`. If the cipher does not need + * an initialization vector, `iv` may be `null`. + * + * When passing strings for `key` or `iv`, please consider `caveats when using strings as inputs to cryptographic APIs`. + * + * Initialization vectors should be unpredictable and unique; ideally, they will be + * cryptographically random. They do not have to be secret: IVs are typically just + * added to ciphertext messages unencrypted. It may sound contradictory that + * something has to be unpredictable and unique, but does not have to be secret; + * remember that an attacker must not be able to predict ahead of time what a given + * IV will be. + * @since v0.1.94 + * @param options `stream.transform` options + */ + function createDecipheriv(algorithm: CipherCCMTypes, key: CipherKey, iv: BinaryLike | null, options: CipherCCMOptions): DecipherCCM; + function createDecipheriv(algorithm: CipherGCMTypes, key: CipherKey, iv: BinaryLike | null, options?: CipherGCMOptions): DecipherGCM; + function createDecipheriv(algorithm: string, key: CipherKey, iv: BinaryLike | null, options?: stream.TransformOptions): Decipher; + /** + * Instances of the `Decipher` class are used to decrypt data. The class can be + * used in one of two ways: + * + * * As a `stream` that is both readable and writable, where plain encrypted + * data is written to produce unencrypted data on the readable side, or + * * Using the `decipher.update()` and `decipher.final()` methods to + * produce the unencrypted data. + * + * The {@link createDecipher} or {@link createDecipheriv} methods are + * used to create `Decipher` instances. `Decipher` objects are not to be created + * directly using the `new` keyword. + * + * Example: Using `Decipher` objects as streams: + * + * ```js + * import { Buffer } from 'buffer'; + * const { + * scryptSync, + * createDecipheriv + * } = await import('crypto'); + * + * const algorithm = 'aes-192-cbc'; + * const password = 'Password used to generate key'; + * // Key length is dependent on the algorithm. In this case for aes192, it is + * // 24 bytes (192 bits). + * // Use the async `crypto.scrypt()` instead. + * const key = scryptSync(password, 'salt', 24); + * // The IV is usually passed along with the ciphertext. + * const iv = Buffer.alloc(16, 0); // Initialization vector. + * + * const decipher = createDecipheriv(algorithm, key, iv); + * + * let decrypted = ''; + * decipher.on('readable', () => { + * while (null !== (chunk = decipher.read())) { + * decrypted += chunk.toString('utf8'); + * } + * }); + * decipher.on('end', () => { + * console.log(decrypted); + * // Prints: some clear text data + * }); + * + * // Encrypted with same algorithm, key and iv. + * const encrypted = + * 'e5f79c5915c02171eec6b212d5520d44480993d7d622a7c4c2da32f6efda0ffa'; + * decipher.write(encrypted, 'hex'); + * decipher.end(); + * ``` + * + * Example: Using `Decipher` and piped streams: + * + * ```js + * import { + * createReadStream, + * createWriteStream, + * } from 'fs'; + * import { Buffer } from 'buffer'; + * const { + * scryptSync, + * createDecipheriv + * } = await import('crypto'); + * + * const algorithm = 'aes-192-cbc'; + * const password = 'Password used to generate key'; + * // Use the async `crypto.scrypt()` instead. + * const key = scryptSync(password, 'salt', 24); + * // The IV is usually passed along with the ciphertext. + * const iv = Buffer.alloc(16, 0); // Initialization vector. + * + * const decipher = createDecipheriv(algorithm, key, iv); + * + * const input = createReadStream('test.enc'); + * const output = createWriteStream('test.js'); + * + * input.pipe(decipher).pipe(output); + * ``` + * + * Example: Using the `decipher.update()` and `decipher.final()` methods: + * + * ```js + * import { Buffer } from 'buffer'; + * const { + * scryptSync, + * createDecipheriv + * } = await import('crypto'); + * + * const algorithm = 'aes-192-cbc'; + * const password = 'Password used to generate key'; + * // Use the async `crypto.scrypt()` instead. + * const key = scryptSync(password, 'salt', 24); + * // The IV is usually passed along with the ciphertext. + * const iv = Buffer.alloc(16, 0); // Initialization vector. + * + * const decipher = createDecipheriv(algorithm, key, iv); + * + * // Encrypted using same algorithm, key and iv. + * const encrypted = + * 'e5f79c5915c02171eec6b212d5520d44480993d7d622a7c4c2da32f6efda0ffa'; + * let decrypted = decipher.update(encrypted, 'hex', 'utf8'); + * decrypted += decipher.final('utf8'); + * console.log(decrypted); + * // Prints: some clear text data + * ``` + * @since v0.1.94 + */ + class Decipher extends stream.Transform { + private constructor(); + /** + * Updates the decipher with `data`. If the `inputEncoding` argument is given, + * the `data`argument is a string using the specified encoding. If the `inputEncoding`argument is not given, `data` must be a `Buffer`. If `data` is a `Buffer` then `inputEncoding` is + * ignored. + * + * The `outputEncoding` specifies the output format of the enciphered + * data. If the `outputEncoding`is specified, a string using the specified encoding is returned. If no`outputEncoding` is provided, a `Buffer` is returned. + * + * The `decipher.update()` method can be called multiple times with new data until `decipher.final()` is called. Calling `decipher.update()` after `decipher.final()` will result in an error + * being thrown. + * @since v0.1.94 + * @param inputEncoding The `encoding` of the `data` string. + * @param outputEncoding The `encoding` of the return value. + */ + update(data: NodeJS.ArrayBufferView): Buffer; + update(data: string, inputEncoding: Encoding): Buffer; + update(data: NodeJS.ArrayBufferView, inputEncoding: undefined, outputEncoding: Encoding): string; + update(data: string, inputEncoding: Encoding | undefined, outputEncoding: Encoding): string; + /** + * Once the `decipher.final()` method has been called, the `Decipher` object can + * no longer be used to decrypt data. Attempts to call `decipher.final()` more + * than once will result in an error being thrown. + * @since v0.1.94 + * @param outputEncoding The `encoding` of the return value. + * @return Any remaining deciphered contents. If `outputEncoding` is specified, a string is returned. If an `outputEncoding` is not provided, a {@link Buffer} is returned. + */ + final(): Buffer; + final(outputEncoding: BufferEncoding): string; + /** + * When data has been encrypted without standard block padding, calling`decipher.setAutoPadding(false)` will disable automatic padding to prevent `decipher.final()` from checking for and + * removing padding. + * + * Turning auto padding off will only work if the input data's length is a + * multiple of the ciphers block size. + * + * The `decipher.setAutoPadding()` method must be called before `decipher.final()`. + * @since v0.7.1 + * @param [autoPadding=true] + * @return for method chaining. + */ + setAutoPadding(auto_padding?: boolean): this; + } + interface DecipherCCM extends Decipher { + setAuthTag(buffer: NodeJS.ArrayBufferView): this; + setAAD( + buffer: NodeJS.ArrayBufferView, + options: { + plaintextLength: number; + } + ): this; + } + interface DecipherGCM extends Decipher { + setAuthTag(buffer: NodeJS.ArrayBufferView): this; + setAAD( + buffer: NodeJS.ArrayBufferView, + options?: { + plaintextLength: number; + } + ): this; + } + interface PrivateKeyInput { + key: string | Buffer; + format?: KeyFormat | undefined; + type?: 'pkcs1' | 'pkcs8' | 'sec1' | undefined; + passphrase?: string | Buffer | undefined; + } + interface PublicKeyInput { + key: string | Buffer; + format?: KeyFormat | undefined; + type?: 'pkcs1' | 'spki' | undefined; + } + /** + * Asynchronously generates a new random secret key of the given `length`. The`type` will determine which validations will be performed on the `length`. + * + * ```js + * const { + * generateKey + * } = await import('crypto'); + * + * generateKey('hmac', { length: 64 }, (err, key) => { + * if (err) throw err; + * console.log(key.export().toString('hex')); // 46e..........620 + * }); + * ``` + * @since v15.0.0 + * @param type The intended use of the generated secret key. Currently accepted values are `'hmac'` and `'aes'`. + */ + function generateKey( + type: 'hmac' | 'aes', + options: { + length: number; + }, + callback: (err: Error | null, key: KeyObject) => void + ): void; + /** + * Synchronously generates a new random secret key of the given `length`. The`type` will determine which validations will be performed on the `length`. + * + * ```js + * const { + * generateKeySync + * } = await import('crypto'); + * + * const key = generateKeySync('hmac', 64); + * console.log(key.export().toString('hex')); // e89..........41e + * ``` + * @since v15.0.0 + * @param type The intended use of the generated secret key. Currently accepted values are `'hmac'` and `'aes'`. + */ + function generateKeySync( + type: 'hmac' | 'aes', + options: { + length: number; + } + ): KeyObject; + interface JsonWebKeyInput { + key: JsonWebKey; + format: 'jwk'; + } + /** + * Creates and returns a new key object containing a private key. If `key` is a + * string or `Buffer`, `format` is assumed to be `'pem'`; otherwise, `key`must be an object with the properties described above. + * + * If the private key is encrypted, a `passphrase` must be specified. The length + * of the passphrase is limited to 1024 bytes. + * @since v11.6.0 + */ + function createPrivateKey(key: PrivateKeyInput | string | Buffer | JsonWebKeyInput): KeyObject; + /** + * Creates and returns a new key object containing a public key. If `key` is a + * string or `Buffer`, `format` is assumed to be `'pem'`; if `key` is a `KeyObject`with type `'private'`, the public key is derived from the given private key; + * otherwise, `key` must be an object with the properties described above. + * + * If the format is `'pem'`, the `'key'` may also be an X.509 certificate. + * + * Because public keys can be derived from private keys, a private key may be + * passed instead of a public key. In that case, this function behaves as if {@link createPrivateKey} had been called, except that the type of the + * returned `KeyObject` will be `'public'` and that the private key cannot be + * extracted from the returned `KeyObject`. Similarly, if a `KeyObject` with type`'private'` is given, a new `KeyObject` with type `'public'` will be returned + * and it will be impossible to extract the private key from the returned object. + * @since v11.6.0 + */ + function createPublicKey(key: PublicKeyInput | string | Buffer | KeyObject | JsonWebKeyInput): KeyObject; + /** + * Creates and returns a new key object containing a secret key for symmetric + * encryption or `Hmac`. + * @since v11.6.0 + * @param encoding The string encoding when `key` is a string. + */ + function createSecretKey(key: NodeJS.ArrayBufferView): KeyObject; + function createSecretKey(key: string, encoding: BufferEncoding): KeyObject; + /** + * Creates and returns a `Sign` object that uses the given `algorithm`. Use {@link getHashes} to obtain the names of the available digest algorithms. + * Optional `options` argument controls the `stream.Writable` behavior. + * + * In some cases, a `Sign` instance can be created using the name of a signature + * algorithm, such as `'RSA-SHA256'`, instead of a digest algorithm. This will use + * the corresponding digest algorithm. This does not work for all signature + * algorithms, such as `'ecdsa-with-SHA256'`, so it is best to always use digest + * algorithm names. + * @since v0.1.92 + * @param options `stream.Writable` options + */ + function createSign(algorithm: string, options?: stream.WritableOptions): Sign; + type DSAEncoding = 'der' | 'ieee-p1363'; + interface SigningOptions { + /** + * @See crypto.constants.RSA_PKCS1_PADDING + */ + padding?: number | undefined; + saltLength?: number | undefined; + dsaEncoding?: DSAEncoding | undefined; + } + interface SignPrivateKeyInput extends PrivateKeyInput, SigningOptions {} + interface SignKeyObjectInput extends SigningOptions { + key: KeyObject; + } + interface VerifyPublicKeyInput extends PublicKeyInput, SigningOptions {} + interface VerifyKeyObjectInput extends SigningOptions { + key: KeyObject; + } + type KeyLike = string | Buffer | KeyObject; + /** + * The `Sign` class is a utility for generating signatures. It can be used in one + * of two ways: + * + * * As a writable `stream`, where data to be signed is written and the `sign.sign()` method is used to generate and return the signature, or + * * Using the `sign.update()` and `sign.sign()` methods to produce the + * signature. + * + * The {@link createSign} method is used to create `Sign` instances. The + * argument is the string name of the hash function to use. `Sign` objects are not + * to be created directly using the `new` keyword. + * + * Example: Using `Sign` and `Verify` objects as streams: + * + * ```js + * const { + * generateKeyPairSync, + * createSign, + * createVerify + * } = await import('crypto'); + * + * const { privateKey, publicKey } = generateKeyPairSync('ec', { + * namedCurve: 'sect239k1' + * }); + * + * const sign = createSign('SHA256'); + * sign.write('some data to sign'); + * sign.end(); + * const signature = sign.sign(privateKey, 'hex'); + * + * const verify = createVerify('SHA256'); + * verify.write('some data to sign'); + * verify.end(); + * console.log(verify.verify(publicKey, signature, 'hex')); + * // Prints: true + * ``` + * + * Example: Using the `sign.update()` and `verify.update()` methods: + * + * ```js + * const { + * generateKeyPairSync, + * createSign, + * createVerify + * } = await import('crypto'); + * + * const { privateKey, publicKey } = generateKeyPairSync('rsa', { + * modulusLength: 2048, + * }); + * + * const sign = createSign('SHA256'); + * sign.update('some data to sign'); + * sign.end(); + * const signature = sign.sign(privateKey); + * + * const verify = createVerify('SHA256'); + * verify.update('some data to sign'); + * verify.end(); + * console.log(verify.verify(publicKey, signature)); + * // Prints: true + * ``` + * @since v0.1.92 + */ + class Sign extends stream.Writable { + private constructor(); + /** + * Updates the `Sign` content with the given `data`, the encoding of which + * is given in `inputEncoding`. + * If `encoding` is not provided, and the `data` is a string, an + * encoding of `'utf8'` is enforced. If `data` is a `Buffer`, `TypedArray`, or`DataView`, then `inputEncoding` is ignored. + * + * This can be called many times with new data as it is streamed. + * @since v0.1.92 + * @param inputEncoding The `encoding` of the `data` string. + */ + update(data: BinaryLike): this; + update(data: string, inputEncoding: Encoding): this; + /** + * Calculates the signature on all the data passed through using either `sign.update()` or `sign.write()`. + * + * If `privateKey` is not a `KeyObject`, this function behaves as if`privateKey` had been passed to {@link createPrivateKey}. If it is an + * object, the following additional properties can be passed: + * + * If `outputEncoding` is provided a string is returned; otherwise a `Buffer` is returned. + * + * The `Sign` object can not be again used after `sign.sign()` method has been + * called. Multiple calls to `sign.sign()` will result in an error being thrown. + * @since v0.1.92 + */ + sign(privateKey: KeyLike | SignKeyObjectInput | SignPrivateKeyInput): Buffer; + sign(privateKey: KeyLike | SignKeyObjectInput | SignPrivateKeyInput, outputFormat: BinaryToTextEncoding): string; + } + /** + * Creates and returns a `Verify` object that uses the given algorithm. + * Use {@link getHashes} to obtain an array of names of the available + * signing algorithms. Optional `options` argument controls the`stream.Writable` behavior. + * + * In some cases, a `Verify` instance can be created using the name of a signature + * algorithm, such as `'RSA-SHA256'`, instead of a digest algorithm. This will use + * the corresponding digest algorithm. This does not work for all signature + * algorithms, such as `'ecdsa-with-SHA256'`, so it is best to always use digest + * algorithm names. + * @since v0.1.92 + * @param options `stream.Writable` options + */ + function createVerify(algorithm: string, options?: stream.WritableOptions): Verify; + /** + * The `Verify` class is a utility for verifying signatures. It can be used in one + * of two ways: + * + * * As a writable `stream` where written data is used to validate against the + * supplied signature, or + * * Using the `verify.update()` and `verify.verify()` methods to verify + * the signature. + * + * The {@link createVerify} method is used to create `Verify` instances.`Verify` objects are not to be created directly using the `new` keyword. + * + * See `Sign` for examples. + * @since v0.1.92 + */ + class Verify extends stream.Writable { + private constructor(); + /** + * Updates the `Verify` content with the given `data`, the encoding of which + * is given in `inputEncoding`. + * If `inputEncoding` is not provided, and the `data` is a string, an + * encoding of `'utf8'` is enforced. If `data` is a `Buffer`, `TypedArray`, or`DataView`, then `inputEncoding` is ignored. + * + * This can be called many times with new data as it is streamed. + * @since v0.1.92 + * @param inputEncoding The `encoding` of the `data` string. + */ + update(data: BinaryLike): Verify; + update(data: string, inputEncoding: Encoding): Verify; + /** + * Verifies the provided data using the given `object` and `signature`. + * + * If `object` is not a `KeyObject`, this function behaves as if`object` had been passed to {@link createPublicKey}. If it is an + * object, the following additional properties can be passed: + * + * The `signature` argument is the previously calculated signature for the data, in + * the `signatureEncoding`. + * If a `signatureEncoding` is specified, the `signature` is expected to be a + * string; otherwise `signature` is expected to be a `Buffer`,`TypedArray`, or `DataView`. + * + * The `verify` object can not be used again after `verify.verify()` has been + * called. Multiple calls to `verify.verify()` will result in an error being + * thrown. + * + * Because public keys can be derived from private keys, a private key may + * be passed instead of a public key. + * @since v0.1.92 + */ + verify(object: KeyLike | VerifyKeyObjectInput | VerifyPublicKeyInput, signature: NodeJS.ArrayBufferView): boolean; + verify(object: KeyLike | VerifyKeyObjectInput | VerifyPublicKeyInput, signature: string, signature_format?: BinaryToTextEncoding): boolean; + } + /** + * Creates a `DiffieHellman` key exchange object using the supplied `prime` and an + * optional specific `generator`. + * + * The `generator` argument can be a number, string, or `Buffer`. If`generator` is not specified, the value `2` is used. + * + * If `primeEncoding` is specified, `prime` is expected to be a string; otherwise + * a `Buffer`, `TypedArray`, or `DataView` is expected. + * + * If `generatorEncoding` is specified, `generator` is expected to be a string; + * otherwise a number, `Buffer`, `TypedArray`, or `DataView` is expected. + * @since v0.11.12 + * @param primeEncoding The `encoding` of the `prime` string. + * @param [generator=2] + * @param generatorEncoding The `encoding` of the `generator` string. + */ + function createDiffieHellman(primeLength: number, generator?: number | NodeJS.ArrayBufferView): DiffieHellman; + function createDiffieHellman(prime: NodeJS.ArrayBufferView): DiffieHellman; + function createDiffieHellman(prime: string, primeEncoding: BinaryToTextEncoding): DiffieHellman; + function createDiffieHellman(prime: string, primeEncoding: BinaryToTextEncoding, generator: number | NodeJS.ArrayBufferView): DiffieHellman; + function createDiffieHellman(prime: string, primeEncoding: BinaryToTextEncoding, generator: string, generatorEncoding: BinaryToTextEncoding): DiffieHellman; + /** + * The `DiffieHellman` class is a utility for creating Diffie-Hellman key + * exchanges. + * + * Instances of the `DiffieHellman` class can be created using the {@link createDiffieHellman} function. + * + * ```js + * import assert from 'assert'; + * + * const { + * createDiffieHellman + * } = await import('crypto'); + * + * // Generate Alice's keys... + * const alice = createDiffieHellman(2048); + * const aliceKey = alice.generateKeys(); + * + * // Generate Bob's keys... + * const bob = createDiffieHellman(alice.getPrime(), alice.getGenerator()); + * const bobKey = bob.generateKeys(); + * + * // Exchange and generate the secret... + * const aliceSecret = alice.computeSecret(bobKey); + * const bobSecret = bob.computeSecret(aliceKey); + * + * // OK + * assert.strictEqual(aliceSecret.toString('hex'), bobSecret.toString('hex')); + * ``` + * @since v0.5.0 + */ + class DiffieHellman { + private constructor(); + /** + * Generates private and public Diffie-Hellman key values, and returns + * the public key in the specified `encoding`. This key should be + * transferred to the other party. + * If `encoding` is provided a string is returned; otherwise a `Buffer` is returned. + * @since v0.5.0 + * @param encoding The `encoding` of the return value. + */ + generateKeys(): Buffer; + generateKeys(encoding: BinaryToTextEncoding): string; + /** + * Computes the shared secret using `otherPublicKey` as the other + * party's public key and returns the computed shared secret. The supplied + * key is interpreted using the specified `inputEncoding`, and secret is + * encoded using specified `outputEncoding`. + * If the `inputEncoding` is not + * provided, `otherPublicKey` is expected to be a `Buffer`,`TypedArray`, or `DataView`. + * + * If `outputEncoding` is given a string is returned; otherwise, a `Buffer` is returned. + * @since v0.5.0 + * @param inputEncoding The `encoding` of an `otherPublicKey` string. + * @param outputEncoding The `encoding` of the return value. + */ + computeSecret(otherPublicKey: NodeJS.ArrayBufferView): Buffer; + computeSecret(otherPublicKey: string, inputEncoding: BinaryToTextEncoding): Buffer; + computeSecret(otherPublicKey: NodeJS.ArrayBufferView, outputEncoding: BinaryToTextEncoding): string; + computeSecret(otherPublicKey: string, inputEncoding: BinaryToTextEncoding, outputEncoding: BinaryToTextEncoding): string; + /** + * Returns the Diffie-Hellman prime in the specified `encoding`. + * If `encoding` is provided a string is + * returned; otherwise a `Buffer` is returned. + * @since v0.5.0 + * @param encoding The `encoding` of the return value. + */ + getPrime(): Buffer; + getPrime(encoding: BinaryToTextEncoding): string; + /** + * Returns the Diffie-Hellman generator in the specified `encoding`. + * If `encoding` is provided a string is + * returned; otherwise a `Buffer` is returned. + * @since v0.5.0 + * @param encoding The `encoding` of the return value. + */ + getGenerator(): Buffer; + getGenerator(encoding: BinaryToTextEncoding): string; + /** + * Returns the Diffie-Hellman public key in the specified `encoding`. + * If `encoding` is provided a + * string is returned; otherwise a `Buffer` is returned. + * @since v0.5.0 + * @param encoding The `encoding` of the return value. + */ + getPublicKey(): Buffer; + getPublicKey(encoding: BinaryToTextEncoding): string; + /** + * Returns the Diffie-Hellman private key in the specified `encoding`. + * If `encoding` is provided a + * string is returned; otherwise a `Buffer` is returned. + * @since v0.5.0 + * @param encoding The `encoding` of the return value. + */ + getPrivateKey(): Buffer; + getPrivateKey(encoding: BinaryToTextEncoding): string; + /** + * Sets the Diffie-Hellman public key. If the `encoding` argument is provided,`publicKey` is expected + * to be a string. If no `encoding` is provided, `publicKey` is expected + * to be a `Buffer`, `TypedArray`, or `DataView`. + * @since v0.5.0 + * @param encoding The `encoding` of the `publicKey` string. + */ + setPublicKey(publicKey: NodeJS.ArrayBufferView): void; + setPublicKey(publicKey: string, encoding: BufferEncoding): void; + /** + * Sets the Diffie-Hellman private key. If the `encoding` argument is provided,`privateKey` is expected + * to be a string. If no `encoding` is provided, `privateKey` is expected + * to be a `Buffer`, `TypedArray`, or `DataView`. + * @since v0.5.0 + * @param encoding The `encoding` of the `privateKey` string. + */ + setPrivateKey(privateKey: NodeJS.ArrayBufferView): void; + setPrivateKey(privateKey: string, encoding: BufferEncoding): void; + /** + * A bit field containing any warnings and/or errors resulting from a check + * performed during initialization of the `DiffieHellman` object. + * + * The following values are valid for this property (as defined in `constants`module): + * + * * `DH_CHECK_P_NOT_SAFE_PRIME` + * * `DH_CHECK_P_NOT_PRIME` + * * `DH_UNABLE_TO_CHECK_GENERATOR` + * * `DH_NOT_SUITABLE_GENERATOR` + * @since v0.11.12 + */ + verifyError: number; + } + /** + * Creates a predefined `DiffieHellmanGroup` key exchange object. The + * supported groups are: `'modp1'`, `'modp2'`, `'modp5'` (defined in [RFC 2412](https://www.rfc-editor.org/rfc/rfc2412.txt), but see `Caveats`) and `'modp14'`, `'modp15'`,`'modp16'`, `'modp17'`, + * `'modp18'` (defined in [RFC 3526](https://www.rfc-editor.org/rfc/rfc3526.txt)). The + * returned object mimics the interface of objects created by {@link createDiffieHellman}, but will not allow changing + * the keys (with `diffieHellman.setPublicKey()`, for example). The + * advantage of using this method is that the parties do not have to + * generate nor exchange a group modulus beforehand, saving both processor + * and communication time. + * + * Example (obtaining a shared secret): + * + * ```js + * const { + * getDiffieHellman + * } = await import('crypto'); + * const alice = getDiffieHellman('modp14'); + * const bob = getDiffieHellman('modp14'); + * + * alice.generateKeys(); + * bob.generateKeys(); + * + * const aliceSecret = alice.computeSecret(bob.getPublicKey(), null, 'hex'); + * const bobSecret = bob.computeSecret(alice.getPublicKey(), null, 'hex'); + * + * // aliceSecret and bobSecret should be the same + * console.log(aliceSecret === bobSecret); + * ``` + * @since v0.7.5 + */ + function getDiffieHellman(groupName: string): DiffieHellman; + /** + * Provides an asynchronous Password-Based Key Derivation Function 2 (PBKDF2) + * implementation. A selected HMAC digest algorithm specified by `digest` is + * applied to derive a key of the requested byte length (`keylen`) from the`password`, `salt` and `iterations`. + * + * The supplied `callback` function is called with two arguments: `err` and`derivedKey`. If an error occurs while deriving the key, `err` will be set; + * otherwise `err` will be `null`. By default, the successfully generated`derivedKey` will be passed to the callback as a `Buffer`. An error will be + * thrown if any of the input arguments specify invalid values or types. + * + * If `digest` is `null`, `'sha1'` will be used. This behavior is deprecated, + * please specify a `digest` explicitly. + * + * The `iterations` argument must be a number set as high as possible. The + * higher the number of iterations, the more secure the derived key will be, + * but will take a longer amount of time to complete. + * + * The `salt` should be as unique as possible. It is recommended that a salt is + * random and at least 16 bytes long. See [NIST SP 800-132](https://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecialpublication800-132.pdf) for details. + * + * When passing strings for `password` or `salt`, please consider `caveats when using strings as inputs to cryptographic APIs`. + * + * ```js + * const { + * pbkdf2 + * } = await import('crypto'); + * + * pbkdf2('secret', 'salt', 100000, 64, 'sha512', (err, derivedKey) => { + * if (err) throw err; + * console.log(derivedKey.toString('hex')); // '3745e48...08d59ae' + * }); + * ``` + * + * The `crypto.DEFAULT_ENCODING` property can be used to change the way the`derivedKey` is passed to the callback. This property, however, has been + * deprecated and use should be avoided. + * + * ```js + * import crypto from 'crypto'; + * crypto.DEFAULT_ENCODING = 'hex'; + * crypto.pbkdf2('secret', 'salt', 100000, 512, 'sha512', (err, derivedKey) => { + * if (err) throw err; + * console.log(derivedKey); // '3745e48...aa39b34' + * }); + * ``` + * + * An array of supported digest functions can be retrieved using {@link getHashes}. + * + * This API uses libuv's threadpool, which can have surprising and + * negative performance implications for some applications; see the `UV_THREADPOOL_SIZE` documentation for more information. + * @since v0.5.5 + */ + function pbkdf2(password: BinaryLike, salt: BinaryLike, iterations: number, keylen: number, digest: string, callback: (err: Error | null, derivedKey: Buffer) => void): void; + /** + * Provides a synchronous Password-Based Key Derivation Function 2 (PBKDF2) + * implementation. A selected HMAC digest algorithm specified by `digest` is + * applied to derive a key of the requested byte length (`keylen`) from the`password`, `salt` and `iterations`. + * + * If an error occurs an `Error` will be thrown, otherwise the derived key will be + * returned as a `Buffer`. + * + * If `digest` is `null`, `'sha1'` will be used. This behavior is deprecated, + * please specify a `digest` explicitly. + * + * The `iterations` argument must be a number set as high as possible. The + * higher the number of iterations, the more secure the derived key will be, + * but will take a longer amount of time to complete. + * + * The `salt` should be as unique as possible. It is recommended that a salt is + * random and at least 16 bytes long. See [NIST SP 800-132](https://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecialpublication800-132.pdf) for details. + * + * When passing strings for `password` or `salt`, please consider `caveats when using strings as inputs to cryptographic APIs`. + * + * ```js + * const { + * pbkdf2Sync + * } = await import('crypto'); + * + * const key = pbkdf2Sync('secret', 'salt', 100000, 64, 'sha512'); + * console.log(key.toString('hex')); // '3745e48...08d59ae' + * ``` + * + * The `crypto.DEFAULT_ENCODING` property may be used to change the way the`derivedKey` is returned. This property, however, is deprecated and use + * should be avoided. + * + * ```js + * import crypto from 'crypto'; + * crypto.DEFAULT_ENCODING = 'hex'; + * const key = crypto.pbkdf2Sync('secret', 'salt', 100000, 512, 'sha512'); + * console.log(key); // '3745e48...aa39b34' + * ``` + * + * An array of supported digest functions can be retrieved using {@link getHashes}. + * @since v0.9.3 + */ + function pbkdf2Sync(password: BinaryLike, salt: BinaryLike, iterations: number, keylen: number, digest: string): Buffer; + /** + * Generates cryptographically strong pseudorandom data. The `size` argument + * is a number indicating the number of bytes to generate. + * + * If a `callback` function is provided, the bytes are generated asynchronously + * and the `callback` function is invoked with two arguments: `err` and `buf`. + * If an error occurs, `err` will be an `Error` object; otherwise it is `null`. The`buf` argument is a `Buffer` containing the generated bytes. + * + * ```js + * // Asynchronous + * const { + * randomBytes + * } = await import('crypto'); + * + * randomBytes(256, (err, buf) => { + * if (err) throw err; + * console.log(`${buf.length} bytes of random data: ${buf.toString('hex')}`); + * }); + * ``` + * + * If the `callback` function is not provided, the random bytes are generated + * synchronously and returned as a `Buffer`. An error will be thrown if + * there is a problem generating the bytes. + * + * ```js + * // Synchronous + * const { + * randomBytes + * } = await import('crypto'); + * + * const buf = randomBytes(256); + * console.log( + * `${buf.length} bytes of random data: ${buf.toString('hex')}`); + * ``` + * + * The `crypto.randomBytes()` method will not complete until there is + * sufficient entropy available. + * This should normally never take longer than a few milliseconds. The only time + * when generating the random bytes may conceivably block for a longer period of + * time is right after boot, when the whole system is still low on entropy. + * + * This API uses libuv's threadpool, which can have surprising and + * negative performance implications for some applications; see the `UV_THREADPOOL_SIZE` documentation for more information. + * + * The asynchronous version of `crypto.randomBytes()` is carried out in a single + * threadpool request. To minimize threadpool task length variation, partition + * large `randomBytes` requests when doing so as part of fulfilling a client + * request. + * @since v0.5.8 + * @param size The number of bytes to generate. The `size` must not be larger than `2**31 - 1`. + * @return if the `callback` function is not provided. + */ + function randomBytes(size: number): Buffer; + function randomBytes(size: number, callback: (err: Error | null, buf: Buffer) => void): void; + function pseudoRandomBytes(size: number): Buffer; + function pseudoRandomBytes(size: number, callback: (err: Error | null, buf: Buffer) => void): void; + /** + * Return a random integer `n` such that `min <= n < max`. This + * implementation avoids [modulo bias](https://en.wikipedia.org/wiki/Fisher%E2%80%93Yates_shuffle#Modulo_bias). + * + * The range (`max - min`) must be less than 248. `min` and `max` must + * be [safe integers](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number/isSafeInteger). + * + * If the `callback` function is not provided, the random integer is + * generated synchronously. + * + * ```js + * // Asynchronous + * const { + * randomInt + * } = await import('crypto'); + * + * randomInt(3, (err, n) => { + * if (err) throw err; + * console.log(`Random number chosen from (0, 1, 2): ${n}`); + * }); + * ``` + * + * ```js + * // Synchronous + * const { + * randomInt + * } = await import('crypto'); + * + * const n = randomInt(3); + * console.log(`Random number chosen from (0, 1, 2): ${n}`); + * ``` + * + * ```js + * // With `min` argument + * const { + * randomInt + * } = await import('crypto'); + * + * const n = randomInt(1, 7); + * console.log(`The dice rolled: ${n}`); + * ``` + * @since v14.10.0, v12.19.0 + * @param [min=0] Start of random range (inclusive). + * @param max End of random range (exclusive). + * @param callback `function(err, n) {}`. + */ + function randomInt(max: number): number; + function randomInt(min: number, max: number): number; + function randomInt(max: number, callback: (err: Error | null, value: number) => void): void; + function randomInt(min: number, max: number, callback: (err: Error | null, value: number) => void): void; + /** + * Synchronous version of {@link randomFill}. + * + * ```js + * import { Buffer } from 'buffer'; + * const { randomFillSync } = await import('crypto'); + * + * const buf = Buffer.alloc(10); + * console.log(randomFillSync(buf).toString('hex')); + * + * randomFillSync(buf, 5); + * console.log(buf.toString('hex')); + * + * // The above is equivalent to the following: + * randomFillSync(buf, 5, 5); + * console.log(buf.toString('hex')); + * ``` + * + * Any `ArrayBuffer`, `TypedArray` or `DataView` instance may be passed as`buffer`. + * + * ```js + * import { Buffer } from 'buffer'; + * const { randomFillSync } = await import('crypto'); + * + * const a = new Uint32Array(10); + * console.log(Buffer.from(randomFillSync(a).buffer, + * a.byteOffset, a.byteLength).toString('hex')); + * + * const b = new DataView(new ArrayBuffer(10)); + * console.log(Buffer.from(randomFillSync(b).buffer, + * b.byteOffset, b.byteLength).toString('hex')); + * + * const c = new ArrayBuffer(10); + * console.log(Buffer.from(randomFillSync(c)).toString('hex')); + * ``` + * @since v7.10.0, v6.13.0 + * @param buffer Must be supplied. The size of the provided `buffer` must not be larger than `2**31 - 1`. + * @param [offset=0] + * @param [size=buffer.length - offset] + * @return The object passed as `buffer` argument. + */ + function randomFillSync(buffer: T, offset?: number, size?: number): T; + /** + * This function is similar to {@link randomBytes} but requires the first + * argument to be a `Buffer` that will be filled. It also + * requires that a callback is passed in. + * + * If the `callback` function is not provided, an error will be thrown. + * + * ```js + * import { Buffer } from 'buffer'; + * const { randomFill } = await import('crypto'); + * + * const buf = Buffer.alloc(10); + * randomFill(buf, (err, buf) => { + * if (err) throw err; + * console.log(buf.toString('hex')); + * }); + * + * randomFill(buf, 5, (err, buf) => { + * if (err) throw err; + * console.log(buf.toString('hex')); + * }); + * + * // The above is equivalent to the following: + * randomFill(buf, 5, 5, (err, buf) => { + * if (err) throw err; + * console.log(buf.toString('hex')); + * }); + * ``` + * + * Any `ArrayBuffer`, `TypedArray`, or `DataView` instance may be passed as`buffer`. + * + * While this includes instances of `Float32Array` and `Float64Array`, this + * function should not be used to generate random floating-point numbers. The + * result may contain `+Infinity`, `-Infinity`, and `NaN`, and even if the array + * contains finite numbers only, they are not drawn from a uniform random + * distribution and have no meaningful lower or upper bounds. + * + * ```js + * import { Buffer } from 'buffer'; + * const { randomFill } = await import('crypto'); + * + * const a = new Uint32Array(10); + * randomFill(a, (err, buf) => { + * if (err) throw err; + * console.log(Buffer.from(buf.buffer, buf.byteOffset, buf.byteLength) + * .toString('hex')); + * }); + * + * const b = new DataView(new ArrayBuffer(10)); + * randomFill(b, (err, buf) => { + * if (err) throw err; + * console.log(Buffer.from(buf.buffer, buf.byteOffset, buf.byteLength) + * .toString('hex')); + * }); + * + * const c = new ArrayBuffer(10); + * randomFill(c, (err, buf) => { + * if (err) throw err; + * console.log(Buffer.from(buf).toString('hex')); + * }); + * ``` + * + * This API uses libuv's threadpool, which can have surprising and + * negative performance implications for some applications; see the `UV_THREADPOOL_SIZE` documentation for more information. + * + * The asynchronous version of `crypto.randomFill()` is carried out in a single + * threadpool request. To minimize threadpool task length variation, partition + * large `randomFill` requests when doing so as part of fulfilling a client + * request. + * @since v7.10.0, v6.13.0 + * @param buffer Must be supplied. The size of the provided `buffer` must not be larger than `2**31 - 1`. + * @param [offset=0] + * @param [size=buffer.length - offset] + * @param callback `function(err, buf) {}`. + */ + function randomFill(buffer: T, callback: (err: Error | null, buf: T) => void): void; + function randomFill(buffer: T, offset: number, callback: (err: Error | null, buf: T) => void): void; + function randomFill(buffer: T, offset: number, size: number, callback: (err: Error | null, buf: T) => void): void; + interface ScryptOptions { + cost?: number | undefined; + blockSize?: number | undefined; + parallelization?: number | undefined; + N?: number | undefined; + r?: number | undefined; + p?: number | undefined; + maxmem?: number | undefined; + } + /** + * Provides an asynchronous [scrypt](https://en.wikipedia.org/wiki/Scrypt) implementation. Scrypt is a password-based + * key derivation function that is designed to be expensive computationally and + * memory-wise in order to make brute-force attacks unrewarding. + * + * The `salt` should be as unique as possible. It is recommended that a salt is + * random and at least 16 bytes long. See [NIST SP 800-132](https://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecialpublication800-132.pdf) for details. + * + * When passing strings for `password` or `salt`, please consider `caveats when using strings as inputs to cryptographic APIs`. + * + * The `callback` function is called with two arguments: `err` and `derivedKey`.`err` is an exception object when key derivation fails, otherwise `err` is`null`. `derivedKey` is passed to the + * callback as a `Buffer`. + * + * An exception is thrown when any of the input arguments specify invalid values + * or types. + * + * ```js + * const { + * scrypt + * } = await import('crypto'); + * + * // Using the factory defaults. + * scrypt('password', 'salt', 64, (err, derivedKey) => { + * if (err) throw err; + * console.log(derivedKey.toString('hex')); // '3745e48...08d59ae' + * }); + * // Using a custom N parameter. Must be a power of two. + * scrypt('password', 'salt', 64, { N: 1024 }, (err, derivedKey) => { + * if (err) throw err; + * console.log(derivedKey.toString('hex')); // '3745e48...aa39b34' + * }); + * ``` + * @since v10.5.0 + */ + function scrypt(password: BinaryLike, salt: BinaryLike, keylen: number, callback: (err: Error | null, derivedKey: Buffer) => void): void; + function scrypt(password: BinaryLike, salt: BinaryLike, keylen: number, options: ScryptOptions, callback: (err: Error | null, derivedKey: Buffer) => void): void; + /** + * Provides a synchronous [scrypt](https://en.wikipedia.org/wiki/Scrypt) implementation. Scrypt is a password-based + * key derivation function that is designed to be expensive computationally and + * memory-wise in order to make brute-force attacks unrewarding. + * + * The `salt` should be as unique as possible. It is recommended that a salt is + * random and at least 16 bytes long. See [NIST SP 800-132](https://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecialpublication800-132.pdf) for details. + * + * When passing strings for `password` or `salt`, please consider `caveats when using strings as inputs to cryptographic APIs`. + * + * An exception is thrown when key derivation fails, otherwise the derived key is + * returned as a `Buffer`. + * + * An exception is thrown when any of the input arguments specify invalid values + * or types. + * + * ```js + * const { + * scryptSync + * } = await import('crypto'); + * // Using the factory defaults. + * + * const key1 = scryptSync('password', 'salt', 64); + * console.log(key1.toString('hex')); // '3745e48...08d59ae' + * // Using a custom N parameter. Must be a power of two. + * const key2 = scryptSync('password', 'salt', 64, { N: 1024 }); + * console.log(key2.toString('hex')); // '3745e48...aa39b34' + * ``` + * @since v10.5.0 + */ + function scryptSync(password: BinaryLike, salt: BinaryLike, keylen: number, options?: ScryptOptions): Buffer; + interface RsaPublicKey { + key: KeyLike; + padding?: number | undefined; + } + interface RsaPrivateKey { + key: KeyLike; + passphrase?: string | undefined; + /** + * @default 'sha1' + */ + oaepHash?: string | undefined; + oaepLabel?: NodeJS.TypedArray | undefined; + padding?: number | undefined; + } + /** + * Encrypts the content of `buffer` with `key` and returns a new `Buffer` with encrypted content. The returned data can be decrypted using + * the corresponding private key, for example using {@link privateDecrypt}. + * + * If `key` is not a `KeyObject`, this function behaves as if`key` had been passed to {@link createPublicKey}. If it is an + * object, the `padding` property can be passed. Otherwise, this function uses`RSA_PKCS1_OAEP_PADDING`. + * + * Because RSA public keys can be derived from private keys, a private key may + * be passed instead of a public key. + * @since v0.11.14 + */ + function publicEncrypt(key: RsaPublicKey | RsaPrivateKey | KeyLike, buffer: NodeJS.ArrayBufferView): Buffer; + /** + * Decrypts `buffer` with `key`.`buffer` was previously encrypted using + * the corresponding private key, for example using {@link privateEncrypt}. + * + * If `key` is not a `KeyObject`, this function behaves as if`key` had been passed to {@link createPublicKey}. If it is an + * object, the `padding` property can be passed. Otherwise, this function uses`RSA_PKCS1_PADDING`. + * + * Because RSA public keys can be derived from private keys, a private key may + * be passed instead of a public key. + * @since v1.1.0 + */ + function publicDecrypt(key: RsaPublicKey | RsaPrivateKey | KeyLike, buffer: NodeJS.ArrayBufferView): Buffer; + /** + * Decrypts `buffer` with `privateKey`. `buffer` was previously encrypted using + * the corresponding public key, for example using {@link publicEncrypt}. + * + * If `privateKey` is not a `KeyObject`, this function behaves as if`privateKey` had been passed to {@link createPrivateKey}. If it is an + * object, the `padding` property can be passed. Otherwise, this function uses`RSA_PKCS1_OAEP_PADDING`. + * @since v0.11.14 + */ + function privateDecrypt(privateKey: RsaPrivateKey | KeyLike, buffer: NodeJS.ArrayBufferView): Buffer; + /** + * Encrypts `buffer` with `privateKey`. The returned data can be decrypted using + * the corresponding public key, for example using {@link publicDecrypt}. + * + * If `privateKey` is not a `KeyObject`, this function behaves as if`privateKey` had been passed to {@link createPrivateKey}. If it is an + * object, the `padding` property can be passed. Otherwise, this function uses`RSA_PKCS1_PADDING`. + * @since v1.1.0 + */ + function privateEncrypt(privateKey: RsaPrivateKey | KeyLike, buffer: NodeJS.ArrayBufferView): Buffer; + /** + * ```js + * const { + * getCiphers + * } = await import('crypto'); + * + * console.log(getCiphers()); // ['aes-128-cbc', 'aes-128-ccm', ...] + * ``` + * @since v0.9.3 + * @return An array with the names of the supported cipher algorithms. + */ + function getCiphers(): string[]; + /** + * ```js + * const { + * getCurves + * } = await import('crypto'); + * + * console.log(getCurves()); // ['Oakley-EC2N-3', 'Oakley-EC2N-4', ...] + * ``` + * @since v2.3.0 + * @return An array with the names of the supported elliptic curves. + */ + function getCurves(): string[]; + /** + * @since v10.0.0 + * @return `1` if and only if a FIPS compliant crypto provider is currently in use, `0` otherwise. A future semver-major release may change the return type of this API to a {boolean}. + */ + function getFips(): 1 | 0; + /** + * ```js + * const { + * getHashes + * } = await import('crypto'); + * + * console.log(getHashes()); // ['DSA', 'DSA-SHA', 'DSA-SHA1', ...] + * ``` + * @since v0.9.3 + * @return An array of the names of the supported hash algorithms, such as `'RSA-SHA256'`. Hash algorithms are also called "digest" algorithms. + */ + function getHashes(): string[]; + /** + * The `ECDH` class is a utility for creating Elliptic Curve Diffie-Hellman (ECDH) + * key exchanges. + * + * Instances of the `ECDH` class can be created using the {@link createECDH} function. + * + * ```js + * import assert from 'assert'; + * + * const { + * createECDH + * } = await import('crypto'); + * + * // Generate Alice's keys... + * const alice = createECDH('secp521r1'); + * const aliceKey = alice.generateKeys(); + * + * // Generate Bob's keys... + * const bob = createECDH('secp521r1'); + * const bobKey = bob.generateKeys(); + * + * // Exchange and generate the secret... + * const aliceSecret = alice.computeSecret(bobKey); + * const bobSecret = bob.computeSecret(aliceKey); + * + * assert.strictEqual(aliceSecret.toString('hex'), bobSecret.toString('hex')); + * // OK + * ``` + * @since v0.11.14 + */ + class ECDH { + private constructor(); + /** + * Converts the EC Diffie-Hellman public key specified by `key` and `curve` to the + * format specified by `format`. The `format` argument specifies point encoding + * and can be `'compressed'`, `'uncompressed'` or `'hybrid'`. The supplied key is + * interpreted using the specified `inputEncoding`, and the returned key is encoded + * using the specified `outputEncoding`. + * + * Use {@link getCurves} to obtain a list of available curve names. + * On recent OpenSSL releases, `openssl ecparam -list_curves` will also display + * the name and description of each available elliptic curve. + * + * If `format` is not specified the point will be returned in `'uncompressed'`format. + * + * If the `inputEncoding` is not provided, `key` is expected to be a `Buffer`,`TypedArray`, or `DataView`. + * + * Example (uncompressing a key): + * + * ```js + * const { + * createECDH, + * ECDH + * } = await import('crypto'); + * + * const ecdh = createECDH('secp256k1'); + * ecdh.generateKeys(); + * + * const compressedKey = ecdh.getPublicKey('hex', 'compressed'); + * + * const uncompressedKey = ECDH.convertKey(compressedKey, + * 'secp256k1', + * 'hex', + * 'hex', + * 'uncompressed'); + * + * // The converted key and the uncompressed public key should be the same + * console.log(uncompressedKey === ecdh.getPublicKey('hex')); + * ``` + * @since v10.0.0 + * @param inputEncoding The `encoding` of the `key` string. + * @param outputEncoding The `encoding` of the return value. + * @param [format='uncompressed'] + */ + static convertKey( + key: BinaryLike, + curve: string, + inputEncoding?: BinaryToTextEncoding, + outputEncoding?: 'latin1' | 'hex' | 'base64' | 'base64url', + format?: 'uncompressed' | 'compressed' | 'hybrid' + ): Buffer | string; + /** + * Generates private and public EC Diffie-Hellman key values, and returns + * the public key in the specified `format` and `encoding`. This key should be + * transferred to the other party. + * + * The `format` argument specifies point encoding and can be `'compressed'` or`'uncompressed'`. If `format` is not specified, the point will be returned in`'uncompressed'` format. + * + * If `encoding` is provided a string is returned; otherwise a `Buffer` is returned. + * @since v0.11.14 + * @param encoding The `encoding` of the return value. + * @param [format='uncompressed'] + */ + generateKeys(): Buffer; + generateKeys(encoding: BinaryToTextEncoding, format?: ECDHKeyFormat): string; + /** + * Computes the shared secret using `otherPublicKey` as the other + * party's public key and returns the computed shared secret. The supplied + * key is interpreted using specified `inputEncoding`, and the returned secret + * is encoded using the specified `outputEncoding`. + * If the `inputEncoding` is not + * provided, `otherPublicKey` is expected to be a `Buffer`, `TypedArray`, or`DataView`. + * + * If `outputEncoding` is given a string will be returned; otherwise a `Buffer` is returned. + * + * `ecdh.computeSecret` will throw an`ERR_CRYPTO_ECDH_INVALID_PUBLIC_KEY` error when `otherPublicKey`lies outside of the elliptic curve. Since `otherPublicKey` is + * usually supplied from a remote user over an insecure network, + * be sure to handle this exception accordingly. + * @since v0.11.14 + * @param inputEncoding The `encoding` of the `otherPublicKey` string. + * @param outputEncoding The `encoding` of the return value. + */ + computeSecret(otherPublicKey: NodeJS.ArrayBufferView): Buffer; + computeSecret(otherPublicKey: string, inputEncoding: BinaryToTextEncoding): Buffer; + computeSecret(otherPublicKey: NodeJS.ArrayBufferView, outputEncoding: BinaryToTextEncoding): string; + computeSecret(otherPublicKey: string, inputEncoding: BinaryToTextEncoding, outputEncoding: BinaryToTextEncoding): string; + /** + * If `encoding` is specified, a string is returned; otherwise a `Buffer` is + * returned. + * @since v0.11.14 + * @param encoding The `encoding` of the return value. + * @return The EC Diffie-Hellman in the specified `encoding`. + */ + getPrivateKey(): Buffer; + getPrivateKey(encoding: BinaryToTextEncoding): string; + /** + * The `format` argument specifies point encoding and can be `'compressed'` or`'uncompressed'`. If `format` is not specified the point will be returned in`'uncompressed'` format. + * + * If `encoding` is specified, a string is returned; otherwise a `Buffer` is + * returned. + * @since v0.11.14 + * @param encoding The `encoding` of the return value. + * @param [format='uncompressed'] + * @return The EC Diffie-Hellman public key in the specified `encoding` and `format`. + */ + getPublicKey(): Buffer; + getPublicKey(encoding: BinaryToTextEncoding, format?: ECDHKeyFormat): string; + /** + * Sets the EC Diffie-Hellman private key. + * If `encoding` is provided, `privateKey` is expected + * to be a string; otherwise `privateKey` is expected to be a `Buffer`,`TypedArray`, or `DataView`. + * + * If `privateKey` is not valid for the curve specified when the `ECDH` object was + * created, an error is thrown. Upon setting the private key, the associated + * public point (key) is also generated and set in the `ECDH` object. + * @since v0.11.14 + * @param encoding The `encoding` of the `privateKey` string. + */ + setPrivateKey(privateKey: NodeJS.ArrayBufferView): void; + setPrivateKey(privateKey: string, encoding: BinaryToTextEncoding): void; + } + /** + * Creates an Elliptic Curve Diffie-Hellman (`ECDH`) key exchange object using a + * predefined curve specified by the `curveName` string. Use {@link getCurves} to obtain a list of available curve names. On recent + * OpenSSL releases, `openssl ecparam -list_curves` will also display the name + * and description of each available elliptic curve. + * @since v0.11.14 + */ + function createECDH(curveName: string): ECDH; + /** + * This function is based on a constant-time algorithm. + * Returns true if `a` is equal to `b`, without leaking timing information that + * would allow an attacker to guess one of the values. This is suitable for + * comparing HMAC digests or secret values like authentication cookies or [capability urls](https://www.w3.org/TR/capability-urls/). + * + * `a` and `b` must both be `Buffer`s, `TypedArray`s, or `DataView`s, and they + * must have the same byte length. + * + * If at least one of `a` and `b` is a `TypedArray` with more than one byte per + * entry, such as `Uint16Array`, the result will be computed using the platform + * byte order. + * + * Use of `crypto.timingSafeEqual` does not guarantee that the _surrounding_ code + * is timing-safe. Care should be taken to ensure that the surrounding code does + * not introduce timing vulnerabilities. + * @since v6.6.0 + */ + function timingSafeEqual(a: NodeJS.ArrayBufferView, b: NodeJS.ArrayBufferView): boolean; + /** @deprecated since v10.0.0 */ + const DEFAULT_ENCODING: BufferEncoding; + type KeyType = 'rsa' | 'rsa-pss' | 'dsa' | 'ec' | 'ed25519' | 'ed448' | 'x25519' | 'x448'; + type KeyFormat = 'pem' | 'der'; + interface BasePrivateKeyEncodingOptions { + format: T; + cipher?: string | undefined; + passphrase?: string | undefined; + } + interface KeyPairKeyObjectResult { + publicKey: KeyObject; + privateKey: KeyObject; + } + interface ED25519KeyPairKeyObjectOptions {} + interface ED448KeyPairKeyObjectOptions {} + interface X25519KeyPairKeyObjectOptions {} + interface X448KeyPairKeyObjectOptions {} + interface ECKeyPairKeyObjectOptions { + /** + * Name of the curve to use + */ + namedCurve: string; + } + interface RSAKeyPairKeyObjectOptions { + /** + * Key size in bits + */ + modulusLength: number; + /** + * Public exponent + * @default 0x10001 + */ + publicExponent?: number | undefined; + } + interface RSAPSSKeyPairKeyObjectOptions { + /** + * Key size in bits + */ + modulusLength: number; + /** + * Public exponent + * @default 0x10001 + */ + publicExponent?: number | undefined; + /** + * Name of the message digest + */ + hashAlgorithm?: string; + /** + * Name of the message digest used by MGF1 + */ + mgf1HashAlgorithm?: string; + /** + * Minimal salt length in bytes + */ + saltLength?: string; + } + interface DSAKeyPairKeyObjectOptions { + /** + * Key size in bits + */ + modulusLength: number; + /** + * Size of q in bits + */ + divisorLength: number; + } + interface RSAKeyPairOptions { + /** + * Key size in bits + */ + modulusLength: number; + /** + * Public exponent + * @default 0x10001 + */ + publicExponent?: number | undefined; + publicKeyEncoding: { + type: 'pkcs1' | 'spki'; + format: PubF; + }; + privateKeyEncoding: BasePrivateKeyEncodingOptions & { + type: 'pkcs1' | 'pkcs8'; + }; + } + interface RSAPSSKeyPairOptions { + /** + * Key size in bits + */ + modulusLength: number; + /** + * Public exponent + * @default 0x10001 + */ + publicExponent?: number | undefined; + /** + * Name of the message digest + */ + hashAlgorithm?: string; + /** + * Name of the message digest used by MGF1 + */ + mgf1HashAlgorithm?: string; + /** + * Minimal salt length in bytes + */ + saltLength?: string; + publicKeyEncoding: { + type: 'spki'; + format: PubF; + }; + privateKeyEncoding: BasePrivateKeyEncodingOptions & { + type: 'pkcs8'; + }; + } + interface DSAKeyPairOptions { + /** + * Key size in bits + */ + modulusLength: number; + /** + * Size of q in bits + */ + divisorLength: number; + publicKeyEncoding: { + type: 'spki'; + format: PubF; + }; + privateKeyEncoding: BasePrivateKeyEncodingOptions & { + type: 'pkcs8'; + }; + } + interface ECKeyPairOptions { + /** + * Name of the curve to use. + */ + namedCurve: string; + publicKeyEncoding: { + type: 'pkcs1' | 'spki'; + format: PubF; + }; + privateKeyEncoding: BasePrivateKeyEncodingOptions & { + type: 'sec1' | 'pkcs8'; + }; + } + interface ED25519KeyPairOptions { + publicKeyEncoding: { + type: 'spki'; + format: PubF; + }; + privateKeyEncoding: BasePrivateKeyEncodingOptions & { + type: 'pkcs8'; + }; + } + interface ED448KeyPairOptions { + publicKeyEncoding: { + type: 'spki'; + format: PubF; + }; + privateKeyEncoding: BasePrivateKeyEncodingOptions & { + type: 'pkcs8'; + }; + } + interface X25519KeyPairOptions { + publicKeyEncoding: { + type: 'spki'; + format: PubF; + }; + privateKeyEncoding: BasePrivateKeyEncodingOptions & { + type: 'pkcs8'; + }; + } + interface X448KeyPairOptions { + publicKeyEncoding: { + type: 'spki'; + format: PubF; + }; + privateKeyEncoding: BasePrivateKeyEncodingOptions & { + type: 'pkcs8'; + }; + } + interface KeyPairSyncResult { + publicKey: T1; + privateKey: T2; + } + /** + * Generates a new asymmetric key pair of the given `type`. RSA, RSA-PSS, DSA, EC, + * Ed25519, Ed448, X25519, X448, and DH are currently supported. + * + * If a `publicKeyEncoding` or `privateKeyEncoding` was specified, this function + * behaves as if `keyObject.export()` had been called on its result. Otherwise, + * the respective part of the key is returned as a `KeyObject`. + * + * When encoding public keys, it is recommended to use `'spki'`. When encoding + * private keys, it is recommended to use `'pkcs8'` with a strong passphrase, + * and to keep the passphrase confidential. + * + * ```js + * const { + * generateKeyPairSync + * } = await import('crypto'); + * + * const { + * publicKey, + * privateKey, + * } = generateKeyPairSync('rsa', { + * modulusLength: 4096, + * publicKeyEncoding: { + * type: 'spki', + * format: 'pem' + * }, + * privateKeyEncoding: { + * type: 'pkcs8', + * format: 'pem', + * cipher: 'aes-256-cbc', + * passphrase: 'top secret' + * } + * }); + * ``` + * + * The return value `{ publicKey, privateKey }` represents the generated key pair. + * When PEM encoding was selected, the respective key will be a string, otherwise + * it will be a buffer containing the data encoded as DER. + * @since v10.12.0 + * @param type Must be `'rsa'`, `'rsa-pss'`, `'dsa'`, `'ec'`, `'ed25519'`, `'ed448'`, `'x25519'`, `'x448'`, or `'dh'`. + */ + function generateKeyPairSync(type: 'rsa', options: RSAKeyPairOptions<'pem', 'pem'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'rsa', options: RSAKeyPairOptions<'pem', 'der'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'rsa', options: RSAKeyPairOptions<'der', 'pem'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'rsa', options: RSAKeyPairOptions<'der', 'der'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'rsa', options: RSAKeyPairKeyObjectOptions): KeyPairKeyObjectResult; + function generateKeyPairSync(type: 'rsa-pss', options: RSAPSSKeyPairOptions<'pem', 'pem'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'rsa-pss', options: RSAPSSKeyPairOptions<'pem', 'der'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'rsa-pss', options: RSAPSSKeyPairOptions<'der', 'pem'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'rsa-pss', options: RSAPSSKeyPairOptions<'der', 'der'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'rsa-pss', options: RSAPSSKeyPairKeyObjectOptions): KeyPairKeyObjectResult; + function generateKeyPairSync(type: 'dsa', options: DSAKeyPairOptions<'pem', 'pem'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'dsa', options: DSAKeyPairOptions<'pem', 'der'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'dsa', options: DSAKeyPairOptions<'der', 'pem'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'dsa', options: DSAKeyPairOptions<'der', 'der'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'dsa', options: DSAKeyPairKeyObjectOptions): KeyPairKeyObjectResult; + function generateKeyPairSync(type: 'ec', options: ECKeyPairOptions<'pem', 'pem'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'ec', options: ECKeyPairOptions<'pem', 'der'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'ec', options: ECKeyPairOptions<'der', 'pem'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'ec', options: ECKeyPairOptions<'der', 'der'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'ec', options: ECKeyPairKeyObjectOptions): KeyPairKeyObjectResult; + function generateKeyPairSync(type: 'ed25519', options: ED25519KeyPairOptions<'pem', 'pem'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'ed25519', options: ED25519KeyPairOptions<'pem', 'der'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'ed25519', options: ED25519KeyPairOptions<'der', 'pem'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'ed25519', options: ED25519KeyPairOptions<'der', 'der'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'ed25519', options?: ED25519KeyPairKeyObjectOptions): KeyPairKeyObjectResult; + function generateKeyPairSync(type: 'ed448', options: ED448KeyPairOptions<'pem', 'pem'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'ed448', options: ED448KeyPairOptions<'pem', 'der'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'ed448', options: ED448KeyPairOptions<'der', 'pem'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'ed448', options: ED448KeyPairOptions<'der', 'der'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'ed448', options?: ED448KeyPairKeyObjectOptions): KeyPairKeyObjectResult; + function generateKeyPairSync(type: 'x25519', options: X25519KeyPairOptions<'pem', 'pem'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'x25519', options: X25519KeyPairOptions<'pem', 'der'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'x25519', options: X25519KeyPairOptions<'der', 'pem'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'x25519', options: X25519KeyPairOptions<'der', 'der'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'x25519', options?: X25519KeyPairKeyObjectOptions): KeyPairKeyObjectResult; + function generateKeyPairSync(type: 'x448', options: X448KeyPairOptions<'pem', 'pem'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'x448', options: X448KeyPairOptions<'pem', 'der'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'x448', options: X448KeyPairOptions<'der', 'pem'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'x448', options: X448KeyPairOptions<'der', 'der'>): KeyPairSyncResult; + function generateKeyPairSync(type: 'x448', options?: X448KeyPairKeyObjectOptions): KeyPairKeyObjectResult; + /** + * Generates a new asymmetric key pair of the given `type`. RSA, RSA-PSS, DSA, EC, + * Ed25519, Ed448, X25519, X448, and DH are currently supported. + * + * If a `publicKeyEncoding` or `privateKeyEncoding` was specified, this function + * behaves as if `keyObject.export()` had been called on its result. Otherwise, + * the respective part of the key is returned as a `KeyObject`. + * + * It is recommended to encode public keys as `'spki'` and private keys as`'pkcs8'` with encryption for long-term storage: + * + * ```js + * const { + * generateKeyPair + * } = await import('crypto'); + * + * generateKeyPair('rsa', { + * modulusLength: 4096, + * publicKeyEncoding: { + * type: 'spki', + * format: 'pem' + * }, + * privateKeyEncoding: { + * type: 'pkcs8', + * format: 'pem', + * cipher: 'aes-256-cbc', + * passphrase: 'top secret' + * } + * }, (err, publicKey, privateKey) => { + * // Handle errors and use the generated key pair. + * }); + * ``` + * + * On completion, `callback` will be called with `err` set to `undefined` and`publicKey` / `privateKey` representing the generated key pair. + * + * If this method is invoked as its `util.promisify()` ed version, it returns + * a `Promise` for an `Object` with `publicKey` and `privateKey` properties. + * @since v10.12.0 + * @param type Must be `'rsa'`, `'rsa-pss'`, `'dsa'`, `'ec'`, `'ed25519'`, `'ed448'`, `'x25519'`, `'x448'`, or `'dh'`. + */ + function generateKeyPair(type: 'rsa', options: RSAKeyPairOptions<'pem', 'pem'>, callback: (err: Error | null, publicKey: string, privateKey: string) => void): void; + function generateKeyPair(type: 'rsa', options: RSAKeyPairOptions<'pem', 'der'>, callback: (err: Error | null, publicKey: string, privateKey: Buffer) => void): void; + function generateKeyPair(type: 'rsa', options: RSAKeyPairOptions<'der', 'pem'>, callback: (err: Error | null, publicKey: Buffer, privateKey: string) => void): void; + function generateKeyPair(type: 'rsa', options: RSAKeyPairOptions<'der', 'der'>, callback: (err: Error | null, publicKey: Buffer, privateKey: Buffer) => void): void; + function generateKeyPair(type: 'rsa', options: RSAKeyPairKeyObjectOptions, callback: (err: Error | null, publicKey: KeyObject, privateKey: KeyObject) => void): void; + function generateKeyPair(type: 'rsa-pss', options: RSAPSSKeyPairOptions<'pem', 'pem'>, callback: (err: Error | null, publicKey: string, privateKey: string) => void): void; + function generateKeyPair(type: 'rsa-pss', options: RSAPSSKeyPairOptions<'pem', 'der'>, callback: (err: Error | null, publicKey: string, privateKey: Buffer) => void): void; + function generateKeyPair(type: 'rsa-pss', options: RSAPSSKeyPairOptions<'der', 'pem'>, callback: (err: Error | null, publicKey: Buffer, privateKey: string) => void): void; + function generateKeyPair(type: 'rsa-pss', options: RSAPSSKeyPairOptions<'der', 'der'>, callback: (err: Error | null, publicKey: Buffer, privateKey: Buffer) => void): void; + function generateKeyPair(type: 'rsa-pss', options: RSAPSSKeyPairKeyObjectOptions, callback: (err: Error | null, publicKey: KeyObject, privateKey: KeyObject) => void): void; + function generateKeyPair(type: 'dsa', options: DSAKeyPairOptions<'pem', 'pem'>, callback: (err: Error | null, publicKey: string, privateKey: string) => void): void; + function generateKeyPair(type: 'dsa', options: DSAKeyPairOptions<'pem', 'der'>, callback: (err: Error | null, publicKey: string, privateKey: Buffer) => void): void; + function generateKeyPair(type: 'dsa', options: DSAKeyPairOptions<'der', 'pem'>, callback: (err: Error | null, publicKey: Buffer, privateKey: string) => void): void; + function generateKeyPair(type: 'dsa', options: DSAKeyPairOptions<'der', 'der'>, callback: (err: Error | null, publicKey: Buffer, privateKey: Buffer) => void): void; + function generateKeyPair(type: 'dsa', options: DSAKeyPairKeyObjectOptions, callback: (err: Error | null, publicKey: KeyObject, privateKey: KeyObject) => void): void; + function generateKeyPair(type: 'ec', options: ECKeyPairOptions<'pem', 'pem'>, callback: (err: Error | null, publicKey: string, privateKey: string) => void): void; + function generateKeyPair(type: 'ec', options: ECKeyPairOptions<'pem', 'der'>, callback: (err: Error | null, publicKey: string, privateKey: Buffer) => void): void; + function generateKeyPair(type: 'ec', options: ECKeyPairOptions<'der', 'pem'>, callback: (err: Error | null, publicKey: Buffer, privateKey: string) => void): void; + function generateKeyPair(type: 'ec', options: ECKeyPairOptions<'der', 'der'>, callback: (err: Error | null, publicKey: Buffer, privateKey: Buffer) => void): void; + function generateKeyPair(type: 'ec', options: ECKeyPairKeyObjectOptions, callback: (err: Error | null, publicKey: KeyObject, privateKey: KeyObject) => void): void; + function generateKeyPair(type: 'ed25519', options: ED25519KeyPairOptions<'pem', 'pem'>, callback: (err: Error | null, publicKey: string, privateKey: string) => void): void; + function generateKeyPair(type: 'ed25519', options: ED25519KeyPairOptions<'pem', 'der'>, callback: (err: Error | null, publicKey: string, privateKey: Buffer) => void): void; + function generateKeyPair(type: 'ed25519', options: ED25519KeyPairOptions<'der', 'pem'>, callback: (err: Error | null, publicKey: Buffer, privateKey: string) => void): void; + function generateKeyPair(type: 'ed25519', options: ED25519KeyPairOptions<'der', 'der'>, callback: (err: Error | null, publicKey: Buffer, privateKey: Buffer) => void): void; + function generateKeyPair(type: 'ed25519', options: ED25519KeyPairKeyObjectOptions | undefined, callback: (err: Error | null, publicKey: KeyObject, privateKey: KeyObject) => void): void; + function generateKeyPair(type: 'ed448', options: ED448KeyPairOptions<'pem', 'pem'>, callback: (err: Error | null, publicKey: string, privateKey: string) => void): void; + function generateKeyPair(type: 'ed448', options: ED448KeyPairOptions<'pem', 'der'>, callback: (err: Error | null, publicKey: string, privateKey: Buffer) => void): void; + function generateKeyPair(type: 'ed448', options: ED448KeyPairOptions<'der', 'pem'>, callback: (err: Error | null, publicKey: Buffer, privateKey: string) => void): void; + function generateKeyPair(type: 'ed448', options: ED448KeyPairOptions<'der', 'der'>, callback: (err: Error | null, publicKey: Buffer, privateKey: Buffer) => void): void; + function generateKeyPair(type: 'ed448', options: ED448KeyPairKeyObjectOptions | undefined, callback: (err: Error | null, publicKey: KeyObject, privateKey: KeyObject) => void): void; + function generateKeyPair(type: 'x25519', options: X25519KeyPairOptions<'pem', 'pem'>, callback: (err: Error | null, publicKey: string, privateKey: string) => void): void; + function generateKeyPair(type: 'x25519', options: X25519KeyPairOptions<'pem', 'der'>, callback: (err: Error | null, publicKey: string, privateKey: Buffer) => void): void; + function generateKeyPair(type: 'x25519', options: X25519KeyPairOptions<'der', 'pem'>, callback: (err: Error | null, publicKey: Buffer, privateKey: string) => void): void; + function generateKeyPair(type: 'x25519', options: X25519KeyPairOptions<'der', 'der'>, callback: (err: Error | null, publicKey: Buffer, privateKey: Buffer) => void): void; + function generateKeyPair(type: 'x25519', options: X25519KeyPairKeyObjectOptions | undefined, callback: (err: Error | null, publicKey: KeyObject, privateKey: KeyObject) => void): void; + function generateKeyPair(type: 'x448', options: X448KeyPairOptions<'pem', 'pem'>, callback: (err: Error | null, publicKey: string, privateKey: string) => void): void; + function generateKeyPair(type: 'x448', options: X448KeyPairOptions<'pem', 'der'>, callback: (err: Error | null, publicKey: string, privateKey: Buffer) => void): void; + function generateKeyPair(type: 'x448', options: X448KeyPairOptions<'der', 'pem'>, callback: (err: Error | null, publicKey: Buffer, privateKey: string) => void): void; + function generateKeyPair(type: 'x448', options: X448KeyPairOptions<'der', 'der'>, callback: (err: Error | null, publicKey: Buffer, privateKey: Buffer) => void): void; + function generateKeyPair(type: 'x448', options: X448KeyPairKeyObjectOptions | undefined, callback: (err: Error | null, publicKey: KeyObject, privateKey: KeyObject) => void): void; + namespace generateKeyPair { + function __promisify__( + type: 'rsa', + options: RSAKeyPairOptions<'pem', 'pem'> + ): Promise<{ + publicKey: string; + privateKey: string; + }>; + function __promisify__( + type: 'rsa', + options: RSAKeyPairOptions<'pem', 'der'> + ): Promise<{ + publicKey: string; + privateKey: Buffer; + }>; + function __promisify__( + type: 'rsa', + options: RSAKeyPairOptions<'der', 'pem'> + ): Promise<{ + publicKey: Buffer; + privateKey: string; + }>; + function __promisify__( + type: 'rsa', + options: RSAKeyPairOptions<'der', 'der'> + ): Promise<{ + publicKey: Buffer; + privateKey: Buffer; + }>; + function __promisify__(type: 'rsa', options: RSAKeyPairKeyObjectOptions): Promise; + function __promisify__( + type: 'rsa-pss', + options: RSAPSSKeyPairOptions<'pem', 'pem'> + ): Promise<{ + publicKey: string; + privateKey: string; + }>; + function __promisify__( + type: 'rsa-pss', + options: RSAPSSKeyPairOptions<'pem', 'der'> + ): Promise<{ + publicKey: string; + privateKey: Buffer; + }>; + function __promisify__( + type: 'rsa-pss', + options: RSAPSSKeyPairOptions<'der', 'pem'> + ): Promise<{ + publicKey: Buffer; + privateKey: string; + }>; + function __promisify__( + type: 'rsa-pss', + options: RSAPSSKeyPairOptions<'der', 'der'> + ): Promise<{ + publicKey: Buffer; + privateKey: Buffer; + }>; + function __promisify__(type: 'rsa-pss', options: RSAPSSKeyPairKeyObjectOptions): Promise; + function __promisify__( + type: 'dsa', + options: DSAKeyPairOptions<'pem', 'pem'> + ): Promise<{ + publicKey: string; + privateKey: string; + }>; + function __promisify__( + type: 'dsa', + options: DSAKeyPairOptions<'pem', 'der'> + ): Promise<{ + publicKey: string; + privateKey: Buffer; + }>; + function __promisify__( + type: 'dsa', + options: DSAKeyPairOptions<'der', 'pem'> + ): Promise<{ + publicKey: Buffer; + privateKey: string; + }>; + function __promisify__( + type: 'dsa', + options: DSAKeyPairOptions<'der', 'der'> + ): Promise<{ + publicKey: Buffer; + privateKey: Buffer; + }>; + function __promisify__(type: 'dsa', options: DSAKeyPairKeyObjectOptions): Promise; + function __promisify__( + type: 'ec', + options: ECKeyPairOptions<'pem', 'pem'> + ): Promise<{ + publicKey: string; + privateKey: string; + }>; + function __promisify__( + type: 'ec', + options: ECKeyPairOptions<'pem', 'der'> + ): Promise<{ + publicKey: string; + privateKey: Buffer; + }>; + function __promisify__( + type: 'ec', + options: ECKeyPairOptions<'der', 'pem'> + ): Promise<{ + publicKey: Buffer; + privateKey: string; + }>; + function __promisify__( + type: 'ec', + options: ECKeyPairOptions<'der', 'der'> + ): Promise<{ + publicKey: Buffer; + privateKey: Buffer; + }>; + function __promisify__(type: 'ec', options: ECKeyPairKeyObjectOptions): Promise; + function __promisify__( + type: 'ed25519', + options: ED25519KeyPairOptions<'pem', 'pem'> + ): Promise<{ + publicKey: string; + privateKey: string; + }>; + function __promisify__( + type: 'ed25519', + options: ED25519KeyPairOptions<'pem', 'der'> + ): Promise<{ + publicKey: string; + privateKey: Buffer; + }>; + function __promisify__( + type: 'ed25519', + options: ED25519KeyPairOptions<'der', 'pem'> + ): Promise<{ + publicKey: Buffer; + privateKey: string; + }>; + function __promisify__( + type: 'ed25519', + options: ED25519KeyPairOptions<'der', 'der'> + ): Promise<{ + publicKey: Buffer; + privateKey: Buffer; + }>; + function __promisify__(type: 'ed25519', options?: ED25519KeyPairKeyObjectOptions): Promise; + function __promisify__( + type: 'ed448', + options: ED448KeyPairOptions<'pem', 'pem'> + ): Promise<{ + publicKey: string; + privateKey: string; + }>; + function __promisify__( + type: 'ed448', + options: ED448KeyPairOptions<'pem', 'der'> + ): Promise<{ + publicKey: string; + privateKey: Buffer; + }>; + function __promisify__( + type: 'ed448', + options: ED448KeyPairOptions<'der', 'pem'> + ): Promise<{ + publicKey: Buffer; + privateKey: string; + }>; + function __promisify__( + type: 'ed448', + options: ED448KeyPairOptions<'der', 'der'> + ): Promise<{ + publicKey: Buffer; + privateKey: Buffer; + }>; + function __promisify__(type: 'ed448', options?: ED448KeyPairKeyObjectOptions): Promise; + function __promisify__( + type: 'x25519', + options: X25519KeyPairOptions<'pem', 'pem'> + ): Promise<{ + publicKey: string; + privateKey: string; + }>; + function __promisify__( + type: 'x25519', + options: X25519KeyPairOptions<'pem', 'der'> + ): Promise<{ + publicKey: string; + privateKey: Buffer; + }>; + function __promisify__( + type: 'x25519', + options: X25519KeyPairOptions<'der', 'pem'> + ): Promise<{ + publicKey: Buffer; + privateKey: string; + }>; + function __promisify__( + type: 'x25519', + options: X25519KeyPairOptions<'der', 'der'> + ): Promise<{ + publicKey: Buffer; + privateKey: Buffer; + }>; + function __promisify__(type: 'x25519', options?: X25519KeyPairKeyObjectOptions): Promise; + function __promisify__( + type: 'x448', + options: X448KeyPairOptions<'pem', 'pem'> + ): Promise<{ + publicKey: string; + privateKey: string; + }>; + function __promisify__( + type: 'x448', + options: X448KeyPairOptions<'pem', 'der'> + ): Promise<{ + publicKey: string; + privateKey: Buffer; + }>; + function __promisify__( + type: 'x448', + options: X448KeyPairOptions<'der', 'pem'> + ): Promise<{ + publicKey: Buffer; + privateKey: string; + }>; + function __promisify__( + type: 'x448', + options: X448KeyPairOptions<'der', 'der'> + ): Promise<{ + publicKey: Buffer; + privateKey: Buffer; + }>; + function __promisify__(type: 'x448', options?: X448KeyPairKeyObjectOptions): Promise; + } + /** + * Calculates and returns the signature for `data` using the given private key and + * algorithm. If `algorithm` is `null` or `undefined`, then the algorithm is + * dependent upon the key type (especially Ed25519 and Ed448). + * + * If `key` is not a `KeyObject`, this function behaves as if `key` had been + * passed to {@link createPrivateKey}. If it is an object, the following + * additional properties can be passed: + * + * If the `callback` function is provided this function uses libuv's threadpool. + * @since v12.0.0 + */ + function sign(algorithm: string | null | undefined, data: NodeJS.ArrayBufferView, key: KeyLike | SignKeyObjectInput | SignPrivateKeyInput): Buffer; + function sign( + algorithm: string | null | undefined, + data: NodeJS.ArrayBufferView, + key: KeyLike | SignKeyObjectInput | SignPrivateKeyInput, + callback: (error: Error | null, data: Buffer) => void + ): void; + /** + * Verifies the given signature for `data` using the given key and algorithm. If`algorithm` is `null` or `undefined`, then the algorithm is dependent upon the + * key type (especially Ed25519 and Ed448). + * + * If `key` is not a `KeyObject`, this function behaves as if `key` had been + * passed to {@link createPublicKey}. If it is an object, the following + * additional properties can be passed: + * + * The `signature` argument is the previously calculated signature for the `data`. + * + * Because public keys can be derived from private keys, a private key or a public + * key may be passed for `key`. + * + * If the `callback` function is provided this function uses libuv's threadpool. + * @since v12.0.0 + */ + function verify(algorithm: string | null | undefined, data: NodeJS.ArrayBufferView, key: KeyLike | VerifyKeyObjectInput | VerifyPublicKeyInput, signature: NodeJS.ArrayBufferView): boolean; + function verify( + algorithm: string | null | undefined, + data: NodeJS.ArrayBufferView, + key: KeyLike | VerifyKeyObjectInput | VerifyPublicKeyInput, + signature: NodeJS.ArrayBufferView, + callback: (error: Error | null, result: boolean) => void + ): void; + /** + * Computes the Diffie-Hellman secret based on a `privateKey` and a `publicKey`. + * Both keys must have the same `asymmetricKeyType`, which must be one of `'dh'`(for Diffie-Hellman), `'ec'` (for ECDH), `'x448'`, or `'x25519'` (for ECDH-ES). + * @since v13.9.0, v12.17.0 + */ + function diffieHellman(options: { privateKey: KeyObject; publicKey: KeyObject }): Buffer; + type CipherMode = 'cbc' | 'ccm' | 'cfb' | 'ctr' | 'ecb' | 'gcm' | 'ocb' | 'ofb' | 'stream' | 'wrap' | 'xts'; + interface CipherInfoOptions { + /** + * A test key length. + */ + keyLength?: number | undefined; + /** + * A test IV length. + */ + ivLength?: number | undefined; + } + interface CipherInfo { + /** + * The name of the cipher. + */ + name: string; + /** + * The nid of the cipher. + */ + nid: number; + /** + * The block size of the cipher in bytes. + * This property is omitted when mode is 'stream'. + */ + blockSize?: number | undefined; + /** + * The expected or default initialization vector length in bytes. + * This property is omitted if the cipher does not use an initialization vector. + */ + ivLength?: number | undefined; + /** + * The expected or default key length in bytes. + */ + keyLength: number; + /** + * The cipher mode. + */ + mode: CipherMode; + } + /** + * Returns information about a given cipher. + * + * Some ciphers accept variable length keys and initialization vectors. By default, + * the `crypto.getCipherInfo()` method will return the default values for these + * ciphers. To test if a given key length or iv length is acceptable for given + * cipher, use the `keyLength` and `ivLength` options. If the given values are + * unacceptable, `undefined` will be returned. + * @since v15.0.0 + * @param nameOrNid The name or nid of the cipher to query. + */ + function getCipherInfo(nameOrNid: string | number, options?: CipherInfoOptions): CipherInfo | undefined; + /** + * HKDF is a simple key derivation function defined in RFC 5869\. The given `ikm`,`salt` and `info` are used with the `digest` to derive a key of `keylen` bytes. + * + * The supplied `callback` function is called with two arguments: `err` and`derivedKey`. If an errors occurs while deriving the key, `err` will be set; + * otherwise `err` will be `null`. The successfully generated `derivedKey` will + * be passed to the callback as an [ArrayBuffer](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer). An error will be thrown if any + * of the input arguments specify invalid values or types. + * + * ```js + * import { Buffer } from 'buffer'; + * const { + * hkdf + * } = await import('crypto'); + * + * hkdf('sha512', 'key', 'salt', 'info', 64, (err, derivedKey) => { + * if (err) throw err; + * console.log(Buffer.from(derivedKey).toString('hex')); // '24156e2...5391653' + * }); + * ``` + * @since v15.0.0 + * @param digest The digest algorithm to use. + * @param ikm The input keying material. It must be at least one byte in length. + * @param salt The salt value. Must be provided but can be zero-length. + * @param info Additional info value. Must be provided but can be zero-length, and cannot be more than 1024 bytes. + * @param keylen The length of the key to generate. Must be greater than 0. The maximum allowable value is `255` times the number of bytes produced by the selected digest function (e.g. `sha512` + * generates 64-byte hashes, making the maximum HKDF output 16320 bytes). + */ + function hkdf(digest: string, irm: BinaryLike | KeyObject, salt: BinaryLike, info: BinaryLike, keylen: number, callback: (err: Error | null, derivedKey: ArrayBuffer) => void): void; + /** + * Provides a synchronous HKDF key derivation function as defined in RFC 5869\. The + * given `ikm`, `salt` and `info` are used with the `digest` to derive a key of`keylen` bytes. + * + * The successfully generated `derivedKey` will be returned as an [ArrayBuffer](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer). + * + * An error will be thrown if any of the input arguments specify invalid values or + * types, or if the derived key cannot be generated. + * + * ```js + * import { Buffer } from 'buffer'; + * const { + * hkdfSync + * } = await import('crypto'); + * + * const derivedKey = hkdfSync('sha512', 'key', 'salt', 'info', 64); + * console.log(Buffer.from(derivedKey).toString('hex')); // '24156e2...5391653' + * ``` + * @since v15.0.0 + * @param digest The digest algorithm to use. + * @param ikm The input keying material. It must be at least one byte in length. + * @param salt The salt value. Must be provided but can be zero-length. + * @param info Additional info value. Must be provided but can be zero-length, and cannot be more than 1024 bytes. + * @param keylen The length of the key to generate. Must be greater than 0. The maximum allowable value is `255` times the number of bytes produced by the selected digest function (e.g. `sha512` + * generates 64-byte hashes, making the maximum HKDF output 16320 bytes). + */ + function hkdfSync(digest: string, ikm: BinaryLike | KeyObject, salt: BinaryLike, info: BinaryLike, keylen: number): ArrayBuffer; + interface SecureHeapUsage { + /** + * The total allocated secure heap size as specified using the `--secure-heap=n` command-line flag. + */ + total: number; + /** + * The minimum allocation from the secure heap as specified using the `--secure-heap-min` command-line flag. + */ + min: number; + /** + * The total number of bytes currently allocated from the secure heap. + */ + used: number; + /** + * The calculated ratio of `used` to `total` allocated bytes. + */ + utilization: number; + } + /** + * @since v15.6.0 + */ + function secureHeapUsed(): SecureHeapUsage; + interface RandomUUIDOptions { + /** + * By default, to improve performance, + * Node.js will pre-emptively generate and persistently cache enough + * random data to generate up to 128 random UUIDs. To generate a UUID + * without using the cache, set `disableEntropyCache` to `true`. + * + * @default `false` + */ + disableEntropyCache?: boolean | undefined; + } + /** + * Generates a random [RFC 4122](https://www.rfc-editor.org/rfc/rfc4122.txt) version 4 UUID. The UUID is generated using a + * cryptographic pseudorandom number generator. + * @since v15.6.0 + */ + function randomUUID(options?: RandomUUIDOptions): string; + interface X509CheckOptions { + /** + * @default 'always' + */ + subject: 'always' | 'never'; + /** + * @default true + */ + wildcards: boolean; + /** + * @default true + */ + partialWildcards: boolean; + /** + * @default false + */ + multiLabelWildcards: boolean; + /** + * @default false + */ + singleLabelSubdomains: boolean; + } + /** + * Encapsulates an X509 certificate and provides read-only access to + * its information. + * + * ```js + * const { X509Certificate } = await import('crypto'); + * + * const x509 = new X509Certificate('{... pem encoded cert ...}'); + * + * console.log(x509.subject); + * ``` + * @since v15.6.0 + */ + class X509Certificate { + /** + * Will be \`true\` if this is a Certificate Authority (ca) certificate. + * @since v15.6.0 + */ + readonly ca: boolean; + /** + * The SHA-1 fingerprint of this certificate. + * @since v15.6.0 + */ + readonly fingerprint: string; + /** + * The SHA-256 fingerprint of this certificate. + * @since v15.6.0 + */ + readonly fingerprint256: string; + /** + * The complete subject of this certificate. + * @since v15.6.0 + */ + readonly subject: string; + /** + * The subject alternative name specified for this certificate. + * @since v15.6.0 + */ + readonly subjectAltName: string; + /** + * The information access content of this certificate. + * @since v15.6.0 + */ + readonly infoAccess: string; + /** + * An array detailing the key usages for this certificate. + * @since v15.6.0 + */ + readonly keyUsage: string[]; + /** + * The issuer identification included in this certificate. + * @since v15.6.0 + */ + readonly issuer: string; + /** + * The issuer certificate or `undefined` if the issuer certificate is not + * available. + * @since v15.9.0 + */ + readonly issuerCertificate?: X509Certificate | undefined; + /** + * The public key `KeyObject` for this certificate. + * @since v15.6.0 + */ + readonly publicKey: KeyObject; + /** + * A `Buffer` containing the DER encoding of this certificate. + * @since v15.6.0 + */ + readonly raw: Buffer; + /** + * The serial number of this certificate. + * @since v15.6.0 + */ + readonly serialNumber: string; + /** + * The date/time from which this certificate is considered valid. + * @since v15.6.0 + */ + readonly validFrom: string; + /** + * The date/time until which this certificate is considered valid. + * @since v15.6.0 + */ + readonly validTo: string; + constructor(buffer: BinaryLike); + /** + * Checks whether the certificate matches the given email address. + * @since v15.6.0 + * @return Returns `email` if the certificate matches, `undefined` if it does not. + */ + checkEmail(email: string, options?: X509CheckOptions): string | undefined; + /** + * Checks whether the certificate matches the given host name. + * @since v15.6.0 + * @return Returns `name` if the certificate matches, `undefined` if it does not. + */ + checkHost(name: string, options?: X509CheckOptions): string | undefined; + /** + * Checks whether the certificate matches the given IP address (IPv4 or IPv6). + * @since v15.6.0 + * @return Returns `ip` if the certificate matches, `undefined` if it does not. + */ + checkIP(ip: string, options?: X509CheckOptions): string | undefined; + /** + * Checks whether this certificate was issued by the given `otherCert`. + * @since v15.6.0 + */ + checkIssued(otherCert: X509Certificate): boolean; + /** + * Checks whether the public key for this certificate is consistent with + * the given private key. + * @since v15.6.0 + * @param privateKey A private key. + */ + checkPrivateKey(privateKey: KeyObject): boolean; + /** + * There is no standard JSON encoding for X509 certificates. The`toJSON()` method returns a string containing the PEM encoded + * certificate. + * @since v15.6.0 + */ + toJSON(): string; + /** + * Returns information about this certificate using the legacy `certificate object` encoding. + * @since v15.6.0 + */ + toLegacyObject(): PeerCertificate; + /** + * Returns the PEM-encoded certificate. + * @since v15.6.0 + */ + toString(): string; + /** + * Verifies that this certificate was signed by the given public key. + * Does not perform any other validation checks on the certificate. + * @since v15.6.0 + * @param publicKey A public key. + */ + verify(publicKey: KeyObject): boolean; + } + type LargeNumberLike = NodeJS.ArrayBufferView | SharedArrayBuffer | ArrayBuffer | bigint; + interface GeneratePrimeOptions { + add?: LargeNumberLike | undefined; + rem?: LargeNumberLike | undefined; + /** + * @default false + */ + safe?: boolean | undefined; + bigint?: boolean | undefined; + } + interface GeneratePrimeOptionsBigInt extends GeneratePrimeOptions { + bigint: true; + } + interface GeneratePrimeOptionsArrayBuffer extends GeneratePrimeOptions { + bigint?: false | undefined; + } + /** + * Generates a pseudorandom prime of `size` bits. + * + * If `options.safe` is `true`, the prime will be a safe prime -- that is,`(prime - 1) / 2` will also be a prime. + * + * The `options.add` and `options.rem` parameters can be used to enforce additional + * requirements, e.g., for Diffie-Hellman: + * + * * If `options.add` and `options.rem` are both set, the prime will satisfy the + * condition that `prime % add = rem`. + * * If only `options.add` is set and `options.safe` is not `true`, the prime will + * satisfy the condition that `prime % add = 1`. + * * If only `options.add` is set and `options.safe` is set to `true`, the prime + * will instead satisfy the condition that `prime % add = 3`. This is necessary + * because `prime % add = 1` for `options.add > 2` would contradict the condition + * enforced by `options.safe`. + * * `options.rem` is ignored if `options.add` is not given. + * + * Both `options.add` and `options.rem` must be encoded as big-endian sequences + * if given as an `ArrayBuffer`, `SharedArrayBuffer`, `TypedArray`, `Buffer`, or`DataView`. + * + * By default, the prime is encoded as a big-endian sequence of octets + * in an [ArrayBuffer](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer). If the `bigint` option is `true`, then a + * [bigint](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/BigInt) is provided. + * @since v15.8.0 + * @param size The size (in bits) of the prime to generate. + */ + function generatePrime(size: number, callback: (err: Error | null, prime: ArrayBuffer) => void): void; + function generatePrime(size: number, options: GeneratePrimeOptionsBigInt, callback: (err: Error | null, prime: bigint) => void): void; + function generatePrime(size: number, options: GeneratePrimeOptionsArrayBuffer, callback: (err: Error | null, prime: ArrayBuffer) => void): void; + function generatePrime(size: number, options: GeneratePrimeOptions, callback: (err: Error | null, prime: ArrayBuffer | bigint) => void): void; + /** + * Generates a pseudorandom prime of `size` bits. + * + * If `options.safe` is `true`, the prime will be a safe prime -- that is,`(prime - 1) / 2` will also be a prime. + * + * The `options.add` and `options.rem` parameters can be used to enforce additional + * requirements, e.g., for Diffie-Hellman: + * + * * If `options.add` and `options.rem` are both set, the prime will satisfy the + * condition that `prime % add = rem`. + * * If only `options.add` is set and `options.safe` is not `true`, the prime will + * satisfy the condition that `prime % add = 1`. + * * If only `options.add` is set and `options.safe` is set to `true`, the prime + * will instead satisfy the condition that `prime % add = 3`. This is necessary + * because `prime % add = 1` for `options.add > 2` would contradict the condition + * enforced by `options.safe`. + * * `options.rem` is ignored if `options.add` is not given. + * + * Both `options.add` and `options.rem` must be encoded as big-endian sequences + * if given as an `ArrayBuffer`, `SharedArrayBuffer`, `TypedArray`, `Buffer`, or`DataView`. + * + * By default, the prime is encoded as a big-endian sequence of octets + * in an [ArrayBuffer](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer). If the `bigint` option is `true`, then a + * [bigint](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/BigInt) is provided. + * @since v15.8.0 + * @param size The size (in bits) of the prime to generate. + */ + function generatePrimeSync(size: number): ArrayBuffer; + function generatePrimeSync(size: number, options: GeneratePrimeOptionsBigInt): bigint; + function generatePrimeSync(size: number, options: GeneratePrimeOptionsArrayBuffer): ArrayBuffer; + function generatePrimeSync(size: number, options: GeneratePrimeOptions): ArrayBuffer | bigint; + interface CheckPrimeOptions { + /** + * The number of Miller-Rabin probabilistic primality iterations to perform. + * When the value is 0 (zero), a number of checks is used that yields a false positive rate of at most 2-64 for random input. + * Care must be used when selecting a number of checks. + * Refer to the OpenSSL documentation for the BN_is_prime_ex function nchecks options for more details. + * + * @default 0 + */ + checks?: number | undefined; + } + /** + * Checks the primality of the `candidate`. + * @since v15.8.0 + * @param candidate A possible prime encoded as a sequence of big endian octets of arbitrary length. + */ + function checkPrime(value: LargeNumberLike, callback: (err: Error | null, result: boolean) => void): void; + function checkPrime(value: LargeNumberLike, options: CheckPrimeOptions, callback: (err: Error | null, result: boolean) => void): void; + /** + * Checks the primality of the `candidate`. + * @since v15.8.0 + * @param candidate A possible prime encoded as a sequence of big endian octets of arbitrary length. + * @return `true` if the candidate is a prime with an error probability less than `0.25 ** options.checks`. + */ + function checkPrimeSync(candidate: LargeNumberLike, options?: CheckPrimeOptions): boolean; + namespace webcrypto { + class CryptoKey {} // placeholder + } +} +declare module 'node:crypto' { + export * from 'crypto'; +} diff --git a/node_modules/@types/node/dgram.d.ts b/node_modules/@types/node/dgram.d.ts new file mode 100644 index 000000000..8a2d2c804 --- /dev/null +++ b/node_modules/@types/node/dgram.d.ts @@ -0,0 +1,545 @@ +/** + * The `dgram` module provides an implementation of UDP datagram sockets. + * + * ```js + * import dgram from 'dgram'; + * + * const server = dgram.createSocket('udp4'); + * + * server.on('error', (err) => { + * console.log(`server error:\n${err.stack}`); + * server.close(); + * }); + * + * server.on('message', (msg, rinfo) => { + * console.log(`server got: ${msg} from ${rinfo.address}:${rinfo.port}`); + * }); + * + * server.on('listening', () => { + * const address = server.address(); + * console.log(`server listening ${address.address}:${address.port}`); + * }); + * + * server.bind(41234); + * // Prints: server listening 0.0.0.0:41234 + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/dgram.js) + */ +declare module 'dgram' { + import { AddressInfo } from 'node:net'; + import * as dns from 'node:dns'; + import { EventEmitter, Abortable } from 'node:events'; + interface RemoteInfo { + address: string; + family: 'IPv4' | 'IPv6'; + port: number; + size: number; + } + interface BindOptions { + port?: number | undefined; + address?: string | undefined; + exclusive?: boolean | undefined; + fd?: number | undefined; + } + type SocketType = 'udp4' | 'udp6'; + interface SocketOptions extends Abortable { + type: SocketType; + reuseAddr?: boolean | undefined; + /** + * @default false + */ + ipv6Only?: boolean | undefined; + recvBufferSize?: number | undefined; + sendBufferSize?: number | undefined; + lookup?: ((hostname: string, options: dns.LookupOneOptions, callback: (err: NodeJS.ErrnoException | null, address: string, family: number) => void) => void) | undefined; + } + /** + * Creates a `dgram.Socket` object. Once the socket is created, calling `socket.bind()` will instruct the socket to begin listening for datagram + * messages. When `address` and `port` are not passed to `socket.bind()` the + * method will bind the socket to the "all interfaces" address on a random port + * (it does the right thing for both `udp4` and `udp6` sockets). The bound address + * and port can be retrieved using `socket.address().address` and `socket.address().port`. + * + * If the `signal` option is enabled, calling `.abort()` on the corresponding`AbortController` is similar to calling `.close()` on the socket: + * + * ```js + * const controller = new AbortController(); + * const { signal } = controller; + * const server = dgram.createSocket({ type: 'udp4', signal }); + * server.on('message', (msg, rinfo) => { + * console.log(`server got: ${msg} from ${rinfo.address}:${rinfo.port}`); + * }); + * // Later, when you want to close the server. + * controller.abort(); + * ``` + * @since v0.11.13 + * @param options Available options are: + * @param callback Attached as a listener for `'message'` events. Optional. + */ + function createSocket(type: SocketType, callback?: (msg: Buffer, rinfo: RemoteInfo) => void): Socket; + function createSocket(options: SocketOptions, callback?: (msg: Buffer, rinfo: RemoteInfo) => void): Socket; + /** + * Encapsulates the datagram functionality. + * + * New instances of `dgram.Socket` are created using {@link createSocket}. + * The `new` keyword is not to be used to create `dgram.Socket` instances. + * @since v0.1.99 + */ + class Socket extends EventEmitter { + /** + * Tells the kernel to join a multicast group at the given `multicastAddress` and`multicastInterface` using the `IP_ADD_MEMBERSHIP` socket option. If the`multicastInterface` argument is not + * specified, the operating system will choose + * one interface and will add membership to it. To add membership to every + * available interface, call `addMembership` multiple times, once per interface. + * + * When called on an unbound socket, this method will implicitly bind to a random + * port, listening on all interfaces. + * + * When sharing a UDP socket across multiple `cluster` workers, the`socket.addMembership()` function must be called only once or an`EADDRINUSE` error will occur: + * + * ```js + * import cluster from 'cluster'; + * import dgram from 'dgram'; + * + * if (cluster.isPrimary) { + * cluster.fork(); // Works ok. + * cluster.fork(); // Fails with EADDRINUSE. + * } else { + * const s = dgram.createSocket('udp4'); + * s.bind(1234, () => { + * s.addMembership('224.0.0.114'); + * }); + * } + * ``` + * @since v0.6.9 + */ + addMembership(multicastAddress: string, multicastInterface?: string): void; + /** + * Returns an object containing the address information for a socket. + * For UDP sockets, this object will contain `address`, `family` and `port`properties. + * + * This method throws `EBADF` if called on an unbound socket. + * @since v0.1.99 + */ + address(): AddressInfo; + /** + * For UDP sockets, causes the `dgram.Socket` to listen for datagram + * messages on a named `port` and optional `address`. If `port` is not + * specified or is `0`, the operating system will attempt to bind to a + * random port. If `address` is not specified, the operating system will + * attempt to listen on all addresses. Once binding is complete, a`'listening'` event is emitted and the optional `callback` function is + * called. + * + * Specifying both a `'listening'` event listener and passing a`callback` to the `socket.bind()` method is not harmful but not very + * useful. + * + * A bound datagram socket keeps the Node.js process running to receive + * datagram messages. + * + * If binding fails, an `'error'` event is generated. In rare case (e.g. + * attempting to bind with a closed socket), an `Error` may be thrown. + * + * Example of a UDP server listening on port 41234: + * + * ```js + * import dgram from 'dgram'; + * + * const server = dgram.createSocket('udp4'); + * + * server.on('error', (err) => { + * console.log(`server error:\n${err.stack}`); + * server.close(); + * }); + * + * server.on('message', (msg, rinfo) => { + * console.log(`server got: ${msg} from ${rinfo.address}:${rinfo.port}`); + * }); + * + * server.on('listening', () => { + * const address = server.address(); + * console.log(`server listening ${address.address}:${address.port}`); + * }); + * + * server.bind(41234); + * // Prints: server listening 0.0.0.0:41234 + * ``` + * @since v0.1.99 + * @param callback with no parameters. Called when binding is complete. + */ + bind(port?: number, address?: string, callback?: () => void): void; + bind(port?: number, callback?: () => void): void; + bind(callback?: () => void): void; + bind(options: BindOptions, callback?: () => void): void; + /** + * Close the underlying socket and stop listening for data on it. If a callback is + * provided, it is added as a listener for the `'close'` event. + * @since v0.1.99 + * @param callback Called when the socket has been closed. + */ + close(callback?: () => void): void; + /** + * Associates the `dgram.Socket` to a remote address and port. Every + * message sent by this handle is automatically sent to that destination. Also, + * the socket will only receive messages from that remote peer. + * Trying to call `connect()` on an already connected socket will result + * in an `ERR_SOCKET_DGRAM_IS_CONNECTED` exception. If `address` is not + * provided, `'127.0.0.1'` (for `udp4` sockets) or `'::1'` (for `udp6` sockets) + * will be used by default. Once the connection is complete, a `'connect'` event + * is emitted and the optional `callback` function is called. In case of failure, + * the `callback` is called or, failing this, an `'error'` event is emitted. + * @since v12.0.0 + * @param callback Called when the connection is completed or on error. + */ + connect(port: number, address?: string, callback?: () => void): void; + connect(port: number, callback: () => void): void; + /** + * A synchronous function that disassociates a connected `dgram.Socket` from + * its remote address. Trying to call `disconnect()` on an unbound or already + * disconnected socket will result in an `ERR_SOCKET_DGRAM_NOT_CONNECTED` exception. + * @since v12.0.0 + */ + disconnect(): void; + /** + * Instructs the kernel to leave a multicast group at `multicastAddress` using the`IP_DROP_MEMBERSHIP` socket option. This method is automatically called by the + * kernel when the socket is closed or the process terminates, so most apps will + * never have reason to call this. + * + * If `multicastInterface` is not specified, the operating system will attempt to + * drop membership on all valid interfaces. + * @since v0.6.9 + */ + dropMembership(multicastAddress: string, multicastInterface?: string): void; + /** + * This method throws `ERR_SOCKET_BUFFER_SIZE` if called on an unbound socket. + * @since v8.7.0 + * @return the `SO_RCVBUF` socket receive buffer size in bytes. + */ + getRecvBufferSize(): number; + /** + * This method throws `ERR_SOCKET_BUFFER_SIZE` if called on an unbound socket. + * @since v8.7.0 + * @return the `SO_SNDBUF` socket send buffer size in bytes. + */ + getSendBufferSize(): number; + /** + * By default, binding a socket will cause it to block the Node.js process from + * exiting as long as the socket is open. The `socket.unref()` method can be used + * to exclude the socket from the reference counting that keeps the Node.js + * process active. The `socket.ref()` method adds the socket back to the reference + * counting and restores the default behavior. + * + * Calling `socket.ref()` multiples times will have no additional effect. + * + * The `socket.ref()` method returns a reference to the socket so calls can be + * chained. + * @since v0.9.1 + */ + ref(): this; + /** + * Returns an object containing the `address`, `family`, and `port` of the remote + * endpoint. This method throws an `ERR_SOCKET_DGRAM_NOT_CONNECTED` exception + * if the socket is not connected. + * @since v12.0.0 + */ + remoteAddress(): AddressInfo; + /** + * Broadcasts a datagram on the socket. + * For connectionless sockets, the destination `port` and `address` must be + * specified. Connected sockets, on the other hand, will use their associated + * remote endpoint, so the `port` and `address` arguments must not be set. + * + * The `msg` argument contains the message to be sent. + * Depending on its type, different behavior can apply. If `msg` is a `Buffer`, + * any `TypedArray` or a `DataView`, + * the `offset` and `length` specify the offset within the `Buffer` where the + * message begins and the number of bytes in the message, respectively. + * If `msg` is a `String`, then it is automatically converted to a `Buffer`with `'utf8'` encoding. With messages that + * contain multi-byte characters, `offset` and `length` will be calculated with + * respect to `byte length` and not the character position. + * If `msg` is an array, `offset` and `length` must not be specified. + * + * The `address` argument is a string. If the value of `address` is a host name, + * DNS will be used to resolve the address of the host. If `address` is not + * provided or otherwise falsy, `'127.0.0.1'` (for `udp4` sockets) or `'::1'`(for `udp6` sockets) will be used by default. + * + * If the socket has not been previously bound with a call to `bind`, the socket + * is assigned a random port number and is bound to the "all interfaces" address + * (`'0.0.0.0'` for `udp4` sockets, `'::0'` for `udp6` sockets.) + * + * An optional `callback` function may be specified to as a way of reporting + * DNS errors or for determining when it is safe to reuse the `buf` object. + * DNS lookups delay the time to send for at least one tick of the + * Node.js event loop. + * + * The only way to know for sure that the datagram has been sent is by using a`callback`. If an error occurs and a `callback` is given, the error will be + * passed as the first argument to the `callback`. If a `callback` is not given, + * the error is emitted as an `'error'` event on the `socket` object. + * + * Offset and length are optional but both _must_ be set if either are used. + * They are supported only when the first argument is a `Buffer`, a `TypedArray`, + * or a `DataView`. + * + * This method throws `ERR_SOCKET_BAD_PORT` if called on an unbound socket. + * + * Example of sending a UDP packet to a port on `localhost`; + * + * ```js + * import dgram from 'dgram'; + * import { Buffer } from 'buffer'; + * + * const message = Buffer.from('Some bytes'); + * const client = dgram.createSocket('udp4'); + * client.send(message, 41234, 'localhost', (err) => { + * client.close(); + * }); + * ``` + * + * Example of sending a UDP packet composed of multiple buffers to a port on`127.0.0.1`; + * + * ```js + * import dgram from 'dgram'; + * import { Buffer } from 'buffer'; + * + * const buf1 = Buffer.from('Some '); + * const buf2 = Buffer.from('bytes'); + * const client = dgram.createSocket('udp4'); + * client.send([buf1, buf2], 41234, (err) => { + * client.close(); + * }); + * ``` + * + * Sending multiple buffers might be faster or slower depending on the + * application and operating system. Run benchmarks to + * determine the optimal strategy on a case-by-case basis. Generally speaking, + * however, sending multiple buffers is faster. + * + * Example of sending a UDP packet using a socket connected to a port on`localhost`: + * + * ```js + * import dgram from 'dgram'; + * import { Buffer } from 'buffer'; + * + * const message = Buffer.from('Some bytes'); + * const client = dgram.createSocket('udp4'); + * client.connect(41234, 'localhost', (err) => { + * client.send(message, (err) => { + * client.close(); + * }); + * }); + * ``` + * @since v0.1.99 + * @param msg Message to be sent. + * @param offset Offset in the buffer where the message starts. + * @param length Number of bytes in the message. + * @param port Destination port. + * @param address Destination host name or IP address. + * @param callback Called when the message has been sent. + */ + send(msg: string | Uint8Array | ReadonlyArray, port?: number, address?: string, callback?: (error: Error | null, bytes: number) => void): void; + send(msg: string | Uint8Array | ReadonlyArray, port?: number, callback?: (error: Error | null, bytes: number) => void): void; + send(msg: string | Uint8Array | ReadonlyArray, callback?: (error: Error | null, bytes: number) => void): void; + send(msg: string | Uint8Array, offset: number, length: number, port?: number, address?: string, callback?: (error: Error | null, bytes: number) => void): void; + send(msg: string | Uint8Array, offset: number, length: number, port?: number, callback?: (error: Error | null, bytes: number) => void): void; + send(msg: string | Uint8Array, offset: number, length: number, callback?: (error: Error | null, bytes: number) => void): void; + /** + * Sets or clears the `SO_BROADCAST` socket option. When set to `true`, UDP + * packets may be sent to a local interface's broadcast address. + * + * This method throws `EBADF` if called on an unbound socket. + * @since v0.6.9 + */ + setBroadcast(flag: boolean): void; + /** + * _All references to scope in this section are referring to [IPv6 Zone Indices](https://en.wikipedia.org/wiki/IPv6_address#Scoped_literal_IPv6_addresses), which are defined by [RFC + * 4007](https://tools.ietf.org/html/rfc4007). In string form, an IP_ + * _with a scope index is written as `'IP%scope'` where scope is an interface name_ + * _or interface number._ + * + * Sets the default outgoing multicast interface of the socket to a chosen + * interface or back to system interface selection. The `multicastInterface` must + * be a valid string representation of an IP from the socket's family. + * + * For IPv4 sockets, this should be the IP configured for the desired physical + * interface. All packets sent to multicast on the socket will be sent on the + * interface determined by the most recent successful use of this call. + * + * For IPv6 sockets, `multicastInterface` should include a scope to indicate the + * interface as in the examples that follow. In IPv6, individual `send` calls can + * also use explicit scope in addresses, so only packets sent to a multicast + * address without specifying an explicit scope are affected by the most recent + * successful use of this call. + * + * This method throws `EBADF` if called on an unbound socket. + * + * #### Example: IPv6 outgoing multicast interface + * + * On most systems, where scope format uses the interface name: + * + * ```js + * const socket = dgram.createSocket('udp6'); + * + * socket.bind(1234, () => { + * socket.setMulticastInterface('::%eth1'); + * }); + * ``` + * + * On Windows, where scope format uses an interface number: + * + * ```js + * const socket = dgram.createSocket('udp6'); + * + * socket.bind(1234, () => { + * socket.setMulticastInterface('::%2'); + * }); + * ``` + * + * #### Example: IPv4 outgoing multicast interface + * + * All systems use an IP of the host on the desired physical interface: + * + * ```js + * const socket = dgram.createSocket('udp4'); + * + * socket.bind(1234, () => { + * socket.setMulticastInterface('10.0.0.2'); + * }); + * ``` + * @since v8.6.0 + */ + setMulticastInterface(multicastInterface: string): void; + /** + * Sets or clears the `IP_MULTICAST_LOOP` socket option. When set to `true`, + * multicast packets will also be received on the local interface. + * + * This method throws `EBADF` if called on an unbound socket. + * @since v0.3.8 + */ + setMulticastLoopback(flag: boolean): void; + /** + * Sets the `IP_MULTICAST_TTL` socket option. While TTL generally stands for + * "Time to Live", in this context it specifies the number of IP hops that a + * packet is allowed to travel through, specifically for multicast traffic. Each + * router or gateway that forwards a packet decrements the TTL. If the TTL is + * decremented to 0 by a router, it will not be forwarded. + * + * The `ttl` argument may be between 0 and 255\. The default on most systems is `1`. + * + * This method throws `EBADF` if called on an unbound socket. + * @since v0.3.8 + */ + setMulticastTTL(ttl: number): void; + /** + * Sets the `SO_RCVBUF` socket option. Sets the maximum socket receive buffer + * in bytes. + * + * This method throws `ERR_SOCKET_BUFFER_SIZE` if called on an unbound socket. + * @since v8.7.0 + */ + setRecvBufferSize(size: number): void; + /** + * Sets the `SO_SNDBUF` socket option. Sets the maximum socket send buffer + * in bytes. + * + * This method throws `ERR_SOCKET_BUFFER_SIZE` if called on an unbound socket. + * @since v8.7.0 + */ + setSendBufferSize(size: number): void; + /** + * Sets the `IP_TTL` socket option. While TTL generally stands for "Time to Live", + * in this context it specifies the number of IP hops that a packet is allowed to + * travel through. Each router or gateway that forwards a packet decrements the + * TTL. If the TTL is decremented to 0 by a router, it will not be forwarded. + * Changing TTL values is typically done for network probes or when multicasting. + * + * The `ttl` argument may be between between 1 and 255\. The default on most systems + * is 64. + * + * This method throws `EBADF` if called on an unbound socket. + * @since v0.1.101 + */ + setTTL(ttl: number): void; + /** + * By default, binding a socket will cause it to block the Node.js process from + * exiting as long as the socket is open. The `socket.unref()` method can be used + * to exclude the socket from the reference counting that keeps the Node.js + * process active, allowing the process to exit even if the socket is still + * listening. + * + * Calling `socket.unref()` multiple times will have no addition effect. + * + * The `socket.unref()` method returns a reference to the socket so calls can be + * chained. + * @since v0.9.1 + */ + unref(): this; + /** + * Tells the kernel to join a source-specific multicast channel at the given`sourceAddress` and `groupAddress`, using the `multicastInterface` with the`IP_ADD_SOURCE_MEMBERSHIP` socket + * option. If the `multicastInterface` argument + * is not specified, the operating system will choose one interface and will add + * membership to it. To add membership to every available interface, call`socket.addSourceSpecificMembership()` multiple times, once per interface. + * + * When called on an unbound socket, this method will implicitly bind to a random + * port, listening on all interfaces. + * @since v13.1.0, v12.16.0 + */ + addSourceSpecificMembership(sourceAddress: string, groupAddress: string, multicastInterface?: string): void; + /** + * Instructs the kernel to leave a source-specific multicast channel at the given`sourceAddress` and `groupAddress` using the `IP_DROP_SOURCE_MEMBERSHIP`socket option. This method is + * automatically called by the kernel when the + * socket is closed or the process terminates, so most apps will never have + * reason to call this. + * + * If `multicastInterface` is not specified, the operating system will attempt to + * drop membership on all valid interfaces. + * @since v13.1.0, v12.16.0 + */ + dropSourceSpecificMembership(sourceAddress: string, groupAddress: string, multicastInterface?: string): void; + /** + * events.EventEmitter + * 1. close + * 2. connect + * 3. error + * 4. listening + * 5. message + */ + addListener(event: string, listener: (...args: any[]) => void): this; + addListener(event: 'close', listener: () => void): this; + addListener(event: 'connect', listener: () => void): this; + addListener(event: 'error', listener: (err: Error) => void): this; + addListener(event: 'listening', listener: () => void): this; + addListener(event: 'message', listener: (msg: Buffer, rinfo: RemoteInfo) => void): this; + emit(event: string | symbol, ...args: any[]): boolean; + emit(event: 'close'): boolean; + emit(event: 'connect'): boolean; + emit(event: 'error', err: Error): boolean; + emit(event: 'listening'): boolean; + emit(event: 'message', msg: Buffer, rinfo: RemoteInfo): boolean; + on(event: string, listener: (...args: any[]) => void): this; + on(event: 'close', listener: () => void): this; + on(event: 'connect', listener: () => void): this; + on(event: 'error', listener: (err: Error) => void): this; + on(event: 'listening', listener: () => void): this; + on(event: 'message', listener: (msg: Buffer, rinfo: RemoteInfo) => void): this; + once(event: string, listener: (...args: any[]) => void): this; + once(event: 'close', listener: () => void): this; + once(event: 'connect', listener: () => void): this; + once(event: 'error', listener: (err: Error) => void): this; + once(event: 'listening', listener: () => void): this; + once(event: 'message', listener: (msg: Buffer, rinfo: RemoteInfo) => void): this; + prependListener(event: string, listener: (...args: any[]) => void): this; + prependListener(event: 'close', listener: () => void): this; + prependListener(event: 'connect', listener: () => void): this; + prependListener(event: 'error', listener: (err: Error) => void): this; + prependListener(event: 'listening', listener: () => void): this; + prependListener(event: 'message', listener: (msg: Buffer, rinfo: RemoteInfo) => void): this; + prependOnceListener(event: string, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'close', listener: () => void): this; + prependOnceListener(event: 'connect', listener: () => void): this; + prependOnceListener(event: 'error', listener: (err: Error) => void): this; + prependOnceListener(event: 'listening', listener: () => void): this; + prependOnceListener(event: 'message', listener: (msg: Buffer, rinfo: RemoteInfo) => void): this; + } +} +declare module 'node:dgram' { + export * from 'dgram'; +} diff --git a/node_modules/@types/node/diagnostics_channel.d.ts b/node_modules/@types/node/diagnostics_channel.d.ts new file mode 100644 index 000000000..a59478e95 --- /dev/null +++ b/node_modules/@types/node/diagnostics_channel.d.ts @@ -0,0 +1,128 @@ +/** + * The `diagnostics_channel` module provides an API to create named channels + * to report arbitrary message data for diagnostics purposes. + * + * It can be accessed using: + * + * ```js + * import diagnostics_channel from 'diagnostics_channel'; + * ``` + * + * It is intended that a module writer wanting to report diagnostics messages + * will create one or many top-level channels to report messages through. + * Channels may also be acquired at runtime but it is not encouraged + * due to the additional overhead of doing so. Channels may be exported for + * convenience, but as long as the name is known it can be acquired anywhere. + * + * If you intend for your module to produce diagnostics data for others to + * consume it is recommended that you include documentation of what named + * channels are used along with the shape of the message data. Channel names + * should generally include the module name to avoid collisions with data from + * other modules. + * @experimental + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/diagnostics_channel.js) + */ +declare module 'diagnostics_channel' { + /** + * Check if there are active subscribers to the named channel. This is helpful if + * the message you want to send might be expensive to prepare. + * + * This API is optional but helpful when trying to publish messages from very + * performance-sensitive code. + * + * ```js + * import diagnostics_channel from 'diagnostics_channel'; + * + * if (diagnostics_channel.hasSubscribers('my-channel')) { + * // There are subscribers, prepare and publish message + * } + * ``` + * @param name The channel name + * @return If there are active subscribers + */ + function hasSubscribers(name: string): boolean; + /** + * This is the primary entry-point for anyone wanting to interact with a named + * channel. It produces a channel object which is optimized to reduce overhead at + * publish time as much as possible. + * + * ```js + * import diagnostics_channel from 'diagnostics_channel'; + * + * const channel = diagnostics_channel.channel('my-channel'); + * ``` + * @param name The channel name + * @return The named channel object + */ + function channel(name: string): Channel; + type ChannelListener = (name: string, message: unknown) => void; + /** + * The class `Channel` represents an individual named channel within the data + * pipeline. It is use to track subscribers and to publish messages when there + * are subscribers present. It exists as a separate object to avoid channel + * lookups at publish time, enabling very fast publish speeds and allowing + * for heavy use while incurring very minimal cost. Channels are created with {@link channel}, constructing a channel directly + * with `new Channel(name)` is not supported. + */ + class Channel { + readonly name: string; + /** + * Check if there are active subscribers to this channel. This is helpful if + * the message you want to send might be expensive to prepare. + * + * This API is optional but helpful when trying to publish messages from very + * performance-sensitive code. + * + * ```js + * import diagnostics_channel from 'diagnostics_channel'; + * + * const channel = diagnostics_channel.channel('my-channel'); + * + * if (channel.hasSubscribers) { + * // There are subscribers, prepare and publish message + * } + * ``` + */ + readonly hasSubscribers: boolean; + private constructor(name: string); + /** + * Register a message handler to subscribe to this channel. This message handler + * will be run synchronously whenever a message is published to the channel. Any + * errors thrown in the message handler will trigger an `'uncaughtException'`. + * + * ```js + * import diagnostics_channel from 'diagnostics_channel'; + * + * const channel = diagnostics_channel.channel('my-channel'); + * + * channel.subscribe((message, name) => { + * // Received data + * }); + * ``` + * @param onMessage The handler to receive channel messages + */ + subscribe(onMessage: ChannelListener): void; + /** + * Remove a message handler previously registered to this channel with `channel.subscribe(onMessage)`. + * + * ```js + * import diagnostics_channel from 'diagnostics_channel'; + * + * const channel = diagnostics_channel.channel('my-channel'); + * + * function onMessage(message, name) { + * // Received data + * } + * + * channel.subscribe(onMessage); + * + * channel.unsubscribe(onMessage); + * ``` + * @param onMessage The previous subscribed handler to remove + */ + unsubscribe(onMessage: ChannelListener): void; + } +} +declare module 'node:diagnostics_channel' { + export * from 'diagnostics_channel'; +} diff --git a/node_modules/@types/node/dns.d.ts b/node_modules/@types/node/dns.d.ts new file mode 100644 index 000000000..11af81778 --- /dev/null +++ b/node_modules/@types/node/dns.d.ts @@ -0,0 +1,643 @@ +/** + * The `dns` module enables name resolution. For example, use it to look up IP + * addresses of host names. + * + * Although named for the [Domain Name System (DNS)](https://en.wikipedia.org/wiki/Domain_Name_System), it does not always use the + * DNS protocol for lookups. {@link lookup} uses the operating system + * facilities to perform name resolution. It may not need to perform any network + * communication. To perform name resolution the way other applications on the same + * system do, use {@link lookup}. + * + * ```js + * const dns = require('dns'); + * + * dns.lookup('example.org', (err, address, family) => { + * console.log('address: %j family: IPv%s', address, family); + * }); + * // address: "93.184.216.34" family: IPv4 + * ``` + * + * All other functions in the `dns` module connect to an actual DNS server to + * perform name resolution. They will always use the network to perform DNS + * queries. These functions do not use the same set of configuration files used by {@link lookup} (e.g. `/etc/hosts`). Use these functions to always perform + * DNS queries, bypassing other name-resolution facilities. + * + * ```js + * const dns = require('dns'); + * + * dns.resolve4('archive.org', (err, addresses) => { + * if (err) throw err; + * + * console.log(`addresses: ${JSON.stringify(addresses)}`); + * + * addresses.forEach((a) => { + * dns.reverse(a, (err, hostnames) => { + * if (err) { + * throw err; + * } + * console.log(`reverse for ${a}: ${JSON.stringify(hostnames)}`); + * }); + * }); + * }); + * ``` + * + * See the `Implementation considerations section` for more information. + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/dns.js) + */ +declare module 'dns' { + import * as dnsPromises from 'node:dns/promises'; + // Supported getaddrinfo flags. + export const ADDRCONFIG: number; + export const V4MAPPED: number; + /** + * If `dns.V4MAPPED` is specified, return resolved IPv6 addresses as + * well as IPv4 mapped IPv6 addresses. + */ + export const ALL: number; + export interface LookupOptions { + family?: number | undefined; + hints?: number | undefined; + all?: boolean | undefined; + verbatim?: boolean | undefined; + } + export interface LookupOneOptions extends LookupOptions { + all?: false | undefined; + } + export interface LookupAllOptions extends LookupOptions { + all: true; + } + export interface LookupAddress { + address: string; + family: number; + } + /** + * Resolves a host name (e.g. `'nodejs.org'`) into the first found A (IPv4) or + * AAAA (IPv6) record. All `option` properties are optional. If `options` is an + * integer, then it must be `4` or `6` – if `options` is not provided, then IPv4 + * and IPv6 addresses are both returned if found. + * + * With the `all` option set to `true`, the arguments for `callback` change to`(err, addresses)`, with `addresses` being an array of objects with the + * properties `address` and `family`. + * + * On error, `err` is an `Error` object, where `err.code` is the error code. + * Keep in mind that `err.code` will be set to `'ENOTFOUND'` not only when + * the host name does not exist but also when the lookup fails in other ways + * such as no available file descriptors. + * + * `dns.lookup()` does not necessarily have anything to do with the DNS protocol. + * The implementation uses an operating system facility that can associate names + * with addresses, and vice versa. This implementation can have subtle but + * important consequences on the behavior of any Node.js program. Please take some + * time to consult the `Implementation considerations section` before using`dns.lookup()`. + * + * Example usage: + * + * ```js + * const dns = require('dns'); + * const options = { + * family: 6, + * hints: dns.ADDRCONFIG | dns.V4MAPPED, + * }; + * dns.lookup('example.com', options, (err, address, family) => + * console.log('address: %j family: IPv%s', address, family)); + * // address: "2606:2800:220:1:248:1893:25c8:1946" family: IPv6 + * + * // When options.all is true, the result will be an Array. + * options.all = true; + * dns.lookup('example.com', options, (err, addresses) => + * console.log('addresses: %j', addresses)); + * // addresses: [{"address":"2606:2800:220:1:248:1893:25c8:1946","family":6}] + * ``` + * + * If this method is invoked as its `util.promisify()` ed version, and `all`is not set to `true`, it returns a `Promise` for an `Object` with `address` and`family` properties. + * @since v0.1.90 + */ + export function lookup(hostname: string, family: number, callback: (err: NodeJS.ErrnoException | null, address: string, family: number) => void): void; + export function lookup(hostname: string, options: LookupOneOptions, callback: (err: NodeJS.ErrnoException | null, address: string, family: number) => void): void; + export function lookup(hostname: string, options: LookupAllOptions, callback: (err: NodeJS.ErrnoException | null, addresses: LookupAddress[]) => void): void; + export function lookup(hostname: string, options: LookupOptions, callback: (err: NodeJS.ErrnoException | null, address: string | LookupAddress[], family: number) => void): void; + export function lookup(hostname: string, callback: (err: NodeJS.ErrnoException | null, address: string, family: number) => void): void; + export namespace lookup { + function __promisify__(hostname: string, options: LookupAllOptions): Promise; + function __promisify__(hostname: string, options?: LookupOneOptions | number): Promise; + function __promisify__(hostname: string, options: LookupOptions): Promise; + } + /** + * Resolves the given `address` and `port` into a host name and service using + * the operating system's underlying `getnameinfo` implementation. + * + * If `address` is not a valid IP address, a `TypeError` will be thrown. + * The `port` will be coerced to a number. If it is not a legal port, a `TypeError`will be thrown. + * + * On an error, `err` is an `Error` object, where `err.code` is the error code. + * + * ```js + * const dns = require('dns'); + * dns.lookupService('127.0.0.1', 22, (err, hostname, service) => { + * console.log(hostname, service); + * // Prints: localhost ssh + * }); + * ``` + * + * If this method is invoked as its `util.promisify()` ed version, it returns a`Promise` for an `Object` with `hostname` and `service` properties. + * @since v0.11.14 + */ + export function lookupService(address: string, port: number, callback: (err: NodeJS.ErrnoException | null, hostname: string, service: string) => void): void; + export namespace lookupService { + function __promisify__( + address: string, + port: number + ): Promise<{ + hostname: string; + service: string; + }>; + } + export interface ResolveOptions { + ttl: boolean; + } + export interface ResolveWithTtlOptions extends ResolveOptions { + ttl: true; + } + export interface RecordWithTtl { + address: string; + ttl: number; + } + /** @deprecated Use `AnyARecord` or `AnyAaaaRecord` instead. */ + export type AnyRecordWithTtl = AnyARecord | AnyAaaaRecord; + export interface AnyARecord extends RecordWithTtl { + type: 'A'; + } + export interface AnyAaaaRecord extends RecordWithTtl { + type: 'AAAA'; + } + export interface CaaRecord { + critial: number; + issue?: string | undefined; + issuewild?: string | undefined; + iodef?: string | undefined; + contactemail?: string | undefined; + contactphone?: string | undefined; + } + export interface MxRecord { + priority: number; + exchange: string; + } + export interface AnyMxRecord extends MxRecord { + type: 'MX'; + } + export interface NaptrRecord { + flags: string; + service: string; + regexp: string; + replacement: string; + order: number; + preference: number; + } + export interface AnyNaptrRecord extends NaptrRecord { + type: 'NAPTR'; + } + export interface SoaRecord { + nsname: string; + hostmaster: string; + serial: number; + refresh: number; + retry: number; + expire: number; + minttl: number; + } + export interface AnySoaRecord extends SoaRecord { + type: 'SOA'; + } + export interface SrvRecord { + priority: number; + weight: number; + port: number; + name: string; + } + export interface AnySrvRecord extends SrvRecord { + type: 'SRV'; + } + export interface AnyTxtRecord { + type: 'TXT'; + entries: string[]; + } + export interface AnyNsRecord { + type: 'NS'; + value: string; + } + export interface AnyPtrRecord { + type: 'PTR'; + value: string; + } + export interface AnyCnameRecord { + type: 'CNAME'; + value: string; + } + export type AnyRecord = AnyARecord | AnyAaaaRecord | AnyCnameRecord | AnyMxRecord | AnyNaptrRecord | AnyNsRecord | AnyPtrRecord | AnySoaRecord | AnySrvRecord | AnyTxtRecord; + /** + * Uses the DNS protocol to resolve a host name (e.g. `'nodejs.org'`) into an array + * of the resource records. The `callback` function has arguments`(err, records)`. When successful, `records` will be an array of resource + * records. The type and structure of individual results varies based on `rrtype`: + * + * + * + * On error, `err` is an `Error` object, where `err.code` is one of the `DNS error codes`. + * @since v0.1.27 + * @param hostname Host name to resolve. + * @param [rrtype='A'] Resource record type. + */ + export function resolve(hostname: string, callback: (err: NodeJS.ErrnoException | null, addresses: string[]) => void): void; + export function resolve(hostname: string, rrtype: 'A', callback: (err: NodeJS.ErrnoException | null, addresses: string[]) => void): void; + export function resolve(hostname: string, rrtype: 'AAAA', callback: (err: NodeJS.ErrnoException | null, addresses: string[]) => void): void; + export function resolve(hostname: string, rrtype: 'ANY', callback: (err: NodeJS.ErrnoException | null, addresses: AnyRecord[]) => void): void; + export function resolve(hostname: string, rrtype: 'CNAME', callback: (err: NodeJS.ErrnoException | null, addresses: string[]) => void): void; + export function resolve(hostname: string, rrtype: 'MX', callback: (err: NodeJS.ErrnoException | null, addresses: MxRecord[]) => void): void; + export function resolve(hostname: string, rrtype: 'NAPTR', callback: (err: NodeJS.ErrnoException | null, addresses: NaptrRecord[]) => void): void; + export function resolve(hostname: string, rrtype: 'NS', callback: (err: NodeJS.ErrnoException | null, addresses: string[]) => void): void; + export function resolve(hostname: string, rrtype: 'PTR', callback: (err: NodeJS.ErrnoException | null, addresses: string[]) => void): void; + export function resolve(hostname: string, rrtype: 'SOA', callback: (err: NodeJS.ErrnoException | null, addresses: SoaRecord) => void): void; + export function resolve(hostname: string, rrtype: 'SRV', callback: (err: NodeJS.ErrnoException | null, addresses: SrvRecord[]) => void): void; + export function resolve(hostname: string, rrtype: 'TXT', callback: (err: NodeJS.ErrnoException | null, addresses: string[][]) => void): void; + export function resolve( + hostname: string, + rrtype: string, + callback: (err: NodeJS.ErrnoException | null, addresses: string[] | MxRecord[] | NaptrRecord[] | SoaRecord | SrvRecord[] | string[][] | AnyRecord[]) => void + ): void; + export namespace resolve { + function __promisify__(hostname: string, rrtype?: 'A' | 'AAAA' | 'CNAME' | 'NS' | 'PTR'): Promise; + function __promisify__(hostname: string, rrtype: 'ANY'): Promise; + function __promisify__(hostname: string, rrtype: 'MX'): Promise; + function __promisify__(hostname: string, rrtype: 'NAPTR'): Promise; + function __promisify__(hostname: string, rrtype: 'SOA'): Promise; + function __promisify__(hostname: string, rrtype: 'SRV'): Promise; + function __promisify__(hostname: string, rrtype: 'TXT'): Promise; + function __promisify__(hostname: string, rrtype: string): Promise; + } + /** + * Uses the DNS protocol to resolve a IPv4 addresses (`A` records) for the`hostname`. The `addresses` argument passed to the `callback` function + * will contain an array of IPv4 addresses (e.g.`['74.125.79.104', '74.125.79.105', '74.125.79.106']`). + * @since v0.1.16 + * @param hostname Host name to resolve. + */ + export function resolve4(hostname: string, callback: (err: NodeJS.ErrnoException | null, addresses: string[]) => void): void; + export function resolve4(hostname: string, options: ResolveWithTtlOptions, callback: (err: NodeJS.ErrnoException | null, addresses: RecordWithTtl[]) => void): void; + export function resolve4(hostname: string, options: ResolveOptions, callback: (err: NodeJS.ErrnoException | null, addresses: string[] | RecordWithTtl[]) => void): void; + export namespace resolve4 { + function __promisify__(hostname: string): Promise; + function __promisify__(hostname: string, options: ResolveWithTtlOptions): Promise; + function __promisify__(hostname: string, options?: ResolveOptions): Promise; + } + /** + * Uses the DNS protocol to resolve a IPv6 addresses (`AAAA` records) for the`hostname`. The `addresses` argument passed to the `callback` function + * will contain an array of IPv6 addresses. + * @since v0.1.16 + * @param hostname Host name to resolve. + */ + export function resolve6(hostname: string, callback: (err: NodeJS.ErrnoException | null, addresses: string[]) => void): void; + export function resolve6(hostname: string, options: ResolveWithTtlOptions, callback: (err: NodeJS.ErrnoException | null, addresses: RecordWithTtl[]) => void): void; + export function resolve6(hostname: string, options: ResolveOptions, callback: (err: NodeJS.ErrnoException | null, addresses: string[] | RecordWithTtl[]) => void): void; + export namespace resolve6 { + function __promisify__(hostname: string): Promise; + function __promisify__(hostname: string, options: ResolveWithTtlOptions): Promise; + function __promisify__(hostname: string, options?: ResolveOptions): Promise; + } + /** + * Uses the DNS protocol to resolve `CNAME` records for the `hostname`. The`addresses` argument passed to the `callback` function + * will contain an array of canonical name records available for the `hostname`(e.g. `['bar.example.com']`). + * @since v0.3.2 + */ + export function resolveCname(hostname: string, callback: (err: NodeJS.ErrnoException | null, addresses: string[]) => void): void; + export namespace resolveCname { + function __promisify__(hostname: string): Promise; + } + /** + * Uses the DNS protocol to resolve `CAA` records for the `hostname`. The`addresses` argument passed to the `callback` function + * will contain an array of certification authority authorization records + * available for the `hostname` (e.g. `[{critical: 0, iodef: 'mailto:pki@example.com'}, {critical: 128, issue: 'pki.example.com'}]`). + * @since v15.0.0 + */ + export function resolveCaa(hostname: string, callback: (err: NodeJS.ErrnoException | null, records: CaaRecord[]) => void): void; + export namespace resolveCaa { + function __promisify__(hostname: string): Promise; + } + /** + * Uses the DNS protocol to resolve mail exchange records (`MX` records) for the`hostname`. The `addresses` argument passed to the `callback` function will + * contain an array of objects containing both a `priority` and `exchange`property (e.g. `[{priority: 10, exchange: 'mx.example.com'}, ...]`). + * @since v0.1.27 + */ + export function resolveMx(hostname: string, callback: (err: NodeJS.ErrnoException | null, addresses: MxRecord[]) => void): void; + export namespace resolveMx { + function __promisify__(hostname: string): Promise; + } + /** + * Uses the DNS protocol to resolve regular expression based records (`NAPTR`records) for the `hostname`. The `addresses` argument passed to the `callback`function will contain an array of + * objects with the following properties: + * + * * `flags` + * * `service` + * * `regexp` + * * `replacement` + * * `order` + * * `preference` + * + * ```js + * { + * flags: 's', + * service: 'SIP+D2U', + * regexp: '', + * replacement: '_sip._udp.example.com', + * order: 30, + * preference: 100 + * } + * ``` + * @since v0.9.12 + */ + export function resolveNaptr(hostname: string, callback: (err: NodeJS.ErrnoException | null, addresses: NaptrRecord[]) => void): void; + export namespace resolveNaptr { + function __promisify__(hostname: string): Promise; + } + /** + * Uses the DNS protocol to resolve name server records (`NS` records) for the`hostname`. The `addresses` argument passed to the `callback` function will + * contain an array of name server records available for `hostname`(e.g. `['ns1.example.com', 'ns2.example.com']`). + * @since v0.1.90 + */ + export function resolveNs(hostname: string, callback: (err: NodeJS.ErrnoException | null, addresses: string[]) => void): void; + export namespace resolveNs { + function __promisify__(hostname: string): Promise; + } + /** + * Uses the DNS protocol to resolve pointer records (`PTR` records) for the`hostname`. The `addresses` argument passed to the `callback` function will + * be an array of strings containing the reply records. + * @since v6.0.0 + */ + export function resolvePtr(hostname: string, callback: (err: NodeJS.ErrnoException | null, addresses: string[]) => void): void; + export namespace resolvePtr { + function __promisify__(hostname: string): Promise; + } + /** + * Uses the DNS protocol to resolve a start of authority record (`SOA` record) for + * the `hostname`. The `address` argument passed to the `callback` function will + * be an object with the following properties: + * + * * `nsname` + * * `hostmaster` + * * `serial` + * * `refresh` + * * `retry` + * * `expire` + * * `minttl` + * + * ```js + * { + * nsname: 'ns.example.com', + * hostmaster: 'root.example.com', + * serial: 2013101809, + * refresh: 10000, + * retry: 2400, + * expire: 604800, + * minttl: 3600 + * } + * ``` + * @since v0.11.10 + */ + export function resolveSoa(hostname: string, callback: (err: NodeJS.ErrnoException | null, address: SoaRecord) => void): void; + export namespace resolveSoa { + function __promisify__(hostname: string): Promise; + } + /** + * Uses the DNS protocol to resolve service records (`SRV` records) for the`hostname`. The `addresses` argument passed to the `callback` function will + * be an array of objects with the following properties: + * + * * `priority` + * * `weight` + * * `port` + * * `name` + * + * ```js + * { + * priority: 10, + * weight: 5, + * port: 21223, + * name: 'service.example.com' + * } + * ``` + * @since v0.1.27 + */ + export function resolveSrv(hostname: string, callback: (err: NodeJS.ErrnoException | null, addresses: SrvRecord[]) => void): void; + export namespace resolveSrv { + function __promisify__(hostname: string): Promise; + } + /** + * Uses the DNS protocol to resolve text queries (`TXT` records) for the`hostname`. The `records` argument passed to the `callback` function is a + * two-dimensional array of the text records available for `hostname` (e.g.`[ ['v=spf1 ip4:0.0.0.0 ', '~all' ] ]`). Each sub-array contains TXT chunks of + * one record. Depending on the use case, these could be either joined together or + * treated separately. + * @since v0.1.27 + */ + export function resolveTxt(hostname: string, callback: (err: NodeJS.ErrnoException | null, addresses: string[][]) => void): void; + export namespace resolveTxt { + function __promisify__(hostname: string): Promise; + } + /** + * Uses the DNS protocol to resolve all records (also known as `ANY` or `*` query). + * The `ret` argument passed to the `callback` function will be an array containing + * various types of records. Each object has a property `type` that indicates the + * type of the current record. And depending on the `type`, additional properties + * will be present on the object: + * + * + * + * Here is an example of the `ret` object passed to the callback: + * + * ```js + * [ { type: 'A', address: '127.0.0.1', ttl: 299 }, + * { type: 'CNAME', value: 'example.com' }, + * { type: 'MX', exchange: 'alt4.aspmx.l.example.com', priority: 50 }, + * { type: 'NS', value: 'ns1.example.com' }, + * { type: 'TXT', entries: [ 'v=spf1 include:_spf.example.com ~all' ] }, + * { type: 'SOA', + * nsname: 'ns1.example.com', + * hostmaster: 'admin.example.com', + * serial: 156696742, + * refresh: 900, + * retry: 900, + * expire: 1800, + * minttl: 60 } ] + * ``` + * + * DNS server operators may choose not to respond to `ANY`queries. It may be better to call individual methods like {@link resolve4},{@link resolveMx}, and so on. For more details, see [RFC + * 8482](https://tools.ietf.org/html/rfc8482). + */ + export function resolveAny(hostname: string, callback: (err: NodeJS.ErrnoException | null, addresses: AnyRecord[]) => void): void; + export namespace resolveAny { + function __promisify__(hostname: string): Promise; + } + /** + * Performs a reverse DNS query that resolves an IPv4 or IPv6 address to an + * array of host names. + * + * On error, `err` is an `Error` object, where `err.code` is + * one of the `DNS error codes`. + * @since v0.1.16 + */ + export function reverse(ip: string, callback: (err: NodeJS.ErrnoException | null, hostnames: string[]) => void): void; + /** + * Sets the IP address and port of servers to be used when performing DNS + * resolution. The `servers` argument is an array of [RFC 5952](https://tools.ietf.org/html/rfc5952#section-6) formatted + * addresses. If the port is the IANA default DNS port (53) it can be omitted. + * + * ```js + * dns.setServers([ + * '4.4.4.4', + * '[2001:4860:4860::8888]', + * '4.4.4.4:1053', + * '[2001:4860:4860::8888]:1053', + * ]); + * ``` + * + * An error will be thrown if an invalid address is provided. + * + * The `dns.setServers()` method must not be called while a DNS query is in + * progress. + * + * The {@link setServers} method affects only {@link resolve},`dns.resolve*()` and {@link reverse} (and specifically _not_ {@link lookup}). + * + * This method works much like [resolve.conf](https://man7.org/linux/man-pages/man5/resolv.conf.5.html). + * That is, if attempting to resolve with the first server provided results in a`NOTFOUND` error, the `resolve()` method will _not_ attempt to resolve with + * subsequent servers provided. Fallback DNS servers will only be used if the + * earlier ones time out or result in some other error. + * @since v0.11.3 + * @param servers array of `RFC 5952` formatted addresses + */ + export function setServers(servers: ReadonlyArray): void; + /** + * Returns an array of IP address strings, formatted according to [RFC 5952](https://tools.ietf.org/html/rfc5952#section-6), + * that are currently configured for DNS resolution. A string will include a port + * section if a custom port is used. + * + * ```js + * [ + * '4.4.4.4', + * '2001:4860:4860::8888', + * '4.4.4.4:1053', + * '[2001:4860:4860::8888]:1053', + * ] + * ``` + * @since v0.11.3 + */ + export function getServers(): string[]; + // Error codes + export const NODATA: string; + export const FORMERR: string; + export const SERVFAIL: string; + export const NOTFOUND: string; + export const NOTIMP: string; + export const REFUSED: string; + export const BADQUERY: string; + export const BADNAME: string; + export const BADFAMILY: string; + export const BADRESP: string; + export const CONNREFUSED: string; + export const TIMEOUT: string; + export const EOF: string; + export const FILE: string; + export const NOMEM: string; + export const DESTRUCTION: string; + export const BADSTR: string; + export const BADFLAGS: string; + export const NONAME: string; + export const BADHINTS: string; + export const NOTINITIALIZED: string; + export const LOADIPHLPAPI: string; + export const ADDRGETNETWORKPARAMS: string; + export const CANCELLED: string; + export interface ResolverOptions { + timeout?: number | undefined; + /** + * @default 4 + */ + tries?: number; + } + /** + * An independent resolver for DNS requests. + * + * Creating a new resolver uses the default server settings. Setting + * the servers used for a resolver using `resolver.setServers()` does not affect + * other resolvers: + * + * ```js + * const { Resolver } = require('dns'); + * const resolver = new Resolver(); + * resolver.setServers(['4.4.4.4']); + * + * // This request will use the server at 4.4.4.4, independent of global settings. + * resolver.resolve4('example.org', (err, addresses) => { + * // ... + * }); + * ``` + * + * The following methods from the `dns` module are available: + * + * * `resolver.getServers()` + * * `resolver.resolve()` + * * `resolver.resolve4()` + * * `resolver.resolve6()` + * * `resolver.resolveAny()` + * * `resolver.resolveCaa()` + * * `resolver.resolveCname()` + * * `resolver.resolveMx()` + * * `resolver.resolveNaptr()` + * * `resolver.resolveNs()` + * * `resolver.resolvePtr()` + * * `resolver.resolveSoa()` + * * `resolver.resolveSrv()` + * * `resolver.resolveTxt()` + * * `resolver.reverse()` + * * `resolver.setServers()` + * @since v8.3.0 + */ + export class Resolver { + constructor(options?: ResolverOptions); + /** + * Cancel all outstanding DNS queries made by this resolver. The corresponding + * callbacks will be called with an error with code `ECANCELLED`. + * @since v8.3.0 + */ + cancel(): void; + getServers: typeof getServers; + resolve: typeof resolve; + resolve4: typeof resolve4; + resolve6: typeof resolve6; + resolveAny: typeof resolveAny; + resolveCname: typeof resolveCname; + resolveMx: typeof resolveMx; + resolveNaptr: typeof resolveNaptr; + resolveNs: typeof resolveNs; + resolvePtr: typeof resolvePtr; + resolveSoa: typeof resolveSoa; + resolveSrv: typeof resolveSrv; + resolveTxt: typeof resolveTxt; + reverse: typeof reverse; + /** + * The resolver instance will send its requests from the specified IP address. + * This allows programs to specify outbound interfaces when used on multi-homed + * systems. + * + * If a v4 or v6 address is not specified, it is set to the default, and the + * operating system will choose a local address automatically. + * + * The resolver will use the v4 local address when making requests to IPv4 DNS + * servers, and the v6 local address when making requests to IPv6 DNS servers. + * The `rrtype` of resolution requests has no impact on the local address used. + * @since v15.1.0 + * @param [ipv4='0.0.0.0'] A string representation of an IPv4 address. + * @param [ipv6='::0'] A string representation of an IPv6 address. + */ + setLocalAddress(ipv4?: string, ipv6?: string): void; + setServers: typeof setServers; + } + export { dnsPromises as promises }; +} +declare module 'node:dns' { + export * from 'dns'; +} diff --git a/node_modules/@types/node/dns/promises.d.ts b/node_modules/@types/node/dns/promises.d.ts new file mode 100644 index 000000000..fbfe1dbb7 --- /dev/null +++ b/node_modules/@types/node/dns/promises.d.ts @@ -0,0 +1,357 @@ +/** + * The `dns.promises` API provides an alternative set of asynchronous DNS methods + * that return `Promise` objects rather than using callbacks. The API is accessible + * via `require('dns').promises` or `require('dns/promises')`. + * @since v10.6.0 + */ +declare module 'dns/promises' { + import { + LookupAddress, + LookupOneOptions, + LookupAllOptions, + LookupOptions, + AnyRecord, + CaaRecord, + MxRecord, + NaptrRecord, + SoaRecord, + SrvRecord, + ResolveWithTtlOptions, + RecordWithTtl, + ResolveOptions, + ResolverOptions, + } from 'node:dns'; + /** + * Returns an array of IP address strings, formatted according to [RFC 5952](https://tools.ietf.org/html/rfc5952#section-6), + * that are currently configured for DNS resolution. A string will include a port + * section if a custom port is used. + * + * ```js + * [ + * '4.4.4.4', + * '2001:4860:4860::8888', + * '4.4.4.4:1053', + * '[2001:4860:4860::8888]:1053', + * ] + * ``` + * @since v10.6.0 + */ + function getServers(): string[]; + /** + * Resolves a host name (e.g. `'nodejs.org'`) into the first found A (IPv4) or + * AAAA (IPv6) record. All `option` properties are optional. If `options` is an + * integer, then it must be `4` or `6` – if `options` is not provided, then IPv4 + * and IPv6 addresses are both returned if found. + * + * With the `all` option set to `true`, the `Promise` is resolved with `addresses`being an array of objects with the properties `address` and `family`. + * + * On error, the `Promise` is rejected with an `Error` object, where `err.code`is the error code. + * Keep in mind that `err.code` will be set to `'ENOTFOUND'` not only when + * the host name does not exist but also when the lookup fails in other ways + * such as no available file descriptors. + * + * `dnsPromises.lookup()` does not necessarily have anything to do with the DNS + * protocol. The implementation uses an operating system facility that can + * associate names with addresses, and vice versa. This implementation can have + * subtle but important consequences on the behavior of any Node.js program. Please + * take some time to consult the `Implementation considerations section` before + * using `dnsPromises.lookup()`. + * + * Example usage: + * + * ```js + * const dns = require('dns'); + * const dnsPromises = dns.promises; + * const options = { + * family: 6, + * hints: dns.ADDRCONFIG | dns.V4MAPPED, + * }; + * + * dnsPromises.lookup('example.com', options).then((result) => { + * console.log('address: %j family: IPv%s', result.address, result.family); + * // address: "2606:2800:220:1:248:1893:25c8:1946" family: IPv6 + * }); + * + * // When options.all is true, the result will be an Array. + * options.all = true; + * dnsPromises.lookup('example.com', options).then((result) => { + * console.log('addresses: %j', result); + * // addresses: [{"address":"2606:2800:220:1:248:1893:25c8:1946","family":6}] + * }); + * ``` + * @since v10.6.0 + */ + function lookup(hostname: string, family: number): Promise; + function lookup(hostname: string, options: LookupOneOptions): Promise; + function lookup(hostname: string, options: LookupAllOptions): Promise; + function lookup(hostname: string, options: LookupOptions): Promise; + function lookup(hostname: string): Promise; + /** + * Resolves the given `address` and `port` into a host name and service using + * the operating system's underlying `getnameinfo` implementation. + * + * If `address` is not a valid IP address, a `TypeError` will be thrown. + * The `port` will be coerced to a number. If it is not a legal port, a `TypeError`will be thrown. + * + * On error, the `Promise` is rejected with an `Error` object, where `err.code`is the error code. + * + * ```js + * const dnsPromises = require('dns').promises; + * dnsPromises.lookupService('127.0.0.1', 22).then((result) => { + * console.log(result.hostname, result.service); + * // Prints: localhost ssh + * }); + * ``` + * @since v10.6.0 + */ + function lookupService( + address: string, + port: number + ): Promise<{ + hostname: string; + service: string; + }>; + /** + * Uses the DNS protocol to resolve a host name (e.g. `'nodejs.org'`) into an array + * of the resource records. When successful, the `Promise` is resolved with an + * array of resource records. The type and structure of individual results vary + * based on `rrtype`: + * + * + * + * On error, the `Promise` is rejected with an `Error` object, where `err.code`is one of the `DNS error codes`. + * @since v10.6.0 + * @param hostname Host name to resolve. + * @param [rrtype='A'] Resource record type. + */ + function resolve(hostname: string): Promise; + function resolve(hostname: string, rrtype: 'A'): Promise; + function resolve(hostname: string, rrtype: 'AAAA'): Promise; + function resolve(hostname: string, rrtype: 'ANY'): Promise; + function resolve(hostname: string, rrtype: 'CAA'): Promise; + function resolve(hostname: string, rrtype: 'CNAME'): Promise; + function resolve(hostname: string, rrtype: 'MX'): Promise; + function resolve(hostname: string, rrtype: 'NAPTR'): Promise; + function resolve(hostname: string, rrtype: 'NS'): Promise; + function resolve(hostname: string, rrtype: 'PTR'): Promise; + function resolve(hostname: string, rrtype: 'SOA'): Promise; + function resolve(hostname: string, rrtype: 'SRV'): Promise; + function resolve(hostname: string, rrtype: 'TXT'): Promise; + function resolve(hostname: string, rrtype: string): Promise; + /** + * Uses the DNS protocol to resolve IPv4 addresses (`A` records) for the`hostname`. On success, the `Promise` is resolved with an array of IPv4 + * addresses (e.g. `['74.125.79.104', '74.125.79.105', '74.125.79.106']`). + * @since v10.6.0 + * @param hostname Host name to resolve. + */ + function resolve4(hostname: string): Promise; + function resolve4(hostname: string, options: ResolveWithTtlOptions): Promise; + function resolve4(hostname: string, options: ResolveOptions): Promise; + /** + * Uses the DNS protocol to resolve IPv6 addresses (`AAAA` records) for the`hostname`. On success, the `Promise` is resolved with an array of IPv6 + * addresses. + * @since v10.6.0 + * @param hostname Host name to resolve. + */ + function resolve6(hostname: string): Promise; + function resolve6(hostname: string, options: ResolveWithTtlOptions): Promise; + function resolve6(hostname: string, options: ResolveOptions): Promise; + /** + * Uses the DNS protocol to resolve all records (also known as `ANY` or `*` query). + * On success, the `Promise` is resolved with an array containing various types of + * records. Each object has a property `type` that indicates the type of the + * current record. And depending on the `type`, additional properties will be + * present on the object: + * + * + * + * Here is an example of the result object: + * + * ```js + * [ { type: 'A', address: '127.0.0.1', ttl: 299 }, + * { type: 'CNAME', value: 'example.com' }, + * { type: 'MX', exchange: 'alt4.aspmx.l.example.com', priority: 50 }, + * { type: 'NS', value: 'ns1.example.com' }, + * { type: 'TXT', entries: [ 'v=spf1 include:_spf.example.com ~all' ] }, + * { type: 'SOA', + * nsname: 'ns1.example.com', + * hostmaster: 'admin.example.com', + * serial: 156696742, + * refresh: 900, + * retry: 900, + * expire: 1800, + * minttl: 60 } ] + * ``` + * @since v10.6.0 + */ + function resolveAny(hostname: string): Promise; + /** + * Uses the DNS protocol to resolve `CAA` records for the `hostname`. On success, + * the `Promise` is resolved with an array of objects containing available + * certification authority authorization records available for the `hostname`(e.g. `[{critical: 0, iodef: 'mailto:pki@example.com'},{critical: 128, issue: 'pki.example.com'}]`). + * @since v15.0.0 + */ + function resolveCaa(hostname: string): Promise; + /** + * Uses the DNS protocol to resolve `CNAME` records for the `hostname`. On success, + * the `Promise` is resolved with an array of canonical name records available for + * the `hostname` (e.g. `['bar.example.com']`). + * @since v10.6.0 + */ + function resolveCname(hostname: string): Promise; + /** + * Uses the DNS protocol to resolve mail exchange records (`MX` records) for the`hostname`. On success, the `Promise` is resolved with an array of objects + * containing both a `priority` and `exchange` property (e.g.`[{priority: 10, exchange: 'mx.example.com'}, ...]`). + * @since v10.6.0 + */ + function resolveMx(hostname: string): Promise; + /** + * Uses the DNS protocol to resolve regular expression based records (`NAPTR`records) for the `hostname`. On success, the `Promise` is resolved with an array + * of objects with the following properties: + * + * * `flags` + * * `service` + * * `regexp` + * * `replacement` + * * `order` + * * `preference` + * + * ```js + * { + * flags: 's', + * service: 'SIP+D2U', + * regexp: '', + * replacement: '_sip._udp.example.com', + * order: 30, + * preference: 100 + * } + * ``` + * @since v10.6.0 + */ + function resolveNaptr(hostname: string): Promise; + /** + * Uses the DNS protocol to resolve name server records (`NS` records) for the`hostname`. On success, the `Promise` is resolved with an array of name server + * records available for `hostname` (e.g.`['ns1.example.com', 'ns2.example.com']`). + * @since v10.6.0 + */ + function resolveNs(hostname: string): Promise; + /** + * Uses the DNS protocol to resolve pointer records (`PTR` records) for the`hostname`. On success, the `Promise` is resolved with an array of strings + * containing the reply records. + * @since v10.6.0 + */ + function resolvePtr(hostname: string): Promise; + /** + * Uses the DNS protocol to resolve a start of authority record (`SOA` record) for + * the `hostname`. On success, the `Promise` is resolved with an object with the + * following properties: + * + * * `nsname` + * * `hostmaster` + * * `serial` + * * `refresh` + * * `retry` + * * `expire` + * * `minttl` + * + * ```js + * { + * nsname: 'ns.example.com', + * hostmaster: 'root.example.com', + * serial: 2013101809, + * refresh: 10000, + * retry: 2400, + * expire: 604800, + * minttl: 3600 + * } + * ``` + * @since v10.6.0 + */ + function resolveSoa(hostname: string): Promise; + /** + * Uses the DNS protocol to resolve service records (`SRV` records) for the`hostname`. On success, the `Promise` is resolved with an array of objects with + * the following properties: + * + * * `priority` + * * `weight` + * * `port` + * * `name` + * + * ```js + * { + * priority: 10, + * weight: 5, + * port: 21223, + * name: 'service.example.com' + * } + * ``` + * @since v10.6.0 + */ + function resolveSrv(hostname: string): Promise; + /** + * Uses the DNS protocol to resolve text queries (`TXT` records) for the`hostname`. On success, the `Promise` is resolved with a two-dimensional array + * of the text records available for `hostname` (e.g.`[ ['v=spf1 ip4:0.0.0.0 ', '~all' ] ]`). Each sub-array contains TXT chunks of + * one record. Depending on the use case, these could be either joined together or + * treated separately. + * @since v10.6.0 + */ + function resolveTxt(hostname: string): Promise; + /** + * Performs a reverse DNS query that resolves an IPv4 or IPv6 address to an + * array of host names. + * + * On error, the `Promise` is rejected with an `Error` object, where `err.code`is one of the `DNS error codes`. + * @since v10.6.0 + */ + function reverse(ip: string): Promise; + /** + * Sets the IP address and port of servers to be used when performing DNS + * resolution. The `servers` argument is an array of [RFC 5952](https://tools.ietf.org/html/rfc5952#section-6) formatted + * addresses. If the port is the IANA default DNS port (53) it can be omitted. + * + * ```js + * dnsPromises.setServers([ + * '4.4.4.4', + * '[2001:4860:4860::8888]', + * '4.4.4.4:1053', + * '[2001:4860:4860::8888]:1053', + * ]); + * ``` + * + * An error will be thrown if an invalid address is provided. + * + * The `dnsPromises.setServers()` method must not be called while a DNS query is in + * progress. + * + * This method works much like [resolve.conf](https://man7.org/linux/man-pages/man5/resolv.conf.5.html). + * That is, if attempting to resolve with the first server provided results in a`NOTFOUND` error, the `resolve()` method will _not_ attempt to resolve with + * subsequent servers provided. Fallback DNS servers will only be used if the + * earlier ones time out or result in some other error. + * @since v10.6.0 + * @param servers array of `RFC 5952` formatted addresses + */ + function setServers(servers: ReadonlyArray): void; + class Resolver { + constructor(options?: ResolverOptions); + cancel(): void; + getServers: typeof getServers; + resolve: typeof resolve; + resolve4: typeof resolve4; + resolve6: typeof resolve6; + resolveAny: typeof resolveAny; + resolveCname: typeof resolveCname; + resolveMx: typeof resolveMx; + resolveNaptr: typeof resolveNaptr; + resolveNs: typeof resolveNs; + resolvePtr: typeof resolvePtr; + resolveSoa: typeof resolveSoa; + resolveSrv: typeof resolveSrv; + resolveTxt: typeof resolveTxt; + reverse: typeof reverse; + setLocalAddress(ipv4?: string, ipv6?: string): void; + setServers: typeof setServers; + } +} +declare module 'node:dns/promises' { + export * from 'dns/promises'; +} diff --git a/node_modules/@types/node/domain.d.ts b/node_modules/@types/node/domain.d.ts new file mode 100644 index 000000000..383e38557 --- /dev/null +++ b/node_modules/@types/node/domain.d.ts @@ -0,0 +1,169 @@ +/** + * **This module is pending deprecation.** Once a replacement API has been + * finalized, this module will be fully deprecated. Most developers should**not** have cause to use this module. Users who absolutely must have + * the functionality that domains provide may rely on it for the time being + * but should expect to have to migrate to a different solution + * in the future. + * + * Domains provide a way to handle multiple different IO operations as a + * single group. If any of the event emitters or callbacks registered to a + * domain emit an `'error'` event, or throw an error, then the domain object + * will be notified, rather than losing the context of the error in the`process.on('uncaughtException')` handler, or causing the program to + * exit immediately with an error code. + * @deprecated Since v1.4.2 - Deprecated + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/domain.js) + */ +declare module 'domain' { + import EventEmitter = require('node:events'); + /** + * The `Domain` class encapsulates the functionality of routing errors and + * uncaught exceptions to the active `Domain` object. + * + * To handle the errors that it catches, listen to its `'error'` event. + */ + class Domain extends EventEmitter { + /** + * An array of timers and event emitters that have been explicitly added + * to the domain. + */ + members: Array; + /** + * The `enter()` method is plumbing used by the `run()`, `bind()`, and`intercept()` methods to set the active domain. It sets `domain.active` and`process.domain` to the domain, and implicitly + * pushes the domain onto the domain + * stack managed by the domain module (see {@link exit} for details on the + * domain stack). The call to `enter()` delimits the beginning of a chain of + * asynchronous calls and I/O operations bound to a domain. + * + * Calling `enter()` changes only the active domain, and does not alter the domain + * itself. `enter()` and `exit()` can be called an arbitrary number of times on a + * single domain. + */ + enter(): void; + /** + * The `exit()` method exits the current domain, popping it off the domain stack. + * Any time execution is going to switch to the context of a different chain of + * asynchronous calls, it's important to ensure that the current domain is exited. + * The call to `exit()` delimits either the end of or an interruption to the chain + * of asynchronous calls and I/O operations bound to a domain. + * + * If there are multiple, nested domains bound to the current execution context,`exit()` will exit any domains nested within this domain. + * + * Calling `exit()` changes only the active domain, and does not alter the domain + * itself. `enter()` and `exit()` can be called an arbitrary number of times on a + * single domain. + */ + exit(): void; + /** + * Run the supplied function in the context of the domain, implicitly + * binding all event emitters, timers, and lowlevel requests that are + * created in that context. Optionally, arguments can be passed to + * the function. + * + * This is the most basic way to use a domain. + * + * ```js + * const domain = require('domain'); + * const fs = require('fs'); + * const d = domain.create(); + * d.on('error', (er) => { + * console.error('Caught error!', er); + * }); + * d.run(() => { + * process.nextTick(() => { + * setTimeout(() => { // Simulating some various async stuff + * fs.open('non-existent file', 'r', (er, fd) => { + * if (er) throw er; + * // proceed... + * }); + * }, 100); + * }); + * }); + * ``` + * + * In this example, the `d.on('error')` handler will be triggered, rather + * than crashing the program. + */ + run(fn: (...args: any[]) => T, ...args: any[]): T; + /** + * Explicitly adds an emitter to the domain. If any event handlers called by + * the emitter throw an error, or if the emitter emits an `'error'` event, it + * will be routed to the domain's `'error'` event, just like with implicit + * binding. + * + * This also works with timers that are returned from `setInterval()` and `setTimeout()`. If their callback function throws, it will be caught by + * the domain `'error'` handler. + * + * If the Timer or `EventEmitter` was already bound to a domain, it is removed + * from that one, and bound to this one instead. + * @param emitter emitter or timer to be added to the domain + */ + add(emitter: EventEmitter | NodeJS.Timer): void; + /** + * The opposite of {@link add}. Removes domain handling from the + * specified emitter. + * @param emitter emitter or timer to be removed from the domain + */ + remove(emitter: EventEmitter | NodeJS.Timer): void; + /** + * The returned function will be a wrapper around the supplied callback + * function. When the returned function is called, any errors that are + * thrown will be routed to the domain's `'error'` event. + * + * ```js + * const d = domain.create(); + * + * function readSomeFile(filename, cb) { + * fs.readFile(filename, 'utf8', d.bind((er, data) => { + * // If this throws, it will also be passed to the domain. + * return cb(er, data ? JSON.parse(data) : null); + * })); + * } + * + * d.on('error', (er) => { + * // An error occurred somewhere. If we throw it now, it will crash the program + * // with the normal line number and stack message. + * }); + * ``` + * @param callback The callback function + * @return The bound function + */ + bind(callback: T): T; + /** + * This method is almost identical to {@link bind}. However, in + * addition to catching thrown errors, it will also intercept `Error` objects sent as the first argument to the function. + * + * In this way, the common `if (err) return callback(err);` pattern can be replaced + * with a single error handler in a single place. + * + * ```js + * const d = domain.create(); + * + * function readSomeFile(filename, cb) { + * fs.readFile(filename, 'utf8', d.intercept((data) => { + * // Note, the first argument is never passed to the + * // callback since it is assumed to be the 'Error' argument + * // and thus intercepted by the domain. + * + * // If this throws, it will also be passed to the domain + * // so the error-handling logic can be moved to the 'error' + * // event on the domain instead of being repeated throughout + * // the program. + * return cb(null, JSON.parse(data)); + * })); + * } + * + * d.on('error', (er) => { + * // An error occurred somewhere. If we throw it now, it will crash the program + * // with the normal line number and stack message. + * }); + * ``` + * @param callback The callback function + * @return The intercepted function + */ + intercept(callback: T): T; + } + function create(): Domain; +} +declare module 'node:domain' { + export * from 'domain'; +} diff --git a/node_modules/@types/node/events.d.ts b/node_modules/@types/node/events.d.ts new file mode 100644 index 000000000..a1ed5ab73 --- /dev/null +++ b/node_modules/@types/node/events.d.ts @@ -0,0 +1,623 @@ +/** + * Much of the Node.js core API is built around an idiomatic asynchronous + * event-driven architecture in which certain kinds of objects (called "emitters") + * emit named events that cause `Function` objects ("listeners") to be called. + * + * For instance: a `net.Server` object emits an event each time a peer + * connects to it; a `fs.ReadStream` emits an event when the file is opened; + * a `stream` emits an event whenever data is available to be read. + * + * All objects that emit events are instances of the `EventEmitter` class. These + * objects expose an `eventEmitter.on()` function that allows one or more + * functions to be attached to named events emitted by the object. Typically, + * event names are camel-cased strings but any valid JavaScript property key + * can be used. + * + * When the `EventEmitter` object emits an event, all of the functions attached + * to that specific event are called _synchronously_. Any values returned by the + * called listeners are _ignored_ and discarded. + * + * The following example shows a simple `EventEmitter` instance with a single + * listener. The `eventEmitter.on()` method is used to register listeners, while + * the `eventEmitter.emit()` method is used to trigger the event. + * + * ```js + * const EventEmitter = require('events'); + * + * class MyEmitter extends EventEmitter {} + * + * const myEmitter = new MyEmitter(); + * myEmitter.on('event', () => { + * console.log('an event occurred!'); + * }); + * myEmitter.emit('event'); + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/events.js) + */ +declare module 'events' { + interface EventEmitterOptions { + /** + * Enables automatic capturing of promise rejection. + */ + captureRejections?: boolean | undefined; + } + interface NodeEventTarget { + once(eventName: string | symbol, listener: (...args: any[]) => void): this; + } + interface DOMEventTarget { + addEventListener( + eventName: string, + listener: (...args: any[]) => void, + opts?: { + once: boolean; + } + ): any; + } + interface StaticEventEmitterOptions { + signal?: AbortSignal | undefined; + } + interface EventEmitter extends NodeJS.EventEmitter {} + /** + * The `EventEmitter` class is defined and exposed by the `events` module: + * + * ```js + * const EventEmitter = require('events'); + * ``` + * + * All `EventEmitter`s emit the event `'newListener'` when new listeners are + * added and `'removeListener'` when existing listeners are removed. + * + * It supports the following option: + * @since v0.1.26 + */ + class EventEmitter { + constructor(options?: EventEmitterOptions); + /** + * Creates a `Promise` that is fulfilled when the `EventEmitter` emits the given + * event or that is rejected if the `EventEmitter` emits `'error'` while waiting. + * The `Promise` will resolve with an array of all the arguments emitted to the + * given event. + * + * This method is intentionally generic and works with the web platform [EventTarget](https://dom.spec.whatwg.org/#interface-eventtarget) interface, which has no special`'error'` event + * semantics and does not listen to the `'error'` event. + * + * ```js + * const { once, EventEmitter } = require('events'); + * + * async function run() { + * const ee = new EventEmitter(); + * + * process.nextTick(() => { + * ee.emit('myevent', 42); + * }); + * + * const [value] = await once(ee, 'myevent'); + * console.log(value); + * + * const err = new Error('kaboom'); + * process.nextTick(() => { + * ee.emit('error', err); + * }); + * + * try { + * await once(ee, 'myevent'); + * } catch (err) { + * console.log('error happened', err); + * } + * } + * + * run(); + * ``` + * + * The special handling of the `'error'` event is only used when `events.once()`is used to wait for another event. If `events.once()` is used to wait for the + * '`error'` event itself, then it is treated as any other kind of event without + * special handling: + * + * ```js + * const { EventEmitter, once } = require('events'); + * + * const ee = new EventEmitter(); + * + * once(ee, 'error') + * .then(([err]) => console.log('ok', err.message)) + * .catch((err) => console.log('error', err.message)); + * + * ee.emit('error', new Error('boom')); + * + * // Prints: ok boom + * ``` + * + * An `AbortSignal` can be used to cancel waiting for the event: + * + * ```js + * const { EventEmitter, once } = require('events'); + * + * const ee = new EventEmitter(); + * const ac = new AbortController(); + * + * async function foo(emitter, event, signal) { + * try { + * await once(emitter, event, { signal }); + * console.log('event emitted!'); + * } catch (error) { + * if (error.name === 'AbortError') { + * console.error('Waiting for the event was canceled!'); + * } else { + * console.error('There was an error', error.message); + * } + * } + * } + * + * foo(ee, 'foo', ac.signal); + * ac.abort(); // Abort waiting for the event + * ee.emit('foo'); // Prints: Waiting for the event was canceled! + * ``` + * @since v11.13.0, v10.16.0 + */ + static once(emitter: NodeEventTarget, eventName: string | symbol, options?: StaticEventEmitterOptions): Promise; + static once(emitter: DOMEventTarget, eventName: string, options?: StaticEventEmitterOptions): Promise; + /** + * ```js + * const { on, EventEmitter } = require('events'); + * + * (async () => { + * const ee = new EventEmitter(); + * + * // Emit later on + * process.nextTick(() => { + * ee.emit('foo', 'bar'); + * ee.emit('foo', 42); + * }); + * + * for await (const event of on(ee, 'foo')) { + * // The execution of this inner block is synchronous and it + * // processes one event at a time (even with await). Do not use + * // if concurrent execution is required. + * console.log(event); // prints ['bar'] [42] + * } + * // Unreachable here + * })(); + * ``` + * + * Returns an `AsyncIterator` that iterates `eventName` events. It will throw + * if the `EventEmitter` emits `'error'`. It removes all listeners when + * exiting the loop. The `value` returned by each iteration is an array + * composed of the emitted event arguments. + * + * An `AbortSignal` can be used to cancel waiting on events: + * + * ```js + * const { on, EventEmitter } = require('events'); + * const ac = new AbortController(); + * + * (async () => { + * const ee = new EventEmitter(); + * + * // Emit later on + * process.nextTick(() => { + * ee.emit('foo', 'bar'); + * ee.emit('foo', 42); + * }); + * + * for await (const event of on(ee, 'foo', { signal: ac.signal })) { + * // The execution of this inner block is synchronous and it + * // processes one event at a time (even with await). Do not use + * // if concurrent execution is required. + * console.log(event); // prints ['bar'] [42] + * } + * // Unreachable here + * })(); + * + * process.nextTick(() => ac.abort()); + * ``` + * @since v13.6.0, v12.16.0 + * @param eventName The name of the event being listened for + * @return that iterates `eventName` events emitted by the `emitter` + */ + static on(emitter: NodeJS.EventEmitter, eventName: string, options?: StaticEventEmitterOptions): AsyncIterableIterator; + /** + * A class method that returns the number of listeners for the given `eventName`registered on the given `emitter`. + * + * ```js + * const { EventEmitter, listenerCount } = require('events'); + * const myEmitter = new EventEmitter(); + * myEmitter.on('event', () => {}); + * myEmitter.on('event', () => {}); + * console.log(listenerCount(myEmitter, 'event')); + * // Prints: 2 + * ``` + * @since v0.9.12 + * @deprecated Since v3.2.0 - Use `listenerCount` instead. + * @param emitter The emitter to query + * @param eventName The event name + */ + static listenerCount(emitter: NodeJS.EventEmitter, eventName: string | symbol): number; + /** + * Returns a copy of the array of listeners for the event named `eventName`. + * + * For `EventEmitter`s this behaves exactly the same as calling `.listeners` on + * the emitter. + * + * For `EventTarget`s this is the only way to get the event listeners for the + * event target. This is useful for debugging and diagnostic purposes. + * + * ```js + * const { getEventListeners, EventEmitter } = require('events'); + * + * { + * const ee = new EventEmitter(); + * const listener = () => console.log('Events are fun'); + * ee.on('foo', listener); + * getEventListeners(ee, 'foo'); // [listener] + * } + * { + * const et = new EventTarget(); + * const listener = () => console.log('Events are fun'); + * et.addEventListener('foo', listener); + * getEventListeners(et, 'foo'); // [listener] + * } + * ``` + * @since v15.2.0 + */ + static getEventListeners(emitter: DOMEventTarget | NodeJS.EventEmitter, name: string | symbol): Function[]; + /** + * This symbol shall be used to install a listener for only monitoring `'error'` + * events. Listeners installed using this symbol are called before the regular + * `'error'` listeners are called. + * + * Installing a listener using this symbol does not change the behavior once an + * `'error'` event is emitted, therefore the process will still crash if no + * regular `'error'` listener is installed. + */ + static readonly errorMonitor: unique symbol; + static readonly captureRejectionSymbol: unique symbol; + /** + * Sets or gets the default captureRejection value for all emitters. + */ + // TODO: These should be described using static getter/setter pairs: + static captureRejections: boolean; + static defaultMaxListeners: number; + } + import internal = require('node:events'); + namespace EventEmitter { + // Should just be `export { EventEmitter }`, but that doesn't work in TypeScript 3.4 + export { internal as EventEmitter }; + export interface Abortable { + /** + * When provided the corresponding `AbortController` can be used to cancel an asynchronous action. + */ + signal?: AbortSignal | undefined; + } + } + global { + namespace NodeJS { + interface EventEmitter { + /** + * Alias for `emitter.on(eventName, listener)`. + * @since v0.1.26 + */ + addListener(eventName: string | symbol, listener: (...args: any[]) => void): this; + /** + * Adds the `listener` function to the end of the listeners array for the + * event named `eventName`. No checks are made to see if the `listener` has + * already been added. Multiple calls passing the same combination of `eventName`and `listener` will result in the `listener` being added, and called, multiple + * times. + * + * ```js + * server.on('connection', (stream) => { + * console.log('someone connected!'); + * }); + * ``` + * + * Returns a reference to the `EventEmitter`, so that calls can be chained. + * + * By default, event listeners are invoked in the order they are added. The`emitter.prependListener()` method can be used as an alternative to add the + * event listener to the beginning of the listeners array. + * + * ```js + * const myEE = new EventEmitter(); + * myEE.on('foo', () => console.log('a')); + * myEE.prependListener('foo', () => console.log('b')); + * myEE.emit('foo'); + * // Prints: + * // b + * // a + * ``` + * @since v0.1.101 + * @param eventName The name of the event. + * @param listener The callback function + */ + on(eventName: string | symbol, listener: (...args: any[]) => void): this; + /** + * Adds a **one-time**`listener` function for the event named `eventName`. The + * next time `eventName` is triggered, this listener is removed and then invoked. + * + * ```js + * server.once('connection', (stream) => { + * console.log('Ah, we have our first user!'); + * }); + * ``` + * + * Returns a reference to the `EventEmitter`, so that calls can be chained. + * + * By default, event listeners are invoked in the order they are added. The`emitter.prependOnceListener()` method can be used as an alternative to add the + * event listener to the beginning of the listeners array. + * + * ```js + * const myEE = new EventEmitter(); + * myEE.once('foo', () => console.log('a')); + * myEE.prependOnceListener('foo', () => console.log('b')); + * myEE.emit('foo'); + * // Prints: + * // b + * // a + * ``` + * @since v0.3.0 + * @param eventName The name of the event. + * @param listener The callback function + */ + once(eventName: string | symbol, listener: (...args: any[]) => void): this; + /** + * Removes the specified `listener` from the listener array for the event named`eventName`. + * + * ```js + * const callback = (stream) => { + * console.log('someone connected!'); + * }; + * server.on('connection', callback); + * // ... + * server.removeListener('connection', callback); + * ``` + * + * `removeListener()` will remove, at most, one instance of a listener from the + * listener array. If any single listener has been added multiple times to the + * listener array for the specified `eventName`, then `removeListener()` must be + * called multiple times to remove each instance. + * + * Once an event is emitted, all listeners attached to it at the + * time of emitting are called in order. This implies that any`removeListener()` or `removeAllListeners()` calls _after_ emitting and_before_ the last listener finishes execution will + * not remove them from`emit()` in progress. Subsequent events behave as expected. + * + * ```js + * const myEmitter = new MyEmitter(); + * + * const callbackA = () => { + * console.log('A'); + * myEmitter.removeListener('event', callbackB); + * }; + * + * const callbackB = () => { + * console.log('B'); + * }; + * + * myEmitter.on('event', callbackA); + * + * myEmitter.on('event', callbackB); + * + * // callbackA removes listener callbackB but it will still be called. + * // Internal listener array at time of emit [callbackA, callbackB] + * myEmitter.emit('event'); + * // Prints: + * // A + * // B + * + * // callbackB is now removed. + * // Internal listener array [callbackA] + * myEmitter.emit('event'); + * // Prints: + * // A + * ``` + * + * Because listeners are managed using an internal array, calling this will + * change the position indices of any listener registered _after_ the listener + * being removed. This will not impact the order in which listeners are called, + * but it means that any copies of the listener array as returned by + * the `emitter.listeners()` method will need to be recreated. + * + * When a single function has been added as a handler multiple times for a single + * event (as in the example below), `removeListener()` will remove the most + * recently added instance. In the example the `once('ping')`listener is removed: + * + * ```js + * const ee = new EventEmitter(); + * + * function pong() { + * console.log('pong'); + * } + * + * ee.on('ping', pong); + * ee.once('ping', pong); + * ee.removeListener('ping', pong); + * + * ee.emit('ping'); + * ee.emit('ping'); + * ``` + * + * Returns a reference to the `EventEmitter`, so that calls can be chained. + * @since v0.1.26 + */ + removeListener(eventName: string | symbol, listener: (...args: any[]) => void): this; + /** + * Alias for `emitter.removeListener()`. + * @since v10.0.0 + */ + off(eventName: string | symbol, listener: (...args: any[]) => void): this; + /** + * Removes all listeners, or those of the specified `eventName`. + * + * It is bad practice to remove listeners added elsewhere in the code, + * particularly when the `EventEmitter` instance was created by some other + * component or module (e.g. sockets or file streams). + * + * Returns a reference to the `EventEmitter`, so that calls can be chained. + * @since v0.1.26 + */ + removeAllListeners(event?: string | symbol): this; + /** + * By default `EventEmitter`s will print a warning if more than `10` listeners are + * added for a particular event. This is a useful default that helps finding + * memory leaks. The `emitter.setMaxListeners()` method allows the limit to be + * modified for this specific `EventEmitter` instance. The value can be set to`Infinity` (or `0`) to indicate an unlimited number of listeners. + * + * Returns a reference to the `EventEmitter`, so that calls can be chained. + * @since v0.3.5 + */ + setMaxListeners(n: number): this; + /** + * Returns the current max listener value for the `EventEmitter` which is either + * set by `emitter.setMaxListeners(n)` or defaults to {@link defaultMaxListeners}. + * @since v1.0.0 + */ + getMaxListeners(): number; + /** + * Returns a copy of the array of listeners for the event named `eventName`. + * + * ```js + * server.on('connection', (stream) => { + * console.log('someone connected!'); + * }); + * console.log(util.inspect(server.listeners('connection'))); + * // Prints: [ [Function] ] + * ``` + * @since v0.1.26 + */ + listeners(eventName: string | symbol): Function[]; + /** + * Returns a copy of the array of listeners for the event named `eventName`, + * including any wrappers (such as those created by `.once()`). + * + * ```js + * const emitter = new EventEmitter(); + * emitter.once('log', () => console.log('log once')); + * + * // Returns a new Array with a function `onceWrapper` which has a property + * // `listener` which contains the original listener bound above + * const listeners = emitter.rawListeners('log'); + * const logFnWrapper = listeners[0]; + * + * // Logs "log once" to the console and does not unbind the `once` event + * logFnWrapper.listener(); + * + * // Logs "log once" to the console and removes the listener + * logFnWrapper(); + * + * emitter.on('log', () => console.log('log persistently')); + * // Will return a new Array with a single function bound by `.on()` above + * const newListeners = emitter.rawListeners('log'); + * + * // Logs "log persistently" twice + * newListeners[0](); + * emitter.emit('log'); + * ``` + * @since v9.4.0 + */ + rawListeners(eventName: string | symbol): Function[]; + /** + * Synchronously calls each of the listeners registered for the event named`eventName`, in the order they were registered, passing the supplied arguments + * to each. + * + * Returns `true` if the event had listeners, `false` otherwise. + * + * ```js + * const EventEmitter = require('events'); + * const myEmitter = new EventEmitter(); + * + * // First listener + * myEmitter.on('event', function firstListener() { + * console.log('Helloooo! first listener'); + * }); + * // Second listener + * myEmitter.on('event', function secondListener(arg1, arg2) { + * console.log(`event with parameters ${arg1}, ${arg2} in second listener`); + * }); + * // Third listener + * myEmitter.on('event', function thirdListener(...args) { + * const parameters = args.join(', '); + * console.log(`event with parameters ${parameters} in third listener`); + * }); + * + * console.log(myEmitter.listeners('event')); + * + * myEmitter.emit('event', 1, 2, 3, 4, 5); + * + * // Prints: + * // [ + * // [Function: firstListener], + * // [Function: secondListener], + * // [Function: thirdListener] + * // ] + * // Helloooo! first listener + * // event with parameters 1, 2 in second listener + * // event with parameters 1, 2, 3, 4, 5 in third listener + * ``` + * @since v0.1.26 + */ + emit(eventName: string | symbol, ...args: any[]): boolean; + /** + * Returns the number of listeners listening to the event named `eventName`. + * @since v3.2.0 + * @param eventName The name of the event being listened for + */ + listenerCount(eventName: string | symbol): number; + /** + * Adds the `listener` function to the _beginning_ of the listeners array for the + * event named `eventName`. No checks are made to see if the `listener` has + * already been added. Multiple calls passing the same combination of `eventName`and `listener` will result in the `listener` being added, and called, multiple + * times. + * + * ```js + * server.prependListener('connection', (stream) => { + * console.log('someone connected!'); + * }); + * ``` + * + * Returns a reference to the `EventEmitter`, so that calls can be chained. + * @since v6.0.0 + * @param eventName The name of the event. + * @param listener The callback function + */ + prependListener(eventName: string | symbol, listener: (...args: any[]) => void): this; + /** + * Adds a **one-time**`listener` function for the event named `eventName` to the_beginning_ of the listeners array. The next time `eventName` is triggered, this + * listener is removed, and then invoked. + * + * ```js + * server.prependOnceListener('connection', (stream) => { + * console.log('Ah, we have our first user!'); + * }); + * ``` + * + * Returns a reference to the `EventEmitter`, so that calls can be chained. + * @since v6.0.0 + * @param eventName The name of the event. + * @param listener The callback function + */ + prependOnceListener(eventName: string | symbol, listener: (...args: any[]) => void): this; + /** + * Returns an array listing the events for which the emitter has registered + * listeners. The values in the array are strings or `Symbol`s. + * + * ```js + * const EventEmitter = require('events'); + * const myEE = new EventEmitter(); + * myEE.on('foo', () => {}); + * myEE.on('bar', () => {}); + * + * const sym = Symbol('symbol'); + * myEE.on(sym, () => {}); + * + * console.log(myEE.eventNames()); + * // Prints: [ 'foo', 'bar', Symbol(symbol) ] + * ``` + * @since v6.0.0 + */ + eventNames(): Array; + } + } + } + export = EventEmitter; +} +declare module 'node:events' { + import events = require('events'); + export = events; +} diff --git a/node_modules/@types/node/fs.d.ts b/node_modules/@types/node/fs.d.ts new file mode 100644 index 000000000..4339a852a --- /dev/null +++ b/node_modules/@types/node/fs.d.ts @@ -0,0 +1,3748 @@ +/** + * The `fs` module enables interacting with the file system in a + * way modeled on standard POSIX functions. + * + * To use the promise-based APIs: + * + * ```js + * import * as fs from 'fs/promises'; + * ``` + * + * To use the callback and sync APIs: + * + * ```js + * import * as fs from 'fs'; + * ``` + * + * All file system operations have synchronous, callback, and promise-based + * forms, and are accessible using both CommonJS syntax and ES6 Modules (ESM). + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/fs.js) + */ +declare module 'fs' { + import * as stream from 'node:stream'; + import { Abortable, EventEmitter } from 'node:events'; + import { URL } from 'node:url'; + import * as promises from 'node:fs/promises'; + export { promises }; + /** + * Valid types for path values in "fs". + */ + export type PathLike = string | Buffer | URL; + export type PathOrFileDescriptor = PathLike | number; + export type TimeLike = string | number | Date; + export type NoParamCallback = (err: NodeJS.ErrnoException | null) => void; + export type BufferEncodingOption = + | 'buffer' + | { + encoding: 'buffer'; + }; + export interface ObjectEncodingOptions { + encoding?: BufferEncoding | null | undefined; + } + export type EncodingOption = ObjectEncodingOptions | BufferEncoding | undefined | null; + export type OpenMode = number | string; + export type Mode = number | string; + export interface StatsBase { + isFile(): boolean; + isDirectory(): boolean; + isBlockDevice(): boolean; + isCharacterDevice(): boolean; + isSymbolicLink(): boolean; + isFIFO(): boolean; + isSocket(): boolean; + dev: T; + ino: T; + mode: T; + nlink: T; + uid: T; + gid: T; + rdev: T; + size: T; + blksize: T; + blocks: T; + atimeMs: T; + mtimeMs: T; + ctimeMs: T; + birthtimeMs: T; + atime: Date; + mtime: Date; + ctime: Date; + birthtime: Date; + } + export interface Stats extends StatsBase {} + /** + * A `fs.Stats` object provides information about a file. + * + * Objects returned from {@link stat}, {@link lstat} and {@link fstat} and + * their synchronous counterparts are of this type. + * If `bigint` in the `options` passed to those methods is true, the numeric values + * will be `bigint` instead of `number`, and the object will contain additional + * nanosecond-precision properties suffixed with `Ns`. + * + * ```console + * Stats { + * dev: 2114, + * ino: 48064969, + * mode: 33188, + * nlink: 1, + * uid: 85, + * gid: 100, + * rdev: 0, + * size: 527, + * blksize: 4096, + * blocks: 8, + * atimeMs: 1318289051000.1, + * mtimeMs: 1318289051000.1, + * ctimeMs: 1318289051000.1, + * birthtimeMs: 1318289051000.1, + * atime: Mon, 10 Oct 2011 23:24:11 GMT, + * mtime: Mon, 10 Oct 2011 23:24:11 GMT, + * ctime: Mon, 10 Oct 2011 23:24:11 GMT, + * birthtime: Mon, 10 Oct 2011 23:24:11 GMT } + * ``` + * + * `bigint` version: + * + * ```console + * BigIntStats { + * dev: 2114n, + * ino: 48064969n, + * mode: 33188n, + * nlink: 1n, + * uid: 85n, + * gid: 100n, + * rdev: 0n, + * size: 527n, + * blksize: 4096n, + * blocks: 8n, + * atimeMs: 1318289051000n, + * mtimeMs: 1318289051000n, + * ctimeMs: 1318289051000n, + * birthtimeMs: 1318289051000n, + * atimeNs: 1318289051000000000n, + * mtimeNs: 1318289051000000000n, + * ctimeNs: 1318289051000000000n, + * birthtimeNs: 1318289051000000000n, + * atime: Mon, 10 Oct 2011 23:24:11 GMT, + * mtime: Mon, 10 Oct 2011 23:24:11 GMT, + * ctime: Mon, 10 Oct 2011 23:24:11 GMT, + * birthtime: Mon, 10 Oct 2011 23:24:11 GMT } + * ``` + * @since v0.1.21 + */ + export class Stats {} + /** + * A representation of a directory entry, which can be a file or a subdirectory + * within the directory, as returned by reading from an `fs.Dir`. The + * directory entry is a combination of the file name and file type pairs. + * + * Additionally, when {@link readdir} or {@link readdirSync} is called with + * the `withFileTypes` option set to `true`, the resulting array is filled with `fs.Dirent` objects, rather than strings or `Buffer` s. + * @since v10.10.0 + */ + export class Dirent { + /** + * Returns `true` if the `fs.Dirent` object describes a regular file. + * @since v10.10.0 + */ + isFile(): boolean; + /** + * Returns `true` if the `fs.Dirent` object describes a file system + * directory. + * @since v10.10.0 + */ + isDirectory(): boolean; + /** + * Returns `true` if the `fs.Dirent` object describes a block device. + * @since v10.10.0 + */ + isBlockDevice(): boolean; + /** + * Returns `true` if the `fs.Dirent` object describes a character device. + * @since v10.10.0 + */ + isCharacterDevice(): boolean; + /** + * Returns `true` if the `fs.Dirent` object describes a symbolic link. + * @since v10.10.0 + */ + isSymbolicLink(): boolean; + /** + * Returns `true` if the `fs.Dirent` object describes a first-in-first-out + * (FIFO) pipe. + * @since v10.10.0 + */ + isFIFO(): boolean; + /** + * Returns `true` if the `fs.Dirent` object describes a socket. + * @since v10.10.0 + */ + isSocket(): boolean; + /** + * The file name that this `fs.Dirent` object refers to. The type of this + * value is determined by the `options.encoding` passed to {@link readdir} or {@link readdirSync}. + * @since v10.10.0 + */ + name: string; + } + /** + * A class representing a directory stream. + * + * Created by {@link opendir}, {@link opendirSync}, or `fsPromises.opendir()`. + * + * ```js + * import { opendir } from 'fs/promises'; + * + * try { + * const dir = await opendir('./'); + * for await (const dirent of dir) + * console.log(dirent.name); + * } catch (err) { + * console.error(err); + * } + * ``` + * + * When using the async iterator, the `fs.Dir` object will be automatically + * closed after the iterator exits. + * @since v12.12.0 + */ + export class Dir implements AsyncIterable { + /** + * The read-only path of this directory as was provided to {@link opendir},{@link opendirSync}, or `fsPromises.opendir()`. + * @since v12.12.0 + */ + readonly path: string; + /** + * Asynchronously iterates over the directory via `readdir(3)` until all entries have been read. + */ + [Symbol.asyncIterator](): AsyncIterableIterator; + /** + * Asynchronously close the directory's underlying resource handle. + * Subsequent reads will result in errors. + * + * A promise is returned that will be resolved after the resource has been + * closed. + * @since v12.12.0 + */ + close(): Promise; + close(cb: NoParamCallback): void; + /** + * Synchronously close the directory's underlying resource handle. + * Subsequent reads will result in errors. + * @since v12.12.0 + */ + closeSync(): void; + /** + * Asynchronously read the next directory entry via [`readdir(3)`](http://man7.org/linux/man-pages/man3/readdir.3.html) as an `fs.Dirent`. + * + * A promise is returned that will be resolved with an `fs.Dirent`, or `null`if there are no more directory entries to read. + * + * Directory entries returned by this function are in no particular order as + * provided by the operating system's underlying directory mechanisms. + * Entries added or removed while iterating over the directory might not be + * included in the iteration results. + * @since v12.12.0 + * @return containing {fs.Dirent|null} + */ + read(): Promise; + read(cb: (err: NodeJS.ErrnoException | null, dirEnt: Dirent | null) => void): void; + /** + * Synchronously read the next directory entry as an `fs.Dirent`. See the + * POSIX [`readdir(3)`](http://man7.org/linux/man-pages/man3/readdir.3.html) documentation for more detail. + * + * If there are no more directory entries to read, `null` will be returned. + * + * Directory entries returned by this function are in no particular order as + * provided by the operating system's underlying directory mechanisms. + * Entries added or removed while iterating over the directory might not be + * included in the iteration results. + * @since v12.12.0 + */ + readSync(): Dirent | null; + } + export interface FSWatcher extends EventEmitter { + /** + * Stop watching for changes on the given `fs.FSWatcher`. Once stopped, the `fs.FSWatcher` object is no longer usable. + * @since v0.5.8 + */ + close(): void; + /** + * events.EventEmitter + * 1. change + * 2. error + */ + addListener(event: string, listener: (...args: any[]) => void): this; + addListener(event: 'change', listener: (eventType: string, filename: string | Buffer) => void): this; + addListener(event: 'error', listener: (error: Error) => void): this; + addListener(event: 'close', listener: () => void): this; + on(event: string, listener: (...args: any[]) => void): this; + on(event: 'change', listener: (eventType: string, filename: string | Buffer) => void): this; + on(event: 'error', listener: (error: Error) => void): this; + on(event: 'close', listener: () => void): this; + once(event: string, listener: (...args: any[]) => void): this; + once(event: 'change', listener: (eventType: string, filename: string | Buffer) => void): this; + once(event: 'error', listener: (error: Error) => void): this; + once(event: 'close', listener: () => void): this; + prependListener(event: string, listener: (...args: any[]) => void): this; + prependListener(event: 'change', listener: (eventType: string, filename: string | Buffer) => void): this; + prependListener(event: 'error', listener: (error: Error) => void): this; + prependListener(event: 'close', listener: () => void): this; + prependOnceListener(event: string, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'change', listener: (eventType: string, filename: string | Buffer) => void): this; + prependOnceListener(event: 'error', listener: (error: Error) => void): this; + prependOnceListener(event: 'close', listener: () => void): this; + } + /** + * Instances of `fs.ReadStream` are created and returned using the {@link createReadStream} function. + * @since v0.1.93 + */ + export class ReadStream extends stream.Readable { + close(): void; + /** + * The number of bytes that have been read so far. + * @since v6.4.0 + */ + bytesRead: number; + /** + * The path to the file the stream is reading from as specified in the first + * argument to `fs.createReadStream()`. If `path` is passed as a string, then`readStream.path` will be a string. If `path` is passed as a `Buffer`, then`readStream.path` will be a + * `Buffer`. + * @since v0.1.93 + */ + path: string | Buffer; + /** + * This property is `true` if the underlying file has not been opened yet, + * i.e. before the `'ready'` event is emitted. + * @since v11.2.0, v10.16.0 + */ + pending: boolean; + /** + * events.EventEmitter + * 1. open + * 2. close + * 3. ready + */ + addListener(event: 'close', listener: () => void): this; + addListener(event: 'data', listener: (chunk: Buffer | string) => void): this; + addListener(event: 'end', listener: () => void): this; + addListener(event: 'error', listener: (err: Error) => void): this; + addListener(event: 'open', listener: (fd: number) => void): this; + addListener(event: 'pause', listener: () => void): this; + addListener(event: 'readable', listener: () => void): this; + addListener(event: 'ready', listener: () => void): this; + addListener(event: 'resume', listener: () => void): this; + addListener(event: string | symbol, listener: (...args: any[]) => void): this; + on(event: 'close', listener: () => void): this; + on(event: 'data', listener: (chunk: Buffer | string) => void): this; + on(event: 'end', listener: () => void): this; + on(event: 'error', listener: (err: Error) => void): this; + on(event: 'open', listener: (fd: number) => void): this; + on(event: 'pause', listener: () => void): this; + on(event: 'readable', listener: () => void): this; + on(event: 'ready', listener: () => void): this; + on(event: 'resume', listener: () => void): this; + on(event: string | symbol, listener: (...args: any[]) => void): this; + once(event: 'close', listener: () => void): this; + once(event: 'data', listener: (chunk: Buffer | string) => void): this; + once(event: 'end', listener: () => void): this; + once(event: 'error', listener: (err: Error) => void): this; + once(event: 'open', listener: (fd: number) => void): this; + once(event: 'pause', listener: () => void): this; + once(event: 'readable', listener: () => void): this; + once(event: 'ready', listener: () => void): this; + once(event: 'resume', listener: () => void): this; + once(event: string | symbol, listener: (...args: any[]) => void): this; + prependListener(event: 'close', listener: () => void): this; + prependListener(event: 'data', listener: (chunk: Buffer | string) => void): this; + prependListener(event: 'end', listener: () => void): this; + prependListener(event: 'error', listener: (err: Error) => void): this; + prependListener(event: 'open', listener: (fd: number) => void): this; + prependListener(event: 'pause', listener: () => void): this; + prependListener(event: 'readable', listener: () => void): this; + prependListener(event: 'ready', listener: () => void): this; + prependListener(event: 'resume', listener: () => void): this; + prependListener(event: string | symbol, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'close', listener: () => void): this; + prependOnceListener(event: 'data', listener: (chunk: Buffer | string) => void): this; + prependOnceListener(event: 'end', listener: () => void): this; + prependOnceListener(event: 'error', listener: (err: Error) => void): this; + prependOnceListener(event: 'open', listener: (fd: number) => void): this; + prependOnceListener(event: 'pause', listener: () => void): this; + prependOnceListener(event: 'readable', listener: () => void): this; + prependOnceListener(event: 'ready', listener: () => void): this; + prependOnceListener(event: 'resume', listener: () => void): this; + prependOnceListener(event: string | symbol, listener: (...args: any[]) => void): this; + } + /** + * * Extends `stream.Writable` + * + * Instances of `fs.WriteStream` are created and returned using the {@link createWriteStream} function. + * @since v0.1.93 + */ + export class WriteStream extends stream.Writable { + /** + * Closes `writeStream`. Optionally accepts a + * callback that will be executed once the `writeStream`is closed. + * @since v0.9.4 + */ + close(): void; + /** + * The number of bytes written so far. Does not include data that is still queued + * for writing. + * @since v0.4.7 + */ + bytesWritten: number; + /** + * The path to the file the stream is writing to as specified in the first + * argument to {@link createWriteStream}. If `path` is passed as a string, then`writeStream.path` will be a string. If `path` is passed as a `Buffer`, then`writeStream.path` will be a + * `Buffer`. + * @since v0.1.93 + */ + path: string | Buffer; + /** + * This property is `true` if the underlying file has not been opened yet, + * i.e. before the `'ready'` event is emitted. + * @since v11.2.0 + */ + pending: boolean; + /** + * events.EventEmitter + * 1. open + * 2. close + * 3. ready + */ + addListener(event: 'close', listener: () => void): this; + addListener(event: 'drain', listener: () => void): this; + addListener(event: 'error', listener: (err: Error) => void): this; + addListener(event: 'finish', listener: () => void): this; + addListener(event: 'open', listener: (fd: number) => void): this; + addListener(event: 'pipe', listener: (src: stream.Readable) => void): this; + addListener(event: 'ready', listener: () => void): this; + addListener(event: 'unpipe', listener: (src: stream.Readable) => void): this; + addListener(event: string | symbol, listener: (...args: any[]) => void): this; + on(event: 'close', listener: () => void): this; + on(event: 'drain', listener: () => void): this; + on(event: 'error', listener: (err: Error) => void): this; + on(event: 'finish', listener: () => void): this; + on(event: 'open', listener: (fd: number) => void): this; + on(event: 'pipe', listener: (src: stream.Readable) => void): this; + on(event: 'ready', listener: () => void): this; + on(event: 'unpipe', listener: (src: stream.Readable) => void): this; + on(event: string | symbol, listener: (...args: any[]) => void): this; + once(event: 'close', listener: () => void): this; + once(event: 'drain', listener: () => void): this; + once(event: 'error', listener: (err: Error) => void): this; + once(event: 'finish', listener: () => void): this; + once(event: 'open', listener: (fd: number) => void): this; + once(event: 'pipe', listener: (src: stream.Readable) => void): this; + once(event: 'ready', listener: () => void): this; + once(event: 'unpipe', listener: (src: stream.Readable) => void): this; + once(event: string | symbol, listener: (...args: any[]) => void): this; + prependListener(event: 'close', listener: () => void): this; + prependListener(event: 'drain', listener: () => void): this; + prependListener(event: 'error', listener: (err: Error) => void): this; + prependListener(event: 'finish', listener: () => void): this; + prependListener(event: 'open', listener: (fd: number) => void): this; + prependListener(event: 'pipe', listener: (src: stream.Readable) => void): this; + prependListener(event: 'ready', listener: () => void): this; + prependListener(event: 'unpipe', listener: (src: stream.Readable) => void): this; + prependListener(event: string | symbol, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'close', listener: () => void): this; + prependOnceListener(event: 'drain', listener: () => void): this; + prependOnceListener(event: 'error', listener: (err: Error) => void): this; + prependOnceListener(event: 'finish', listener: () => void): this; + prependOnceListener(event: 'open', listener: (fd: number) => void): this; + prependOnceListener(event: 'pipe', listener: (src: stream.Readable) => void): this; + prependOnceListener(event: 'ready', listener: () => void): this; + prependOnceListener(event: 'unpipe', listener: (src: stream.Readable) => void): this; + prependOnceListener(event: string | symbol, listener: (...args: any[]) => void): this; + } + /** + * Asynchronously rename file at `oldPath` to the pathname provided + * as `newPath`. In the case that `newPath` already exists, it will + * be overwritten. If there is a directory at `newPath`, an error will + * be raised instead. No arguments other than a possible exception are + * given to the completion callback. + * + * See also: [`rename(2)`](http://man7.org/linux/man-pages/man2/rename.2.html). + * + * ```js + * import { rename } from 'fs'; + * + * rename('oldFile.txt', 'newFile.txt', (err) => { + * if (err) throw err; + * console.log('Rename complete!'); + * }); + * ``` + * @since v0.0.2 + */ + export function rename(oldPath: PathLike, newPath: PathLike, callback: NoParamCallback): void; + export namespace rename { + /** + * Asynchronous rename(2) - Change the name or location of a file or directory. + * @param oldPath A path to a file. If a URL is provided, it must use the `file:` protocol. + * URL support is _experimental_. + * @param newPath A path to a file. If a URL is provided, it must use the `file:` protocol. + * URL support is _experimental_. + */ + function __promisify__(oldPath: PathLike, newPath: PathLike): Promise; + } + /** + * Renames the file from `oldPath` to `newPath`. Returns `undefined`. + * + * See the POSIX [`rename(2)`](http://man7.org/linux/man-pages/man2/rename.2.html) documentation for more details. + * @since v0.1.21 + */ + export function renameSync(oldPath: PathLike, newPath: PathLike): void; + /** + * Truncates the file. No arguments other than a possible exception are + * given to the completion callback. A file descriptor can also be passed as the + * first argument. In this case, `fs.ftruncate()` is called. + * + * ```js + * import { truncate } from 'fs'; + * // Assuming that 'path/file.txt' is a regular file. + * truncate('path/file.txt', (err) => { + * if (err) throw err; + * console.log('path/file.txt was truncated'); + * }); + * ``` + * + * Passing a file descriptor is deprecated and may result in an error being thrown + * in the future. + * + * See the POSIX [`truncate(2)`](http://man7.org/linux/man-pages/man2/truncate.2.html) documentation for more details. + * @since v0.8.6 + * @param [len=0] + */ + export function truncate(path: PathLike, len: number | undefined | null, callback: NoParamCallback): void; + /** + * Asynchronous truncate(2) - Truncate a file to a specified length. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + */ + export function truncate(path: PathLike, callback: NoParamCallback): void; + export namespace truncate { + /** + * Asynchronous truncate(2) - Truncate a file to a specified length. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param len If not specified, defaults to `0`. + */ + function __promisify__(path: PathLike, len?: number | null): Promise; + } + /** + * Truncates the file. Returns `undefined`. A file descriptor can also be + * passed as the first argument. In this case, `fs.ftruncateSync()` is called. + * + * Passing a file descriptor is deprecated and may result in an error being thrown + * in the future. + * @since v0.8.6 + * @param [len=0] + */ + export function truncateSync(path: PathLike, len?: number | null): void; + /** + * Truncates the file descriptor. No arguments other than a possible exception are + * given to the completion callback. + * + * See the POSIX [`ftruncate(2)`](http://man7.org/linux/man-pages/man2/ftruncate.2.html) documentation for more detail. + * + * If the file referred to by the file descriptor was larger than `len` bytes, only + * the first `len` bytes will be retained in the file. + * + * For example, the following program retains only the first four bytes of the + * file: + * + * ```js + * import { open, close, ftruncate } from 'fs'; + * + * function closeFd(fd) { + * close(fd, (err) => { + * if (err) throw err; + * }); + * } + * + * open('temp.txt', 'r+', (err, fd) => { + * if (err) throw err; + * + * try { + * ftruncate(fd, 4, (err) => { + * closeFd(fd); + * if (err) throw err; + * }); + * } catch (err) { + * closeFd(fd); + * if (err) throw err; + * } + * }); + * ``` + * + * If the file previously was shorter than `len` bytes, it is extended, and the + * extended part is filled with null bytes (`'\0'`): + * + * If `len` is negative then `0` will be used. + * @since v0.8.6 + * @param [len=0] + */ + export function ftruncate(fd: number, len: number | undefined | null, callback: NoParamCallback): void; + /** + * Asynchronous ftruncate(2) - Truncate a file to a specified length. + * @param fd A file descriptor. + */ + export function ftruncate(fd: number, callback: NoParamCallback): void; + export namespace ftruncate { + /** + * Asynchronous ftruncate(2) - Truncate a file to a specified length. + * @param fd A file descriptor. + * @param len If not specified, defaults to `0`. + */ + function __promisify__(fd: number, len?: number | null): Promise; + } + /** + * Truncates the file descriptor. Returns `undefined`. + * + * For detailed information, see the documentation of the asynchronous version of + * this API: {@link ftruncate}. + * @since v0.8.6 + * @param [len=0] + */ + export function ftruncateSync(fd: number, len?: number | null): void; + /** + * Asynchronously changes owner and group of a file. No arguments other than a + * possible exception are given to the completion callback. + * + * See the POSIX [`chown(2)`](http://man7.org/linux/man-pages/man2/chown.2.html) documentation for more detail. + * @since v0.1.97 + */ + export function chown(path: PathLike, uid: number, gid: number, callback: NoParamCallback): void; + export namespace chown { + /** + * Asynchronous chown(2) - Change ownership of a file. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + */ + function __promisify__(path: PathLike, uid: number, gid: number): Promise; + } + /** + * Synchronously changes owner and group of a file. Returns `undefined`. + * This is the synchronous version of {@link chown}. + * + * See the POSIX [`chown(2)`](http://man7.org/linux/man-pages/man2/chown.2.html) documentation for more detail. + * @since v0.1.97 + */ + export function chownSync(path: PathLike, uid: number, gid: number): void; + /** + * Sets the owner of the file. No arguments other than a possible exception are + * given to the completion callback. + * + * See the POSIX [`fchown(2)`](http://man7.org/linux/man-pages/man2/fchown.2.html) documentation for more detail. + * @since v0.4.7 + */ + export function fchown(fd: number, uid: number, gid: number, callback: NoParamCallback): void; + export namespace fchown { + /** + * Asynchronous fchown(2) - Change ownership of a file. + * @param fd A file descriptor. + */ + function __promisify__(fd: number, uid: number, gid: number): Promise; + } + /** + * Sets the owner of the file. Returns `undefined`. + * + * See the POSIX [`fchown(2)`](http://man7.org/linux/man-pages/man2/fchown.2.html) documentation for more detail. + * @since v0.4.7 + * @param uid The file's new owner's user id. + * @param gid The file's new group's group id. + */ + export function fchownSync(fd: number, uid: number, gid: number): void; + /** + * Set the owner of the symbolic link. No arguments other than a possible + * exception are given to the completion callback. + * + * See the POSIX [`lchown(2)`](http://man7.org/linux/man-pages/man2/lchown.2.html) documentation for more detail. + */ + export function lchown(path: PathLike, uid: number, gid: number, callback: NoParamCallback): void; + export namespace lchown { + /** + * Asynchronous lchown(2) - Change ownership of a file. Does not dereference symbolic links. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + */ + function __promisify__(path: PathLike, uid: number, gid: number): Promise; + } + /** + * Set the owner for the path. Returns `undefined`. + * + * See the POSIX [`lchown(2)`](http://man7.org/linux/man-pages/man2/lchown.2.html) documentation for more details. + * @param uid The file's new owner's user id. + * @param gid The file's new group's group id. + */ + export function lchownSync(path: PathLike, uid: number, gid: number): void; + /** + * Changes the access and modification times of a file in the same way as {@link utimes}, with the difference that if the path refers to a symbolic + * link, then the link is not dereferenced: instead, the timestamps of the + * symbolic link itself are changed. + * + * No arguments other than a possible exception are given to the completion + * callback. + * @since v14.5.0, v12.19.0 + */ + export function lutimes(path: PathLike, atime: TimeLike, mtime: TimeLike, callback: NoParamCallback): void; + export namespace lutimes { + /** + * Changes the access and modification times of a file in the same way as `fsPromises.utimes()`, + * with the difference that if the path refers to a symbolic link, then the link is not + * dereferenced: instead, the timestamps of the symbolic link itself are changed. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param atime The last access time. If a string is provided, it will be coerced to number. + * @param mtime The last modified time. If a string is provided, it will be coerced to number. + */ + function __promisify__(path: PathLike, atime: TimeLike, mtime: TimeLike): Promise; + } + /** + * Change the file system timestamps of the symbolic link referenced by `path`. + * Returns `undefined`, or throws an exception when parameters are incorrect or + * the operation fails. This is the synchronous version of {@link lutimes}. + * @since v14.5.0, v12.19.0 + */ + export function lutimesSync(path: PathLike, atime: TimeLike, mtime: TimeLike): void; + /** + * Asynchronously changes the permissions of a file. No arguments other than a + * possible exception are given to the completion callback. + * + * See the POSIX [`chmod(2)`](http://man7.org/linux/man-pages/man2/chmod.2.html) documentation for more detail. + * + * ```js + * import { chmod } from 'fs'; + * + * chmod('my_file.txt', 0o775, (err) => { + * if (err) throw err; + * console.log('The permissions for file "my_file.txt" have been changed!'); + * }); + * ``` + * @since v0.1.30 + */ + export function chmod(path: PathLike, mode: Mode, callback: NoParamCallback): void; + export namespace chmod { + /** + * Asynchronous chmod(2) - Change permissions of a file. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param mode A file mode. If a string is passed, it is parsed as an octal integer. + */ + function __promisify__(path: PathLike, mode: Mode): Promise; + } + /** + * For detailed information, see the documentation of the asynchronous version of + * this API: {@link chmod}. + * + * See the POSIX [`chmod(2)`](http://man7.org/linux/man-pages/man2/chmod.2.html) documentation for more detail. + * @since v0.6.7 + */ + export function chmodSync(path: PathLike, mode: Mode): void; + /** + * Sets the permissions on the file. No arguments other than a possible exception + * are given to the completion callback. + * + * See the POSIX [`fchmod(2)`](http://man7.org/linux/man-pages/man2/fchmod.2.html) documentation for more detail. + * @since v0.4.7 + */ + export function fchmod(fd: number, mode: Mode, callback: NoParamCallback): void; + export namespace fchmod { + /** + * Asynchronous fchmod(2) - Change permissions of a file. + * @param fd A file descriptor. + * @param mode A file mode. If a string is passed, it is parsed as an octal integer. + */ + function __promisify__(fd: number, mode: Mode): Promise; + } + /** + * Sets the permissions on the file. Returns `undefined`. + * + * See the POSIX [`fchmod(2)`](http://man7.org/linux/man-pages/man2/fchmod.2.html) documentation for more detail. + * @since v0.4.7 + */ + export function fchmodSync(fd: number, mode: Mode): void; + /** + * Changes the permissions on a symbolic link. No arguments other than a possible + * exception are given to the completion callback. + * + * This method is only implemented on macOS. + * + * See the POSIX [`lchmod(2)`](https://www.freebsd.org/cgi/man.cgi?query=lchmod&sektion=2) documentation for more detail. + * @deprecated Since v0.4.7 + */ + export function lchmod(path: PathLike, mode: Mode, callback: NoParamCallback): void; + export namespace lchmod { + /** + * Asynchronous lchmod(2) - Change permissions of a file. Does not dereference symbolic links. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param mode A file mode. If a string is passed, it is parsed as an octal integer. + */ + function __promisify__(path: PathLike, mode: Mode): Promise; + } + /** + * Changes the permissions on a symbolic link. Returns `undefined`. + * + * This method is only implemented on macOS. + * + * See the POSIX [`lchmod(2)`](https://www.freebsd.org/cgi/man.cgi?query=lchmod&sektion=2) documentation for more detail. + * @deprecated Since v0.4.7 + */ + export function lchmodSync(path: PathLike, mode: Mode): void; + /** + * Asynchronous [`stat(2)`](http://man7.org/linux/man-pages/man2/stat.2.html). The callback gets two arguments `(err, stats)` where`stats` is an `fs.Stats` object. + * + * In case of an error, the `err.code` will be one of `Common System Errors`. + * + * Using `fs.stat()` to check for the existence of a file before calling`fs.open()`, `fs.readFile()` or `fs.writeFile()` is not recommended. + * Instead, user code should open/read/write the file directly and handle the + * error raised if the file is not available. + * + * To check if a file exists without manipulating it afterwards, {@link access} is recommended. + * + * For example, given the following directory structure: + * + * ```text + * - txtDir + * -- file.txt + * - app.js + * ``` + * + * The next program will check for the stats of the given paths: + * + * ```js + * import { stat } from 'fs'; + * + * const pathsToCheck = ['./txtDir', './txtDir/file.txt']; + * + * for (let i = 0; i < pathsToCheck.length; i++) { + * stat(pathsToCheck[i], (err, stats) => { + * console.log(stats.isDirectory()); + * console.log(stats); + * }); + * } + * ``` + * + * The resulting output will resemble: + * + * ```console + * true + * Stats { + * dev: 16777220, + * mode: 16877, + * nlink: 3, + * uid: 501, + * gid: 20, + * rdev: 0, + * blksize: 4096, + * ino: 14214262, + * size: 96, + * blocks: 0, + * atimeMs: 1561174653071.963, + * mtimeMs: 1561174614583.3518, + * ctimeMs: 1561174626623.5366, + * birthtimeMs: 1561174126937.2893, + * atime: 2019-06-22T03:37:33.072Z, + * mtime: 2019-06-22T03:36:54.583Z, + * ctime: 2019-06-22T03:37:06.624Z, + * birthtime: 2019-06-22T03:28:46.937Z + * } + * false + * Stats { + * dev: 16777220, + * mode: 33188, + * nlink: 1, + * uid: 501, + * gid: 20, + * rdev: 0, + * blksize: 4096, + * ino: 14214074, + * size: 8, + * blocks: 8, + * atimeMs: 1561174616618.8555, + * mtimeMs: 1561174614584, + * ctimeMs: 1561174614583.8145, + * birthtimeMs: 1561174007710.7478, + * atime: 2019-06-22T03:36:56.619Z, + * mtime: 2019-06-22T03:36:54.584Z, + * ctime: 2019-06-22T03:36:54.584Z, + * birthtime: 2019-06-22T03:26:47.711Z + * } + * ``` + * @since v0.0.2 + */ + export function stat(path: PathLike, callback: (err: NodeJS.ErrnoException | null, stats: Stats) => void): void; + export function stat( + path: PathLike, + options: + | (StatOptions & { + bigint?: false | undefined; + }) + | undefined, + callback: (err: NodeJS.ErrnoException | null, stats: Stats) => void + ): void; + export function stat( + path: PathLike, + options: StatOptions & { + bigint: true; + }, + callback: (err: NodeJS.ErrnoException | null, stats: BigIntStats) => void + ): void; + export function stat(path: PathLike, options: StatOptions | undefined, callback: (err: NodeJS.ErrnoException | null, stats: Stats | BigIntStats) => void): void; + export namespace stat { + /** + * Asynchronous stat(2) - Get file status. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + */ + function __promisify__( + path: PathLike, + options?: StatOptions & { + bigint?: false | undefined; + } + ): Promise; + function __promisify__( + path: PathLike, + options: StatOptions & { + bigint: true; + } + ): Promise; + function __promisify__(path: PathLike, options?: StatOptions): Promise; + } + export interface StatSyncFn extends Function { + (path: PathLike, options?: undefined): Stats; + ( + path: PathLike, + options?: StatSyncOptions & { + bigint?: false | undefined; + throwIfNoEntry: false; + } + ): Stats | undefined; + ( + path: PathLike, + options: StatSyncOptions & { + bigint: true; + throwIfNoEntry: false; + } + ): BigIntStats | undefined; + ( + path: PathLike, + options?: StatSyncOptions & { + bigint?: false | undefined; + } + ): Stats; + ( + path: PathLike, + options: StatSyncOptions & { + bigint: true; + } + ): BigIntStats; + ( + path: PathLike, + options: StatSyncOptions & { + bigint: boolean; + throwIfNoEntry?: false | undefined; + } + ): Stats | BigIntStats; + (path: PathLike, options?: StatSyncOptions): Stats | BigIntStats | undefined; + } + /** + * Synchronous stat(2) - Get file status. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + */ + export const statSync: StatSyncFn; + /** + * Invokes the callback with the `fs.Stats` for the file descriptor. + * + * See the POSIX [`fstat(2)`](http://man7.org/linux/man-pages/man2/fstat.2.html) documentation for more detail. + * @since v0.1.95 + */ + export function fstat(fd: number, callback: (err: NodeJS.ErrnoException | null, stats: Stats) => void): void; + export function fstat( + fd: number, + options: + | (StatOptions & { + bigint?: false | undefined; + }) + | undefined, + callback: (err: NodeJS.ErrnoException | null, stats: Stats) => void + ): void; + export function fstat( + fd: number, + options: StatOptions & { + bigint: true; + }, + callback: (err: NodeJS.ErrnoException | null, stats: BigIntStats) => void + ): void; + export function fstat(fd: number, options: StatOptions | undefined, callback: (err: NodeJS.ErrnoException | null, stats: Stats | BigIntStats) => void): void; + export namespace fstat { + /** + * Asynchronous fstat(2) - Get file status. + * @param fd A file descriptor. + */ + function __promisify__( + fd: number, + options?: StatOptions & { + bigint?: false | undefined; + } + ): Promise; + function __promisify__( + fd: number, + options: StatOptions & { + bigint: true; + } + ): Promise; + function __promisify__(fd: number, options?: StatOptions): Promise; + } + /** + * Synchronous fstat(2) - Get file status. + * @param fd A file descriptor. + */ + export function fstatSync( + fd: number, + options?: StatOptions & { + bigint?: false | undefined; + } + ): Stats; + export function fstatSync( + fd: number, + options: StatOptions & { + bigint: true; + } + ): BigIntStats; + export function fstatSync(fd: number, options?: StatOptions): Stats | BigIntStats; + + /** + * Retrieves the `fs.Stats` for the symbolic link referred to by the path. + * The callback gets two arguments `(err, stats)` where `stats` is a `fs.Stats` object. `lstat()` is identical to `stat()`, except that if `path` is a symbolic + * link, then the link itself is stat-ed, not the file that it refers to. + * + * See the POSIX [`lstat(2)`](http://man7.org/linux/man-pages/man2/lstat.2.html) documentation for more details. + * @since v0.1.30 + */ + export function lstat(path: PathLike, callback: (err: NodeJS.ErrnoException | null, stats: Stats) => void): void; + export function lstat( + path: PathLike, + options: + | (StatOptions & { + bigint?: false | undefined; + }) + | undefined, + callback: (err: NodeJS.ErrnoException | null, stats: Stats) => void + ): void; + export function lstat( + path: PathLike, + options: StatOptions & { + bigint: true; + }, + callback: (err: NodeJS.ErrnoException | null, stats: BigIntStats) => void + ): void; + export function lstat(path: PathLike, options: StatOptions | undefined, callback: (err: NodeJS.ErrnoException | null, stats: Stats | BigIntStats) => void): void; + export namespace lstat { + /** + * Asynchronous lstat(2) - Get file status. Does not dereference symbolic links. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + */ + function __promisify__( + path: PathLike, + options?: StatOptions & { + bigint?: false | undefined; + } + ): Promise; + function __promisify__( + path: PathLike, + options: StatOptions & { + bigint: true; + } + ): Promise; + function __promisify__(path: PathLike, options?: StatOptions): Promise; + } + /** + * Synchronous lstat(2) - Get file status. Does not dereference symbolic links. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + */ + export const lstatSync: StatSyncFn; + /** + * Creates a new link from the `existingPath` to the `newPath`. See the POSIX [`link(2)`](http://man7.org/linux/man-pages/man2/link.2.html) documentation for more detail. No arguments other than + * a possible + * exception are given to the completion callback. + * @since v0.1.31 + */ + export function link(existingPath: PathLike, newPath: PathLike, callback: NoParamCallback): void; + export namespace link { + /** + * Asynchronous link(2) - Create a new link (also known as a hard link) to an existing file. + * @param existingPath A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param newPath A path to a file. If a URL is provided, it must use the `file:` protocol. + */ + function __promisify__(existingPath: PathLike, newPath: PathLike): Promise; + } + /** + * Creates a new link from the `existingPath` to the `newPath`. See the POSIX [`link(2)`](http://man7.org/linux/man-pages/man2/link.2.html) documentation for more detail. Returns `undefined`. + * @since v0.1.31 + */ + export function linkSync(existingPath: PathLike, newPath: PathLike): void; + /** + * Creates the link called `path` pointing to `target`. No arguments other than a + * possible exception are given to the completion callback. + * + * See the POSIX [`symlink(2)`](http://man7.org/linux/man-pages/man2/symlink.2.html) documentation for more details. + * + * The `type` argument is only available on Windows and ignored on other platforms. + * It can be set to `'dir'`, `'file'`, or `'junction'`. If the `type` argument is + * not set, Node.js will autodetect `target` type and use `'file'` or `'dir'`. If + * the `target` does not exist, `'file'` will be used. Windows junction points + * require the destination path to be absolute. When using `'junction'`, the`target` argument will automatically be normalized to absolute path. + * + * Relative targets are relative to the link’s parent directory. + * + * ```js + * import { symlink } from 'fs'; + * + * symlink('./mew', './example/mewtwo', callback); + * ``` + * + * The above example creates a symbolic link `mewtwo` in the `example` which points + * to `mew` in the same directory: + * + * ```bash + * $ tree example/ + * example/ + * ├── mew + * └── mewtwo -> ./mew + * ``` + * @since v0.1.31 + */ + export function symlink(target: PathLike, path: PathLike, type: symlink.Type | undefined | null, callback: NoParamCallback): void; + /** + * Asynchronous symlink(2) - Create a new symbolic link to an existing file. + * @param target A path to an existing file. If a URL is provided, it must use the `file:` protocol. + * @param path A path to the new symlink. If a URL is provided, it must use the `file:` protocol. + */ + export function symlink(target: PathLike, path: PathLike, callback: NoParamCallback): void; + export namespace symlink { + /** + * Asynchronous symlink(2) - Create a new symbolic link to an existing file. + * @param target A path to an existing file. If a URL is provided, it must use the `file:` protocol. + * @param path A path to the new symlink. If a URL is provided, it must use the `file:` protocol. + * @param type May be set to `'dir'`, `'file'`, or `'junction'` (default is `'file'`) and is only available on Windows (ignored on other platforms). + * When using `'junction'`, the `target` argument will automatically be normalized to an absolute path. + */ + function __promisify__(target: PathLike, path: PathLike, type?: string | null): Promise; + type Type = 'dir' | 'file' | 'junction'; + } + /** + * Returns `undefined`. + * + * For detailed information, see the documentation of the asynchronous version of + * this API: {@link symlink}. + * @since v0.1.31 + */ + export function symlinkSync(target: PathLike, path: PathLike, type?: symlink.Type | null): void; + /** + * Reads the contents of the symbolic link referred to by `path`. The callback gets + * two arguments `(err, linkString)`. + * + * See the POSIX [`readlink(2)`](http://man7.org/linux/man-pages/man2/readlink.2.html) documentation for more details. + * + * The optional `options` argument can be a string specifying an encoding, or an + * object with an `encoding` property specifying the character encoding to use for + * the link path passed to the callback. If the `encoding` is set to `'buffer'`, + * the link path returned will be passed as a `Buffer` object. + * @since v0.1.31 + */ + export function readlink(path: PathLike, options: EncodingOption, callback: (err: NodeJS.ErrnoException | null, linkString: string) => void): void; + /** + * Asynchronous readlink(2) - read value of a symbolic link. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + export function readlink(path: PathLike, options: BufferEncodingOption, callback: (err: NodeJS.ErrnoException | null, linkString: Buffer) => void): void; + /** + * Asynchronous readlink(2) - read value of a symbolic link. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + export function readlink(path: PathLike, options: EncodingOption, callback: (err: NodeJS.ErrnoException | null, linkString: string | Buffer) => void): void; + /** + * Asynchronous readlink(2) - read value of a symbolic link. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + */ + export function readlink(path: PathLike, callback: (err: NodeJS.ErrnoException | null, linkString: string) => void): void; + export namespace readlink { + /** + * Asynchronous readlink(2) - read value of a symbolic link. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function __promisify__(path: PathLike, options?: EncodingOption): Promise; + /** + * Asynchronous readlink(2) - read value of a symbolic link. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function __promisify__(path: PathLike, options: BufferEncodingOption): Promise; + /** + * Asynchronous readlink(2) - read value of a symbolic link. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function __promisify__(path: PathLike, options?: EncodingOption): Promise; + } + /** + * Returns the symbolic link's string value. + * + * See the POSIX [`readlink(2)`](http://man7.org/linux/man-pages/man2/readlink.2.html) documentation for more details. + * + * The optional `options` argument can be a string specifying an encoding, or an + * object with an `encoding` property specifying the character encoding to use for + * the link path returned. If the `encoding` is set to `'buffer'`, + * the link path returned will be passed as a `Buffer` object. + * @since v0.1.31 + */ + export function readlinkSync(path: PathLike, options?: EncodingOption): string; + /** + * Synchronous readlink(2) - read value of a symbolic link. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + export function readlinkSync(path: PathLike, options: BufferEncodingOption): Buffer; + /** + * Synchronous readlink(2) - read value of a symbolic link. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + export function readlinkSync(path: PathLike, options?: EncodingOption): string | Buffer; + /** + * Asynchronously computes the canonical pathname by resolving `.`, `..` and + * symbolic links. + * + * A canonical pathname is not necessarily unique. Hard links and bind mounts can + * expose a file system entity through many pathnames. + * + * This function behaves like [`realpath(3)`](http://man7.org/linux/man-pages/man3/realpath.3.html), with some exceptions: + * + * 1. No case conversion is performed on case-insensitive file systems. + * 2. The maximum number of symbolic links is platform-independent and generally + * (much) higher than what the native [`realpath(3)`](http://man7.org/linux/man-pages/man3/realpath.3.html) implementation supports. + * + * The `callback` gets two arguments `(err, resolvedPath)`. May use `process.cwd`to resolve relative paths. + * + * Only paths that can be converted to UTF8 strings are supported. + * + * The optional `options` argument can be a string specifying an encoding, or an + * object with an `encoding` property specifying the character encoding to use for + * the path passed to the callback. If the `encoding` is set to `'buffer'`, + * the path returned will be passed as a `Buffer` object. + * + * If `path` resolves to a socket or a pipe, the function will return a system + * dependent name for that object. + * @since v0.1.31 + */ + export function realpath(path: PathLike, options: EncodingOption, callback: (err: NodeJS.ErrnoException | null, resolvedPath: string) => void): void; + /** + * Asynchronous realpath(3) - return the canonicalized absolute pathname. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + export function realpath(path: PathLike, options: BufferEncodingOption, callback: (err: NodeJS.ErrnoException | null, resolvedPath: Buffer) => void): void; + /** + * Asynchronous realpath(3) - return the canonicalized absolute pathname. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + export function realpath(path: PathLike, options: EncodingOption, callback: (err: NodeJS.ErrnoException | null, resolvedPath: string | Buffer) => void): void; + /** + * Asynchronous realpath(3) - return the canonicalized absolute pathname. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + */ + export function realpath(path: PathLike, callback: (err: NodeJS.ErrnoException | null, resolvedPath: string) => void): void; + export namespace realpath { + /** + * Asynchronous realpath(3) - return the canonicalized absolute pathname. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function __promisify__(path: PathLike, options?: EncodingOption): Promise; + /** + * Asynchronous realpath(3) - return the canonicalized absolute pathname. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function __promisify__(path: PathLike, options: BufferEncodingOption): Promise; + /** + * Asynchronous realpath(3) - return the canonicalized absolute pathname. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function __promisify__(path: PathLike, options?: EncodingOption): Promise; + /** + * Asynchronous [`realpath(3)`](http://man7.org/linux/man-pages/man3/realpath.3.html). + * + * The `callback` gets two arguments `(err, resolvedPath)`. + * + * Only paths that can be converted to UTF8 strings are supported. + * + * The optional `options` argument can be a string specifying an encoding, or an + * object with an `encoding` property specifying the character encoding to use for + * the path passed to the callback. If the `encoding` is set to `'buffer'`, + * the path returned will be passed as a `Buffer` object. + * + * On Linux, when Node.js is linked against musl libc, the procfs file system must + * be mounted on `/proc` in order for this function to work. Glibc does not have + * this restriction. + * @since v9.2.0 + */ + function native(path: PathLike, options: EncodingOption, callback: (err: NodeJS.ErrnoException | null, resolvedPath: string) => void): void; + function native(path: PathLike, options: BufferEncodingOption, callback: (err: NodeJS.ErrnoException | null, resolvedPath: Buffer) => void): void; + function native(path: PathLike, options: EncodingOption, callback: (err: NodeJS.ErrnoException | null, resolvedPath: string | Buffer) => void): void; + function native(path: PathLike, callback: (err: NodeJS.ErrnoException | null, resolvedPath: string) => void): void; + } + /** + * Returns the resolved pathname. + * + * For detailed information, see the documentation of the asynchronous version of + * this API: {@link realpath}. + * @since v0.1.31 + */ + export function realpathSync(path: PathLike, options?: EncodingOption): string; + /** + * Synchronous realpath(3) - return the canonicalized absolute pathname. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + export function realpathSync(path: PathLike, options: BufferEncodingOption): Buffer; + /** + * Synchronous realpath(3) - return the canonicalized absolute pathname. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + export function realpathSync(path: PathLike, options?: EncodingOption): string | Buffer; + export namespace realpathSync { + function native(path: PathLike, options?: EncodingOption): string; + function native(path: PathLike, options: BufferEncodingOption): Buffer; + function native(path: PathLike, options?: EncodingOption): string | Buffer; + } + /** + * Asynchronously removes a file or symbolic link. No arguments other than a + * possible exception are given to the completion callback. + * + * ```js + * import { unlink } from 'fs'; + * // Assuming that 'path/file.txt' is a regular file. + * unlink('path/file.txt', (err) => { + * if (err) throw err; + * console.log('path/file.txt was deleted'); + * }); + * ``` + * + * `fs.unlink()` will not work on a directory, empty or otherwise. To remove a + * directory, use {@link rmdir}. + * + * See the POSIX [`unlink(2)`](http://man7.org/linux/man-pages/man2/unlink.2.html) documentation for more details. + * @since v0.0.2 + */ + export function unlink(path: PathLike, callback: NoParamCallback): void; + export namespace unlink { + /** + * Asynchronous unlink(2) - delete a name and possibly the file it refers to. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + */ + function __promisify__(path: PathLike): Promise; + } + /** + * Synchronous [`unlink(2)`](http://man7.org/linux/man-pages/man2/unlink.2.html). Returns `undefined`. + * @since v0.1.21 + */ + export function unlinkSync(path: PathLike): void; + export interface RmDirOptions { + /** + * If an `EBUSY`, `EMFILE`, `ENFILE`, `ENOTEMPTY`, or + * `EPERM` error is encountered, Node.js will retry the operation with a linear + * backoff wait of `retryDelay` ms longer on each try. This option represents the + * number of retries. This option is ignored if the `recursive` option is not + * `true`. + * @default 0 + */ + maxRetries?: number | undefined; + /** + * @deprecated since v14.14.0 In future versions of Node.js and will trigger a warning + * `fs.rmdir(path, { recursive: true })` will throw if `path` does not exist or is a file. + * Use `fs.rm(path, { recursive: true, force: true })` instead. + * + * If `true`, perform a recursive directory removal. In + * recursive mode soperations are retried on failure. + * @default false + */ + recursive?: boolean | undefined; + /** + * The amount of time in milliseconds to wait between retries. + * This option is ignored if the `recursive` option is not `true`. + * @default 100 + */ + retryDelay?: number | undefined; + } + /** + * Asynchronous [`rmdir(2)`](http://man7.org/linux/man-pages/man2/rmdir.2.html). No arguments other than a possible exception are given + * to the completion callback. + * + * Using `fs.rmdir()` on a file (not a directory) results in an `ENOENT` error on + * Windows and an `ENOTDIR` error on POSIX. + * + * To get a behavior similar to the `rm -rf` Unix command, use {@link rm} with options `{ recursive: true, force: true }`. + * @since v0.0.2 + */ + export function rmdir(path: PathLike, callback: NoParamCallback): void; + export function rmdir(path: PathLike, options: RmDirOptions, callback: NoParamCallback): void; + export namespace rmdir { + /** + * Asynchronous rmdir(2) - delete a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + */ + function __promisify__(path: PathLike, options?: RmDirOptions): Promise; + } + /** + * Synchronous [`rmdir(2)`](http://man7.org/linux/man-pages/man2/rmdir.2.html). Returns `undefined`. + * + * Using `fs.rmdirSync()` on a file (not a directory) results in an `ENOENT` error + * on Windows and an `ENOTDIR` error on POSIX. + * + * To get a behavior similar to the `rm -rf` Unix command, use {@link rmSync} with options `{ recursive: true, force: true }`. + * @since v0.1.21 + */ + export function rmdirSync(path: PathLike, options?: RmDirOptions): void; + export interface RmOptions { + /** + * When `true`, exceptions will be ignored if `path` does not exist. + * @default false + */ + force?: boolean | undefined; + /** + * If an `EBUSY`, `EMFILE`, `ENFILE`, `ENOTEMPTY`, or + * `EPERM` error is encountered, Node.js will retry the operation with a linear + * backoff wait of `retryDelay` ms longer on each try. This option represents the + * number of retries. This option is ignored if the `recursive` option is not + * `true`. + * @default 0 + */ + maxRetries?: number | undefined; + /** + * If `true`, perform a recursive directory removal. In + * recursive mode, operations are retried on failure. + * @default false + */ + recursive?: boolean | undefined; + /** + * The amount of time in milliseconds to wait between retries. + * This option is ignored if the `recursive` option is not `true`. + * @default 100 + */ + retryDelay?: number | undefined; + } + /** + * Asynchronously removes files and directories (modeled on the standard POSIX `rm`utility). No arguments other than a possible exception are given to the + * completion callback. + * @since v14.14.0 + */ + export function rm(path: PathLike, callback: NoParamCallback): void; + export function rm(path: PathLike, options: RmOptions, callback: NoParamCallback): void; + export namespace rm { + /** + * Asynchronously removes files and directories (modeled on the standard POSIX `rm` utility). + */ + function __promisify__(path: PathLike, options?: RmOptions): Promise; + } + /** + * Synchronously removes files and directories (modeled on the standard POSIX `rm`utility). Returns `undefined`. + * @since v14.14.0 + */ + export function rmSync(path: PathLike, options?: RmOptions): void; + export interface MakeDirectoryOptions { + /** + * Indicates whether parent folders should be created. + * If a folder was created, the path to the first created folder will be returned. + * @default false + */ + recursive?: boolean | undefined; + /** + * A file mode. If a string is passed, it is parsed as an octal integer. If not specified + * @default 0o777 + */ + mode?: Mode | undefined; + } + /** + * Asynchronously creates a directory. + * + * The callback is given a possible exception and, if `recursive` is `true`, the + * first directory path created, `(err[, path])`.`path` can still be `undefined` when `recursive` is `true`, if no directory was + * created. + * + * The optional `options` argument can be an integer specifying `mode` (permission + * and sticky bits), or an object with a `mode` property and a `recursive`property indicating whether parent directories should be created. Calling`fs.mkdir()` when `path` is a directory that + * exists results in an error only + * when `recursive` is false. + * + * ```js + * import { mkdir } from 'fs'; + * + * // Creates /tmp/a/apple, regardless of whether `/tmp` and /tmp/a exist. + * mkdir('/tmp/a/apple', { recursive: true }, (err) => { + * if (err) throw err; + * }); + * ``` + * + * On Windows, using `fs.mkdir()` on the root directory even with recursion will + * result in an error: + * + * ```js + * import { mkdir } from 'fs'; + * + * mkdir('/', { recursive: true }, (err) => { + * // => [Error: EPERM: operation not permitted, mkdir 'C:\'] + * }); + * ``` + * + * See the POSIX [`mkdir(2)`](http://man7.org/linux/man-pages/man2/mkdir.2.html) documentation for more details. + * @since v0.1.8 + */ + export function mkdir( + path: PathLike, + options: MakeDirectoryOptions & { + recursive: true; + }, + callback: (err: NodeJS.ErrnoException | null, path?: string) => void + ): void; + /** + * Asynchronous mkdir(2) - create a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options Either the file mode, or an object optionally specifying the file mode and whether parent folders + * should be created. If a string is passed, it is parsed as an octal integer. If not specified, defaults to `0o777`. + */ + export function mkdir( + path: PathLike, + options: + | Mode + | (MakeDirectoryOptions & { + recursive?: false | undefined; + }) + | null + | undefined, + callback: NoParamCallback + ): void; + /** + * Asynchronous mkdir(2) - create a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options Either the file mode, or an object optionally specifying the file mode and whether parent folders + * should be created. If a string is passed, it is parsed as an octal integer. If not specified, defaults to `0o777`. + */ + export function mkdir(path: PathLike, options: Mode | MakeDirectoryOptions | null | undefined, callback: (err: NodeJS.ErrnoException | null, path?: string) => void): void; + /** + * Asynchronous mkdir(2) - create a directory with a mode of `0o777`. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + */ + export function mkdir(path: PathLike, callback: NoParamCallback): void; + export namespace mkdir { + /** + * Asynchronous mkdir(2) - create a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options Either the file mode, or an object optionally specifying the file mode and whether parent folders + * should be created. If a string is passed, it is parsed as an octal integer. If not specified, defaults to `0o777`. + */ + function __promisify__( + path: PathLike, + options: MakeDirectoryOptions & { + recursive: true; + } + ): Promise; + /** + * Asynchronous mkdir(2) - create a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options Either the file mode, or an object optionally specifying the file mode and whether parent folders + * should be created. If a string is passed, it is parsed as an octal integer. If not specified, defaults to `0o777`. + */ + function __promisify__( + path: PathLike, + options?: + | Mode + | (MakeDirectoryOptions & { + recursive?: false | undefined; + }) + | null + ): Promise; + /** + * Asynchronous mkdir(2) - create a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options Either the file mode, or an object optionally specifying the file mode and whether parent folders + * should be created. If a string is passed, it is parsed as an octal integer. If not specified, defaults to `0o777`. + */ + function __promisify__(path: PathLike, options?: Mode | MakeDirectoryOptions | null): Promise; + } + /** + * Synchronously creates a directory. Returns `undefined`, or if `recursive` is`true`, the first directory path created. + * This is the synchronous version of {@link mkdir}. + * + * See the POSIX [`mkdir(2)`](http://man7.org/linux/man-pages/man2/mkdir.2.html) documentation for more details. + * @since v0.1.21 + */ + export function mkdirSync( + path: PathLike, + options: MakeDirectoryOptions & { + recursive: true; + } + ): string | undefined; + /** + * Synchronous mkdir(2) - create a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options Either the file mode, or an object optionally specifying the file mode and whether parent folders + * should be created. If a string is passed, it is parsed as an octal integer. If not specified, defaults to `0o777`. + */ + export function mkdirSync( + path: PathLike, + options?: + | Mode + | (MakeDirectoryOptions & { + recursive?: false | undefined; + }) + | null + ): void; + /** + * Synchronous mkdir(2) - create a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options Either the file mode, or an object optionally specifying the file mode and whether parent folders + * should be created. If a string is passed, it is parsed as an octal integer. If not specified, defaults to `0o777`. + */ + export function mkdirSync(path: PathLike, options?: Mode | MakeDirectoryOptions | null): string | undefined; + /** + * Creates a unique temporary directory. + * + * Generates six random characters to be appended behind a required`prefix` to create a unique temporary directory. Due to platform + * inconsistencies, avoid trailing `X` characters in `prefix`. Some platforms, + * notably the BSDs, can return more than six random characters, and replace + * trailing `X` characters in `prefix` with random characters. + * + * The created directory path is passed as a string to the callback's second + * parameter. + * + * The optional `options` argument can be a string specifying an encoding, or an + * object with an `encoding` property specifying the character encoding to use. + * + * ```js + * import { mkdtemp } from 'fs'; + * + * mkdtemp(path.join(os.tmpdir(), 'foo-'), (err, directory) => { + * if (err) throw err; + * console.log(directory); + * // Prints: /tmp/foo-itXde2 or C:\Users\...\AppData\Local\Temp\foo-itXde2 + * }); + * ``` + * + * The `fs.mkdtemp()` method will append the six randomly selected characters + * directly to the `prefix` string. For instance, given a directory `/tmp`, if the + * intention is to create a temporary directory _within_`/tmp`, the `prefix`must end with a trailing platform-specific path separator + * (`require('path').sep`). + * + * ```js + * import { tmpdir } from 'os'; + * import { mkdtemp } from 'fs'; + * + * // The parent directory for the new temporary directory + * const tmpDir = tmpdir(); + * + * // This method is *INCORRECT*: + * mkdtemp(tmpDir, (err, directory) => { + * if (err) throw err; + * console.log(directory); + * // Will print something similar to `/tmpabc123`. + * // A new temporary directory is created at the file system root + * // rather than *within* the /tmp directory. + * }); + * + * // This method is *CORRECT*: + * import { sep } from 'path'; + * mkdtemp(`${tmpDir}${sep}`, (err, directory) => { + * if (err) throw err; + * console.log(directory); + * // Will print something similar to `/tmp/abc123`. + * // A new temporary directory is created within + * // the /tmp directory. + * }); + * ``` + * @since v5.10.0 + */ + export function mkdtemp(prefix: string, options: EncodingOption, callback: (err: NodeJS.ErrnoException | null, folder: string) => void): void; + /** + * Asynchronously creates a unique temporary directory. + * Generates six random characters to be appended behind a required prefix to create a unique temporary directory. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + export function mkdtemp( + prefix: string, + options: + | 'buffer' + | { + encoding: 'buffer'; + }, + callback: (err: NodeJS.ErrnoException | null, folder: Buffer) => void + ): void; + /** + * Asynchronously creates a unique temporary directory. + * Generates six random characters to be appended behind a required prefix to create a unique temporary directory. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + export function mkdtemp(prefix: string, options: EncodingOption, callback: (err: NodeJS.ErrnoException | null, folder: string | Buffer) => void): void; + /** + * Asynchronously creates a unique temporary directory. + * Generates six random characters to be appended behind a required prefix to create a unique temporary directory. + */ + export function mkdtemp(prefix: string, callback: (err: NodeJS.ErrnoException | null, folder: string) => void): void; + export namespace mkdtemp { + /** + * Asynchronously creates a unique temporary directory. + * Generates six random characters to be appended behind a required prefix to create a unique temporary directory. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function __promisify__(prefix: string, options?: EncodingOption): Promise; + /** + * Asynchronously creates a unique temporary directory. + * Generates six random characters to be appended behind a required prefix to create a unique temporary directory. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function __promisify__(prefix: string, options: BufferEncodingOption): Promise; + /** + * Asynchronously creates a unique temporary directory. + * Generates six random characters to be appended behind a required prefix to create a unique temporary directory. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function __promisify__(prefix: string, options?: EncodingOption): Promise; + } + /** + * Returns the created directory path. + * + * For detailed information, see the documentation of the asynchronous version of + * this API: {@link mkdtemp}. + * + * The optional `options` argument can be a string specifying an encoding, or an + * object with an `encoding` property specifying the character encoding to use. + * @since v5.10.0 + */ + export function mkdtempSync(prefix: string, options?: EncodingOption): string; + /** + * Synchronously creates a unique temporary directory. + * Generates six random characters to be appended behind a required prefix to create a unique temporary directory. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + export function mkdtempSync(prefix: string, options: BufferEncodingOption): Buffer; + /** + * Synchronously creates a unique temporary directory. + * Generates six random characters to be appended behind a required prefix to create a unique temporary directory. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + export function mkdtempSync(prefix: string, options?: EncodingOption): string | Buffer; + /** + * Reads the contents of a directory. The callback gets two arguments `(err, files)`where `files` is an array of the names of the files in the directory excluding`'.'` and `'..'`. + * + * See the POSIX [`readdir(3)`](http://man7.org/linux/man-pages/man3/readdir.3.html) documentation for more details. + * + * The optional `options` argument can be a string specifying an encoding, or an + * object with an `encoding` property specifying the character encoding to use for + * the filenames passed to the callback. If the `encoding` is set to `'buffer'`, + * the filenames returned will be passed as `Buffer` objects. + * + * If `options.withFileTypes` is set to `true`, the `files` array will contain `fs.Dirent` objects. + * @since v0.1.8 + */ + export function readdir( + path: PathLike, + options: + | { + encoding: BufferEncoding | null; + withFileTypes?: false | undefined; + } + | BufferEncoding + | undefined + | null, + callback: (err: NodeJS.ErrnoException | null, files: string[]) => void + ): void; + /** + * Asynchronous readdir(3) - read a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + export function readdir( + path: PathLike, + options: + | { + encoding: 'buffer'; + withFileTypes?: false | undefined; + } + | 'buffer', + callback: (err: NodeJS.ErrnoException | null, files: Buffer[]) => void + ): void; + /** + * Asynchronous readdir(3) - read a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + export function readdir( + path: PathLike, + options: + | (ObjectEncodingOptions & { + withFileTypes?: false | undefined; + }) + | BufferEncoding + | undefined + | null, + callback: (err: NodeJS.ErrnoException | null, files: string[] | Buffer[]) => void + ): void; + /** + * Asynchronous readdir(3) - read a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + */ + export function readdir(path: PathLike, callback: (err: NodeJS.ErrnoException | null, files: string[]) => void): void; + /** + * Asynchronous readdir(3) - read a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options If called with `withFileTypes: true` the result data will be an array of Dirent. + */ + export function readdir( + path: PathLike, + options: ObjectEncodingOptions & { + withFileTypes: true; + }, + callback: (err: NodeJS.ErrnoException | null, files: Dirent[]) => void + ): void; + export namespace readdir { + /** + * Asynchronous readdir(3) - read a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function __promisify__( + path: PathLike, + options?: + | { + encoding: BufferEncoding | null; + withFileTypes?: false | undefined; + } + | BufferEncoding + | null + ): Promise; + /** + * Asynchronous readdir(3) - read a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function __promisify__( + path: PathLike, + options: + | 'buffer' + | { + encoding: 'buffer'; + withFileTypes?: false | undefined; + } + ): Promise; + /** + * Asynchronous readdir(3) - read a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function __promisify__( + path: PathLike, + options?: + | (ObjectEncodingOptions & { + withFileTypes?: false | undefined; + }) + | BufferEncoding + | null + ): Promise; + /** + * Asynchronous readdir(3) - read a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options If called with `withFileTypes: true` the result data will be an array of Dirent + */ + function __promisify__( + path: PathLike, + options: ObjectEncodingOptions & { + withFileTypes: true; + } + ): Promise; + } + /** + * Reads the contents of the directory. + * + * See the POSIX [`readdir(3)`](http://man7.org/linux/man-pages/man3/readdir.3.html) documentation for more details. + * + * The optional `options` argument can be a string specifying an encoding, or an + * object with an `encoding` property specifying the character encoding to use for + * the filenames returned. If the `encoding` is set to `'buffer'`, + * the filenames returned will be passed as `Buffer` objects. + * + * If `options.withFileTypes` is set to `true`, the result will contain `fs.Dirent` objects. + * @since v0.1.21 + */ + export function readdirSync( + path: PathLike, + options?: + | { + encoding: BufferEncoding | null; + withFileTypes?: false | undefined; + } + | BufferEncoding + | null + ): string[]; + /** + * Synchronous readdir(3) - read a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + export function readdirSync( + path: PathLike, + options: + | { + encoding: 'buffer'; + withFileTypes?: false | undefined; + } + | 'buffer' + ): Buffer[]; + /** + * Synchronous readdir(3) - read a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + export function readdirSync( + path: PathLike, + options?: + | (ObjectEncodingOptions & { + withFileTypes?: false | undefined; + }) + | BufferEncoding + | null + ): string[] | Buffer[]; + /** + * Synchronous readdir(3) - read a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options If called with `withFileTypes: true` the result data will be an array of Dirent. + */ + export function readdirSync( + path: PathLike, + options: ObjectEncodingOptions & { + withFileTypes: true; + } + ): Dirent[]; + /** + * Closes the file descriptor. No arguments other than a possible exception are + * given to the completion callback. + * + * Calling `fs.close()` on any file descriptor (`fd`) that is currently in use + * through any other `fs` operation may lead to undefined behavior. + * + * See the POSIX [`close(2)`](http://man7.org/linux/man-pages/man2/close.2.html) documentation for more detail. + * @since v0.0.2 + */ + export function close(fd: number, callback?: NoParamCallback): void; + export namespace close { + /** + * Asynchronous close(2) - close a file descriptor. + * @param fd A file descriptor. + */ + function __promisify__(fd: number): Promise; + } + /** + * Closes the file descriptor. Returns `undefined`. + * + * Calling `fs.closeSync()` on any file descriptor (`fd`) that is currently in use + * through any other `fs` operation may lead to undefined behavior. + * + * See the POSIX [`close(2)`](http://man7.org/linux/man-pages/man2/close.2.html) documentation for more detail. + * @since v0.1.21 + */ + export function closeSync(fd: number): void; + /** + * Asynchronous file open. See the POSIX [`open(2)`](http://man7.org/linux/man-pages/man2/open.2.html) documentation for more details. + * + * `mode` sets the file mode (permission and sticky bits), but only if the file was + * created. On Windows, only the write permission can be manipulated; see {@link chmod}. + * + * The callback gets two arguments `(err, fd)`. + * + * Some characters (`< > : " / \ | ? *`) are reserved under Windows as documented + * by [Naming Files, Paths, and Namespaces](https://docs.microsoft.com/en-us/windows/desktop/FileIO/naming-a-file). Under NTFS, if the filename contains + * a colon, Node.js will open a file system stream, as described by [this MSDN page](https://docs.microsoft.com/en-us/windows/desktop/FileIO/using-streams). + * + * Functions based on `fs.open()` exhibit this behavior as well:`fs.writeFile()`, `fs.readFile()`, etc. + * @since v0.0.2 + * @param [flags='r'] See `support of file system `flags``. + * @param [mode=0o666] + */ + export function open(path: PathLike, flags: OpenMode, mode: Mode | undefined | null, callback: (err: NodeJS.ErrnoException | null, fd: number) => void): void; + /** + * Asynchronous open(2) - open and possibly create a file. If the file is created, its mode will be `0o666`. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + */ + export function open(path: PathLike, flags: OpenMode, callback: (err: NodeJS.ErrnoException | null, fd: number) => void): void; + export namespace open { + /** + * Asynchronous open(2) - open and possibly create a file. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param mode A file mode. If a string is passed, it is parsed as an octal integer. If not supplied, defaults to `0o666`. + */ + function __promisify__(path: PathLike, flags: OpenMode, mode?: Mode | null): Promise; + } + /** + * Returns an integer representing the file descriptor. + * + * For detailed information, see the documentation of the asynchronous version of + * this API: {@link open}. + * @since v0.1.21 + * @param [flags='r'] + * @param [mode=0o666] + */ + export function openSync(path: PathLike, flags: OpenMode, mode?: Mode | null): number; + /** + * Change the file system timestamps of the object referenced by `path`. + * + * The `atime` and `mtime` arguments follow these rules: + * + * * Values can be either numbers representing Unix epoch time in seconds,`Date`s, or a numeric string like `'123456789.0'`. + * * If the value can not be converted to a number, or is `NaN`, `Infinity` or`-Infinity`, an `Error` will be thrown. + * @since v0.4.2 + */ + export function utimes(path: PathLike, atime: TimeLike, mtime: TimeLike, callback: NoParamCallback): void; + export namespace utimes { + /** + * Asynchronously change file timestamps of the file referenced by the supplied path. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param atime The last access time. If a string is provided, it will be coerced to number. + * @param mtime The last modified time. If a string is provided, it will be coerced to number. + */ + function __promisify__(path: PathLike, atime: TimeLike, mtime: TimeLike): Promise; + } + /** + * Returns `undefined`. + * + * For detailed information, see the documentation of the asynchronous version of + * this API: {@link utimes}. + * @since v0.4.2 + */ + export function utimesSync(path: PathLike, atime: TimeLike, mtime: TimeLike): void; + /** + * Change the file system timestamps of the object referenced by the supplied file + * descriptor. See {@link utimes}. + * @since v0.4.2 + */ + export function futimes(fd: number, atime: TimeLike, mtime: TimeLike, callback: NoParamCallback): void; + export namespace futimes { + /** + * Asynchronously change file timestamps of the file referenced by the supplied file descriptor. + * @param fd A file descriptor. + * @param atime The last access time. If a string is provided, it will be coerced to number. + * @param mtime The last modified time. If a string is provided, it will be coerced to number. + */ + function __promisify__(fd: number, atime: TimeLike, mtime: TimeLike): Promise; + } + /** + * Synchronous version of {@link futimes}. Returns `undefined`. + * @since v0.4.2 + */ + export function futimesSync(fd: number, atime: TimeLike, mtime: TimeLike): void; + /** + * Request that all data for the open file descriptor is flushed to the storage + * device. The specific implementation is operating system and device specific. + * Refer to the POSIX [`fsync(2)`](http://man7.org/linux/man-pages/man2/fsync.2.html) documentation for more detail. No arguments other + * than a possible exception are given to the completion callback. + * @since v0.1.96 + */ + export function fsync(fd: number, callback: NoParamCallback): void; + export namespace fsync { + /** + * Asynchronous fsync(2) - synchronize a file's in-core state with the underlying storage device. + * @param fd A file descriptor. + */ + function __promisify__(fd: number): Promise; + } + /** + * Request that all data for the open file descriptor is flushed to the storage + * device. The specific implementation is operating system and device specific. + * Refer to the POSIX [`fsync(2)`](http://man7.org/linux/man-pages/man2/fsync.2.html) documentation for more detail. Returns `undefined`. + * @since v0.1.96 + */ + export function fsyncSync(fd: number): void; + /** + * Write `buffer` to the file specified by `fd`. If `buffer` is a normal object, it + * must have an own `toString` function property. + * + * `offset` determines the part of the buffer to be written, and `length` is + * an integer specifying the number of bytes to write. + * + * `position` refers to the offset from the beginning of the file where this data + * should be written. If `typeof position !== 'number'`, the data will be written + * at the current position. See [`pwrite(2)`](http://man7.org/linux/man-pages/man2/pwrite.2.html). + * + * The callback will be given three arguments `(err, bytesWritten, buffer)` where`bytesWritten` specifies how many _bytes_ were written from `buffer`. + * + * If this method is invoked as its `util.promisify()` ed version, it returns + * a promise for an `Object` with `bytesWritten` and `buffer` properties. + * + * It is unsafe to use `fs.write()` multiple times on the same file without waiting + * for the callback. For this scenario, {@link createWriteStream} is + * recommended. + * + * On Linux, positional writes don't work when the file is opened in append mode. + * The kernel ignores the position argument and always appends the data to + * the end of the file. + * @since v0.0.2 + */ + export function write( + fd: number, + buffer: TBuffer, + offset: number | undefined | null, + length: number | undefined | null, + position: number | undefined | null, + callback: (err: NodeJS.ErrnoException | null, written: number, buffer: TBuffer) => void + ): void; + /** + * Asynchronously writes `buffer` to the file referenced by the supplied file descriptor. + * @param fd A file descriptor. + * @param offset The part of the buffer to be written. If not supplied, defaults to `0`. + * @param length The number of bytes to write. If not supplied, defaults to `buffer.length - offset`. + */ + export function write( + fd: number, + buffer: TBuffer, + offset: number | undefined | null, + length: number | undefined | null, + callback: (err: NodeJS.ErrnoException | null, written: number, buffer: TBuffer) => void + ): void; + /** + * Asynchronously writes `buffer` to the file referenced by the supplied file descriptor. + * @param fd A file descriptor. + * @param offset The part of the buffer to be written. If not supplied, defaults to `0`. + */ + export function write( + fd: number, + buffer: TBuffer, + offset: number | undefined | null, + callback: (err: NodeJS.ErrnoException | null, written: number, buffer: TBuffer) => void + ): void; + /** + * Asynchronously writes `buffer` to the file referenced by the supplied file descriptor. + * @param fd A file descriptor. + */ + export function write(fd: number, buffer: TBuffer, callback: (err: NodeJS.ErrnoException | null, written: number, buffer: TBuffer) => void): void; + /** + * Asynchronously writes `string` to the file referenced by the supplied file descriptor. + * @param fd A file descriptor. + * @param string A string to write. + * @param position The offset from the beginning of the file where this data should be written. If not supplied, defaults to the current position. + * @param encoding The expected string encoding. + */ + export function write( + fd: number, + string: string, + position: number | undefined | null, + encoding: BufferEncoding | undefined | null, + callback: (err: NodeJS.ErrnoException | null, written: number, str: string) => void + ): void; + /** + * Asynchronously writes `string` to the file referenced by the supplied file descriptor. + * @param fd A file descriptor. + * @param string A string to write. + * @param position The offset from the beginning of the file where this data should be written. If not supplied, defaults to the current position. + */ + export function write(fd: number, string: string, position: number | undefined | null, callback: (err: NodeJS.ErrnoException | null, written: number, str: string) => void): void; + /** + * Asynchronously writes `string` to the file referenced by the supplied file descriptor. + * @param fd A file descriptor. + * @param string A string to write. + */ + export function write(fd: number, string: string, callback: (err: NodeJS.ErrnoException | null, written: number, str: string) => void): void; + export namespace write { + /** + * Asynchronously writes `buffer` to the file referenced by the supplied file descriptor. + * @param fd A file descriptor. + * @param offset The part of the buffer to be written. If not supplied, defaults to `0`. + * @param length The number of bytes to write. If not supplied, defaults to `buffer.length - offset`. + * @param position The offset from the beginning of the file where this data should be written. If not supplied, defaults to the current position. + */ + function __promisify__( + fd: number, + buffer?: TBuffer, + offset?: number, + length?: number, + position?: number | null + ): Promise<{ + bytesWritten: number; + buffer: TBuffer; + }>; + /** + * Asynchronously writes `string` to the file referenced by the supplied file descriptor. + * @param fd A file descriptor. + * @param string A string to write. + * @param position The offset from the beginning of the file where this data should be written. If not supplied, defaults to the current position. + * @param encoding The expected string encoding. + */ + function __promisify__( + fd: number, + string: string, + position?: number | null, + encoding?: BufferEncoding | null + ): Promise<{ + bytesWritten: number; + buffer: string; + }>; + } + /** + * If `buffer` is a plain object, it must have an own (not inherited) `toString`function property. + * + * For detailed information, see the documentation of the asynchronous version of + * this API: {@link write}. + * @since v0.1.21 + * @return The number of bytes written. + */ + export function writeSync(fd: number, buffer: NodeJS.ArrayBufferView, offset?: number | null, length?: number | null, position?: number | null): number; + /** + * Synchronously writes `string` to the file referenced by the supplied file descriptor, returning the number of bytes written. + * @param fd A file descriptor. + * @param string A string to write. + * @param position The offset from the beginning of the file where this data should be written. If not supplied, defaults to the current position. + * @param encoding The expected string encoding. + */ + export function writeSync(fd: number, string: string, position?: number | null, encoding?: BufferEncoding | null): number; + export type ReadPosition = number | bigint; + /** + * Read data from the file specified by `fd`. + * + * The callback is given the three arguments, `(err, bytesRead, buffer)`. + * + * If the file is not modified concurrently, the end-of-file is reached when the + * number of bytes read is zero. + * + * If this method is invoked as its `util.promisify()` ed version, it returns + * a promise for an `Object` with `bytesRead` and `buffer` properties. + * @since v0.0.2 + * @param buffer The buffer that the data will be written to. + * @param offset The position in `buffer` to write the data to. + * @param length The number of bytes to read. + * @param position Specifies where to begin reading from in the file. If `position` is `null` or `-1 `, data will be read from the current file position, and the file position will be updated. If + * `position` is an integer, the file position will be unchanged. + */ + export function read( + fd: number, + buffer: TBuffer, + offset: number, + length: number, + position: ReadPosition | null, + callback: (err: NodeJS.ErrnoException | null, bytesRead: number, buffer: TBuffer) => void + ): void; + export namespace read { + /** + * @param fd A file descriptor. + * @param buffer The buffer that the data will be written to. + * @param offset The offset in the buffer at which to start writing. + * @param length The number of bytes to read. + * @param position The offset from the beginning of the file from which data should be read. If `null`, data will be read from the current position. + */ + function __promisify__( + fd: number, + buffer: TBuffer, + offset: number, + length: number, + position: number | null + ): Promise<{ + bytesRead: number; + buffer: TBuffer; + }>; + } + export interface ReadSyncOptions { + /** + * @default 0 + */ + offset?: number | undefined; + /** + * @default `length of buffer` + */ + length?: number | undefined; + /** + * @default null + */ + position?: ReadPosition | null | undefined; + } + /** + * Returns the number of `bytesRead`. + * + * For detailed information, see the documentation of the asynchronous version of + * this API: {@link read}. + * @since v0.1.21 + */ + export function readSync(fd: number, buffer: NodeJS.ArrayBufferView, offset: number, length: number, position: ReadPosition | null): number; + /** + * Similar to the above `fs.readSync` function, this version takes an optional `options` object. + * If no `options` object is specified, it will default with the above values. + */ + export function readSync(fd: number, buffer: NodeJS.ArrayBufferView, opts?: ReadSyncOptions): number; + /** + * Asynchronously reads the entire contents of a file. + * + * ```js + * import { readFile } from 'fs'; + * + * readFile('/etc/passwd', (err, data) => { + * if (err) throw err; + * console.log(data); + * }); + * ``` + * + * The callback is passed two arguments `(err, data)`, where `data` is the + * contents of the file. + * + * If no encoding is specified, then the raw buffer is returned. + * + * If `options` is a string, then it specifies the encoding: + * + * ```js + * import { readFile } from 'fs'; + * + * readFile('/etc/passwd', 'utf8', callback); + * ``` + * + * When the path is a directory, the behavior of `fs.readFile()` and {@link readFileSync} is platform-specific. On macOS, Linux, and Windows, an + * error will be returned. On FreeBSD, a representation of the directory's contents + * will be returned. + * + * ```js + * import { readFile } from 'fs'; + * + * // macOS, Linux, and Windows + * readFile('', (err, data) => { + * // => [Error: EISDIR: illegal operation on a directory, read ] + * }); + * + * // FreeBSD + * readFile('', (err, data) => { + * // => null, + * }); + * ``` + * + * It is possible to abort an ongoing request using an `AbortSignal`. If a + * request is aborted the callback is called with an `AbortError`: + * + * ```js + * import { readFile } from 'fs'; + * + * const controller = new AbortController(); + * const signal = controller.signal; + * readFile(fileInfo[0].name, { signal }, (err, buf) => { + * // ... + * }); + * // When you want to abort the request + * controller.abort(); + * ``` + * + * The `fs.readFile()` function buffers the entire file. To minimize memory costs, + * when possible prefer streaming via `fs.createReadStream()`. + * + * Aborting an ongoing request does not abort individual operating + * system requests but rather the internal buffering `fs.readFile` performs. + * @since v0.1.29 + * @param path filename or file descriptor + */ + export function readFile( + path: PathOrFileDescriptor, + options: + | ({ + encoding?: null | undefined; + flag?: string | undefined; + } & Abortable) + | undefined + | null, + callback: (err: NodeJS.ErrnoException | null, data: Buffer) => void + ): void; + /** + * Asynchronously reads the entire contents of a file. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * If a file descriptor is provided, the underlying file will _not_ be closed automatically. + * @param options Either the encoding for the result, or an object that contains the encoding and an optional flag. + * If a flag is not provided, it defaults to `'r'`. + */ + export function readFile( + path: PathOrFileDescriptor, + options: + | ({ + encoding: BufferEncoding; + flag?: string | undefined; + } & Abortable) + | BufferEncoding, + callback: (err: NodeJS.ErrnoException | null, data: string) => void + ): void; + /** + * Asynchronously reads the entire contents of a file. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * If a file descriptor is provided, the underlying file will _not_ be closed automatically. + * @param options Either the encoding for the result, or an object that contains the encoding and an optional flag. + * If a flag is not provided, it defaults to `'r'`. + */ + export function readFile( + path: PathOrFileDescriptor, + options: + | (ObjectEncodingOptions & { + flag?: string | undefined; + } & Abortable) + | BufferEncoding + | undefined + | null, + callback: (err: NodeJS.ErrnoException | null, data: string | Buffer) => void + ): void; + /** + * Asynchronously reads the entire contents of a file. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * If a file descriptor is provided, the underlying file will _not_ be closed automatically. + */ + export function readFile(path: PathOrFileDescriptor, callback: (err: NodeJS.ErrnoException | null, data: Buffer) => void): void; + export namespace readFile { + /** + * Asynchronously reads the entire contents of a file. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * If a file descriptor is provided, the underlying file will _not_ be closed automatically. + * @param options An object that may contain an optional flag. + * If a flag is not provided, it defaults to `'r'`. + */ + function __promisify__( + path: PathOrFileDescriptor, + options?: { + encoding?: null | undefined; + flag?: string | undefined; + } | null + ): Promise; + /** + * Asynchronously reads the entire contents of a file. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * URL support is _experimental_. + * If a file descriptor is provided, the underlying file will _not_ be closed automatically. + * @param options Either the encoding for the result, or an object that contains the encoding and an optional flag. + * If a flag is not provided, it defaults to `'r'`. + */ + function __promisify__( + path: PathOrFileDescriptor, + options: + | { + encoding: BufferEncoding; + flag?: string | undefined; + } + | BufferEncoding + ): Promise; + /** + * Asynchronously reads the entire contents of a file. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * URL support is _experimental_. + * If a file descriptor is provided, the underlying file will _not_ be closed automatically. + * @param options Either the encoding for the result, or an object that contains the encoding and an optional flag. + * If a flag is not provided, it defaults to `'r'`. + */ + function __promisify__( + path: PathOrFileDescriptor, + options?: + | (ObjectEncodingOptions & { + flag?: string | undefined; + }) + | BufferEncoding + | null + ): Promise; + } + /** + * Returns the contents of the `path`. + * + * For detailed information, see the documentation of the asynchronous version of + * this API: {@link readFile}. + * + * If the `encoding` option is specified then this function returns a + * string. Otherwise it returns a buffer. + * + * Similar to {@link readFile}, when the path is a directory, the behavior of`fs.readFileSync()` is platform-specific. + * + * ```js + * import { readFileSync } from 'fs'; + * + * // macOS, Linux, and Windows + * readFileSync(''); + * // => [Error: EISDIR: illegal operation on a directory, read ] + * + * // FreeBSD + * readFileSync(''); // => + * ``` + * @since v0.1.8 + * @param path filename or file descriptor + */ + export function readFileSync( + path: PathOrFileDescriptor, + options?: { + encoding?: null | undefined; + flag?: string | undefined; + } | null + ): Buffer; + /** + * Synchronously reads the entire contents of a file. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * If a file descriptor is provided, the underlying file will _not_ be closed automatically. + * @param options Either the encoding for the result, or an object that contains the encoding and an optional flag. + * If a flag is not provided, it defaults to `'r'`. + */ + export function readFileSync( + path: PathOrFileDescriptor, + options: + | { + encoding: BufferEncoding; + flag?: string | undefined; + } + | BufferEncoding + ): string; + /** + * Synchronously reads the entire contents of a file. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * If a file descriptor is provided, the underlying file will _not_ be closed automatically. + * @param options Either the encoding for the result, or an object that contains the encoding and an optional flag. + * If a flag is not provided, it defaults to `'r'`. + */ + export function readFileSync( + path: PathOrFileDescriptor, + options?: + | (ObjectEncodingOptions & { + flag?: string | undefined; + }) + | BufferEncoding + | null + ): string | Buffer; + export type WriteFileOptions = + | (ObjectEncodingOptions & + Abortable & { + mode?: Mode | undefined; + flag?: string | undefined; + }) + | BufferEncoding + | null; + /** + * When `file` is a filename, asynchronously writes data to the file, replacing the + * file if it already exists. `data` can be a string or a buffer. + * + * When `file` is a file descriptor, the behavior is similar to calling`fs.write()` directly (which is recommended). See the notes below on using + * a file descriptor. + * + * The `encoding` option is ignored if `data` is a buffer. + * + * The `mode` option only affects the newly created file. See {@link open} for more details. + * + * If `data` is a plain object, it must have an own (not inherited) `toString`function property. + * + * ```js + * import { writeFile } from 'fs'; + * import { Buffer } from 'buffer'; + * + * const data = new Uint8Array(Buffer.from('Hello Node.js')); + * writeFile('message.txt', data, (err) => { + * if (err) throw err; + * console.log('The file has been saved!'); + * }); + * ``` + * + * If `options` is a string, then it specifies the encoding: + * + * ```js + * import { writeFile } from 'fs'; + * + * writeFile('message.txt', 'Hello Node.js', 'utf8', callback); + * ``` + * + * It is unsafe to use `fs.writeFile()` multiple times on the same file without + * waiting for the callback. For this scenario, {@link createWriteStream} is + * recommended. + * + * Similarly to `fs.readFile` \- `fs.writeFile` is a convenience method that + * performs multiple `write` calls internally to write the buffer passed to it. + * For performance sensitive code consider using {@link createWriteStream}. + * + * It is possible to use an `AbortSignal` to cancel an `fs.writeFile()`. + * Cancelation is "best effort", and some amount of data is likely still + * to be written. + * + * ```js + * import { writeFile } from 'fs'; + * import { Buffer } from 'buffer'; + * + * const controller = new AbortController(); + * const { signal } = controller; + * const data = new Uint8Array(Buffer.from('Hello Node.js')); + * writeFile('message.txt', data, { signal }, (err) => { + * // When a request is aborted - the callback is called with an AbortError + * }); + * // When the request should be aborted + * controller.abort(); + * ``` + * + * Aborting an ongoing request does not abort individual operating + * system requests but rather the internal buffering `fs.writeFile` performs. + * @since v0.1.29 + * @param file filename or file descriptor + */ + export function writeFile(file: PathOrFileDescriptor, data: string | NodeJS.ArrayBufferView, options: WriteFileOptions, callback: NoParamCallback): void; + /** + * Asynchronously writes data to a file, replacing the file if it already exists. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * If a file descriptor is provided, the underlying file will _not_ be closed automatically. + * @param data The data to write. If something other than a Buffer or Uint8Array is provided, the value is coerced to a string. + */ + export function writeFile(path: PathOrFileDescriptor, data: string | NodeJS.ArrayBufferView, callback: NoParamCallback): void; + export namespace writeFile { + /** + * Asynchronously writes data to a file, replacing the file if it already exists. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * URL support is _experimental_. + * If a file descriptor is provided, the underlying file will _not_ be closed automatically. + * @param data The data to write. If something other than a Buffer or Uint8Array is provided, the value is coerced to a string. + * @param options Either the encoding for the file, or an object optionally specifying the encoding, file mode, and flag. + * If `encoding` is not supplied, the default of `'utf8'` is used. + * If `mode` is not supplied, the default of `0o666` is used. + * If `mode` is a string, it is parsed as an octal integer. + * If `flag` is not supplied, the default of `'w'` is used. + */ + function __promisify__(path: PathOrFileDescriptor, data: string | NodeJS.ArrayBufferView, options?: WriteFileOptions): Promise; + } + /** + * Returns `undefined`. + * + * If `data` is a plain object, it must have an own (not inherited) `toString`function property. + * + * The `mode` option only affects the newly created file. See {@link open} for more details. + * + * For detailed information, see the documentation of the asynchronous version of + * this API: {@link writeFile}. + * @since v0.1.29 + * @param file filename or file descriptor + */ + export function writeFileSync(file: PathOrFileDescriptor, data: string | NodeJS.ArrayBufferView, options?: WriteFileOptions): void; + /** + * Asynchronously append data to a file, creating the file if it does not yet + * exist. `data` can be a string or a `Buffer`. + * + * The `mode` option only affects the newly created file. See {@link open} for more details. + * + * ```js + * import { appendFile } from 'fs'; + * + * appendFile('message.txt', 'data to append', (err) => { + * if (err) throw err; + * console.log('The "data to append" was appended to file!'); + * }); + * ``` + * + * If `options` is a string, then it specifies the encoding: + * + * ```js + * import { appendFile } from 'fs'; + * + * appendFile('message.txt', 'data to append', 'utf8', callback); + * ``` + * + * The `path` may be specified as a numeric file descriptor that has been opened + * for appending (using `fs.open()` or `fs.openSync()`). The file descriptor will + * not be closed automatically. + * + * ```js + * import { open, close, appendFile } from 'fs'; + * + * function closeFd(fd) { + * close(fd, (err) => { + * if (err) throw err; + * }); + * } + * + * open('message.txt', 'a', (err, fd) => { + * if (err) throw err; + * + * try { + * appendFile(fd, 'data to append', 'utf8', (err) => { + * closeFd(fd); + * if (err) throw err; + * }); + * } catch (err) { + * closeFd(fd); + * throw err; + * } + * }); + * ``` + * @since v0.6.7 + * @param path filename or file descriptor + */ + export function appendFile(path: PathOrFileDescriptor, data: string | Uint8Array, options: WriteFileOptions, callback: NoParamCallback): void; + /** + * Asynchronously append data to a file, creating the file if it does not exist. + * @param file A path to a file. If a URL is provided, it must use the `file:` protocol. + * If a file descriptor is provided, the underlying file will _not_ be closed automatically. + * @param data The data to write. If something other than a Buffer or Uint8Array is provided, the value is coerced to a string. + */ + export function appendFile(file: PathOrFileDescriptor, data: string | Uint8Array, callback: NoParamCallback): void; + export namespace appendFile { + /** + * Asynchronously append data to a file, creating the file if it does not exist. + * @param file A path to a file. If a URL is provided, it must use the `file:` protocol. + * URL support is _experimental_. + * If a file descriptor is provided, the underlying file will _not_ be closed automatically. + * @param data The data to write. If something other than a Buffer or Uint8Array is provided, the value is coerced to a string. + * @param options Either the encoding for the file, or an object optionally specifying the encoding, file mode, and flag. + * If `encoding` is not supplied, the default of `'utf8'` is used. + * If `mode` is not supplied, the default of `0o666` is used. + * If `mode` is a string, it is parsed as an octal integer. + * If `flag` is not supplied, the default of `'a'` is used. + */ + function __promisify__(file: PathOrFileDescriptor, data: string | Uint8Array, options?: WriteFileOptions): Promise; + } + /** + * Synchronously append data to a file, creating the file if it does not yet + * exist. `data` can be a string or a `Buffer`. + * + * The `mode` option only affects the newly created file. See {@link open} for more details. + * + * ```js + * import { appendFileSync } from 'fs'; + * + * try { + * appendFileSync('message.txt', 'data to append'); + * console.log('The "data to append" was appended to file!'); + * } catch (err) { + * // Handle the error + * } + * ``` + * + * If `options` is a string, then it specifies the encoding: + * + * ```js + * import { appendFileSync } from 'fs'; + * + * appendFileSync('message.txt', 'data to append', 'utf8'); + * ``` + * + * The `path` may be specified as a numeric file descriptor that has been opened + * for appending (using `fs.open()` or `fs.openSync()`). The file descriptor will + * not be closed automatically. + * + * ```js + * import { openSync, closeSync, appendFileSync } from 'fs'; + * + * let fd; + * + * try { + * fd = openSync('message.txt', 'a'); + * appendFileSync(fd, 'data to append', 'utf8'); + * } catch (err) { + * // Handle the error + * } finally { + * if (fd !== undefined) + * closeSync(fd); + * } + * ``` + * @since v0.6.7 + * @param path filename or file descriptor + */ + export function appendFileSync(path: PathOrFileDescriptor, data: string | Uint8Array, options?: WriteFileOptions): void; + /** + * Watch for changes on `filename`. The callback `listener` will be called each + * time the file is accessed. + * + * The `options` argument may be omitted. If provided, it should be an object. The`options` object may contain a boolean named `persistent` that indicates + * whether the process should continue to run as long as files are being watched. + * The `options` object may specify an `interval` property indicating how often the + * target should be polled in milliseconds. + * + * The `listener` gets two arguments the current stat object and the previous + * stat object: + * + * ```js + * import { watchFile } from 'fs'; + * + * watchFile('message.text', (curr, prev) => { + * console.log(`the current mtime is: ${curr.mtime}`); + * console.log(`the previous mtime was: ${prev.mtime}`); + * }); + * ``` + * + * These stat objects are instances of `fs.Stat`. If the `bigint` option is `true`, + * the numeric values in these objects are specified as `BigInt`s. + * + * To be notified when the file was modified, not just accessed, it is necessary + * to compare `curr.mtime` and `prev.mtime`. + * + * When an `fs.watchFile` operation results in an `ENOENT` error, it + * will invoke the listener once, with all the fields zeroed (or, for dates, the + * Unix Epoch). If the file is created later on, the listener will be called + * again, with the latest stat objects. This is a change in functionality since + * v0.10. + * + * Using {@link watch} is more efficient than `fs.watchFile` and`fs.unwatchFile`. `fs.watch` should be used instead of `fs.watchFile` and`fs.unwatchFile` when possible. + * + * When a file being watched by `fs.watchFile()` disappears and reappears, + * then the contents of `previous` in the second callback event (the file's + * reappearance) will be the same as the contents of `previous` in the first + * callback event (its disappearance). + * + * This happens when: + * + * * the file is deleted, followed by a restore + * * the file is renamed and then renamed a second time back to its original name + * @since v0.1.31 + */ + export function watchFile( + filename: PathLike, + options: + | { + persistent?: boolean | undefined; + interval?: number | undefined; + } + | undefined, + listener: (curr: Stats, prev: Stats) => void + ): void; + /** + * Watch for changes on `filename`. The callback `listener` will be called each time the file is accessed. + * @param filename A path to a file or directory. If a URL is provided, it must use the `file:` protocol. + */ + export function watchFile(filename: PathLike, listener: (curr: Stats, prev: Stats) => void): void; + /** + * Stop watching for changes on `filename`. If `listener` is specified, only that + * particular listener is removed. Otherwise, _all_ listeners are removed, + * effectively stopping watching of `filename`. + * + * Calling `fs.unwatchFile()` with a filename that is not being watched is a + * no-op, not an error. + * + * Using {@link watch} is more efficient than `fs.watchFile()` and`fs.unwatchFile()`. `fs.watch()` should be used instead of `fs.watchFile()`and `fs.unwatchFile()` when possible. + * @since v0.1.31 + * @param listener Optional, a listener previously attached using `fs.watchFile()` + */ + export function unwatchFile(filename: PathLike, listener?: (curr: Stats, prev: Stats) => void): void; + export interface WatchOptions extends Abortable { + encoding?: BufferEncoding | 'buffer' | undefined; + persistent?: boolean | undefined; + recursive?: boolean | undefined; + } + export type WatchEventType = 'rename' | 'change'; + export type WatchListener = (event: WatchEventType, filename: T) => void; + /** + * Watch for changes on `filename`, where `filename` is either a file or a + * directory. + * + * The second argument is optional. If `options` is provided as a string, it + * specifies the `encoding`. Otherwise `options` should be passed as an object. + * + * The listener callback gets two arguments `(eventType, filename)`. `eventType`is either `'rename'` or `'change'`, and `filename` is the name of the file + * which triggered the event. + * + * On most platforms, `'rename'` is emitted whenever a filename appears or + * disappears in the directory. + * + * The listener callback is attached to the `'change'` event fired by `fs.FSWatcher`, but it is not the same thing as the `'change'` value of`eventType`. + * + * If a `signal` is passed, aborting the corresponding AbortController will close + * the returned `fs.FSWatcher`. + * @since v0.5.10 + * @param listener + */ + export function watch( + filename: PathLike, + options: + | (WatchOptions & { + encoding: 'buffer'; + }) + | 'buffer', + listener?: WatchListener + ): FSWatcher; + /** + * Watch for changes on `filename`, where `filename` is either a file or a directory, returning an `FSWatcher`. + * @param filename A path to a file or directory. If a URL is provided, it must use the `file:` protocol. + * @param options Either the encoding for the filename provided to the listener, or an object optionally specifying encoding, persistent, and recursive options. + * If `encoding` is not supplied, the default of `'utf8'` is used. + * If `persistent` is not supplied, the default of `true` is used. + * If `recursive` is not supplied, the default of `false` is used. + */ + export function watch(filename: PathLike, options?: WatchOptions | BufferEncoding | null, listener?: WatchListener): FSWatcher; + /** + * Watch for changes on `filename`, where `filename` is either a file or a directory, returning an `FSWatcher`. + * @param filename A path to a file or directory. If a URL is provided, it must use the `file:` protocol. + * @param options Either the encoding for the filename provided to the listener, or an object optionally specifying encoding, persistent, and recursive options. + * If `encoding` is not supplied, the default of `'utf8'` is used. + * If `persistent` is not supplied, the default of `true` is used. + * If `recursive` is not supplied, the default of `false` is used. + */ + export function watch(filename: PathLike, options: WatchOptions | string, listener?: WatchListener): FSWatcher; + /** + * Watch for changes on `filename`, where `filename` is either a file or a directory, returning an `FSWatcher`. + * @param filename A path to a file or directory. If a URL is provided, it must use the `file:` protocol. + */ + export function watch(filename: PathLike, listener?: WatchListener): FSWatcher; + /** + * Test whether or not the given path exists by checking with the file system. + * Then call the `callback` argument with either true or false: + * + * ```js + * import { exists } from 'fs'; + * + * exists('/etc/passwd', (e) => { + * console.log(e ? 'it exists' : 'no passwd!'); + * }); + * ``` + * + * **The parameters for this callback are not consistent with other Node.js** + * **callbacks.** Normally, the first parameter to a Node.js callback is an `err`parameter, optionally followed by other parameters. The `fs.exists()` callback + * has only one boolean parameter. This is one reason `fs.access()` is recommended + * instead of `fs.exists()`. + * + * Using `fs.exists()` to check for the existence of a file before calling`fs.open()`, `fs.readFile()` or `fs.writeFile()` is not recommended. Doing + * so introduces a race condition, since other processes may change the file's + * state between the two calls. Instead, user code should open/read/write the + * file directly and handle the error raised if the file does not exist. + * + * **write (NOT RECOMMENDED)** + * + * ```js + * import { exists, open, close } from 'fs'; + * + * exists('myfile', (e) => { + * if (e) { + * console.error('myfile already exists'); + * } else { + * open('myfile', 'wx', (err, fd) => { + * if (err) throw err; + * + * try { + * writeMyData(fd); + * } finally { + * close(fd, (err) => { + * if (err) throw err; + * }); + * } + * }); + * } + * }); + * ``` + * + * **write (RECOMMENDED)** + * + * ```js + * import { open, close } from 'fs'; + * open('myfile', 'wx', (err, fd) => { + * if (err) { + * if (err.code === 'EEXIST') { + * console.error('myfile already exists'); + * return; + * } + * + * throw err; + * } + * + * try { + * writeMyData(fd); + * } finally { + * close(fd, (err) => { + * if (err) throw err; + * }); + * } + * }); + * ``` + * + * **read (NOT RECOMMENDED)** + * + * ```js + * import { open, close, exists } from 'fs'; + * + * exists('myfile', (e) => { + * if (e) { + * open('myfile', 'r', (err, fd) => { + * if (err) throw err; + * + * try { + * readMyData(fd); + * } finally { + * close(fd, (err) => { + * if (err) throw err; + * }); + * } + * }); + * } else { + * console.error('myfile does not exist'); + * } + * }); + * ``` + * + * **read (RECOMMENDED)** + * + * ```js + * import { open, close } from 'fs'; + * + * open('myfile', 'r', (err, fd) => { + * if (err) { + * if (err.code === 'ENOENT') { + * console.error('myfile does not exist'); + * return; + * } + * + * throw err; + * } + * + * try { + * readMyData(fd); + * } finally { + * close(fd, (err) => { + * if (err) throw err; + * }); + * } + * }); + * ``` + * + * The "not recommended" examples above check for existence and then use the + * file; the "recommended" examples are better because they use the file directly + * and handle the error, if any. + * + * In general, check for the existence of a file only if the file won’t be + * used directly, for example when its existence is a signal from another + * process. + * @since v0.0.2 + * @deprecated Since v1.0.0 - Use {@link stat} or {@link access} instead. + */ + export function exists(path: PathLike, callback: (exists: boolean) => void): void; + export namespace exists { + /** + * @param path A path to a file or directory. If a URL is provided, it must use the `file:` protocol. + * URL support is _experimental_. + */ + function __promisify__(path: PathLike): Promise; + } + /** + * Returns `true` if the path exists, `false` otherwise. + * + * For detailed information, see the documentation of the asynchronous version of + * this API: {@link exists}. + * + * `fs.exists()` is deprecated, but `fs.existsSync()` is not. The `callback`parameter to `fs.exists()` accepts parameters that are inconsistent with other + * Node.js callbacks. `fs.existsSync()` does not use a callback. + * + * ```js + * import { existsSync } from 'fs'; + * + * if (existsSync('/etc/passwd')) + * console.log('The path exists.'); + * ``` + * @since v0.1.21 + */ + export function existsSync(path: PathLike): boolean; + export namespace constants { + // File Access Constants + /** Constant for fs.access(). File is visible to the calling process. */ + const F_OK: number; + /** Constant for fs.access(). File can be read by the calling process. */ + const R_OK: number; + /** Constant for fs.access(). File can be written by the calling process. */ + const W_OK: number; + /** Constant for fs.access(). File can be executed by the calling process. */ + const X_OK: number; + // File Copy Constants + /** Constant for fs.copyFile. Flag indicating the destination file should not be overwritten if it already exists. */ + const COPYFILE_EXCL: number; + /** + * Constant for fs.copyFile. copy operation will attempt to create a copy-on-write reflink. + * If the underlying platform does not support copy-on-write, then a fallback copy mechanism is used. + */ + const COPYFILE_FICLONE: number; + /** + * Constant for fs.copyFile. Copy operation will attempt to create a copy-on-write reflink. + * If the underlying platform does not support copy-on-write, then the operation will fail with an error. + */ + const COPYFILE_FICLONE_FORCE: number; + // File Open Constants + /** Constant for fs.open(). Flag indicating to open a file for read-only access. */ + const O_RDONLY: number; + /** Constant for fs.open(). Flag indicating to open a file for write-only access. */ + const O_WRONLY: number; + /** Constant for fs.open(). Flag indicating to open a file for read-write access. */ + const O_RDWR: number; + /** Constant for fs.open(). Flag indicating to create the file if it does not already exist. */ + const O_CREAT: number; + /** Constant for fs.open(). Flag indicating that opening a file should fail if the O_CREAT flag is set and the file already exists. */ + const O_EXCL: number; + /** + * Constant for fs.open(). Flag indicating that if path identifies a terminal device, + * opening the path shall not cause that terminal to become the controlling terminal for the process + * (if the process does not already have one). + */ + const O_NOCTTY: number; + /** Constant for fs.open(). Flag indicating that if the file exists and is a regular file, and the file is opened successfully for write access, its length shall be truncated to zero. */ + const O_TRUNC: number; + /** Constant for fs.open(). Flag indicating that data will be appended to the end of the file. */ + const O_APPEND: number; + /** Constant for fs.open(). Flag indicating that the open should fail if the path is not a directory. */ + const O_DIRECTORY: number; + /** + * constant for fs.open(). + * Flag indicating reading accesses to the file system will no longer result in + * an update to the atime information associated with the file. + * This flag is available on Linux operating systems only. + */ + const O_NOATIME: number; + /** Constant for fs.open(). Flag indicating that the open should fail if the path is a symbolic link. */ + const O_NOFOLLOW: number; + /** Constant for fs.open(). Flag indicating that the file is opened for synchronous I/O. */ + const O_SYNC: number; + /** Constant for fs.open(). Flag indicating that the file is opened for synchronous I/O with write operations waiting for data integrity. */ + const O_DSYNC: number; + /** Constant for fs.open(). Flag indicating to open the symbolic link itself rather than the resource it is pointing to. */ + const O_SYMLINK: number; + /** Constant for fs.open(). When set, an attempt will be made to minimize caching effects of file I/O. */ + const O_DIRECT: number; + /** Constant for fs.open(). Flag indicating to open the file in nonblocking mode when possible. */ + const O_NONBLOCK: number; + // File Type Constants + /** Constant for fs.Stats mode property for determining a file's type. Bit mask used to extract the file type code. */ + const S_IFMT: number; + /** Constant for fs.Stats mode property for determining a file's type. File type constant for a regular file. */ + const S_IFREG: number; + /** Constant for fs.Stats mode property for determining a file's type. File type constant for a directory. */ + const S_IFDIR: number; + /** Constant for fs.Stats mode property for determining a file's type. File type constant for a character-oriented device file. */ + const S_IFCHR: number; + /** Constant for fs.Stats mode property for determining a file's type. File type constant for a block-oriented device file. */ + const S_IFBLK: number; + /** Constant for fs.Stats mode property for determining a file's type. File type constant for a FIFO/pipe. */ + const S_IFIFO: number; + /** Constant for fs.Stats mode property for determining a file's type. File type constant for a symbolic link. */ + const S_IFLNK: number; + /** Constant for fs.Stats mode property for determining a file's type. File type constant for a socket. */ + const S_IFSOCK: number; + // File Mode Constants + /** Constant for fs.Stats mode property for determining access permissions for a file. File mode indicating readable, writable and executable by owner. */ + const S_IRWXU: number; + /** Constant for fs.Stats mode property for determining access permissions for a file. File mode indicating readable by owner. */ + const S_IRUSR: number; + /** Constant for fs.Stats mode property for determining access permissions for a file. File mode indicating writable by owner. */ + const S_IWUSR: number; + /** Constant for fs.Stats mode property for determining access permissions for a file. File mode indicating executable by owner. */ + const S_IXUSR: number; + /** Constant for fs.Stats mode property for determining access permissions for a file. File mode indicating readable, writable and executable by group. */ + const S_IRWXG: number; + /** Constant for fs.Stats mode property for determining access permissions for a file. File mode indicating readable by group. */ + const S_IRGRP: number; + /** Constant for fs.Stats mode property for determining access permissions for a file. File mode indicating writable by group. */ + const S_IWGRP: number; + /** Constant for fs.Stats mode property for determining access permissions for a file. File mode indicating executable by group. */ + const S_IXGRP: number; + /** Constant for fs.Stats mode property for determining access permissions for a file. File mode indicating readable, writable and executable by others. */ + const S_IRWXO: number; + /** Constant for fs.Stats mode property for determining access permissions for a file. File mode indicating readable by others. */ + const S_IROTH: number; + /** Constant for fs.Stats mode property for determining access permissions for a file. File mode indicating writable by others. */ + const S_IWOTH: number; + /** Constant for fs.Stats mode property for determining access permissions for a file. File mode indicating executable by others. */ + const S_IXOTH: number; + /** + * When set, a memory file mapping is used to access the file. This flag + * is available on Windows operating systems only. On other operating systems, + * this flag is ignored. + */ + const UV_FS_O_FILEMAP: number; + } + /** + * Tests a user's permissions for the file or directory specified by `path`. + * The `mode` argument is an optional integer that specifies the accessibility + * checks to be performed. Check `File access constants` for possible values + * of `mode`. It is possible to create a mask consisting of the bitwise OR of + * two or more values (e.g. `fs.constants.W_OK | fs.constants.R_OK`). + * + * The final argument, `callback`, is a callback function that is invoked with + * a possible error argument. If any of the accessibility checks fail, the error + * argument will be an `Error` object. The following examples check if`package.json` exists, and if it is readable or writable. + * + * ```js + * import { access, constants } from 'fs'; + * + * const file = 'package.json'; + * + * // Check if the file exists in the current directory. + * access(file, constants.F_OK, (err) => { + * console.log(`${file} ${err ? 'does not exist' : 'exists'}`); + * }); + * + * // Check if the file is readable. + * access(file, constants.R_OK, (err) => { + * console.log(`${file} ${err ? 'is not readable' : 'is readable'}`); + * }); + * + * // Check if the file is writable. + * access(file, constants.W_OK, (err) => { + * console.log(`${file} ${err ? 'is not writable' : 'is writable'}`); + * }); + * + * // Check if the file exists in the current directory, and if it is writable. + * access(file, constants.F_OK | constants.W_OK, (err) => { + * if (err) { + * console.error( + * `${file} ${err.code === 'ENOENT' ? 'does not exist' : 'is read-only'}`); + * } else { + * console.log(`${file} exists, and it is writable`); + * } + * }); + * ``` + * + * Do not use `fs.access()` to check for the accessibility of a file before calling`fs.open()`, `fs.readFile()` or `fs.writeFile()`. Doing + * so introduces a race condition, since other processes may change the file's + * state between the two calls. Instead, user code should open/read/write the + * file directly and handle the error raised if the file is not accessible. + * + * **write (NOT RECOMMENDED)** + * + * ```js + * import { access, open, close } from 'fs'; + * + * access('myfile', (err) => { + * if (!err) { + * console.error('myfile already exists'); + * return; + * } + * + * open('myfile', 'wx', (err, fd) => { + * if (err) throw err; + * + * try { + * writeMyData(fd); + * } finally { + * close(fd, (err) => { + * if (err) throw err; + * }); + * } + * }); + * }); + * ``` + * + * **write (RECOMMENDED)** + * + * ```js + * import { open, close } from 'fs'; + * + * open('myfile', 'wx', (err, fd) => { + * if (err) { + * if (err.code === 'EEXIST') { + * console.error('myfile already exists'); + * return; + * } + * + * throw err; + * } + * + * try { + * writeMyData(fd); + * } finally { + * close(fd, (err) => { + * if (err) throw err; + * }); + * } + * }); + * ``` + * + * **read (NOT RECOMMENDED)** + * + * ```js + * import { access, open, close } from 'fs'; + * access('myfile', (err) => { + * if (err) { + * if (err.code === 'ENOENT') { + * console.error('myfile does not exist'); + * return; + * } + * + * throw err; + * } + * + * open('myfile', 'r', (err, fd) => { + * if (err) throw err; + * + * try { + * readMyData(fd); + * } finally { + * close(fd, (err) => { + * if (err) throw err; + * }); + * } + * }); + * }); + * ``` + * + * **read (RECOMMENDED)** + * + * ```js + * import { open, close } from 'fs'; + * + * open('myfile', 'r', (err, fd) => { + * if (err) { + * if (err.code === 'ENOENT') { + * console.error('myfile does not exist'); + * return; + * } + * + * throw err; + * } + * + * try { + * readMyData(fd); + * } finally { + * close(fd, (err) => { + * if (err) throw err; + * }); + * } + * }); + * ``` + * + * The "not recommended" examples above check for accessibility and then use the + * file; the "recommended" examples are better because they use the file directly + * and handle the error, if any. + * + * In general, check for the accessibility of a file only if the file will not be + * used directly, for example when its accessibility is a signal from another + * process. + * + * On Windows, access-control policies (ACLs) on a directory may limit access to + * a file or directory. The `fs.access()` function, however, does not check the + * ACL and therefore may report that a path is accessible even if the ACL restricts + * the user from reading or writing to it. + * @since v0.11.15 + * @param [mode=fs.constants.F_OK] + */ + export function access(path: PathLike, mode: number | undefined, callback: NoParamCallback): void; + /** + * Asynchronously tests a user's permissions for the file specified by path. + * @param path A path to a file or directory. If a URL is provided, it must use the `file:` protocol. + */ + export function access(path: PathLike, callback: NoParamCallback): void; + export namespace access { + /** + * Asynchronously tests a user's permissions for the file specified by path. + * @param path A path to a file or directory. If a URL is provided, it must use the `file:` protocol. + * URL support is _experimental_. + */ + function __promisify__(path: PathLike, mode?: number): Promise; + } + /** + * Synchronously tests a user's permissions for the file or directory specified + * by `path`. The `mode` argument is an optional integer that specifies the + * accessibility checks to be performed. Check `File access constants` for + * possible values of `mode`. It is possible to create a mask consisting of + * the bitwise OR of two or more values + * (e.g. `fs.constants.W_OK | fs.constants.R_OK`). + * + * If any of the accessibility checks fail, an `Error` will be thrown. Otherwise, + * the method will return `undefined`. + * + * ```js + * import { accessSync, constants } from 'fs'; + * + * try { + * accessSync('etc/passwd', constants.R_OK | constants.W_OK); + * console.log('can read/write'); + * } catch (err) { + * console.error('no access!'); + * } + * ``` + * @since v0.11.15 + * @param [mode=fs.constants.F_OK] + */ + export function accessSync(path: PathLike, mode?: number): void; + interface StreamOptions { + flags?: string | undefined; + encoding?: BufferEncoding | undefined; + fd?: number | promises.FileHandle | undefined; + mode?: number | undefined; + autoClose?: boolean | undefined; + /** + * @default false + */ + emitClose?: boolean | undefined; + start?: number | undefined; + highWaterMark?: number | undefined; + } + interface ReadStreamOptions extends StreamOptions { + end?: number | undefined; + } + /** + * Unlike the 16 kb default `highWaterMark` for a readable stream, the stream + * returned by this method has a default `highWaterMark` of 64 kb. + * + * `options` can include `start` and `end` values to read a range of bytes from + * the file instead of the entire file. Both `start` and `end` are inclusive and + * start counting at 0, allowed values are in the + * \[0, [`Number.MAX_SAFE_INTEGER`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number/MAX_SAFE_INTEGER)\] range. If `fd` is specified and `start` is + * omitted or `undefined`, `fs.createReadStream()` reads sequentially from the + * current file position. The `encoding` can be any one of those accepted by `Buffer`. + * + * If `fd` is specified, `ReadStream` will ignore the `path` argument and will use + * the specified file descriptor. This means that no `'open'` event will be + * emitted. `fd` should be blocking; non-blocking `fd`s should be passed to `net.Socket`. + * + * If `fd` points to a character device that only supports blocking reads + * (such as keyboard or sound card), read operations do not finish until data is + * available. This can prevent the process from exiting and the stream from + * closing naturally. + * + * By default, the stream will emit a `'close'` event after it has been + * destroyed, like most `Readable` streams. Set the `emitClose` option to`false` to change this behavior. + * + * By providing the `fs` option, it is possible to override the corresponding `fs`implementations for `open`, `read`, and `close`. When providing the `fs` option, + * an override for `read` is required. If no `fd` is provided, an override for`open` is also required. If `autoClose` is `true`, an override for `close` is + * also required. + * + * ```js + * import { createReadStream } from 'fs'; + * + * // Create a stream from some character device. + * const stream = createReadStream('/dev/input/event0'); + * setTimeout(() => { + * stream.close(); // This may not close the stream. + * // Artificially marking end-of-stream, as if the underlying resource had + * // indicated end-of-file by itself, allows the stream to close. + * // This does not cancel pending read operations, and if there is such an + * // operation, the process may still not be able to exit successfully + * // until it finishes. + * stream.push(null); + * stream.read(0); + * }, 100); + * ``` + * + * If `autoClose` is false, then the file descriptor won't be closed, even if + * there's an error. It is the application's responsibility to close it and make + * sure there's no file descriptor leak. If `autoClose` is set to true (default + * behavior), on `'error'` or `'end'` the file descriptor will be closed + * automatically. + * + * `mode` sets the file mode (permission and sticky bits), but only if the + * file was created. + * + * An example to read the last 10 bytes of a file which is 100 bytes long: + * + * ```js + * import { createReadStream } from 'fs'; + * + * createReadStream('sample.txt', { start: 90, end: 99 }); + * ``` + * + * If `options` is a string, then it specifies the encoding. + * @since v0.1.31 + * @return See `Readable Stream`. + */ + export function createReadStream(path: PathLike, options?: BufferEncoding | ReadStreamOptions): ReadStream; + /** + * `options` may also include a `start` option to allow writing data at some + * position past the beginning of the file, allowed values are in the + * \[0, [`Number.MAX_SAFE_INTEGER`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Number/MAX_SAFE_INTEGER)\] range. Modifying a file rather than replacing + * it may require the `flags` option to be set to `r+` rather than the default `w`. + * The `encoding` can be any one of those accepted by `Buffer`. + * + * If `autoClose` is set to true (default behavior) on `'error'` or `'finish'`the file descriptor will be closed automatically. If `autoClose` is false, + * then the file descriptor won't be closed, even if there's an error. + * It is the application's responsibility to close it and make sure there's no + * file descriptor leak. + * + * By default, the stream will emit a `'close'` event after it has been + * destroyed, like most `Writable` streams. Set the `emitClose` option to`false` to change this behavior. + * + * By providing the `fs` option it is possible to override the corresponding `fs`implementations for `open`, `write`, `writev` and `close`. Overriding `write()`without `writev()` can reduce + * performance as some optimizations (`_writev()`) + * will be disabled. When providing the `fs` option, overrides for at least one of`write` and `writev` are required. If no `fd` option is supplied, an override + * for `open` is also required. If `autoClose` is `true`, an override for `close`is also required. + * + * Like `fs.ReadStream`, if `fd` is specified, `fs.WriteStream` will ignore the`path` argument and will use the specified file descriptor. This means that no`'open'` event will be + * emitted. `fd` should be blocking; non-blocking `fd`s + * should be passed to `net.Socket`. + * + * If `options` is a string, then it specifies the encoding. + * @since v0.1.31 + * @return See `Writable Stream`. + */ + export function createWriteStream(path: PathLike, options?: BufferEncoding | StreamOptions): WriteStream; + /** + * Forces all currently queued I/O operations associated with the file to the + * operating system's synchronized I/O completion state. Refer to the POSIX [`fdatasync(2)`](http://man7.org/linux/man-pages/man2/fdatasync.2.html) documentation for details. No arguments other + * than a possible + * exception are given to the completion callback. + * @since v0.1.96 + */ + export function fdatasync(fd: number, callback: NoParamCallback): void; + export namespace fdatasync { + /** + * Asynchronous fdatasync(2) - synchronize a file's in-core state with storage device. + * @param fd A file descriptor. + */ + function __promisify__(fd: number): Promise; + } + /** + * Forces all currently queued I/O operations associated with the file to the + * operating system's synchronized I/O completion state. Refer to the POSIX [`fdatasync(2)`](http://man7.org/linux/man-pages/man2/fdatasync.2.html) documentation for details. Returns `undefined`. + * @since v0.1.96 + */ + export function fdatasyncSync(fd: number): void; + /** + * Asynchronously copies `src` to `dest`. By default, `dest` is overwritten if it + * already exists. No arguments other than a possible exception are given to the + * callback function. Node.js makes no guarantees about the atomicity of the copy + * operation. If an error occurs after the destination file has been opened for + * writing, Node.js will attempt to remove the destination. + * + * `mode` is an optional integer that specifies the behavior + * of the copy operation. It is possible to create a mask consisting of the bitwise + * OR of two or more values (e.g.`fs.constants.COPYFILE_EXCL | fs.constants.COPYFILE_FICLONE`). + * + * * `fs.constants.COPYFILE_EXCL`: The copy operation will fail if `dest` already + * exists. + * * `fs.constants.COPYFILE_FICLONE`: The copy operation will attempt to create a + * copy-on-write reflink. If the platform does not support copy-on-write, then a + * fallback copy mechanism is used. + * * `fs.constants.COPYFILE_FICLONE_FORCE`: The copy operation will attempt to + * create a copy-on-write reflink. If the platform does not support + * copy-on-write, then the operation will fail. + * + * ```js + * import { copyFile, constants } from 'fs'; + * + * function callback(err) { + * if (err) throw err; + * console.log('source.txt was copied to destination.txt'); + * } + * + * // destination.txt will be created or overwritten by default. + * copyFile('source.txt', 'destination.txt', callback); + * + * // By using COPYFILE_EXCL, the operation will fail if destination.txt exists. + * copyFile('source.txt', 'destination.txt', constants.COPYFILE_EXCL, callback); + * ``` + * @since v8.5.0 + * @param src source filename to copy + * @param dest destination filename of the copy operation + * @param [mode=0] modifiers for copy operation. + */ + export function copyFile(src: PathLike, dest: PathLike, callback: NoParamCallback): void; + export function copyFile(src: PathLike, dest: PathLike, mode: number, callback: NoParamCallback): void; + export namespace copyFile { + function __promisify__(src: PathLike, dst: PathLike, mode?: number): Promise; + } + /** + * Synchronously copies `src` to `dest`. By default, `dest` is overwritten if it + * already exists. Returns `undefined`. Node.js makes no guarantees about the + * atomicity of the copy operation. If an error occurs after the destination file + * has been opened for writing, Node.js will attempt to remove the destination. + * + * `mode` is an optional integer that specifies the behavior + * of the copy operation. It is possible to create a mask consisting of the bitwise + * OR of two or more values (e.g.`fs.constants.COPYFILE_EXCL | fs.constants.COPYFILE_FICLONE`). + * + * * `fs.constants.COPYFILE_EXCL`: The copy operation will fail if `dest` already + * exists. + * * `fs.constants.COPYFILE_FICLONE`: The copy operation will attempt to create a + * copy-on-write reflink. If the platform does not support copy-on-write, then a + * fallback copy mechanism is used. + * * `fs.constants.COPYFILE_FICLONE_FORCE`: The copy operation will attempt to + * create a copy-on-write reflink. If the platform does not support + * copy-on-write, then the operation will fail. + * + * ```js + * import { copyFileSync, constants } from 'fs'; + * + * // destination.txt will be created or overwritten by default. + * copyFileSync('source.txt', 'destination.txt'); + * console.log('source.txt was copied to destination.txt'); + * + * // By using COPYFILE_EXCL, the operation will fail if destination.txt exists. + * copyFileSync('source.txt', 'destination.txt', constants.COPYFILE_EXCL); + * ``` + * @since v8.5.0 + * @param src source filename to copy + * @param dest destination filename of the copy operation + * @param [mode=0] modifiers for copy operation. + */ + export function copyFileSync(src: PathLike, dest: PathLike, mode?: number): void; + /** + * Write an array of `ArrayBufferView`s to the file specified by `fd` using`writev()`. + * + * `position` is the offset from the beginning of the file where this data + * should be written. If `typeof position !== 'number'`, the data will be written + * at the current position. + * + * The callback will be given three arguments: `err`, `bytesWritten`, and`buffers`. `bytesWritten` is how many bytes were written from `buffers`. + * + * If this method is `util.promisify()` ed, it returns a promise for an`Object` with `bytesWritten` and `buffers` properties. + * + * It is unsafe to use `fs.writev()` multiple times on the same file without + * waiting for the callback. For this scenario, use {@link createWriteStream}. + * + * On Linux, positional writes don't work when the file is opened in append mode. + * The kernel ignores the position argument and always appends the data to + * the end of the file. + * @since v12.9.0 + */ + export function writev(fd: number, buffers: ReadonlyArray, cb: (err: NodeJS.ErrnoException | null, bytesWritten: number, buffers: NodeJS.ArrayBufferView[]) => void): void; + export function writev( + fd: number, + buffers: ReadonlyArray, + position: number, + cb: (err: NodeJS.ErrnoException | null, bytesWritten: number, buffers: NodeJS.ArrayBufferView[]) => void + ): void; + export interface WriteVResult { + bytesWritten: number; + buffers: NodeJS.ArrayBufferView[]; + } + export namespace writev { + function __promisify__(fd: number, buffers: ReadonlyArray, position?: number): Promise; + } + /** + * For detailed information, see the documentation of the asynchronous version of + * this API: {@link writev}. + * @since v12.9.0 + * @return The number of bytes written. + */ + export function writevSync(fd: number, buffers: ReadonlyArray, position?: number): number; + /** + * Read from a file specified by `fd` and write to an array of `ArrayBufferView`s + * using `readv()`. + * + * `position` is the offset from the beginning of the file from where data + * should be read. If `typeof position !== 'number'`, the data will be read + * from the current position. + * + * The callback will be given three arguments: `err`, `bytesRead`, and`buffers`. `bytesRead` is how many bytes were read from the file. + * + * If this method is invoked as its `util.promisify()` ed version, it returns + * a promise for an `Object` with `bytesRead` and `buffers` properties. + * @since v13.13.0, v12.17.0 + */ + export function readv(fd: number, buffers: ReadonlyArray, cb: (err: NodeJS.ErrnoException | null, bytesRead: number, buffers: NodeJS.ArrayBufferView[]) => void): void; + export function readv( + fd: number, + buffers: ReadonlyArray, + position: number, + cb: (err: NodeJS.ErrnoException | null, bytesRead: number, buffers: NodeJS.ArrayBufferView[]) => void + ): void; + export interface ReadVResult { + bytesRead: number; + buffers: NodeJS.ArrayBufferView[]; + } + export namespace readv { + function __promisify__(fd: number, buffers: ReadonlyArray, position?: number): Promise; + } + /** + * For detailed information, see the documentation of the asynchronous version of + * this API: {@link readv}. + * @since v13.13.0, v12.17.0 + * @return The number of bytes read. + */ + export function readvSync(fd: number, buffers: ReadonlyArray, position?: number): number; + export interface OpenDirOptions { + encoding?: BufferEncoding | undefined; + /** + * Number of directory entries that are buffered + * internally when reading from the directory. Higher values lead to better + * performance but higher memory usage. + * @default 32 + */ + bufferSize?: number | undefined; + } + /** + * Synchronously open a directory. See [`opendir(3)`](http://man7.org/linux/man-pages/man3/opendir.3.html). + * + * Creates an `fs.Dir`, which contains all further functions for reading from + * and cleaning up the directory. + * + * The `encoding` option sets the encoding for the `path` while opening the + * directory and subsequent read operations. + * @since v12.12.0 + */ + export function opendirSync(path: string, options?: OpenDirOptions): Dir; + /** + * Asynchronously open a directory. See the POSIX [`opendir(3)`](http://man7.org/linux/man-pages/man3/opendir.3.html) documentation for + * more details. + * + * Creates an `fs.Dir`, which contains all further functions for reading from + * and cleaning up the directory. + * + * The `encoding` option sets the encoding for the `path` while opening the + * directory and subsequent read operations. + * @since v12.12.0 + */ + export function opendir(path: string, cb: (err: NodeJS.ErrnoException | null, dir: Dir) => void): void; + export function opendir(path: string, options: OpenDirOptions, cb: (err: NodeJS.ErrnoException | null, dir: Dir) => void): void; + export namespace opendir { + function __promisify__(path: string, options?: OpenDirOptions): Promise; + } + export interface BigIntStats extends StatsBase { + atimeNs: bigint; + mtimeNs: bigint; + ctimeNs: bigint; + birthtimeNs: bigint; + } + export interface BigIntOptions { + bigint: true; + } + export interface StatOptions { + bigint?: boolean | undefined; + } + export interface StatSyncOptions extends StatOptions { + throwIfNoEntry?: boolean | undefined; + } + export interface CopyOptions { + /** + * Dereference symlinks + * @default false + */ + dereference?: boolean; + /** + * When `force` is `false`, and the destination + * exists, throw an error. + * @default false + */ + errorOnExist?: boolean; + /** + * Function to filter copied files/directories. Return + * `true` to copy the item, `false` to ignore it. + */ + filter?(source: string, destination: string): boolean; + /** + * Overwrite existing file or directory. _The copy + * operation will ignore errors if you set this to false and the destination + * exists. Use the `errorOnExist` option to change this behavior. + * @default true + */ + force?: boolean; + /** + * When `true` timestamps from `src` will + * be preserved. + * @default false + */ + preserveTimestamps?: boolean; + /** + * Copy directories recursively. + * @default false + */ + recursive?: boolean; + } + /** + * Asynchronously copies the entire directory structure from `src` to `dest`, + * including subdirectories and files. + * + * When copying a directory to another directory, globs are not supported and + * behavior is similar to `cp dir1/ dir2/`. + * @since v16.7.0 + * @experimental + * @param src source path to copy. + * @param dest destination path to copy to. + */ + export function cp(source: string, destination: string, callback: (err: NodeJS.ErrnoException | null) => void): void; + export function cp(source: string, destination: string, opts: CopyOptions, callback: (err: NodeJS.ErrnoException | null) => void): void; + /** + * Synchronously copies the entire directory structure from `src` to `dest`, + * including subdirectories and files. + * + * When copying a directory to another directory, globs are not supported and + * behavior is similar to `cp dir1/ dir2/`. + * @since v16.7.0 + * @experimental + * @param src source path to copy. + * @param dest destination path to copy to. + */ + export function cpSync(source: string, destination: string, opts?: CopyOptions): void; +} +declare module 'node:fs' { + export * from 'fs'; +} diff --git a/node_modules/@types/node/fs/promises.d.ts b/node_modules/@types/node/fs/promises.d.ts new file mode 100644 index 000000000..e4e04a840 --- /dev/null +++ b/node_modules/@types/node/fs/promises.d.ts @@ -0,0 +1,1004 @@ +/** + * The `fs/promises` API provides asynchronous file system methods that return + * promises. + * + * The promise APIs use the underlying Node.js threadpool to perform file + * system operations off the event loop thread. These operations are not + * synchronized or threadsafe. Care must be taken when performing multiple + * concurrent modifications on the same file or data corruption may occur. + * @since v10.0.0 + */ +declare module 'fs/promises' { + import { Abortable } from 'node:events'; + import { Stream } from 'node:stream'; + import { + Stats, + BigIntStats, + StatOptions, + WriteVResult, + ReadVResult, + PathLike, + RmDirOptions, + RmOptions, + MakeDirectoryOptions, + Dirent, + OpenDirOptions, + Dir, + ObjectEncodingOptions, + BufferEncodingOption, + OpenMode, + Mode, + WatchOptions, + WatchEventType, + CopyOptions, + } from 'node:fs'; + interface FileChangeInfo { + eventType: WatchEventType; + filename: T; + } + interface FlagAndOpenMode { + mode?: Mode | undefined; + flag?: OpenMode | undefined; + } + interface FileReadResult { + bytesRead: number; + buffer: T; + } + interface FileReadOptions { + /** + * @default `Buffer.alloc(0xffff)` + */ + buffer?: T; + /** + * @default 0 + */ + offset?: number | null; + /** + * @default `buffer.byteLength` + */ + length?: number | null; + position?: number | null; + } + // TODO: Add `EventEmitter` close + interface FileHandle { + /** + * The numeric file descriptor managed by the {FileHandle} object. + * @since v10.0.0 + */ + readonly fd: number; + /** + * Alias of `filehandle.writeFile()`. + * + * When operating on file handles, the mode cannot be changed from what it was set + * to with `fsPromises.open()`. Therefore, this is equivalent to `filehandle.writeFile()`. + * @since v10.0.0 + * @return Fulfills with `undefined` upon success. + */ + appendFile(data: string | Uint8Array, options?: (ObjectEncodingOptions & FlagAndOpenMode) | BufferEncoding | null): Promise; + /** + * Changes the ownership of the file. A wrapper for [`chown(2)`](http://man7.org/linux/man-pages/man2/chown.2.html). + * @since v10.0.0 + * @param uid The file's new owner's user id. + * @param gid The file's new group's group id. + * @return Fulfills with `undefined` upon success. + */ + chown(uid: number, gid: number): Promise; + /** + * Modifies the permissions on the file. See [`chmod(2)`](http://man7.org/linux/man-pages/man2/chmod.2.html). + * @since v10.0.0 + * @param mode the file mode bit mask. + * @return Fulfills with `undefined` upon success. + */ + chmod(mode: Mode): Promise; + /** + * Forces all currently queued I/O operations associated with the file to the + * operating system's synchronized I/O completion state. Refer to the POSIX [`fdatasync(2)`](http://man7.org/linux/man-pages/man2/fdatasync.2.html) documentation for details. + * + * Unlike `filehandle.sync` this method does not flush modified metadata. + * @since v10.0.0 + * @return Fulfills with `undefined` upon success. + */ + datasync(): Promise; + /** + * Request that all data for the open file descriptor is flushed to the storage + * device. The specific implementation is operating system and device specific. + * Refer to the POSIX [`fsync(2)`](http://man7.org/linux/man-pages/man2/fsync.2.html) documentation for more detail. + * @since v10.0.0 + * @return Fufills with `undefined` upon success. + */ + sync(): Promise; + /** + * Reads data from the file and stores that in the given buffer. + * + * If the file is not modified concurrently, the end-of-file is reached when the + * number of bytes read is zero. + * @since v10.0.0 + * @param buffer A buffer that will be filled with the file data read. + * @param offset The location in the buffer at which to start filling. + * @param length The number of bytes to read. + * @param position The location where to begin reading data from the file. If `null`, data will be read from the current file position, and the position will be updated. If `position` is an + * integer, the current file position will remain unchanged. + * @return Fulfills upon success with an object with two properties: + */ + read(buffer: T, offset?: number | null, length?: number | null, position?: number | null): Promise>; + read(options?: FileReadOptions): Promise>; + /** + * Asynchronously reads the entire contents of a file. + * + * If `options` is a string, then it specifies the `encoding`. + * + * The `FileHandle` has to support reading. + * + * If one or more `filehandle.read()` calls are made on a file handle and then a`filehandle.readFile()` call is made, the data will be read from the current + * position till the end of the file. It doesn't always read from the beginning + * of the file. + * @since v10.0.0 + * @return Fulfills upon a successful read with the contents of the file. If no encoding is specified (using `options.encoding`), the data is returned as a {Buffer} object. Otherwise, the + * data will be a string. + */ + readFile( + options?: { + encoding?: null | undefined; + flag?: OpenMode | undefined; + } | null + ): Promise; + /** + * Asynchronously reads the entire contents of a file. The underlying file will _not_ be closed automatically. + * The `FileHandle` must have been opened for reading. + * @param options An object that may contain an optional flag. + * If a flag is not provided, it defaults to `'r'`. + */ + readFile( + options: + | { + encoding: BufferEncoding; + flag?: OpenMode | undefined; + } + | BufferEncoding + ): Promise; + /** + * Asynchronously reads the entire contents of a file. The underlying file will _not_ be closed automatically. + * The `FileHandle` must have been opened for reading. + * @param options An object that may contain an optional flag. + * If a flag is not provided, it defaults to `'r'`. + */ + readFile( + options?: + | (ObjectEncodingOptions & { + flag?: OpenMode | undefined; + }) + | BufferEncoding + | null + ): Promise; + /** + * @since v10.0.0 + * @return Fulfills with an {fs.Stats} for the file. + */ + stat( + opts?: StatOptions & { + bigint?: false | undefined; + } + ): Promise; + stat( + opts: StatOptions & { + bigint: true; + } + ): Promise; + stat(opts?: StatOptions): Promise; + /** + * Truncates the file. + * + * If the file was larger than `len` bytes, only the first `len` bytes will be + * retained in the file. + * + * The following example retains only the first four bytes of the file: + * + * ```js + * import { open } from 'fs/promises'; + * + * let filehandle = null; + * try { + * filehandle = await open('temp.txt', 'r+'); + * await filehandle.truncate(4); + * } finally { + * await filehandle?.close(); + * } + * ``` + * + * If the file previously was shorter than `len` bytes, it is extended, and the + * extended part is filled with null bytes (`'\0'`): + * + * If `len` is negative then `0` will be used. + * @since v10.0.0 + * @param [len=0] + * @return Fulfills with `undefined` upon success. + */ + truncate(len?: number): Promise; + /** + * Change the file system timestamps of the object referenced by the `FileHandle` then resolves the promise with no arguments upon success. + * @since v10.0.0 + */ + utimes(atime: string | number | Date, mtime: string | number | Date): Promise; + /** + * Asynchronously writes data to a file, replacing the file if it already exists.`data` can be a string, a buffer, an + * [AsyncIterable](https://tc39.github.io/ecma262/#sec-asynciterable-interface) or + * [Iterable](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Iteration_protocols#The_iterable_protocol) object, or an + * object with an own `toString` function + * property. The promise is resolved with no arguments upon success. + * + * If `options` is a string, then it specifies the `encoding`. + * + * The `FileHandle` has to support writing. + * + * It is unsafe to use `filehandle.writeFile()` multiple times on the same file + * without waiting for the promise to be resolved (or rejected). + * + * If one or more `filehandle.write()` calls are made on a file handle and then a`filehandle.writeFile()` call is made, the data will be written from the + * current position till the end of the file. It doesn't always write from the + * beginning of the file. + * @since v10.0.0 + */ + writeFile(data: string | Uint8Array, options?: (ObjectEncodingOptions & FlagAndOpenMode & Abortable) | BufferEncoding | null): Promise; + /** + * Write `buffer` to the file. + * + * If `buffer` is a plain object, it must have an own (not inherited) `toString`function property. + * + * The promise is resolved with an object containing two properties: + * + * It is unsafe to use `filehandle.write()` multiple times on the same file + * without waiting for the promise to be resolved (or rejected). For this + * scenario, use `fs.createWriteStream()`. + * + * On Linux, positional writes do not work when the file is opened in append mode. + * The kernel ignores the position argument and always appends the data to + * the end of the file. + * @since v10.0.0 + * @param [offset=0] The start position from within `buffer` where the data to write begins. + * @param [length=buffer.byteLength] The number of bytes from `buffer` to write. + * @param position The offset from the beginning of the file where the data from `buffer` should be written. If `position` is not a `number`, the data will be written at the current position. + * See the POSIX pwrite(2) documentation for more detail. + */ + write( + buffer: TBuffer, + offset?: number | null, + length?: number | null, + position?: number | null + ): Promise<{ + bytesWritten: number; + buffer: TBuffer; + }>; + write( + data: string, + position?: number | null, + encoding?: BufferEncoding | null + ): Promise<{ + bytesWritten: number; + buffer: string; + }>; + /** + * Write an array of [ArrayBufferView](https://developer.mozilla.org/en-US/docs/Web/API/ArrayBufferView) s to the file. + * + * The promise is resolved with an object containing a two properties: + * + * It is unsafe to call `writev()` multiple times on the same file without waiting + * for the promise to be resolved (or rejected). + * + * On Linux, positional writes don't work when the file is opened in append mode. + * The kernel ignores the position argument and always appends the data to + * the end of the file. + * @since v12.9.0 + * @param position The offset from the beginning of the file where the data from `buffers` should be written. If `position` is not a `number`, the data will be written at the current + * position. + */ + writev(buffers: ReadonlyArray, position?: number): Promise; + /** + * Read from a file and write to an array of [ArrayBufferView](https://developer.mozilla.org/en-US/docs/Web/API/ArrayBufferView) s + * @since v13.13.0, v12.17.0 + * @param position The offset from the beginning of the file where the data should be read from. If `position` is not a `number`, the data will be read from the current position. + * @return Fulfills upon success an object containing two properties: + */ + readv(buffers: ReadonlyArray, position?: number): Promise; + /** + * Closes the file handle after waiting for any pending operation on the handle to + * complete. + * + * ```js + * import { open } from 'fs/promises'; + * + * let filehandle; + * try { + * filehandle = await open('thefile.txt', 'r'); + * } finally { + * await filehandle?.close(); + * } + * ``` + * @since v10.0.0 + * @return Fulfills with `undefined` upon success. + */ + close(): Promise; + } + /** + * Tests a user's permissions for the file or directory specified by `path`. + * The `mode` argument is an optional integer that specifies the accessibility + * checks to be performed. Check `File access constants` for possible values + * of `mode`. It is possible to create a mask consisting of the bitwise OR of + * two or more values (e.g. `fs.constants.W_OK | fs.constants.R_OK`). + * + * If the accessibility check is successful, the promise is resolved with no + * value. If any of the accessibility checks fail, the promise is rejected + * with an [Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error) object. The following example checks if the file`/etc/passwd` can be read and + * written by the current process. + * + * ```js + * import { access } from 'fs/promises'; + * import { constants } from 'fs'; + * + * try { + * await access('/etc/passwd', constants.R_OK | constants.W_OK); + * console.log('can access'); + * } catch { + * console.error('cannot access'); + * } + * ``` + * + * Using `fsPromises.access()` to check for the accessibility of a file before + * calling `fsPromises.open()` is not recommended. Doing so introduces a race + * condition, since other processes may change the file's state between the two + * calls. Instead, user code should open/read/write the file directly and handle + * the error raised if the file is not accessible. + * @since v10.0.0 + * @param [mode=fs.constants.F_OK] + * @return Fulfills with `undefined` upon success. + */ + function access(path: PathLike, mode?: number): Promise; + /** + * Asynchronously copies `src` to `dest`. By default, `dest` is overwritten if it + * already exists. + * + * No guarantees are made about the atomicity of the copy operation. If an + * error occurs after the destination file has been opened for writing, an attempt + * will be made to remove the destination. + * + * ```js + * import { constants } from 'fs'; + * import { copyFile } from 'fs/promises'; + * + * try { + * await copyFile('source.txt', 'destination.txt'); + * console.log('source.txt was copied to destination.txt'); + * } catch { + * console.log('The file could not be copied'); + * } + * + * // By using COPYFILE_EXCL, the operation will fail if destination.txt exists. + * try { + * await copyFile('source.txt', 'destination.txt', constants.COPYFILE_EXCL); + * console.log('source.txt was copied to destination.txt'); + * } catch { + * console.log('The file could not be copied'); + * } + * ``` + * @since v10.0.0 + * @param src source filename to copy + * @param dest destination filename of the copy operation + * @param [mode=0] Optional modifiers that specify the behavior of the copy operation. It is possible to create a mask consisting of the bitwise OR of two or more values (e.g. + * `fs.constants.COPYFILE_EXCL | fs.constants.COPYFILE_FICLONE`) + * @return Fulfills with `undefined` upon success. + */ + function copyFile(src: PathLike, dest: PathLike, mode?: number): Promise; + /** + * Opens a `FileHandle`. + * + * Refer to the POSIX [`open(2)`](http://man7.org/linux/man-pages/man2/open.2.html) documentation for more detail. + * + * Some characters (`< > : " / \ | ? *`) are reserved under Windows as documented + * by [Naming Files, Paths, and Namespaces](https://docs.microsoft.com/en-us/windows/desktop/FileIO/naming-a-file). Under NTFS, if the filename contains + * a colon, Node.js will open a file system stream, as described by [this MSDN page](https://docs.microsoft.com/en-us/windows/desktop/FileIO/using-streams). + * @since v10.0.0 + * @param [flags='r'] See `support of file system `flags``. + * @param [mode=0o666] Sets the file mode (permission and sticky bits) if the file is created. + * @return Fulfills with a {FileHandle} object. + */ + function open(path: PathLike, flags: string | number, mode?: Mode): Promise; + /** + * Renames `oldPath` to `newPath`. + * @since v10.0.0 + * @return Fulfills with `undefined` upon success. + */ + function rename(oldPath: PathLike, newPath: PathLike): Promise; + /** + * Truncates (shortens or extends the length) of the content at `path` to `len`bytes. + * @since v10.0.0 + * @param [len=0] + * @return Fulfills with `undefined` upon success. + */ + function truncate(path: PathLike, len?: number): Promise; + /** + * Removes the directory identified by `path`. + * + * Using `fsPromises.rmdir()` on a file (not a directory) results in the + * promise being rejected with an `ENOENT` error on Windows and an `ENOTDIR`error on POSIX. + * + * To get a behavior similar to the `rm -rf` Unix command, use `fsPromises.rm()` with options `{ recursive: true, force: true }`. + * @since v10.0.0 + * @return Fulfills with `undefined` upon success. + */ + function rmdir(path: PathLike, options?: RmDirOptions): Promise; + /** + * Removes files and directories (modeled on the standard POSIX `rm` utility). + * @since v14.14.0 + * @return Fulfills with `undefined` upon success. + */ + function rm(path: PathLike, options?: RmOptions): Promise; + /** + * Asynchronously creates a directory. + * + * The optional `options` argument can be an integer specifying `mode` (permission + * and sticky bits), or an object with a `mode` property and a `recursive`property indicating whether parent directories should be created. Calling`fsPromises.mkdir()` when `path` is a directory + * that exists results in a + * rejection only when `recursive` is false. + * @since v10.0.0 + * @return Upon success, fulfills with `undefined` if `recursive` is `false`, or the first directory path created if `recursive` is `true`. + */ + function mkdir( + path: PathLike, + options: MakeDirectoryOptions & { + recursive: true; + } + ): Promise; + /** + * Asynchronous mkdir(2) - create a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options Either the file mode, or an object optionally specifying the file mode and whether parent folders + * should be created. If a string is passed, it is parsed as an octal integer. If not specified, defaults to `0o777`. + */ + function mkdir( + path: PathLike, + options?: + | Mode + | (MakeDirectoryOptions & { + recursive?: false | undefined; + }) + | null + ): Promise; + /** + * Asynchronous mkdir(2) - create a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options Either the file mode, or an object optionally specifying the file mode and whether parent folders + * should be created. If a string is passed, it is parsed as an octal integer. If not specified, defaults to `0o777`. + */ + function mkdir(path: PathLike, options?: Mode | MakeDirectoryOptions | null): Promise; + /** + * Reads the contents of a directory. + * + * The optional `options` argument can be a string specifying an encoding, or an + * object with an `encoding` property specifying the character encoding to use for + * the filenames. If the `encoding` is set to `'buffer'`, the filenames returned + * will be passed as `Buffer` objects. + * + * If `options.withFileTypes` is set to `true`, the resolved array will contain `fs.Dirent` objects. + * + * ```js + * import { readdir } from 'fs/promises'; + * + * try { + * const files = await readdir(path); + * for (const file of files) + * console.log(file); + * } catch (err) { + * console.error(err); + * } + * ``` + * @since v10.0.0 + * @return Fulfills with an array of the names of the files in the directory excluding `'.'` and `'..'`. + */ + function readdir( + path: PathLike, + options?: + | (ObjectEncodingOptions & { + withFileTypes?: false | undefined; + }) + | BufferEncoding + | null + ): Promise; + /** + * Asynchronous readdir(3) - read a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function readdir( + path: PathLike, + options: + | { + encoding: 'buffer'; + withFileTypes?: false | undefined; + } + | 'buffer' + ): Promise; + /** + * Asynchronous readdir(3) - read a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function readdir( + path: PathLike, + options?: + | (ObjectEncodingOptions & { + withFileTypes?: false | undefined; + }) + | BufferEncoding + | null + ): Promise; + /** + * Asynchronous readdir(3) - read a directory. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options If called with `withFileTypes: true` the result data will be an array of Dirent. + */ + function readdir( + path: PathLike, + options: ObjectEncodingOptions & { + withFileTypes: true; + } + ): Promise; + /** + * Reads the contents of the symbolic link referred to by `path`. See the POSIX [`readlink(2)`](http://man7.org/linux/man-pages/man2/readlink.2.html) documentation for more detail. The promise is + * resolved with the`linkString` upon success. + * + * The optional `options` argument can be a string specifying an encoding, or an + * object with an `encoding` property specifying the character encoding to use for + * the link path returned. If the `encoding` is set to `'buffer'`, the link path + * returned will be passed as a `Buffer` object. + * @since v10.0.0 + * @return Fulfills with the `linkString` upon success. + */ + function readlink(path: PathLike, options?: ObjectEncodingOptions | BufferEncoding | null): Promise; + /** + * Asynchronous readlink(2) - read value of a symbolic link. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function readlink(path: PathLike, options: BufferEncodingOption): Promise; + /** + * Asynchronous readlink(2) - read value of a symbolic link. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function readlink(path: PathLike, options?: ObjectEncodingOptions | string | null): Promise; + /** + * Creates a symbolic link. + * + * The `type` argument is only used on Windows platforms and can be one of `'dir'`,`'file'`, or `'junction'`. Windows junction points require the destination path + * to be absolute. When using `'junction'`, the `target` argument will + * automatically be normalized to absolute path. + * @since v10.0.0 + * @param [type='file'] + * @return Fulfills with `undefined` upon success. + */ + function symlink(target: PathLike, path: PathLike, type?: string | null): Promise; + /** + * Equivalent to `fsPromises.stat()` unless `path` refers to a symbolic link, + * in which case the link itself is stat-ed, not the file that it refers to. + * Refer to the POSIX [`lstat(2)`](http://man7.org/linux/man-pages/man2/lstat.2.html) document for more detail. + * @since v10.0.0 + * @return Fulfills with the {fs.Stats} object for the given symbolic link `path`. + */ + function lstat( + path: PathLike, + opts?: StatOptions & { + bigint?: false | undefined; + } + ): Promise; + function lstat( + path: PathLike, + opts: StatOptions & { + bigint: true; + } + ): Promise; + function lstat(path: PathLike, opts?: StatOptions): Promise; + /** + * @since v10.0.0 + * @return Fulfills with the {fs.Stats} object for the given `path`. + */ + function stat( + path: PathLike, + opts?: StatOptions & { + bigint?: false | undefined; + } + ): Promise; + function stat( + path: PathLike, + opts: StatOptions & { + bigint: true; + } + ): Promise; + function stat(path: PathLike, opts?: StatOptions): Promise; + /** + * Creates a new link from the `existingPath` to the `newPath`. See the POSIX [`link(2)`](http://man7.org/linux/man-pages/man2/link.2.html) documentation for more detail. + * @since v10.0.0 + * @return Fulfills with `undefined` upon success. + */ + function link(existingPath: PathLike, newPath: PathLike): Promise; + /** + * If `path` refers to a symbolic link, then the link is removed without affecting + * the file or directory to which that link refers. If the `path` refers to a file + * path that is not a symbolic link, the file is deleted. See the POSIX [`unlink(2)`](http://man7.org/linux/man-pages/man2/unlink.2.html) documentation for more detail. + * @since v10.0.0 + * @return Fulfills with `undefined` upon success. + */ + function unlink(path: PathLike): Promise; + /** + * Changes the permissions of a file. + * @since v10.0.0 + * @return Fulfills with `undefined` upon success. + */ + function chmod(path: PathLike, mode: Mode): Promise; + /** + * Changes the permissions on a symbolic link. + * + * This method is only implemented on macOS. + * @deprecated Since v10.0.0 + * @return Fulfills with `undefined` upon success. + */ + function lchmod(path: PathLike, mode: Mode): Promise; + /** + * Changes the ownership on a symbolic link. + * @since v10.0.0 + * @return Fulfills with `undefined` upon success. + */ + function lchown(path: PathLike, uid: number, gid: number): Promise; + /** + * Changes the access and modification times of a file in the same way as `fsPromises.utimes()`, with the difference that if the path refers to a + * symbolic link, then the link is not dereferenced: instead, the timestamps of + * the symbolic link itself are changed. + * @since v14.5.0, v12.19.0 + * @return Fulfills with `undefined` upon success. + */ + function lutimes(path: PathLike, atime: string | number | Date, mtime: string | number | Date): Promise; + /** + * Changes the ownership of a file. + * @since v10.0.0 + * @return Fulfills with `undefined` upon success. + */ + function chown(path: PathLike, uid: number, gid: number): Promise; + /** + * Change the file system timestamps of the object referenced by `path`. + * + * The `atime` and `mtime` arguments follow these rules: + * + * * Values can be either numbers representing Unix epoch time, `Date`s, or a + * numeric string like `'123456789.0'`. + * * If the value can not be converted to a number, or is `NaN`, `Infinity` or`-Infinity`, an `Error` will be thrown. + * @since v10.0.0 + * @return Fulfills with `undefined` upon success. + */ + function utimes(path: PathLike, atime: string | number | Date, mtime: string | number | Date): Promise; + /** + * Determines the actual location of `path` using the same semantics as the`fs.realpath.native()` function. + * + * Only paths that can be converted to UTF8 strings are supported. + * + * The optional `options` argument can be a string specifying an encoding, or an + * object with an `encoding` property specifying the character encoding to use for + * the path. If the `encoding` is set to `'buffer'`, the path returned will be + * passed as a `Buffer` object. + * + * On Linux, when Node.js is linked against musl libc, the procfs file system must + * be mounted on `/proc` in order for this function to work. Glibc does not have + * this restriction. + * @since v10.0.0 + * @return Fulfills with the resolved path upon success. + */ + function realpath(path: PathLike, options?: ObjectEncodingOptions | BufferEncoding | null): Promise; + /** + * Asynchronous realpath(3) - return the canonicalized absolute pathname. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function realpath(path: PathLike, options: BufferEncodingOption): Promise; + /** + * Asynchronous realpath(3) - return the canonicalized absolute pathname. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function realpath(path: PathLike, options?: ObjectEncodingOptions | BufferEncoding | null): Promise; + /** + * Creates a unique temporary directory. A unique directory name is generated by + * appending six random characters to the end of the provided `prefix`. Due to + * platform inconsistencies, avoid trailing `X` characters in `prefix`. Some + * platforms, notably the BSDs, can return more than six random characters, and + * replace trailing `X` characters in `prefix` with random characters. + * + * The optional `options` argument can be a string specifying an encoding, or an + * object with an `encoding` property specifying the character encoding to use. + * + * ```js + * import { mkdtemp } from 'fs/promises'; + * + * try { + * await mkdtemp(path.join(os.tmpdir(), 'foo-')); + * } catch (err) { + * console.error(err); + * } + * ``` + * + * The `fsPromises.mkdtemp()` method will append the six randomly selected + * characters directly to the `prefix` string. For instance, given a directory`/tmp`, if the intention is to create a temporary directory _within_`/tmp`, the`prefix` must end with a trailing + * platform-specific path separator + * (`require('path').sep`). + * @since v10.0.0 + * @return Fulfills with a string containing the filesystem path of the newly created temporary directory. + */ + function mkdtemp(prefix: string, options?: ObjectEncodingOptions | BufferEncoding | null): Promise; + /** + * Asynchronously creates a unique temporary directory. + * Generates six random characters to be appended behind a required `prefix` to create a unique temporary directory. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function mkdtemp(prefix: string, options: BufferEncodingOption): Promise; + /** + * Asynchronously creates a unique temporary directory. + * Generates six random characters to be appended behind a required `prefix` to create a unique temporary directory. + * @param options The encoding (or an object specifying the encoding), used as the encoding of the result. If not provided, `'utf8'` is used. + */ + function mkdtemp(prefix: string, options?: ObjectEncodingOptions | BufferEncoding | null): Promise; + /** + * Asynchronously writes data to a file, replacing the file if it already exists.`data` can be a string, a `Buffer`, or, an object with an own (not inherited)`toString` function property. + * + * The `encoding` option is ignored if `data` is a buffer. + * + * If `options` is a string, then it specifies the encoding. + * + * The `mode` option only affects the newly created file. See `fs.open()` for more details. + * + * Any specified `FileHandle` has to support writing. + * + * It is unsafe to use `fsPromises.writeFile()` multiple times on the same file + * without waiting for the promise to be settled. + * + * Similarly to `fsPromises.readFile` \- `fsPromises.writeFile` is a convenience + * method that performs multiple `write` calls internally to write the buffer + * passed to it. For performance sensitive code consider using `fs.createWriteStream()`. + * + * It is possible to use an `AbortSignal` to cancel an `fsPromises.writeFile()`. + * Cancelation is "best effort", and some amount of data is likely still + * to be written. + * + * ```js + * import { writeFile } from 'fs/promises'; + * import { Buffer } from 'buffer'; + * + * try { + * const controller = new AbortController(); + * const { signal } = controller; + * const data = new Uint8Array(Buffer.from('Hello Node.js')); + * const promise = writeFile('message.txt', data, { signal }); + * + * // Abort the request before the promise settles. + * controller.abort(); + * + * await promise; + * } catch (err) { + * // When a request is aborted - err is an AbortError + * console.error(err); + * } + * ``` + * + * Aborting an ongoing request does not abort individual operating + * system requests but rather the internal buffering `fs.writeFile` performs. + * @since v10.0.0 + * @param file filename or `FileHandle` + * @return Fulfills with `undefined` upon success. + */ + function writeFile( + file: PathLike | FileHandle, + data: string | NodeJS.ArrayBufferView | Iterable | AsyncIterable | Stream, + options?: + | (ObjectEncodingOptions & { + mode?: Mode | undefined; + flag?: OpenMode | undefined; + } & Abortable) + | BufferEncoding + | null + ): Promise; + /** + * Asynchronously append data to a file, creating the file if it does not yet + * exist. `data` can be a string or a `Buffer`. + * + * If `options` is a string, then it specifies the `encoding`. + * + * The `mode` option only affects the newly created file. See `fs.open()` for more details. + * + * The `path` may be specified as a `FileHandle` that has been opened + * for appending (using `fsPromises.open()`). + * @since v10.0.0 + * @param path filename or {FileHandle} + * @return Fulfills with `undefined` upon success. + */ + function appendFile(path: PathLike | FileHandle, data: string | Uint8Array, options?: (ObjectEncodingOptions & FlagAndOpenMode) | BufferEncoding | null): Promise; + /** + * Asynchronously reads the entire contents of a file. + * + * If no encoding is specified (using `options.encoding`), the data is returned + * as a `Buffer` object. Otherwise, the data will be a string. + * + * If `options` is a string, then it specifies the encoding. + * + * When the `path` is a directory, the behavior of `fsPromises.readFile()` is + * platform-specific. On macOS, Linux, and Windows, the promise will be rejected + * with an error. On FreeBSD, a representation of the directory's contents will be + * returned. + * + * It is possible to abort an ongoing `readFile` using an `AbortSignal`. If a + * request is aborted the promise returned is rejected with an `AbortError`: + * + * ```js + * import { readFile } from 'fs/promises'; + * + * try { + * const controller = new AbortController(); + * const { signal } = controller; + * const promise = readFile(fileName, { signal }); + * + * // Abort the request before the promise settles. + * controller.abort(); + * + * await promise; + * } catch (err) { + * // When a request is aborted - err is an AbortError + * console.error(err); + * } + * ``` + * + * Aborting an ongoing request does not abort individual operating + * system requests but rather the internal buffering `fs.readFile` performs. + * + * Any specified `FileHandle` has to support reading. + * @since v10.0.0 + * @param path filename or `FileHandle` + * @return Fulfills with the contents of the file. + */ + function readFile( + path: PathLike | FileHandle, + options?: + | ({ + encoding?: null | undefined; + flag?: OpenMode | undefined; + } & Abortable) + | null + ): Promise; + /** + * Asynchronously reads the entire contents of a file. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * If a `FileHandle` is provided, the underlying file will _not_ be closed automatically. + * @param options An object that may contain an optional flag. + * If a flag is not provided, it defaults to `'r'`. + */ + function readFile( + path: PathLike | FileHandle, + options: + | ({ + encoding: BufferEncoding; + flag?: OpenMode | undefined; + } & Abortable) + | BufferEncoding + ): Promise; + /** + * Asynchronously reads the entire contents of a file. + * @param path A path to a file. If a URL is provided, it must use the `file:` protocol. + * If a `FileHandle` is provided, the underlying file will _not_ be closed automatically. + * @param options An object that may contain an optional flag. + * If a flag is not provided, it defaults to `'r'`. + */ + function readFile( + path: PathLike | FileHandle, + options?: + | (ObjectEncodingOptions & + Abortable & { + flag?: OpenMode | undefined; + }) + | BufferEncoding + | null + ): Promise; + /** + * Asynchronously open a directory for iterative scanning. See the POSIX [`opendir(3)`](http://man7.org/linux/man-pages/man3/opendir.3.html) documentation for more detail. + * + * Creates an `fs.Dir`, which contains all further functions for reading from + * and cleaning up the directory. + * + * The `encoding` option sets the encoding for the `path` while opening the + * directory and subsequent read operations. + * + * Example using async iteration: + * + * ```js + * import { opendir } from 'fs/promises'; + * + * try { + * const dir = await opendir('./'); + * for await (const dirent of dir) + * console.log(dirent.name); + * } catch (err) { + * console.error(err); + * } + * ``` + * + * When using the async iterator, the `fs.Dir` object will be automatically + * closed after the iterator exits. + * @since v12.12.0 + * @return Fulfills with an {fs.Dir}. + */ + function opendir(path: string, options?: OpenDirOptions): Promise; + /** + * Returns an async iterator that watches for changes on `filename`, where `filename`is either a file or a directory. + * + * ```js + * const { watch } = require('fs/promises'); + * + * const ac = new AbortController(); + * const { signal } = ac; + * setTimeout(() => ac.abort(), 10000); + * + * (async () => { + * try { + * const watcher = watch(__filename, { signal }); + * for await (const event of watcher) + * console.log(event); + * } catch (err) { + * if (err.name === 'AbortError') + * return; + * throw err; + * } + * })(); + * ``` + * + * On most platforms, `'rename'` is emitted whenever a filename appears or + * disappears in the directory. + * + * All the `caveats` for `fs.watch()` also apply to `fsPromises.watch()`. + * @since v15.9.0 + * @return of objects with the properties: + */ + function watch( + filename: PathLike, + options: + | (WatchOptions & { + encoding: 'buffer'; + }) + | 'buffer' + ): AsyncIterable>; + /** + * Watch for changes on `filename`, where `filename` is either a file or a directory, returning an `FSWatcher`. + * @param filename A path to a file or directory. If a URL is provided, it must use the `file:` protocol. + * @param options Either the encoding for the filename provided to the listener, or an object optionally specifying encoding, persistent, and recursive options. + * If `encoding` is not supplied, the default of `'utf8'` is used. + * If `persistent` is not supplied, the default of `true` is used. + * If `recursive` is not supplied, the default of `false` is used. + */ + function watch(filename: PathLike, options?: WatchOptions | BufferEncoding): AsyncIterable>; + /** + * Watch for changes on `filename`, where `filename` is either a file or a directory, returning an `FSWatcher`. + * @param filename A path to a file or directory. If a URL is provided, it must use the `file:` protocol. + * @param options Either the encoding for the filename provided to the listener, or an object optionally specifying encoding, persistent, and recursive options. + * If `encoding` is not supplied, the default of `'utf8'` is used. + * If `persistent` is not supplied, the default of `true` is used. + * If `recursive` is not supplied, the default of `false` is used. + */ + function watch(filename: PathLike, options: WatchOptions | string): AsyncIterable> | AsyncIterable>; + /** + * Asynchronously copies the entire directory structure from `src` to `dest`, + * including subdirectories and files. + * + * When copying a directory to another directory, globs are not supported and + * behavior is similar to `cp dir1/ dir2/`. + * @since v16.7.0 + * @experimental + * @param src source path to copy. + * @param dest destination path to copy to. + * @return Fulfills with `undefined` upon success. + */ + function cp(source: string, destination: string, opts?: CopyOptions): Promise; +} +declare module 'node:fs/promises' { + export * from 'fs/promises'; +} diff --git a/node_modules/@types/node/globals.d.ts b/node_modules/@types/node/globals.d.ts new file mode 100644 index 000000000..ae974db7e --- /dev/null +++ b/node_modules/@types/node/globals.d.ts @@ -0,0 +1,284 @@ +// Declare "static" methods in Error +interface ErrorConstructor { + /** Create .stack property on a target object */ + captureStackTrace(targetObject: object, constructorOpt?: Function): void; + + /** + * Optional override for formatting stack traces + * + * @see https://v8.dev/docs/stack-trace-api#customizing-stack-traces + */ + prepareStackTrace?: ((err: Error, stackTraces: NodeJS.CallSite[]) => any) | undefined; + + stackTraceLimit: number; +} + +/*-----------------------------------------------* + * * + * GLOBAL * + * * + ------------------------------------------------*/ + +// For backwards compability +interface NodeRequire extends NodeJS.Require { } +interface RequireResolve extends NodeJS.RequireResolve { } +interface NodeModule extends NodeJS.Module { } + +declare var process: NodeJS.Process; +declare var console: Console; + +declare var __filename: string; +declare var __dirname: string; + +declare var require: NodeRequire; +declare var module: NodeModule; + +// Same as module.exports +declare var exports: any; + +/** + * Only available if `--expose-gc` is passed to the process. + */ +declare var gc: undefined | (() => void); + +//#region borrowed +// from https://github.com/microsoft/TypeScript/blob/38da7c600c83e7b31193a62495239a0fe478cb67/lib/lib.webworker.d.ts#L633 until moved to separate lib +/** A controller object that allows you to abort one or more DOM requests as and when desired. */ +interface AbortController { + /** + * Returns the AbortSignal object associated with this object. + */ + + readonly signal: AbortSignal; + /** + * Invoking this method will set this object's AbortSignal's aborted flag and signal to any observers that the associated activity is to be aborted. + */ + abort(): void; +} + +/** A signal object that allows you to communicate with a DOM request (such as a Fetch) and abort it if required via an AbortController object. */ +interface AbortSignal { + /** + * Returns true if this AbortSignal's AbortController has signaled to abort, and false otherwise. + */ + readonly aborted: boolean; +} + +declare var AbortController: { + prototype: AbortController; + new(): AbortController; +}; + +declare var AbortSignal: { + prototype: AbortSignal; + new(): AbortSignal; + // TODO: Add abort() static +}; +//#endregion borrowed + +//#region ArrayLike.at() +interface RelativeIndexable { + /** + * Takes an integer value and returns the item at that index, + * allowing for positive and negative integers. + * Negative integers count back from the last item in the array. + */ + at(index: number): T | undefined; +} +interface String extends RelativeIndexable {} +interface Array extends RelativeIndexable {} +interface Int8Array extends RelativeIndexable {} +interface Uint8Array extends RelativeIndexable {} +interface Uint8ClampedArray extends RelativeIndexable {} +interface Int16Array extends RelativeIndexable {} +interface Uint16Array extends RelativeIndexable {} +interface Int32Array extends RelativeIndexable {} +interface Uint32Array extends RelativeIndexable {} +interface Float32Array extends RelativeIndexable {} +interface Float64Array extends RelativeIndexable {} +interface BigInt64Array extends RelativeIndexable {} +interface BigUint64Array extends RelativeIndexable {} +//#endregion ArrayLike.at() end + +/*----------------------------------------------* +* * +* GLOBAL INTERFACES * +* * +*-----------------------------------------------*/ +declare namespace NodeJS { + interface CallSite { + /** + * Value of "this" + */ + getThis(): unknown; + + /** + * Type of "this" as a string. + * This is the name of the function stored in the constructor field of + * "this", if available. Otherwise the object's [[Class]] internal + * property. + */ + getTypeName(): string | null; + + /** + * Current function + */ + getFunction(): Function | undefined; + + /** + * Name of the current function, typically its name property. + * If a name property is not available an attempt will be made to try + * to infer a name from the function's context. + */ + getFunctionName(): string | null; + + /** + * Name of the property [of "this" or one of its prototypes] that holds + * the current function + */ + getMethodName(): string | null; + + /** + * Name of the script [if this function was defined in a script] + */ + getFileName(): string | null; + + /** + * Current line number [if this function was defined in a script] + */ + getLineNumber(): number | null; + + /** + * Current column number [if this function was defined in a script] + */ + getColumnNumber(): number | null; + + /** + * A call site object representing the location where eval was called + * [if this function was created using a call to eval] + */ + getEvalOrigin(): string | undefined; + + /** + * Is this a toplevel invocation, that is, is "this" the global object? + */ + isToplevel(): boolean; + + /** + * Does this call take place in code defined by a call to eval? + */ + isEval(): boolean; + + /** + * Is this call in native V8 code? + */ + isNative(): boolean; + + /** + * Is this a constructor call? + */ + isConstructor(): boolean; + } + + interface ErrnoException extends Error { + errno?: number | undefined; + code?: string | undefined; + path?: string | undefined; + syscall?: string | undefined; + } + + interface ReadableStream extends EventEmitter { + readable: boolean; + read(size?: number): string | Buffer; + setEncoding(encoding: BufferEncoding): this; + pause(): this; + resume(): this; + isPaused(): boolean; + pipe(destination: T, options?: { end?: boolean | undefined; }): T; + unpipe(destination?: WritableStream): this; + unshift(chunk: string | Uint8Array, encoding?: BufferEncoding): void; + wrap(oldStream: ReadableStream): this; + [Symbol.asyncIterator](): AsyncIterableIterator; + } + + interface WritableStream extends EventEmitter { + writable: boolean; + write(buffer: Uint8Array | string, cb?: (err?: Error | null) => void): boolean; + write(str: string, encoding?: BufferEncoding, cb?: (err?: Error | null) => void): boolean; + end(cb?: () => void): void; + end(data: string | Uint8Array, cb?: () => void): void; + end(str: string, encoding?: BufferEncoding, cb?: () => void): void; + } + + interface ReadWriteStream extends ReadableStream, WritableStream { } + + interface RefCounted { + ref(): this; + unref(): this; + } + + type TypedArray = + | Uint8Array + | Uint8ClampedArray + | Uint16Array + | Uint32Array + | Int8Array + | Int16Array + | Int32Array + | BigUint64Array + | BigInt64Array + | Float32Array + | Float64Array; + type ArrayBufferView = TypedArray | DataView; + + interface Require { + (id: string): any; + resolve: RequireResolve; + cache: Dict; + /** + * @deprecated + */ + extensions: RequireExtensions; + main: Module | undefined; + } + + interface RequireResolve { + (id: string, options?: { paths?: string[] | undefined; }): string; + paths(request: string): string[] | null; + } + + interface RequireExtensions extends Dict<(m: Module, filename: string) => any> { + '.js': (m: Module, filename: string) => any; + '.json': (m: Module, filename: string) => any; + '.node': (m: Module, filename: string) => any; + } + interface Module { + /** + * `true` if the module is running during the Node.js preload + */ + isPreloading: boolean; + exports: any; + require: Require; + id: string; + filename: string; + loaded: boolean; + /** @deprecated since 14.6.0 Please use `require.main` and `module.children` instead. */ + parent: Module | null | undefined; + children: Module[]; + /** + * @since 11.14.0 + * + * The directory name of the module. This is usually the same as the path.dirname() of the module.id. + */ + path: string; + paths: string[]; + } + + interface Dict { + [key: string]: T | undefined; + } + + interface ReadOnlyDict { + readonly [key: string]: T | undefined; + } +} diff --git a/node_modules/@types/node/globals.global.d.ts b/node_modules/@types/node/globals.global.d.ts new file mode 100644 index 000000000..ef1198c05 --- /dev/null +++ b/node_modules/@types/node/globals.global.d.ts @@ -0,0 +1 @@ +declare var global: typeof globalThis; diff --git a/node_modules/@types/node/http.d.ts b/node_modules/@types/node/http.d.ts new file mode 100644 index 000000000..68a63ba13 --- /dev/null +++ b/node_modules/@types/node/http.d.ts @@ -0,0 +1,1368 @@ +/** + * To use the HTTP server and client one must `require('http')`. + * + * The HTTP interfaces in Node.js are designed to support many features + * of the protocol which have been traditionally difficult to use. + * In particular, large, possibly chunk-encoded, messages. The interface is + * careful to never buffer entire requests or responses, so the + * user is able to stream data. + * + * HTTP message headers are represented by an object like this: + * + * ```js + * { 'content-length': '123', + * 'content-type': 'text/plain', + * 'connection': 'keep-alive', + * 'host': 'mysite.com', + * 'accept': '*' } + * ``` + * + * Keys are lowercased. Values are not modified. + * + * In order to support the full spectrum of possible HTTP applications, the Node.js + * HTTP API is very low-level. It deals with stream handling and message + * parsing only. It parses a message into headers and body but it does not + * parse the actual headers or the body. + * + * See `message.headers` for details on how duplicate headers are handled. + * + * The raw headers as they were received are retained in the `rawHeaders`property, which is an array of `[key, value, key2, value2, ...]`. For + * example, the previous message header object might have a `rawHeaders`list like the following: + * + * ```js + * [ 'ConTent-Length', '123456', + * 'content-LENGTH', '123', + * 'content-type', 'text/plain', + * 'CONNECTION', 'keep-alive', + * 'Host', 'mysite.com', + * 'accepT', '*' ] + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/http.js) + */ +declare module 'http' { + import * as stream from 'node:stream'; + import { URL } from 'node:url'; + import { Socket, Server as NetServer } from 'node:net'; + // incoming headers will never contain number + interface IncomingHttpHeaders extends NodeJS.Dict { + accept?: string | undefined; + 'accept-language'?: string | undefined; + 'accept-patch'?: string | undefined; + 'accept-ranges'?: string | undefined; + 'access-control-allow-credentials'?: string | undefined; + 'access-control-allow-headers'?: string | undefined; + 'access-control-allow-methods'?: string | undefined; + 'access-control-allow-origin'?: string | undefined; + 'access-control-expose-headers'?: string | undefined; + 'access-control-max-age'?: string | undefined; + 'access-control-request-headers'?: string | undefined; + 'access-control-request-method'?: string | undefined; + age?: string | undefined; + allow?: string | undefined; + 'alt-svc'?: string | undefined; + authorization?: string | undefined; + 'cache-control'?: string | undefined; + connection?: string | undefined; + 'content-disposition'?: string | undefined; + 'content-encoding'?: string | undefined; + 'content-language'?: string | undefined; + 'content-length'?: string | undefined; + 'content-location'?: string | undefined; + 'content-range'?: string | undefined; + 'content-type'?: string | undefined; + cookie?: string | undefined; + date?: string | undefined; + etag?: string | undefined; + expect?: string | undefined; + expires?: string | undefined; + forwarded?: string | undefined; + from?: string | undefined; + host?: string | undefined; + 'if-match'?: string | undefined; + 'if-modified-since'?: string | undefined; + 'if-none-match'?: string | undefined; + 'if-unmodified-since'?: string | undefined; + 'last-modified'?: string | undefined; + location?: string | undefined; + origin?: string | undefined; + pragma?: string | undefined; + 'proxy-authenticate'?: string | undefined; + 'proxy-authorization'?: string | undefined; + 'public-key-pins'?: string | undefined; + range?: string | undefined; + referer?: string | undefined; + 'retry-after'?: string | undefined; + 'sec-websocket-accept'?: string | undefined; + 'sec-websocket-extensions'?: string | undefined; + 'sec-websocket-key'?: string | undefined; + 'sec-websocket-protocol'?: string | undefined; + 'sec-websocket-version'?: string | undefined; + 'set-cookie'?: string[] | undefined; + 'strict-transport-security'?: string | undefined; + tk?: string | undefined; + trailer?: string | undefined; + 'transfer-encoding'?: string | undefined; + upgrade?: string | undefined; + 'user-agent'?: string | undefined; + vary?: string | undefined; + via?: string | undefined; + warning?: string | undefined; + 'www-authenticate'?: string | undefined; + } + // outgoing headers allows numbers (as they are converted internally to strings) + type OutgoingHttpHeader = number | string | string[]; + interface OutgoingHttpHeaders extends NodeJS.Dict {} + interface ClientRequestArgs { + abort?: AbortSignal | undefined; + protocol?: string | null | undefined; + host?: string | null | undefined; + hostname?: string | null | undefined; + family?: number | undefined; + port?: number | string | null | undefined; + defaultPort?: number | string | undefined; + localAddress?: string | undefined; + socketPath?: string | undefined; + /** + * @default 8192 + */ + maxHeaderSize?: number | undefined; + method?: string | undefined; + path?: string | null | undefined; + headers?: OutgoingHttpHeaders | undefined; + auth?: string | null | undefined; + agent?: Agent | boolean | undefined; + _defaultAgent?: Agent | undefined; + timeout?: number | undefined; + setHost?: boolean | undefined; + // https://github.com/nodejs/node/blob/master/lib/_http_client.js#L278 + createConnection?: ((options: ClientRequestArgs, oncreate: (err: Error, socket: Socket) => void) => Socket) | undefined; + } + interface ServerOptions { + IncomingMessage?: typeof IncomingMessage | undefined; + ServerResponse?: typeof ServerResponse | undefined; + /** + * Optionally overrides the value of + * `--max-http-header-size` for requests received by this server, i.e. + * the maximum length of request headers in bytes. + * @default 8192 + */ + maxHeaderSize?: number | undefined; + /** + * Use an insecure HTTP parser that accepts invalid HTTP headers when true. + * Using the insecure parser should be avoided. + * See --insecure-http-parser for more information. + * @default false + */ + insecureHTTPParser?: boolean | undefined; + } + type RequestListener = (req: IncomingMessage, res: ServerResponse) => void; + /** + * @since v0.1.17 + */ + class Server extends NetServer { + constructor(requestListener?: RequestListener); + constructor(options: ServerOptions, requestListener?: RequestListener); + /** + * Sets the timeout value for sockets, and emits a `'timeout'` event on + * the Server object, passing the socket as an argument, if a timeout + * occurs. + * + * If there is a `'timeout'` event listener on the Server object, then it + * will be called with the timed-out socket as an argument. + * + * By default, the Server does not timeout sockets. However, if a callback + * is assigned to the Server's `'timeout'` event, timeouts must be handled + * explicitly. + * @since v0.9.12 + * @param [msecs=0 (no timeout)] + */ + setTimeout(msecs?: number, callback?: () => void): this; + setTimeout(callback: () => void): this; + /** + * Limits maximum incoming headers count. If set to 0, no limit will be applied. + * @since v0.7.0 + */ + maxHeadersCount: number | null; + /** + * The maximum number of requests socket can handle + * before closing keep alive connection. + * + * A value of `null` will disable the limit. + * + * When limit is reach it will set `Connection` header value to `closed`, + * but will not actually close the connection, subsequent requests sent + * after the limit is reached will get `503 Service Unavailable` as a response. + * @since v16.10.0 + */ + maxRequestsPerSocket: number | null; + /** + * The number of milliseconds of inactivity before a socket is presumed + * to have timed out. + * + * A value of `0` will disable the timeout behavior on incoming connections. + * + * The socket timeout logic is set up on connection, so changing this + * value only affects new connections to the server, not any existing connections. + * @since v0.9.12 + */ + timeout: number; + /** + * Limit the amount of time the parser will wait to receive the complete HTTP + * headers. + * + * In case of inactivity, the rules defined in `server.timeout` apply. However, + * that inactivity based timeout would still allow the connection to be kept open + * if the headers are being sent very slowly (by default, up to a byte per 2 + * minutes). In order to prevent this, whenever header data arrives an additional + * check is made that more than `server.headersTimeout` milliseconds has not + * passed since the connection was established. If the check fails, a `'timeout'`event is emitted on the server object, and (by default) the socket is destroyed. + * See `server.timeout` for more information on how timeout behavior can be + * customized. + * @since v11.3.0, v10.14.0 + */ + headersTimeout: number; + /** + * The number of milliseconds of inactivity a server needs to wait for additional + * incoming data, after it has finished writing the last response, before a socket + * will be destroyed. If the server receives new data before the keep-alive + * timeout has fired, it will reset the regular inactivity timeout, i.e.,`server.timeout`. + * + * A value of `0` will disable the keep-alive timeout behavior on incoming + * connections. + * A value of `0` makes the http server behave similarly to Node.js versions prior + * to 8.0.0, which did not have a keep-alive timeout. + * + * The socket timeout logic is set up on connection, so changing this value only + * affects new connections to the server, not any existing connections. + * @since v8.0.0 + */ + keepAliveTimeout: number; + /** + * Sets the timeout value in milliseconds for receiving the entire request from + * the client. + * + * If the timeout expires, the server responds with status 408 without + * forwarding the request to the request listener and then closes the connection. + * + * It must be set to a non-zero value (e.g. 120 seconds) to protect against + * potential Denial-of-Service attacks in case the server is deployed without a + * reverse proxy in front. + * @since v14.11.0 + */ + requestTimeout: number; + addListener(event: string, listener: (...args: any[]) => void): this; + addListener(event: 'close', listener: () => void): this; + addListener(event: 'connection', listener: (socket: Socket) => void): this; + addListener(event: 'error', listener: (err: Error) => void): this; + addListener(event: 'listening', listener: () => void): this; + addListener(event: 'checkContinue', listener: RequestListener): this; + addListener(event: 'checkExpectation', listener: RequestListener): this; + addListener(event: 'clientError', listener: (err: Error, socket: stream.Duplex) => void): this; + addListener(event: 'connect', listener: (req: IncomingMessage, socket: stream.Duplex, head: Buffer) => void): this; + addListener(event: 'request', listener: RequestListener): this; + addListener(event: 'upgrade', listener: (req: IncomingMessage, socket: stream.Duplex, head: Buffer) => void): this; + emit(event: string, ...args: any[]): boolean; + emit(event: 'close'): boolean; + emit(event: 'connection', socket: Socket): boolean; + emit(event: 'error', err: Error): boolean; + emit(event: 'listening'): boolean; + emit(event: 'checkContinue', req: IncomingMessage, res: ServerResponse): boolean; + emit(event: 'checkExpectation', req: IncomingMessage, res: ServerResponse): boolean; + emit(event: 'clientError', err: Error, socket: stream.Duplex): boolean; + emit(event: 'connect', req: IncomingMessage, socket: stream.Duplex, head: Buffer): boolean; + emit(event: 'request', req: IncomingMessage, res: ServerResponse): boolean; + emit(event: 'upgrade', req: IncomingMessage, socket: stream.Duplex, head: Buffer): boolean; + on(event: string, listener: (...args: any[]) => void): this; + on(event: 'close', listener: () => void): this; + on(event: 'connection', listener: (socket: Socket) => void): this; + on(event: 'error', listener: (err: Error) => void): this; + on(event: 'listening', listener: () => void): this; + on(event: 'checkContinue', listener: RequestListener): this; + on(event: 'checkExpectation', listener: RequestListener): this; + on(event: 'clientError', listener: (err: Error, socket: stream.Duplex) => void): this; + on(event: 'connect', listener: (req: IncomingMessage, socket: stream.Duplex, head: Buffer) => void): this; + on(event: 'request', listener: RequestListener): this; + on(event: 'upgrade', listener: (req: IncomingMessage, socket: stream.Duplex, head: Buffer) => void): this; + once(event: string, listener: (...args: any[]) => void): this; + once(event: 'close', listener: () => void): this; + once(event: 'connection', listener: (socket: Socket) => void): this; + once(event: 'error', listener: (err: Error) => void): this; + once(event: 'listening', listener: () => void): this; + once(event: 'checkContinue', listener: RequestListener): this; + once(event: 'checkExpectation', listener: RequestListener): this; + once(event: 'clientError', listener: (err: Error, socket: stream.Duplex) => void): this; + once(event: 'connect', listener: (req: IncomingMessage, socket: stream.Duplex, head: Buffer) => void): this; + once(event: 'request', listener: RequestListener): this; + once(event: 'upgrade', listener: (req: IncomingMessage, socket: stream.Duplex, head: Buffer) => void): this; + prependListener(event: string, listener: (...args: any[]) => void): this; + prependListener(event: 'close', listener: () => void): this; + prependListener(event: 'connection', listener: (socket: Socket) => void): this; + prependListener(event: 'error', listener: (err: Error) => void): this; + prependListener(event: 'listening', listener: () => void): this; + prependListener(event: 'checkContinue', listener: RequestListener): this; + prependListener(event: 'checkExpectation', listener: RequestListener): this; + prependListener(event: 'clientError', listener: (err: Error, socket: stream.Duplex) => void): this; + prependListener(event: 'connect', listener: (req: IncomingMessage, socket: stream.Duplex, head: Buffer) => void): this; + prependListener(event: 'request', listener: RequestListener): this; + prependListener(event: 'upgrade', listener: (req: IncomingMessage, socket: stream.Duplex, head: Buffer) => void): this; + prependOnceListener(event: string, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'close', listener: () => void): this; + prependOnceListener(event: 'connection', listener: (socket: Socket) => void): this; + prependOnceListener(event: 'error', listener: (err: Error) => void): this; + prependOnceListener(event: 'listening', listener: () => void): this; + prependOnceListener(event: 'checkContinue', listener: RequestListener): this; + prependOnceListener(event: 'checkExpectation', listener: RequestListener): this; + prependOnceListener(event: 'clientError', listener: (err: Error, socket: stream.Duplex) => void): this; + prependOnceListener(event: 'connect', listener: (req: IncomingMessage, socket: stream.Duplex, head: Buffer) => void): this; + prependOnceListener(event: 'request', listener: RequestListener): this; + prependOnceListener(event: 'upgrade', listener: (req: IncomingMessage, socket: stream.Duplex, head: Buffer) => void): this; + } + /** + * This class serves as the parent class of {@link ClientRequest} and {@link ServerResponse}. It is an abstract of outgoing message from + * the perspective of the participants of HTTP transaction. + * @since v0.1.17 + */ + class OutgoingMessage extends stream.Writable { + readonly req: IncomingMessage; + chunkedEncoding: boolean; + shouldKeepAlive: boolean; + useChunkedEncodingByDefault: boolean; + sendDate: boolean; + /** + * @deprecated Use `writableEnded` instead. + */ + finished: boolean; + /** + * Read-only. `true` if the headers were sent, otherwise `false`. + * @since v0.9.3 + */ + readonly headersSent: boolean; + /** + * Aliases of `outgoingMessage.socket` + * @since v0.3.0 + * @deprecated Since v15.12.0 - Use `socket` instead. + */ + readonly connection: Socket | null; + /** + * Reference to the underlying socket. Usually, users will not want to access + * this property. + * + * After calling `outgoingMessage.end()`, this property will be nulled. + * @since v0.3.0 + */ + readonly socket: Socket | null; + constructor(); + /** + * Once a socket is associated with the message and is connected,`socket.setTimeout()` will be called with `msecs` as the first parameter. + * @since v0.9.12 + * @param callback Optional function to be called when a timeout occurs. Same as binding to the `timeout` event. + */ + setTimeout(msecs: number, callback?: () => void): this; + /** + * Sets a single header value for the header object. + * @since v0.4.0 + * @param name Header name + * @param value Header value + */ + setHeader(name: string, value: number | string | ReadonlyArray): this; + /** + * Gets the value of HTTP header with the given name. If such a name doesn't + * exist in message, it will be `undefined`. + * @since v0.4.0 + * @param name Name of header + */ + getHeader(name: string): number | string | string[] | undefined; + /** + * Returns a shallow copy of the current outgoing headers. Since a shallow + * copy is used, array values may be mutated without additional calls to + * various header-related HTTP module methods. The keys of the returned + * object are the header names and the values are the respective header + * values. All header names are lowercase. + * + * The object returned by the `outgoingMessage.getHeaders()` method does + * not prototypically inherit from the JavaScript Object. This means that + * typical Object methods such as `obj.toString()`, `obj.hasOwnProperty()`, + * and others are not defined and will not work. + * + * ```js + * outgoingMessage.setHeader('Foo', 'bar'); + * outgoingMessage.setHeader('Set-Cookie', ['foo=bar', 'bar=baz']); + * + * const headers = outgoingMessage.getHeaders(); + * // headers === { foo: 'bar', 'set-cookie': ['foo=bar', 'bar=baz'] } + * ``` + * @since v8.0.0 + */ + getHeaders(): OutgoingHttpHeaders; + /** + * Returns an array of names of headers of the outgoing outgoingMessage. All + * names are lowercase. + * @since v8.0.0 + */ + getHeaderNames(): string[]; + /** + * Returns `true` if the header identified by `name` is currently set in the + * outgoing headers. The header name is case-insensitive. + * + * ```js + * const hasContentType = outgoingMessage.hasHeader('content-type'); + * ``` + * @since v8.0.0 + */ + hasHeader(name: string): boolean; + /** + * Removes a header that is queued for implicit sending. + * + * ```js + * outgoingMessage.removeHeader('Content-Encoding'); + * ``` + * @since v0.4.0 + */ + removeHeader(name: string): void; + /** + * Adds HTTP trailers (headers but at the end of the message) to the message. + * + * Trailers are **only** be emitted if the message is chunked encoded. If not, + * the trailer will be silently discarded. + * + * HTTP requires the `Trailer` header to be sent to emit trailers, + * with a list of header fields in its value, e.g. + * + * ```js + * message.writeHead(200, { 'Content-Type': 'text/plain', + * 'Trailer': 'Content-MD5' }); + * message.write(fileData); + * message.addTrailers({ 'Content-MD5': '7895bf4b8828b55ceaf47747b4bca667' }); + * message.end(); + * ``` + * + * Attempting to set a header field name or value that contains invalid characters + * will result in a `TypeError` being thrown. + * @since v0.3.0 + */ + addTrailers(headers: OutgoingHttpHeaders | ReadonlyArray<[string, string]>): void; + /** + * Compulsorily flushes the message headers + * + * For efficiency reason, Node.js normally buffers the message headers + * until `outgoingMessage.end()` is called or the first chunk of message data + * is written. It then tries to pack the headers and data into a single TCP + * packet. + * + * It is usually desired (it saves a TCP round-trip), but not when the first + * data is not sent until possibly much later. `outgoingMessage.flushHeaders()`bypasses the optimization and kickstarts the request. + * @since v1.6.0 + */ + flushHeaders(): void; + } + /** + * This object is created internally by an HTTP server, not by the user. It is + * passed as the second parameter to the `'request'` event. + * @since v0.1.17 + */ + class ServerResponse extends OutgoingMessage { + /** + * When using implicit headers (not calling `response.writeHead()` explicitly), + * this property controls the status code that will be sent to the client when + * the headers get flushed. + * + * ```js + * response.statusCode = 404; + * ``` + * + * After response header was sent to the client, this property indicates the + * status code which was sent out. + * @since v0.4.0 + */ + statusCode: number; + /** + * When using implicit headers (not calling `response.writeHead()` explicitly), + * this property controls the status message that will be sent to the client when + * the headers get flushed. If this is left as `undefined` then the standard + * message for the status code will be used. + * + * ```js + * response.statusMessage = 'Not found'; + * ``` + * + * After response header was sent to the client, this property indicates the + * status message which was sent out. + * @since v0.11.8 + */ + statusMessage: string; + constructor(req: IncomingMessage); + assignSocket(socket: Socket): void; + detachSocket(socket: Socket): void; + /** + * Sends a HTTP/1.1 100 Continue message to the client, indicating that + * the request body should be sent. See the `'checkContinue'` event on`Server`. + * @since v0.3.0 + */ + writeContinue(callback?: () => void): void; + /** + * Sends a response header to the request. The status code is a 3-digit HTTP + * status code, like `404`. The last argument, `headers`, are the response headers. + * Optionally one can give a human-readable `statusMessage` as the second + * argument. + * + * `headers` may be an `Array` where the keys and values are in the same list. + * It is _not_ a list of tuples. So, the even-numbered offsets are key values, + * and the odd-numbered offsets are the associated values. The array is in the same + * format as `request.rawHeaders`. + * + * Returns a reference to the `ServerResponse`, so that calls can be chained. + * + * ```js + * const body = 'hello world'; + * response + * .writeHead(200, { + * 'Content-Length': Buffer.byteLength(body), + * 'Content-Type': 'text/plain' + * }) + * .end(body); + * ``` + * + * This method must only be called once on a message and it must + * be called before `response.end()` is called. + * + * If `response.write()` or `response.end()` are called before calling + * this, the implicit/mutable headers will be calculated and call this function. + * + * When headers have been set with `response.setHeader()`, they will be merged + * with any headers passed to `response.writeHead()`, with the headers passed + * to `response.writeHead()` given precedence. + * + * If this method is called and `response.setHeader()` has not been called, + * it will directly write the supplied header values onto the network channel + * without caching internally, and the `response.getHeader()` on the header + * will not yield the expected result. If progressive population of headers is + * desired with potential future retrieval and modification, use `response.setHeader()` instead. + * + * ```js + * // Returns content-type = text/plain + * const server = http.createServer((req, res) => { + * res.setHeader('Content-Type', 'text/html'); + * res.setHeader('X-Foo', 'bar'); + * res.writeHead(200, { 'Content-Type': 'text/plain' }); + * res.end('ok'); + * }); + * ``` + * + * `Content-Length` is given in bytes, not characters. Use `Buffer.byteLength()` to determine the length of the body in bytes. Node.js + * does not check whether `Content-Length` and the length of the body which has + * been transmitted are equal or not. + * + * Attempting to set a header field name or value that contains invalid characters + * will result in a `TypeError` being thrown. + * @since v0.1.30 + */ + writeHead(statusCode: number, statusMessage?: string, headers?: OutgoingHttpHeaders | OutgoingHttpHeader[]): this; + writeHead(statusCode: number, headers?: OutgoingHttpHeaders | OutgoingHttpHeader[]): this; + /** + * Sends a HTTP/1.1 102 Processing message to the client, indicating that + * the request body should be sent. + * @since v10.0.0 + */ + writeProcessing(): void; + } + interface InformationEvent { + statusCode: number; + statusMessage: string; + httpVersion: string; + httpVersionMajor: number; + httpVersionMinor: number; + headers: IncomingHttpHeaders; + rawHeaders: string[]; + } + /** + * This object is created internally and returned from {@link request}. It + * represents an _in-progress_ request whose header has already been queued. The + * header is still mutable using the `setHeader(name, value)`,`getHeader(name)`, `removeHeader(name)` API. The actual header will + * be sent along with the first data chunk or when calling `request.end()`. + * + * To get the response, add a listener for `'response'` to the request object.`'response'` will be emitted from the request object when the response + * headers have been received. The `'response'` event is executed with one + * argument which is an instance of {@link IncomingMessage}. + * + * During the `'response'` event, one can add listeners to the + * response object; particularly to listen for the `'data'` event. + * + * If no `'response'` handler is added, then the response will be + * entirely discarded. However, if a `'response'` event handler is added, + * then the data from the response object **must** be consumed, either by + * calling `response.read()` whenever there is a `'readable'` event, or + * by adding a `'data'` handler, or by calling the `.resume()` method. + * Until the data is consumed, the `'end'` event will not fire. Also, until + * the data is read it will consume memory that can eventually lead to a + * 'process out of memory' error. + * + * For backward compatibility, `res` will only emit `'error'` if there is an`'error'` listener registered. + * + * Node.js does not check whether Content-Length and the length of the + * body which has been transmitted are equal or not. + * @since v0.1.17 + */ + class ClientRequest extends OutgoingMessage { + /** + * The `request.aborted` property will be `true` if the request has + * been aborted. + * @since v0.11.14 + */ + aborted: boolean; + /** + * The request host. + * @since v14.5.0, v12.19.0 + */ + host: string; + /** + * The request protocol. + * @since v14.5.0, v12.19.0 + */ + protocol: string; + constructor(url: string | URL | ClientRequestArgs, cb?: (res: IncomingMessage) => void); + /** + * The request method. + * @since v0.1.97 + */ + method: string; + /** + * The request path. + * @since v0.4.0 + */ + path: string; + /** + * Marks the request as aborting. Calling this will cause remaining data + * in the response to be dropped and the socket to be destroyed. + * @since v0.3.8 + * @deprecated Since v14.1.0,v13.14.0 - Use `destroy` instead. + */ + abort(): void; + onSocket(socket: Socket): void; + /** + * Once a socket is assigned to this request and is connected `socket.setTimeout()` will be called. + * @since v0.5.9 + * @param timeout Milliseconds before a request times out. + * @param callback Optional function to be called when a timeout occurs. Same as binding to the `'timeout'` event. + */ + setTimeout(timeout: number, callback?: () => void): this; + /** + * Once a socket is assigned to this request and is connected `socket.setNoDelay()` will be called. + * @since v0.5.9 + */ + setNoDelay(noDelay?: boolean): void; + /** + * Once a socket is assigned to this request and is connected `socket.setKeepAlive()` will be called. + * @since v0.5.9 + */ + setSocketKeepAlive(enable?: boolean, initialDelay?: number): void; + /** + * Returns an array containing the unique names of the current outgoing raw + * headers. Header names are returned with their exact casing being set. + * + * ```js + * request.setHeader('Foo', 'bar'); + * request.setHeader('Set-Cookie', ['foo=bar', 'bar=baz']); + * + * const headerNames = request.getRawHeaderNames(); + * // headerNames === ['Foo', 'Set-Cookie'] + * ``` + * @since v15.13.0 + */ + getRawHeaderNames(): string[]; + addListener(event: 'abort', listener: () => void): this; + addListener(event: 'connect', listener: (response: IncomingMessage, socket: Socket, head: Buffer) => void): this; + addListener(event: 'continue', listener: () => void): this; + addListener(event: 'information', listener: (info: InformationEvent) => void): this; + addListener(event: 'response', listener: (response: IncomingMessage) => void): this; + addListener(event: 'socket', listener: (socket: Socket) => void): this; + addListener(event: 'timeout', listener: () => void): this; + addListener(event: 'upgrade', listener: (response: IncomingMessage, socket: Socket, head: Buffer) => void): this; + addListener(event: 'close', listener: () => void): this; + addListener(event: 'drain', listener: () => void): this; + addListener(event: 'error', listener: (err: Error) => void): this; + addListener(event: 'finish', listener: () => void): this; + addListener(event: 'pipe', listener: (src: stream.Readable) => void): this; + addListener(event: 'unpipe', listener: (src: stream.Readable) => void): this; + addListener(event: string | symbol, listener: (...args: any[]) => void): this; + on(event: 'abort', listener: () => void): this; + on(event: 'connect', listener: (response: IncomingMessage, socket: Socket, head: Buffer) => void): this; + on(event: 'continue', listener: () => void): this; + on(event: 'information', listener: (info: InformationEvent) => void): this; + on(event: 'response', listener: (response: IncomingMessage) => void): this; + on(event: 'socket', listener: (socket: Socket) => void): this; + on(event: 'timeout', listener: () => void): this; + on(event: 'upgrade', listener: (response: IncomingMessage, socket: Socket, head: Buffer) => void): this; + on(event: 'close', listener: () => void): this; + on(event: 'drain', listener: () => void): this; + on(event: 'error', listener: (err: Error) => void): this; + on(event: 'finish', listener: () => void): this; + on(event: 'pipe', listener: (src: stream.Readable) => void): this; + on(event: 'unpipe', listener: (src: stream.Readable) => void): this; + on(event: string | symbol, listener: (...args: any[]) => void): this; + once(event: 'abort', listener: () => void): this; + once(event: 'connect', listener: (response: IncomingMessage, socket: Socket, head: Buffer) => void): this; + once(event: 'continue', listener: () => void): this; + once(event: 'information', listener: (info: InformationEvent) => void): this; + once(event: 'response', listener: (response: IncomingMessage) => void): this; + once(event: 'socket', listener: (socket: Socket) => void): this; + once(event: 'timeout', listener: () => void): this; + once(event: 'upgrade', listener: (response: IncomingMessage, socket: Socket, head: Buffer) => void): this; + once(event: 'close', listener: () => void): this; + once(event: 'drain', listener: () => void): this; + once(event: 'error', listener: (err: Error) => void): this; + once(event: 'finish', listener: () => void): this; + once(event: 'pipe', listener: (src: stream.Readable) => void): this; + once(event: 'unpipe', listener: (src: stream.Readable) => void): this; + once(event: string | symbol, listener: (...args: any[]) => void): this; + prependListener(event: 'abort', listener: () => void): this; + prependListener(event: 'connect', listener: (response: IncomingMessage, socket: Socket, head: Buffer) => void): this; + prependListener(event: 'continue', listener: () => void): this; + prependListener(event: 'information', listener: (info: InformationEvent) => void): this; + prependListener(event: 'response', listener: (response: IncomingMessage) => void): this; + prependListener(event: 'socket', listener: (socket: Socket) => void): this; + prependListener(event: 'timeout', listener: () => void): this; + prependListener(event: 'upgrade', listener: (response: IncomingMessage, socket: Socket, head: Buffer) => void): this; + prependListener(event: 'close', listener: () => void): this; + prependListener(event: 'drain', listener: () => void): this; + prependListener(event: 'error', listener: (err: Error) => void): this; + prependListener(event: 'finish', listener: () => void): this; + prependListener(event: 'pipe', listener: (src: stream.Readable) => void): this; + prependListener(event: 'unpipe', listener: (src: stream.Readable) => void): this; + prependListener(event: string | symbol, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'abort', listener: () => void): this; + prependOnceListener(event: 'connect', listener: (response: IncomingMessage, socket: Socket, head: Buffer) => void): this; + prependOnceListener(event: 'continue', listener: () => void): this; + prependOnceListener(event: 'information', listener: (info: InformationEvent) => void): this; + prependOnceListener(event: 'response', listener: (response: IncomingMessage) => void): this; + prependOnceListener(event: 'socket', listener: (socket: Socket) => void): this; + prependOnceListener(event: 'timeout', listener: () => void): this; + prependOnceListener(event: 'upgrade', listener: (response: IncomingMessage, socket: Socket, head: Buffer) => void): this; + prependOnceListener(event: 'close', listener: () => void): this; + prependOnceListener(event: 'drain', listener: () => void): this; + prependOnceListener(event: 'error', listener: (err: Error) => void): this; + prependOnceListener(event: 'finish', listener: () => void): this; + prependOnceListener(event: 'pipe', listener: (src: stream.Readable) => void): this; + prependOnceListener(event: 'unpipe', listener: (src: stream.Readable) => void): this; + prependOnceListener(event: string | symbol, listener: (...args: any[]) => void): this; + } + /** + * An `IncomingMessage` object is created by {@link Server} or {@link ClientRequest} and passed as the first argument to the `'request'` and `'response'` event respectively. It may be used to + * access response + * status, headers and data. + * + * Different from its `socket` value which is a subclass of `stream.Duplex`, the`IncomingMessage` itself extends `stream.Readable` and is created separately to + * parse and emit the incoming HTTP headers and payload, as the underlying socket + * may be reused multiple times in case of keep-alive. + * @since v0.1.17 + */ + class IncomingMessage extends stream.Readable { + constructor(socket: Socket); + /** + * The `message.aborted` property will be `true` if the request has + * been aborted. + * @since v10.1.0 + */ + aborted: boolean; + /** + * In case of server request, the HTTP version sent by the client. In the case of + * client response, the HTTP version of the connected-to server. + * Probably either `'1.1'` or `'1.0'`. + * + * Also `message.httpVersionMajor` is the first integer and`message.httpVersionMinor` is the second. + * @since v0.1.1 + */ + httpVersion: string; + httpVersionMajor: number; + httpVersionMinor: number; + /** + * The `message.complete` property will be `true` if a complete HTTP message has + * been received and successfully parsed. + * + * This property is particularly useful as a means of determining if a client or + * server fully transmitted a message before a connection was terminated: + * + * ```js + * const req = http.request({ + * host: '127.0.0.1', + * port: 8080, + * method: 'POST' + * }, (res) => { + * res.resume(); + * res.on('end', () => { + * if (!res.complete) + * console.error( + * 'The connection was terminated while the message was still being sent'); + * }); + * }); + * ``` + * @since v0.3.0 + */ + complete: boolean; + /** + * Alias for `message.socket`. + * @since v0.1.90 + * @deprecated Since v16.0.0 - Use `socket`. + */ + connection: Socket; + /** + * The `net.Socket` object associated with the connection. + * + * With HTTPS support, use `request.socket.getPeerCertificate()` to obtain the + * client's authentication details. + * + * This property is guaranteed to be an instance of the `net.Socket` class, + * a subclass of `stream.Duplex`, unless the user specified a socket + * type other than `net.Socket`. + * @since v0.3.0 + */ + socket: Socket; + /** + * The request/response headers object. + * + * Key-value pairs of header names and values. Header names are lower-cased. + * + * ```js + * // Prints something like: + * // + * // { 'user-agent': 'curl/7.22.0', + * // host: '127.0.0.1:8000', + * // accept: '*' } + * console.log(request.headers); + * ``` + * + * Duplicates in raw headers are handled in the following ways, depending on the + * header name: + * + * * Duplicates of `age`, `authorization`, `content-length`, `content-type`,`etag`, `expires`, `from`, `host`, `if-modified-since`, `if-unmodified-since`,`last-modified`, `location`, + * `max-forwards`, `proxy-authorization`, `referer`,`retry-after`, `server`, or `user-agent` are discarded. + * * `set-cookie` is always an array. Duplicates are added to the array. + * * For duplicate `cookie` headers, the values are joined together with '; '. + * * For all other headers, the values are joined together with ', '. + * @since v0.1.5 + */ + headers: IncomingHttpHeaders; + /** + * The raw request/response headers list exactly as they were received. + * + * The keys and values are in the same list. It is _not_ a + * list of tuples. So, the even-numbered offsets are key values, and the + * odd-numbered offsets are the associated values. + * + * Header names are not lowercased, and duplicates are not merged. + * + * ```js + * // Prints something like: + * // + * // [ 'user-agent', + * // 'this is invalid because there can be only one', + * // 'User-Agent', + * // 'curl/7.22.0', + * // 'Host', + * // '127.0.0.1:8000', + * // 'ACCEPT', + * // '*' ] + * console.log(request.rawHeaders); + * ``` + * @since v0.11.6 + */ + rawHeaders: string[]; + /** + * The request/response trailers object. Only populated at the `'end'` event. + * @since v0.3.0 + */ + trailers: NodeJS.Dict; + /** + * The raw request/response trailer keys and values exactly as they were + * received. Only populated at the `'end'` event. + * @since v0.11.6 + */ + rawTrailers: string[]; + /** + * Calls `message.socket.setTimeout(msecs, callback)`. + * @since v0.5.9 + */ + setTimeout(msecs: number, callback?: () => void): this; + /** + * **Only valid for request obtained from {@link Server}.** + * + * The request method as a string. Read only. Examples: `'GET'`, `'DELETE'`. + * @since v0.1.1 + */ + method?: string | undefined; + /** + * **Only valid for request obtained from {@link Server}.** + * + * Request URL string. This contains only the URL that is present in the actual + * HTTP request. Take the following request: + * + * ```http + * GET /status?name=ryan HTTP/1.1 + * Accept: text/plain + * ``` + * + * To parse the URL into its parts: + * + * ```js + * new URL(request.url, `http://${request.headers.host}`); + * ``` + * + * When `request.url` is `'/status?name=ryan'` and`request.headers.host` is `'localhost:3000'`: + * + * ```console + * $ node + * > new URL(request.url, `http://${request.headers.host}`) + * URL { + * href: 'http://localhost:3000/status?name=ryan', + * origin: 'http://localhost:3000', + * protocol: 'http:', + * username: '', + * password: '', + * host: 'localhost:3000', + * hostname: 'localhost', + * port: '3000', + * pathname: '/status', + * search: '?name=ryan', + * searchParams: URLSearchParams { 'name' => 'ryan' }, + * hash: '' + * } + * ``` + * @since v0.1.90 + */ + url?: string | undefined; + /** + * **Only valid for response obtained from {@link ClientRequest}.** + * + * The 3-digit HTTP response status code. E.G. `404`. + * @since v0.1.1 + */ + statusCode?: number | undefined; + /** + * **Only valid for response obtained from {@link ClientRequest}.** + * + * The HTTP response status message (reason phrase). E.G. `OK` or `Internal Server Error`. + * @since v0.11.10 + */ + statusMessage?: string | undefined; + /** + * Calls `destroy()` on the socket that received the `IncomingMessage`. If `error`is provided, an `'error'` event is emitted on the socket and `error` is passed + * as an argument to any listeners on the event. + * @since v0.3.0 + */ + destroy(error?: Error): void; + } + interface AgentOptions { + /** + * Keep sockets around in a pool to be used by other requests in the future. Default = false + */ + keepAlive?: boolean | undefined; + /** + * When using HTTP KeepAlive, how often to send TCP KeepAlive packets over sockets being kept alive. Default = 1000. + * Only relevant if keepAlive is set to true. + */ + keepAliveMsecs?: number | undefined; + /** + * Maximum number of sockets to allow per host. Default for Node 0.10 is 5, default for Node 0.12 is Infinity + */ + maxSockets?: number | undefined; + /** + * Maximum number of sockets allowed for all hosts in total. Each request will use a new socket until the maximum is reached. Default: Infinity. + */ + maxTotalSockets?: number | undefined; + /** + * Maximum number of sockets to leave open in a free state. Only relevant if keepAlive is set to true. Default = 256. + */ + maxFreeSockets?: number | undefined; + /** + * Socket timeout in milliseconds. This will set the timeout after the socket is connected. + */ + timeout?: number | undefined; + /** + * Scheduling strategy to apply when picking the next free socket to use. + * @default `lifo` + */ + scheduling?: 'fifo' | 'lifo' | undefined; + } + /** + * An `Agent` is responsible for managing connection persistence + * and reuse for HTTP clients. It maintains a queue of pending requests + * for a given host and port, reusing a single socket connection for each + * until the queue is empty, at which time the socket is either destroyed + * or put into a pool where it is kept to be used again for requests to the + * same host and port. Whether it is destroyed or pooled depends on the`keepAlive` `option`. + * + * Pooled connections have TCP Keep-Alive enabled for them, but servers may + * still close idle connections, in which case they will be removed from the + * pool and a new connection will be made when a new HTTP request is made for + * that host and port. Servers may also refuse to allow multiple requests + * over the same connection, in which case the connection will have to be + * remade for every request and cannot be pooled. The `Agent` will still make + * the requests to that server, but each one will occur over a new connection. + * + * When a connection is closed by the client or the server, it is removed + * from the pool. Any unused sockets in the pool will be unrefed so as not + * to keep the Node.js process running when there are no outstanding requests. + * (see `socket.unref()`). + * + * It is good practice, to `destroy()` an `Agent` instance when it is no + * longer in use, because unused sockets consume OS resources. + * + * Sockets are removed from an agent when the socket emits either + * a `'close'` event or an `'agentRemove'` event. When intending to keep one + * HTTP request open for a long time without keeping it in the agent, something + * like the following may be done: + * + * ```js + * http.get(options, (res) => { + * // Do stuff + * }).on('socket', (socket) => { + * socket.emit('agentRemove'); + * }); + * ``` + * + * An agent may also be used for an individual request. By providing`{agent: false}` as an option to the `http.get()` or `http.request()`functions, a one-time use `Agent` with default options + * will be used + * for the client connection. + * + * `agent:false`: + * + * ```js + * http.get({ + * hostname: 'localhost', + * port: 80, + * path: '/', + * agent: false // Create a new agent just for this one request + * }, (res) => { + * // Do stuff with response + * }); + * ``` + * @since v0.3.4 + */ + class Agent { + /** + * By default set to 256\. For agents with `keepAlive` enabled, this + * sets the maximum number of sockets that will be left open in the free + * state. + * @since v0.11.7 + */ + maxFreeSockets: number; + /** + * By default set to `Infinity`. Determines how many concurrent sockets the agent + * can have open per origin. Origin is the returned value of `agent.getName()`. + * @since v0.3.6 + */ + maxSockets: number; + /** + * By default set to `Infinity`. Determines how many concurrent sockets the agent + * can have open. Unlike `maxSockets`, this parameter applies across all origins. + * @since v14.5.0, v12.19.0 + */ + maxTotalSockets: number; + /** + * An object which contains arrays of sockets currently awaiting use by + * the agent when `keepAlive` is enabled. Do not modify. + * + * Sockets in the `freeSockets` list will be automatically destroyed and + * removed from the array on `'timeout'`. + * @since v0.11.4 + */ + readonly freeSockets: NodeJS.ReadOnlyDict; + /** + * An object which contains arrays of sockets currently in use by the + * agent. Do not modify. + * @since v0.3.6 + */ + readonly sockets: NodeJS.ReadOnlyDict; + /** + * An object which contains queues of requests that have not yet been assigned to + * sockets. Do not modify. + * @since v0.5.9 + */ + readonly requests: NodeJS.ReadOnlyDict; + constructor(opts?: AgentOptions); + /** + * Destroy any sockets that are currently in use by the agent. + * + * It is usually not necessary to do this. However, if using an + * agent with `keepAlive` enabled, then it is best to explicitly shut down + * the agent when it is no longer needed. Otherwise, + * sockets might stay open for quite a long time before the server + * terminates them. + * @since v0.11.4 + */ + destroy(): void; + } + const METHODS: string[]; + const STATUS_CODES: { + [errorCode: number]: string | undefined; + [errorCode: string]: string | undefined; + }; + /** + * Returns a new instance of {@link Server}. + * + * The `requestListener` is a function which is automatically + * added to the `'request'` event. + * @since v0.1.13 + */ + function createServer(requestListener?: RequestListener): Server; + function createServer(options: ServerOptions, requestListener?: RequestListener): Server; + // although RequestOptions are passed as ClientRequestArgs to ClientRequest directly, + // create interface RequestOptions would make the naming more clear to developers + interface RequestOptions extends ClientRequestArgs {} + /** + * Node.js maintains several connections per server to make HTTP requests. + * This function allows one to transparently issue requests. + * + * `url` can be a string or a `URL` object. If `url` is a + * string, it is automatically parsed with `new URL()`. If it is a `URL` object, it will be automatically converted to an ordinary `options` object. + * + * If both `url` and `options` are specified, the objects are merged, with the`options` properties taking precedence. + * + * The optional `callback` parameter will be added as a one-time listener for + * the `'response'` event. + * + * `http.request()` returns an instance of the {@link ClientRequest} class. The `ClientRequest` instance is a writable stream. If one needs to + * upload a file with a POST request, then write to the `ClientRequest` object. + * + * ```js + * const http = require('http'); + * + * const postData = JSON.stringify({ + * 'msg': 'Hello World!' + * }); + * + * const options = { + * hostname: 'www.google.com', + * port: 80, + * path: '/upload', + * method: 'POST', + * headers: { + * 'Content-Type': 'application/json', + * 'Content-Length': Buffer.byteLength(postData) + * } + * }; + * + * const req = http.request(options, (res) => { + * console.log(`STATUS: ${res.statusCode}`); + * console.log(`HEADERS: ${JSON.stringify(res.headers)}`); + * res.setEncoding('utf8'); + * res.on('data', (chunk) => { + * console.log(`BODY: ${chunk}`); + * }); + * res.on('end', () => { + * console.log('No more data in response.'); + * }); + * }); + * + * req.on('error', (e) => { + * console.error(`problem with request: ${e.message}`); + * }); + * + * // Write data to request body + * req.write(postData); + * req.end(); + * ``` + * + * In the example `req.end()` was called. With `http.request()` one + * must always call `req.end()` to signify the end of the request - + * even if there is no data being written to the request body. + * + * If any error is encountered during the request (be that with DNS resolution, + * TCP level errors, or actual HTTP parse errors) an `'error'` event is emitted + * on the returned request object. As with all `'error'` events, if no listeners + * are registered the error will be thrown. + * + * There are a few special headers that should be noted. + * + * * Sending a 'Connection: keep-alive' will notify Node.js that the connection to + * the server should be persisted until the next request. + * * Sending a 'Content-Length' header will disable the default chunked encoding. + * * Sending an 'Expect' header will immediately send the request headers. + * Usually, when sending 'Expect: 100-continue', both a timeout and a listener + * for the `'continue'` event should be set. See RFC 2616 Section 8.2.3 for more + * information. + * * Sending an Authorization header will override using the `auth` option + * to compute basic authentication. + * + * Example using a `URL` as `options`: + * + * ```js + * const options = new URL('http://abc:xyz@example.com'); + * + * const req = http.request(options, (res) => { + * // ... + * }); + * ``` + * + * In a successful request, the following events will be emitted in the following + * order: + * + * * `'socket'` + * * `'response'` + * * `'data'` any number of times, on the `res` object + * (`'data'` will not be emitted at all if the response body is empty, for + * instance, in most redirects) + * * `'end'` on the `res` object + * * `'close'` + * + * In the case of a connection error, the following events will be emitted: + * + * * `'socket'` + * * `'error'` + * * `'close'` + * + * In the case of a premature connection close before the response is received, + * the following events will be emitted in the following order: + * + * * `'socket'` + * * `'error'` with an error with message `'Error: socket hang up'` and code`'ECONNRESET'` + * * `'close'` + * + * In the case of a premature connection close after the response is received, + * the following events will be emitted in the following order: + * + * * `'socket'` + * * `'response'` + * * `'data'` any number of times, on the `res` object + * * (connection closed here) + * * `'aborted'` on the `res` object + * * `'error'` on the `res` object with an error with message`'Error: aborted'` and code `'ECONNRESET'`. + * * `'close'` + * * `'close'` on the `res` object + * + * If `req.destroy()` is called before a socket is assigned, the following + * events will be emitted in the following order: + * + * * (`req.destroy()` called here) + * * `'error'` with an error with message `'Error: socket hang up'` and code`'ECONNRESET'` + * * `'close'` + * + * If `req.destroy()` is called before the connection succeeds, the following + * events will be emitted in the following order: + * + * * `'socket'` + * * (`req.destroy()` called here) + * * `'error'` with an error with message `'Error: socket hang up'` and code`'ECONNRESET'` + * * `'close'` + * + * If `req.destroy()` is called after the response is received, the following + * events will be emitted in the following order: + * + * * `'socket'` + * * `'response'` + * * `'data'` any number of times, on the `res` object + * * (`req.destroy()` called here) + * * `'aborted'` on the `res` object + * * `'error'` on the `res` object with an error with message`'Error: aborted'` and code `'ECONNRESET'`. + * * `'close'` + * * `'close'` on the `res` object + * + * If `req.abort()` is called before a socket is assigned, the following + * events will be emitted in the following order: + * + * * (`req.abort()` called here) + * * `'abort'` + * * `'close'` + * + * If `req.abort()` is called before the connection succeeds, the following + * events will be emitted in the following order: + * + * * `'socket'` + * * (`req.abort()` called here) + * * `'abort'` + * * `'error'` with an error with message `'Error: socket hang up'` and code`'ECONNRESET'` + * * `'close'` + * + * If `req.abort()` is called after the response is received, the following + * events will be emitted in the following order: + * + * * `'socket'` + * * `'response'` + * * `'data'` any number of times, on the `res` object + * * (`req.abort()` called here) + * * `'abort'` + * * `'aborted'` on the `res` object + * * `'error'` on the `res` object with an error with message`'Error: aborted'` and code `'ECONNRESET'`. + * * `'close'` + * * `'close'` on the `res` object + * + * Setting the `timeout` option or using the `setTimeout()` function will + * not abort the request or do anything besides add a `'timeout'` event. + * + * Passing an `AbortSignal` and then calling `abort` on the corresponding`AbortController` will behave the same way as calling `.destroy()` on the + * request itself. + * @since v0.3.6 + */ + function request(options: RequestOptions | string | URL, callback?: (res: IncomingMessage) => void): ClientRequest; + function request(url: string | URL, options: RequestOptions, callback?: (res: IncomingMessage) => void): ClientRequest; + /** + * Since most requests are GET requests without bodies, Node.js provides this + * convenience method. The only difference between this method and {@link request} is that it sets the method to GET and calls `req.end()`automatically. The callback must take care to consume the + * response + * data for reasons stated in {@link ClientRequest} section. + * + * The `callback` is invoked with a single argument that is an instance of {@link IncomingMessage}. + * + * JSON fetching example: + * + * ```js + * http.get('http://localhost:8000/', (res) => { + * const { statusCode } = res; + * const contentType = res.headers['content-type']; + * + * let error; + * // Any 2xx status code signals a successful response but + * // here we're only checking for 200. + * if (statusCode !== 200) { + * error = new Error('Request Failed.\n' + + * `Status Code: ${statusCode}`); + * } else if (!/^application\/json/.test(contentType)) { + * error = new Error('Invalid content-type.\n' + + * `Expected application/json but received ${contentType}`); + * } + * if (error) { + * console.error(error.message); + * // Consume response data to free up memory + * res.resume(); + * return; + * } + * + * res.setEncoding('utf8'); + * let rawData = ''; + * res.on('data', (chunk) => { rawData += chunk; }); + * res.on('end', () => { + * try { + * const parsedData = JSON.parse(rawData); + * console.log(parsedData); + * } catch (e) { + * console.error(e.message); + * } + * }); + * }).on('error', (e) => { + * console.error(`Got error: ${e.message}`); + * }); + * + * // Create a local server to receive data from + * const server = http.createServer((req, res) => { + * res.writeHead(200, { 'Content-Type': 'application/json' }); + * res.end(JSON.stringify({ + * data: 'Hello World!' + * })); + * }); + * + * server.listen(8000); + * ``` + * @since v0.3.6 + * @param options Accepts the same `options` as {@link request}, with the `method` always set to `GET`. Properties that are inherited from the prototype are ignored. + */ + function get(options: RequestOptions | string | URL, callback?: (res: IncomingMessage) => void): ClientRequest; + function get(url: string | URL, options: RequestOptions, callback?: (res: IncomingMessage) => void): ClientRequest; + let globalAgent: Agent; + /** + * Read-only property specifying the maximum allowed size of HTTP headers in bytes. + * Defaults to 16KB. Configurable using the `--max-http-header-size` CLI option. + */ + const maxHeaderSize: number; +} +declare module 'node:http' { + export * from 'http'; +} diff --git a/node_modules/@types/node/http2.d.ts b/node_modules/@types/node/http2.d.ts new file mode 100644 index 000000000..3aa9497e0 --- /dev/null +++ b/node_modules/@types/node/http2.d.ts @@ -0,0 +1,2100 @@ +/** + * The `http2` module provides an implementation of the [HTTP/2](https://tools.ietf.org/html/rfc7540) protocol. It + * can be accessed using: + * + * ```js + * const http2 = require('http2'); + * ``` + * @since v8.4.0 + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/http2.js) + */ +declare module 'http2' { + import EventEmitter = require('node:events'); + import * as fs from 'node:fs'; + import * as net from 'node:net'; + import * as stream from 'node:stream'; + import * as tls from 'node:tls'; + import * as url from 'node:url'; + import { IncomingHttpHeaders as Http1IncomingHttpHeaders, OutgoingHttpHeaders, IncomingMessage, ServerResponse } from 'node:http'; + export { OutgoingHttpHeaders } from 'node:http'; + export interface IncomingHttpStatusHeader { + ':status'?: number | undefined; + } + export interface IncomingHttpHeaders extends Http1IncomingHttpHeaders { + ':path'?: string | undefined; + ':method'?: string | undefined; + ':authority'?: string | undefined; + ':scheme'?: string | undefined; + } + // Http2Stream + export interface StreamPriorityOptions { + exclusive?: boolean | undefined; + parent?: number | undefined; + weight?: number | undefined; + silent?: boolean | undefined; + } + export interface StreamState { + localWindowSize?: number | undefined; + state?: number | undefined; + localClose?: number | undefined; + remoteClose?: number | undefined; + sumDependencyWeight?: number | undefined; + weight?: number | undefined; + } + export interface ServerStreamResponseOptions { + endStream?: boolean | undefined; + waitForTrailers?: boolean | undefined; + } + export interface StatOptions { + offset: number; + length: number; + } + export interface ServerStreamFileResponseOptions { + statCheck?(stats: fs.Stats, headers: OutgoingHttpHeaders, statOptions: StatOptions): void | boolean; + waitForTrailers?: boolean | undefined; + offset?: number | undefined; + length?: number | undefined; + } + export interface ServerStreamFileResponseOptionsWithError extends ServerStreamFileResponseOptions { + onError?(err: NodeJS.ErrnoException): void; + } + export interface Http2Stream extends stream.Duplex { + /** + * Set to `true` if the `Http2Stream` instance was aborted abnormally. When set, + * the `'aborted'` event will have been emitted. + * @since v8.4.0 + */ + readonly aborted: boolean; + /** + * This property shows the number of characters currently buffered to be written. + * See `net.Socket.bufferSize` for details. + * @since v11.2.0, v10.16.0 + */ + readonly bufferSize: number; + /** + * Set to `true` if the `Http2Stream` instance has been closed. + * @since v9.4.0 + */ + readonly closed: boolean; + /** + * Set to `true` if the `Http2Stream` instance has been destroyed and is no longer + * usable. + * @since v8.4.0 + */ + readonly destroyed: boolean; + /** + * Set the `true` if the `END_STREAM` flag was set in the request or response + * HEADERS frame received, indicating that no additional data should be received + * and the readable side of the `Http2Stream` will be closed. + * @since v10.11.0 + */ + readonly endAfterHeaders: boolean; + /** + * The numeric stream identifier of this `Http2Stream` instance. Set to `undefined`if the stream identifier has not yet been assigned. + * @since v8.4.0 + */ + readonly id?: number | undefined; + /** + * Set to `true` if the `Http2Stream` instance has not yet been assigned a + * numeric stream identifier. + * @since v9.4.0 + */ + readonly pending: boolean; + /** + * Set to the `RST_STREAM` `error code` reported when the `Http2Stream` is + * destroyed after either receiving an `RST_STREAM` frame from the connected peer, + * calling `http2stream.close()`, or `http2stream.destroy()`. Will be`undefined` if the `Http2Stream` has not been closed. + * @since v8.4.0 + */ + readonly rstCode: number; + /** + * An object containing the outbound headers sent for this `Http2Stream`. + * @since v9.5.0 + */ + readonly sentHeaders: OutgoingHttpHeaders; + /** + * An array of objects containing the outbound informational (additional) headers + * sent for this `Http2Stream`. + * @since v9.5.0 + */ + readonly sentInfoHeaders?: OutgoingHttpHeaders[] | undefined; + /** + * An object containing the outbound trailers sent for this `HttpStream`. + * @since v9.5.0 + */ + readonly sentTrailers?: OutgoingHttpHeaders | undefined; + /** + * A reference to the `Http2Session` instance that owns this `Http2Stream`. The + * value will be `undefined` after the `Http2Stream` instance is destroyed. + * @since v8.4.0 + */ + readonly session: Http2Session; + /** + * Provides miscellaneous information about the current state of the`Http2Stream`. + * + * A current state of this `Http2Stream`. + * @since v8.4.0 + */ + readonly state: StreamState; + /** + * Closes the `Http2Stream` instance by sending an `RST_STREAM` frame to the + * connected HTTP/2 peer. + * @since v8.4.0 + * @param [code=http2.constants.NGHTTP2_NO_ERROR] Unsigned 32-bit integer identifying the error code. + * @param callback An optional function registered to listen for the `'close'` event. + */ + close(code?: number, callback?: () => void): void; + /** + * Updates the priority for this `Http2Stream` instance. + * @since v8.4.0 + */ + priority(options: StreamPriorityOptions): void; + /** + * ```js + * const http2 = require('http2'); + * const client = http2.connect('http://example.org:8000'); + * const { NGHTTP2_CANCEL } = http2.constants; + * const req = client.request({ ':path': '/' }); + * + * // Cancel the stream if there's no activity after 5 seconds + * req.setTimeout(5000, () => req.close(NGHTTP2_CANCEL)); + * ``` + * @since v8.4.0 + */ + setTimeout(msecs: number, callback?: () => void): void; + /** + * Sends a trailing `HEADERS` frame to the connected HTTP/2 peer. This method + * will cause the `Http2Stream` to be immediately closed and must only be + * called after the `'wantTrailers'` event has been emitted. When sending a + * request or sending a response, the `options.waitForTrailers` option must be set + * in order to keep the `Http2Stream` open after the final `DATA` frame so that + * trailers can be sent. + * + * ```js + * const http2 = require('http2'); + * const server = http2.createServer(); + * server.on('stream', (stream) => { + * stream.respond(undefined, { waitForTrailers: true }); + * stream.on('wantTrailers', () => { + * stream.sendTrailers({ xyz: 'abc' }); + * }); + * stream.end('Hello World'); + * }); + * ``` + * + * The HTTP/1 specification forbids trailers from containing HTTP/2 pseudo-header + * fields (e.g. `':method'`, `':path'`, etc). + * @since v10.0.0 + */ + sendTrailers(headers: OutgoingHttpHeaders): void; + addListener(event: 'aborted', listener: () => void): this; + addListener(event: 'close', listener: () => void): this; + addListener(event: 'data', listener: (chunk: Buffer | string) => void): this; + addListener(event: 'drain', listener: () => void): this; + addListener(event: 'end', listener: () => void): this; + addListener(event: 'error', listener: (err: Error) => void): this; + addListener(event: 'finish', listener: () => void): this; + addListener(event: 'frameError', listener: (frameType: number, errorCode: number) => void): this; + addListener(event: 'pipe', listener: (src: stream.Readable) => void): this; + addListener(event: 'unpipe', listener: (src: stream.Readable) => void): this; + addListener(event: 'streamClosed', listener: (code: number) => void): this; + addListener(event: 'timeout', listener: () => void): this; + addListener(event: 'trailers', listener: (trailers: IncomingHttpHeaders, flags: number) => void): this; + addListener(event: 'wantTrailers', listener: () => void): this; + addListener(event: string | symbol, listener: (...args: any[]) => void): this; + emit(event: 'aborted'): boolean; + emit(event: 'close'): boolean; + emit(event: 'data', chunk: Buffer | string): boolean; + emit(event: 'drain'): boolean; + emit(event: 'end'): boolean; + emit(event: 'error', err: Error): boolean; + emit(event: 'finish'): boolean; + emit(event: 'frameError', frameType: number, errorCode: number): boolean; + emit(event: 'pipe', src: stream.Readable): boolean; + emit(event: 'unpipe', src: stream.Readable): boolean; + emit(event: 'streamClosed', code: number): boolean; + emit(event: 'timeout'): boolean; + emit(event: 'trailers', trailers: IncomingHttpHeaders, flags: number): boolean; + emit(event: 'wantTrailers'): boolean; + emit(event: string | symbol, ...args: any[]): boolean; + on(event: 'aborted', listener: () => void): this; + on(event: 'close', listener: () => void): this; + on(event: 'data', listener: (chunk: Buffer | string) => void): this; + on(event: 'drain', listener: () => void): this; + on(event: 'end', listener: () => void): this; + on(event: 'error', listener: (err: Error) => void): this; + on(event: 'finish', listener: () => void): this; + on(event: 'frameError', listener: (frameType: number, errorCode: number) => void): this; + on(event: 'pipe', listener: (src: stream.Readable) => void): this; + on(event: 'unpipe', listener: (src: stream.Readable) => void): this; + on(event: 'streamClosed', listener: (code: number) => void): this; + on(event: 'timeout', listener: () => void): this; + on(event: 'trailers', listener: (trailers: IncomingHttpHeaders, flags: number) => void): this; + on(event: 'wantTrailers', listener: () => void): this; + on(event: string | symbol, listener: (...args: any[]) => void): this; + once(event: 'aborted', listener: () => void): this; + once(event: 'close', listener: () => void): this; + once(event: 'data', listener: (chunk: Buffer | string) => void): this; + once(event: 'drain', listener: () => void): this; + once(event: 'end', listener: () => void): this; + once(event: 'error', listener: (err: Error) => void): this; + once(event: 'finish', listener: () => void): this; + once(event: 'frameError', listener: (frameType: number, errorCode: number) => void): this; + once(event: 'pipe', listener: (src: stream.Readable) => void): this; + once(event: 'unpipe', listener: (src: stream.Readable) => void): this; + once(event: 'streamClosed', listener: (code: number) => void): this; + once(event: 'timeout', listener: () => void): this; + once(event: 'trailers', listener: (trailers: IncomingHttpHeaders, flags: number) => void): this; + once(event: 'wantTrailers', listener: () => void): this; + once(event: string | symbol, listener: (...args: any[]) => void): this; + prependListener(event: 'aborted', listener: () => void): this; + prependListener(event: 'close', listener: () => void): this; + prependListener(event: 'data', listener: (chunk: Buffer | string) => void): this; + prependListener(event: 'drain', listener: () => void): this; + prependListener(event: 'end', listener: () => void): this; + prependListener(event: 'error', listener: (err: Error) => void): this; + prependListener(event: 'finish', listener: () => void): this; + prependListener(event: 'frameError', listener: (frameType: number, errorCode: number) => void): this; + prependListener(event: 'pipe', listener: (src: stream.Readable) => void): this; + prependListener(event: 'unpipe', listener: (src: stream.Readable) => void): this; + prependListener(event: 'streamClosed', listener: (code: number) => void): this; + prependListener(event: 'timeout', listener: () => void): this; + prependListener(event: 'trailers', listener: (trailers: IncomingHttpHeaders, flags: number) => void): this; + prependListener(event: 'wantTrailers', listener: () => void): this; + prependListener(event: string | symbol, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'aborted', listener: () => void): this; + prependOnceListener(event: 'close', listener: () => void): this; + prependOnceListener(event: 'data', listener: (chunk: Buffer | string) => void): this; + prependOnceListener(event: 'drain', listener: () => void): this; + prependOnceListener(event: 'end', listener: () => void): this; + prependOnceListener(event: 'error', listener: (err: Error) => void): this; + prependOnceListener(event: 'finish', listener: () => void): this; + prependOnceListener(event: 'frameError', listener: (frameType: number, errorCode: number) => void): this; + prependOnceListener(event: 'pipe', listener: (src: stream.Readable) => void): this; + prependOnceListener(event: 'unpipe', listener: (src: stream.Readable) => void): this; + prependOnceListener(event: 'streamClosed', listener: (code: number) => void): this; + prependOnceListener(event: 'timeout', listener: () => void): this; + prependOnceListener(event: 'trailers', listener: (trailers: IncomingHttpHeaders, flags: number) => void): this; + prependOnceListener(event: 'wantTrailers', listener: () => void): this; + prependOnceListener(event: string | symbol, listener: (...args: any[]) => void): this; + } + export interface ClientHttp2Stream extends Http2Stream { + addListener(event: 'continue', listener: () => {}): this; + addListener(event: 'headers', listener: (headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number) => void): this; + addListener(event: 'push', listener: (headers: IncomingHttpHeaders, flags: number) => void): this; + addListener(event: 'response', listener: (headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number) => void): this; + addListener(event: string | symbol, listener: (...args: any[]) => void): this; + emit(event: 'continue'): boolean; + emit(event: 'headers', headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number): boolean; + emit(event: 'push', headers: IncomingHttpHeaders, flags: number): boolean; + emit(event: 'response', headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number): boolean; + emit(event: string | symbol, ...args: any[]): boolean; + on(event: 'continue', listener: () => {}): this; + on(event: 'headers', listener: (headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number) => void): this; + on(event: 'push', listener: (headers: IncomingHttpHeaders, flags: number) => void): this; + on(event: 'response', listener: (headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number) => void): this; + on(event: string | symbol, listener: (...args: any[]) => void): this; + once(event: 'continue', listener: () => {}): this; + once(event: 'headers', listener: (headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number) => void): this; + once(event: 'push', listener: (headers: IncomingHttpHeaders, flags: number) => void): this; + once(event: 'response', listener: (headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number) => void): this; + once(event: string | symbol, listener: (...args: any[]) => void): this; + prependListener(event: 'continue', listener: () => {}): this; + prependListener(event: 'headers', listener: (headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number) => void): this; + prependListener(event: 'push', listener: (headers: IncomingHttpHeaders, flags: number) => void): this; + prependListener(event: 'response', listener: (headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number) => void): this; + prependListener(event: string | symbol, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'continue', listener: () => {}): this; + prependOnceListener(event: 'headers', listener: (headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number) => void): this; + prependOnceListener(event: 'push', listener: (headers: IncomingHttpHeaders, flags: number) => void): this; + prependOnceListener(event: 'response', listener: (headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number) => void): this; + prependOnceListener(event: string | symbol, listener: (...args: any[]) => void): this; + } + export interface ServerHttp2Stream extends Http2Stream { + /** + * True if headers were sent, false otherwise (read-only). + * @since v8.4.0 + */ + readonly headersSent: boolean; + /** + * Read-only property mapped to the `SETTINGS_ENABLE_PUSH` flag of the remote + * client's most recent `SETTINGS` frame. Will be `true` if the remote peer + * accepts push streams, `false` otherwise. Settings are the same for every`Http2Stream` in the same `Http2Session`. + * @since v8.4.0 + */ + readonly pushAllowed: boolean; + /** + * Sends an additional informational `HEADERS` frame to the connected HTTP/2 peer. + * @since v8.4.0 + */ + additionalHeaders(headers: OutgoingHttpHeaders): void; + /** + * Initiates a push stream. The callback is invoked with the new `Http2Stream`instance created for the push stream passed as the second argument, or an`Error` passed as the first argument. + * + * ```js + * const http2 = require('http2'); + * const server = http2.createServer(); + * server.on('stream', (stream) => { + * stream.respond({ ':status': 200 }); + * stream.pushStream({ ':path': '/' }, (err, pushStream, headers) => { + * if (err) throw err; + * pushStream.respond({ ':status': 200 }); + * pushStream.end('some pushed data'); + * }); + * stream.end('some data'); + * }); + * ``` + * + * Setting the weight of a push stream is not allowed in the `HEADERS` frame. Pass + * a `weight` value to `http2stream.priority` with the `silent` option set to`true` to enable server-side bandwidth balancing between concurrent streams. + * + * Calling `http2stream.pushStream()` from within a pushed stream is not permitted + * and will throw an error. + * @since v8.4.0 + * @param callback Callback that is called once the push stream has been initiated. + */ + pushStream(headers: OutgoingHttpHeaders, callback?: (err: Error | null, pushStream: ServerHttp2Stream, headers: OutgoingHttpHeaders) => void): void; + pushStream(headers: OutgoingHttpHeaders, options?: StreamPriorityOptions, callback?: (err: Error | null, pushStream: ServerHttp2Stream, headers: OutgoingHttpHeaders) => void): void; + /** + * ```js + * const http2 = require('http2'); + * const server = http2.createServer(); + * server.on('stream', (stream) => { + * stream.respond({ ':status': 200 }); + * stream.end('some data'); + * }); + * ``` + * + * When the `options.waitForTrailers` option is set, the `'wantTrailers'` event + * will be emitted immediately after queuing the last chunk of payload data to be + * sent. The `http2stream.sendTrailers()` method can then be used to sent trailing + * header fields to the peer. + * + * When `options.waitForTrailers` is set, the `Http2Stream` will not automatically + * close when the final `DATA` frame is transmitted. User code must call either`http2stream.sendTrailers()` or `http2stream.close()` to close the`Http2Stream`. + * + * ```js + * const http2 = require('http2'); + * const server = http2.createServer(); + * server.on('stream', (stream) => { + * stream.respond({ ':status': 200 }, { waitForTrailers: true }); + * stream.on('wantTrailers', () => { + * stream.sendTrailers({ ABC: 'some value to send' }); + * }); + * stream.end('some data'); + * }); + * ``` + * @since v8.4.0 + */ + respond(headers?: OutgoingHttpHeaders, options?: ServerStreamResponseOptions): void; + /** + * Initiates a response whose data is read from the given file descriptor. No + * validation is performed on the given file descriptor. If an error occurs while + * attempting to read data using the file descriptor, the `Http2Stream` will be + * closed using an `RST_STREAM` frame using the standard `INTERNAL_ERROR` code. + * + * When used, the `Http2Stream` object's `Duplex` interface will be closed + * automatically. + * + * ```js + * const http2 = require('http2'); + * const fs = require('fs'); + * + * const server = http2.createServer(); + * server.on('stream', (stream) => { + * const fd = fs.openSync('/some/file', 'r'); + * + * const stat = fs.fstatSync(fd); + * const headers = { + * 'content-length': stat.size, + * 'last-modified': stat.mtime.toUTCString(), + * 'content-type': 'text/plain; charset=utf-8' + * }; + * stream.respondWithFD(fd, headers); + * stream.on('close', () => fs.closeSync(fd)); + * }); + * ``` + * + * The optional `options.statCheck` function may be specified to give user code + * an opportunity to set additional content headers based on the `fs.Stat` details + * of the given fd. If the `statCheck` function is provided, the`http2stream.respondWithFD()` method will perform an `fs.fstat()` call to + * collect details on the provided file descriptor. + * + * The `offset` and `length` options may be used to limit the response to a + * specific range subset. This can be used, for instance, to support HTTP Range + * requests. + * + * The file descriptor or `FileHandle` is not closed when the stream is closed, + * so it will need to be closed manually once it is no longer needed. + * Using the same file descriptor concurrently for multiple streams + * is not supported and may result in data loss. Re-using a file descriptor + * after a stream has finished is supported. + * + * When the `options.waitForTrailers` option is set, the `'wantTrailers'` event + * will be emitted immediately after queuing the last chunk of payload data to be + * sent. The `http2stream.sendTrailers()` method can then be used to sent trailing + * header fields to the peer. + * + * When `options.waitForTrailers` is set, the `Http2Stream` will not automatically + * close when the final `DATA` frame is transmitted. User code _must_ call either`http2stream.sendTrailers()` or `http2stream.close()` to close the`Http2Stream`. + * + * ```js + * const http2 = require('http2'); + * const fs = require('fs'); + * + * const server = http2.createServer(); + * server.on('stream', (stream) => { + * const fd = fs.openSync('/some/file', 'r'); + * + * const stat = fs.fstatSync(fd); + * const headers = { + * 'content-length': stat.size, + * 'last-modified': stat.mtime.toUTCString(), + * 'content-type': 'text/plain; charset=utf-8' + * }; + * stream.respondWithFD(fd, headers, { waitForTrailers: true }); + * stream.on('wantTrailers', () => { + * stream.sendTrailers({ ABC: 'some value to send' }); + * }); + * + * stream.on('close', () => fs.closeSync(fd)); + * }); + * ``` + * @since v8.4.0 + * @param fd A readable file descriptor. + */ + respondWithFD(fd: number | fs.promises.FileHandle, headers?: OutgoingHttpHeaders, options?: ServerStreamFileResponseOptions): void; + /** + * Sends a regular file as the response. The `path` must specify a regular file + * or an `'error'` event will be emitted on the `Http2Stream` object. + * + * When used, the `Http2Stream` object's `Duplex` interface will be closed + * automatically. + * + * The optional `options.statCheck` function may be specified to give user code + * an opportunity to set additional content headers based on the `fs.Stat` details + * of the given file: + * + * If an error occurs while attempting to read the file data, the `Http2Stream`will be closed using an `RST_STREAM` frame using the standard `INTERNAL_ERROR`code. If the `onError` callback is + * defined, then it will be called. Otherwise + * the stream will be destroyed. + * + * Example using a file path: + * + * ```js + * const http2 = require('http2'); + * const server = http2.createServer(); + * server.on('stream', (stream) => { + * function statCheck(stat, headers) { + * headers['last-modified'] = stat.mtime.toUTCString(); + * } + * + * function onError(err) { + * // stream.respond() can throw if the stream has been destroyed by + * // the other side. + * try { + * if (err.code === 'ENOENT') { + * stream.respond({ ':status': 404 }); + * } else { + * stream.respond({ ':status': 500 }); + * } + * } catch (err) { + * // Perform actual error handling. + * console.log(err); + * } + * stream.end(); + * } + * + * stream.respondWithFile('/some/file', + * { 'content-type': 'text/plain; charset=utf-8' }, + * { statCheck, onError }); + * }); + * ``` + * + * The `options.statCheck` function may also be used to cancel the send operation + * by returning `false`. For instance, a conditional request may check the stat + * results to determine if the file has been modified to return an appropriate`304` response: + * + * ```js + * const http2 = require('http2'); + * const server = http2.createServer(); + * server.on('stream', (stream) => { + * function statCheck(stat, headers) { + * // Check the stat here... + * stream.respond({ ':status': 304 }); + * return false; // Cancel the send operation + * } + * stream.respondWithFile('/some/file', + * { 'content-type': 'text/plain; charset=utf-8' }, + * { statCheck }); + * }); + * ``` + * + * The `content-length` header field will be automatically set. + * + * The `offset` and `length` options may be used to limit the response to a + * specific range subset. This can be used, for instance, to support HTTP Range + * requests. + * + * The `options.onError` function may also be used to handle all the errors + * that could happen before the delivery of the file is initiated. The + * default behavior is to destroy the stream. + * + * When the `options.waitForTrailers` option is set, the `'wantTrailers'` event + * will be emitted immediately after queuing the last chunk of payload data to be + * sent. The `http2stream.sendTrailers()` method can then be used to sent trailing + * header fields to the peer. + * + * When `options.waitForTrailers` is set, the `Http2Stream` will not automatically + * close when the final `DATA` frame is transmitted. User code must call either`http2stream.sendTrailers()` or `http2stream.close()` to close the`Http2Stream`. + * + * ```js + * const http2 = require('http2'); + * const server = http2.createServer(); + * server.on('stream', (stream) => { + * stream.respondWithFile('/some/file', + * { 'content-type': 'text/plain; charset=utf-8' }, + * { waitForTrailers: true }); + * stream.on('wantTrailers', () => { + * stream.sendTrailers({ ABC: 'some value to send' }); + * }); + * }); + * ``` + * @since v8.4.0 + */ + respondWithFile(path: string, headers?: OutgoingHttpHeaders, options?: ServerStreamFileResponseOptionsWithError): void; + } + // Http2Session + export interface Settings { + headerTableSize?: number | undefined; + enablePush?: boolean | undefined; + initialWindowSize?: number | undefined; + maxFrameSize?: number | undefined; + maxConcurrentStreams?: number | undefined; + maxHeaderListSize?: number | undefined; + enableConnectProtocol?: boolean | undefined; + } + export interface ClientSessionRequestOptions { + endStream?: boolean | undefined; + exclusive?: boolean | undefined; + parent?: number | undefined; + weight?: number | undefined; + waitForTrailers?: boolean | undefined; + } + export interface SessionState { + effectiveLocalWindowSize?: number | undefined; + effectiveRecvDataLength?: number | undefined; + nextStreamID?: number | undefined; + localWindowSize?: number | undefined; + lastProcStreamID?: number | undefined; + remoteWindowSize?: number | undefined; + outboundQueueSize?: number | undefined; + deflateDynamicTableSize?: number | undefined; + inflateDynamicTableSize?: number | undefined; + } + export interface Http2Session extends EventEmitter { + /** + * Value will be `undefined` if the `Http2Session` is not yet connected to a + * socket, `h2c` if the `Http2Session` is not connected to a `TLSSocket`, or + * will return the value of the connected `TLSSocket`'s own `alpnProtocol`property. + * @since v9.4.0 + */ + readonly alpnProtocol?: string | undefined; + /** + * Will be `true` if this `Http2Session` instance has been closed, otherwise`false`. + * @since v9.4.0 + */ + readonly closed: boolean; + /** + * Will be `true` if this `Http2Session` instance is still connecting, will be set + * to `false` before emitting `connect` event and/or calling the `http2.connect`callback. + * @since v10.0.0 + */ + readonly connecting: boolean; + /** + * Will be `true` if this `Http2Session` instance has been destroyed and must no + * longer be used, otherwise `false`. + * @since v8.4.0 + */ + readonly destroyed: boolean; + /** + * Value is `undefined` if the `Http2Session` session socket has not yet been + * connected, `true` if the `Http2Session` is connected with a `TLSSocket`, + * and `false` if the `Http2Session` is connected to any other kind of socket + * or stream. + * @since v9.4.0 + */ + readonly encrypted?: boolean | undefined; + /** + * A prototype-less object describing the current local settings of this`Http2Session`. The local settings are local to _this_`Http2Session` instance. + * @since v8.4.0 + */ + readonly localSettings: Settings; + /** + * If the `Http2Session` is connected to a `TLSSocket`, the `originSet` property + * will return an `Array` of origins for which the `Http2Session` may be + * considered authoritative. + * + * The `originSet` property is only available when using a secure TLS connection. + * @since v9.4.0 + */ + readonly originSet?: string[] | undefined; + /** + * Indicates whether the `Http2Session` is currently waiting for acknowledgment of + * a sent `SETTINGS` frame. Will be `true` after calling the`http2session.settings()` method. Will be `false` once all sent `SETTINGS`frames have been acknowledged. + * @since v8.4.0 + */ + readonly pendingSettingsAck: boolean; + /** + * A prototype-less object describing the current remote settings of this`Http2Session`. The remote settings are set by the _connected_ HTTP/2 peer. + * @since v8.4.0 + */ + readonly remoteSettings: Settings; + /** + * Returns a `Proxy` object that acts as a `net.Socket` (or `tls.TLSSocket`) but + * limits available methods to ones safe to use with HTTP/2. + * + * `destroy`, `emit`, `end`, `pause`, `read`, `resume`, and `write` will throw + * an error with code `ERR_HTTP2_NO_SOCKET_MANIPULATION`. See `Http2Session and Sockets` for more information. + * + * `setTimeout` method will be called on this `Http2Session`. + * + * All other interactions will be routed directly to the socket. + * @since v8.4.0 + */ + readonly socket: net.Socket | tls.TLSSocket; + /** + * Provides miscellaneous information about the current state of the`Http2Session`. + * + * An object describing the current status of this `Http2Session`. + * @since v8.4.0 + */ + readonly state: SessionState; + /** + * The `http2session.type` will be equal to`http2.constants.NGHTTP2_SESSION_SERVER` if this `Http2Session` instance is a + * server, and `http2.constants.NGHTTP2_SESSION_CLIENT` if the instance is a + * client. + * @since v8.4.0 + */ + readonly type: number; + /** + * Gracefully closes the `Http2Session`, allowing any existing streams to + * complete on their own and preventing new `Http2Stream` instances from being + * created. Once closed, `http2session.destroy()`_might_ be called if there + * are no open `Http2Stream` instances. + * + * If specified, the `callback` function is registered as a handler for the`'close'` event. + * @since v9.4.0 + */ + close(callback?: () => void): void; + /** + * Immediately terminates the `Http2Session` and the associated `net.Socket` or`tls.TLSSocket`. + * + * Once destroyed, the `Http2Session` will emit the `'close'` event. If `error`is not undefined, an `'error'` event will be emitted immediately before the`'close'` event. + * + * If there are any remaining open `Http2Streams` associated with the`Http2Session`, those will also be destroyed. + * @since v8.4.0 + * @param error An `Error` object if the `Http2Session` is being destroyed due to an error. + * @param code The HTTP/2 error code to send in the final `GOAWAY` frame. If unspecified, and `error` is not undefined, the default is `INTERNAL_ERROR`, otherwise defaults to `NO_ERROR`. + */ + destroy(error?: Error, code?: number): void; + /** + * Transmits a `GOAWAY` frame to the connected peer _without_ shutting down the`Http2Session`. + * @since v9.4.0 + * @param code An HTTP/2 error code + * @param lastStreamID The numeric ID of the last processed `Http2Stream` + * @param opaqueData A `TypedArray` or `DataView` instance containing additional data to be carried within the `GOAWAY` frame. + */ + goaway(code?: number, lastStreamID?: number, opaqueData?: NodeJS.ArrayBufferView): void; + /** + * Sends a `PING` frame to the connected HTTP/2 peer. A `callback` function must + * be provided. The method will return `true` if the `PING` was sent, `false`otherwise. + * + * The maximum number of outstanding (unacknowledged) pings is determined by the`maxOutstandingPings` configuration option. The default maximum is 10. + * + * If provided, the `payload` must be a `Buffer`, `TypedArray`, or `DataView`containing 8 bytes of data that will be transmitted with the `PING` and + * returned with the ping acknowledgment. + * + * The callback will be invoked with three arguments: an error argument that will + * be `null` if the `PING` was successfully acknowledged, a `duration` argument + * that reports the number of milliseconds elapsed since the ping was sent and the + * acknowledgment was received, and a `Buffer` containing the 8-byte `PING`payload. + * + * ```js + * session.ping(Buffer.from('abcdefgh'), (err, duration, payload) => { + * if (!err) { + * console.log(`Ping acknowledged in ${duration} milliseconds`); + * console.log(`With payload '${payload.toString()}'`); + * } + * }); + * ``` + * + * If the `payload` argument is not specified, the default payload will be the + * 64-bit timestamp (little endian) marking the start of the `PING` duration. + * @since v8.9.3 + * @param payload Optional ping payload. + */ + ping(callback: (err: Error | null, duration: number, payload: Buffer) => void): boolean; + ping(payload: NodeJS.ArrayBufferView, callback: (err: Error | null, duration: number, payload: Buffer) => void): boolean; + /** + * Calls `ref()` on this `Http2Session`instance's underlying `net.Socket`. + * @since v9.4.0 + */ + ref(): void; + /** + * Sets the local endpoint's window size. + * The `windowSize` is the total window size to set, not + * the delta. + * + * ```js + * const http2 = require('http2'); + * + * const server = http2.createServer(); + * const expectedWindowSize = 2 ** 20; + * server.on('connect', (session) => { + * + * // Set local window size to be 2 ** 20 + * session.setLocalWindowSize(expectedWindowSize); + * }); + * ``` + * @since v15.3.0 + */ + setLocalWindowSize(windowSize: number): void; + /** + * Used to set a callback function that is called when there is no activity on + * the `Http2Session` after `msecs` milliseconds. The given `callback` is + * registered as a listener on the `'timeout'` event. + * @since v8.4.0 + */ + setTimeout(msecs: number, callback?: () => void): void; + /** + * Updates the current local settings for this `Http2Session` and sends a new`SETTINGS` frame to the connected HTTP/2 peer. + * + * Once called, the `http2session.pendingSettingsAck` property will be `true`while the session is waiting for the remote peer to acknowledge the new + * settings. + * + * The new settings will not become effective until the `SETTINGS` acknowledgment + * is received and the `'localSettings'` event is emitted. It is possible to send + * multiple `SETTINGS` frames while acknowledgment is still pending. + * @since v8.4.0 + * @param callback Callback that is called once the session is connected or right away if the session is already connected. + */ + settings(settings: Settings): void; + /** + * Calls `unref()` on this `Http2Session`instance's underlying `net.Socket`. + * @since v9.4.0 + */ + unref(): void; + addListener(event: 'close', listener: () => void): this; + addListener(event: 'error', listener: (err: Error) => void): this; + addListener(event: 'frameError', listener: (frameType: number, errorCode: number, streamID: number) => void): this; + addListener(event: 'goaway', listener: (errorCode: number, lastStreamID: number, opaqueData: Buffer) => void): this; + addListener(event: 'localSettings', listener: (settings: Settings) => void): this; + addListener(event: 'ping', listener: () => void): this; + addListener(event: 'remoteSettings', listener: (settings: Settings) => void): this; + addListener(event: 'timeout', listener: () => void): this; + addListener(event: string | symbol, listener: (...args: any[]) => void): this; + emit(event: 'close'): boolean; + emit(event: 'error', err: Error): boolean; + emit(event: 'frameError', frameType: number, errorCode: number, streamID: number): boolean; + emit(event: 'goaway', errorCode: number, lastStreamID: number, opaqueData: Buffer): boolean; + emit(event: 'localSettings', settings: Settings): boolean; + emit(event: 'ping'): boolean; + emit(event: 'remoteSettings', settings: Settings): boolean; + emit(event: 'timeout'): boolean; + emit(event: string | symbol, ...args: any[]): boolean; + on(event: 'close', listener: () => void): this; + on(event: 'error', listener: (err: Error) => void): this; + on(event: 'frameError', listener: (frameType: number, errorCode: number, streamID: number) => void): this; + on(event: 'goaway', listener: (errorCode: number, lastStreamID: number, opaqueData: Buffer) => void): this; + on(event: 'localSettings', listener: (settings: Settings) => void): this; + on(event: 'ping', listener: () => void): this; + on(event: 'remoteSettings', listener: (settings: Settings) => void): this; + on(event: 'timeout', listener: () => void): this; + on(event: string | symbol, listener: (...args: any[]) => void): this; + once(event: 'close', listener: () => void): this; + once(event: 'error', listener: (err: Error) => void): this; + once(event: 'frameError', listener: (frameType: number, errorCode: number, streamID: number) => void): this; + once(event: 'goaway', listener: (errorCode: number, lastStreamID: number, opaqueData: Buffer) => void): this; + once(event: 'localSettings', listener: (settings: Settings) => void): this; + once(event: 'ping', listener: () => void): this; + once(event: 'remoteSettings', listener: (settings: Settings) => void): this; + once(event: 'timeout', listener: () => void): this; + once(event: string | symbol, listener: (...args: any[]) => void): this; + prependListener(event: 'close', listener: () => void): this; + prependListener(event: 'error', listener: (err: Error) => void): this; + prependListener(event: 'frameError', listener: (frameType: number, errorCode: number, streamID: number) => void): this; + prependListener(event: 'goaway', listener: (errorCode: number, lastStreamID: number, opaqueData: Buffer) => void): this; + prependListener(event: 'localSettings', listener: (settings: Settings) => void): this; + prependListener(event: 'ping', listener: () => void): this; + prependListener(event: 'remoteSettings', listener: (settings: Settings) => void): this; + prependListener(event: 'timeout', listener: () => void): this; + prependListener(event: string | symbol, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'close', listener: () => void): this; + prependOnceListener(event: 'error', listener: (err: Error) => void): this; + prependOnceListener(event: 'frameError', listener: (frameType: number, errorCode: number, streamID: number) => void): this; + prependOnceListener(event: 'goaway', listener: (errorCode: number, lastStreamID: number, opaqueData: Buffer) => void): this; + prependOnceListener(event: 'localSettings', listener: (settings: Settings) => void): this; + prependOnceListener(event: 'ping', listener: () => void): this; + prependOnceListener(event: 'remoteSettings', listener: (settings: Settings) => void): this; + prependOnceListener(event: 'timeout', listener: () => void): this; + prependOnceListener(event: string | symbol, listener: (...args: any[]) => void): this; + } + export interface ClientHttp2Session extends Http2Session { + /** + * For HTTP/2 Client `Http2Session` instances only, the `http2session.request()`creates and returns an `Http2Stream` instance that can be used to send an + * HTTP/2 request to the connected server. + * + * This method is only available if `http2session.type` is equal to`http2.constants.NGHTTP2_SESSION_CLIENT`. + * + * ```js + * const http2 = require('http2'); + * const clientSession = http2.connect('https://localhost:1234'); + * const { + * HTTP2_HEADER_PATH, + * HTTP2_HEADER_STATUS + * } = http2.constants; + * + * const req = clientSession.request({ [HTTP2_HEADER_PATH]: '/' }); + * req.on('response', (headers) => { + * console.log(headers[HTTP2_HEADER_STATUS]); + * req.on('data', (chunk) => { // .. }); + * req.on('end', () => { // .. }); + * }); + * ``` + * + * When the `options.waitForTrailers` option is set, the `'wantTrailers'` event + * is emitted immediately after queuing the last chunk of payload data to be sent. + * The `http2stream.sendTrailers()` method can then be called to send trailing + * headers to the peer. + * + * When `options.waitForTrailers` is set, the `Http2Stream` will not automatically + * close when the final `DATA` frame is transmitted. User code must call either`http2stream.sendTrailers()` or `http2stream.close()` to close the`Http2Stream`. + * + * When `options.signal` is set with an `AbortSignal` and then `abort` on the + * corresponding `AbortController` is called, the request will emit an `'error'`event with an `AbortError` error. + * + * The `:method` and `:path` pseudo-headers are not specified within `headers`, + * they respectively default to: + * + * * `:method` \= `'GET'` + * * `:path` \= `/` + * @since v8.4.0 + */ + request(headers?: OutgoingHttpHeaders, options?: ClientSessionRequestOptions): ClientHttp2Stream; + addListener(event: 'altsvc', listener: (alt: string, origin: string, stream: number) => void): this; + addListener(event: 'origin', listener: (origins: string[]) => void): this; + addListener(event: 'connect', listener: (session: ClientHttp2Session, socket: net.Socket | tls.TLSSocket) => void): this; + addListener(event: 'stream', listener: (stream: ClientHttp2Stream, headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number) => void): this; + addListener(event: string | symbol, listener: (...args: any[]) => void): this; + emit(event: 'altsvc', alt: string, origin: string, stream: number): boolean; + emit(event: 'origin', origins: ReadonlyArray): boolean; + emit(event: 'connect', session: ClientHttp2Session, socket: net.Socket | tls.TLSSocket): boolean; + emit(event: 'stream', stream: ClientHttp2Stream, headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number): boolean; + emit(event: string | symbol, ...args: any[]): boolean; + on(event: 'altsvc', listener: (alt: string, origin: string, stream: number) => void): this; + on(event: 'origin', listener: (origins: string[]) => void): this; + on(event: 'connect', listener: (session: ClientHttp2Session, socket: net.Socket | tls.TLSSocket) => void): this; + on(event: 'stream', listener: (stream: ClientHttp2Stream, headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number) => void): this; + on(event: string | symbol, listener: (...args: any[]) => void): this; + once(event: 'altsvc', listener: (alt: string, origin: string, stream: number) => void): this; + once(event: 'origin', listener: (origins: string[]) => void): this; + once(event: 'connect', listener: (session: ClientHttp2Session, socket: net.Socket | tls.TLSSocket) => void): this; + once(event: 'stream', listener: (stream: ClientHttp2Stream, headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number) => void): this; + once(event: string | symbol, listener: (...args: any[]) => void): this; + prependListener(event: 'altsvc', listener: (alt: string, origin: string, stream: number) => void): this; + prependListener(event: 'origin', listener: (origins: string[]) => void): this; + prependListener(event: 'connect', listener: (session: ClientHttp2Session, socket: net.Socket | tls.TLSSocket) => void): this; + prependListener(event: 'stream', listener: (stream: ClientHttp2Stream, headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number) => void): this; + prependListener(event: string | symbol, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'altsvc', listener: (alt: string, origin: string, stream: number) => void): this; + prependOnceListener(event: 'origin', listener: (origins: string[]) => void): this; + prependOnceListener(event: 'connect', listener: (session: ClientHttp2Session, socket: net.Socket | tls.TLSSocket) => void): this; + prependOnceListener(event: 'stream', listener: (stream: ClientHttp2Stream, headers: IncomingHttpHeaders & IncomingHttpStatusHeader, flags: number) => void): this; + prependOnceListener(event: string | symbol, listener: (...args: any[]) => void): this; + } + export interface AlternativeServiceOptions { + origin: number | string | url.URL; + } + export interface ServerHttp2Session extends Http2Session { + readonly server: Http2Server | Http2SecureServer; + /** + * Submits an `ALTSVC` frame (as defined by [RFC 7838](https://tools.ietf.org/html/rfc7838)) to the connected client. + * + * ```js + * const http2 = require('http2'); + * + * const server = http2.createServer(); + * server.on('session', (session) => { + * // Set altsvc for origin https://example.org:80 + * session.altsvc('h2=":8000"', 'https://example.org:80'); + * }); + * + * server.on('stream', (stream) => { + * // Set altsvc for a specific stream + * stream.session.altsvc('h2=":8000"', stream.id); + * }); + * ``` + * + * Sending an `ALTSVC` frame with a specific stream ID indicates that the alternate + * service is associated with the origin of the given `Http2Stream`. + * + * The `alt` and origin string _must_ contain only ASCII bytes and are + * strictly interpreted as a sequence of ASCII bytes. The special value `'clear'`may be passed to clear any previously set alternative service for a given + * domain. + * + * When a string is passed for the `originOrStream` argument, it will be parsed as + * a URL and the origin will be derived. For instance, the origin for the + * HTTP URL `'https://example.org/foo/bar'` is the ASCII string`'https://example.org'`. An error will be thrown if either the given string + * cannot be parsed as a URL or if a valid origin cannot be derived. + * + * A `URL` object, or any object with an `origin` property, may be passed as`originOrStream`, in which case the value of the `origin` property will be + * used. The value of the `origin` property _must_ be a properly serialized + * ASCII origin. + * @since v9.4.0 + * @param alt A description of the alternative service configuration as defined by `RFC 7838`. + * @param originOrStream Either a URL string specifying the origin (or an `Object` with an `origin` property) or the numeric identifier of an active `Http2Stream` as given by the + * `http2stream.id` property. + */ + altsvc(alt: string, originOrStream: number | string | url.URL | AlternativeServiceOptions): void; + /** + * Submits an `ORIGIN` frame (as defined by [RFC 8336](https://tools.ietf.org/html/rfc8336)) to the connected client + * to advertise the set of origins for which the server is capable of providing + * authoritative responses. + * + * ```js + * const http2 = require('http2'); + * const options = getSecureOptionsSomehow(); + * const server = http2.createSecureServer(options); + * server.on('stream', (stream) => { + * stream.respond(); + * stream.end('ok'); + * }); + * server.on('session', (session) => { + * session.origin('https://example.com', 'https://example.org'); + * }); + * ``` + * + * When a string is passed as an `origin`, it will be parsed as a URL and the + * origin will be derived. For instance, the origin for the HTTP URL`'https://example.org/foo/bar'` is the ASCII string`'https://example.org'`. An error will be thrown if either the given + * string + * cannot be parsed as a URL or if a valid origin cannot be derived. + * + * A `URL` object, or any object with an `origin` property, may be passed as + * an `origin`, in which case the value of the `origin` property will be + * used. The value of the `origin` property _must_ be a properly serialized + * ASCII origin. + * + * Alternatively, the `origins` option may be used when creating a new HTTP/2 + * server using the `http2.createSecureServer()` method: + * + * ```js + * const http2 = require('http2'); + * const options = getSecureOptionsSomehow(); + * options.origins = ['https://example.com', 'https://example.org']; + * const server = http2.createSecureServer(options); + * server.on('stream', (stream) => { + * stream.respond(); + * stream.end('ok'); + * }); + * ``` + * @since v10.12.0 + * @param origins One or more URL Strings passed as separate arguments. + */ + origin( + ...origins: Array< + | string + | url.URL + | { + origin: string; + } + > + ): void; + addListener(event: 'connect', listener: (session: ServerHttp2Session, socket: net.Socket | tls.TLSSocket) => void): this; + addListener(event: 'stream', listener: (stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number) => void): this; + addListener(event: string | symbol, listener: (...args: any[]) => void): this; + emit(event: 'connect', session: ServerHttp2Session, socket: net.Socket | tls.TLSSocket): boolean; + emit(event: 'stream', stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number): boolean; + emit(event: string | symbol, ...args: any[]): boolean; + on(event: 'connect', listener: (session: ServerHttp2Session, socket: net.Socket | tls.TLSSocket) => void): this; + on(event: 'stream', listener: (stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number) => void): this; + on(event: string | symbol, listener: (...args: any[]) => void): this; + once(event: 'connect', listener: (session: ServerHttp2Session, socket: net.Socket | tls.TLSSocket) => void): this; + once(event: 'stream', listener: (stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number) => void): this; + once(event: string | symbol, listener: (...args: any[]) => void): this; + prependListener(event: 'connect', listener: (session: ServerHttp2Session, socket: net.Socket | tls.TLSSocket) => void): this; + prependListener(event: 'stream', listener: (stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number) => void): this; + prependListener(event: string | symbol, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'connect', listener: (session: ServerHttp2Session, socket: net.Socket | tls.TLSSocket) => void): this; + prependOnceListener(event: 'stream', listener: (stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number) => void): this; + prependOnceListener(event: string | symbol, listener: (...args: any[]) => void): this; + } + // Http2Server + export interface SessionOptions { + maxDeflateDynamicTableSize?: number | undefined; + maxSessionMemory?: number | undefined; + maxHeaderListPairs?: number | undefined; + maxOutstandingPings?: number | undefined; + maxSendHeaderBlockLength?: number | undefined; + paddingStrategy?: number | undefined; + peerMaxConcurrentStreams?: number | undefined; + settings?: Settings | undefined; + /** + * Specifies a timeout in milliseconds that + * a server should wait when an [`'unknownProtocol'`][] is emitted. If the + * socket has not been destroyed by that time the server will destroy it. + * @default 100000 + */ + unknownProtocolTimeout?: number | undefined; + selectPadding?(frameLen: number, maxFrameLen: number): number; + createConnection?(authority: url.URL, option: SessionOptions): stream.Duplex; + } + export interface ClientSessionOptions extends SessionOptions { + maxReservedRemoteStreams?: number | undefined; + createConnection?: ((authority: url.URL, option: SessionOptions) => stream.Duplex) | undefined; + protocol?: 'http:' | 'https:' | undefined; + } + export interface ServerSessionOptions extends SessionOptions { + Http1IncomingMessage?: typeof IncomingMessage | undefined; + Http1ServerResponse?: typeof ServerResponse | undefined; + Http2ServerRequest?: typeof Http2ServerRequest | undefined; + Http2ServerResponse?: typeof Http2ServerResponse | undefined; + } + export interface SecureClientSessionOptions extends ClientSessionOptions, tls.ConnectionOptions {} + export interface SecureServerSessionOptions extends ServerSessionOptions, tls.TlsOptions {} + export interface ServerOptions extends ServerSessionOptions {} + export interface SecureServerOptions extends SecureServerSessionOptions { + allowHTTP1?: boolean | undefined; + origins?: string[] | undefined; + } + interface HTTP2ServerCommon { + setTimeout(msec?: number, callback?: () => void): this; + /** + * Throws ERR_HTTP2_INVALID_SETTING_VALUE for invalid settings values. + * Throws ERR_INVALID_ARG_TYPE for invalid settings argument. + */ + updateSettings(settings: Settings): void; + } + export interface Http2Server extends net.Server, HTTP2ServerCommon { + addListener(event: 'checkContinue', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + addListener(event: 'request', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + addListener(event: 'session', listener: (session: ServerHttp2Session) => void): this; + addListener(event: 'sessionError', listener: (err: Error) => void): this; + addListener(event: 'stream', listener: (stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number) => void): this; + addListener(event: 'timeout', listener: () => void): this; + addListener(event: string | symbol, listener: (...args: any[]) => void): this; + emit(event: 'checkContinue', request: Http2ServerRequest, response: Http2ServerResponse): boolean; + emit(event: 'request', request: Http2ServerRequest, response: Http2ServerResponse): boolean; + emit(event: 'session', session: ServerHttp2Session): boolean; + emit(event: 'sessionError', err: Error): boolean; + emit(event: 'stream', stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number): boolean; + emit(event: 'timeout'): boolean; + emit(event: string | symbol, ...args: any[]): boolean; + on(event: 'checkContinue', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + on(event: 'request', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + on(event: 'session', listener: (session: ServerHttp2Session) => void): this; + on(event: 'sessionError', listener: (err: Error) => void): this; + on(event: 'stream', listener: (stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number) => void): this; + on(event: 'timeout', listener: () => void): this; + on(event: string | symbol, listener: (...args: any[]) => void): this; + once(event: 'checkContinue', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + once(event: 'request', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + once(event: 'session', listener: (session: ServerHttp2Session) => void): this; + once(event: 'sessionError', listener: (err: Error) => void): this; + once(event: 'stream', listener: (stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number) => void): this; + once(event: 'timeout', listener: () => void): this; + once(event: string | symbol, listener: (...args: any[]) => void): this; + prependListener(event: 'checkContinue', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + prependListener(event: 'request', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + prependListener(event: 'session', listener: (session: ServerHttp2Session) => void): this; + prependListener(event: 'sessionError', listener: (err: Error) => void): this; + prependListener(event: 'stream', listener: (stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number) => void): this; + prependListener(event: 'timeout', listener: () => void): this; + prependListener(event: string | symbol, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'checkContinue', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + prependOnceListener(event: 'request', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + prependOnceListener(event: 'session', listener: (session: ServerHttp2Session) => void): this; + prependOnceListener(event: 'sessionError', listener: (err: Error) => void): this; + prependOnceListener(event: 'stream', listener: (stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number) => void): this; + prependOnceListener(event: 'timeout', listener: () => void): this; + prependOnceListener(event: string | symbol, listener: (...args: any[]) => void): this; + } + export interface Http2SecureServer extends tls.Server, HTTP2ServerCommon { + addListener(event: 'checkContinue', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + addListener(event: 'request', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + addListener(event: 'session', listener: (session: ServerHttp2Session) => void): this; + addListener(event: 'sessionError', listener: (err: Error) => void): this; + addListener(event: 'stream', listener: (stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number) => void): this; + addListener(event: 'timeout', listener: () => void): this; + addListener(event: 'unknownProtocol', listener: (socket: tls.TLSSocket) => void): this; + addListener(event: string | symbol, listener: (...args: any[]) => void): this; + emit(event: 'checkContinue', request: Http2ServerRequest, response: Http2ServerResponse): boolean; + emit(event: 'request', request: Http2ServerRequest, response: Http2ServerResponse): boolean; + emit(event: 'session', session: ServerHttp2Session): boolean; + emit(event: 'sessionError', err: Error): boolean; + emit(event: 'stream', stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number): boolean; + emit(event: 'timeout'): boolean; + emit(event: 'unknownProtocol', socket: tls.TLSSocket): boolean; + emit(event: string | symbol, ...args: any[]): boolean; + on(event: 'checkContinue', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + on(event: 'request', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + on(event: 'session', listener: (session: ServerHttp2Session) => void): this; + on(event: 'sessionError', listener: (err: Error) => void): this; + on(event: 'stream', listener: (stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number) => void): this; + on(event: 'timeout', listener: () => void): this; + on(event: 'unknownProtocol', listener: (socket: tls.TLSSocket) => void): this; + on(event: string | symbol, listener: (...args: any[]) => void): this; + once(event: 'checkContinue', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + once(event: 'request', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + once(event: 'session', listener: (session: ServerHttp2Session) => void): this; + once(event: 'sessionError', listener: (err: Error) => void): this; + once(event: 'stream', listener: (stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number) => void): this; + once(event: 'timeout', listener: () => void): this; + once(event: 'unknownProtocol', listener: (socket: tls.TLSSocket) => void): this; + once(event: string | symbol, listener: (...args: any[]) => void): this; + prependListener(event: 'checkContinue', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + prependListener(event: 'request', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + prependListener(event: 'session', listener: (session: ServerHttp2Session) => void): this; + prependListener(event: 'sessionError', listener: (err: Error) => void): this; + prependListener(event: 'stream', listener: (stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number) => void): this; + prependListener(event: 'timeout', listener: () => void): this; + prependListener(event: 'unknownProtocol', listener: (socket: tls.TLSSocket) => void): this; + prependListener(event: string | symbol, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'checkContinue', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + prependOnceListener(event: 'request', listener: (request: Http2ServerRequest, response: Http2ServerResponse) => void): this; + prependOnceListener(event: 'session', listener: (session: ServerHttp2Session) => void): this; + prependOnceListener(event: 'sessionError', listener: (err: Error) => void): this; + prependOnceListener(event: 'stream', listener: (stream: ServerHttp2Stream, headers: IncomingHttpHeaders, flags: number) => void): this; + prependOnceListener(event: 'timeout', listener: () => void): this; + prependOnceListener(event: 'unknownProtocol', listener: (socket: tls.TLSSocket) => void): this; + prependOnceListener(event: string | symbol, listener: (...args: any[]) => void): this; + } + /** + * A `Http2ServerRequest` object is created by {@link Server} or {@link SecureServer} and passed as the first argument to the `'request'` event. It may be used to access a request status, + * headers, and + * data. + * @since v8.4.0 + */ + export class Http2ServerRequest extends stream.Readable { + constructor(stream: ServerHttp2Stream, headers: IncomingHttpHeaders, options: stream.ReadableOptions, rawHeaders: ReadonlyArray); + /** + * The `request.aborted` property will be `true` if the request has + * been aborted. + * @since v10.1.0 + */ + readonly aborted: boolean; + /** + * The request authority pseudo header field. Because HTTP/2 allows requests + * to set either `:authority` or `host`, this value is derived from`req.headers[':authority']` if present. Otherwise, it is derived from`req.headers['host']`. + * @since v8.4.0 + */ + readonly authority: string; + /** + * See `request.socket`. + * @since v8.4.0 + * @deprecated Since v13.0.0 - Use `socket`. + */ + readonly connection: net.Socket | tls.TLSSocket; + /** + * The `request.complete` property will be `true` if the request has + * been completed, aborted, or destroyed. + * @since v12.10.0 + */ + readonly complete: boolean; + /** + * The request/response headers object. + * + * Key-value pairs of header names and values. Header names are lower-cased. + * + * ```js + * // Prints something like: + * // + * // { 'user-agent': 'curl/7.22.0', + * // host: '127.0.0.1:8000', + * // accept: '*' } + * console.log(request.headers); + * ``` + * + * See `HTTP/2 Headers Object`. + * + * In HTTP/2, the request path, host name, protocol, and method are represented as + * special headers prefixed with the `:` character (e.g. `':path'`). These special + * headers will be included in the `request.headers` object. Care must be taken not + * to inadvertently modify these special headers or errors may occur. For instance, + * removing all headers from the request will cause errors to occur: + * + * ```js + * removeAllHeaders(request.headers); + * assert(request.url); // Fails because the :path header has been removed + * ``` + * @since v8.4.0 + */ + readonly headers: IncomingHttpHeaders; + /** + * In case of server request, the HTTP version sent by the client. In the case of + * client response, the HTTP version of the connected-to server. Returns`'2.0'`. + * + * Also `message.httpVersionMajor` is the first integer and`message.httpVersionMinor` is the second. + * @since v8.4.0 + */ + readonly httpVersion: string; + readonly httpVersionMinor: number; + readonly httpVersionMajor: number; + /** + * The request method as a string. Read-only. Examples: `'GET'`, `'DELETE'`. + * @since v8.4.0 + */ + readonly method: string; + /** + * The raw request/response headers list exactly as they were received. + * + * The keys and values are in the same list. It is _not_ a + * list of tuples. So, the even-numbered offsets are key values, and the + * odd-numbered offsets are the associated values. + * + * Header names are not lowercased, and duplicates are not merged. + * + * ```js + * // Prints something like: + * // + * // [ 'user-agent', + * // 'this is invalid because there can be only one', + * // 'User-Agent', + * // 'curl/7.22.0', + * // 'Host', + * // '127.0.0.1:8000', + * // 'ACCEPT', + * // '*' ] + * console.log(request.rawHeaders); + * ``` + * @since v8.4.0 + */ + readonly rawHeaders: string[]; + /** + * The raw request/response trailer keys and values exactly as they were + * received. Only populated at the `'end'` event. + * @since v8.4.0 + */ + readonly rawTrailers: string[]; + /** + * The request scheme pseudo header field indicating the scheme + * portion of the target URL. + * @since v8.4.0 + */ + readonly scheme: string; + /** + * Returns a `Proxy` object that acts as a `net.Socket` (or `tls.TLSSocket`) but + * applies getters, setters, and methods based on HTTP/2 logic. + * + * `destroyed`, `readable`, and `writable` properties will be retrieved from and + * set on `request.stream`. + * + * `destroy`, `emit`, `end`, `on` and `once` methods will be called on`request.stream`. + * + * `setTimeout` method will be called on `request.stream.session`. + * + * `pause`, `read`, `resume`, and `write` will throw an error with code`ERR_HTTP2_NO_SOCKET_MANIPULATION`. See `Http2Session and Sockets` for + * more information. + * + * All other interactions will be routed directly to the socket. With TLS support, + * use `request.socket.getPeerCertificate()` to obtain the client's + * authentication details. + * @since v8.4.0 + */ + readonly socket: net.Socket | tls.TLSSocket; + /** + * The `Http2Stream` object backing the request. + * @since v8.4.0 + */ + readonly stream: ServerHttp2Stream; + /** + * The request/response trailers object. Only populated at the `'end'` event. + * @since v8.4.0 + */ + readonly trailers: IncomingHttpHeaders; + /** + * Request URL string. This contains only the URL that is present in the actual + * HTTP request. If the request is: + * + * ```http + * GET /status?name=ryan HTTP/1.1 + * Accept: text/plain + * ``` + * + * Then `request.url` will be: + * + * ```js + * '/status?name=ryan' + * ``` + * + * To parse the url into its parts, `new URL()` can be used: + * + * ```console + * $ node + * > new URL('/status?name=ryan', 'http://example.com') + * URL { + * href: 'http://example.com/status?name=ryan', + * origin: 'http://example.com', + * protocol: 'http:', + * username: '', + * password: '', + * host: 'example.com', + * hostname: 'example.com', + * port: '', + * pathname: '/status', + * search: '?name=ryan', + * searchParams: URLSearchParams { 'name' => 'ryan' }, + * hash: '' + * } + * ``` + * @since v8.4.0 + */ + readonly url: string; + /** + * Sets the `Http2Stream`'s timeout value to `msecs`. If a callback is + * provided, then it is added as a listener on the `'timeout'` event on + * the response object. + * + * If no `'timeout'` listener is added to the request, the response, or + * the server, then `Http2Stream` s are destroyed when they time out. If a + * handler is assigned to the request, the response, or the server's `'timeout'`events, timed out sockets must be handled explicitly. + * @since v8.4.0 + */ + setTimeout(msecs: number, callback?: () => void): void; + read(size?: number): Buffer | string | null; + addListener(event: 'aborted', listener: (hadError: boolean, code: number) => void): this; + addListener(event: 'close', listener: () => void): this; + addListener(event: 'data', listener: (chunk: Buffer | string) => void): this; + addListener(event: 'end', listener: () => void): this; + addListener(event: 'readable', listener: () => void): this; + addListener(event: 'error', listener: (err: Error) => void): this; + addListener(event: string | symbol, listener: (...args: any[]) => void): this; + emit(event: 'aborted', hadError: boolean, code: number): boolean; + emit(event: 'close'): boolean; + emit(event: 'data', chunk: Buffer | string): boolean; + emit(event: 'end'): boolean; + emit(event: 'readable'): boolean; + emit(event: 'error', err: Error): boolean; + emit(event: string | symbol, ...args: any[]): boolean; + on(event: 'aborted', listener: (hadError: boolean, code: number) => void): this; + on(event: 'close', listener: () => void): this; + on(event: 'data', listener: (chunk: Buffer | string) => void): this; + on(event: 'end', listener: () => void): this; + on(event: 'readable', listener: () => void): this; + on(event: 'error', listener: (err: Error) => void): this; + on(event: string | symbol, listener: (...args: any[]) => void): this; + once(event: 'aborted', listener: (hadError: boolean, code: number) => void): this; + once(event: 'close', listener: () => void): this; + once(event: 'data', listener: (chunk: Buffer | string) => void): this; + once(event: 'end', listener: () => void): this; + once(event: 'readable', listener: () => void): this; + once(event: 'error', listener: (err: Error) => void): this; + once(event: string | symbol, listener: (...args: any[]) => void): this; + prependListener(event: 'aborted', listener: (hadError: boolean, code: number) => void): this; + prependListener(event: 'close', listener: () => void): this; + prependListener(event: 'data', listener: (chunk: Buffer | string) => void): this; + prependListener(event: 'end', listener: () => void): this; + prependListener(event: 'readable', listener: () => void): this; + prependListener(event: 'error', listener: (err: Error) => void): this; + prependListener(event: string | symbol, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'aborted', listener: (hadError: boolean, code: number) => void): this; + prependOnceListener(event: 'close', listener: () => void): this; + prependOnceListener(event: 'data', listener: (chunk: Buffer | string) => void): this; + prependOnceListener(event: 'end', listener: () => void): this; + prependOnceListener(event: 'readable', listener: () => void): this; + prependOnceListener(event: 'error', listener: (err: Error) => void): this; + prependOnceListener(event: string | symbol, listener: (...args: any[]) => void): this; + } + /** + * This object is created internally by an HTTP server, not by the user. It is + * passed as the second parameter to the `'request'` event. + * @since v8.4.0 + */ + export class Http2ServerResponse extends stream.Writable { + constructor(stream: ServerHttp2Stream); + /** + * See `response.socket`. + * @since v8.4.0 + * @deprecated Since v13.0.0 - Use `socket`. + */ + readonly connection: net.Socket | tls.TLSSocket; + /** + * Boolean value that indicates whether the response has completed. Starts + * as `false`. After `response.end()` executes, the value will be `true`. + * @since v8.4.0 + * @deprecated Since v13.4.0,v12.16.0 - Use `writableEnded`. + */ + readonly finished: boolean; + /** + * True if headers were sent, false otherwise (read-only). + * @since v8.4.0 + */ + readonly headersSent: boolean; + /** + * A reference to the original HTTP2 request object. + * @since v15.7.0 + */ + readonly req: Http2ServerRequest; + /** + * Returns a `Proxy` object that acts as a `net.Socket` (or `tls.TLSSocket`) but + * applies getters, setters, and methods based on HTTP/2 logic. + * + * `destroyed`, `readable`, and `writable` properties will be retrieved from and + * set on `response.stream`. + * + * `destroy`, `emit`, `end`, `on` and `once` methods will be called on`response.stream`. + * + * `setTimeout` method will be called on `response.stream.session`. + * + * `pause`, `read`, `resume`, and `write` will throw an error with code`ERR_HTTP2_NO_SOCKET_MANIPULATION`. See `Http2Session and Sockets` for + * more information. + * + * All other interactions will be routed directly to the socket. + * + * ```js + * const http2 = require('http2'); + * const server = http2.createServer((req, res) => { + * const ip = req.socket.remoteAddress; + * const port = req.socket.remotePort; + * res.end(`Your IP address is ${ip} and your source port is ${port}.`); + * }).listen(3000); + * ``` + * @since v8.4.0 + */ + readonly socket: net.Socket | tls.TLSSocket; + /** + * The `Http2Stream` object backing the response. + * @since v8.4.0 + */ + readonly stream: ServerHttp2Stream; + /** + * When true, the Date header will be automatically generated and sent in + * the response if it is not already present in the headers. Defaults to true. + * + * This should only be disabled for testing; HTTP requires the Date header + * in responses. + * @since v8.4.0 + */ + sendDate: boolean; + /** + * When using implicit headers (not calling `response.writeHead()` explicitly), + * this property controls the status code that will be sent to the client when + * the headers get flushed. + * + * ```js + * response.statusCode = 404; + * ``` + * + * After response header was sent to the client, this property indicates the + * status code which was sent out. + * @since v8.4.0 + */ + statusCode: number; + /** + * Status message is not supported by HTTP/2 (RFC 7540 8.1.2.4). It returns + * an empty string. + * @since v8.4.0 + */ + statusMessage: ''; + /** + * This method adds HTTP trailing headers (a header but at the end of the + * message) to the response. + * + * Attempting to set a header field name or value that contains invalid characters + * will result in a `TypeError` being thrown. + * @since v8.4.0 + */ + addTrailers(trailers: OutgoingHttpHeaders): void; + /** + * This method signals to the server that all of the response headers and body + * have been sent; that server should consider this message complete. + * The method, `response.end()`, MUST be called on each response. + * + * If `data` is specified, it is equivalent to calling `response.write(data, encoding)` followed by `response.end(callback)`. + * + * If `callback` is specified, it will be called when the response stream + * is finished. + * @since v8.4.0 + */ + end(callback?: () => void): void; + end(data: string | Uint8Array, callback?: () => void): void; + end(data: string | Uint8Array, encoding: BufferEncoding, callback?: () => void): void; + /** + * Reads out a header that has already been queued but not sent to the client. + * The name is case-insensitive. + * + * ```js + * const contentType = response.getHeader('content-type'); + * ``` + * @since v8.4.0 + */ + getHeader(name: string): string; + /** + * Returns an array containing the unique names of the current outgoing headers. + * All header names are lowercase. + * + * ```js + * response.setHeader('Foo', 'bar'); + * response.setHeader('Set-Cookie', ['foo=bar', 'bar=baz']); + * + * const headerNames = response.getHeaderNames(); + * // headerNames === ['foo', 'set-cookie'] + * ``` + * @since v8.4.0 + */ + getHeaderNames(): string[]; + /** + * Returns a shallow copy of the current outgoing headers. Since a shallow copy + * is used, array values may be mutated without additional calls to various + * header-related http module methods. The keys of the returned object are the + * header names and the values are the respective header values. All header names + * are lowercase. + * + * The object returned by the `response.getHeaders()` method _does not_prototypically inherit from the JavaScript `Object`. This means that typical`Object` methods such as `obj.toString()`, + * `obj.hasOwnProperty()`, and others + * are not defined and _will not work_. + * + * ```js + * response.setHeader('Foo', 'bar'); + * response.setHeader('Set-Cookie', ['foo=bar', 'bar=baz']); + * + * const headers = response.getHeaders(); + * // headers === { foo: 'bar', 'set-cookie': ['foo=bar', 'bar=baz'] } + * ``` + * @since v8.4.0 + */ + getHeaders(): OutgoingHttpHeaders; + /** + * Returns `true` if the header identified by `name` is currently set in the + * outgoing headers. The header name matching is case-insensitive. + * + * ```js + * const hasContentType = response.hasHeader('content-type'); + * ``` + * @since v8.4.0 + */ + hasHeader(name: string): boolean; + /** + * Removes a header that has been queued for implicit sending. + * + * ```js + * response.removeHeader('Content-Encoding'); + * ``` + * @since v8.4.0 + */ + removeHeader(name: string): void; + /** + * Sets a single header value for implicit headers. If this header already exists + * in the to-be-sent headers, its value will be replaced. Use an array of strings + * here to send multiple headers with the same name. + * + * ```js + * response.setHeader('Content-Type', 'text/html; charset=utf-8'); + * ``` + * + * or + * + * ```js + * response.setHeader('Set-Cookie', ['type=ninja', 'language=javascript']); + * ``` + * + * Attempting to set a header field name or value that contains invalid characters + * will result in a `TypeError` being thrown. + * + * When headers have been set with `response.setHeader()`, they will be merged + * with any headers passed to `response.writeHead()`, with the headers passed + * to `response.writeHead()` given precedence. + * + * ```js + * // Returns content-type = text/plain + * const server = http2.createServer((req, res) => { + * res.setHeader('Content-Type', 'text/html; charset=utf-8'); + * res.setHeader('X-Foo', 'bar'); + * res.writeHead(200, { 'Content-Type': 'text/plain; charset=utf-8' }); + * res.end('ok'); + * }); + * ``` + * @since v8.4.0 + */ + setHeader(name: string, value: number | string | ReadonlyArray): void; + /** + * Sets the `Http2Stream`'s timeout value to `msecs`. If a callback is + * provided, then it is added as a listener on the `'timeout'` event on + * the response object. + * + * If no `'timeout'` listener is added to the request, the response, or + * the server, then `Http2Stream` s are destroyed when they time out. If a + * handler is assigned to the request, the response, or the server's `'timeout'`events, timed out sockets must be handled explicitly. + * @since v8.4.0 + */ + setTimeout(msecs: number, callback?: () => void): void; + /** + * If this method is called and `response.writeHead()` has not been called, + * it will switch to implicit header mode and flush the implicit headers. + * + * This sends a chunk of the response body. This method may + * be called multiple times to provide successive parts of the body. + * + * In the `http` module, the response body is omitted when the + * request is a HEAD request. Similarly, the `204` and `304` responses_must not_ include a message body. + * + * `chunk` can be a string or a buffer. If `chunk` is a string, + * the second parameter specifies how to encode it into a byte stream. + * By default the `encoding` is `'utf8'`. `callback` will be called when this chunk + * of data is flushed. + * + * This is the raw HTTP body and has nothing to do with higher-level multi-part + * body encodings that may be used. + * + * The first time `response.write()` is called, it will send the buffered + * header information and the first chunk of the body to the client. The second + * time `response.write()` is called, Node.js assumes data will be streamed, + * and sends the new data separately. That is, the response is buffered up to the + * first chunk of the body. + * + * Returns `true` if the entire data was flushed successfully to the kernel + * buffer. Returns `false` if all or part of the data was queued in user memory.`'drain'` will be emitted when the buffer is free again. + * @since v8.4.0 + */ + write(chunk: string | Uint8Array, callback?: (err: Error) => void): boolean; + write(chunk: string | Uint8Array, encoding: BufferEncoding, callback?: (err: Error) => void): boolean; + /** + * Sends a status `100 Continue` to the client, indicating that the request body + * should be sent. See the `'checkContinue'` event on `Http2Server` and`Http2SecureServer`. + * @since v8.4.0 + */ + writeContinue(): void; + /** + * Sends a response header to the request. The status code is a 3-digit HTTP + * status code, like `404`. The last argument, `headers`, are the response headers. + * + * Returns a reference to the `Http2ServerResponse`, so that calls can be chained. + * + * For compatibility with `HTTP/1`, a human-readable `statusMessage` may be + * passed as the second argument. However, because the `statusMessage` has no + * meaning within HTTP/2, the argument will have no effect and a process warning + * will be emitted. + * + * ```js + * const body = 'hello world'; + * response.writeHead(200, { + * 'Content-Length': Buffer.byteLength(body), + * 'Content-Type': 'text/plain; charset=utf-8', + * }); + * ``` + * + * `Content-Length` is given in bytes not characters. The`Buffer.byteLength()` API may be used to determine the number of bytes in a + * given encoding. On outbound messages, Node.js does not check if Content-Length + * and the length of the body being transmitted are equal or not. However, when + * receiving messages, Node.js will automatically reject messages when the`Content-Length` does not match the actual payload size. + * + * This method may be called at most one time on a message before `response.end()` is called. + * + * If `response.write()` or `response.end()` are called before calling + * this, the implicit/mutable headers will be calculated and call this function. + * + * When headers have been set with `response.setHeader()`, they will be merged + * with any headers passed to `response.writeHead()`, with the headers passed + * to `response.writeHead()` given precedence. + * + * ```js + * // Returns content-type = text/plain + * const server = http2.createServer((req, res) => { + * res.setHeader('Content-Type', 'text/html; charset=utf-8'); + * res.setHeader('X-Foo', 'bar'); + * res.writeHead(200, { 'Content-Type': 'text/plain; charset=utf-8' }); + * res.end('ok'); + * }); + * ``` + * + * Attempting to set a header field name or value that contains invalid characters + * will result in a `TypeError` being thrown. + * @since v8.4.0 + */ + writeHead(statusCode: number, headers?: OutgoingHttpHeaders): this; + writeHead(statusCode: number, statusMessage: string, headers?: OutgoingHttpHeaders): this; + /** + * Call `http2stream.pushStream()` with the given headers, and wrap the + * given `Http2Stream` on a newly created `Http2ServerResponse` as the callback + * parameter if successful. When `Http2ServerRequest` is closed, the callback is + * called with an error `ERR_HTTP2_INVALID_STREAM`. + * @since v8.4.0 + * @param headers An object describing the headers + * @param callback Called once `http2stream.pushStream()` is finished, or either when the attempt to create the pushed `Http2Stream` has failed or has been rejected, or the state of + * `Http2ServerRequest` is closed prior to calling the `http2stream.pushStream()` method + */ + createPushResponse(headers: OutgoingHttpHeaders, callback: (err: Error | null, res: Http2ServerResponse) => void): void; + addListener(event: 'close', listener: () => void): this; + addListener(event: 'drain', listener: () => void): this; + addListener(event: 'error', listener: (error: Error) => void): this; + addListener(event: 'finish', listener: () => void): this; + addListener(event: 'pipe', listener: (src: stream.Readable) => void): this; + addListener(event: 'unpipe', listener: (src: stream.Readable) => void): this; + addListener(event: string | symbol, listener: (...args: any[]) => void): this; + emit(event: 'close'): boolean; + emit(event: 'drain'): boolean; + emit(event: 'error', error: Error): boolean; + emit(event: 'finish'): boolean; + emit(event: 'pipe', src: stream.Readable): boolean; + emit(event: 'unpipe', src: stream.Readable): boolean; + emit(event: string | symbol, ...args: any[]): boolean; + on(event: 'close', listener: () => void): this; + on(event: 'drain', listener: () => void): this; + on(event: 'error', listener: (error: Error) => void): this; + on(event: 'finish', listener: () => void): this; + on(event: 'pipe', listener: (src: stream.Readable) => void): this; + on(event: 'unpipe', listener: (src: stream.Readable) => void): this; + on(event: string | symbol, listener: (...args: any[]) => void): this; + once(event: 'close', listener: () => void): this; + once(event: 'drain', listener: () => void): this; + once(event: 'error', listener: (error: Error) => void): this; + once(event: 'finish', listener: () => void): this; + once(event: 'pipe', listener: (src: stream.Readable) => void): this; + once(event: 'unpipe', listener: (src: stream.Readable) => void): this; + once(event: string | symbol, listener: (...args: any[]) => void): this; + prependListener(event: 'close', listener: () => void): this; + prependListener(event: 'drain', listener: () => void): this; + prependListener(event: 'error', listener: (error: Error) => void): this; + prependListener(event: 'finish', listener: () => void): this; + prependListener(event: 'pipe', listener: (src: stream.Readable) => void): this; + prependListener(event: 'unpipe', listener: (src: stream.Readable) => void): this; + prependListener(event: string | symbol, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'close', listener: () => void): this; + prependOnceListener(event: 'drain', listener: () => void): this; + prependOnceListener(event: 'error', listener: (error: Error) => void): this; + prependOnceListener(event: 'finish', listener: () => void): this; + prependOnceListener(event: 'pipe', listener: (src: stream.Readable) => void): this; + prependOnceListener(event: 'unpipe', listener: (src: stream.Readable) => void): this; + prependOnceListener(event: string | symbol, listener: (...args: any[]) => void): this; + } + export namespace constants { + const NGHTTP2_SESSION_SERVER: number; + const NGHTTP2_SESSION_CLIENT: number; + const NGHTTP2_STREAM_STATE_IDLE: number; + const NGHTTP2_STREAM_STATE_OPEN: number; + const NGHTTP2_STREAM_STATE_RESERVED_LOCAL: number; + const NGHTTP2_STREAM_STATE_RESERVED_REMOTE: number; + const NGHTTP2_STREAM_STATE_HALF_CLOSED_LOCAL: number; + const NGHTTP2_STREAM_STATE_HALF_CLOSED_REMOTE: number; + const NGHTTP2_STREAM_STATE_CLOSED: number; + const NGHTTP2_NO_ERROR: number; + const NGHTTP2_PROTOCOL_ERROR: number; + const NGHTTP2_INTERNAL_ERROR: number; + const NGHTTP2_FLOW_CONTROL_ERROR: number; + const NGHTTP2_SETTINGS_TIMEOUT: number; + const NGHTTP2_STREAM_CLOSED: number; + const NGHTTP2_FRAME_SIZE_ERROR: number; + const NGHTTP2_REFUSED_STREAM: number; + const NGHTTP2_CANCEL: number; + const NGHTTP2_COMPRESSION_ERROR: number; + const NGHTTP2_CONNECT_ERROR: number; + const NGHTTP2_ENHANCE_YOUR_CALM: number; + const NGHTTP2_INADEQUATE_SECURITY: number; + const NGHTTP2_HTTP_1_1_REQUIRED: number; + const NGHTTP2_ERR_FRAME_SIZE_ERROR: number; + const NGHTTP2_FLAG_NONE: number; + const NGHTTP2_FLAG_END_STREAM: number; + const NGHTTP2_FLAG_END_HEADERS: number; + const NGHTTP2_FLAG_ACK: number; + const NGHTTP2_FLAG_PADDED: number; + const NGHTTP2_FLAG_PRIORITY: number; + const DEFAULT_SETTINGS_HEADER_TABLE_SIZE: number; + const DEFAULT_SETTINGS_ENABLE_PUSH: number; + const DEFAULT_SETTINGS_INITIAL_WINDOW_SIZE: number; + const DEFAULT_SETTINGS_MAX_FRAME_SIZE: number; + const MAX_MAX_FRAME_SIZE: number; + const MIN_MAX_FRAME_SIZE: number; + const MAX_INITIAL_WINDOW_SIZE: number; + const NGHTTP2_DEFAULT_WEIGHT: number; + const NGHTTP2_SETTINGS_HEADER_TABLE_SIZE: number; + const NGHTTP2_SETTINGS_ENABLE_PUSH: number; + const NGHTTP2_SETTINGS_MAX_CONCURRENT_STREAMS: number; + const NGHTTP2_SETTINGS_INITIAL_WINDOW_SIZE: number; + const NGHTTP2_SETTINGS_MAX_FRAME_SIZE: number; + const NGHTTP2_SETTINGS_MAX_HEADER_LIST_SIZE: number; + const PADDING_STRATEGY_NONE: number; + const PADDING_STRATEGY_MAX: number; + const PADDING_STRATEGY_CALLBACK: number; + const HTTP2_HEADER_STATUS: string; + const HTTP2_HEADER_METHOD: string; + const HTTP2_HEADER_AUTHORITY: string; + const HTTP2_HEADER_SCHEME: string; + const HTTP2_HEADER_PATH: string; + const HTTP2_HEADER_ACCEPT_CHARSET: string; + const HTTP2_HEADER_ACCEPT_ENCODING: string; + const HTTP2_HEADER_ACCEPT_LANGUAGE: string; + const HTTP2_HEADER_ACCEPT_RANGES: string; + const HTTP2_HEADER_ACCEPT: string; + const HTTP2_HEADER_ACCESS_CONTROL_ALLOW_ORIGIN: string; + const HTTP2_HEADER_AGE: string; + const HTTP2_HEADER_ALLOW: string; + const HTTP2_HEADER_AUTHORIZATION: string; + const HTTP2_HEADER_CACHE_CONTROL: string; + const HTTP2_HEADER_CONNECTION: string; + const HTTP2_HEADER_CONTENT_DISPOSITION: string; + const HTTP2_HEADER_CONTENT_ENCODING: string; + const HTTP2_HEADER_CONTENT_LANGUAGE: string; + const HTTP2_HEADER_CONTENT_LENGTH: string; + const HTTP2_HEADER_CONTENT_LOCATION: string; + const HTTP2_HEADER_CONTENT_MD5: string; + const HTTP2_HEADER_CONTENT_RANGE: string; + const HTTP2_HEADER_CONTENT_TYPE: string; + const HTTP2_HEADER_COOKIE: string; + const HTTP2_HEADER_DATE: string; + const HTTP2_HEADER_ETAG: string; + const HTTP2_HEADER_EXPECT: string; + const HTTP2_HEADER_EXPIRES: string; + const HTTP2_HEADER_FROM: string; + const HTTP2_HEADER_HOST: string; + const HTTP2_HEADER_IF_MATCH: string; + const HTTP2_HEADER_IF_MODIFIED_SINCE: string; + const HTTP2_HEADER_IF_NONE_MATCH: string; + const HTTP2_HEADER_IF_RANGE: string; + const HTTP2_HEADER_IF_UNMODIFIED_SINCE: string; + const HTTP2_HEADER_LAST_MODIFIED: string; + const HTTP2_HEADER_LINK: string; + const HTTP2_HEADER_LOCATION: string; + const HTTP2_HEADER_MAX_FORWARDS: string; + const HTTP2_HEADER_PREFER: string; + const HTTP2_HEADER_PROXY_AUTHENTICATE: string; + const HTTP2_HEADER_PROXY_AUTHORIZATION: string; + const HTTP2_HEADER_RANGE: string; + const HTTP2_HEADER_REFERER: string; + const HTTP2_HEADER_REFRESH: string; + const HTTP2_HEADER_RETRY_AFTER: string; + const HTTP2_HEADER_SERVER: string; + const HTTP2_HEADER_SET_COOKIE: string; + const HTTP2_HEADER_STRICT_TRANSPORT_SECURITY: string; + const HTTP2_HEADER_TRANSFER_ENCODING: string; + const HTTP2_HEADER_TE: string; + const HTTP2_HEADER_UPGRADE: string; + const HTTP2_HEADER_USER_AGENT: string; + const HTTP2_HEADER_VARY: string; + const HTTP2_HEADER_VIA: string; + const HTTP2_HEADER_WWW_AUTHENTICATE: string; + const HTTP2_HEADER_HTTP2_SETTINGS: string; + const HTTP2_HEADER_KEEP_ALIVE: string; + const HTTP2_HEADER_PROXY_CONNECTION: string; + const HTTP2_METHOD_ACL: string; + const HTTP2_METHOD_BASELINE_CONTROL: string; + const HTTP2_METHOD_BIND: string; + const HTTP2_METHOD_CHECKIN: string; + const HTTP2_METHOD_CHECKOUT: string; + const HTTP2_METHOD_CONNECT: string; + const HTTP2_METHOD_COPY: string; + const HTTP2_METHOD_DELETE: string; + const HTTP2_METHOD_GET: string; + const HTTP2_METHOD_HEAD: string; + const HTTP2_METHOD_LABEL: string; + const HTTP2_METHOD_LINK: string; + const HTTP2_METHOD_LOCK: string; + const HTTP2_METHOD_MERGE: string; + const HTTP2_METHOD_MKACTIVITY: string; + const HTTP2_METHOD_MKCALENDAR: string; + const HTTP2_METHOD_MKCOL: string; + const HTTP2_METHOD_MKREDIRECTREF: string; + const HTTP2_METHOD_MKWORKSPACE: string; + const HTTP2_METHOD_MOVE: string; + const HTTP2_METHOD_OPTIONS: string; + const HTTP2_METHOD_ORDERPATCH: string; + const HTTP2_METHOD_PATCH: string; + const HTTP2_METHOD_POST: string; + const HTTP2_METHOD_PRI: string; + const HTTP2_METHOD_PROPFIND: string; + const HTTP2_METHOD_PROPPATCH: string; + const HTTP2_METHOD_PUT: string; + const HTTP2_METHOD_REBIND: string; + const HTTP2_METHOD_REPORT: string; + const HTTP2_METHOD_SEARCH: string; + const HTTP2_METHOD_TRACE: string; + const HTTP2_METHOD_UNBIND: string; + const HTTP2_METHOD_UNCHECKOUT: string; + const HTTP2_METHOD_UNLINK: string; + const HTTP2_METHOD_UNLOCK: string; + const HTTP2_METHOD_UPDATE: string; + const HTTP2_METHOD_UPDATEREDIRECTREF: string; + const HTTP2_METHOD_VERSION_CONTROL: string; + const HTTP_STATUS_CONTINUE: number; + const HTTP_STATUS_SWITCHING_PROTOCOLS: number; + const HTTP_STATUS_PROCESSING: number; + const HTTP_STATUS_OK: number; + const HTTP_STATUS_CREATED: number; + const HTTP_STATUS_ACCEPTED: number; + const HTTP_STATUS_NON_AUTHORITATIVE_INFORMATION: number; + const HTTP_STATUS_NO_CONTENT: number; + const HTTP_STATUS_RESET_CONTENT: number; + const HTTP_STATUS_PARTIAL_CONTENT: number; + const HTTP_STATUS_MULTI_STATUS: number; + const HTTP_STATUS_ALREADY_REPORTED: number; + const HTTP_STATUS_IM_USED: number; + const HTTP_STATUS_MULTIPLE_CHOICES: number; + const HTTP_STATUS_MOVED_PERMANENTLY: number; + const HTTP_STATUS_FOUND: number; + const HTTP_STATUS_SEE_OTHER: number; + const HTTP_STATUS_NOT_MODIFIED: number; + const HTTP_STATUS_USE_PROXY: number; + const HTTP_STATUS_TEMPORARY_REDIRECT: number; + const HTTP_STATUS_PERMANENT_REDIRECT: number; + const HTTP_STATUS_BAD_REQUEST: number; + const HTTP_STATUS_UNAUTHORIZED: number; + const HTTP_STATUS_PAYMENT_REQUIRED: number; + const HTTP_STATUS_FORBIDDEN: number; + const HTTP_STATUS_NOT_FOUND: number; + const HTTP_STATUS_METHOD_NOT_ALLOWED: number; + const HTTP_STATUS_NOT_ACCEPTABLE: number; + const HTTP_STATUS_PROXY_AUTHENTICATION_REQUIRED: number; + const HTTP_STATUS_REQUEST_TIMEOUT: number; + const HTTP_STATUS_CONFLICT: number; + const HTTP_STATUS_GONE: number; + const HTTP_STATUS_LENGTH_REQUIRED: number; + const HTTP_STATUS_PRECONDITION_FAILED: number; + const HTTP_STATUS_PAYLOAD_TOO_LARGE: number; + const HTTP_STATUS_URI_TOO_LONG: number; + const HTTP_STATUS_UNSUPPORTED_MEDIA_TYPE: number; + const HTTP_STATUS_RANGE_NOT_SATISFIABLE: number; + const HTTP_STATUS_EXPECTATION_FAILED: number; + const HTTP_STATUS_TEAPOT: number; + const HTTP_STATUS_MISDIRECTED_REQUEST: number; + const HTTP_STATUS_UNPROCESSABLE_ENTITY: number; + const HTTP_STATUS_LOCKED: number; + const HTTP_STATUS_FAILED_DEPENDENCY: number; + const HTTP_STATUS_UNORDERED_COLLECTION: number; + const HTTP_STATUS_UPGRADE_REQUIRED: number; + const HTTP_STATUS_PRECONDITION_REQUIRED: number; + const HTTP_STATUS_TOO_MANY_REQUESTS: number; + const HTTP_STATUS_REQUEST_HEADER_FIELDS_TOO_LARGE: number; + const HTTP_STATUS_UNAVAILABLE_FOR_LEGAL_REASONS: number; + const HTTP_STATUS_INTERNAL_SERVER_ERROR: number; + const HTTP_STATUS_NOT_IMPLEMENTED: number; + const HTTP_STATUS_BAD_GATEWAY: number; + const HTTP_STATUS_SERVICE_UNAVAILABLE: number; + const HTTP_STATUS_GATEWAY_TIMEOUT: number; + const HTTP_STATUS_HTTP_VERSION_NOT_SUPPORTED: number; + const HTTP_STATUS_VARIANT_ALSO_NEGOTIATES: number; + const HTTP_STATUS_INSUFFICIENT_STORAGE: number; + const HTTP_STATUS_LOOP_DETECTED: number; + const HTTP_STATUS_BANDWIDTH_LIMIT_EXCEEDED: number; + const HTTP_STATUS_NOT_EXTENDED: number; + const HTTP_STATUS_NETWORK_AUTHENTICATION_REQUIRED: number; + } + /** + * This symbol can be set as a property on the HTTP/2 headers object with + * an array value in order to provide a list of headers considered sensitive. + */ + export const sensitiveHeaders: symbol; + /** + * Returns an object containing the default settings for an `Http2Session`instance. This method returns a new object instance every time it is called + * so instances returned may be safely modified for use. + * @since v8.4.0 + */ + export function getDefaultSettings(): Settings; + /** + * Returns a `Buffer` instance containing serialized representation of the given + * HTTP/2 settings as specified in the [HTTP/2](https://tools.ietf.org/html/rfc7540) specification. This is intended + * for use with the `HTTP2-Settings` header field. + * + * ```js + * const http2 = require('http2'); + * + * const packed = http2.getPackedSettings({ enablePush: false }); + * + * console.log(packed.toString('base64')); + * // Prints: AAIAAAAA + * ``` + * @since v8.4.0 + */ + export function getPackedSettings(settings: Settings): Buffer; + /** + * Returns a `HTTP/2 Settings Object` containing the deserialized settings from + * the given `Buffer` as generated by `http2.getPackedSettings()`. + * @since v8.4.0 + * @param buf The packed settings. + */ + export function getUnpackedSettings(buf: Uint8Array): Settings; + /** + * Returns a `net.Server` instance that creates and manages `Http2Session`instances. + * + * Since there are no browsers known that support [unencrypted HTTP/2](https://http2.github.io/faq/#does-http2-require-encryption), the use of {@link createSecureServer} is necessary when + * communicating + * with browser clients. + * + * ```js + * const http2 = require('http2'); + * + * // Create an unencrypted HTTP/2 server. + * // Since there are no browsers known that support + * // unencrypted HTTP/2, the use of `http2.createSecureServer()` + * // is necessary when communicating with browser clients. + * const server = http2.createServer(); + * + * server.on('stream', (stream, headers) => { + * stream.respond({ + * 'content-type': 'text/html; charset=utf-8', + * ':status': 200 + * }); + * stream.end('

Hello World

'); + * }); + * + * server.listen(80); + * ``` + * @since v8.4.0 + * @param onRequestHandler See `Compatibility API` + */ + export function createServer(onRequestHandler?: (request: Http2ServerRequest, response: Http2ServerResponse) => void): Http2Server; + export function createServer(options: ServerOptions, onRequestHandler?: (request: Http2ServerRequest, response: Http2ServerResponse) => void): Http2Server; + /** + * Returns a `tls.Server` instance that creates and manages `Http2Session`instances. + * + * ```js + * const http2 = require('http2'); + * const fs = require('fs'); + * + * const options = { + * key: fs.readFileSync('server-key.pem'), + * cert: fs.readFileSync('server-cert.pem') + * }; + * + * // Create a secure HTTP/2 server + * const server = http2.createSecureServer(options); + * + * server.on('stream', (stream, headers) => { + * stream.respond({ + * 'content-type': 'text/html; charset=utf-8', + * ':status': 200 + * }); + * stream.end('

Hello World

'); + * }); + * + * server.listen(80); + * ``` + * @since v8.4.0 + * @param onRequestHandler See `Compatibility API` + */ + export function createSecureServer(onRequestHandler?: (request: Http2ServerRequest, response: Http2ServerResponse) => void): Http2SecureServer; + export function createSecureServer(options: SecureServerOptions, onRequestHandler?: (request: Http2ServerRequest, response: Http2ServerResponse) => void): Http2SecureServer; + /** + * Returns a `ClientHttp2Session` instance. + * + * ```js + * const http2 = require('http2'); + * const client = http2.connect('https://localhost:1234'); + * + * // Use the client + * + * client.close(); + * ``` + * @since v8.4.0 + * @param authority The remote HTTP/2 server to connect to. This must be in the form of a minimal, valid URL with the `http://` or `https://` prefix, host name, and IP port (if a non-default port + * is used). Userinfo (user ID and password), path, querystring, and fragment details in the URL will be ignored. + * @param listener Will be registered as a one-time listener of the {@link 'connect'} event. + */ + export function connect(authority: string | url.URL, listener: (session: ClientHttp2Session, socket: net.Socket | tls.TLSSocket) => void): ClientHttp2Session; + export function connect( + authority: string | url.URL, + options?: ClientSessionOptions | SecureClientSessionOptions, + listener?: (session: ClientHttp2Session, socket: net.Socket | tls.TLSSocket) => void + ): ClientHttp2Session; +} +declare module 'node:http2' { + export * from 'http2'; +} diff --git a/node_modules/@types/node/https.d.ts b/node_modules/@types/node/https.d.ts new file mode 100644 index 000000000..194a3ad1c --- /dev/null +++ b/node_modules/@types/node/https.d.ts @@ -0,0 +1,391 @@ +/** + * HTTPS is the HTTP protocol over TLS/SSL. In Node.js this is implemented as a + * separate module. + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/https.js) + */ +declare module 'https' { + import { Duplex } from 'node:stream'; + import * as tls from 'node:tls'; + import * as http from 'node:http'; + import { URL } from 'node:url'; + type ServerOptions = tls.SecureContextOptions & tls.TlsOptions & http.ServerOptions; + type RequestOptions = http.RequestOptions & + tls.SecureContextOptions & { + rejectUnauthorized?: boolean | undefined; // Defaults to true + servername?: string | undefined; // SNI TLS Extension + }; + interface AgentOptions extends http.AgentOptions, tls.ConnectionOptions { + rejectUnauthorized?: boolean | undefined; + maxCachedSessions?: number | undefined; + } + /** + * An `Agent` object for HTTPS similar to `http.Agent`. See {@link request} for more information. + * @since v0.4.5 + */ + class Agent extends http.Agent { + constructor(options?: AgentOptions); + options: AgentOptions; + } + interface Server extends http.Server {} + /** + * See `http.Server` for more information. + * @since v0.3.4 + */ + class Server extends tls.Server { + constructor(requestListener?: http.RequestListener); + constructor(options: ServerOptions, requestListener?: http.RequestListener); + addListener(event: string, listener: (...args: any[]) => void): this; + addListener(event: 'keylog', listener: (line: Buffer, tlsSocket: tls.TLSSocket) => void): this; + addListener(event: 'newSession', listener: (sessionId: Buffer, sessionData: Buffer, callback: (err: Error, resp: Buffer) => void) => void): this; + addListener(event: 'OCSPRequest', listener: (certificate: Buffer, issuer: Buffer, callback: (err: Error | null, resp: Buffer) => void) => void): this; + addListener(event: 'resumeSession', listener: (sessionId: Buffer, callback: (err: Error, sessionData: Buffer) => void) => void): this; + addListener(event: 'secureConnection', listener: (tlsSocket: tls.TLSSocket) => void): this; + addListener(event: 'tlsClientError', listener: (err: Error, tlsSocket: tls.TLSSocket) => void): this; + addListener(event: 'close', listener: () => void): this; + addListener(event: 'connection', listener: (socket: Duplex) => void): this; + addListener(event: 'error', listener: (err: Error) => void): this; + addListener(event: 'listening', listener: () => void): this; + addListener(event: 'checkContinue', listener: http.RequestListener): this; + addListener(event: 'checkExpectation', listener: http.RequestListener): this; + addListener(event: 'clientError', listener: (err: Error, socket: Duplex) => void): this; + addListener(event: 'connect', listener: (req: http.IncomingMessage, socket: Duplex, head: Buffer) => void): this; + addListener(event: 'request', listener: http.RequestListener): this; + addListener(event: 'upgrade', listener: (req: http.IncomingMessage, socket: Duplex, head: Buffer) => void): this; + emit(event: string, ...args: any[]): boolean; + emit(event: 'keylog', line: Buffer, tlsSocket: tls.TLSSocket): boolean; + emit(event: 'newSession', sessionId: Buffer, sessionData: Buffer, callback: (err: Error, resp: Buffer) => void): boolean; + emit(event: 'OCSPRequest', certificate: Buffer, issuer: Buffer, callback: (err: Error | null, resp: Buffer) => void): boolean; + emit(event: 'resumeSession', sessionId: Buffer, callback: (err: Error, sessionData: Buffer) => void): boolean; + emit(event: 'secureConnection', tlsSocket: tls.TLSSocket): boolean; + emit(event: 'tlsClientError', err: Error, tlsSocket: tls.TLSSocket): boolean; + emit(event: 'close'): boolean; + emit(event: 'connection', socket: Duplex): boolean; + emit(event: 'error', err: Error): boolean; + emit(event: 'listening'): boolean; + emit(event: 'checkContinue', req: http.IncomingMessage, res: http.ServerResponse): boolean; + emit(event: 'checkExpectation', req: http.IncomingMessage, res: http.ServerResponse): boolean; + emit(event: 'clientError', err: Error, socket: Duplex): boolean; + emit(event: 'connect', req: http.IncomingMessage, socket: Duplex, head: Buffer): boolean; + emit(event: 'request', req: http.IncomingMessage, res: http.ServerResponse): boolean; + emit(event: 'upgrade', req: http.IncomingMessage, socket: Duplex, head: Buffer): boolean; + on(event: string, listener: (...args: any[]) => void): this; + on(event: 'keylog', listener: (line: Buffer, tlsSocket: tls.TLSSocket) => void): this; + on(event: 'newSession', listener: (sessionId: Buffer, sessionData: Buffer, callback: (err: Error, resp: Buffer) => void) => void): this; + on(event: 'OCSPRequest', listener: (certificate: Buffer, issuer: Buffer, callback: (err: Error | null, resp: Buffer) => void) => void): this; + on(event: 'resumeSession', listener: (sessionId: Buffer, callback: (err: Error, sessionData: Buffer) => void) => void): this; + on(event: 'secureConnection', listener: (tlsSocket: tls.TLSSocket) => void): this; + on(event: 'tlsClientError', listener: (err: Error, tlsSocket: tls.TLSSocket) => void): this; + on(event: 'close', listener: () => void): this; + on(event: 'connection', listener: (socket: Duplex) => void): this; + on(event: 'error', listener: (err: Error) => void): this; + on(event: 'listening', listener: () => void): this; + on(event: 'checkContinue', listener: http.RequestListener): this; + on(event: 'checkExpectation', listener: http.RequestListener): this; + on(event: 'clientError', listener: (err: Error, socket: Duplex) => void): this; + on(event: 'connect', listener: (req: http.IncomingMessage, socket: Duplex, head: Buffer) => void): this; + on(event: 'request', listener: http.RequestListener): this; + on(event: 'upgrade', listener: (req: http.IncomingMessage, socket: Duplex, head: Buffer) => void): this; + once(event: string, listener: (...args: any[]) => void): this; + once(event: 'keylog', listener: (line: Buffer, tlsSocket: tls.TLSSocket) => void): this; + once(event: 'newSession', listener: (sessionId: Buffer, sessionData: Buffer, callback: (err: Error, resp: Buffer) => void) => void): this; + once(event: 'OCSPRequest', listener: (certificate: Buffer, issuer: Buffer, callback: (err: Error | null, resp: Buffer) => void) => void): this; + once(event: 'resumeSession', listener: (sessionId: Buffer, callback: (err: Error, sessionData: Buffer) => void) => void): this; + once(event: 'secureConnection', listener: (tlsSocket: tls.TLSSocket) => void): this; + once(event: 'tlsClientError', listener: (err: Error, tlsSocket: tls.TLSSocket) => void): this; + once(event: 'close', listener: () => void): this; + once(event: 'connection', listener: (socket: Duplex) => void): this; + once(event: 'error', listener: (err: Error) => void): this; + once(event: 'listening', listener: () => void): this; + once(event: 'checkContinue', listener: http.RequestListener): this; + once(event: 'checkExpectation', listener: http.RequestListener): this; + once(event: 'clientError', listener: (err: Error, socket: Duplex) => void): this; + once(event: 'connect', listener: (req: http.IncomingMessage, socket: Duplex, head: Buffer) => void): this; + once(event: 'request', listener: http.RequestListener): this; + once(event: 'upgrade', listener: (req: http.IncomingMessage, socket: Duplex, head: Buffer) => void): this; + prependListener(event: string, listener: (...args: any[]) => void): this; + prependListener(event: 'keylog', listener: (line: Buffer, tlsSocket: tls.TLSSocket) => void): this; + prependListener(event: 'newSession', listener: (sessionId: Buffer, sessionData: Buffer, callback: (err: Error, resp: Buffer) => void) => void): this; + prependListener(event: 'OCSPRequest', listener: (certificate: Buffer, issuer: Buffer, callback: (err: Error | null, resp: Buffer) => void) => void): this; + prependListener(event: 'resumeSession', listener: (sessionId: Buffer, callback: (err: Error, sessionData: Buffer) => void) => void): this; + prependListener(event: 'secureConnection', listener: (tlsSocket: tls.TLSSocket) => void): this; + prependListener(event: 'tlsClientError', listener: (err: Error, tlsSocket: tls.TLSSocket) => void): this; + prependListener(event: 'close', listener: () => void): this; + prependListener(event: 'connection', listener: (socket: Duplex) => void): this; + prependListener(event: 'error', listener: (err: Error) => void): this; + prependListener(event: 'listening', listener: () => void): this; + prependListener(event: 'checkContinue', listener: http.RequestListener): this; + prependListener(event: 'checkExpectation', listener: http.RequestListener): this; + prependListener(event: 'clientError', listener: (err: Error, socket: Duplex) => void): this; + prependListener(event: 'connect', listener: (req: http.IncomingMessage, socket: Duplex, head: Buffer) => void): this; + prependListener(event: 'request', listener: http.RequestListener): this; + prependListener(event: 'upgrade', listener: (req: http.IncomingMessage, socket: Duplex, head: Buffer) => void): this; + prependOnceListener(event: string, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'keylog', listener: (line: Buffer, tlsSocket: tls.TLSSocket) => void): this; + prependOnceListener(event: 'newSession', listener: (sessionId: Buffer, sessionData: Buffer, callback: (err: Error, resp: Buffer) => void) => void): this; + prependOnceListener(event: 'OCSPRequest', listener: (certificate: Buffer, issuer: Buffer, callback: (err: Error | null, resp: Buffer) => void) => void): this; + prependOnceListener(event: 'resumeSession', listener: (sessionId: Buffer, callback: (err: Error, sessionData: Buffer) => void) => void): this; + prependOnceListener(event: 'secureConnection', listener: (tlsSocket: tls.TLSSocket) => void): this; + prependOnceListener(event: 'tlsClientError', listener: (err: Error, tlsSocket: tls.TLSSocket) => void): this; + prependOnceListener(event: 'close', listener: () => void): this; + prependOnceListener(event: 'connection', listener: (socket: Duplex) => void): this; + prependOnceListener(event: 'error', listener: (err: Error) => void): this; + prependOnceListener(event: 'listening', listener: () => void): this; + prependOnceListener(event: 'checkContinue', listener: http.RequestListener): this; + prependOnceListener(event: 'checkExpectation', listener: http.RequestListener): this; + prependOnceListener(event: 'clientError', listener: (err: Error, socket: Duplex) => void): this; + prependOnceListener(event: 'connect', listener: (req: http.IncomingMessage, socket: Duplex, head: Buffer) => void): this; + prependOnceListener(event: 'request', listener: http.RequestListener): this; + prependOnceListener(event: 'upgrade', listener: (req: http.IncomingMessage, socket: Duplex, head: Buffer) => void): this; + } + /** + * ```js + * // curl -k https://localhost:8000/ + * const https = require('https'); + * const fs = require('fs'); + * + * const options = { + * key: fs.readFileSync('test/fixtures/keys/agent2-key.pem'), + * cert: fs.readFileSync('test/fixtures/keys/agent2-cert.pem') + * }; + * + * https.createServer(options, (req, res) => { + * res.writeHead(200); + * res.end('hello world\n'); + * }).listen(8000); + * ``` + * + * Or + * + * ```js + * const https = require('https'); + * const fs = require('fs'); + * + * const options = { + * pfx: fs.readFileSync('test/fixtures/test_cert.pfx'), + * passphrase: 'sample' + * }; + * + * https.createServer(options, (req, res) => { + * res.writeHead(200); + * res.end('hello world\n'); + * }).listen(8000); + * ``` + * @since v0.3.4 + * @param options Accepts `options` from `createServer`, `createSecureContext` and `createServer`. + * @param requestListener A listener to be added to the `'request'` event. + */ + function createServer(requestListener?: http.RequestListener): Server; + function createServer(options: ServerOptions, requestListener?: http.RequestListener): Server; + /** + * Makes a request to a secure web server. + * + * The following additional `options` from `tls.connect()` are also accepted:`ca`, `cert`, `ciphers`, `clientCertEngine`, `crl`, `dhparam`, `ecdhCurve`,`honorCipherOrder`, `key`, `passphrase`, + * `pfx`, `rejectUnauthorized`,`secureOptions`, `secureProtocol`, `servername`, `sessionIdContext`,`highWaterMark`. + * + * `options` can be an object, a string, or a `URL` object. If `options` is a + * string, it is automatically parsed with `new URL()`. If it is a `URL` object, it will be automatically converted to an ordinary `options` object. + * + * `https.request()` returns an instance of the `http.ClientRequest` class. The `ClientRequest` instance is a writable stream. If one needs to + * upload a file with a POST request, then write to the `ClientRequest` object. + * + * ```js + * const https = require('https'); + * + * const options = { + * hostname: 'encrypted.google.com', + * port: 443, + * path: '/', + * method: 'GET' + * }; + * + * const req = https.request(options, (res) => { + * console.log('statusCode:', res.statusCode); + * console.log('headers:', res.headers); + * + * res.on('data', (d) => { + * process.stdout.write(d); + * }); + * }); + * + * req.on('error', (e) => { + * console.error(e); + * }); + * req.end(); + * ``` + * + * Example using options from `tls.connect()`: + * + * ```js + * const options = { + * hostname: 'encrypted.google.com', + * port: 443, + * path: '/', + * method: 'GET', + * key: fs.readFileSync('test/fixtures/keys/agent2-key.pem'), + * cert: fs.readFileSync('test/fixtures/keys/agent2-cert.pem') + * }; + * options.agent = new https.Agent(options); + * + * const req = https.request(options, (res) => { + * // ... + * }); + * ``` + * + * Alternatively, opt out of connection pooling by not using an `Agent`. + * + * ```js + * const options = { + * hostname: 'encrypted.google.com', + * port: 443, + * path: '/', + * method: 'GET', + * key: fs.readFileSync('test/fixtures/keys/agent2-key.pem'), + * cert: fs.readFileSync('test/fixtures/keys/agent2-cert.pem'), + * agent: false + * }; + * + * const req = https.request(options, (res) => { + * // ... + * }); + * ``` + * + * Example using a `URL` as `options`: + * + * ```js + * const options = new URL('https://abc:xyz@example.com'); + * + * const req = https.request(options, (res) => { + * // ... + * }); + * ``` + * + * Example pinning on certificate fingerprint, or the public key (similar to`pin-sha256`): + * + * ```js + * const tls = require('tls'); + * const https = require('https'); + * const crypto = require('crypto'); + * + * function sha256(s) { + * return crypto.createHash('sha256').update(s).digest('base64'); + * } + * const options = { + * hostname: 'github.com', + * port: 443, + * path: '/', + * method: 'GET', + * checkServerIdentity: function(host, cert) { + * // Make sure the certificate is issued to the host we are connected to + * const err = tls.checkServerIdentity(host, cert); + * if (err) { + * return err; + * } + * + * // Pin the public key, similar to HPKP pin-sha25 pinning + * const pubkey256 = 'pL1+qb9HTMRZJmuC/bB/ZI9d302BYrrqiVuRyW+DGrU='; + * if (sha256(cert.pubkey) !== pubkey256) { + * const msg = 'Certificate verification error: ' + + * `The public key of '${cert.subject.CN}' ` + + * 'does not match our pinned fingerprint'; + * return new Error(msg); + * } + * + * // Pin the exact certificate, rather than the pub key + * const cert256 = '25:FE:39:32:D9:63:8C:8A:FC:A1:9A:29:87:' + + * 'D8:3E:4C:1D:98:DB:71:E4:1A:48:03:98:EA:22:6A:BD:8B:93:16'; + * if (cert.fingerprint256 !== cert256) { + * const msg = 'Certificate verification error: ' + + * `The certificate of '${cert.subject.CN}' ` + + * 'does not match our pinned fingerprint'; + * return new Error(msg); + * } + * + * // This loop is informational only. + * // Print the certificate and public key fingerprints of all certs in the + * // chain. Its common to pin the public key of the issuer on the public + * // internet, while pinning the public key of the service in sensitive + * // environments. + * do { + * console.log('Subject Common Name:', cert.subject.CN); + * console.log(' Certificate SHA256 fingerprint:', cert.fingerprint256); + * + * hash = crypto.createHash('sha256'); + * console.log(' Public key ping-sha256:', sha256(cert.pubkey)); + * + * lastprint256 = cert.fingerprint256; + * cert = cert.issuerCertificate; + * } while (cert.fingerprint256 !== lastprint256); + * + * }, + * }; + * + * options.agent = new https.Agent(options); + * const req = https.request(options, (res) => { + * console.log('All OK. Server matched our pinned cert or public key'); + * console.log('statusCode:', res.statusCode); + * // Print the HPKP values + * console.log('headers:', res.headers['public-key-pins']); + * + * res.on('data', (d) => {}); + * }); + * + * req.on('error', (e) => { + * console.error(e.message); + * }); + * req.end(); + * ``` + * + * Outputs for example: + * + * ```text + * Subject Common Name: github.com + * Certificate SHA256 fingerprint: 25:FE:39:32:D9:63:8C:8A:FC:A1:9A:29:87:D8:3E:4C:1D:98:DB:71:E4:1A:48:03:98:EA:22:6A:BD:8B:93:16 + * Public key ping-sha256: pL1+qb9HTMRZJmuC/bB/ZI9d302BYrrqiVuRyW+DGrU= + * Subject Common Name: DigiCert SHA2 Extended Validation Server CA + * Certificate SHA256 fingerprint: 40:3E:06:2A:26:53:05:91:13:28:5B:AF:80:A0:D4:AE:42:2C:84:8C:9F:78:FA:D0:1F:C9:4B:C5:B8:7F:EF:1A + * Public key ping-sha256: RRM1dGqnDFsCJXBTHky16vi1obOlCgFFn/yOhI/y+ho= + * Subject Common Name: DigiCert High Assurance EV Root CA + * Certificate SHA256 fingerprint: 74:31:E5:F4:C3:C1:CE:46:90:77:4F:0B:61:E0:54:40:88:3B:A9:A0:1E:D0:0B:A6:AB:D7:80:6E:D3:B1:18:CF + * Public key ping-sha256: WoiWRyIOVNa9ihaBciRSC7XHjliYS9VwUGOIud4PB18= + * All OK. Server matched our pinned cert or public key + * statusCode: 200 + * headers: max-age=0; pin-sha256="WoiWRyIOVNa9ihaBciRSC7XHjliYS9VwUGOIud4PB18="; pin-sha256="RRM1dGqnDFsCJXBTHky16vi1obOlCgFFn/yOhI/y+ho="; + * pin-sha256="k2v657xBsOVe1PQRwOsHsw3bsGT2VzIqz5K+59sNQws="; pin-sha256="K87oWBWM9UZfyddvDfoxL+8lpNyoUB2ptGtn0fv6G2Q="; pin-sha256="IQBnNBEiFuhj+8x6X8XLgh01V9Ic5/V3IRQLNFFc7v4="; + * pin-sha256="iie1VXtL7HzAMF+/PVPR9xzT80kQxdZeJ+zduCB3uj0="; pin-sha256="LvRiGEjRqfzurezaWuj8Wie2gyHMrW5Q06LspMnox7A="; includeSubDomains + * ``` + * @since v0.3.6 + * @param options Accepts all `options` from `request`, with some differences in default values: + */ + function request(options: RequestOptions | string | URL, callback?: (res: http.IncomingMessage) => void): http.ClientRequest; + function request(url: string | URL, options: RequestOptions, callback?: (res: http.IncomingMessage) => void): http.ClientRequest; + /** + * Like `http.get()` but for HTTPS. + * + * `options` can be an object, a string, or a `URL` object. If `options` is a + * string, it is automatically parsed with `new URL()`. If it is a `URL` object, it will be automatically converted to an ordinary `options` object. + * + * ```js + * const https = require('https'); + * + * https.get('https://encrypted.google.com/', (res) => { + * console.log('statusCode:', res.statusCode); + * console.log('headers:', res.headers); + * + * res.on('data', (d) => { + * process.stdout.write(d); + * }); + * + * }).on('error', (e) => { + * console.error(e); + * }); + * ``` + * @since v0.3.6 + * @param options Accepts the same `options` as {@link request}, with the `method` always set to `GET`. + */ + function get(options: RequestOptions | string | URL, callback?: (res: http.IncomingMessage) => void): http.ClientRequest; + function get(url: string | URL, options: RequestOptions, callback?: (res: http.IncomingMessage) => void): http.ClientRequest; + let globalAgent: Agent; +} +declare module 'node:https' { + export * from 'https'; +} diff --git a/node_modules/@types/node/index.d.ts b/node_modules/@types/node/index.d.ts new file mode 100644 index 000000000..8d7e97e6b --- /dev/null +++ b/node_modules/@types/node/index.d.ts @@ -0,0 +1,131 @@ +// Type definitions for non-npm package Node.js 16.10 +// Project: https://nodejs.org/ +// Definitions by: Microsoft TypeScript +// DefinitelyTyped +// Alberto Schiabel +// Alvis HT Tang +// Andrew Makarov +// Benjamin Toueg +// Chigozirim C. +// David Junger +// Deividas Bakanas +// Eugene Y. Q. Shen +// Hannes Magnusson +// Huw +// Kelvin Jin +// Klaus Meinhardt +// Lishude +// Mariusz Wiktorczyk +// Mohsen Azimi +// Nicolas Even +// Nikita Galkin +// Parambir Singh +// Sebastian Silbermann +// Simon Schick +// Thomas den Hollander +// Wilco Bakker +// wwwy3y3 +// Samuel Ainsworth +// Kyle Uehlein +// Thanik Bhongbhibhat +// Marcin Kopacz +// Trivikram Kamat +// Minh Son Nguyen +// Junxiao Shi +// Ilia Baryshnikov +// ExE Boss +// Surasak Chaisurin +// Piotr Błażejewicz +// Anna Henningsen +// Victor Perin +// Yongsheng Zhang +// NodeJS Contributors +// Linus Unnebäck +// wafuwafu13 +// Definitions: https://github.com/DefinitelyTyped/DefinitelyTyped + +/** + * License for programmatically and manually incorporated + * documentation aka. `JSDoc` from https://github.com/nodejs/node/tree/master/doc + * + * Copyright Node.js contributors. All rights reserved. + * Permission is hereby granted, free of charge, to any person obtaining a copy + * of this software and associated documentation files (the "Software"), to + * deal in the Software without restriction, including without limitation the + * rights to use, copy, modify, merge, publish, distribute, sublicense, and/or + * sell copies of the Software, and to permit persons to whom the Software is + * furnished to do so, subject to the following conditions: + * + * The above copyright notice and this permission notice shall be included in + * all copies or substantial portions of the Software. + * + * THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR + * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, + * FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE + * AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER + * LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING + * FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS + * IN THE SOFTWARE. + */ + +// NOTE: These definitions support NodeJS and TypeScript 3.7+. + +// Reference required types from the default lib: +/// +/// +/// +/// + +// Base definitions for all NodeJS modules that are not specific to any version of TypeScript: +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// +/// + +/// diff --git a/node_modules/@types/node/inspector.d.ts b/node_modules/@types/node/inspector.d.ts new file mode 100644 index 000000000..c5f13665c --- /dev/null +++ b/node_modules/@types/node/inspector.d.ts @@ -0,0 +1,2738 @@ +// tslint:disable-next-line:dt-header +// Type definitions for inspector + +// These definitions are auto-generated. +// Please see https://github.com/DefinitelyTyped/DefinitelyTyped/pull/19330 +// for more information. + +// tslint:disable:max-line-length + +/** + * The `inspector` module provides an API for interacting with the V8 inspector. + * + * It can be accessed using: + * + * ```js + * const inspector = require('inspector'); + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/inspector.js) + */ +declare module 'inspector' { + import EventEmitter = require('node:events'); + interface InspectorNotification { + method: string; + params: T; + } + namespace Schema { + /** + * Description of the protocol domain. + */ + interface Domain { + /** + * Domain name. + */ + name: string; + /** + * Domain version. + */ + version: string; + } + interface GetDomainsReturnType { + /** + * List of supported domains. + */ + domains: Domain[]; + } + } + namespace Runtime { + /** + * Unique script identifier. + */ + type ScriptId = string; + /** + * Unique object identifier. + */ + type RemoteObjectId = string; + /** + * Primitive value which cannot be JSON-stringified. + */ + type UnserializableValue = string; + /** + * Mirror object referencing original JavaScript object. + */ + interface RemoteObject { + /** + * Object type. + */ + type: string; + /** + * Object subtype hint. Specified for object type values only. + */ + subtype?: string | undefined; + /** + * Object class (constructor) name. Specified for object type values only. + */ + className?: string | undefined; + /** + * Remote object value in case of primitive values or JSON values (if it was requested). + */ + value?: any; + /** + * Primitive value which can not be JSON-stringified does not have value, but gets this property. + */ + unserializableValue?: UnserializableValue | undefined; + /** + * String representation of the object. + */ + description?: string | undefined; + /** + * Unique object identifier (for non-primitive values). + */ + objectId?: RemoteObjectId | undefined; + /** + * Preview containing abbreviated property values. Specified for object type values only. + * @experimental + */ + preview?: ObjectPreview | undefined; + /** + * @experimental + */ + customPreview?: CustomPreview | undefined; + } + /** + * @experimental + */ + interface CustomPreview { + header: string; + hasBody: boolean; + formatterObjectId: RemoteObjectId; + bindRemoteObjectFunctionId: RemoteObjectId; + configObjectId?: RemoteObjectId | undefined; + } + /** + * Object containing abbreviated remote object value. + * @experimental + */ + interface ObjectPreview { + /** + * Object type. + */ + type: string; + /** + * Object subtype hint. Specified for object type values only. + */ + subtype?: string | undefined; + /** + * String representation of the object. + */ + description?: string | undefined; + /** + * True iff some of the properties or entries of the original object did not fit. + */ + overflow: boolean; + /** + * List of the properties. + */ + properties: PropertyPreview[]; + /** + * List of the entries. Specified for map and set subtype values only. + */ + entries?: EntryPreview[] | undefined; + } + /** + * @experimental + */ + interface PropertyPreview { + /** + * Property name. + */ + name: string; + /** + * Object type. Accessor means that the property itself is an accessor property. + */ + type: string; + /** + * User-friendly property value string. + */ + value?: string | undefined; + /** + * Nested value preview. + */ + valuePreview?: ObjectPreview | undefined; + /** + * Object subtype hint. Specified for object type values only. + */ + subtype?: string | undefined; + } + /** + * @experimental + */ + interface EntryPreview { + /** + * Preview of the key. Specified for map-like collection entries. + */ + key?: ObjectPreview | undefined; + /** + * Preview of the value. + */ + value: ObjectPreview; + } + /** + * Object property descriptor. + */ + interface PropertyDescriptor { + /** + * Property name or symbol description. + */ + name: string; + /** + * The value associated with the property. + */ + value?: RemoteObject | undefined; + /** + * True if the value associated with the property may be changed (data descriptors only). + */ + writable?: boolean | undefined; + /** + * A function which serves as a getter for the property, or undefined if there is no getter (accessor descriptors only). + */ + get?: RemoteObject | undefined; + /** + * A function which serves as a setter for the property, or undefined if there is no setter (accessor descriptors only). + */ + set?: RemoteObject | undefined; + /** + * True if the type of this property descriptor may be changed and if the property may be deleted from the corresponding object. + */ + configurable: boolean; + /** + * True if this property shows up during enumeration of the properties on the corresponding object. + */ + enumerable: boolean; + /** + * True if the result was thrown during the evaluation. + */ + wasThrown?: boolean | undefined; + /** + * True if the property is owned for the object. + */ + isOwn?: boolean | undefined; + /** + * Property symbol object, if the property is of the symbol type. + */ + symbol?: RemoteObject | undefined; + } + /** + * Object internal property descriptor. This property isn't normally visible in JavaScript code. + */ + interface InternalPropertyDescriptor { + /** + * Conventional property name. + */ + name: string; + /** + * The value associated with the property. + */ + value?: RemoteObject | undefined; + } + /** + * Represents function call argument. Either remote object id objectId, primitive value, unserializable primitive value or neither of (for undefined) them should be specified. + */ + interface CallArgument { + /** + * Primitive value or serializable javascript object. + */ + value?: any; + /** + * Primitive value which can not be JSON-stringified. + */ + unserializableValue?: UnserializableValue | undefined; + /** + * Remote object handle. + */ + objectId?: RemoteObjectId | undefined; + } + /** + * Id of an execution context. + */ + type ExecutionContextId = number; + /** + * Description of an isolated world. + */ + interface ExecutionContextDescription { + /** + * Unique id of the execution context. It can be used to specify in which execution context script evaluation should be performed. + */ + id: ExecutionContextId; + /** + * Execution context origin. + */ + origin: string; + /** + * Human readable name describing given context. + */ + name: string; + /** + * Embedder-specific auxiliary data. + */ + auxData?: {} | undefined; + } + /** + * Detailed information about exception (or error) that was thrown during script compilation or execution. + */ + interface ExceptionDetails { + /** + * Exception id. + */ + exceptionId: number; + /** + * Exception text, which should be used together with exception object when available. + */ + text: string; + /** + * Line number of the exception location (0-based). + */ + lineNumber: number; + /** + * Column number of the exception location (0-based). + */ + columnNumber: number; + /** + * Script ID of the exception location. + */ + scriptId?: ScriptId | undefined; + /** + * URL of the exception location, to be used when the script was not reported. + */ + url?: string | undefined; + /** + * JavaScript stack trace if available. + */ + stackTrace?: StackTrace | undefined; + /** + * Exception object if available. + */ + exception?: RemoteObject | undefined; + /** + * Identifier of the context where exception happened. + */ + executionContextId?: ExecutionContextId | undefined; + } + /** + * Number of milliseconds since epoch. + */ + type Timestamp = number; + /** + * Stack entry for runtime errors and assertions. + */ + interface CallFrame { + /** + * JavaScript function name. + */ + functionName: string; + /** + * JavaScript script id. + */ + scriptId: ScriptId; + /** + * JavaScript script name or url. + */ + url: string; + /** + * JavaScript script line number (0-based). + */ + lineNumber: number; + /** + * JavaScript script column number (0-based). + */ + columnNumber: number; + } + /** + * Call frames for assertions or error messages. + */ + interface StackTrace { + /** + * String label of this stack trace. For async traces this may be a name of the function that initiated the async call. + */ + description?: string | undefined; + /** + * JavaScript function name. + */ + callFrames: CallFrame[]; + /** + * Asynchronous JavaScript stack trace that preceded this stack, if available. + */ + parent?: StackTrace | undefined; + /** + * Asynchronous JavaScript stack trace that preceded this stack, if available. + * @experimental + */ + parentId?: StackTraceId | undefined; + } + /** + * Unique identifier of current debugger. + * @experimental + */ + type UniqueDebuggerId = string; + /** + * If debuggerId is set stack trace comes from another debugger and can be resolved there. This allows to track cross-debugger calls. See Runtime.StackTrace and Debugger.paused for usages. + * @experimental + */ + interface StackTraceId { + id: string; + debuggerId?: UniqueDebuggerId | undefined; + } + interface EvaluateParameterType { + /** + * Expression to evaluate. + */ + expression: string; + /** + * Symbolic group name that can be used to release multiple objects. + */ + objectGroup?: string | undefined; + /** + * Determines whether Command Line API should be available during the evaluation. + */ + includeCommandLineAPI?: boolean | undefined; + /** + * In silent mode exceptions thrown during evaluation are not reported and do not pause execution. Overrides setPauseOnException state. + */ + silent?: boolean | undefined; + /** + * Specifies in which execution context to perform evaluation. If the parameter is omitted the evaluation will be performed in the context of the inspected page. + */ + contextId?: ExecutionContextId | undefined; + /** + * Whether the result is expected to be a JSON object that should be sent by value. + */ + returnByValue?: boolean | undefined; + /** + * Whether preview should be generated for the result. + * @experimental + */ + generatePreview?: boolean | undefined; + /** + * Whether execution should be treated as initiated by user in the UI. + */ + userGesture?: boolean | undefined; + /** + * Whether execution should await for resulting value and return once awaited promise is resolved. + */ + awaitPromise?: boolean | undefined; + } + interface AwaitPromiseParameterType { + /** + * Identifier of the promise. + */ + promiseObjectId: RemoteObjectId; + /** + * Whether the result is expected to be a JSON object that should be sent by value. + */ + returnByValue?: boolean | undefined; + /** + * Whether preview should be generated for the result. + */ + generatePreview?: boolean | undefined; + } + interface CallFunctionOnParameterType { + /** + * Declaration of the function to call. + */ + functionDeclaration: string; + /** + * Identifier of the object to call function on. Either objectId or executionContextId should be specified. + */ + objectId?: RemoteObjectId | undefined; + /** + * Call arguments. All call arguments must belong to the same JavaScript world as the target object. + */ + arguments?: CallArgument[] | undefined; + /** + * In silent mode exceptions thrown during evaluation are not reported and do not pause execution. Overrides setPauseOnException state. + */ + silent?: boolean | undefined; + /** + * Whether the result is expected to be a JSON object which should be sent by value. + */ + returnByValue?: boolean | undefined; + /** + * Whether preview should be generated for the result. + * @experimental + */ + generatePreview?: boolean | undefined; + /** + * Whether execution should be treated as initiated by user in the UI. + */ + userGesture?: boolean | undefined; + /** + * Whether execution should await for resulting value and return once awaited promise is resolved. + */ + awaitPromise?: boolean | undefined; + /** + * Specifies execution context which global object will be used to call function on. Either executionContextId or objectId should be specified. + */ + executionContextId?: ExecutionContextId | undefined; + /** + * Symbolic group name that can be used to release multiple objects. If objectGroup is not specified and objectId is, objectGroup will be inherited from object. + */ + objectGroup?: string | undefined; + } + interface GetPropertiesParameterType { + /** + * Identifier of the object to return properties for. + */ + objectId: RemoteObjectId; + /** + * If true, returns properties belonging only to the element itself, not to its prototype chain. + */ + ownProperties?: boolean | undefined; + /** + * If true, returns accessor properties (with getter/setter) only; internal properties are not returned either. + * @experimental + */ + accessorPropertiesOnly?: boolean | undefined; + /** + * Whether preview should be generated for the results. + * @experimental + */ + generatePreview?: boolean | undefined; + } + interface ReleaseObjectParameterType { + /** + * Identifier of the object to release. + */ + objectId: RemoteObjectId; + } + interface ReleaseObjectGroupParameterType { + /** + * Symbolic object group name. + */ + objectGroup: string; + } + interface SetCustomObjectFormatterEnabledParameterType { + enabled: boolean; + } + interface CompileScriptParameterType { + /** + * Expression to compile. + */ + expression: string; + /** + * Source url to be set for the script. + */ + sourceURL: string; + /** + * Specifies whether the compiled script should be persisted. + */ + persistScript: boolean; + /** + * Specifies in which execution context to perform script run. If the parameter is omitted the evaluation will be performed in the context of the inspected page. + */ + executionContextId?: ExecutionContextId | undefined; + } + interface RunScriptParameterType { + /** + * Id of the script to run. + */ + scriptId: ScriptId; + /** + * Specifies in which execution context to perform script run. If the parameter is omitted the evaluation will be performed in the context of the inspected page. + */ + executionContextId?: ExecutionContextId | undefined; + /** + * Symbolic group name that can be used to release multiple objects. + */ + objectGroup?: string | undefined; + /** + * In silent mode exceptions thrown during evaluation are not reported and do not pause execution. Overrides setPauseOnException state. + */ + silent?: boolean | undefined; + /** + * Determines whether Command Line API should be available during the evaluation. + */ + includeCommandLineAPI?: boolean | undefined; + /** + * Whether the result is expected to be a JSON object which should be sent by value. + */ + returnByValue?: boolean | undefined; + /** + * Whether preview should be generated for the result. + */ + generatePreview?: boolean | undefined; + /** + * Whether execution should await for resulting value and return once awaited promise is resolved. + */ + awaitPromise?: boolean | undefined; + } + interface QueryObjectsParameterType { + /** + * Identifier of the prototype to return objects for. + */ + prototypeObjectId: RemoteObjectId; + } + interface GlobalLexicalScopeNamesParameterType { + /** + * Specifies in which execution context to lookup global scope variables. + */ + executionContextId?: ExecutionContextId | undefined; + } + interface EvaluateReturnType { + /** + * Evaluation result. + */ + result: RemoteObject; + /** + * Exception details. + */ + exceptionDetails?: ExceptionDetails | undefined; + } + interface AwaitPromiseReturnType { + /** + * Promise result. Will contain rejected value if promise was rejected. + */ + result: RemoteObject; + /** + * Exception details if stack strace is available. + */ + exceptionDetails?: ExceptionDetails | undefined; + } + interface CallFunctionOnReturnType { + /** + * Call result. + */ + result: RemoteObject; + /** + * Exception details. + */ + exceptionDetails?: ExceptionDetails | undefined; + } + interface GetPropertiesReturnType { + /** + * Object properties. + */ + result: PropertyDescriptor[]; + /** + * Internal object properties (only of the element itself). + */ + internalProperties?: InternalPropertyDescriptor[] | undefined; + /** + * Exception details. + */ + exceptionDetails?: ExceptionDetails | undefined; + } + interface CompileScriptReturnType { + /** + * Id of the script. + */ + scriptId?: ScriptId | undefined; + /** + * Exception details. + */ + exceptionDetails?: ExceptionDetails | undefined; + } + interface RunScriptReturnType { + /** + * Run result. + */ + result: RemoteObject; + /** + * Exception details. + */ + exceptionDetails?: ExceptionDetails | undefined; + } + interface QueryObjectsReturnType { + /** + * Array with objects. + */ + objects: RemoteObject; + } + interface GlobalLexicalScopeNamesReturnType { + names: string[]; + } + interface ExecutionContextCreatedEventDataType { + /** + * A newly created execution context. + */ + context: ExecutionContextDescription; + } + interface ExecutionContextDestroyedEventDataType { + /** + * Id of the destroyed context + */ + executionContextId: ExecutionContextId; + } + interface ExceptionThrownEventDataType { + /** + * Timestamp of the exception. + */ + timestamp: Timestamp; + exceptionDetails: ExceptionDetails; + } + interface ExceptionRevokedEventDataType { + /** + * Reason describing why exception was revoked. + */ + reason: string; + /** + * The id of revoked exception, as reported in exceptionThrown. + */ + exceptionId: number; + } + interface ConsoleAPICalledEventDataType { + /** + * Type of the call. + */ + type: string; + /** + * Call arguments. + */ + args: RemoteObject[]; + /** + * Identifier of the context where the call was made. + */ + executionContextId: ExecutionContextId; + /** + * Call timestamp. + */ + timestamp: Timestamp; + /** + * Stack trace captured when the call was made. + */ + stackTrace?: StackTrace | undefined; + /** + * Console context descriptor for calls on non-default console context (not console.*): 'anonymous#unique-logger-id' for call on unnamed context, 'name#unique-logger-id' for call on named context. + * @experimental + */ + context?: string | undefined; + } + interface InspectRequestedEventDataType { + object: RemoteObject; + hints: {}; + } + } + namespace Debugger { + /** + * Breakpoint identifier. + */ + type BreakpointId = string; + /** + * Call frame identifier. + */ + type CallFrameId = string; + /** + * Location in the source code. + */ + interface Location { + /** + * Script identifier as reported in the Debugger.scriptParsed. + */ + scriptId: Runtime.ScriptId; + /** + * Line number in the script (0-based). + */ + lineNumber: number; + /** + * Column number in the script (0-based). + */ + columnNumber?: number | undefined; + } + /** + * Location in the source code. + * @experimental + */ + interface ScriptPosition { + lineNumber: number; + columnNumber: number; + } + /** + * JavaScript call frame. Array of call frames form the call stack. + */ + interface CallFrame { + /** + * Call frame identifier. This identifier is only valid while the virtual machine is paused. + */ + callFrameId: CallFrameId; + /** + * Name of the JavaScript function called on this call frame. + */ + functionName: string; + /** + * Location in the source code. + */ + functionLocation?: Location | undefined; + /** + * Location in the source code. + */ + location: Location; + /** + * JavaScript script name or url. + */ + url: string; + /** + * Scope chain for this call frame. + */ + scopeChain: Scope[]; + /** + * this object for this call frame. + */ + this: Runtime.RemoteObject; + /** + * The value being returned, if the function is at return point. + */ + returnValue?: Runtime.RemoteObject | undefined; + } + /** + * Scope description. + */ + interface Scope { + /** + * Scope type. + */ + type: string; + /** + * Object representing the scope. For global and with scopes it represents the actual object; for the rest of the scopes, it is artificial transient object enumerating scope variables as its properties. + */ + object: Runtime.RemoteObject; + name?: string | undefined; + /** + * Location in the source code where scope starts + */ + startLocation?: Location | undefined; + /** + * Location in the source code where scope ends + */ + endLocation?: Location | undefined; + } + /** + * Search match for resource. + */ + interface SearchMatch { + /** + * Line number in resource content. + */ + lineNumber: number; + /** + * Line with match content. + */ + lineContent: string; + } + interface BreakLocation { + /** + * Script identifier as reported in the Debugger.scriptParsed. + */ + scriptId: Runtime.ScriptId; + /** + * Line number in the script (0-based). + */ + lineNumber: number; + /** + * Column number in the script (0-based). + */ + columnNumber?: number | undefined; + type?: string | undefined; + } + interface SetBreakpointsActiveParameterType { + /** + * New value for breakpoints active state. + */ + active: boolean; + } + interface SetSkipAllPausesParameterType { + /** + * New value for skip pauses state. + */ + skip: boolean; + } + interface SetBreakpointByUrlParameterType { + /** + * Line number to set breakpoint at. + */ + lineNumber: number; + /** + * URL of the resources to set breakpoint on. + */ + url?: string | undefined; + /** + * Regex pattern for the URLs of the resources to set breakpoints on. Either url or urlRegex must be specified. + */ + urlRegex?: string | undefined; + /** + * Script hash of the resources to set breakpoint on. + */ + scriptHash?: string | undefined; + /** + * Offset in the line to set breakpoint at. + */ + columnNumber?: number | undefined; + /** + * Expression to use as a breakpoint condition. When specified, debugger will only stop on the breakpoint if this expression evaluates to true. + */ + condition?: string | undefined; + } + interface SetBreakpointParameterType { + /** + * Location to set breakpoint in. + */ + location: Location; + /** + * Expression to use as a breakpoint condition. When specified, debugger will only stop on the breakpoint if this expression evaluates to true. + */ + condition?: string | undefined; + } + interface RemoveBreakpointParameterType { + breakpointId: BreakpointId; + } + interface GetPossibleBreakpointsParameterType { + /** + * Start of range to search possible breakpoint locations in. + */ + start: Location; + /** + * End of range to search possible breakpoint locations in (excluding). When not specified, end of scripts is used as end of range. + */ + end?: Location | undefined; + /** + * Only consider locations which are in the same (non-nested) function as start. + */ + restrictToFunction?: boolean | undefined; + } + interface ContinueToLocationParameterType { + /** + * Location to continue to. + */ + location: Location; + targetCallFrames?: string | undefined; + } + interface PauseOnAsyncCallParameterType { + /** + * Debugger will pause when async call with given stack trace is started. + */ + parentStackTraceId: Runtime.StackTraceId; + } + interface StepIntoParameterType { + /** + * Debugger will issue additional Debugger.paused notification if any async task is scheduled before next pause. + * @experimental + */ + breakOnAsyncCall?: boolean | undefined; + } + interface GetStackTraceParameterType { + stackTraceId: Runtime.StackTraceId; + } + interface SearchInContentParameterType { + /** + * Id of the script to search in. + */ + scriptId: Runtime.ScriptId; + /** + * String to search for. + */ + query: string; + /** + * If true, search is case sensitive. + */ + caseSensitive?: boolean | undefined; + /** + * If true, treats string parameter as regex. + */ + isRegex?: boolean | undefined; + } + interface SetScriptSourceParameterType { + /** + * Id of the script to edit. + */ + scriptId: Runtime.ScriptId; + /** + * New content of the script. + */ + scriptSource: string; + /** + * If true the change will not actually be applied. Dry run may be used to get result description without actually modifying the code. + */ + dryRun?: boolean | undefined; + } + interface RestartFrameParameterType { + /** + * Call frame identifier to evaluate on. + */ + callFrameId: CallFrameId; + } + interface GetScriptSourceParameterType { + /** + * Id of the script to get source for. + */ + scriptId: Runtime.ScriptId; + } + interface SetPauseOnExceptionsParameterType { + /** + * Pause on exceptions mode. + */ + state: string; + } + interface EvaluateOnCallFrameParameterType { + /** + * Call frame identifier to evaluate on. + */ + callFrameId: CallFrameId; + /** + * Expression to evaluate. + */ + expression: string; + /** + * String object group name to put result into (allows rapid releasing resulting object handles using releaseObjectGroup). + */ + objectGroup?: string | undefined; + /** + * Specifies whether command line API should be available to the evaluated expression, defaults to false. + */ + includeCommandLineAPI?: boolean | undefined; + /** + * In silent mode exceptions thrown during evaluation are not reported and do not pause execution. Overrides setPauseOnException state. + */ + silent?: boolean | undefined; + /** + * Whether the result is expected to be a JSON object that should be sent by value. + */ + returnByValue?: boolean | undefined; + /** + * Whether preview should be generated for the result. + * @experimental + */ + generatePreview?: boolean | undefined; + /** + * Whether to throw an exception if side effect cannot be ruled out during evaluation. + */ + throwOnSideEffect?: boolean | undefined; + } + interface SetVariableValueParameterType { + /** + * 0-based number of scope as was listed in scope chain. Only 'local', 'closure' and 'catch' scope types are allowed. Other scopes could be manipulated manually. + */ + scopeNumber: number; + /** + * Variable name. + */ + variableName: string; + /** + * New variable value. + */ + newValue: Runtime.CallArgument; + /** + * Id of callframe that holds variable. + */ + callFrameId: CallFrameId; + } + interface SetReturnValueParameterType { + /** + * New return value. + */ + newValue: Runtime.CallArgument; + } + interface SetAsyncCallStackDepthParameterType { + /** + * Maximum depth of async call stacks. Setting to 0 will effectively disable collecting async call stacks (default). + */ + maxDepth: number; + } + interface SetBlackboxPatternsParameterType { + /** + * Array of regexps that will be used to check script url for blackbox state. + */ + patterns: string[]; + } + interface SetBlackboxedRangesParameterType { + /** + * Id of the script. + */ + scriptId: Runtime.ScriptId; + positions: ScriptPosition[]; + } + interface EnableReturnType { + /** + * Unique identifier of the debugger. + * @experimental + */ + debuggerId: Runtime.UniqueDebuggerId; + } + interface SetBreakpointByUrlReturnType { + /** + * Id of the created breakpoint for further reference. + */ + breakpointId: BreakpointId; + /** + * List of the locations this breakpoint resolved into upon addition. + */ + locations: Location[]; + } + interface SetBreakpointReturnType { + /** + * Id of the created breakpoint for further reference. + */ + breakpointId: BreakpointId; + /** + * Location this breakpoint resolved into. + */ + actualLocation: Location; + } + interface GetPossibleBreakpointsReturnType { + /** + * List of the possible breakpoint locations. + */ + locations: BreakLocation[]; + } + interface GetStackTraceReturnType { + stackTrace: Runtime.StackTrace; + } + interface SearchInContentReturnType { + /** + * List of search matches. + */ + result: SearchMatch[]; + } + interface SetScriptSourceReturnType { + /** + * New stack trace in case editing has happened while VM was stopped. + */ + callFrames?: CallFrame[] | undefined; + /** + * Whether current call stack was modified after applying the changes. + */ + stackChanged?: boolean | undefined; + /** + * Async stack trace, if any. + */ + asyncStackTrace?: Runtime.StackTrace | undefined; + /** + * Async stack trace, if any. + * @experimental + */ + asyncStackTraceId?: Runtime.StackTraceId | undefined; + /** + * Exception details if any. + */ + exceptionDetails?: Runtime.ExceptionDetails | undefined; + } + interface RestartFrameReturnType { + /** + * New stack trace. + */ + callFrames: CallFrame[]; + /** + * Async stack trace, if any. + */ + asyncStackTrace?: Runtime.StackTrace | undefined; + /** + * Async stack trace, if any. + * @experimental + */ + asyncStackTraceId?: Runtime.StackTraceId | undefined; + } + interface GetScriptSourceReturnType { + /** + * Script source. + */ + scriptSource: string; + } + interface EvaluateOnCallFrameReturnType { + /** + * Object wrapper for the evaluation result. + */ + result: Runtime.RemoteObject; + /** + * Exception details. + */ + exceptionDetails?: Runtime.ExceptionDetails | undefined; + } + interface ScriptParsedEventDataType { + /** + * Identifier of the script parsed. + */ + scriptId: Runtime.ScriptId; + /** + * URL or name of the script parsed (if any). + */ + url: string; + /** + * Line offset of the script within the resource with given URL (for script tags). + */ + startLine: number; + /** + * Column offset of the script within the resource with given URL. + */ + startColumn: number; + /** + * Last line of the script. + */ + endLine: number; + /** + * Length of the last line of the script. + */ + endColumn: number; + /** + * Specifies script creation context. + */ + executionContextId: Runtime.ExecutionContextId; + /** + * Content hash of the script. + */ + hash: string; + /** + * Embedder-specific auxiliary data. + */ + executionContextAuxData?: {} | undefined; + /** + * True, if this script is generated as a result of the live edit operation. + * @experimental + */ + isLiveEdit?: boolean | undefined; + /** + * URL of source map associated with script (if any). + */ + sourceMapURL?: string | undefined; + /** + * True, if this script has sourceURL. + */ + hasSourceURL?: boolean | undefined; + /** + * True, if this script is ES6 module. + */ + isModule?: boolean | undefined; + /** + * This script length. + */ + length?: number | undefined; + /** + * JavaScript top stack frame of where the script parsed event was triggered if available. + * @experimental + */ + stackTrace?: Runtime.StackTrace | undefined; + } + interface ScriptFailedToParseEventDataType { + /** + * Identifier of the script parsed. + */ + scriptId: Runtime.ScriptId; + /** + * URL or name of the script parsed (if any). + */ + url: string; + /** + * Line offset of the script within the resource with given URL (for script tags). + */ + startLine: number; + /** + * Column offset of the script within the resource with given URL. + */ + startColumn: number; + /** + * Last line of the script. + */ + endLine: number; + /** + * Length of the last line of the script. + */ + endColumn: number; + /** + * Specifies script creation context. + */ + executionContextId: Runtime.ExecutionContextId; + /** + * Content hash of the script. + */ + hash: string; + /** + * Embedder-specific auxiliary data. + */ + executionContextAuxData?: {} | undefined; + /** + * URL of source map associated with script (if any). + */ + sourceMapURL?: string | undefined; + /** + * True, if this script has sourceURL. + */ + hasSourceURL?: boolean | undefined; + /** + * True, if this script is ES6 module. + */ + isModule?: boolean | undefined; + /** + * This script length. + */ + length?: number | undefined; + /** + * JavaScript top stack frame of where the script parsed event was triggered if available. + * @experimental + */ + stackTrace?: Runtime.StackTrace | undefined; + } + interface BreakpointResolvedEventDataType { + /** + * Breakpoint unique identifier. + */ + breakpointId: BreakpointId; + /** + * Actual breakpoint location. + */ + location: Location; + } + interface PausedEventDataType { + /** + * Call stack the virtual machine stopped on. + */ + callFrames: CallFrame[]; + /** + * Pause reason. + */ + reason: string; + /** + * Object containing break-specific auxiliary properties. + */ + data?: {} | undefined; + /** + * Hit breakpoints IDs + */ + hitBreakpoints?: string[] | undefined; + /** + * Async stack trace, if any. + */ + asyncStackTrace?: Runtime.StackTrace | undefined; + /** + * Async stack trace, if any. + * @experimental + */ + asyncStackTraceId?: Runtime.StackTraceId | undefined; + /** + * Just scheduled async call will have this stack trace as parent stack during async execution. This field is available only after Debugger.stepInto call with breakOnAsynCall flag. + * @experimental + */ + asyncCallStackTraceId?: Runtime.StackTraceId | undefined; + } + } + namespace Console { + /** + * Console message. + */ + interface ConsoleMessage { + /** + * Message source. + */ + source: string; + /** + * Message severity. + */ + level: string; + /** + * Message text. + */ + text: string; + /** + * URL of the message origin. + */ + url?: string | undefined; + /** + * Line number in the resource that generated this message (1-based). + */ + line?: number | undefined; + /** + * Column number in the resource that generated this message (1-based). + */ + column?: number | undefined; + } + interface MessageAddedEventDataType { + /** + * Console message that has been added. + */ + message: ConsoleMessage; + } + } + namespace Profiler { + /** + * Profile node. Holds callsite information, execution statistics and child nodes. + */ + interface ProfileNode { + /** + * Unique id of the node. + */ + id: number; + /** + * Function location. + */ + callFrame: Runtime.CallFrame; + /** + * Number of samples where this node was on top of the call stack. + */ + hitCount?: number | undefined; + /** + * Child node ids. + */ + children?: number[] | undefined; + /** + * The reason of being not optimized. The function may be deoptimized or marked as don't optimize. + */ + deoptReason?: string | undefined; + /** + * An array of source position ticks. + */ + positionTicks?: PositionTickInfo[] | undefined; + } + /** + * Profile. + */ + interface Profile { + /** + * The list of profile nodes. First item is the root node. + */ + nodes: ProfileNode[]; + /** + * Profiling start timestamp in microseconds. + */ + startTime: number; + /** + * Profiling end timestamp in microseconds. + */ + endTime: number; + /** + * Ids of samples top nodes. + */ + samples?: number[] | undefined; + /** + * Time intervals between adjacent samples in microseconds. The first delta is relative to the profile startTime. + */ + timeDeltas?: number[] | undefined; + } + /** + * Specifies a number of samples attributed to a certain source position. + */ + interface PositionTickInfo { + /** + * Source line number (1-based). + */ + line: number; + /** + * Number of samples attributed to the source line. + */ + ticks: number; + } + /** + * Coverage data for a source range. + */ + interface CoverageRange { + /** + * JavaScript script source offset for the range start. + */ + startOffset: number; + /** + * JavaScript script source offset for the range end. + */ + endOffset: number; + /** + * Collected execution count of the source range. + */ + count: number; + } + /** + * Coverage data for a JavaScript function. + */ + interface FunctionCoverage { + /** + * JavaScript function name. + */ + functionName: string; + /** + * Source ranges inside the function with coverage data. + */ + ranges: CoverageRange[]; + /** + * Whether coverage data for this function has block granularity. + */ + isBlockCoverage: boolean; + } + /** + * Coverage data for a JavaScript script. + */ + interface ScriptCoverage { + /** + * JavaScript script id. + */ + scriptId: Runtime.ScriptId; + /** + * JavaScript script name or url. + */ + url: string; + /** + * Functions contained in the script that has coverage data. + */ + functions: FunctionCoverage[]; + } + /** + * Describes a type collected during runtime. + * @experimental + */ + interface TypeObject { + /** + * Name of a type collected with type profiling. + */ + name: string; + } + /** + * Source offset and types for a parameter or return value. + * @experimental + */ + interface TypeProfileEntry { + /** + * Source offset of the parameter or end of function for return values. + */ + offset: number; + /** + * The types for this parameter or return value. + */ + types: TypeObject[]; + } + /** + * Type profile data collected during runtime for a JavaScript script. + * @experimental + */ + interface ScriptTypeProfile { + /** + * JavaScript script id. + */ + scriptId: Runtime.ScriptId; + /** + * JavaScript script name or url. + */ + url: string; + /** + * Type profile entries for parameters and return values of the functions in the script. + */ + entries: TypeProfileEntry[]; + } + interface SetSamplingIntervalParameterType { + /** + * New sampling interval in microseconds. + */ + interval: number; + } + interface StartPreciseCoverageParameterType { + /** + * Collect accurate call counts beyond simple 'covered' or 'not covered'. + */ + callCount?: boolean | undefined; + /** + * Collect block-based coverage. + */ + detailed?: boolean | undefined; + } + interface StopReturnType { + /** + * Recorded profile. + */ + profile: Profile; + } + interface TakePreciseCoverageReturnType { + /** + * Coverage data for the current isolate. + */ + result: ScriptCoverage[]; + } + interface GetBestEffortCoverageReturnType { + /** + * Coverage data for the current isolate. + */ + result: ScriptCoverage[]; + } + interface TakeTypeProfileReturnType { + /** + * Type profile for all scripts since startTypeProfile() was turned on. + */ + result: ScriptTypeProfile[]; + } + interface ConsoleProfileStartedEventDataType { + id: string; + /** + * Location of console.profile(). + */ + location: Debugger.Location; + /** + * Profile title passed as an argument to console.profile(). + */ + title?: string | undefined; + } + interface ConsoleProfileFinishedEventDataType { + id: string; + /** + * Location of console.profileEnd(). + */ + location: Debugger.Location; + profile: Profile; + /** + * Profile title passed as an argument to console.profile(). + */ + title?: string | undefined; + } + } + namespace HeapProfiler { + /** + * Heap snapshot object id. + */ + type HeapSnapshotObjectId = string; + /** + * Sampling Heap Profile node. Holds callsite information, allocation statistics and child nodes. + */ + interface SamplingHeapProfileNode { + /** + * Function location. + */ + callFrame: Runtime.CallFrame; + /** + * Allocations size in bytes for the node excluding children. + */ + selfSize: number; + /** + * Child nodes. + */ + children: SamplingHeapProfileNode[]; + } + /** + * Profile. + */ + interface SamplingHeapProfile { + head: SamplingHeapProfileNode; + } + interface StartTrackingHeapObjectsParameterType { + trackAllocations?: boolean | undefined; + } + interface StopTrackingHeapObjectsParameterType { + /** + * If true 'reportHeapSnapshotProgress' events will be generated while snapshot is being taken when the tracking is stopped. + */ + reportProgress?: boolean | undefined; + } + interface TakeHeapSnapshotParameterType { + /** + * If true 'reportHeapSnapshotProgress' events will be generated while snapshot is being taken. + */ + reportProgress?: boolean | undefined; + } + interface GetObjectByHeapObjectIdParameterType { + objectId: HeapSnapshotObjectId; + /** + * Symbolic group name that can be used to release multiple objects. + */ + objectGroup?: string | undefined; + } + interface AddInspectedHeapObjectParameterType { + /** + * Heap snapshot object id to be accessible by means of $x command line API. + */ + heapObjectId: HeapSnapshotObjectId; + } + interface GetHeapObjectIdParameterType { + /** + * Identifier of the object to get heap object id for. + */ + objectId: Runtime.RemoteObjectId; + } + interface StartSamplingParameterType { + /** + * Average sample interval in bytes. Poisson distribution is used for the intervals. The default value is 32768 bytes. + */ + samplingInterval?: number | undefined; + } + interface GetObjectByHeapObjectIdReturnType { + /** + * Evaluation result. + */ + result: Runtime.RemoteObject; + } + interface GetHeapObjectIdReturnType { + /** + * Id of the heap snapshot object corresponding to the passed remote object id. + */ + heapSnapshotObjectId: HeapSnapshotObjectId; + } + interface StopSamplingReturnType { + /** + * Recorded sampling heap profile. + */ + profile: SamplingHeapProfile; + } + interface GetSamplingProfileReturnType { + /** + * Return the sampling profile being collected. + */ + profile: SamplingHeapProfile; + } + interface AddHeapSnapshotChunkEventDataType { + chunk: string; + } + interface ReportHeapSnapshotProgressEventDataType { + done: number; + total: number; + finished?: boolean | undefined; + } + interface LastSeenObjectIdEventDataType { + lastSeenObjectId: number; + timestamp: number; + } + interface HeapStatsUpdateEventDataType { + /** + * An array of triplets. Each triplet describes a fragment. The first integer is the fragment index, the second integer is a total count of objects for the fragment, the third integer is a total size of the objects for the fragment. + */ + statsUpdate: number[]; + } + } + namespace NodeTracing { + interface TraceConfig { + /** + * Controls how the trace buffer stores data. + */ + recordMode?: string; + /** + * Included category filters. + */ + includedCategories: string[]; + } + interface StartParameterType { + traceConfig: TraceConfig; + } + interface GetCategoriesReturnType { + /** + * A list of supported tracing categories. + */ + categories: string[]; + } + interface DataCollectedEventDataType { + value: Array<{}>; + } + } + namespace NodeWorker { + type WorkerID = string; + /** + * Unique identifier of attached debugging session. + */ + type SessionID = string; + interface WorkerInfo { + workerId: WorkerID; + type: string; + title: string; + url: string; + } + interface SendMessageToWorkerParameterType { + message: string; + /** + * Identifier of the session. + */ + sessionId: SessionID; + } + interface EnableParameterType { + /** + * Whether to new workers should be paused until the frontend sends `Runtime.runIfWaitingForDebugger` + * message to run them. + */ + waitForDebuggerOnStart: boolean; + } + interface DetachParameterType { + sessionId: SessionID; + } + interface AttachedToWorkerEventDataType { + /** + * Identifier assigned to the session used to send/receive messages. + */ + sessionId: SessionID; + workerInfo: WorkerInfo; + waitingForDebugger: boolean; + } + interface DetachedFromWorkerEventDataType { + /** + * Detached session identifier. + */ + sessionId: SessionID; + } + interface ReceivedMessageFromWorkerEventDataType { + /** + * Identifier of a session which sends a message. + */ + sessionId: SessionID; + message: string; + } + } + namespace NodeRuntime { + interface NotifyWhenWaitingForDisconnectParameterType { + enabled: boolean; + } + } + /** + * The `inspector.Session` is used for dispatching messages to the V8 inspector + * back-end and receiving message responses and notifications. + */ + class Session extends EventEmitter { + /** + * Create a new instance of the inspector.Session class. + * The inspector session needs to be connected through session.connect() before the messages can be dispatched to the inspector backend. + */ + constructor(); + /** + * Connects a session to the inspector back-end. + * @since v8.0.0 + */ + connect(): void; + /** + * Immediately close the session. All pending message callbacks will be called + * with an error. `session.connect()` will need to be called to be able to send + * messages again. Reconnected session will lose all inspector state, such as + * enabled agents or configured breakpoints. + * @since v8.0.0 + */ + disconnect(): void; + /** + * Posts a message to the inspector back-end. `callback` will be notified when + * a response is received. `callback` is a function that accepts two optional + * arguments: error and message-specific result. + * + * ```js + * session.post('Runtime.evaluate', { expression: '2 + 2' }, + * (error, { result }) => console.log(result)); + * // Output: { type: 'number', value: 4, description: '4' } + * ``` + * + * The latest version of the V8 inspector protocol is published on the [Chrome DevTools Protocol Viewer](https://chromedevtools.github.io/devtools-protocol/v8/). + * + * Node.js inspector supports all the Chrome DevTools Protocol domains declared + * by V8\. Chrome DevTools Protocol domain provides an interface for interacting + * with one of the runtime agents used to inspect the application state and listen + * to the run-time events. + * + * ## Example usage + * + * Apart from the debugger, various V8 Profilers are available through the DevTools + * protocol. + * @since v8.0.0 + */ + post(method: string, params?: {}, callback?: (err: Error | null, params?: {}) => void): void; + post(method: string, callback?: (err: Error | null, params?: {}) => void): void; + /** + * Returns supported domains. + */ + post(method: 'Schema.getDomains', callback?: (err: Error | null, params: Schema.GetDomainsReturnType) => void): void; + /** + * Evaluates expression on global object. + */ + post(method: 'Runtime.evaluate', params?: Runtime.EvaluateParameterType, callback?: (err: Error | null, params: Runtime.EvaluateReturnType) => void): void; + post(method: 'Runtime.evaluate', callback?: (err: Error | null, params: Runtime.EvaluateReturnType) => void): void; + /** + * Add handler to promise with given promise object id. + */ + post(method: 'Runtime.awaitPromise', params?: Runtime.AwaitPromiseParameterType, callback?: (err: Error | null, params: Runtime.AwaitPromiseReturnType) => void): void; + post(method: 'Runtime.awaitPromise', callback?: (err: Error | null, params: Runtime.AwaitPromiseReturnType) => void): void; + /** + * Calls function with given declaration on the given object. Object group of the result is inherited from the target object. + */ + post(method: 'Runtime.callFunctionOn', params?: Runtime.CallFunctionOnParameterType, callback?: (err: Error | null, params: Runtime.CallFunctionOnReturnType) => void): void; + post(method: 'Runtime.callFunctionOn', callback?: (err: Error | null, params: Runtime.CallFunctionOnReturnType) => void): void; + /** + * Returns properties of a given object. Object group of the result is inherited from the target object. + */ + post(method: 'Runtime.getProperties', params?: Runtime.GetPropertiesParameterType, callback?: (err: Error | null, params: Runtime.GetPropertiesReturnType) => void): void; + post(method: 'Runtime.getProperties', callback?: (err: Error | null, params: Runtime.GetPropertiesReturnType) => void): void; + /** + * Releases remote object with given id. + */ + post(method: 'Runtime.releaseObject', params?: Runtime.ReleaseObjectParameterType, callback?: (err: Error | null) => void): void; + post(method: 'Runtime.releaseObject', callback?: (err: Error | null) => void): void; + /** + * Releases all remote objects that belong to a given group. + */ + post(method: 'Runtime.releaseObjectGroup', params?: Runtime.ReleaseObjectGroupParameterType, callback?: (err: Error | null) => void): void; + post(method: 'Runtime.releaseObjectGroup', callback?: (err: Error | null) => void): void; + /** + * Tells inspected instance to run if it was waiting for debugger to attach. + */ + post(method: 'Runtime.runIfWaitingForDebugger', callback?: (err: Error | null) => void): void; + /** + * Enables reporting of execution contexts creation by means of executionContextCreated event. When the reporting gets enabled the event will be sent immediately for each existing execution context. + */ + post(method: 'Runtime.enable', callback?: (err: Error | null) => void): void; + /** + * Disables reporting of execution contexts creation. + */ + post(method: 'Runtime.disable', callback?: (err: Error | null) => void): void; + /** + * Discards collected exceptions and console API calls. + */ + post(method: 'Runtime.discardConsoleEntries', callback?: (err: Error | null) => void): void; + /** + * @experimental + */ + post(method: 'Runtime.setCustomObjectFormatterEnabled', params?: Runtime.SetCustomObjectFormatterEnabledParameterType, callback?: (err: Error | null) => void): void; + post(method: 'Runtime.setCustomObjectFormatterEnabled', callback?: (err: Error | null) => void): void; + /** + * Compiles expression. + */ + post(method: 'Runtime.compileScript', params?: Runtime.CompileScriptParameterType, callback?: (err: Error | null, params: Runtime.CompileScriptReturnType) => void): void; + post(method: 'Runtime.compileScript', callback?: (err: Error | null, params: Runtime.CompileScriptReturnType) => void): void; + /** + * Runs script with given id in a given context. + */ + post(method: 'Runtime.runScript', params?: Runtime.RunScriptParameterType, callback?: (err: Error | null, params: Runtime.RunScriptReturnType) => void): void; + post(method: 'Runtime.runScript', callback?: (err: Error | null, params: Runtime.RunScriptReturnType) => void): void; + post(method: 'Runtime.queryObjects', params?: Runtime.QueryObjectsParameterType, callback?: (err: Error | null, params: Runtime.QueryObjectsReturnType) => void): void; + post(method: 'Runtime.queryObjects', callback?: (err: Error | null, params: Runtime.QueryObjectsReturnType) => void): void; + /** + * Returns all let, const and class variables from global scope. + */ + post( + method: 'Runtime.globalLexicalScopeNames', + params?: Runtime.GlobalLexicalScopeNamesParameterType, + callback?: (err: Error | null, params: Runtime.GlobalLexicalScopeNamesReturnType) => void + ): void; + post(method: 'Runtime.globalLexicalScopeNames', callback?: (err: Error | null, params: Runtime.GlobalLexicalScopeNamesReturnType) => void): void; + /** + * Enables debugger for the given page. Clients should not assume that the debugging has been enabled until the result for this command is received. + */ + post(method: 'Debugger.enable', callback?: (err: Error | null, params: Debugger.EnableReturnType) => void): void; + /** + * Disables debugger for given page. + */ + post(method: 'Debugger.disable', callback?: (err: Error | null) => void): void; + /** + * Activates / deactivates all breakpoints on the page. + */ + post(method: 'Debugger.setBreakpointsActive', params?: Debugger.SetBreakpointsActiveParameterType, callback?: (err: Error | null) => void): void; + post(method: 'Debugger.setBreakpointsActive', callback?: (err: Error | null) => void): void; + /** + * Makes page not interrupt on any pauses (breakpoint, exception, dom exception etc). + */ + post(method: 'Debugger.setSkipAllPauses', params?: Debugger.SetSkipAllPausesParameterType, callback?: (err: Error | null) => void): void; + post(method: 'Debugger.setSkipAllPauses', callback?: (err: Error | null) => void): void; + /** + * Sets JavaScript breakpoint at given location specified either by URL or URL regex. Once this command is issued, all existing parsed scripts will have breakpoints resolved and returned in locations property. Further matching script parsing will result in subsequent breakpointResolved events issued. This logical breakpoint will survive page reloads. + */ + post(method: 'Debugger.setBreakpointByUrl', params?: Debugger.SetBreakpointByUrlParameterType, callback?: (err: Error | null, params: Debugger.SetBreakpointByUrlReturnType) => void): void; + post(method: 'Debugger.setBreakpointByUrl', callback?: (err: Error | null, params: Debugger.SetBreakpointByUrlReturnType) => void): void; + /** + * Sets JavaScript breakpoint at a given location. + */ + post(method: 'Debugger.setBreakpoint', params?: Debugger.SetBreakpointParameterType, callback?: (err: Error | null, params: Debugger.SetBreakpointReturnType) => void): void; + post(method: 'Debugger.setBreakpoint', callback?: (err: Error | null, params: Debugger.SetBreakpointReturnType) => void): void; + /** + * Removes JavaScript breakpoint. + */ + post(method: 'Debugger.removeBreakpoint', params?: Debugger.RemoveBreakpointParameterType, callback?: (err: Error | null) => void): void; + post(method: 'Debugger.removeBreakpoint', callback?: (err: Error | null) => void): void; + /** + * Returns possible locations for breakpoint. scriptId in start and end range locations should be the same. + */ + post( + method: 'Debugger.getPossibleBreakpoints', + params?: Debugger.GetPossibleBreakpointsParameterType, + callback?: (err: Error | null, params: Debugger.GetPossibleBreakpointsReturnType) => void + ): void; + post(method: 'Debugger.getPossibleBreakpoints', callback?: (err: Error | null, params: Debugger.GetPossibleBreakpointsReturnType) => void): void; + /** + * Continues execution until specific location is reached. + */ + post(method: 'Debugger.continueToLocation', params?: Debugger.ContinueToLocationParameterType, callback?: (err: Error | null) => void): void; + post(method: 'Debugger.continueToLocation', callback?: (err: Error | null) => void): void; + /** + * @experimental + */ + post(method: 'Debugger.pauseOnAsyncCall', params?: Debugger.PauseOnAsyncCallParameterType, callback?: (err: Error | null) => void): void; + post(method: 'Debugger.pauseOnAsyncCall', callback?: (err: Error | null) => void): void; + /** + * Steps over the statement. + */ + post(method: 'Debugger.stepOver', callback?: (err: Error | null) => void): void; + /** + * Steps into the function call. + */ + post(method: 'Debugger.stepInto', params?: Debugger.StepIntoParameterType, callback?: (err: Error | null) => void): void; + post(method: 'Debugger.stepInto', callback?: (err: Error | null) => void): void; + /** + * Steps out of the function call. + */ + post(method: 'Debugger.stepOut', callback?: (err: Error | null) => void): void; + /** + * Stops on the next JavaScript statement. + */ + post(method: 'Debugger.pause', callback?: (err: Error | null) => void): void; + /** + * This method is deprecated - use Debugger.stepInto with breakOnAsyncCall and Debugger.pauseOnAsyncTask instead. Steps into next scheduled async task if any is scheduled before next pause. Returns success when async task is actually scheduled, returns error if no task were scheduled or another scheduleStepIntoAsync was called. + * @experimental + */ + post(method: 'Debugger.scheduleStepIntoAsync', callback?: (err: Error | null) => void): void; + /** + * Resumes JavaScript execution. + */ + post(method: 'Debugger.resume', callback?: (err: Error | null) => void): void; + /** + * Returns stack trace with given stackTraceId. + * @experimental + */ + post(method: 'Debugger.getStackTrace', params?: Debugger.GetStackTraceParameterType, callback?: (err: Error | null, params: Debugger.GetStackTraceReturnType) => void): void; + post(method: 'Debugger.getStackTrace', callback?: (err: Error | null, params: Debugger.GetStackTraceReturnType) => void): void; + /** + * Searches for given string in script content. + */ + post(method: 'Debugger.searchInContent', params?: Debugger.SearchInContentParameterType, callback?: (err: Error | null, params: Debugger.SearchInContentReturnType) => void): void; + post(method: 'Debugger.searchInContent', callback?: (err: Error | null, params: Debugger.SearchInContentReturnType) => void): void; + /** + * Edits JavaScript source live. + */ + post(method: 'Debugger.setScriptSource', params?: Debugger.SetScriptSourceParameterType, callback?: (err: Error | null, params: Debugger.SetScriptSourceReturnType) => void): void; + post(method: 'Debugger.setScriptSource', callback?: (err: Error | null, params: Debugger.SetScriptSourceReturnType) => void): void; + /** + * Restarts particular call frame from the beginning. + */ + post(method: 'Debugger.restartFrame', params?: Debugger.RestartFrameParameterType, callback?: (err: Error | null, params: Debugger.RestartFrameReturnType) => void): void; + post(method: 'Debugger.restartFrame', callback?: (err: Error | null, params: Debugger.RestartFrameReturnType) => void): void; + /** + * Returns source for the script with given id. + */ + post(method: 'Debugger.getScriptSource', params?: Debugger.GetScriptSourceParameterType, callback?: (err: Error | null, params: Debugger.GetScriptSourceReturnType) => void): void; + post(method: 'Debugger.getScriptSource', callback?: (err: Error | null, params: Debugger.GetScriptSourceReturnType) => void): void; + /** + * Defines pause on exceptions state. Can be set to stop on all exceptions, uncaught exceptions or no exceptions. Initial pause on exceptions state is none. + */ + post(method: 'Debugger.setPauseOnExceptions', params?: Debugger.SetPauseOnExceptionsParameterType, callback?: (err: Error | null) => void): void; + post(method: 'Debugger.setPauseOnExceptions', callback?: (err: Error | null) => void): void; + /** + * Evaluates expression on a given call frame. + */ + post(method: 'Debugger.evaluateOnCallFrame', params?: Debugger.EvaluateOnCallFrameParameterType, callback?: (err: Error | null, params: Debugger.EvaluateOnCallFrameReturnType) => void): void; + post(method: 'Debugger.evaluateOnCallFrame', callback?: (err: Error | null, params: Debugger.EvaluateOnCallFrameReturnType) => void): void; + /** + * Changes value of variable in a callframe. Object-based scopes are not supported and must be mutated manually. + */ + post(method: 'Debugger.setVariableValue', params?: Debugger.SetVariableValueParameterType, callback?: (err: Error | null) => void): void; + post(method: 'Debugger.setVariableValue', callback?: (err: Error | null) => void): void; + /** + * Changes return value in top frame. Available only at return break position. + * @experimental + */ + post(method: 'Debugger.setReturnValue', params?: Debugger.SetReturnValueParameterType, callback?: (err: Error | null) => void): void; + post(method: 'Debugger.setReturnValue', callback?: (err: Error | null) => void): void; + /** + * Enables or disables async call stacks tracking. + */ + post(method: 'Debugger.setAsyncCallStackDepth', params?: Debugger.SetAsyncCallStackDepthParameterType, callback?: (err: Error | null) => void): void; + post(method: 'Debugger.setAsyncCallStackDepth', callback?: (err: Error | null) => void): void; + /** + * Replace previous blackbox patterns with passed ones. Forces backend to skip stepping/pausing in scripts with url matching one of the patterns. VM will try to leave blackboxed script by performing 'step in' several times, finally resorting to 'step out' if unsuccessful. + * @experimental + */ + post(method: 'Debugger.setBlackboxPatterns', params?: Debugger.SetBlackboxPatternsParameterType, callback?: (err: Error | null) => void): void; + post(method: 'Debugger.setBlackboxPatterns', callback?: (err: Error | null) => void): void; + /** + * Makes backend skip steps in the script in blackboxed ranges. VM will try leave blacklisted scripts by performing 'step in' several times, finally resorting to 'step out' if unsuccessful. Positions array contains positions where blackbox state is changed. First interval isn't blackboxed. Array should be sorted. + * @experimental + */ + post(method: 'Debugger.setBlackboxedRanges', params?: Debugger.SetBlackboxedRangesParameterType, callback?: (err: Error | null) => void): void; + post(method: 'Debugger.setBlackboxedRanges', callback?: (err: Error | null) => void): void; + /** + * Enables console domain, sends the messages collected so far to the client by means of the messageAdded notification. + */ + post(method: 'Console.enable', callback?: (err: Error | null) => void): void; + /** + * Disables console domain, prevents further console messages from being reported to the client. + */ + post(method: 'Console.disable', callback?: (err: Error | null) => void): void; + /** + * Does nothing. + */ + post(method: 'Console.clearMessages', callback?: (err: Error | null) => void): void; + post(method: 'Profiler.enable', callback?: (err: Error | null) => void): void; + post(method: 'Profiler.disable', callback?: (err: Error | null) => void): void; + /** + * Changes CPU profiler sampling interval. Must be called before CPU profiles recording started. + */ + post(method: 'Profiler.setSamplingInterval', params?: Profiler.SetSamplingIntervalParameterType, callback?: (err: Error | null) => void): void; + post(method: 'Profiler.setSamplingInterval', callback?: (err: Error | null) => void): void; + post(method: 'Profiler.start', callback?: (err: Error | null) => void): void; + post(method: 'Profiler.stop', callback?: (err: Error | null, params: Profiler.StopReturnType) => void): void; + /** + * Enable precise code coverage. Coverage data for JavaScript executed before enabling precise code coverage may be incomplete. Enabling prevents running optimized code and resets execution counters. + */ + post(method: 'Profiler.startPreciseCoverage', params?: Profiler.StartPreciseCoverageParameterType, callback?: (err: Error | null) => void): void; + post(method: 'Profiler.startPreciseCoverage', callback?: (err: Error | null) => void): void; + /** + * Disable precise code coverage. Disabling releases unnecessary execution count records and allows executing optimized code. + */ + post(method: 'Profiler.stopPreciseCoverage', callback?: (err: Error | null) => void): void; + /** + * Collect coverage data for the current isolate, and resets execution counters. Precise code coverage needs to have started. + */ + post(method: 'Profiler.takePreciseCoverage', callback?: (err: Error | null, params: Profiler.TakePreciseCoverageReturnType) => void): void; + /** + * Collect coverage data for the current isolate. The coverage data may be incomplete due to garbage collection. + */ + post(method: 'Profiler.getBestEffortCoverage', callback?: (err: Error | null, params: Profiler.GetBestEffortCoverageReturnType) => void): void; + /** + * Enable type profile. + * @experimental + */ + post(method: 'Profiler.startTypeProfile', callback?: (err: Error | null) => void): void; + /** + * Disable type profile. Disabling releases type profile data collected so far. + * @experimental + */ + post(method: 'Profiler.stopTypeProfile', callback?: (err: Error | null) => void): void; + /** + * Collect type profile. + * @experimental + */ + post(method: 'Profiler.takeTypeProfile', callback?: (err: Error | null, params: Profiler.TakeTypeProfileReturnType) => void): void; + post(method: 'HeapProfiler.enable', callback?: (err: Error | null) => void): void; + post(method: 'HeapProfiler.disable', callback?: (err: Error | null) => void): void; + post(method: 'HeapProfiler.startTrackingHeapObjects', params?: HeapProfiler.StartTrackingHeapObjectsParameterType, callback?: (err: Error | null) => void): void; + post(method: 'HeapProfiler.startTrackingHeapObjects', callback?: (err: Error | null) => void): void; + post(method: 'HeapProfiler.stopTrackingHeapObjects', params?: HeapProfiler.StopTrackingHeapObjectsParameterType, callback?: (err: Error | null) => void): void; + post(method: 'HeapProfiler.stopTrackingHeapObjects', callback?: (err: Error | null) => void): void; + post(method: 'HeapProfiler.takeHeapSnapshot', params?: HeapProfiler.TakeHeapSnapshotParameterType, callback?: (err: Error | null) => void): void; + post(method: 'HeapProfiler.takeHeapSnapshot', callback?: (err: Error | null) => void): void; + post(method: 'HeapProfiler.collectGarbage', callback?: (err: Error | null) => void): void; + post( + method: 'HeapProfiler.getObjectByHeapObjectId', + params?: HeapProfiler.GetObjectByHeapObjectIdParameterType, + callback?: (err: Error | null, params: HeapProfiler.GetObjectByHeapObjectIdReturnType) => void + ): void; + post(method: 'HeapProfiler.getObjectByHeapObjectId', callback?: (err: Error | null, params: HeapProfiler.GetObjectByHeapObjectIdReturnType) => void): void; + /** + * Enables console to refer to the node with given id via $x (see Command Line API for more details $x functions). + */ + post(method: 'HeapProfiler.addInspectedHeapObject', params?: HeapProfiler.AddInspectedHeapObjectParameterType, callback?: (err: Error | null) => void): void; + post(method: 'HeapProfiler.addInspectedHeapObject', callback?: (err: Error | null) => void): void; + post(method: 'HeapProfiler.getHeapObjectId', params?: HeapProfiler.GetHeapObjectIdParameterType, callback?: (err: Error | null, params: HeapProfiler.GetHeapObjectIdReturnType) => void): void; + post(method: 'HeapProfiler.getHeapObjectId', callback?: (err: Error | null, params: HeapProfiler.GetHeapObjectIdReturnType) => void): void; + post(method: 'HeapProfiler.startSampling', params?: HeapProfiler.StartSamplingParameterType, callback?: (err: Error | null) => void): void; + post(method: 'HeapProfiler.startSampling', callback?: (err: Error | null) => void): void; + post(method: 'HeapProfiler.stopSampling', callback?: (err: Error | null, params: HeapProfiler.StopSamplingReturnType) => void): void; + post(method: 'HeapProfiler.getSamplingProfile', callback?: (err: Error | null, params: HeapProfiler.GetSamplingProfileReturnType) => void): void; + /** + * Gets supported tracing categories. + */ + post(method: 'NodeTracing.getCategories', callback?: (err: Error | null, params: NodeTracing.GetCategoriesReturnType) => void): void; + /** + * Start trace events collection. + */ + post(method: 'NodeTracing.start', params?: NodeTracing.StartParameterType, callback?: (err: Error | null) => void): void; + post(method: 'NodeTracing.start', callback?: (err: Error | null) => void): void; + /** + * Stop trace events collection. Remaining collected events will be sent as a sequence of + * dataCollected events followed by tracingComplete event. + */ + post(method: 'NodeTracing.stop', callback?: (err: Error | null) => void): void; + /** + * Sends protocol message over session with given id. + */ + post(method: 'NodeWorker.sendMessageToWorker', params?: NodeWorker.SendMessageToWorkerParameterType, callback?: (err: Error | null) => void): void; + post(method: 'NodeWorker.sendMessageToWorker', callback?: (err: Error | null) => void): void; + /** + * Instructs the inspector to attach to running workers. Will also attach to new workers + * as they start + */ + post(method: 'NodeWorker.enable', params?: NodeWorker.EnableParameterType, callback?: (err: Error | null) => void): void; + post(method: 'NodeWorker.enable', callback?: (err: Error | null) => void): void; + /** + * Detaches from all running workers and disables attaching to new workers as they are started. + */ + post(method: 'NodeWorker.disable', callback?: (err: Error | null) => void): void; + /** + * Detached from the worker with given sessionId. + */ + post(method: 'NodeWorker.detach', params?: NodeWorker.DetachParameterType, callback?: (err: Error | null) => void): void; + post(method: 'NodeWorker.detach', callback?: (err: Error | null) => void): void; + /** + * Enable the `NodeRuntime.waitingForDisconnect`. + */ + post(method: 'NodeRuntime.notifyWhenWaitingForDisconnect', params?: NodeRuntime.NotifyWhenWaitingForDisconnectParameterType, callback?: (err: Error | null) => void): void; + post(method: 'NodeRuntime.notifyWhenWaitingForDisconnect', callback?: (err: Error | null) => void): void; + // Events + addListener(event: string, listener: (...args: any[]) => void): this; + /** + * Emitted when any notification from the V8 Inspector is received. + */ + addListener(event: 'inspectorNotification', listener: (message: InspectorNotification<{}>) => void): this; + /** + * Issued when new execution context is created. + */ + addListener(event: 'Runtime.executionContextCreated', listener: (message: InspectorNotification) => void): this; + /** + * Issued when execution context is destroyed. + */ + addListener(event: 'Runtime.executionContextDestroyed', listener: (message: InspectorNotification) => void): this; + /** + * Issued when all executionContexts were cleared in browser + */ + addListener(event: 'Runtime.executionContextsCleared', listener: () => void): this; + /** + * Issued when exception was thrown and unhandled. + */ + addListener(event: 'Runtime.exceptionThrown', listener: (message: InspectorNotification) => void): this; + /** + * Issued when unhandled exception was revoked. + */ + addListener(event: 'Runtime.exceptionRevoked', listener: (message: InspectorNotification) => void): this; + /** + * Issued when console API was called. + */ + addListener(event: 'Runtime.consoleAPICalled', listener: (message: InspectorNotification) => void): this; + /** + * Issued when object should be inspected (for example, as a result of inspect() command line API call). + */ + addListener(event: 'Runtime.inspectRequested', listener: (message: InspectorNotification) => void): this; + /** + * Fired when virtual machine parses script. This event is also fired for all known and uncollected scripts upon enabling debugger. + */ + addListener(event: 'Debugger.scriptParsed', listener: (message: InspectorNotification) => void): this; + /** + * Fired when virtual machine fails to parse the script. + */ + addListener(event: 'Debugger.scriptFailedToParse', listener: (message: InspectorNotification) => void): this; + /** + * Fired when breakpoint is resolved to an actual script and location. + */ + addListener(event: 'Debugger.breakpointResolved', listener: (message: InspectorNotification) => void): this; + /** + * Fired when the virtual machine stopped on breakpoint or exception or any other stop criteria. + */ + addListener(event: 'Debugger.paused', listener: (message: InspectorNotification) => void): this; + /** + * Fired when the virtual machine resumed execution. + */ + addListener(event: 'Debugger.resumed', listener: () => void): this; + /** + * Issued when new console message is added. + */ + addListener(event: 'Console.messageAdded', listener: (message: InspectorNotification) => void): this; + /** + * Sent when new profile recording is started using console.profile() call. + */ + addListener(event: 'Profiler.consoleProfileStarted', listener: (message: InspectorNotification) => void): this; + addListener(event: 'Profiler.consoleProfileFinished', listener: (message: InspectorNotification) => void): this; + addListener(event: 'HeapProfiler.addHeapSnapshotChunk', listener: (message: InspectorNotification) => void): this; + addListener(event: 'HeapProfiler.resetProfiles', listener: () => void): this; + addListener(event: 'HeapProfiler.reportHeapSnapshotProgress', listener: (message: InspectorNotification) => void): this; + /** + * If heap objects tracking has been started then backend regularly sends a current value for last seen object id and corresponding timestamp. If the were changes in the heap since last event then one or more heapStatsUpdate events will be sent before a new lastSeenObjectId event. + */ + addListener(event: 'HeapProfiler.lastSeenObjectId', listener: (message: InspectorNotification) => void): this; + /** + * If heap objects tracking has been started then backend may send update for one or more fragments + */ + addListener(event: 'HeapProfiler.heapStatsUpdate', listener: (message: InspectorNotification) => void): this; + /** + * Contains an bucket of collected trace events. + */ + addListener(event: 'NodeTracing.dataCollected', listener: (message: InspectorNotification) => void): this; + /** + * Signals that tracing is stopped and there is no trace buffers pending flush, all data were + * delivered via dataCollected events. + */ + addListener(event: 'NodeTracing.tracingComplete', listener: () => void): this; + /** + * Issued when attached to a worker. + */ + addListener(event: 'NodeWorker.attachedToWorker', listener: (message: InspectorNotification) => void): this; + /** + * Issued when detached from the worker. + */ + addListener(event: 'NodeWorker.detachedFromWorker', listener: (message: InspectorNotification) => void): this; + /** + * Notifies about a new protocol message received from the session + * (session ID is provided in attachedToWorker notification). + */ + addListener(event: 'NodeWorker.receivedMessageFromWorker', listener: (message: InspectorNotification) => void): this; + /** + * This event is fired instead of `Runtime.executionContextDestroyed` when + * enabled. + * It is fired when the Node process finished all code execution and is + * waiting for all frontends to disconnect. + */ + addListener(event: 'NodeRuntime.waitingForDisconnect', listener: () => void): this; + emit(event: string | symbol, ...args: any[]): boolean; + emit(event: 'inspectorNotification', message: InspectorNotification<{}>): boolean; + emit(event: 'Runtime.executionContextCreated', message: InspectorNotification): boolean; + emit(event: 'Runtime.executionContextDestroyed', message: InspectorNotification): boolean; + emit(event: 'Runtime.executionContextsCleared'): boolean; + emit(event: 'Runtime.exceptionThrown', message: InspectorNotification): boolean; + emit(event: 'Runtime.exceptionRevoked', message: InspectorNotification): boolean; + emit(event: 'Runtime.consoleAPICalled', message: InspectorNotification): boolean; + emit(event: 'Runtime.inspectRequested', message: InspectorNotification): boolean; + emit(event: 'Debugger.scriptParsed', message: InspectorNotification): boolean; + emit(event: 'Debugger.scriptFailedToParse', message: InspectorNotification): boolean; + emit(event: 'Debugger.breakpointResolved', message: InspectorNotification): boolean; + emit(event: 'Debugger.paused', message: InspectorNotification): boolean; + emit(event: 'Debugger.resumed'): boolean; + emit(event: 'Console.messageAdded', message: InspectorNotification): boolean; + emit(event: 'Profiler.consoleProfileStarted', message: InspectorNotification): boolean; + emit(event: 'Profiler.consoleProfileFinished', message: InspectorNotification): boolean; + emit(event: 'HeapProfiler.addHeapSnapshotChunk', message: InspectorNotification): boolean; + emit(event: 'HeapProfiler.resetProfiles'): boolean; + emit(event: 'HeapProfiler.reportHeapSnapshotProgress', message: InspectorNotification): boolean; + emit(event: 'HeapProfiler.lastSeenObjectId', message: InspectorNotification): boolean; + emit(event: 'HeapProfiler.heapStatsUpdate', message: InspectorNotification): boolean; + emit(event: 'NodeTracing.dataCollected', message: InspectorNotification): boolean; + emit(event: 'NodeTracing.tracingComplete'): boolean; + emit(event: 'NodeWorker.attachedToWorker', message: InspectorNotification): boolean; + emit(event: 'NodeWorker.detachedFromWorker', message: InspectorNotification): boolean; + emit(event: 'NodeWorker.receivedMessageFromWorker', message: InspectorNotification): boolean; + emit(event: 'NodeRuntime.waitingForDisconnect'): boolean; + on(event: string, listener: (...args: any[]) => void): this; + /** + * Emitted when any notification from the V8 Inspector is received. + */ + on(event: 'inspectorNotification', listener: (message: InspectorNotification<{}>) => void): this; + /** + * Issued when new execution context is created. + */ + on(event: 'Runtime.executionContextCreated', listener: (message: InspectorNotification) => void): this; + /** + * Issued when execution context is destroyed. + */ + on(event: 'Runtime.executionContextDestroyed', listener: (message: InspectorNotification) => void): this; + /** + * Issued when all executionContexts were cleared in browser + */ + on(event: 'Runtime.executionContextsCleared', listener: () => void): this; + /** + * Issued when exception was thrown and unhandled. + */ + on(event: 'Runtime.exceptionThrown', listener: (message: InspectorNotification) => void): this; + /** + * Issued when unhandled exception was revoked. + */ + on(event: 'Runtime.exceptionRevoked', listener: (message: InspectorNotification) => void): this; + /** + * Issued when console API was called. + */ + on(event: 'Runtime.consoleAPICalled', listener: (message: InspectorNotification) => void): this; + /** + * Issued when object should be inspected (for example, as a result of inspect() command line API call). + */ + on(event: 'Runtime.inspectRequested', listener: (message: InspectorNotification) => void): this; + /** + * Fired when virtual machine parses script. This event is also fired for all known and uncollected scripts upon enabling debugger. + */ + on(event: 'Debugger.scriptParsed', listener: (message: InspectorNotification) => void): this; + /** + * Fired when virtual machine fails to parse the script. + */ + on(event: 'Debugger.scriptFailedToParse', listener: (message: InspectorNotification) => void): this; + /** + * Fired when breakpoint is resolved to an actual script and location. + */ + on(event: 'Debugger.breakpointResolved', listener: (message: InspectorNotification) => void): this; + /** + * Fired when the virtual machine stopped on breakpoint or exception or any other stop criteria. + */ + on(event: 'Debugger.paused', listener: (message: InspectorNotification) => void): this; + /** + * Fired when the virtual machine resumed execution. + */ + on(event: 'Debugger.resumed', listener: () => void): this; + /** + * Issued when new console message is added. + */ + on(event: 'Console.messageAdded', listener: (message: InspectorNotification) => void): this; + /** + * Sent when new profile recording is started using console.profile() call. + */ + on(event: 'Profiler.consoleProfileStarted', listener: (message: InspectorNotification) => void): this; + on(event: 'Profiler.consoleProfileFinished', listener: (message: InspectorNotification) => void): this; + on(event: 'HeapProfiler.addHeapSnapshotChunk', listener: (message: InspectorNotification) => void): this; + on(event: 'HeapProfiler.resetProfiles', listener: () => void): this; + on(event: 'HeapProfiler.reportHeapSnapshotProgress', listener: (message: InspectorNotification) => void): this; + /** + * If heap objects tracking has been started then backend regularly sends a current value for last seen object id and corresponding timestamp. If the were changes in the heap since last event then one or more heapStatsUpdate events will be sent before a new lastSeenObjectId event. + */ + on(event: 'HeapProfiler.lastSeenObjectId', listener: (message: InspectorNotification) => void): this; + /** + * If heap objects tracking has been started then backend may send update for one or more fragments + */ + on(event: 'HeapProfiler.heapStatsUpdate', listener: (message: InspectorNotification) => void): this; + /** + * Contains an bucket of collected trace events. + */ + on(event: 'NodeTracing.dataCollected', listener: (message: InspectorNotification) => void): this; + /** + * Signals that tracing is stopped and there is no trace buffers pending flush, all data were + * delivered via dataCollected events. + */ + on(event: 'NodeTracing.tracingComplete', listener: () => void): this; + /** + * Issued when attached to a worker. + */ + on(event: 'NodeWorker.attachedToWorker', listener: (message: InspectorNotification) => void): this; + /** + * Issued when detached from the worker. + */ + on(event: 'NodeWorker.detachedFromWorker', listener: (message: InspectorNotification) => void): this; + /** + * Notifies about a new protocol message received from the session + * (session ID is provided in attachedToWorker notification). + */ + on(event: 'NodeWorker.receivedMessageFromWorker', listener: (message: InspectorNotification) => void): this; + /** + * This event is fired instead of `Runtime.executionContextDestroyed` when + * enabled. + * It is fired when the Node process finished all code execution and is + * waiting for all frontends to disconnect. + */ + on(event: 'NodeRuntime.waitingForDisconnect', listener: () => void): this; + once(event: string, listener: (...args: any[]) => void): this; + /** + * Emitted when any notification from the V8 Inspector is received. + */ + once(event: 'inspectorNotification', listener: (message: InspectorNotification<{}>) => void): this; + /** + * Issued when new execution context is created. + */ + once(event: 'Runtime.executionContextCreated', listener: (message: InspectorNotification) => void): this; + /** + * Issued when execution context is destroyed. + */ + once(event: 'Runtime.executionContextDestroyed', listener: (message: InspectorNotification) => void): this; + /** + * Issued when all executionContexts were cleared in browser + */ + once(event: 'Runtime.executionContextsCleared', listener: () => void): this; + /** + * Issued when exception was thrown and unhandled. + */ + once(event: 'Runtime.exceptionThrown', listener: (message: InspectorNotification) => void): this; + /** + * Issued when unhandled exception was revoked. + */ + once(event: 'Runtime.exceptionRevoked', listener: (message: InspectorNotification) => void): this; + /** + * Issued when console API was called. + */ + once(event: 'Runtime.consoleAPICalled', listener: (message: InspectorNotification) => void): this; + /** + * Issued when object should be inspected (for example, as a result of inspect() command line API call). + */ + once(event: 'Runtime.inspectRequested', listener: (message: InspectorNotification) => void): this; + /** + * Fired when virtual machine parses script. This event is also fired for all known and uncollected scripts upon enabling debugger. + */ + once(event: 'Debugger.scriptParsed', listener: (message: InspectorNotification) => void): this; + /** + * Fired when virtual machine fails to parse the script. + */ + once(event: 'Debugger.scriptFailedToParse', listener: (message: InspectorNotification) => void): this; + /** + * Fired when breakpoint is resolved to an actual script and location. + */ + once(event: 'Debugger.breakpointResolved', listener: (message: InspectorNotification) => void): this; + /** + * Fired when the virtual machine stopped on breakpoint or exception or any other stop criteria. + */ + once(event: 'Debugger.paused', listener: (message: InspectorNotification) => void): this; + /** + * Fired when the virtual machine resumed execution. + */ + once(event: 'Debugger.resumed', listener: () => void): this; + /** + * Issued when new console message is added. + */ + once(event: 'Console.messageAdded', listener: (message: InspectorNotification) => void): this; + /** + * Sent when new profile recording is started using console.profile() call. + */ + once(event: 'Profiler.consoleProfileStarted', listener: (message: InspectorNotification) => void): this; + once(event: 'Profiler.consoleProfileFinished', listener: (message: InspectorNotification) => void): this; + once(event: 'HeapProfiler.addHeapSnapshotChunk', listener: (message: InspectorNotification) => void): this; + once(event: 'HeapProfiler.resetProfiles', listener: () => void): this; + once(event: 'HeapProfiler.reportHeapSnapshotProgress', listener: (message: InspectorNotification) => void): this; + /** + * If heap objects tracking has been started then backend regularly sends a current value for last seen object id and corresponding timestamp. If the were changes in the heap since last event then one or more heapStatsUpdate events will be sent before a new lastSeenObjectId event. + */ + once(event: 'HeapProfiler.lastSeenObjectId', listener: (message: InspectorNotification) => void): this; + /** + * If heap objects tracking has been started then backend may send update for one or more fragments + */ + once(event: 'HeapProfiler.heapStatsUpdate', listener: (message: InspectorNotification) => void): this; + /** + * Contains an bucket of collected trace events. + */ + once(event: 'NodeTracing.dataCollected', listener: (message: InspectorNotification) => void): this; + /** + * Signals that tracing is stopped and there is no trace buffers pending flush, all data were + * delivered via dataCollected events. + */ + once(event: 'NodeTracing.tracingComplete', listener: () => void): this; + /** + * Issued when attached to a worker. + */ + once(event: 'NodeWorker.attachedToWorker', listener: (message: InspectorNotification) => void): this; + /** + * Issued when detached from the worker. + */ + once(event: 'NodeWorker.detachedFromWorker', listener: (message: InspectorNotification) => void): this; + /** + * Notifies about a new protocol message received from the session + * (session ID is provided in attachedToWorker notification). + */ + once(event: 'NodeWorker.receivedMessageFromWorker', listener: (message: InspectorNotification) => void): this; + /** + * This event is fired instead of `Runtime.executionContextDestroyed` when + * enabled. + * It is fired when the Node process finished all code execution and is + * waiting for all frontends to disconnect. + */ + once(event: 'NodeRuntime.waitingForDisconnect', listener: () => void): this; + prependListener(event: string, listener: (...args: any[]) => void): this; + /** + * Emitted when any notification from the V8 Inspector is received. + */ + prependListener(event: 'inspectorNotification', listener: (message: InspectorNotification<{}>) => void): this; + /** + * Issued when new execution context is created. + */ + prependListener(event: 'Runtime.executionContextCreated', listener: (message: InspectorNotification) => void): this; + /** + * Issued when execution context is destroyed. + */ + prependListener(event: 'Runtime.executionContextDestroyed', listener: (message: InspectorNotification) => void): this; + /** + * Issued when all executionContexts were cleared in browser + */ + prependListener(event: 'Runtime.executionContextsCleared', listener: () => void): this; + /** + * Issued when exception was thrown and unhandled. + */ + prependListener(event: 'Runtime.exceptionThrown', listener: (message: InspectorNotification) => void): this; + /** + * Issued when unhandled exception was revoked. + */ + prependListener(event: 'Runtime.exceptionRevoked', listener: (message: InspectorNotification) => void): this; + /** + * Issued when console API was called. + */ + prependListener(event: 'Runtime.consoleAPICalled', listener: (message: InspectorNotification) => void): this; + /** + * Issued when object should be inspected (for example, as a result of inspect() command line API call). + */ + prependListener(event: 'Runtime.inspectRequested', listener: (message: InspectorNotification) => void): this; + /** + * Fired when virtual machine parses script. This event is also fired for all known and uncollected scripts upon enabling debugger. + */ + prependListener(event: 'Debugger.scriptParsed', listener: (message: InspectorNotification) => void): this; + /** + * Fired when virtual machine fails to parse the script. + */ + prependListener(event: 'Debugger.scriptFailedToParse', listener: (message: InspectorNotification) => void): this; + /** + * Fired when breakpoint is resolved to an actual script and location. + */ + prependListener(event: 'Debugger.breakpointResolved', listener: (message: InspectorNotification) => void): this; + /** + * Fired when the virtual machine stopped on breakpoint or exception or any other stop criteria. + */ + prependListener(event: 'Debugger.paused', listener: (message: InspectorNotification) => void): this; + /** + * Fired when the virtual machine resumed execution. + */ + prependListener(event: 'Debugger.resumed', listener: () => void): this; + /** + * Issued when new console message is added. + */ + prependListener(event: 'Console.messageAdded', listener: (message: InspectorNotification) => void): this; + /** + * Sent when new profile recording is started using console.profile() call. + */ + prependListener(event: 'Profiler.consoleProfileStarted', listener: (message: InspectorNotification) => void): this; + prependListener(event: 'Profiler.consoleProfileFinished', listener: (message: InspectorNotification) => void): this; + prependListener(event: 'HeapProfiler.addHeapSnapshotChunk', listener: (message: InspectorNotification) => void): this; + prependListener(event: 'HeapProfiler.resetProfiles', listener: () => void): this; + prependListener(event: 'HeapProfiler.reportHeapSnapshotProgress', listener: (message: InspectorNotification) => void): this; + /** + * If heap objects tracking has been started then backend regularly sends a current value for last seen object id and corresponding timestamp. If the were changes in the heap since last event then one or more heapStatsUpdate events will be sent before a new lastSeenObjectId event. + */ + prependListener(event: 'HeapProfiler.lastSeenObjectId', listener: (message: InspectorNotification) => void): this; + /** + * If heap objects tracking has been started then backend may send update for one or more fragments + */ + prependListener(event: 'HeapProfiler.heapStatsUpdate', listener: (message: InspectorNotification) => void): this; + /** + * Contains an bucket of collected trace events. + */ + prependListener(event: 'NodeTracing.dataCollected', listener: (message: InspectorNotification) => void): this; + /** + * Signals that tracing is stopped and there is no trace buffers pending flush, all data were + * delivered via dataCollected events. + */ + prependListener(event: 'NodeTracing.tracingComplete', listener: () => void): this; + /** + * Issued when attached to a worker. + */ + prependListener(event: 'NodeWorker.attachedToWorker', listener: (message: InspectorNotification) => void): this; + /** + * Issued when detached from the worker. + */ + prependListener(event: 'NodeWorker.detachedFromWorker', listener: (message: InspectorNotification) => void): this; + /** + * Notifies about a new protocol message received from the session + * (session ID is provided in attachedToWorker notification). + */ + prependListener(event: 'NodeWorker.receivedMessageFromWorker', listener: (message: InspectorNotification) => void): this; + /** + * This event is fired instead of `Runtime.executionContextDestroyed` when + * enabled. + * It is fired when the Node process finished all code execution and is + * waiting for all frontends to disconnect. + */ + prependListener(event: 'NodeRuntime.waitingForDisconnect', listener: () => void): this; + prependOnceListener(event: string, listener: (...args: any[]) => void): this; + /** + * Emitted when any notification from the V8 Inspector is received. + */ + prependOnceListener(event: 'inspectorNotification', listener: (message: InspectorNotification<{}>) => void): this; + /** + * Issued when new execution context is created. + */ + prependOnceListener(event: 'Runtime.executionContextCreated', listener: (message: InspectorNotification) => void): this; + /** + * Issued when execution context is destroyed. + */ + prependOnceListener(event: 'Runtime.executionContextDestroyed', listener: (message: InspectorNotification) => void): this; + /** + * Issued when all executionContexts were cleared in browser + */ + prependOnceListener(event: 'Runtime.executionContextsCleared', listener: () => void): this; + /** + * Issued when exception was thrown and unhandled. + */ + prependOnceListener(event: 'Runtime.exceptionThrown', listener: (message: InspectorNotification) => void): this; + /** + * Issued when unhandled exception was revoked. + */ + prependOnceListener(event: 'Runtime.exceptionRevoked', listener: (message: InspectorNotification) => void): this; + /** + * Issued when console API was called. + */ + prependOnceListener(event: 'Runtime.consoleAPICalled', listener: (message: InspectorNotification) => void): this; + /** + * Issued when object should be inspected (for example, as a result of inspect() command line API call). + */ + prependOnceListener(event: 'Runtime.inspectRequested', listener: (message: InspectorNotification) => void): this; + /** + * Fired when virtual machine parses script. This event is also fired for all known and uncollected scripts upon enabling debugger. + */ + prependOnceListener(event: 'Debugger.scriptParsed', listener: (message: InspectorNotification) => void): this; + /** + * Fired when virtual machine fails to parse the script. + */ + prependOnceListener(event: 'Debugger.scriptFailedToParse', listener: (message: InspectorNotification) => void): this; + /** + * Fired when breakpoint is resolved to an actual script and location. + */ + prependOnceListener(event: 'Debugger.breakpointResolved', listener: (message: InspectorNotification) => void): this; + /** + * Fired when the virtual machine stopped on breakpoint or exception or any other stop criteria. + */ + prependOnceListener(event: 'Debugger.paused', listener: (message: InspectorNotification) => void): this; + /** + * Fired when the virtual machine resumed execution. + */ + prependOnceListener(event: 'Debugger.resumed', listener: () => void): this; + /** + * Issued when new console message is added. + */ + prependOnceListener(event: 'Console.messageAdded', listener: (message: InspectorNotification) => void): this; + /** + * Sent when new profile recording is started using console.profile() call. + */ + prependOnceListener(event: 'Profiler.consoleProfileStarted', listener: (message: InspectorNotification) => void): this; + prependOnceListener(event: 'Profiler.consoleProfileFinished', listener: (message: InspectorNotification) => void): this; + prependOnceListener(event: 'HeapProfiler.addHeapSnapshotChunk', listener: (message: InspectorNotification) => void): this; + prependOnceListener(event: 'HeapProfiler.resetProfiles', listener: () => void): this; + prependOnceListener(event: 'HeapProfiler.reportHeapSnapshotProgress', listener: (message: InspectorNotification) => void): this; + /** + * If heap objects tracking has been started then backend regularly sends a current value for last seen object id and corresponding timestamp. If the were changes in the heap since last event then one or more heapStatsUpdate events will be sent before a new lastSeenObjectId event. + */ + prependOnceListener(event: 'HeapProfiler.lastSeenObjectId', listener: (message: InspectorNotification) => void): this; + /** + * If heap objects tracking has been started then backend may send update for one or more fragments + */ + prependOnceListener(event: 'HeapProfiler.heapStatsUpdate', listener: (message: InspectorNotification) => void): this; + /** + * Contains an bucket of collected trace events. + */ + prependOnceListener(event: 'NodeTracing.dataCollected', listener: (message: InspectorNotification) => void): this; + /** + * Signals that tracing is stopped and there is no trace buffers pending flush, all data were + * delivered via dataCollected events. + */ + prependOnceListener(event: 'NodeTracing.tracingComplete', listener: () => void): this; + /** + * Issued when attached to a worker. + */ + prependOnceListener(event: 'NodeWorker.attachedToWorker', listener: (message: InspectorNotification) => void): this; + /** + * Issued when detached from the worker. + */ + prependOnceListener(event: 'NodeWorker.detachedFromWorker', listener: (message: InspectorNotification) => void): this; + /** + * Notifies about a new protocol message received from the session + * (session ID is provided in attachedToWorker notification). + */ + prependOnceListener(event: 'NodeWorker.receivedMessageFromWorker', listener: (message: InspectorNotification) => void): this; + /** + * This event is fired instead of `Runtime.executionContextDestroyed` when + * enabled. + * It is fired when the Node process finished all code execution and is + * waiting for all frontends to disconnect. + */ + prependOnceListener(event: 'NodeRuntime.waitingForDisconnect', listener: () => void): this; + } + /** + * Activate inspector on host and port. Equivalent to `node --inspect=[[host:]port]`, but can be done programmatically after node has + * started. + * + * If wait is `true`, will block until a client has connected to the inspect port + * and flow control has been passed to the debugger client. + * + * See the `security warning` regarding the `host`parameter usage. + * @param [port='what was specified on the CLI'] Port to listen on for inspector connections. Optional. + * @param [host='what was specified on the CLI'] Host to listen on for inspector connections. Optional. + * @param [wait=false] Block until a client has connected. Optional. + */ + function open(port?: number, host?: string, wait?: boolean): void; + /** + * Deactivate the inspector. Blocks until there are no active connections. + */ + function close(): void; + /** + * Return the URL of the active inspector, or `undefined` if there is none. + * + * ```console + * $ node --inspect -p 'inspector.url()' + * Debugger listening on ws://127.0.0.1:9229/166e272e-7a30-4d09-97ce-f1c012b43c34 + * For help see https://nodejs.org/en/docs/inspector + * ws://127.0.0.1:9229/166e272e-7a30-4d09-97ce-f1c012b43c34 + * + * $ node --inspect=localhost:3000 -p 'inspector.url()' + * Debugger listening on ws://localhost:3000/51cf8d0e-3c36-4c59-8efd-54519839e56a + * For help see https://nodejs.org/en/docs/inspector + * ws://localhost:3000/51cf8d0e-3c36-4c59-8efd-54519839e56a + * + * $ node -p 'inspector.url()' + * undefined + * ``` + */ + function url(): string | undefined; + /** + * Blocks until a client (existing or connected later) has sent`Runtime.runIfWaitingForDebugger` command. + * + * An exception will be thrown if there is no active inspector. + * @since v12.7.0 + */ + function waitForDebugger(): void; +} +declare module 'node:inspector' { + import EventEmitter = require('inspector'); + export = EventEmitter; +} diff --git a/node_modules/@types/node/module.d.ts b/node_modules/@types/node/module.d.ts new file mode 100644 index 000000000..d83aec94a --- /dev/null +++ b/node_modules/@types/node/module.d.ts @@ -0,0 +1,114 @@ +/** + * @since v0.3.7 + */ +declare module 'module' { + import { URL } from 'node:url'; + namespace Module { + /** + * The `module.syncBuiltinESMExports()` method updates all the live bindings for + * builtin `ES Modules` to match the properties of the `CommonJS` exports. It + * does not add or remove exported names from the `ES Modules`. + * + * ```js + * const fs = require('fs'); + * const assert = require('assert'); + * const { syncBuiltinESMExports } = require('module'); + * + * fs.readFile = newAPI; + * + * delete fs.readFileSync; + * + * function newAPI() { + * // ... + * } + * + * fs.newAPI = newAPI; + * + * syncBuiltinESMExports(); + * + * import('fs').then((esmFS) => { + * // It syncs the existing readFile property with the new value + * assert.strictEqual(esmFS.readFile, newAPI); + * // readFileSync has been deleted from the required fs + * assert.strictEqual('readFileSync' in fs, false); + * // syncBuiltinESMExports() does not remove readFileSync from esmFS + * assert.strictEqual('readFileSync' in esmFS, true); + * // syncBuiltinESMExports() does not add names + * assert.strictEqual(esmFS.newAPI, undefined); + * }); + * ``` + * @since v12.12.0 + */ + function syncBuiltinESMExports(): void; + /** + * `path` is the resolved path for the file for which a corresponding source map + * should be fetched. + * @since v13.7.0, v12.17.0 + */ + function findSourceMap(path: string, error?: Error): SourceMap; + interface SourceMapPayload { + file: string; + version: number; + sources: string[]; + sourcesContent: string[]; + names: string[]; + mappings: string; + sourceRoot: string; + } + interface SourceMapping { + generatedLine: number; + generatedColumn: number; + originalSource: string; + originalLine: number; + originalColumn: number; + } + /** + * @since v13.7.0, v12.17.0 + */ + class SourceMap { + /** + * Getter for the payload used to construct the `SourceMap` instance. + */ + readonly payload: SourceMapPayload; + constructor(payload: SourceMapPayload); + /** + * Given a line number and column number in the generated source file, returns + * an object representing the position in the original file. The object returned + * consists of the following keys: + */ + findEntry(line: number, column: number): SourceMapping; + } + } + interface Module extends NodeModule {} + class Module { + static runMain(): void; + static wrap(code: string): string; + static createRequire(path: string | URL): NodeRequire; + static builtinModules: string[]; + static Module: typeof Module; + constructor(id: string, parent?: Module); + } + global { + interface ImportMeta { + url: string; + /** + * @experimental + * This feature is only available with the `--experimental-import-meta-resolve` + * command flag enabled. + * + * Provides a module-relative resolution function scoped to each module, returning + * the URL string. + * + * @param specified The module specifier to resolve relative to `parent`. + * @param parent The absolute parent module URL to resolve from. If none + * is specified, the value of `import.meta.url` is used as the default. + */ + resolve?(specified: string, parent?: string | URL): Promise; + } + } + export = Module; +} +declare module 'node:module' { + import module = require('module'); + export = module; +} diff --git a/node_modules/@types/node/net.d.ts b/node_modules/@types/node/net.d.ts new file mode 100644 index 000000000..331cd559e --- /dev/null +++ b/node_modules/@types/node/net.d.ts @@ -0,0 +1,783 @@ +/** + * > Stability: 2 - Stable + * + * The `net` module provides an asynchronous network API for creating stream-based + * TCP or `IPC` servers ({@link createServer}) and clients + * ({@link createConnection}). + * + * It can be accessed using: + * + * ```js + * const net = require('net'); + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/net.js) + */ +declare module 'net' { + import * as stream from 'node:stream'; + import { Abortable, EventEmitter } from 'node:events'; + import * as dns from 'node:dns'; + type LookupFunction = (hostname: string, options: dns.LookupOneOptions, callback: (err: NodeJS.ErrnoException | null, address: string, family: number) => void) => void; + interface AddressInfo { + address: string; + family: string; + port: number; + } + interface SocketConstructorOpts { + fd?: number | undefined; + allowHalfOpen?: boolean | undefined; + readable?: boolean | undefined; + writable?: boolean | undefined; + } + interface OnReadOpts { + buffer: Uint8Array | (() => Uint8Array); + /** + * This function is called for every chunk of incoming data. + * Two arguments are passed to it: the number of bytes written to buffer and a reference to buffer. + * Return false from this function to implicitly pause() the socket. + */ + callback(bytesWritten: number, buf: Uint8Array): boolean; + } + interface ConnectOpts { + /** + * If specified, incoming data is stored in a single buffer and passed to the supplied callback when data arrives on the socket. + * Note: this will cause the streaming functionality to not provide any data, however events like 'error', 'end', and 'close' will + * still be emitted as normal and methods like pause() and resume() will also behave as expected. + */ + onread?: OnReadOpts | undefined; + } + interface TcpSocketConnectOpts extends ConnectOpts { + port: number; + host?: string | undefined; + localAddress?: string | undefined; + localPort?: number | undefined; + hints?: number | undefined; + family?: number | undefined; + lookup?: LookupFunction | undefined; + } + interface IpcSocketConnectOpts extends ConnectOpts { + path: string; + } + type SocketConnectOpts = TcpSocketConnectOpts | IpcSocketConnectOpts; + /** + * This class is an abstraction of a TCP socket or a streaming `IPC` endpoint + * (uses named pipes on Windows, and Unix domain sockets otherwise). It is also + * an `EventEmitter`. + * + * A `net.Socket` can be created by the user and used directly to interact with + * a server. For example, it is returned by {@link createConnection}, + * so the user can use it to talk to the server. + * + * It can also be created by Node.js and passed to the user when a connection + * is received. For example, it is passed to the listeners of a `'connection'` event emitted on a {@link Server}, so the user can use + * it to interact with the client. + * @since v0.3.4 + */ + class Socket extends stream.Duplex { + constructor(options?: SocketConstructorOpts); + /** + * Sends data on the socket. The second parameter specifies the encoding in the + * case of a string. It defaults to UTF8 encoding. + * + * Returns `true` if the entire data was flushed successfully to the kernel + * buffer. Returns `false` if all or part of the data was queued in user memory.`'drain'` will be emitted when the buffer is again free. + * + * The optional `callback` parameter will be executed when the data is finally + * written out, which may not be immediately. + * + * See `Writable` stream `write()` method for more + * information. + * @since v0.1.90 + * @param [encoding='utf8'] Only used when data is `string`. + */ + write(buffer: Uint8Array | string, cb?: (err?: Error) => void): boolean; + write(str: Uint8Array | string, encoding?: BufferEncoding, cb?: (err?: Error) => void): boolean; + /** + * Initiate a connection on a given socket. + * + * Possible signatures: + * + * * `socket.connect(options[, connectListener])` + * * `socket.connect(path[, connectListener])` for `IPC` connections. + * * `socket.connect(port[, host][, connectListener])` for TCP connections. + * * Returns: `net.Socket` The socket itself. + * + * This function is asynchronous. When the connection is established, the `'connect'` event will be emitted. If there is a problem connecting, + * instead of a `'connect'` event, an `'error'` event will be emitted with + * the error passed to the `'error'` listener. + * The last parameter `connectListener`, if supplied, will be added as a listener + * for the `'connect'` event **once**. + * + * This function should only be used for reconnecting a socket after`'close'` has been emitted or otherwise it may lead to undefined + * behavior. + */ + connect(options: SocketConnectOpts, connectionListener?: () => void): this; + connect(port: number, host: string, connectionListener?: () => void): this; + connect(port: number, connectionListener?: () => void): this; + connect(path: string, connectionListener?: () => void): this; + /** + * Set the encoding for the socket as a `Readable Stream`. See `readable.setEncoding()` for more information. + * @since v0.1.90 + * @return The socket itself. + */ + setEncoding(encoding?: BufferEncoding): this; + /** + * Pauses the reading of data. That is, `'data'` events will not be emitted. + * Useful to throttle back an upload. + * @return The socket itself. + */ + pause(): this; + /** + * Resumes reading after a call to `socket.pause()`. + * @return The socket itself. + */ + resume(): this; + /** + * Sets the socket to timeout after `timeout` milliseconds of inactivity on + * the socket. By default `net.Socket` do not have a timeout. + * + * When an idle timeout is triggered the socket will receive a `'timeout'` event but the connection will not be severed. The user must manually call `socket.end()` or `socket.destroy()` to + * end the connection. + * + * ```js + * socket.setTimeout(3000); + * socket.on('timeout', () => { + * console.log('socket timeout'); + * socket.end(); + * }); + * ``` + * + * If `timeout` is 0, then the existing idle timeout is disabled. + * + * The optional `callback` parameter will be added as a one-time listener for the `'timeout'` event. + * @since v0.1.90 + * @return The socket itself. + */ + setTimeout(timeout: number, callback?: () => void): this; + /** + * Enable/disable the use of Nagle's algorithm. + * + * When a TCP connection is created, it will have Nagle's algorithm enabled. + * + * Nagle's algorithm delays data before it is sent via the network. It attempts + * to optimize throughput at the expense of latency. + * + * Passing `true` for `noDelay` or not passing an argument will disable Nagle's + * algorithm for the socket. Passing `false` for `noDelay` will enable Nagle's + * algorithm. + * @since v0.1.90 + * @param [noDelay=true] + * @return The socket itself. + */ + setNoDelay(noDelay?: boolean): this; + /** + * Enable/disable keep-alive functionality, and optionally set the initial + * delay before the first keepalive probe is sent on an idle socket. + * + * Set `initialDelay` (in milliseconds) to set the delay between the last + * data packet received and the first keepalive probe. Setting `0` for`initialDelay` will leave the value unchanged from the default + * (or previous) setting. + * + * Enabling the keep-alive functionality will set the following socket options: + * + * * `SO_KEEPALIVE=1` + * * `TCP_KEEPIDLE=initialDelay` + * * `TCP_KEEPCNT=10` + * * `TCP_KEEPINTVL=1` + * @since v0.1.92 + * @param [enable=false] + * @param [initialDelay=0] + * @return The socket itself. + */ + setKeepAlive(enable?: boolean, initialDelay?: number): this; + /** + * Returns the bound `address`, the address `family` name and `port` of the + * socket as reported by the operating system:`{ port: 12346, family: 'IPv4', address: '127.0.0.1' }` + * @since v0.1.90 + */ + address(): AddressInfo | {}; + /** + * Calling `unref()` on a socket will allow the program to exit if this is the only + * active socket in the event system. If the socket is already `unref`ed calling`unref()` again will have no effect. + * @since v0.9.1 + * @return The socket itself. + */ + unref(): this; + /** + * Opposite of `unref()`, calling `ref()` on a previously `unref`ed socket will_not_ let the program exit if it's the only socket left (the default behavior). + * If the socket is `ref`ed calling `ref` again will have no effect. + * @since v0.9.1 + * @return The socket itself. + */ + ref(): this; + /** + * This property shows the number of characters buffered for writing. The buffer + * may contain strings whose length after encoding is not yet known. So this number + * is only an approximation of the number of bytes in the buffer. + * + * `net.Socket` has the property that `socket.write()` always works. This is to + * help users get up and running quickly. The computer cannot always keep up + * with the amount of data that is written to a socket. The network connection + * simply might be too slow. Node.js will internally queue up the data written to a + * socket and send it out over the wire when it is possible. + * + * The consequence of this internal buffering is that memory may grow. + * Users who experience large or growing `bufferSize` should attempt to + * "throttle" the data flows in their program with `socket.pause()` and `socket.resume()`. + * @since v0.3.8 + * @deprecated Since v14.6.0 - Use `writableLength` instead. + */ + readonly bufferSize: number; + /** + * The amount of received bytes. + * @since v0.5.3 + */ + readonly bytesRead: number; + /** + * The amount of bytes sent. + * @since v0.5.3 + */ + readonly bytesWritten: number; + /** + * If `true`,`socket.connect(options[, connectListener])` was + * called and has not yet finished. It will stay `true` until the socket becomes + * connected, then it is set to `false` and the `'connect'` event is emitted. Note + * that the `socket.connect(options[, connectListener])` callback is a listener for the `'connect'` event. + * @since v6.1.0 + */ + readonly connecting: boolean; + /** + * See `writable.destroyed` for further details. + */ + readonly destroyed: boolean; + /** + * The string representation of the local IP address the remote client is + * connecting on. For example, in a server listening on `'0.0.0.0'`, if a client + * connects on `'192.168.1.1'`, the value of `socket.localAddress` would be`'192.168.1.1'`. + * @since v0.9.6 + */ + readonly localAddress: string; + /** + * The numeric representation of the local port. For example, `80` or `21`. + * @since v0.9.6 + */ + readonly localPort: number; + /** + * The string representation of the remote IP address. For example,`'74.125.127.100'` or `'2001:4860:a005::68'`. Value may be `undefined` if + * the socket is destroyed (for example, if the client disconnected). + * @since v0.5.10 + */ + readonly remoteAddress?: string | undefined; + /** + * The string representation of the remote IP family. `'IPv4'` or `'IPv6'`. + * @since v0.11.14 + */ + readonly remoteFamily?: string | undefined; + /** + * The numeric representation of the remote port. For example, `80` or `21`. + * @since v0.5.10 + */ + readonly remotePort?: number | undefined; + /** + * Half-closes the socket. i.e., it sends a FIN packet. It is possible the + * server will still send some data. + * + * See `writable.end()` for further details. + * @since v0.1.90 + * @param [encoding='utf8'] Only used when data is `string`. + * @param callback Optional callback for when the socket is finished. + * @return The socket itself. + */ + end(callback?: () => void): void; + end(buffer: Uint8Array | string, callback?: () => void): void; + end(str: Uint8Array | string, encoding?: BufferEncoding, callback?: () => void): void; + /** + * events.EventEmitter + * 1. close + * 2. connect + * 3. data + * 4. drain + * 5. end + * 6. error + * 7. lookup + * 8. timeout + */ + addListener(event: string, listener: (...args: any[]) => void): this; + addListener(event: 'close', listener: (hadError: boolean) => void): this; + addListener(event: 'connect', listener: () => void): this; + addListener(event: 'data', listener: (data: Buffer) => void): this; + addListener(event: 'drain', listener: () => void): this; + addListener(event: 'end', listener: () => void): this; + addListener(event: 'error', listener: (err: Error) => void): this; + addListener(event: 'lookup', listener: (err: Error, address: string, family: string | number, host: string) => void): this; + addListener(event: 'ready', listener: () => void): this; + addListener(event: 'timeout', listener: () => void): this; + emit(event: string | symbol, ...args: any[]): boolean; + emit(event: 'close', hadError: boolean): boolean; + emit(event: 'connect'): boolean; + emit(event: 'data', data: Buffer): boolean; + emit(event: 'drain'): boolean; + emit(event: 'end'): boolean; + emit(event: 'error', err: Error): boolean; + emit(event: 'lookup', err: Error, address: string, family: string | number, host: string): boolean; + emit(event: 'ready'): boolean; + emit(event: 'timeout'): boolean; + on(event: string, listener: (...args: any[]) => void): this; + on(event: 'close', listener: (hadError: boolean) => void): this; + on(event: 'connect', listener: () => void): this; + on(event: 'data', listener: (data: Buffer) => void): this; + on(event: 'drain', listener: () => void): this; + on(event: 'end', listener: () => void): this; + on(event: 'error', listener: (err: Error) => void): this; + on(event: 'lookup', listener: (err: Error, address: string, family: string | number, host: string) => void): this; + on(event: 'ready', listener: () => void): this; + on(event: 'timeout', listener: () => void): this; + once(event: string, listener: (...args: any[]) => void): this; + once(event: 'close', listener: (hadError: boolean) => void): this; + once(event: 'connect', listener: () => void): this; + once(event: 'data', listener: (data: Buffer) => void): this; + once(event: 'drain', listener: () => void): this; + once(event: 'end', listener: () => void): this; + once(event: 'error', listener: (err: Error) => void): this; + once(event: 'lookup', listener: (err: Error, address: string, family: string | number, host: string) => void): this; + once(event: 'ready', listener: () => void): this; + once(event: 'timeout', listener: () => void): this; + prependListener(event: string, listener: (...args: any[]) => void): this; + prependListener(event: 'close', listener: (hadError: boolean) => void): this; + prependListener(event: 'connect', listener: () => void): this; + prependListener(event: 'data', listener: (data: Buffer) => void): this; + prependListener(event: 'drain', listener: () => void): this; + prependListener(event: 'end', listener: () => void): this; + prependListener(event: 'error', listener: (err: Error) => void): this; + prependListener(event: 'lookup', listener: (err: Error, address: string, family: string | number, host: string) => void): this; + prependListener(event: 'ready', listener: () => void): this; + prependListener(event: 'timeout', listener: () => void): this; + prependOnceListener(event: string, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'close', listener: (hadError: boolean) => void): this; + prependOnceListener(event: 'connect', listener: () => void): this; + prependOnceListener(event: 'data', listener: (data: Buffer) => void): this; + prependOnceListener(event: 'drain', listener: () => void): this; + prependOnceListener(event: 'end', listener: () => void): this; + prependOnceListener(event: 'error', listener: (err: Error) => void): this; + prependOnceListener(event: 'lookup', listener: (err: Error, address: string, family: string | number, host: string) => void): this; + prependOnceListener(event: 'ready', listener: () => void): this; + prependOnceListener(event: 'timeout', listener: () => void): this; + } + interface ListenOptions extends Abortable { + port?: number | undefined; + host?: string | undefined; + backlog?: number | undefined; + path?: string | undefined; + exclusive?: boolean | undefined; + readableAll?: boolean | undefined; + writableAll?: boolean | undefined; + /** + * @default false + */ + ipv6Only?: boolean | undefined; + } + interface ServerOpts { + /** + * Indicates whether half-opened TCP connections are allowed. + * @default false + */ + allowHalfOpen?: boolean | undefined; + /** + * Indicates whether the socket should be paused on incoming connections. + * @default false + */ + pauseOnConnect?: boolean | undefined; + } + /** + * This class is used to create a TCP or `IPC` server. + * @since v0.1.90 + */ + class Server extends EventEmitter { + constructor(connectionListener?: (socket: Socket) => void); + constructor(options?: ServerOpts, connectionListener?: (socket: Socket) => void); + /** + * Start a server listening for connections. A `net.Server` can be a TCP or + * an `IPC` server depending on what it listens to. + * + * Possible signatures: + * + * * `server.listen(handle[, backlog][, callback])` + * * `server.listen(options[, callback])` + * * `server.listen(path[, backlog][, callback])` for `IPC` servers + * * `server.listen([port[, host[, backlog]]][, callback])` for TCP servers + * + * This function is asynchronous. When the server starts listening, the `'listening'` event will be emitted. The last parameter `callback`will be added as a listener for the `'listening'` + * event. + * + * All `listen()` methods can take a `backlog` parameter to specify the maximum + * length of the queue of pending connections. The actual length will be determined + * by the OS through sysctl settings such as `tcp_max_syn_backlog` and `somaxconn`on Linux. The default value of this parameter is 511 (not 512). + * + * All {@link Socket} are set to `SO_REUSEADDR` (see [`socket(7)`](https://man7.org/linux/man-pages/man7/socket.7.html) for + * details). + * + * The `server.listen()` method can be called again if and only if there was an + * error during the first `server.listen()` call or `server.close()` has been + * called. Otherwise, an `ERR_SERVER_ALREADY_LISTEN` error will be thrown. + * + * One of the most common errors raised when listening is `EADDRINUSE`. + * This happens when another server is already listening on the requested`port`/`path`/`handle`. One way to handle this would be to retry + * after a certain amount of time: + * + * ```js + * server.on('error', (e) => { + * if (e.code === 'EADDRINUSE') { + * console.log('Address in use, retrying...'); + * setTimeout(() => { + * server.close(); + * server.listen(PORT, HOST); + * }, 1000); + * } + * }); + * ``` + */ + listen(port?: number, hostname?: string, backlog?: number, listeningListener?: () => void): this; + listen(port?: number, hostname?: string, listeningListener?: () => void): this; + listen(port?: number, backlog?: number, listeningListener?: () => void): this; + listen(port?: number, listeningListener?: () => void): this; + listen(path: string, backlog?: number, listeningListener?: () => void): this; + listen(path: string, listeningListener?: () => void): this; + listen(options: ListenOptions, listeningListener?: () => void): this; + listen(handle: any, backlog?: number, listeningListener?: () => void): this; + listen(handle: any, listeningListener?: () => void): this; + /** + * Stops the server from accepting new connections and keeps existing + * connections. This function is asynchronous, the server is finally closed + * when all connections are ended and the server emits a `'close'` event. + * The optional `callback` will be called once the `'close'` event occurs. Unlike + * that event, it will be called with an `Error` as its only argument if the server + * was not open when it was closed. + * @since v0.1.90 + * @param callback Called when the server is closed. + */ + close(callback?: (err?: Error) => void): this; + /** + * Returns the bound `address`, the address `family` name, and `port` of the server + * as reported by the operating system if listening on an IP socket + * (useful to find which port was assigned when getting an OS-assigned address):`{ port: 12346, family: 'IPv4', address: '127.0.0.1' }`. + * + * For a server listening on a pipe or Unix domain socket, the name is returned + * as a string. + * + * ```js + * const server = net.createServer((socket) => { + * socket.end('goodbye\n'); + * }).on('error', (err) => { + * // Handle errors here. + * throw err; + * }); + * + * // Grab an arbitrary unused port. + * server.listen(() => { + * console.log('opened server on', server.address()); + * }); + * ``` + * + * `server.address()` returns `null` before the `'listening'` event has been + * emitted or after calling `server.close()`. + * @since v0.1.90 + */ + address(): AddressInfo | string | null; + /** + * Asynchronously get the number of concurrent connections on the server. Works + * when sockets were sent to forks. + * + * Callback should take two arguments `err` and `count`. + * @since v0.9.7 + */ + getConnections(cb: (error: Error | null, count: number) => void): void; + /** + * Opposite of `unref()`, calling `ref()` on a previously `unref`ed server will_not_ let the program exit if it's the only server left (the default behavior). + * If the server is `ref`ed calling `ref()` again will have no effect. + * @since v0.9.1 + */ + ref(): this; + /** + * Calling `unref()` on a server will allow the program to exit if this is the only + * active server in the event system. If the server is already `unref`ed calling`unref()` again will have no effect. + * @since v0.9.1 + */ + unref(): this; + /** + * Set this property to reject connections when the server's connection count gets + * high. + * + * It is not recommended to use this option once a socket has been sent to a child + * with `child_process.fork()`. + * @since v0.2.0 + */ + maxConnections: number; + connections: number; + /** + * Indicates whether or not the server is listening for connections. + * @since v5.7.0 + */ + listening: boolean; + /** + * events.EventEmitter + * 1. close + * 2. connection + * 3. error + * 4. listening + */ + addListener(event: string, listener: (...args: any[]) => void): this; + addListener(event: 'close', listener: () => void): this; + addListener(event: 'connection', listener: (socket: Socket) => void): this; + addListener(event: 'error', listener: (err: Error) => void): this; + addListener(event: 'listening', listener: () => void): this; + emit(event: string | symbol, ...args: any[]): boolean; + emit(event: 'close'): boolean; + emit(event: 'connection', socket: Socket): boolean; + emit(event: 'error', err: Error): boolean; + emit(event: 'listening'): boolean; + on(event: string, listener: (...args: any[]) => void): this; + on(event: 'close', listener: () => void): this; + on(event: 'connection', listener: (socket: Socket) => void): this; + on(event: 'error', listener: (err: Error) => void): this; + on(event: 'listening', listener: () => void): this; + once(event: string, listener: (...args: any[]) => void): this; + once(event: 'close', listener: () => void): this; + once(event: 'connection', listener: (socket: Socket) => void): this; + once(event: 'error', listener: (err: Error) => void): this; + once(event: 'listening', listener: () => void): this; + prependListener(event: string, listener: (...args: any[]) => void): this; + prependListener(event: 'close', listener: () => void): this; + prependListener(event: 'connection', listener: (socket: Socket) => void): this; + prependListener(event: 'error', listener: (err: Error) => void): this; + prependListener(event: 'listening', listener: () => void): this; + prependOnceListener(event: string, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'close', listener: () => void): this; + prependOnceListener(event: 'connection', listener: (socket: Socket) => void): this; + prependOnceListener(event: 'error', listener: (err: Error) => void): this; + prependOnceListener(event: 'listening', listener: () => void): this; + } + type IPVersion = 'ipv4' | 'ipv6'; + /** + * The `BlockList` object can be used with some network APIs to specify rules for + * disabling inbound or outbound access to specific IP addresses, IP ranges, or + * IP subnets. + * @since v15.0.0 + */ + class BlockList { + /** + * Adds a rule to block the given IP address. + * @since v15.0.0 + * @param address An IPv4 or IPv6 address. + * @param [type='ipv4'] Either `'ipv4'` or `'ipv6'`. + */ + addAddress(address: string, type?: IPVersion): void; + addAddress(address: SocketAddress): void; + /** + * Adds a rule to block a range of IP addresses from `start` (inclusive) to`end` (inclusive). + * @since v15.0.0 + * @param start The starting IPv4 or IPv6 address in the range. + * @param end The ending IPv4 or IPv6 address in the range. + * @param [type='ipv4'] Either `'ipv4'` or `'ipv6'`. + */ + addRange(start: string, end: string, type?: IPVersion): void; + addRange(start: SocketAddress, end: SocketAddress): void; + /** + * Adds a rule to block a range of IP addresses specified as a subnet mask. + * @since v15.0.0 + * @param net The network IPv4 or IPv6 address. + * @param prefix The number of CIDR prefix bits. For IPv4, this must be a value between `0` and `32`. For IPv6, this must be between `0` and `128`. + * @param [type='ipv4'] Either `'ipv4'` or `'ipv6'`. + */ + addSubnet(net: SocketAddress, prefix: number): void; + addSubnet(net: string, prefix: number, type?: IPVersion): void; + /** + * Returns `true` if the given IP address matches any of the rules added to the`BlockList`. + * + * ```js + * const blockList = new net.BlockList(); + * blockList.addAddress('123.123.123.123'); + * blockList.addRange('10.0.0.1', '10.0.0.10'); + * blockList.addSubnet('8592:757c:efae:4e45::', 64, 'ipv6'); + * + * console.log(blockList.check('123.123.123.123')); // Prints: true + * console.log(blockList.check('10.0.0.3')); // Prints: true + * console.log(blockList.check('222.111.111.222')); // Prints: false + * + * // IPv6 notation for IPv4 addresses works: + * console.log(blockList.check('::ffff:7b7b:7b7b', 'ipv6')); // Prints: true + * console.log(blockList.check('::ffff:123.123.123.123', 'ipv6')); // Prints: true + * ``` + * @since v15.0.0 + * @param address The IP address to check + * @param [type='ipv4'] Either `'ipv4'` or `'ipv6'`. + */ + check(address: SocketAddress): boolean; + check(address: string, type?: IPVersion): boolean; + } + interface TcpNetConnectOpts extends TcpSocketConnectOpts, SocketConstructorOpts { + timeout?: number | undefined; + } + interface IpcNetConnectOpts extends IpcSocketConnectOpts, SocketConstructorOpts { + timeout?: number | undefined; + } + type NetConnectOpts = TcpNetConnectOpts | IpcNetConnectOpts; + /** + * Creates a new TCP or `IPC` server. + * + * If `allowHalfOpen` is set to `true`, when the other end of the socket + * signals the end of transmission, the server will only send back the end of + * transmission when `socket.end()` is explicitly called. For example, in the + * context of TCP, when a FIN packed is received, a FIN packed is sent + * back only when `socket.end()` is explicitly called. Until then the + * connection is half-closed (non-readable but still writable). See `'end'` event and [RFC 1122](https://tools.ietf.org/html/rfc1122) (section 4.2.2.13) for more information. + * + * If `pauseOnConnect` is set to `true`, then the socket associated with each + * incoming connection will be paused, and no data will be read from its handle. + * This allows connections to be passed between processes without any data being + * read by the original process. To begin reading data from a paused socket, call `socket.resume()`. + * + * The server can be a TCP server or an `IPC` server, depending on what it `listen()` to. + * + * Here is an example of an TCP echo server which listens for connections + * on port 8124: + * + * ```js + * const net = require('net'); + * const server = net.createServer((c) => { + * // 'connection' listener. + * console.log('client connected'); + * c.on('end', () => { + * console.log('client disconnected'); + * }); + * c.write('hello\r\n'); + * c.pipe(c); + * }); + * server.on('error', (err) => { + * throw err; + * }); + * server.listen(8124, () => { + * console.log('server bound'); + * }); + * ``` + * + * Test this by using `telnet`: + * + * ```console + * $ telnet localhost 8124 + * ``` + * + * To listen on the socket `/tmp/echo.sock`: + * + * ```js + * server.listen('/tmp/echo.sock', () => { + * console.log('server bound'); + * }); + * ``` + * + * Use `nc` to connect to a Unix domain socket server: + * + * ```console + * $ nc -U /tmp/echo.sock + * ``` + * @since v0.5.0 + * @param connectionListener Automatically set as a listener for the {@link 'connection'} event. + */ + function createServer(connectionListener?: (socket: Socket) => void): Server; + function createServer(options?: ServerOpts, connectionListener?: (socket: Socket) => void): Server; + /** + * Aliases to {@link createConnection}. + * + * Possible signatures: + * + * * {@link connect} + * * {@link connect} for `IPC` connections. + * * {@link connect} for TCP connections. + */ + function connect(options: NetConnectOpts, connectionListener?: () => void): Socket; + function connect(port: number, host?: string, connectionListener?: () => void): Socket; + function connect(path: string, connectionListener?: () => void): Socket; + /** + * A factory function, which creates a new {@link Socket}, + * immediately initiates connection with `socket.connect()`, + * then returns the `net.Socket` that starts the connection. + * + * When the connection is established, a `'connect'` event will be emitted + * on the returned socket. The last parameter `connectListener`, if supplied, + * will be added as a listener for the `'connect'` event **once**. + * + * Possible signatures: + * + * * {@link createConnection} + * * {@link createConnection} for `IPC` connections. + * * {@link createConnection} for TCP connections. + * + * The {@link connect} function is an alias to this function. + */ + function createConnection(options: NetConnectOpts, connectionListener?: () => void): Socket; + function createConnection(port: number, host?: string, connectionListener?: () => void): Socket; + function createConnection(path: string, connectionListener?: () => void): Socket; + /** + * Tests if input is an IP address. Returns `0` for invalid strings, + * returns `4` for IP version 4 addresses, and returns `6` for IP version 6 + * addresses. + * @since v0.3.0 + */ + function isIP(input: string): number; + /** + * Returns `true` if input is a version 4 IP address, otherwise returns `false`. + * @since v0.3.0 + */ + function isIPv4(input: string): boolean; + /** + * Returns `true` if input is a version 6 IP address, otherwise returns `false`. + * @since v0.3.0 + */ + function isIPv6(input: string): boolean; + interface SocketAddressInitOptions { + /** + * The network address as either an IPv4 or IPv6 string. + * @default 127.0.0.1 + */ + address?: string | undefined; + /** + * @default `'ipv4'` + */ + family?: IPVersion | undefined; + /** + * An IPv6 flow-label used only if `family` is `'ipv6'`. + * @default 0 + */ + flowlabel?: number | undefined; + /** + * An IP port. + * @default 0 + */ + port?: number | undefined; + } + /** + * @since v15.14.0 + */ + class SocketAddress { + constructor(options: SocketAddressInitOptions); + /** + * Either \`'ipv4'\` or \`'ipv6'\`. + * @since v15.14.0 + */ + readonly address: string; + /** + * Either \`'ipv4'\` or \`'ipv6'\`. + * @since v15.14.0 + */ + readonly family: IPVersion; + /** + * @since v15.14.0 + */ + readonly port: number; + /** + * @since v15.14.0 + */ + readonly flowlabel: number; + } +} +declare module 'node:net' { + export * from 'net'; +} diff --git a/node_modules/@types/node/os.d.ts b/node_modules/@types/node/os.d.ts new file mode 100644 index 000000000..f9cd24249 --- /dev/null +++ b/node_modules/@types/node/os.d.ts @@ -0,0 +1,455 @@ +/** + * The `os` module provides operating system-related utility methods and + * properties. It can be accessed using: + * + * ```js + * const os = require('os'); + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/os.js) + */ +declare module 'os' { + interface CpuInfo { + model: string; + speed: number; + times: { + user: number; + nice: number; + sys: number; + idle: number; + irq: number; + }; + } + interface NetworkInterfaceBase { + address: string; + netmask: string; + mac: string; + internal: boolean; + cidr: string | null; + } + interface NetworkInterfaceInfoIPv4 extends NetworkInterfaceBase { + family: 'IPv4'; + } + interface NetworkInterfaceInfoIPv6 extends NetworkInterfaceBase { + family: 'IPv6'; + scopeid: number; + } + interface UserInfo { + username: T; + uid: number; + gid: number; + shell: T; + homedir: T; + } + type NetworkInterfaceInfo = NetworkInterfaceInfoIPv4 | NetworkInterfaceInfoIPv6; + /** + * Returns the host name of the operating system as a string. + * @since v0.3.3 + */ + function hostname(): string; + /** + * Returns an array containing the 1, 5, and 15 minute load averages. + * + * The load average is a measure of system activity calculated by the operating + * system and expressed as a fractional number. + * + * The load average is a Unix-specific concept. On Windows, the return value is + * always `[0, 0, 0]`. + * @since v0.3.3 + */ + function loadavg(): number[]; + /** + * Returns the system uptime in number of seconds. + * @since v0.3.3 + */ + function uptime(): number; + /** + * Returns the amount of free system memory in bytes as an integer. + * @since v0.3.3 + */ + function freemem(): number; + /** + * Returns the total amount of system memory in bytes as an integer. + * @since v0.3.3 + */ + function totalmem(): number; + /** + * Returns an array of objects containing information about each logical CPU core. + * + * The properties included on each object include: + * + * ```js + * [ + * { + * model: 'Intel(R) Core(TM) i7 CPU 860 @ 2.80GHz', + * speed: 2926, + * times: { + * user: 252020, + * nice: 0, + * sys: 30340, + * idle: 1070356870, + * irq: 0 + * } + * }, + * { + * model: 'Intel(R) Core(TM) i7 CPU 860 @ 2.80GHz', + * speed: 2926, + * times: { + * user: 306960, + * nice: 0, + * sys: 26980, + * idle: 1071569080, + * irq: 0 + * } + * }, + * { + * model: 'Intel(R) Core(TM) i7 CPU 860 @ 2.80GHz', + * speed: 2926, + * times: { + * user: 248450, + * nice: 0, + * sys: 21750, + * idle: 1070919370, + * irq: 0 + * } + * }, + * { + * model: 'Intel(R) Core(TM) i7 CPU 860 @ 2.80GHz', + * speed: 2926, + * times: { + * user: 256880, + * nice: 0, + * sys: 19430, + * idle: 1070905480, + * irq: 20 + * } + * }, + * ] + * ``` + * + * `nice` values are POSIX-only. On Windows, the `nice` values of all processors + * are always 0. + * @since v0.3.3 + */ + function cpus(): CpuInfo[]; + /** + * Returns the operating system name as returned by [`uname(3)`](https://linux.die.net/man/3/uname). For example, it + * returns `'Linux'` on Linux, `'Darwin'` on macOS, and `'Windows_NT'` on Windows. + * + * See [https://en.wikipedia.org/wiki/Uname#Examples](https://en.wikipedia.org/wiki/Uname#Examples) for additional information + * about the output of running [`uname(3)`](https://linux.die.net/man/3/uname) on various operating systems. + * @since v0.3.3 + */ + function type(): string; + /** + * Returns the operating system as a string. + * + * On POSIX systems, the operating system release is determined by calling [`uname(3)`](https://linux.die.net/man/3/uname). On Windows, `GetVersionExW()` is used. See + * [https://en.wikipedia.org/wiki/Uname#Examples](https://en.wikipedia.org/wiki/Uname#Examples) for more information. + * @since v0.3.3 + */ + function release(): string; + /** + * Returns an object containing network interfaces that have been assigned a + * network address. + * + * Each key on the returned object identifies a network interface. The associated + * value is an array of objects that each describe an assigned network address. + * + * The properties available on the assigned network address object include: + * + * ```js + * { + * lo: [ + * { + * address: '127.0.0.1', + * netmask: '255.0.0.0', + * family: 'IPv4', + * mac: '00:00:00:00:00:00', + * internal: true, + * cidr: '127.0.0.1/8' + * }, + * { + * address: '::1', + * netmask: 'ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff', + * family: 'IPv6', + * mac: '00:00:00:00:00:00', + * scopeid: 0, + * internal: true, + * cidr: '::1/128' + * } + * ], + * eth0: [ + * { + * address: '192.168.1.108', + * netmask: '255.255.255.0', + * family: 'IPv4', + * mac: '01:02:03:0a:0b:0c', + * internal: false, + * cidr: '192.168.1.108/24' + * }, + * { + * address: 'fe80::a00:27ff:fe4e:66a1', + * netmask: 'ffff:ffff:ffff:ffff::', + * family: 'IPv6', + * mac: '01:02:03:0a:0b:0c', + * scopeid: 1, + * internal: false, + * cidr: 'fe80::a00:27ff:fe4e:66a1/64' + * } + * ] + * } + * ``` + * @since v0.6.0 + */ + function networkInterfaces(): NodeJS.Dict; + /** + * Returns the string path of the current user's home directory. + * + * On POSIX, it uses the `$HOME` environment variable if defined. Otherwise it + * uses the [effective UID](https://en.wikipedia.org/wiki/User_identifier#Effective_user_ID) to look up the user's home directory. + * + * On Windows, it uses the `USERPROFILE` environment variable if defined. + * Otherwise it uses the path to the profile directory of the current user. + * @since v2.3.0 + */ + function homedir(): string; + /** + * Returns information about the currently effective user. On POSIX platforms, + * this is typically a subset of the password file. The returned object includes + * the `username`, `uid`, `gid`, `shell`, and `homedir`. On Windows, the `uid` and`gid` fields are `-1`, and `shell` is `null`. + * + * The value of `homedir` returned by `os.userInfo()` is provided by the operating + * system. This differs from the result of `os.homedir()`, which queries + * environment variables for the home directory before falling back to the + * operating system response. + * + * Throws a `SystemError` if a user has no `username` or `homedir`. + * @since v6.0.0 + */ + function userInfo(options: { encoding: 'buffer' }): UserInfo; + function userInfo(options?: { encoding: BufferEncoding }): UserInfo; + type SignalConstants = { + [key in NodeJS.Signals]: number; + }; + namespace constants { + const UV_UDP_REUSEADDR: number; + namespace signals {} + const signals: SignalConstants; + namespace errno { + const E2BIG: number; + const EACCES: number; + const EADDRINUSE: number; + const EADDRNOTAVAIL: number; + const EAFNOSUPPORT: number; + const EAGAIN: number; + const EALREADY: number; + const EBADF: number; + const EBADMSG: number; + const EBUSY: number; + const ECANCELED: number; + const ECHILD: number; + const ECONNABORTED: number; + const ECONNREFUSED: number; + const ECONNRESET: number; + const EDEADLK: number; + const EDESTADDRREQ: number; + const EDOM: number; + const EDQUOT: number; + const EEXIST: number; + const EFAULT: number; + const EFBIG: number; + const EHOSTUNREACH: number; + const EIDRM: number; + const EILSEQ: number; + const EINPROGRESS: number; + const EINTR: number; + const EINVAL: number; + const EIO: number; + const EISCONN: number; + const EISDIR: number; + const ELOOP: number; + const EMFILE: number; + const EMLINK: number; + const EMSGSIZE: number; + const EMULTIHOP: number; + const ENAMETOOLONG: number; + const ENETDOWN: number; + const ENETRESET: number; + const ENETUNREACH: number; + const ENFILE: number; + const ENOBUFS: number; + const ENODATA: number; + const ENODEV: number; + const ENOENT: number; + const ENOEXEC: number; + const ENOLCK: number; + const ENOLINK: number; + const ENOMEM: number; + const ENOMSG: number; + const ENOPROTOOPT: number; + const ENOSPC: number; + const ENOSR: number; + const ENOSTR: number; + const ENOSYS: number; + const ENOTCONN: number; + const ENOTDIR: number; + const ENOTEMPTY: number; + const ENOTSOCK: number; + const ENOTSUP: number; + const ENOTTY: number; + const ENXIO: number; + const EOPNOTSUPP: number; + const EOVERFLOW: number; + const EPERM: number; + const EPIPE: number; + const EPROTO: number; + const EPROTONOSUPPORT: number; + const EPROTOTYPE: number; + const ERANGE: number; + const EROFS: number; + const ESPIPE: number; + const ESRCH: number; + const ESTALE: number; + const ETIME: number; + const ETIMEDOUT: number; + const ETXTBSY: number; + const EWOULDBLOCK: number; + const EXDEV: number; + const WSAEINTR: number; + const WSAEBADF: number; + const WSAEACCES: number; + const WSAEFAULT: number; + const WSAEINVAL: number; + const WSAEMFILE: number; + const WSAEWOULDBLOCK: number; + const WSAEINPROGRESS: number; + const WSAEALREADY: number; + const WSAENOTSOCK: number; + const WSAEDESTADDRREQ: number; + const WSAEMSGSIZE: number; + const WSAEPROTOTYPE: number; + const WSAENOPROTOOPT: number; + const WSAEPROTONOSUPPORT: number; + const WSAESOCKTNOSUPPORT: number; + const WSAEOPNOTSUPP: number; + const WSAEPFNOSUPPORT: number; + const WSAEAFNOSUPPORT: number; + const WSAEADDRINUSE: number; + const WSAEADDRNOTAVAIL: number; + const WSAENETDOWN: number; + const WSAENETUNREACH: number; + const WSAENETRESET: number; + const WSAECONNABORTED: number; + const WSAECONNRESET: number; + const WSAENOBUFS: number; + const WSAEISCONN: number; + const WSAENOTCONN: number; + const WSAESHUTDOWN: number; + const WSAETOOMANYREFS: number; + const WSAETIMEDOUT: number; + const WSAECONNREFUSED: number; + const WSAELOOP: number; + const WSAENAMETOOLONG: number; + const WSAEHOSTDOWN: number; + const WSAEHOSTUNREACH: number; + const WSAENOTEMPTY: number; + const WSAEPROCLIM: number; + const WSAEUSERS: number; + const WSAEDQUOT: number; + const WSAESTALE: number; + const WSAEREMOTE: number; + const WSASYSNOTREADY: number; + const WSAVERNOTSUPPORTED: number; + const WSANOTINITIALISED: number; + const WSAEDISCON: number; + const WSAENOMORE: number; + const WSAECANCELLED: number; + const WSAEINVALIDPROCTABLE: number; + const WSAEINVALIDPROVIDER: number; + const WSAEPROVIDERFAILEDINIT: number; + const WSASYSCALLFAILURE: number; + const WSASERVICE_NOT_FOUND: number; + const WSATYPE_NOT_FOUND: number; + const WSA_E_NO_MORE: number; + const WSA_E_CANCELLED: number; + const WSAEREFUSED: number; + } + namespace priority { + const PRIORITY_LOW: number; + const PRIORITY_BELOW_NORMAL: number; + const PRIORITY_NORMAL: number; + const PRIORITY_ABOVE_NORMAL: number; + const PRIORITY_HIGH: number; + const PRIORITY_HIGHEST: number; + } + } + const devNull: string; + const EOL: string; + /** + * Returns the operating system CPU architecture for which the Node.js binary was + * compiled. Possible values are `'arm'`, `'arm64'`, `'ia32'`, `'mips'`,`'mipsel'`, `'ppc'`, `'ppc64'`, `'s390'`, `'s390x'`, `'x32'`, and `'x64'`. + * + * The return value is equivalent to `process.arch`. + * @since v0.5.0 + */ + function arch(): string; + /** + * Returns a string identifying the kernel version. + * + * On POSIX systems, the operating system release is determined by calling [`uname(3)`](https://linux.die.net/man/3/uname). On Windows, `RtlGetVersion()` is used, and if it is not + * available, `GetVersionExW()` will be used. See [https://en.wikipedia.org/wiki/Uname#Examples](https://en.wikipedia.org/wiki/Uname#Examples) for more information. + * @since v13.11.0, v12.17.0 + */ + function version(): string; + /** + * Returns a string identifying the operating system platform. The value is set + * at compile time. Possible values are `'aix'`, `'darwin'`, `'freebsd'`,`'linux'`, `'openbsd'`, `'sunos'`, and `'win32'`. + * + * The return value is equivalent to `process.platform`. + * + * The value `'android'` may also be returned if Node.js is built on the Android + * operating system. [Android support is experimental](https://github.com/nodejs/node/blob/HEAD/BUILDING.md#androidandroid-based-devices-eg-firefox-os). + * @since v0.5.0 + */ + function platform(): NodeJS.Platform; + /** + * Returns the operating system's default directory for temporary files as a + * string. + * @since v0.9.9 + */ + function tmpdir(): string; + /** + * Returns a string identifying the endianness of the CPU for which the Node.js + * binary was compiled. + * + * Possible values are `'BE'` for big endian and `'LE'` for little endian. + * @since v0.9.4 + */ + function endianness(): 'BE' | 'LE'; + /** + * Returns the scheduling priority for the process specified by `pid`. If `pid` is + * not provided or is `0`, the priority of the current process is returned. + * @since v10.10.0 + * @param [pid=0] The process ID to retrieve scheduling priority for. + */ + function getPriority(pid?: number): number; + /** + * Attempts to set the scheduling priority for the process specified by `pid`. If`pid` is not provided or is `0`, the process ID of the current process is used. + * + * The `priority` input must be an integer between `-20` (high priority) and `19`(low priority). Due to differences between Unix priority levels and Windows + * priority classes, `priority` is mapped to one of six priority constants in`os.constants.priority`. When retrieving a process priority level, this range + * mapping may cause the return value to be slightly different on Windows. To avoid + * confusion, set `priority` to one of the priority constants. + * + * On Windows, setting priority to `PRIORITY_HIGHEST` requires elevated user + * privileges. Otherwise the set priority will be silently reduced to`PRIORITY_HIGH`. + * @since v10.10.0 + * @param [pid=0] The process ID to set scheduling priority for. + * @param priority The scheduling priority to assign to the process. + */ + function setPriority(priority: number): void; + function setPriority(pid: number, priority: number): void; +} +declare module 'node:os' { + export * from 'os'; +} diff --git a/node_modules/@types/node/package.json b/node_modules/@types/node/package.json new file mode 100644 index 000000000..f72d6030e --- /dev/null +++ b/node_modules/@types/node/package.json @@ -0,0 +1,217 @@ +{ + "_from": "@types/node@*", + "_id": "@types/node@16.10.3", + "_inBundle": false, + "_integrity": "sha512-ho3Ruq+fFnBrZhUYI46n/bV2GjwzSkwuT4dTf0GkuNFmnb8nq4ny2z9JEVemFi6bdEJanHLlYfy9c6FN9B9McQ==", + "_location": "/@types/node", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "@types/node@*", + "name": "@types/node", + "escapedName": "@types%2fnode", + "scope": "@types", + "rawSpec": "*", + "saveSpec": null, + "fetchSpec": "*" + }, + "_requiredBy": [ + "/@types/whatwg-url" + ], + "_resolved": "https://registry.npmjs.org/@types/node/-/node-16.10.3.tgz", + "_shasum": "7a8f2838603ea314d1d22bb3171d899e15c57bd5", + "_spec": "@types/node@*", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\@types\\whatwg-url", + "bugs": { + "url": "https://github.com/DefinitelyTyped/DefinitelyTyped/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Microsoft TypeScript", + "url": "https://github.com/Microsoft" + }, + { + "name": "DefinitelyTyped", + "url": "https://github.com/DefinitelyTyped" + }, + { + "name": "Alberto Schiabel", + "url": "https://github.com/jkomyno" + }, + { + "name": "Alvis HT Tang", + "url": "https://github.com/alvis" + }, + { + "name": "Andrew Makarov", + "url": "https://github.com/r3nya" + }, + { + "name": "Benjamin Toueg", + "url": "https://github.com/btoueg" + }, + { + "name": "Chigozirim C.", + "url": "https://github.com/smac89" + }, + { + "name": "David Junger", + "url": "https://github.com/touffy" + }, + { + "name": "Deividas Bakanas", + "url": "https://github.com/DeividasBakanas" + }, + { + "name": "Eugene Y. Q. Shen", + "url": "https://github.com/eyqs" + }, + { + "name": "Hannes Magnusson", + "url": "https://github.com/Hannes-Magnusson-CK" + }, + { + "name": "Huw", + "url": "https://github.com/hoo29" + }, + { + "name": "Kelvin Jin", + "url": "https://github.com/kjin" + }, + { + "name": "Klaus Meinhardt", + "url": "https://github.com/ajafff" + }, + { + "name": "Lishude", + "url": "https://github.com/islishude" + }, + { + "name": "Mariusz Wiktorczyk", + "url": "https://github.com/mwiktorczyk" + }, + { + "name": "Mohsen Azimi", + "url": "https://github.com/mohsen1" + }, + { + "name": "Nicolas Even", + "url": "https://github.com/n-e" + }, + { + "name": "Nikita Galkin", + "url": "https://github.com/galkin" + }, + { + "name": "Parambir Singh", + "url": "https://github.com/parambirs" + }, + { + "name": "Sebastian Silbermann", + "url": "https://github.com/eps1lon" + }, + { + "name": "Simon Schick", + "url": "https://github.com/SimonSchick" + }, + { + "name": "Thomas den Hollander", + "url": "https://github.com/ThomasdenH" + }, + { + "name": "Wilco Bakker", + "url": "https://github.com/WilcoBakker" + }, + { + "name": "wwwy3y3", + "url": "https://github.com/wwwy3y3" + }, + { + "name": "Samuel Ainsworth", + "url": "https://github.com/samuela" + }, + { + "name": "Kyle Uehlein", + "url": "https://github.com/kuehlein" + }, + { + "name": "Thanik Bhongbhibhat", + "url": "https://github.com/bhongy" + }, + { + "name": "Marcin Kopacz", + "url": "https://github.com/chyzwar" + }, + { + "name": "Trivikram Kamat", + "url": "https://github.com/trivikr" + }, + { + "name": "Minh Son Nguyen", + "url": "https://github.com/nguymin4" + }, + { + "name": "Junxiao Shi", + "url": "https://github.com/yoursunny" + }, + { + "name": "Ilia Baryshnikov", + "url": "https://github.com/qwelias" + }, + { + "name": "ExE Boss", + "url": "https://github.com/ExE-Boss" + }, + { + "name": "Surasak Chaisurin", + "url": "https://github.com/Ryan-Willpower" + }, + { + "name": "Piotr Błażejewicz", + "url": "https://github.com/peterblazejewicz" + }, + { + "name": "Anna Henningsen", + "url": "https://github.com/addaleax" + }, + { + "name": "Victor Perin", + "url": "https://github.com/victorperin" + }, + { + "name": "Yongsheng Zhang", + "url": "https://github.com/ZYSzys" + }, + { + "name": "NodeJS Contributors", + "url": "https://github.com/NodeJS" + }, + { + "name": "Linus Unnebäck", + "url": "https://github.com/LinusU" + }, + { + "name": "wafuwafu13", + "url": "https://github.com/wafuwafu13" + } + ], + "dependencies": {}, + "deprecated": false, + "description": "TypeScript definitions for Node.js", + "homepage": "https://github.com/DefinitelyTyped/DefinitelyTyped/tree/master/types/node", + "license": "MIT", + "main": "", + "name": "@types/node", + "repository": { + "type": "git", + "url": "git+https://github.com/DefinitelyTyped/DefinitelyTyped.git", + "directory": "types/node" + }, + "scripts": {}, + "typeScriptVersion": "3.7", + "types": "index.d.ts", + "typesPublisherContentHash": "6b890401c9e8ed8ad1296e2cfeef65f2ba29feb793d490917f614c65e7f33f07", + "version": "16.10.3" +} diff --git a/node_modules/@types/node/path.d.ts b/node_modules/@types/node/path.d.ts new file mode 100644 index 000000000..a58c0aab9 --- /dev/null +++ b/node_modules/@types/node/path.d.ts @@ -0,0 +1,172 @@ +declare module 'path/posix' { + import path = require('path'); + export = path; +} +declare module 'path/win32' { + import path = require('path'); + export = path; +} +/** + * The `path` module provides utilities for working with file and directory paths. + * It can be accessed using: + * + * ```js + * const path = require('path'); + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/path.js) + */ +declare module 'path' { + namespace path { + /** + * A parsed path object generated by path.parse() or consumed by path.format(). + */ + interface ParsedPath { + /** + * The root of the path such as '/' or 'c:\' + */ + root: string; + /** + * The full directory path such as '/home/user/dir' or 'c:\path\dir' + */ + dir: string; + /** + * The file name including extension (if any) such as 'index.html' + */ + base: string; + /** + * The file extension (if any) such as '.html' + */ + ext: string; + /** + * The file name without extension (if any) such as 'index' + */ + name: string; + } + interface FormatInputPathObject { + /** + * The root of the path such as '/' or 'c:\' + */ + root?: string | undefined; + /** + * The full directory path such as '/home/user/dir' or 'c:\path\dir' + */ + dir?: string | undefined; + /** + * The file name including extension (if any) such as 'index.html' + */ + base?: string | undefined; + /** + * The file extension (if any) such as '.html' + */ + ext?: string | undefined; + /** + * The file name without extension (if any) such as 'index' + */ + name?: string | undefined; + } + interface PlatformPath { + /** + * Normalize a string path, reducing '..' and '.' parts. + * When multiple slashes are found, they're replaced by a single one; when the path contains a trailing slash, it is preserved. On Windows backslashes are used. + * + * @param p string path to normalize. + */ + normalize(p: string): string; + /** + * Join all arguments together and normalize the resulting path. + * Arguments must be strings. In v0.8, non-string arguments were silently ignored. In v0.10 and up, an exception is thrown. + * + * @param paths paths to join. + */ + join(...paths: string[]): string; + /** + * The right-most parameter is considered {to}. Other parameters are considered an array of {from}. + * + * Starting from leftmost {from} parameter, resolves {to} to an absolute path. + * + * If {to} isn't already absolute, {from} arguments are prepended in right to left order, + * until an absolute path is found. If after using all {from} paths still no absolute path is found, + * the current working directory is used as well. The resulting path is normalized, + * and trailing slashes are removed unless the path gets resolved to the root directory. + * + * @param pathSegments string paths to join. Non-string arguments are ignored. + */ + resolve(...pathSegments: string[]): string; + /** + * Determines whether {path} is an absolute path. An absolute path will always resolve to the same location, regardless of the working directory. + * + * @param path path to test. + */ + isAbsolute(p: string): boolean; + /** + * Solve the relative path from {from} to {to}. + * At times we have two absolute paths, and we need to derive the relative path from one to the other. This is actually the reverse transform of path.resolve. + */ + relative(from: string, to: string): string; + /** + * Return the directory name of a path. Similar to the Unix dirname command. + * + * @param p the path to evaluate. + */ + dirname(p: string): string; + /** + * Return the last portion of a path. Similar to the Unix basename command. + * Often used to extract the file name from a fully qualified path. + * + * @param p the path to evaluate. + * @param ext optionally, an extension to remove from the result. + */ + basename(p: string, ext?: string): string; + /** + * Return the extension of the path, from the last '.' to end of string in the last portion of the path. + * If there is no '.' in the last portion of the path or the first character of it is '.', then it returns an empty string + * + * @param p the path to evaluate. + */ + extname(p: string): string; + /** + * The platform-specific file separator. '\\' or '/'. + */ + readonly sep: string; + /** + * The platform-specific file delimiter. ';' or ':'. + */ + readonly delimiter: string; + /** + * Returns an object from a path string - the opposite of format(). + * + * @param pathString path to evaluate. + */ + parse(p: string): ParsedPath; + /** + * Returns a path string from an object - the opposite of parse(). + * + * @param pathString path to evaluate. + */ + format(pP: FormatInputPathObject): string; + /** + * On Windows systems only, returns an equivalent namespace-prefixed path for the given path. + * If path is not a string, path will be returned without modifications. + * This method is meaningful only on Windows system. + * On POSIX systems, the method is non-operational and always returns path without modifications. + */ + toNamespacedPath(path: string): string; + /** + * Posix specific pathing. + * Same as parent object on posix. + */ + readonly posix: PlatformPath; + /** + * Windows specific pathing. + * Same as parent object on windows + */ + readonly win32: PlatformPath; + } + } + const path: path.PlatformPath; + export = path; +} +declare module 'node:path' { + import path = require('path'); + export = path; +} diff --git a/node_modules/@types/node/perf_hooks.d.ts b/node_modules/@types/node/perf_hooks.d.ts new file mode 100644 index 000000000..0e4b48f93 --- /dev/null +++ b/node_modules/@types/node/perf_hooks.d.ts @@ -0,0 +1,555 @@ +/** + * This module provides an implementation of a subset of the W3C [Web Performance APIs](https://w3c.github.io/perf-timing-primer/) as well as additional APIs for + * Node.js-specific performance measurements. + * + * Node.js supports the following [Web Performance APIs](https://w3c.github.io/perf-timing-primer/): + * + * * [High Resolution Time](https://www.w3.org/TR/hr-time-2) + * * [Performance Timeline](https://w3c.github.io/performance-timeline/) + * * [User Timing](https://www.w3.org/TR/user-timing/) + * + * ```js + * const { PerformanceObserver, performance } = require('perf_hooks'); + * + * const obs = new PerformanceObserver((items) => { + * console.log(items.getEntries()[0].duration); + * performance.clearMarks(); + * }); + * obs.observe({ type: 'measure' }); + * performance.measure('Start to Now'); + * + * performance.mark('A'); + * doSomeLongRunningProcess(() => { + * performance.measure('A to Now', 'A'); + * + * performance.mark('B'); + * performance.measure('A to B', 'A', 'B'); + * }); + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/perf_hooks.js) + */ +declare module 'perf_hooks' { + import { AsyncResource } from 'node:async_hooks'; + type EntryType = 'node' | 'mark' | 'measure' | 'gc' | 'function' | 'http2' | 'http'; + interface NodeGCPerformanceDetail { + /** + * When `performanceEntry.entryType` is equal to 'gc', `the performance.kind` property identifies + * the type of garbage collection operation that occurred. + * See perf_hooks.constants for valid values. + */ + readonly kind?: number | undefined; + /** + * When `performanceEntry.entryType` is equal to 'gc', the `performance.flags` + * property contains additional information about garbage collection operation. + * See perf_hooks.constants for valid values. + */ + readonly flags?: number | undefined; + } + /** + * @since v8.5.0 + */ + class PerformanceEntry { + protected constructor(); + /** + * The total number of milliseconds elapsed for this entry. This value will not + * be meaningful for all Performance Entry types. + * @since v8.5.0 + */ + readonly duration: number; + /** + * The name of the performance entry. + * @since v8.5.0 + */ + readonly name: string; + /** + * The high resolution millisecond timestamp marking the starting time of the + * Performance Entry. + * @since v8.5.0 + */ + readonly startTime: number; + /** + * The type of the performance entry. It may be one of: + * + * * `'node'` (Node.js only) + * * `'mark'` (available on the Web) + * * `'measure'` (available on the Web) + * * `'gc'` (Node.js only) + * * `'function'` (Node.js only) + * * `'http2'` (Node.js only) + * * `'http'` (Node.js only) + * @since v8.5.0 + */ + readonly entryType: EntryType; + /** + * Additional detail specific to the `entryType`. + * @since v16.0.0 + */ + readonly detail?: NodeGCPerformanceDetail | unknown | undefined; // TODO: Narrow this based on entry type. + } + /** + * _This property is an extension by Node.js. It is not available in Web browsers._ + * + * Provides timing details for Node.js itself. The constructor of this class + * is not exposed to users. + * @since v8.5.0 + */ + class PerformanceNodeTiming extends PerformanceEntry { + /** + * The high resolution millisecond timestamp at which the Node.js process + * completed bootstrapping. If bootstrapping has not yet finished, the property + * has the value of -1. + * @since v8.5.0 + */ + readonly bootstrapComplete: number; + /** + * The high resolution millisecond timestamp at which the Node.js environment was + * initialized. + * @since v8.5.0 + */ + readonly environment: number; + /** + * The high resolution millisecond timestamp of the amount of time the event loop + * has been idle within the event loop's event provider (e.g. `epoll_wait`). This + * does not take CPU usage into consideration. If the event loop has not yet + * started (e.g., in the first tick of the main script), the property has the + * value of 0. + * @since v14.10.0, v12.19.0 + */ + readonly idleTime: number; + /** + * The high resolution millisecond timestamp at which the Node.js event loop + * exited. If the event loop has not yet exited, the property has the value of -1\. + * It can only have a value of not -1 in a handler of the `'exit'` event. + * @since v8.5.0 + */ + readonly loopExit: number; + /** + * The high resolution millisecond timestamp at which the Node.js event loop + * started. If the event loop has not yet started (e.g., in the first tick of the + * main script), the property has the value of -1. + * @since v8.5.0 + */ + readonly loopStart: number; + /** + * The high resolution millisecond timestamp at which the V8 platform was + * initialized. + * @since v8.5.0 + */ + readonly v8Start: number; + } + interface EventLoopUtilization { + idle: number; + active: number; + utilization: number; + } + /** + * @param util1 The result of a previous call to eventLoopUtilization() + * @param util2 The result of a previous call to eventLoopUtilization() prior to util1 + */ + type EventLoopUtilityFunction = (util1?: EventLoopUtilization, util2?: EventLoopUtilization) => EventLoopUtilization; + interface MarkOptions { + /** + * Additional optional detail to include with the mark. + */ + detail?: unknown | undefined; + /** + * An optional timestamp to be used as the mark time. + * @default `performance.now()`. + */ + startTime?: number | undefined; + } + interface MeasureOptions { + /** + * Additional optional detail to include with the mark. + */ + detail?: unknown | undefined; + /** + * Duration between start and end times. + */ + duration?: number | undefined; + /** + * Timestamp to be used as the end time, or a string identifying a previously recorded mark. + */ + end?: number | string | undefined; + /** + * Timestamp to be used as the start time, or a string identifying a previously recorded mark. + */ + start?: number | string | undefined; + } + interface TimerifyOptions { + /** + * A histogram object created using + * `perf_hooks.createHistogram()` that will record runtime durations in + * nanoseconds. + */ + histogram?: RecordableHistogram | undefined; + } + interface Performance { + /** + * If name is not provided, removes all PerformanceMark objects from the Performance Timeline. + * If name is provided, removes only the named mark. + * @param name + */ + clearMarks(name?: string): void; + /** + * Creates a new PerformanceMark entry in the Performance Timeline. + * A PerformanceMark is a subclass of PerformanceEntry whose performanceEntry.entryType is always 'mark', + * and whose performanceEntry.duration is always 0. + * Performance marks are used to mark specific significant moments in the Performance Timeline. + * @param name + */ + mark(name?: string, options?: MarkOptions): void; + /** + * Creates a new PerformanceMeasure entry in the Performance Timeline. + * A PerformanceMeasure is a subclass of PerformanceEntry whose performanceEntry.entryType is always 'measure', + * and whose performanceEntry.duration measures the number of milliseconds elapsed since startMark and endMark. + * + * The startMark argument may identify any existing PerformanceMark in the the Performance Timeline, or may identify + * any of the timestamp properties provided by the PerformanceNodeTiming class. If the named startMark does not exist, + * then startMark is set to timeOrigin by default. + * + * The endMark argument must identify any existing PerformanceMark in the the Performance Timeline or any of the timestamp + * properties provided by the PerformanceNodeTiming class. If the named endMark does not exist, an error will be thrown. + * @param name + * @param startMark + * @param endMark + */ + measure(name: string, startMark?: string, endMark?: string): void; + measure(name: string, options: MeasureOptions): void; + /** + * An instance of the PerformanceNodeTiming class that provides performance metrics for specific Node.js operational milestones. + */ + readonly nodeTiming: PerformanceNodeTiming; + /** + * @return the current high resolution millisecond timestamp + */ + now(): number; + /** + * The timeOrigin specifies the high resolution millisecond timestamp from which all performance metric durations are measured. + */ + readonly timeOrigin: number; + /** + * Wraps a function within a new function that measures the running time of the wrapped function. + * A PerformanceObserver must be subscribed to the 'function' event type in order for the timing details to be accessed. + * @param fn + */ + timerify any>(fn: T, options?: TimerifyOptions): T; + /** + * eventLoopUtilization is similar to CPU utilization except that it is calculated using high precision wall-clock time. + * It represents the percentage of time the event loop has spent outside the event loop's event provider (e.g. epoll_wait). + * No other CPU idle time is taken into consideration. + */ + eventLoopUtilization: EventLoopUtilityFunction; + } + interface PerformanceObserverEntryList { + /** + * Returns a list of `PerformanceEntry` objects in chronological order + * with respect to `performanceEntry.startTime`. + * + * ```js + * const { + * performance, + * PerformanceObserver + * } = require('perf_hooks'); + * + * const obs = new PerformanceObserver((perfObserverList, observer) => { + * console.log(perfObserverList.getEntries()); + * + * * [ + * * PerformanceEntry { + * * name: 'test', + * * entryType: 'mark', + * * startTime: 81.465639, + * * duration: 0 + * * }, + * * PerformanceEntry { + * * name: 'meow', + * * entryType: 'mark', + * * startTime: 81.860064, + * * duration: 0 + * * } + * * ] + * + * observer.disconnect(); + * }); + * obs.observe({ type: 'mark' }); + * + * performance.mark('test'); + * performance.mark('meow'); + * ``` + * @since v8.5.0 + */ + getEntries(): PerformanceEntry[]; + /** + * Returns a list of `PerformanceEntry` objects in chronological order + * with respect to `performanceEntry.startTime` whose `performanceEntry.name` is + * equal to `name`, and optionally, whose `performanceEntry.entryType` is equal to`type`. + * + * ```js + * const { + * performance, + * PerformanceObserver + * } = require('perf_hooks'); + * + * const obs = new PerformanceObserver((perfObserverList, observer) => { + * console.log(perfObserverList.getEntriesByName('meow')); + * + * * [ + * * PerformanceEntry { + * * name: 'meow', + * * entryType: 'mark', + * * startTime: 98.545991, + * * duration: 0 + * * } + * * ] + * + * console.log(perfObserverList.getEntriesByName('nope')); // [] + * + * console.log(perfObserverList.getEntriesByName('test', 'mark')); + * + * * [ + * * PerformanceEntry { + * * name: 'test', + * * entryType: 'mark', + * * startTime: 63.518931, + * * duration: 0 + * * } + * * ] + * + * console.log(perfObserverList.getEntriesByName('test', 'measure')); // [] + * observer.disconnect(); + * }); + * obs.observe({ entryTypes: ['mark', 'measure'] }); + * + * performance.mark('test'); + * performance.mark('meow'); + * ``` + * @since v8.5.0 + */ + getEntriesByName(name: string, type?: EntryType): PerformanceEntry[]; + /** + * Returns a list of `PerformanceEntry` objects in chronological order + * with respect to `performanceEntry.startTime` whose `performanceEntry.entryType`is equal to `type`. + * + * ```js + * const { + * performance, + * PerformanceObserver + * } = require('perf_hooks'); + * + * const obs = new PerformanceObserver((perfObserverList, observer) => { + * console.log(perfObserverList.getEntriesByType('mark')); + * + * * [ + * * PerformanceEntry { + * * name: 'test', + * * entryType: 'mark', + * * startTime: 55.897834, + * * duration: 0 + * * }, + * * PerformanceEntry { + * * name: 'meow', + * * entryType: 'mark', + * * startTime: 56.350146, + * * duration: 0 + * * } + * * ] + * + * observer.disconnect(); + * }); + * obs.observe({ type: 'mark' }); + * + * performance.mark('test'); + * performance.mark('meow'); + * ``` + * @since v8.5.0 + */ + getEntriesByType(type: EntryType): PerformanceEntry[]; + } + type PerformanceObserverCallback = (list: PerformanceObserverEntryList, observer: PerformanceObserver) => void; + class PerformanceObserver extends AsyncResource { + constructor(callback: PerformanceObserverCallback); + /** + * Disconnects the `PerformanceObserver` instance from all notifications. + * @since v8.5.0 + */ + disconnect(): void; + /** + * Subscribes the `PerformanceObserver` instance to notifications of new `PerformanceEntry` instances identified either by `options.entryTypes`or `options.type`: + * + * ```js + * const { + * performance, + * PerformanceObserver + * } = require('perf_hooks'); + * + * const obs = new PerformanceObserver((list, observer) => { + * // Called three times synchronously. `list` contains one item. + * }); + * obs.observe({ type: 'mark' }); + * + * for (let n = 0; n < 3; n++) + * performance.mark(`test${n}`); + * ``` + * @since v8.5.0 + */ + observe( + options: + | { + entryTypes: ReadonlyArray; + } + | { + type: EntryType; + } + ): void; + } + namespace constants { + const NODE_PERFORMANCE_GC_MAJOR: number; + const NODE_PERFORMANCE_GC_MINOR: number; + const NODE_PERFORMANCE_GC_INCREMENTAL: number; + const NODE_PERFORMANCE_GC_WEAKCB: number; + const NODE_PERFORMANCE_GC_FLAGS_NO: number; + const NODE_PERFORMANCE_GC_FLAGS_CONSTRUCT_RETAINED: number; + const NODE_PERFORMANCE_GC_FLAGS_FORCED: number; + const NODE_PERFORMANCE_GC_FLAGS_SYNCHRONOUS_PHANTOM_PROCESSING: number; + const NODE_PERFORMANCE_GC_FLAGS_ALL_AVAILABLE_GARBAGE: number; + const NODE_PERFORMANCE_GC_FLAGS_ALL_EXTERNAL_MEMORY: number; + const NODE_PERFORMANCE_GC_FLAGS_SCHEDULE_IDLE: number; + } + const performance: Performance; + interface EventLoopMonitorOptions { + /** + * The sampling rate in milliseconds. + * Must be greater than zero. + * @default 10 + */ + resolution?: number | undefined; + } + interface Histogram { + /** + * Returns a `Map` object detailing the accumulated percentile distribution. + * @since v11.10.0 + */ + readonly percentiles: Map; + /** + * The number of times the event loop delay exceeded the maximum 1 hour event + * loop delay threshold. + * @since v11.10.0 + */ + readonly exceeds: number; + /** + * The minimum recorded event loop delay. + * @since v11.10.0 + */ + readonly min: number; + /** + * The maximum recorded event loop delay. + * @since v11.10.0 + */ + readonly max: number; + /** + * The mean of the recorded event loop delays. + * @since v11.10.0 + */ + readonly mean: number; + /** + * The standard deviation of the recorded event loop delays. + * @since v11.10.0 + */ + readonly stddev: number; + /** + * Resets the collected histogram data. + * @since v11.10.0 + */ + reset(): void; + /** + * Returns the value at the given percentile. + * @since v11.10.0 + * @param percentile A percentile value in the range (0, 100]. + */ + percentile(percentile: number): number; + } + interface IntervalHistogram extends Histogram { + /** + * Enables the update interval timer. Returns `true` if the timer was + * started, `false` if it was already started. + * @since v11.10.0 + */ + enable(): boolean; + /** + * Disables the update interval timer. Returns `true` if the timer was + * stopped, `false` if it was already stopped. + * @since v11.10.0 + */ + disable(): boolean; + } + interface RecordableHistogram extends Histogram { + /** + * @since v15.9.0 + * @param val The amount to record in the histogram. + */ + record(val: number | bigint): void; + /** + * Calculates the amount of time (in nanoseconds) that has passed since the + * previous call to `recordDelta()` and records that amount in the histogram. + * + * ## Examples + * @since v15.9.0 + */ + recordDelta(): void; + } + /** + * _This property is an extension by Node.js. It is not available in Web browsers._ + * + * Creates an `IntervalHistogram` object that samples and reports the event loop + * delay over time. The delays will be reported in nanoseconds. + * + * Using a timer to detect approximate event loop delay works because the + * execution of timers is tied specifically to the lifecycle of the libuv + * event loop. That is, a delay in the loop will cause a delay in the execution + * of the timer, and those delays are specifically what this API is intended to + * detect. + * + * ```js + * const { monitorEventLoopDelay } = require('perf_hooks'); + * const h = monitorEventLoopDelay({ resolution: 20 }); + * h.enable(); + * // Do something. + * h.disable(); + * console.log(h.min); + * console.log(h.max); + * console.log(h.mean); + * console.log(h.stddev); + * console.log(h.percentiles); + * console.log(h.percentile(50)); + * console.log(h.percentile(99)); + * ``` + * @since v11.10.0 + */ + function monitorEventLoopDelay(options?: EventLoopMonitorOptions): IntervalHistogram; + interface CreateHistogramOptions { + /** + * The minimum recordable value. Must be an integer value greater than 0. + * @default 1 + */ + min?: number | bigint | undefined; + /** + * The maximum recordable value. Must be an integer value greater than min. + * @default Number.MAX_SAFE_INTEGER + */ + max?: number | bigint | undefined; + /** + * The number of accuracy digits. Must be a number between 1 and 5. + * @default 3 + */ + figures?: number | undefined; + } + /** + * Returns a `RecordableHistogram`. + * @since v15.9.0 + */ + function createHistogram(options?: CreateHistogramOptions): RecordableHistogram; +} +declare module 'node:perf_hooks' { + export * from 'perf_hooks'; +} diff --git a/node_modules/@types/node/process.d.ts b/node_modules/@types/node/process.d.ts new file mode 100644 index 000000000..46458f4a0 --- /dev/null +++ b/node_modules/@types/node/process.d.ts @@ -0,0 +1,1477 @@ +declare module 'process' { + import * as tty from 'node:tty'; + import { Worker } from 'node:worker_threads'; + global { + var process: NodeJS.Process; + namespace NodeJS { + // this namespace merge is here because these are specifically used + // as the type for process.stdin, process.stdout, and process.stderr. + // they can't live in tty.d.ts because we need to disambiguate the imported name. + interface ReadStream extends tty.ReadStream {} + interface WriteStream extends tty.WriteStream {} + interface MemoryUsageFn { + /** + * The `process.memoryUsage()` method iterate over each page to gather informations about memory + * usage which can be slow depending on the program memory allocations. + */ + (): MemoryUsage; + /** + * method returns an integer representing the Resident Set Size (RSS) in bytes. + */ + rss(): number; + } + interface MemoryUsage { + rss: number; + heapTotal: number; + heapUsed: number; + external: number; + arrayBuffers: number; + } + interface CpuUsage { + user: number; + system: number; + } + interface ProcessRelease { + name: string; + sourceUrl?: string | undefined; + headersUrl?: string | undefined; + libUrl?: string | undefined; + lts?: string | undefined; + } + interface ProcessVersions extends Dict { + http_parser: string; + node: string; + v8: string; + ares: string; + uv: string; + zlib: string; + modules: string; + openssl: string; + } + type Platform = 'aix' | 'android' | 'darwin' | 'freebsd' | 'haiku' | 'linux' | 'openbsd' | 'sunos' | 'win32' | 'cygwin' | 'netbsd'; + type Signals = + | 'SIGABRT' + | 'SIGALRM' + | 'SIGBUS' + | 'SIGCHLD' + | 'SIGCONT' + | 'SIGFPE' + | 'SIGHUP' + | 'SIGILL' + | 'SIGINT' + | 'SIGIO' + | 'SIGIOT' + | 'SIGKILL' + | 'SIGPIPE' + | 'SIGPOLL' + | 'SIGPROF' + | 'SIGPWR' + | 'SIGQUIT' + | 'SIGSEGV' + | 'SIGSTKFLT' + | 'SIGSTOP' + | 'SIGSYS' + | 'SIGTERM' + | 'SIGTRAP' + | 'SIGTSTP' + | 'SIGTTIN' + | 'SIGTTOU' + | 'SIGUNUSED' + | 'SIGURG' + | 'SIGUSR1' + | 'SIGUSR2' + | 'SIGVTALRM' + | 'SIGWINCH' + | 'SIGXCPU' + | 'SIGXFSZ' + | 'SIGBREAK' + | 'SIGLOST' + | 'SIGINFO'; + type UncaughtExceptionOrigin = 'uncaughtException' | 'unhandledRejection'; + type MultipleResolveType = 'resolve' | 'reject'; + type BeforeExitListener = (code: number) => void; + type DisconnectListener = () => void; + type ExitListener = (code: number) => void; + type RejectionHandledListener = (promise: Promise) => void; + type UncaughtExceptionListener = (error: Error, origin: UncaughtExceptionOrigin) => void; + type UnhandledRejectionListener = (reason: {} | null | undefined, promise: Promise) => void; + type WarningListener = (warning: Error) => void; + type MessageListener = (message: unknown, sendHandle: unknown) => void; + type SignalsListener = (signal: Signals) => void; + type MultipleResolveListener = (type: MultipleResolveType, promise: Promise, value: unknown) => void; + type WorkerListener = (worker: Worker) => void; + interface Socket extends ReadWriteStream { + isTTY?: true | undefined; + } + // Alias for compatibility + interface ProcessEnv extends Dict { + /** + * Can be used to change the default timezone at runtime + */ + TZ?: string; + } + interface HRTime { + (time?: [number, number]): [number, number]; + bigint(): bigint; + } + interface ProcessReport { + /** + * Directory where the report is written. + * working directory of the Node.js process. + * @default '' indicating that reports are written to the current + */ + directory: string; + /** + * Filename where the report is written. + * The default value is the empty string. + * @default '' the output filename will be comprised of a timestamp, + * PID, and sequence number. + */ + filename: string; + /** + * Returns a JSON-formatted diagnostic report for the running process. + * The report's JavaScript stack trace is taken from err, if present. + */ + getReport(err?: Error): string; + /** + * If true, a diagnostic report is generated on fatal errors, + * such as out of memory errors or failed C++ assertions. + * @default false + */ + reportOnFatalError: boolean; + /** + * If true, a diagnostic report is generated when the process + * receives the signal specified by process.report.signal. + * @defaul false + */ + reportOnSignal: boolean; + /** + * If true, a diagnostic report is generated on uncaught exception. + * @default false + */ + reportOnUncaughtException: boolean; + /** + * The signal used to trigger the creation of a diagnostic report. + * @default 'SIGUSR2' + */ + signal: Signals; + /** + * Writes a diagnostic report to a file. If filename is not provided, the default filename + * includes the date, time, PID, and a sequence number. + * The report's JavaScript stack trace is taken from err, if present. + * + * @param fileName Name of the file where the report is written. + * This should be a relative path, that will be appended to the directory specified in + * `process.report.directory`, or the current working directory of the Node.js process, + * if unspecified. + * @param error A custom error used for reporting the JavaScript stack. + * @return Filename of the generated report. + */ + writeReport(fileName?: string): string; + writeReport(error?: Error): string; + writeReport(fileName?: string, err?: Error): string; + } + interface ResourceUsage { + fsRead: number; + fsWrite: number; + involuntaryContextSwitches: number; + ipcReceived: number; + ipcSent: number; + majorPageFault: number; + maxRSS: number; + minorPageFault: number; + sharedMemorySize: number; + signalsCount: number; + swappedOut: number; + systemCPUTime: number; + unsharedDataSize: number; + unsharedStackSize: number; + userCPUTime: number; + voluntaryContextSwitches: number; + } + interface EmitWarningOptions { + /** + * When `warning` is a `string`, `type` is the name to use for the _type_ of warning being emitted. + * + * @default 'Warning' + */ + type?: string | undefined; + /** + * A unique identifier for the warning instance being emitted. + */ + code?: string | undefined; + /** + * When `warning` is a `string`, `ctor` is an optional function used to limit the generated stack trace. + * + * @default process.emitWarning + */ + ctor?: Function | undefined; + /** + * Additional text to include with the error. + */ + detail?: string | undefined; + } + interface ProcessConfig { + readonly target_defaults: { + readonly cflags: any[]; + readonly default_configuration: string; + readonly defines: string[]; + readonly include_dirs: string[]; + readonly libraries: string[]; + }; + readonly variables: { + readonly clang: number; + readonly host_arch: string; + readonly node_install_npm: boolean; + readonly node_install_waf: boolean; + readonly node_prefix: string; + readonly node_shared_openssl: boolean; + readonly node_shared_v8: boolean; + readonly node_shared_zlib: boolean; + readonly node_use_dtrace: boolean; + readonly node_use_etw: boolean; + readonly node_use_openssl: boolean; + readonly target_arch: string; + readonly v8_no_strict_aliasing: number; + readonly v8_use_snapshot: boolean; + readonly visibility: string; + }; + } + interface Process extends EventEmitter { + /** + * The `process.stdout` property returns a stream connected to`stdout` (fd `1`). It is a `net.Socket` (which is a `Duplex` stream) unless fd `1` refers to a file, in which case it is + * a `Writable` stream. + * + * For example, to copy `process.stdin` to `process.stdout`: + * + * ```js + * import { stdin, stdout } from 'process'; + * + * stdin.pipe(stdout); + * ``` + * + * `process.stdout` differs from other Node.js streams in important ways. See `note on process I/O` for more information. + */ + stdout: WriteStream & { + fd: 1; + }; + /** + * The `process.stderr` property returns a stream connected to`stderr` (fd `2`). It is a `net.Socket` (which is a `Duplex` stream) unless fd `2` refers to a file, in which case it is + * a `Writable` stream. + * + * `process.stderr` differs from other Node.js streams in important ways. See `note on process I/O` for more information. + */ + stderr: WriteStream & { + fd: 2; + }; + /** + * The `process.stdin` property returns a stream connected to`stdin` (fd `0`). It is a `net.Socket` (which is a `Duplex` stream) unless fd `0` refers to a file, in which case it is + * a `Readable` stream. + * + * For details of how to read from `stdin` see `readable.read()`. + * + * As a `Duplex` stream, `process.stdin` can also be used in "old" mode that + * is compatible with scripts written for Node.js prior to v0.10\. + * For more information see `Stream compatibility`. + * + * In "old" streams mode the `stdin` stream is paused by default, so one + * must call `process.stdin.resume()` to read from it. Note also that calling`process.stdin.resume()` itself would switch stream to "old" mode. + */ + stdin: ReadStream & { + fd: 0; + }; + openStdin(): Socket; + /** + * The `process.argv` property returns an array containing the command-line + * arguments passed when the Node.js process was launched. The first element will + * be {@link execPath}. See `process.argv0` if access to the original value + * of `argv[0]` is needed. The second element will be the path to the JavaScript + * file being executed. The remaining elements will be any additional command-line + * arguments. + * + * For example, assuming the following script for `process-args.js`: + * + * ```js + * import { argv } from 'process'; + * + * // print process.argv + * argv.forEach((val, index) => { + * console.log(`${index}: ${val}`); + * }); + * ``` + * + * Launching the Node.js process as: + * + * ```console + * $ node process-args.js one two=three four + * ``` + * + * Would generate the output: + * + * ```text + * 0: /usr/local/bin/node + * 1: /Users/mjr/work/node/process-args.js + * 2: one + * 3: two=three + * 4: four + * ``` + * @since v0.1.27 + */ + argv: string[]; + /** + * The `process.argv0` property stores a read-only copy of the original value of`argv[0]` passed when Node.js starts. + * + * ```console + * $ bash -c 'exec -a customArgv0 ./node' + * > process.argv[0] + * '/Volumes/code/external/node/out/Release/node' + * > process.argv0 + * 'customArgv0' + * ``` + * @since v6.4.0 + */ + argv0: string; + /** + * The `process.execArgv` property returns the set of Node.js-specific command-line + * options passed when the Node.js process was launched. These options do not + * appear in the array returned by the {@link argv} property, and do not + * include the Node.js executable, the name of the script, or any options following + * the script name. These options are useful in order to spawn child processes with + * the same execution environment as the parent. + * + * ```console + * $ node --harmony script.js --version + * ``` + * + * Results in `process.execArgv`: + * + * ```js + * ['--harmony'] + * ``` + * + * And `process.argv`: + * + * ```js + * ['/usr/local/bin/node', 'script.js', '--version'] + * ``` + * + * Refer to `Worker constructor` for the detailed behavior of worker + * threads with this property. + * @since v0.7.7 + */ + execArgv: string[]; + /** + * The `process.execPath` property returns the absolute pathname of the executable + * that started the Node.js process. Symbolic links, if any, are resolved. + * + * ```js + * '/usr/local/bin/node' + * ``` + * @since v0.1.100 + */ + execPath: string; + /** + * The `process.abort()` method causes the Node.js process to exit immediately and + * generate a core file. + * + * This feature is not available in `Worker` threads. + * @since v0.7.0 + */ + abort(): never; + /** + * The `process.chdir()` method changes the current working directory of the + * Node.js process or throws an exception if doing so fails (for instance, if + * the specified `directory` does not exist). + * + * ```js + * import { chdir, cwd } from 'process'; + * + * console.log(`Starting directory: ${cwd()}`); + * try { + * chdir('/tmp'); + * console.log(`New directory: ${cwd()}`); + * } catch (err) { + * console.error(`chdir: ${err}`); + * } + * ``` + * + * This feature is not available in `Worker` threads. + * @since v0.1.17 + */ + chdir(directory: string): void; + /** + * The `process.cwd()` method returns the current working directory of the Node.js + * process. + * + * ```js + * import { cwd } from 'process'; + * + * console.log(`Current directory: ${cwd()}`); + * ``` + * @since v0.1.8 + */ + cwd(): string; + /** + * The port used by the Node.js debugger when enabled. + * + * ```js + * import process from 'process'; + * + * process.debugPort = 5858; + * ``` + * @since v0.7.2 + */ + debugPort: number; + /** + * The `process.emitWarning()` method can be used to emit custom or application + * specific process warnings. These can be listened for by adding a handler to the `'warning'` event. + * + * ```js + * import { emitWarning } from 'process'; + * + * // Emit a warning with a code and additional detail. + * emitWarning('Something happened!', { + * code: 'MY_WARNING', + * detail: 'This is some additional information' + * }); + * // Emits: + * // (node:56338) [MY_WARNING] Warning: Something happened! + * // This is some additional information + * ``` + * + * In this example, an `Error` object is generated internally by`process.emitWarning()` and passed through to the `'warning'` handler. + * + * ```js + * import process from 'process'; + * + * process.on('warning', (warning) => { + * console.warn(warning.name); // 'Warning' + * console.warn(warning.message); // 'Something happened!' + * console.warn(warning.code); // 'MY_WARNING' + * console.warn(warning.stack); // Stack trace + * console.warn(warning.detail); // 'This is some additional information' + * }); + * ``` + * + * If `warning` is passed as an `Error` object, the `options` argument is ignored. + * @since v8.0.0 + * @param warning The warning to emit. + */ + emitWarning(warning: string | Error, ctor?: Function): void; + emitWarning(warning: string | Error, type?: string, ctor?: Function): void; + emitWarning(warning: string | Error, type?: string, code?: string, ctor?: Function): void; + emitWarning(warning: string | Error, options?: EmitWarningOptions): void; + /** + * The `process.env` property returns an object containing the user environment. + * See [`environ(7)`](http://man7.org/linux/man-pages/man7/environ.7.html). + * + * An example of this object looks like: + * + * ```js + * { + * TERM: 'xterm-256color', + * SHELL: '/usr/local/bin/bash', + * USER: 'maciej', + * PATH: '~/.bin/:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin', + * PWD: '/Users/maciej', + * EDITOR: 'vim', + * SHLVL: '1', + * HOME: '/Users/maciej', + * LOGNAME: 'maciej', + * _: '/usr/local/bin/node' + * } + * ``` + * + * It is possible to modify this object, but such modifications will not be + * reflected outside the Node.js process, or (unless explicitly requested) + * to other `Worker` threads. + * In other words, the following example would not work: + * + * ```console + * $ node -e 'process.env.foo = "bar"' && echo $foo + * ``` + * + * While the following will: + * + * ```js + * import { env } from 'process'; + * + * env.foo = 'bar'; + * console.log(env.foo); + * ``` + * + * Assigning a property on `process.env` will implicitly convert the value + * to a string. **This behavior is deprecated.** Future versions of Node.js may + * throw an error when the value is not a string, number, or boolean. + * + * ```js + * import { env } from 'process'; + * + * env.test = null; + * console.log(env.test); + * // => 'null' + * env.test = undefined; + * console.log(env.test); + * // => 'undefined' + * ``` + * + * Use `delete` to delete a property from `process.env`. + * + * ```js + * import { env } from 'process'; + * + * env.TEST = 1; + * delete env.TEST; + * console.log(env.TEST); + * // => undefined + * ``` + * + * On Windows operating systems, environment variables are case-insensitive. + * + * ```js + * import { env } from 'process'; + * + * env.TEST = 1; + * console.log(env.test); + * // => 1 + * ``` + * + * Unless explicitly specified when creating a `Worker` instance, + * each `Worker` thread has its own copy of `process.env`, based on its + * parent thread’s `process.env`, or whatever was specified as the `env` option + * to the `Worker` constructor. Changes to `process.env` will not be visible + * across `Worker` threads, and only the main thread can make changes that + * are visible to the operating system or to native add-ons. + * @since v0.1.27 + */ + env: ProcessEnv; + /** + * The `process.exit()` method instructs Node.js to terminate the process + * synchronously with an exit status of `code`. If `code` is omitted, exit uses + * either the 'success' code `0` or the value of `process.exitCode` if it has been + * set. Node.js will not terminate until all the `'exit'` event listeners are + * called. + * + * To exit with a 'failure' code: + * + * ```js + * import { exit } from 'process'; + * + * exit(1); + * ``` + * + * The shell that executed Node.js should see the exit code as `1`. + * + * Calling `process.exit()` will force the process to exit as quickly as possible + * even if there are still asynchronous operations pending that have not yet + * completed fully, including I/O operations to `process.stdout` and`process.stderr`. + * + * In most situations, it is not actually necessary to call `process.exit()`explicitly. The Node.js process will exit on its own _if there is no additional_ + * _work pending_ in the event loop. The `process.exitCode` property can be set to + * tell the process which exit code to use when the process exits gracefully. + * + * For instance, the following example illustrates a _misuse_ of the`process.exit()` method that could lead to data printed to stdout being + * truncated and lost: + * + * ```js + * import { exit } from 'process'; + * + * // This is an example of what *not* to do: + * if (someConditionNotMet()) { + * printUsageToStdout(); + * exit(1); + * } + * ``` + * + * The reason this is problematic is because writes to `process.stdout` in Node.js + * are sometimes _asynchronous_ and may occur over multiple ticks of the Node.js + * event loop. Calling `process.exit()`, however, forces the process to exit_before_ those additional writes to `stdout` can be performed. + * + * Rather than calling `process.exit()` directly, the code _should_ set the`process.exitCode` and allow the process to exit naturally by avoiding + * scheduling any additional work for the event loop: + * + * ```js + * import process from 'process'; + * + * // How to properly set the exit code while letting + * // the process exit gracefully. + * if (someConditionNotMet()) { + * printUsageToStdout(); + * process.exitCode = 1; + * } + * ``` + * + * If it is necessary to terminate the Node.js process due to an error condition, + * throwing an _uncaught_ error and allowing the process to terminate accordingly + * is safer than calling `process.exit()`. + * + * In `Worker` threads, this function stops the current thread rather + * than the current process. + * @since v0.1.13 + * @param [code=0] The exit code. + */ + exit(code?: number): never; + /** + * A number which will be the process exit code, when the process either + * exits gracefully, or is exited via {@link exit} without specifying + * a code. + * + * Specifying a code to {@link exit} will override any + * previous setting of `process.exitCode`. + * @since v0.11.8 + */ + exitCode?: number | undefined; + /** + * The `process.getgid()` method returns the numerical group identity of the + * process. (See [`getgid(2)`](http://man7.org/linux/man-pages/man2/getgid.2.html).) + * + * ```js + * import process from 'process'; + * + * if (process.getgid) { + * console.log(`Current gid: ${process.getgid()}`); + * } + * ``` + * + * This function is only available on POSIX platforms (i.e. not Windows or + * Android). + * @since v0.1.31 + */ + getgid(): number; + /** + * The `process.setgid()` method sets the group identity of the process. (See [`setgid(2)`](http://man7.org/linux/man-pages/man2/setgid.2.html).) The `id` can be passed as either a + * numeric ID or a group name + * string. If a group name is specified, this method blocks while resolving the + * associated numeric ID. + * + * ```js + * import process from 'process'; + * + * if (process.getgid && process.setgid) { + * console.log(`Current gid: ${process.getgid()}`); + * try { + * process.setgid(501); + * console.log(`New gid: ${process.getgid()}`); + * } catch (err) { + * console.log(`Failed to set gid: ${err}`); + * } + * } + * ``` + * + * This function is only available on POSIX platforms (i.e. not Windows or + * Android). + * This feature is not available in `Worker` threads. + * @since v0.1.31 + * @param id The group name or ID + */ + setgid(id: number | string): void; + /** + * The `process.getuid()` method returns the numeric user identity of the process. + * (See [`getuid(2)`](http://man7.org/linux/man-pages/man2/getuid.2.html).) + * + * ```js + * import process from 'process'; + * + * if (process.getuid) { + * console.log(`Current uid: ${process.getuid()}`); + * } + * ``` + * + * This function is only available on POSIX platforms (i.e. not Windows or + * Android). + * @since v0.1.28 + */ + getuid(): number; + /** + * The `process.setuid(id)` method sets the user identity of the process. (See [`setuid(2)`](http://man7.org/linux/man-pages/man2/setuid.2.html).) The `id` can be passed as either a + * numeric ID or a username string. + * If a username is specified, the method blocks while resolving the associated + * numeric ID. + * + * ```js + * import process from 'process'; + * + * if (process.getuid && process.setuid) { + * console.log(`Current uid: ${process.getuid()}`); + * try { + * process.setuid(501); + * console.log(`New uid: ${process.getuid()}`); + * } catch (err) { + * console.log(`Failed to set uid: ${err}`); + * } + * } + * ``` + * + * This function is only available on POSIX platforms (i.e. not Windows or + * Android). + * This feature is not available in `Worker` threads. + * @since v0.1.28 + */ + setuid(id: number | string): void; + /** + * The `process.geteuid()` method returns the numerical effective user identity of + * the process. (See [`geteuid(2)`](http://man7.org/linux/man-pages/man2/geteuid.2.html).) + * + * ```js + * import process from 'process'; + * + * if (process.geteuid) { + * console.log(`Current uid: ${process.geteuid()}`); + * } + * ``` + * + * This function is only available on POSIX platforms (i.e. not Windows or + * Android). + * @since v2.0.0 + */ + geteuid(): number; + /** + * The `process.seteuid()` method sets the effective user identity of the process. + * (See [`seteuid(2)`](http://man7.org/linux/man-pages/man2/seteuid.2.html).) The `id` can be passed as either a numeric ID or a username + * string. If a username is specified, the method blocks while resolving the + * associated numeric ID. + * + * ```js + * import process from 'process'; + * + * if (process.geteuid && process.seteuid) { + * console.log(`Current uid: ${process.geteuid()}`); + * try { + * process.seteuid(501); + * console.log(`New uid: ${process.geteuid()}`); + * } catch (err) { + * console.log(`Failed to set uid: ${err}`); + * } + * } + * ``` + * + * This function is only available on POSIX platforms (i.e. not Windows or + * Android). + * This feature is not available in `Worker` threads. + * @since v2.0.0 + * @param id A user name or ID + */ + seteuid(id: number | string): void; + /** + * The `process.getegid()` method returns the numerical effective group identity + * of the Node.js process. (See [`getegid(2)`](http://man7.org/linux/man-pages/man2/getegid.2.html).) + * + * ```js + * import process from 'process'; + * + * if (process.getegid) { + * console.log(`Current gid: ${process.getegid()}`); + * } + * ``` + * + * This function is only available on POSIX platforms (i.e. not Windows or + * Android). + * @since v2.0.0 + */ + getegid(): number; + /** + * The `process.setegid()` method sets the effective group identity of the process. + * (See [`setegid(2)`](http://man7.org/linux/man-pages/man2/setegid.2.html).) The `id` can be passed as either a numeric ID or a group + * name string. If a group name is specified, this method blocks while resolving + * the associated a numeric ID. + * + * ```js + * import process from 'process'; + * + * if (process.getegid && process.setegid) { + * console.log(`Current gid: ${process.getegid()}`); + * try { + * process.setegid(501); + * console.log(`New gid: ${process.getegid()}`); + * } catch (err) { + * console.log(`Failed to set gid: ${err}`); + * } + * } + * ``` + * + * This function is only available on POSIX platforms (i.e. not Windows or + * Android). + * This feature is not available in `Worker` threads. + * @since v2.0.0 + * @param id A group name or ID + */ + setegid(id: number | string): void; + /** + * The `process.getgroups()` method returns an array with the supplementary group + * IDs. POSIX leaves it unspecified if the effective group ID is included but + * Node.js ensures it always is. + * + * ```js + * import process from 'process'; + * + * if (process.getgroups) { + * console.log(process.getgroups()); // [ 16, 21, 297 ] + * } + * ``` + * + * This function is only available on POSIX platforms (i.e. not Windows or + * Android). + * @since v0.9.4 + */ + getgroups(): number[]; + /** + * The `process.setgroups()` method sets the supplementary group IDs for the + * Node.js process. This is a privileged operation that requires the Node.js + * process to have `root` or the `CAP_SETGID` capability. + * + * The `groups` array can contain numeric group IDs, group names, or both. + * + * ```js + * import process from 'process'; + * + * if (process.getgroups && process.setgroups) { + * try { + * process.setgroups([501]); + * console.log(process.getgroups()); // new groups + * } catch (err) { + * console.log(`Failed to set groups: ${err}`); + * } + * } + * ``` + * + * This function is only available on POSIX platforms (i.e. not Windows or + * Android). + * This feature is not available in `Worker` threads. + * @since v0.9.4 + */ + setgroups(groups: ReadonlyArray): void; + /** + * The `process.setUncaughtExceptionCaptureCallback()` function sets a function + * that will be invoked when an uncaught exception occurs, which will receive the + * exception value itself as its first argument. + * + * If such a function is set, the `'uncaughtException'` event will + * not be emitted. If `--abort-on-uncaught-exception` was passed from the + * command line or set through `v8.setFlagsFromString()`, the process will + * not abort. Actions configured to take place on exceptions such as report + * generations will be affected too + * + * To unset the capture function,`process.setUncaughtExceptionCaptureCallback(null)` may be used. Calling this + * method with a non-`null` argument while another capture function is set will + * throw an error. + * + * Using this function is mutually exclusive with using the deprecated `domain` built-in module. + * @since v9.3.0 + */ + setUncaughtExceptionCaptureCallback(cb: ((err: Error) => void) | null): void; + /** + * Indicates whether a callback has been set using {@link setUncaughtExceptionCaptureCallback}. + * @since v9.3.0 + */ + hasUncaughtExceptionCaptureCallback(): boolean; + /** + * The `process.version` property contains the Node.js version string. + * + * ```js + * import { version } from 'process'; + * + * console.log(`Version: ${version}`); + * // Version: v14.8.0 + * ``` + * + * To get the version string without the prepended _v_, use`process.versions.node`. + * @since v0.1.3 + */ + readonly version: string; + /** + * The `process.versions` property returns an object listing the version strings of + * Node.js and its dependencies. `process.versions.modules` indicates the current + * ABI version, which is increased whenever a C++ API changes. Node.js will refuse + * to load modules that were compiled against a different module ABI version. + * + * ```js + * import { versions } from 'process'; + * + * console.log(versions); + * ``` + * + * Will generate an object similar to: + * + * ```console + * { node: '11.13.0', + * v8: '7.0.276.38-node.18', + * uv: '1.27.0', + * zlib: '1.2.11', + * brotli: '1.0.7', + * ares: '1.15.0', + * modules: '67', + * nghttp2: '1.34.0', + * napi: '4', + * llhttp: '1.1.1', + * openssl: '1.1.1b', + * cldr: '34.0', + * icu: '63.1', + * tz: '2018e', + * unicode: '11.0' } + * ``` + * @since v0.2.0 + */ + readonly versions: ProcessVersions; + /** + * The `process.config` property returns an `Object` containing the JavaScript + * representation of the configure options used to compile the current Node.js + * executable. This is the same as the `config.gypi` file that was produced when + * running the `./configure` script. + * + * An example of the possible output looks like: + * + * ```js + * { + * target_defaults: + * { cflags: [], + * default_configuration: 'Release', + * defines: [], + * include_dirs: [], + * libraries: [] }, + * variables: + * { + * host_arch: 'x64', + * napi_build_version: 5, + * node_install_npm: 'true', + * node_prefix: '', + * node_shared_cares: 'false', + * node_shared_http_parser: 'false', + * node_shared_libuv: 'false', + * node_shared_zlib: 'false', + * node_use_dtrace: 'false', + * node_use_openssl: 'true', + * node_shared_openssl: 'false', + * strict_aliasing: 'true', + * target_arch: 'x64', + * v8_use_snapshot: 1 + * } + * } + * ``` + * + * The `process.config` property is **not** read-only and there are existing + * modules in the ecosystem that are known to extend, modify, or entirely replace + * the value of `process.config`. + * + * Modifying the `process.config` property, or any child-property of the`process.config` object has been deprecated. The `process.config` will be made + * read-only in a future release. + * @since v0.7.7 + */ + readonly config: ProcessConfig; + /** + * The `process.kill()` method sends the `signal` to the process identified by`pid`. + * + * Signal names are strings such as `'SIGINT'` or `'SIGHUP'`. See `Signal Events` and [`kill(2)`](http://man7.org/linux/man-pages/man2/kill.2.html) for more information. + * + * This method will throw an error if the target `pid` does not exist. As a special + * case, a signal of `0` can be used to test for the existence of a process. + * Windows platforms will throw an error if the `pid` is used to kill a process + * group. + * + * Even though the name of this function is `process.kill()`, it is really just a + * signal sender, like the `kill` system call. The signal sent may do something + * other than kill the target process. + * + * ```js + * import process, { kill } from 'process'; + * + * process.on('SIGHUP', () => { + * console.log('Got SIGHUP signal.'); + * }); + * + * setTimeout(() => { + * console.log('Exiting.'); + * process.exit(0); + * }, 100); + * + * kill(process.pid, 'SIGHUP'); + * ``` + * + * When `SIGUSR1` is received by a Node.js process, Node.js will start the + * debugger. See `Signal Events`. + * @since v0.0.6 + * @param pid A process ID + * @param [signal='SIGTERM'] The signal to send, either as a string or number. + */ + kill(pid: number, signal?: string | number): true; + /** + * The `process.pid` property returns the PID of the process. + * + * ```js + * import { pid } from 'process'; + * + * console.log(`This process is pid ${pid}`); + * ``` + * @since v0.1.15 + */ + readonly pid: number; + /** + * The `process.ppid` property returns the PID of the parent of the + * current process. + * + * ```js + * import { ppid } from 'process'; + * + * console.log(`The parent process is pid ${ppid}`); + * ``` + * @since v9.2.0, v8.10.0, v6.13.0 + */ + readonly ppid: number; + /** + * The `process.title` property returns the current process title (i.e. returns + * the current value of `ps`). Assigning a new value to `process.title` modifies + * the current value of `ps`. + * + * When a new value is assigned, different platforms will impose different maximum + * length restrictions on the title. Usually such restrictions are quite limited. + * For instance, on Linux and macOS, `process.title` is limited to the size of the + * binary name plus the length of the command-line arguments because setting the`process.title` overwrites the `argv` memory of the process. Node.js v0.8 + * allowed for longer process title strings by also overwriting the `environ`memory but that was potentially insecure and confusing in some (rather obscure) + * cases. + * + * Assigning a value to `process.title` might not result in an accurate label + * within process manager applications such as macOS Activity Monitor or Windows + * Services Manager. + * @since v0.1.104 + */ + title: string; + /** + * The operating system CPU architecture for which the Node.js binary was compiled. + * Possible values are: `'arm'`, `'arm64'`, `'ia32'`, `'mips'`,`'mipsel'`, `'ppc'`,`'ppc64'`, `'s390'`, `'s390x'`, `'x32'`, and `'x64'`. + * + * ```js + * import { arch } from 'process'; + * + * console.log(`This processor architecture is ${arch}`); + * ``` + * @since v0.5.0 + */ + readonly arch: string; + /** + * The `process.platform` property returns a string identifying the operating + * system platform on which the Node.js process is running. + * + * Currently possible values are: + * + * * `'aix'` + * * `'darwin'` + * * `'freebsd'` + * * `'linux'` + * * `'openbsd'` + * * `'sunos'` + * * `'win32'` + * + * ```js + * import { platform } from 'process'; + * + * console.log(`This platform is ${platform}`); + * ``` + * + * The value `'android'` may also be returned if the Node.js is built on the + * Android operating system. However, Android support in Node.js [is experimental](https://github.com/nodejs/node/blob/HEAD/BUILDING.md#androidandroid-based-devices-eg-firefox-os). + * @since v0.1.16 + */ + readonly platform: Platform; + /** + * The `process.mainModule` property provides an alternative way of retrieving `require.main`. The difference is that if the main module changes at + * runtime, `require.main` may still refer to the original main module in + * modules that were required before the change occurred. Generally, it's + * safe to assume that the two refer to the same module. + * + * As with `require.main`, `process.mainModule` will be `undefined` if there + * is no entry script. + * @since v0.1.17 + * @deprecated Since v14.0.0 - Use `main` instead. + */ + mainModule?: Module | undefined; + memoryUsage: MemoryUsageFn; + /** + * The `process.cpuUsage()` method returns the user and system CPU time usage of + * the current process, in an object with properties `user` and `system`, whose + * values are microsecond values (millionth of a second). These values measure time + * spent in user and system code respectively, and may end up being greater than + * actual elapsed time if multiple CPU cores are performing work for this process. + * + * The result of a previous call to `process.cpuUsage()` can be passed as the + * argument to the function, to get a diff reading. + * + * ```js + * import { cpuUsage } from 'process'; + * + * const startUsage = cpuUsage(); + * // { user: 38579, system: 6986 } + * + * // spin the CPU for 500 milliseconds + * const now = Date.now(); + * while (Date.now() - now < 500); + * + * console.log(cpuUsage(startUsage)); + * // { user: 514883, system: 11226 } + * ``` + * @since v6.1.0 + * @param previousValue A previous return value from calling `process.cpuUsage()` + */ + cpuUsage(previousValue?: CpuUsage): CpuUsage; + /** + * `process.nextTick()` adds `callback` to the "next tick queue". This queue is + * fully drained after the current operation on the JavaScript stack runs to + * completion and before the event loop is allowed to continue. It's possible to + * create an infinite loop if one were to recursively call `process.nextTick()`. + * See the [Event Loop](https://nodejs.org/en/docs/guides/event-loop-timers-and-nexttick/#process-nexttick) guide for more background. + * + * ```js + * import { nextTick } from 'process'; + * + * console.log('start'); + * nextTick(() => { + * console.log('nextTick callback'); + * }); + * console.log('scheduled'); + * // Output: + * // start + * // scheduled + * // nextTick callback + * ``` + * + * This is important when developing APIs in order to give users the opportunity + * to assign event handlers _after_ an object has been constructed but before any + * I/O has occurred: + * + * ```js + * import { nextTick } from 'process'; + * + * function MyThing(options) { + * this.setupOptions(options); + * + * nextTick(() => { + * this.startDoingStuff(); + * }); + * } + * + * const thing = new MyThing(); + * thing.getReadyForStuff(); + * + * // thing.startDoingStuff() gets called now, not before. + * ``` + * + * It is very important for APIs to be either 100% synchronous or 100% + * asynchronous. Consider this example: + * + * ```js + * // WARNING! DO NOT USE! BAD UNSAFE HAZARD! + * function maybeSync(arg, cb) { + * if (arg) { + * cb(); + * return; + * } + * + * fs.stat('file', cb); + * } + * ``` + * + * This API is hazardous because in the following case: + * + * ```js + * const maybeTrue = Math.random() > 0.5; + * + * maybeSync(maybeTrue, () => { + * foo(); + * }); + * + * bar(); + * ``` + * + * It is not clear whether `foo()` or `bar()` will be called first. + * + * The following approach is much better: + * + * ```js + * import { nextTick } from 'process'; + * + * function definitelyAsync(arg, cb) { + * if (arg) { + * nextTick(cb); + * return; + * } + * + * fs.stat('file', cb); + * } + * ``` + * @since v0.1.26 + * @param args Additional arguments to pass when invoking the `callback` + */ + nextTick(callback: Function, ...args: any[]): void; + /** + * The `process.release` property returns an `Object` containing metadata related + * to the current release, including URLs for the source tarball and headers-only + * tarball. + * + * `process.release` contains the following properties: + * + * ```js + * { + * name: 'node', + * lts: 'Erbium', + * sourceUrl: 'https://nodejs.org/download/release/v12.18.1/node-v12.18.1.tar.gz', + * headersUrl: 'https://nodejs.org/download/release/v12.18.1/node-v12.18.1-headers.tar.gz', + * libUrl: 'https://nodejs.org/download/release/v12.18.1/win-x64/node.lib' + * } + * ``` + * + * In custom builds from non-release versions of the source tree, only the`name` property may be present. The additional properties should not be + * relied upon to exist. + * @since v3.0.0 + */ + readonly release: ProcessRelease; + features: { + inspector: boolean; + debug: boolean; + uv: boolean; + ipv6: boolean; + tls_alpn: boolean; + tls_sni: boolean; + tls_ocsp: boolean; + tls: boolean; + }; + /** + * `process.umask()` returns the Node.js process's file mode creation mask. Child + * processes inherit the mask from the parent process. + * @since v0.1.19 + * @deprecated Calling `process.umask()` with no argument causes the process-wide umask to be written twice. This introduces a race condition between threads, and is a potential * + * security vulnerability. There is no safe, cross-platform alternative API. + */ + umask(): number; + /** + * Can only be set if not in worker thread. + */ + umask(mask: string | number): number; + /** + * The `process.uptime()` method returns the number of seconds the current Node.js + * process has been running. + * + * The return value includes fractions of a second. Use `Math.floor()` to get whole + * seconds. + * @since v0.5.0 + */ + uptime(): number; + hrtime: HRTime; + /** + * If Node.js is spawned with an IPC channel, the `process.send()` method can be + * used to send messages to the parent process. Messages will be received as a `'message'` event on the parent's `ChildProcess` object. + * + * If Node.js was not spawned with an IPC channel, `process.send` will be`undefined`. + * + * The message goes through serialization and parsing. The resulting message might + * not be the same as what is originally sent. + * @since v0.5.9 + * @param options used to parameterize the sending of certain types of handles.`options` supports the following properties: + */ + send?( + message: any, + sendHandle?: any, + options?: { + swallowErrors?: boolean | undefined; + }, + callback?: (error: Error | null) => void + ): boolean; + /** + * If the Node.js process is spawned with an IPC channel (see the `Child Process` and `Cluster` documentation), the `process.disconnect()` method will close the + * IPC channel to the parent process, allowing the child process to exit gracefully + * once there are no other connections keeping it alive. + * + * The effect of calling `process.disconnect()` is the same as calling `ChildProcess.disconnect()` from the parent process. + * + * If the Node.js process was not spawned with an IPC channel,`process.disconnect()` will be `undefined`. + * @since v0.7.2 + */ + disconnect(): void; + /** + * If the Node.js process is spawned with an IPC channel (see the `Child Process` and `Cluster` documentation), the `process.connected` property will return`true` so long as the IPC + * channel is connected and will return `false` after`process.disconnect()` is called. + * + * Once `process.connected` is `false`, it is no longer possible to send messages + * over the IPC channel using `process.send()`. + * @since v0.7.2 + */ + connected: boolean; + /** + * The `process.allowedNodeEnvironmentFlags` property is a special, + * read-only `Set` of flags allowable within the `NODE_OPTIONS` environment variable. + * + * `process.allowedNodeEnvironmentFlags` extends `Set`, but overrides`Set.prototype.has` to recognize several different possible flag + * representations. `process.allowedNodeEnvironmentFlags.has()` will + * return `true` in the following cases: + * + * * Flags may omit leading single (`-`) or double (`--`) dashes; e.g.,`inspect-brk` for `--inspect-brk`, or `r` for `-r`. + * * Flags passed through to V8 (as listed in `--v8-options`) may replace + * one or more _non-leading_ dashes for an underscore, or vice-versa; + * e.g., `--perf_basic_prof`, `--perf-basic-prof`, `--perf_basic-prof`, + * etc. + * * Flags may contain one or more equals (`=`) characters; all + * characters after and including the first equals will be ignored; + * e.g., `--stack-trace-limit=100`. + * * Flags _must_ be allowable within `NODE_OPTIONS`. + * + * When iterating over `process.allowedNodeEnvironmentFlags`, flags will + * appear only _once_; each will begin with one or more dashes. Flags + * passed through to V8 will contain underscores instead of non-leading + * dashes: + * + * ```js + * import { allowedNodeEnvironmentFlags } from 'process'; + * + * allowedNodeEnvironmentFlags.forEach((flag) => { + * // -r + * // --inspect-brk + * // --abort_on_uncaught_exception + * // ... + * }); + * ``` + * + * The methods `add()`, `clear()`, and `delete()` of`process.allowedNodeEnvironmentFlags` do nothing, and will fail + * silently. + * + * If Node.js was compiled _without_ `NODE_OPTIONS` support (shown in {@link config}), `process.allowedNodeEnvironmentFlags` will + * contain what _would have_ been allowable. + * @since v10.10.0 + */ + allowedNodeEnvironmentFlags: ReadonlySet; + /** + * `process.report` is an object whose methods are used to generate diagnostic + * reports for the current process. Additional documentation is available in the `report documentation`. + * @since v11.8.0 + */ + report?: ProcessReport | undefined; + /** + * ```js + * import { resourceUsage } from 'process'; + * + * console.log(resourceUsage()); + * /* + * Will output: + * { + * userCPUTime: 82872, + * systemCPUTime: 4143, + * maxRSS: 33164, + * sharedMemorySize: 0, + * unsharedDataSize: 0, + * unsharedStackSize: 0, + * minorPageFault: 2469, + * majorPageFault: 0, + * swappedOut: 0, + * fsRead: 0, + * fsWrite: 8, + * ipcSent: 0, + * ipcReceived: 0, + * signalsCount: 0, + * voluntaryContextSwitches: 79, + * involuntaryContextSwitches: 1 + * } + * + * ``` + * @since v12.6.0 + * @return the resource usage for the current process. All of these values come from the `uv_getrusage` call which returns a [`uv_rusage_t` struct][uv_rusage_t]. + */ + resourceUsage(): ResourceUsage; + /** + * The `process.traceDeprecation` property indicates whether the`--trace-deprecation` flag is set on the current Node.js process. See the + * documentation for the `'warning' event` and the `emitWarning() method` for more information about this + * flag's behavior. + * @since v0.8.0 + */ + traceDeprecation: boolean; + /* EventEmitter */ + addListener(event: 'beforeExit', listener: BeforeExitListener): this; + addListener(event: 'disconnect', listener: DisconnectListener): this; + addListener(event: 'exit', listener: ExitListener): this; + addListener(event: 'rejectionHandled', listener: RejectionHandledListener): this; + addListener(event: 'uncaughtException', listener: UncaughtExceptionListener): this; + addListener(event: 'uncaughtExceptionMonitor', listener: UncaughtExceptionListener): this; + addListener(event: 'unhandledRejection', listener: UnhandledRejectionListener): this; + addListener(event: 'warning', listener: WarningListener): this; + addListener(event: 'message', listener: MessageListener): this; + addListener(event: Signals, listener: SignalsListener): this; + addListener(event: 'multipleResolves', listener: MultipleResolveListener): this; + addListener(event: 'worker', listener: WorkerListener): this; + emit(event: 'beforeExit', code: number): boolean; + emit(event: 'disconnect'): boolean; + emit(event: 'exit', code: number): boolean; + emit(event: 'rejectionHandled', promise: Promise): boolean; + emit(event: 'uncaughtException', error: Error): boolean; + emit(event: 'uncaughtExceptionMonitor', error: Error): boolean; + emit(event: 'unhandledRejection', reason: unknown, promise: Promise): boolean; + emit(event: 'warning', warning: Error): boolean; + emit(event: 'message', message: unknown, sendHandle: unknown): this; + emit(event: Signals, signal: Signals): boolean; + emit(event: 'multipleResolves', type: MultipleResolveType, promise: Promise, value: unknown): this; + emit(event: 'worker', listener: WorkerListener): this; + on(event: 'beforeExit', listener: BeforeExitListener): this; + on(event: 'disconnect', listener: DisconnectListener): this; + on(event: 'exit', listener: ExitListener): this; + on(event: 'rejectionHandled', listener: RejectionHandledListener): this; + on(event: 'uncaughtException', listener: UncaughtExceptionListener): this; + on(event: 'uncaughtExceptionMonitor', listener: UncaughtExceptionListener): this; + on(event: 'unhandledRejection', listener: UnhandledRejectionListener): this; + on(event: 'warning', listener: WarningListener): this; + on(event: 'message', listener: MessageListener): this; + on(event: Signals, listener: SignalsListener): this; + on(event: 'multipleResolves', listener: MultipleResolveListener): this; + on(event: 'worker', listener: WorkerListener): this; + on(event: string | symbol, listener: (...args: any[]) => void): this; + once(event: 'beforeExit', listener: BeforeExitListener): this; + once(event: 'disconnect', listener: DisconnectListener): this; + once(event: 'exit', listener: ExitListener): this; + once(event: 'rejectionHandled', listener: RejectionHandledListener): this; + once(event: 'uncaughtException', listener: UncaughtExceptionListener): this; + once(event: 'uncaughtExceptionMonitor', listener: UncaughtExceptionListener): this; + once(event: 'unhandledRejection', listener: UnhandledRejectionListener): this; + once(event: 'warning', listener: WarningListener): this; + once(event: 'message', listener: MessageListener): this; + once(event: Signals, listener: SignalsListener): this; + once(event: 'multipleResolves', listener: MultipleResolveListener): this; + once(event: 'worker', listener: WorkerListener): this; + once(event: string | symbol, listener: (...args: any[]) => void): this; + prependListener(event: 'beforeExit', listener: BeforeExitListener): this; + prependListener(event: 'disconnect', listener: DisconnectListener): this; + prependListener(event: 'exit', listener: ExitListener): this; + prependListener(event: 'rejectionHandled', listener: RejectionHandledListener): this; + prependListener(event: 'uncaughtException', listener: UncaughtExceptionListener): this; + prependListener(event: 'uncaughtExceptionMonitor', listener: UncaughtExceptionListener): this; + prependListener(event: 'unhandledRejection', listener: UnhandledRejectionListener): this; + prependListener(event: 'warning', listener: WarningListener): this; + prependListener(event: 'message', listener: MessageListener): this; + prependListener(event: Signals, listener: SignalsListener): this; + prependListener(event: 'multipleResolves', listener: MultipleResolveListener): this; + prependListener(event: 'worker', listener: WorkerListener): this; + prependOnceListener(event: 'beforeExit', listener: BeforeExitListener): this; + prependOnceListener(event: 'disconnect', listener: DisconnectListener): this; + prependOnceListener(event: 'exit', listener: ExitListener): this; + prependOnceListener(event: 'rejectionHandled', listener: RejectionHandledListener): this; + prependOnceListener(event: 'uncaughtException', listener: UncaughtExceptionListener): this; + prependOnceListener(event: 'uncaughtExceptionMonitor', listener: UncaughtExceptionListener): this; + prependOnceListener(event: 'unhandledRejection', listener: UnhandledRejectionListener): this; + prependOnceListener(event: 'warning', listener: WarningListener): this; + prependOnceListener(event: 'message', listener: MessageListener): this; + prependOnceListener(event: Signals, listener: SignalsListener): this; + prependOnceListener(event: 'multipleResolves', listener: MultipleResolveListener): this; + prependOnceListener(event: 'worker', listener: WorkerListener): this; + listeners(event: 'beforeExit'): BeforeExitListener[]; + listeners(event: 'disconnect'): DisconnectListener[]; + listeners(event: 'exit'): ExitListener[]; + listeners(event: 'rejectionHandled'): RejectionHandledListener[]; + listeners(event: 'uncaughtException'): UncaughtExceptionListener[]; + listeners(event: 'uncaughtExceptionMonitor'): UncaughtExceptionListener[]; + listeners(event: 'unhandledRejection'): UnhandledRejectionListener[]; + listeners(event: 'warning'): WarningListener[]; + listeners(event: 'message'): MessageListener[]; + listeners(event: Signals): SignalsListener[]; + listeners(event: 'multipleResolves'): MultipleResolveListener[]; + listeners(event: 'worker'): WorkerListener[]; + } + } + } + export = process; +} +declare module 'node:process' { + import process = require('process'); + export = process; +} diff --git a/node_modules/@types/node/punycode.d.ts b/node_modules/@types/node/punycode.d.ts new file mode 100644 index 000000000..345af5313 --- /dev/null +++ b/node_modules/@types/node/punycode.d.ts @@ -0,0 +1,117 @@ +/** + * **The version of the punycode module bundled in Node.js is being deprecated.**In a future major version of Node.js this module will be removed. Users + * currently depending on the `punycode` module should switch to using the + * userland-provided [Punycode.js](https://github.com/bestiejs/punycode.js) module instead. For punycode-based URL + * encoding, see `url.domainToASCII` or, more generally, the `WHATWG URL API`. + * + * The `punycode` module is a bundled version of the [Punycode.js](https://github.com/bestiejs/punycode.js) module. It + * can be accessed using: + * + * ```js + * const punycode = require('punycode'); + * ``` + * + * [Punycode](https://tools.ietf.org/html/rfc3492) is a character encoding scheme defined by RFC 3492 that is + * primarily intended for use in Internationalized Domain Names. Because host + * names in URLs are limited to ASCII characters only, Domain Names that contain + * non-ASCII characters must be converted into ASCII using the Punycode scheme. + * For instance, the Japanese character that translates into the English word,`'example'` is `'例'`. The Internationalized Domain Name, `'例.com'` (equivalent + * to `'example.com'`) is represented by Punycode as the ASCII string`'xn--fsq.com'`. + * + * The `punycode` module provides a simple implementation of the Punycode standard. + * + * The `punycode` module is a third-party dependency used by Node.js and + * made available to developers as a convenience. Fixes or other modifications to + * the module must be directed to the [Punycode.js](https://github.com/bestiejs/punycode.js) project. + * @deprecated Since v7.0.0 - Deprecated + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/punycode.js) + */ +declare module 'punycode' { + /** + * The `punycode.decode()` method converts a [Punycode](https://tools.ietf.org/html/rfc3492) string of ASCII-only + * characters to the equivalent string of Unicode codepoints. + * + * ```js + * punycode.decode('maana-pta'); // 'mañana' + * punycode.decode('--dqo34k'); // '☃-⌘' + * ``` + * @since v0.5.1 + */ + function decode(string: string): string; + /** + * The `punycode.encode()` method converts a string of Unicode codepoints to a [Punycode](https://tools.ietf.org/html/rfc3492) string of ASCII-only characters. + * + * ```js + * punycode.encode('mañana'); // 'maana-pta' + * punycode.encode('☃-⌘'); // '--dqo34k' + * ``` + * @since v0.5.1 + */ + function encode(string: string): string; + /** + * The `punycode.toUnicode()` method converts a string representing a domain name + * containing [Punycode](https://tools.ietf.org/html/rfc3492) encoded characters into Unicode. Only the [Punycode](https://tools.ietf.org/html/rfc3492) encoded parts of the domain name are be + * converted. + * + * ```js + * // decode domain names + * punycode.toUnicode('xn--maana-pta.com'); // 'mañana.com' + * punycode.toUnicode('xn----dqo34k.com'); // '☃-⌘.com' + * punycode.toUnicode('example.com'); // 'example.com' + * ``` + * @since v0.6.1 + */ + function toUnicode(domain: string): string; + /** + * The `punycode.toASCII()` method converts a Unicode string representing an + * Internationalized Domain Name to [Punycode](https://tools.ietf.org/html/rfc3492). Only the non-ASCII parts of the + * domain name will be converted. Calling `punycode.toASCII()` on a string that + * already only contains ASCII characters will have no effect. + * + * ```js + * // encode domain names + * punycode.toASCII('mañana.com'); // 'xn--maana-pta.com' + * punycode.toASCII('☃-⌘.com'); // 'xn----dqo34k.com' + * punycode.toASCII('example.com'); // 'example.com' + * ``` + * @since v0.6.1 + */ + function toASCII(domain: string): string; + /** + * @deprecated since v7.0.0 + * The version of the punycode module bundled in Node.js is being deprecated. + * In a future major version of Node.js this module will be removed. + * Users currently depending on the punycode module should switch to using + * the userland-provided Punycode.js module instead. + */ + const ucs2: ucs2; + interface ucs2 { + /** + * @deprecated since v7.0.0 + * The version of the punycode module bundled in Node.js is being deprecated. + * In a future major version of Node.js this module will be removed. + * Users currently depending on the punycode module should switch to using + * the userland-provided Punycode.js module instead. + */ + decode(string: string): number[]; + /** + * @deprecated since v7.0.0 + * The version of the punycode module bundled in Node.js is being deprecated. + * In a future major version of Node.js this module will be removed. + * Users currently depending on the punycode module should switch to using + * the userland-provided Punycode.js module instead. + */ + encode(codePoints: ReadonlyArray): string; + } + /** + * @deprecated since v7.0.0 + * The version of the punycode module bundled in Node.js is being deprecated. + * In a future major version of Node.js this module will be removed. + * Users currently depending on the punycode module should switch to using + * the userland-provided Punycode.js module instead. + */ + const version: string; +} +declare module 'node:punycode' { + export * from 'punycode'; +} diff --git a/node_modules/@types/node/querystring.d.ts b/node_modules/@types/node/querystring.d.ts new file mode 100644 index 000000000..892440b78 --- /dev/null +++ b/node_modules/@types/node/querystring.d.ts @@ -0,0 +1,131 @@ +/** + * The `querystring` module provides utilities for parsing and formatting URL + * query strings. It can be accessed using: + * + * ```js + * const querystring = require('querystring'); + * ``` + * + * The `querystring` API is considered Legacy. While it is still maintained, + * new code should use the `URLSearchParams` API instead. + * @deprecated Legacy + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/querystring.js) + */ +declare module 'querystring' { + interface StringifyOptions { + encodeURIComponent?: ((str: string) => string) | undefined; + } + interface ParseOptions { + maxKeys?: number | undefined; + decodeURIComponent?: ((str: string) => string) | undefined; + } + interface ParsedUrlQuery extends NodeJS.Dict {} + interface ParsedUrlQueryInput extends NodeJS.Dict | ReadonlyArray | ReadonlyArray | null> {} + /** + * The `querystring.stringify()` method produces a URL query string from a + * given `obj` by iterating through the object's "own properties". + * + * It serializes the following types of values passed in `obj`:[string](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Data_structures#String_type) | + * [number](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Data_structures#Number_type) | + * [bigint](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/BigInt) | + * [boolean](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Data_structures#Boolean_type) | + * [string\[\]](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Data_structures#String_type) | + * [number\[\]](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Data_structures#Number_type) | + * [bigint\[\]](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/BigInt) | + * [boolean\[\]](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Data_structures#Boolean_type) The numeric values must be finite. Any other input values will be coerced to + * empty strings. + * + * ```js + * querystring.stringify({ foo: 'bar', baz: ['qux', 'quux'], corge: '' }); + * // Returns 'foo=bar&baz=qux&baz=quux&corge=' + * + * querystring.stringify({ foo: 'bar', baz: 'qux' }, ';', ':'); + * // Returns 'foo:bar;baz:qux' + * ``` + * + * By default, characters requiring percent-encoding within the query string will + * be encoded as UTF-8\. If an alternative encoding is required, then an alternative`encodeURIComponent` option will need to be specified: + * + * ```js + * // Assuming gbkEncodeURIComponent function already exists, + * + * querystring.stringify({ w: '中文', foo: 'bar' }, null, null, + * { encodeURIComponent: gbkEncodeURIComponent }); + * ``` + * @since v0.1.25 + * @param obj The object to serialize into a URL query string + * @param [sep='&'] The substring used to delimit key and value pairs in the query string. + * @param [eq='='] . The substring used to delimit keys and values in the query string. + */ + function stringify(obj?: ParsedUrlQueryInput, sep?: string, eq?: string, options?: StringifyOptions): string; + /** + * The `querystring.parse()` method parses a URL query string (`str`) into a + * collection of key and value pairs. + * + * For example, the query string `'foo=bar&abc=xyz&abc=123'` is parsed into: + * + * ```js + * { + * foo: 'bar', + * abc: ['xyz', '123'] + * } + * ``` + * + * The object returned by the `querystring.parse()` method _does not_prototypically inherit from the JavaScript `Object`. This means that typical`Object` methods such as `obj.toString()`, + * `obj.hasOwnProperty()`, and others + * are not defined and _will not work_. + * + * By default, percent-encoded characters within the query string will be assumed + * to use UTF-8 encoding. If an alternative character encoding is used, then an + * alternative `decodeURIComponent` option will need to be specified: + * + * ```js + * // Assuming gbkDecodeURIComponent function already exists... + * + * querystring.parse('w=%D6%D0%CE%C4&foo=bar', null, null, + * { decodeURIComponent: gbkDecodeURIComponent }); + * ``` + * @since v0.1.25 + * @param str The URL query string to parse + * @param [sep='&'] The substring used to delimit key and value pairs in the query string. + * @param [eq='='] . The substring used to delimit keys and values in the query string. + */ + function parse(str: string, sep?: string, eq?: string, options?: ParseOptions): ParsedUrlQuery; + /** + * The querystring.encode() function is an alias for querystring.stringify(). + */ + const encode: typeof stringify; + /** + * The querystring.decode() function is an alias for querystring.parse(). + */ + const decode: typeof parse; + /** + * The `querystring.escape()` method performs URL percent-encoding on the given`str` in a manner that is optimized for the specific requirements of URL + * query strings. + * + * The `querystring.escape()` method is used by `querystring.stringify()` and is + * generally not expected to be used directly. It is exported primarily to allow + * application code to provide a replacement percent-encoding implementation if + * necessary by assigning `querystring.escape` to an alternative function. + * @since v0.1.25 + */ + function escape(str: string): string; + /** + * The `querystring.unescape()` method performs decoding of URL percent-encoded + * characters on the given `str`. + * + * The `querystring.unescape()` method is used by `querystring.parse()` and is + * generally not expected to be used directly. It is exported primarily to allow + * application code to provide a replacement decoding implementation if + * necessary by assigning `querystring.unescape` to an alternative function. + * + * By default, the `querystring.unescape()` method will attempt to use the + * JavaScript built-in `decodeURIComponent()` method to decode. If that fails, + * a safer equivalent that does not throw on malformed URLs will be used. + * @since v0.1.25 + */ + function unescape(str: string): string; +} +declare module 'node:querystring' { + export * from 'querystring'; +} diff --git a/node_modules/@types/node/readline.d.ts b/node_modules/@types/node/readline.d.ts new file mode 100644 index 000000000..9c9258b22 --- /dev/null +++ b/node_modules/@types/node/readline.d.ts @@ -0,0 +1,542 @@ +/** + * The `readline` module provides an interface for reading data from a `Readable` stream (such as `process.stdin`) one line at a time. It can be accessed + * using: + * + * ```js + * const readline = require('readline'); + * ``` + * + * The following simple example illustrates the basic use of the `readline` module. + * + * ```js + * const readline = require('readline'); + * + * const rl = readline.createInterface({ + * input: process.stdin, + * output: process.stdout + * }); + * + * rl.question('What do you think of Node.js? ', (answer) => { + * // TODO: Log the answer in a database + * console.log(`Thank you for your valuable feedback: ${answer}`); + * + * rl.close(); + * }); + * ``` + * + * Once this code is invoked, the Node.js application will not terminate until the`readline.Interface` is closed because the interface waits for data to be + * received on the `input` stream. + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/readline.js) + */ +declare module 'readline' { + import { Abortable, EventEmitter } from 'node:events'; + interface Key { + sequence?: string | undefined; + name?: string | undefined; + ctrl?: boolean | undefined; + meta?: boolean | undefined; + shift?: boolean | undefined; + } + /** + * Instances of the `readline.Interface` class are constructed using the`readline.createInterface()` method. Every instance is associated with a + * single `input` `Readable` stream and a single `output` `Writable` stream. + * The `output` stream is used to print prompts for user input that arrives on, + * and is read from, the `input` stream. + * @since v0.1.104 + */ + class Interface extends EventEmitter { + readonly terminal: boolean; + /** + * The current input data being processed by node. + * + * This can be used when collecting input from a TTY stream to retrieve the + * current value that has been processed thus far, prior to the `line` event + * being emitted. Once the `line` event has been emitted, this property will + * be an empty string. + * + * Be aware that modifying the value during the instance runtime may have + * unintended consequences if `rl.cursor` is not also controlled. + * + * **If not using a TTY stream for input, use the `'line'` event.** + * + * One possible use case would be as follows: + * + * ```js + * const values = ['lorem ipsum', 'dolor sit amet']; + * const rl = readline.createInterface(process.stdin); + * const showResults = debounce(() => { + * console.log( + * '\n', + * values.filter((val) => val.startsWith(rl.line)).join(' ') + * ); + * }, 300); + * process.stdin.on('keypress', (c, k) => { + * showResults(); + * }); + * ``` + * @since v0.1.98 + */ + readonly line: string; + /** + * The cursor position relative to `rl.line`. + * + * This will track where the current cursor lands in the input string, when + * reading input from a TTY stream. The position of cursor determines the + * portion of the input string that will be modified as input is processed, + * as well as the column where the terminal caret will be rendered. + * @since v0.1.98 + */ + readonly cursor: number; + /** + * NOTE: According to the documentation: + * + * > Instances of the `readline.Interface` class are constructed using the + * > `readline.createInterface()` method. + * + * @see https://nodejs.org/dist/latest-v10.x/docs/api/readline.html#readline_class_interface + */ + protected constructor(input: NodeJS.ReadableStream, output?: NodeJS.WritableStream, completer?: Completer | AsyncCompleter, terminal?: boolean); + /** + * NOTE: According to the documentation: + * + * > Instances of the `readline.Interface` class are constructed using the + * > `readline.createInterface()` method. + * + * @see https://nodejs.org/dist/latest-v10.x/docs/api/readline.html#readline_class_interface + */ + protected constructor(options: ReadLineOptions); + /** + * The `rl.getPrompt()` method returns the current prompt used by `rl.prompt()`. + * @since v15.3.0 + * @return the current prompt string + */ + getPrompt(): string; + /** + * The `rl.setPrompt()` method sets the prompt that will be written to `output`whenever `rl.prompt()` is called. + * @since v0.1.98 + */ + setPrompt(prompt: string): void; + /** + * The `rl.prompt()` method writes the `readline.Interface` instances configured`prompt` to a new line in `output` in order to provide a user with a new + * location at which to provide input. + * + * When called, `rl.prompt()` will resume the `input` stream if it has been + * paused. + * + * If the `readline.Interface` was created with `output` set to `null` or`undefined` the prompt is not written. + * @since v0.1.98 + * @param preserveCursor If `true`, prevents the cursor placement from being reset to `0`. + */ + prompt(preserveCursor?: boolean): void; + /** + * The `rl.question()` method displays the `query` by writing it to the `output`, + * waits for user input to be provided on `input`, then invokes the `callback`function passing the provided input as the first argument. + * + * When called, `rl.question()` will resume the `input` stream if it has been + * paused. + * + * If the `readline.Interface` was created with `output` set to `null` or`undefined` the `query` is not written. + * + * The `callback` function passed to `rl.question()` does not follow the typical + * pattern of accepting an `Error` object or `null` as the first argument. + * The `callback` is called with the provided answer as the only argument. + * + * Example usage: + * + * ```js + * rl.question('What is your favorite food? ', (answer) => { + * console.log(`Oh, so your favorite food is ${answer}`); + * }); + * ``` + * + * Using an `AbortController` to cancel a question. + * + * ```js + * const ac = new AbortController(); + * const signal = ac.signal; + * + * rl.question('What is your favorite food? ', { signal }, (answer) => { + * console.log(`Oh, so your favorite food is ${answer}`); + * }); + * + * signal.addEventListener('abort', () => { + * console.log('The food question timed out'); + * }, { once: true }); + * + * setTimeout(() => ac.abort(), 10000); + * ``` + * + * If this method is invoked as it's util.promisify()ed version, it returns a + * Promise that fulfills with the answer. If the question is canceled using + * an `AbortController` it will reject with an `AbortError`. + * + * ```js + * const util = require('util'); + * const question = util.promisify(rl.question).bind(rl); + * + * async function questionExample() { + * try { + * const answer = await question('What is you favorite food? '); + * console.log(`Oh, so your favorite food is ${answer}`); + * } catch (err) { + * console.error('Question rejected', err); + * } + * } + * questionExample(); + * ``` + * @since v0.3.3 + * @param query A statement or query to write to `output`, prepended to the prompt. + * @param callback A callback function that is invoked with the user's input in response to the `query`. + */ + question(query: string, callback: (answer: string) => void): void; + question(query: string, options: Abortable, callback: (answer: string) => void): void; + /** + * The `rl.pause()` method pauses the `input` stream, allowing it to be resumed + * later if necessary. + * + * Calling `rl.pause()` does not immediately pause other events (including`'line'`) from being emitted by the `readline.Interface` instance. + * @since v0.3.4 + */ + pause(): this; + /** + * The `rl.resume()` method resumes the `input` stream if it has been paused. + * @since v0.3.4 + */ + resume(): this; + /** + * The `rl.close()` method closes the `readline.Interface` instance and + * relinquishes control over the `input` and `output` streams. When called, + * the `'close'` event will be emitted. + * + * Calling `rl.close()` does not immediately stop other events (including `'line'`) + * from being emitted by the `readline.Interface` instance. + * @since v0.1.98 + */ + close(): void; + /** + * The `rl.write()` method will write either `data` or a key sequence identified + * by `key` to the `output`. The `key` argument is supported only if `output` is + * a `TTY` text terminal. See `TTY keybindings` for a list of key + * combinations. + * + * If `key` is specified, `data` is ignored. + * + * When called, `rl.write()` will resume the `input` stream if it has been + * paused. + * + * If the `readline.Interface` was created with `output` set to `null` or`undefined` the `data` and `key` are not written. + * + * ```js + * rl.write('Delete this!'); + * // Simulate Ctrl+U to delete the line written previously + * rl.write(null, { ctrl: true, name: 'u' }); + * ``` + * + * The `rl.write()` method will write the data to the `readline` `Interface`'s`input`_as if it were provided by the user_. + * @since v0.1.98 + */ + write(data: string | Buffer, key?: Key): void; + /** + * Returns the real position of the cursor in relation to the input + * prompt + string. Long input (wrapping) strings, as well as multiple + * line prompts are included in the calculations. + * @since v13.5.0, v12.16.0 + */ + getCursorPos(): CursorPos; + /** + * events.EventEmitter + * 1. close + * 2. line + * 3. pause + * 4. resume + * 5. SIGCONT + * 6. SIGINT + * 7. SIGTSTP + * 8. history + */ + addListener(event: string, listener: (...args: any[]) => void): this; + addListener(event: 'close', listener: () => void): this; + addListener(event: 'line', listener: (input: string) => void): this; + addListener(event: 'pause', listener: () => void): this; + addListener(event: 'resume', listener: () => void): this; + addListener(event: 'SIGCONT', listener: () => void): this; + addListener(event: 'SIGINT', listener: () => void): this; + addListener(event: 'SIGTSTP', listener: () => void): this; + addListener(event: 'history', listener: (history: string[]) => void): this; + emit(event: string | symbol, ...args: any[]): boolean; + emit(event: 'close'): boolean; + emit(event: 'line', input: string): boolean; + emit(event: 'pause'): boolean; + emit(event: 'resume'): boolean; + emit(event: 'SIGCONT'): boolean; + emit(event: 'SIGINT'): boolean; + emit(event: 'SIGTSTP'): boolean; + emit(event: 'history', history: string[]): boolean; + on(event: string, listener: (...args: any[]) => void): this; + on(event: 'close', listener: () => void): this; + on(event: 'line', listener: (input: string) => void): this; + on(event: 'pause', listener: () => void): this; + on(event: 'resume', listener: () => void): this; + on(event: 'SIGCONT', listener: () => void): this; + on(event: 'SIGINT', listener: () => void): this; + on(event: 'SIGTSTP', listener: () => void): this; + on(event: 'history', listener: (history: string[]) => void): this; + once(event: string, listener: (...args: any[]) => void): this; + once(event: 'close', listener: () => void): this; + once(event: 'line', listener: (input: string) => void): this; + once(event: 'pause', listener: () => void): this; + once(event: 'resume', listener: () => void): this; + once(event: 'SIGCONT', listener: () => void): this; + once(event: 'SIGINT', listener: () => void): this; + once(event: 'SIGTSTP', listener: () => void): this; + once(event: 'history', listener: (history: string[]) => void): this; + prependListener(event: string, listener: (...args: any[]) => void): this; + prependListener(event: 'close', listener: () => void): this; + prependListener(event: 'line', listener: (input: string) => void): this; + prependListener(event: 'pause', listener: () => void): this; + prependListener(event: 'resume', listener: () => void): this; + prependListener(event: 'SIGCONT', listener: () => void): this; + prependListener(event: 'SIGINT', listener: () => void): this; + prependListener(event: 'SIGTSTP', listener: () => void): this; + prependListener(event: 'history', listener: (history: string[]) => void): this; + prependOnceListener(event: string, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'close', listener: () => void): this; + prependOnceListener(event: 'line', listener: (input: string) => void): this; + prependOnceListener(event: 'pause', listener: () => void): this; + prependOnceListener(event: 'resume', listener: () => void): this; + prependOnceListener(event: 'SIGCONT', listener: () => void): this; + prependOnceListener(event: 'SIGINT', listener: () => void): this; + prependOnceListener(event: 'SIGTSTP', listener: () => void): this; + prependOnceListener(event: 'history', listener: (history: string[]) => void): this; + [Symbol.asyncIterator](): AsyncIterableIterator; + } + type ReadLine = Interface; // type forwarded for backwards compatibility + type Completer = (line: string) => CompleterResult; + type AsyncCompleter = (line: string, callback: (err?: null | Error, result?: CompleterResult) => void) => void; + type CompleterResult = [string[], string]; + interface ReadLineOptions { + input: NodeJS.ReadableStream; + output?: NodeJS.WritableStream | undefined; + completer?: Completer | AsyncCompleter | undefined; + terminal?: boolean | undefined; + /** + * Initial list of history lines. This option makes sense + * only if `terminal` is set to `true` by the user or by an internal `output` + * check, otherwise the history caching mechanism is not initialized at all. + * @default [] + */ + history?: string[] | undefined; + historySize?: number | undefined; + prompt?: string | undefined; + crlfDelay?: number | undefined; + /** + * If `true`, when a new input line added + * to the history list duplicates an older one, this removes the older line + * from the list. + * @default false + */ + removeHistoryDuplicates?: boolean | undefined; + escapeCodeTimeout?: number | undefined; + tabSize?: number | undefined; + } + /** + * The `readline.createInterface()` method creates a new `readline.Interface`instance. + * + * ```js + * const readline = require('readline'); + * const rl = readline.createInterface({ + * input: process.stdin, + * output: process.stdout + * }); + * ``` + * + * Once the `readline.Interface` instance is created, the most common case is to + * listen for the `'line'` event: + * + * ```js + * rl.on('line', (line) => { + * console.log(`Received: ${line}`); + * }); + * ``` + * + * If `terminal` is `true` for this instance then the `output` stream will get + * the best compatibility if it defines an `output.columns` property and emits + * a `'resize'` event on the `output` if or when the columns ever change + * (`process.stdout` does this automatically when it is a TTY). + * + * When creating a `readline.Interface` using `stdin` as input, the program + * will not terminate until it receives `EOF` (Ctrl+D on + * Linux/macOS, Ctrl+Z followed by Return on + * Windows). + * If you want your application to exit without waiting for user input, you can `unref()` the standard input stream: + * + * ```js + * process.stdin.unref(); + * ``` + * @since v0.1.98 + */ + function createInterface(input: NodeJS.ReadableStream, output?: NodeJS.WritableStream, completer?: Completer | AsyncCompleter, terminal?: boolean): Interface; + function createInterface(options: ReadLineOptions): Interface; + /** + * The `readline.emitKeypressEvents()` method causes the given `Readable` stream to begin emitting `'keypress'` events corresponding to received input. + * + * Optionally, `interface` specifies a `readline.Interface` instance for which + * autocompletion is disabled when copy-pasted input is detected. + * + * If the `stream` is a `TTY`, then it must be in raw mode. + * + * This is automatically called by any readline instance on its `input` if the`input` is a terminal. Closing the `readline` instance does not stop + * the `input` from emitting `'keypress'` events. + * + * ```js + * readline.emitKeypressEvents(process.stdin); + * if (process.stdin.isTTY) + * process.stdin.setRawMode(true); + * ``` + * @since v0.7.7 + */ + function emitKeypressEvents(stream: NodeJS.ReadableStream, readlineInterface?: Interface): void; + type Direction = -1 | 0 | 1; + interface CursorPos { + rows: number; + cols: number; + } + /** + * The `readline.clearLine()` method clears current line of given `TTY` stream + * in a specified direction identified by `dir`. + * @since v0.7.7 + * @param callback Invoked once the operation completes. + * @return `false` if `stream` wishes for the calling code to wait for the `'drain'` event to be emitted before continuing to write additional data; otherwise `true`. + */ + function clearLine(stream: NodeJS.WritableStream, dir: Direction, callback?: () => void): boolean; + /** + * The `readline.clearScreenDown()` method clears the given `TTY` stream from + * the current position of the cursor down. + * @since v0.7.7 + * @param callback Invoked once the operation completes. + * @return `false` if `stream` wishes for the calling code to wait for the `'drain'` event to be emitted before continuing to write additional data; otherwise `true`. + */ + function clearScreenDown(stream: NodeJS.WritableStream, callback?: () => void): boolean; + /** + * The `readline.cursorTo()` method moves cursor to the specified position in a + * given `TTY` `stream`. + * @since v0.7.7 + * @param callback Invoked once the operation completes. + * @return `false` if `stream` wishes for the calling code to wait for the `'drain'` event to be emitted before continuing to write additional data; otherwise `true`. + */ + function cursorTo(stream: NodeJS.WritableStream, x: number, y?: number, callback?: () => void): boolean; + /** + * The `readline.moveCursor()` method moves the cursor _relative_ to its current + * position in a given `TTY` `stream`. + * + * ## Example: Tiny CLI + * + * The following example illustrates the use of `readline.Interface` class to + * implement a small command-line interface: + * + * ```js + * const readline = require('readline'); + * const rl = readline.createInterface({ + * input: process.stdin, + * output: process.stdout, + * prompt: 'OHAI> ' + * }); + * + * rl.prompt(); + * + * rl.on('line', (line) => { + * switch (line.trim()) { + * case 'hello': + * console.log('world!'); + * break; + * default: + * console.log(`Say what? I might have heard '${line.trim()}'`); + * break; + * } + * rl.prompt(); + * }).on('close', () => { + * console.log('Have a great day!'); + * process.exit(0); + * }); + * ``` + * + * ## Example: Read file stream line-by-Line + * + * A common use case for `readline` is to consume an input file one line at a + * time. The easiest way to do so is leveraging the `fs.ReadStream` API as + * well as a `for await...of` loop: + * + * ```js + * const fs = require('fs'); + * const readline = require('readline'); + * + * async function processLineByLine() { + * const fileStream = fs.createReadStream('input.txt'); + * + * const rl = readline.createInterface({ + * input: fileStream, + * crlfDelay: Infinity + * }); + * // Note: we use the crlfDelay option to recognize all instances of CR LF + * // ('\r\n') in input.txt as a single line break. + * + * for await (const line of rl) { + * // Each line in input.txt will be successively available here as `line`. + * console.log(`Line from file: ${line}`); + * } + * } + * + * processLineByLine(); + * ``` + * + * Alternatively, one could use the `'line'` event: + * + * ```js + * const fs = require('fs'); + * const readline = require('readline'); + * + * const rl = readline.createInterface({ + * input: fs.createReadStream('sample.txt'), + * crlfDelay: Infinity + * }); + * + * rl.on('line', (line) => { + * console.log(`Line from file: ${line}`); + * }); + * ``` + * + * Currently, `for await...of` loop can be a bit slower. If `async` / `await`flow and speed are both essential, a mixed approach can be applied: + * + * ```js + * const { once } = require('events'); + * const { createReadStream } = require('fs'); + * const { createInterface } = require('readline'); + * + * (async function processLineByLine() { + * try { + * const rl = createInterface({ + * input: createReadStream('big-file.txt'), + * crlfDelay: Infinity + * }); + * + * rl.on('line', (line) => { + * // Process the line. + * }); + * + * await once(rl, 'close'); + * + * console.log('File processed.'); + * } catch (err) { + * console.error(err); + * } + * })(); + * ``` + * @since v0.7.7 + * @param callback Invoked once the operation completes. + * @return `false` if `stream` wishes for the calling code to wait for the `'drain'` event to be emitted before continuing to write additional data; otherwise `true`. + */ + function moveCursor(stream: NodeJS.WritableStream, dx: number, dy: number, callback?: () => void): boolean; +} +declare module 'node:readline' { + export * from 'readline'; +} diff --git a/node_modules/@types/node/repl.d.ts b/node_modules/@types/node/repl.d.ts new file mode 100644 index 000000000..04f64e183 --- /dev/null +++ b/node_modules/@types/node/repl.d.ts @@ -0,0 +1,424 @@ +/** + * The `repl` module provides a Read-Eval-Print-Loop (REPL) implementation that + * is available both as a standalone program or includible in other applications. + * It can be accessed using: + * + * ```js + * const repl = require('repl'); + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/repl.js) + */ +declare module 'repl' { + import { Interface, Completer, AsyncCompleter } from 'node:readline'; + import { Context } from 'node:vm'; + import { InspectOptions } from 'node:util'; + interface ReplOptions { + /** + * The input prompt to display. + * @default "> " + */ + prompt?: string | undefined; + /** + * The `Readable` stream from which REPL input will be read. + * @default process.stdin + */ + input?: NodeJS.ReadableStream | undefined; + /** + * The `Writable` stream to which REPL output will be written. + * @default process.stdout + */ + output?: NodeJS.WritableStream | undefined; + /** + * If `true`, specifies that the output should be treated as a TTY terminal, and have + * ANSI/VT100 escape codes written to it. + * Default: checking the value of the `isTTY` property on the output stream upon + * instantiation. + */ + terminal?: boolean | undefined; + /** + * The function to be used when evaluating each given line of input. + * Default: an async wrapper for the JavaScript `eval()` function. An `eval` function can + * error with `repl.Recoverable` to indicate the input was incomplete and prompt for + * additional lines. + * + * @see https://nodejs.org/dist/latest-v10.x/docs/api/repl.html#repl_default_evaluation + * @see https://nodejs.org/dist/latest-v10.x/docs/api/repl.html#repl_custom_evaluation_functions + */ + eval?: REPLEval | undefined; + /** + * Defines if the repl prints output previews or not. + * @default `true` Always `false` in case `terminal` is falsy. + */ + preview?: boolean | undefined; + /** + * If `true`, specifies that the default `writer` function should include ANSI color + * styling to REPL output. If a custom `writer` function is provided then this has no + * effect. + * Default: the REPL instance's `terminal` value. + */ + useColors?: boolean | undefined; + /** + * If `true`, specifies that the default evaluation function will use the JavaScript + * `global` as the context as opposed to creating a new separate context for the REPL + * instance. The node CLI REPL sets this value to `true`. + * Default: `false`. + */ + useGlobal?: boolean | undefined; + /** + * If `true`, specifies that the default writer will not output the return value of a + * command if it evaluates to `undefined`. + * Default: `false`. + */ + ignoreUndefined?: boolean | undefined; + /** + * The function to invoke to format the output of each command before writing to `output`. + * Default: a wrapper for `util.inspect`. + * + * @see https://nodejs.org/dist/latest-v10.x/docs/api/repl.html#repl_customizing_repl_output + */ + writer?: REPLWriter | undefined; + /** + * An optional function used for custom Tab auto completion. + * + * @see https://nodejs.org/dist/latest-v11.x/docs/api/readline.html#readline_use_of_the_completer_function + */ + completer?: Completer | AsyncCompleter | undefined; + /** + * A flag that specifies whether the default evaluator executes all JavaScript commands in + * strict mode or default (sloppy) mode. + * Accepted values are: + * - `repl.REPL_MODE_SLOPPY` - evaluates expressions in sloppy mode. + * - `repl.REPL_MODE_STRICT` - evaluates expressions in strict mode. This is equivalent to + * prefacing every repl statement with `'use strict'`. + */ + replMode?: typeof REPL_MODE_SLOPPY | typeof REPL_MODE_STRICT | undefined; + /** + * Stop evaluating the current piece of code when `SIGINT` is received, i.e. `Ctrl+C` is + * pressed. This cannot be used together with a custom `eval` function. + * Default: `false`. + */ + breakEvalOnSigint?: boolean | undefined; + } + type REPLEval = (this: REPLServer, evalCmd: string, context: Context, file: string, cb: (err: Error | null, result: any) => void) => void; + type REPLWriter = (this: REPLServer, obj: any) => string; + /** + * This is the default "writer" value, if none is passed in the REPL options, + * and it can be overridden by custom print functions. + */ + const writer: REPLWriter & { + options: InspectOptions; + }; + type REPLCommandAction = (this: REPLServer, text: string) => void; + interface REPLCommand { + /** + * Help text to be displayed when `.help` is entered. + */ + help?: string | undefined; + /** + * The function to execute, optionally accepting a single string argument. + */ + action: REPLCommandAction; + } + /** + * Instances of `repl.REPLServer` are created using the {@link start} method + * or directly using the JavaScript `new` keyword. + * + * ```js + * const repl = require('repl'); + * + * const options = { useColors: true }; + * + * const firstInstance = repl.start(options); + * const secondInstance = new repl.REPLServer(options); + * ``` + * @since v0.1.91 + */ + class REPLServer extends Interface { + /** + * The `vm.Context` provided to the `eval` function to be used for JavaScript + * evaluation. + */ + readonly context: Context; + /** + * @deprecated since v14.3.0 - Use `input` instead. + */ + readonly inputStream: NodeJS.ReadableStream; + /** + * @deprecated since v14.3.0 - Use `output` instead. + */ + readonly outputStream: NodeJS.WritableStream; + /** + * The `Readable` stream from which REPL input will be read. + */ + readonly input: NodeJS.ReadableStream; + /** + * The `Writable` stream to which REPL output will be written. + */ + readonly output: NodeJS.WritableStream; + /** + * The commands registered via `replServer.defineCommand()`. + */ + readonly commands: NodeJS.ReadOnlyDict; + /** + * A value indicating whether the REPL is currently in "editor mode". + * + * @see https://nodejs.org/dist/latest-v10.x/docs/api/repl.html#repl_commands_and_special_keys + */ + readonly editorMode: boolean; + /** + * A value indicating whether the `_` variable has been assigned. + * + * @see https://nodejs.org/dist/latest-v10.x/docs/api/repl.html#repl_assignment_of_the_underscore_variable + */ + readonly underscoreAssigned: boolean; + /** + * The last evaluation result from the REPL (assigned to the `_` variable inside of the REPL). + * + * @see https://nodejs.org/dist/latest-v10.x/docs/api/repl.html#repl_assignment_of_the_underscore_variable + */ + readonly last: any; + /** + * A value indicating whether the `_error` variable has been assigned. + * + * @since v9.8.0 + * @see https://nodejs.org/dist/latest-v10.x/docs/api/repl.html#repl_assignment_of_the_underscore_variable + */ + readonly underscoreErrAssigned: boolean; + /** + * The last error raised inside the REPL (assigned to the `_error` variable inside of the REPL). + * + * @since v9.8.0 + * @see https://nodejs.org/dist/latest-v10.x/docs/api/repl.html#repl_assignment_of_the_underscore_variable + */ + readonly lastError: any; + /** + * Specified in the REPL options, this is the function to be used when evaluating each + * given line of input. If not specified in the REPL options, this is an async wrapper + * for the JavaScript `eval()` function. + */ + readonly eval: REPLEval; + /** + * Specified in the REPL options, this is a value indicating whether the default + * `writer` function should include ANSI color styling to REPL output. + */ + readonly useColors: boolean; + /** + * Specified in the REPL options, this is a value indicating whether the default `eval` + * function will use the JavaScript `global` as the context as opposed to creating a new + * separate context for the REPL instance. + */ + readonly useGlobal: boolean; + /** + * Specified in the REPL options, this is a value indicating whether the default `writer` + * function should output the result of a command if it evaluates to `undefined`. + */ + readonly ignoreUndefined: boolean; + /** + * Specified in the REPL options, this is the function to invoke to format the output of + * each command before writing to `outputStream`. If not specified in the REPL options, + * this will be a wrapper for `util.inspect`. + */ + readonly writer: REPLWriter; + /** + * Specified in the REPL options, this is the function to use for custom Tab auto-completion. + */ + readonly completer: Completer | AsyncCompleter; + /** + * Specified in the REPL options, this is a flag that specifies whether the default `eval` + * function should execute all JavaScript commands in strict mode or default (sloppy) mode. + * Possible values are: + * - `repl.REPL_MODE_SLOPPY` - evaluates expressions in sloppy mode. + * - `repl.REPL_MODE_STRICT` - evaluates expressions in strict mode. This is equivalent to + * prefacing every repl statement with `'use strict'`. + */ + readonly replMode: typeof REPL_MODE_SLOPPY | typeof REPL_MODE_STRICT; + /** + * NOTE: According to the documentation: + * + * > Instances of `repl.REPLServer` are created using the `repl.start()` method and + * > _should not_ be created directly using the JavaScript `new` keyword. + * + * `REPLServer` cannot be subclassed due to implementation specifics in NodeJS. + * + * @see https://nodejs.org/dist/latest-v10.x/docs/api/repl.html#repl_class_replserver + */ + private constructor(); + /** + * The `replServer.defineCommand()` method is used to add new `.`\-prefixed commands + * to the REPL instance. Such commands are invoked by typing a `.` followed by the`keyword`. The `cmd` is either a `Function` or an `Object` with the following + * properties: + * + * The following example shows two new commands added to the REPL instance: + * + * ```js + * const repl = require('repl'); + * + * const replServer = repl.start({ prompt: '> ' }); + * replServer.defineCommand('sayhello', { + * help: 'Say hello', + * action(name) { + * this.clearBufferedCommand(); + * console.log(`Hello, ${name}!`); + * this.displayPrompt(); + * } + * }); + * replServer.defineCommand('saybye', function saybye() { + * console.log('Goodbye!'); + * this.close(); + * }); + * ``` + * + * The new commands can then be used from within the REPL instance: + * + * ```console + * > .sayhello Node.js User + * Hello, Node.js User! + * > .saybye + * Goodbye! + * ``` + * @since v0.3.0 + * @param keyword The command keyword (*without* a leading `.` character). + * @param cmd The function to invoke when the command is processed. + */ + defineCommand(keyword: string, cmd: REPLCommandAction | REPLCommand): void; + /** + * The `replServer.displayPrompt()` method readies the REPL instance for input + * from the user, printing the configured `prompt` to a new line in the `output`and resuming the `input` to accept new input. + * + * When multi-line input is being entered, an ellipsis is printed rather than the + * 'prompt'. + * + * When `preserveCursor` is `true`, the cursor placement will not be reset to `0`. + * + * The `replServer.displayPrompt` method is primarily intended to be called from + * within the action function for commands registered using the`replServer.defineCommand()` method. + * @since v0.1.91 + */ + displayPrompt(preserveCursor?: boolean): void; + /** + * The `replServer.clearBufferedCommand()` method clears any command that has been + * buffered but not yet executed. This method is primarily intended to be + * called from within the action function for commands registered using the`replServer.defineCommand()` method. + * @since v9.0.0 + */ + clearBufferedCommand(): void; + /** + * Initializes a history log file for the REPL instance. When executing the + * Node.js binary and using the command-line REPL, a history file is initialized + * by default. However, this is not the case when creating a REPL + * programmatically. Use this method to initialize a history log file when working + * with REPL instances programmatically. + * @since v11.10.0 + * @param historyPath the path to the history file + * @param callback called when history writes are ready or upon error + */ + setupHistory(path: string, callback: (err: Error | null, repl: this) => void): void; + /** + * events.EventEmitter + * 1. close - inherited from `readline.Interface` + * 2. line - inherited from `readline.Interface` + * 3. pause - inherited from `readline.Interface` + * 4. resume - inherited from `readline.Interface` + * 5. SIGCONT - inherited from `readline.Interface` + * 6. SIGINT - inherited from `readline.Interface` + * 7. SIGTSTP - inherited from `readline.Interface` + * 8. exit + * 9. reset + */ + addListener(event: string, listener: (...args: any[]) => void): this; + addListener(event: 'close', listener: () => void): this; + addListener(event: 'line', listener: (input: string) => void): this; + addListener(event: 'pause', listener: () => void): this; + addListener(event: 'resume', listener: () => void): this; + addListener(event: 'SIGCONT', listener: () => void): this; + addListener(event: 'SIGINT', listener: () => void): this; + addListener(event: 'SIGTSTP', listener: () => void): this; + addListener(event: 'exit', listener: () => void): this; + addListener(event: 'reset', listener: (context: Context) => void): this; + emit(event: string | symbol, ...args: any[]): boolean; + emit(event: 'close'): boolean; + emit(event: 'line', input: string): boolean; + emit(event: 'pause'): boolean; + emit(event: 'resume'): boolean; + emit(event: 'SIGCONT'): boolean; + emit(event: 'SIGINT'): boolean; + emit(event: 'SIGTSTP'): boolean; + emit(event: 'exit'): boolean; + emit(event: 'reset', context: Context): boolean; + on(event: string, listener: (...args: any[]) => void): this; + on(event: 'close', listener: () => void): this; + on(event: 'line', listener: (input: string) => void): this; + on(event: 'pause', listener: () => void): this; + on(event: 'resume', listener: () => void): this; + on(event: 'SIGCONT', listener: () => void): this; + on(event: 'SIGINT', listener: () => void): this; + on(event: 'SIGTSTP', listener: () => void): this; + on(event: 'exit', listener: () => void): this; + on(event: 'reset', listener: (context: Context) => void): this; + once(event: string, listener: (...args: any[]) => void): this; + once(event: 'close', listener: () => void): this; + once(event: 'line', listener: (input: string) => void): this; + once(event: 'pause', listener: () => void): this; + once(event: 'resume', listener: () => void): this; + once(event: 'SIGCONT', listener: () => void): this; + once(event: 'SIGINT', listener: () => void): this; + once(event: 'SIGTSTP', listener: () => void): this; + once(event: 'exit', listener: () => void): this; + once(event: 'reset', listener: (context: Context) => void): this; + prependListener(event: string, listener: (...args: any[]) => void): this; + prependListener(event: 'close', listener: () => void): this; + prependListener(event: 'line', listener: (input: string) => void): this; + prependListener(event: 'pause', listener: () => void): this; + prependListener(event: 'resume', listener: () => void): this; + prependListener(event: 'SIGCONT', listener: () => void): this; + prependListener(event: 'SIGINT', listener: () => void): this; + prependListener(event: 'SIGTSTP', listener: () => void): this; + prependListener(event: 'exit', listener: () => void): this; + prependListener(event: 'reset', listener: (context: Context) => void): this; + prependOnceListener(event: string, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'close', listener: () => void): this; + prependOnceListener(event: 'line', listener: (input: string) => void): this; + prependOnceListener(event: 'pause', listener: () => void): this; + prependOnceListener(event: 'resume', listener: () => void): this; + prependOnceListener(event: 'SIGCONT', listener: () => void): this; + prependOnceListener(event: 'SIGINT', listener: () => void): this; + prependOnceListener(event: 'SIGTSTP', listener: () => void): this; + prependOnceListener(event: 'exit', listener: () => void): this; + prependOnceListener(event: 'reset', listener: (context: Context) => void): this; + } + /** + * A flag passed in the REPL options. Evaluates expressions in sloppy mode. + */ + const REPL_MODE_SLOPPY: unique symbol; + /** + * A flag passed in the REPL options. Evaluates expressions in strict mode. + * This is equivalent to prefacing every repl statement with `'use strict'`. + */ + const REPL_MODE_STRICT: unique symbol; + /** + * The `repl.start()` method creates and starts a {@link REPLServer} instance. + * + * If `options` is a string, then it specifies the input prompt: + * + * ```js + * const repl = require('repl'); + * + * // a Unix style prompt + * repl.start('$ '); + * ``` + * @since v0.1.91 + */ + function start(options?: string | ReplOptions): REPLServer; + /** + * Indicates a recoverable error that a `REPLServer` can use to support multi-line input. + * + * @see https://nodejs.org/dist/latest-v10.x/docs/api/repl.html#repl_recoverable_errors + */ + class Recoverable extends SyntaxError { + err: Error; + constructor(err: Error); + } +} +declare module 'node:repl' { + export * from 'repl'; +} diff --git a/node_modules/@types/node/stream.d.ts b/node_modules/@types/node/stream.d.ts new file mode 100644 index 000000000..3927144a8 --- /dev/null +++ b/node_modules/@types/node/stream.d.ts @@ -0,0 +1,1249 @@ +/** + * A stream is an abstract interface for working with streaming data in Node.js. + * The `stream` module provides an API for implementing the stream interface. + * + * There are many stream objects provided by Node.js. For instance, a `request to an HTTP server` and `process.stdout` are both stream instances. + * + * Streams can be readable, writable, or both. All streams are instances of `EventEmitter`. + * + * To access the `stream` module: + * + * ```js + * const stream = require('stream'); + * ``` + * + * The `stream` module is useful for creating new types of stream instances. It is + * usually not necessary to use the `stream` module to consume streams. + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/stream.js) + */ +declare module 'stream' { + import { EventEmitter, Abortable } from 'node:events'; + import * as streamPromises from 'node:stream/promises'; + import * as streamConsumers from 'node:stream/consumers'; + class internal extends EventEmitter { + pipe( + destination: T, + options?: { + end?: boolean | undefined; + } + ): T; + } + namespace internal { + class Stream extends internal { + constructor(opts?: ReadableOptions); + } + interface StreamOptions extends Abortable { + emitClose?: boolean | undefined; + highWaterMark?: number | undefined; + objectMode?: boolean | undefined; + construct?(this: T, callback: (error?: Error | null) => void): void; + destroy?(this: T, error: Error | null, callback: (error: Error | null) => void): void; + autoDestroy?: boolean | undefined; + } + interface ReadableOptions extends StreamOptions { + encoding?: BufferEncoding | undefined; + read?(this: Readable, size: number): void; + } + /** + * @since v0.9.4 + */ + class Readable extends Stream implements NodeJS.ReadableStream { + /** + * A utility method for creating Readable Streams out of iterators. + */ + static from(iterable: Iterable | AsyncIterable, options?: ReadableOptions): Readable; + /** + * Returns whether the stream has been read from or cancelled. + * @since v16.8.0 + */ + static isDisturbed(stream: Readable | NodeJS.ReadableStream): boolean; + /** + * Returns whether the stream was destroyed or errored before emitting `'end'`. + * @since v16.8.0 + * @experimental + */ + readonly readableAborted: boolean; + /** + * Is `true` if it is safe to call `readable.read()`, which means + * the stream has not been destroyed or emitted `'error'` or `'end'`. + * @since v11.4.0 + */ + readable: boolean; + /** + * Returns whether `'data'` has been emitted. + * @since v16.7.0 + * @experimental + */ + readonly readableDidRead: boolean; + /** + * Getter for the property `encoding` of a given `Readable` stream. The `encoding`property can be set using the `readable.setEncoding()` method. + * @since v12.7.0 + */ + readonly readableEncoding: BufferEncoding | null; + /** + * Becomes `true` when `'end'` event is emitted. + * @since v12.9.0 + */ + readonly readableEnded: boolean; + /** + * This property reflects the current state of a `Readable` stream as described + * in the `Three states` section. + * @since v9.4.0 + */ + readonly readableFlowing: boolean | null; + /** + * Returns the value of `highWaterMark` passed when creating this `Readable`. + * @since v9.3.0 + */ + readonly readableHighWaterMark: number; + /** + * This property contains the number of bytes (or objects) in the queue + * ready to be read. The value provides introspection data regarding + * the status of the `highWaterMark`. + * @since v9.4.0 + */ + readonly readableLength: number; + /** + * Getter for the property `objectMode` of a given `Readable` stream. + * @since v12.3.0 + */ + readonly readableObjectMode: boolean; + /** + * Is `true` after `readable.destroy()` has been called. + * @since v8.0.0 + */ + destroyed: boolean; + constructor(opts?: ReadableOptions); + _construct?(callback: (error?: Error | null) => void): void; + _read(size: number): void; + /** + * The `readable.read()` method pulls some data out of the internal buffer and + * returns it. If no data available to be read, `null` is returned. By default, + * the data will be returned as a `Buffer` object unless an encoding has been + * specified using the `readable.setEncoding()` method or the stream is operating + * in object mode. + * + * The optional `size` argument specifies a specific number of bytes to read. If`size` bytes are not available to be read, `null` will be returned _unless_the stream has ended, in which + * case all of the data remaining in the internal + * buffer will be returned. + * + * If the `size` argument is not specified, all of the data contained in the + * internal buffer will be returned. + * + * The `size` argument must be less than or equal to 1 GiB. + * + * The `readable.read()` method should only be called on `Readable` streams + * operating in paused mode. In flowing mode, `readable.read()` is called + * automatically until the internal buffer is fully drained. + * + * ```js + * const readable = getReadableStreamSomehow(); + * + * // 'readable' may be triggered multiple times as data is buffered in + * readable.on('readable', () => { + * let chunk; + * console.log('Stream is readable (new data received in buffer)'); + * // Use a loop to make sure we read all currently available data + * while (null !== (chunk = readable.read())) { + * console.log(`Read ${chunk.length} bytes of data...`); + * } + * }); + * + * // 'end' will be triggered once when there is no more data available + * readable.on('end', () => { + * console.log('Reached end of stream.'); + * }); + * ``` + * + * Each call to `readable.read()` returns a chunk of data, or `null`. The chunks + * are not concatenated. A `while` loop is necessary to consume all data + * currently in the buffer. When reading a large file `.read()` may return `null`, + * having consumed all buffered content so far, but there is still more data to + * come not yet buffered. In this case a new `'readable'` event will be emitted + * when there is more data in the buffer. Finally the `'end'` event will be + * emitted when there is no more data to come. + * + * Therefore to read a file's whole contents from a `readable`, it is necessary + * to collect chunks across multiple `'readable'` events: + * + * ```js + * const chunks = []; + * + * readable.on('readable', () => { + * let chunk; + * while (null !== (chunk = readable.read())) { + * chunks.push(chunk); + * } + * }); + * + * readable.on('end', () => { + * const content = chunks.join(''); + * }); + * ``` + * + * A `Readable` stream in object mode will always return a single item from + * a call to `readable.read(size)`, regardless of the value of the`size` argument. + * + * If the `readable.read()` method returns a chunk of data, a `'data'` event will + * also be emitted. + * + * Calling {@link read} after the `'end'` event has + * been emitted will return `null`. No runtime error will be raised. + * @since v0.9.4 + * @param size Optional argument to specify how much data to read. + */ + read(size?: number): any; + /** + * The `readable.setEncoding()` method sets the character encoding for + * data read from the `Readable` stream. + * + * By default, no encoding is assigned and stream data will be returned as`Buffer` objects. Setting an encoding causes the stream data + * to be returned as strings of the specified encoding rather than as `Buffer`objects. For instance, calling `readable.setEncoding('utf8')` will cause the + * output data to be interpreted as UTF-8 data, and passed as strings. Calling`readable.setEncoding('hex')` will cause the data to be encoded in hexadecimal + * string format. + * + * The `Readable` stream will properly handle multi-byte characters delivered + * through the stream that would otherwise become improperly decoded if simply + * pulled from the stream as `Buffer` objects. + * + * ```js + * const readable = getReadableStreamSomehow(); + * readable.setEncoding('utf8'); + * readable.on('data', (chunk) => { + * assert.equal(typeof chunk, 'string'); + * console.log('Got %d characters of string data:', chunk.length); + * }); + * ``` + * @since v0.9.4 + * @param encoding The encoding to use. + */ + setEncoding(encoding: BufferEncoding): this; + /** + * The `readable.pause()` method will cause a stream in flowing mode to stop + * emitting `'data'` events, switching out of flowing mode. Any data that + * becomes available will remain in the internal buffer. + * + * ```js + * const readable = getReadableStreamSomehow(); + * readable.on('data', (chunk) => { + * console.log(`Received ${chunk.length} bytes of data.`); + * readable.pause(); + * console.log('There will be no additional data for 1 second.'); + * setTimeout(() => { + * console.log('Now data will start flowing again.'); + * readable.resume(); + * }, 1000); + * }); + * ``` + * + * The `readable.pause()` method has no effect if there is a `'readable'`event listener. + * @since v0.9.4 + */ + pause(): this; + /** + * The `readable.resume()` method causes an explicitly paused `Readable` stream to + * resume emitting `'data'` events, switching the stream into flowing mode. + * + * The `readable.resume()` method can be used to fully consume the data from a + * stream without actually processing any of that data: + * + * ```js + * getReadableStreamSomehow() + * .resume() + * .on('end', () => { + * console.log('Reached the end, but did not read anything.'); + * }); + * ``` + * + * The `readable.resume()` method has no effect if there is a `'readable'`event listener. + * @since v0.9.4 + */ + resume(): this; + /** + * The `readable.isPaused()` method returns the current operating state of the`Readable`. This is used primarily by the mechanism that underlies the`readable.pipe()` method. In most + * typical cases, there will be no reason to + * use this method directly. + * + * ```js + * const readable = new stream.Readable(); + * + * readable.isPaused(); // === false + * readable.pause(); + * readable.isPaused(); // === true + * readable.resume(); + * readable.isPaused(); // === false + * ``` + * @since v0.11.14 + */ + isPaused(): boolean; + /** + * The `readable.unpipe()` method detaches a `Writable` stream previously attached + * using the {@link pipe} method. + * + * If the `destination` is not specified, then _all_ pipes are detached. + * + * If the `destination` is specified, but no pipe is set up for it, then + * the method does nothing. + * + * ```js + * const fs = require('fs'); + * const readable = getReadableStreamSomehow(); + * const writable = fs.createWriteStream('file.txt'); + * // All the data from readable goes into 'file.txt', + * // but only for the first second. + * readable.pipe(writable); + * setTimeout(() => { + * console.log('Stop writing to file.txt.'); + * readable.unpipe(writable); + * console.log('Manually close the file stream.'); + * writable.end(); + * }, 1000); + * ``` + * @since v0.9.4 + * @param destination Optional specific stream to unpipe + */ + unpipe(destination?: NodeJS.WritableStream): this; + /** + * Passing `chunk` as `null` signals the end of the stream (EOF) and behaves the + * same as `readable.push(null)`, after which no more data can be written. The EOF + * signal is put at the end of the buffer and any buffered data will still be + * flushed. + * + * The `readable.unshift()` method pushes a chunk of data back into the internal + * buffer. This is useful in certain situations where a stream is being consumed by + * code that needs to "un-consume" some amount of data that it has optimistically + * pulled out of the source, so that the data can be passed on to some other party. + * + * The `stream.unshift(chunk)` method cannot be called after the `'end'` event + * has been emitted or a runtime error will be thrown. + * + * Developers using `stream.unshift()` often should consider switching to + * use of a `Transform` stream instead. See the `API for stream implementers` section for more information. + * + * ```js + * // Pull off a header delimited by \n\n. + * // Use unshift() if we get too much. + * // Call the callback with (error, header, stream). + * const { StringDecoder } = require('string_decoder'); + * function parseHeader(stream, callback) { + * stream.on('error', callback); + * stream.on('readable', onReadable); + * const decoder = new StringDecoder('utf8'); + * let header = ''; + * function onReadable() { + * let chunk; + * while (null !== (chunk = stream.read())) { + * const str = decoder.write(chunk); + * if (str.match(/\n\n/)) { + * // Found the header boundary. + * const split = str.split(/\n\n/); + * header += split.shift(); + * const remaining = split.join('\n\n'); + * const buf = Buffer.from(remaining, 'utf8'); + * stream.removeListener('error', callback); + * // Remove the 'readable' listener before unshifting. + * stream.removeListener('readable', onReadable); + * if (buf.length) + * stream.unshift(buf); + * // Now the body of the message can be read from the stream. + * callback(null, header, stream); + * } else { + * // Still reading the header. + * header += str; + * } + * } + * } + * } + * ``` + * + * Unlike {@link push}, `stream.unshift(chunk)` will not + * end the reading process by resetting the internal reading state of the stream. + * This can cause unexpected results if `readable.unshift()` is called during a + * read (i.e. from within a {@link _read} implementation on a + * custom stream). Following the call to `readable.unshift()` with an immediate {@link push} will reset the reading state appropriately, + * however it is best to simply avoid calling `readable.unshift()` while in the + * process of performing a read. + * @since v0.9.11 + * @param chunk Chunk of data to unshift onto the read queue. For streams not operating in object mode, `chunk` must be a string, `Buffer`, `Uint8Array` or `null`. For object mode + * streams, `chunk` may be any JavaScript value. + * @param encoding Encoding of string chunks. Must be a valid `Buffer` encoding, such as `'utf8'` or `'ascii'`. + */ + unshift(chunk: any, encoding?: BufferEncoding): void; + /** + * Prior to Node.js 0.10, streams did not implement the entire `stream` module API + * as it is currently defined. (See `Compatibility` for more information.) + * + * When using an older Node.js library that emits `'data'` events and has a {@link pause} method that is advisory only, the`readable.wrap()` method can be used to create a `Readable` + * stream that uses + * the old stream as its data source. + * + * It will rarely be necessary to use `readable.wrap()` but the method has been + * provided as a convenience for interacting with older Node.js applications and + * libraries. + * + * ```js + * const { OldReader } = require('./old-api-module.js'); + * const { Readable } = require('stream'); + * const oreader = new OldReader(); + * const myReader = new Readable().wrap(oreader); + * + * myReader.on('readable', () => { + * myReader.read(); // etc. + * }); + * ``` + * @since v0.9.4 + * @param stream An "old style" readable stream + */ + wrap(stream: NodeJS.ReadableStream): this; + push(chunk: any, encoding?: BufferEncoding): boolean; + _destroy(error: Error | null, callback: (error?: Error | null) => void): void; + /** + * Destroy the stream. Optionally emit an `'error'` event, and emit a `'close'`event (unless `emitClose` is set to `false`). After this call, the readable + * stream will release any internal resources and subsequent calls to `push()`will be ignored. + * + * Once `destroy()` has been called any further calls will be a no-op and no + * further errors except from `_destroy()` may be emitted as `'error'`. + * + * Implementors should not override this method, but instead implement `readable._destroy()`. + * @since v8.0.0 + * @param error Error which will be passed as payload in `'error'` event + */ + destroy(error?: Error): void; + /** + * Event emitter + * The defined events on documents including: + * 1. close + * 2. data + * 3. end + * 4. error + * 5. pause + * 6. readable + * 7. resume + */ + addListener(event: 'close', listener: () => void): this; + addListener(event: 'data', listener: (chunk: any) => void): this; + addListener(event: 'end', listener: () => void): this; + addListener(event: 'error', listener: (err: Error) => void): this; + addListener(event: 'pause', listener: () => void): this; + addListener(event: 'readable', listener: () => void): this; + addListener(event: 'resume', listener: () => void): this; + addListener(event: string | symbol, listener: (...args: any[]) => void): this; + emit(event: 'close'): boolean; + emit(event: 'data', chunk: any): boolean; + emit(event: 'end'): boolean; + emit(event: 'error', err: Error): boolean; + emit(event: 'pause'): boolean; + emit(event: 'readable'): boolean; + emit(event: 'resume'): boolean; + emit(event: string | symbol, ...args: any[]): boolean; + on(event: 'close', listener: () => void): this; + on(event: 'data', listener: (chunk: any) => void): this; + on(event: 'end', listener: () => void): this; + on(event: 'error', listener: (err: Error) => void): this; + on(event: 'pause', listener: () => void): this; + on(event: 'readable', listener: () => void): this; + on(event: 'resume', listener: () => void): this; + on(event: string | symbol, listener: (...args: any[]) => void): this; + once(event: 'close', listener: () => void): this; + once(event: 'data', listener: (chunk: any) => void): this; + once(event: 'end', listener: () => void): this; + once(event: 'error', listener: (err: Error) => void): this; + once(event: 'pause', listener: () => void): this; + once(event: 'readable', listener: () => void): this; + once(event: 'resume', listener: () => void): this; + once(event: string | symbol, listener: (...args: any[]) => void): this; + prependListener(event: 'close', listener: () => void): this; + prependListener(event: 'data', listener: (chunk: any) => void): this; + prependListener(event: 'end', listener: () => void): this; + prependListener(event: 'error', listener: (err: Error) => void): this; + prependListener(event: 'pause', listener: () => void): this; + prependListener(event: 'readable', listener: () => void): this; + prependListener(event: 'resume', listener: () => void): this; + prependListener(event: string | symbol, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'close', listener: () => void): this; + prependOnceListener(event: 'data', listener: (chunk: any) => void): this; + prependOnceListener(event: 'end', listener: () => void): this; + prependOnceListener(event: 'error', listener: (err: Error) => void): this; + prependOnceListener(event: 'pause', listener: () => void): this; + prependOnceListener(event: 'readable', listener: () => void): this; + prependOnceListener(event: 'resume', listener: () => void): this; + prependOnceListener(event: string | symbol, listener: (...args: any[]) => void): this; + removeListener(event: 'close', listener: () => void): this; + removeListener(event: 'data', listener: (chunk: any) => void): this; + removeListener(event: 'end', listener: () => void): this; + removeListener(event: 'error', listener: (err: Error) => void): this; + removeListener(event: 'pause', listener: () => void): this; + removeListener(event: 'readable', listener: () => void): this; + removeListener(event: 'resume', listener: () => void): this; + removeListener(event: string | symbol, listener: (...args: any[]) => void): this; + [Symbol.asyncIterator](): AsyncIterableIterator; + } + interface WritableOptions extends StreamOptions { + decodeStrings?: boolean | undefined; + defaultEncoding?: BufferEncoding | undefined; + write?(this: Writable, chunk: any, encoding: BufferEncoding, callback: (error?: Error | null) => void): void; + writev?( + this: Writable, + chunks: Array<{ + chunk: any; + encoding: BufferEncoding; + }>, + callback: (error?: Error | null) => void + ): void; + final?(this: Writable, callback: (error?: Error | null) => void): void; + } + /** + * @since v0.9.4 + */ + class Writable extends Stream implements NodeJS.WritableStream { + /** + * Is `true` if it is safe to call `writable.write()`, which means + * the stream has not been destroyed, errored or ended. + * @since v11.4.0 + */ + readonly writable: boolean; + /** + * Is `true` after `writable.end()` has been called. This property + * does not indicate whether the data has been flushed, for this use `writable.writableFinished` instead. + * @since v12.9.0 + */ + readonly writableEnded: boolean; + /** + * Is set to `true` immediately before the `'finish'` event is emitted. + * @since v12.6.0 + */ + readonly writableFinished: boolean; + /** + * Return the value of `highWaterMark` passed when creating this `Writable`. + * @since v9.3.0 + */ + readonly writableHighWaterMark: number; + /** + * This property contains the number of bytes (or objects) in the queue + * ready to be written. The value provides introspection data regarding + * the status of the `highWaterMark`. + * @since v9.4.0 + */ + readonly writableLength: number; + /** + * Getter for the property `objectMode` of a given `Writable` stream. + * @since v12.3.0 + */ + readonly writableObjectMode: boolean; + /** + * Number of times `writable.uncork()` needs to be + * called in order to fully uncork the stream. + * @since v13.2.0, v12.16.0 + */ + readonly writableCorked: number; + /** + * Is `true` after `writable.destroy()` has been called. + * @since v8.0.0 + */ + destroyed: boolean; + constructor(opts?: WritableOptions); + _write(chunk: any, encoding: BufferEncoding, callback: (error?: Error | null) => void): void; + _writev?( + chunks: Array<{ + chunk: any; + encoding: BufferEncoding; + }>, + callback: (error?: Error | null) => void + ): void; + _construct?(callback: (error?: Error | null) => void): void; + _destroy(error: Error | null, callback: (error?: Error | null) => void): void; + _final(callback: (error?: Error | null) => void): void; + /** + * The `writable.write()` method writes some data to the stream, and calls the + * supplied `callback` once the data has been fully handled. If an error + * occurs, the `callback` will be called with the error as its + * first argument. The `callback` is called asynchronously and before `'error'` is + * emitted. + * + * The return value is `true` if the internal buffer is less than the`highWaterMark` configured when the stream was created after admitting `chunk`. + * If `false` is returned, further attempts to write data to the stream should + * stop until the `'drain'` event is emitted. + * + * While a stream is not draining, calls to `write()` will buffer `chunk`, and + * return false. Once all currently buffered chunks are drained (accepted for + * delivery by the operating system), the `'drain'` event will be emitted. + * It is recommended that once `write()` returns false, no more chunks be written + * until the `'drain'` event is emitted. While calling `write()` on a stream that + * is not draining is allowed, Node.js will buffer all written chunks until + * maximum memory usage occurs, at which point it will abort unconditionally. + * Even before it aborts, high memory usage will cause poor garbage collector + * performance and high RSS (which is not typically released back to the system, + * even after the memory is no longer required). Since TCP sockets may never + * drain if the remote peer does not read the data, writing a socket that is + * not draining may lead to a remotely exploitable vulnerability. + * + * Writing data while the stream is not draining is particularly + * problematic for a `Transform`, because the `Transform` streams are paused + * by default until they are piped or a `'data'` or `'readable'` event handler + * is added. + * + * If the data to be written can be generated or fetched on demand, it is + * recommended to encapsulate the logic into a `Readable` and use {@link pipe}. However, if calling `write()` is preferred, it is + * possible to respect backpressure and avoid memory issues using the `'drain'` event: + * + * ```js + * function write(data, cb) { + * if (!stream.write(data)) { + * stream.once('drain', cb); + * } else { + * process.nextTick(cb); + * } + * } + * + * // Wait for cb to be called before doing any other write. + * write('hello', () => { + * console.log('Write completed, do more writes now.'); + * }); + * ``` + * + * A `Writable` stream in object mode will always ignore the `encoding` argument. + * @since v0.9.4 + * @param chunk Optional data to write. For streams not operating in object mode, `chunk` must be a string, `Buffer` or `Uint8Array`. For object mode streams, `chunk` may be any + * JavaScript value other than `null`. + * @param [encoding='utf8'] The encoding, if `chunk` is a string. + * @param callback Callback for when this chunk of data is flushed. + * @return `false` if the stream wishes for the calling code to wait for the `'drain'` event to be emitted before continuing to write additional data; otherwise `true`. + */ + write(chunk: any, callback?: (error: Error | null | undefined) => void): boolean; + write(chunk: any, encoding: BufferEncoding, callback?: (error: Error | null | undefined) => void): boolean; + /** + * The `writable.setDefaultEncoding()` method sets the default `encoding` for a `Writable` stream. + * @since v0.11.15 + * @param encoding The new default encoding + */ + setDefaultEncoding(encoding: BufferEncoding): this; + /** + * Calling the `writable.end()` method signals that no more data will be written + * to the `Writable`. The optional `chunk` and `encoding` arguments allow one + * final additional chunk of data to be written immediately before closing the + * stream. + * + * Calling the {@link write} method after calling {@link end} will raise an error. + * + * ```js + * // Write 'hello, ' and then end with 'world!'. + * const fs = require('fs'); + * const file = fs.createWriteStream('example.txt'); + * file.write('hello, '); + * file.end('world!'); + * // Writing more now is not allowed! + * ``` + * @since v0.9.4 + * @param chunk Optional data to write. For streams not operating in object mode, `chunk` must be a string, `Buffer` or `Uint8Array`. For object mode streams, `chunk` may be any + * JavaScript value other than `null`. + * @param encoding The encoding if `chunk` is a string + * @param callback Callback for when the stream is finished. + */ + end(cb?: () => void): void; + end(chunk: any, cb?: () => void): void; + end(chunk: any, encoding: BufferEncoding, cb?: () => void): void; + /** + * The `writable.cork()` method forces all written data to be buffered in memory. + * The buffered data will be flushed when either the {@link uncork} or {@link end} methods are called. + * + * The primary intent of `writable.cork()` is to accommodate a situation in which + * several small chunks are written to the stream in rapid succession. Instead of + * immediately forwarding them to the underlying destination, `writable.cork()`buffers all the chunks until `writable.uncork()` is called, which will pass them + * all to `writable._writev()`, if present. This prevents a head-of-line blocking + * situation where data is being buffered while waiting for the first small chunk + * to be processed. However, use of `writable.cork()` without implementing`writable._writev()` may have an adverse effect on throughput. + * + * See also: `writable.uncork()`, `writable._writev()`. + * @since v0.11.2 + */ + cork(): void; + /** + * The `writable.uncork()` method flushes all data buffered since {@link cork} was called. + * + * When using `writable.cork()` and `writable.uncork()` to manage the buffering + * of writes to a stream, it is recommended that calls to `writable.uncork()` be + * deferred using `process.nextTick()`. Doing so allows batching of all`writable.write()` calls that occur within a given Node.js event loop phase. + * + * ```js + * stream.cork(); + * stream.write('some '); + * stream.write('data '); + * process.nextTick(() => stream.uncork()); + * ``` + * + * If the `writable.cork()` method is called multiple times on a stream, the + * same number of calls to `writable.uncork()` must be called to flush the buffered + * data. + * + * ```js + * stream.cork(); + * stream.write('some '); + * stream.cork(); + * stream.write('data '); + * process.nextTick(() => { + * stream.uncork(); + * // The data will not be flushed until uncork() is called a second time. + * stream.uncork(); + * }); + * ``` + * + * See also: `writable.cork()`. + * @since v0.11.2 + */ + uncork(): void; + /** + * Destroy the stream. Optionally emit an `'error'` event, and emit a `'close'`event (unless `emitClose` is set to `false`). After this call, the writable + * stream has ended and subsequent calls to `write()` or `end()` will result in + * an `ERR_STREAM_DESTROYED` error. + * This is a destructive and immediate way to destroy a stream. Previous calls to`write()` may not have drained, and may trigger an `ERR_STREAM_DESTROYED` error. + * Use `end()` instead of destroy if data should flush before close, or wait for + * the `'drain'` event before destroying the stream. + * + * Once `destroy()` has been called any further calls will be a no-op and no + * further errors except from `_destroy()` may be emitted as `'error'`. + * + * Implementors should not override this method, + * but instead implement `writable._destroy()`. + * @since v8.0.0 + * @param error Optional, an error to emit with `'error'` event. + */ + destroy(error?: Error): void; + /** + * Event emitter + * The defined events on documents including: + * 1. close + * 2. drain + * 3. error + * 4. finish + * 5. pipe + * 6. unpipe + */ + addListener(event: 'close', listener: () => void): this; + addListener(event: 'drain', listener: () => void): this; + addListener(event: 'error', listener: (err: Error) => void): this; + addListener(event: 'finish', listener: () => void): this; + addListener(event: 'pipe', listener: (src: Readable) => void): this; + addListener(event: 'unpipe', listener: (src: Readable) => void): this; + addListener(event: string | symbol, listener: (...args: any[]) => void): this; + emit(event: 'close'): boolean; + emit(event: 'drain'): boolean; + emit(event: 'error', err: Error): boolean; + emit(event: 'finish'): boolean; + emit(event: 'pipe', src: Readable): boolean; + emit(event: 'unpipe', src: Readable): boolean; + emit(event: string | symbol, ...args: any[]): boolean; + on(event: 'close', listener: () => void): this; + on(event: 'drain', listener: () => void): this; + on(event: 'error', listener: (err: Error) => void): this; + on(event: 'finish', listener: () => void): this; + on(event: 'pipe', listener: (src: Readable) => void): this; + on(event: 'unpipe', listener: (src: Readable) => void): this; + on(event: string | symbol, listener: (...args: any[]) => void): this; + once(event: 'close', listener: () => void): this; + once(event: 'drain', listener: () => void): this; + once(event: 'error', listener: (err: Error) => void): this; + once(event: 'finish', listener: () => void): this; + once(event: 'pipe', listener: (src: Readable) => void): this; + once(event: 'unpipe', listener: (src: Readable) => void): this; + once(event: string | symbol, listener: (...args: any[]) => void): this; + prependListener(event: 'close', listener: () => void): this; + prependListener(event: 'drain', listener: () => void): this; + prependListener(event: 'error', listener: (err: Error) => void): this; + prependListener(event: 'finish', listener: () => void): this; + prependListener(event: 'pipe', listener: (src: Readable) => void): this; + prependListener(event: 'unpipe', listener: (src: Readable) => void): this; + prependListener(event: string | symbol, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'close', listener: () => void): this; + prependOnceListener(event: 'drain', listener: () => void): this; + prependOnceListener(event: 'error', listener: (err: Error) => void): this; + prependOnceListener(event: 'finish', listener: () => void): this; + prependOnceListener(event: 'pipe', listener: (src: Readable) => void): this; + prependOnceListener(event: 'unpipe', listener: (src: Readable) => void): this; + prependOnceListener(event: string | symbol, listener: (...args: any[]) => void): this; + removeListener(event: 'close', listener: () => void): this; + removeListener(event: 'drain', listener: () => void): this; + removeListener(event: 'error', listener: (err: Error) => void): this; + removeListener(event: 'finish', listener: () => void): this; + removeListener(event: 'pipe', listener: (src: Readable) => void): this; + removeListener(event: 'unpipe', listener: (src: Readable) => void): this; + removeListener(event: string | symbol, listener: (...args: any[]) => void): this; + } + interface DuplexOptions extends ReadableOptions, WritableOptions { + allowHalfOpen?: boolean | undefined; + readableObjectMode?: boolean | undefined; + writableObjectMode?: boolean | undefined; + readableHighWaterMark?: number | undefined; + writableHighWaterMark?: number | undefined; + writableCorked?: number | undefined; + construct?(this: Duplex, callback: (error?: Error | null) => void): void; + read?(this: Duplex, size: number): void; + write?(this: Duplex, chunk: any, encoding: BufferEncoding, callback: (error?: Error | null) => void): void; + writev?( + this: Duplex, + chunks: Array<{ + chunk: any; + encoding: BufferEncoding; + }>, + callback: (error?: Error | null) => void + ): void; + final?(this: Duplex, callback: (error?: Error | null) => void): void; + destroy?(this: Duplex, error: Error | null, callback: (error: Error | null) => void): void; + } + /** + * Duplex streams are streams that implement both the `Readable` and `Writable` interfaces. + * + * Examples of `Duplex` streams include: + * + * * `TCP sockets` + * * `zlib streams` + * * `crypto streams` + * @since v0.9.4 + */ + class Duplex extends Readable implements Writable { + readonly writable: boolean; + readonly writableEnded: boolean; + readonly writableFinished: boolean; + readonly writableHighWaterMark: number; + readonly writableLength: number; + readonly writableObjectMode: boolean; + readonly writableCorked: number; + /** + * If `false` then the stream will automatically end the writable side when the + * readable side ends. Set initially by the `allowHalfOpen` constructor option, + * which defaults to `false`. + * + * This can be changed manually to change the half-open behavior of an existing`Duplex` stream instance, but must be changed before the `'end'` event is + * emitted. + * @since v0.9.4 + */ + allowHalfOpen: boolean; + constructor(opts?: DuplexOptions); + /** + * A utility method for creating duplex streams. + * + * - `Stream` converts writable stream into writable `Duplex` and readable stream + * to `Duplex`. + * - `Blob` converts into readable `Duplex`. + * - `string` converts into readable `Duplex`. + * - `ArrayBuffer` converts into readable `Duplex`. + * - `AsyncIterable` converts into a readable `Duplex`. Cannot yield `null`. + * - `AsyncGeneratorFunction` converts into a readable/writable transform + * `Duplex`. Must take a source `AsyncIterable` as first parameter. Cannot yield + * `null`. + * - `AsyncFunction` converts into a writable `Duplex`. Must return + * either `null` or `undefined` + * - `Object ({ writable, readable })` converts `readable` and + * `writable` into `Stream` and then combines them into `Duplex` where the + * `Duplex` will write to the `writable` and read from the `readable`. + * - `Promise` converts into readable `Duplex`. Value `null` is ignored. + * + * @since v16.8.0 + */ + static from(src: Stream | Blob | ArrayBuffer | string | Iterable | AsyncIterable | AsyncGeneratorFunction | Promise | Object): Duplex; + _write(chunk: any, encoding: BufferEncoding, callback: (error?: Error | null) => void): void; + _writev?( + chunks: Array<{ + chunk: any; + encoding: BufferEncoding; + }>, + callback: (error?: Error | null) => void + ): void; + _destroy(error: Error | null, callback: (error: Error | null) => void): void; + _final(callback: (error?: Error | null) => void): void; + write(chunk: any, encoding?: BufferEncoding, cb?: (error: Error | null | undefined) => void): boolean; + write(chunk: any, cb?: (error: Error | null | undefined) => void): boolean; + setDefaultEncoding(encoding: BufferEncoding): this; + end(cb?: () => void): void; + end(chunk: any, cb?: () => void): void; + end(chunk: any, encoding?: BufferEncoding, cb?: () => void): void; + cork(): void; + uncork(): void; + } + type TransformCallback = (error?: Error | null, data?: any) => void; + interface TransformOptions extends DuplexOptions { + construct?(this: Transform, callback: (error?: Error | null) => void): void; + read?(this: Transform, size: number): void; + write?(this: Transform, chunk: any, encoding: BufferEncoding, callback: (error?: Error | null) => void): void; + writev?( + this: Transform, + chunks: Array<{ + chunk: any; + encoding: BufferEncoding; + }>, + callback: (error?: Error | null) => void + ): void; + final?(this: Transform, callback: (error?: Error | null) => void): void; + destroy?(this: Transform, error: Error | null, callback: (error: Error | null) => void): void; + transform?(this: Transform, chunk: any, encoding: BufferEncoding, callback: TransformCallback): void; + flush?(this: Transform, callback: TransformCallback): void; + } + /** + * Transform streams are `Duplex` streams where the output is in some way + * related to the input. Like all `Duplex` streams, `Transform` streams + * implement both the `Readable` and `Writable` interfaces. + * + * Examples of `Transform` streams include: + * + * * `zlib streams` + * * `crypto streams` + * @since v0.9.4 + */ + class Transform extends Duplex { + constructor(opts?: TransformOptions); + _transform(chunk: any, encoding: BufferEncoding, callback: TransformCallback): void; + _flush(callback: TransformCallback): void; + } + /** + * The `stream.PassThrough` class is a trivial implementation of a `Transform` stream that simply passes the input bytes across to the output. Its purpose is + * primarily for examples and testing, but there are some use cases where`stream.PassThrough` is useful as a building block for novel sorts of streams. + */ + class PassThrough extends Transform {} + /** + * Attaches an AbortSignal to a readable or writeable stream. This lets code + * control stream destruction using an `AbortController`. + * + * Calling `abort` on the `AbortController` corresponding to the passed`AbortSignal` will behave the same way as calling `.destroy(new AbortError())`on the stream. + * + * ```js + * const fs = require('fs'); + * + * const controller = new AbortController(); + * const read = addAbortSignal( + * controller.signal, + * fs.createReadStream(('object.json')) + * ); + * // Later, abort the operation closing the stream + * controller.abort(); + * ``` + * + * Or using an `AbortSignal` with a readable stream as an async iterable: + * + * ```js + * const controller = new AbortController(); + * setTimeout(() => controller.abort(), 10_000); // set a timeout + * const stream = addAbortSignal( + * controller.signal, + * fs.createReadStream(('object.json')) + * ); + * (async () => { + * try { + * for await (const chunk of stream) { + * await process(chunk); + * } + * } catch (e) { + * if (e.name === 'AbortError') { + * // The operation was cancelled + * } else { + * throw e; + * } + * } + * })(); + * ``` + * @since v15.4.0 + * @param signal A signal representing possible cancellation + * @param stream a stream to attach a signal to + */ + function addAbortSignal(signal: AbortSignal, stream: T): T; + interface FinishedOptions extends Abortable { + error?: boolean | undefined; + readable?: boolean | undefined; + writable?: boolean | undefined; + } + /** + * A function to get notified when a stream is no longer readable, writable + * or has experienced an error or a premature close event. + * + * ```js + * const { finished } = require('stream'); + * + * const rs = fs.createReadStream('archive.tar'); + * + * finished(rs, (err) => { + * if (err) { + * console.error('Stream failed.', err); + * } else { + * console.log('Stream is done reading.'); + * } + * }); + * + * rs.resume(); // Drain the stream. + * ``` + * + * Especially useful in error handling scenarios where a stream is destroyed + * prematurely (like an aborted HTTP request), and will not emit `'end'`or `'finish'`. + * + * The `finished` API provides promise version: + * + * ```js + * const { finished } = require('stream/promises'); + * + * const rs = fs.createReadStream('archive.tar'); + * + * async function run() { + * await finished(rs); + * console.log('Stream is done reading.'); + * } + * + * run().catch(console.error); + * rs.resume(); // Drain the stream. + * ``` + * + * `stream.finished()` leaves dangling event listeners (in particular`'error'`, `'end'`, `'finish'` and `'close'`) after `callback` has been + * invoked. The reason for this is so that unexpected `'error'` events (due to + * incorrect stream implementations) do not cause unexpected crashes. + * If this is unwanted behavior then the returned cleanup function needs to be + * invoked in the callback: + * + * ```js + * const cleanup = finished(rs, (err) => { + * cleanup(); + * // ... + * }); + * ``` + * @since v10.0.0 + * @param stream A readable and/or writable stream. + * @param callback A callback function that takes an optional error argument. + * @return A cleanup function which removes all registered listeners. + */ + function finished(stream: NodeJS.ReadableStream | NodeJS.WritableStream | NodeJS.ReadWriteStream, options: FinishedOptions, callback: (err?: NodeJS.ErrnoException | null) => void): () => void; + function finished(stream: NodeJS.ReadableStream | NodeJS.WritableStream | NodeJS.ReadWriteStream, callback: (err?: NodeJS.ErrnoException | null) => void): () => void; + namespace finished { + function __promisify__(stream: NodeJS.ReadableStream | NodeJS.WritableStream | NodeJS.ReadWriteStream, options?: FinishedOptions): Promise; + } + type PipelineSourceFunction = () => Iterable | AsyncIterable; + type PipelineSource = Iterable | AsyncIterable | NodeJS.ReadableStream | PipelineSourceFunction; + type PipelineTransform, U> = + | NodeJS.ReadWriteStream + | ((source: S extends (...args: any[]) => Iterable | AsyncIterable ? AsyncIterable : S) => AsyncIterable); + type PipelineTransformSource = PipelineSource | PipelineTransform; + type PipelineDestinationIterableFunction = (source: AsyncIterable) => AsyncIterable; + type PipelineDestinationPromiseFunction = (source: AsyncIterable) => Promise

; + type PipelineDestination, P> = S extends PipelineTransformSource + ? NodeJS.WritableStream | PipelineDestinationIterableFunction | PipelineDestinationPromiseFunction + : never; + type PipelineCallback> = S extends PipelineDestinationPromiseFunction + ? (err: NodeJS.ErrnoException | null, value: P) => void + : (err: NodeJS.ErrnoException | null) => void; + type PipelinePromise> = S extends PipelineDestinationPromiseFunction ? Promise

: Promise; + interface PipelineOptions { + signal: AbortSignal; + } + /** + * A module method to pipe between streams and generators forwarding errors and + * properly cleaning up and provide a callback when the pipeline is complete. + * + * ```js + * const { pipeline } = require('stream'); + * const fs = require('fs'); + * const zlib = require('zlib'); + * + * // Use the pipeline API to easily pipe a series of streams + * // together and get notified when the pipeline is fully done. + * + * // A pipeline to gzip a potentially huge tar file efficiently: + * + * pipeline( + * fs.createReadStream('archive.tar'), + * zlib.createGzip(), + * fs.createWriteStream('archive.tar.gz'), + * (err) => { + * if (err) { + * console.error('Pipeline failed.', err); + * } else { + * console.log('Pipeline succeeded.'); + * } + * } + * ); + * ``` + * + * The `pipeline` API provides a promise version, which can also + * receive an options argument as the last parameter with a`signal` `AbortSignal` property. When the signal is aborted,`destroy` will be called on the underlying pipeline, with + * an`AbortError`. + * + * ```js + * const { pipeline } = require('stream/promises'); + * + * async function run() { + * await pipeline( + * fs.createReadStream('archive.tar'), + * zlib.createGzip(), + * fs.createWriteStream('archive.tar.gz') + * ); + * console.log('Pipeline succeeded.'); + * } + * + * run().catch(console.error); + * ``` + * + * To use an `AbortSignal`, pass it inside an options object, + * as the last argument: + * + * ```js + * const { pipeline } = require('stream/promises'); + * + * async function run() { + * const ac = new AbortController(); + * const signal = ac.signal; + * + * setTimeout(() => ac.abort(), 1); + * await pipeline( + * fs.createReadStream('archive.tar'), + * zlib.createGzip(), + * fs.createWriteStream('archive.tar.gz'), + * { signal }, + * ); + * } + * + * run().catch(console.error); // AbortError + * ``` + * + * The `pipeline` API also supports async generators: + * + * ```js + * const { pipeline } = require('stream/promises'); + * const fs = require('fs'); + * + * async function run() { + * await pipeline( + * fs.createReadStream('lowercase.txt'), + * async function* (source, signal) { + * source.setEncoding('utf8'); // Work with strings rather than `Buffer`s. + * for await (const chunk of source) { + * yield await processChunk(chunk, { signal }); + * } + * }, + * fs.createWriteStream('uppercase.txt') + * ); + * console.log('Pipeline succeeded.'); + * } + * + * run().catch(console.error); + * ``` + * + * Remember to handle the `signal` argument passed into the async generator. + * Especially in the case where the async generator is the source for the + * pipeline (i.e. first argument) or the pipeline will never complete. + * + * ```js + * const { pipeline } = require('stream/promises'); + * const fs = require('fs'); + * + * async function run() { + * await pipeline( + * async function * (signal) { + * await someLongRunningfn({ signal }); + * yield 'asd'; + * }, + * fs.createWriteStream('uppercase.txt') + * ); + * console.log('Pipeline succeeded.'); + * } + * + * run().catch(console.error); + * ``` + * + * `stream.pipeline()` will call `stream.destroy(err)` on all streams except: + * + * * `Readable` streams which have emitted `'end'` or `'close'`. + * * `Writable` streams which have emitted `'finish'` or `'close'`. + * + * `stream.pipeline()` leaves dangling event listeners on the streams + * after the `callback` has been invoked. In the case of reuse of streams after + * failure, this can cause event listener leaks and swallowed errors. + * @since v10.0.0 + * @param callback Called when the pipeline is fully done. + */ + function pipeline, B extends PipelineDestination>( + source: A, + destination: B, + callback?: PipelineCallback + ): B extends NodeJS.WritableStream ? B : NodeJS.WritableStream; + function pipeline, T1 extends PipelineTransform, B extends PipelineDestination>( + source: A, + transform1: T1, + destination: B, + callback?: PipelineCallback + ): B extends NodeJS.WritableStream ? B : NodeJS.WritableStream; + function pipeline, T1 extends PipelineTransform, T2 extends PipelineTransform, B extends PipelineDestination>( + source: A, + transform1: T1, + transform2: T2, + destination: B, + callback?: PipelineCallback + ): B extends NodeJS.WritableStream ? B : NodeJS.WritableStream; + function pipeline< + A extends PipelineSource, + T1 extends PipelineTransform, + T2 extends PipelineTransform, + T3 extends PipelineTransform, + B extends PipelineDestination + >(source: A, transform1: T1, transform2: T2, transform3: T3, destination: B, callback?: PipelineCallback): B extends NodeJS.WritableStream ? B : NodeJS.WritableStream; + function pipeline< + A extends PipelineSource, + T1 extends PipelineTransform, + T2 extends PipelineTransform, + T3 extends PipelineTransform, + T4 extends PipelineTransform, + B extends PipelineDestination + >(source: A, transform1: T1, transform2: T2, transform3: T3, transform4: T4, destination: B, callback?: PipelineCallback): B extends NodeJS.WritableStream ? B : NodeJS.WritableStream; + function pipeline( + streams: ReadonlyArray, + callback?: (err: NodeJS.ErrnoException | null) => void + ): NodeJS.WritableStream; + function pipeline( + stream1: NodeJS.ReadableStream, + stream2: NodeJS.ReadWriteStream | NodeJS.WritableStream, + ...streams: Array void)> + ): NodeJS.WritableStream; + namespace pipeline { + function __promisify__, B extends PipelineDestination>(source: A, destination: B, options?: PipelineOptions): PipelinePromise; + function __promisify__, T1 extends PipelineTransform, B extends PipelineDestination>( + source: A, + transform1: T1, + destination: B, + options?: PipelineOptions + ): PipelinePromise; + function __promisify__, T1 extends PipelineTransform, T2 extends PipelineTransform, B extends PipelineDestination>( + source: A, + transform1: T1, + transform2: T2, + destination: B, + options?: PipelineOptions + ): PipelinePromise; + function __promisify__< + A extends PipelineSource, + T1 extends PipelineTransform, + T2 extends PipelineTransform, + T3 extends PipelineTransform, + B extends PipelineDestination + >(source: A, transform1: T1, transform2: T2, transform3: T3, destination: B, options?: PipelineOptions): PipelinePromise; + function __promisify__< + A extends PipelineSource, + T1 extends PipelineTransform, + T2 extends PipelineTransform, + T3 extends PipelineTransform, + T4 extends PipelineTransform, + B extends PipelineDestination + >(source: A, transform1: T1, transform2: T2, transform3: T3, transform4: T4, destination: B, options?: PipelineOptions): PipelinePromise; + function __promisify__(streams: ReadonlyArray, options?: PipelineOptions): Promise; + function __promisify__( + stream1: NodeJS.ReadableStream, + stream2: NodeJS.ReadWriteStream | NodeJS.WritableStream, + ...streams: Array + ): Promise; + } + interface Pipe { + close(): void; + hasRef(): boolean; + ref(): void; + unref(): void; + } + const promises: typeof streamPromises; + const consumers: typeof streamConsumers; + } + export = internal; +} +declare module 'node:stream' { + import stream = require('stream'); + export = stream; +} diff --git a/node_modules/@types/node/stream/consumers.d.ts b/node_modules/@types/node/stream/consumers.d.ts new file mode 100644 index 000000000..ce6c9bb78 --- /dev/null +++ b/node_modules/@types/node/stream/consumers.d.ts @@ -0,0 +1,24 @@ +// Duplicates of interface in lib.dom.ts. +// Duplicated here rather than referencing lib.dom.ts because doing so causes lib.dom.ts to be loaded for "test-all" +// Which in turn causes tests to pass that shouldn't pass. +// +// This interface is not, and should not be, exported. +interface Blob { + readonly size: number; + readonly type: string; + arrayBuffer(): Promise; + slice(start?: number, end?: number, contentType?: string): Blob; + stream(): NodeJS.ReadableStream; + text(): Promise; +} +declare module 'stream/consumers' { + import { Readable } from 'node:stream'; + function buffer(stream: NodeJS.ReadableStream | Readable | AsyncIterator): Promise; + function text(stream: NodeJS.ReadableStream | Readable | AsyncIterator): Promise; + function arrayBuffer(stream: NodeJS.ReadableStream | Readable | AsyncIterator): Promise; + function blob(stream: NodeJS.ReadableStream | Readable | AsyncIterator): Promise; + function json(stream: NodeJS.ReadableStream | Readable | AsyncIterator): Promise; +} +declare module 'node:stream/consumers' { + export * from 'stream/consumers'; +} diff --git a/node_modules/@types/node/stream/promises.d.ts b/node_modules/@types/node/stream/promises.d.ts new file mode 100644 index 000000000..b427073de --- /dev/null +++ b/node_modules/@types/node/stream/promises.d.ts @@ -0,0 +1,42 @@ +declare module 'stream/promises' { + import { FinishedOptions, PipelineSource, PipelineTransform, PipelineDestination, PipelinePromise, PipelineOptions } from 'node:stream'; + function finished(stream: NodeJS.ReadableStream | NodeJS.WritableStream | NodeJS.ReadWriteStream, options?: FinishedOptions): Promise; + function pipeline, B extends PipelineDestination>(source: A, destination: B, options?: PipelineOptions): PipelinePromise; + function pipeline, T1 extends PipelineTransform, B extends PipelineDestination>( + source: A, + transform1: T1, + destination: B, + options?: PipelineOptions + ): PipelinePromise; + function pipeline, T1 extends PipelineTransform, T2 extends PipelineTransform, B extends PipelineDestination>( + source: A, + transform1: T1, + transform2: T2, + destination: B, + options?: PipelineOptions + ): PipelinePromise; + function pipeline< + A extends PipelineSource, + T1 extends PipelineTransform, + T2 extends PipelineTransform, + T3 extends PipelineTransform, + B extends PipelineDestination + >(source: A, transform1: T1, transform2: T2, transform3: T3, destination: B, options?: PipelineOptions): PipelinePromise; + function pipeline< + A extends PipelineSource, + T1 extends PipelineTransform, + T2 extends PipelineTransform, + T3 extends PipelineTransform, + T4 extends PipelineTransform, + B extends PipelineDestination + >(source: A, transform1: T1, transform2: T2, transform3: T3, transform4: T4, destination: B, options?: PipelineOptions): PipelinePromise; + function pipeline(streams: ReadonlyArray, options?: PipelineOptions): Promise; + function pipeline( + stream1: NodeJS.ReadableStream, + stream2: NodeJS.ReadWriteStream | NodeJS.WritableStream, + ...streams: Array + ): Promise; +} +declare module 'node:stream/promises' { + export * from 'stream/promises'; +} diff --git a/node_modules/@types/node/stream/web.d.ts b/node_modules/@types/node/stream/web.d.ts new file mode 100644 index 000000000..a9d75c19c --- /dev/null +++ b/node_modules/@types/node/stream/web.d.ts @@ -0,0 +1,6 @@ +declare module 'stream/web' { + // stub module, pending copy&paste from .d.ts or manual impl +} +declare module 'node:stream/web' { + export * from 'stream/web'; +} diff --git a/node_modules/@types/node/string_decoder.d.ts b/node_modules/@types/node/string_decoder.d.ts new file mode 100644 index 000000000..584e0c5d2 --- /dev/null +++ b/node_modules/@types/node/string_decoder.d.ts @@ -0,0 +1,67 @@ +/** + * The `string_decoder` module provides an API for decoding `Buffer` objects into + * strings in a manner that preserves encoded multi-byte UTF-8 and UTF-16 + * characters. It can be accessed using: + * + * ```js + * const { StringDecoder } = require('string_decoder'); + * ``` + * + * The following example shows the basic use of the `StringDecoder` class. + * + * ```js + * const { StringDecoder } = require('string_decoder'); + * const decoder = new StringDecoder('utf8'); + * + * const cent = Buffer.from([0xC2, 0xA2]); + * console.log(decoder.write(cent)); + * + * const euro = Buffer.from([0xE2, 0x82, 0xAC]); + * console.log(decoder.write(euro)); + * ``` + * + * When a `Buffer` instance is written to the `StringDecoder` instance, an + * internal buffer is used to ensure that the decoded string does not contain + * any incomplete multibyte characters. These are held in the buffer until the + * next call to `stringDecoder.write()` or until `stringDecoder.end()` is called. + * + * In the following example, the three UTF-8 encoded bytes of the European Euro + * symbol (`€`) are written over three separate operations: + * + * ```js + * const { StringDecoder } = require('string_decoder'); + * const decoder = new StringDecoder('utf8'); + * + * decoder.write(Buffer.from([0xE2])); + * decoder.write(Buffer.from([0x82])); + * console.log(decoder.end(Buffer.from([0xAC]))); + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/string_decoder.js) + */ +declare module 'string_decoder' { + class StringDecoder { + constructor(encoding?: BufferEncoding); + /** + * Returns a decoded string, ensuring that any incomplete multibyte characters at + * the end of the `Buffer`, or `TypedArray`, or `DataView` are omitted from the + * returned string and stored in an internal buffer for the next call to`stringDecoder.write()` or `stringDecoder.end()`. + * @since v0.1.99 + * @param buffer A `Buffer`, or `TypedArray`, or `DataView` containing the bytes to decode. + */ + write(buffer: Buffer): string; + /** + * Returns any remaining input stored in the internal buffer as a string. Bytes + * representing incomplete UTF-8 and UTF-16 characters will be replaced with + * substitution characters appropriate for the character encoding. + * + * If the `buffer` argument is provided, one final call to `stringDecoder.write()`is performed before returning the remaining input. + * After `end()` is called, the `stringDecoder` object can be reused for new input. + * @since v0.9.3 + * @param buffer A `Buffer`, or `TypedArray`, or `DataView` containing the bytes to decode. + */ + end(buffer?: Buffer): string; + } +} +declare module 'node:string_decoder' { + export * from 'string_decoder'; +} diff --git a/node_modules/@types/node/timers.d.ts b/node_modules/@types/node/timers.d.ts new file mode 100644 index 000000000..1768d7fb9 --- /dev/null +++ b/node_modules/@types/node/timers.d.ts @@ -0,0 +1,94 @@ +/** + * The `timer` module exposes a global API for scheduling functions to + * be called at some future period of time. Because the timer functions are + * globals, there is no need to call `require('timers')` to use the API. + * + * The timer functions within Node.js implement a similar API as the timers API + * provided by Web Browsers but use a different internal implementation that is + * built around the Node.js [Event Loop](https://nodejs.org/en/docs/guides/event-loop-timers-and-nexttick/#setimmediate-vs-settimeout). + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/timers.js) + */ +declare module 'timers' { + import { Abortable } from 'node:events'; + import { setTimeout as setTimeoutPromise, setImmediate as setImmediatePromise, setInterval as setIntervalPromise } from 'node:timers/promises'; + interface TimerOptions extends Abortable { + /** + * Set to `false` to indicate that the scheduled `Timeout` + * should not require the Node.js event loop to remain active. + * @default true + */ + ref?: boolean | undefined; + } + let setTimeout: typeof global.setTimeout; + let clearTimeout: typeof global.clearTimeout; + let setInterval: typeof global.setInterval; + let clearInterval: typeof global.clearInterval; + let setImmediate: typeof global.setImmediate; + let clearImmediate: typeof global.clearImmediate; + global { + namespace NodeJS { + // compatibility with older typings + interface Timer extends RefCounted { + hasRef(): boolean; + refresh(): this; + [Symbol.toPrimitive](): number; + } + interface Immediate extends RefCounted { + /** + * If true, the `Immediate` object will keep the Node.js event loop active. + * @since v11.0.0 + */ + hasRef(): boolean; + _onImmediate: Function; // to distinguish it from the Timeout class + } + interface Timeout extends Timer { + /** + * If true, the `Timeout` object will keep the Node.js event loop active. + * @since v11.0.0 + */ + hasRef(): boolean; + /** + * Sets the timer's start time to the current time, and reschedules the timer to + * call its callback at the previously specified duration adjusted to the current + * time. This is useful for refreshing a timer without allocating a new + * JavaScript object. + * + * Using this on a timer that has already called its callback will reactivate the + * timer. + * @since v10.2.0 + * @return a reference to `timeout` + */ + refresh(): this; + [Symbol.toPrimitive](): number; + } + } + function setTimeout(callback: (...args: TArgs) => void, ms?: number, ...args: TArgs): NodeJS.Timeout; + // util.promisify no rest args compability + // tslint:disable-next-line void-return + function setTimeout(callback: (args: void) => void, ms?: number): NodeJS.Timeout; + namespace setTimeout { + const __promisify__: typeof setTimeoutPromise; + } + function clearTimeout(timeoutId: NodeJS.Timeout): void; + function setInterval(callback: (...args: TArgs) => void, ms?: number, ...args: TArgs): NodeJS.Timer; + // util.promisify no rest args compability + // tslint:disable-next-line void-return + function setInterval(callback: (args: void) => void, ms?: number): NodeJS.Timer; + namespace setInterval { + const __promisify__: typeof setIntervalPromise; + } + function clearInterval(intervalId: NodeJS.Timeout): void; + function setImmediate(callback: (...args: TArgs) => void, ...args: TArgs): NodeJS.Immediate; + // util.promisify no rest args compability + // tslint:disable-next-line void-return + function setImmediate(callback: (args: void) => void): NodeJS.Immediate; + namespace setImmediate { + const __promisify__: typeof setImmediatePromise; + } + function clearImmediate(immediateId: NodeJS.Immediate): void; + function queueMicrotask(callback: () => void): void; + } +} +declare module 'node:timers' { + export * from 'timers'; +} diff --git a/node_modules/@types/node/timers/promises.d.ts b/node_modules/@types/node/timers/promises.d.ts new file mode 100644 index 000000000..fd778880e --- /dev/null +++ b/node_modules/@types/node/timers/promises.d.ts @@ -0,0 +1,68 @@ +/** + * The `timers/promises` API provides an alternative set of timer functions + * that return `Promise` objects. The API is accessible via`require('timers/promises')`. + * + * ```js + * import { + * setTimeout, + * setImmediate, + * setInterval, + * } from 'timers/promises'; + * ``` + * @since v15.0.0 + */ +declare module 'timers/promises' { + import { TimerOptions } from 'node:timers'; + /** + * ```js + * import { + * setTimeout, + * } from 'timers/promises'; + * + * const res = await setTimeout(100, 'result'); + * + * console.log(res); // Prints 'result' + * ``` + * @since v15.0.0 + * @param [delay=1] The number of milliseconds to wait before fulfilling the promise. + * @param value A value with which the promise is fulfilled. + */ + function setTimeout(delay?: number, value?: T, options?: TimerOptions): Promise; + /** + * ```js + * import { + * setImmediate, + * } from 'timers/promises'; + * + * const res = await setImmediate('result'); + * + * console.log(res); // Prints 'result' + * ``` + * @since v15.0.0 + * @param value A value with which the promise is fulfilled. + */ + function setImmediate(value?: T, options?: TimerOptions): Promise; + /** + * Returns an async iterator that generates values in an interval of `delay` ms. + * + * ```js + * import { + * setInterval, + * } from 'timers/promises'; + * + * const interval = 100; + * for await (const startTime of setInterval(interval, Date.now())) { + * const now = Date.now(); + * console.log(now); + * if ((now - startTime) > 1000) + * break; + * } + * console.log(Date.now()); + * ``` + * @since v15.9.0 + */ + function setInterval(delay?: number, value?: T, options?: TimerOptions): AsyncIterable; +} +declare module 'node:timers/promises' { + export * from 'timers/promises'; +} diff --git a/node_modules/@types/node/tls.d.ts b/node_modules/@types/node/tls.d.ts new file mode 100644 index 000000000..b1f4c8707 --- /dev/null +++ b/node_modules/@types/node/tls.d.ts @@ -0,0 +1,1019 @@ +/** + * The `tls` module provides an implementation of the Transport Layer Security + * (TLS) and Secure Socket Layer (SSL) protocols that is built on top of OpenSSL. + * The module can be accessed using: + * + * ```js + * const tls = require('tls'); + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/tls.js) + */ +declare module 'tls' { + import { X509Certificate } from 'node:crypto'; + import * as net from 'node:net'; + const CLIENT_RENEG_LIMIT: number; + const CLIENT_RENEG_WINDOW: number; + interface Certificate { + /** + * Country code. + */ + C: string; + /** + * Street. + */ + ST: string; + /** + * Locality. + */ + L: string; + /** + * Organization. + */ + O: string; + /** + * Organizational unit. + */ + OU: string; + /** + * Common name. + */ + CN: string; + } + interface PeerCertificate { + subject: Certificate; + issuer: Certificate; + subjectaltname: string; + infoAccess: NodeJS.Dict; + modulus: string; + exponent: string; + valid_from: string; + valid_to: string; + fingerprint: string; + fingerprint256: string; + ext_key_usage: string[]; + serialNumber: string; + raw: Buffer; + } + interface DetailedPeerCertificate extends PeerCertificate { + issuerCertificate: DetailedPeerCertificate; + } + interface CipherNameAndProtocol { + /** + * The cipher name. + */ + name: string; + /** + * SSL/TLS protocol version. + */ + version: string; + /** + * IETF name for the cipher suite. + */ + standardName: string; + } + interface EphemeralKeyInfo { + /** + * The supported types are 'DH' and 'ECDH'. + */ + type: string; + /** + * The name property is available only when type is 'ECDH'. + */ + name?: string | undefined; + /** + * The size of parameter of an ephemeral key exchange. + */ + size: number; + } + interface KeyObject { + /** + * Private keys in PEM format. + */ + pem: string | Buffer; + /** + * Optional passphrase. + */ + passphrase?: string | undefined; + } + interface PxfObject { + /** + * PFX or PKCS12 encoded private key and certificate chain. + */ + buf: string | Buffer; + /** + * Optional passphrase. + */ + passphrase?: string | undefined; + } + interface TLSSocketOptions extends SecureContextOptions, CommonConnectionOptions { + /** + * If true the TLS socket will be instantiated in server-mode. + * Defaults to false. + */ + isServer?: boolean | undefined; + /** + * An optional net.Server instance. + */ + server?: net.Server | undefined; + /** + * An optional Buffer instance containing a TLS session. + */ + session?: Buffer | undefined; + /** + * If true, specifies that the OCSP status request extension will be + * added to the client hello and an 'OCSPResponse' event will be + * emitted on the socket before establishing a secure communication + */ + requestOCSP?: boolean | undefined; + } + /** + * Performs transparent encryption of written data and all required TLS + * negotiation. + * + * Instances of `tls.TLSSocket` implement the duplex `Stream` interface. + * + * Methods that return TLS connection metadata (e.g.{@link TLSSocket.getPeerCertificate} will only return data while the + * connection is open. + * @since v0.11.4 + */ + class TLSSocket extends net.Socket { + /** + * Construct a new tls.TLSSocket object from an existing TCP socket. + */ + constructor(socket: net.Socket, options?: TLSSocketOptions); + /** + * Returns `true` if the peer certificate was signed by one of the CAs specified + * when creating the `tls.TLSSocket` instance, otherwise `false`. + * @since v0.11.4 + */ + authorized: boolean; + /** + * Returns the reason why the peer's certificate was not been verified. This + * property is set only when `tlsSocket.authorized === false`. + * @since v0.11.4 + */ + authorizationError: Error; + /** + * Always returns `true`. This may be used to distinguish TLS sockets from regular`net.Socket` instances. + * @since v0.11.4 + */ + encrypted: boolean; + /** + * String containing the selected ALPN protocol. + * Before a handshake has completed, this value is always null. + * When a handshake is completed but not ALPN protocol was selected, tlsSocket.alpnProtocol equals false. + */ + alpnProtocol: string | false | null; + /** + * Returns an object representing the local certificate. The returned object has + * some properties corresponding to the fields of the certificate. + * + * See {@link TLSSocket.getPeerCertificate} for an example of the certificate + * structure. + * + * If there is no local certificate, an empty object will be returned. If the + * socket has been destroyed, `null` will be returned. + * @since v11.2.0 + */ + getCertificate(): PeerCertificate | object | null; + /** + * Returns an object containing information on the negotiated cipher suite. + * + * For example: + * + * ```json + * { + * "name": "AES128-SHA256", + * "standardName": "TLS_RSA_WITH_AES_128_CBC_SHA256", + * "version": "TLSv1.2" + * } + * ``` + * + * See [SSL\_CIPHER\_get\_name](https://www.openssl.org/docs/man1.1.1/man3/SSL_CIPHER_get_name.html) for more information. + * @since v0.11.4 + */ + getCipher(): CipherNameAndProtocol; + /** + * Returns an object representing the type, name, and size of parameter of + * an ephemeral key exchange in `perfect forward secrecy` on a client + * connection. It returns an empty object when the key exchange is not + * ephemeral. As this is only supported on a client socket; `null` is returned + * if called on a server socket. The supported types are `'DH'` and `'ECDH'`. The`name` property is available only when type is `'ECDH'`. + * + * For example: `{ type: 'ECDH', name: 'prime256v1', size: 256 }`. + * @since v5.0.0 + */ + getEphemeralKeyInfo(): EphemeralKeyInfo | object | null; + /** + * As the `Finished` messages are message digests of the complete handshake + * (with a total of 192 bits for TLS 1.0 and more for SSL 3.0), they can + * be used for external authentication procedures when the authentication + * provided by SSL/TLS is not desired or is not enough. + * + * Corresponds to the `SSL_get_finished` routine in OpenSSL and may be used + * to implement the `tls-unique` channel binding from [RFC 5929](https://tools.ietf.org/html/rfc5929). + * @since v9.9.0 + * @return The latest `Finished` message that has been sent to the socket as part of a SSL/TLS handshake, or `undefined` if no `Finished` message has been sent yet. + */ + getFinished(): Buffer | undefined; + /** + * Returns an object representing the peer's certificate. If the peer does not + * provide a certificate, an empty object will be returned. If the socket has been + * destroyed, `null` will be returned. + * + * If the full certificate chain was requested, each certificate will include an`issuerCertificate` property containing an object representing its issuer's + * certificate. + * @since v0.11.4 + * @param detailed Include the full certificate chain if `true`, otherwise include just the peer's certificate. + * @return A certificate object. + */ + getPeerCertificate(detailed: true): DetailedPeerCertificate; + getPeerCertificate(detailed?: false): PeerCertificate; + getPeerCertificate(detailed?: boolean): PeerCertificate | DetailedPeerCertificate; + /** + * As the `Finished` messages are message digests of the complete handshake + * (with a total of 192 bits for TLS 1.0 and more for SSL 3.0), they can + * be used for external authentication procedures when the authentication + * provided by SSL/TLS is not desired or is not enough. + * + * Corresponds to the `SSL_get_peer_finished` routine in OpenSSL and may be used + * to implement the `tls-unique` channel binding from [RFC 5929](https://tools.ietf.org/html/rfc5929). + * @since v9.9.0 + * @return The latest `Finished` message that is expected or has actually been received from the socket as part of a SSL/TLS handshake, or `undefined` if there is no `Finished` message so + * far. + */ + getPeerFinished(): Buffer | undefined; + /** + * Returns a string containing the negotiated SSL/TLS protocol version of the + * current connection. The value `'unknown'` will be returned for connected + * sockets that have not completed the handshaking process. The value `null` will + * be returned for server sockets or disconnected client sockets. + * + * Protocol versions are: + * + * * `'SSLv3'` + * * `'TLSv1'` + * * `'TLSv1.1'` + * * `'TLSv1.2'` + * * `'TLSv1.3'` + * + * See the OpenSSL [`SSL_get_version`](https://www.openssl.org/docs/man1.1.1/man3/SSL_get_version.html) documentation for more information. + * @since v5.7.0 + */ + getProtocol(): string | null; + /** + * Returns the TLS session data or `undefined` if no session was + * negotiated. On the client, the data can be provided to the `session` option of {@link connect} to resume the connection. On the server, it may be useful + * for debugging. + * + * See `Session Resumption` for more information. + * + * Note: `getSession()` works only for TLSv1.2 and below. For TLSv1.3, applications + * must use the `'session'` event (it also works for TLSv1.2 and below). + * @since v0.11.4 + */ + getSession(): Buffer | undefined; + /** + * See [SSL\_get\_shared\_sigalgs](https://www.openssl.org/docs/man1.1.1/man3/SSL_get_shared_sigalgs.html) for more information. + * @since v12.11.0 + * @return List of signature algorithms shared between the server and the client in the order of decreasing preference. + */ + getSharedSigalgs(): string[]; + /** + * For a client, returns the TLS session ticket if one is available, or`undefined`. For a server, always returns `undefined`. + * + * It may be useful for debugging. + * + * See `Session Resumption` for more information. + * @since v0.11.4 + */ + getTLSTicket(): Buffer | undefined; + /** + * See `Session Resumption` for more information. + * @since v0.5.6 + * @return `true` if the session was reused, `false` otherwise. + */ + isSessionReused(): boolean; + /** + * The `tlsSocket.renegotiate()` method initiates a TLS renegotiation process. + * Upon completion, the `callback` function will be passed a single argument + * that is either an `Error` (if the request failed) or `null`. + * + * This method can be used to request a peer's certificate after the secure + * connection has been established. + * + * When running as the server, the socket will be destroyed with an error after`handshakeTimeout` timeout. + * + * For TLSv1.3, renegotiation cannot be initiated, it is not supported by the + * protocol. + * @since v0.11.8 + * @param callback If `renegotiate()` returned `true`, callback is attached once to the `'secure'` event. If `renegotiate()` returned `false`, `callback` will be called in the next tick with + * an error, unless the `tlsSocket` has been destroyed, in which case `callback` will not be called at all. + * @return `true` if renegotiation was initiated, `false` otherwise. + */ + renegotiate( + options: { + rejectUnauthorized?: boolean | undefined; + requestCert?: boolean | undefined; + }, + callback: (err: Error | null) => void + ): undefined | boolean; + /** + * The `tlsSocket.setMaxSendFragment()` method sets the maximum TLS fragment size. + * Returns `true` if setting the limit succeeded; `false` otherwise. + * + * Smaller fragment sizes decrease the buffering latency on the client: larger + * fragments are buffered by the TLS layer until the entire fragment is received + * and its integrity is verified; large fragments can span multiple roundtrips + * and their processing can be delayed due to packet loss or reordering. However, + * smaller fragments add extra TLS framing bytes and CPU overhead, which may + * decrease overall server throughput. + * @since v0.11.11 + * @param [size=16384] The maximum TLS fragment size. The maximum value is `16384`. + */ + setMaxSendFragment(size: number): boolean; + /** + * Disables TLS renegotiation for this `TLSSocket` instance. Once called, attempts + * to renegotiate will trigger an `'error'` event on the `TLSSocket`. + * @since v8.4.0 + */ + disableRenegotiation(): void; + /** + * When enabled, TLS packet trace information is written to `stderr`. This can be + * used to debug TLS connection problems. + * + * Note: The format of the output is identical to the output of `openssl s_client -trace` or `openssl s_server -trace`. While it is produced by OpenSSL's`SSL_trace()` function, the format is + * undocumented, can change without notice, + * and should not be relied on. + * @since v12.2.0 + */ + enableTrace(): void; + /** + * Returns the peer certificate as an `X509Certificate` object. + * + * If there is no peer certificate, or the socket has been destroyed,`undefined` will be returned. + * @since v15.9.0 + */ + getPeerX509Certificate(): X509Certificate | undefined; + /** + * Returns the local certificate as an `X509Certificate` object. + * + * If there is no local certificate, or the socket has been destroyed,`undefined` will be returned. + * @since v15.9.0 + */ + getX509Certificate(): X509Certificate | undefined; + /** + * Keying material is used for validations to prevent different kind of attacks in + * network protocols, for example in the specifications of IEEE 802.1X. + * + * Example + * + * ```js + * const keyingMaterial = tlsSocket.exportKeyingMaterial( + * 128, + * 'client finished'); + * + * + * Example return value of keyingMaterial: + * + * + * ``` + * + * See the OpenSSL [`SSL_export_keying_material`](https://www.openssl.org/docs/man1.1.1/man3/SSL_export_keying_material.html) documentation for more + * information. + * @since v13.10.0, v12.17.0 + * @param length number of bytes to retrieve from keying material + * @param label an application specific label, typically this will be a value from the [IANA Exporter Label + * Registry](https://www.iana.org/assignments/tls-parameters/tls-parameters.xhtml#exporter-labels). + * @param context Optionally provide a context. + * @return requested bytes of the keying material + */ + exportKeyingMaterial(length: number, label: string, context: Buffer): Buffer; + addListener(event: string, listener: (...args: any[]) => void): this; + addListener(event: 'OCSPResponse', listener: (response: Buffer) => void): this; + addListener(event: 'secureConnect', listener: () => void): this; + addListener(event: 'session', listener: (session: Buffer) => void): this; + addListener(event: 'keylog', listener: (line: Buffer) => void): this; + emit(event: string | symbol, ...args: any[]): boolean; + emit(event: 'OCSPResponse', response: Buffer): boolean; + emit(event: 'secureConnect'): boolean; + emit(event: 'session', session: Buffer): boolean; + emit(event: 'keylog', line: Buffer): boolean; + on(event: string, listener: (...args: any[]) => void): this; + on(event: 'OCSPResponse', listener: (response: Buffer) => void): this; + on(event: 'secureConnect', listener: () => void): this; + on(event: 'session', listener: (session: Buffer) => void): this; + on(event: 'keylog', listener: (line: Buffer) => void): this; + once(event: string, listener: (...args: any[]) => void): this; + once(event: 'OCSPResponse', listener: (response: Buffer) => void): this; + once(event: 'secureConnect', listener: () => void): this; + once(event: 'session', listener: (session: Buffer) => void): this; + once(event: 'keylog', listener: (line: Buffer) => void): this; + prependListener(event: string, listener: (...args: any[]) => void): this; + prependListener(event: 'OCSPResponse', listener: (response: Buffer) => void): this; + prependListener(event: 'secureConnect', listener: () => void): this; + prependListener(event: 'session', listener: (session: Buffer) => void): this; + prependListener(event: 'keylog', listener: (line: Buffer) => void): this; + prependOnceListener(event: string, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'OCSPResponse', listener: (response: Buffer) => void): this; + prependOnceListener(event: 'secureConnect', listener: () => void): this; + prependOnceListener(event: 'session', listener: (session: Buffer) => void): this; + prependOnceListener(event: 'keylog', listener: (line: Buffer) => void): this; + } + interface CommonConnectionOptions { + /** + * An optional TLS context object from tls.createSecureContext() + */ + secureContext?: SecureContext | undefined; + /** + * When enabled, TLS packet trace information is written to `stderr`. This can be + * used to debug TLS connection problems. + * @default false + */ + enableTrace?: boolean | undefined; + /** + * If true the server will request a certificate from clients that + * connect and attempt to verify that certificate. Defaults to + * false. + */ + requestCert?: boolean | undefined; + /** + * An array of strings or a Buffer naming possible ALPN protocols. + * (Protocols should be ordered by their priority.) + */ + ALPNProtocols?: string[] | Uint8Array[] | Uint8Array | undefined; + /** + * SNICallback(servername, cb) A function that will be + * called if the client supports SNI TLS extension. Two arguments + * will be passed when called: servername and cb. SNICallback should + * invoke cb(null, ctx), where ctx is a SecureContext instance. + * (tls.createSecureContext(...) can be used to get a proper + * SecureContext.) If SNICallback wasn't provided the default callback + * with high-level API will be used (see below). + */ + SNICallback?: ((servername: string, cb: (err: Error | null, ctx?: SecureContext) => void) => void) | undefined; + /** + * If true the server will reject any connection which is not + * authorized with the list of supplied CAs. This option only has an + * effect if requestCert is true. + * @default true + */ + rejectUnauthorized?: boolean | undefined; + } + interface TlsOptions extends SecureContextOptions, CommonConnectionOptions, net.ServerOpts { + /** + * Abort the connection if the SSL/TLS handshake does not finish in the + * specified number of milliseconds. A 'tlsClientError' is emitted on + * the tls.Server object whenever a handshake times out. Default: + * 120000 (120 seconds). + */ + handshakeTimeout?: number | undefined; + /** + * The number of seconds after which a TLS session created by the + * server will no longer be resumable. See Session Resumption for more + * information. Default: 300. + */ + sessionTimeout?: number | undefined; + /** + * 48-bytes of cryptographically strong pseudo-random data. + */ + ticketKeys?: Buffer | undefined; + /** + * + * @param socket + * @param identity identity parameter sent from the client. + * @return pre-shared key that must either be + * a buffer or `null` to stop the negotiation process. Returned PSK must be + * compatible with the selected cipher's digest. + * + * When negotiating TLS-PSK (pre-shared keys), this function is called + * with the identity provided by the client. + * If the return value is `null` the negotiation process will stop and an + * "unknown_psk_identity" alert message will be sent to the other party. + * If the server wishes to hide the fact that the PSK identity was not known, + * the callback must provide some random data as `psk` to make the connection + * fail with "decrypt_error" before negotiation is finished. + * PSK ciphers are disabled by default, and using TLS-PSK thus + * requires explicitly specifying a cipher suite with the `ciphers` option. + * More information can be found in the RFC 4279. + */ + pskCallback?(socket: TLSSocket, identity: string): DataView | NodeJS.TypedArray | null; + /** + * hint to send to a client to help + * with selecting the identity during TLS-PSK negotiation. Will be ignored + * in TLS 1.3. Upon failing to set pskIdentityHint `tlsClientError` will be + * emitted with `ERR_TLS_PSK_SET_IDENTIY_HINT_FAILED` code. + */ + pskIdentityHint?: string | undefined; + } + interface PSKCallbackNegotation { + psk: DataView | NodeJS.TypedArray; + identity: string; + } + interface ConnectionOptions extends SecureContextOptions, CommonConnectionOptions { + host?: string | undefined; + port?: number | undefined; + path?: string | undefined; // Creates unix socket connection to path. If this option is specified, `host` and `port` are ignored. + socket?: net.Socket | undefined; // Establish secure connection on a given socket rather than creating a new socket + checkServerIdentity?: typeof checkServerIdentity | undefined; + servername?: string | undefined; // SNI TLS Extension + session?: Buffer | undefined; + minDHSize?: number | undefined; + lookup?: net.LookupFunction | undefined; + timeout?: number | undefined; + /** + * When negotiating TLS-PSK (pre-shared keys), this function is called + * with optional identity `hint` provided by the server or `null` + * in case of TLS 1.3 where `hint` was removed. + * It will be necessary to provide a custom `tls.checkServerIdentity()` + * for the connection as the default one will try to check hostname/IP + * of the server against the certificate but that's not applicable for PSK + * because there won't be a certificate present. + * More information can be found in the RFC 4279. + * + * @param hint message sent from the server to help client + * decide which identity to use during negotiation. + * Always `null` if TLS 1.3 is used. + * @returns Return `null` to stop the negotiation process. `psk` must be + * compatible with the selected cipher's digest. + * `identity` must use UTF-8 encoding. + */ + pskCallback?(hint: string | null): PSKCallbackNegotation | null; + } + /** + * Accepts encrypted connections using TLS or SSL. + * @since v0.3.2 + */ + class Server extends net.Server { + constructor(secureConnectionListener?: (socket: TLSSocket) => void); + constructor(options: TlsOptions, secureConnectionListener?: (socket: TLSSocket) => void); + /** + * The `server.addContext()` method adds a secure context that will be used if + * the client request's SNI name matches the supplied `hostname` (or wildcard). + * + * When there are multiple matching contexts, the most recently added one is + * used. + * @since v0.5.3 + * @param hostname A SNI host name or wildcard (e.g. `'*'`) + * @param context An object containing any of the possible properties from the {@link createSecureContext} `options` arguments (e.g. `key`, `cert`, `ca`, etc). + */ + addContext(hostname: string, context: SecureContextOptions): void; + /** + * Returns the session ticket keys. + * + * See `Session Resumption` for more information. + * @since v3.0.0 + * @return A 48-byte buffer containing the session ticket keys. + */ + getTicketKeys(): Buffer; + /** + * The `server.setSecureContext()` method replaces the secure context of an + * existing server. Existing connections to the server are not interrupted. + * @since v11.0.0 + * @param options An object containing any of the possible properties from the {@link createSecureContext} `options` arguments (e.g. `key`, `cert`, `ca`, etc). + */ + setSecureContext(options: SecureContextOptions): void; + /** + * Sets the session ticket keys. + * + * Changes to the ticket keys are effective only for future server connections. + * Existing or currently pending server connections will use the previous keys. + * + * See `Session Resumption` for more information. + * @since v3.0.0 + * @param keys A 48-byte buffer containing the session ticket keys. + */ + setTicketKeys(keys: Buffer): void; + /** + * events.EventEmitter + * 1. tlsClientError + * 2. newSession + * 3. OCSPRequest + * 4. resumeSession + * 5. secureConnection + * 6. keylog + */ + addListener(event: string, listener: (...args: any[]) => void): this; + addListener(event: 'tlsClientError', listener: (err: Error, tlsSocket: TLSSocket) => void): this; + addListener(event: 'newSession', listener: (sessionId: Buffer, sessionData: Buffer, callback: (err: Error, resp: Buffer) => void) => void): this; + addListener(event: 'OCSPRequest', listener: (certificate: Buffer, issuer: Buffer, callback: (err: Error | null, resp: Buffer) => void) => void): this; + addListener(event: 'resumeSession', listener: (sessionId: Buffer, callback: (err: Error, sessionData: Buffer) => void) => void): this; + addListener(event: 'secureConnection', listener: (tlsSocket: TLSSocket) => void): this; + addListener(event: 'keylog', listener: (line: Buffer, tlsSocket: TLSSocket) => void): this; + emit(event: string | symbol, ...args: any[]): boolean; + emit(event: 'tlsClientError', err: Error, tlsSocket: TLSSocket): boolean; + emit(event: 'newSession', sessionId: Buffer, sessionData: Buffer, callback: (err: Error, resp: Buffer) => void): boolean; + emit(event: 'OCSPRequest', certificate: Buffer, issuer: Buffer, callback: (err: Error | null, resp: Buffer) => void): boolean; + emit(event: 'resumeSession', sessionId: Buffer, callback: (err: Error, sessionData: Buffer) => void): boolean; + emit(event: 'secureConnection', tlsSocket: TLSSocket): boolean; + emit(event: 'keylog', line: Buffer, tlsSocket: TLSSocket): boolean; + on(event: string, listener: (...args: any[]) => void): this; + on(event: 'tlsClientError', listener: (err: Error, tlsSocket: TLSSocket) => void): this; + on(event: 'newSession', listener: (sessionId: Buffer, sessionData: Buffer, callback: (err: Error, resp: Buffer) => void) => void): this; + on(event: 'OCSPRequest', listener: (certificate: Buffer, issuer: Buffer, callback: (err: Error | null, resp: Buffer) => void) => void): this; + on(event: 'resumeSession', listener: (sessionId: Buffer, callback: (err: Error, sessionData: Buffer) => void) => void): this; + on(event: 'secureConnection', listener: (tlsSocket: TLSSocket) => void): this; + on(event: 'keylog', listener: (line: Buffer, tlsSocket: TLSSocket) => void): this; + once(event: string, listener: (...args: any[]) => void): this; + once(event: 'tlsClientError', listener: (err: Error, tlsSocket: TLSSocket) => void): this; + once(event: 'newSession', listener: (sessionId: Buffer, sessionData: Buffer, callback: (err: Error, resp: Buffer) => void) => void): this; + once(event: 'OCSPRequest', listener: (certificate: Buffer, issuer: Buffer, callback: (err: Error | null, resp: Buffer) => void) => void): this; + once(event: 'resumeSession', listener: (sessionId: Buffer, callback: (err: Error, sessionData: Buffer) => void) => void): this; + once(event: 'secureConnection', listener: (tlsSocket: TLSSocket) => void): this; + once(event: 'keylog', listener: (line: Buffer, tlsSocket: TLSSocket) => void): this; + prependListener(event: string, listener: (...args: any[]) => void): this; + prependListener(event: 'tlsClientError', listener: (err: Error, tlsSocket: TLSSocket) => void): this; + prependListener(event: 'newSession', listener: (sessionId: Buffer, sessionData: Buffer, callback: (err: Error, resp: Buffer) => void) => void): this; + prependListener(event: 'OCSPRequest', listener: (certificate: Buffer, issuer: Buffer, callback: (err: Error | null, resp: Buffer) => void) => void): this; + prependListener(event: 'resumeSession', listener: (sessionId: Buffer, callback: (err: Error, sessionData: Buffer) => void) => void): this; + prependListener(event: 'secureConnection', listener: (tlsSocket: TLSSocket) => void): this; + prependListener(event: 'keylog', listener: (line: Buffer, tlsSocket: TLSSocket) => void): this; + prependOnceListener(event: string, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'tlsClientError', listener: (err: Error, tlsSocket: TLSSocket) => void): this; + prependOnceListener(event: 'newSession', listener: (sessionId: Buffer, sessionData: Buffer, callback: (err: Error, resp: Buffer) => void) => void): this; + prependOnceListener(event: 'OCSPRequest', listener: (certificate: Buffer, issuer: Buffer, callback: (err: Error | null, resp: Buffer) => void) => void): this; + prependOnceListener(event: 'resumeSession', listener: (sessionId: Buffer, callback: (err: Error, sessionData: Buffer) => void) => void): this; + prependOnceListener(event: 'secureConnection', listener: (tlsSocket: TLSSocket) => void): this; + prependOnceListener(event: 'keylog', listener: (line: Buffer, tlsSocket: TLSSocket) => void): this; + } + /** + * @deprecated since v0.11.3 Use `tls.TLSSocket` instead. + */ + interface SecurePair { + encrypted: TLSSocket; + cleartext: TLSSocket; + } + type SecureVersion = 'TLSv1.3' | 'TLSv1.2' | 'TLSv1.1' | 'TLSv1'; + interface SecureContextOptions { + /** + * Optionally override the trusted CA certificates. Default is to trust + * the well-known CAs curated by Mozilla. Mozilla's CAs are completely + * replaced when CAs are explicitly specified using this option. + */ + ca?: string | Buffer | Array | undefined; + /** + * Cert chains in PEM format. One cert chain should be provided per + * private key. Each cert chain should consist of the PEM formatted + * certificate for a provided private key, followed by the PEM + * formatted intermediate certificates (if any), in order, and not + * including the root CA (the root CA must be pre-known to the peer, + * see ca). When providing multiple cert chains, they do not have to + * be in the same order as their private keys in key. If the + * intermediate certificates are not provided, the peer will not be + * able to validate the certificate, and the handshake will fail. + */ + cert?: string | Buffer | Array | undefined; + /** + * Colon-separated list of supported signature algorithms. The list + * can contain digest algorithms (SHA256, MD5 etc.), public key + * algorithms (RSA-PSS, ECDSA etc.), combination of both (e.g + * 'RSA+SHA384') or TLS v1.3 scheme names (e.g. rsa_pss_pss_sha512). + */ + sigalgs?: string | undefined; + /** + * Cipher suite specification, replacing the default. For more + * information, see modifying the default cipher suite. Permitted + * ciphers can be obtained via tls.getCiphers(). Cipher names must be + * uppercased in order for OpenSSL to accept them. + */ + ciphers?: string | undefined; + /** + * Name of an OpenSSL engine which can provide the client certificate. + */ + clientCertEngine?: string | undefined; + /** + * PEM formatted CRLs (Certificate Revocation Lists). + */ + crl?: string | Buffer | Array | undefined; + /** + * Diffie Hellman parameters, required for Perfect Forward Secrecy. Use + * openssl dhparam to create the parameters. The key length must be + * greater than or equal to 1024 bits or else an error will be thrown. + * Although 1024 bits is permissible, use 2048 bits or larger for + * stronger security. If omitted or invalid, the parameters are + * silently discarded and DHE ciphers will not be available. + */ + dhparam?: string | Buffer | undefined; + /** + * A string describing a named curve or a colon separated list of curve + * NIDs or names, for example P-521:P-384:P-256, to use for ECDH key + * agreement. Set to auto to select the curve automatically. Use + * crypto.getCurves() to obtain a list of available curve names. On + * recent releases, openssl ecparam -list_curves will also display the + * name and description of each available elliptic curve. Default: + * tls.DEFAULT_ECDH_CURVE. + */ + ecdhCurve?: string | undefined; + /** + * Attempt to use the server's cipher suite preferences instead of the + * client's. When true, causes SSL_OP_CIPHER_SERVER_PREFERENCE to be + * set in secureOptions + */ + honorCipherOrder?: boolean | undefined; + /** + * Private keys in PEM format. PEM allows the option of private keys + * being encrypted. Encrypted keys will be decrypted with + * options.passphrase. Multiple keys using different algorithms can be + * provided either as an array of unencrypted key strings or buffers, + * or an array of objects in the form {pem: [, + * passphrase: ]}. The object form can only occur in an array. + * object.passphrase is optional. Encrypted keys will be decrypted with + * object.passphrase if provided, or options.passphrase if it is not. + */ + key?: string | Buffer | Array | undefined; + /** + * Name of an OpenSSL engine to get private key from. Should be used + * together with privateKeyIdentifier. + */ + privateKeyEngine?: string | undefined; + /** + * Identifier of a private key managed by an OpenSSL engine. Should be + * used together with privateKeyEngine. Should not be set together with + * key, because both options define a private key in different ways. + */ + privateKeyIdentifier?: string | undefined; + /** + * Optionally set the maximum TLS version to allow. One + * of `'TLSv1.3'`, `'TLSv1.2'`, `'TLSv1.1'`, or `'TLSv1'`. Cannot be specified along with the + * `secureProtocol` option, use one or the other. + * **Default:** `'TLSv1.3'`, unless changed using CLI options. Using + * `--tls-max-v1.2` sets the default to `'TLSv1.2'`. Using `--tls-max-v1.3` sets the default to + * `'TLSv1.3'`. If multiple of the options are provided, the highest maximum is used. + */ + maxVersion?: SecureVersion | undefined; + /** + * Optionally set the minimum TLS version to allow. One + * of `'TLSv1.3'`, `'TLSv1.2'`, `'TLSv1.1'`, or `'TLSv1'`. Cannot be specified along with the + * `secureProtocol` option, use one or the other. It is not recommended to use + * less than TLSv1.2, but it may be required for interoperability. + * **Default:** `'TLSv1.2'`, unless changed using CLI options. Using + * `--tls-v1.0` sets the default to `'TLSv1'`. Using `--tls-v1.1` sets the default to + * `'TLSv1.1'`. Using `--tls-min-v1.3` sets the default to + * 'TLSv1.3'. If multiple of the options are provided, the lowest minimum is used. + */ + minVersion?: SecureVersion | undefined; + /** + * Shared passphrase used for a single private key and/or a PFX. + */ + passphrase?: string | undefined; + /** + * PFX or PKCS12 encoded private key and certificate chain. pfx is an + * alternative to providing key and cert individually. PFX is usually + * encrypted, if it is, passphrase will be used to decrypt it. Multiple + * PFX can be provided either as an array of unencrypted PFX buffers, + * or an array of objects in the form {buf: [, + * passphrase: ]}. The object form can only occur in an array. + * object.passphrase is optional. Encrypted PFX will be decrypted with + * object.passphrase if provided, or options.passphrase if it is not. + */ + pfx?: string | Buffer | Array | undefined; + /** + * Optionally affect the OpenSSL protocol behavior, which is not + * usually necessary. This should be used carefully if at all! Value is + * a numeric bitmask of the SSL_OP_* options from OpenSSL Options + */ + secureOptions?: number | undefined; // Value is a numeric bitmask of the `SSL_OP_*` options + /** + * Legacy mechanism to select the TLS protocol version to use, it does + * not support independent control of the minimum and maximum version, + * and does not support limiting the protocol to TLSv1.3. Use + * minVersion and maxVersion instead. The possible values are listed as + * SSL_METHODS, use the function names as strings. For example, use + * 'TLSv1_1_method' to force TLS version 1.1, or 'TLS_method' to allow + * any TLS protocol version up to TLSv1.3. It is not recommended to use + * TLS versions less than 1.2, but it may be required for + * interoperability. Default: none, see minVersion. + */ + secureProtocol?: string | undefined; + /** + * Opaque identifier used by servers to ensure session state is not + * shared between applications. Unused by clients. + */ + sessionIdContext?: string | undefined; + /** + * 48-bytes of cryptographically strong pseudo-random data. + * See Session Resumption for more information. + */ + ticketKeys?: Buffer | undefined; + /** + * The number of seconds after which a TLS session created by the + * server will no longer be resumable. See Session Resumption for more + * information. Default: 300. + */ + sessionTimeout?: number | undefined; + } + interface SecureContext { + context: any; + } + /** + * Verifies the certificate `cert` is issued to `hostname`. + * + * Returns [Error](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Error) object, populating it with `reason`, `host`, and `cert` on + * failure. On success, returns [undefined](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Data_structures#Undefined_type). + * + * This function can be overwritten by providing alternative function as part of + * the `options.checkServerIdentity` option passed to `tls.connect()`. The + * overwriting function can call `tls.checkServerIdentity()` of course, to augment + * the checks done with additional verification. + * + * This function is only called if the certificate passed all other checks, such as + * being issued by trusted CA (`options.ca`). + * @since v0.8.4 + * @param hostname The host name or IP address to verify the certificate against. + * @param cert A `certificate object` representing the peer's certificate. + */ + function checkServerIdentity(hostname: string, cert: PeerCertificate): Error | undefined; + /** + * Creates a new {@link Server}. The `secureConnectionListener`, if provided, is + * automatically set as a listener for the `'secureConnection'` event. + * + * The `ticketKeys` options is automatically shared between `cluster` module + * workers. + * + * The following illustrates a simple echo server: + * + * ```js + * const tls = require('tls'); + * const fs = require('fs'); + * + * const options = { + * key: fs.readFileSync('server-key.pem'), + * cert: fs.readFileSync('server-cert.pem'), + * + * // This is necessary only if using client certificate authentication. + * requestCert: true, + * + * // This is necessary only if the client uses a self-signed certificate. + * ca: [ fs.readFileSync('client-cert.pem') ] + * }; + * + * const server = tls.createServer(options, (socket) => { + * console.log('server connected', + * socket.authorized ? 'authorized' : 'unauthorized'); + * socket.write('welcome!\n'); + * socket.setEncoding('utf8'); + * socket.pipe(socket); + * }); + * server.listen(8000, () => { + * console.log('server bound'); + * }); + * ``` + * + * The server can be tested by connecting to it using the example client from {@link connect}. + * @since v0.3.2 + */ + function createServer(secureConnectionListener?: (socket: TLSSocket) => void): Server; + function createServer(options: TlsOptions, secureConnectionListener?: (socket: TLSSocket) => void): Server; + /** + * The `callback` function, if specified, will be added as a listener for the `'secureConnect'` event. + * + * `tls.connect()` returns a {@link TLSSocket} object. + * + * Unlike the `https` API, `tls.connect()` does not enable the + * SNI (Server Name Indication) extension by default, which may cause some + * servers to return an incorrect certificate or reject the connection + * altogether. To enable SNI, set the `servername` option in addition + * to `host`. + * + * The following illustrates a client for the echo server example from {@link createServer}: + * + * ```js + * // Assumes an echo server that is listening on port 8000. + * const tls = require('tls'); + * const fs = require('fs'); + * + * const options = { + * // Necessary only if the server requires client certificate authentication. + * key: fs.readFileSync('client-key.pem'), + * cert: fs.readFileSync('client-cert.pem'), + * + * // Necessary only if the server uses a self-signed certificate. + * ca: [ fs.readFileSync('server-cert.pem') ], + * + * // Necessary only if the server's cert isn't for "localhost". + * checkServerIdentity: () => { return null; }, + * }; + * + * const socket = tls.connect(8000, options, () => { + * console.log('client connected', + * socket.authorized ? 'authorized' : 'unauthorized'); + * process.stdin.pipe(socket); + * process.stdin.resume(); + * }); + * socket.setEncoding('utf8'); + * socket.on('data', (data) => { + * console.log(data); + * }); + * socket.on('end', () => { + * console.log('server ends connection'); + * }); + * ``` + * @since v0.11.3 + */ + function connect(options: ConnectionOptions, secureConnectListener?: () => void): TLSSocket; + function connect(port: number, host?: string, options?: ConnectionOptions, secureConnectListener?: () => void): TLSSocket; + function connect(port: number, options?: ConnectionOptions, secureConnectListener?: () => void): TLSSocket; + /** + * Creates a new secure pair object with two streams, one of which reads and writes + * the encrypted data and the other of which reads and writes the cleartext data. + * Generally, the encrypted stream is piped to/from an incoming encrypted data + * stream and the cleartext one is used as a replacement for the initial encrypted + * stream. + * + * `tls.createSecurePair()` returns a `tls.SecurePair` object with `cleartext` and`encrypted` stream properties. + * + * Using `cleartext` has the same API as {@link TLSSocket}. + * + * The `tls.createSecurePair()` method is now deprecated in favor of`tls.TLSSocket()`. For example, the code: + * + * ```js + * pair = tls.createSecurePair(// ... ); + * pair.encrypted.pipe(socket); + * socket.pipe(pair.encrypted); + * ``` + * + * can be replaced by: + * + * ```js + * secureSocket = tls.TLSSocket(socket, options); + * ``` + * + * where `secureSocket` has the same API as `pair.cleartext`. + * @since v0.3.2 + * @deprecated Since v0.11.3 - Use {@link TLSSocket} instead. + * @param context A secure context object as returned by `tls.createSecureContext()` + * @param isServer `true` to specify that this TLS connection should be opened as a server. + * @param requestCert `true` to specify whether a server should request a certificate from a connecting client. Only applies when `isServer` is `true`. + * @param rejectUnauthorized If not `false` a server automatically reject clients with invalid certificates. Only applies when `isServer` is `true`. + */ + function createSecurePair(context?: SecureContext, isServer?: boolean, requestCert?: boolean, rejectUnauthorized?: boolean): SecurePair; + /** + * {@link createServer} sets the default value of the `honorCipherOrder` option + * to `true`, other APIs that create secure contexts leave it unset. + * + * {@link createServer} uses a 128 bit truncated SHA1 hash value generated + * from `process.argv` as the default value of the `sessionIdContext` option, other + * APIs that create secure contexts have no default value. + * + * The `tls.createSecureContext()` method creates a `SecureContext` object. It is + * usable as an argument to several `tls` APIs, such as {@link createServer} and `server.addContext()`, but has no public methods. + * + * A key is _required_ for ciphers that use certificates. Either `key` or`pfx` can be used to provide it. + * + * If the `ca` option is not given, then Node.js will default to using [Mozilla's publicly trusted list of + * CAs](https://hg.mozilla.org/mozilla-central/raw-file/tip/security/nss/lib/ckfw/builtins/certdata.txt). + * @since v0.11.13 + */ + function createSecureContext(options?: SecureContextOptions): SecureContext; + /** + * Returns an array with the names of the supported TLS ciphers. The names are + * lower-case for historical reasons, but must be uppercased to be used in + * the `ciphers` option of {@link createSecureContext}. + * + * Cipher names that start with `'tls_'` are for TLSv1.3, all the others are for + * TLSv1.2 and below. + * + * ```js + * console.log(tls.getCiphers()); // ['aes128-gcm-sha256', 'aes128-sha', ...] + * ``` + * @since v0.10.2 + */ + function getCiphers(): string[]; + /** + * The default curve name to use for ECDH key agreement in a tls server. + * The default value is 'auto'. See tls.createSecureContext() for further + * information. + */ + let DEFAULT_ECDH_CURVE: string; + /** + * The default value of the maxVersion option of + * tls.createSecureContext(). It can be assigned any of the supported TLS + * protocol versions, 'TLSv1.3', 'TLSv1.2', 'TLSv1.1', or 'TLSv1'. Default: + * 'TLSv1.3', unless changed using CLI options. Using --tls-max-v1.2 sets + * the default to 'TLSv1.2'. Using --tls-max-v1.3 sets the default to + * 'TLSv1.3'. If multiple of the options are provided, the highest maximum + * is used. + */ + let DEFAULT_MAX_VERSION: SecureVersion; + /** + * The default value of the minVersion option of tls.createSecureContext(). + * It can be assigned any of the supported TLS protocol versions, + * 'TLSv1.3', 'TLSv1.2', 'TLSv1.1', or 'TLSv1'. Default: 'TLSv1.2', unless + * changed using CLI options. Using --tls-min-v1.0 sets the default to + * 'TLSv1'. Using --tls-min-v1.1 sets the default to 'TLSv1.1'. Using + * --tls-min-v1.3 sets the default to 'TLSv1.3'. If multiple of the options + * are provided, the lowest minimum is used. + */ + let DEFAULT_MIN_VERSION: SecureVersion; + /** + * An immutable array of strings representing the root certificates (in PEM + * format) used for verifying peer certificates. This is the default value + * of the ca option to tls.createSecureContext(). + */ + const rootCertificates: ReadonlyArray; +} +declare module 'node:tls' { + export * from 'tls'; +} diff --git a/node_modules/@types/node/trace_events.d.ts b/node_modules/@types/node/trace_events.d.ts new file mode 100644 index 000000000..2976eb6e5 --- /dev/null +++ b/node_modules/@types/node/trace_events.d.ts @@ -0,0 +1,161 @@ +/** + * The `trace_events` module provides a mechanism to centralize tracing information + * generated by V8, Node.js core, and userspace code. + * + * Tracing can be enabled with the `--trace-event-categories` command-line flag + * or by using the `trace_events` module. The `--trace-event-categories` flag + * accepts a list of comma-separated category names. + * + * The available categories are: + * + * * `node`: An empty placeholder. + * * `node.async_hooks`: Enables capture of detailed `async_hooks` trace data. + * The `async_hooks` events have a unique `asyncId` and a special `triggerId` `triggerAsyncId` property. + * * `node.bootstrap`: Enables capture of Node.js bootstrap milestones. + * * `node.console`: Enables capture of `console.time()` and `console.count()`output. + * * `node.dns.native`: Enables capture of trace data for DNS queries. + * * `node.environment`: Enables capture of Node.js Environment milestones. + * * `node.fs.sync`: Enables capture of trace data for file system sync methods. + * * `node.perf`: Enables capture of `Performance API` measurements. + * * `node.perf.usertiming`: Enables capture of only Performance API User Timing + * measures and marks. + * * `node.perf.timerify`: Enables capture of only Performance API timerify + * measurements. + * * `node.promises.rejections`: Enables capture of trace data tracking the number + * of unhandled Promise rejections and handled-after-rejections. + * * `node.vm.script`: Enables capture of trace data for the `vm` module's`runInNewContext()`, `runInContext()`, and `runInThisContext()` methods. + * * `v8`: The `V8` events are GC, compiling, and execution related. + * + * By default the `node`, `node.async_hooks`, and `v8` categories are enabled. + * + * ```bash + * node --trace-event-categories v8,node,node.async_hooks server.js + * ``` + * + * Prior versions of Node.js required the use of the `--trace-events-enabled`flag to enable trace events. This requirement has been removed. However, the`--trace-events-enabled` flag _may_ still be + * used and will enable the`node`, `node.async_hooks`, and `v8` trace event categories by default. + * + * ```bash + * node --trace-events-enabled + * + * # is equivalent to + * + * node --trace-event-categories v8,node,node.async_hooks + * ``` + * + * Alternatively, trace events may be enabled using the `trace_events` module: + * + * ```js + * const trace_events = require('trace_events'); + * const tracing = trace_events.createTracing({ categories: ['node.perf'] }); + * tracing.enable(); // Enable trace event capture for the 'node.perf' category + * + * // do work + * + * tracing.disable(); // Disable trace event capture for the 'node.perf' category + * ``` + * + * Running Node.js with tracing enabled will produce log files that can be opened + * in the [`chrome://tracing`](https://www.chromium.org/developers/how-tos/trace-event-profiling-tool) tab of Chrome. + * + * The logging file is by default called `node_trace.${rotation}.log`, where`${rotation}` is an incrementing log-rotation id. The filepath pattern can + * be specified with `--trace-event-file-pattern` that accepts a template + * string that supports `${rotation}` and `${pid}`: + * + * ```bash + * node --trace-event-categories v8 --trace-event-file-pattern '${pid}-${rotation}.log' server.js + * ``` + * + * The tracing system uses the same time source + * as the one used by `process.hrtime()`. + * However the trace-event timestamps are expressed in microseconds, + * unlike `process.hrtime()` which returns nanoseconds. + * + * The features from this module are not available in `Worker` threads. + * @experimental + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/trace_events.js) + */ +declare module 'trace_events' { + /** + * The `Tracing` object is used to enable or disable tracing for sets of + * categories. Instances are created using the + * `trace_events.createTracing()` method. + * + * When created, the `Tracing` object is disabled. Calling the + * `tracing.enable()` method adds the categories to the set of enabled trace + * event categories. Calling `tracing.disable()` will remove the categories + * from the set of enabled trace event categories. + */ + interface Tracing { + /** + * A comma-separated list of the trace event categories covered by this + * `Tracing` object. + */ + readonly categories: string; + /** + * Disables this `Tracing` object. + * + * Only trace event categories _not_ covered by other enabled `Tracing` + * objects and _not_ specified by the `--trace-event-categories` flag + * will be disabled. + */ + disable(): void; + /** + * Enables this `Tracing` object for the set of categories covered by + * the `Tracing` object. + */ + enable(): void; + /** + * `true` only if the `Tracing` object has been enabled. + */ + readonly enabled: boolean; + } + interface CreateTracingOptions { + /** + * An array of trace category names. Values included in the array are + * coerced to a string when possible. An error will be thrown if the + * value cannot be coerced. + */ + categories: string[]; + } + /** + * Creates and returns a `Tracing` object for the given set of `categories`. + * + * ```js + * const trace_events = require('trace_events'); + * const categories = ['node.perf', 'node.async_hooks']; + * const tracing = trace_events.createTracing({ categories }); + * tracing.enable(); + * // do stuff + * tracing.disable(); + * ``` + * @since v10.0.0 + * @return . + */ + function createTracing(options: CreateTracingOptions): Tracing; + /** + * Returns a comma-separated list of all currently-enabled trace event + * categories. The current set of enabled trace event categories is determined + * by the _union_ of all currently-enabled `Tracing` objects and any categories + * enabled using the `--trace-event-categories` flag. + * + * Given the file `test.js` below, the command`node --trace-event-categories node.perf test.js` will print`'node.async_hooks,node.perf'` to the console. + * + * ```js + * const trace_events = require('trace_events'); + * const t1 = trace_events.createTracing({ categories: ['node.async_hooks'] }); + * const t2 = trace_events.createTracing({ categories: ['node.perf'] }); + * const t3 = trace_events.createTracing({ categories: ['v8'] }); + * + * t1.enable(); + * t2.enable(); + * + * console.log(trace_events.getEnabledCategories()); + * ``` + * @since v10.0.0 + */ + function getEnabledCategories(): string | undefined; +} +declare module 'node:trace_events' { + export * from 'trace_events'; +} diff --git a/node_modules/@types/node/tty.d.ts b/node_modules/@types/node/tty.d.ts new file mode 100644 index 000000000..3bce2d366 --- /dev/null +++ b/node_modules/@types/node/tty.d.ts @@ -0,0 +1,204 @@ +/** + * The `tty` module provides the `tty.ReadStream` and `tty.WriteStream` classes. + * In most cases, it will not be necessary or possible to use this module directly. + * However, it can be accessed using: + * + * ```js + * const tty = require('tty'); + * ``` + * + * When Node.js detects that it is being run with a text terminal ("TTY") + * attached, `process.stdin` will, by default, be initialized as an instance of`tty.ReadStream` and both `process.stdout` and `process.stderr` will, by + * default, be instances of `tty.WriteStream`. The preferred method of determining + * whether Node.js is being run within a TTY context is to check that the value of + * the `process.stdout.isTTY` property is `true`: + * + * ```console + * $ node -p -e "Boolean(process.stdout.isTTY)" + * true + * $ node -p -e "Boolean(process.stdout.isTTY)" | cat + * false + * ``` + * + * In most cases, there should be little to no reason for an application to + * manually create instances of the `tty.ReadStream` and `tty.WriteStream`classes. + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/tty.js) + */ +declare module 'tty' { + import * as net from 'node:net'; + /** + * The `tty.isatty()` method returns `true` if the given `fd` is associated with + * a TTY and `false` if it is not, including whenever `fd` is not a non-negative + * integer. + * @since v0.5.8 + * @param fd A numeric file descriptor + */ + function isatty(fd: number): boolean; + /** + * Represents the readable side of a TTY. In normal circumstances `process.stdin` will be the only `tty.ReadStream` instance in a Node.js + * process and there should be no reason to create additional instances. + * @since v0.5.8 + */ + class ReadStream extends net.Socket { + constructor(fd: number, options?: net.SocketConstructorOpts); + /** + * A `boolean` that is `true` if the TTY is currently configured to operate as a + * raw device. Defaults to `false`. + * @since v0.7.7 + */ + isRaw: boolean; + /** + * Allows configuration of `tty.ReadStream` so that it operates as a raw device. + * + * When in raw mode, input is always available character-by-character, not + * including modifiers. Additionally, all special processing of characters by the + * terminal is disabled, including echoing input characters.Ctrl+C will no longer cause a `SIGINT` when in this mode. + * @since v0.7.7 + * @param mode If `true`, configures the `tty.ReadStream` to operate as a raw device. If `false`, configures the `tty.ReadStream` to operate in its default mode. The `readStream.isRaw` + * property will be set to the resulting mode. + * @return The read stream instance. + */ + setRawMode(mode: boolean): this; + /** + * A `boolean` that is always `true` for `tty.ReadStream` instances. + * @since v0.5.8 + */ + isTTY: boolean; + } + /** + * -1 - to the left from cursor + * 0 - the entire line + * 1 - to the right from cursor + */ + type Direction = -1 | 0 | 1; + /** + * Represents the writable side of a TTY. In normal circumstances,`process.stdout` and `process.stderr` will be the only`tty.WriteStream` instances created for a Node.js process and there + * should be no reason to create additional instances. + * @since v0.5.8 + */ + class WriteStream extends net.Socket { + constructor(fd: number); + addListener(event: string, listener: (...args: any[]) => void): this; + addListener(event: 'resize', listener: () => void): this; + emit(event: string | symbol, ...args: any[]): boolean; + emit(event: 'resize'): boolean; + on(event: string, listener: (...args: any[]) => void): this; + on(event: 'resize', listener: () => void): this; + once(event: string, listener: (...args: any[]) => void): this; + once(event: 'resize', listener: () => void): this; + prependListener(event: string, listener: (...args: any[]) => void): this; + prependListener(event: 'resize', listener: () => void): this; + prependOnceListener(event: string, listener: (...args: any[]) => void): this; + prependOnceListener(event: 'resize', listener: () => void): this; + /** + * `writeStream.clearLine()` clears the current line of this `WriteStream` in a + * direction identified by `dir`. + * @since v0.7.7 + * @param callback Invoked once the operation completes. + * @return `false` if the stream wishes for the calling code to wait for the `'drain'` event to be emitted before continuing to write additional data; otherwise `true`. + */ + clearLine(dir: Direction, callback?: () => void): boolean; + /** + * `writeStream.clearScreenDown()` clears this `WriteStream` from the current + * cursor down. + * @since v0.7.7 + * @param callback Invoked once the operation completes. + * @return `false` if the stream wishes for the calling code to wait for the `'drain'` event to be emitted before continuing to write additional data; otherwise `true`. + */ + clearScreenDown(callback?: () => void): boolean; + /** + * `writeStream.cursorTo()` moves this `WriteStream`'s cursor to the specified + * position. + * @since v0.7.7 + * @param callback Invoked once the operation completes. + * @return `false` if the stream wishes for the calling code to wait for the `'drain'` event to be emitted before continuing to write additional data; otherwise `true`. + */ + cursorTo(x: number, y?: number, callback?: () => void): boolean; + cursorTo(x: number, callback: () => void): boolean; + /** + * `writeStream.moveCursor()` moves this `WriteStream`'s cursor _relative_ to its + * current position. + * @since v0.7.7 + * @param callback Invoked once the operation completes. + * @return `false` if the stream wishes for the calling code to wait for the `'drain'` event to be emitted before continuing to write additional data; otherwise `true`. + */ + moveCursor(dx: number, dy: number, callback?: () => void): boolean; + /** + * Returns: + * + * * `1` for 2, + * * `4` for 16, + * * `8` for 256, + * * `24` for 16,777,216 colors supported. + * + * Use this to determine what colors the terminal supports. Due to the nature of + * colors in terminals it is possible to either have false positives or false + * negatives. It depends on process information and the environment variables that + * may lie about what terminal is used. + * It is possible to pass in an `env` object to simulate the usage of a specific + * terminal. This can be useful to check how specific environment settings behave. + * + * To enforce a specific color support, use one of the below environment settings. + * + * * 2 colors: `FORCE_COLOR = 0` (Disables colors) + * * 16 colors: `FORCE_COLOR = 1` + * * 256 colors: `FORCE_COLOR = 2` + * * 16,777,216 colors: `FORCE_COLOR = 3` + * + * Disabling color support is also possible by using the `NO_COLOR` and`NODE_DISABLE_COLORS` environment variables. + * @since v9.9.0 + * @param [env=process.env] An object containing the environment variables to check. This enables simulating the usage of a specific terminal. + */ + getColorDepth(env?: object): number; + /** + * Returns `true` if the `writeStream` supports at least as many colors as provided + * in `count`. Minimum support is 2 (black and white). + * + * This has the same false positives and negatives as described in `writeStream.getColorDepth()`. + * + * ```js + * process.stdout.hasColors(); + * // Returns true or false depending on if `stdout` supports at least 16 colors. + * process.stdout.hasColors(256); + * // Returns true or false depending on if `stdout` supports at least 256 colors. + * process.stdout.hasColors({ TMUX: '1' }); + * // Returns true. + * process.stdout.hasColors(2 ** 24, { TMUX: '1' }); + * // Returns false (the environment setting pretends to support 2 ** 8 colors). + * ``` + * @since v11.13.0, v10.16.0 + * @param [count=16] The number of colors that are requested (minimum 2). + * @param [env=process.env] An object containing the environment variables to check. This enables simulating the usage of a specific terminal. + */ + hasColors(count?: number): boolean; + hasColors(env?: object): boolean; + hasColors(count: number, env?: object): boolean; + /** + * `writeStream.getWindowSize()` returns the size of the TTY + * corresponding to this `WriteStream`. The array is of the type`[numColumns, numRows]` where `numColumns` and `numRows` represent the number + * of columns and rows in the corresponding TTY. + * @since v0.7.7 + */ + getWindowSize(): [number, number]; + /** + * A `number` specifying the number of columns the TTY currently has. This property + * is updated whenever the `'resize'` event is emitted. + * @since v0.7.7 + */ + columns: number; + /** + * A `number` specifying the number of rows the TTY currently has. This property + * is updated whenever the `'resize'` event is emitted. + * @since v0.7.7 + */ + rows: number; + /** + * A `boolean` that is always `true`. + * @since v0.5.8 + */ + isTTY: boolean; + } +} +declare module 'node:tty' { + export * from 'tty'; +} diff --git a/node_modules/@types/node/url.d.ts b/node_modules/@types/node/url.d.ts new file mode 100644 index 000000000..531f34c91 --- /dev/null +++ b/node_modules/@types/node/url.d.ts @@ -0,0 +1,798 @@ +/** + * The `url` module provides utilities for URL resolution and parsing. It can be + * accessed using: + * + * ```js + * import url from 'url'; + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/url.js) + */ +declare module 'url' { + import { Blob } from 'node:buffer'; + import { ClientRequestArgs } from 'node:http'; + import { ParsedUrlQuery, ParsedUrlQueryInput } from 'node:querystring'; + // Input to `url.format` + interface UrlObject { + auth?: string | null | undefined; + hash?: string | null | undefined; + host?: string | null | undefined; + hostname?: string | null | undefined; + href?: string | null | undefined; + pathname?: string | null | undefined; + protocol?: string | null | undefined; + search?: string | null | undefined; + slashes?: boolean | null | undefined; + port?: string | number | null | undefined; + query?: string | null | ParsedUrlQueryInput | undefined; + } + // Output of `url.parse` + interface Url { + auth: string | null; + hash: string | null; + host: string | null; + hostname: string | null; + href: string; + path: string | null; + pathname: string | null; + protocol: string | null; + search: string | null; + slashes: boolean | null; + port: string | null; + query: string | null | ParsedUrlQuery; + } + interface UrlWithParsedQuery extends Url { + query: ParsedUrlQuery; + } + interface UrlWithStringQuery extends Url { + query: string | null; + } + /** + * The `url.parse()` method takes a URL string, parses it, and returns a URL + * object. + * + * A `TypeError` is thrown if `urlString` is not a string. + * + * A `URIError` is thrown if the `auth` property is present but cannot be decoded. + * + * Use of the legacy `url.parse()` method is discouraged. Users should + * use the WHATWG `URL` API. Because the `url.parse()` method uses a + * lenient, non-standard algorithm for parsing URL strings, security + * issues can be introduced. Specifically, issues with [host name spoofing](https://hackerone.com/reports/678487) and + * incorrect handling of usernames and passwords have been identified. + * @since v0.1.25 + * @deprecated Legacy: Use the WHATWG URL API instead. + * @param urlString The URL string to parse. + * @param [parseQueryString=false] If `true`, the `query` property will always be set to an object returned by the {@link querystring} module's `parse()` method. If `false`, the `query` property + * on the returned URL object will be an unparsed, undecoded string. + * @param [slashesDenoteHost=false] If `true`, the first token after the literal string `//` and preceding the next `/` will be interpreted as the `host`. For instance, given `//foo/bar`, the + * result would be `{host: 'foo', pathname: '/bar'}` rather than `{pathname: '//foo/bar'}`. + */ + function parse(urlString: string): UrlWithStringQuery; + function parse(urlString: string, parseQueryString: false | undefined, slashesDenoteHost?: boolean): UrlWithStringQuery; + function parse(urlString: string, parseQueryString: true, slashesDenoteHost?: boolean): UrlWithParsedQuery; + function parse(urlString: string, parseQueryString: boolean, slashesDenoteHost?: boolean): Url; + /** + * The `url.format()` method returns a formatted URL string derived from`urlObject`. + * + * ```js + * const url = require('url'); + * url.format({ + * protocol: 'https', + * hostname: 'example.com', + * pathname: '/some/path', + * query: { + * page: 1, + * format: 'json' + * } + * }); + * + * // => 'https://example.com/some/path?page=1&format=json' + * ``` + * + * If `urlObject` is not an object or a string, `url.format()` will throw a `TypeError`. + * + * The formatting process operates as follows: + * + * * A new empty string `result` is created. + * * If `urlObject.protocol` is a string, it is appended as-is to `result`. + * * Otherwise, if `urlObject.protocol` is not `undefined` and is not a string, an `Error` is thrown. + * * For all string values of `urlObject.protocol` that _do not end_ with an ASCII + * colon (`:`) character, the literal string `:` will be appended to `result`. + * * If either of the following conditions is true, then the literal string `//`will be appended to `result`: + * * `urlObject.slashes` property is true; + * * `urlObject.protocol` begins with `http`, `https`, `ftp`, `gopher`, or`file`; + * * If the value of the `urlObject.auth` property is truthy, and either`urlObject.host` or `urlObject.hostname` are not `undefined`, the value of`urlObject.auth` will be coerced into a string + * and appended to `result`followed by the literal string `@`. + * * If the `urlObject.host` property is `undefined` then: + * * If the `urlObject.hostname` is a string, it is appended to `result`. + * * Otherwise, if `urlObject.hostname` is not `undefined` and is not a string, + * an `Error` is thrown. + * * If the `urlObject.port` property value is truthy, and `urlObject.hostname`is not `undefined`: + * * The literal string `:` is appended to `result`, and + * * The value of `urlObject.port` is coerced to a string and appended to`result`. + * * Otherwise, if the `urlObject.host` property value is truthy, the value of`urlObject.host` is coerced to a string and appended to `result`. + * * If the `urlObject.pathname` property is a string that is not an empty string: + * * If the `urlObject.pathname`_does not start_ with an ASCII forward slash + * (`/`), then the literal string `'/'` is appended to `result`. + * * The value of `urlObject.pathname` is appended to `result`. + * * Otherwise, if `urlObject.pathname` is not `undefined` and is not a string, an `Error` is thrown. + * * If the `urlObject.search` property is `undefined` and if the `urlObject.query`property is an `Object`, the literal string `?` is appended to `result`followed by the output of calling the + * `querystring` module's `stringify()`method passing the value of `urlObject.query`. + * * Otherwise, if `urlObject.search` is a string: + * * If the value of `urlObject.search`_does not start_ with the ASCII question + * mark (`?`) character, the literal string `?` is appended to `result`. + * * The value of `urlObject.search` is appended to `result`. + * * Otherwise, if `urlObject.search` is not `undefined` and is not a string, an `Error` is thrown. + * * If the `urlObject.hash` property is a string: + * * If the value of `urlObject.hash`_does not start_ with the ASCII hash (`#`) + * character, the literal string `#` is appended to `result`. + * * The value of `urlObject.hash` is appended to `result`. + * * Otherwise, if the `urlObject.hash` property is not `undefined` and is not a + * string, an `Error` is thrown. + * * `result` is returned. + * @since v0.1.25 + * @deprecated Legacy: Use the WHATWG URL API instead. + * @param urlObject A URL object (as returned by `url.parse()` or constructed otherwise). If a string, it is converted to an object by passing it to `url.parse()`. + */ + function format(urlObject: URL, options?: URLFormatOptions): string; + function format(urlObject: UrlObject | string): string; + /** + * The `url.resolve()` method resolves a target URL relative to a base URL in a + * manner similar to that of a Web browser resolving an anchor tag HREF. + * + * ```js + * const url = require('url'); + * url.resolve('/one/two/three', 'four'); // '/one/two/four' + * url.resolve('http://example.com/', '/one'); // 'http://example.com/one' + * url.resolve('http://example.com/one', '/two'); // 'http://example.com/two' + * ``` + * + * You can achieve the same result using the WHATWG URL API: + * + * ```js + * function resolve(from, to) { + * const resolvedUrl = new URL(to, new URL(from, 'resolve://')); + * if (resolvedUrl.protocol === 'resolve:') { + * // `from` is a relative URL. + * const { pathname, search, hash } = resolvedUrl; + * return pathname + search + hash; + * } + * return resolvedUrl.toString(); + * } + * + * resolve('/one/two/three', 'four'); // '/one/two/four' + * resolve('http://example.com/', '/one'); // 'http://example.com/one' + * resolve('http://example.com/one', '/two'); // 'http://example.com/two' + * ``` + * @since v0.1.25 + * @deprecated Legacy: Use the WHATWG URL API instead. + * @param from The Base URL being resolved against. + * @param to The HREF URL being resolved. + */ + function resolve(from: string, to: string): string; + /** + * Returns the [Punycode](https://tools.ietf.org/html/rfc5891#section-4.4) ASCII serialization of the `domain`. If `domain` is an + * invalid domain, the empty string is returned. + * + * It performs the inverse operation to {@link domainToUnicode}. + * + * This feature is only available if the `node` executable was compiled with `ICU` enabled. If not, the domain names are passed through unchanged. + * + * ```js + * import url from 'url'; + * + * console.log(url.domainToASCII('español.com')); + * // Prints xn--espaol-zwa.com + * console.log(url.domainToASCII('中文.com')); + * // Prints xn--fiq228c.com + * console.log(url.domainToASCII('xn--iñvalid.com')); + * // Prints an empty string + * ``` + * @since v7.4.0, v6.13.0 + */ + function domainToASCII(domain: string): string; + /** + * Returns the Unicode serialization of the `domain`. If `domain` is an invalid + * domain, the empty string is returned. + * + * It performs the inverse operation to {@link domainToASCII}. + * + * This feature is only available if the `node` executable was compiled with `ICU` enabled. If not, the domain names are passed through unchanged. + * + * ```js + * import url from 'url'; + * + * console.log(url.domainToUnicode('xn--espaol-zwa.com')); + * // Prints español.com + * console.log(url.domainToUnicode('xn--fiq228c.com')); + * // Prints 中文.com + * console.log(url.domainToUnicode('xn--iñvalid.com')); + * // Prints an empty string + * ``` + * @since v7.4.0, v6.13.0 + */ + function domainToUnicode(domain: string): string; + /** + * This function ensures the correct decodings of percent-encoded characters as + * well as ensuring a cross-platform valid absolute path string. + * + * ```js + * import { fileURLToPath } from 'url'; + * + * const __filename = fileURLToPath(import.meta.url); + * + * new URL('file:///C:/path/').pathname; // Incorrect: /C:/path/ + * fileURLToPath('file:///C:/path/'); // Correct: C:\path\ (Windows) + * + * new URL('file://nas/foo.txt').pathname; // Incorrect: /foo.txt + * fileURLToPath('file://nas/foo.txt'); // Correct: \\nas\foo.txt (Windows) + * + * new URL('file:///你好.txt').pathname; // Incorrect: /%E4%BD%A0%E5%A5%BD.txt + * fileURLToPath('file:///你好.txt'); // Correct: /你好.txt (POSIX) + * + * new URL('file:///hello world').pathname; // Incorrect: /hello%20world + * fileURLToPath('file:///hello world'); // Correct: /hello world (POSIX) + * ``` + * @since v10.12.0 + * @param url The file URL string or URL object to convert to a path. + * @return The fully-resolved platform-specific Node.js file path. + */ + function fileURLToPath(url: string | URL): string; + /** + * This function ensures that `path` is resolved absolutely, and that the URL + * control characters are correctly encoded when converting into a File URL. + * + * ```js + * import { pathToFileURL } from 'url'; + * + * new URL('/foo#1', 'file:'); // Incorrect: file:///foo#1 + * pathToFileURL('/foo#1'); // Correct: file:///foo%231 (POSIX) + * + * new URL('/some/path%.c', 'file:'); // Incorrect: file:///some/path%.c + * pathToFileURL('/some/path%.c'); // Correct: file:///some/path%25.c (POSIX) + * ``` + * @since v10.12.0 + * @param path The path to convert to a File URL. + * @return The file URL object. + */ + function pathToFileURL(path: string): URL; + /** + * This utility function converts a URL object into an ordinary options object as + * expected by the `http.request()` and `https.request()` APIs. + * + * ```js + * import { urlToHttpOptions } from 'url'; + * const myURL = new URL('https://a:b@測試?abc#foo'); + * + * console.log(urlToHttpOptions(myURL)); + * + * { + * protocol: 'https:', + * hostname: 'xn--g6w251d', + * hash: '#foo', + * search: '?abc', + * pathname: '/', + * path: '/?abc', + * href: 'https://a:b@xn--g6w251d/?abc#foo', + * auth: 'a:b' + * } + * + * ``` + * @since v15.7.0 + * @param url The `WHATWG URL` object to convert to an options object. + * @return Options object + */ + function urlToHttpOptions(url: URL): ClientRequestArgs; + interface URLFormatOptions { + auth?: boolean | undefined; + fragment?: boolean | undefined; + search?: boolean | undefined; + unicode?: boolean | undefined; + } + /** + * Browser-compatible `URL` class, implemented by following the WHATWG URL + * Standard. [Examples of parsed URLs](https://url.spec.whatwg.org/#example-url-parsing) may be found in the Standard itself. + * The `URL` class is also available on the global object. + * + * In accordance with browser conventions, all properties of `URL` objects + * are implemented as getters and setters on the class prototype, rather than as + * data properties on the object itself. Thus, unlike `legacy urlObject` s, + * using the `delete` keyword on any properties of `URL` objects (e.g. `delete myURL.protocol`, `delete myURL.pathname`, etc) has no effect but will still + * return `true`. + * @since v7.0.0, v6.13.0 + */ + class URL { + /** + * Creates a `'blob:nodedata:...'` URL string that represents the given `Blob` object and can be used to retrieve the `Blob` later. + * + * ```js + * const { + * Blob, + * resolveObjectURL, + * } = require('buffer'); + * + * const blob = new Blob(['hello']); + * const id = URL.createObjectURL(blob); + * + * // later... + * + * const otherBlob = resolveObjectURL(id); + * console.log(otherBlob.size); + * ``` + * + * The data stored by the registered `Blob` will be retained in memory until`URL.revokeObjectURL()` is called to remove it. + * + * `Blob` objects are registered within the current thread. If using Worker + * Threads, `Blob` objects registered within one Worker will not be available + * to other workers or the main thread. + * @since v16.7.0 + * @experimental + */ + static createObjectURL(blob: Blob): string; + /** + * Removes the stored `Blob` identified by the given ID. + * @since v16.7.0 + * @experimental + * @param id A `'blob:nodedata:...` URL string returned by a prior call to `URL.createObjectURL()`. + */ + static revokeObjectURL(objectUrl: string): void; + constructor(input: string, base?: string | URL); + /** + * Gets and sets the fragment portion of the URL. + * + * ```js + * const myURL = new URL('https://example.org/foo#bar'); + * console.log(myURL.hash); + * // Prints #bar + * + * myURL.hash = 'baz'; + * console.log(myURL.href); + * // Prints https://example.org/foo#baz + * ``` + * + * Invalid URL characters included in the value assigned to the `hash` property + * are `percent-encoded`. The selection of which characters to + * percent-encode may vary somewhat from what the {@link parse} and {@link format} methods would produce. + */ + hash: string; + /** + * Gets and sets the host portion of the URL. + * + * ```js + * const myURL = new URL('https://example.org:81/foo'); + * console.log(myURL.host); + * // Prints example.org:81 + * + * myURL.host = 'example.com:82'; + * console.log(myURL.href); + * // Prints https://example.com:82/foo + * ``` + * + * Invalid host values assigned to the `host` property are ignored. + */ + host: string; + /** + * Gets and sets the host name portion of the URL. The key difference between`url.host` and `url.hostname` is that `url.hostname` does _not_ include the + * port. + * + * ```js + * const myURL = new URL('https://example.org:81/foo'); + * console.log(myURL.hostname); + * // Prints example.org + * + * // Setting the hostname does not change the port + * myURL.hostname = 'example.com:82'; + * console.log(myURL.href); + * // Prints https://example.com:81/foo + * + * // Use myURL.host to change the hostname and port + * myURL.host = 'example.org:82'; + * console.log(myURL.href); + * // Prints https://example.org:82/foo + * ``` + * + * Invalid host name values assigned to the `hostname` property are ignored. + */ + hostname: string; + /** + * Gets and sets the serialized URL. + * + * ```js + * const myURL = new URL('https://example.org/foo'); + * console.log(myURL.href); + * // Prints https://example.org/foo + * + * myURL.href = 'https://example.com/bar'; + * console.log(myURL.href); + * // Prints https://example.com/bar + * ``` + * + * Getting the value of the `href` property is equivalent to calling {@link toString}. + * + * Setting the value of this property to a new value is equivalent to creating a + * new `URL` object using `new URL(value)`. Each of the `URL`object's properties will be modified. + * + * If the value assigned to the `href` property is not a valid URL, a `TypeError`will be thrown. + */ + href: string; + /** + * Gets the read-only serialization of the URL's origin. + * + * ```js + * const myURL = new URL('https://example.org/foo/bar?baz'); + * console.log(myURL.origin); + * // Prints https://example.org + * ``` + * + * ```js + * const idnURL = new URL('https://測試'); + * console.log(idnURL.origin); + * // Prints https://xn--g6w251d + * + * console.log(idnURL.hostname); + * // Prints xn--g6w251d + * ``` + */ + readonly origin: string; + /** + * Gets and sets the password portion of the URL. + * + * ```js + * const myURL = new URL('https://abc:xyz@example.com'); + * console.log(myURL.password); + * // Prints xyz + * + * myURL.password = '123'; + * console.log(myURL.href); + * // Prints https://abc:123@example.com + * ``` + * + * Invalid URL characters included in the value assigned to the `password` property + * are `percent-encoded`. The selection of which characters to + * percent-encode may vary somewhat from what the {@link parse} and {@link format} methods would produce. + */ + password: string; + /** + * Gets and sets the path portion of the URL. + * + * ```js + * const myURL = new URL('https://example.org/abc/xyz?123'); + * console.log(myURL.pathname); + * // Prints /abc/xyz + * + * myURL.pathname = '/abcdef'; + * console.log(myURL.href); + * // Prints https://example.org/abcdef?123 + * ``` + * + * Invalid URL characters included in the value assigned to the `pathname`property are `percent-encoded`. The selection of which characters + * to percent-encode may vary somewhat from what the {@link parse} and {@link format} methods would produce. + */ + pathname: string; + /** + * Gets and sets the port portion of the URL. + * + * The port value may be a number or a string containing a number in the range`0` to `65535` (inclusive). Setting the value to the default port of the`URL` objects given `protocol` will + * result in the `port` value becoming + * the empty string (`''`). + * + * The port value can be an empty string in which case the port depends on + * the protocol/scheme: + * + * + * + * Upon assigning a value to the port, the value will first be converted to a + * string using `.toString()`. + * + * If that string is invalid but it begins with a number, the leading number is + * assigned to `port`. + * If the number lies outside the range denoted above, it is ignored. + * + * ```js + * const myURL = new URL('https://example.org:8888'); + * console.log(myURL.port); + * // Prints 8888 + * + * // Default ports are automatically transformed to the empty string + * // (HTTPS protocol's default port is 443) + * myURL.port = '443'; + * console.log(myURL.port); + * // Prints the empty string + * console.log(myURL.href); + * // Prints https://example.org/ + * + * myURL.port = 1234; + * console.log(myURL.port); + * // Prints 1234 + * console.log(myURL.href); + * // Prints https://example.org:1234/ + * + * // Completely invalid port strings are ignored + * myURL.port = 'abcd'; + * console.log(myURL.port); + * // Prints 1234 + * + * // Leading numbers are treated as a port number + * myURL.port = '5678abcd'; + * console.log(myURL.port); + * // Prints 5678 + * + * // Non-integers are truncated + * myURL.port = 1234.5678; + * console.log(myURL.port); + * // Prints 1234 + * + * // Out-of-range numbers which are not represented in scientific notation + * // will be ignored. + * myURL.port = 1e10; // 10000000000, will be range-checked as described below + * console.log(myURL.port); + * // Prints 1234 + * ``` + * + * Numbers which contain a decimal point, + * such as floating-point numbers or numbers in scientific notation, + * are not an exception to this rule. + * Leading numbers up to the decimal point will be set as the URL's port, + * assuming they are valid: + * + * ```js + * myURL.port = 4.567e21; + * console.log(myURL.port); + * // Prints 4 (because it is the leading number in the string '4.567e21') + * ``` + */ + port: string; + /** + * Gets and sets the protocol portion of the URL. + * + * ```js + * const myURL = new URL('https://example.org'); + * console.log(myURL.protocol); + * // Prints https: + * + * myURL.protocol = 'ftp'; + * console.log(myURL.href); + * // Prints ftp://example.org/ + * ``` + * + * Invalid URL protocol values assigned to the `protocol` property are ignored. + */ + protocol: string; + /** + * Gets and sets the serialized query portion of the URL. + * + * ```js + * const myURL = new URL('https://example.org/abc?123'); + * console.log(myURL.search); + * // Prints ?123 + * + * myURL.search = 'abc=xyz'; + * console.log(myURL.href); + * // Prints https://example.org/abc?abc=xyz + * ``` + * + * Any invalid URL characters appearing in the value assigned the `search`property will be `percent-encoded`. The selection of which + * characters to percent-encode may vary somewhat from what the {@link parse} and {@link format} methods would produce. + */ + search: string; + /** + * Gets the `URLSearchParams` object representing the query parameters of the + * URL. This property is read-only but the `URLSearchParams` object it provides + * can be used to mutate the URL instance; to replace the entirety of query + * parameters of the URL, use the {@link search} setter. See `URLSearchParams` documentation for details. + * + * Use care when using `.searchParams` to modify the `URL` because, + * per the WHATWG specification, the `URLSearchParams` object uses + * different rules to determine which characters to percent-encode. For + * instance, the `URL` object will not percent encode the ASCII tilde (`~`) + * character, while `URLSearchParams` will always encode it: + * + * ```js + * const myUrl = new URL('https://example.org/abc?foo=~bar'); + * + * console.log(myUrl.search); // prints ?foo=~bar + * + * // Modify the URL via searchParams... + * myUrl.searchParams.sort(); + * + * console.log(myUrl.search); // prints ?foo=%7Ebar + * ``` + */ + readonly searchParams: URLSearchParams; + /** + * Gets and sets the username portion of the URL. + * + * ```js + * const myURL = new URL('https://abc:xyz@example.com'); + * console.log(myURL.username); + * // Prints abc + * + * myURL.username = '123'; + * console.log(myURL.href); + * // Prints https://123:xyz@example.com/ + * ``` + * + * Any invalid URL characters appearing in the value assigned the `username`property will be `percent-encoded`. The selection of which + * characters to percent-encode may vary somewhat from what the {@link parse} and {@link format} methods would produce. + */ + username: string; + /** + * The `toString()` method on the `URL` object returns the serialized URL. The + * value returned is equivalent to that of {@link href} and {@link toJSON}. + */ + toString(): string; + /** + * The `toJSON()` method on the `URL` object returns the serialized URL. The + * value returned is equivalent to that of {@link href} and {@link toString}. + * + * This method is automatically called when an `URL` object is serialized + * with [`JSON.stringify()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/stringify). + * + * ```js + * const myURLs = [ + * new URL('https://www.example.com'), + * new URL('https://test.example.org'), + * ]; + * console.log(JSON.stringify(myURLs)); + * // Prints ["https://www.example.com/","https://test.example.org/"] + * ``` + */ + toJSON(): string; + } + /** + * The `URLSearchParams` API provides read and write access to the query of a`URL`. The `URLSearchParams` class can also be used standalone with one of the + * four following constructors. + * The `URLSearchParams` class is also available on the global object. + * + * The WHATWG `URLSearchParams` interface and the `querystring` module have + * similar purpose, but the purpose of the `querystring` module is more + * general, as it allows the customization of delimiter characters (`&` and `=`). + * On the other hand, this API is designed purely for URL query strings. + * + * ```js + * const myURL = new URL('https://example.org/?abc=123'); + * console.log(myURL.searchParams.get('abc')); + * // Prints 123 + * + * myURL.searchParams.append('abc', 'xyz'); + * console.log(myURL.href); + * // Prints https://example.org/?abc=123&abc=xyz + * + * myURL.searchParams.delete('abc'); + * myURL.searchParams.set('a', 'b'); + * console.log(myURL.href); + * // Prints https://example.org/?a=b + * + * const newSearchParams = new URLSearchParams(myURL.searchParams); + * // The above is equivalent to + * // const newSearchParams = new URLSearchParams(myURL.search); + * + * newSearchParams.append('a', 'c'); + * console.log(myURL.href); + * // Prints https://example.org/?a=b + * console.log(newSearchParams.toString()); + * // Prints a=b&a=c + * + * // newSearchParams.toString() is implicitly called + * myURL.search = newSearchParams; + * console.log(myURL.href); + * // Prints https://example.org/?a=b&a=c + * newSearchParams.delete('a'); + * console.log(myURL.href); + * // Prints https://example.org/?a=b&a=c + * ``` + * @since v7.5.0, v6.13.0 + */ + class URLSearchParams implements Iterable<[string, string]> { + constructor(init?: URLSearchParams | string | Record> | Iterable<[string, string]> | ReadonlyArray<[string, string]>); + /** + * Append a new name-value pair to the query string. + */ + append(name: string, value: string): void; + /** + * Remove all name-value pairs whose name is `name`. + */ + delete(name: string): void; + /** + * Returns an ES6 `Iterator` over each of the name-value pairs in the query. + * Each item of the iterator is a JavaScript `Array`. The first item of the `Array`is the `name`, the second item of the `Array` is the `value`. + * + * Alias for `urlSearchParams[@@iterator]()`. + */ + entries(): IterableIterator<[string, string]>; + /** + * Iterates over each name-value pair in the query and invokes the given function. + * + * ```js + * const myURL = new URL('https://example.org/?a=b&c=d'); + * myURL.searchParams.forEach((value, name, searchParams) => { + * console.log(name, value, myURL.searchParams === searchParams); + * }); + * // Prints: + * // a b true + * // c d true + * ``` + * @param fn Invoked for each name-value pair in the query + * @param thisArg To be used as `this` value for when `fn` is called + */ + forEach(callback: (this: TThis, value: string, name: string, searchParams: this) => void, thisArg?: TThis): void; + /** + * Returns the value of the first name-value pair whose name is `name`. If there + * are no such pairs, `null` is returned. + * @return or `null` if there is no name-value pair with the given `name`. + */ + get(name: string): string | null; + /** + * Returns the values of all name-value pairs whose name is `name`. If there are + * no such pairs, an empty array is returned. + */ + getAll(name: string): string[]; + /** + * Returns `true` if there is at least one name-value pair whose name is `name`. + */ + has(name: string): boolean; + /** + * Returns an ES6 `Iterator` over the names of each name-value pair. + * + * ```js + * const params = new URLSearchParams('foo=bar&foo=baz'); + * for (const name of params.keys()) { + * console.log(name); + * } + * // Prints: + * // foo + * // foo + * ``` + */ + keys(): IterableIterator; + /** + * Sets the value in the `URLSearchParams` object associated with `name` to`value`. If there are any pre-existing name-value pairs whose names are `name`, + * set the first such pair's value to `value` and remove all others. If not, + * append the name-value pair to the query string. + * + * ```js + * const params = new URLSearchParams(); + * params.append('foo', 'bar'); + * params.append('foo', 'baz'); + * params.append('abc', 'def'); + * console.log(params.toString()); + * // Prints foo=bar&foo=baz&abc=def + * + * params.set('foo', 'def'); + * params.set('xyz', 'opq'); + * console.log(params.toString()); + * // Prints foo=def&abc=def&xyz=opq + * ``` + */ + set(name: string, value: string): void; + /** + * Sort all existing name-value pairs in-place by their names. Sorting is done + * with a [stable sorting algorithm](https://en.wikipedia.org/wiki/Sorting_algorithm#Stability), so relative order between name-value pairs + * with the same name is preserved. + * + * This method can be used, in particular, to increase cache hits. + * + * ```js + * const params = new URLSearchParams('query[]=abc&type=search&query[]=123'); + * params.sort(); + * console.log(params.toString()); + * // Prints query%5B%5D=abc&query%5B%5D=123&type=search + * ``` + * @since v7.7.0, v6.13.0 + */ + sort(): void; + /** + * Returns the search parameters serialized as a string, with characters + * percent-encoded where necessary. + */ + toString(): string; + /** + * Returns an ES6 `Iterator` over the values of each name-value pair. + */ + values(): IterableIterator; + [Symbol.iterator](): IterableIterator<[string, string]>; + } +} +declare module 'node:url' { + export * from 'url'; +} diff --git a/node_modules/@types/node/util.d.ts b/node_modules/@types/node/util.d.ts new file mode 100644 index 000000000..b866e89d5 --- /dev/null +++ b/node_modules/@types/node/util.d.ts @@ -0,0 +1,1563 @@ +/** + * The `util` module supports the needs of Node.js internal APIs. Many of the + * utilities are useful for application and module developers as well. To access + * it: + * + * ```js + * const util = require('util'); + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/util.js) + */ +declare module 'util' { + import * as types from 'node:util/types'; + export interface InspectOptions { + /** + * If set to `true`, getters are going to be + * inspected as well. If set to `'get'` only getters without setter are going + * to be inspected. If set to `'set'` only getters having a corresponding + * setter are going to be inspected. This might cause side effects depending on + * the getter function. + * @default `false` + */ + getters?: 'get' | 'set' | boolean | undefined; + showHidden?: boolean | undefined; + /** + * @default 2 + */ + depth?: number | null | undefined; + colors?: boolean | undefined; + customInspect?: boolean | undefined; + showProxy?: boolean | undefined; + maxArrayLength?: number | null | undefined; + /** + * Specifies the maximum number of characters to + * include when formatting. Set to `null` or `Infinity` to show all elements. + * Set to `0` or negative to show no characters. + * @default 10000 + */ + maxStringLength?: number | null | undefined; + breakLength?: number | undefined; + /** + * Setting this to `false` causes each object key + * to be displayed on a new line. It will also add new lines to text that is + * longer than `breakLength`. If set to a number, the most `n` inner elements + * are united on a single line as long as all properties fit into + * `breakLength`. Short array elements are also grouped together. Note that no + * text will be reduced below 16 characters, no matter the `breakLength` size. + * For more information, see the example below. + * @default `true` + */ + compact?: boolean | number | undefined; + sorted?: boolean | ((a: string, b: string) => number) | undefined; + } + export type Style = 'special' | 'number' | 'bigint' | 'boolean' | 'undefined' | 'null' | 'string' | 'symbol' | 'date' | 'regexp' | 'module'; + export type CustomInspectFunction = (depth: number, options: InspectOptionsStylized) => string; + export interface InspectOptionsStylized extends InspectOptions { + stylize(text: string, styleType: Style): string; + } + /** + * The `util.format()` method returns a formatted string using the first argument + * as a `printf`\-like format string which can contain zero or more format + * specifiers. Each specifier is replaced with the converted value from the + * corresponding argument. Supported specifiers are: + * + * If a specifier does not have a corresponding argument, it is not replaced: + * + * ```js + * util.format('%s:%s', 'foo'); + * // Returns: 'foo:%s' + * ``` + * + * Values that are not part of the format string are formatted using`util.inspect()` if their type is not `string`. + * + * If there are more arguments passed to the `util.format()` method than the + * number of specifiers, the extra arguments are concatenated to the returned + * string, separated by spaces: + * + * ```js + * util.format('%s:%s', 'foo', 'bar', 'baz'); + * // Returns: 'foo:bar baz' + * ``` + * + * If the first argument does not contain a valid format specifier, `util.format()`returns a string that is the concatenation of all arguments separated by spaces: + * + * ```js + * util.format(1, 2, 3); + * // Returns: '1 2 3' + * ``` + * + * If only one argument is passed to `util.format()`, it is returned as it is + * without any formatting: + * + * ```js + * util.format('%% %s'); + * // Returns: '%% %s' + * ``` + * + * `util.format()` is a synchronous method that is intended as a debugging tool. + * Some input values can have a significant performance overhead that can block the + * event loop. Use this function with care and never in a hot code path. + * @since v0.5.3 + * @param format A `printf`-like format string. + */ + export function format(format?: any, ...param: any[]): string; + /** + * This function is identical to {@link format}, except in that it takes + * an `inspectOptions` argument which specifies options that are passed along to {@link inspect}. + * + * ```js + * util.formatWithOptions({ colors: true }, 'See object %O', { foo: 42 }); + * // Returns 'See object { foo: 42 }', where `42` is colored as a number + * // when printed to a terminal. + * ``` + * @since v10.0.0 + */ + export function formatWithOptions(inspectOptions: InspectOptions, format?: any, ...param: any[]): string; + /** + * Returns a Map of all system error codes available from the Node.js API. + * The mapping between error codes and error names is platform-dependent. + * See `Common System Errors` for the names of common errors. + * + * ```js + * fs.access('file/that/does/not/exist', (err) => { + * const errorMap = util.getSystemErrorMap(); + * const name = errorMap.get(err.errno); + * console.error(name); // ENOENT + * }); + * ``` + * @since v16.0.0 + */ + export function getSystemErrorMap(): Map; + /** + * The `util.log()` method prints the given `string` to `stdout` with an included + * timestamp. + * + * ```js + * const util = require('util'); + * + * util.log('Timestamped message.'); + * ``` + * @since v0.3.0 + * @deprecated Since v6.0.0 - Use a third party module instead. + */ + export function log(string: string): void; + /** + * Returns the `string` after replacing any surrogate code points + * (or equivalently, any unpaired surrogate code units) with the + * Unicode "replacement character" U+FFFD. + * @since v16.8.0 + */ + export function toUSVString(string: string): string; + /** + * The `util.inspect()` method returns a string representation of `object` that is + * intended for debugging. The output of `util.inspect` may change at any time + * and should not be depended upon programmatically. Additional `options` may be + * passed that alter the result.`util.inspect()` will use the constructor's name and/or `@@toStringTag` to make + * an identifiable tag for an inspected value. + * + * ```js + * class Foo { + * get [Symbol.toStringTag]() { + * return 'bar'; + * } + * } + * + * class Bar {} + * + * const baz = Object.create(null, { [Symbol.toStringTag]: { value: 'foo' } }); + * + * util.inspect(new Foo()); // 'Foo [bar] {}' + * util.inspect(new Bar()); // 'Bar {}' + * util.inspect(baz); // '[foo] {}' + * ``` + * + * Circular references point to their anchor by using a reference index: + * + * ```js + * const { inspect } = require('util'); + * + * const obj = {}; + * obj.a = [obj]; + * obj.b = {}; + * obj.b.inner = obj.b; + * obj.b.obj = obj; + * + * console.log(inspect(obj)); + * // { + * // a: [ [Circular *1] ], + * // b: { inner: [Circular *2], obj: [Circular *1] } + * // } + * ``` + * + * The following example inspects all properties of the `util` object: + * + * ```js + * const util = require('util'); + * + * console.log(util.inspect(util, { showHidden: true, depth: null })); + * ``` + * + * The following example highlights the effect of the `compact` option: + * + * ```js + * const util = require('util'); + * + * const o = { + * a: [1, 2, [[ + * 'Lorem ipsum dolor sit amet,\nconsectetur adipiscing elit, sed do ' + + * 'eiusmod \ntempor incididunt ut labore et dolore magna aliqua.', + * 'test', + * 'foo']], 4], + * b: new Map([['za', 1], ['zb', 'test']]) + * }; + * console.log(util.inspect(o, { compact: true, depth: 5, breakLength: 80 })); + * + * // { a: + * // [ 1, + * // 2, + * // [ [ 'Lorem ipsum dolor sit amet,\nconsectetur [...]', // A long line + * // 'test', + * // 'foo' ] ], + * // 4 ], + * // b: Map(2) { 'za' => 1, 'zb' => 'test' } } + * + * // Setting `compact` to false or an integer creates more reader friendly output. + * console.log(util.inspect(o, { compact: false, depth: 5, breakLength: 80 })); + * + * // { + * // a: [ + * // 1, + * // 2, + * // [ + * // [ + * // 'Lorem ipsum dolor sit amet,\n' + + * // 'consectetur adipiscing elit, sed do eiusmod \n' + + * // 'tempor incididunt ut labore et dolore magna aliqua.', + * // 'test', + * // 'foo' + * // ] + * // ], + * // 4 + * // ], + * // b: Map(2) { + * // 'za' => 1, + * // 'zb' => 'test' + * // } + * // } + * + * // Setting `breakLength` to e.g. 150 will print the "Lorem ipsum" text in a + * // single line. + * ``` + * + * The `showHidden` option allows [`WeakMap`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/WeakMap) and + * [`WeakSet`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/WeakSet) entries to be + * inspected. If there are more entries than `maxArrayLength`, there is no + * guarantee which entries are displayed. That means retrieving the same [`WeakSet`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/WeakSet) entries twice may + * result in different output. Furthermore, entries + * with no remaining strong references may be garbage collected at any time. + * + * ```js + * const { inspect } = require('util'); + * + * const obj = { a: 1 }; + * const obj2 = { b: 2 }; + * const weakSet = new WeakSet([obj, obj2]); + * + * console.log(inspect(weakSet, { showHidden: true })); + * // WeakSet { { a: 1 }, { b: 2 } } + * ``` + * + * The `sorted` option ensures that an object's property insertion order does not + * impact the result of `util.inspect()`. + * + * ```js + * const { inspect } = require('util'); + * const assert = require('assert'); + * + * const o1 = { + * b: [2, 3, 1], + * a: '`a` comes before `b`', + * c: new Set([2, 3, 1]) + * }; + * console.log(inspect(o1, { sorted: true })); + * // { a: '`a` comes before `b`', b: [ 2, 3, 1 ], c: Set(3) { 1, 2, 3 } } + * console.log(inspect(o1, { sorted: (a, b) => b.localeCompare(a) })); + * // { c: Set(3) { 3, 2, 1 }, b: [ 2, 3, 1 ], a: '`a` comes before `b`' } + * + * const o2 = { + * c: new Set([2, 1, 3]), + * a: '`a` comes before `b`', + * b: [2, 3, 1] + * }; + * assert.strict.equal( + * inspect(o1, { sorted: true }), + * inspect(o2, { sorted: true }) + * ); + * ``` + * + * `util.inspect()` is a synchronous method intended for debugging. Its maximum + * output length is approximately 128 MB. Inputs that result in longer output will + * be truncated. + * @since v0.3.0 + * @param object Any JavaScript primitive or `Object`. + * @return The representation of `object`. + */ + export function inspect(object: any, showHidden?: boolean, depth?: number | null, color?: boolean): string; + export function inspect(object: any, options: InspectOptions): string; + export namespace inspect { + let colors: NodeJS.Dict<[number, number]>; + let styles: { + [K in Style]: string; + }; + let defaultOptions: InspectOptions; + /** + * Allows changing inspect settings from the repl. + */ + let replDefaults: InspectOptions; + const custom: unique symbol; + } + /** + * Alias for [`Array.isArray()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/isArray). + * + * Returns `true` if the given `object` is an `Array`. Otherwise, returns `false`. + * + * ```js + * const util = require('util'); + * + * util.isArray([]); + * // Returns: true + * util.isArray(new Array()); + * // Returns: true + * util.isArray({}); + * // Returns: false + * ``` + * @since v0.6.0 + * @deprecated Since v4.0.0 - Use `isArray` instead. + */ + export function isArray(object: unknown): object is unknown[]; + /** + * Returns `true` if the given `object` is a `RegExp`. Otherwise, returns `false`. + * + * ```js + * const util = require('util'); + * + * util.isRegExp(/some regexp/); + * // Returns: true + * util.isRegExp(new RegExp('another regexp')); + * // Returns: true + * util.isRegExp({}); + * // Returns: false + * ``` + * @since v0.6.0 + * @deprecated Since v4.0.0 - Deprecated + */ + export function isRegExp(object: unknown): object is RegExp; + /** + * Returns `true` if the given `object` is a `Date`. Otherwise, returns `false`. + * + * ```js + * const util = require('util'); + * + * util.isDate(new Date()); + * // Returns: true + * util.isDate(Date()); + * // false (without 'new' returns a String) + * util.isDate({}); + * // Returns: false + * ``` + * @since v0.6.0 + * @deprecated Since v4.0.0 - Use {@link types.isDate} instead. + */ + export function isDate(object: unknown): object is Date; + /** + * Returns `true` if the given `object` is an `Error`. Otherwise, returns`false`. + * + * ```js + * const util = require('util'); + * + * util.isError(new Error()); + * // Returns: true + * util.isError(new TypeError()); + * // Returns: true + * util.isError({ name: 'Error', message: 'an error occurred' }); + * // Returns: false + * ``` + * + * This method relies on `Object.prototype.toString()` behavior. It is + * possible to obtain an incorrect result when the `object` argument manipulates`@@toStringTag`. + * + * ```js + * const util = require('util'); + * const obj = { name: 'Error', message: 'an error occurred' }; + * + * util.isError(obj); + * // Returns: false + * obj[Symbol.toStringTag] = 'Error'; + * util.isError(obj); + * // Returns: true + * ``` + * @since v0.6.0 + * @deprecated Since v4.0.0 - Use {@link types.isNativeError} instead. + */ + export function isError(object: unknown): object is Error; + /** + * Usage of `util.inherits()` is discouraged. Please use the ES6 `class` and`extends` keywords to get language level inheritance support. Also note + * that the two styles are [semantically incompatible](https://github.com/nodejs/node/issues/4179). + * + * Inherit the prototype methods from one [constructor](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/constructor) into another. The + * prototype of `constructor` will be set to a new object created from`superConstructor`. + * + * This mainly adds some input validation on top of`Object.setPrototypeOf(constructor.prototype, superConstructor.prototype)`. + * As an additional convenience, `superConstructor` will be accessible + * through the `constructor.super_` property. + * + * ```js + * const util = require('util'); + * const EventEmitter = require('events'); + * + * function MyStream() { + * EventEmitter.call(this); + * } + * + * util.inherits(MyStream, EventEmitter); + * + * MyStream.prototype.write = function(data) { + * this.emit('data', data); + * }; + * + * const stream = new MyStream(); + * + * console.log(stream instanceof EventEmitter); // true + * console.log(MyStream.super_ === EventEmitter); // true + * + * stream.on('data', (data) => { + * console.log(`Received data: "${data}"`); + * }); + * stream.write('It works!'); // Received data: "It works!" + * ``` + * + * ES6 example using `class` and `extends`: + * + * ```js + * const EventEmitter = require('events'); + * + * class MyStream extends EventEmitter { + * write(data) { + * this.emit('data', data); + * } + * } + * + * const stream = new MyStream(); + * + * stream.on('data', (data) => { + * console.log(`Received data: "${data}"`); + * }); + * stream.write('With ES6'); + * ``` + * @since v0.3.0 + * @deprecated Legacy: Use ES2015 class syntax and `extends` keyword instead. + */ + export function inherits(constructor: unknown, superConstructor: unknown): void; + export type DebugLoggerFunction = (msg: string, ...param: unknown[]) => void; + export interface DebugLogger extends DebugLoggerFunction { + enabled: boolean; + } + /** + * The `util.debuglog()` method is used to create a function that conditionally + * writes debug messages to `stderr` based on the existence of the `NODE_DEBUG`environment variable. If the `section` name appears within the value of that + * environment variable, then the returned function operates similar to `console.error()`. If not, then the returned function is a no-op. + * + * ```js + * const util = require('util'); + * const debuglog = util.debuglog('foo'); + * + * debuglog('hello from foo [%d]', 123); + * ``` + * + * If this program is run with `NODE_DEBUG=foo` in the environment, then + * it will output something like: + * + * ```console + * FOO 3245: hello from foo [123] + * ``` + * + * where `3245` is the process id. If it is not run with that + * environment variable set, then it will not print anything. + * + * The `section` supports wildcard also: + * + * ```js + * const util = require('util'); + * const debuglog = util.debuglog('foo-bar'); + * + * debuglog('hi there, it\'s foo-bar [%d]', 2333); + * ``` + * + * if it is run with `NODE_DEBUG=foo*` in the environment, then it will output + * something like: + * + * ```console + * FOO-BAR 3257: hi there, it's foo-bar [2333] + * ``` + * + * Multiple comma-separated `section` names may be specified in the `NODE_DEBUG`environment variable: `NODE_DEBUG=fs,net,tls`. + * + * The optional `callback` argument can be used to replace the logging function + * with a different function that doesn't have any initialization or + * unnecessary wrapping. + * + * ```js + * const util = require('util'); + * let debuglog = util.debuglog('internals', (debug) => { + * // Replace with a logging function that optimizes out + * // testing if the section is enabled + * debuglog = debug; + * }); + * ``` + * @since v0.11.3 + * @param section A string identifying the portion of the application for which the `debuglog` function is being created. + * @param callback A callback invoked the first time the logging function is called with a function argument that is a more optimized logging function. + * @return The logging function + */ + export function debuglog(section: string, callback?: (fn: DebugLoggerFunction) => void): DebugLogger; + /** + * Returns `true` if the given `object` is a `Boolean`. Otherwise, returns `false`. + * + * ```js + * const util = require('util'); + * + * util.isBoolean(1); + * // Returns: false + * util.isBoolean(0); + * // Returns: false + * util.isBoolean(false); + * // Returns: true + * ``` + * @since v0.11.5 + * @deprecated Since v4.0.0 - Use `typeof value === 'boolean'` instead. + */ + export function isBoolean(object: unknown): object is boolean; + /** + * Returns `true` if the given `object` is a `Buffer`. Otherwise, returns `false`. + * + * ```js + * const util = require('util'); + * + * util.isBuffer({ length: 0 }); + * // Returns: false + * util.isBuffer([]); + * // Returns: false + * util.isBuffer(Buffer.from('hello world')); + * // Returns: true + * ``` + * @since v0.11.5 + * @deprecated Since v4.0.0 - Use `isBuffer` instead. + */ + export function isBuffer(object: unknown): object is Buffer; + /** + * Returns `true` if the given `object` is a `Function`. Otherwise, returns`false`. + * + * ```js + * const util = require('util'); + * + * function Foo() {} + * const Bar = () => {}; + * + * util.isFunction({}); + * // Returns: false + * util.isFunction(Foo); + * // Returns: true + * util.isFunction(Bar); + * // Returns: true + * ``` + * @since v0.11.5 + * @deprecated Since v4.0.0 - Use `typeof value === 'function'` instead. + */ + export function isFunction(object: unknown): boolean; + /** + * Returns `true` if the given `object` is strictly `null`. Otherwise, returns`false`. + * + * ```js + * const util = require('util'); + * + * util.isNull(0); + * // Returns: false + * util.isNull(undefined); + * // Returns: false + * util.isNull(null); + * // Returns: true + * ``` + * @since v0.11.5 + * @deprecated Since v4.0.0 - Use `value === null` instead. + */ + export function isNull(object: unknown): object is null; + /** + * Returns `true` if the given `object` is `null` or `undefined`. Otherwise, + * returns `false`. + * + * ```js + * const util = require('util'); + * + * util.isNullOrUndefined(0); + * // Returns: false + * util.isNullOrUndefined(undefined); + * // Returns: true + * util.isNullOrUndefined(null); + * // Returns: true + * ``` + * @since v0.11.5 + * @deprecated Since v4.0.0 - Use `value === undefined || value === null` instead. + */ + export function isNullOrUndefined(object: unknown): object is null | undefined; + /** + * Returns `true` if the given `object` is a `Number`. Otherwise, returns `false`. + * + * ```js + * const util = require('util'); + * + * util.isNumber(false); + * // Returns: false + * util.isNumber(Infinity); + * // Returns: true + * util.isNumber(0); + * // Returns: true + * util.isNumber(NaN); + * // Returns: true + * ``` + * @since v0.11.5 + * @deprecated Since v4.0.0 - Use `typeof value === 'number'` instead. + */ + export function isNumber(object: unknown): object is number; + /** + * Returns `true` if the given `object` is strictly an `Object`**and** not a`Function` (even though functions are objects in JavaScript). + * Otherwise, returns `false`. + * + * ```js + * const util = require('util'); + * + * util.isObject(5); + * // Returns: false + * util.isObject(null); + * // Returns: false + * util.isObject({}); + * // Returns: true + * util.isObject(() => {}); + * // Returns: false + * ``` + * @since v0.11.5 + * @deprecated Since v4.0.0 - Deprecated: Use `value !== null && typeof value === 'object'` instead. + */ + export function isObject(object: unknown): boolean; + /** + * Returns `true` if the given `object` is a primitive type. Otherwise, returns`false`. + * + * ```js + * const util = require('util'); + * + * util.isPrimitive(5); + * // Returns: true + * util.isPrimitive('foo'); + * // Returns: true + * util.isPrimitive(false); + * // Returns: true + * util.isPrimitive(null); + * // Returns: true + * util.isPrimitive(undefined); + * // Returns: true + * util.isPrimitive({}); + * // Returns: false + * util.isPrimitive(() => {}); + * // Returns: false + * util.isPrimitive(/^$/); + * // Returns: false + * util.isPrimitive(new Date()); + * // Returns: false + * ``` + * @since v0.11.5 + * @deprecated Since v4.0.0 - Use `(typeof value !== 'object' && typeof value !== 'function') || value === null` instead. + */ + export function isPrimitive(object: unknown): boolean; + /** + * Returns `true` if the given `object` is a `string`. Otherwise, returns `false`. + * + * ```js + * const util = require('util'); + * + * util.isString(''); + * // Returns: true + * util.isString('foo'); + * // Returns: true + * util.isString(String('foo')); + * // Returns: true + * util.isString(5); + * // Returns: false + * ``` + * @since v0.11.5 + * @deprecated Since v4.0.0 - Use `typeof value === 'string'` instead. + */ + export function isString(object: unknown): object is string; + /** + * Returns `true` if the given `object` is a `Symbol`. Otherwise, returns `false`. + * + * ```js + * const util = require('util'); + * + * util.isSymbol(5); + * // Returns: false + * util.isSymbol('foo'); + * // Returns: false + * util.isSymbol(Symbol('foo')); + * // Returns: true + * ``` + * @since v0.11.5 + * @deprecated Since v4.0.0 - Use `typeof value === 'symbol'` instead. + */ + export function isSymbol(object: unknown): object is symbol; + /** + * Returns `true` if the given `object` is `undefined`. Otherwise, returns `false`. + * + * ```js + * const util = require('util'); + * + * const foo = undefined; + * util.isUndefined(5); + * // Returns: false + * util.isUndefined(foo); + * // Returns: true + * util.isUndefined(null); + * // Returns: false + * ``` + * @since v0.11.5 + * @deprecated Since v4.0.0 - Use `value === undefined` instead. + */ + export function isUndefined(object: unknown): object is undefined; + /** + * The `util.deprecate()` method wraps `fn` (which may be a function or class) in + * such a way that it is marked as deprecated. + * + * ```js + * const util = require('util'); + * + * exports.obsoleteFunction = util.deprecate(() => { + * // Do something here. + * }, 'obsoleteFunction() is deprecated. Use newShinyFunction() instead.'); + * ``` + * + * When called, `util.deprecate()` will return a function that will emit a`DeprecationWarning` using the `'warning'` event. The warning will + * be emitted and printed to `stderr` the first time the returned function is + * called. After the warning is emitted, the wrapped function is called without + * emitting a warning. + * + * If the same optional `code` is supplied in multiple calls to `util.deprecate()`, + * the warning will be emitted only once for that `code`. + * + * ```js + * const util = require('util'); + * + * const fn1 = util.deprecate(someFunction, someMessage, 'DEP0001'); + * const fn2 = util.deprecate(someOtherFunction, someOtherMessage, 'DEP0001'); + * fn1(); // Emits a deprecation warning with code DEP0001 + * fn2(); // Does not emit a deprecation warning because it has the same code + * ``` + * + * If either the `--no-deprecation` or `--no-warnings` command-line flags are + * used, or if the `process.noDeprecation` property is set to `true`_prior_ to + * the first deprecation warning, the `util.deprecate()` method does nothing. + * + * If the `--trace-deprecation` or `--trace-warnings` command-line flags are set, + * or the `process.traceDeprecation` property is set to `true`, a warning and a + * stack trace are printed to `stderr` the first time the deprecated function is + * called. + * + * If the `--throw-deprecation` command-line flag is set, or the`process.throwDeprecation` property is set to `true`, then an exception will be + * thrown when the deprecated function is called. + * + * The `--throw-deprecation` command-line flag and `process.throwDeprecation`property take precedence over `--trace-deprecation` and`process.traceDeprecation`. + * @since v0.8.0 + * @param fn The function that is being deprecated. + * @param msg A warning message to display when the deprecated function is invoked. + * @param code A deprecation code. See the `list of deprecated APIs` for a list of codes. + * @return The deprecated function wrapped to emit a warning. + */ + export function deprecate(fn: T, msg: string, code?: string): T; + /** + * Returns `true` if there is deep strict equality between `val1` and `val2`. + * Otherwise, returns `false`. + * + * See `assert.deepStrictEqual()` for more information about deep strict + * equality. + * @since v9.0.0 + */ + export function isDeepStrictEqual(val1: unknown, val2: unknown): boolean; + /** + * Takes an `async` function (or a function that returns a `Promise`) and returns a + * function following the error-first callback style, i.e. taking + * an `(err, value) => ...` callback as the last argument. In the callback, the + * first argument will be the rejection reason (or `null` if the `Promise`resolved), and the second argument will be the resolved value. + * + * ```js + * const util = require('util'); + * + * async function fn() { + * return 'hello world'; + * } + * const callbackFunction = util.callbackify(fn); + * + * callbackFunction((err, ret) => { + * if (err) throw err; + * console.log(ret); + * }); + * ``` + * + * Will print: + * + * ```text + * hello world + * ``` + * + * The callback is executed asynchronously, and will have a limited stack trace. + * If the callback throws, the process will emit an `'uncaughtException'` event, and if not handled will exit. + * + * Since `null` has a special meaning as the first argument to a callback, if a + * wrapped function rejects a `Promise` with a falsy value as a reason, the value + * is wrapped in an `Error` with the original value stored in a field named`reason`. + * + * ```js + * function fn() { + * return Promise.reject(null); + * } + * const callbackFunction = util.callbackify(fn); + * + * callbackFunction((err, ret) => { + * // When the Promise was rejected with `null` it is wrapped with an Error and + * // the original value is stored in `reason`. + * err && err.hasOwnProperty('reason') && err.reason === null; // true + * }); + * ``` + * @since v8.2.0 + * @param original An `async` function + * @return a callback style function + */ + export function callbackify(fn: () => Promise): (callback: (err: NodeJS.ErrnoException) => void) => void; + export function callbackify(fn: () => Promise): (callback: (err: NodeJS.ErrnoException, result: TResult) => void) => void; + export function callbackify(fn: (arg1: T1) => Promise): (arg1: T1, callback: (err: NodeJS.ErrnoException) => void) => void; + export function callbackify(fn: (arg1: T1) => Promise): (arg1: T1, callback: (err: NodeJS.ErrnoException, result: TResult) => void) => void; + export function callbackify(fn: (arg1: T1, arg2: T2) => Promise): (arg1: T1, arg2: T2, callback: (err: NodeJS.ErrnoException) => void) => void; + export function callbackify(fn: (arg1: T1, arg2: T2) => Promise): (arg1: T1, arg2: T2, callback: (err: NodeJS.ErrnoException | null, result: TResult) => void) => void; + export function callbackify(fn: (arg1: T1, arg2: T2, arg3: T3) => Promise): (arg1: T1, arg2: T2, arg3: T3, callback: (err: NodeJS.ErrnoException) => void) => void; + export function callbackify( + fn: (arg1: T1, arg2: T2, arg3: T3) => Promise + ): (arg1: T1, arg2: T2, arg3: T3, callback: (err: NodeJS.ErrnoException | null, result: TResult) => void) => void; + export function callbackify( + fn: (arg1: T1, arg2: T2, arg3: T3, arg4: T4) => Promise + ): (arg1: T1, arg2: T2, arg3: T3, arg4: T4, callback: (err: NodeJS.ErrnoException) => void) => void; + export function callbackify( + fn: (arg1: T1, arg2: T2, arg3: T3, arg4: T4) => Promise + ): (arg1: T1, arg2: T2, arg3: T3, arg4: T4, callback: (err: NodeJS.ErrnoException | null, result: TResult) => void) => void; + export function callbackify( + fn: (arg1: T1, arg2: T2, arg3: T3, arg4: T4, arg5: T5) => Promise + ): (arg1: T1, arg2: T2, arg3: T3, arg4: T4, arg5: T5, callback: (err: NodeJS.ErrnoException) => void) => void; + export function callbackify( + fn: (arg1: T1, arg2: T2, arg3: T3, arg4: T4, arg5: T5) => Promise + ): (arg1: T1, arg2: T2, arg3: T3, arg4: T4, arg5: T5, callback: (err: NodeJS.ErrnoException | null, result: TResult) => void) => void; + export function callbackify( + fn: (arg1: T1, arg2: T2, arg3: T3, arg4: T4, arg5: T5, arg6: T6) => Promise + ): (arg1: T1, arg2: T2, arg3: T3, arg4: T4, arg5: T5, arg6: T6, callback: (err: NodeJS.ErrnoException) => void) => void; + export function callbackify( + fn: (arg1: T1, arg2: T2, arg3: T3, arg4: T4, arg5: T5, arg6: T6) => Promise + ): (arg1: T1, arg2: T2, arg3: T3, arg4: T4, arg5: T5, arg6: T6, callback: (err: NodeJS.ErrnoException | null, result: TResult) => void) => void; + export interface CustomPromisifyLegacy extends Function { + __promisify__: TCustom; + } + export interface CustomPromisifySymbol extends Function { + [promisify.custom]: TCustom; + } + export type CustomPromisify = CustomPromisifySymbol | CustomPromisifyLegacy; + /** + * Takes a function following the common error-first callback style, i.e. taking + * an `(err, value) => ...` callback as the last argument, and returns a version + * that returns promises. + * + * ```js + * const util = require('util'); + * const fs = require('fs'); + * + * const stat = util.promisify(fs.stat); + * stat('.').then((stats) => { + * // Do something with `stats` + * }).catch((error) => { + * // Handle the error. + * }); + * ``` + * + * Or, equivalently using `async function`s: + * + * ```js + * const util = require('util'); + * const fs = require('fs'); + * + * const stat = util.promisify(fs.stat); + * + * async function callStat() { + * const stats = await stat('.'); + * console.log(`This directory is owned by ${stats.uid}`); + * } + * ``` + * + * If there is an `original[util.promisify.custom]` property present, `promisify`will return its value, see `Custom promisified functions`. + * + * `promisify()` assumes that `original` is a function taking a callback as its + * final argument in all cases. If `original` is not a function, `promisify()`will throw an error. If `original` is a function but its last argument is not + * an error-first callback, it will still be passed an error-first + * callback as its last argument. + * + * Using `promisify()` on class methods or other methods that use `this` may not + * work as expected unless handled specially: + * + * ```js + * const util = require('util'); + * + * class Foo { + * constructor() { + * this.a = 42; + * } + * + * bar(callback) { + * callback(null, this.a); + * } + * } + * + * const foo = new Foo(); + * + * const naiveBar = util.promisify(foo.bar); + * // TypeError: Cannot read property 'a' of undefined + * // naiveBar().then(a => console.log(a)); + * + * naiveBar.call(foo).then((a) => console.log(a)); // '42' + * + * const bindBar = naiveBar.bind(foo); + * bindBar().then((a) => console.log(a)); // '42' + * ``` + * @since v8.0.0 + */ + export function promisify(fn: CustomPromisify): TCustom; + export function promisify(fn: (callback: (err: any, result: TResult) => void) => void): () => Promise; + export function promisify(fn: (callback: (err?: any) => void) => void): () => Promise; + export function promisify(fn: (arg1: T1, callback: (err: any, result: TResult) => void) => void): (arg1: T1) => Promise; + export function promisify(fn: (arg1: T1, callback: (err?: any) => void) => void): (arg1: T1) => Promise; + export function promisify(fn: (arg1: T1, arg2: T2, callback: (err: any, result: TResult) => void) => void): (arg1: T1, arg2: T2) => Promise; + export function promisify(fn: (arg1: T1, arg2: T2, callback: (err?: any) => void) => void): (arg1: T1, arg2: T2) => Promise; + export function promisify(fn: (arg1: T1, arg2: T2, arg3: T3, callback: (err: any, result: TResult) => void) => void): (arg1: T1, arg2: T2, arg3: T3) => Promise; + export function promisify(fn: (arg1: T1, arg2: T2, arg3: T3, callback: (err?: any) => void) => void): (arg1: T1, arg2: T2, arg3: T3) => Promise; + export function promisify( + fn: (arg1: T1, arg2: T2, arg3: T3, arg4: T4, callback: (err: any, result: TResult) => void) => void + ): (arg1: T1, arg2: T2, arg3: T3, arg4: T4) => Promise; + export function promisify(fn: (arg1: T1, arg2: T2, arg3: T3, arg4: T4, callback: (err?: any) => void) => void): (arg1: T1, arg2: T2, arg3: T3, arg4: T4) => Promise; + export function promisify( + fn: (arg1: T1, arg2: T2, arg3: T3, arg4: T4, arg5: T5, callback: (err: any, result: TResult) => void) => void + ): (arg1: T1, arg2: T2, arg3: T3, arg4: T4, arg5: T5) => Promise; + export function promisify( + fn: (arg1: T1, arg2: T2, arg3: T3, arg4: T4, arg5: T5, callback: (err?: any) => void) => void + ): (arg1: T1, arg2: T2, arg3: T3, arg4: T4, arg5: T5) => Promise; + export function promisify(fn: Function): Function; + export namespace promisify { + const custom: unique symbol; + } + /** + * An implementation of the [WHATWG Encoding Standard](https://encoding.spec.whatwg.org/) `TextDecoder` API. + * + * ```js + * const decoder = new TextDecoder('shift_jis'); + * let string = ''; + * let buffer; + * while (buffer = getNextChunkSomehow()) { + * string += decoder.decode(buffer, { stream: true }); + * } + * string += decoder.decode(); // end-of-stream + * ``` + * @since v8.3.0 + */ + export class TextDecoder { + /** + * The encoding supported by the `TextDecoder` instance. + */ + readonly encoding: string; + /** + * The value will be `true` if decoding errors result in a `TypeError` being + * thrown. + */ + readonly fatal: boolean; + /** + * The value will be `true` if the decoding result will include the byte order + * mark. + */ + readonly ignoreBOM: boolean; + constructor( + encoding?: string, + options?: { + fatal?: boolean | undefined; + ignoreBOM?: boolean | undefined; + } + ); + /** + * Decodes the `input` and returns a string. If `options.stream` is `true`, any + * incomplete byte sequences occurring at the end of the `input` are buffered + * internally and emitted after the next call to `textDecoder.decode()`. + * + * If `textDecoder.fatal` is `true`, decoding errors that occur will result in a`TypeError` being thrown. + * @param input An `ArrayBuffer`, `DataView` or `TypedArray` instance containing the encoded data. + */ + decode( + input?: NodeJS.ArrayBufferView | ArrayBuffer | null, + options?: { + stream?: boolean | undefined; + } + ): string; + } + export interface EncodeIntoResult { + /** + * The read Unicode code units of input. + */ + read: number; + /** + * The written UTF-8 bytes of output. + */ + written: number; + } + export { types }; + /** + * An implementation of the [WHATWG Encoding Standard](https://encoding.spec.whatwg.org/) `TextEncoder` API. All + * instances of `TextEncoder` only support UTF-8 encoding. + * + * ```js + * const encoder = new TextEncoder(); + * const uint8array = encoder.encode('this is some data'); + * ``` + * + * The `TextEncoder` class is also available on the global object. + * @since v8.3.0 + */ + export class TextEncoder { + /** + * The encoding supported by the `TextEncoder` instance. Always set to `'utf-8'`. + */ + readonly encoding: string; + /** + * UTF-8 encodes the `input` string and returns a `Uint8Array` containing the + * encoded bytes. + * @param [input='an empty string'] The text to encode. + */ + encode(input?: string): Uint8Array; + /** + * UTF-8 encodes the `src` string to the `dest` Uint8Array and returns an object + * containing the read Unicode code units and written UTF-8 bytes. + * + * ```js + * const encoder = new TextEncoder(); + * const src = 'this is some data'; + * const dest = new Uint8Array(10); + * const { read, written } = encoder.encodeInto(src, dest); + * ``` + * @param src The text to encode. + * @param dest The array to hold the encode result. + */ + encodeInto(src: string, dest: Uint8Array): EncodeIntoResult; + } +} +declare module 'util/types' { + export * from 'util/types'; +} +declare module 'util/types' { + import { KeyObject, webcrypto } from 'node:crypto'; + /** + * Returns `true` if the value is a built-in [`ArrayBuffer`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer) or + * [`SharedArrayBuffer`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/SharedArrayBuffer) instance. + * + * See also `util.types.isArrayBuffer()` and `util.types.isSharedArrayBuffer()`. + * + * ```js + * util.types.isAnyArrayBuffer(new ArrayBuffer()); // Returns true + * util.types.isAnyArrayBuffer(new SharedArrayBuffer()); // Returns true + * ``` + * @since v10.0.0 + */ + function isAnyArrayBuffer(object: unknown): object is ArrayBufferLike; + /** + * Returns `true` if the value is an `arguments` object. + * + * ```js + * function foo() { + * util.types.isArgumentsObject(arguments); // Returns true + * } + * ``` + * @since v10.0.0 + */ + function isArgumentsObject(object: unknown): object is IArguments; + /** + * Returns `true` if the value is a built-in [`ArrayBuffer`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer) instance. + * This does _not_ include [`SharedArrayBuffer`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/SharedArrayBuffer) instances. Usually, it is + * desirable to test for both; See `util.types.isAnyArrayBuffer()` for that. + * + * ```js + * util.types.isArrayBuffer(new ArrayBuffer()); // Returns true + * util.types.isArrayBuffer(new SharedArrayBuffer()); // Returns false + * ``` + * @since v10.0.0 + */ + function isArrayBuffer(object: unknown): object is ArrayBuffer; + /** + * Returns `true` if the value is an instance of one of the [`ArrayBuffer`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer) views, such as typed + * array objects or [`DataView`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/DataView). Equivalent to + * [`ArrayBuffer.isView()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer/isView). + * + * ```js + * util.types.isArrayBufferView(new Int8Array()); // true + * util.types.isArrayBufferView(Buffer.from('hello world')); // true + * util.types.isArrayBufferView(new DataView(new ArrayBuffer(16))); // true + * util.types.isArrayBufferView(new ArrayBuffer()); // false + * ``` + * @since v10.0.0 + */ + function isArrayBufferView(object: unknown): object is NodeJS.ArrayBufferView; + /** + * Returns `true` if the value is an [async function](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Statements/async_function). + * This only reports back what the JavaScript engine is seeing; + * in particular, the return value may not match the original source code if + * a transpilation tool was used. + * + * ```js + * util.types.isAsyncFunction(function foo() {}); // Returns false + * util.types.isAsyncFunction(async function foo() {}); // Returns true + * ``` + * @since v10.0.0 + */ + function isAsyncFunction(object: unknown): boolean; + /** + * Returns `true` if the value is a `BigInt64Array` instance. + * + * ```js + * util.types.isBigInt64Array(new BigInt64Array()); // Returns true + * util.types.isBigInt64Array(new BigUint64Array()); // Returns false + * ``` + * @since v10.0.0 + */ + function isBigInt64Array(value: unknown): value is BigInt64Array; + /** + * Returns `true` if the value is a `BigUint64Array` instance. + * + * ```js + * util.types.isBigUint64Array(new BigInt64Array()); // Returns false + * util.types.isBigUint64Array(new BigUint64Array()); // Returns true + * ``` + * @since v10.0.0 + */ + function isBigUint64Array(value: unknown): value is BigUint64Array; + /** + * Returns `true` if the value is a boolean object, e.g. created + * by `new Boolean()`. + * + * ```js + * util.types.isBooleanObject(false); // Returns false + * util.types.isBooleanObject(true); // Returns false + * util.types.isBooleanObject(new Boolean(false)); // Returns true + * util.types.isBooleanObject(new Boolean(true)); // Returns true + * util.types.isBooleanObject(Boolean(false)); // Returns false + * util.types.isBooleanObject(Boolean(true)); // Returns false + * ``` + * @since v10.0.0 + */ + function isBooleanObject(object: unknown): object is Boolean; + /** + * Returns `true` if the value is any boxed primitive object, e.g. created + * by `new Boolean()`, `new String()` or `Object(Symbol())`. + * + * For example: + * + * ```js + * util.types.isBoxedPrimitive(false); // Returns false + * util.types.isBoxedPrimitive(new Boolean(false)); // Returns true + * util.types.isBoxedPrimitive(Symbol('foo')); // Returns false + * util.types.isBoxedPrimitive(Object(Symbol('foo'))); // Returns true + * util.types.isBoxedPrimitive(Object(BigInt(5))); // Returns true + * ``` + * @since v10.11.0 + */ + function isBoxedPrimitive(object: unknown): object is String | Number | BigInt | Boolean | Symbol; + /** + * Returns `true` if the value is a built-in [`DataView`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/DataView) instance. + * + * ```js + * const ab = new ArrayBuffer(20); + * util.types.isDataView(new DataView(ab)); // Returns true + * util.types.isDataView(new Float64Array()); // Returns false + * ``` + * + * See also [`ArrayBuffer.isView()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer/isView). + * @since v10.0.0 + */ + function isDataView(object: unknown): object is DataView; + /** + * Returns `true` if the value is a built-in [`Date`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Date) instance. + * + * ```js + * util.types.isDate(new Date()); // Returns true + * ``` + * @since v10.0.0 + */ + function isDate(object: unknown): object is Date; + /** + * Returns `true` if the value is a native `External` value. + * + * A native `External` value is a special type of object that contains a + * raw C++ pointer (`void*`) for access from native code, and has no other + * properties. Such objects are created either by Node.js internals or native + * addons. In JavaScript, they are [frozen](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/freeze) objects with a`null` prototype. + * + * ```c + * #include + * #include + * napi_value result; + * static napi_value MyNapi(napi_env env, napi_callback_info info) { + * int* raw = (int*) malloc(1024); + * napi_status status = napi_create_external(env, (void*) raw, NULL, NULL, &result); + * if (status != napi_ok) { + * napi_throw_error(env, NULL, "napi_create_external failed"); + * return NULL; + * } + * return result; + * } + * ... + * DECLARE_NAPI_PROPERTY("myNapi", MyNapi) + * ... + * ``` + * + * ```js + * const native = require('napi_addon.node'); + * const data = native.myNapi(); + * util.types.isExternal(data); // returns true + * util.types.isExternal(0); // returns false + * util.types.isExternal(new String('foo')); // returns false + * ``` + * + * For further information on `napi_create_external`, refer to `napi_create_external()`. + * @since v10.0.0 + */ + function isExternal(object: unknown): boolean; + /** + * Returns `true` if the value is a built-in [`Float32Array`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Float32Array) instance. + * + * ```js + * util.types.isFloat32Array(new ArrayBuffer()); // Returns false + * util.types.isFloat32Array(new Float32Array()); // Returns true + * util.types.isFloat32Array(new Float64Array()); // Returns false + * ``` + * @since v10.0.0 + */ + function isFloat32Array(object: unknown): object is Float32Array; + /** + * Returns `true` if the value is a built-in [`Float64Array`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Float64Array) instance. + * + * ```js + * util.types.isFloat64Array(new ArrayBuffer()); // Returns false + * util.types.isFloat64Array(new Uint8Array()); // Returns false + * util.types.isFloat64Array(new Float64Array()); // Returns true + * ``` + * @since v10.0.0 + */ + function isFloat64Array(object: unknown): object is Float64Array; + /** + * Returns `true` if the value is a generator function. + * This only reports back what the JavaScript engine is seeing; + * in particular, the return value may not match the original source code if + * a transpilation tool was used. + * + * ```js + * util.types.isGeneratorFunction(function foo() {}); // Returns false + * util.types.isGeneratorFunction(function* foo() {}); // Returns true + * ``` + * @since v10.0.0 + */ + function isGeneratorFunction(object: unknown): object is GeneratorFunction; + /** + * Returns `true` if the value is a generator object as returned from a + * built-in generator function. + * This only reports back what the JavaScript engine is seeing; + * in particular, the return value may not match the original source code if + * a transpilation tool was used. + * + * ```js + * function* foo() {} + * const generator = foo(); + * util.types.isGeneratorObject(generator); // Returns true + * ``` + * @since v10.0.0 + */ + function isGeneratorObject(object: unknown): object is Generator; + /** + * Returns `true` if the value is a built-in [`Int8Array`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Int8Array) instance. + * + * ```js + * util.types.isInt8Array(new ArrayBuffer()); // Returns false + * util.types.isInt8Array(new Int8Array()); // Returns true + * util.types.isInt8Array(new Float64Array()); // Returns false + * ``` + * @since v10.0.0 + */ + function isInt8Array(object: unknown): object is Int8Array; + /** + * Returns `true` if the value is a built-in [`Int16Array`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Int16Array) instance. + * + * ```js + * util.types.isInt16Array(new ArrayBuffer()); // Returns false + * util.types.isInt16Array(new Int16Array()); // Returns true + * util.types.isInt16Array(new Float64Array()); // Returns false + * ``` + * @since v10.0.0 + */ + function isInt16Array(object: unknown): object is Int16Array; + /** + * Returns `true` if the value is a built-in [`Int32Array`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Int32Array) instance. + * + * ```js + * util.types.isInt32Array(new ArrayBuffer()); // Returns false + * util.types.isInt32Array(new Int32Array()); // Returns true + * util.types.isInt32Array(new Float64Array()); // Returns false + * ``` + * @since v10.0.0 + */ + function isInt32Array(object: unknown): object is Int32Array; + /** + * Returns `true` if the value is a built-in [`Map`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Map) instance. + * + * ```js + * util.types.isMap(new Map()); // Returns true + * ``` + * @since v10.0.0 + */ + function isMap(object: T | {}): object is T extends ReadonlyMap ? (unknown extends T ? never : ReadonlyMap) : Map; + /** + * Returns `true` if the value is an iterator returned for a built-in [`Map`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Map) instance. + * + * ```js + * const map = new Map(); + * util.types.isMapIterator(map.keys()); // Returns true + * util.types.isMapIterator(map.values()); // Returns true + * util.types.isMapIterator(map.entries()); // Returns true + * util.types.isMapIterator(map[Symbol.iterator]()); // Returns true + * ``` + * @since v10.0.0 + */ + function isMapIterator(object: unknown): boolean; + /** + * Returns `true` if the value is an instance of a [Module Namespace Object](https://tc39.github.io/ecma262/#sec-module-namespace-exotic-objects). + * + * ```js + * import * as ns from './a.js'; + * + * util.types.isModuleNamespaceObject(ns); // Returns true + * ``` + * @since v10.0.0 + */ + function isModuleNamespaceObject(value: unknown): boolean; + /** + * Returns `true` if the value is an instance of a built-in `Error` type. + * + * ```js + * util.types.isNativeError(new Error()); // Returns true + * util.types.isNativeError(new TypeError()); // Returns true + * util.types.isNativeError(new RangeError()); // Returns true + * ``` + * @since v10.0.0 + */ + function isNativeError(object: unknown): object is Error; + /** + * Returns `true` if the value is a number object, e.g. created + * by `new Number()`. + * + * ```js + * util.types.isNumberObject(0); // Returns false + * util.types.isNumberObject(new Number(0)); // Returns true + * ``` + * @since v10.0.0 + */ + function isNumberObject(object: unknown): object is Number; + /** + * Returns `true` if the value is a built-in [`Promise`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Promise). + * + * ```js + * util.types.isPromise(Promise.resolve(42)); // Returns true + * ``` + * @since v10.0.0 + */ + function isPromise(object: unknown): object is Promise; + /** + * Returns `true` if the value is a [`Proxy`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Proxy) instance. + * + * ```js + * const target = {}; + * const proxy = new Proxy(target, {}); + * util.types.isProxy(target); // Returns false + * util.types.isProxy(proxy); // Returns true + * ``` + * @since v10.0.0 + */ + function isProxy(object: unknown): boolean; + /** + * Returns `true` if the value is a regular expression object. + * + * ```js + * util.types.isRegExp(/abc/); // Returns true + * util.types.isRegExp(new RegExp('abc')); // Returns true + * ``` + * @since v10.0.0 + */ + function isRegExp(object: unknown): object is RegExp; + /** + * Returns `true` if the value is a built-in [`Set`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Set) instance. + * + * ```js + * util.types.isSet(new Set()); // Returns true + * ``` + * @since v10.0.0 + */ + function isSet(object: T | {}): object is T extends ReadonlySet ? (unknown extends T ? never : ReadonlySet) : Set; + /** + * Returns `true` if the value is an iterator returned for a built-in [`Set`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Set) instance. + * + * ```js + * const set = new Set(); + * util.types.isSetIterator(set.keys()); // Returns true + * util.types.isSetIterator(set.values()); // Returns true + * util.types.isSetIterator(set.entries()); // Returns true + * util.types.isSetIterator(set[Symbol.iterator]()); // Returns true + * ``` + * @since v10.0.0 + */ + function isSetIterator(object: unknown): boolean; + /** + * Returns `true` if the value is a built-in [`SharedArrayBuffer`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/SharedArrayBuffer) instance. + * This does _not_ include [`ArrayBuffer`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer) instances. Usually, it is + * desirable to test for both; See `util.types.isAnyArrayBuffer()` for that. + * + * ```js + * util.types.isSharedArrayBuffer(new ArrayBuffer()); // Returns false + * util.types.isSharedArrayBuffer(new SharedArrayBuffer()); // Returns true + * ``` + * @since v10.0.0 + */ + function isSharedArrayBuffer(object: unknown): object is SharedArrayBuffer; + /** + * Returns `true` if the value is a string object, e.g. created + * by `new String()`. + * + * ```js + * util.types.isStringObject('foo'); // Returns false + * util.types.isStringObject(new String('foo')); // Returns true + * ``` + * @since v10.0.0 + */ + function isStringObject(object: unknown): object is String; + /** + * Returns `true` if the value is a symbol object, created + * by calling `Object()` on a `Symbol` primitive. + * + * ```js + * const symbol = Symbol('foo'); + * util.types.isSymbolObject(symbol); // Returns false + * util.types.isSymbolObject(Object(symbol)); // Returns true + * ``` + * @since v10.0.0 + */ + function isSymbolObject(object: unknown): object is Symbol; + /** + * Returns `true` if the value is a built-in [`TypedArray`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/TypedArray) instance. + * + * ```js + * util.types.isTypedArray(new ArrayBuffer()); // Returns false + * util.types.isTypedArray(new Uint8Array()); // Returns true + * util.types.isTypedArray(new Float64Array()); // Returns true + * ``` + * + * See also [`ArrayBuffer.isView()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/ArrayBuffer/isView). + * @since v10.0.0 + */ + function isTypedArray(object: unknown): object is NodeJS.TypedArray; + /** + * Returns `true` if the value is a built-in [`Uint8Array`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8Array) instance. + * + * ```js + * util.types.isUint8Array(new ArrayBuffer()); // Returns false + * util.types.isUint8Array(new Uint8Array()); // Returns true + * util.types.isUint8Array(new Float64Array()); // Returns false + * ``` + * @since v10.0.0 + */ + function isUint8Array(object: unknown): object is Uint8Array; + /** + * Returns `true` if the value is a built-in [`Uint8ClampedArray`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint8ClampedArray) instance. + * + * ```js + * util.types.isUint8ClampedArray(new ArrayBuffer()); // Returns false + * util.types.isUint8ClampedArray(new Uint8ClampedArray()); // Returns true + * util.types.isUint8ClampedArray(new Float64Array()); // Returns false + * ``` + * @since v10.0.0 + */ + function isUint8ClampedArray(object: unknown): object is Uint8ClampedArray; + /** + * Returns `true` if the value is a built-in [`Uint16Array`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint16Array) instance. + * + * ```js + * util.types.isUint16Array(new ArrayBuffer()); // Returns false + * util.types.isUint16Array(new Uint16Array()); // Returns true + * util.types.isUint16Array(new Float64Array()); // Returns false + * ``` + * @since v10.0.0 + */ + function isUint16Array(object: unknown): object is Uint16Array; + /** + * Returns `true` if the value is a built-in [`Uint32Array`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Uint32Array) instance. + * + * ```js + * util.types.isUint32Array(new ArrayBuffer()); // Returns false + * util.types.isUint32Array(new Uint32Array()); // Returns true + * util.types.isUint32Array(new Float64Array()); // Returns false + * ``` + * @since v10.0.0 + */ + function isUint32Array(object: unknown): object is Uint32Array; + /** + * Returns `true` if the value is a built-in [`WeakMap`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/WeakMap) instance. + * + * ```js + * util.types.isWeakMap(new WeakMap()); // Returns true + * ``` + * @since v10.0.0 + */ + function isWeakMap(object: unknown): object is WeakMap; + /** + * Returns `true` if the value is a built-in [`WeakSet`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/WeakSet) instance. + * + * ```js + * util.types.isWeakSet(new WeakSet()); // Returns true + * ``` + * @since v10.0.0 + */ + function isWeakSet(object: unknown): object is WeakSet; + /** + * Returns `true` if `value` is a `KeyObject`, `false` otherwise. + * @since v16.2.0 + */ + function isKeyObject(object: unknown): object is KeyObject; + /** + * Returns `true` if `value` is a `CryptoKey`, `false` otherwise. + * @since v16.2.0 + */ + function isCryptoKey(object: unknown): object is webcrypto.CryptoKey; +} +declare module 'node:util' { + export * from 'util'; +} +declare module 'node:util/types' { + export * from 'util/types'; +} diff --git a/node_modules/@types/node/v8.d.ts b/node_modules/@types/node/v8.d.ts new file mode 100644 index 000000000..f467fccf8 --- /dev/null +++ b/node_modules/@types/node/v8.d.ts @@ -0,0 +1,378 @@ +/** + * The `v8` module exposes APIs that are specific to the version of [V8](https://developers.google.com/v8/) built into the Node.js binary. It can be accessed using: + * + * ```js + * const v8 = require('v8'); + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/v8.js) + */ +declare module 'v8' { + import { Readable } from 'node:stream'; + interface HeapSpaceInfo { + space_name: string; + space_size: number; + space_used_size: number; + space_available_size: number; + physical_space_size: number; + } + // ** Signifies if the --zap_code_space option is enabled or not. 1 == enabled, 0 == disabled. */ + type DoesZapCodeSpaceFlag = 0 | 1; + interface HeapInfo { + total_heap_size: number; + total_heap_size_executable: number; + total_physical_size: number; + total_available_size: number; + used_heap_size: number; + heap_size_limit: number; + malloced_memory: number; + peak_malloced_memory: number; + does_zap_garbage: DoesZapCodeSpaceFlag; + number_of_native_contexts: number; + number_of_detached_contexts: number; + } + interface HeapCodeStatistics { + code_and_metadata_size: number; + bytecode_and_metadata_size: number; + external_script_source_size: number; + } + /** + * Returns an integer representing a version tag derived from the V8 version, + * command-line flags, and detected CPU features. This is useful for determining + * whether a `vm.Script` `cachedData` buffer is compatible with this instance + * of V8. + * + * ```js + * console.log(v8.cachedDataVersionTag()); // 3947234607 + * // The value returned by v8.cachedDataVersionTag() is derived from the V8 + * // version, command-line flags, and detected CPU features. Test that the value + * // does indeed update when flags are toggled. + * v8.setFlagsFromString('--allow_natives_syntax'); + * console.log(v8.cachedDataVersionTag()); // 183726201 + * ``` + * @since v8.0.0 + */ + function cachedDataVersionTag(): number; + /** + * Returns an object with the following properties: + * + * `does_zap_garbage` is a 0/1 boolean, which signifies whether the`--zap_code_space` option is enabled or not. This makes V8 overwrite heap + * garbage with a bit pattern. The RSS footprint (resident set size) gets bigger + * because it continuously touches all heap pages and that makes them less likely + * to get swapped out by the operating system. + * + * `number_of_native_contexts` The value of native\_context is the number of the + * top-level contexts currently active. Increase of this number over time indicates + * a memory leak. + * + * `number_of_detached_contexts` The value of detached\_context is the number + * of contexts that were detached and not yet garbage collected. This number + * being non-zero indicates a potential memory leak. + * + * ```js + * { + * total_heap_size: 7326976, + * total_heap_size_executable: 4194304, + * total_physical_size: 7326976, + * total_available_size: 1152656, + * used_heap_size: 3476208, + * heap_size_limit: 1535115264, + * malloced_memory: 16384, + * peak_malloced_memory: 1127496, + * does_zap_garbage: 0, + * number_of_native_contexts: 1, + * number_of_detached_contexts: 0 + * } + * ``` + * @since v1.0.0 + */ + function getHeapStatistics(): HeapInfo; + /** + * Returns statistics about the V8 heap spaces, i.e. the segments which make up + * the V8 heap. Neither the ordering of heap spaces, nor the availability of a + * heap space can be guaranteed as the statistics are provided via the + * V8[`GetHeapSpaceStatistics`](https://v8docs.nodesource.com/node-13.2/d5/dda/classv8_1_1_isolate.html#ac673576f24fdc7a33378f8f57e1d13a4) function and may change from one V8 version to the + * next. + * + * The value returned is an array of objects containing the following properties: + * + * ```json + * [ + * { + * "space_name": "new_space", + * "space_size": 2063872, + * "space_used_size": 951112, + * "space_available_size": 80824, + * "physical_space_size": 2063872 + * }, + * { + * "space_name": "old_space", + * "space_size": 3090560, + * "space_used_size": 2493792, + * "space_available_size": 0, + * "physical_space_size": 3090560 + * }, + * { + * "space_name": "code_space", + * "space_size": 1260160, + * "space_used_size": 644256, + * "space_available_size": 960, + * "physical_space_size": 1260160 + * }, + * { + * "space_name": "map_space", + * "space_size": 1094160, + * "space_used_size": 201608, + * "space_available_size": 0, + * "physical_space_size": 1094160 + * }, + * { + * "space_name": "large_object_space", + * "space_size": 0, + * "space_used_size": 0, + * "space_available_size": 1490980608, + * "physical_space_size": 0 + * } + * ] + * ``` + * @since v6.0.0 + */ + function getHeapSpaceStatistics(): HeapSpaceInfo[]; + /** + * The `v8.setFlagsFromString()` method can be used to programmatically set + * V8 command-line flags. This method should be used with care. Changing settings + * after the VM has started may result in unpredictable behavior, including + * crashes and data loss; or it may simply do nothing. + * + * The V8 options available for a version of Node.js may be determined by running`node --v8-options`. + * + * Usage: + * + * ```js + * // Print GC events to stdout for one minute. + * const v8 = require('v8'); + * v8.setFlagsFromString('--trace_gc'); + * setTimeout(() => { v8.setFlagsFromString('--notrace_gc'); }, 60e3); + * ``` + * @since v1.0.0 + */ + function setFlagsFromString(flags: string): void; + /** + * Generates a snapshot of the current V8 heap and returns a Readable + * Stream that may be used to read the JSON serialized representation. + * This JSON stream format is intended to be used with tools such as + * Chrome DevTools. The JSON schema is undocumented and specific to the + * V8 engine. Therefore, the schema may change from one version of V8 to the next. + * + * ```js + * // Print heap snapshot to the console + * const v8 = require('v8'); + * const stream = v8.getHeapSnapshot(); + * stream.pipe(process.stdout); + * ``` + * @since v11.13.0 + * @return A Readable Stream containing the V8 heap snapshot + */ + function getHeapSnapshot(): Readable; + /** + * Generates a snapshot of the current V8 heap and writes it to a JSON + * file. This file is intended to be used with tools such as Chrome + * DevTools. The JSON schema is undocumented and specific to the V8 + * engine, and may change from one version of V8 to the next. + * + * A heap snapshot is specific to a single V8 isolate. When using `worker threads`, a heap snapshot generated from the main thread will + * not contain any information about the workers, and vice versa. + * + * ```js + * const { writeHeapSnapshot } = require('v8'); + * const { + * Worker, + * isMainThread, + * parentPort + * } = require('worker_threads'); + * + * if (isMainThread) { + * const worker = new Worker(__filename); + * + * worker.once('message', (filename) => { + * console.log(`worker heapdump: ${filename}`); + * // Now get a heapdump for the main thread. + * console.log(`main thread heapdump: ${writeHeapSnapshot()}`); + * }); + * + * // Tell the worker to create a heapdump. + * worker.postMessage('heapdump'); + * } else { + * parentPort.once('message', (message) => { + * if (message === 'heapdump') { + * // Generate a heapdump for the worker + * // and return the filename to the parent. + * parentPort.postMessage(writeHeapSnapshot()); + * } + * }); + * } + * ``` + * @since v11.13.0 + * @param filename The file path where the V8 heap snapshot is to be saved. If not specified, a file name with the pattern `'Heap-${yyyymmdd}-${hhmmss}-${pid}-${thread_id}.heapsnapshot'` will be + * generated, where `{pid}` will be the PID of the Node.js process, `{thread_id}` will be `0` when `writeHeapSnapshot()` is called from the main Node.js thread or the id of a + * worker thread. + * @return The filename where the snapshot was saved. + */ + function writeHeapSnapshot(filename?: string): string; + /** + * Returns an object with the following properties: + * + * ```js + * { + * code_and_metadata_size: 212208, + * bytecode_and_metadata_size: 161368, + * external_script_source_size: 1410794 + * } + * ``` + * @since v12.8.0 + */ + function getHeapCodeStatistics(): HeapCodeStatistics; + /** + * @since v8.0.0 + */ + class Serializer { + /** + * Writes out a header, which includes the serialization format version. + */ + writeHeader(): void; + /** + * Serializes a JavaScript value and adds the serialized representation to the + * internal buffer. + * + * This throws an error if `value` cannot be serialized. + */ + writeValue(val: any): boolean; + /** + * Returns the stored internal buffer. This serializer should not be used once + * the buffer is released. Calling this method results in undefined behavior + * if a previous write has failed. + */ + releaseBuffer(): Buffer; + /** + * Marks an `ArrayBuffer` as having its contents transferred out of band. + * Pass the corresponding `ArrayBuffer` in the deserializing context to `deserializer.transferArrayBuffer()`. + * @param id A 32-bit unsigned integer. + * @param arrayBuffer An `ArrayBuffer` instance. + */ + transferArrayBuffer(id: number, arrayBuffer: ArrayBuffer): void; + /** + * Write a raw 32-bit unsigned integer. + * For use inside of a custom `serializer._writeHostObject()`. + */ + writeUint32(value: number): void; + /** + * Write a raw 64-bit unsigned integer, split into high and low 32-bit parts. + * For use inside of a custom `serializer._writeHostObject()`. + */ + writeUint64(hi: number, lo: number): void; + /** + * Write a JS `number` value. + * For use inside of a custom `serializer._writeHostObject()`. + */ + writeDouble(value: number): void; + /** + * Write raw bytes into the serializer’s internal buffer. The deserializer + * will require a way to compute the length of the buffer. + * For use inside of a custom `serializer._writeHostObject()`. + */ + writeRawBytes(buffer: NodeJS.TypedArray): void; + } + /** + * A subclass of `Serializer` that serializes `TypedArray`(in particular `Buffer`) and `DataView` objects as host objects, and only + * stores the part of their underlying `ArrayBuffer`s that they are referring to. + * @since v8.0.0 + */ + class DefaultSerializer extends Serializer {} + /** + * @since v8.0.0 + */ + class Deserializer { + constructor(data: NodeJS.TypedArray); + /** + * Reads and validates a header (including the format version). + * May, for example, reject an invalid or unsupported wire format. In that case, + * an `Error` is thrown. + */ + readHeader(): boolean; + /** + * Deserializes a JavaScript value from the buffer and returns it. + */ + readValue(): any; + /** + * Marks an `ArrayBuffer` as having its contents transferred out of band. + * Pass the corresponding `ArrayBuffer` in the serializing context to `serializer.transferArrayBuffer()` (or return the `id` from `serializer._getSharedArrayBufferId()` in the case of + * `SharedArrayBuffer`s). + * @param id A 32-bit unsigned integer. + * @param arrayBuffer An `ArrayBuffer` instance. + */ + transferArrayBuffer(id: number, arrayBuffer: ArrayBuffer): void; + /** + * Reads the underlying wire format version. Likely mostly to be useful to + * legacy code reading old wire format versions. May not be called before`.readHeader()`. + */ + getWireFormatVersion(): number; + /** + * Read a raw 32-bit unsigned integer and return it. + * For use inside of a custom `deserializer._readHostObject()`. + */ + readUint32(): number; + /** + * Read a raw 64-bit unsigned integer and return it as an array `[hi, lo]`with two 32-bit unsigned integer entries. + * For use inside of a custom `deserializer._readHostObject()`. + */ + readUint64(): [number, number]; + /** + * Read a JS `number` value. + * For use inside of a custom `deserializer._readHostObject()`. + */ + readDouble(): number; + /** + * Read raw bytes from the deserializer’s internal buffer. The `length` parameter + * must correspond to the length of the buffer that was passed to `serializer.writeRawBytes()`. + * For use inside of a custom `deserializer._readHostObject()`. + */ + readRawBytes(length: number): Buffer; + } + /** + * A subclass of `Deserializer` corresponding to the format written by `DefaultSerializer`. + * @since v8.0.0 + */ + class DefaultDeserializer extends Deserializer {} + /** + * Uses a `DefaultSerializer` to serialize `value` into a buffer. + * @since v8.0.0 + */ + function serialize(value: any): Buffer; + /** + * Uses a `DefaultDeserializer` with default options to read a JS value + * from a buffer. + * @since v8.0.0 + * @param buffer A buffer returned by {@link serialize}. + */ + function deserialize(buffer: NodeJS.TypedArray): any; + /** + * The `v8.takeCoverage()` method allows the user to write the coverage started by `NODE_V8_COVERAGE` to disk on demand. This method can be invoked multiple + * times during the lifetime of the process. Each time the execution counter will + * be reset and a new coverage report will be written to the directory specified + * by `NODE_V8_COVERAGE`. + * + * When the process is about to exit, one last coverage will still be written to + * disk unless {@link stopCoverage} is invoked before the process exits. + * @since v15.1.0, v12.22.0 + */ + function takeCoverage(): void; + /** + * The `v8.stopCoverage()` method allows the user to stop the coverage collection + * started by `NODE_V8_COVERAGE`, so that V8 can release the execution count + * records and optimize code. This can be used in conjunction with {@link takeCoverage} if the user wants to collect the coverage on demand. + * @since v15.1.0, v12.22.0 + */ + function stopCoverage(): void; +} +declare module 'node:v8' { + export * from 'v8'; +} diff --git a/node_modules/@types/node/vm.d.ts b/node_modules/@types/node/vm.d.ts new file mode 100644 index 000000000..a46b39190 --- /dev/null +++ b/node_modules/@types/node/vm.d.ts @@ -0,0 +1,507 @@ +/** + * The `vm` module enables compiling and running code within V8 Virtual + * Machine contexts. **The `vm` module is not a security mechanism. Do** + * **not use it to run untrusted code.** + * + * JavaScript code can be compiled and run immediately or + * compiled, saved, and run later. + * + * A common use case is to run the code in a different V8 Context. This means + * invoked code has a different global object than the invoking code. + * + * One can provide the context by `contextifying` an + * object. The invoked code treats any property in the context like a + * global variable. Any changes to global variables caused by the invoked + * code are reflected in the context object. + * + * ```js + * const vm = require('vm'); + * + * const x = 1; + * + * const context = { x: 2 }; + * vm.createContext(context); // Contextify the object. + * + * const code = 'x += 40; var y = 17;'; + * // `x` and `y` are global variables in the context. + * // Initially, x has the value 2 because that is the value of context.x. + * vm.runInContext(code, context); + * + * console.log(context.x); // 42 + * console.log(context.y); // 17 + * + * console.log(x); // 1; y is not defined. + * ``` + * @see [source](https://github.com/nodejs/node/blob/v16.9.0/lib/vm.js) + */ +declare module 'vm' { + interface Context extends NodeJS.Dict {} + interface BaseOptions { + /** + * Specifies the filename used in stack traces produced by this script. + * Default: `''`. + */ + filename?: string | undefined; + /** + * Specifies the line number offset that is displayed in stack traces produced by this script. + * Default: `0`. + */ + lineOffset?: number | undefined; + /** + * Specifies the column number offset that is displayed in stack traces produced by this script. + * @default 0 + */ + columnOffset?: number | undefined; + } + interface ScriptOptions extends BaseOptions { + displayErrors?: boolean | undefined; + timeout?: number | undefined; + cachedData?: Buffer | undefined; + /** @deprecated in favor of `script.createCachedData()` */ + produceCachedData?: boolean | undefined; + } + interface RunningScriptOptions extends BaseOptions { + /** + * When `true`, if an `Error` occurs while compiling the `code`, the line of code causing the error is attached to the stack trace. + * Default: `true`. + */ + displayErrors?: boolean | undefined; + /** + * Specifies the number of milliseconds to execute code before terminating execution. + * If execution is terminated, an `Error` will be thrown. This value must be a strictly positive integer. + */ + timeout?: number | undefined; + /** + * If `true`, the execution will be terminated when `SIGINT` (Ctrl+C) is received. + * Existing handlers for the event that have been attached via `process.on('SIGINT')` will be disabled during script execution, but will continue to work after that. + * If execution is terminated, an `Error` will be thrown. + * Default: `false`. + */ + breakOnSigint?: boolean | undefined; + /** + * If set to `afterEvaluate`, microtasks will be run immediately after the script has run. + */ + microtaskMode?: 'afterEvaluate' | undefined; + } + interface CompileFunctionOptions extends BaseOptions { + /** + * Provides an optional data with V8's code cache data for the supplied source. + */ + cachedData?: Buffer | undefined; + /** + * Specifies whether to produce new cache data. + * Default: `false`, + */ + produceCachedData?: boolean | undefined; + /** + * The sandbox/context in which the said function should be compiled in. + */ + parsingContext?: Context | undefined; + /** + * An array containing a collection of context extensions (objects wrapping the current scope) to be applied while compiling + */ + contextExtensions?: Object[] | undefined; + } + interface CreateContextOptions { + /** + * Human-readable name of the newly created context. + * @default 'VM Context i' Where i is an ascending numerical index of the created context. + */ + name?: string | undefined; + /** + * Corresponds to the newly created context for display purposes. + * The origin should be formatted like a `URL`, but with only the scheme, host, and port (if necessary), + * like the value of the `url.origin` property of a URL object. + * Most notably, this string should omit the trailing slash, as that denotes a path. + * @default '' + */ + origin?: string | undefined; + codeGeneration?: + | { + /** + * If set to false any calls to eval or function constructors (Function, GeneratorFunction, etc) + * will throw an EvalError. + * @default true + */ + strings?: boolean | undefined; + /** + * If set to false any attempt to compile a WebAssembly module will throw a WebAssembly.CompileError. + * @default true + */ + wasm?: boolean | undefined; + } + | undefined; + /** + * If set to `afterEvaluate`, microtasks will be run immediately after the script has run. + */ + microtaskMode?: 'afterEvaluate' | undefined; + } + type MeasureMemoryMode = 'summary' | 'detailed'; + interface MeasureMemoryOptions { + /** + * @default 'summary' + */ + mode?: MeasureMemoryMode | undefined; + context?: Context | undefined; + } + interface MemoryMeasurement { + total: { + jsMemoryEstimate: number; + jsMemoryRange: [number, number]; + }; + } + /** + * Instances of the `vm.Script` class contain precompiled scripts that can be + * executed in specific contexts. + * @since v0.3.1 + */ + class Script { + constructor(code: string, options?: ScriptOptions); + /** + * Runs the compiled code contained by the `vm.Script` object within the given`contextifiedObject` and returns the result. Running code does not have access + * to local scope. + * + * The following example compiles code that increments a global variable, sets + * the value of another global variable, then execute the code multiple times. + * The globals are contained in the `context` object. + * + * ```js + * const vm = require('vm'); + * + * const context = { + * animal: 'cat', + * count: 2 + * }; + * + * const script = new vm.Script('count += 1; name = "kitty";'); + * + * vm.createContext(context); + * for (let i = 0; i < 10; ++i) { + * script.runInContext(context); + * } + * + * console.log(context); + * // Prints: { animal: 'cat', count: 12, name: 'kitty' } + * ``` + * + * Using the `timeout` or `breakOnSigint` options will result in new event loops + * and corresponding threads being started, which have a non-zero performance + * overhead. + * @since v0.3.1 + * @param contextifiedObject A `contextified` object as returned by the `vm.createContext()` method. + * @return the result of the very last statement executed in the script. + */ + runInContext(contextifiedObject: Context, options?: RunningScriptOptions): any; + /** + * First contextifies the given `contextObject`, runs the compiled code contained + * by the `vm.Script` object within the created context, and returns the result. + * Running code does not have access to local scope. + * + * The following example compiles code that sets a global variable, then executes + * the code multiple times in different contexts. The globals are set on and + * contained within each individual `context`. + * + * ```js + * const vm = require('vm'); + * + * const script = new vm.Script('globalVar = "set"'); + * + * const contexts = [{}, {}, {}]; + * contexts.forEach((context) => { + * script.runInNewContext(context); + * }); + * + * console.log(contexts); + * // Prints: [{ globalVar: 'set' }, { globalVar: 'set' }, { globalVar: 'set' }] + * ``` + * @since v0.3.1 + * @param contextObject An object that will be `contextified`. If `undefined`, a new object will be created. + * @return the result of the very last statement executed in the script. + */ + runInNewContext(contextObject?: Context, options?: RunningScriptOptions): any; + /** + * Runs the compiled code contained by the `vm.Script` within the context of the + * current `global` object. Running code does not have access to local scope, but_does_ have access to the current `global` object. + * + * The following example compiles code that increments a `global` variable then + * executes that code multiple times: + * + * ```js + * const vm = require('vm'); + * + * global.globalVar = 0; + * + * const script = new vm.Script('globalVar += 1', { filename: 'myfile.vm' }); + * + * for (let i = 0; i < 1000; ++i) { + * script.runInThisContext(); + * } + * + * console.log(globalVar); + * + * // 1000 + * ``` + * @since v0.3.1 + * @return the result of the very last statement executed in the script. + */ + runInThisContext(options?: RunningScriptOptions): any; + /** + * Creates a code cache that can be used with the `Script` constructor's`cachedData` option. Returns a `Buffer`. This method may be called at any + * time and any number of times. + * + * ```js + * const script = new vm.Script(` + * function add(a, b) { + * return a + b; + * } + * + * const x = add(1, 2); + * `); + * + * const cacheWithoutX = script.createCachedData(); + * + * script.runInThisContext(); + * + * const cacheWithX = script.createCachedData(); + * ``` + * @since v10.6.0 + */ + createCachedData(): Buffer; + /** @deprecated in favor of `script.createCachedData()` */ + cachedDataProduced?: boolean | undefined; + cachedDataRejected?: boolean | undefined; + cachedData?: Buffer | undefined; + } + /** + * If given a `contextObject`, the `vm.createContext()` method will `prepare + * that object` so that it can be used in calls to {@link runInContext} or `script.runInContext()`. Inside such scripts, + * the `contextObject` will be the global object, retaining all of its existing + * properties but also having the built-in objects and functions any standard [global object](https://es5.github.io/#x15.1) has. Outside of scripts run by the vm module, global variables + * will remain unchanged. + * + * ```js + * const vm = require('vm'); + * + * global.globalVar = 3; + * + * const context = { globalVar: 1 }; + * vm.createContext(context); + * + * vm.runInContext('globalVar *= 2;', context); + * + * console.log(context); + * // Prints: { globalVar: 2 } + * + * console.log(global.globalVar); + * // Prints: 3 + * ``` + * + * If `contextObject` is omitted (or passed explicitly as `undefined`), a new, + * empty `contextified` object will be returned. + * + * The `vm.createContext()` method is primarily useful for creating a single + * context that can be used to run multiple scripts. For instance, if emulating a + * web browser, the method can be used to create a single context representing a + * window's global object, then run all ` + +``` + +## Documentation + +### Collections + +* [`each`](#each) +* [`eachSeries`](#eachSeries) +* [`eachLimit`](#eachLimit) +* [`map`](#map) +* [`mapSeries`](#mapSeries) +* [`mapLimit`](#mapLimit) +* [`filter`](#filter) +* [`filterSeries`](#filterSeries) +* [`reject`](#reject) +* [`rejectSeries`](#rejectSeries) +* [`reduce`](#reduce) +* [`reduceRight`](#reduceRight) +* [`detect`](#detect) +* [`detectSeries`](#detectSeries) +* [`sortBy`](#sortBy) +* [`some`](#some) +* [`every`](#every) +* [`concat`](#concat) +* [`concatSeries`](#concatSeries) + +### Control Flow + +* [`series`](#seriestasks-callback) +* [`parallel`](#parallel) +* [`parallelLimit`](#parallellimittasks-limit-callback) +* [`whilst`](#whilst) +* [`doWhilst`](#doWhilst) +* [`until`](#until) +* [`doUntil`](#doUntil) +* [`forever`](#forever) +* [`waterfall`](#waterfall) +* [`compose`](#compose) +* [`seq`](#seq) +* [`applyEach`](#applyEach) +* [`applyEachSeries`](#applyEachSeries) +* [`queue`](#queue) +* [`priorityQueue`](#priorityQueue) +* [`cargo`](#cargo) +* [`auto`](#auto) +* [`retry`](#retry) +* [`iterator`](#iterator) +* [`apply`](#apply) +* [`nextTick`](#nextTick) +* [`times`](#times) +* [`timesSeries`](#timesSeries) + +### Utils + +* [`memoize`](#memoize) +* [`unmemoize`](#unmemoize) +* [`log`](#log) +* [`dir`](#dir) +* [`noConflict`](#noConflict) + + +## Collections + + + +### each(arr, iterator, callback) + +Applies the function `iterator` to each item in `arr`, in parallel. +The `iterator` is called with an item from the list, and a callback for when it +has finished. If the `iterator` passes an error to its `callback`, the main +`callback` (for the `each` function) is immediately called with the error. + +Note, that since this function applies `iterator` to each item in parallel, +there is no guarantee that the iterator functions will complete in order. + +__Arguments__ + +* `arr` - An array to iterate over. +* `iterator(item, callback)` - A function to apply to each item in `arr`. + The iterator is passed a `callback(err)` which must be called once it has + completed. If no error has occurred, the `callback` should be run without + arguments or with an explicit `null` argument. +* `callback(err)` - A callback which is called when all `iterator` functions + have finished, or an error occurs. + +__Examples__ + + +```js +// assuming openFiles is an array of file names and saveFile is a function +// to save the modified contents of that file: + +async.each(openFiles, saveFile, function(err){ + // if any of the saves produced an error, err would equal that error +}); +``` + +```js +// assuming openFiles is an array of file names + +async.each(openFiles, function(file, callback) { + + // Perform operation on file here. + console.log('Processing file ' + file); + + if( file.length > 32 ) { + console.log('This file name is too long'); + callback('File name too long'); + } else { + // Do work to process file here + console.log('File processed'); + callback(); + } +}, function(err){ + // if any of the file processing produced an error, err would equal that error + if( err ) { + // One of the iterations produced an error. + // All processing will now stop. + console.log('A file failed to process'); + } else { + console.log('All files have been processed successfully'); + } +}); +``` + +--------------------------------------- + + + +### eachSeries(arr, iterator, callback) + +The same as [`each`](#each), only `iterator` is applied to each item in `arr` in +series. The next `iterator` is only called once the current one has completed. +This means the `iterator` functions will complete in order. + + +--------------------------------------- + + + +### eachLimit(arr, limit, iterator, callback) + +The same as [`each`](#each), only no more than `limit` `iterator`s will be simultaneously +running at any time. + +Note that the items in `arr` are not processed in batches, so there is no guarantee that +the first `limit` `iterator` functions will complete before any others are started. + +__Arguments__ + +* `arr` - An array to iterate over. +* `limit` - The maximum number of `iterator`s to run at any time. +* `iterator(item, callback)` - A function to apply to each item in `arr`. + The iterator is passed a `callback(err)` which must be called once it has + completed. If no error has occurred, the callback should be run without + arguments or with an explicit `null` argument. +* `callback(err)` - A callback which is called when all `iterator` functions + have finished, or an error occurs. + +__Example__ + +```js +// Assume documents is an array of JSON objects and requestApi is a +// function that interacts with a rate-limited REST api. + +async.eachLimit(documents, 20, requestApi, function(err){ + // if any of the saves produced an error, err would equal that error +}); +``` + +--------------------------------------- + + +### map(arr, iterator, callback) + +Produces a new array of values by mapping each value in `arr` through +the `iterator` function. The `iterator` is called with an item from `arr` and a +callback for when it has finished processing. Each of these callback takes 2 arguments: +an `error`, and the transformed item from `arr`. If `iterator` passes an error to his +callback, the main `callback` (for the `map` function) is immediately called with the error. + +Note, that since this function applies the `iterator` to each item in parallel, +there is no guarantee that the `iterator` functions will complete in order. +However, the results array will be in the same order as the original `arr`. + +__Arguments__ + +* `arr` - An array to iterate over. +* `iterator(item, callback)` - A function to apply to each item in `arr`. + The iterator is passed a `callback(err, transformed)` which must be called once + it has completed with an error (which can be `null`) and a transformed item. +* `callback(err, results)` - A callback which is called when all `iterator` + functions have finished, or an error occurs. Results is an array of the + transformed items from the `arr`. + +__Example__ + +```js +async.map(['file1','file2','file3'], fs.stat, function(err, results){ + // results is now an array of stats for each file +}); +``` + +--------------------------------------- + + +### mapSeries(arr, iterator, callback) + +The same as [`map`](#map), only the `iterator` is applied to each item in `arr` in +series. The next `iterator` is only called once the current one has completed. +The results array will be in the same order as the original. + + +--------------------------------------- + + +### mapLimit(arr, limit, iterator, callback) + +The same as [`map`](#map), only no more than `limit` `iterator`s will be simultaneously +running at any time. + +Note that the items are not processed in batches, so there is no guarantee that +the first `limit` `iterator` functions will complete before any others are started. + +__Arguments__ + +* `arr` - An array to iterate over. +* `limit` - The maximum number of `iterator`s to run at any time. +* `iterator(item, callback)` - A function to apply to each item in `arr`. + The iterator is passed a `callback(err, transformed)` which must be called once + it has completed with an error (which can be `null`) and a transformed item. +* `callback(err, results)` - A callback which is called when all `iterator` + calls have finished, or an error occurs. The result is an array of the + transformed items from the original `arr`. + +__Example__ + +```js +async.mapLimit(['file1','file2','file3'], 1, fs.stat, function(err, results){ + // results is now an array of stats for each file +}); +``` + +--------------------------------------- + + + +### filter(arr, iterator, callback) + +__Alias:__ `select` + +Returns a new array of all the values in `arr` which pass an async truth test. +_The callback for each `iterator` call only accepts a single argument of `true` or +`false`; it does not accept an error argument first!_ This is in-line with the +way node libraries work with truth tests like `fs.exists`. This operation is +performed in parallel, but the results array will be in the same order as the +original. + +__Arguments__ + +* `arr` - An array to iterate over. +* `iterator(item, callback)` - A truth test to apply to each item in `arr`. + The `iterator` is passed a `callback(truthValue)`, which must be called with a + boolean argument once it has completed. +* `callback(results)` - A callback which is called after all the `iterator` + functions have finished. + +__Example__ + +```js +async.filter(['file1','file2','file3'], fs.exists, function(results){ + // results now equals an array of the existing files +}); +``` + +--------------------------------------- + + + +### filterSeries(arr, iterator, callback) + +__Alias:__ `selectSeries` + +The same as [`filter`](#filter) only the `iterator` is applied to each item in `arr` in +series. The next `iterator` is only called once the current one has completed. +The results array will be in the same order as the original. + +--------------------------------------- + + +### reject(arr, iterator, callback) + +The opposite of [`filter`](#filter). Removes values that pass an `async` truth test. + +--------------------------------------- + + +### rejectSeries(arr, iterator, callback) + +The same as [`reject`](#reject), only the `iterator` is applied to each item in `arr` +in series. + + +--------------------------------------- + + +### reduce(arr, memo, iterator, callback) + +__Aliases:__ `inject`, `foldl` + +Reduces `arr` into a single value using an async `iterator` to return +each successive step. `memo` is the initial state of the reduction. +This function only operates in series. + +For performance reasons, it may make sense to split a call to this function into +a parallel map, and then use the normal `Array.prototype.reduce` on the results. +This function is for situations where each step in the reduction needs to be async; +if you can get the data before reducing it, then it's probably a good idea to do so. + +__Arguments__ + +* `arr` - An array to iterate over. +* `memo` - The initial state of the reduction. +* `iterator(memo, item, callback)` - A function applied to each item in the + array to produce the next step in the reduction. The `iterator` is passed a + `callback(err, reduction)` which accepts an optional error as its first + argument, and the state of the reduction as the second. If an error is + passed to the callback, the reduction is stopped and the main `callback` is + immediately called with the error. +* `callback(err, result)` - A callback which is called after all the `iterator` + functions have finished. Result is the reduced value. + +__Example__ + +```js +async.reduce([1,2,3], 0, function(memo, item, callback){ + // pointless async: + process.nextTick(function(){ + callback(null, memo + item) + }); +}, function(err, result){ + // result is now equal to the last value of memo, which is 6 +}); +``` + +--------------------------------------- + + +### reduceRight(arr, memo, iterator, callback) + +__Alias:__ `foldr` + +Same as [`reduce`](#reduce), only operates on `arr` in reverse order. + + +--------------------------------------- + + +### detect(arr, iterator, callback) + +Returns the first value in `arr` that passes an async truth test. The +`iterator` is applied in parallel, meaning the first iterator to return `true` will +fire the detect `callback` with that result. That means the result might not be +the first item in the original `arr` (in terms of order) that passes the test. + +If order within the original `arr` is important, then look at [`detectSeries`](#detectSeries). + +__Arguments__ + +* `arr` - An array to iterate over. +* `iterator(item, callback)` - A truth test to apply to each item in `arr`. + The iterator is passed a `callback(truthValue)` which must be called with a + boolean argument once it has completed. +* `callback(result)` - A callback which is called as soon as any iterator returns + `true`, or after all the `iterator` functions have finished. Result will be + the first item in the array that passes the truth test (iterator) or the + value `undefined` if none passed. + +__Example__ + +```js +async.detect(['file1','file2','file3'], fs.exists, function(result){ + // result now equals the first file in the list that exists +}); +``` + +--------------------------------------- + + +### detectSeries(arr, iterator, callback) + +The same as [`detect`](#detect), only the `iterator` is applied to each item in `arr` +in series. This means the result is always the first in the original `arr` (in +terms of array order) that passes the truth test. + + +--------------------------------------- + + +### sortBy(arr, iterator, callback) + +Sorts a list by the results of running each `arr` value through an async `iterator`. + +__Arguments__ + +* `arr` - An array to iterate over. +* `iterator(item, callback)` - A function to apply to each item in `arr`. + The iterator is passed a `callback(err, sortValue)` which must be called once it + has completed with an error (which can be `null`) and a value to use as the sort + criteria. +* `callback(err, results)` - A callback which is called after all the `iterator` + functions have finished, or an error occurs. Results is the items from + the original `arr` sorted by the values returned by the `iterator` calls. + +__Example__ + +```js +async.sortBy(['file1','file2','file3'], function(file, callback){ + fs.stat(file, function(err, stats){ + callback(err, stats.mtime); + }); +}, function(err, results){ + // results is now the original array of files sorted by + // modified date +}); +``` + +__Sort Order__ + +By modifying the callback parameter the sorting order can be influenced: + +```js +//ascending order +async.sortBy([1,9,3,5], function(x, callback){ + callback(null, x); +}, function(err,result){ + //result callback +} ); + +//descending order +async.sortBy([1,9,3,5], function(x, callback){ + callback(null, x*-1); //<- x*-1 instead of x, turns the order around +}, function(err,result){ + //result callback +} ); +``` + +--------------------------------------- + + +### some(arr, iterator, callback) + +__Alias:__ `any` + +Returns `true` if at least one element in the `arr` satisfies an async test. +_The callback for each iterator call only accepts a single argument of `true` or +`false`; it does not accept an error argument first!_ This is in-line with the +way node libraries work with truth tests like `fs.exists`. Once any iterator +call returns `true`, the main `callback` is immediately called. + +__Arguments__ + +* `arr` - An array to iterate over. +* `iterator(item, callback)` - A truth test to apply to each item in the array + in parallel. The iterator is passed a callback(truthValue) which must be + called with a boolean argument once it has completed. +* `callback(result)` - A callback which is called as soon as any iterator returns + `true`, or after all the iterator functions have finished. Result will be + either `true` or `false` depending on the values of the async tests. + +__Example__ + +```js +async.some(['file1','file2','file3'], fs.exists, function(result){ + // if result is true then at least one of the files exists +}); +``` + +--------------------------------------- + + +### every(arr, iterator, callback) + +__Alias:__ `all` + +Returns `true` if every element in `arr` satisfies an async test. +_The callback for each `iterator` call only accepts a single argument of `true` or +`false`; it does not accept an error argument first!_ This is in-line with the +way node libraries work with truth tests like `fs.exists`. + +__Arguments__ + +* `arr` - An array to iterate over. +* `iterator(item, callback)` - A truth test to apply to each item in the array + in parallel. The iterator is passed a callback(truthValue) which must be + called with a boolean argument once it has completed. +* `callback(result)` - A callback which is called after all the `iterator` + functions have finished. Result will be either `true` or `false` depending on + the values of the async tests. + +__Example__ + +```js +async.every(['file1','file2','file3'], fs.exists, function(result){ + // if result is true then every file exists +}); +``` + +--------------------------------------- + + +### concat(arr, iterator, callback) + +Applies `iterator` to each item in `arr`, concatenating the results. Returns the +concatenated list. The `iterator`s are called in parallel, and the results are +concatenated as they return. There is no guarantee that the results array will +be returned in the original order of `arr` passed to the `iterator` function. + +__Arguments__ + +* `arr` - An array to iterate over. +* `iterator(item, callback)` - A function to apply to each item in `arr`. + The iterator is passed a `callback(err, results)` which must be called once it + has completed with an error (which can be `null`) and an array of results. +* `callback(err, results)` - A callback which is called after all the `iterator` + functions have finished, or an error occurs. Results is an array containing + the concatenated results of the `iterator` function. + +__Example__ + +```js +async.concat(['dir1','dir2','dir3'], fs.readdir, function(err, files){ + // files is now a list of filenames that exist in the 3 directories +}); +``` + +--------------------------------------- + + +### concatSeries(arr, iterator, callback) + +Same as [`concat`](#concat), but executes in series instead of parallel. + + +## Control Flow + + +### series(tasks, [callback]) + +Run the functions in the `tasks` array in series, each one running once the previous +function has completed. If any functions in the series pass an error to its +callback, no more functions are run, and `callback` is immediately called with the value of the error. +Otherwise, `callback` receives an array of results when `tasks` have completed. + +It is also possible to use an object instead of an array. Each property will be +run as a function, and the results will be passed to the final `callback` as an object +instead of an array. This can be a more readable way of handling results from +[`series`](#series). + +**Note** that while many implementations preserve the order of object properties, the +[ECMAScript Language Specifcation](http://www.ecma-international.org/ecma-262/5.1/#sec-8.6) +explicitly states that + +> The mechanics and order of enumerating the properties is not specified. + +So if you rely on the order in which your series of functions are executed, and want +this to work on all platforms, consider using an array. + +__Arguments__ + +* `tasks` - An array or object containing functions to run, each function is passed + a `callback(err, result)` it must call on completion with an error `err` (which can + be `null`) and an optional `result` value. +* `callback(err, results)` - An optional callback to run once all the functions + have completed. This function gets a results array (or object) containing all + the result arguments passed to the `task` callbacks. + +__Example__ + +```js +async.series([ + function(callback){ + // do some stuff ... + callback(null, 'one'); + }, + function(callback){ + // do some more stuff ... + callback(null, 'two'); + } +], +// optional callback +function(err, results){ + // results is now equal to ['one', 'two'] +}); + + +// an example using an object instead of an array +async.series({ + one: function(callback){ + setTimeout(function(){ + callback(null, 1); + }, 200); + }, + two: function(callback){ + setTimeout(function(){ + callback(null, 2); + }, 100); + } +}, +function(err, results) { + // results is now equal to: {one: 1, two: 2} +}); +``` + +--------------------------------------- + + +### parallel(tasks, [callback]) + +Run the `tasks` array of functions in parallel, without waiting until the previous +function has completed. If any of the functions pass an error to its +callback, the main `callback` is immediately called with the value of the error. +Once the `tasks` have completed, the results are passed to the final `callback` as an +array. + +It is also possible to use an object instead of an array. Each property will be +run as a function and the results will be passed to the final `callback` as an object +instead of an array. This can be a more readable way of handling results from +[`parallel`](#parallel). + + +__Arguments__ + +* `tasks` - An array or object containing functions to run. Each function is passed + a `callback(err, result)` which it must call on completion with an error `err` + (which can be `null`) and an optional `result` value. +* `callback(err, results)` - An optional callback to run once all the functions + have completed. This function gets a results array (or object) containing all + the result arguments passed to the task callbacks. + +__Example__ + +```js +async.parallel([ + function(callback){ + setTimeout(function(){ + callback(null, 'one'); + }, 200); + }, + function(callback){ + setTimeout(function(){ + callback(null, 'two'); + }, 100); + } +], +// optional callback +function(err, results){ + // the results array will equal ['one','two'] even though + // the second function had a shorter timeout. +}); + + +// an example using an object instead of an array +async.parallel({ + one: function(callback){ + setTimeout(function(){ + callback(null, 1); + }, 200); + }, + two: function(callback){ + setTimeout(function(){ + callback(null, 2); + }, 100); + } +}, +function(err, results) { + // results is now equals to: {one: 1, two: 2} +}); +``` + +--------------------------------------- + + +### parallelLimit(tasks, limit, [callback]) + +The same as [`parallel`](#parallel), only `tasks` are executed in parallel +with a maximum of `limit` tasks executing at any time. + +Note that the `tasks` are not executed in batches, so there is no guarantee that +the first `limit` tasks will complete before any others are started. + +__Arguments__ + +* `tasks` - An array or object containing functions to run, each function is passed + a `callback(err, result)` it must call on completion with an error `err` (which can + be `null`) and an optional `result` value. +* `limit` - The maximum number of `tasks` to run at any time. +* `callback(err, results)` - An optional callback to run once all the functions + have completed. This function gets a results array (or object) containing all + the result arguments passed to the `task` callbacks. + +--------------------------------------- + + +### whilst(test, fn, callback) + +Repeatedly call `fn`, while `test` returns `true`. Calls `callback` when stopped, +or an error occurs. + +__Arguments__ + +* `test()` - synchronous truth test to perform before each execution of `fn`. +* `fn(callback)` - A function which is called each time `test` passes. The function is + passed a `callback(err)`, which must be called once it has completed with an + optional `err` argument. +* `callback(err)` - A callback which is called after the test fails and repeated + execution of `fn` has stopped. + +__Example__ + +```js +var count = 0; + +async.whilst( + function () { return count < 5; }, + function (callback) { + count++; + setTimeout(callback, 1000); + }, + function (err) { + // 5 seconds have passed + } +); +``` + +--------------------------------------- + + +### doWhilst(fn, test, callback) + +The post-check version of [`whilst`](#whilst). To reflect the difference in +the order of operations, the arguments `test` and `fn` are switched. + +`doWhilst` is to `whilst` as `do while` is to `while` in plain JavaScript. + +--------------------------------------- + + +### until(test, fn, callback) + +Repeatedly call `fn` until `test` returns `true`. Calls `callback` when stopped, +or an error occurs. + +The inverse of [`whilst`](#whilst). + +--------------------------------------- + + +### doUntil(fn, test, callback) + +Like [`doWhilst`](#doWhilst), except the `test` is inverted. Note the argument ordering differs from `until`. + +--------------------------------------- + + +### forever(fn, errback) + +Calls the asynchronous function `fn` with a callback parameter that allows it to +call itself again, in series, indefinitely. + +If an error is passed to the callback then `errback` is called with the +error, and execution stops, otherwise it will never be called. + +```js +async.forever( + function(next) { + // next is suitable for passing to things that need a callback(err [, whatever]); + // it will result in this function being called again. + }, + function(err) { + // if next is called with a value in its first parameter, it will appear + // in here as 'err', and execution will stop. + } +); +``` + +--------------------------------------- + + +### waterfall(tasks, [callback]) + +Runs the `tasks` array of functions in series, each passing their results to the next in +the array. However, if any of the `tasks` pass an error to their own callback, the +next function is not executed, and the main `callback` is immediately called with +the error. + +__Arguments__ + +* `tasks` - An array of functions to run, each function is passed a + `callback(err, result1, result2, ...)` it must call on completion. The first + argument is an error (which can be `null`) and any further arguments will be + passed as arguments in order to the next task. +* `callback(err, [results])` - An optional callback to run once all the functions + have completed. This will be passed the results of the last task's callback. + + + +__Example__ + +```js +async.waterfall([ + function(callback) { + callback(null, 'one', 'two'); + }, + function(arg1, arg2, callback) { + // arg1 now equals 'one' and arg2 now equals 'two' + callback(null, 'three'); + }, + function(arg1, callback) { + // arg1 now equals 'three' + callback(null, 'done'); + } +], function (err, result) { + // result now equals 'done' +}); +``` + +--------------------------------------- + +### compose(fn1, fn2...) + +Creates a function which is a composition of the passed asynchronous +functions. Each function consumes the return value of the function that +follows. Composing functions `f()`, `g()`, and `h()` would produce the result of +`f(g(h()))`, only this version uses callbacks to obtain the return values. + +Each function is executed with the `this` binding of the composed function. + +__Arguments__ + +* `functions...` - the asynchronous functions to compose + + +__Example__ + +```js +function add1(n, callback) { + setTimeout(function () { + callback(null, n + 1); + }, 10); +} + +function mul3(n, callback) { + setTimeout(function () { + callback(null, n * 3); + }, 10); +} + +var add1mul3 = async.compose(mul3, add1); + +add1mul3(4, function (err, result) { + // result now equals 15 +}); +``` + +--------------------------------------- + +### seq(fn1, fn2...) + +Version of the compose function that is more natural to read. +Each function consumes the return value of the previous function. +It is the equivalent of [`compose`](#compose) with the arguments reversed. + +Each function is executed with the `this` binding of the composed function. + +__Arguments__ + +* functions... - the asynchronous functions to compose + + +__Example__ + +```js +// Requires lodash (or underscore), express3 and dresende's orm2. +// Part of an app, that fetches cats of the logged user. +// This example uses `seq` function to avoid overnesting and error +// handling clutter. +app.get('/cats', function(request, response) { + var User = request.models.User; + async.seq( + _.bind(User.get, User), // 'User.get' has signature (id, callback(err, data)) + function(user, fn) { + user.getCats(fn); // 'getCats' has signature (callback(err, data)) + } + )(req.session.user_id, function (err, cats) { + if (err) { + console.error(err); + response.json({ status: 'error', message: err.message }); + } else { + response.json({ status: 'ok', message: 'Cats found', data: cats }); + } + }); +}); +``` + +--------------------------------------- + +### applyEach(fns, args..., callback) + +Applies the provided arguments to each function in the array, calling +`callback` after all functions have completed. If you only provide the first +argument, then it will return a function which lets you pass in the +arguments as if it were a single function call. + +__Arguments__ + +* `fns` - the asynchronous functions to all call with the same arguments +* `args...` - any number of separate arguments to pass to the function +* `callback` - the final argument should be the callback, called when all + functions have completed processing + + +__Example__ + +```js +async.applyEach([enableSearch, updateSchema], 'bucket', callback); + +// partial application example: +async.each( + buckets, + async.applyEach([enableSearch, updateSchema]), + callback +); +``` + +--------------------------------------- + + +### applyEachSeries(arr, iterator, callback) + +The same as [`applyEach`](#applyEach) only the functions are applied in series. + +--------------------------------------- + + +### queue(worker, concurrency) + +Creates a `queue` object with the specified `concurrency`. Tasks added to the +`queue` are processed in parallel (up to the `concurrency` limit). If all +`worker`s are in progress, the task is queued until one becomes available. +Once a `worker` completes a `task`, that `task`'s callback is called. + +__Arguments__ + +* `worker(task, callback)` - An asynchronous function for processing a queued + task, which must call its `callback(err)` argument when finished, with an + optional `error` as an argument. +* `concurrency` - An `integer` for determining how many `worker` functions should be + run in parallel. + +__Queue objects__ + +The `queue` object returned by this function has the following properties and +methods: + +* `length()` - a function returning the number of items waiting to be processed. +* `started` - a function returning whether or not any items have been pushed and processed by the queue +* `running()` - a function returning the number of items currently being processed. +* `idle()` - a function returning false if there are items waiting or being processed, or true if not. +* `concurrency` - an integer for determining how many `worker` functions should be + run in parallel. This property can be changed after a `queue` is created to + alter the concurrency on-the-fly. +* `push(task, [callback])` - add a new task to the `queue`. Calls `callback` once + the `worker` has finished processing the task. Instead of a single task, a `tasks` array + can be submitted. The respective callback is used for every task in the list. +* `unshift(task, [callback])` - add a new task to the front of the `queue`. +* `saturated` - a callback that is called when the `queue` length hits the `concurrency` limit, + and further tasks will be queued. +* `empty` - a callback that is called when the last item from the `queue` is given to a `worker`. +* `drain` - a callback that is called when the last item from the `queue` has returned from the `worker`. +* `paused` - a boolean for determining whether the queue is in a paused state +* `pause()` - a function that pauses the processing of tasks until `resume()` is called. +* `resume()` - a function that resumes the processing of queued tasks when the queue is paused. +* `kill()` - a function that removes the `drain` callback and empties remaining tasks from the queue forcing it to go idle. + +__Example__ + +```js +// create a queue object with concurrency 2 + +var q = async.queue(function (task, callback) { + console.log('hello ' + task.name); + callback(); +}, 2); + + +// assign a callback +q.drain = function() { + console.log('all items have been processed'); +} + +// add some items to the queue + +q.push({name: 'foo'}, function (err) { + console.log('finished processing foo'); +}); +q.push({name: 'bar'}, function (err) { + console.log('finished processing bar'); +}); + +// add some items to the queue (batch-wise) + +q.push([{name: 'baz'},{name: 'bay'},{name: 'bax'}], function (err) { + console.log('finished processing item'); +}); + +// add some items to the front of the queue + +q.unshift({name: 'bar'}, function (err) { + console.log('finished processing bar'); +}); +``` + + +--------------------------------------- + + +### priorityQueue(worker, concurrency) + +The same as [`queue`](#queue) only tasks are assigned a priority and completed in ascending priority order. There are two differences between `queue` and `priorityQueue` objects: + +* `push(task, priority, [callback])` - `priority` should be a number. If an array of + `tasks` is given, all tasks will be assigned the same priority. +* The `unshift` method was removed. + +--------------------------------------- + + +### cargo(worker, [payload]) + +Creates a `cargo` object with the specified payload. Tasks added to the +cargo will be processed altogether (up to the `payload` limit). If the +`worker` is in progress, the task is queued until it becomes available. Once +the `worker` has completed some tasks, each callback of those tasks is called. +Check out [this animation](https://camo.githubusercontent.com/6bbd36f4cf5b35a0f11a96dcd2e97711ffc2fb37/68747470733a2f2f662e636c6f75642e6769746875622e636f6d2f6173736574732f313637363837312f36383130382f62626330636662302d356632392d313165322d393734662d3333393763363464633835382e676966) for how `cargo` and `queue` work. + +While [queue](#queue) passes only one task to one of a group of workers +at a time, cargo passes an array of tasks to a single worker, repeating +when the worker is finished. + +__Arguments__ + +* `worker(tasks, callback)` - An asynchronous function for processing an array of + queued tasks, which must call its `callback(err)` argument when finished, with + an optional `err` argument. +* `payload` - An optional `integer` for determining how many tasks should be + processed per round; if omitted, the default is unlimited. + +__Cargo objects__ + +The `cargo` object returned by this function has the following properties and +methods: + +* `length()` - A function returning the number of items waiting to be processed. +* `payload` - An `integer` for determining how many tasks should be + process per round. This property can be changed after a `cargo` is created to + alter the payload on-the-fly. +* `push(task, [callback])` - Adds `task` to the `queue`. The callback is called + once the `worker` has finished processing the task. Instead of a single task, an array of `tasks` + can be submitted. The respective callback is used for every task in the list. +* `saturated` - A callback that is called when the `queue.length()` hits the concurrency and further tasks will be queued. +* `empty` - A callback that is called when the last item from the `queue` is given to a `worker`. +* `drain` - A callback that is called when the last item from the `queue` has returned from the `worker`. + +__Example__ + +```js +// create a cargo object with payload 2 + +var cargo = async.cargo(function (tasks, callback) { + for(var i=0; i +### auto(tasks, [callback]) + +Determines the best order for running the functions in `tasks`, based on their +requirements. Each function can optionally depend on other functions being completed +first, and each function is run as soon as its requirements are satisfied. + +If any of the functions pass an error to their callback, it will not +complete (so any other functions depending on it will not run), and the main +`callback` is immediately called with the error. Functions also receive an +object containing the results of functions which have completed so far. + +Note, all functions are called with a `results` object as a second argument, +so it is unsafe to pass functions in the `tasks` object which cannot handle the +extra argument. + +For example, this snippet of code: + +```js +async.auto({ + readData: async.apply(fs.readFile, 'data.txt', 'utf-8') +}, callback); +``` + +will have the effect of calling `readFile` with the results object as the last +argument, which will fail: + +```js +fs.readFile('data.txt', 'utf-8', cb, {}); +``` + +Instead, wrap the call to `readFile` in a function which does not forward the +`results` object: + +```js +async.auto({ + readData: function(cb, results){ + fs.readFile('data.txt', 'utf-8', cb); + } +}, callback); +``` + +__Arguments__ + +* `tasks` - An object. Each of its properties is either a function or an array of + requirements, with the function itself the last item in the array. The object's key + of a property serves as the name of the task defined by that property, + i.e. can be used when specifying requirements for other tasks. + The function receives two arguments: (1) a `callback(err, result)` which must be + called when finished, passing an `error` (which can be `null`) and the result of + the function's execution, and (2) a `results` object, containing the results of + the previously executed functions. +* `callback(err, results)` - An optional callback which is called when all the + tasks have been completed. It receives the `err` argument if any `tasks` + pass an error to their callback. Results are always returned; however, if + an error occurs, no further `tasks` will be performed, and the results + object will only contain partial results. + + +__Example__ + +```js +async.auto({ + get_data: function(callback){ + console.log('in get_data'); + // async code to get some data + callback(null, 'data', 'converted to array'); + }, + make_folder: function(callback){ + console.log('in make_folder'); + // async code to create a directory to store a file in + // this is run at the same time as getting the data + callback(null, 'folder'); + }, + write_file: ['get_data', 'make_folder', function(callback, results){ + console.log('in write_file', JSON.stringify(results)); + // once there is some data and the directory exists, + // write the data to a file in the directory + callback(null, 'filename'); + }], + email_link: ['write_file', function(callback, results){ + console.log('in email_link', JSON.stringify(results)); + // once the file is written let's email a link to it... + // results.write_file contains the filename returned by write_file. + callback(null, {'file':results.write_file, 'email':'user@example.com'}); + }] +}, function(err, results) { + console.log('err = ', err); + console.log('results = ', results); +}); +``` + +This is a fairly trivial example, but to do this using the basic parallel and +series functions would look like this: + +```js +async.parallel([ + function(callback){ + console.log('in get_data'); + // async code to get some data + callback(null, 'data', 'converted to array'); + }, + function(callback){ + console.log('in make_folder'); + // async code to create a directory to store a file in + // this is run at the same time as getting the data + callback(null, 'folder'); + } +], +function(err, results){ + async.series([ + function(callback){ + console.log('in write_file', JSON.stringify(results)); + // once there is some data and the directory exists, + // write the data to a file in the directory + results.push('filename'); + callback(null); + }, + function(callback){ + console.log('in email_link', JSON.stringify(results)); + // once the file is written let's email a link to it... + callback(null, {'file':results.pop(), 'email':'user@example.com'}); + } + ]); +}); +``` + +For a complicated series of `async` tasks, using the [`auto`](#auto) function makes adding +new tasks much easier (and the code more readable). + + +--------------------------------------- + + +### retry([times = 5], task, [callback]) + +Attempts to get a successful response from `task` no more than `times` times before +returning an error. If the task is successful, the `callback` will be passed the result +of the successful task. If all attempts fail, the callback will be passed the error and +result (if any) of the final attempt. + +__Arguments__ + +* `times` - An integer indicating how many times to attempt the `task` before giving up. Defaults to 5. +* `task(callback, results)` - A function which receives two arguments: (1) a `callback(err, result)` + which must be called when finished, passing `err` (which can be `null`) and the `result` of + the function's execution, and (2) a `results` object, containing the results of + the previously executed functions (if nested inside another control flow). +* `callback(err, results)` - An optional callback which is called when the + task has succeeded, or after the final failed attempt. It receives the `err` and `result` arguments of the last attempt at completing the `task`. + +The [`retry`](#retry) function can be used as a stand-alone control flow by passing a +callback, as shown below: + +```js +async.retry(3, apiMethod, function(err, result) { + // do something with the result +}); +``` + +It can also be embeded within other control flow functions to retry individual methods +that are not as reliable, like this: + +```js +async.auto({ + users: api.getUsers.bind(api), + payments: async.retry(3, api.getPayments.bind(api)) +}, function(err, results) { + // do something with the results +}); +``` + + +--------------------------------------- + + +### iterator(tasks) + +Creates an iterator function which calls the next function in the `tasks` array, +returning a continuation to call the next one after that. It's also possible to +“peek” at the next iterator with `iterator.next()`. + +This function is used internally by the `async` module, but can be useful when +you want to manually control the flow of functions in series. + +__Arguments__ + +* `tasks` - An array of functions to run. + +__Example__ + +```js +var iterator = async.iterator([ + function(){ sys.p('one'); }, + function(){ sys.p('two'); }, + function(){ sys.p('three'); } +]); + +node> var iterator2 = iterator(); +'one' +node> var iterator3 = iterator2(); +'two' +node> iterator3(); +'three' +node> var nextfn = iterator2.next(); +node> nextfn(); +'three' +``` + +--------------------------------------- + + +### apply(function, arguments..) + +Creates a continuation function with some arguments already applied. + +Useful as a shorthand when combined with other control flow functions. Any arguments +passed to the returned function are added to the arguments originally passed +to apply. + +__Arguments__ + +* `function` - The function you want to eventually apply all arguments to. +* `arguments...` - Any number of arguments to automatically apply when the + continuation is called. + +__Example__ + +```js +// using apply + +async.parallel([ + async.apply(fs.writeFile, 'testfile1', 'test1'), + async.apply(fs.writeFile, 'testfile2', 'test2'), +]); + + +// the same process without using apply + +async.parallel([ + function(callback){ + fs.writeFile('testfile1', 'test1', callback); + }, + function(callback){ + fs.writeFile('testfile2', 'test2', callback); + } +]); +``` + +It's possible to pass any number of additional arguments when calling the +continuation: + +```js +node> var fn = async.apply(sys.puts, 'one'); +node> fn('two', 'three'); +one +two +three +``` + +--------------------------------------- + + +### nextTick(callback), setImmediate(callback) + +Calls `callback` on a later loop around the event loop. In Node.js this just +calls `process.nextTick`; in the browser it falls back to `setImmediate(callback)` +if available, otherwise `setTimeout(callback, 0)`, which means other higher priority +events may precede the execution of `callback`. + +This is used internally for browser-compatibility purposes. + +__Arguments__ + +* `callback` - The function to call on a later loop around the event loop. + +__Example__ + +```js +var call_order = []; +async.nextTick(function(){ + call_order.push('two'); + // call_order now equals ['one','two'] +}); +call_order.push('one') +``` + + +### times(n, callback) + +Calls the `callback` function `n` times, and accumulates results in the same manner +you would use with [`map`](#map). + +__Arguments__ + +* `n` - The number of times to run the function. +* `callback` - The function to call `n` times. + +__Example__ + +```js +// Pretend this is some complicated async factory +var createUser = function(id, callback) { + callback(null, { + id: 'user' + id + }) +} +// generate 5 users +async.times(5, function(n, next){ + createUser(n, function(err, user) { + next(err, user) + }) +}, function(err, users) { + // we should now have 5 users +}); +``` + + +### timesSeries(n, callback) + +The same as [`times`](#times), only the iterator is applied to each item in `arr` in +series. The next `iterator` is only called once the current one has completed. +The results array will be in the same order as the original. + + +## Utils + + +### memoize(fn, [hasher]) + +Caches the results of an `async` function. When creating a hash to store function +results against, the callback is omitted from the hash and an optional hash +function can be used. + +The cache of results is exposed as the `memo` property of the function returned +by `memoize`. + +__Arguments__ + +* `fn` - The function to proxy and cache results from. +* `hasher` - Tn optional function for generating a custom hash for storing + results. It has all the arguments applied to it apart from the callback, and + must be synchronous. + +__Example__ + +```js +var slow_fn = function (name, callback) { + // do something + callback(null, result); +}; +var fn = async.memoize(slow_fn); + +// fn can now be used as if it were slow_fn +fn('some name', function () { + // callback +}); +``` + + +### unmemoize(fn) + +Undoes a [`memoize`](#memoize)d function, reverting it to the original, unmemoized +form. Handy for testing. + +__Arguments__ + +* `fn` - the memoized function + + +### log(function, arguments) + +Logs the result of an `async` function to the `console`. Only works in Node.js or +in browsers that support `console.log` and `console.error` (such as FF and Chrome). +If multiple arguments are returned from the async function, `console.log` is +called on each argument in order. + +__Arguments__ + +* `function` - The function you want to eventually apply all arguments to. +* `arguments...` - Any number of arguments to apply to the function. + +__Example__ + +```js +var hello = function(name, callback){ + setTimeout(function(){ + callback(null, 'hello ' + name); + }, 1000); +}; +``` +```js +node> async.log(hello, 'world'); +'hello world' +``` + +--------------------------------------- + + +### dir(function, arguments) + +Logs the result of an `async` function to the `console` using `console.dir` to +display the properties of the resulting object. Only works in Node.js or +in browsers that support `console.dir` and `console.error` (such as FF and Chrome). +If multiple arguments are returned from the async function, `console.dir` is +called on each argument in order. + +__Arguments__ + +* `function` - The function you want to eventually apply all arguments to. +* `arguments...` - Any number of arguments to apply to the function. + +__Example__ + +```js +var hello = function(name, callback){ + setTimeout(function(){ + callback(null, {hello: name}); + }, 1000); +}; +``` +```js +node> async.dir(hello, 'world'); +{hello: 'world'} +``` + +--------------------------------------- + + +### noConflict() + +Changes the value of `async` back to its original value, returning a reference to the +`async` object. diff --git a/node_modules/async/bower.json b/node_modules/async/bower.json new file mode 100644 index 000000000..18176881e --- /dev/null +++ b/node_modules/async/bower.json @@ -0,0 +1,38 @@ +{ + "name": "async", + "description": "Higher-order functions and common patterns for asynchronous code", + "version": "0.9.2", + "main": "lib/async.js", + "keywords": [ + "async", + "callback", + "utility", + "module" + ], + "license": "MIT", + "repository": { + "type": "git", + "url": "https://github.com/caolan/async.git" + }, + "devDependencies": { + "nodeunit": ">0.0.0", + "uglify-js": "1.2.x", + "nodelint": ">0.0.0", + "lodash": ">=2.4.1" + }, + "moduleType": [ + "amd", + "globals", + "node" + ], + "ignore": [ + "**/.*", + "node_modules", + "bower_components", + "test", + "tests" + ], + "authors": [ + "Caolan McMahon" + ] +} \ No newline at end of file diff --git a/node_modules/async/component.json b/node_modules/async/component.json new file mode 100644 index 000000000..5003a7c52 --- /dev/null +++ b/node_modules/async/component.json @@ -0,0 +1,16 @@ +{ + "name": "async", + "description": "Higher-order functions and common patterns for asynchronous code", + "version": "0.9.2", + "keywords": [ + "async", + "callback", + "utility", + "module" + ], + "license": "MIT", + "repository": "caolan/async", + "scripts": [ + "lib/async.js" + ] +} \ No newline at end of file diff --git a/node_modules/async/lib/async.js b/node_modules/async/lib/async.js new file mode 100644 index 000000000..394c41cad --- /dev/null +++ b/node_modules/async/lib/async.js @@ -0,0 +1,1123 @@ +/*! + * async + * https://github.com/caolan/async + * + * Copyright 2010-2014 Caolan McMahon + * Released under the MIT license + */ +/*jshint onevar: false, indent:4 */ +/*global setImmediate: false, setTimeout: false, console: false */ +(function () { + + var async = {}; + + // global on the server, window in the browser + var root, previous_async; + + root = this; + if (root != null) { + previous_async = root.async; + } + + async.noConflict = function () { + root.async = previous_async; + return async; + }; + + function only_once(fn) { + var called = false; + return function() { + if (called) throw new Error("Callback was already called."); + called = true; + fn.apply(root, arguments); + } + } + + //// cross-browser compatiblity functions //// + + var _toString = Object.prototype.toString; + + var _isArray = Array.isArray || function (obj) { + return _toString.call(obj) === '[object Array]'; + }; + + var _each = function (arr, iterator) { + for (var i = 0; i < arr.length; i += 1) { + iterator(arr[i], i, arr); + } + }; + + var _map = function (arr, iterator) { + if (arr.map) { + return arr.map(iterator); + } + var results = []; + _each(arr, function (x, i, a) { + results.push(iterator(x, i, a)); + }); + return results; + }; + + var _reduce = function (arr, iterator, memo) { + if (arr.reduce) { + return arr.reduce(iterator, memo); + } + _each(arr, function (x, i, a) { + memo = iterator(memo, x, i, a); + }); + return memo; + }; + + var _keys = function (obj) { + if (Object.keys) { + return Object.keys(obj); + } + var keys = []; + for (var k in obj) { + if (obj.hasOwnProperty(k)) { + keys.push(k); + } + } + return keys; + }; + + //// exported async module functions //// + + //// nextTick implementation with browser-compatible fallback //// + if (typeof process === 'undefined' || !(process.nextTick)) { + if (typeof setImmediate === 'function') { + async.nextTick = function (fn) { + // not a direct alias for IE10 compatibility + setImmediate(fn); + }; + async.setImmediate = async.nextTick; + } + else { + async.nextTick = function (fn) { + setTimeout(fn, 0); + }; + async.setImmediate = async.nextTick; + } + } + else { + async.nextTick = process.nextTick; + if (typeof setImmediate !== 'undefined') { + async.setImmediate = function (fn) { + // not a direct alias for IE10 compatibility + setImmediate(fn); + }; + } + else { + async.setImmediate = async.nextTick; + } + } + + async.each = function (arr, iterator, callback) { + callback = callback || function () {}; + if (!arr.length) { + return callback(); + } + var completed = 0; + _each(arr, function (x) { + iterator(x, only_once(done) ); + }); + function done(err) { + if (err) { + callback(err); + callback = function () {}; + } + else { + completed += 1; + if (completed >= arr.length) { + callback(); + } + } + } + }; + async.forEach = async.each; + + async.eachSeries = function (arr, iterator, callback) { + callback = callback || function () {}; + if (!arr.length) { + return callback(); + } + var completed = 0; + var iterate = function () { + iterator(arr[completed], function (err) { + if (err) { + callback(err); + callback = function () {}; + } + else { + completed += 1; + if (completed >= arr.length) { + callback(); + } + else { + iterate(); + } + } + }); + }; + iterate(); + }; + async.forEachSeries = async.eachSeries; + + async.eachLimit = function (arr, limit, iterator, callback) { + var fn = _eachLimit(limit); + fn.apply(null, [arr, iterator, callback]); + }; + async.forEachLimit = async.eachLimit; + + var _eachLimit = function (limit) { + + return function (arr, iterator, callback) { + callback = callback || function () {}; + if (!arr.length || limit <= 0) { + return callback(); + } + var completed = 0; + var started = 0; + var running = 0; + + (function replenish () { + if (completed >= arr.length) { + return callback(); + } + + while (running < limit && started < arr.length) { + started += 1; + running += 1; + iterator(arr[started - 1], function (err) { + if (err) { + callback(err); + callback = function () {}; + } + else { + completed += 1; + running -= 1; + if (completed >= arr.length) { + callback(); + } + else { + replenish(); + } + } + }); + } + })(); + }; + }; + + + var doParallel = function (fn) { + return function () { + var args = Array.prototype.slice.call(arguments); + return fn.apply(null, [async.each].concat(args)); + }; + }; + var doParallelLimit = function(limit, fn) { + return function () { + var args = Array.prototype.slice.call(arguments); + return fn.apply(null, [_eachLimit(limit)].concat(args)); + }; + }; + var doSeries = function (fn) { + return function () { + var args = Array.prototype.slice.call(arguments); + return fn.apply(null, [async.eachSeries].concat(args)); + }; + }; + + + var _asyncMap = function (eachfn, arr, iterator, callback) { + arr = _map(arr, function (x, i) { + return {index: i, value: x}; + }); + if (!callback) { + eachfn(arr, function (x, callback) { + iterator(x.value, function (err) { + callback(err); + }); + }); + } else { + var results = []; + eachfn(arr, function (x, callback) { + iterator(x.value, function (err, v) { + results[x.index] = v; + callback(err); + }); + }, function (err) { + callback(err, results); + }); + } + }; + async.map = doParallel(_asyncMap); + async.mapSeries = doSeries(_asyncMap); + async.mapLimit = function (arr, limit, iterator, callback) { + return _mapLimit(limit)(arr, iterator, callback); + }; + + var _mapLimit = function(limit) { + return doParallelLimit(limit, _asyncMap); + }; + + // reduce only has a series version, as doing reduce in parallel won't + // work in many situations. + async.reduce = function (arr, memo, iterator, callback) { + async.eachSeries(arr, function (x, callback) { + iterator(memo, x, function (err, v) { + memo = v; + callback(err); + }); + }, function (err) { + callback(err, memo); + }); + }; + // inject alias + async.inject = async.reduce; + // foldl alias + async.foldl = async.reduce; + + async.reduceRight = function (arr, memo, iterator, callback) { + var reversed = _map(arr, function (x) { + return x; + }).reverse(); + async.reduce(reversed, memo, iterator, callback); + }; + // foldr alias + async.foldr = async.reduceRight; + + var _filter = function (eachfn, arr, iterator, callback) { + var results = []; + arr = _map(arr, function (x, i) { + return {index: i, value: x}; + }); + eachfn(arr, function (x, callback) { + iterator(x.value, function (v) { + if (v) { + results.push(x); + } + callback(); + }); + }, function (err) { + callback(_map(results.sort(function (a, b) { + return a.index - b.index; + }), function (x) { + return x.value; + })); + }); + }; + async.filter = doParallel(_filter); + async.filterSeries = doSeries(_filter); + // select alias + async.select = async.filter; + async.selectSeries = async.filterSeries; + + var _reject = function (eachfn, arr, iterator, callback) { + var results = []; + arr = _map(arr, function (x, i) { + return {index: i, value: x}; + }); + eachfn(arr, function (x, callback) { + iterator(x.value, function (v) { + if (!v) { + results.push(x); + } + callback(); + }); + }, function (err) { + callback(_map(results.sort(function (a, b) { + return a.index - b.index; + }), function (x) { + return x.value; + })); + }); + }; + async.reject = doParallel(_reject); + async.rejectSeries = doSeries(_reject); + + var _detect = function (eachfn, arr, iterator, main_callback) { + eachfn(arr, function (x, callback) { + iterator(x, function (result) { + if (result) { + main_callback(x); + main_callback = function () {}; + } + else { + callback(); + } + }); + }, function (err) { + main_callback(); + }); + }; + async.detect = doParallel(_detect); + async.detectSeries = doSeries(_detect); + + async.some = function (arr, iterator, main_callback) { + async.each(arr, function (x, callback) { + iterator(x, function (v) { + if (v) { + main_callback(true); + main_callback = function () {}; + } + callback(); + }); + }, function (err) { + main_callback(false); + }); + }; + // any alias + async.any = async.some; + + async.every = function (arr, iterator, main_callback) { + async.each(arr, function (x, callback) { + iterator(x, function (v) { + if (!v) { + main_callback(false); + main_callback = function () {}; + } + callback(); + }); + }, function (err) { + main_callback(true); + }); + }; + // all alias + async.all = async.every; + + async.sortBy = function (arr, iterator, callback) { + async.map(arr, function (x, callback) { + iterator(x, function (err, criteria) { + if (err) { + callback(err); + } + else { + callback(null, {value: x, criteria: criteria}); + } + }); + }, function (err, results) { + if (err) { + return callback(err); + } + else { + var fn = function (left, right) { + var a = left.criteria, b = right.criteria; + return a < b ? -1 : a > b ? 1 : 0; + }; + callback(null, _map(results.sort(fn), function (x) { + return x.value; + })); + } + }); + }; + + async.auto = function (tasks, callback) { + callback = callback || function () {}; + var keys = _keys(tasks); + var remainingTasks = keys.length + if (!remainingTasks) { + return callback(); + } + + var results = {}; + + var listeners = []; + var addListener = function (fn) { + listeners.unshift(fn); + }; + var removeListener = function (fn) { + for (var i = 0; i < listeners.length; i += 1) { + if (listeners[i] === fn) { + listeners.splice(i, 1); + return; + } + } + }; + var taskComplete = function () { + remainingTasks-- + _each(listeners.slice(0), function (fn) { + fn(); + }); + }; + + addListener(function () { + if (!remainingTasks) { + var theCallback = callback; + // prevent final callback from calling itself if it errors + callback = function () {}; + + theCallback(null, results); + } + }); + + _each(keys, function (k) { + var task = _isArray(tasks[k]) ? tasks[k]: [tasks[k]]; + var taskCallback = function (err) { + var args = Array.prototype.slice.call(arguments, 1); + if (args.length <= 1) { + args = args[0]; + } + if (err) { + var safeResults = {}; + _each(_keys(results), function(rkey) { + safeResults[rkey] = results[rkey]; + }); + safeResults[k] = args; + callback(err, safeResults); + // stop subsequent errors hitting callback multiple times + callback = function () {}; + } + else { + results[k] = args; + async.setImmediate(taskComplete); + } + }; + var requires = task.slice(0, Math.abs(task.length - 1)) || []; + var ready = function () { + return _reduce(requires, function (a, x) { + return (a && results.hasOwnProperty(x)); + }, true) && !results.hasOwnProperty(k); + }; + if (ready()) { + task[task.length - 1](taskCallback, results); + } + else { + var listener = function () { + if (ready()) { + removeListener(listener); + task[task.length - 1](taskCallback, results); + } + }; + addListener(listener); + } + }); + }; + + async.retry = function(times, task, callback) { + var DEFAULT_TIMES = 5; + var attempts = []; + // Use defaults if times not passed + if (typeof times === 'function') { + callback = task; + task = times; + times = DEFAULT_TIMES; + } + // Make sure times is a number + times = parseInt(times, 10) || DEFAULT_TIMES; + var wrappedTask = function(wrappedCallback, wrappedResults) { + var retryAttempt = function(task, finalAttempt) { + return function(seriesCallback) { + task(function(err, result){ + seriesCallback(!err || finalAttempt, {err: err, result: result}); + }, wrappedResults); + }; + }; + while (times) { + attempts.push(retryAttempt(task, !(times-=1))); + } + async.series(attempts, function(done, data){ + data = data[data.length - 1]; + (wrappedCallback || callback)(data.err, data.result); + }); + } + // If a callback is passed, run this as a controll flow + return callback ? wrappedTask() : wrappedTask + }; + + async.waterfall = function (tasks, callback) { + callback = callback || function () {}; + if (!_isArray(tasks)) { + var err = new Error('First argument to waterfall must be an array of functions'); + return callback(err); + } + if (!tasks.length) { + return callback(); + } + var wrapIterator = function (iterator) { + return function (err) { + if (err) { + callback.apply(null, arguments); + callback = function () {}; + } + else { + var args = Array.prototype.slice.call(arguments, 1); + var next = iterator.next(); + if (next) { + args.push(wrapIterator(next)); + } + else { + args.push(callback); + } + async.setImmediate(function () { + iterator.apply(null, args); + }); + } + }; + }; + wrapIterator(async.iterator(tasks))(); + }; + + var _parallel = function(eachfn, tasks, callback) { + callback = callback || function () {}; + if (_isArray(tasks)) { + eachfn.map(tasks, function (fn, callback) { + if (fn) { + fn(function (err) { + var args = Array.prototype.slice.call(arguments, 1); + if (args.length <= 1) { + args = args[0]; + } + callback.call(null, err, args); + }); + } + }, callback); + } + else { + var results = {}; + eachfn.each(_keys(tasks), function (k, callback) { + tasks[k](function (err) { + var args = Array.prototype.slice.call(arguments, 1); + if (args.length <= 1) { + args = args[0]; + } + results[k] = args; + callback(err); + }); + }, function (err) { + callback(err, results); + }); + } + }; + + async.parallel = function (tasks, callback) { + _parallel({ map: async.map, each: async.each }, tasks, callback); + }; + + async.parallelLimit = function(tasks, limit, callback) { + _parallel({ map: _mapLimit(limit), each: _eachLimit(limit) }, tasks, callback); + }; + + async.series = function (tasks, callback) { + callback = callback || function () {}; + if (_isArray(tasks)) { + async.mapSeries(tasks, function (fn, callback) { + if (fn) { + fn(function (err) { + var args = Array.prototype.slice.call(arguments, 1); + if (args.length <= 1) { + args = args[0]; + } + callback.call(null, err, args); + }); + } + }, callback); + } + else { + var results = {}; + async.eachSeries(_keys(tasks), function (k, callback) { + tasks[k](function (err) { + var args = Array.prototype.slice.call(arguments, 1); + if (args.length <= 1) { + args = args[0]; + } + results[k] = args; + callback(err); + }); + }, function (err) { + callback(err, results); + }); + } + }; + + async.iterator = function (tasks) { + var makeCallback = function (index) { + var fn = function () { + if (tasks.length) { + tasks[index].apply(null, arguments); + } + return fn.next(); + }; + fn.next = function () { + return (index < tasks.length - 1) ? makeCallback(index + 1): null; + }; + return fn; + }; + return makeCallback(0); + }; + + async.apply = function (fn) { + var args = Array.prototype.slice.call(arguments, 1); + return function () { + return fn.apply( + null, args.concat(Array.prototype.slice.call(arguments)) + ); + }; + }; + + var _concat = function (eachfn, arr, fn, callback) { + var r = []; + eachfn(arr, function (x, cb) { + fn(x, function (err, y) { + r = r.concat(y || []); + cb(err); + }); + }, function (err) { + callback(err, r); + }); + }; + async.concat = doParallel(_concat); + async.concatSeries = doSeries(_concat); + + async.whilst = function (test, iterator, callback) { + if (test()) { + iterator(function (err) { + if (err) { + return callback(err); + } + async.whilst(test, iterator, callback); + }); + } + else { + callback(); + } + }; + + async.doWhilst = function (iterator, test, callback) { + iterator(function (err) { + if (err) { + return callback(err); + } + var args = Array.prototype.slice.call(arguments, 1); + if (test.apply(null, args)) { + async.doWhilst(iterator, test, callback); + } + else { + callback(); + } + }); + }; + + async.until = function (test, iterator, callback) { + if (!test()) { + iterator(function (err) { + if (err) { + return callback(err); + } + async.until(test, iterator, callback); + }); + } + else { + callback(); + } + }; + + async.doUntil = function (iterator, test, callback) { + iterator(function (err) { + if (err) { + return callback(err); + } + var args = Array.prototype.slice.call(arguments, 1); + if (!test.apply(null, args)) { + async.doUntil(iterator, test, callback); + } + else { + callback(); + } + }); + }; + + async.queue = function (worker, concurrency) { + if (concurrency === undefined) { + concurrency = 1; + } + function _insert(q, data, pos, callback) { + if (!q.started){ + q.started = true; + } + if (!_isArray(data)) { + data = [data]; + } + if(data.length == 0) { + // call drain immediately if there are no tasks + return async.setImmediate(function() { + if (q.drain) { + q.drain(); + } + }); + } + _each(data, function(task) { + var item = { + data: task, + callback: typeof callback === 'function' ? callback : null + }; + + if (pos) { + q.tasks.unshift(item); + } else { + q.tasks.push(item); + } + + if (q.saturated && q.tasks.length === q.concurrency) { + q.saturated(); + } + async.setImmediate(q.process); + }); + } + + var workers = 0; + var q = { + tasks: [], + concurrency: concurrency, + saturated: null, + empty: null, + drain: null, + started: false, + paused: false, + push: function (data, callback) { + _insert(q, data, false, callback); + }, + kill: function () { + q.drain = null; + q.tasks = []; + }, + unshift: function (data, callback) { + _insert(q, data, true, callback); + }, + process: function () { + if (!q.paused && workers < q.concurrency && q.tasks.length) { + var task = q.tasks.shift(); + if (q.empty && q.tasks.length === 0) { + q.empty(); + } + workers += 1; + var next = function () { + workers -= 1; + if (task.callback) { + task.callback.apply(task, arguments); + } + if (q.drain && q.tasks.length + workers === 0) { + q.drain(); + } + q.process(); + }; + var cb = only_once(next); + worker(task.data, cb); + } + }, + length: function () { + return q.tasks.length; + }, + running: function () { + return workers; + }, + idle: function() { + return q.tasks.length + workers === 0; + }, + pause: function () { + if (q.paused === true) { return; } + q.paused = true; + }, + resume: function () { + if (q.paused === false) { return; } + q.paused = false; + // Need to call q.process once per concurrent + // worker to preserve full concurrency after pause + for (var w = 1; w <= q.concurrency; w++) { + async.setImmediate(q.process); + } + } + }; + return q; + }; + + async.priorityQueue = function (worker, concurrency) { + + function _compareTasks(a, b){ + return a.priority - b.priority; + }; + + function _binarySearch(sequence, item, compare) { + var beg = -1, + end = sequence.length - 1; + while (beg < end) { + var mid = beg + ((end - beg + 1) >>> 1); + if (compare(item, sequence[mid]) >= 0) { + beg = mid; + } else { + end = mid - 1; + } + } + return beg; + } + + function _insert(q, data, priority, callback) { + if (!q.started){ + q.started = true; + } + if (!_isArray(data)) { + data = [data]; + } + if(data.length == 0) { + // call drain immediately if there are no tasks + return async.setImmediate(function() { + if (q.drain) { + q.drain(); + } + }); + } + _each(data, function(task) { + var item = { + data: task, + priority: priority, + callback: typeof callback === 'function' ? callback : null + }; + + q.tasks.splice(_binarySearch(q.tasks, item, _compareTasks) + 1, 0, item); + + if (q.saturated && q.tasks.length === q.concurrency) { + q.saturated(); + } + async.setImmediate(q.process); + }); + } + + // Start with a normal queue + var q = async.queue(worker, concurrency); + + // Override push to accept second parameter representing priority + q.push = function (data, priority, callback) { + _insert(q, data, priority, callback); + }; + + // Remove unshift function + delete q.unshift; + + return q; + }; + + async.cargo = function (worker, payload) { + var working = false, + tasks = []; + + var cargo = { + tasks: tasks, + payload: payload, + saturated: null, + empty: null, + drain: null, + drained: true, + push: function (data, callback) { + if (!_isArray(data)) { + data = [data]; + } + _each(data, function(task) { + tasks.push({ + data: task, + callback: typeof callback === 'function' ? callback : null + }); + cargo.drained = false; + if (cargo.saturated && tasks.length === payload) { + cargo.saturated(); + } + }); + async.setImmediate(cargo.process); + }, + process: function process() { + if (working) return; + if (tasks.length === 0) { + if(cargo.drain && !cargo.drained) cargo.drain(); + cargo.drained = true; + return; + } + + var ts = typeof payload === 'number' + ? tasks.splice(0, payload) + : tasks.splice(0, tasks.length); + + var ds = _map(ts, function (task) { + return task.data; + }); + + if(cargo.empty) cargo.empty(); + working = true; + worker(ds, function () { + working = false; + + var args = arguments; + _each(ts, function (data) { + if (data.callback) { + data.callback.apply(null, args); + } + }); + + process(); + }); + }, + length: function () { + return tasks.length; + }, + running: function () { + return working; + } + }; + return cargo; + }; + + var _console_fn = function (name) { + return function (fn) { + var args = Array.prototype.slice.call(arguments, 1); + fn.apply(null, args.concat([function (err) { + var args = Array.prototype.slice.call(arguments, 1); + if (typeof console !== 'undefined') { + if (err) { + if (console.error) { + console.error(err); + } + } + else if (console[name]) { + _each(args, function (x) { + console[name](x); + }); + } + } + }])); + }; + }; + async.log = _console_fn('log'); + async.dir = _console_fn('dir'); + /*async.info = _console_fn('info'); + async.warn = _console_fn('warn'); + async.error = _console_fn('error');*/ + + async.memoize = function (fn, hasher) { + var memo = {}; + var queues = {}; + hasher = hasher || function (x) { + return x; + }; + var memoized = function () { + var args = Array.prototype.slice.call(arguments); + var callback = args.pop(); + var key = hasher.apply(null, args); + if (key in memo) { + async.nextTick(function () { + callback.apply(null, memo[key]); + }); + } + else if (key in queues) { + queues[key].push(callback); + } + else { + queues[key] = [callback]; + fn.apply(null, args.concat([function () { + memo[key] = arguments; + var q = queues[key]; + delete queues[key]; + for (var i = 0, l = q.length; i < l; i++) { + q[i].apply(null, arguments); + } + }])); + } + }; + memoized.memo = memo; + memoized.unmemoized = fn; + return memoized; + }; + + async.unmemoize = function (fn) { + return function () { + return (fn.unmemoized || fn).apply(null, arguments); + }; + }; + + async.times = function (count, iterator, callback) { + var counter = []; + for (var i = 0; i < count; i++) { + counter.push(i); + } + return async.map(counter, iterator, callback); + }; + + async.timesSeries = function (count, iterator, callback) { + var counter = []; + for (var i = 0; i < count; i++) { + counter.push(i); + } + return async.mapSeries(counter, iterator, callback); + }; + + async.seq = function (/* functions... */) { + var fns = arguments; + return function () { + var that = this; + var args = Array.prototype.slice.call(arguments); + var callback = args.pop(); + async.reduce(fns, args, function (newargs, fn, cb) { + fn.apply(that, newargs.concat([function () { + var err = arguments[0]; + var nextargs = Array.prototype.slice.call(arguments, 1); + cb(err, nextargs); + }])) + }, + function (err, results) { + callback.apply(that, [err].concat(results)); + }); + }; + }; + + async.compose = function (/* functions... */) { + return async.seq.apply(null, Array.prototype.reverse.call(arguments)); + }; + + var _applyEach = function (eachfn, fns /*args...*/) { + var go = function () { + var that = this; + var args = Array.prototype.slice.call(arguments); + var callback = args.pop(); + return eachfn(fns, function (fn, cb) { + fn.apply(that, args.concat([cb])); + }, + callback); + }; + if (arguments.length > 2) { + var args = Array.prototype.slice.call(arguments, 2); + return go.apply(this, args); + } + else { + return go; + } + }; + async.applyEach = doParallel(_applyEach); + async.applyEachSeries = doSeries(_applyEach); + + async.forever = function (fn, callback) { + function next(err) { + if (err) { + if (callback) { + return callback(err); + } + throw err; + } + fn(next); + } + next(); + }; + + // Node.js + if (typeof module !== 'undefined' && module.exports) { + module.exports = async; + } + // AMD / RequireJS + else if (typeof define !== 'undefined' && define.amd) { + define([], function () { + return async; + }); + } + // included directly via ` + +[Get supported base64-js with the Tidelift Subscription](https://tidelift.com/subscription/pkg/npm-base64-js?utm_source=npm-base64-js&utm_medium=referral&utm_campaign=readme) + +## methods + +`base64js` has three exposed functions, `byteLength`, `toByteArray` and `fromByteArray`, which both take a single argument. + +* `byteLength` - Takes a base64 string and returns length of byte array +* `toByteArray` - Takes a base64 string and returns a byte array +* `fromByteArray` - Takes a byte array and returns a base64 string + +## license + +MIT diff --git a/node_modules/base64-js/base64js.min.js b/node_modules/base64-js/base64js.min.js new file mode 100644 index 000000000..908ac83fd --- /dev/null +++ b/node_modules/base64-js/base64js.min.js @@ -0,0 +1 @@ +(function(a){if("object"==typeof exports&&"undefined"!=typeof module)module.exports=a();else if("function"==typeof define&&define.amd)define([],a);else{var b;b="undefined"==typeof window?"undefined"==typeof global?"undefined"==typeof self?this:self:global:window,b.base64js=a()}})(function(){return function(){function b(d,e,g){function a(j,i){if(!e[j]){if(!d[j]){var f="function"==typeof require&&require;if(!i&&f)return f(j,!0);if(h)return h(j,!0);var c=new Error("Cannot find module '"+j+"'");throw c.code="MODULE_NOT_FOUND",c}var k=e[j]={exports:{}};d[j][0].call(k.exports,function(b){var c=d[j][1][b];return a(c||b)},k,k.exports,b,d,e,g)}return e[j].exports}for(var h="function"==typeof require&&require,c=0;c>16,j[k++]=255&b>>8,j[k++]=255&b;return 2===h&&(b=l[a.charCodeAt(c)]<<2|l[a.charCodeAt(c+1)]>>4,j[k++]=255&b),1===h&&(b=l[a.charCodeAt(c)]<<10|l[a.charCodeAt(c+1)]<<4|l[a.charCodeAt(c+2)]>>2,j[k++]=255&b>>8,j[k++]=255&b),j}function g(a){return k[63&a>>18]+k[63&a>>12]+k[63&a>>6]+k[63&a]}function h(a,b,c){for(var d,e=[],f=b;fj?j:g+f));return 1===d?(b=a[c-1],e.push(k[b>>2]+k[63&b<<4]+"==")):2===d&&(b=(a[c-2]<<8)+a[c-1],e.push(k[b>>10]+k[63&b>>4]+k[63&b<<2]+"=")),e.join("")}c.byteLength=function(a){var b=d(a),c=b[0],e=b[1];return 3*(c+e)/4-e},c.toByteArray=f,c.fromByteArray=j;for(var k=[],l=[],m="undefined"==typeof Uint8Array?Array:Uint8Array,n="ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/",o=0,p=n.length;o 0) { + throw new Error('Invalid string. Length must be a multiple of 4') + } + + // Trim off extra bytes after placeholder bytes are found + // See: https://github.com/beatgammit/base64-js/issues/42 + var validLen = b64.indexOf('=') + if (validLen === -1) validLen = len + + var placeHoldersLen = validLen === len + ? 0 + : 4 - (validLen % 4) + + return [validLen, placeHoldersLen] +} + +// base64 is 4/3 + up to two characters of the original data +function byteLength (b64) { + var lens = getLens(b64) + var validLen = lens[0] + var placeHoldersLen = lens[1] + return ((validLen + placeHoldersLen) * 3 / 4) - placeHoldersLen +} + +function _byteLength (b64, validLen, placeHoldersLen) { + return ((validLen + placeHoldersLen) * 3 / 4) - placeHoldersLen +} + +function toByteArray (b64) { + var tmp + var lens = getLens(b64) + var validLen = lens[0] + var placeHoldersLen = lens[1] + + var arr = new Arr(_byteLength(b64, validLen, placeHoldersLen)) + + var curByte = 0 + + // if there are placeholders, only get up to the last complete 4 chars + var len = placeHoldersLen > 0 + ? validLen - 4 + : validLen + + var i + for (i = 0; i < len; i += 4) { + tmp = + (revLookup[b64.charCodeAt(i)] << 18) | + (revLookup[b64.charCodeAt(i + 1)] << 12) | + (revLookup[b64.charCodeAt(i + 2)] << 6) | + revLookup[b64.charCodeAt(i + 3)] + arr[curByte++] = (tmp >> 16) & 0xFF + arr[curByte++] = (tmp >> 8) & 0xFF + arr[curByte++] = tmp & 0xFF + } + + if (placeHoldersLen === 2) { + tmp = + (revLookup[b64.charCodeAt(i)] << 2) | + (revLookup[b64.charCodeAt(i + 1)] >> 4) + arr[curByte++] = tmp & 0xFF + } + + if (placeHoldersLen === 1) { + tmp = + (revLookup[b64.charCodeAt(i)] << 10) | + (revLookup[b64.charCodeAt(i + 1)] << 4) | + (revLookup[b64.charCodeAt(i + 2)] >> 2) + arr[curByte++] = (tmp >> 8) & 0xFF + arr[curByte++] = tmp & 0xFF + } + + return arr +} + +function tripletToBase64 (num) { + return lookup[num >> 18 & 0x3F] + + lookup[num >> 12 & 0x3F] + + lookup[num >> 6 & 0x3F] + + lookup[num & 0x3F] +} + +function encodeChunk (uint8, start, end) { + var tmp + var output = [] + for (var i = start; i < end; i += 3) { + tmp = + ((uint8[i] << 16) & 0xFF0000) + + ((uint8[i + 1] << 8) & 0xFF00) + + (uint8[i + 2] & 0xFF) + output.push(tripletToBase64(tmp)) + } + return output.join('') +} + +function fromByteArray (uint8) { + var tmp + var len = uint8.length + var extraBytes = len % 3 // if we have 1 byte left, pad 2 bytes + var parts = [] + var maxChunkLength = 16383 // must be multiple of 3 + + // go through the array every three bytes, we'll deal with trailing stuff later + for (var i = 0, len2 = len - extraBytes; i < len2; i += maxChunkLength) { + parts.push(encodeChunk(uint8, i, (i + maxChunkLength) > len2 ? len2 : (i + maxChunkLength))) + } + + // pad the end with zeros, but make sure to not forget the extra bytes + if (extraBytes === 1) { + tmp = uint8[len - 1] + parts.push( + lookup[tmp >> 2] + + lookup[(tmp << 4) & 0x3F] + + '==' + ) + } else if (extraBytes === 2) { + tmp = (uint8[len - 2] << 8) + uint8[len - 1] + parts.push( + lookup[tmp >> 10] + + lookup[(tmp >> 4) & 0x3F] + + lookup[(tmp << 2) & 0x3F] + + '=' + ) + } + + return parts.join('') +} diff --git a/node_modules/base64-js/package.json b/node_modules/base64-js/package.json new file mode 100644 index 000000000..2eb69c940 --- /dev/null +++ b/node_modules/base64-js/package.json @@ -0,0 +1,75 @@ +{ + "_from": "base64-js@^1.3.1", + "_id": "base64-js@1.5.1", + "_inBundle": false, + "_integrity": "sha512-AKpaYlHn8t4SVbOHCy+b5+KKgvR4vrsD8vbvrbiQJps7fKDTkjkDry6ji0rUJjC0kzbNePLwzxq8iypo41qeWA==", + "_location": "/base64-js", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "base64-js@^1.3.1", + "name": "base64-js", + "escapedName": "base64-js", + "rawSpec": "^1.3.1", + "saveSpec": null, + "fetchSpec": "^1.3.1" + }, + "_requiredBy": [ + "/buffer" + ], + "_resolved": "https://registry.npmjs.org/base64-js/-/base64-js-1.5.1.tgz", + "_shasum": "1b1b440160a5bf7ad40b650f095963481903930a", + "_spec": "base64-js@^1.3.1", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\buffer", + "author": { + "name": "T. Jameson Little", + "email": "t.jameson.little@gmail.com" + }, + "bugs": { + "url": "https://github.com/beatgammit/base64-js/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Base64 encoding/decoding in pure JS", + "devDependencies": { + "babel-minify": "^0.5.1", + "benchmark": "^2.1.4", + "browserify": "^16.3.0", + "standard": "*", + "tape": "4.x" + }, + "funding": [ + { + "type": "github", + "url": "https://github.com/sponsors/feross" + }, + { + "type": "patreon", + "url": "https://www.patreon.com/feross" + }, + { + "type": "consulting", + "url": "https://feross.org/support" + } + ], + "homepage": "https://github.com/beatgammit/base64-js", + "keywords": [ + "base64" + ], + "license": "MIT", + "main": "index.js", + "name": "base64-js", + "repository": { + "type": "git", + "url": "git://github.com/beatgammit/base64-js.git" + }, + "scripts": { + "build": "browserify -s base64js -r ./ | minify > base64js.min.js", + "lint": "standard", + "test": "npm run lint && npm run unit", + "unit": "tape test/*.js" + }, + "typings": "index.d.ts", + "version": "1.5.1" +} diff --git a/node_modules/binary-extensions/binary-extensions.json b/node_modules/binary-extensions/binary-extensions.json new file mode 100644 index 000000000..4aab38378 --- /dev/null +++ b/node_modules/binary-extensions/binary-extensions.json @@ -0,0 +1,260 @@ +[ + "3dm", + "3ds", + "3g2", + "3gp", + "7z", + "a", + "aac", + "adp", + "ai", + "aif", + "aiff", + "alz", + "ape", + "apk", + "appimage", + "ar", + "arj", + "asf", + "au", + "avi", + "bak", + "baml", + "bh", + "bin", + "bk", + "bmp", + "btif", + "bz2", + "bzip2", + "cab", + "caf", + "cgm", + "class", + "cmx", + "cpio", + "cr2", + "cur", + "dat", + "dcm", + "deb", + "dex", + "djvu", + "dll", + "dmg", + "dng", + "doc", + "docm", + "docx", + "dot", + "dotm", + "dra", + "DS_Store", + "dsk", + "dts", + "dtshd", + "dvb", + "dwg", + "dxf", + "ecelp4800", + "ecelp7470", + "ecelp9600", + "egg", + "eol", + "eot", + "epub", + "exe", + "f4v", + "fbs", + "fh", + "fla", + "flac", + "flatpak", + "fli", + "flv", + "fpx", + "fst", + "fvt", + "g3", + "gh", + "gif", + "graffle", + "gz", + "gzip", + "h261", + "h263", + "h264", + "icns", + "ico", + "ief", + "img", + "ipa", + "iso", + "jar", + "jpeg", + "jpg", + "jpgv", + "jpm", + "jxr", + "key", + "ktx", + "lha", + "lib", + "lvp", + "lz", + "lzh", + "lzma", + "lzo", + "m3u", + "m4a", + "m4v", + "mar", + "mdi", + "mht", + "mid", + "midi", + "mj2", + "mka", + "mkv", + "mmr", + "mng", + "mobi", + "mov", + "movie", + "mp3", + "mp4", + "mp4a", + "mpeg", + "mpg", + "mpga", + "mxu", + "nef", + "npx", + "numbers", + "nupkg", + "o", + "odp", + "ods", + "odt", + "oga", + "ogg", + "ogv", + "otf", + "ott", + "pages", + "pbm", + "pcx", + "pdb", + "pdf", + "pea", + "pgm", + "pic", + "png", + "pnm", + "pot", + "potm", + "potx", + "ppa", + "ppam", + "ppm", + "pps", + "ppsm", + "ppsx", + "ppt", + "pptm", + "pptx", + "psd", + "pya", + "pyc", + "pyo", + "pyv", + "qt", + "rar", + "ras", + "raw", + "resources", + "rgb", + "rip", + "rlc", + "rmf", + "rmvb", + "rpm", + "rtf", + "rz", + "s3m", + "s7z", + "scpt", + "sgi", + "shar", + "snap", + "sil", + "sketch", + "slk", + "smv", + "snk", + "so", + "stl", + "suo", + "sub", + "swf", + "tar", + "tbz", + "tbz2", + "tga", + "tgz", + "thmx", + "tif", + "tiff", + "tlz", + "ttc", + "ttf", + "txz", + "udf", + "uvh", + "uvi", + "uvm", + "uvp", + "uvs", + "uvu", + "viv", + "vob", + "war", + "wav", + "wax", + "wbmp", + "wdp", + "weba", + "webm", + "webp", + "whl", + "wim", + "wm", + "wma", + "wmv", + "wmx", + "woff", + "woff2", + "wrm", + "wvx", + "xbm", + "xif", + "xla", + "xlam", + "xls", + "xlsb", + "xlsm", + "xlsx", + "xlt", + "xltm", + "xltx", + "xm", + "xmind", + "xpi", + "xpm", + "xwd", + "xz", + "z", + "zip", + "zipx" +] diff --git a/node_modules/binary-extensions/binary-extensions.json.d.ts b/node_modules/binary-extensions/binary-extensions.json.d.ts new file mode 100644 index 000000000..94a248c2b --- /dev/null +++ b/node_modules/binary-extensions/binary-extensions.json.d.ts @@ -0,0 +1,3 @@ +declare const binaryExtensionsJson: readonly string[]; + +export = binaryExtensionsJson; diff --git a/node_modules/binary-extensions/index.d.ts b/node_modules/binary-extensions/index.d.ts new file mode 100644 index 000000000..f469ac5fb --- /dev/null +++ b/node_modules/binary-extensions/index.d.ts @@ -0,0 +1,14 @@ +/** +List of binary file extensions. + +@example +``` +import binaryExtensions = require('binary-extensions'); + +console.log(binaryExtensions); +//=> ['3ds', '3g2', …] +``` +*/ +declare const binaryExtensions: readonly string[]; + +export = binaryExtensions; diff --git a/node_modules/binary-extensions/index.js b/node_modules/binary-extensions/index.js new file mode 100644 index 000000000..d46e46886 --- /dev/null +++ b/node_modules/binary-extensions/index.js @@ -0,0 +1 @@ +module.exports = require('./binary-extensions.json'); diff --git a/node_modules/binary-extensions/license b/node_modules/binary-extensions/license new file mode 100644 index 000000000..401b1c731 --- /dev/null +++ b/node_modules/binary-extensions/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) 2019 Sindre Sorhus (https://sindresorhus.com), Paul Miller (https://paulmillr.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/binary-extensions/package.json b/node_modules/binary-extensions/package.json new file mode 100644 index 000000000..86aa375e0 --- /dev/null +++ b/node_modules/binary-extensions/package.json @@ -0,0 +1,70 @@ +{ + "_from": "binary-extensions@^2.0.0", + "_id": "binary-extensions@2.2.0", + "_inBundle": false, + "_integrity": "sha512-jDctJ/IVQbZoJykoeHbhXpOlNBqGNcwXJKJog42E5HDPUwQTSdjCHdihjj0DlnheQ7blbT6dHOafNAiS8ooQKA==", + "_location": "/binary-extensions", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "binary-extensions@^2.0.0", + "name": "binary-extensions", + "escapedName": "binary-extensions", + "rawSpec": "^2.0.0", + "saveSpec": null, + "fetchSpec": "^2.0.0" + }, + "_requiredBy": [ + "/is-binary-path" + ], + "_resolved": "https://registry.npmjs.org/binary-extensions/-/binary-extensions-2.2.0.tgz", + "_shasum": "75f502eeaf9ffde42fc98829645be4ea76bd9e2d", + "_spec": "binary-extensions@^2.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\is-binary-path", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/binary-extensions/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "List of binary file extensions", + "devDependencies": { + "ava": "^1.4.1", + "tsd": "^0.7.2", + "xo": "^0.24.0" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts", + "binary-extensions.json", + "binary-extensions.json.d.ts" + ], + "homepage": "https://github.com/sindresorhus/binary-extensions#readme", + "keywords": [ + "binary", + "extensions", + "extension", + "file", + "json", + "list", + "array" + ], + "license": "MIT", + "name": "binary-extensions", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/binary-extensions.git" + }, + "scripts": { + "test": "xo && ava && tsd" + }, + "version": "2.2.0" +} diff --git a/node_modules/binary-extensions/readme.md b/node_modules/binary-extensions/readme.md new file mode 100644 index 000000000..3e25dd835 --- /dev/null +++ b/node_modules/binary-extensions/readme.md @@ -0,0 +1,41 @@ +# binary-extensions + +> List of binary file extensions + +The list is just a [JSON file](binary-extensions.json) and can be used anywhere. + + +## Install + +``` +$ npm install binary-extensions +``` + + +## Usage + +```js +const binaryExtensions = require('binary-extensions'); + +console.log(binaryExtensions); +//=> ['3ds', '3g2', …] +``` + + +## Related + +- [is-binary-path](https://github.com/sindresorhus/is-binary-path) - Check if a filepath is a binary file +- [text-extensions](https://github.com/sindresorhus/text-extensions) - List of text file extensions + + +--- + +
+ + Get professional support for this package with a Tidelift subscription + +
+ + Tidelift helps make open source sustainable for maintainers while giving companies
assurances about security, maintenance, and licensing for their dependencies. +
+
diff --git a/node_modules/body-parser/HISTORY.md b/node_modules/body-parser/HISTORY.md new file mode 100644 index 000000000..a1d3fbfbb --- /dev/null +++ b/node_modules/body-parser/HISTORY.md @@ -0,0 +1,609 @@ +1.19.0 / 2019-04-25 +=================== + + * deps: bytes@3.1.0 + - Add petabyte (`pb`) support + * deps: http-errors@1.7.2 + - Set constructor name when possible + - deps: setprototypeof@1.1.1 + - deps: statuses@'>= 1.5.0 < 2' + * deps: iconv-lite@0.4.24 + - Added encoding MIK + * deps: qs@6.7.0 + - Fix parsing array brackets after index + * deps: raw-body@2.4.0 + - deps: bytes@3.1.0 + - deps: http-errors@1.7.2 + - deps: iconv-lite@0.4.24 + * deps: type-is@~1.6.17 + - deps: mime-types@~2.1.24 + - perf: prevent internal `throw` on invalid type + +1.18.3 / 2018-05-14 +=================== + + * Fix stack trace for strict json parse error + * deps: depd@~1.1.2 + - perf: remove argument reassignment + * deps: http-errors@~1.6.3 + - deps: depd@~1.1.2 + - deps: setprototypeof@1.1.0 + - deps: statuses@'>= 1.3.1 < 2' + * deps: iconv-lite@0.4.23 + - Fix loading encoding with year appended + - Fix deprecation warnings on Node.js 10+ + * deps: qs@6.5.2 + * deps: raw-body@2.3.3 + - deps: http-errors@1.6.3 + - deps: iconv-lite@0.4.23 + * deps: type-is@~1.6.16 + - deps: mime-types@~2.1.18 + +1.18.2 / 2017-09-22 +=================== + + * deps: debug@2.6.9 + * perf: remove argument reassignment + +1.18.1 / 2017-09-12 +=================== + + * deps: content-type@~1.0.4 + - perf: remove argument reassignment + - perf: skip parameter parsing when no parameters + * deps: iconv-lite@0.4.19 + - Fix ISO-8859-1 regression + - Update Windows-1255 + * deps: qs@6.5.1 + - Fix parsing & compacting very deep objects + * deps: raw-body@2.3.2 + - deps: iconv-lite@0.4.19 + +1.18.0 / 2017-09-08 +=================== + + * Fix JSON strict violation error to match native parse error + * Include the `body` property on verify errors + * Include the `type` property on all generated errors + * Use `http-errors` to set status code on errors + * deps: bytes@3.0.0 + * deps: debug@2.6.8 + * deps: depd@~1.1.1 + - Remove unnecessary `Buffer` loading + * deps: http-errors@~1.6.2 + - deps: depd@1.1.1 + * deps: iconv-lite@0.4.18 + - Add support for React Native + - Add a warning if not loaded as utf-8 + - Fix CESU-8 decoding in Node.js 8 + - Improve speed of ISO-8859-1 encoding + * deps: qs@6.5.0 + * deps: raw-body@2.3.1 + - Use `http-errors` for standard emitted errors + - deps: bytes@3.0.0 + - deps: iconv-lite@0.4.18 + - perf: skip buffer decoding on overage chunk + * perf: prevent internal `throw` when missing charset + +1.17.2 / 2017-05-17 +=================== + + * deps: debug@2.6.7 + - Fix `DEBUG_MAX_ARRAY_LENGTH` + - deps: ms@2.0.0 + * deps: type-is@~1.6.15 + - deps: mime-types@~2.1.15 + +1.17.1 / 2017-03-06 +=================== + + * deps: qs@6.4.0 + - Fix regression parsing keys starting with `[` + +1.17.0 / 2017-03-01 +=================== + + * deps: http-errors@~1.6.1 + - Make `message` property enumerable for `HttpError`s + - deps: setprototypeof@1.0.3 + * deps: qs@6.3.1 + - Fix compacting nested arrays + +1.16.1 / 2017-02-10 +=================== + + * deps: debug@2.6.1 + - Fix deprecation messages in WebStorm and other editors + - Undeprecate `DEBUG_FD` set to `1` or `2` + +1.16.0 / 2017-01-17 +=================== + + * deps: debug@2.6.0 + - Allow colors in workers + - Deprecated `DEBUG_FD` environment variable + - Fix error when running under React Native + - Use same color for same namespace + - deps: ms@0.7.2 + * deps: http-errors@~1.5.1 + - deps: inherits@2.0.3 + - deps: setprototypeof@1.0.2 + - deps: statuses@'>= 1.3.1 < 2' + * deps: iconv-lite@0.4.15 + - Added encoding MS-31J + - Added encoding MS-932 + - Added encoding MS-936 + - Added encoding MS-949 + - Added encoding MS-950 + - Fix GBK/GB18030 handling of Euro character + * deps: qs@6.2.1 + - Fix array parsing from skipping empty values + * deps: raw-body@~2.2.0 + - deps: iconv-lite@0.4.15 + * deps: type-is@~1.6.14 + - deps: mime-types@~2.1.13 + +1.15.2 / 2016-06-19 +=================== + + * deps: bytes@2.4.0 + * deps: content-type@~1.0.2 + - perf: enable strict mode + * deps: http-errors@~1.5.0 + - Use `setprototypeof` module to replace `__proto__` setting + - deps: statuses@'>= 1.3.0 < 2' + - perf: enable strict mode + * deps: qs@6.2.0 + * deps: raw-body@~2.1.7 + - deps: bytes@2.4.0 + - perf: remove double-cleanup on happy path + * deps: type-is@~1.6.13 + - deps: mime-types@~2.1.11 + +1.15.1 / 2016-05-05 +=================== + + * deps: bytes@2.3.0 + - Drop partial bytes on all parsed units + - Fix parsing byte string that looks like hex + * deps: raw-body@~2.1.6 + - deps: bytes@2.3.0 + * deps: type-is@~1.6.12 + - deps: mime-types@~2.1.10 + +1.15.0 / 2016-02-10 +=================== + + * deps: http-errors@~1.4.0 + - Add `HttpError` export, for `err instanceof createError.HttpError` + - deps: inherits@2.0.1 + - deps: statuses@'>= 1.2.1 < 2' + * deps: qs@6.1.0 + * deps: type-is@~1.6.11 + - deps: mime-types@~2.1.9 + +1.14.2 / 2015-12-16 +=================== + + * deps: bytes@2.2.0 + * deps: iconv-lite@0.4.13 + * deps: qs@5.2.0 + * deps: raw-body@~2.1.5 + - deps: bytes@2.2.0 + - deps: iconv-lite@0.4.13 + * deps: type-is@~1.6.10 + - deps: mime-types@~2.1.8 + +1.14.1 / 2015-09-27 +=================== + + * Fix issue where invalid charset results in 400 when `verify` used + * deps: iconv-lite@0.4.12 + - Fix CESU-8 decoding in Node.js 4.x + * deps: raw-body@~2.1.4 + - Fix masking critical errors from `iconv-lite` + - deps: iconv-lite@0.4.12 + * deps: type-is@~1.6.9 + - deps: mime-types@~2.1.7 + +1.14.0 / 2015-09-16 +=================== + + * Fix JSON strict parse error to match syntax errors + * Provide static `require` analysis in `urlencoded` parser + * deps: depd@~1.1.0 + - Support web browser loading + * deps: qs@5.1.0 + * deps: raw-body@~2.1.3 + - Fix sync callback when attaching data listener causes sync read + * deps: type-is@~1.6.8 + - Fix type error when given invalid type to match against + - deps: mime-types@~2.1.6 + +1.13.3 / 2015-07-31 +=================== + + * deps: type-is@~1.6.6 + - deps: mime-types@~2.1.4 + +1.13.2 / 2015-07-05 +=================== + + * deps: iconv-lite@0.4.11 + * deps: qs@4.0.0 + - Fix dropping parameters like `hasOwnProperty` + - Fix user-visible incompatibilities from 3.1.0 + - Fix various parsing edge cases + * deps: raw-body@~2.1.2 + - Fix error stack traces to skip `makeError` + - deps: iconv-lite@0.4.11 + * deps: type-is@~1.6.4 + - deps: mime-types@~2.1.2 + - perf: enable strict mode + - perf: remove argument reassignment + +1.13.1 / 2015-06-16 +=================== + + * deps: qs@2.4.2 + - Downgraded from 3.1.0 because of user-visible incompatibilities + +1.13.0 / 2015-06-14 +=================== + + * Add `statusCode` property on `Error`s, in addition to `status` + * Change `type` default to `application/json` for JSON parser + * Change `type` default to `application/x-www-form-urlencoded` for urlencoded parser + * Provide static `require` analysis + * Use the `http-errors` module to generate errors + * deps: bytes@2.1.0 + - Slight optimizations + * deps: iconv-lite@0.4.10 + - The encoding UTF-16 without BOM now defaults to UTF-16LE when detection fails + - Leading BOM is now removed when decoding + * deps: on-finished@~2.3.0 + - Add defined behavior for HTTP `CONNECT` requests + - Add defined behavior for HTTP `Upgrade` requests + - deps: ee-first@1.1.1 + * deps: qs@3.1.0 + - Fix dropping parameters like `hasOwnProperty` + - Fix various parsing edge cases + - Parsed object now has `null` prototype + * deps: raw-body@~2.1.1 + - Use `unpipe` module for unpiping requests + - deps: iconv-lite@0.4.10 + * deps: type-is@~1.6.3 + - deps: mime-types@~2.1.1 + - perf: reduce try block size + - perf: remove bitwise operations + * perf: enable strict mode + * perf: remove argument reassignment + * perf: remove delete call + +1.12.4 / 2015-05-10 +=================== + + * deps: debug@~2.2.0 + * deps: qs@2.4.2 + - Fix allowing parameters like `constructor` + * deps: on-finished@~2.2.1 + * deps: raw-body@~2.0.1 + - Fix a false-positive when unpiping in Node.js 0.8 + - deps: bytes@2.0.1 + * deps: type-is@~1.6.2 + - deps: mime-types@~2.0.11 + +1.12.3 / 2015-04-15 +=================== + + * Slight efficiency improvement when not debugging + * deps: depd@~1.0.1 + * deps: iconv-lite@0.4.8 + - Add encoding alias UNICODE-1-1-UTF-7 + * deps: raw-body@1.3.4 + - Fix hanging callback if request aborts during read + - deps: iconv-lite@0.4.8 + +1.12.2 / 2015-03-16 +=================== + + * deps: qs@2.4.1 + - Fix error when parameter `hasOwnProperty` is present + +1.12.1 / 2015-03-15 +=================== + + * deps: debug@~2.1.3 + - Fix high intensity foreground color for bold + - deps: ms@0.7.0 + * deps: type-is@~1.6.1 + - deps: mime-types@~2.0.10 + +1.12.0 / 2015-02-13 +=================== + + * add `debug` messages + * accept a function for the `type` option + * use `content-type` to parse `Content-Type` headers + * deps: iconv-lite@0.4.7 + - Gracefully support enumerables on `Object.prototype` + * deps: raw-body@1.3.3 + - deps: iconv-lite@0.4.7 + * deps: type-is@~1.6.0 + - fix argument reassignment + - fix false-positives in `hasBody` `Transfer-Encoding` check + - support wildcard for both type and subtype (`*/*`) + - deps: mime-types@~2.0.9 + +1.11.0 / 2015-01-30 +=================== + + * make internal `extended: true` depth limit infinity + * deps: type-is@~1.5.6 + - deps: mime-types@~2.0.8 + +1.10.2 / 2015-01-20 +=================== + + * deps: iconv-lite@0.4.6 + - Fix rare aliases of single-byte encodings + * deps: raw-body@1.3.2 + - deps: iconv-lite@0.4.6 + +1.10.1 / 2015-01-01 +=================== + + * deps: on-finished@~2.2.0 + * deps: type-is@~1.5.5 + - deps: mime-types@~2.0.7 + +1.10.0 / 2014-12-02 +=================== + + * make internal `extended: true` array limit dynamic + +1.9.3 / 2014-11-21 +================== + + * deps: iconv-lite@0.4.5 + - Fix Windows-31J and X-SJIS encoding support + * deps: qs@2.3.3 + - Fix `arrayLimit` behavior + * deps: raw-body@1.3.1 + - deps: iconv-lite@0.4.5 + * deps: type-is@~1.5.3 + - deps: mime-types@~2.0.3 + +1.9.2 / 2014-10-27 +================== + + * deps: qs@2.3.2 + - Fix parsing of mixed objects and values + +1.9.1 / 2014-10-22 +================== + + * deps: on-finished@~2.1.1 + - Fix handling of pipelined requests + * deps: qs@2.3.0 + - Fix parsing of mixed implicit and explicit arrays + * deps: type-is@~1.5.2 + - deps: mime-types@~2.0.2 + +1.9.0 / 2014-09-24 +================== + + * include the charset in "unsupported charset" error message + * include the encoding in "unsupported content encoding" error message + * deps: depd@~1.0.0 + +1.8.4 / 2014-09-23 +================== + + * fix content encoding to be case-insensitive + +1.8.3 / 2014-09-19 +================== + + * deps: qs@2.2.4 + - Fix issue with object keys starting with numbers truncated + +1.8.2 / 2014-09-15 +================== + + * deps: depd@0.4.5 + +1.8.1 / 2014-09-07 +================== + + * deps: media-typer@0.3.0 + * deps: type-is@~1.5.1 + +1.8.0 / 2014-09-05 +================== + + * make empty-body-handling consistent between chunked requests + - empty `json` produces `{}` + - empty `raw` produces `new Buffer(0)` + - empty `text` produces `''` + - empty `urlencoded` produces `{}` + * deps: qs@2.2.3 + - Fix issue where first empty value in array is discarded + * deps: type-is@~1.5.0 + - fix `hasbody` to be true for `content-length: 0` + +1.7.0 / 2014-09-01 +================== + + * add `parameterLimit` option to `urlencoded` parser + * change `urlencoded` extended array limit to 100 + * respond with 413 when over `parameterLimit` in `urlencoded` + +1.6.7 / 2014-08-29 +================== + + * deps: qs@2.2.2 + - Remove unnecessary cloning + +1.6.6 / 2014-08-27 +================== + + * deps: qs@2.2.0 + - Array parsing fix + - Performance improvements + +1.6.5 / 2014-08-16 +================== + + * deps: on-finished@2.1.0 + +1.6.4 / 2014-08-14 +================== + + * deps: qs@1.2.2 + +1.6.3 / 2014-08-10 +================== + + * deps: qs@1.2.1 + +1.6.2 / 2014-08-07 +================== + + * deps: qs@1.2.0 + - Fix parsing array of objects + +1.6.1 / 2014-08-06 +================== + + * deps: qs@1.1.0 + - Accept urlencoded square brackets + - Accept empty values in implicit array notation + +1.6.0 / 2014-08-05 +================== + + * deps: qs@1.0.2 + - Complete rewrite + - Limits array length to 20 + - Limits object depth to 5 + - Limits parameters to 1,000 + +1.5.2 / 2014-07-27 +================== + + * deps: depd@0.4.4 + - Work-around v8 generating empty stack traces + +1.5.1 / 2014-07-26 +================== + + * deps: depd@0.4.3 + - Fix exception when global `Error.stackTraceLimit` is too low + +1.5.0 / 2014-07-20 +================== + + * deps: depd@0.4.2 + - Add `TRACE_DEPRECATION` environment variable + - Remove non-standard grey color from color output + - Support `--no-deprecation` argument + - Support `--trace-deprecation` argument + * deps: iconv-lite@0.4.4 + - Added encoding UTF-7 + * deps: raw-body@1.3.0 + - deps: iconv-lite@0.4.4 + - Added encoding UTF-7 + - Fix `Cannot switch to old mode now` error on Node.js 0.10+ + * deps: type-is@~1.3.2 + +1.4.3 / 2014-06-19 +================== + + * deps: type-is@1.3.1 + - fix global variable leak + +1.4.2 / 2014-06-19 +================== + + * deps: type-is@1.3.0 + - improve type parsing + +1.4.1 / 2014-06-19 +================== + + * fix urlencoded extended deprecation message + +1.4.0 / 2014-06-19 +================== + + * add `text` parser + * add `raw` parser + * check accepted charset in content-type (accepts utf-8) + * check accepted encoding in content-encoding (accepts identity) + * deprecate `bodyParser()` middleware; use `.json()` and `.urlencoded()` as needed + * deprecate `urlencoded()` without provided `extended` option + * lazy-load urlencoded parsers + * parsers split into files for reduced mem usage + * support gzip and deflate bodies + - set `inflate: false` to turn off + * deps: raw-body@1.2.2 + - Support all encodings from `iconv-lite` + +1.3.1 / 2014-06-11 +================== + + * deps: type-is@1.2.1 + - Switch dependency from mime to mime-types@1.0.0 + +1.3.0 / 2014-05-31 +================== + + * add `extended` option to urlencoded parser + +1.2.2 / 2014-05-27 +================== + + * deps: raw-body@1.1.6 + - assert stream encoding on node.js 0.8 + - assert stream encoding on node.js < 0.10.6 + - deps: bytes@1 + +1.2.1 / 2014-05-26 +================== + + * invoke `next(err)` after request fully read + - prevents hung responses and socket hang ups + +1.2.0 / 2014-05-11 +================== + + * add `verify` option + * deps: type-is@1.2.0 + - support suffix matching + +1.1.2 / 2014-05-11 +================== + + * improve json parser speed + +1.1.1 / 2014-05-11 +================== + + * fix repeated limit parsing with every request + +1.1.0 / 2014-05-10 +================== + + * add `type` option + * deps: pin for safety and consistency + +1.0.2 / 2014-04-14 +================== + + * use `type-is` module + +1.0.1 / 2014-03-20 +================== + + * lower default limits to 100kb diff --git a/node_modules/body-parser/LICENSE b/node_modules/body-parser/LICENSE new file mode 100644 index 000000000..386b7b694 --- /dev/null +++ b/node_modules/body-parser/LICENSE @@ -0,0 +1,23 @@ +(The MIT License) + +Copyright (c) 2014 Jonathan Ong +Copyright (c) 2014-2015 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/body-parser/README.md b/node_modules/body-parser/README.md new file mode 100644 index 000000000..aba6297a8 --- /dev/null +++ b/node_modules/body-parser/README.md @@ -0,0 +1,443 @@ +# body-parser + +[![NPM Version][npm-image]][npm-url] +[![NPM Downloads][downloads-image]][downloads-url] +[![Build Status][travis-image]][travis-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +Node.js body parsing middleware. + +Parse incoming request bodies in a middleware before your handlers, available +under the `req.body` property. + +**Note** As `req.body`'s shape is based on user-controlled input, all +properties and values in this object are untrusted and should be validated +before trusting. For example, `req.body.foo.toString()` may fail in multiple +ways, for example the `foo` property may not be there or may not be a string, +and `toString` may not be a function and instead a string or other user input. + +[Learn about the anatomy of an HTTP transaction in Node.js](https://nodejs.org/en/docs/guides/anatomy-of-an-http-transaction/). + +_This does not handle multipart bodies_, due to their complex and typically +large nature. For multipart bodies, you may be interested in the following +modules: + + * [busboy](https://www.npmjs.org/package/busboy#readme) and + [connect-busboy](https://www.npmjs.org/package/connect-busboy#readme) + * [multiparty](https://www.npmjs.org/package/multiparty#readme) and + [connect-multiparty](https://www.npmjs.org/package/connect-multiparty#readme) + * [formidable](https://www.npmjs.org/package/formidable#readme) + * [multer](https://www.npmjs.org/package/multer#readme) + +This module provides the following parsers: + + * [JSON body parser](#bodyparserjsonoptions) + * [Raw body parser](#bodyparserrawoptions) + * [Text body parser](#bodyparsertextoptions) + * [URL-encoded form body parser](#bodyparserurlencodedoptions) + +Other body parsers you might be interested in: + +- [body](https://www.npmjs.org/package/body#readme) +- [co-body](https://www.npmjs.org/package/co-body#readme) + +## Installation + +```sh +$ npm install body-parser +``` + +## API + + + +```js +var bodyParser = require('body-parser') +``` + +The `bodyParser` object exposes various factories to create middlewares. All +middlewares will populate the `req.body` property with the parsed body when +the `Content-Type` request header matches the `type` option, or an empty +object (`{}`) if there was no body to parse, the `Content-Type` was not matched, +or an error occurred. + +The various errors returned by this module are described in the +[errors section](#errors). + +### bodyParser.json([options]) + +Returns middleware that only parses `json` and only looks at requests where +the `Content-Type` header matches the `type` option. This parser accepts any +Unicode encoding of the body and supports automatic inflation of `gzip` and +`deflate` encodings. + +A new `body` object containing the parsed data is populated on the `request` +object after the middleware (i.e. `req.body`). + +#### Options + +The `json` function takes an optional `options` object that may contain any of +the following keys: + +##### inflate + +When set to `true`, then deflated (compressed) bodies will be inflated; when +`false`, deflated bodies are rejected. Defaults to `true`. + +##### limit + +Controls the maximum request body size. If this is a number, then the value +specifies the number of bytes; if it is a string, the value is passed to the +[bytes](https://www.npmjs.com/package/bytes) library for parsing. Defaults +to `'100kb'`. + +##### reviver + +The `reviver` option is passed directly to `JSON.parse` as the second +argument. You can find more information on this argument +[in the MDN documentation about JSON.parse](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse#Example.3A_Using_the_reviver_parameter). + +##### strict + +When set to `true`, will only accept arrays and objects; when `false` will +accept anything `JSON.parse` accepts. Defaults to `true`. + +##### type + +The `type` option is used to determine what media type the middleware will +parse. This option can be a string, array of strings, or a function. If not a +function, `type` option is passed directly to the +[type-is](https://www.npmjs.org/package/type-is#readme) library and this can +be an extension name (like `json`), a mime type (like `application/json`), or +a mime type with a wildcard (like `*/*` or `*/json`). If a function, the `type` +option is called as `fn(req)` and the request is parsed if it returns a truthy +value. Defaults to `application/json`. + +##### verify + +The `verify` option, if supplied, is called as `verify(req, res, buf, encoding)`, +where `buf` is a `Buffer` of the raw request body and `encoding` is the +encoding of the request. The parsing can be aborted by throwing an error. + +### bodyParser.raw([options]) + +Returns middleware that parses all bodies as a `Buffer` and only looks at +requests where the `Content-Type` header matches the `type` option. This +parser supports automatic inflation of `gzip` and `deflate` encodings. + +A new `body` object containing the parsed data is populated on the `request` +object after the middleware (i.e. `req.body`). This will be a `Buffer` object +of the body. + +#### Options + +The `raw` function takes an optional `options` object that may contain any of +the following keys: + +##### inflate + +When set to `true`, then deflated (compressed) bodies will be inflated; when +`false`, deflated bodies are rejected. Defaults to `true`. + +##### limit + +Controls the maximum request body size. If this is a number, then the value +specifies the number of bytes; if it is a string, the value is passed to the +[bytes](https://www.npmjs.com/package/bytes) library for parsing. Defaults +to `'100kb'`. + +##### type + +The `type` option is used to determine what media type the middleware will +parse. This option can be a string, array of strings, or a function. +If not a function, `type` option is passed directly to the +[type-is](https://www.npmjs.org/package/type-is#readme) library and this +can be an extension name (like `bin`), a mime type (like +`application/octet-stream`), or a mime type with a wildcard (like `*/*` or +`application/*`). If a function, the `type` option is called as `fn(req)` +and the request is parsed if it returns a truthy value. Defaults to +`application/octet-stream`. + +##### verify + +The `verify` option, if supplied, is called as `verify(req, res, buf, encoding)`, +where `buf` is a `Buffer` of the raw request body and `encoding` is the +encoding of the request. The parsing can be aborted by throwing an error. + +### bodyParser.text([options]) + +Returns middleware that parses all bodies as a string and only looks at +requests where the `Content-Type` header matches the `type` option. This +parser supports automatic inflation of `gzip` and `deflate` encodings. + +A new `body` string containing the parsed data is populated on the `request` +object after the middleware (i.e. `req.body`). This will be a string of the +body. + +#### Options + +The `text` function takes an optional `options` object that may contain any of +the following keys: + +##### defaultCharset + +Specify the default character set for the text content if the charset is not +specified in the `Content-Type` header of the request. Defaults to `utf-8`. + +##### inflate + +When set to `true`, then deflated (compressed) bodies will be inflated; when +`false`, deflated bodies are rejected. Defaults to `true`. + +##### limit + +Controls the maximum request body size. If this is a number, then the value +specifies the number of bytes; if it is a string, the value is passed to the +[bytes](https://www.npmjs.com/package/bytes) library for parsing. Defaults +to `'100kb'`. + +##### type + +The `type` option is used to determine what media type the middleware will +parse. This option can be a string, array of strings, or a function. If not +a function, `type` option is passed directly to the +[type-is](https://www.npmjs.org/package/type-is#readme) library and this can +be an extension name (like `txt`), a mime type (like `text/plain`), or a mime +type with a wildcard (like `*/*` or `text/*`). If a function, the `type` +option is called as `fn(req)` and the request is parsed if it returns a +truthy value. Defaults to `text/plain`. + +##### verify + +The `verify` option, if supplied, is called as `verify(req, res, buf, encoding)`, +where `buf` is a `Buffer` of the raw request body and `encoding` is the +encoding of the request. The parsing can be aborted by throwing an error. + +### bodyParser.urlencoded([options]) + +Returns middleware that only parses `urlencoded` bodies and only looks at +requests where the `Content-Type` header matches the `type` option. This +parser accepts only UTF-8 encoding of the body and supports automatic +inflation of `gzip` and `deflate` encodings. + +A new `body` object containing the parsed data is populated on the `request` +object after the middleware (i.e. `req.body`). This object will contain +key-value pairs, where the value can be a string or array (when `extended` is +`false`), or any type (when `extended` is `true`). + +#### Options + +The `urlencoded` function takes an optional `options` object that may contain +any of the following keys: + +##### extended + +The `extended` option allows to choose between parsing the URL-encoded data +with the `querystring` library (when `false`) or the `qs` library (when +`true`). The "extended" syntax allows for rich objects and arrays to be +encoded into the URL-encoded format, allowing for a JSON-like experience +with URL-encoded. For more information, please +[see the qs library](https://www.npmjs.org/package/qs#readme). + +Defaults to `true`, but using the default has been deprecated. Please +research into the difference between `qs` and `querystring` and choose the +appropriate setting. + +##### inflate + +When set to `true`, then deflated (compressed) bodies will be inflated; when +`false`, deflated bodies are rejected. Defaults to `true`. + +##### limit + +Controls the maximum request body size. If this is a number, then the value +specifies the number of bytes; if it is a string, the value is passed to the +[bytes](https://www.npmjs.com/package/bytes) library for parsing. Defaults +to `'100kb'`. + +##### parameterLimit + +The `parameterLimit` option controls the maximum number of parameters that +are allowed in the URL-encoded data. If a request contains more parameters +than this value, a 413 will be returned to the client. Defaults to `1000`. + +##### type + +The `type` option is used to determine what media type the middleware will +parse. This option can be a string, array of strings, or a function. If not +a function, `type` option is passed directly to the +[type-is](https://www.npmjs.org/package/type-is#readme) library and this can +be an extension name (like `urlencoded`), a mime type (like +`application/x-www-form-urlencoded`), or a mime type with a wildcard (like +`*/x-www-form-urlencoded`). If a function, the `type` option is called as +`fn(req)` and the request is parsed if it returns a truthy value. Defaults +to `application/x-www-form-urlencoded`. + +##### verify + +The `verify` option, if supplied, is called as `verify(req, res, buf, encoding)`, +where `buf` is a `Buffer` of the raw request body and `encoding` is the +encoding of the request. The parsing can be aborted by throwing an error. + +## Errors + +The middlewares provided by this module create errors depending on the error +condition during parsing. The errors will typically have a `status`/`statusCode` +property that contains the suggested HTTP response code, an `expose` property +to determine if the `message` property should be displayed to the client, a +`type` property to determine the type of error without matching against the +`message`, and a `body` property containing the read body, if available. + +The following are the common errors emitted, though any error can come through +for various reasons. + +### content encoding unsupported + +This error will occur when the request had a `Content-Encoding` header that +contained an encoding but the "inflation" option was set to `false`. The +`status` property is set to `415`, the `type` property is set to +`'encoding.unsupported'`, and the `charset` property will be set to the +encoding that is unsupported. + +### request aborted + +This error will occur when the request is aborted by the client before reading +the body has finished. The `received` property will be set to the number of +bytes received before the request was aborted and the `expected` property is +set to the number of expected bytes. The `status` property is set to `400` +and `type` property is set to `'request.aborted'`. + +### request entity too large + +This error will occur when the request body's size is larger than the "limit" +option. The `limit` property will be set to the byte limit and the `length` +property will be set to the request body's length. The `status` property is +set to `413` and the `type` property is set to `'entity.too.large'`. + +### request size did not match content length + +This error will occur when the request's length did not match the length from +the `Content-Length` header. This typically occurs when the request is malformed, +typically when the `Content-Length` header was calculated based on characters +instead of bytes. The `status` property is set to `400` and the `type` property +is set to `'request.size.invalid'`. + +### stream encoding should not be set + +This error will occur when something called the `req.setEncoding` method prior +to this middleware. This module operates directly on bytes only and you cannot +call `req.setEncoding` when using this module. The `status` property is set to +`500` and the `type` property is set to `'stream.encoding.set'`. + +### too many parameters + +This error will occur when the content of the request exceeds the configured +`parameterLimit` for the `urlencoded` parser. The `status` property is set to +`413` and the `type` property is set to `'parameters.too.many'`. + +### unsupported charset "BOGUS" + +This error will occur when the request had a charset parameter in the +`Content-Type` header, but the `iconv-lite` module does not support it OR the +parser does not support it. The charset is contained in the message as well +as in the `charset` property. The `status` property is set to `415`, the +`type` property is set to `'charset.unsupported'`, and the `charset` property +is set to the charset that is unsupported. + +### unsupported content encoding "bogus" + +This error will occur when the request had a `Content-Encoding` header that +contained an unsupported encoding. The encoding is contained in the message +as well as in the `encoding` property. The `status` property is set to `415`, +the `type` property is set to `'encoding.unsupported'`, and the `encoding` +property is set to the encoding that is unsupported. + +## Examples + +### Express/Connect top-level generic + +This example demonstrates adding a generic JSON and URL-encoded parser as a +top-level middleware, which will parse the bodies of all incoming requests. +This is the simplest setup. + +```js +var express = require('express') +var bodyParser = require('body-parser') + +var app = express() + +// parse application/x-www-form-urlencoded +app.use(bodyParser.urlencoded({ extended: false })) + +// parse application/json +app.use(bodyParser.json()) + +app.use(function (req, res) { + res.setHeader('Content-Type', 'text/plain') + res.write('you posted:\n') + res.end(JSON.stringify(req.body, null, 2)) +}) +``` + +### Express route-specific + +This example demonstrates adding body parsers specifically to the routes that +need them. In general, this is the most recommended way to use body-parser with +Express. + +```js +var express = require('express') +var bodyParser = require('body-parser') + +var app = express() + +// create application/json parser +var jsonParser = bodyParser.json() + +// create application/x-www-form-urlencoded parser +var urlencodedParser = bodyParser.urlencoded({ extended: false }) + +// POST /login gets urlencoded bodies +app.post('/login', urlencodedParser, function (req, res) { + res.send('welcome, ' + req.body.username) +}) + +// POST /api/users gets JSON bodies +app.post('/api/users', jsonParser, function (req, res) { + // create user in req.body +}) +``` + +### Change accepted type for parsers + +All the parsers accept a `type` option which allows you to change the +`Content-Type` that the middleware will parse. + +```js +var express = require('express') +var bodyParser = require('body-parser') + +var app = express() + +// parse various different custom JSON types as JSON +app.use(bodyParser.json({ type: 'application/*+json' })) + +// parse some custom thing into a Buffer +app.use(bodyParser.raw({ type: 'application/vnd.custom-type' })) + +// parse an HTML body into a string +app.use(bodyParser.text({ type: 'text/html' })) +``` + +## License + +[MIT](LICENSE) + +[npm-image]: https://img.shields.io/npm/v/body-parser.svg +[npm-url]: https://npmjs.org/package/body-parser +[travis-image]: https://img.shields.io/travis/expressjs/body-parser/master.svg +[travis-url]: https://travis-ci.org/expressjs/body-parser +[coveralls-image]: https://img.shields.io/coveralls/expressjs/body-parser/master.svg +[coveralls-url]: https://coveralls.io/r/expressjs/body-parser?branch=master +[downloads-image]: https://img.shields.io/npm/dm/body-parser.svg +[downloads-url]: https://npmjs.org/package/body-parser diff --git a/node_modules/body-parser/index.js b/node_modules/body-parser/index.js new file mode 100644 index 000000000..93c3a1fff --- /dev/null +++ b/node_modules/body-parser/index.js @@ -0,0 +1,157 @@ +/*! + * body-parser + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module dependencies. + * @private + */ + +var deprecate = require('depd')('body-parser') + +/** + * Cache of loaded parsers. + * @private + */ + +var parsers = Object.create(null) + +/** + * @typedef Parsers + * @type {function} + * @property {function} json + * @property {function} raw + * @property {function} text + * @property {function} urlencoded + */ + +/** + * Module exports. + * @type {Parsers} + */ + +exports = module.exports = deprecate.function(bodyParser, + 'bodyParser: use individual json/urlencoded middlewares') + +/** + * JSON parser. + * @public + */ + +Object.defineProperty(exports, 'json', { + configurable: true, + enumerable: true, + get: createParserGetter('json') +}) + +/** + * Raw parser. + * @public + */ + +Object.defineProperty(exports, 'raw', { + configurable: true, + enumerable: true, + get: createParserGetter('raw') +}) + +/** + * Text parser. + * @public + */ + +Object.defineProperty(exports, 'text', { + configurable: true, + enumerable: true, + get: createParserGetter('text') +}) + +/** + * URL-encoded parser. + * @public + */ + +Object.defineProperty(exports, 'urlencoded', { + configurable: true, + enumerable: true, + get: createParserGetter('urlencoded') +}) + +/** + * Create a middleware to parse json and urlencoded bodies. + * + * @param {object} [options] + * @return {function} + * @deprecated + * @public + */ + +function bodyParser (options) { + var opts = {} + + // exclude type option + if (options) { + for (var prop in options) { + if (prop !== 'type') { + opts[prop] = options[prop] + } + } + } + + var _urlencoded = exports.urlencoded(opts) + var _json = exports.json(opts) + + return function bodyParser (req, res, next) { + _json(req, res, function (err) { + if (err) return next(err) + _urlencoded(req, res, next) + }) + } +} + +/** + * Create a getter for loading a parser. + * @private + */ + +function createParserGetter (name) { + return function get () { + return loadParser(name) + } +} + +/** + * Load a parser module. + * @private + */ + +function loadParser (parserName) { + var parser = parsers[parserName] + + if (parser !== undefined) { + return parser + } + + // this uses a switch for static require analysis + switch (parserName) { + case 'json': + parser = require('./lib/types/json') + break + case 'raw': + parser = require('./lib/types/raw') + break + case 'text': + parser = require('./lib/types/text') + break + case 'urlencoded': + parser = require('./lib/types/urlencoded') + break + } + + // store to prevent invoking require() + return (parsers[parserName] = parser) +} diff --git a/node_modules/body-parser/lib/read.js b/node_modules/body-parser/lib/read.js new file mode 100644 index 000000000..c10260958 --- /dev/null +++ b/node_modules/body-parser/lib/read.js @@ -0,0 +1,181 @@ +/*! + * body-parser + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module dependencies. + * @private + */ + +var createError = require('http-errors') +var getBody = require('raw-body') +var iconv = require('iconv-lite') +var onFinished = require('on-finished') +var zlib = require('zlib') + +/** + * Module exports. + */ + +module.exports = read + +/** + * Read a request into a buffer and parse. + * + * @param {object} req + * @param {object} res + * @param {function} next + * @param {function} parse + * @param {function} debug + * @param {object} options + * @private + */ + +function read (req, res, next, parse, debug, options) { + var length + var opts = options + var stream + + // flag as parsed + req._body = true + + // read options + var encoding = opts.encoding !== null + ? opts.encoding + : null + var verify = opts.verify + + try { + // get the content stream + stream = contentstream(req, debug, opts.inflate) + length = stream.length + stream.length = undefined + } catch (err) { + return next(err) + } + + // set raw-body options + opts.length = length + opts.encoding = verify + ? null + : encoding + + // assert charset is supported + if (opts.encoding === null && encoding !== null && !iconv.encodingExists(encoding)) { + return next(createError(415, 'unsupported charset "' + encoding.toUpperCase() + '"', { + charset: encoding.toLowerCase(), + type: 'charset.unsupported' + })) + } + + // read body + debug('read body') + getBody(stream, opts, function (error, body) { + if (error) { + var _error + + if (error.type === 'encoding.unsupported') { + // echo back charset + _error = createError(415, 'unsupported charset "' + encoding.toUpperCase() + '"', { + charset: encoding.toLowerCase(), + type: 'charset.unsupported' + }) + } else { + // set status code on error + _error = createError(400, error) + } + + // read off entire request + stream.resume() + onFinished(req, function onfinished () { + next(createError(400, _error)) + }) + return + } + + // verify + if (verify) { + try { + debug('verify body') + verify(req, res, body, encoding) + } catch (err) { + next(createError(403, err, { + body: body, + type: err.type || 'entity.verify.failed' + })) + return + } + } + + // parse + var str = body + try { + debug('parse body') + str = typeof body !== 'string' && encoding !== null + ? iconv.decode(body, encoding) + : body + req.body = parse(str) + } catch (err) { + next(createError(400, err, { + body: str, + type: err.type || 'entity.parse.failed' + })) + return + } + + next() + }) +} + +/** + * Get the content stream of the request. + * + * @param {object} req + * @param {function} debug + * @param {boolean} [inflate=true] + * @return {object} + * @api private + */ + +function contentstream (req, debug, inflate) { + var encoding = (req.headers['content-encoding'] || 'identity').toLowerCase() + var length = req.headers['content-length'] + var stream + + debug('content-encoding "%s"', encoding) + + if (inflate === false && encoding !== 'identity') { + throw createError(415, 'content encoding unsupported', { + encoding: encoding, + type: 'encoding.unsupported' + }) + } + + switch (encoding) { + case 'deflate': + stream = zlib.createInflate() + debug('inflate body') + req.pipe(stream) + break + case 'gzip': + stream = zlib.createGunzip() + debug('gunzip body') + req.pipe(stream) + break + case 'identity': + stream = req + stream.length = length + break + default: + throw createError(415, 'unsupported content encoding "' + encoding + '"', { + encoding: encoding, + type: 'encoding.unsupported' + }) + } + + return stream +} diff --git a/node_modules/body-parser/lib/types/json.js b/node_modules/body-parser/lib/types/json.js new file mode 100644 index 000000000..2971dc14d --- /dev/null +++ b/node_modules/body-parser/lib/types/json.js @@ -0,0 +1,230 @@ +/*! + * body-parser + * Copyright(c) 2014 Jonathan Ong + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module dependencies. + * @private + */ + +var bytes = require('bytes') +var contentType = require('content-type') +var createError = require('http-errors') +var debug = require('debug')('body-parser:json') +var read = require('../read') +var typeis = require('type-is') + +/** + * Module exports. + */ + +module.exports = json + +/** + * RegExp to match the first non-space in a string. + * + * Allowed whitespace is defined in RFC 7159: + * + * ws = *( + * %x20 / ; Space + * %x09 / ; Horizontal tab + * %x0A / ; Line feed or New line + * %x0D ) ; Carriage return + */ + +var FIRST_CHAR_REGEXP = /^[\x20\x09\x0a\x0d]*(.)/ // eslint-disable-line no-control-regex + +/** + * Create a middleware to parse JSON bodies. + * + * @param {object} [options] + * @return {function} + * @public + */ + +function json (options) { + var opts = options || {} + + var limit = typeof opts.limit !== 'number' + ? bytes.parse(opts.limit || '100kb') + : opts.limit + var inflate = opts.inflate !== false + var reviver = opts.reviver + var strict = opts.strict !== false + var type = opts.type || 'application/json' + var verify = opts.verify || false + + if (verify !== false && typeof verify !== 'function') { + throw new TypeError('option verify must be function') + } + + // create the appropriate type checking function + var shouldParse = typeof type !== 'function' + ? typeChecker(type) + : type + + function parse (body) { + if (body.length === 0) { + // special-case empty json body, as it's a common client-side mistake + // TODO: maybe make this configurable or part of "strict" option + return {} + } + + if (strict) { + var first = firstchar(body) + + if (first !== '{' && first !== '[') { + debug('strict violation') + throw createStrictSyntaxError(body, first) + } + } + + try { + debug('parse json') + return JSON.parse(body, reviver) + } catch (e) { + throw normalizeJsonSyntaxError(e, { + message: e.message, + stack: e.stack + }) + } + } + + return function jsonParser (req, res, next) { + if (req._body) { + debug('body already parsed') + next() + return + } + + req.body = req.body || {} + + // skip requests without bodies + if (!typeis.hasBody(req)) { + debug('skip empty body') + next() + return + } + + debug('content-type %j', req.headers['content-type']) + + // determine if request should be parsed + if (!shouldParse(req)) { + debug('skip parsing') + next() + return + } + + // assert charset per RFC 7159 sec 8.1 + var charset = getCharset(req) || 'utf-8' + if (charset.substr(0, 4) !== 'utf-') { + debug('invalid charset') + next(createError(415, 'unsupported charset "' + charset.toUpperCase() + '"', { + charset: charset, + type: 'charset.unsupported' + })) + return + } + + // read + read(req, res, next, parse, debug, { + encoding: charset, + inflate: inflate, + limit: limit, + verify: verify + }) + } +} + +/** + * Create strict violation syntax error matching native error. + * + * @param {string} str + * @param {string} char + * @return {Error} + * @private + */ + +function createStrictSyntaxError (str, char) { + var index = str.indexOf(char) + var partial = str.substring(0, index) + '#' + + try { + JSON.parse(partial); /* istanbul ignore next */ throw new SyntaxError('strict violation') + } catch (e) { + return normalizeJsonSyntaxError(e, { + message: e.message.replace('#', char), + stack: e.stack + }) + } +} + +/** + * Get the first non-whitespace character in a string. + * + * @param {string} str + * @return {function} + * @private + */ + +function firstchar (str) { + return FIRST_CHAR_REGEXP.exec(str)[1] +} + +/** + * Get the charset of a request. + * + * @param {object} req + * @api private + */ + +function getCharset (req) { + try { + return (contentType.parse(req).parameters.charset || '').toLowerCase() + } catch (e) { + return undefined + } +} + +/** + * Normalize a SyntaxError for JSON.parse. + * + * @param {SyntaxError} error + * @param {object} obj + * @return {SyntaxError} + */ + +function normalizeJsonSyntaxError (error, obj) { + var keys = Object.getOwnPropertyNames(error) + + for (var i = 0; i < keys.length; i++) { + var key = keys[i] + if (key !== 'stack' && key !== 'message') { + delete error[key] + } + } + + // replace stack before message for Node.js 0.10 and below + error.stack = obj.stack.replace(error.message, obj.message) + error.message = obj.message + + return error +} + +/** + * Get the simple type checker. + * + * @param {string} type + * @return {function} + */ + +function typeChecker (type) { + return function checkType (req) { + return Boolean(typeis(req, type)) + } +} diff --git a/node_modules/body-parser/lib/types/raw.js b/node_modules/body-parser/lib/types/raw.js new file mode 100644 index 000000000..f5d1b6747 --- /dev/null +++ b/node_modules/body-parser/lib/types/raw.js @@ -0,0 +1,101 @@ +/*! + * body-parser + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module dependencies. + */ + +var bytes = require('bytes') +var debug = require('debug')('body-parser:raw') +var read = require('../read') +var typeis = require('type-is') + +/** + * Module exports. + */ + +module.exports = raw + +/** + * Create a middleware to parse raw bodies. + * + * @param {object} [options] + * @return {function} + * @api public + */ + +function raw (options) { + var opts = options || {} + + var inflate = opts.inflate !== false + var limit = typeof opts.limit !== 'number' + ? bytes.parse(opts.limit || '100kb') + : opts.limit + var type = opts.type || 'application/octet-stream' + var verify = opts.verify || false + + if (verify !== false && typeof verify !== 'function') { + throw new TypeError('option verify must be function') + } + + // create the appropriate type checking function + var shouldParse = typeof type !== 'function' + ? typeChecker(type) + : type + + function parse (buf) { + return buf + } + + return function rawParser (req, res, next) { + if (req._body) { + debug('body already parsed') + next() + return + } + + req.body = req.body || {} + + // skip requests without bodies + if (!typeis.hasBody(req)) { + debug('skip empty body') + next() + return + } + + debug('content-type %j', req.headers['content-type']) + + // determine if request should be parsed + if (!shouldParse(req)) { + debug('skip parsing') + next() + return + } + + // read + read(req, res, next, parse, debug, { + encoding: null, + inflate: inflate, + limit: limit, + verify: verify + }) + } +} + +/** + * Get the simple type checker. + * + * @param {string} type + * @return {function} + */ + +function typeChecker (type) { + return function checkType (req) { + return Boolean(typeis(req, type)) + } +} diff --git a/node_modules/body-parser/lib/types/text.js b/node_modules/body-parser/lib/types/text.js new file mode 100644 index 000000000..083a00908 --- /dev/null +++ b/node_modules/body-parser/lib/types/text.js @@ -0,0 +1,121 @@ +/*! + * body-parser + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module dependencies. + */ + +var bytes = require('bytes') +var contentType = require('content-type') +var debug = require('debug')('body-parser:text') +var read = require('../read') +var typeis = require('type-is') + +/** + * Module exports. + */ + +module.exports = text + +/** + * Create a middleware to parse text bodies. + * + * @param {object} [options] + * @return {function} + * @api public + */ + +function text (options) { + var opts = options || {} + + var defaultCharset = opts.defaultCharset || 'utf-8' + var inflate = opts.inflate !== false + var limit = typeof opts.limit !== 'number' + ? bytes.parse(opts.limit || '100kb') + : opts.limit + var type = opts.type || 'text/plain' + var verify = opts.verify || false + + if (verify !== false && typeof verify !== 'function') { + throw new TypeError('option verify must be function') + } + + // create the appropriate type checking function + var shouldParse = typeof type !== 'function' + ? typeChecker(type) + : type + + function parse (buf) { + return buf + } + + return function textParser (req, res, next) { + if (req._body) { + debug('body already parsed') + next() + return + } + + req.body = req.body || {} + + // skip requests without bodies + if (!typeis.hasBody(req)) { + debug('skip empty body') + next() + return + } + + debug('content-type %j', req.headers['content-type']) + + // determine if request should be parsed + if (!shouldParse(req)) { + debug('skip parsing') + next() + return + } + + // get charset + var charset = getCharset(req) || defaultCharset + + // read + read(req, res, next, parse, debug, { + encoding: charset, + inflate: inflate, + limit: limit, + verify: verify + }) + } +} + +/** + * Get the charset of a request. + * + * @param {object} req + * @api private + */ + +function getCharset (req) { + try { + return (contentType.parse(req).parameters.charset || '').toLowerCase() + } catch (e) { + return undefined + } +} + +/** + * Get the simple type checker. + * + * @param {string} type + * @return {function} + */ + +function typeChecker (type) { + return function checkType (req) { + return Boolean(typeis(req, type)) + } +} diff --git a/node_modules/body-parser/lib/types/urlencoded.js b/node_modules/body-parser/lib/types/urlencoded.js new file mode 100644 index 000000000..b2ca8f16d --- /dev/null +++ b/node_modules/body-parser/lib/types/urlencoded.js @@ -0,0 +1,284 @@ +/*! + * body-parser + * Copyright(c) 2014 Jonathan Ong + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module dependencies. + * @private + */ + +var bytes = require('bytes') +var contentType = require('content-type') +var createError = require('http-errors') +var debug = require('debug')('body-parser:urlencoded') +var deprecate = require('depd')('body-parser') +var read = require('../read') +var typeis = require('type-is') + +/** + * Module exports. + */ + +module.exports = urlencoded + +/** + * Cache of parser modules. + */ + +var parsers = Object.create(null) + +/** + * Create a middleware to parse urlencoded bodies. + * + * @param {object} [options] + * @return {function} + * @public + */ + +function urlencoded (options) { + var opts = options || {} + + // notice because option default will flip in next major + if (opts.extended === undefined) { + deprecate('undefined extended: provide extended option') + } + + var extended = opts.extended !== false + var inflate = opts.inflate !== false + var limit = typeof opts.limit !== 'number' + ? bytes.parse(opts.limit || '100kb') + : opts.limit + var type = opts.type || 'application/x-www-form-urlencoded' + var verify = opts.verify || false + + if (verify !== false && typeof verify !== 'function') { + throw new TypeError('option verify must be function') + } + + // create the appropriate query parser + var queryparse = extended + ? extendedparser(opts) + : simpleparser(opts) + + // create the appropriate type checking function + var shouldParse = typeof type !== 'function' + ? typeChecker(type) + : type + + function parse (body) { + return body.length + ? queryparse(body) + : {} + } + + return function urlencodedParser (req, res, next) { + if (req._body) { + debug('body already parsed') + next() + return + } + + req.body = req.body || {} + + // skip requests without bodies + if (!typeis.hasBody(req)) { + debug('skip empty body') + next() + return + } + + debug('content-type %j', req.headers['content-type']) + + // determine if request should be parsed + if (!shouldParse(req)) { + debug('skip parsing') + next() + return + } + + // assert charset + var charset = getCharset(req) || 'utf-8' + if (charset !== 'utf-8') { + debug('invalid charset') + next(createError(415, 'unsupported charset "' + charset.toUpperCase() + '"', { + charset: charset, + type: 'charset.unsupported' + })) + return + } + + // read + read(req, res, next, parse, debug, { + debug: debug, + encoding: charset, + inflate: inflate, + limit: limit, + verify: verify + }) + } +} + +/** + * Get the extended query parser. + * + * @param {object} options + */ + +function extendedparser (options) { + var parameterLimit = options.parameterLimit !== undefined + ? options.parameterLimit + : 1000 + var parse = parser('qs') + + if (isNaN(parameterLimit) || parameterLimit < 1) { + throw new TypeError('option parameterLimit must be a positive number') + } + + if (isFinite(parameterLimit)) { + parameterLimit = parameterLimit | 0 + } + + return function queryparse (body) { + var paramCount = parameterCount(body, parameterLimit) + + if (paramCount === undefined) { + debug('too many parameters') + throw createError(413, 'too many parameters', { + type: 'parameters.too.many' + }) + } + + var arrayLimit = Math.max(100, paramCount) + + debug('parse extended urlencoding') + return parse(body, { + allowPrototypes: true, + arrayLimit: arrayLimit, + depth: Infinity, + parameterLimit: parameterLimit + }) + } +} + +/** + * Get the charset of a request. + * + * @param {object} req + * @api private + */ + +function getCharset (req) { + try { + return (contentType.parse(req).parameters.charset || '').toLowerCase() + } catch (e) { + return undefined + } +} + +/** + * Count the number of parameters, stopping once limit reached + * + * @param {string} body + * @param {number} limit + * @api private + */ + +function parameterCount (body, limit) { + var count = 0 + var index = 0 + + while ((index = body.indexOf('&', index)) !== -1) { + count++ + index++ + + if (count === limit) { + return undefined + } + } + + return count +} + +/** + * Get parser for module name dynamically. + * + * @param {string} name + * @return {function} + * @api private + */ + +function parser (name) { + var mod = parsers[name] + + if (mod !== undefined) { + return mod.parse + } + + // this uses a switch for static require analysis + switch (name) { + case 'qs': + mod = require('qs') + break + case 'querystring': + mod = require('querystring') + break + } + + // store to prevent invoking require() + parsers[name] = mod + + return mod.parse +} + +/** + * Get the simple query parser. + * + * @param {object} options + */ + +function simpleparser (options) { + var parameterLimit = options.parameterLimit !== undefined + ? options.parameterLimit + : 1000 + var parse = parser('querystring') + + if (isNaN(parameterLimit) || parameterLimit < 1) { + throw new TypeError('option parameterLimit must be a positive number') + } + + if (isFinite(parameterLimit)) { + parameterLimit = parameterLimit | 0 + } + + return function queryparse (body) { + var paramCount = parameterCount(body, parameterLimit) + + if (paramCount === undefined) { + debug('too many parameters') + throw createError(413, 'too many parameters', { + type: 'parameters.too.many' + }) + } + + debug('parse urlencoding') + return parse(body, undefined, undefined, { maxKeys: parameterLimit }) + } +} + +/** + * Get the simple type checker. + * + * @param {string} type + * @return {function} + */ + +function typeChecker (type) { + return function checkType (req) { + return Boolean(typeis(req, type)) + } +} diff --git a/node_modules/body-parser/package.json b/node_modules/body-parser/package.json new file mode 100644 index 000000000..65c44678f --- /dev/null +++ b/node_modules/body-parser/package.json @@ -0,0 +1,91 @@ +{ + "_from": "body-parser@1.19.0", + "_id": "body-parser@1.19.0", + "_inBundle": false, + "_integrity": "sha512-dhEPs72UPbDnAQJ9ZKMNTP6ptJaionhP5cBb541nXPlW60Jepo9RV/a4fX4XWW9CuFNK22krhrj1+rgzifNCsw==", + "_location": "/body-parser", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "body-parser@1.19.0", + "name": "body-parser", + "escapedName": "body-parser", + "rawSpec": "1.19.0", + "saveSpec": null, + "fetchSpec": "1.19.0" + }, + "_requiredBy": [ + "/express" + ], + "_resolved": "https://registry.npmjs.org/body-parser/-/body-parser-1.19.0.tgz", + "_shasum": "96b2709e57c9c4e09a6fd66a8fd979844f69f08a", + "_spec": "body-parser@1.19.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\express", + "bugs": { + "url": "https://github.com/expressjs/body-parser/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + { + "name": "Jonathan Ong", + "email": "me@jongleberry.com", + "url": "http://jongleberry.com" + } + ], + "dependencies": { + "bytes": "3.1.0", + "content-type": "~1.0.4", + "debug": "2.6.9", + "depd": "~1.1.2", + "http-errors": "1.7.2", + "iconv-lite": "0.4.24", + "on-finished": "~2.3.0", + "qs": "6.7.0", + "raw-body": "2.4.0", + "type-is": "~1.6.17" + }, + "deprecated": false, + "description": "Node.js body parsing middleware", + "devDependencies": { + "eslint": "5.16.0", + "eslint-config-standard": "12.0.0", + "eslint-plugin-import": "2.17.2", + "eslint-plugin-markdown": "1.0.0", + "eslint-plugin-node": "8.0.1", + "eslint-plugin-promise": "4.1.1", + "eslint-plugin-standard": "4.0.0", + "istanbul": "0.4.5", + "methods": "1.1.2", + "mocha": "6.1.4", + "safe-buffer": "5.1.2", + "supertest": "4.0.2" + }, + "engines": { + "node": ">= 0.8" + }, + "files": [ + "lib/", + "LICENSE", + "HISTORY.md", + "index.js" + ], + "homepage": "https://github.com/expressjs/body-parser#readme", + "license": "MIT", + "name": "body-parser", + "repository": { + "type": "git", + "url": "git+https://github.com/expressjs/body-parser.git" + }, + "scripts": { + "lint": "eslint --plugin markdown --ext js,md .", + "test": "mocha --require test/support/env --reporter spec --check-leaks --bail test/", + "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --require test/support/env --reporter dot --check-leaks test/", + "test-travis": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --require test/support/env --reporter spec --check-leaks test/" + }, + "version": "1.19.0" +} diff --git a/node_modules/boxen/index.d.ts b/node_modules/boxen/index.d.ts new file mode 100644 index 000000000..69ebd426c --- /dev/null +++ b/node_modules/boxen/index.d.ts @@ -0,0 +1,200 @@ +import {LiteralUnion} from 'type-fest'; +import {BoxStyle, Boxes} from 'cli-boxes'; + +declare namespace boxen { + /** + Characters used for custom border. + + @example + ``` + // affffb + // e e + // dffffc + + const border: CustomBorderStyle = { + topLeft: 'a', + topRight: 'b', + bottomRight: 'c', + bottomLeft: 'd', + vertical: 'e', + horizontal: 'f' + }; + ``` + */ + interface CustomBorderStyle extends BoxStyle {} + + /** + Spacing used for `padding` and `margin`. + */ + interface Spacing { + readonly top: number; + readonly right: number; + readonly bottom: number; + readonly left: number; + } + + interface Options { + /** + Color of the box border. + */ + readonly borderColor?: LiteralUnion< + | 'black' + | 'red' + | 'green' + | 'yellow' + | 'blue' + | 'magenta' + | 'cyan' + | 'white' + | 'gray' + | 'grey' + | 'blackBright' + | 'redBright' + | 'greenBright' + | 'yellowBright' + | 'blueBright' + | 'magentaBright' + | 'cyanBright' + | 'whiteBright', + string + >; + + /** + Style of the box border. + + @default 'single' + */ + readonly borderStyle?: keyof Boxes | CustomBorderStyle; + + /** + Reduce opacity of the border. + + @default false + */ + readonly dimBorder?: boolean; + + /** + Space between the text and box border. + + @default 0 + */ + readonly padding?: number | Spacing; + + /** + Space around the box. + + @default 0 + */ + readonly margin?: number | Spacing; + + /** + Float the box on the available terminal screen space. + + @default 'left' + */ + readonly float?: 'left' | 'right' | 'center'; + + /** + Color of the background. + */ + readonly backgroundColor?: LiteralUnion< + | 'black' + | 'red' + | 'green' + | 'yellow' + | 'blue' + | 'magenta' + | 'cyan' + | 'white' + | 'blackBright' + | 'redBright' + | 'greenBright' + | 'yellowBright' + | 'blueBright' + | 'magentaBright' + | 'cyanBright' + | 'whiteBright', + string + >; + + /** + Align the text in the box based on the widest line. + + @default 'left' + @deprecated Use `textAlignment` instead. + */ + readonly align?: 'left' | 'right' | 'center'; + + /** + Align the text in the box based on the widest line. + + @default 'left' + */ + readonly textAlignment?: 'left' | 'right' | 'center'; + + /** + Display a title at the top of the box. + If needed, the box will horizontally expand to fit the title. + + @example + ``` + console.log(boxen('foo bar', {title: 'example'})); + // ┌ example ┐ + // │foo bar │ + // └─────────┘ + ``` + */ + readonly title?: string; + + /** + Align the title in the top bar. + + @default 'left' + + @example + ``` + console.log(boxen('foo bar foo bar', {title: 'example', titleAlignment: 'center'})); + // ┌─── example ───┐ + // │foo bar foo bar│ + // └───────────────┘ + + console.log(boxen('foo bar foo bar', {title: 'example', titleAlignment: 'right'})); + // ┌────── example ┐ + // │foo bar foo bar│ + // └───────────────┘ + ``` + */ + readonly titleAlignment?: 'left' | 'right' | 'center'; + } +} + +/** +Creates a box in the terminal. + +@param text - The text inside the box. +@returns The box. + +@example +``` +import boxen = require('boxen'); + +console.log(boxen('unicorn', {padding: 1})); +// ┌─────────────┐ +// │ │ +// │ unicorn │ +// │ │ +// └─────────────┘ + +console.log(boxen('unicorn', {padding: 1, margin: 1, borderStyle: 'double'})); +// +// ╔═════════════╗ +// ║ ║ +// ║ unicorn ║ +// ║ ║ +// ╚═════════════╝ +// +``` +*/ +declare const boxen: (text: string, options?: boxen.Options) => string; + +export = boxen; diff --git a/node_modules/boxen/index.js b/node_modules/boxen/index.js new file mode 100644 index 000000000..d6bc693c5 --- /dev/null +++ b/node_modules/boxen/index.js @@ -0,0 +1,279 @@ +'use strict'; +const stringWidth = require('string-width'); +const chalk = require('chalk'); +const widestLine = require('widest-line'); +const cliBoxes = require('cli-boxes'); +const camelCase = require('camelcase'); +const ansiAlign = require('ansi-align'); +const wrapAnsi = require('wrap-ansi'); + +const NL = '\n'; +const PAD = ' '; + +const terminalColumns = () => { + const {env, stdout, stderr} = process; + + if (stdout && stdout.columns) { + return stdout.columns; + } + + if (stderr && stderr.columns) { + return stderr.columns; + } + + if (env.COLUMNS) { + return Number.parseInt(env.COLUMNS, 10); + } + + return 80; +}; + +const getObject = detail => { + return typeof detail === 'number' ? { + top: detail, + right: detail * 3, + bottom: detail, + left: detail * 3 + } : { + top: 0, + right: 0, + bottom: 0, + left: 0, + ...detail + }; +}; + +const getBorderChars = borderStyle => { + const sides = [ + 'topLeft', + 'topRight', + 'bottomRight', + 'bottomLeft', + 'vertical', + 'horizontal' + ]; + + let chararacters; + + if (typeof borderStyle === 'string') { + chararacters = cliBoxes[borderStyle]; + + if (!chararacters) { + throw new TypeError(`Invalid border style: ${borderStyle}`); + } + } else { + for (const side of sides) { + if (!borderStyle[side] || typeof borderStyle[side] !== 'string') { + throw new TypeError(`Invalid border style: ${side}`); + } + } + + chararacters = borderStyle; + } + + return chararacters; +}; + +const makeTitle = (text, horizontal, alignement) => { + let title = ''; + + const textWidth = stringWidth(text); + + switch (alignement) { + case 'left': + title = text + horizontal.slice(textWidth); + break; + case 'right': + title = horizontal.slice(textWidth) + text; + break; + default: + horizontal = horizontal.slice(textWidth); + + if (horizontal.length % 2 === 1) { // This is needed in case the length is odd + horizontal = horizontal.slice(Math.floor(horizontal.length / 2)); + title = horizontal.slice(1) + text + horizontal; // We reduce the left part of one character to avoid the bar to go beyond its limit + } else { + horizontal = horizontal.slice(horizontal.length / 2); + title = horizontal + text + horizontal; + } + + break; + } + + return title; +}; + +const makeContentText = (text, padding, columns, align) => { + text = ansiAlign(text, {align}); + let lines = text.split(NL); + const textWidth = widestLine(text); + + const max = columns - padding.left - padding.right; + + if (textWidth > max) { + const newLines = []; + for (const line of lines) { + const createdLines = wrapAnsi(line, max, {hard: true}); + const alignedLines = ansiAlign(createdLines, {align}); + const alignedLinesArray = alignedLines.split('\n'); + const longestLength = Math.max(...alignedLinesArray.map(s => stringWidth(s))); + + for (const alignedLine of alignedLinesArray) { + let paddedLine; + switch (align) { + case 'center': + paddedLine = PAD.repeat((max - longestLength) / 2) + alignedLine; + break; + case 'right': + paddedLine = PAD.repeat(max - longestLength) + alignedLine; + break; + default: + paddedLine = alignedLine; + break; + } + + newLines.push(paddedLine); + } + } + + lines = newLines; + } + + if (align === 'center' && textWidth < max) { + lines = lines.map(line => PAD.repeat((max - textWidth) / 2) + line); + } else if (align === 'right' && textWidth < max) { + lines = lines.map(line => PAD.repeat(max - textWidth) + line); + } + + const paddingLeft = PAD.repeat(padding.left); + const paddingRight = PAD.repeat(padding.right); + + lines = lines.map(line => paddingLeft + line + paddingRight); + + lines = lines.map(line => { + if (columns - stringWidth(line) > 0) { + switch (align) { + case 'center': + return line + PAD.repeat(columns - stringWidth(line)); + case 'right': + return line + PAD.repeat(columns - stringWidth(line)); + default: + return line + PAD.repeat(columns - stringWidth(line)); + } + } + + return line; + }); + + if (padding.top > 0) { + lines = new Array(padding.top).fill(PAD.repeat(columns)).concat(lines); + } + + if (padding.bottom > 0) { + lines = lines.concat(new Array(padding.bottom).fill(PAD.repeat(columns))); + } + + return lines.join(NL); +}; + +const isHex = color => color.match(/^#(?:[0-f]{3}){1,2}$/i); +const isColorValid = color => typeof color === 'string' && ((chalk[color]) || isHex(color)); +const getColorFn = color => isHex(color) ? chalk.hex(color) : chalk[color]; +const getBGColorFn = color => isHex(color) ? chalk.bgHex(color) : chalk[camelCase(['bg', color])]; + +module.exports = (text, options) => { + options = { + padding: 0, + borderStyle: 'single', + dimBorder: false, + textAlignment: 'left', + float: 'left', + titleAlignment: 'left', + ...options + }; + + // This option is deprecated + if (options.align) { + options.textAlignment = options.align; + } + + const BORDERS_WIDTH = 2; + + if (options.borderColor && !isColorValid(options.borderColor)) { + throw new Error(`${options.borderColor} is not a valid borderColor`); + } + + if (options.backgroundColor && !isColorValid(options.backgroundColor)) { + throw new Error(`${options.backgroundColor} is not a valid backgroundColor`); + } + + const chars = getBorderChars(options.borderStyle); + const padding = getObject(options.padding); + const margin = getObject(options.margin); + + const colorizeBorder = border => { + const newBorder = options.borderColor ? getColorFn(options.borderColor)(border) : border; + return options.dimBorder ? chalk.dim(newBorder) : newBorder; + }; + + const colorizeContent = content => options.backgroundColor ? getBGColorFn(options.backgroundColor)(content) : content; + + const columns = terminalColumns(); + + let contentWidth = widestLine(wrapAnsi(text, columns - BORDERS_WIDTH, {hard: true, trim: false})) + padding.left + padding.right; + + // This prevents the title bar to exceed the console's width + let title = options.title && options.title.slice(0, columns - 4 - margin.left - margin.right); + + if (title) { + title = ` ${title} `; + // Make the box larger to fit a larger title + if (stringWidth(title) > contentWidth) { + contentWidth = stringWidth(title); + } + } + + if ((margin.left && margin.right) && contentWidth + BORDERS_WIDTH + margin.left + margin.right > columns) { + // Let's assume we have margins: left = 3, right = 5, in total = 8 + const spaceForMargins = columns - contentWidth - BORDERS_WIDTH; + // Let's assume we have space = 4 + const multiplier = spaceForMargins / (margin.left + margin.right); + // Here: multiplier = 4/8 = 0.5 + margin.left = Math.max(0, Math.floor(margin.left * multiplier)); + margin.right = Math.max(0, Math.floor(margin.right * multiplier)); + // Left: 3 * 0.5 = 1.5 -> 1 + // Right: 6 * 0.5 = 3 + } + + // Prevent content from exceeding the console's width + contentWidth = Math.min(contentWidth, columns - BORDERS_WIDTH - margin.left - margin.right); + + text = makeContentText(text, padding, contentWidth, options.textAlignment); + + let marginLeft = PAD.repeat(margin.left); + + if (options.float === 'center') { + const marginWidth = Math.max((columns - contentWidth - BORDERS_WIDTH) / 2, 0); + marginLeft = PAD.repeat(marginWidth); + } else if (options.float === 'right') { + const marginWidth = Math.max(columns - contentWidth - margin.right - BORDERS_WIDTH, 0); + marginLeft = PAD.repeat(marginWidth); + } + + const horizontal = chars.horizontal.repeat(contentWidth); + const top = colorizeBorder(NL.repeat(margin.top) + marginLeft + chars.topLeft + (title ? makeTitle(title, horizontal, options.titleAlignment) : horizontal) + chars.topRight); + const bottom = colorizeBorder(marginLeft + chars.bottomLeft + horizontal + chars.bottomRight + NL.repeat(margin.bottom)); + const side = colorizeBorder(chars.vertical); + + const LINE_SEPARATOR = (contentWidth + BORDERS_WIDTH + margin.left >= columns) ? '' : NL; + + const lines = text.split(NL); + + const middle = lines.map(line => { + return marginLeft + side + colorizeContent(line) + side; + }).join(LINE_SEPARATOR); + + return top + LINE_SEPARATOR + middle + LINE_SEPARATOR + bottom; +}; + +module.exports._borderStyles = cliBoxes; diff --git a/node_modules/boxen/license b/node_modules/boxen/license new file mode 100644 index 000000000..fa7ceba3e --- /dev/null +++ b/node_modules/boxen/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (https://sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/boxen/node_modules/ansi-styles/index.d.ts b/node_modules/boxen/node_modules/ansi-styles/index.d.ts new file mode 100644 index 000000000..44a907e58 --- /dev/null +++ b/node_modules/boxen/node_modules/ansi-styles/index.d.ts @@ -0,0 +1,345 @@ +declare type CSSColor = + | 'aliceblue' + | 'antiquewhite' + | 'aqua' + | 'aquamarine' + | 'azure' + | 'beige' + | 'bisque' + | 'black' + | 'blanchedalmond' + | 'blue' + | 'blueviolet' + | 'brown' + | 'burlywood' + | 'cadetblue' + | 'chartreuse' + | 'chocolate' + | 'coral' + | 'cornflowerblue' + | 'cornsilk' + | 'crimson' + | 'cyan' + | 'darkblue' + | 'darkcyan' + | 'darkgoldenrod' + | 'darkgray' + | 'darkgreen' + | 'darkgrey' + | 'darkkhaki' + | 'darkmagenta' + | 'darkolivegreen' + | 'darkorange' + | 'darkorchid' + | 'darkred' + | 'darksalmon' + | 'darkseagreen' + | 'darkslateblue' + | 'darkslategray' + | 'darkslategrey' + | 'darkturquoise' + | 'darkviolet' + | 'deeppink' + | 'deepskyblue' + | 'dimgray' + | 'dimgrey' + | 'dodgerblue' + | 'firebrick' + | 'floralwhite' + | 'forestgreen' + | 'fuchsia' + | 'gainsboro' + | 'ghostwhite' + | 'gold' + | 'goldenrod' + | 'gray' + | 'green' + | 'greenyellow' + | 'grey' + | 'honeydew' + | 'hotpink' + | 'indianred' + | 'indigo' + | 'ivory' + | 'khaki' + | 'lavender' + | 'lavenderblush' + | 'lawngreen' + | 'lemonchiffon' + | 'lightblue' + | 'lightcoral' + | 'lightcyan' + | 'lightgoldenrodyellow' + | 'lightgray' + | 'lightgreen' + | 'lightgrey' + | 'lightpink' + | 'lightsalmon' + | 'lightseagreen' + | 'lightskyblue' + | 'lightslategray' + | 'lightslategrey' + | 'lightsteelblue' + | 'lightyellow' + | 'lime' + | 'limegreen' + | 'linen' + | 'magenta' + | 'maroon' + | 'mediumaquamarine' + | 'mediumblue' + | 'mediumorchid' + | 'mediumpurple' + | 'mediumseagreen' + | 'mediumslateblue' + | 'mediumspringgreen' + | 'mediumturquoise' + | 'mediumvioletred' + | 'midnightblue' + | 'mintcream' + | 'mistyrose' + | 'moccasin' + | 'navajowhite' + | 'navy' + | 'oldlace' + | 'olive' + | 'olivedrab' + | 'orange' + | 'orangered' + | 'orchid' + | 'palegoldenrod' + | 'palegreen' + | 'paleturquoise' + | 'palevioletred' + | 'papayawhip' + | 'peachpuff' + | 'peru' + | 'pink' + | 'plum' + | 'powderblue' + | 'purple' + | 'rebeccapurple' + | 'red' + | 'rosybrown' + | 'royalblue' + | 'saddlebrown' + | 'salmon' + | 'sandybrown' + | 'seagreen' + | 'seashell' + | 'sienna' + | 'silver' + | 'skyblue' + | 'slateblue' + | 'slategray' + | 'slategrey' + | 'snow' + | 'springgreen' + | 'steelblue' + | 'tan' + | 'teal' + | 'thistle' + | 'tomato' + | 'turquoise' + | 'violet' + | 'wheat' + | 'white' + | 'whitesmoke' + | 'yellow' + | 'yellowgreen'; + +declare namespace ansiStyles { + interface ColorConvert { + /** + The RGB color space. + + @param red - (`0`-`255`) + @param green - (`0`-`255`) + @param blue - (`0`-`255`) + */ + rgb(red: number, green: number, blue: number): string; + + /** + The RGB HEX color space. + + @param hex - A hexadecimal string containing RGB data. + */ + hex(hex: string): string; + + /** + @param keyword - A CSS color name. + */ + keyword(keyword: CSSColor): string; + + /** + The HSL color space. + + @param hue - (`0`-`360`) + @param saturation - (`0`-`100`) + @param lightness - (`0`-`100`) + */ + hsl(hue: number, saturation: number, lightness: number): string; + + /** + The HSV color space. + + @param hue - (`0`-`360`) + @param saturation - (`0`-`100`) + @param value - (`0`-`100`) + */ + hsv(hue: number, saturation: number, value: number): string; + + /** + The HSV color space. + + @param hue - (`0`-`360`) + @param whiteness - (`0`-`100`) + @param blackness - (`0`-`100`) + */ + hwb(hue: number, whiteness: number, blackness: number): string; + + /** + Use a [4-bit unsigned number](https://en.wikipedia.org/wiki/ANSI_escape_code#3/4-bit) to set text color. + */ + ansi(ansi: number): string; + + /** + Use an [8-bit unsigned number](https://en.wikipedia.org/wiki/ANSI_escape_code#8-bit) to set text color. + */ + ansi256(ansi: number): string; + } + + interface CSPair { + /** + The ANSI terminal control sequence for starting this style. + */ + readonly open: string; + + /** + The ANSI terminal control sequence for ending this style. + */ + readonly close: string; + } + + interface ColorBase { + readonly ansi: ColorConvert; + readonly ansi256: ColorConvert; + readonly ansi16m: ColorConvert; + + /** + The ANSI terminal control sequence for ending this color. + */ + readonly close: string; + } + + interface Modifier { + /** + Resets the current color chain. + */ + readonly reset: CSPair; + + /** + Make text bold. + */ + readonly bold: CSPair; + + /** + Emitting only a small amount of light. + */ + readonly dim: CSPair; + + /** + Make text italic. (Not widely supported) + */ + readonly italic: CSPair; + + /** + Make text underline. (Not widely supported) + */ + readonly underline: CSPair; + + /** + Inverse background and foreground colors. + */ + readonly inverse: CSPair; + + /** + Prints the text, but makes it invisible. + */ + readonly hidden: CSPair; + + /** + Puts a horizontal line through the center of the text. (Not widely supported) + */ + readonly strikethrough: CSPair; + } + + interface ForegroundColor { + readonly black: CSPair; + readonly red: CSPair; + readonly green: CSPair; + readonly yellow: CSPair; + readonly blue: CSPair; + readonly cyan: CSPair; + readonly magenta: CSPair; + readonly white: CSPair; + + /** + Alias for `blackBright`. + */ + readonly gray: CSPair; + + /** + Alias for `blackBright`. + */ + readonly grey: CSPair; + + readonly blackBright: CSPair; + readonly redBright: CSPair; + readonly greenBright: CSPair; + readonly yellowBright: CSPair; + readonly blueBright: CSPair; + readonly cyanBright: CSPair; + readonly magentaBright: CSPair; + readonly whiteBright: CSPair; + } + + interface BackgroundColor { + readonly bgBlack: CSPair; + readonly bgRed: CSPair; + readonly bgGreen: CSPair; + readonly bgYellow: CSPair; + readonly bgBlue: CSPair; + readonly bgCyan: CSPair; + readonly bgMagenta: CSPair; + readonly bgWhite: CSPair; + + /** + Alias for `bgBlackBright`. + */ + readonly bgGray: CSPair; + + /** + Alias for `bgBlackBright`. + */ + readonly bgGrey: CSPair; + + readonly bgBlackBright: CSPair; + readonly bgRedBright: CSPair; + readonly bgGreenBright: CSPair; + readonly bgYellowBright: CSPair; + readonly bgBlueBright: CSPair; + readonly bgCyanBright: CSPair; + readonly bgMagentaBright: CSPair; + readonly bgWhiteBright: CSPair; + } +} + +declare const ansiStyles: { + readonly modifier: ansiStyles.Modifier; + readonly color: ansiStyles.ForegroundColor & ansiStyles.ColorBase; + readonly bgColor: ansiStyles.BackgroundColor & ansiStyles.ColorBase; + readonly codes: ReadonlyMap; +} & ansiStyles.BackgroundColor & ansiStyles.ForegroundColor & ansiStyles.Modifier; + +export = ansiStyles; diff --git a/node_modules/boxen/node_modules/ansi-styles/index.js b/node_modules/boxen/node_modules/ansi-styles/index.js new file mode 100644 index 000000000..5d82581a1 --- /dev/null +++ b/node_modules/boxen/node_modules/ansi-styles/index.js @@ -0,0 +1,163 @@ +'use strict'; + +const wrapAnsi16 = (fn, offset) => (...args) => { + const code = fn(...args); + return `\u001B[${code + offset}m`; +}; + +const wrapAnsi256 = (fn, offset) => (...args) => { + const code = fn(...args); + return `\u001B[${38 + offset};5;${code}m`; +}; + +const wrapAnsi16m = (fn, offset) => (...args) => { + const rgb = fn(...args); + return `\u001B[${38 + offset};2;${rgb[0]};${rgb[1]};${rgb[2]}m`; +}; + +const ansi2ansi = n => n; +const rgb2rgb = (r, g, b) => [r, g, b]; + +const setLazyProperty = (object, property, get) => { + Object.defineProperty(object, property, { + get: () => { + const value = get(); + + Object.defineProperty(object, property, { + value, + enumerable: true, + configurable: true + }); + + return value; + }, + enumerable: true, + configurable: true + }); +}; + +/** @type {typeof import('color-convert')} */ +let colorConvert; +const makeDynamicStyles = (wrap, targetSpace, identity, isBackground) => { + if (colorConvert === undefined) { + colorConvert = require('color-convert'); + } + + const offset = isBackground ? 10 : 0; + const styles = {}; + + for (const [sourceSpace, suite] of Object.entries(colorConvert)) { + const name = sourceSpace === 'ansi16' ? 'ansi' : sourceSpace; + if (sourceSpace === targetSpace) { + styles[name] = wrap(identity, offset); + } else if (typeof suite === 'object') { + styles[name] = wrap(suite[targetSpace], offset); + } + } + + return styles; +}; + +function assembleStyles() { + const codes = new Map(); + const styles = { + modifier: { + reset: [0, 0], + // 21 isn't widely supported and 22 does the same thing + bold: [1, 22], + dim: [2, 22], + italic: [3, 23], + underline: [4, 24], + inverse: [7, 27], + hidden: [8, 28], + strikethrough: [9, 29] + }, + color: { + black: [30, 39], + red: [31, 39], + green: [32, 39], + yellow: [33, 39], + blue: [34, 39], + magenta: [35, 39], + cyan: [36, 39], + white: [37, 39], + + // Bright color + blackBright: [90, 39], + redBright: [91, 39], + greenBright: [92, 39], + yellowBright: [93, 39], + blueBright: [94, 39], + magentaBright: [95, 39], + cyanBright: [96, 39], + whiteBright: [97, 39] + }, + bgColor: { + bgBlack: [40, 49], + bgRed: [41, 49], + bgGreen: [42, 49], + bgYellow: [43, 49], + bgBlue: [44, 49], + bgMagenta: [45, 49], + bgCyan: [46, 49], + bgWhite: [47, 49], + + // Bright color + bgBlackBright: [100, 49], + bgRedBright: [101, 49], + bgGreenBright: [102, 49], + bgYellowBright: [103, 49], + bgBlueBright: [104, 49], + bgMagentaBright: [105, 49], + bgCyanBright: [106, 49], + bgWhiteBright: [107, 49] + } + }; + + // Alias bright black as gray (and grey) + styles.color.gray = styles.color.blackBright; + styles.bgColor.bgGray = styles.bgColor.bgBlackBright; + styles.color.grey = styles.color.blackBright; + styles.bgColor.bgGrey = styles.bgColor.bgBlackBright; + + for (const [groupName, group] of Object.entries(styles)) { + for (const [styleName, style] of Object.entries(group)) { + styles[styleName] = { + open: `\u001B[${style[0]}m`, + close: `\u001B[${style[1]}m` + }; + + group[styleName] = styles[styleName]; + + codes.set(style[0], style[1]); + } + + Object.defineProperty(styles, groupName, { + value: group, + enumerable: false + }); + } + + Object.defineProperty(styles, 'codes', { + value: codes, + enumerable: false + }); + + styles.color.close = '\u001B[39m'; + styles.bgColor.close = '\u001B[49m'; + + setLazyProperty(styles.color, 'ansi', () => makeDynamicStyles(wrapAnsi16, 'ansi16', ansi2ansi, false)); + setLazyProperty(styles.color, 'ansi256', () => makeDynamicStyles(wrapAnsi256, 'ansi256', ansi2ansi, false)); + setLazyProperty(styles.color, 'ansi16m', () => makeDynamicStyles(wrapAnsi16m, 'rgb', rgb2rgb, false)); + setLazyProperty(styles.bgColor, 'ansi', () => makeDynamicStyles(wrapAnsi16, 'ansi16', ansi2ansi, true)); + setLazyProperty(styles.bgColor, 'ansi256', () => makeDynamicStyles(wrapAnsi256, 'ansi256', ansi2ansi, true)); + setLazyProperty(styles.bgColor, 'ansi16m', () => makeDynamicStyles(wrapAnsi16m, 'rgb', rgb2rgb, true)); + + return styles; +} + +// Make the export immutable +Object.defineProperty(module, 'exports', { + enumerable: true, + get: assembleStyles +}); diff --git a/node_modules/boxen/node_modules/ansi-styles/license b/node_modules/boxen/node_modules/ansi-styles/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/boxen/node_modules/ansi-styles/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/boxen/node_modules/ansi-styles/package.json b/node_modules/boxen/node_modules/ansi-styles/package.json new file mode 100644 index 000000000..e7ede5fba --- /dev/null +++ b/node_modules/boxen/node_modules/ansi-styles/package.json @@ -0,0 +1,88 @@ +{ + "_from": "ansi-styles@^4.1.0", + "_id": "ansi-styles@4.3.0", + "_inBundle": false, + "_integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==", + "_location": "/boxen/ansi-styles", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "ansi-styles@^4.1.0", + "name": "ansi-styles", + "escapedName": "ansi-styles", + "rawSpec": "^4.1.0", + "saveSpec": null, + "fetchSpec": "^4.1.0" + }, + "_requiredBy": [ + "/boxen/chalk" + ], + "_resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz", + "_shasum": "edd803628ae71c04c85ae7a0906edad34b648937", + "_spec": "ansi-styles@^4.1.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\boxen\\node_modules\\chalk", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/chalk/ansi-styles/issues" + }, + "bundleDependencies": false, + "dependencies": { + "color-convert": "^2.0.1" + }, + "deprecated": false, + "description": "ANSI escape codes for styling strings in the terminal", + "devDependencies": { + "@types/color-convert": "^1.9.0", + "ava": "^2.3.0", + "svg-term-cli": "^2.1.1", + "tsd": "^0.11.0", + "xo": "^0.25.3" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "funding": "https://github.com/chalk/ansi-styles?sponsor=1", + "homepage": "https://github.com/chalk/ansi-styles#readme", + "keywords": [ + "ansi", + "styles", + "color", + "colour", + "colors", + "terminal", + "console", + "cli", + "string", + "tty", + "escape", + "formatting", + "rgb", + "256", + "shell", + "xterm", + "log", + "logging", + "command-line", + "text" + ], + "license": "MIT", + "name": "ansi-styles", + "repository": { + "type": "git", + "url": "git+https://github.com/chalk/ansi-styles.git" + }, + "scripts": { + "screenshot": "svg-term --command='node screenshot' --out=screenshot.svg --padding=3 --width=55 --height=3 --at=1000 --no-cursor", + "test": "xo && ava && tsd" + }, + "version": "4.3.0" +} diff --git a/node_modules/boxen/node_modules/ansi-styles/readme.md b/node_modules/boxen/node_modules/ansi-styles/readme.md new file mode 100644 index 000000000..24883de80 --- /dev/null +++ b/node_modules/boxen/node_modules/ansi-styles/readme.md @@ -0,0 +1,152 @@ +# ansi-styles [![Build Status](https://travis-ci.org/chalk/ansi-styles.svg?branch=master)](https://travis-ci.org/chalk/ansi-styles) + +> [ANSI escape codes](https://en.wikipedia.org/wiki/ANSI_escape_code#Colors_and_Styles) for styling strings in the terminal + +You probably want the higher-level [chalk](https://github.com/chalk/chalk) module for styling your strings. + + + +## Install + +``` +$ npm install ansi-styles +``` + +## Usage + +```js +const style = require('ansi-styles'); + +console.log(`${style.green.open}Hello world!${style.green.close}`); + + +// Color conversion between 16/256/truecolor +// NOTE: If conversion goes to 16 colors or 256 colors, the original color +// may be degraded to fit that color palette. This means terminals +// that do not support 16 million colors will best-match the +// original color. +console.log(style.bgColor.ansi.hsl(120, 80, 72) + 'Hello world!' + style.bgColor.close); +console.log(style.color.ansi256.rgb(199, 20, 250) + 'Hello world!' + style.color.close); +console.log(style.color.ansi16m.hex('#abcdef') + 'Hello world!' + style.color.close); +``` + +## API + +Each style has an `open` and `close` property. + +## Styles + +### Modifiers + +- `reset` +- `bold` +- `dim` +- `italic` *(Not widely supported)* +- `underline` +- `inverse` +- `hidden` +- `strikethrough` *(Not widely supported)* + +### Colors + +- `black` +- `red` +- `green` +- `yellow` +- `blue` +- `magenta` +- `cyan` +- `white` +- `blackBright` (alias: `gray`, `grey`) +- `redBright` +- `greenBright` +- `yellowBright` +- `blueBright` +- `magentaBright` +- `cyanBright` +- `whiteBright` + +### Background colors + +- `bgBlack` +- `bgRed` +- `bgGreen` +- `bgYellow` +- `bgBlue` +- `bgMagenta` +- `bgCyan` +- `bgWhite` +- `bgBlackBright` (alias: `bgGray`, `bgGrey`) +- `bgRedBright` +- `bgGreenBright` +- `bgYellowBright` +- `bgBlueBright` +- `bgMagentaBright` +- `bgCyanBright` +- `bgWhiteBright` + +## Advanced usage + +By default, you get a map of styles, but the styles are also available as groups. They are non-enumerable so they don't show up unless you access them explicitly. This makes it easier to expose only a subset in a higher-level module. + +- `style.modifier` +- `style.color` +- `style.bgColor` + +###### Example + +```js +console.log(style.color.green.open); +``` + +Raw escape codes (i.e. without the CSI escape prefix `\u001B[` and render mode postfix `m`) are available under `style.codes`, which returns a `Map` with the open codes as keys and close codes as values. + +###### Example + +```js +console.log(style.codes.get(36)); +//=> 39 +``` + +## [256 / 16 million (TrueColor) support](https://gist.github.com/XVilka/8346728) + +`ansi-styles` uses the [`color-convert`](https://github.com/Qix-/color-convert) package to allow for converting between various colors and ANSI escapes, with support for 256 and 16 million colors. + +The following color spaces from `color-convert` are supported: + +- `rgb` +- `hex` +- `keyword` +- `hsl` +- `hsv` +- `hwb` +- `ansi` +- `ansi256` + +To use these, call the associated conversion function with the intended output, for example: + +```js +style.color.ansi.rgb(100, 200, 15); // RGB to 16 color ansi foreground code +style.bgColor.ansi.rgb(100, 200, 15); // RGB to 16 color ansi background code + +style.color.ansi256.hsl(120, 100, 60); // HSL to 256 color ansi foreground code +style.bgColor.ansi256.hsl(120, 100, 60); // HSL to 256 color ansi foreground code + +style.color.ansi16m.hex('#C0FFEE'); // Hex (RGB) to 16 million color foreground code +style.bgColor.ansi16m.hex('#C0FFEE'); // Hex (RGB) to 16 million color background code +``` + +## Related + +- [ansi-escapes](https://github.com/sindresorhus/ansi-escapes) - ANSI escape codes for manipulating the terminal + +## Maintainers + +- [Sindre Sorhus](https://github.com/sindresorhus) +- [Josh Junon](https://github.com/qix-) + +## For enterprise + +Available as part of the Tidelift Subscription. + +The maintainers of `ansi-styles` and thousands of other packages are working with Tidelift to deliver commercial support and maintenance for the open source dependencies you use to build your applications. Save time, reduce risk, and improve code health, while paying the maintainers of the exact dependencies you use. [Learn more.](https://tidelift.com/subscription/pkg/npm-ansi-styles?utm_source=npm-ansi-styles&utm_medium=referral&utm_campaign=enterprise&utm_term=repo) diff --git a/node_modules/boxen/node_modules/chalk/index.d.ts b/node_modules/boxen/node_modules/chalk/index.d.ts new file mode 100644 index 000000000..9cd88f38b --- /dev/null +++ b/node_modules/boxen/node_modules/chalk/index.d.ts @@ -0,0 +1,415 @@ +/** +Basic foreground colors. + +[More colors here.](https://github.com/chalk/chalk/blob/master/readme.md#256-and-truecolor-color-support) +*/ +declare type ForegroundColor = + | 'black' + | 'red' + | 'green' + | 'yellow' + | 'blue' + | 'magenta' + | 'cyan' + | 'white' + | 'gray' + | 'grey' + | 'blackBright' + | 'redBright' + | 'greenBright' + | 'yellowBright' + | 'blueBright' + | 'magentaBright' + | 'cyanBright' + | 'whiteBright'; + +/** +Basic background colors. + +[More colors here.](https://github.com/chalk/chalk/blob/master/readme.md#256-and-truecolor-color-support) +*/ +declare type BackgroundColor = + | 'bgBlack' + | 'bgRed' + | 'bgGreen' + | 'bgYellow' + | 'bgBlue' + | 'bgMagenta' + | 'bgCyan' + | 'bgWhite' + | 'bgGray' + | 'bgGrey' + | 'bgBlackBright' + | 'bgRedBright' + | 'bgGreenBright' + | 'bgYellowBright' + | 'bgBlueBright' + | 'bgMagentaBright' + | 'bgCyanBright' + | 'bgWhiteBright'; + +/** +Basic colors. + +[More colors here.](https://github.com/chalk/chalk/blob/master/readme.md#256-and-truecolor-color-support) +*/ +declare type Color = ForegroundColor | BackgroundColor; + +declare type Modifiers = + | 'reset' + | 'bold' + | 'dim' + | 'italic' + | 'underline' + | 'inverse' + | 'hidden' + | 'strikethrough' + | 'visible'; + +declare namespace chalk { + /** + Levels: + - `0` - All colors disabled. + - `1` - Basic 16 colors support. + - `2` - ANSI 256 colors support. + - `3` - Truecolor 16 million colors support. + */ + type Level = 0 | 1 | 2 | 3; + + interface Options { + /** + Specify the color support for Chalk. + + By default, color support is automatically detected based on the environment. + + Levels: + - `0` - All colors disabled. + - `1` - Basic 16 colors support. + - `2` - ANSI 256 colors support. + - `3` - Truecolor 16 million colors support. + */ + level?: Level; + } + + /** + Return a new Chalk instance. + */ + type Instance = new (options?: Options) => Chalk; + + /** + Detect whether the terminal supports color. + */ + interface ColorSupport { + /** + The color level used by Chalk. + */ + level: Level; + + /** + Return whether Chalk supports basic 16 colors. + */ + hasBasic: boolean; + + /** + Return whether Chalk supports ANSI 256 colors. + */ + has256: boolean; + + /** + Return whether Chalk supports Truecolor 16 million colors. + */ + has16m: boolean; + } + + interface ChalkFunction { + /** + Use a template string. + + @remarks Template literals are unsupported for nested calls (see [issue #341](https://github.com/chalk/chalk/issues/341)) + + @example + ``` + import chalk = require('chalk'); + + log(chalk` + CPU: {red ${cpu.totalPercent}%} + RAM: {green ${ram.used / ram.total * 100}%} + DISK: {rgb(255,131,0) ${disk.used / disk.total * 100}%} + `); + ``` + + @example + ``` + import chalk = require('chalk'); + + log(chalk.red.bgBlack`2 + 3 = {bold ${2 + 3}}`) + ``` + */ + (text: TemplateStringsArray, ...placeholders: unknown[]): string; + + (...text: unknown[]): string; + } + + interface Chalk extends ChalkFunction { + /** + Return a new Chalk instance. + */ + Instance: Instance; + + /** + The color support for Chalk. + + By default, color support is automatically detected based on the environment. + + Levels: + - `0` - All colors disabled. + - `1` - Basic 16 colors support. + - `2` - ANSI 256 colors support. + - `3` - Truecolor 16 million colors support. + */ + level: Level; + + /** + Use HEX value to set text color. + + @param color - Hexadecimal value representing the desired color. + + @example + ``` + import chalk = require('chalk'); + + chalk.hex('#DEADED'); + ``` + */ + hex(color: string): Chalk; + + /** + Use keyword color value to set text color. + + @param color - Keyword value representing the desired color. + + @example + ``` + import chalk = require('chalk'); + + chalk.keyword('orange'); + ``` + */ + keyword(color: string): Chalk; + + /** + Use RGB values to set text color. + */ + rgb(red: number, green: number, blue: number): Chalk; + + /** + Use HSL values to set text color. + */ + hsl(hue: number, saturation: number, lightness: number): Chalk; + + /** + Use HSV values to set text color. + */ + hsv(hue: number, saturation: number, value: number): Chalk; + + /** + Use HWB values to set text color. + */ + hwb(hue: number, whiteness: number, blackness: number): Chalk; + + /** + Use a [Select/Set Graphic Rendition](https://en.wikipedia.org/wiki/ANSI_escape_code#SGR_parameters) (SGR) [color code number](https://en.wikipedia.org/wiki/ANSI_escape_code#3/4_bit) to set text color. + + 30 <= code && code < 38 || 90 <= code && code < 98 + For example, 31 for red, 91 for redBright. + */ + ansi(code: number): Chalk; + + /** + Use a [8-bit unsigned number](https://en.wikipedia.org/wiki/ANSI_escape_code#8-bit) to set text color. + */ + ansi256(index: number): Chalk; + + /** + Use HEX value to set background color. + + @param color - Hexadecimal value representing the desired color. + + @example + ``` + import chalk = require('chalk'); + + chalk.bgHex('#DEADED'); + ``` + */ + bgHex(color: string): Chalk; + + /** + Use keyword color value to set background color. + + @param color - Keyword value representing the desired color. + + @example + ``` + import chalk = require('chalk'); + + chalk.bgKeyword('orange'); + ``` + */ + bgKeyword(color: string): Chalk; + + /** + Use RGB values to set background color. + */ + bgRgb(red: number, green: number, blue: number): Chalk; + + /** + Use HSL values to set background color. + */ + bgHsl(hue: number, saturation: number, lightness: number): Chalk; + + /** + Use HSV values to set background color. + */ + bgHsv(hue: number, saturation: number, value: number): Chalk; + + /** + Use HWB values to set background color. + */ + bgHwb(hue: number, whiteness: number, blackness: number): Chalk; + + /** + Use a [Select/Set Graphic Rendition](https://en.wikipedia.org/wiki/ANSI_escape_code#SGR_parameters) (SGR) [color code number](https://en.wikipedia.org/wiki/ANSI_escape_code#3/4_bit) to set background color. + + 30 <= code && code < 38 || 90 <= code && code < 98 + For example, 31 for red, 91 for redBright. + Use the foreground code, not the background code (for example, not 41, nor 101). + */ + bgAnsi(code: number): Chalk; + + /** + Use a [8-bit unsigned number](https://en.wikipedia.org/wiki/ANSI_escape_code#8-bit) to set background color. + */ + bgAnsi256(index: number): Chalk; + + /** + Modifier: Resets the current color chain. + */ + readonly reset: Chalk; + + /** + Modifier: Make text bold. + */ + readonly bold: Chalk; + + /** + Modifier: Emitting only a small amount of light. + */ + readonly dim: Chalk; + + /** + Modifier: Make text italic. (Not widely supported) + */ + readonly italic: Chalk; + + /** + Modifier: Make text underline. (Not widely supported) + */ + readonly underline: Chalk; + + /** + Modifier: Inverse background and foreground colors. + */ + readonly inverse: Chalk; + + /** + Modifier: Prints the text, but makes it invisible. + */ + readonly hidden: Chalk; + + /** + Modifier: Puts a horizontal line through the center of the text. (Not widely supported) + */ + readonly strikethrough: Chalk; + + /** + Modifier: Prints the text only when Chalk has a color support level > 0. + Can be useful for things that are purely cosmetic. + */ + readonly visible: Chalk; + + readonly black: Chalk; + readonly red: Chalk; + readonly green: Chalk; + readonly yellow: Chalk; + readonly blue: Chalk; + readonly magenta: Chalk; + readonly cyan: Chalk; + readonly white: Chalk; + + /* + Alias for `blackBright`. + */ + readonly gray: Chalk; + + /* + Alias for `blackBright`. + */ + readonly grey: Chalk; + + readonly blackBright: Chalk; + readonly redBright: Chalk; + readonly greenBright: Chalk; + readonly yellowBright: Chalk; + readonly blueBright: Chalk; + readonly magentaBright: Chalk; + readonly cyanBright: Chalk; + readonly whiteBright: Chalk; + + readonly bgBlack: Chalk; + readonly bgRed: Chalk; + readonly bgGreen: Chalk; + readonly bgYellow: Chalk; + readonly bgBlue: Chalk; + readonly bgMagenta: Chalk; + readonly bgCyan: Chalk; + readonly bgWhite: Chalk; + + /* + Alias for `bgBlackBright`. + */ + readonly bgGray: Chalk; + + /* + Alias for `bgBlackBright`. + */ + readonly bgGrey: Chalk; + + readonly bgBlackBright: Chalk; + readonly bgRedBright: Chalk; + readonly bgGreenBright: Chalk; + readonly bgYellowBright: Chalk; + readonly bgBlueBright: Chalk; + readonly bgMagentaBright: Chalk; + readonly bgCyanBright: Chalk; + readonly bgWhiteBright: Chalk; + } +} + +/** +Main Chalk object that allows to chain styles together. +Call the last one as a method with a string argument. +Order doesn't matter, and later styles take precedent in case of a conflict. +This simply means that `chalk.red.yellow.green` is equivalent to `chalk.green`. +*/ +declare const chalk: chalk.Chalk & chalk.ChalkFunction & { + supportsColor: chalk.ColorSupport | false; + Level: chalk.Level; + Color: Color; + ForegroundColor: ForegroundColor; + BackgroundColor: BackgroundColor; + Modifiers: Modifiers; + stderr: chalk.Chalk & {supportsColor: chalk.ColorSupport | false}; +}; + +export = chalk; diff --git a/node_modules/boxen/node_modules/chalk/license b/node_modules/boxen/node_modules/chalk/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/boxen/node_modules/chalk/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/boxen/node_modules/chalk/package.json b/node_modules/boxen/node_modules/chalk/package.json new file mode 100644 index 000000000..b2c2218a3 --- /dev/null +++ b/node_modules/boxen/node_modules/chalk/package.json @@ -0,0 +1,100 @@ +{ + "_from": "chalk@^4.1.0", + "_id": "chalk@4.1.2", + "_inBundle": false, + "_integrity": "sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==", + "_location": "/boxen/chalk", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "chalk@^4.1.0", + "name": "chalk", + "escapedName": "chalk", + "rawSpec": "^4.1.0", + "saveSpec": null, + "fetchSpec": "^4.1.0" + }, + "_requiredBy": [ + "/boxen" + ], + "_resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz", + "_shasum": "aac4e2b7734a740867aeb16bf02aad556a1e7a01", + "_spec": "chalk@^4.1.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\boxen", + "bugs": { + "url": "https://github.com/chalk/chalk/issues" + }, + "bundleDependencies": false, + "dependencies": { + "ansi-styles": "^4.1.0", + "supports-color": "^7.1.0" + }, + "deprecated": false, + "description": "Terminal string styling done right", + "devDependencies": { + "ava": "^2.4.0", + "coveralls": "^3.0.7", + "execa": "^4.0.0", + "import-fresh": "^3.1.0", + "matcha": "^0.7.0", + "nyc": "^15.0.0", + "resolve-from": "^5.0.0", + "tsd": "^0.7.4", + "xo": "^0.28.2" + }, + "engines": { + "node": ">=10" + }, + "files": [ + "source", + "index.d.ts" + ], + "funding": "https://github.com/chalk/chalk?sponsor=1", + "homepage": "https://github.com/chalk/chalk#readme", + "keywords": [ + "color", + "colour", + "colors", + "terminal", + "console", + "cli", + "string", + "str", + "ansi", + "style", + "styles", + "tty", + "formatting", + "rgb", + "256", + "shell", + "xterm", + "log", + "logging", + "command-line", + "text" + ], + "license": "MIT", + "main": "source", + "name": "chalk", + "repository": { + "type": "git", + "url": "git+https://github.com/chalk/chalk.git" + }, + "scripts": { + "bench": "matcha benchmark.js", + "test": "xo && nyc ava && tsd" + }, + "version": "4.1.2", + "xo": { + "rules": { + "unicorn/prefer-string-slice": "off", + "unicorn/prefer-includes": "off", + "@typescript-eslint/member-ordering": "off", + "no-redeclare": "off", + "unicorn/string-content": "off", + "unicorn/better-regex": "off" + } + } +} diff --git a/node_modules/boxen/node_modules/chalk/readme.md b/node_modules/boxen/node_modules/chalk/readme.md new file mode 100644 index 000000000..a055d21c9 --- /dev/null +++ b/node_modules/boxen/node_modules/chalk/readme.md @@ -0,0 +1,341 @@ +

+
+
+ Chalk +
+
+
+

+ +> Terminal string styling done right + +[![Build Status](https://travis-ci.org/chalk/chalk.svg?branch=master)](https://travis-ci.org/chalk/chalk) [![Coverage Status](https://coveralls.io/repos/github/chalk/chalk/badge.svg?branch=master)](https://coveralls.io/github/chalk/chalk?branch=master) [![npm dependents](https://badgen.net/npm/dependents/chalk)](https://www.npmjs.com/package/chalk?activeTab=dependents) [![Downloads](https://badgen.net/npm/dt/chalk)](https://www.npmjs.com/package/chalk) [![](https://img.shields.io/badge/unicorn-approved-ff69b4.svg)](https://www.youtube.com/watch?v=9auOCbH5Ns4) [![XO code style](https://img.shields.io/badge/code_style-XO-5ed9c7.svg)](https://github.com/xojs/xo) ![TypeScript-ready](https://img.shields.io/npm/types/chalk.svg) [![run on repl.it](https://repl.it/badge/github/chalk/chalk)](https://repl.it/github/chalk/chalk) + + + +
+ +--- + + + +--- + +
+ +## Highlights + +- Expressive API +- Highly performant +- Ability to nest styles +- [256/Truecolor color support](#256-and-truecolor-color-support) +- Auto-detects color support +- Doesn't extend `String.prototype` +- Clean and focused +- Actively maintained +- [Used by ~50,000 packages](https://www.npmjs.com/browse/depended/chalk) as of January 1, 2020 + +## Install + +```console +$ npm install chalk +``` + +## Usage + +```js +const chalk = require('chalk'); + +console.log(chalk.blue('Hello world!')); +``` + +Chalk comes with an easy to use composable API where you just chain and nest the styles you want. + +```js +const chalk = require('chalk'); +const log = console.log; + +// Combine styled and normal strings +log(chalk.blue('Hello') + ' World' + chalk.red('!')); + +// Compose multiple styles using the chainable API +log(chalk.blue.bgRed.bold('Hello world!')); + +// Pass in multiple arguments +log(chalk.blue('Hello', 'World!', 'Foo', 'bar', 'biz', 'baz')); + +// Nest styles +log(chalk.red('Hello', chalk.underline.bgBlue('world') + '!')); + +// Nest styles of the same type even (color, underline, background) +log(chalk.green( + 'I am a green line ' + + chalk.blue.underline.bold('with a blue substring') + + ' that becomes green again!' +)); + +// ES2015 template literal +log(` +CPU: ${chalk.red('90%')} +RAM: ${chalk.green('40%')} +DISK: ${chalk.yellow('70%')} +`); + +// ES2015 tagged template literal +log(chalk` +CPU: {red ${cpu.totalPercent}%} +RAM: {green ${ram.used / ram.total * 100}%} +DISK: {rgb(255,131,0) ${disk.used / disk.total * 100}%} +`); + +// Use RGB colors in terminal emulators that support it. +log(chalk.keyword('orange')('Yay for orange colored text!')); +log(chalk.rgb(123, 45, 67).underline('Underlined reddish color')); +log(chalk.hex('#DEADED').bold('Bold gray!')); +``` + +Easily define your own themes: + +```js +const chalk = require('chalk'); + +const error = chalk.bold.red; +const warning = chalk.keyword('orange'); + +console.log(error('Error!')); +console.log(warning('Warning!')); +``` + +Take advantage of console.log [string substitution](https://nodejs.org/docs/latest/api/console.html#console_console_log_data_args): + +```js +const name = 'Sindre'; +console.log(chalk.green('Hello %s'), name); +//=> 'Hello Sindre' +``` + +## API + +### chalk.` + +<% /* +Like stylesheets, scripts can also be extracted. +This script block will end up at the end of the HTML document. +*/ %> + + +

<%= message %>

+ diff --git a/node_modules/express-ejs-layouts/lib/express-layouts.js b/node_modules/express-ejs-layouts/lib/express-layouts.js new file mode 100644 index 000000000..4eb49a5ae --- /dev/null +++ b/node_modules/express-ejs-layouts/lib/express-layouts.js @@ -0,0 +1,117 @@ +/*jslint sloppy:true indent:2 plusplus:true regexp:true*/ + +var contentPattern = '&&<>&&'; + +function contentFor(contentName) { + return contentPattern + contentName + contentPattern; +} + +function parseContents(locals) { + var name, i = 1, str = locals.body, + regex = new RegExp('\r?\n?' + contentPattern + '.+?' + contentPattern + '\r?\n?', 'g'), + split = str.split(regex), + matches = str.match(regex); + + locals.body = split[0]; + + if (matches !== null) { + matches.forEach(function (match) { + name = match.split(contentPattern)[1]; + locals[name] = split[i]; + i++; + }); + } +} + +function parseScripts(locals) { + var str = locals.body, regex = /\[\s\S]*?\<\/script\>/g; + + if (regex.test(str)) { + locals.body = str.replace(regex, ''); + locals.script = str.match(regex).join('\n'); + } +} + +function parseStyles(locals) { + var str = locals.body, regex = /(?:\[\s\S]*?\<\/style\>)|(?:\(?:\<\/link\>)?)/g; + + if (regex.test(str)) { + locals.body = str.replace(regex, ''); + locals.style = str.match(regex).join('\n'); + } +} + +function parseMetas(locals) { + var str = locals.body, regex = /\/g; + + if (regex.test(str)) { + locals.body = str.replace(regex, ''); + locals.meta = str.match(regex).join('\n'); + } +} + +module.exports = function (req, res, next) { + if(!res.__render) res.__render = res.render; + + res.render = function (view, options, fn) { + var layout, self = this, app = req.app, + defaultLayout = app.get('layout'); + + options = options || {}; + if (typeof options === 'function') { + fn = options; + options = {}; + } + + if (options.layout === false || ((options.layout || defaultLayout) === false)) { + res.__render.call(res, view, options, fn); + return; + } + + layout = options.layout || res.locals.layout || defaultLayout; + if (layout === true || layout === undefined) { + layout = 'layout'; + } + + options.contentFor = contentFor; + res.__render.call(res, view, options, function (err, str) { + var l, locals; + + if (err) { return fn ? fn(err) : next(err); } + + locals = { + body: str, + defineContent: function(contentName) { return locals[contentName] || ''; } + }; + for (l in options) { + if (options.hasOwnProperty(l) && l !== 'layout' && l !== 'contentFor') { + locals[l] = options[l]; + } + } + + if (typeof locals.body !== 'string') { + res.__render.call(self, view, locals, fn); + return; + } + + if (options.extractScripts === true || (options.extractScripts === undefined && app.get('layout extractScripts') === true)) { + locals.script = ''; + parseScripts(locals); + } + + if (options.extractStyles === true || (options.extractStyles === undefined && app.get('layout extractStyles') === true)) { + locals.style = ''; + parseStyles(locals); + } + + if (options.extractMetas === true || (options.extractMetas === undefined && app.get('layout extractMetas') === true)) { + locals.meta = ''; + parseMetas(locals); + } + + parseContents(locals); + res.__render.call(self, layout, locals, fn); + }); + }; + next(); +}; diff --git a/node_modules/express-ejs-layouts/package.json b/node_modules/express-ejs-layouts/package.json new file mode 100644 index 000000000..9aa3b6ce9 --- /dev/null +++ b/node_modules/express-ejs-layouts/package.json @@ -0,0 +1,61 @@ +{ + "_from": "express-ejs-layouts@^2.5.1", + "_id": "express-ejs-layouts@2.5.1", + "_inBundle": false, + "_integrity": "sha512-IXROv9n3xKga7FowT06n1Qn927JR8ZWDn5Dc9CJQoiiaaDqbhW5PDmWShzbpAa2wjWT1vJqaIM1S6vJwwX11gA==", + "_location": "/express-ejs-layouts", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "express-ejs-layouts@^2.5.1", + "name": "express-ejs-layouts", + "escapedName": "express-ejs-layouts", + "rawSpec": "^2.5.1", + "saveSpec": null, + "fetchSpec": "^2.5.1" + }, + "_requiredBy": [ + "#DEV:/", + "#USER" + ], + "_resolved": "https://registry.npmjs.org/express-ejs-layouts/-/express-ejs-layouts-2.5.1.tgz", + "_shasum": "d204d9065ee2825fcbd718d820289fc81e691ccb", + "_spec": "express-ejs-layouts@^2.5.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project", + "author": { + "name": "Igor Soarez", + "email": "igorsoarez@gmail.com" + }, + "bugs": { + "url": "https://github.com/Soarez/express-ejs-layouts/issues" + }, + "bundleDependencies": false, + "dependencies": {}, + "deprecated": false, + "description": "Layout support for ejs in express.", + "devDependencies": { + "ejs": "^2.6.1", + "express": "*", + "mocha": "*", + "should": "*", + "supertest": "*" + }, + "homepage": "https://github.com/Soarez/express-ejs-layouts#readme", + "keywords": [ + "express", + "layout", + "ejs" + ], + "main": "lib/express-layouts.js", + "name": "express-ejs-layouts", + "optionalDependencies": {}, + "repository": { + "type": "git", + "url": "git://github.com/Soarez/express-ejs-layouts.git" + }, + "scripts": { + "test": "mocha" + }, + "version": "2.5.1" +} diff --git a/node_modules/express/History.md b/node_modules/express/History.md new file mode 100644 index 000000000..6e62a6ddb --- /dev/null +++ b/node_modules/express/History.md @@ -0,0 +1,3477 @@ +4.17.1 / 2019-05-25 +=================== + + * Revert "Improve error message for `null`/`undefined` to `res.status`" + +4.17.0 / 2019-05-16 +=================== + + * Add `express.raw` to parse bodies into `Buffer` + * Add `express.text` to parse bodies into string + * Improve error message for non-strings to `res.sendFile` + * Improve error message for `null`/`undefined` to `res.status` + * Support multiple hosts in `X-Forwarded-Host` + * deps: accepts@~1.3.7 + * deps: body-parser@1.19.0 + - Add encoding MIK + - Add petabyte (`pb`) support + - Fix parsing array brackets after index + - deps: bytes@3.1.0 + - deps: http-errors@1.7.2 + - deps: iconv-lite@0.4.24 + - deps: qs@6.7.0 + - deps: raw-body@2.4.0 + - deps: type-is@~1.6.17 + * deps: content-disposition@0.5.3 + * deps: cookie@0.4.0 + - Add `SameSite=None` support + * deps: finalhandler@~1.1.2 + - Set stricter `Content-Security-Policy` header + - deps: parseurl@~1.3.3 + - deps: statuses@~1.5.0 + * deps: parseurl@~1.3.3 + * deps: proxy-addr@~2.0.5 + - deps: ipaddr.js@1.9.0 + * deps: qs@6.7.0 + - Fix parsing array brackets after index + * deps: range-parser@~1.2.1 + * deps: send@0.17.1 + - Set stricter CSP header in redirect & error responses + - deps: http-errors@~1.7.2 + - deps: mime@1.6.0 + - deps: ms@2.1.1 + - deps: range-parser@~1.2.1 + - deps: statuses@~1.5.0 + - perf: remove redundant `path.normalize` call + * deps: serve-static@1.14.1 + - Set stricter CSP header in redirect response + - deps: parseurl@~1.3.3 + - deps: send@0.17.1 + * deps: setprototypeof@1.1.1 + * deps: statuses@~1.5.0 + - Add `103 Early Hints` + * deps: type-is@~1.6.18 + - deps: mime-types@~2.1.24 + - perf: prevent internal `throw` on invalid type + +4.16.4 / 2018-10-10 +=================== + + * Fix issue where `"Request aborted"` may be logged in `res.sendfile` + * Fix JSDoc for `Router` constructor + * deps: body-parser@1.18.3 + - Fix deprecation warnings on Node.js 10+ + - Fix stack trace for strict json parse error + - deps: depd@~1.1.2 + - deps: http-errors@~1.6.3 + - deps: iconv-lite@0.4.23 + - deps: qs@6.5.2 + - deps: raw-body@2.3.3 + - deps: type-is@~1.6.16 + * deps: proxy-addr@~2.0.4 + - deps: ipaddr.js@1.8.0 + * deps: qs@6.5.2 + * deps: safe-buffer@5.1.2 + +4.16.3 / 2018-03-12 +=================== + + * deps: accepts@~1.3.5 + - deps: mime-types@~2.1.18 + * deps: depd@~1.1.2 + - perf: remove argument reassignment + * deps: encodeurl@~1.0.2 + - Fix encoding `%` as last character + * deps: finalhandler@1.1.1 + - Fix 404 output for bad / missing pathnames + - deps: encodeurl@~1.0.2 + - deps: statuses@~1.4.0 + * deps: proxy-addr@~2.0.3 + - deps: ipaddr.js@1.6.0 + * deps: send@0.16.2 + - Fix incorrect end tag in default error & redirects + - deps: depd@~1.1.2 + - deps: encodeurl@~1.0.2 + - deps: statuses@~1.4.0 + * deps: serve-static@1.13.2 + - Fix incorrect end tag in redirects + - deps: encodeurl@~1.0.2 + - deps: send@0.16.2 + * deps: statuses@~1.4.0 + * deps: type-is@~1.6.16 + - deps: mime-types@~2.1.18 + +4.16.2 / 2017-10-09 +=================== + + * Fix `TypeError` in `res.send` when given `Buffer` and `ETag` header set + * perf: skip parsing of entire `X-Forwarded-Proto` header + +4.16.1 / 2017-09-29 +=================== + + * deps: send@0.16.1 + * deps: serve-static@1.13.1 + - Fix regression when `root` is incorrectly set to a file + - deps: send@0.16.1 + +4.16.0 / 2017-09-28 +=================== + + * Add `"json escape"` setting for `res.json` and `res.jsonp` + * Add `express.json` and `express.urlencoded` to parse bodies + * Add `options` argument to `res.download` + * Improve error message when autoloading invalid view engine + * Improve error messages when non-function provided as middleware + * Skip `Buffer` encoding when not generating ETag for small response + * Use `safe-buffer` for improved Buffer API + * deps: accepts@~1.3.4 + - deps: mime-types@~2.1.16 + * deps: content-type@~1.0.4 + - perf: remove argument reassignment + - perf: skip parameter parsing when no parameters + * deps: etag@~1.8.1 + - perf: replace regular expression with substring + * deps: finalhandler@1.1.0 + - Use `res.headersSent` when available + * deps: parseurl@~1.3.2 + - perf: reduce overhead for full URLs + - perf: unroll the "fast-path" `RegExp` + * deps: proxy-addr@~2.0.2 + - Fix trimming leading / trailing OWS in `X-Forwarded-For` + - deps: forwarded@~0.1.2 + - deps: ipaddr.js@1.5.2 + - perf: reduce overhead when no `X-Forwarded-For` header + * deps: qs@6.5.1 + - Fix parsing & compacting very deep objects + * deps: send@0.16.0 + - Add 70 new types for file extensions + - Add `immutable` option + - Fix missing `` in default error & redirects + - Set charset as "UTF-8" for .js and .json + - Use instance methods on steam to check for listeners + - deps: mime@1.4.1 + - perf: improve path validation speed + * deps: serve-static@1.13.0 + - Add 70 new types for file extensions + - Add `immutable` option + - Set charset as "UTF-8" for .js and .json + - deps: send@0.16.0 + * deps: setprototypeof@1.1.0 + * deps: utils-merge@1.0.1 + * deps: vary@~1.1.2 + - perf: improve header token parsing speed + * perf: re-use options object when generating ETags + * perf: remove dead `.charset` set in `res.jsonp` + +4.15.5 / 2017-09-24 +=================== + + * deps: debug@2.6.9 + * deps: finalhandler@~1.0.6 + - deps: debug@2.6.9 + - deps: parseurl@~1.3.2 + * deps: fresh@0.5.2 + - Fix handling of modified headers with invalid dates + - perf: improve ETag match loop + - perf: improve `If-None-Match` token parsing + * deps: send@0.15.6 + - Fix handling of modified headers with invalid dates + - deps: debug@2.6.9 + - deps: etag@~1.8.1 + - deps: fresh@0.5.2 + - perf: improve `If-Match` token parsing + * deps: serve-static@1.12.6 + - deps: parseurl@~1.3.2 + - deps: send@0.15.6 + - perf: improve slash collapsing + +4.15.4 / 2017-08-06 +=================== + + * deps: debug@2.6.8 + * deps: depd@~1.1.1 + - Remove unnecessary `Buffer` loading + * deps: finalhandler@~1.0.4 + - deps: debug@2.6.8 + * deps: proxy-addr@~1.1.5 + - Fix array argument being altered + - deps: ipaddr.js@1.4.0 + * deps: qs@6.5.0 + * deps: send@0.15.4 + - deps: debug@2.6.8 + - deps: depd@~1.1.1 + - deps: http-errors@~1.6.2 + * deps: serve-static@1.12.4 + - deps: send@0.15.4 + +4.15.3 / 2017-05-16 +=================== + + * Fix error when `res.set` cannot add charset to `Content-Type` + * deps: debug@2.6.7 + - Fix `DEBUG_MAX_ARRAY_LENGTH` + - deps: ms@2.0.0 + * deps: finalhandler@~1.0.3 + - Fix missing `` in HTML document + - deps: debug@2.6.7 + * deps: proxy-addr@~1.1.4 + - deps: ipaddr.js@1.3.0 + * deps: send@0.15.3 + - deps: debug@2.6.7 + - deps: ms@2.0.0 + * deps: serve-static@1.12.3 + - deps: send@0.15.3 + * deps: type-is@~1.6.15 + - deps: mime-types@~2.1.15 + * deps: vary@~1.1.1 + - perf: hoist regular expression + +4.15.2 / 2017-03-06 +=================== + + * deps: qs@6.4.0 + - Fix regression parsing keys starting with `[` + +4.15.1 / 2017-03-05 +=================== + + * deps: send@0.15.1 + - Fix issue when `Date.parse` does not return `NaN` on invalid date + - Fix strict violation in broken environments + * deps: serve-static@1.12.1 + - Fix issue when `Date.parse` does not return `NaN` on invalid date + - deps: send@0.15.1 + +4.15.0 / 2017-03-01 +=================== + + * Add debug message when loading view engine + * Add `next("router")` to exit from router + * Fix case where `router.use` skipped requests routes did not + * Remove usage of `res._headers` private field + - Improves compatibility with Node.js 8 nightly + * Skip routing when `req.url` is not set + * Use `%o` in path debug to tell types apart + * Use `Object.create` to setup request & response prototypes + * Use `setprototypeof` module to replace `__proto__` setting + * Use `statuses` instead of `http` module for status messages + * deps: debug@2.6.1 + - Allow colors in workers + - Deprecated `DEBUG_FD` environment variable set to `3` or higher + - Fix error when running under React Native + - Use same color for same namespace + - deps: ms@0.7.2 + * deps: etag@~1.8.0 + - Use SHA1 instead of MD5 for ETag hashing + - Works with FIPS 140-2 OpenSSL configuration + * deps: finalhandler@~1.0.0 + - Fix exception when `err` cannot be converted to a string + - Fully URL-encode the pathname in the 404 + - Only include the pathname in the 404 message + - Send complete HTML document + - Set `Content-Security-Policy: default-src 'self'` header + - deps: debug@2.6.1 + * deps: fresh@0.5.0 + - Fix false detection of `no-cache` request directive + - Fix incorrect result when `If-None-Match` has both `*` and ETags + - Fix weak `ETag` matching to match spec + - perf: delay reading header values until needed + - perf: enable strict mode + - perf: hoist regular expressions + - perf: remove duplicate conditional + - perf: remove unnecessary boolean coercions + - perf: skip checking modified time if ETag check failed + - perf: skip parsing `If-None-Match` when no `ETag` header + - perf: use `Date.parse` instead of `new Date` + * deps: qs@6.3.1 + - Fix array parsing from skipping empty values + - Fix compacting nested arrays + * deps: send@0.15.0 + - Fix false detection of `no-cache` request directive + - Fix incorrect result when `If-None-Match` has both `*` and ETags + - Fix weak `ETag` matching to match spec + - Remove usage of `res._headers` private field + - Support `If-Match` and `If-Unmodified-Since` headers + - Use `res.getHeaderNames()` when available + - Use `res.headersSent` when available + - deps: debug@2.6.1 + - deps: etag@~1.8.0 + - deps: fresh@0.5.0 + - deps: http-errors@~1.6.1 + * deps: serve-static@1.12.0 + - Fix false detection of `no-cache` request directive + - Fix incorrect result when `If-None-Match` has both `*` and ETags + - Fix weak `ETag` matching to match spec + - Remove usage of `res._headers` private field + - Send complete HTML document in redirect response + - Set default CSP header in redirect response + - Support `If-Match` and `If-Unmodified-Since` headers + - Use `res.getHeaderNames()` when available + - Use `res.headersSent` when available + - deps: send@0.15.0 + * perf: add fast match path for `*` route + * perf: improve `req.ips` performance + +4.14.1 / 2017-01-28 +=================== + + * deps: content-disposition@0.5.2 + * deps: finalhandler@0.5.1 + - Fix exception when `err.headers` is not an object + - deps: statuses@~1.3.1 + - perf: hoist regular expressions + - perf: remove duplicate validation path + * deps: proxy-addr@~1.1.3 + - deps: ipaddr.js@1.2.0 + * deps: send@0.14.2 + - deps: http-errors@~1.5.1 + - deps: ms@0.7.2 + - deps: statuses@~1.3.1 + * deps: serve-static@~1.11.2 + - deps: send@0.14.2 + * deps: type-is@~1.6.14 + - deps: mime-types@~2.1.13 + +4.14.0 / 2016-06-16 +=================== + + * Add `acceptRanges` option to `res.sendFile`/`res.sendfile` + * Add `cacheControl` option to `res.sendFile`/`res.sendfile` + * Add `options` argument to `req.range` + - Includes the `combine` option + * Encode URL in `res.location`/`res.redirect` if not already encoded + * Fix some redirect handling in `res.sendFile`/`res.sendfile` + * Fix Windows absolute path check using forward slashes + * Improve error with invalid arguments to `req.get()` + * Improve performance for `res.json`/`res.jsonp` in most cases + * Improve `Range` header handling in `res.sendFile`/`res.sendfile` + * deps: accepts@~1.3.3 + - Fix including type extensions in parameters in `Accept` parsing + - Fix parsing `Accept` parameters with quoted equals + - Fix parsing `Accept` parameters with quoted semicolons + - Many performance improvements + - deps: mime-types@~2.1.11 + - deps: negotiator@0.6.1 + * deps: content-type@~1.0.2 + - perf: enable strict mode + * deps: cookie@0.3.1 + - Add `sameSite` option + - Fix cookie `Max-Age` to never be a floating point number + - Improve error message when `encode` is not a function + - Improve error message when `expires` is not a `Date` + - Throw better error for invalid argument to parse + - Throw on invalid values provided to `serialize` + - perf: enable strict mode + - perf: hoist regular expression + - perf: use for loop in parse + - perf: use string concatenation for serialization + * deps: finalhandler@0.5.0 + - Change invalid or non-numeric status code to 500 + - Overwrite status message to match set status code + - Prefer `err.statusCode` if `err.status` is invalid + - Set response headers from `err.headers` object + - Use `statuses` instead of `http` module for status messages + * deps: proxy-addr@~1.1.2 + - Fix accepting various invalid netmasks + - Fix IPv6-mapped IPv4 validation edge cases + - IPv4 netmasks must be contiguous + - IPv6 addresses cannot be used as a netmask + - deps: ipaddr.js@1.1.1 + * deps: qs@6.2.0 + - Add `decoder` option in `parse` function + * deps: range-parser@~1.2.0 + - Add `combine` option to combine overlapping ranges + - Fix incorrectly returning -1 when there is at least one valid range + - perf: remove internal function + * deps: send@0.14.1 + - Add `acceptRanges` option + - Add `cacheControl` option + - Attempt to combine multiple ranges into single range + - Correctly inherit from `Stream` class + - Fix `Content-Range` header in 416 responses when using `start`/`end` options + - Fix `Content-Range` header missing from default 416 responses + - Fix redirect error when `path` contains raw non-URL characters + - Fix redirect when `path` starts with multiple forward slashes + - Ignore non-byte `Range` headers + - deps: http-errors@~1.5.0 + - deps: range-parser@~1.2.0 + - deps: statuses@~1.3.0 + - perf: remove argument reassignment + * deps: serve-static@~1.11.1 + - Add `acceptRanges` option + - Add `cacheControl` option + - Attempt to combine multiple ranges into single range + - Fix redirect error when `req.url` contains raw non-URL characters + - Ignore non-byte `Range` headers + - Use status code 301 for redirects + - deps: send@0.14.1 + * deps: type-is@~1.6.13 + - Fix type error when given invalid type to match against + - deps: mime-types@~2.1.11 + * deps: vary@~1.1.0 + - Only accept valid field names in the `field` argument + * perf: use strict equality when possible + +4.13.4 / 2016-01-21 +=================== + + * deps: content-disposition@0.5.1 + - perf: enable strict mode + * deps: cookie@0.1.5 + - Throw on invalid values provided to `serialize` + * deps: depd@~1.1.0 + - Support web browser loading + - perf: enable strict mode + * deps: escape-html@~1.0.3 + - perf: enable strict mode + - perf: optimize string replacement + - perf: use faster string coercion + * deps: finalhandler@0.4.1 + - deps: escape-html@~1.0.3 + * deps: merge-descriptors@1.0.1 + - perf: enable strict mode + * deps: methods@~1.1.2 + - perf: enable strict mode + * deps: parseurl@~1.3.1 + - perf: enable strict mode + * deps: proxy-addr@~1.0.10 + - deps: ipaddr.js@1.0.5 + - perf: enable strict mode + * deps: range-parser@~1.0.3 + - perf: enable strict mode + * deps: send@0.13.1 + - deps: depd@~1.1.0 + - deps: destroy@~1.0.4 + - deps: escape-html@~1.0.3 + - deps: range-parser@~1.0.3 + * deps: serve-static@~1.10.2 + - deps: escape-html@~1.0.3 + - deps: parseurl@~1.3.0 + - deps: send@0.13.1 + +4.13.3 / 2015-08-02 +=================== + + * Fix infinite loop condition using `mergeParams: true` + * Fix inner numeric indices incorrectly altering parent `req.params` + +4.13.2 / 2015-07-31 +=================== + + * deps: accepts@~1.2.12 + - deps: mime-types@~2.1.4 + * deps: array-flatten@1.1.1 + - perf: enable strict mode + * deps: path-to-regexp@0.1.7 + - Fix regression with escaped round brackets and matching groups + * deps: type-is@~1.6.6 + - deps: mime-types@~2.1.4 + +4.13.1 / 2015-07-05 +=================== + + * deps: accepts@~1.2.10 + - deps: mime-types@~2.1.2 + * deps: qs@4.0.0 + - Fix dropping parameters like `hasOwnProperty` + - Fix various parsing edge cases + * deps: type-is@~1.6.4 + - deps: mime-types@~2.1.2 + - perf: enable strict mode + - perf: remove argument reassignment + +4.13.0 / 2015-06-20 +=================== + + * Add settings to debug output + * Fix `res.format` error when only `default` provided + * Fix issue where `next('route')` in `app.param` would incorrectly skip values + * Fix hiding platform issues with `decodeURIComponent` + - Only `URIError`s are a 400 + * Fix using `*` before params in routes + * Fix using capture groups before params in routes + * Simplify `res.cookie` to call `res.append` + * Use `array-flatten` module for flattening arrays + * deps: accepts@~1.2.9 + - deps: mime-types@~2.1.1 + - perf: avoid argument reassignment & argument slice + - perf: avoid negotiator recursive construction + - perf: enable strict mode + - perf: remove unnecessary bitwise operator + * deps: cookie@0.1.3 + - perf: deduce the scope of try-catch deopt + - perf: remove argument reassignments + * deps: escape-html@1.0.2 + * deps: etag@~1.7.0 + - Always include entity length in ETags for hash length extensions + - Generate non-Stats ETags using MD5 only (no longer CRC32) + - Improve stat performance by removing hashing + - Improve support for JXcore + - Remove base64 padding in ETags to shorten + - Support "fake" stats objects in environments without fs + - Use MD5 instead of MD4 in weak ETags over 1KB + * deps: finalhandler@0.4.0 + - Fix a false-positive when unpiping in Node.js 0.8 + - Support `statusCode` property on `Error` objects + - Use `unpipe` module for unpiping requests + - deps: escape-html@1.0.2 + - deps: on-finished@~2.3.0 + - perf: enable strict mode + - perf: remove argument reassignment + * deps: fresh@0.3.0 + - Add weak `ETag` matching support + * deps: on-finished@~2.3.0 + - Add defined behavior for HTTP `CONNECT` requests + - Add defined behavior for HTTP `Upgrade` requests + - deps: ee-first@1.1.1 + * deps: path-to-regexp@0.1.6 + * deps: send@0.13.0 + - Allow Node.js HTTP server to set `Date` response header + - Fix incorrectly removing `Content-Location` on 304 response + - Improve the default redirect response headers + - Send appropriate headers on default error response + - Use `http-errors` for standard emitted errors + - Use `statuses` instead of `http` module for status messages + - deps: escape-html@1.0.2 + - deps: etag@~1.7.0 + - deps: fresh@0.3.0 + - deps: on-finished@~2.3.0 + - perf: enable strict mode + - perf: remove unnecessary array allocations + * deps: serve-static@~1.10.0 + - Add `fallthrough` option + - Fix reading options from options prototype + - Improve the default redirect response headers + - Malformed URLs now `next()` instead of 400 + - deps: escape-html@1.0.2 + - deps: send@0.13.0 + - perf: enable strict mode + - perf: remove argument reassignment + * deps: type-is@~1.6.3 + - deps: mime-types@~2.1.1 + - perf: reduce try block size + - perf: remove bitwise operations + * perf: enable strict mode + * perf: isolate `app.render` try block + * perf: remove argument reassignments in application + * perf: remove argument reassignments in request prototype + * perf: remove argument reassignments in response prototype + * perf: remove argument reassignments in routing + * perf: remove argument reassignments in `View` + * perf: skip attempting to decode zero length string + * perf: use saved reference to `http.STATUS_CODES` + +4.12.4 / 2015-05-17 +=================== + + * deps: accepts@~1.2.7 + - deps: mime-types@~2.0.11 + - deps: negotiator@0.5.3 + * deps: debug@~2.2.0 + - deps: ms@0.7.1 + * deps: depd@~1.0.1 + * deps: etag@~1.6.0 + - Improve support for JXcore + - Support "fake" stats objects in environments without `fs` + * deps: finalhandler@0.3.6 + - deps: debug@~2.2.0 + - deps: on-finished@~2.2.1 + * deps: on-finished@~2.2.1 + - Fix `isFinished(req)` when data buffered + * deps: proxy-addr@~1.0.8 + - deps: ipaddr.js@1.0.1 + * deps: qs@2.4.2 + - Fix allowing parameters like `constructor` + * deps: send@0.12.3 + - deps: debug@~2.2.0 + - deps: depd@~1.0.1 + - deps: etag@~1.6.0 + - deps: ms@0.7.1 + - deps: on-finished@~2.2.1 + * deps: serve-static@~1.9.3 + - deps: send@0.12.3 + * deps: type-is@~1.6.2 + - deps: mime-types@~2.0.11 + +4.12.3 / 2015-03-17 +=================== + + * deps: accepts@~1.2.5 + - deps: mime-types@~2.0.10 + * deps: debug@~2.1.3 + - Fix high intensity foreground color for bold + - deps: ms@0.7.0 + * deps: finalhandler@0.3.4 + - deps: debug@~2.1.3 + * deps: proxy-addr@~1.0.7 + - deps: ipaddr.js@0.1.9 + * deps: qs@2.4.1 + - Fix error when parameter `hasOwnProperty` is present + * deps: send@0.12.2 + - Throw errors early for invalid `extensions` or `index` options + - deps: debug@~2.1.3 + * deps: serve-static@~1.9.2 + - deps: send@0.12.2 + * deps: type-is@~1.6.1 + - deps: mime-types@~2.0.10 + +4.12.2 / 2015-03-02 +=================== + + * Fix regression where `"Request aborted"` is logged using `res.sendFile` + +4.12.1 / 2015-03-01 +=================== + + * Fix constructing application with non-configurable prototype properties + * Fix `ECONNRESET` errors from `res.sendFile` usage + * Fix `req.host` when using "trust proxy" hops count + * Fix `req.protocol`/`req.secure` when using "trust proxy" hops count + * Fix wrong `code` on aborted connections from `res.sendFile` + * deps: merge-descriptors@1.0.0 + +4.12.0 / 2015-02-23 +=================== + + * Fix `"trust proxy"` setting to inherit when app is mounted + * Generate `ETag`s for all request responses + - No longer restricted to only responses for `GET` and `HEAD` requests + * Use `content-type` to parse `Content-Type` headers + * deps: accepts@~1.2.4 + - Fix preference sorting to be stable for long acceptable lists + - deps: mime-types@~2.0.9 + - deps: negotiator@0.5.1 + * deps: cookie-signature@1.0.6 + * deps: send@0.12.1 + - Always read the stat size from the file + - Fix mutating passed-in `options` + - deps: mime@1.3.4 + * deps: serve-static@~1.9.1 + - deps: send@0.12.1 + * deps: type-is@~1.6.0 + - fix argument reassignment + - fix false-positives in `hasBody` `Transfer-Encoding` check + - support wildcard for both type and subtype (`*/*`) + - deps: mime-types@~2.0.9 + +4.11.2 / 2015-02-01 +=================== + + * Fix `res.redirect` double-calling `res.end` for `HEAD` requests + * deps: accepts@~1.2.3 + - deps: mime-types@~2.0.8 + * deps: proxy-addr@~1.0.6 + - deps: ipaddr.js@0.1.8 + * deps: type-is@~1.5.6 + - deps: mime-types@~2.0.8 + +4.11.1 / 2015-01-20 +=================== + + * deps: send@0.11.1 + - Fix root path disclosure + * deps: serve-static@~1.8.1 + - Fix redirect loop in Node.js 0.11.14 + - Fix root path disclosure + - deps: send@0.11.1 + +4.11.0 / 2015-01-13 +=================== + + * Add `res.append(field, val)` to append headers + * Deprecate leading `:` in `name` for `app.param(name, fn)` + * Deprecate `req.param()` -- use `req.params`, `req.body`, or `req.query` instead + * Deprecate `app.param(fn)` + * Fix `OPTIONS` responses to include the `HEAD` method properly + * Fix `res.sendFile` not always detecting aborted connection + * Match routes iteratively to prevent stack overflows + * deps: accepts@~1.2.2 + - deps: mime-types@~2.0.7 + - deps: negotiator@0.5.0 + * deps: send@0.11.0 + - deps: debug@~2.1.1 + - deps: etag@~1.5.1 + - deps: ms@0.7.0 + - deps: on-finished@~2.2.0 + * deps: serve-static@~1.8.0 + - deps: send@0.11.0 + +4.10.8 / 2015-01-13 +=================== + + * Fix crash from error within `OPTIONS` response handler + * deps: proxy-addr@~1.0.5 + - deps: ipaddr.js@0.1.6 + +4.10.7 / 2015-01-04 +=================== + + * Fix `Allow` header for `OPTIONS` to not contain duplicate methods + * Fix incorrect "Request aborted" for `res.sendFile` when `HEAD` or 304 + * deps: debug@~2.1.1 + * deps: finalhandler@0.3.3 + - deps: debug@~2.1.1 + - deps: on-finished@~2.2.0 + * deps: methods@~1.1.1 + * deps: on-finished@~2.2.0 + * deps: serve-static@~1.7.2 + - Fix potential open redirect when mounted at root + * deps: type-is@~1.5.5 + - deps: mime-types@~2.0.7 + +4.10.6 / 2014-12-12 +=================== + + * Fix exception in `req.fresh`/`req.stale` without response headers + +4.10.5 / 2014-12-10 +=================== + + * Fix `res.send` double-calling `res.end` for `HEAD` requests + * deps: accepts@~1.1.4 + - deps: mime-types@~2.0.4 + * deps: type-is@~1.5.4 + - deps: mime-types@~2.0.4 + +4.10.4 / 2014-11-24 +=================== + + * Fix `res.sendfile` logging standard write errors + +4.10.3 / 2014-11-23 +=================== + + * Fix `res.sendFile` logging standard write errors + * deps: etag@~1.5.1 + * deps: proxy-addr@~1.0.4 + - deps: ipaddr.js@0.1.5 + * deps: qs@2.3.3 + - Fix `arrayLimit` behavior + +4.10.2 / 2014-11-09 +=================== + + * Correctly invoke async router callback asynchronously + * deps: accepts@~1.1.3 + - deps: mime-types@~2.0.3 + * deps: type-is@~1.5.3 + - deps: mime-types@~2.0.3 + +4.10.1 / 2014-10-28 +=================== + + * Fix handling of URLs containing `://` in the path + * deps: qs@2.3.2 + - Fix parsing of mixed objects and values + +4.10.0 / 2014-10-23 +=================== + + * Add support for `app.set('views', array)` + - Views are looked up in sequence in array of directories + * Fix `res.send(status)` to mention `res.sendStatus(status)` + * Fix handling of invalid empty URLs + * Use `content-disposition` module for `res.attachment`/`res.download` + - Sends standards-compliant `Content-Disposition` header + - Full Unicode support + * Use `path.resolve` in view lookup + * deps: debug@~2.1.0 + - Implement `DEBUG_FD` env variable support + * deps: depd@~1.0.0 + * deps: etag@~1.5.0 + - Improve string performance + - Slightly improve speed for weak ETags over 1KB + * deps: finalhandler@0.3.2 + - Terminate in progress response only on error + - Use `on-finished` to determine request status + - deps: debug@~2.1.0 + - deps: on-finished@~2.1.1 + * deps: on-finished@~2.1.1 + - Fix handling of pipelined requests + * deps: qs@2.3.0 + - Fix parsing of mixed implicit and explicit arrays + * deps: send@0.10.1 + - deps: debug@~2.1.0 + - deps: depd@~1.0.0 + - deps: etag@~1.5.0 + - deps: on-finished@~2.1.1 + * deps: serve-static@~1.7.1 + - deps: send@0.10.1 + +4.9.8 / 2014-10-17 +================== + + * Fix `res.redirect` body when redirect status specified + * deps: accepts@~1.1.2 + - Fix error when media type has invalid parameter + - deps: negotiator@0.4.9 + +4.9.7 / 2014-10-10 +================== + + * Fix using same param name in array of paths + +4.9.6 / 2014-10-08 +================== + + * deps: accepts@~1.1.1 + - deps: mime-types@~2.0.2 + - deps: negotiator@0.4.8 + * deps: serve-static@~1.6.4 + - Fix redirect loop when index file serving disabled + * deps: type-is@~1.5.2 + - deps: mime-types@~2.0.2 + +4.9.5 / 2014-09-24 +================== + + * deps: etag@~1.4.0 + * deps: proxy-addr@~1.0.3 + - Use `forwarded` npm module + * deps: send@0.9.3 + - deps: etag@~1.4.0 + * deps: serve-static@~1.6.3 + - deps: send@0.9.3 + +4.9.4 / 2014-09-19 +================== + + * deps: qs@2.2.4 + - Fix issue with object keys starting with numbers truncated + +4.9.3 / 2014-09-18 +================== + + * deps: proxy-addr@~1.0.2 + - Fix a global leak when multiple subnets are trusted + - deps: ipaddr.js@0.1.3 + +4.9.2 / 2014-09-17 +================== + + * Fix regression for empty string `path` in `app.use` + * Fix `router.use` to accept array of middleware without path + * Improve error message for bad `app.use` arguments + +4.9.1 / 2014-09-16 +================== + + * Fix `app.use` to accept array of middleware without path + * deps: depd@0.4.5 + * deps: etag@~1.3.1 + * deps: send@0.9.2 + - deps: depd@0.4.5 + - deps: etag@~1.3.1 + - deps: range-parser@~1.0.2 + * deps: serve-static@~1.6.2 + - deps: send@0.9.2 + +4.9.0 / 2014-09-08 +================== + + * Add `res.sendStatus` + * Invoke callback for sendfile when client aborts + - Applies to `res.sendFile`, `res.sendfile`, and `res.download` + - `err` will be populated with request aborted error + * Support IP address host in `req.subdomains` + * Use `etag` to generate `ETag` headers + * deps: accepts@~1.1.0 + - update `mime-types` + * deps: cookie-signature@1.0.5 + * deps: debug@~2.0.0 + * deps: finalhandler@0.2.0 + - Set `X-Content-Type-Options: nosniff` header + - deps: debug@~2.0.0 + * deps: fresh@0.2.4 + * deps: media-typer@0.3.0 + - Throw error when parameter format invalid on parse + * deps: qs@2.2.3 + - Fix issue where first empty value in array is discarded + * deps: range-parser@~1.0.2 + * deps: send@0.9.1 + - Add `lastModified` option + - Use `etag` to generate `ETag` header + - deps: debug@~2.0.0 + - deps: fresh@0.2.4 + * deps: serve-static@~1.6.1 + - Add `lastModified` option + - deps: send@0.9.1 + * deps: type-is@~1.5.1 + - fix `hasbody` to be true for `content-length: 0` + - deps: media-typer@0.3.0 + - deps: mime-types@~2.0.1 + * deps: vary@~1.0.0 + - Accept valid `Vary` header string as `field` + +4.8.8 / 2014-09-04 +================== + + * deps: send@0.8.5 + - Fix a path traversal issue when using `root` + - Fix malicious path detection for empty string path + * deps: serve-static@~1.5.4 + - deps: send@0.8.5 + +4.8.7 / 2014-08-29 +================== + + * deps: qs@2.2.2 + - Remove unnecessary cloning + +4.8.6 / 2014-08-27 +================== + + * deps: qs@2.2.0 + - Array parsing fix + - Performance improvements + +4.8.5 / 2014-08-18 +================== + + * deps: send@0.8.3 + - deps: destroy@1.0.3 + - deps: on-finished@2.1.0 + * deps: serve-static@~1.5.3 + - deps: send@0.8.3 + +4.8.4 / 2014-08-14 +================== + + * deps: qs@1.2.2 + * deps: send@0.8.2 + - Work around `fd` leak in Node.js 0.10 for `fs.ReadStream` + * deps: serve-static@~1.5.2 + - deps: send@0.8.2 + +4.8.3 / 2014-08-10 +================== + + * deps: parseurl@~1.3.0 + * deps: qs@1.2.1 + * deps: serve-static@~1.5.1 + - Fix parsing of weird `req.originalUrl` values + - deps: parseurl@~1.3.0 + - deps: utils-merge@1.0.0 + +4.8.2 / 2014-08-07 +================== + + * deps: qs@1.2.0 + - Fix parsing array of objects + +4.8.1 / 2014-08-06 +================== + + * fix incorrect deprecation warnings on `res.download` + * deps: qs@1.1.0 + - Accept urlencoded square brackets + - Accept empty values in implicit array notation + +4.8.0 / 2014-08-05 +================== + + * add `res.sendFile` + - accepts a file system path instead of a URL + - requires an absolute path or `root` option specified + * deprecate `res.sendfile` -- use `res.sendFile` instead + * support mounted app as any argument to `app.use()` + * deps: qs@1.0.2 + - Complete rewrite + - Limits array length to 20 + - Limits object depth to 5 + - Limits parameters to 1,000 + * deps: send@0.8.1 + - Add `extensions` option + * deps: serve-static@~1.5.0 + - Add `extensions` option + - deps: send@0.8.1 + +4.7.4 / 2014-08-04 +================== + + * fix `res.sendfile` regression for serving directory index files + * deps: send@0.7.4 + - Fix incorrect 403 on Windows and Node.js 0.11 + - Fix serving index files without root dir + * deps: serve-static@~1.4.4 + - deps: send@0.7.4 + +4.7.3 / 2014-08-04 +================== + + * deps: send@0.7.3 + - Fix incorrect 403 on Windows and Node.js 0.11 + * deps: serve-static@~1.4.3 + - Fix incorrect 403 on Windows and Node.js 0.11 + - deps: send@0.7.3 + +4.7.2 / 2014-07-27 +================== + + * deps: depd@0.4.4 + - Work-around v8 generating empty stack traces + * deps: send@0.7.2 + - deps: depd@0.4.4 + * deps: serve-static@~1.4.2 + +4.7.1 / 2014-07-26 +================== + + * deps: depd@0.4.3 + - Fix exception when global `Error.stackTraceLimit` is too low + * deps: send@0.7.1 + - deps: depd@0.4.3 + * deps: serve-static@~1.4.1 + +4.7.0 / 2014-07-25 +================== + + * fix `req.protocol` for proxy-direct connections + * configurable query parser with `app.set('query parser', parser)` + - `app.set('query parser', 'extended')` parse with "qs" module + - `app.set('query parser', 'simple')` parse with "querystring" core module + - `app.set('query parser', false)` disable query string parsing + - `app.set('query parser', true)` enable simple parsing + * deprecate `res.json(status, obj)` -- use `res.status(status).json(obj)` instead + * deprecate `res.jsonp(status, obj)` -- use `res.status(status).jsonp(obj)` instead + * deprecate `res.send(status, body)` -- use `res.status(status).send(body)` instead + * deps: debug@1.0.4 + * deps: depd@0.4.2 + - Add `TRACE_DEPRECATION` environment variable + - Remove non-standard grey color from color output + - Support `--no-deprecation` argument + - Support `--trace-deprecation` argument + * deps: finalhandler@0.1.0 + - Respond after request fully read + - deps: debug@1.0.4 + * deps: parseurl@~1.2.0 + - Cache URLs based on original value + - Remove no-longer-needed URL mis-parse work-around + - Simplify the "fast-path" `RegExp` + * deps: send@0.7.0 + - Add `dotfiles` option + - Cap `maxAge` value to 1 year + - deps: debug@1.0.4 + - deps: depd@0.4.2 + * deps: serve-static@~1.4.0 + - deps: parseurl@~1.2.0 + - deps: send@0.7.0 + * perf: prevent multiple `Buffer` creation in `res.send` + +4.6.1 / 2014-07-12 +================== + + * fix `subapp.mountpath` regression for `app.use(subapp)` + +4.6.0 / 2014-07-11 +================== + + * accept multiple callbacks to `app.use()` + * add explicit "Rosetta Flash JSONP abuse" protection + - previous versions are not vulnerable; this is just explicit protection + * catch errors in multiple `req.param(name, fn)` handlers + * deprecate `res.redirect(url, status)` -- use `res.redirect(status, url)` instead + * fix `res.send(status, num)` to send `num` as json (not error) + * remove unnecessary escaping when `res.jsonp` returns JSON response + * support non-string `path` in `app.use(path, fn)` + - supports array of paths + - supports `RegExp` + * router: fix optimization on router exit + * router: refactor location of `try` blocks + * router: speed up standard `app.use(fn)` + * deps: debug@1.0.3 + - Add support for multiple wildcards in namespaces + * deps: finalhandler@0.0.3 + - deps: debug@1.0.3 + * deps: methods@1.1.0 + - add `CONNECT` + * deps: parseurl@~1.1.3 + - faster parsing of href-only URLs + * deps: path-to-regexp@0.1.3 + * deps: send@0.6.0 + - deps: debug@1.0.3 + * deps: serve-static@~1.3.2 + - deps: parseurl@~1.1.3 + - deps: send@0.6.0 + * perf: fix arguments reassign deopt in some `res` methods + +4.5.1 / 2014-07-06 +================== + + * fix routing regression when altering `req.method` + +4.5.0 / 2014-07-04 +================== + + * add deprecation message to non-plural `req.accepts*` + * add deprecation message to `res.send(body, status)` + * add deprecation message to `res.vary()` + * add `headers` option to `res.sendfile` + - use to set headers on successful file transfer + * add `mergeParams` option to `Router` + - merges `req.params` from parent routes + * add `req.hostname` -- correct name for what `req.host` returns + * deprecate things with `depd` module + * deprecate `req.host` -- use `req.hostname` instead + * fix behavior when handling request without routes + * fix handling when `route.all` is only route + * invoke `router.param()` only when route matches + * restore `req.params` after invoking router + * use `finalhandler` for final response handling + * use `media-typer` to alter content-type charset + * deps: accepts@~1.0.7 + * deps: send@0.5.0 + - Accept string for `maxage` (converted by `ms`) + - Include link in default redirect response + * deps: serve-static@~1.3.0 + - Accept string for `maxAge` (converted by `ms`) + - Add `setHeaders` option + - Include HTML link in redirect response + - deps: send@0.5.0 + * deps: type-is@~1.3.2 + +4.4.5 / 2014-06-26 +================== + + * deps: cookie-signature@1.0.4 + - fix for timing attacks + +4.4.4 / 2014-06-20 +================== + + * fix `res.attachment` Unicode filenames in Safari + * fix "trim prefix" debug message in `express:router` + * deps: accepts@~1.0.5 + * deps: buffer-crc32@0.2.3 + +4.4.3 / 2014-06-11 +================== + + * fix persistence of modified `req.params[name]` from `app.param()` + * deps: accepts@1.0.3 + - deps: negotiator@0.4.6 + * deps: debug@1.0.2 + * deps: send@0.4.3 + - Do not throw uncatchable error on file open race condition + - Use `escape-html` for HTML escaping + - deps: debug@1.0.2 + - deps: finished@1.2.2 + - deps: fresh@0.2.2 + * deps: serve-static@1.2.3 + - Do not throw uncatchable error on file open race condition + - deps: send@0.4.3 + +4.4.2 / 2014-06-09 +================== + + * fix catching errors from top-level handlers + * use `vary` module for `res.vary` + * deps: debug@1.0.1 + * deps: proxy-addr@1.0.1 + * deps: send@0.4.2 + - fix "event emitter leak" warnings + - deps: debug@1.0.1 + - deps: finished@1.2.1 + * deps: serve-static@1.2.2 + - fix "event emitter leak" warnings + - deps: send@0.4.2 + * deps: type-is@1.2.1 + +4.4.1 / 2014-06-02 +================== + + * deps: methods@1.0.1 + * deps: send@0.4.1 + - Send `max-age` in `Cache-Control` in correct format + * deps: serve-static@1.2.1 + - use `escape-html` for escaping + - deps: send@0.4.1 + +4.4.0 / 2014-05-30 +================== + + * custom etag control with `app.set('etag', val)` + - `app.set('etag', function(body, encoding){ return '"etag"' })` custom etag generation + - `app.set('etag', 'weak')` weak tag + - `app.set('etag', 'strong')` strong etag + - `app.set('etag', false)` turn off + - `app.set('etag', true)` standard etag + * mark `res.send` ETag as weak and reduce collisions + * update accepts to 1.0.2 + - Fix interpretation when header not in request + * update send to 0.4.0 + - Calculate ETag with md5 for reduced collisions + - Ignore stream errors after request ends + - deps: debug@0.8.1 + * update serve-static to 1.2.0 + - Calculate ETag with md5 for reduced collisions + - Ignore stream errors after request ends + - deps: send@0.4.0 + +4.3.2 / 2014-05-28 +================== + + * fix handling of errors from `router.param()` callbacks + +4.3.1 / 2014-05-23 +================== + + * revert "fix behavior of multiple `app.VERB` for the same path" + - this caused a regression in the order of route execution + +4.3.0 / 2014-05-21 +================== + + * add `req.baseUrl` to access the path stripped from `req.url` in routes + * fix behavior of multiple `app.VERB` for the same path + * fix issue routing requests among sub routers + * invoke `router.param()` only when necessary instead of every match + * proper proxy trust with `app.set('trust proxy', trust)` + - `app.set('trust proxy', 1)` trust first hop + - `app.set('trust proxy', 'loopback')` trust loopback addresses + - `app.set('trust proxy', '10.0.0.1')` trust single IP + - `app.set('trust proxy', '10.0.0.1/16')` trust subnet + - `app.set('trust proxy', '10.0.0.1, 10.0.0.2')` trust list + - `app.set('trust proxy', false)` turn off + - `app.set('trust proxy', true)` trust everything + * set proper `charset` in `Content-Type` for `res.send` + * update type-is to 1.2.0 + - support suffix matching + +4.2.0 / 2014-05-11 +================== + + * deprecate `app.del()` -- use `app.delete()` instead + * deprecate `res.json(obj, status)` -- use `res.json(status, obj)` instead + - the edge-case `res.json(status, num)` requires `res.status(status).json(num)` + * deprecate `res.jsonp(obj, status)` -- use `res.jsonp(status, obj)` instead + - the edge-case `res.jsonp(status, num)` requires `res.status(status).jsonp(num)` + * fix `req.next` when inside router instance + * include `ETag` header in `HEAD` requests + * keep previous `Content-Type` for `res.jsonp` + * support PURGE method + - add `app.purge` + - add `router.purge` + - include PURGE in `app.all` + * update debug to 0.8.0 + - add `enable()` method + - change from stderr to stdout + * update methods to 1.0.0 + - add PURGE + +4.1.2 / 2014-05-08 +================== + + * fix `req.host` for IPv6 literals + * fix `res.jsonp` error if callback param is object + +4.1.1 / 2014-04-27 +================== + + * fix package.json to reflect supported node version + +4.1.0 / 2014-04-24 +================== + + * pass options from `res.sendfile` to `send` + * preserve casing of headers in `res.header` and `res.set` + * support unicode file names in `res.attachment` and `res.download` + * update accepts to 1.0.1 + - deps: negotiator@0.4.0 + * update cookie to 0.1.2 + - Fix for maxAge == 0 + - made compat with expires field + * update send to 0.3.0 + - Accept API options in options object + - Coerce option types + - Control whether to generate etags + - Default directory access to 403 when index disabled + - Fix sending files with dots without root set + - Include file path in etag + - Make "Can't set headers after they are sent." catchable + - Send full entity-body for multi range requests + - Set etags to "weak" + - Support "If-Range" header + - Support multiple index paths + - deps: mime@1.2.11 + * update serve-static to 1.1.0 + - Accept options directly to `send` module + - Resolve relative paths at middleware setup + - Use parseurl to parse the URL from request + - deps: send@0.3.0 + * update type-is to 1.1.0 + - add non-array values support + - add `multipart` as a shorthand + +4.0.0 / 2014-04-09 +================== + + * remove: + - node 0.8 support + - connect and connect's patches except for charset handling + - express(1) - moved to [express-generator](https://github.com/expressjs/generator) + - `express.createServer()` - it has been deprecated for a long time. Use `express()` + - `app.configure` - use logic in your own app code + - `app.router` - is removed + - `req.auth` - use `basic-auth` instead + - `req.accepted*` - use `req.accepts*()` instead + - `res.location` - relative URL resolution is removed + - `res.charset` - include the charset in the content type when using `res.set()` + - all bundled middleware except `static` + * change: + - `app.route` -> `app.mountpath` when mounting an express app in another express app + - `json spaces` no longer enabled by default in development + - `req.accepts*` -> `req.accepts*s` - i.e. `req.acceptsEncoding` -> `req.acceptsEncodings` + - `req.params` is now an object instead of an array + - `res.locals` is no longer a function. It is a plain js object. Treat it as such. + - `res.headerSent` -> `res.headersSent` to match node.js ServerResponse object + * refactor: + - `req.accepts*` with [accepts](https://github.com/expressjs/accepts) + - `req.is` with [type-is](https://github.com/expressjs/type-is) + - [path-to-regexp](https://github.com/component/path-to-regexp) + * add: + - `app.router()` - returns the app Router instance + - `app.route()` - Proxy to the app's `Router#route()` method to create a new route + - Router & Route - public API + +3.21.2 / 2015-07-31 +=================== + + * deps: connect@2.30.2 + - deps: body-parser@~1.13.3 + - deps: compression@~1.5.2 + - deps: errorhandler@~1.4.2 + - deps: method-override@~2.3.5 + - deps: serve-index@~1.7.2 + - deps: type-is@~1.6.6 + - deps: vhost@~3.0.1 + * deps: vary@~1.0.1 + - Fix setting empty header from empty `field` + - perf: enable strict mode + - perf: remove argument reassignments + +3.21.1 / 2015-07-05 +=================== + + * deps: basic-auth@~1.0.3 + * deps: connect@2.30.1 + - deps: body-parser@~1.13.2 + - deps: compression@~1.5.1 + - deps: errorhandler@~1.4.1 + - deps: morgan@~1.6.1 + - deps: pause@0.1.0 + - deps: qs@4.0.0 + - deps: serve-index@~1.7.1 + - deps: type-is@~1.6.4 + +3.21.0 / 2015-06-18 +=================== + + * deps: basic-auth@1.0.2 + - perf: enable strict mode + - perf: hoist regular expression + - perf: parse with regular expressions + - perf: remove argument reassignment + * deps: connect@2.30.0 + - deps: body-parser@~1.13.1 + - deps: bytes@2.1.0 + - deps: compression@~1.5.0 + - deps: cookie@0.1.3 + - deps: cookie-parser@~1.3.5 + - deps: csurf@~1.8.3 + - deps: errorhandler@~1.4.0 + - deps: express-session@~1.11.3 + - deps: finalhandler@0.4.0 + - deps: fresh@0.3.0 + - deps: morgan@~1.6.0 + - deps: serve-favicon@~2.3.0 + - deps: serve-index@~1.7.0 + - deps: serve-static@~1.10.0 + - deps: type-is@~1.6.3 + * deps: cookie@0.1.3 + - perf: deduce the scope of try-catch deopt + - perf: remove argument reassignments + * deps: escape-html@1.0.2 + * deps: etag@~1.7.0 + - Always include entity length in ETags for hash length extensions + - Generate non-Stats ETags using MD5 only (no longer CRC32) + - Improve stat performance by removing hashing + - Improve support for JXcore + - Remove base64 padding in ETags to shorten + - Support "fake" stats objects in environments without fs + - Use MD5 instead of MD4 in weak ETags over 1KB + * deps: fresh@0.3.0 + - Add weak `ETag` matching support + * deps: mkdirp@0.5.1 + - Work in global strict mode + * deps: send@0.13.0 + - Allow Node.js HTTP server to set `Date` response header + - Fix incorrectly removing `Content-Location` on 304 response + - Improve the default redirect response headers + - Send appropriate headers on default error response + - Use `http-errors` for standard emitted errors + - Use `statuses` instead of `http` module for status messages + - deps: escape-html@1.0.2 + - deps: etag@~1.7.0 + - deps: fresh@0.3.0 + - deps: on-finished@~2.3.0 + - perf: enable strict mode + - perf: remove unnecessary array allocations + +3.20.3 / 2015-05-17 +=================== + + * deps: connect@2.29.2 + - deps: body-parser@~1.12.4 + - deps: compression@~1.4.4 + - deps: connect-timeout@~1.6.2 + - deps: debug@~2.2.0 + - deps: depd@~1.0.1 + - deps: errorhandler@~1.3.6 + - deps: finalhandler@0.3.6 + - deps: method-override@~2.3.3 + - deps: morgan@~1.5.3 + - deps: qs@2.4.2 + - deps: response-time@~2.3.1 + - deps: serve-favicon@~2.2.1 + - deps: serve-index@~1.6.4 + - deps: serve-static@~1.9.3 + - deps: type-is@~1.6.2 + * deps: debug@~2.2.0 + - deps: ms@0.7.1 + * deps: depd@~1.0.1 + * deps: proxy-addr@~1.0.8 + - deps: ipaddr.js@1.0.1 + * deps: send@0.12.3 + - deps: debug@~2.2.0 + - deps: depd@~1.0.1 + - deps: etag@~1.6.0 + - deps: ms@0.7.1 + - deps: on-finished@~2.2.1 + +3.20.2 / 2015-03-16 +=================== + + * deps: connect@2.29.1 + - deps: body-parser@~1.12.2 + - deps: compression@~1.4.3 + - deps: connect-timeout@~1.6.1 + - deps: debug@~2.1.3 + - deps: errorhandler@~1.3.5 + - deps: express-session@~1.10.4 + - deps: finalhandler@0.3.4 + - deps: method-override@~2.3.2 + - deps: morgan@~1.5.2 + - deps: qs@2.4.1 + - deps: serve-index@~1.6.3 + - deps: serve-static@~1.9.2 + - deps: type-is@~1.6.1 + * deps: debug@~2.1.3 + - Fix high intensity foreground color for bold + - deps: ms@0.7.0 + * deps: merge-descriptors@1.0.0 + * deps: proxy-addr@~1.0.7 + - deps: ipaddr.js@0.1.9 + * deps: send@0.12.2 + - Throw errors early for invalid `extensions` or `index` options + - deps: debug@~2.1.3 + +3.20.1 / 2015-02-28 +=================== + + * Fix `req.host` when using "trust proxy" hops count + * Fix `req.protocol`/`req.secure` when using "trust proxy" hops count + +3.20.0 / 2015-02-18 +=================== + + * Fix `"trust proxy"` setting to inherit when app is mounted + * Generate `ETag`s for all request responses + - No longer restricted to only responses for `GET` and `HEAD` requests + * Use `content-type` to parse `Content-Type` headers + * deps: connect@2.29.0 + - Use `content-type` to parse `Content-Type` headers + - deps: body-parser@~1.12.0 + - deps: compression@~1.4.1 + - deps: connect-timeout@~1.6.0 + - deps: cookie-parser@~1.3.4 + - deps: cookie-signature@1.0.6 + - deps: csurf@~1.7.0 + - deps: errorhandler@~1.3.4 + - deps: express-session@~1.10.3 + - deps: http-errors@~1.3.1 + - deps: response-time@~2.3.0 + - deps: serve-index@~1.6.2 + - deps: serve-static@~1.9.1 + - deps: type-is@~1.6.0 + * deps: cookie-signature@1.0.6 + * deps: send@0.12.1 + - Always read the stat size from the file + - Fix mutating passed-in `options` + - deps: mime@1.3.4 + +3.19.2 / 2015-02-01 +=================== + + * deps: connect@2.28.3 + - deps: compression@~1.3.1 + - deps: csurf@~1.6.6 + - deps: errorhandler@~1.3.3 + - deps: express-session@~1.10.2 + - deps: serve-index@~1.6.1 + - deps: type-is@~1.5.6 + * deps: proxy-addr@~1.0.6 + - deps: ipaddr.js@0.1.8 + +3.19.1 / 2015-01-20 +=================== + + * deps: connect@2.28.2 + - deps: body-parser@~1.10.2 + - deps: serve-static@~1.8.1 + * deps: send@0.11.1 + - Fix root path disclosure + +3.19.0 / 2015-01-09 +=================== + + * Fix `OPTIONS` responses to include the `HEAD` method property + * Use `readline` for prompt in `express(1)` + * deps: commander@2.6.0 + * deps: connect@2.28.1 + - deps: body-parser@~1.10.1 + - deps: compression@~1.3.0 + - deps: connect-timeout@~1.5.0 + - deps: csurf@~1.6.4 + - deps: debug@~2.1.1 + - deps: errorhandler@~1.3.2 + - deps: express-session@~1.10.1 + - deps: finalhandler@0.3.3 + - deps: method-override@~2.3.1 + - deps: morgan@~1.5.1 + - deps: serve-favicon@~2.2.0 + - deps: serve-index@~1.6.0 + - deps: serve-static@~1.8.0 + - deps: type-is@~1.5.5 + * deps: debug@~2.1.1 + * deps: methods@~1.1.1 + * deps: proxy-addr@~1.0.5 + - deps: ipaddr.js@0.1.6 + * deps: send@0.11.0 + - deps: debug@~2.1.1 + - deps: etag@~1.5.1 + - deps: ms@0.7.0 + - deps: on-finished@~2.2.0 + +3.18.6 / 2014-12-12 +=================== + + * Fix exception in `req.fresh`/`req.stale` without response headers + +3.18.5 / 2014-12-11 +=================== + + * deps: connect@2.27.6 + - deps: compression@~1.2.2 + - deps: express-session@~1.9.3 + - deps: http-errors@~1.2.8 + - deps: serve-index@~1.5.3 + - deps: type-is@~1.5.4 + +3.18.4 / 2014-11-23 +=================== + + * deps: connect@2.27.4 + - deps: body-parser@~1.9.3 + - deps: compression@~1.2.1 + - deps: errorhandler@~1.2.3 + - deps: express-session@~1.9.2 + - deps: qs@2.3.3 + - deps: serve-favicon@~2.1.7 + - deps: serve-static@~1.5.1 + - deps: type-is@~1.5.3 + * deps: etag@~1.5.1 + * deps: proxy-addr@~1.0.4 + - deps: ipaddr.js@0.1.5 + +3.18.3 / 2014-11-09 +=================== + + * deps: connect@2.27.3 + - Correctly invoke async callback asynchronously + - deps: csurf@~1.6.3 + +3.18.2 / 2014-10-28 +=================== + + * deps: connect@2.27.2 + - Fix handling of URLs containing `://` in the path + - deps: body-parser@~1.9.2 + - deps: qs@2.3.2 + +3.18.1 / 2014-10-22 +=================== + + * Fix internal `utils.merge` deprecation warnings + * deps: connect@2.27.1 + - deps: body-parser@~1.9.1 + - deps: express-session@~1.9.1 + - deps: finalhandler@0.3.2 + - deps: morgan@~1.4.1 + - deps: qs@2.3.0 + - deps: serve-static@~1.7.1 + * deps: send@0.10.1 + - deps: on-finished@~2.1.1 + +3.18.0 / 2014-10-17 +=================== + + * Use `content-disposition` module for `res.attachment`/`res.download` + - Sends standards-compliant `Content-Disposition` header + - Full Unicode support + * Use `etag` module to generate `ETag` headers + * deps: connect@2.27.0 + - Use `http-errors` module for creating errors + - Use `utils-merge` module for merging objects + - deps: body-parser@~1.9.0 + - deps: compression@~1.2.0 + - deps: connect-timeout@~1.4.0 + - deps: debug@~2.1.0 + - deps: depd@~1.0.0 + - deps: express-session@~1.9.0 + - deps: finalhandler@0.3.1 + - deps: method-override@~2.3.0 + - deps: morgan@~1.4.0 + - deps: response-time@~2.2.0 + - deps: serve-favicon@~2.1.6 + - deps: serve-index@~1.5.0 + - deps: serve-static@~1.7.0 + * deps: debug@~2.1.0 + - Implement `DEBUG_FD` env variable support + * deps: depd@~1.0.0 + * deps: send@0.10.0 + - deps: debug@~2.1.0 + - deps: depd@~1.0.0 + - deps: etag@~1.5.0 + +3.17.8 / 2014-10-15 +=================== + + * deps: connect@2.26.6 + - deps: compression@~1.1.2 + - deps: csurf@~1.6.2 + - deps: errorhandler@~1.2.2 + +3.17.7 / 2014-10-08 +=================== + + * deps: connect@2.26.5 + - Fix accepting non-object arguments to `logger` + - deps: serve-static@~1.6.4 + +3.17.6 / 2014-10-02 +=================== + + * deps: connect@2.26.4 + - deps: morgan@~1.3.2 + - deps: type-is@~1.5.2 + +3.17.5 / 2014-09-24 +=================== + + * deps: connect@2.26.3 + - deps: body-parser@~1.8.4 + - deps: serve-favicon@~2.1.5 + - deps: serve-static@~1.6.3 + * deps: proxy-addr@~1.0.3 + - Use `forwarded` npm module + * deps: send@0.9.3 + - deps: etag@~1.4.0 + +3.17.4 / 2014-09-19 +=================== + + * deps: connect@2.26.2 + - deps: body-parser@~1.8.3 + - deps: qs@2.2.4 + +3.17.3 / 2014-09-18 +=================== + + * deps: proxy-addr@~1.0.2 + - Fix a global leak when multiple subnets are trusted + - deps: ipaddr.js@0.1.3 + +3.17.2 / 2014-09-15 +=================== + + * Use `crc` instead of `buffer-crc32` for speed + * deps: connect@2.26.1 + - deps: body-parser@~1.8.2 + - deps: depd@0.4.5 + - deps: express-session@~1.8.2 + - deps: morgan@~1.3.1 + - deps: serve-favicon@~2.1.3 + - deps: serve-static@~1.6.2 + * deps: depd@0.4.5 + * deps: send@0.9.2 + - deps: depd@0.4.5 + - deps: etag@~1.3.1 + - deps: range-parser@~1.0.2 + +3.17.1 / 2014-09-08 +=================== + + * Fix error in `req.subdomains` on empty host + +3.17.0 / 2014-09-08 +=================== + + * Support `X-Forwarded-Host` in `req.subdomains` + * Support IP address host in `req.subdomains` + * deps: connect@2.26.0 + - deps: body-parser@~1.8.1 + - deps: compression@~1.1.0 + - deps: connect-timeout@~1.3.0 + - deps: cookie-parser@~1.3.3 + - deps: cookie-signature@1.0.5 + - deps: csurf@~1.6.1 + - deps: debug@~2.0.0 + - deps: errorhandler@~1.2.0 + - deps: express-session@~1.8.1 + - deps: finalhandler@0.2.0 + - deps: fresh@0.2.4 + - deps: media-typer@0.3.0 + - deps: method-override@~2.2.0 + - deps: morgan@~1.3.0 + - deps: qs@2.2.3 + - deps: serve-favicon@~2.1.3 + - deps: serve-index@~1.2.1 + - deps: serve-static@~1.6.1 + - deps: type-is@~1.5.1 + - deps: vhost@~3.0.0 + * deps: cookie-signature@1.0.5 + * deps: debug@~2.0.0 + * deps: fresh@0.2.4 + * deps: media-typer@0.3.0 + - Throw error when parameter format invalid on parse + * deps: range-parser@~1.0.2 + * deps: send@0.9.1 + - Add `lastModified` option + - Use `etag` to generate `ETag` header + - deps: debug@~2.0.0 + - deps: fresh@0.2.4 + * deps: vary@~1.0.0 + - Accept valid `Vary` header string as `field` + +3.16.10 / 2014-09-04 +==================== + + * deps: connect@2.25.10 + - deps: serve-static@~1.5.4 + * deps: send@0.8.5 + - Fix a path traversal issue when using `root` + - Fix malicious path detection for empty string path + +3.16.9 / 2014-08-29 +=================== + + * deps: connect@2.25.9 + - deps: body-parser@~1.6.7 + - deps: qs@2.2.2 + +3.16.8 / 2014-08-27 +=================== + + * deps: connect@2.25.8 + - deps: body-parser@~1.6.6 + - deps: csurf@~1.4.1 + - deps: qs@2.2.0 + +3.16.7 / 2014-08-18 +=================== + + * deps: connect@2.25.7 + - deps: body-parser@~1.6.5 + - deps: express-session@~1.7.6 + - deps: morgan@~1.2.3 + - deps: serve-static@~1.5.3 + * deps: send@0.8.3 + - deps: destroy@1.0.3 + - deps: on-finished@2.1.0 + +3.16.6 / 2014-08-14 +=================== + + * deps: connect@2.25.6 + - deps: body-parser@~1.6.4 + - deps: qs@1.2.2 + - deps: serve-static@~1.5.2 + * deps: send@0.8.2 + - Work around `fd` leak in Node.js 0.10 for `fs.ReadStream` + +3.16.5 / 2014-08-11 +=================== + + * deps: connect@2.25.5 + - Fix backwards compatibility in `logger` + +3.16.4 / 2014-08-10 +=================== + + * Fix original URL parsing in `res.location` + * deps: connect@2.25.4 + - Fix `query` middleware breaking with argument + - deps: body-parser@~1.6.3 + - deps: compression@~1.0.11 + - deps: connect-timeout@~1.2.2 + - deps: express-session@~1.7.5 + - deps: method-override@~2.1.3 + - deps: on-headers@~1.0.0 + - deps: parseurl@~1.3.0 + - deps: qs@1.2.1 + - deps: response-time@~2.0.1 + - deps: serve-index@~1.1.6 + - deps: serve-static@~1.5.1 + * deps: parseurl@~1.3.0 + +3.16.3 / 2014-08-07 +=================== + + * deps: connect@2.25.3 + - deps: multiparty@3.3.2 + +3.16.2 / 2014-08-07 +=================== + + * deps: connect@2.25.2 + - deps: body-parser@~1.6.2 + - deps: qs@1.2.0 + +3.16.1 / 2014-08-06 +=================== + + * deps: connect@2.25.1 + - deps: body-parser@~1.6.1 + - deps: qs@1.1.0 + +3.16.0 / 2014-08-05 +=================== + + * deps: connect@2.25.0 + - deps: body-parser@~1.6.0 + - deps: compression@~1.0.10 + - deps: csurf@~1.4.0 + - deps: express-session@~1.7.4 + - deps: qs@1.0.2 + - deps: serve-static@~1.5.0 + * deps: send@0.8.1 + - Add `extensions` option + +3.15.3 / 2014-08-04 +=================== + + * fix `res.sendfile` regression for serving directory index files + * deps: connect@2.24.3 + - deps: serve-index@~1.1.5 + - deps: serve-static@~1.4.4 + * deps: send@0.7.4 + - Fix incorrect 403 on Windows and Node.js 0.11 + - Fix serving index files without root dir + +3.15.2 / 2014-07-27 +=================== + + * deps: connect@2.24.2 + - deps: body-parser@~1.5.2 + - deps: depd@0.4.4 + - deps: express-session@~1.7.2 + - deps: morgan@~1.2.2 + - deps: serve-static@~1.4.2 + * deps: depd@0.4.4 + - Work-around v8 generating empty stack traces + * deps: send@0.7.2 + - deps: depd@0.4.4 + +3.15.1 / 2014-07-26 +=================== + + * deps: connect@2.24.1 + - deps: body-parser@~1.5.1 + - deps: depd@0.4.3 + - deps: express-session@~1.7.1 + - deps: morgan@~1.2.1 + - deps: serve-index@~1.1.4 + - deps: serve-static@~1.4.1 + * deps: depd@0.4.3 + - Fix exception when global `Error.stackTraceLimit` is too low + * deps: send@0.7.1 + - deps: depd@0.4.3 + +3.15.0 / 2014-07-22 +=================== + + * Fix `req.protocol` for proxy-direct connections + * Pass options from `res.sendfile` to `send` + * deps: connect@2.24.0 + - deps: body-parser@~1.5.0 + - deps: compression@~1.0.9 + - deps: connect-timeout@~1.2.1 + - deps: debug@1.0.4 + - deps: depd@0.4.2 + - deps: express-session@~1.7.0 + - deps: finalhandler@0.1.0 + - deps: method-override@~2.1.2 + - deps: morgan@~1.2.0 + - deps: multiparty@3.3.1 + - deps: parseurl@~1.2.0 + - deps: serve-static@~1.4.0 + * deps: debug@1.0.4 + * deps: depd@0.4.2 + - Add `TRACE_DEPRECATION` environment variable + - Remove non-standard grey color from color output + - Support `--no-deprecation` argument + - Support `--trace-deprecation` argument + * deps: parseurl@~1.2.0 + - Cache URLs based on original value + - Remove no-longer-needed URL mis-parse work-around + - Simplify the "fast-path" `RegExp` + * deps: send@0.7.0 + - Add `dotfiles` option + - Cap `maxAge` value to 1 year + - deps: debug@1.0.4 + - deps: depd@0.4.2 + +3.14.0 / 2014-07-11 +=================== + + * add explicit "Rosetta Flash JSONP abuse" protection + - previous versions are not vulnerable; this is just explicit protection + * deprecate `res.redirect(url, status)` -- use `res.redirect(status, url)` instead + * fix `res.send(status, num)` to send `num` as json (not error) + * remove unnecessary escaping when `res.jsonp` returns JSON response + * deps: basic-auth@1.0.0 + - support empty password + - support empty username + * deps: connect@2.23.0 + - deps: debug@1.0.3 + - deps: express-session@~1.6.4 + - deps: method-override@~2.1.0 + - deps: parseurl@~1.1.3 + - deps: serve-static@~1.3.1 + * deps: debug@1.0.3 + - Add support for multiple wildcards in namespaces + * deps: methods@1.1.0 + - add `CONNECT` + * deps: parseurl@~1.1.3 + - faster parsing of href-only URLs + +3.13.0 / 2014-07-03 +=================== + + * add deprecation message to `app.configure` + * add deprecation message to `req.auth` + * use `basic-auth` to parse `Authorization` header + * deps: connect@2.22.0 + - deps: csurf@~1.3.0 + - deps: express-session@~1.6.1 + - deps: multiparty@3.3.0 + - deps: serve-static@~1.3.0 + * deps: send@0.5.0 + - Accept string for `maxage` (converted by `ms`) + - Include link in default redirect response + +3.12.1 / 2014-06-26 +=================== + + * deps: connect@2.21.1 + - deps: cookie-parser@1.3.2 + - deps: cookie-signature@1.0.4 + - deps: express-session@~1.5.2 + - deps: type-is@~1.3.2 + * deps: cookie-signature@1.0.4 + - fix for timing attacks + +3.12.0 / 2014-06-21 +=================== + + * use `media-typer` to alter content-type charset + * deps: connect@2.21.0 + - deprecate `connect(middleware)` -- use `app.use(middleware)` instead + - deprecate `connect.createServer()` -- use `connect()` instead + - fix `res.setHeader()` patch to work with with get -> append -> set pattern + - deps: compression@~1.0.8 + - deps: errorhandler@~1.1.1 + - deps: express-session@~1.5.0 + - deps: serve-index@~1.1.3 + +3.11.0 / 2014-06-19 +=================== + + * deprecate things with `depd` module + * deps: buffer-crc32@0.2.3 + * deps: connect@2.20.2 + - deprecate `verify` option to `json` -- use `body-parser` npm module instead + - deprecate `verify` option to `urlencoded` -- use `body-parser` npm module instead + - deprecate things with `depd` module + - use `finalhandler` for final response handling + - use `media-typer` to parse `content-type` for charset + - deps: body-parser@1.4.3 + - deps: connect-timeout@1.1.1 + - deps: cookie-parser@1.3.1 + - deps: csurf@1.2.2 + - deps: errorhandler@1.1.0 + - deps: express-session@1.4.0 + - deps: multiparty@3.2.9 + - deps: serve-index@1.1.2 + - deps: type-is@1.3.1 + - deps: vhost@2.0.0 + +3.10.5 / 2014-06-11 +=================== + + * deps: connect@2.19.6 + - deps: body-parser@1.3.1 + - deps: compression@1.0.7 + - deps: debug@1.0.2 + - deps: serve-index@1.1.1 + - deps: serve-static@1.2.3 + * deps: debug@1.0.2 + * deps: send@0.4.3 + - Do not throw uncatchable error on file open race condition + - Use `escape-html` for HTML escaping + - deps: debug@1.0.2 + - deps: finished@1.2.2 + - deps: fresh@0.2.2 + +3.10.4 / 2014-06-09 +=================== + + * deps: connect@2.19.5 + - fix "event emitter leak" warnings + - deps: csurf@1.2.1 + - deps: debug@1.0.1 + - deps: serve-static@1.2.2 + - deps: type-is@1.2.1 + * deps: debug@1.0.1 + * deps: send@0.4.2 + - fix "event emitter leak" warnings + - deps: finished@1.2.1 + - deps: debug@1.0.1 + +3.10.3 / 2014-06-05 +=================== + + * use `vary` module for `res.vary` + * deps: connect@2.19.4 + - deps: errorhandler@1.0.2 + - deps: method-override@2.0.2 + - deps: serve-favicon@2.0.1 + * deps: debug@1.0.0 + +3.10.2 / 2014-06-03 +=================== + + * deps: connect@2.19.3 + - deps: compression@1.0.6 + +3.10.1 / 2014-06-03 +=================== + + * deps: connect@2.19.2 + - deps: compression@1.0.4 + * deps: proxy-addr@1.0.1 + +3.10.0 / 2014-06-02 +=================== + + * deps: connect@2.19.1 + - deprecate `methodOverride()` -- use `method-override` npm module instead + - deps: body-parser@1.3.0 + - deps: method-override@2.0.1 + - deps: multiparty@3.2.8 + - deps: response-time@2.0.0 + - deps: serve-static@1.2.1 + * deps: methods@1.0.1 + * deps: send@0.4.1 + - Send `max-age` in `Cache-Control` in correct format + +3.9.0 / 2014-05-30 +================== + + * custom etag control with `app.set('etag', val)` + - `app.set('etag', function(body, encoding){ return '"etag"' })` custom etag generation + - `app.set('etag', 'weak')` weak tag + - `app.set('etag', 'strong')` strong etag + - `app.set('etag', false)` turn off + - `app.set('etag', true)` standard etag + * Include ETag in HEAD requests + * mark `res.send` ETag as weak and reduce collisions + * update connect to 2.18.0 + - deps: compression@1.0.3 + - deps: serve-index@1.1.0 + - deps: serve-static@1.2.0 + * update send to 0.4.0 + - Calculate ETag with md5 for reduced collisions + - Ignore stream errors after request ends + - deps: debug@0.8.1 + +3.8.1 / 2014-05-27 +================== + + * update connect to 2.17.3 + - deps: body-parser@1.2.2 + - deps: express-session@1.2.1 + - deps: method-override@1.0.2 + +3.8.0 / 2014-05-21 +================== + + * keep previous `Content-Type` for `res.jsonp` + * set proper `charset` in `Content-Type` for `res.send` + * update connect to 2.17.1 + - fix `res.charset` appending charset when `content-type` has one + - deps: express-session@1.2.0 + - deps: morgan@1.1.1 + - deps: serve-index@1.0.3 + +3.7.0 / 2014-05-18 +================== + + * proper proxy trust with `app.set('trust proxy', trust)` + - `app.set('trust proxy', 1)` trust first hop + - `app.set('trust proxy', 'loopback')` trust loopback addresses + - `app.set('trust proxy', '10.0.0.1')` trust single IP + - `app.set('trust proxy', '10.0.0.1/16')` trust subnet + - `app.set('trust proxy', '10.0.0.1, 10.0.0.2')` trust list + - `app.set('trust proxy', false)` turn off + - `app.set('trust proxy', true)` trust everything + * update connect to 2.16.2 + - deprecate `res.headerSent` -- use `res.headersSent` + - deprecate `res.on("header")` -- use on-headers module instead + - fix edge-case in `res.appendHeader` that would append in wrong order + - json: use body-parser + - urlencoded: use body-parser + - dep: bytes@1.0.0 + - dep: cookie-parser@1.1.0 + - dep: csurf@1.2.0 + - dep: express-session@1.1.0 + - dep: method-override@1.0.1 + +3.6.0 / 2014-05-09 +================== + + * deprecate `app.del()` -- use `app.delete()` instead + * deprecate `res.json(obj, status)` -- use `res.json(status, obj)` instead + - the edge-case `res.json(status, num)` requires `res.status(status).json(num)` + * deprecate `res.jsonp(obj, status)` -- use `res.jsonp(status, obj)` instead + - the edge-case `res.jsonp(status, num)` requires `res.status(status).jsonp(num)` + * support PURGE method + - add `app.purge` + - add `router.purge` + - include PURGE in `app.all` + * update connect to 2.15.0 + * Add `res.appendHeader` + * Call error stack even when response has been sent + * Patch `res.headerSent` to return Boolean + * Patch `res.headersSent` for node.js 0.8 + * Prevent default 404 handler after response sent + * dep: compression@1.0.2 + * dep: connect-timeout@1.1.0 + * dep: debug@^0.8.0 + * dep: errorhandler@1.0.1 + * dep: express-session@1.0.4 + * dep: morgan@1.0.1 + * dep: serve-favicon@2.0.0 + * dep: serve-index@1.0.2 + * update debug to 0.8.0 + * add `enable()` method + * change from stderr to stdout + * update methods to 1.0.0 + - add PURGE + * update mkdirp to 0.5.0 + +3.5.3 / 2014-05-08 +================== + + * fix `req.host` for IPv6 literals + * fix `res.jsonp` error if callback param is object + +3.5.2 / 2014-04-24 +================== + + * update connect to 2.14.5 + * update cookie to 0.1.2 + * update mkdirp to 0.4.0 + * update send to 0.3.0 + +3.5.1 / 2014-03-25 +================== + + * pin less-middleware in generated app + +3.5.0 / 2014-03-06 +================== + + * bump deps + +3.4.8 / 2014-01-13 +================== + + * prevent incorrect automatic OPTIONS responses #1868 @dpatti + * update binary and examples for jade 1.0 #1876 @yossi, #1877 @reqshark, #1892 @matheusazzi + * throw 400 in case of malformed paths @rlidwka + +3.4.7 / 2013-12-10 +================== + + * update connect + +3.4.6 / 2013-12-01 +================== + + * update connect (raw-body) + +3.4.5 / 2013-11-27 +================== + + * update connect + * res.location: remove leading ./ #1802 @kapouer + * res.redirect: fix `res.redirect('toString') #1829 @michaelficarra + * res.send: always send ETag when content-length > 0 + * router: add Router.all() method + +3.4.4 / 2013-10-29 +================== + + * update connect + * update supertest + * update methods + * express(1): replace bodyParser() with urlencoded() and json() #1795 @chirag04 + +3.4.3 / 2013-10-23 +================== + + * update connect + +3.4.2 / 2013-10-18 +================== + + * update connect + * downgrade commander + +3.4.1 / 2013-10-15 +================== + + * update connect + * update commander + * jsonp: check if callback is a function + * router: wrap encodeURIComponent in a try/catch #1735 (@lxe) + * res.format: now includes charset @1747 (@sorribas) + * res.links: allow multiple calls @1746 (@sorribas) + +3.4.0 / 2013-09-07 +================== + + * add res.vary(). Closes #1682 + * update connect + +3.3.8 / 2013-09-02 +================== + + * update connect + +3.3.7 / 2013-08-28 +================== + + * update connect + +3.3.6 / 2013-08-27 +================== + + * Revert "remove charset from json responses. Closes #1631" (causes issues in some clients) + * add: req.accepts take an argument list + +3.3.4 / 2013-07-08 +================== + + * update send and connect + +3.3.3 / 2013-07-04 +================== + + * update connect + +3.3.2 / 2013-07-03 +================== + + * update connect + * update send + * remove .version export + +3.3.1 / 2013-06-27 +================== + + * update connect + +3.3.0 / 2013-06-26 +================== + + * update connect + * add support for multiple X-Forwarded-Proto values. Closes #1646 + * change: remove charset from json responses. Closes #1631 + * change: return actual booleans from req.accept* functions + * fix jsonp callback array throw + +3.2.6 / 2013-06-02 +================== + + * update connect + +3.2.5 / 2013-05-21 +================== + + * update connect + * update node-cookie + * add: throw a meaningful error when there is no default engine + * change generation of ETags with res.send() to GET requests only. Closes #1619 + +3.2.4 / 2013-05-09 +================== + + * fix `req.subdomains` when no Host is present + * fix `req.host` when no Host is present, return undefined + +3.2.3 / 2013-05-07 +================== + + * update connect / qs + +3.2.2 / 2013-05-03 +================== + + * update qs + +3.2.1 / 2013-04-29 +================== + + * add app.VERB() paths array deprecation warning + * update connect + * update qs and remove all ~ semver crap + * fix: accept number as value of Signed Cookie + +3.2.0 / 2013-04-15 +================== + + * add "view" constructor setting to override view behaviour + * add req.acceptsEncoding(name) + * add req.acceptedEncodings + * revert cookie signature change causing session race conditions + * fix sorting of Accept values of the same quality + +3.1.2 / 2013-04-12 +================== + + * add support for custom Accept parameters + * update cookie-signature + +3.1.1 / 2013-04-01 +================== + + * add X-Forwarded-Host support to `req.host` + * fix relative redirects + * update mkdirp + * update buffer-crc32 + * remove legacy app.configure() method from app template. + +3.1.0 / 2013-01-25 +================== + + * add support for leading "." in "view engine" setting + * add array support to `res.set()` + * add node 0.8.x to travis.yml + * add "subdomain offset" setting for tweaking `req.subdomains` + * add `res.location(url)` implementing `res.redirect()`-like setting of Location + * use app.get() for x-powered-by setting for inheritance + * fix colons in passwords for `req.auth` + +3.0.6 / 2013-01-04 +================== + + * add http verb methods to Router + * update connect + * fix mangling of the `res.cookie()` options object + * fix jsonp whitespace escape. Closes #1132 + +3.0.5 / 2012-12-19 +================== + + * add throwing when a non-function is passed to a route + * fix: explicitly remove Transfer-Encoding header from 204 and 304 responses + * revert "add 'etag' option" + +3.0.4 / 2012-12-05 +================== + + * add 'etag' option to disable `res.send()` Etags + * add escaping of urls in text/plain in `res.redirect()` + for old browsers interpreting as html + * change crc32 module for a more liberal license + * update connect + +3.0.3 / 2012-11-13 +================== + + * update connect + * update cookie module + * fix cookie max-age + +3.0.2 / 2012-11-08 +================== + + * add OPTIONS to cors example. Closes #1398 + * fix route chaining regression. Closes #1397 + +3.0.1 / 2012-11-01 +================== + + * update connect + +3.0.0 / 2012-10-23 +================== + + * add `make clean` + * add "Basic" check to req.auth + * add `req.auth` test coverage + * add cb && cb(payload) to `res.jsonp()`. Closes #1374 + * add backwards compat for `res.redirect()` status. Closes #1336 + * add support for `res.json()` to retain previously defined Content-Types. Closes #1349 + * update connect + * change `res.redirect()` to utilize a pathname-relative Location again. Closes #1382 + * remove non-primitive string support for `res.send()` + * fix view-locals example. Closes #1370 + * fix route-separation example + +3.0.0rc5 / 2012-09-18 +================== + + * update connect + * add redis search example + * add static-files example + * add "x-powered-by" setting (`app.disable('x-powered-by')`) + * add "application/octet-stream" redirect Accept test case. Closes #1317 + +3.0.0rc4 / 2012-08-30 +================== + + * add `res.jsonp()`. Closes #1307 + * add "verbose errors" option to error-pages example + * add another route example to express(1) so people are not so confused + * add redis online user activity tracking example + * update connect dep + * fix etag quoting. Closes #1310 + * fix error-pages 404 status + * fix jsonp callback char restrictions + * remove old OPTIONS default response + +3.0.0rc3 / 2012-08-13 +================== + + * update connect dep + * fix signed cookies to work with `connect.cookieParser()` ("s:" prefix was missing) [tnydwrds] + * fix `res.render()` clobbering of "locals" + +3.0.0rc2 / 2012-08-03 +================== + + * add CORS example + * update connect dep + * deprecate `.createServer()` & remove old stale examples + * fix: escape `res.redirect()` link + * fix vhost example + +3.0.0rc1 / 2012-07-24 +================== + + * add more examples to view-locals + * add scheme-relative redirects (`res.redirect("//foo.com")`) support + * update cookie dep + * update connect dep + * update send dep + * fix `express(1)` -h flag, use -H for hogan. Closes #1245 + * fix `res.sendfile()` socket error handling regression + +3.0.0beta7 / 2012-07-16 +================== + + * update connect dep for `send()` root normalization regression + +3.0.0beta6 / 2012-07-13 +================== + + * add `err.view` property for view errors. Closes #1226 + * add "jsonp callback name" setting + * add support for "/foo/:bar*" non-greedy matches + * change `res.sendfile()` to use `send()` module + * change `res.send` to use "response-send" module + * remove `app.locals.use` and `res.locals.use`, use regular middleware + +3.0.0beta5 / 2012-07-03 +================== + + * add "make check" support + * add route-map example + * add `res.json(obj, status)` support back for BC + * add "methods" dep, remove internal methods module + * update connect dep + * update auth example to utilize cores pbkdf2 + * updated tests to use "supertest" + +3.0.0beta4 / 2012-06-25 +================== + + * Added `req.auth` + * Added `req.range(size)` + * Added `res.links(obj)` + * Added `res.send(body, status)` support back for backwards compat + * Added `.default()` support to `res.format()` + * Added 2xx / 304 check to `req.fresh` + * Revert "Added + support to the router" + * Fixed `res.send()` freshness check, respect res.statusCode + +3.0.0beta3 / 2012-06-15 +================== + + * Added hogan `--hjs` to express(1) [nullfirm] + * Added another example to content-negotiation + * Added `fresh` dep + * Changed: `res.send()` always checks freshness + * Fixed: expose connects mime module. Closes #1165 + +3.0.0beta2 / 2012-06-06 +================== + + * Added `+` support to the router + * Added `req.host` + * Changed `req.param()` to check route first + * Update connect dep + +3.0.0beta1 / 2012-06-01 +================== + + * Added `res.format()` callback to override default 406 behaviour + * Fixed `res.redirect()` 406. Closes #1154 + +3.0.0alpha5 / 2012-05-30 +================== + + * Added `req.ip` + * Added `{ signed: true }` option to `res.cookie()` + * Removed `res.signedCookie()` + * Changed: dont reverse `req.ips` + * Fixed "trust proxy" setting check for `req.ips` + +3.0.0alpha4 / 2012-05-09 +================== + + * Added: allow `[]` in jsonp callback. Closes #1128 + * Added `PORT` env var support in generated template. Closes #1118 [benatkin] + * Updated: connect 2.2.2 + +3.0.0alpha3 / 2012-05-04 +================== + + * Added public `app.routes`. Closes #887 + * Added _view-locals_ example + * Added _mvc_ example + * Added `res.locals.use()`. Closes #1120 + * Added conditional-GET support to `res.send()` + * Added: coerce `res.set()` values to strings + * Changed: moved `static()` in generated apps below router + * Changed: `res.send()` only set ETag when not previously set + * Changed connect 2.2.1 dep + * Changed: `make test` now runs unit / acceptance tests + * Fixed req/res proto inheritance + +3.0.0alpha2 / 2012-04-26 +================== + + * Added `make benchmark` back + * Added `res.send()` support for `String` objects + * Added client-side data exposing example + * Added `res.header()` and `req.header()` aliases for BC + * Added `express.createServer()` for BC + * Perf: memoize parsed urls + * Perf: connect 2.2.0 dep + * Changed: make `expressInit()` middleware self-aware + * Fixed: use app.get() for all core settings + * Fixed redis session example + * Fixed session example. Closes #1105 + * Fixed generated express dep. Closes #1078 + +3.0.0alpha1 / 2012-04-15 +================== + + * Added `app.locals.use(callback)` + * Added `app.locals` object + * Added `app.locals(obj)` + * Added `res.locals` object + * Added `res.locals(obj)` + * Added `res.format()` for content-negotiation + * Added `app.engine()` + * Added `res.cookie()` JSON cookie support + * Added "trust proxy" setting + * Added `req.subdomains` + * Added `req.protocol` + * Added `req.secure` + * Added `req.path` + * Added `req.ips` + * Added `req.fresh` + * Added `req.stale` + * Added comma-delimited / array support for `req.accepts()` + * Added debug instrumentation + * Added `res.set(obj)` + * Added `res.set(field, value)` + * Added `res.get(field)` + * Added `app.get(setting)`. Closes #842 + * Added `req.acceptsLanguage()` + * Added `req.acceptsCharset()` + * Added `req.accepted` + * Added `req.acceptedLanguages` + * Added `req.acceptedCharsets` + * Added "json replacer" setting + * Added "json spaces" setting + * Added X-Forwarded-Proto support to `res.redirect()`. Closes #92 + * Added `--less` support to express(1) + * Added `express.response` prototype + * Added `express.request` prototype + * Added `express.application` prototype + * Added `app.path()` + * Added `app.render()` + * Added `res.type()` to replace `res.contentType()` + * Changed: `res.redirect()` to add relative support + * Changed: enable "jsonp callback" by default + * Changed: renamed "case sensitive routes" to "case sensitive routing" + * Rewrite of all tests with mocha + * Removed "root" setting + * Removed `res.redirect('home')` support + * Removed `req.notify()` + * Removed `app.register()` + * Removed `app.redirect()` + * Removed `app.is()` + * Removed `app.helpers()` + * Removed `app.dynamicHelpers()` + * Fixed `res.sendfile()` with non-GET. Closes #723 + * Fixed express(1) public dir for windows. Closes #866 + +2.5.9/ 2012-04-02 +================== + + * Added support for PURGE request method [pbuyle] + * Fixed `express(1)` generated app `app.address()` before `listening` [mmalecki] + +2.5.8 / 2012-02-08 +================== + + * Update mkdirp dep. Closes #991 + +2.5.7 / 2012-02-06 +================== + + * Fixed `app.all` duplicate DELETE requests [mscdex] + +2.5.6 / 2012-01-13 +================== + + * Updated hamljs dev dep. Closes #953 + +2.5.5 / 2012-01-08 +================== + + * Fixed: set `filename` on cached templates [matthewleon] + +2.5.4 / 2012-01-02 +================== + + * Fixed `express(1)` eol on 0.4.x. Closes #947 + +2.5.3 / 2011-12-30 +================== + + * Fixed `req.is()` when a charset is present + +2.5.2 / 2011-12-10 +================== + + * Fixed: express(1) LF -> CRLF for windows + +2.5.1 / 2011-11-17 +================== + + * Changed: updated connect to 1.8.x + * Removed sass.js support from express(1) + +2.5.0 / 2011-10-24 +================== + + * Added ./routes dir for generated app by default + * Added npm install reminder to express(1) app gen + * Added 0.5.x support + * Removed `make test-cov` since it wont work with node 0.5.x + * Fixed express(1) public dir for windows. Closes #866 + +2.4.7 / 2011-10-05 +================== + + * Added mkdirp to express(1). Closes #795 + * Added simple _json-config_ example + * Added shorthand for the parsed request's pathname via `req.path` + * Changed connect dep to 1.7.x to fix npm issue... + * Fixed `res.redirect()` __HEAD__ support. [reported by xerox] + * Fixed `req.flash()`, only escape args + * Fixed absolute path checking on windows. Closes #829 [reported by andrewpmckenzie] + +2.4.6 / 2011-08-22 +================== + + * Fixed multiple param callback regression. Closes #824 [reported by TroyGoode] + +2.4.5 / 2011-08-19 +================== + + * Added support for routes to handle errors. Closes #809 + * Added `app.routes.all()`. Closes #803 + * Added "basepath" setting to work in conjunction with reverse proxies etc. + * Refactored `Route` to use a single array of callbacks + * Added support for multiple callbacks for `app.param()`. Closes #801 +Closes #805 + * Changed: removed .call(self) for route callbacks + * Dependency: `qs >= 0.3.1` + * Fixed `res.redirect()` on windows due to `join()` usage. Closes #808 + +2.4.4 / 2011-08-05 +================== + + * Fixed `res.header()` intention of a set, even when `undefined` + * Fixed `*`, value no longer required + * Fixed `res.send(204)` support. Closes #771 + +2.4.3 / 2011-07-14 +================== + + * Added docs for `status` option special-case. Closes #739 + * Fixed `options.filename`, exposing the view path to template engines + +2.4.2. / 2011-07-06 +================== + + * Revert "removed jsonp stripping" for XSS + +2.4.1 / 2011-07-06 +================== + + * Added `res.json()` JSONP support. Closes #737 + * Added _extending-templates_ example. Closes #730 + * Added "strict routing" setting for trailing slashes + * Added support for multiple envs in `app.configure()` calls. Closes #735 + * Changed: `res.send()` using `res.json()` + * Changed: when cookie `path === null` don't default it + * Changed; default cookie path to "home" setting. Closes #731 + * Removed _pids/logs_ creation from express(1) + +2.4.0 / 2011-06-28 +================== + + * Added chainable `res.status(code)` + * Added `res.json()`, an explicit version of `res.send(obj)` + * Added simple web-service example + +2.3.12 / 2011-06-22 +================== + + * \#express is now on freenode! come join! + * Added `req.get(field, param)` + * Added links to Japanese documentation, thanks @hideyukisaito! + * Added; the `express(1)` generated app outputs the env + * Added `content-negotiation` example + * Dependency: connect >= 1.5.1 < 2.0.0 + * Fixed view layout bug. Closes #720 + * Fixed; ignore body on 304. Closes #701 + +2.3.11 / 2011-06-04 +================== + + * Added `npm test` + * Removed generation of dummy test file from `express(1)` + * Fixed; `express(1)` adds express as a dep + * Fixed; prune on `prepublish` + +2.3.10 / 2011-05-27 +================== + + * Added `req.route`, exposing the current route + * Added _package.json_ generation support to `express(1)` + * Fixed call to `app.param()` function for optional params. Closes #682 + +2.3.9 / 2011-05-25 +================== + + * Fixed bug-ish with `../' in `res.partial()` calls + +2.3.8 / 2011-05-24 +================== + + * Fixed `app.options()` + +2.3.7 / 2011-05-23 +================== + + * Added route `Collection`, ex: `app.get('/user/:id').remove();` + * Added support for `app.param(fn)` to define param logic + * Removed `app.param()` support for callback with return value + * Removed module.parent check from express(1) generated app. Closes #670 + * Refactored router. Closes #639 + +2.3.6 / 2011-05-20 +================== + + * Changed; using devDependencies instead of git submodules + * Fixed redis session example + * Fixed markdown example + * Fixed view caching, should not be enabled in development + +2.3.5 / 2011-05-20 +================== + + * Added export `.view` as alias for `.View` + +2.3.4 / 2011-05-08 +================== + + * Added `./examples/say` + * Fixed `res.sendfile()` bug preventing the transfer of files with spaces + +2.3.3 / 2011-05-03 +================== + + * Added "case sensitive routes" option. + * Changed; split methods supported per rfc [slaskis] + * Fixed route-specific middleware when using the same callback function several times + +2.3.2 / 2011-04-27 +================== + + * Fixed view hints + +2.3.1 / 2011-04-26 +================== + + * Added `app.match()` as `app.match.all()` + * Added `app.lookup()` as `app.lookup.all()` + * Added `app.remove()` for `app.remove.all()` + * Added `app.remove.VERB()` + * Fixed template caching collision issue. Closes #644 + * Moved router over from connect and started refactor + +2.3.0 / 2011-04-25 +================== + + * Added options support to `res.clearCookie()` + * Added `res.helpers()` as alias of `res.locals()` + * Added; json defaults to UTF-8 with `res.send()`. Closes #632. [Daniel * Dependency `connect >= 1.4.0` + * Changed; auto set Content-Type in res.attachement [Aaron Heckmann] + * Renamed "cache views" to "view cache". Closes #628 + * Fixed caching of views when using several apps. Closes #637 + * Fixed gotcha invoking `app.param()` callbacks once per route middleware. +Closes #638 + * Fixed partial lookup precedence. Closes #631 +Shaw] + +2.2.2 / 2011-04-12 +================== + + * Added second callback support for `res.download()` connection errors + * Fixed `filename` option passing to template engine + +2.2.1 / 2011-04-04 +================== + + * Added `layout(path)` helper to change the layout within a view. Closes #610 + * Fixed `partial()` collection object support. + Previously only anything with `.length` would work. + When `.length` is present one must still be aware of holes, + however now `{ collection: {foo: 'bar'}}` is valid, exposes + `keyInCollection` and `keysInCollection`. + + * Performance improved with better view caching + * Removed `request` and `response` locals + * Changed; errorHandler page title is now `Express` instead of `Connect` + +2.2.0 / 2011-03-30 +================== + + * Added `app.lookup.VERB()`, ex `app.lookup.put('/user/:id')`. Closes #606 + * Added `app.match.VERB()`, ex `app.match.put('/user/12')`. Closes #606 + * Added `app.VERB(path)` as alias of `app.lookup.VERB()`. + * Dependency `connect >= 1.2.0` + +2.1.1 / 2011-03-29 +================== + + * Added; expose `err.view` object when failing to locate a view + * Fixed `res.partial()` call `next(err)` when no callback is given [reported by aheckmann] + * Fixed; `res.send(undefined)` responds with 204 [aheckmann] + +2.1.0 / 2011-03-24 +================== + + * Added `/_?` partial lookup support. Closes #447 + * Added `request`, `response`, and `app` local variables + * Added `settings` local variable, containing the app's settings + * Added `req.flash()` exception if `req.session` is not available + * Added `res.send(bool)` support (json response) + * Fixed stylus example for latest version + * Fixed; wrap try/catch around `res.render()` + +2.0.0 / 2011-03-17 +================== + + * Fixed up index view path alternative. + * Changed; `res.locals()` without object returns the locals + +2.0.0rc3 / 2011-03-17 +================== + + * Added `res.locals(obj)` to compliment `res.local(key, val)` + * Added `res.partial()` callback support + * Fixed recursive error reporting issue in `res.render()` + +2.0.0rc2 / 2011-03-17 +================== + + * Changed; `partial()` "locals" are now optional + * Fixed `SlowBuffer` support. Closes #584 [reported by tyrda01] + * Fixed .filename view engine option [reported by drudge] + * Fixed blog example + * Fixed `{req,res}.app` reference when mounting [Ben Weaver] + +2.0.0rc / 2011-03-14 +================== + + * Fixed; expose `HTTPSServer` constructor + * Fixed express(1) default test charset. Closes #579 [reported by secoif] + * Fixed; default charset to utf-8 instead of utf8 for lame IE [reported by NickP] + +2.0.0beta3 / 2011-03-09 +================== + + * Added support for `res.contentType()` literal + The original `res.contentType('.json')`, + `res.contentType('application/json')`, and `res.contentType('json')` + will work now. + * Added `res.render()` status option support back + * Added charset option for `res.render()` + * Added `.charset` support (via connect 1.0.4) + * Added view resolution hints when in development and a lookup fails + * Added layout lookup support relative to the page view. + For example while rendering `./views/user/index.jade` if you create + `./views/user/layout.jade` it will be used in favour of the root layout. + * Fixed `res.redirect()`. RFC states absolute url [reported by unlink] + * Fixed; default `res.send()` string charset to utf8 + * Removed `Partial` constructor (not currently used) + +2.0.0beta2 / 2011-03-07 +================== + + * Added res.render() `.locals` support back to aid in migration process + * Fixed flash example + +2.0.0beta / 2011-03-03 +================== + + * Added HTTPS support + * Added `res.cookie()` maxAge support + * Added `req.header()` _Referrer_ / _Referer_ special-case, either works + * Added mount support for `res.redirect()`, now respects the mount-point + * Added `union()` util, taking place of `merge(clone())` combo + * Added stylus support to express(1) generated app + * Added secret to session middleware used in examples and generated app + * Added `res.local(name, val)` for progressive view locals + * Added default param support to `req.param(name, default)` + * Added `app.disabled()` and `app.enabled()` + * Added `app.register()` support for omitting leading ".", either works + * Added `res.partial()`, using the same interface as `partial()` within a view. Closes #539 + * Added `app.param()` to map route params to async/sync logic + * Added; aliased `app.helpers()` as `app.locals()`. Closes #481 + * Added extname with no leading "." support to `res.contentType()` + * Added `cache views` setting, defaulting to enabled in "production" env + * Added index file partial resolution, eg: partial('user') may try _views/user/index.jade_. + * Added `req.accepts()` support for extensions + * Changed; `res.download()` and `res.sendfile()` now utilize Connect's + static file server `connect.static.send()`. + * Changed; replaced `connect.utils.mime()` with npm _mime_ module + * Changed; allow `req.query` to be pre-defined (via middleware or other parent + * Changed view partial resolution, now relative to parent view + * Changed view engine signature. no longer `engine.render(str, options, callback)`, now `engine.compile(str, options) -> Function`, the returned function accepts `fn(locals)`. + * Fixed `req.param()` bug returning Array.prototype methods. Closes #552 + * Fixed; using `Stream#pipe()` instead of `sys.pump()` in `res.sendfile()` + * Fixed; using _qs_ module instead of _querystring_ + * Fixed; strip unsafe chars from jsonp callbacks + * Removed "stream threshold" setting + +1.0.8 / 2011-03-01 +================== + + * Allow `req.query` to be pre-defined (via middleware or other parent app) + * "connect": ">= 0.5.0 < 1.0.0". Closes #547 + * Removed the long deprecated __EXPRESS_ENV__ support + +1.0.7 / 2011-02-07 +================== + + * Fixed `render()` setting inheritance. + Mounted apps would not inherit "view engine" + +1.0.6 / 2011-02-07 +================== + + * Fixed `view engine` setting bug when period is in dirname + +1.0.5 / 2011-02-05 +================== + + * Added secret to generated app `session()` call + +1.0.4 / 2011-02-05 +================== + + * Added `qs` dependency to _package.json_ + * Fixed namespaced `require()`s for latest connect support + +1.0.3 / 2011-01-13 +================== + + * Remove unsafe characters from JSONP callback names [Ryan Grove] + +1.0.2 / 2011-01-10 +================== + + * Removed nested require, using `connect.router` + +1.0.1 / 2010-12-29 +================== + + * Fixed for middleware stacked via `createServer()` + previously the `foo` middleware passed to `createServer(foo)` + would not have access to Express methods such as `res.send()` + or props like `req.query` etc. + +1.0.0 / 2010-11-16 +================== + + * Added; deduce partial object names from the last segment. + For example by default `partial('forum/post', postObject)` will + give you the _post_ object, providing a meaningful default. + * Added http status code string representation to `res.redirect()` body + * Added; `res.redirect()` supporting _text/plain_ and _text/html_ via __Accept__. + * Added `req.is()` to aid in content negotiation + * Added partial local inheritance [suggested by masylum]. Closes #102 + providing access to parent template locals. + * Added _-s, --session[s]_ flag to express(1) to add session related middleware + * Added _--template_ flag to express(1) to specify the + template engine to use. + * Added _--css_ flag to express(1) to specify the + stylesheet engine to use (or just plain css by default). + * Added `app.all()` support [thanks aheckmann] + * Added partial direct object support. + You may now `partial('user', user)` providing the "user" local, + vs previously `partial('user', { object: user })`. + * Added _route-separation_ example since many people question ways + to do this with CommonJS modules. Also view the _blog_ example for + an alternative. + * Performance; caching view path derived partial object names + * Fixed partial local inheritance precedence. [reported by Nick Poulden] Closes #454 + * Fixed jsonp support; _text/javascript_ as per mailinglist discussion + +1.0.0rc4 / 2010-10-14 +================== + + * Added _NODE_ENV_ support, _EXPRESS_ENV_ is deprecated and will be removed in 1.0.0 + * Added route-middleware support (very helpful, see the [docs](http://expressjs.com/guide.html#Route-Middleware)) + * Added _jsonp callback_ setting to enable/disable jsonp autowrapping [Dav Glass] + * Added callback query check on response.send to autowrap JSON objects for simple webservice implementations [Dav Glass] + * Added `partial()` support for array-like collections. Closes #434 + * Added support for swappable querystring parsers + * Added session usage docs. Closes #443 + * Added dynamic helper caching. Closes #439 [suggested by maritz] + * Added authentication example + * Added basic Range support to `res.sendfile()` (and `res.download()` etc) + * Changed; `express(1)` generated app using 2 spaces instead of 4 + * Default env to "development" again [aheckmann] + * Removed _context_ option is no more, use "scope" + * Fixed; exposing _./support_ libs to examples so they can run without installs + * Fixed mvc example + +1.0.0rc3 / 2010-09-20 +================== + + * Added confirmation for `express(1)` app generation. Closes #391 + * Added extending of flash formatters via `app.flashFormatters` + * Added flash formatter support. Closes #411 + * Added streaming support to `res.sendfile()` using `sys.pump()` when >= "stream threshold" + * Added _stream threshold_ setting for `res.sendfile()` + * Added `res.send()` __HEAD__ support + * Added `res.clearCookie()` + * Added `res.cookie()` + * Added `res.render()` headers option + * Added `res.redirect()` response bodies + * Added `res.render()` status option support. Closes #425 [thanks aheckmann] + * Fixed `res.sendfile()` responding with 403 on malicious path + * Fixed `res.download()` bug; when an error occurs remove _Content-Disposition_ + * Fixed; mounted apps settings now inherit from parent app [aheckmann] + * Fixed; stripping Content-Length / Content-Type when 204 + * Fixed `res.send()` 204. Closes #419 + * Fixed multiple _Set-Cookie_ headers via `res.header()`. Closes #402 + * Fixed bug messing with error handlers when `listenFD()` is called instead of `listen()`. [thanks guillermo] + + +1.0.0rc2 / 2010-08-17 +================== + + * Added `app.register()` for template engine mapping. Closes #390 + * Added `res.render()` callback support as second argument (no options) + * Added callback support to `res.download()` + * Added callback support for `res.sendfile()` + * Added support for middleware access via `express.middlewareName()` vs `connect.middlewareName()` + * Added "partials" setting to docs + * Added default expresso tests to `express(1)` generated app. Closes #384 + * Fixed `res.sendfile()` error handling, defer via `next()` + * Fixed `res.render()` callback when a layout is used [thanks guillermo] + * Fixed; `make install` creating ~/.node_libraries when not present + * Fixed issue preventing error handlers from being defined anywhere. Closes #387 + +1.0.0rc / 2010-07-28 +================== + + * Added mounted hook. Closes #369 + * Added connect dependency to _package.json_ + + * Removed "reload views" setting and support code + development env never caches, production always caches. + + * Removed _param_ in route callbacks, signature is now + simply (req, res, next), previously (req, res, params, next). + Use _req.params_ for path captures, _req.query_ for GET params. + + * Fixed "home" setting + * Fixed middleware/router precedence issue. Closes #366 + * Fixed; _configure()_ callbacks called immediately. Closes #368 + +1.0.0beta2 / 2010-07-23 +================== + + * Added more examples + * Added; exporting `Server` constructor + * Added `Server#helpers()` for view locals + * Added `Server#dynamicHelpers()` for dynamic view locals. Closes #349 + * Added support for absolute view paths + * Added; _home_ setting defaults to `Server#route` for mounted apps. Closes #363 + * Added Guillermo Rauch to the contributor list + * Added support for "as" for non-collection partials. Closes #341 + * Fixed _install.sh_, ensuring _~/.node_libraries_ exists. Closes #362 [thanks jf] + * Fixed `res.render()` exceptions, now passed to `next()` when no callback is given [thanks guillermo] + * Fixed instanceof `Array` checks, now `Array.isArray()` + * Fixed express(1) expansion of public dirs. Closes #348 + * Fixed middleware precedence. Closes #345 + * Fixed view watcher, now async [thanks aheckmann] + +1.0.0beta / 2010-07-15 +================== + + * Re-write + - much faster + - much lighter + - Check [ExpressJS.com](http://expressjs.com) for migration guide and updated docs + +0.14.0 / 2010-06-15 +================== + + * Utilize relative requires + * Added Static bufferSize option [aheckmann] + * Fixed caching of view and partial subdirectories [aheckmann] + * Fixed mime.type() comments now that ".ext" is not supported + * Updated haml submodule + * Updated class submodule + * Removed bin/express + +0.13.0 / 2010-06-01 +================== + + * Added node v0.1.97 compatibility + * Added support for deleting cookies via Request#cookie('key', null) + * Updated haml submodule + * Fixed not-found page, now using using charset utf-8 + * Fixed show-exceptions page, now using using charset utf-8 + * Fixed view support due to fs.readFile Buffers + * Changed; mime.type() no longer accepts ".type" due to node extname() changes + +0.12.0 / 2010-05-22 +================== + + * Added node v0.1.96 compatibility + * Added view `helpers` export which act as additional local variables + * Updated haml submodule + * Changed ETag; removed inode, modified time only + * Fixed LF to CRLF for setting multiple cookies + * Fixed cookie compilation; values are now urlencoded + * Fixed cookies parsing; accepts quoted values and url escaped cookies + +0.11.0 / 2010-05-06 +================== + + * Added support for layouts using different engines + - this.render('page.html.haml', { layout: 'super-cool-layout.html.ejs' }) + - this.render('page.html.haml', { layout: 'foo' }) // assumes 'foo.html.haml' + - this.render('page.html.haml', { layout: false }) // no layout + * Updated ext submodule + * Updated haml submodule + * Fixed EJS partial support by passing along the context. Issue #307 + +0.10.1 / 2010-05-03 +================== + + * Fixed binary uploads. + +0.10.0 / 2010-04-30 +================== + + * Added charset support via Request#charset (automatically assigned to 'UTF-8' when respond()'s + encoding is set to 'utf8' or 'utf-8'. + * Added "encoding" option to Request#render(). Closes #299 + * Added "dump exceptions" setting, which is enabled by default. + * Added simple ejs template engine support + * Added error response support for text/plain, application/json. Closes #297 + * Added callback function param to Request#error() + * Added Request#sendHead() + * Added Request#stream() + * Added support for Request#respond(304, null) for empty response bodies + * Added ETag support to Request#sendfile() + * Added options to Request#sendfile(), passed to fs.createReadStream() + * Added filename arg to Request#download() + * Performance enhanced due to pre-reversing plugins so that plugins.reverse() is not called on each request + * Performance enhanced by preventing several calls to toLowerCase() in Router#match() + * Changed; Request#sendfile() now streams + * Changed; Renamed Request#halt() to Request#respond(). Closes #289 + * Changed; Using sys.inspect() instead of JSON.encode() for error output + * Changed; run() returns the http.Server instance. Closes #298 + * Changed; Defaulting Server#host to null (INADDR_ANY) + * Changed; Logger "common" format scale of 0.4f + * Removed Logger "request" format + * Fixed; Catching ENOENT in view caching, preventing error when "views/partials" is not found + * Fixed several issues with http client + * Fixed Logger Content-Length output + * Fixed bug preventing Opera from retaining the generated session id. Closes #292 + +0.9.0 / 2010-04-14 +================== + + * Added DSL level error() route support + * Added DSL level notFound() route support + * Added Request#error() + * Added Request#notFound() + * Added Request#render() callback function. Closes #258 + * Added "max upload size" setting + * Added "magic" variables to collection partials (\_\_index\_\_, \_\_length\_\_, \_\_isFirst\_\_, \_\_isLast\_\_). Closes #254 + * Added [haml.js](http://github.com/visionmedia/haml.js) submodule; removed haml-js + * Added callback function support to Request#halt() as 3rd/4th arg + * Added preprocessing of route param wildcards using param(). Closes #251 + * Added view partial support (with collections etc) + * Fixed bug preventing falsey params (such as ?page=0). Closes #286 + * Fixed setting of multiple cookies. Closes #199 + * Changed; view naming convention is now NAME.TYPE.ENGINE (for example page.html.haml) + * Changed; session cookie is now httpOnly + * Changed; Request is no longer global + * Changed; Event is no longer global + * Changed; "sys" module is no longer global + * Changed; moved Request#download to Static plugin where it belongs + * Changed; Request instance created before body parsing. Closes #262 + * Changed; Pre-caching views in memory when "cache view contents" is enabled. Closes #253 + * Changed; Pre-caching view partials in memory when "cache view partials" is enabled + * Updated support to node --version 0.1.90 + * Updated dependencies + * Removed set("session cookie") in favour of use(Session, { cookie: { ... }}) + * Removed utils.mixin(); use Object#mergeDeep() + +0.8.0 / 2010-03-19 +================== + + * Added coffeescript example app. Closes #242 + * Changed; cache api now async friendly. Closes #240 + * Removed deprecated 'express/static' support. Use 'express/plugins/static' + +0.7.6 / 2010-03-19 +================== + + * Added Request#isXHR. Closes #229 + * Added `make install` (for the executable) + * Added `express` executable for setting up simple app templates + * Added "GET /public/*" to Static plugin, defaulting to /public + * Added Static plugin + * Fixed; Request#render() only calls cache.get() once + * Fixed; Namespacing View caches with "view:" + * Fixed; Namespacing Static caches with "static:" + * Fixed; Both example apps now use the Static plugin + * Fixed set("views"). Closes #239 + * Fixed missing space for combined log format + * Deprecated Request#sendfile() and 'express/static' + * Removed Server#running + +0.7.5 / 2010-03-16 +================== + + * Added Request#flash() support without args, now returns all flashes + * Updated ext submodule + +0.7.4 / 2010-03-16 +================== + + * Fixed session reaper + * Changed; class.js replacing js-oo Class implementation (quite a bit faster, no browser cruft) + +0.7.3 / 2010-03-16 +================== + + * Added package.json + * Fixed requiring of haml / sass due to kiwi removal + +0.7.2 / 2010-03-16 +================== + + * Fixed GIT submodules (HAH!) + +0.7.1 / 2010-03-16 +================== + + * Changed; Express now using submodules again until a PM is adopted + * Changed; chat example using millisecond conversions from ext + +0.7.0 / 2010-03-15 +================== + + * Added Request#pass() support (finds the next matching route, or the given path) + * Added Logger plugin (default "common" format replaces CommonLogger) + * Removed Profiler plugin + * Removed CommonLogger plugin + +0.6.0 / 2010-03-11 +================== + + * Added seed.yml for kiwi package management support + * Added HTTP client query string support when method is GET. Closes #205 + + * Added support for arbitrary view engines. + For example "foo.engine.html" will now require('engine'), + the exports from this module are cached after the first require(). + + * Added async plugin support + + * Removed usage of RESTful route funcs as http client + get() etc, use http.get() and friends + + * Removed custom exceptions + +0.5.0 / 2010-03-10 +================== + + * Added ext dependency (library of js extensions) + * Removed extname() / basename() utils. Use path module + * Removed toArray() util. Use arguments.values + * Removed escapeRegexp() util. Use RegExp.escape() + * Removed process.mixin() dependency. Use utils.mixin() + * Removed Collection + * Removed ElementCollection + * Shameless self promotion of ebook "Advanced JavaScript" (http://dev-mag.com) ;) + +0.4.0 / 2010-02-11 +================== + + * Added flash() example to sample upload app + * Added high level restful http client module (express/http) + * Changed; RESTful route functions double as HTTP clients. Closes #69 + * Changed; throwing error when routes are added at runtime + * Changed; defaulting render() context to the current Request. Closes #197 + * Updated haml submodule + +0.3.0 / 2010-02-11 +================== + + * Updated haml / sass submodules. Closes #200 + * Added flash message support. Closes #64 + * Added accepts() now allows multiple args. fixes #117 + * Added support for plugins to halt. Closes #189 + * Added alternate layout support. Closes #119 + * Removed Route#run(). Closes #188 + * Fixed broken specs due to use(Cookie) missing + +0.2.1 / 2010-02-05 +================== + + * Added "plot" format option for Profiler (for gnuplot processing) + * Added request number to Profiler plugin + * Fixed binary encoding for multipart file uploads, was previously defaulting to UTF8 + * Fixed issue with routes not firing when not files are present. Closes #184 + * Fixed process.Promise -> events.Promise + +0.2.0 / 2010-02-03 +================== + + * Added parseParam() support for name[] etc. (allows for file inputs with "multiple" attr) Closes #180 + * Added Both Cache and Session option "reapInterval" may be "reapEvery". Closes #174 + * Added expiration support to cache api with reaper. Closes #133 + * Added cache Store.Memory#reap() + * Added Cache; cache api now uses first class Cache instances + * Added abstract session Store. Closes #172 + * Changed; cache Memory.Store#get() utilizing Collection + * Renamed MemoryStore -> Store.Memory + * Fixed use() of the same plugin several time will always use latest options. Closes #176 + +0.1.0 / 2010-02-03 +================== + + * Changed; Hooks (before / after) pass request as arg as well as evaluated in their context + * Updated node support to 0.1.27 Closes #169 + * Updated dirname(__filename) -> __dirname + * Updated libxmljs support to v0.2.0 + * Added session support with memory store / reaping + * Added quick uid() helper + * Added multi-part upload support + * Added Sass.js support / submodule + * Added production env caching view contents and static files + * Added static file caching. Closes #136 + * Added cache plugin with memory stores + * Added support to StaticFile so that it works with non-textual files. + * Removed dirname() helper + * Removed several globals (now their modules must be required) + +0.0.2 / 2010-01-10 +================== + + * Added view benchmarks; currently haml vs ejs + * Added Request#attachment() specs. Closes #116 + * Added use of node's parseQuery() util. Closes #123 + * Added `make init` for submodules + * Updated Haml + * Updated sample chat app to show messages on load + * Updated libxmljs parseString -> parseHtmlString + * Fixed `make init` to work with older versions of git + * Fixed specs can now run independent specs for those who can't build deps. Closes #127 + * Fixed issues introduced by the node url module changes. Closes 126. + * Fixed two assertions failing due to Collection#keys() returning strings + * Fixed faulty Collection#toArray() spec due to keys() returning strings + * Fixed `make test` now builds libxmljs.node before testing + +0.0.1 / 2010-01-03 +================== + + * Initial release diff --git a/node_modules/express/LICENSE b/node_modules/express/LICENSE new file mode 100644 index 000000000..aa927e44e --- /dev/null +++ b/node_modules/express/LICENSE @@ -0,0 +1,24 @@ +(The MIT License) + +Copyright (c) 2009-2014 TJ Holowaychuk +Copyright (c) 2013-2014 Roman Shtylman +Copyright (c) 2014-2015 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/express/Readme.md b/node_modules/express/Readme.md new file mode 100644 index 000000000..1f9129731 --- /dev/null +++ b/node_modules/express/Readme.md @@ -0,0 +1,155 @@ +[![Express Logo](https://i.cloudup.com/zfY6lL7eFa-3000x3000.png)](http://expressjs.com/) + + Fast, unopinionated, minimalist web framework for [node](http://nodejs.org). + + [![NPM Version][npm-image]][npm-url] + [![NPM Downloads][downloads-image]][downloads-url] + [![Linux Build][travis-image]][travis-url] + [![Windows Build][appveyor-image]][appveyor-url] + [![Test Coverage][coveralls-image]][coveralls-url] + +```js +const express = require('express') +const app = express() + +app.get('/', function (req, res) { + res.send('Hello World') +}) + +app.listen(3000) +``` + +## Installation + +This is a [Node.js](https://nodejs.org/en/) module available through the +[npm registry](https://www.npmjs.com/). + +Before installing, [download and install Node.js](https://nodejs.org/en/download/). +Node.js 0.10 or higher is required. + +Installation is done using the +[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally): + +```bash +$ npm install express +``` + +Follow [our installing guide](http://expressjs.com/en/starter/installing.html) +for more information. + +## Features + + * Robust routing + * Focus on high performance + * Super-high test coverage + * HTTP helpers (redirection, caching, etc) + * View system supporting 14+ template engines + * Content negotiation + * Executable for generating applications quickly + +## Docs & Community + + * [Website and Documentation](http://expressjs.com/) - [[website repo](https://github.com/expressjs/expressjs.com)] + * [#express](https://webchat.freenode.net/?channels=express) on freenode IRC + * [GitHub Organization](https://github.com/expressjs) for Official Middleware & Modules + * Visit the [Wiki](https://github.com/expressjs/express/wiki) + * [Google Group](https://groups.google.com/group/express-js) for discussion + * [Gitter](https://gitter.im/expressjs/express) for support and discussion + +**PROTIP** Be sure to read [Migrating from 3.x to 4.x](https://github.com/expressjs/express/wiki/Migrating-from-3.x-to-4.x) as well as [New features in 4.x](https://github.com/expressjs/express/wiki/New-features-in-4.x). + +### Security Issues + +If you discover a security vulnerability in Express, please see [Security Policies and Procedures](Security.md). + +## Quick Start + + The quickest way to get started with express is to utilize the executable [`express(1)`](https://github.com/expressjs/generator) to generate an application as shown below: + + Install the executable. The executable's major version will match Express's: + +```bash +$ npm install -g express-generator@4 +``` + + Create the app: + +```bash +$ express /tmp/foo && cd /tmp/foo +``` + + Install dependencies: + +```bash +$ npm install +``` + + Start the server: + +```bash +$ npm start +``` + + View the website at: http://localhost:3000 + +## Philosophy + + The Express philosophy is to provide small, robust tooling for HTTP servers, making + it a great solution for single page applications, web sites, hybrids, or public + HTTP APIs. + + Express does not force you to use any specific ORM or template engine. With support for over + 14 template engines via [Consolidate.js](https://github.com/tj/consolidate.js), + you can quickly craft your perfect framework. + +## Examples + + To view the examples, clone the Express repo and install the dependencies: + +```bash +$ git clone git://github.com/expressjs/express.git --depth 1 +$ cd express +$ npm install +``` + + Then run whichever example you want: + +```bash +$ node examples/content-negotiation +``` + +## Tests + + To run the test suite, first install the dependencies, then run `npm test`: + +```bash +$ npm install +$ npm test +``` + +## Contributing + +[Contributing Guide](Contributing.md) + +## People + +The original author of Express is [TJ Holowaychuk](https://github.com/tj) + +The current lead maintainer is [Douglas Christopher Wilson](https://github.com/dougwilson) + +[List of all contributors](https://github.com/expressjs/express/graphs/contributors) + +## License + + [MIT](LICENSE) + +[npm-image]: https://img.shields.io/npm/v/express.svg +[npm-url]: https://npmjs.org/package/express +[downloads-image]: https://img.shields.io/npm/dm/express.svg +[downloads-url]: https://npmjs.org/package/express +[travis-image]: https://img.shields.io/travis/expressjs/express/master.svg?label=linux +[travis-url]: https://travis-ci.org/expressjs/express +[appveyor-image]: https://img.shields.io/appveyor/ci/dougwilson/express/master.svg?label=windows +[appveyor-url]: https://ci.appveyor.com/project/dougwilson/express +[coveralls-image]: https://img.shields.io/coveralls/expressjs/express/master.svg +[coveralls-url]: https://coveralls.io/r/expressjs/express?branch=master diff --git a/node_modules/express/index.js b/node_modules/express/index.js new file mode 100644 index 000000000..d219b0c87 --- /dev/null +++ b/node_modules/express/index.js @@ -0,0 +1,11 @@ +/*! + * express + * Copyright(c) 2009-2013 TJ Holowaychuk + * Copyright(c) 2013 Roman Shtylman + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +module.exports = require('./lib/express'); diff --git a/node_modules/express/lib/application.js b/node_modules/express/lib/application.js new file mode 100644 index 000000000..91f77d241 --- /dev/null +++ b/node_modules/express/lib/application.js @@ -0,0 +1,644 @@ +/*! + * express + * Copyright(c) 2009-2013 TJ Holowaychuk + * Copyright(c) 2013 Roman Shtylman + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +/** + * Module dependencies. + * @private + */ + +var finalhandler = require('finalhandler'); +var Router = require('./router'); +var methods = require('methods'); +var middleware = require('./middleware/init'); +var query = require('./middleware/query'); +var debug = require('debug')('express:application'); +var View = require('./view'); +var http = require('http'); +var compileETag = require('./utils').compileETag; +var compileQueryParser = require('./utils').compileQueryParser; +var compileTrust = require('./utils').compileTrust; +var deprecate = require('depd')('express'); +var flatten = require('array-flatten'); +var merge = require('utils-merge'); +var resolve = require('path').resolve; +var setPrototypeOf = require('setprototypeof') +var slice = Array.prototype.slice; + +/** + * Application prototype. + */ + +var app = exports = module.exports = {}; + +/** + * Variable for trust proxy inheritance back-compat + * @private + */ + +var trustProxyDefaultSymbol = '@@symbol:trust_proxy_default'; + +/** + * Initialize the server. + * + * - setup default configuration + * - setup default middleware + * - setup route reflection methods + * + * @private + */ + +app.init = function init() { + this.cache = {}; + this.engines = {}; + this.settings = {}; + + this.defaultConfiguration(); +}; + +/** + * Initialize application configuration. + * @private + */ + +app.defaultConfiguration = function defaultConfiguration() { + var env = process.env.NODE_ENV || 'development'; + + // default settings + this.enable('x-powered-by'); + this.set('etag', 'weak'); + this.set('env', env); + this.set('query parser', 'extended'); + this.set('subdomain offset', 2); + this.set('trust proxy', false); + + // trust proxy inherit back-compat + Object.defineProperty(this.settings, trustProxyDefaultSymbol, { + configurable: true, + value: true + }); + + debug('booting in %s mode', env); + + this.on('mount', function onmount(parent) { + // inherit trust proxy + if (this.settings[trustProxyDefaultSymbol] === true + && typeof parent.settings['trust proxy fn'] === 'function') { + delete this.settings['trust proxy']; + delete this.settings['trust proxy fn']; + } + + // inherit protos + setPrototypeOf(this.request, parent.request) + setPrototypeOf(this.response, parent.response) + setPrototypeOf(this.engines, parent.engines) + setPrototypeOf(this.settings, parent.settings) + }); + + // setup locals + this.locals = Object.create(null); + + // top-most app is mounted at / + this.mountpath = '/'; + + // default locals + this.locals.settings = this.settings; + + // default configuration + this.set('view', View); + this.set('views', resolve('views')); + this.set('jsonp callback name', 'callback'); + + if (env === 'production') { + this.enable('view cache'); + } + + Object.defineProperty(this, 'router', { + get: function() { + throw new Error('\'app.router\' is deprecated!\nPlease see the 3.x to 4.x migration guide for details on how to update your app.'); + } + }); +}; + +/** + * lazily adds the base router if it has not yet been added. + * + * We cannot add the base router in the defaultConfiguration because + * it reads app settings which might be set after that has run. + * + * @private + */ +app.lazyrouter = function lazyrouter() { + if (!this._router) { + this._router = new Router({ + caseSensitive: this.enabled('case sensitive routing'), + strict: this.enabled('strict routing') + }); + + this._router.use(query(this.get('query parser fn'))); + this._router.use(middleware.init(this)); + } +}; + +/** + * Dispatch a req, res pair into the application. Starts pipeline processing. + * + * If no callback is provided, then default error handlers will respond + * in the event of an error bubbling through the stack. + * + * @private + */ + +app.handle = function handle(req, res, callback) { + var router = this._router; + + // final handler + var done = callback || finalhandler(req, res, { + env: this.get('env'), + onerror: logerror.bind(this) + }); + + // no routes + if (!router) { + debug('no routes defined on app'); + done(); + return; + } + + router.handle(req, res, done); +}; + +/** + * Proxy `Router#use()` to add middleware to the app router. + * See Router#use() documentation for details. + * + * If the _fn_ parameter is an express app, then it will be + * mounted at the _route_ specified. + * + * @public + */ + +app.use = function use(fn) { + var offset = 0; + var path = '/'; + + // default path to '/' + // disambiguate app.use([fn]) + if (typeof fn !== 'function') { + var arg = fn; + + while (Array.isArray(arg) && arg.length !== 0) { + arg = arg[0]; + } + + // first arg is the path + if (typeof arg !== 'function') { + offset = 1; + path = fn; + } + } + + var fns = flatten(slice.call(arguments, offset)); + + if (fns.length === 0) { + throw new TypeError('app.use() requires a middleware function') + } + + // setup router + this.lazyrouter(); + var router = this._router; + + fns.forEach(function (fn) { + // non-express app + if (!fn || !fn.handle || !fn.set) { + return router.use(path, fn); + } + + debug('.use app under %s', path); + fn.mountpath = path; + fn.parent = this; + + // restore .app property on req and res + router.use(path, function mounted_app(req, res, next) { + var orig = req.app; + fn.handle(req, res, function (err) { + setPrototypeOf(req, orig.request) + setPrototypeOf(res, orig.response) + next(err); + }); + }); + + // mounted an app + fn.emit('mount', this); + }, this); + + return this; +}; + +/** + * Proxy to the app `Router#route()` + * Returns a new `Route` instance for the _path_. + * + * Routes are isolated middleware stacks for specific paths. + * See the Route api docs for details. + * + * @public + */ + +app.route = function route(path) { + this.lazyrouter(); + return this._router.route(path); +}; + +/** + * Register the given template engine callback `fn` + * as `ext`. + * + * By default will `require()` the engine based on the + * file extension. For example if you try to render + * a "foo.ejs" file Express will invoke the following internally: + * + * app.engine('ejs', require('ejs').__express); + * + * For engines that do not provide `.__express` out of the box, + * or if you wish to "map" a different extension to the template engine + * you may use this method. For example mapping the EJS template engine to + * ".html" files: + * + * app.engine('html', require('ejs').renderFile); + * + * In this case EJS provides a `.renderFile()` method with + * the same signature that Express expects: `(path, options, callback)`, + * though note that it aliases this method as `ejs.__express` internally + * so if you're using ".ejs" extensions you dont need to do anything. + * + * Some template engines do not follow this convention, the + * [Consolidate.js](https://github.com/tj/consolidate.js) + * library was created to map all of node's popular template + * engines to follow this convention, thus allowing them to + * work seamlessly within Express. + * + * @param {String} ext + * @param {Function} fn + * @return {app} for chaining + * @public + */ + +app.engine = function engine(ext, fn) { + if (typeof fn !== 'function') { + throw new Error('callback function required'); + } + + // get file extension + var extension = ext[0] !== '.' + ? '.' + ext + : ext; + + // store engine + this.engines[extension] = fn; + + return this; +}; + +/** + * Proxy to `Router#param()` with one added api feature. The _name_ parameter + * can be an array of names. + * + * See the Router#param() docs for more details. + * + * @param {String|Array} name + * @param {Function} fn + * @return {app} for chaining + * @public + */ + +app.param = function param(name, fn) { + this.lazyrouter(); + + if (Array.isArray(name)) { + for (var i = 0; i < name.length; i++) { + this.param(name[i], fn); + } + + return this; + } + + this._router.param(name, fn); + + return this; +}; + +/** + * Assign `setting` to `val`, or return `setting`'s value. + * + * app.set('foo', 'bar'); + * app.set('foo'); + * // => "bar" + * + * Mounted servers inherit their parent server's settings. + * + * @param {String} setting + * @param {*} [val] + * @return {Server} for chaining + * @public + */ + +app.set = function set(setting, val) { + if (arguments.length === 1) { + // app.get(setting) + return this.settings[setting]; + } + + debug('set "%s" to %o', setting, val); + + // set value + this.settings[setting] = val; + + // trigger matched settings + switch (setting) { + case 'etag': + this.set('etag fn', compileETag(val)); + break; + case 'query parser': + this.set('query parser fn', compileQueryParser(val)); + break; + case 'trust proxy': + this.set('trust proxy fn', compileTrust(val)); + + // trust proxy inherit back-compat + Object.defineProperty(this.settings, trustProxyDefaultSymbol, { + configurable: true, + value: false + }); + + break; + } + + return this; +}; + +/** + * Return the app's absolute pathname + * based on the parent(s) that have + * mounted it. + * + * For example if the application was + * mounted as "/admin", which itself + * was mounted as "/blog" then the + * return value would be "/blog/admin". + * + * @return {String} + * @private + */ + +app.path = function path() { + return this.parent + ? this.parent.path() + this.mountpath + : ''; +}; + +/** + * Check if `setting` is enabled (truthy). + * + * app.enabled('foo') + * // => false + * + * app.enable('foo') + * app.enabled('foo') + * // => true + * + * @param {String} setting + * @return {Boolean} + * @public + */ + +app.enabled = function enabled(setting) { + return Boolean(this.set(setting)); +}; + +/** + * Check if `setting` is disabled. + * + * app.disabled('foo') + * // => true + * + * app.enable('foo') + * app.disabled('foo') + * // => false + * + * @param {String} setting + * @return {Boolean} + * @public + */ + +app.disabled = function disabled(setting) { + return !this.set(setting); +}; + +/** + * Enable `setting`. + * + * @param {String} setting + * @return {app} for chaining + * @public + */ + +app.enable = function enable(setting) { + return this.set(setting, true); +}; + +/** + * Disable `setting`. + * + * @param {String} setting + * @return {app} for chaining + * @public + */ + +app.disable = function disable(setting) { + return this.set(setting, false); +}; + +/** + * Delegate `.VERB(...)` calls to `router.VERB(...)`. + */ + +methods.forEach(function(method){ + app[method] = function(path){ + if (method === 'get' && arguments.length === 1) { + // app.get(setting) + return this.set(path); + } + + this.lazyrouter(); + + var route = this._router.route(path); + route[method].apply(route, slice.call(arguments, 1)); + return this; + }; +}); + +/** + * Special-cased "all" method, applying the given route `path`, + * middleware, and callback to _every_ HTTP method. + * + * @param {String} path + * @param {Function} ... + * @return {app} for chaining + * @public + */ + +app.all = function all(path) { + this.lazyrouter(); + + var route = this._router.route(path); + var args = slice.call(arguments, 1); + + for (var i = 0; i < methods.length; i++) { + route[methods[i]].apply(route, args); + } + + return this; +}; + +// del -> delete alias + +app.del = deprecate.function(app.delete, 'app.del: Use app.delete instead'); + +/** + * Render the given view `name` name with `options` + * and a callback accepting an error and the + * rendered template string. + * + * Example: + * + * app.render('email', { name: 'Tobi' }, function(err, html){ + * // ... + * }) + * + * @param {String} name + * @param {Object|Function} options or fn + * @param {Function} callback + * @public + */ + +app.render = function render(name, options, callback) { + var cache = this.cache; + var done = callback; + var engines = this.engines; + var opts = options; + var renderOptions = {}; + var view; + + // support callback function as second arg + if (typeof options === 'function') { + done = options; + opts = {}; + } + + // merge app.locals + merge(renderOptions, this.locals); + + // merge options._locals + if (opts._locals) { + merge(renderOptions, opts._locals); + } + + // merge options + merge(renderOptions, opts); + + // set .cache unless explicitly provided + if (renderOptions.cache == null) { + renderOptions.cache = this.enabled('view cache'); + } + + // primed cache + if (renderOptions.cache) { + view = cache[name]; + } + + // view + if (!view) { + var View = this.get('view'); + + view = new View(name, { + defaultEngine: this.get('view engine'), + root: this.get('views'), + engines: engines + }); + + if (!view.path) { + var dirs = Array.isArray(view.root) && view.root.length > 1 + ? 'directories "' + view.root.slice(0, -1).join('", "') + '" or "' + view.root[view.root.length - 1] + '"' + : 'directory "' + view.root + '"' + var err = new Error('Failed to lookup view "' + name + '" in views ' + dirs); + err.view = view; + return done(err); + } + + // prime the cache + if (renderOptions.cache) { + cache[name] = view; + } + } + + // render + tryRender(view, renderOptions, done); +}; + +/** + * Listen for connections. + * + * A node `http.Server` is returned, with this + * application (which is a `Function`) as its + * callback. If you wish to create both an HTTP + * and HTTPS server you may do so with the "http" + * and "https" modules as shown here: + * + * var http = require('http') + * , https = require('https') + * , express = require('express') + * , app = express(); + * + * http.createServer(app).listen(80); + * https.createServer({ ... }, app).listen(443); + * + * @return {http.Server} + * @public + */ + +app.listen = function listen() { + var server = http.createServer(this); + return server.listen.apply(server, arguments); +}; + +/** + * Log error using console.error. + * + * @param {Error} err + * @private + */ + +function logerror(err) { + /* istanbul ignore next */ + if (this.get('env') !== 'test') console.error(err.stack || err.toString()); +} + +/** + * Try rendering a view. + * @private + */ + +function tryRender(view, options, callback) { + try { + view.render(options, callback); + } catch (err) { + callback(err); + } +} diff --git a/node_modules/express/lib/express.js b/node_modules/express/lib/express.js new file mode 100644 index 000000000..d188a16db --- /dev/null +++ b/node_modules/express/lib/express.js @@ -0,0 +1,116 @@ +/*! + * express + * Copyright(c) 2009-2013 TJ Holowaychuk + * Copyright(c) 2013 Roman Shtylman + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +/** + * Module dependencies. + */ + +var bodyParser = require('body-parser') +var EventEmitter = require('events').EventEmitter; +var mixin = require('merge-descriptors'); +var proto = require('./application'); +var Route = require('./router/route'); +var Router = require('./router'); +var req = require('./request'); +var res = require('./response'); + +/** + * Expose `createApplication()`. + */ + +exports = module.exports = createApplication; + +/** + * Create an express application. + * + * @return {Function} + * @api public + */ + +function createApplication() { + var app = function(req, res, next) { + app.handle(req, res, next); + }; + + mixin(app, EventEmitter.prototype, false); + mixin(app, proto, false); + + // expose the prototype that will get set on requests + app.request = Object.create(req, { + app: { configurable: true, enumerable: true, writable: true, value: app } + }) + + // expose the prototype that will get set on responses + app.response = Object.create(res, { + app: { configurable: true, enumerable: true, writable: true, value: app } + }) + + app.init(); + return app; +} + +/** + * Expose the prototypes. + */ + +exports.application = proto; +exports.request = req; +exports.response = res; + +/** + * Expose constructors. + */ + +exports.Route = Route; +exports.Router = Router; + +/** + * Expose middleware + */ + +exports.json = bodyParser.json +exports.query = require('./middleware/query'); +exports.raw = bodyParser.raw +exports.static = require('serve-static'); +exports.text = bodyParser.text +exports.urlencoded = bodyParser.urlencoded + +/** + * Replace removed middleware with an appropriate error message. + */ + +var removedMiddlewares = [ + 'bodyParser', + 'compress', + 'cookieSession', + 'session', + 'logger', + 'cookieParser', + 'favicon', + 'responseTime', + 'errorHandler', + 'timeout', + 'methodOverride', + 'vhost', + 'csrf', + 'directory', + 'limit', + 'multipart', + 'staticCache' +] + +removedMiddlewares.forEach(function (name) { + Object.defineProperty(exports, name, { + get: function () { + throw new Error('Most middleware (like ' + name + ') is no longer bundled with Express and must be installed separately. Please see https://github.com/senchalabs/connect#middleware.'); + }, + configurable: true + }); +}); diff --git a/node_modules/express/lib/middleware/init.js b/node_modules/express/lib/middleware/init.js new file mode 100644 index 000000000..dfd042747 --- /dev/null +++ b/node_modules/express/lib/middleware/init.js @@ -0,0 +1,43 @@ +/*! + * express + * Copyright(c) 2009-2013 TJ Holowaychuk + * Copyright(c) 2013 Roman Shtylman + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +/** + * Module dependencies. + * @private + */ + +var setPrototypeOf = require('setprototypeof') + +/** + * Initialization middleware, exposing the + * request and response to each other, as well + * as defaulting the X-Powered-By header field. + * + * @param {Function} app + * @return {Function} + * @api private + */ + +exports.init = function(app){ + return function expressInit(req, res, next){ + if (app.enabled('x-powered-by')) res.setHeader('X-Powered-By', 'Express'); + req.res = res; + res.req = req; + req.next = next; + + setPrototypeOf(req, app.request) + setPrototypeOf(res, app.response) + + res.locals = res.locals || Object.create(null); + + next(); + }; +}; + diff --git a/node_modules/express/lib/middleware/query.js b/node_modules/express/lib/middleware/query.js new file mode 100644 index 000000000..7e9166947 --- /dev/null +++ b/node_modules/express/lib/middleware/query.js @@ -0,0 +1,47 @@ +/*! + * express + * Copyright(c) 2009-2013 TJ Holowaychuk + * Copyright(c) 2013 Roman Shtylman + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +/** + * Module dependencies. + */ + +var merge = require('utils-merge') +var parseUrl = require('parseurl'); +var qs = require('qs'); + +/** + * @param {Object} options + * @return {Function} + * @api public + */ + +module.exports = function query(options) { + var opts = merge({}, options) + var queryparse = qs.parse; + + if (typeof options === 'function') { + queryparse = options; + opts = undefined; + } + + if (opts !== undefined && opts.allowPrototypes === undefined) { + // back-compat for qs module + opts.allowPrototypes = true; + } + + return function query(req, res, next){ + if (!req.query) { + var val = parseUrl(req).query; + req.query = queryparse(val, opts); + } + + next(); + }; +}; diff --git a/node_modules/express/lib/request.js b/node_modules/express/lib/request.js new file mode 100644 index 000000000..a9400ef99 --- /dev/null +++ b/node_modules/express/lib/request.js @@ -0,0 +1,525 @@ +/*! + * express + * Copyright(c) 2009-2013 TJ Holowaychuk + * Copyright(c) 2013 Roman Shtylman + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +/** + * Module dependencies. + * @private + */ + +var accepts = require('accepts'); +var deprecate = require('depd')('express'); +var isIP = require('net').isIP; +var typeis = require('type-is'); +var http = require('http'); +var fresh = require('fresh'); +var parseRange = require('range-parser'); +var parse = require('parseurl'); +var proxyaddr = require('proxy-addr'); + +/** + * Request prototype. + * @public + */ + +var req = Object.create(http.IncomingMessage.prototype) + +/** + * Module exports. + * @public + */ + +module.exports = req + +/** + * Return request header. + * + * The `Referrer` header field is special-cased, + * both `Referrer` and `Referer` are interchangeable. + * + * Examples: + * + * req.get('Content-Type'); + * // => "text/plain" + * + * req.get('content-type'); + * // => "text/plain" + * + * req.get('Something'); + * // => undefined + * + * Aliased as `req.header()`. + * + * @param {String} name + * @return {String} + * @public + */ + +req.get = +req.header = function header(name) { + if (!name) { + throw new TypeError('name argument is required to req.get'); + } + + if (typeof name !== 'string') { + throw new TypeError('name must be a string to req.get'); + } + + var lc = name.toLowerCase(); + + switch (lc) { + case 'referer': + case 'referrer': + return this.headers.referrer + || this.headers.referer; + default: + return this.headers[lc]; + } +}; + +/** + * To do: update docs. + * + * Check if the given `type(s)` is acceptable, returning + * the best match when true, otherwise `undefined`, in which + * case you should respond with 406 "Not Acceptable". + * + * The `type` value may be a single MIME type string + * such as "application/json", an extension name + * such as "json", a comma-delimited list such as "json, html, text/plain", + * an argument list such as `"json", "html", "text/plain"`, + * or an array `["json", "html", "text/plain"]`. When a list + * or array is given, the _best_ match, if any is returned. + * + * Examples: + * + * // Accept: text/html + * req.accepts('html'); + * // => "html" + * + * // Accept: text/*, application/json + * req.accepts('html'); + * // => "html" + * req.accepts('text/html'); + * // => "text/html" + * req.accepts('json, text'); + * // => "json" + * req.accepts('application/json'); + * // => "application/json" + * + * // Accept: text/*, application/json + * req.accepts('image/png'); + * req.accepts('png'); + * // => undefined + * + * // Accept: text/*;q=.5, application/json + * req.accepts(['html', 'json']); + * req.accepts('html', 'json'); + * req.accepts('html, json'); + * // => "json" + * + * @param {String|Array} type(s) + * @return {String|Array|Boolean} + * @public + */ + +req.accepts = function(){ + var accept = accepts(this); + return accept.types.apply(accept, arguments); +}; + +/** + * Check if the given `encoding`s are accepted. + * + * @param {String} ...encoding + * @return {String|Array} + * @public + */ + +req.acceptsEncodings = function(){ + var accept = accepts(this); + return accept.encodings.apply(accept, arguments); +}; + +req.acceptsEncoding = deprecate.function(req.acceptsEncodings, + 'req.acceptsEncoding: Use acceptsEncodings instead'); + +/** + * Check if the given `charset`s are acceptable, + * otherwise you should respond with 406 "Not Acceptable". + * + * @param {String} ...charset + * @return {String|Array} + * @public + */ + +req.acceptsCharsets = function(){ + var accept = accepts(this); + return accept.charsets.apply(accept, arguments); +}; + +req.acceptsCharset = deprecate.function(req.acceptsCharsets, + 'req.acceptsCharset: Use acceptsCharsets instead'); + +/** + * Check if the given `lang`s are acceptable, + * otherwise you should respond with 406 "Not Acceptable". + * + * @param {String} ...lang + * @return {String|Array} + * @public + */ + +req.acceptsLanguages = function(){ + var accept = accepts(this); + return accept.languages.apply(accept, arguments); +}; + +req.acceptsLanguage = deprecate.function(req.acceptsLanguages, + 'req.acceptsLanguage: Use acceptsLanguages instead'); + +/** + * Parse Range header field, capping to the given `size`. + * + * Unspecified ranges such as "0-" require knowledge of your resource length. In + * the case of a byte range this is of course the total number of bytes. If the + * Range header field is not given `undefined` is returned, `-1` when unsatisfiable, + * and `-2` when syntactically invalid. + * + * When ranges are returned, the array has a "type" property which is the type of + * range that is required (most commonly, "bytes"). Each array element is an object + * with a "start" and "end" property for the portion of the range. + * + * The "combine" option can be set to `true` and overlapping & adjacent ranges + * will be combined into a single range. + * + * NOTE: remember that ranges are inclusive, so for example "Range: users=0-3" + * should respond with 4 users when available, not 3. + * + * @param {number} size + * @param {object} [options] + * @param {boolean} [options.combine=false] + * @return {number|array} + * @public + */ + +req.range = function range(size, options) { + var range = this.get('Range'); + if (!range) return; + return parseRange(size, range, options); +}; + +/** + * Return the value of param `name` when present or `defaultValue`. + * + * - Checks route placeholders, ex: _/user/:id_ + * - Checks body params, ex: id=12, {"id":12} + * - Checks query string params, ex: ?id=12 + * + * To utilize request bodies, `req.body` + * should be an object. This can be done by using + * the `bodyParser()` middleware. + * + * @param {String} name + * @param {Mixed} [defaultValue] + * @return {String} + * @public + */ + +req.param = function param(name, defaultValue) { + var params = this.params || {}; + var body = this.body || {}; + var query = this.query || {}; + + var args = arguments.length === 1 + ? 'name' + : 'name, default'; + deprecate('req.param(' + args + '): Use req.params, req.body, or req.query instead'); + + if (null != params[name] && params.hasOwnProperty(name)) return params[name]; + if (null != body[name]) return body[name]; + if (null != query[name]) return query[name]; + + return defaultValue; +}; + +/** + * Check if the incoming request contains the "Content-Type" + * header field, and it contains the give mime `type`. + * + * Examples: + * + * // With Content-Type: text/html; charset=utf-8 + * req.is('html'); + * req.is('text/html'); + * req.is('text/*'); + * // => true + * + * // When Content-Type is application/json + * req.is('json'); + * req.is('application/json'); + * req.is('application/*'); + * // => true + * + * req.is('html'); + * // => false + * + * @param {String|Array} types... + * @return {String|false|null} + * @public + */ + +req.is = function is(types) { + var arr = types; + + // support flattened arguments + if (!Array.isArray(types)) { + arr = new Array(arguments.length); + for (var i = 0; i < arr.length; i++) { + arr[i] = arguments[i]; + } + } + + return typeis(this, arr); +}; + +/** + * Return the protocol string "http" or "https" + * when requested with TLS. When the "trust proxy" + * setting trusts the socket address, the + * "X-Forwarded-Proto" header field will be trusted + * and used if present. + * + * If you're running behind a reverse proxy that + * supplies https for you this may be enabled. + * + * @return {String} + * @public + */ + +defineGetter(req, 'protocol', function protocol(){ + var proto = this.connection.encrypted + ? 'https' + : 'http'; + var trust = this.app.get('trust proxy fn'); + + if (!trust(this.connection.remoteAddress, 0)) { + return proto; + } + + // Note: X-Forwarded-Proto is normally only ever a + // single value, but this is to be safe. + var header = this.get('X-Forwarded-Proto') || proto + var index = header.indexOf(',') + + return index !== -1 + ? header.substring(0, index).trim() + : header.trim() +}); + +/** + * Short-hand for: + * + * req.protocol === 'https' + * + * @return {Boolean} + * @public + */ + +defineGetter(req, 'secure', function secure(){ + return this.protocol === 'https'; +}); + +/** + * Return the remote address from the trusted proxy. + * + * The is the remote address on the socket unless + * "trust proxy" is set. + * + * @return {String} + * @public + */ + +defineGetter(req, 'ip', function ip(){ + var trust = this.app.get('trust proxy fn'); + return proxyaddr(this, trust); +}); + +/** + * When "trust proxy" is set, trusted proxy addresses + client. + * + * For example if the value were "client, proxy1, proxy2" + * you would receive the array `["client", "proxy1", "proxy2"]` + * where "proxy2" is the furthest down-stream and "proxy1" and + * "proxy2" were trusted. + * + * @return {Array} + * @public + */ + +defineGetter(req, 'ips', function ips() { + var trust = this.app.get('trust proxy fn'); + var addrs = proxyaddr.all(this, trust); + + // reverse the order (to farthest -> closest) + // and remove socket address + addrs.reverse().pop() + + return addrs +}); + +/** + * Return subdomains as an array. + * + * Subdomains are the dot-separated parts of the host before the main domain of + * the app. By default, the domain of the app is assumed to be the last two + * parts of the host. This can be changed by setting "subdomain offset". + * + * For example, if the domain is "tobi.ferrets.example.com": + * If "subdomain offset" is not set, req.subdomains is `["ferrets", "tobi"]`. + * If "subdomain offset" is 3, req.subdomains is `["tobi"]`. + * + * @return {Array} + * @public + */ + +defineGetter(req, 'subdomains', function subdomains() { + var hostname = this.hostname; + + if (!hostname) return []; + + var offset = this.app.get('subdomain offset'); + var subdomains = !isIP(hostname) + ? hostname.split('.').reverse() + : [hostname]; + + return subdomains.slice(offset); +}); + +/** + * Short-hand for `url.parse(req.url).pathname`. + * + * @return {String} + * @public + */ + +defineGetter(req, 'path', function path() { + return parse(this).pathname; +}); + +/** + * Parse the "Host" header field to a hostname. + * + * When the "trust proxy" setting trusts the socket + * address, the "X-Forwarded-Host" header field will + * be trusted. + * + * @return {String} + * @public + */ + +defineGetter(req, 'hostname', function hostname(){ + var trust = this.app.get('trust proxy fn'); + var host = this.get('X-Forwarded-Host'); + + if (!host || !trust(this.connection.remoteAddress, 0)) { + host = this.get('Host'); + } else if (host.indexOf(',') !== -1) { + // Note: X-Forwarded-Host is normally only ever a + // single value, but this is to be safe. + host = host.substring(0, host.indexOf(',')).trimRight() + } + + if (!host) return; + + // IPv6 literal support + var offset = host[0] === '[' + ? host.indexOf(']') + 1 + : 0; + var index = host.indexOf(':', offset); + + return index !== -1 + ? host.substring(0, index) + : host; +}); + +// TODO: change req.host to return host in next major + +defineGetter(req, 'host', deprecate.function(function host(){ + return this.hostname; +}, 'req.host: Use req.hostname instead')); + +/** + * Check if the request is fresh, aka + * Last-Modified and/or the ETag + * still match. + * + * @return {Boolean} + * @public + */ + +defineGetter(req, 'fresh', function(){ + var method = this.method; + var res = this.res + var status = res.statusCode + + // GET or HEAD for weak freshness validation only + if ('GET' !== method && 'HEAD' !== method) return false; + + // 2xx or 304 as per rfc2616 14.26 + if ((status >= 200 && status < 300) || 304 === status) { + return fresh(this.headers, { + 'etag': res.get('ETag'), + 'last-modified': res.get('Last-Modified') + }) + } + + return false; +}); + +/** + * Check if the request is stale, aka + * "Last-Modified" and / or the "ETag" for the + * resource has changed. + * + * @return {Boolean} + * @public + */ + +defineGetter(req, 'stale', function stale(){ + return !this.fresh; +}); + +/** + * Check if the request was an _XMLHttpRequest_. + * + * @return {Boolean} + * @public + */ + +defineGetter(req, 'xhr', function xhr(){ + var val = this.get('X-Requested-With') || ''; + return val.toLowerCase() === 'xmlhttprequest'; +}); + +/** + * Helper function for creating a getter on an object. + * + * @param {Object} obj + * @param {String} name + * @param {Function} getter + * @private + */ +function defineGetter(obj, name, getter) { + Object.defineProperty(obj, name, { + configurable: true, + enumerable: true, + get: getter + }); +} diff --git a/node_modules/express/lib/response.js b/node_modules/express/lib/response.js new file mode 100644 index 000000000..c9f08cd54 --- /dev/null +++ b/node_modules/express/lib/response.js @@ -0,0 +1,1142 @@ +/*! + * express + * Copyright(c) 2009-2013 TJ Holowaychuk + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +/** + * Module dependencies. + * @private + */ + +var Buffer = require('safe-buffer').Buffer +var contentDisposition = require('content-disposition'); +var deprecate = require('depd')('express'); +var encodeUrl = require('encodeurl'); +var escapeHtml = require('escape-html'); +var http = require('http'); +var isAbsolute = require('./utils').isAbsolute; +var onFinished = require('on-finished'); +var path = require('path'); +var statuses = require('statuses') +var merge = require('utils-merge'); +var sign = require('cookie-signature').sign; +var normalizeType = require('./utils').normalizeType; +var normalizeTypes = require('./utils').normalizeTypes; +var setCharset = require('./utils').setCharset; +var cookie = require('cookie'); +var send = require('send'); +var extname = path.extname; +var mime = send.mime; +var resolve = path.resolve; +var vary = require('vary'); + +/** + * Response prototype. + * @public + */ + +var res = Object.create(http.ServerResponse.prototype) + +/** + * Module exports. + * @public + */ + +module.exports = res + +/** + * Module variables. + * @private + */ + +var charsetRegExp = /;\s*charset\s*=/; + +/** + * Set status `code`. + * + * @param {Number} code + * @return {ServerResponse} + * @public + */ + +res.status = function status(code) { + this.statusCode = code; + return this; +}; + +/** + * Set Link header field with the given `links`. + * + * Examples: + * + * res.links({ + * next: 'http://api.example.com/users?page=2', + * last: 'http://api.example.com/users?page=5' + * }); + * + * @param {Object} links + * @return {ServerResponse} + * @public + */ + +res.links = function(links){ + var link = this.get('Link') || ''; + if (link) link += ', '; + return this.set('Link', link + Object.keys(links).map(function(rel){ + return '<' + links[rel] + '>; rel="' + rel + '"'; + }).join(', ')); +}; + +/** + * Send a response. + * + * Examples: + * + * res.send(Buffer.from('wahoo')); + * res.send({ some: 'json' }); + * res.send('

some html

'); + * + * @param {string|number|boolean|object|Buffer} body + * @public + */ + +res.send = function send(body) { + var chunk = body; + var encoding; + var req = this.req; + var type; + + // settings + var app = this.app; + + // allow status / body + if (arguments.length === 2) { + // res.send(body, status) backwards compat + if (typeof arguments[0] !== 'number' && typeof arguments[1] === 'number') { + deprecate('res.send(body, status): Use res.status(status).send(body) instead'); + this.statusCode = arguments[1]; + } else { + deprecate('res.send(status, body): Use res.status(status).send(body) instead'); + this.statusCode = arguments[0]; + chunk = arguments[1]; + } + } + + // disambiguate res.send(status) and res.send(status, num) + if (typeof chunk === 'number' && arguments.length === 1) { + // res.send(status) will set status message as text string + if (!this.get('Content-Type')) { + this.type('txt'); + } + + deprecate('res.send(status): Use res.sendStatus(status) instead'); + this.statusCode = chunk; + chunk = statuses[chunk] + } + + switch (typeof chunk) { + // string defaulting to html + case 'string': + if (!this.get('Content-Type')) { + this.type('html'); + } + break; + case 'boolean': + case 'number': + case 'object': + if (chunk === null) { + chunk = ''; + } else if (Buffer.isBuffer(chunk)) { + if (!this.get('Content-Type')) { + this.type('bin'); + } + } else { + return this.json(chunk); + } + break; + } + + // write strings in utf-8 + if (typeof chunk === 'string') { + encoding = 'utf8'; + type = this.get('Content-Type'); + + // reflect this in content-type + if (typeof type === 'string') { + this.set('Content-Type', setCharset(type, 'utf-8')); + } + } + + // determine if ETag should be generated + var etagFn = app.get('etag fn') + var generateETag = !this.get('ETag') && typeof etagFn === 'function' + + // populate Content-Length + var len + if (chunk !== undefined) { + if (Buffer.isBuffer(chunk)) { + // get length of Buffer + len = chunk.length + } else if (!generateETag && chunk.length < 1000) { + // just calculate length when no ETag + small chunk + len = Buffer.byteLength(chunk, encoding) + } else { + // convert chunk to Buffer and calculate + chunk = Buffer.from(chunk, encoding) + encoding = undefined; + len = chunk.length + } + + this.set('Content-Length', len); + } + + // populate ETag + var etag; + if (generateETag && len !== undefined) { + if ((etag = etagFn(chunk, encoding))) { + this.set('ETag', etag); + } + } + + // freshness + if (req.fresh) this.statusCode = 304; + + // strip irrelevant headers + if (204 === this.statusCode || 304 === this.statusCode) { + this.removeHeader('Content-Type'); + this.removeHeader('Content-Length'); + this.removeHeader('Transfer-Encoding'); + chunk = ''; + } + + if (req.method === 'HEAD') { + // skip body for HEAD + this.end(); + } else { + // respond + this.end(chunk, encoding); + } + + return this; +}; + +/** + * Send JSON response. + * + * Examples: + * + * res.json(null); + * res.json({ user: 'tj' }); + * + * @param {string|number|boolean|object} obj + * @public + */ + +res.json = function json(obj) { + var val = obj; + + // allow status / body + if (arguments.length === 2) { + // res.json(body, status) backwards compat + if (typeof arguments[1] === 'number') { + deprecate('res.json(obj, status): Use res.status(status).json(obj) instead'); + this.statusCode = arguments[1]; + } else { + deprecate('res.json(status, obj): Use res.status(status).json(obj) instead'); + this.statusCode = arguments[0]; + val = arguments[1]; + } + } + + // settings + var app = this.app; + var escape = app.get('json escape') + var replacer = app.get('json replacer'); + var spaces = app.get('json spaces'); + var body = stringify(val, replacer, spaces, escape) + + // content-type + if (!this.get('Content-Type')) { + this.set('Content-Type', 'application/json'); + } + + return this.send(body); +}; + +/** + * Send JSON response with JSONP callback support. + * + * Examples: + * + * res.jsonp(null); + * res.jsonp({ user: 'tj' }); + * + * @param {string|number|boolean|object} obj + * @public + */ + +res.jsonp = function jsonp(obj) { + var val = obj; + + // allow status / body + if (arguments.length === 2) { + // res.json(body, status) backwards compat + if (typeof arguments[1] === 'number') { + deprecate('res.jsonp(obj, status): Use res.status(status).json(obj) instead'); + this.statusCode = arguments[1]; + } else { + deprecate('res.jsonp(status, obj): Use res.status(status).jsonp(obj) instead'); + this.statusCode = arguments[0]; + val = arguments[1]; + } + } + + // settings + var app = this.app; + var escape = app.get('json escape') + var replacer = app.get('json replacer'); + var spaces = app.get('json spaces'); + var body = stringify(val, replacer, spaces, escape) + var callback = this.req.query[app.get('jsonp callback name')]; + + // content-type + if (!this.get('Content-Type')) { + this.set('X-Content-Type-Options', 'nosniff'); + this.set('Content-Type', 'application/json'); + } + + // fixup callback + if (Array.isArray(callback)) { + callback = callback[0]; + } + + // jsonp + if (typeof callback === 'string' && callback.length !== 0) { + this.set('X-Content-Type-Options', 'nosniff'); + this.set('Content-Type', 'text/javascript'); + + // restrict callback charset + callback = callback.replace(/[^\[\]\w$.]/g, ''); + + // replace chars not allowed in JavaScript that are in JSON + body = body + .replace(/\u2028/g, '\\u2028') + .replace(/\u2029/g, '\\u2029'); + + // the /**/ is a specific security mitigation for "Rosetta Flash JSONP abuse" + // the typeof check is just to reduce client error noise + body = '/**/ typeof ' + callback + ' === \'function\' && ' + callback + '(' + body + ');'; + } + + return this.send(body); +}; + +/** + * Send given HTTP status code. + * + * Sets the response status to `statusCode` and the body of the + * response to the standard description from node's http.STATUS_CODES + * or the statusCode number if no description. + * + * Examples: + * + * res.sendStatus(200); + * + * @param {number} statusCode + * @public + */ + +res.sendStatus = function sendStatus(statusCode) { + var body = statuses[statusCode] || String(statusCode) + + this.statusCode = statusCode; + this.type('txt'); + + return this.send(body); +}; + +/** + * Transfer the file at the given `path`. + * + * Automatically sets the _Content-Type_ response header field. + * The callback `callback(err)` is invoked when the transfer is complete + * or when an error occurs. Be sure to check `res.sentHeader` + * if you wish to attempt responding, as the header and some data + * may have already been transferred. + * + * Options: + * + * - `maxAge` defaulting to 0 (can be string converted by `ms`) + * - `root` root directory for relative filenames + * - `headers` object of headers to serve with file + * - `dotfiles` serve dotfiles, defaulting to false; can be `"allow"` to send them + * + * Other options are passed along to `send`. + * + * Examples: + * + * The following example illustrates how `res.sendFile()` may + * be used as an alternative for the `static()` middleware for + * dynamic situations. The code backing `res.sendFile()` is actually + * the same code, so HTTP cache support etc is identical. + * + * app.get('/user/:uid/photos/:file', function(req, res){ + * var uid = req.params.uid + * , file = req.params.file; + * + * req.user.mayViewFilesFrom(uid, function(yes){ + * if (yes) { + * res.sendFile('/uploads/' + uid + '/' + file); + * } else { + * res.send(403, 'Sorry! you cant see that.'); + * } + * }); + * }); + * + * @public + */ + +res.sendFile = function sendFile(path, options, callback) { + var done = callback; + var req = this.req; + var res = this; + var next = req.next; + var opts = options || {}; + + if (!path) { + throw new TypeError('path argument is required to res.sendFile'); + } + + if (typeof path !== 'string') { + throw new TypeError('path must be a string to res.sendFile') + } + + // support function as second arg + if (typeof options === 'function') { + done = options; + opts = {}; + } + + if (!opts.root && !isAbsolute(path)) { + throw new TypeError('path must be absolute or specify root to res.sendFile'); + } + + // create file stream + var pathname = encodeURI(path); + var file = send(req, pathname, opts); + + // transfer + sendfile(res, file, opts, function (err) { + if (done) return done(err); + if (err && err.code === 'EISDIR') return next(); + + // next() all but write errors + if (err && err.code !== 'ECONNABORTED' && err.syscall !== 'write') { + next(err); + } + }); +}; + +/** + * Transfer the file at the given `path`. + * + * Automatically sets the _Content-Type_ response header field. + * The callback `callback(err)` is invoked when the transfer is complete + * or when an error occurs. Be sure to check `res.sentHeader` + * if you wish to attempt responding, as the header and some data + * may have already been transferred. + * + * Options: + * + * - `maxAge` defaulting to 0 (can be string converted by `ms`) + * - `root` root directory for relative filenames + * - `headers` object of headers to serve with file + * - `dotfiles` serve dotfiles, defaulting to false; can be `"allow"` to send them + * + * Other options are passed along to `send`. + * + * Examples: + * + * The following example illustrates how `res.sendfile()` may + * be used as an alternative for the `static()` middleware for + * dynamic situations. The code backing `res.sendfile()` is actually + * the same code, so HTTP cache support etc is identical. + * + * app.get('/user/:uid/photos/:file', function(req, res){ + * var uid = req.params.uid + * , file = req.params.file; + * + * req.user.mayViewFilesFrom(uid, function(yes){ + * if (yes) { + * res.sendfile('/uploads/' + uid + '/' + file); + * } else { + * res.send(403, 'Sorry! you cant see that.'); + * } + * }); + * }); + * + * @public + */ + +res.sendfile = function (path, options, callback) { + var done = callback; + var req = this.req; + var res = this; + var next = req.next; + var opts = options || {}; + + // support function as second arg + if (typeof options === 'function') { + done = options; + opts = {}; + } + + // create file stream + var file = send(req, path, opts); + + // transfer + sendfile(res, file, opts, function (err) { + if (done) return done(err); + if (err && err.code === 'EISDIR') return next(); + + // next() all but write errors + if (err && err.code !== 'ECONNABORTED' && err.syscall !== 'write') { + next(err); + } + }); +}; + +res.sendfile = deprecate.function(res.sendfile, + 'res.sendfile: Use res.sendFile instead'); + +/** + * Transfer the file at the given `path` as an attachment. + * + * Optionally providing an alternate attachment `filename`, + * and optional callback `callback(err)`. The callback is invoked + * when the data transfer is complete, or when an error has + * ocurred. Be sure to check `res.headersSent` if you plan to respond. + * + * Optionally providing an `options` object to use with `res.sendFile()`. + * This function will set the `Content-Disposition` header, overriding + * any `Content-Disposition` header passed as header options in order + * to set the attachment and filename. + * + * This method uses `res.sendFile()`. + * + * @public + */ + +res.download = function download (path, filename, options, callback) { + var done = callback; + var name = filename; + var opts = options || null + + // support function as second or third arg + if (typeof filename === 'function') { + done = filename; + name = null; + opts = null + } else if (typeof options === 'function') { + done = options + opts = null + } + + // set Content-Disposition when file is sent + var headers = { + 'Content-Disposition': contentDisposition(name || path) + }; + + // merge user-provided headers + if (opts && opts.headers) { + var keys = Object.keys(opts.headers) + for (var i = 0; i < keys.length; i++) { + var key = keys[i] + if (key.toLowerCase() !== 'content-disposition') { + headers[key] = opts.headers[key] + } + } + } + + // merge user-provided options + opts = Object.create(opts) + opts.headers = headers + + // Resolve the full path for sendFile + var fullPath = resolve(path); + + // send file + return this.sendFile(fullPath, opts, done) +}; + +/** + * Set _Content-Type_ response header with `type` through `mime.lookup()` + * when it does not contain "/", or set the Content-Type to `type` otherwise. + * + * Examples: + * + * res.type('.html'); + * res.type('html'); + * res.type('json'); + * res.type('application/json'); + * res.type('png'); + * + * @param {String} type + * @return {ServerResponse} for chaining + * @public + */ + +res.contentType = +res.type = function contentType(type) { + var ct = type.indexOf('/') === -1 + ? mime.lookup(type) + : type; + + return this.set('Content-Type', ct); +}; + +/** + * Respond to the Acceptable formats using an `obj` + * of mime-type callbacks. + * + * This method uses `req.accepted`, an array of + * acceptable types ordered by their quality values. + * When "Accept" is not present the _first_ callback + * is invoked, otherwise the first match is used. When + * no match is performed the server responds with + * 406 "Not Acceptable". + * + * Content-Type is set for you, however if you choose + * you may alter this within the callback using `res.type()` + * or `res.set('Content-Type', ...)`. + * + * res.format({ + * 'text/plain': function(){ + * res.send('hey'); + * }, + * + * 'text/html': function(){ + * res.send('

hey

'); + * }, + * + * 'appliation/json': function(){ + * res.send({ message: 'hey' }); + * } + * }); + * + * In addition to canonicalized MIME types you may + * also use extnames mapped to these types: + * + * res.format({ + * text: function(){ + * res.send('hey'); + * }, + * + * html: function(){ + * res.send('

hey

'); + * }, + * + * json: function(){ + * res.send({ message: 'hey' }); + * } + * }); + * + * By default Express passes an `Error` + * with a `.status` of 406 to `next(err)` + * if a match is not made. If you provide + * a `.default` callback it will be invoked + * instead. + * + * @param {Object} obj + * @return {ServerResponse} for chaining + * @public + */ + +res.format = function(obj){ + var req = this.req; + var next = req.next; + + var fn = obj.default; + if (fn) delete obj.default; + var keys = Object.keys(obj); + + var key = keys.length > 0 + ? req.accepts(keys) + : false; + + this.vary("Accept"); + + if (key) { + this.set('Content-Type', normalizeType(key).value); + obj[key](req, this, next); + } else if (fn) { + fn(); + } else { + var err = new Error('Not Acceptable'); + err.status = err.statusCode = 406; + err.types = normalizeTypes(keys).map(function(o){ return o.value }); + next(err); + } + + return this; +}; + +/** + * Set _Content-Disposition_ header to _attachment_ with optional `filename`. + * + * @param {String} filename + * @return {ServerResponse} + * @public + */ + +res.attachment = function attachment(filename) { + if (filename) { + this.type(extname(filename)); + } + + this.set('Content-Disposition', contentDisposition(filename)); + + return this; +}; + +/** + * Append additional header `field` with value `val`. + * + * Example: + * + * res.append('Link', ['', '']); + * res.append('Set-Cookie', 'foo=bar; Path=/; HttpOnly'); + * res.append('Warning', '199 Miscellaneous warning'); + * + * @param {String} field + * @param {String|Array} val + * @return {ServerResponse} for chaining + * @public + */ + +res.append = function append(field, val) { + var prev = this.get(field); + var value = val; + + if (prev) { + // concat the new and prev vals + value = Array.isArray(prev) ? prev.concat(val) + : Array.isArray(val) ? [prev].concat(val) + : [prev, val]; + } + + return this.set(field, value); +}; + +/** + * Set header `field` to `val`, or pass + * an object of header fields. + * + * Examples: + * + * res.set('Foo', ['bar', 'baz']); + * res.set('Accept', 'application/json'); + * res.set({ Accept: 'text/plain', 'X-API-Key': 'tobi' }); + * + * Aliased as `res.header()`. + * + * @param {String|Object} field + * @param {String|Array} val + * @return {ServerResponse} for chaining + * @public + */ + +res.set = +res.header = function header(field, val) { + if (arguments.length === 2) { + var value = Array.isArray(val) + ? val.map(String) + : String(val); + + // add charset to content-type + if (field.toLowerCase() === 'content-type') { + if (Array.isArray(value)) { + throw new TypeError('Content-Type cannot be set to an Array'); + } + if (!charsetRegExp.test(value)) { + var charset = mime.charsets.lookup(value.split(';')[0]); + if (charset) value += '; charset=' + charset.toLowerCase(); + } + } + + this.setHeader(field, value); + } else { + for (var key in field) { + this.set(key, field[key]); + } + } + return this; +}; + +/** + * Get value for header `field`. + * + * @param {String} field + * @return {String} + * @public + */ + +res.get = function(field){ + return this.getHeader(field); +}; + +/** + * Clear cookie `name`. + * + * @param {String} name + * @param {Object} [options] + * @return {ServerResponse} for chaining + * @public + */ + +res.clearCookie = function clearCookie(name, options) { + var opts = merge({ expires: new Date(1), path: '/' }, options); + + return this.cookie(name, '', opts); +}; + +/** + * Set cookie `name` to `value`, with the given `options`. + * + * Options: + * + * - `maxAge` max-age in milliseconds, converted to `expires` + * - `signed` sign the cookie + * - `path` defaults to "/" + * + * Examples: + * + * // "Remember Me" for 15 minutes + * res.cookie('rememberme', '1', { expires: new Date(Date.now() + 900000), httpOnly: true }); + * + * // same as above + * res.cookie('rememberme', '1', { maxAge: 900000, httpOnly: true }) + * + * @param {String} name + * @param {String|Object} value + * @param {Object} [options] + * @return {ServerResponse} for chaining + * @public + */ + +res.cookie = function (name, value, options) { + var opts = merge({}, options); + var secret = this.req.secret; + var signed = opts.signed; + + if (signed && !secret) { + throw new Error('cookieParser("secret") required for signed cookies'); + } + + var val = typeof value === 'object' + ? 'j:' + JSON.stringify(value) + : String(value); + + if (signed) { + val = 's:' + sign(val, secret); + } + + if ('maxAge' in opts) { + opts.expires = new Date(Date.now() + opts.maxAge); + opts.maxAge /= 1000; + } + + if (opts.path == null) { + opts.path = '/'; + } + + this.append('Set-Cookie', cookie.serialize(name, String(val), opts)); + + return this; +}; + +/** + * Set the location header to `url`. + * + * The given `url` can also be "back", which redirects + * to the _Referrer_ or _Referer_ headers or "/". + * + * Examples: + * + * res.location('/foo/bar').; + * res.location('http://example.com'); + * res.location('../login'); + * + * @param {String} url + * @return {ServerResponse} for chaining + * @public + */ + +res.location = function location(url) { + var loc = url; + + // "back" is an alias for the referrer + if (url === 'back') { + loc = this.req.get('Referrer') || '/'; + } + + // set location + return this.set('Location', encodeUrl(loc)); +}; + +/** + * Redirect to the given `url` with optional response `status` + * defaulting to 302. + * + * The resulting `url` is determined by `res.location()`, so + * it will play nicely with mounted apps, relative paths, + * `"back"` etc. + * + * Examples: + * + * res.redirect('/foo/bar'); + * res.redirect('http://example.com'); + * res.redirect(301, 'http://example.com'); + * res.redirect('../login'); // /blog/post/1 -> /blog/login + * + * @public + */ + +res.redirect = function redirect(url) { + var address = url; + var body; + var status = 302; + + // allow status / url + if (arguments.length === 2) { + if (typeof arguments[0] === 'number') { + status = arguments[0]; + address = arguments[1]; + } else { + deprecate('res.redirect(url, status): Use res.redirect(status, url) instead'); + status = arguments[1]; + } + } + + // Set location header + address = this.location(address).get('Location'); + + // Support text/{plain,html} by default + this.format({ + text: function(){ + body = statuses[status] + '. Redirecting to ' + address + }, + + html: function(){ + var u = escapeHtml(address); + body = '

' + statuses[status] + '. Redirecting to ' + u + '

' + }, + + default: function(){ + body = ''; + } + }); + + // Respond + this.statusCode = status; + this.set('Content-Length', Buffer.byteLength(body)); + + if (this.req.method === 'HEAD') { + this.end(); + } else { + this.end(body); + } +}; + +/** + * Add `field` to Vary. If already present in the Vary set, then + * this call is simply ignored. + * + * @param {Array|String} field + * @return {ServerResponse} for chaining + * @public + */ + +res.vary = function(field){ + // checks for back-compat + if (!field || (Array.isArray(field) && !field.length)) { + deprecate('res.vary(): Provide a field name'); + return this; + } + + vary(this, field); + + return this; +}; + +/** + * Render `view` with the given `options` and optional callback `fn`. + * When a callback function is given a response will _not_ be made + * automatically, otherwise a response of _200_ and _text/html_ is given. + * + * Options: + * + * - `cache` boolean hinting to the engine it should cache + * - `filename` filename of the view being rendered + * + * @public + */ + +res.render = function render(view, options, callback) { + var app = this.req.app; + var done = callback; + var opts = options || {}; + var req = this.req; + var self = this; + + // support callback function as second arg + if (typeof options === 'function') { + done = options; + opts = {}; + } + + // merge res.locals + opts._locals = self.locals; + + // default callback to respond + done = done || function (err, str) { + if (err) return req.next(err); + self.send(str); + }; + + // render + app.render(view, opts, done); +}; + +// pipe the send file stream +function sendfile(res, file, options, callback) { + var done = false; + var streaming; + + // request aborted + function onaborted() { + if (done) return; + done = true; + + var err = new Error('Request aborted'); + err.code = 'ECONNABORTED'; + callback(err); + } + + // directory + function ondirectory() { + if (done) return; + done = true; + + var err = new Error('EISDIR, read'); + err.code = 'EISDIR'; + callback(err); + } + + // errors + function onerror(err) { + if (done) return; + done = true; + callback(err); + } + + // ended + function onend() { + if (done) return; + done = true; + callback(); + } + + // file + function onfile() { + streaming = false; + } + + // finished + function onfinish(err) { + if (err && err.code === 'ECONNRESET') return onaborted(); + if (err) return onerror(err); + if (done) return; + + setImmediate(function () { + if (streaming !== false && !done) { + onaborted(); + return; + } + + if (done) return; + done = true; + callback(); + }); + } + + // streaming + function onstream() { + streaming = true; + } + + file.on('directory', ondirectory); + file.on('end', onend); + file.on('error', onerror); + file.on('file', onfile); + file.on('stream', onstream); + onFinished(res, onfinish); + + if (options.headers) { + // set headers on successful transfer + file.on('headers', function headers(res) { + var obj = options.headers; + var keys = Object.keys(obj); + + for (var i = 0; i < keys.length; i++) { + var k = keys[i]; + res.setHeader(k, obj[k]); + } + }); + } + + // pipe + file.pipe(res); +} + +/** + * Stringify JSON, like JSON.stringify, but v8 optimized, with the + * ability to escape characters that can trigger HTML sniffing. + * + * @param {*} value + * @param {function} replaces + * @param {number} spaces + * @param {boolean} escape + * @returns {string} + * @private + */ + +function stringify (value, replacer, spaces, escape) { + // v8 checks arguments.length for optimizing simple call + // https://bugs.chromium.org/p/v8/issues/detail?id=4730 + var json = replacer || spaces + ? JSON.stringify(value, replacer, spaces) + : JSON.stringify(value); + + if (escape) { + json = json.replace(/[<>&]/g, function (c) { + switch (c.charCodeAt(0)) { + case 0x3c: + return '\\u003c' + case 0x3e: + return '\\u003e' + case 0x26: + return '\\u0026' + /* istanbul ignore next: unreachable default */ + default: + return c + } + }) + } + + return json +} diff --git a/node_modules/express/lib/router/index.js b/node_modules/express/lib/router/index.js new file mode 100644 index 000000000..69e6d3800 --- /dev/null +++ b/node_modules/express/lib/router/index.js @@ -0,0 +1,662 @@ +/*! + * express + * Copyright(c) 2009-2013 TJ Holowaychuk + * Copyright(c) 2013 Roman Shtylman + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +/** + * Module dependencies. + * @private + */ + +var Route = require('./route'); +var Layer = require('./layer'); +var methods = require('methods'); +var mixin = require('utils-merge'); +var debug = require('debug')('express:router'); +var deprecate = require('depd')('express'); +var flatten = require('array-flatten'); +var parseUrl = require('parseurl'); +var setPrototypeOf = require('setprototypeof') + +/** + * Module variables. + * @private + */ + +var objectRegExp = /^\[object (\S+)\]$/; +var slice = Array.prototype.slice; +var toString = Object.prototype.toString; + +/** + * Initialize a new `Router` with the given `options`. + * + * @param {Object} [options] + * @return {Router} which is an callable function + * @public + */ + +var proto = module.exports = function(options) { + var opts = options || {}; + + function router(req, res, next) { + router.handle(req, res, next); + } + + // mixin Router class functions + setPrototypeOf(router, proto) + + router.params = {}; + router._params = []; + router.caseSensitive = opts.caseSensitive; + router.mergeParams = opts.mergeParams; + router.strict = opts.strict; + router.stack = []; + + return router; +}; + +/** + * Map the given param placeholder `name`(s) to the given callback. + * + * Parameter mapping is used to provide pre-conditions to routes + * which use normalized placeholders. For example a _:user_id_ parameter + * could automatically load a user's information from the database without + * any additional code, + * + * The callback uses the same signature as middleware, the only difference + * being that the value of the placeholder is passed, in this case the _id_ + * of the user. Once the `next()` function is invoked, just like middleware + * it will continue on to execute the route, or subsequent parameter functions. + * + * Just like in middleware, you must either respond to the request or call next + * to avoid stalling the request. + * + * app.param('user_id', function(req, res, next, id){ + * User.find(id, function(err, user){ + * if (err) { + * return next(err); + * } else if (!user) { + * return next(new Error('failed to load user')); + * } + * req.user = user; + * next(); + * }); + * }); + * + * @param {String} name + * @param {Function} fn + * @return {app} for chaining + * @public + */ + +proto.param = function param(name, fn) { + // param logic + if (typeof name === 'function') { + deprecate('router.param(fn): Refactor to use path params'); + this._params.push(name); + return; + } + + // apply param functions + var params = this._params; + var len = params.length; + var ret; + + if (name[0] === ':') { + deprecate('router.param(' + JSON.stringify(name) + ', fn): Use router.param(' + JSON.stringify(name.substr(1)) + ', fn) instead'); + name = name.substr(1); + } + + for (var i = 0; i < len; ++i) { + if (ret = params[i](name, fn)) { + fn = ret; + } + } + + // ensure we end up with a + // middleware function + if ('function' !== typeof fn) { + throw new Error('invalid param() call for ' + name + ', got ' + fn); + } + + (this.params[name] = this.params[name] || []).push(fn); + return this; +}; + +/** + * Dispatch a req, res into the router. + * @private + */ + +proto.handle = function handle(req, res, out) { + var self = this; + + debug('dispatching %s %s', req.method, req.url); + + var idx = 0; + var protohost = getProtohost(req.url) || '' + var removed = ''; + var slashAdded = false; + var paramcalled = {}; + + // store options for OPTIONS request + // only used if OPTIONS request + var options = []; + + // middleware and routes + var stack = self.stack; + + // manage inter-router variables + var parentParams = req.params; + var parentUrl = req.baseUrl || ''; + var done = restore(out, req, 'baseUrl', 'next', 'params'); + + // setup next layer + req.next = next; + + // for options requests, respond with a default if nothing else responds + if (req.method === 'OPTIONS') { + done = wrap(done, function(old, err) { + if (err || options.length === 0) return old(err); + sendOptionsResponse(res, options, old); + }); + } + + // setup basic req values + req.baseUrl = parentUrl; + req.originalUrl = req.originalUrl || req.url; + + next(); + + function next(err) { + var layerError = err === 'route' + ? null + : err; + + // remove added slash + if (slashAdded) { + req.url = req.url.substr(1); + slashAdded = false; + } + + // restore altered req.url + if (removed.length !== 0) { + req.baseUrl = parentUrl; + req.url = protohost + removed + req.url.substr(protohost.length); + removed = ''; + } + + // signal to exit router + if (layerError === 'router') { + setImmediate(done, null) + return + } + + // no more matching layers + if (idx >= stack.length) { + setImmediate(done, layerError); + return; + } + + // get pathname of request + var path = getPathname(req); + + if (path == null) { + return done(layerError); + } + + // find next matching layer + var layer; + var match; + var route; + + while (match !== true && idx < stack.length) { + layer = stack[idx++]; + match = matchLayer(layer, path); + route = layer.route; + + if (typeof match !== 'boolean') { + // hold on to layerError + layerError = layerError || match; + } + + if (match !== true) { + continue; + } + + if (!route) { + // process non-route handlers normally + continue; + } + + if (layerError) { + // routes do not match with a pending error + match = false; + continue; + } + + var method = req.method; + var has_method = route._handles_method(method); + + // build up automatic options response + if (!has_method && method === 'OPTIONS') { + appendMethods(options, route._options()); + } + + // don't even bother matching route + if (!has_method && method !== 'HEAD') { + match = false; + continue; + } + } + + // no match + if (match !== true) { + return done(layerError); + } + + // store route for dispatch on change + if (route) { + req.route = route; + } + + // Capture one-time layer values + req.params = self.mergeParams + ? mergeParams(layer.params, parentParams) + : layer.params; + var layerPath = layer.path; + + // this should be done for the layer + self.process_params(layer, paramcalled, req, res, function (err) { + if (err) { + return next(layerError || err); + } + + if (route) { + return layer.handle_request(req, res, next); + } + + trim_prefix(layer, layerError, layerPath, path); + }); + } + + function trim_prefix(layer, layerError, layerPath, path) { + if (layerPath.length !== 0) { + // Validate path breaks on a path separator + var c = path[layerPath.length] + if (c && c !== '/' && c !== '.') return next(layerError) + + // Trim off the part of the url that matches the route + // middleware (.use stuff) needs to have the path stripped + debug('trim prefix (%s) from url %s', layerPath, req.url); + removed = layerPath; + req.url = protohost + req.url.substr(protohost.length + removed.length); + + // Ensure leading slash + if (!protohost && req.url[0] !== '/') { + req.url = '/' + req.url; + slashAdded = true; + } + + // Setup base URL (no trailing slash) + req.baseUrl = parentUrl + (removed[removed.length - 1] === '/' + ? removed.substring(0, removed.length - 1) + : removed); + } + + debug('%s %s : %s', layer.name, layerPath, req.originalUrl); + + if (layerError) { + layer.handle_error(layerError, req, res, next); + } else { + layer.handle_request(req, res, next); + } + } +}; + +/** + * Process any parameters for the layer. + * @private + */ + +proto.process_params = function process_params(layer, called, req, res, done) { + var params = this.params; + + // captured parameters from the layer, keys and values + var keys = layer.keys; + + // fast track + if (!keys || keys.length === 0) { + return done(); + } + + var i = 0; + var name; + var paramIndex = 0; + var key; + var paramVal; + var paramCallbacks; + var paramCalled; + + // process params in order + // param callbacks can be async + function param(err) { + if (err) { + return done(err); + } + + if (i >= keys.length ) { + return done(); + } + + paramIndex = 0; + key = keys[i++]; + name = key.name; + paramVal = req.params[name]; + paramCallbacks = params[name]; + paramCalled = called[name]; + + if (paramVal === undefined || !paramCallbacks) { + return param(); + } + + // param previously called with same value or error occurred + if (paramCalled && (paramCalled.match === paramVal + || (paramCalled.error && paramCalled.error !== 'route'))) { + // restore value + req.params[name] = paramCalled.value; + + // next param + return param(paramCalled.error); + } + + called[name] = paramCalled = { + error: null, + match: paramVal, + value: paramVal + }; + + paramCallback(); + } + + // single param callbacks + function paramCallback(err) { + var fn = paramCallbacks[paramIndex++]; + + // store updated value + paramCalled.value = req.params[key.name]; + + if (err) { + // store error + paramCalled.error = err; + param(err); + return; + } + + if (!fn) return param(); + + try { + fn(req, res, paramCallback, paramVal, key.name); + } catch (e) { + paramCallback(e); + } + } + + param(); +}; + +/** + * Use the given middleware function, with optional path, defaulting to "/". + * + * Use (like `.all`) will run for any http METHOD, but it will not add + * handlers for those methods so OPTIONS requests will not consider `.use` + * functions even if they could respond. + * + * The other difference is that _route_ path is stripped and not visible + * to the handler function. The main effect of this feature is that mounted + * handlers can operate without any code changes regardless of the "prefix" + * pathname. + * + * @public + */ + +proto.use = function use(fn) { + var offset = 0; + var path = '/'; + + // default path to '/' + // disambiguate router.use([fn]) + if (typeof fn !== 'function') { + var arg = fn; + + while (Array.isArray(arg) && arg.length !== 0) { + arg = arg[0]; + } + + // first arg is the path + if (typeof arg !== 'function') { + offset = 1; + path = fn; + } + } + + var callbacks = flatten(slice.call(arguments, offset)); + + if (callbacks.length === 0) { + throw new TypeError('Router.use() requires a middleware function') + } + + for (var i = 0; i < callbacks.length; i++) { + var fn = callbacks[i]; + + if (typeof fn !== 'function') { + throw new TypeError('Router.use() requires a middleware function but got a ' + gettype(fn)) + } + + // add the middleware + debug('use %o %s', path, fn.name || '') + + var layer = new Layer(path, { + sensitive: this.caseSensitive, + strict: false, + end: false + }, fn); + + layer.route = undefined; + + this.stack.push(layer); + } + + return this; +}; + +/** + * Create a new Route for the given path. + * + * Each route contains a separate middleware stack and VERB handlers. + * + * See the Route api documentation for details on adding handlers + * and middleware to routes. + * + * @param {String} path + * @return {Route} + * @public + */ + +proto.route = function route(path) { + var route = new Route(path); + + var layer = new Layer(path, { + sensitive: this.caseSensitive, + strict: this.strict, + end: true + }, route.dispatch.bind(route)); + + layer.route = route; + + this.stack.push(layer); + return route; +}; + +// create Router#VERB functions +methods.concat('all').forEach(function(method){ + proto[method] = function(path){ + var route = this.route(path) + route[method].apply(route, slice.call(arguments, 1)); + return this; + }; +}); + +// append methods to a list of methods +function appendMethods(list, addition) { + for (var i = 0; i < addition.length; i++) { + var method = addition[i]; + if (list.indexOf(method) === -1) { + list.push(method); + } + } +} + +// get pathname of request +function getPathname(req) { + try { + return parseUrl(req).pathname; + } catch (err) { + return undefined; + } +} + +// Get get protocol + host for a URL +function getProtohost(url) { + if (typeof url !== 'string' || url.length === 0 || url[0] === '/') { + return undefined + } + + var searchIndex = url.indexOf('?') + var pathLength = searchIndex !== -1 + ? searchIndex + : url.length + var fqdnIndex = url.substr(0, pathLength).indexOf('://') + + return fqdnIndex !== -1 + ? url.substr(0, url.indexOf('/', 3 + fqdnIndex)) + : undefined +} + +// get type for error message +function gettype(obj) { + var type = typeof obj; + + if (type !== 'object') { + return type; + } + + // inspect [[Class]] for objects + return toString.call(obj) + .replace(objectRegExp, '$1'); +} + +/** + * Match path to a layer. + * + * @param {Layer} layer + * @param {string} path + * @private + */ + +function matchLayer(layer, path) { + try { + return layer.match(path); + } catch (err) { + return err; + } +} + +// merge params with parent params +function mergeParams(params, parent) { + if (typeof parent !== 'object' || !parent) { + return params; + } + + // make copy of parent for base + var obj = mixin({}, parent); + + // simple non-numeric merging + if (!(0 in params) || !(0 in parent)) { + return mixin(obj, params); + } + + var i = 0; + var o = 0; + + // determine numeric gaps + while (i in params) { + i++; + } + + while (o in parent) { + o++; + } + + // offset numeric indices in params before merge + for (i--; i >= 0; i--) { + params[i + o] = params[i]; + + // create holes for the merge when necessary + if (i < o) { + delete params[i]; + } + } + + return mixin(obj, params); +} + +// restore obj props after function +function restore(fn, obj) { + var props = new Array(arguments.length - 2); + var vals = new Array(arguments.length - 2); + + for (var i = 0; i < props.length; i++) { + props[i] = arguments[i + 2]; + vals[i] = obj[props[i]]; + } + + return function () { + // restore vals + for (var i = 0; i < props.length; i++) { + obj[props[i]] = vals[i]; + } + + return fn.apply(this, arguments); + }; +} + +// send an OPTIONS response +function sendOptionsResponse(res, options, next) { + try { + var body = options.join(','); + res.set('Allow', body); + res.send(body); + } catch (err) { + next(err); + } +} + +// wrap a function +function wrap(old, fn) { + return function proxy() { + var args = new Array(arguments.length + 1); + + args[0] = old; + for (var i = 0, len = arguments.length; i < len; i++) { + args[i + 1] = arguments[i]; + } + + fn.apply(this, args); + }; +} diff --git a/node_modules/express/lib/router/layer.js b/node_modules/express/lib/router/layer.js new file mode 100644 index 000000000..4dc8e86d4 --- /dev/null +++ b/node_modules/express/lib/router/layer.js @@ -0,0 +1,181 @@ +/*! + * express + * Copyright(c) 2009-2013 TJ Holowaychuk + * Copyright(c) 2013 Roman Shtylman + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +/** + * Module dependencies. + * @private + */ + +var pathRegexp = require('path-to-regexp'); +var debug = require('debug')('express:router:layer'); + +/** + * Module variables. + * @private + */ + +var hasOwnProperty = Object.prototype.hasOwnProperty; + +/** + * Module exports. + * @public + */ + +module.exports = Layer; + +function Layer(path, options, fn) { + if (!(this instanceof Layer)) { + return new Layer(path, options, fn); + } + + debug('new %o', path) + var opts = options || {}; + + this.handle = fn; + this.name = fn.name || ''; + this.params = undefined; + this.path = undefined; + this.regexp = pathRegexp(path, this.keys = [], opts); + + // set fast path flags + this.regexp.fast_star = path === '*' + this.regexp.fast_slash = path === '/' && opts.end === false +} + +/** + * Handle the error for the layer. + * + * @param {Error} error + * @param {Request} req + * @param {Response} res + * @param {function} next + * @api private + */ + +Layer.prototype.handle_error = function handle_error(error, req, res, next) { + var fn = this.handle; + + if (fn.length !== 4) { + // not a standard error handler + return next(error); + } + + try { + fn(error, req, res, next); + } catch (err) { + next(err); + } +}; + +/** + * Handle the request for the layer. + * + * @param {Request} req + * @param {Response} res + * @param {function} next + * @api private + */ + +Layer.prototype.handle_request = function handle(req, res, next) { + var fn = this.handle; + + if (fn.length > 3) { + // not a standard request handler + return next(); + } + + try { + fn(req, res, next); + } catch (err) { + next(err); + } +}; + +/** + * Check if this route matches `path`, if so + * populate `.params`. + * + * @param {String} path + * @return {Boolean} + * @api private + */ + +Layer.prototype.match = function match(path) { + var match + + if (path != null) { + // fast path non-ending match for / (any path matches) + if (this.regexp.fast_slash) { + this.params = {} + this.path = '' + return true + } + + // fast path for * (everything matched in a param) + if (this.regexp.fast_star) { + this.params = {'0': decode_param(path)} + this.path = path + return true + } + + // match the path + match = this.regexp.exec(path) + } + + if (!match) { + this.params = undefined; + this.path = undefined; + return false; + } + + // store values + this.params = {}; + this.path = match[0] + + var keys = this.keys; + var params = this.params; + + for (var i = 1; i < match.length; i++) { + var key = keys[i - 1]; + var prop = key.name; + var val = decode_param(match[i]) + + if (val !== undefined || !(hasOwnProperty.call(params, prop))) { + params[prop] = val; + } + } + + return true; +}; + +/** + * Decode param value. + * + * @param {string} val + * @return {string} + * @private + */ + +function decode_param(val) { + if (typeof val !== 'string' || val.length === 0) { + return val; + } + + try { + return decodeURIComponent(val); + } catch (err) { + if (err instanceof URIError) { + err.message = 'Failed to decode param \'' + val + '\''; + err.status = err.statusCode = 400; + } + + throw err; + } +} diff --git a/node_modules/express/lib/router/route.js b/node_modules/express/lib/router/route.js new file mode 100644 index 000000000..178df0d51 --- /dev/null +++ b/node_modules/express/lib/router/route.js @@ -0,0 +1,216 @@ +/*! + * express + * Copyright(c) 2009-2013 TJ Holowaychuk + * Copyright(c) 2013 Roman Shtylman + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +/** + * Module dependencies. + * @private + */ + +var debug = require('debug')('express:router:route'); +var flatten = require('array-flatten'); +var Layer = require('./layer'); +var methods = require('methods'); + +/** + * Module variables. + * @private + */ + +var slice = Array.prototype.slice; +var toString = Object.prototype.toString; + +/** + * Module exports. + * @public + */ + +module.exports = Route; + +/** + * Initialize `Route` with the given `path`, + * + * @param {String} path + * @public + */ + +function Route(path) { + this.path = path; + this.stack = []; + + debug('new %o', path) + + // route handlers for various http methods + this.methods = {}; +} + +/** + * Determine if the route handles a given method. + * @private + */ + +Route.prototype._handles_method = function _handles_method(method) { + if (this.methods._all) { + return true; + } + + var name = method.toLowerCase(); + + if (name === 'head' && !this.methods['head']) { + name = 'get'; + } + + return Boolean(this.methods[name]); +}; + +/** + * @return {Array} supported HTTP methods + * @private + */ + +Route.prototype._options = function _options() { + var methods = Object.keys(this.methods); + + // append automatic head + if (this.methods.get && !this.methods.head) { + methods.push('head'); + } + + for (var i = 0; i < methods.length; i++) { + // make upper case + methods[i] = methods[i].toUpperCase(); + } + + return methods; +}; + +/** + * dispatch req, res into this route + * @private + */ + +Route.prototype.dispatch = function dispatch(req, res, done) { + var idx = 0; + var stack = this.stack; + if (stack.length === 0) { + return done(); + } + + var method = req.method.toLowerCase(); + if (method === 'head' && !this.methods['head']) { + method = 'get'; + } + + req.route = this; + + next(); + + function next(err) { + // signal to exit route + if (err && err === 'route') { + return done(); + } + + // signal to exit router + if (err && err === 'router') { + return done(err) + } + + var layer = stack[idx++]; + if (!layer) { + return done(err); + } + + if (layer.method && layer.method !== method) { + return next(err); + } + + if (err) { + layer.handle_error(err, req, res, next); + } else { + layer.handle_request(req, res, next); + } + } +}; + +/** + * Add a handler for all HTTP verbs to this route. + * + * Behaves just like middleware and can respond or call `next` + * to continue processing. + * + * You can use multiple `.all` call to add multiple handlers. + * + * function check_something(req, res, next){ + * next(); + * }; + * + * function validate_user(req, res, next){ + * next(); + * }; + * + * route + * .all(validate_user) + * .all(check_something) + * .get(function(req, res, next){ + * res.send('hello world'); + * }); + * + * @param {function} handler + * @return {Route} for chaining + * @api public + */ + +Route.prototype.all = function all() { + var handles = flatten(slice.call(arguments)); + + for (var i = 0; i < handles.length; i++) { + var handle = handles[i]; + + if (typeof handle !== 'function') { + var type = toString.call(handle); + var msg = 'Route.all() requires a callback function but got a ' + type + throw new TypeError(msg); + } + + var layer = Layer('/', {}, handle); + layer.method = undefined; + + this.methods._all = true; + this.stack.push(layer); + } + + return this; +}; + +methods.forEach(function(method){ + Route.prototype[method] = function(){ + var handles = flatten(slice.call(arguments)); + + for (var i = 0; i < handles.length; i++) { + var handle = handles[i]; + + if (typeof handle !== 'function') { + var type = toString.call(handle); + var msg = 'Route.' + method + '() requires a callback function but got a ' + type + throw new Error(msg); + } + + debug('%s %o', method, this.path) + + var layer = Layer('/', {}, handle); + layer.method = method; + + this.methods[method] = true; + this.stack.push(layer); + } + + return this; + }; +}); diff --git a/node_modules/express/lib/utils.js b/node_modules/express/lib/utils.js new file mode 100644 index 000000000..bd81ac7f6 --- /dev/null +++ b/node_modules/express/lib/utils.js @@ -0,0 +1,306 @@ +/*! + * express + * Copyright(c) 2009-2013 TJ Holowaychuk + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +/** + * Module dependencies. + * @api private + */ + +var Buffer = require('safe-buffer').Buffer +var contentDisposition = require('content-disposition'); +var contentType = require('content-type'); +var deprecate = require('depd')('express'); +var flatten = require('array-flatten'); +var mime = require('send').mime; +var etag = require('etag'); +var proxyaddr = require('proxy-addr'); +var qs = require('qs'); +var querystring = require('querystring'); + +/** + * Return strong ETag for `body`. + * + * @param {String|Buffer} body + * @param {String} [encoding] + * @return {String} + * @api private + */ + +exports.etag = createETagGenerator({ weak: false }) + +/** + * Return weak ETag for `body`. + * + * @param {String|Buffer} body + * @param {String} [encoding] + * @return {String} + * @api private + */ + +exports.wetag = createETagGenerator({ weak: true }) + +/** + * Check if `path` looks absolute. + * + * @param {String} path + * @return {Boolean} + * @api private + */ + +exports.isAbsolute = function(path){ + if ('/' === path[0]) return true; + if (':' === path[1] && ('\\' === path[2] || '/' === path[2])) return true; // Windows device path + if ('\\\\' === path.substring(0, 2)) return true; // Microsoft Azure absolute path +}; + +/** + * Flatten the given `arr`. + * + * @param {Array} arr + * @return {Array} + * @api private + */ + +exports.flatten = deprecate.function(flatten, + 'utils.flatten: use array-flatten npm module instead'); + +/** + * Normalize the given `type`, for example "html" becomes "text/html". + * + * @param {String} type + * @return {Object} + * @api private + */ + +exports.normalizeType = function(type){ + return ~type.indexOf('/') + ? acceptParams(type) + : { value: mime.lookup(type), params: {} }; +}; + +/** + * Normalize `types`, for example "html" becomes "text/html". + * + * @param {Array} types + * @return {Array} + * @api private + */ + +exports.normalizeTypes = function(types){ + var ret = []; + + for (var i = 0; i < types.length; ++i) { + ret.push(exports.normalizeType(types[i])); + } + + return ret; +}; + +/** + * Generate Content-Disposition header appropriate for the filename. + * non-ascii filenames are urlencoded and a filename* parameter is added + * + * @param {String} filename + * @return {String} + * @api private + */ + +exports.contentDisposition = deprecate.function(contentDisposition, + 'utils.contentDisposition: use content-disposition npm module instead'); + +/** + * Parse accept params `str` returning an + * object with `.value`, `.quality` and `.params`. + * also includes `.originalIndex` for stable sorting + * + * @param {String} str + * @return {Object} + * @api private + */ + +function acceptParams(str, index) { + var parts = str.split(/ *; */); + var ret = { value: parts[0], quality: 1, params: {}, originalIndex: index }; + + for (var i = 1; i < parts.length; ++i) { + var pms = parts[i].split(/ *= */); + if ('q' === pms[0]) { + ret.quality = parseFloat(pms[1]); + } else { + ret.params[pms[0]] = pms[1]; + } + } + + return ret; +} + +/** + * Compile "etag" value to function. + * + * @param {Boolean|String|Function} val + * @return {Function} + * @api private + */ + +exports.compileETag = function(val) { + var fn; + + if (typeof val === 'function') { + return val; + } + + switch (val) { + case true: + fn = exports.wetag; + break; + case false: + break; + case 'strong': + fn = exports.etag; + break; + case 'weak': + fn = exports.wetag; + break; + default: + throw new TypeError('unknown value for etag function: ' + val); + } + + return fn; +} + +/** + * Compile "query parser" value to function. + * + * @param {String|Function} val + * @return {Function} + * @api private + */ + +exports.compileQueryParser = function compileQueryParser(val) { + var fn; + + if (typeof val === 'function') { + return val; + } + + switch (val) { + case true: + fn = querystring.parse; + break; + case false: + fn = newObject; + break; + case 'extended': + fn = parseExtendedQueryString; + break; + case 'simple': + fn = querystring.parse; + break; + default: + throw new TypeError('unknown value for query parser function: ' + val); + } + + return fn; +} + +/** + * Compile "proxy trust" value to function. + * + * @param {Boolean|String|Number|Array|Function} val + * @return {Function} + * @api private + */ + +exports.compileTrust = function(val) { + if (typeof val === 'function') return val; + + if (val === true) { + // Support plain true/false + return function(){ return true }; + } + + if (typeof val === 'number') { + // Support trusting hop count + return function(a, i){ return i < val }; + } + + if (typeof val === 'string') { + // Support comma-separated values + val = val.split(/ *, */); + } + + return proxyaddr.compile(val || []); +} + +/** + * Set the charset in a given Content-Type string. + * + * @param {String} type + * @param {String} charset + * @return {String} + * @api private + */ + +exports.setCharset = function setCharset(type, charset) { + if (!type || !charset) { + return type; + } + + // parse type + var parsed = contentType.parse(type); + + // set charset + parsed.parameters.charset = charset; + + // format type + return contentType.format(parsed); +}; + +/** + * Create an ETag generator function, generating ETags with + * the given options. + * + * @param {object} options + * @return {function} + * @private + */ + +function createETagGenerator (options) { + return function generateETag (body, encoding) { + var buf = !Buffer.isBuffer(body) + ? Buffer.from(body, encoding) + : body + + return etag(buf, options) + } +} + +/** + * Parse an extended query string with qs. + * + * @return {Object} + * @private + */ + +function parseExtendedQueryString(str) { + return qs.parse(str, { + allowPrototypes: true + }); +} + +/** + * Return new empty object. + * + * @return {Object} + * @api private + */ + +function newObject() { + return {}; +} diff --git a/node_modules/express/lib/view.js b/node_modules/express/lib/view.js new file mode 100644 index 000000000..cf101caea --- /dev/null +++ b/node_modules/express/lib/view.js @@ -0,0 +1,182 @@ +/*! + * express + * Copyright(c) 2009-2013 TJ Holowaychuk + * Copyright(c) 2013 Roman Shtylman + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +/** + * Module dependencies. + * @private + */ + +var debug = require('debug')('express:view'); +var path = require('path'); +var fs = require('fs'); + +/** + * Module variables. + * @private + */ + +var dirname = path.dirname; +var basename = path.basename; +var extname = path.extname; +var join = path.join; +var resolve = path.resolve; + +/** + * Module exports. + * @public + */ + +module.exports = View; + +/** + * Initialize a new `View` with the given `name`. + * + * Options: + * + * - `defaultEngine` the default template engine name + * - `engines` template engine require() cache + * - `root` root path for view lookup + * + * @param {string} name + * @param {object} options + * @public + */ + +function View(name, options) { + var opts = options || {}; + + this.defaultEngine = opts.defaultEngine; + this.ext = extname(name); + this.name = name; + this.root = opts.root; + + if (!this.ext && !this.defaultEngine) { + throw new Error('No default engine was specified and no extension was provided.'); + } + + var fileName = name; + + if (!this.ext) { + // get extension from default engine name + this.ext = this.defaultEngine[0] !== '.' + ? '.' + this.defaultEngine + : this.defaultEngine; + + fileName += this.ext; + } + + if (!opts.engines[this.ext]) { + // load engine + var mod = this.ext.substr(1) + debug('require "%s"', mod) + + // default engine export + var fn = require(mod).__express + + if (typeof fn !== 'function') { + throw new Error('Module "' + mod + '" does not provide a view engine.') + } + + opts.engines[this.ext] = fn + } + + // store loaded engine + this.engine = opts.engines[this.ext]; + + // lookup path + this.path = this.lookup(fileName); +} + +/** + * Lookup view by the given `name` + * + * @param {string} name + * @private + */ + +View.prototype.lookup = function lookup(name) { + var path; + var roots = [].concat(this.root); + + debug('lookup "%s"', name); + + for (var i = 0; i < roots.length && !path; i++) { + var root = roots[i]; + + // resolve the path + var loc = resolve(root, name); + var dir = dirname(loc); + var file = basename(loc); + + // resolve the file + path = this.resolve(dir, file); + } + + return path; +}; + +/** + * Render with the given options. + * + * @param {object} options + * @param {function} callback + * @private + */ + +View.prototype.render = function render(options, callback) { + debug('render "%s"', this.path); + this.engine(this.path, options, callback); +}; + +/** + * Resolve the file within the given directory. + * + * @param {string} dir + * @param {string} file + * @private + */ + +View.prototype.resolve = function resolve(dir, file) { + var ext = this.ext; + + // . + var path = join(dir, file); + var stat = tryStat(path); + + if (stat && stat.isFile()) { + return path; + } + + // /index. + path = join(dir, basename(file, ext), 'index' + ext); + stat = tryStat(path); + + if (stat && stat.isFile()) { + return path; + } +}; + +/** + * Return a stat, maybe. + * + * @param {string} path + * @return {fs.Stats} + * @private + */ + +function tryStat(path) { + debug('stat "%s"', path); + + try { + return fs.statSync(path); + } catch (e) { + return undefined; + } +} diff --git a/node_modules/express/package.json b/node_modules/express/package.json new file mode 100644 index 000000000..7b840803b --- /dev/null +++ b/node_modules/express/package.json @@ -0,0 +1,154 @@ +{ + "_from": "express@^4.17.1", + "_id": "express@4.17.1", + "_inBundle": false, + "_integrity": "sha512-mHJ9O79RqluphRrcw2X/GTh3k9tVv8YcoyY4Kkh4WDMUYKRZUq0h1o0w2rrrxBqM7VoeUVqgb27xlEMXTnYt4g==", + "_location": "/express", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "express@^4.17.1", + "name": "express", + "escapedName": "express", + "rawSpec": "^4.17.1", + "saveSpec": null, + "fetchSpec": "^4.17.1" + }, + "_requiredBy": [ + "#DEV:/", + "#USER" + ], + "_resolved": "https://registry.npmjs.org/express/-/express-4.17.1.tgz", + "_shasum": "4491fc38605cf51f8629d39c2b5d026f98a4c134", + "_spec": "express@^4.17.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project", + "author": { + "name": "TJ Holowaychuk", + "email": "tj@vision-media.ca" + }, + "bugs": { + "url": "https://github.com/expressjs/express/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Aaron Heckmann", + "email": "aaron.heckmann+github@gmail.com" + }, + { + "name": "Ciaran Jessup", + "email": "ciaranj@gmail.com" + }, + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + { + "name": "Guillermo Rauch", + "email": "rauchg@gmail.com" + }, + { + "name": "Jonathan Ong", + "email": "me@jongleberry.com" + }, + { + "name": "Roman Shtylman", + "email": "shtylman+expressjs@gmail.com" + }, + { + "name": "Young Jae Sim", + "email": "hanul@hanul.me" + } + ], + "dependencies": { + "accepts": "~1.3.7", + "array-flatten": "1.1.1", + "body-parser": "1.19.0", + "content-disposition": "0.5.3", + "content-type": "~1.0.4", + "cookie": "0.4.0", + "cookie-signature": "1.0.6", + "debug": "2.6.9", + "depd": "~1.1.2", + "encodeurl": "~1.0.2", + "escape-html": "~1.0.3", + "etag": "~1.8.1", + "finalhandler": "~1.1.2", + "fresh": "0.5.2", + "merge-descriptors": "1.0.1", + "methods": "~1.1.2", + "on-finished": "~2.3.0", + "parseurl": "~1.3.3", + "path-to-regexp": "0.1.7", + "proxy-addr": "~2.0.5", + "qs": "6.7.0", + "range-parser": "~1.2.1", + "safe-buffer": "5.1.2", + "send": "0.17.1", + "serve-static": "1.14.1", + "setprototypeof": "1.1.1", + "statuses": "~1.5.0", + "type-is": "~1.6.18", + "utils-merge": "1.0.1", + "vary": "~1.1.2" + }, + "deprecated": false, + "description": "Fast, unopinionated, minimalist web framework", + "devDependencies": { + "after": "0.8.2", + "connect-redis": "3.4.1", + "cookie-parser": "~1.4.4", + "cookie-session": "1.3.3", + "ejs": "2.6.1", + "eslint": "2.13.1", + "express-session": "1.16.1", + "hbs": "4.0.4", + "istanbul": "0.4.5", + "marked": "0.6.2", + "method-override": "3.0.0", + "mocha": "5.2.0", + "morgan": "1.9.1", + "multiparty": "4.2.1", + "pbkdf2-password": "1.2.1", + "should": "13.2.3", + "supertest": "3.3.0", + "vhost": "~3.0.2" + }, + "engines": { + "node": ">= 0.10.0" + }, + "files": [ + "LICENSE", + "History.md", + "Readme.md", + "index.js", + "lib/" + ], + "homepage": "http://expressjs.com/", + "keywords": [ + "express", + "framework", + "sinatra", + "web", + "rest", + "restful", + "router", + "app", + "api" + ], + "license": "MIT", + "name": "express", + "repository": { + "type": "git", + "url": "git+https://github.com/expressjs/express.git" + }, + "scripts": { + "lint": "eslint .", + "test": "mocha --require test/support/env --reporter spec --bail --check-leaks test/ test/acceptance/", + "test-ci": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --require test/support/env --reporter spec --check-leaks test/ test/acceptance/", + "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --require test/support/env --reporter dot --check-leaks test/ test/acceptance/", + "test-tap": "mocha --require test/support/env --reporter tap --check-leaks test/ test/acceptance/" + }, + "version": "4.17.1" +} diff --git a/node_modules/filelist/README.md b/node_modules/filelist/README.md new file mode 100644 index 000000000..b52ebe7c5 --- /dev/null +++ b/node_modules/filelist/README.md @@ -0,0 +1,84 @@ +## FileList + +A FileList is a lazy-evaluated list of files. When given a list +of glob patterns for possible files to be included in the file +list, instead of searching the file structures to find the files, +a FileList holds the pattern for latter use. + +This allows you to define a FileList to match any number of +files, but only search out the actual files when then FileList +itself is actually used. The key is that the first time an +element of the FileList/Array is requested, the pending patterns +are resolved into a real list of file names. + +### Usage + +Add files to the list with the `include` method. You can add glob +patterns, individual files, or RegExp objects. When the Array +methods are invoked on the FileList, these items are resolved to +an actual list of files. + +```javascript +var fl = new FileList(); +fl.include('test/*.js'); +fl.exclude('test/helpers.js'); +``` + +Use the `exclude` method to override inclusions. You can use this +when your inclusions are too broad. + +### Array methods + +FileList has lazy-evaluated versions of most of the array +methods, including the following: + +* join +* pop +* push +* concat +* reverse +* shift +* unshift +* slice +* splice +* sort +* filter +* forEach +* some +* every +* map +* indexOf +* lastIndexOf +* reduce +* reduceRight + +When you call one of these methods, the items in the FileList +will be resolved to the full list of files, and the method will +be invoked on that result. + +### Special `length` method + +`length`: FileList includes a length *method* (instead of a +property) which returns the number of actual files in the list +once it's been resolved. + +### FileList-specific methods + +`include`: Add a filename/glob/regex to the list + +`exclude`: Override inclusions by excluding a filename/glob/regex + +`resolve`: Resolve the items in the FileList to the full list of +files. This method is invoked automatically when one of the array +methods is called. + +`toArray`: Immediately resolves the list of items, and returns an +actual array of filepaths. + +`clearInclusions`: Clears any pending items -- must be used +before resolving the list. + +`clearExclusions`: Clears the list of exclusions rules. + + + diff --git a/node_modules/filelist/index.d.ts b/node_modules/filelist/index.d.ts new file mode 100644 index 000000000..dbf6dc5d4 --- /dev/null +++ b/node_modules/filelist/index.d.ts @@ -0,0 +1,54 @@ +// Type definitions for filelist v0.0.6 +// Project: https://github.com/mde/filelist +// Definitions by: Christophe MASSOLIN +// Definitions: https://github.com/DefinitelyTyped/DefinitelyTyped + +declare module "filelist" { + export class FileList { + pendingAdd: string[] + pending: boolean + excludes: { + pats: RegExp[], + funcs: Function[], + regex: null | RegExp + } + items: string[] + static clone(): FileList + static verbose: boolean + toArray(): string[] + include(options: any, ...items: string[]): void + exclude(...items: string[]): void + resolve(): void + clearInclusions(): void + clearExclusions(): void + length(): number + toString(): string; + toLocaleString(): string; + push(...items: string[]): number; + pop(): string | undefined; + concat(...items: ReadonlyArray[]): string[]; + concat(...items: (string | ReadonlyArray)[]): string[]; + join(separator?: string): string; + reverse(): string[]; + shift(): string | undefined; + slice(start?: number, end?: number): string[]; + sort(compareFn?: (a: string, b: string) => number): this; + splice(start: number, deleteCount?: number): string[]; + splice(start: number, deleteCount: number, ...items: string[]): string[]; + unshift(...items: string[]): number; + indexOf(searchElement: string, fromIndex?: number): number; + lastIndexOf(searchElement: string, fromIndex?: number): number; + every(callbackfn: (value: string, index: number, array: string[]) => boolean, thisArg?: any): boolean; + some(callbackfn: (value: string, index: number, array: string[]) => boolean, thisArg?: any): boolean; + forEach(callbackfn: (value: string, index: number, array: string[]) => void, thisArg?: any): void; + map(callbackfn: (value: string, index: number, array: string[]) => U, thisArg?: any): U[]; + filter(callbackfn: (value: string, index: number, array: string[]) => value is S, thisArg?: any): S[]; + filter(callbackfn: (value: string, index: number, array: string[]) => any, thisArg?: any): string[]; + reduce(callbackfn: (previousValue: string, currentValue: string, currentIndex: number, array: string[]) => string): string; + reduce(callbackfn: (previousValue: string, currentValue: string, currentIndex: number, array: string[]) => string, initialValue: string): string; + reduce(callbackfn: (previousValue: U, currentValue: string, currentIndex: number, array: string[]) => U, initialValue: U): U; + reduceRight(callbackfn: (previousValue: string, currentValue: string, currentIndex: number, array: string[]) => string): string; + reduceRight(callbackfn: (previousValue: string, currentValue: string, currentIndex: number, array: string[]) => string, initialValue: string): string; + reduceRight(callbackfn: (previousValue: U, currentValue: string, currentIndex: number, array: string[]) => U, initialValue: U): U; + } +} \ No newline at end of file diff --git a/node_modules/filelist/index.js b/node_modules/filelist/index.js new file mode 100644 index 000000000..fef873a54 --- /dev/null +++ b/node_modules/filelist/index.js @@ -0,0 +1,486 @@ +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ +var fs = require('fs') +, path = require('path') +, minimatch = require('minimatch') +, escapeRegExpChars +, merge +, basedir +, _readDir +, readdirR +, globSync; + + /** + @name escapeRegExpChars + @function + @return {String} A string of escaped characters + @description Escapes regex control-characters in strings + used to build regexes dynamically + @param {String} string The string of chars to escape + */ + escapeRegExpChars = (function () { + var specials = [ '^', '$', '/', '.', '*', '+', '?', '|', '(', ')', + '[', ']', '{', '}', '\\' ]; + var sRE = new RegExp('(\\' + specials.join('|\\') + ')', 'g'); + return function (string) { + var str = string || ''; + str = String(str); + return str.replace(sRE, '\\$1'); + }; + })(); + + /** + @name merge + @function + @return {Object} Returns the merged object + @description Merge merges `otherObject` into `object` and takes care of deep + merging of objects + @param {Object} object Object to merge into + @param {Object} otherObject Object to read from + */ + merge = function (object, otherObject) { + var obj = object || {} + , otherObj = otherObject || {} + , key, value; + + for (key in otherObj) { + value = otherObj[key]; + + // Check if a value is an Object, if so recursively add it's key/values + if (typeof value === 'object' && !(value instanceof Array)) { + // Update value of object to the one from otherObj + obj[key] = merge(obj[key], value); + } + // Value is anything other than an Object, so just add it + else { + obj[key] = value; + } + } + + return obj; + }; + /** + Given a patern, return the base directory of it (ie. the folder + that will contain all the files matching the path). + eg. file.basedir('/test/**') => '/test/' + Path ending by '/' are considerd as folder while other are considerd + as files, eg.: + file.basedir('/test/a/') => '/test/a' + file.basedir('/test/a') => '/test' + The returned path always end with a '/' so we have: + file.basedir(file.basedir(x)) == file.basedir(x) + */ + basedir = function (pathParam) { + var bd = '' + , parts + , part + , pos = 0 + , p = pathParam || ''; + + // If the path has a leading asterisk, basedir is the current dir + if (p.indexOf('*') == 0 || p.indexOf('**') == 0) { + return '.'; + } + + // always consider .. at the end as a folder and not a filename + if (/(?:^|\/|\\)\.\.$/.test(p.slice(-3))) { + p += '/'; + } + + parts = p.split(/\\|\//); + for (var i = 0, l = parts.length - 1; i < l; i++) { + part = parts[i]; + if (part.indexOf('*') > -1 || part.indexOf('**') > -1) { + break; + } + pos += part.length + 1; + bd += part + p[pos - 1]; + } + if (!bd) { + bd = '.'; + } + // Strip trailing slashes + if (!(bd == '\\' || bd == '/')) { + bd = bd.replace(/\\$|\/$/, ''); + } + return bd; + + }; + + // Return the contents of a given directory + _readDir = function (dirPath) { + var dir = path.normalize(dirPath) + , paths = [] + , ret = [dir] + , msg; + + try { + paths = fs.readdirSync(dir); + } + catch (e) { + msg = 'Could not read path ' + dir + '\n'; + if (e.stack) { + msg += e.stack; + } + throw new Error(msg); + } + + paths.forEach(function (p) { + var curr = path.join(dir, p); + var stat = fs.statSync(curr); + if (stat.isDirectory()) { + ret = ret.concat(_readDir(curr)); + } + else { + ret.push(curr); + } + }); + + return ret; + }; + + /** + @name file#readdirR + @function + @return {Array} Returns the contents as an Array, can be configured via opts.format + @description Reads the given directory returning it's contents + @param {String} dir The directory to read + @param {Object} opts Options to use + @param {String} [opts.format] Set the format to return(Default: Array) + */ + readdirR = function (dir, opts) { + var options = opts || {} + , format = options.format || 'array' + , ret; + ret = _readDir(dir); + return format == 'string' ? ret.join('\n') : ret; + }; + + +globSync = function (pat, opts) { + var dirname = basedir(pat) + , files + , matches; + + try { + files = readdirR(dirname).map(function(file){ + return file.replace(/\\/g, '/'); + }); + } + // Bail if path doesn't exist -- assume no files + catch(e) { + if (FileList.verbose) console.error(e.message); + } + + if (files) { + pat = path.normalize(pat); + matches = minimatch.match(files, pat, opts || {}); + } + return matches || []; +}; + +// Constants +// --------------- +// List of all the builtin Array methods we want to override +var ARRAY_METHODS = Object.getOwnPropertyNames(Array.prototype) +// Array methods that return a copy instead of affecting the original + , SPECIAL_RETURN = { + 'concat': true + , 'slice': true + , 'filter': true + , 'map': true + } +// Default file-patterns we want to ignore + , DEFAULT_IGNORE_PATTERNS = [ + /(^|[\/\\])CVS([\/\\]|$)/ + , /(^|[\/\\])\.svn([\/\\]|$)/ + , /(^|[\/\\])\.git([\/\\]|$)/ + , /\.bak$/ + , /~$/ + ] +// Ignore core files + , DEFAULT_IGNORE_FUNCS = [ + function (name) { + var isDir = false + , stats; + try { + stats = fs.statSync(name); + isDir = stats.isDirectory(); + } + catch(e) {} + return (/(^|[\/\\])core$/).test(name) && !isDir; + } + ]; + +var FileList = function () { + var self = this + , wrap; + + // List of glob-patterns or specific filenames + this.pendingAdd = []; + // Switched to false after lazy-eval of files + this.pending = true; + // Used to calculate exclusions from the list of files + this.excludes = { + pats: DEFAULT_IGNORE_PATTERNS.slice() + , funcs: DEFAULT_IGNORE_FUNCS.slice() + , regex: null + }; + this.items = []; + + // Wrap the array methods with the delegates + wrap = function (prop) { + var arr; + self[prop] = function () { + if (self.pending) { + self.resolve(); + } + if (typeof self.items[prop] == 'function') { + // Special method that return a copy + if (SPECIAL_RETURN[prop]) { + arr = self.items[prop].apply(self.items, arguments); + return FileList.clone(self, arr); + } + else { + return self.items[prop].apply(self.items, arguments); + } + } + else { + return self.items[prop]; + } + }; + }; + for (var i = 0, ii = ARRAY_METHODS.length; i < ii; i++) { + wrap(ARRAY_METHODS[i]); + } + + // Include whatever files got passed to the constructor + this.include.apply(this, arguments); + + // Fix constructor linkage + this.constructor = FileList; +}; + +FileList.prototype = new (function () { + var globPattern = /[*?\[\{]/; + + var _addMatching = function (item) { + var matches = globSync(item.path, item.options); + this.items = this.items.concat(matches); + } + + , _resolveAdd = function (item) { + if (globPattern.test(item.path)) { + _addMatching.call(this, item); + } + else { + this.push(item.path); + } + } + + , _calculateExcludeRe = function () { + var pats = this.excludes.pats + , pat + , excl = [] + , matches = []; + + for (var i = 0, ii = pats.length; i < ii; i++) { + pat = pats[i]; + if (typeof pat == 'string') { + // Glob, look up files + if (/[*?]/.test(pat)) { + matches = globSync(pat); + matches = matches.map(function (m) { + return escapeRegExpChars(m); + }); + excl = excl.concat(matches); + } + // String for regex + else { + excl.push(escapeRegExpChars(pat)); + } + } + // Regex, grab the string-representation + else if (pat instanceof RegExp) { + excl.push(pat.toString().replace(/^\/|\/$/g, '')); + } + } + if (excl.length) { + this.excludes.regex = new RegExp('(' + excl.join(')|(') + ')'); + } + else { + this.excludes.regex = /^$/; + } + } + + , _resolveExclude = function () { + var self = this; + _calculateExcludeRe.call(this); + // No `reject` method, so use reverse-filter + this.items = this.items.filter(function (name) { + return !self.shouldExclude(name); + }); + }; + + /** + * Includes file-patterns in the FileList. Should be called with one or more + * pattern for finding file to include in the list. Arguments should be strings + * for either a glob-pattern or a specific file-name, or an array of them + */ + this.include = function () { + var args = Array.prototype.slice.call(arguments) + , arg + , includes = { items: [], options: {} }; + + for (var i = 0, ilen = args.length; i < ilen; i++) { + arg = args[i]; + + if (typeof arg === 'object' && !Array.isArray(arg)) { + merge(includes.options, arg); + } else { + includes.items = includes.items.concat(arg).filter(function (item) { + return !!item; + }); + } + } + + var items = includes.items.map(function(item) { + return { path: item, options: includes.options }; + }); + + this.pendingAdd = this.pendingAdd.concat(items); + + return this; + }; + + /** + * Indicates whether a particular file would be filtered out by the current + * exclusion rules for this FileList. + * @param {String} name The filename to check + * @return {Boolean} Whether or not the file should be excluded + */ + this.shouldExclude = function (name) { + if (!this.excludes.regex) { + _calculateExcludeRe.call(this); + } + var excl = this.excludes; + return excl.regex.test(name) || excl.funcs.some(function (f) { + return !!f(name); + }); + }; + + /** + * Excludes file-patterns from the FileList. Should be called with one or more + * pattern for finding file to include in the list. Arguments can be: + * 1. Strings for either a glob-pattern or a specific file-name + * 2. Regular expression literals + * 3. Functions to be run on the filename that return a true/false + */ + this.exclude = function () { + var args = Array.isArray(arguments[0]) ? arguments[0] : arguments + , arg; + for (var i = 0, ii = args.length; i < ii; i++) { + arg = args[i]; + if (typeof arg == 'function' && !(arg instanceof RegExp)) { + this.excludes.funcs.push(arg); + } + else { + this.excludes.pats.push(arg); + } + } + if (!this.pending) { + _resolveExclude.call(this); + } + return this; + }; + + /** + * Populates the FileList from the include/exclude rules with a list of + * actual files + */ + this.resolve = function () { + var item + , uniqueFunc = function (p, c) { + if (p.indexOf(c) < 0) { + p.push(c); + } + return p; + }; + if (this.pending) { + this.pending = false; + while ((item = this.pendingAdd.shift())) { + _resolveAdd.call(this, item); + } + // Reduce to a unique list + this.items = this.items.reduce(uniqueFunc, []); + // Remove exclusions + _resolveExclude.call(this); + } + return this; + }; + + /** + * Convert to a plain-jane array + */ + this.toArray = function () { + // Call slice to ensure lazy-resolution before slicing items + var ret = this.slice().items.slice(); + return ret; + }; + + /** + * Clear any pending items -- only useful before + * calling `resolve` + */ + this.clearInclusions = function () { + this.pendingAdd = []; + return this; + }; + + /** + * Clear any current exclusion rules + */ + this.clearExclusions = function () { + this.excludes = { + pats: [] + , funcs: [] + , regex: null + }; + return this; + }; + +})(); + +// Static method, used to create copy returned by special +// array methods +FileList.clone = function (list, items) { + var clone = new FileList(); + if (items) { + clone.items = items; + } + clone.pendingAdd = list.pendingAdd; + clone.pending = list.pending; + for (var p in list.excludes) { + clone.excludes[p] = list.excludes[p]; + } + return clone; +}; + +FileList.verbose = true + +exports.FileList = FileList; diff --git a/node_modules/filelist/jakefile.js b/node_modules/filelist/jakefile.js new file mode 100644 index 000000000..18d4d7df0 --- /dev/null +++ b/node_modules/filelist/jakefile.js @@ -0,0 +1,15 @@ +testTask('FileList', function () { + this.testFiles.include('test/*.js'); +}); + +publishTask('FileList', function () { + this.packageFiles.include([ + 'jakefile.js', + 'README.md', + 'package.json', + 'index.js', + 'index.d.ts' + ]); +}); + + diff --git a/node_modules/filelist/package.json b/node_modules/filelist/package.json new file mode 100644 index 000000000..fd34d1865 --- /dev/null +++ b/node_modules/filelist/package.json @@ -0,0 +1,57 @@ +{ + "_from": "filelist@^1.0.1", + "_id": "filelist@1.0.2", + "_inBundle": false, + "_integrity": "sha512-z7O0IS8Plc39rTCq6i6iHxk43duYOn8uFJiWSewIq0Bww1RNybVHSCjahmcC87ZqAm4OTvFzlzeGu3XAzG1ctQ==", + "_location": "/filelist", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "filelist@^1.0.1", + "name": "filelist", + "escapedName": "filelist", + "rawSpec": "^1.0.1", + "saveSpec": null, + "fetchSpec": "^1.0.1" + }, + "_requiredBy": [ + "/jake" + ], + "_resolved": "https://registry.npmjs.org/filelist/-/filelist-1.0.2.tgz", + "_shasum": "80202f21462d4d1c2e214119b1807c1bc0380e5b", + "_spec": "filelist@^1.0.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\jake", + "author": { + "name": "Matthew Eernisse", + "email": "mde@fleegix.org", + "url": "http://fleegix.org" + }, + "bugs": { + "url": "https://github.com/mde/filelist/issues" + }, + "bundleDependencies": false, + "dependencies": { + "minimatch": "^3.0.4" + }, + "deprecated": false, + "description": "Lazy-evaluating list of files, based on globs or regex patterns", + "homepage": "https://github.com/mde/filelist", + "keywords": [ + "file", + "utility", + "glob" + ], + "license": "Apache-2.0", + "main": "index.js", + "name": "filelist", + "repository": { + "type": "git", + "url": "git://github.com/mde/filelist.git" + }, + "scripts": { + "test": "jake test" + }, + "types": "index.d.ts", + "version": "1.0.2" +} diff --git a/node_modules/fill-range/LICENSE b/node_modules/fill-range/LICENSE new file mode 100644 index 000000000..9af4a67d2 --- /dev/null +++ b/node_modules/fill-range/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2014-present, Jon Schlinkert. + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/fill-range/README.md b/node_modules/fill-range/README.md new file mode 100644 index 000000000..8d756fe90 --- /dev/null +++ b/node_modules/fill-range/README.md @@ -0,0 +1,237 @@ +# fill-range [![Donate](https://img.shields.io/badge/Donate-PayPal-green.svg)](https://www.paypal.com/cgi-bin/webscr?cmd=_s-xclick&hosted_button_id=W8YFZ425KND68) [![NPM version](https://img.shields.io/npm/v/fill-range.svg?style=flat)](https://www.npmjs.com/package/fill-range) [![NPM monthly downloads](https://img.shields.io/npm/dm/fill-range.svg?style=flat)](https://npmjs.org/package/fill-range) [![NPM total downloads](https://img.shields.io/npm/dt/fill-range.svg?style=flat)](https://npmjs.org/package/fill-range) [![Linux Build Status](https://img.shields.io/travis/jonschlinkert/fill-range.svg?style=flat&label=Travis)](https://travis-ci.org/jonschlinkert/fill-range) + +> Fill in a range of numbers or letters, optionally passing an increment or `step` to use, or create a regex-compatible range with `options.toRegex` + +Please consider following this project's author, [Jon Schlinkert](https://github.com/jonschlinkert), and consider starring the project to show your :heart: and support. + +## Install + +Install with [npm](https://www.npmjs.com/): + +```sh +$ npm install --save fill-range +``` + +## Usage + +Expands numbers and letters, optionally using a `step` as the last argument. _(Numbers may be defined as JavaScript numbers or strings)_. + +```js +const fill = require('fill-range'); +// fill(from, to[, step, options]); + +console.log(fill('1', '10')); //=> ['1', '2', '3', '4', '5', '6', '7', '8', '9', '10'] +console.log(fill('1', '10', { toRegex: true })); //=> [1-9]|10 +``` + +**Params** + +* `from`: **{String|Number}** the number or letter to start with +* `to`: **{String|Number}** the number or letter to end with +* `step`: **{String|Number|Object|Function}** Optionally pass a [step](#optionsstep) to use. +* `options`: **{Object|Function}**: See all available [options](#options) + +## Examples + +By default, an array of values is returned. + +**Alphabetical ranges** + +```js +console.log(fill('a', 'e')); //=> ['a', 'b', 'c', 'd', 'e'] +console.log(fill('A', 'E')); //=> [ 'A', 'B', 'C', 'D', 'E' ] +``` + +**Numerical ranges** + +Numbers can be defined as actual numbers or strings. + +```js +console.log(fill(1, 5)); //=> [ 1, 2, 3, 4, 5 ] +console.log(fill('1', '5')); //=> [ 1, 2, 3, 4, 5 ] +``` + +**Negative ranges** + +Numbers can be defined as actual numbers or strings. + +```js +console.log(fill('-5', '-1')); //=> [ '-5', '-4', '-3', '-2', '-1' ] +console.log(fill('-5', '5')); //=> [ '-5', '-4', '-3', '-2', '-1', '0', '1', '2', '3', '4', '5' ] +``` + +**Steps (increments)** + +```js +// numerical ranges with increments +console.log(fill('0', '25', 4)); //=> [ '0', '4', '8', '12', '16', '20', '24' ] +console.log(fill('0', '25', 5)); //=> [ '0', '5', '10', '15', '20', '25' ] +console.log(fill('0', '25', 6)); //=> [ '0', '6', '12', '18', '24' ] + +// alphabetical ranges with increments +console.log(fill('a', 'z', 4)); //=> [ 'a', 'e', 'i', 'm', 'q', 'u', 'y' ] +console.log(fill('a', 'z', 5)); //=> [ 'a', 'f', 'k', 'p', 'u', 'z' ] +console.log(fill('a', 'z', 6)); //=> [ 'a', 'g', 'm', 's', 'y' ] +``` + +## Options + +### options.step + +**Type**: `number` (formatted as a string or number) + +**Default**: `undefined` + +**Description**: The increment to use for the range. Can be used with letters or numbers. + +**Example(s)** + +```js +// numbers +console.log(fill('1', '10', 2)); //=> [ '1', '3', '5', '7', '9' ] +console.log(fill('1', '10', 3)); //=> [ '1', '4', '7', '10' ] +console.log(fill('1', '10', 4)); //=> [ '1', '5', '9' ] + +// letters +console.log(fill('a', 'z', 5)); //=> [ 'a', 'f', 'k', 'p', 'u', 'z' ] +console.log(fill('a', 'z', 7)); //=> [ 'a', 'h', 'o', 'v' ] +console.log(fill('a', 'z', 9)); //=> [ 'a', 'j', 's' ] +``` + +### options.strictRanges + +**Type**: `boolean` + +**Default**: `false` + +**Description**: By default, `null` is returned when an invalid range is passed. Enable this option to throw a `RangeError` on invalid ranges. + +**Example(s)** + +The following are all invalid: + +```js +fill('1.1', '2'); // decimals not supported in ranges +fill('a', '2'); // incompatible range values +fill(1, 10, 'foo'); // invalid "step" argument +``` + +### options.stringify + +**Type**: `boolean` + +**Default**: `undefined` + +**Description**: Cast all returned values to strings. By default, integers are returned as numbers. + +**Example(s)** + +```js +console.log(fill(1, 5)); //=> [ 1, 2, 3, 4, 5 ] +console.log(fill(1, 5, { stringify: true })); //=> [ '1', '2', '3', '4', '5' ] +``` + +### options.toRegex + +**Type**: `boolean` + +**Default**: `undefined` + +**Description**: Create a regex-compatible source string, instead of expanding values to an array. + +**Example(s)** + +```js +// alphabetical range +console.log(fill('a', 'e', { toRegex: true })); //=> '[a-e]' +// alphabetical with step +console.log(fill('a', 'z', 3, { toRegex: true })); //=> 'a|d|g|j|m|p|s|v|y' +// numerical range +console.log(fill('1', '100', { toRegex: true })); //=> '[1-9]|[1-9][0-9]|100' +// numerical range with zero padding +console.log(fill('000001', '100000', { toRegex: true })); +//=> '0{5}[1-9]|0{4}[1-9][0-9]|0{3}[1-9][0-9]{2}|0{2}[1-9][0-9]{3}|0[1-9][0-9]{4}|100000' +``` + +### options.transform + +**Type**: `function` + +**Default**: `undefined` + +**Description**: Customize each value in the returned array (or [string](#optionstoRegex)). _(you can also pass this function as the last argument to `fill()`)_. + +**Example(s)** + +```js +// add zero padding +console.log(fill(1, 5, value => String(value).padStart(4, '0'))); +//=> ['0001', '0002', '0003', '0004', '0005'] +``` + +## About + +
+Contributing + +Pull requests and stars are always welcome. For bugs and feature requests, [please create an issue](../../issues/new). + +
+ +
+Running Tests + +Running and reviewing unit tests is a great way to get familiarized with a library and its API. You can install dependencies and run tests with the following command: + +```sh +$ npm install && npm test +``` + +
+ +
+Building docs + +_(This project's readme.md is generated by [verb](https://github.com/verbose/verb-generate-readme), please don't edit the readme directly. Any changes to the readme must be made in the [.verb.md](.verb.md) readme template.)_ + +To generate the readme, run the following command: + +```sh +$ npm install -g verbose/verb#dev verb-generate-readme && verb +``` + +
+ +### Contributors + +| **Commits** | **Contributor** | +| --- | --- | +| 116 | [jonschlinkert](https://github.com/jonschlinkert) | +| 4 | [paulmillr](https://github.com/paulmillr) | +| 2 | [realityking](https://github.com/realityking) | +| 2 | [bluelovers](https://github.com/bluelovers) | +| 1 | [edorivai](https://github.com/edorivai) | +| 1 | [wtgtybhertgeghgtwtg](https://github.com/wtgtybhertgeghgtwtg) | + +### Author + +**Jon Schlinkert** + +* [GitHub Profile](https://github.com/jonschlinkert) +* [Twitter Profile](https://twitter.com/jonschlinkert) +* [LinkedIn Profile](https://linkedin.com/in/jonschlinkert) + +Please consider supporting me on Patreon, or [start your own Patreon page](https://patreon.com/invite/bxpbvm)! + + + + + +### License + +Copyright © 2019, [Jon Schlinkert](https://github.com/jonschlinkert). +Released under the [MIT License](LICENSE). + +*** + +_This file was generated by [verb-generate-readme](https://github.com/verbose/verb-generate-readme), v0.8.0, on April 08, 2019._ \ No newline at end of file diff --git a/node_modules/fill-range/index.js b/node_modules/fill-range/index.js new file mode 100644 index 000000000..97ce35a5b --- /dev/null +++ b/node_modules/fill-range/index.js @@ -0,0 +1,249 @@ +/*! + * fill-range + * + * Copyright (c) 2014-present, Jon Schlinkert. + * Licensed under the MIT License. + */ + +'use strict'; + +const util = require('util'); +const toRegexRange = require('to-regex-range'); + +const isObject = val => val !== null && typeof val === 'object' && !Array.isArray(val); + +const transform = toNumber => { + return value => toNumber === true ? Number(value) : String(value); +}; + +const isValidValue = value => { + return typeof value === 'number' || (typeof value === 'string' && value !== ''); +}; + +const isNumber = num => Number.isInteger(+num); + +const zeros = input => { + let value = `${input}`; + let index = -1; + if (value[0] === '-') value = value.slice(1); + if (value === '0') return false; + while (value[++index] === '0'); + return index > 0; +}; + +const stringify = (start, end, options) => { + if (typeof start === 'string' || typeof end === 'string') { + return true; + } + return options.stringify === true; +}; + +const pad = (input, maxLength, toNumber) => { + if (maxLength > 0) { + let dash = input[0] === '-' ? '-' : ''; + if (dash) input = input.slice(1); + input = (dash + input.padStart(dash ? maxLength - 1 : maxLength, '0')); + } + if (toNumber === false) { + return String(input); + } + return input; +}; + +const toMaxLen = (input, maxLength) => { + let negative = input[0] === '-' ? '-' : ''; + if (negative) { + input = input.slice(1); + maxLength--; + } + while (input.length < maxLength) input = '0' + input; + return negative ? ('-' + input) : input; +}; + +const toSequence = (parts, options) => { + parts.negatives.sort((a, b) => a < b ? -1 : a > b ? 1 : 0); + parts.positives.sort((a, b) => a < b ? -1 : a > b ? 1 : 0); + + let prefix = options.capture ? '' : '?:'; + let positives = ''; + let negatives = ''; + let result; + + if (parts.positives.length) { + positives = parts.positives.join('|'); + } + + if (parts.negatives.length) { + negatives = `-(${prefix}${parts.negatives.join('|')})`; + } + + if (positives && negatives) { + result = `${positives}|${negatives}`; + } else { + result = positives || negatives; + } + + if (options.wrap) { + return `(${prefix}${result})`; + } + + return result; +}; + +const toRange = (a, b, isNumbers, options) => { + if (isNumbers) { + return toRegexRange(a, b, { wrap: false, ...options }); + } + + let start = String.fromCharCode(a); + if (a === b) return start; + + let stop = String.fromCharCode(b); + return `[${start}-${stop}]`; +}; + +const toRegex = (start, end, options) => { + if (Array.isArray(start)) { + let wrap = options.wrap === true; + let prefix = options.capture ? '' : '?:'; + return wrap ? `(${prefix}${start.join('|')})` : start.join('|'); + } + return toRegexRange(start, end, options); +}; + +const rangeError = (...args) => { + return new RangeError('Invalid range arguments: ' + util.inspect(...args)); +}; + +const invalidRange = (start, end, options) => { + if (options.strictRanges === true) throw rangeError([start, end]); + return []; +}; + +const invalidStep = (step, options) => { + if (options.strictRanges === true) { + throw new TypeError(`Expected step "${step}" to be a number`); + } + return []; +}; + +const fillNumbers = (start, end, step = 1, options = {}) => { + let a = Number(start); + let b = Number(end); + + if (!Number.isInteger(a) || !Number.isInteger(b)) { + if (options.strictRanges === true) throw rangeError([start, end]); + return []; + } + + // fix negative zero + if (a === 0) a = 0; + if (b === 0) b = 0; + + let descending = a > b; + let startString = String(start); + let endString = String(end); + let stepString = String(step); + step = Math.max(Math.abs(step), 1); + + let padded = zeros(startString) || zeros(endString) || zeros(stepString); + let maxLen = padded ? Math.max(startString.length, endString.length, stepString.length) : 0; + let toNumber = padded === false && stringify(start, end, options) === false; + let format = options.transform || transform(toNumber); + + if (options.toRegex && step === 1) { + return toRange(toMaxLen(start, maxLen), toMaxLen(end, maxLen), true, options); + } + + let parts = { negatives: [], positives: [] }; + let push = num => parts[num < 0 ? 'negatives' : 'positives'].push(Math.abs(num)); + let range = []; + let index = 0; + + while (descending ? a >= b : a <= b) { + if (options.toRegex === true && step > 1) { + push(a); + } else { + range.push(pad(format(a, index), maxLen, toNumber)); + } + a = descending ? a - step : a + step; + index++; + } + + if (options.toRegex === true) { + return step > 1 + ? toSequence(parts, options) + : toRegex(range, null, { wrap: false, ...options }); + } + + return range; +}; + +const fillLetters = (start, end, step = 1, options = {}) => { + if ((!isNumber(start) && start.length > 1) || (!isNumber(end) && end.length > 1)) { + return invalidRange(start, end, options); + } + + + let format = options.transform || (val => String.fromCharCode(val)); + let a = `${start}`.charCodeAt(0); + let b = `${end}`.charCodeAt(0); + + let descending = a > b; + let min = Math.min(a, b); + let max = Math.max(a, b); + + if (options.toRegex && step === 1) { + return toRange(min, max, false, options); + } + + let range = []; + let index = 0; + + while (descending ? a >= b : a <= b) { + range.push(format(a, index)); + a = descending ? a - step : a + step; + index++; + } + + if (options.toRegex === true) { + return toRegex(range, null, { wrap: false, options }); + } + + return range; +}; + +const fill = (start, end, step, options = {}) => { + if (end == null && isValidValue(start)) { + return [start]; + } + + if (!isValidValue(start) || !isValidValue(end)) { + return invalidRange(start, end, options); + } + + if (typeof step === 'function') { + return fill(start, end, 1, { transform: step }); + } + + if (isObject(step)) { + return fill(start, end, 0, step); + } + + let opts = { ...options }; + if (opts.capture === true) opts.wrap = true; + step = step || opts.step || 1; + + if (!isNumber(step)) { + if (step != null && !isObject(step)) return invalidStep(step, opts); + return fill(start, end, 1, step); + } + + if (isNumber(start) && isNumber(end)) { + return fillNumbers(start, end, step, opts); + } + + return fillLetters(start, end, Math.max(Math.abs(step), 1), opts); +}; + +module.exports = fill; diff --git a/node_modules/fill-range/package.json b/node_modules/fill-range/package.json new file mode 100644 index 000000000..40ecb9189 --- /dev/null +++ b/node_modules/fill-range/package.json @@ -0,0 +1,114 @@ +{ + "_from": "fill-range@^7.0.1", + "_id": "fill-range@7.0.1", + "_inBundle": false, + "_integrity": "sha512-qOo9F+dMUmC2Lcb4BbVvnKJxTPjCm+RRpe4gDuGrzkL7mEVl/djYSu2OdQ2Pa302N4oqkSg9ir6jaLWJ2USVpQ==", + "_location": "/fill-range", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "fill-range@^7.0.1", + "name": "fill-range", + "escapedName": "fill-range", + "rawSpec": "^7.0.1", + "saveSpec": null, + "fetchSpec": "^7.0.1" + }, + "_requiredBy": [ + "/braces" + ], + "_resolved": "https://registry.npmjs.org/fill-range/-/fill-range-7.0.1.tgz", + "_shasum": "1919a6a7c75fe38b2c7c77e5198535da9acdda40", + "_spec": "fill-range@^7.0.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\braces", + "author": { + "name": "Jon Schlinkert", + "url": "https://github.com/jonschlinkert" + }, + "bugs": { + "url": "https://github.com/jonschlinkert/fill-range/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Edo Rivai", + "url": "edo.rivai.nl" + }, + { + "name": "Jon Schlinkert", + "url": "http://twitter.com/jonschlinkert" + }, + { + "name": "Paul Miller", + "url": "paulmillr.com" + }, + { + "name": "Rouven Weßling", + "url": "www.rouvenwessling.de" + }, + { + "url": "https://github.com/wtgtybhertgeghgtwtg" + } + ], + "dependencies": { + "to-regex-range": "^5.0.1" + }, + "deprecated": false, + "description": "Fill in a range of numbers or letters, optionally passing an increment or `step` to use, or create a regex-compatible range with `options.toRegex`", + "devDependencies": { + "gulp-format-md": "^2.0.0", + "mocha": "^6.1.1" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/jonschlinkert/fill-range", + "keywords": [ + "alpha", + "alphabetical", + "array", + "bash", + "brace", + "expand", + "expansion", + "fill", + "glob", + "match", + "matches", + "matching", + "number", + "numerical", + "range", + "ranges", + "regex", + "sh" + ], + "license": "MIT", + "main": "index.js", + "name": "fill-range", + "repository": { + "type": "git", + "url": "git+https://github.com/jonschlinkert/fill-range.git" + }, + "scripts": { + "test": "mocha" + }, + "verb": { + "toc": false, + "layout": "default", + "tasks": [ + "readme" + ], + "plugins": [ + "gulp-format-md" + ], + "lint": { + "reflinks": true + } + }, + "version": "7.0.1" +} diff --git a/node_modules/finalhandler/HISTORY.md b/node_modules/finalhandler/HISTORY.md new file mode 100644 index 000000000..920c35e58 --- /dev/null +++ b/node_modules/finalhandler/HISTORY.md @@ -0,0 +1,187 @@ +1.1.2 / 2019-05-09 +================== + + * Set stricter `Content-Security-Policy` header + * deps: parseurl@~1.3.3 + * deps: statuses@~1.5.0 + +1.1.1 / 2018-03-06 +================== + + * Fix 404 output for bad / missing pathnames + * deps: encodeurl@~1.0.2 + - Fix encoding `%` as last character + * deps: statuses@~1.4.0 + +1.1.0 / 2017-09-24 +================== + + * Use `res.headersSent` when available + +1.0.6 / 2017-09-22 +================== + + * deps: debug@2.6.9 + +1.0.5 / 2017-09-15 +================== + + * deps: parseurl@~1.3.2 + - perf: reduce overhead for full URLs + - perf: unroll the "fast-path" `RegExp` + +1.0.4 / 2017-08-03 +================== + + * deps: debug@2.6.8 + +1.0.3 / 2017-05-16 +================== + + * deps: debug@2.6.7 + - deps: ms@2.0.0 + +1.0.2 / 2017-04-22 +================== + + * deps: debug@2.6.4 + - deps: ms@0.7.3 + +1.0.1 / 2017-03-21 +================== + + * Fix missing `` in HTML document + * deps: debug@2.6.3 + - Fix: `DEBUG_MAX_ARRAY_LENGTH` + +1.0.0 / 2017-02-15 +================== + + * Fix exception when `err` cannot be converted to a string + * Fully URL-encode the pathname in the 404 message + * Only include the pathname in the 404 message + * Send complete HTML document + * Set `Content-Security-Policy: default-src 'self'` header + * deps: debug@2.6.1 + - Allow colors in workers + - Deprecated `DEBUG_FD` environment variable set to `3` or higher + - Fix error when running under React Native + - Use same color for same namespace + - deps: ms@0.7.2 + +0.5.1 / 2016-11-12 +================== + + * Fix exception when `err.headers` is not an object + * deps: statuses@~1.3.1 + * perf: hoist regular expressions + * perf: remove duplicate validation path + +0.5.0 / 2016-06-15 +================== + + * Change invalid or non-numeric status code to 500 + * Overwrite status message to match set status code + * Prefer `err.statusCode` if `err.status` is invalid + * Set response headers from `err.headers` object + * Use `statuses` instead of `http` module for status messages + - Includes all defined status messages + +0.4.1 / 2015-12-02 +================== + + * deps: escape-html@~1.0.3 + - perf: enable strict mode + - perf: optimize string replacement + - perf: use faster string coercion + +0.4.0 / 2015-06-14 +================== + + * Fix a false-positive when unpiping in Node.js 0.8 + * Support `statusCode` property on `Error` objects + * Use `unpipe` module for unpiping requests + * deps: escape-html@1.0.2 + * deps: on-finished@~2.3.0 + - Add defined behavior for HTTP `CONNECT` requests + - Add defined behavior for HTTP `Upgrade` requests + - deps: ee-first@1.1.1 + * perf: enable strict mode + * perf: remove argument reassignment + +0.3.6 / 2015-05-11 +================== + + * deps: debug@~2.2.0 + - deps: ms@0.7.1 + +0.3.5 / 2015-04-22 +================== + + * deps: on-finished@~2.2.1 + - Fix `isFinished(req)` when data buffered + +0.3.4 / 2015-03-15 +================== + + * deps: debug@~2.1.3 + - Fix high intensity foreground color for bold + - deps: ms@0.7.0 + +0.3.3 / 2015-01-01 +================== + + * deps: debug@~2.1.1 + * deps: on-finished@~2.2.0 + +0.3.2 / 2014-10-22 +================== + + * deps: on-finished@~2.1.1 + - Fix handling of pipelined requests + +0.3.1 / 2014-10-16 +================== + + * deps: debug@~2.1.0 + - Implement `DEBUG_FD` env variable support + +0.3.0 / 2014-09-17 +================== + + * Terminate in progress response only on error + * Use `on-finished` to determine request status + +0.2.0 / 2014-09-03 +================== + + * Set `X-Content-Type-Options: nosniff` header + * deps: debug@~2.0.0 + +0.1.0 / 2014-07-16 +================== + + * Respond after request fully read + - prevents hung responses and socket hang ups + * deps: debug@1.0.4 + +0.0.3 / 2014-07-11 +================== + + * deps: debug@1.0.3 + - Add support for multiple wildcards in namespaces + +0.0.2 / 2014-06-19 +================== + + * Handle invalid status codes + +0.0.1 / 2014-06-05 +================== + + * deps: debug@1.0.2 + +0.0.0 / 2014-06-05 +================== + + * Extracted from connect/express diff --git a/node_modules/finalhandler/LICENSE b/node_modules/finalhandler/LICENSE new file mode 100644 index 000000000..fb3098277 --- /dev/null +++ b/node_modules/finalhandler/LICENSE @@ -0,0 +1,22 @@ +(The MIT License) + +Copyright (c) 2014-2017 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/finalhandler/README.md b/node_modules/finalhandler/README.md new file mode 100644 index 000000000..96327f0d1 --- /dev/null +++ b/node_modules/finalhandler/README.md @@ -0,0 +1,148 @@ +# finalhandler + +[![NPM Version][npm-image]][npm-url] +[![NPM Downloads][downloads-image]][downloads-url] +[![Node.js Version][node-image]][node-url] +[![Build Status][travis-image]][travis-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +Node.js function to invoke as the final step to respond to HTTP request. + +## Installation + +This is a [Node.js](https://nodejs.org/en/) module available through the +[npm registry](https://www.npmjs.com/). Installation is done using the +[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally): + +```sh +$ npm install finalhandler +``` + +## API + + + +```js +var finalhandler = require('finalhandler') +``` + +### finalhandler(req, res, [options]) + +Returns function to be invoked as the final step for the given `req` and `res`. +This function is to be invoked as `fn(err)`. If `err` is falsy, the handler will +write out a 404 response to the `res`. If it is truthy, an error response will +be written out to the `res`. + +When an error is written, the following information is added to the response: + + * The `res.statusCode` is set from `err.status` (or `err.statusCode`). If + this value is outside the 4xx or 5xx range, it will be set to 500. + * The `res.statusMessage` is set according to the status code. + * The body will be the HTML of the status code message if `env` is + `'production'`, otherwise will be `err.stack`. + * Any headers specified in an `err.headers` object. + +The final handler will also unpipe anything from `req` when it is invoked. + +#### options.env + +By default, the environment is determined by `NODE_ENV` variable, but it can be +overridden by this option. + +#### options.onerror + +Provide a function to be called with the `err` when it exists. Can be used for +writing errors to a central location without excessive function generation. Called +as `onerror(err, req, res)`. + +## Examples + +### always 404 + +```js +var finalhandler = require('finalhandler') +var http = require('http') + +var server = http.createServer(function (req, res) { + var done = finalhandler(req, res) + done() +}) + +server.listen(3000) +``` + +### perform simple action + +```js +var finalhandler = require('finalhandler') +var fs = require('fs') +var http = require('http') + +var server = http.createServer(function (req, res) { + var done = finalhandler(req, res) + + fs.readFile('index.html', function (err, buf) { + if (err) return done(err) + res.setHeader('Content-Type', 'text/html') + res.end(buf) + }) +}) + +server.listen(3000) +``` + +### use with middleware-style functions + +```js +var finalhandler = require('finalhandler') +var http = require('http') +var serveStatic = require('serve-static') + +var serve = serveStatic('public') + +var server = http.createServer(function (req, res) { + var done = finalhandler(req, res) + serve(req, res, done) +}) + +server.listen(3000) +``` + +### keep log of all errors + +```js +var finalhandler = require('finalhandler') +var fs = require('fs') +var http = require('http') + +var server = http.createServer(function (req, res) { + var done = finalhandler(req, res, { onerror: logerror }) + + fs.readFile('index.html', function (err, buf) { + if (err) return done(err) + res.setHeader('Content-Type', 'text/html') + res.end(buf) + }) +}) + +server.listen(3000) + +function logerror (err) { + console.error(err.stack || err.toString()) +} +``` + +## License + +[MIT](LICENSE) + +[npm-image]: https://img.shields.io/npm/v/finalhandler.svg +[npm-url]: https://npmjs.org/package/finalhandler +[node-image]: https://img.shields.io/node/v/finalhandler.svg +[node-url]: https://nodejs.org/en/download +[travis-image]: https://img.shields.io/travis/pillarjs/finalhandler.svg +[travis-url]: https://travis-ci.org/pillarjs/finalhandler +[coveralls-image]: https://img.shields.io/coveralls/pillarjs/finalhandler.svg +[coveralls-url]: https://coveralls.io/r/pillarjs/finalhandler?branch=master +[downloads-image]: https://img.shields.io/npm/dm/finalhandler.svg +[downloads-url]: https://npmjs.org/package/finalhandler diff --git a/node_modules/finalhandler/index.js b/node_modules/finalhandler/index.js new file mode 100644 index 000000000..567350798 --- /dev/null +++ b/node_modules/finalhandler/index.js @@ -0,0 +1,331 @@ +/*! + * finalhandler + * Copyright(c) 2014-2017 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module dependencies. + * @private + */ + +var debug = require('debug')('finalhandler') +var encodeUrl = require('encodeurl') +var escapeHtml = require('escape-html') +var onFinished = require('on-finished') +var parseUrl = require('parseurl') +var statuses = require('statuses') +var unpipe = require('unpipe') + +/** + * Module variables. + * @private + */ + +var DOUBLE_SPACE_REGEXP = /\x20{2}/g +var NEWLINE_REGEXP = /\n/g + +/* istanbul ignore next */ +var defer = typeof setImmediate === 'function' + ? setImmediate + : function (fn) { process.nextTick(fn.bind.apply(fn, arguments)) } +var isFinished = onFinished.isFinished + +/** + * Create a minimal HTML document. + * + * @param {string} message + * @private + */ + +function createHtmlDocument (message) { + var body = escapeHtml(message) + .replace(NEWLINE_REGEXP, '
') + .replace(DOUBLE_SPACE_REGEXP, '  ') + + return '\n' + + '\n' + + '\n' + + '\n' + + 'Error\n' + + '\n' + + '\n' + + '
' + body + '
\n' + + '\n' + + '\n' +} + +/** + * Module exports. + * @public + */ + +module.exports = finalhandler + +/** + * Create a function to handle the final response. + * + * @param {Request} req + * @param {Response} res + * @param {Object} [options] + * @return {Function} + * @public + */ + +function finalhandler (req, res, options) { + var opts = options || {} + + // get environment + var env = opts.env || process.env.NODE_ENV || 'development' + + // get error callback + var onerror = opts.onerror + + return function (err) { + var headers + var msg + var status + + // ignore 404 on in-flight response + if (!err && headersSent(res)) { + debug('cannot 404 after headers sent') + return + } + + // unhandled error + if (err) { + // respect status code from error + status = getErrorStatusCode(err) + + if (status === undefined) { + // fallback to status code on response + status = getResponseStatusCode(res) + } else { + // respect headers from error + headers = getErrorHeaders(err) + } + + // get error message + msg = getErrorMessage(err, status, env) + } else { + // not found + status = 404 + msg = 'Cannot ' + req.method + ' ' + encodeUrl(getResourceName(req)) + } + + debug('default %s', status) + + // schedule onerror callback + if (err && onerror) { + defer(onerror, err, req, res) + } + + // cannot actually respond + if (headersSent(res)) { + debug('cannot %d after headers sent', status) + req.socket.destroy() + return + } + + // send response + send(req, res, status, headers, msg) + } +} + +/** + * Get headers from Error object. + * + * @param {Error} err + * @return {object} + * @private + */ + +function getErrorHeaders (err) { + if (!err.headers || typeof err.headers !== 'object') { + return undefined + } + + var headers = Object.create(null) + var keys = Object.keys(err.headers) + + for (var i = 0; i < keys.length; i++) { + var key = keys[i] + headers[key] = err.headers[key] + } + + return headers +} + +/** + * Get message from Error object, fallback to status message. + * + * @param {Error} err + * @param {number} status + * @param {string} env + * @return {string} + * @private + */ + +function getErrorMessage (err, status, env) { + var msg + + if (env !== 'production') { + // use err.stack, which typically includes err.message + msg = err.stack + + // fallback to err.toString() when possible + if (!msg && typeof err.toString === 'function') { + msg = err.toString() + } + } + + return msg || statuses[status] +} + +/** + * Get status code from Error object. + * + * @param {Error} err + * @return {number} + * @private + */ + +function getErrorStatusCode (err) { + // check err.status + if (typeof err.status === 'number' && err.status >= 400 && err.status < 600) { + return err.status + } + + // check err.statusCode + if (typeof err.statusCode === 'number' && err.statusCode >= 400 && err.statusCode < 600) { + return err.statusCode + } + + return undefined +} + +/** + * Get resource name for the request. + * + * This is typically just the original pathname of the request + * but will fallback to "resource" is that cannot be determined. + * + * @param {IncomingMessage} req + * @return {string} + * @private + */ + +function getResourceName (req) { + try { + return parseUrl.original(req).pathname + } catch (e) { + return 'resource' + } +} + +/** + * Get status code from response. + * + * @param {OutgoingMessage} res + * @return {number} + * @private + */ + +function getResponseStatusCode (res) { + var status = res.statusCode + + // default status code to 500 if outside valid range + if (typeof status !== 'number' || status < 400 || status > 599) { + status = 500 + } + + return status +} + +/** + * Determine if the response headers have been sent. + * + * @param {object} res + * @returns {boolean} + * @private + */ + +function headersSent (res) { + return typeof res.headersSent !== 'boolean' + ? Boolean(res._header) + : res.headersSent +} + +/** + * Send response. + * + * @param {IncomingMessage} req + * @param {OutgoingMessage} res + * @param {number} status + * @param {object} headers + * @param {string} message + * @private + */ + +function send (req, res, status, headers, message) { + function write () { + // response body + var body = createHtmlDocument(message) + + // response status + res.statusCode = status + res.statusMessage = statuses[status] + + // response headers + setHeaders(res, headers) + + // security headers + res.setHeader('Content-Security-Policy', "default-src 'none'") + res.setHeader('X-Content-Type-Options', 'nosniff') + + // standard headers + res.setHeader('Content-Type', 'text/html; charset=utf-8') + res.setHeader('Content-Length', Buffer.byteLength(body, 'utf8')) + + if (req.method === 'HEAD') { + res.end() + return + } + + res.end(body, 'utf8') + } + + if (isFinished(req)) { + write() + return + } + + // unpipe everything from the request + unpipe(req) + + // flush the request + onFinished(req, write) + req.resume() +} + +/** + * Set response headers from an object. + * + * @param {OutgoingMessage} res + * @param {object} headers + * @private + */ + +function setHeaders (res, headers) { + if (!headers) { + return + } + + var keys = Object.keys(headers) + for (var i = 0; i < keys.length; i++) { + var key = keys[i] + res.setHeader(key, headers[key]) + } +} diff --git a/node_modules/finalhandler/package.json b/node_modules/finalhandler/package.json new file mode 100644 index 000000000..b8fa5f9f9 --- /dev/null +++ b/node_modules/finalhandler/package.json @@ -0,0 +1,80 @@ +{ + "_from": "finalhandler@~1.1.2", + "_id": "finalhandler@1.1.2", + "_inBundle": false, + "_integrity": "sha512-aAWcW57uxVNrQZqFXjITpW3sIUQmHGG3qSb9mUah9MgMC4NeWhNOlNjXEYq3HjRAvL6arUviZGGJsBg6z0zsWA==", + "_location": "/finalhandler", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "finalhandler@~1.1.2", + "name": "finalhandler", + "escapedName": "finalhandler", + "rawSpec": "~1.1.2", + "saveSpec": null, + "fetchSpec": "~1.1.2" + }, + "_requiredBy": [ + "/express" + ], + "_resolved": "https://registry.npmjs.org/finalhandler/-/finalhandler-1.1.2.tgz", + "_shasum": "b7e7d000ffd11938d0fdb053506f6ebabe9f587d", + "_spec": "finalhandler@~1.1.2", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\express", + "author": { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + "bugs": { + "url": "https://github.com/pillarjs/finalhandler/issues" + }, + "bundleDependencies": false, + "dependencies": { + "debug": "2.6.9", + "encodeurl": "~1.0.2", + "escape-html": "~1.0.3", + "on-finished": "~2.3.0", + "parseurl": "~1.3.3", + "statuses": "~1.5.0", + "unpipe": "~1.0.0" + }, + "deprecated": false, + "description": "Node.js final http responder", + "devDependencies": { + "eslint": "5.16.0", + "eslint-config-standard": "12.0.0", + "eslint-plugin-import": "2.17.2", + "eslint-plugin-markdown": "1.0.0", + "eslint-plugin-node": "8.0.1", + "eslint-plugin-promise": "4.1.1", + "eslint-plugin-standard": "4.0.0", + "istanbul": "0.4.5", + "mocha": "6.1.4", + "readable-stream": "2.3.6", + "safe-buffer": "5.1.2", + "supertest": "4.0.2" + }, + "engines": { + "node": ">= 0.8" + }, + "files": [ + "LICENSE", + "HISTORY.md", + "index.js" + ], + "homepage": "https://github.com/pillarjs/finalhandler#readme", + "license": "MIT", + "name": "finalhandler", + "repository": { + "type": "git", + "url": "git+https://github.com/pillarjs/finalhandler.git" + }, + "scripts": { + "lint": "eslint --plugin markdown --ext js,md .", + "test": "mocha --reporter spec --bail --check-leaks test/", + "test-ci": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --reporter spec --check-leaks test/", + "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --reporter dot --check-leaks test/" + }, + "version": "1.1.2" +} diff --git a/node_modules/forwarded/HISTORY.md b/node_modules/forwarded/HISTORY.md new file mode 100644 index 000000000..381e6aad4 --- /dev/null +++ b/node_modules/forwarded/HISTORY.md @@ -0,0 +1,21 @@ +0.2.0 / 2021-05-31 +================== + + * Use `req.socket` over deprecated `req.connection` + +0.1.2 / 2017-09-14 +================== + + * perf: improve header parsing + * perf: reduce overhead when no `X-Forwarded-For` header + +0.1.1 / 2017-09-10 +================== + + * Fix trimming leading / trailing OWS + * perf: hoist regular expression + +0.1.0 / 2014-09-21 +================== + + * Initial release diff --git a/node_modules/forwarded/LICENSE b/node_modules/forwarded/LICENSE new file mode 100644 index 000000000..84441fbb5 --- /dev/null +++ b/node_modules/forwarded/LICENSE @@ -0,0 +1,22 @@ +(The MIT License) + +Copyright (c) 2014-2017 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/forwarded/README.md b/node_modules/forwarded/README.md new file mode 100644 index 000000000..fdd220bca --- /dev/null +++ b/node_modules/forwarded/README.md @@ -0,0 +1,57 @@ +# forwarded + +[![NPM Version][npm-image]][npm-url] +[![NPM Downloads][downloads-image]][downloads-url] +[![Node.js Version][node-version-image]][node-version-url] +[![Build Status][ci-image]][ci-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +Parse HTTP X-Forwarded-For header + +## Installation + +This is a [Node.js](https://nodejs.org/en/) module available through the +[npm registry](https://www.npmjs.com/). Installation is done using the +[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally): + +```sh +$ npm install forwarded +``` + +## API + +```js +var forwarded = require('forwarded') +``` + +### forwarded(req) + +```js +var addresses = forwarded(req) +``` + +Parse the `X-Forwarded-For` header from the request. Returns an array +of the addresses, including the socket address for the `req`, in reverse +order (i.e. index `0` is the socket address and the last index is the +furthest address, typically the end-user). + +## Testing + +```sh +$ npm test +``` + +## License + +[MIT](LICENSE) + +[ci-image]: https://badgen.net/github/checks/jshttp/forwarded/master?label=ci +[ci-url]: https://github.com/jshttp/forwarded/actions?query=workflow%3Aci +[npm-image]: https://img.shields.io/npm/v/forwarded.svg +[npm-url]: https://npmjs.org/package/forwarded +[node-version-image]: https://img.shields.io/node/v/forwarded.svg +[node-version-url]: https://nodejs.org/en/download/ +[coveralls-image]: https://img.shields.io/coveralls/jshttp/forwarded/master.svg +[coveralls-url]: https://coveralls.io/r/jshttp/forwarded?branch=master +[downloads-image]: https://img.shields.io/npm/dm/forwarded.svg +[downloads-url]: https://npmjs.org/package/forwarded diff --git a/node_modules/forwarded/index.js b/node_modules/forwarded/index.js new file mode 100644 index 000000000..b2b6bdd3c --- /dev/null +++ b/node_modules/forwarded/index.js @@ -0,0 +1,90 @@ +/*! + * forwarded + * Copyright(c) 2014-2017 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module exports. + * @public + */ + +module.exports = forwarded + +/** + * Get all addresses in the request, using the `X-Forwarded-For` header. + * + * @param {object} req + * @return {array} + * @public + */ + +function forwarded (req) { + if (!req) { + throw new TypeError('argument req is required') + } + + // simple header parsing + var proxyAddrs = parse(req.headers['x-forwarded-for'] || '') + var socketAddr = getSocketAddr(req) + var addrs = [socketAddr].concat(proxyAddrs) + + // return all addresses + return addrs +} + +/** + * Get the socket address for a request. + * + * @param {object} req + * @return {string} + * @private + */ + +function getSocketAddr (req) { + return req.socket + ? req.socket.remoteAddress + : req.connection.remoteAddress +} + +/** + * Parse the X-Forwarded-For header. + * + * @param {string} header + * @private + */ + +function parse (header) { + var end = header.length + var list = [] + var start = header.length + + // gather addresses, backwards + for (var i = header.length - 1; i >= 0; i--) { + switch (header.charCodeAt(i)) { + case 0x20: /* */ + if (start === end) { + start = end = i + } + break + case 0x2c: /* , */ + if (start !== end) { + list.push(header.substring(start, end)) + } + start = end = i + break + default: + start = i + break + } + } + + // final address + if (start !== end) { + list.push(header.substring(start, end)) + } + + return list +} diff --git a/node_modules/forwarded/package.json b/node_modules/forwarded/package.json new file mode 100644 index 000000000..3a3e0829c --- /dev/null +++ b/node_modules/forwarded/package.json @@ -0,0 +1,80 @@ +{ + "_from": "forwarded@0.2.0", + "_id": "forwarded@0.2.0", + "_inBundle": false, + "_integrity": "sha512-buRG0fpBtRHSTCOASe6hD258tEubFoRLb4ZNA6NxMVHNw2gOcwHo9wyablzMzOA5z9xA9L1KNjk/Nt6MT9aYow==", + "_location": "/forwarded", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "forwarded@0.2.0", + "name": "forwarded", + "escapedName": "forwarded", + "rawSpec": "0.2.0", + "saveSpec": null, + "fetchSpec": "0.2.0" + }, + "_requiredBy": [ + "/proxy-addr" + ], + "_resolved": "https://registry.npmjs.org/forwarded/-/forwarded-0.2.0.tgz", + "_shasum": "2269936428aad4c15c7ebe9779a84bf0b2a81811", + "_spec": "forwarded@0.2.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\proxy-addr", + "bugs": { + "url": "https://github.com/jshttp/forwarded/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + } + ], + "deprecated": false, + "description": "Parse HTTP X-Forwarded-For header", + "devDependencies": { + "beautify-benchmark": "0.2.4", + "benchmark": "2.1.4", + "deep-equal": "1.0.1", + "eslint": "7.27.0", + "eslint-config-standard": "14.1.1", + "eslint-plugin-import": "2.23.4", + "eslint-plugin-node": "11.1.0", + "eslint-plugin-promise": "4.3.1", + "eslint-plugin-standard": "4.1.0", + "mocha": "8.4.0", + "nyc": "15.1.0" + }, + "engines": { + "node": ">= 0.6" + }, + "files": [ + "LICENSE", + "HISTORY.md", + "README.md", + "index.js" + ], + "homepage": "https://github.com/jshttp/forwarded#readme", + "keywords": [ + "x-forwarded-for", + "http", + "req" + ], + "license": "MIT", + "name": "forwarded", + "repository": { + "type": "git", + "url": "git+https://github.com/jshttp/forwarded.git" + }, + "scripts": { + "bench": "node benchmark/index.js", + "lint": "eslint .", + "test": "mocha --reporter spec --bail --check-leaks test/", + "test-ci": "nyc --reporter=lcov --reporter=text npm test", + "test-cov": "nyc --reporter=html --reporter=text npm test", + "version": "node scripts/version-history.js && git add HISTORY.md" + }, + "version": "0.2.0" +} diff --git a/node_modules/fresh/HISTORY.md b/node_modules/fresh/HISTORY.md new file mode 100644 index 000000000..4586996a3 --- /dev/null +++ b/node_modules/fresh/HISTORY.md @@ -0,0 +1,70 @@ +0.5.2 / 2017-09-13 +================== + + * Fix regression matching multiple ETags in `If-None-Match` + * perf: improve `If-None-Match` token parsing + +0.5.1 / 2017-09-11 +================== + + * Fix handling of modified headers with invalid dates + * perf: improve ETag match loop + +0.5.0 / 2017-02-21 +================== + + * Fix incorrect result when `If-None-Match` has both `*` and ETags + * Fix weak `ETag` matching to match spec + * perf: delay reading header values until needed + * perf: skip checking modified time if ETag check failed + * perf: skip parsing `If-None-Match` when no `ETag` header + * perf: use `Date.parse` instead of `new Date` + +0.4.0 / 2017-02-05 +================== + + * Fix false detection of `no-cache` request directive + * perf: enable strict mode + * perf: hoist regular expressions + * perf: remove duplicate conditional + * perf: remove unnecessary boolean coercions + +0.3.0 / 2015-05-12 +================== + + * Add weak `ETag` matching support + +0.2.4 / 2014-09-07 +================== + + * Support Node.js 0.6 + +0.2.3 / 2014-09-07 +================== + + * Move repository to jshttp + +0.2.2 / 2014-02-19 +================== + + * Revert "Fix for blank page on Safari reload" + +0.2.1 / 2014-01-29 +================== + + * Fix for blank page on Safari reload + +0.2.0 / 2013-08-11 +================== + + * Return stale for `Cache-Control: no-cache` + +0.1.0 / 2012-06-15 +================== + + * Add `If-None-Match: *` support + +0.0.1 / 2012-06-10 +================== + + * Initial release diff --git a/node_modules/fresh/LICENSE b/node_modules/fresh/LICENSE new file mode 100644 index 000000000..1434ade75 --- /dev/null +++ b/node_modules/fresh/LICENSE @@ -0,0 +1,23 @@ +(The MIT License) + +Copyright (c) 2012 TJ Holowaychuk +Copyright (c) 2016-2017 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/fresh/README.md b/node_modules/fresh/README.md new file mode 100644 index 000000000..1c1c680d1 --- /dev/null +++ b/node_modules/fresh/README.md @@ -0,0 +1,119 @@ +# fresh + +[![NPM Version][npm-image]][npm-url] +[![NPM Downloads][downloads-image]][downloads-url] +[![Node.js Version][node-version-image]][node-version-url] +[![Build Status][travis-image]][travis-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +HTTP response freshness testing + +## Installation + +This is a [Node.js](https://nodejs.org/en/) module available through the +[npm registry](https://www.npmjs.com/). Installation is done using the +[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally): + +``` +$ npm install fresh +``` + +## API + + + +```js +var fresh = require('fresh') +``` + +### fresh(reqHeaders, resHeaders) + +Check freshness of the response using request and response headers. + +When the response is still "fresh" in the client's cache `true` is +returned, otherwise `false` is returned to indicate that the client +cache is now stale and the full response should be sent. + +When a client sends the `Cache-Control: no-cache` request header to +indicate an end-to-end reload request, this module will return `false` +to make handling these requests transparent. + +## Known Issues + +This module is designed to only follow the HTTP specifications, not +to work-around all kinda of client bugs (especially since this module +typically does not recieve enough information to understand what the +client actually is). + +There is a known issue that in certain versions of Safari, Safari +will incorrectly make a request that allows this module to validate +freshness of the resource even when Safari does not have a +representation of the resource in the cache. The module +[jumanji](https://www.npmjs.com/package/jumanji) can be used in +an Express application to work-around this issue and also provides +links to further reading on this Safari bug. + +## Example + +### API usage + + + +```js +var reqHeaders = { 'if-none-match': '"foo"' } +var resHeaders = { 'etag': '"bar"' } +fresh(reqHeaders, resHeaders) +// => false + +var reqHeaders = { 'if-none-match': '"foo"' } +var resHeaders = { 'etag': '"foo"' } +fresh(reqHeaders, resHeaders) +// => true +``` + +### Using with Node.js http server + +```js +var fresh = require('fresh') +var http = require('http') + +var server = http.createServer(function (req, res) { + // perform server logic + // ... including adding ETag / Last-Modified response headers + + if (isFresh(req, res)) { + // client has a fresh copy of resource + res.statusCode = 304 + res.end() + return + } + + // send the resource + res.statusCode = 200 + res.end('hello, world!') +}) + +function isFresh (req, res) { + return fresh(req.headers, { + 'etag': res.getHeader('ETag'), + 'last-modified': res.getHeader('Last-Modified') + }) +} + +server.listen(3000) +``` + +## License + +[MIT](LICENSE) + +[npm-image]: https://img.shields.io/npm/v/fresh.svg +[npm-url]: https://npmjs.org/package/fresh +[node-version-image]: https://img.shields.io/node/v/fresh.svg +[node-version-url]: https://nodejs.org/en/ +[travis-image]: https://img.shields.io/travis/jshttp/fresh/master.svg +[travis-url]: https://travis-ci.org/jshttp/fresh +[coveralls-image]: https://img.shields.io/coveralls/jshttp/fresh/master.svg +[coveralls-url]: https://coveralls.io/r/jshttp/fresh?branch=master +[downloads-image]: https://img.shields.io/npm/dm/fresh.svg +[downloads-url]: https://npmjs.org/package/fresh diff --git a/node_modules/fresh/index.js b/node_modules/fresh/index.js new file mode 100644 index 000000000..d154f5a7d --- /dev/null +++ b/node_modules/fresh/index.js @@ -0,0 +1,137 @@ +/*! + * fresh + * Copyright(c) 2012 TJ Holowaychuk + * Copyright(c) 2016-2017 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * RegExp to check for no-cache token in Cache-Control. + * @private + */ + +var CACHE_CONTROL_NO_CACHE_REGEXP = /(?:^|,)\s*?no-cache\s*?(?:,|$)/ + +/** + * Module exports. + * @public + */ + +module.exports = fresh + +/** + * Check freshness of the response using request and response headers. + * + * @param {Object} reqHeaders + * @param {Object} resHeaders + * @return {Boolean} + * @public + */ + +function fresh (reqHeaders, resHeaders) { + // fields + var modifiedSince = reqHeaders['if-modified-since'] + var noneMatch = reqHeaders['if-none-match'] + + // unconditional request + if (!modifiedSince && !noneMatch) { + return false + } + + // Always return stale when Cache-Control: no-cache + // to support end-to-end reload requests + // https://tools.ietf.org/html/rfc2616#section-14.9.4 + var cacheControl = reqHeaders['cache-control'] + if (cacheControl && CACHE_CONTROL_NO_CACHE_REGEXP.test(cacheControl)) { + return false + } + + // if-none-match + if (noneMatch && noneMatch !== '*') { + var etag = resHeaders['etag'] + + if (!etag) { + return false + } + + var etagStale = true + var matches = parseTokenList(noneMatch) + for (var i = 0; i < matches.length; i++) { + var match = matches[i] + if (match === etag || match === 'W/' + etag || 'W/' + match === etag) { + etagStale = false + break + } + } + + if (etagStale) { + return false + } + } + + // if-modified-since + if (modifiedSince) { + var lastModified = resHeaders['last-modified'] + var modifiedStale = !lastModified || !(parseHttpDate(lastModified) <= parseHttpDate(modifiedSince)) + + if (modifiedStale) { + return false + } + } + + return true +} + +/** + * Parse an HTTP Date into a number. + * + * @param {string} date + * @private + */ + +function parseHttpDate (date) { + var timestamp = date && Date.parse(date) + + // istanbul ignore next: guard against date.js Date.parse patching + return typeof timestamp === 'number' + ? timestamp + : NaN +} + +/** + * Parse a HTTP token list. + * + * @param {string} str + * @private + */ + +function parseTokenList (str) { + var end = 0 + var list = [] + var start = 0 + + // gather tokens + for (var i = 0, len = str.length; i < len; i++) { + switch (str.charCodeAt(i)) { + case 0x20: /* */ + if (start === end) { + start = end = i + 1 + } + break + case 0x2c: /* , */ + list.push(str.substring(start, end)) + start = end = i + 1 + break + default: + end = i + 1 + break + } + } + + // final token + list.push(str.substring(start, end)) + + return list +} diff --git a/node_modules/fresh/package.json b/node_modules/fresh/package.json new file mode 100644 index 000000000..29c111fad --- /dev/null +++ b/node_modules/fresh/package.json @@ -0,0 +1,90 @@ +{ + "_from": "fresh@0.5.2", + "_id": "fresh@0.5.2", + "_inBundle": false, + "_integrity": "sha1-PYyt2Q2XZWn6g1qx+OSyOhBWBac=", + "_location": "/fresh", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "fresh@0.5.2", + "name": "fresh", + "escapedName": "fresh", + "rawSpec": "0.5.2", + "saveSpec": null, + "fetchSpec": "0.5.2" + }, + "_requiredBy": [ + "/express", + "/send" + ], + "_resolved": "https://registry.npmjs.org/fresh/-/fresh-0.5.2.tgz", + "_shasum": "3d8cadd90d976569fa835ab1f8e4b23a105605a7", + "_spec": "fresh@0.5.2", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\express", + "author": { + "name": "TJ Holowaychuk", + "email": "tj@vision-media.ca", + "url": "http://tjholowaychuk.com" + }, + "bugs": { + "url": "https://github.com/jshttp/fresh/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + { + "name": "Jonathan Ong", + "email": "me@jongleberry.com", + "url": "http://jongleberry.com" + } + ], + "deprecated": false, + "description": "HTTP response freshness testing", + "devDependencies": { + "beautify-benchmark": "0.2.4", + "benchmark": "2.1.4", + "eslint": "3.19.0", + "eslint-config-standard": "10.2.1", + "eslint-plugin-import": "2.7.0", + "eslint-plugin-markdown": "1.0.0-beta.6", + "eslint-plugin-node": "5.1.1", + "eslint-plugin-promise": "3.5.0", + "eslint-plugin-standard": "3.0.1", + "istanbul": "0.4.5", + "mocha": "1.21.5" + }, + "engines": { + "node": ">= 0.6" + }, + "files": [ + "HISTORY.md", + "LICENSE", + "index.js" + ], + "homepage": "https://github.com/jshttp/fresh#readme", + "keywords": [ + "fresh", + "http", + "conditional", + "cache" + ], + "license": "MIT", + "name": "fresh", + "repository": { + "type": "git", + "url": "git+https://github.com/jshttp/fresh.git" + }, + "scripts": { + "bench": "node benchmark/index.js", + "lint": "eslint --plugin markdown --ext js,md .", + "test": "mocha --reporter spec --bail --check-leaks test/", + "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --reporter dot --check-leaks test/", + "test-travis": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --reporter spec --check-leaks test/" + }, + "version": "0.5.2" +} diff --git a/node_modules/get-stream/buffer-stream.js b/node_modules/get-stream/buffer-stream.js new file mode 100644 index 000000000..4121c8e56 --- /dev/null +++ b/node_modules/get-stream/buffer-stream.js @@ -0,0 +1,51 @@ +'use strict'; +const {PassThrough} = require('stream'); + +module.exports = options => { + options = Object.assign({}, options); + + const {array} = options; + let {encoding} = options; + const buffer = encoding === 'buffer'; + let objectMode = false; + + if (array) { + objectMode = !(encoding || buffer); + } else { + encoding = encoding || 'utf8'; + } + + if (buffer) { + encoding = null; + } + + let len = 0; + const ret = []; + const stream = new PassThrough({objectMode}); + + if (encoding) { + stream.setEncoding(encoding); + } + + stream.on('data', chunk => { + ret.push(chunk); + + if (objectMode) { + len = ret.length; + } else { + len += chunk.length; + } + }); + + stream.getBufferedValue = () => { + if (array) { + return ret; + } + + return buffer ? Buffer.concat(ret, len) : ret.join(''); + }; + + stream.getBufferedLength = () => len; + + return stream; +}; diff --git a/node_modules/get-stream/index.js b/node_modules/get-stream/index.js new file mode 100644 index 000000000..7e5584a63 --- /dev/null +++ b/node_modules/get-stream/index.js @@ -0,0 +1,50 @@ +'use strict'; +const pump = require('pump'); +const bufferStream = require('./buffer-stream'); + +class MaxBufferError extends Error { + constructor() { + super('maxBuffer exceeded'); + this.name = 'MaxBufferError'; + } +} + +function getStream(inputStream, options) { + if (!inputStream) { + return Promise.reject(new Error('Expected a stream')); + } + + options = Object.assign({maxBuffer: Infinity}, options); + + const {maxBuffer} = options; + + let stream; + return new Promise((resolve, reject) => { + const rejectPromise = error => { + if (error) { // A null check + error.bufferedData = stream.getBufferedValue(); + } + reject(error); + }; + + stream = pump(inputStream, bufferStream(options), error => { + if (error) { + rejectPromise(error); + return; + } + + resolve(); + }); + + stream.on('data', () => { + if (stream.getBufferedLength() > maxBuffer) { + rejectPromise(new MaxBufferError()); + } + }); + }).then(() => stream.getBufferedValue()); +} + +module.exports = getStream; +module.exports.buffer = (stream, options) => getStream(stream, Object.assign({}, options, {encoding: 'buffer'})); +module.exports.array = (stream, options) => getStream(stream, Object.assign({}, options, {array: true})); +module.exports.MaxBufferError = MaxBufferError; diff --git a/node_modules/get-stream/license b/node_modules/get-stream/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/get-stream/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/get-stream/package.json b/node_modules/get-stream/package.json new file mode 100644 index 000000000..f8d0809e9 --- /dev/null +++ b/node_modules/get-stream/package.json @@ -0,0 +1,78 @@ +{ + "_from": "get-stream@^4.1.0", + "_id": "get-stream@4.1.0", + "_inBundle": false, + "_integrity": "sha512-GMat4EJ5161kIy2HevLlr4luNjBgvmj413KaQA7jt4V8B4RDsfpHk7WQ9GVqfYyyx8OS/L66Kox+rJRNklLK7w==", + "_location": "/get-stream", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "get-stream@^4.1.0", + "name": "get-stream", + "escapedName": "get-stream", + "rawSpec": "^4.1.0", + "saveSpec": null, + "fetchSpec": "^4.1.0" + }, + "_requiredBy": [ + "/got" + ], + "_resolved": "https://registry.npmjs.org/get-stream/-/get-stream-4.1.0.tgz", + "_shasum": "c1b255575f3dc21d59bfc79cd3d2b46b1c3a54b5", + "_spec": "get-stream@^4.1.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\got", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/get-stream/issues" + }, + "bundleDependencies": false, + "dependencies": { + "pump": "^3.0.0" + }, + "deprecated": false, + "description": "Get a stream as a string, buffer, or array", + "devDependencies": { + "ava": "*", + "into-stream": "^3.0.0", + "xo": "*" + }, + "engines": { + "node": ">=6" + }, + "files": [ + "index.js", + "buffer-stream.js" + ], + "homepage": "https://github.com/sindresorhus/get-stream#readme", + "keywords": [ + "get", + "stream", + "promise", + "concat", + "string", + "text", + "buffer", + "read", + "data", + "consume", + "readable", + "readablestream", + "array", + "object" + ], + "license": "MIT", + "name": "get-stream", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/get-stream.git" + }, + "scripts": { + "test": "xo && ava" + }, + "version": "4.1.0" +} diff --git a/node_modules/get-stream/readme.md b/node_modules/get-stream/readme.md new file mode 100644 index 000000000..b87a4d37c --- /dev/null +++ b/node_modules/get-stream/readme.md @@ -0,0 +1,123 @@ +# get-stream [![Build Status](https://travis-ci.org/sindresorhus/get-stream.svg?branch=master)](https://travis-ci.org/sindresorhus/get-stream) + +> Get a stream as a string, buffer, or array + + +## Install + +``` +$ npm install get-stream +``` + + +## Usage + +```js +const fs = require('fs'); +const getStream = require('get-stream'); + +(async () => { + const stream = fs.createReadStream('unicorn.txt'); + + console.log(await getStream(stream)); + /* + ,,))))))));, + __)))))))))))))), + \|/ -\(((((''''((((((((. + -*-==//////(('' . `)))))), + /|\ ))| o ;-. '((((( ,(, + ( `| / ) ;))))' ,_))^;(~ + | | | ,))((((_ _____------~~~-. %,;(;(>';'~ + o_); ; )))(((` ~---~ `:: \ %%~~)(v;(`('~ + ; ''''```` `: `:::|\,__,%% );`'; ~ + | _ ) / `:|`----' `-' + ______/\/~ | / / + /~;;.____/;;' / ___--,-( `;;;/ + / // _;______;'------~~~~~ /;;/\ / + // | | / ; \;;,\ + (<_ | ; /',/-----' _> + \_| ||_ //~;~~~~~~~~~ + `\_| (,~~ + \~\ + ~~ + */ +})(); +``` + + +## API + +The methods returns a promise that resolves when the `end` event fires on the stream, indicating that there is no more data to be read. The stream is switched to flowing mode. + +### getStream(stream, [options]) + +Get the `stream` as a string. + +#### options + +Type: `Object` + +##### encoding + +Type: `string`
+Default: `utf8` + +[Encoding](https://nodejs.org/api/buffer.html#buffer_buffer) of the incoming stream. + +##### maxBuffer + +Type: `number`
+Default: `Infinity` + +Maximum length of the returned string. If it exceeds this value before the stream ends, the promise will be rejected with a `getStream.MaxBufferError` error. + +### getStream.buffer(stream, [options]) + +Get the `stream` as a buffer. + +It honors the `maxBuffer` option as above, but it refers to byte length rather than string length. + +### getStream.array(stream, [options]) + +Get the `stream` as an array of values. + +It honors both the `maxBuffer` and `encoding` options. The behavior changes slightly based on the encoding chosen: + +- When `encoding` is unset, it assumes an [object mode stream](https://nodesource.com/blog/understanding-object-streams/) and collects values emitted from `stream` unmodified. In this case `maxBuffer` refers to the number of items in the array (not the sum of their sizes). + +- When `encoding` is set to `buffer`, it collects an array of buffers. `maxBuffer` refers to the summed byte lengths of every buffer in the array. + +- When `encoding` is set to anything else, it collects an array of strings. `maxBuffer` refers to the summed character lengths of every string in the array. + + +## Errors + +If the input stream emits an `error` event, the promise will be rejected with the error. The buffered data will be attached to the `bufferedData` property of the error. + +```js +(async () => { + try { + await getStream(streamThatErrorsAtTheEnd('unicorn')); + } catch (error) { + console.log(error.bufferedData); + //=> 'unicorn' + } +})() +``` + + +## FAQ + +### How is this different from [`concat-stream`](https://github.com/maxogden/concat-stream)? + +This module accepts a stream instead of being one and returns a promise instead of using a callback. The API is simpler and it only supports returning a string, buffer, or array. It doesn't have a fragile type inference. You explicitly choose what you want. And it doesn't depend on the huge `readable-stream` package. + + +## Related + +- [get-stdin](https://github.com/sindresorhus/get-stdin) - Get stdin as a string or buffer + + +## License + +MIT © [Sindre Sorhus](https://sindresorhus.com) diff --git a/node_modules/glob-parent/CHANGELOG.md b/node_modules/glob-parent/CHANGELOG.md new file mode 100644 index 000000000..fb9de9618 --- /dev/null +++ b/node_modules/glob-parent/CHANGELOG.md @@ -0,0 +1,110 @@ +### [5.1.2](https://github.com/gulpjs/glob-parent/compare/v5.1.1...v5.1.2) (2021-03-06) + + +### Bug Fixes + +* eliminate ReDoS ([#36](https://github.com/gulpjs/glob-parent/issues/36)) ([f923116](https://github.com/gulpjs/glob-parent/commit/f9231168b0041fea3f8f954b3cceb56269fc6366)) + +### [5.1.1](https://github.com/gulpjs/glob-parent/compare/v5.1.0...v5.1.1) (2021-01-27) + + +### Bug Fixes + +* unescape exclamation mark ([#26](https://github.com/gulpjs/glob-parent/issues/26)) ([a98874f](https://github.com/gulpjs/glob-parent/commit/a98874f1a59e407f4fb1beb0db4efa8392da60bb)) + +## [5.1.0](https://github.com/gulpjs/glob-parent/compare/v5.0.0...v5.1.0) (2021-01-27) + + +### Features + +* add `flipBackslashes` option to disable auto conversion of slashes (closes [#24](https://github.com/gulpjs/glob-parent/issues/24)) ([#25](https://github.com/gulpjs/glob-parent/issues/25)) ([eecf91d](https://github.com/gulpjs/glob-parent/commit/eecf91d5e3834ed78aee39c4eaaae654d76b87b3)) + +## [5.0.0](https://github.com/gulpjs/glob-parent/compare/v4.0.0...v5.0.0) (2021-01-27) + + +### ⚠ BREAKING CHANGES + +* Drop support for node <6 & bump dependencies + +### Miscellaneous Chores + +* Drop support for node <6 & bump dependencies ([896c0c0](https://github.com/gulpjs/glob-parent/commit/896c0c00b4e7362f60b96e7fc295ae929245255a)) + +## [4.0.0](https://github.com/gulpjs/glob-parent/compare/v3.1.0...v4.0.0) (2021-01-27) + + +### ⚠ BREAKING CHANGES + +* question marks are valid path characters on Windows so avoid flagging as a glob when alone +* Update is-glob dependency + +### Features + +* hoist regexps and strings for performance gains ([4a80667](https://github.com/gulpjs/glob-parent/commit/4a80667c69355c76a572a5892b0f133c8e1f457e)) +* question marks are valid path characters on Windows so avoid flagging as a glob when alone ([2a551dd](https://github.com/gulpjs/glob-parent/commit/2a551dd0dc3235e78bf3c94843d4107072d17841)) +* Update is-glob dependency ([e41fcd8](https://github.com/gulpjs/glob-parent/commit/e41fcd895d1f7bc617dba45c9d935a7949b9c281)) + +## [3.1.0](https://github.com/gulpjs/glob-parent/compare/v3.0.1...v3.1.0) (2021-01-27) + + +### Features + +* allow basic win32 backslash use ([272afa5](https://github.com/gulpjs/glob-parent/commit/272afa5fd070fc0f796386a5993d4ee4a846988b)) +* handle extglobs (parentheses) containing separators ([7db1bdb](https://github.com/gulpjs/glob-parent/commit/7db1bdb0756e55fd14619e8ce31aa31b17b117fd)) +* new approach to braces/brackets handling ([8269bd8](https://github.com/gulpjs/glob-parent/commit/8269bd89290d99fac9395a354fb56fdcdb80f0be)) +* pre-process braces/brackets sections ([9ef8a87](https://github.com/gulpjs/glob-parent/commit/9ef8a87f66b1a43d0591e7a8e4fc5a18415ee388)) +* preserve escaped brace/bracket at end of string ([8cfb0ba](https://github.com/gulpjs/glob-parent/commit/8cfb0ba84202d51571340dcbaf61b79d16a26c76)) + + +### Bug Fixes + +* trailing escaped square brackets ([99ec9fe](https://github.com/gulpjs/glob-parent/commit/99ec9fecc60ee488ded20a94dd4f18b4f55c4ccf)) + +### [3.0.1](https://github.com/gulpjs/glob-parent/compare/v3.0.0...v3.0.1) (2021-01-27) + + +### Features + +* use path-dirname ponyfill ([cdbea5f](https://github.com/gulpjs/glob-parent/commit/cdbea5f32a58a54e001a75ddd7c0fccd4776aacc)) + + +### Bug Fixes + +* unescape glob-escaped dirnames on output ([598c533](https://github.com/gulpjs/glob-parent/commit/598c533bdf49c1428bc063aa9b8db40c5a86b030)) + +## [3.0.0](https://github.com/gulpjs/glob-parent/compare/v2.0.0...v3.0.0) (2021-01-27) + + +### ⚠ BREAKING CHANGES + +* update is-glob dependency + +### Features + +* update is-glob dependency ([5c5f8ef](https://github.com/gulpjs/glob-parent/commit/5c5f8efcee362a8e7638cf8220666acd8784f6bd)) + +## [2.0.0](https://github.com/gulpjs/glob-parent/compare/v1.3.0...v2.0.0) (2021-01-27) + + +### Features + +* move up to dirname regardless of glob characters ([f97fb83](https://github.com/gulpjs/glob-parent/commit/f97fb83be2e0a9fc8d3b760e789d2ecadd6aa0c2)) + +## [1.3.0](https://github.com/gulpjs/glob-parent/compare/v1.2.0...v1.3.0) (2021-01-27) + +## [1.2.0](https://github.com/gulpjs/glob-parent/compare/v1.1.0...v1.2.0) (2021-01-27) + + +### Reverts + +* feat: make regex test strings smaller ([dc80fa9](https://github.com/gulpjs/glob-parent/commit/dc80fa9658dca20549cfeba44bbd37d5246fcce0)) + +## [1.1.0](https://github.com/gulpjs/glob-parent/compare/v1.0.0...v1.1.0) (2021-01-27) + + +### Features + +* make regex test strings smaller ([cd83220](https://github.com/gulpjs/glob-parent/commit/cd832208638f45169f986d80fcf66e401f35d233)) + +## 1.0.0 (2021-01-27) + diff --git a/node_modules/glob-parent/LICENSE b/node_modules/glob-parent/LICENSE new file mode 100644 index 000000000..63222d7a8 --- /dev/null +++ b/node_modules/glob-parent/LICENSE @@ -0,0 +1,15 @@ +The ISC License + +Copyright (c) 2015, 2019 Elan Shanker + +Permission to use, copy, modify, and/or distribute this software for any +purpose with or without fee is hereby granted, provided that the above +copyright notice and this permission notice appear in all copies. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES +WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF +MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR +ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES +WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN +ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR +IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. diff --git a/node_modules/glob-parent/README.md b/node_modules/glob-parent/README.md new file mode 100644 index 000000000..36a279384 --- /dev/null +++ b/node_modules/glob-parent/README.md @@ -0,0 +1,137 @@ +

+ + + +

+ +# glob-parent + +[![NPM version][npm-image]][npm-url] [![Downloads][downloads-image]][npm-url] [![Azure Pipelines Build Status][azure-pipelines-image]][azure-pipelines-url] [![Travis Build Status][travis-image]][travis-url] [![AppVeyor Build Status][appveyor-image]][appveyor-url] [![Coveralls Status][coveralls-image]][coveralls-url] [![Gitter chat][gitter-image]][gitter-url] + +Extract the non-magic parent path from a glob string. + +## Usage + +```js +var globParent = require('glob-parent'); + +globParent('path/to/*.js'); // 'path/to' +globParent('/root/path/to/*.js'); // '/root/path/to' +globParent('/*.js'); // '/' +globParent('*.js'); // '.' +globParent('**/*.js'); // '.' +globParent('path/{to,from}'); // 'path' +globParent('path/!(to|from)'); // 'path' +globParent('path/?(to|from)'); // 'path' +globParent('path/+(to|from)'); // 'path' +globParent('path/*(to|from)'); // 'path' +globParent('path/@(to|from)'); // 'path' +globParent('path/**/*'); // 'path' + +// if provided a non-glob path, returns the nearest dir +globParent('path/foo/bar.js'); // 'path/foo' +globParent('path/foo/'); // 'path/foo' +globParent('path/foo'); // 'path' (see issue #3 for details) +``` + +## API + +### `globParent(maybeGlobString, [options])` + +Takes a string and returns the part of the path before the glob begins. Be aware of Escaping rules and Limitations below. + +#### options + +```js +{ + // Disables the automatic conversion of slashes for Windows + flipBackslashes: true +} +``` + +## Escaping + +The following characters have special significance in glob patterns and must be escaped if you want them to be treated as regular path characters: + +- `?` (question mark) unless used as a path segment alone +- `*` (asterisk) +- `|` (pipe) +- `(` (opening parenthesis) +- `)` (closing parenthesis) +- `{` (opening curly brace) +- `}` (closing curly brace) +- `[` (opening bracket) +- `]` (closing bracket) + +**Example** + +```js +globParent('foo/[bar]/') // 'foo' +globParent('foo/\\[bar]/') // 'foo/[bar]' +``` + +## Limitations + +### Braces & Brackets +This library attempts a quick and imperfect method of determining which path +parts have glob magic without fully parsing/lexing the pattern. There are some +advanced use cases that can trip it up, such as nested braces where the outer +pair is escaped and the inner one contains a path separator. If you find +yourself in the unlikely circumstance of being affected by this or need to +ensure higher-fidelity glob handling in your library, it is recommended that you +pre-process your input with [expand-braces] and/or [expand-brackets]. + +### Windows +Backslashes are not valid path separators for globs. If a path with backslashes +is provided anyway, for simple cases, glob-parent will replace the path +separator for you and return the non-glob parent path (now with +forward-slashes, which are still valid as Windows path separators). + +This cannot be used in conjunction with escape characters. + +```js +// BAD +globParent('C:\\Program Files \\(x86\\)\\*.ext') // 'C:/Program Files /(x86/)' + +// GOOD +globParent('C:/Program Files\\(x86\\)/*.ext') // 'C:/Program Files (x86)' +``` + +If you are using escape characters for a pattern without path parts (i.e. +relative to `cwd`), prefix with `./` to avoid confusing glob-parent. + +```js +// BAD +globParent('foo \\[bar]') // 'foo ' +globParent('foo \\[bar]*') // 'foo ' + +// GOOD +globParent('./foo \\[bar]') // 'foo [bar]' +globParent('./foo \\[bar]*') // '.' +``` + +## License + +ISC + +[expand-braces]: https://github.com/jonschlinkert/expand-braces +[expand-brackets]: https://github.com/jonschlinkert/expand-brackets + +[downloads-image]: https://img.shields.io/npm/dm/glob-parent.svg +[npm-url]: https://www.npmjs.com/package/glob-parent +[npm-image]: https://img.shields.io/npm/v/glob-parent.svg + +[azure-pipelines-url]: https://dev.azure.com/gulpjs/gulp/_build/latest?definitionId=2&branchName=master +[azure-pipelines-image]: https://dev.azure.com/gulpjs/gulp/_apis/build/status/glob-parent?branchName=master + +[travis-url]: https://travis-ci.org/gulpjs/glob-parent +[travis-image]: https://img.shields.io/travis/gulpjs/glob-parent.svg?label=travis-ci + +[appveyor-url]: https://ci.appveyor.com/project/gulpjs/glob-parent +[appveyor-image]: https://img.shields.io/appveyor/ci/gulpjs/glob-parent.svg?label=appveyor + +[coveralls-url]: https://coveralls.io/r/gulpjs/glob-parent +[coveralls-image]: https://img.shields.io/coveralls/gulpjs/glob-parent/master.svg + +[gitter-url]: https://gitter.im/gulpjs/gulp +[gitter-image]: https://badges.gitter.im/gulpjs/gulp.svg diff --git a/node_modules/glob-parent/index.js b/node_modules/glob-parent/index.js new file mode 100644 index 000000000..09e257ea3 --- /dev/null +++ b/node_modules/glob-parent/index.js @@ -0,0 +1,42 @@ +'use strict'; + +var isGlob = require('is-glob'); +var pathPosixDirname = require('path').posix.dirname; +var isWin32 = require('os').platform() === 'win32'; + +var slash = '/'; +var backslash = /\\/g; +var enclosure = /[\{\[].*[\}\]]$/; +var globby = /(^|[^\\])([\{\[]|\([^\)]+$)/; +var escaped = /\\([\!\*\?\|\[\]\(\)\{\}])/g; + +/** + * @param {string} str + * @param {Object} opts + * @param {boolean} [opts.flipBackslashes=true] + * @returns {string} + */ +module.exports = function globParent(str, opts) { + var options = Object.assign({ flipBackslashes: true }, opts); + + // flip windows path separators + if (options.flipBackslashes && isWin32 && str.indexOf(slash) < 0) { + str = str.replace(backslash, slash); + } + + // special case for strings ending in enclosure containing path separator + if (enclosure.test(str)) { + str += slash; + } + + // preserves full path in case of trailing path separator + str += 'a'; + + // remove path parts that are globby + do { + str = pathPosixDirname(str); + } while (isGlob(str) || globby.test(str)); + + // remove escape chars and return result + return str.replace(escaped, '$1'); +}; diff --git a/node_modules/glob-parent/package.json b/node_modules/glob-parent/package.json new file mode 100644 index 000000000..e6266262b --- /dev/null +++ b/node_modules/glob-parent/package.json @@ -0,0 +1,90 @@ +{ + "_from": "glob-parent@~5.1.2", + "_id": "glob-parent@5.1.2", + "_inBundle": false, + "_integrity": "sha512-AOIgSQCepiJYwP3ARnGx+5VnTu2HBYdzbGP45eLw1vr3zB3vZLeyed1sC9hnbcOc9/SrMyM5RPQrkGz4aS9Zow==", + "_location": "/glob-parent", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "glob-parent@~5.1.2", + "name": "glob-parent", + "escapedName": "glob-parent", + "rawSpec": "~5.1.2", + "saveSpec": null, + "fetchSpec": "~5.1.2" + }, + "_requiredBy": [ + "/chokidar" + ], + "_resolved": "https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.2.tgz", + "_shasum": "869832c58034fe68a4093c17dc15e8340d8401c4", + "_spec": "glob-parent@~5.1.2", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\chokidar", + "author": { + "name": "Gulp Team", + "email": "team@gulpjs.com", + "url": "https://gulpjs.com/" + }, + "bugs": { + "url": "https://github.com/gulpjs/glob-parent/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Elan Shanker", + "url": "https://github.com/es128" + }, + { + "name": "Blaine Bublitz", + "email": "blaine.bublitz@gmail.com" + } + ], + "dependencies": { + "is-glob": "^4.0.1" + }, + "deprecated": false, + "description": "Extract the non-magic parent path from a glob string.", + "devDependencies": { + "coveralls": "^3.0.11", + "eslint": "^2.13.1", + "eslint-config-gulp": "^3.0.1", + "expect": "^1.20.2", + "mocha": "^6.0.2", + "nyc": "^13.3.0" + }, + "engines": { + "node": ">= 6" + }, + "files": [ + "LICENSE", + "index.js" + ], + "homepage": "https://github.com/gulpjs/glob-parent#readme", + "keywords": [ + "glob", + "parent", + "strip", + "path", + "dirname", + "directory", + "base", + "wildcard" + ], + "license": "ISC", + "main": "index.js", + "name": "glob-parent", + "repository": { + "type": "git", + "url": "git+https://github.com/gulpjs/glob-parent.git" + }, + "scripts": { + "azure-pipelines": "nyc mocha --async-only --reporter xunit -O output=test.xunit", + "coveralls": "nyc report --reporter=text-lcov | coveralls", + "lint": "eslint .", + "pretest": "npm run lint", + "test": "nyc mocha --async-only" + }, + "version": "5.1.2" +} diff --git a/node_modules/global-dirs/index.d.ts b/node_modules/global-dirs/index.d.ts new file mode 100644 index 000000000..ac39a2344 --- /dev/null +++ b/node_modules/global-dirs/index.d.ts @@ -0,0 +1,60 @@ +declare namespace globalDirectories { + interface GlobalDirectories { + /** + Directory with globally installed packages. + + Equivalent to `npm root --global`. + */ + readonly packages: string; + + /** + Directory with globally installed binaries. + + Equivalent to `npm bin --global`. + */ + readonly binaries: string; + + /** + Directory with directories for packages and binaries. You probably want either of the above. + + Equivalent to `npm prefix --global`. + */ + readonly prefix: string; + } +} + +declare const globalDirectories: { + /** + Get the directory of globally installed packages and binaries. + + @example + ``` + import globalDirectories = require('global-dirs'); + + console.log(globalDirectories.npm.prefix); + //=> '/usr/local' + + console.log(globalDirectories.npm.packages); + //=> '/usr/local/lib/node_modules' + ``` + */ + readonly npm: globalDirectories.GlobalDirectories; + + /** + Get the directory of globally installed packages and binaries. + + @example + ``` + import globalDirectories = require('global-dirs'); + + console.log(globalDirectories.npm.binaries); + //=> '/usr/local/bin' + + console.log(globalDirectories.yarn.packages); + //=> '/Users/sindresorhus/.config/yarn/global/node_modules' + ``` + */ + readonly yarn: globalDirectories.GlobalDirectories; +}; + +export = globalDirectories; diff --git a/node_modules/global-dirs/index.js b/node_modules/global-dirs/index.js new file mode 100644 index 000000000..c5d24384f --- /dev/null +++ b/node_modules/global-dirs/index.js @@ -0,0 +1,120 @@ +'use strict'; +const path = require('path'); +const os = require('os'); +const fs = require('fs'); +const ini = require('ini'); + +const isWindows = process.platform === 'win32'; + +const readRc = filePath => { + try { + return ini.parse(fs.readFileSync(filePath, 'utf8')).prefix; + } catch {} +}; + +const getEnvNpmPrefix = () => { + // TODO: Remove the `.reduce` call. + // eslint-disable-next-line unicorn/no-array-reduce + return Object.keys(process.env).reduce((prefix, name) => { + return /^npm_config_prefix$/i.test(name) ? process.env[name] : prefix; + }, undefined); +}; + +const getGlobalNpmrc = () => { + if (isWindows && process.env.APPDATA) { + // Hardcoded contents of `c:\Program Files\nodejs\node_modules\npm\npmrc` + return path.join(process.env.APPDATA, '/npm/etc/npmrc'); + } + + // Homebrew special case: `$(brew --prefix)/lib/node_modules/npm/npmrc` + if (process.execPath.includes('/Cellar/node')) { + const homebrewPrefix = process.execPath.slice(0, process.execPath.indexOf('/Cellar/node')); + return path.join(homebrewPrefix, '/lib/node_modules/npm/npmrc'); + } + + if (process.execPath.endsWith('/bin/node')) { + const installDir = path.dirname(path.dirname(process.execPath)); + return path.join(installDir, '/etc/npmrc'); + } +}; + +const getDefaultNpmPrefix = () => { + if (isWindows) { + // `c:\node\node.exe` → `prefix=c:\node\` + return path.dirname(process.execPath); + } + + // `/usr/local/bin/node` → `prefix=/usr/local` + return path.dirname(path.dirname(process.execPath)); +}; + +const getNpmPrefix = () => { + const envPrefix = getEnvNpmPrefix(); + if (envPrefix) { + return envPrefix; + } + + const homePrefix = readRc(path.join(os.homedir(), '.npmrc')); + if (homePrefix) { + return homePrefix; + } + + if (process.env.PREFIX) { + return process.env.PREFIX; + } + + const globalPrefix = readRc(getGlobalNpmrc()); + if (globalPrefix) { + return globalPrefix; + } + + return getDefaultNpmPrefix(); +}; + +const npmPrefix = path.resolve(getNpmPrefix()); + +const getYarnWindowsDirectory = () => { + if (isWindows && process.env.LOCALAPPDATA) { + const dir = path.join(process.env.LOCALAPPDATA, 'Yarn'); + if (fs.existsSync(dir)) { + return dir; + } + } + + return false; +}; + +const getYarnPrefix = () => { + if (process.env.PREFIX) { + return process.env.PREFIX; + } + + const windowsPrefix = getYarnWindowsDirectory(); + if (windowsPrefix) { + return windowsPrefix; + } + + const configPrefix = path.join(os.homedir(), '.config/yarn'); + if (fs.existsSync(configPrefix)) { + return configPrefix; + } + + const homePrefix = path.join(os.homedir(), '.yarn-config'); + if (fs.existsSync(homePrefix)) { + return homePrefix; + } + + // Yarn supports the npm conventions but the inverse is not true + return npmPrefix; +}; + +exports.npm = {}; +exports.npm.prefix = npmPrefix; +exports.npm.packages = path.join(npmPrefix, isWindows ? 'node_modules' : 'lib/node_modules'); +exports.npm.binaries = isWindows ? npmPrefix : path.join(npmPrefix, 'bin'); + +const yarnPrefix = path.resolve(getYarnPrefix()); +exports.yarn = {}; +exports.yarn.prefix = yarnPrefix; +exports.yarn.packages = path.join(yarnPrefix, getYarnWindowsDirectory() ? 'Data/global/node_modules' : 'global/node_modules'); +exports.yarn.binaries = path.join(exports.yarn.packages, '.bin'); diff --git a/node_modules/global-dirs/license b/node_modules/global-dirs/license new file mode 100644 index 000000000..fa7ceba3e --- /dev/null +++ b/node_modules/global-dirs/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (https://sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/global-dirs/package.json b/node_modules/global-dirs/package.json new file mode 100644 index 000000000..837036d0d --- /dev/null +++ b/node_modules/global-dirs/package.json @@ -0,0 +1,88 @@ +{ + "_from": "global-dirs@^3.0.0", + "_id": "global-dirs@3.0.0", + "_inBundle": false, + "_integrity": "sha512-v8ho2DS5RiCjftj1nD9NmnfaOzTdud7RRnVd9kFNOjqZbISlx5DQ+OrTkywgd0dIt7oFCvKetZSHoHcP3sDdiA==", + "_location": "/global-dirs", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "global-dirs@^3.0.0", + "name": "global-dirs", + "escapedName": "global-dirs", + "rawSpec": "^3.0.0", + "saveSpec": null, + "fetchSpec": "^3.0.0" + }, + "_requiredBy": [ + "/is-installed-globally" + ], + "_resolved": "https://registry.npmjs.org/global-dirs/-/global-dirs-3.0.0.tgz", + "_shasum": "70a76fe84ea315ab37b1f5576cbde7d48ef72686", + "_spec": "global-dirs@^3.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\is-installed-globally", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "https://sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/global-dirs/issues" + }, + "bundleDependencies": false, + "dependencies": { + "ini": "2.0.0" + }, + "deprecated": false, + "description": "Get the directory of globally installed packages and binaries", + "devDependencies": { + "ava": "^2.4.0", + "execa": "^5.0.0", + "import-fresh": "^3.3.0", + "tsd": "^0.14.0", + "xo": "^0.37.1" + }, + "engines": { + "node": ">=10" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "funding": "https://github.com/sponsors/sindresorhus", + "homepage": "https://github.com/sindresorhus/global-dirs#readme", + "keywords": [ + "global", + "prefix", + "path", + "paths", + "npm", + "yarn", + "node", + "modules", + "node-modules", + "package", + "packages", + "binary", + "binaries", + "bin", + "directory", + "directories", + "npmrc", + "rc", + "config", + "root", + "resolve" + ], + "license": "MIT", + "name": "global-dirs", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/global-dirs.git" + }, + "scripts": { + "test": "xo && ava && tsd" + }, + "version": "3.0.0" +} diff --git a/node_modules/global-dirs/readme.md b/node_modules/global-dirs/readme.md new file mode 100644 index 000000000..0276ee18e --- /dev/null +++ b/node_modules/global-dirs/readme.md @@ -0,0 +1,72 @@ +# global-dirs + +> Get the directory of globally installed packages and binaries + +Uses the same resolution logic as `npm` and `yarn`. + +## Install + +``` +$ npm install global-dirs +``` + +## Usage + +```js +const globalDirectories = require('global-dirs'); + +console.log(globalDirectories.npm.prefix); +//=> '/usr/local' + +console.log(globalDirectories.npm.packages); +//=> '/usr/local/lib/node_modules' + +console.log(globalDirectories.npm.binaries); +//=> '/usr/local/bin' + +console.log(globalDirectories.yarn.packages); +//=> '/Users/sindresorhus/.config/yarn/global/node_modules' +``` + +## API + +### globalDirectories + +#### npm +#### yarn + +##### packages + +Directory with globally installed packages. + +Equivalent to `npm root --global`. + +##### binaries + +Directory with globally installed binaries. + +Equivalent to `npm bin --global`. + +##### prefix + +Directory with directories for packages and binaries. You probably want either of the above. + +Equivalent to `npm prefix --global`. + +## Related + +- [import-global](https://github.com/sindresorhus/import-global) - Import a globally installed module +- [resolve-global](https://github.com/sindresorhus/resolve-global) - Resolve the path of a globally installed module +- [is-installed-globally](https://github.com/sindresorhus/is-installed-globally) - Check if your package was installed globally + +--- + +
+ + Get professional support for this package with a Tidelift subscription + +
+ + Tidelift helps make open source sustainable for maintainers while giving companies
assurances about security, maintenance, and licensing for their dependencies. +
+
diff --git a/node_modules/got/license b/node_modules/got/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/got/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/got/package.json b/node_modules/got/package.json new file mode 100644 index 000000000..bce28097d --- /dev/null +++ b/node_modules/got/package.json @@ -0,0 +1,106 @@ +{ + "_from": "got@^9.6.0", + "_id": "got@9.6.0", + "_inBundle": false, + "_integrity": "sha512-R7eWptXuGYxwijs0eV+v3o6+XH1IqVK8dJOEecQfTmkncw9AV4dcw/Dhxi8MdlqPthxxpZyizMzyg8RTmEsG+Q==", + "_location": "/got", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "got@^9.6.0", + "name": "got", + "escapedName": "got", + "rawSpec": "^9.6.0", + "saveSpec": null, + "fetchSpec": "^9.6.0" + }, + "_requiredBy": [ + "/package-json" + ], + "_resolved": "https://registry.npmjs.org/got/-/got-9.6.0.tgz", + "_shasum": "edf45e7d67f99545705de1f7bbeeeb121765ed85", + "_spec": "got@^9.6.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\package-json", + "ava": { + "concurrency": 4 + }, + "browser": { + "decompress-response": false, + "electron": false + }, + "bugs": { + "url": "https://github.com/sindresorhus/got/issues" + }, + "bundleDependencies": false, + "dependencies": { + "@sindresorhus/is": "^0.14.0", + "@szmarczak/http-timer": "^1.1.2", + "cacheable-request": "^6.0.0", + "decompress-response": "^3.3.0", + "duplexer3": "^0.1.4", + "get-stream": "^4.1.0", + "lowercase-keys": "^1.0.1", + "mimic-response": "^1.0.1", + "p-cancelable": "^1.0.0", + "to-readable-stream": "^1.0.0", + "url-parse-lax": "^3.0.0" + }, + "deprecated": false, + "description": "Simplified HTTP requests", + "devDependencies": { + "ava": "^1.1.0", + "coveralls": "^3.0.0", + "delay": "^4.1.0", + "form-data": "^2.3.3", + "get-port": "^4.0.0", + "np": "^3.1.0", + "nyc": "^13.1.0", + "p-event": "^2.1.0", + "pem": "^1.13.2", + "proxyquire": "^2.0.1", + "sinon": "^7.2.2", + "slow-stream": "0.0.4", + "tempfile": "^2.0.0", + "tempy": "^0.2.1", + "tough-cookie": "^3.0.0", + "xo": "^0.24.0" + }, + "engines": { + "node": ">=8.6" + }, + "files": [ + "source" + ], + "homepage": "https://github.com/sindresorhus/got#readme", + "keywords": [ + "http", + "https", + "get", + "got", + "url", + "uri", + "request", + "util", + "utility", + "simple", + "curl", + "wget", + "fetch", + "net", + "network", + "electron" + ], + "license": "MIT", + "main": "source", + "name": "got", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/got.git" + }, + "scripts": { + "release": "np", + "test": "xo && nyc ava" + }, + "version": "9.6.0" +} diff --git a/node_modules/got/readme.md b/node_modules/got/readme.md new file mode 100644 index 000000000..37132ab01 --- /dev/null +++ b/node_modules/got/readme.md @@ -0,0 +1,1237 @@ +
+
+
+ Got +
+
+
+

Huge thanks to for sponsoring me! +

+
+
+
+ +> Simplified HTTP requests + +[![Build Status: Linux](https://travis-ci.org/sindresorhus/got.svg?branch=master)](https://travis-ci.org/sindresorhus/got) [![Coverage Status](https://coveralls.io/repos/github/sindresorhus/got/badge.svg?branch=master)](https://coveralls.io/github/sindresorhus/got?branch=master) [![Downloads](https://img.shields.io/npm/dm/got.svg)](https://npmjs.com/got) [![Install size](https://packagephobia.now.sh/badge?p=got)](https://packagephobia.now.sh/result?p=got) + +Got is a human-friendly and powerful HTTP request library. + +It was created because the popular [`request`](https://github.com/request/request) package is bloated: [![Install size](https://packagephobia.now.sh/badge?p=request)](https://packagephobia.now.sh/result?p=request) + +Got is for Node.js. For browsers, we recommend [Ky](https://github.com/sindresorhus/ky). + + +## Highlights + +- [Promise & stream API](#api) +- [Request cancelation](#aborting-the-request) +- [RFC compliant caching](#cache-adapters) +- [Follows redirects](#followredirect) +- [Retries on failure](#retry) +- [Progress events](#onuploadprogress-progress) +- [Handles gzip/deflate](#decompress) +- [Timeout handling](#timeout) +- [Errors with metadata](#errors) +- [JSON mode](#json) +- [WHATWG URL support](#url) +- [Hooks](#hooks) +- [Instances with custom defaults](#instances) +- [Composable](advanced-creation.md#merging-instances) +- [Electron support](#useelectronnet) +- [Used by ~2000 packages and ~500K repos](https://github.com/sindresorhus/got/network/dependents) +- Actively maintained + +[Moving from Request?](migration-guides.md) + +[See how Got compares to other HTTP libraries](#comparison) + +## Install + +``` +$ npm install got +``` + + + + + + +## Usage + +```js +const got = require('got'); + +(async () => { + try { + const response = await got('sindresorhus.com'); + console.log(response.body); + //=> ' ...' + } catch (error) { + console.log(error.response.body); + //=> 'Internal server error ...' + } +})(); +``` + +###### Streams + +```js +const fs = require('fs'); +const got = require('got'); + +got.stream('sindresorhus.com').pipe(fs.createWriteStream('index.html')); + +// For POST, PUT, and PATCH methods `got.stream` returns a `stream.Writable` +fs.createReadStream('index.html').pipe(got.stream.post('sindresorhus.com')); +``` + + +### API + +It's a `GET` request by default, but can be changed by using different methods or in the `options`. + +#### got(url, [options]) + +Returns a Promise for a [`response` object](#response) or a [stream](#streams-1) if `options.stream` is set to true. + +##### url + +Type: `string` `Object` + +The URL to request, as a string, a [`https.request` options object](https://nodejs.org/api/https.html#https_https_request_options_callback), or a [WHATWG `URL`](https://nodejs.org/api/url.html#url_class_url). + +Properties from `options` will override properties in the parsed `url`. + +If no protocol is specified, it will default to `https`. + +##### options + +Type: `Object` + +Any of the [`https.request`](https://nodejs.org/api/https.html#https_https_request_options_callback) options. + +###### baseUrl + +Type: `string` `Object` + +When specified, `url` will be prepended by `baseUrl`.
+If you specify an absolute URL, it will skip the `baseUrl`. + +Very useful when used with `got.extend()` to create niche-specific Got instances. + +Can be a string or a [WHATWG `URL`](https://nodejs.org/api/url.html#url_class_url). + +Slash at the end of `baseUrl` and at the beginning of the `url` argument is optional: + +```js +await got('hello', {baseUrl: 'https://example.com/v1'}); +//=> 'https://example.com/v1/hello' + +await got('/hello', {baseUrl: 'https://example.com/v1/'}); +//=> 'https://example.com/v1/hello' + +await got('/hello', {baseUrl: 'https://example.com/v1'}); +//=> 'https://example.com/v1/hello' +``` + +###### headers + +Type: `Object`
+Default: `{}` + +Request headers. + +Existing headers will be overwritten. Headers set to `null` will be omitted. + +###### stream + +Type: `boolean`
+Default: `false` + +Returns a `Stream` instead of a `Promise`. This is equivalent to calling `got.stream(url, [options])`. + +###### body + +Type: `string` `Buffer` `stream.Readable` [`form-data` instance](https://github.com/form-data/form-data) + +**Note:** If you provide this option, `got.stream()` will be read-only. + +The body that will be sent with a `POST` request. + +If present in `options` and `options.method` is not set, `options.method` will be set to `POST`. + +The `content-length` header will be automatically set if `body` is a `string` / `Buffer` / `fs.createReadStream` instance / [`form-data` instance](https://github.com/form-data/form-data), and `content-length` and `transfer-encoding` are not manually set in `options.headers`. + +###### cookieJar + +Type: [`tough.CookieJar` instance](https://github.com/salesforce/tough-cookie#cookiejar) + +**Note:** If you provide this option, `options.headers.cookie` will be overridden. + +Cookie support. You don't have to care about parsing or how to store them. [Example.](#cookies) + +###### encoding + +Type: `string` `null`
+Default: `'utf8'` + +[Encoding](https://nodejs.org/api/buffer.html#buffer_buffers_and_character_encodings) to be used on `setEncoding` of the response data. If `null`, the body is returned as a [`Buffer`](https://nodejs.org/api/buffer.html) (binary data). + +###### form + +Type: `boolean`
+Default: `false` + +**Note:** If you provide this option, `got.stream()` will be read-only. +**Note:** `body` must be a plain object. It will be converted to a query string using [`(new URLSearchParams(object)).toString()`](https://nodejs.org/api/url.html#url_constructor_new_urlsearchparams_obj). + +If set to `true` and `Content-Type` header is not set, it will be set to `application/x-www-form-urlencoded`. + +###### json + +Type: `boolean`
+Default: `false` + +**Note:** If you use `got.stream()`, this option will be ignored. +**Note:** `body` must be a plain object or array and will be stringified. + +If set to `true` and `Content-Type` header is not set, it will be set to `application/json`. + +Parse response body with `JSON.parse` and set `accept` header to `application/json`. If used in conjunction with the `form` option, the `body` will the stringified as querystring and the response parsed as JSON. + +###### query + +Type: `string` `Object` [`URLSearchParams`](https://developer.mozilla.org/en-US/docs/Web/API/URLSearchParams) + +Query string that will be added to the request URL. This will override the query string in `url`. + +If you need to pass in an array, you can do it using a `URLSearchParams` instance: + +```js +const got = require('got'); + +const query = new URLSearchParams([['key', 'a'], ['key', 'b']]); + +got('https://example.com', {query}); + +console.log(query.toString()); +//=> 'key=a&key=b' +``` + +And if you need a different array format, you could use the [`query-string`](https://github.com/sindresorhus/query-string) package: + +```js +const got = require('got'); +const queryString = require('query-string'); + +const query = queryString.stringify({key: ['a', 'b']}, {arrayFormat: 'bracket'}); + +got('https://example.com', {query}); + +console.log(query); +//=> 'key[]=a&key[]=b' +``` + +###### timeout + +Type: `number` `Object` + +Milliseconds to wait for the server to end the response before aborting the request with [`got.TimeoutError`](#gottimeouterror) error (a.k.a. `request` property). By default, there's no timeout. + +This also accepts an `object` with the following fields to constrain the duration of each phase of the request lifecycle: + +- `lookup` starts when a socket is assigned and ends when the hostname has been resolved. Does not apply when using a Unix domain socket. +- `connect` starts when `lookup` completes (or when the socket is assigned if lookup does not apply to the request) and ends when the socket is connected. +- `secureConnect` starts when `connect` completes and ends when the handshaking process completes (HTTPS only). +- `socket` starts when the socket is connected. See [request.setTimeout](https://nodejs.org/api/http.html#http_request_settimeout_timeout_callback). +- `response` starts when the request has been written to the socket and ends when the response headers are received. +- `send` starts when the socket is connected and ends with the request has been written to the socket. +- `request` starts when the request is initiated and ends when the response's end event fires. + +###### retry + +Type: `number` `Object`
+Default: +- retries: `2` +- methods: `GET` `PUT` `HEAD` `DELETE` `OPTIONS` `TRACE` +- statusCodes: [`408`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/408) [`413`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/413) [`429`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/429) [`500`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/500) [`502`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/502) [`503`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/503) [`504`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/504) +- maxRetryAfter: `undefined` +- errorCodes: `ETIMEDOUT` `ECONNRESET` `EADDRINUSE` `ECONNREFUSED` `EPIPE` `ENOTFOUND` `ENETUNREACH` `EAI_AGAIN` + +An object representing `retries`, `methods`, `statusCodes`, `maxRetryAfter` and `errorCodes` fields for the time until retry, allowed methods, allowed status codes, maximum [`Retry-After`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Retry-After) time and allowed error codes. + +If `maxRetryAfter` is set to `undefined`, it will use `options.timeout`.
+If [`Retry-After`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Retry-After) header is greater than `maxRetryAfter`, it will cancel the request. + +Delays between retries counts with function `1000 * Math.pow(2, retry) + Math.random() * 100`, where `retry` is attempt number (starts from 1). + +The `retries` property can be a `number` or a `function` with `retry` and `error` arguments. The function must return a delay in milliseconds (`0` return value cancels retry). + +By default, it retries *only* on the specified methods, status codes, and on these network errors: +- `ETIMEDOUT`: One of the [timeout](#timeout) limits were reached. +- `ECONNRESET`: Connection was forcibly closed by a peer. +- `EADDRINUSE`: Could not bind to any free port. +- `ECONNREFUSED`: Connection was refused by the server. +- `EPIPE`: The remote side of the stream being written has been closed. +- `ENOTFOUND`: Couldn't resolve the hostname to an IP address. +- `ENETUNREACH`: No internet connection. +- `EAI_AGAIN`: DNS lookup timed out. + +###### followRedirect + +Type: `boolean`
+Default: `true` + +Defines if redirect responses should be followed automatically. + +Note that if a `303` is sent by the server in response to any request type (`POST`, `DELETE`, etc.), Got will automatically request the resource pointed to in the location header via `GET`. This is in accordance with [the spec](https://tools.ietf.org/html/rfc7231#section-6.4.4). + +###### decompress + +Type: `boolean`
+Default: `true` + +Decompress the response automatically. This will set the `accept-encoding` header to `gzip, deflate` unless you set it yourself. + +If this is disabled, a compressed response is returned as a `Buffer`. This may be useful if you want to handle decompression yourself or stream the raw compressed data. + +###### cache + +Type: `Object`
+Default: `false` + +[Cache adapter instance](#cache-adapters) for storing cached data. + +###### request + +Type: `Function`
+Default: `http.request` `https.request` *(depending on the protocol)* + +Custom request function. The main purpose of this is to [support HTTP2 using a wrapper](#experimental-http2-support). + +###### useElectronNet + +Type: `boolean`
+Default: `false` + +When used in Electron, Got will use [`electron.net`](https://electronjs.org/docs/api/net/) instead of the Node.js `http` module. According to the Electron docs, it should be fully compatible, but it's not entirely. See [#443](https://github.com/sindresorhus/got/issues/443) and [#461](https://github.com/sindresorhus/got/issues/461). + +###### throwHttpErrors + +Type: `boolean`
+Default: `true` + +Determines if a `got.HTTPError` is thrown for error responses (non-2xx status codes). + +If this is disabled, requests that encounter an error status code will be resolved with the `response` instead of throwing. This may be useful if you are checking for resource availability and are expecting error responses. + +###### agent + +Same as the [`agent` option](https://nodejs.org/api/http.html#http_http_request_url_options_callback) for `http.request`, but with an extra feature: + +If you require different agents for different protocols, you can pass a map of agents to the `agent` option. This is necessary because a request to one protocol might redirect to another. In such a scenario, Got will switch over to the right protocol agent for you. + +```js +const got = require('got'); +const HttpAgent = require('agentkeepalive'); +const {HttpsAgent} = HttpAgent; + +got('sindresorhus.com', { + agent: { + http: new HttpAgent(), + https: new HttpsAgent() + } +}); +``` + +###### hooks + +Type: `Object` + +Hooks allow modifications during the request lifecycle. Hook functions may be async and are run serially. + +###### hooks.init + +Type: `Function[]`
+Default: `[]` + +Called with plain [request options](#options), right before their normalization. This is especially useful in conjunction with [`got.extend()`](#instances) and [`got.create()`](advanced-creation.md) when the input needs custom handling. + +See the [Request migration guide](migration-guides.md#breaking-changes) for an example. + +**Note**: This hook must be synchronous! + +###### hooks.beforeRequest + +Type: `Function[]`
+Default: `[]` + +Called with [normalized](source/normalize-arguments.js) [request options](#options). Got will make no further changes to the request before it is sent. This is especially useful in conjunction with [`got.extend()`](#instances) and [`got.create()`](advanced-creation.md) when you want to create an API client that, for example, uses HMAC-signing. + +See the [AWS section](#aws) for an example. + +**Note:** If you modify the `body` you will need to modify the `content-length` header too, because it has already been computed and assigned. + +###### hooks.beforeRedirect + +Type: `Function[]`
+Default: `[]` + +Called with [normalized](source/normalize-arguments.js) [request options](#options). Got will make no further changes to the request. This is especially useful when you want to avoid dead sites. Example: + +```js +const got = require('got'); + +got('example.com', { + hooks: { + beforeRedirect: [ + options => { + if (options.hostname === 'deadSite') { + options.hostname = 'fallbackSite'; + } + } + ] + } +}); +``` + +###### hooks.beforeRetry + +Type: `Function[]`
+Default: `[]` + +Called with [normalized](source/normalize-arguments.js) [request options](#options), the error and the retry count. Got will make no further changes to the request. This is especially useful when some extra work is required before the next try. Example: + +```js +const got = require('got'); + +got('example.com', { + hooks: { + beforeRetry: [ + (options, error, retryCount) => { + if (error.statusCode === 413) { // Payload too large + options.body = getNewBody(); + } + } + ] + } +}); +``` + +###### hooks.afterResponse + +Type: `Function[]`
+Default: `[]` + +Called with [response object](#response) and a retry function. + +Each function should return the response. This is especially useful when you want to refresh an access token. Example: + +```js +const got = require('got'); + +const instance = got.extend({ + hooks: { + afterResponse: [ + (response, retryWithMergedOptions) => { + if (response.statusCode === 401) { // Unauthorized + const updatedOptions = { + headers: { + token: getNewToken() // Refresh the access token + } + }; + + // Save for further requests + instance.defaults.options = got.mergeOptions(instance.defaults.options, updatedOptions); + + // Make a new retry + return retryWithMergedOptions(updatedOptions); + } + + // No changes otherwise + return response; + } + ] + }, + mutableDefaults: true +}); +``` + +###### hooks.beforeError + +Type: `Function[]`
+Default: `[]` + +Called with an `Error` instance. The error is passed to the hook right before it's thrown. This is especially useful when you want to have more detailed errors. + +**Note**: Errors thrown while normalizing input options are thrown directly and not part of this hook. + +```js +const got = require('got'); + +got('api.github.com/some-endpoint', { + hooks: { + onError: [ + error => { + const {response} = error; + if (response && response.body) { + error.name = 'GitHubError'; + error.message = `${response.body.message} (${error.statusCode})`; + } + + return error; + } + ] + } +}); +``` + +#### Response + +The response object will typically be a [Node.js HTTP response stream](https://nodejs.org/api/http.html#http_class_http_incomingmessage), however, if returned from the cache it will be a [response-like object](https://github.com/lukechilds/responselike) which behaves in the same way. + +##### request + +Type: `Object` + +**Note:** This is not a [http.ClientRequest](https://nodejs.org/api/http.html#http_class_http_clientrequest). + +- `gotOptions` - The options that were set on this request. + +##### body + +Type: `string` `Object` *(depending on `options.json`)* + +The result of the request. + +##### url + +Type: `string` + +The request URL or the final URL after redirects. + +##### requestUrl + +Type: `string` + +The original request URL. + +##### timings + +Type: `Object` + +The object contains the following properties: + +- `start` - Time when the request started. +- `socket` - Time when a socket was assigned to the request. +- `lookup` - Time when the DNS lookup finished. +- `connect` - Time when the socket successfully connected. +- `upload` - Time when the request finished uploading. +- `response` - Time when the request fired the `response` event. +- `end` - Time when the response fired the `end` event. +- `error` - Time when the request fired the `error` event. +- `phases` + - `wait` - `timings.socket - timings.start` + - `dns` - `timings.lookup - timings.socket` + - `tcp` - `timings.connect - timings.lookup` + - `request` - `timings.upload - timings.connect` + - `firstByte` - `timings.response - timings.upload` + - `download` - `timings.end - timings.response` + - `total` - `timings.end - timings.start` or `timings.error - timings.start` + +**Note:** The time is a `number` representing the milliseconds elapsed since the UNIX epoch. + +##### fromCache + +Type: `boolean` + +Whether the response was retrieved from the cache. + +##### redirectUrls + +Type: `Array` + +The redirect URLs. + +##### retryCount + +Type: `number` + +The number of times the request was retried. + +#### Streams + +**Note:** Progress events, redirect events and request/response events can also be used with promises. + +#### got.stream(url, [options]) + +Sets `options.stream` to `true`. + +Returns a [duplex stream](https://nodejs.org/api/stream.html#stream_class_stream_duplex) with additional events: + +##### .on('request', request) + +`request` event to get the request object of the request. + +**Tip:** You can use `request` event to abort request: + +```js +got.stream('github.com') + .on('request', request => setTimeout(() => request.abort(), 50)); +``` + +##### .on('response', response) + +The `response` event to get the response object of the final request. + +##### .on('redirect', response, nextOptions) + +The `redirect` event to get the response object of a redirect. The second argument is options for the next request to the redirect location. + +##### .on('uploadProgress', progress) +##### .on('downloadProgress', progress) + +Progress events for uploading (sending a request) and downloading (receiving a response). The `progress` argument is an object like: + +```js +{ + percent: 0.1, + transferred: 1024, + total: 10240 +} +``` + +If it's not possible to retrieve the body size (can happen when streaming), `total` will be `null`. + +```js +(async () => { + const response = await got('sindresorhus.com') + .on('downloadProgress', progress => { + // Report download progress + }) + .on('uploadProgress', progress => { + // Report upload progress + }); + + console.log(response); +})(); +``` + +##### .on('error', error, body, response) + +The `error` event emitted in case of a protocol error (like `ENOTFOUND` etc.) or status error (4xx or 5xx). The second argument is the body of the server response in case of status error. The third argument is a response object. + +#### got.get(url, [options]) +#### got.post(url, [options]) +#### got.put(url, [options]) +#### got.patch(url, [options]) +#### got.head(url, [options]) +#### got.delete(url, [options]) + +Sets `options.method` to the method name and makes a request. + +### Instances + +#### got.extend([options]) + +Configure a new `got` instance with default `options`. The `options` are merged with the parent instance's `defaults.options` using [`got.mergeOptions`](#gotmergeoptionsparentoptions-newoptions). You can access the resolved options with the `.defaults` property on the instance. + +```js +const client = got.extend({ + baseUrl: 'https://example.com', + headers: { + 'x-unicorn': 'rainbow' + } +}); + +client.get('/demo'); + +/* HTTP Request => + * GET /demo HTTP/1.1 + * Host: example.com + * x-unicorn: rainbow + */ +``` + +```js +(async () => { + const client = got.extend({ + baseUrl: 'httpbin.org', + headers: { + 'x-foo': 'bar' + } + }); + const {headers} = (await client.get('/headers', {json: true})).body; + //=> headers['x-foo'] === 'bar' + + const jsonClient = client.extend({ + json: true, + headers: { + 'x-baz': 'qux' + } + }); + const {headers: headers2} = (await jsonClient.get('/headers')).body; + //=> headers2['x-foo'] === 'bar' + //=> headers2['x-baz'] === 'qux' +})(); +``` + +**Tip:** Need more control over the behavior of Got? Check out the [`got.create()`](advanced-creation.md). + +#### got.mergeOptions(parentOptions, newOptions) + +Extends parent options. Avoid using [object spread](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Operators/Spread_syntax#Spread_in_object_literals) as it doesn't work recursively: + +```js +const a = {headers: {cat: 'meow', wolf: ['bark', 'wrrr']}}; +const b = {headers: {cow: 'moo', wolf: ['auuu']}}; + +{...a, ...b} // => {headers: {cow: 'moo', wolf: ['auuu']}} +got.mergeOptions(a, b) // => {headers: {cat: 'meow', cow: 'moo', wolf: ['auuu']}} +``` + +Options are deeply merged to a new object. The value of each key is determined as follows: + +- If the new property is set to `undefined`, it keeps the old one. +- If the parent property is an instance of `URL` and the new value is a `string` or `URL`, a new URL instance is created: [`new URL(new, parent)`](https://developer.mozilla.org/en-US/docs/Web/API/URL/URL#Syntax). +- If the new property is a plain `Object`: + - If the parent property is a plain `Object` too, both values are merged recursively into a new `Object`. + - Otherwise, only the new value is deeply cloned. +- If the new property is an `Array`, it overwrites the old one with a deep clone of the new property. +- Otherwise, the new value is assigned to the key. + +#### got.defaults + +Type: `Object` + +The default Got options. + +## Errors + +Each error contains `host`, `hostname`, `method`, `path`, `protocol`, `url` and `gotOptions` properties to make debugging easier. + +In Promise mode, the `response` is attached to the error. + +#### got.CacheError + +When a cache method fails, for example, if the database goes down or there's a filesystem error. + +#### got.RequestError + +When a request fails. Contains a `code` property with error class code, like `ECONNREFUSED`. + +#### got.ReadError + +When reading from response stream fails. + +#### got.ParseError + +When `json` option is enabled, server response code is 2xx, and `JSON.parse` fails. Includes `statusCode` and `statusMessage` properties. + +#### got.HTTPError + +When the server response code is not 2xx. Includes `body`, `statusCode`, `statusMessage`, and `redirectUrls` properties. + +#### got.MaxRedirectsError + +When the server redirects you more than ten times. Includes a `statusCode`, `statusMessage`, and `redirectUrls` property which is an array of the URLs Got was redirected to before giving up. + +#### got.UnsupportedProtocolError + +When given an unsupported protocol. + +#### got.CancelError + +When the request is aborted with `.cancel()`. + +#### got.TimeoutError + +When the request is aborted due to a [timeout](#timeout). Includes an `event` property. + +## Aborting the request + +The promise returned by Got has a [`.cancel()`](https://github.com/sindresorhus/p-cancelable) method which when called, aborts the request. + +```js +(async () => { + const request = got(url, options); + + // … + + // In another part of the code + if (something) { + request.cancel(); + } + + // … + + try { + await request; + } catch (error) { + if (request.isCanceled) { // Or `error instanceof got.CancelError` + // Handle cancelation + } + + // Handle other errors + } +})(); +``` + + +## Cache + +Got implements [RFC 7234](http://httpwg.org/specs/rfc7234.html) compliant HTTP caching which works out of the box in-memory and is easily pluggable with a wide range of storage adapters. Fresh cache entries are served directly from the cache, and stale cache entries are revalidated with `If-None-Match`/`If-Modified-Since` headers. You can read more about the underlying cache behavior in the [`cacheable-request` documentation](https://github.com/lukechilds/cacheable-request). + +You can use the JavaScript `Map` type as an in-memory cache: + +```js +const got = require('got'); +const map = new Map(); + +(async () => { + let response = await got('sindresorhus.com', {cache: map}); + console.log(response.fromCache); + //=> false + + response = await got('sindresorhus.com', {cache: map}); + console.log(response.fromCache); + //=> true +})(); +``` + +Got uses [Keyv](https://github.com/lukechilds/keyv) internally to support a wide range of storage adapters. For something more scalable you could use an [official Keyv storage adapter](https://github.com/lukechilds/keyv#official-storage-adapters): + +``` +$ npm install @keyv/redis +``` + +```js +const got = require('got'); +const KeyvRedis = require('@keyv/redis'); + +const redis = new KeyvRedis('redis://user:pass@localhost:6379'); + +got('sindresorhus.com', {cache: redis}); +``` + +Got supports anything that follows the Map API, so it's easy to write your own storage adapter or use a third-party solution. + +For example, the following are all valid storage adapters: + +```js +const storageAdapter = new Map(); +// Or +const storageAdapter = require('./my-storage-adapter'); +// Or +const QuickLRU = require('quick-lru'); +const storageAdapter = new QuickLRU({maxSize: 1000}); + +got('sindresorhus.com', {cache: storageAdapter}); +``` + +View the [Keyv docs](https://github.com/lukechilds/keyv) for more information on how to use storage adapters. + + +## Proxies + +You can use the [`tunnel`](https://github.com/koichik/node-tunnel) package with the `agent` option to work with proxies: + +```js +const got = require('got'); +const tunnel = require('tunnel'); + +got('sindresorhus.com', { + agent: tunnel.httpOverHttp({ + proxy: { + host: 'localhost' + } + }) +}); +``` + +Check out [`global-tunnel`](https://github.com/np-maintain/global-tunnel) if you want to configure proxy support for all HTTP/HTTPS traffic in your app. + + +## Cookies + +You can use the [`tough-cookie`](https://github.com/salesforce/tough-cookie) package: + +```js +const got = require('got'); +const {CookieJar} = require('tough-cookie'); + +const cookieJar = new CookieJar(); +cookieJar.setCookie('foo=bar', 'https://www.google.com'); + +got('google.com', {cookieJar}); +``` + + +## Form data + +You can use the [`form-data`](https://github.com/form-data/form-data) package to create POST request with form data: + +```js +const fs = require('fs'); +const got = require('got'); +const FormData = require('form-data'); +const form = new FormData(); + +form.append('my_file', fs.createReadStream('/foo/bar.jpg')); + +got.post('google.com', { + body: form +}); +``` + + +## OAuth + +You can use the [`oauth-1.0a`](https://github.com/ddo/oauth-1.0a) package to create a signed OAuth request: + +```js +const got = require('got'); +const crypto = require('crypto'); +const OAuth = require('oauth-1.0a'); + +const oauth = OAuth({ + consumer: { + key: process.env.CONSUMER_KEY, + secret: process.env.CONSUMER_SECRET + }, + signature_method: 'HMAC-SHA1', + hash_function: (baseString, key) => crypto.createHmac('sha1', key).update(baseString).digest('base64') +}); + +const token = { + key: process.env.ACCESS_TOKEN, + secret: process.env.ACCESS_TOKEN_SECRET +}; + +const url = 'https://api.twitter.com/1.1/statuses/home_timeline.json'; + +got(url, { + headers: oauth.toHeader(oauth.authorize({url, method: 'GET'}, token)), + json: true +}); +``` + + +## Unix Domain Sockets + +Requests can also be sent via [unix domain sockets](http://serverfault.com/questions/124517/whats-the-difference-between-unix-socket-and-tcp-ip-socket). Use the following URL scheme: `PROTOCOL://unix:SOCKET:PATH`. + +- `PROTOCOL` - `http` or `https` *(optional)* +- `SOCKET` - Absolute path to a unix domain socket, for example: `/var/run/docker.sock` +- `PATH` - Request path, for example: `/v2/keys` + +```js +got('http://unix:/var/run/docker.sock:/containers/json'); + +// Or without protocol (HTTP by default) +got('unix:/var/run/docker.sock:/containers/json'); +``` + + +## AWS + +Requests to AWS services need to have their headers signed. This can be accomplished by using the [`aws4`](https://www.npmjs.com/package/aws4) package. This is an example for querying an ["API Gateway"](https://docs.aws.amazon.com/apigateway/api-reference/signing-requests/) with a signed request. + +```js +const AWS = require('aws-sdk'); +const aws4 = require('aws4'); +const got = require('got'); + +const chain = new AWS.CredentialProviderChain(); + +// Create a Got instance to use relative paths and signed requests +const awsClient = got.extend({ + baseUrl: 'https://.execute-api..amazonaws.com//', + hooks: { + beforeRequest: [ + async options => { + const credentials = await chain.resolvePromise(); + aws4.sign(options, credentials); + } + ] + } +}); + +const response = await awsClient('endpoint/path', { + // Request-specific options +}); +``` + + +## Testing + +You can test your requests by using the [`nock`](https://github.com/node-nock/nock) package to mock an endpoint: + +```js +const got = require('got'); +const nock = require('nock'); + +nock('https://sindresorhus.com') + .get('/') + .reply(200, 'Hello world!'); + +(async () => { + const response = await got('sindresorhus.com'); + console.log(response.body); + //=> 'Hello world!' +})(); +``` + +If you need real integration tests you can use [`create-test-server`](https://github.com/lukechilds/create-test-server): + +```js +const got = require('got'); +const createTestServer = require('create-test-server'); + +(async () => { + const server = await createTestServer(); + server.get('/', 'Hello world!'); + + const response = await got(server.url); + console.log(response.body); + //=> 'Hello world!' + + await server.close(); +})(); +``` + + +## Tips + +### User Agent + +It's a good idea to set the `'user-agent'` header so the provider can more easily see how their resource is used. By default, it's the URL to this repo. You can omit this header by setting it to `null`. + +```js +const got = require('got'); +const pkg = require('./package.json'); + +got('sindresorhus.com', { + headers: { + 'user-agent': `my-package/${pkg.version} (https://github.com/username/my-package)` + } +}); + +got('sindresorhus.com', { + headers: { + 'user-agent': null + } +}); +``` + +### 304 Responses + +Bear in mind; if you send an `if-modified-since` header and receive a `304 Not Modified` response, the body will be empty. It's your responsibility to cache and retrieve the body contents. + +### Custom endpoints + +Use `got.extend()` to make it nicer to work with REST APIs. Especially if you use the `baseUrl` option. + +**Note:** Not to be confused with [`got.create()`](advanced-creation.md), which has no defaults. + +```js +const got = require('got'); +const pkg = require('./package.json'); + +const custom = got.extend({ + baseUrl: 'example.com', + json: true, + headers: { + 'user-agent': `my-package/${pkg.version} (https://github.com/username/my-package)` + } +}); + +// Use `custom` exactly how you use `got` +(async () => { + const list = await custom('/v1/users/list'); +})(); +``` + +**Tip:** Need to merge some instances into a single one? Check out [`got.mergeInstances()`](advanced-creation.md#merging-instances). + +### Experimental HTTP2 support + +Got provides an experimental support for HTTP2 using the [`http2-wrapper`](https://github.com/szmarczak/http2-wrapper) package: + +```js +const got = require('got'); +const {request} = require('http2-wrapper'); + +const h2got = got.extend({request}); + +(async () => { + const {body} = await h2got('https://nghttp2.org/httpbin/headers'); + console.log(body); +})(); +``` + +## Comparison + +| | `got` | [`request`][r0] | [`node-fetch`][n0] | [`axios`][a0] | [`superagent`][s0] | +|-----------------------|:--------------:|:---------------:|:------------------:|:---------------:|:--------------------:| +| HTTP/2 support | ❔ | ✖ | ✖ | ✖ | ✔\*\* | +| Browser support | ✖ | ✖ | ✔\* | ✔ | ✔ | +| Electron support | ✔ | ✖ | ✖ | ✖ | ✖ | +| Promise API | ✔ | ✔ | ✔ | ✔ | ✔ | +| Stream API | ✔ | ✔ | Node.js only | ✖ | ✔ | +| Request cancelation | ✔ | ✖ | ✔ | ✔ | ✔ | +| RFC compliant caching | ✔ | ✖ | ✖ | ✖ | ✖ | +| Cookies (out-of-box) | ✔ | ✔ | ✖ | ✖ | ✖ | +| Follows redirects | ✔ | ✔ | ✔ | ✔ | ✔ | +| Retries on failure | ✔ | ✖ | ✖ | ✖ | ✔ | +| Progress events | ✔ | ✖ | ✖ | Browser only | ✔ | +| Handles gzip/deflate | ✔ | ✔ | ✔ | ✔ | ✔ | +| Advanced timeouts | ✔ | ✖ | ✖ | ✖ | ✖ | +| Timings | ✔ | ✔ | ✖ | ✖ | ✖ | +| Errors with metadata | ✔ | ✖ | ✖ | ✔ | ✖ | +| JSON mode | ✔ | ✔ | ✖ | ✔ | ✔ | +| Custom defaults | ✔ | ✔ | ✖ | ✔ | ✖ | +| Composable | ✔ | ✖ | ✖ | ✖ | ✔ | +| Hooks | ✔ | ✖ | ✖ | ✔ | ✖ | +| Issues open | [![][gio]][g1] | [![][rio]][r1] | [![][nio]][n1] | [![][aio]][a1] | [![][sio]][s1] | +| Issues closed | [![][gic]][g2] | [![][ric]][r2] | [![][nic]][n2] | [![][aic]][a2] | [![][sic]][s2] | +| Downloads | [![][gd]][g3] | [![][rd]][r3] | [![][nd]][n3] | [![][ad]][a3] | [![][sd]][s3] | +| Coverage | [![][gc]][g4] | [![][rc]][r4] | [![][nc]][n4] | [![][ac]][a4] | unknown | +| Build | [![][gb]][g5] | [![][rb]][r5] | [![][nb]][n5] | [![][ab]][a5] | [![][sb]][s5] | +| Bugs | [![][gbg]][g6] | [![][rbg]][r6] | [![][nbg]][n6] | [![][abg]][a6] | [![][sbg]][s6] | +| Dependents | [![][gdp]][g7] | [![][rdp]][r7] | [![][ndp]][n7] | [![][adp]][a7] | [![][sdp]][s7] | +| Install size | [![][gis]][g8] | [![][ris]][r8] | [![][nis]][n8] | [![][ais]][a8] | [![][sis]][s8] | + +\* It's almost API compatible with the browser `fetch` API.
+\*\* Need to switch the protocol manually.
+❔ Experimental support. + + +[r0]: https://github.com/request/request +[n0]: https://github.com/bitinn/node-fetch +[a0]: https://github.com/axios/axios +[s0]: https://github.com/visionmedia/superagent + + +[gio]: https://badgen.net/github/open-issues/sindresorhus/got?label +[rio]: https://badgen.net/github/open-issues/request/request?label +[nio]: https://badgen.net/github/open-issues/bitinn/node-fetch?label +[aio]: https://badgen.net/github/open-issues/axios/axios?label +[sio]: https://badgen.net/github/open-issues/visionmedia/superagent?label + +[g1]: https://github.com/sindresorhus/got/issues?q=is%3Aissue+is%3Aopen+sort%3Aupdated-desc +[r1]: https://github.com/request/request/issues?q=is%3Aissue+is%3Aopen+sort%3Aupdated-desc +[n1]: https://github.com/bitinn/node-fetch/issues?q=is%3Aissue+is%3Aopen+sort%3Aupdated-desc +[a1]: https://github.com/axios/axios/issues?q=is%3Aissue+is%3Aopen+sort%3Aupdated-desc +[s1]: https://github.com/visionmedia/superagent/issues?q=is%3Aissue+is%3Aopen+sort%3Aupdated-desc + + +[gic]: https://badgen.net/github/closed-issues/sindresorhus/got?label +[ric]: https://badgen.net/github/closed-issues/request/request?label +[nic]: https://badgen.net/github/closed-issues/bitinn/node-fetch?label +[aic]: https://badgen.net/github/closed-issues/axios/axios?label +[sic]: https://badgen.net/github/closed-issues/visionmedia/superagent?label + +[g2]: https://github.com/sindresorhus/got/issues?q=is%3Aissue+is%3Aclosed+sort%3Aupdated-desc +[r2]: https://github.com/request/request/issues?q=is%3Aissue+is%3Aclosed+sort%3Aupdated-desc +[n2]: https://github.com/bitinn/node-fetch/issues?q=is%3Aissue+is%3Aclosed+sort%3Aupdated-desc +[a2]: https://github.com/axios/axios/issues?q=is%3Aissue+is%3Aclosed+sort%3Aupdated-desc +[s2]: https://github.com/visionmedia/superagent/issues?q=is%3Aissue+is%3Aclosed+sort%3Aupdated-desc + + +[gd]: https://badgen.net/npm/dm/got?label +[rd]: https://badgen.net/npm/dm/request?label +[nd]: https://badgen.net/npm/dm/node-fetch?label +[ad]: https://badgen.net/npm/dm/axios?label +[sd]: https://badgen.net/npm/dm/superagent?label + +[g3]: https://www.npmjs.com/package/got +[r3]: https://www.npmjs.com/package/request +[n3]: https://www.npmjs.com/package/node-fetch +[a3]: https://www.npmjs.com/package/axios +[s3]: https://www.npmjs.com/package/superagent + + +[gc]: https://badgen.net/coveralls/c/github/sindresorhus/got?label +[rc]: https://badgen.net/coveralls/c/github/request/request?label +[nc]: https://badgen.net/coveralls/c/github/bitinn/node-fetch?label +[ac]: https://badgen.net/coveralls/c/github/mzabriskie/axios?label + +[g4]: https://coveralls.io/github/sindresorhus/got +[r4]: https://coveralls.io/github/request/request +[n4]: https://coveralls.io/github/bitinn/node-fetch +[a4]: https://coveralls.io/github/mzabriskie/axios + + +[gb]: https://badgen.net/travis/sindresorhus/got?label +[rb]: https://badgen.net/travis/request/request?label +[nb]: https://badgen.net/travis/bitinn/node-fetch?label +[ab]: https://badgen.net/travis/axios/axios?label +[sb]: https://badgen.net/travis/visionmedia/superagent?label + +[g5]: https://travis-ci.org/sindresorhus/got +[r5]: https://travis-ci.org/request/request +[n5]: https://travis-ci.org/bitinn/node-fetch +[a5]: https://travis-ci.org/axios/axios +[s5]: https://travis-ci.org/visionmedia/superagent + + +[gbg]: https://badgen.net/github/label-issues/sindresorhus/got/bug/open?label +[rbg]: https://badgen.net/github/label-issues/request/request/Needs%20investigation/open?label +[nbg]: https://badgen.net/github/label-issues/bitinn/node-fetch/bug/open?label +[abg]: https://badgen.net/github/label-issues/axios/axios/bug/open?label +[sbg]: https://badgen.net/github/label-issues/visionmedia/superagent/Bug/open?label + +[g6]: https://github.com/sindresorhus/got/issues?q=is%3Aissue+is%3Aopen+sort%3Aupdated-desc+label%3Abug +[r6]: https://github.com/request/request/issues?q=is%3Aissue+is%3Aopen+sort%3Aupdated-desc+label%3A"Needs+investigation" +[n6]: https://github.com/bitinn/node-fetch/issues?q=is%3Aissue+is%3Aopen+sort%3Aupdated-desc+label%3Abug +[a6]: https://github.com/axios/axios/issues?q=is%3Aissue+is%3Aopen+sort%3Aupdated-desc+label%3Abug +[s6]: https://github.com/visionmedia/superagent/issues?q=is%3Aissue+is%3Aopen+sort%3Aupdated-desc+label%3ABug + + +[gdp]: https://badgen.net/npm/dependents/got?label +[rdp]: https://badgen.net/npm/dependents/request?label +[ndp]: https://badgen.net/npm/dependents/node-fetch?label +[adp]: https://badgen.net/npm/dependents/axios?label +[sdp]: https://badgen.net/npm/dependents/superagent?label + +[g7]: https://www.npmjs.com/package/got?activeTab=dependents +[r7]: https://www.npmjs.com/package/request?activeTab=dependents +[n7]: https://www.npmjs.com/package/node-fetch?activeTab=dependents +[a7]: https://www.npmjs.com/package/axios?activeTab=dependents +[s7]: https://www.npmjs.com/package/visionmedia?activeTab=dependents + + +[gis]: https://badgen.net/packagephobia/install/got?label +[ris]: https://badgen.net/packagephobia/install/request?label +[nis]: https://badgen.net/packagephobia/install/node-fetch?label +[ais]: https://badgen.net/packagephobia/install/axios?label +[sis]: https://badgen.net/packagephobia/install/superagent?label + +[g8]: https://packagephobia.now.sh/result?p=got +[r8]: https://packagephobia.now.sh/result?p=request +[n8]: https://packagephobia.now.sh/result?p=node-fetch +[a8]: https://packagephobia.now.sh/result?p=axios +[s8]: https://packagephobia.now.sh/result?p=superagent + + +## Related + +- [gh-got](https://github.com/sindresorhus/gh-got) - Got convenience wrapper to interact with the GitHub API +- [gl-got](https://github.com/singapore/gl-got) - Got convenience wrapper to interact with the GitLab API +- [travis-got](https://github.com/samverschueren/travis-got) - Got convenience wrapper to interact with the Travis API +- [graphql-got](https://github.com/kevva/graphql-got) - Got convenience wrapper to interact with GraphQL +- [GotQL](https://github.com/khaosdoctor/gotql) - Got convenience wrapper to interact with GraphQL using JSON-parsed queries instead of strings + + +## Maintainers + +[![Sindre Sorhus](https://github.com/sindresorhus.png?size=100)](https://sindresorhus.com) | [![Vsevolod Strukchinsky](https://github.com/floatdrop.png?size=100)](https://github.com/floatdrop) | [![Alexander Tesfamichael](https://github.com/AlexTes.png?size=100)](https://github.com/AlexTes) | [![Luke Childs](https://github.com/lukechilds.png?size=100)](https://github.com/lukechilds) | [![Szymon Marczak](https://github.com/szmarczak.png?size=100)](https://github.com/szmarczak) | [![Brandon Smith](https://github.com/brandon93s.png?size=100)](https://github.com/brandon93s) +---|---|---|---|---|--- +[Sindre Sorhus](https://sindresorhus.com) | [Vsevolod Strukchinsky](https://github.com/floatdrop) | [Alexander Tesfamichael](https://alextes.me) | [Luke Childs](https://github.com/lukechilds) | [Szymon Marczak](https://github.com/szmarczak) | [Brandon Smith](https://github.com/brandon93s) + + +## License + +MIT diff --git a/node_modules/got/source/as-promise.js b/node_modules/got/source/as-promise.js new file mode 100644 index 000000000..c5023253f --- /dev/null +++ b/node_modules/got/source/as-promise.js @@ -0,0 +1,108 @@ +'use strict'; +const EventEmitter = require('events'); +const getStream = require('get-stream'); +const is = require('@sindresorhus/is'); +const PCancelable = require('p-cancelable'); +const requestAsEventEmitter = require('./request-as-event-emitter'); +const {HTTPError, ParseError, ReadError} = require('./errors'); +const {options: mergeOptions} = require('./merge'); +const {reNormalize} = require('./normalize-arguments'); + +const asPromise = options => { + const proxy = new EventEmitter(); + + const promise = new PCancelable((resolve, reject, onCancel) => { + const emitter = requestAsEventEmitter(options); + + onCancel(emitter.abort); + + emitter.on('response', async response => { + proxy.emit('response', response); + + const stream = is.null(options.encoding) ? getStream.buffer(response) : getStream(response, options); + + let data; + try { + data = await stream; + } catch (error) { + reject(new ReadError(error, options)); + return; + } + + const limitStatusCode = options.followRedirect ? 299 : 399; + + response.body = data; + + try { + for (const [index, hook] of Object.entries(options.hooks.afterResponse)) { + // eslint-disable-next-line no-await-in-loop + response = await hook(response, updatedOptions => { + updatedOptions = reNormalize(mergeOptions(options, { + ...updatedOptions, + retry: 0, + throwHttpErrors: false + })); + + // Remove any further hooks for that request, because we we'll call them anyway. + // The loop continues. We don't want duplicates (asPromise recursion). + updatedOptions.hooks.afterResponse = options.hooks.afterResponse.slice(0, index); + + return asPromise(updatedOptions); + }); + } + } catch (error) { + reject(error); + return; + } + + const {statusCode} = response; + + if (options.json && response.body) { + try { + response.body = JSON.parse(response.body); + } catch (error) { + if (statusCode >= 200 && statusCode < 300) { + const parseError = new ParseError(error, statusCode, options, data); + Object.defineProperty(parseError, 'response', {value: response}); + reject(parseError); + return; + } + } + } + + if (statusCode !== 304 && (statusCode < 200 || statusCode > limitStatusCode)) { + const error = new HTTPError(response, options); + Object.defineProperty(error, 'response', {value: response}); + if (emitter.retry(error) === false) { + if (options.throwHttpErrors) { + reject(error); + return; + } + + resolve(response); + } + + return; + } + + resolve(response); + }); + + emitter.once('error', reject); + [ + 'request', + 'redirect', + 'uploadProgress', + 'downloadProgress' + ].forEach(event => emitter.on(event, (...args) => proxy.emit(event, ...args))); + }); + + promise.on = (name, fn) => { + proxy.on(name, fn); + return promise; + }; + + return promise; +}; + +module.exports = asPromise; diff --git a/node_modules/got/source/as-stream.js b/node_modules/got/source/as-stream.js new file mode 100644 index 000000000..98c5342df --- /dev/null +++ b/node_modules/got/source/as-stream.js @@ -0,0 +1,93 @@ +'use strict'; +const {PassThrough} = require('stream'); +const duplexer3 = require('duplexer3'); +const requestAsEventEmitter = require('./request-as-event-emitter'); +const {HTTPError, ReadError} = require('./errors'); + +module.exports = options => { + const input = new PassThrough(); + const output = new PassThrough(); + const proxy = duplexer3(input, output); + const piped = new Set(); + let isFinished = false; + + options.retry.retries = () => 0; + + if (options.body) { + proxy.write = () => { + throw new Error('Got\'s stream is not writable when the `body` option is used'); + }; + } + + const emitter = requestAsEventEmitter(options, input); + + // Cancels the request + proxy._destroy = emitter.abort; + + emitter.on('response', response => { + const {statusCode} = response; + + response.on('error', error => { + proxy.emit('error', new ReadError(error, options)); + }); + + if (options.throwHttpErrors && statusCode !== 304 && (statusCode < 200 || statusCode > 299)) { + proxy.emit('error', new HTTPError(response, options), null, response); + return; + } + + isFinished = true; + + response.pipe(output); + + for (const destination of piped) { + if (destination.headersSent) { + continue; + } + + for (const [key, value] of Object.entries(response.headers)) { + // Got gives *decompressed* data. Overriding `content-encoding` header would result in an error. + // It's not possible to decompress already decompressed data, is it? + const allowed = options.decompress ? key !== 'content-encoding' : true; + if (allowed) { + destination.setHeader(key, value); + } + } + + destination.statusCode = response.statusCode; + } + + proxy.emit('response', response); + }); + + [ + 'error', + 'request', + 'redirect', + 'uploadProgress', + 'downloadProgress' + ].forEach(event => emitter.on(event, (...args) => proxy.emit(event, ...args))); + + const pipe = proxy.pipe.bind(proxy); + const unpipe = proxy.unpipe.bind(proxy); + proxy.pipe = (destination, options) => { + if (isFinished) { + throw new Error('Failed to pipe. The response has been emitted already.'); + } + + const result = pipe(destination, options); + + if (Reflect.has(destination, 'setHeader')) { + piped.add(destination); + } + + return result; + }; + + proxy.unpipe = stream => { + piped.delete(stream); + return unpipe(stream); + }; + + return proxy; +}; diff --git a/node_modules/got/source/create.js b/node_modules/got/source/create.js new file mode 100644 index 000000000..b78c51f09 --- /dev/null +++ b/node_modules/got/source/create.js @@ -0,0 +1,79 @@ +'use strict'; +const errors = require('./errors'); +const asStream = require('./as-stream'); +const asPromise = require('./as-promise'); +const normalizeArguments = require('./normalize-arguments'); +const merge = require('./merge'); +const deepFreeze = require('./utils/deep-freeze'); + +const getPromiseOrStream = options => options.stream ? asStream(options) : asPromise(options); + +const aliases = [ + 'get', + 'post', + 'put', + 'patch', + 'head', + 'delete' +]; + +const create = defaults => { + defaults = merge({}, defaults); + normalizeArguments.preNormalize(defaults.options); + + if (!defaults.handler) { + // This can't be getPromiseOrStream, because when merging + // the chain would stop at this point and no further handlers would be called. + defaults.handler = (options, next) => next(options); + } + + function got(url, options) { + try { + return defaults.handler(normalizeArguments(url, options, defaults), getPromiseOrStream); + } catch (error) { + if (options && options.stream) { + throw error; + } else { + return Promise.reject(error); + } + } + } + + got.create = create; + got.extend = options => { + let mutableDefaults; + if (options && Reflect.has(options, 'mutableDefaults')) { + mutableDefaults = options.mutableDefaults; + delete options.mutableDefaults; + } else { + mutableDefaults = defaults.mutableDefaults; + } + + return create({ + options: merge.options(defaults.options, options), + handler: defaults.handler, + mutableDefaults + }); + }; + + got.mergeInstances = (...args) => create(merge.instances(args)); + + got.stream = (url, options) => got(url, {...options, stream: true}); + + for (const method of aliases) { + got[method] = (url, options) => got(url, {...options, method}); + got.stream[method] = (url, options) => got.stream(url, {...options, method}); + } + + Object.assign(got, {...errors, mergeOptions: merge.options}); + Object.defineProperty(got, 'defaults', { + value: defaults.mutableDefaults ? defaults : deepFreeze(defaults), + writable: defaults.mutableDefaults, + configurable: defaults.mutableDefaults, + enumerable: true + }); + + return got; +}; + +module.exports = create; diff --git a/node_modules/got/source/errors.js b/node_modules/got/source/errors.js new file mode 100644 index 000000000..b6cbadc3c --- /dev/null +++ b/node_modules/got/source/errors.js @@ -0,0 +1,107 @@ +'use strict'; +const urlLib = require('url'); +const http = require('http'); +const PCancelable = require('p-cancelable'); +const is = require('@sindresorhus/is'); + +class GotError extends Error { + constructor(message, error, options) { + super(message); + Error.captureStackTrace(this, this.constructor); + this.name = 'GotError'; + + if (!is.undefined(error.code)) { + this.code = error.code; + } + + Object.assign(this, { + host: options.host, + hostname: options.hostname, + method: options.method, + path: options.path, + socketPath: options.socketPath, + protocol: options.protocol, + url: options.href, + gotOptions: options + }); + } +} + +module.exports.GotError = GotError; + +module.exports.CacheError = class extends GotError { + constructor(error, options) { + super(error.message, error, options); + this.name = 'CacheError'; + } +}; + +module.exports.RequestError = class extends GotError { + constructor(error, options) { + super(error.message, error, options); + this.name = 'RequestError'; + } +}; + +module.exports.ReadError = class extends GotError { + constructor(error, options) { + super(error.message, error, options); + this.name = 'ReadError'; + } +}; + +module.exports.ParseError = class extends GotError { + constructor(error, statusCode, options, data) { + super(`${error.message} in "${urlLib.format(options)}": \n${data.slice(0, 77)}...`, error, options); + this.name = 'ParseError'; + this.statusCode = statusCode; + this.statusMessage = http.STATUS_CODES[this.statusCode]; + } +}; + +module.exports.HTTPError = class extends GotError { + constructor(response, options) { + const {statusCode} = response; + let {statusMessage} = response; + + if (statusMessage) { + statusMessage = statusMessage.replace(/\r?\n/g, ' ').trim(); + } else { + statusMessage = http.STATUS_CODES[statusCode]; + } + + super(`Response code ${statusCode} (${statusMessage})`, {}, options); + this.name = 'HTTPError'; + this.statusCode = statusCode; + this.statusMessage = statusMessage; + this.headers = response.headers; + this.body = response.body; + } +}; + +module.exports.MaxRedirectsError = class extends GotError { + constructor(statusCode, redirectUrls, options) { + super('Redirected 10 times. Aborting.', {}, options); + this.name = 'MaxRedirectsError'; + this.statusCode = statusCode; + this.statusMessage = http.STATUS_CODES[this.statusCode]; + this.redirectUrls = redirectUrls; + } +}; + +module.exports.UnsupportedProtocolError = class extends GotError { + constructor(options) { + super(`Unsupported protocol "${options.protocol}"`, {}, options); + this.name = 'UnsupportedProtocolError'; + } +}; + +module.exports.TimeoutError = class extends GotError { + constructor(error, options) { + super(error.message, {code: 'ETIMEDOUT'}, options); + this.name = 'TimeoutError'; + this.event = error.event; + } +}; + +module.exports.CancelError = PCancelable.CancelError; diff --git a/node_modules/got/source/get-response.js b/node_modules/got/source/get-response.js new file mode 100644 index 000000000..18453c252 --- /dev/null +++ b/node_modules/got/source/get-response.js @@ -0,0 +1,31 @@ +'use strict'; +const decompressResponse = require('decompress-response'); +const is = require('@sindresorhus/is'); +const mimicResponse = require('mimic-response'); +const progress = require('./progress'); + +module.exports = (response, options, emitter) => { + const downloadBodySize = Number(response.headers['content-length']) || null; + + const progressStream = progress.download(response, emitter, downloadBodySize); + + mimicResponse(response, progressStream); + + const newResponse = options.decompress === true && + is.function(decompressResponse) && + options.method !== 'HEAD' ? decompressResponse(progressStream) : progressStream; + + if (!options.decompress && ['gzip', 'deflate'].includes(response.headers['content-encoding'])) { + options.encoding = null; + } + + emitter.emit('response', newResponse); + + emitter.emit('downloadProgress', { + percent: 0, + transferred: 0, + total: downloadBodySize + }); + + response.pipe(progressStream); +}; diff --git a/node_modules/got/source/index.js b/node_modules/got/source/index.js new file mode 100644 index 000000000..cbf7c3739 --- /dev/null +++ b/node_modules/got/source/index.js @@ -0,0 +1,60 @@ +'use strict'; +const pkg = require('../package.json'); +const create = require('./create'); + +const defaults = { + options: { + retry: { + retries: 2, + methods: [ + 'GET', + 'PUT', + 'HEAD', + 'DELETE', + 'OPTIONS', + 'TRACE' + ], + statusCodes: [ + 408, + 413, + 429, + 500, + 502, + 503, + 504 + ], + errorCodes: [ + 'ETIMEDOUT', + 'ECONNRESET', + 'EADDRINUSE', + 'ECONNREFUSED', + 'EPIPE', + 'ENOTFOUND', + 'ENETUNREACH', + 'EAI_AGAIN' + ] + }, + headers: { + 'user-agent': `${pkg.name}/${pkg.version} (https://github.com/sindresorhus/got)` + }, + hooks: { + beforeRequest: [], + beforeRedirect: [], + beforeRetry: [], + afterResponse: [] + }, + decompress: true, + throwHttpErrors: true, + followRedirect: true, + stream: false, + form: false, + json: false, + cache: false, + useElectronNet: false + }, + mutableDefaults: false +}; + +const got = create(defaults); + +module.exports = got; diff --git a/node_modules/got/source/known-hook-events.js b/node_modules/got/source/known-hook-events.js new file mode 100644 index 000000000..cd245e124 --- /dev/null +++ b/node_modules/got/source/known-hook-events.js @@ -0,0 +1,10 @@ +'use strict'; + +module.exports = [ + 'beforeError', + 'init', + 'beforeRequest', + 'beforeRedirect', + 'beforeRetry', + 'afterResponse' +]; diff --git a/node_modules/got/source/merge.js b/node_modules/got/source/merge.js new file mode 100644 index 000000000..900f09a08 --- /dev/null +++ b/node_modules/got/source/merge.js @@ -0,0 +1,73 @@ +'use strict'; +const {URL} = require('url'); +const is = require('@sindresorhus/is'); +const knownHookEvents = require('./known-hook-events'); + +const merge = (target, ...sources) => { + for (const source of sources) { + for (const [key, sourceValue] of Object.entries(source)) { + if (is.undefined(sourceValue)) { + continue; + } + + const targetValue = target[key]; + if (is.urlInstance(targetValue) && (is.urlInstance(sourceValue) || is.string(sourceValue))) { + target[key] = new URL(sourceValue, targetValue); + } else if (is.plainObject(sourceValue)) { + if (is.plainObject(targetValue)) { + target[key] = merge({}, targetValue, sourceValue); + } else { + target[key] = merge({}, sourceValue); + } + } else if (is.array(sourceValue)) { + target[key] = merge([], sourceValue); + } else { + target[key] = sourceValue; + } + } + } + + return target; +}; + +const mergeOptions = (...sources) => { + sources = sources.map(source => source || {}); + const merged = merge({}, ...sources); + + const hooks = {}; + for (const hook of knownHookEvents) { + hooks[hook] = []; + } + + for (const source of sources) { + if (source.hooks) { + for (const hook of knownHookEvents) { + hooks[hook] = hooks[hook].concat(source.hooks[hook]); + } + } + } + + merged.hooks = hooks; + + return merged; +}; + +const mergeInstances = (instances, methods) => { + const handlers = instances.map(instance => instance.defaults.handler); + const size = instances.length - 1; + + return { + methods, + options: mergeOptions(...instances.map(instance => instance.defaults.options)), + handler: (options, next) => { + let iteration = -1; + const iterate = options => handlers[++iteration](options, iteration === size ? next : iterate); + + return iterate(options); + } + }; +}; + +module.exports = merge; +module.exports.options = mergeOptions; +module.exports.instances = mergeInstances; diff --git a/node_modules/got/source/normalize-arguments.js b/node_modules/got/source/normalize-arguments.js new file mode 100644 index 000000000..665cbceac --- /dev/null +++ b/node_modules/got/source/normalize-arguments.js @@ -0,0 +1,265 @@ +'use strict'; +const {URL, URLSearchParams} = require('url'); // TODO: Use the `URL` global when targeting Node.js 10 +const urlLib = require('url'); +const is = require('@sindresorhus/is'); +const urlParseLax = require('url-parse-lax'); +const lowercaseKeys = require('lowercase-keys'); +const urlToOptions = require('./utils/url-to-options'); +const isFormData = require('./utils/is-form-data'); +const merge = require('./merge'); +const knownHookEvents = require('./known-hook-events'); + +const retryAfterStatusCodes = new Set([413, 429, 503]); + +// `preNormalize` handles static options (e.g. headers). +// For example, when you create a custom instance and make a request +// with no static changes, they won't be normalized again. +// +// `normalize` operates on dynamic options - they cannot be saved. +// For example, `body` is everytime different per request. +// When it's done normalizing the new options, it performs merge() +// on the prenormalized options and the normalized ones. + +const preNormalize = (options, defaults) => { + if (is.nullOrUndefined(options.headers)) { + options.headers = {}; + } else { + options.headers = lowercaseKeys(options.headers); + } + + if (options.baseUrl && !options.baseUrl.toString().endsWith('/')) { + options.baseUrl += '/'; + } + + if (options.stream) { + options.json = false; + } + + if (is.nullOrUndefined(options.hooks)) { + options.hooks = {}; + } else if (!is.object(options.hooks)) { + throw new TypeError(`Parameter \`hooks\` must be an object, not ${is(options.hooks)}`); + } + + for (const event of knownHookEvents) { + if (is.nullOrUndefined(options.hooks[event])) { + if (defaults) { + options.hooks[event] = [...defaults.hooks[event]]; + } else { + options.hooks[event] = []; + } + } + } + + if (is.number(options.timeout)) { + options.gotTimeout = {request: options.timeout}; + } else if (is.object(options.timeout)) { + options.gotTimeout = options.timeout; + } + + delete options.timeout; + + const {retry} = options; + options.retry = { + retries: 0, + methods: [], + statusCodes: [], + errorCodes: [] + }; + + if (is.nonEmptyObject(defaults) && retry !== false) { + options.retry = {...defaults.retry}; + } + + if (retry !== false) { + if (is.number(retry)) { + options.retry.retries = retry; + } else { + options.retry = {...options.retry, ...retry}; + } + } + + if (options.gotTimeout) { + options.retry.maxRetryAfter = Math.min(...[options.gotTimeout.request, options.gotTimeout.connection].filter(n => !is.nullOrUndefined(n))); + } + + if (is.array(options.retry.methods)) { + options.retry.methods = new Set(options.retry.methods.map(method => method.toUpperCase())); + } + + if (is.array(options.retry.statusCodes)) { + options.retry.statusCodes = new Set(options.retry.statusCodes); + } + + if (is.array(options.retry.errorCodes)) { + options.retry.errorCodes = new Set(options.retry.errorCodes); + } + + return options; +}; + +const normalize = (url, options, defaults) => { + if (is.plainObject(url)) { + options = {...url, ...options}; + url = options.url || {}; + delete options.url; + } + + if (defaults) { + options = merge({}, defaults.options, options ? preNormalize(options, defaults.options) : {}); + } else { + options = merge({}, preNormalize(options)); + } + + if (!is.string(url) && !is.object(url)) { + throw new TypeError(`Parameter \`url\` must be a string or object, not ${is(url)}`); + } + + if (is.string(url)) { + if (options.baseUrl) { + if (url.toString().startsWith('/')) { + url = url.toString().slice(1); + } + + url = urlToOptions(new URL(url, options.baseUrl)); + } else { + url = url.replace(/^unix:/, 'http://$&'); + url = urlParseLax(url); + } + } else if (is(url) === 'URL') { + url = urlToOptions(url); + } + + // Override both null/undefined with default protocol + options = merge({path: ''}, url, {protocol: url.protocol || 'https:'}, options); + + for (const hook of options.hooks.init) { + const called = hook(options); + + if (is.promise(called)) { + throw new TypeError('The `init` hook must be a synchronous function'); + } + } + + const {baseUrl} = options; + Object.defineProperty(options, 'baseUrl', { + set: () => { + throw new Error('Failed to set baseUrl. Options are normalized already.'); + }, + get: () => baseUrl + }); + + const {query} = options; + if (is.nonEmptyString(query) || is.nonEmptyObject(query) || query instanceof URLSearchParams) { + if (!is.string(query)) { + options.query = (new URLSearchParams(query)).toString(); + } + + options.path = `${options.path.split('?')[0]}?${options.query}`; + delete options.query; + } + + if (options.hostname === 'unix') { + const matches = /(.+?):(.+)/.exec(options.path); + + if (matches) { + const [, socketPath, path] = matches; + options = { + ...options, + socketPath, + path, + host: null + }; + } + } + + const {headers} = options; + for (const [key, value] of Object.entries(headers)) { + if (is.nullOrUndefined(value)) { + delete headers[key]; + } + } + + if (options.json && is.undefined(headers.accept)) { + headers.accept = 'application/json'; + } + + if (options.decompress && is.undefined(headers['accept-encoding'])) { + headers['accept-encoding'] = 'gzip, deflate'; + } + + const {body} = options; + if (is.nullOrUndefined(body)) { + options.method = options.method ? options.method.toUpperCase() : 'GET'; + } else { + const isObject = is.object(body) && !is.buffer(body) && !is.nodeStream(body); + if (!is.nodeStream(body) && !is.string(body) && !is.buffer(body) && !(options.form || options.json)) { + throw new TypeError('The `body` option must be a stream.Readable, string or Buffer'); + } + + if (options.json && !(isObject || is.array(body))) { + throw new TypeError('The `body` option must be an Object or Array when the `json` option is used'); + } + + if (options.form && !isObject) { + throw new TypeError('The `body` option must be an Object when the `form` option is used'); + } + + if (isFormData(body)) { + // Special case for https://github.com/form-data/form-data + headers['content-type'] = headers['content-type'] || `multipart/form-data; boundary=${body.getBoundary()}`; + } else if (options.form) { + headers['content-type'] = headers['content-type'] || 'application/x-www-form-urlencoded'; + options.body = (new URLSearchParams(body)).toString(); + } else if (options.json) { + headers['content-type'] = headers['content-type'] || 'application/json'; + options.body = JSON.stringify(body); + } + + options.method = options.method ? options.method.toUpperCase() : 'POST'; + } + + if (!is.function(options.retry.retries)) { + const {retries} = options.retry; + + options.retry.retries = (iteration, error) => { + if (iteration > retries) { + return 0; + } + + if ((!error || !options.retry.errorCodes.has(error.code)) && (!options.retry.methods.has(error.method) || !options.retry.statusCodes.has(error.statusCode))) { + return 0; + } + + if (Reflect.has(error, 'headers') && Reflect.has(error.headers, 'retry-after') && retryAfterStatusCodes.has(error.statusCode)) { + let after = Number(error.headers['retry-after']); + if (is.nan(after)) { + after = Date.parse(error.headers['retry-after']) - Date.now(); + } else { + after *= 1000; + } + + if (after > options.retry.maxRetryAfter) { + return 0; + } + + return after; + } + + if (error.statusCode === 413) { + return 0; + } + + const noise = Math.random() * 100; + return ((2 ** (iteration - 1)) * 1000) + noise; + }; + } + + return options; +}; + +const reNormalize = options => normalize(urlLib.format(options), options); + +module.exports = normalize; +module.exports.preNormalize = preNormalize; +module.exports.reNormalize = reNormalize; diff --git a/node_modules/got/source/progress.js b/node_modules/got/source/progress.js new file mode 100644 index 000000000..666abcfdd --- /dev/null +++ b/node_modules/got/source/progress.js @@ -0,0 +1,96 @@ +'use strict'; +const {Transform} = require('stream'); + +module.exports = { + download(response, emitter, downloadBodySize) { + let downloaded = 0; + + return new Transform({ + transform(chunk, encoding, callback) { + downloaded += chunk.length; + + const percent = downloadBodySize ? downloaded / downloadBodySize : 0; + + // Let `flush()` be responsible for emitting the last event + if (percent < 1) { + emitter.emit('downloadProgress', { + percent, + transferred: downloaded, + total: downloadBodySize + }); + } + + callback(null, chunk); + }, + + flush(callback) { + emitter.emit('downloadProgress', { + percent: 1, + transferred: downloaded, + total: downloadBodySize + }); + + callback(); + } + }); + }, + + upload(request, emitter, uploadBodySize) { + const uploadEventFrequency = 150; + let uploaded = 0; + let progressInterval; + + emitter.emit('uploadProgress', { + percent: 0, + transferred: 0, + total: uploadBodySize + }); + + request.once('error', () => { + clearInterval(progressInterval); + }); + + request.once('response', () => { + clearInterval(progressInterval); + + emitter.emit('uploadProgress', { + percent: 1, + transferred: uploaded, + total: uploadBodySize + }); + }); + + request.once('socket', socket => { + const onSocketConnect = () => { + progressInterval = setInterval(() => { + const lastUploaded = uploaded; + /* istanbul ignore next: see #490 (occurs randomly!) */ + const headersSize = request._header ? Buffer.byteLength(request._header) : 0; + uploaded = socket.bytesWritten - headersSize; + + // Don't emit events with unchanged progress and + // prevent last event from being emitted, because + // it's emitted when `response` is emitted + if (uploaded === lastUploaded || uploaded === uploadBodySize) { + return; + } + + emitter.emit('uploadProgress', { + percent: uploadBodySize ? uploaded / uploadBodySize : 0, + transferred: uploaded, + total: uploadBodySize + }); + }, uploadEventFrequency); + }; + + /* istanbul ignore next: hard to test */ + if (socket.connecting) { + socket.once('connect', onSocketConnect); + } else if (socket.writable) { + // The socket is being reused from pool, + // so the connect event will not be emitted + onSocketConnect(); + } + }); + } +}; diff --git a/node_modules/got/source/request-as-event-emitter.js b/node_modules/got/source/request-as-event-emitter.js new file mode 100644 index 000000000..79586af84 --- /dev/null +++ b/node_modules/got/source/request-as-event-emitter.js @@ -0,0 +1,312 @@ +'use strict'; +const {URL} = require('url'); // TODO: Use the `URL` global when targeting Node.js 10 +const util = require('util'); +const EventEmitter = require('events'); +const http = require('http'); +const https = require('https'); +const urlLib = require('url'); +const CacheableRequest = require('cacheable-request'); +const toReadableStream = require('to-readable-stream'); +const is = require('@sindresorhus/is'); +const timer = require('@szmarczak/http-timer'); +const timedOut = require('./utils/timed-out'); +const getBodySize = require('./utils/get-body-size'); +const getResponse = require('./get-response'); +const progress = require('./progress'); +const {CacheError, UnsupportedProtocolError, MaxRedirectsError, RequestError, TimeoutError} = require('./errors'); +const urlToOptions = require('./utils/url-to-options'); + +const getMethodRedirectCodes = new Set([300, 301, 302, 303, 304, 305, 307, 308]); +const allMethodRedirectCodes = new Set([300, 303, 307, 308]); + +module.exports = (options, input) => { + const emitter = new EventEmitter(); + const redirects = []; + let currentRequest; + let requestUrl; + let redirectString; + let uploadBodySize; + let retryCount = 0; + let shouldAbort = false; + + const setCookie = options.cookieJar ? util.promisify(options.cookieJar.setCookie.bind(options.cookieJar)) : null; + const getCookieString = options.cookieJar ? util.promisify(options.cookieJar.getCookieString.bind(options.cookieJar)) : null; + const agents = is.object(options.agent) ? options.agent : null; + + const emitError = async error => { + try { + for (const hook of options.hooks.beforeError) { + // eslint-disable-next-line no-await-in-loop + error = await hook(error); + } + + emitter.emit('error', error); + } catch (error2) { + emitter.emit('error', error2); + } + }; + + const get = async options => { + const currentUrl = redirectString || requestUrl; + + if (options.protocol !== 'http:' && options.protocol !== 'https:') { + throw new UnsupportedProtocolError(options); + } + + decodeURI(currentUrl); + + let fn; + if (is.function(options.request)) { + fn = {request: options.request}; + } else { + fn = options.protocol === 'https:' ? https : http; + } + + if (agents) { + const protocolName = options.protocol === 'https:' ? 'https' : 'http'; + options.agent = agents[protocolName] || options.agent; + } + + /* istanbul ignore next: electron.net is broken */ + if (options.useElectronNet && process.versions.electron) { + const r = ({x: require})['yx'.slice(1)]; // Trick webpack + const electron = r('electron'); + fn = electron.net || electron.remote.net; + } + + if (options.cookieJar) { + const cookieString = await getCookieString(currentUrl, {}); + + if (is.nonEmptyString(cookieString)) { + options.headers.cookie = cookieString; + } + } + + let timings; + const handleResponse = async response => { + try { + /* istanbul ignore next: fixes https://github.com/electron/electron/blob/cbb460d47628a7a146adf4419ed48550a98b2923/lib/browser/api/net.js#L59-L65 */ + if (options.useElectronNet) { + response = new Proxy(response, { + get: (target, name) => { + if (name === 'trailers' || name === 'rawTrailers') { + return []; + } + + const value = target[name]; + return is.function(value) ? value.bind(target) : value; + } + }); + } + + const {statusCode} = response; + response.url = currentUrl; + response.requestUrl = requestUrl; + response.retryCount = retryCount; + response.timings = timings; + response.redirectUrls = redirects; + response.request = { + gotOptions: options + }; + + const rawCookies = response.headers['set-cookie']; + if (options.cookieJar && rawCookies) { + await Promise.all(rawCookies.map(rawCookie => setCookie(rawCookie, response.url))); + } + + if (options.followRedirect && 'location' in response.headers) { + if (allMethodRedirectCodes.has(statusCode) || (getMethodRedirectCodes.has(statusCode) && (options.method === 'GET' || options.method === 'HEAD'))) { + response.resume(); // We're being redirected, we don't care about the response. + + if (statusCode === 303) { + // Server responded with "see other", indicating that the resource exists at another location, + // and the client should request it from that location via GET or HEAD. + options.method = 'GET'; + } + + if (redirects.length >= 10) { + throw new MaxRedirectsError(statusCode, redirects, options); + } + + // Handles invalid URLs. See https://github.com/sindresorhus/got/issues/604 + const redirectBuffer = Buffer.from(response.headers.location, 'binary').toString(); + const redirectURL = new URL(redirectBuffer, currentUrl); + redirectString = redirectURL.toString(); + + redirects.push(redirectString); + + const redirectOptions = { + ...options, + ...urlToOptions(redirectURL) + }; + + for (const hook of options.hooks.beforeRedirect) { + // eslint-disable-next-line no-await-in-loop + await hook(redirectOptions); + } + + emitter.emit('redirect', response, redirectOptions); + + await get(redirectOptions); + return; + } + } + + getResponse(response, options, emitter); + } catch (error) { + emitError(error); + } + }; + + const handleRequest = request => { + if (shouldAbort) { + request.once('error', () => {}); + request.abort(); + return; + } + + currentRequest = request; + + request.once('error', error => { + if (request.aborted) { + return; + } + + if (error instanceof timedOut.TimeoutError) { + error = new TimeoutError(error, options); + } else { + error = new RequestError(error, options); + } + + if (emitter.retry(error) === false) { + emitError(error); + } + }); + + timings = timer(request); + + progress.upload(request, emitter, uploadBodySize); + + if (options.gotTimeout) { + timedOut(request, options.gotTimeout, options); + } + + emitter.emit('request', request); + + const uploadComplete = () => { + request.emit('upload-complete'); + }; + + try { + if (is.nodeStream(options.body)) { + options.body.once('end', uploadComplete); + options.body.pipe(request); + options.body = undefined; + } else if (options.body) { + request.end(options.body, uploadComplete); + } else if (input && (options.method === 'POST' || options.method === 'PUT' || options.method === 'PATCH')) { + input.once('end', uploadComplete); + input.pipe(request); + } else { + request.end(uploadComplete); + } + } catch (error) { + emitError(new RequestError(error, options)); + } + }; + + if (options.cache) { + const cacheableRequest = new CacheableRequest(fn.request, options.cache); + const cacheRequest = cacheableRequest(options, handleResponse); + + cacheRequest.once('error', error => { + if (error instanceof CacheableRequest.RequestError) { + emitError(new RequestError(error, options)); + } else { + emitError(new CacheError(error, options)); + } + }); + + cacheRequest.once('request', handleRequest); + } else { + // Catches errors thrown by calling fn.request(...) + try { + handleRequest(fn.request(options, handleResponse)); + } catch (error) { + emitError(new RequestError(error, options)); + } + } + }; + + emitter.retry = error => { + let backoff; + + try { + backoff = options.retry.retries(++retryCount, error); + } catch (error2) { + emitError(error2); + return; + } + + if (backoff) { + const retry = async options => { + try { + for (const hook of options.hooks.beforeRetry) { + // eslint-disable-next-line no-await-in-loop + await hook(options, error, retryCount); + } + + await get(options); + } catch (error) { + emitError(error); + } + }; + + setTimeout(retry, backoff, {...options, forceRefresh: true}); + return true; + } + + return false; + }; + + emitter.abort = () => { + if (currentRequest) { + currentRequest.once('error', () => {}); + currentRequest.abort(); + } else { + shouldAbort = true; + } + }; + + setImmediate(async () => { + try { + // Convert buffer to stream to receive upload progress events (#322) + const {body} = options; + if (is.buffer(body)) { + options.body = toReadableStream(body); + uploadBodySize = body.length; + } else { + uploadBodySize = await getBodySize(options); + } + + if (is.undefined(options.headers['content-length']) && is.undefined(options.headers['transfer-encoding'])) { + if ((uploadBodySize > 0 || options.method === 'PUT') && !is.null(uploadBodySize)) { + options.headers['content-length'] = uploadBodySize; + } + } + + for (const hook of options.hooks.beforeRequest) { + // eslint-disable-next-line no-await-in-loop + await hook(options); + } + + requestUrl = options.href || (new URL(options.path, urlLib.format(options))).toString(); + + await get(options); + } catch (error) { + emitError(error); + } + }); + + return emitter; +}; diff --git a/node_modules/got/source/utils/deep-freeze.js b/node_modules/got/source/utils/deep-freeze.js new file mode 100644 index 000000000..5052b7126 --- /dev/null +++ b/node_modules/got/source/utils/deep-freeze.js @@ -0,0 +1,12 @@ +'use strict'; +const is = require('@sindresorhus/is'); + +module.exports = function deepFreeze(object) { + for (const [key, value] of Object.entries(object)) { + if (is.plainObject(value) || is.array(value)) { + deepFreeze(object[key]); + } + } + + return Object.freeze(object); +}; diff --git a/node_modules/got/source/utils/get-body-size.js b/node_modules/got/source/utils/get-body-size.js new file mode 100644 index 000000000..0df5af21e --- /dev/null +++ b/node_modules/got/source/utils/get-body-size.js @@ -0,0 +1,32 @@ +'use strict'; +const fs = require('fs'); +const util = require('util'); +const is = require('@sindresorhus/is'); +const isFormData = require('./is-form-data'); + +module.exports = async options => { + const {body} = options; + + if (options.headers['content-length']) { + return Number(options.headers['content-length']); + } + + if (!body && !options.stream) { + return 0; + } + + if (is.string(body)) { + return Buffer.byteLength(body); + } + + if (isFormData(body)) { + return util.promisify(body.getLength.bind(body))(); + } + + if (body instanceof fs.ReadStream) { + const {size} = await util.promisify(fs.stat)(body.path); + return size; + } + + return null; +}; diff --git a/node_modules/got/source/utils/is-form-data.js b/node_modules/got/source/utils/is-form-data.js new file mode 100644 index 000000000..0033618cd --- /dev/null +++ b/node_modules/got/source/utils/is-form-data.js @@ -0,0 +1,4 @@ +'use strict'; +const is = require('@sindresorhus/is'); + +module.exports = body => is.nodeStream(body) && is.function(body.getBoundary); diff --git a/node_modules/got/source/utils/timed-out.js b/node_modules/got/source/utils/timed-out.js new file mode 100644 index 000000000..33611a734 --- /dev/null +++ b/node_modules/got/source/utils/timed-out.js @@ -0,0 +1,160 @@ +'use strict'; +const net = require('net'); + +class TimeoutError extends Error { + constructor(threshold, event) { + super(`Timeout awaiting '${event}' for ${threshold}ms`); + this.name = 'TimeoutError'; + this.code = 'ETIMEDOUT'; + this.event = event; + } +} + +const reentry = Symbol('reentry'); + +const noop = () => {}; + +module.exports = (request, delays, options) => { + /* istanbul ignore next: this makes sure timed-out isn't called twice */ + if (request[reentry]) { + return; + } + + request[reentry] = true; + + let stopNewTimeouts = false; + + const addTimeout = (delay, callback, ...args) => { + // An error had been thrown before. Going further would result in uncaught errors. + // See https://github.com/sindresorhus/got/issues/631#issuecomment-435675051 + if (stopNewTimeouts) { + return noop; + } + + // Event loop order is timers, poll, immediates. + // The timed event may emit during the current tick poll phase, so + // defer calling the handler until the poll phase completes. + let immediate; + const timeout = setTimeout(() => { + immediate = setImmediate(callback, delay, ...args); + /* istanbul ignore next: added in node v9.7.0 */ + if (immediate.unref) { + immediate.unref(); + } + }, delay); + + /* istanbul ignore next: in order to support electron renderer */ + if (timeout.unref) { + timeout.unref(); + } + + const cancel = () => { + clearTimeout(timeout); + clearImmediate(immediate); + }; + + cancelers.push(cancel); + + return cancel; + }; + + const {host, hostname} = options; + const timeoutHandler = (delay, event) => { + request.emit('error', new TimeoutError(delay, event)); + request.once('error', () => {}); // Ignore the `socket hung up` error made by request.abort() + + request.abort(); + }; + + const cancelers = []; + const cancelTimeouts = () => { + stopNewTimeouts = true; + cancelers.forEach(cancelTimeout => cancelTimeout()); + }; + + request.once('error', cancelTimeouts); + request.once('response', response => { + response.once('end', cancelTimeouts); + }); + + if (delays.request !== undefined) { + addTimeout(delays.request, timeoutHandler, 'request'); + } + + if (delays.socket !== undefined) { + const socketTimeoutHandler = () => { + timeoutHandler(delays.socket, 'socket'); + }; + + request.setTimeout(delays.socket, socketTimeoutHandler); + + // `request.setTimeout(0)` causes a memory leak. + // We can just remove the listener and forget about the timer - it's unreffed. + // See https://github.com/sindresorhus/got/issues/690 + cancelers.push(() => request.removeListener('timeout', socketTimeoutHandler)); + } + + if (delays.lookup !== undefined && !request.socketPath && !net.isIP(hostname || host)) { + request.once('socket', socket => { + /* istanbul ignore next: hard to test */ + if (socket.connecting) { + const cancelTimeout = addTimeout(delays.lookup, timeoutHandler, 'lookup'); + socket.once('lookup', cancelTimeout); + } + }); + } + + if (delays.connect !== undefined) { + request.once('socket', socket => { + /* istanbul ignore next: hard to test */ + if (socket.connecting) { + const timeConnect = () => addTimeout(delays.connect, timeoutHandler, 'connect'); + + if (request.socketPath || net.isIP(hostname || host)) { + socket.once('connect', timeConnect()); + } else { + socket.once('lookup', error => { + if (error === null) { + socket.once('connect', timeConnect()); + } + }); + } + } + }); + } + + if (delays.secureConnect !== undefined && options.protocol === 'https:') { + request.once('socket', socket => { + /* istanbul ignore next: hard to test */ + if (socket.connecting) { + socket.once('connect', () => { + const cancelTimeout = addTimeout(delays.secureConnect, timeoutHandler, 'secureConnect'); + socket.once('secureConnect', cancelTimeout); + }); + } + }); + } + + if (delays.send !== undefined) { + request.once('socket', socket => { + const timeRequest = () => addTimeout(delays.send, timeoutHandler, 'send'); + /* istanbul ignore next: hard to test */ + if (socket.connecting) { + socket.once('connect', () => { + request.once('upload-complete', timeRequest()); + }); + } else { + request.once('upload-complete', timeRequest()); + } + }); + } + + if (delays.response !== undefined) { + request.once('upload-complete', () => { + const cancelTimeout = addTimeout(delays.response, timeoutHandler, 'response'); + request.once('response', cancelTimeout); + }); + } +}; + +module.exports.TimeoutError = TimeoutError; diff --git a/node_modules/got/source/utils/url-to-options.js b/node_modules/got/source/utils/url-to-options.js new file mode 100644 index 000000000..848ef300f --- /dev/null +++ b/node_modules/got/source/utils/url-to-options.js @@ -0,0 +1,25 @@ +'use strict'; +const is = require('@sindresorhus/is'); + +module.exports = url => { + const options = { + protocol: url.protocol, + hostname: url.hostname.startsWith('[') ? url.hostname.slice(1, -1) : url.hostname, + hash: url.hash, + search: url.search, + pathname: url.pathname, + href: url.href + }; + + if (is.string(url.port) && url.port.length > 0) { + options.port = Number(url.port); + } + + if (url.username || url.password) { + options.auth = `${url.username}:${url.password}`; + } + + options.path = is.null(url.search) ? url.pathname : `${url.pathname}${url.search}`; + + return options; +}; diff --git a/node_modules/graceful-fs/LICENSE b/node_modules/graceful-fs/LICENSE new file mode 100644 index 000000000..9d2c80369 --- /dev/null +++ b/node_modules/graceful-fs/LICENSE @@ -0,0 +1,15 @@ +The ISC License + +Copyright (c) Isaac Z. Schlueter, Ben Noordhuis, and Contributors + +Permission to use, copy, modify, and/or distribute this software for any +purpose with or without fee is hereby granted, provided that the above +copyright notice and this permission notice appear in all copies. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES +WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF +MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR +ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES +WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN +ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR +IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. diff --git a/node_modules/graceful-fs/README.md b/node_modules/graceful-fs/README.md new file mode 100644 index 000000000..82d6e4daf --- /dev/null +++ b/node_modules/graceful-fs/README.md @@ -0,0 +1,143 @@ +# graceful-fs + +graceful-fs functions as a drop-in replacement for the fs module, +making various improvements. + +The improvements are meant to normalize behavior across different +platforms and environments, and to make filesystem access more +resilient to errors. + +## Improvements over [fs module](https://nodejs.org/api/fs.html) + +* Queues up `open` and `readdir` calls, and retries them once + something closes if there is an EMFILE error from too many file + descriptors. +* fixes `lchmod` for Node versions prior to 0.6.2. +* implements `fs.lutimes` if possible. Otherwise it becomes a noop. +* ignores `EINVAL` and `EPERM` errors in `chown`, `fchown` or + `lchown` if the user isn't root. +* makes `lchmod` and `lchown` become noops, if not available. +* retries reading a file if `read` results in EAGAIN error. + +On Windows, it retries renaming a file for up to one second if `EACCESS` +or `EPERM` error occurs, likely because antivirus software has locked +the directory. + +## USAGE + +```javascript +// use just like fs +var fs = require('graceful-fs') + +// now go and do stuff with it... +fs.readFile('some-file-or-whatever', (err, data) => { + // Do stuff here. +}) +``` + +## Sync methods + +This module cannot intercept or handle `EMFILE` or `ENFILE` errors from sync +methods. If you use sync methods which open file descriptors then you are +responsible for dealing with any errors. + +This is a known limitation, not a bug. + +## Global Patching + +If you want to patch the global fs module (or any other fs-like +module) you can do this: + +```javascript +// Make sure to read the caveat below. +var realFs = require('fs') +var gracefulFs = require('graceful-fs') +gracefulFs.gracefulify(realFs) +``` + +This should only ever be done at the top-level application layer, in +order to delay on EMFILE errors from any fs-using dependencies. You +should **not** do this in a library, because it can cause unexpected +delays in other parts of the program. + +## Changes + +This module is fairly stable at this point, and used by a lot of +things. That being said, because it implements a subtle behavior +change in a core part of the node API, even modest changes can be +extremely breaking, and the versioning is thus biased towards +bumping the major when in doubt. + +The main change between major versions has been switching between +providing a fully-patched `fs` module vs monkey-patching the node core +builtin, and the approach by which a non-monkey-patched `fs` was +created. + +The goal is to trade `EMFILE` errors for slower fs operations. So, if +you try to open a zillion files, rather than crashing, `open` +operations will be queued up and wait for something else to `close`. + +There are advantages to each approach. Monkey-patching the fs means +that no `EMFILE` errors can possibly occur anywhere in your +application, because everything is using the same core `fs` module, +which is patched. However, it can also obviously cause undesirable +side-effects, especially if the module is loaded multiple times. + +Implementing a separate-but-identical patched `fs` module is more +surgical (and doesn't run the risk of patching multiple times), but +also imposes the challenge of keeping in sync with the core module. + +The current approach loads the `fs` module, and then creates a +lookalike object that has all the same methods, except a few that are +patched. It is safe to use in all versions of Node from 0.8 through +7.0. + +### v4 + +* Do not monkey-patch the fs module. This module may now be used as a + drop-in dep, and users can opt into monkey-patching the fs builtin + if their app requires it. + +### v3 + +* Monkey-patch fs, because the eval approach no longer works on recent + node. +* fixed possible type-error throw if rename fails on windows +* verify that we *never* get EMFILE errors +* Ignore ENOSYS from chmod/chown +* clarify that graceful-fs must be used as a drop-in + +### v2.1.0 + +* Use eval rather than monkey-patching fs. +* readdir: Always sort the results +* win32: requeue a file if error has an OK status + +### v2.0 + +* A return to monkey patching +* wrap process.cwd + +### v1.1 + +* wrap readFile +* Wrap fs.writeFile. +* readdir protection +* Don't clobber the fs builtin +* Handle fs.read EAGAIN errors by trying again +* Expose the curOpen counter +* No-op lchown/lchmod if not implemented +* fs.rename patch only for win32 +* Patch fs.rename to handle AV software on Windows +* Close #4 Chown should not fail on einval or eperm if non-root +* Fix isaacs/fstream#1 Only wrap fs one time +* Fix #3 Start at 1024 max files, then back off on EMFILE +* lutimes that doens't blow up on Linux +* A full on-rewrite using a queue instead of just swallowing the EMFILE error +* Wrap Read/Write streams as well + +### 1.0 + +* Update engines for node 0.6 +* Be lstat-graceful on Windows +* first diff --git a/node_modules/graceful-fs/clone.js b/node_modules/graceful-fs/clone.js new file mode 100644 index 000000000..dff3cc8c5 --- /dev/null +++ b/node_modules/graceful-fs/clone.js @@ -0,0 +1,23 @@ +'use strict' + +module.exports = clone + +var getPrototypeOf = Object.getPrototypeOf || function (obj) { + return obj.__proto__ +} + +function clone (obj) { + if (obj === null || typeof obj !== 'object') + return obj + + if (obj instanceof Object) + var copy = { __proto__: getPrototypeOf(obj) } + else + var copy = Object.create(null) + + Object.getOwnPropertyNames(obj).forEach(function (key) { + Object.defineProperty(copy, key, Object.getOwnPropertyDescriptor(obj, key)) + }) + + return copy +} diff --git a/node_modules/graceful-fs/graceful-fs.js b/node_modules/graceful-fs/graceful-fs.js new file mode 100644 index 000000000..947cd94bb --- /dev/null +++ b/node_modules/graceful-fs/graceful-fs.js @@ -0,0 +1,429 @@ +var fs = require('fs') +var polyfills = require('./polyfills.js') +var legacy = require('./legacy-streams.js') +var clone = require('./clone.js') + +var util = require('util') + +/* istanbul ignore next - node 0.x polyfill */ +var gracefulQueue +var previousSymbol + +/* istanbul ignore else - node 0.x polyfill */ +if (typeof Symbol === 'function' && typeof Symbol.for === 'function') { + gracefulQueue = Symbol.for('graceful-fs.queue') + // This is used in testing by future versions + previousSymbol = Symbol.for('graceful-fs.previous') +} else { + gracefulQueue = '___graceful-fs.queue' + previousSymbol = '___graceful-fs.previous' +} + +function noop () {} + +function publishQueue(context, queue) { + Object.defineProperty(context, gracefulQueue, { + get: function() { + return queue + } + }) +} + +var debug = noop +if (util.debuglog) + debug = util.debuglog('gfs4') +else if (/\bgfs4\b/i.test(process.env.NODE_DEBUG || '')) + debug = function() { + var m = util.format.apply(util, arguments) + m = 'GFS4: ' + m.split(/\n/).join('\nGFS4: ') + console.error(m) + } + +// Once time initialization +if (!fs[gracefulQueue]) { + // This queue can be shared by multiple loaded instances + var queue = global[gracefulQueue] || [] + publishQueue(fs, queue) + + // Patch fs.close/closeSync to shared queue version, because we need + // to retry() whenever a close happens *anywhere* in the program. + // This is essential when multiple graceful-fs instances are + // in play at the same time. + fs.close = (function (fs$close) { + function close (fd, cb) { + return fs$close.call(fs, fd, function (err) { + // This function uses the graceful-fs shared queue + if (!err) { + resetQueue() + } + + if (typeof cb === 'function') + cb.apply(this, arguments) + }) + } + + Object.defineProperty(close, previousSymbol, { + value: fs$close + }) + return close + })(fs.close) + + fs.closeSync = (function (fs$closeSync) { + function closeSync (fd) { + // This function uses the graceful-fs shared queue + fs$closeSync.apply(fs, arguments) + resetQueue() + } + + Object.defineProperty(closeSync, previousSymbol, { + value: fs$closeSync + }) + return closeSync + })(fs.closeSync) + + if (/\bgfs4\b/i.test(process.env.NODE_DEBUG || '')) { + process.on('exit', function() { + debug(fs[gracefulQueue]) + require('assert').equal(fs[gracefulQueue].length, 0) + }) + } +} + +if (!global[gracefulQueue]) { + publishQueue(global, fs[gracefulQueue]); +} + +module.exports = patch(clone(fs)) +if (process.env.TEST_GRACEFUL_FS_GLOBAL_PATCH && !fs.__patched) { + module.exports = patch(fs) + fs.__patched = true; +} + +function patch (fs) { + // Everything that references the open() function needs to be in here + polyfills(fs) + fs.gracefulify = patch + + fs.createReadStream = createReadStream + fs.createWriteStream = createWriteStream + var fs$readFile = fs.readFile + fs.readFile = readFile + function readFile (path, options, cb) { + if (typeof options === 'function') + cb = options, options = null + + return go$readFile(path, options, cb) + + function go$readFile (path, options, cb, startTime) { + return fs$readFile(path, options, function (err) { + if (err && (err.code === 'EMFILE' || err.code === 'ENFILE')) + enqueue([go$readFile, [path, options, cb], err, startTime || Date.now(), Date.now()]) + else { + if (typeof cb === 'function') + cb.apply(this, arguments) + } + }) + } + } + + var fs$writeFile = fs.writeFile + fs.writeFile = writeFile + function writeFile (path, data, options, cb) { + if (typeof options === 'function') + cb = options, options = null + + return go$writeFile(path, data, options, cb) + + function go$writeFile (path, data, options, cb, startTime) { + return fs$writeFile(path, data, options, function (err) { + if (err && (err.code === 'EMFILE' || err.code === 'ENFILE')) + enqueue([go$writeFile, [path, data, options, cb], err, startTime || Date.now(), Date.now()]) + else { + if (typeof cb === 'function') + cb.apply(this, arguments) + } + }) + } + } + + var fs$appendFile = fs.appendFile + if (fs$appendFile) + fs.appendFile = appendFile + function appendFile (path, data, options, cb) { + if (typeof options === 'function') + cb = options, options = null + + return go$appendFile(path, data, options, cb) + + function go$appendFile (path, data, options, cb, startTime) { + return fs$appendFile(path, data, options, function (err) { + if (err && (err.code === 'EMFILE' || err.code === 'ENFILE')) + enqueue([go$appendFile, [path, data, options, cb], err, startTime || Date.now(), Date.now()]) + else { + if (typeof cb === 'function') + cb.apply(this, arguments) + } + }) + } + } + + var fs$copyFile = fs.copyFile + if (fs$copyFile) + fs.copyFile = copyFile + function copyFile (src, dest, flags, cb) { + if (typeof flags === 'function') { + cb = flags + flags = 0 + } + return go$copyFile(src, dest, flags, cb) + + function go$copyFile (src, dest, flags, cb, startTime) { + return fs$copyFile(src, dest, flags, function (err) { + if (err && (err.code === 'EMFILE' || err.code === 'ENFILE')) + enqueue([go$copyFile, [src, dest, flags, cb], err, startTime || Date.now(), Date.now()]) + else { + if (typeof cb === 'function') + cb.apply(this, arguments) + } + }) + } + } + + var fs$readdir = fs.readdir + fs.readdir = readdir + function readdir (path, options, cb) { + if (typeof options === 'function') + cb = options, options = null + + return go$readdir(path, options, cb) + + function go$readdir (path, options, cb, startTime) { + return fs$readdir(path, options, function (err, files) { + if (err && (err.code === 'EMFILE' || err.code === 'ENFILE')) + enqueue([go$readdir, [path, options, cb], err, startTime || Date.now(), Date.now()]) + else { + if (files && files.sort) + files.sort() + + if (typeof cb === 'function') + cb.call(this, err, files) + } + }) + } + } + + if (process.version.substr(0, 4) === 'v0.8') { + var legStreams = legacy(fs) + ReadStream = legStreams.ReadStream + WriteStream = legStreams.WriteStream + } + + var fs$ReadStream = fs.ReadStream + if (fs$ReadStream) { + ReadStream.prototype = Object.create(fs$ReadStream.prototype) + ReadStream.prototype.open = ReadStream$open + } + + var fs$WriteStream = fs.WriteStream + if (fs$WriteStream) { + WriteStream.prototype = Object.create(fs$WriteStream.prototype) + WriteStream.prototype.open = WriteStream$open + } + + Object.defineProperty(fs, 'ReadStream', { + get: function () { + return ReadStream + }, + set: function (val) { + ReadStream = val + }, + enumerable: true, + configurable: true + }) + Object.defineProperty(fs, 'WriteStream', { + get: function () { + return WriteStream + }, + set: function (val) { + WriteStream = val + }, + enumerable: true, + configurable: true + }) + + // legacy names + var FileReadStream = ReadStream + Object.defineProperty(fs, 'FileReadStream', { + get: function () { + return FileReadStream + }, + set: function (val) { + FileReadStream = val + }, + enumerable: true, + configurable: true + }) + var FileWriteStream = WriteStream + Object.defineProperty(fs, 'FileWriteStream', { + get: function () { + return FileWriteStream + }, + set: function (val) { + FileWriteStream = val + }, + enumerable: true, + configurable: true + }) + + function ReadStream (path, options) { + if (this instanceof ReadStream) + return fs$ReadStream.apply(this, arguments), this + else + return ReadStream.apply(Object.create(ReadStream.prototype), arguments) + } + + function ReadStream$open () { + var that = this + open(that.path, that.flags, that.mode, function (err, fd) { + if (err) { + if (that.autoClose) + that.destroy() + + that.emit('error', err) + } else { + that.fd = fd + that.emit('open', fd) + that.read() + } + }) + } + + function WriteStream (path, options) { + if (this instanceof WriteStream) + return fs$WriteStream.apply(this, arguments), this + else + return WriteStream.apply(Object.create(WriteStream.prototype), arguments) + } + + function WriteStream$open () { + var that = this + open(that.path, that.flags, that.mode, function (err, fd) { + if (err) { + that.destroy() + that.emit('error', err) + } else { + that.fd = fd + that.emit('open', fd) + } + }) + } + + function createReadStream (path, options) { + return new fs.ReadStream(path, options) + } + + function createWriteStream (path, options) { + return new fs.WriteStream(path, options) + } + + var fs$open = fs.open + fs.open = open + function open (path, flags, mode, cb) { + if (typeof mode === 'function') + cb = mode, mode = null + + return go$open(path, flags, mode, cb) + + function go$open (path, flags, mode, cb, startTime) { + return fs$open(path, flags, mode, function (err, fd) { + if (err && (err.code === 'EMFILE' || err.code === 'ENFILE')) + enqueue([go$open, [path, flags, mode, cb], err, startTime || Date.now(), Date.now()]) + else { + if (typeof cb === 'function') + cb.apply(this, arguments) + } + }) + } + } + + return fs +} + +function enqueue (elem) { + debug('ENQUEUE', elem[0].name, elem[1]) + fs[gracefulQueue].push(elem) + retry() +} + +// keep track of the timeout between retry() calls +var retryTimer + +// reset the startTime and lastTime to now +// this resets the start of the 60 second overall timeout as well as the +// delay between attempts so that we'll retry these jobs sooner +function resetQueue () { + var now = Date.now() + for (var i = 0; i < fs[gracefulQueue].length; ++i) { + // entries that are only a length of 2 are from an older version, don't + // bother modifying those since they'll be retried anyway. + if (fs[gracefulQueue][i].length > 2) { + fs[gracefulQueue][i][3] = now // startTime + fs[gracefulQueue][i][4] = now // lastTime + } + } + // call retry to make sure we're actively processing the queue + retry() +} + +function retry () { + // clear the timer and remove it to help prevent unintended concurrency + clearTimeout(retryTimer) + retryTimer = undefined + + if (fs[gracefulQueue].length === 0) + return + + var elem = fs[gracefulQueue].shift() + var fn = elem[0] + var args = elem[1] + // these items may be unset if they were added by an older graceful-fs + var err = elem[2] + var startTime = elem[3] + var lastTime = elem[4] + + // if we don't have a startTime we have no way of knowing if we've waited + // long enough, so go ahead and retry this item now + if (startTime === undefined) { + debug('RETRY', fn.name, args) + fn.apply(null, args) + } else if (Date.now() - startTime >= 60000) { + // it's been more than 60 seconds total, bail now + debug('TIMEOUT', fn.name, args) + var cb = args.pop() + if (typeof cb === 'function') + cb.call(null, err) + } else { + // the amount of time between the last attempt and right now + var sinceAttempt = Date.now() - lastTime + // the amount of time between when we first tried, and when we last tried + // rounded up to at least 1 + var sinceStart = Math.max(lastTime - startTime, 1) + // backoff. wait longer than the total time we've been retrying, but only + // up to a maximum of 100ms + var desiredDelay = Math.min(sinceStart * 1.2, 100) + // it's been long enough since the last retry, do it again + if (sinceAttempt >= desiredDelay) { + debug('RETRY', fn.name, args) + fn.apply(null, args.concat([startTime])) + } else { + // if we can't do this job yet, push it to the end of the queue + // and let the next iteration check again + fs[gracefulQueue].push(elem) + } + } + + // schedule our next run if one isn't already scheduled + if (retryTimer === undefined) { + retryTimer = setTimeout(retry, 0) + } +} diff --git a/node_modules/graceful-fs/legacy-streams.js b/node_modules/graceful-fs/legacy-streams.js new file mode 100644 index 000000000..d617b50fc --- /dev/null +++ b/node_modules/graceful-fs/legacy-streams.js @@ -0,0 +1,118 @@ +var Stream = require('stream').Stream + +module.exports = legacy + +function legacy (fs) { + return { + ReadStream: ReadStream, + WriteStream: WriteStream + } + + function ReadStream (path, options) { + if (!(this instanceof ReadStream)) return new ReadStream(path, options); + + Stream.call(this); + + var self = this; + + this.path = path; + this.fd = null; + this.readable = true; + this.paused = false; + + this.flags = 'r'; + this.mode = 438; /*=0666*/ + this.bufferSize = 64 * 1024; + + options = options || {}; + + // Mixin options into this + var keys = Object.keys(options); + for (var index = 0, length = keys.length; index < length; index++) { + var key = keys[index]; + this[key] = options[key]; + } + + if (this.encoding) this.setEncoding(this.encoding); + + if (this.start !== undefined) { + if ('number' !== typeof this.start) { + throw TypeError('start must be a Number'); + } + if (this.end === undefined) { + this.end = Infinity; + } else if ('number' !== typeof this.end) { + throw TypeError('end must be a Number'); + } + + if (this.start > this.end) { + throw new Error('start must be <= end'); + } + + this.pos = this.start; + } + + if (this.fd !== null) { + process.nextTick(function() { + self._read(); + }); + return; + } + + fs.open(this.path, this.flags, this.mode, function (err, fd) { + if (err) { + self.emit('error', err); + self.readable = false; + return; + } + + self.fd = fd; + self.emit('open', fd); + self._read(); + }) + } + + function WriteStream (path, options) { + if (!(this instanceof WriteStream)) return new WriteStream(path, options); + + Stream.call(this); + + this.path = path; + this.fd = null; + this.writable = true; + + this.flags = 'w'; + this.encoding = 'binary'; + this.mode = 438; /*=0666*/ + this.bytesWritten = 0; + + options = options || {}; + + // Mixin options into this + var keys = Object.keys(options); + for (var index = 0, length = keys.length; index < length; index++) { + var key = keys[index]; + this[key] = options[key]; + } + + if (this.start !== undefined) { + if ('number' !== typeof this.start) { + throw TypeError('start must be a Number'); + } + if (this.start < 0) { + throw new Error('start must be >= zero'); + } + + this.pos = this.start; + } + + this.busy = false; + this._queue = []; + + if (this.fd === null) { + this._open = fs.open; + this._queue.push([this._open, this.path, this.flags, this.mode, undefined]); + this.flush(); + } + } +} diff --git a/node_modules/graceful-fs/package.json b/node_modules/graceful-fs/package.json new file mode 100644 index 000000000..a682a71bb --- /dev/null +++ b/node_modules/graceful-fs/package.json @@ -0,0 +1,79 @@ +{ + "_from": "graceful-fs@^4.1.2", + "_id": "graceful-fs@4.2.8", + "_inBundle": false, + "_integrity": "sha512-qkIilPUYcNhJpd33n0GBXTB1MMPp14TxEsEs0pTrsSVucApsYzW5V+Q8Qxhik6KU3evy+qkAAowTByymK0avdg==", + "_location": "/graceful-fs", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "graceful-fs@^4.1.2", + "name": "graceful-fs", + "escapedName": "graceful-fs", + "rawSpec": "^4.1.2", + "saveSpec": null, + "fetchSpec": "^4.1.2" + }, + "_requiredBy": [ + "/configstore" + ], + "_resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.8.tgz", + "_shasum": "e412b8d33f5e006593cbd3cee6df9f2cebbe802a", + "_spec": "graceful-fs@^4.1.2", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\configstore", + "bugs": { + "url": "https://github.com/isaacs/node-graceful-fs/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "A drop-in replacement for fs, making various improvements.", + "devDependencies": { + "import-fresh": "^2.0.0", + "mkdirp": "^0.5.0", + "rimraf": "^2.2.8", + "tap": "^12.7.0" + }, + "directories": { + "test": "test" + }, + "files": [ + "fs.js", + "graceful-fs.js", + "legacy-streams.js", + "polyfills.js", + "clone.js" + ], + "homepage": "https://github.com/isaacs/node-graceful-fs#readme", + "keywords": [ + "fs", + "module", + "reading", + "retry", + "retries", + "queue", + "error", + "errors", + "handling", + "EMFILE", + "EAGAIN", + "EINVAL", + "EPERM", + "EACCESS" + ], + "license": "ISC", + "main": "graceful-fs.js", + "name": "graceful-fs", + "repository": { + "type": "git", + "url": "git+https://github.com/isaacs/node-graceful-fs.git" + }, + "scripts": { + "postpublish": "git push origin --follow-tags", + "posttest": "nyc report", + "postversion": "npm publish", + "preversion": "npm test", + "test": "nyc --silent node test.js | tap -c -" + }, + "version": "4.2.8" +} diff --git a/node_modules/graceful-fs/polyfills.js b/node_modules/graceful-fs/polyfills.js new file mode 100644 index 000000000..1287da1aa --- /dev/null +++ b/node_modules/graceful-fs/polyfills.js @@ -0,0 +1,346 @@ +var constants = require('constants') + +var origCwd = process.cwd +var cwd = null + +var platform = process.env.GRACEFUL_FS_PLATFORM || process.platform + +process.cwd = function() { + if (!cwd) + cwd = origCwd.call(process) + return cwd +} +try { + process.cwd() +} catch (er) {} + +// This check is needed until node.js 12 is required +if (typeof process.chdir === 'function') { + var chdir = process.chdir + process.chdir = function (d) { + cwd = null + chdir.call(process, d) + } + if (Object.setPrototypeOf) Object.setPrototypeOf(process.chdir, chdir) +} + +module.exports = patch + +function patch (fs) { + // (re-)implement some things that are known busted or missing. + + // lchmod, broken prior to 0.6.2 + // back-port the fix here. + if (constants.hasOwnProperty('O_SYMLINK') && + process.version.match(/^v0\.6\.[0-2]|^v0\.5\./)) { + patchLchmod(fs) + } + + // lutimes implementation, or no-op + if (!fs.lutimes) { + patchLutimes(fs) + } + + // https://github.com/isaacs/node-graceful-fs/issues/4 + // Chown should not fail on einval or eperm if non-root. + // It should not fail on enosys ever, as this just indicates + // that a fs doesn't support the intended operation. + + fs.chown = chownFix(fs.chown) + fs.fchown = chownFix(fs.fchown) + fs.lchown = chownFix(fs.lchown) + + fs.chmod = chmodFix(fs.chmod) + fs.fchmod = chmodFix(fs.fchmod) + fs.lchmod = chmodFix(fs.lchmod) + + fs.chownSync = chownFixSync(fs.chownSync) + fs.fchownSync = chownFixSync(fs.fchownSync) + fs.lchownSync = chownFixSync(fs.lchownSync) + + fs.chmodSync = chmodFixSync(fs.chmodSync) + fs.fchmodSync = chmodFixSync(fs.fchmodSync) + fs.lchmodSync = chmodFixSync(fs.lchmodSync) + + fs.stat = statFix(fs.stat) + fs.fstat = statFix(fs.fstat) + fs.lstat = statFix(fs.lstat) + + fs.statSync = statFixSync(fs.statSync) + fs.fstatSync = statFixSync(fs.fstatSync) + fs.lstatSync = statFixSync(fs.lstatSync) + + // if lchmod/lchown do not exist, then make them no-ops + if (!fs.lchmod) { + fs.lchmod = function (path, mode, cb) { + if (cb) process.nextTick(cb) + } + fs.lchmodSync = function () {} + } + if (!fs.lchown) { + fs.lchown = function (path, uid, gid, cb) { + if (cb) process.nextTick(cb) + } + fs.lchownSync = function () {} + } + + // on Windows, A/V software can lock the directory, causing this + // to fail with an EACCES or EPERM if the directory contains newly + // created files. Try again on failure, for up to 60 seconds. + + // Set the timeout this long because some Windows Anti-Virus, such as Parity + // bit9, may lock files for up to a minute, causing npm package install + // failures. Also, take care to yield the scheduler. Windows scheduling gives + // CPU to a busy looping process, which can cause the program causing the lock + // contention to be starved of CPU by node, so the contention doesn't resolve. + if (platform === "win32") { + fs.rename = (function (fs$rename) { return function (from, to, cb) { + var start = Date.now() + var backoff = 0; + fs$rename(from, to, function CB (er) { + if (er + && (er.code === "EACCES" || er.code === "EPERM") + && Date.now() - start < 60000) { + setTimeout(function() { + fs.stat(to, function (stater, st) { + if (stater && stater.code === "ENOENT") + fs$rename(from, to, CB); + else + cb(er) + }) + }, backoff) + if (backoff < 100) + backoff += 10; + return; + } + if (cb) cb(er) + }) + }})(fs.rename) + } + + // if read() returns EAGAIN, then just try it again. + fs.read = (function (fs$read) { + function read (fd, buffer, offset, length, position, callback_) { + var callback + if (callback_ && typeof callback_ === 'function') { + var eagCounter = 0 + callback = function (er, _, __) { + if (er && er.code === 'EAGAIN' && eagCounter < 10) { + eagCounter ++ + return fs$read.call(fs, fd, buffer, offset, length, position, callback) + } + callback_.apply(this, arguments) + } + } + return fs$read.call(fs, fd, buffer, offset, length, position, callback) + } + + // This ensures `util.promisify` works as it does for native `fs.read`. + if (Object.setPrototypeOf) Object.setPrototypeOf(read, fs$read) + return read + })(fs.read) + + fs.readSync = (function (fs$readSync) { return function (fd, buffer, offset, length, position) { + var eagCounter = 0 + while (true) { + try { + return fs$readSync.call(fs, fd, buffer, offset, length, position) + } catch (er) { + if (er.code === 'EAGAIN' && eagCounter < 10) { + eagCounter ++ + continue + } + throw er + } + } + }})(fs.readSync) + + function patchLchmod (fs) { + fs.lchmod = function (path, mode, callback) { + fs.open( path + , constants.O_WRONLY | constants.O_SYMLINK + , mode + , function (err, fd) { + if (err) { + if (callback) callback(err) + return + } + // prefer to return the chmod error, if one occurs, + // but still try to close, and report closing errors if they occur. + fs.fchmod(fd, mode, function (err) { + fs.close(fd, function(err2) { + if (callback) callback(err || err2) + }) + }) + }) + } + + fs.lchmodSync = function (path, mode) { + var fd = fs.openSync(path, constants.O_WRONLY | constants.O_SYMLINK, mode) + + // prefer to return the chmod error, if one occurs, + // but still try to close, and report closing errors if they occur. + var threw = true + var ret + try { + ret = fs.fchmodSync(fd, mode) + threw = false + } finally { + if (threw) { + try { + fs.closeSync(fd) + } catch (er) {} + } else { + fs.closeSync(fd) + } + } + return ret + } + } + + function patchLutimes (fs) { + if (constants.hasOwnProperty("O_SYMLINK")) { + fs.lutimes = function (path, at, mt, cb) { + fs.open(path, constants.O_SYMLINK, function (er, fd) { + if (er) { + if (cb) cb(er) + return + } + fs.futimes(fd, at, mt, function (er) { + fs.close(fd, function (er2) { + if (cb) cb(er || er2) + }) + }) + }) + } + + fs.lutimesSync = function (path, at, mt) { + var fd = fs.openSync(path, constants.O_SYMLINK) + var ret + var threw = true + try { + ret = fs.futimesSync(fd, at, mt) + threw = false + } finally { + if (threw) { + try { + fs.closeSync(fd) + } catch (er) {} + } else { + fs.closeSync(fd) + } + } + return ret + } + + } else { + fs.lutimes = function (_a, _b, _c, cb) { if (cb) process.nextTick(cb) } + fs.lutimesSync = function () {} + } + } + + function chmodFix (orig) { + if (!orig) return orig + return function (target, mode, cb) { + return orig.call(fs, target, mode, function (er) { + if (chownErOk(er)) er = null + if (cb) cb.apply(this, arguments) + }) + } + } + + function chmodFixSync (orig) { + if (!orig) return orig + return function (target, mode) { + try { + return orig.call(fs, target, mode) + } catch (er) { + if (!chownErOk(er)) throw er + } + } + } + + + function chownFix (orig) { + if (!orig) return orig + return function (target, uid, gid, cb) { + return orig.call(fs, target, uid, gid, function (er) { + if (chownErOk(er)) er = null + if (cb) cb.apply(this, arguments) + }) + } + } + + function chownFixSync (orig) { + if (!orig) return orig + return function (target, uid, gid) { + try { + return orig.call(fs, target, uid, gid) + } catch (er) { + if (!chownErOk(er)) throw er + } + } + } + + function statFix (orig) { + if (!orig) return orig + // Older versions of Node erroneously returned signed integers for + // uid + gid. + return function (target, options, cb) { + if (typeof options === 'function') { + cb = options + options = null + } + function callback (er, stats) { + if (stats) { + if (stats.uid < 0) stats.uid += 0x100000000 + if (stats.gid < 0) stats.gid += 0x100000000 + } + if (cb) cb.apply(this, arguments) + } + return options ? orig.call(fs, target, options, callback) + : orig.call(fs, target, callback) + } + } + + function statFixSync (orig) { + if (!orig) return orig + // Older versions of Node erroneously returned signed integers for + // uid + gid. + return function (target, options) { + var stats = options ? orig.call(fs, target, options) + : orig.call(fs, target) + if (stats.uid < 0) stats.uid += 0x100000000 + if (stats.gid < 0) stats.gid += 0x100000000 + return stats; + } + } + + // ENOSYS means that the fs doesn't support the op. Just ignore + // that, because it doesn't matter. + // + // if there's no getuid, or if getuid() is something other + // than 0, and the error is EINVAL or EPERM, then just ignore + // it. + // + // This specific case is a silent failure in cp, install, tar, + // and most other unix tools that manage permissions. + // + // When running as root, or if other types of errors are + // encountered, then it's strict. + function chownErOk (er) { + if (!er) + return true + + if (er.code === "ENOSYS") + return true + + var nonroot = !process.getuid || process.getuid() !== 0 + if (nonroot) { + if (er.code === "EINVAL" || er.code === "EPERM") + return true + } + + return false + } +} diff --git a/node_modules/has-flag/index.js b/node_modules/has-flag/index.js new file mode 100644 index 000000000..5139728fb --- /dev/null +++ b/node_modules/has-flag/index.js @@ -0,0 +1,8 @@ +'use strict'; +module.exports = (flag, argv) => { + argv = argv || process.argv; + const prefix = flag.startsWith('-') ? '' : (flag.length === 1 ? '-' : '--'); + const pos = argv.indexOf(prefix + flag); + const terminatorPos = argv.indexOf('--'); + return pos !== -1 && (terminatorPos === -1 ? true : pos < terminatorPos); +}; diff --git a/node_modules/has-flag/license b/node_modules/has-flag/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/has-flag/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/has-flag/package.json b/node_modules/has-flag/package.json new file mode 100644 index 000000000..b9ba8ee00 --- /dev/null +++ b/node_modules/has-flag/package.json @@ -0,0 +1,76 @@ +{ + "_from": "has-flag@^3.0.0", + "_id": "has-flag@3.0.0", + "_inBundle": false, + "_integrity": "sha1-tdRU3CGZriJWmfNGfloH87lVuv0=", + "_location": "/has-flag", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "has-flag@^3.0.0", + "name": "has-flag", + "escapedName": "has-flag", + "rawSpec": "^3.0.0", + "saveSpec": null, + "fetchSpec": "^3.0.0" + }, + "_requiredBy": [ + "/supports-color" + ], + "_resolved": "https://registry.npmjs.org/has-flag/-/has-flag-3.0.0.tgz", + "_shasum": "b5d454dc2199ae225699f3467e5a07f3b955bafd", + "_spec": "has-flag@^3.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\supports-color", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/has-flag/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Check if argv has a specific flag", + "devDependencies": { + "ava": "*", + "xo": "*" + }, + "engines": { + "node": ">=4" + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/sindresorhus/has-flag#readme", + "keywords": [ + "has", + "check", + "detect", + "contains", + "find", + "flag", + "cli", + "command-line", + "argv", + "process", + "arg", + "args", + "argument", + "arguments", + "getopt", + "minimist", + "optimist" + ], + "license": "MIT", + "name": "has-flag", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/has-flag.git" + }, + "scripts": { + "test": "xo && ava" + }, + "version": "3.0.0" +} diff --git a/node_modules/has-flag/readme.md b/node_modules/has-flag/readme.md new file mode 100644 index 000000000..677893c27 --- /dev/null +++ b/node_modules/has-flag/readme.md @@ -0,0 +1,70 @@ +# has-flag [![Build Status](https://travis-ci.org/sindresorhus/has-flag.svg?branch=master)](https://travis-ci.org/sindresorhus/has-flag) + +> Check if [`argv`](https://nodejs.org/docs/latest/api/process.html#process_process_argv) has a specific flag + +Correctly stops looking after an `--` argument terminator. + + +## Install + +``` +$ npm install has-flag +``` + + +## Usage + +```js +// foo.js +const hasFlag = require('has-flag'); + +hasFlag('unicorn'); +//=> true + +hasFlag('--unicorn'); +//=> true + +hasFlag('f'); +//=> true + +hasFlag('-f'); +//=> true + +hasFlag('foo=bar'); +//=> true + +hasFlag('foo'); +//=> false + +hasFlag('rainbow'); +//=> false +``` + +``` +$ node foo.js -f --unicorn --foo=bar -- --rainbow +``` + + +## API + +### hasFlag(flag, [argv]) + +Returns a boolean for whether the flag exists. + +#### flag + +Type: `string` + +CLI flag to look for. The `--` prefix is optional. + +#### argv + +Type: `string[]`
+Default: `process.argv` + +CLI arguments. + + +## License + +MIT © [Sindre Sorhus](https://sindresorhus.com) diff --git a/node_modules/has-yarn/index.d.ts b/node_modules/has-yarn/index.d.ts new file mode 100644 index 000000000..5e2af4c93 --- /dev/null +++ b/node_modules/has-yarn/index.d.ts @@ -0,0 +1,16 @@ +declare const hasYarn: { + /** + * Check if a project is using [Yarn](https://yarnpkg.com). + * + * @param cwd - Current working directory. Default: `process.cwd()`. + * @returns Whether the project uses Yarn. + */ + (cwd?: string): boolean; + + // TODO: Remove this for the next major release, refactor the whole definition to: + // declare function hasYarn(cwd?: string): boolean; + // export = hasYarn; + default: typeof hasYarn; +}; + +export = hasYarn; diff --git a/node_modules/has-yarn/index.js b/node_modules/has-yarn/index.js new file mode 100644 index 000000000..a1f4eed86 --- /dev/null +++ b/node_modules/has-yarn/index.js @@ -0,0 +1,9 @@ +'use strict'; +const path = require('path'); +const fs = require('fs'); + +const hasYarn = (cwd = process.cwd()) => fs.existsSync(path.resolve(cwd, 'yarn.lock')); + +module.exports = hasYarn; +// TODO: Remove this for the next major release +module.exports.default = hasYarn; diff --git a/node_modules/has-yarn/license b/node_modules/has-yarn/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/has-yarn/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/has-yarn/package.json b/node_modules/has-yarn/package.json new file mode 100644 index 000000000..e92d5e38f --- /dev/null +++ b/node_modules/has-yarn/package.json @@ -0,0 +1,71 @@ +{ + "_from": "has-yarn@^2.1.0", + "_id": "has-yarn@2.1.0", + "_inBundle": false, + "_integrity": "sha512-UqBRqi4ju7T+TqGNdqAO0PaSVGsDGJUBQvk9eUWNGRY1CFGDzYhLWoM7JQEemnlvVcv/YEmc2wNW8BC24EnUsw==", + "_location": "/has-yarn", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "has-yarn@^2.1.0", + "name": "has-yarn", + "escapedName": "has-yarn", + "rawSpec": "^2.1.0", + "saveSpec": null, + "fetchSpec": "^2.1.0" + }, + "_requiredBy": [ + "/update-notifier" + ], + "_resolved": "https://registry.npmjs.org/has-yarn/-/has-yarn-2.1.0.tgz", + "_shasum": "137e11354a7b5bf11aa5cb649cf0c6f3ff2b2e77", + "_spec": "has-yarn@^2.1.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\update-notifier", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/has-yarn/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Check if a project is using Yarn", + "devDependencies": { + "ava": "^1.4.1", + "tsd": "^0.7.1", + "xo": "^0.24.0" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "homepage": "https://github.com/sindresorhus/has-yarn#readme", + "keywords": [ + "yarn", + "has", + "detect", + "is", + "project", + "app", + "module", + "package", + "manager", + "npm" + ], + "license": "MIT", + "name": "has-yarn", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/has-yarn.git" + }, + "scripts": { + "test": "xo && ava && tsd" + }, + "version": "2.1.0" +} diff --git a/node_modules/has-yarn/readme.md b/node_modules/has-yarn/readme.md new file mode 100644 index 000000000..0315c20c2 --- /dev/null +++ b/node_modules/has-yarn/readme.md @@ -0,0 +1,60 @@ +# has-yarn [![Build Status](https://travis-ci.org/sindresorhus/has-yarn.svg?branch=master)](https://travis-ci.org/sindresorhus/has-yarn) + +> Check if a project is using [Yarn](https://yarnpkg.com) + +Useful for tools that needs to know whether to use `yarn` or `npm` to install dependencies. + +It checks if a `yarn.lock` file is present in the working directory. + + +## Install + +``` +$ npm install has-yarn +``` + + +## Usage + +``` +. +├── foo +│   └── package.json +└── bar + ├── package.json + └── yarn.lock +``` + +```js +const hasYarn = require('has-yarn'); + +hasYarn('foo'); +//=> false + +hasYarn('bar'); +//=> true +``` + + +## API + +### hasYarn([cwd]) + +Returns a `boolean` of whether the project uses Yarn. + +#### cwd + +Type: `string`
+Default: `process.cwd()` + +Current working directory. + + +## Related + +- [has-yarn-cli](https://github.com/sindresorhus/has-yarn-cli) - CLI for this module + + +## License + +MIT © [Sindre Sorhus](https://sindresorhus.com) diff --git a/node_modules/http-cache-semantics/LICENSE b/node_modules/http-cache-semantics/LICENSE new file mode 100644 index 000000000..493d2ea29 --- /dev/null +++ b/node_modules/http-cache-semantics/LICENSE @@ -0,0 +1,9 @@ +Copyright 2016-2018 Kornel Lesiński + +Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/node_modules/http-cache-semantics/README.md b/node_modules/http-cache-semantics/README.md new file mode 100644 index 000000000..685aa55dd --- /dev/null +++ b/node_modules/http-cache-semantics/README.md @@ -0,0 +1,203 @@ +# Can I cache this? [![Build Status](https://travis-ci.org/kornelski/http-cache-semantics.svg?branch=master)](https://travis-ci.org/kornelski/http-cache-semantics) + +`CachePolicy` tells when responses can be reused from a cache, taking into account [HTTP RFC 7234](http://httpwg.org/specs/rfc7234.html) rules for user agents and shared caches. +It also implements [RFC 5861](https://tools.ietf.org/html/rfc5861), implementing `stale-if-error` and `stale-while-revalidate`. +It's aware of many tricky details such as the `Vary` header, proxy revalidation, and authenticated responses. + +## Usage + +Cacheability of an HTTP response depends on how it was requested, so both `request` and `response` are required to create the policy. + +```js +const policy = new CachePolicy(request, response, options); + +if (!policy.storable()) { + // throw the response away, it's not usable at all + return; +} + +// Cache the data AND the policy object in your cache +// (this is pseudocode, roll your own cache (lru-cache package works)) +letsPretendThisIsSomeCache.set( + request.url, + { policy, response }, + policy.timeToLive() +); +``` + +```js +// And later, when you receive a new request: +const { policy, response } = letsPretendThisIsSomeCache.get(newRequest.url); + +// It's not enough that it exists in the cache, it has to match the new request, too: +if (policy && policy.satisfiesWithoutRevalidation(newRequest)) { + // OK, the previous response can be used to respond to the `newRequest`. + // Response headers have to be updated, e.g. to add Age and remove uncacheable headers. + response.headers = policy.responseHeaders(); + return response; +} +``` + +It may be surprising, but it's not enough for an HTTP response to be [fresh](#yo-fresh) to satisfy a request. It may need to match request headers specified in `Vary`. Even a matching fresh response may still not be usable if the new request restricted cacheability, etc. + +The key method is `satisfiesWithoutRevalidation(newRequest)`, which checks whether the `newRequest` is compatible with the original request and whether all caching conditions are met. + +### Constructor options + +Request and response must have a `headers` property with all header names in lower case. `url`, `status` and `method` are optional (defaults are any URL, status `200`, and `GET` method). + +```js +const request = { + url: '/', + method: 'GET', + headers: { + accept: '*/*', + }, +}; + +const response = { + status: 200, + headers: { + 'cache-control': 'public, max-age=7234', + }, +}; + +const options = { + shared: true, + cacheHeuristic: 0.1, + immutableMinTimeToLive: 24 * 3600 * 1000, // 24h + ignoreCargoCult: false, +}; +``` + +If `options.shared` is `true` (default), then the response is evaluated from a perspective of a shared cache (i.e. `private` is not cacheable and `s-maxage` is respected). If `options.shared` is `false`, then the response is evaluated from a perspective of a single-user cache (i.e. `private` is cacheable and `s-maxage` is ignored). `shared: true` is recommended for HTTP clients. + +`options.cacheHeuristic` is a fraction of response's age that is used as a fallback cache duration. The default is 0.1 (10%), e.g. if a file hasn't been modified for 100 days, it'll be cached for 100\*0.1 = 10 days. + +`options.immutableMinTimeToLive` is a number of milliseconds to assume as the default time to cache responses with `Cache-Control: immutable`. Note that [per RFC](http://httpwg.org/http-extensions/immutable.html) these can become stale, so `max-age` still overrides the default. + +If `options.ignoreCargoCult` is true, common anti-cache directives will be completely ignored if the non-standard `pre-check` and `post-check` directives are present. These two useless directives are most commonly found in bad StackOverflow answers and PHP's "session limiter" defaults. + +### `storable()` + +Returns `true` if the response can be stored in a cache. If it's `false` then you MUST NOT store either the request or the response. + +### `satisfiesWithoutRevalidation(newRequest)` + +This is the most important method. Use this method to check whether the cached response is still fresh in the context of the new request. + +If it returns `true`, then the given `request` matches the original response this cache policy has been created with, and the response can be reused without contacting the server. Note that the old response can't be returned without being updated, see `responseHeaders()`. + +If it returns `false`, then the response may not be matching at all (e.g. it's for a different URL or method), or may require to be refreshed first (see `revalidationHeaders()`). + +### `responseHeaders()` + +Returns updated, filtered set of response headers to return to clients receiving the cached response. This function is necessary, because proxies MUST always remove hop-by-hop headers (such as `TE` and `Connection`) and update response's `Age` to avoid doubling cache time. + +```js +cachedResponse.headers = cachePolicy.responseHeaders(cachedResponse); +``` + +### `timeToLive()` + +Returns approximate time in _milliseconds_ until the response becomes stale (i.e. not fresh). + +After that time (when `timeToLive() <= 0`) the response might not be usable without revalidation. However, there are exceptions, e.g. a client can explicitly allow stale responses, so always check with `satisfiesWithoutRevalidation()`. +`stale-if-error` and `stale-while-revalidate` extend the time to live of the cache, that can still be used if stale. + +### `toObject()`/`fromObject(json)` + +Chances are you'll want to store the `CachePolicy` object along with the cached response. `obj = policy.toObject()` gives a plain JSON-serializable object. `policy = CachePolicy.fromObject(obj)` creates an instance from it. + +### Refreshing stale cache (revalidation) + +When a cached response has expired, it can be made fresh again by making a request to the origin server. The server may respond with status 304 (Not Modified) without sending the response body again, saving bandwidth. + +The following methods help perform the update efficiently and correctly. + +#### `revalidationHeaders(newRequest)` + +Returns updated, filtered set of request headers to send to the origin server to check if the cached response can be reused. These headers allow the origin server to return status 304 indicating the response is still fresh. All headers unrelated to caching are passed through as-is. + +Use this method when updating cache from the origin server. + +```js +updateRequest.headers = cachePolicy.revalidationHeaders(updateRequest); +``` + +#### `revalidatedPolicy(revalidationRequest, revalidationResponse)` + +Use this method to update the cache after receiving a new response from the origin server. It returns an object with two keys: + +- `policy` — A new `CachePolicy` with HTTP headers updated from `revalidationResponse`. You can always replace the old cached `CachePolicy` with the new one. +- `modified` — Boolean indicating whether the response body has changed. + - If `false`, then a valid 304 Not Modified response has been received, and you can reuse the old cached response body. This is also affected by `stale-if-error`. + - If `true`, you should use new response's body (if present), or make another request to the origin server without any conditional headers (i.e. don't use `revalidationHeaders()` this time) to get the new resource. + +```js +// When serving requests from cache: +const { oldPolicy, oldResponse } = letsPretendThisIsSomeCache.get( + newRequest.url +); + +if (!oldPolicy.satisfiesWithoutRevalidation(newRequest)) { + // Change the request to ask the origin server if the cached response can be used + newRequest.headers = oldPolicy.revalidationHeaders(newRequest); + + // Send request to the origin server. The server may respond with status 304 + const newResponse = await makeRequest(newRequest); + + // Create updated policy and combined response from the old and new data + const { policy, modified } = oldPolicy.revalidatedPolicy( + newRequest, + newResponse + ); + const response = modified ? newResponse : oldResponse; + + // Update the cache with the newer/fresher response + letsPretendThisIsSomeCache.set( + newRequest.url, + { policy, response }, + policy.timeToLive() + ); + + // And proceed returning cached response as usual + response.headers = policy.responseHeaders(); + return response; +} +``` + +# Yo, FRESH + +![satisfiesWithoutRevalidation](fresh.jpg) + +## Used by + +- [ImageOptim API](https://imageoptim.com/api), [make-fetch-happen](https://github.com/zkat/make-fetch-happen), [cacheable-request](https://www.npmjs.com/package/cacheable-request) ([got](https://www.npmjs.com/package/got)), [npm/registry-fetch](https://github.com/npm/registry-fetch), [etc.](https://github.com/kornelski/http-cache-semantics/network/dependents) + +## Implemented + +- `Cache-Control` response header with all the quirks. +- `Expires` with check for bad clocks. +- `Pragma` response header. +- `Age` response header. +- `Vary` response header. +- Default cacheability of statuses and methods. +- Requests for stale data. +- Filtering of hop-by-hop headers. +- Basic revalidation request +- `stale-if-error` + +## Unimplemented + +- Merging of range requests, `If-Range` (but correctly supports them as non-cacheable) +- Revalidation of multiple representations + +### Trusting server `Date` + +Per the RFC, the cache should take into account the time between server-supplied `Date` and the time it received the response. The RFC-mandated behavior creates two problems: + + * Servers with incorrectly set timezone may add several hours to cache age (or more, if the clock is completely wrong). + * Even reasonably correct clocks may be off by a couple of seconds, breaking `max-age=1` trick (which is useful for reverse proxies on high-traffic servers). + +Previous versions of this library had an option to ignore the server date if it was "too inaccurate". To support the `max-age=1` trick the library also has to ignore dates that pretty accurate. There's no point of having an option to trust dates that are only a bit inaccurate, so this library won't trust any server dates. `max-age` will be interpreted from the time the response has been received, not from when it has been sent. This will affect only [RFC 1149 networks](https://tools.ietf.org/html/rfc1149). diff --git a/node_modules/http-cache-semantics/index.js b/node_modules/http-cache-semantics/index.js new file mode 100644 index 000000000..4f6c2f304 --- /dev/null +++ b/node_modules/http-cache-semantics/index.js @@ -0,0 +1,673 @@ +'use strict'; +// rfc7231 6.1 +const statusCodeCacheableByDefault = new Set([ + 200, + 203, + 204, + 206, + 300, + 301, + 404, + 405, + 410, + 414, + 501, +]); + +// This implementation does not understand partial responses (206) +const understoodStatuses = new Set([ + 200, + 203, + 204, + 300, + 301, + 302, + 303, + 307, + 308, + 404, + 405, + 410, + 414, + 501, +]); + +const errorStatusCodes = new Set([ + 500, + 502, + 503, + 504, +]); + +const hopByHopHeaders = { + date: true, // included, because we add Age update Date + connection: true, + 'keep-alive': true, + 'proxy-authenticate': true, + 'proxy-authorization': true, + te: true, + trailer: true, + 'transfer-encoding': true, + upgrade: true, +}; + +const excludedFromRevalidationUpdate = { + // Since the old body is reused, it doesn't make sense to change properties of the body + 'content-length': true, + 'content-encoding': true, + 'transfer-encoding': true, + 'content-range': true, +}; + +function toNumberOrZero(s) { + const n = parseInt(s, 10); + return isFinite(n) ? n : 0; +} + +// RFC 5861 +function isErrorResponse(response) { + // consider undefined response as faulty + if(!response) { + return true + } + return errorStatusCodes.has(response.status); +} + +function parseCacheControl(header) { + const cc = {}; + if (!header) return cc; + + // TODO: When there is more than one value present for a given directive (e.g., two Expires header fields, multiple Cache-Control: max-age directives), + // the directive's value is considered invalid. Caches are encouraged to consider responses that have invalid freshness information to be stale + const parts = header.trim().split(/\s*,\s*/); // TODO: lame parsing + for (const part of parts) { + const [k, v] = part.split(/\s*=\s*/, 2); + cc[k] = v === undefined ? true : v.replace(/^"|"$/g, ''); // TODO: lame unquoting + } + + return cc; +} + +function formatCacheControl(cc) { + let parts = []; + for (const k in cc) { + const v = cc[k]; + parts.push(v === true ? k : k + '=' + v); + } + if (!parts.length) { + return undefined; + } + return parts.join(', '); +} + +module.exports = class CachePolicy { + constructor( + req, + res, + { + shared, + cacheHeuristic, + immutableMinTimeToLive, + ignoreCargoCult, + _fromObject, + } = {} + ) { + if (_fromObject) { + this._fromObject(_fromObject); + return; + } + + if (!res || !res.headers) { + throw Error('Response headers missing'); + } + this._assertRequestHasHeaders(req); + + this._responseTime = this.now(); + this._isShared = shared !== false; + this._cacheHeuristic = + undefined !== cacheHeuristic ? cacheHeuristic : 0.1; // 10% matches IE + this._immutableMinTtl = + undefined !== immutableMinTimeToLive + ? immutableMinTimeToLive + : 24 * 3600 * 1000; + + this._status = 'status' in res ? res.status : 200; + this._resHeaders = res.headers; + this._rescc = parseCacheControl(res.headers['cache-control']); + this._method = 'method' in req ? req.method : 'GET'; + this._url = req.url; + this._host = req.headers.host; + this._noAuthorization = !req.headers.authorization; + this._reqHeaders = res.headers.vary ? req.headers : null; // Don't keep all request headers if they won't be used + this._reqcc = parseCacheControl(req.headers['cache-control']); + + // Assume that if someone uses legacy, non-standard uncecessary options they don't understand caching, + // so there's no point stricly adhering to the blindly copy&pasted directives. + if ( + ignoreCargoCult && + 'pre-check' in this._rescc && + 'post-check' in this._rescc + ) { + delete this._rescc['pre-check']; + delete this._rescc['post-check']; + delete this._rescc['no-cache']; + delete this._rescc['no-store']; + delete this._rescc['must-revalidate']; + this._resHeaders = Object.assign({}, this._resHeaders, { + 'cache-control': formatCacheControl(this._rescc), + }); + delete this._resHeaders.expires; + delete this._resHeaders.pragma; + } + + // When the Cache-Control header field is not present in a request, caches MUST consider the no-cache request pragma-directive + // as having the same effect as if "Cache-Control: no-cache" were present (see Section 5.2.1). + if ( + res.headers['cache-control'] == null && + /no-cache/.test(res.headers.pragma) + ) { + this._rescc['no-cache'] = true; + } + } + + now() { + return Date.now(); + } + + storable() { + // The "no-store" request directive indicates that a cache MUST NOT store any part of either this request or any response to it. + return !!( + !this._reqcc['no-store'] && + // A cache MUST NOT store a response to any request, unless: + // The request method is understood by the cache and defined as being cacheable, and + ('GET' === this._method || + 'HEAD' === this._method || + ('POST' === this._method && this._hasExplicitExpiration())) && + // the response status code is understood by the cache, and + understoodStatuses.has(this._status) && + // the "no-store" cache directive does not appear in request or response header fields, and + !this._rescc['no-store'] && + // the "private" response directive does not appear in the response, if the cache is shared, and + (!this._isShared || !this._rescc.private) && + // the Authorization header field does not appear in the request, if the cache is shared, + (!this._isShared || + this._noAuthorization || + this._allowsStoringAuthenticated()) && + // the response either: + // contains an Expires header field, or + (this._resHeaders.expires || + // contains a max-age response directive, or + // contains a s-maxage response directive and the cache is shared, or + // contains a public response directive. + this._rescc['max-age'] || + (this._isShared && this._rescc['s-maxage']) || + this._rescc.public || + // has a status code that is defined as cacheable by default + statusCodeCacheableByDefault.has(this._status)) + ); + } + + _hasExplicitExpiration() { + // 4.2.1 Calculating Freshness Lifetime + return ( + (this._isShared && this._rescc['s-maxage']) || + this._rescc['max-age'] || + this._resHeaders.expires + ); + } + + _assertRequestHasHeaders(req) { + if (!req || !req.headers) { + throw Error('Request headers missing'); + } + } + + satisfiesWithoutRevalidation(req) { + this._assertRequestHasHeaders(req); + + // When presented with a request, a cache MUST NOT reuse a stored response, unless: + // the presented request does not contain the no-cache pragma (Section 5.4), nor the no-cache cache directive, + // unless the stored response is successfully validated (Section 4.3), and + const requestCC = parseCacheControl(req.headers['cache-control']); + if (requestCC['no-cache'] || /no-cache/.test(req.headers.pragma)) { + return false; + } + + if (requestCC['max-age'] && this.age() > requestCC['max-age']) { + return false; + } + + if ( + requestCC['min-fresh'] && + this.timeToLive() < 1000 * requestCC['min-fresh'] + ) { + return false; + } + + // the stored response is either: + // fresh, or allowed to be served stale + if (this.stale()) { + const allowsStale = + requestCC['max-stale'] && + !this._rescc['must-revalidate'] && + (true === requestCC['max-stale'] || + requestCC['max-stale'] > this.age() - this.maxAge()); + if (!allowsStale) { + return false; + } + } + + return this._requestMatches(req, false); + } + + _requestMatches(req, allowHeadMethod) { + // The presented effective request URI and that of the stored response match, and + return ( + (!this._url || this._url === req.url) && + this._host === req.headers.host && + // the request method associated with the stored response allows it to be used for the presented request, and + (!req.method || + this._method === req.method || + (allowHeadMethod && 'HEAD' === req.method)) && + // selecting header fields nominated by the stored response (if any) match those presented, and + this._varyMatches(req) + ); + } + + _allowsStoringAuthenticated() { + // following Cache-Control response directives (Section 5.2.2) have such an effect: must-revalidate, public, and s-maxage. + return ( + this._rescc['must-revalidate'] || + this._rescc.public || + this._rescc['s-maxage'] + ); + } + + _varyMatches(req) { + if (!this._resHeaders.vary) { + return true; + } + + // A Vary header field-value of "*" always fails to match + if (this._resHeaders.vary === '*') { + return false; + } + + const fields = this._resHeaders.vary + .trim() + .toLowerCase() + .split(/\s*,\s*/); + for (const name of fields) { + if (req.headers[name] !== this._reqHeaders[name]) return false; + } + return true; + } + + _copyWithoutHopByHopHeaders(inHeaders) { + const headers = {}; + for (const name in inHeaders) { + if (hopByHopHeaders[name]) continue; + headers[name] = inHeaders[name]; + } + // 9.1. Connection + if (inHeaders.connection) { + const tokens = inHeaders.connection.trim().split(/\s*,\s*/); + for (const name of tokens) { + delete headers[name]; + } + } + if (headers.warning) { + const warnings = headers.warning.split(/,/).filter(warning => { + return !/^\s*1[0-9][0-9]/.test(warning); + }); + if (!warnings.length) { + delete headers.warning; + } else { + headers.warning = warnings.join(',').trim(); + } + } + return headers; + } + + responseHeaders() { + const headers = this._copyWithoutHopByHopHeaders(this._resHeaders); + const age = this.age(); + + // A cache SHOULD generate 113 warning if it heuristically chose a freshness + // lifetime greater than 24 hours and the response's age is greater than 24 hours. + if ( + age > 3600 * 24 && + !this._hasExplicitExpiration() && + this.maxAge() > 3600 * 24 + ) { + headers.warning = + (headers.warning ? `${headers.warning}, ` : '') + + '113 - "rfc7234 5.5.4"'; + } + headers.age = `${Math.round(age)}`; + headers.date = new Date(this.now()).toUTCString(); + return headers; + } + + /** + * Value of the Date response header or current time if Date was invalid + * @return timestamp + */ + date() { + const serverDate = Date.parse(this._resHeaders.date); + if (isFinite(serverDate)) { + return serverDate; + } + return this._responseTime; + } + + /** + * Value of the Age header, in seconds, updated for the current time. + * May be fractional. + * + * @return Number + */ + age() { + let age = this._ageValue(); + + const residentTime = (this.now() - this._responseTime) / 1000; + return age + residentTime; + } + + _ageValue() { + return toNumberOrZero(this._resHeaders.age); + } + + /** + * Value of applicable max-age (or heuristic equivalent) in seconds. This counts since response's `Date`. + * + * For an up-to-date value, see `timeToLive()`. + * + * @return Number + */ + maxAge() { + if (!this.storable() || this._rescc['no-cache']) { + return 0; + } + + // Shared responses with cookies are cacheable according to the RFC, but IMHO it'd be unwise to do so by default + // so this implementation requires explicit opt-in via public header + if ( + this._isShared && + (this._resHeaders['set-cookie'] && + !this._rescc.public && + !this._rescc.immutable) + ) { + return 0; + } + + if (this._resHeaders.vary === '*') { + return 0; + } + + if (this._isShared) { + if (this._rescc['proxy-revalidate']) { + return 0; + } + // if a response includes the s-maxage directive, a shared cache recipient MUST ignore the Expires field. + if (this._rescc['s-maxage']) { + return toNumberOrZero(this._rescc['s-maxage']); + } + } + + // If a response includes a Cache-Control field with the max-age directive, a recipient MUST ignore the Expires field. + if (this._rescc['max-age']) { + return toNumberOrZero(this._rescc['max-age']); + } + + const defaultMinTtl = this._rescc.immutable ? this._immutableMinTtl : 0; + + const serverDate = this.date(); + if (this._resHeaders.expires) { + const expires = Date.parse(this._resHeaders.expires); + // A cache recipient MUST interpret invalid date formats, especially the value "0", as representing a time in the past (i.e., "already expired"). + if (Number.isNaN(expires) || expires < serverDate) { + return 0; + } + return Math.max(defaultMinTtl, (expires - serverDate) / 1000); + } + + if (this._resHeaders['last-modified']) { + const lastModified = Date.parse(this._resHeaders['last-modified']); + if (isFinite(lastModified) && serverDate > lastModified) { + return Math.max( + defaultMinTtl, + ((serverDate - lastModified) / 1000) * this._cacheHeuristic + ); + } + } + + return defaultMinTtl; + } + + timeToLive() { + const age = this.maxAge() - this.age(); + const staleIfErrorAge = age + toNumberOrZero(this._rescc['stale-if-error']); + const staleWhileRevalidateAge = age + toNumberOrZero(this._rescc['stale-while-revalidate']); + return Math.max(0, age, staleIfErrorAge, staleWhileRevalidateAge) * 1000; + } + + stale() { + return this.maxAge() <= this.age(); + } + + _useStaleIfError() { + return this.maxAge() + toNumberOrZero(this._rescc['stale-if-error']) > this.age(); + } + + useStaleWhileRevalidate() { + return this.maxAge() + toNumberOrZero(this._rescc['stale-while-revalidate']) > this.age(); + } + + static fromObject(obj) { + return new this(undefined, undefined, { _fromObject: obj }); + } + + _fromObject(obj) { + if (this._responseTime) throw Error('Reinitialized'); + if (!obj || obj.v !== 1) throw Error('Invalid serialization'); + + this._responseTime = obj.t; + this._isShared = obj.sh; + this._cacheHeuristic = obj.ch; + this._immutableMinTtl = + obj.imm !== undefined ? obj.imm : 24 * 3600 * 1000; + this._status = obj.st; + this._resHeaders = obj.resh; + this._rescc = obj.rescc; + this._method = obj.m; + this._url = obj.u; + this._host = obj.h; + this._noAuthorization = obj.a; + this._reqHeaders = obj.reqh; + this._reqcc = obj.reqcc; + } + + toObject() { + return { + v: 1, + t: this._responseTime, + sh: this._isShared, + ch: this._cacheHeuristic, + imm: this._immutableMinTtl, + st: this._status, + resh: this._resHeaders, + rescc: this._rescc, + m: this._method, + u: this._url, + h: this._host, + a: this._noAuthorization, + reqh: this._reqHeaders, + reqcc: this._reqcc, + }; + } + + /** + * Headers for sending to the origin server to revalidate stale response. + * Allows server to return 304 to allow reuse of the previous response. + * + * Hop by hop headers are always stripped. + * Revalidation headers may be added or removed, depending on request. + */ + revalidationHeaders(incomingReq) { + this._assertRequestHasHeaders(incomingReq); + const headers = this._copyWithoutHopByHopHeaders(incomingReq.headers); + + // This implementation does not understand range requests + delete headers['if-range']; + + if (!this._requestMatches(incomingReq, true) || !this.storable()) { + // revalidation allowed via HEAD + // not for the same resource, or wasn't allowed to be cached anyway + delete headers['if-none-match']; + delete headers['if-modified-since']; + return headers; + } + + /* MUST send that entity-tag in any cache validation request (using If-Match or If-None-Match) if an entity-tag has been provided by the origin server. */ + if (this._resHeaders.etag) { + headers['if-none-match'] = headers['if-none-match'] + ? `${headers['if-none-match']}, ${this._resHeaders.etag}` + : this._resHeaders.etag; + } + + // Clients MAY issue simple (non-subrange) GET requests with either weak validators or strong validators. Clients MUST NOT use weak validators in other forms of request. + const forbidsWeakValidators = + headers['accept-ranges'] || + headers['if-match'] || + headers['if-unmodified-since'] || + (this._method && this._method != 'GET'); + + /* SHOULD send the Last-Modified value in non-subrange cache validation requests (using If-Modified-Since) if only a Last-Modified value has been provided by the origin server. + Note: This implementation does not understand partial responses (206) */ + if (forbidsWeakValidators) { + delete headers['if-modified-since']; + + if (headers['if-none-match']) { + const etags = headers['if-none-match'] + .split(/,/) + .filter(etag => { + return !/^\s*W\//.test(etag); + }); + if (!etags.length) { + delete headers['if-none-match']; + } else { + headers['if-none-match'] = etags.join(',').trim(); + } + } + } else if ( + this._resHeaders['last-modified'] && + !headers['if-modified-since'] + ) { + headers['if-modified-since'] = this._resHeaders['last-modified']; + } + + return headers; + } + + /** + * Creates new CachePolicy with information combined from the previews response, + * and the new revalidation response. + * + * Returns {policy, modified} where modified is a boolean indicating + * whether the response body has been modified, and old cached body can't be used. + * + * @return {Object} {policy: CachePolicy, modified: Boolean} + */ + revalidatedPolicy(request, response) { + this._assertRequestHasHeaders(request); + if(this._useStaleIfError() && isErrorResponse(response)) { // I consider the revalidation request unsuccessful + return { + modified: false, + matches: false, + policy: this, + }; + } + if (!response || !response.headers) { + throw Error('Response headers missing'); + } + + // These aren't going to be supported exactly, since one CachePolicy object + // doesn't know about all the other cached objects. + let matches = false; + if (response.status !== undefined && response.status != 304) { + matches = false; + } else if ( + response.headers.etag && + !/^\s*W\//.test(response.headers.etag) + ) { + // "All of the stored responses with the same strong validator are selected. + // If none of the stored responses contain the same strong validator, + // then the cache MUST NOT use the new response to update any stored responses." + matches = + this._resHeaders.etag && + this._resHeaders.etag.replace(/^\s*W\//, '') === + response.headers.etag; + } else if (this._resHeaders.etag && response.headers.etag) { + // "If the new response contains a weak validator and that validator corresponds + // to one of the cache's stored responses, + // then the most recent of those matching stored responses is selected for update." + matches = + this._resHeaders.etag.replace(/^\s*W\//, '') === + response.headers.etag.replace(/^\s*W\//, ''); + } else if (this._resHeaders['last-modified']) { + matches = + this._resHeaders['last-modified'] === + response.headers['last-modified']; + } else { + // If the new response does not include any form of validator (such as in the case where + // a client generates an If-Modified-Since request from a source other than the Last-Modified + // response header field), and there is only one stored response, and that stored response also + // lacks a validator, then that stored response is selected for update. + if ( + !this._resHeaders.etag && + !this._resHeaders['last-modified'] && + !response.headers.etag && + !response.headers['last-modified'] + ) { + matches = true; + } + } + + if (!matches) { + return { + policy: new this.constructor(request, response), + // Client receiving 304 without body, even if it's invalid/mismatched has no option + // but to reuse a cached body. We don't have a good way to tell clients to do + // error recovery in such case. + modified: response.status != 304, + matches: false, + }; + } + + // use other header fields provided in the 304 (Not Modified) response to replace all instances + // of the corresponding header fields in the stored response. + const headers = {}; + for (const k in this._resHeaders) { + headers[k] = + k in response.headers && !excludedFromRevalidationUpdate[k] + ? response.headers[k] + : this._resHeaders[k]; + } + + const newResponse = Object.assign({}, response, { + status: this._status, + method: this._method, + headers, + }); + return { + policy: new this.constructor(request, newResponse, { + shared: this._isShared, + cacheHeuristic: this._cacheHeuristic, + immutableMinTimeToLive: this._immutableMinTtl, + }), + modified: false, + matches: true, + }; + } +}; diff --git a/node_modules/http-cache-semantics/package.json b/node_modules/http-cache-semantics/package.json new file mode 100644 index 000000000..3551e053e --- /dev/null +++ b/node_modules/http-cache-semantics/package.json @@ -0,0 +1,60 @@ +{ + "_from": "http-cache-semantics@^4.0.0", + "_id": "http-cache-semantics@4.1.0", + "_inBundle": false, + "_integrity": "sha512-carPklcUh7ROWRK7Cv27RPtdhYhUsela/ue5/jKzjegVvXDqM2ILE9Q2BGn9JZJh1g87cp56su/FgQSzcWS8cQ==", + "_location": "/http-cache-semantics", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "http-cache-semantics@^4.0.0", + "name": "http-cache-semantics", + "escapedName": "http-cache-semantics", + "rawSpec": "^4.0.0", + "saveSpec": null, + "fetchSpec": "^4.0.0" + }, + "_requiredBy": [ + "/cacheable-request" + ], + "_resolved": "https://registry.npmjs.org/http-cache-semantics/-/http-cache-semantics-4.1.0.tgz", + "_shasum": "49e91c5cbf36c9b94bcfcd71c23d5249ec74e390", + "_spec": "http-cache-semantics@^4.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\cacheable-request", + "author": { + "name": "Kornel Lesiński", + "email": "kornel@geekhood.net", + "url": "https://kornel.ski/" + }, + "bugs": { + "url": "https://github.com/kornelski/http-cache-semantics/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Parses Cache-Control and other headers. Helps building correct HTTP caches and proxies", + "devDependencies": { + "eslint": "^5.13.0", + "eslint-plugin-prettier": "^3.0.1", + "husky": "^0.14.3", + "lint-staged": "^8.1.3", + "mocha": "^5.1.0", + "prettier": "^1.14.3", + "prettier-eslint-cli": "^4.7.1" + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/kornelski/http-cache-semantics#readme", + "license": "BSD-2-Clause", + "main": "index.js", + "name": "http-cache-semantics", + "repository": { + "type": "git", + "url": "git+https://github.com/kornelski/http-cache-semantics.git" + }, + "scripts": { + "test": "mocha" + }, + "version": "4.1.0" +} diff --git a/node_modules/http-errors/HISTORY.md b/node_modules/http-errors/HISTORY.md new file mode 100644 index 000000000..efc2d4c98 --- /dev/null +++ b/node_modules/http-errors/HISTORY.md @@ -0,0 +1,149 @@ +2019-02-18 / 1.7.2 +================== + + * deps: setprototypeof@1.1.1 + +2018-09-08 / 1.7.1 +================== + + * Fix error creating objects in some environments + +2018-07-30 / 1.7.0 +================== + + * Set constructor name when possible + * Use `toidentifier` module to make class names + * deps: statuses@'>= 1.5.0 < 2' + +2018-03-29 / 1.6.3 +================== + + * deps: depd@~1.1.2 + - perf: remove argument reassignment + * deps: setprototypeof@1.1.0 + * deps: statuses@'>= 1.4.0 < 2' + +2017-08-04 / 1.6.2 +================== + + * deps: depd@1.1.1 + - Remove unnecessary `Buffer` loading + +2017-02-20 / 1.6.1 +================== + + * deps: setprototypeof@1.0.3 + - Fix shim for old browsers + +2017-02-14 / 1.6.0 +================== + + * Accept custom 4xx and 5xx status codes in factory + * Add deprecation message to `"I'mateapot"` export + * Deprecate passing status code as anything except first argument in factory + * Deprecate using non-error status codes + * Make `message` property enumerable for `HttpError`s + +2016-11-16 / 1.5.1 +================== + + * deps: inherits@2.0.3 + - Fix issue loading in browser + * deps: setprototypeof@1.0.2 + * deps: statuses@'>= 1.3.1 < 2' + +2016-05-18 / 1.5.0 +================== + + * Support new code `421 Misdirected Request` + * Use `setprototypeof` module to replace `__proto__` setting + * deps: statuses@'>= 1.3.0 < 2' + - Add `421 Misdirected Request` + - perf: enable strict mode + * perf: enable strict mode + +2016-01-28 / 1.4.0 +================== + + * Add `HttpError` export, for `err instanceof createError.HttpError` + * deps: inherits@2.0.1 + * deps: statuses@'>= 1.2.1 < 2' + - Fix message for status 451 + - Remove incorrect nginx status code + +2015-02-02 / 1.3.1 +================== + + * Fix regression where status can be overwritten in `createError` `props` + +2015-02-01 / 1.3.0 +================== + + * Construct errors using defined constructors from `createError` + * Fix error names that are not identifiers + - `createError["I'mateapot"]` is now `createError.ImATeapot` + * Set a meaningful `name` property on constructed errors + +2014-12-09 / 1.2.8 +================== + + * Fix stack trace from exported function + * Remove `arguments.callee` usage + +2014-10-14 / 1.2.7 +================== + + * Remove duplicate line + +2014-10-02 / 1.2.6 +================== + + * Fix `expose` to be `true` for `ClientError` constructor + +2014-09-28 / 1.2.5 +================== + + * deps: statuses@1 + +2014-09-21 / 1.2.4 +================== + + * Fix dependency version to work with old `npm`s + +2014-09-21 / 1.2.3 +================== + + * deps: statuses@~1.1.0 + +2014-09-21 / 1.2.2 +================== + + * Fix publish error + +2014-09-21 / 1.2.1 +================== + + * Support Node.js 0.6 + * Use `inherits` instead of `util` + +2014-09-09 / 1.2.0 +================== + + * Fix the way inheriting functions + * Support `expose` being provided in properties argument + +2014-09-08 / 1.1.0 +================== + + * Default status to 500 + * Support provided `error` to extend + +2014-09-08 / 1.0.1 +================== + + * Fix accepting string message + +2014-09-08 / 1.0.0 +================== + + * Initial release diff --git a/node_modules/http-errors/LICENSE b/node_modules/http-errors/LICENSE new file mode 100644 index 000000000..82af4df54 --- /dev/null +++ b/node_modules/http-errors/LICENSE @@ -0,0 +1,23 @@ + +The MIT License (MIT) + +Copyright (c) 2014 Jonathan Ong me@jongleberry.com +Copyright (c) 2016 Douglas Christopher Wilson doug@somethingdoug.com + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/http-errors/README.md b/node_modules/http-errors/README.md new file mode 100644 index 000000000..3b2548118 --- /dev/null +++ b/node_modules/http-errors/README.md @@ -0,0 +1,163 @@ +# http-errors + +[![NPM Version][npm-version-image]][npm-url] +[![NPM Downloads][npm-downloads-image]][node-url] +[![Node.js Version][node-image]][node-url] +[![Build Status][travis-image]][travis-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +Create HTTP errors for Express, Koa, Connect, etc. with ease. + +## Install + +This is a [Node.js](https://nodejs.org/en/) module available through the +[npm registry](https://www.npmjs.com/). Installation is done using the +[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally): + +```bash +$ npm install http-errors +``` + +## Example + +```js +var createError = require('http-errors') +var express = require('express') +var app = express() + +app.use(function (req, res, next) { + if (!req.user) return next(createError(401, 'Please login to view this page.')) + next() +}) +``` + +## API + +This is the current API, currently extracted from Koa and subject to change. + +### Error Properties + +- `expose` - can be used to signal if `message` should be sent to the client, + defaulting to `false` when `status` >= 500 +- `headers` - can be an object of header names to values to be sent to the + client, defaulting to `undefined`. When defined, the key names should all + be lower-cased +- `message` - the traditional error message, which should be kept short and all + single line +- `status` - the status code of the error, mirroring `statusCode` for general + compatibility +- `statusCode` - the status code of the error, defaulting to `500` + +### createError([status], [message], [properties]) + +Create a new error object with the given message `msg`. +The error object inherits from `createError.HttpError`. + + + +```js +var err = createError(404, 'This video does not exist!') +``` + +- `status: 500` - the status code as a number +- `message` - the message of the error, defaulting to node's text for that status code. +- `properties` - custom properties to attach to the object + +### createError([status], [error], [properties]) + +Extend the given `error` object with `createError.HttpError` +properties. This will not alter the inheritance of the given +`error` object, and the modified `error` object is the +return value. + + + +```js +fs.readFile('foo.txt', function (err, buf) { + if (err) { + if (err.code === 'ENOENT') { + var httpError = createError(404, err, { expose: false }) + } else { + var httpError = createError(500, err) + } + } +}) +``` + +- `status` - the status code as a number +- `error` - the error object to extend +- `properties` - custom properties to attach to the object + +### new createError\[code || name\](\[msg]\)) + +Create a new error object with the given message `msg`. +The error object inherits from `createError.HttpError`. + + + +```js +var err = new createError.NotFound() +``` + +- `code` - the status code as a number +- `name` - the name of the error as a "bumpy case", i.e. `NotFound` or `InternalServerError`. + +#### List of all constructors + +|Status Code|Constructor Name | +|-----------|-----------------------------| +|400 |BadRequest | +|401 |Unauthorized | +|402 |PaymentRequired | +|403 |Forbidden | +|404 |NotFound | +|405 |MethodNotAllowed | +|406 |NotAcceptable | +|407 |ProxyAuthenticationRequired | +|408 |RequestTimeout | +|409 |Conflict | +|410 |Gone | +|411 |LengthRequired | +|412 |PreconditionFailed | +|413 |PayloadTooLarge | +|414 |URITooLong | +|415 |UnsupportedMediaType | +|416 |RangeNotSatisfiable | +|417 |ExpectationFailed | +|418 |ImATeapot | +|421 |MisdirectedRequest | +|422 |UnprocessableEntity | +|423 |Locked | +|424 |FailedDependency | +|425 |UnorderedCollection | +|426 |UpgradeRequired | +|428 |PreconditionRequired | +|429 |TooManyRequests | +|431 |RequestHeaderFieldsTooLarge | +|451 |UnavailableForLegalReasons | +|500 |InternalServerError | +|501 |NotImplemented | +|502 |BadGateway | +|503 |ServiceUnavailable | +|504 |GatewayTimeout | +|505 |HTTPVersionNotSupported | +|506 |VariantAlsoNegotiates | +|507 |InsufficientStorage | +|508 |LoopDetected | +|509 |BandwidthLimitExceeded | +|510 |NotExtended | +|511 |NetworkAuthenticationRequired| + +## License + +[MIT](LICENSE) + +[coveralls-image]: https://badgen.net/coveralls/c/github/jshttp/http-errors/master +[coveralls-url]: https://coveralls.io/r/jshttp/http-errors?branch=master +[node-image]: https://badgen.net/npm/node/http-errors +[node-url]: https://nodejs.org/en/download +[npm-downloads-image]: https://badgen.net/npm/dm/http-errors +[npm-url]: https://npmjs.org/package/http-errors +[npm-version-image]: https://badgen.net/npm/v/http-errors +[travis-image]: https://badgen.net/travis/jshttp/http-errors/master +[travis-url]: https://travis-ci.org/jshttp/http-errors diff --git a/node_modules/http-errors/index.js b/node_modules/http-errors/index.js new file mode 100644 index 000000000..10ca4adc0 --- /dev/null +++ b/node_modules/http-errors/index.js @@ -0,0 +1,266 @@ +/*! + * http-errors + * Copyright(c) 2014 Jonathan Ong + * Copyright(c) 2016 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module dependencies. + * @private + */ + +var deprecate = require('depd')('http-errors') +var setPrototypeOf = require('setprototypeof') +var statuses = require('statuses') +var inherits = require('inherits') +var toIdentifier = require('toidentifier') + +/** + * Module exports. + * @public + */ + +module.exports = createError +module.exports.HttpError = createHttpErrorConstructor() + +// Populate exports for all constructors +populateConstructorExports(module.exports, statuses.codes, module.exports.HttpError) + +/** + * Get the code class of a status code. + * @private + */ + +function codeClass (status) { + return Number(String(status).charAt(0) + '00') +} + +/** + * Create a new HTTP Error. + * + * @returns {Error} + * @public + */ + +function createError () { + // so much arity going on ~_~ + var err + var msg + var status = 500 + var props = {} + for (var i = 0; i < arguments.length; i++) { + var arg = arguments[i] + if (arg instanceof Error) { + err = arg + status = err.status || err.statusCode || status + continue + } + switch (typeof arg) { + case 'string': + msg = arg + break + case 'number': + status = arg + if (i !== 0) { + deprecate('non-first-argument status code; replace with createError(' + arg + ', ...)') + } + break + case 'object': + props = arg + break + } + } + + if (typeof status === 'number' && (status < 400 || status >= 600)) { + deprecate('non-error status code; use only 4xx or 5xx status codes') + } + + if (typeof status !== 'number' || + (!statuses[status] && (status < 400 || status >= 600))) { + status = 500 + } + + // constructor + var HttpError = createError[status] || createError[codeClass(status)] + + if (!err) { + // create error + err = HttpError + ? new HttpError(msg) + : new Error(msg || statuses[status]) + Error.captureStackTrace(err, createError) + } + + if (!HttpError || !(err instanceof HttpError) || err.status !== status) { + // add properties to generic error + err.expose = status < 500 + err.status = err.statusCode = status + } + + for (var key in props) { + if (key !== 'status' && key !== 'statusCode') { + err[key] = props[key] + } + } + + return err +} + +/** + * Create HTTP error abstract base class. + * @private + */ + +function createHttpErrorConstructor () { + function HttpError () { + throw new TypeError('cannot construct abstract class') + } + + inherits(HttpError, Error) + + return HttpError +} + +/** + * Create a constructor for a client error. + * @private + */ + +function createClientErrorConstructor (HttpError, name, code) { + var className = name.match(/Error$/) ? name : name + 'Error' + + function ClientError (message) { + // create the error object + var msg = message != null ? message : statuses[code] + var err = new Error(msg) + + // capture a stack trace to the construction point + Error.captureStackTrace(err, ClientError) + + // adjust the [[Prototype]] + setPrototypeOf(err, ClientError.prototype) + + // redefine the error message + Object.defineProperty(err, 'message', { + enumerable: true, + configurable: true, + value: msg, + writable: true + }) + + // redefine the error name + Object.defineProperty(err, 'name', { + enumerable: false, + configurable: true, + value: className, + writable: true + }) + + return err + } + + inherits(ClientError, HttpError) + nameFunc(ClientError, className) + + ClientError.prototype.status = code + ClientError.prototype.statusCode = code + ClientError.prototype.expose = true + + return ClientError +} + +/** + * Create a constructor for a server error. + * @private + */ + +function createServerErrorConstructor (HttpError, name, code) { + var className = name.match(/Error$/) ? name : name + 'Error' + + function ServerError (message) { + // create the error object + var msg = message != null ? message : statuses[code] + var err = new Error(msg) + + // capture a stack trace to the construction point + Error.captureStackTrace(err, ServerError) + + // adjust the [[Prototype]] + setPrototypeOf(err, ServerError.prototype) + + // redefine the error message + Object.defineProperty(err, 'message', { + enumerable: true, + configurable: true, + value: msg, + writable: true + }) + + // redefine the error name + Object.defineProperty(err, 'name', { + enumerable: false, + configurable: true, + value: className, + writable: true + }) + + return err + } + + inherits(ServerError, HttpError) + nameFunc(ServerError, className) + + ServerError.prototype.status = code + ServerError.prototype.statusCode = code + ServerError.prototype.expose = false + + return ServerError +} + +/** + * Set the name of a function, if possible. + * @private + */ + +function nameFunc (func, name) { + var desc = Object.getOwnPropertyDescriptor(func, 'name') + + if (desc && desc.configurable) { + desc.value = name + Object.defineProperty(func, 'name', desc) + } +} + +/** + * Populate the exports object with constructors for every error class. + * @private + */ + +function populateConstructorExports (exports, codes, HttpError) { + codes.forEach(function forEachCode (code) { + var CodeError + var name = toIdentifier(statuses[code]) + + switch (codeClass(code)) { + case 400: + CodeError = createClientErrorConstructor(HttpError, name, code) + break + case 500: + CodeError = createServerErrorConstructor(HttpError, name, code) + break + } + + if (CodeError) { + // export the constructor + exports[code] = CodeError + exports[name] = CodeError + } + }) + + // backwards-compatibility + exports["I'mateapot"] = deprecate.function(exports.ImATeapot, + '"I\'mateapot"; use "ImATeapot" instead') +} diff --git a/node_modules/http-errors/package.json b/node_modules/http-errors/package.json new file mode 100644 index 000000000..1981b6644 --- /dev/null +++ b/node_modules/http-errors/package.json @@ -0,0 +1,93 @@ +{ + "_from": "http-errors@1.7.2", + "_id": "http-errors@1.7.2", + "_inBundle": false, + "_integrity": "sha512-uUQBt3H/cSIVfch6i1EuPNy/YsRSOUBXTVfZ+yR7Zjez3qjBz6i9+i4zjNaoqcoFVI4lQJ5plg63TvGfRSDCRg==", + "_location": "/http-errors", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "http-errors@1.7.2", + "name": "http-errors", + "escapedName": "http-errors", + "rawSpec": "1.7.2", + "saveSpec": null, + "fetchSpec": "1.7.2" + }, + "_requiredBy": [ + "/body-parser", + "/raw-body", + "/send" + ], + "_resolved": "https://registry.npmjs.org/http-errors/-/http-errors-1.7.2.tgz", + "_shasum": "4f5029cf13239f31036e5b2e55292bcfbcc85c8f", + "_spec": "http-errors@1.7.2", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\body-parser", + "author": { + "name": "Jonathan Ong", + "email": "me@jongleberry.com", + "url": "http://jongleberry.com" + }, + "bugs": { + "url": "https://github.com/jshttp/http-errors/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Alan Plum", + "email": "me@pluma.io" + }, + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + } + ], + "dependencies": { + "depd": "~1.1.2", + "inherits": "2.0.3", + "setprototypeof": "1.1.1", + "statuses": ">= 1.5.0 < 2", + "toidentifier": "1.0.0" + }, + "deprecated": false, + "description": "Create HTTP error objects", + "devDependencies": { + "eslint": "5.13.0", + "eslint-config-standard": "12.0.0", + "eslint-plugin-import": "2.16.0", + "eslint-plugin-markdown": "1.0.0", + "eslint-plugin-node": "7.0.1", + "eslint-plugin-promise": "4.0.1", + "eslint-plugin-standard": "4.0.0", + "istanbul": "0.4.5", + "mocha": "5.2.0" + }, + "engines": { + "node": ">= 0.6" + }, + "files": [ + "index.js", + "HISTORY.md", + "LICENSE", + "README.md" + ], + "homepage": "https://github.com/jshttp/http-errors#readme", + "keywords": [ + "http", + "error" + ], + "license": "MIT", + "name": "http-errors", + "repository": { + "type": "git", + "url": "git+https://github.com/jshttp/http-errors.git" + }, + "scripts": { + "lint": "eslint --plugin markdown --ext js,md . && node ./scripts/lint-readme-list.js", + "test": "mocha --reporter spec --bail", + "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --reporter dot", + "test-travis": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --reporter dot" + }, + "version": "1.7.2" +} diff --git a/node_modules/iconv-lite/Changelog.md b/node_modules/iconv-lite/Changelog.md new file mode 100644 index 000000000..f252313f8 --- /dev/null +++ b/node_modules/iconv-lite/Changelog.md @@ -0,0 +1,162 @@ +# 0.4.24 / 2018-08-22 + + * Added MIK encoding (#196, by @Ivan-Kalatchev) + + +# 0.4.23 / 2018-05-07 + + * Fix deprecation warning in Node v10 due to the last usage of `new Buffer` (#185, by @felixbuenemann) + * Switched from NodeBuffer to Buffer in typings (#155 by @felixfbecker, #186 by @larssn) + + +# 0.4.22 / 2018-05-05 + + * Use older semver style for dependencies to be compatible with Node version 0.10 (#182, by @dougwilson) + * Fix tests to accomodate fixes in Node v10 (#182, by @dougwilson) + + +# 0.4.21 / 2018-04-06 + + * Fix encoding canonicalization (#156) + * Fix the paths in the "browser" field in package.json (#174 by @LMLB) + * Removed "contributors" section in package.json - see Git history instead. + + +# 0.4.20 / 2018-04-06 + + * Updated `new Buffer()` usages with recommended replacements as it's being deprecated in Node v10 (#176, #178 by @ChALkeR) + + +# 0.4.19 / 2017-09-09 + + * Fixed iso8859-1 codec regression in handling untranslatable characters (#162, caused by #147) + * Re-generated windows1255 codec, because it was updated in iconv project + * Fixed grammar in error message when iconv-lite is loaded with encoding other than utf8 + + +# 0.4.18 / 2017-06-13 + + * Fixed CESU-8 regression in Node v8. + + +# 0.4.17 / 2017-04-22 + + * Updated typescript definition file to support Angular 2 AoT mode (#153 by @larssn) + + +# 0.4.16 / 2017-04-22 + + * Added support for React Native (#150) + * Changed iso8859-1 encoding to usine internal 'binary' encoding, as it's the same thing (#147 by @mscdex) + * Fixed typo in Readme (#138 by @jiangzhuo) + * Fixed build for Node v6.10+ by making correct version comparison + * Added a warning if iconv-lite is loaded not as utf-8 (see #142) + + +# 0.4.15 / 2016-11-21 + + * Fixed typescript type definition (#137) + + +# 0.4.14 / 2016-11-20 + + * Preparation for v1.0 + * Added Node v6 and latest Node versions to Travis CI test rig + * Deprecated Node v0.8 support + * Typescript typings (@larssn) + * Fix encoding of Euro character in GB 18030 (inspired by @lygstate) + * Add ms prefix to dbcs windows encodings (@rokoroku) + + +# 0.4.13 / 2015-10-01 + + * Fix silly mistake in deprecation notice. + + +# 0.4.12 / 2015-09-26 + + * Node v4 support: + * Added CESU-8 decoding (#106) + * Added deprecation notice for `extendNodeEncodings` + * Added Travis tests for Node v4 and io.js latest (#105 by @Mithgol) + + +# 0.4.11 / 2015-07-03 + + * Added CESU-8 encoding. + + +# 0.4.10 / 2015-05-26 + + * Changed UTF-16 endianness heuristic to take into account any ASCII chars, not + just spaces. This should minimize the importance of "default" endianness. + + +# 0.4.9 / 2015-05-24 + + * Streamlined BOM handling: strip BOM by default, add BOM when encoding if + addBOM: true. Added docs to Readme. + * UTF16 now uses UTF16-LE by default. + * Fixed minor issue with big5 encoding. + * Added io.js testing on Travis; updated node-iconv version to test against. + Now we just skip testing SBCS encodings that node-iconv doesn't support. + * (internal refactoring) Updated codec interface to use classes. + * Use strict mode in all files. + + +# 0.4.8 / 2015-04-14 + + * added alias UNICODE-1-1-UTF-7 for UTF-7 encoding (#94) + + +# 0.4.7 / 2015-02-05 + + * stop official support of Node.js v0.8. Should still work, but no guarantees. + reason: Packages needed for testing are hard to get on Travis CI. + * work in environment where Object.prototype is monkey patched with enumerable + props (#89). + + +# 0.4.6 / 2015-01-12 + + * fix rare aliases of single-byte encodings (thanks @mscdex) + * double the timeout for dbcs tests to make them less flaky on travis + + +# 0.4.5 / 2014-11-20 + + * fix windows-31j and x-sjis encoding support (@nleush) + * minor fix: undefined variable reference when internal error happens + + +# 0.4.4 / 2014-07-16 + + * added encodings UTF-7 (RFC2152) and UTF-7-IMAP (RFC3501 Section 5.1.3) + * fixed streaming base64 encoding + + +# 0.4.3 / 2014-06-14 + + * added encodings UTF-16BE and UTF-16 with BOM + + +# 0.4.2 / 2014-06-12 + + * don't throw exception if `extendNodeEncodings()` is called more than once + + +# 0.4.1 / 2014-06-11 + + * codepage 808 added + + +# 0.4.0 / 2014-06-10 + + * code is rewritten from scratch + * all widespread encodings are supported + * streaming interface added + * browserify compatibility added + * (optional) extend core primitive encodings to make usage even simpler + * moved from vows to mocha as the testing framework + + diff --git a/node_modules/iconv-lite/LICENSE b/node_modules/iconv-lite/LICENSE new file mode 100644 index 000000000..d518d8376 --- /dev/null +++ b/node_modules/iconv-lite/LICENSE @@ -0,0 +1,21 @@ +Copyright (c) 2011 Alexander Shtuchkin + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +"Software"), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND +NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE +LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION +OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION +WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. + diff --git a/node_modules/iconv-lite/README.md b/node_modules/iconv-lite/README.md new file mode 100644 index 000000000..c981c3708 --- /dev/null +++ b/node_modules/iconv-lite/README.md @@ -0,0 +1,156 @@ +## Pure JS character encoding conversion [![Build Status](https://travis-ci.org/ashtuchkin/iconv-lite.svg?branch=master)](https://travis-ci.org/ashtuchkin/iconv-lite) + + * Doesn't need native code compilation. Works on Windows and in sandboxed environments like [Cloud9](http://c9.io). + * Used in popular projects like [Express.js (body_parser)](https://github.com/expressjs/body-parser), + [Grunt](http://gruntjs.com/), [Nodemailer](http://www.nodemailer.com/), [Yeoman](http://yeoman.io/) and others. + * Faster than [node-iconv](https://github.com/bnoordhuis/node-iconv) (see below for performance comparison). + * Intuitive encode/decode API + * Streaming support for Node v0.10+ + * [Deprecated] Can extend Node.js primitives (buffers, streams) to support all iconv-lite encodings. + * In-browser usage via [Browserify](https://github.com/substack/node-browserify) (~180k gzip compressed with Buffer shim included). + * Typescript [type definition file](https://github.com/ashtuchkin/iconv-lite/blob/master/lib/index.d.ts) included. + * React Native is supported (need to explicitly `npm install` two more modules: `buffer` and `stream`). + * License: MIT. + +[![NPM Stats](https://nodei.co/npm/iconv-lite.png?downloads=true&downloadRank=true)](https://npmjs.org/packages/iconv-lite/) + +## Usage +### Basic API +```javascript +var iconv = require('iconv-lite'); + +// Convert from an encoded buffer to js string. +str = iconv.decode(Buffer.from([0x68, 0x65, 0x6c, 0x6c, 0x6f]), 'win1251'); + +// Convert from js string to an encoded buffer. +buf = iconv.encode("Sample input string", 'win1251'); + +// Check if encoding is supported +iconv.encodingExists("us-ascii") +``` + +### Streaming API (Node v0.10+) +```javascript + +// Decode stream (from binary stream to js strings) +http.createServer(function(req, res) { + var converterStream = iconv.decodeStream('win1251'); + req.pipe(converterStream); + + converterStream.on('data', function(str) { + console.log(str); // Do something with decoded strings, chunk-by-chunk. + }); +}); + +// Convert encoding streaming example +fs.createReadStream('file-in-win1251.txt') + .pipe(iconv.decodeStream('win1251')) + .pipe(iconv.encodeStream('ucs2')) + .pipe(fs.createWriteStream('file-in-ucs2.txt')); + +// Sugar: all encode/decode streams have .collect(cb) method to accumulate data. +http.createServer(function(req, res) { + req.pipe(iconv.decodeStream('win1251')).collect(function(err, body) { + assert(typeof body == 'string'); + console.log(body); // full request body string + }); +}); +``` + +### [Deprecated] Extend Node.js own encodings +> NOTE: This doesn't work on latest Node versions. See [details](https://github.com/ashtuchkin/iconv-lite/wiki/Node-v4-compatibility). + +```javascript +// After this call all Node basic primitives will understand iconv-lite encodings. +iconv.extendNodeEncodings(); + +// Examples: +buf = new Buffer(str, 'win1251'); +buf.write(str, 'gbk'); +str = buf.toString('latin1'); +assert(Buffer.isEncoding('iso-8859-15')); +Buffer.byteLength(str, 'us-ascii'); + +http.createServer(function(req, res) { + req.setEncoding('big5'); + req.collect(function(err, body) { + console.log(body); + }); +}); + +fs.createReadStream("file.txt", "shift_jis"); + +// External modules are also supported (if they use Node primitives, which they probably do). +request = require('request'); +request({ + url: "http://github.com/", + encoding: "cp932" +}); + +// To remove extensions +iconv.undoExtendNodeEncodings(); +``` + +## Supported encodings + + * All node.js native encodings: utf8, ucs2 / utf16-le, ascii, binary, base64, hex. + * Additional unicode encodings: utf16, utf16-be, utf-7, utf-7-imap. + * All widespread singlebyte encodings: Windows 125x family, ISO-8859 family, + IBM/DOS codepages, Macintosh family, KOI8 family, all others supported by iconv library. + Aliases like 'latin1', 'us-ascii' also supported. + * All widespread multibyte encodings: CP932, CP936, CP949, CP950, GB2312, GBK, GB18030, Big5, Shift_JIS, EUC-JP. + +See [all supported encodings on wiki](https://github.com/ashtuchkin/iconv-lite/wiki/Supported-Encodings). + +Most singlebyte encodings are generated automatically from [node-iconv](https://github.com/bnoordhuis/node-iconv). Thank you Ben Noordhuis and libiconv authors! + +Multibyte encodings are generated from [Unicode.org mappings](http://www.unicode.org/Public/MAPPINGS/) and [WHATWG Encoding Standard mappings](http://encoding.spec.whatwg.org/). Thank you, respective authors! + + +## Encoding/decoding speed + +Comparison with node-iconv module (1000x256kb, on MacBook Pro, Core i5/2.6 GHz, Node v0.12.0). +Note: your results may vary, so please always check on your hardware. + + operation iconv@2.1.4 iconv-lite@0.4.7 + ---------------------------------------------------------- + encode('win1251') ~96 Mb/s ~320 Mb/s + decode('win1251') ~95 Mb/s ~246 Mb/s + +## BOM handling + + * Decoding: BOM is stripped by default, unless overridden by passing `stripBOM: false` in options + (f.ex. `iconv.decode(buf, enc, {stripBOM: false})`). + A callback might also be given as a `stripBOM` parameter - it'll be called if BOM character was actually found. + * If you want to detect UTF-8 BOM when decoding other encodings, use [node-autodetect-decoder-stream](https://github.com/danielgindi/node-autodetect-decoder-stream) module. + * Encoding: No BOM added, unless overridden by `addBOM: true` option. + +## UTF-16 Encodings + +This library supports UTF-16LE, UTF-16BE and UTF-16 encodings. First two are straightforward, but UTF-16 is trying to be +smart about endianness in the following ways: + * Decoding: uses BOM and 'spaces heuristic' to determine input endianness. Default is UTF-16LE, but can be + overridden with `defaultEncoding: 'utf-16be'` option. Strips BOM unless `stripBOM: false`. + * Encoding: uses UTF-16LE and writes BOM by default. Use `addBOM: false` to override. + +## Other notes + +When decoding, be sure to supply a Buffer to decode() method, otherwise [bad things usually happen](https://github.com/ashtuchkin/iconv-lite/wiki/Use-Buffers-when-decoding). +Untranslatable characters are set to � or ?. No transliteration is currently supported. +Node versions 0.10.31 and 0.11.13 are buggy, don't use them (see #65, #77). + +## Testing + +```bash +$ git clone git@github.com:ashtuchkin/iconv-lite.git +$ cd iconv-lite +$ npm install +$ npm test + +$ # To view performance: +$ node test/performance.js + +$ # To view test coverage: +$ npm run coverage +$ open coverage/lcov-report/index.html +``` diff --git a/node_modules/iconv-lite/encodings/dbcs-codec.js b/node_modules/iconv-lite/encodings/dbcs-codec.js new file mode 100644 index 000000000..1fe3e1601 --- /dev/null +++ b/node_modules/iconv-lite/encodings/dbcs-codec.js @@ -0,0 +1,555 @@ +"use strict"; +var Buffer = require("safer-buffer").Buffer; + +// Multibyte codec. In this scheme, a character is represented by 1 or more bytes. +// Our codec supports UTF-16 surrogates, extensions for GB18030 and unicode sequences. +// To save memory and loading time, we read table files only when requested. + +exports._dbcs = DBCSCodec; + +var UNASSIGNED = -1, + GB18030_CODE = -2, + SEQ_START = -10, + NODE_START = -1000, + UNASSIGNED_NODE = new Array(0x100), + DEF_CHAR = -1; + +for (var i = 0; i < 0x100; i++) + UNASSIGNED_NODE[i] = UNASSIGNED; + + +// Class DBCSCodec reads and initializes mapping tables. +function DBCSCodec(codecOptions, iconv) { + this.encodingName = codecOptions.encodingName; + if (!codecOptions) + throw new Error("DBCS codec is called without the data.") + if (!codecOptions.table) + throw new Error("Encoding '" + this.encodingName + "' has no data."); + + // Load tables. + var mappingTable = codecOptions.table(); + + + // Decode tables: MBCS -> Unicode. + + // decodeTables is a trie, encoded as an array of arrays of integers. Internal arrays are trie nodes and all have len = 256. + // Trie root is decodeTables[0]. + // Values: >= 0 -> unicode character code. can be > 0xFFFF + // == UNASSIGNED -> unknown/unassigned sequence. + // == GB18030_CODE -> this is the end of a GB18030 4-byte sequence. + // <= NODE_START -> index of the next node in our trie to process next byte. + // <= SEQ_START -> index of the start of a character code sequence, in decodeTableSeq. + this.decodeTables = []; + this.decodeTables[0] = UNASSIGNED_NODE.slice(0); // Create root node. + + // Sometimes a MBCS char corresponds to a sequence of unicode chars. We store them as arrays of integers here. + this.decodeTableSeq = []; + + // Actual mapping tables consist of chunks. Use them to fill up decode tables. + for (var i = 0; i < mappingTable.length; i++) + this._addDecodeChunk(mappingTable[i]); + + this.defaultCharUnicode = iconv.defaultCharUnicode; + + + // Encode tables: Unicode -> DBCS. + + // `encodeTable` is array mapping from unicode char to encoded char. All its values are integers for performance. + // Because it can be sparse, it is represented as array of buckets by 256 chars each. Bucket can be null. + // Values: >= 0 -> it is a normal char. Write the value (if <=256 then 1 byte, if <=65536 then 2 bytes, etc.). + // == UNASSIGNED -> no conversion found. Output a default char. + // <= SEQ_START -> it's an index in encodeTableSeq, see below. The character starts a sequence. + this.encodeTable = []; + + // `encodeTableSeq` is used when a sequence of unicode characters is encoded as a single code. We use a tree of + // objects where keys correspond to characters in sequence and leafs are the encoded dbcs values. A special DEF_CHAR key + // means end of sequence (needed when one sequence is a strict subsequence of another). + // Objects are kept separately from encodeTable to increase performance. + this.encodeTableSeq = []; + + // Some chars can be decoded, but need not be encoded. + var skipEncodeChars = {}; + if (codecOptions.encodeSkipVals) + for (var i = 0; i < codecOptions.encodeSkipVals.length; i++) { + var val = codecOptions.encodeSkipVals[i]; + if (typeof val === 'number') + skipEncodeChars[val] = true; + else + for (var j = val.from; j <= val.to; j++) + skipEncodeChars[j] = true; + } + + // Use decode trie to recursively fill out encode tables. + this._fillEncodeTable(0, 0, skipEncodeChars); + + // Add more encoding pairs when needed. + if (codecOptions.encodeAdd) { + for (var uChar in codecOptions.encodeAdd) + if (Object.prototype.hasOwnProperty.call(codecOptions.encodeAdd, uChar)) + this._setEncodeChar(uChar.charCodeAt(0), codecOptions.encodeAdd[uChar]); + } + + this.defCharSB = this.encodeTable[0][iconv.defaultCharSingleByte.charCodeAt(0)]; + if (this.defCharSB === UNASSIGNED) this.defCharSB = this.encodeTable[0]['?']; + if (this.defCharSB === UNASSIGNED) this.defCharSB = "?".charCodeAt(0); + + + // Load & create GB18030 tables when needed. + if (typeof codecOptions.gb18030 === 'function') { + this.gb18030 = codecOptions.gb18030(); // Load GB18030 ranges. + + // Add GB18030 decode tables. + var thirdByteNodeIdx = this.decodeTables.length; + var thirdByteNode = this.decodeTables[thirdByteNodeIdx] = UNASSIGNED_NODE.slice(0); + + var fourthByteNodeIdx = this.decodeTables.length; + var fourthByteNode = this.decodeTables[fourthByteNodeIdx] = UNASSIGNED_NODE.slice(0); + + for (var i = 0x81; i <= 0xFE; i++) { + var secondByteNodeIdx = NODE_START - this.decodeTables[0][i]; + var secondByteNode = this.decodeTables[secondByteNodeIdx]; + for (var j = 0x30; j <= 0x39; j++) + secondByteNode[j] = NODE_START - thirdByteNodeIdx; + } + for (var i = 0x81; i <= 0xFE; i++) + thirdByteNode[i] = NODE_START - fourthByteNodeIdx; + for (var i = 0x30; i <= 0x39; i++) + fourthByteNode[i] = GB18030_CODE + } +} + +DBCSCodec.prototype.encoder = DBCSEncoder; +DBCSCodec.prototype.decoder = DBCSDecoder; + +// Decoder helpers +DBCSCodec.prototype._getDecodeTrieNode = function(addr) { + var bytes = []; + for (; addr > 0; addr >>= 8) + bytes.push(addr & 0xFF); + if (bytes.length == 0) + bytes.push(0); + + var node = this.decodeTables[0]; + for (var i = bytes.length-1; i > 0; i--) { // Traverse nodes deeper into the trie. + var val = node[bytes[i]]; + + if (val == UNASSIGNED) { // Create new node. + node[bytes[i]] = NODE_START - this.decodeTables.length; + this.decodeTables.push(node = UNASSIGNED_NODE.slice(0)); + } + else if (val <= NODE_START) { // Existing node. + node = this.decodeTables[NODE_START - val]; + } + else + throw new Error("Overwrite byte in " + this.encodingName + ", addr: " + addr.toString(16)); + } + return node; +} + + +DBCSCodec.prototype._addDecodeChunk = function(chunk) { + // First element of chunk is the hex mbcs code where we start. + var curAddr = parseInt(chunk[0], 16); + + // Choose the decoding node where we'll write our chars. + var writeTable = this._getDecodeTrieNode(curAddr); + curAddr = curAddr & 0xFF; + + // Write all other elements of the chunk to the table. + for (var k = 1; k < chunk.length; k++) { + var part = chunk[k]; + if (typeof part === "string") { // String, write as-is. + for (var l = 0; l < part.length;) { + var code = part.charCodeAt(l++); + if (0xD800 <= code && code < 0xDC00) { // Decode surrogate + var codeTrail = part.charCodeAt(l++); + if (0xDC00 <= codeTrail && codeTrail < 0xE000) + writeTable[curAddr++] = 0x10000 + (code - 0xD800) * 0x400 + (codeTrail - 0xDC00); + else + throw new Error("Incorrect surrogate pair in " + this.encodingName + " at chunk " + chunk[0]); + } + else if (0x0FF0 < code && code <= 0x0FFF) { // Character sequence (our own encoding used) + var len = 0xFFF - code + 2; + var seq = []; + for (var m = 0; m < len; m++) + seq.push(part.charCodeAt(l++)); // Simple variation: don't support surrogates or subsequences in seq. + + writeTable[curAddr++] = SEQ_START - this.decodeTableSeq.length; + this.decodeTableSeq.push(seq); + } + else + writeTable[curAddr++] = code; // Basic char + } + } + else if (typeof part === "number") { // Integer, meaning increasing sequence starting with prev character. + var charCode = writeTable[curAddr - 1] + 1; + for (var l = 0; l < part; l++) + writeTable[curAddr++] = charCode++; + } + else + throw new Error("Incorrect type '" + typeof part + "' given in " + this.encodingName + " at chunk " + chunk[0]); + } + if (curAddr > 0xFF) + throw new Error("Incorrect chunk in " + this.encodingName + " at addr " + chunk[0] + ": too long" + curAddr); +} + +// Encoder helpers +DBCSCodec.prototype._getEncodeBucket = function(uCode) { + var high = uCode >> 8; // This could be > 0xFF because of astral characters. + if (this.encodeTable[high] === undefined) + this.encodeTable[high] = UNASSIGNED_NODE.slice(0); // Create bucket on demand. + return this.encodeTable[high]; +} + +DBCSCodec.prototype._setEncodeChar = function(uCode, dbcsCode) { + var bucket = this._getEncodeBucket(uCode); + var low = uCode & 0xFF; + if (bucket[low] <= SEQ_START) + this.encodeTableSeq[SEQ_START-bucket[low]][DEF_CHAR] = dbcsCode; // There's already a sequence, set a single-char subsequence of it. + else if (bucket[low] == UNASSIGNED) + bucket[low] = dbcsCode; +} + +DBCSCodec.prototype._setEncodeSequence = function(seq, dbcsCode) { + + // Get the root of character tree according to first character of the sequence. + var uCode = seq[0]; + var bucket = this._getEncodeBucket(uCode); + var low = uCode & 0xFF; + + var node; + if (bucket[low] <= SEQ_START) { + // There's already a sequence with - use it. + node = this.encodeTableSeq[SEQ_START-bucket[low]]; + } + else { + // There was no sequence object - allocate a new one. + node = {}; + if (bucket[low] !== UNASSIGNED) node[DEF_CHAR] = bucket[low]; // If a char was set before - make it a single-char subsequence. + bucket[low] = SEQ_START - this.encodeTableSeq.length; + this.encodeTableSeq.push(node); + } + + // Traverse the character tree, allocating new nodes as needed. + for (var j = 1; j < seq.length-1; j++) { + var oldVal = node[uCode]; + if (typeof oldVal === 'object') + node = oldVal; + else { + node = node[uCode] = {} + if (oldVal !== undefined) + node[DEF_CHAR] = oldVal + } + } + + // Set the leaf to given dbcsCode. + uCode = seq[seq.length-1]; + node[uCode] = dbcsCode; +} + +DBCSCodec.prototype._fillEncodeTable = function(nodeIdx, prefix, skipEncodeChars) { + var node = this.decodeTables[nodeIdx]; + for (var i = 0; i < 0x100; i++) { + var uCode = node[i]; + var mbCode = prefix + i; + if (skipEncodeChars[mbCode]) + continue; + + if (uCode >= 0) + this._setEncodeChar(uCode, mbCode); + else if (uCode <= NODE_START) + this._fillEncodeTable(NODE_START - uCode, mbCode << 8, skipEncodeChars); + else if (uCode <= SEQ_START) + this._setEncodeSequence(this.decodeTableSeq[SEQ_START - uCode], mbCode); + } +} + + + +// == Encoder ================================================================== + +function DBCSEncoder(options, codec) { + // Encoder state + this.leadSurrogate = -1; + this.seqObj = undefined; + + // Static data + this.encodeTable = codec.encodeTable; + this.encodeTableSeq = codec.encodeTableSeq; + this.defaultCharSingleByte = codec.defCharSB; + this.gb18030 = codec.gb18030; +} + +DBCSEncoder.prototype.write = function(str) { + var newBuf = Buffer.alloc(str.length * (this.gb18030 ? 4 : 3)), + leadSurrogate = this.leadSurrogate, + seqObj = this.seqObj, nextChar = -1, + i = 0, j = 0; + + while (true) { + // 0. Get next character. + if (nextChar === -1) { + if (i == str.length) break; + var uCode = str.charCodeAt(i++); + } + else { + var uCode = nextChar; + nextChar = -1; + } + + // 1. Handle surrogates. + if (0xD800 <= uCode && uCode < 0xE000) { // Char is one of surrogates. + if (uCode < 0xDC00) { // We've got lead surrogate. + if (leadSurrogate === -1) { + leadSurrogate = uCode; + continue; + } else { + leadSurrogate = uCode; + // Double lead surrogate found. + uCode = UNASSIGNED; + } + } else { // We've got trail surrogate. + if (leadSurrogate !== -1) { + uCode = 0x10000 + (leadSurrogate - 0xD800) * 0x400 + (uCode - 0xDC00); + leadSurrogate = -1; + } else { + // Incomplete surrogate pair - only trail surrogate found. + uCode = UNASSIGNED; + } + + } + } + else if (leadSurrogate !== -1) { + // Incomplete surrogate pair - only lead surrogate found. + nextChar = uCode; uCode = UNASSIGNED; // Write an error, then current char. + leadSurrogate = -1; + } + + // 2. Convert uCode character. + var dbcsCode = UNASSIGNED; + if (seqObj !== undefined && uCode != UNASSIGNED) { // We are in the middle of the sequence + var resCode = seqObj[uCode]; + if (typeof resCode === 'object') { // Sequence continues. + seqObj = resCode; + continue; + + } else if (typeof resCode == 'number') { // Sequence finished. Write it. + dbcsCode = resCode; + + } else if (resCode == undefined) { // Current character is not part of the sequence. + + // Try default character for this sequence + resCode = seqObj[DEF_CHAR]; + if (resCode !== undefined) { + dbcsCode = resCode; // Found. Write it. + nextChar = uCode; // Current character will be written too in the next iteration. + + } else { + // TODO: What if we have no default? (resCode == undefined) + // Then, we should write first char of the sequence as-is and try the rest recursively. + // Didn't do it for now because no encoding has this situation yet. + // Currently, just skip the sequence and write current char. + } + } + seqObj = undefined; + } + else if (uCode >= 0) { // Regular character + var subtable = this.encodeTable[uCode >> 8]; + if (subtable !== undefined) + dbcsCode = subtable[uCode & 0xFF]; + + if (dbcsCode <= SEQ_START) { // Sequence start + seqObj = this.encodeTableSeq[SEQ_START-dbcsCode]; + continue; + } + + if (dbcsCode == UNASSIGNED && this.gb18030) { + // Use GB18030 algorithm to find character(s) to write. + var idx = findIdx(this.gb18030.uChars, uCode); + if (idx != -1) { + var dbcsCode = this.gb18030.gbChars[idx] + (uCode - this.gb18030.uChars[idx]); + newBuf[j++] = 0x81 + Math.floor(dbcsCode / 12600); dbcsCode = dbcsCode % 12600; + newBuf[j++] = 0x30 + Math.floor(dbcsCode / 1260); dbcsCode = dbcsCode % 1260; + newBuf[j++] = 0x81 + Math.floor(dbcsCode / 10); dbcsCode = dbcsCode % 10; + newBuf[j++] = 0x30 + dbcsCode; + continue; + } + } + } + + // 3. Write dbcsCode character. + if (dbcsCode === UNASSIGNED) + dbcsCode = this.defaultCharSingleByte; + + if (dbcsCode < 0x100) { + newBuf[j++] = dbcsCode; + } + else if (dbcsCode < 0x10000) { + newBuf[j++] = dbcsCode >> 8; // high byte + newBuf[j++] = dbcsCode & 0xFF; // low byte + } + else { + newBuf[j++] = dbcsCode >> 16; + newBuf[j++] = (dbcsCode >> 8) & 0xFF; + newBuf[j++] = dbcsCode & 0xFF; + } + } + + this.seqObj = seqObj; + this.leadSurrogate = leadSurrogate; + return newBuf.slice(0, j); +} + +DBCSEncoder.prototype.end = function() { + if (this.leadSurrogate === -1 && this.seqObj === undefined) + return; // All clean. Most often case. + + var newBuf = Buffer.alloc(10), j = 0; + + if (this.seqObj) { // We're in the sequence. + var dbcsCode = this.seqObj[DEF_CHAR]; + if (dbcsCode !== undefined) { // Write beginning of the sequence. + if (dbcsCode < 0x100) { + newBuf[j++] = dbcsCode; + } + else { + newBuf[j++] = dbcsCode >> 8; // high byte + newBuf[j++] = dbcsCode & 0xFF; // low byte + } + } else { + // See todo above. + } + this.seqObj = undefined; + } + + if (this.leadSurrogate !== -1) { + // Incomplete surrogate pair - only lead surrogate found. + newBuf[j++] = this.defaultCharSingleByte; + this.leadSurrogate = -1; + } + + return newBuf.slice(0, j); +} + +// Export for testing +DBCSEncoder.prototype.findIdx = findIdx; + + +// == Decoder ================================================================== + +function DBCSDecoder(options, codec) { + // Decoder state + this.nodeIdx = 0; + this.prevBuf = Buffer.alloc(0); + + // Static data + this.decodeTables = codec.decodeTables; + this.decodeTableSeq = codec.decodeTableSeq; + this.defaultCharUnicode = codec.defaultCharUnicode; + this.gb18030 = codec.gb18030; +} + +DBCSDecoder.prototype.write = function(buf) { + var newBuf = Buffer.alloc(buf.length*2), + nodeIdx = this.nodeIdx, + prevBuf = this.prevBuf, prevBufOffset = this.prevBuf.length, + seqStart = -this.prevBuf.length, // idx of the start of current parsed sequence. + uCode; + + if (prevBufOffset > 0) // Make prev buf overlap a little to make it easier to slice later. + prevBuf = Buffer.concat([prevBuf, buf.slice(0, 10)]); + + for (var i = 0, j = 0; i < buf.length; i++) { + var curByte = (i >= 0) ? buf[i] : prevBuf[i + prevBufOffset]; + + // Lookup in current trie node. + var uCode = this.decodeTables[nodeIdx][curByte]; + + if (uCode >= 0) { + // Normal character, just use it. + } + else if (uCode === UNASSIGNED) { // Unknown char. + // TODO: Callback with seq. + //var curSeq = (seqStart >= 0) ? buf.slice(seqStart, i+1) : prevBuf.slice(seqStart + prevBufOffset, i+1 + prevBufOffset); + i = seqStart; // Try to parse again, after skipping first byte of the sequence ('i' will be incremented by 'for' cycle). + uCode = this.defaultCharUnicode.charCodeAt(0); + } + else if (uCode === GB18030_CODE) { + var curSeq = (seqStart >= 0) ? buf.slice(seqStart, i+1) : prevBuf.slice(seqStart + prevBufOffset, i+1 + prevBufOffset); + var ptr = (curSeq[0]-0x81)*12600 + (curSeq[1]-0x30)*1260 + (curSeq[2]-0x81)*10 + (curSeq[3]-0x30); + var idx = findIdx(this.gb18030.gbChars, ptr); + uCode = this.gb18030.uChars[idx] + ptr - this.gb18030.gbChars[idx]; + } + else if (uCode <= NODE_START) { // Go to next trie node. + nodeIdx = NODE_START - uCode; + continue; + } + else if (uCode <= SEQ_START) { // Output a sequence of chars. + var seq = this.decodeTableSeq[SEQ_START - uCode]; + for (var k = 0; k < seq.length - 1; k++) { + uCode = seq[k]; + newBuf[j++] = uCode & 0xFF; + newBuf[j++] = uCode >> 8; + } + uCode = seq[seq.length-1]; + } + else + throw new Error("iconv-lite internal error: invalid decoding table value " + uCode + " at " + nodeIdx + "/" + curByte); + + // Write the character to buffer, handling higher planes using surrogate pair. + if (uCode > 0xFFFF) { + uCode -= 0x10000; + var uCodeLead = 0xD800 + Math.floor(uCode / 0x400); + newBuf[j++] = uCodeLead & 0xFF; + newBuf[j++] = uCodeLead >> 8; + + uCode = 0xDC00 + uCode % 0x400; + } + newBuf[j++] = uCode & 0xFF; + newBuf[j++] = uCode >> 8; + + // Reset trie node. + nodeIdx = 0; seqStart = i+1; + } + + this.nodeIdx = nodeIdx; + this.prevBuf = (seqStart >= 0) ? buf.slice(seqStart) : prevBuf.slice(seqStart + prevBufOffset); + return newBuf.slice(0, j).toString('ucs2'); +} + +DBCSDecoder.prototype.end = function() { + var ret = ''; + + // Try to parse all remaining chars. + while (this.prevBuf.length > 0) { + // Skip 1 character in the buffer. + ret += this.defaultCharUnicode; + var buf = this.prevBuf.slice(1); + + // Parse remaining as usual. + this.prevBuf = Buffer.alloc(0); + this.nodeIdx = 0; + if (buf.length > 0) + ret += this.write(buf); + } + + this.nodeIdx = 0; + return ret; +} + +// Binary search for GB18030. Returns largest i such that table[i] <= val. +function findIdx(table, val) { + if (table[0] > val) + return -1; + + var l = 0, r = table.length; + while (l < r-1) { // always table[l] <= val < table[r] + var mid = l + Math.floor((r-l+1)/2); + if (table[mid] <= val) + l = mid; + else + r = mid; + } + return l; +} + diff --git a/node_modules/iconv-lite/encodings/dbcs-data.js b/node_modules/iconv-lite/encodings/dbcs-data.js new file mode 100644 index 000000000..4b6191434 --- /dev/null +++ b/node_modules/iconv-lite/encodings/dbcs-data.js @@ -0,0 +1,176 @@ +"use strict"; + +// Description of supported double byte encodings and aliases. +// Tables are not require()-d until they are needed to speed up library load. +// require()-s are direct to support Browserify. + +module.exports = { + + // == Japanese/ShiftJIS ==================================================== + // All japanese encodings are based on JIS X set of standards: + // JIS X 0201 - Single-byte encoding of ASCII + ¥ + Kana chars at 0xA1-0xDF. + // JIS X 0208 - Main set of 6879 characters, placed in 94x94 plane, to be encoded by 2 bytes. + // Has several variations in 1978, 1983, 1990 and 1997. + // JIS X 0212 - Supplementary plane of 6067 chars in 94x94 plane. 1990. Effectively dead. + // JIS X 0213 - Extension and modern replacement of 0208 and 0212. Total chars: 11233. + // 2 planes, first is superset of 0208, second - revised 0212. + // Introduced in 2000, revised 2004. Some characters are in Unicode Plane 2 (0x2xxxx) + + // Byte encodings are: + // * Shift_JIS: Compatible with 0201, uses not defined chars in top half as lead bytes for double-byte + // encoding of 0208. Lead byte ranges: 0x81-0x9F, 0xE0-0xEF; Trail byte ranges: 0x40-0x7E, 0x80-0x9E, 0x9F-0xFC. + // Windows CP932 is a superset of Shift_JIS. Some companies added more chars, notably KDDI. + // * EUC-JP: Up to 3 bytes per character. Used mostly on *nixes. + // 0x00-0x7F - lower part of 0201 + // 0x8E, 0xA1-0xDF - upper part of 0201 + // (0xA1-0xFE)x2 - 0208 plane (94x94). + // 0x8F, (0xA1-0xFE)x2 - 0212 plane (94x94). + // * JIS X 208: 7-bit, direct encoding of 0208. Byte ranges: 0x21-0x7E (94 values). Uncommon. + // Used as-is in ISO2022 family. + // * ISO2022-JP: Stateful encoding, with escape sequences to switch between ASCII, + // 0201-1976 Roman, 0208-1978, 0208-1983. + // * ISO2022-JP-1: Adds esc seq for 0212-1990. + // * ISO2022-JP-2: Adds esc seq for GB2313-1980, KSX1001-1992, ISO8859-1, ISO8859-7. + // * ISO2022-JP-3: Adds esc seq for 0201-1976 Kana set, 0213-2000 Planes 1, 2. + // * ISO2022-JP-2004: Adds 0213-2004 Plane 1. + // + // After JIS X 0213 appeared, Shift_JIS-2004, EUC-JISX0213 and ISO2022-JP-2004 followed, with just changing the planes. + // + // Overall, it seems that it's a mess :( http://www8.plala.or.jp/tkubota1/unicode-symbols-map2.html + + 'shiftjis': { + type: '_dbcs', + table: function() { return require('./tables/shiftjis.json') }, + encodeAdd: {'\u00a5': 0x5C, '\u203E': 0x7E}, + encodeSkipVals: [{from: 0xED40, to: 0xF940}], + }, + 'csshiftjis': 'shiftjis', + 'mskanji': 'shiftjis', + 'sjis': 'shiftjis', + 'windows31j': 'shiftjis', + 'ms31j': 'shiftjis', + 'xsjis': 'shiftjis', + 'windows932': 'shiftjis', + 'ms932': 'shiftjis', + '932': 'shiftjis', + 'cp932': 'shiftjis', + + 'eucjp': { + type: '_dbcs', + table: function() { return require('./tables/eucjp.json') }, + encodeAdd: {'\u00a5': 0x5C, '\u203E': 0x7E}, + }, + + // TODO: KDDI extension to Shift_JIS + // TODO: IBM CCSID 942 = CP932, but F0-F9 custom chars and other char changes. + // TODO: IBM CCSID 943 = Shift_JIS = CP932 with original Shift_JIS lower 128 chars. + + + // == Chinese/GBK ========================================================== + // http://en.wikipedia.org/wiki/GBK + // We mostly implement W3C recommendation: https://www.w3.org/TR/encoding/#gbk-encoder + + // Oldest GB2312 (1981, ~7600 chars) is a subset of CP936 + 'gb2312': 'cp936', + 'gb231280': 'cp936', + 'gb23121980': 'cp936', + 'csgb2312': 'cp936', + 'csiso58gb231280': 'cp936', + 'euccn': 'cp936', + + // Microsoft's CP936 is a subset and approximation of GBK. + 'windows936': 'cp936', + 'ms936': 'cp936', + '936': 'cp936', + 'cp936': { + type: '_dbcs', + table: function() { return require('./tables/cp936.json') }, + }, + + // GBK (~22000 chars) is an extension of CP936 that added user-mapped chars and some other. + 'gbk': { + type: '_dbcs', + table: function() { return require('./tables/cp936.json').concat(require('./tables/gbk-added.json')) }, + }, + 'xgbk': 'gbk', + 'isoir58': 'gbk', + + // GB18030 is an algorithmic extension of GBK. + // Main source: https://www.w3.org/TR/encoding/#gbk-encoder + // http://icu-project.org/docs/papers/gb18030.html + // http://source.icu-project.org/repos/icu/data/trunk/charset/data/xml/gb-18030-2000.xml + // http://www.khngai.com/chinese/charmap/tblgbk.php?page=0 + 'gb18030': { + type: '_dbcs', + table: function() { return require('./tables/cp936.json').concat(require('./tables/gbk-added.json')) }, + gb18030: function() { return require('./tables/gb18030-ranges.json') }, + encodeSkipVals: [0x80], + encodeAdd: {'€': 0xA2E3}, + }, + + 'chinese': 'gb18030', + + + // == Korean =============================================================== + // EUC-KR, KS_C_5601 and KS X 1001 are exactly the same. + 'windows949': 'cp949', + 'ms949': 'cp949', + '949': 'cp949', + 'cp949': { + type: '_dbcs', + table: function() { return require('./tables/cp949.json') }, + }, + + 'cseuckr': 'cp949', + 'csksc56011987': 'cp949', + 'euckr': 'cp949', + 'isoir149': 'cp949', + 'korean': 'cp949', + 'ksc56011987': 'cp949', + 'ksc56011989': 'cp949', + 'ksc5601': 'cp949', + + + // == Big5/Taiwan/Hong Kong ================================================ + // There are lots of tables for Big5 and cp950. Please see the following links for history: + // http://moztw.org/docs/big5/ http://www.haible.de/bruno/charsets/conversion-tables/Big5.html + // Variations, in roughly number of defined chars: + // * Windows CP 950: Microsoft variant of Big5. Canonical: http://www.unicode.org/Public/MAPPINGS/VENDORS/MICSFT/WINDOWS/CP950.TXT + // * Windows CP 951: Microsoft variant of Big5-HKSCS-2001. Seems to be never public. http://me.abelcheung.org/articles/research/what-is-cp951/ + // * Big5-2003 (Taiwan standard) almost superset of cp950. + // * Unicode-at-on (UAO) / Mozilla 1.8. Falling out of use on the Web. Not supported by other browsers. + // * Big5-HKSCS (-2001, -2004, -2008). Hong Kong standard. + // many unicode code points moved from PUA to Supplementary plane (U+2XXXX) over the years. + // Plus, it has 4 combining sequences. + // Seems that Mozilla refused to support it for 10 yrs. https://bugzilla.mozilla.org/show_bug.cgi?id=162431 https://bugzilla.mozilla.org/show_bug.cgi?id=310299 + // because big5-hkscs is the only encoding to include astral characters in non-algorithmic way. + // Implementations are not consistent within browsers; sometimes labeled as just big5. + // MS Internet Explorer switches from big5 to big5-hkscs when a patch applied. + // Great discussion & recap of what's going on https://bugzilla.mozilla.org/show_bug.cgi?id=912470#c31 + // In the encoder, it might make sense to support encoding old PUA mappings to Big5 bytes seq-s. + // Official spec: http://www.ogcio.gov.hk/en/business/tech_promotion/ccli/terms/doc/2003cmp_2008.txt + // http://www.ogcio.gov.hk/tc/business/tech_promotion/ccli/terms/doc/hkscs-2008-big5-iso.txt + // + // Current understanding of how to deal with Big5(-HKSCS) is in the Encoding Standard, http://encoding.spec.whatwg.org/#big5-encoder + // Unicode mapping (http://www.unicode.org/Public/MAPPINGS/OBSOLETE/EASTASIA/OTHER/BIG5.TXT) is said to be wrong. + + 'windows950': 'cp950', + 'ms950': 'cp950', + '950': 'cp950', + 'cp950': { + type: '_dbcs', + table: function() { return require('./tables/cp950.json') }, + }, + + // Big5 has many variations and is an extension of cp950. We use Encoding Standard's as a consensus. + 'big5': 'big5hkscs', + 'big5hkscs': { + type: '_dbcs', + table: function() { return require('./tables/cp950.json').concat(require('./tables/big5-added.json')) }, + encodeSkipVals: [0xa2cc], + }, + + 'cnbig5': 'big5hkscs', + 'csbig5': 'big5hkscs', + 'xxbig5': 'big5hkscs', +}; diff --git a/node_modules/iconv-lite/encodings/index.js b/node_modules/iconv-lite/encodings/index.js new file mode 100644 index 000000000..e30400317 --- /dev/null +++ b/node_modules/iconv-lite/encodings/index.js @@ -0,0 +1,22 @@ +"use strict"; + +// Update this array if you add/rename/remove files in this directory. +// We support Browserify by skipping automatic module discovery and requiring modules directly. +var modules = [ + require("./internal"), + require("./utf16"), + require("./utf7"), + require("./sbcs-codec"), + require("./sbcs-data"), + require("./sbcs-data-generated"), + require("./dbcs-codec"), + require("./dbcs-data"), +]; + +// Put all encoding/alias/codec definitions to single object and export it. +for (var i = 0; i < modules.length; i++) { + var module = modules[i]; + for (var enc in module) + if (Object.prototype.hasOwnProperty.call(module, enc)) + exports[enc] = module[enc]; +} diff --git a/node_modules/iconv-lite/encodings/internal.js b/node_modules/iconv-lite/encodings/internal.js new file mode 100644 index 000000000..05ce38b27 --- /dev/null +++ b/node_modules/iconv-lite/encodings/internal.js @@ -0,0 +1,188 @@ +"use strict"; +var Buffer = require("safer-buffer").Buffer; + +// Export Node.js internal encodings. + +module.exports = { + // Encodings + utf8: { type: "_internal", bomAware: true}, + cesu8: { type: "_internal", bomAware: true}, + unicode11utf8: "utf8", + + ucs2: { type: "_internal", bomAware: true}, + utf16le: "ucs2", + + binary: { type: "_internal" }, + base64: { type: "_internal" }, + hex: { type: "_internal" }, + + // Codec. + _internal: InternalCodec, +}; + +//------------------------------------------------------------------------------ + +function InternalCodec(codecOptions, iconv) { + this.enc = codecOptions.encodingName; + this.bomAware = codecOptions.bomAware; + + if (this.enc === "base64") + this.encoder = InternalEncoderBase64; + else if (this.enc === "cesu8") { + this.enc = "utf8"; // Use utf8 for decoding. + this.encoder = InternalEncoderCesu8; + + // Add decoder for versions of Node not supporting CESU-8 + if (Buffer.from('eda0bdedb2a9', 'hex').toString() !== '💩') { + this.decoder = InternalDecoderCesu8; + this.defaultCharUnicode = iconv.defaultCharUnicode; + } + } +} + +InternalCodec.prototype.encoder = InternalEncoder; +InternalCodec.prototype.decoder = InternalDecoder; + +//------------------------------------------------------------------------------ + +// We use node.js internal decoder. Its signature is the same as ours. +var StringDecoder = require('string_decoder').StringDecoder; + +if (!StringDecoder.prototype.end) // Node v0.8 doesn't have this method. + StringDecoder.prototype.end = function() {}; + + +function InternalDecoder(options, codec) { + StringDecoder.call(this, codec.enc); +} + +InternalDecoder.prototype = StringDecoder.prototype; + + +//------------------------------------------------------------------------------ +// Encoder is mostly trivial + +function InternalEncoder(options, codec) { + this.enc = codec.enc; +} + +InternalEncoder.prototype.write = function(str) { + return Buffer.from(str, this.enc); +} + +InternalEncoder.prototype.end = function() { +} + + +//------------------------------------------------------------------------------ +// Except base64 encoder, which must keep its state. + +function InternalEncoderBase64(options, codec) { + this.prevStr = ''; +} + +InternalEncoderBase64.prototype.write = function(str) { + str = this.prevStr + str; + var completeQuads = str.length - (str.length % 4); + this.prevStr = str.slice(completeQuads); + str = str.slice(0, completeQuads); + + return Buffer.from(str, "base64"); +} + +InternalEncoderBase64.prototype.end = function() { + return Buffer.from(this.prevStr, "base64"); +} + + +//------------------------------------------------------------------------------ +// CESU-8 encoder is also special. + +function InternalEncoderCesu8(options, codec) { +} + +InternalEncoderCesu8.prototype.write = function(str) { + var buf = Buffer.alloc(str.length * 3), bufIdx = 0; + for (var i = 0; i < str.length; i++) { + var charCode = str.charCodeAt(i); + // Naive implementation, but it works because CESU-8 is especially easy + // to convert from UTF-16 (which all JS strings are encoded in). + if (charCode < 0x80) + buf[bufIdx++] = charCode; + else if (charCode < 0x800) { + buf[bufIdx++] = 0xC0 + (charCode >>> 6); + buf[bufIdx++] = 0x80 + (charCode & 0x3f); + } + else { // charCode will always be < 0x10000 in javascript. + buf[bufIdx++] = 0xE0 + (charCode >>> 12); + buf[bufIdx++] = 0x80 + ((charCode >>> 6) & 0x3f); + buf[bufIdx++] = 0x80 + (charCode & 0x3f); + } + } + return buf.slice(0, bufIdx); +} + +InternalEncoderCesu8.prototype.end = function() { +} + +//------------------------------------------------------------------------------ +// CESU-8 decoder is not implemented in Node v4.0+ + +function InternalDecoderCesu8(options, codec) { + this.acc = 0; + this.contBytes = 0; + this.accBytes = 0; + this.defaultCharUnicode = codec.defaultCharUnicode; +} + +InternalDecoderCesu8.prototype.write = function(buf) { + var acc = this.acc, contBytes = this.contBytes, accBytes = this.accBytes, + res = ''; + for (var i = 0; i < buf.length; i++) { + var curByte = buf[i]; + if ((curByte & 0xC0) !== 0x80) { // Leading byte + if (contBytes > 0) { // Previous code is invalid + res += this.defaultCharUnicode; + contBytes = 0; + } + + if (curByte < 0x80) { // Single-byte code + res += String.fromCharCode(curByte); + } else if (curByte < 0xE0) { // Two-byte code + acc = curByte & 0x1F; + contBytes = 1; accBytes = 1; + } else if (curByte < 0xF0) { // Three-byte code + acc = curByte & 0x0F; + contBytes = 2; accBytes = 1; + } else { // Four or more are not supported for CESU-8. + res += this.defaultCharUnicode; + } + } else { // Continuation byte + if (contBytes > 0) { // We're waiting for it. + acc = (acc << 6) | (curByte & 0x3f); + contBytes--; accBytes++; + if (contBytes === 0) { + // Check for overlong encoding, but support Modified UTF-8 (encoding NULL as C0 80) + if (accBytes === 2 && acc < 0x80 && acc > 0) + res += this.defaultCharUnicode; + else if (accBytes === 3 && acc < 0x800) + res += this.defaultCharUnicode; + else + // Actually add character. + res += String.fromCharCode(acc); + } + } else { // Unexpected continuation byte + res += this.defaultCharUnicode; + } + } + } + this.acc = acc; this.contBytes = contBytes; this.accBytes = accBytes; + return res; +} + +InternalDecoderCesu8.prototype.end = function() { + var res = 0; + if (this.contBytes > 0) + res += this.defaultCharUnicode; + return res; +} diff --git a/node_modules/iconv-lite/encodings/sbcs-codec.js b/node_modules/iconv-lite/encodings/sbcs-codec.js new file mode 100644 index 000000000..abac5ffaa --- /dev/null +++ b/node_modules/iconv-lite/encodings/sbcs-codec.js @@ -0,0 +1,72 @@ +"use strict"; +var Buffer = require("safer-buffer").Buffer; + +// Single-byte codec. Needs a 'chars' string parameter that contains 256 or 128 chars that +// correspond to encoded bytes (if 128 - then lower half is ASCII). + +exports._sbcs = SBCSCodec; +function SBCSCodec(codecOptions, iconv) { + if (!codecOptions) + throw new Error("SBCS codec is called without the data.") + + // Prepare char buffer for decoding. + if (!codecOptions.chars || (codecOptions.chars.length !== 128 && codecOptions.chars.length !== 256)) + throw new Error("Encoding '"+codecOptions.type+"' has incorrect 'chars' (must be of len 128 or 256)"); + + if (codecOptions.chars.length === 128) { + var asciiString = ""; + for (var i = 0; i < 128; i++) + asciiString += String.fromCharCode(i); + codecOptions.chars = asciiString + codecOptions.chars; + } + + this.decodeBuf = Buffer.from(codecOptions.chars, 'ucs2'); + + // Encoding buffer. + var encodeBuf = Buffer.alloc(65536, iconv.defaultCharSingleByte.charCodeAt(0)); + + for (var i = 0; i < codecOptions.chars.length; i++) + encodeBuf[codecOptions.chars.charCodeAt(i)] = i; + + this.encodeBuf = encodeBuf; +} + +SBCSCodec.prototype.encoder = SBCSEncoder; +SBCSCodec.prototype.decoder = SBCSDecoder; + + +function SBCSEncoder(options, codec) { + this.encodeBuf = codec.encodeBuf; +} + +SBCSEncoder.prototype.write = function(str) { + var buf = Buffer.alloc(str.length); + for (var i = 0; i < str.length; i++) + buf[i] = this.encodeBuf[str.charCodeAt(i)]; + + return buf; +} + +SBCSEncoder.prototype.end = function() { +} + + +function SBCSDecoder(options, codec) { + this.decodeBuf = codec.decodeBuf; +} + +SBCSDecoder.prototype.write = function(buf) { + // Strings are immutable in JS -> we use ucs2 buffer to speed up computations. + var decodeBuf = this.decodeBuf; + var newBuf = Buffer.alloc(buf.length*2); + var idx1 = 0, idx2 = 0; + for (var i = 0; i < buf.length; i++) { + idx1 = buf[i]*2; idx2 = i*2; + newBuf[idx2] = decodeBuf[idx1]; + newBuf[idx2+1] = decodeBuf[idx1+1]; + } + return newBuf.toString('ucs2'); +} + +SBCSDecoder.prototype.end = function() { +} diff --git a/node_modules/iconv-lite/encodings/sbcs-data-generated.js b/node_modules/iconv-lite/encodings/sbcs-data-generated.js new file mode 100644 index 000000000..9b4823607 --- /dev/null +++ b/node_modules/iconv-lite/encodings/sbcs-data-generated.js @@ -0,0 +1,451 @@ +"use strict"; + +// Generated data for sbcs codec. Don't edit manually. Regenerate using generation/gen-sbcs.js script. +module.exports = { + "437": "cp437", + "737": "cp737", + "775": "cp775", + "850": "cp850", + "852": "cp852", + "855": "cp855", + "856": "cp856", + "857": "cp857", + "858": "cp858", + "860": "cp860", + "861": "cp861", + "862": "cp862", + "863": "cp863", + "864": "cp864", + "865": "cp865", + "866": "cp866", + "869": "cp869", + "874": "windows874", + "922": "cp922", + "1046": "cp1046", + "1124": "cp1124", + "1125": "cp1125", + "1129": "cp1129", + "1133": "cp1133", + "1161": "cp1161", + "1162": "cp1162", + "1163": "cp1163", + "1250": "windows1250", + "1251": "windows1251", + "1252": "windows1252", + "1253": "windows1253", + "1254": "windows1254", + "1255": "windows1255", + "1256": "windows1256", + "1257": "windows1257", + "1258": "windows1258", + "28591": "iso88591", + "28592": "iso88592", + "28593": "iso88593", + "28594": "iso88594", + "28595": "iso88595", + "28596": "iso88596", + "28597": "iso88597", + "28598": "iso88598", + "28599": "iso88599", + "28600": "iso885910", + "28601": "iso885911", + "28603": "iso885913", + "28604": "iso885914", + "28605": "iso885915", + "28606": "iso885916", + "windows874": { + "type": "_sbcs", + "chars": "€����…�����������‘’“”•–—�������� กขฃคฅฆงจฉชซฌญฎฏฐฑฒณดตถทธนบปผฝพฟภมยรฤลฦวศษสหฬอฮฯะัาำิีึืฺุู����฿เแโใไๅๆ็่้๊๋์ํ๎๏๐๑๒๓๔๕๖๗๘๙๚๛����" + }, + "win874": "windows874", + "cp874": "windows874", + "windows1250": { + "type": "_sbcs", + "chars": "€�‚�„…†‡�‰Š‹ŚŤŽŹ�‘’“”•–—�™š›śťžź ˇ˘Ł¤Ą¦§¨©Ş«¬­®Ż°±˛ł´µ¶·¸ąş»Ľ˝ľżŔÁÂĂÄĹĆÇČÉĘËĚÍÎĎĐŃŇÓÔŐÖ×ŘŮÚŰÜÝŢßŕáâăäĺćçčéęëěíîďđńňóôőö÷řůúűüýţ˙" + }, + "win1250": "windows1250", + "cp1250": "windows1250", + "windows1251": { + "type": "_sbcs", + "chars": "ЂЃ‚ѓ„…†‡€‰Љ‹ЊЌЋЏђ‘’“”•–—�™љ›њќћџ ЎўЈ¤Ґ¦§Ё©Є«¬­®Ї°±Ііґµ¶·ё№є»јЅѕїАБВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюя" + }, + "win1251": "windows1251", + "cp1251": "windows1251", + "windows1252": { + "type": "_sbcs", + "chars": "€�‚ƒ„…†‡ˆ‰Š‹Œ�Ž��‘’“”•–—˜™š›œ�žŸ ¡¢£¤¥¦§¨©ª«¬­®¯°±²³´µ¶·¸¹º»¼½¾¿ÀÁÂÃÄÅÆÇÈÉÊËÌÍÎÏÐÑÒÓÔÕÖ×ØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõö÷øùúûüýþÿ" + }, + "win1252": "windows1252", + "cp1252": "windows1252", + "windows1253": { + "type": "_sbcs", + "chars": "€�‚ƒ„…†‡�‰�‹�����‘’“”•–—�™�›���� ΅Ά£¤¥¦§¨©�«¬­®―°±²³΄µ¶·ΈΉΊ»Ό½ΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡ�ΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύώ�" + }, + "win1253": "windows1253", + "cp1253": "windows1253", + "windows1254": { + "type": "_sbcs", + "chars": "€�‚ƒ„…†‡ˆ‰Š‹Œ����‘’“”•–—˜™š›œ��Ÿ ¡¢£¤¥¦§¨©ª«¬­®¯°±²³´µ¶·¸¹º»¼½¾¿ÀÁÂÃÄÅÆÇÈÉÊËÌÍÎÏĞÑÒÓÔÕÖ×ØÙÚÛÜİŞßàáâãäåæçèéêëìíîïğñòóôõö÷øùúûüışÿ" + }, + "win1254": "windows1254", + "cp1254": "windows1254", + "windows1255": { + "type": "_sbcs", + "chars": "€�‚ƒ„…†‡ˆ‰�‹�����‘’“”•–—˜™�›���� ¡¢£₪¥¦§¨©×«¬­®¯°±²³´µ¶·¸¹÷»¼½¾¿ְֱֲֳִֵֶַָֹֺֻּֽ־ֿ׀ׁׂ׃װױײ׳״�������אבגדהוזחטיךכלםמןנסעףפץצקרשת��‎‏�" + }, + "win1255": "windows1255", + "cp1255": "windows1255", + "windows1256": { + "type": "_sbcs", + "chars": "€پ‚ƒ„…†‡ˆ‰ٹ‹Œچژڈگ‘’“”•–—ک™ڑ›œ‌‍ں ،¢£¤¥¦§¨©ھ«¬­®¯°±²³´µ¶·¸¹؛»¼½¾؟ہءآأؤإئابةتثجحخدذرزسشصض×طظعغـفقكàلâمنهوçèéêëىيîïًٌٍَôُِ÷ّùْûü‎‏ے" + }, + "win1256": "windows1256", + "cp1256": "windows1256", + "windows1257": { + "type": "_sbcs", + "chars": "€�‚�„…†‡�‰�‹�¨ˇ¸�‘’“”•–—�™�›�¯˛� �¢£¤�¦§Ø©Ŗ«¬­®Æ°±²³´µ¶·ø¹ŗ»¼½¾æĄĮĀĆÄÅĘĒČÉŹĖĢĶĪĻŠŃŅÓŌÕÖ×ŲŁŚŪÜŻŽßąįāćäåęēčéźėģķīļšńņóōõö÷ųłśūüżž˙" + }, + "win1257": "windows1257", + "cp1257": "windows1257", + "windows1258": { + "type": "_sbcs", + "chars": "€�‚ƒ„…†‡ˆ‰�‹Œ����‘’“”•–—˜™�›œ��Ÿ ¡¢£¤¥¦§¨©ª«¬­®¯°±²³´µ¶·¸¹º»¼½¾¿ÀÁÂĂÄÅÆÇÈÉÊË̀ÍÎÏĐÑ̉ÓÔƠÖ×ØÙÚÛÜỮßàáâăäåæçèéêë́íîïđṇ̃óôơö÷øùúûüư₫ÿ" + }, + "win1258": "windows1258", + "cp1258": "windows1258", + "iso88591": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§¨©ª«¬­®¯°±²³´µ¶·¸¹º»¼½¾¿ÀÁÂÃÄÅÆÇÈÉÊËÌÍÎÏÐÑÒÓÔÕÖ×ØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõö÷øùúûüýþÿ" + }, + "cp28591": "iso88591", + "iso88592": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ Ą˘Ł¤ĽŚ§¨ŠŞŤŹ­ŽŻ°ą˛ł´ľśˇ¸šşťź˝žżŔÁÂĂÄĹĆÇČÉĘËĚÍÎĎĐŃŇÓÔŐÖ×ŘŮÚŰÜÝŢßŕáâăäĺćçčéęëěíîďđńňóôőö÷řůúűüýţ˙" + }, + "cp28592": "iso88592", + "iso88593": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ Ħ˘£¤�Ĥ§¨İŞĞĴ­�Ż°ħ²³´µĥ·¸ışğĵ½�żÀÁÂ�ÄĊĈÇÈÉÊËÌÍÎÏ�ÑÒÓÔĠÖ×ĜÙÚÛÜŬŜßàáâ�äċĉçèéêëìíîï�ñòóôġö÷ĝùúûüŭŝ˙" + }, + "cp28593": "iso88593", + "iso88594": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ĄĸŖ¤ĨĻ§¨ŠĒĢŦ­Ž¯°ą˛ŗ´ĩļˇ¸šēģŧŊžŋĀÁÂÃÄÅÆĮČÉĘËĖÍÎĪĐŅŌĶÔÕÖ×ØŲÚÛÜŨŪßāáâãäåæįčéęëėíîīđņōķôõö÷øųúûüũū˙" + }, + "cp28594": "iso88594", + "iso88595": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ЁЂЃЄЅІЇЈЉЊЋЌ­ЎЏАБВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюя№ёђѓєѕіїјљњћќ§ўџ" + }, + "cp28595": "iso88595", + "iso88596": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ���¤�������،­�������������؛���؟�ءآأؤإئابةتثجحخدذرزسشصضطظعغ�����ـفقكلمنهوىيًٌٍَُِّْ�������������" + }, + "cp28596": "iso88596", + "iso88597": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ‘’£€₯¦§¨©ͺ«¬­�―°±²³΄΅Ά·ΈΉΊ»Ό½ΎΏΐΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡ�ΣΤΥΦΧΨΩΪΫάέήίΰαβγδεζηθικλμνξοπρςστυφχψωϊϋόύώ�" + }, + "cp28597": "iso88597", + "iso88598": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ �¢£¤¥¦§¨©×«¬­®¯°±²³´µ¶·¸¹÷»¼½¾��������������������������������‗אבגדהוזחטיךכלםמןנסעףפץצקרשת��‎‏�" + }, + "cp28598": "iso88598", + "iso88599": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§¨©ª«¬­®¯°±²³´µ¶·¸¹º»¼½¾¿ÀÁÂÃÄÅÆÇÈÉÊËÌÍÎÏĞÑÒÓÔÕÖ×ØÙÚÛÜİŞßàáâãäåæçèéêëìíîïğñòóôõö÷øùúûüışÿ" + }, + "cp28599": "iso88599", + "iso885910": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ĄĒĢĪĨĶ§ĻĐŠŦŽ­ŪŊ°ąēģīĩķ·ļđšŧž―ūŋĀÁÂÃÄÅÆĮČÉĘËĖÍÎÏÐŅŌÓÔÕÖŨØŲÚÛÜÝÞßāáâãäåæįčéęëėíîïðņōóôõöũøųúûüýþĸ" + }, + "cp28600": "iso885910", + "iso885911": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ กขฃคฅฆงจฉชซฌญฎฏฐฑฒณดตถทธนบปผฝพฟภมยรฤลฦวศษสหฬอฮฯะัาำิีึืฺุู����฿เแโใไๅๆ็่้๊๋์ํ๎๏๐๑๒๓๔๕๖๗๘๙๚๛����" + }, + "cp28601": "iso885911", + "iso885913": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ”¢£¤„¦§Ø©Ŗ«¬­®Æ°±²³“µ¶·ø¹ŗ»¼½¾æĄĮĀĆÄÅĘĒČÉŹĖĢĶĪĻŠŃŅÓŌÕÖ×ŲŁŚŪÜŻŽßąįāćäåęēčéźėģķīļšńņóōõö÷ųłśūüżž’" + }, + "cp28603": "iso885913", + "iso885914": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ Ḃḃ£ĊċḊ§Ẁ©ẂḋỲ­®ŸḞḟĠġṀṁ¶ṖẁṗẃṠỳẄẅṡÀÁÂÃÄÅÆÇÈÉÊËÌÍÎÏŴÑÒÓÔÕÖṪØÙÚÛÜÝŶßàáâãäåæçèéêëìíîïŵñòóôõöṫøùúûüýŷÿ" + }, + "cp28604": "iso885914", + "iso885915": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£€¥Š§š©ª«¬­®¯°±²³Žµ¶·ž¹º»ŒœŸ¿ÀÁÂÃÄÅÆÇÈÉÊËÌÍÎÏÐÑÒÓÔÕÖ×ØÙÚÛÜÝÞßàáâãäåæçèéêëìíîïðñòóôõö÷øùúûüýþÿ" + }, + "cp28605": "iso885915", + "iso885916": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ĄąŁ€„Š§š©Ș«Ź­źŻ°±ČłŽ”¶·žčș»ŒœŸżÀÁÂĂÄĆÆÇÈÉÊËÌÍÎÏĐŃÒÓÔŐÖŚŰÙÚÛÜĘȚßàáâăäćæçèéêëìíîïđńòóôőöśűùúûüęțÿ" + }, + "cp28606": "iso885916", + "cp437": { + "type": "_sbcs", + "chars": "ÇüéâäàåçêëèïîìÄÅÉæÆôöòûùÿÖÜ¢£¥₧ƒáíóúñѪº¿⌐¬½¼¡«»░▒▓│┤╡╢╖╕╣║╗╝╜╛┐└┴┬├─┼╞╟╚╔╩╦╠═╬╧╨╤╥╙╘╒╓╫╪┘┌█▄▌▐▀αßΓπΣσµτΦΘΩδ∞φε∩≡±≥≤⌠⌡÷≈°∙·√ⁿ²■ " + }, + "ibm437": "cp437", + "csibm437": "cp437", + "cp737": { + "type": "_sbcs", + "chars": "ΑΒΓΔΕΖΗΘΙΚΛΜΝΞΟΠΡΣΤΥΦΧΨΩαβγδεζηθικλμνξοπρσςτυφχψ░▒▓│┤╡╢╖╕╣║╗╝╜╛┐└┴┬├─┼╞╟╚╔╩╦╠═╬╧╨╤╥╙╘╒╓╫╪┘┌█▄▌▐▀ωάέήϊίόύϋώΆΈΉΊΌΎΏ±≥≤ΪΫ÷≈°∙·√ⁿ²■ " + }, + "ibm737": "cp737", + "csibm737": "cp737", + "cp775": { + "type": "_sbcs", + "chars": "ĆüéāäģåćłēŖŗīŹÄÅÉæÆōöĢ¢ŚśÖÜø£ØפĀĪóŻżź”¦©®¬½¼Ł«»░▒▓│┤ĄČĘĖ╣║╗╝ĮŠ┐└┴┬├─┼ŲŪ╚╔╩╦╠═╬Žąčęėįšųūž┘┌█▄▌▐▀ÓßŌŃõÕµńĶķĻļņĒŅ’­±“¾¶§÷„°∙·¹³²■ " + }, + "ibm775": "cp775", + "csibm775": "cp775", + "cp850": { + "type": "_sbcs", + "chars": "ÇüéâäàåçêëèïîìÄÅÉæÆôöòûùÿÖÜø£Ø׃áíóúñѪº¿®¬½¼¡«»░▒▓│┤ÁÂÀ©╣║╗╝¢¥┐└┴┬├─┼ãÃ╚╔╩╦╠═╬¤ðÐÊËÈıÍÎÏ┘┌█▄¦Ì▀ÓßÔÒõÕµþÞÚÛÙýݯ´­±‗¾¶§÷¸°¨·¹³²■ " + }, + "ibm850": "cp850", + "csibm850": "cp850", + "cp852": { + "type": "_sbcs", + "chars": "ÇüéâäůćçłëŐőîŹÄĆÉĹĺôöĽľŚśÖÜŤťŁ×čáíóúĄąŽžĘ꬟Ⱥ«»░▒▓│┤ÁÂĚŞ╣║╗╝Żż┐└┴┬├─┼Ăă╚╔╩╦╠═╬¤đĐĎËďŇÍÎě┘┌█▄ŢŮ▀ÓßÔŃńňŠšŔÚŕŰýÝţ´­˝˛ˇ˘§÷¸°¨˙űŘř■ " + }, + "ibm852": "cp852", + "csibm852": "cp852", + "cp855": { + "type": "_sbcs", + "chars": "ђЂѓЃёЁєЄѕЅіІїЇјЈљЉњЊћЋќЌўЎџЏюЮъЪаАбБцЦдДеЕфФгГ«»░▒▓│┤хХиИ╣║╗╝йЙ┐└┴┬├─┼кК╚╔╩╦╠═╬¤лЛмМнНоОп┘┌█▄Пя▀ЯрРсСтТуУжЖвВьЬ№­ыЫзЗшШэЭщЩчЧ§■ " + }, + "ibm855": "cp855", + "csibm855": "cp855", + "cp856": { + "type": "_sbcs", + "chars": "אבגדהוזחטיךכלםמןנסעףפץצקרשת�£�×����������®¬½¼�«»░▒▓│┤���©╣║╗╝¢¥┐└┴┬├─┼��╚╔╩╦╠═╬¤���������┘┌█▄¦�▀������µ�������¯´­±‗¾¶§÷¸°¨·¹³²■ " + }, + "ibm856": "cp856", + "csibm856": "cp856", + "cp857": { + "type": "_sbcs", + "chars": "ÇüéâäàåçêëèïîıÄÅÉæÆôöòûùİÖÜø£ØŞşáíóúñÑĞ𿮬½¼¡«»░▒▓│┤ÁÂÀ©╣║╗╝¢¥┐└┴┬├─┼ãÃ╚╔╩╦╠═╬¤ºªÊËÈ�ÍÎÏ┘┌█▄¦Ì▀ÓßÔÒõÕµ�×ÚÛÙìÿ¯´­±�¾¶§÷¸°¨·¹³²■ " + }, + "ibm857": "cp857", + "csibm857": "cp857", + "cp858": { + "type": "_sbcs", + "chars": "ÇüéâäàåçêëèïîìÄÅÉæÆôöòûùÿÖÜø£Ø׃áíóúñѪº¿®¬½¼¡«»░▒▓│┤ÁÂÀ©╣║╗╝¢¥┐└┴┬├─┼ãÃ╚╔╩╦╠═╬¤ðÐÊËÈ€ÍÎÏ┘┌█▄¦Ì▀ÓßÔÒõÕµþÞÚÛÙýݯ´­±‗¾¶§÷¸°¨·¹³²■ " + }, + "ibm858": "cp858", + "csibm858": "cp858", + "cp860": { + "type": "_sbcs", + "chars": "ÇüéâãàÁçêÊèÍÔìÃÂÉÀÈôõòÚùÌÕÜ¢£Ù₧ÓáíóúñѪº¿Ò¬½¼¡«»░▒▓│┤╡╢╖╕╣║╗╝╜╛┐└┴┬├─┼╞╟╚╔╩╦╠═╬╧╨╤╥╙╘╒╓╫╪┘┌█▄▌▐▀αßΓπΣσµτΦΘΩδ∞φε∩≡±≥≤⌠⌡÷≈°∙·√ⁿ²■ " + }, + "ibm860": "cp860", + "csibm860": "cp860", + "cp861": { + "type": "_sbcs", + "chars": "ÇüéâäàåçêëèÐðÞÄÅÉæÆôöþûÝýÖÜø£Ø₧ƒáíóúÁÍÓÚ¿⌐¬½¼¡«»░▒▓│┤╡╢╖╕╣║╗╝╜╛┐└┴┬├─┼╞╟╚╔╩╦╠═╬╧╨╤╥╙╘╒╓╫╪┘┌█▄▌▐▀αßΓπΣσµτΦΘΩδ∞φε∩≡±≥≤⌠⌡÷≈°∙·√ⁿ²■ " + }, + "ibm861": "cp861", + "csibm861": "cp861", + "cp862": { + "type": "_sbcs", + "chars": "אבגדהוזחטיךכלםמןנסעףפץצקרשת¢£¥₧ƒáíóúñѪº¿⌐¬½¼¡«»░▒▓│┤╡╢╖╕╣║╗╝╜╛┐└┴┬├─┼╞╟╚╔╩╦╠═╬╧╨╤╥╙╘╒╓╫╪┘┌█▄▌▐▀αßΓπΣσµτΦΘΩδ∞φε∩≡±≥≤⌠⌡÷≈°∙·√ⁿ²■ " + }, + "ibm862": "cp862", + "csibm862": "cp862", + "cp863": { + "type": "_sbcs", + "chars": "ÇüéâÂà¶çêëèïî‗À§ÉÈÊôËÏûù¤ÔÜ¢£ÙÛƒ¦´óú¨¸³¯Î⌐¬½¼¾«»░▒▓│┤╡╢╖╕╣║╗╝╜╛┐└┴┬├─┼╞╟╚╔╩╦╠═╬╧╨╤╥╙╘╒╓╫╪┘┌█▄▌▐▀αßΓπΣσµτΦΘΩδ∞φε∩≡±≥≤⌠⌡÷≈°∙·√ⁿ²■ " + }, + "ibm863": "cp863", + "csibm863": "cp863", + "cp864": { + "type": "_sbcs", + "chars": "\u0000\u0001\u0002\u0003\u0004\u0005\u0006\u0007\b\t\n\u000b\f\r\u000e\u000f\u0010\u0011\u0012\u0013\u0014\u0015\u0016\u0017\u0018\u0019\u001a\u001b\u001c\u001d\u001e\u001f !\"#$٪&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~°·∙√▒─│┼┤┬├┴┐┌└┘β∞φ±½¼≈«»ﻷﻸ��ﻻﻼ� ­ﺂ£¤ﺄ��ﺎﺏﺕﺙ،ﺝﺡﺥ٠١٢٣٤٥٦٧٨٩ﻑ؛ﺱﺵﺹ؟¢ﺀﺁﺃﺅﻊﺋﺍﺑﺓﺗﺛﺟﺣﺧﺩﺫﺭﺯﺳﺷﺻﺿﻁﻅﻋﻏ¦¬÷×ﻉـﻓﻗﻛﻟﻣﻧﻫﻭﻯﻳﺽﻌﻎﻍﻡﹽّﻥﻩﻬﻰﻲﻐﻕﻵﻶﻝﻙﻱ■�" + }, + "ibm864": "cp864", + "csibm864": "cp864", + "cp865": { + "type": "_sbcs", + "chars": "ÇüéâäàåçêëèïîìÄÅÉæÆôöòûùÿÖÜø£Ø₧ƒáíóúñѪº¿⌐¬½¼¡«¤░▒▓│┤╡╢╖╕╣║╗╝╜╛┐└┴┬├─┼╞╟╚╔╩╦╠═╬╧╨╤╥╙╘╒╓╫╪┘┌█▄▌▐▀αßΓπΣσµτΦΘΩδ∞φε∩≡±≥≤⌠⌡÷≈°∙·√ⁿ²■ " + }, + "ibm865": "cp865", + "csibm865": "cp865", + "cp866": { + "type": "_sbcs", + "chars": "АБВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмноп░▒▓│┤╡╢╖╕╣║╗╝╜╛┐└┴┬├─┼╞╟╚╔╩╦╠═╬╧╨╤╥╙╘╒╓╫╪┘┌█▄▌▐▀рстуфхцчшщъыьэюяЁёЄєЇїЎў°∙·√№¤■ " + }, + "ibm866": "cp866", + "csibm866": "cp866", + "cp869": { + "type": "_sbcs", + "chars": "������Ά�·¬¦‘’Έ―ΉΊΪΌ��ΎΫ©Ώ²³ά£έήίϊΐόύΑΒΓΔΕΖΗ½ΘΙ«»░▒▓│┤ΚΛΜΝ╣║╗╝ΞΟ┐└┴┬├─┼ΠΡ╚╔╩╦╠═╬ΣΤΥΦΧΨΩαβγ┘┌█▄δε▀ζηθικλμνξοπρσςτ΄­±υφχ§ψ΅°¨ωϋΰώ■ " + }, + "ibm869": "cp869", + "csibm869": "cp869", + "cp922": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§¨©ª«¬­®‾°±²³´µ¶·¸¹º»¼½¾¿ÀÁÂÃÄÅÆÇÈÉÊËÌÍÎÏŠÑÒÓÔÕÖ×ØÙÚÛÜÝŽßàáâãäåæçèéêëìíîïšñòóôõö÷øùúûüýžÿ" + }, + "ibm922": "cp922", + "csibm922": "cp922", + "cp1046": { + "type": "_sbcs", + "chars": "ﺈ×÷ﹱˆ■│─┐┌└┘ﹹﹻﹽﹿﹷﺊﻰﻳﻲﻎﻏﻐﻶﻸﻺﻼ ¤ﺋﺑﺗﺛﺟﺣ،­ﺧﺳ٠١٢٣٤٥٦٧٨٩ﺷ؛ﺻﺿﻊ؟ﻋءآأؤإئابةتثجحخدذرزسشصضطﻇعغﻌﺂﺄﺎﻓـفقكلمنهوىيًٌٍَُِّْﻗﻛﻟﻵﻷﻹﻻﻣﻧﻬﻩ�" + }, + "ibm1046": "cp1046", + "csibm1046": "cp1046", + "cp1124": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ЁЂҐЄЅІЇЈЉЊЋЌ­ЎЏАБВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюя№ёђґєѕіїјљњћќ§ўџ" + }, + "ibm1124": "cp1124", + "csibm1124": "cp1124", + "cp1125": { + "type": "_sbcs", + "chars": "АБВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмноп░▒▓│┤╡╢╖╕╣║╗╝╜╛┐└┴┬├─┼╞╟╚╔╩╦╠═╬╧╨╤╥╙╘╒╓╫╪┘┌█▄▌▐▀рстуфхцчшщъыьэюяЁёҐґЄєІіЇї·√№¤■ " + }, + "ibm1125": "cp1125", + "csibm1125": "cp1125", + "cp1129": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§œ©ª«¬­®¯°±²³Ÿµ¶·Œ¹º»¼½¾¿ÀÁÂĂÄÅÆÇÈÉÊË̀ÍÎÏĐÑ̉ÓÔƠÖ×ØÙÚÛÜỮßàáâăäåæçèéêë́íîïđṇ̃óôơö÷øùúûüư₫ÿ" + }, + "ibm1129": "cp1129", + "csibm1129": "cp1129", + "cp1133": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ກຂຄງຈສຊຍດຕຖທນບປຜຝພຟມຢຣລວຫອຮ���ຯະາຳິີຶືຸູຼັົຽ���ເແໂໃໄ່້໊໋໌ໍໆ�ໜໝ₭����������������໐໑໒໓໔໕໖໗໘໙��¢¬¦�" + }, + "ibm1133": "cp1133", + "csibm1133": "cp1133", + "cp1161": { + "type": "_sbcs", + "chars": "��������������������������������่กขฃคฅฆงจฉชซฌญฎฏฐฑฒณดตถทธนบปผฝพฟภมยรฤลฦวศษสหฬอฮฯะัาำิีึืฺุู้๊๋€฿เแโใไๅๆ็่้๊๋์ํ๎๏๐๑๒๓๔๕๖๗๘๙๚๛¢¬¦ " + }, + "ibm1161": "cp1161", + "csibm1161": "cp1161", + "cp1162": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ กขฃคฅฆงจฉชซฌญฎฏฐฑฒณดตถทธนบปผฝพฟภมยรฤลฦวศษสหฬอฮฯะัาำิีึืฺุู����฿เแโใไๅๆ็่้๊๋์ํ๎๏๐๑๒๓๔๕๖๗๘๙๚๛����" + }, + "ibm1162": "cp1162", + "csibm1162": "cp1162", + "cp1163": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£€¥¦§œ©ª«¬­®¯°±²³Ÿµ¶·Œ¹º»¼½¾¿ÀÁÂĂÄÅÆÇÈÉÊË̀ÍÎÏĐÑ̉ÓÔƠÖ×ØÙÚÛÜỮßàáâăäåæçèéêë́íîïđṇ̃óôơö÷øùúûüư₫ÿ" + }, + "ibm1163": "cp1163", + "csibm1163": "cp1163", + "maccroatian": { + "type": "_sbcs", + "chars": "ÄÅÇÉÑÖÜáàâäãåçéèêëíìîïñóòôöõúùûü†°¢£§•¶ß®Š™´¨≠ŽØ∞±≤≥∆µ∂∑∏š∫ªºΩžø¿¡¬√ƒ≈Ć«Č… ÀÃÕŒœĐ—“”‘’÷◊�©⁄¤‹›Æ»–·‚„‰ÂćÁčÈÍÎÏÌÓÔđÒÚÛÙıˆ˜¯πË˚¸Êæˇ" + }, + "maccyrillic": { + "type": "_sbcs", + "chars": "АБВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯ†°¢£§•¶І®©™Ђђ≠Ѓѓ∞±≤≥іµ∂ЈЄєЇїЉљЊњјЅ¬√ƒ≈∆«»… ЋћЌќѕ–—“”‘’÷„ЎўЏџ№Ёёяабвгдежзийклмнопрстуфхцчшщъыьэю¤" + }, + "macgreek": { + "type": "_sbcs", + "chars": "Ĺ²É³ÖÜ΅àâä΄¨çéèê룙î‰ôö¦­ùûü†ΓΔΘΛΞΠß®©ΣΪ§≠°·Α±≤≥¥ΒΕΖΗΙΚΜΦΫΨΩάΝ¬ΟΡ≈Τ«»… ΥΧΆΈœ–―“”‘’÷ΉΊΌΎέήίόΏύαβψδεφγηιξκλμνοπώρστθωςχυζϊϋΐΰ�" + }, + "maciceland": { + "type": "_sbcs", + "chars": "ÄÅÇÉÑÖÜáàâäãåçéèêëíìîïñóòôöõúùûüÝ°¢£§•¶ß®©™´¨≠ÆØ∞±≤≥¥µ∂∑∏π∫ªºΩæø¿¡¬√ƒ≈∆«»… ÀÃÕŒœ–—“”‘’÷◊ÿŸ⁄¤ÐðÞþý·‚„‰ÂÊÁËÈÍÎÏÌÓÔ�ÒÚÛÙıˆ˜¯˘˙˚¸˝˛ˇ" + }, + "macroman": { + "type": "_sbcs", + "chars": "ÄÅÇÉÑÖÜáàâäãåçéèêëíìîïñóòôöõúùûü†°¢£§•¶ß®©™´¨≠ÆØ∞±≤≥¥µ∂∑∏π∫ªºΩæø¿¡¬√ƒ≈∆«»… ÀÃÕŒœ–—“”‘’÷◊ÿŸ⁄¤‹›fifl‡·‚„‰ÂÊÁËÈÍÎÏÌÓÔ�ÒÚÛÙıˆ˜¯˘˙˚¸˝˛ˇ" + }, + "macromania": { + "type": "_sbcs", + "chars": "ÄÅÇÉÑÖÜáàâäãåçéèêëíìîïñóòôöõúùûü†°¢£§•¶ß®©™´¨≠ĂŞ∞±≤≥¥µ∂∑∏π∫ªºΩăş¿¡¬√ƒ≈∆«»… ÀÃÕŒœ–—“”‘’÷◊ÿŸ⁄¤‹›Ţţ‡·‚„‰ÂÊÁËÈÍÎÏÌÓÔ�ÒÚÛÙıˆ˜¯˘˙˚¸˝˛ˇ" + }, + "macthai": { + "type": "_sbcs", + "chars": "«»…“”�•‘’� กขฃคฅฆงจฉชซฌญฎฏฐฑฒณดตถทธนบปผฝพฟภมยรฤลฦวศษสหฬอฮฯะัาำิีึืฺุู​–—฿เแโใไๅๆ็่้๊๋์ํ™๏๐๑๒๓๔๕๖๗๘๙®©����" + }, + "macturkish": { + "type": "_sbcs", + "chars": "ÄÅÇÉÑÖÜáàâäãåçéèêëíìîïñóòôöõúùûü†°¢£§•¶ß®©™´¨≠ÆØ∞±≤≥¥µ∂∑∏π∫ªºΩæø¿¡¬√ƒ≈∆«»… ÀÃÕŒœ–—“”‘’÷◊ÿŸĞğİıŞş‡·‚„‰ÂÊÁËÈÍÎÏÌÓÔ�ÒÚÛÙ�ˆ˜¯˘˙˚¸˝˛ˇ" + }, + "macukraine": { + "type": "_sbcs", + "chars": "АБВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯ†°Ґ£§•¶І®©™Ђђ≠Ѓѓ∞±≤≥іµґЈЄєЇїЉљЊњјЅ¬√ƒ≈∆«»… ЋћЌќѕ–—“”‘’÷„ЎўЏџ№Ёёяабвгдежзийклмнопрстуфхцчшщъыьэю¤" + }, + "koi8r": { + "type": "_sbcs", + "chars": "─│┌┐└┘├┤┬┴┼▀▄█▌▐░▒▓⌠■∙√≈≤≥ ⌡°²·÷═║╒ё╓╔╕╖╗╘╙╚╛╜╝╞╟╠╡Ё╢╣╤╥╦╧╨╩╪╫╬©юабцдефгхийклмнопярстужвьызшэщчъЮАБЦДЕФГХИЙКЛМНОПЯРСТУЖВЬЫЗШЭЩЧЪ" + }, + "koi8u": { + "type": "_sbcs", + "chars": "─│┌┐└┘├┤┬┴┼▀▄█▌▐░▒▓⌠■∙√≈≤≥ ⌡°²·÷═║╒ёє╔ії╗╘╙╚╛ґ╝╞╟╠╡ЁЄ╣ІЇ╦╧╨╩╪Ґ╬©юабцдефгхийклмнопярстужвьызшэщчъЮАБЦДЕФГХИЙКЛМНОПЯРСТУЖВЬЫЗШЭЩЧЪ" + }, + "koi8ru": { + "type": "_sbcs", + "chars": "─│┌┐└┘├┤┬┴┼▀▄█▌▐░▒▓⌠■∙√≈≤≥ ⌡°²·÷═║╒ёє╔ії╗╘╙╚╛ґў╞╟╠╡ЁЄ╣ІЇ╦╧╨╩╪ҐЎ©юабцдефгхийклмнопярстужвьызшэщчъЮАБЦДЕФГХИЙКЛМНОПЯРСТУЖВЬЫЗШЭЩЧЪ" + }, + "koi8t": { + "type": "_sbcs", + "chars": "қғ‚Ғ„…†‡�‰ҳ‹ҲҷҶ�Қ‘’“”•–—�™�›�����ӯӮё¤ӣ¦§���«¬­®�°±²Ё�Ӣ¶·�№�»���©юабцдефгхийклмнопярстужвьызшэщчъЮАБЦДЕФГХИЙКЛМНОПЯРСТУЖВЬЫЗШЭЩЧЪ" + }, + "armscii8": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ �և։)(»«—.՝,-֊…՜՛՞ԱաԲբԳգԴդԵեԶզԷէԸըԹթԺժԻիԼլԽխԾծԿկՀհՁձՂղՃճՄմՅյՆնՇշՈոՉչՊպՋջՌռՍսՎվՏտՐրՑցՒւՓփՔքՕօՖֆ՚�" + }, + "rk1048": { + "type": "_sbcs", + "chars": "ЂЃ‚ѓ„…†‡€‰Љ‹ЊҚҺЏђ‘’“”•–—�™љ›њқһџ ҰұӘ¤Ө¦§Ё©Ғ«¬­®Ү°±Ііөµ¶·ё№ғ»әҢңүАБВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюя" + }, + "tcvn": { + "type": "_sbcs", + "chars": "\u0000ÚỤ\u0003ỪỬỮ\u0007\b\t\n\u000b\f\r\u000e\u000f\u0010ỨỰỲỶỸÝỴ\u0018\u0019\u001a\u001b\u001c\u001d\u001e\u001f !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~ÀẢÃÁẠẶẬÈẺẼÉẸỆÌỈĨÍỊÒỎÕÓỌỘỜỞỠỚỢÙỦŨ ĂÂÊÔƠƯĐăâêôơưđẶ̀̀̉̃́àảãáạẲằẳẵắẴẮẦẨẪẤỀặầẩẫấậèỂẻẽéẹềểễếệìỉỄẾỒĩíịòỔỏõóọồổỗốộờởỡớợùỖủũúụừửữứựỳỷỹýỵỐ" + }, + "georgianacademy": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§¨©ª«¬­®¯°±²³´µ¶·¸¹º»¼½¾¿აბგდევზთიკლმნოპჟრსტუფქღყშჩცძწჭხჯჰჱჲჳჴჵჶçèéêëìíîïðñòóôõö÷øùúûüýþÿ" + }, + "georgianps": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ¡¢£¤¥¦§¨©ª«¬­®¯°±²³´µ¶·¸¹º»¼½¾¿აბგდევზჱთიკლმნჲოპჟრსტჳუფქღყშჩცძწჭხჴჯჰჵæçèéêëìíîïðñòóôõö÷øùúûüýþÿ" + }, + "pt154": { + "type": "_sbcs", + "chars": "ҖҒӮғ„…ҶҮҲүҠӢҢҚҺҸҗ‘’“”•–—ҳҷҡӣңқһҹ ЎўЈӨҘҰ§Ё©Ә«¬ӯ®Ҝ°ұІіҙө¶·ё№ә»јҪҫҝАБВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюя" + }, + "viscii": { + "type": "_sbcs", + "chars": "\u0000\u0001Ẳ\u0003\u0004ẴẪ\u0007\b\t\n\u000b\f\r\u000e\u000f\u0010\u0011\u0012\u0013Ỷ\u0015\u0016\u0017\u0018Ỹ\u001a\u001b\u001c\u001dỴ\u001f !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}~ẠẮẰẶẤẦẨẬẼẸẾỀỂỄỆỐỒỔỖỘỢỚỜỞỊỎỌỈỦŨỤỲÕắằặấầẩậẽẹếềểễệốồổỗỠƠộờởịỰỨỪỬơớƯÀÁÂÃẢĂẳẵÈÉÊẺÌÍĨỳĐứÒÓÔạỷừửÙÚỹỵÝỡưàáâãảăữẫèéêẻìíĩỉđựòóôõỏọụùúũủýợỮ" + }, + "iso646cn": { + "type": "_sbcs", + "chars": "\u0000\u0001\u0002\u0003\u0004\u0005\u0006\u0007\b\t\n\u000b\f\r\u000e\u000f\u0010\u0011\u0012\u0013\u0014\u0015\u0016\u0017\u0018\u0019\u001a\u001b\u001c\u001d\u001e\u001f !\"#¥%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[\\]^_`abcdefghijklmnopqrstuvwxyz{|}‾��������������������������������������������������������������������������������������������������������������������������������" + }, + "iso646jp": { + "type": "_sbcs", + "chars": "\u0000\u0001\u0002\u0003\u0004\u0005\u0006\u0007\b\t\n\u000b\f\r\u000e\u000f\u0010\u0011\u0012\u0013\u0014\u0015\u0016\u0017\u0018\u0019\u001a\u001b\u001c\u001d\u001e\u001f !\"#$%&'()*+,-./0123456789:;<=>?@ABCDEFGHIJKLMNOPQRSTUVWXYZ[¥]^_`abcdefghijklmnopqrstuvwxyz{|}‾��������������������������������������������������������������������������������������������������������������������������������" + }, + "hproman8": { + "type": "_sbcs", + "chars": "€‚ƒ„…†‡ˆ‰Š‹ŒŽ‘’“”•–—˜™š›œžŸ ÀÂÈÊËÎÏ´ˋˆ¨˜ÙÛ₤¯Ýý°ÇçÑñ¡¿¤£¥§ƒ¢âêôûáéóúàèòùäëöüÅîØÆåíøæÄìÖÜÉïßÔÁÃãÐðÍÌÓÒÕõŠšÚŸÿÞþ·µ¶¾—¼½ªº«■»±�" + }, + "macintosh": { + "type": "_sbcs", + "chars": "ÄÅÇÉÑÖÜáàâäãåçéèêëíìîïñóòôöõúùûü†°¢£§•¶ß®©™´¨≠ÆØ∞±≤≥¥µ∂∑∏π∫ªºΩæø¿¡¬√ƒ≈∆«»… ÀÃÕŒœ–—“”‘’÷◊ÿŸ⁄¤‹›fifl‡·‚„‰ÂÊÁËÈÍÎÏÌÓÔ�ÒÚÛÙıˆ˜¯˘˙˚¸˝˛ˇ" + }, + "ascii": { + "type": "_sbcs", + "chars": "��������������������������������������������������������������������������������������������������������������������������������" + }, + "tis620": { + "type": "_sbcs", + "chars": "���������������������������������กขฃคฅฆงจฉชซฌญฎฏฐฑฒณดตถทธนบปผฝพฟภมยรฤลฦวศษสหฬอฮฯะัาำิีึืฺุู����฿เแโใไๅๆ็่้๊๋์ํ๎๏๐๑๒๓๔๕๖๗๘๙๚๛����" + } +} \ No newline at end of file diff --git a/node_modules/iconv-lite/encodings/sbcs-data.js b/node_modules/iconv-lite/encodings/sbcs-data.js new file mode 100644 index 000000000..fdb81a39a --- /dev/null +++ b/node_modules/iconv-lite/encodings/sbcs-data.js @@ -0,0 +1,174 @@ +"use strict"; + +// Manually added data to be used by sbcs codec in addition to generated one. + +module.exports = { + // Not supported by iconv, not sure why. + "10029": "maccenteuro", + "maccenteuro": { + "type": "_sbcs", + "chars": "ÄĀāÉĄÖÜáąČäčĆć鏟ĎíďĒēĖóėôöõúĚěü†°Ę£§•¶ß®©™ę¨≠ģĮįĪ≤≥īĶ∂∑łĻļĽľĹĺŅņѬ√ńŇ∆«»… ňŐÕőŌ–—“”‘’÷◊ōŔŕŘ‹›řŖŗŠ‚„šŚśÁŤťÍŽžŪÓÔūŮÚůŰűŲųÝýķŻŁżĢˇ" + }, + + "808": "cp808", + "ibm808": "cp808", + "cp808": { + "type": "_sbcs", + "chars": "АБВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмноп░▒▓│┤╡╢╖╕╣║╗╝╜╛┐└┴┬├─┼╞╟╚╔╩╦╠═╬╧╨╤╥╙╘╒╓╫╪┘┌█▄▌▐▀рстуфхцчшщъыьэюяЁёЄєЇїЎў°∙·√№€■ " + }, + + "mik": { + "type": "_sbcs", + "chars": "АБВГДЕЖЗИЙКЛМНОПРСТУФХЦЧШЩЪЫЬЭЮЯабвгдежзийклмнопрстуфхцчшщъыьэюя└┴┬├─┼╣║╚╔╩╦╠═╬┐░▒▓│┤№§╗╝┘┌█▄▌▐▀αßΓπΣσµτΦΘΩδ∞φε∩≡±≥≤⌠⌡÷≈°∙·√ⁿ²■ " + }, + + // Aliases of generated encodings. + "ascii8bit": "ascii", + "usascii": "ascii", + "ansix34": "ascii", + "ansix341968": "ascii", + "ansix341986": "ascii", + "csascii": "ascii", + "cp367": "ascii", + "ibm367": "ascii", + "isoir6": "ascii", + "iso646us": "ascii", + "iso646irv": "ascii", + "us": "ascii", + + "latin1": "iso88591", + "latin2": "iso88592", + "latin3": "iso88593", + "latin4": "iso88594", + "latin5": "iso88599", + "latin6": "iso885910", + "latin7": "iso885913", + "latin8": "iso885914", + "latin9": "iso885915", + "latin10": "iso885916", + + "csisolatin1": "iso88591", + "csisolatin2": "iso88592", + "csisolatin3": "iso88593", + "csisolatin4": "iso88594", + "csisolatincyrillic": "iso88595", + "csisolatinarabic": "iso88596", + "csisolatingreek" : "iso88597", + "csisolatinhebrew": "iso88598", + "csisolatin5": "iso88599", + "csisolatin6": "iso885910", + + "l1": "iso88591", + "l2": "iso88592", + "l3": "iso88593", + "l4": "iso88594", + "l5": "iso88599", + "l6": "iso885910", + "l7": "iso885913", + "l8": "iso885914", + "l9": "iso885915", + "l10": "iso885916", + + "isoir14": "iso646jp", + "isoir57": "iso646cn", + "isoir100": "iso88591", + "isoir101": "iso88592", + "isoir109": "iso88593", + "isoir110": "iso88594", + "isoir144": "iso88595", + "isoir127": "iso88596", + "isoir126": "iso88597", + "isoir138": "iso88598", + "isoir148": "iso88599", + "isoir157": "iso885910", + "isoir166": "tis620", + "isoir179": "iso885913", + "isoir199": "iso885914", + "isoir203": "iso885915", + "isoir226": "iso885916", + + "cp819": "iso88591", + "ibm819": "iso88591", + + "cyrillic": "iso88595", + + "arabic": "iso88596", + "arabic8": "iso88596", + "ecma114": "iso88596", + "asmo708": "iso88596", + + "greek" : "iso88597", + "greek8" : "iso88597", + "ecma118" : "iso88597", + "elot928" : "iso88597", + + "hebrew": "iso88598", + "hebrew8": "iso88598", + + "turkish": "iso88599", + "turkish8": "iso88599", + + "thai": "iso885911", + "thai8": "iso885911", + + "celtic": "iso885914", + "celtic8": "iso885914", + "isoceltic": "iso885914", + + "tis6200": "tis620", + "tis62025291": "tis620", + "tis62025330": "tis620", + + "10000": "macroman", + "10006": "macgreek", + "10007": "maccyrillic", + "10079": "maciceland", + "10081": "macturkish", + + "cspc8codepage437": "cp437", + "cspc775baltic": "cp775", + "cspc850multilingual": "cp850", + "cspcp852": "cp852", + "cspc862latinhebrew": "cp862", + "cpgr": "cp869", + + "msee": "cp1250", + "mscyrl": "cp1251", + "msansi": "cp1252", + "msgreek": "cp1253", + "msturk": "cp1254", + "mshebr": "cp1255", + "msarab": "cp1256", + "winbaltrim": "cp1257", + + "cp20866": "koi8r", + "20866": "koi8r", + "ibm878": "koi8r", + "cskoi8r": "koi8r", + + "cp21866": "koi8u", + "21866": "koi8u", + "ibm1168": "koi8u", + + "strk10482002": "rk1048", + + "tcvn5712": "tcvn", + "tcvn57121": "tcvn", + + "gb198880": "iso646cn", + "cn": "iso646cn", + + "csiso14jisc6220ro": "iso646jp", + "jisc62201969ro": "iso646jp", + "jp": "iso646jp", + + "cshproman8": "hproman8", + "r8": "hproman8", + "roman8": "hproman8", + "xroman8": "hproman8", + "ibm1051": "hproman8", + + "mac": "macintosh", + "csmacintosh": "macintosh", +}; + diff --git a/node_modules/iconv-lite/encodings/tables/big5-added.json b/node_modules/iconv-lite/encodings/tables/big5-added.json new file mode 100644 index 000000000..3c3d3c2f7 --- /dev/null +++ b/node_modules/iconv-lite/encodings/tables/big5-added.json @@ -0,0 +1,122 @@ +[ +["8740","䏰䰲䘃䖦䕸𧉧䵷䖳𧲱䳢𧳅㮕䜶䝄䱇䱀𤊿𣘗𧍒𦺋𧃒䱗𪍑䝏䗚䲅𧱬䴇䪤䚡𦬣爥𥩔𡩣𣸆𣽡晍囻"], +["8767","綕夝𨮹㷴霴𧯯寛𡵞媤㘥𩺰嫑宷峼杮薓𩥅瑡璝㡵𡵓𣚞𦀡㻬"], +["87a1","𥣞㫵竼龗𤅡𨤍𣇪𠪊𣉞䌊蒄龖鐯䤰蘓墖靊鈘秐稲晠権袝瑌篅枂稬剏遆㓦珄𥶹瓆鿇垳䤯呌䄱𣚎堘穲𧭥讏䚮𦺈䆁𥶙箮𢒼鿈𢓁𢓉𢓌鿉蔄𣖻䂴鿊䓡𪷿拁灮鿋"], +["8840","㇀",4,"𠄌㇅𠃑𠃍㇆㇇𠃋𡿨㇈𠃊㇉㇊㇋㇌𠄎㇍㇎ĀÁǍÀĒÉĚÈŌÓǑÒ࿿Ê̄Ế࿿Ê̌ỀÊāáǎàɑēéěèīíǐìōóǒòūúǔùǖǘǚ"], +["88a1","ǜü࿿ê̄ế࿿ê̌ềêɡ⏚⏛"], +["8940","𪎩𡅅"], +["8943","攊"], +["8946","丽滝鵎釟"], +["894c","𧜵撑会伨侨兖兴农凤务动医华发变团声处备夲头学实実岚庆总斉柾栄桥济炼电纤纬纺织经统缆缷艺苏药视设询车轧轮"], +["89a1","琑糼緍楆竉刧"], +["89ab","醌碸酞肼"], +["89b0","贋胶𠧧"], +["89b5","肟黇䳍鷉鸌䰾𩷶𧀎鸊𪄳㗁"], +["89c1","溚舾甙"], +["89c5","䤑马骏龙禇𨑬𡷊𠗐𢫦两亁亀亇亿仫伷㑌侽㹈倃傈㑽㒓㒥円夅凛凼刅争剹劐匧㗇厩㕑厰㕓参吣㕭㕲㚁咓咣咴咹哐哯唘唣唨㖘唿㖥㖿嗗㗅"], +["8a40","𧶄唥"], +["8a43","𠱂𠴕𥄫喐𢳆㧬𠍁蹆𤶸𩓥䁓𨂾睺𢰸㨴䟕𨅝𦧲𤷪擝𠵼𠾴𠳕𡃴撍蹾𠺖𠰋𠽤𢲩𨉖𤓓"], +["8a64","𠵆𩩍𨃩䟴𤺧𢳂骲㩧𩗴㿭㔆𥋇𩟔𧣈𢵄鵮頕"], +["8a76","䏙𦂥撴哣𢵌𢯊𡁷㧻𡁯"], +["8aa1","𦛚𦜖𧦠擪𥁒𠱃蹨𢆡𨭌𠜱"], +["8aac","䠋𠆩㿺塳𢶍"], +["8ab2","𤗈𠓼𦂗𠽌𠶖啹䂻䎺"], +["8abb","䪴𢩦𡂝膪飵𠶜捹㧾𢝵跀嚡摼㹃"], +["8ac9","𪘁𠸉𢫏𢳉"], +["8ace","𡃈𣧂㦒㨆𨊛㕸𥹉𢃇噒𠼱𢲲𩜠㒼氽𤸻"], +["8adf","𧕴𢺋𢈈𪙛𨳍𠹺𠰴𦠜羓𡃏𢠃𢤹㗻𥇣𠺌𠾍𠺪㾓𠼰𠵇𡅏𠹌"], +["8af6","𠺫𠮩𠵈𡃀𡄽㿹𢚖搲𠾭"], +["8b40","𣏴𧘹𢯎𠵾𠵿𢱑𢱕㨘𠺘𡃇𠼮𪘲𦭐𨳒𨶙𨳊閪哌苄喹"], +["8b55","𩻃鰦骶𧝞𢷮煀腭胬尜𦕲脴㞗卟𨂽醶𠻺𠸏𠹷𠻻㗝𤷫㘉𠳖嚯𢞵𡃉𠸐𠹸𡁸𡅈𨈇𡑕𠹹𤹐𢶤婔𡀝𡀞𡃵𡃶垜𠸑"], +["8ba1","𧚔𨋍𠾵𠹻𥅾㜃𠾶𡆀𥋘𪊽𤧚𡠺𤅷𨉼墙剨㘚𥜽箲孨䠀䬬鼧䧧鰟鮍𥭴𣄽嗻㗲嚉丨夂𡯁屮靑𠂆乛亻㔾尣彑忄㣺扌攵歺氵氺灬爫丬犭𤣩罒礻糹罓𦉪㓁"], +["8bde","𦍋耂肀𦘒𦥑卝衤见𧢲讠贝钅镸长门𨸏韦页风飞饣𩠐鱼鸟黄歯龜丷𠂇阝户钢"], +["8c40","倻淾𩱳龦㷉袏𤅎灷峵䬠𥇍㕙𥴰愢𨨲辧釶熑朙玺𣊁𪄇㲋𡦀䬐磤琂冮𨜏䀉橣𪊺䈣蘏𠩯稪𩥇𨫪靕灍匤𢁾鏴盙𨧣龧矝亣俰傼丯众龨吴綋墒壐𡶶庒庙忂𢜒斋"], +["8ca1","𣏹椙橃𣱣泿"], +["8ca7","爀𤔅玌㻛𤨓嬕璹讃𥲤𥚕窓篬糃繬苸薗龩袐龪躹龫迏蕟駠鈡龬𨶹𡐿䁱䊢娚"], +["8cc9","顨杫䉶圽"], +["8cce","藖𤥻芿𧄍䲁𦵴嵻𦬕𦾾龭龮宖龯曧繛湗秊㶈䓃𣉖𢞖䎚䔶"], +["8ce6","峕𣬚諹屸㴒𣕑嵸龲煗䕘𤃬𡸣䱷㥸㑊𠆤𦱁諌侴𠈹妿腬顖𩣺弻"], +["8d40","𠮟"], +["8d42","𢇁𨥭䄂䚻𩁹㼇龳𪆵䃸㟖䛷𦱆䅼𨚲𧏿䕭㣔𥒚䕡䔛䶉䱻䵶䗪㿈𤬏㙡䓞䒽䇭崾嵈嵖㷼㠏嶤嶹㠠㠸幂庽弥徃㤈㤔㤿㥍惗愽峥㦉憷憹懏㦸戬抐拥挘㧸嚱"], +["8da1","㨃揢揻搇摚㩋擀崕嘡龟㪗斆㪽旿晓㫲暒㬢朖㭂枤栀㭘桊梄㭲㭱㭻椉楃牜楤榟榅㮼槖㯝橥橴橱檂㯬檙㯲檫檵櫔櫶殁毁毪汵沪㳋洂洆洦涁㳯涤涱渕渘温溆𨧀溻滢滚齿滨滩漤漴㵆𣽁澁澾㵪㵵熷岙㶊瀬㶑灐灔灯灿炉𠌥䏁㗱𠻘"], +["8e40","𣻗垾𦻓焾𥟠㙎榢𨯩孴穉𥣡𩓙穥穽𥦬窻窰竂竃燑𦒍䇊竚竝竪䇯咲𥰁笋筕笩𥌎𥳾箢筯莜𥮴𦱿篐萡箒箸𥴠㶭𥱥蒒篺簆簵𥳁籄粃𤢂粦晽𤕸糉糇糦籴糳糵糎"], +["8ea1","繧䔝𦹄絝𦻖璍綉綫焵綳緒𤁗𦀩緤㴓緵𡟹緥𨍭縝𦄡𦅚繮纒䌫鑬縧罀罁罇礶𦋐駡羗𦍑羣𡙡𠁨䕜𣝦䔃𨌺翺𦒉者耈耝耨耯𪂇𦳃耻耼聡𢜔䦉𦘦𣷣𦛨朥肧𨩈脇脚墰𢛶汿𦒘𤾸擧𡒊舘𡡞橓𤩥𤪕䑺舩𠬍𦩒𣵾俹𡓽蓢荢𦬊𤦧𣔰𡝳𣷸芪椛芳䇛"], +["8f40","蕋苐茚𠸖𡞴㛁𣅽𣕚艻苢茘𣺋𦶣𦬅𦮗𣗎㶿茝嗬莅䔋𦶥莬菁菓㑾𦻔橗蕚㒖𦹂𢻯葘𥯤葱㷓䓤檧葊𣲵祘蒨𦮖𦹷𦹃蓞萏莑䒠蒓蓤𥲑䉀𥳀䕃蔴嫲𦺙䔧蕳䔖枿蘖"], +["8fa1","𨘥𨘻藁𧂈蘂𡖂𧃍䕫䕪蘨㙈𡢢号𧎚虾蝱𪃸蟮𢰧螱蟚蠏噡虬桖䘏衅衆𧗠𣶹𧗤衞袜䙛袴袵揁装睷𧜏覇覊覦覩覧覼𨨥觧𧤤𧪽誜瞓釾誐𧩙竩𧬺𣾏䜓𧬸煼謌謟𥐰𥕥謿譌譍誩𤩺讐讛誯𡛟䘕衏貛𧵔𧶏貫㜥𧵓賖𧶘𧶽贒贃𡤐賛灜贑𤳉㻐起"], +["9040","趩𨀂𡀔𤦊㭼𨆼𧄌竧躭躶軃鋔輙輭𨍥𨐒辥錃𪊟𠩐辳䤪𨧞𨔽𣶻廸𣉢迹𪀔𨚼𨔁𢌥㦀𦻗逷𨔼𧪾遡𨕬𨘋邨𨜓郄𨛦邮都酧㫰醩釄粬𨤳𡺉鈎沟鉁鉢𥖹銹𨫆𣲛𨬌𥗛"], +["90a1","𠴱錬鍫𨫡𨯫炏嫃𨫢𨫥䥥鉄𨯬𨰹𨯿鍳鑛躼閅閦鐦閠濶䊹𢙺𨛘𡉼𣸮䧟氜陻隖䅬隣𦻕懚隶磵𨫠隽双䦡𦲸𠉴𦐐𩂯𩃥𤫑𡤕𣌊霱虂霶䨏䔽䖅𤫩灵孁霛靜𩇕靗孊𩇫靟鐥僐𣂷𣂼鞉鞟鞱鞾韀韒韠𥑬韮琜𩐳響韵𩐝𧥺䫑頴頳顋顦㬎𧅵㵑𠘰𤅜"], +["9140","𥜆飊颷飈飇䫿𦴧𡛓喰飡飦飬鍸餹𤨩䭲𩡗𩤅駵騌騻騐驘𥜥㛄𩂱𩯕髠髢𩬅髴䰎鬔鬭𨘀倴鬴𦦨㣃𣁽魐魀𩴾婅𡡣鮎𤉋鰂鯿鰌𩹨鷔𩾷𪆒𪆫𪃡𪄣𪇟鵾鶃𪄴鸎梈"], +["91a1","鷄𢅛𪆓𪈠𡤻𪈳鴹𪂹𪊴麐麕麞麢䴴麪麯𤍤黁㭠㧥㴝伲㞾𨰫鼂鼈䮖鐤𦶢鼗鼖鼹嚟嚊齅馸𩂋韲葿齢齩竜龎爖䮾𤥵𤦻煷𤧸𤍈𤩑玞𨯚𡣺禟𨥾𨸶鍩鏳𨩄鋬鎁鏋𨥬𤒹爗㻫睲穃烐𤑳𤏸煾𡟯炣𡢾𣖙㻇𡢅𥐯𡟸㜢𡛻𡠹㛡𡝴𡣑𥽋㜣𡛀坛𤨥𡏾𡊨"], +["9240","𡏆𡒶蔃𣚦蔃葕𤦔𧅥𣸱𥕜𣻻𧁒䓴𣛮𩦝𦼦柹㜳㰕㷧塬𡤢栐䁗𣜿𤃡𤂋𤄏𦰡哋嚞𦚱嚒𠿟𠮨𠸍鏆𨬓鎜仸儫㠙𤐶亼𠑥𠍿佋侊𥙑婨𠆫𠏋㦙𠌊𠐔㐵伩𠋀𨺳𠉵諚𠈌亘"], +["92a1","働儍侢伃𤨎𣺊佂倮偬傁俌俥偘僼兙兛兝兞湶𣖕𣸹𣺿浲𡢄𣺉冨凃𠗠䓝𠒣𠒒𠒑赺𨪜𠜎剙劤𠡳勡鍮䙺熌𤎌𠰠𤦬𡃤槑𠸝瑹㻞璙琔瑖玘䮎𤪼𤂍叐㖄爏𤃉喴𠍅响𠯆圝鉝雴鍦埝垍坿㘾壋媙𨩆𡛺𡝯𡜐娬妸銏婾嫏娒𥥆𡧳𡡡𤊕㛵洅瑃娡𥺃"], +["9340","媁𨯗𠐓鏠璌𡌃焅䥲鐈𨧻鎽㞠尞岞幞幈𡦖𡥼𣫮廍孏𡤃𡤄㜁𡢠㛝𡛾㛓脪𨩇𡶺𣑲𨦨弌弎𡤧𡞫婫𡜻孄蘔𧗽衠恾𢡠𢘫忛㺸𢖯𢖾𩂈𦽳懀𠀾𠁆𢘛憙憘恵𢲛𢴇𤛔𩅍"], +["93a1","摱𤙥𢭪㨩𢬢𣑐𩣪𢹸挷𪑛撶挱揑𤧣𢵧护𢲡搻敫楲㯴𣂎𣊭𤦉𣊫唍𣋠𡣙𩐿曎𣊉𣆳㫠䆐𥖄𨬢𥖏𡛼𥕛𥐥磮𣄃𡠪𣈴㑤𣈏𣆂𤋉暎𦴤晫䮓昰𧡰𡷫晣𣋒𣋡昞𥡲㣑𣠺𣞼㮙𣞢𣏾瓐㮖枏𤘪梶栞㯄檾㡣𣟕𤒇樳橒櫉欅𡤒攑梘橌㯗橺歗𣿀𣲚鎠鋲𨯪𨫋"], +["9440","銉𨀞𨧜鑧涥漋𤧬浧𣽿㶏渄𤀼娽渊塇洤硂焻𤌚𤉶烱牐犇犔𤞏𤜥兹𤪤𠗫瑺𣻸𣙟𤩊𤤗𥿡㼆㺱𤫟𨰣𣼵悧㻳瓌琼鎇琷䒟𦷪䕑疃㽣𤳙𤴆㽘畕癳𪗆㬙瑨𨫌𤦫𤦎㫻"], +["94a1","㷍𤩎㻿𤧅𤣳釺圲鍂𨫣𡡤僟𥈡𥇧睸𣈲眎眏睻𤚗𣞁㩞𤣰琸璛㺿𤪺𤫇䃈𤪖𦆮錇𥖁砞碍碈磒珐祙𧝁𥛣䄎禛蒖禥樭𣻺稺秴䅮𡛦䄲鈵秱𠵌𤦌𠊙𣶺𡝮㖗啫㕰㚪𠇔𠰍竢婙𢛵𥪯𥪜娍𠉛磰娪𥯆竾䇹籝籭䈑𥮳𥺼𥺦糍𤧹𡞰粎籼粮檲緜縇緓罎𦉡"], +["9540","𦅜𧭈綗𥺂䉪𦭵𠤖柖𠁎𣗏埄𦐒𦏸𤥢翝笧𠠬𥫩𥵃笌𥸎駦虅驣樜𣐿㧢𤧷𦖭騟𦖠蒀𧄧𦳑䓪脷䐂胆脉腂𦞴飃𦩂艢艥𦩑葓𦶧蘐𧈛媆䅿𡡀嬫𡢡嫤𡣘蚠蜨𣶏蠭𧐢娂"], +["95a1","衮佅袇袿裦襥襍𥚃襔𧞅𧞄𨯵𨯙𨮜𨧹㺭蒣䛵䛏㟲訽訜𩑈彍鈫𤊄旔焩烄𡡅鵭貟賩𧷜妚矃姰䍮㛔踪躧𤰉輰轊䋴汘澻𢌡䢛潹溋𡟚鯩㚵𤤯邻邗啱䤆醻鐄𨩋䁢𨫼鐧𨰝𨰻蓥訫閙閧閗閖𨴴瑅㻂𤣿𤩂𤏪㻧𣈥随𨻧𨹦𨹥㻌𤧭𤩸𣿮琒瑫㻼靁𩂰"], +["9640","桇䨝𩂓𥟟靝鍨𨦉𨰦𨬯𦎾銺嬑譩䤼珹𤈛鞛靱餸𠼦巁𨯅𤪲頟𩓚鋶𩗗釥䓀𨭐𤩧𨭤飜𨩅㼀鈪䤥萔餻饍𧬆㷽馛䭯馪驜𨭥𥣈檏騡嫾騯𩣱䮐𩥈馼䮽䮗鍽塲𡌂堢𤦸"], +["96a1","𡓨硄𢜟𣶸棅㵽鑘㤧慐𢞁𢥫愇鱏鱓鱻鰵鰐魿鯏𩸭鮟𪇵𪃾鴡䲮𤄄鸘䲰鴌𪆴𪃭𪃳𩤯鶥蒽𦸒𦿟𦮂藼䔳𦶤𦺄𦷰萠藮𦸀𣟗𦁤秢𣖜𣙀䤭𤧞㵢鏛銾鍈𠊿碹鉷鑍俤㑀遤𥕝砽硔碶硋𡝗𣇉𤥁㚚佲濚濙瀞瀞吔𤆵垻壳垊鴖埗焴㒯𤆬燫𦱀𤾗嬨𡞵𨩉"], +["9740","愌嫎娋䊼𤒈㜬䭻𨧼鎻鎸𡣖𠼝葲𦳀𡐓𤋺𢰦𤏁妔𣶷𦝁綨𦅛𦂤𤦹𤦋𨧺鋥珢㻩璴𨭣𡢟㻡𤪳櫘珳珻㻖𤨾𤪔𡟙𤩦𠎧𡐤𤧥瑈𤤖炥𤥶銄珦鍟𠓾錱𨫎𨨖鎆𨯧𥗕䤵𨪂煫"], +["97a1","𤥃𠳿嚤𠘚𠯫𠲸唂秄𡟺緾𡛂𤩐𡡒䔮鐁㜊𨫀𤦭妰𡢿𡢃𧒄媡㛢𣵛㚰鉟婹𨪁𡡢鍴㳍𠪴䪖㦊僴㵩㵌𡎜煵䋻𨈘渏𩃤䓫浗𧹏灧沯㳖𣿭𣸭渂漌㵯𠏵畑㚼㓈䚀㻚䡱姄鉮䤾轁𨰜𦯀堒埈㛖𡑒烾𤍢𤩱𢿣𡊰𢎽梹楧𡎘𣓥𧯴𣛟𨪃𣟖𣏺𤲟樚𣚭𦲷萾䓟䓎"], +["9840","𦴦𦵑𦲂𦿞漗𧄉茽𡜺菭𦲀𧁓𡟛妉媂𡞳婡婱𡤅𤇼㜭姯𡜼㛇熎鎐暚𤊥婮娫𤊓樫𣻹𧜶𤑛𤋊焝𤉙𨧡侰𦴨峂𤓎𧹍𤎽樌𤉖𡌄炦焳𤏩㶥泟勇𤩏繥姫崯㷳彜𤩝𡟟綤萦"], +["98a1","咅𣫺𣌀𠈔坾𠣕𠘙㿥𡾞𪊶瀃𩅛嵰玏糓𨩙𩐠俈翧狍猐𧫴猸猹𥛶獁獈㺩𧬘遬燵𤣲珡臶㻊県㻑沢国琙琞琟㻢㻰㻴㻺瓓㼎㽓畂畭畲疍㽼痈痜㿀癍㿗癴㿜発𤽜熈嘣覀塩䀝睃䀹条䁅㗛瞘䁪䁯属瞾矋売砘点砜䂨砹硇硑硦葈𥔵礳栃礲䄃"], +["9940","䄉禑禙辻稆込䅧窑䆲窼艹䇄竏竛䇏両筢筬筻簒簛䉠䉺类粜䊌粸䊔糭输烀𠳏総緔緐緽羮羴犟䎗耠耥笹耮耱联㷌垴炠肷胩䏭脌猪脎脒畠脔䐁㬹腖腙腚"], +["99a1","䐓堺腼膄䐥膓䐭膥埯臁臤艔䒏芦艶苊苘苿䒰荗险榊萅烵葤惣蒈䔄蒾蓡蓸蔐蔸蕒䔻蕯蕰藠䕷虲蚒蚲蛯际螋䘆䘗袮裿褤襇覑𧥧訩訸誔誴豑賔賲贜䞘塟跃䟭仮踺嗘坔蹱嗵躰䠷軎転軤軭軲辷迁迊迌逳駄䢭飠鈓䤞鈨鉘鉫銱銮銿"], +["9a40","鋣鋫鋳鋴鋽鍃鎄鎭䥅䥑麿鐗匁鐝鐭鐾䥪鑔鑹锭関䦧间阳䧥枠䨤靀䨵鞲韂噔䫤惨颹䬙飱塄餎餙冴餜餷饂饝饢䭰駅䮝騼鬏窃魩鮁鯝鯱鯴䱭鰠㝯𡯂鵉鰺"], +["9aa1","黾噐鶓鶽鷀鷼银辶鹻麬麱麽黆铜黢黱黸竈齄𠂔𠊷𠎠椚铃妬𠓗塀铁㞹𠗕𠘕𠙶𡚺块煳𠫂𠫍𠮿呪吆𠯋咞𠯻𠰻𠱓𠱥𠱼惧𠲍噺𠲵𠳝𠳭𠵯𠶲𠷈楕鰯螥𠸄𠸎𠻗𠾐𠼭𠹳尠𠾼帋𡁜𡁏𡁶朞𡁻𡂈𡂖㙇𡂿𡃓𡄯𡄻卤蒭𡋣𡍵𡌶讁𡕷𡘙𡟃𡟇乸炻𡠭𡥪"], +["9b40","𡨭𡩅𡰪𡱰𡲬𡻈拃𡻕𡼕熘桕𢁅槩㛈𢉼𢏗𢏺𢜪𢡱𢥏苽𢥧𢦓𢫕覥𢫨辠𢬎鞸𢬿顇骽𢱌"], +["9b62","𢲈𢲷𥯨𢴈𢴒𢶷𢶕𢹂𢽴𢿌𣀳𣁦𣌟𣏞徱晈暿𧩹𣕧𣗳爁𤦺矗𣘚𣜖纇𠍆墵朎"], +["9ba1","椘𣪧𧙗𥿢𣸑𣺹𧗾𢂚䣐䪸𤄙𨪚𤋮𤌍𤀻𤌴𤎖𤩅𠗊凒𠘑妟𡺨㮾𣳿𤐄𤓖垈𤙴㦛𤜯𨗨𩧉㝢𢇃譞𨭎駖𤠒𤣻𤨕爉𤫀𠱸奥𤺥𤾆𠝹軚𥀬劏圿煱𥊙𥐙𣽊𤪧喼𥑆𥑮𦭒釔㑳𥔿𧘲𥕞䜘𥕢𥕦𥟇𤤿𥡝偦㓻𣏌惞𥤃䝼𨥈𥪮𥮉𥰆𡶐垡煑澶𦄂𧰒遖𦆲𤾚譢𦐂𦑊"], +["9c40","嵛𦯷輶𦒄𡤜諪𤧶𦒈𣿯𦔒䯀𦖿𦚵𢜛鑥𥟡憕娧晉侻嚹𤔡𦛼乪𤤴陖涏𦲽㘘襷𦞙𦡮𦐑𦡞營𦣇筂𩃀𠨑𦤦鄄𦤹穅鷰𦧺騦𦨭㙟𦑩𠀡禃𦨴𦭛崬𣔙菏𦮝䛐𦲤画补𦶮墶"], +["9ca1","㜜𢖍𧁋𧇍㱔𧊀𧊅銁𢅺𧊋錰𧋦𤧐氹钟𧑐𠻸蠧裵𢤦𨑳𡞱溸𤨪𡠠㦤㚹尐秣䔿暶𩲭𩢤襃𧟌𧡘囖䃟𡘊㦡𣜯𨃨𡏅熭荦𧧝𩆨婧䲷𧂯𨦫𧧽𧨊𧬋𧵦𤅺筃祾𨀉澵𪋟樃𨌘厢𦸇鎿栶靝𨅯𨀣𦦵𡏭𣈯𨁈嶅𨰰𨂃圕頣𨥉嶫𤦈斾槕叒𤪥𣾁㰑朶𨂐𨃴𨄮𡾡𨅏"], +["9d40","𨆉𨆯𨈚𨌆𨌯𨎊㗊𨑨𨚪䣺揦𨥖砈鉕𨦸䏲𨧧䏟𨧨𨭆𨯔姸𨰉輋𨿅𩃬筑𩄐𩄼㷷𩅞𤫊运犏嚋𩓧𩗩𩖰𩖸𩜲𩣑𩥉𩥪𩧃𩨨𩬎𩵚𩶛纟𩻸𩼣䲤镇𪊓熢𪋿䶑递𪗋䶜𠲜达嗁"], +["9da1","辺𢒰边𤪓䔉繿潖檱仪㓤𨬬𧢝㜺躀𡟵𨀤𨭬𨮙𧨾𦚯㷫𧙕𣲷𥘵𥥖亚𥺁𦉘嚿𠹭踎孭𣺈𤲞揞拐𡟶𡡻攰嘭𥱊吚𥌑㷆𩶘䱽嘢嘞罉𥻘奵𣵀蝰东𠿪𠵉𣚺脗鵞贘瘻鱅癎瞹鍅吲腈苷嘥脲萘肽嗪祢噃吖𠺝㗎嘅嗱曱𨋢㘭甴嗰喺咗啲𠱁𠲖廐𥅈𠹶𢱢"], +["9e40","𠺢麫絚嗞𡁵抝靭咔賍燶酶揼掹揾啩𢭃鱲𢺳冚㓟𠶧冧呍唞唓癦踭𦢊疱肶蠄螆裇膶萜𡃁䓬猄𤜆宐茋𦢓噻𢛴𧴯𤆣𧵳𦻐𧊶酰𡇙鈈𣳼𪚩𠺬𠻹牦𡲢䝎𤿂𧿹𠿫䃺"], +["9ea1","鱝攟𢶠䣳𤟠𩵼𠿬𠸊恢𧖣𠿭"], +["9ead","𦁈𡆇熣纎鵐业丄㕷嬍沲卧㚬㧜卽㚥𤘘墚𤭮舭呋垪𥪕𠥹"], +["9ec5","㩒𢑥獴𩺬䴉鯭𣳾𩼰䱛𤾩𩖞𩿞葜𣶶𧊲𦞳𣜠挮紥𣻷𣸬㨪逈勌㹴㙺䗩𠒎癀嫰𠺶硺𧼮墧䂿噼鮋嵴癔𪐴麅䳡痹㟻愙𣃚𤏲"], +["9ef5","噝𡊩垧𤥣𩸆刴𧂮㖭汊鵼"], +["9f40","籖鬹埞𡝬屓擓𩓐𦌵𧅤蚭𠴨𦴢𤫢𠵱"], +["9f4f","凾𡼏嶎霃𡷑麁遌笟鬂峑箣扨挵髿篏鬪籾鬮籂粆鰕篼鬉鼗鰛𤤾齚啳寃俽麘俲剠㸆勑坧偖妷帒韈鶫轜呩鞴饀鞺匬愰"], +["9fa1","椬叚鰊鴂䰻陁榀傦畆𡝭駚剳"], +["9fae","酙隁酜"], +["9fb2","酑𨺗捿𦴣櫊嘑醎畺抅𠏼獏籰𥰡𣳽"], +["9fc1","𤤙盖鮝个𠳔莾衂"], +["9fc9","届槀僭坺刟巵从氱𠇲伹咜哚劚趂㗾弌㗳"], +["9fdb","歒酼龥鮗頮颴骺麨麄煺笔"], +["9fe7","毺蠘罸"], +["9feb","嘠𪙊蹷齓"], +["9ff0","跔蹏鸜踁抂𨍽踨蹵竓𤩷稾磘泪詧瘇"], +["a040","𨩚鼦泎蟖痃𪊲硓咢贌狢獱謭猂瓱賫𤪻蘯徺袠䒷"], +["a055","𡠻𦸅"], +["a058","詾𢔛"], +["a05b","惽癧髗鵄鍮鮏蟵"], +["a063","蠏賷猬霡鮰㗖犲䰇籑饊𦅙慙䰄麖慽"], +["a073","坟慯抦戹拎㩜懢厪𣏵捤栂㗒"], +["a0a1","嵗𨯂迚𨸹"], +["a0a6","僙𡵆礆匲阸𠼻䁥"], +["a0ae","矾"], +["a0b0","糂𥼚糚稭聦聣絍甅瓲覔舚朌聢𧒆聛瓰脃眤覉𦟌畓𦻑螩蟎臈螌詉貭譃眫瓸蓚㘵榲趦"], +["a0d4","覩瑨涹蟁𤀑瓧㷛煶悤憜㳑煢恷"], +["a0e2","罱𨬭牐惩䭾删㰘𣳇𥻗𧙖𥔱𡥄𡋾𩤃𦷜𧂭峁𦆭𨨏𣙷𠃮𦡆𤼎䕢嬟𦍌齐麦𦉫"], +["a3c0","␀",31,"␡"], +["c6a1","①",9,"⑴",9,"ⅰ",9,"丶丿亅亠冂冖冫勹匸卩厶夊宀巛⼳广廴彐彡攴无疒癶辵隶¨ˆヽヾゝゞ〃仝々〆〇ー[]✽ぁ",23], +["c740","す",58,"ァアィイ"], +["c7a1","ゥ",81,"А",5,"ЁЖ",4], +["c840","Л",26,"ёж",25,"⇧↸↹㇏𠃌乚𠂊刂䒑"], +["c8a1","龰冈龱𧘇"], +["c8cd","¬¦'"㈱№℡゛゜⺀⺄⺆⺇⺈⺊⺌⺍⺕⺜⺝⺥⺧⺪⺬⺮⺶⺼⺾⻆⻊⻌⻍⻏⻖⻗⻞⻣"], +["c8f5","ʃɐɛɔɵœøŋʊɪ"], +["f9fe","■"], +["fa40","𠕇鋛𠗟𣿅蕌䊵珯况㙉𤥂𨧤鍄𡧛苮𣳈砼杄拟𤤳𨦪𠊠𦮳𡌅侫𢓭倈𦴩𧪄𣘀𤪱𢔓倩𠍾徤𠎀𠍇滛𠐟偽儁㑺儎顬㝃萖𤦤𠒇兠𣎴兪𠯿𢃼𠋥𢔰𠖎𣈳𡦃宂蝽𠖳𣲙冲冸"], +["faa1","鴴凉减凑㳜凓𤪦决凢卂凭菍椾𣜭彻刋刦刼劵剗劔効勅簕蕂勠蘍𦬓包𨫞啉滙𣾀𠥔𣿬匳卄𠯢泋𡜦栛珕恊㺪㣌𡛨燝䒢卭却𨚫卾卿𡖖𡘓矦厓𨪛厠厫厮玧𥝲㽙玜叁叅汉义埾叙㪫𠮏叠𣿫𢶣叶𠱷吓灹唫晗浛呭𦭓𠵴啝咏咤䞦𡜍𠻝㶴𠵍"], +["fb40","𨦼𢚘啇䳭启琗喆喩嘅𡣗𤀺䕒𤐵暳𡂴嘷曍𣊊暤暭噍噏磱囱鞇叾圀囯园𨭦㘣𡉏坆𤆥汮炋坂㚱𦱾埦𡐖堃𡑔𤍣堦𤯵塜墪㕡壠壜𡈼壻寿坃𪅐𤉸鏓㖡够梦㛃湙"], +["fba1","𡘾娤啓𡚒蔅姉𠵎𦲁𦴪𡟜姙𡟻𡞲𦶦浱𡠨𡛕姹𦹅媫婣㛦𤦩婷㜈媖瑥嫓𦾡𢕔㶅𡤑㜲𡚸広勐孶斈孼𧨎䀄䡝𠈄寕慠𡨴𥧌𠖥寳宝䴐尅𡭄尓珎尔𡲥𦬨屉䣝岅峩峯嶋𡷹𡸷崐崘嵆𡺤岺巗苼㠭𤤁𢁉𢅳芇㠶㯂帮檊幵幺𤒼𠳓厦亷廐厨𡝱帉廴𨒂"], +["fc40","廹廻㢠廼栾鐛弍𠇁弢㫞䢮𡌺强𦢈𢏐彘𢑱彣鞽𦹮彲鍀𨨶徧嶶㵟𥉐𡽪𧃸𢙨釖𠊞𨨩怱暅𡡷㥣㷇㘹垐𢞴祱㹀悞悤悳𤦂𤦏𧩓璤僡媠慤萤慂慈𦻒憁凴𠙖憇宪𣾷"], +["fca1","𢡟懓𨮝𩥝懐㤲𢦀𢣁怣慜攞掋𠄘担𡝰拕𢸍捬𤧟㨗搸揸𡎎𡟼撐澊𢸶頔𤂌𥜝擡擥鑻㩦携㩗敍漖𤨨𤨣斅敭敟𣁾斵𤥀䬷旑䃘𡠩无旣忟𣐀昘𣇷𣇸晄𣆤𣆥晋𠹵晧𥇦晳晴𡸽𣈱𨗴𣇈𥌓矅𢣷馤朂𤎜𤨡㬫槺𣟂杞杧杢𤇍𩃭柗䓩栢湐鈼栁𣏦𦶠桝"], +["fd40","𣑯槡樋𨫟楳棃𣗍椁椀㴲㨁𣘼㮀枬楡𨩊䋼椶榘㮡𠏉荣傐槹𣙙𢄪橅𣜃檝㯳枱櫈𩆜㰍欝𠤣惞欵歴𢟍溵𣫛𠎵𡥘㝀吡𣭚毡𣻼毜氷𢒋𤣱𦭑汚舦汹𣶼䓅𣶽𤆤𤤌𤤀"], +["fda1","𣳉㛥㳫𠴲鮃𣇹𢒑羏样𦴥𦶡𦷫涖浜湼漄𤥿𤂅𦹲蔳𦽴凇沜渝萮𨬡港𣸯瑓𣾂秌湏媑𣁋濸㜍澝𣸰滺𡒗𤀽䕕鏰潄潜㵎潴𩅰㴻澟𤅄濓𤂑𤅕𤀹𣿰𣾴𤄿凟𤅖𤅗𤅀𦇝灋灾炧炁烌烕烖烟䄄㷨熴熖𤉷焫煅媈煊煮岜𤍥煏鍢𤋁焬𤑚𤨧𤨢熺𨯨炽爎"], +["fe40","鑂爕夑鑃爤鍁𥘅爮牀𤥴梽牕牗㹕𣁄栍漽犂猪猫𤠣𨠫䣭𨠄猨献珏玪𠰺𦨮珉瑉𤇢𡛧𤨤昣㛅𤦷𤦍𤧻珷琕椃𤨦琹𠗃㻗瑜𢢭瑠𨺲瑇珤瑶莹瑬㜰瑴鏱樬璂䥓𤪌"], +["fea1","𤅟𤩹𨮏孆𨰃𡢞瓈𡦈甎瓩甞𨻙𡩋寗𨺬鎅畍畊畧畮𤾂㼄𤴓疎瑝疞疴瘂瘬癑癏癯癶𦏵皐臯㟸𦤑𦤎皡皥皷盌𦾟葢𥂝𥅽𡸜眞眦着撯𥈠睘𣊬瞯𨥤𨥨𡛁矴砉𡍶𤨒棊碯磇磓隥礮𥗠磗礴碱𧘌辸袄𨬫𦂃𢘜禆褀椂禀𥡗禝𧬹礼禩渪𧄦㺨秆𩄍秔"] +] diff --git a/node_modules/iconv-lite/encodings/tables/cp936.json b/node_modules/iconv-lite/encodings/tables/cp936.json new file mode 100644 index 000000000..49ddb9a1d --- /dev/null +++ b/node_modules/iconv-lite/encodings/tables/cp936.json @@ -0,0 +1,264 @@ +[ +["0","\u0000",127,"€"], +["8140","丂丄丅丆丏丒丗丟丠両丣並丩丮丯丱丳丵丷丼乀乁乂乄乆乊乑乕乗乚乛乢乣乤乥乧乨乪",5,"乲乴",9,"乿",6,"亇亊"], +["8180","亐亖亗亙亜亝亞亣亪亯亰亱亴亶亷亸亹亼亽亾仈仌仏仐仒仚仛仜仠仢仦仧仩仭仮仯仱仴仸仹仺仼仾伀伂",6,"伋伌伒",4,"伜伝伡伣伨伩伬伭伮伱伳伵伷伹伻伾",4,"佄佅佇",5,"佒佔佖佡佢佦佨佪佫佭佮佱佲併佷佸佹佺佽侀侁侂侅來侇侊侌侎侐侒侓侕侖侘侙侚侜侞侟価侢"], +["8240","侤侫侭侰",4,"侶",8,"俀俁係俆俇俈俉俋俌俍俒",4,"俙俛俠俢俤俥俧俫俬俰俲俴俵俶俷俹俻俼俽俿",11], +["8280","個倎倐們倓倕倖倗倛倝倞倠倢倣値倧倫倯",10,"倻倽倿偀偁偂偄偅偆偉偊偋偍偐",4,"偖偗偘偙偛偝",7,"偦",5,"偭",8,"偸偹偺偼偽傁傂傃傄傆傇傉傊傋傌傎",20,"傤傦傪傫傭",4,"傳",6,"傼"], +["8340","傽",17,"僐",5,"僗僘僙僛",10,"僨僩僪僫僯僰僱僲僴僶",4,"僼",9,"儈"], +["8380","儉儊儌",5,"儓",13,"儢",28,"兂兇兊兌兎兏児兒兓兗兘兙兛兝",4,"兣兤兦內兩兪兯兲兺兾兿冃冄円冇冊冋冎冏冐冑冓冔冘冚冝冞冟冡冣冦",4,"冭冮冴冸冹冺冾冿凁凂凃凅凈凊凍凎凐凒",5], +["8440","凘凙凚凜凞凟凢凣凥",5,"凬凮凱凲凴凷凾刄刅刉刋刌刏刐刓刔刕刜刞刟刡刢刣別刦刧刪刬刯刱刲刴刵刼刾剄",5,"剋剎剏剒剓剕剗剘"], +["8480","剙剚剛剝剟剠剢剣剤剦剨剫剬剭剮剰剱剳",9,"剾劀劃",4,"劉",6,"劑劒劔",6,"劜劤劥劦劧劮劯劰労",9,"勀勁勂勄勅勆勈勊勌勍勎勏勑勓勔動勗務",5,"勠勡勢勣勥",10,"勱",7,"勻勼勽匁匂匃匄匇匉匊匋匌匎"], +["8540","匑匒匓匔匘匛匜匞匟匢匤匥匧匨匩匫匬匭匯",9,"匼匽區卂卄卆卋卌卍卐協単卙卛卝卥卨卪卬卭卲卶卹卻卼卽卾厀厁厃厇厈厊厎厏"], +["8580","厐",4,"厖厗厙厛厜厞厠厡厤厧厪厫厬厭厯",6,"厷厸厹厺厼厽厾叀參",4,"収叏叐叒叓叕叚叜叝叞叡叢叧叴叺叾叿吀吂吅吇吋吔吘吙吚吜吢吤吥吪吰吳吶吷吺吽吿呁呂呄呅呇呉呌呍呎呏呑呚呝",4,"呣呥呧呩",7,"呴呹呺呾呿咁咃咅咇咈咉咊咍咑咓咗咘咜咞咟咠咡"], +["8640","咢咥咮咰咲咵咶咷咹咺咼咾哃哅哊哋哖哘哛哠",4,"哫哬哯哰哱哴",5,"哻哾唀唂唃唄唅唈唊",4,"唒唓唕",5,"唜唝唞唟唡唥唦"], +["8680","唨唩唫唭唲唴唵唶唸唹唺唻唽啀啂啅啇啈啋",4,"啑啒啓啔啗",4,"啝啞啟啠啢啣啨啩啫啯",5,"啹啺啽啿喅喆喌喍喎喐喒喓喕喖喗喚喛喞喠",6,"喨",8,"喲喴営喸喺喼喿",4,"嗆嗇嗈嗊嗋嗎嗏嗐嗕嗗",4,"嗞嗠嗢嗧嗩嗭嗮嗰嗱嗴嗶嗸",4,"嗿嘂嘃嘄嘅"], +["8740","嘆嘇嘊嘋嘍嘐",7,"嘙嘚嘜嘝嘠嘡嘢嘥嘦嘨嘩嘪嘫嘮嘯嘰嘳嘵嘷嘸嘺嘼嘽嘾噀",11,"噏",4,"噕噖噚噛噝",4], +["8780","噣噥噦噧噭噮噯噰噲噳噴噵噷噸噹噺噽",7,"嚇",6,"嚐嚑嚒嚔",14,"嚤",10,"嚰",6,"嚸嚹嚺嚻嚽",12,"囋",8,"囕囖囘囙囜団囥",5,"囬囮囯囲図囶囷囸囻囼圀圁圂圅圇國",6], +["8840","園",9,"圝圞圠圡圢圤圥圦圧圫圱圲圴",4,"圼圽圿坁坃坄坅坆坈坉坋坒",4,"坘坙坢坣坥坧坬坮坰坱坲坴坵坸坹坺坽坾坿垀"], +["8880","垁垇垈垉垊垍",4,"垔",6,"垜垝垞垟垥垨垪垬垯垰垱垳垵垶垷垹",8,"埄",6,"埌埍埐埑埓埖埗埛埜埞埡埢埣埥",7,"埮埰埱埲埳埵埶執埻埼埾埿堁堃堄堅堈堉堊堌堎堏堐堒堓堔堖堗堘堚堛堜堝堟堢堣堥",4,"堫",4,"報堲堳場堶",7], +["8940","堾",5,"塅",6,"塎塏塐塒塓塕塖塗塙",4,"塟",5,"塦",4,"塭",16,"塿墂墄墆墇墈墊墋墌"], +["8980","墍",4,"墔",4,"墛墜墝墠",7,"墪",17,"墽墾墿壀壂壃壄壆",10,"壒壓壔壖",13,"壥",5,"壭壯壱売壴壵壷壸壺",7,"夃夅夆夈",4,"夎夐夑夒夓夗夘夛夝夞夠夡夢夣夦夨夬夰夲夳夵夶夻"], +["8a40","夽夾夿奀奃奅奆奊奌奍奐奒奓奙奛",4,"奡奣奤奦",12,"奵奷奺奻奼奾奿妀妅妉妋妌妎妏妐妑妔妕妘妚妛妜妝妟妠妡妢妦"], +["8a80","妧妬妭妰妱妳",5,"妺妼妽妿",6,"姇姈姉姌姍姎姏姕姖姙姛姞",4,"姤姦姧姩姪姫姭",11,"姺姼姽姾娀娂娊娋娍娎娏娐娒娔娕娖娗娙娚娛娝娞娡娢娤娦娧娨娪",6,"娳娵娷",4,"娽娾娿婁",4,"婇婈婋",9,"婖婗婘婙婛",5], +["8b40","婡婣婤婥婦婨婩婫",8,"婸婹婻婼婽婾媀",17,"媓",6,"媜",13,"媫媬"], +["8b80","媭",4,"媴媶媷媹",4,"媿嫀嫃",5,"嫊嫋嫍",4,"嫓嫕嫗嫙嫚嫛嫝嫞嫟嫢嫤嫥嫧嫨嫪嫬",4,"嫲",22,"嬊",11,"嬘",25,"嬳嬵嬶嬸",7,"孁",6], +["8c40","孈",7,"孒孖孞孠孡孧孨孫孭孮孯孲孴孶孷學孹孻孼孾孿宂宆宊宍宎宐宑宒宔宖実宧宨宩宬宭宮宯宱宲宷宺宻宼寀寁寃寈寉寊寋寍寎寏"], +["8c80","寑寔",8,"寠寢寣實寧審",4,"寯寱",6,"寽対尀専尃尅將專尋尌對導尐尒尓尗尙尛尞尟尠尡尣尦尨尩尪尫尭尮尯尰尲尳尵尶尷屃屄屆屇屌屍屒屓屔屖屗屘屚屛屜屝屟屢層屧",6,"屰屲",6,"屻屼屽屾岀岃",4,"岉岊岋岎岏岒岓岕岝",4,"岤",4], +["8d40","岪岮岯岰岲岴岶岹岺岻岼岾峀峂峃峅",5,"峌",5,"峓",5,"峚",6,"峢峣峧峩峫峬峮峯峱",9,"峼",4], +["8d80","崁崄崅崈",5,"崏",4,"崕崗崘崙崚崜崝崟",4,"崥崨崪崫崬崯",4,"崵",7,"崿",7,"嵈嵉嵍",10,"嵙嵚嵜嵞",10,"嵪嵭嵮嵰嵱嵲嵳嵵",12,"嶃",21,"嶚嶛嶜嶞嶟嶠"], +["8e40","嶡",21,"嶸",12,"巆",6,"巎",12,"巜巟巠巣巤巪巬巭"], +["8e80","巰巵巶巸",4,"巿帀帄帇帉帊帋帍帎帒帓帗帞",7,"帨",4,"帯帰帲",4,"帹帺帾帿幀幁幃幆",5,"幍",6,"幖",4,"幜幝幟幠幣",14,"幵幷幹幾庁庂広庅庈庉庌庍庎庒庘庛庝庡庢庣庤庨",4,"庮",4,"庴庺庻庼庽庿",6], +["8f40","廆廇廈廋",5,"廔廕廗廘廙廚廜",11,"廩廫",8,"廵廸廹廻廼廽弅弆弇弉弌弍弎弐弒弔弖弙弚弜弝弞弡弢弣弤"], +["8f80","弨弫弬弮弰弲",6,"弻弽弾弿彁",14,"彑彔彙彚彛彜彞彟彠彣彥彧彨彫彮彯彲彴彵彶彸彺彽彾彿徃徆徍徎徏徑従徔徖徚徛徝從徟徠徢",5,"復徫徬徯",5,"徶徸徹徺徻徾",4,"忇忈忊忋忎忓忔忕忚忛応忞忟忢忣忥忦忨忩忬忯忰忲忳忴忶忷忹忺忼怇"], +["9040","怈怉怋怌怐怑怓怗怘怚怞怟怢怣怤怬怭怮怰",4,"怶",4,"怽怾恀恄",6,"恌恎恏恑恓恔恖恗恘恛恜恞恟恠恡恥恦恮恱恲恴恵恷恾悀"], +["9080","悁悂悅悆悇悈悊悋悎悏悐悑悓悕悗悘悙悜悞悡悢悤悥悧悩悪悮悰悳悵悶悷悹悺悽",7,"惇惈惉惌",4,"惒惓惔惖惗惙惛惞惡",4,"惪惱惲惵惷惸惻",4,"愂愃愄愅愇愊愋愌愐",4,"愖愗愘愙愛愜愝愞愡愢愥愨愩愪愬",18,"慀",6], +["9140","慇慉態慍慏慐慒慓慔慖",6,"慞慟慠慡慣慤慥慦慩",6,"慱慲慳慴慶慸",18,"憌憍憏",4,"憕"], +["9180","憖",6,"憞",8,"憪憫憭",9,"憸",5,"憿懀懁懃",4,"應懌",4,"懓懕",16,"懧",13,"懶",8,"戀",5,"戇戉戓戔戙戜戝戞戠戣戦戧戨戩戫戭戯戰戱戲戵戶戸",4,"扂扄扅扆扊"], +["9240","扏扐払扖扗扙扚扜",6,"扤扥扨扱扲扴扵扷扸扺扻扽抁抂抃抅抆抇抈抋",5,"抔抙抜抝択抣抦抧抩抪抭抮抯抰抲抳抴抶抷抸抺抾拀拁"], +["9280","拃拋拏拑拕拝拞拠拡拤拪拫拰拲拵拸拹拺拻挀挃挄挅挆挊挋挌挍挏挐挒挓挔挕挗挘挙挜挦挧挩挬挭挮挰挱挳",5,"挻挼挾挿捀捁捄捇捈捊捑捒捓捔捖",7,"捠捤捥捦捨捪捫捬捯捰捲捳捴捵捸捹捼捽捾捿掁掃掄掅掆掋掍掑掓掔掕掗掙",6,"採掤掦掫掯掱掲掵掶掹掻掽掿揀"], +["9340","揁揂揃揅揇揈揊揋揌揑揓揔揕揗",6,"揟揢揤",4,"揫揬揮揯揰揱揳揵揷揹揺揻揼揾搃搄搆",4,"損搎搑搒搕",5,"搝搟搢搣搤"], +["9380","搥搧搨搩搫搮",5,"搵",4,"搻搼搾摀摂摃摉摋",6,"摓摕摖摗摙",4,"摟",7,"摨摪摫摬摮",9,"摻",6,"撃撆撈",8,"撓撔撗撘撚撛撜撝撟",4,"撥撦撧撨撪撫撯撱撲撳撴撶撹撻撽撾撿擁擃擄擆",6,"擏擑擓擔擕擖擙據"], +["9440","擛擜擝擟擠擡擣擥擧",24,"攁",7,"攊",7,"攓",4,"攙",8], +["9480","攢攣攤攦",4,"攬攭攰攱攲攳攷攺攼攽敀",4,"敆敇敊敋敍敎敐敒敓敔敗敘敚敜敟敠敡敤敥敧敨敩敪敭敮敯敱敳敵敶數",14,"斈斉斊斍斎斏斒斔斕斖斘斚斝斞斠斢斣斦斨斪斬斮斱",7,"斺斻斾斿旀旂旇旈旉旊旍旐旑旓旔旕旘",7,"旡旣旤旪旫"], +["9540","旲旳旴旵旸旹旻",4,"昁昄昅昇昈昉昋昍昐昑昒昖昗昘昚昛昜昞昡昢昣昤昦昩昪昫昬昮昰昲昳昷",4,"昽昿晀時晄",6,"晍晎晐晑晘"], +["9580","晙晛晜晝晞晠晢晣晥晧晩",4,"晱晲晳晵晸晹晻晼晽晿暀暁暃暅暆暈暉暊暋暍暎暏暐暒暓暔暕暘",4,"暞",8,"暩",4,"暯",4,"暵暶暷暸暺暻暼暽暿",25,"曚曞",7,"曧曨曪",5,"曱曵曶書曺曻曽朁朂會"], +["9640","朄朅朆朇朌朎朏朑朒朓朖朘朙朚朜朞朠",5,"朧朩朮朰朲朳朶朷朸朹朻朼朾朿杁杄杅杇杊杋杍杒杔杕杗",4,"杝杢杣杤杦杧杫杬杮東杴杶"], +["9680","杸杹杺杻杽枀枂枃枅枆枈枊枌枍枎枏枑枒枓枔枖枙枛枟枠枡枤枦枩枬枮枱枲枴枹",7,"柂柅",9,"柕柖柗柛柟柡柣柤柦柧柨柪柫柭柮柲柵",7,"柾栁栂栃栄栆栍栐栒栔栕栘",4,"栞栟栠栢",6,"栫",6,"栴栵栶栺栻栿桇桋桍桏桒桖",5], +["9740","桜桝桞桟桪桬",7,"桵桸",8,"梂梄梇",7,"梐梑梒梔梕梖梘",9,"梣梤梥梩梪梫梬梮梱梲梴梶梷梸"], +["9780","梹",6,"棁棃",5,"棊棌棎棏棐棑棓棔棖棗棙棛",4,"棡棢棤",9,"棯棲棳棴棶棷棸棻棽棾棿椀椂椃椄椆",4,"椌椏椑椓",11,"椡椢椣椥",7,"椮椯椱椲椳椵椶椷椸椺椻椼椾楀楁楃",16,"楕楖楘楙楛楜楟"], +["9840","楡楢楤楥楧楨楩楪楬業楯楰楲",4,"楺楻楽楾楿榁榃榅榊榋榌榎",5,"榖榗榙榚榝",9,"榩榪榬榮榯榰榲榳榵榶榸榹榺榼榽"], +["9880","榾榿槀槂",7,"構槍槏槑槒槓槕",5,"槜槝槞槡",11,"槮槯槰槱槳",9,"槾樀",9,"樋",11,"標",5,"樠樢",5,"権樫樬樭樮樰樲樳樴樶",6,"樿",4,"橅橆橈",7,"橑",6,"橚"], +["9940","橜",4,"橢橣橤橦",10,"橲",6,"橺橻橽橾橿檁檂檃檅",8,"檏檒",4,"檘",7,"檡",5], +["9980","檧檨檪檭",114,"欥欦欨",6], +["9a40","欯欰欱欳欴欵欶欸欻欼欽欿歀歁歂歄歅歈歊歋歍",11,"歚",7,"歨歩歫",13,"歺歽歾歿殀殅殈"], +["9a80","殌殎殏殐殑殔殕殗殘殙殜",4,"殢",7,"殫",7,"殶殸",6,"毀毃毄毆",4,"毌毎毐毑毘毚毜",4,"毢",7,"毬毭毮毰毱毲毴毶毷毸毺毻毼毾",6,"氈",4,"氎氒気氜氝氞氠氣氥氫氬氭氱氳氶氷氹氺氻氼氾氿汃汄汅汈汋",4,"汑汒汓汖汘"], +["9b40","汙汚汢汣汥汦汧汫",4,"汱汳汵汷汸決汻汼汿沀沄沇沊沋沍沎沑沒沕沖沗沘沚沜沝沞沠沢沨沬沯沰沴沵沶沷沺泀況泂泃泆泇泈泋泍泎泏泑泒泘"], +["9b80","泙泚泜泝泟泤泦泧泩泬泭泲泴泹泿洀洂洃洅洆洈洉洊洍洏洐洑洓洔洕洖洘洜洝洟",5,"洦洨洩洬洭洯洰洴洶洷洸洺洿浀浂浄浉浌浐浕浖浗浘浛浝浟浡浢浤浥浧浨浫浬浭浰浱浲浳浵浶浹浺浻浽",4,"涃涄涆涇涊涋涍涏涐涒涖",4,"涜涢涥涬涭涰涱涳涴涶涷涹",5,"淁淂淃淈淉淊"], +["9c40","淍淎淏淐淒淓淔淕淗淚淛淜淟淢淣淥淧淨淩淪淭淯淰淲淴淵淶淸淺淽",7,"渆渇済渉渋渏渒渓渕渘渙減渜渞渟渢渦渧渨渪測渮渰渱渳渵"], +["9c80","渶渷渹渻",7,"湅",7,"湏湐湑湒湕湗湙湚湜湝湞湠",10,"湬湭湯",14,"満溁溂溄溇溈溊",4,"溑",6,"溙溚溛溝溞溠溡溣溤溦溨溩溫溬溭溮溰溳溵溸溹溼溾溿滀滃滄滅滆滈滉滊滌滍滎滐滒滖滘滙滛滜滝滣滧滪",5], +["9d40","滰滱滲滳滵滶滷滸滺",7,"漃漄漅漇漈漊",4,"漐漑漒漖",9,"漡漢漣漥漦漧漨漬漮漰漲漴漵漷",6,"漿潀潁潂"], +["9d80","潃潄潅潈潉潊潌潎",9,"潙潚潛潝潟潠潡潣潤潥潧",5,"潯潰潱潳潵潶潷潹潻潽",6,"澅澆澇澊澋澏",12,"澝澞澟澠澢",4,"澨",10,"澴澵澷澸澺",5,"濁濃",5,"濊",6,"濓",10,"濟濢濣濤濥"], +["9e40","濦",7,"濰",32,"瀒",7,"瀜",6,"瀤",6], +["9e80","瀫",9,"瀶瀷瀸瀺",17,"灍灎灐",13,"灟",11,"灮灱灲灳灴灷灹灺灻災炁炂炃炄炆炇炈炋炌炍炏炐炑炓炗炘炚炛炞",12,"炰炲炴炵炶為炾炿烄烅烆烇烉烋",12,"烚"], +["9f40","烜烝烞烠烡烢烣烥烪烮烰",6,"烸烺烻烼烾",10,"焋",4,"焑焒焔焗焛",10,"焧",7,"焲焳焴"], +["9f80","焵焷",13,"煆煇煈煉煋煍煏",12,"煝煟",4,"煥煩",4,"煯煰煱煴煵煶煷煹煻煼煾",5,"熅",4,"熋熌熍熎熐熑熒熓熕熖熗熚",4,"熡",6,"熩熪熫熭",5,"熴熶熷熸熺",8,"燄",9,"燏",4], +["a040","燖",9,"燡燢燣燤燦燨",5,"燯",9,"燺",11,"爇",19], +["a080","爛爜爞",9,"爩爫爭爮爯爲爳爴爺爼爾牀",6,"牉牊牋牎牏牐牑牓牔牕牗牘牚牜牞牠牣牤牥牨牪牫牬牭牰牱牳牴牶牷牸牻牼牽犂犃犅",4,"犌犎犐犑犓",11,"犠",11,"犮犱犲犳犵犺",6,"狅狆狇狉狊狋狌狏狑狓狔狕狖狘狚狛"], +["a1a1"," 、。·ˉˇ¨〃々—~‖…‘’“”〔〕〈",7,"〖〗【】±×÷∶∧∨∑∏∪∩∈∷√⊥∥∠⌒⊙∫∮≡≌≈∽∝≠≮≯≤≥∞∵∴♂♀°′″℃$¤¢£‰§№☆★○●◎◇◆□■△▲※→←↑↓〓"], +["a2a1","ⅰ",9], +["a2b1","⒈",19,"⑴",19,"①",9], +["a2e5","㈠",9], +["a2f1","Ⅰ",11], +["a3a1","!"#¥%",88," ̄"], +["a4a1","ぁ",82], +["a5a1","ァ",85], +["a6a1","Α",16,"Σ",6], +["a6c1","α",16,"σ",6], +["a6e0","︵︶︹︺︿﹀︽︾﹁﹂﹃﹄"], +["a6ee","︻︼︷︸︱"], +["a6f4","︳︴"], +["a7a1","А",5,"ЁЖ",25], +["a7d1","а",5,"ёж",25], +["a840","ˊˋ˙–―‥‵℅℉↖↗↘↙∕∟∣≒≦≧⊿═",35,"▁",6], +["a880","█",7,"▓▔▕▼▽◢◣◤◥☉⊕〒〝〞"], +["a8a1","āáǎàēéěèīíǐìōóǒòūúǔùǖǘǚǜüêɑ"], +["a8bd","ńň"], +["a8c0","ɡ"], +["a8c5","ㄅ",36], +["a940","〡",8,"㊣㎎㎏㎜㎝㎞㎡㏄㏎㏑㏒㏕︰¬¦"], +["a959","℡㈱"], +["a95c","‐"], +["a960","ー゛゜ヽヾ〆ゝゞ﹉",9,"﹔﹕﹖﹗﹙",8], +["a980","﹢",4,"﹨﹩﹪﹫"], +["a996","〇"], +["a9a4","─",75], +["aa40","狜狝狟狢",5,"狪狫狵狶狹狽狾狿猀猂猄",5,"猋猌猍猏猐猑猒猔猘猙猚猟猠猣猤猦猧猨猭猯猰猲猳猵猶猺猻猼猽獀",8], +["aa80","獉獊獋獌獎獏獑獓獔獕獖獘",7,"獡",10,"獮獰獱"], +["ab40","獲",11,"獿",4,"玅玆玈玊玌玍玏玐玒玓玔玕玗玘玙玚玜玝玞玠玡玣",5,"玪玬玭玱玴玵玶玸玹玼玽玾玿珁珃",4], +["ab80","珋珌珎珒",6,"珚珛珜珝珟珡珢珣珤珦珨珪珫珬珮珯珰珱珳",4], +["ac40","珸",10,"琄琇琈琋琌琍琎琑",8,"琜",5,"琣琤琧琩琫琭琯琱琲琷",4,"琽琾琿瑀瑂",11], +["ac80","瑎",6,"瑖瑘瑝瑠",12,"瑮瑯瑱",4,"瑸瑹瑺"], +["ad40","瑻瑼瑽瑿璂璄璅璆璈璉璊璌璍璏璑",10,"璝璟",7,"璪",15,"璻",12], +["ad80","瓈",9,"瓓",8,"瓝瓟瓡瓥瓧",6,"瓰瓱瓲"], +["ae40","瓳瓵瓸",6,"甀甁甂甃甅",7,"甎甐甒甔甕甖甗甛甝甞甠",4,"甦甧甪甮甴甶甹甼甽甿畁畂畃畄畆畇畉畊畍畐畑畒畓畕畖畗畘"], +["ae80","畝",7,"畧畨畩畫",6,"畳畵當畷畺",4,"疀疁疂疄疅疇"], +["af40","疈疉疊疌疍疎疐疓疕疘疛疜疞疢疦",4,"疭疶疷疺疻疿痀痁痆痋痌痎痏痐痑痓痗痙痚痜痝痟痠痡痥痩痬痭痮痯痲痳痵痶痷痸痺痻痽痾瘂瘄瘆瘇"], +["af80","瘈瘉瘋瘍瘎瘏瘑瘒瘓瘔瘖瘚瘜瘝瘞瘡瘣瘧瘨瘬瘮瘯瘱瘲瘶瘷瘹瘺瘻瘽癁療癄"], +["b040","癅",6,"癎",5,"癕癗",4,"癝癟癠癡癢癤",6,"癬癭癮癰",7,"癹発發癿皀皁皃皅皉皊皌皍皏皐皒皔皕皗皘皚皛"], +["b080","皜",7,"皥",8,"皯皰皳皵",9,"盀盁盃啊阿埃挨哎唉哀皑癌蔼矮艾碍爱隘鞍氨安俺按暗岸胺案肮昂盎凹敖熬翱袄傲奥懊澳芭捌扒叭吧笆八疤巴拔跋靶把耙坝霸罢爸白柏百摆佰败拜稗斑班搬扳般颁板版扮拌伴瓣半办绊邦帮梆榜膀绑棒磅蚌镑傍谤苞胞包褒剥"], +["b140","盄盇盉盋盌盓盕盙盚盜盝盞盠",4,"盦",7,"盰盳盵盶盷盺盻盽盿眀眂眃眅眆眊県眎",10,"眛眜眝眞眡眣眤眥眧眪眫"], +["b180","眬眮眰",4,"眹眻眽眾眿睂睄睅睆睈",7,"睒",7,"睜薄雹保堡饱宝抱报暴豹鲍爆杯碑悲卑北辈背贝钡倍狈备惫焙被奔苯本笨崩绷甭泵蹦迸逼鼻比鄙笔彼碧蓖蔽毕毙毖币庇痹闭敝弊必辟壁臂避陛鞭边编贬扁便变卞辨辩辫遍标彪膘表鳖憋别瘪彬斌濒滨宾摈兵冰柄丙秉饼炳"], +["b240","睝睞睟睠睤睧睩睪睭",11,"睺睻睼瞁瞂瞃瞆",5,"瞏瞐瞓",11,"瞡瞣瞤瞦瞨瞫瞭瞮瞯瞱瞲瞴瞶",4], +["b280","瞼瞾矀",12,"矎",8,"矘矙矚矝",4,"矤病并玻菠播拨钵波博勃搏铂箔伯帛舶脖膊渤泊驳捕卜哺补埠不布步簿部怖擦猜裁材才财睬踩采彩菜蔡餐参蚕残惭惨灿苍舱仓沧藏操糙槽曹草厕策侧册测层蹭插叉茬茶查碴搽察岔差诧拆柴豺搀掺蝉馋谗缠铲产阐颤昌猖"], +["b340","矦矨矪矯矰矱矲矴矵矷矹矺矻矼砃",5,"砊砋砎砏砐砓砕砙砛砞砠砡砢砤砨砪砫砮砯砱砲砳砵砶砽砿硁硂硃硄硆硈硉硊硋硍硏硑硓硔硘硙硚"], +["b380","硛硜硞",11,"硯",7,"硸硹硺硻硽",6,"场尝常长偿肠厂敞畅唱倡超抄钞朝嘲潮巢吵炒车扯撤掣彻澈郴臣辰尘晨忱沉陈趁衬撑称城橙成呈乘程惩澄诚承逞骋秤吃痴持匙池迟弛驰耻齿侈尺赤翅斥炽充冲虫崇宠抽酬畴踌稠愁筹仇绸瞅丑臭初出橱厨躇锄雏滁除楚"], +["b440","碄碅碆碈碊碋碏碐碒碔碕碖碙碝碞碠碢碤碦碨",7,"碵碶碷碸確碻碼碽碿磀磂磃磄磆磇磈磌磍磎磏磑磒磓磖磗磘磚",9], +["b480","磤磥磦磧磩磪磫磭",4,"磳磵磶磸磹磻",5,"礂礃礄礆",6,"础储矗搐触处揣川穿椽传船喘串疮窗幢床闯创吹炊捶锤垂春椿醇唇淳纯蠢戳绰疵茨磁雌辞慈瓷词此刺赐次聪葱囱匆从丛凑粗醋簇促蹿篡窜摧崔催脆瘁粹淬翠村存寸磋撮搓措挫错搭达答瘩打大呆歹傣戴带殆代贷袋待逮"], +["b540","礍",5,"礔",9,"礟",4,"礥",14,"礵",4,"礽礿祂祃祄祅祇祊",8,"祔祕祘祙祡祣"], +["b580","祤祦祩祪祫祬祮祰",6,"祹祻",4,"禂禃禆禇禈禉禋禌禍禎禐禑禒怠耽担丹单郸掸胆旦氮但惮淡诞弹蛋当挡党荡档刀捣蹈倒岛祷导到稻悼道盗德得的蹬灯登等瞪凳邓堤低滴迪敌笛狄涤翟嫡抵底地蒂第帝弟递缔颠掂滇碘点典靛垫电佃甸店惦奠淀殿碉叼雕凋刁掉吊钓调跌爹碟蝶迭谍叠"], +["b640","禓",6,"禛",11,"禨",10,"禴",4,"禼禿秂秄秅秇秈秊秌秎秏秐秓秔秖秗秙",5,"秠秡秢秥秨秪"], +["b680","秬秮秱",6,"秹秺秼秾秿稁稄稅稇稈稉稊稌稏",4,"稕稖稘稙稛稜丁盯叮钉顶鼎锭定订丢东冬董懂动栋侗恫冻洞兜抖斗陡豆逗痘都督毒犊独读堵睹赌杜镀肚度渡妒端短锻段断缎堆兑队对墩吨蹲敦顿囤钝盾遁掇哆多夺垛躲朵跺舵剁惰堕蛾峨鹅俄额讹娥恶厄扼遏鄂饿恩而儿耳尔饵洱二"], +["b740","稝稟稡稢稤",14,"稴稵稶稸稺稾穀",5,"穇",9,"穒",4,"穘",16], +["b780","穩",6,"穱穲穳穵穻穼穽穾窂窅窇窉窊窋窌窎窏窐窓窔窙窚窛窞窡窢贰发罚筏伐乏阀法珐藩帆番翻樊矾钒繁凡烦反返范贩犯饭泛坊芳方肪房防妨仿访纺放菲非啡飞肥匪诽吠肺废沸费芬酚吩氛分纷坟焚汾粉奋份忿愤粪丰封枫蜂峰锋风疯烽逢冯缝讽奉凤佛否夫敷肤孵扶拂辐幅氟符伏俘服"], +["b840","窣窤窧窩窪窫窮",4,"窴",10,"竀",10,"竌",9,"竗竘竚竛竜竝竡竢竤竧",5,"竮竰竱竲竳"], +["b880","竴",4,"竻竼竾笀笁笂笅笇笉笌笍笎笐笒笓笖笗笘笚笜笝笟笡笢笣笧笩笭浮涪福袱弗甫抚辅俯釜斧脯腑府腐赴副覆赋复傅付阜父腹负富讣附妇缚咐噶嘎该改概钙盖溉干甘杆柑竿肝赶感秆敢赣冈刚钢缸肛纲岗港杠篙皋高膏羔糕搞镐稿告哥歌搁戈鸽胳疙割革葛格蛤阁隔铬个各给根跟耕更庚羹"], +["b940","笯笰笲笴笵笶笷笹笻笽笿",5,"筆筈筊筍筎筓筕筗筙筜筞筟筡筣",10,"筯筰筳筴筶筸筺筼筽筿箁箂箃箄箆",6,"箎箏"], +["b980","箑箒箓箖箘箙箚箛箞箟箠箣箤箥箮箯箰箲箳箵箶箷箹",7,"篂篃範埂耿梗工攻功恭龚供躬公宫弓巩汞拱贡共钩勾沟苟狗垢构购够辜菇咕箍估沽孤姑鼓古蛊骨谷股故顾固雇刮瓜剐寡挂褂乖拐怪棺关官冠观管馆罐惯灌贯光广逛瑰规圭硅归龟闺轨鬼诡癸桂柜跪贵刽辊滚棍锅郭国果裹过哈"], +["ba40","篅篈築篊篋篍篎篏篐篒篔",4,"篛篜篞篟篠篢篣篤篧篨篩篫篬篭篯篰篲",4,"篸篹篺篻篽篿",7,"簈簉簊簍簎簐",5,"簗簘簙"], +["ba80","簚",4,"簠",5,"簨簩簫",12,"簹",5,"籂骸孩海氦亥害骇酣憨邯韩含涵寒函喊罕翰撼捍旱憾悍焊汗汉夯杭航壕嚎豪毫郝好耗号浩呵喝荷菏核禾和何合盒貉阂河涸赫褐鹤贺嘿黑痕很狠恨哼亨横衡恒轰哄烘虹鸿洪宏弘红喉侯猴吼厚候后呼乎忽瑚壶葫胡蝴狐糊湖"], +["bb40","籃",9,"籎",36,"籵",5,"籾",9], +["bb80","粈粊",6,"粓粔粖粙粚粛粠粡粣粦粧粨粩粫粬粭粯粰粴",4,"粺粻弧虎唬护互沪户花哗华猾滑画划化话槐徊怀淮坏欢环桓还缓换患唤痪豢焕涣宦幻荒慌黄磺蝗簧皇凰惶煌晃幌恍谎灰挥辉徽恢蛔回毁悔慧卉惠晦贿秽会烩汇讳诲绘荤昏婚魂浑混豁活伙火获或惑霍货祸击圾基机畸稽积箕"], +["bc40","粿糀糂糃糄糆糉糋糎",6,"糘糚糛糝糞糡",6,"糩",5,"糰",7,"糹糺糼",13,"紋",5], +["bc80","紑",14,"紡紣紤紥紦紨紩紪紬紭紮細",6,"肌饥迹激讥鸡姬绩缉吉极棘辑籍集及急疾汲即嫉级挤几脊己蓟技冀季伎祭剂悸济寄寂计记既忌际妓继纪嘉枷夹佳家加荚颊贾甲钾假稼价架驾嫁歼监坚尖笺间煎兼肩艰奸缄茧检柬碱硷拣捡简俭剪减荐槛鉴践贱见键箭件"], +["bd40","紷",54,"絯",7], +["bd80","絸",32,"健舰剑饯渐溅涧建僵姜将浆江疆蒋桨奖讲匠酱降蕉椒礁焦胶交郊浇骄娇嚼搅铰矫侥脚狡角饺缴绞剿教酵轿较叫窖揭接皆秸街阶截劫节桔杰捷睫竭洁结解姐戒藉芥界借介疥诫届巾筋斤金今津襟紧锦仅谨进靳晋禁近烬浸"], +["be40","継",12,"綧",6,"綯",42], +["be80","線",32,"尽劲荆兢茎睛晶鲸京惊精粳经井警景颈静境敬镜径痉靖竟竞净炯窘揪究纠玖韭久灸九酒厩救旧臼舅咎就疚鞠拘狙疽居驹菊局咀矩举沮聚拒据巨具距踞锯俱句惧炬剧捐鹃娟倦眷卷绢撅攫抉掘倔爵觉决诀绝均菌钧军君峻"], +["bf40","緻",62], +["bf80","縺縼",4,"繂",4,"繈",21,"俊竣浚郡骏喀咖卡咯开揩楷凯慨刊堪勘坎砍看康慷糠扛抗亢炕考拷烤靠坷苛柯棵磕颗科壳咳可渴克刻客课肯啃垦恳坑吭空恐孔控抠口扣寇枯哭窟苦酷库裤夸垮挎跨胯块筷侩快宽款匡筐狂框矿眶旷况亏盔岿窥葵奎魁傀"], +["c040","繞",35,"纃",23,"纜纝纞"], +["c080","纮纴纻纼绖绤绬绹缊缐缞缷缹缻",6,"罃罆",9,"罒罓馈愧溃坤昆捆困括扩廓阔垃拉喇蜡腊辣啦莱来赖蓝婪栏拦篮阑兰澜谰揽览懒缆烂滥琅榔狼廊郎朗浪捞劳牢老佬姥酪烙涝勒乐雷镭蕾磊累儡垒擂肋类泪棱楞冷厘梨犁黎篱狸离漓理李里鲤礼莉荔吏栗丽厉励砾历利傈例俐"], +["c140","罖罙罛罜罝罞罠罣",4,"罫罬罭罯罰罳罵罶罷罸罺罻罼罽罿羀羂",7,"羋羍羏",4,"羕",4,"羛羜羠羢羣羥羦羨",6,"羱"], +["c180","羳",4,"羺羻羾翀翂翃翄翆翇翈翉翋翍翏",4,"翖翗翙",5,"翢翣痢立粒沥隶力璃哩俩联莲连镰廉怜涟帘敛脸链恋炼练粮凉梁粱良两辆量晾亮谅撩聊僚疗燎寥辽潦了撂镣廖料列裂烈劣猎琳林磷霖临邻鳞淋凛赁吝拎玲菱零龄铃伶羚凌灵陵岭领另令溜琉榴硫馏留刘瘤流柳六龙聋咙笼窿"], +["c240","翤翧翨翪翫翬翭翯翲翴",6,"翽翾翿耂耇耈耉耊耎耏耑耓耚耛耝耞耟耡耣耤耫",5,"耲耴耹耺耼耾聀聁聄聅聇聈聉聎聏聐聑聓聕聖聗"], +["c280","聙聛",13,"聫",5,"聲",11,"隆垄拢陇楼娄搂篓漏陋芦卢颅庐炉掳卤虏鲁麓碌露路赂鹿潞禄录陆戮驴吕铝侣旅履屡缕虑氯律率滤绿峦挛孪滦卵乱掠略抡轮伦仑沦纶论萝螺罗逻锣箩骡裸落洛骆络妈麻玛码蚂马骂嘛吗埋买麦卖迈脉瞒馒蛮满蔓曼慢漫"], +["c340","聾肁肂肅肈肊肍",5,"肔肕肗肙肞肣肦肧肨肬肰肳肵肶肸肹肻胅胇",4,"胏",6,"胘胟胠胢胣胦胮胵胷胹胻胾胿脀脁脃脄脅脇脈脋"], +["c380","脌脕脗脙脛脜脝脟",12,"脭脮脰脳脴脵脷脹",4,"脿谩芒茫盲氓忙莽猫茅锚毛矛铆卯茂冒帽貌贸么玫枚梅酶霉煤没眉媒镁每美昧寐妹媚门闷们萌蒙檬盟锰猛梦孟眯醚靡糜迷谜弥米秘觅泌蜜密幂棉眠绵冕免勉娩缅面苗描瞄藐秒渺庙妙蔑灭民抿皿敏悯闽明螟鸣铭名命谬摸"], +["c440","腀",5,"腇腉腍腎腏腒腖腗腘腛",4,"腡腢腣腤腦腨腪腫腬腯腲腳腵腶腷腸膁膃",4,"膉膋膌膍膎膐膒",5,"膙膚膞",4,"膤膥"], +["c480","膧膩膫",7,"膴",5,"膼膽膾膿臄臅臇臈臉臋臍",6,"摹蘑模膜磨摩魔抹末莫墨默沫漠寞陌谋牟某拇牡亩姆母墓暮幕募慕木目睦牧穆拿哪呐钠那娜纳氖乃奶耐奈南男难囊挠脑恼闹淖呢馁内嫩能妮霓倪泥尼拟你匿腻逆溺蔫拈年碾撵捻念娘酿鸟尿捏聂孽啮镊镍涅您柠狞凝宁"], +["c540","臔",14,"臤臥臦臨臩臫臮",4,"臵",5,"臽臿舃與",4,"舎舏舑舓舕",5,"舝舠舤舥舦舧舩舮舲舺舼舽舿"], +["c580","艀艁艂艃艅艆艈艊艌艍艎艐",7,"艙艛艜艝艞艠",7,"艩拧泞牛扭钮纽脓浓农弄奴努怒女暖虐疟挪懦糯诺哦欧鸥殴藕呕偶沤啪趴爬帕怕琶拍排牌徘湃派攀潘盘磐盼畔判叛乓庞旁耪胖抛咆刨炮袍跑泡呸胚培裴赔陪配佩沛喷盆砰抨烹澎彭蓬棚硼篷膨朋鹏捧碰坯砒霹批披劈琵毗"], +["c640","艪艫艬艭艱艵艶艷艸艻艼芀芁芃芅芆芇芉芌芐芓芔芕芖芚芛芞芠芢芣芧芲芵芶芺芻芼芿苀苂苃苅苆苉苐苖苙苚苝苢苧苨苩苪苬苭苮苰苲苳苵苶苸"], +["c680","苺苼",4,"茊茋茍茐茒茓茖茘茙茝",9,"茩茪茮茰茲茷茻茽啤脾疲皮匹痞僻屁譬篇偏片骗飘漂瓢票撇瞥拼频贫品聘乒坪苹萍平凭瓶评屏坡泼颇婆破魄迫粕剖扑铺仆莆葡菩蒲埔朴圃普浦谱曝瀑期欺栖戚妻七凄漆柒沏其棋奇歧畦崎脐齐旗祈祁骑起岂乞企启契砌器气迄弃汽泣讫掐"], +["c740","茾茿荁荂荄荅荈荊",4,"荓荕",4,"荝荢荰",6,"荹荺荾",6,"莇莈莊莋莌莍莏莐莑莔莕莖莗莙莚莝莟莡",6,"莬莭莮"], +["c780","莯莵莻莾莿菂菃菄菆菈菉菋菍菎菐菑菒菓菕菗菙菚菛菞菢菣菤菦菧菨菫菬菭恰洽牵扦钎铅千迁签仟谦乾黔钱钳前潜遣浅谴堑嵌欠歉枪呛腔羌墙蔷强抢橇锹敲悄桥瞧乔侨巧鞘撬翘峭俏窍切茄且怯窃钦侵亲秦琴勤芹擒禽寝沁青轻氢倾卿清擎晴氰情顷请庆琼穷秋丘邱球求囚酋泅趋区蛆曲躯屈驱渠"], +["c840","菮華菳",4,"菺菻菼菾菿萀萂萅萇萈萉萊萐萒",5,"萙萚萛萞",5,"萩",7,"萲",5,"萹萺萻萾",7,"葇葈葉"], +["c880","葊",6,"葒",4,"葘葝葞葟葠葢葤",4,"葪葮葯葰葲葴葷葹葻葼取娶龋趣去圈颧权醛泉全痊拳犬券劝缺炔瘸却鹊榷确雀裙群然燃冉染瓤壤攘嚷让饶扰绕惹热壬仁人忍韧任认刃妊纫扔仍日戎茸蓉荣融熔溶容绒冗揉柔肉茹蠕儒孺如辱乳汝入褥软阮蕊瑞锐闰润若弱撒洒萨腮鳃塞赛三叁"], +["c940","葽",4,"蒃蒄蒅蒆蒊蒍蒏",7,"蒘蒚蒛蒝蒞蒟蒠蒢",12,"蒰蒱蒳蒵蒶蒷蒻蒼蒾蓀蓂蓃蓅蓆蓇蓈蓋蓌蓎蓏蓒蓔蓕蓗"], +["c980","蓘",4,"蓞蓡蓢蓤蓧",4,"蓭蓮蓯蓱",10,"蓽蓾蔀蔁蔂伞散桑嗓丧搔骚扫嫂瑟色涩森僧莎砂杀刹沙纱傻啥煞筛晒珊苫杉山删煽衫闪陕擅赡膳善汕扇缮墒伤商赏晌上尚裳梢捎稍烧芍勺韶少哨邵绍奢赊蛇舌舍赦摄射慑涉社设砷申呻伸身深娠绅神沈审婶甚肾慎渗声生甥牲升绳"], +["ca40","蔃",8,"蔍蔎蔏蔐蔒蔔蔕蔖蔘蔙蔛蔜蔝蔞蔠蔢",8,"蔭",9,"蔾",4,"蕄蕅蕆蕇蕋",10], +["ca80","蕗蕘蕚蕛蕜蕝蕟",4,"蕥蕦蕧蕩",8,"蕳蕵蕶蕷蕸蕼蕽蕿薀薁省盛剩胜圣师失狮施湿诗尸虱十石拾时什食蚀实识史矢使屎驶始式示士世柿事拭誓逝势是嗜噬适仕侍释饰氏市恃室视试收手首守寿授售受瘦兽蔬枢梳殊抒输叔舒淑疏书赎孰熟薯暑曙署蜀黍鼠属术述树束戍竖墅庶数漱"], +["cb40","薂薃薆薈",6,"薐",10,"薝",6,"薥薦薧薩薫薬薭薱",5,"薸薺",6,"藂",6,"藊",4,"藑藒"], +["cb80","藔藖",5,"藝",6,"藥藦藧藨藪",14,"恕刷耍摔衰甩帅栓拴霜双爽谁水睡税吮瞬顺舜说硕朔烁斯撕嘶思私司丝死肆寺嗣四伺似饲巳松耸怂颂送宋讼诵搜艘擞嗽苏酥俗素速粟僳塑溯宿诉肃酸蒜算虽隋随绥髓碎岁穗遂隧祟孙损笋蓑梭唆缩琐索锁所塌他它她塔"], +["cc40","藹藺藼藽藾蘀",4,"蘆",10,"蘒蘓蘔蘕蘗",15,"蘨蘪",13,"蘹蘺蘻蘽蘾蘿虀"], +["cc80","虁",11,"虒虓處",4,"虛虜虝號虠虡虣",7,"獭挞蹋踏胎苔抬台泰酞太态汰坍摊贪瘫滩坛檀痰潭谭谈坦毯袒碳探叹炭汤塘搪堂棠膛唐糖倘躺淌趟烫掏涛滔绦萄桃逃淘陶讨套特藤腾疼誊梯剔踢锑提题蹄啼体替嚏惕涕剃屉天添填田甜恬舔腆挑条迢眺跳贴铁帖厅听烃"], +["cd40","虭虯虰虲",6,"蚃",6,"蚎",4,"蚔蚖",5,"蚞",4,"蚥蚦蚫蚭蚮蚲蚳蚷蚸蚹蚻",4,"蛁蛂蛃蛅蛈蛌蛍蛒蛓蛕蛖蛗蛚蛜"], +["cd80","蛝蛠蛡蛢蛣蛥蛦蛧蛨蛪蛫蛬蛯蛵蛶蛷蛺蛻蛼蛽蛿蜁蜄蜅蜆蜋蜌蜎蜏蜐蜑蜔蜖汀廷停亭庭挺艇通桐酮瞳同铜彤童桶捅筒统痛偷投头透凸秃突图徒途涂屠土吐兔湍团推颓腿蜕褪退吞屯臀拖托脱鸵陀驮驼椭妥拓唾挖哇蛙洼娃瓦袜歪外豌弯湾玩顽丸烷完碗挽晚皖惋宛婉万腕汪王亡枉网往旺望忘妄威"], +["ce40","蜙蜛蜝蜟蜠蜤蜦蜧蜨蜪蜫蜬蜭蜯蜰蜲蜳蜵蜶蜸蜹蜺蜼蜽蝀",6,"蝊蝋蝍蝏蝐蝑蝒蝔蝕蝖蝘蝚",5,"蝡蝢蝦",7,"蝯蝱蝲蝳蝵"], +["ce80","蝷蝸蝹蝺蝿螀螁螄螆螇螉螊螌螎",4,"螔螕螖螘",6,"螠",4,"巍微危韦违桅围唯惟为潍维苇萎委伟伪尾纬未蔚味畏胃喂魏位渭谓尉慰卫瘟温蚊文闻纹吻稳紊问嗡翁瓮挝蜗涡窝我斡卧握沃巫呜钨乌污诬屋无芜梧吾吴毋武五捂午舞伍侮坞戊雾晤物勿务悟误昔熙析西硒矽晰嘻吸锡牺"], +["cf40","螥螦螧螩螪螮螰螱螲螴螶螷螸螹螻螼螾螿蟁",4,"蟇蟈蟉蟌",4,"蟔",6,"蟜蟝蟞蟟蟡蟢蟣蟤蟦蟧蟨蟩蟫蟬蟭蟯",9], +["cf80","蟺蟻蟼蟽蟿蠀蠁蠂蠄",5,"蠋",7,"蠔蠗蠘蠙蠚蠜",4,"蠣稀息希悉膝夕惜熄烯溪汐犀檄袭席习媳喜铣洗系隙戏细瞎虾匣霞辖暇峡侠狭下厦夏吓掀锨先仙鲜纤咸贤衔舷闲涎弦嫌显险现献县腺馅羡宪陷限线相厢镶香箱襄湘乡翔祥详想响享项巷橡像向象萧硝霄削哮嚣销消宵淆晓"], +["d040","蠤",13,"蠳",5,"蠺蠻蠽蠾蠿衁衂衃衆",5,"衎",5,"衕衖衘衚",6,"衦衧衪衭衯衱衳衴衵衶衸衹衺"], +["d080","衻衼袀袃袆袇袉袊袌袎袏袐袑袓袔袕袗",4,"袝",4,"袣袥",5,"小孝校肖啸笑效楔些歇蝎鞋协挟携邪斜胁谐写械卸蟹懈泄泻谢屑薪芯锌欣辛新忻心信衅星腥猩惺兴刑型形邢行醒幸杏性姓兄凶胸匈汹雄熊休修羞朽嗅锈秀袖绣墟戌需虚嘘须徐许蓄酗叙旭序畜恤絮婿绪续轩喧宣悬旋玄"], +["d140","袬袮袯袰袲",4,"袸袹袺袻袽袾袿裀裃裄裇裈裊裋裌裍裏裐裑裓裖裗裚",4,"裠裡裦裧裩",6,"裲裵裶裷裺裻製裿褀褁褃",5], +["d180","褉褋",4,"褑褔",4,"褜",4,"褢褣褤褦褧褨褩褬褭褮褯褱褲褳褵褷选癣眩绚靴薛学穴雪血勋熏循旬询寻驯巡殉汛训讯逊迅压押鸦鸭呀丫芽牙蚜崖衙涯雅哑亚讶焉咽阉烟淹盐严研蜒岩延言颜阎炎沿奄掩眼衍演艳堰燕厌砚雁唁彦焰宴谚验殃央鸯秧杨扬佯疡羊洋阳氧仰痒养样漾邀腰妖瑶"], +["d240","褸",8,"襂襃襅",24,"襠",5,"襧",19,"襼"], +["d280","襽襾覀覂覄覅覇",26,"摇尧遥窑谣姚咬舀药要耀椰噎耶爷野冶也页掖业叶曳腋夜液一壹医揖铱依伊衣颐夷遗移仪胰疑沂宜姨彝椅蚁倚已乙矣以艺抑易邑屹亿役臆逸肄疫亦裔意毅忆义益溢诣议谊译异翼翌绎茵荫因殷音阴姻吟银淫寅饮尹引隐"], +["d340","覢",30,"觃觍觓觔觕觗觘觙觛觝觟觠觡觢觤觧觨觩觪觬觭觮觰觱觲觴",6], +["d380","觻",4,"訁",5,"計",21,"印英樱婴鹰应缨莹萤营荧蝇迎赢盈影颖硬映哟拥佣臃痈庸雍踊蛹咏泳涌永恿勇用幽优悠忧尤由邮铀犹油游酉有友右佑釉诱又幼迂淤于盂榆虞愚舆余俞逾鱼愉渝渔隅予娱雨与屿禹宇语羽玉域芋郁吁遇喻峪御愈欲狱育誉"], +["d440","訞",31,"訿",8,"詉",21], +["d480","詟",25,"詺",6,"浴寓裕预豫驭鸳渊冤元垣袁原援辕园员圆猿源缘远苑愿怨院曰约越跃钥岳粤月悦阅耘云郧匀陨允运蕴酝晕韵孕匝砸杂栽哉灾宰载再在咱攒暂赞赃脏葬遭糟凿藻枣早澡蚤躁噪造皂灶燥责择则泽贼怎增憎曾赠扎喳渣札轧"], +["d540","誁",7,"誋",7,"誔",46], +["d580","諃",32,"铡闸眨栅榨咋乍炸诈摘斋宅窄债寨瞻毡詹粘沾盏斩辗崭展蘸栈占战站湛绽樟章彰漳张掌涨杖丈帐账仗胀瘴障招昭找沼赵照罩兆肇召遮折哲蛰辙者锗蔗这浙珍斟真甄砧臻贞针侦枕疹诊震振镇阵蒸挣睁征狰争怔整拯正政"], +["d640","諤",34,"謈",27], +["d680","謤謥謧",30,"帧症郑证芝枝支吱蜘知肢脂汁之织职直植殖执值侄址指止趾只旨纸志挚掷至致置帜峙制智秩稚质炙痔滞治窒中盅忠钟衷终种肿重仲众舟周州洲诌粥轴肘帚咒皱宙昼骤珠株蛛朱猪诸诛逐竹烛煮拄瞩嘱主著柱助蛀贮铸筑"], +["d740","譆",31,"譧",4,"譭",25], +["d780","讇",24,"讬讱讻诇诐诪谉谞住注祝驻抓爪拽专砖转撰赚篆桩庄装妆撞壮状椎锥追赘坠缀谆准捉拙卓桌琢茁酌啄着灼浊兹咨资姿滋淄孜紫仔籽滓子自渍字鬃棕踪宗综总纵邹走奏揍租足卒族祖诅阻组钻纂嘴醉最罪尊遵昨左佐柞做作坐座"], +["d840","谸",8,"豂豃豄豅豈豊豋豍",7,"豖豗豘豙豛",5,"豣",6,"豬",6,"豴豵豶豷豻",6,"貃貄貆貇"], +["d880","貈貋貍",6,"貕貖貗貙",20,"亍丌兀丐廿卅丕亘丞鬲孬噩丨禺丿匕乇夭爻卮氐囟胤馗毓睾鼗丶亟鼐乜乩亓芈孛啬嘏仄厍厝厣厥厮靥赝匚叵匦匮匾赜卦卣刂刈刎刭刳刿剀剌剞剡剜蒯剽劂劁劐劓冂罔亻仃仉仂仨仡仫仞伛仳伢佤仵伥伧伉伫佞佧攸佚佝"], +["d940","貮",62], +["d980","賭",32,"佟佗伲伽佶佴侑侉侃侏佾佻侪佼侬侔俦俨俪俅俚俣俜俑俟俸倩偌俳倬倏倮倭俾倜倌倥倨偾偃偕偈偎偬偻傥傧傩傺僖儆僭僬僦僮儇儋仝氽佘佥俎龠汆籴兮巽黉馘冁夔勹匍訇匐凫夙兕亠兖亳衮袤亵脔裒禀嬴蠃羸冫冱冽冼"], +["da40","贎",14,"贠赑赒赗赟赥赨赩赪赬赮赯赱赲赸",8,"趂趃趆趇趈趉趌",4,"趒趓趕",9,"趠趡"], +["da80","趢趤",12,"趲趶趷趹趻趽跀跁跂跅跇跈跉跊跍跐跒跓跔凇冖冢冥讠讦讧讪讴讵讷诂诃诋诏诎诒诓诔诖诘诙诜诟诠诤诨诩诮诰诳诶诹诼诿谀谂谄谇谌谏谑谒谔谕谖谙谛谘谝谟谠谡谥谧谪谫谮谯谲谳谵谶卩卺阝阢阡阱阪阽阼陂陉陔陟陧陬陲陴隈隍隗隰邗邛邝邙邬邡邴邳邶邺"], +["db40","跕跘跙跜跠跡跢跥跦跧跩跭跮跰跱跲跴跶跼跾",6,"踆踇踈踋踍踎踐踑踒踓踕",7,"踠踡踤",4,"踫踭踰踲踳踴踶踷踸踻踼踾"], +["db80","踿蹃蹅蹆蹌",4,"蹓",5,"蹚",11,"蹧蹨蹪蹫蹮蹱邸邰郏郅邾郐郄郇郓郦郢郜郗郛郫郯郾鄄鄢鄞鄣鄱鄯鄹酃酆刍奂劢劬劭劾哿勐勖勰叟燮矍廴凵凼鬯厶弁畚巯坌垩垡塾墼壅壑圩圬圪圳圹圮圯坜圻坂坩垅坫垆坼坻坨坭坶坳垭垤垌垲埏垧垴垓垠埕埘埚埙埒垸埴埯埸埤埝"], +["dc40","蹳蹵蹷",4,"蹽蹾躀躂躃躄躆躈",6,"躑躒躓躕",6,"躝躟",11,"躭躮躰躱躳",6,"躻",7], +["dc80","軃",10,"軏",21,"堋堍埽埭堀堞堙塄堠塥塬墁墉墚墀馨鼙懿艹艽艿芏芊芨芄芎芑芗芙芫芸芾芰苈苊苣芘芷芮苋苌苁芩芴芡芪芟苄苎芤苡茉苷苤茏茇苜苴苒苘茌苻苓茑茚茆茔茕苠苕茜荑荛荜茈莒茼茴茱莛荞茯荏荇荃荟荀茗荠茭茺茳荦荥"], +["dd40","軥",62], +["dd80","輤",32,"荨茛荩荬荪荭荮莰荸莳莴莠莪莓莜莅荼莶莩荽莸荻莘莞莨莺莼菁萁菥菘堇萘萋菝菽菖萜萸萑萆菔菟萏萃菸菹菪菅菀萦菰菡葜葑葚葙葳蒇蒈葺蒉葸萼葆葩葶蒌蒎萱葭蓁蓍蓐蓦蒽蓓蓊蒿蒺蓠蒡蒹蒴蒗蓥蓣蔌甍蔸蓰蔹蔟蔺"], +["de40","轅",32,"轪辀辌辒辝辠辡辢辤辥辦辧辪辬辭辮辯農辳辴辵辷辸辺辻込辿迀迃迆"], +["de80","迉",4,"迏迒迖迗迚迠迡迣迧迬迯迱迲迴迵迶迺迻迼迾迿逇逈逌逎逓逕逘蕖蔻蓿蓼蕙蕈蕨蕤蕞蕺瞢蕃蕲蕻薤薨薇薏蕹薮薜薅薹薷薰藓藁藜藿蘧蘅蘩蘖蘼廾弈夼奁耷奕奚奘匏尢尥尬尴扌扪抟抻拊拚拗拮挢拶挹捋捃掭揶捱捺掎掴捭掬掊捩掮掼揲揸揠揿揄揞揎摒揆掾摅摁搋搛搠搌搦搡摞撄摭撖"], +["df40","這逜連逤逥逧",5,"逰",4,"逷逹逺逽逿遀遃遅遆遈",4,"過達違遖遙遚遜",5,"遤遦遧適遪遫遬遯",4,"遶",6,"遾邁"], +["df80","還邅邆邇邉邊邌",4,"邒邔邖邘邚邜邞邟邠邤邥邧邨邩邫邭邲邷邼邽邿郀摺撷撸撙撺擀擐擗擤擢攉攥攮弋忒甙弑卟叱叽叩叨叻吒吖吆呋呒呓呔呖呃吡呗呙吣吲咂咔呷呱呤咚咛咄呶呦咝哐咭哂咴哒咧咦哓哔呲咣哕咻咿哌哙哚哜咩咪咤哝哏哞唛哧唠哽唔哳唢唣唏唑唧唪啧喏喵啉啭啁啕唿啐唼"], +["e040","郂郃郆郈郉郋郌郍郒郔郕郖郘郙郚郞郟郠郣郤郥郩郪郬郮郰郱郲郳郵郶郷郹郺郻郼郿鄀鄁鄃鄅",19,"鄚鄛鄜"], +["e080","鄝鄟鄠鄡鄤",10,"鄰鄲",6,"鄺",8,"酄唷啖啵啶啷唳唰啜喋嗒喃喱喹喈喁喟啾嗖喑啻嗟喽喾喔喙嗪嗷嗉嘟嗑嗫嗬嗔嗦嗝嗄嗯嗥嗲嗳嗌嗍嗨嗵嗤辔嘞嘈嘌嘁嘤嘣嗾嘀嘧嘭噘嘹噗嘬噍噢噙噜噌噔嚆噤噱噫噻噼嚅嚓嚯囔囗囝囡囵囫囹囿圄圊圉圜帏帙帔帑帱帻帼"], +["e140","酅酇酈酑酓酔酕酖酘酙酛酜酟酠酦酧酨酫酭酳酺酻酼醀",4,"醆醈醊醎醏醓",6,"醜",5,"醤",5,"醫醬醰醱醲醳醶醷醸醹醻"], +["e180","醼",10,"釈釋釐釒",9,"針",8,"帷幄幔幛幞幡岌屺岍岐岖岈岘岙岑岚岜岵岢岽岬岫岱岣峁岷峄峒峤峋峥崂崃崧崦崮崤崞崆崛嵘崾崴崽嵬嵛嵯嵝嵫嵋嵊嵩嵴嶂嶙嶝豳嶷巅彳彷徂徇徉後徕徙徜徨徭徵徼衢彡犭犰犴犷犸狃狁狎狍狒狨狯狩狲狴狷猁狳猃狺"], +["e240","釦",62], +["e280","鈥",32,"狻猗猓猡猊猞猝猕猢猹猥猬猸猱獐獍獗獠獬獯獾舛夥飧夤夂饣饧",5,"饴饷饽馀馄馇馊馍馐馑馓馔馕庀庑庋庖庥庠庹庵庾庳赓廒廑廛廨廪膺忄忉忖忏怃忮怄忡忤忾怅怆忪忭忸怙怵怦怛怏怍怩怫怊怿怡恸恹恻恺恂"], +["e340","鉆",45,"鉵",16], +["e380","銆",7,"銏",24,"恪恽悖悚悭悝悃悒悌悛惬悻悱惝惘惆惚悴愠愦愕愣惴愀愎愫慊慵憬憔憧憷懔懵忝隳闩闫闱闳闵闶闼闾阃阄阆阈阊阋阌阍阏阒阕阖阗阙阚丬爿戕氵汔汜汊沣沅沐沔沌汨汩汴汶沆沩泐泔沭泷泸泱泗沲泠泖泺泫泮沱泓泯泾"], +["e440","銨",5,"銯",24,"鋉",31], +["e480","鋩",32,"洹洧洌浃浈洇洄洙洎洫浍洮洵洚浏浒浔洳涑浯涞涠浞涓涔浜浠浼浣渚淇淅淞渎涿淠渑淦淝淙渖涫渌涮渫湮湎湫溲湟溆湓湔渲渥湄滟溱溘滠漭滢溥溧溽溻溷滗溴滏溏滂溟潢潆潇漤漕滹漯漶潋潴漪漉漩澉澍澌潸潲潼潺濑"], +["e540","錊",51,"錿",10], +["e580","鍊",31,"鍫濉澧澹澶濂濡濮濞濠濯瀚瀣瀛瀹瀵灏灞宀宄宕宓宥宸甯骞搴寤寮褰寰蹇謇辶迓迕迥迮迤迩迦迳迨逅逄逋逦逑逍逖逡逵逶逭逯遄遑遒遐遨遘遢遛暹遴遽邂邈邃邋彐彗彖彘尻咫屐屙孱屣屦羼弪弩弭艴弼鬻屮妁妃妍妩妪妣"], +["e640","鍬",34,"鎐",27], +["e680","鎬",29,"鏋鏌鏍妗姊妫妞妤姒妲妯姗妾娅娆姝娈姣姘姹娌娉娲娴娑娣娓婀婧婊婕娼婢婵胬媪媛婷婺媾嫫媲嫒嫔媸嫠嫣嫱嫖嫦嫘嫜嬉嬗嬖嬲嬷孀尕尜孚孥孳孑孓孢驵驷驸驺驿驽骀骁骅骈骊骐骒骓骖骘骛骜骝骟骠骢骣骥骧纟纡纣纥纨纩"], +["e740","鏎",7,"鏗",54], +["e780","鐎",32,"纭纰纾绀绁绂绉绋绌绐绔绗绛绠绡绨绫绮绯绱绲缍绶绺绻绾缁缂缃缇缈缋缌缏缑缒缗缙缜缛缟缡",6,"缪缫缬缭缯",4,"缵幺畿巛甾邕玎玑玮玢玟珏珂珑玷玳珀珉珈珥珙顼琊珩珧珞玺珲琏琪瑛琦琥琨琰琮琬"], +["e840","鐯",14,"鐿",43,"鑬鑭鑮鑯"], +["e880","鑰",20,"钑钖钘铇铏铓铔铚铦铻锜锠琛琚瑁瑜瑗瑕瑙瑷瑭瑾璜璎璀璁璇璋璞璨璩璐璧瓒璺韪韫韬杌杓杞杈杩枥枇杪杳枘枧杵枨枞枭枋杷杼柰栉柘栊柩枰栌柙枵柚枳柝栀柃枸柢栎柁柽栲栳桠桡桎桢桄桤梃栝桕桦桁桧桀栾桊桉栩梵梏桴桷梓桫棂楮棼椟椠棹"], +["e940","锧锳锽镃镈镋镕镚镠镮镴镵長",7,"門",42], +["e980","閫",32,"椤棰椋椁楗棣椐楱椹楠楂楝榄楫榀榘楸椴槌榇榈槎榉楦楣楹榛榧榻榫榭槔榱槁槊槟榕槠榍槿樯槭樗樘橥槲橄樾檠橐橛樵檎橹樽樨橘橼檑檐檩檗檫猷獒殁殂殇殄殒殓殍殚殛殡殪轫轭轱轲轳轵轶轸轷轹轺轼轾辁辂辄辇辋"], +["ea40","闌",27,"闬闿阇阓阘阛阞阠阣",6,"阫阬阭阯阰阷阸阹阺阾陁陃陊陎陏陑陒陓陖陗"], +["ea80","陘陙陚陜陝陞陠陣陥陦陫陭",4,"陳陸",12,"隇隉隊辍辎辏辘辚軎戋戗戛戟戢戡戥戤戬臧瓯瓴瓿甏甑甓攴旮旯旰昊昙杲昃昕昀炅曷昝昴昱昶昵耆晟晔晁晏晖晡晗晷暄暌暧暝暾曛曜曦曩贲贳贶贻贽赀赅赆赈赉赇赍赕赙觇觊觋觌觎觏觐觑牮犟牝牦牯牾牿犄犋犍犏犒挈挲掰"], +["eb40","隌階隑隒隓隕隖隚際隝",9,"隨",7,"隱隲隴隵隷隸隺隻隿雂雃雈雊雋雐雑雓雔雖",9,"雡",6,"雫"], +["eb80","雬雭雮雰雱雲雴雵雸雺電雼雽雿霂霃霅霊霋霌霐霑霒霔霕霗",4,"霝霟霠搿擘耄毪毳毽毵毹氅氇氆氍氕氘氙氚氡氩氤氪氲攵敕敫牍牒牖爰虢刖肟肜肓肼朊肽肱肫肭肴肷胧胨胩胪胛胂胄胙胍胗朐胝胫胱胴胭脍脎胲胼朕脒豚脶脞脬脘脲腈腌腓腴腙腚腱腠腩腼腽腭腧塍媵膈膂膑滕膣膪臌朦臊膻"], +["ec40","霡",8,"霫霬霮霯霱霳",4,"霺霻霼霽霿",18,"靔靕靗靘靚靜靝靟靣靤靦靧靨靪",7], +["ec80","靲靵靷",4,"靽",7,"鞆",4,"鞌鞎鞏鞐鞓鞕鞖鞗鞙",4,"臁膦欤欷欹歃歆歙飑飒飓飕飙飚殳彀毂觳斐齑斓於旆旄旃旌旎旒旖炀炜炖炝炻烀炷炫炱烨烊焐焓焖焯焱煳煜煨煅煲煊煸煺熘熳熵熨熠燠燔燧燹爝爨灬焘煦熹戾戽扃扈扉礻祀祆祉祛祜祓祚祢祗祠祯祧祺禅禊禚禧禳忑忐"], +["ed40","鞞鞟鞡鞢鞤",6,"鞬鞮鞰鞱鞳鞵",46], +["ed80","韤韥韨韮",4,"韴韷",23,"怼恝恚恧恁恙恣悫愆愍慝憩憝懋懑戆肀聿沓泶淼矶矸砀砉砗砘砑斫砭砜砝砹砺砻砟砼砥砬砣砩硎硭硖硗砦硐硇硌硪碛碓碚碇碜碡碣碲碹碥磔磙磉磬磲礅磴礓礤礞礴龛黹黻黼盱眄眍盹眇眈眚眢眙眭眦眵眸睐睑睇睃睚睨"], +["ee40","頏",62], +["ee80","顎",32,"睢睥睿瞍睽瞀瞌瞑瞟瞠瞰瞵瞽町畀畎畋畈畛畲畹疃罘罡罟詈罨罴罱罹羁罾盍盥蠲钅钆钇钋钊钌钍钏钐钔钗钕钚钛钜钣钤钫钪钭钬钯钰钲钴钶",4,"钼钽钿铄铈",6,"铐铑铒铕铖铗铙铘铛铞铟铠铢铤铥铧铨铪"], +["ef40","顯",5,"颋颎颒颕颙颣風",37,"飏飐飔飖飗飛飜飝飠",4], +["ef80","飥飦飩",30,"铩铫铮铯铳铴铵铷铹铼铽铿锃锂锆锇锉锊锍锎锏锒",4,"锘锛锝锞锟锢锪锫锩锬锱锲锴锶锷锸锼锾锿镂锵镄镅镆镉镌镎镏镒镓镔镖镗镘镙镛镞镟镝镡镢镤",8,"镯镱镲镳锺矧矬雉秕秭秣秫稆嵇稃稂稞稔"], +["f040","餈",4,"餎餏餑",28,"餯",26], +["f080","饊",9,"饖",12,"饤饦饳饸饹饻饾馂馃馉稹稷穑黏馥穰皈皎皓皙皤瓞瓠甬鸠鸢鸨",4,"鸲鸱鸶鸸鸷鸹鸺鸾鹁鹂鹄鹆鹇鹈鹉鹋鹌鹎鹑鹕鹗鹚鹛鹜鹞鹣鹦",6,"鹱鹭鹳疒疔疖疠疝疬疣疳疴疸痄疱疰痃痂痖痍痣痨痦痤痫痧瘃痱痼痿瘐瘀瘅瘌瘗瘊瘥瘘瘕瘙"], +["f140","馌馎馚",10,"馦馧馩",47], +["f180","駙",32,"瘛瘼瘢瘠癀瘭瘰瘿瘵癃瘾瘳癍癞癔癜癖癫癯翊竦穸穹窀窆窈窕窦窠窬窨窭窳衤衩衲衽衿袂袢裆袷袼裉裢裎裣裥裱褚裼裨裾裰褡褙褓褛褊褴褫褶襁襦襻疋胥皲皴矜耒耔耖耜耠耢耥耦耧耩耨耱耋耵聃聆聍聒聩聱覃顸颀颃"], +["f240","駺",62], +["f280","騹",32,"颉颌颍颏颔颚颛颞颟颡颢颥颦虍虔虬虮虿虺虼虻蚨蚍蚋蚬蚝蚧蚣蚪蚓蚩蚶蛄蚵蛎蚰蚺蚱蚯蛉蛏蚴蛩蛱蛲蛭蛳蛐蜓蛞蛴蛟蛘蛑蜃蜇蛸蜈蜊蜍蜉蜣蜻蜞蜥蜮蜚蜾蝈蜴蜱蜩蜷蜿螂蜢蝽蝾蝻蝠蝰蝌蝮螋蝓蝣蝼蝤蝙蝥螓螯螨蟒"], +["f340","驚",17,"驲骃骉骍骎骔骕骙骦骩",6,"骲骳骴骵骹骻骽骾骿髃髄髆",4,"髍髎髏髐髒體髕髖髗髙髚髛髜"], +["f380","髝髞髠髢髣髤髥髧髨髩髪髬髮髰",8,"髺髼",6,"鬄鬅鬆蟆螈螅螭螗螃螫蟥螬螵螳蟋蟓螽蟑蟀蟊蟛蟪蟠蟮蠖蠓蟾蠊蠛蠡蠹蠼缶罂罄罅舐竺竽笈笃笄笕笊笫笏筇笸笪笙笮笱笠笥笤笳笾笞筘筚筅筵筌筝筠筮筻筢筲筱箐箦箧箸箬箝箨箅箪箜箢箫箴篑篁篌篝篚篥篦篪簌篾篼簏簖簋"], +["f440","鬇鬉",5,"鬐鬑鬒鬔",10,"鬠鬡鬢鬤",10,"鬰鬱鬳",7,"鬽鬾鬿魀魆魊魋魌魎魐魒魓魕",5], +["f480","魛",32,"簟簪簦簸籁籀臾舁舂舄臬衄舡舢舣舭舯舨舫舸舻舳舴舾艄艉艋艏艚艟艨衾袅袈裘裟襞羝羟羧羯羰羲籼敉粑粝粜粞粢粲粼粽糁糇糌糍糈糅糗糨艮暨羿翎翕翥翡翦翩翮翳糸絷綦綮繇纛麸麴赳趄趔趑趱赧赭豇豉酊酐酎酏酤"], +["f540","魼",62], +["f580","鮻",32,"酢酡酰酩酯酽酾酲酴酹醌醅醐醍醑醢醣醪醭醮醯醵醴醺豕鹾趸跫踅蹙蹩趵趿趼趺跄跖跗跚跞跎跏跛跆跬跷跸跣跹跻跤踉跽踔踝踟踬踮踣踯踺蹀踹踵踽踱蹉蹁蹂蹑蹒蹊蹰蹶蹼蹯蹴躅躏躔躐躜躞豸貂貊貅貘貔斛觖觞觚觜"], +["f640","鯜",62], +["f680","鰛",32,"觥觫觯訾謦靓雩雳雯霆霁霈霏霎霪霭霰霾龀龃龅",5,"龌黾鼋鼍隹隼隽雎雒瞿雠銎銮鋈錾鍪鏊鎏鐾鑫鱿鲂鲅鲆鲇鲈稣鲋鲎鲐鲑鲒鲔鲕鲚鲛鲞",5,"鲥",4,"鲫鲭鲮鲰",7,"鲺鲻鲼鲽鳄鳅鳆鳇鳊鳋"], +["f740","鰼",62], +["f780","鱻鱽鱾鲀鲃鲄鲉鲊鲌鲏鲓鲖鲗鲘鲙鲝鲪鲬鲯鲹鲾",4,"鳈鳉鳑鳒鳚鳛鳠鳡鳌",4,"鳓鳔鳕鳗鳘鳙鳜鳝鳟鳢靼鞅鞑鞒鞔鞯鞫鞣鞲鞴骱骰骷鹘骶骺骼髁髀髅髂髋髌髑魅魃魇魉魈魍魑飨餍餮饕饔髟髡髦髯髫髻髭髹鬈鬏鬓鬟鬣麽麾縻麂麇麈麋麒鏖麝麟黛黜黝黠黟黢黩黧黥黪黯鼢鼬鼯鼹鼷鼽鼾齄"], +["f840","鳣",62], +["f880","鴢",32], +["f940","鵃",62], +["f980","鶂",32], +["fa40","鶣",62], +["fa80","鷢",32], +["fb40","鸃",27,"鸤鸧鸮鸰鸴鸻鸼鹀鹍鹐鹒鹓鹔鹖鹙鹝鹟鹠鹡鹢鹥鹮鹯鹲鹴",9,"麀"], +["fb80","麁麃麄麅麆麉麊麌",5,"麔",8,"麞麠",5,"麧麨麩麪"], +["fc40","麫",8,"麵麶麷麹麺麼麿",4,"黅黆黇黈黊黋黌黐黒黓黕黖黗黙黚點黡黣黤黦黨黫黬黭黮黰",8,"黺黽黿",6], +["fc80","鼆",4,"鼌鼏鼑鼒鼔鼕鼖鼘鼚",5,"鼡鼣",8,"鼭鼮鼰鼱"], +["fd40","鼲",4,"鼸鼺鼼鼿",4,"齅",10,"齒",38], +["fd80","齹",5,"龁龂龍",11,"龜龝龞龡",4,"郎凉秊裏隣"], +["fe40","兀嗀﨎﨏﨑﨓﨔礼﨟蘒﨡﨣﨤﨧﨨﨩"] +] diff --git a/node_modules/iconv-lite/encodings/tables/cp949.json b/node_modules/iconv-lite/encodings/tables/cp949.json new file mode 100644 index 000000000..2022a007f --- /dev/null +++ b/node_modules/iconv-lite/encodings/tables/cp949.json @@ -0,0 +1,273 @@ +[ +["0","\u0000",127], +["8141","갂갃갅갆갋",4,"갘갞갟갡갢갣갥",6,"갮갲갳갴"], +["8161","갵갶갷갺갻갽갾갿걁",9,"걌걎",5,"걕"], +["8181","걖걗걙걚걛걝",18,"걲걳걵걶걹걻",4,"겂겇겈겍겎겏겑겒겓겕",6,"겞겢",5,"겫겭겮겱",6,"겺겾겿곀곂곃곅곆곇곉곊곋곍",7,"곖곘",7,"곢곣곥곦곩곫곭곮곲곴곷",4,"곾곿괁괂괃괅괇",4,"괎괐괒괓"], +["8241","괔괕괖괗괙괚괛괝괞괟괡",7,"괪괫괮",5], +["8261","괶괷괹괺괻괽",6,"굆굈굊",5,"굑굒굓굕굖굗"], +["8281","굙",7,"굢굤",7,"굮굯굱굲굷굸굹굺굾궀궃",4,"궊궋궍궎궏궑",10,"궞",5,"궥",17,"궸",7,"귂귃귅귆귇귉",6,"귒귔",7,"귝귞귟귡귢귣귥",18], +["8341","귺귻귽귾긂",5,"긊긌긎",5,"긕",7], +["8361","긝",18,"긲긳긵긶긹긻긼"], +["8381","긽긾긿깂깄깇깈깉깋깏깑깒깓깕깗",4,"깞깢깣깤깦깧깪깫깭깮깯깱",6,"깺깾",5,"꺆",5,"꺍",46,"꺿껁껂껃껅",6,"껎껒",5,"껚껛껝",8], +["8441","껦껧껩껪껬껮",5,"껵껶껷껹껺껻껽",8], +["8461","꼆꼉꼊꼋꼌꼎꼏꼑",18], +["8481","꼤",7,"꼮꼯꼱꼳꼵",6,"꼾꽀꽄꽅꽆꽇꽊",5,"꽑",10,"꽞",5,"꽦",18,"꽺",5,"꾁꾂꾃꾅꾆꾇꾉",6,"꾒꾓꾔꾖",5,"꾝",26,"꾺꾻꾽꾾"], +["8541","꾿꿁",5,"꿊꿌꿏",4,"꿕",6,"꿝",4], +["8561","꿢",5,"꿪",5,"꿲꿳꿵꿶꿷꿹",6,"뀂뀃"], +["8581","뀅",6,"뀍뀎뀏뀑뀒뀓뀕",6,"뀞",9,"뀩",26,"끆끇끉끋끍끏끐끑끒끖끘끚끛끜끞",29,"끾끿낁낂낃낅",6,"낎낐낒",5,"낛낝낞낣낤"], +["8641","낥낦낧낪낰낲낶낷낹낺낻낽",6,"냆냊",5,"냒"], +["8661","냓냕냖냗냙",6,"냡냢냣냤냦",10], +["8681","냱",22,"넊넍넎넏넑넔넕넖넗넚넞",4,"넦넧넩넪넫넭",6,"넶넺",5,"녂녃녅녆녇녉",6,"녒녓녖녗녙녚녛녝녞녟녡",22,"녺녻녽녾녿놁놃",4,"놊놌놎놏놐놑놕놖놗놙놚놛놝"], +["8741","놞",9,"놩",15], +["8761","놹",18,"뇍뇎뇏뇑뇒뇓뇕"], +["8781","뇖",5,"뇞뇠",7,"뇪뇫뇭뇮뇯뇱",7,"뇺뇼뇾",5,"눆눇눉눊눍",6,"눖눘눚",5,"눡",18,"눵",6,"눽",26,"뉙뉚뉛뉝뉞뉟뉡",6,"뉪",4], +["8841","뉯",4,"뉶",5,"뉽",6,"늆늇늈늊",4], +["8861","늏늒늓늕늖늗늛",4,"늢늤늧늨늩늫늭늮늯늱늲늳늵늶늷"], +["8881","늸",15,"닊닋닍닎닏닑닓",4,"닚닜닞닟닠닡닣닧닩닪닰닱닲닶닼닽닾댂댃댅댆댇댉",6,"댒댖",5,"댝",54,"덗덙덚덝덠덡덢덣"], +["8941","덦덨덪덬덭덯덲덳덵덶덷덹",6,"뎂뎆",5,"뎍"], +["8961","뎎뎏뎑뎒뎓뎕",10,"뎢",5,"뎩뎪뎫뎭"], +["8981","뎮",21,"돆돇돉돊돍돏돑돒돓돖돘돚돜돞돟돡돢돣돥돦돧돩",18,"돽",18,"됑",6,"됙됚됛됝됞됟됡",6,"됪됬",7,"됵",15], +["8a41","둅",10,"둒둓둕둖둗둙",6,"둢둤둦"], +["8a61","둧",4,"둭",18,"뒁뒂"], +["8a81","뒃",4,"뒉",19,"뒞",5,"뒥뒦뒧뒩뒪뒫뒭",7,"뒶뒸뒺",5,"듁듂듃듅듆듇듉",6,"듑듒듓듔듖",5,"듞듟듡듢듥듧",4,"듮듰듲",5,"듹",26,"딖딗딙딚딝"], +["8b41","딞",5,"딦딫",4,"딲딳딵딶딷딹",6,"땂땆"], +["8b61","땇땈땉땊땎땏땑땒땓땕",6,"땞땢",8], +["8b81","땫",52,"떢떣떥떦떧떩떬떭떮떯떲떶",4,"떾떿뗁뗂뗃뗅",6,"뗎뗒",5,"뗙",18,"뗭",18], +["8c41","똀",15,"똒똓똕똖똗똙",4], +["8c61","똞",6,"똦",5,"똭",6,"똵",5], +["8c81","똻",12,"뙉",26,"뙥뙦뙧뙩",50,"뚞뚟뚡뚢뚣뚥",5,"뚭뚮뚯뚰뚲",16], +["8d41","뛃",16,"뛕",8], +["8d61","뛞",17,"뛱뛲뛳뛵뛶뛷뛹뛺"], +["8d81","뛻",4,"뜂뜃뜄뜆",33,"뜪뜫뜭뜮뜱",6,"뜺뜼",7,"띅띆띇띉띊띋띍",6,"띖",9,"띡띢띣띥띦띧띩",6,"띲띴띶",5,"띾띿랁랂랃랅",6,"랎랓랔랕랚랛랝랞"], +["8e41","랟랡",6,"랪랮",5,"랶랷랹",8], +["8e61","럂",4,"럈럊",19], +["8e81","럞",13,"럮럯럱럲럳럵",6,"럾렂",4,"렊렋렍렎렏렑",6,"렚렜렞",5,"렦렧렩렪렫렭",6,"렶렺",5,"롁롂롃롅",11,"롒롔",7,"롞롟롡롢롣롥",6,"롮롰롲",5,"롹롺롻롽",7], +["8f41","뢅",7,"뢎",17], +["8f61","뢠",7,"뢩",6,"뢱뢲뢳뢵뢶뢷뢹",4], +["8f81","뢾뢿룂룄룆",5,"룍룎룏룑룒룓룕",7,"룞룠룢",5,"룪룫룭룮룯룱",6,"룺룼룾",5,"뤅",18,"뤙",6,"뤡",26,"뤾뤿륁륂륃륅",6,"륍륎륐륒",5], +["9041","륚륛륝륞륟륡",6,"륪륬륮",5,"륶륷륹륺륻륽"], +["9061","륾",5,"릆릈릋릌릏",15], +["9081","릟",12,"릮릯릱릲릳릵",6,"릾맀맂",5,"맊맋맍맓",4,"맚맜맟맠맢맦맧맩맪맫맭",6,"맶맻",4,"먂",5,"먉",11,"먖",33,"먺먻먽먾먿멁멃멄멅멆"], +["9141","멇멊멌멏멐멑멒멖멗멙멚멛멝",6,"멦멪",5], +["9161","멲멳멵멶멷멹",9,"몆몈몉몊몋몍",5], +["9181","몓",20,"몪몭몮몯몱몳",4,"몺몼몾",5,"뫅뫆뫇뫉",14,"뫚",33,"뫽뫾뫿묁묂묃묅",7,"묎묐묒",5,"묙묚묛묝묞묟묡",6], +["9241","묨묪묬",7,"묷묹묺묿",4,"뭆뭈뭊뭋뭌뭎뭑뭒"], +["9261","뭓뭕뭖뭗뭙",7,"뭢뭤",7,"뭭",4], +["9281","뭲",21,"뮉뮊뮋뮍뮎뮏뮑",18,"뮥뮦뮧뮩뮪뮫뮭",6,"뮵뮶뮸",7,"믁믂믃믅믆믇믉",6,"믑믒믔",35,"믺믻믽믾밁"], +["9341","밃",4,"밊밎밐밒밓밙밚밠밡밢밣밦밨밪밫밬밮밯밲밳밵"], +["9361","밶밷밹",6,"뱂뱆뱇뱈뱊뱋뱎뱏뱑",8], +["9381","뱚뱛뱜뱞",37,"벆벇벉벊벍벏",4,"벖벘벛",4,"벢벣벥벦벩",6,"벲벶",5,"벾벿볁볂볃볅",7,"볎볒볓볔볖볗볙볚볛볝",22,"볷볹볺볻볽"], +["9441","볾",5,"봆봈봊",5,"봑봒봓봕",8], +["9461","봞",5,"봥",6,"봭",12], +["9481","봺",5,"뵁",6,"뵊뵋뵍뵎뵏뵑",6,"뵚",9,"뵥뵦뵧뵩",22,"붂붃붅붆붋",4,"붒붔붖붗붘붛붝",6,"붥",10,"붱",6,"붹",24], +["9541","뷒뷓뷖뷗뷙뷚뷛뷝",11,"뷪",5,"뷱"], +["9561","뷲뷳뷵뷶뷷뷹",6,"븁븂븄븆",5,"븎븏븑븒븓"], +["9581","븕",6,"븞븠",35,"빆빇빉빊빋빍빏",4,"빖빘빜빝빞빟빢빣빥빦빧빩빫",4,"빲빶",4,"빾빿뺁뺂뺃뺅",6,"뺎뺒",5,"뺚",13,"뺩",14], +["9641","뺸",23,"뻒뻓"], +["9661","뻕뻖뻙",6,"뻡뻢뻦",5,"뻭",8], +["9681","뻶",10,"뼂",5,"뼊",13,"뼚뼞",33,"뽂뽃뽅뽆뽇뽉",6,"뽒뽓뽔뽖",44], +["9741","뾃",16,"뾕",8], +["9761","뾞",17,"뾱",7], +["9781","뾹",11,"뿆",5,"뿎뿏뿑뿒뿓뿕",6,"뿝뿞뿠뿢",89,"쀽쀾쀿"], +["9841","쁀",16,"쁒",5,"쁙쁚쁛"], +["9861","쁝쁞쁟쁡",6,"쁪",15], +["9881","쁺",21,"삒삓삕삖삗삙",6,"삢삤삦",5,"삮삱삲삷",4,"삾샂샃샄샆샇샊샋샍샎샏샑",6,"샚샞",5,"샦샧샩샪샫샭",6,"샶샸샺",5,"섁섂섃섅섆섇섉",6,"섑섒섓섔섖",5,"섡섢섥섨섩섪섫섮"], +["9941","섲섳섴섵섷섺섻섽섾섿셁",6,"셊셎",5,"셖셗"], +["9961","셙셚셛셝",6,"셦셪",5,"셱셲셳셵셶셷셹셺셻"], +["9981","셼",8,"솆",5,"솏솑솒솓솕솗",4,"솞솠솢솣솤솦솧솪솫솭솮솯솱",11,"솾",5,"쇅쇆쇇쇉쇊쇋쇍",6,"쇕쇖쇙",6,"쇡쇢쇣쇥쇦쇧쇩",6,"쇲쇴",7,"쇾쇿숁숂숃숅",6,"숎숐숒",5,"숚숛숝숞숡숢숣"], +["9a41","숤숥숦숧숪숬숮숰숳숵",16], +["9a61","쉆쉇쉉",6,"쉒쉓쉕쉖쉗쉙",6,"쉡쉢쉣쉤쉦"], +["9a81","쉧",4,"쉮쉯쉱쉲쉳쉵",6,"쉾슀슂",5,"슊",5,"슑",6,"슙슚슜슞",5,"슦슧슩슪슫슮",5,"슶슸슺",33,"싞싟싡싢싥",5,"싮싰싲싳싴싵싷싺싽싾싿쌁",6,"쌊쌋쌎쌏"], +["9b41","쌐쌑쌒쌖쌗쌙쌚쌛쌝",6,"쌦쌧쌪",8], +["9b61","쌳",17,"썆",7], +["9b81","썎",25,"썪썫썭썮썯썱썳",4,"썺썻썾",5,"쎅쎆쎇쎉쎊쎋쎍",50,"쏁",22,"쏚"], +["9c41","쏛쏝쏞쏡쏣",4,"쏪쏫쏬쏮",5,"쏶쏷쏹",5], +["9c61","쏿",8,"쐉",6,"쐑",9], +["9c81","쐛",8,"쐥",6,"쐭쐮쐯쐱쐲쐳쐵",6,"쐾",9,"쑉",26,"쑦쑧쑩쑪쑫쑭",6,"쑶쑷쑸쑺",5,"쒁",18,"쒕",6,"쒝",12], +["9d41","쒪",13,"쒹쒺쒻쒽",8], +["9d61","쓆",25], +["9d81","쓠",8,"쓪",5,"쓲쓳쓵쓶쓷쓹쓻쓼쓽쓾씂",9,"씍씎씏씑씒씓씕",6,"씝",10,"씪씫씭씮씯씱",6,"씺씼씾",5,"앆앇앋앏앐앑앒앖앚앛앜앟앢앣앥앦앧앩",6,"앲앶",5,"앾앿얁얂얃얅얆얈얉얊얋얎얐얒얓얔"], +["9e41","얖얙얚얛얝얞얟얡",7,"얪",9,"얶"], +["9e61","얷얺얿",4,"엋엍엏엒엓엕엖엗엙",6,"엢엤엦엧"], +["9e81","엨엩엪엫엯엱엲엳엵엸엹엺엻옂옃옄옉옊옋옍옎옏옑",6,"옚옝",6,"옦옧옩옪옫옯옱옲옶옸옺옼옽옾옿왂왃왅왆왇왉",6,"왒왖",5,"왞왟왡",10,"왭왮왰왲",5,"왺왻왽왾왿욁",6,"욊욌욎",5,"욖욗욙욚욛욝",6,"욦"], +["9f41","욨욪",5,"욲욳욵욶욷욻",4,"웂웄웆",5,"웎"], +["9f61","웏웑웒웓웕",6,"웞웟웢",5,"웪웫웭웮웯웱웲"], +["9f81","웳",4,"웺웻웼웾",5,"윆윇윉윊윋윍",6,"윖윘윚",5,"윢윣윥윦윧윩",6,"윲윴윶윸윹윺윻윾윿읁읂읃읅",4,"읋읎읐읙읚읛읝읞읟읡",6,"읩읪읬",7,"읶읷읹읺읻읿잀잁잂잆잋잌잍잏잒잓잕잙잛",4,"잢잧",4,"잮잯잱잲잳잵잶잷"], +["a041","잸잹잺잻잾쟂",5,"쟊쟋쟍쟏쟑",6,"쟙쟚쟛쟜"], +["a061","쟞",5,"쟥쟦쟧쟩쟪쟫쟭",13], +["a081","쟻",4,"젂젃젅젆젇젉젋",4,"젒젔젗",4,"젞젟젡젢젣젥",6,"젮젰젲",5,"젹젺젻젽젾젿졁",6,"졊졋졎",5,"졕",26,"졲졳졵졶졷졹졻",4,"좂좄좈좉좊좎",5,"좕",7,"좞좠좢좣좤"], +["a141","좥좦좧좩",18,"좾좿죀죁"], +["a161","죂죃죅죆죇죉죊죋죍",6,"죖죘죚",5,"죢죣죥"], +["a181","죦",14,"죶",5,"죾죿줁줂줃줇",4,"줎 、。·‥…¨〃­―∥\∼‘’“”〔〕〈",9,"±×÷≠≤≥∞∴°′″℃Å¢£¥♂♀∠⊥⌒∂∇≡≒§※☆★○●◎◇◆□■△▲▽▼→←↑↓↔〓≪≫√∽∝∵∫∬∈∋⊆⊇⊂⊃∪∩∧∨¬"], +["a241","줐줒",5,"줙",18], +["a261","줭",6,"줵",18], +["a281","쥈",7,"쥒쥓쥕쥖쥗쥙",6,"쥢쥤",7,"쥭쥮쥯⇒⇔∀∃´~ˇ˘˝˚˙¸˛¡¿ː∮∑∏¤℉‰◁◀▷▶♤♠♡♥♧♣⊙◈▣◐◑▒▤▥▨▧▦▩♨☏☎☜☞¶†‡↕↗↙↖↘♭♩♪♬㉿㈜№㏇™㏂㏘℡€®"], +["a341","쥱쥲쥳쥵",6,"쥽",10,"즊즋즍즎즏"], +["a361","즑",6,"즚즜즞",16], +["a381","즯",16,"짂짃짅짆짉짋",4,"짒짔짗짘짛!",58,"₩]",32," ̄"], +["a441","짞짟짡짣짥짦짨짩짪짫짮짲",5,"짺짻짽짾짿쨁쨂쨃쨄"], +["a461","쨅쨆쨇쨊쨎",5,"쨕쨖쨗쨙",12], +["a481","쨦쨧쨨쨪",28,"ㄱ",93], +["a541","쩇",4,"쩎쩏쩑쩒쩓쩕",6,"쩞쩢",5,"쩩쩪"], +["a561","쩫",17,"쩾",5,"쪅쪆"], +["a581","쪇",16,"쪙",14,"ⅰ",9], +["a5b0","Ⅰ",9], +["a5c1","Α",16,"Σ",6], +["a5e1","α",16,"σ",6], +["a641","쪨",19,"쪾쪿쫁쫂쫃쫅"], +["a661","쫆",5,"쫎쫐쫒쫔쫕쫖쫗쫚",5,"쫡",6], +["a681","쫨쫩쫪쫫쫭",6,"쫵",18,"쬉쬊─│┌┐┘└├┬┤┴┼━┃┏┓┛┗┣┳┫┻╋┠┯┨┷┿┝┰┥┸╂┒┑┚┙┖┕┎┍┞┟┡┢┦┧┩┪┭┮┱┲┵┶┹┺┽┾╀╁╃",7], +["a741","쬋",4,"쬑쬒쬓쬕쬖쬗쬙",6,"쬢",7], +["a761","쬪",22,"쭂쭃쭄"], +["a781","쭅쭆쭇쭊쭋쭍쭎쭏쭑",6,"쭚쭛쭜쭞",5,"쭥",7,"㎕㎖㎗ℓ㎘㏄㎣㎤㎥㎦㎙",9,"㏊㎍㎎㎏㏏㎈㎉㏈㎧㎨㎰",9,"㎀",4,"㎺",5,"㎐",4,"Ω㏀㏁㎊㎋㎌㏖㏅㎭㎮㎯㏛㎩㎪㎫㎬㏝㏐㏓㏃㏉㏜㏆"], +["a841","쭭",10,"쭺",14], +["a861","쮉",18,"쮝",6], +["a881","쮤",19,"쮹",11,"ÆЪĦ"], +["a8a6","IJ"], +["a8a8","ĿŁØŒºÞŦŊ"], +["a8b1","㉠",27,"ⓐ",25,"①",14,"½⅓⅔¼¾⅛⅜⅝⅞"], +["a941","쯅",14,"쯕",10], +["a961","쯠쯡쯢쯣쯥쯦쯨쯪",18], +["a981","쯽",14,"찎찏찑찒찓찕",6,"찞찟찠찣찤æđðħıijĸŀłøœßþŧŋʼn㈀",27,"⒜",25,"⑴",14,"¹²³⁴ⁿ₁₂₃₄"], +["aa41","찥찦찪찫찭찯찱",6,"찺찿",4,"챆챇챉챊챋챍챎"], +["aa61","챏",4,"챖챚",5,"챡챢챣챥챧챩",6,"챱챲"], +["aa81","챳챴챶",29,"ぁ",82], +["ab41","첔첕첖첗첚첛첝첞첟첡",6,"첪첮",5,"첶첷첹"], +["ab61","첺첻첽",6,"쳆쳈쳊",5,"쳑쳒쳓쳕",5], +["ab81","쳛",8,"쳥",6,"쳭쳮쳯쳱",12,"ァ",85], +["ac41","쳾쳿촀촂",5,"촊촋촍촎촏촑",6,"촚촜촞촟촠"], +["ac61","촡촢촣촥촦촧촩촪촫촭",11,"촺",4], +["ac81","촿",28,"쵝쵞쵟А",5,"ЁЖ",25], +["acd1","а",5,"ёж",25], +["ad41","쵡쵢쵣쵥",6,"쵮쵰쵲",5,"쵹",7], +["ad61","춁",6,"춉",10,"춖춗춙춚춛춝춞춟"], +["ad81","춠춡춢춣춦춨춪",5,"춱",18,"췅"], +["ae41","췆",5,"췍췎췏췑",16], +["ae61","췢",5,"췩췪췫췭췮췯췱",6,"췺췼췾",4], +["ae81","츃츅츆츇츉츊츋츍",6,"츕츖츗츘츚",5,"츢츣츥츦츧츩츪츫"], +["af41","츬츭츮츯츲츴츶",19], +["af61","칊",13,"칚칛칝칞칢",5,"칪칬"], +["af81","칮",5,"칶칷칹칺칻칽",6,"캆캈캊",5,"캒캓캕캖캗캙"], +["b041","캚",5,"캢캦",5,"캮",12], +["b061","캻",5,"컂",19], +["b081","컖",13,"컦컧컩컪컭",6,"컶컺",5,"가각간갇갈갉갊감",7,"같",4,"갠갤갬갭갯갰갱갸갹갼걀걋걍걔걘걜거걱건걷걸걺검겁것겄겅겆겉겊겋게겐겔겜겝겟겠겡겨격겪견겯결겸겹겻겼경곁계곈곌곕곗고곡곤곧골곪곬곯곰곱곳공곶과곽관괄괆"], +["b141","켂켃켅켆켇켉",6,"켒켔켖",5,"켝켞켟켡켢켣"], +["b161","켥",6,"켮켲",5,"켹",11], +["b181","콅",14,"콖콗콙콚콛콝",6,"콦콨콪콫콬괌괍괏광괘괜괠괩괬괭괴괵괸괼굄굅굇굉교굔굘굡굣구국군굳굴굵굶굻굼굽굿궁궂궈궉권궐궜궝궤궷귀귁귄귈귐귑귓규균귤그극근귿글긁금급긋긍긔기긱긴긷길긺김깁깃깅깆깊까깍깎깐깔깖깜깝깟깠깡깥깨깩깬깰깸"], +["b241","콭콮콯콲콳콵콶콷콹",6,"쾁쾂쾃쾄쾆",5,"쾍"], +["b261","쾎",18,"쾢",5,"쾩"], +["b281","쾪",5,"쾱",18,"쿅",6,"깹깻깼깽꺄꺅꺌꺼꺽꺾껀껄껌껍껏껐껑께껙껜껨껫껭껴껸껼꼇꼈꼍꼐꼬꼭꼰꼲꼴꼼꼽꼿꽁꽂꽃꽈꽉꽐꽜꽝꽤꽥꽹꾀꾄꾈꾐꾑꾕꾜꾸꾹꾼꿀꿇꿈꿉꿋꿍꿎꿔꿜꿨꿩꿰꿱꿴꿸뀀뀁뀄뀌뀐뀔뀜뀝뀨끄끅끈끊끌끎끓끔끕끗끙"], +["b341","쿌",19,"쿢쿣쿥쿦쿧쿩"], +["b361","쿪",5,"쿲쿴쿶",5,"쿽쿾쿿퀁퀂퀃퀅",5], +["b381","퀋",5,"퀒",5,"퀙",19,"끝끼끽낀낄낌낍낏낑나낙낚난낟날낡낢남납낫",4,"낱낳내낵낸낼냄냅냇냈냉냐냑냔냘냠냥너넉넋넌널넒넓넘넙넛넜넝넣네넥넨넬넴넵넷넸넹녀녁년녈념녑녔녕녘녜녠노녹논놀놂놈놉놋농높놓놔놘놜놨뇌뇐뇔뇜뇝"], +["b441","퀮",5,"퀶퀷퀹퀺퀻퀽",6,"큆큈큊",5], +["b461","큑큒큓큕큖큗큙",6,"큡",10,"큮큯"], +["b481","큱큲큳큵",6,"큾큿킀킂",18,"뇟뇨뇩뇬뇰뇹뇻뇽누눅눈눋눌눔눕눗눙눠눴눼뉘뉜뉠뉨뉩뉴뉵뉼늄늅늉느늑는늘늙늚늠늡늣능늦늪늬늰늴니닉닌닐닒님닙닛닝닢다닥닦단닫",4,"닳담답닷",4,"닿대댁댄댈댐댑댓댔댕댜더덕덖던덛덜덞덟덤덥"], +["b541","킕",14,"킦킧킩킪킫킭",5], +["b561","킳킶킸킺",5,"탂탃탅탆탇탊",5,"탒탖",4], +["b581","탛탞탟탡탢탣탥",6,"탮탲",5,"탹",11,"덧덩덫덮데덱덴델뎀뎁뎃뎄뎅뎌뎐뎔뎠뎡뎨뎬도독돈돋돌돎돐돔돕돗동돛돝돠돤돨돼됐되된될됨됩됫됴두둑둔둘둠둡둣둥둬뒀뒈뒝뒤뒨뒬뒵뒷뒹듀듄듈듐듕드득든듣들듦듬듭듯등듸디딕딘딛딜딤딥딧딨딩딪따딱딴딸"], +["b641","턅",7,"턎",17], +["b661","턠",15,"턲턳턵턶턷턹턻턼턽턾"], +["b681","턿텂텆",5,"텎텏텑텒텓텕",6,"텞텠텢",5,"텩텪텫텭땀땁땃땄땅땋때땍땐땔땜땝땟땠땡떠떡떤떨떪떫떰떱떳떴떵떻떼떽뗀뗄뗌뗍뗏뗐뗑뗘뗬또똑똔똘똥똬똴뙈뙤뙨뚜뚝뚠뚤뚫뚬뚱뛔뛰뛴뛸뜀뜁뜅뜨뜩뜬뜯뜰뜸뜹뜻띄띈띌띔띕띠띤띨띰띱띳띵라락란랄람랍랏랐랑랒랖랗"], +["b741","텮",13,"텽",6,"톅톆톇톉톊"], +["b761","톋",20,"톢톣톥톦톧"], +["b781","톩",6,"톲톴톶톷톸톹톻톽톾톿퇁",14,"래랙랜랠램랩랫랬랭랴략랸럇량러럭런럴럼럽럿렀렁렇레렉렌렐렘렙렛렝려력련렬렴렵렷렸령례롄롑롓로록론롤롬롭롯롱롸롼뢍뢨뢰뢴뢸룀룁룃룅료룐룔룝룟룡루룩룬룰룸룹룻룽뤄뤘뤠뤼뤽륀륄륌륏륑류륙륜률륨륩"], +["b841","퇐",7,"퇙",17], +["b861","퇫",8,"퇵퇶퇷퇹",13], +["b881","툈툊",5,"툑",24,"륫륭르륵른를름릅릇릉릊릍릎리릭린릴림립릿링마막만많",4,"맘맙맛망맞맡맣매맥맨맬맴맵맷맸맹맺먀먁먈먕머먹먼멀멂멈멉멋멍멎멓메멕멘멜멤멥멧멨멩며멱면멸몃몄명몇몌모목몫몬몰몲몸몹못몽뫄뫈뫘뫙뫼"], +["b941","툪툫툮툯툱툲툳툵",6,"툾퉀퉂",5,"퉉퉊퉋퉌"], +["b961","퉍",14,"퉝",6,"퉥퉦퉧퉨"], +["b981","퉩",22,"튂튃튅튆튇튉튊튋튌묀묄묍묏묑묘묜묠묩묫무묵묶문묻물묽묾뭄뭅뭇뭉뭍뭏뭐뭔뭘뭡뭣뭬뮈뮌뮐뮤뮨뮬뮴뮷므믄믈믐믓미믹민믿밀밂밈밉밋밌밍및밑바",4,"받",4,"밤밥밧방밭배백밴밸뱀뱁뱃뱄뱅뱉뱌뱍뱐뱝버벅번벋벌벎범법벗"], +["ba41","튍튎튏튒튓튔튖",5,"튝튞튟튡튢튣튥",6,"튭"], +["ba61","튮튯튰튲",5,"튺튻튽튾틁틃",4,"틊틌",5], +["ba81","틒틓틕틖틗틙틚틛틝",6,"틦",9,"틲틳틵틶틷틹틺벙벚베벡벤벧벨벰벱벳벴벵벼벽변별볍볏볐병볕볘볜보복볶본볼봄봅봇봉봐봔봤봬뵀뵈뵉뵌뵐뵘뵙뵤뵨부북분붇불붉붊붐붑붓붕붙붚붜붤붰붸뷔뷕뷘뷜뷩뷰뷴뷸븀븃븅브븍븐블븜븝븟비빅빈빌빎빔빕빗빙빚빛빠빡빤"], +["bb41","틻",4,"팂팄팆",5,"팏팑팒팓팕팗",4,"팞팢팣"], +["bb61","팤팦팧팪팫팭팮팯팱",6,"팺팾",5,"퍆퍇퍈퍉"], +["bb81","퍊",31,"빨빪빰빱빳빴빵빻빼빽뺀뺄뺌뺍뺏뺐뺑뺘뺙뺨뻐뻑뻔뻗뻘뻠뻣뻤뻥뻬뼁뼈뼉뼘뼙뼛뼜뼝뽀뽁뽄뽈뽐뽑뽕뾔뾰뿅뿌뿍뿐뿔뿜뿟뿡쀼쁑쁘쁜쁠쁨쁩삐삑삔삘삠삡삣삥사삭삯산삳살삵삶삼삽삿샀상샅새색샌샐샘샙샛샜생샤"], +["bc41","퍪",17,"퍾퍿펁펂펃펅펆펇"], +["bc61","펈펉펊펋펎펒",5,"펚펛펝펞펟펡",6,"펪펬펮"], +["bc81","펯",4,"펵펶펷펹펺펻펽",6,"폆폇폊",5,"폑",5,"샥샨샬샴샵샷샹섀섄섈섐섕서",4,"섣설섦섧섬섭섯섰성섶세섹센셀셈셉셋셌셍셔셕션셜셤셥셧셨셩셰셴셸솅소속솎손솔솖솜솝솟송솥솨솩솬솰솽쇄쇈쇌쇔쇗쇘쇠쇤쇨쇰쇱쇳쇼쇽숀숄숌숍숏숑수숙순숟술숨숩숫숭"], +["bd41","폗폙",7,"폢폤",7,"폮폯폱폲폳폵폶폷"], +["bd61","폸폹폺폻폾퐀퐂",5,"퐉",13], +["bd81","퐗",5,"퐞",25,"숯숱숲숴쉈쉐쉑쉔쉘쉠쉥쉬쉭쉰쉴쉼쉽쉿슁슈슉슐슘슛슝스슥슨슬슭슴습슷승시식신싣실싫심십싯싱싶싸싹싻싼쌀쌈쌉쌌쌍쌓쌔쌕쌘쌜쌤쌥쌨쌩썅써썩썬썰썲썸썹썼썽쎄쎈쎌쏀쏘쏙쏜쏟쏠쏢쏨쏩쏭쏴쏵쏸쐈쐐쐤쐬쐰"], +["be41","퐸",7,"푁푂푃푅",14], +["be61","푔",7,"푝푞푟푡푢푣푥",7,"푮푰푱푲"], +["be81","푳",4,"푺푻푽푾풁풃",4,"풊풌풎",5,"풕",8,"쐴쐼쐽쑈쑤쑥쑨쑬쑴쑵쑹쒀쒔쒜쒸쒼쓩쓰쓱쓴쓸쓺쓿씀씁씌씐씔씜씨씩씬씰씸씹씻씽아악안앉않알앍앎앓암압앗았앙앝앞애액앤앨앰앱앳앴앵야약얀얄얇얌얍얏양얕얗얘얜얠얩어억언얹얻얼얽얾엄",6,"엌엎"], +["bf41","풞",10,"풪",14], +["bf61","풹",18,"퓍퓎퓏퓑퓒퓓퓕"], +["bf81","퓖",5,"퓝퓞퓠",7,"퓩퓪퓫퓭퓮퓯퓱",6,"퓹퓺퓼에엑엔엘엠엡엣엥여역엮연열엶엷염",5,"옅옆옇예옌옐옘옙옛옜오옥온올옭옮옰옳옴옵옷옹옻와왁완왈왐왑왓왔왕왜왝왠왬왯왱외왹왼욀욈욉욋욍요욕욘욜욤욥욧용우욱운울욹욺움웁웃웅워웍원월웜웝웠웡웨"], +["c041","퓾",5,"픅픆픇픉픊픋픍",6,"픖픘",5], +["c061","픞",25], +["c081","픸픹픺픻픾픿핁핂핃핅",6,"핎핐핒",5,"핚핛핝핞핟핡핢핣웩웬웰웸웹웽위윅윈윌윔윕윗윙유육윤율윰윱윳융윷으윽은을읊음읍읏응",7,"읜읠읨읫이익인일읽읾잃임입잇있잉잊잎자작잔잖잗잘잚잠잡잣잤장잦재잭잰잴잼잽잿쟀쟁쟈쟉쟌쟎쟐쟘쟝쟤쟨쟬저적전절젊"], +["c141","핤핦핧핪핬핮",5,"핶핷핹핺핻핽",6,"햆햊햋"], +["c161","햌햍햎햏햑",19,"햦햧"], +["c181","햨",31,"점접젓정젖제젝젠젤젬젭젯젱져젼졀졈졉졌졍졔조족존졸졺좀좁좃종좆좇좋좌좍좔좝좟좡좨좼좽죄죈죌죔죕죗죙죠죡죤죵주죽준줄줅줆줌줍줏중줘줬줴쥐쥑쥔쥘쥠쥡쥣쥬쥰쥴쥼즈즉즌즐즘즙즛증지직진짇질짊짐집짓"], +["c241","헊헋헍헎헏헑헓",4,"헚헜헞",5,"헦헧헩헪헫헭헮"], +["c261","헯",4,"헶헸헺",5,"혂혃혅혆혇혉",6,"혒"], +["c281","혖",5,"혝혞혟혡혢혣혥",7,"혮",9,"혺혻징짖짙짚짜짝짠짢짤짧짬짭짯짰짱째짹짼쨀쨈쨉쨋쨌쨍쨔쨘쨩쩌쩍쩐쩔쩜쩝쩟쩠쩡쩨쩽쪄쪘쪼쪽쫀쫄쫌쫍쫏쫑쫓쫘쫙쫠쫬쫴쬈쬐쬔쬘쬠쬡쭁쭈쭉쭌쭐쭘쭙쭝쭤쭸쭹쮜쮸쯔쯤쯧쯩찌찍찐찔찜찝찡찢찧차착찬찮찰참찹찻"], +["c341","혽혾혿홁홂홃홄홆홇홊홌홎홏홐홒홓홖홗홙홚홛홝",4], +["c361","홢",4,"홨홪",5,"홲홳홵",11], +["c381","횁횂횄횆",5,"횎횏횑횒횓횕",7,"횞횠횢",5,"횩횪찼창찾채책챈챌챔챕챗챘챙챠챤챦챨챰챵처척천철첨첩첫첬청체첵첸첼쳄쳅쳇쳉쳐쳔쳤쳬쳰촁초촉촌촐촘촙촛총촤촨촬촹최쵠쵤쵬쵭쵯쵱쵸춈추축춘출춤춥춧충춰췄췌췐취췬췰췸췹췻췽츄츈츌츔츙츠측츤츨츰츱츳층"], +["c441","횫횭횮횯횱",7,"횺횼",7,"훆훇훉훊훋"], +["c461","훍훎훏훐훒훓훕훖훘훚",5,"훡훢훣훥훦훧훩",4], +["c481","훮훯훱훲훳훴훶",5,"훾훿휁휂휃휅",11,"휒휓휔치칙친칟칠칡침칩칫칭카칵칸칼캄캅캇캉캐캑캔캘캠캡캣캤캥캬캭컁커컥컨컫컬컴컵컷컸컹케켁켄켈켐켑켓켕켜켠켤켬켭켯켰켱켸코콕콘콜콤콥콧콩콰콱콴콸쾀쾅쾌쾡쾨쾰쿄쿠쿡쿤쿨쿰쿱쿳쿵쿼퀀퀄퀑퀘퀭퀴퀵퀸퀼"], +["c541","휕휖휗휚휛휝휞휟휡",6,"휪휬휮",5,"휶휷휹"], +["c561","휺휻휽",6,"흅흆흈흊",5,"흒흓흕흚",4], +["c581","흟흢흤흦흧흨흪흫흭흮흯흱흲흳흵",6,"흾흿힀힂",5,"힊힋큄큅큇큉큐큔큘큠크큭큰클큼큽킁키킥킨킬킴킵킷킹타탁탄탈탉탐탑탓탔탕태택탠탤탬탭탯탰탱탸턍터턱턴털턺텀텁텃텄텅테텍텐텔템텝텟텡텨텬텼톄톈토톡톤톨톰톱톳통톺톼퇀퇘퇴퇸툇툉툐투툭툰툴툼툽툿퉁퉈퉜"], +["c641","힍힎힏힑",6,"힚힜힞",5], +["c6a1","퉤튀튁튄튈튐튑튕튜튠튤튬튱트특튼튿틀틂틈틉틋틔틘틜틤틥티틱틴틸팀팁팃팅파팍팎판팔팖팜팝팟팠팡팥패팩팬팰팸팹팻팼팽퍄퍅퍼퍽펀펄펌펍펏펐펑페펙펜펠펨펩펫펭펴편펼폄폅폈평폐폘폡폣포폭폰폴폼폽폿퐁"], +["c7a1","퐈퐝푀푄표푠푤푭푯푸푹푼푿풀풂품풉풋풍풔풩퓌퓐퓔퓜퓟퓨퓬퓰퓸퓻퓽프픈플픔픕픗피픽핀필핌핍핏핑하학한할핥함합핫항해핵핸핼햄햅햇했행햐향허헉헌헐헒험헙헛헝헤헥헨헬헴헵헷헹혀혁현혈혐협혓혔형혜혠"], +["c8a1","혤혭호혹혼홀홅홈홉홋홍홑화확환활홧황홰홱홴횃횅회획횐횔횝횟횡효횬횰횹횻후훅훈훌훑훔훗훙훠훤훨훰훵훼훽휀휄휑휘휙휜휠휨휩휫휭휴휵휸휼흄흇흉흐흑흔흖흗흘흙흠흡흣흥흩희흰흴흼흽힁히힉힌힐힘힙힛힝"], +["caa1","伽佳假價加可呵哥嘉嫁家暇架枷柯歌珂痂稼苛茄街袈訶賈跏軻迦駕刻却各恪慤殼珏脚覺角閣侃刊墾奸姦干幹懇揀杆柬桿澗癎看磵稈竿簡肝艮艱諫間乫喝曷渴碣竭葛褐蝎鞨勘坎堪嵌感憾戡敢柑橄減甘疳監瞰紺邯鑑鑒龕"], +["cba1","匣岬甲胛鉀閘剛堈姜岡崗康强彊慷江畺疆糠絳綱羌腔舡薑襁講鋼降鱇介价個凱塏愷愾慨改槪漑疥皆盖箇芥蓋豈鎧開喀客坑更粳羹醵倨去居巨拒据據擧渠炬祛距踞車遽鉅鋸乾件健巾建愆楗腱虔蹇鍵騫乞傑杰桀儉劍劒檢"], +["cca1","瞼鈐黔劫怯迲偈憩揭擊格檄激膈覡隔堅牽犬甄絹繭肩見譴遣鵑抉決潔結缺訣兼慊箝謙鉗鎌京俓倞傾儆勁勍卿坰境庚徑慶憬擎敬景暻更梗涇炅烱璟璥瓊痙硬磬竟競絅經耕耿脛莖警輕逕鏡頃頸驚鯨係啓堺契季屆悸戒桂械"], +["cda1","棨溪界癸磎稽系繫繼計誡谿階鷄古叩告呱固姑孤尻庫拷攷故敲暠枯槁沽痼皐睾稿羔考股膏苦苽菰藁蠱袴誥賈辜錮雇顧高鼓哭斛曲梏穀谷鵠困坤崑昆梱棍滾琨袞鯤汨滑骨供公共功孔工恐恭拱控攻珙空蚣貢鞏串寡戈果瓜"], +["cea1","科菓誇課跨過鍋顆廓槨藿郭串冠官寬慣棺款灌琯瓘管罐菅觀貫關館刮恝括适侊光匡壙廣曠洸炚狂珖筐胱鑛卦掛罫乖傀塊壞怪愧拐槐魁宏紘肱轟交僑咬喬嬌嶠巧攪敎校橋狡皎矯絞翹膠蕎蛟較轎郊餃驕鮫丘久九仇俱具勾"], +["cfa1","區口句咎嘔坵垢寇嶇廐懼拘救枸柩構歐毆毬求溝灸狗玖球瞿矩究絿耉臼舅舊苟衢謳購軀逑邱鉤銶駒驅鳩鷗龜國局菊鞠鞫麴君窘群裙軍郡堀屈掘窟宮弓穹窮芎躬倦券勸卷圈拳捲權淃眷厥獗蕨蹶闕机櫃潰詭軌饋句晷歸貴"], +["d0a1","鬼龜叫圭奎揆槻珪硅窺竅糾葵規赳逵閨勻均畇筠菌鈞龜橘克剋劇戟棘極隙僅劤勤懃斤根槿瑾筋芹菫覲謹近饉契今妗擒昑檎琴禁禽芩衾衿襟金錦伋及急扱汲級給亘兢矜肯企伎其冀嗜器圻基埼夔奇妓寄岐崎己幾忌技旗旣"], +["d1a1","朞期杞棋棄機欺氣汽沂淇玘琦琪璂璣畸畿碁磯祁祇祈祺箕紀綺羈耆耭肌記譏豈起錡錤飢饑騎騏驥麒緊佶吉拮桔金喫儺喇奈娜懦懶拏拿癩",5,"那樂",4,"諾酪駱亂卵暖欄煖爛蘭難鸞捏捺南嵐枏楠湳濫男藍襤拉"], +["d2a1","納臘蠟衲囊娘廊",4,"乃來內奈柰耐冷女年撚秊念恬拈捻寧寗努勞奴弩怒擄櫓爐瑙盧",5,"駑魯",10,"濃籠聾膿農惱牢磊腦賂雷尿壘",7,"嫩訥杻紐勒",5,"能菱陵尼泥匿溺多茶"], +["d3a1","丹亶但單團壇彖斷旦檀段湍短端簞緞蛋袒鄲鍛撻澾獺疸達啖坍憺擔曇淡湛潭澹痰聃膽蕁覃談譚錟沓畓答踏遝唐堂塘幢戇撞棠當糖螳黨代垈坮大對岱帶待戴擡玳臺袋貸隊黛宅德悳倒刀到圖堵塗導屠島嶋度徒悼挑掉搗桃"], +["d4a1","棹櫂淘渡滔濤燾盜睹禱稻萄覩賭跳蹈逃途道都鍍陶韜毒瀆牘犢獨督禿篤纛讀墩惇敦旽暾沌焞燉豚頓乭突仝冬凍動同憧東桐棟洞潼疼瞳童胴董銅兜斗杜枓痘竇荳讀豆逗頭屯臀芚遁遯鈍得嶝橙燈登等藤謄鄧騰喇懶拏癩羅"], +["d5a1","蘿螺裸邏樂洛烙珞絡落諾酪駱丹亂卵欄欒瀾爛蘭鸞剌辣嵐擥攬欖濫籃纜藍襤覽拉臘蠟廊朗浪狼琅瑯螂郞來崍徠萊冷掠略亮倆兩凉梁樑粮粱糧良諒輛量侶儷勵呂廬慮戾旅櫚濾礪藜蠣閭驢驪麗黎力曆歷瀝礫轢靂憐戀攣漣"], +["d6a1","煉璉練聯蓮輦連鍊冽列劣洌烈裂廉斂殮濂簾獵令伶囹寧岺嶺怜玲笭羚翎聆逞鈴零靈領齡例澧禮醴隷勞怒撈擄櫓潞瀘爐盧老蘆虜路輅露魯鷺鹵碌祿綠菉錄鹿麓論壟弄朧瀧瓏籠聾儡瀨牢磊賂賚賴雷了僚寮廖料燎療瞭聊蓼"], +["d7a1","遼鬧龍壘婁屢樓淚漏瘻累縷蔞褸鏤陋劉旒柳榴流溜瀏琉瑠留瘤硫謬類六戮陸侖倫崙淪綸輪律慄栗率隆勒肋凜凌楞稜綾菱陵俚利厘吏唎履悧李梨浬犁狸理璃異痢籬罹羸莉裏裡里釐離鯉吝潾燐璘藺躪隣鱗麟林淋琳臨霖砬"], +["d8a1","立笠粒摩瑪痲碼磨馬魔麻寞幕漠膜莫邈万卍娩巒彎慢挽晩曼滿漫灣瞞萬蔓蠻輓饅鰻唜抹末沫茉襪靺亡妄忘忙望網罔芒茫莽輞邙埋妹媒寐昧枚梅每煤罵買賣邁魅脈貊陌驀麥孟氓猛盲盟萌冪覓免冕勉棉沔眄眠綿緬面麵滅"], +["d9a1","蔑冥名命明暝椧溟皿瞑茗蓂螟酩銘鳴袂侮冒募姆帽慕摸摹暮某模母毛牟牡瑁眸矛耗芼茅謀謨貌木沐牧目睦穆鶩歿沒夢朦蒙卯墓妙廟描昴杳渺猫竗苗錨務巫憮懋戊拇撫无楙武毋無珷畝繆舞茂蕪誣貿霧鵡墨默們刎吻問文"], +["daa1","汶紊紋聞蚊門雯勿沕物味媚尾嵋彌微未梶楣渼湄眉米美薇謎迷靡黴岷悶愍憫敏旻旼民泯玟珉緡閔密蜜謐剝博拍搏撲朴樸泊珀璞箔粕縛膊舶薄迫雹駁伴半反叛拌搬攀斑槃泮潘班畔瘢盤盼磐磻礬絆般蟠返頒飯勃拔撥渤潑"], +["dba1","發跋醱鉢髮魃倣傍坊妨尨幇彷房放方旁昉枋榜滂磅紡肪膀舫芳蒡蚌訪謗邦防龐倍俳北培徘拜排杯湃焙盃背胚裴裵褙賠輩配陪伯佰帛柏栢白百魄幡樊煩燔番磻繁蕃藩飜伐筏罰閥凡帆梵氾汎泛犯範范法琺僻劈壁擘檗璧癖"], +["dca1","碧蘗闢霹便卞弁變辨辯邊別瞥鱉鼈丙倂兵屛幷昞昺柄棅炳甁病秉竝輧餠騈保堡報寶普步洑湺潽珤甫菩補褓譜輔伏僕匐卜宓復服福腹茯蔔複覆輹輻馥鰒本乶俸奉封峯峰捧棒烽熢琫縫蓬蜂逢鋒鳳不付俯傅剖副否咐埠夫婦"], +["dda1","孚孵富府復扶敷斧浮溥父符簿缶腐腑膚艀芙莩訃負賦賻赴趺部釜阜附駙鳧北分吩噴墳奔奮忿憤扮昐汾焚盆粉糞紛芬賁雰不佛弗彿拂崩朋棚硼繃鵬丕備匕匪卑妃婢庇悲憊扉批斐枇榧比毖毗毘沸泌琵痺砒碑秕秘粃緋翡肥"], +["dea1","脾臂菲蜚裨誹譬費鄙非飛鼻嚬嬪彬斌檳殯浜濱瀕牝玭貧賓頻憑氷聘騁乍事些仕伺似使俟僿史司唆嗣四士奢娑寫寺射巳師徙思捨斜斯柶査梭死沙泗渣瀉獅砂社祀祠私篩紗絲肆舍莎蓑蛇裟詐詞謝賜赦辭邪飼駟麝削數朔索"], +["dfa1","傘刪山散汕珊産疝算蒜酸霰乷撒殺煞薩三參杉森渗芟蔘衫揷澁鈒颯上傷像償商喪嘗孀尙峠常床庠廂想桑橡湘爽牀狀相祥箱翔裳觴詳象賞霜塞璽賽嗇塞穡索色牲生甥省笙墅壻嶼序庶徐恕抒捿敍暑曙書栖棲犀瑞筮絮緖署"], +["e0a1","胥舒薯西誓逝鋤黍鼠夕奭席惜昔晳析汐淅潟石碩蓆釋錫仙僊先善嬋宣扇敾旋渲煽琁瑄璇璿癬禪線繕羨腺膳船蘚蟬詵跣選銑鐥饍鮮卨屑楔泄洩渫舌薛褻設說雪齧剡暹殲纖蟾贍閃陝攝涉燮葉城姓宬性惺成星晟猩珹盛省筬"], +["e1a1","聖聲腥誠醒世勢歲洗稅笹細說貰召嘯塑宵小少巢所掃搔昭梳沼消溯瀟炤燒甦疏疎瘙笑篠簫素紹蔬蕭蘇訴逍遡邵銷韶騷俗屬束涑粟續謖贖速孫巽損蓀遜飡率宋悚松淞訟誦送頌刷殺灑碎鎖衰釗修受嗽囚垂壽嫂守岫峀帥愁"], +["e2a1","戍手授搜收數樹殊水洙漱燧狩獸琇璲瘦睡秀穗竪粹綏綬繡羞脩茱蒐蓚藪袖誰讐輸遂邃酬銖銹隋隧隨雖需須首髓鬚叔塾夙孰宿淑潚熟琡璹肅菽巡徇循恂旬栒楯橓殉洵淳珣盾瞬筍純脣舜荀蓴蕣詢諄醇錞順馴戌術述鉥崇崧"], +["e3a1","嵩瑟膝蝨濕拾習褶襲丞乘僧勝升承昇繩蠅陞侍匙嘶始媤尸屎屍市弑恃施是時枾柴猜矢示翅蒔蓍視試詩諡豕豺埴寔式息拭植殖湜熄篒蝕識軾食飾伸侁信呻娠宸愼新晨燼申神紳腎臣莘薪藎蜃訊身辛辰迅失室實悉審尋心沁"], +["e4a1","沈深瀋甚芯諶什十拾雙氏亞俄兒啞娥峨我牙芽莪蛾衙訝阿雅餓鴉鵝堊岳嶽幄惡愕握樂渥鄂鍔顎鰐齷安岸按晏案眼雁鞍顔鮟斡謁軋閼唵岩巖庵暗癌菴闇壓押狎鴨仰央怏昻殃秧鴦厓哀埃崖愛曖涯碍艾隘靄厄扼掖液縊腋額"], +["e5a1","櫻罌鶯鸚也倻冶夜惹揶椰爺耶若野弱掠略約若葯蒻藥躍亮佯兩凉壤孃恙揚攘敭暘梁楊樣洋瀁煬痒瘍禳穰糧羊良襄諒讓釀陽量養圄御於漁瘀禦語馭魚齬億憶抑檍臆偃堰彦焉言諺孼蘖俺儼嚴奄掩淹嶪業円予余勵呂女如廬"], +["e6a1","旅歟汝濾璵礖礪與艅茹輿轝閭餘驪麗黎亦力域役易曆歷疫繹譯轢逆驛嚥堧姸娟宴年延憐戀捐挻撚椽沇沿涎涓淵演漣烟然煙煉燃燕璉硏硯秊筵緣練縯聯衍軟輦蓮連鉛鍊鳶列劣咽悅涅烈熱裂說閱厭廉念捻染殮炎焰琰艶苒"], +["e7a1","簾閻髥鹽曄獵燁葉令囹塋寧嶺嶸影怜映暎楹榮永泳渶潁濚瀛瀯煐營獰玲瑛瑩瓔盈穎纓羚聆英詠迎鈴鍈零霙靈領乂倪例刈叡曳汭濊猊睿穢芮藝蘂禮裔詣譽豫醴銳隸霓預五伍俉傲午吾吳嗚塢墺奧娛寤悟惡懊敖旿晤梧汚澳"], +["e8a1","烏熬獒筽蜈誤鰲鼇屋沃獄玉鈺溫瑥瘟穩縕蘊兀壅擁瓮甕癰翁邕雍饔渦瓦窩窪臥蛙蝸訛婉完宛梡椀浣玩琓琬碗緩翫脘腕莞豌阮頑曰往旺枉汪王倭娃歪矮外嵬巍猥畏了僚僥凹堯夭妖姚寥寮尿嶢拗搖撓擾料曜樂橈燎燿瑤療"], +["e9a1","窈窯繇繞耀腰蓼蟯要謠遙遼邀饒慾欲浴縟褥辱俑傭冗勇埇墉容庸慂榕涌湧溶熔瑢用甬聳茸蓉踊鎔鏞龍于佑偶優又友右宇寓尤愚憂旴牛玗瑀盂祐禑禹紆羽芋藕虞迂遇郵釪隅雨雩勖彧旭昱栯煜稶郁頊云暈橒殞澐熉耘芸蕓"], +["eaa1","運隕雲韻蔚鬱亐熊雄元原員圓園垣媛嫄寃怨愿援沅洹湲源爰猿瑗苑袁轅遠阮院願鴛月越鉞位偉僞危圍委威尉慰暐渭爲瑋緯胃萎葦蔿蝟衛褘謂違韋魏乳侑儒兪劉唯喩孺宥幼幽庾悠惟愈愉揄攸有杻柔柚柳楡楢油洧流游溜"], +["eba1","濡猶猷琉瑜由留癒硫紐維臾萸裕誘諛諭踰蹂遊逾遺酉釉鍮類六堉戮毓肉育陸倫允奫尹崙淪潤玧胤贇輪鈗閏律慄栗率聿戎瀜絨融隆垠恩慇殷誾銀隱乙吟淫蔭陰音飮揖泣邑凝應膺鷹依倚儀宜意懿擬椅毅疑矣義艤薏蟻衣誼"], +["eca1","議醫二以伊利吏夷姨履已弛彛怡易李梨泥爾珥理異痍痢移罹而耳肄苡荑裏裡貽貳邇里離飴餌匿溺瀷益翊翌翼謚人仁刃印吝咽因姻寅引忍湮燐璘絪茵藺蚓認隣靭靷鱗麟一佚佾壹日溢逸鎰馹任壬妊姙恁林淋稔臨荏賃入卄"], +["eda1","立笠粒仍剩孕芿仔刺咨姉姿子字孜恣慈滋炙煮玆瓷疵磁紫者自茨蔗藉諮資雌作勺嚼斫昨灼炸爵綽芍酌雀鵲孱棧殘潺盞岑暫潛箴簪蠶雜丈仗匠場墻壯奬將帳庄張掌暲杖樟檣欌漿牆狀獐璋章粧腸臟臧莊葬蔣薔藏裝贓醬長"], +["eea1","障再哉在宰才材栽梓渽滓災縡裁財載齋齎爭箏諍錚佇低儲咀姐底抵杵楮樗沮渚狙猪疽箸紵苧菹著藷詛貯躇這邸雎齟勣吊嫡寂摘敵滴狄炙的積笛籍績翟荻謫賊赤跡蹟迪迹適鏑佃佺傳全典前剪塡塼奠專展廛悛戰栓殿氈澱"], +["efa1","煎琠田甸畑癲筌箋箭篆纏詮輾轉鈿銓錢鐫電顚顫餞切截折浙癤竊節絶占岾店漸点粘霑鮎點接摺蝶丁井亭停偵呈姃定幀庭廷征情挺政整旌晶晸柾楨檉正汀淀淨渟湞瀞炡玎珽町睛碇禎程穽精綎艇訂諪貞鄭酊釘鉦鋌錠霆靖"], +["f0a1","靜頂鼎制劑啼堤帝弟悌提梯濟祭第臍薺製諸蹄醍除際霽題齊俎兆凋助嘲弔彫措操早晁曺曹朝條棗槽漕潮照燥爪璪眺祖祚租稠窕粗糟組繰肇藻蚤詔調趙躁造遭釣阻雕鳥族簇足鏃存尊卒拙猝倧宗從悰慫棕淙琮種終綜縱腫"], +["f1a1","踪踵鍾鐘佐坐左座挫罪主住侏做姝胄呪周嗾奏宙州廚晝朱柱株注洲湊澍炷珠疇籌紂紬綢舟蛛註誅走躊輳週酎酒鑄駐竹粥俊儁准埈寯峻晙樽浚準濬焌畯竣蠢逡遵雋駿茁中仲衆重卽櫛楫汁葺增憎曾拯烝甑症繒蒸證贈之只"], +["f2a1","咫地址志持指摯支旨智枝枳止池沚漬知砥祉祗紙肢脂至芝芷蜘誌識贄趾遲直稙稷織職唇嗔塵振搢晉晋桭榛殄津溱珍瑨璡畛疹盡眞瞋秦縉縝臻蔯袗診賑軫辰進鎭陣陳震侄叱姪嫉帙桎瓆疾秩窒膣蛭質跌迭斟朕什執潗緝輯"], +["f3a1","鏶集徵懲澄且侘借叉嗟嵯差次此磋箚茶蹉車遮捉搾着窄錯鑿齪撰澯燦璨瓚竄簒纂粲纘讚贊鑽餐饌刹察擦札紮僭參塹慘慙懺斬站讒讖倉倡創唱娼廠彰愴敞昌昶暢槍滄漲猖瘡窓脹艙菖蒼債埰寀寨彩採砦綵菜蔡采釵冊柵策"], +["f4a1","責凄妻悽處倜刺剔尺慽戚拓擲斥滌瘠脊蹠陟隻仟千喘天川擅泉淺玔穿舛薦賤踐遷釧闡阡韆凸哲喆徹撤澈綴輟轍鐵僉尖沾添甛瞻簽籤詹諂堞妾帖捷牒疊睫諜貼輒廳晴淸聽菁請靑鯖切剃替涕滯締諦逮遞體初剿哨憔抄招梢"], +["f5a1","椒楚樵炒焦硝礁礎秒稍肖艸苕草蕉貂超酢醋醮促囑燭矗蜀觸寸忖村邨叢塚寵悤憁摠總聰蔥銃撮催崔最墜抽推椎楸樞湫皺秋芻萩諏趨追鄒酋醜錐錘鎚雛騶鰍丑畜祝竺筑築縮蓄蹙蹴軸逐春椿瑃出朮黜充忠沖蟲衝衷悴膵萃"], +["f6a1","贅取吹嘴娶就炊翠聚脆臭趣醉驟鷲側仄厠惻測層侈値嗤峙幟恥梔治淄熾痔痴癡稚穉緇緻置致蚩輜雉馳齒則勅飭親七柒漆侵寢枕沈浸琛砧針鍼蟄秤稱快他咤唾墮妥惰打拖朶楕舵陀馱駝倬卓啄坼度托拓擢晫柝濁濯琢琸託"], +["f7a1","鐸呑嘆坦彈憚歎灘炭綻誕奪脫探眈耽貪塔搭榻宕帑湯糖蕩兌台太怠態殆汰泰笞胎苔跆邰颱宅擇澤撑攄兎吐土討慟桶洞痛筒統通堆槌腿褪退頹偸套妬投透鬪慝特闖坡婆巴把播擺杷波派爬琶破罷芭跛頗判坂板版瓣販辦鈑"], +["f8a1","阪八叭捌佩唄悖敗沛浿牌狽稗覇貝彭澎烹膨愎便偏扁片篇編翩遍鞭騙貶坪平枰萍評吠嬖幣廢弊斃肺蔽閉陛佈包匍匏咆哺圃布怖抛抱捕暴泡浦疱砲胞脯苞葡蒲袍褒逋鋪飽鮑幅暴曝瀑爆輻俵剽彪慓杓標漂瓢票表豹飇飄驃"], +["f9a1","品稟楓諷豊風馮彼披疲皮被避陂匹弼必泌珌畢疋筆苾馝乏逼下何厦夏廈昰河瑕荷蝦賀遐霞鰕壑學虐謔鶴寒恨悍旱汗漢澣瀚罕翰閑閒限韓割轄函含咸啣喊檻涵緘艦銜陷鹹合哈盒蛤閤闔陜亢伉姮嫦巷恒抗杭桁沆港缸肛航"], +["faa1","行降項亥偕咳垓奚孩害懈楷海瀣蟹解該諧邂駭骸劾核倖幸杏荇行享向嚮珦鄕響餉饗香噓墟虛許憲櫶獻軒歇險驗奕爀赫革俔峴弦懸晛泫炫玄玹現眩睍絃絢縣舷衒見賢鉉顯孑穴血頁嫌俠協夾峽挾浹狹脅脇莢鋏頰亨兄刑型"], +["fba1","形泂滎瀅灐炯熒珩瑩荊螢衡逈邢鎣馨兮彗惠慧暳蕙蹊醯鞋乎互呼壕壺好岵弧戶扈昊晧毫浩淏湖滸澔濠濩灝狐琥瑚瓠皓祜糊縞胡芦葫蒿虎號蝴護豪鎬頀顥惑或酷婚昏混渾琿魂忽惚笏哄弘汞泓洪烘紅虹訌鴻化和嬅樺火畵"], +["fca1","禍禾花華話譁貨靴廓擴攫確碻穫丸喚奐宦幻患換歡晥桓渙煥環紈還驩鰥活滑猾豁闊凰幌徨恍惶愰慌晃晄榥況湟滉潢煌璜皇篁簧荒蝗遑隍黃匯回廻徊恢悔懷晦會檜淮澮灰獪繪膾茴蛔誨賄劃獲宖橫鐄哮嚆孝效斅曉梟涍淆"], +["fda1","爻肴酵驍侯候厚后吼喉嗅帿後朽煦珝逅勛勳塤壎焄熏燻薰訓暈薨喧暄煊萱卉喙毁彙徽揮暉煇諱輝麾休携烋畦虧恤譎鷸兇凶匈洶胸黑昕欣炘痕吃屹紇訖欠欽歆吸恰洽翕興僖凞喜噫囍姬嬉希憙憘戱晞曦熙熹熺犧禧稀羲詰"] +] diff --git a/node_modules/iconv-lite/encodings/tables/cp950.json b/node_modules/iconv-lite/encodings/tables/cp950.json new file mode 100644 index 000000000..d8bc87178 --- /dev/null +++ b/node_modules/iconv-lite/encodings/tables/cp950.json @@ -0,0 +1,177 @@ +[ +["0","\u0000",127], +["a140"," ,、。.‧;:?!︰…‥﹐﹑﹒·﹔﹕﹖﹗|–︱—︳╴︴﹏()︵︶{}︷︸〔〕︹︺【】︻︼《》︽︾〈〉︿﹀「」﹁﹂『』﹃﹄﹙﹚"], +["a1a1","﹛﹜﹝﹞‘’“”〝〞‵′#&*※§〃○●△▲◎☆★◇◆□■▽▼㊣℅¯ ̄_ˍ﹉﹊﹍﹎﹋﹌﹟﹠﹡+-×÷±√<>=≦≧≠∞≒≡﹢",4,"~∩∪⊥∠∟⊿㏒㏑∫∮∵∴♀♂⊕⊙↑↓←→↖↗↙↘∥∣/"], +["a240","\∕﹨$¥〒¢£%@℃℉﹩﹪﹫㏕㎜㎝㎞㏎㎡㎎㎏㏄°兙兛兞兝兡兣嗧瓩糎▁",7,"▏▎▍▌▋▊▉┼┴┬┤├▔─│▕┌┐└┘╭"], +["a2a1","╮╰╯═╞╪╡◢◣◥◤╱╲╳0",9,"Ⅰ",9,"〡",8,"十卄卅A",25,"a",21], +["a340","wxyzΑ",16,"Σ",6,"α",16,"σ",6,"ㄅ",10], +["a3a1","ㄐ",25,"˙ˉˊˇˋ"], +["a3e1","€"], +["a440","一乙丁七乃九了二人儿入八几刀刁力匕十卜又三下丈上丫丸凡久么也乞于亡兀刃勺千叉口土士夕大女子孑孓寸小尢尸山川工己已巳巾干廾弋弓才"], +["a4a1","丑丐不中丰丹之尹予云井互五亢仁什仃仆仇仍今介仄元允內六兮公冗凶分切刈勻勾勿化匹午升卅卞厄友及反壬天夫太夭孔少尤尺屯巴幻廿弔引心戈戶手扎支文斗斤方日曰月木欠止歹毋比毛氏水火爪父爻片牙牛犬王丙"], +["a540","世丕且丘主乍乏乎以付仔仕他仗代令仙仞充兄冉冊冬凹出凸刊加功包匆北匝仟半卉卡占卯卮去可古右召叮叩叨叼司叵叫另只史叱台句叭叻四囚外"], +["a5a1","央失奴奶孕它尼巨巧左市布平幼弁弘弗必戊打扔扒扑斥旦朮本未末札正母民氐永汁汀氾犯玄玉瓜瓦甘生用甩田由甲申疋白皮皿目矛矢石示禾穴立丞丟乒乓乩亙交亦亥仿伉伙伊伕伍伐休伏仲件任仰仳份企伋光兇兆先全"], +["a640","共再冰列刑划刎刖劣匈匡匠印危吉吏同吊吐吁吋各向名合吃后吆吒因回囝圳地在圭圬圯圩夙多夷夸妄奸妃好她如妁字存宇守宅安寺尖屹州帆并年"], +["a6a1","式弛忙忖戎戌戍成扣扛托收早旨旬旭曲曳有朽朴朱朵次此死氖汝汗汙江池汐汕污汛汍汎灰牟牝百竹米糸缶羊羽老考而耒耳聿肉肋肌臣自至臼舌舛舟艮色艾虫血行衣西阡串亨位住佇佗佞伴佛何估佐佑伽伺伸佃佔似但佣"], +["a740","作你伯低伶余佝佈佚兌克免兵冶冷別判利刪刨劫助努劬匣即卵吝吭吞吾否呎吧呆呃吳呈呂君吩告吹吻吸吮吵吶吠吼呀吱含吟听囪困囤囫坊坑址坍"], +["a7a1","均坎圾坐坏圻壯夾妝妒妨妞妣妙妖妍妤妓妊妥孝孜孚孛完宋宏尬局屁尿尾岐岑岔岌巫希序庇床廷弄弟彤形彷役忘忌志忍忱快忸忪戒我抄抗抖技扶抉扭把扼找批扳抒扯折扮投抓抑抆改攻攸旱更束李杏材村杜杖杞杉杆杠"], +["a840","杓杗步每求汞沙沁沈沉沅沛汪決沐汰沌汨沖沒汽沃汲汾汴沆汶沍沔沘沂灶灼災灸牢牡牠狄狂玖甬甫男甸皂盯矣私秀禿究系罕肖肓肝肘肛肚育良芒"], +["a8a1","芋芍見角言谷豆豕貝赤走足身車辛辰迂迆迅迄巡邑邢邪邦那酉釆里防阮阱阪阬並乖乳事些亞享京佯依侍佳使佬供例來侃佰併侈佩佻侖佾侏侑佺兔兒兕兩具其典冽函刻券刷刺到刮制剁劾劻卒協卓卑卦卷卸卹取叔受味呵"], +["a940","咖呸咕咀呻呷咄咒咆呼咐呱呶和咚呢周咋命咎固垃坷坪坩坡坦坤坼夜奉奇奈奄奔妾妻委妹妮姑姆姐姍始姓姊妯妳姒姅孟孤季宗定官宜宙宛尚屈居"], +["a9a1","屆岷岡岸岩岫岱岳帘帚帖帕帛帑幸庚店府底庖延弦弧弩往征彿彼忝忠忽念忿怏怔怯怵怖怪怕怡性怩怫怛或戕房戾所承拉拌拄抿拂抹拒招披拓拔拋拈抨抽押拐拙拇拍抵拚抱拘拖拗拆抬拎放斧於旺昔易昌昆昂明昀昏昕昊"], +["aa40","昇服朋杭枋枕東果杳杷枇枝林杯杰板枉松析杵枚枓杼杪杲欣武歧歿氓氛泣注泳沱泌泥河沽沾沼波沫法泓沸泄油況沮泗泅泱沿治泡泛泊沬泯泜泖泠"], +["aaa1","炕炎炒炊炙爬爭爸版牧物狀狎狙狗狐玩玨玟玫玥甽疝疙疚的盂盲直知矽社祀祁秉秈空穹竺糾罔羌羋者肺肥肢肱股肫肩肴肪肯臥臾舍芳芝芙芭芽芟芹花芬芥芯芸芣芰芾芷虎虱初表軋迎返近邵邸邱邶采金長門阜陀阿阻附"], +["ab40","陂隹雨青非亟亭亮信侵侯便俠俑俏保促侶俘俟俊俗侮俐俄係俚俎俞侷兗冒冑冠剎剃削前剌剋則勇勉勃勁匍南卻厚叛咬哀咨哎哉咸咦咳哇哂咽咪品"], +["aba1","哄哈咯咫咱咻咩咧咿囿垂型垠垣垢城垮垓奕契奏奎奐姜姘姿姣姨娃姥姪姚姦威姻孩宣宦室客宥封屎屏屍屋峙峒巷帝帥帟幽庠度建弈弭彥很待徊律徇後徉怒思怠急怎怨恍恰恨恢恆恃恬恫恪恤扁拜挖按拼拭持拮拽指拱拷"], +["ac40","拯括拾拴挑挂政故斫施既春昭映昧是星昨昱昤曷柿染柱柔某柬架枯柵柩柯柄柑枴柚查枸柏柞柳枰柙柢柝柒歪殃殆段毒毗氟泉洋洲洪流津洌洱洞洗"], +["aca1","活洽派洶洛泵洹洧洸洩洮洵洎洫炫為炳炬炯炭炸炮炤爰牲牯牴狩狠狡玷珊玻玲珍珀玳甚甭畏界畎畋疫疤疥疢疣癸皆皇皈盈盆盃盅省盹相眉看盾盼眇矜砂研砌砍祆祉祈祇禹禺科秒秋穿突竿竽籽紂紅紀紉紇約紆缸美羿耄"], +["ad40","耐耍耑耶胖胥胚胃胄背胡胛胎胞胤胝致舢苧范茅苣苛苦茄若茂茉苒苗英茁苜苔苑苞苓苟苯茆虐虹虻虺衍衫要觔計訂訃貞負赴赳趴軍軌述迦迢迪迥"], +["ada1","迭迫迤迨郊郎郁郃酋酊重閂限陋陌降面革韋韭音頁風飛食首香乘亳倌倍倣俯倦倥俸倩倖倆值借倚倒們俺倀倔倨俱倡個候倘俳修倭倪俾倫倉兼冤冥冢凍凌准凋剖剜剔剛剝匪卿原厝叟哨唐唁唷哼哥哲唆哺唔哩哭員唉哮哪"], +["ae40","哦唧唇哽唏圃圄埂埔埋埃堉夏套奘奚娑娘娜娟娛娓姬娠娣娩娥娌娉孫屘宰害家宴宮宵容宸射屑展屐峭峽峻峪峨峰島崁峴差席師庫庭座弱徒徑徐恙"], +["aea1","恣恥恐恕恭恩息悄悟悚悍悔悌悅悖扇拳挈拿捎挾振捕捂捆捏捉挺捐挽挪挫挨捍捌效敉料旁旅時晉晏晃晒晌晅晁書朔朕朗校核案框桓根桂桔栩梳栗桌桑栽柴桐桀格桃株桅栓栘桁殊殉殷氣氧氨氦氤泰浪涕消涇浦浸海浙涓"], +["af40","浬涉浮浚浴浩涌涊浹涅浥涔烊烘烤烙烈烏爹特狼狹狽狸狷玆班琉珮珠珪珞畔畝畜畚留疾病症疲疳疽疼疹痂疸皋皰益盍盎眩真眠眨矩砰砧砸砝破砷"], +["afa1","砥砭砠砟砲祕祐祠祟祖神祝祗祚秤秣秧租秦秩秘窄窈站笆笑粉紡紗紋紊素索純紐紕級紜納紙紛缺罟羔翅翁耆耘耕耙耗耽耿胱脂胰脅胭胴脆胸胳脈能脊胼胯臭臬舀舐航舫舨般芻茫荒荔荊茸荐草茵茴荏茲茹茶茗荀茱茨荃"], +["b040","虔蚊蚪蚓蚤蚩蚌蚣蚜衰衷袁袂衽衹記訐討訌訕訊託訓訖訏訑豈豺豹財貢起躬軒軔軏辱送逆迷退迺迴逃追逅迸邕郡郝郢酒配酌釘針釗釜釙閃院陣陡"], +["b0a1","陛陝除陘陞隻飢馬骨高鬥鬲鬼乾偺偽停假偃偌做偉健偶偎偕偵側偷偏倏偯偭兜冕凰剪副勒務勘動匐匏匙匿區匾參曼商啪啦啄啞啡啃啊唱啖問啕唯啤唸售啜唬啣唳啁啗圈國圉域堅堊堆埠埤基堂堵執培夠奢娶婁婉婦婪婀"], +["b140","娼婢婚婆婊孰寇寅寄寂宿密尉專將屠屜屝崇崆崎崛崖崢崑崩崔崙崤崧崗巢常帶帳帷康庸庶庵庾張強彗彬彩彫得徙從徘御徠徜恿患悉悠您惋悴惦悽"], +["b1a1","情悻悵惜悼惘惕惆惟悸惚惇戚戛扈掠控捲掖探接捷捧掘措捱掩掉掃掛捫推掄授掙採掬排掏掀捻捩捨捺敝敖救教敗啟敏敘敕敔斜斛斬族旋旌旎晝晚晤晨晦晞曹勗望梁梯梢梓梵桿桶梱梧梗械梃棄梭梆梅梔條梨梟梡梂欲殺"], +["b240","毫毬氫涎涼淳淙液淡淌淤添淺清淇淋涯淑涮淞淹涸混淵淅淒渚涵淚淫淘淪深淮淨淆淄涪淬涿淦烹焉焊烽烯爽牽犁猜猛猖猓猙率琅琊球理現琍瓠瓶"], +["b2a1","瓷甜產略畦畢異疏痔痕疵痊痍皎盔盒盛眷眾眼眶眸眺硫硃硎祥票祭移窒窕笠笨笛第符笙笞笮粒粗粕絆絃統紮紹紼絀細紳組累終紲紱缽羞羚翌翎習耜聊聆脯脖脣脫脩脰脤舂舵舷舶船莎莞莘荸莢莖莽莫莒莊莓莉莠荷荻荼"], +["b340","莆莧處彪蛇蛀蚶蛄蚵蛆蛋蚱蚯蛉術袞袈被袒袖袍袋覓規訪訝訣訥許設訟訛訢豉豚販責貫貨貪貧赧赦趾趺軛軟這逍通逗連速逝逐逕逞造透逢逖逛途"], +["b3a1","部郭都酗野釵釦釣釧釭釩閉陪陵陳陸陰陴陶陷陬雀雪雩章竟頂頃魚鳥鹵鹿麥麻傢傍傅備傑傀傖傘傚最凱割剴創剩勞勝勛博厥啻喀喧啼喊喝喘喂喜喪喔喇喋喃喳單喟唾喲喚喻喬喱啾喉喫喙圍堯堪場堤堰報堡堝堠壹壺奠"], +["b440","婷媚婿媒媛媧孳孱寒富寓寐尊尋就嵌嵐崴嵇巽幅帽幀幃幾廊廁廂廄弼彭復循徨惑惡悲悶惠愜愣惺愕惰惻惴慨惱愎惶愉愀愒戟扉掣掌描揀揩揉揆揍"], +["b4a1","插揣提握揖揭揮捶援揪換摒揚揹敞敦敢散斑斐斯普晰晴晶景暑智晾晷曾替期朝棺棕棠棘棗椅棟棵森棧棹棒棲棣棋棍植椒椎棉棚楮棻款欺欽殘殖殼毯氮氯氬港游湔渡渲湧湊渠渥渣減湛湘渤湖湮渭渦湯渴湍渺測湃渝渾滋"], +["b540","溉渙湎湣湄湲湩湟焙焚焦焰無然煮焜牌犄犀猶猥猴猩琺琪琳琢琥琵琶琴琯琛琦琨甥甦畫番痢痛痣痙痘痞痠登發皖皓皴盜睏短硝硬硯稍稈程稅稀窘"], +["b5a1","窗窖童竣等策筆筐筒答筍筋筏筑粟粥絞結絨絕紫絮絲絡給絢絰絳善翔翕耋聒肅腕腔腋腑腎脹腆脾腌腓腴舒舜菩萃菸萍菠菅萋菁華菱菴著萊菰萌菌菽菲菊萸萎萄菜萇菔菟虛蛟蛙蛭蛔蛛蛤蛐蛞街裁裂袱覃視註詠評詞証詁"], +["b640","詔詛詐詆訴診訶詖象貂貯貼貳貽賁費賀貴買貶貿貸越超趁跎距跋跚跑跌跛跆軻軸軼辜逮逵週逸進逶鄂郵鄉郾酣酥量鈔鈕鈣鈉鈞鈍鈐鈇鈑閔閏開閑"], +["b6a1","間閒閎隊階隋陽隅隆隍陲隄雁雅雄集雇雯雲韌項順須飧飪飯飩飲飭馮馭黃黍黑亂傭債傲傳僅傾催傷傻傯僇剿剷剽募勦勤勢勣匯嗟嗨嗓嗦嗎嗜嗇嗑嗣嗤嗯嗚嗡嗅嗆嗥嗉園圓塞塑塘塗塚塔填塌塭塊塢塒塋奧嫁嫉嫌媾媽媼"], +["b740","媳嫂媲嵩嵯幌幹廉廈弒彙徬微愚意慈感想愛惹愁愈慎慌慄慍愾愴愧愍愆愷戡戢搓搾搞搪搭搽搬搏搜搔損搶搖搗搆敬斟新暗暉暇暈暖暄暘暍會榔業"], +["b7a1","楚楷楠楔極椰概楊楨楫楞楓楹榆楝楣楛歇歲毀殿毓毽溢溯滓溶滂源溝滇滅溥溘溼溺溫滑準溜滄滔溪溧溴煎煙煩煤煉照煜煬煦煌煥煞煆煨煖爺牒猷獅猿猾瑯瑚瑕瑟瑞瑁琿瑙瑛瑜當畸瘀痰瘁痲痱痺痿痴痳盞盟睛睫睦睞督"], +["b840","睹睪睬睜睥睨睢矮碎碰碗碘碌碉硼碑碓硿祺祿禁萬禽稜稚稠稔稟稞窟窠筷節筠筮筧粱粳粵經絹綑綁綏絛置罩罪署義羨群聖聘肆肄腱腰腸腥腮腳腫"], +["b8a1","腹腺腦舅艇蒂葷落萱葵葦葫葉葬葛萼萵葡董葩葭葆虞虜號蛹蜓蜈蜇蜀蛾蛻蜂蜃蜆蜊衙裟裔裙補裘裝裡裊裕裒覜解詫該詳試詩詰誇詼詣誠話誅詭詢詮詬詹詻訾詨豢貊貉賊資賈賄貲賃賂賅跡跟跨路跳跺跪跤跦躲較載軾輊"], +["b940","辟農運遊道遂達逼違遐遇遏過遍遑逾遁鄒鄗酬酪酩釉鈷鉗鈸鈽鉀鈾鉛鉋鉤鉑鈴鉉鉍鉅鈹鈿鉚閘隘隔隕雍雋雉雊雷電雹零靖靴靶預頑頓頊頒頌飼飴"], +["b9a1","飽飾馳馱馴髡鳩麂鼎鼓鼠僧僮僥僖僭僚僕像僑僱僎僩兢凳劃劂匱厭嗾嘀嘛嘗嗽嘔嘆嘉嘍嘎嗷嘖嘟嘈嘐嗶團圖塵塾境墓墊塹墅塽壽夥夢夤奪奩嫡嫦嫩嫗嫖嫘嫣孵寞寧寡寥實寨寢寤察對屢嶄嶇幛幣幕幗幔廓廖弊彆彰徹慇"], +["ba40","愿態慷慢慣慟慚慘慵截撇摘摔撤摸摟摺摑摧搴摭摻敲斡旗旖暢暨暝榜榨榕槁榮槓構榛榷榻榫榴槐槍榭槌榦槃榣歉歌氳漳演滾漓滴漩漾漠漬漏漂漢"], +["baa1","滿滯漆漱漸漲漣漕漫漯澈漪滬漁滲滌滷熔熙煽熊熄熒爾犒犖獄獐瑤瑣瑪瑰瑭甄疑瘧瘍瘋瘉瘓盡監瞄睽睿睡磁碟碧碳碩碣禎福禍種稱窪窩竭端管箕箋筵算箝箔箏箸箇箄粹粽精綻綰綜綽綾綠緊綴網綱綺綢綿綵綸維緒緇綬"], +["bb40","罰翠翡翟聞聚肇腐膀膏膈膊腿膂臧臺與舔舞艋蓉蒿蓆蓄蒙蒞蒲蒜蓋蒸蓀蓓蒐蒼蓑蓊蜿蜜蜻蜢蜥蜴蜘蝕蜷蜩裳褂裴裹裸製裨褚裯誦誌語誣認誡誓誤"], +["bba1","說誥誨誘誑誚誧豪貍貌賓賑賒赫趙趕跼輔輒輕輓辣遠遘遜遣遙遞遢遝遛鄙鄘鄞酵酸酷酴鉸銀銅銘銖鉻銓銜銨鉼銑閡閨閩閣閥閤隙障際雌雒需靼鞅韶頗領颯颱餃餅餌餉駁骯骰髦魁魂鳴鳶鳳麼鼻齊億儀僻僵價儂儈儉儅凜"], +["bc40","劇劈劉劍劊勰厲嘮嘻嘹嘲嘿嘴嘩噓噎噗噴嘶嘯嘰墀墟增墳墜墮墩墦奭嬉嫻嬋嫵嬌嬈寮寬審寫層履嶝嶔幢幟幡廢廚廟廝廣廠彈影德徵慶慧慮慝慕憂"], +["bca1","慼慰慫慾憧憐憫憎憬憚憤憔憮戮摩摯摹撞撲撈撐撰撥撓撕撩撒撮播撫撚撬撙撢撳敵敷數暮暫暴暱樣樟槨樁樞標槽模樓樊槳樂樅槭樑歐歎殤毅毆漿潼澄潑潦潔澆潭潛潸潮澎潺潰潤澗潘滕潯潠潟熟熬熱熨牖犛獎獗瑩璋璃"], +["bd40","瑾璀畿瘠瘩瘟瘤瘦瘡瘢皚皺盤瞎瞇瞌瞑瞋磋磅確磊碾磕碼磐稿稼穀稽稷稻窯窮箭箱範箴篆篇篁箠篌糊締練緯緻緘緬緝編緣線緞緩綞緙緲緹罵罷羯"], +["bda1","翩耦膛膜膝膠膚膘蔗蔽蔚蓮蔬蔭蔓蔑蔣蔡蔔蓬蔥蓿蔆螂蝴蝶蝠蝦蝸蝨蝙蝗蝌蝓衛衝褐複褒褓褕褊誼諒談諄誕請諸課諉諂調誰論諍誶誹諛豌豎豬賠賞賦賤賬賭賢賣賜質賡赭趟趣踫踐踝踢踏踩踟踡踞躺輝輛輟輩輦輪輜輞"], +["be40","輥適遮遨遭遷鄰鄭鄧鄱醇醉醋醃鋅銻銷鋪銬鋤鋁銳銼鋒鋇鋰銲閭閱霄霆震霉靠鞍鞋鞏頡頫頜颳養餓餒餘駝駐駟駛駑駕駒駙骷髮髯鬧魅魄魷魯鴆鴉"], +["bea1","鴃麩麾黎墨齒儒儘儔儐儕冀冪凝劑劓勳噙噫噹噩噤噸噪器噥噱噯噬噢噶壁墾壇壅奮嬝嬴學寰導彊憲憑憩憊懍憶憾懊懈戰擅擁擋撻撼據擄擇擂操撿擒擔撾整曆曉暹曄曇暸樽樸樺橙橫橘樹橄橢橡橋橇樵機橈歙歷氅濂澱澡"], +["bf40","濃澤濁澧澳激澹澶澦澠澴熾燉燐燒燈燕熹燎燙燜燃燄獨璜璣璘璟璞瓢甌甍瘴瘸瘺盧盥瞠瞞瞟瞥磨磚磬磧禦積穎穆穌穋窺篙簑築篤篛篡篩篦糕糖縊"], +["bfa1","縑縈縛縣縞縝縉縐罹羲翰翱翮耨膳膩膨臻興艘艙蕊蕙蕈蕨蕩蕃蕉蕭蕪蕞螃螟螞螢融衡褪褲褥褫褡親覦諦諺諫諱謀諜諧諮諾謁謂諷諭諳諶諼豫豭貓賴蹄踱踴蹂踹踵輻輯輸輳辨辦遵遴選遲遼遺鄴醒錠錶鋸錳錯錢鋼錫錄錚"], +["c040","錐錦錡錕錮錙閻隧隨險雕霎霑霖霍霓霏靛靜靦鞘頰頸頻頷頭頹頤餐館餞餛餡餚駭駢駱骸骼髻髭鬨鮑鴕鴣鴦鴨鴒鴛默黔龍龜優償儡儲勵嚎嚀嚐嚅嚇"], +["c0a1","嚏壕壓壑壎嬰嬪嬤孺尷屨嶼嶺嶽嶸幫彌徽應懂懇懦懋戲戴擎擊擘擠擰擦擬擱擢擭斂斃曙曖檀檔檄檢檜櫛檣橾檗檐檠歜殮毚氈濘濱濟濠濛濤濫濯澀濬濡濩濕濮濰燧營燮燦燥燭燬燴燠爵牆獰獲璩環璦璨癆療癌盪瞳瞪瞰瞬"], +["c140","瞧瞭矯磷磺磴磯礁禧禪穗窿簇簍篾篷簌篠糠糜糞糢糟糙糝縮績繆縷縲繃縫總縱繅繁縴縹繈縵縿縯罄翳翼聱聲聰聯聳臆臃膺臂臀膿膽臉膾臨舉艱薪"], +["c1a1","薄蕾薜薑薔薯薛薇薨薊虧蟀蟑螳蟒蟆螫螻螺蟈蟋褻褶襄褸褽覬謎謗謙講謊謠謝謄謐豁谿豳賺賽購賸賻趨蹉蹋蹈蹊轄輾轂轅輿避遽還邁邂邀鄹醣醞醜鍍鎂錨鍵鍊鍥鍋錘鍾鍬鍛鍰鍚鍔闊闋闌闈闆隱隸雖霜霞鞠韓顆颶餵騁"], +["c240","駿鮮鮫鮪鮭鴻鴿麋黏點黜黝黛鼾齋叢嚕嚮壙壘嬸彝懣戳擴擲擾攆擺擻擷斷曜朦檳檬櫃檻檸櫂檮檯歟歸殯瀉瀋濾瀆濺瀑瀏燻燼燾燸獷獵璧璿甕癖癘"], +["c2a1","癒瞽瞿瞻瞼礎禮穡穢穠竄竅簫簧簪簞簣簡糧織繕繞繚繡繒繙罈翹翻職聶臍臏舊藏薩藍藐藉薰薺薹薦蟯蟬蟲蟠覆覲觴謨謹謬謫豐贅蹙蹣蹦蹤蹟蹕軀轉轍邇邃邈醫醬釐鎔鎊鎖鎢鎳鎮鎬鎰鎘鎚鎗闔闖闐闕離雜雙雛雞霤鞣鞦"], +["c340","鞭韹額顏題顎顓颺餾餿餽餮馥騎髁鬃鬆魏魎魍鯊鯉鯽鯈鯀鵑鵝鵠黠鼕鼬儳嚥壞壟壢寵龐廬懲懷懶懵攀攏曠曝櫥櫝櫚櫓瀛瀟瀨瀚瀝瀕瀘爆爍牘犢獸"], +["c3a1","獺璽瓊瓣疇疆癟癡矇礙禱穫穩簾簿簸簽簷籀繫繭繹繩繪羅繳羶羹羸臘藩藝藪藕藤藥藷蟻蠅蠍蟹蟾襠襟襖襞譁譜識證譚譎譏譆譙贈贊蹼蹲躇蹶蹬蹺蹴轔轎辭邊邋醱醮鏡鏑鏟鏃鏈鏜鏝鏖鏢鏍鏘鏤鏗鏨關隴難霪霧靡韜韻類"], +["c440","願顛颼饅饉騖騙鬍鯨鯧鯖鯛鶉鵡鵲鵪鵬麒麗麓麴勸嚨嚷嚶嚴嚼壤孀孃孽寶巉懸懺攘攔攙曦朧櫬瀾瀰瀲爐獻瓏癢癥礦礪礬礫竇競籌籃籍糯糰辮繽繼"], +["c4a1","纂罌耀臚艦藻藹蘑藺蘆蘋蘇蘊蠔蠕襤覺觸議譬警譯譟譫贏贍躉躁躅躂醴釋鐘鐃鏽闡霰飄饒饑馨騫騰騷騵鰓鰍鹹麵黨鼯齟齣齡儷儸囁囀囂夔屬巍懼懾攝攜斕曩櫻欄櫺殲灌爛犧瓖瓔癩矓籐纏續羼蘗蘭蘚蠣蠢蠡蠟襪襬覽譴"], +["c540","護譽贓躊躍躋轟辯醺鐮鐳鐵鐺鐸鐲鐫闢霸霹露響顧顥饗驅驃驀騾髏魔魑鰭鰥鶯鶴鷂鶸麝黯鼙齜齦齧儼儻囈囊囉孿巔巒彎懿攤權歡灑灘玀瓤疊癮癬"], +["c5a1","禳籠籟聾聽臟襲襯觼讀贖贗躑躓轡酈鑄鑑鑒霽霾韃韁顫饕驕驍髒鬚鱉鰱鰾鰻鷓鷗鼴齬齪龔囌巖戀攣攫攪曬欐瓚竊籤籣籥纓纖纔臢蘸蘿蠱變邐邏鑣鑠鑤靨顯饜驚驛驗髓體髑鱔鱗鱖鷥麟黴囑壩攬灞癱癲矗罐羈蠶蠹衢讓讒"], +["c640","讖艷贛釀鑪靂靈靄韆顰驟鬢魘鱟鷹鷺鹼鹽鼇齷齲廳欖灣籬籮蠻觀躡釁鑲鑰顱饞髖鬣黌灤矚讚鑷韉驢驥纜讜躪釅鑽鑾鑼鱷鱸黷豔鑿鸚爨驪鬱鸛鸞籲"], +["c940","乂乜凵匚厂万丌乇亍囗兀屮彳丏冇与丮亓仂仉仈冘勼卬厹圠夃夬尐巿旡殳毌气爿丱丼仨仜仩仡仝仚刌匜卌圢圣夗夯宁宄尒尻屴屳帄庀庂忉戉扐氕"], +["c9a1","氶汃氿氻犮犰玊禸肊阞伎优伬仵伔仱伀价伈伝伂伅伢伓伄仴伒冱刓刉刐劦匢匟卍厊吇囡囟圮圪圴夼妀奼妅奻奾奷奿孖尕尥屼屺屻屾巟幵庄异弚彴忕忔忏扜扞扤扡扦扢扙扠扚扥旯旮朾朹朸朻机朿朼朳氘汆汒汜汏汊汔汋"], +["ca40","汌灱牞犴犵玎甪癿穵网艸艼芀艽艿虍襾邙邗邘邛邔阢阤阠阣佖伻佢佉体佤伾佧佒佟佁佘伭伳伿佡冏冹刜刞刡劭劮匉卣卲厎厏吰吷吪呔呅吙吜吥吘"], +["caa1","吽呏呁吨吤呇囮囧囥坁坅坌坉坋坒夆奀妦妘妠妗妎妢妐妏妧妡宎宒尨尪岍岏岈岋岉岒岊岆岓岕巠帊帎庋庉庌庈庍弅弝彸彶忒忑忐忭忨忮忳忡忤忣忺忯忷忻怀忴戺抃抌抎抏抔抇扱扻扺扰抁抈扷扽扲扴攷旰旴旳旲旵杅杇"], +["cb40","杙杕杌杈杝杍杚杋毐氙氚汸汧汫沄沋沏汱汯汩沚汭沇沕沜汦汳汥汻沎灴灺牣犿犽狃狆狁犺狅玕玗玓玔玒町甹疔疕皁礽耴肕肙肐肒肜芐芏芅芎芑芓"], +["cba1","芊芃芄豸迉辿邟邡邥邞邧邠阰阨阯阭丳侘佼侅佽侀侇佶佴侉侄佷佌侗佪侚佹侁佸侐侜侔侞侒侂侕佫佮冞冼冾刵刲刳剆刱劼匊匋匼厒厔咇呿咁咑咂咈呫呺呾呥呬呴呦咍呯呡呠咘呣呧呤囷囹坯坲坭坫坱坰坶垀坵坻坳坴坢"], +["cc40","坨坽夌奅妵妺姏姎妲姌姁妶妼姃姖妱妽姀姈妴姇孢孥宓宕屄屇岮岤岠岵岯岨岬岟岣岭岢岪岧岝岥岶岰岦帗帔帙弨弢弣弤彔徂彾彽忞忥怭怦怙怲怋"], +["cca1","怴怊怗怳怚怞怬怢怍怐怮怓怑怌怉怜戔戽抭抴拑抾抪抶拊抮抳抯抻抩抰抸攽斨斻昉旼昄昒昈旻昃昋昍昅旽昑昐曶朊枅杬枎枒杶杻枘枆构杴枍枌杺枟枑枙枃杽极杸杹枔欥殀歾毞氝沓泬泫泮泙沶泔沭泧沷泐泂沺泃泆泭泲"], +["cd40","泒泝沴沊沝沀泞泀洰泍泇沰泹泏泩泑炔炘炅炓炆炄炑炖炂炚炃牪狖狋狘狉狜狒狔狚狌狑玤玡玭玦玢玠玬玝瓝瓨甿畀甾疌疘皯盳盱盰盵矸矼矹矻矺"], +["cda1","矷祂礿秅穸穻竻籵糽耵肏肮肣肸肵肭舠芠苀芫芚芘芛芵芧芮芼芞芺芴芨芡芩苂芤苃芶芢虰虯虭虮豖迒迋迓迍迖迕迗邲邴邯邳邰阹阽阼阺陃俍俅俓侲俉俋俁俔俜俙侻侳俛俇俖侺俀侹俬剄剉勀勂匽卼厗厖厙厘咺咡咭咥哏"], +["ce40","哃茍咷咮哖咶哅哆咠呰咼咢咾呲哞咰垵垞垟垤垌垗垝垛垔垘垏垙垥垚垕壴复奓姡姞姮娀姱姝姺姽姼姶姤姲姷姛姩姳姵姠姾姴姭宨屌峐峘峌峗峋峛"], +["cea1","峞峚峉峇峊峖峓峔峏峈峆峎峟峸巹帡帢帣帠帤庰庤庢庛庣庥弇弮彖徆怷怹恔恲恞恅恓恇恉恛恌恀恂恟怤恄恘恦恮扂扃拏挍挋拵挎挃拫拹挏挌拸拶挀挓挔拺挕拻拰敁敃斪斿昶昡昲昵昜昦昢昳昫昺昝昴昹昮朏朐柁柲柈枺"], +["cf40","柜枻柸柘柀枷柅柫柤柟枵柍枳柷柶柮柣柂枹柎柧柰枲柼柆柭柌枮柦柛柺柉柊柃柪柋欨殂殄殶毖毘毠氠氡洨洴洭洟洼洿洒洊泚洳洄洙洺洚洑洀洝浂"], +["cfa1","洁洘洷洃洏浀洇洠洬洈洢洉洐炷炟炾炱炰炡炴炵炩牁牉牊牬牰牳牮狊狤狨狫狟狪狦狣玅珌珂珈珅玹玶玵玴珫玿珇玾珃珆玸珋瓬瓮甮畇畈疧疪癹盄眈眃眄眅眊盷盻盺矧矨砆砑砒砅砐砏砎砉砃砓祊祌祋祅祄秕种秏秖秎窀"], +["d040","穾竑笀笁籺籸籹籿粀粁紃紈紁罘羑羍羾耇耎耏耔耷胘胇胠胑胈胂胐胅胣胙胜胊胕胉胏胗胦胍臿舡芔苙苾苹茇苨茀苕茺苫苖苴苬苡苲苵茌苻苶苰苪"], +["d0a1","苤苠苺苳苭虷虴虼虳衁衎衧衪衩觓訄訇赲迣迡迮迠郱邽邿郕郅邾郇郋郈釔釓陔陏陑陓陊陎倞倅倇倓倢倰倛俵俴倳倷倬俶俷倗倜倠倧倵倯倱倎党冔冓凊凄凅凈凎剡剚剒剞剟剕剢勍匎厞唦哢唗唒哧哳哤唚哿唄唈哫唑唅哱"], +["d140","唊哻哷哸哠唎唃唋圁圂埌堲埕埒垺埆垽垼垸垶垿埇埐垹埁夎奊娙娖娭娮娕娏娗娊娞娳孬宧宭宬尃屖屔峬峿峮峱峷崀峹帩帨庨庮庪庬弳弰彧恝恚恧"], +["d1a1","恁悢悈悀悒悁悝悃悕悛悗悇悜悎戙扆拲挐捖挬捄捅挶捃揤挹捋捊挼挩捁挴捘捔捙挭捇挳捚捑挸捗捀捈敊敆旆旃旄旂晊晟晇晑朒朓栟栚桉栲栳栻桋桏栖栱栜栵栫栭栯桎桄栴栝栒栔栦栨栮桍栺栥栠欬欯欭欱欴歭肂殈毦毤"], +["d240","毨毣毢毧氥浺浣浤浶洍浡涒浘浢浭浯涑涍淯浿涆浞浧浠涗浰浼浟涂涘洯浨涋浾涀涄洖涃浻浽浵涐烜烓烑烝烋缹烢烗烒烞烠烔烍烅烆烇烚烎烡牂牸"], +["d2a1","牷牶猀狺狴狾狶狳狻猁珓珙珥珖玼珧珣珩珜珒珛珔珝珚珗珘珨瓞瓟瓴瓵甡畛畟疰痁疻痄痀疿疶疺皊盉眝眛眐眓眒眣眑眕眙眚眢眧砣砬砢砵砯砨砮砫砡砩砳砪砱祔祛祏祜祓祒祑秫秬秠秮秭秪秜秞秝窆窉窅窋窌窊窇竘笐"], +["d340","笄笓笅笏笈笊笎笉笒粄粑粊粌粈粍粅紞紝紑紎紘紖紓紟紒紏紌罜罡罞罠罝罛羖羒翃翂翀耖耾耹胺胲胹胵脁胻脀舁舯舥茳茭荄茙荑茥荖茿荁茦茜茢"], +["d3a1","荂荎茛茪茈茼荍茖茤茠茷茯茩荇荅荌荓茞茬荋茧荈虓虒蚢蚨蚖蚍蚑蚞蚇蚗蚆蚋蚚蚅蚥蚙蚡蚧蚕蚘蚎蚝蚐蚔衃衄衭衵衶衲袀衱衿衯袃衾衴衼訒豇豗豻貤貣赶赸趵趷趶軑軓迾迵适迿迻逄迼迶郖郠郙郚郣郟郥郘郛郗郜郤酐"], +["d440","酎酏釕釢釚陜陟隼飣髟鬯乿偰偪偡偞偠偓偋偝偲偈偍偁偛偊偢倕偅偟偩偫偣偤偆偀偮偳偗偑凐剫剭剬剮勖勓匭厜啵啶唼啍啐唴唪啑啢唶唵唰啒啅"], +["d4a1","唌唲啥啎唹啈唭唻啀啋圊圇埻堔埢埶埜埴堀埭埽堈埸堋埳埏堇埮埣埲埥埬埡堎埼堐埧堁堌埱埩埰堍堄奜婠婘婕婧婞娸娵婭婐婟婥婬婓婤婗婃婝婒婄婛婈媎娾婍娹婌婰婩婇婑婖婂婜孲孮寁寀屙崞崋崝崚崠崌崨崍崦崥崏"], +["d540","崰崒崣崟崮帾帴庱庴庹庲庳弶弸徛徖徟悊悐悆悾悰悺惓惔惏惤惙惝惈悱惛悷惊悿惃惍惀挲捥掊掂捽掽掞掭掝掗掫掎捯掇掐据掯捵掜捭掮捼掤挻掟"], +["d5a1","捸掅掁掑掍捰敓旍晥晡晛晙晜晢朘桹梇梐梜桭桮梮梫楖桯梣梬梩桵桴梲梏桷梒桼桫桲梪梀桱桾梛梖梋梠梉梤桸桻梑梌梊桽欶欳欷欸殑殏殍殎殌氪淀涫涴涳湴涬淩淢涷淶淔渀淈淠淟淖涾淥淜淝淛淴淊涽淭淰涺淕淂淏淉"], +["d640","淐淲淓淽淗淍淣涻烺焍烷焗烴焌烰焄烳焐烼烿焆焓焀烸烶焋焂焎牾牻牼牿猝猗猇猑猘猊猈狿猏猞玈珶珸珵琄琁珽琇琀珺珼珿琌琋珴琈畤畣痎痒痏"], +["d6a1","痋痌痑痐皏皉盓眹眯眭眱眲眴眳眽眥眻眵硈硒硉硍硊硌砦硅硐祤祧祩祪祣祫祡离秺秸秶秷窏窔窐笵筇笴笥笰笢笤笳笘笪笝笱笫笭笯笲笸笚笣粔粘粖粣紵紽紸紶紺絅紬紩絁絇紾紿絊紻紨罣羕羜羝羛翊翋翍翐翑翇翏翉耟"], +["d740","耞耛聇聃聈脘脥脙脛脭脟脬脞脡脕脧脝脢舑舸舳舺舴舲艴莐莣莨莍荺荳莤荴莏莁莕莙荵莔莩荽莃莌莝莛莪莋荾莥莯莈莗莰荿莦莇莮荶莚虙虖蚿蚷"], +["d7a1","蛂蛁蛅蚺蚰蛈蚹蚳蚸蛌蚴蚻蚼蛃蚽蚾衒袉袕袨袢袪袚袑袡袟袘袧袙袛袗袤袬袌袓袎覂觖觙觕訰訧訬訞谹谻豜豝豽貥赽赻赹趼跂趹趿跁軘軞軝軜軗軠軡逤逋逑逜逌逡郯郪郰郴郲郳郔郫郬郩酖酘酚酓酕釬釴釱釳釸釤釹釪"], +["d840","釫釷釨釮镺閆閈陼陭陫陱陯隿靪頄飥馗傛傕傔傞傋傣傃傌傎傝偨傜傒傂傇兟凔匒匑厤厧喑喨喥喭啷噅喢喓喈喏喵喁喣喒喤啽喌喦啿喕喡喎圌堩堷"], +["d8a1","堙堞堧堣堨埵塈堥堜堛堳堿堶堮堹堸堭堬堻奡媯媔媟婺媢媞婸媦婼媥媬媕媮娷媄媊媗媃媋媩婻婽媌媜媏媓媝寪寍寋寔寑寊寎尌尰崷嵃嵫嵁嵋崿崵嵑嵎嵕崳崺嵒崽崱嵙嵂崹嵉崸崼崲崶嵀嵅幄幁彘徦徥徫惉悹惌惢惎惄愔"], +["d940","惲愊愖愅惵愓惸惼惾惁愃愘愝愐惿愄愋扊掔掱掰揎揥揨揯揃撝揳揊揠揶揕揲揵摡揟掾揝揜揄揘揓揂揇揌揋揈揰揗揙攲敧敪敤敜敨敥斌斝斞斮旐旒"], +["d9a1","晼晬晻暀晱晹晪晲朁椌棓椄棜椪棬棪棱椏棖棷棫棤棶椓椐棳棡椇棌椈楰梴椑棯棆椔棸棐棽棼棨椋椊椗棎棈棝棞棦棴棑椆棔棩椕椥棇欹欻欿欼殔殗殙殕殽毰毲毳氰淼湆湇渟湉溈渼渽湅湢渫渿湁湝湳渜渳湋湀湑渻渃渮湞"], +["da40","湨湜湡渱渨湠湱湫渹渢渰湓湥渧湸湤湷湕湹湒湦渵渶湚焠焞焯烻焮焱焣焥焢焲焟焨焺焛牋牚犈犉犆犅犋猒猋猰猢猱猳猧猲猭猦猣猵猌琮琬琰琫琖"], +["daa1","琚琡琭琱琤琣琝琩琠琲瓻甯畯畬痧痚痡痦痝痟痤痗皕皒盚睆睇睄睍睅睊睎睋睌矞矬硠硤硥硜硭硱硪确硰硩硨硞硢祴祳祲祰稂稊稃稌稄窙竦竤筊笻筄筈筌筎筀筘筅粢粞粨粡絘絯絣絓絖絧絪絏絭絜絫絒絔絩絑絟絎缾缿罥"], +["db40","罦羢羠羡翗聑聏聐胾胔腃腊腒腏腇脽腍脺臦臮臷臸臹舄舼舽舿艵茻菏菹萣菀菨萒菧菤菼菶萐菆菈菫菣莿萁菝菥菘菿菡菋菎菖菵菉萉萏菞萑萆菂菳"], +["dba1","菕菺菇菑菪萓菃菬菮菄菻菗菢萛菛菾蛘蛢蛦蛓蛣蛚蛪蛝蛫蛜蛬蛩蛗蛨蛑衈衖衕袺裗袹袸裀袾袶袼袷袽袲褁裉覕覘覗觝觚觛詎詍訹詙詀詗詘詄詅詒詈詑詊詌詏豟貁貀貺貾貰貹貵趄趀趉跘跓跍跇跖跜跏跕跙跈跗跅軯軷軺"], +["dc40","軹軦軮軥軵軧軨軶軫軱軬軴軩逭逴逯鄆鄬鄄郿郼鄈郹郻鄁鄀鄇鄅鄃酡酤酟酢酠鈁鈊鈥鈃鈚鈦鈏鈌鈀鈒釿釽鈆鈄鈧鈂鈜鈤鈙鈗鈅鈖镻閍閌閐隇陾隈"], +["dca1","隉隃隀雂雈雃雱雰靬靰靮頇颩飫鳦黹亃亄亶傽傿僆傮僄僊傴僈僂傰僁傺傱僋僉傶傸凗剺剸剻剼嗃嗛嗌嗐嗋嗊嗝嗀嗔嗄嗩喿嗒喍嗏嗕嗢嗖嗈嗲嗍嗙嗂圔塓塨塤塏塍塉塯塕塎塝塙塥塛堽塣塱壼嫇嫄嫋媺媸媱媵媰媿嫈媻嫆"], +["dd40","媷嫀嫊媴媶嫍媹媐寖寘寙尟尳嵱嵣嵊嵥嵲嵬嵞嵨嵧嵢巰幏幎幊幍幋廅廌廆廋廇彀徯徭惷慉慊愫慅愶愲愮慆愯慏愩慀戠酨戣戥戤揅揱揫搐搒搉搠搤"], +["dda1","搳摃搟搕搘搹搷搢搣搌搦搰搨摁搵搯搊搚摀搥搧搋揧搛搮搡搎敯斒旓暆暌暕暐暋暊暙暔晸朠楦楟椸楎楢楱椿楅楪椹楂楗楙楺楈楉椵楬椳椽楥棰楸椴楩楀楯楄楶楘楁楴楌椻楋椷楜楏楑椲楒椯楻椼歆歅歃歂歈歁殛嗀毻毼"], +["de40","毹毷毸溛滖滈溏滀溟溓溔溠溱溹滆滒溽滁溞滉溷溰滍溦滏溲溾滃滜滘溙溒溎溍溤溡溿溳滐滊溗溮溣煇煔煒煣煠煁煝煢煲煸煪煡煂煘煃煋煰煟煐煓"], +["dea1","煄煍煚牏犍犌犑犐犎猼獂猻猺獀獊獉瑄瑊瑋瑒瑑瑗瑀瑏瑐瑎瑂瑆瑍瑔瓡瓿瓾瓽甝畹畷榃痯瘏瘃痷痾痼痹痸瘐痻痶痭痵痽皙皵盝睕睟睠睒睖睚睩睧睔睙睭矠碇碚碔碏碄碕碅碆碡碃硹碙碀碖硻祼禂祽祹稑稘稙稒稗稕稢稓"], +["df40","稛稐窣窢窞竫筦筤筭筴筩筲筥筳筱筰筡筸筶筣粲粴粯綈綆綀綍絿綅絺綎絻綃絼綌綔綄絽綒罭罫罧罨罬羦羥羧翛翜耡腤腠腷腜腩腛腢腲朡腞腶腧腯"], +["dfa1","腄腡舝艉艄艀艂艅蓱萿葖葶葹蒏蒍葥葑葀蒆葧萰葍葽葚葙葴葳葝蔇葞萷萺萴葺葃葸萲葅萩菙葋萯葂萭葟葰萹葎葌葒葯蓅蒎萻葇萶萳葨葾葄萫葠葔葮葐蜋蜄蛷蜌蛺蛖蛵蝍蛸蜎蜉蜁蛶蜍蜅裖裋裍裎裞裛裚裌裐覅覛觟觥觤"], +["e040","觡觠觢觜触詶誆詿詡訿詷誂誄詵誃誁詴詺谼豋豊豥豤豦貆貄貅賌赨赩趑趌趎趏趍趓趔趐趒跰跠跬跱跮跐跩跣跢跧跲跫跴輆軿輁輀輅輇輈輂輋遒逿"], +["e0a1","遄遉逽鄐鄍鄏鄑鄖鄔鄋鄎酮酯鉈鉒鈰鈺鉦鈳鉥鉞銃鈮鉊鉆鉭鉬鉏鉠鉧鉯鈶鉡鉰鈱鉔鉣鉐鉲鉎鉓鉌鉖鈲閟閜閞閛隒隓隑隗雎雺雽雸雵靳靷靸靲頏頍頎颬飶飹馯馲馰馵骭骫魛鳪鳭鳧麀黽僦僔僗僨僳僛僪僝僤僓僬僰僯僣僠"], +["e140","凘劀劁勩勫匰厬嘧嘕嘌嘒嗼嘏嘜嘁嘓嘂嗺嘝嘄嗿嗹墉塼墐墘墆墁塿塴墋塺墇墑墎塶墂墈塻墔墏壾奫嫜嫮嫥嫕嫪嫚嫭嫫嫳嫢嫠嫛嫬嫞嫝嫙嫨嫟孷寠"], +["e1a1","寣屣嶂嶀嵽嶆嵺嶁嵷嶊嶉嶈嵾嵼嶍嵹嵿幘幙幓廘廑廗廎廜廕廙廒廔彄彃彯徶愬愨慁慞慱慳慒慓慲慬憀慴慔慺慛慥愻慪慡慖戩戧戫搫摍摛摝摴摶摲摳摽摵摦撦摎撂摞摜摋摓摠摐摿搿摬摫摙摥摷敳斠暡暠暟朅朄朢榱榶槉"], +["e240","榠槎榖榰榬榼榑榙榎榧榍榩榾榯榿槄榽榤槔榹槊榚槏榳榓榪榡榞槙榗榐槂榵榥槆歊歍歋殞殟殠毃毄毾滎滵滱漃漥滸漷滻漮漉潎漙漚漧漘漻漒滭漊"], +["e2a1","漶潳滹滮漭潀漰漼漵滫漇漎潃漅滽滶漹漜滼漺漟漍漞漈漡熇熐熉熀熅熂熏煻熆熁熗牄牓犗犕犓獃獍獑獌瑢瑳瑱瑵瑲瑧瑮甀甂甃畽疐瘖瘈瘌瘕瘑瘊瘔皸瞁睼瞅瞂睮瞀睯睾瞃碲碪碴碭碨硾碫碞碥碠碬碢碤禘禊禋禖禕禔禓"], +["e340","禗禈禒禐稫穊稰稯稨稦窨窫窬竮箈箜箊箑箐箖箍箌箛箎箅箘劄箙箤箂粻粿粼粺綧綷緂綣綪緁緀緅綝緎緄緆緋緌綯綹綖綼綟綦綮綩綡緉罳翢翣翥翞"], +["e3a1","耤聝聜膉膆膃膇膍膌膋舕蒗蒤蒡蒟蒺蓎蓂蒬蒮蒫蒹蒴蓁蓍蒪蒚蒱蓐蒝蒧蒻蒢蒔蓇蓌蒛蒩蒯蒨蓖蒘蒶蓏蒠蓗蓔蓒蓛蒰蒑虡蜳蜣蜨蝫蝀蜮蜞蜡蜙蜛蝃蜬蝁蜾蝆蜠蜲蜪蜭蜼蜒蜺蜱蜵蝂蜦蜧蜸蜤蜚蜰蜑裷裧裱裲裺裾裮裼裶裻"], +["e440","裰裬裫覝覡覟覞觩觫觨誫誙誋誒誏誖谽豨豩賕賏賗趖踉踂跿踍跽踊踃踇踆踅跾踀踄輐輑輎輍鄣鄜鄠鄢鄟鄝鄚鄤鄡鄛酺酲酹酳銥銤鉶銛鉺銠銔銪銍"], +["e4a1","銦銚銫鉹銗鉿銣鋮銎銂銕銢鉽銈銡銊銆銌銙銧鉾銇銩銝銋鈭隞隡雿靘靽靺靾鞃鞀鞂靻鞄鞁靿韎韍頖颭颮餂餀餇馝馜駃馹馻馺駂馽駇骱髣髧鬾鬿魠魡魟鳱鳲鳵麧僿儃儰僸儆儇僶僾儋儌僽儊劋劌勱勯噈噂噌嘵噁噊噉噆噘"], +["e540","噚噀嘳嘽嘬嘾嘸嘪嘺圚墫墝墱墠墣墯墬墥墡壿嫿嫴嫽嫷嫶嬃嫸嬂嫹嬁嬇嬅嬏屧嶙嶗嶟嶒嶢嶓嶕嶠嶜嶡嶚嶞幩幝幠幜緳廛廞廡彉徲憋憃慹憱憰憢憉"], +["e5a1","憛憓憯憭憟憒憪憡憍慦憳戭摮摰撖撠撅撗撜撏撋撊撌撣撟摨撱撘敶敺敹敻斲斳暵暰暩暲暷暪暯樀樆樗槥槸樕槱槤樠槿槬槢樛樝槾樧槲槮樔槷槧橀樈槦槻樍槼槫樉樄樘樥樏槶樦樇槴樖歑殥殣殢殦氁氀毿氂潁漦潾澇濆澒"], +["e640","澍澉澌潢潏澅潚澖潶潬澂潕潲潒潐潗澔澓潝漀潡潫潽潧澐潓澋潩潿澕潣潷潪潻熲熯熛熰熠熚熩熵熝熥熞熤熡熪熜熧熳犘犚獘獒獞獟獠獝獛獡獚獙"], +["e6a1","獢璇璉璊璆璁瑽璅璈瑼瑹甈甇畾瘥瘞瘙瘝瘜瘣瘚瘨瘛皜皝皞皛瞍瞏瞉瞈磍碻磏磌磑磎磔磈磃磄磉禚禡禠禜禢禛歶稹窲窴窳箷篋箾箬篎箯箹篊箵糅糈糌糋緷緛緪緧緗緡縃緺緦緶緱緰緮緟罶羬羰羭翭翫翪翬翦翨聤聧膣膟"], +["e740","膞膕膢膙膗舖艏艓艒艐艎艑蔤蔻蔏蔀蔩蔎蔉蔍蔟蔊蔧蔜蓻蔫蓺蔈蔌蓴蔪蓲蔕蓷蓫蓳蓼蔒蓪蓩蔖蓾蔨蔝蔮蔂蓽蔞蓶蔱蔦蓧蓨蓰蓯蓹蔘蔠蔰蔋蔙蔯虢"], +["e7a1","蝖蝣蝤蝷蟡蝳蝘蝔蝛蝒蝡蝚蝑蝞蝭蝪蝐蝎蝟蝝蝯蝬蝺蝮蝜蝥蝏蝻蝵蝢蝧蝩衚褅褌褔褋褗褘褙褆褖褑褎褉覢覤覣觭觰觬諏諆誸諓諑諔諕誻諗誾諀諅諘諃誺誽諙谾豍貏賥賟賙賨賚賝賧趠趜趡趛踠踣踥踤踮踕踛踖踑踙踦踧"], +["e840","踔踒踘踓踜踗踚輬輤輘輚輠輣輖輗遳遰遯遧遫鄯鄫鄩鄪鄲鄦鄮醅醆醊醁醂醄醀鋐鋃鋄鋀鋙銶鋏鋱鋟鋘鋩鋗鋝鋌鋯鋂鋨鋊鋈鋎鋦鋍鋕鋉鋠鋞鋧鋑鋓"], +["e8a1","銵鋡鋆銴镼閬閫閮閰隤隢雓霅霈霂靚鞊鞎鞈韐韏頞頝頦頩頨頠頛頧颲餈飺餑餔餖餗餕駜駍駏駓駔駎駉駖駘駋駗駌骳髬髫髳髲髱魆魃魧魴魱魦魶魵魰魨魤魬鳼鳺鳽鳿鳷鴇鴀鳹鳻鴈鴅鴄麃黓鼏鼐儜儓儗儚儑凞匴叡噰噠噮"], +["e940","噳噦噣噭噲噞噷圜圛壈墽壉墿墺壂墼壆嬗嬙嬛嬡嬔嬓嬐嬖嬨嬚嬠嬞寯嶬嶱嶩嶧嶵嶰嶮嶪嶨嶲嶭嶯嶴幧幨幦幯廩廧廦廨廥彋徼憝憨憖懅憴懆懁懌憺"], +["e9a1","憿憸憌擗擖擐擏擉撽撉擃擛擳擙攳敿敼斢曈暾曀曊曋曏暽暻暺曌朣樴橦橉橧樲橨樾橝橭橶橛橑樨橚樻樿橁橪橤橐橏橔橯橩橠樼橞橖橕橍橎橆歕歔歖殧殪殫毈毇氄氃氆澭濋澣濇澼濎濈潞濄澽澞濊澨瀄澥澮澺澬澪濏澿澸"], +["ea40","澢濉澫濍澯澲澰燅燂熿熸燖燀燁燋燔燊燇燏熽燘熼燆燚燛犝犞獩獦獧獬獥獫獪瑿璚璠璔璒璕璡甋疀瘯瘭瘱瘽瘳瘼瘵瘲瘰皻盦瞚瞝瞡瞜瞛瞢瞣瞕瞙"], +["eaa1","瞗磝磩磥磪磞磣磛磡磢磭磟磠禤穄穈穇窶窸窵窱窷篞篣篧篝篕篥篚篨篹篔篪篢篜篫篘篟糒糔糗糐糑縒縡縗縌縟縠縓縎縜縕縚縢縋縏縖縍縔縥縤罃罻罼罺羱翯耪耩聬膱膦膮膹膵膫膰膬膴膲膷膧臲艕艖艗蕖蕅蕫蕍蕓蕡蕘"], +["eb40","蕀蕆蕤蕁蕢蕄蕑蕇蕣蔾蕛蕱蕎蕮蕵蕕蕧蕠薌蕦蕝蕔蕥蕬虣虥虤螛螏螗螓螒螈螁螖螘蝹螇螣螅螐螑螝螄螔螜螚螉褞褦褰褭褮褧褱褢褩褣褯褬褟觱諠"], +["eba1","諢諲諴諵諝謔諤諟諰諈諞諡諨諿諯諻貑貒貐賵賮賱賰賳赬赮趥趧踳踾踸蹀蹅踶踼踽蹁踰踿躽輶輮輵輲輹輷輴遶遹遻邆郺鄳鄵鄶醓醐醑醍醏錧錞錈錟錆錏鍺錸錼錛錣錒錁鍆錭錎錍鋋錝鋺錥錓鋹鋷錴錂錤鋿錩錹錵錪錔錌"], +["ec40","錋鋾錉錀鋻錖閼闍閾閹閺閶閿閵閽隩雔霋霒霐鞙鞗鞔韰韸頵頯頲餤餟餧餩馞駮駬駥駤駰駣駪駩駧骹骿骴骻髶髺髹髷鬳鮀鮅鮇魼魾魻鮂鮓鮒鮐魺鮕"], +["eca1","魽鮈鴥鴗鴠鴞鴔鴩鴝鴘鴢鴐鴙鴟麈麆麇麮麭黕黖黺鼒鼽儦儥儢儤儠儩勴嚓嚌嚍嚆嚄嚃噾嚂噿嚁壖壔壏壒嬭嬥嬲嬣嬬嬧嬦嬯嬮孻寱寲嶷幬幪徾徻懃憵憼懧懠懥懤懨懞擯擩擣擫擤擨斁斀斶旚曒檍檖檁檥檉檟檛檡檞檇檓檎"], +["ed40","檕檃檨檤檑橿檦檚檅檌檒歛殭氉濌澩濴濔濣濜濭濧濦濞濲濝濢濨燡燱燨燲燤燰燢獳獮獯璗璲璫璐璪璭璱璥璯甐甑甒甏疄癃癈癉癇皤盩瞵瞫瞲瞷瞶"], +["eda1","瞴瞱瞨矰磳磽礂磻磼磲礅磹磾礄禫禨穜穛穖穘穔穚窾竀竁簅簏篲簀篿篻簎篴簋篳簂簉簃簁篸篽簆篰篱簐簊糨縭縼繂縳顈縸縪繉繀繇縩繌縰縻縶繄縺罅罿罾罽翴翲耬膻臄臌臊臅臇膼臩艛艚艜薃薀薏薧薕薠薋薣蕻薤薚薞"], +["ee40","蕷蕼薉薡蕺蕸蕗薎薖薆薍薙薝薁薢薂薈薅蕹蕶薘薐薟虨螾螪螭蟅螰螬螹螵螼螮蟉蟃蟂蟌螷螯蟄蟊螴螶螿螸螽蟞螲褵褳褼褾襁襒褷襂覭覯覮觲觳謞"], +["eea1","謘謖謑謅謋謢謏謒謕謇謍謈謆謜謓謚豏豰豲豱豯貕貔賹赯蹎蹍蹓蹐蹌蹇轃轀邅遾鄸醚醢醛醙醟醡醝醠鎡鎃鎯鍤鍖鍇鍼鍘鍜鍶鍉鍐鍑鍠鍭鎏鍌鍪鍹鍗鍕鍒鍏鍱鍷鍻鍡鍞鍣鍧鎀鍎鍙闇闀闉闃闅閷隮隰隬霠霟霘霝霙鞚鞡鞜"], +["ef40","鞞鞝韕韔韱顁顄顊顉顅顃餥餫餬餪餳餲餯餭餱餰馘馣馡騂駺駴駷駹駸駶駻駽駾駼騃骾髾髽鬁髼魈鮚鮨鮞鮛鮦鮡鮥鮤鮆鮢鮠鮯鴳鵁鵧鴶鴮鴯鴱鴸鴰"], +["efa1","鵅鵂鵃鴾鴷鵀鴽翵鴭麊麉麍麰黈黚黻黿鼤鼣鼢齔龠儱儭儮嚘嚜嚗嚚嚝嚙奰嬼屩屪巀幭幮懘懟懭懮懱懪懰懫懖懩擿攄擽擸攁攃擼斔旛曚曛曘櫅檹檽櫡櫆檺檶檷櫇檴檭歞毉氋瀇瀌瀍瀁瀅瀔瀎濿瀀濻瀦濼濷瀊爁燿燹爃燽獶"], +["f040","璸瓀璵瓁璾璶璻瓂甔甓癜癤癙癐癓癗癚皦皽盬矂瞺磿礌礓礔礉礐礒礑禭禬穟簜簩簙簠簟簭簝簦簨簢簥簰繜繐繖繣繘繢繟繑繠繗繓羵羳翷翸聵臑臒"], +["f0a1","臐艟艞薴藆藀藃藂薳薵薽藇藄薿藋藎藈藅薱薶藒蘤薸薷薾虩蟧蟦蟢蟛蟫蟪蟥蟟蟳蟤蟔蟜蟓蟭蟘蟣螤蟗蟙蠁蟴蟨蟝襓襋襏襌襆襐襑襉謪謧謣謳謰謵譇謯謼謾謱謥謷謦謶謮謤謻謽謺豂豵貙貘貗賾贄贂贀蹜蹢蹠蹗蹖蹞蹥蹧"], +["f140","蹛蹚蹡蹝蹩蹔轆轇轈轋鄨鄺鄻鄾醨醥醧醯醪鎵鎌鎒鎷鎛鎝鎉鎧鎎鎪鎞鎦鎕鎈鎙鎟鎍鎱鎑鎲鎤鎨鎴鎣鎥闒闓闑隳雗雚巂雟雘雝霣霢霥鞬鞮鞨鞫鞤鞪"], +["f1a1","鞢鞥韗韙韖韘韺顐顑顒颸饁餼餺騏騋騉騍騄騑騊騅騇騆髀髜鬈鬄鬅鬩鬵魊魌魋鯇鯆鯃鮿鯁鮵鮸鯓鮶鯄鮹鮽鵜鵓鵏鵊鵛鵋鵙鵖鵌鵗鵒鵔鵟鵘鵚麎麌黟鼁鼀鼖鼥鼫鼪鼩鼨齌齕儴儵劖勷厴嚫嚭嚦嚧嚪嚬壚壝壛夒嬽嬾嬿巃幰"], +["f240","徿懻攇攐攍攉攌攎斄旞旝曞櫧櫠櫌櫑櫙櫋櫟櫜櫐櫫櫏櫍櫞歠殰氌瀙瀧瀠瀖瀫瀡瀢瀣瀩瀗瀤瀜瀪爌爊爇爂爅犥犦犤犣犡瓋瓅璷瓃甖癠矉矊矄矱礝礛"], +["f2a1","礡礜礗礞禰穧穨簳簼簹簬簻糬糪繶繵繸繰繷繯繺繲繴繨罋罊羃羆羷翽翾聸臗臕艤艡艣藫藱藭藙藡藨藚藗藬藲藸藘藟藣藜藑藰藦藯藞藢蠀蟺蠃蟶蟷蠉蠌蠋蠆蟼蠈蟿蠊蠂襢襚襛襗襡襜襘襝襙覈覷覶觶譐譈譊譀譓譖譔譋譕"], +["f340","譑譂譒譗豃豷豶貚贆贇贉趬趪趭趫蹭蹸蹳蹪蹯蹻軂轒轑轏轐轓辴酀鄿醰醭鏞鏇鏏鏂鏚鏐鏹鏬鏌鏙鎩鏦鏊鏔鏮鏣鏕鏄鏎鏀鏒鏧镽闚闛雡霩霫霬霨霦"], +["f3a1","鞳鞷鞶韝韞韟顜顙顝顗颿颽颻颾饈饇饃馦馧騚騕騥騝騤騛騢騠騧騣騞騜騔髂鬋鬊鬎鬌鬷鯪鯫鯠鯞鯤鯦鯢鯰鯔鯗鯬鯜鯙鯥鯕鯡鯚鵷鶁鶊鶄鶈鵱鶀鵸鶆鶋鶌鵽鵫鵴鵵鵰鵩鶅鵳鵻鶂鵯鵹鵿鶇鵨麔麑黀黼鼭齀齁齍齖齗齘匷嚲"], +["f440","嚵嚳壣孅巆巇廮廯忀忁懹攗攖攕攓旟曨曣曤櫳櫰櫪櫨櫹櫱櫮櫯瀼瀵瀯瀷瀴瀱灂瀸瀿瀺瀹灀瀻瀳灁爓爔犨獽獼璺皫皪皾盭矌矎矏矍矲礥礣礧礨礤礩"], +["f4a1","禲穮穬穭竷籉籈籊籇籅糮繻繾纁纀羺翿聹臛臙舋艨艩蘢藿蘁藾蘛蘀藶蘄蘉蘅蘌藽蠙蠐蠑蠗蠓蠖襣襦覹觷譠譪譝譨譣譥譧譭趮躆躈躄轙轖轗轕轘轚邍酃酁醷醵醲醳鐋鐓鏻鐠鐏鐔鏾鐕鐐鐨鐙鐍鏵鐀鏷鐇鐎鐖鐒鏺鐉鏸鐊鏿"], +["f540","鏼鐌鏶鐑鐆闞闠闟霮霯鞹鞻韽韾顠顢顣顟飁飂饐饎饙饌饋饓騲騴騱騬騪騶騩騮騸騭髇髊髆鬐鬒鬑鰋鰈鯷鰅鰒鯸鱀鰇鰎鰆鰗鰔鰉鶟鶙鶤鶝鶒鶘鶐鶛"], +["f5a1","鶠鶔鶜鶪鶗鶡鶚鶢鶨鶞鶣鶿鶩鶖鶦鶧麙麛麚黥黤黧黦鼰鼮齛齠齞齝齙龑儺儹劘劗囃嚽嚾孈孇巋巏廱懽攛欂櫼欃櫸欀灃灄灊灈灉灅灆爝爚爙獾甗癪矐礭礱礯籔籓糲纊纇纈纋纆纍罍羻耰臝蘘蘪蘦蘟蘣蘜蘙蘧蘮蘡蘠蘩蘞蘥"], +["f640","蠩蠝蠛蠠蠤蠜蠫衊襭襩襮襫觺譹譸譅譺譻贐贔趯躎躌轞轛轝酆酄酅醹鐿鐻鐶鐩鐽鐼鐰鐹鐪鐷鐬鑀鐱闥闤闣霵霺鞿韡顤飉飆飀饘饖騹騽驆驄驂驁騺"], +["f6a1","騿髍鬕鬗鬘鬖鬺魒鰫鰝鰜鰬鰣鰨鰩鰤鰡鶷鶶鶼鷁鷇鷊鷏鶾鷅鷃鶻鶵鷎鶹鶺鶬鷈鶱鶭鷌鶳鷍鶲鹺麜黫黮黭鼛鼘鼚鼱齎齥齤龒亹囆囅囋奱孋孌巕巑廲攡攠攦攢欋欈欉氍灕灖灗灒爞爟犩獿瓘瓕瓙瓗癭皭礵禴穰穱籗籜籙籛籚"], +["f740","糴糱纑罏羇臞艫蘴蘵蘳蘬蘲蘶蠬蠨蠦蠪蠥襱覿覾觻譾讄讂讆讅譿贕躕躔躚躒躐躖躗轠轢酇鑌鑐鑊鑋鑏鑇鑅鑈鑉鑆霿韣顪顩飋饔饛驎驓驔驌驏驈驊"], +["f7a1","驉驒驐髐鬙鬫鬻魖魕鱆鱈鰿鱄鰹鰳鱁鰼鰷鰴鰲鰽鰶鷛鷒鷞鷚鷋鷐鷜鷑鷟鷩鷙鷘鷖鷵鷕鷝麶黰鼵鼳鼲齂齫龕龢儽劙壨壧奲孍巘蠯彏戁戃戄攩攥斖曫欑欒欏毊灛灚爢玂玁玃癰矔籧籦纕艬蘺虀蘹蘼蘱蘻蘾蠰蠲蠮蠳襶襴襳觾"], +["f840","讌讎讋讈豅贙躘轤轣醼鑢鑕鑝鑗鑞韄韅頀驖驙鬞鬟鬠鱒鱘鱐鱊鱍鱋鱕鱙鱌鱎鷻鷷鷯鷣鷫鷸鷤鷶鷡鷮鷦鷲鷰鷢鷬鷴鷳鷨鷭黂黐黲黳鼆鼜鼸鼷鼶齃齏"], +["f8a1","齱齰齮齯囓囍孎屭攭曭曮欓灟灡灝灠爣瓛瓥矕礸禷禶籪纗羉艭虃蠸蠷蠵衋讔讕躞躟躠躝醾醽釂鑫鑨鑩雥靆靃靇韇韥驞髕魙鱣鱧鱦鱢鱞鱠鸂鷾鸇鸃鸆鸅鸀鸁鸉鷿鷽鸄麠鼞齆齴齵齶囔攮斸欘欙欗欚灢爦犪矘矙礹籩籫糶纚"], +["f940","纘纛纙臠臡虆虇虈襹襺襼襻觿讘讙躥躤躣鑮鑭鑯鑱鑳靉顲饟鱨鱮鱭鸋鸍鸐鸏鸒鸑麡黵鼉齇齸齻齺齹圞灦籯蠼趲躦釃鑴鑸鑶鑵驠鱴鱳鱱鱵鸔鸓黶鼊"], +["f9a1","龤灨灥糷虪蠾蠽蠿讞貜躩軉靋顳顴飌饡馫驤驦驧鬤鸕鸗齈戇欞爧虌躨钂钀钁驩驨鬮鸙爩虋讟钃鱹麷癵驫鱺鸝灩灪麤齾齉龘碁銹裏墻恒粧嫺╔╦╗╠╬╣╚╩╝╒╤╕╞╪╡╘╧╛╓╥╖╟╫╢╙╨╜║═╭╮╰╯▓"] +] diff --git a/node_modules/iconv-lite/encodings/tables/eucjp.json b/node_modules/iconv-lite/encodings/tables/eucjp.json new file mode 100644 index 000000000..4fa61ca11 --- /dev/null +++ b/node_modules/iconv-lite/encodings/tables/eucjp.json @@ -0,0 +1,182 @@ +[ +["0","\u0000",127], +["8ea1","。",62], +["a1a1"," 、。,.・:;?!゛゜´`¨^ ̄_ヽヾゝゞ〃仝々〆〇ー―‐/\~∥|…‥‘’“”()〔〕[]{}〈",9,"+-±×÷=≠<>≦≧∞∴♂♀°′″℃¥$¢£%#&*@§☆★○●◎◇"], +["a2a1","◆□■△▲▽▼※〒→←↑↓〓"], +["a2ba","∈∋⊆⊇⊂⊃∪∩"], +["a2ca","∧∨¬⇒⇔∀∃"], +["a2dc","∠⊥⌒∂∇≡≒≪≫√∽∝∵∫∬"], +["a2f2","ʼn♯♭♪†‡¶"], +["a2fe","◯"], +["a3b0","0",9], +["a3c1","A",25], +["a3e1","a",25], +["a4a1","ぁ",82], +["a5a1","ァ",85], +["a6a1","Α",16,"Σ",6], +["a6c1","α",16,"σ",6], +["a7a1","А",5,"ЁЖ",25], +["a7d1","а",5,"ёж",25], +["a8a1","─│┌┐┘└├┬┤┴┼━┃┏┓┛┗┣┳┫┻╋┠┯┨┷┿┝┰┥┸╂"], +["ada1","①",19,"Ⅰ",9], +["adc0","㍉㌔㌢㍍㌘㌧㌃㌶㍑㍗㌍㌦㌣㌫㍊㌻㎜㎝㎞㎎㎏㏄㎡"], +["addf","㍻〝〟№㏍℡㊤",4,"㈱㈲㈹㍾㍽㍼≒≡∫∮∑√⊥∠∟⊿∵∩∪"], +["b0a1","亜唖娃阿哀愛挨姶逢葵茜穐悪握渥旭葦芦鯵梓圧斡扱宛姐虻飴絢綾鮎或粟袷安庵按暗案闇鞍杏以伊位依偉囲夷委威尉惟意慰易椅為畏異移維緯胃萎衣謂違遺医井亥域育郁磯一壱溢逸稲茨芋鰯允印咽員因姻引飲淫胤蔭"], +["b1a1","院陰隠韻吋右宇烏羽迂雨卯鵜窺丑碓臼渦嘘唄欝蔚鰻姥厩浦瓜閏噂云運雲荏餌叡営嬰影映曳栄永泳洩瑛盈穎頴英衛詠鋭液疫益駅悦謁越閲榎厭円園堰奄宴延怨掩援沿演炎焔煙燕猿縁艶苑薗遠鉛鴛塩於汚甥凹央奥往応"], +["b2a1","押旺横欧殴王翁襖鴬鴎黄岡沖荻億屋憶臆桶牡乙俺卸恩温穏音下化仮何伽価佳加可嘉夏嫁家寡科暇果架歌河火珂禍禾稼箇花苛茄荷華菓蝦課嘩貨迦過霞蚊俄峨我牙画臥芽蛾賀雅餓駕介会解回塊壊廻快怪悔恢懐戒拐改"], +["b3a1","魁晦械海灰界皆絵芥蟹開階貝凱劾外咳害崖慨概涯碍蓋街該鎧骸浬馨蛙垣柿蛎鈎劃嚇各廓拡撹格核殻獲確穫覚角赫較郭閣隔革学岳楽額顎掛笠樫橿梶鰍潟割喝恰括活渇滑葛褐轄且鰹叶椛樺鞄株兜竃蒲釜鎌噛鴨栢茅萱"], +["b4a1","粥刈苅瓦乾侃冠寒刊勘勧巻喚堪姦完官寛干幹患感慣憾換敢柑桓棺款歓汗漢澗潅環甘監看竿管簡緩缶翰肝艦莞観諌貫還鑑間閑関陥韓館舘丸含岸巌玩癌眼岩翫贋雁頑顔願企伎危喜器基奇嬉寄岐希幾忌揮机旗既期棋棄"], +["b5a1","機帰毅気汽畿祈季稀紀徽規記貴起軌輝飢騎鬼亀偽儀妓宜戯技擬欺犠疑祇義蟻誼議掬菊鞠吉吃喫桔橘詰砧杵黍却客脚虐逆丘久仇休及吸宮弓急救朽求汲泣灸球究窮笈級糾給旧牛去居巨拒拠挙渠虚許距鋸漁禦魚亨享京"], +["b6a1","供侠僑兇競共凶協匡卿叫喬境峡強彊怯恐恭挟教橋況狂狭矯胸脅興蕎郷鏡響饗驚仰凝尭暁業局曲極玉桐粁僅勤均巾錦斤欣欽琴禁禽筋緊芹菌衿襟謹近金吟銀九倶句区狗玖矩苦躯駆駈駒具愚虞喰空偶寓遇隅串櫛釧屑屈"], +["b7a1","掘窟沓靴轡窪熊隈粂栗繰桑鍬勲君薫訓群軍郡卦袈祁係傾刑兄啓圭珪型契形径恵慶慧憩掲携敬景桂渓畦稽系経継繋罫茎荊蛍計詣警軽頚鶏芸迎鯨劇戟撃激隙桁傑欠決潔穴結血訣月件倹倦健兼券剣喧圏堅嫌建憲懸拳捲"], +["b8a1","検権牽犬献研硯絹県肩見謙賢軒遣鍵険顕験鹸元原厳幻弦減源玄現絃舷言諺限乎個古呼固姑孤己庫弧戸故枯湖狐糊袴股胡菰虎誇跨鈷雇顧鼓五互伍午呉吾娯後御悟梧檎瑚碁語誤護醐乞鯉交佼侯候倖光公功効勾厚口向"], +["b9a1","后喉坑垢好孔孝宏工巧巷幸広庚康弘恒慌抗拘控攻昂晃更杭校梗構江洪浩港溝甲皇硬稿糠紅紘絞綱耕考肯肱腔膏航荒行衡講貢購郊酵鉱砿鋼閤降項香高鴻剛劫号合壕拷濠豪轟麹克刻告国穀酷鵠黒獄漉腰甑忽惚骨狛込"], +["baa1","此頃今困坤墾婚恨懇昏昆根梱混痕紺艮魂些佐叉唆嵯左差査沙瑳砂詐鎖裟坐座挫債催再最哉塞妻宰彩才採栽歳済災采犀砕砦祭斎細菜裁載際剤在材罪財冴坂阪堺榊肴咲崎埼碕鷺作削咋搾昨朔柵窄策索錯桜鮭笹匙冊刷"], +["bba1","察拶撮擦札殺薩雑皐鯖捌錆鮫皿晒三傘参山惨撒散桟燦珊産算纂蚕讃賛酸餐斬暫残仕仔伺使刺司史嗣四士始姉姿子屍市師志思指支孜斯施旨枝止死氏獅祉私糸紙紫肢脂至視詞詩試誌諮資賜雌飼歯事似侍児字寺慈持時"], +["bca1","次滋治爾璽痔磁示而耳自蒔辞汐鹿式識鴫竺軸宍雫七叱執失嫉室悉湿漆疾質実蔀篠偲柴芝屡蕊縞舎写射捨赦斜煮社紗者謝車遮蛇邪借勺尺杓灼爵酌釈錫若寂弱惹主取守手朱殊狩珠種腫趣酒首儒受呪寿授樹綬需囚収周"], +["bda1","宗就州修愁拾洲秀秋終繍習臭舟蒐衆襲讐蹴輯週酋酬集醜什住充十従戎柔汁渋獣縦重銃叔夙宿淑祝縮粛塾熟出術述俊峻春瞬竣舜駿准循旬楯殉淳準潤盾純巡遵醇順処初所暑曙渚庶緒署書薯藷諸助叙女序徐恕鋤除傷償"], +["bea1","勝匠升召哨商唱嘗奨妾娼宵将小少尚庄床廠彰承抄招掌捷昇昌昭晶松梢樟樵沼消渉湘焼焦照症省硝礁祥称章笑粧紹肖菖蒋蕉衝裳訟証詔詳象賞醤鉦鍾鐘障鞘上丈丞乗冗剰城場壌嬢常情擾条杖浄状畳穣蒸譲醸錠嘱埴飾"], +["bfa1","拭植殖燭織職色触食蝕辱尻伸信侵唇娠寝審心慎振新晋森榛浸深申疹真神秦紳臣芯薪親診身辛進針震人仁刃塵壬尋甚尽腎訊迅陣靭笥諏須酢図厨逗吹垂帥推水炊睡粋翠衰遂酔錐錘随瑞髄崇嵩数枢趨雛据杉椙菅頗雀裾"], +["c0a1","澄摺寸世瀬畝是凄制勢姓征性成政整星晴棲栖正清牲生盛精聖声製西誠誓請逝醒青静斉税脆隻席惜戚斥昔析石積籍績脊責赤跡蹟碩切拙接摂折設窃節説雪絶舌蝉仙先千占宣専尖川戦扇撰栓栴泉浅洗染潜煎煽旋穿箭線"], +["c1a1","繊羨腺舛船薦詮賎践選遷銭銑閃鮮前善漸然全禅繕膳糎噌塑岨措曾曽楚狙疏疎礎祖租粗素組蘇訴阻遡鼠僧創双叢倉喪壮奏爽宋層匝惣想捜掃挿掻操早曹巣槍槽漕燥争痩相窓糟総綜聡草荘葬蒼藻装走送遭鎗霜騒像増憎"], +["c2a1","臓蔵贈造促側則即息捉束測足速俗属賊族続卒袖其揃存孫尊損村遜他多太汰詑唾堕妥惰打柁舵楕陀駄騨体堆対耐岱帯待怠態戴替泰滞胎腿苔袋貸退逮隊黛鯛代台大第醍題鷹滝瀧卓啄宅托択拓沢濯琢託鐸濁諾茸凧蛸只"], +["c3a1","叩但達辰奪脱巽竪辿棚谷狸鱈樽誰丹単嘆坦担探旦歎淡湛炭短端箪綻耽胆蛋誕鍛団壇弾断暖檀段男談値知地弛恥智池痴稚置致蜘遅馳築畜竹筑蓄逐秩窒茶嫡着中仲宙忠抽昼柱注虫衷註酎鋳駐樗瀦猪苧著貯丁兆凋喋寵"], +["c4a1","帖帳庁弔張彫徴懲挑暢朝潮牒町眺聴脹腸蝶調諜超跳銚長頂鳥勅捗直朕沈珍賃鎮陳津墜椎槌追鎚痛通塚栂掴槻佃漬柘辻蔦綴鍔椿潰坪壷嬬紬爪吊釣鶴亭低停偵剃貞呈堤定帝底庭廷弟悌抵挺提梯汀碇禎程締艇訂諦蹄逓"], +["c5a1","邸鄭釘鼎泥摘擢敵滴的笛適鏑溺哲徹撤轍迭鉄典填天展店添纏甜貼転顛点伝殿澱田電兎吐堵塗妬屠徒斗杜渡登菟賭途都鍍砥砺努度土奴怒倒党冬凍刀唐塔塘套宕島嶋悼投搭東桃梼棟盗淘湯涛灯燈当痘祷等答筒糖統到"], +["c6a1","董蕩藤討謄豆踏逃透鐙陶頭騰闘働動同堂導憧撞洞瞳童胴萄道銅峠鴇匿得徳涜特督禿篤毒独読栃橡凸突椴届鳶苫寅酉瀞噸屯惇敦沌豚遁頓呑曇鈍奈那内乍凪薙謎灘捺鍋楢馴縄畷南楠軟難汝二尼弐迩匂賑肉虹廿日乳入"], +["c7a1","如尿韮任妊忍認濡禰祢寧葱猫熱年念捻撚燃粘乃廼之埜嚢悩濃納能脳膿農覗蚤巴把播覇杷波派琶破婆罵芭馬俳廃拝排敗杯盃牌背肺輩配倍培媒梅楳煤狽買売賠陪這蝿秤矧萩伯剥博拍柏泊白箔粕舶薄迫曝漠爆縛莫駁麦"], +["c8a1","函箱硲箸肇筈櫨幡肌畑畠八鉢溌発醗髪伐罰抜筏閥鳩噺塙蛤隼伴判半反叛帆搬斑板氾汎版犯班畔繁般藩販範釆煩頒飯挽晩番盤磐蕃蛮匪卑否妃庇彼悲扉批披斐比泌疲皮碑秘緋罷肥被誹費避非飛樋簸備尾微枇毘琵眉美"], +["c9a1","鼻柊稗匹疋髭彦膝菱肘弼必畢筆逼桧姫媛紐百謬俵彪標氷漂瓢票表評豹廟描病秒苗錨鋲蒜蛭鰭品彬斌浜瀕貧賓頻敏瓶不付埠夫婦富冨布府怖扶敷斧普浮父符腐膚芙譜負賦赴阜附侮撫武舞葡蕪部封楓風葺蕗伏副復幅服"], +["caa1","福腹複覆淵弗払沸仏物鮒分吻噴墳憤扮焚奮粉糞紛雰文聞丙併兵塀幣平弊柄並蔽閉陛米頁僻壁癖碧別瞥蔑箆偏変片篇編辺返遍便勉娩弁鞭保舗鋪圃捕歩甫補輔穂募墓慕戊暮母簿菩倣俸包呆報奉宝峰峯崩庖抱捧放方朋"], +["cba1","法泡烹砲縫胞芳萌蓬蜂褒訪豊邦鋒飽鳳鵬乏亡傍剖坊妨帽忘忙房暴望某棒冒紡肪膨謀貌貿鉾防吠頬北僕卜墨撲朴牧睦穆釦勃没殆堀幌奔本翻凡盆摩磨魔麻埋妹昧枚毎哩槙幕膜枕鮪柾鱒桝亦俣又抹末沫迄侭繭麿万慢満"], +["cca1","漫蔓味未魅巳箕岬密蜜湊蓑稔脈妙粍民眠務夢無牟矛霧鵡椋婿娘冥名命明盟迷銘鳴姪牝滅免棉綿緬面麺摸模茂妄孟毛猛盲網耗蒙儲木黙目杢勿餅尤戻籾貰問悶紋門匁也冶夜爺耶野弥矢厄役約薬訳躍靖柳薮鑓愉愈油癒"], +["cda1","諭輸唯佑優勇友宥幽悠憂揖有柚湧涌猶猷由祐裕誘遊邑郵雄融夕予余与誉輿預傭幼妖容庸揚揺擁曜楊様洋溶熔用窯羊耀葉蓉要謡踊遥陽養慾抑欲沃浴翌翼淀羅螺裸来莱頼雷洛絡落酪乱卵嵐欄濫藍蘭覧利吏履李梨理璃"], +["cea1","痢裏裡里離陸律率立葎掠略劉流溜琉留硫粒隆竜龍侶慮旅虜了亮僚両凌寮料梁涼猟療瞭稜糧良諒遼量陵領力緑倫厘林淋燐琳臨輪隣鱗麟瑠塁涙累類令伶例冷励嶺怜玲礼苓鈴隷零霊麗齢暦歴列劣烈裂廉恋憐漣煉簾練聯"], +["cfa1","蓮連錬呂魯櫓炉賂路露労婁廊弄朗楼榔浪漏牢狼篭老聾蝋郎六麓禄肋録論倭和話歪賄脇惑枠鷲亙亘鰐詫藁蕨椀湾碗腕"], +["d0a1","弌丐丕个丱丶丼丿乂乖乘亂亅豫亊舒弍于亞亟亠亢亰亳亶从仍仄仆仂仗仞仭仟价伉佚估佛佝佗佇佶侈侏侘佻佩佰侑佯來侖儘俔俟俎俘俛俑俚俐俤俥倚倨倔倪倥倅伜俶倡倩倬俾俯們倆偃假會偕偐偈做偖偬偸傀傚傅傴傲"], +["d1a1","僉僊傳僂僖僞僥僭僣僮價僵儉儁儂儖儕儔儚儡儺儷儼儻儿兀兒兌兔兢竸兩兪兮冀冂囘册冉冏冑冓冕冖冤冦冢冩冪冫决冱冲冰况冽凅凉凛几處凩凭凰凵凾刄刋刔刎刧刪刮刳刹剏剄剋剌剞剔剪剴剩剳剿剽劍劔劒剱劈劑辨"], +["d2a1","辧劬劭劼劵勁勍勗勞勣勦飭勠勳勵勸勹匆匈甸匍匐匏匕匚匣匯匱匳匸區卆卅丗卉卍凖卞卩卮夘卻卷厂厖厠厦厥厮厰厶參簒雙叟曼燮叮叨叭叺吁吽呀听吭吼吮吶吩吝呎咏呵咎呟呱呷呰咒呻咀呶咄咐咆哇咢咸咥咬哄哈咨"], +["d3a1","咫哂咤咾咼哘哥哦唏唔哽哮哭哺哢唹啀啣啌售啜啅啖啗唸唳啝喙喀咯喊喟啻啾喘喞單啼喃喩喇喨嗚嗅嗟嗄嗜嗤嗔嘔嗷嘖嗾嗽嘛嗹噎噐營嘴嘶嘲嘸噫噤嘯噬噪嚆嚀嚊嚠嚔嚏嚥嚮嚶嚴囂嚼囁囃囀囈囎囑囓囗囮囹圀囿圄圉"], +["d4a1","圈國圍圓團圖嗇圜圦圷圸坎圻址坏坩埀垈坡坿垉垓垠垳垤垪垰埃埆埔埒埓堊埖埣堋堙堝塲堡塢塋塰毀塒堽塹墅墹墟墫墺壞墻墸墮壅壓壑壗壙壘壥壜壤壟壯壺壹壻壼壽夂夊夐夛梦夥夬夭夲夸夾竒奕奐奎奚奘奢奠奧奬奩"], +["d5a1","奸妁妝佞侫妣妲姆姨姜妍姙姚娥娟娑娜娉娚婀婬婉娵娶婢婪媚媼媾嫋嫂媽嫣嫗嫦嫩嫖嫺嫻嬌嬋嬖嬲嫐嬪嬶嬾孃孅孀孑孕孚孛孥孩孰孳孵學斈孺宀它宦宸寃寇寉寔寐寤實寢寞寥寫寰寶寳尅將專對尓尠尢尨尸尹屁屆屎屓"], +["d6a1","屐屏孱屬屮乢屶屹岌岑岔妛岫岻岶岼岷峅岾峇峙峩峽峺峭嶌峪崋崕崗嵜崟崛崑崔崢崚崙崘嵌嵒嵎嵋嵬嵳嵶嶇嶄嶂嶢嶝嶬嶮嶽嶐嶷嶼巉巍巓巒巖巛巫已巵帋帚帙帑帛帶帷幄幃幀幎幗幔幟幢幤幇幵并幺麼广庠廁廂廈廐廏"], +["d7a1","廖廣廝廚廛廢廡廨廩廬廱廳廰廴廸廾弃弉彝彜弋弑弖弩弭弸彁彈彌彎弯彑彖彗彙彡彭彳彷徃徂彿徊很徑徇從徙徘徠徨徭徼忖忻忤忸忱忝悳忿怡恠怙怐怩怎怱怛怕怫怦怏怺恚恁恪恷恟恊恆恍恣恃恤恂恬恫恙悁悍惧悃悚"], +["d8a1","悄悛悖悗悒悧悋惡悸惠惓悴忰悽惆悵惘慍愕愆惶惷愀惴惺愃愡惻惱愍愎慇愾愨愧慊愿愼愬愴愽慂慄慳慷慘慙慚慫慴慯慥慱慟慝慓慵憙憖憇憬憔憚憊憑憫憮懌懊應懷懈懃懆憺懋罹懍懦懣懶懺懴懿懽懼懾戀戈戉戍戌戔戛"], +["d9a1","戞戡截戮戰戲戳扁扎扞扣扛扠扨扼抂抉找抒抓抖拔抃抔拗拑抻拏拿拆擔拈拜拌拊拂拇抛拉挌拮拱挧挂挈拯拵捐挾捍搜捏掖掎掀掫捶掣掏掉掟掵捫捩掾揩揀揆揣揉插揶揄搖搴搆搓搦搶攝搗搨搏摧摯摶摎攪撕撓撥撩撈撼"], +["daa1","據擒擅擇撻擘擂擱擧舉擠擡抬擣擯攬擶擴擲擺攀擽攘攜攅攤攣攫攴攵攷收攸畋效敖敕敍敘敞敝敲數斂斃變斛斟斫斷旃旆旁旄旌旒旛旙无旡旱杲昊昃旻杳昵昶昴昜晏晄晉晁晞晝晤晧晨晟晢晰暃暈暎暉暄暘暝曁暹曉暾暼"], +["dba1","曄暸曖曚曠昿曦曩曰曵曷朏朖朞朦朧霸朮朿朶杁朸朷杆杞杠杙杣杤枉杰枩杼杪枌枋枦枡枅枷柯枴柬枳柩枸柤柞柝柢柮枹柎柆柧檜栞框栩桀桍栲桎梳栫桙档桷桿梟梏梭梔條梛梃檮梹桴梵梠梺椏梍桾椁棊椈棘椢椦棡椌棍"], +["dca1","棔棧棕椶椒椄棗棣椥棹棠棯椨椪椚椣椡棆楹楷楜楸楫楔楾楮椹楴椽楙椰楡楞楝榁楪榲榮槐榿槁槓榾槎寨槊槝榻槃榧樮榑榠榜榕榴槞槨樂樛槿權槹槲槧樅榱樞槭樔槫樊樒櫁樣樓橄樌橲樶橸橇橢橙橦橈樸樢檐檍檠檄檢檣"], +["dda1","檗蘗檻櫃櫂檸檳檬櫞櫑櫟檪櫚櫪櫻欅蘖櫺欒欖鬱欟欸欷盜欹飮歇歃歉歐歙歔歛歟歡歸歹歿殀殄殃殍殘殕殞殤殪殫殯殲殱殳殷殼毆毋毓毟毬毫毳毯麾氈氓气氛氤氣汞汕汢汪沂沍沚沁沛汾汨汳沒沐泄泱泓沽泗泅泝沮沱沾"], +["dea1","沺泛泯泙泪洟衍洶洫洽洸洙洵洳洒洌浣涓浤浚浹浙涎涕濤涅淹渕渊涵淇淦涸淆淬淞淌淨淒淅淺淙淤淕淪淮渭湮渮渙湲湟渾渣湫渫湶湍渟湃渺湎渤滿渝游溂溪溘滉溷滓溽溯滄溲滔滕溏溥滂溟潁漑灌滬滸滾漿滲漱滯漲滌"], +["dfa1","漾漓滷澆潺潸澁澀潯潛濳潭澂潼潘澎澑濂潦澳澣澡澤澹濆澪濟濕濬濔濘濱濮濛瀉瀋濺瀑瀁瀏濾瀛瀚潴瀝瀘瀟瀰瀾瀲灑灣炙炒炯烱炬炸炳炮烟烋烝烙焉烽焜焙煥煕熈煦煢煌煖煬熏燻熄熕熨熬燗熹熾燒燉燔燎燠燬燧燵燼"], +["e0a1","燹燿爍爐爛爨爭爬爰爲爻爼爿牀牆牋牘牴牾犂犁犇犒犖犢犧犹犲狃狆狄狎狒狢狠狡狹狷倏猗猊猜猖猝猴猯猩猥猾獎獏默獗獪獨獰獸獵獻獺珈玳珎玻珀珥珮珞璢琅瑯琥珸琲琺瑕琿瑟瑙瑁瑜瑩瑰瑣瑪瑶瑾璋璞璧瓊瓏瓔珱"], +["e1a1","瓠瓣瓧瓩瓮瓲瓰瓱瓸瓷甄甃甅甌甎甍甕甓甞甦甬甼畄畍畊畉畛畆畚畩畤畧畫畭畸當疆疇畴疊疉疂疔疚疝疥疣痂疳痃疵疽疸疼疱痍痊痒痙痣痞痾痿痼瘁痰痺痲痳瘋瘍瘉瘟瘧瘠瘡瘢瘤瘴瘰瘻癇癈癆癜癘癡癢癨癩癪癧癬癰"], +["e2a1","癲癶癸發皀皃皈皋皎皖皓皙皚皰皴皸皹皺盂盍盖盒盞盡盥盧盪蘯盻眈眇眄眩眤眞眥眦眛眷眸睇睚睨睫睛睥睿睾睹瞎瞋瞑瞠瞞瞰瞶瞹瞿瞼瞽瞻矇矍矗矚矜矣矮矼砌砒礦砠礪硅碎硴碆硼碚碌碣碵碪碯磑磆磋磔碾碼磅磊磬"], +["e3a1","磧磚磽磴礇礒礑礙礬礫祀祠祗祟祚祕祓祺祿禊禝禧齋禪禮禳禹禺秉秕秧秬秡秣稈稍稘稙稠稟禀稱稻稾稷穃穗穉穡穢穩龝穰穹穽窈窗窕窘窖窩竈窰窶竅竄窿邃竇竊竍竏竕竓站竚竝竡竢竦竭竰笂笏笊笆笳笘笙笞笵笨笶筐"], +["e4a1","筺笄筍笋筌筅筵筥筴筧筰筱筬筮箝箘箟箍箜箚箋箒箏筝箙篋篁篌篏箴篆篝篩簑簔篦篥籠簀簇簓篳篷簗簍篶簣簧簪簟簷簫簽籌籃籔籏籀籐籘籟籤籖籥籬籵粃粐粤粭粢粫粡粨粳粲粱粮粹粽糀糅糂糘糒糜糢鬻糯糲糴糶糺紆"], +["e5a1","紂紜紕紊絅絋紮紲紿紵絆絳絖絎絲絨絮絏絣經綉絛綏絽綛綺綮綣綵緇綽綫總綢綯緜綸綟綰緘緝緤緞緻緲緡縅縊縣縡縒縱縟縉縋縢繆繦縻縵縹繃縷縲縺繧繝繖繞繙繚繹繪繩繼繻纃緕繽辮繿纈纉續纒纐纓纔纖纎纛纜缸缺"], +["e6a1","罅罌罍罎罐网罕罔罘罟罠罨罩罧罸羂羆羃羈羇羌羔羞羝羚羣羯羲羹羮羶羸譱翅翆翊翕翔翡翦翩翳翹飜耆耄耋耒耘耙耜耡耨耿耻聊聆聒聘聚聟聢聨聳聲聰聶聹聽聿肄肆肅肛肓肚肭冐肬胛胥胙胝胄胚胖脉胯胱脛脩脣脯腋"], +["e7a1","隋腆脾腓腑胼腱腮腥腦腴膃膈膊膀膂膠膕膤膣腟膓膩膰膵膾膸膽臀臂膺臉臍臑臙臘臈臚臟臠臧臺臻臾舁舂舅與舊舍舐舖舩舫舸舳艀艙艘艝艚艟艤艢艨艪艫舮艱艷艸艾芍芒芫芟芻芬苡苣苟苒苴苳苺莓范苻苹苞茆苜茉苙"], +["e8a1","茵茴茖茲茱荀茹荐荅茯茫茗茘莅莚莪莟莢莖茣莎莇莊荼莵荳荵莠莉莨菴萓菫菎菽萃菘萋菁菷萇菠菲萍萢萠莽萸蔆菻葭萪萼蕚蒄葷葫蒭葮蒂葩葆萬葯葹萵蓊葢蒹蒿蒟蓙蓍蒻蓚蓐蓁蓆蓖蒡蔡蓿蓴蔗蔘蔬蔟蔕蔔蓼蕀蕣蕘蕈"], +["e9a1","蕁蘂蕋蕕薀薤薈薑薊薨蕭薔薛藪薇薜蕷蕾薐藉薺藏薹藐藕藝藥藜藹蘊蘓蘋藾藺蘆蘢蘚蘰蘿虍乕虔號虧虱蚓蚣蚩蚪蚋蚌蚶蚯蛄蛆蚰蛉蠣蚫蛔蛞蛩蛬蛟蛛蛯蜒蜆蜈蜀蜃蛻蜑蜉蜍蛹蜊蜴蜿蜷蜻蜥蜩蜚蝠蝟蝸蝌蝎蝴蝗蝨蝮蝙"], +["eaa1","蝓蝣蝪蠅螢螟螂螯蟋螽蟀蟐雖螫蟄螳蟇蟆螻蟯蟲蟠蠏蠍蟾蟶蟷蠎蟒蠑蠖蠕蠢蠡蠱蠶蠹蠧蠻衄衂衒衙衞衢衫袁衾袞衵衽袵衲袂袗袒袮袙袢袍袤袰袿袱裃裄裔裘裙裝裹褂裼裴裨裲褄褌褊褓襃褞褥褪褫襁襄褻褶褸襌褝襠襞"], +["eba1","襦襤襭襪襯襴襷襾覃覈覊覓覘覡覩覦覬覯覲覺覽覿觀觚觜觝觧觴觸訃訖訐訌訛訝訥訶詁詛詒詆詈詼詭詬詢誅誂誄誨誡誑誥誦誚誣諄諍諂諚諫諳諧諤諱謔諠諢諷諞諛謌謇謚諡謖謐謗謠謳鞫謦謫謾謨譁譌譏譎證譖譛譚譫"], +["eca1","譟譬譯譴譽讀讌讎讒讓讖讙讚谺豁谿豈豌豎豐豕豢豬豸豺貂貉貅貊貍貎貔豼貘戝貭貪貽貲貳貮貶賈賁賤賣賚賽賺賻贄贅贊贇贏贍贐齎贓賍贔贖赧赭赱赳趁趙跂趾趺跏跚跖跌跛跋跪跫跟跣跼踈踉跿踝踞踐踟蹂踵踰踴蹊"], +["eda1","蹇蹉蹌蹐蹈蹙蹤蹠踪蹣蹕蹶蹲蹼躁躇躅躄躋躊躓躑躔躙躪躡躬躰軆躱躾軅軈軋軛軣軼軻軫軾輊輅輕輒輙輓輜輟輛輌輦輳輻輹轅轂輾轌轉轆轎轗轜轢轣轤辜辟辣辭辯辷迚迥迢迪迯邇迴逅迹迺逑逕逡逍逞逖逋逧逶逵逹迸"], +["eea1","遏遐遑遒逎遉逾遖遘遞遨遯遶隨遲邂遽邁邀邊邉邏邨邯邱邵郢郤扈郛鄂鄒鄙鄲鄰酊酖酘酣酥酩酳酲醋醉醂醢醫醯醪醵醴醺釀釁釉釋釐釖釟釡釛釼釵釶鈞釿鈔鈬鈕鈑鉞鉗鉅鉉鉤鉈銕鈿鉋鉐銜銖銓銛鉚鋏銹銷鋩錏鋺鍄錮"], +["efa1","錙錢錚錣錺錵錻鍜鍠鍼鍮鍖鎰鎬鎭鎔鎹鏖鏗鏨鏥鏘鏃鏝鏐鏈鏤鐚鐔鐓鐃鐇鐐鐶鐫鐵鐡鐺鑁鑒鑄鑛鑠鑢鑞鑪鈩鑰鑵鑷鑽鑚鑼鑾钁鑿閂閇閊閔閖閘閙閠閨閧閭閼閻閹閾闊濶闃闍闌闕闔闖關闡闥闢阡阨阮阯陂陌陏陋陷陜陞"], +["f0a1","陝陟陦陲陬隍隘隕隗險隧隱隲隰隴隶隸隹雎雋雉雍襍雜霍雕雹霄霆霈霓霎霑霏霖霙霤霪霰霹霽霾靄靆靈靂靉靜靠靤靦靨勒靫靱靹鞅靼鞁靺鞆鞋鞏鞐鞜鞨鞦鞣鞳鞴韃韆韈韋韜韭齏韲竟韶韵頏頌頸頤頡頷頽顆顏顋顫顯顰"], +["f1a1","顱顴顳颪颯颱颶飄飃飆飩飫餃餉餒餔餘餡餝餞餤餠餬餮餽餾饂饉饅饐饋饑饒饌饕馗馘馥馭馮馼駟駛駝駘駑駭駮駱駲駻駸騁騏騅駢騙騫騷驅驂驀驃騾驕驍驛驗驟驢驥驤驩驫驪骭骰骼髀髏髑髓體髞髟髢髣髦髯髫髮髴髱髷"], +["f2a1","髻鬆鬘鬚鬟鬢鬣鬥鬧鬨鬩鬪鬮鬯鬲魄魃魏魍魎魑魘魴鮓鮃鮑鮖鮗鮟鮠鮨鮴鯀鯊鮹鯆鯏鯑鯒鯣鯢鯤鯔鯡鰺鯲鯱鯰鰕鰔鰉鰓鰌鰆鰈鰒鰊鰄鰮鰛鰥鰤鰡鰰鱇鰲鱆鰾鱚鱠鱧鱶鱸鳧鳬鳰鴉鴈鳫鴃鴆鴪鴦鶯鴣鴟鵄鴕鴒鵁鴿鴾鵆鵈"], +["f3a1","鵝鵞鵤鵑鵐鵙鵲鶉鶇鶫鵯鵺鶚鶤鶩鶲鷄鷁鶻鶸鶺鷆鷏鷂鷙鷓鷸鷦鷭鷯鷽鸚鸛鸞鹵鹹鹽麁麈麋麌麒麕麑麝麥麩麸麪麭靡黌黎黏黐黔黜點黝黠黥黨黯黴黶黷黹黻黼黽鼇鼈皷鼕鼡鼬鼾齊齒齔齣齟齠齡齦齧齬齪齷齲齶龕龜龠"], +["f4a1","堯槇遙瑤凜熙"], +["f9a1","纊褜鍈銈蓜俉炻昱棈鋹曻彅丨仡仼伀伃伹佖侒侊侚侔俍偀倢俿倞偆偰偂傔僴僘兊兤冝冾凬刕劜劦勀勛匀匇匤卲厓厲叝﨎咜咊咩哿喆坙坥垬埈埇﨏塚增墲夋奓奛奝奣妤妺孖寀甯寘寬尞岦岺峵崧嵓﨑嵂嵭嶸嶹巐弡弴彧德"], +["faa1","忞恝悅悊惞惕愠惲愑愷愰憘戓抦揵摠撝擎敎昀昕昻昉昮昞昤晥晗晙晴晳暙暠暲暿曺朎朗杦枻桒柀栁桄棏﨓楨﨔榘槢樰橫橆橳橾櫢櫤毖氿汜沆汯泚洄涇浯涖涬淏淸淲淼渹湜渧渼溿澈澵濵瀅瀇瀨炅炫焏焄煜煆煇凞燁燾犱"], +["fba1","犾猤猪獷玽珉珖珣珒琇珵琦琪琩琮瑢璉璟甁畯皂皜皞皛皦益睆劯砡硎硤硺礰礼神祥禔福禛竑竧靖竫箞精絈絜綷綠緖繒罇羡羽茁荢荿菇菶葈蒴蕓蕙蕫﨟薰蘒﨡蠇裵訒訷詹誧誾諟諸諶譓譿賰賴贒赶﨣軏﨤逸遧郞都鄕鄧釚"], +["fca1","釗釞釭釮釤釥鈆鈐鈊鈺鉀鈼鉎鉙鉑鈹鉧銧鉷鉸鋧鋗鋙鋐﨧鋕鋠鋓錥錡鋻﨨錞鋿錝錂鍰鍗鎤鏆鏞鏸鐱鑅鑈閒隆﨩隝隯霳霻靃靍靏靑靕顗顥飯飼餧館馞驎髙髜魵魲鮏鮱鮻鰀鵰鵫鶴鸙黑"], +["fcf1","ⅰ",9,"¬¦'""], +["8fa2af","˘ˇ¸˙˝¯˛˚~΄΅"], +["8fa2c2","¡¦¿"], +["8fa2eb","ºª©®™¤№"], +["8fa6e1","ΆΈΉΊΪ"], +["8fa6e7","Ό"], +["8fa6e9","ΎΫ"], +["8fa6ec","Ώ"], +["8fa6f1","άέήίϊΐόςύϋΰώ"], +["8fa7c2","Ђ",10,"ЎЏ"], +["8fa7f2","ђ",10,"ўџ"], +["8fa9a1","ÆĐ"], +["8fa9a4","Ħ"], +["8fa9a6","IJ"], +["8fa9a8","ŁĿ"], +["8fa9ab","ŊØŒ"], +["8fa9af","ŦÞ"], +["8fa9c1","æđðħıijĸłŀʼnŋøœßŧþ"], +["8faaa1","ÁÀÄÂĂǍĀĄÅÃĆĈČÇĊĎÉÈËÊĚĖĒĘ"], +["8faaba","ĜĞĢĠĤÍÌÏÎǏİĪĮĨĴĶĹĽĻŃŇŅÑÓÒÖÔǑŐŌÕŔŘŖŚŜŠŞŤŢÚÙÜÛŬǓŰŪŲŮŨǗǛǙǕŴÝŸŶŹŽŻ"], +["8faba1","áàäâăǎāąåãćĉčçċďéèëêěėēęǵĝğ"], +["8fabbd","ġĥíìïîǐ"], +["8fabc5","īįĩĵķĺľļńňņñóòöôǒőōõŕřŗśŝšşťţúùüûŭǔűūųůũǘǜǚǖŵýÿŷźžż"], +["8fb0a1","丂丄丅丌丒丟丣两丨丫丮丯丰丵乀乁乄乇乑乚乜乣乨乩乴乵乹乿亍亖亗亝亯亹仃仐仚仛仠仡仢仨仯仱仳仵份仾仿伀伂伃伈伋伌伒伕伖众伙伮伱你伳伵伷伹伻伾佀佂佈佉佋佌佒佔佖佘佟佣佪佬佮佱佷佸佹佺佽佾侁侂侄"], +["8fb1a1","侅侉侊侌侎侐侒侓侔侗侙侚侞侟侲侷侹侻侼侽侾俀俁俅俆俈俉俋俌俍俏俒俜俠俢俰俲俼俽俿倀倁倄倇倊倌倎倐倓倗倘倛倜倝倞倢倧倮倰倲倳倵偀偁偂偅偆偊偌偎偑偒偓偗偙偟偠偢偣偦偧偪偭偰偱倻傁傃傄傆傊傎傏傐"], +["8fb2a1","傒傓傔傖傛傜傞",4,"傪傯傰傹傺傽僀僃僄僇僌僎僐僓僔僘僜僝僟僢僤僦僨僩僯僱僶僺僾儃儆儇儈儋儌儍儎僲儐儗儙儛儜儝儞儣儧儨儬儭儯儱儳儴儵儸儹兂兊兏兓兕兗兘兟兤兦兾冃冄冋冎冘冝冡冣冭冸冺冼冾冿凂"], +["8fb3a1","凈减凑凒凓凕凘凞凢凥凮凲凳凴凷刁刂刅划刓刕刖刘刢刨刱刲刵刼剅剉剕剗剘剚剜剟剠剡剦剮剷剸剹劀劂劅劊劌劓劕劖劗劘劚劜劤劥劦劧劯劰劶劷劸劺劻劽勀勄勆勈勌勏勑勔勖勛勜勡勥勨勩勪勬勰勱勴勶勷匀匃匊匋"], +["8fb4a1","匌匑匓匘匛匜匞匟匥匧匨匩匫匬匭匰匲匵匼匽匾卂卌卋卙卛卡卣卥卬卭卲卹卾厃厇厈厎厓厔厙厝厡厤厪厫厯厲厴厵厷厸厺厽叀叅叏叒叓叕叚叝叞叠另叧叵吂吓吚吡吧吨吪启吱吴吵呃呄呇呍呏呞呢呤呦呧呩呫呭呮呴呿"], +["8fb5a1","咁咃咅咈咉咍咑咕咖咜咟咡咦咧咩咪咭咮咱咷咹咺咻咿哆哊响哎哠哪哬哯哶哼哾哿唀唁唅唈唉唌唍唎唕唪唫唲唵唶唻唼唽啁啇啉啊啍啐啑啘啚啛啞啠啡啤啦啿喁喂喆喈喎喏喑喒喓喔喗喣喤喭喲喿嗁嗃嗆嗉嗋嗌嗎嗑嗒"], +["8fb6a1","嗓嗗嗘嗛嗞嗢嗩嗶嗿嘅嘈嘊嘍",5,"嘙嘬嘰嘳嘵嘷嘹嘻嘼嘽嘿噀噁噃噄噆噉噋噍噏噔噞噠噡噢噣噦噩噭噯噱噲噵嚄嚅嚈嚋嚌嚕嚙嚚嚝嚞嚟嚦嚧嚨嚩嚫嚬嚭嚱嚳嚷嚾囅囉囊囋囏囐囌囍囙囜囝囟囡囤",4,"囱囫园"], +["8fb7a1","囶囷圁圂圇圊圌圑圕圚圛圝圠圢圣圤圥圩圪圬圮圯圳圴圽圾圿坅坆坌坍坒坢坥坧坨坫坭",4,"坳坴坵坷坹坺坻坼坾垁垃垌垔垗垙垚垜垝垞垟垡垕垧垨垩垬垸垽埇埈埌埏埕埝埞埤埦埧埩埭埰埵埶埸埽埾埿堃堄堈堉埡"], +["8fb8a1","堌堍堛堞堟堠堦堧堭堲堹堿塉塌塍塏塐塕塟塡塤塧塨塸塼塿墀墁墇墈墉墊墌墍墏墐墔墖墝墠墡墢墦墩墱墲壄墼壂壈壍壎壐壒壔壖壚壝壡壢壩壳夅夆夋夌夒夓夔虁夝夡夣夤夨夯夰夳夵夶夿奃奆奒奓奙奛奝奞奟奡奣奫奭"], +["8fb9a1","奯奲奵奶她奻奼妋妌妎妒妕妗妟妤妧妭妮妯妰妳妷妺妼姁姃姄姈姊姍姒姝姞姟姣姤姧姮姯姱姲姴姷娀娄娌娍娎娒娓娞娣娤娧娨娪娭娰婄婅婇婈婌婐婕婞婣婥婧婭婷婺婻婾媋媐媓媖媙媜媞媟媠媢媧媬媱媲媳媵媸媺媻媿"], +["8fbaa1","嫄嫆嫈嫏嫚嫜嫠嫥嫪嫮嫵嫶嫽嬀嬁嬈嬗嬴嬙嬛嬝嬡嬥嬭嬸孁孋孌孒孖孞孨孮孯孼孽孾孿宁宄宆宊宎宐宑宓宔宖宨宩宬宭宯宱宲宷宺宼寀寁寍寏寖",4,"寠寯寱寴寽尌尗尞尟尣尦尩尫尬尮尰尲尵尶屙屚屜屢屣屧屨屩"], +["8fbba1","屭屰屴屵屺屻屼屽岇岈岊岏岒岝岟岠岢岣岦岪岲岴岵岺峉峋峒峝峗峮峱峲峴崁崆崍崒崫崣崤崦崧崱崴崹崽崿嵂嵃嵆嵈嵕嵑嵙嵊嵟嵠嵡嵢嵤嵪嵭嵰嵹嵺嵾嵿嶁嶃嶈嶊嶒嶓嶔嶕嶙嶛嶟嶠嶧嶫嶰嶴嶸嶹巃巇巋巐巎巘巙巠巤"], +["8fbca1","巩巸巹帀帇帍帒帔帕帘帟帠帮帨帲帵帾幋幐幉幑幖幘幛幜幞幨幪",4,"幰庀庋庎庢庤庥庨庪庬庱庳庽庾庿廆廌廋廎廑廒廔廕廜廞廥廫异弆弇弈弎弙弜弝弡弢弣弤弨弫弬弮弰弴弶弻弽弿彀彄彅彇彍彐彔彘彛彠彣彤彧"], +["8fbda1","彯彲彴彵彸彺彽彾徉徍徏徖徜徝徢徧徫徤徬徯徰徱徸忄忇忈忉忋忐",4,"忞忡忢忨忩忪忬忭忮忯忲忳忶忺忼怇怊怍怓怔怗怘怚怟怤怭怳怵恀恇恈恉恌恑恔恖恗恝恡恧恱恾恿悂悆悈悊悎悑悓悕悘悝悞悢悤悥您悰悱悷"], +["8fbea1","悻悾惂惄惈惉惊惋惎惏惔惕惙惛惝惞惢惥惲惵惸惼惽愂愇愊愌愐",4,"愖愗愙愜愞愢愪愫愰愱愵愶愷愹慁慅慆慉慞慠慬慲慸慻慼慿憀憁憃憄憋憍憒憓憗憘憜憝憟憠憥憨憪憭憸憹憼懀懁懂懎懏懕懜懝懞懟懡懢懧懩懥"], +["8fbfa1","懬懭懯戁戃戄戇戓戕戜戠戢戣戧戩戫戹戽扂扃扄扆扌扐扑扒扔扖扚扜扤扭扯扳扺扽抍抎抏抐抦抨抳抶抷抺抾抿拄拎拕拖拚拪拲拴拼拽挃挄挊挋挍挐挓挖挘挩挪挭挵挶挹挼捁捂捃捄捆捊捋捎捒捓捔捘捛捥捦捬捭捱捴捵"], +["8fc0a1","捸捼捽捿掂掄掇掊掐掔掕掙掚掞掤掦掭掮掯掽揁揅揈揎揑揓揔揕揜揠揥揪揬揲揳揵揸揹搉搊搐搒搔搘搞搠搢搤搥搩搪搯搰搵搽搿摋摏摑摒摓摔摚摛摜摝摟摠摡摣摭摳摴摻摽撅撇撏撐撑撘撙撛撝撟撡撣撦撨撬撳撽撾撿"], +["8fc1a1","擄擉擊擋擌擎擐擑擕擗擤擥擩擪擭擰擵擷擻擿攁攄攈攉攊攏攓攔攖攙攛攞攟攢攦攩攮攱攺攼攽敃敇敉敐敒敔敟敠敧敫敺敽斁斅斊斒斕斘斝斠斣斦斮斲斳斴斿旂旈旉旎旐旔旖旘旟旰旲旴旵旹旾旿昀昄昈昉昍昑昒昕昖昝"], +["8fc2a1","昞昡昢昣昤昦昩昪昫昬昮昰昱昳昹昷晀晅晆晊晌晑晎晗晘晙晛晜晠晡曻晪晫晬晾晳晵晿晷晸晹晻暀晼暋暌暍暐暒暙暚暛暜暟暠暤暭暱暲暵暻暿曀曂曃曈曌曎曏曔曛曟曨曫曬曮曺朅朇朎朓朙朜朠朢朳朾杅杇杈杌杔杕杝"], +["8fc3a1","杦杬杮杴杶杻极构枎枏枑枓枖枘枙枛枰枱枲枵枻枼枽柹柀柂柃柅柈柉柒柗柙柜柡柦柰柲柶柷桒栔栙栝栟栨栧栬栭栯栰栱栳栻栿桄桅桊桌桕桗桘桛桫桮",4,"桵桹桺桻桼梂梄梆梈梖梘梚梜梡梣梥梩梪梮梲梻棅棈棌棏"], +["8fc4a1","棐棑棓棖棙棜棝棥棨棪棫棬棭棰棱棵棶棻棼棽椆椉椊椐椑椓椖椗椱椳椵椸椻楂楅楉楎楗楛楣楤楥楦楨楩楬楰楱楲楺楻楿榀榍榒榖榘榡榥榦榨榫榭榯榷榸榺榼槅槈槑槖槗槢槥槮槯槱槳槵槾樀樁樃樏樑樕樚樝樠樤樨樰樲"], +["8fc5a1","樴樷樻樾樿橅橆橉橊橎橐橑橒橕橖橛橤橧橪橱橳橾檁檃檆檇檉檋檑檛檝檞檟檥檫檯檰檱檴檽檾檿櫆櫉櫈櫌櫐櫔櫕櫖櫜櫝櫤櫧櫬櫰櫱櫲櫼櫽欂欃欆欇欉欏欐欑欗欛欞欤欨欫欬欯欵欶欻欿歆歊歍歒歖歘歝歠歧歫歮歰歵歽"], +["8fc6a1","歾殂殅殗殛殟殠殢殣殨殩殬殭殮殰殸殹殽殾毃毄毉毌毖毚毡毣毦毧毮毱毷毹毿氂氄氅氉氍氎氐氒氙氟氦氧氨氬氮氳氵氶氺氻氿汊汋汍汏汒汔汙汛汜汫汭汯汴汶汸汹汻沅沆沇沉沔沕沗沘沜沟沰沲沴泂泆泍泏泐泑泒泔泖"], +["8fc7a1","泚泜泠泧泩泫泬泮泲泴洄洇洊洎洏洑洓洚洦洧洨汧洮洯洱洹洼洿浗浞浟浡浥浧浯浰浼涂涇涑涒涔涖涗涘涪涬涴涷涹涽涿淄淈淊淎淏淖淛淝淟淠淢淥淩淯淰淴淶淼渀渄渞渢渧渲渶渹渻渼湄湅湈湉湋湏湑湒湓湔湗湜湝湞"], +["8fc8a1","湢湣湨湳湻湽溍溓溙溠溧溭溮溱溳溻溿滀滁滃滇滈滊滍滎滏滫滭滮滹滻滽漄漈漊漌漍漖漘漚漛漦漩漪漯漰漳漶漻漼漭潏潑潒潓潗潙潚潝潞潡潢潨潬潽潾澃澇澈澋澌澍澐澒澓澔澖澚澟澠澥澦澧澨澮澯澰澵澶澼濅濇濈濊"], +["8fc9a1","濚濞濨濩濰濵濹濼濽瀀瀅瀆瀇瀍瀗瀠瀣瀯瀴瀷瀹瀼灃灄灈灉灊灋灔灕灝灞灎灤灥灬灮灵灶灾炁炅炆炔",4,"炛炤炫炰炱炴炷烊烑烓烔烕烖烘烜烤烺焃",4,"焋焌焏焞焠焫焭焯焰焱焸煁煅煆煇煊煋煐煒煗煚煜煞煠"], +["8fcaa1","煨煹熀熅熇熌熒熚熛熠熢熯熰熲熳熺熿燀燁燄燋燌燓燖燙燚燜燸燾爀爇爈爉爓爗爚爝爟爤爫爯爴爸爹牁牂牃牅牎牏牐牓牕牖牚牜牞牠牣牨牫牮牯牱牷牸牻牼牿犄犉犍犎犓犛犨犭犮犱犴犾狁狇狉狌狕狖狘狟狥狳狴狺狻"], +["8fcba1","狾猂猄猅猇猋猍猒猓猘猙猞猢猤猧猨猬猱猲猵猺猻猽獃獍獐獒獖獘獝獞獟獠獦獧獩獫獬獮獯獱獷獹獼玀玁玃玅玆玎玐玓玕玗玘玜玞玟玠玢玥玦玪玫玭玵玷玹玼玽玿珅珆珉珋珌珏珒珓珖珙珝珡珣珦珧珩珴珵珷珹珺珻珽"], +["8fcca1","珿琀琁琄琇琊琑琚琛琤琦琨",9,"琹瑀瑃瑄瑆瑇瑋瑍瑑瑒瑗瑝瑢瑦瑧瑨瑫瑭瑮瑱瑲璀璁璅璆璇璉璏璐璑璒璘璙璚璜璟璠璡璣璦璨璩璪璫璮璯璱璲璵璹璻璿瓈瓉瓌瓐瓓瓘瓚瓛瓞瓟瓤瓨瓪瓫瓯瓴瓺瓻瓼瓿甆"], +["8fcda1","甒甖甗甠甡甤甧甩甪甯甶甹甽甾甿畀畃畇畈畎畐畒畗畞畟畡畯畱畹",5,"疁疅疐疒疓疕疙疜疢疤疴疺疿痀痁痄痆痌痎痏痗痜痟痠痡痤痧痬痮痯痱痹瘀瘂瘃瘄瘇瘈瘊瘌瘏瘒瘓瘕瘖瘙瘛瘜瘝瘞瘣瘥瘦瘩瘭瘲瘳瘵瘸瘹"], +["8fcea1","瘺瘼癊癀癁癃癄癅癉癋癕癙癟癤癥癭癮癯癱癴皁皅皌皍皕皛皜皝皟皠皢",6,"皪皭皽盁盅盉盋盌盎盔盙盠盦盨盬盰盱盶盹盼眀眆眊眎眒眔眕眗眙眚眜眢眨眭眮眯眴眵眶眹眽眾睂睅睆睊睍睎睏睒睖睗睜睞睟睠睢"], +["8fcfa1","睤睧睪睬睰睲睳睴睺睽瞀瞄瞌瞍瞔瞕瞖瞚瞟瞢瞧瞪瞮瞯瞱瞵瞾矃矉矑矒矕矙矞矟矠矤矦矪矬矰矱矴矸矻砅砆砉砍砎砑砝砡砢砣砭砮砰砵砷硃硄硇硈硌硎硒硜硞硠硡硣硤硨硪确硺硾碊碏碔碘碡碝碞碟碤碨碬碭碰碱碲碳"], +["8fd0a1","碻碽碿磇磈磉磌磎磒磓磕磖磤磛磟磠磡磦磪磲磳礀磶磷磺磻磿礆礌礐礚礜礞礟礠礥礧礩礭礱礴礵礻礽礿祄祅祆祊祋祏祑祔祘祛祜祧祩祫祲祹祻祼祾禋禌禑禓禔禕禖禘禛禜禡禨禩禫禯禱禴禸离秂秄秇秈秊秏秔秖秚秝秞"], +["8fd1a1","秠秢秥秪秫秭秱秸秼稂稃稇稉稊稌稑稕稛稞稡稧稫稭稯稰稴稵稸稹稺穄穅穇穈穌穕穖穙穜穝穟穠穥穧穪穭穵穸穾窀窂窅窆窊窋窐窑窔窞窠窣窬窳窵窹窻窼竆竉竌竎竑竛竨竩竫竬竱竴竻竽竾笇笔笟笣笧笩笪笫笭笮笯笰"], +["8fd2a1","笱笴笽笿筀筁筇筎筕筠筤筦筩筪筭筯筲筳筷箄箉箎箐箑箖箛箞箠箥箬箯箰箲箵箶箺箻箼箽篂篅篈篊篔篖篗篙篚篛篨篪篲篴篵篸篹篺篼篾簁簂簃簄簆簉簋簌簎簏簙簛簠簥簦簨簬簱簳簴簶簹簺籆籊籕籑籒籓籙",5], +["8fd3a1","籡籣籧籩籭籮籰籲籹籼籽粆粇粏粔粞粠粦粰粶粷粺粻粼粿糄糇糈糉糍糏糓糔糕糗糙糚糝糦糩糫糵紃紇紈紉紏紑紒紓紖紝紞紣紦紪紭紱紼紽紾絀絁絇絈絍絑絓絗絙絚絜絝絥絧絪絰絸絺絻絿綁綂綃綅綆綈綋綌綍綑綖綗綝"], +["8fd4a1","綞綦綧綪綳綶綷綹緂",4,"緌緍緎緗緙縀緢緥緦緪緫緭緱緵緶緹緺縈縐縑縕縗縜縝縠縧縨縬縭縯縳縶縿繄繅繇繎繐繒繘繟繡繢繥繫繮繯繳繸繾纁纆纇纊纍纑纕纘纚纝纞缼缻缽缾缿罃罄罇罏罒罓罛罜罝罡罣罤罥罦罭"], +["8fd5a1","罱罽罾罿羀羋羍羏羐羑羖羗羜羡羢羦羪羭羴羼羿翀翃翈翎翏翛翟翣翥翨翬翮翯翲翺翽翾翿耇耈耊耍耎耏耑耓耔耖耝耞耟耠耤耦耬耮耰耴耵耷耹耺耼耾聀聄聠聤聦聭聱聵肁肈肎肜肞肦肧肫肸肹胈胍胏胒胔胕胗胘胠胭胮"], +["8fd6a1","胰胲胳胶胹胺胾脃脋脖脗脘脜脞脠脤脧脬脰脵脺脼腅腇腊腌腒腗腠腡腧腨腩腭腯腷膁膐膄膅膆膋膎膖膘膛膞膢膮膲膴膻臋臃臅臊臎臏臕臗臛臝臞臡臤臫臬臰臱臲臵臶臸臹臽臿舀舃舏舓舔舙舚舝舡舢舨舲舴舺艃艄艅艆"], +["8fd7a1","艋艎艏艑艖艜艠艣艧艭艴艻艽艿芀芁芃芄芇芉芊芎芑芔芖芘芚芛芠芡芣芤芧芨芩芪芮芰芲芴芷芺芼芾芿苆苐苕苚苠苢苤苨苪苭苯苶苷苽苾茀茁茇茈茊茋荔茛茝茞茟茡茢茬茭茮茰茳茷茺茼茽荂荃荄荇荍荎荑荕荖荗荰荸"], +["8fd8a1","荽荿莀莂莄莆莍莒莔莕莘莙莛莜莝莦莧莩莬莾莿菀菇菉菏菐菑菔菝荓菨菪菶菸菹菼萁萆萊萏萑萕萙莭萯萹葅葇葈葊葍葏葑葒葖葘葙葚葜葠葤葥葧葪葰葳葴葶葸葼葽蒁蒅蒒蒓蒕蒞蒦蒨蒩蒪蒯蒱蒴蒺蒽蒾蓀蓂蓇蓈蓌蓏蓓"], +["8fd9a1","蓜蓧蓪蓯蓰蓱蓲蓷蔲蓺蓻蓽蔂蔃蔇蔌蔎蔐蔜蔞蔢蔣蔤蔥蔧蔪蔫蔯蔳蔴蔶蔿蕆蕏",4,"蕖蕙蕜",6,"蕤蕫蕯蕹蕺蕻蕽蕿薁薅薆薉薋薌薏薓薘薝薟薠薢薥薧薴薶薷薸薼薽薾薿藂藇藊藋藎薭藘藚藟藠藦藨藭藳藶藼"], +["8fdaa1","藿蘀蘄蘅蘍蘎蘐蘑蘒蘘蘙蘛蘞蘡蘧蘩蘶蘸蘺蘼蘽虀虂虆虒虓虖虗虘虙虝虠",4,"虩虬虯虵虶虷虺蚍蚑蚖蚘蚚蚜蚡蚦蚧蚨蚭蚱蚳蚴蚵蚷蚸蚹蚿蛀蛁蛃蛅蛑蛒蛕蛗蛚蛜蛠蛣蛥蛧蚈蛺蛼蛽蜄蜅蜇蜋蜎蜏蜐蜓蜔蜙蜞蜟蜡蜣"], +["8fdba1","蜨蜮蜯蜱蜲蜹蜺蜼蜽蜾蝀蝃蝅蝍蝘蝝蝡蝤蝥蝯蝱蝲蝻螃",6,"螋螌螐螓螕螗螘螙螞螠螣螧螬螭螮螱螵螾螿蟁蟈蟉蟊蟎蟕蟖蟙蟚蟜蟟蟢蟣蟤蟪蟫蟭蟱蟳蟸蟺蟿蠁蠃蠆蠉蠊蠋蠐蠙蠒蠓蠔蠘蠚蠛蠜蠞蠟蠨蠭蠮蠰蠲蠵"], +["8fdca1","蠺蠼衁衃衅衈衉衊衋衎衑衕衖衘衚衜衟衠衤衩衱衹衻袀袘袚袛袜袟袠袨袪袺袽袾裀裊",4,"裑裒裓裛裞裧裯裰裱裵裷褁褆褍褎褏褕褖褘褙褚褜褠褦褧褨褰褱褲褵褹褺褾襀襂襅襆襉襏襒襗襚襛襜襡襢襣襫襮襰襳襵襺"], +["8fdda1","襻襼襽覉覍覐覔覕覛覜覟覠覥覰覴覵覶覷覼觔",4,"觥觩觫觭觱觳觶觹觽觿訄訅訇訏訑訒訔訕訞訠訢訤訦訫訬訯訵訷訽訾詀詃詅詇詉詍詎詓詖詗詘詜詝詡詥詧詵詶詷詹詺詻詾詿誀誃誆誋誏誐誒誖誗誙誟誧誩誮誯誳"], +["8fdea1","誶誷誻誾諃諆諈諉諊諑諓諔諕諗諝諟諬諰諴諵諶諼諿謅謆謋謑謜謞謟謊謭謰謷謼譂",4,"譈譒譓譔譙譍譞譣譭譶譸譹譼譾讁讄讅讋讍讏讔讕讜讞讟谸谹谽谾豅豇豉豋豏豑豓豔豗豘豛豝豙豣豤豦豨豩豭豳豵豶豻豾貆"], +["8fdfa1","貇貋貐貒貓貙貛貜貤貹貺賅賆賉賋賏賖賕賙賝賡賨賬賯賰賲賵賷賸賾賿贁贃贉贒贗贛赥赩赬赮赿趂趄趈趍趐趑趕趞趟趠趦趫趬趯趲趵趷趹趻跀跅跆跇跈跊跎跑跔跕跗跙跤跥跧跬跰趼跱跲跴跽踁踄踅踆踋踑踔踖踠踡踢"], +["8fe0a1","踣踦踧踱踳踶踷踸踹踽蹀蹁蹋蹍蹎蹏蹔蹛蹜蹝蹞蹡蹢蹩蹬蹭蹯蹰蹱蹹蹺蹻躂躃躉躐躒躕躚躛躝躞躢躧躩躭躮躳躵躺躻軀軁軃軄軇軏軑軔軜軨軮軰軱軷軹軺軭輀輂輇輈輏輐輖輗輘輞輠輡輣輥輧輨輬輭輮輴輵輶輷輺轀轁"], +["8fe1a1","轃轇轏轑",4,"轘轝轞轥辝辠辡辤辥辦辵辶辸达迀迁迆迊迋迍运迒迓迕迠迣迤迨迮迱迵迶迻迾适逄逈逌逘逛逨逩逯逪逬逭逳逴逷逿遃遄遌遛遝遢遦遧遬遰遴遹邅邈邋邌邎邐邕邗邘邙邛邠邡邢邥邰邲邳邴邶邽郌邾郃"], +["8fe2a1","郄郅郇郈郕郗郘郙郜郝郟郥郒郶郫郯郰郴郾郿鄀鄄鄅鄆鄈鄍鄐鄔鄖鄗鄘鄚鄜鄞鄠鄥鄢鄣鄧鄩鄮鄯鄱鄴鄶鄷鄹鄺鄼鄽酃酇酈酏酓酗酙酚酛酡酤酧酭酴酹酺酻醁醃醅醆醊醎醑醓醔醕醘醞醡醦醨醬醭醮醰醱醲醳醶醻醼醽醿"], +["8fe3a1","釂釃釅釓釔釗釙釚釞釤釥釩釪釬",5,"釷釹釻釽鈀鈁鈄鈅鈆鈇鈉鈊鈌鈐鈒鈓鈖鈘鈜鈝鈣鈤鈥鈦鈨鈮鈯鈰鈳鈵鈶鈸鈹鈺鈼鈾鉀鉂鉃鉆鉇鉊鉍鉎鉏鉑鉘鉙鉜鉝鉠鉡鉥鉧鉨鉩鉮鉯鉰鉵",4,"鉻鉼鉽鉿銈銉銊銍銎銒銗"], +["8fe4a1","銙銟銠銤銥銧銨銫銯銲銶銸銺銻銼銽銿",4,"鋅鋆鋇鋈鋋鋌鋍鋎鋐鋓鋕鋗鋘鋙鋜鋝鋟鋠鋡鋣鋥鋧鋨鋬鋮鋰鋹鋻鋿錀錂錈錍錑錔錕錜錝錞錟錡錤錥錧錩錪錳錴錶錷鍇鍈鍉鍐鍑鍒鍕鍗鍘鍚鍞鍤鍥鍧鍩鍪鍭鍯鍰鍱鍳鍴鍶"], +["8fe5a1","鍺鍽鍿鎀鎁鎂鎈鎊鎋鎍鎏鎒鎕鎘鎛鎞鎡鎣鎤鎦鎨鎫鎴鎵鎶鎺鎩鏁鏄鏅鏆鏇鏉",4,"鏓鏙鏜鏞鏟鏢鏦鏧鏹鏷鏸鏺鏻鏽鐁鐂鐄鐈鐉鐍鐎鐏鐕鐖鐗鐟鐮鐯鐱鐲鐳鐴鐻鐿鐽鑃鑅鑈鑊鑌鑕鑙鑜鑟鑡鑣鑨鑫鑭鑮鑯鑱鑲钄钃镸镹"], +["8fe6a1","镾閄閈閌閍閎閝閞閟閡閦閩閫閬閴閶閺閽閿闆闈闉闋闐闑闒闓闙闚闝闞闟闠闤闦阝阞阢阤阥阦阬阱阳阷阸阹阺阼阽陁陒陔陖陗陘陡陮陴陻陼陾陿隁隂隃隄隉隑隖隚隝隟隤隥隦隩隮隯隳隺雊雒嶲雘雚雝雞雟雩雯雱雺霂"], +["8fe7a1","霃霅霉霚霛霝霡霢霣霨霱霳靁靃靊靎靏靕靗靘靚靛靣靧靪靮靳靶靷靸靻靽靿鞀鞉鞕鞖鞗鞙鞚鞞鞟鞢鞬鞮鞱鞲鞵鞶鞸鞹鞺鞼鞾鞿韁韄韅韇韉韊韌韍韎韐韑韔韗韘韙韝韞韠韛韡韤韯韱韴韷韸韺頇頊頙頍頎頔頖頜頞頠頣頦"], +["8fe8a1","頫頮頯頰頲頳頵頥頾顄顇顊顑顒顓顖顗顙顚顢顣顥顦顪顬颫颭颮颰颴颷颸颺颻颿飂飅飈飌飡飣飥飦飧飪飳飶餂餇餈餑餕餖餗餚餛餜餟餢餦餧餫餱",4,"餹餺餻餼饀饁饆饇饈饍饎饔饘饙饛饜饞饟饠馛馝馟馦馰馱馲馵"], +["8fe9a1","馹馺馽馿駃駉駓駔駙駚駜駞駧駪駫駬駰駴駵駹駽駾騂騃騄騋騌騐騑騖騞騠騢騣騤騧騭騮騳騵騶騸驇驁驄驊驋驌驎驑驔驖驝骪骬骮骯骲骴骵骶骹骻骾骿髁髃髆髈髎髐髒髕髖髗髛髜髠髤髥髧髩髬髲髳髵髹髺髽髿",4], +["8feaa1","鬄鬅鬈鬉鬋鬌鬍鬎鬐鬒鬖鬙鬛鬜鬠鬦鬫鬭鬳鬴鬵鬷鬹鬺鬽魈魋魌魕魖魗魛魞魡魣魥魦魨魪",4,"魳魵魷魸魹魿鮀鮄鮅鮆鮇鮉鮊鮋鮍鮏鮐鮔鮚鮝鮞鮦鮧鮩鮬鮰鮱鮲鮷鮸鮻鮼鮾鮿鯁鯇鯈鯎鯐鯗鯘鯝鯟鯥鯧鯪鯫鯯鯳鯷鯸"], +["8feba1","鯹鯺鯽鯿鰀鰂鰋鰏鰑鰖鰘鰙鰚鰜鰞鰢鰣鰦",4,"鰱鰵鰶鰷鰽鱁鱃鱄鱅鱉鱊鱎鱏鱐鱓鱔鱖鱘鱛鱝鱞鱟鱣鱩鱪鱜鱫鱨鱮鱰鱲鱵鱷鱻鳦鳲鳷鳹鴋鴂鴑鴗鴘鴜鴝鴞鴯鴰鴲鴳鴴鴺鴼鵅鴽鵂鵃鵇鵊鵓鵔鵟鵣鵢鵥鵩鵪鵫鵰鵶鵷鵻"], +["8feca1","鵼鵾鶃鶄鶆鶊鶍鶎鶒鶓鶕鶖鶗鶘鶡鶪鶬鶮鶱鶵鶹鶼鶿鷃鷇鷉鷊鷔鷕鷖鷗鷚鷞鷟鷠鷥鷧鷩鷫鷮鷰鷳鷴鷾鸊鸂鸇鸎鸐鸑鸒鸕鸖鸙鸜鸝鹺鹻鹼麀麂麃麄麅麇麎麏麖麘麛麞麤麨麬麮麯麰麳麴麵黆黈黋黕黟黤黧黬黭黮黰黱黲黵"], +["8feda1","黸黿鼂鼃鼉鼏鼐鼑鼒鼔鼖鼗鼙鼚鼛鼟鼢鼦鼪鼫鼯鼱鼲鼴鼷鼹鼺鼼鼽鼿齁齃",4,"齓齕齖齗齘齚齝齞齨齩齭",4,"齳齵齺齽龏龐龑龒龔龖龗龞龡龢龣龥"] +] diff --git a/node_modules/iconv-lite/encodings/tables/gb18030-ranges.json b/node_modules/iconv-lite/encodings/tables/gb18030-ranges.json new file mode 100644 index 000000000..85c693475 --- /dev/null +++ b/node_modules/iconv-lite/encodings/tables/gb18030-ranges.json @@ -0,0 +1 @@ +{"uChars":[128,165,169,178,184,216,226,235,238,244,248,251,253,258,276,284,300,325,329,334,364,463,465,467,469,471,473,475,477,506,594,610,712,716,730,930,938,962,970,1026,1104,1106,8209,8215,8218,8222,8231,8241,8244,8246,8252,8365,8452,8454,8458,8471,8482,8556,8570,8596,8602,8713,8720,8722,8726,8731,8737,8740,8742,8748,8751,8760,8766,8777,8781,8787,8802,8808,8816,8854,8858,8870,8896,8979,9322,9372,9548,9588,9616,9622,9634,9652,9662,9672,9676,9680,9702,9735,9738,9793,9795,11906,11909,11913,11917,11928,11944,11947,11951,11956,11960,11964,11979,12284,12292,12312,12319,12330,12351,12436,12447,12535,12543,12586,12842,12850,12964,13200,13215,13218,13253,13263,13267,13270,13384,13428,13727,13839,13851,14617,14703,14801,14816,14964,15183,15471,15585,16471,16736,17208,17325,17330,17374,17623,17997,18018,18212,18218,18301,18318,18760,18811,18814,18820,18823,18844,18848,18872,19576,19620,19738,19887,40870,59244,59336,59367,59413,59417,59423,59431,59437,59443,59452,59460,59478,59493,63789,63866,63894,63976,63986,64016,64018,64021,64025,64034,64037,64042,65074,65093,65107,65112,65127,65132,65375,65510,65536],"gbChars":[0,36,38,45,50,81,89,95,96,100,103,104,105,109,126,133,148,172,175,179,208,306,307,308,309,310,311,312,313,341,428,443,544,545,558,741,742,749,750,805,819,820,7922,7924,7925,7927,7934,7943,7944,7945,7950,8062,8148,8149,8152,8164,8174,8236,8240,8262,8264,8374,8380,8381,8384,8388,8390,8392,8393,8394,8396,8401,8406,8416,8419,8424,8437,8439,8445,8482,8485,8496,8521,8603,8936,8946,9046,9050,9063,9066,9076,9092,9100,9108,9111,9113,9131,9162,9164,9218,9219,11329,11331,11334,11336,11346,11361,11363,11366,11370,11372,11375,11389,11682,11686,11687,11692,11694,11714,11716,11723,11725,11730,11736,11982,11989,12102,12336,12348,12350,12384,12393,12395,12397,12510,12553,12851,12962,12973,13738,13823,13919,13933,14080,14298,14585,14698,15583,15847,16318,16434,16438,16481,16729,17102,17122,17315,17320,17402,17418,17859,17909,17911,17915,17916,17936,17939,17961,18664,18703,18814,18962,19043,33469,33470,33471,33484,33485,33490,33497,33501,33505,33513,33520,33536,33550,37845,37921,37948,38029,38038,38064,38065,38066,38069,38075,38076,38078,39108,39109,39113,39114,39115,39116,39265,39394,189000]} \ No newline at end of file diff --git a/node_modules/iconv-lite/encodings/tables/gbk-added.json b/node_modules/iconv-lite/encodings/tables/gbk-added.json new file mode 100644 index 000000000..8abfa9f7b --- /dev/null +++ b/node_modules/iconv-lite/encodings/tables/gbk-added.json @@ -0,0 +1,55 @@ +[ +["a140","",62], +["a180","",32], +["a240","",62], +["a280","",32], +["a2ab","",5], +["a2e3","€"], +["a2ef",""], +["a2fd",""], +["a340","",62], +["a380","",31," "], +["a440","",62], +["a480","",32], +["a4f4","",10], +["a540","",62], +["a580","",32], +["a5f7","",7], +["a640","",62], +["a680","",32], +["a6b9","",7], +["a6d9","",6], +["a6ec",""], +["a6f3",""], +["a6f6","",8], +["a740","",62], +["a780","",32], +["a7c2","",14], +["a7f2","",12], +["a896","",10], +["a8bc",""], +["a8bf","ǹ"], +["a8c1",""], +["a8ea","",20], +["a958",""], +["a95b",""], +["a95d",""], +["a989","〾⿰",11], +["a997","",12], +["a9f0","",14], +["aaa1","",93], +["aba1","",93], +["aca1","",93], +["ada1","",93], +["aea1","",93], +["afa1","",93], +["d7fa","",4], +["f8a1","",93], +["f9a1","",93], +["faa1","",93], +["fba1","",93], +["fca1","",93], +["fda1","",93], +["fe50","⺁⺄㑳㑇⺈⺋㖞㘚㘎⺌⺗㥮㤘㧏㧟㩳㧐㭎㱮㳠⺧⺪䁖䅟⺮䌷⺳⺶⺷䎱䎬⺻䏝䓖䙡䙌"], +["fe80","䜣䜩䝼䞍⻊䥇䥺䥽䦂䦃䦅䦆䦟䦛䦷䦶䲣䲟䲠䲡䱷䲢䴓",6,"䶮",93] +] diff --git a/node_modules/iconv-lite/encodings/tables/shiftjis.json b/node_modules/iconv-lite/encodings/tables/shiftjis.json new file mode 100644 index 000000000..5a3a43cf8 --- /dev/null +++ b/node_modules/iconv-lite/encodings/tables/shiftjis.json @@ -0,0 +1,125 @@ +[ +["0","\u0000",128], +["a1","。",62], +["8140"," 、。,.・:;?!゛゜´`¨^ ̄_ヽヾゝゞ〃仝々〆〇ー―‐/\~∥|…‥‘’“”()〔〕[]{}〈",9,"+-±×"], +["8180","÷=≠<>≦≧∞∴♂♀°′″℃¥$¢£%#&*@§☆★○●◎◇◆□■△▲▽▼※〒→←↑↓〓"], +["81b8","∈∋⊆⊇⊂⊃∪∩"], +["81c8","∧∨¬⇒⇔∀∃"], +["81da","∠⊥⌒∂∇≡≒≪≫√∽∝∵∫∬"], +["81f0","ʼn♯♭♪†‡¶"], +["81fc","◯"], +["824f","0",9], +["8260","A",25], +["8281","a",25], +["829f","ぁ",82], +["8340","ァ",62], +["8380","ム",22], +["839f","Α",16,"Σ",6], +["83bf","α",16,"σ",6], +["8440","А",5,"ЁЖ",25], +["8470","а",5,"ёж",7], +["8480","о",17], +["849f","─│┌┐┘└├┬┤┴┼━┃┏┓┛┗┣┳┫┻╋┠┯┨┷┿┝┰┥┸╂"], +["8740","①",19,"Ⅰ",9], +["875f","㍉㌔㌢㍍㌘㌧㌃㌶㍑㍗㌍㌦㌣㌫㍊㌻㎜㎝㎞㎎㎏㏄㎡"], +["877e","㍻"], +["8780","〝〟№㏍℡㊤",4,"㈱㈲㈹㍾㍽㍼≒≡∫∮∑√⊥∠∟⊿∵∩∪"], +["889f","亜唖娃阿哀愛挨姶逢葵茜穐悪握渥旭葦芦鯵梓圧斡扱宛姐虻飴絢綾鮎或粟袷安庵按暗案闇鞍杏以伊位依偉囲夷委威尉惟意慰易椅為畏異移維緯胃萎衣謂違遺医井亥域育郁磯一壱溢逸稲茨芋鰯允印咽員因姻引飲淫胤蔭"], +["8940","院陰隠韻吋右宇烏羽迂雨卯鵜窺丑碓臼渦嘘唄欝蔚鰻姥厩浦瓜閏噂云運雲荏餌叡営嬰影映曳栄永泳洩瑛盈穎頴英衛詠鋭液疫益駅悦謁越閲榎厭円"], +["8980","園堰奄宴延怨掩援沿演炎焔煙燕猿縁艶苑薗遠鉛鴛塩於汚甥凹央奥往応押旺横欧殴王翁襖鴬鴎黄岡沖荻億屋憶臆桶牡乙俺卸恩温穏音下化仮何伽価佳加可嘉夏嫁家寡科暇果架歌河火珂禍禾稼箇花苛茄荷華菓蝦課嘩貨迦過霞蚊俄峨我牙画臥芽蛾賀雅餓駕介会解回塊壊廻快怪悔恢懐戒拐改"], +["8a40","魁晦械海灰界皆絵芥蟹開階貝凱劾外咳害崖慨概涯碍蓋街該鎧骸浬馨蛙垣柿蛎鈎劃嚇各廓拡撹格核殻獲確穫覚角赫較郭閣隔革学岳楽額顎掛笠樫"], +["8a80","橿梶鰍潟割喝恰括活渇滑葛褐轄且鰹叶椛樺鞄株兜竃蒲釜鎌噛鴨栢茅萱粥刈苅瓦乾侃冠寒刊勘勧巻喚堪姦完官寛干幹患感慣憾換敢柑桓棺款歓汗漢澗潅環甘監看竿管簡緩缶翰肝艦莞観諌貫還鑑間閑関陥韓館舘丸含岸巌玩癌眼岩翫贋雁頑顔願企伎危喜器基奇嬉寄岐希幾忌揮机旗既期棋棄"], +["8b40","機帰毅気汽畿祈季稀紀徽規記貴起軌輝飢騎鬼亀偽儀妓宜戯技擬欺犠疑祇義蟻誼議掬菊鞠吉吃喫桔橘詰砧杵黍却客脚虐逆丘久仇休及吸宮弓急救"], +["8b80","朽求汲泣灸球究窮笈級糾給旧牛去居巨拒拠挙渠虚許距鋸漁禦魚亨享京供侠僑兇競共凶協匡卿叫喬境峡強彊怯恐恭挟教橋況狂狭矯胸脅興蕎郷鏡響饗驚仰凝尭暁業局曲極玉桐粁僅勤均巾錦斤欣欽琴禁禽筋緊芹菌衿襟謹近金吟銀九倶句区狗玖矩苦躯駆駈駒具愚虞喰空偶寓遇隅串櫛釧屑屈"], +["8c40","掘窟沓靴轡窪熊隈粂栗繰桑鍬勲君薫訓群軍郡卦袈祁係傾刑兄啓圭珪型契形径恵慶慧憩掲携敬景桂渓畦稽系経継繋罫茎荊蛍計詣警軽頚鶏芸迎鯨"], +["8c80","劇戟撃激隙桁傑欠決潔穴結血訣月件倹倦健兼券剣喧圏堅嫌建憲懸拳捲検権牽犬献研硯絹県肩見謙賢軒遣鍵険顕験鹸元原厳幻弦減源玄現絃舷言諺限乎個古呼固姑孤己庫弧戸故枯湖狐糊袴股胡菰虎誇跨鈷雇顧鼓五互伍午呉吾娯後御悟梧檎瑚碁語誤護醐乞鯉交佼侯候倖光公功効勾厚口向"], +["8d40","后喉坑垢好孔孝宏工巧巷幸広庚康弘恒慌抗拘控攻昂晃更杭校梗構江洪浩港溝甲皇硬稿糠紅紘絞綱耕考肯肱腔膏航荒行衡講貢購郊酵鉱砿鋼閤降"], +["8d80","項香高鴻剛劫号合壕拷濠豪轟麹克刻告国穀酷鵠黒獄漉腰甑忽惚骨狛込此頃今困坤墾婚恨懇昏昆根梱混痕紺艮魂些佐叉唆嵯左差査沙瑳砂詐鎖裟坐座挫債催再最哉塞妻宰彩才採栽歳済災采犀砕砦祭斎細菜裁載際剤在材罪財冴坂阪堺榊肴咲崎埼碕鷺作削咋搾昨朔柵窄策索錯桜鮭笹匙冊刷"], +["8e40","察拶撮擦札殺薩雑皐鯖捌錆鮫皿晒三傘参山惨撒散桟燦珊産算纂蚕讃賛酸餐斬暫残仕仔伺使刺司史嗣四士始姉姿子屍市師志思指支孜斯施旨枝止"], +["8e80","死氏獅祉私糸紙紫肢脂至視詞詩試誌諮資賜雌飼歯事似侍児字寺慈持時次滋治爾璽痔磁示而耳自蒔辞汐鹿式識鴫竺軸宍雫七叱執失嫉室悉湿漆疾質実蔀篠偲柴芝屡蕊縞舎写射捨赦斜煮社紗者謝車遮蛇邪借勺尺杓灼爵酌釈錫若寂弱惹主取守手朱殊狩珠種腫趣酒首儒受呪寿授樹綬需囚収周"], +["8f40","宗就州修愁拾洲秀秋終繍習臭舟蒐衆襲讐蹴輯週酋酬集醜什住充十従戎柔汁渋獣縦重銃叔夙宿淑祝縮粛塾熟出術述俊峻春瞬竣舜駿准循旬楯殉淳"], +["8f80","準潤盾純巡遵醇順処初所暑曙渚庶緒署書薯藷諸助叙女序徐恕鋤除傷償勝匠升召哨商唱嘗奨妾娼宵将小少尚庄床廠彰承抄招掌捷昇昌昭晶松梢樟樵沼消渉湘焼焦照症省硝礁祥称章笑粧紹肖菖蒋蕉衝裳訟証詔詳象賞醤鉦鍾鐘障鞘上丈丞乗冗剰城場壌嬢常情擾条杖浄状畳穣蒸譲醸錠嘱埴飾"], +["9040","拭植殖燭織職色触食蝕辱尻伸信侵唇娠寝審心慎振新晋森榛浸深申疹真神秦紳臣芯薪親診身辛進針震人仁刃塵壬尋甚尽腎訊迅陣靭笥諏須酢図厨"], +["9080","逗吹垂帥推水炊睡粋翠衰遂酔錐錘随瑞髄崇嵩数枢趨雛据杉椙菅頗雀裾澄摺寸世瀬畝是凄制勢姓征性成政整星晴棲栖正清牲生盛精聖声製西誠誓請逝醒青静斉税脆隻席惜戚斥昔析石積籍績脊責赤跡蹟碩切拙接摂折設窃節説雪絶舌蝉仙先千占宣専尖川戦扇撰栓栴泉浅洗染潜煎煽旋穿箭線"], +["9140","繊羨腺舛船薦詮賎践選遷銭銑閃鮮前善漸然全禅繕膳糎噌塑岨措曾曽楚狙疏疎礎祖租粗素組蘇訴阻遡鼠僧創双叢倉喪壮奏爽宋層匝惣想捜掃挿掻"], +["9180","操早曹巣槍槽漕燥争痩相窓糟総綜聡草荘葬蒼藻装走送遭鎗霜騒像増憎臓蔵贈造促側則即息捉束測足速俗属賊族続卒袖其揃存孫尊損村遜他多太汰詑唾堕妥惰打柁舵楕陀駄騨体堆対耐岱帯待怠態戴替泰滞胎腿苔袋貸退逮隊黛鯛代台大第醍題鷹滝瀧卓啄宅托択拓沢濯琢託鐸濁諾茸凧蛸只"], +["9240","叩但達辰奪脱巽竪辿棚谷狸鱈樽誰丹単嘆坦担探旦歎淡湛炭短端箪綻耽胆蛋誕鍛団壇弾断暖檀段男談値知地弛恥智池痴稚置致蜘遅馳築畜竹筑蓄"], +["9280","逐秩窒茶嫡着中仲宙忠抽昼柱注虫衷註酎鋳駐樗瀦猪苧著貯丁兆凋喋寵帖帳庁弔張彫徴懲挑暢朝潮牒町眺聴脹腸蝶調諜超跳銚長頂鳥勅捗直朕沈珍賃鎮陳津墜椎槌追鎚痛通塚栂掴槻佃漬柘辻蔦綴鍔椿潰坪壷嬬紬爪吊釣鶴亭低停偵剃貞呈堤定帝底庭廷弟悌抵挺提梯汀碇禎程締艇訂諦蹄逓"], +["9340","邸鄭釘鼎泥摘擢敵滴的笛適鏑溺哲徹撤轍迭鉄典填天展店添纏甜貼転顛点伝殿澱田電兎吐堵塗妬屠徒斗杜渡登菟賭途都鍍砥砺努度土奴怒倒党冬"], +["9380","凍刀唐塔塘套宕島嶋悼投搭東桃梼棟盗淘湯涛灯燈当痘祷等答筒糖統到董蕩藤討謄豆踏逃透鐙陶頭騰闘働動同堂導憧撞洞瞳童胴萄道銅峠鴇匿得徳涜特督禿篤毒独読栃橡凸突椴届鳶苫寅酉瀞噸屯惇敦沌豚遁頓呑曇鈍奈那内乍凪薙謎灘捺鍋楢馴縄畷南楠軟難汝二尼弐迩匂賑肉虹廿日乳入"], +["9440","如尿韮任妊忍認濡禰祢寧葱猫熱年念捻撚燃粘乃廼之埜嚢悩濃納能脳膿農覗蚤巴把播覇杷波派琶破婆罵芭馬俳廃拝排敗杯盃牌背肺輩配倍培媒梅"], +["9480","楳煤狽買売賠陪這蝿秤矧萩伯剥博拍柏泊白箔粕舶薄迫曝漠爆縛莫駁麦函箱硲箸肇筈櫨幡肌畑畠八鉢溌発醗髪伐罰抜筏閥鳩噺塙蛤隼伴判半反叛帆搬斑板氾汎版犯班畔繁般藩販範釆煩頒飯挽晩番盤磐蕃蛮匪卑否妃庇彼悲扉批披斐比泌疲皮碑秘緋罷肥被誹費避非飛樋簸備尾微枇毘琵眉美"], +["9540","鼻柊稗匹疋髭彦膝菱肘弼必畢筆逼桧姫媛紐百謬俵彪標氷漂瓢票表評豹廟描病秒苗錨鋲蒜蛭鰭品彬斌浜瀕貧賓頻敏瓶不付埠夫婦富冨布府怖扶敷"], +["9580","斧普浮父符腐膚芙譜負賦赴阜附侮撫武舞葡蕪部封楓風葺蕗伏副復幅服福腹複覆淵弗払沸仏物鮒分吻噴墳憤扮焚奮粉糞紛雰文聞丙併兵塀幣平弊柄並蔽閉陛米頁僻壁癖碧別瞥蔑箆偏変片篇編辺返遍便勉娩弁鞭保舗鋪圃捕歩甫補輔穂募墓慕戊暮母簿菩倣俸包呆報奉宝峰峯崩庖抱捧放方朋"], +["9640","法泡烹砲縫胞芳萌蓬蜂褒訪豊邦鋒飽鳳鵬乏亡傍剖坊妨帽忘忙房暴望某棒冒紡肪膨謀貌貿鉾防吠頬北僕卜墨撲朴牧睦穆釦勃没殆堀幌奔本翻凡盆"], +["9680","摩磨魔麻埋妹昧枚毎哩槙幕膜枕鮪柾鱒桝亦俣又抹末沫迄侭繭麿万慢満漫蔓味未魅巳箕岬密蜜湊蓑稔脈妙粍民眠務夢無牟矛霧鵡椋婿娘冥名命明盟迷銘鳴姪牝滅免棉綿緬面麺摸模茂妄孟毛猛盲網耗蒙儲木黙目杢勿餅尤戻籾貰問悶紋門匁也冶夜爺耶野弥矢厄役約薬訳躍靖柳薮鑓愉愈油癒"], +["9740","諭輸唯佑優勇友宥幽悠憂揖有柚湧涌猶猷由祐裕誘遊邑郵雄融夕予余与誉輿預傭幼妖容庸揚揺擁曜楊様洋溶熔用窯羊耀葉蓉要謡踊遥陽養慾抑欲"], +["9780","沃浴翌翼淀羅螺裸来莱頼雷洛絡落酪乱卵嵐欄濫藍蘭覧利吏履李梨理璃痢裏裡里離陸律率立葎掠略劉流溜琉留硫粒隆竜龍侶慮旅虜了亮僚両凌寮料梁涼猟療瞭稜糧良諒遼量陵領力緑倫厘林淋燐琳臨輪隣鱗麟瑠塁涙累類令伶例冷励嶺怜玲礼苓鈴隷零霊麗齢暦歴列劣烈裂廉恋憐漣煉簾練聯"], +["9840","蓮連錬呂魯櫓炉賂路露労婁廊弄朗楼榔浪漏牢狼篭老聾蝋郎六麓禄肋録論倭和話歪賄脇惑枠鷲亙亘鰐詫藁蕨椀湾碗腕"], +["989f","弌丐丕个丱丶丼丿乂乖乘亂亅豫亊舒弍于亞亟亠亢亰亳亶从仍仄仆仂仗仞仭仟价伉佚估佛佝佗佇佶侈侏侘佻佩佰侑佯來侖儘俔俟俎俘俛俑俚俐俤俥倚倨倔倪倥倅伜俶倡倩倬俾俯們倆偃假會偕偐偈做偖偬偸傀傚傅傴傲"], +["9940","僉僊傳僂僖僞僥僭僣僮價僵儉儁儂儖儕儔儚儡儺儷儼儻儿兀兒兌兔兢竸兩兪兮冀冂囘册冉冏冑冓冕冖冤冦冢冩冪冫决冱冲冰况冽凅凉凛几處凩凭"], +["9980","凰凵凾刄刋刔刎刧刪刮刳刹剏剄剋剌剞剔剪剴剩剳剿剽劍劔劒剱劈劑辨辧劬劭劼劵勁勍勗勞勣勦飭勠勳勵勸勹匆匈甸匍匐匏匕匚匣匯匱匳匸區卆卅丗卉卍凖卞卩卮夘卻卷厂厖厠厦厥厮厰厶參簒雙叟曼燮叮叨叭叺吁吽呀听吭吼吮吶吩吝呎咏呵咎呟呱呷呰咒呻咀呶咄咐咆哇咢咸咥咬哄哈咨"], +["9a40","咫哂咤咾咼哘哥哦唏唔哽哮哭哺哢唹啀啣啌售啜啅啖啗唸唳啝喙喀咯喊喟啻啾喘喞單啼喃喩喇喨嗚嗅嗟嗄嗜嗤嗔嘔嗷嘖嗾嗽嘛嗹噎噐營嘴嘶嘲嘸"], +["9a80","噫噤嘯噬噪嚆嚀嚊嚠嚔嚏嚥嚮嚶嚴囂嚼囁囃囀囈囎囑囓囗囮囹圀囿圄圉圈國圍圓團圖嗇圜圦圷圸坎圻址坏坩埀垈坡坿垉垓垠垳垤垪垰埃埆埔埒埓堊埖埣堋堙堝塲堡塢塋塰毀塒堽塹墅墹墟墫墺壞墻墸墮壅壓壑壗壙壘壥壜壤壟壯壺壹壻壼壽夂夊夐夛梦夥夬夭夲夸夾竒奕奐奎奚奘奢奠奧奬奩"], +["9b40","奸妁妝佞侫妣妲姆姨姜妍姙姚娥娟娑娜娉娚婀婬婉娵娶婢婪媚媼媾嫋嫂媽嫣嫗嫦嫩嫖嫺嫻嬌嬋嬖嬲嫐嬪嬶嬾孃孅孀孑孕孚孛孥孩孰孳孵學斈孺宀"], +["9b80","它宦宸寃寇寉寔寐寤實寢寞寥寫寰寶寳尅將專對尓尠尢尨尸尹屁屆屎屓屐屏孱屬屮乢屶屹岌岑岔妛岫岻岶岼岷峅岾峇峙峩峽峺峭嶌峪崋崕崗嵜崟崛崑崔崢崚崙崘嵌嵒嵎嵋嵬嵳嵶嶇嶄嶂嶢嶝嶬嶮嶽嶐嶷嶼巉巍巓巒巖巛巫已巵帋帚帙帑帛帶帷幄幃幀幎幗幔幟幢幤幇幵并幺麼广庠廁廂廈廐廏"], +["9c40","廖廣廝廚廛廢廡廨廩廬廱廳廰廴廸廾弃弉彝彜弋弑弖弩弭弸彁彈彌彎弯彑彖彗彙彡彭彳彷徃徂彿徊很徑徇從徙徘徠徨徭徼忖忻忤忸忱忝悳忿怡恠"], +["9c80","怙怐怩怎怱怛怕怫怦怏怺恚恁恪恷恟恊恆恍恣恃恤恂恬恫恙悁悍惧悃悚悄悛悖悗悒悧悋惡悸惠惓悴忰悽惆悵惘慍愕愆惶惷愀惴惺愃愡惻惱愍愎慇愾愨愧慊愿愼愬愴愽慂慄慳慷慘慙慚慫慴慯慥慱慟慝慓慵憙憖憇憬憔憚憊憑憫憮懌懊應懷懈懃懆憺懋罹懍懦懣懶懺懴懿懽懼懾戀戈戉戍戌戔戛"], +["9d40","戞戡截戮戰戲戳扁扎扞扣扛扠扨扼抂抉找抒抓抖拔抃抔拗拑抻拏拿拆擔拈拜拌拊拂拇抛拉挌拮拱挧挂挈拯拵捐挾捍搜捏掖掎掀掫捶掣掏掉掟掵捫"], +["9d80","捩掾揩揀揆揣揉插揶揄搖搴搆搓搦搶攝搗搨搏摧摯摶摎攪撕撓撥撩撈撼據擒擅擇撻擘擂擱擧舉擠擡抬擣擯攬擶擴擲擺攀擽攘攜攅攤攣攫攴攵攷收攸畋效敖敕敍敘敞敝敲數斂斃變斛斟斫斷旃旆旁旄旌旒旛旙无旡旱杲昊昃旻杳昵昶昴昜晏晄晉晁晞晝晤晧晨晟晢晰暃暈暎暉暄暘暝曁暹曉暾暼"], +["9e40","曄暸曖曚曠昿曦曩曰曵曷朏朖朞朦朧霸朮朿朶杁朸朷杆杞杠杙杣杤枉杰枩杼杪枌枋枦枡枅枷柯枴柬枳柩枸柤柞柝柢柮枹柎柆柧檜栞框栩桀桍栲桎"], +["9e80","梳栫桙档桷桿梟梏梭梔條梛梃檮梹桴梵梠梺椏梍桾椁棊椈棘椢椦棡椌棍棔棧棕椶椒椄棗棣椥棹棠棯椨椪椚椣椡棆楹楷楜楸楫楔楾楮椹楴椽楙椰楡楞楝榁楪榲榮槐榿槁槓榾槎寨槊槝榻槃榧樮榑榠榜榕榴槞槨樂樛槿權槹槲槧樅榱樞槭樔槫樊樒櫁樣樓橄樌橲樶橸橇橢橙橦橈樸樢檐檍檠檄檢檣"], +["9f40","檗蘗檻櫃櫂檸檳檬櫞櫑櫟檪櫚櫪櫻欅蘖櫺欒欖鬱欟欸欷盜欹飮歇歃歉歐歙歔歛歟歡歸歹歿殀殄殃殍殘殕殞殤殪殫殯殲殱殳殷殼毆毋毓毟毬毫毳毯"], +["9f80","麾氈氓气氛氤氣汞汕汢汪沂沍沚沁沛汾汨汳沒沐泄泱泓沽泗泅泝沮沱沾沺泛泯泙泪洟衍洶洫洽洸洙洵洳洒洌浣涓浤浚浹浙涎涕濤涅淹渕渊涵淇淦涸淆淬淞淌淨淒淅淺淙淤淕淪淮渭湮渮渙湲湟渾渣湫渫湶湍渟湃渺湎渤滿渝游溂溪溘滉溷滓溽溯滄溲滔滕溏溥滂溟潁漑灌滬滸滾漿滲漱滯漲滌"], +["e040","漾漓滷澆潺潸澁澀潯潛濳潭澂潼潘澎澑濂潦澳澣澡澤澹濆澪濟濕濬濔濘濱濮濛瀉瀋濺瀑瀁瀏濾瀛瀚潴瀝瀘瀟瀰瀾瀲灑灣炙炒炯烱炬炸炳炮烟烋烝"], +["e080","烙焉烽焜焙煥煕熈煦煢煌煖煬熏燻熄熕熨熬燗熹熾燒燉燔燎燠燬燧燵燼燹燿爍爐爛爨爭爬爰爲爻爼爿牀牆牋牘牴牾犂犁犇犒犖犢犧犹犲狃狆狄狎狒狢狠狡狹狷倏猗猊猜猖猝猴猯猩猥猾獎獏默獗獪獨獰獸獵獻獺珈玳珎玻珀珥珮珞璢琅瑯琥珸琲琺瑕琿瑟瑙瑁瑜瑩瑰瑣瑪瑶瑾璋璞璧瓊瓏瓔珱"], +["e140","瓠瓣瓧瓩瓮瓲瓰瓱瓸瓷甄甃甅甌甎甍甕甓甞甦甬甼畄畍畊畉畛畆畚畩畤畧畫畭畸當疆疇畴疊疉疂疔疚疝疥疣痂疳痃疵疽疸疼疱痍痊痒痙痣痞痾痿"], +["e180","痼瘁痰痺痲痳瘋瘍瘉瘟瘧瘠瘡瘢瘤瘴瘰瘻癇癈癆癜癘癡癢癨癩癪癧癬癰癲癶癸發皀皃皈皋皎皖皓皙皚皰皴皸皹皺盂盍盖盒盞盡盥盧盪蘯盻眈眇眄眩眤眞眥眦眛眷眸睇睚睨睫睛睥睿睾睹瞎瞋瞑瞠瞞瞰瞶瞹瞿瞼瞽瞻矇矍矗矚矜矣矮矼砌砒礦砠礪硅碎硴碆硼碚碌碣碵碪碯磑磆磋磔碾碼磅磊磬"], +["e240","磧磚磽磴礇礒礑礙礬礫祀祠祗祟祚祕祓祺祿禊禝禧齋禪禮禳禹禺秉秕秧秬秡秣稈稍稘稙稠稟禀稱稻稾稷穃穗穉穡穢穩龝穰穹穽窈窗窕窘窖窩竈窰"], +["e280","窶竅竄窿邃竇竊竍竏竕竓站竚竝竡竢竦竭竰笂笏笊笆笳笘笙笞笵笨笶筐筺笄筍笋筌筅筵筥筴筧筰筱筬筮箝箘箟箍箜箚箋箒箏筝箙篋篁篌篏箴篆篝篩簑簔篦篥籠簀簇簓篳篷簗簍篶簣簧簪簟簷簫簽籌籃籔籏籀籐籘籟籤籖籥籬籵粃粐粤粭粢粫粡粨粳粲粱粮粹粽糀糅糂糘糒糜糢鬻糯糲糴糶糺紆"], +["e340","紂紜紕紊絅絋紮紲紿紵絆絳絖絎絲絨絮絏絣經綉絛綏絽綛綺綮綣綵緇綽綫總綢綯緜綸綟綰緘緝緤緞緻緲緡縅縊縣縡縒縱縟縉縋縢繆繦縻縵縹繃縷"], +["e380","縲縺繧繝繖繞繙繚繹繪繩繼繻纃緕繽辮繿纈纉續纒纐纓纔纖纎纛纜缸缺罅罌罍罎罐网罕罔罘罟罠罨罩罧罸羂羆羃羈羇羌羔羞羝羚羣羯羲羹羮羶羸譱翅翆翊翕翔翡翦翩翳翹飜耆耄耋耒耘耙耜耡耨耿耻聊聆聒聘聚聟聢聨聳聲聰聶聹聽聿肄肆肅肛肓肚肭冐肬胛胥胙胝胄胚胖脉胯胱脛脩脣脯腋"], +["e440","隋腆脾腓腑胼腱腮腥腦腴膃膈膊膀膂膠膕膤膣腟膓膩膰膵膾膸膽臀臂膺臉臍臑臙臘臈臚臟臠臧臺臻臾舁舂舅與舊舍舐舖舩舫舸舳艀艙艘艝艚艟艤"], +["e480","艢艨艪艫舮艱艷艸艾芍芒芫芟芻芬苡苣苟苒苴苳苺莓范苻苹苞茆苜茉苙茵茴茖茲茱荀茹荐荅茯茫茗茘莅莚莪莟莢莖茣莎莇莊荼莵荳荵莠莉莨菴萓菫菎菽萃菘萋菁菷萇菠菲萍萢萠莽萸蔆菻葭萪萼蕚蒄葷葫蒭葮蒂葩葆萬葯葹萵蓊葢蒹蒿蒟蓙蓍蒻蓚蓐蓁蓆蓖蒡蔡蓿蓴蔗蔘蔬蔟蔕蔔蓼蕀蕣蕘蕈"], +["e540","蕁蘂蕋蕕薀薤薈薑薊薨蕭薔薛藪薇薜蕷蕾薐藉薺藏薹藐藕藝藥藜藹蘊蘓蘋藾藺蘆蘢蘚蘰蘿虍乕虔號虧虱蚓蚣蚩蚪蚋蚌蚶蚯蛄蛆蚰蛉蠣蚫蛔蛞蛩蛬"], +["e580","蛟蛛蛯蜒蜆蜈蜀蜃蛻蜑蜉蜍蛹蜊蜴蜿蜷蜻蜥蜩蜚蝠蝟蝸蝌蝎蝴蝗蝨蝮蝙蝓蝣蝪蠅螢螟螂螯蟋螽蟀蟐雖螫蟄螳蟇蟆螻蟯蟲蟠蠏蠍蟾蟶蟷蠎蟒蠑蠖蠕蠢蠡蠱蠶蠹蠧蠻衄衂衒衙衞衢衫袁衾袞衵衽袵衲袂袗袒袮袙袢袍袤袰袿袱裃裄裔裘裙裝裹褂裼裴裨裲褄褌褊褓襃褞褥褪褫襁襄褻褶褸襌褝襠襞"], +["e640","襦襤襭襪襯襴襷襾覃覈覊覓覘覡覩覦覬覯覲覺覽覿觀觚觜觝觧觴觸訃訖訐訌訛訝訥訶詁詛詒詆詈詼詭詬詢誅誂誄誨誡誑誥誦誚誣諄諍諂諚諫諳諧"], +["e680","諤諱謔諠諢諷諞諛謌謇謚諡謖謐謗謠謳鞫謦謫謾謨譁譌譏譎證譖譛譚譫譟譬譯譴譽讀讌讎讒讓讖讙讚谺豁谿豈豌豎豐豕豢豬豸豺貂貉貅貊貍貎貔豼貘戝貭貪貽貲貳貮貶賈賁賤賣賚賽賺賻贄贅贊贇贏贍贐齎贓賍贔贖赧赭赱赳趁趙跂趾趺跏跚跖跌跛跋跪跫跟跣跼踈踉跿踝踞踐踟蹂踵踰踴蹊"], +["e740","蹇蹉蹌蹐蹈蹙蹤蹠踪蹣蹕蹶蹲蹼躁躇躅躄躋躊躓躑躔躙躪躡躬躰軆躱躾軅軈軋軛軣軼軻軫軾輊輅輕輒輙輓輜輟輛輌輦輳輻輹轅轂輾轌轉轆轎轗轜"], +["e780","轢轣轤辜辟辣辭辯辷迚迥迢迪迯邇迴逅迹迺逑逕逡逍逞逖逋逧逶逵逹迸遏遐遑遒逎遉逾遖遘遞遨遯遶隨遲邂遽邁邀邊邉邏邨邯邱邵郢郤扈郛鄂鄒鄙鄲鄰酊酖酘酣酥酩酳酲醋醉醂醢醫醯醪醵醴醺釀釁釉釋釐釖釟釡釛釼釵釶鈞釿鈔鈬鈕鈑鉞鉗鉅鉉鉤鉈銕鈿鉋鉐銜銖銓銛鉚鋏銹銷鋩錏鋺鍄錮"], +["e840","錙錢錚錣錺錵錻鍜鍠鍼鍮鍖鎰鎬鎭鎔鎹鏖鏗鏨鏥鏘鏃鏝鏐鏈鏤鐚鐔鐓鐃鐇鐐鐶鐫鐵鐡鐺鑁鑒鑄鑛鑠鑢鑞鑪鈩鑰鑵鑷鑽鑚鑼鑾钁鑿閂閇閊閔閖閘閙"], +["e880","閠閨閧閭閼閻閹閾闊濶闃闍闌闕闔闖關闡闥闢阡阨阮阯陂陌陏陋陷陜陞陝陟陦陲陬隍隘隕隗險隧隱隲隰隴隶隸隹雎雋雉雍襍雜霍雕雹霄霆霈霓霎霑霏霖霙霤霪霰霹霽霾靄靆靈靂靉靜靠靤靦靨勒靫靱靹鞅靼鞁靺鞆鞋鞏鞐鞜鞨鞦鞣鞳鞴韃韆韈韋韜韭齏韲竟韶韵頏頌頸頤頡頷頽顆顏顋顫顯顰"], +["e940","顱顴顳颪颯颱颶飄飃飆飩飫餃餉餒餔餘餡餝餞餤餠餬餮餽餾饂饉饅饐饋饑饒饌饕馗馘馥馭馮馼駟駛駝駘駑駭駮駱駲駻駸騁騏騅駢騙騫騷驅驂驀驃"], +["e980","騾驕驍驛驗驟驢驥驤驩驫驪骭骰骼髀髏髑髓體髞髟髢髣髦髯髫髮髴髱髷髻鬆鬘鬚鬟鬢鬣鬥鬧鬨鬩鬪鬮鬯鬲魄魃魏魍魎魑魘魴鮓鮃鮑鮖鮗鮟鮠鮨鮴鯀鯊鮹鯆鯏鯑鯒鯣鯢鯤鯔鯡鰺鯲鯱鯰鰕鰔鰉鰓鰌鰆鰈鰒鰊鰄鰮鰛鰥鰤鰡鰰鱇鰲鱆鰾鱚鱠鱧鱶鱸鳧鳬鳰鴉鴈鳫鴃鴆鴪鴦鶯鴣鴟鵄鴕鴒鵁鴿鴾鵆鵈"], +["ea40","鵝鵞鵤鵑鵐鵙鵲鶉鶇鶫鵯鵺鶚鶤鶩鶲鷄鷁鶻鶸鶺鷆鷏鷂鷙鷓鷸鷦鷭鷯鷽鸚鸛鸞鹵鹹鹽麁麈麋麌麒麕麑麝麥麩麸麪麭靡黌黎黏黐黔黜點黝黠黥黨黯"], +["ea80","黴黶黷黹黻黼黽鼇鼈皷鼕鼡鼬鼾齊齒齔齣齟齠齡齦齧齬齪齷齲齶龕龜龠堯槇遙瑤凜熙"], +["ed40","纊褜鍈銈蓜俉炻昱棈鋹曻彅丨仡仼伀伃伹佖侒侊侚侔俍偀倢俿倞偆偰偂傔僴僘兊兤冝冾凬刕劜劦勀勛匀匇匤卲厓厲叝﨎咜咊咩哿喆坙坥垬埈埇﨏"], +["ed80","塚增墲夋奓奛奝奣妤妺孖寀甯寘寬尞岦岺峵崧嵓﨑嵂嵭嶸嶹巐弡弴彧德忞恝悅悊惞惕愠惲愑愷愰憘戓抦揵摠撝擎敎昀昕昻昉昮昞昤晥晗晙晴晳暙暠暲暿曺朎朗杦枻桒柀栁桄棏﨓楨﨔榘槢樰橫橆橳橾櫢櫤毖氿汜沆汯泚洄涇浯涖涬淏淸淲淼渹湜渧渼溿澈澵濵瀅瀇瀨炅炫焏焄煜煆煇凞燁燾犱"], +["ee40","犾猤猪獷玽珉珖珣珒琇珵琦琪琩琮瑢璉璟甁畯皂皜皞皛皦益睆劯砡硎硤硺礰礼神祥禔福禛竑竧靖竫箞精絈絜綷綠緖繒罇羡羽茁荢荿菇菶葈蒴蕓蕙"], +["ee80","蕫﨟薰蘒﨡蠇裵訒訷詹誧誾諟諸諶譓譿賰賴贒赶﨣軏﨤逸遧郞都鄕鄧釚釗釞釭釮釤釥鈆鈐鈊鈺鉀鈼鉎鉙鉑鈹鉧銧鉷鉸鋧鋗鋙鋐﨧鋕鋠鋓錥錡鋻﨨錞鋿錝錂鍰鍗鎤鏆鏞鏸鐱鑅鑈閒隆﨩隝隯霳霻靃靍靏靑靕顗顥飯飼餧館馞驎髙髜魵魲鮏鮱鮻鰀鵰鵫鶴鸙黑"], +["eeef","ⅰ",9,"¬¦'""], +["f040","",62], +["f080","",124], +["f140","",62], +["f180","",124], +["f240","",62], +["f280","",124], +["f340","",62], +["f380","",124], +["f440","",62], +["f480","",124], +["f540","",62], +["f580","",124], +["f640","",62], +["f680","",124], +["f740","",62], +["f780","",124], +["f840","",62], +["f880","",124], +["f940",""], +["fa40","ⅰ",9,"Ⅰ",9,"¬¦'"㈱№℡∵纊褜鍈銈蓜俉炻昱棈鋹曻彅丨仡仼伀伃伹佖侒侊侚侔俍偀倢俿倞偆偰偂傔僴僘兊"], +["fa80","兤冝冾凬刕劜劦勀勛匀匇匤卲厓厲叝﨎咜咊咩哿喆坙坥垬埈埇﨏塚增墲夋奓奛奝奣妤妺孖寀甯寘寬尞岦岺峵崧嵓﨑嵂嵭嶸嶹巐弡弴彧德忞恝悅悊惞惕愠惲愑愷愰憘戓抦揵摠撝擎敎昀昕昻昉昮昞昤晥晗晙晴晳暙暠暲暿曺朎朗杦枻桒柀栁桄棏﨓楨﨔榘槢樰橫橆橳橾櫢櫤毖氿汜沆汯泚洄涇浯"], +["fb40","涖涬淏淸淲淼渹湜渧渼溿澈澵濵瀅瀇瀨炅炫焏焄煜煆煇凞燁燾犱犾猤猪獷玽珉珖珣珒琇珵琦琪琩琮瑢璉璟甁畯皂皜皞皛皦益睆劯砡硎硤硺礰礼神"], +["fb80","祥禔福禛竑竧靖竫箞精絈絜綷綠緖繒罇羡羽茁荢荿菇菶葈蒴蕓蕙蕫﨟薰蘒﨡蠇裵訒訷詹誧誾諟諸諶譓譿賰賴贒赶﨣軏﨤逸遧郞都鄕鄧釚釗釞釭釮釤釥鈆鈐鈊鈺鉀鈼鉎鉙鉑鈹鉧銧鉷鉸鋧鋗鋙鋐﨧鋕鋠鋓錥錡鋻﨨錞鋿錝錂鍰鍗鎤鏆鏞鏸鐱鑅鑈閒隆﨩隝隯霳霻靃靍靏靑靕顗顥飯飼餧館馞驎髙"], +["fc40","髜魵魲鮏鮱鮻鰀鵰鵫鶴鸙黑"] +] diff --git a/node_modules/iconv-lite/encodings/utf16.js b/node_modules/iconv-lite/encodings/utf16.js new file mode 100644 index 000000000..54765aeee --- /dev/null +++ b/node_modules/iconv-lite/encodings/utf16.js @@ -0,0 +1,177 @@ +"use strict"; +var Buffer = require("safer-buffer").Buffer; + +// Note: UTF16-LE (or UCS2) codec is Node.js native. See encodings/internal.js + +// == UTF16-BE codec. ========================================================== + +exports.utf16be = Utf16BECodec; +function Utf16BECodec() { +} + +Utf16BECodec.prototype.encoder = Utf16BEEncoder; +Utf16BECodec.prototype.decoder = Utf16BEDecoder; +Utf16BECodec.prototype.bomAware = true; + + +// -- Encoding + +function Utf16BEEncoder() { +} + +Utf16BEEncoder.prototype.write = function(str) { + var buf = Buffer.from(str, 'ucs2'); + for (var i = 0; i < buf.length; i += 2) { + var tmp = buf[i]; buf[i] = buf[i+1]; buf[i+1] = tmp; + } + return buf; +} + +Utf16BEEncoder.prototype.end = function() { +} + + +// -- Decoding + +function Utf16BEDecoder() { + this.overflowByte = -1; +} + +Utf16BEDecoder.prototype.write = function(buf) { + if (buf.length == 0) + return ''; + + var buf2 = Buffer.alloc(buf.length + 1), + i = 0, j = 0; + + if (this.overflowByte !== -1) { + buf2[0] = buf[0]; + buf2[1] = this.overflowByte; + i = 1; j = 2; + } + + for (; i < buf.length-1; i += 2, j+= 2) { + buf2[j] = buf[i+1]; + buf2[j+1] = buf[i]; + } + + this.overflowByte = (i == buf.length-1) ? buf[buf.length-1] : -1; + + return buf2.slice(0, j).toString('ucs2'); +} + +Utf16BEDecoder.prototype.end = function() { +} + + +// == UTF-16 codec ============================================================= +// Decoder chooses automatically from UTF-16LE and UTF-16BE using BOM and space-based heuristic. +// Defaults to UTF-16LE, as it's prevalent and default in Node. +// http://en.wikipedia.org/wiki/UTF-16 and http://encoding.spec.whatwg.org/#utf-16le +// Decoder default can be changed: iconv.decode(buf, 'utf16', {defaultEncoding: 'utf-16be'}); + +// Encoder uses UTF-16LE and prepends BOM (which can be overridden with addBOM: false). + +exports.utf16 = Utf16Codec; +function Utf16Codec(codecOptions, iconv) { + this.iconv = iconv; +} + +Utf16Codec.prototype.encoder = Utf16Encoder; +Utf16Codec.prototype.decoder = Utf16Decoder; + + +// -- Encoding (pass-through) + +function Utf16Encoder(options, codec) { + options = options || {}; + if (options.addBOM === undefined) + options.addBOM = true; + this.encoder = codec.iconv.getEncoder('utf-16le', options); +} + +Utf16Encoder.prototype.write = function(str) { + return this.encoder.write(str); +} + +Utf16Encoder.prototype.end = function() { + return this.encoder.end(); +} + + +// -- Decoding + +function Utf16Decoder(options, codec) { + this.decoder = null; + this.initialBytes = []; + this.initialBytesLen = 0; + + this.options = options || {}; + this.iconv = codec.iconv; +} + +Utf16Decoder.prototype.write = function(buf) { + if (!this.decoder) { + // Codec is not chosen yet. Accumulate initial bytes. + this.initialBytes.push(buf); + this.initialBytesLen += buf.length; + + if (this.initialBytesLen < 16) // We need more bytes to use space heuristic (see below) + return ''; + + // We have enough bytes -> detect endianness. + var buf = Buffer.concat(this.initialBytes), + encoding = detectEncoding(buf, this.options.defaultEncoding); + this.decoder = this.iconv.getDecoder(encoding, this.options); + this.initialBytes.length = this.initialBytesLen = 0; + } + + return this.decoder.write(buf); +} + +Utf16Decoder.prototype.end = function() { + if (!this.decoder) { + var buf = Buffer.concat(this.initialBytes), + encoding = detectEncoding(buf, this.options.defaultEncoding); + this.decoder = this.iconv.getDecoder(encoding, this.options); + + var res = this.decoder.write(buf), + trail = this.decoder.end(); + + return trail ? (res + trail) : res; + } + return this.decoder.end(); +} + +function detectEncoding(buf, defaultEncoding) { + var enc = defaultEncoding || 'utf-16le'; + + if (buf.length >= 2) { + // Check BOM. + if (buf[0] == 0xFE && buf[1] == 0xFF) // UTF-16BE BOM + enc = 'utf-16be'; + else if (buf[0] == 0xFF && buf[1] == 0xFE) // UTF-16LE BOM + enc = 'utf-16le'; + else { + // No BOM found. Try to deduce encoding from initial content. + // Most of the time, the content has ASCII chars (U+00**), but the opposite (U+**00) is uncommon. + // So, we count ASCII as if it was LE or BE, and decide from that. + var asciiCharsLE = 0, asciiCharsBE = 0, // Counts of chars in both positions + _len = Math.min(buf.length - (buf.length % 2), 64); // Len is always even. + + for (var i = 0; i < _len; i += 2) { + if (buf[i] === 0 && buf[i+1] !== 0) asciiCharsBE++; + if (buf[i] !== 0 && buf[i+1] === 0) asciiCharsLE++; + } + + if (asciiCharsBE > asciiCharsLE) + enc = 'utf-16be'; + else if (asciiCharsBE < asciiCharsLE) + enc = 'utf-16le'; + } + } + + return enc; +} + + diff --git a/node_modules/iconv-lite/encodings/utf7.js b/node_modules/iconv-lite/encodings/utf7.js new file mode 100644 index 000000000..b7631c23a --- /dev/null +++ b/node_modules/iconv-lite/encodings/utf7.js @@ -0,0 +1,290 @@ +"use strict"; +var Buffer = require("safer-buffer").Buffer; + +// UTF-7 codec, according to https://tools.ietf.org/html/rfc2152 +// See also below a UTF-7-IMAP codec, according to http://tools.ietf.org/html/rfc3501#section-5.1.3 + +exports.utf7 = Utf7Codec; +exports.unicode11utf7 = 'utf7'; // Alias UNICODE-1-1-UTF-7 +function Utf7Codec(codecOptions, iconv) { + this.iconv = iconv; +}; + +Utf7Codec.prototype.encoder = Utf7Encoder; +Utf7Codec.prototype.decoder = Utf7Decoder; +Utf7Codec.prototype.bomAware = true; + + +// -- Encoding + +var nonDirectChars = /[^A-Za-z0-9'\(\),-\.\/:\? \n\r\t]+/g; + +function Utf7Encoder(options, codec) { + this.iconv = codec.iconv; +} + +Utf7Encoder.prototype.write = function(str) { + // Naive implementation. + // Non-direct chars are encoded as "+-"; single "+" char is encoded as "+-". + return Buffer.from(str.replace(nonDirectChars, function(chunk) { + return "+" + (chunk === '+' ? '' : + this.iconv.encode(chunk, 'utf16-be').toString('base64').replace(/=+$/, '')) + + "-"; + }.bind(this))); +} + +Utf7Encoder.prototype.end = function() { +} + + +// -- Decoding + +function Utf7Decoder(options, codec) { + this.iconv = codec.iconv; + this.inBase64 = false; + this.base64Accum = ''; +} + +var base64Regex = /[A-Za-z0-9\/+]/; +var base64Chars = []; +for (var i = 0; i < 256; i++) + base64Chars[i] = base64Regex.test(String.fromCharCode(i)); + +var plusChar = '+'.charCodeAt(0), + minusChar = '-'.charCodeAt(0), + andChar = '&'.charCodeAt(0); + +Utf7Decoder.prototype.write = function(buf) { + var res = "", lastI = 0, + inBase64 = this.inBase64, + base64Accum = this.base64Accum; + + // The decoder is more involved as we must handle chunks in stream. + + for (var i = 0; i < buf.length; i++) { + if (!inBase64) { // We're in direct mode. + // Write direct chars until '+' + if (buf[i] == plusChar) { + res += this.iconv.decode(buf.slice(lastI, i), "ascii"); // Write direct chars. + lastI = i+1; + inBase64 = true; + } + } else { // We decode base64. + if (!base64Chars[buf[i]]) { // Base64 ended. + if (i == lastI && buf[i] == minusChar) {// "+-" -> "+" + res += "+"; + } else { + var b64str = base64Accum + buf.slice(lastI, i).toString(); + res += this.iconv.decode(Buffer.from(b64str, 'base64'), "utf16-be"); + } + + if (buf[i] != minusChar) // Minus is absorbed after base64. + i--; + + lastI = i+1; + inBase64 = false; + base64Accum = ''; + } + } + } + + if (!inBase64) { + res += this.iconv.decode(buf.slice(lastI), "ascii"); // Write direct chars. + } else { + var b64str = base64Accum + buf.slice(lastI).toString(); + + var canBeDecoded = b64str.length - (b64str.length % 8); // Minimal chunk: 2 quads -> 2x3 bytes -> 3 chars. + base64Accum = b64str.slice(canBeDecoded); // The rest will be decoded in future. + b64str = b64str.slice(0, canBeDecoded); + + res += this.iconv.decode(Buffer.from(b64str, 'base64'), "utf16-be"); + } + + this.inBase64 = inBase64; + this.base64Accum = base64Accum; + + return res; +} + +Utf7Decoder.prototype.end = function() { + var res = ""; + if (this.inBase64 && this.base64Accum.length > 0) + res = this.iconv.decode(Buffer.from(this.base64Accum, 'base64'), "utf16-be"); + + this.inBase64 = false; + this.base64Accum = ''; + return res; +} + + +// UTF-7-IMAP codec. +// RFC3501 Sec. 5.1.3 Modified UTF-7 (http://tools.ietf.org/html/rfc3501#section-5.1.3) +// Differences: +// * Base64 part is started by "&" instead of "+" +// * Direct characters are 0x20-0x7E, except "&" (0x26) +// * In Base64, "," is used instead of "/" +// * Base64 must not be used to represent direct characters. +// * No implicit shift back from Base64 (should always end with '-') +// * String must end in non-shifted position. +// * "-&" while in base64 is not allowed. + + +exports.utf7imap = Utf7IMAPCodec; +function Utf7IMAPCodec(codecOptions, iconv) { + this.iconv = iconv; +}; + +Utf7IMAPCodec.prototype.encoder = Utf7IMAPEncoder; +Utf7IMAPCodec.prototype.decoder = Utf7IMAPDecoder; +Utf7IMAPCodec.prototype.bomAware = true; + + +// -- Encoding + +function Utf7IMAPEncoder(options, codec) { + this.iconv = codec.iconv; + this.inBase64 = false; + this.base64Accum = Buffer.alloc(6); + this.base64AccumIdx = 0; +} + +Utf7IMAPEncoder.prototype.write = function(str) { + var inBase64 = this.inBase64, + base64Accum = this.base64Accum, + base64AccumIdx = this.base64AccumIdx, + buf = Buffer.alloc(str.length*5 + 10), bufIdx = 0; + + for (var i = 0; i < str.length; i++) { + var uChar = str.charCodeAt(i); + if (0x20 <= uChar && uChar <= 0x7E) { // Direct character or '&'. + if (inBase64) { + if (base64AccumIdx > 0) { + bufIdx += buf.write(base64Accum.slice(0, base64AccumIdx).toString('base64').replace(/\//g, ',').replace(/=+$/, ''), bufIdx); + base64AccumIdx = 0; + } + + buf[bufIdx++] = minusChar; // Write '-', then go to direct mode. + inBase64 = false; + } + + if (!inBase64) { + buf[bufIdx++] = uChar; // Write direct character + + if (uChar === andChar) // Ampersand -> '&-' + buf[bufIdx++] = minusChar; + } + + } else { // Non-direct character + if (!inBase64) { + buf[bufIdx++] = andChar; // Write '&', then go to base64 mode. + inBase64 = true; + } + if (inBase64) { + base64Accum[base64AccumIdx++] = uChar >> 8; + base64Accum[base64AccumIdx++] = uChar & 0xFF; + + if (base64AccumIdx == base64Accum.length) { + bufIdx += buf.write(base64Accum.toString('base64').replace(/\//g, ','), bufIdx); + base64AccumIdx = 0; + } + } + } + } + + this.inBase64 = inBase64; + this.base64AccumIdx = base64AccumIdx; + + return buf.slice(0, bufIdx); +} + +Utf7IMAPEncoder.prototype.end = function() { + var buf = Buffer.alloc(10), bufIdx = 0; + if (this.inBase64) { + if (this.base64AccumIdx > 0) { + bufIdx += buf.write(this.base64Accum.slice(0, this.base64AccumIdx).toString('base64').replace(/\//g, ',').replace(/=+$/, ''), bufIdx); + this.base64AccumIdx = 0; + } + + buf[bufIdx++] = minusChar; // Write '-', then go to direct mode. + this.inBase64 = false; + } + + return buf.slice(0, bufIdx); +} + + +// -- Decoding + +function Utf7IMAPDecoder(options, codec) { + this.iconv = codec.iconv; + this.inBase64 = false; + this.base64Accum = ''; +} + +var base64IMAPChars = base64Chars.slice(); +base64IMAPChars[','.charCodeAt(0)] = true; + +Utf7IMAPDecoder.prototype.write = function(buf) { + var res = "", lastI = 0, + inBase64 = this.inBase64, + base64Accum = this.base64Accum; + + // The decoder is more involved as we must handle chunks in stream. + // It is forgiving, closer to standard UTF-7 (for example, '-' is optional at the end). + + for (var i = 0; i < buf.length; i++) { + if (!inBase64) { // We're in direct mode. + // Write direct chars until '&' + if (buf[i] == andChar) { + res += this.iconv.decode(buf.slice(lastI, i), "ascii"); // Write direct chars. + lastI = i+1; + inBase64 = true; + } + } else { // We decode base64. + if (!base64IMAPChars[buf[i]]) { // Base64 ended. + if (i == lastI && buf[i] == minusChar) { // "&-" -> "&" + res += "&"; + } else { + var b64str = base64Accum + buf.slice(lastI, i).toString().replace(/,/g, '/'); + res += this.iconv.decode(Buffer.from(b64str, 'base64'), "utf16-be"); + } + + if (buf[i] != minusChar) // Minus may be absorbed after base64. + i--; + + lastI = i+1; + inBase64 = false; + base64Accum = ''; + } + } + } + + if (!inBase64) { + res += this.iconv.decode(buf.slice(lastI), "ascii"); // Write direct chars. + } else { + var b64str = base64Accum + buf.slice(lastI).toString().replace(/,/g, '/'); + + var canBeDecoded = b64str.length - (b64str.length % 8); // Minimal chunk: 2 quads -> 2x3 bytes -> 3 chars. + base64Accum = b64str.slice(canBeDecoded); // The rest will be decoded in future. + b64str = b64str.slice(0, canBeDecoded); + + res += this.iconv.decode(Buffer.from(b64str, 'base64'), "utf16-be"); + } + + this.inBase64 = inBase64; + this.base64Accum = base64Accum; + + return res; +} + +Utf7IMAPDecoder.prototype.end = function() { + var res = ""; + if (this.inBase64 && this.base64Accum.length > 0) + res = this.iconv.decode(Buffer.from(this.base64Accum, 'base64'), "utf16-be"); + + this.inBase64 = false; + this.base64Accum = ''; + return res; +} + + diff --git a/node_modules/iconv-lite/lib/bom-handling.js b/node_modules/iconv-lite/lib/bom-handling.js new file mode 100644 index 000000000..105087238 --- /dev/null +++ b/node_modules/iconv-lite/lib/bom-handling.js @@ -0,0 +1,52 @@ +"use strict"; + +var BOMChar = '\uFEFF'; + +exports.PrependBOM = PrependBOMWrapper +function PrependBOMWrapper(encoder, options) { + this.encoder = encoder; + this.addBOM = true; +} + +PrependBOMWrapper.prototype.write = function(str) { + if (this.addBOM) { + str = BOMChar + str; + this.addBOM = false; + } + + return this.encoder.write(str); +} + +PrependBOMWrapper.prototype.end = function() { + return this.encoder.end(); +} + + +//------------------------------------------------------------------------------ + +exports.StripBOM = StripBOMWrapper; +function StripBOMWrapper(decoder, options) { + this.decoder = decoder; + this.pass = false; + this.options = options || {}; +} + +StripBOMWrapper.prototype.write = function(buf) { + var res = this.decoder.write(buf); + if (this.pass || !res) + return res; + + if (res[0] === BOMChar) { + res = res.slice(1); + if (typeof this.options.stripBOM === 'function') + this.options.stripBOM(); + } + + this.pass = true; + return res; +} + +StripBOMWrapper.prototype.end = function() { + return this.decoder.end(); +} + diff --git a/node_modules/iconv-lite/lib/extend-node.js b/node_modules/iconv-lite/lib/extend-node.js new file mode 100644 index 000000000..87f5394a4 --- /dev/null +++ b/node_modules/iconv-lite/lib/extend-node.js @@ -0,0 +1,217 @@ +"use strict"; +var Buffer = require("buffer").Buffer; +// Note: not polyfilled with safer-buffer on a purpose, as overrides Buffer + +// == Extend Node primitives to use iconv-lite ================================= + +module.exports = function (iconv) { + var original = undefined; // Place to keep original methods. + + // Node authors rewrote Buffer internals to make it compatible with + // Uint8Array and we cannot patch key functions since then. + // Note: this does use older Buffer API on a purpose + iconv.supportsNodeEncodingsExtension = !(Buffer.from || new Buffer(0) instanceof Uint8Array); + + iconv.extendNodeEncodings = function extendNodeEncodings() { + if (original) return; + original = {}; + + if (!iconv.supportsNodeEncodingsExtension) { + console.error("ACTION NEEDED: require('iconv-lite').extendNodeEncodings() is not supported in your version of Node"); + console.error("See more info at https://github.com/ashtuchkin/iconv-lite/wiki/Node-v4-compatibility"); + return; + } + + var nodeNativeEncodings = { + 'hex': true, 'utf8': true, 'utf-8': true, 'ascii': true, 'binary': true, + 'base64': true, 'ucs2': true, 'ucs-2': true, 'utf16le': true, 'utf-16le': true, + }; + + Buffer.isNativeEncoding = function(enc) { + return enc && nodeNativeEncodings[enc.toLowerCase()]; + } + + // -- SlowBuffer ----------------------------------------------------------- + var SlowBuffer = require('buffer').SlowBuffer; + + original.SlowBufferToString = SlowBuffer.prototype.toString; + SlowBuffer.prototype.toString = function(encoding, start, end) { + encoding = String(encoding || 'utf8').toLowerCase(); + + // Use native conversion when possible + if (Buffer.isNativeEncoding(encoding)) + return original.SlowBufferToString.call(this, encoding, start, end); + + // Otherwise, use our decoding method. + if (typeof start == 'undefined') start = 0; + if (typeof end == 'undefined') end = this.length; + return iconv.decode(this.slice(start, end), encoding); + } + + original.SlowBufferWrite = SlowBuffer.prototype.write; + SlowBuffer.prototype.write = function(string, offset, length, encoding) { + // Support both (string, offset, length, encoding) + // and the legacy (string, encoding, offset, length) + if (isFinite(offset)) { + if (!isFinite(length)) { + encoding = length; + length = undefined; + } + } else { // legacy + var swap = encoding; + encoding = offset; + offset = length; + length = swap; + } + + offset = +offset || 0; + var remaining = this.length - offset; + if (!length) { + length = remaining; + } else { + length = +length; + if (length > remaining) { + length = remaining; + } + } + encoding = String(encoding || 'utf8').toLowerCase(); + + // Use native conversion when possible + if (Buffer.isNativeEncoding(encoding)) + return original.SlowBufferWrite.call(this, string, offset, length, encoding); + + if (string.length > 0 && (length < 0 || offset < 0)) + throw new RangeError('attempt to write beyond buffer bounds'); + + // Otherwise, use our encoding method. + var buf = iconv.encode(string, encoding); + if (buf.length < length) length = buf.length; + buf.copy(this, offset, 0, length); + return length; + } + + // -- Buffer --------------------------------------------------------------- + + original.BufferIsEncoding = Buffer.isEncoding; + Buffer.isEncoding = function(encoding) { + return Buffer.isNativeEncoding(encoding) || iconv.encodingExists(encoding); + } + + original.BufferByteLength = Buffer.byteLength; + Buffer.byteLength = SlowBuffer.byteLength = function(str, encoding) { + encoding = String(encoding || 'utf8').toLowerCase(); + + // Use native conversion when possible + if (Buffer.isNativeEncoding(encoding)) + return original.BufferByteLength.call(this, str, encoding); + + // Slow, I know, but we don't have a better way yet. + return iconv.encode(str, encoding).length; + } + + original.BufferToString = Buffer.prototype.toString; + Buffer.prototype.toString = function(encoding, start, end) { + encoding = String(encoding || 'utf8').toLowerCase(); + + // Use native conversion when possible + if (Buffer.isNativeEncoding(encoding)) + return original.BufferToString.call(this, encoding, start, end); + + // Otherwise, use our decoding method. + if (typeof start == 'undefined') start = 0; + if (typeof end == 'undefined') end = this.length; + return iconv.decode(this.slice(start, end), encoding); + } + + original.BufferWrite = Buffer.prototype.write; + Buffer.prototype.write = function(string, offset, length, encoding) { + var _offset = offset, _length = length, _encoding = encoding; + // Support both (string, offset, length, encoding) + // and the legacy (string, encoding, offset, length) + if (isFinite(offset)) { + if (!isFinite(length)) { + encoding = length; + length = undefined; + } + } else { // legacy + var swap = encoding; + encoding = offset; + offset = length; + length = swap; + } + + encoding = String(encoding || 'utf8').toLowerCase(); + + // Use native conversion when possible + if (Buffer.isNativeEncoding(encoding)) + return original.BufferWrite.call(this, string, _offset, _length, _encoding); + + offset = +offset || 0; + var remaining = this.length - offset; + if (!length) { + length = remaining; + } else { + length = +length; + if (length > remaining) { + length = remaining; + } + } + + if (string.length > 0 && (length < 0 || offset < 0)) + throw new RangeError('attempt to write beyond buffer bounds'); + + // Otherwise, use our encoding method. + var buf = iconv.encode(string, encoding); + if (buf.length < length) length = buf.length; + buf.copy(this, offset, 0, length); + return length; + + // TODO: Set _charsWritten. + } + + + // -- Readable ------------------------------------------------------------- + if (iconv.supportsStreams) { + var Readable = require('stream').Readable; + + original.ReadableSetEncoding = Readable.prototype.setEncoding; + Readable.prototype.setEncoding = function setEncoding(enc, options) { + // Use our own decoder, it has the same interface. + // We cannot use original function as it doesn't handle BOM-s. + this._readableState.decoder = iconv.getDecoder(enc, options); + this._readableState.encoding = enc; + } + + Readable.prototype.collect = iconv._collect; + } + } + + // Remove iconv-lite Node primitive extensions. + iconv.undoExtendNodeEncodings = function undoExtendNodeEncodings() { + if (!iconv.supportsNodeEncodingsExtension) + return; + if (!original) + throw new Error("require('iconv-lite').undoExtendNodeEncodings(): Nothing to undo; extendNodeEncodings() is not called.") + + delete Buffer.isNativeEncoding; + + var SlowBuffer = require('buffer').SlowBuffer; + + SlowBuffer.prototype.toString = original.SlowBufferToString; + SlowBuffer.prototype.write = original.SlowBufferWrite; + + Buffer.isEncoding = original.BufferIsEncoding; + Buffer.byteLength = original.BufferByteLength; + Buffer.prototype.toString = original.BufferToString; + Buffer.prototype.write = original.BufferWrite; + + if (iconv.supportsStreams) { + var Readable = require('stream').Readable; + + Readable.prototype.setEncoding = original.ReadableSetEncoding; + delete Readable.prototype.collect; + } + + original = undefined; + } +} diff --git a/node_modules/iconv-lite/lib/index.d.ts b/node_modules/iconv-lite/lib/index.d.ts new file mode 100644 index 000000000..0547eb346 --- /dev/null +++ b/node_modules/iconv-lite/lib/index.d.ts @@ -0,0 +1,24 @@ +/*--------------------------------------------------------------------------------------------- + * Copyright (c) Microsoft Corporation. All rights reserved. + * Licensed under the MIT License. + * REQUIREMENT: This definition is dependent on the @types/node definition. + * Install with `npm install @types/node --save-dev` + *--------------------------------------------------------------------------------------------*/ + +declare module 'iconv-lite' { + export function decode(buffer: Buffer, encoding: string, options?: Options): string; + + export function encode(content: string, encoding: string, options?: Options): Buffer; + + export function encodingExists(encoding: string): boolean; + + export function decodeStream(encoding: string, options?: Options): NodeJS.ReadWriteStream; + + export function encodeStream(encoding: string, options?: Options): NodeJS.ReadWriteStream; +} + +export interface Options { + stripBOM?: boolean; + addBOM?: boolean; + defaultEncoding?: string; +} diff --git a/node_modules/iconv-lite/lib/index.js b/node_modules/iconv-lite/lib/index.js new file mode 100644 index 000000000..5391919ca --- /dev/null +++ b/node_modules/iconv-lite/lib/index.js @@ -0,0 +1,153 @@ +"use strict"; + +// Some environments don't have global Buffer (e.g. React Native). +// Solution would be installing npm modules "buffer" and "stream" explicitly. +var Buffer = require("safer-buffer").Buffer; + +var bomHandling = require("./bom-handling"), + iconv = module.exports; + +// All codecs and aliases are kept here, keyed by encoding name/alias. +// They are lazy loaded in `iconv.getCodec` from `encodings/index.js`. +iconv.encodings = null; + +// Characters emitted in case of error. +iconv.defaultCharUnicode = '�'; +iconv.defaultCharSingleByte = '?'; + +// Public API. +iconv.encode = function encode(str, encoding, options) { + str = "" + (str || ""); // Ensure string. + + var encoder = iconv.getEncoder(encoding, options); + + var res = encoder.write(str); + var trail = encoder.end(); + + return (trail && trail.length > 0) ? Buffer.concat([res, trail]) : res; +} + +iconv.decode = function decode(buf, encoding, options) { + if (typeof buf === 'string') { + if (!iconv.skipDecodeWarning) { + console.error('Iconv-lite warning: decode()-ing strings is deprecated. Refer to https://github.com/ashtuchkin/iconv-lite/wiki/Use-Buffers-when-decoding'); + iconv.skipDecodeWarning = true; + } + + buf = Buffer.from("" + (buf || ""), "binary"); // Ensure buffer. + } + + var decoder = iconv.getDecoder(encoding, options); + + var res = decoder.write(buf); + var trail = decoder.end(); + + return trail ? (res + trail) : res; +} + +iconv.encodingExists = function encodingExists(enc) { + try { + iconv.getCodec(enc); + return true; + } catch (e) { + return false; + } +} + +// Legacy aliases to convert functions +iconv.toEncoding = iconv.encode; +iconv.fromEncoding = iconv.decode; + +// Search for a codec in iconv.encodings. Cache codec data in iconv._codecDataCache. +iconv._codecDataCache = {}; +iconv.getCodec = function getCodec(encoding) { + if (!iconv.encodings) + iconv.encodings = require("../encodings"); // Lazy load all encoding definitions. + + // Canonicalize encoding name: strip all non-alphanumeric chars and appended year. + var enc = iconv._canonicalizeEncoding(encoding); + + // Traverse iconv.encodings to find actual codec. + var codecOptions = {}; + while (true) { + var codec = iconv._codecDataCache[enc]; + if (codec) + return codec; + + var codecDef = iconv.encodings[enc]; + + switch (typeof codecDef) { + case "string": // Direct alias to other encoding. + enc = codecDef; + break; + + case "object": // Alias with options. Can be layered. + for (var key in codecDef) + codecOptions[key] = codecDef[key]; + + if (!codecOptions.encodingName) + codecOptions.encodingName = enc; + + enc = codecDef.type; + break; + + case "function": // Codec itself. + if (!codecOptions.encodingName) + codecOptions.encodingName = enc; + + // The codec function must load all tables and return object with .encoder and .decoder methods. + // It'll be called only once (for each different options object). + codec = new codecDef(codecOptions, iconv); + + iconv._codecDataCache[codecOptions.encodingName] = codec; // Save it to be reused later. + return codec; + + default: + throw new Error("Encoding not recognized: '" + encoding + "' (searched as: '"+enc+"')"); + } + } +} + +iconv._canonicalizeEncoding = function(encoding) { + // Canonicalize encoding name: strip all non-alphanumeric chars and appended year. + return (''+encoding).toLowerCase().replace(/:\d{4}$|[^0-9a-z]/g, ""); +} + +iconv.getEncoder = function getEncoder(encoding, options) { + var codec = iconv.getCodec(encoding), + encoder = new codec.encoder(options, codec); + + if (codec.bomAware && options && options.addBOM) + encoder = new bomHandling.PrependBOM(encoder, options); + + return encoder; +} + +iconv.getDecoder = function getDecoder(encoding, options) { + var codec = iconv.getCodec(encoding), + decoder = new codec.decoder(options, codec); + + if (codec.bomAware && !(options && options.stripBOM === false)) + decoder = new bomHandling.StripBOM(decoder, options); + + return decoder; +} + + +// Load extensions in Node. All of them are omitted in Browserify build via 'browser' field in package.json. +var nodeVer = typeof process !== 'undefined' && process.versions && process.versions.node; +if (nodeVer) { + + // Load streaming support in Node v0.10+ + var nodeVerArr = nodeVer.split(".").map(Number); + if (nodeVerArr[0] > 0 || nodeVerArr[1] >= 10) { + require("./streams")(iconv); + } + + // Load Node primitive extensions. + require("./extend-node")(iconv); +} + +if ("Ā" != "\u0100") { + console.error("iconv-lite warning: javascript files use encoding different from utf-8. See https://github.com/ashtuchkin/iconv-lite/wiki/Javascript-source-file-encodings for more info."); +} diff --git a/node_modules/iconv-lite/lib/streams.js b/node_modules/iconv-lite/lib/streams.js new file mode 100644 index 000000000..440955295 --- /dev/null +++ b/node_modules/iconv-lite/lib/streams.js @@ -0,0 +1,121 @@ +"use strict"; + +var Buffer = require("buffer").Buffer, + Transform = require("stream").Transform; + + +// == Exports ================================================================== +module.exports = function(iconv) { + + // Additional Public API. + iconv.encodeStream = function encodeStream(encoding, options) { + return new IconvLiteEncoderStream(iconv.getEncoder(encoding, options), options); + } + + iconv.decodeStream = function decodeStream(encoding, options) { + return new IconvLiteDecoderStream(iconv.getDecoder(encoding, options), options); + } + + iconv.supportsStreams = true; + + + // Not published yet. + iconv.IconvLiteEncoderStream = IconvLiteEncoderStream; + iconv.IconvLiteDecoderStream = IconvLiteDecoderStream; + iconv._collect = IconvLiteDecoderStream.prototype.collect; +}; + + +// == Encoder stream ======================================================= +function IconvLiteEncoderStream(conv, options) { + this.conv = conv; + options = options || {}; + options.decodeStrings = false; // We accept only strings, so we don't need to decode them. + Transform.call(this, options); +} + +IconvLiteEncoderStream.prototype = Object.create(Transform.prototype, { + constructor: { value: IconvLiteEncoderStream } +}); + +IconvLiteEncoderStream.prototype._transform = function(chunk, encoding, done) { + if (typeof chunk != 'string') + return done(new Error("Iconv encoding stream needs strings as its input.")); + try { + var res = this.conv.write(chunk); + if (res && res.length) this.push(res); + done(); + } + catch (e) { + done(e); + } +} + +IconvLiteEncoderStream.prototype._flush = function(done) { + try { + var res = this.conv.end(); + if (res && res.length) this.push(res); + done(); + } + catch (e) { + done(e); + } +} + +IconvLiteEncoderStream.prototype.collect = function(cb) { + var chunks = []; + this.on('error', cb); + this.on('data', function(chunk) { chunks.push(chunk); }); + this.on('end', function() { + cb(null, Buffer.concat(chunks)); + }); + return this; +} + + +// == Decoder stream ======================================================= +function IconvLiteDecoderStream(conv, options) { + this.conv = conv; + options = options || {}; + options.encoding = this.encoding = 'utf8'; // We output strings. + Transform.call(this, options); +} + +IconvLiteDecoderStream.prototype = Object.create(Transform.prototype, { + constructor: { value: IconvLiteDecoderStream } +}); + +IconvLiteDecoderStream.prototype._transform = function(chunk, encoding, done) { + if (!Buffer.isBuffer(chunk)) + return done(new Error("Iconv decoding stream needs buffers as its input.")); + try { + var res = this.conv.write(chunk); + if (res && res.length) this.push(res, this.encoding); + done(); + } + catch (e) { + done(e); + } +} + +IconvLiteDecoderStream.prototype._flush = function(done) { + try { + var res = this.conv.end(); + if (res && res.length) this.push(res, this.encoding); + done(); + } + catch (e) { + done(e); + } +} + +IconvLiteDecoderStream.prototype.collect = function(cb) { + var res = ''; + this.on('error', cb); + this.on('data', function(chunk) { res += chunk; }); + this.on('end', function() { + cb(null, res); + }); + return this; +} + diff --git a/node_modules/iconv-lite/package.json b/node_modules/iconv-lite/package.json new file mode 100644 index 000000000..27d52f2ef --- /dev/null +++ b/node_modules/iconv-lite/package.json @@ -0,0 +1,77 @@ +{ + "_from": "iconv-lite@0.4.24", + "_id": "iconv-lite@0.4.24", + "_inBundle": false, + "_integrity": "sha512-v3MXnZAcvnywkTUEZomIActle7RXXeedOR31wwl7VlyoXO4Qi9arvSenNQWne1TcRwhCL1HwLI21bEqdpj8/rA==", + "_location": "/iconv-lite", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "iconv-lite@0.4.24", + "name": "iconv-lite", + "escapedName": "iconv-lite", + "rawSpec": "0.4.24", + "saveSpec": null, + "fetchSpec": "0.4.24" + }, + "_requiredBy": [ + "/body-parser", + "/raw-body" + ], + "_resolved": "https://registry.npmjs.org/iconv-lite/-/iconv-lite-0.4.24.tgz", + "_shasum": "2022b4b25fbddc21d2f524974a474aafe733908b", + "_spec": "iconv-lite@0.4.24", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\body-parser", + "author": { + "name": "Alexander Shtuchkin", + "email": "ashtuchkin@gmail.com" + }, + "browser": { + "./lib/extend-node": false, + "./lib/streams": false + }, + "bugs": { + "url": "https://github.com/ashtuchkin/iconv-lite/issues" + }, + "bundleDependencies": false, + "dependencies": { + "safer-buffer": ">= 2.1.2 < 3" + }, + "deprecated": false, + "description": "Convert character encodings in pure javascript.", + "devDependencies": { + "async": "*", + "errto": "*", + "iconv": "*", + "istanbul": "*", + "mocha": "^3.1.0", + "request": "~2.87.0", + "semver": "*", + "unorm": "*" + }, + "engines": { + "node": ">=0.10.0" + }, + "homepage": "https://github.com/ashtuchkin/iconv-lite", + "keywords": [ + "iconv", + "convert", + "charset", + "icu" + ], + "license": "MIT", + "main": "./lib/index.js", + "name": "iconv-lite", + "repository": { + "type": "git", + "url": "git://github.com/ashtuchkin/iconv-lite.git" + }, + "scripts": { + "coverage": "istanbul cover _mocha -- --grep .", + "coverage-open": "open coverage/lcov-report/index.html", + "test": "mocha --reporter spec --grep ." + }, + "typings": "./lib/index.d.ts", + "version": "0.4.24" +} diff --git a/node_modules/ieee754/LICENSE b/node_modules/ieee754/LICENSE new file mode 100644 index 000000000..5aac82c78 --- /dev/null +++ b/node_modules/ieee754/LICENSE @@ -0,0 +1,11 @@ +Copyright 2008 Fair Oaks Labs, Inc. + +Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. + +3. Neither the name of the copyright holder nor the names of its contributors may be used to endorse or promote products derived from this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/node_modules/ieee754/README.md b/node_modules/ieee754/README.md new file mode 100644 index 000000000..cb7527b3c --- /dev/null +++ b/node_modules/ieee754/README.md @@ -0,0 +1,51 @@ +# ieee754 [![travis][travis-image]][travis-url] [![npm][npm-image]][npm-url] [![downloads][downloads-image]][downloads-url] [![javascript style guide][standard-image]][standard-url] + +[travis-image]: https://img.shields.io/travis/feross/ieee754/master.svg +[travis-url]: https://travis-ci.org/feross/ieee754 +[npm-image]: https://img.shields.io/npm/v/ieee754.svg +[npm-url]: https://npmjs.org/package/ieee754 +[downloads-image]: https://img.shields.io/npm/dm/ieee754.svg +[downloads-url]: https://npmjs.org/package/ieee754 +[standard-image]: https://img.shields.io/badge/code_style-standard-brightgreen.svg +[standard-url]: https://standardjs.com + +[![saucelabs][saucelabs-image]][saucelabs-url] + +[saucelabs-image]: https://saucelabs.com/browser-matrix/ieee754.svg +[saucelabs-url]: https://saucelabs.com/u/ieee754 + +### Read/write IEEE754 floating point numbers from/to a Buffer or array-like object. + +## install + +``` +npm install ieee754 +``` + +## methods + +`var ieee754 = require('ieee754')` + +The `ieee754` object has the following functions: + +``` +ieee754.read = function (buffer, offset, isLE, mLen, nBytes) +ieee754.write = function (buffer, value, offset, isLE, mLen, nBytes) +``` + +The arguments mean the following: + +- buffer = the buffer +- offset = offset into the buffer +- value = value to set (only for `write`) +- isLe = is little endian? +- mLen = mantissa length +- nBytes = number of bytes + +## what is ieee754? + +The IEEE Standard for Floating-Point Arithmetic (IEEE 754) is a technical standard for floating-point computation. [Read more](http://en.wikipedia.org/wiki/IEEE_floating_point). + +## license + +BSD 3 Clause. Copyright (c) 2008, Fair Oaks Labs, Inc. diff --git a/node_modules/ieee754/index.d.ts b/node_modules/ieee754/index.d.ts new file mode 100644 index 000000000..f1e435487 --- /dev/null +++ b/node_modules/ieee754/index.d.ts @@ -0,0 +1,10 @@ +declare namespace ieee754 { + export function read( + buffer: Uint8Array, offset: number, isLE: boolean, mLen: number, + nBytes: number): number; + export function write( + buffer: Uint8Array, value: number, offset: number, isLE: boolean, + mLen: number, nBytes: number): void; + } + + export = ieee754; \ No newline at end of file diff --git a/node_modules/ieee754/index.js b/node_modules/ieee754/index.js new file mode 100644 index 000000000..81d26c343 --- /dev/null +++ b/node_modules/ieee754/index.js @@ -0,0 +1,85 @@ +/*! ieee754. BSD-3-Clause License. Feross Aboukhadijeh */ +exports.read = function (buffer, offset, isLE, mLen, nBytes) { + var e, m + var eLen = (nBytes * 8) - mLen - 1 + var eMax = (1 << eLen) - 1 + var eBias = eMax >> 1 + var nBits = -7 + var i = isLE ? (nBytes - 1) : 0 + var d = isLE ? -1 : 1 + var s = buffer[offset + i] + + i += d + + e = s & ((1 << (-nBits)) - 1) + s >>= (-nBits) + nBits += eLen + for (; nBits > 0; e = (e * 256) + buffer[offset + i], i += d, nBits -= 8) {} + + m = e & ((1 << (-nBits)) - 1) + e >>= (-nBits) + nBits += mLen + for (; nBits > 0; m = (m * 256) + buffer[offset + i], i += d, nBits -= 8) {} + + if (e === 0) { + e = 1 - eBias + } else if (e === eMax) { + return m ? NaN : ((s ? -1 : 1) * Infinity) + } else { + m = m + Math.pow(2, mLen) + e = e - eBias + } + return (s ? -1 : 1) * m * Math.pow(2, e - mLen) +} + +exports.write = function (buffer, value, offset, isLE, mLen, nBytes) { + var e, m, c + var eLen = (nBytes * 8) - mLen - 1 + var eMax = (1 << eLen) - 1 + var eBias = eMax >> 1 + var rt = (mLen === 23 ? Math.pow(2, -24) - Math.pow(2, -77) : 0) + var i = isLE ? 0 : (nBytes - 1) + var d = isLE ? 1 : -1 + var s = value < 0 || (value === 0 && 1 / value < 0) ? 1 : 0 + + value = Math.abs(value) + + if (isNaN(value) || value === Infinity) { + m = isNaN(value) ? 1 : 0 + e = eMax + } else { + e = Math.floor(Math.log(value) / Math.LN2) + if (value * (c = Math.pow(2, -e)) < 1) { + e-- + c *= 2 + } + if (e + eBias >= 1) { + value += rt / c + } else { + value += rt * Math.pow(2, 1 - eBias) + } + if (value * c >= 2) { + e++ + c /= 2 + } + + if (e + eBias >= eMax) { + m = 0 + e = eMax + } else if (e + eBias >= 1) { + m = ((value * c) - 1) * Math.pow(2, mLen) + e = e + eBias + } else { + m = value * Math.pow(2, eBias - 1) * Math.pow(2, mLen) + e = 0 + } + } + + for (; mLen >= 8; buffer[offset + i] = m & 0xff, i += d, m /= 256, mLen -= 8) {} + + e = (e << mLen) | m + eLen += mLen + for (; eLen > 0; buffer[offset + i] = e & 0xff, i += d, e /= 256, eLen -= 8) {} + + buffer[offset + i - d] |= s * 128 +} diff --git a/node_modules/ieee754/package.json b/node_modules/ieee754/package.json new file mode 100644 index 000000000..f97583b43 --- /dev/null +++ b/node_modules/ieee754/package.json @@ -0,0 +1,84 @@ +{ + "_from": "ieee754@^1.1.13", + "_id": "ieee754@1.2.1", + "_inBundle": false, + "_integrity": "sha512-dcyqhDvX1C46lXZcVqCpK+FtMRQVdIMN6/Df5js2zouUsqG7I6sFxitIC+7KYK29KdXOLHdu9zL4sFnoVQnqaA==", + "_location": "/ieee754", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "ieee754@^1.1.13", + "name": "ieee754", + "escapedName": "ieee754", + "rawSpec": "^1.1.13", + "saveSpec": null, + "fetchSpec": "^1.1.13" + }, + "_requiredBy": [ + "/buffer" + ], + "_resolved": "https://registry.npmjs.org/ieee754/-/ieee754-1.2.1.tgz", + "_shasum": "8eb7a10a63fff25d15a57b001586d177d1b0d352", + "_spec": "ieee754@^1.1.13", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\buffer", + "author": { + "name": "Feross Aboukhadijeh", + "email": "feross@feross.org", + "url": "https://feross.org" + }, + "bugs": { + "url": "https://github.com/feross/ieee754/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Romain Beauxis", + "email": "toots@rastageeks.org" + } + ], + "deprecated": false, + "description": "Read/write IEEE754 floating point numbers from/to a Buffer or array-like object", + "devDependencies": { + "airtap": "^3.0.0", + "standard": "*", + "tape": "^5.0.1" + }, + "funding": [ + { + "type": "github", + "url": "https://github.com/sponsors/feross" + }, + { + "type": "patreon", + "url": "https://www.patreon.com/feross" + }, + { + "type": "consulting", + "url": "https://feross.org/support" + } + ], + "homepage": "https://github.com/feross/ieee754#readme", + "keywords": [ + "IEEE 754", + "buffer", + "convert", + "floating point", + "ieee754" + ], + "license": "BSD-3-Clause", + "main": "index.js", + "name": "ieee754", + "repository": { + "type": "git", + "url": "git://github.com/feross/ieee754.git" + }, + "scripts": { + "test": "standard && npm run test-node && npm run test-browser", + "test-browser": "airtap -- test/*.js", + "test-browser-local": "airtap --local -- test/*.js", + "test-node": "tape test/*.js" + }, + "types": "index.d.ts", + "version": "1.2.1" +} diff --git a/node_modules/ignore-by-default/LICENSE b/node_modules/ignore-by-default/LICENSE new file mode 100644 index 000000000..ee1e36789 --- /dev/null +++ b/node_modules/ignore-by-default/LICENSE @@ -0,0 +1,14 @@ +ISC License (ISC) +Copyright (c) 2016, Mark Wubben + +Permission to use, copy, modify, and/or distribute this software for any purpose +with or without fee is hereby granted, provided that the above copyright notice +and this permission notice appear in all copies. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH +REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND +FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, +INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS +OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER +TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF +THIS SOFTWARE. diff --git a/node_modules/ignore-by-default/README.md b/node_modules/ignore-by-default/README.md new file mode 100644 index 000000000..ee7719119 --- /dev/null +++ b/node_modules/ignore-by-default/README.md @@ -0,0 +1,26 @@ +# ignore-by-default + +This is a package aimed at Node.js development tools. It provides a list of +directories that should probably be ignored by such tools, e.g. when watching +for file changes. + +It's used by [AVA](https://www.npmjs.com/package/ava) and +[nodemon](https://www.npmjs.com/package/nodemon). + +[Please contribute!](./CONTRIBUTING.md) + +## Installation + +``` +npm install --save ignore-by-default +``` + +## Usage + +The `ignore-by-default` module exports a `directories()` function, which will +return an array of directory names. These are the ones you should ignore. + +```js +// ['.git', '.sass_cache', …] +var ignoredDirectories = require('ignore-by-default').directories() +``` diff --git a/node_modules/ignore-by-default/index.js b/node_modules/ignore-by-default/index.js new file mode 100644 index 000000000..c65857da5 --- /dev/null +++ b/node_modules/ignore-by-default/index.js @@ -0,0 +1,12 @@ +'use strict' + +exports.directories = function () { + return [ + '.git', // Git repository files, see + '.nyc_output', // Temporary directory where nyc stores coverage data, see + '.sass-cache', // Cache folder for node-sass, see + 'bower_components', // Where Bower packages are installed, see + 'coverage', // Standard output directory for code coverage reports, see + 'node_modules' // Where Node modules are installed, see + ] +} diff --git a/node_modules/ignore-by-default/package.json b/node_modules/ignore-by-default/package.json new file mode 100644 index 000000000..d47fbe0f5 --- /dev/null +++ b/node_modules/ignore-by-default/package.json @@ -0,0 +1,62 @@ +{ + "_from": "ignore-by-default@^1.0.1", + "_id": "ignore-by-default@1.0.1", + "_inBundle": false, + "_integrity": "sha1-SMptcvbGo68Aqa1K5odr44ieKwk=", + "_location": "/ignore-by-default", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "ignore-by-default@^1.0.1", + "name": "ignore-by-default", + "escapedName": "ignore-by-default", + "rawSpec": "^1.0.1", + "saveSpec": null, + "fetchSpec": "^1.0.1" + }, + "_requiredBy": [ + "/nodemon" + ], + "_resolved": "https://registry.npmjs.org/ignore-by-default/-/ignore-by-default-1.0.1.tgz", + "_shasum": "48ca6d72f6c6a3af00a9ad4ae6876be3889e2b09", + "_spec": "ignore-by-default@^1.0.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\nodemon", + "author": { + "name": "Mark Wubben", + "url": "https://novemberborn.net/" + }, + "bugs": { + "url": "https://github.com/novemberborn/ignore-by-default/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "A list of directories you should ignore by default", + "devDependencies": { + "figures": "^1.4.0", + "standard": "^6.0.4" + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/novemberborn/ignore-by-default#readme", + "keywords": [ + "ignore", + "chokidar", + "watcher", + "exclude", + "glob", + "pattern" + ], + "license": "ISC", + "main": "index.js", + "name": "ignore-by-default", + "repository": { + "type": "git", + "url": "git+https://github.com/novemberborn/ignore-by-default.git" + }, + "scripts": { + "test": "standard && node test.js" + }, + "version": "1.0.1" +} diff --git a/node_modules/import-lazy/index.js b/node_modules/import-lazy/index.js new file mode 100644 index 000000000..307f08f39 --- /dev/null +++ b/node_modules/import-lazy/index.js @@ -0,0 +1,53 @@ +'use strict'; +const lazy = (mod, fn, id) => mod === undefined ? fn(id) : mod; + +module.exports = fn => { + return id => { + let mod; + + return function () { + if (arguments.length === 0) { + mod = lazy(mod, fn, id); + return mod; + } + + const ret = {}; + + [].forEach.call(arguments, prop => { + Object.defineProperty(ret, prop, { + get: () => { + mod = lazy(mod, fn, id); + if (typeof mod[prop] === 'function') { + return function () { + return mod[prop].apply(mod, arguments); + }; + } + + return mod[prop]; + } + }); + }); + + return ret; + }; + }; +}; + +module.exports.proxy = fn => { + return id => { + let mod; + + const handler = { + get: (target, property) => { + mod = lazy(mod, fn, id); + return Reflect.get(mod, property); + }, + apply: (target, thisArg, argumentsList) => { + mod = lazy(mod, fn, id); + return Reflect.apply(mod, thisArg, argumentsList); + } + }; + + return new Proxy(() => {}, handler); + }; +}; diff --git a/node_modules/import-lazy/license b/node_modules/import-lazy/license new file mode 100644 index 000000000..654d0bfe9 --- /dev/null +++ b/node_modules/import-lazy/license @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/import-lazy/package.json b/node_modules/import-lazy/package.json new file mode 100644 index 000000000..2245ade41 --- /dev/null +++ b/node_modules/import-lazy/package.json @@ -0,0 +1,76 @@ +{ + "_from": "import-lazy@^2.1.0", + "_id": "import-lazy@2.1.0", + "_inBundle": false, + "_integrity": "sha1-BWmOPUXIjo1+nZLLBYTnfwlvPkM=", + "_location": "/import-lazy", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "import-lazy@^2.1.0", + "name": "import-lazy", + "escapedName": "import-lazy", + "rawSpec": "^2.1.0", + "saveSpec": null, + "fetchSpec": "^2.1.0" + }, + "_requiredBy": [ + "/update-notifier" + ], + "_resolved": "https://registry.npmjs.org/import-lazy/-/import-lazy-2.1.0.tgz", + "_shasum": "05698e3d45c88e8d7e9d92cb0584e77f096f3e43", + "_spec": "import-lazy@^2.1.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\update-notifier", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/import-lazy/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Jorge Bucaran", + "email": "jbucaran@me.com" + } + ], + "deprecated": false, + "description": "Import modules lazily", + "devDependencies": { + "ava": "*", + "xo": "*" + }, + "engines": { + "node": ">=4" + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/sindresorhus/import-lazy#readme", + "keywords": [ + "import", + "require", + "load", + "module", + "modules", + "lazy", + "lazily", + "defer", + "deferred", + "proxy", + "proxies" + ], + "license": "MIT", + "name": "import-lazy", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/import-lazy.git" + }, + "scripts": { + "test": "xo && ava" + }, + "version": "2.1.0" +} diff --git a/node_modules/import-lazy/readme.md b/node_modules/import-lazy/readme.md new file mode 100644 index 000000000..233e42e23 --- /dev/null +++ b/node_modules/import-lazy/readme.md @@ -0,0 +1,64 @@ +# import-lazy [![Build Status](https://travis-ci.org/sindresorhus/import-lazy.svg?branch=master)](https://travis-ci.org/sindresorhus/import-lazy) + +> Import modules lazily + + +## Install + +``` +$ npm install --save import-lazy +``` + + +## Usage + +```js +// Pass in `require` or a custom import function +const importLazy = require('import-lazy')(require); +const _ = importLazy('lodash'); + +// Where you would normally do +_.isNumber(2); + +// You now instead call it as a function +_().isNumber(2); + +// It's cached on consecutive calls +_().isString('unicorn'); + +// Extract lazy variations of the props you need +const members = importLazy('lodash')('isNumber', 'isString'); + +// Useful when using destructuring assignment in ES2015 +const {isNumber, isString} = importLazy('lodash')('isNumber', 'isString'); + +// Works out of the box for functions and regular properties +const stuff = importLazy('./math-lib')('sum', 'PHI'); +console.log(stuff.sum(1, 2)); // => 3 +console.log(stuff.PHI); // => 1.618033 +``` + +### Proxy support in Node.js 6 or later + +If you use Node.js 6 or later, you can take advantage of ES2015 proxies and don't need to call it as a function. + +```js +const importLazy = require('import-lazy').proxy(require); +const _ = importLazy('lodash'); + +// No need to call it as a function but still lazily imported +_.isNumber(2); +``` + +## Related + +- [resolve-from](https://github.com/sindresorhus/resolve-from) - Resolve the path of a module from a given path +- [import-from](https://github.com/sindresorhus/import-from) - Import a module from a given path +- [resolve-pkg](https://github.com/sindresorhus/resolve-pkg) - Resolve the path of a package regardless of it having an entry point +- [lazy-value](https://github.com/sindresorhus/lazy-value) - Create a lazily evaluated value +- [define-lazy-prop](https://github.com/sindresorhus/define-lazy-prop) - Define a lazily evaluated property on an object + + +## License + +MIT © [Sindre Sorhus](https://sindresorhus.com) diff --git a/node_modules/imurmurhash/README.md b/node_modules/imurmurhash/README.md new file mode 100644 index 000000000..f35b20a0e --- /dev/null +++ b/node_modules/imurmurhash/README.md @@ -0,0 +1,122 @@ +iMurmurHash.js +============== + +An incremental implementation of the MurmurHash3 (32-bit) hashing algorithm for JavaScript based on [Gary Court's implementation](https://github.com/garycourt/murmurhash-js) with [kazuyukitanimura's modifications](https://github.com/kazuyukitanimura/murmurhash-js). + +This version works significantly faster than the non-incremental version if you need to hash many small strings into a single hash, since string concatenation (to build the single string to pass the non-incremental version) is fairly costly. In one case tested, using the incremental version was about 50% faster than concatenating 5-10 strings and then hashing. + +Installation +------------ + +To use iMurmurHash in the browser, [download the latest version](https://raw.github.com/jensyt/imurmurhash-js/master/imurmurhash.min.js) and include it as a script on your site. + +```html + + +``` + +--- + +To use iMurmurHash in Node.js, install the module using NPM: + +```bash +npm install imurmurhash +``` + +Then simply include it in your scripts: + +```javascript +MurmurHash3 = require('imurmurhash'); +``` + +Quick Example +------------- + +```javascript +// Create the initial hash +var hashState = MurmurHash3('string'); + +// Incrementally add text +hashState.hash('more strings'); +hashState.hash('even more strings'); + +// All calls can be chained if desired +hashState.hash('and').hash('some').hash('more'); + +// Get a result +hashState.result(); +// returns 0xe4ccfe6b +``` + +Functions +--------- + +### MurmurHash3 ([string], [seed]) +Get a hash state object, optionally initialized with the given _string_ and _seed_. _Seed_ must be a positive integer if provided. Calling this function without the `new` keyword will return a cached state object that has been reset. This is safe to use as long as the object is only used from a single thread and no other hashes are created while operating on this one. If this constraint cannot be met, you can use `new` to create a new state object. For example: + +```javascript +// Use the cached object, calling the function again will return the same +// object (but reset, so the current state would be lost) +hashState = MurmurHash3(); +... + +// Create a new object that can be safely used however you wish. Calling the +// function again will simply return a new state object, and no state loss +// will occur, at the cost of creating more objects. +hashState = new MurmurHash3(); +``` + +Both methods can be mixed however you like if you have different use cases. + +--- + +### MurmurHash3.prototype.hash (string) +Incrementally add _string_ to the hash. This can be called as many times as you want for the hash state object, including after a call to `result()`. Returns `this` so calls can be chained. + +--- + +### MurmurHash3.prototype.result () +Get the result of the hash as a 32-bit positive integer. This performs the tail and finalizer portions of the algorithm, but does not store the result in the state object. This means that it is perfectly safe to get results and then continue adding strings via `hash`. + +```javascript +// Do the whole string at once +MurmurHash3('this is a test string').result(); +// 0x70529328 + +// Do part of the string, get a result, then the other part +var m = MurmurHash3('this is a'); +m.result(); +// 0xbfc4f834 +m.hash(' test string').result(); +// 0x70529328 (same as above) +``` + +--- + +### MurmurHash3.prototype.reset ([seed]) +Reset the state object for reuse, optionally using the given _seed_ (defaults to 0 like the constructor). Returns `this` so calls can be chained. + +--- + +License (MIT) +------------- +Copyright (c) 2013 Gary Court, Jens Taylor + +Permission is hereby granted, free of charge, to any person obtaining a copy of +this software and associated documentation files (the "Software"), to deal in +the Software without restriction, including without limitation the rights to +use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of +the Software, and to permit persons to whom the Software is furnished to do so, +subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS +FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR +COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER +IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN +CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/imurmurhash/imurmurhash.js b/node_modules/imurmurhash/imurmurhash.js new file mode 100644 index 000000000..e63146a2b --- /dev/null +++ b/node_modules/imurmurhash/imurmurhash.js @@ -0,0 +1,138 @@ +/** + * @preserve + * JS Implementation of incremental MurmurHash3 (r150) (as of May 10, 2013) + * + * @author Jens Taylor + * @see http://github.com/homebrewing/brauhaus-diff + * @author Gary Court + * @see http://github.com/garycourt/murmurhash-js + * @author Austin Appleby + * @see http://sites.google.com/site/murmurhash/ + */ +(function(){ + var cache; + + // Call this function without `new` to use the cached object (good for + // single-threaded environments), or with `new` to create a new object. + // + // @param {string} key A UTF-16 or ASCII string + // @param {number} seed An optional positive integer + // @return {object} A MurmurHash3 object for incremental hashing + function MurmurHash3(key, seed) { + var m = this instanceof MurmurHash3 ? this : cache; + m.reset(seed) + if (typeof key === 'string' && key.length > 0) { + m.hash(key); + } + + if (m !== this) { + return m; + } + }; + + // Incrementally add a string to this hash + // + // @param {string} key A UTF-16 or ASCII string + // @return {object} this + MurmurHash3.prototype.hash = function(key) { + var h1, k1, i, top, len; + + len = key.length; + this.len += len; + + k1 = this.k1; + i = 0; + switch (this.rem) { + case 0: k1 ^= len > i ? (key.charCodeAt(i++) & 0xffff) : 0; + case 1: k1 ^= len > i ? (key.charCodeAt(i++) & 0xffff) << 8 : 0; + case 2: k1 ^= len > i ? (key.charCodeAt(i++) & 0xffff) << 16 : 0; + case 3: + k1 ^= len > i ? (key.charCodeAt(i) & 0xff) << 24 : 0; + k1 ^= len > i ? (key.charCodeAt(i++) & 0xff00) >> 8 : 0; + } + + this.rem = (len + this.rem) & 3; // & 3 is same as % 4 + len -= this.rem; + if (len > 0) { + h1 = this.h1; + while (1) { + k1 = (k1 * 0x2d51 + (k1 & 0xffff) * 0xcc9e0000) & 0xffffffff; + k1 = (k1 << 15) | (k1 >>> 17); + k1 = (k1 * 0x3593 + (k1 & 0xffff) * 0x1b870000) & 0xffffffff; + + h1 ^= k1; + h1 = (h1 << 13) | (h1 >>> 19); + h1 = (h1 * 5 + 0xe6546b64) & 0xffffffff; + + if (i >= len) { + break; + } + + k1 = ((key.charCodeAt(i++) & 0xffff)) ^ + ((key.charCodeAt(i++) & 0xffff) << 8) ^ + ((key.charCodeAt(i++) & 0xffff) << 16); + top = key.charCodeAt(i++); + k1 ^= ((top & 0xff) << 24) ^ + ((top & 0xff00) >> 8); + } + + k1 = 0; + switch (this.rem) { + case 3: k1 ^= (key.charCodeAt(i + 2) & 0xffff) << 16; + case 2: k1 ^= (key.charCodeAt(i + 1) & 0xffff) << 8; + case 1: k1 ^= (key.charCodeAt(i) & 0xffff); + } + + this.h1 = h1; + } + + this.k1 = k1; + return this; + }; + + // Get the result of this hash + // + // @return {number} The 32-bit hash + MurmurHash3.prototype.result = function() { + var k1, h1; + + k1 = this.k1; + h1 = this.h1; + + if (k1 > 0) { + k1 = (k1 * 0x2d51 + (k1 & 0xffff) * 0xcc9e0000) & 0xffffffff; + k1 = (k1 << 15) | (k1 >>> 17); + k1 = (k1 * 0x3593 + (k1 & 0xffff) * 0x1b870000) & 0xffffffff; + h1 ^= k1; + } + + h1 ^= this.len; + + h1 ^= h1 >>> 16; + h1 = (h1 * 0xca6b + (h1 & 0xffff) * 0x85eb0000) & 0xffffffff; + h1 ^= h1 >>> 13; + h1 = (h1 * 0xae35 + (h1 & 0xffff) * 0xc2b20000) & 0xffffffff; + h1 ^= h1 >>> 16; + + return h1 >>> 0; + }; + + // Reset the hash object for reuse + // + // @param {number} seed An optional positive integer + MurmurHash3.prototype.reset = function(seed) { + this.h1 = typeof seed === 'number' ? seed : 0; + this.rem = this.k1 = this.len = 0; + return this; + }; + + // A cached object to use. This can be safely used if you're in a single- + // threaded environment, otherwise you need to create new hashes to use. + cache = new MurmurHash3(); + + if (typeof(module) != 'undefined') { + module.exports = MurmurHash3; + } else { + this.MurmurHash3 = MurmurHash3; + } +}()); diff --git a/node_modules/imurmurhash/imurmurhash.min.js b/node_modules/imurmurhash/imurmurhash.min.js new file mode 100644 index 000000000..dc0ee88d6 --- /dev/null +++ b/node_modules/imurmurhash/imurmurhash.min.js @@ -0,0 +1,12 @@ +/** + * @preserve + * JS Implementation of incremental MurmurHash3 (r150) (as of May 10, 2013) + * + * @author Jens Taylor + * @see http://github.com/homebrewing/brauhaus-diff + * @author Gary Court + * @see http://github.com/garycourt/murmurhash-js + * @author Austin Appleby + * @see http://sites.google.com/site/murmurhash/ + */ +!function(){function t(h,r){var s=this instanceof t?this:e;return s.reset(r),"string"==typeof h&&h.length>0&&s.hash(h),s!==this?s:void 0}var e;t.prototype.hash=function(t){var e,h,r,s,i;switch(i=t.length,this.len+=i,h=this.k1,r=0,this.rem){case 0:h^=i>r?65535&t.charCodeAt(r++):0;case 1:h^=i>r?(65535&t.charCodeAt(r++))<<8:0;case 2:h^=i>r?(65535&t.charCodeAt(r++))<<16:0;case 3:h^=i>r?(255&t.charCodeAt(r))<<24:0,h^=i>r?(65280&t.charCodeAt(r++))>>8:0}if(this.rem=3&i+this.rem,i-=this.rem,i>0){for(e=this.h1;;){if(h=4294967295&11601*h+3432906752*(65535&h),h=h<<15|h>>>17,h=4294967295&13715*h+461832192*(65535&h),e^=h,e=e<<13|e>>>19,e=4294967295&5*e+3864292196,r>=i)break;h=65535&t.charCodeAt(r++)^(65535&t.charCodeAt(r++))<<8^(65535&t.charCodeAt(r++))<<16,s=t.charCodeAt(r++),h^=(255&s)<<24^(65280&s)>>8}switch(h=0,this.rem){case 3:h^=(65535&t.charCodeAt(r+2))<<16;case 2:h^=(65535&t.charCodeAt(r+1))<<8;case 1:h^=65535&t.charCodeAt(r)}this.h1=e}return this.k1=h,this},t.prototype.result=function(){var t,e;return t=this.k1,e=this.h1,t>0&&(t=4294967295&11601*t+3432906752*(65535&t),t=t<<15|t>>>17,t=4294967295&13715*t+461832192*(65535&t),e^=t),e^=this.len,e^=e>>>16,e=4294967295&51819*e+2246770688*(65535&e),e^=e>>>13,e=4294967295&44597*e+3266445312*(65535&e),e^=e>>>16,e>>>0},t.prototype.reset=function(t){return this.h1="number"==typeof t?t:0,this.rem=this.k1=this.len=0,this},e=new t,"undefined"!=typeof module?module.exports=t:this.MurmurHash3=t}(); \ No newline at end of file diff --git a/node_modules/imurmurhash/package.json b/node_modules/imurmurhash/package.json new file mode 100644 index 000000000..bf352c895 --- /dev/null +++ b/node_modules/imurmurhash/package.json @@ -0,0 +1,63 @@ +{ + "_from": "imurmurhash@^0.1.4", + "_id": "imurmurhash@0.1.4", + "_inBundle": false, + "_integrity": "sha1-khi5srkoojixPcT7a21XbyMUU+o=", + "_location": "/imurmurhash", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "imurmurhash@^0.1.4", + "name": "imurmurhash", + "escapedName": "imurmurhash", + "rawSpec": "^0.1.4", + "saveSpec": null, + "fetchSpec": "^0.1.4" + }, + "_requiredBy": [ + "/write-file-atomic" + ], + "_resolved": "https://registry.npmjs.org/imurmurhash/-/imurmurhash-0.1.4.tgz", + "_shasum": "9218b9b2b928a238b13dc4fb6b6d576f231453ea", + "_spec": "imurmurhash@^0.1.4", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\write-file-atomic", + "author": { + "name": "Jens Taylor", + "email": "jensyt@gmail.com", + "url": "https://github.com/homebrewing" + }, + "bugs": { + "url": "https://github.com/jensyt/imurmurhash-js/issues" + }, + "bundleDependencies": false, + "dependencies": {}, + "deprecated": false, + "description": "An incremental implementation of MurmurHash3", + "devDependencies": {}, + "engines": { + "node": ">=0.8.19" + }, + "files": [ + "imurmurhash.js", + "imurmurhash.min.js", + "package.json", + "README.md" + ], + "homepage": "https://github.com/jensyt/imurmurhash-js", + "keywords": [ + "murmur", + "murmurhash", + "murmurhash3", + "hash", + "incremental" + ], + "license": "MIT", + "main": "imurmurhash.js", + "name": "imurmurhash", + "repository": { + "type": "git", + "url": "git+https://github.com/jensyt/imurmurhash-js.git" + }, + "version": "0.1.4" +} diff --git a/node_modules/inherits/LICENSE b/node_modules/inherits/LICENSE new file mode 100644 index 000000000..dea3013d6 --- /dev/null +++ b/node_modules/inherits/LICENSE @@ -0,0 +1,16 @@ +The ISC License + +Copyright (c) Isaac Z. Schlueter + +Permission to use, copy, modify, and/or distribute this software for any +purpose with or without fee is hereby granted, provided that the above +copyright notice and this permission notice appear in all copies. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH +REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY AND +FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, +INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM +LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR +OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR +PERFORMANCE OF THIS SOFTWARE. + diff --git a/node_modules/inherits/README.md b/node_modules/inherits/README.md new file mode 100644 index 000000000..b1c566585 --- /dev/null +++ b/node_modules/inherits/README.md @@ -0,0 +1,42 @@ +Browser-friendly inheritance fully compatible with standard node.js +[inherits](http://nodejs.org/api/util.html#util_util_inherits_constructor_superconstructor). + +This package exports standard `inherits` from node.js `util` module in +node environment, but also provides alternative browser-friendly +implementation through [browser +field](https://gist.github.com/shtylman/4339901). Alternative +implementation is a literal copy of standard one located in standalone +module to avoid requiring of `util`. It also has a shim for old +browsers with no `Object.create` support. + +While keeping you sure you are using standard `inherits` +implementation in node.js environment, it allows bundlers such as +[browserify](https://github.com/substack/node-browserify) to not +include full `util` package to your client code if all you need is +just `inherits` function. It worth, because browser shim for `util` +package is large and `inherits` is often the single function you need +from it. + +It's recommended to use this package instead of +`require('util').inherits` for any code that has chances to be used +not only in node.js but in browser too. + +## usage + +```js +var inherits = require('inherits'); +// then use exactly as the standard one +``` + +## note on version ~1.0 + +Version ~1.0 had completely different motivation and is not compatible +neither with 2.0 nor with standard node.js `inherits`. + +If you are using version ~1.0 and planning to switch to ~2.0, be +careful: + +* new version uses `super_` instead of `super` for referencing + superclass +* new version overwrites current prototype while old one preserves any + existing fields on it diff --git a/node_modules/inherits/inherits.js b/node_modules/inherits/inherits.js new file mode 100644 index 000000000..3b94763a7 --- /dev/null +++ b/node_modules/inherits/inherits.js @@ -0,0 +1,7 @@ +try { + var util = require('util'); + if (typeof util.inherits !== 'function') throw ''; + module.exports = util.inherits; +} catch (e) { + module.exports = require('./inherits_browser.js'); +} diff --git a/node_modules/inherits/inherits_browser.js b/node_modules/inherits/inherits_browser.js new file mode 100644 index 000000000..c1e78a75e --- /dev/null +++ b/node_modules/inherits/inherits_browser.js @@ -0,0 +1,23 @@ +if (typeof Object.create === 'function') { + // implementation from standard node.js 'util' module + module.exports = function inherits(ctor, superCtor) { + ctor.super_ = superCtor + ctor.prototype = Object.create(superCtor.prototype, { + constructor: { + value: ctor, + enumerable: false, + writable: true, + configurable: true + } + }); + }; +} else { + // old school shim for old browsers + module.exports = function inherits(ctor, superCtor) { + ctor.super_ = superCtor + var TempCtor = function () {} + TempCtor.prototype = superCtor.prototype + ctor.prototype = new TempCtor() + ctor.prototype.constructor = ctor + } +} diff --git a/node_modules/inherits/package.json b/node_modules/inherits/package.json new file mode 100644 index 000000000..12df5cde4 --- /dev/null +++ b/node_modules/inherits/package.json @@ -0,0 +1,61 @@ +{ + "_from": "inherits@2.0.3", + "_id": "inherits@2.0.3", + "_inBundle": false, + "_integrity": "sha1-Yzwsg+PaQqUC9SRmAiSA9CCCYd4=", + "_location": "/inherits", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "inherits@2.0.3", + "name": "inherits", + "escapedName": "inherits", + "rawSpec": "2.0.3", + "saveSpec": null, + "fetchSpec": "2.0.3" + }, + "_requiredBy": [ + "/http-errors" + ], + "_resolved": "https://registry.npmjs.org/inherits/-/inherits-2.0.3.tgz", + "_shasum": "633c2c83e3da42a502f52466022480f4208261de", + "_spec": "inherits@2.0.3", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\http-errors", + "browser": "./inherits_browser.js", + "bugs": { + "url": "https://github.com/isaacs/inherits/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Browser-friendly inheritance fully compatible with standard node.js inherits()", + "devDependencies": { + "tap": "^7.1.0" + }, + "files": [ + "inherits.js", + "inherits_browser.js" + ], + "homepage": "https://github.com/isaacs/inherits#readme", + "keywords": [ + "inheritance", + "class", + "klass", + "oop", + "object-oriented", + "inherits", + "browser", + "browserify" + ], + "license": "ISC", + "main": "./inherits.js", + "name": "inherits", + "repository": { + "type": "git", + "url": "git://github.com/isaacs/inherits.git" + }, + "scripts": { + "test": "node test" + }, + "version": "2.0.3" +} diff --git a/node_modules/ini/LICENSE b/node_modules/ini/LICENSE new file mode 100644 index 000000000..19129e315 --- /dev/null +++ b/node_modules/ini/LICENSE @@ -0,0 +1,15 @@ +The ISC License + +Copyright (c) Isaac Z. Schlueter and Contributors + +Permission to use, copy, modify, and/or distribute this software for any +purpose with or without fee is hereby granted, provided that the above +copyright notice and this permission notice appear in all copies. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES +WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF +MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR +ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES +WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN +ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR +IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. diff --git a/node_modules/ini/README.md b/node_modules/ini/README.md new file mode 100644 index 000000000..33df25829 --- /dev/null +++ b/node_modules/ini/README.md @@ -0,0 +1,102 @@ +An ini format parser and serializer for node. + +Sections are treated as nested objects. Items before the first +heading are saved on the object directly. + +## Usage + +Consider an ini-file `config.ini` that looks like this: + + ; this comment is being ignored + scope = global + + [database] + user = dbuser + password = dbpassword + database = use_this_database + + [paths.default] + datadir = /var/lib/data + array[] = first value + array[] = second value + array[] = third value + +You can read, manipulate and write the ini-file like so: + + var fs = require('fs') + , ini = require('ini') + + var config = ini.parse(fs.readFileSync('./config.ini', 'utf-8')) + + config.scope = 'local' + config.database.database = 'use_another_database' + config.paths.default.tmpdir = '/tmp' + delete config.paths.default.datadir + config.paths.default.array.push('fourth value') + + fs.writeFileSync('./config_modified.ini', ini.stringify(config, { section: 'section' })) + +This will result in a file called `config_modified.ini` being written +to the filesystem with the following content: + + [section] + scope=local + [section.database] + user=dbuser + password=dbpassword + database=use_another_database + [section.paths.default] + tmpdir=/tmp + array[]=first value + array[]=second value + array[]=third value + array[]=fourth value + + +## API + +### decode(inistring) + +Decode the ini-style formatted `inistring` into a nested object. + +### parse(inistring) + +Alias for `decode(inistring)` + +### encode(object, [options]) + +Encode the object `object` into an ini-style formatted string. If the +optional parameter `section` is given, then all top-level properties +of the object are put into this section and the `section`-string is +prepended to all sub-sections, see the usage example above. + +The `options` object may contain the following: + +* `section` A string which will be the first `section` in the encoded + ini data. Defaults to none. +* `whitespace` Boolean to specify whether to put whitespace around the + `=` character. By default, whitespace is omitted, to be friendly to + some persnickety old parsers that don't tolerate it well. But some + find that it's more human-readable and pretty with the whitespace. + +For backwards compatibility reasons, if a `string` options is passed +in, then it is assumed to be the `section` value. + +### stringify(object, [options]) + +Alias for `encode(object, [options])` + +### safe(val) + +Escapes the string `val` such that it is safe to be used as a key or +value in an ini-file. Basically escapes quotes. For example + + ini.safe('"unsafe string"') + +would result in + + "\"unsafe string\"" + +### unsafe(val) + +Unescapes the string `val` diff --git a/node_modules/ini/ini.js b/node_modules/ini/ini.js new file mode 100644 index 000000000..7d05a719b --- /dev/null +++ b/node_modules/ini/ini.js @@ -0,0 +1,206 @@ +const { hasOwnProperty } = Object.prototype + +const eol = typeof process !== 'undefined' && + process.platform === 'win32' ? '\r\n' : '\n' + +const encode = (obj, opt) => { + const children = [] + let out = '' + + if (typeof opt === 'string') { + opt = { + section: opt, + whitespace: false, + } + } else { + opt = opt || Object.create(null) + opt.whitespace = opt.whitespace === true + } + + const separator = opt.whitespace ? ' = ' : '=' + + for (const k of Object.keys(obj)) { + const val = obj[k] + if (val && Array.isArray(val)) { + for (const item of val) + out += safe(k + '[]') + separator + safe(item) + '\n' + } else if (val && typeof val === 'object') + children.push(k) + else + out += safe(k) + separator + safe(val) + eol + } + + if (opt.section && out.length) + out = '[' + safe(opt.section) + ']' + eol + out + + for (const k of children) { + const nk = dotSplit(k).join('\\.') + const section = (opt.section ? opt.section + '.' : '') + nk + const { whitespace } = opt + const child = encode(obj[k], { + section, + whitespace, + }) + if (out.length && child.length) + out += eol + + out += child + } + + return out +} + +const dotSplit = str => + str.replace(/\1/g, '\u0002LITERAL\\1LITERAL\u0002') + .replace(/\\\./g, '\u0001') + .split(/\./) + .map(part => + part.replace(/\1/g, '\\.') + .replace(/\2LITERAL\\1LITERAL\2/g, '\u0001')) + +const decode = str => { + const out = Object.create(null) + let p = out + let section = null + // section |key = value + const re = /^\[([^\]]*)\]$|^([^=]+)(=(.*))?$/i + const lines = str.split(/[\r\n]+/g) + + for (const line of lines) { + if (!line || line.match(/^\s*[;#]/)) + continue + const match = line.match(re) + if (!match) + continue + if (match[1] !== undefined) { + section = unsafe(match[1]) + if (section === '__proto__') { + // not allowed + // keep parsing the section, but don't attach it. + p = Object.create(null) + continue + } + p = out[section] = out[section] || Object.create(null) + continue + } + const keyRaw = unsafe(match[2]) + const isArray = keyRaw.length > 2 && keyRaw.slice(-2) === '[]' + const key = isArray ? keyRaw.slice(0, -2) : keyRaw + if (key === '__proto__') + continue + const valueRaw = match[3] ? unsafe(match[4]) : true + const value = valueRaw === 'true' || + valueRaw === 'false' || + valueRaw === 'null' ? JSON.parse(valueRaw) + : valueRaw + + // Convert keys with '[]' suffix to an array + if (isArray) { + if (!hasOwnProperty.call(p, key)) + p[key] = [] + else if (!Array.isArray(p[key])) + p[key] = [p[key]] + } + + // safeguard against resetting a previously defined + // array by accidentally forgetting the brackets + if (Array.isArray(p[key])) + p[key].push(value) + else + p[key] = value + } + + // {a:{y:1},"a.b":{x:2}} --> {a:{y:1,b:{x:2}}} + // use a filter to return the keys that have to be deleted. + const remove = [] + for (const k of Object.keys(out)) { + if (!hasOwnProperty.call(out, k) || + typeof out[k] !== 'object' || + Array.isArray(out[k])) + continue + + // see if the parent section is also an object. + // if so, add it to that, and mark this one for deletion + const parts = dotSplit(k) + let p = out + const l = parts.pop() + const nl = l.replace(/\\\./g, '.') + for (const part of parts) { + if (part === '__proto__') + continue + if (!hasOwnProperty.call(p, part) || typeof p[part] !== 'object') + p[part] = Object.create(null) + p = p[part] + } + if (p === out && nl === l) + continue + + p[nl] = out[k] + remove.push(k) + } + for (const del of remove) + delete out[del] + + return out +} + +const isQuoted = val => + (val.charAt(0) === '"' && val.slice(-1) === '"') || + (val.charAt(0) === "'" && val.slice(-1) === "'") + +const safe = val => + (typeof val !== 'string' || + val.match(/[=\r\n]/) || + val.match(/^\[/) || + (val.length > 1 && + isQuoted(val)) || + val !== val.trim()) + ? JSON.stringify(val) + : val.replace(/;/g, '\\;').replace(/#/g, '\\#') + +const unsafe = (val, doUnesc) => { + val = (val || '').trim() + if (isQuoted(val)) { + // remove the single quotes before calling JSON.parse + if (val.charAt(0) === "'") + val = val.substr(1, val.length - 2) + + try { + val = JSON.parse(val) + } catch (_) {} + } else { + // walk the val to find the first not-escaped ; character + let esc = false + let unesc = '' + for (let i = 0, l = val.length; i < l; i++) { + const c = val.charAt(i) + if (esc) { + if ('\\;#'.indexOf(c) !== -1) + unesc += c + else + unesc += '\\' + c + + esc = false + } else if (';#'.indexOf(c) !== -1) + break + else if (c === '\\') + esc = true + else + unesc += c + } + if (esc) + unesc += '\\' + + return unesc.trim() + } + return val +} + +module.exports = { + parse: decode, + decode, + stringify: encode, + encode, + safe, + unsafe, +} diff --git a/node_modules/ini/package.json b/node_modules/ini/package.json new file mode 100644 index 000000000..1e273c5bc --- /dev/null +++ b/node_modules/ini/package.json @@ -0,0 +1,69 @@ +{ + "_from": "ini@2.0.0", + "_id": "ini@2.0.0", + "_inBundle": false, + "_integrity": "sha512-7PnF4oN3CvZF23ADhA5wRaYEQpJ8qygSkbtTXWBeXWXmEVRXK+1ITciHWwHhsjv1TmW0MgacIv6hEi5pX5NQdA==", + "_location": "/ini", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "ini@2.0.0", + "name": "ini", + "escapedName": "ini", + "rawSpec": "2.0.0", + "saveSpec": null, + "fetchSpec": "2.0.0" + }, + "_requiredBy": [ + "/global-dirs" + ], + "_resolved": "https://registry.npmjs.org/ini/-/ini-2.0.0.tgz", + "_shasum": "e5fd556ecdd5726be978fa1001862eacb0a94bc5", + "_spec": "ini@2.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\global-dirs", + "author": { + "name": "Isaac Z. Schlueter", + "email": "i@izs.me", + "url": "http://blog.izs.me/" + }, + "bugs": { + "url": "https://github.com/isaacs/ini/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "An ini encoder/decoder for node", + "devDependencies": { + "eslint": "^7.9.0", + "eslint-plugin-import": "^2.22.0", + "eslint-plugin-node": "^11.1.0", + "eslint-plugin-promise": "^4.2.1", + "eslint-plugin-standard": "^4.0.1", + "tap": "14" + }, + "engines": { + "node": ">=10" + }, + "files": [ + "ini.js" + ], + "homepage": "https://github.com/isaacs/ini#readme", + "license": "ISC", + "main": "ini.js", + "name": "ini", + "repository": { + "type": "git", + "url": "git://github.com/isaacs/ini.git" + }, + "scripts": { + "eslint": "eslint", + "lint": "npm run eslint -- ini.js test/*.js", + "lintfix": "npm run lint -- --fix", + "posttest": "npm run lint", + "postversion": "npm publish", + "prepublishOnly": "git push origin --follow-tags", + "preversion": "npm test", + "test": "tap" + }, + "version": "2.0.0" +} diff --git a/node_modules/ipaddr.js/LICENSE b/node_modules/ipaddr.js/LICENSE new file mode 100644 index 000000000..f6b37b52d --- /dev/null +++ b/node_modules/ipaddr.js/LICENSE @@ -0,0 +1,19 @@ +Copyright (C) 2011-2017 whitequark + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/ipaddr.js/README.md b/node_modules/ipaddr.js/README.md new file mode 100644 index 000000000..f57725b0f --- /dev/null +++ b/node_modules/ipaddr.js/README.md @@ -0,0 +1,233 @@ +# ipaddr.js — an IPv6 and IPv4 address manipulation library [![Build Status](https://travis-ci.org/whitequark/ipaddr.js.svg)](https://travis-ci.org/whitequark/ipaddr.js) + +ipaddr.js is a small (1.9K minified and gzipped) library for manipulating +IP addresses in JavaScript environments. It runs on both CommonJS runtimes +(e.g. [nodejs]) and in a web browser. + +ipaddr.js allows you to verify and parse string representation of an IP +address, match it against a CIDR range or range list, determine if it falls +into some reserved ranges (examples include loopback and private ranges), +and convert between IPv4 and IPv4-mapped IPv6 addresses. + +[nodejs]: http://nodejs.org + +## Installation + +`npm install ipaddr.js` + +or + +`bower install ipaddr.js` + +## API + +ipaddr.js defines one object in the global scope: `ipaddr`. In CommonJS, +it is exported from the module: + +```js +var ipaddr = require('ipaddr.js'); +``` + +The API consists of several global methods and two classes: ipaddr.IPv6 and ipaddr.IPv4. + +### Global methods + +There are three global methods defined: `ipaddr.isValid`, `ipaddr.parse` and +`ipaddr.process`. All of them receive a string as a single parameter. + +The `ipaddr.isValid` method returns `true` if the address is a valid IPv4 or +IPv6 address, and `false` otherwise. It does not throw any exceptions. + +The `ipaddr.parse` method returns an object representing the IP address, +or throws an `Error` if the passed string is not a valid representation of an +IP address. + +The `ipaddr.process` method works just like the `ipaddr.parse` one, but it +automatically converts IPv4-mapped IPv6 addresses to their IPv4 counterparts +before returning. It is useful when you have a Node.js instance listening +on an IPv6 socket, and the `net.ivp6.bindv6only` sysctl parameter (or its +equivalent on non-Linux OS) is set to 0. In this case, you can accept IPv4 +connections on your IPv6-only socket, but the remote address will be mangled. +Use `ipaddr.process` method to automatically demangle it. + +### Object representation + +Parsing methods return an object which descends from `ipaddr.IPv6` or +`ipaddr.IPv4`. These objects share some properties, but most of them differ. + +#### Shared properties + +One can determine the type of address by calling `addr.kind()`. It will return +either `"ipv6"` or `"ipv4"`. + +An address can be converted back to its string representation with `addr.toString()`. +Note that this method: + * does not return the original string used to create the object (in fact, there is + no way of getting that string) + * returns a compact representation (when it is applicable) + +A `match(range, bits)` method can be used to check if the address falls into a +certain CIDR range. +Note that an address can be (obviously) matched only against an address of the same type. + +For example: + +```js +var addr = ipaddr.parse("2001:db8:1234::1"); +var range = ipaddr.parse("2001:db8::"); + +addr.match(range, 32); // => true +``` + +Alternatively, `match` can also be called as `match([range, bits])`. In this way, +it can be used together with the `parseCIDR(string)` method, which parses an IP +address together with a CIDR range. + +For example: + +```js +var addr = ipaddr.parse("2001:db8:1234::1"); + +addr.match(ipaddr.parseCIDR("2001:db8::/32")); // => true +``` + +A `range()` method returns one of predefined names for several special ranges defined +by IP protocols. The exact names (and their respective CIDR ranges) can be looked up +in the source: [IPv6 ranges] and [IPv4 ranges]. Some common ones include `"unicast"` +(the default one) and `"reserved"`. + +You can match against your own range list by using +`ipaddr.subnetMatch(address, rangeList, defaultName)` method. It can work with a mix of IPv6 or IPv4 addresses, and accepts a name-to-subnet map as the range list. For example: + +```js +var rangeList = { + documentationOnly: [ ipaddr.parse('2001:db8::'), 32 ], + tunnelProviders: [ + [ ipaddr.parse('2001:470::'), 32 ], // he.net + [ ipaddr.parse('2001:5c0::'), 32 ] // freenet6 + ] +}; +ipaddr.subnetMatch(ipaddr.parse('2001:470:8:66::1'), rangeList, 'unknown'); // => "tunnelProviders" +``` + +The addresses can be converted to their byte representation with `toByteArray()`. +(Actually, JavaScript mostly does not know about byte buffers. They are emulated with +arrays of numbers, each in range of 0..255.) + +```js +var bytes = ipaddr.parse('2a00:1450:8007::68').toByteArray(); // ipv6.google.com +bytes // => [42, 0x00, 0x14, 0x50, 0x80, 0x07, 0x00, , 0x00, 0x68 ] +``` + +The `ipaddr.IPv4` and `ipaddr.IPv6` objects have some methods defined, too. All of them +have the same interface for both protocols, and are similar to global methods. + +`ipaddr.IPvX.isValid(string)` can be used to check if the string is a valid address +for particular protocol, and `ipaddr.IPvX.parse(string)` is the error-throwing parser. + +`ipaddr.IPvX.isValid(string)` uses the same format for parsing as the POSIX `inet_ntoa` function, which accepts unusual formats like `0xc0.168.1.1` or `0x10000000`. The function `ipaddr.IPv4.isValidFourPartDecimal(string)` validates the IPv4 address and also ensures that it is written in four-part decimal format. + +[IPv6 ranges]: https://github.com/whitequark/ipaddr.js/blob/master/src/ipaddr.coffee#L186 +[IPv4 ranges]: https://github.com/whitequark/ipaddr.js/blob/master/src/ipaddr.coffee#L71 + +#### IPv6 properties + +Sometimes you will want to convert IPv6 not to a compact string representation (with +the `::` substitution); the `toNormalizedString()` method will return an address where +all zeroes are explicit. + +For example: + +```js +var addr = ipaddr.parse("2001:0db8::0001"); +addr.toString(); // => "2001:db8::1" +addr.toNormalizedString(); // => "2001:db8:0:0:0:0:0:1" +``` + +The `isIPv4MappedAddress()` method will return `true` if this address is an IPv4-mapped +one, and `toIPv4Address()` will return an IPv4 object address. + +To access the underlying binary representation of the address, use `addr.parts`. + +```js +var addr = ipaddr.parse("2001:db8:10::1234:DEAD"); +addr.parts // => [0x2001, 0xdb8, 0x10, 0, 0, 0, 0x1234, 0xdead] +``` + +A IPv6 zone index can be accessed via `addr.zoneId`: + +```js +var addr = ipaddr.parse("2001:db8::%eth0"); +addr.zoneId // => 'eth0' +``` + +#### IPv4 properties + +`toIPv4MappedAddress()` will return a corresponding IPv4-mapped IPv6 address. + +To access the underlying representation of the address, use `addr.octets`. + +```js +var addr = ipaddr.parse("192.168.1.1"); +addr.octets // => [192, 168, 1, 1] +``` + +`prefixLengthFromSubnetMask()` will return a CIDR prefix length for a valid IPv4 netmask or +null if the netmask is not valid. + +```js +ipaddr.IPv4.parse('255.255.255.240').prefixLengthFromSubnetMask() == 28 +ipaddr.IPv4.parse('255.192.164.0').prefixLengthFromSubnetMask() == null +``` + +`subnetMaskFromPrefixLength()` will return an IPv4 netmask for a valid CIDR prefix length. + +```js +ipaddr.IPv4.subnetMaskFromPrefixLength(24) == "255.255.255.0" +ipaddr.IPv4.subnetMaskFromPrefixLength(29) == "255.255.255.248" +``` + +`broadcastAddressFromCIDR()` will return the broadcast address for a given IPv4 interface and netmask in CIDR notation. +```js +ipaddr.IPv4.broadcastAddressFromCIDR("172.0.0.1/24") == "172.0.0.255" +``` +`networkAddressFromCIDR()` will return the network address for a given IPv4 interface and netmask in CIDR notation. +```js +ipaddr.IPv4.networkAddressFromCIDR("172.0.0.1/24") == "172.0.0.0" +``` + +#### Conversion + +IPv4 and IPv6 can be converted bidirectionally to and from network byte order (MSB) byte arrays. + +The `fromByteArray()` method will take an array and create an appropriate IPv4 or IPv6 object +if the input satisfies the requirements. For IPv4 it has to be an array of four 8-bit values, +while for IPv6 it has to be an array of sixteen 8-bit values. + +For example: +```js +var addr = ipaddr.fromByteArray([0x7f, 0, 0, 1]); +addr.toString(); // => "127.0.0.1" +``` + +or + +```js +var addr = ipaddr.fromByteArray([0x20, 1, 0xd, 0xb8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1]) +addr.toString(); // => "2001:db8::1" +``` + +Both objects also offer a `toByteArray()` method, which returns an array in network byte order (MSB). + +For example: +```js +var addr = ipaddr.parse("127.0.0.1"); +addr.toByteArray(); // => [0x7f, 0, 0, 1] +``` + +or + +```js +var addr = ipaddr.parse("2001:db8::1"); +addr.toByteArray(); // => [0x20, 1, 0xd, 0xb8, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1] +``` diff --git a/node_modules/ipaddr.js/ipaddr.min.js b/node_modules/ipaddr.js/ipaddr.min.js new file mode 100644 index 000000000..b54a7cc42 --- /dev/null +++ b/node_modules/ipaddr.js/ipaddr.min.js @@ -0,0 +1 @@ +(function(){var r,t,n,e,i,o,a,s;t={},s=this,"undefined"!=typeof module&&null!==module&&module.exports?module.exports=t:s.ipaddr=t,a=function(r,t,n,e){var i,o;if(r.length!==t.length)throw new Error("ipaddr: cannot match CIDR for objects with different lengths");for(i=0;e>0;){if((o=n-e)<0&&(o=0),r[i]>>o!=t[i]>>o)return!1;e-=n,i+=1}return!0},t.subnetMatch=function(r,t,n){var e,i,o,a,s;null==n&&(n="unicast");for(o in t)for(!(a=t[o])[0]||a[0]instanceof Array||(a=[a]),e=0,i=a.length;e=0;t=n+=-1){if(!((e=this.octets[t])in a))return null;if(o=a[e],i&&0!==o)return null;8!==o&&(i=!0),r+=o}return 32-r},r}(),n="(0?\\d+|0x[a-f0-9]+)",e={fourOctet:new RegExp("^"+n+"\\."+n+"\\."+n+"\\."+n+"$","i"),longValue:new RegExp("^"+n+"$","i")},t.IPv4.parser=function(r){var t,n,i,o,a;if(n=function(r){return"0"===r[0]&&"x"!==r[1]?parseInt(r,8):parseInt(r)},t=r.match(e.fourOctet))return function(){var r,e,o,a;for(a=[],r=0,e=(o=t.slice(1,6)).length;r4294967295||a<0)throw new Error("ipaddr: address outside defined range");return function(){var r,t;for(t=[],o=r=0;r<=24;o=r+=8)t.push(a>>o&255);return t}().reverse()}return null},t.IPv6=function(){function r(r,t){var n,e,i,o,a,s;if(16===r.length)for(this.parts=[],n=e=0;e<=14;n=e+=2)this.parts.push(r[n]<<8|r[n+1]);else{if(8!==r.length)throw new Error("ipaddr: ipv6 part count should be 8 or 16");this.parts=r}for(i=0,o=(s=this.parts).length;it&&(r=n.index,t=n[0].length);return t<0?i:i.substring(0,r)+"::"+i.substring(r+t)},r.prototype.toByteArray=function(){var r,t,n,e,i;for(r=[],t=0,n=(i=this.parts).length;t>8),r.push(255&e);return r},r.prototype.toNormalizedString=function(){var r,t,n;return r=function(){var r,n,e,i;for(i=[],r=0,n=(e=this.parts).length;r>8,255&r,n>>8,255&n])},r.prototype.prefixLengthFromSubnetMask=function(){var r,t,n,e,i,o,a;for(a={0:16,32768:15,49152:14,57344:13,61440:12,63488:11,64512:10,65024:9,65280:8,65408:7,65472:6,65504:5,65520:4,65528:3,65532:2,65534:1,65535:0},r=0,i=!1,t=n=7;n>=0;t=n+=-1){if(!((e=this.parts[t])in a))return null;if(o=a[e],i&&0!==o)return null;16!==o&&(i=!0),r+=o}return 128-r},r}(),i="(?:[0-9a-f]+::?)+",o={zoneIndex:new RegExp("%[0-9a-z]{1,}","i"),native:new RegExp("^(::)?("+i+")?([0-9a-f]+)?(::)?(%[0-9a-z]{1,})?$","i"),transitional:new RegExp("^((?:"+i+")|(?:::)(?:"+i+")?)"+n+"\\."+n+"\\."+n+"\\."+n+"(%[0-9a-z]{1,})?$","i")},r=function(r,t){var n,e,i,a,s,p;if(r.indexOf("::")!==r.lastIndexOf("::"))return null;for((p=(r.match(o.zoneIndex)||[])[0])&&(p=p.substring(1),r=r.replace(/%.+$/,"")),n=0,e=-1;(e=r.indexOf(":",e+1))>=0;)n++;if("::"===r.substr(0,2)&&n--,"::"===r.substr(-2,2)&&n--,n>t)return null;for(s=t-n,a=":";s--;)a+="0:";return":"===(r=r.replace("::",a))[0]&&(r=r.slice(1)),":"===r[r.length-1]&&(r=r.slice(0,-1)),t=function(){var t,n,e,o;for(o=[],t=0,n=(e=r.split(":")).length;t=0&&t<=32)return e=[this.parse(n[1]),t],Object.defineProperty(e,"toString",{value:function(){return this.join("/")}}),e;throw new Error("ipaddr: string is not formatted like an IPv4 CIDR range")},t.IPv4.subnetMaskFromPrefixLength=function(r){var t,n,e;if((r=parseInt(r))<0||r>32)throw new Error("ipaddr: invalid IPv4 prefix length");for(e=[0,0,0,0],n=0,t=Math.floor(r/8);n=0&&t<=128)return e=[this.parse(n[1]),t],Object.defineProperty(e,"toString",{value:function(){return this.join("/")}}),e;throw new Error("ipaddr: string is not formatted like an IPv6 CIDR range")},t.isValid=function(r){return t.IPv6.isValid(r)||t.IPv4.isValid(r)},t.parse=function(r){if(t.IPv6.isValid(r))return t.IPv6.parse(r);if(t.IPv4.isValid(r))return t.IPv4.parse(r);throw new Error("ipaddr: the address has neither IPv6 nor IPv4 format")},t.parseCIDR=function(r){try{return t.IPv6.parseCIDR(r)}catch(n){n;try{return t.IPv4.parseCIDR(r)}catch(r){throw r,new Error("ipaddr: the address has neither IPv6 nor IPv4 CIDR format")}}},t.fromByteArray=function(r){var n;if(4===(n=r.length))return new t.IPv4(r);if(16===n)return new t.IPv6(r);throw new Error("ipaddr: the binary input is neither an IPv6 nor IPv4 address")},t.process=function(r){var t;return t=this.parse(r),"ipv6"===t.kind()&&t.isIPv4MappedAddress()?t.toIPv4Address():t}}).call(this); \ No newline at end of file diff --git a/node_modules/ipaddr.js/lib/ipaddr.js b/node_modules/ipaddr.js/lib/ipaddr.js new file mode 100644 index 000000000..18bd93b5e --- /dev/null +++ b/node_modules/ipaddr.js/lib/ipaddr.js @@ -0,0 +1,673 @@ +(function() { + var expandIPv6, ipaddr, ipv4Part, ipv4Regexes, ipv6Part, ipv6Regexes, matchCIDR, root, zoneIndex; + + ipaddr = {}; + + root = this; + + if ((typeof module !== "undefined" && module !== null) && module.exports) { + module.exports = ipaddr; + } else { + root['ipaddr'] = ipaddr; + } + + matchCIDR = function(first, second, partSize, cidrBits) { + var part, shift; + if (first.length !== second.length) { + throw new Error("ipaddr: cannot match CIDR for objects with different lengths"); + } + part = 0; + while (cidrBits > 0) { + shift = partSize - cidrBits; + if (shift < 0) { + shift = 0; + } + if (first[part] >> shift !== second[part] >> shift) { + return false; + } + cidrBits -= partSize; + part += 1; + } + return true; + }; + + ipaddr.subnetMatch = function(address, rangeList, defaultName) { + var k, len, rangeName, rangeSubnets, subnet; + if (defaultName == null) { + defaultName = 'unicast'; + } + for (rangeName in rangeList) { + rangeSubnets = rangeList[rangeName]; + if (rangeSubnets[0] && !(rangeSubnets[0] instanceof Array)) { + rangeSubnets = [rangeSubnets]; + } + for (k = 0, len = rangeSubnets.length; k < len; k++) { + subnet = rangeSubnets[k]; + if (address.kind() === subnet[0].kind()) { + if (address.match.apply(address, subnet)) { + return rangeName; + } + } + } + } + return defaultName; + }; + + ipaddr.IPv4 = (function() { + function IPv4(octets) { + var k, len, octet; + if (octets.length !== 4) { + throw new Error("ipaddr: ipv4 octet count should be 4"); + } + for (k = 0, len = octets.length; k < len; k++) { + octet = octets[k]; + if (!((0 <= octet && octet <= 255))) { + throw new Error("ipaddr: ipv4 octet should fit in 8 bits"); + } + } + this.octets = octets; + } + + IPv4.prototype.kind = function() { + return 'ipv4'; + }; + + IPv4.prototype.toString = function() { + return this.octets.join("."); + }; + + IPv4.prototype.toNormalizedString = function() { + return this.toString(); + }; + + IPv4.prototype.toByteArray = function() { + return this.octets.slice(0); + }; + + IPv4.prototype.match = function(other, cidrRange) { + var ref; + if (cidrRange === void 0) { + ref = other, other = ref[0], cidrRange = ref[1]; + } + if (other.kind() !== 'ipv4') { + throw new Error("ipaddr: cannot match ipv4 address with non-ipv4 one"); + } + return matchCIDR(this.octets, other.octets, 8, cidrRange); + }; + + IPv4.prototype.SpecialRanges = { + unspecified: [[new IPv4([0, 0, 0, 0]), 8]], + broadcast: [[new IPv4([255, 255, 255, 255]), 32]], + multicast: [[new IPv4([224, 0, 0, 0]), 4]], + linkLocal: [[new IPv4([169, 254, 0, 0]), 16]], + loopback: [[new IPv4([127, 0, 0, 0]), 8]], + carrierGradeNat: [[new IPv4([100, 64, 0, 0]), 10]], + "private": [[new IPv4([10, 0, 0, 0]), 8], [new IPv4([172, 16, 0, 0]), 12], [new IPv4([192, 168, 0, 0]), 16]], + reserved: [[new IPv4([192, 0, 0, 0]), 24], [new IPv4([192, 0, 2, 0]), 24], [new IPv4([192, 88, 99, 0]), 24], [new IPv4([198, 51, 100, 0]), 24], [new IPv4([203, 0, 113, 0]), 24], [new IPv4([240, 0, 0, 0]), 4]] + }; + + IPv4.prototype.range = function() { + return ipaddr.subnetMatch(this, this.SpecialRanges); + }; + + IPv4.prototype.toIPv4MappedAddress = function() { + return ipaddr.IPv6.parse("::ffff:" + (this.toString())); + }; + + IPv4.prototype.prefixLengthFromSubnetMask = function() { + var cidr, i, k, octet, stop, zeros, zerotable; + zerotable = { + 0: 8, + 128: 7, + 192: 6, + 224: 5, + 240: 4, + 248: 3, + 252: 2, + 254: 1, + 255: 0 + }; + cidr = 0; + stop = false; + for (i = k = 3; k >= 0; i = k += -1) { + octet = this.octets[i]; + if (octet in zerotable) { + zeros = zerotable[octet]; + if (stop && zeros !== 0) { + return null; + } + if (zeros !== 8) { + stop = true; + } + cidr += zeros; + } else { + return null; + } + } + return 32 - cidr; + }; + + return IPv4; + + })(); + + ipv4Part = "(0?\\d+|0x[a-f0-9]+)"; + + ipv4Regexes = { + fourOctet: new RegExp("^" + ipv4Part + "\\." + ipv4Part + "\\." + ipv4Part + "\\." + ipv4Part + "$", 'i'), + longValue: new RegExp("^" + ipv4Part + "$", 'i') + }; + + ipaddr.IPv4.parser = function(string) { + var match, parseIntAuto, part, shift, value; + parseIntAuto = function(string) { + if (string[0] === "0" && string[1] !== "x") { + return parseInt(string, 8); + } else { + return parseInt(string); + } + }; + if (match = string.match(ipv4Regexes.fourOctet)) { + return (function() { + var k, len, ref, results; + ref = match.slice(1, 6); + results = []; + for (k = 0, len = ref.length; k < len; k++) { + part = ref[k]; + results.push(parseIntAuto(part)); + } + return results; + })(); + } else if (match = string.match(ipv4Regexes.longValue)) { + value = parseIntAuto(match[1]); + if (value > 0xffffffff || value < 0) { + throw new Error("ipaddr: address outside defined range"); + } + return ((function() { + var k, results; + results = []; + for (shift = k = 0; k <= 24; shift = k += 8) { + results.push((value >> shift) & 0xff); + } + return results; + })()).reverse(); + } else { + return null; + } + }; + + ipaddr.IPv6 = (function() { + function IPv6(parts, zoneId) { + var i, k, l, len, part, ref; + if (parts.length === 16) { + this.parts = []; + for (i = k = 0; k <= 14; i = k += 2) { + this.parts.push((parts[i] << 8) | parts[i + 1]); + } + } else if (parts.length === 8) { + this.parts = parts; + } else { + throw new Error("ipaddr: ipv6 part count should be 8 or 16"); + } + ref = this.parts; + for (l = 0, len = ref.length; l < len; l++) { + part = ref[l]; + if (!((0 <= part && part <= 0xffff))) { + throw new Error("ipaddr: ipv6 part should fit in 16 bits"); + } + } + if (zoneId) { + this.zoneId = zoneId; + } + } + + IPv6.prototype.kind = function() { + return 'ipv6'; + }; + + IPv6.prototype.toString = function() { + return this.toNormalizedString().replace(/((^|:)(0(:|$))+)/, '::'); + }; + + IPv6.prototype.toRFC5952String = function() { + var bestMatchIndex, bestMatchLength, match, regex, string; + regex = /((^|:)(0(:|$)){2,})/g; + string = this.toNormalizedString(); + bestMatchIndex = 0; + bestMatchLength = -1; + while ((match = regex.exec(string))) { + if (match[0].length > bestMatchLength) { + bestMatchIndex = match.index; + bestMatchLength = match[0].length; + } + } + if (bestMatchLength < 0) { + return string; + } + return string.substring(0, bestMatchIndex) + '::' + string.substring(bestMatchIndex + bestMatchLength); + }; + + IPv6.prototype.toByteArray = function() { + var bytes, k, len, part, ref; + bytes = []; + ref = this.parts; + for (k = 0, len = ref.length; k < len; k++) { + part = ref[k]; + bytes.push(part >> 8); + bytes.push(part & 0xff); + } + return bytes; + }; + + IPv6.prototype.toNormalizedString = function() { + var addr, part, suffix; + addr = ((function() { + var k, len, ref, results; + ref = this.parts; + results = []; + for (k = 0, len = ref.length; k < len; k++) { + part = ref[k]; + results.push(part.toString(16)); + } + return results; + }).call(this)).join(":"); + suffix = ''; + if (this.zoneId) { + suffix = '%' + this.zoneId; + } + return addr + suffix; + }; + + IPv6.prototype.toFixedLengthString = function() { + var addr, part, suffix; + addr = ((function() { + var k, len, ref, results; + ref = this.parts; + results = []; + for (k = 0, len = ref.length; k < len; k++) { + part = ref[k]; + results.push(part.toString(16).padStart(4, '0')); + } + return results; + }).call(this)).join(":"); + suffix = ''; + if (this.zoneId) { + suffix = '%' + this.zoneId; + } + return addr + suffix; + }; + + IPv6.prototype.match = function(other, cidrRange) { + var ref; + if (cidrRange === void 0) { + ref = other, other = ref[0], cidrRange = ref[1]; + } + if (other.kind() !== 'ipv6') { + throw new Error("ipaddr: cannot match ipv6 address with non-ipv6 one"); + } + return matchCIDR(this.parts, other.parts, 16, cidrRange); + }; + + IPv6.prototype.SpecialRanges = { + unspecified: [new IPv6([0, 0, 0, 0, 0, 0, 0, 0]), 128], + linkLocal: [new IPv6([0xfe80, 0, 0, 0, 0, 0, 0, 0]), 10], + multicast: [new IPv6([0xff00, 0, 0, 0, 0, 0, 0, 0]), 8], + loopback: [new IPv6([0, 0, 0, 0, 0, 0, 0, 1]), 128], + uniqueLocal: [new IPv6([0xfc00, 0, 0, 0, 0, 0, 0, 0]), 7], + ipv4Mapped: [new IPv6([0, 0, 0, 0, 0, 0xffff, 0, 0]), 96], + rfc6145: [new IPv6([0, 0, 0, 0, 0xffff, 0, 0, 0]), 96], + rfc6052: [new IPv6([0x64, 0xff9b, 0, 0, 0, 0, 0, 0]), 96], + '6to4': [new IPv6([0x2002, 0, 0, 0, 0, 0, 0, 0]), 16], + teredo: [new IPv6([0x2001, 0, 0, 0, 0, 0, 0, 0]), 32], + reserved: [[new IPv6([0x2001, 0xdb8, 0, 0, 0, 0, 0, 0]), 32]] + }; + + IPv6.prototype.range = function() { + return ipaddr.subnetMatch(this, this.SpecialRanges); + }; + + IPv6.prototype.isIPv4MappedAddress = function() { + return this.range() === 'ipv4Mapped'; + }; + + IPv6.prototype.toIPv4Address = function() { + var high, low, ref; + if (!this.isIPv4MappedAddress()) { + throw new Error("ipaddr: trying to convert a generic ipv6 address to ipv4"); + } + ref = this.parts.slice(-2), high = ref[0], low = ref[1]; + return new ipaddr.IPv4([high >> 8, high & 0xff, low >> 8, low & 0xff]); + }; + + IPv6.prototype.prefixLengthFromSubnetMask = function() { + var cidr, i, k, part, stop, zeros, zerotable; + zerotable = { + 0: 16, + 32768: 15, + 49152: 14, + 57344: 13, + 61440: 12, + 63488: 11, + 64512: 10, + 65024: 9, + 65280: 8, + 65408: 7, + 65472: 6, + 65504: 5, + 65520: 4, + 65528: 3, + 65532: 2, + 65534: 1, + 65535: 0 + }; + cidr = 0; + stop = false; + for (i = k = 7; k >= 0; i = k += -1) { + part = this.parts[i]; + if (part in zerotable) { + zeros = zerotable[part]; + if (stop && zeros !== 0) { + return null; + } + if (zeros !== 16) { + stop = true; + } + cidr += zeros; + } else { + return null; + } + } + return 128 - cidr; + }; + + return IPv6; + + })(); + + ipv6Part = "(?:[0-9a-f]+::?)+"; + + zoneIndex = "%[0-9a-z]{1,}"; + + ipv6Regexes = { + zoneIndex: new RegExp(zoneIndex, 'i'), + "native": new RegExp("^(::)?(" + ipv6Part + ")?([0-9a-f]+)?(::)?(" + zoneIndex + ")?$", 'i'), + transitional: new RegExp(("^((?:" + ipv6Part + ")|(?:::)(?:" + ipv6Part + ")?)") + (ipv4Part + "\\." + ipv4Part + "\\." + ipv4Part + "\\." + ipv4Part) + ("(" + zoneIndex + ")?$"), 'i') + }; + + expandIPv6 = function(string, parts) { + var colonCount, lastColon, part, replacement, replacementCount, zoneId; + if (string.indexOf('::') !== string.lastIndexOf('::')) { + return null; + } + zoneId = (string.match(ipv6Regexes['zoneIndex']) || [])[0]; + if (zoneId) { + zoneId = zoneId.substring(1); + string = string.replace(/%.+$/, ''); + } + colonCount = 0; + lastColon = -1; + while ((lastColon = string.indexOf(':', lastColon + 1)) >= 0) { + colonCount++; + } + if (string.substr(0, 2) === '::') { + colonCount--; + } + if (string.substr(-2, 2) === '::') { + colonCount--; + } + if (colonCount > parts) { + return null; + } + replacementCount = parts - colonCount; + replacement = ':'; + while (replacementCount--) { + replacement += '0:'; + } + string = string.replace('::', replacement); + if (string[0] === ':') { + string = string.slice(1); + } + if (string[string.length - 1] === ':') { + string = string.slice(0, -1); + } + parts = (function() { + var k, len, ref, results; + ref = string.split(":"); + results = []; + for (k = 0, len = ref.length; k < len; k++) { + part = ref[k]; + results.push(parseInt(part, 16)); + } + return results; + })(); + return { + parts: parts, + zoneId: zoneId + }; + }; + + ipaddr.IPv6.parser = function(string) { + var addr, k, len, match, octet, octets, zoneId; + if (ipv6Regexes['native'].test(string)) { + return expandIPv6(string, 8); + } else if (match = string.match(ipv6Regexes['transitional'])) { + zoneId = match[6] || ''; + addr = expandIPv6(match[1].slice(0, -1) + zoneId, 6); + if (addr.parts) { + octets = [parseInt(match[2]), parseInt(match[3]), parseInt(match[4]), parseInt(match[5])]; + for (k = 0, len = octets.length; k < len; k++) { + octet = octets[k]; + if (!((0 <= octet && octet <= 255))) { + return null; + } + } + addr.parts.push(octets[0] << 8 | octets[1]); + addr.parts.push(octets[2] << 8 | octets[3]); + return { + parts: addr.parts, + zoneId: addr.zoneId + }; + } + } + return null; + }; + + ipaddr.IPv4.isIPv4 = ipaddr.IPv6.isIPv6 = function(string) { + return this.parser(string) !== null; + }; + + ipaddr.IPv4.isValid = function(string) { + var e; + try { + new this(this.parser(string)); + return true; + } catch (error1) { + e = error1; + return false; + } + }; + + ipaddr.IPv4.isValidFourPartDecimal = function(string) { + if (ipaddr.IPv4.isValid(string) && string.match(/^(0|[1-9]\d*)(\.(0|[1-9]\d*)){3}$/)) { + return true; + } else { + return false; + } + }; + + ipaddr.IPv6.isValid = function(string) { + var addr, e; + if (typeof string === "string" && string.indexOf(":") === -1) { + return false; + } + try { + addr = this.parser(string); + new this(addr.parts, addr.zoneId); + return true; + } catch (error1) { + e = error1; + return false; + } + }; + + ipaddr.IPv4.parse = function(string) { + var parts; + parts = this.parser(string); + if (parts === null) { + throw new Error("ipaddr: string is not formatted like ip address"); + } + return new this(parts); + }; + + ipaddr.IPv6.parse = function(string) { + var addr; + addr = this.parser(string); + if (addr.parts === null) { + throw new Error("ipaddr: string is not formatted like ip address"); + } + return new this(addr.parts, addr.zoneId); + }; + + ipaddr.IPv4.parseCIDR = function(string) { + var maskLength, match, parsed; + if (match = string.match(/^(.+)\/(\d+)$/)) { + maskLength = parseInt(match[2]); + if (maskLength >= 0 && maskLength <= 32) { + parsed = [this.parse(match[1]), maskLength]; + Object.defineProperty(parsed, 'toString', { + value: function() { + return this.join('/'); + } + }); + return parsed; + } + } + throw new Error("ipaddr: string is not formatted like an IPv4 CIDR range"); + }; + + ipaddr.IPv4.subnetMaskFromPrefixLength = function(prefix) { + var filledOctetCount, j, octets; + prefix = parseInt(prefix); + if (prefix < 0 || prefix > 32) { + throw new Error('ipaddr: invalid IPv4 prefix length'); + } + octets = [0, 0, 0, 0]; + j = 0; + filledOctetCount = Math.floor(prefix / 8); + while (j < filledOctetCount) { + octets[j] = 255; + j++; + } + if (filledOctetCount < 4) { + octets[filledOctetCount] = Math.pow(2, prefix % 8) - 1 << 8 - (prefix % 8); + } + return new this(octets); + }; + + ipaddr.IPv4.broadcastAddressFromCIDR = function(string) { + var cidr, error, i, ipInterfaceOctets, octets, subnetMaskOctets; + try { + cidr = this.parseCIDR(string); + ipInterfaceOctets = cidr[0].toByteArray(); + subnetMaskOctets = this.subnetMaskFromPrefixLength(cidr[1]).toByteArray(); + octets = []; + i = 0; + while (i < 4) { + octets.push(parseInt(ipInterfaceOctets[i], 10) | parseInt(subnetMaskOctets[i], 10) ^ 255); + i++; + } + return new this(octets); + } catch (error1) { + error = error1; + throw new Error('ipaddr: the address does not have IPv4 CIDR format'); + } + }; + + ipaddr.IPv4.networkAddressFromCIDR = function(string) { + var cidr, error, i, ipInterfaceOctets, octets, subnetMaskOctets; + try { + cidr = this.parseCIDR(string); + ipInterfaceOctets = cidr[0].toByteArray(); + subnetMaskOctets = this.subnetMaskFromPrefixLength(cidr[1]).toByteArray(); + octets = []; + i = 0; + while (i < 4) { + octets.push(parseInt(ipInterfaceOctets[i], 10) & parseInt(subnetMaskOctets[i], 10)); + i++; + } + return new this(octets); + } catch (error1) { + error = error1; + throw new Error('ipaddr: the address does not have IPv4 CIDR format'); + } + }; + + ipaddr.IPv6.parseCIDR = function(string) { + var maskLength, match, parsed; + if (match = string.match(/^(.+)\/(\d+)$/)) { + maskLength = parseInt(match[2]); + if (maskLength >= 0 && maskLength <= 128) { + parsed = [this.parse(match[1]), maskLength]; + Object.defineProperty(parsed, 'toString', { + value: function() { + return this.join('/'); + } + }); + return parsed; + } + } + throw new Error("ipaddr: string is not formatted like an IPv6 CIDR range"); + }; + + ipaddr.isValid = function(string) { + return ipaddr.IPv6.isValid(string) || ipaddr.IPv4.isValid(string); + }; + + ipaddr.parse = function(string) { + if (ipaddr.IPv6.isValid(string)) { + return ipaddr.IPv6.parse(string); + } else if (ipaddr.IPv4.isValid(string)) { + return ipaddr.IPv4.parse(string); + } else { + throw new Error("ipaddr: the address has neither IPv6 nor IPv4 format"); + } + }; + + ipaddr.parseCIDR = function(string) { + var e; + try { + return ipaddr.IPv6.parseCIDR(string); + } catch (error1) { + e = error1; + try { + return ipaddr.IPv4.parseCIDR(string); + } catch (error1) { + e = error1; + throw new Error("ipaddr: the address has neither IPv6 nor IPv4 CIDR format"); + } + } + }; + + ipaddr.fromByteArray = function(bytes) { + var length; + length = bytes.length; + if (length === 4) { + return new ipaddr.IPv4(bytes); + } else if (length === 16) { + return new ipaddr.IPv6(bytes); + } else { + throw new Error("ipaddr: the binary input is neither an IPv6 nor IPv4 address"); + } + }; + + ipaddr.process = function(string) { + var addr; + addr = this.parse(string); + if (addr.kind() === 'ipv6' && addr.isIPv4MappedAddress()) { + return addr.toIPv4Address(); + } else { + return addr; + } + }; + +}).call(this); diff --git a/node_modules/ipaddr.js/lib/ipaddr.js.d.ts b/node_modules/ipaddr.js/lib/ipaddr.js.d.ts new file mode 100644 index 000000000..52174b6b6 --- /dev/null +++ b/node_modules/ipaddr.js/lib/ipaddr.js.d.ts @@ -0,0 +1,68 @@ +declare module "ipaddr.js" { + type IPv4Range = 'unicast' | 'unspecified' | 'broadcast' | 'multicast' | 'linkLocal' | 'loopback' | 'carrierGradeNat' | 'private' | 'reserved'; + type IPv6Range = 'unicast' | 'unspecified' | 'linkLocal' | 'multicast' | 'loopback' | 'uniqueLocal' | 'ipv4Mapped' | 'rfc6145' | 'rfc6052' | '6to4' | 'teredo' | 'reserved'; + + interface RangeList { + [name: string]: [T, number] | [T, number][]; + } + + // Common methods/properties for IPv4 and IPv6 classes. + class IP { + prefixLengthFromSubnetMask(): number | null; + toByteArray(): number[]; + toNormalizedString(): string; + toString(): string; + } + + namespace Address { + export function isValid(addr: string): boolean; + export function fromByteArray(bytes: number[]): IPv4 | IPv6; + export function parse(addr: string): IPv4 | IPv6; + export function parseCIDR(mask: string): [IPv4 | IPv6, number]; + export function process(addr: string): IPv4 | IPv6; + export function subnetMatch(addr: IPv4, rangeList: RangeList, defaultName?: string): string; + export function subnetMatch(addr: IPv6, rangeList: RangeList, defaultName?: string): string; + + export class IPv4 extends IP { + static broadcastAddressFromCIDR(addr: string): IPv4; + static isIPv4(addr: string): boolean; + static isValidFourPartDecimal(addr: string): boolean; + static isValid(addr: string): boolean; + static networkAddressFromCIDR(addr: string): IPv4; + static parse(addr: string): IPv4; + static parseCIDR(addr: string): [IPv4, number]; + static subnetMaskFromPrefixLength(prefix: number): IPv4; + constructor(octets: number[]); + octets: number[] + + kind(): 'ipv4'; + match(addr: IPv4, bits: number): boolean; + match(mask: [IPv4, number]): boolean; + range(): IPv4Range; + subnetMatch(rangeList: RangeList, defaultName?: string): string; + toIPv4MappedAddress(): IPv6; + } + + export class IPv6 extends IP { + static broadcastAddressFromCIDR(addr: string): IPv6; + static isIPv6(addr: string): boolean; + static isValid(addr: string): boolean; + static parse(addr: string): IPv6; + static parseCIDR(addr: string): [IPv6, number]; + static subnetMaskFromPrefixLength(prefix: number): IPv6; + constructor(parts: number[]); + parts: number[] + zoneId?: string + + isIPv4MappedAddress(): boolean; + kind(): 'ipv6'; + match(addr: IPv6, bits: number): boolean; + match(mask: [IPv6, number]): boolean; + range(): IPv6Range; + subnetMatch(rangeList: RangeList, defaultName?: string): string; + toIPv4Address(): IPv4; + } + } + + export = Address; +} diff --git a/node_modules/ipaddr.js/package.json b/node_modules/ipaddr.js/package.json new file mode 100644 index 000000000..65d9c7f11 --- /dev/null +++ b/node_modules/ipaddr.js/package.json @@ -0,0 +1,70 @@ +{ + "_from": "ipaddr.js@1.9.1", + "_id": "ipaddr.js@1.9.1", + "_inBundle": false, + "_integrity": "sha512-0KI/607xoxSToH7GjN1FfSbLoU0+btTicjsQSWQlh/hZykN8KpmMf7uYwPW3R+akZ6R/w18ZlXSHBYXiYUPO3g==", + "_location": "/ipaddr.js", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "ipaddr.js@1.9.1", + "name": "ipaddr.js", + "escapedName": "ipaddr.js", + "rawSpec": "1.9.1", + "saveSpec": null, + "fetchSpec": "1.9.1" + }, + "_requiredBy": [ + "/proxy-addr" + ], + "_resolved": "https://registry.npmjs.org/ipaddr.js/-/ipaddr.js-1.9.1.tgz", + "_shasum": "bff38543eeb8984825079ff3a2a8e6cbd46781b3", + "_spec": "ipaddr.js@1.9.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\proxy-addr", + "author": { + "name": "whitequark", + "email": "whitequark@whitequark.org" + }, + "bugs": { + "url": "https://github.com/whitequark/ipaddr.js/issues" + }, + "bundleDependencies": false, + "dependencies": {}, + "deprecated": false, + "description": "A library for manipulating IPv4 and IPv6 addresses in JavaScript.", + "devDependencies": { + "coffee-script": "~1.12.6", + "nodeunit": "^0.11.3", + "uglify-js": "~3.0.19" + }, + "directories": { + "lib": "./lib" + }, + "engines": { + "node": ">= 0.10" + }, + "files": [ + "lib/", + "LICENSE", + "ipaddr.min.js" + ], + "homepage": "https://github.com/whitequark/ipaddr.js#readme", + "keywords": [ + "ip", + "ipv4", + "ipv6" + ], + "license": "MIT", + "main": "./lib/ipaddr.js", + "name": "ipaddr.js", + "repository": { + "type": "git", + "url": "git://github.com/whitequark/ipaddr.js.git" + }, + "scripts": { + "test": "cake build test" + }, + "types": "./lib/ipaddr.js.d.ts", + "version": "1.9.1" +} diff --git a/node_modules/is-binary-path/index.d.ts b/node_modules/is-binary-path/index.d.ts new file mode 100644 index 000000000..19dcd4327 --- /dev/null +++ b/node_modules/is-binary-path/index.d.ts @@ -0,0 +1,17 @@ +/** +Check if a file path is a binary file. + +@example +``` +import isBinaryPath = require('is-binary-path'); + +isBinaryPath('source/unicorn.png'); +//=> true + +isBinaryPath('source/unicorn.txt'); +//=> false +``` +*/ +declare function isBinaryPath(filePath: string): boolean; + +export = isBinaryPath; diff --git a/node_modules/is-binary-path/index.js b/node_modules/is-binary-path/index.js new file mode 100644 index 000000000..ef7548c83 --- /dev/null +++ b/node_modules/is-binary-path/index.js @@ -0,0 +1,7 @@ +'use strict'; +const path = require('path'); +const binaryExtensions = require('binary-extensions'); + +const extensions = new Set(binaryExtensions); + +module.exports = filePath => extensions.has(path.extname(filePath).slice(1).toLowerCase()); diff --git a/node_modules/is-binary-path/license b/node_modules/is-binary-path/license new file mode 100644 index 000000000..401b1c731 --- /dev/null +++ b/node_modules/is-binary-path/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) 2019 Sindre Sorhus (https://sindresorhus.com), Paul Miller (https://paulmillr.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/is-binary-path/package.json b/node_modules/is-binary-path/package.json new file mode 100644 index 000000000..13568cd0f --- /dev/null +++ b/node_modules/is-binary-path/package.json @@ -0,0 +1,72 @@ +{ + "_from": "is-binary-path@~2.1.0", + "_id": "is-binary-path@2.1.0", + "_inBundle": false, + "_integrity": "sha512-ZMERYes6pDydyuGidse7OsHxtbI7WVeUEozgR/g7rd0xUimYNlvZRE/K2MgZTjWy725IfelLeVcEM97mmtRGXw==", + "_location": "/is-binary-path", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "is-binary-path@~2.1.0", + "name": "is-binary-path", + "escapedName": "is-binary-path", + "rawSpec": "~2.1.0", + "saveSpec": null, + "fetchSpec": "~2.1.0" + }, + "_requiredBy": [ + "/chokidar" + ], + "_resolved": "https://registry.npmjs.org/is-binary-path/-/is-binary-path-2.1.0.tgz", + "_shasum": "ea1f7f3b80f064236e83470f86c09c254fb45b09", + "_spec": "is-binary-path@~2.1.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\chokidar", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/is-binary-path/issues" + }, + "bundleDependencies": false, + "dependencies": { + "binary-extensions": "^2.0.0" + }, + "deprecated": false, + "description": "Check if a file path is a binary file", + "devDependencies": { + "ava": "^1.4.1", + "tsd": "^0.7.2", + "xo": "^0.24.0" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "homepage": "https://github.com/sindresorhus/is-binary-path#readme", + "keywords": [ + "binary", + "extensions", + "extension", + "file", + "path", + "check", + "detect", + "is" + ], + "license": "MIT", + "name": "is-binary-path", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/is-binary-path.git" + }, + "scripts": { + "test": "xo && ava && tsd" + }, + "version": "2.1.0" +} diff --git a/node_modules/is-binary-path/readme.md b/node_modules/is-binary-path/readme.md new file mode 100644 index 000000000..b4ab02519 --- /dev/null +++ b/node_modules/is-binary-path/readme.md @@ -0,0 +1,34 @@ +# is-binary-path [![Build Status](https://travis-ci.org/sindresorhus/is-binary-path.svg?branch=master)](https://travis-ci.org/sindresorhus/is-binary-path) + +> Check if a file path is a binary file + + +## Install + +``` +$ npm install is-binary-path +``` + + +## Usage + +```js +const isBinaryPath = require('is-binary-path'); + +isBinaryPath('source/unicorn.png'); +//=> true + +isBinaryPath('source/unicorn.txt'); +//=> false +``` + + +## Related + +- [binary-extensions](https://github.com/sindresorhus/binary-extensions) - List of binary file extensions +- [is-text-path](https://github.com/sindresorhus/is-text-path) - Check if a filepath is a text file + + +## License + +MIT © [Sindre Sorhus](https://sindresorhus.com), [Paul Miller](https://paulmillr.com) diff --git a/node_modules/is-ci/CHANGELOG.md b/node_modules/is-ci/CHANGELOG.md new file mode 100644 index 000000000..c51927769 --- /dev/null +++ b/node_modules/is-ci/CHANGELOG.md @@ -0,0 +1,14 @@ +# Changelog + +## v2.0.0 + +Breaking changes: + +* Drop support for Node.js end-of-life versions: 0.10, 0.12, 4, 5, 7, + and 9 + +Other changes: + +See [ci-info +changelog](https://github.com/watson/ci-info/blob/master/CHANGELOG.md#v200) +for a list of newly supported CI servers. diff --git a/node_modules/is-ci/LICENSE b/node_modules/is-ci/LICENSE new file mode 100644 index 000000000..67846832e --- /dev/null +++ b/node_modules/is-ci/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2016-2018 Thomas Watson Steen + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/node_modules/is-ci/README.md b/node_modules/is-ci/README.md new file mode 100644 index 000000000..bc3840a22 --- /dev/null +++ b/node_modules/is-ci/README.md @@ -0,0 +1,50 @@ +# is-ci + +Returns `true` if the current environment is a Continuous Integration +server. + +Please [open an issue](https://github.com/watson/is-ci/issues) if your +CI server isn't properly detected :) + +[![npm](https://img.shields.io/npm/v/is-ci.svg)](https://www.npmjs.com/package/is-ci) +[![Build status](https://travis-ci.org/watson/is-ci.svg?branch=master)](https://travis-ci.org/watson/is-ci) +[![js-standard-style](https://img.shields.io/badge/code%20style-standard-brightgreen.svg?style=flat)](https://github.com/feross/standard) + +## Installation + +```bash +npm install is-ci --save +``` + +## Programmatic Usage + +```js +const isCI = require('is-ci') + +if (isCI) { + console.log('The code is running on a CI server') +} +``` + +## CLI Usage + +For CLI usage you need to have the `is-ci` executable in your `PATH`. +There's a few ways to do that: + +- Either install the module globally using `npm install is-ci -g` +- Or add the module as a dependency to your app in which case it can be + used inside your package.json scripts as is +- Or provide the full path to the executable, e.g. + `./node_modules/.bin/is-ci` + +```bash +is-ci && echo "This is a CI server" +``` + +## Supported CI tools + +Refer to [ci-info](https://github.com/watson/ci-info#supported-ci-tools) docs for all supported CI's + +## License + +[MIT](LICENSE) diff --git a/node_modules/is-ci/bin.js b/node_modules/is-ci/bin.js new file mode 100644 index 000000000..0c56c01f2 --- /dev/null +++ b/node_modules/is-ci/bin.js @@ -0,0 +1,4 @@ +#!/usr/bin/env node +'use strict' + +process.exit(require('./') ? 0 : 1) diff --git a/node_modules/is-ci/index.js b/node_modules/is-ci/index.js new file mode 100644 index 000000000..d4cb67aa9 --- /dev/null +++ b/node_modules/is-ci/index.js @@ -0,0 +1,3 @@ +'use strict' + +module.exports = require('ci-info').isCI diff --git a/node_modules/is-ci/package.json b/node_modules/is-ci/package.json new file mode 100644 index 000000000..b69fb91d8 --- /dev/null +++ b/node_modules/is-ci/package.json @@ -0,0 +1,69 @@ +{ + "_from": "is-ci@^2.0.0", + "_id": "is-ci@2.0.0", + "_inBundle": false, + "_integrity": "sha512-YfJT7rkpQB0updsdHLGWrvhBJfcfzNNawYDNIyQXJz0IViGf75O8EBPKSdvw2rF+LGCsX4FZ8tcr3b19LcZq4w==", + "_location": "/is-ci", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "is-ci@^2.0.0", + "name": "is-ci", + "escapedName": "is-ci", + "rawSpec": "^2.0.0", + "saveSpec": null, + "fetchSpec": "^2.0.0" + }, + "_requiredBy": [ + "/update-notifier" + ], + "_resolved": "https://registry.npmjs.org/is-ci/-/is-ci-2.0.0.tgz", + "_shasum": "6bc6334181810e04b5c22b3d589fdca55026404c", + "_spec": "is-ci@^2.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\update-notifier", + "author": { + "name": "Thomas Watson Steen", + "email": "w@tson.dk", + "url": "https://twitter.com/wa7son" + }, + "bin": { + "is-ci": "bin.js" + }, + "bugs": { + "url": "https://github.com/watson/is-ci/issues" + }, + "bundleDependencies": false, + "coordinates": [ + 55.778272, + 12.593116 + ], + "dependencies": { + "ci-info": "^2.0.0" + }, + "deprecated": false, + "description": "Detect if the current environment is a CI server", + "devDependencies": { + "clear-module": "^3.0.0", + "standard": "^12.0.1" + }, + "homepage": "https://github.com/watson/is-ci", + "keywords": [ + "ci", + "continuous", + "integration", + "test", + "detect" + ], + "license": "MIT", + "main": "index.js", + "name": "is-ci", + "repository": { + "type": "git", + "url": "git+https://github.com/watson/is-ci.git" + }, + "scripts": { + "test": "standard && node test.js" + }, + "version": "2.0.0" +} diff --git a/node_modules/is-extglob/LICENSE b/node_modules/is-extglob/LICENSE new file mode 100644 index 000000000..842218cf0 --- /dev/null +++ b/node_modules/is-extglob/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2014-2016, Jon Schlinkert + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/is-extglob/README.md b/node_modules/is-extglob/README.md new file mode 100644 index 000000000..0416af5c3 --- /dev/null +++ b/node_modules/is-extglob/README.md @@ -0,0 +1,107 @@ +# is-extglob [![NPM version](https://img.shields.io/npm/v/is-extglob.svg?style=flat)](https://www.npmjs.com/package/is-extglob) [![NPM downloads](https://img.shields.io/npm/dm/is-extglob.svg?style=flat)](https://npmjs.org/package/is-extglob) [![Build Status](https://img.shields.io/travis/jonschlinkert/is-extglob.svg?style=flat)](https://travis-ci.org/jonschlinkert/is-extglob) + +> Returns true if a string has an extglob. + +## Install + +Install with [npm](https://www.npmjs.com/): + +```sh +$ npm install --save is-extglob +``` + +## Usage + +```js +var isExtglob = require('is-extglob'); +``` + +**True** + +```js +isExtglob('?(abc)'); +isExtglob('@(abc)'); +isExtglob('!(abc)'); +isExtglob('*(abc)'); +isExtglob('+(abc)'); +``` + +**False** + +Escaped extglobs: + +```js +isExtglob('\\?(abc)'); +isExtglob('\\@(abc)'); +isExtglob('\\!(abc)'); +isExtglob('\\*(abc)'); +isExtglob('\\+(abc)'); +``` + +Everything else... + +```js +isExtglob('foo.js'); +isExtglob('!foo.js'); +isExtglob('*.js'); +isExtglob('**/abc.js'); +isExtglob('abc/*.js'); +isExtglob('abc/(aaa|bbb).js'); +isExtglob('abc/[a-z].js'); +isExtglob('abc/{a,b}.js'); +isExtglob('abc/?.js'); +isExtglob('abc.js'); +isExtglob('abc/def/ghi.js'); +``` + +## History + +**v2.0** + +Adds support for escaping. Escaped exglobs no longer return true. + +## About + +### Related projects + +* [has-glob](https://www.npmjs.com/package/has-glob): Returns `true` if an array has a glob pattern. | [homepage](https://github.com/jonschlinkert/has-glob "Returns `true` if an array has a glob pattern.") +* [is-glob](https://www.npmjs.com/package/is-glob): Returns `true` if the given string looks like a glob pattern or an extglob pattern… [more](https://github.com/jonschlinkert/is-glob) | [homepage](https://github.com/jonschlinkert/is-glob "Returns `true` if the given string looks like a glob pattern or an extglob pattern. This makes it easy to create code that only uses external modules like node-glob when necessary, resulting in much faster code execution and initialization time, and a bet") +* [micromatch](https://www.npmjs.com/package/micromatch): Glob matching for javascript/node.js. A drop-in replacement and faster alternative to minimatch and multimatch. | [homepage](https://github.com/jonschlinkert/micromatch "Glob matching for javascript/node.js. A drop-in replacement and faster alternative to minimatch and multimatch.") + +### Contributing + +Pull requests and stars are always welcome. For bugs and feature requests, [please create an issue](../../issues/new). + +### Building docs + +_(This document was generated by [verb-generate-readme](https://github.com/verbose/verb-generate-readme) (a [verb](https://github.com/verbose/verb) generator), please don't edit the readme directly. Any changes to the readme must be made in [.verb.md](.verb.md).)_ + +To generate the readme and API documentation with [verb](https://github.com/verbose/verb): + +```sh +$ npm install -g verb verb-generate-readme && verb +``` + +### Running tests + +Install dev dependencies: + +```sh +$ npm install -d && npm test +``` + +### Author + +**Jon Schlinkert** + +* [github/jonschlinkert](https://github.com/jonschlinkert) +* [twitter/jonschlinkert](http://twitter.com/jonschlinkert) + +### License + +Copyright © 2016, [Jon Schlinkert](https://github.com/jonschlinkert). +Released under the [MIT license](https://github.com/jonschlinkert/is-extglob/blob/master/LICENSE). + +*** + +_This file was generated by [verb-generate-readme](https://github.com/verbose/verb-generate-readme), v0.1.31, on October 12, 2016._ \ No newline at end of file diff --git a/node_modules/is-extglob/index.js b/node_modules/is-extglob/index.js new file mode 100644 index 000000000..c1d986fc5 --- /dev/null +++ b/node_modules/is-extglob/index.js @@ -0,0 +1,20 @@ +/*! + * is-extglob + * + * Copyright (c) 2014-2016, Jon Schlinkert. + * Licensed under the MIT License. + */ + +module.exports = function isExtglob(str) { + if (typeof str !== 'string' || str === '') { + return false; + } + + var match; + while ((match = /(\\).|([@?!+*]\(.*\))/g.exec(str))) { + if (match[2]) return true; + str = str.slice(match.index + match[0].length); + } + + return false; +}; diff --git a/node_modules/is-extglob/package.json b/node_modules/is-extglob/package.json new file mode 100644 index 000000000..b05160bfd --- /dev/null +++ b/node_modules/is-extglob/package.json @@ -0,0 +1,100 @@ +{ + "_from": "is-extglob@^2.1.1", + "_id": "is-extglob@2.1.1", + "_inBundle": false, + "_integrity": "sha1-qIwCU1eR8C7TfHahueqXc8gz+MI=", + "_location": "/is-extglob", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "is-extglob@^2.1.1", + "name": "is-extglob", + "escapedName": "is-extglob", + "rawSpec": "^2.1.1", + "saveSpec": null, + "fetchSpec": "^2.1.1" + }, + "_requiredBy": [ + "/is-glob" + ], + "_resolved": "https://registry.npmjs.org/is-extglob/-/is-extglob-2.1.1.tgz", + "_shasum": "a88c02535791f02ed37c76a1b9ea9773c833f8c2", + "_spec": "is-extglob@^2.1.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\is-glob", + "author": { + "name": "Jon Schlinkert", + "url": "https://github.com/jonschlinkert" + }, + "bugs": { + "url": "https://github.com/jonschlinkert/is-extglob/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Returns true if a string has an extglob.", + "devDependencies": { + "gulp-format-md": "^0.1.10", + "mocha": "^3.0.2" + }, + "engines": { + "node": ">=0.10.0" + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/jonschlinkert/is-extglob", + "keywords": [ + "bash", + "braces", + "check", + "exec", + "expression", + "extglob", + "glob", + "globbing", + "globstar", + "is", + "match", + "matches", + "pattern", + "regex", + "regular", + "string", + "test" + ], + "license": "MIT", + "main": "index.js", + "name": "is-extglob", + "repository": { + "type": "git", + "url": "git+https://github.com/jonschlinkert/is-extglob.git" + }, + "scripts": { + "test": "mocha" + }, + "verb": { + "toc": false, + "layout": "default", + "tasks": [ + "readme" + ], + "plugins": [ + "gulp-format-md" + ], + "related": { + "list": [ + "has-glob", + "is-glob", + "micromatch" + ] + }, + "reflinks": [ + "verb", + "verb-generate-readme" + ], + "lint": { + "reflinks": true + } + }, + "version": "2.1.1" +} diff --git a/node_modules/is-fullwidth-code-point/index.d.ts b/node_modules/is-fullwidth-code-point/index.d.ts new file mode 100644 index 000000000..729d20205 --- /dev/null +++ b/node_modules/is-fullwidth-code-point/index.d.ts @@ -0,0 +1,17 @@ +/** +Check if the character represented by a given [Unicode code point](https://en.wikipedia.org/wiki/Code_point) is [fullwidth](https://en.wikipedia.org/wiki/Halfwidth_and_fullwidth_forms). + +@param codePoint - The [code point](https://en.wikipedia.org/wiki/Code_point) of a character. + +@example +``` +import isFullwidthCodePoint from 'is-fullwidth-code-point'; + +isFullwidthCodePoint('谢'.codePointAt(0)); +//=> true + +isFullwidthCodePoint('a'.codePointAt(0)); +//=> false +``` +*/ +export default function isFullwidthCodePoint(codePoint: number): boolean; diff --git a/node_modules/is-fullwidth-code-point/index.js b/node_modules/is-fullwidth-code-point/index.js new file mode 100644 index 000000000..671f97f76 --- /dev/null +++ b/node_modules/is-fullwidth-code-point/index.js @@ -0,0 +1,50 @@ +/* eslint-disable yoda */ +'use strict'; + +const isFullwidthCodePoint = codePoint => { + if (Number.isNaN(codePoint)) { + return false; + } + + // Code points are derived from: + // http://www.unix.org/Public/UNIDATA/EastAsianWidth.txt + if ( + codePoint >= 0x1100 && ( + codePoint <= 0x115F || // Hangul Jamo + codePoint === 0x2329 || // LEFT-POINTING ANGLE BRACKET + codePoint === 0x232A || // RIGHT-POINTING ANGLE BRACKET + // CJK Radicals Supplement .. Enclosed CJK Letters and Months + (0x2E80 <= codePoint && codePoint <= 0x3247 && codePoint !== 0x303F) || + // Enclosed CJK Letters and Months .. CJK Unified Ideographs Extension A + (0x3250 <= codePoint && codePoint <= 0x4DBF) || + // CJK Unified Ideographs .. Yi Radicals + (0x4E00 <= codePoint && codePoint <= 0xA4C6) || + // Hangul Jamo Extended-A + (0xA960 <= codePoint && codePoint <= 0xA97C) || + // Hangul Syllables + (0xAC00 <= codePoint && codePoint <= 0xD7A3) || + // CJK Compatibility Ideographs + (0xF900 <= codePoint && codePoint <= 0xFAFF) || + // Vertical Forms + (0xFE10 <= codePoint && codePoint <= 0xFE19) || + // CJK Compatibility Forms .. Small Form Variants + (0xFE30 <= codePoint && codePoint <= 0xFE6B) || + // Halfwidth and Fullwidth Forms + (0xFF01 <= codePoint && codePoint <= 0xFF60) || + (0xFFE0 <= codePoint && codePoint <= 0xFFE6) || + // Kana Supplement + (0x1B000 <= codePoint && codePoint <= 0x1B001) || + // Enclosed Ideographic Supplement + (0x1F200 <= codePoint && codePoint <= 0x1F251) || + // CJK Unified Ideographs Extension B .. Tertiary Ideographic Plane + (0x20000 <= codePoint && codePoint <= 0x3FFFD) + ) + ) { + return true; + } + + return false; +}; + +module.exports = isFullwidthCodePoint; +module.exports.default = isFullwidthCodePoint; diff --git a/node_modules/is-fullwidth-code-point/license b/node_modules/is-fullwidth-code-point/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/is-fullwidth-code-point/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/is-fullwidth-code-point/package.json b/node_modules/is-fullwidth-code-point/package.json new file mode 100644 index 000000000..8e8bfaffd --- /dev/null +++ b/node_modules/is-fullwidth-code-point/package.json @@ -0,0 +1,74 @@ +{ + "_from": "is-fullwidth-code-point@^3.0.0", + "_id": "is-fullwidth-code-point@3.0.0", + "_inBundle": false, + "_integrity": "sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg==", + "_location": "/is-fullwidth-code-point", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "is-fullwidth-code-point@^3.0.0", + "name": "is-fullwidth-code-point", + "escapedName": "is-fullwidth-code-point", + "rawSpec": "^3.0.0", + "saveSpec": null, + "fetchSpec": "^3.0.0" + }, + "_requiredBy": [ + "/string-width" + ], + "_resolved": "https://registry.npmjs.org/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz", + "_shasum": "f116f8064fe90b3f7844a38997c0b75051269f1d", + "_spec": "is-fullwidth-code-point@^3.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\string-width", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/is-fullwidth-code-point/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Check if the character represented by a given Unicode code point is fullwidth", + "devDependencies": { + "ava": "^1.3.1", + "tsd-check": "^0.5.0", + "xo": "^0.24.0" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "homepage": "https://github.com/sindresorhus/is-fullwidth-code-point#readme", + "keywords": [ + "fullwidth", + "full-width", + "full", + "width", + "unicode", + "character", + "string", + "codepoint", + "code", + "point", + "is", + "detect", + "check" + ], + "license": "MIT", + "name": "is-fullwidth-code-point", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/is-fullwidth-code-point.git" + }, + "scripts": { + "test": "xo && ava && tsd-check" + }, + "version": "3.0.0" +} diff --git a/node_modules/is-fullwidth-code-point/readme.md b/node_modules/is-fullwidth-code-point/readme.md new file mode 100644 index 000000000..4236bba98 --- /dev/null +++ b/node_modules/is-fullwidth-code-point/readme.md @@ -0,0 +1,39 @@ +# is-fullwidth-code-point [![Build Status](https://travis-ci.org/sindresorhus/is-fullwidth-code-point.svg?branch=master)](https://travis-ci.org/sindresorhus/is-fullwidth-code-point) + +> Check if the character represented by a given [Unicode code point](https://en.wikipedia.org/wiki/Code_point) is [fullwidth](https://en.wikipedia.org/wiki/Halfwidth_and_fullwidth_forms) + + +## Install + +``` +$ npm install is-fullwidth-code-point +``` + + +## Usage + +```js +const isFullwidthCodePoint = require('is-fullwidth-code-point'); + +isFullwidthCodePoint('谢'.codePointAt(0)); +//=> true + +isFullwidthCodePoint('a'.codePointAt(0)); +//=> false +``` + + +## API + +### isFullwidthCodePoint(codePoint) + +#### codePoint + +Type: `number` + +The [code point](https://en.wikipedia.org/wiki/Code_point) of a character. + + +## License + +MIT © [Sindre Sorhus](https://sindresorhus.com) diff --git a/node_modules/is-glob/LICENSE b/node_modules/is-glob/LICENSE new file mode 100644 index 000000000..3f2eca18f --- /dev/null +++ b/node_modules/is-glob/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2014-2017, Jon Schlinkert. + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/is-glob/README.md b/node_modules/is-glob/README.md new file mode 100644 index 000000000..740724b27 --- /dev/null +++ b/node_modules/is-glob/README.md @@ -0,0 +1,206 @@ +# is-glob [![NPM version](https://img.shields.io/npm/v/is-glob.svg?style=flat)](https://www.npmjs.com/package/is-glob) [![NPM monthly downloads](https://img.shields.io/npm/dm/is-glob.svg?style=flat)](https://npmjs.org/package/is-glob) [![NPM total downloads](https://img.shields.io/npm/dt/is-glob.svg?style=flat)](https://npmjs.org/package/is-glob) [![Build Status](https://img.shields.io/github/workflow/status/micromatch/is-glob/dev)](https://github.com/micromatch/is-glob/actions) + +> Returns `true` if the given string looks like a glob pattern or an extglob pattern. This makes it easy to create code that only uses external modules like node-glob when necessary, resulting in much faster code execution and initialization time, and a better user experience. + +Please consider following this project's author, [Jon Schlinkert](https://github.com/jonschlinkert), and consider starring the project to show your :heart: and support. + +## Install + +Install with [npm](https://www.npmjs.com/): + +```sh +$ npm install --save is-glob +``` + +You might also be interested in [is-valid-glob](https://github.com/jonschlinkert/is-valid-glob) and [has-glob](https://github.com/jonschlinkert/has-glob). + +## Usage + +```js +var isGlob = require('is-glob'); +``` + +### Default behavior + +**True** + +Patterns that have glob characters or regex patterns will return `true`: + +```js +isGlob('!foo.js'); +isGlob('*.js'); +isGlob('**/abc.js'); +isGlob('abc/*.js'); +isGlob('abc/(aaa|bbb).js'); +isGlob('abc/[a-z].js'); +isGlob('abc/{a,b}.js'); +//=> true +``` + +Extglobs + +```js +isGlob('abc/@(a).js'); +isGlob('abc/!(a).js'); +isGlob('abc/+(a).js'); +isGlob('abc/*(a).js'); +isGlob('abc/?(a).js'); +//=> true +``` + +**False** + +Escaped globs or extglobs return `false`: + +```js +isGlob('abc/\\@(a).js'); +isGlob('abc/\\!(a).js'); +isGlob('abc/\\+(a).js'); +isGlob('abc/\\*(a).js'); +isGlob('abc/\\?(a).js'); +isGlob('\\!foo.js'); +isGlob('\\*.js'); +isGlob('\\*\\*/abc.js'); +isGlob('abc/\\*.js'); +isGlob('abc/\\(aaa|bbb).js'); +isGlob('abc/\\[a-z].js'); +isGlob('abc/\\{a,b}.js'); +//=> false +``` + +Patterns that do not have glob patterns return `false`: + +```js +isGlob('abc.js'); +isGlob('abc/def/ghi.js'); +isGlob('foo.js'); +isGlob('abc/@.js'); +isGlob('abc/+.js'); +isGlob('abc/?.js'); +isGlob(); +isGlob(null); +//=> false +``` + +Arrays are also `false` (If you want to check if an array has a glob pattern, use [has-glob](https://github.com/jonschlinkert/has-glob)): + +```js +isGlob(['**/*.js']); +isGlob(['foo.js']); +//=> false +``` + +### Option strict + +When `options.strict === false` the behavior is less strict in determining if a pattern is a glob. Meaning that +some patterns that would return `false` may return `true`. This is done so that matching libraries like [micromatch](https://github.com/micromatch/micromatch) have a chance at determining if the pattern is a glob or not. + +**True** + +Patterns that have glob characters or regex patterns will return `true`: + +```js +isGlob('!foo.js', {strict: false}); +isGlob('*.js', {strict: false}); +isGlob('**/abc.js', {strict: false}); +isGlob('abc/*.js', {strict: false}); +isGlob('abc/(aaa|bbb).js', {strict: false}); +isGlob('abc/[a-z].js', {strict: false}); +isGlob('abc/{a,b}.js', {strict: false}); +//=> true +``` + +Extglobs + +```js +isGlob('abc/@(a).js', {strict: false}); +isGlob('abc/!(a).js', {strict: false}); +isGlob('abc/+(a).js', {strict: false}); +isGlob('abc/*(a).js', {strict: false}); +isGlob('abc/?(a).js', {strict: false}); +//=> true +``` + +**False** + +Escaped globs or extglobs return `false`: + +```js +isGlob('\\!foo.js', {strict: false}); +isGlob('\\*.js', {strict: false}); +isGlob('\\*\\*/abc.js', {strict: false}); +isGlob('abc/\\*.js', {strict: false}); +isGlob('abc/\\(aaa|bbb).js', {strict: false}); +isGlob('abc/\\[a-z].js', {strict: false}); +isGlob('abc/\\{a,b}.js', {strict: false}); +//=> false +``` + +## About + +
+Contributing + +Pull requests and stars are always welcome. For bugs and feature requests, [please create an issue](../../issues/new). + +
+ +
+Running Tests + +Running and reviewing unit tests is a great way to get familiarized with a library and its API. You can install dependencies and run tests with the following command: + +```sh +$ npm install && npm test +``` + +
+ +
+Building docs + +_(This project's readme.md is generated by [verb](https://github.com/verbose/verb-generate-readme), please don't edit the readme directly. Any changes to the readme must be made in the [.verb.md](.verb.md) readme template.)_ + +To generate the readme, run the following command: + +```sh +$ npm install -g verbose/verb#dev verb-generate-readme && verb +``` + +
+ +### Related projects + +You might also be interested in these projects: + +* [assemble](https://www.npmjs.com/package/assemble): Get the rocks out of your socks! Assemble makes you fast at creating web projects… [more](https://github.com/assemble/assemble) | [homepage](https://github.com/assemble/assemble "Get the rocks out of your socks! Assemble makes you fast at creating web projects. Assemble is used by thousands of projects for rapid prototyping, creating themes, scaffolds, boilerplates, e-books, UI components, API documentation, blogs, building websit") +* [base](https://www.npmjs.com/package/base): Framework for rapidly creating high quality, server-side node.js applications, using plugins like building blocks | [homepage](https://github.com/node-base/base "Framework for rapidly creating high quality, server-side node.js applications, using plugins like building blocks") +* [update](https://www.npmjs.com/package/update): Be scalable! Update is a new, open source developer framework and CLI for automating updates… [more](https://github.com/update/update) | [homepage](https://github.com/update/update "Be scalable! Update is a new, open source developer framework and CLI for automating updates of any kind in code projects.") +* [verb](https://www.npmjs.com/package/verb): Documentation generator for GitHub projects. Verb is extremely powerful, easy to use, and is used… [more](https://github.com/verbose/verb) | [homepage](https://github.com/verbose/verb "Documentation generator for GitHub projects. Verb is extremely powerful, easy to use, and is used on hundreds of projects of all sizes to generate everything from API docs to readmes.") + +### Contributors + +| **Commits** | **Contributor** | +| --- | --- | +| 47 | [jonschlinkert](https://github.com/jonschlinkert) | +| 5 | [doowb](https://github.com/doowb) | +| 1 | [phated](https://github.com/phated) | +| 1 | [danhper](https://github.com/danhper) | +| 1 | [paulmillr](https://github.com/paulmillr) | + +### Author + +**Jon Schlinkert** + +* [GitHub Profile](https://github.com/jonschlinkert) +* [Twitter Profile](https://twitter.com/jonschlinkert) +* [LinkedIn Profile](https://linkedin.com/in/jonschlinkert) + +### License + +Copyright © 2019, [Jon Schlinkert](https://github.com/jonschlinkert). +Released under the [MIT License](LICENSE). + +*** + +_This file was generated by [verb-generate-readme](https://github.com/verbose/verb-generate-readme), v0.8.0, on March 27, 2019._ \ No newline at end of file diff --git a/node_modules/is-glob/index.js b/node_modules/is-glob/index.js new file mode 100644 index 000000000..620f563ec --- /dev/null +++ b/node_modules/is-glob/index.js @@ -0,0 +1,150 @@ +/*! + * is-glob + * + * Copyright (c) 2014-2017, Jon Schlinkert. + * Released under the MIT License. + */ + +var isExtglob = require('is-extglob'); +var chars = { '{': '}', '(': ')', '[': ']'}; +var strictCheck = function(str) { + if (str[0] === '!') { + return true; + } + var index = 0; + var pipeIndex = -2; + var closeSquareIndex = -2; + var closeCurlyIndex = -2; + var closeParenIndex = -2; + var backSlashIndex = -2; + while (index < str.length) { + if (str[index] === '*') { + return true; + } + + if (str[index + 1] === '?' && /[\].+)]/.test(str[index])) { + return true; + } + + if (closeSquareIndex !== -1 && str[index] === '[' && str[index + 1] !== ']') { + if (closeSquareIndex < index) { + closeSquareIndex = str.indexOf(']', index); + } + if (closeSquareIndex > index) { + if (backSlashIndex === -1 || backSlashIndex > closeSquareIndex) { + return true; + } + backSlashIndex = str.indexOf('\\', index); + if (backSlashIndex === -1 || backSlashIndex > closeSquareIndex) { + return true; + } + } + } + + if (closeCurlyIndex !== -1 && str[index] === '{' && str[index + 1] !== '}') { + closeCurlyIndex = str.indexOf('}', index); + if (closeCurlyIndex > index) { + backSlashIndex = str.indexOf('\\', index); + if (backSlashIndex === -1 || backSlashIndex > closeCurlyIndex) { + return true; + } + } + } + + if (closeParenIndex !== -1 && str[index] === '(' && str[index + 1] === '?' && /[:!=]/.test(str[index + 2]) && str[index + 3] !== ')') { + closeParenIndex = str.indexOf(')', index); + if (closeParenIndex > index) { + backSlashIndex = str.indexOf('\\', index); + if (backSlashIndex === -1 || backSlashIndex > closeParenIndex) { + return true; + } + } + } + + if (pipeIndex !== -1 && str[index] === '(' && str[index + 1] !== '|') { + if (pipeIndex < index) { + pipeIndex = str.indexOf('|', index); + } + if (pipeIndex !== -1 && str[pipeIndex + 1] !== ')') { + closeParenIndex = str.indexOf(')', pipeIndex); + if (closeParenIndex > pipeIndex) { + backSlashIndex = str.indexOf('\\', pipeIndex); + if (backSlashIndex === -1 || backSlashIndex > closeParenIndex) { + return true; + } + } + } + } + + if (str[index] === '\\') { + var open = str[index + 1]; + index += 2; + var close = chars[open]; + + if (close) { + var n = str.indexOf(close, index); + if (n !== -1) { + index = n + 1; + } + } + + if (str[index] === '!') { + return true; + } + } else { + index++; + } + } + return false; +}; + +var relaxedCheck = function(str) { + if (str[0] === '!') { + return true; + } + var index = 0; + while (index < str.length) { + if (/[*?{}()[\]]/.test(str[index])) { + return true; + } + + if (str[index] === '\\') { + var open = str[index + 1]; + index += 2; + var close = chars[open]; + + if (close) { + var n = str.indexOf(close, index); + if (n !== -1) { + index = n + 1; + } + } + + if (str[index] === '!') { + return true; + } + } else { + index++; + } + } + return false; +}; + +module.exports = function isGlob(str, options) { + if (typeof str !== 'string' || str === '') { + return false; + } + + if (isExtglob(str)) { + return true; + } + + var check = strictCheck; + + // optionally relax check + if (options && options.strict === false) { + check = relaxedCheck; + } + + return check(str); +}; diff --git a/node_modules/is-glob/package.json b/node_modules/is-glob/package.json new file mode 100644 index 000000000..2b4a3f739 --- /dev/null +++ b/node_modules/is-glob/package.json @@ -0,0 +1,122 @@ +{ + "_from": "is-glob@~4.0.1", + "_id": "is-glob@4.0.3", + "_inBundle": false, + "_integrity": "sha512-xelSayHH36ZgE7ZWhli7pW34hNbNl8Ojv5KVmkJD4hBdD3th8Tfk9vYasLM+mXWOZhFkgZfxhLSnrwRr4elSSg==", + "_location": "/is-glob", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "is-glob@~4.0.1", + "name": "is-glob", + "escapedName": "is-glob", + "rawSpec": "~4.0.1", + "saveSpec": null, + "fetchSpec": "~4.0.1" + }, + "_requiredBy": [ + "/chokidar", + "/glob-parent" + ], + "_resolved": "https://registry.npmjs.org/is-glob/-/is-glob-4.0.3.tgz", + "_shasum": "64f61e42cbbb2eec2071a9dac0b28ba1e65d5084", + "_spec": "is-glob@~4.0.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\chokidar", + "author": { + "name": "Jon Schlinkert", + "url": "https://github.com/jonschlinkert" + }, + "bugs": { + "url": "https://github.com/micromatch/is-glob/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Brian Woodward", + "url": "https://twitter.com/doowb" + }, + { + "name": "Daniel Perez", + "url": "https://tuvistavie.com" + }, + { + "name": "Jon Schlinkert", + "url": "http://twitter.com/jonschlinkert" + } + ], + "dependencies": { + "is-extglob": "^2.1.1" + }, + "deprecated": false, + "description": "Returns `true` if the given string looks like a glob pattern or an extglob pattern. This makes it easy to create code that only uses external modules like node-glob when necessary, resulting in much faster code execution and initialization time, and a better user experience.", + "devDependencies": { + "gulp-format-md": "^0.1.10", + "mocha": "^3.0.2" + }, + "engines": { + "node": ">=0.10.0" + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/micromatch/is-glob", + "keywords": [ + "bash", + "braces", + "check", + "exec", + "expression", + "extglob", + "glob", + "globbing", + "globstar", + "is", + "match", + "matches", + "pattern", + "regex", + "regular", + "string", + "test" + ], + "license": "MIT", + "main": "index.js", + "name": "is-glob", + "repository": { + "type": "git", + "url": "git+https://github.com/micromatch/is-glob.git" + }, + "scripts": { + "test": "mocha && node benchmark.js" + }, + "verb": { + "layout": "default", + "plugins": [ + "gulp-format-md" + ], + "related": { + "list": [ + "assemble", + "base", + "update", + "verb" + ] + }, + "reflinks": [ + "assemble", + "bach", + "base", + "composer", + "gulp", + "has-glob", + "is-valid-glob", + "micromatch", + "npm", + "scaffold", + "verb", + "vinyl" + ] + }, + "version": "4.0.3" +} diff --git a/node_modules/is-installed-globally/index.d.ts b/node_modules/is-installed-globally/index.d.ts new file mode 100644 index 000000000..4c2cbbaec --- /dev/null +++ b/node_modules/is-installed-globally/index.d.ts @@ -0,0 +1,19 @@ +/** +Check if your package was installed globally. + +@example +``` +import isInstalledGlobally = require('is-installed-globally'); + +// With `npm install your-package` +console.log(isInstalledGlobally); +//=> false + +// With `npm install --global your-package` +console.log(isInstalledGlobally); +//=> true +``` +*/ +declare const isInstalledGlobally: boolean; + +export = isInstalledGlobally; diff --git a/node_modules/is-installed-globally/index.js b/node_modules/is-installed-globally/index.js new file mode 100644 index 000000000..7fefcda3e --- /dev/null +++ b/node_modules/is-installed-globally/index.js @@ -0,0 +1,15 @@ +'use strict'; +const fs = require('fs'); +const globalDirs = require('global-dirs'); +const isPathInside = require('is-path-inside'); + +module.exports = (() => { + try { + return ( + isPathInside(__dirname, globalDirs.yarn.packages) || + isPathInside(__dirname, fs.realpathSync(globalDirs.npm.packages)) + ); + } catch { + return false; + } +})(); diff --git a/node_modules/is-installed-globally/license b/node_modules/is-installed-globally/license new file mode 100644 index 000000000..fa7ceba3e --- /dev/null +++ b/node_modules/is-installed-globally/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (https://sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/is-installed-globally/package.json b/node_modules/is-installed-globally/package.json new file mode 100644 index 000000000..b1e11c10f --- /dev/null +++ b/node_modules/is-installed-globally/package.json @@ -0,0 +1,86 @@ +{ + "_from": "is-installed-globally@^0.4.0", + "_id": "is-installed-globally@0.4.0", + "_inBundle": false, + "_integrity": "sha512-iwGqO3J21aaSkC7jWnHP/difazwS7SFeIqxv6wEtLU8Y5KlzFTjyqcSIT0d8s4+dDhKytsk9PJZ2BkS5eZwQRQ==", + "_location": "/is-installed-globally", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "is-installed-globally@^0.4.0", + "name": "is-installed-globally", + "escapedName": "is-installed-globally", + "rawSpec": "^0.4.0", + "saveSpec": null, + "fetchSpec": "^0.4.0" + }, + "_requiredBy": [ + "/update-notifier" + ], + "_resolved": "https://registry.npmjs.org/is-installed-globally/-/is-installed-globally-0.4.0.tgz", + "_shasum": "9a0fd407949c30f86eb6959ef1b7994ed0b7b520", + "_spec": "is-installed-globally@^0.4.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\update-notifier", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "https://sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/is-installed-globally/issues" + }, + "bundleDependencies": false, + "dependencies": { + "global-dirs": "^3.0.0", + "is-path-inside": "^3.0.2" + }, + "deprecated": false, + "description": "Check if your package was installed globally", + "devDependencies": { + "ava": "^2.4.0", + "cpy": "^8.1.1", + "del": "^6.0.0", + "execa": "^5.0.0", + "make-dir": "^3.1.0", + "tsd": "^0.14.0", + "xo": "^0.37.1" + }, + "engines": { + "node": ">=10" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "funding": "https://github.com/sponsors/sindresorhus", + "homepage": "https://github.com/sindresorhus/is-installed-globally#readme", + "keywords": [ + "global", + "package", + "globally", + "module", + "install", + "installed", + "npm", + "yarn", + "is", + "check", + "detect", + "local", + "locally", + "cli", + "bin", + "binary" + ], + "license": "MIT", + "name": "is-installed-globally", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/is-installed-globally.git" + }, + "scripts": { + "test": "xo && ava && tsd" + }, + "version": "0.4.0" +} diff --git a/node_modules/is-installed-globally/readme.md b/node_modules/is-installed-globally/readme.md new file mode 100644 index 000000000..ef9e18a68 --- /dev/null +++ b/node_modules/is-installed-globally/readme.md @@ -0,0 +1,31 @@ +# is-installed-globally + +> Check if your package was installed globally + +Can be useful if your CLI needs different behavior when installed globally and locally. + +## Install + +``` +$ npm install is-installed-globally +``` + +## Usage + +```js +const isInstalledGlobally = require('is-installed-globally'); + +// With `npm install your-package` +console.log(isInstalledGlobally); +//=> false + +// With `npm install --global your-package` +console.log(isInstalledGlobally); +//=> true +``` + +## Related + +- [import-global](https://github.com/sindresorhus/import-global) - Import a globally installed module +- [resolve-global](https://github.com/sindresorhus/resolve-global) - Resolve the path of a globally installed module +- [global-dirs](https://github.com/sindresorhus/global-dirs) - Get the directory of globally installed packages and binaries diff --git a/node_modules/is-npm/index.d.ts b/node_modules/is-npm/index.d.ts new file mode 100644 index 000000000..533877500 --- /dev/null +++ b/node_modules/is-npm/index.d.ts @@ -0,0 +1,41 @@ +/** +Check if your code is running as an [npm](https://docs.npmjs.com/misc/scripts) or [yarn](https://yarnpkg.com/lang/en/docs/cli/run/) script. + +@example +``` +import {isNpmOrYarn} from 'is-npm'; + +if (isNpmOrYarn) { + console.log('Running as an npm or yarn script!'); +} +``` +*/ +export const isNpmOrYarn: boolean; + +/** +Check if your code is running as an [npm](https://docs.npmjs.com/misc/scripts) script. + +@example +``` +import {isNpm} from 'is-npm'; + +if (isNpm) { + console.log('Running as an npm script!'); +} +``` +*/ +export const isNpm: boolean; + +/** +Check if your code is running as a [yarn](https://yarnpkg.com/lang/en/docs/cli/run/) script. + +@example +``` +import {isYarn} from 'is-npm'; + +if (isYarn) { + console.log('Running as a yarn script!'); +} +``` +*/ +export const isYarn: boolean; diff --git a/node_modules/is-npm/index.js b/node_modules/is-npm/index.js new file mode 100644 index 000000000..000a893cf --- /dev/null +++ b/node_modules/is-npm/index.js @@ -0,0 +1,11 @@ +'use strict'; + +const packageJson = process.env.npm_package_json; +const userAgent = process.env.npm_config_user_agent; +const isYarn = Boolean(userAgent && userAgent.startsWith('yarn')); +const isNpm = Boolean(userAgent && userAgent.startsWith('npm')); +const isNpm7 = Boolean(packageJson && packageJson.endsWith('package.json')); + +module.exports.isNpmOrYarn = isNpm || isNpm7 || isYarn; +module.exports.isNpm = isNpm || isNpm7; +module.exports.isYarn = isYarn; diff --git a/node_modules/is-npm/license b/node_modules/is-npm/license new file mode 100644 index 000000000..fa7ceba3e --- /dev/null +++ b/node_modules/is-npm/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (https://sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/is-npm/package.json b/node_modules/is-npm/package.json new file mode 100644 index 000000000..5c3d4f9e2 --- /dev/null +++ b/node_modules/is-npm/package.json @@ -0,0 +1,71 @@ +{ + "_from": "is-npm@^5.0.0", + "_id": "is-npm@5.0.0", + "_inBundle": false, + "_integrity": "sha512-WW/rQLOazUq+ST/bCAVBp/2oMERWLsR7OrKyt052dNDk4DHcDE0/7QSXITlmi+VBcV13DfIbysG3tZJm5RfdBA==", + "_location": "/is-npm", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "is-npm@^5.0.0", + "name": "is-npm", + "escapedName": "is-npm", + "rawSpec": "^5.0.0", + "saveSpec": null, + "fetchSpec": "^5.0.0" + }, + "_requiredBy": [ + "/update-notifier" + ], + "_resolved": "https://registry.npmjs.org/is-npm/-/is-npm-5.0.0.tgz", + "_shasum": "43e8d65cc56e1b67f8d47262cf667099193f45a8", + "_spec": "is-npm@^5.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\update-notifier", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "https://sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/is-npm/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Check if your code is running as an npm script", + "devDependencies": { + "ava": "^2.4.0", + "tsd": "^0.11.0", + "xo": "^0.30.0" + }, + "engines": { + "node": ">=10" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "funding": "https://github.com/sponsors/sindresorhus", + "homepage": "https://github.com/sindresorhus/is-npm#readme", + "keywords": [ + "npm", + "yarn", + "is", + "check", + "detect", + "env", + "environment", + "run", + "script" + ], + "license": "MIT", + "name": "is-npm", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/is-npm.git" + }, + "scripts": { + "test": "xo && ava && tsd" + }, + "version": "5.0.0" +} diff --git a/node_modules/is-npm/readme.md b/node_modules/is-npm/readme.md new file mode 100644 index 000000000..b18ba102c --- /dev/null +++ b/node_modules/is-npm/readme.md @@ -0,0 +1,60 @@ +# is-npm [![Build Status](https://travis-ci.com/sindresorhus/is-npm.svg?branch=master)](https://travis-ci.com/sindresorhus/is-npm) + +> Check if your code is running as an [npm](https://docs.npmjs.com/misc/scripts) or [yarn](https://yarnpkg.com/lang/en/docs/cli/run/) script + +## Install + +``` +$ npm install is-npm +``` + +## Usage + +```js +const {isNpmOrYarn, isNpm, isYarn} = require('is-npm'); + +console.table({isNpmOrYarn, isNpm, isYarn}); +``` + +```sh +$ node foo.js +# ┌─────────────┬────────┐ +# │ (index) │ Values │ +# ├─────────────┼────────┤ +# │ isNpmOrYarn │ false │ +# │ isNpm │ false │ +# │ isYarn │ false │ +# └─────────────┴────────┘ +$ npm run foo +# ┌─────────────┬────────┐ +# │ (index) │ Values │ +# ├─────────────┼────────┤ +# │ isNpmOrYarn │ true │ +# │ isNpm │ true │ +# │ isYarn │ false │ +# └─────────────┴────────┘ +$ yarn run foo +# ┌─────────────┬────────┐ +# │ (index) │ Values │ +# ├─────────────┼────────┤ +# │ isNpmOrYarn │ true │ +# │ isNpm │ false │ +# │ isYarn │ true │ +# └─────────────┴────────┘ +``` + +## Related + +- [is-npm-cli](https://github.com/sindresorhus/is-npm-cli) - CLI for this module + +--- + +
+ + Get professional support for this package with a Tidelift subscription + +
+ + Tidelift helps make open source sustainable for maintainers while giving companies
assurances about security, maintenance, and licensing for their dependencies. +
+
diff --git a/node_modules/is-number/LICENSE b/node_modules/is-number/LICENSE new file mode 100644 index 000000000..9af4a67d2 --- /dev/null +++ b/node_modules/is-number/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2014-present, Jon Schlinkert. + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/is-number/README.md b/node_modules/is-number/README.md new file mode 100644 index 000000000..eb8149e8c --- /dev/null +++ b/node_modules/is-number/README.md @@ -0,0 +1,187 @@ +# is-number [![NPM version](https://img.shields.io/npm/v/is-number.svg?style=flat)](https://www.npmjs.com/package/is-number) [![NPM monthly downloads](https://img.shields.io/npm/dm/is-number.svg?style=flat)](https://npmjs.org/package/is-number) [![NPM total downloads](https://img.shields.io/npm/dt/is-number.svg?style=flat)](https://npmjs.org/package/is-number) [![Linux Build Status](https://img.shields.io/travis/jonschlinkert/is-number.svg?style=flat&label=Travis)](https://travis-ci.org/jonschlinkert/is-number) + +> Returns true if the value is a finite number. + +Please consider following this project's author, [Jon Schlinkert](https://github.com/jonschlinkert), and consider starring the project to show your :heart: and support. + +## Install + +Install with [npm](https://www.npmjs.com/): + +```sh +$ npm install --save is-number +``` + +## Why is this needed? + +In JavaScript, it's not always as straightforward as it should be to reliably check if a value is a number. It's common for devs to use `+`, `-`, or `Number()` to cast a string value to a number (for example, when values are returned from user input, regex matches, parsers, etc). But there are many non-intuitive edge cases that yield unexpected results: + +```js +console.log(+[]); //=> 0 +console.log(+''); //=> 0 +console.log(+' '); //=> 0 +console.log(typeof NaN); //=> 'number' +``` + +This library offers a performant way to smooth out edge cases like these. + +## Usage + +```js +const isNumber = require('is-number'); +``` + +See the [tests](./test.js) for more examples. + +### true + +```js +isNumber(5e3); // true +isNumber(0xff); // true +isNumber(-1.1); // true +isNumber(0); // true +isNumber(1); // true +isNumber(1.1); // true +isNumber(10); // true +isNumber(10.10); // true +isNumber(100); // true +isNumber('-1.1'); // true +isNumber('0'); // true +isNumber('012'); // true +isNumber('0xff'); // true +isNumber('1'); // true +isNumber('1.1'); // true +isNumber('10'); // true +isNumber('10.10'); // true +isNumber('100'); // true +isNumber('5e3'); // true +isNumber(parseInt('012')); // true +isNumber(parseFloat('012')); // true +``` + +### False + +Everything else is false, as you would expect: + +```js +isNumber(Infinity); // false +isNumber(NaN); // false +isNumber(null); // false +isNumber(undefined); // false +isNumber(''); // false +isNumber(' '); // false +isNumber('foo'); // false +isNumber([1]); // false +isNumber([]); // false +isNumber(function () {}); // false +isNumber({}); // false +``` + +## Release history + +### 7.0.0 + +* Refactor. Now uses `.isFinite` if it exists. +* Performance is about the same as v6.0 when the value is a string or number. But it's now 3x-4x faster when the value is not a string or number. + +### 6.0.0 + +* Optimizations, thanks to @benaadams. + +### 5.0.0 + +**Breaking changes** + +* removed support for `instanceof Number` and `instanceof String` + +## Benchmarks + +As with all benchmarks, take these with a grain of salt. See the [benchmarks](./benchmark/index.js) for more detail. + +``` +# all +v7.0 x 413,222 ops/sec ±2.02% (86 runs sampled) +v6.0 x 111,061 ops/sec ±1.29% (85 runs sampled) +parseFloat x 317,596 ops/sec ±1.36% (86 runs sampled) +fastest is 'v7.0' + +# string +v7.0 x 3,054,496 ops/sec ±1.05% (89 runs sampled) +v6.0 x 2,957,781 ops/sec ±0.98% (88 runs sampled) +parseFloat x 3,071,060 ops/sec ±1.13% (88 runs sampled) +fastest is 'parseFloat,v7.0' + +# number +v7.0 x 3,146,895 ops/sec ±0.89% (89 runs sampled) +v6.0 x 3,214,038 ops/sec ±1.07% (89 runs sampled) +parseFloat x 3,077,588 ops/sec ±1.07% (87 runs sampled) +fastest is 'v6.0' +``` + +## About + +
+Contributing + +Pull requests and stars are always welcome. For bugs and feature requests, [please create an issue](../../issues/new). + +
+ +
+Running Tests + +Running and reviewing unit tests is a great way to get familiarized with a library and its API. You can install dependencies and run tests with the following command: + +```sh +$ npm install && npm test +``` + +
+ +
+Building docs + +_(This project's readme.md is generated by [verb](https://github.com/verbose/verb-generate-readme), please don't edit the readme directly. Any changes to the readme must be made in the [.verb.md](.verb.md) readme template.)_ + +To generate the readme, run the following command: + +```sh +$ npm install -g verbose/verb#dev verb-generate-readme && verb +``` + +
+ +### Related projects + +You might also be interested in these projects: + +* [is-plain-object](https://www.npmjs.com/package/is-plain-object): Returns true if an object was created by the `Object` constructor. | [homepage](https://github.com/jonschlinkert/is-plain-object "Returns true if an object was created by the `Object` constructor.") +* [is-primitive](https://www.npmjs.com/package/is-primitive): Returns `true` if the value is a primitive. | [homepage](https://github.com/jonschlinkert/is-primitive "Returns `true` if the value is a primitive. ") +* [isobject](https://www.npmjs.com/package/isobject): Returns true if the value is an object and not an array or null. | [homepage](https://github.com/jonschlinkert/isobject "Returns true if the value is an object and not an array or null.") +* [kind-of](https://www.npmjs.com/package/kind-of): Get the native type of a value. | [homepage](https://github.com/jonschlinkert/kind-of "Get the native type of a value.") + +### Contributors + +| **Commits** | **Contributor** | +| --- | --- | +| 49 | [jonschlinkert](https://github.com/jonschlinkert) | +| 5 | [charlike-old](https://github.com/charlike-old) | +| 1 | [benaadams](https://github.com/benaadams) | +| 1 | [realityking](https://github.com/realityking) | + +### Author + +**Jon Schlinkert** + +* [LinkedIn Profile](https://linkedin.com/in/jonschlinkert) +* [GitHub Profile](https://github.com/jonschlinkert) +* [Twitter Profile](https://twitter.com/jonschlinkert) + +### License + +Copyright © 2018, [Jon Schlinkert](https://github.com/jonschlinkert). +Released under the [MIT License](LICENSE). + +*** + +_This file was generated by [verb-generate-readme](https://github.com/verbose/verb-generate-readme), v0.6.0, on June 15, 2018._ \ No newline at end of file diff --git a/node_modules/is-number/index.js b/node_modules/is-number/index.js new file mode 100644 index 000000000..27f19b757 --- /dev/null +++ b/node_modules/is-number/index.js @@ -0,0 +1,18 @@ +/*! + * is-number + * + * Copyright (c) 2014-present, Jon Schlinkert. + * Released under the MIT License. + */ + +'use strict'; + +module.exports = function(num) { + if (typeof num === 'number') { + return num - num === 0; + } + if (typeof num === 'string' && num.trim() !== '') { + return Number.isFinite ? Number.isFinite(+num) : isFinite(+num); + } + return false; +}; diff --git a/node_modules/is-number/package.json b/node_modules/is-number/package.json new file mode 100644 index 000000000..5015f4f49 --- /dev/null +++ b/node_modules/is-number/package.json @@ -0,0 +1,122 @@ +{ + "_from": "is-number@^7.0.0", + "_id": "is-number@7.0.0", + "_inBundle": false, + "_integrity": "sha512-41Cifkg6e8TylSpdtTpeLVMqvSBEVzTttHvERD741+pnZ8ANv0004MRL43QKPDlK9cGvNp6NZWZUBlbGXYxxng==", + "_location": "/is-number", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "is-number@^7.0.0", + "name": "is-number", + "escapedName": "is-number", + "rawSpec": "^7.0.0", + "saveSpec": null, + "fetchSpec": "^7.0.0" + }, + "_requiredBy": [ + "/to-regex-range" + ], + "_resolved": "https://registry.npmjs.org/is-number/-/is-number-7.0.0.tgz", + "_shasum": "7535345b896734d5f80c4d06c50955527a14f12b", + "_spec": "is-number@^7.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\to-regex-range", + "author": { + "name": "Jon Schlinkert", + "url": "https://github.com/jonschlinkert" + }, + "bugs": { + "url": "https://github.com/jonschlinkert/is-number/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Jon Schlinkert", + "url": "http://twitter.com/jonschlinkert" + }, + { + "name": "Olsten Larck", + "url": "https://i.am.charlike.online" + }, + { + "name": "Rouven Weßling", + "url": "www.rouvenwessling.de" + } + ], + "deprecated": false, + "description": "Returns true if a number or string value is a finite number. Useful for regex matches, parsing, user input, etc.", + "devDependencies": { + "ansi": "^0.3.1", + "benchmark": "^2.1.4", + "gulp-format-md": "^1.0.0", + "mocha": "^3.5.3" + }, + "engines": { + "node": ">=0.12.0" + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/jonschlinkert/is-number", + "keywords": [ + "cast", + "check", + "coerce", + "coercion", + "finite", + "integer", + "is", + "isnan", + "is-nan", + "is-num", + "is-number", + "isnumber", + "isfinite", + "istype", + "kind", + "math", + "nan", + "num", + "number", + "numeric", + "parseFloat", + "parseInt", + "test", + "type", + "typeof", + "value" + ], + "license": "MIT", + "main": "index.js", + "name": "is-number", + "repository": { + "type": "git", + "url": "git+https://github.com/jonschlinkert/is-number.git" + }, + "scripts": { + "test": "mocha" + }, + "verb": { + "toc": false, + "layout": "default", + "tasks": [ + "readme" + ], + "related": { + "list": [ + "is-plain-object", + "is-primitive", + "isobject", + "kind-of" + ] + }, + "plugins": [ + "gulp-format-md" + ], + "lint": { + "reflinks": true + } + }, + "version": "7.0.0" +} diff --git a/node_modules/is-obj/index.d.ts b/node_modules/is-obj/index.d.ts new file mode 100644 index 000000000..e8a985a9d --- /dev/null +++ b/node_modules/is-obj/index.d.ts @@ -0,0 +1,22 @@ +/** +Check if a value is an object. + +Keep in mind that array, function, regexp, etc, are objects in JavaScript. + +@example +``` +import isObject = require('is-obj'); + +isObject({foo: 'bar'}); +//=> true + +isObject([1, 2, 3]); +//=> true + +isObject('foo'); +//=> false +``` +*/ +declare function isObject(value: unknown): value is object; + +export = isObject; diff --git a/node_modules/is-obj/index.js b/node_modules/is-obj/index.js new file mode 100644 index 000000000..c1755902d --- /dev/null +++ b/node_modules/is-obj/index.js @@ -0,0 +1,6 @@ +'use strict'; + +module.exports = value => { + const type = typeof value; + return value !== null && (type === 'object' || type === 'function'); +}; diff --git a/node_modules/is-obj/license b/node_modules/is-obj/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/is-obj/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/is-obj/package.json b/node_modules/is-obj/package.json new file mode 100644 index 000000000..60c0b2656 --- /dev/null +++ b/node_modules/is-obj/package.json @@ -0,0 +1,66 @@ +{ + "_from": "is-obj@^2.0.0", + "_id": "is-obj@2.0.0", + "_inBundle": false, + "_integrity": "sha512-drqDG3cbczxxEJRoOXcOjtdp1J/lyp1mNn0xaznRs8+muBhgQcrnbspox5X5fOw0HnMnbfDzvnEMEtqDEJEo8w==", + "_location": "/is-obj", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "is-obj@^2.0.0", + "name": "is-obj", + "escapedName": "is-obj", + "rawSpec": "^2.0.0", + "saveSpec": null, + "fetchSpec": "^2.0.0" + }, + "_requiredBy": [ + "/dot-prop" + ], + "_resolved": "https://registry.npmjs.org/is-obj/-/is-obj-2.0.0.tgz", + "_shasum": "473fb05d973705e3fd9620545018ca8e22ef4982", + "_spec": "is-obj@^2.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\dot-prop", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/is-obj/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Check if a value is an object", + "devDependencies": { + "ava": "^1.4.1", + "tsd": "^0.7.2", + "xo": "^0.24.0" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "homepage": "https://github.com/sindresorhus/is-obj#readme", + "keywords": [ + "object", + "is", + "check", + "test", + "type" + ], + "license": "MIT", + "name": "is-obj", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/is-obj.git" + }, + "scripts": { + "test": "xo && ava && tsd" + }, + "version": "2.0.0" +} diff --git a/node_modules/is-obj/readme.md b/node_modules/is-obj/readme.md new file mode 100644 index 000000000..127ee2694 --- /dev/null +++ b/node_modules/is-obj/readme.md @@ -0,0 +1,39 @@ +# is-obj [![Build Status](https://travis-ci.org/sindresorhus/is-obj.svg?branch=master)](https://travis-ci.org/sindresorhus/is-obj) + +> Check if a value is an object + +Keep in mind that array, function, regexp, etc, are objects in JavaScript.
+See [`is-plain-obj`](https://github.com/sindresorhus/is-plain-obj) if you want to check for plain objects. + + +## Install + +``` +$ npm install is-obj +``` + + +## Usage + +```js +const isObject = require('is-obj'); + +isObject({foo: 'bar'}); +//=> true + +isObject([1, 2, 3]); +//=> true + +isObject('foo'); +//=> false +``` + + +## Related + +- [is](https://github.com/sindresorhus/is) - Type check values + + +## License + +MIT © [Sindre Sorhus](https://sindresorhus.com) diff --git a/node_modules/is-path-inside/index.d.ts b/node_modules/is-path-inside/index.d.ts new file mode 100644 index 000000000..5cc3d804d --- /dev/null +++ b/node_modules/is-path-inside/index.d.ts @@ -0,0 +1,27 @@ +/** +Check if a path is inside another path. + +Note that relative paths are resolved against `process.cwd()` to make them absolute. + +_Important:_ This package is meant for use with path manipulation. It does not check if the paths exist nor does it resolve symlinks. You should not use this as a security mechanism to guard against access to certain places on the file system. + +@example +``` +import isPathInside = require('is-path-inside'); + +isPathInside('a/b/c', 'a/b'); +//=> true + +isPathInside('a/b/c', 'x/y'); +//=> false + +isPathInside('a/b/c', 'a/b/c'); +//=> false + +isPathInside('/Users/sindresorhus/dev/unicorn', '/Users/sindresorhus'); +//=> true +``` +*/ +declare function isPathInside(childPath: string, parentPath: string): boolean; + +export = isPathInside; diff --git a/node_modules/is-path-inside/index.js b/node_modules/is-path-inside/index.js new file mode 100644 index 000000000..28ed79c0f --- /dev/null +++ b/node_modules/is-path-inside/index.js @@ -0,0 +1,12 @@ +'use strict'; +const path = require('path'); + +module.exports = (childPath, parentPath) => { + const relation = path.relative(parentPath, childPath); + return Boolean( + relation && + relation !== '..' && + !relation.startsWith(`..${path.sep}`) && + relation !== path.resolve(childPath) + ); +}; diff --git a/node_modules/is-path-inside/license b/node_modules/is-path-inside/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/is-path-inside/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/is-path-inside/package.json b/node_modules/is-path-inside/package.json new file mode 100644 index 000000000..923ee25fd --- /dev/null +++ b/node_modules/is-path-inside/package.json @@ -0,0 +1,68 @@ +{ + "_from": "is-path-inside@^3.0.2", + "_id": "is-path-inside@3.0.3", + "_inBundle": false, + "_integrity": "sha512-Fd4gABb+ycGAmKou8eMftCupSir5lRxqf4aD/vd0cD2qc4HL07OjCeuHMr8Ro4CoMaeCKDB0/ECBOVWjTwUvPQ==", + "_location": "/is-path-inside", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "is-path-inside@^3.0.2", + "name": "is-path-inside", + "escapedName": "is-path-inside", + "rawSpec": "^3.0.2", + "saveSpec": null, + "fetchSpec": "^3.0.2" + }, + "_requiredBy": [ + "/is-installed-globally" + ], + "_resolved": "https://registry.npmjs.org/is-path-inside/-/is-path-inside-3.0.3.tgz", + "_shasum": "d231362e53a07ff2b0e0ea7fed049161ffd16283", + "_spec": "is-path-inside@^3.0.2", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\is-installed-globally", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/is-path-inside/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Check if a path is inside another path", + "devDependencies": { + "ava": "^2.1.0", + "tsd": "^0.7.2", + "xo": "^0.24.0" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "homepage": "https://github.com/sindresorhus/is-path-inside#readme", + "keywords": [ + "path", + "inside", + "folder", + "directory", + "dir", + "file", + "resolve" + ], + "license": "MIT", + "name": "is-path-inside", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/is-path-inside.git" + }, + "scripts": { + "test": "xo && ava && tsd" + }, + "version": "3.0.3" +} diff --git a/node_modules/is-path-inside/readme.md b/node_modules/is-path-inside/readme.md new file mode 100644 index 000000000..e8c4f928e --- /dev/null +++ b/node_modules/is-path-inside/readme.md @@ -0,0 +1,63 @@ +# is-path-inside + +> Check if a path is inside another path + + +## Install + +``` +$ npm install is-path-inside +``` + + +## Usage + +```js +const isPathInside = require('is-path-inside'); + +isPathInside('a/b/c', 'a/b'); +//=> true + +isPathInside('a/b/c', 'x/y'); +//=> false + +isPathInside('a/b/c', 'a/b/c'); +//=> false + +isPathInside('/Users/sindresorhus/dev/unicorn', '/Users/sindresorhus'); +//=> true +``` + + +## API + +### isPathInside(childPath, parentPath) + +Note that relative paths are resolved against `process.cwd()` to make them absolute. + +**Important:** This package is meant for use with path manipulation. It does not check if the paths exist nor does it resolve symlinks. You should not use this as a security mechanism to guard against access to certain places on the file system. + +#### childPath + +Type: `string` + +The path that should be inside `parentPath`. + +#### parentPath + +Type: `string` + +The path that should contain `childPath`. + + +--- + +
+ + Get professional support for this package with a Tidelift subscription + +
+ + Tidelift helps make open source sustainable for maintainers while giving companies
assurances about security, maintenance, and licensing for their dependencies. +
+
diff --git a/node_modules/is-typedarray/LICENSE.md b/node_modules/is-typedarray/LICENSE.md new file mode 100644 index 000000000..ee27ba4b4 --- /dev/null +++ b/node_modules/is-typedarray/LICENSE.md @@ -0,0 +1,18 @@ +This software is released under the MIT license: + +Permission is hereby granted, free of charge, to any person obtaining a copy of +this software and associated documentation files (the "Software"), to deal in +the Software without restriction, including without limitation the rights to +use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of +the Software, and to permit persons to whom the Software is furnished to do so, +subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS +FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR +COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER +IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN +CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/is-typedarray/README.md b/node_modules/is-typedarray/README.md new file mode 100644 index 000000000..275286391 --- /dev/null +++ b/node_modules/is-typedarray/README.md @@ -0,0 +1,16 @@ +# is-typedarray [![locked](http://badges.github.io/stability-badges/dist/locked.svg)](http://github.com/badges/stability-badges) + +Detect whether or not an object is a +[Typed Array](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Typed_arrays). + +## Usage + +[![NPM](https://nodei.co/npm/is-typedarray.png)](https://nodei.co/npm/is-typedarray/) + +### isTypedArray(array) + +Returns `true` when array is a Typed Array, and `false` when it is not. + +## License + +MIT. See [LICENSE.md](http://github.com/hughsk/is-typedarray/blob/master/LICENSE.md) for details. diff --git a/node_modules/is-typedarray/index.js b/node_modules/is-typedarray/index.js new file mode 100644 index 000000000..58596036c --- /dev/null +++ b/node_modules/is-typedarray/index.js @@ -0,0 +1,41 @@ +module.exports = isTypedArray +isTypedArray.strict = isStrictTypedArray +isTypedArray.loose = isLooseTypedArray + +var toString = Object.prototype.toString +var names = { + '[object Int8Array]': true + , '[object Int16Array]': true + , '[object Int32Array]': true + , '[object Uint8Array]': true + , '[object Uint8ClampedArray]': true + , '[object Uint16Array]': true + , '[object Uint32Array]': true + , '[object Float32Array]': true + , '[object Float64Array]': true +} + +function isTypedArray(arr) { + return ( + isStrictTypedArray(arr) + || isLooseTypedArray(arr) + ) +} + +function isStrictTypedArray(arr) { + return ( + arr instanceof Int8Array + || arr instanceof Int16Array + || arr instanceof Int32Array + || arr instanceof Uint8Array + || arr instanceof Uint8ClampedArray + || arr instanceof Uint16Array + || arr instanceof Uint32Array + || arr instanceof Float32Array + || arr instanceof Float64Array + ) +} + +function isLooseTypedArray(arr) { + return names[toString.call(arr)] +} diff --git a/node_modules/is-typedarray/package.json b/node_modules/is-typedarray/package.json new file mode 100644 index 000000000..15d757ed0 --- /dev/null +++ b/node_modules/is-typedarray/package.json @@ -0,0 +1,60 @@ +{ + "_from": "is-typedarray@^1.0.0", + "_id": "is-typedarray@1.0.0", + "_inBundle": false, + "_integrity": "sha1-5HnICFjfDBsR3dppQPlgEfzaSpo=", + "_location": "/is-typedarray", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "is-typedarray@^1.0.0", + "name": "is-typedarray", + "escapedName": "is-typedarray", + "rawSpec": "^1.0.0", + "saveSpec": null, + "fetchSpec": "^1.0.0" + }, + "_requiredBy": [ + "/typedarray-to-buffer", + "/write-file-atomic" + ], + "_resolved": "https://registry.npmjs.org/is-typedarray/-/is-typedarray-1.0.0.tgz", + "_shasum": "e479c80858df0c1b11ddda6940f96011fcda4a9a", + "_spec": "is-typedarray@^1.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\write-file-atomic", + "author": { + "name": "Hugh Kennedy", + "email": "hughskennedy@gmail.com", + "url": "http://hughsk.io/" + }, + "bugs": { + "url": "https://github.com/hughsk/is-typedarray/issues" + }, + "bundleDependencies": false, + "dependencies": {}, + "deprecated": false, + "description": "Detect whether or not an object is a Typed Array", + "devDependencies": { + "tape": "^2.13.1" + }, + "homepage": "https://github.com/hughsk/is-typedarray", + "keywords": [ + "typed", + "array", + "detect", + "is", + "util" + ], + "license": "MIT", + "main": "index.js", + "name": "is-typedarray", + "repository": { + "type": "git", + "url": "git://github.com/hughsk/is-typedarray.git" + }, + "scripts": { + "test": "node test" + }, + "version": "1.0.0" +} diff --git a/node_modules/is-typedarray/test.js b/node_modules/is-typedarray/test.js new file mode 100644 index 000000000..b0c176fa3 --- /dev/null +++ b/node_modules/is-typedarray/test.js @@ -0,0 +1,34 @@ +var test = require('tape') +var ista = require('./') + +test('strict', function(t) { + t.ok(ista.strict(new Int8Array), 'Int8Array') + t.ok(ista.strict(new Int16Array), 'Int16Array') + t.ok(ista.strict(new Int32Array), 'Int32Array') + t.ok(ista.strict(new Uint8Array), 'Uint8Array') + t.ok(ista.strict(new Uint16Array), 'Uint16Array') + t.ok(ista.strict(new Uint32Array), 'Uint32Array') + t.ok(ista.strict(new Float32Array), 'Float32Array') + t.ok(ista.strict(new Float64Array), 'Float64Array') + + t.ok(!ista.strict(new Array), 'Array') + t.ok(!ista.strict([]), '[]') + + t.end() +}) + +test('loose', function(t) { + t.ok(ista.loose(new Int8Array), 'Int8Array') + t.ok(ista.loose(new Int16Array), 'Int16Array') + t.ok(ista.loose(new Int32Array), 'Int32Array') + t.ok(ista.loose(new Uint8Array), 'Uint8Array') + t.ok(ista.loose(new Uint16Array), 'Uint16Array') + t.ok(ista.loose(new Uint32Array), 'Uint32Array') + t.ok(ista.loose(new Float32Array), 'Float32Array') + t.ok(ista.loose(new Float64Array), 'Float64Array') + + t.ok(!ista.loose(new Array), 'Array') + t.ok(!ista.loose([]), '[]') + + t.end() +}) diff --git a/node_modules/is-yarn-global/.travis.yml b/node_modules/is-yarn-global/.travis.yml new file mode 100644 index 000000000..c576a02f5 --- /dev/null +++ b/node_modules/is-yarn-global/.travis.yml @@ -0,0 +1,4 @@ +language: node_js +node_js: + - "8" + - "6" diff --git a/node_modules/is-yarn-global/LICENSE b/node_modules/is-yarn-global/LICENSE new file mode 100644 index 000000000..04d3e3191 --- /dev/null +++ b/node_modules/is-yarn-global/LICENSE @@ -0,0 +1,21 @@ +MIT License + +Copyright (c) 2018 LitoMore + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/node_modules/is-yarn-global/README.md b/node_modules/is-yarn-global/README.md new file mode 100644 index 000000000..d070e2a54 --- /dev/null +++ b/node_modules/is-yarn-global/README.md @@ -0,0 +1,28 @@ +# is-yarn-global + +[![](https://img.shields.io/travis/LitoMore/is-yarn-global/master.svg)](https://travis-ci.org/LitoMore/is-yarn-global) +[![](https://img.shields.io/npm/v/is-yarn-global.svg)](https://www.npmjs.com/package/is-yarn-global) +[![](https://img.shields.io/npm/l/is-yarn-global.svg)](https://github.com/LitoMore/is-yarn-global/blob/master/LICENSE) +[![](https://img.shields.io/badge/code_style-XO-5ed9c7.svg)](https://github.com/sindresorhus/xo) + +Check if installed by yarn globally without any `fs` calls + +## Install + +```bash +$ npm install is-yarn-global +``` + +## Usage + +Just require it in your package. + +```javascript +const isYarnGlobal = require('is-yarn-global'); + +console.log(isYarnGlobal()); +``` + +## License + +MIT © [LitoMore](https://github.com/LitoMore) diff --git a/node_modules/is-yarn-global/index.js b/node_modules/is-yarn-global/index.js new file mode 100644 index 000000000..4d0c642cf --- /dev/null +++ b/node_modules/is-yarn-global/index.js @@ -0,0 +1,12 @@ +'use strict'; + +const path = require('path'); + +module.exports = function () { + const isWindows = process.platform === 'win32'; + const yarnPath = isWindows ? path.join('Yarn', 'config', 'global') : path.join('.config', 'yarn', 'global'); + if (__dirname.includes(yarnPath)) { + return true; + } + return false; +}; diff --git a/node_modules/is-yarn-global/package.json b/node_modules/is-yarn-global/package.json new file mode 100644 index 000000000..ef59a6493 --- /dev/null +++ b/node_modules/is-yarn-global/package.json @@ -0,0 +1,50 @@ +{ + "_from": "is-yarn-global@^0.3.0", + "_id": "is-yarn-global@0.3.0", + "_inBundle": false, + "_integrity": "sha512-VjSeb/lHmkoyd8ryPVIKvOCn4D1koMqY+vqyjjUfc3xyKtP4dYOxM44sZrnqQSzSds3xyOrUTLTC9LVCVgLngw==", + "_location": "/is-yarn-global", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "is-yarn-global@^0.3.0", + "name": "is-yarn-global", + "escapedName": "is-yarn-global", + "rawSpec": "^0.3.0", + "saveSpec": null, + "fetchSpec": "^0.3.0" + }, + "_requiredBy": [ + "/update-notifier" + ], + "_resolved": "https://registry.npmjs.org/is-yarn-global/-/is-yarn-global-0.3.0.tgz", + "_shasum": "d502d3382590ea3004893746754c89139973e232", + "_spec": "is-yarn-global@^0.3.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\update-notifier", + "author": { + "name": "LitoMore", + "url": "litomore@gmail.com" + }, + "bugs": { + "url": "https://github.com/LitoMore/is-yarn-global/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Check if installed by yarn globally without any `fs` calls", + "devDependencies": { + "ava": "^0.24.0", + "xo": "^0.18.2" + }, + "homepage": "https://github.com/LitoMore/is-yarn-global#readme", + "license": "MIT", + "name": "is-yarn-global", + "repository": { + "type": "git", + "url": "git+ssh://git@github.com/LitoMore/is-yarn-global.git" + }, + "scripts": { + "test": "xo" + }, + "version": "0.3.0" +} diff --git a/node_modules/jake/Makefile b/node_modules/jake/Makefile new file mode 100644 index 000000000..3d0574ece --- /dev/null +++ b/node_modules/jake/Makefile @@ -0,0 +1,44 @@ +# +# Jake JavaScript build tool +# Copyright 2112 Matthew Eernisse (mde@fleegix.org) +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +# + +.PHONY: all build install clean uninstall + +PREFIX=/usr/local +DESTDIR= + +all: build + +build: + @echo 'Jake built.' + +install: + @mkdir -p $(DESTDIR)$(PREFIX)/bin && \ + mkdir -p $(DESTDIR)$(PREFIX)/lib/node_modules/jake && \ + mkdir -p ./node_modules && \ + npm install utilities minimatch && \ + cp -R ./* $(DESTDIR)$(PREFIX)/lib/node_modules/jake/ && \ + ln -snf ../lib/node_modules/jake/bin/cli.js $(DESTDIR)$(PREFIX)/bin/jake && \ + chmod 755 $(DESTDIR)$(PREFIX)/lib/node_modules/jake/bin/cli.js && \ + echo 'Jake installed.' + +clean: + @true + +uninstall: + @rm -f $(DESTDIR)$(PREFIX)/bin/jake && \ + rm -fr $(DESTDIR)$(PREFIX)/lib/node_modules/jake/ && \ + echo 'Jake uninstalled.' diff --git a/node_modules/jake/README.md b/node_modules/jake/README.md new file mode 100644 index 000000000..e9388501d --- /dev/null +++ b/node_modules/jake/README.md @@ -0,0 +1,17 @@ +### Jake -- the JavaScript build tool for Node.js + +[![Build Status](https://travis-ci.org/jakejs/jake.svg?branch=master)](https://travis-ci.org/jakejs/jake) + +Documentation site at [http://jakejs.com](http://jakejs.com/) + +### Contributing +1. [Install node](http://nodejs.org/#download). +2. Clone this repository `$ git clone git@github.com:jakejs/jake.git`. +3. Install dependencies `$ npm install`. +4. Run tests with `$ npm test`. +5. Start Hacking! + +### License + +Licensed under the Apache License, Version 2.0 +() diff --git a/node_modules/jake/bin/bash_completion.sh b/node_modules/jake/bin/bash_completion.sh new file mode 100644 index 000000000..bb2599562 --- /dev/null +++ b/node_modules/jake/bin/bash_completion.sh @@ -0,0 +1,41 @@ +#!/bin/bash + +# http://stackoverflow.com/a/246128 +SOURCE="${BASH_SOURCE[0]}" +while [ -h "$SOURCE" ]; do # resolve $SOURCE until the file is no longer a symlink + DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )" + SOURCE="$(readlink "$SOURCE")" + [[ $SOURCE != /* ]] && SOURCE="$DIR/$SOURCE" # if $SOURCE was a relative symlink, we need to resolve it relative to the path where the symlink file was located +done +JAKE_BIN_DIR="$( cd -P "$( dirname "$SOURCE" )" && pwd )" + +# http://stackoverflow.com/a/12495480 +# http://stackoverflow.com/a/28647824 +_auto_jake() +{ + local cur + local -a COMPGEN=() + _get_comp_words_by_ref -n : -c cur + + # run auto-completions in jake via our auto_complete.js wrapper + local -a auto_complete_info=( $(export COMP_LINE="${COMP_LINE}" && ${JAKE_BIN_DIR}/auto_complete.js "$cur" "${3}") ) + # check reply flag + local reply_flag="${auto_complete_info[0]}" + if [[ "${reply_flag}" == "no-complete" ]]; then + return 1 + fi + local auto_completions=("${auto_complete_info[@]:1}") + COMPGEN=( $(compgen -W "${auto_completions[*]}" -- "$cur") ) + COMPREPLY=( "${COMPGEN[@]}" ) + + __ltrim_colon_completions "$cur" + + # do we need another space?? + if [[ "${reply_flag}" == "yes-space" ]]; then + COMPREPLY=( "${COMPGEN[@]}" " " ) + fi + + return 0 +} + +complete -o default -F _auto_jake jake diff --git a/node_modules/jake/bin/cli.js b/node_modules/jake/bin/cli.js new file mode 100644 index 000000000..9f68abbf7 --- /dev/null +++ b/node_modules/jake/bin/cli.js @@ -0,0 +1,31 @@ +#!/usr/bin/env node +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ + +// Try to load a local jake +try { + require(`${ process.cwd() }/node_modules/jake`); +} +// If that fails, likely running globally +catch(e) { + require('../lib/jake'); +} + +var args = process.argv.slice(2); + +jake.run.apply(jake, args); diff --git a/node_modules/jake/jakefile.js b/node_modules/jake/jakefile.js new file mode 100644 index 000000000..b0ae79bb8 --- /dev/null +++ b/node_modules/jake/jakefile.js @@ -0,0 +1,105 @@ +let fs = require('fs') +let path = require('path'); +let proc = require('child_process'); + +const PROJECT_DIR = process.cwd(); +process.env.PROJECT_DIR = PROJECT_DIR; + +namespace('doc', function () { + task('generate', ['doc:clobber'], function () { + var cmd = '../node-jsdoc-toolkit/app/run.js -n -r=100 ' + + '-t=../node-jsdoc-toolkit/templates/codeview -d=./doc/ ./lib'; + jake.logger.log('Generating docs ...'); + jake.exec([cmd], function () { + jake.logger.log('Done.'); + complete(); + }); + }, {async: true}); + + task('clobber', function () { + var cmd = 'rm -fr ./doc/*'; + jake.exec([cmd], function () { + jake.logger.log('Clobbered old docs.'); + complete(); + }); + }, {async: true}); + +}); + +desc('Generate docs for Jake'); +task('doc', ['doc:generate']); + +npmPublishTask('jake', function () { + this.packageFiles.include([ + 'Makefile', + 'jakefile.js', + 'README.md', + 'package.json', + 'usage.txt', + 'lib/**', + 'bin/**', + 'test/**' + ]); + this.packageFiles.exclude([ + 'test/tmp' + ]); +}); + +jake.Task['publish:package'].directory = PROJECT_DIR; + +namespace('test', function () { + + let integrationTest = task('integration', ['publish:package'], async function () { + let pkg = JSON.parse(fs.readFileSync(`${PROJECT_DIR}/package.json`).toString()); + let version = pkg.version; + + proc.execSync('rm -rf ./node_modules'); + // Install from the actual package, run tests from the packaged binary + proc.execSync(`mkdir -p node_modules/.bin && mv ${PROJECT_DIR}/pkg/jake-v` + + `${version} node_modules/jake && ln -s ${process.cwd()}` + + '/node_modules/jake/bin/cli.js ./node_modules/.bin/jake'); + + let testArgs = []; + if (process.env.filter) { + testArgs.push(process.env.filter); + } + else { + testArgs.push('*.js'); + } + let spawned = proc.spawn(`${PROJECT_DIR}/node_modules/.bin/mocha`, testArgs, { + stdio: 'inherit' + }); + return new Promise((resolve, reject) => { + spawned.on('exit', () => { + if (!(process.env.noclobber || process.env.noClobber)) { + proc.execSync('rm -rf tmp_publish && rm -rf package.json' + + ' && rm -rf package-lock.json && rm -rf node_modules'); + // Rather than invoking 'clobber' task + jake.rmRf(`${PROJECT_DIR}/pkg`); + } + resolve(); + }); + }); + + }); + + integrationTest.directory = `${PROJECT_DIR}/test/integration`; + + let unitTest = task('unit', async function () { + let testArgs = []; + if (process.env.filter) { + testArgs.push(process.env.filter); + } + else { + testArgs.push('*.js'); + } + let spawned = proc.spawn(`${PROJECT_DIR}/node_modules/.bin/mocha`, testArgs, { + stdio: 'inherit' + }); + }); + + unitTest.directory = `${PROJECT_DIR}/test/unit`; +}); + +desc('Runs all tests'); +task('test', ['test:unit', 'test:integration']); diff --git a/node_modules/jake/lib/api.js b/node_modules/jake/lib/api.js new file mode 100644 index 000000000..9f09140eb --- /dev/null +++ b/node_modules/jake/lib/api.js @@ -0,0 +1,409 @@ +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ +let { uuid } = require('./utils'); + +let api = new (function () { + /** + @name task + @static + @function + @description Creates a Jake Task + ` + @param {String} name The name of the Task + @param {Array} [prereqs] Prerequisites to be run before this task + @param {Function} [action] The action to perform for this task + @param {Object} [opts] + @param {Boolean} [opts.asyc=false] Perform this task asynchronously. + If you flag a task with this option, you must call the global + `complete` method inside the task's action, for execution to proceed + to the next task. + + @example + desc('This is the default task.'); + task('default', function (params) { + console.log('This is the default task.'); + }); + + desc('This task has prerequisites.'); + task('hasPrereqs', ['foo', 'bar', 'baz'], function (params) { + console.log('Ran some prereqs first.'); + }); + + desc('This is an asynchronous task.'); + task('asyncTask', function () { + setTimeout(complete, 1000); + }, {async: true}); + */ + this.task = function (name, prereqs, action, opts) { + let args = Array.prototype.slice.call(arguments); + let createdTask; + args.unshift('task'); + createdTask = jake.createTask.apply(global, args); + jake.currentTaskDescription = null; + return createdTask; + }; + + /** + @name rule + @static + @function + @description Creates a Jake Suffix Rule + ` + @param {String} pattern The suffix name of the objective + @param {String} source The suffix name of the objective + @param {Array} [prereqs] Prerequisites to be run before this task + @param {Function} [action] The action to perform for this task + @param {Object} [opts] + @param {Boolean} [opts.asyc=false] Perform this task asynchronously. + If you flag a task with this option, you must call the global + `complete` method inside the task's action, for execution to proceed + to the next task. + @example + desc('This is a rule, which does not support namespace or pattern.'); + rule('.o', '.c', {async: true}, function () { + let cmd = util.format('gcc -o %s %s', this.name, this.source); + jake.exec([cmd], function () { + complete(); + }, {printStdout: true}); + }); + + desc('This rule has prerequisites.'); + rule('.o', '.c', ['util.h'], {async: true}, function () { + let cmd = util.format('gcc -o %s %s', this.name, this.source); + jake.exec([cmd], function () { + complete(); + }, {printStdout: true}); + }); + + desc('This is a rule with patterns.'); + rule('%.o', '%.c', {async: true}, function () { + let cmd = util.format('gcc -o %s %s', this.name, this.source); + jake.exec([cmd], function () { + complete(); + }, {printStdout: true}); + }); + + desc('This is another rule with patterns.'); + rule('obj/%.o', 'src/%.c', {async: true}, function () { + let cmd = util.format('gcc -o %s %s', this.name, this.source); + jake.exec([cmd], function () { + complete(); + }, {printStdout: true}); + }); + + desc('This is an example with chain rules.'); + rule('%.pdf', '%.dvi', {async: true}, function () { + let cmd = util.format('dvipdfm %s',this.source); + jake.exec([cmd], function () { + complete(); + }, {printStdout: true}); + }); + + rule('%.dvi', '%.tex', {async: true}, function () { + let cmd = util.format('latex %s',this.source); + jake.exec([cmd], function () { + complete(); + }, {printStdout: true}); + }); + + desc('This rule has a namespace.'); + task('default', ['debug:obj/main.o]); + + namespace('debug', {async: true}, function() { + rule('obj/%.o', 'src/%.c', function () { + // ... + }); + } + */ + this.rule = function () { + let args = Array.prototype.slice.call(arguments); + let arg; + let pattern = args.shift(); + let source = args.shift(); + let prereqs = []; + let action = function () {}; + let opts = {}; + let key = pattern.toString(); // May be a RegExp + + while ((arg = args.shift())) { + if (typeof arg == 'function') { + action = arg; + } + else if (Array.isArray(arg)) { + prereqs = arg; + } + else { + opts = arg; + } + } + + jake.currentNamespace.rules[key] = new jake.Rule({ + pattern: pattern, + source: source, + prereqs: prereqs, + action: action, + opts: opts, + desc: jake.currentTaskDescription, + ns: jake.currentNamespace + }); + jake.currentTaskDescription = null; + }; + + /** + @name directory + @static + @function + @description Creates a Jake DirectoryTask. Can be used as a prerequisite + for FileTasks, or for simply ensuring a directory exists for use with a + Task's action. + ` + @param {String} name The name of the DiretoryTask + + @example + + // Creates the package directory for distribution + directory('pkg'); + */ + this.directory = function (name) { + let args = Array.prototype.slice.call(arguments); + let createdTask; + args.unshift('directory'); + createdTask = jake.createTask.apply(global, args); + jake.currentTaskDescription = null; + return createdTask; + }; + + /** + @name file + @static + @function + @description Creates a Jake FileTask. + ` + @param {String} name The name of the FileTask + @param {Array} [prereqs] Prerequisites to be run before this task + @param {Function} [action] The action to create this file, if it doesn't + exist already. + @param {Object} [opts] + @param {Array} [opts.asyc=false] Perform this task asynchronously. + If you flag a task with this option, you must call the global + `complete` method inside the task's action, for execution to proceed + to the next task. + + */ + this.file = function (name, prereqs, action, opts) { + let args = Array.prototype.slice.call(arguments); + let createdTask; + args.unshift('file'); + createdTask = jake.createTask.apply(global, args); + jake.currentTaskDescription = null; + return createdTask; + }; + + /** + @name desc + @static + @function + @description Creates a description for a Jake Task (or FileTask, + DirectoryTask). When invoked, the description that iscreated will + be associated with whatever Task is created next. + ` + @param {String} description The description for the Task + */ + this.desc = function (description) { + jake.currentTaskDescription = description; + }; + + /** + @name namespace + @static + @function + @description Creates a namespace which allows logical grouping + of tasks, and prevents name-collisions with task-names. Namespaces + can be nested inside of other namespaces. + ` + @param {String} name The name of the namespace + @param {Function} scope The enclosing scope for the namespaced tasks + + @example + namespace('doc', function () { + task('generate', ['doc:clobber'], function () { + // Generate some docs + }); + + task('clobber', function () { + // Clobber the doc directory first + }); + }); + */ + this.namespace = function (name, closure) { + let curr = jake.currentNamespace; + let ns = curr.childNamespaces[name] || new jake.Namespace(name, curr); + let fn = closure || function () {}; + curr.childNamespaces[name] = ns; + jake.currentNamespace = ns; + fn(); + jake.currentNamespace = curr; + jake.currentTaskDescription = null; + return ns; + }; + + /** + @name complete + @static + @function + @description Completes an asynchronous task, allowing Jake's + execution to proceed to the next task. Calling complete globally or without + arguments completes the last task on the invocationChain. If you use parallel + execution of prereqs this will probably complete a wrong task. You should call this + function with this task as the first argument, before the optional return value. + Alternatively you can call task.complete() + ` + @example + task('generate', ['doc:clobber'], function () { + exec('./generate_docs.sh', function (err, stdout, stderr) { + if (err || stderr) { + fail(err || stderr); + } + else { + console.log(stdout); + complete(); + } + }); + }, {async: true}); + */ + this.complete = function (task, val) { + //this should detect if the first arg is a task, but I guess it should be more thorough + if(task && task. _currentPrereqIndex >=0 ) { + task.complete(val); + } + else { + val = task; + if(jake._invocationChain.length > 0) { + jake._invocationChain[jake._invocationChain.length-1].complete(val); + } + } + }; + + /** + @name fail + @static + @function + @description Causes Jake execution to abort with an error. + Allows passing an optional error code, which will be used to + set the exit-code of exiting process. + ` + @param {Error|String} err The error to thow when aborting execution. + If this argument is an Error object, it will simply be thrown. If + a String, it will be used as the error-message. (If it is a multi-line + String, the first line will be used as the Error message, and the + remaining lines will be used as the error-stack.) + + @example + task('createTests, function () { + if (!fs.existsSync('./tests')) { + fail('Test directory does not exist.'); + } + else { + // Do some testing stuff ... + } + }); + */ + this.fail = function (err, code) { + let msg; + let errObj; + if (code) { + jake.errorCode = code; + } + if (err) { + if (typeof err == 'string') { + // Use the initial or only line of the error as the error-message + // If there was a multi-line error, use the rest as the stack + msg = err.split('\n'); + errObj = new Error(msg.shift()); + if (msg.length) { + errObj.stack = msg.join('\n'); + } + throw errObj; + } + else if (err instanceof Error) { + throw err; + } + else { + throw new Error(err.toString()); + } + } + else { + throw new Error(); + } + }; + + this.packageTask = function (name, version, prereqs, definition) { + return new jake.PackageTask(name, version, prereqs, definition); + }; + + this.publishTask = function (name, prereqs, opts, definition) { + return new jake.PublishTask(name, prereqs, opts, definition); + }; + + // Backward-compat + this.npmPublishTask = function (name, prereqs, opts, definition) { + return new jake.PublishTask(name, prereqs, opts, definition); + }; + + this.testTask = function () { + let ctor = function () {}; + let t; + ctor.prototype = jake.TestTask.prototype; + t = new ctor(); + jake.TestTask.apply(t, arguments); + return t; + }; + + this.setTaskTimeout = function (t) { + this._taskTimeout = t; + }; + + this.setSeriesAutoPrefix = function (prefix) { + this._seriesAutoPrefix = prefix; + }; + + this.series = function (...args) { + let prereqs = args.map((arg) => { + let name = (this._seriesAutoPrefix || '') + arg.name; + jake.task(name, arg); + return name; + }); + let seriesName = uuid(); + let seriesTask = jake.task(seriesName, prereqs); + seriesTask._internal = true; + let res = function () { + return new Promise((resolve) => { + seriesTask.invoke(); + seriesTask.on('complete', (val) => { + resolve(val); + }); + }); + }; + Object.defineProperty(res, 'name', {value: uuid(), + writable: false}); + return res; + }; + +})(); + +module.exports = api; diff --git a/node_modules/jake/lib/jake.js b/node_modules/jake/lib/jake.js new file mode 100644 index 000000000..a463163c1 --- /dev/null +++ b/node_modules/jake/lib/jake.js @@ -0,0 +1,319 @@ +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ + +if (!global.jake) { + + let EventEmitter = require('events').EventEmitter; + // And so it begins + global.jake = new EventEmitter(); + + let fs = require('fs'); + let chalk = require('chalk'); + let taskNs = require('./task'); + let Task = taskNs.Task; + let FileTask = taskNs.FileTask; + let DirectoryTask = taskNs.DirectoryTask; + let Rule = require('./rule').Rule; + let Namespace = require('./namespace').Namespace; + let RootNamespace = require('./namespace').RootNamespace; + let api = require('./api'); + let utils = require('./utils'); + let Program = require('./program').Program; + let loader = require('./loader')(); + let pkg = JSON.parse(fs.readFileSync(__dirname + '/../package.json').toString()); + + const MAX_RULE_RECURSION_LEVEL = 16; + + // Globalize jake and top-level API methods (e.g., `task`, `desc`) + Object.assign(global, api); + + // Copy utils onto base jake + jake.logger = utils.logger; + jake.exec = utils.exec; + + // File utils should be aliased directly on base jake as well + Object.assign(jake, utils.file); + + // Also add top-level API methods to exported object for those who don't want to + // use the globals (`file` here will overwrite the 'file' utils namespace) + Object.assign(jake, api); + + Object.assign(jake, new (function () { + + this._invocationChain = []; + this._taskTimeout = 30000; + + // Public properties + // ================= + this.version = pkg.version; + // Used when Jake exits with a specific error-code + this.errorCode = null; + // Loads Jakefiles/jakelibdirs + this.loader = loader; + // The root of all ... namespaces + this.rootNamespace = new RootNamespace(); + // Non-namespaced tasks are placed into the default + this.defaultNamespace = this.rootNamespace; + // Start in the default + this.currentNamespace = this.defaultNamespace; + // Saves the description created by a 'desc' call that prefaces a + // 'task' call that defines a task. + this.currentTaskDescription = null; + this.program = new Program(); + this.FileList = require('filelist').FileList; + this.PackageTask = require('./package_task').PackageTask; + this.PublishTask = require('./publish_task').PublishTask; + this.TestTask = require('./test_task').TestTask; + this.Task = Task; + this.FileTask = FileTask; + this.DirectoryTask = DirectoryTask; + this.Namespace = Namespace; + this.Rule = Rule; + + this.parseAllTasks = function () { + let _parseNs = function (ns) { + let nsTasks = ns.tasks; + let nsNamespaces = ns.childNamespaces; + for (let q in nsTasks) { + let nsTask = nsTasks[q]; + jake.Task[nsTask.fullName] = nsTask; + } + for (let p in nsNamespaces) { + let nsNamespace = nsNamespaces[p]; + _parseNs(nsNamespace); + } + }; + _parseNs(jake.defaultNamespace); + }; + + /** + * Displays the list of descriptions avaliable for tasks defined in + * a Jakefile + */ + this.showAllTaskDescriptions = function (f) { + let p; + let maxTaskNameLength = 0; + let task; + let padding; + let name; + let descr; + let filter = typeof f == 'string' ? f : null; + + for (p in jake.Task) { + if (!Object.prototype.hasOwnProperty.call(jake.Task, p)) { + continue; + } + if (filter && p.indexOf(filter) == -1) { + continue; + } + task = jake.Task[p]; + // Record the length of the longest task name -- used for + // pretty alignment of the task descriptions + if (task.description) { + maxTaskNameLength = p.length > maxTaskNameLength ? + p.length : maxTaskNameLength; + } + } + // Print out each entry with descriptions neatly aligned + for (p in jake.Task) { + if (!Object.prototype.hasOwnProperty.call(jake.Task, p)) { + continue; + } + if (filter && p.indexOf(filter) == -1) { + continue; + } + task = jake.Task[p]; + + //name = '\033[32m' + p + '\033[39m '; + name = chalk.green(p); + + descr = task.description; + if (descr) { + descr = chalk.gray('# ' + descr); + + // Create padding-string with calculated length + padding = (new Array(maxTaskNameLength - p.length + 2)).join(' '); + + console.log('jake ' + name + padding + descr); + } + } + }; + + this.createTask = function () { + let args = Array.prototype.slice.call(arguments); + let arg; + let obj; + let task; + let type; + let name; + let action; + let opts = {}; + let prereqs = []; + + type = args.shift(); + + // name, [deps], [action] + // Name (string) + deps (array) format + if (typeof args[0] == 'string') { + name = args.shift(); + if (Array.isArray(args[0])) { + prereqs = args.shift(); + } + } + // name:deps, [action] + // Legacy object-literal syntax, e.g.: {'name': ['depA', 'depB']} + else { + obj = args.shift(); + for (let p in obj) { + prereqs = prereqs.concat(obj[p]); + name = p; + } + } + + // Optional opts/callback or callback/opts + while ((arg = args.shift())) { + if (typeof arg == 'function') { + action = arg; + } + else { + opts = Object.assign(Object.create(null), arg); + } + } + + task = jake.currentNamespace.resolveTask(name); + if (task && !action) { + // Task already exists and no action, just update prereqs, and return it. + task.prereqs = task.prereqs.concat(prereqs); + return task; + } + + switch (type) { + case 'directory': + action = function () { + jake.mkdirP(name); + }; + task = new DirectoryTask(name, prereqs, action, opts); + break; + case 'file': + task = new FileTask(name, prereqs, action, opts); + break; + default: + task = new Task(name, prereqs, action, opts); + } + + jake.currentNamespace.addTask(task); + + if (jake.currentTaskDescription) { + task.description = jake.currentTaskDescription; + jake.currentTaskDescription = null; + } + + // FIXME: Should only need to add a new entry for the current + // task-definition, not reparse the entire structure + jake.parseAllTasks(); + + return task; + }; + + this.attemptRule = function (name, ns, level) { + let prereqRule; + let prereq; + if (level > MAX_RULE_RECURSION_LEVEL) { + return null; + } + // Check Rule + prereqRule = ns.matchRule(name); + if (prereqRule) { + prereq = prereqRule.createTask(name, level); + } + return prereq || null; + }; + + this.createPlaceholderFileTask = function (name, namespace) { + let parsed = name.split(':'); + let filePath = parsed.pop(); // Strip any namespace + let task; + + task = namespace.resolveTask(name); + + // If there's not already an existing dummy FileTask for it, + // create one + if (!task) { + // Create a dummy FileTask only if file actually exists + if (fs.existsSync(filePath)) { + task = new jake.FileTask(filePath); + task.dummy = true; + let ns; + if (parsed.length) { + ns = namespace.resolveNamespace(parsed.join(':')); + } + else { + ns = namespace; + } + if (!namespace) { + throw new Error('Invalid namespace, cannot add FileTask'); + } + ns.addTask(task); + // Put this dummy Task in the global Tasks list so + // modTime will be eval'd correctly + jake.Task[`${ns.path}:${filePath}`] = task; + } + } + + return task || null; + }; + + + this.run = function () { + let args = Array.prototype.slice.call(arguments); + let program = this.program; + let loader = this.loader; + let preempt; + let opts; + + program.parseArgs(args); + program.init(); + + preempt = program.firstPreemptiveOption(); + if (preempt) { + preempt(); + } + else { + opts = program.opts; + // jakefile flag set but no jakefile yet + if (opts.autocomplete && opts.jakefile === true) { + process.stdout.write('no-complete'); + return; + } + // Load Jakefile and jakelibdir files + let jakefileLoaded = loader.loadFile(opts.jakefile); + let jakelibdirLoaded = loader.loadDirectory(opts.jakelibdir); + + if(!jakefileLoaded && !jakelibdirLoaded && !opts.autocomplete) { + fail('No Jakefile. Specify a valid path with -f/--jakefile, ' + + 'or place one in the current directory.'); + } + + program.run(); + } + }; + + })()); +} + +module.exports = jake; diff --git a/node_modules/jake/lib/loader.js b/node_modules/jake/lib/loader.js new file mode 100644 index 000000000..02ad26282 --- /dev/null +++ b/node_modules/jake/lib/loader.js @@ -0,0 +1,165 @@ +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ + +let path = require('path'); +let fs = require('fs'); +let existsSync = fs.existsSync; +let utils = require('./utils'); + +// Files like jakelib/foobar.jake.js +const JAKELIB_FILE_PAT = /\.jake$|\.js$/; +const SUPPORTED_EXTENSIONS = { + 'js': null, + 'coffee': function () { + try { + let cs = require('coffeescript'); + if (typeof cs.register == 'function') { + cs.register(); + } + } + catch(e) { + throw new Error('You have a CoffeeScript Jakefile, but have not installed CoffeeScript'); + } + }, + 'ls': function () { + try { + require('livescript'); + } + catch (e) { + throw new Error('You have a LiveScript Jakefile, but have not installed LiveScript'); + } + } +}; +const IMPLICIT_JAKEFILE_NAMES = [ + 'Jakefile', + 'Gulpfile' +]; + +let Loader = function () { + // Load a Jakefile, running the code inside -- this may result in + // tasks getting defined using the original Jake API, e.g., + // `task('foo' ['bar', 'baz']);`, or can also auto-create tasks + // from any functions exported from the file + function loadFile(filePath) { + let exported = require(filePath); + for (let [key, value] of Object.entries(exported)) { + let t; + if (typeof value == 'function') { + t = jake.task(key, value); + t.description = '(Exported function)'; + } + } + } + + function fileExists(name) { + let nameWithExt = null; + // Support no file extension as well + let exts = Object.keys(SUPPORTED_EXTENSIONS).concat(['']); + exts.some((ext) => { + let fname = ext ? `${name}.${ext}` : name; + if (existsSync(fname)) { + nameWithExt = fname; + return true; + } + }); + return nameWithExt; + } + + // Recursive + function findImplicitJakefile() { + let cwd = process.cwd(); + let names = IMPLICIT_JAKEFILE_NAMES; + let found = null; + names.some((name) => { + let n; + // Prefer all-lowercase + n = name.toLowerCase(); + if ((found = fileExists(n))) { + return found; + } + // Check mixed-case as well + n = name; + if ((found = fileExists(n))) { + return found; + } + }); + if (found) { + return found; + } + else { + process.chdir(".."); + // If we've walked all the way up the directory tree, + // bail out with no result + if (cwd === process.cwd()) { + return null; + } + return findImplicitJakefile(); + } + } + + this.loadFile = function (fileSpecified) { + let jakefile; + let origCwd = process.cwd(); + + if (fileSpecified) { + if (existsSync(fileSpecified)) { + jakefile = fileSpecified; + } + } + else { + jakefile = findImplicitJakefile(); + } + + if (jakefile) { + let ext = jakefile.split('.')[1]; + let loaderFunc = SUPPORTED_EXTENSIONS[ext]; + loaderFunc && loaderFunc(); + + loadFile(utils.file.absolutize(jakefile)); + return true; + } + else { + if (!fileSpecified) { + // Restore the working directory on failure + process.chdir(origCwd); + } + return false; + } + }; + + this.loadDirectory = function (d) { + let dirname = d || 'jakelib'; + let dirlist; + dirname = utils.file.absolutize(dirname); + if (existsSync(dirname)) { + dirlist = fs.readdirSync(dirname); + dirlist.forEach(function (filePath) { + if (JAKELIB_FILE_PAT.test(filePath)) { + loadFile(path.join(dirname, filePath)); + } + }); + return true; + } + return false; + }; + +}; + +module.exports = function () { + return new Loader(); +}; diff --git a/node_modules/jake/lib/namespace.js b/node_modules/jake/lib/namespace.js new file mode 100644 index 000000000..a3c2787a7 --- /dev/null +++ b/node_modules/jake/lib/namespace.js @@ -0,0 +1,115 @@ +const ROOT_NAMESPACE_NAME = '__rootNamespace__'; + +class Namespace { + constructor(name, parentNamespace) { + this.name = name; + this.parentNamespace = parentNamespace; + this.childNamespaces = {}; + this.tasks = {}; + this.rules = {}; + this.path = this.getPath(); + } + + get fullName() { + return this._getFullName(); + } + + addTask(task) { + this.tasks[task.name] = task; + task.namespace = this; + } + + resolveTask(name) { + if (!name) { + return; + } + + let taskPath = name.split(':'); + let taskName = taskPath.pop(); + let task; + let ns; + + // Namespaced, return either relative to current, or from root + if (taskPath.length) { + taskPath = taskPath.join(':'); + ns = this.resolveNamespace(taskPath) || + Namespace.ROOT_NAMESPACE.resolveNamespace(taskPath); + task = (ns && ns.resolveTask(taskName)); + } + // Bare task, return either local, or top-level + else { + task = this.tasks[name] || Namespace.ROOT_NAMESPACE.tasks[name]; + } + + return task || null; + } + + + resolveNamespace(relativeName) { + if (!relativeName) { + return this; + } + + let parts = relativeName.split(':'); + let ns = this; + + for (let i = 0, ii = parts.length; (ns && i < ii); i++) { + ns = ns.childNamespaces[parts[i]]; + } + + return ns || null; + } + + matchRule(relativeName) { + let parts = relativeName.split(':'); + parts.pop(); + let ns = this.resolveNamespace(parts.join(':')); + let rules = ns ? ns.rules : []; + let r; + let match; + + for (let p in rules) { + r = rules[p]; + if (r.match(relativeName)) { + match = r; + } + } + + return (ns && match) || + (this.parentNamespace && + this.parentNamespace.matchRule(relativeName)); + } + + getPath() { + let parts = []; + let next = this.parentNamespace; + while (next) { + parts.push(next.name); + next = next.parentNamespace; + } + parts.pop(); // Remove '__rootNamespace__' + return parts.reverse().join(':'); + } + + _getFullName() { + let path = this.path; + path = (path && path.split(':')) || []; + path.push(this.name); + return path.join(':'); + } + + isRootNamespace() { + return !this.parentNamespace; + } +} + +class RootNamespace extends Namespace { + constructor() { + super(ROOT_NAMESPACE_NAME, null); + Namespace.ROOT_NAMESPACE = this; + } +} + +module.exports.Namespace = Namespace; +module.exports.RootNamespace = RootNamespace; + diff --git a/node_modules/jake/lib/package_task.js b/node_modules/jake/lib/package_task.js new file mode 100644 index 000000000..527aca70e --- /dev/null +++ b/node_modules/jake/lib/package_task.js @@ -0,0 +1,406 @@ +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ + +let path = require('path'); +let fs = require('fs'); +let exec = require('child_process').exec; +let FileList = require('filelist').FileList; + +/** + @name jake + @namespace jake +*/ +/** + @name jake.PackageTask + @constructor + @description Instantiating a PackageTask creates a number of Jake + Tasks that make packaging and distributing your software easy. + + @param {String} name The name of the project + @param {String} version The current project version (will be + appended to the project-name in the package-archive + @param {Function} definition Defines the contents of the package, + and format of the package-archive. Will be executed on the instantiated + PackageTask (i.e., 'this', will be the PackageTask instance), + to set the various instance-propertiess. + + @example + let t = new jake.PackageTask('rous', 'v' + version, function () { + let files = [ + 'Capfile' + , 'Jakefile' + , 'README.md' + , 'package.json' + , 'app/*' + , 'bin/*' + , 'config/*' + , 'lib/*' + , 'node_modules/*' + ]; + this.packageFiles.include(files); + this.packageFiles.exclude('node_modules/foobar'); + this.needTarGz = true; + }); + + */ +let PackageTask = function () { + let args = Array.prototype.slice.call(arguments); + let name = args.shift(); + let version = args.shift(); + let definition = args.pop(); + let prereqs = args.pop() || []; // Optional + + prereqs = [].concat(prereqs); // Accept string or list + + /** + @name jake.PackageTask#name + @public + @type {String} + @description The name of the project + */ + this.name = name; + /** + @name jake.PackageTask#version + @public + @type {String} + @description The project version-string + */ + this.version = version; + /** + @name jake.PackageTask#prereqs + @public + @type {Array} + @description Tasks to run before packaging + */ + this.prereqs = prereqs; + /** + @name jake.PackageTask#packageDir + @public + @type {String='pkg'} + @description The directory-name to use for packaging the software + */ + this.packageDir = 'pkg'; + /** + @name jake.PackageTask#packageFiles + @public + @type {jake.FileList} + @description The list of files and directories to include in the + package-archive + */ + this.packageFiles = new FileList(); + /** + @name jake.PackageTask#needTar + @public + @type {Boolean=false} + @description If set to true, uses the `tar` utility to create + a gzip .tgz archive of the package + */ + this.needTar = false; + /** + @name jake.PackageTask#needTarGz + @public + @type {Boolean=false} + @description If set to true, uses the `tar` utility to create + a gzip .tar.gz archive of the package + */ + this.needTarGz = false; + /** + @name jake.PackageTask#needTarBz2 + @public + @type {Boolean=false} + @description If set to true, uses the `tar` utility to create + a bzip2 .bz2 archive of the package + */ + this.needTarBz2 = false; + /** + @name jake.PackageTask#needJar + @public + @type {Boolean=false} + @description If set to true, uses the `jar` utility to create + a .jar archive of the package + */ + this.needJar = false; + /** + @name jake.PackageTask#needZip + @public + @type {Boolean=false} + @description If set to true, uses the `zip` utility to create + a .zip archive of the package + */ + this.needZip = false; + /** + @name jake.PackageTask#manifestFile + @public + @type {String=null} + @description Can be set to point the `jar` utility at a manifest + file to use in a .jar archive. If unset, one will be automatically + created by the `jar` utility. This path should be relative to the + root of the package directory (this.packageDir above, likely 'pkg') + */ + this.manifestFile = null; + /** + @name jake.PackageTask#tarCommand + @public + @type {String='tar'} + @description The shell-command to use for creating tar archives. + */ + this.tarCommand = 'tar'; + /** + @name jake.PackageTask#jarCommand + @public + @type {String='jar'} + @description The shell-command to use for creating jar archives. + */ + this.jarCommand = 'jar'; + /** + @name jake.PackageTask#zipCommand + @public + @type {String='zip'} + @description The shell-command to use for creating zip archives. + */ + this.zipCommand = 'zip'; + /** + @name jake.PackageTask#archiveNoBaseDir + @public + @type {Boolean=false} + @description Simple option for performing the archive on the + contents of the directory instead of the directory itself + */ + this.archiveNoBaseDir = false; + /** + @name jake.PackageTask#archiveChangeDir + @public + @type {String=null} + @description Equivalent to the '-C' command for the `tar` and `jar` + commands. ("Change to this directory before adding files.") + */ + this.archiveChangeDir = null; + /** + @name jake.PackageTask#archiveContentDir + @public + @type {String=null} + @description Specifies the files and directories to include in the + package-archive. If unset, this will default to the main package + directory -- i.e., name + version. + */ + this.archiveContentDir = null; + + if (typeof definition == 'function') { + definition.call(this); + } + this.define(); +}; + +PackageTask.prototype = new (function () { + + let _compressOpts = { + Tar: { + ext: '.tgz', + flags: 'czf', + cmd: 'tar' + }, + TarGz: { + ext: '.tar.gz', + flags: 'czf', + cmd: 'tar' + }, + TarBz2: { + ext: '.tar.bz2', + flags: 'cjf', + cmd: 'tar' + }, + Jar: { + ext: '.jar', + flags: 'cf', + cmd: 'jar' + }, + Zip: { + ext: '.zip', + flags: 'qr', + cmd: 'zip' + } + }; + + this.define = function () { + let self = this; + let packageDirPath = this.packageDirPath(); + let compressTaskArr = []; + + desc('Build the package for distribution'); + task('package', self.prereqs.concat(['clobberPackage', 'buildPackage'])); + // Backward-compat alias + task('repackage', ['package']); + + task('clobberPackage', function () { + jake.rmRf(self.packageDir, {silent: true}); + }); + + desc('Remove the package'); + task('clobber', ['clobberPackage']); + + let doCommand = function (p) { + let filename = path.resolve(self.packageDir + '/' + self.packageName() + + _compressOpts[p].ext); + if (process.platform == 'win32') { + // Windows full path may have drive letter, which is going to cause + // namespace problems, so strip it. + if (filename.length > 2 && filename[1] == ':') { + filename = filename.substr(2); + } + } + compressTaskArr.push(filename); + + file(filename, [packageDirPath], function () { + let cmd; + let opts = _compressOpts[p]; + // Directory to move to when doing the compression-task + // Changes in the case of zip for emulating -C option + let chdir = self.packageDir; + // Save the current dir so it's possible to pop back up + // after compressing + let currDir = process.cwd(); + let archiveChangeDir; + let archiveContentDir; + + if (self.archiveNoBaseDir) { + archiveChangeDir = self.packageName(); + archiveContentDir = '.'; + } + else { + archiveChangeDir = self.archiveChangeDir; + archiveContentDir = self.archiveContentDir; + } + + cmd = self[opts.cmd + 'Command']; + cmd += ' -' + opts.flags; + if (opts.cmd == 'jar' && self.manifestFile) { + cmd += 'm'; + } + + // The name of the archive to create -- use full path + // so compression can be performed from a different dir + // if needed + cmd += ' ' + filename; + + if (opts.cmd == 'jar' && self.manifestFile) { + cmd += ' ' + self.manifestFile; + } + + // Where to perform the compression -- -C option isn't + // supported in zip, so actually do process.chdir for this + if (archiveChangeDir) { + if (opts.cmd == 'zip') { + chdir = path.join(chdir, archiveChangeDir); + } + else { + cmd += ' -C ' + archiveChangeDir; + } + } + + // Where to get the archive content + if (archiveContentDir) { + cmd += ' ' + archiveContentDir; + } + else { + cmd += ' ' + self.packageName(); + } + + // Move into the desired dir (usually packageDir) to compress + // Return back up to the current dir after the exec + process.chdir(chdir); + + exec(cmd, function (err, stdout, stderr) { + if (err) { throw err; } + + // Return back up to the starting directory (see above, + // before exec) + process.chdir(currDir); + + complete(); + }); + }, {async: true}); + }; + + for (let p in _compressOpts) { + if (this['need' + p]) { + doCommand(p); + } + } + + task('buildPackage', compressTaskArr, function () {}); + + directory(this.packageDir); + + file(packageDirPath, this.packageFiles, function () { + jake.mkdirP(packageDirPath); + let fileList = []; + self.packageFiles.forEach(function (name) { + let f = path.join(self.packageDirPath(), name); + let fDir = path.dirname(f); + jake.mkdirP(fDir, {silent: true}); + + // Add both files and directories + fileList.push({ + from: name, + to: f + }); + }); + let _copyFile = function () { + let file = fileList.pop(); + let stat; + if (file) { + stat = fs.statSync(file.from); + // Target is a directory, just create it + if (stat.isDirectory()) { + jake.mkdirP(file.to, {silent: true}); + _copyFile(); + } + // Otherwise copy the file + else { + jake.cpR(file.from, file.to, {silent: true}); + _copyFile(); + } + } + else { + complete(); + } + }; + _copyFile(); + }, {async: true}); + + + }; + + this.packageName = function () { + if (this.version) { + return this.name + '-' + this.version; + } + else { + return this.name; + } + }; + + this.packageDirPath = function () { + return this.packageDir + '/' + this.packageName(); + }; + +})(); + +jake.PackageTask = PackageTask; +exports.PackageTask = PackageTask; + diff --git a/node_modules/jake/lib/parseargs.js b/node_modules/jake/lib/parseargs.js new file mode 100644 index 000000000..1bd24c9b4 --- /dev/null +++ b/node_modules/jake/lib/parseargs.js @@ -0,0 +1,134 @@ +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ + +let parseargs = {}; +let isOpt = function (arg) { return arg.indexOf('-') === 0 }; +let removeOptPrefix = function (opt) { return opt.replace(/^--/, '').replace(/^-/, '') }; + +/** + * @constructor + * Parses a list of command-line args into a key/value object of + * options and an array of positional commands. + * @ param {Array} opts A list of options in the following format: + * [{full: 'foo', abbr: 'f'}, {full: 'bar', abbr: 'b'}]] + */ +parseargs.Parser = function (opts) { + // A key/value object of matching options parsed out of the args + this.opts = {}; + this.taskNames = null; + this.envVars = null; + + // Data structures used for parsing + this.reg = opts; + this.shortOpts = {}; + this.longOpts = {}; + + let self = this; + [].forEach.call(opts, function (item) { + self.shortOpts[item.abbr] = item; + self.longOpts[item.full] = item; + }); +}; + +parseargs.Parser.prototype = new function () { + + let _trueOrNextVal = function (argParts, args) { + if (argParts[1]) { + return argParts[1]; + } + else { + return (!args[0] || isOpt(args[0])) ? + true : args.shift(); + } + }; + + /** + * Parses an array of arguments into options and positional commands + * @param {Array} args The command-line args to parse + */ + this.parse = function (args) { + let cmds = []; + let cmd; + let envVars = {}; + let opts = {}; + let arg; + let argItem; + let argParts; + let cmdItems; + let taskNames = []; + let preempt; + + while (args.length) { + arg = args.shift(); + + if (isOpt(arg)) { + arg = removeOptPrefix(arg); + argParts = arg.split('='); + argItem = this.longOpts[argParts[0]] || this.shortOpts[argParts[0]]; + if (argItem) { + // First-encountered preemptive opt takes precedence -- no further opts + // or possibility of ambiguity, so just look for a value, or set to + // true and then bail + if (argItem.preempts) { + opts[argItem.full] = _trueOrNextVal(argParts, args); + preempt = true; + break; + } + // If the opt requires a value, see if we can get a value from the + // next arg, or infer true from no-arg -- if it's followed by another + // opt, throw an error + if (argItem.expectValue || argItem.allowValue) { + opts[argItem.full] = _trueOrNextVal(argParts, args); + if (argItem.expectValue && !opts[argItem.full]) { + throw new Error(argItem.full + ' option expects a value.'); + } + } + else { + opts[argItem.full] = true; + } + } + } + else { + cmds.unshift(arg); + } + } + + if (!preempt) { + // Parse out any env-vars and task-name + while ((cmd = cmds.pop())) { + cmdItems = cmd.split('='); + if (cmdItems.length > 1) { + envVars[cmdItems[0]] = cmdItems[1]; + } + else { + taskNames.push(cmd); + } + } + + } + + return { + opts: opts, + envVars: envVars, + taskNames: taskNames + }; + }; + +}; + +module.exports = parseargs; diff --git a/node_modules/jake/lib/program.js b/node_modules/jake/lib/program.js new file mode 100644 index 000000000..121632fb1 --- /dev/null +++ b/node_modules/jake/lib/program.js @@ -0,0 +1,282 @@ +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ + +let fs = require('fs'); +let parseargs = require('./parseargs'); +let utils = require('./utils'); +let Program; +let usage = require('fs').readFileSync(`${__dirname}/../usage.txt`).toString(); +let { Task } = require('./task/task'); + +function die(msg) { + console.log(msg); + process.stdout.write('', function () { + process.stderr.write('', function () { + process.exit(); + }); + }); +} + +let preempts = { + version: function () { + die(jake.version); + }, + help: function () { + die(usage); + } +}; + +let AVAILABLE_OPTS = [ + { full: 'jakefile', + abbr: 'f', + expectValue: true + }, + { full: 'quiet', + abbr: 'q', + expectValue: false + }, + { full: 'directory', + abbr: 'C', + expectValue: true + }, + { full: 'always-make', + abbr: 'B', + expectValue: false + }, + { full: 'tasks', + abbr: 'T', + expectValue: false, + allowValue: true + }, + // Alias t + { full: 'tasks', + abbr: 't', + expectValue: false, + allowValue: true + }, + // Alias ls + { full: 'tasks', + abbr: 'ls', + expectValue: false, + allowValue: true + }, + { full: 'help', + abbr: 'h', + }, + { full: 'version', + abbr: 'V', + }, + // Alias lowercase v + { full: 'version', + abbr: 'v', + }, + { full: 'jakelibdir', + abbr: 'J', + expectValue: true + }, + { full: 'allow-rejection', + abbr: 'ar', + expectValue: false + } +]; + +Program = function () { + this.availableOpts = AVAILABLE_OPTS; + this.opts = {}; + this.taskNames = null; + this.taskArgs = null; + this.envVars = null; + this.die = die; +}; + +Program.prototype = new (function () { + + this.handleErr = function (err) { + if (jake.listeners('error').length !== 0) { + jake.emit('error', err); + return; + } + + if (jake.listeners('error').length) { + jake.emit('error', err); + return; + } + + utils.logger.error('jake aborted.'); + if (err.stack) { + utils.logger.error(err.stack); + } + else { + utils.logger.error(err.message); + } + + process.stdout.write('', function () { + process.stderr.write('', function () { + jake.errorCode = jake.errorCode || 1; + process.exit(jake.errorCode); + }); + }); + }; + + this.parseArgs = function (args) { + let result = (new parseargs.Parser(this.availableOpts)).parse(args); + this.setOpts(result.opts); + this.setTaskNames(result.taskNames); + this.setEnvVars(result.envVars); + }; + + this.setOpts = function (options) { + let opts = options || {}; + Object.assign(this.opts, opts); + }; + + this.internalOpts = function (options) { + this.availableOpts = this.availableOpts.concat(options); + }; + + this.autocompletions = function (cur) { + let p; let i; let task; + let commonPrefix = ''; + let matches = []; + + for (p in jake.Task) { + task = jake.Task[p]; + if ( + 'fullName' in task + && ( + // if empty string, program converts to true + cur === true || + task.fullName.indexOf(cur) === 0 + ) + ) { + if (matches.length === 0) { + commonPrefix = task.fullName; + } + else { + for (i = commonPrefix.length; i > -1; --i) { + commonPrefix = commonPrefix.substr(0, i); + if (task.fullName.indexOf(commonPrefix) === 0) { + break; + } + } + } + matches.push(task.fullName); + } + } + + if (matches.length > 1 && commonPrefix === cur) { + matches.unshift('yes-space'); + } + else { + matches.unshift('no-space'); + } + + process.stdout.write(matches.join(' ')); + }; + + this.setTaskNames = function (names) { + if (names && !Array.isArray(names)) { + throw new Error('Task names must be an array'); + } + this.taskNames = (names && names.length) ? names : ['default']; + }; + + this.setEnvVars = function (vars) { + this.envVars = vars || null; + }; + + this.firstPreemptiveOption = function () { + let opts = this.opts; + for (let p in opts) { + if (preempts[p]) { + return preempts[p]; + } + } + return false; + }; + + this.init = function (configuration) { + let self = this; + let config = configuration || {}; + if (config.options) { + this.setOpts(config.options); + } + if (config.taskNames) { + this.setTaskNames(config.taskNames); + } + if (config.envVars) { + this.setEnvVars(config.envVars); + } + process.addListener('uncaughtException', function (err) { + self.handleErr(err); + }); + if (!this.opts['allow-rejection']) { + process.addListener('unhandledRejection', (reason, promise) => { + utils.logger.error('Unhandled rejection at:', promise, 'reason:', reason); + self.handleErr(reason); + }); + } + if (this.envVars) { + Object.assign(process.env, this.envVars); + } + }; + + this.run = function () { + let rootTask; + let taskNames; + let dirname; + let opts = this.opts; + + if (opts.autocomplete) { + return this.autocompletions(opts['autocomplete-cur'], opts['autocomplete-prev']); + } + // Run with `jake -T`, just show descriptions + if (opts.tasks) { + return jake.showAllTaskDescriptions(opts.tasks); + } + + taskNames = this.taskNames; + if (!(Array.isArray(taskNames) && taskNames.length)) { + throw new Error('Please pass jake.runTasks an array of task-names'); + } + + // Set working dir + dirname = opts.directory; + if (dirname) { + if (fs.existsSync(dirname) && + fs.statSync(dirname).isDirectory()) { + process.chdir(dirname); + } + else { + throw new Error(dirname + ' is not a valid directory path'); + } + } + + rootTask = task(Task.ROOT_TASK_NAME, taskNames, function () {}); + rootTask._internal = true; + + rootTask.once('complete', function () { + jake.emit('complete'); + }); + jake.emit('start'); + rootTask.invoke(); + }; + +})(); + +module.exports.Program = Program; diff --git a/node_modules/jake/lib/publish_task.js b/node_modules/jake/lib/publish_task.js new file mode 100644 index 000000000..f0cacfda0 --- /dev/null +++ b/node_modules/jake/lib/publish_task.js @@ -0,0 +1,290 @@ +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ + +let fs = require('fs'); +let path = require('path'); +let exec = require('child_process').execSync; +let FileList = require('filelist').FileList; + +let PublishTask = function () { + let args = Array.prototype.slice.call(arguments).filter(function (item) { + return typeof item != 'undefined'; + }); + let arg; + let opts = {}; + let definition; + let prereqs = []; + let createDef = function (arg) { + return function () { + this.packageFiles.include(arg); + }; + }; + + this.name = args.shift(); + + // Old API, just name + list of files + if (args.length == 1 && (Array.isArray(args[0]) || typeof args[0] == 'string')) { + definition = createDef(args.pop()); + } + // Current API, name + [prereqs] + [opts] + definition + else { + while ((arg = args.pop())) { + // Definition func + if (typeof arg == 'function') { + definition = arg; + } + // Prereqs + else if (Array.isArray(arg) || typeof arg == 'string') { + prereqs = arg; + } + // Opts + else { + opts = arg; + } + } + } + + this.prereqs = prereqs; + this.packageFiles = new FileList(); + this.publishCmd = opts.publishCmd || 'npm publish %filename'; + this.publishMessage = opts.publishMessage || 'BOOM! Published.'; + this.gitCmd = opts.gitCmd || 'git'; + this.versionFiles = opts.versionFiles || ['package.json']; + this.scheduleDelay = 5000; + + // Override utility funcs for testing + this._ensureRepoClean = function (stdout) { + if (stdout.length) { + fail(new Error('Git repository is not clean.')); + } + }; + this._getCurrentBranch = function (stdout) { + return String(stdout).trim(); + }; + + if (typeof definition == 'function') { + definition.call(this); + } + this.define(); +}; + + +PublishTask.prototype = new (function () { + + let _currentBranch = null; + + let getPackage = function () { + let pkg = JSON.parse(fs.readFileSync(path.join(process.cwd(), + '/package.json')).toString()); + return pkg; + }; + let getPackageVersionNumber = function () { + return getPackage().version; + }; + + this.define = function () { + let self = this; + + namespace('publish', function () { + task('fetchTags', function () { + // Make sure local tags are up to date + exec(self.gitCmd + ' fetch --tags'); + console.log('Fetched remote tags.'); + }); + + task('getCurrentBranch', function () { + // Figure out what branch to push to + let stdout = exec(self.gitCmd + ' symbolic-ref --short HEAD').toString(); + if (!stdout) { + throw new Error('No current Git branch found'); + } + _currentBranch = self._getCurrentBranch(stdout); + console.log('On branch ' + _currentBranch); + }); + + task('ensureClean', function () { + // Only bump, push, and tag if the Git repo is clean + let stdout = exec(self.gitCmd + ' status --porcelain --untracked-files=no').toString(); + // Throw if there's output + self._ensureRepoClean(stdout); + }); + + task('updateVersionFiles', function () { + let pkg; + let version; + let arr; + let patch; + + // Grab the current version-string + pkg = getPackage(); + version = pkg.version; + // Increment the patch-number for the version + arr = version.split('.'); + patch = parseInt(arr.pop(), 10) + 1; + arr.push(patch); + version = arr.join('.'); + + // Update package.json or other files with the new version-info + self.versionFiles.forEach(function (file) { + let p = path.join(process.cwd(), file); + let data = JSON.parse(fs.readFileSync(p).toString()); + data.version = version; + fs.writeFileSync(p, JSON.stringify(data, true, 2) + '\n'); + }); + // Return the version string so that listeners for the 'complete' event + // for this task can use it (e.g., to update other files before pushing + // to Git) + return version; + }); + + task('pushVersion', ['ensureClean', 'updateVersionFiles'], function () { + let version = getPackageVersionNumber(); + let message = 'Version ' + version; + let cmds = [ + self.gitCmd + ' commit -a -m "' + message + '"', + self.gitCmd + ' push origin ' + _currentBranch, + self.gitCmd + ' tag -a v' + version + ' -m "' + message + '"', + self.gitCmd + ' push --tags' + ]; + cmds.forEach((cmd) => { + exec(cmd); + }); + version = getPackageVersionNumber(); + console.log('Bumped version number to v' + version + '.'); + }); + + let defineTask = task('definePackage', function () { + let version = getPackageVersionNumber(); + new jake.PackageTask(self.name, 'v' + version, self.prereqs, function () { + // Replace the PackageTask's FileList with the PublishTask's FileList + this.packageFiles = self.packageFiles; + this.needTarGz = true; // Default to tar.gz + // If any of the need or archive opts are set + // proxy them to the PackageTask + for (let p in this) { + if (p.indexOf('need') === 0 || p.indexOf('archive') === 0) { + if (typeof self[p] != 'undefined') { + this[p] = self[p]; + } + } + } + }); + }); + defineTask._internal = true; + + task('package', function () { + let definePack = jake.Task['publish:definePackage']; + let pack = jake.Task['package']; + let version = getPackageVersionNumber(); + + // May have already been run + if (definePack.taskStatus == jake.Task.runStatuses.DONE) { + definePack.reenable(true); + } + definePack.invoke(); + // Set manually, completion happens in next tick, creating deadlock + definePack.taskStatus = jake.Task.runStatuses.DONE; + pack.invoke(); + console.log('Created package for ' + self.name + ' v' + version); + }); + + task('publish', function () { + return new Promise((resolve) => { + let version = getPackageVersionNumber(); + let filename; + let cmd; + + console.log('Publishing ' + self.name + ' v' + version); + + if (typeof self.createPublishCommand == 'function') { + cmd = self.createPublishCommand(version); + } + else { + filename = './pkg/' + self.name + '-v' + version + '.tar.gz'; + cmd = self.publishCmd.replace(/%filename/gi, filename); + } + + if (typeof cmd == 'function') { + cmd(function (err) { + if (err) { + throw err; + } + console.log(self.publishMessage); + resolve(); + }); + } + else { + // Hackity hack -- NPM publish sometimes returns errror like: + // Error sending version data\nnpm ERR! + // Error: forbidden 0.2.4 is modified, should match modified time + setTimeout(function () { + let stdout = exec(cmd).toString() || ''; + stdout = stdout.trim(); + if (stdout) { + console.log(stdout); + } + console.log(self.publishMessage); + resolve(); + }, self.scheduleDelay); + } + }); + }); + + task('cleanup', function () { + return new Promise((resolve) => { + let clobber = jake.Task.clobber; + clobber.reenable(true); + clobber.on('complete', function () { + console.log('Cleaned up package'); + resolve(); + }); + clobber.invoke(); + }); + }); + + }); + + let prefixNs = function (item) { + return 'publish:' + item; + }; + + // Create aliases in the default namespace + desc('Create a new version and release.'); + task('publish', self.prereqs.concat(['version', 'release'] + .map(prefixNs))); + + desc('Release the existing version.'); + task('publishExisting', self.prereqs.concat(['release'] + .map(prefixNs))); + + task('version', ['fetchTags', 'getCurrentBranch', 'pushVersion'] + .map(prefixNs)); + + task('release', ['package', 'publish', 'cleanup'] + .map(prefixNs)); + + // Invoke proactively so there will be a callable 'package' task + // which can be used apart from 'publish' + jake.Task['publish:definePackage'].invoke(); + }; + +})(); + +jake.PublishTask = PublishTask; +exports.PublishTask = PublishTask; + diff --git a/node_modules/jake/lib/rule.js b/node_modules/jake/lib/rule.js new file mode 100644 index 000000000..25f51ae4b --- /dev/null +++ b/node_modules/jake/lib/rule.js @@ -0,0 +1,311 @@ +let path = require('path'); +let fs = require('fs'); +let Task = require('./task/task').Task; + +// Split a task to two parts, name space and task name. +// For example, given 'foo:bin/a%.c', return an object with +// - 'ns' : foo +// - 'name' : bin/a%.c +function splitNs(task) { + let parts = task.split(':'); + let name = parts.pop(); + let ns = resolveNs(parts); + return { + 'name' : name, + 'ns' : ns + }; +} + +// Return the namespace based on an array of names. +// For example, given ['foo', 'baz' ], return the namespace +// +// default -> foo -> baz +// +// where default is the global root namespace +// and -> means child namespace. +function resolveNs(parts) { + let ns = jake.defaultNamespace; + for(let i = 0, l = parts.length; ns && i < l; i++) { + ns = ns.childNamespaces[parts[i]]; + } + return ns; +} + +// Given a pattern p, say 'foo:bin/a%.c' +// Return an object with +// - 'ns' : foo +// - 'dir' : bin +// - 'prefix' : a +// - 'suffix' : .c +function resolve(p) { + let task = splitNs(p); + let name = task.name; + let ns = task.ns; + let split = path.basename(name).split('%'); + return { + ns: ns, + dir: path.dirname(name), + prefix: split[0], + suffix: split[1] + }; +} + +// Test whether string a is a suffix of string b +function stringEndWith(a, b) { + let l; + return (l = b.lastIndexOf(a)) == -1 ? false : l + a.length == b.length; +} + +// Replace the suffix a of the string s with b. +// Note that, it is assumed a is a suffix of s. +function stringReplaceSuffix(s, a, b) { + return s.slice(0, s.lastIndexOf(a)) + b; +} + +class Rule { + constructor(opts) { + this.pattern = opts.pattern; + this.source = opts.source; + this.prereqs = opts.prereqs; + this.action = opts.action; + this.opts = opts.opts; + this.desc = opts.desc; + this.ns = opts.ns; + } + + // Create a file task based on this rule for the specified + // task-name + // ====== + // FIXME: Right now this just throws away any passed-in args + // for the synthsized task (taskArgs param) + // ====== + createTask(fullName, level) { + let self = this; + let pattern; + let source; + let action; + let opts; + let prereqs; + let valid; + let src; + let tNs; + let createdTask; + let name = Task.getBaseTaskName(fullName); + let nsPath = Task.getBaseNamespacePath(fullName); + let ns = this.ns.resolveNamespace(nsPath); + + pattern = this.pattern; + source = this.source; + + if (typeof source == 'string') { + src = Rule.getSource(name, pattern, source); + } + else { + src = source(name); + } + + // TODO: Write a utility function that appends a + // taskname to a namespace path + src = nsPath.split(':').filter(function (item) { + return !!item; + }).concat(src).join(':'); + + // Generate the prerequisite for the matching task. + // It is the original prerequisites plus the prerequisite + // representing source file, i.e., + // + // rule( '%.o', '%.c', ['some.h'] ... + // + // If the objective is main.o, then new task should be + // + // file( 'main.o', ['main.c', 'some.h' ] ... + prereqs = this.prereqs.slice(); // Get a copy to work with + prereqs.unshift(src); + + // Prereq should be: + // 1. an existing task + // 2. an existing file on disk + // 3. a valid rule (i.e., not at too deep a level) + valid = prereqs.some(function (p) { + let ns = self.ns; + return ns.resolveTask(p) || + fs.existsSync(Task.getBaseTaskName(p)) || + jake.attemptRule(p, ns, level + 1); + }); + + // If any of the prereqs aren't valid, the rule isn't valid + if (!valid) { + return null; + } + // Otherwise, hunky-dory, finish creating the task for the rule + else { + // Create the action for the task + action = function () { + let task = this; + self.action.apply(task); + }; + + opts = this.opts; + + // Insert the file task into Jake + // + // Since createTask function stores the task as a child task + // of currentNamespace. Here we temporariliy switch the namespace. + // FIXME: Should allow optional ns passed in instead of this hack + tNs = jake.currentNamespace; + jake.currentNamespace = ns; + createdTask = jake.createTask('file', name, prereqs, action, opts); + createdTask.source = src.split(':').pop(); + jake.currentNamespace = tNs; + + return createdTask; + } + } + + match(name) { + return Rule.match(this.pattern, name); + } + + // Test wether the a prerequisite matchs the pattern. + // The arg 'pattern' does not have namespace as prefix. + // For example, the following tests are true + // + // pattern | name + // bin/%.o | bin/main.o + // bin/%.o | foo:bin/main.o + // + // The following tests are false (trivally) + // + // pattern | name + // bin/%.o | foobin/main.o + // bin/%.o | bin/main.oo + static match(pattern, name) { + let p; + let task; + let obj; + let filename; + + if (pattern instanceof RegExp) { + return pattern.test(name); + } + else if (pattern.indexOf('%') == -1) { + // No Pattern. No Folder. No Namespace. + // A Simple Suffix Rule. Just test suffix + return stringEndWith(pattern, name); + } + else { + // Resolve the dir, prefix and suffix of pattern + p = resolve(pattern); + + // Resolve the namespace and task-name + task = splitNs(name); + name = task.name; + + // Set the objective as the task-name + obj = name; + + // Namespace is already matched. + + // Check dir + if (path.dirname(obj) != p.dir) { + return false; + } + + filename = path.basename(obj); + + // Check file name length + if ((p.prefix.length + p.suffix.length + 1) > filename.length) { + // Length does not match. + return false; + } + + // Check prefix + if (filename.indexOf(p.prefix) !== 0) { + return false; + } + + // Check suffix + if (!stringEndWith(p.suffix, filename)) { + return false; + } + + // OK. Find a match. + return true; + } + } + + // Generate the source based on + // - name name for the synthesized task + // - pattern pattern for the objective + // - source pattern for the source + // + // Return the source with properties + // - dep the prerequisite of source + // (with the namespace) + // + // - file the file name of source + // (without the namespace) + // + // For example, given + // + // - name foo:bin/main.o + // - pattern bin/%.o + // - source src/%.c + // + // return 'foo:src/main.c', + // + static getSource(name, pattern, source) { + let dep; + let pat; + let match; + let file; + let src; + + // Regex pattern -- use to look up the extension + if (pattern instanceof RegExp) { + match = pattern.exec(name); + if (match) { + if (typeof source == 'function') { + src = source(name); + } + else { + src = stringReplaceSuffix(name, match[0], source); + } + } + } + // Assume string + else { + // Simple string suffix replacement + if (pattern.indexOf('%') == -1) { + if (typeof source == 'function') { + src = source(name); + } + else { + src = stringReplaceSuffix(name, pattern, source); + } + } + // Percent-based substitution + else { + pat = pattern.replace('%', '(.*?)'); + pat = new RegExp(pat); + match = pat.exec(name); + if (match) { + if (typeof source == 'function') { + src = source(name); + } + else { + file = match[1]; + file = source.replace('%', file); + dep = match[0]; + src = name.replace(dep, file); + } + } + } + } + + return src; + } +} + + +exports.Rule = Rule; diff --git a/node_modules/jake/lib/task/directory_task.js b/node_modules/jake/lib/task/directory_task.js new file mode 100644 index 000000000..b17b62466 --- /dev/null +++ b/node_modules/jake/lib/task/directory_task.js @@ -0,0 +1,30 @@ +let fs = require('fs'); +let FileTask = require('./file_task').FileTask; + +/** + @name jake + @namespace jake +*/ +/** + @name jake.DirectoryTask + @constructor + @augments EventEmitter + @augments jake.Task + @augments jake.FileTask + @description A Jake DirectoryTask + + @param {String} name The name of the directory to create. + */ +class DirectoryTask extends FileTask { + constructor(...args) { + super(...args); + if (fs.existsSync(this.name)) { + this.updateModTime(); + } + else { + this.modTime = null; + } + } +} + +exports.DirectoryTask = DirectoryTask; diff --git a/node_modules/jake/lib/task/file_task.js b/node_modules/jake/lib/task/file_task.js new file mode 100644 index 000000000..6fad84ba5 --- /dev/null +++ b/node_modules/jake/lib/task/file_task.js @@ -0,0 +1,124 @@ +let fs = require('fs'); +let Task = require('./task').Task; + +function isFileOrDirectory(t) { + return (t instanceof FileTask || + t instanceof DirectoryTask); +} + +function isFile(t) { + return (t instanceof FileTask && !(t instanceof DirectoryTask)); +} + +/** + @name jake + @namespace jake +*/ +/** + @name jake.FileTask + @class` + @extentds Task + @description A Jake FileTask + + @param {String} name The name of the Task + @param {Array} [prereqs] Prerequisites to be run before this task + @param {Function} [action] The action to perform to create this file + @param {Object} [opts] + @param {Array} [opts.asyc=false] Perform this task asynchronously. + If you flag a task with this option, you must call the global + `complete` method inside the task's action, for execution to proceed + to the next task. + */ +class FileTask extends Task { + constructor(...args) { + super(...args); + this.dummy = false; + if (fs.existsSync(this.name)) { + this.updateModTime(); + } + else { + this.modTime = null; + } + } + + isNeeded() { + let prereqs = this.prereqs; + let prereqName; + let prereqTask; + + // No repeatsies + if (this.taskStatus == Task.runStatuses.DONE) { + return false; + } + // The always-make override + else if (jake.program.opts['always-make']) { + return true; + } + // Default case + else { + + // We need either an existing file, or an action to create one. + // First try grabbing the actual mod-time of the file + try { + this.updateModTime(); + } + // Then fall back to looking for an action + catch(e) { + if (typeof this.action == 'function') { + return true; + } + else { + throw new Error('File-task ' + this.fullName + ' has no ' + + 'existing file, and no action to create one.'); + } + } + + // Compare mod-time of all the prereqs with its mod-time + // If any prereqs are newer, need to run the action to update + if (prereqs && prereqs.length) { + for (let i = 0, ii = prereqs.length; i < ii; i++) { + prereqName = prereqs[i]; + prereqTask = this.namespace.resolveTask(prereqName) || + jake.createPlaceholderFileTask(prereqName, this.namespace); + // Run the action if: + // 1. The prereq is a normal task (not file/dir) + // 2. The prereq is a file-task with a mod-date more recent than + // the one for this file/dir + if (prereqTask) { + if (!isFileOrDirectory(prereqTask) || + (isFile(prereqTask) && prereqTask.modTime > this.modTime)) { + return true; + } + } + } + } + // File/dir has no prereqs, and exists -- no need to run + else { + // Effectively done + this.taskStatus = Task.runStatuses.DONE; + return false; + } + } + } + + updateModTime() { + let stats = fs.statSync(this.name); + this.modTime = stats.mtime; + } + + complete() { + if (!this.dummy) { + this.updateModTime(); + } + // Hackity hack + Task.prototype.complete.apply(this, arguments); + } + +} + +exports.FileTask = FileTask; + +// DirectoryTask is a subclass of FileTask, depends on it +// being defined +let DirectoryTask = require('./directory_task').DirectoryTask; + diff --git a/node_modules/jake/lib/task/index.js b/node_modules/jake/lib/task/index.js new file mode 100644 index 000000000..bc93f4147 --- /dev/null +++ b/node_modules/jake/lib/task/index.js @@ -0,0 +1,9 @@ + +let Task = require('./task').Task; +let FileTask = require('./file_task').FileTask; +let DirectoryTask = require('./directory_task').DirectoryTask; + +exports.Task = Task; +exports.FileTask = FileTask; +exports.DirectoryTask = DirectoryTask; + diff --git a/node_modules/jake/lib/task/task.js b/node_modules/jake/lib/task/task.js new file mode 100644 index 000000000..9e8886fa0 --- /dev/null +++ b/node_modules/jake/lib/task/task.js @@ -0,0 +1,439 @@ +let EventEmitter = require('events').EventEmitter; +let async = require('async'); +let chalk = require('chalk'); +// 'rule' module is required at the bottom because circular deps + +// Used for task value, so better not to use +// null, since value should be unset/uninitialized +let UNDEFINED_VALUE; + +const ROOT_TASK_NAME = '__rootTask__'; +const POLLING_INTERVAL = 100; + +// Parse any positional args attached to the task-name +function parsePrereqName(name) { + let taskArr = name.split('['); + let taskName = taskArr[0]; + let taskArgs = []; + if (taskArr[1]) { + taskArgs = taskArr[1].replace(/\]$/, ''); + taskArgs = taskArgs.split(','); + } + return { + name: taskName, + args: taskArgs + }; +} + +/** + @name jake.Task + @class + @extends EventEmitter + @description A Jake Task + + @param {String} name The name of the Task + @param {Array} [prereqs] Prerequisites to be run before this task + @param {Function} [action] The action to perform for this task + @param {Object} [opts] + @param {Array} [opts.asyc=false] Perform this task asynchronously. + If you flag a task with this option, you must call the global + `complete` method inside the task's action, for execution to proceed + to the next task. + */ +class Task extends EventEmitter { + + constructor(name, prereqs, action, options) { + // EventEmitter ctor takes no args + super(); + + if (name.indexOf(':') > -1) { + throw new Error('Task name cannot include a colon. It is used internally as namespace delimiter.'); + } + let opts = options || {}; + + this._currentPrereqIndex = 0; + this._internal = false; + this._skipped = false; + + this.name = name; + this.prereqs = prereqs; + this.action = action; + this.async = false; + this.taskStatus = Task.runStatuses.UNSTARTED; + this.description = null; + this.args = []; + this.value = UNDEFINED_VALUE; + this.concurrency = 1; + this.startTime = null; + this.endTime = null; + this.directory = null; + this.namespace = null; + + // Support legacy async-flag -- if not explicitly passed or falsy, will + // be set to empty-object + if (typeof opts == 'boolean' && opts === true) { + this.async = true; + } + else { + if (opts.async) { + this.async = true; + } + if (opts.concurrency) { + this.concurrency = opts.concurrency; + } + } + + //Do a test on self dependencies for this task + if(Array.isArray(this.prereqs) && this.prereqs.indexOf(this.name) !== -1) { + throw new Error("Cannot use prereq " + this.name + " as a dependency of itself"); + } + } + + get fullName() { + return this._getFullName(); + } + + _initInvocationChain() { + // Legacy global invocation chain + jake._invocationChain.push(this); + + // New root chain + if (!this._invocationChain) { + this._invocationChainRoot = true; + this._invocationChain = []; + if (jake.currentRunningTask) { + jake.currentRunningTask._waitForChains = jake.currentRunningTask._waitForChains || []; + jake.currentRunningTask._waitForChains.push(this._invocationChain); + } + } + } + + /** + @name jake.Task#invoke + @function + @description Runs prerequisites, then this task. If the task has already + been run, will not run the task again. + */ + invoke() { + this._initInvocationChain(); + + this.args = Array.prototype.slice.call(arguments); + this.reenabled = false + this.runPrereqs(); + } + + /** + @name jake.Task#execute + @function + @description Run only this task, without prereqs. If the task has already + been run, *will* run the task again. + */ + execute() { + this._initInvocationChain(); + + this.args = Array.prototype.slice.call(arguments); + this.reenable(); + this.reenabled = true + this.run(); + } + + runPrereqs() { + if (this.prereqs && this.prereqs.length) { + + if (this.concurrency > 1) { + async.eachLimit(this.prereqs, this.concurrency, + + (name, cb) => { + let parsed = parsePrereqName(name); + + let prereq = this.namespace.resolveTask(parsed.name) || + jake.attemptRule(name, this.namespace, 0) || + jake.createPlaceholderFileTask(name, this.namespace); + + if (!prereq) { + throw new Error('Unknown task "' + name + '"'); + } + + //Test for circular invocation + if(prereq === this) { + setImmediate(function () { + cb(new Error("Cannot use prereq " + prereq.name + " as a dependency of itself")); + }); + } + + if (prereq.taskStatus == Task.runStatuses.DONE) { + //prereq already done, return + setImmediate(cb); + } + else { + //wait for complete before calling cb + prereq.once('_done', () => { + prereq.removeAllListeners('_done'); + setImmediate(cb); + }); + // Start the prereq if we are the first to encounter it + if (prereq.taskStatus === Task.runStatuses.UNSTARTED) { + prereq.taskStatus = Task.runStatuses.STARTED; + prereq.invoke.apply(prereq, parsed.args); + } + } + }, + + (err) => { + //async callback is called after all prereqs have run. + if (err) { + throw err; + } + else { + setImmediate(this.run.bind(this)); + } + } + ); + } + else { + setImmediate(this.nextPrereq.bind(this)); + } + } + else { + setImmediate(this.run.bind(this)); + } + } + + nextPrereq() { + let self = this; + let index = this._currentPrereqIndex; + let name = this.prereqs[index]; + let prereq; + let parsed; + + if (name) { + + parsed = parsePrereqName(name); + + prereq = this.namespace.resolveTask(parsed.name) || + jake.attemptRule(name, this.namespace, 0) || + jake.createPlaceholderFileTask(name, this.namespace); + + if (!prereq) { + throw new Error('Unknown task "' + name + '"'); + } + + // Do when done + if (prereq.taskStatus == Task.runStatuses.DONE) { + self.handlePrereqDone(prereq); + } + else { + prereq.once('_done', () => { + this.handlePrereqDone(prereq); + prereq.removeAllListeners('_done'); + }); + if (prereq.taskStatus == Task.runStatuses.UNSTARTED) { + prereq.taskStatus = Task.runStatuses.STARTED; + prereq._invocationChain = this._invocationChain; + prereq.invoke.apply(prereq, parsed.args); + } + } + } + } + + /** + @name jake.Task#reenable + @function + @description Reenables a task so that it can be run again. + */ + reenable(deep) { + let prereqs; + let prereq; + this._skipped = false; + this.taskStatus = Task.runStatuses.UNSTARTED; + this.value = UNDEFINED_VALUE; + if (deep && this.prereqs) { + prereqs = this.prereqs; + for (let i = 0, ii = prereqs.length; i < ii; i++) { + prereq = jake.Task[prereqs[i]]; + if (prereq) { + prereq.reenable(deep); + } + } + } + } + + handlePrereqDone(prereq) { + this._currentPrereqIndex++; + if (this._currentPrereqIndex < this.prereqs.length) { + setImmediate(this.nextPrereq.bind(this)); + } + else { + setImmediate(this.run.bind(this)); + } + } + + isNeeded() { + let needed = true; + if (this.taskStatus == Task.runStatuses.DONE) { + needed = false; + } + return needed; + } + + run() { + let val, previous; + let hasAction = typeof this.action == 'function'; + + if (!this.isNeeded()) { + this.emit('skip'); + this.emit('_done'); + } + else { + if (this._invocationChain.length) { + previous = this._invocationChain[this._invocationChain.length - 1]; + // If this task is repeating and its previous is equal to this, don't check its status because it was set to UNSTARTED by the reenable() method + if (!(this.reenabled && previous == this)) { + if (previous.taskStatus != Task.runStatuses.DONE) { + let now = (new Date()).getTime(); + if (now - this.startTime > jake._taskTimeout) { + return jake.fail(`Timed out waiting for task: ${previous.name} with status of ${previous.taskStatus}`); + } + setTimeout(this.run.bind(this), POLLING_INTERVAL); + return; + } + } + } + if (!(this.reenabled && previous == this)) { + this._invocationChain.push(this); + } + + if (!(this._internal || jake.program.opts.quiet)) { + console.log("Starting '" + chalk.green(this.fullName) + "'..."); + } + + this.startTime = (new Date()).getTime(); + this.emit('start'); + + jake.currentRunningTask = this; + + if (hasAction) { + try { + if (this.directory) { + process.chdir(this.directory); + } + + val = this.action.apply(this, this.args); + + if (typeof val == 'object' && typeof val.then == 'function') { + this.async = true; + + val.then( + (result) => { + setImmediate(() => { + this.complete(result); + }); + }, + (err) => { + setImmediate(() => { + this.errorOut(err); + }); + }); + } + } + catch (err) { + this.errorOut(err); + return; // Bail out, not complete + } + } + + if (!(hasAction && this.async)) { + setImmediate(() => { + this.complete(val); + }); + } + } + } + + errorOut(err) { + this.taskStatus = Task.runStatuses.ERROR; + this._invocationChain.chainStatus = Task.runStatuses.ERROR; + this.emit('error', err); + } + + complete(val) { + + if (Array.isArray(this._waitForChains)) { + let stillWaiting = this._waitForChains.some((chain) => { + return !(chain.chainStatus == Task.runStatuses.DONE || + chain.chainStatus == Task.runStatuses.ERROR); + }); + if (stillWaiting) { + let now = (new Date()).getTime(); + let elapsed = now - this.startTime; + if (elapsed > jake._taskTimeout) { + return jake.fail(`Timed out waiting for task: ${this.name} with status of ${this.taskStatus}. Elapsed: ${elapsed}`); + } + setTimeout(() => { + this.complete(val); + }, POLLING_INTERVAL); + return; + } + } + + jake._invocationChain.splice(jake._invocationChain.indexOf(this), 1); + + if (this._invocationChainRoot) { + this._invocationChain.chainStatus = Task.runStatuses.DONE; + } + + this._currentPrereqIndex = 0; + + // If 'complete' getting called because task has been + // run already, value will not be passed -- leave in place + if (!this._skipped) { + this.taskStatus = Task.runStatuses.DONE; + this.value = val; + + this.emit('complete', this.value); + this.emit('_done'); + + this.endTime = (new Date()).getTime(); + let taskTime = this.endTime - this.startTime; + + if (!(this._internal || jake.program.opts.quiet)) { + console.log("Finished '" + chalk.green(this.fullName) + "' after " + chalk.magenta(taskTime + ' ms')); + } + + } + } + + _getFullName() { + let ns = this.namespace; + let path = (ns && ns.path) || ''; + path = (path && path.split(':')) || []; + if (this.namespace !== jake.defaultNamespace) { + path.push(this.namespace.name); + } + path.push(this.name); + return path.join(':'); + } + + static getBaseNamespacePath(fullName) { + return fullName.split(':').slice(0, -1).join(':'); + } + + static getBaseTaskName(fullName) { + return fullName.split(':').pop(); + } +} + +Task.runStatuses = { + UNSTARTED: 'unstarted', + DONE: 'done', + STARTED: 'started', + ERROR: 'error' +}; + +Task.ROOT_TASK_NAME = ROOT_TASK_NAME; + +exports.Task = Task; + +// Required here because circular deps +require('../rule'); + diff --git a/node_modules/jake/lib/test_task.js b/node_modules/jake/lib/test_task.js new file mode 100644 index 000000000..6482bf14d --- /dev/null +++ b/node_modules/jake/lib/test_task.js @@ -0,0 +1,270 @@ +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ + +let path = require('path'); +let currDir = process.cwd(); + +/** + @name jake + @namespace jake +*/ +/** + @name jake.TestTask + @constructor + @description Instantiating a TestTask creates a number of Jake + Tasks that make running tests for your software easy. + + @param {String} name The name of the project + @param {Function} definition Defines the list of files containing the tests, + and the name of the namespace/task for running them. Will be executed on the + instantiated TestTask (i.e., 'this', will be the TestTask instance), to set + the various instance-propertiess. + + @example + let t = new jake.TestTask('bij-js', function () { + this.testName = 'testSpecial'; + this.testFiles.include('test/**'); + }); + + */ +let TestTask = function () { + let self = this; + let args = Array.prototype.slice.call(arguments); + let name = args.shift(); + let definition = args.pop(); + let prereqs = args.pop() || []; + + /** + @name jake.TestTask#testNam + @public + @type {String} + @description The name of the namespace to place the tests in, and + the top-level task for running tests. Defaults to "test" + */ + this.testName = 'test'; + + /** + @name jake.TestTask#testFiles + @public + @type {jake.FileList} + @description The list of files containing tests to load + */ + this.testFiles = new jake.FileList(); + + /** + @name jake.TestTask#showDescription + @public + @type {Boolean} + @description Show the created task when doing Jake -T + */ + this.showDescription = true; + + /* + @name jake.TestTask#totalTests + @public + @type {Number} + @description The total number of tests to run + */ + this.totalTests = 0; + + /* + @name jake.TestTask#executedTests + @public + @type {Number} + @description The number of tests successfully run + */ + this.executedTests = 0; + + if (typeof definition == 'function') { + definition.call(this); + } + + if (this.showDescription) { + desc('Run the tests for ' + name); + } + + task(this.testName, prereqs, {async: true}, function () { + let t = jake.Task[this.fullName + ':run']; + t.on('complete', function () { + complete(); + }); + // Pass args to the namespaced test + t.invoke.apply(t, arguments); + }); + + namespace(self.testName, function () { + + let runTask = task('run', {async: true}, function (pat) { + let re; + let testFiles; + + // Don't nest; make a top-level namespace. Don't want + // re-calling from inside to nest infinitely + jake.currentNamespace = jake.defaultNamespace; + + re = new RegExp(pat); + // Get test files that match the passed-in pattern + testFiles = self.testFiles.toArray() + .filter(function (f) { + return (re).test(f); + }) // Don't load the same file multiple times -- should this be in FileList? + .reduce(function (p, c) { + if (p.indexOf(c) < 0) { + p.push(c); + } + return p; + }, []); + + // Create a namespace for all the testing tasks to live in + namespace(self.testName + 'Exec', function () { + // Each test will be a prereq for the dummy top-level task + let prereqs = []; + // Continuation to pass to the async tests, wrapping `continune` + let next = function () { + complete(); + }; + // Create the task for this test-function + let createTask = function (name, action) { + // If the test-function is defined with a continuation + // param, flag the task as async + let t; + let isAsync = !!action.length; + + // Define the actual namespaced task with the name, the + // wrapped action, and the correc async-flag + t = task(name, createAction(name, action), { + async: isAsync + }); + t.once('complete', function () { + self.executedTests++; + }); + t._internal = true; + return t; + }; + // Used as the action for the defined task for each test. + let createAction = function (n, a) { + // A wrapped function that passes in the `next` function + // for any tasks that run asynchronously + return function () { + let cb; + if (a.length) { + cb = next; + } + if (!(n == 'before' || n == 'after' || + /_beforeEach$/.test(n) || /_afterEach$/.test(n))) { + jake.logger.log(n); + } + // 'this' will be the task when action is run + return a.call(this, cb); + }; + }; + // Dummy top-level task for everything to be prereqs for + let topLevel; + + // Pull in each test-file, and iterate over any exported + // test-functions. Register each test-function as a prereq task + testFiles.forEach(function (file) { + let exp = require(path.join(currDir, file)); + + // Create a namespace for each filename, so test-name collisions + // won't be a problem + namespace(file, function () { + let testPrefix = self.testName + 'Exec:' + file + ':'; + let testName; + // Dummy task for displaying file banner + testName = '*** Running ' + file + ' ***'; + prereqs.push(testPrefix + testName); + createTask(testName, function () {}); + + // 'before' setup + if (typeof exp.before == 'function') { + prereqs.push(testPrefix + 'before'); + // Create the task + createTask('before', exp.before); + } + + // Walk each exported function, and create a task for each + for (let p in exp) { + if (p == 'before' || p == 'after' || + p == 'beforeEach' || p == 'afterEach') { + continue; + } + + if (typeof exp.beforeEach == 'function') { + prereqs.push(testPrefix + p + '_beforeEach'); + // Create the task + createTask(p + '_beforeEach', exp.beforeEach); + } + + // Add the namespace:name of this test to the list of prereqs + // for the dummy top-level task + prereqs.push(testPrefix + p); + // Create the task + createTask(p, exp[p]); + + if (typeof exp.afterEach == 'function') { + prereqs.push(testPrefix + p + '_afterEach'); + // Create the task + createTask(p + '_afterEach', exp.afterEach); + } + } + + // 'after' teardown + if (typeof exp.after == 'function') { + prereqs.push(testPrefix + 'after'); + // Create the task + let afterTask = createTask('after', exp.after); + afterTask._internal = true; + } + + }); + }); + + self.totalTests = prereqs.length; + process.on('exit', function () { + // Throw in the case where the process exits without + // finishing tests, but no error was thrown + if (!jake.errorCode && (self.totalTests > self.executedTests)) { + throw new Error('Process exited without all tests completing.'); + } + }); + + // Create the dummy top-level task. When calling a task internally + // with `invoke` that is async (or has async prereqs), have to listen + // for the 'complete' event to know when it's done + topLevel = task('__top__', prereqs); + topLevel._internal = true; + topLevel.addListener('complete', function () { + jake.logger.log('All tests ran successfully'); + complete(); + }); + + topLevel.invoke(); // Do the thing! + }); + + }); + runTask._internal = true; + + }); + + +}; + +jake.TestTask = TestTask; +exports.TestTask = TestTask; + diff --git a/node_modules/jake/lib/utils/file.js b/node_modules/jake/lib/utils/file.js new file mode 100644 index 000000000..a436defc9 --- /dev/null +++ b/node_modules/jake/lib/utils/file.js @@ -0,0 +1,286 @@ +/* + * Utilities: A classic collection of JavaScript utilities + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ + +let fs = require('fs'); +let path = require('path'); + +/** + @name file + @namespace file +*/ + +let fileUtils = new (function () { + + // Recursively copy files and directories + let _copyFile = function (fromPath, toPath, opts) { + let from = path.normalize(fromPath) + let to = path.normalize(toPath) + let options = opts || {} + let fromStat; + let toStat; + let destExists; + let destDoesNotExistErr; + let content; + let filename; + let dirContents; + let targetDir; + + fromStat = fs.statSync(from); + + try { + //console.dir(to + ' destExists'); + toStat = fs.statSync(to); + destExists = true; + } + catch(e) { + //console.dir(to + ' does not exist'); + destDoesNotExistErr = e; + destExists = false; + } + // Destination dir or file exists, copy into (directory) + // or overwrite (file) + if (destExists) { + + // If there's a rename-via-copy file/dir name passed, use it. + // Otherwise use the actual file/dir name + filename = options.rename || path.basename(from); + + // Copying a directory + if (fromStat.isDirectory()) { + dirContents = fs.readdirSync(from); + targetDir = path.join(to, filename); + // We don't care if the target dir already exists + try { + fs.mkdirSync(targetDir, {mode: fromStat.mode & 0o777}); + } + catch(e) { + if (e.code !== 'EEXIST') { + throw e; + } + } + for (let i = 0, ii = dirContents.length; i < ii; i++) { + _copyFile(path.join(from, dirContents[i]), targetDir, {preserveMode: options.preserveMode}); + } + } + // Copying a file + else { + content = fs.readFileSync(from); + let mode = fromStat.mode & 0o777; + let targetFile = to; + + if (toStat.isDirectory()) { + targetFile = path.join(to, filename); + } + + let fileExists = fs.existsSync(targetFile); + fs.writeFileSync(targetFile, content); + + // If the file didn't already exist, use the original file mode. + // Otherwise, only update the mode if preserverMode is true. + if(!fileExists || options.preserveMode) { + fs.chmodSync(targetFile, mode); + } + } + } + // Dest doesn't exist, can't create it + else { + throw destDoesNotExistErr; + } + }; + + // Remove the given directory + let _rmDir = function (dirPath) { + let dir = path.normalize(dirPath); + let paths = []; + paths = fs.readdirSync(dir); + paths.forEach(function (p) { + let curr = path.join(dir, p); + let stat = fs.lstatSync(curr); + if (stat.isDirectory()) { + _rmDir(curr); + } + else { + try { + fs.unlinkSync(curr); + } catch(e) { + if (e.code === 'EPERM') { + fs.chmodSync(curr, parseInt(666, 8)); + fs.unlinkSync(curr); + } else { + throw e; + } + } + } + }); + fs.rmdirSync(dir); + }; + + /** + @name file#cpR + @public + @function + @description Copies a directory/file to a destination + @param {String} fromPath The source path to copy from + @param {String} toPath The destination path to copy to + @param {Object} opts Options to use + @param {Boolean} [opts.preserveMode] If target file already exists, this + determines whether the original file's mode is copied over. The default of + false mimics the behavior of the `cp` command line tool. (Default: false) + */ + this.cpR = function (fromPath, toPath, options) { + let from = path.normalize(fromPath); + let to = path.normalize(toPath); + let toStat; + let doesNotExistErr; + let filename; + let opts = options || {}; + + if (from == to) { + throw new Error('Cannot copy ' + from + ' to itself.'); + } + + // Handle rename-via-copy + try { + toStat = fs.statSync(to); + } + catch(e) { + doesNotExistErr = e; + + // Get abs path so it's possible to check parent dir + if (!this.isAbsolute(to)) { + to = path.join(process.cwd(), to); + } + + // Save the file/dir name + filename = path.basename(to); + // See if a parent dir exists, so there's a place to put the + /// renamed file/dir (resets the destination for the copy) + to = path.dirname(to); + try { + toStat = fs.statSync(to); + } + catch(e) {} + if (toStat && toStat.isDirectory()) { + // Set the rename opt to pass to the copy func, will be used + // as the new file/dir name + opts.rename = filename; + //console.log('filename ' + filename); + } + else { + throw doesNotExistErr; + } + } + + _copyFile(from, to, opts); + }; + + /** + @name file#mkdirP + @public + @function + @description Create the given directory(ies) using the given mode permissions + @param {String} dir The directory to create + @param {Number} mode The mode to give the created directory(ies)(Default: 0755) + */ + this.mkdirP = function (dir, mode) { + let dirPath = path.normalize(dir); + let paths = dirPath.split(/\/|\\/); + let currPath = ''; + let next; + + if (paths[0] == '' || /^[A-Za-z]+:/.test(paths[0])) { + currPath = paths.shift() || '/'; + currPath = path.join(currPath, paths.shift()); + //console.log('basedir'); + } + while ((next = paths.shift())) { + if (next == '..') { + currPath = path.join(currPath, next); + continue; + } + currPath = path.join(currPath, next); + try { + //console.log('making ' + currPath); + fs.mkdirSync(currPath, mode || parseInt(755, 8)); + } + catch(e) { + if (e.code != 'EEXIST') { + throw e; + } + } + } + }; + + /** + @name file#rmRf + @public + @function + @description Deletes the given directory/file + @param {String} p The path to delete, can be a directory or file + */ + this.rmRf = function (p, options) { + let stat; + try { + stat = fs.lstatSync(p); + if (stat.isDirectory()) { + _rmDir(p); + } + else { + fs.unlinkSync(p); + } + } + catch (e) {} + }; + + /** + @name file#isAbsolute + @public + @function + @return {Boolean/String} If it's absolute the first character is returned otherwise false + @description Checks if a given path is absolute or relative + @param {String} p Path to check + */ + this.isAbsolute = function (p) { + let match = /^[A-Za-z]+:\\|^\//.exec(p); + if (match && match.length) { + return match[0]; + } + return false; + }; + + /** + @name file#absolutize + @public + @function + @return {String} Returns the absolute path for the given path + @description Returns the absolute path for the given path + @param {String} p The path to get the absolute path for + */ + this.absolutize = function (p) { + if (this.isAbsolute(p)) { + return p; + } + else { + return path.join(process.cwd(), p); + } + }; + +})(); + +module.exports = fileUtils; + diff --git a/node_modules/jake/lib/utils/index.js b/node_modules/jake/lib/utils/index.js new file mode 100644 index 000000000..17d686bdc --- /dev/null +++ b/node_modules/jake/lib/utils/index.js @@ -0,0 +1,297 @@ +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ + + +let util = require('util'); // Native Node util module +let spawn = require('child_process').spawn; +let EventEmitter = require('events').EventEmitter; +let logger = require('./logger'); +let file = require('./file'); +let Exec; + +const _UUID_CHARS = '0123456789ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz'.split(''); + +let parseArgs = function (argumentsObj) { + let args; + let arg; + let cmds; + let callback; + let opts = { + interactive: false, + printStdout: false, + printStderr: false, + breakOnError: true + }; + + args = Array.prototype.slice.call(argumentsObj); + + cmds = args.shift(); + // Arrayize if passed a single string command + if (typeof cmds == 'string') { + cmds = [cmds]; + } + // Make a copy if it's an actual list + else { + cmds = cmds.slice(); + } + + // Get optional callback or opts + while((arg = args.shift())) { + if (typeof arg == 'function') { + callback = arg; + } + else if (typeof arg == 'object') { + opts = Object.assign(opts, arg); + } + } + + // Backward-compat shim + if (typeof opts.stdout != 'undefined') { + opts.printStdout = opts.stdout; + delete opts.stdout; + } + if (typeof opts.stderr != 'undefined') { + opts.printStderr = opts.stderr; + delete opts.stderr; + } + + return { + cmds: cmds, + opts: opts, + callback: callback + }; +}; + +/** + @name jake + @namespace jake +*/ +let utils = new (function () { + /** + @name jake.exec + @static + @function + @description Executes shell-commands asynchronously with an optional + final callback. + ` + @param {String[]} cmds The list of shell-commands to execute + @param {Object} [opts] + @param {Boolean} [opts.printStdout=false] Print stdout from each command + @param {Boolean} [opts.printStderr=false] Print stderr from each command + @param {Boolean} [opts.breakOnError=true] Stop further execution on + the first error. + @param {Boolean} [opts.windowsVerbatimArguments=false] Don't translate + arguments on Windows. + @param {Function} [callback] Callback to run after executing the + commands + + @example + let cmds = [ + 'echo "showing directories"' + , 'ls -al | grep ^d' + , 'echo "moving up a directory"' + , 'cd ../' + ] + , callback = function () { + console.log('Finished running commands.'); + } + jake.exec(cmds, {stdout: true}, callback); + */ + this.exec = function (a, b, c) { + let parsed = parseArgs(arguments); + let cmds = parsed.cmds; + let opts = parsed.opts; + let callback = parsed.callback; + + let ex = new Exec(cmds, opts, callback); + + ex.addListener('error', function (msg, code) { + if (opts.breakOnError) { + fail(msg, code); + } + }); + ex.run(); + + return ex; + }; + + this.createExec = function (a, b, c) { + return new Exec(a, b, c); + }; + + // From Math.uuid.js, https://github.com/broofa/node-uuid + // Robert Kieffer (robert@broofa.com), MIT license + this.uuid = function (length, radix) { + var chars = _UUID_CHARS + , uuid = [] + , r + , i; + + radix = radix || chars.length; + + if (length) { + // Compact form + i = -1; + while (++i < length) { + uuid[i] = chars[0 | Math.random()*radix]; + } + } else { + // rfc4122, version 4 form + + // rfc4122 requires these characters + uuid[8] = uuid[13] = uuid[18] = uuid[23] = '-'; + uuid[14] = '4'; + + // Fill in random data. At i==19 set the high bits of clock sequence as + // per rfc4122, sec. 4.1.5 + i = -1; + while (++i < 36) { + if (!uuid[i]) { + r = 0 | Math.random()*16; + uuid[i] = chars[(i == 19) ? (r & 0x3) | 0x8 : r]; + } + } + } + + return uuid.join(''); + }; + +})(); + +Exec = function () { + let parsed = parseArgs(arguments); + let cmds = parsed.cmds; + let opts = parsed.opts; + let callback = parsed.callback; + + this._cmds = cmds; + this._callback = callback; + this._config = opts; +}; + +util.inherits(Exec, EventEmitter); + +Object.assign(Exec.prototype, new (function () { + + let _run = function () { + let self = this; + let sh; + let cmd; + let args; + let next = this._cmds.shift(); + let config = this._config; + let errData = ''; + let shStdio; + let handleStdoutData = function (data) { + self.emit('stdout', data); + }; + let handleStderrData = function (data) { + let d = data.toString(); + self.emit('stderr', data); + // Accumulate the error-data so we can use it as the + // stack if the process exits with an error + errData += d; + }; + + // Keep running as long as there are commands in the array + if (next) { + let spawnOpts = {}; + this.emit('cmdStart', next); + + // Ganking part of Node's child_process.exec to get cmdline args parsed + if (process.platform == 'win32') { + cmd = 'cmd'; + args = ['/c', next]; + if (config.windowsVerbatimArguments) { + spawnOpts.windowsVerbatimArguments = true; + } + } + else { + cmd = '/bin/sh'; + args = ['-c', next]; + } + + if (config.interactive) { + spawnOpts.stdio = 'inherit'; + sh = spawn(cmd, args, spawnOpts); + } + else { + shStdio = [ + process.stdin + ]; + if (config.printStdout) { + shStdio.push(process.stdout); + } + else { + shStdio.push('pipe'); + } + if (config.printStderr) { + shStdio.push(process.stderr); + } + else { + shStdio.push('pipe'); + } + spawnOpts.stdio = shStdio; + sh = spawn(cmd, args, spawnOpts); + if (!config.printStdout) { + sh.stdout.addListener('data', handleStdoutData); + } + if (!config.printStderr) { + sh.stderr.addListener('data', handleStderrData); + } + } + + // Exit, handle err or run next + sh.on('exit', function (code) { + let msg; + if (code !== 0) { + msg = errData || 'Process exited with error.'; + msg = msg.trim(); + self.emit('error', msg, code); + } + if (code === 0 || !config.breakOnError) { + self.emit('cmdEnd', next); + setTimeout(function () { _run.call(self); }, 0); + } + }); + + } + else { + self.emit('end'); + if (typeof self._callback == 'function') { + self._callback(); + } + } + }; + + this.append = function (cmd) { + this._cmds.push(cmd); + }; + + this.run = function () { + _run.call(this); + }; + +})()); + +utils.Exec = Exec; +utils.file = file; +utils.logger = logger; + +module.exports = utils; + diff --git a/node_modules/jake/lib/utils/logger.js b/node_modules/jake/lib/utils/logger.js new file mode 100644 index 000000000..8f72686fa --- /dev/null +++ b/node_modules/jake/lib/utils/logger.js @@ -0,0 +1,24 @@ +let util = require('util'); + +let logger = new (function () { + let _output = function (type, out) { + let quiet = typeof jake != 'undefined' && jake.program && + jake.program.opts && jake.program.opts.quiet; + let msg; + if (!quiet) { + msg = typeof out == 'string' ? out : util.inspect(out); + console[type](msg); + } + }; + + this.log = function (out) { + _output('log', out); + }; + + this.error = function (out) { + _output('error', out); + }; + +})(); + +module.exports = logger; diff --git a/node_modules/jake/package.json b/node_modules/jake/package.json new file mode 100644 index 000000000..15e712cb6 --- /dev/null +++ b/node_modules/jake/package.json @@ -0,0 +1,75 @@ +{ + "_from": "jake@^10.6.1", + "_id": "jake@10.8.2", + "_inBundle": false, + "_integrity": "sha512-eLpKyrfG3mzvGE2Du8VoPbeSkRry093+tyNjdYaBbJS9v17knImYGNXQCUV0gLxQtF82m3E8iRb/wdSQZLoq7A==", + "_location": "/jake", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "jake@^10.6.1", + "name": "jake", + "escapedName": "jake", + "rawSpec": "^10.6.1", + "saveSpec": null, + "fetchSpec": "^10.6.1" + }, + "_requiredBy": [ + "/ejs" + ], + "_resolved": "https://registry.npmjs.org/jake/-/jake-10.8.2.tgz", + "_shasum": "ebc9de8558160a66d82d0eadc6a2e58fbc500a7b", + "_spec": "jake@^10.6.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\ejs", + "author": { + "name": "Matthew Eernisse", + "email": "mde@fleegix.org", + "url": "http://fleegix.org" + }, + "bin": { + "jake": "bin/cli.js" + }, + "bugs": { + "url": "https://github.com/jakejs/jake/issues" + }, + "bundleDependencies": false, + "dependencies": { + "async": "0.9.x", + "chalk": "^2.4.2", + "filelist": "^1.0.1", + "minimatch": "^3.0.4" + }, + "deprecated": false, + "description": "JavaScript build tool, similar to Make or Rake", + "devDependencies": { + "eslint": "^6.8.0", + "mocha": "^7.1.1", + "q": "^1.5.1" + }, + "engines": { + "node": "*" + }, + "homepage": "https://github.com/jakejs/jake#readme", + "keywords": [ + "build", + "cli", + "make", + "rake" + ], + "license": "Apache-2.0", + "main": "./lib/jake.js", + "name": "jake", + "preferGlobal": true, + "repository": { + "type": "git", + "url": "git://github.com/jakejs/jake.git" + }, + "scripts": { + "lint": "eslint --format codeframe \"lib/**/*.js\" \"test/**/*.js\"", + "lint:fix": "eslint --fix \"lib/**/*.js\" \"test/**/*.js\"", + "test": "./bin/cli.js test", + "test:ci": "npm run lint && npm run test" + }, + "version": "10.8.2" +} diff --git a/node_modules/jake/test/integration/concurrent.js b/node_modules/jake/test/integration/concurrent.js new file mode 100644 index 000000000..4ae41e810 --- /dev/null +++ b/node_modules/jake/test/integration/concurrent.js @@ -0,0 +1,42 @@ +let assert = require('assert'); +let exec = require('child_process').execSync; + +suite('concurrent', function () { + + this.timeout(7000); + + test(' simple concurrent prerequisites 1', function () { + let out = exec('./node_modules/.bin/jake -q concurrent:simple1').toString().trim() + assert.equal('Started A\nStarted B\nFinished B\nFinished A', out); + }); + + test(' simple concurrent prerequisites 2', function () { + let out = exec('./node_modules/.bin/jake -q concurrent:simple2').toString().trim() + assert.equal('Started C\nStarted D\nFinished C\nFinished D', out); + }); + + test(' sequential concurrent prerequisites', function () { + let out = exec('./node_modules/.bin/jake -q concurrent:seqconcurrent').toString().trim() + assert.equal('Started A\nStarted B\nFinished B\nFinished A\nStarted C\nStarted D\nFinished C\nFinished D', out); + }); + + test(' concurrent concurrent prerequisites', function () { + let out = exec('./node_modules/.bin/jake -q concurrent:concurrentconcurrent').toString().trim() + assert.equal('Started A\nStarted B\nStarted C\nStarted D\nFinished B\nFinished C\nFinished A\nFinished D', out); + }); + + test(' concurrent prerequisites with subdependency', function () { + let out = exec('./node_modules/.bin/jake -q concurrent:subdep').toString().trim() + assert.equal('Started A\nFinished A\nStarted Ba\nFinished Ba', out); + }); + + test(' failing in concurrent prerequisites', function () { + try { + exec('./node_modules/.bin/jake -q concurrent:Cfail'); + } + catch(err) { + assert(err.message.indexOf('Command failed') > -1); + } + }); + +}); diff --git a/node_modules/jake/test/integration/file.js b/node_modules/jake/test/integration/file.js new file mode 100644 index 000000000..97ed0d63b --- /dev/null +++ b/node_modules/jake/test/integration/file.js @@ -0,0 +1,228 @@ +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ + +const PROJECT_DIR = process.env.PROJECT_DIR; + +let assert = require('assert'); +let fs = require('fs'); +let path = require('path'); +let file = require(`${PROJECT_DIR}/lib/utils/file`); +let existsSync = fs.existsSync || path.existsSync; +let exec = require('child_process').execSync; + +suite('fileUtils', function () { + + test('mkdirP', function () { + let expected = [ + ['foo'], + ['foo', 'bar'], + ['foo', 'bar', 'baz'], + ['foo', 'bar', 'baz', 'qux'] + ]; + file.mkdirP('foo/bar/baz/qux'); + let res = exec('find foo').toString().trim().split('\n'); + for (let i = 0, ii = res.length; i < ii; i++) { + assert.equal(path.join.apply(path, expected[i]), res[i]); + } + file.rmRf('foo'); + }); + + test('rmRf', function () { + file.mkdirP('foo/bar/baz/qux'); + file.rmRf('foo/bar'); + let res = exec('find foo').toString().trim().split('\n'); + assert.equal(1, res.length); + assert.equal('foo', res[0]); + fs.rmdirSync('foo'); + }); + + test('rmRf with symlink subdir', function () { + file.mkdirP('foo'); + file.mkdirP('bar'); + fs.writeFileSync('foo/hello.txt', 'hello, it\'s me'); + fs.symlinkSync('../foo', 'bar/foo'); file.rmRf('bar'); + + // Make sure the bar directory was successfully deleted + let barDeleted = false; + try { + fs.statSync('bar'); + } catch(err) { + if(err.code == 'ENOENT') { + barDeleted = true; + } + } + assert.equal(true, barDeleted); + + // Make sure that the file inside the linked folder wasn't deleted + let res = fs.readdirSync('foo'); + assert.equal(1, res.length); + assert.equal('hello.txt', res[0]); + + // Cleanup + fs.unlinkSync('foo/hello.txt'); + fs.rmdirSync('foo'); + }); + + test('rmRf with symlinked dir', function () { + file.mkdirP('foo'); + fs.writeFileSync('foo/hello.txt', 'hello!'); + fs.symlinkSync('foo', 'bar'); + file.rmRf('bar'); + + // Make sure the bar directory was successfully deleted + let barDeleted = false; + try { + fs.statSync('bar'); + } catch(err) { + if(err.code == 'ENOENT') { + barDeleted = true; + } + } + assert.equal(true, barDeleted); + + // Make sure that the file inside the linked folder wasn't deleted + let res = fs.readdirSync('foo'); + assert.equal(1, res.length); + assert.equal('hello.txt', res[0]); + + // Cleanup + fs.unlinkSync('foo/hello.txt'); + fs.rmdirSync('foo'); + }); + + test('cpR with same name and different directory', function () { + file.mkdirP('foo'); + fs.writeFileSync('foo/bar.txt', 'w00t'); + file.cpR('foo', 'bar'); + assert.ok(existsSync('bar/bar.txt')); + file.rmRf('foo'); + file.rmRf('bar'); + }); + + test('cpR with same to and from will throw', function () { + assert.throws(function () { + file.cpR('foo.txt', 'foo.txt'); + }); + }); + + test('cpR rename via copy in directory', function () { + file.mkdirP('foo'); + fs.writeFileSync('foo/bar.txt', 'w00t'); + file.cpR('foo/bar.txt', 'foo/baz.txt'); + assert.ok(existsSync('foo/baz.txt')); + file.rmRf('foo'); + }); + + test('cpR rename via copy in base', function () { + fs.writeFileSync('bar.txt', 'w00t'); + file.cpR('bar.txt', 'baz.txt'); + assert.ok(existsSync('baz.txt')); + file.rmRf('bar.txt'); + file.rmRf('baz.txt'); + }); + + test('cpR keeps file mode', function () { + fs.writeFileSync('bar.txt', 'w00t', {mode: 0o750}); + fs.writeFileSync('bar1.txt', 'w00t!', {mode: 0o744}); + file.cpR('bar.txt', 'baz.txt'); + file.cpR('bar1.txt', 'baz1.txt'); + + assert.ok(existsSync('baz.txt')); + assert.ok(existsSync('baz1.txt')); + let bazStat = fs.statSync('baz.txt'); + let bazStat1 = fs.statSync('baz1.txt'); + assert.equal(0o750, bazStat.mode & 0o7777); + assert.equal(0o744, bazStat1.mode & 0o7777); + + file.rmRf('bar.txt'); + file.rmRf('baz.txt'); + file.rmRf('bar1.txt'); + file.rmRf('baz1.txt'); + }); + + test('cpR keeps file mode when overwriting with preserveMode', function () { + fs.writeFileSync('bar.txt', 'w00t', {mode: 0o755}); + fs.writeFileSync('baz.txt', 'w00t!', {mode: 0o744}); + file.cpR('bar.txt', 'baz.txt', {silent: true, preserveMode: true}); + + assert.ok(existsSync('baz.txt')); + let bazStat = fs.statSync('baz.txt'); + assert.equal(0o755, bazStat.mode & 0o777); + + file.rmRf('bar.txt'); + file.rmRf('baz.txt'); + }); + + test('cpR does not keep file mode when overwriting', function () { + fs.writeFileSync('bar.txt', 'w00t', {mode: 0o766}); + fs.writeFileSync('baz.txt', 'w00t!', {mode: 0o744}); + file.cpR('bar.txt', 'baz.txt'); + + assert.ok(existsSync('baz.txt')); + let bazStat = fs.statSync('baz.txt'); + assert.equal(0o744, bazStat.mode & 0o777); + + file.rmRf('bar.txt'); + file.rmRf('baz.txt'); + }); + + test('cpR copies file mode recursively', function () { + fs.mkdirSync('foo'); + fs.writeFileSync('foo/bar.txt', 'w00t', {mode: 0o740}); + file.cpR('foo', 'baz'); + + assert.ok(existsSync('baz')); + let barStat = fs.statSync('baz/bar.txt'); + assert.equal(0o740, barStat.mode & 0o777); + + file.rmRf('foo'); + file.rmRf('baz'); + }); + + test('cpR keeps file mode recursively', function () { + fs.mkdirSync('foo'); + fs.writeFileSync('foo/bar.txt', 'w00t', {mode: 0o740}); + fs.mkdirSync('baz'); + fs.mkdirSync('baz/foo'); + fs.writeFileSync('baz/foo/bar.txt', 'w00t!', {mode: 0o755}); + file.cpR('foo', 'baz', {silent: true, preserveMode: true}); + + assert.ok(existsSync('baz')); + let barStat = fs.statSync('baz/foo/bar.txt'); + assert.equal(0o740, barStat.mode & 0o777); + + file.rmRf('foo'); + file.rmRf('baz'); + }); + + test('cpR copies directory mode recursively', function () { + fs.mkdirSync('foo', 0o755); + fs.mkdirSync('foo/bar', 0o700); + file.cpR('foo', 'bar'); + + assert.ok(existsSync('foo')); + let fooBarStat = fs.statSync('bar/bar'); + assert.equal(0o700, fooBarStat.mode & 0o777); + + file.rmRf('foo'); + file.rmRf('bar'); + }); + +}); + + diff --git a/node_modules/jake/test/integration/file_task.js b/node_modules/jake/test/integration/file_task.js new file mode 100644 index 000000000..b48f07e7a --- /dev/null +++ b/node_modules/jake/test/integration/file_task.js @@ -0,0 +1,125 @@ +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ + +const PROJECT_DIR = process.env.PROJECT_DIR; + +let assert = require('assert'); +let fs = require('fs'); +let exec = require('child_process').execSync; +let { rmRf } = require(`${PROJECT_DIR}/lib/jake`); + +let cleanUpAndNext = function (callback) { + rmRf('./foo', { + silent: true + }); + callback && callback(); +}; + +suite('fileTask', function () { + this.timeout(7000); + + setup(function () { + cleanUpAndNext(); + }); + + test('where a file-task prereq does not change with --always-make', function () { + let out; + out = exec('./node_modules/.bin/jake -q fileTest:foo/from-src1.txt').toString().trim(); + assert.equal('fileTest:foo/src1.txt task\nfileTest:foo/from-src1.txt task', + out); + out = exec('./node_modules/.bin/jake -q -B fileTest:foo/from-src1.txt').toString().trim(); + assert.equal('fileTest:foo/src1.txt task\nfileTest:foo/from-src1.txt task', + out); + cleanUpAndNext(); + }); + + test('concating two files', function () { + let out; + out = exec('./node_modules/.bin/jake -q fileTest:foo/concat.txt').toString().trim(); + assert.equal('fileTest:foo/src1.txt task\ndefault task\nfileTest:foo/src2.txt task\n' + + 'fileTest:foo/concat.txt task', out); + // Check to see the two files got concat'd + let data = fs.readFileSync(process.cwd() + '/foo/concat.txt'); + assert.equal('src1src2', data.toString()); + cleanUpAndNext(); + }); + + test('where a file-task prereq does not change', function () { + let out; + out = exec('./node_modules/.bin/jake -q fileTest:foo/from-src1.txt').toString().trim(); + assert.equal('fileTest:foo/src1.txt task\nfileTest:foo/from-src1.txt task', out); + out = exec('./node_modules/.bin/jake -q fileTest:foo/from-src1.txt').toString().trim(); + // Second time should be a no-op + assert.equal('', out); + cleanUpAndNext(); + }); + + test('where a file-task prereq does change, then does not', function (next) { + exec('mkdir -p ./foo'); + exec('touch ./foo/from-src1.txt'); + setTimeout(() => { + fs.writeFileSync('./foo/src1.txt', '-SRC'); + // Task should run the first time + let out; + out = exec('./node_modules/.bin/jake -q fileTest:foo/from-src1.txt').toString().trim(); + assert.equal('fileTest:foo/from-src1.txt task', out); + // Task should not run on subsequent invocation + out = exec('./node_modules/.bin/jake -q fileTest:foo/from-src1.txt').toString().trim(); + assert.equal('', out); + cleanUpAndNext(next); + }, 1000); + }); + + test('a preexisting file', function () { + let prereqData = 'howdy'; + exec('mkdir -p ./foo'); + fs.writeFileSync('foo/prereq.txt', prereqData); + let out; + out = exec('./node_modules/.bin/jake -q fileTest:foo/from-prereq.txt').toString().trim(); + assert.equal('fileTest:foo/from-prereq.txt task', out); + let data = fs.readFileSync(process.cwd() + '/foo/from-prereq.txt'); + assert.equal(prereqData, data.toString()); + out = exec('./node_modules/.bin/jake -q fileTest:foo/from-prereq.txt').toString().trim(); + // Second time should be a no-op + assert.equal('', out); + cleanUpAndNext(); + }); + + test('a preexisting file with --always-make flag', function () { + let prereqData = 'howdy'; + exec('mkdir -p ./foo'); + fs.writeFileSync('foo/prereq.txt', prereqData); + let out; + out = exec('./node_modules/.bin/jake -q fileTest:foo/from-prereq.txt').toString().trim(); + assert.equal('fileTest:foo/from-prereq.txt task', out); + let data = fs.readFileSync(process.cwd() + '/foo/from-prereq.txt'); + assert.equal(prereqData, data.toString()); + out = exec('./node_modules/.bin/jake -q -B fileTest:foo/from-prereq.txt').toString().trim(); + assert.equal('fileTest:foo/from-prereq.txt task', out); + cleanUpAndNext(); + }); + + test('nested directory-task', function () { + exec('./node_modules/.bin/jake -q fileTest:foo/bar/baz/bamf.txt'); + let data = fs.readFileSync(process.cwd() + '/foo/bar/baz/bamf.txt'); + assert.equal('w00t', data); + cleanUpAndNext(); + }); + +}); + diff --git a/node_modules/jake/test/integration/helpers.js b/node_modules/jake/test/integration/helpers.js new file mode 100644 index 000000000..9caaa4e7f --- /dev/null +++ b/node_modules/jake/test/integration/helpers.js @@ -0,0 +1,80 @@ +var exec = require('child_process').exec; + +var helpers = new (function () { + var _tests; + var _names = []; + var _name; + var _callback; + var _runner = function () { + if ((_name = _names.shift())) { + console.log('Running ' + _name); + _tests[_name](); + } + else { + _callback(); + } + }; + + this.exec = function () { + var args = Array.prototype.slice.call(arguments); + var arg; + var cmd = args.shift(); + var opts = {}; + var callback; + // Optional opts/callback or callback/opts + while ((arg = args.shift())) { + if (typeof arg == 'function') { + callback = arg; + } + else { + opts = arg; + } + } + + cmd += ' --trace'; + var execOpts = opts.execOpts ? opts.execOpts : {}; + exec(cmd, execOpts, function (err, stdout, stderr) { + var out = helpers.trim(stdout); + if (err) { + if (opts.breakOnError === false) { + return callback(err); + } + else { + throw err; + } + } + if (stderr) { + callback(stderr); + } + else { + callback(out); + } + }); + }; + + this.trim = function (s) { + var str = s || ''; + return str.replace(/^\s*|\s*$/g, ''); + }; + + this.parse = function (s) { + var str = s || ''; + str = helpers.trim(str); + str = str.replace(/'/g, '"'); + return JSON.parse(str); + }; + + this.run = function (tests, callback) { + _tests = tests; + _names = Object.keys(tests); + _callback = callback; + _runner(); + }; + + this.next = function () { + _runner(); + }; + +})(); + +module.exports = helpers; diff --git a/node_modules/jake/test/integration/jakefile.js b/node_modules/jake/test/integration/jakefile.js new file mode 100644 index 000000000..f3b7d1a4c --- /dev/null +++ b/node_modules/jake/test/integration/jakefile.js @@ -0,0 +1,337 @@ +let fs = require('fs'); +let Q = require('q'); + +desc('The default t.'); +task('default', function () { + console.log('default task'); +}); + +desc('No action.'); +task({'noAction': ['default']}); + +desc('No action, no prereqs.'); +task('noActionNoPrereqs'); + +desc('Top-level zerbofrangazoomy task'); +task('zerbofrangazoomy', function () { + console.log('Whaaaaaaaa? Ran the zerbofrangazoomy task!') +}); + +desc('Task that throws'); +task('throwy', function () { + let errorListener = function (err) { + console.log('Emitted'); + console.log(err.toString()); + + jake.removeListener('error', errorListener); + }; + + jake.on('error', errorListener); + + throw new Error('I am bad'); +}); + +desc('Task that rejects a Promise'); +task('promiseRejecter', function () { + const originalOption = jake.program.opts['allow-rejection']; + + const errorListener = function (err) { + console.log(err.toString()); + jake.removeListener('error', errorListener); + jake.program.opts['allow-rejection'] = originalOption; // Restore original 'allow-rejection' option + }; + jake.on('error', errorListener); + + jake.program.opts['allow-rejection'] = false; // Do not allow rejection so the rejection is passed to error handlers + + Promise.reject(''); +}); + +desc('Accepts args and env vars.'); +task('argsEnvVars', function () { + let res = { + args: arguments + , env: { + foo: process.env.foo + , baz: process.env.baz + } + }; + console.log(JSON.stringify(res)); +}); + +namespace('foo', function () { + desc('The foo:bar t.'); + task('bar', function () { + if (arguments.length) { + console.log('foo:bar[' + + Array.prototype.join.call(arguments, ',') + + '] task'); + } + else { + console.log('foo:bar task'); + } + }); + + desc('The foo:baz task, calls foo:bar as a prerequisite.'); + task('baz', ['foo:bar'], function () { + console.log('foo:baz task'); + }); + + desc('The foo:qux task, calls foo:bar with cmdline args as a prerequisite.'); + task('qux', ['foo:bar[asdf,qwer]'], function () { + console.log('foo:qux task'); + }); + + desc('The foo:frang task,`invokes` foo:bar with passed args as a prerequisite.'); + task('frang', function () { + let t = jake.Task['foo:bar']; + // Do args pass-through + t.invoke.apply(t, arguments); + t.on('complete', () => { + console.log('foo:frang task'); + }); + }); + + desc('The foo:zerb task, `executes` foo:bar with passed args as a prerequisite.'); + task('zerb', function () { + let t = jake.Task['foo:bar']; + // Do args pass-through + t.execute.apply(t, arguments); + t.on('complete', () => { + console.log('foo:zerb task'); + }); + }); + + desc('The foo:zoobie task, has no prerequisites.'); + task('zoobie', function () { + console.log('foo:zoobie task'); + }); + + desc('The foo:voom task, run the foo:zoobie task repeatedly.'); + task('voom', function () { + let t = jake.Task['foo:bar']; + t.on('complete', function () { + console.log('complete'); + }); + t.execute.apply(t); + t.execute.apply(t); + }); + + desc('The foo:asdf task, has the same prereq twice.'); + task('asdf', ['foo:bar', 'foo:baz'], function () { + console.log('foo:asdf task'); + }); + +}); + +namespace('bar', function () { + desc('The bar:foo task, has no prerequisites, is async, returns Promise which resolves.'); + task('foo', async function () { + return new Promise((resolve, reject) => { + console.log('bar:foo task'); + resolve(); + }); + }); + + desc('The bar:promise task has no prerequisites, is async, returns Q-based promise.'); + task('promise', function () { + return Q() + .then(function () { + console.log('bar:promise task'); + return 123654; + }); + }); + + desc('The bar:dependOnpromise task waits for a promise based async test'); + task('dependOnpromise', ['promise'], function () { + console.log('bar:dependOnpromise task saw value', jake.Task["bar:promise"].value); + }); + + desc('The bar:brokenPromise task is a failing Q-promise based async task.'); + task('brokenPromise', function () { + return Q() + .then(function () { + throw new Error("nom nom nom"); + }); + }); + + desc('The bar:bar task, has the async bar:foo task as a prerequisite.'); + task('bar', ['bar:foo'], function () { + console.log('bar:bar task'); + }); + +}); + +namespace('hoge', function () { + desc('The hoge:hoge task, has no prerequisites.'); + task('hoge', function () { + console.log('hoge:hoge task'); + }); + + desc('The hoge:piyo task, has no prerequisites.'); + task('piyo', function () { + console.log('hoge:piyo task'); + }); + + desc('The hoge:fuga task, has hoge:hoge and hoge:piyo as prerequisites.'); + task('fuga', ['hoge:hoge', 'hoge:piyo'], function () { + console.log('hoge:fuga task'); + }); + + desc('The hoge:charan task, has hoge:fuga as a prerequisite.'); + task('charan', ['hoge:fuga'], function () { + console.log('hoge:charan task'); + }); + + desc('The hoge:gero task, has hoge:fuga as a prerequisite.'); + task('gero', ['hoge:fuga'], function () { + console.log('hoge:gero task'); + }); + + desc('The hoge:kira task, has hoge:charan and hoge:gero as prerequisites.'); + task('kira', ['hoge:charan', 'hoge:gero'], function () { + console.log('hoge:kira task'); + }); + +}); + +namespace('fileTest', function () { + directory('foo'); + + desc('File task, concatenating two files together'); + file('foo/concat.txt', ['fileTest:foo', 'fileTest:foo/src1.txt', 'fileTest:foo/src2.txt'], function () { + console.log('fileTest:foo/concat.txt task'); + let data1 = fs.readFileSync('foo/src1.txt'); + let data2 = fs.readFileSync('foo/src2.txt'); + fs.writeFileSync('foo/concat.txt', data1 + data2); + }); + + desc('File task, async creation with writeFile'); + file('foo/src1.txt', function () { + return new Promise(function (resolve, reject) { + fs.writeFile('foo/src1.txt', 'src1', function (err) { + if (err) { + reject(err); + } + else { + console.log('fileTest:foo/src1.txt task'); + resolve(); + } + }); + }); + }); + + desc('File task, sync creation with writeFileSync'); + file('foo/src2.txt', ['default'], function () { + fs.writeFileSync('foo/src2.txt', 'src2'); + console.log('fileTest:foo/src2.txt task'); + }); + + desc('File task, do not run unless the prereq file changes'); + file('foo/from-src1.txt', ['fileTest:foo', 'fileTest:foo/src1.txt'], function () { + let data = fs.readFileSync('foo/src1.txt').toString(); + fs.writeFileSync('foo/from-src1.txt', data); + console.log('fileTest:foo/from-src1.txt task'); + }); + + desc('File task, run if the prereq file changes'); + task('touch-prereq', function () { + fs.writeFileSync('foo/prereq.txt', 'UPDATED'); + }) + + desc('File task, has a preexisting file (with no associated task) as a prereq'); + file('foo/from-prereq.txt', ['fileTest:foo', 'foo/prereq.txt'], function () { + let data = fs.readFileSync('foo/prereq.txt'); + fs.writeFileSync('foo/from-prereq.txt', data); + console.log('fileTest:foo/from-prereq.txt task'); + }); + + directory('foo/bar/baz'); + + desc('Write a file in a nested subdirectory'); + file('foo/bar/baz/bamf.txt', ['foo/bar/baz'], function () { + fs.writeFileSync('foo/bar/baz/bamf.txt', 'w00t'); + }); + +}); + +task('blammo'); +// Define task +task('voom', ['blammo'], function () { + console.log(this.prereqs.length); +}); + +// Modify, add a prereq +task('voom', ['noActionNoPrereqs']); + +namespace('vronk', function () { + task('groo', function () { + let t = jake.Task['vronk:zong']; + t.addListener('error', function (e) { + console.log(e.message); + }); + t.invoke(); + }); + task('zong', function () { + throw new Error('OMFGZONG'); + }); +}); + +// define namespace +namespace('one', function () { + task('one', function () { + console.log('one:one'); + }); +}); + +// modify namespace (add task) +namespace('one', function () { + task('two', ['one:one'], function () { + console.log('one:two'); + }); +}); + +task('selfdepconst', [], function () { + task('selfdep', ['selfdep'], function () { + console.log("I made a task that depends on itself"); + }); +}); +task('selfdepdyn', function () { + task('selfdeppar', [], {concurrency: 2}, function () { + console.log("I will depend on myself and will fail at runtime"); + }); + task('selfdeppar', ['selfdeppar']); + jake.Task['selfdeppar'].invoke(); +}); + +namespace("large", function () { + task("leaf", function () { + console.log("large:leaf"); + }); + + const same = []; + for (let i = 0; i < 2000; i++) { + same.push("leaf"); + } + + desc("Task with a large number of same prereqs"); + task("same", same, { concurrency: 2 }, function () { + console.log("large:same"); + }); + + const different = []; + for (let i = 0; i < 2000; i++) { + const name = "leaf-" + i; + task(name, function () { + if (name === "leaf-12" || name === "leaf-123") { + console.log(name); + } + }); + different.push(name); + } + + desc("Task with a large number of different prereqs"); + task("different", different, { concurrency: 2 } , function () { + console.log("large:different") + }) +}); diff --git a/node_modules/jake/test/integration/jakelib/concurrent.jake.js b/node_modules/jake/test/integration/jakelib/concurrent.jake.js new file mode 100644 index 000000000..684c86f1a --- /dev/null +++ b/node_modules/jake/test/integration/jakelib/concurrent.jake.js @@ -0,0 +1,113 @@ + +namespace('concurrent', function () { + task('A', function () { + console.log('Started A'); + return new Promise((resolve, reject) => { + setTimeout(() => { + console.log('Finished A'); + resolve(); + }, 200); + }); + }); + + task('B', function () { + console.log('Started B'); + return new Promise((resolve, reject) => { + setTimeout(() => { + console.log('Finished B'); + resolve(); + }, 50); + }); + }); + + task('C', function () { + console.log('Started C'); + return new Promise((resolve, reject) => { + setTimeout(() => { + console.log('Finished C'); + resolve(); + }, 100); + }); + }); + + task('D', function () { + console.log('Started D'); + return new Promise((resolve, reject) => { + setTimeout(() => { + console.log('Finished D'); + resolve(); + }, 300); + }); + }); + + task('Ba', ['A'], function () { + console.log('Started Ba'); + return new Promise((resolve, reject) => { + setTimeout(() => { + console.log('Finished Ba'); + resolve(); + }, 50); + }); + }); + + task('Afail', function () { + console.log('Started failing task'); + return new Promise((resolve, reject) => { + setTimeout(() => { + console.log('Failing B with error'); + throw new Error('I failed'); + }, 50); + }); + }); + + task('simple1', ['A','B'], {concurrency: 2}, function () { + return new Promise((resolve, reject) => { + setTimeout(() => { + resolve(); + }, 50); + }); + }); + + task('simple2', ['C','D'], {concurrency: 2}, function () { + return new Promise((resolve, reject) => { + setTimeout(() => { + resolve(); + }, 50); + }); + }); + + task('seqconcurrent', ['simple1','simple2'], function () { + return new Promise((resolve, reject) => { + setTimeout(() => { + resolve(); + }, 50); + }); + }); + + task('concurrentconcurrent', ['simple1','simple2'], {concurrency: 2}, function () { + return new Promise((resolve, reject) => { + setTimeout(() => { + resolve(); + }, 50); + }); + }); + + task('subdep', ['A','Ba'], {concurrency: 2}, function () { + return new Promise((resolve, reject) => { + setTimeout(() => { + resolve(); + }, 50); + }); + }); + + task('fail', ['A', 'B', 'Afail'], {concurrency: 3}, function () { + return new Promise((resolve, reject) => { + setTimeout(() => { + resolve(); + }, 50); + }); + }); + +}); + + diff --git a/node_modules/jake/test/integration/jakelib/publish.jake.js b/node_modules/jake/test/integration/jakelib/publish.jake.js new file mode 100644 index 000000000..52dd04a0f --- /dev/null +++ b/node_modules/jake/test/integration/jakelib/publish.jake.js @@ -0,0 +1,49 @@ +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ + +const PROJECT_DIR = process.env.PROJECT_DIR; + +let fs = require('fs'); +let { publishTask, rmRf, mkdirP } = require(`${PROJECT_DIR}/lib/jake`); + +fs.writeFileSync('package.json', '{"version": "0.0.1"}'); +mkdirP('tmp_publish'); +fs.writeFileSync('tmp_publish/foo.txt', 'FOO'); + +publishTask('zerb', function () { + this.packageFiles.include([ + 'package.json' + , 'tmp_publish/**' + ]); + this.publishCmd = 'node -p -e "\'%filename\'"'; + this.gitCmd = 'echo' + this.scheduleDelay = 0; + + this._ensureRepoClean = function () {}; + this._getCurrentBranch = function () { + return 'v0.0' + }; +}); + +jake.setTaskTimeout(5000); + +jake.Task['publish'].on('complete', function () { + rmRf('tmp_publish', {silent: true}); + rmRf('package.json', {silent: true}); +}); + diff --git a/node_modules/jake/test/integration/jakelib/required_module.jake.js b/node_modules/jake/test/integration/jakelib/required_module.jake.js new file mode 100644 index 000000000..c63751d72 --- /dev/null +++ b/node_modules/jake/test/integration/jakelib/required_module.jake.js @@ -0,0 +1,10 @@ +let { task, namespace } = require("jake"); + +namespace('usingRequire', function () { + task('test', () => { + console.log('howdy test'); + }); +}); + + + diff --git a/node_modules/jake/test/integration/jakelib/rule.jake.js b/node_modules/jake/test/integration/jakelib/rule.jake.js new file mode 100644 index 000000000..8e977dd95 --- /dev/null +++ b/node_modules/jake/test/integration/jakelib/rule.jake.js @@ -0,0 +1,222 @@ +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ + +const PROJECT_DIR = process.env.PROJECT_DIR; + +let exec = require('child_process').execSync; +let fs = require('fs'); +let util = require('util'); +let { rule, rmRf } = require(`${PROJECT_DIR}/lib/jake`); + +directory('tmpsrc'); +directory('tmpbin'); + +//////////////////////////////////////////////////////////// +// Simple Suffix Rule +file('tmp', ['tmp_init', 'tmp_dep1.o', 'tmp_dep2.o'], function (params) { + console.log('tmp task'); + let data1 = fs.readFileSync('tmp_dep1.o'); + let data2 = fs.readFileSync('tmp_dep2.o'); + fs.writeFileSync('tmp', data1 + data2); +}); + +rule('.o', '.c', function () { + let cmd = util.format('cp %s %s', this.source, this.name); + console.log(cmd + ' task'); + exec(cmd); +}); + +file('tmp_dep1.c', function () { + fs.writeFileSync('tmp_dep1.c', 'src_1'); + console.log('tmp_dep1.c task'); +}); + +// note that tmp_dep2.o depends on tmp_dep2.c, which is a +// static file. +task('tmp_init', function () { + fs.writeFileSync('tmp_dep2.c', 'src_2'); + console.log('tmp_dep2.c task'); +}); +//////////////////////////////////////////////////////////// + +//////////////////////////////////////////////////////////// +// Pattern Rule +file('tmp_p', ['tmp_init', 'tmp_dep1.oo', 'tmp_dep2.oo'], function (params) { + console.log('tmp pattern task'); + let data1 = fs.readFileSync('tmp_dep1.oo'); + let data2 = fs.readFileSync('tmp_dep2.oo'); + fs.writeFileSync('tmp_p', data1 + data2 + ' pattern'); +}); + +rule('%.oo', '%.c', function () { + let cmd = util.format('cp %s %s', this.source, this.name); + console.log(cmd + ' task'); + exec(cmd); +}); +//////////////////////////////////////////////////////////// + +//////////////////////////////////////////////////////////// +// Pattern Rule with Folder +// i.e. rule('tmpbin/%.oo', 'tmpsrc/%.c', ... +file('tmp_pf', [ + 'tmp_src_init' + , 'tmpbin' + , 'tmpbin/tmp_dep1.oo' + , 'tmpbin/tmp_dep2.oo' ], function (params) { + console.log('tmp pattern folder task'); + let data1 = fs.readFileSync('tmpbin/tmp_dep1.oo'); + let data2 = fs.readFileSync('tmpbin/tmp_dep2.oo'); + fs.writeFileSync('tmp_pf', data1 + data2 + ' pattern folder'); +}); + +rule('tmpbin/%.oo', 'tmpsrc/%.c', function () { + let cmd = util.format('cp %s %s', this.source, this.name); + console.log(cmd + ' task'); + exec(cmd); +}); + +file('tmpsrc/tmp_dep2.c',['tmpsrc'], function () { + fs.writeFileSync('tmpsrc/tmp_dep2.c', 'src/src_2'); + console.log('tmpsrc/tmp_dep2.c task'); +}); + +// Create static files in folder tmpsrc. +task('tmp_src_init', ['tmpsrc'], function () { + fs.writeFileSync('tmpsrc/tmp_dep1.c', 'src/src_1'); + console.log('tmpsrc/tmp_dep1.c task'); +}); +//////////////////////////////////////////////////////////// + + +//////////////////////////////////////////////////////////// +// Namespace Test. This is a Mixed Test. +// Test for +// - rules belonging to different namespace. +// - rules with folder and pattern +task('tmp_ns', [ + 'tmpbin' + , 'rule:init' + , 'tmpbin/tmp_dep2.oo' // *** This relies on a rule defined before. + , 'rule:tmpbin/dep1.oo' + , 'rule:tmpbin/file2.oo' ], function () { + console.log('tmp pattern folder namespace task'); + let data1 = fs.readFileSync('tmpbin/dep1.oo'); + let data2 = fs.readFileSync('tmpbin/tmp_dep2.oo'); + let data3 = fs.readFileSync('tmpbin/file2.oo'); + fs.writeFileSync('tmp_ns', data1 + data2 + data3 + ' pattern folder namespace'); +}); + +namespace('rule', function () { + task('init', ['tmpsrc'], function () { + fs.writeFileSync('tmpsrc/file2.c', 'src/src_3'); + console.log('tmpsrc/file2.c init task'); + }); + + file('tmpsrc/dep1.c',['tmpsrc'], function () { + fs.writeFileSync('tmpsrc/dep1.c', 'src/src_1'); + console.log('tmpsrc/dep1.c task'); + }, {async: true}); + + rule('tmpbin/%.oo', 'tmpsrc/%.c', function () { + let cmd = util.format('cp %s %s', this.source, this.name); + console.log(cmd + ' ns task'); + exec(cmd); + }); +}); +//////////////////////////////////////////////////////////// + +//////////////////////////////////////////////////////////// +// Chain rule +// rule('tmpbin/%.pdf', 'tmpbin/%.dvi', function() { ... +// rule('tmpbin/%.dvi', 'tmpsrc/%.tex', ['tmpbin'], function() { ... +task('tmp_cr', [ + 'chainrule:init' + , 'chainrule:tmpbin/file1.pdf' + , 'chainrule:tmpbin/file2.pdf' ], function () { + console.log('tmp chainrule namespace task'); + let data1 = fs.readFileSync('tmpbin/file1.pdf'); + let data2 = fs.readFileSync('tmpbin/file2.pdf'); + fs.writeFileSync('tmp_cr', data1 + data2 + ' chainrule namespace'); +}); + +namespace('chainrule', function () { + task('init', ['tmpsrc', 'tmpbin'], function () { + fs.writeFileSync('tmpsrc/file1.tex', 'tex1 '); + fs.writeFileSync('tmpsrc/file2.tex', 'tex2 '); + console.log('chainrule init task'); + }); + + rule('tmpbin/%.pdf', 'tmpbin/%.dvi', function () { + let cmd = util.format('cp %s %s', this.source, this.name); + console.log(cmd + ' dvi->pdf task'); + exec(cmd); + }); + + rule('tmpbin/%.dvi', 'tmpsrc/%.tex', ['tmpbin'], function () { + let cmd = util.format('cp %s %s', this.source, this.name); + console.log(cmd + ' tex->dvi task'); + exec(cmd); + }); +}); +//////////////////////////////////////////////////////////// +namespace('precedence', function () { + task('test', ['foo.html'], function () { + console.log('ran test'); + }); + + rule('.html', '.txt', function () { + console.log('created html'); + let data = fs.readFileSync(this.source); + fs.writeFileSync(this.name, data.toString()); + }); +}); + +namespace('regexPattern', function () { + task('test', ['foo.html'], function () { + console.log('ran test'); + }); + + rule(/\.html$/, '.txt', function () { + console.log('created html'); + let data = fs.readFileSync(this.source); + fs.writeFileSync(this.name, data.toString()); + }); +}); + +namespace('sourceFunction', function () { + + let srcFunc = function (taskName) { + return taskName.replace(/\.[^.]+$/, '.txt'); + }; + + task('test', ['foo.html'], function () { + console.log('ran test'); + }); + + rule('.html', srcFunc, function () { + console.log('created html'); + let data = fs.readFileSync(this.source); + fs.writeFileSync(this.name, data.toString()); + }); +}); + +//////////////////////////////////////////////////////////// +task('clean', function () { + rmRf('./foo'); + rmRf('./tmp'); +}); diff --git a/node_modules/jake/test/integration/publish_task.js b/node_modules/jake/test/integration/publish_task.js new file mode 100644 index 000000000..034fd94aa --- /dev/null +++ b/node_modules/jake/test/integration/publish_task.js @@ -0,0 +1,24 @@ +let assert = require('assert'); +let exec = require('child_process').execSync; + +suite('publishTask', function () { + + this.timeout(7000); + + test('default task', function () { + let out = exec('./node_modules/.bin/jake -q publish').toString().trim(); + let expected = [ + 'Fetched remote tags.' + , 'On branch v0.0' + , 'Bumped version number to v0.0.2.' + , 'Created package for zerb v0.0.2' + , 'Publishing zerb v0.0.2' + , './pkg/zerb-v0.0.2.tar.gz' + , 'BOOM! Published.' + , 'Cleaned up package' + ].join('\n'); + assert.equal(expected, out); + }); + +}); + diff --git a/node_modules/jake/test/integration/rule.js b/node_modules/jake/test/integration/rule.js new file mode 100644 index 000000000..b837b1dd3 --- /dev/null +++ b/node_modules/jake/test/integration/rule.js @@ -0,0 +1,216 @@ +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ + +const PROJECT_DIR = process.env.PROJECT_DIR; + +let assert = require('assert'); +let exec = require('child_process').execSync; +let fs = require('fs'); +let { Rule } = require(`${PROJECT_DIR}/lib/rule`); +let { rmRf } = require(`${PROJECT_DIR}/lib/jake`); + +let cleanUpAndNext = function (callback) { + // Gotta add globbing to file utils rmRf + let tmpFiles = [ + 'tmp' + , 'tmp_ns' + , 'tmp_cr' + , 'tmp_p' + , 'tmp_pf' + , 'tmpbin' + , 'tmpsrc' + , 'tmp_dep1.c' + , 'tmp_dep1.o' + , 'tmp_dep1.oo' + , 'tmp_dep2.c' + , 'tmp_dep2.o' + , 'tmp_dep2.oo' + , 'foo' + , 'foo.html' + ]; + tmpFiles.forEach(function (f) { + rmRf(f, { + silent: true + }); + }); + callback && callback(); +}; + +suite('rule', function () { + + this.timeout(7000); + + setup(function (next) { + cleanUpAndNext(next); + }); + + + // - name foo:bin/main.o + // - pattern bin/%.o + // - source src/%.c + // + // return { + // 'dep' : 'foo:src/main.c', + // 'file': 'src/main.c' + // }; + test('Rule.getSource', function () { + let src = Rule.getSource('foo:bin/main.o', 'bin/%.o', 'src/%.c'); + assert.equal('foo:src/main.c', src); + }); + + test('rule w/o pattern', function () { + let out = exec( './node_modules/.bin/jake -q tmp').toString().trim(); + let output = [ + "tmp_dep2.c task" + , "tmp_dep1.c task" + , "cp tmp_dep1.c tmp_dep1.o task" + , "cp tmp_dep2.c tmp_dep2.o task" + , "tmp task"]; + assert.equal( output.join('\n'), out); + let data = fs.readFileSync(process.cwd() + '/tmp'); + assert.equal('src_1src_2', data.toString()); + cleanUpAndNext(); + }); + + test('rule w pattern w/o folder w/o namespace', function () { + let out = exec( './node_modules/.bin/jake -q tmp_p').toString().trim(); + let output = [ + "tmp_dep2.c task" + , "tmp_dep1.c task" + , "cp tmp_dep1.c tmp_dep1.oo task" + , "cp tmp_dep2.c tmp_dep2.oo task" + , "tmp pattern task"]; + let data; + assert.equal( output.join('\n'), out); + data = fs.readFileSync(process.cwd() + '/tmp_p'); + assert.equal('src_1src_2 pattern', data.toString()); + cleanUpAndNext(); + }); + + test('rule w pattern w folder w/o namespace', function () { + let out = exec( './node_modules/.bin/jake -q tmp_pf').toString().trim(); + let output = [ + "tmpsrc/tmp_dep1.c task" + , "cp tmpsrc/tmp_dep1.c tmpbin/tmp_dep1.oo task" + , "tmpsrc/tmp_dep2.c task" + , "cp tmpsrc/tmp_dep2.c tmpbin/tmp_dep2.oo task" + , "tmp pattern folder task"]; + let data; + assert.equal( output.join('\n'), out); + data = fs.readFileSync(process.cwd() + '/tmp_pf'); + assert.equal('src/src_1src/src_2 pattern folder', data.toString()); + cleanUpAndNext(); + }); + + test.skip('rule w pattern w folder w namespace', function () { + let out = exec( './node_modules/.bin/jake -q tmp_ns').toString().trim(); + let output = [ + "tmpsrc/file2.c init task" // yes + , "tmpsrc/tmp_dep2.c task" // no + , "cp tmpsrc/tmp_dep2.c tmpbin/tmp_dep2.oo task" // no + , "tmpsrc/dep1.c task" // no + , "cp tmpsrc/dep1.c tmpbin/dep1.oo ns task" // no + , "cp tmpsrc/file2.c tmpbin/file2.oo ns task" // yes + , "tmp pattern folder namespace task"]; // yes + let data; + assert.equal( output.join('\n'), out); + data = fs.readFileSync(process.cwd() + '/tmp_ns'); + assert.equal('src/src_1src/src_2src/src_3 pattern folder namespace', data.toString()); + cleanUpAndNext(); + }); + + test.skip('rule w chain w pattern w folder w namespace', function () { + let out = exec( './node_modules/.bin/jake -q tmp_cr').toString().trim(); + let output = [ + "chainrule init task" + , "cp tmpsrc/file1.tex tmpbin/file1.dvi tex->dvi task" + , "cp tmpbin/file1.dvi tmpbin/file1.pdf dvi->pdf task" + , "cp tmpsrc/file2.tex tmpbin/file2.dvi tex->dvi task" + , "cp tmpbin/file2.dvi tmpbin/file2.pdf dvi->pdf task" + , "tmp chainrule namespace task"]; + let data; + assert.equal( output.join('\n'), out); + data = fs.readFileSync(process.cwd() + '/tmp_cr'); + assert.equal('tex1 tex2 chainrule namespace', data.toString()); + cleanUpAndNext(); + }); + + + ['precedence', 'regexPattern', 'sourceFunction'].forEach(function (key) { + + test('rule with source file not created yet (' + key + ')', function () { + let write = process.stderr.write; + process.stderr.write = () => {}; + rmRf('foo.txt', {silent: true}); + rmRf('foo.html', {silent: true}); + try { + exec('./node_modules/.bin/jake ' + key + ':test'); + } + catch(err) { + // foo.txt prereq doesn't exist yet + assert.ok(err.message.indexOf('Unknown task "foo.html"') > -1); + } + process.stderr.write = write; + }); + + test('rule with source file now created (' + key + ')', function () { + fs.writeFileSync('foo.txt', ''); + let out = exec('./node_modules/.bin/jake -q ' + key + ':test').toString().trim(); + // Should run prereq and test task + let output = [ + 'created html' + , 'ran test' + ]; + assert.equal(output.join('\n'), out); + }); + + test('rule with source file modified (' + key + ')', function (next) { + setTimeout(function () { + fs.writeFileSync('foo.txt', ''); + let out = exec('./node_modules/.bin/jake -q ' + key + ':test').toString().trim(); + // Should again run both prereq and test task + let output = [ + 'created html' + , 'ran test' + ]; + assert.equal(output.join('\n'), out); + //next(); + cleanUpAndNext(next); + }, 1000); // Wait to do the touch to ensure mod-time is different + }); + + test('rule with existing objective file and no source ' + + ' (should be normal file-task) (' + key + ')', function () { + // Remove just the source file + fs.writeFileSync('foo.html', ''); + rmRf('foo.txt', {silent: true}); + let out = exec('./node_modules/.bin/jake -q ' + key + ':test').toString().trim(); + // Should treat existing objective file as plain file-task, + // and just run test-task + let output = [ + 'ran test' + ]; + assert.equal(output.join('\n'), out); + cleanUpAndNext(); + }); + + }); + +}); + + diff --git a/node_modules/jake/test/integration/selfdep.js b/node_modules/jake/test/integration/selfdep.js new file mode 100644 index 000000000..22d58d125 --- /dev/null +++ b/node_modules/jake/test/integration/selfdep.js @@ -0,0 +1,39 @@ +let assert = require('assert'); +let exec = require('child_process').execSync; + +suite('selfDep', function () { + + this.timeout(7000); + + let origStderrWrite; + + setup(function () { + origStderrWrite = process.stderr.write; + process.stderr.write = function () {}; + }); + + teardown(function () { + process.stderr.write = origStderrWrite; + }); + + test('self dep const', function () { + try { + exec('./node_modules/.bin/jake selfdepconst'); + } + catch(e) { + assert(e.message.indexOf('dependency of itself') > -1) + } + }); + + test('self dep dyn', function () { + try { + exec('./node_modules/.bin/jake selfdepdyn'); + } + catch(e) { + assert(e.message.indexOf('dependency of itself') > -1) + } + }); + +}); + + diff --git a/node_modules/jake/test/integration/task_base.js b/node_modules/jake/test/integration/task_base.js new file mode 100644 index 000000000..36e20e883 --- /dev/null +++ b/node_modules/jake/test/integration/task_base.js @@ -0,0 +1,164 @@ +let assert = require('assert'); +let h = require('./helpers'); +let exec = require('child_process').execSync; + +suite('taskBase', function () { + + this.timeout(7000); + + test('default task', function () { + let out; + out = exec('./node_modules/.bin/jake -q').toString().trim(); + assert.equal(out, 'default task'); + out = exec('./node_modules/.bin/jake -q default').toString().trim(); + assert.equal(out, 'default task'); + }); + + test('task with no action', function () { + let out = exec('./node_modules/.bin/jake -q noAction').toString().trim(); + assert.equal(out, 'default task'); + }); + + test('a task with no action and no prereqs', function () { + exec('./node_modules/.bin/jake noActionNoPrereqs'); + }); + + test('a task that exists at the top-level, and not in the specified namespace, should error', function () { + let res = require('child_process').spawnSync('./node_modules/.bin/jake', + ['asdfasdfasdf:zerbofrangazoomy']); + let err = res.stderr.toString(); + assert.ok(err.indexOf('Unknown task' > -1)); + }); + + test('passing args to a task', function () { + let out = exec('./node_modules/.bin/jake -q argsEnvVars[foo,bar]').toString().trim(); + let parsed = h.parse(out); + let args = parsed.args; + assert.equal(args[0], 'foo'); + assert.equal(args[1], 'bar'); + }); + + test('a task with environment vars', function () { + let out = exec('./node_modules/.bin/jake -q argsEnvVars foo=bar baz=qux').toString().trim(); + let parsed = h.parse(out); + let env = parsed.env; + assert.equal(env.foo, 'bar'); + assert.equal(env.baz, 'qux'); + }); + + test('passing args and using environment vars', function () { + let out = exec('./node_modules/.bin/jake -q argsEnvVars[foo,bar] foo=bar baz=qux').toString().trim(); + let parsed = h.parse(out); + let args = parsed.args; + let env = parsed.env; + assert.equal(args[0], 'foo'); + assert.equal(args[1], 'bar'); + assert.equal(env.foo, 'bar'); + assert.equal(env.baz, 'qux'); + }); + + test('a simple prereq', function () { + let out = exec('./node_modules/.bin/jake -q foo:baz').toString().trim(); + assert.equal(out, 'foo:bar task\nfoo:baz task'); + }); + + test('a duplicate prereq only runs once', function () { + let out = exec('./node_modules/.bin/jake -q foo:asdf').toString().trim(); + assert.equal(out, 'foo:bar task\nfoo:baz task\nfoo:asdf task'); + }); + + test('a prereq with command-line args', function () { + let out = exec('./node_modules/.bin/jake -q foo:qux').toString().trim(); + assert.equal(out, 'foo:bar[asdf,qwer] task\nfoo:qux task'); + }); + + test('a prereq with args via invoke', function () { + let out = exec('./node_modules/.bin/jake -q foo:frang[zxcv,uiop]').toString().trim(); + assert.equal(out, 'foo:bar[zxcv,uiop] task\nfoo:frang task'); + }); + + test('a prereq with args via execute', function () { + let out = exec('./node_modules/.bin/jake -q foo:zerb[zxcv,uiop]').toString().trim(); + assert.equal(out, 'foo:bar[zxcv,uiop] task\nfoo:zerb task'); + }); + + test('repeating the task via execute', function () { + let out = exec('./node_modules/.bin/jake -q foo:voom').toString().trim(); + assert.equal(out, 'foo:bar task\nfoo:bar task\ncomplete\ncomplete'); + }); + + test('prereq execution-order', function () { + let out = exec('./node_modules/.bin/jake -q hoge:fuga').toString().trim(); + assert.equal(out, 'hoge:hoge task\nhoge:piyo task\nhoge:fuga task'); + }); + + test('basic async task', function () { + let out = exec('./node_modules/.bin/jake -q bar:bar').toString().trim(); + assert.equal(out, 'bar:foo task\nbar:bar task'); + }); + + test('promise async task', function () { + let out = exec('./node_modules/.bin/jake -q bar:dependOnpromise').toString().trim(); + assert.equal(out, 'bar:promise task\nbar:dependOnpromise task saw value 123654'); + }); + + test('failing promise async task', function () { + try { + exec('./node_modules/.bin/jake -q bar:brokenPromise'); + } + catch(e) { + assert(e.message.indexOf('Command failed') > -1); + } + }); + + test('that current-prereq index gets reset', function () { + let out = exec('./node_modules/.bin/jake -q hoge:kira').toString().trim(); + assert.equal(out, 'hoge:hoge task\nhoge:piyo task\nhoge:fuga task\n' + + 'hoge:charan task\nhoge:gero task\nhoge:kira task'); + }); + + test('modifying a task by adding prereq during execution', function () { + let out = exec('./node_modules/.bin/jake -q voom').toString().trim(); + assert.equal(out, 2); + }); + + test('listening for task error-event', function () { + try { + exec('./node_modules/.bin/jake -q vronk:groo').toString().trim(); + } + catch(e) { + assert(e.message.indexOf('OMFGZONG') > -1); + } + }); + + test('listening for jake error-event', function () { + let out = exec('./node_modules/.bin/jake -q throwy').toString().trim(); + assert(out.indexOf('Emitted\nError: I am bad') > -1); + }); + + test('listening for jake unhandledRejection-event', function () { + let out = exec('./node_modules/.bin/jake -q promiseRejecter').toString().trim(); + assert.equal(out, ''); + }); + + test('large number of same prereqs', function () { + let out = exec('./node_modules/.bin/jake -q large:same').toString().trim(); + assert.equal(out, 'large:leaf\nlarge:same'); + }); + + test('large number of different prereqs', function () { + let out = exec('./node_modules/.bin/jake -q large:different').toString().trim(); + assert.equal(out, 'leaf-12\nleaf-123\nlarge:different'); + }); + + test('large number of different prereqs', function () { + let out = exec('./node_modules/.bin/jake -q usingRequire:test').toString().trim(); + assert.equal(out, 'howdy test'); + }); + + test('modifying a namespace by adding a new task', function () { + let out = exec('./node_modules/.bin/jake -q one:two').toString().trim(); + assert.equal('one:one\none:two', out); + }); + +}); diff --git a/node_modules/jake/test/unit/jakefile.js b/node_modules/jake/test/unit/jakefile.js new file mode 100644 index 000000000..89ff52331 --- /dev/null +++ b/node_modules/jake/test/unit/jakefile.js @@ -0,0 +1,36 @@ + +task('foo', function () { + console.log('ran top-level foo'); +}); + +task('bar', function () { + console.log('ran top-level bar'); +}); + +task('zerb', function () { + console.log('ran zerb'); +}); + +namespace('zooby', function () { + task('zerp', function () {}); + + task('derp', ['zerp'], function () {}); + + namespace('frang', function () { + + namespace('w00t', function () { + task('bar', function () { + console.log('ran zooby:frang:w00t:bar'); + }); + }); + + task('asdf', function () {}); + }); + +}); + +namespace('hurr', function () { + namespace('durr'); +}); + + diff --git a/node_modules/jake/test/unit/namespace.js b/node_modules/jake/test/unit/namespace.js new file mode 100644 index 000000000..c6b3ff56a --- /dev/null +++ b/node_modules/jake/test/unit/namespace.js @@ -0,0 +1,77 @@ +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ + +const PROJECT_DIR = process.env.PROJECT_DIR; + +// Load the jake global +require(`${PROJECT_DIR}/lib/jake`); +let { Namespace } = require(`${PROJECT_DIR}/lib/namespace`); + +require('./jakefile'); + +let assert = require('assert'); + +suite('namespace', function () { + + this.timeout(7000); + + test('resolve namespace by relative name', function () { + let aaa, bbb, ccc; + aaa = namespace('aaa', function () { + bbb = namespace('bbb', function () { + ccc = namespace('ccc', function () { + }); + }); + }); + + assert.ok(aaa, Namespace.ROOT_NAMESPACE.resolveNamespace('aaa')); + assert.ok(bbb === aaa.resolveNamespace('bbb')); + assert.ok(ccc === aaa.resolveNamespace('bbb:ccc')); + }); + + test('resolve task in sub-namespace by relative path', function () { + let curr = Namespace.ROOT_NAMESPACE.resolveNamespace('zooby'); + let task = curr.resolveTask('frang:w00t:bar'); + assert.ok(task.action.toString().indexOf('zooby:frang:w00t:bar') > -1); + }); + + test('prefer local to top-level', function () { + let curr = Namespace.ROOT_NAMESPACE.resolveNamespace('zooby:frang:w00t'); + let task = curr.resolveTask('bar'); + assert.ok(task.action.toString().indexOf('zooby:frang:w00t:bar') > -1); + }); + + test('does resolve top-level', function () { + let curr = Namespace.ROOT_NAMESPACE.resolveNamespace('zooby:frang:w00t'); + let task = curr.resolveTask('foo'); + assert.ok(task.action.toString().indexOf('top-level foo') > -1); + }); + + test('absolute lookup works from sub-namespaces', function () { + let curr = Namespace.ROOT_NAMESPACE.resolveNamespace('hurr:durr'); + let task = curr.resolveTask('zooby:frang:w00t:bar'); + assert.ok(task.action.toString().indexOf('zooby:frang:w00t:bar') > -1); + }); + + test('resolution miss with throw error', function () { + let curr = Namespace.ROOT_NAMESPACE; + let task = curr.resolveTask('asdf:qwer'); + assert.ok(!task); + }); + +}); diff --git a/node_modules/jake/test/unit/parseargs.js b/node_modules/jake/test/unit/parseargs.js new file mode 100644 index 000000000..7a3ddd5ad --- /dev/null +++ b/node_modules/jake/test/unit/parseargs.js @@ -0,0 +1,169 @@ +/* + * Jake JavaScript build tool + * Copyright 2112 Matthew Eernisse (mde@fleegix.org) + * + * Licensed under the Apache License, Version 2.0 (the "License"); + * you may not use this file except in compliance with the License. + * You may obtain a copy of the License at + * + * http://www.apache.org/licenses/LICENSE-2.0 + * + * Unless required by applicable law or agreed to in writing, software + * distributed under the License is distributed on an "AS IS" BASIS, + * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + * See the License for the specific language governing permissions and + * limitations under the License. + * +*/ + +const PROJECT_DIR = process.env.PROJECT_DIR; + +let parseargs = require(`${PROJECT_DIR}/lib/parseargs`); +let assert = require('assert'); +let optsReg = [ + { full: 'directory', + abbr: 'C', + preempts: false, + expectValue: true + }, + { full: 'jakefile', + abbr: 'f', + preempts: false, + expectValue: true + }, + { full: 'tasks', + abbr: 'T', + preempts: true + }, + { full: 'tasks', + abbr: 'ls', + preempts: true + }, + { full: 'trace', + abbr: 't', + preempts: false, + expectValue: false + }, + { full: 'help', + abbr: 'h', + preempts: true + }, + { full: 'version', + abbr: 'V', + preempts: true + } +]; +let p = new parseargs.Parser(optsReg); +let z = function (s) { return s.split(' '); }; +let res; + +suite('parseargs', function () { + + test('long preemptive opt and val with equal-sign, ignore further opts', function () { + res = p.parse(z('--tasks=foo --jakefile=asdf')); + assert.equal('foo', res.opts.tasks); + assert.equal(undefined, res.opts.jakefile); + }); + + test('long preemptive opt and val without equal-sign, ignore further opts', function () { + res = p.parse(z('--tasks foo --jakefile=asdf')); + assert.equal('foo', res.opts.tasks); + assert.equal(undefined, res.opts.jakefile); + }); + + test('long preemptive opt and no val, ignore further opts', function () { + res = p.parse(z('--tasks --jakefile=asdf')); + assert.equal(true, res.opts.tasks); + assert.equal(undefined, res.opts.jakefile); + }); + + test('preemptive opt with no val, should be true', function () { + res = p.parse(z('-T')); + assert.equal(true, res.opts.tasks); + }); + + test('preemptive opt with no val, should be true and ignore further opts', function () { + res = p.parse(z('-T -f')); + assert.equal(true, res.opts.tasks); + assert.equal(undefined, res.opts.jakefile); + }); + + test('preemptive opt with val, should be val', function () { + res = p.parse(z('-T zoobie -f foo/bar/baz')); + assert.equal('zoobie', res.opts.tasks); + assert.equal(undefined, res.opts.jakefile); + }); + + test('-f expects a value, -t does not (howdy is task-name)', function () { + res = p.parse(z('-f zoobie -t howdy')); + assert.equal('zoobie', res.opts.jakefile); + assert.equal(true, res.opts.trace); + assert.equal('howdy', res.taskNames[0]); + }); + + test('different order, -f expects a value, -t does not (howdy is task-name)', function () { + res = p.parse(z('-f zoobie howdy -t')); + assert.equal('zoobie', res.opts.jakefile); + assert.equal(true, res.opts.trace); + assert.equal('howdy', res.taskNames[0]); + }); + + test('-f expects a value, -t does not (foo=bar is env var)', function () { + res = p.parse(z('-f zoobie -t foo=bar')); + assert.equal('zoobie', res.opts.jakefile); + assert.equal(true, res.opts.trace); + assert.equal('bar', res.envVars.foo); + assert.equal(undefined, res.taskNames[0]); + }); + + test('-f expects a value, -t does not (foo=bar is env-var, task-name follows)', function () { + res = p.parse(z('-f zoobie -t howdy foo=bar')); + assert.equal('zoobie', res.opts.jakefile); + assert.equal(true, res.opts.trace); + assert.equal('bar', res.envVars.foo); + assert.equal('howdy', res.taskNames[0]); + }); + + test('-t does not expect a value, -f does (howdy is task-name)', function () { + res = p.parse(z('-t howdy -f zoobie')); + assert.equal(true, res.opts.trace); + assert.equal('zoobie', res.opts.jakefile); + assert.equal('howdy', res.taskNames[0]); + }); + + test('--trace does not expect a value, -f does (howdy is task-name)', function () { + res = p.parse(z('--trace howdy --jakefile zoobie')); + assert.equal(true, res.opts.trace); + assert.equal('zoobie', res.opts.jakefile); + assert.equal('howdy', res.taskNames[0]); + }); + + test('--trace does not expect a value (equal), -f does (throw howdy away)', function () { + res = p.parse(z('--trace=howdy --jakefile=zoobie')); + assert.equal(true, res.opts.trace); + assert.equal('zoobie', res.opts.jakefile); + assert.equal(undefined, res.taskNames[0]); + }); + + /* +, test('task-name with positional args', function () { + res = p.parse(z('foo:bar[asdf,qwer]')); + assert.equal('asdf', p.taskArgs[0]); + assert.equal('qwer', p.taskArgs[1]); + } + +, test('opts, env vars, task-name with positional args', function () { + res = p.parse(z('-f ./tests/Jakefile -t default[asdf,qwer] foo=bar')); + assert.equal('./tests/Jakefile', res.opts.jakefile); + assert.equal(true, res.opts.trace); + assert.equal('bar', res.envVars.foo); + assert.equal('default', res.taskName); + assert.equal('asdf', p.taskArgs[0]); + assert.equal('qwer', p.taskArgs[1]); + } +*/ + + +}); + + diff --git a/node_modules/jake/usage.txt b/node_modules/jake/usage.txt new file mode 100644 index 000000000..392b6d8f8 --- /dev/null +++ b/node_modules/jake/usage.txt @@ -0,0 +1,16 @@ +Jake JavaScript build tool +******************************************************************************** +If no flags are given, Jake looks for a Jakefile or Jakefile.js in the current directory. +******************************************************************************** +{Usage}: jake [options ...] [env variables ...] target + +{Options}: + -f, --jakefile FILE Use FILE as the Jakefile. + -C, --directory DIRECTORY Change to DIRECTORY before running tasks. + -B, --always-make Unconditionally make all targets. + -T/ls, --tasks Display the tasks (matching optional PATTERN) with descriptions, then exit. + -J, --jakelibdir JAKELIBDIR Auto-import any .jake files in JAKELIBDIR. (default is \'jakelib\') + -h, --help Display this help message. + -V/v, --version Display the Jake version. + -ar, --allow-rejection Keep running even after unhandled promise rejection + diff --git a/node_modules/joi/LICENSE.md b/node_modules/joi/LICENSE.md new file mode 100644 index 000000000..35577405a --- /dev/null +++ b/node_modules/joi/LICENSE.md @@ -0,0 +1,10 @@ +Copyright (c) 2012-2020, Sideway. Inc, and project contributors.
+Copyright (c) 2012-2014, Walmart.
+All rights reserved. + +Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: +* Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. +* Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. +* The names of any contributors may not be used to endorse or promote products derived from this software without specific prior written permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS AND CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/node_modules/joi/README.md b/node_modules/joi/README.md new file mode 100644 index 000000000..c5f092fb0 --- /dev/null +++ b/node_modules/joi/README.md @@ -0,0 +1,15 @@ +# joi + +#### The most powerful schema description language and data validator for JavaScript. + +## Installation +`npm install joi` + +### Visit the [joi.dev](https://joi.dev) Developer Portal for tutorials, documentation, and support + +## Useful resources + +- [Documentation and API](https://joi.dev/api/) +- [Versions status](https://joi.dev/resources/status/#joi) +- [Changelog](https://joi.dev/resources/changelog/) +- [Project policies](https://joi.dev/policies/) diff --git a/node_modules/joi/dist/joi-browser.min.js b/node_modules/joi/dist/joi-browser.min.js new file mode 100644 index 000000000..0288827cd --- /dev/null +++ b/node_modules/joi/dist/joi-browser.min.js @@ -0,0 +1 @@ +!function(e,t){"object"==typeof exports&&"object"==typeof module?module.exports=t():"function"==typeof define&&define.amd?define([],t):"object"==typeof exports?exports.joi=t():e.joi=t()}(self,(function(){return e={1238:e=>{"use strict";e.exports={version:"17.4.2"}},7629:(e,t,r)=>{"use strict";const s=r(375),n=r(8571),a=r(9474),o=r(1687),i=r(8652),l=r(8160),c=r(3292),u=r(6354),f=r(8901),m=r(9708),h=r(6914),d=r(2294),p=r(6133),g=r(1152),y=r(8863),b=r(2036),v={Base:class{constructor(e){this.type=e,this.$_root=null,this._definition={},this._reset()}_reset(){this._ids=new d.Ids,this._preferences=null,this._refs=new p.Manager,this._cache=null,this._valids=null,this._invalids=null,this._flags={},this._rules=[],this._singleRules=new Map,this.$_terms={},this.$_temp={ruleset:null,whens:{}}}describe(){return s("function"==typeof m.describe,"Manifest functionality disabled"),m.describe(this)}allow(...e){return l.verifyFlat(e,"allow"),this._values(e,"_valids")}alter(e){s(e&&"object"==typeof e&&!Array.isArray(e),"Invalid targets argument"),s(!this._inRuleset(),"Cannot set alterations inside a ruleset");const t=this.clone();t.$_terms.alterations=t.$_terms.alterations||[];for(const r in e){const n=e[r];s("function"==typeof n,"Alteration adjuster for",r,"must be a function"),t.$_terms.alterations.push({target:r,adjuster:n})}return t.$_temp.ruleset=!1,t}artifact(e){return s(void 0!==e,"Artifact cannot be undefined"),s(!this._cache,"Cannot set an artifact with a rule cache"),this.$_setFlag("artifact",e)}cast(e){return s(!1===e||"string"==typeof e,"Invalid to value"),s(!1===e||this._definition.cast[e],"Type",this.type,"does not support casting to",e),this.$_setFlag("cast",!1===e?void 0:e)}default(e,t){return this._default("default",e,t)}description(e){return s(e&&"string"==typeof e,"Description must be a non-empty string"),this.$_setFlag("description",e)}empty(e){const t=this.clone();return void 0!==e&&(e=t.$_compile(e,{override:!1})),t.$_setFlag("empty",e,{clone:!1})}error(e){return s(e,"Missing error"),s(e instanceof Error||"function"==typeof e,"Must provide a valid Error object or a function"),this.$_setFlag("error",e)}example(e,t={}){return s(void 0!==e,"Missing example"),l.assertOptions(t,["override"]),this._inner("examples",e,{single:!0,override:t.override})}external(e,t){return"object"==typeof e&&(s(!t,"Cannot combine options with description"),t=e.description,e=e.method),s("function"==typeof e,"Method must be a function"),s(void 0===t||t&&"string"==typeof t,"Description must be a non-empty string"),this._inner("externals",{method:e,description:t},{single:!0})}failover(e,t){return this._default("failover",e,t)}forbidden(){return this.presence("forbidden")}id(e){return e?(s("string"==typeof e,"id must be a non-empty string"),s(/^[^\.]+$/.test(e),"id cannot contain period character"),this.$_setFlag("id",e)):this.$_setFlag("id",void 0)}invalid(...e){return this._values(e,"_invalids")}label(e){return s(e&&"string"==typeof e,"Label name must be a non-empty string"),this.$_setFlag("label",e)}meta(e){return s(void 0!==e,"Meta cannot be undefined"),this._inner("metas",e,{single:!0})}note(...e){s(e.length,"Missing notes");for(const t of e)s(t&&"string"==typeof t,"Notes must be non-empty strings");return this._inner("notes",e)}only(e=!0){return s("boolean"==typeof e,"Invalid mode:",e),this.$_setFlag("only",e)}optional(){return this.presence("optional")}prefs(e){s(e,"Missing preferences"),s(void 0===e.context,"Cannot override context"),s(void 0===e.externals,"Cannot override externals"),s(void 0===e.warnings,"Cannot override warnings"),s(void 0===e.debug,"Cannot override debug"),l.checkPreferences(e);const t=this.clone();return t._preferences=l.preferences(t._preferences,e),t}presence(e){return s(["optional","required","forbidden"].includes(e),"Unknown presence mode",e),this.$_setFlag("presence",e)}raw(e=!0){return this.$_setFlag("result",e?"raw":void 0)}result(e){return s(["raw","strip"].includes(e),"Unknown result mode",e),this.$_setFlag("result",e)}required(){return this.presence("required")}strict(e){const t=this.clone(),r=void 0!==e&&!e;return t._preferences=l.preferences(t._preferences,{convert:r}),t}strip(e=!0){return this.$_setFlag("result",e?"strip":void 0)}tag(...e){s(e.length,"Missing tags");for(const t of e)s(t&&"string"==typeof t,"Tags must be non-empty strings");return this._inner("tags",e)}unit(e){return s(e&&"string"==typeof e,"Unit name must be a non-empty string"),this.$_setFlag("unit",e)}valid(...e){l.verifyFlat(e,"valid");const t=this.allow(...e);return t.$_setFlag("only",!!t._valids,{clone:!1}),t}when(e,t){const r=this.clone();r.$_terms.whens||(r.$_terms.whens=[]);const n=c.when(r,e,t);if(!["any","link"].includes(r.type)){const e=n.is?[n]:n.switch;for(const t of e)s(!t.then||"any"===t.then.type||t.then.type===r.type,"Cannot combine",r.type,"with",t.then&&t.then.type),s(!t.otherwise||"any"===t.otherwise.type||t.otherwise.type===r.type,"Cannot combine",r.type,"with",t.otherwise&&t.otherwise.type)}return r.$_terms.whens.push(n),r.$_mutateRebuild()}cache(e){s(!this._inRuleset(),"Cannot set caching inside a ruleset"),s(!this._cache,"Cannot override schema cache"),s(void 0===this._flags.artifact,"Cannot cache a rule with an artifact");const t=this.clone();return t._cache=e||i.provider.provision(),t.$_temp.ruleset=!1,t}clone(){const e=Object.create(Object.getPrototypeOf(this));return this._assign(e)}concat(e){s(l.isSchema(e),"Invalid schema object"),s("any"===this.type||"any"===e.type||e.type===this.type,"Cannot merge type",this.type,"with another type:",e.type),s(!this._inRuleset(),"Cannot concatenate onto a schema with open ruleset"),s(!e._inRuleset(),"Cannot concatenate a schema with open ruleset");let t=this.clone();if("any"===this.type&&"any"!==e.type){const r=e.clone();for(const e of Object.keys(t))"type"!==e&&(r[e]=t[e]);t=r}t._ids.concat(e._ids),t._refs.register(e,p.toSibling),t._preferences=t._preferences?l.preferences(t._preferences,e._preferences):e._preferences,t._valids=b.merge(t._valids,e._valids,e._invalids),t._invalids=b.merge(t._invalids,e._invalids,e._valids);for(const r of e._singleRules.keys())t._singleRules.has(r)&&(t._rules=t._rules.filter((e=>e.keep||e.name!==r)),t._singleRules.delete(r));for(const r of e._rules)e._definition.rules[r.method].multi||t._singleRules.set(r.name,r),t._rules.push(r);if(t._flags.empty&&e._flags.empty){t._flags.empty=t._flags.empty.concat(e._flags.empty);const r=Object.assign({},e._flags);delete r.empty,o(t._flags,r)}else if(e._flags.empty){t._flags.empty=e._flags.empty;const r=Object.assign({},e._flags);delete r.empty,o(t._flags,r)}else o(t._flags,e._flags);for(const r in e.$_terms){const s=e.$_terms[r];s?t.$_terms[r]?t.$_terms[r]=t.$_terms[r].concat(s):t.$_terms[r]=s.slice():t.$_terms[r]||(t.$_terms[r]=s)}return this.$_root._tracer&&this.$_root._tracer._combine(t,[this,e]),t.$_mutateRebuild()}extend(e){return s(!e.base,"Cannot extend type with another base"),f.type(this,e)}extract(e){return e=Array.isArray(e)?e:e.split("."),this._ids.reach(e)}fork(e,t){s(!this._inRuleset(),"Cannot fork inside a ruleset");let r=this;for(let s of[].concat(e))s=Array.isArray(s)?s:s.split("."),r=r._ids.fork(s,t,r);return r.$_temp.ruleset=!1,r}rule(e){const t=this._definition;l.assertOptions(e,Object.keys(t.modifiers)),s(!1!==this.$_temp.ruleset,"Cannot apply rules to empty ruleset or the last rule added does not support rule properties");const r=null===this.$_temp.ruleset?this._rules.length-1:this.$_temp.ruleset;s(r>=0&&rt.tailor(e),ref:!1}),t.$_temp.ruleset=!1,t.$_mutateRebuild()}tracer(){return g.location?g.location(this):this}validate(e,t){return y.entry(e,this,t)}validateAsync(e,t){return y.entryAsync(e,this,t)}$_addRule(e){"string"==typeof e&&(e={name:e}),s(e&&"object"==typeof e,"Invalid options"),s(e.name&&"string"==typeof e.name,"Invalid rule name");for(const t in e)s("_"!==t[0],"Cannot set private rule properties");const t=Object.assign({},e);t._resolve=[],t.method=t.method||t.name;const r=this._definition.rules[t.method],n=t.args;s(r,"Unknown rule",t.method);const a=this.clone();if(n){s(1===Object.keys(n).length||Object.keys(n).length===this._definition.rules[t.name].args.length,"Invalid rule definition for",this.type,t.name);for(const e in n){let o=n[e];if(void 0!==o){if(r.argsByName){const i=r.argsByName.get(e);if(i.ref&&l.isResolvable(o))t._resolve.push(e),a.$_mutateRegister(o);else if(i.normalize&&(o=i.normalize(o),n[e]=o),i.assert){const t=l.validateArg(o,e,i);s(!t,t,"or reference")}}n[e]=o}else delete n[e]}}return r.multi||(a._ruleRemove(t.name,{clone:!1}),a._singleRules.set(t.name,t)),!1===a.$_temp.ruleset&&(a.$_temp.ruleset=null),r.priority?a._rules.unshift(t):a._rules.push(t),a}$_compile(e,t){return c.schema(this.$_root,e,t)}$_createError(e,t,r,s,n,a={}){const o=!1!==a.flags?this._flags:{},i=a.messages?h.merge(this._definition.messages,a.messages):this._definition.messages;return new u.Report(e,t,r,o,i,s,n)}$_getFlag(e){return this._flags[e]}$_getRule(e){return this._singleRules.get(e)}$_mapLabels(e){return e=Array.isArray(e)?e:e.split("."),this._ids.labels(e)}$_match(e,t,r,s){(r=Object.assign({},r)).abortEarly=!0,r._externals=!1,t.snapshot();const n=!y.validate(e,this,t,r,s).errors;return t.restore(),n}$_modify(e){return l.assertOptions(e,["each","once","ref","schema"]),d.schema(this,e)||this}$_mutateRebuild(){return s(!this._inRuleset(),"Cannot add this rule inside a ruleset"),this._refs.reset(),this._ids.reset(),this.$_modify({each:(e,{source:t,name:r,path:s,key:n})=>{const a=this._definition[t][r]&&this._definition[t][r].register;!1!==a&&this.$_mutateRegister(e,{family:a,key:n})}}),this._definition.rebuild&&this._definition.rebuild(this),this.$_temp.ruleset=!1,this}$_mutateRegister(e,{family:t,key:r}={}){this._refs.register(e,t),this._ids.register(e,{key:r})}$_property(e){return this._definition.properties[e]}$_reach(e){return this._ids.reach(e)}$_rootReferences(){return this._refs.roots()}$_setFlag(e,t,r={}){s("_"===e[0]||!this._inRuleset(),"Cannot set flag inside a ruleset");const n=this._definition.flags[e]||{};if(a(t,n.default)&&(t=void 0),a(t,this._flags[e]))return this;const o=!1!==r.clone?this.clone():this;return void 0!==t?(o._flags[e]=t,o.$_mutateRegister(t)):delete o._flags[e],"_"!==e[0]&&(o.$_temp.ruleset=!1),o}$_parent(e,...t){return this[e][l.symbols.parent].call(this,...t)}$_validate(e,t,r){return y.validate(e,this,t,r)}_assign(e){e.type=this.type,e.$_root=this.$_root,e.$_temp=Object.assign({},this.$_temp),e.$_temp.whens={},e._ids=this._ids.clone(),e._preferences=this._preferences,e._valids=this._valids&&this._valids.clone(),e._invalids=this._invalids&&this._invalids.clone(),e._rules=this._rules.slice(),e._singleRules=n(this._singleRules,{shallow:!0}),e._refs=this._refs.clone(),e._flags=Object.assign({},this._flags),e._cache=null,e.$_terms={};for(const t in this.$_terms)e.$_terms[t]=this.$_terms[t]?this.$_terms[t].slice():null;e.$_super={};for(const t in this.$_super)e.$_super[t]=this._super[t].bind(e);return e}_bare(){const e=this.clone();e._reset();const t=e._definition.terms;for(const r in t){const s=t[r];e.$_terms[r]=s.init}return e.$_mutateRebuild()}_default(e,t,r={}){return l.assertOptions(r,"literal"),s(void 0!==t,"Missing",e,"value"),s("function"==typeof t||!r.literal,"Only function value supports literal option"),"function"==typeof t&&r.literal&&(t={[l.symbols.literal]:!0,literal:t}),this.$_setFlag(e,t)}_generate(e,t,r){if(!this.$_terms.whens)return{schema:this};const s=[],n=[];for(let a=0;ac)break}const a=n.join(", ");if(t.mainstay.tracer.debug(t,"rule","when",a),!a)return{schema:this};if(!t.mainstay.tracer.active&&this.$_temp.whens[a])return{schema:this.$_temp.whens[a],id:a};let o=this;this._definition.generate&&(o=this._definition.generate(this,e,t,r));for(const e of s)o=o.concat(e);return this.$_root._tracer&&this.$_root._tracer._combine(o,[this,...s]),this.$_temp.whens[a]=o,{schema:o,id:a}}_inner(e,t,r={}){s(!this._inRuleset(),"Cannot set ".concat(e," inside a ruleset"));const n=this.clone();return n.$_terms[e]&&!r.override||(n.$_terms[e]=[]),r.single?n.$_terms[e].push(t):n.$_terms[e].push(...t),n.$_temp.ruleset=!1,n}_inRuleset(){return null!==this.$_temp.ruleset&&!1!==this.$_temp.ruleset}_ruleRemove(e,t={}){if(!this._singleRules.has(e))return this;const r=!1!==t.clone?this.clone():this;r._singleRules.delete(e);const s=[];for(let t=0;t{"use strict";const s=r(375),n=r(8571),a=r(8160),o={max:1e3,supported:new Set(["undefined","boolean","number","string"])};t.provider={provision:e=>new o.Cache(e)},o.Cache=class{constructor(e={}){a.assertOptions(e,["max"]),s(void 0===e.max||e.max&&e.max>0&&isFinite(e.max),"Invalid max cache size"),this._max=e.max||o.max,this._map=new Map,this._list=new o.List}get length(){return this._map.size}set(e,t){if(null!==e&&!o.supported.has(typeof e))return;let r=this._map.get(e);if(r)return r.value=t,void this._list.first(r);r=this._list.unshift({key:e,value:t}),this._map.set(e,r),this._compact()}get(e){const t=this._map.get(e);if(t)return this._list.first(t),n(t.value)}_compact(){if(this._map.size>this._max){const e=this._list.pop();this._map.delete(e.key)}}},o.List=class{constructor(){this.tail=null,this.head=null}unshift(e){return e.next=null,e.prev=this.head,this.head&&(this.head.next=e),this.head=e,this.tail||(this.tail=e),e}first(e){e!==this.head&&(this._remove(e),this.unshift(e))}pop(){return this._remove(this.tail)}_remove(e){const{next:t,prev:r}=e;return t.prev=r,r&&(r.next=t),e===this.tail&&(this.tail=t),e.prev=null,e.next=null,e}}},8160:(e,t,r)=>{"use strict";const s=r(375),n=r(7916),a=r(1238);let o,i;const l={isoDate:/^(?:[-+]\d{2})?(?:\d{4}(?!\d{2}\b))(?:(-?)(?:(?:0[1-9]|1[0-2])(?:\1(?:[12]\d|0[1-9]|3[01]))?|W(?:[0-4]\d|5[0-2])(?:-?[1-7])?|(?:00[1-9]|0[1-9]\d|[12]\d{2}|3(?:[0-5]\d|6[1-6])))(?![T]$|[T][\d]+Z$)(?:[T\s](?:(?:(?:[01]\d|2[0-3])(?:(:?)[0-5]\d)?|24\:?00)(?:[.,]\d+(?!:))?)(?:\2[0-5]\d(?:[.,]\d+)?)?(?:[Z]|(?:[+-])(?:[01]\d|2[0-3])(?::?[0-5]\d)?)?)?)?$/};t.version=a.version,t.defaults={abortEarly:!0,allowUnknown:!1,artifacts:!1,cache:!0,context:null,convert:!0,dateFormat:"iso",errors:{escapeHtml:!1,label:"path",language:null,render:!0,stack:!1,wrap:{label:'"',array:"[]"}},externals:!0,messages:{},nonEnumerables:!1,noDefaults:!1,presence:"optional",skipFunctions:!1,stripUnknown:!1,warnings:!1},t.symbols={any:Symbol.for("@hapi/joi/schema"),arraySingle:Symbol("arraySingle"),deepDefault:Symbol("deepDefault"),errors:Symbol("errors"),literal:Symbol("literal"),override:Symbol("override"),parent:Symbol("parent"),prefs:Symbol("prefs"),ref:Symbol("ref"),template:Symbol("template"),values:Symbol("values")},t.assertOptions=function(e,t,r="Options"){s(e&&"object"==typeof e&&!Array.isArray(e),"Options must be of type object");const n=Object.keys(e).filter((e=>!t.includes(e)));s(0===n.length,"".concat(r," contain unknown keys: ").concat(n))},t.checkPreferences=function(e){i=i||r(3378);const t=i.preferences.validate(e);if(t.error)throw new n([t.error.details[0].message])},t.compare=function(e,t,r){switch(r){case"=":return e===t;case">":return e>t;case"<":return e=":return e>=t;case"<=":return e<=t}},t.default=function(e,t){return void 0===e?t:e},t.isIsoDate=function(e){return l.isoDate.test(e)},t.isNumber=function(e){return"number"==typeof e&&!isNaN(e)},t.isResolvable=function(e){return!!e&&(e[t.symbols.ref]||e[t.symbols.template])},t.isSchema=function(e,r={}){const n=e&&e[t.symbols.any];return!!n&&(s(r.legacy||n.version===t.version,"Cannot mix different versions of joi schemas"),!0)},t.isValues=function(e){return e[t.symbols.values]},t.limit=function(e){return Number.isSafeInteger(e)&&e>=0},t.preferences=function(e,s){o=o||r(6914),e=e||{},s=s||{};const n=Object.assign({},e,s);return s.errors&&e.errors&&(n.errors=Object.assign({},e.errors,s.errors),n.errors.wrap=Object.assign({},e.errors.wrap,s.errors.wrap)),s.messages&&(n.messages=o.compile(s.messages,e.messages)),delete n[t.symbols.prefs],n},t.tryWithPath=function(e,t,r={}){try{return e()}catch(e){throw void 0!==e.path?e.path=t+"."+e.path:e.path=t,r.append&&(e.message="".concat(e.message," (").concat(e.path,")")),e}},t.validateArg=function(e,r,{assert:s,message:n}){if(t.isSchema(s)){const t=s.validate(e);if(!t.error)return;return t.error.message}if(!s(e))return r?"".concat(r," ").concat(n):n},t.verifyFlat=function(e,t){for(const r of e)s(!Array.isArray(r),"Method no longer accepts array arguments:",t)}},3292:(e,t,r)=>{"use strict";const s=r(375),n=r(8160),a=r(6133),o={};t.schema=function(e,t,r={}){n.assertOptions(r,["appendPath","override"]);try{return o.schema(e,t,r)}catch(e){throw r.appendPath&&void 0!==e.path&&(e.message="".concat(e.message," (").concat(e.path,")")),e}},o.schema=function(e,t,r){s(void 0!==t,"Invalid undefined schema"),Array.isArray(t)&&(s(t.length,"Invalid empty array schema"),1===t.length&&(t=t[0]));const a=(t,...s)=>!1!==r.override?t.valid(e.override,...s):t.valid(...s);if(o.simple(t))return a(e,t);if("function"==typeof t)return e.custom(t);if(s("object"==typeof t,"Invalid schema content:",typeof t),n.isResolvable(t))return a(e,t);if(n.isSchema(t))return t;if(Array.isArray(t)){for(const r of t)if(!o.simple(r))return e.alternatives().try(...t);return a(e,...t)}return t instanceof RegExp?e.string().regex(t):t instanceof Date?a(e.date(),t):(s(Object.getPrototypeOf(t)===Object.getPrototypeOf({}),"Schema can only contain plain objects"),e.object().keys(t))},t.ref=function(e,t){return a.isRef(e)?e:a.create(e,t)},t.compile=function(e,r,a={}){n.assertOptions(a,["legacy"]);const i=r&&r[n.symbols.any];if(i)return s(a.legacy||i.version===n.version,"Cannot mix different versions of joi schemas:",i.version,n.version),r;if("object"!=typeof r||!a.legacy)return t.schema(e,r,{appendPath:!0});const l=o.walk(r);return l?l.compile(l.root,r):t.schema(e,r,{appendPath:!0})},o.walk=function(e){if("object"!=typeof e)return null;if(Array.isArray(e)){for(const t of e){const e=o.walk(t);if(e)return e}return null}const t=e[n.symbols.any];if(t)return{root:e[t.root],compile:t.compile};s(Object.getPrototypeOf(e)===Object.getPrototypeOf({}),"Schema can only contain plain objects");for(const t in e){const r=o.walk(e[t]);if(r)return r}return null},o.simple=function(e){return null===e||["boolean","string","number"].includes(typeof e)},t.when=function(e,r,i){if(void 0===i&&(s(r&&"object"==typeof r,"Missing options"),i=r,r=a.create(".")),Array.isArray(i)&&(i={switch:i}),n.assertOptions(i,["is","not","then","otherwise","switch","break"]),n.isSchema(r))return s(void 0===i.is,'"is" can not be used with a schema condition'),s(void 0===i.not,'"not" can not be used with a schema condition'),s(void 0===i.switch,'"switch" can not be used with a schema condition'),o.condition(e,{is:r,then:i.then,otherwise:i.otherwise,break:i.break});if(s(a.isRef(r)||"string"==typeof r,"Invalid condition:",r),s(void 0===i.not||void 0===i.is,'Cannot combine "is" with "not"'),void 0===i.switch){let l=i;void 0!==i.not&&(l={is:i.not,then:i.otherwise,otherwise:i.then,break:i.break});let c=void 0!==l.is?e.$_compile(l.is):e.$_root.invalid(null,!1,0,"").required();return s(void 0!==l.then||void 0!==l.otherwise,'options must have at least one of "then", "otherwise", or "switch"'),s(void 0===l.break||void 0===l.then||void 0===l.otherwise,"Cannot specify then, otherwise, and break all together"),void 0===i.is||a.isRef(i.is)||n.isSchema(i.is)||(c=c.required()),o.condition(e,{ref:t.ref(r),is:c,then:l.then,otherwise:l.otherwise,break:l.break})}s(Array.isArray(i.switch),'"switch" must be an array'),s(void 0===i.is,'Cannot combine "switch" with "is"'),s(void 0===i.not,'Cannot combine "switch" with "not"'),s(void 0===i.then,'Cannot combine "switch" with "then"');const l={ref:t.ref(r),switch:[],break:i.break};for(let t=0;t{"use strict";const s=r(5688),n=r(8160),a=r(3328);t.Report=class{constructor(e,r,s,n,a,o,i){if(this.code=e,this.flags=n,this.messages=a,this.path=o.path,this.prefs=i,this.state=o,this.value=r,this.message=null,this.template=null,this.local=s||{},this.local.label=t.label(this.flags,this.state,this.prefs,this.messages),void 0===this.value||this.local.hasOwnProperty("value")||(this.local.value=this.value),this.path.length){const e=this.path[this.path.length-1];"object"!=typeof e&&(this.local.key=e)}}_setTemplate(e){if(this.template=e,!this.flags.label&&0===this.path.length){const e=this._template(this.template,"root");e&&(this.local.label=e)}}toString(){if(this.message)return this.message;const e=this.code;if(!this.prefs.errors.render)return this.code;const t=this._template(this.template)||this._template(this.prefs.messages)||this._template(this.messages);return void 0===t?'Error code "'.concat(e,'" is not defined, your custom type is missing the correct messages definition'):(this.message=t.render(this.value,this.state,this.prefs,this.local,{errors:this.prefs.errors,messages:[this.prefs.messages,this.messages]}),this.prefs.errors.label||(this.message=this.message.replace(/^"" /,"").trim()),this.message)}_template(e,r){return t.template(this.value,e,r||this.code,this.state,this.prefs)}},t.path=function(e){let t="";for(const r of e)"object"!=typeof r&&("string"==typeof r?(t&&(t+="."),t+=r):t+="[".concat(r,"]"));return t},t.template=function(e,t,r,s,o){if(!t)return;if(a.isTemplate(t))return"root"!==r?t:null;let i=o.errors.language;return n.isResolvable(i)&&(i=i.resolve(e,s,o)),i&&t[i]&&void 0!==t[i][r]?t[i][r]:t[r]},t.label=function(e,r,s,n){if(e.label)return e.label;if(!s.errors.label)return"";let a=r.path;"key"===s.errors.label&&r.path.length>1&&(a=r.path.slice(-1));return t.path(a)||t.template(null,s.messages,"root",r,s)||n&&t.template(null,n,"root",r,s)||"value"},t.process=function(e,r,s){if(!e)return null;const{override:n,message:a,details:o}=t.details(e);if(n)return n;if(s.errors.stack)return new t.ValidationError(a,o,r);const i=Error.stackTraceLimit;Error.stackTraceLimit=0;const l=new t.ValidationError(a,o,r);return Error.stackTraceLimit=i,l},t.details=function(e,t={}){let r=[];const s=[];for(const n of e){if(n instanceof Error){if(!1!==t.override)return{override:n};const e=n.toString();r.push(e),s.push({message:e,type:"override",context:{error:n}});continue}const e=n.toString();r.push(e),s.push({message:e,path:n.path.filter((e=>"object"!=typeof e)),type:n.code,context:n.local})}return r.length>1&&(r=[...new Set(r)]),{message:r.join(". "),details:s}},t.ValidationError=class extends Error{constructor(e,t,r){super(e),this._original=r,this.details=t}static isError(e){return e instanceof t.ValidationError}},t.ValidationError.prototype.isJoi=!0,t.ValidationError.prototype.name="ValidationError",t.ValidationError.prototype.annotate=s.error},8901:(e,t,r)=>{"use strict";const s=r(375),n=r(8571),a=r(8160),o=r(6914),i={};t.type=function(e,t){const r=Object.getPrototypeOf(e),l=n(r),c=e._assign(Object.create(l)),u=Object.assign({},t);delete u.base,l._definition=u;const f=r._definition||{};u.messages=o.merge(f.messages,u.messages),u.properties=Object.assign({},f.properties,u.properties),c.type=u.type,u.flags=Object.assign({},f.flags,u.flags);const m=Object.assign({},f.terms);if(u.terms)for(const e in u.terms){const t=u.terms[e];s(void 0===c.$_terms[e],"Invalid term override for",u.type,e),c.$_terms[e]=t.init,m[e]=t}u.terms=m,u.args||(u.args=f.args),u.prepare=i.prepare(u.prepare,f.prepare),u.coerce&&("function"==typeof u.coerce&&(u.coerce={method:u.coerce}),u.coerce.from&&!Array.isArray(u.coerce.from)&&(u.coerce={method:u.coerce.method,from:[].concat(u.coerce.from)})),u.coerce=i.coerce(u.coerce,f.coerce),u.validate=i.validate(u.validate,f.validate);const h=Object.assign({},f.rules);if(u.rules)for(const e in u.rules){const t=u.rules[e];s("object"==typeof t,"Invalid rule definition for",u.type,e);let r=t.method;if(void 0===r&&(r=function(){return this.$_addRule(e)}),r&&(s(!l[e],"Rule conflict in",u.type,e),l[e]=r),s(!h[e],"Rule conflict in",u.type,e),h[e]=t,t.alias){const e=[].concat(t.alias);for(const r of e)l[r]=t.method}t.args&&(t.argsByName=new Map,t.args=t.args.map((e=>("string"==typeof e&&(e={name:e}),s(!t.argsByName.has(e.name),"Duplicated argument name",e.name),a.isSchema(e.assert)&&(e.assert=e.assert.strict().label(e.name)),t.argsByName.set(e.name,e),e))))}u.rules=h;const d=Object.assign({},f.modifiers);if(u.modifiers)for(const e in u.modifiers){s(!l[e],"Rule conflict in",u.type,e);const t=u.modifiers[e];s("function"==typeof t,"Invalid modifier definition for",u.type,e);const r=function(t){return this.rule({[e]:t})};l[e]=r,d[e]=t}if(u.modifiers=d,u.overrides){l._super=r,c.$_super={};for(const e in u.overrides)s(r[e],"Cannot override missing",e),u.overrides[e][a.symbols.parent]=r[e],c.$_super[e]=r[e].bind(c);Object.assign(l,u.overrides)}u.cast=Object.assign({},f.cast,u.cast);const p=Object.assign({},f.manifest,u.manifest);return p.build=i.build(u.manifest&&u.manifest.build,f.manifest&&f.manifest.build),u.manifest=p,u.rebuild=i.rebuild(u.rebuild,f.rebuild),c},i.build=function(e,t){return e&&t?function(r,s){return t(e(r,s),s)}:e||t},i.coerce=function(e,t){return e&&t?{from:e.from&&t.from?[...new Set([...e.from,...t.from])]:null,method(r,s){let n;if((!t.from||t.from.includes(typeof r))&&(n=t.method(r,s),n)){if(n.errors||void 0===n.value)return n;r=n.value}if(!e.from||e.from.includes(typeof r)){const t=e.method(r,s);if(t)return t}return n}}:e||t},i.prepare=function(e,t){return e&&t?function(r,s){const n=e(r,s);if(n){if(n.errors||void 0===n.value)return n;r=n.value}return t(r,s)||n}:e||t},i.rebuild=function(e,t){return e&&t?function(r){t(r),e(r)}:e||t},i.validate=function(e,t){return e&&t?function(r,s){const n=t(r,s);if(n){if(n.errors&&(!Array.isArray(n.errors)||n.errors.length))return n;r=n.value}return e(r,s)||n}:e||t}},5107:(e,t,r)=>{"use strict";const s=r(375),n=r(8571),a=r(8652),o=r(8160),i=r(3292),l=r(6354),c=r(8901),u=r(9708),f=r(6133),m=r(3328),h=r(1152);let d;const p={types:{alternatives:r(4946),any:r(8068),array:r(546),boolean:r(4937),date:r(7500),function:r(390),link:r(8785),number:r(3832),object:r(8966),string:r(7417),symbol:r(8826)},aliases:{alt:"alternatives",bool:"boolean",func:"function"},root:function(){const e={_types:new Set(Object.keys(p.types))};for(const t of e._types)e[t]=function(...e){return s(!e.length||["alternatives","link","object"].includes(t),"The",t,"type does not allow arguments"),p.generate(this,p.types[t],e)};for(const t of["allow","custom","disallow","equal","exist","forbidden","invalid","not","only","optional","options","prefs","preferences","required","strip","valid","when"])e[t]=function(...e){return this.any()[t](...e)};Object.assign(e,p.methods);for(const t in p.aliases){const r=p.aliases[t];e[t]=e[r]}return e.x=e.expression,h.setup&&h.setup(e),e}};p.methods={ValidationError:l.ValidationError,version:o.version,cache:a.provider,assert(e,t,...r){p.assert(e,t,!0,r)},attempt:(e,t,...r)=>p.assert(e,t,!1,r),build(e){return s("function"==typeof u.build,"Manifest functionality disabled"),u.build(this,e)},checkPreferences(e){o.checkPreferences(e)},compile(e,t){return i.compile(this,e,t)},defaults(e){s("function"==typeof e,"modifier must be a function");const t=Object.assign({},this);for(const r of t._types){const n=e(t[r]());s(o.isSchema(n),"modifier must return a valid schema object"),t[r]=function(...e){return p.generate(this,n,e)}}return t},expression:(...e)=>new m(...e),extend(...e){o.verifyFlat(e,"extend"),d=d||r(3378),s(e.length,"You need to provide at least one extension"),this.assert(e,d.extensions);const t=Object.assign({},this);t._types=new Set(t._types);for(let r of e){"function"==typeof r&&(r=r(t)),this.assert(r,d.extension);const e=p.expandExtension(r,t);for(const r of e){s(void 0===t[r.type]||t._types.has(r.type),"Cannot override name",r.type);const e=r.base||this.any(),n=c.type(e,r);t._types.add(r.type),t[r.type]=function(...e){return p.generate(this,n,e)}}}return t},isError:l.ValidationError.isError,isExpression:m.isTemplate,isRef:f.isRef,isSchema:o.isSchema,in:(...e)=>f.in(...e),override:o.symbols.override,ref:(...e)=>f.create(...e),types(){const e={};for(const t of this._types)e[t]=this[t]();for(const t in p.aliases)e[t]=this[t]();return e}},p.assert=function(e,t,r,s){const a=s[0]instanceof Error||"string"==typeof s[0]?s[0]:null,i=a?s[1]:s[0],c=t.validate(e,o.preferences({errors:{stack:!0}},i||{}));let u=c.error;if(!u)return c.value;if(a instanceof Error)throw a;const f=r&&"function"==typeof u.annotate?u.annotate():u.message;throw u instanceof l.ValidationError==0&&(u=n(u)),u.message=a?"".concat(a," ").concat(f):f,u},p.generate=function(e,t,r){return s(e,"Must be invoked on a Joi instance."),t.$_root=e,t._definition.args&&r.length?t._definition.args(t,...r):t},p.expandExtension=function(e,t){if("string"==typeof e.type)return[e];const r=[];for(const s of t._types)if(e.type.test(s)){const n=Object.assign({},e);n.type=s,n.base=t[s](),r.push(n)}return r},e.exports=p.root()},6914:(e,t,r)=>{"use strict";const s=r(375),n=r(8571),a=r(3328);t.compile=function(e,t){if("string"==typeof e)return s(!t,"Cannot set single message string"),new a(e);if(a.isTemplate(e))return s(!t,"Cannot set single message template"),e;s("object"==typeof e&&!Array.isArray(e),"Invalid message options"),t=t?n(t):{};for(let r in e){const n=e[r];if("root"===r||a.isTemplate(n)){t[r]=n;continue}if("string"==typeof n){t[r]=new a(n);continue}s("object"==typeof n&&!Array.isArray(n),"Invalid message for",r);const o=r;for(r in t[o]=t[o]||{},n){const e=n[r];"root"===r||a.isTemplate(e)?t[o][r]=e:(s("string"==typeof e,"Invalid message for",r,"in",o),t[o][r]=new a(e))}}return t},t.decompile=function(e){const t={};for(let r in e){const s=e[r];if("root"===r){t[r]=s;continue}if(a.isTemplate(s)){t[r]=s.describe({compact:!0});continue}const n=r;for(r in t[n]={},s){const e=s[r];t[n][r]="root"!==r?e.describe({compact:!0}):e}}return t},t.merge=function(e,r){if(!e)return t.compile(r);if(!r)return e;if("string"==typeof r)return new a(r);if(a.isTemplate(r))return r;const o=n(e);for(let e in r){const t=r[e];if("root"===e||a.isTemplate(t)){o[e]=t;continue}if("string"==typeof t){o[e]=new a(t);continue}s("object"==typeof t&&!Array.isArray(t),"Invalid message for",e);const n=e;for(e in o[n]=o[n]||{},t){const r=t[e];"root"===e||a.isTemplate(r)?o[n][e]=r:(s("string"==typeof r,"Invalid message for",e,"in",n),o[n][e]=new a(r))}}return o}},2294:(e,t,r)=>{"use strict";function s(e,t){var r=Object.keys(e);if(Object.getOwnPropertySymbols){var s=Object.getOwnPropertySymbols(e);t&&(s=s.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),r.push.apply(r,s)}return r}function n(e){for(var t=1;t{if(r===(e._flags.id||t))return s},ref:!1});return n?n.$_mutateRebuild():e},t.schema=function(e,t){let r;for(const s in e._flags){if("_"===s[0])continue;const n=c.scan(e._flags[s],{source:"flags",name:s},t);void 0!==n&&(r=r||e.clone(),r._flags[s]=n)}for(let s=0;s{"use strict";function s(e,t){var r=Object.keys(e);if(Object.getOwnPropertySymbols){var s=Object.getOwnPropertySymbols(e);t&&(s=s.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),r.push.apply(r,s)}return r}function n(e){for(var t=1;t=0&&this.refs.push({ancestor:t.ancestor-s,root:t.root});else t.isRef(e)&&"value"===e.type&&e.ancestor-s>=0&&this.refs.push({ancestor:e.ancestor-s,root:e.root}),u=u||r(3328),u.isTemplate(e)&&this.register(e.refs(),s)}get length(){return this.refs.length}clone(){const e=new t.Manager;return e.refs=i(this.refs),e}reset(){this.refs=[]}roots(){return this.refs.filter((e=>!e.ancestor)).map((e=>e.root))}}},3378:(e,t,r)=>{"use strict";const s=r(5107),n={};n.wrap=s.string().min(1).max(2).allow(!1),t.preferences=s.object({allowUnknown:s.boolean(),abortEarly:s.boolean(),artifacts:s.boolean(),cache:s.boolean(),context:s.object(),convert:s.boolean(),dateFormat:s.valid("date","iso","string","time","utc"),debug:s.boolean(),errors:{escapeHtml:s.boolean(),label:s.valid("path","key",!1),language:[s.string(),s.object().ref()],render:s.boolean(),stack:s.boolean(),wrap:{label:n.wrap,array:n.wrap}},externals:s.boolean(),messages:s.object(),noDefaults:s.boolean(),nonEnumerables:s.boolean(),presence:s.valid("required","optional","forbidden"),skipFunctions:s.boolean(),stripUnknown:s.object({arrays:s.boolean(),objects:s.boolean()}).or("arrays","objects").allow(!0,!1),warnings:s.boolean()}).strict(),n.nameRx=/^[a-zA-Z0-9]\w*$/,n.rule=s.object({alias:s.array().items(s.string().pattern(n.nameRx)).single(),args:s.array().items(s.string(),s.object({name:s.string().pattern(n.nameRx).required(),ref:s.boolean(),assert:s.alternatives([s.function(),s.object().schema()]).conditional("ref",{is:!0,then:s.required()}),normalize:s.function(),message:s.string().when("assert",{is:s.function(),then:s.required()})})),convert:s.boolean(),manifest:s.boolean(),method:s.function().allow(!1),multi:s.boolean(),validate:s.function()}),t.extension=s.object({type:s.alternatives([s.string(),s.object().regex()]).required(),args:s.function(),cast:s.object().pattern(n.nameRx,s.object({from:s.function().maxArity(1).required(),to:s.function().minArity(1).maxArity(2).required()})),base:s.object().schema().when("type",{is:s.object().regex(),then:s.forbidden()}),coerce:[s.function().maxArity(3),s.object({method:s.function().maxArity(3).required(),from:s.array().items(s.string()).single()})],flags:s.object().pattern(n.nameRx,s.object({setter:s.string(),default:s.any()})),manifest:{build:s.function().arity(2)},messages:[s.object(),s.string()],modifiers:s.object().pattern(n.nameRx,s.function().minArity(1).maxArity(2)),overrides:s.object().pattern(n.nameRx,s.function()),prepare:s.function().maxArity(3),rebuild:s.function().arity(1),rules:s.object().pattern(n.nameRx,n.rule),terms:s.object().pattern(n.nameRx,s.object({init:s.array().allow(null).required(),manifest:s.object().pattern(/.+/,[s.valid("schema","single"),s.object({mapped:s.object({from:s.string().required(),to:s.string().required()}).required()})])})),validate:s.function().maxArity(3)}).strict(),t.extensions=s.array().items(s.object(),s.function().arity(1)).strict(),n.desc={buffer:s.object({buffer:s.string()}),func:s.object({function:s.function().required(),options:{literal:!0}}),override:s.object({override:!0}),ref:s.object({ref:s.object({type:s.valid("value","global","local"),path:s.array().required(),separator:s.string().length(1).allow(!1),ancestor:s.number().min(0).integer().allow("root"),map:s.array().items(s.array().length(2)).min(1),adjust:s.function(),iterables:s.boolean(),in:s.boolean(),render:s.boolean()}).required()}),regex:s.object({regex:s.string().min(3)}),special:s.object({special:s.valid("deep").required()}),template:s.object({template:s.string().required(),options:s.object()}),value:s.object({value:s.alternatives([s.object(),s.array()]).required()})},n.desc.entity=s.alternatives([s.array().items(s.link("...")),s.boolean(),s.function(),s.number(),s.string(),n.desc.buffer,n.desc.func,n.desc.ref,n.desc.regex,n.desc.special,n.desc.template,n.desc.value,s.link("/")]),n.desc.values=s.array().items(null,s.boolean(),s.function(),s.number().allow(1/0,-1/0),s.string().allow(""),s.symbol(),n.desc.buffer,n.desc.func,n.desc.override,n.desc.ref,n.desc.regex,n.desc.template,n.desc.value),n.desc.messages=s.object().pattern(/.+/,[s.string(),n.desc.template,s.object().pattern(/.+/,[s.string(),n.desc.template])]),t.description=s.object({type:s.string().required(),flags:s.object({cast:s.string(),default:s.any(),description:s.string(),empty:s.link("/"),failover:n.desc.entity,id:s.string(),label:s.string(),only:!0,presence:["optional","required","forbidden"],result:["raw","strip"],strip:s.boolean(),unit:s.string()}).unknown(),preferences:{allowUnknown:s.boolean(),abortEarly:s.boolean(),artifacts:s.boolean(),cache:s.boolean(),convert:s.boolean(),dateFormat:["date","iso","string","time","utc"],errors:{escapeHtml:s.boolean(),label:["path","key"],language:[s.string(),n.desc.ref],wrap:{label:n.wrap,array:n.wrap}},externals:s.boolean(),messages:n.desc.messages,noDefaults:s.boolean(),nonEnumerables:s.boolean(),presence:["required","optional","forbidden"],skipFunctions:s.boolean(),stripUnknown:s.object({arrays:s.boolean(),objects:s.boolean()}).or("arrays","objects").allow(!0,!1),warnings:s.boolean()},allow:n.desc.values,invalid:n.desc.values,rules:s.array().min(1).items({name:s.string().required(),args:s.object().min(1),keep:s.boolean(),message:[s.string(),n.desc.messages],warn:s.boolean()}),keys:s.object().pattern(/.*/,s.link("/")),link:n.desc.ref}).pattern(/^[a-z]\w*$/,s.any())},493:(e,t,r)=>{"use strict";const s=r(8571),n=r(9621),a=r(8160),o={value:Symbol("value")};e.exports=o.State=class{constructor(e,t,r){this.path=e,this.ancestors=t,this.mainstay=r.mainstay,this.schemas=r.schemas,this.debug=null}localize(e,t=null,r=null){const s=new o.State(e,t,this);return r&&s.schemas&&(s.schemas=[o.schemas(r),...s.schemas]),s}nest(e,t){const r=new o.State(this.path,this.ancestors,this);return r.schemas=r.schemas&&[o.schemas(e),...r.schemas],r.debug=t,r}shadow(e,t){this.mainstay.shadow=this.mainstay.shadow||new o.Shadow,this.mainstay.shadow.set(this.path,e,t)}snapshot(){this.mainstay.shadow&&(this._snapshot=s(this.mainstay.shadow.node(this.path)))}restore(){this.mainstay.shadow&&(this.mainstay.shadow.override(this.path,this._snapshot),this._snapshot=void 0)}},o.schemas=function(e){return a.isSchema(e)?{schema:e}:e},o.Shadow=class{constructor(){this._values=null}set(e,t,r){if(!e.length)return;if("strip"===r&&"number"==typeof e[e.length-1])return;this._values=this._values||new Map;let s=this._values;for(let t=0;t{"use strict";function s(e,t){var r=Object.keys(e);if(Object.getOwnPropertySymbols){var s=Object.getOwnPropertySymbols(e);t&&(s=s.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),r.push.apply(r,s)}return r}function n(e,t,r){return t in e?Object.defineProperty(e,t,{value:r,enumerable:!0,configurable:!0,writable:!0}):e[t]=r,e}const a=r(375),o=r(8571),i=r(5277),l=r(1447),c=r(8160),u=r(6354),f=r(6133),m={symbol:Symbol("template"),opens:new Array(1e3).join("\0"),closes:new Array(1e3).join(""),dateFormat:{date:Date.prototype.toDateString,iso:Date.prototype.toISOString,string:Date.prototype.toString,time:Date.prototype.toTimeString,utc:Date.prototype.toUTCString}};e.exports=m.Template=class{constructor(e,t){a("string"==typeof e,"Template source must be a string"),a(!e.includes("\0")&&!e.includes(""),"Template source cannot contain reserved control characters"),this.source=e,this.rendered=e,this._template=null,this._settings=o(t),this._parse()}_parse(){if(!this.source.includes("{"))return;const e=m.encode(this.source),t=m.split(e);let r=!1;const s=[],n=t.shift();n&&s.push(n);for(const e of t){const t="{"!==e[0],n=t?"}":"}}",a=e.indexOf(n);if(-1===a||"{"===e[1]){s.push("{".concat(m.decode(e)));continue}let o=e.slice(t?0:1,a);const i=":"===o[0];i&&(o=o.slice(1));const l=this._ref(m.decode(o),{raw:t,wrapped:i});s.push(l),"string"!=typeof l&&(r=!0);const c=e.slice(a+n.length);c&&s.push(m.decode(c))}r?this._template=s:this.rendered=s.join("")}static date(e,t){return m.dateFormat[t.dateFormat].call(e)}describe(e={}){if(!this._settings&&e.compact)return this.source;const t={template:this.source};return this._settings&&(t.options=this._settings),t}static build(e){return new m.Template(e.template,e.options)}isDynamic(){return!!this._template}static isTemplate(e){return!!e&&!!e[c.symbols.template]}refs(){if(!this._template)return;const e=[];for(const t of this._template)"string"!=typeof t&&e.push(...t.refs);return e}resolve(e,t,r,s){return this._template&&1===this._template.length?this._part(this._template[0],e,t,r,s,{}):this.render(e,t,r,s)}_part(e,...t){return e.ref?e.ref.resolve(...t):e.formula.evaluate(t)}render(e,t,r,s,n={}){if(!this.isDynamic())return this.rendered;const a=[];for(const o of this._template)if("string"==typeof o)a.push(o);else{const l=this._part(o,e,t,r,s,n),c=m.stringify(l,e,t,r,s,n);if(void 0!==c){const e=o.raw||!1===(n.errors&&n.errors.escapeHtml)?c:i(c);a.push(m.wrap(e,o.wrapped&&r.errors.wrap.label))}}return a.join("")}_ref(e,{raw:t,wrapped:r}){const s=[],n=e=>{const t=f.create(e,this._settings);return s.push(t),e=>t.resolve(...e)};try{var a=new l.Parser(e,{reference:n,functions:m.functions,constants:m.constants})}catch(t){throw t.message='Invalid template variable "'.concat(e,'" fails due to: ').concat(t.message),t}if(a.single){if("reference"===a.single.type){const e=s[0];return{ref:e,raw:t,refs:s,wrapped:r||"local"===e.type&&"label"===e.key}}return m.stringify(a.single.value)}return{formula:a,raw:t,refs:s}}toString(){return this.source}},m.Template.prototype[c.symbols.template]=!0,m.Template.prototype.isImmutable=!0,m.encode=function(e){return e.replace(/\\(\{+)/g,((e,t)=>m.opens.slice(0,t.length))).replace(/\\(\}+)/g,((e,t)=>m.closes.slice(0,t.length)))},m.decode=function(e){return e.replace(/\u0000/g,"{").replace(/\u0001/g,"}")},m.split=function(e){const t=[];let r="";for(let s=0;s ").concat(s.toString()));e=t}if(!Array.isArray(e))return e.toString();let u="";for(const s of e)u=u+(u.length?", ":"")+m.stringify(s,t,r,a,o,i);return c?u:m.wrap(u,a.errors.wrap.array)},m.constants={true:!0,false:!1,null:null,second:1e3,minute:6e4,hour:36e5,day:864e5},m.functions={if:(e,t,r)=>e?t:r,msg(e){const[t,r,s,n,a]=this,o=a.messages;if(!o)return"";const i=u.template(t,o[0],e,r,s)||u.template(t,o[1],e,r,s);return i?i.render(t,r,s,n,a):""},number:e=>"number"==typeof e?e:"string"==typeof e?parseFloat(e):"boolean"==typeof e?e?1:0:e instanceof Date?e.getTime():null}},4946:(e,t,r)=>{"use strict";const s=r(375),n=r(1687),a=r(8068),o=r(8160),i=r(3292),l=r(6354),c=r(6133),u={};e.exports=a.extend({type:"alternatives",flags:{match:{default:"any"}},terms:{matches:{init:[],register:c.toSibling}},args:(e,...t)=>1===t.length&&Array.isArray(t[0])?e.try(...t[0]):e.try(...t),validate(e,t){const{schema:r,error:s,state:a,prefs:o}=t;if(r._flags.match){const t=[];for(let s=0;se&&"object"===t.schema.type),!0)?{value:t.reduce(((e,t)=>n(e,t,{mergeArrays:!1})))}:{value:t[t.length-1]}}const i=[];for(let t=0;t"is"!==r.path[0]?t.label(e):void 0,ref:!1})}},rebuild(e){e.$_modify({each:t=>{o.isSchema(t)&&"array"===t.type&&e.$_setFlag("_arrayItems",!0,{clone:!1})}})},manifest:{build(e,t){if(t.matches)for(const r of t.matches){const{schema:t,ref:s,is:n,not:a,then:o,otherwise:i}=r;e=t?e.try(t):s?e.conditional(s,{is:n,then:o,not:a,otherwise:i,switch:r.switch}):e.conditional(n,{then:o,otherwise:i})}return e}},messages:{"alternatives.all":"{{#label}} does not match all of the required types","alternatives.any":"{{#label}} does not match any of the allowed types","alternatives.match":"{{#label}} does not match any of the allowed types","alternatives.one":"{{#label}} matches more than one allowed type","alternatives.types":"{{#label}} must be one of {{#types}}"}}),u.errors=function(e,{error:t,state:r}){if(!e.length)return{errors:t("alternatives.any")};if(1===e.length)return{errors:e[0].reports};const s=new Set,n=[];for(const{reports:a,schema:o}of e){if(a.length>1)return u.unmatched(e,t);const i=a[0];if(i instanceof l.Report==0)return u.unmatched(e,t);if(i.state.path.length!==r.path.length){n.push({type:o.type,report:i});continue}if("any.only"===i.code){for(const e of i.local.valids)s.add(e);continue}const[c,f]=i.code.split(".");"base"===f?s.add(c):n.push({type:o.type,report:i})}return n.length?1===n.length?{errors:n[0].report}:u.unmatched(e,t):{errors:t("alternatives.types",{types:[...s]})}},u.unmatched=function(e,t){const r=[];for(const t of e)r.push(...t.reports);return{errors:t("alternatives.match",l.details(r,{override:!1}))}}},8068:(e,t,r)=>{"use strict";const s=r(375),n=r(7629),a=r(8160),o=r(6914);e.exports=n.extend({type:"any",flags:{only:{default:!1}},terms:{alterations:{init:null},examples:{init:null},externals:{init:null},metas:{init:[]},notes:{init:[]},shared:{init:null},tags:{init:[]},whens:{init:null}},rules:{custom:{method(e,t){return s("function"==typeof e,"Method must be a function"),s(void 0===t||t&&"string"==typeof t,"Description must be a non-empty string"),this.$_addRule({name:"custom",args:{method:e,description:t}})},validate(e,t,{method:r}){try{return r(e,t)}catch(e){return t.error("any.custom",{error:e})}},args:["method","description"],multi:!0},messages:{method(e){return this.prefs({messages:e})}},shared:{method(e){s(a.isSchema(e)&&e._flags.id,"Schema must be a schema with an id");const t=this.clone();return t.$_terms.shared=t.$_terms.shared||[],t.$_terms.shared.push(e),t.$_mutateRegister(e),t}},warning:{method(e,t){return s(e&&"string"==typeof e,"Invalid warning code"),this.$_addRule({name:"warning",args:{code:e,local:t},warn:!0})},validate:(e,t,{code:r,local:s})=>t.error(r,s),args:["code","local"],multi:!0}},modifiers:{keep(e,t=!0){e.keep=t},message(e,t){e.message=o.compile(t)},warn(e,t=!0){e.warn=t}},manifest:{build(e,t){for(const r in t){const s=t[r];if(["examples","externals","metas","notes","tags"].includes(r))for(const t of s)e=e[r.slice(0,-1)](t);else if("alterations"!==r)if("whens"!==r){if("shared"===r)for(const t of s)e=e.shared(t)}else for(const t of s){const{ref:r,is:s,not:n,then:a,otherwise:o,concat:i}=t;e=i?e.concat(i):r?e.when(r,{is:s,not:n,then:a,otherwise:o,switch:t.switch,break:t.break}):e.when(s,{then:a,otherwise:o,break:t.break})}else{const t={};for(const{target:e,adjuster:r}of s)t[e]=r;e=e.alter(t)}}return e}},messages:{"any.custom":"{{#label}} failed custom validation because {{#error.message}}","any.default":"{{#label}} threw an error when running default method","any.failover":"{{#label}} threw an error when running failover method","any.invalid":"{{#label}} contains an invalid value","any.only":'{{#label}} must be {if(#valids.length == 1, "", "one of ")}{{#valids}}',"any.ref":"{{#label}} {{#arg}} references {{:#ref}} which {{#reason}}","any.required":"{{#label}} is required","any.unknown":"{{#label}} is not allowed"}})},546:(e,t,r)=>{"use strict";const s=r(375),n=r(9474),a=r(9621),o=r(8068),i=r(8160),l=r(3292),c={};e.exports=o.extend({type:"array",flags:{single:{default:!1},sparse:{default:!1}},terms:{items:{init:[],manifest:"schema"},ordered:{init:[],manifest:"schema"},_exclusions:{init:[]},_inclusions:{init:[]},_requireds:{init:[]}},coerce:{from:"object",method(e,{schema:t,state:r,prefs:s}){if(!Array.isArray(e))return;const n=t.$_getRule("sort");return n?c.sort(t,e,n.args.options,r,s):void 0}},validate(e,{schema:t,error:r}){if(!Array.isArray(e)){if(t._flags.single){const t=[e];return t[i.symbols.arraySingle]=!0,{value:t}}return{errors:r("array.base")}}if(t.$_getRule("items")||t.$_terms.externals)return{value:e.slice()}},rules:{has:{method(e){e=this.$_compile(e,{appendPath:!0});const t=this.$_addRule({name:"has",args:{schema:e}});return t.$_mutateRegister(e),t},validate(e,{state:t,prefs:r,error:s},{schema:n}){const a=[e,...t.ancestors];for(let s=0;sthis.$_compile(e[r])),r,{append:!0});t.$_terms.items.push(s)}return t.$_mutateRebuild()},validate(e,{schema:t,error:r,state:s,prefs:n,errorsArray:a}){const o=t.$_terms._requireds.slice(),l=t.$_terms.ordered.slice(),u=[...t.$_terms._inclusions,...o],f=!e[i.symbols.arraySingle];delete e[i.symbols.arraySingle];const m=a();let h=e.length;for(let a=0;a="})}},ordered:{method(...e){i.verifyFlat(e,"ordered");const t=this.$_addRule("items");for(let r=0;rthis.$_compile(e[r])),r,{append:!0});c.validateSingle(s,t),t.$_mutateRegister(s),t.$_terms.ordered.push(s)}return t.$_mutateRebuild()}},single:{method(e){const t=void 0===e||!!e;return s(!t||!this._flags._arrayItems,"Cannot specify single rule when array has array items"),this.$_setFlag("single",t)}},sort:{method(e={}){i.assertOptions(e,["by","order"]);const t={order:e.order||"ascending"};return e.by&&(t.by=l.ref(e.by,{ancestor:0}),s(!t.by.ancestor,"Cannot sort by ancestor")),this.$_addRule({name:"sort",args:{options:t}})},validate(e,{error:t,state:r,prefs:s,schema:n},{options:a}){const{value:o,errors:i}=c.sort(n,e,a,r,s);if(i)return i;for(let r=0;rnew Set(e)}},rebuild(e){e.$_terms._inclusions=[],e.$_terms._exclusions=[],e.$_terms._requireds=[];for(const t of e.$_terms.items)c.validateSingle(t,e),"required"===t._flags.presence?e.$_terms._requireds.push(t):"forbidden"===t._flags.presence?e.$_terms._exclusions.push(t):e.$_terms._inclusions.push(t);for(const t of e.$_terms.ordered)c.validateSingle(t,e)},manifest:{build:(e,t)=>(t.items&&(e=e.items(...t.items)),t.ordered&&(e=e.ordered(...t.ordered)),e)},messages:{"array.base":"{{#label}} must be an array","array.excludes":"{{#label}} contains an excluded value","array.hasKnown":"{{#label}} does not contain at least one required match for type {:#patternLabel}","array.hasUnknown":"{{#label}} does not contain at least one required match","array.includes":"{{#label}} does not match any of the allowed types","array.includesRequiredBoth":"{{#label}} does not contain {{#knownMisses}} and {{#unknownMisses}} other required value(s)","array.includesRequiredKnowns":"{{#label}} does not contain {{#knownMisses}}","array.includesRequiredUnknowns":"{{#label}} does not contain {{#unknownMisses}} required value(s)","array.length":"{{#label}} must contain {{#limit}} items","array.max":"{{#label}} must contain less than or equal to {{#limit}} items","array.min":"{{#label}} must contain at least {{#limit}} items","array.orderedLength":"{{#label}} must contain at most {{#limit}} items","array.sort":"{{#label}} must be sorted in {#order} order by {{#by}}","array.sort.mismatching":"{{#label}} cannot be sorted due to mismatching types","array.sort.unsupported":"{{#label}} cannot be sorted due to unsupported type {#type}","array.sparse":"{{#label}} must not be a sparse array item","array.unique":"{{#label}} contains a duplicate value"}}),c.fillMissedErrors=function(e,t,r,s,n,a){const o=[];let i=0;for(const e of r){const t=e._flags.label;t?o.push(t):++i}o.length?i?t.push(e.$_createError("array.includesRequiredBoth",s,{knownMisses:o,unknownMisses:i},n,a)):t.push(e.$_createError("array.includesRequiredKnowns",s,{knownMisses:o},n,a)):t.push(e.$_createError("array.includesRequiredUnknowns",s,{unknownMisses:i},n,a))},c.fillOrderedErrors=function(e,t,r,s,n,a){const o=[];for(const e of r)"required"===e._flags.presence&&o.push(e);o.length&&c.fillMissedErrors(e,t,o,s,n,a)},c.fillDefault=function(e,t,r,s){const n=[];let a=!0;for(let o=e.length-1;o>=0;--o){const i=e[o],l=[t,...r.ancestors],c=i.$_validate(void 0,r.localize(r.path,l,i),s).value;if(a){if(void 0===c)continue;a=!1}n.unshift(c)}n.length&&t.push(...n)},c.fastSplice=function(e,t){let r=t;for(;r{let f=c.compare(l,u,o,i);if(null!==f)return f;if(r.by&&(l=r.by.resolve(l,s,n),u=r.by.resolve(u,s,n)),f=c.compare(l,u,o,i),null!==f)return f;const m=typeof l;if(m!==typeof u)throw e.$_createError("array.sort.mismatching",t,null,s,n);if("number"!==m&&"string"!==m)throw e.$_createError("array.sort.unsupported",t,{type:m},s,n);return"number"===m?(l-u)*a:l{"use strict";const s=r(375),n=r(8068),a=r(8160),o=r(2036),i={isBool:function(e){return"boolean"==typeof e}};e.exports=n.extend({type:"boolean",flags:{sensitive:{default:!1}},terms:{falsy:{init:null,manifest:"values"},truthy:{init:null,manifest:"values"}},coerce(e,{schema:t}){if("boolean"!=typeof e){if("string"==typeof e){const r=t._flags.sensitive?e:e.toLowerCase();e="true"===r||"false"!==r&&e}return"boolean"!=typeof e&&(e=t.$_terms.truthy&&t.$_terms.truthy.has(e,null,null,!t._flags.sensitive)||(!t.$_terms.falsy||!t.$_terms.falsy.has(e,null,null,!t._flags.sensitive))&&e),{value:e}}},validate(e,{error:t}){if("boolean"!=typeof e)return{value:e,errors:t("boolean.base")}},rules:{truthy:{method(...e){a.verifyFlat(e,"truthy");const t=this.clone();t.$_terms.truthy=t.$_terms.truthy||new o;for(let r=0;re?1:0},string:{from:i.isBool,to:(e,t)=>e?"true":"false"}},manifest:{build:(e,t)=>(t.truthy&&(e=e.truthy(...t.truthy)),t.falsy&&(e=e.falsy(...t.falsy)),e)},messages:{"boolean.base":"{{#label}} must be a boolean"}})},7500:(e,t,r)=>{"use strict";const s=r(375),n=r(8068),a=r(8160),o=r(3328),i={isDate:function(e){return e instanceof Date}};e.exports=n.extend({type:"date",coerce:{from:["number","string"],method:(e,{schema:t})=>({value:i.parse(e,t._flags.format)||e})},validate(e,{schema:t,error:r,prefs:s}){if(e instanceof Date&&!isNaN(e.getTime()))return;const n=t._flags.format;return s.convert&&n&&"string"==typeof e?{value:e,errors:r("date.format",{format:n})}:{value:e,errors:r("date.base")}},rules:{compare:{method:!1,validate(e,t,{date:r},{name:s,operator:n,args:o}){const i="now"===r?Date.now():r.getTime();return a.compare(e.getTime(),i,n)?e:t.error("date."+s,{limit:o.date,value:e})},args:[{name:"date",ref:!0,normalize:e=>"now"===e?e:i.parse(e),assert:e=>null!==e,message:"must have a valid date format"}]},format:{method(e){return s(["iso","javascript","unix"].includes(e),"Unknown date format",e),this.$_setFlag("format",e)}},greater:{method(e){return this.$_addRule({name:"greater",method:"compare",args:{date:e},operator:">"})}},iso:{method(){return this.format("iso")}},less:{method(e){return this.$_addRule({name:"less",method:"compare",args:{date:e},operator:"<"})}},max:{method(e){return this.$_addRule({name:"max",method:"compare",args:{date:e},operator:"<="})}},min:{method(e){return this.$_addRule({name:"min",method:"compare",args:{date:e},operator:">="})}},timestamp:{method(e="javascript"){return s(["javascript","unix"].includes(e),'"type" must be one of "javascript, unix"'),this.format(e)}}},cast:{number:{from:i.isDate,to:(e,t)=>e.getTime()},string:{from:i.isDate,to:(e,{prefs:t})=>o.date(e,t)}},messages:{"date.base":"{{#label}} must be a valid date","date.format":'{{#label}} must be in {msg("date.format." + #format) || #format} format',"date.greater":"{{#label}} must be greater than {{:#limit}}","date.less":"{{#label}} must be less than {{:#limit}}","date.max":"{{#label}} must be less than or equal to {{:#limit}}","date.min":"{{#label}} must be greater than or equal to {{:#limit}}","date.format.iso":"ISO 8601 date","date.format.javascript":"timestamp or number of milliseconds","date.format.unix":"timestamp or number of seconds"}}),i.parse=function(e,t){if(e instanceof Date)return e;if("string"!=typeof e&&(isNaN(e)||!isFinite(e)))return null;if(/^\s*$/.test(e))return null;if("iso"===t)return a.isIsoDate(e)?i.date(e.toString()):null;const r=e;if("string"==typeof e&&/^[+-]?\d+(\.\d+)?$/.test(e)&&(e=parseFloat(e)),t){if("javascript"===t)return i.date(1*e);if("unix"===t)return i.date(1e3*e);if("string"==typeof r)return null}return i.date(e)},i.date=function(e){const t=new Date(e);return isNaN(t.getTime())?null:t}},390:(e,t,r)=>{"use strict";const s=r(375),n=r(7824);e.exports=n.extend({type:"function",properties:{typeof:"function"},rules:{arity:{method(e){return s(Number.isSafeInteger(e)&&e>=0,"n must be a positive integer"),this.$_addRule({name:"arity",args:{n:e}})},validate:(e,t,{n:r})=>e.length===r?e:t.error("function.arity",{n:r})},class:{method(){return this.$_addRule("class")},validate:(e,t)=>/^\s*class\s/.test(e.toString())?e:t.error("function.class",{value:e})},minArity:{method(e){return s(Number.isSafeInteger(e)&&e>0,"n must be a strict positive integer"),this.$_addRule({name:"minArity",args:{n:e}})},validate:(e,t,{n:r})=>e.length>=r?e:t.error("function.minArity",{n:r})},maxArity:{method(e){return s(Number.isSafeInteger(e)&&e>=0,"n must be a positive integer"),this.$_addRule({name:"maxArity",args:{n:e}})},validate:(e,t,{n:r})=>e.length<=r?e:t.error("function.maxArity",{n:r})}},messages:{"function.arity":"{{#label}} must have an arity of {{#n}}","function.class":"{{#label}} must be a class","function.maxArity":"{{#label}} must have an arity lesser or equal to {{#n}}","function.minArity":"{{#label}} must have an arity greater or equal to {{#n}}"}})},7824:(e,t,r)=>{"use strict";const s=r(978),n=r(375),a=r(8571),o=r(3652),i=r(8068),l=r(8160),c=r(3292),u=r(6354),f=r(6133),m=r(3328),h={renameDefaults:{alias:!1,multiple:!1,override:!1}};e.exports=i.extend({type:"_keys",properties:{typeof:"object"},flags:{unknown:{default:!1}},terms:{dependencies:{init:null},keys:{init:null,manifest:{mapped:{from:"schema",to:"key"}}},patterns:{init:null},renames:{init:null}},args:(e,t)=>e.keys(t),validate(e,{schema:t,error:r,state:s,prefs:n}){if(!e||typeof e!==t.$_property("typeof")||Array.isArray(e))return{value:e,errors:r("object.base",{type:t.$_property("typeof")})};if(!(t.$_terms.renames||t.$_terms.dependencies||t.$_terms.keys||t.$_terms.patterns||t.$_terms.externals))return;e=h.clone(e,n);const a=[];if(t.$_terms.renames&&!h.rename(t,e,s,n,a))return{value:e,errors:a};if(!t.$_terms.keys&&!t.$_terms.patterns&&!t.$_terms.dependencies)return{value:e,errors:a};const o=new Set(Object.keys(e));if(t.$_terms.keys){const r=[e,...s.ancestors];for(const i of t.$_terms.keys){const t=i.key,l=e[t];o.delete(t);const c=s.localize([...s.path,t],r,i),u=i.schema.$_validate(l,c,n);if(u.errors){if(n.abortEarly)return{value:e,errors:u.errors};void 0!==u.value&&(e[t]=u.value),a.push(...u.errors)}else"strip"===i.schema._flags.result||void 0===u.value&&void 0!==l?delete e[t]:void 0!==u.value&&(e[t]=u.value)}}if(o.size||t._flags._hasPatternMatch){const r=h.unknown(t,e,o,a,s,n);if(r)return r}if(t.$_terms.dependencies)for(const r of t.$_terms.dependencies){if(r.key&&void 0===r.key.resolve(e,s,n,null,{shadow:!1}))continue;const o=h.dependencies[r.rel](t,r,e,s,n);if(o){const r=t.$_createError(o.code,e,o.context,s,n);if(n.abortEarly)return{value:e,errors:r};a.push(r)}}return{value:e,errors:a}},rules:{and:{method(...e){return l.verifyFlat(e,"and"),h.dependency(this,"and",null,e)}},append:{method(e){return null==e||0===Object.keys(e).length?this:this.keys(e)}},assert:{method(e,t,r){m.isTemplate(e)||(e=c.ref(e)),n(void 0===r||"string"==typeof r,"Message must be a string"),t=this.$_compile(t,{appendPath:!0});const s=this.$_addRule({name:"assert",args:{subject:e,schema:t,message:r}});return s.$_mutateRegister(e),s.$_mutateRegister(t),s},validate(e,{error:t,prefs:r,state:s},{subject:n,schema:a,message:o}){const i=n.resolve(e,s,r),l=f.isRef(n)?n.absolute(s):[];return a.$_match(i,s.localize(l,[e,...s.ancestors],a),r)?e:t("object.assert",{subject:n,message:o})},args:["subject","schema","message"],multi:!0},instance:{method(e,t){return n("function"==typeof e,"constructor must be a function"),t=t||e.name,this.$_addRule({name:"instance",args:{constructor:e,name:t}})},validate:(e,t,{constructor:r,name:s})=>e instanceof r?e:t.error("object.instance",{type:s,value:e}),args:["constructor","name"]},keys:{method(e){n(void 0===e||"object"==typeof e,"Object schema must be a valid object"),n(!l.isSchema(e),"Object schema cannot be a joi schema");const t=this.clone();if(e)if(Object.keys(e).length){t.$_terms.keys=t.$_terms.keys?t.$_terms.keys.filter((t=>!e.hasOwnProperty(t.key))):new h.Keys;for(const r in e)l.tryWithPath((()=>t.$_terms.keys.push({key:r,schema:this.$_compile(e[r])})),r)}else t.$_terms.keys=new h.Keys;else t.$_terms.keys=null;return t.$_mutateRebuild()}},length:{method(e){return this.$_addRule({name:"length",args:{limit:e},operator:"="})},validate:(e,t,{limit:r},{name:s,operator:n,args:a})=>l.compare(Object.keys(e).length,r,n)?e:t.error("object."+s,{limit:a.limit,value:e}),args:[{name:"limit",ref:!0,assert:l.limit,message:"must be a positive integer"}]},max:{method(e){return this.$_addRule({name:"max",method:"length",args:{limit:e},operator:"<="})}},min:{method(e){return this.$_addRule({name:"min",method:"length",args:{limit:e},operator:">="})}},nand:{method(...e){return l.verifyFlat(e,"nand"),h.dependency(this,"nand",null,e)}},or:{method(...e){return l.verifyFlat(e,"or"),h.dependency(this,"or",null,e)}},oxor:{method(...e){return h.dependency(this,"oxor",null,e)}},pattern:{method(e,t,r={}){const s=e instanceof RegExp;s||(e=this.$_compile(e,{appendPath:!0})),n(void 0!==t,"Invalid rule"),l.assertOptions(r,["fallthrough","matches"]),s&&n(!e.flags.includes("g")&&!e.flags.includes("y"),"pattern should not use global or sticky mode"),t=this.$_compile(t,{appendPath:!0});const a=this.clone();a.$_terms.patterns=a.$_terms.patterns||[];const o={[s?"regex":"schema"]:e,rule:t};return r.matches&&(o.matches=this.$_compile(r.matches),"array"!==o.matches.type&&(o.matches=o.matches.$_root.array().items(o.matches)),a.$_mutateRegister(o.matches),a.$_setFlag("_hasPatternMatch",!0,{clone:!1})),r.fallthrough&&(o.fallthrough=!0),a.$_terms.patterns.push(o),a.$_mutateRegister(t),a}},ref:{method(){return this.$_addRule("ref")},validate:(e,t)=>f.isRef(e)?e:t.error("object.refType",{value:e})},regex:{method(){return this.$_addRule("regex")},validate:(e,t)=>e instanceof RegExp?e:t.error("object.regex",{value:e})},rename:{method(e,t,r={}){n("string"==typeof e||e instanceof RegExp,"Rename missing the from argument"),n("string"==typeof t||t instanceof m,"Invalid rename to argument"),n(t!==e,"Cannot rename key to same name:",e),l.assertOptions(r,["alias","ignoreUndefined","override","multiple"]);const a=this.clone();a.$_terms.renames=a.$_terms.renames||[];for(const t of a.$_terms.renames)n(t.from!==e,"Cannot rename the same key multiple times");return t instanceof m&&a.$_mutateRegister(t),a.$_terms.renames.push({from:e,to:t,options:s(h.renameDefaults,r)}),a}},schema:{method(e="any"){return this.$_addRule({name:"schema",args:{type:e}})},validate:(e,t,{type:r})=>!l.isSchema(e)||"any"!==r&&e.type!==r?t.error("object.schema",{type:r}):e},unknown:{method(e){return this.$_setFlag("unknown",!1!==e)}},with:{method(e,t,r={}){return h.dependency(this,"with",e,t,r)}},without:{method(e,t,r={}){return h.dependency(this,"without",e,t,r)}},xor:{method(...e){return l.verifyFlat(e,"xor"),h.dependency(this,"xor",null,e)}}},overrides:{default(e,t){return void 0===e&&(e=l.symbols.deepDefault),this.$_parent("default",e,t)}},rebuild(e){if(e.$_terms.keys){const t=new o.Sorter;for(const r of e.$_terms.keys)l.tryWithPath((()=>t.add(r,{after:r.schema.$_rootReferences(),group:r.key})),r.key);e.$_terms.keys=new h.Keys(...t.nodes)}},manifest:{build(e,t){if(t.keys&&(e=e.keys(t.keys)),t.dependencies)for(const{rel:r,key:s=null,peers:n,options:a}of t.dependencies)e=h.dependency(e,r,s,n,a);if(t.patterns)for(const{regex:r,schema:s,rule:n,fallthrough:a,matches:o}of t.patterns)e=e.pattern(r||s,n,{fallthrough:a,matches:o});if(t.renames)for(const{from:r,to:s,options:n}of t.renames)e=e.rename(r,s,n);return e}},messages:{"object.and":"{{#label}} contains {{#presentWithLabels}} without its required peers {{#missingWithLabels}}","object.assert":'{{#label}} is invalid because {if(#subject.key, `"` + #subject.key + `" failed to ` + (#message || "pass the assertion test"), #message || "the assertion failed")}',"object.base":"{{#label}} must be of type {{#type}}","object.instance":"{{#label}} must be an instance of {{:#type}}","object.length":'{{#label}} must have {{#limit}} key{if(#limit == 1, "", "s")}',"object.max":'{{#label}} must have less than or equal to {{#limit}} key{if(#limit == 1, "", "s")}',"object.min":'{{#label}} must have at least {{#limit}} key{if(#limit == 1, "", "s")}',"object.missing":"{{#label}} must contain at least one of {{#peersWithLabels}}","object.nand":"{{:#mainWithLabel}} must not exist simultaneously with {{#peersWithLabels}}","object.oxor":"{{#label}} contains a conflict between optional exclusive peers {{#peersWithLabels}}","object.pattern.match":"{{#label}} keys failed to match pattern requirements","object.refType":"{{#label}} must be a Joi reference","object.regex":"{{#label}} must be a RegExp object","object.rename.multiple":"{{#label}} cannot rename {{:#from}} because multiple renames are disabled and another key was already renamed to {{:#to}}","object.rename.override":"{{#label}} cannot rename {{:#from}} because override is disabled and target {{:#to}} exists","object.schema":"{{#label}} must be a Joi schema of {{#type}} type","object.unknown":"{{#label}} is not allowed","object.with":"{{:#mainWithLabel}} missing required peer {{:#peerWithLabel}}","object.without":"{{:#mainWithLabel}} conflict with forbidden peer {{:#peerWithLabel}}","object.xor":"{{#label}} contains a conflict between exclusive peers {{#peersWithLabels}}"}}),h.clone=function(e,t){if("object"==typeof e){if(t.nonEnumerables)return a(e,{shallow:!0});const r=Object.create(Object.getPrototypeOf(e));return Object.assign(r,e),r}const r=function(...t){return e.apply(this,t)};return r.prototype=a(e.prototype),Object.defineProperty(r,"name",{value:e.name,writable:!1}),Object.defineProperty(r,"length",{value:e.length,writable:!1}),Object.assign(r,e),r},h.dependency=function(e,t,r,s,a){n(null===r||"string"==typeof r,t,"key must be a strings"),a||(a=s.length>1&&"object"==typeof s[s.length-1]?s.pop():{}),l.assertOptions(a,["separator"]),s=[].concat(s);const o=l.default(a.separator,"."),i=[];for(const e of s)n("string"==typeof e,t,"peers must be strings"),i.push(c.ref(e,{separator:o,ancestor:0,prefix:!1}));null!==r&&(r=c.ref(r,{separator:o,ancestor:0,prefix:!1}));const u=e.clone();return u.$_terms.dependencies=u.$_terms.dependencies||[],u.$_terms.dependencies.push(new h.Dependency(t,r,i,s)),u},h.dependencies={and(e,t,r,s,n){const a=[],o=[],i=t.peers.length;for(const e of t.peers)void 0===e.resolve(r,s,n,null,{shadow:!1})?a.push(e.key):o.push(e.key);if(a.length!==i&&o.length!==i)return{code:"object.and",context:{present:o,presentWithLabels:h.keysToLabels(e,o),missing:a,missingWithLabels:h.keysToLabels(e,a)}}},nand(e,t,r,s,n){const a=[];for(const e of t.peers)void 0!==e.resolve(r,s,n,null,{shadow:!1})&&a.push(e.key);if(a.length!==t.peers.length)return;const o=t.paths[0],i=t.paths.slice(1);return{code:"object.nand",context:{main:o,mainWithLabel:h.keysToLabels(e,o),peers:i,peersWithLabels:h.keysToLabels(e,i)}}},or(e,t,r,s,n){for(const e of t.peers)if(void 0!==e.resolve(r,s,n,null,{shadow:!1}))return;return{code:"object.missing",context:{peers:t.paths,peersWithLabels:h.keysToLabels(e,t.paths)}}},oxor(e,t,r,s,n){const a=[];for(const e of t.peers)void 0!==e.resolve(r,s,n,null,{shadow:!1})&&a.push(e.key);if(!a.length||1===a.length)return;const o={peers:t.paths,peersWithLabels:h.keysToLabels(e,t.paths)};return o.present=a,o.presentWithLabels=h.keysToLabels(e,a),{code:"object.oxor",context:o}},with(e,t,r,s,n){for(const a of t.peers)if(void 0===a.resolve(r,s,n,null,{shadow:!1}))return{code:"object.with",context:{main:t.key.key,mainWithLabel:h.keysToLabels(e,t.key.key),peer:a.key,peerWithLabel:h.keysToLabels(e,a.key)}}},without(e,t,r,s,n){for(const a of t.peers)if(void 0!==a.resolve(r,s,n,null,{shadow:!1}))return{code:"object.without",context:{main:t.key.key,mainWithLabel:h.keysToLabels(e,t.key.key),peer:a.key,peerWithLabel:h.keysToLabels(e,a.key)}}},xor(e,t,r,s,n){const a=[];for(const e of t.peers)void 0!==e.resolve(r,s,n,null,{shadow:!1})&&a.push(e.key);if(1===a.length)return;const o={peers:t.paths,peersWithLabels:h.keysToLabels(e,t.paths)};return 0===a.length?{code:"object.missing",context:o}:(o.present=a,o.presentWithLabels=h.keysToLabels(e,a),{code:"object.xor",context:o})}},h.keysToLabels=function(e,t){return Array.isArray(t)?t.map((t=>e.$_mapLabels(t))):e.$_mapLabels(t)},h.rename=function(e,t,r,s,n){const a={};for(const o of e.$_terms.renames){const i=[],l="string"!=typeof o.from;if(l)for(const e in t){if(void 0===t[e]&&o.options.ignoreUndefined)continue;if(e===o.to)continue;const r=o.from.exec(e);r&&i.push({from:e,to:o.to,match:r})}else!Object.prototype.hasOwnProperty.call(t,o.from)||void 0===t[o.from]&&o.options.ignoreUndefined||i.push(o);for(const c of i){const i=c.from;let u=c.to;if(u instanceof m&&(u=u.render(t,r,s,c.match)),i!==u){if(!o.options.multiple&&a[u]&&(n.push(e.$_createError("object.rename.multiple",t,{from:i,to:u,pattern:l},r,s)),s.abortEarly))return!1;if(Object.prototype.hasOwnProperty.call(t,u)&&!o.options.override&&!a[u]&&(n.push(e.$_createError("object.rename.override",t,{from:i,to:u,pattern:l},r,s)),s.abortEarly))return!1;void 0===t[i]?delete t[u]:t[u]=t[i],a[u]=!0,o.options.alias||delete t[i]}}}return!0},h.unknown=function(e,t,r,s,n,a){if(e.$_terms.patterns){let o=!1;const i=e.$_terms.patterns.map((e=>{if(e.matches)return o=!0,[]})),l=[t,...n.ancestors];for(const o of r){const c=t[o],u=[...n.path,o];for(let f=0;f{"use strict";const s=r(375),n=r(8068),a=r(8160),o=r(3292),i=r(6354),l={};e.exports=n.extend({type:"link",properties:{schemaChain:!0},terms:{link:{init:null,manifest:"single",register:!1}},args:(e,t)=>e.ref(t),validate(e,{schema:t,state:r,prefs:n}){s(t.$_terms.link,"Uninitialized link schema");const a=l.generate(t,e,r,n),o=t.$_terms.link[0].ref;return a.$_validate(e,r.nest(a,"link:".concat(o.display,":").concat(a.type)),n)},generate:(e,t,r,s)=>l.generate(e,t,r,s),rules:{ref:{method(e){s(!this.$_terms.link,"Cannot reinitialize schema"),e=o.ref(e),s("value"===e.type||"local"===e.type,"Invalid reference type:",e.type),s("local"===e.type||"root"===e.ancestor||e.ancestor>0,"Link cannot reference itself");const t=this.clone();return t.$_terms.link=[{ref:e}],t}},relative:{method(e=!0){return this.$_setFlag("relative",e)}}},overrides:{concat(e){s(this.$_terms.link,"Uninitialized link schema"),s(a.isSchema(e),"Invalid schema object"),s("link"!==e.type,"Cannot merge type link with another link");const t=this.clone();return t.$_terms.whens||(t.$_terms.whens=[]),t.$_terms.whens.push({concat:e}),t.$_mutateRebuild()}},manifest:{build:(e,t)=>(s(t.link,"Invalid link description missing link"),e.ref(t.link))}}),l.generate=function(e,t,r,s){let n=r.mainstay.links.get(e);if(n)return n._generate(t,r,s).schema;const a=e.$_terms.link[0].ref,{perspective:o,path:i}=l.perspective(a,r);l.assert(o,"which is outside of schema boundaries",a,e,r,s);try{n=i.length?o.$_reach(i):o}catch(t){l.assert(!1,"to non-existing schema",a,e,r,s)}return l.assert("link"!==n.type,"which is another link",a,e,r,s),e._flags.relative||r.mainstay.links.set(e,n),n._generate(t,r,s).schema},l.perspective=function(e,t){if("local"===e.type){for(const{schema:r,key:s}of t.schemas){if((r._flags.id||s)===e.path[0])return{perspective:r,path:e.path.slice(1)};if(r.$_terms.shared)for(const t of r.$_terms.shared)if(t._flags.id===e.path[0])return{perspective:t,path:e.path.slice(1)}}return{perspective:null,path:null}}return"root"===e.ancestor?{perspective:t.schemas[t.schemas.length-1].schema,path:e.path}:{perspective:t.schemas[e.ancestor]&&t.schemas[e.ancestor].schema,path:e.path}},l.assert=function(e,t,r,n,a,o){e||s(!1,'"'.concat(i.label(n._flags,a,o),'" contains link reference "').concat(r.display,'" ').concat(t))}},3832:(e,t,r)=>{"use strict";const s=r(375),n=r(8068),a=r(8160),o={numberRx:/^\s*[+-]?(?:(?:\d+(?:\.\d*)?)|(?:\.\d+))(?:e([+-]?\d+))?\s*$/i,precisionRx:/(?:\.(\d+))?(?:[eE]([+-]?\d+))?$/};e.exports=n.extend({type:"number",flags:{unsafe:{default:!1}},coerce:{from:"string",method(e,{schema:t,error:r}){const s=e.match(o.numberRx);if(!s)return;e=e.trim();const n={value:parseFloat(e)};if(0===n.value&&(n.value=0),!t._flags.unsafe)if(e.match(/e/i)){if(o.normalizeExponent("".concat(n.value/Math.pow(10,s[1]),"e").concat(s[1]))!==o.normalizeExponent(e))return n.errors=r("number.unsafe"),n}else{const t=n.value.toString();if(t.match(/e/i))return n;if(t!==o.normalizeDecimal(e))return n.errors=r("number.unsafe"),n}return n}},validate(e,{schema:t,error:r,prefs:s}){if(e===1/0||e===-1/0)return{value:e,errors:r("number.infinity")};if(!a.isNumber(e))return{value:e,errors:r("number.base")};const n={value:e};if(s.convert){const e=t.$_getRule("precision");if(e){const t=Math.pow(10,e.args.limit);n.value=Math.round(n.value*t)/t}}return 0===n.value&&(n.value=0),!t._flags.unsafe&&(e>Number.MAX_SAFE_INTEGER||ea.compare(e,r,n)?e:t.error("number."+s,{limit:o.limit,value:e}),args:[{name:"limit",ref:!0,assert:a.isNumber,message:"must be a number"}]},greater:{method(e){return this.$_addRule({name:"greater",method:"compare",args:{limit:e},operator:">"})}},integer:{method(){return this.$_addRule("integer")},validate:(e,t)=>Math.trunc(e)-e==0?e:t.error("number.integer")},less:{method(e){return this.$_addRule({name:"less",method:"compare",args:{limit:e},operator:"<"})}},max:{method(e){return this.$_addRule({name:"max",method:"compare",args:{limit:e},operator:"<="})}},min:{method(e){return this.$_addRule({name:"min",method:"compare",args:{limit:e},operator:">="})}},multiple:{method(e){return this.$_addRule({name:"multiple",args:{base:e}})},validate:(e,t,{base:r},s)=>e%r==0?e:t.error("number.multiple",{multiple:s.args.base,value:e}),args:[{name:"base",ref:!0,assert:e=>"number"==typeof e&&isFinite(e)&&e>0,message:"must be a positive number"}],multi:!0},negative:{method(){return this.sign("negative")}},port:{method(){return this.$_addRule("port")},validate:(e,t)=>Number.isSafeInteger(e)&&e>=0&&e<=65535?e:t.error("number.port")},positive:{method(){return this.sign("positive")}},precision:{method(e){return s(Number.isSafeInteger(e),"limit must be an integer"),this.$_addRule({name:"precision",args:{limit:e}})},validate(e,t,{limit:r}){const s=e.toString().match(o.precisionRx);return Math.max((s[1]?s[1].length:0)-(s[2]?parseInt(s[2],10):0),0)<=r?e:t.error("number.precision",{limit:r,value:e})},convert:!0},sign:{method(e){return s(["negative","positive"].includes(e),"Invalid sign",e),this.$_addRule({name:"sign",args:{sign:e}})},validate:(e,t,{sign:r})=>"negative"===r&&e<0||"positive"===r&&e>0?e:t.error("number.".concat(r))},unsafe:{method(e=!0){return s("boolean"==typeof e,"enabled must be a boolean"),this.$_setFlag("unsafe",e)}}},cast:{string:{from:e=>"number"==typeof e,to:(e,t)=>e.toString()}},messages:{"number.base":"{{#label}} must be a number","number.greater":"{{#label}} must be greater than {{#limit}}","number.infinity":"{{#label}} cannot be infinity","number.integer":"{{#label}} must be an integer","number.less":"{{#label}} must be less than {{#limit}}","number.max":"{{#label}} must be less than or equal to {{#limit}}","number.min":"{{#label}} must be greater than or equal to {{#limit}}","number.multiple":"{{#label}} must be a multiple of {{#multiple}}","number.negative":"{{#label}} must be a negative number","number.port":"{{#label}} must be a valid port","number.positive":"{{#label}} must be a positive number","number.precision":"{{#label}} must have no more than {{#limit}} decimal places","number.unsafe":"{{#label}} must be a safe number"}}),o.normalizeExponent=function(e){return e.replace(/E/,"e").replace(/\.(\d*[1-9])?0+e/,".$1e").replace(/\.e/,"e").replace(/e\+/,"e").replace(/^\+/,"").replace(/^(-?)0+([1-9])/,"$1$2")},o.normalizeDecimal=function(e){return(e=e.replace(/^\+/,"").replace(/\.0*$/,"").replace(/^(-?)\.([^\.]*)$/,"$10.$2").replace(/^(-?)0+([0-9])/,"$1$2")).includes(".")&&e.endsWith("0")&&(e=e.replace(/0+$/,"")),"-0"===e?"0":e}},8966:(e,t,r)=>{"use strict";const s=r(7824);e.exports=s.extend({type:"object",cast:{map:{from:e=>e&&"object"==typeof e,to:(e,t)=>new Map(Object.entries(e))}}})},7417:(e,t,r)=>{"use strict";function s(e,t){var r=Object.keys(e);if(Object.getOwnPropertySymbols){var s=Object.getOwnPropertySymbols(e);t&&(s=s.filter((function(t){return Object.getOwnPropertyDescriptor(e,t).enumerable}))),r.push.apply(r,s)}return r}function n(e){for(var t=1;t"string"!=typeof e?{value:e,errors:t("string.base")}:""===e?{value:e,errors:t("string.empty")}:void 0,rules:{alphanum:{method(){return this.$_addRule("alphanum")},validate:(e,t)=>/^[a-zA-Z0-9]+$/.test(e)?e:t.error("string.alphanum")},base64:{method(e={}){return d.assertOptions(e,["paddingRequired","urlSafe"]),e=n({urlSafe:!1,paddingRequired:!0},e),o("boolean"==typeof e.paddingRequired,"paddingRequired must be boolean"),o("boolean"==typeof e.urlSafe,"urlSafe must be boolean"),this.$_addRule({name:"base64",args:{options:e}})},validate:(e,t,{options:r})=>p.base64Regex[r.paddingRequired][r.urlSafe].test(e)?e:t.error("string.base64")},case:{method(e){return o(["lower","upper"].includes(e),"Invalid case:",e),this.$_addRule({name:"case",args:{direction:e}})},validate:(e,t,{direction:r})=>"lower"===r&&e===e.toLocaleLowerCase()||"upper"===r&&e===e.toLocaleUpperCase()?e:t.error("string.".concat(r,"case")),convert:!0},creditCard:{method(){return this.$_addRule("creditCard")},validate(e,t){let r=e.length,s=0,n=1;for(;r--;){const t=e.charAt(r)*n;s+=t-9*(t>9),n^=3}return s>0&&s%10==0?e:t.error("string.creditCard")}},dataUri:{method(e={}){return d.assertOptions(e,["paddingRequired"]),e=n({paddingRequired:!0},e),o("boolean"==typeof e.paddingRequired,"paddingRequired must be boolean"),this.$_addRule({name:"dataUri",args:{options:e}})},validate(e,t,{options:r}){const s=e.match(p.dataUriRegex);if(s){if(!s[2])return e;if("base64"!==s[2])return e;if(p.base64Regex[r.paddingRequired].false.test(s[3]))return e}return t.error("string.dataUri")}},domain:{method(e){e&&d.assertOptions(e,["allowUnicode","maxDomainSegments","minDomainSegments","tlds"]);const t=p.addressOptions(e);return this.$_addRule({name:"domain",args:{options:e},address:t})},validate:(e,t,r,{address:s})=>i.isValid(e,s)?e:t.error("string.domain")},email:{method(e={}){d.assertOptions(e,["allowUnicode","ignoreLength","maxDomainSegments","minDomainSegments","multiple","separator","tlds"]),o(void 0===e.multiple||"boolean"==typeof e.multiple,"multiple option must be an boolean");const t=p.addressOptions(e),r=new RegExp("\\s*[".concat(e.separator?u(e.separator):",","]\\s*"));return this.$_addRule({name:"email",args:{options:e},regex:r,address:t})},validate(e,t,{options:r},{regex:s,address:n}){const a=r.multiple?e.split(s):[e],o=[];for(const e of a)l.isValid(e,n)||o.push(e);return o.length?t.error("string.email",{value:e,invalids:o}):e}},guid:{alias:"uuid",method(e={}){d.assertOptions(e,["version","separator"]);let t="";if(e.version){const r=[].concat(e.version);o(r.length>=1,"version must have at least 1 valid version specified");const s=new Set;for(let e=0;ep.hexRegex.test(e)?r.byteAligned&&e.length%2!=0?t.error("string.hexAlign"):e:t.error("string.hex")},hostname:{method(){return this.$_addRule("hostname")},validate:(e,t)=>i.isValid(e,{minDomainSegments:1})||p.ipRegex.test(e)?e:t.error("string.hostname")},insensitive:{method(){return this.$_setFlag("insensitive",!0)}},ip:{method(e={}){d.assertOptions(e,["cidr","version"]);const{cidr:t,versions:r,regex:s}=c.regex(e),n=e.version?r:void 0;return this.$_addRule({name:"ip",args:{options:{cidr:t,version:n}},regex:s})},validate:(e,t,{options:r},{regex:s})=>s.test(e)?e:r.version?t.error("string.ipVersion",{value:e,cidr:r.cidr,version:r.version}):t.error("string.ip",{value:e,cidr:r.cidr})},isoDate:{method(){return this.$_addRule("isoDate")},validate:(e,{error:t})=>p.isoDate(e)?e:t("string.isoDate")},isoDuration:{method(){return this.$_addRule("isoDuration")},validate:(e,t)=>p.isoDurationRegex.test(e)?e:t.error("string.isoDuration")},length:{method(e,t){return p.length(this,"length",e,"=",t)},validate(e,t,{limit:r,encoding:s},{name:n,operator:a,args:o}){const i=!s&&e.length;return d.compare(i,r,a)?e:t.error("string."+n,{limit:o.limit,value:e,encoding:s})},args:[{name:"limit",ref:!0,assert:d.limit,message:"must be a positive integer"},"encoding"]},lowercase:{method(){return this.case("lower")}},max:{method(e,t){return p.length(this,"max",e,"<=",t)},args:["limit","encoding"]},min:{method(e,t){return p.length(this,"min",e,">=",t)},args:["limit","encoding"]},normalize:{method(e="NFC"){return o(p.normalizationForms.includes(e),"normalization form must be one of "+p.normalizationForms.join(", ")),this.$_addRule({name:"normalize",args:{form:e}})},validate:(e,{error:t},{form:r})=>e===e.normalize(r)?e:t("string.normalize",{value:e,form:r}),convert:!0},pattern:{alias:"regex",method(e,t={}){o(e instanceof RegExp,"regex must be a RegExp"),o(!e.flags.includes("g")&&!e.flags.includes("y"),"regex should not use global or sticky mode"),"string"==typeof t&&(t={name:t}),d.assertOptions(t,["invert","name"]);const r=["string.pattern",t.invert?".invert":"",t.name?".name":".base"].join("");return this.$_addRule({name:"pattern",args:{regex:e,options:t},errorCode:r})},validate:(e,t,{regex:r,options:s},{errorCode:n})=>r.test(e)^s.invert?e:t.error(n,{name:s.name,regex:r,value:e}),args:["regex","options"],multi:!0},replace:{method(e,t){"string"==typeof e&&(e=new RegExp(u(e),"g")),o(e instanceof RegExp,"pattern must be a RegExp"),o("string"==typeof t,"replacement must be a String");const r=this.clone();return r.$_terms.replacements||(r.$_terms.replacements=[]),r.$_terms.replacements.push({pattern:e,replacement:t}),r}},token:{method(){return this.$_addRule("token")},validate:(e,t)=>/^\w+$/.test(e)?e:t.error("string.token")},trim:{method(e=!0){return o("boolean"==typeof e,"enabled must be a boolean"),this.$_addRule({name:"trim",args:{enabled:e}})},validate:(e,t,{enabled:r})=>r&&e!==e.trim()?t.error("string.trim"):e,convert:!0},truncate:{method(e=!0){return o("boolean"==typeof e,"enabled must be a boolean"),this.$_setFlag("truncate",e)}},uppercase:{method(){return this.case("upper")}},uri:{method(e={}){d.assertOptions(e,["allowRelative","allowQuerySquareBrackets","domain","relativeOnly","scheme"]),e.domain&&d.assertOptions(e.domain,["allowUnicode","maxDomainSegments","minDomainSegments","tlds"]);const{regex:t,scheme:r}=m.regex(e),s=e.domain?p.addressOptions(e.domain):null;return this.$_addRule({name:"uri",args:{options:e},regex:t,domain:s,scheme:r})},validate(e,t,{options:r},{regex:s,domain:n,scheme:a}){if(["http:/","https:/"].includes(e))return t.error("string.uri");const o=s.exec(e);if(o){const s=o[1]||o[2];return!n||r.allowRelative&&!s||i.isValid(s,n)?e:t.error("string.domain",{value:s})}return r.relativeOnly?t.error("string.uriRelativeOnly"):r.scheme?t.error("string.uriCustomScheme",{scheme:a,value:e}):t.error("string.uri")}}},manifest:{build(e,t){if(t.replacements)for(const{pattern:r,replacement:s}of t.replacements)e=e.replace(r,s);return e}},messages:{"string.alphanum":"{{#label}} must only contain alpha-numeric characters","string.base":"{{#label}} must be a string","string.base64":"{{#label}} must be a valid base64 string","string.creditCard":"{{#label}} must be a credit card","string.dataUri":"{{#label}} must be a valid dataUri string","string.domain":"{{#label}} must contain a valid domain name","string.email":"{{#label}} must be a valid email","string.empty":"{{#label}} is not allowed to be empty","string.guid":"{{#label}} must be a valid GUID","string.hex":"{{#label}} must only contain hexadecimal characters","string.hexAlign":"{{#label}} hex decoded representation must be byte aligned","string.hostname":"{{#label}} must be a valid hostname","string.ip":"{{#label}} must be a valid ip address with a {{#cidr}} CIDR","string.ipVersion":"{{#label}} must be a valid ip address of one of the following versions {{#version}} with a {{#cidr}} CIDR","string.isoDate":"{{#label}} must be in iso format","string.isoDuration":"{{#label}} must be a valid ISO 8601 duration","string.length":"{{#label}} length must be {{#limit}} characters long","string.lowercase":"{{#label}} must only contain lowercase characters","string.max":"{{#label}} length must be less than or equal to {{#limit}} characters long","string.min":"{{#label}} length must be at least {{#limit}} characters long","string.normalize":"{{#label}} must be unicode normalized in the {{#form}} form","string.token":"{{#label}} must only contain alpha-numeric and underscore characters","string.pattern.base":"{{#label}} with value {:[.]} fails to match the required pattern: {{#regex}}","string.pattern.name":"{{#label}} with value {:[.]} fails to match the {{#name}} pattern","string.pattern.invert.base":"{{#label}} with value {:[.]} matches the inverted pattern: {{#regex}}","string.pattern.invert.name":"{{#label}} with value {:[.]} matches the inverted {{#name}} pattern","string.trim":"{{#label}} must not have leading or trailing whitespace","string.uri":"{{#label}} must be a valid uri","string.uriCustomScheme":"{{#label}} must be a valid uri with a scheme matching the {{#scheme}} pattern","string.uriRelativeOnly":"{{#label}} must be a valid relative uri","string.uppercase":"{{#label}} must only contain uppercase characters"}}),p.addressOptions=function(e){if(!e)return e;if(o(void 0===e.minDomainSegments||Number.isSafeInteger(e.minDomainSegments)&&e.minDomainSegments>0,"minDomainSegments must be a positive integer"),o(void 0===e.maxDomainSegments||Number.isSafeInteger(e.maxDomainSegments)&&e.maxDomainSegments>0,"maxDomainSegments must be a positive integer"),!1===e.tlds)return e;if(!0===e.tlds||void 0===e.tlds)return o(p.tlds,"Built-in TLD list disabled"),Object.assign({},e,p.tlds);o("object"==typeof e.tlds,"tlds must be true, false, or an object");const t=e.tlds.deny;if(t)return Array.isArray(t)&&(e=Object.assign({},e,{tlds:{deny:new Set(t)}})),o(e.tlds.deny instanceof Set,"tlds.deny must be an array, Set, or boolean"),o(!e.tlds.allow,"Cannot specify both tlds.allow and tlds.deny lists"),p.validateTlds(e.tlds.deny,"tlds.deny"),e;const r=e.tlds.allow;return r?!0===r?(o(p.tlds,"Built-in TLD list disabled"),Object.assign({},e,p.tlds)):(Array.isArray(r)&&(e=Object.assign({},e,{tlds:{allow:new Set(r)}})),o(e.tlds.allow instanceof Set,"tlds.allow must be an array, Set, or boolean"),p.validateTlds(e.tlds.allow,"tlds.allow"),e):e},p.validateTlds=function(e,t){for(const r of e)o(i.isValid(r,{minDomainSegments:1,maxDomainSegments:1}),"".concat(t," must contain valid top level domain names"))},p.isoDate=function(e){if(!d.isIsoDate(e))return null;/.*T.*[+-]\d\d$/.test(e)&&(e+="00");const t=new Date(e);return isNaN(t.getTime())?null:t.toISOString()},p.length=function(e,t,r,s,n){return o(!n||!1,"Invalid encoding:",n),e.$_addRule({name:t,method:"length",args:{limit:r,encoding:n},operator:s})}},8826:(e,t,r)=>{"use strict";const s=r(375),n=r(8068),a={};a.Map=class extends Map{slice(){return new a.Map(this)}},e.exports=n.extend({type:"symbol",terms:{map:{init:new a.Map}},coerce:{method(e,{schema:t,error:r}){const s=t.$_terms.map.get(e);return s&&(e=s),t._flags.only&&"symbol"!=typeof e?{value:e,errors:r("symbol.map",{map:t.$_terms.map})}:{value:e}}},validate(e,{error:t}){if("symbol"!=typeof e)return{value:e,errors:t("symbol.base")}},rules:{map:{method(e){e&&!e[Symbol.iterator]&&"object"==typeof e&&(e=Object.entries(e)),s(e&&e[Symbol.iterator],"Iterable must be an iterable or object");const t=this.clone(),r=[];for(const n of e){s(n&&n[Symbol.iterator],"Entry must be an iterable");const[e,a]=n;s("object"!=typeof e&&"function"!=typeof e&&"symbol"!=typeof e,"Key must not be of type object, function, or Symbol"),s("symbol"==typeof a,"Value must be a Symbol"),t.$_terms.map.set(e,a),r.push(a)}return t.valid(...r)}}},manifest:{build:(e,t)=>(t.map&&(e=e.map(t.map)),e)},messages:{"symbol.base":"{{#label}} must be a symbol","symbol.map":"{{#label}} must be one of {{#map}}"}})},8863:(e,t,r)=>{"use strict";const s=r(375),n=r(8571),a=r(738),o=r(9621),i=r(8160),l=r(6354),c=r(493),u={result:Symbol("result")};t.entry=function(e,t,r){let n=i.defaults;r&&(s(void 0===r.warnings,"Cannot override warnings preference in synchronous validation"),s(void 0===r.artifacts,"Cannot override artifacts preference in synchronous validation"),n=i.preferences(i.defaults,r));const a=u.entry(e,t,n);s(!a.mainstay.externals.length,"Schema with external rules must use validateAsync()");const o={value:a.value};return a.error&&(o.error=a.error),a.mainstay.warnings.length&&(o.warning=l.details(a.mainstay.warnings)),a.mainstay.debug&&(o.debug=a.mainstay.debug),a.mainstay.artifacts&&(o.artifacts=a.mainstay.artifacts),o},t.entryAsync=async function(e,t,r){let s=i.defaults;r&&(s=i.preferences(i.defaults,r));const n=u.entry(e,t,s),a=n.mainstay;if(n.error)throw a.debug&&(n.error.debug=a.debug),n.error;if(a.externals.length){let e=n.value;for(const{method:t,path:s,label:n}of a.externals){let a,i,l=e;s.length&&(a=s[s.length-1],i=o(e,s.slice(0,-1)),l=i[a]);try{const s=await t(l,{prefs:r});if(void 0===s||s===l)continue;i?i[a]=s:e=s}catch(e){throw e.message+=" (".concat(n,")"),e}}n.value=e}if(!s.warnings&&!s.debug&&!s.artifacts)return n.value;const c={value:n.value};return a.warnings.length&&(c.warning=l.details(a.warnings)),a.debug&&(c.debug=a.debug),a.artifacts&&(c.artifacts=a.artifacts),c},u.entry=function(e,r,s){const{tracer:n,cleanup:a}=u.tracer(r,s),o={externals:[],warnings:[],tracer:n,debug:s.debug?[]:null,links:r._ids._schemaChain?new Map:null},i=r._ids._schemaChain?[{schema:r}]:null,f=new c([],[],{mainstay:o,schemas:i}),m=t.validate(e,r,f,s);a&&r.$_root.untrace();const h=l.process(m.errors,e,s);return{value:m.value,error:h,mainstay:o}},u.tracer=function(e,t){return e.$_root._tracer?{tracer:e.$_root._tracer._register(e)}:t.debug?(s(e.$_root.trace,"Debug mode not supported"),{tracer:e.$_root.trace()._register(e),cleanup:!0}):{tracer:u.ignore}},t.validate=function(e,t,r,s,n={}){if(t.$_terms.whens&&(t=t._generate(e,r,s).schema),t._preferences&&(s=u.prefs(t,s)),t._cache&&s.cache){const s=t._cache.get(e);if(r.mainstay.tracer.debug(r,"validate","cached",!!s),s)return s}const a=(n,a,o)=>t.$_createError(n,e,a,o||r,s),o={original:e,prefs:s,schema:t,state:r,error:a,errorsArray:u.errorsArray,warn:(e,t,s)=>r.mainstay.warnings.push(a(e,t,s)),message:(n,a)=>t.$_createError("custom",e,a,r,s,{messages:n})};r.mainstay.tracer.entry(t,r);const l=t._definition;if(l.prepare&&void 0!==e&&s.convert){const t=l.prepare(e,o);if(t){if(r.mainstay.tracer.value(r,"prepare",e,t.value),t.errors)return u.finalize(t.value,[].concat(t.errors),o);e=t.value}}if(l.coerce&&void 0!==e&&s.convert&&(!l.coerce.from||l.coerce.from.includes(typeof e))){const t=l.coerce.method(e,o);if(t){if(r.mainstay.tracer.value(r,"coerced",e,t.value),t.errors)return u.finalize(t.value,[].concat(t.errors),o);e=t.value}}const c=t._flags.empty;c&&c.$_match(u.trim(e,t),r.nest(c),i.defaults)&&(r.mainstay.tracer.value(r,"empty",e,void 0),e=void 0);const f=n.presence||t._flags.presence||(t._flags._endedSwitch?null:s.presence);if(void 0===e){if("forbidden"===f)return u.finalize(e,null,o);if("required"===f)return u.finalize(e,[t.$_createError("any.required",e,null,r,s)],o);if("optional"===f){if(t._flags.default!==i.symbols.deepDefault)return u.finalize(e,null,o);r.mainstay.tracer.value(r,"default",e,{}),e={}}}else if("forbidden"===f)return u.finalize(e,[t.$_createError("any.unknown",e,null,r,s)],o);const m=[];if(t._valids){const n=t._valids.get(e,r,s,t._flags.insensitive);if(n)return s.convert&&(r.mainstay.tracer.value(r,"valids",e,n.value),e=n.value),r.mainstay.tracer.filter(t,r,"valid",n),u.finalize(e,null,o);if(t._flags.only){const n=t.$_createError("any.only",e,{valids:t._valids.values({display:!0})},r,s);if(s.abortEarly)return u.finalize(e,[n],o);m.push(n)}}if(t._invalids){const n=t._invalids.get(e,r,s,t._flags.insensitive);if(n){r.mainstay.tracer.filter(t,r,"invalid",n);const a=t.$_createError("any.invalid",e,{invalids:t._invalids.values({display:!0})},r,s);if(s.abortEarly)return u.finalize(e,[a],o);m.push(a)}}if(l.validate){const t=l.validate(e,o);if(t&&(r.mainstay.tracer.value(r,"base",e,t.value),e=t.value,t.errors)){if(!Array.isArray(t.errors))return m.push(t.errors),u.finalize(e,m,o);if(t.errors.length)return m.push(...t.errors),u.finalize(e,m,o)}}return t._rules.length?u.rules(e,m,o):u.finalize(e,m,o)},u.rules=function(e,t,r){const{schema:s,state:n,prefs:a}=r;for(const o of s._rules){const l=s._definition.rules[o.method];if(l.convert&&a.convert){n.mainstay.tracer.log(s,n,"rule",o.name,"full");continue}let c,f=o.args;if(o._resolve.length){f=Object.assign({},f);for(const t of o._resolve){const r=l.argsByName.get(t),o=f[t].resolve(e,n,a),u=r.normalize?r.normalize(o):o,m=i.validateArg(u,null,r);if(m){c=s.$_createError("any.ref",o,{arg:t,ref:f[t],reason:m},n,a);break}f[t]=u}}c=c||l.validate(e,r,f,o);const m=u.rule(c,o);if(m.errors){if(n.mainstay.tracer.log(s,n,"rule",o.name,"error"),o.warn){n.mainstay.warnings.push(...m.errors);continue}if(a.abortEarly)return u.finalize(e,m.errors,r);t.push(...m.errors)}else n.mainstay.tracer.log(s,n,"rule",o.name,"pass"),n.mainstay.tracer.value(n,"rule",e,m.value,o.name),e=m.value}return u.finalize(e,t,r)},u.rule=function(e,t){return e instanceof l.Report?(u.error(e,t),{errors:[e],value:null}):Array.isArray(e)&&e[i.symbols.errors]?(e.forEach((e=>u.error(e,t))),{errors:e,value:null}):{errors:null,value:e}},u.error=function(e,t){return t.message&&e._setTemplate(t.message),e},u.finalize=function(e,t,r){t=t||[];const{schema:n,state:a,prefs:o}=r;if(t.length){const s=u.default("failover",void 0,t,r);void 0!==s&&(a.mainstay.tracer.value(a,"failover",e,s),e=s,t=[])}if(t.length&&n._flags.error)if("function"==typeof n._flags.error){t=n._flags.error(t),Array.isArray(t)||(t=[t]);for(const e of t)s(e instanceof Error||e instanceof l.Report,"error() must return an Error object")}else t=[n._flags.error];if(void 0===e){const s=u.default("default",e,t,r);a.mainstay.tracer.value(a,"default",e,s),e=s}if(n._flags.cast&&void 0!==e){const t=n._definition.cast[n._flags.cast];if(t.from(e)){const s=t.to(e,r);a.mainstay.tracer.value(a,"cast",e,s,n._flags.cast),e=s}}if(n.$_terms.externals&&o.externals&&!1!==o._externals)for(const{method:e}of n.$_terms.externals)a.mainstay.externals.push({method:e,path:a.path,label:l.label(n._flags,a,o)});const i={value:e,errors:t.length?t:null};return n._flags.result&&(i.value="strip"===n._flags.result?void 0:r.original,a.mainstay.tracer.value(a,n._flags.result,e,i.value),a.shadow(e,n._flags.result)),n._cache&&!1!==o.cache&&!n._refs.length&&n._cache.set(r.original,i),void 0===e||i.errors||void 0===n._flags.artifact||(a.mainstay.artifacts=a.mainstay.artifacts||new Map,a.mainstay.artifacts.has(n._flags.artifact)||a.mainstay.artifacts.set(n._flags.artifact,[]),a.mainstay.artifacts.get(n._flags.artifact).push(a.path)),i},u.prefs=function(e,t){const r=t===i.defaults;return r&&e._preferences[i.symbols.prefs]?e._preferences[i.symbols.prefs]:(t=i.preferences(t,e._preferences),r&&(e._preferences[i.symbols.prefs]=t),t)},u.default=function(e,t,r,s){const{schema:a,state:o,prefs:l}=s,c=a._flags[e];if(l.noDefaults||void 0===c)return t;if(o.mainstay.tracer.log(a,o,"rule",e,"full"),!c)return c;if("function"==typeof c){const t=c.length?[n(o.ancestors[0]),s]:[];try{return c(...t)}catch(t){return void r.push(a.$_createError("any.".concat(e),null,{error:t},o,l))}}return"object"!=typeof c?c:c[i.symbols.literal]?c.literal:i.isResolvable(c)?c.resolve(t,o,l):n(c)},u.trim=function(e,t){if("string"!=typeof e)return e;const r=t.$_getRule("trim");return r&&r.args.enabled?e.trim():e},u.ignore={active:!1,debug:a,entry:a,filter:a,log:a,resolve:a,value:a},u.errorsArray=function(){const e=[];return e[i.symbols.errors]=!0,e}},2036:(e,t,r)=>{"use strict";const s=r(375),n=r(9474),a=r(8160),o={};e.exports=o.Values=class{constructor(e,t){this._values=new Set(e),this._refs=new Set(t),this._lowercase=o.lowercases(e),this._override=!1}get length(){return this._values.size+this._refs.size}add(e,t){a.isResolvable(e)?this._refs.has(e)||(this._refs.add(e),t&&t.register(e)):this.has(e,null,null,!1)||(this._values.add(e),"string"==typeof e&&this._lowercase.set(e.toLowerCase(),e))}static merge(e,t,r){if(e=e||new o.Values,t){if(t._override)return t.clone();for(const r of[...t._values,...t._refs])e.add(r)}if(r)for(const t of[...r._values,...r._refs])e.remove(t);return e.length?e:null}remove(e){a.isResolvable(e)?this._refs.delete(e):(this._values.delete(e),"string"==typeof e&&this._lowercase.delete(e.toLowerCase()))}has(e,t,r,s){return!!this.get(e,t,r,s)}get(e,t,r,s){if(!this.length)return!1;if(this._values.has(e))return{value:e};if("string"==typeof e&&e&&s){const t=this._lowercase.get(e.toLowerCase());if(t)return{value:t}}if(!this._refs.size&&"object"!=typeof e)return!1;if("object"==typeof e)for(const t of this._values)if(n(t,e))return{value:t};if(t)for(const a of this._refs){const o=a.resolve(e,t,r,null,{in:!0});if(void 0===o)continue;const i=a.in&&"object"==typeof o?Array.isArray(o)?o:Object.keys(o):[o];for(const t of i)if(typeof t==typeof e)if(s&&e&&"string"==typeof e){if(t.toLowerCase()===e.toLowerCase())return{value:t,ref:a}}else if(n(t,e))return{value:t,ref:a}}return!1}override(){this._override=!0}values(e){if(e&&e.display){const e=[];for(const t of[...this._values,...this._refs])void 0!==t&&e.push(t);return e}return Array.from([...this._values,...this._refs])}clone(){const e=new o.Values(this._values,this._refs);return e._override=this._override,e}concat(e){s(!e._override,"Cannot concat override set of values");const t=new o.Values([...this._values,...e._values],[...this._refs,...e._refs]);return t._override=this._override,t}describe(){const e=[];this._override&&e.push({override:!0});for(const t of this._values.values())e.push(t&&"object"==typeof t?{value:t}:t);for(const t of this._refs.values())e.push(t.describe());return e}},o.Values.prototype[a.symbols.values]=!0,o.Values.prototype.slice=o.Values.prototype.clone,o.lowercases=function(e){const t=new Map;if(e)for(const r of e)"string"==typeof r&&t.set(r.toLowerCase(),r);return t}},978:(e,t,r)=>{"use strict";const s=r(375),n=r(8571),a=r(1687),o=r(9621),i={};e.exports=function(e,t,r={}){if(s(e&&"object"==typeof e,"Invalid defaults value: must be an object"),s(!t||!0===t||"object"==typeof t,"Invalid source value: must be true, falsy or an object"),s("object"==typeof r,"Invalid options: must be an object"),!t)return null;if(r.shallow)return i.applyToDefaultsWithShallow(e,t,r);const o=n(e);if(!0===t)return o;const l=void 0!==r.nullOverride&&r.nullOverride;return a(o,t,{nullOverride:l,mergeArrays:!1})},i.applyToDefaultsWithShallow=function(e,t,r){const l=r.shallow;s(Array.isArray(l),"Invalid keys");const c=new Map,u=!0===t?null:new Set;for(let r of l){r=Array.isArray(r)?r:r.split(".");const s=o(e,r);s&&"object"==typeof s?c.set(s,u&&o(t,r)||s):u&&u.add(r)}const f=n(e,{},c);if(!u)return f;for(const e of u)i.reachCopy(f,t,e);const m=void 0!==r.nullOverride&&r.nullOverride;return a(f,t,{nullOverride:m,mergeArrays:!1})},i.reachCopy=function(e,t,r){for(const e of r){if(!(e in t))return;const r=t[e];if("object"!=typeof r||null===r)return;t=r}const s=t;let n=e;for(let e=0;e{"use strict";const s=r(7916);e.exports=function(e,...t){if(!e){if(1===t.length&&t[0]instanceof Error)throw t[0];throw new s(t)}}},8571:(e,t,r)=>{"use strict";const s=r(9621),n=r(4277),a=r(7043),o={needsProtoHack:new Set([n.set,n.map,n.weakSet,n.weakMap])};e.exports=o.clone=function(e,t={},r=null){if("object"!=typeof e||null===e)return e;let s=o.clone,i=r;if(t.shallow){if(!0!==t.shallow)return o.cloneWithShallow(e,t);s=e=>e}else if(i){const t=i.get(e);if(t)return t}else i=new Map;const l=n.getInternalProto(e);if(l===n.buffer)return!1;if(l===n.date)return new Date(e.getTime());if(l===n.regex)return new RegExp(e);const c=o.base(e,l,t);if(c===e)return e;if(i&&i.set(e,c),l===n.set)for(const r of e)c.add(s(r,t,i));else if(l===n.map)for(const[r,n]of e)c.set(r,s(n,t,i));const u=a.keys(e,t);for(const r of u){if("__proto__"===r)continue;if(l===n.array&&"length"===r){c.length=e.length;continue}const a=Object.getOwnPropertyDescriptor(e,r);a?a.get||a.set?Object.defineProperty(c,r,a):a.enumerable?c[r]=s(e[r],t,i):Object.defineProperty(c,r,{enumerable:!1,writable:!0,configurable:!0,value:s(e[r],t,i)}):Object.defineProperty(c,r,{enumerable:!0,writable:!0,configurable:!0,value:s(e[r],t,i)})}return c},o.cloneWithShallow=function(e,t){const r=t.shallow;(t=Object.assign({},t)).shallow=!1;const n=new Map;for(const t of r){const r=s(e,t);"object"!=typeof r&&"function"!=typeof r||n.set(r,r)}return o.clone(e,t,n)},o.base=function(e,t,r){if(!1===r.prototype)return o.needsProtoHack.has(t)?new t.constructor:t===n.array?[]:{};const s=Object.getPrototypeOf(e);if(s&&s.isImmutable)return e;if(t===n.array){const e=[];return s!==t&&Object.setPrototypeOf(e,s),e}if(o.needsProtoHack.has(t)){const e=new s.constructor;return s!==t&&Object.setPrototypeOf(e,s),e}return Object.create(s)}},9474:(e,t,r)=>{"use strict";const s=r(4277),n={mismatched:null};e.exports=function(e,t,r){return r=Object.assign({prototype:!0},r),!!n.isDeepEqual(e,t,r,[])},n.isDeepEqual=function(e,t,r,a){if(e===t)return 0!==e||1/e==1/t;const o=typeof e;if(o!==typeof t)return!1;if(null===e||null===t)return!1;if("function"===o){if(!r.deepFunction||e.toString()!==t.toString())return!1}else if("object"!==o)return e!=e&&t!=t;const i=n.getSharedType(e,t,!!r.prototype);switch(i){case s.buffer:return!1;case s.promise:return e===t;case s.regex:return e.toString()===t.toString();case n.mismatched:return!1}for(let r=a.length-1;r>=0;--r)if(a[r].isSame(e,t))return!0;a.push(new n.SeenEntry(e,t));try{return!!n.isDeepEqualObj(i,e,t,r,a)}finally{a.pop()}},n.getSharedType=function(e,t,r){if(r)return Object.getPrototypeOf(e)!==Object.getPrototypeOf(t)?n.mismatched:s.getInternalProto(e);const a=s.getInternalProto(e);return a!==s.getInternalProto(t)?n.mismatched:a},n.valueOf=function(e){const t=e.valueOf;if(void 0===t)return e;try{return t.call(e)}catch(e){return e}},n.hasOwnEnumerableProperty=function(e,t){return Object.prototype.propertyIsEnumerable.call(e,t)},n.isSetSimpleEqual=function(e,t){for(const r of Set.prototype.values.call(e))if(!Set.prototype.has.call(t,r))return!1;return!0},n.isDeepEqualObj=function(e,t,r,a,o){const{isDeepEqual:i,valueOf:l,hasOwnEnumerableProperty:c}=n,{keys:u,getOwnPropertySymbols:f}=Object;if(e===s.array){if(!a.part){if(t.length!==r.length)return!1;for(let e=0;e{"use strict";const s=r(8761);e.exports=class extends Error{constructor(e){super(e.filter((e=>""!==e)).map((e=>"string"==typeof e?e:e instanceof Error?e.message:s(e))).join(" ")||"Unknown error"),"function"==typeof Error.captureStackTrace&&Error.captureStackTrace(this,t.assert)}}},5277:e=>{"use strict";const t={};e.exports=function(e){if(!e)return"";let r="";for(let s=0;s=256)return"&#"+e+";";const s=e.toString(16).padStart(2,"0");return"&#x".concat(s,";")},t.isSafe=function(e){return void 0!==t.safeCharCodes[e]},t.namedHtml={38:"&",60:"<",62:">",34:""",160:" ",162:"¢",163:"£",164:"¤",169:"©",174:"®"},t.safeCharCodes=function(){const e={};for(let t=32;t<123;++t)(t>=97||t>=65&&t<=90||t>=48&&t<=57||32===t||46===t||44===t||45===t||58===t||95===t)&&(e[t]=null);return e}()},6064:e=>{"use strict";e.exports=function(e){return e.replace(/[\^\$\.\*\+\-\?\=\!\:\|\\\/\(\)\[\]\{\}\,]/g,"\\$&")}},738:e=>{"use strict";e.exports=function(){}},1687:(e,t,r)=>{"use strict";const s=r(375),n=r(8571),a=r(7043),o={};e.exports=o.merge=function(e,t,r){if(s(e&&"object"==typeof e,"Invalid target value: must be an object"),s(null==t||"object"==typeof t,"Invalid source value: must be null, undefined, or an object"),!t)return e;if(r=Object.assign({nullOverride:!0,mergeArrays:!0},r),Array.isArray(t)){s(Array.isArray(e),"Cannot merge array onto an object"),r.mergeArrays||(e.length=0);for(let s=0;s{"use strict";const s=r(375),n={};e.exports=function(e,t,r){if(!1===t||null==t)return e;"string"==typeof(r=r||{})&&(r={separator:r});const a=Array.isArray(t);s(!a||!r.separator,"Separator option no valid for array-based chain");const o=a?t:t.split(r.separator||".");let i=e;for(let e=0;e{"use strict";e.exports=function(...e){try{return JSON.stringify.apply(null,e)}catch(e){return"[Cannot display object: "+e.message+"]"}}},4277:(e,t)=>{"use strict";const r={};t=e.exports={array:Array.prototype,buffer:!1,date:Date.prototype,error:Error.prototype,generic:Object.prototype,map:Map.prototype,promise:Promise.prototype,regex:RegExp.prototype,set:Set.prototype,weakMap:WeakMap.prototype,weakSet:WeakSet.prototype},r.typeMap=new Map([["[object Error]",t.error],["[object Map]",t.map],["[object Promise]",t.promise],["[object Set]",t.set],["[object WeakMap]",t.weakMap],["[object WeakSet]",t.weakSet]]),t.getInternalProto=function(e){if(Array.isArray(e))return t.array;if(e instanceof Date)return t.date;if(e instanceof RegExp)return t.regex;if(e instanceof Error)return t.error;const s=Object.prototype.toString.call(e);return r.typeMap.get(s)||t.generic}},7043:(e,t)=>{"use strict";t.keys=function(e,t={}){return!1!==t.symbols?Reflect.ownKeys(e):Object.getOwnPropertyNames(e)}},3652:(e,t,r)=>{"use strict";const s=r(375),n={};t.Sorter=class{constructor(){this._items=[],this.nodes=[]}add(e,t){const r=[].concat((t=t||{}).before||[]),n=[].concat(t.after||[]),a=t.group||"?",o=t.sort||0;s(!r.includes(a),"Item cannot come before itself: ".concat(a)),s(!r.includes("?"),"Item cannot come before unassociated items"),s(!n.includes(a),"Item cannot come after itself: ".concat(a)),s(!n.includes("?"),"Item cannot come after unassociated items"),Array.isArray(e)||(e=[e]);for(const t of e){const e={seq:this._items.length,sort:o,before:r,after:n,group:a,node:t};this._items.push(e)}if(!t.manual){const e=this._sort();s(e,"item","?"!==a?"added into group ".concat(a):"","created a dependencies error")}return this.nodes}merge(e){Array.isArray(e)||(e=[e]);for(const t of e)if(t)for(const e of t._items)this._items.push(Object.assign({},e));this._items.sort(n.mergeSort);for(let e=0;ee.sort===t.sort?0:e.sort{"use strict";const s=r(443),n=r(2178),a={minDomainSegments:2,nonAsciiRx:/[^\x00-\x7f]/,domainControlRx:/[\x00-\x20@\:\/\\#!\$&\'\(\)\*\+,;=\?]/,tldSegmentRx:/^[a-zA-Z](?:[a-zA-Z0-9\-]*[a-zA-Z0-9])?$/,domainSegmentRx:/^[a-zA-Z0-9](?:[a-zA-Z0-9\-]*[a-zA-Z0-9])?$/,URL:s.URL||URL};t.analyze=function(e,t={}){if("string"!=typeof e)throw new Error("Invalid input: domain must be a string");if(!e)return n.code("DOMAIN_NON_EMPTY_STRING");if(e.length>256)return n.code("DOMAIN_TOO_LONG");if(a.nonAsciiRx.test(e)){if(!1===t.allowUnicode)return n.code("DOMAIN_INVALID_UNICODE_CHARS");e=e.normalize("NFC")}if(a.domainControlRx.test(e))return n.code("DOMAIN_INVALID_CHARS");e=a.punycode(e);const r=t.minDomainSegments||a.minDomainSegments,s=e.split(".");if(s.lengtht.maxDomainSegments)return n.code("DOMAIN_SEGMENTS_COUNT_MAX");const o=t.tlds;if(o){const e=s[s.length-1].toLowerCase();if(o.deny&&o.deny.has(e)||o.allow&&!o.allow.has(e))return n.code("DOMAIN_FORBIDDEN_TLDS")}for(let e=0;e63)return n.code("DOMAIN_LONG_SEGMENT");if(e{"use strict";const s=r(9848),n=r(5380),a=r(2178),o={nonAsciiRx:/[^\x00-\x7f]/,encoder:new(s.TextEncoder||TextEncoder)};t.analyze=function(e,t){return o.email(e,t)},t.isValid=function(e,t){return!o.email(e,t)},o.email=function(e,t={}){if("string"!=typeof e)throw new Error("Invalid input: email must be a string");if(!e)return a.code("EMPTY_STRING");const r=!o.nonAsciiRx.test(e);if(!r){if(!1===t.allowUnicode)return a.code("FORBIDDEN_UNICODE");e=e.normalize("NFC")}const s=e.split("@");if(2!==s.length)return s.length>2?a.code("MULTIPLE_AT_CHAR"):a.code("MISSING_AT_CHAR");const[i,l]=s;if(!i)return a.code("EMPTY_LOCAL");if(!t.ignoreLength){if(e.length>254)return a.code("ADDRESS_TOO_LONG");if(o.encoder.encode(i).length>64)return a.code("LOCAL_TOO_LONG")}return o.local(i,r)||n.analyze(l,t)},o.local=function(e,t){const r=e.split(".");for(const e of r){if(!e.length)return a.code("EMPTY_LOCAL_SEGMENT");if(t){if(!o.atextRx.test(e))return a.code("INVALID_LOCAL_CHARS")}else for(const t of e){if(o.atextRx.test(t))continue;const e=o.binary(t);if(!o.atomRx.test(e))return a.code("INVALID_LOCAL_CHARS")}}},o.binary=function(e){return Array.from(o.encoder.encode(e)).map((e=>String.fromCharCode(e))).join("")},o.atextRx=/^[\w!#\$%&'\*\+\-/=\?\^`\{\|\}~]+$/,o.atomRx=new RegExp(["(?:[\\xc2-\\xdf][\\x80-\\xbf])","(?:\\xe0[\\xa0-\\xbf][\\x80-\\xbf])|(?:[\\xe1-\\xec][\\x80-\\xbf]{2})|(?:\\xed[\\x80-\\x9f][\\x80-\\xbf])|(?:[\\xee-\\xef][\\x80-\\xbf]{2})","(?:\\xf0[\\x90-\\xbf][\\x80-\\xbf]{2})|(?:[\\xf1-\\xf3][\\x80-\\xbf]{3})|(?:\\xf4[\\x80-\\x8f][\\x80-\\xbf]{2})"].join("|"))},2178:(e,t)=>{"use strict";t.codes={EMPTY_STRING:"Address must be a non-empty string",FORBIDDEN_UNICODE:"Address contains forbidden Unicode characters",MULTIPLE_AT_CHAR:"Address cannot contain more than one @ character",MISSING_AT_CHAR:"Address must contain one @ character",EMPTY_LOCAL:"Address local part cannot be empty",ADDRESS_TOO_LONG:"Address too long",LOCAL_TOO_LONG:"Address local part too long",EMPTY_LOCAL_SEGMENT:"Address local part contains empty dot-separated segment",INVALID_LOCAL_CHARS:"Address local part contains invalid character",DOMAIN_NON_EMPTY_STRING:"Domain must be a non-empty string",DOMAIN_TOO_LONG:"Domain too long",DOMAIN_INVALID_UNICODE_CHARS:"Domain contains forbidden Unicode characters",DOMAIN_INVALID_CHARS:"Domain contains invalid character",DOMAIN_INVALID_TLDS_CHARS:"Domain contains invalid tld character",DOMAIN_SEGMENTS_COUNT:"Domain lacks the minimum required number of segments",DOMAIN_SEGMENTS_COUNT_MAX:"Domain contains too many segments",DOMAIN_FORBIDDEN_TLDS:"Domain uses forbidden TLD",DOMAIN_EMPTY_SEGMENT:"Domain contains empty dot-separated segment",DOMAIN_LONG_SEGMENT:"Domain contains dot-separated segment that is too long"},t.code=function(e){return{code:e,error:t.codes[e]}}},9959:(e,t,r)=>{"use strict";const s=r(375),n=r(5752);t.regex=function(e={}){s(void 0===e.cidr||"string"==typeof e.cidr,"options.cidr must be a string");const t=e.cidr?e.cidr.toLowerCase():"optional";s(["required","optional","forbidden"].includes(t),"options.cidr must be one of required, optional, forbidden"),s(void 0===e.version||"string"==typeof e.version||Array.isArray(e.version),"options.version must be a string or an array of string");let r=e.version||["ipv4","ipv6","ipvfuture"];Array.isArray(r)||(r=[r]),s(r.length>=1,"options.version must have at least 1 version specified");for(let e=0;e{if("forbidden"===t)return n.ip[e];const r="\\/".concat("ipv4"===e?n.ip.v4Cidr:n.ip.v6Cidr);return"required"===t?"".concat(n.ip[e]).concat(r):"".concat(n.ip[e],"(?:").concat(r,")?")})),o="(?:".concat(a.join("|"),")"),i=new RegExp("^".concat(o,"$"));return{cidr:t,versions:r,regex:i,raw:o}}},5752:(e,t,r)=>{"use strict";const s=r(375),n=r(6064),a={generate:function(){const e={},t="!\\$&'\\(\\)\\*\\+,;=",r="\\w-\\.~%\\dA-Fa-f"+t+":@",s="["+r+"]",n="(?:0{0,2}\\d|0?[1-9]\\d|1\\d\\d|2[0-4]\\d|25[0-5])";e.ipv4address="(?:"+n+"\\.){3}"+n;const a="[\\dA-Fa-f]{1,4}",o="(?:"+a+":"+a+"|"+e.ipv4address+")",i="(?:"+a+":){6}"+o,l="::(?:"+a+":){5}"+o,c="(?:"+a+")?::(?:"+a+":){4}"+o,u="(?:(?:"+a+":){0,1}"+a+")?::(?:"+a+":){3}"+o,f="(?:(?:"+a+":){0,2}"+a+")?::(?:"+a+":){2}"+o,m="(?:(?:"+a+":){0,3}"+a+")?::"+a+":"+o,h="(?:(?:"+a+":){0,4}"+a+")?::"+o;e.ipv4Cidr="(?:\\d|[1-2]\\d|3[0-2])",e.ipv6Cidr="(?:0{0,2}\\d|0?[1-9]\\d|1[01]\\d|12[0-8])",e.ipv6address="(?:"+i+"|"+l+"|"+c+"|"+u+"|"+f+"|"+m+"|"+h+"|(?:(?:[\\dA-Fa-f]{1,4}:){0,5}[\\dA-Fa-f]{1,4})?::[\\dA-Fa-f]{1,4}|(?:(?:[\\dA-Fa-f]{1,4}:){0,6}[\\dA-Fa-f]{1,4})?::)",e.ipvFuture="v[\\dA-Fa-f]+\\.[\\w-\\.~"+t+":]+",e.scheme="[a-zA-Z][a-zA-Z\\d+-\\.]*",e.schemeRegex=new RegExp(e.scheme);const d="[\\w-\\.~%\\dA-Fa-f"+t+":]*",p="(?:\\[(?:"+e.ipv6address+"|"+e.ipvFuture+")\\]|"+e.ipv4address+"|[\\w-\\.~%\\dA-Fa-f!\\$&'\\(\\)\\*\\+,;=]{1,255})",g="(?:"+d+"@)?"+p+"(?::\\d*)?",y="(?:"+d+"@)?("+p+")(?::\\d*)?",b=s+"+",v="(?:\\/[\\w-\\.~%\\dA-Fa-f!\\$&'\\(\\)\\*\\+,;=:@]*)*",_="\\/(?:"+b+v+")?",w=b+v,$="[\\w-\\.~%\\dA-Fa-f!\\$&'\\(\\)\\*\\+,;=@]+"+v;return e.hierPart="(?:(?:\\/\\/"+g+v+")|"+_+"|"+w+"|(?:\\/\\/\\/[\\w-\\.~%\\dA-Fa-f!\\$&'\\(\\)\\*\\+,;=:@]*(?:\\/[\\w-\\.~%\\dA-Fa-f!\\$&'\\(\\)\\*\\+,;=:@]*)*))",e.hierPartCapture="(?:(?:\\/\\/"+y+v+")|"+_+"|"+w+")",e.relativeRef="(?:(?:\\/\\/"+g+v+")|"+_+"|"+$+"|)",e.relativeRefCapture="(?:(?:\\/\\/"+y+v+")|"+_+"|"+$+"|)",e.query="["+r+"\\/\\?]*(?=#|$)",e.queryWithSquareBrackets="["+r+"\\[\\]\\/\\?]*(?=#|$)",e.fragment="["+r+"\\/\\?]*",e}};a.rfc3986=a.generate(),t.ip={v4Cidr:a.rfc3986.ipv4Cidr,v6Cidr:a.rfc3986.ipv6Cidr,ipv4:a.rfc3986.ipv4address,ipv6:a.rfc3986.ipv6address,ipvfuture:a.rfc3986.ipvFuture},a.createRegex=function(e){const t=a.rfc3986,r="(?:\\?"+(e.allowQuerySquareBrackets?t.queryWithSquareBrackets:t.query)+")?(?:#"+t.fragment+")?",o=e.domain?t.relativeRefCapture:t.relativeRef;if(e.relativeOnly)return a.wrap(o+r);let i="";if(e.scheme){s(e.scheme instanceof RegExp||"string"==typeof e.scheme||Array.isArray(e.scheme),"scheme must be a RegExp, String, or Array");const r=[].concat(e.scheme);s(r.length>=1,"scheme must have at least 1 scheme specified");const a=[];for(let e=0;e{"use strict";const r={operators:["!","^","*","/","%","+","-","<","<=",">",">=","==","!=","&&","||","??"],operatorCharacters:["!","^","*","/","%","+","-","<","=",">","&","|","?"],operatorsOrder:[["^"],["*","/","%"],["+","-"],["<","<=",">",">="],["==","!="],["&&"],["||","??"]],operatorsPrefix:["!","n"],literals:{'"':'"',"`":"`","'":"'","[":"]"},numberRx:/^(?:[0-9]*\.?[0-9]*){1}$/,tokenRx:/^[\w\$\#\.\@\:\{\}]+$/,symbol:Symbol("formula"),settings:Symbol("settings")};t.Parser=class{constructor(e,t={}){if(!t[r.settings]&&t.constants)for(const e in t.constants){const r=t.constants[e];if(null!==r&&!["boolean","number","string"].includes(typeof r))throw new Error("Formula constant ".concat(e," contains invalid ").concat(typeof r," value type"))}this.settings=t[r.settings]?t:Object.assign({[r.settings]:!0,constants:{},functions:{}},t),this.single=null,this._parts=null,this._parse(e)}_parse(e){let s=[],n="",a=0,o=!1;const i=e=>{if(a)throw new Error("Formula missing closing parenthesis");const i=s.length?s[s.length-1]:null;if(o||n||e){if(i&&"reference"===i.type&&")"===e)return i.type="function",i.value=this._subFormula(n,i.value),void(n="");if(")"===e){const e=new t.Parser(n,this.settings);s.push({type:"segment",value:e})}else if(o){if("]"===o)return s.push({type:"reference",value:n}),void(n="");s.push({type:"literal",value:n})}else if(r.operatorCharacters.includes(n))i&&"operator"===i.type&&r.operators.includes(i.value+n)?i.value+=n:s.push({type:"operator",value:n});else if(n.match(r.numberRx))s.push({type:"constant",value:parseFloat(n)});else if(void 0!==this.settings.constants[n])s.push({type:"constant",value:this.settings.constants[n]});else{if(!n.match(r.tokenRx))throw new Error("Formula contains invalid token: ".concat(n));s.push({type:"reference",value:n})}n=""}};for(const t of e)o?t===o?(i(),o=!1):n+=t:a?"("===t?(n+=t,++a):")"===t?(--a,a?n+=t:i(t)):n+=t:t in r.literals?o=r.literals[t]:"("===t?(i(),++a):r.operatorCharacters.includes(t)?(i(),n=t,i()):" "!==t?n+=t:i();i(),s=s.map(((e,t)=>"operator"!==e.type||"-"!==e.value||t&&"operator"!==s[t-1].type?e:{type:"operator",value:"n"}));let l=!1;for(const e of s){if("operator"===e.type){if(r.operatorsPrefix.includes(e.value))continue;if(!l)throw new Error("Formula contains an operator in invalid position");if(!r.operators.includes(e.value))throw new Error("Formula contains an unknown operator ".concat(e.value))}else if(l)throw new Error("Formula missing expected operator");l=!l}if(!l)throw new Error("Formula contains invalid trailing operator");1===s.length&&["reference","literal","constant"].includes(s[0].type)&&(this.single={type:"reference"===s[0].type?"reference":"value",value:s[0].value}),this._parts=s.map((e=>{if("operator"===e.type)return r.operatorsPrefix.includes(e.value)?e:e.value;if("reference"!==e.type)return e.value;if(this.settings.tokenRx&&!this.settings.tokenRx.test(e.value))throw new Error("Formula contains invalid reference ".concat(e.value));return this.settings.reference?this.settings.reference(e.value):r.reference(e.value)}))}_subFormula(e,s){const n=this.settings.functions[s];if("function"!=typeof n)throw new Error("Formula contains unknown function ".concat(s));let a=[];if(e){let t="",n=0,o=!1;const i=()=>{if(!t)throw new Error("Formula contains function ".concat(s," with invalid arguments ").concat(e));a.push(t),t=""};for(let s=0;snew t.Parser(e,this.settings))),function(e){const t=[];for(const r of a)t.push(r.evaluate(e));return n.call(e,...t)}}evaluate(e){const t=this._parts.slice();for(let s=t.length-2;s>=0;--s){const n=t[s];if(n&&"operator"===n.type){const a=t[s+1];t.splice(s+1,1);const o=r.evaluate(a,e);t[s]=r.single(n.value,o)}}return r.operatorsOrder.forEach((s=>{for(let n=1;n":return t>s;case">=":return t>=s;case"==":return t===s;case"!=":return t!==s;case"&&":return t&&s;case"||":return t||s}return null},r.exists=function(e){return null!=e}},9926:()=>{},5688:()=>{},9708:()=>{},1152:()=>{},443:()=>{},9848:()=>{}},t={},function r(s){var n=t[s];if(void 0!==n)return n.exports;var a=t[s]={exports:{}};return e[s](a,a.exports,r),a.exports}(5107);var e,t})); \ No newline at end of file diff --git a/node_modules/joi/lib/annotate.js b/node_modules/joi/lib/annotate.js new file mode 100644 index 000000000..42798fc3e --- /dev/null +++ b/node_modules/joi/lib/annotate.js @@ -0,0 +1,175 @@ +'use strict'; + +const Clone = require('@hapi/hoek/lib/clone'); + +const Common = require('./common'); + + +const internals = { + annotations: Symbol('annotations') +}; + + +exports.error = function (stripColorCodes) { + + if (!this._original || + typeof this._original !== 'object') { + + return this.details[0].message; + } + + const redFgEscape = stripColorCodes ? '' : '\u001b[31m'; + const redBgEscape = stripColorCodes ? '' : '\u001b[41m'; + const endColor = stripColorCodes ? '' : '\u001b[0m'; + + const obj = Clone(this._original); + + for (let i = this.details.length - 1; i >= 0; --i) { // Reverse order to process deepest child first + const pos = i + 1; + const error = this.details[i]; + const path = error.path; + let node = obj; + for (let j = 0; ; ++j) { + const seg = path[j]; + + if (Common.isSchema(node)) { + node = node.clone(); // joi schemas are not cloned by hoek, we have to take this extra step + } + + if (j + 1 < path.length && + typeof node[seg] !== 'string') { + + node = node[seg]; + } + else { + const refAnnotations = node[internals.annotations] || { errors: {}, missing: {} }; + node[internals.annotations] = refAnnotations; + + const cacheKey = seg || error.context.key; + + if (node[seg] !== undefined) { + refAnnotations.errors[cacheKey] = refAnnotations.errors[cacheKey] || []; + refAnnotations.errors[cacheKey].push(pos); + } + else { + refAnnotations.missing[cacheKey] = pos; + } + + break; + } + } + } + + const replacers = { + key: /_\$key\$_([, \d]+)_\$end\$_"/g, + missing: /"_\$miss\$_([^|]+)\|(\d+)_\$end\$_": "__missing__"/g, + arrayIndex: /\s*"_\$idx\$_([, \d]+)_\$end\$_",?\n(.*)/g, + specials: /"\[(NaN|Symbol.*|-?Infinity|function.*|\(.*)]"/g + }; + + let message = internals.safeStringify(obj, 2) + .replace(replacers.key, ($0, $1) => `" ${redFgEscape}[${$1}]${endColor}`) + .replace(replacers.missing, ($0, $1, $2) => `${redBgEscape}"${$1}"${endColor}${redFgEscape} [${$2}]: -- missing --${endColor}`) + .replace(replacers.arrayIndex, ($0, $1, $2) => `\n${$2} ${redFgEscape}[${$1}]${endColor}`) + .replace(replacers.specials, ($0, $1) => $1); + + message = `${message}\n${redFgEscape}`; + + for (let i = 0; i < this.details.length; ++i) { + const pos = i + 1; + message = `${message}\n[${pos}] ${this.details[i].message}`; + } + + message = message + endColor; + + return message; +}; + + +// Inspired by json-stringify-safe + +internals.safeStringify = function (obj, spaces) { + + return JSON.stringify(obj, internals.serializer(), spaces); +}; + + +internals.serializer = function () { + + const keys = []; + const stack = []; + + const cycleReplacer = (key, value) => { + + if (stack[0] === value) { + return '[Circular ~]'; + } + + return '[Circular ~.' + keys.slice(0, stack.indexOf(value)).join('.') + ']'; + }; + + return function (key, value) { + + if (stack.length > 0) { + const thisPos = stack.indexOf(this); + if (~thisPos) { + stack.length = thisPos + 1; + keys.length = thisPos + 1; + keys[thisPos] = key; + } + else { + stack.push(this); + keys.push(key); + } + + if (~stack.indexOf(value)) { + value = cycleReplacer.call(this, key, value); + } + } + else { + stack.push(value); + } + + if (value) { + const annotations = value[internals.annotations]; + if (annotations) { + if (Array.isArray(value)) { + const annotated = []; + + for (let i = 0; i < value.length; ++i) { + if (annotations.errors[i]) { + annotated.push(`_$idx$_${annotations.errors[i].sort().join(', ')}_$end$_`); + } + + annotated.push(value[i]); + } + + value = annotated; + } + else { + for (const errorKey in annotations.errors) { + value[`${errorKey}_$key$_${annotations.errors[errorKey].sort().join(', ')}_$end$_`] = value[errorKey]; + value[errorKey] = undefined; + } + + for (const missingKey in annotations.missing) { + value[`_$miss$_${missingKey}|${annotations.missing[missingKey]}_$end$_`] = '__missing__'; + } + } + + return value; + } + } + + if (value === Infinity || + value === -Infinity || + Number.isNaN(value) || + typeof value === 'function' || + typeof value === 'symbol') { + + return '[' + value.toString() + ']'; + } + + return value; + }; +}; diff --git a/node_modules/joi/lib/base.js b/node_modules/joi/lib/base.js new file mode 100644 index 000000000..309ae41e8 --- /dev/null +++ b/node_modules/joi/lib/base.js @@ -0,0 +1,1068 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); +const DeepEqual = require('@hapi/hoek/lib/deepEqual'); +const Merge = require('@hapi/hoek/lib/merge'); + +const Cache = require('./cache'); +const Common = require('./common'); +const Compile = require('./compile'); +const Errors = require('./errors'); +const Extend = require('./extend'); +const Manifest = require('./manifest'); +const Messages = require('./messages'); +const Modify = require('./modify'); +const Ref = require('./ref'); +const Trace = require('./trace'); +const Validator = require('./validator'); +const Values = require('./values'); + + +const internals = {}; + + +internals.Base = class { + + constructor(type) { + + // Naming: public, _private, $_extension, $_mutate{action} + + this.type = type; + + this.$_root = null; + this._definition = {}; + this._reset(); + } + + _reset() { + + this._ids = new Modify.Ids(); + this._preferences = null; + this._refs = new Ref.Manager(); + this._cache = null; + + this._valids = null; + this._invalids = null; + + this._flags = {}; + this._rules = []; + this._singleRules = new Map(); // The rule options passed for non-multi rules + + this.$_terms = {}; // Hash of arrays of immutable objects (extended by other types) + + this.$_temp = { // Runtime state (not cloned) + ruleset: null, // null: use last, false: error, number: start position + whens: {} // Runtime cache of generated whens + }; + } + + // Manifest + + describe() { + + Assert(typeof Manifest.describe === 'function', 'Manifest functionality disabled'); + return Manifest.describe(this); + } + + // Rules + + allow(...values) { + + Common.verifyFlat(values, 'allow'); + return this._values(values, '_valids'); + } + + alter(targets) { + + Assert(targets && typeof targets === 'object' && !Array.isArray(targets), 'Invalid targets argument'); + Assert(!this._inRuleset(), 'Cannot set alterations inside a ruleset'); + + const obj = this.clone(); + obj.$_terms.alterations = obj.$_terms.alterations || []; + for (const target in targets) { + const adjuster = targets[target]; + Assert(typeof adjuster === 'function', 'Alteration adjuster for', target, 'must be a function'); + obj.$_terms.alterations.push({ target, adjuster }); + } + + obj.$_temp.ruleset = false; + return obj; + } + + artifact(id) { + + Assert(id !== undefined, 'Artifact cannot be undefined'); + Assert(!this._cache, 'Cannot set an artifact with a rule cache'); + + return this.$_setFlag('artifact', id); + } + + cast(to) { + + Assert(to === false || typeof to === 'string', 'Invalid to value'); + Assert(to === false || this._definition.cast[to], 'Type', this.type, 'does not support casting to', to); + + return this.$_setFlag('cast', to === false ? undefined : to); + } + + default(value, options) { + + return this._default('default', value, options); + } + + description(desc) { + + Assert(desc && typeof desc === 'string', 'Description must be a non-empty string'); + + return this.$_setFlag('description', desc); + } + + empty(schema) { + + const obj = this.clone(); + + if (schema !== undefined) { + schema = obj.$_compile(schema, { override: false }); + } + + return obj.$_setFlag('empty', schema, { clone: false }); + } + + error(err) { + + Assert(err, 'Missing error'); + Assert(err instanceof Error || typeof err === 'function', 'Must provide a valid Error object or a function'); + + return this.$_setFlag('error', err); + } + + example(example, options = {}) { + + Assert(example !== undefined, 'Missing example'); + Common.assertOptions(options, ['override']); + + return this._inner('examples', example, { single: true, override: options.override }); + } + + external(method, description) { + + if (typeof method === 'object') { + Assert(!description, 'Cannot combine options with description'); + description = method.description; + method = method.method; + } + + Assert(typeof method === 'function', 'Method must be a function'); + Assert(description === undefined || description && typeof description === 'string', 'Description must be a non-empty string'); + + return this._inner('externals', { method, description }, { single: true }); + } + + failover(value, options) { + + return this._default('failover', value, options); + } + + forbidden() { + + return this.presence('forbidden'); + } + + id(id) { + + if (!id) { + return this.$_setFlag('id', undefined); + } + + Assert(typeof id === 'string', 'id must be a non-empty string'); + Assert(/^[^\.]+$/.test(id), 'id cannot contain period character'); + + return this.$_setFlag('id', id); + } + + invalid(...values) { + + return this._values(values, '_invalids'); + } + + label(name) { + + Assert(name && typeof name === 'string', 'Label name must be a non-empty string'); + + return this.$_setFlag('label', name); + } + + meta(meta) { + + Assert(meta !== undefined, 'Meta cannot be undefined'); + + return this._inner('metas', meta, { single: true }); + } + + note(...notes) { + + Assert(notes.length, 'Missing notes'); + for (const note of notes) { + Assert(note && typeof note === 'string', 'Notes must be non-empty strings'); + } + + return this._inner('notes', notes); + } + + only(mode = true) { + + Assert(typeof mode === 'boolean', 'Invalid mode:', mode); + + return this.$_setFlag('only', mode); + } + + optional() { + + return this.presence('optional'); + } + + prefs(prefs) { + + Assert(prefs, 'Missing preferences'); + Assert(prefs.context === undefined, 'Cannot override context'); + Assert(prefs.externals === undefined, 'Cannot override externals'); + Assert(prefs.warnings === undefined, 'Cannot override warnings'); + Assert(prefs.debug === undefined, 'Cannot override debug'); + + Common.checkPreferences(prefs); + + const obj = this.clone(); + obj._preferences = Common.preferences(obj._preferences, prefs); + return obj; + } + + presence(mode) { + + Assert(['optional', 'required', 'forbidden'].includes(mode), 'Unknown presence mode', mode); + + return this.$_setFlag('presence', mode); + } + + raw(enabled = true) { + + return this.$_setFlag('result', enabled ? 'raw' : undefined); + } + + result(mode) { + + Assert(['raw', 'strip'].includes(mode), 'Unknown result mode', mode); + + return this.$_setFlag('result', mode); + } + + required() { + + return this.presence('required'); + } + + strict(enabled) { + + const obj = this.clone(); + + const convert = enabled === undefined ? false : !enabled; + obj._preferences = Common.preferences(obj._preferences, { convert }); + return obj; + } + + strip(enabled = true) { + + return this.$_setFlag('result', enabled ? 'strip' : undefined); + } + + tag(...tags) { + + Assert(tags.length, 'Missing tags'); + for (const tag of tags) { + Assert(tag && typeof tag === 'string', 'Tags must be non-empty strings'); + } + + return this._inner('tags', tags); + } + + unit(name) { + + Assert(name && typeof name === 'string', 'Unit name must be a non-empty string'); + + return this.$_setFlag('unit', name); + } + + valid(...values) { + + Common.verifyFlat(values, 'valid'); + + const obj = this.allow(...values); + obj.$_setFlag('only', !!obj._valids, { clone: false }); + return obj; + } + + when(condition, options) { + + const obj = this.clone(); + + if (!obj.$_terms.whens) { + obj.$_terms.whens = []; + } + + const when = Compile.when(obj, condition, options); + if (!['any', 'link'].includes(obj.type)) { + const conditions = when.is ? [when] : when.switch; + for (const item of conditions) { + Assert(!item.then || item.then.type === 'any' || item.then.type === obj.type, 'Cannot combine', obj.type, 'with', item.then && item.then.type); + Assert(!item.otherwise || item.otherwise.type === 'any' || item.otherwise.type === obj.type, 'Cannot combine', obj.type, 'with', item.otherwise && item.otherwise.type); + + } + } + + obj.$_terms.whens.push(when); + return obj.$_mutateRebuild(); + } + + // Helpers + + cache(cache) { + + Assert(!this._inRuleset(), 'Cannot set caching inside a ruleset'); + Assert(!this._cache, 'Cannot override schema cache'); + Assert(this._flags.artifact === undefined, 'Cannot cache a rule with an artifact'); + + const obj = this.clone(); + obj._cache = cache || Cache.provider.provision(); + obj.$_temp.ruleset = false; + return obj; + } + + clone() { + + const obj = Object.create(Object.getPrototypeOf(this)); + return this._assign(obj); + } + + concat(source) { + + Assert(Common.isSchema(source), 'Invalid schema object'); + Assert(this.type === 'any' || source.type === 'any' || source.type === this.type, 'Cannot merge type', this.type, 'with another type:', source.type); + Assert(!this._inRuleset(), 'Cannot concatenate onto a schema with open ruleset'); + Assert(!source._inRuleset(), 'Cannot concatenate a schema with open ruleset'); + + let obj = this.clone(); + + if (this.type === 'any' && + source.type !== 'any') { + + // Change obj to match source type + + const tmpObj = source.clone(); + for (const key of Object.keys(obj)) { + if (key !== 'type') { + tmpObj[key] = obj[key]; + } + } + + obj = tmpObj; + } + + obj._ids.concat(source._ids); + obj._refs.register(source, Ref.toSibling); + + obj._preferences = obj._preferences ? Common.preferences(obj._preferences, source._preferences) : source._preferences; + obj._valids = Values.merge(obj._valids, source._valids, source._invalids); + obj._invalids = Values.merge(obj._invalids, source._invalids, source._valids); + + // Remove unique rules present in source + + for (const name of source._singleRules.keys()) { + if (obj._singleRules.has(name)) { + obj._rules = obj._rules.filter((target) => target.keep || target.name !== name); + obj._singleRules.delete(name); + } + } + + // Rules + + for (const test of source._rules) { + if (!source._definition.rules[test.method].multi) { + obj._singleRules.set(test.name, test); + } + + obj._rules.push(test); + } + + // Flags + + if (obj._flags.empty && + source._flags.empty) { + + obj._flags.empty = obj._flags.empty.concat(source._flags.empty); + const flags = Object.assign({}, source._flags); + delete flags.empty; + Merge(obj._flags, flags); + } + else if (source._flags.empty) { + obj._flags.empty = source._flags.empty; + const flags = Object.assign({}, source._flags); + delete flags.empty; + Merge(obj._flags, flags); + } + else { + Merge(obj._flags, source._flags); + } + + // Terms + + for (const key in source.$_terms) { + const terms = source.$_terms[key]; + if (!terms) { + if (!obj.$_terms[key]) { + obj.$_terms[key] = terms; + } + + continue; + } + + if (!obj.$_terms[key]) { + obj.$_terms[key] = terms.slice(); + continue; + } + + obj.$_terms[key] = obj.$_terms[key].concat(terms); + } + + // Tracing + + if (this.$_root._tracer) { + this.$_root._tracer._combine(obj, [this, source]); + } + + // Rebuild + + return obj.$_mutateRebuild(); + } + + extend(options) { + + Assert(!options.base, 'Cannot extend type with another base'); + + return Extend.type(this, options); + } + + extract(path) { + + path = Array.isArray(path) ? path : path.split('.'); + return this._ids.reach(path); + } + + fork(paths, adjuster) { + + Assert(!this._inRuleset(), 'Cannot fork inside a ruleset'); + + let obj = this; // eslint-disable-line consistent-this + for (let path of [].concat(paths)) { + path = Array.isArray(path) ? path : path.split('.'); + obj = obj._ids.fork(path, adjuster, obj); + } + + obj.$_temp.ruleset = false; + return obj; + } + + rule(options) { + + const def = this._definition; + Common.assertOptions(options, Object.keys(def.modifiers)); + + Assert(this.$_temp.ruleset !== false, 'Cannot apply rules to empty ruleset or the last rule added does not support rule properties'); + const start = this.$_temp.ruleset === null ? this._rules.length - 1 : this.$_temp.ruleset; + Assert(start >= 0 && start < this._rules.length, 'Cannot apply rules to empty ruleset'); + + const obj = this.clone(); + + for (let i = start; i < obj._rules.length; ++i) { + const original = obj._rules[i]; + const rule = Clone(original); + + for (const name in options) { + def.modifiers[name](rule, options[name]); + Assert(rule.name === original.name, 'Cannot change rule name'); + } + + obj._rules[i] = rule; + + if (obj._singleRules.get(rule.name) === original) { + obj._singleRules.set(rule.name, rule); + } + } + + obj.$_temp.ruleset = false; + return obj.$_mutateRebuild(); + } + + get ruleset() { + + Assert(!this._inRuleset(), 'Cannot start a new ruleset without closing the previous one'); + + const obj = this.clone(); + obj.$_temp.ruleset = obj._rules.length; + return obj; + } + + get $() { + + return this.ruleset; + } + + tailor(targets) { + + targets = [].concat(targets); + + Assert(!this._inRuleset(), 'Cannot tailor inside a ruleset'); + + let obj = this; // eslint-disable-line consistent-this + + if (this.$_terms.alterations) { + for (const { target, adjuster } of this.$_terms.alterations) { + if (targets.includes(target)) { + obj = adjuster(obj); + Assert(Common.isSchema(obj), 'Alteration adjuster for', target, 'failed to return a schema object'); + } + } + } + + obj = obj.$_modify({ each: (item) => item.tailor(targets), ref: false }); + obj.$_temp.ruleset = false; + return obj.$_mutateRebuild(); + } + + tracer() { + + return Trace.location ? Trace.location(this) : this; // $lab:coverage:ignore$ + } + + validate(value, options) { + + return Validator.entry(value, this, options); + } + + validateAsync(value, options) { + + return Validator.entryAsync(value, this, options); + } + + // Extensions + + $_addRule(options) { + + // Normalize rule + + if (typeof options === 'string') { + options = { name: options }; + } + + Assert(options && typeof options === 'object', 'Invalid options'); + Assert(options.name && typeof options.name === 'string', 'Invalid rule name'); + + for (const key in options) { + Assert(key[0] !== '_', 'Cannot set private rule properties'); + } + + const rule = Object.assign({}, options); // Shallow cloned + rule._resolve = []; + rule.method = rule.method || rule.name; + + const definition = this._definition.rules[rule.method]; + const args = rule.args; + + Assert(definition, 'Unknown rule', rule.method); + + // Args + + const obj = this.clone(); + + if (args) { + Assert(Object.keys(args).length === 1 || Object.keys(args).length === this._definition.rules[rule.name].args.length, 'Invalid rule definition for', this.type, rule.name); + + for (const key in args) { + let arg = args[key]; + if (arg === undefined) { + delete args[key]; + continue; + } + + if (definition.argsByName) { + const resolver = definition.argsByName.get(key); + + if (resolver.ref && + Common.isResolvable(arg)) { + + rule._resolve.push(key); + obj.$_mutateRegister(arg); + } + else { + if (resolver.normalize) { + arg = resolver.normalize(arg); + args[key] = arg; + } + + if (resolver.assert) { + const error = Common.validateArg(arg, key, resolver); + Assert(!error, error, 'or reference'); + } + } + } + + args[key] = arg; + } + } + + // Unique rules + + if (!definition.multi) { + obj._ruleRemove(rule.name, { clone: false }); + obj._singleRules.set(rule.name, rule); + } + + if (obj.$_temp.ruleset === false) { + obj.$_temp.ruleset = null; + } + + if (definition.priority) { + obj._rules.unshift(rule); + } + else { + obj._rules.push(rule); + } + + return obj; + } + + $_compile(schema, options) { + + return Compile.schema(this.$_root, schema, options); + } + + $_createError(code, value, local, state, prefs, options = {}) { + + const flags = options.flags !== false ? this._flags : {}; + const messages = options.messages ? Messages.merge(this._definition.messages, options.messages) : this._definition.messages; + return new Errors.Report(code, value, local, flags, messages, state, prefs); + } + + $_getFlag(name) { + + return this._flags[name]; + } + + $_getRule(name) { + + return this._singleRules.get(name); + } + + $_mapLabels(path) { + + path = Array.isArray(path) ? path : path.split('.'); + return this._ids.labels(path); + } + + $_match(value, state, prefs, overrides) { + + prefs = Object.assign({}, prefs); // Shallow cloned + prefs.abortEarly = true; + prefs._externals = false; + + state.snapshot(); + const result = !Validator.validate(value, this, state, prefs, overrides).errors; + state.restore(); + + return result; + } + + $_modify(options) { + + Common.assertOptions(options, ['each', 'once', 'ref', 'schema']); + return Modify.schema(this, options) || this; + } + + $_mutateRebuild() { + + Assert(!this._inRuleset(), 'Cannot add this rule inside a ruleset'); + + this._refs.reset(); + this._ids.reset(); + + const each = (item, { source, name, path, key }) => { + + const family = this._definition[source][name] && this._definition[source][name].register; + if (family !== false) { + this.$_mutateRegister(item, { family, key }); + } + }; + + this.$_modify({ each }); + + if (this._definition.rebuild) { + this._definition.rebuild(this); + } + + this.$_temp.ruleset = false; + return this; + } + + $_mutateRegister(schema, { family, key } = {}) { + + this._refs.register(schema, family); + this._ids.register(schema, { key }); + } + + $_property(name) { + + return this._definition.properties[name]; + } + + $_reach(path) { + + return this._ids.reach(path); + } + + $_rootReferences() { + + return this._refs.roots(); + } + + $_setFlag(name, value, options = {}) { + + Assert(name[0] === '_' || !this._inRuleset(), 'Cannot set flag inside a ruleset'); + + const flag = this._definition.flags[name] || {}; + if (DeepEqual(value, flag.default)) { + value = undefined; + } + + if (DeepEqual(value, this._flags[name])) { + return this; + } + + const obj = options.clone !== false ? this.clone() : this; + + if (value !== undefined) { + obj._flags[name] = value; + obj.$_mutateRegister(value); + } + else { + delete obj._flags[name]; + } + + if (name[0] !== '_') { + obj.$_temp.ruleset = false; + } + + return obj; + } + + $_parent(method, ...args) { + + return this[method][Common.symbols.parent].call(this, ...args); + } + + $_validate(value, state, prefs) { + + return Validator.validate(value, this, state, prefs); + } + + // Internals + + _assign(target) { + + target.type = this.type; + + target.$_root = this.$_root; + + target.$_temp = Object.assign({}, this.$_temp); + target.$_temp.whens = {}; + + target._ids = this._ids.clone(); + target._preferences = this._preferences; + target._valids = this._valids && this._valids.clone(); + target._invalids = this._invalids && this._invalids.clone(); + target._rules = this._rules.slice(); + target._singleRules = Clone(this._singleRules, { shallow: true }); + target._refs = this._refs.clone(); + target._flags = Object.assign({}, this._flags); + target._cache = null; + + target.$_terms = {}; + for (const key in this.$_terms) { + target.$_terms[key] = this.$_terms[key] ? this.$_terms[key].slice() : null; + } + + // Backwards compatibility + + target.$_super = {}; + for (const override in this.$_super) { + target.$_super[override] = this._super[override].bind(target); + } + + return target; + } + + _bare() { + + const obj = this.clone(); + obj._reset(); + + const terms = obj._definition.terms; + for (const name in terms) { + const term = terms[name]; + obj.$_terms[name] = term.init; + } + + return obj.$_mutateRebuild(); + } + + _default(flag, value, options = {}) { + + Common.assertOptions(options, 'literal'); + + Assert(value !== undefined, 'Missing', flag, 'value'); + Assert(typeof value === 'function' || !options.literal, 'Only function value supports literal option'); + + if (typeof value === 'function' && + options.literal) { + + value = { + [Common.symbols.literal]: true, + literal: value + }; + } + + const obj = this.$_setFlag(flag, value); + return obj; + } + + _generate(value, state, prefs) { + + if (!this.$_terms.whens) { + return { schema: this }; + } + + // Collect matching whens + + const whens = []; + const ids = []; + for (let i = 0; i < this.$_terms.whens.length; ++i) { + const when = this.$_terms.whens[i]; + + if (when.concat) { + whens.push(when.concat); + ids.push(`${i}.concat`); + continue; + } + + const input = when.ref ? when.ref.resolve(value, state, prefs) : value; + const tests = when.is ? [when] : when.switch; + const before = ids.length; + + for (let j = 0; j < tests.length; ++j) { + const { is, then, otherwise } = tests[j]; + + const baseId = `${i}${when.switch ? '.' + j : ''}`; + if (is.$_match(input, state.nest(is, `${baseId}.is`), prefs)) { + if (then) { + const localState = state.localize([...state.path, `${baseId}.then`], state.ancestors, state.schemas); + const { schema: generated, id } = then._generate(value, localState, prefs); + whens.push(generated); + ids.push(`${baseId}.then${id ? `(${id})` : ''}`); + break; + } + } + else if (otherwise) { + const localState = state.localize([...state.path, `${baseId}.otherwise`], state.ancestors, state.schemas); + const { schema: generated, id } = otherwise._generate(value, localState, prefs); + whens.push(generated); + ids.push(`${baseId}.otherwise${id ? `(${id})` : ''}`); + break; + } + } + + if (when.break && + ids.length > before) { // Something matched + + break; + } + } + + // Check cache + + const id = ids.join(', '); + state.mainstay.tracer.debug(state, 'rule', 'when', id); + + if (!id) { + return { schema: this }; + } + + if (!state.mainstay.tracer.active && + this.$_temp.whens[id]) { + + return { schema: this.$_temp.whens[id], id }; + } + + // Generate dynamic schema + + let obj = this; // eslint-disable-line consistent-this + if (this._definition.generate) { + obj = this._definition.generate(this, value, state, prefs); + } + + // Apply whens + + for (const when of whens) { + obj = obj.concat(when); + } + + // Tracing + + if (this.$_root._tracer) { + this.$_root._tracer._combine(obj, [this, ...whens]); + } + + // Cache result + + this.$_temp.whens[id] = obj; + return { schema: obj, id }; + } + + _inner(type, values, options = {}) { + + Assert(!this._inRuleset(), `Cannot set ${type} inside a ruleset`); + + const obj = this.clone(); + if (!obj.$_terms[type] || + options.override) { + + obj.$_terms[type] = []; + } + + if (options.single) { + obj.$_terms[type].push(values); + } + else { + obj.$_terms[type].push(...values); + } + + obj.$_temp.ruleset = false; + return obj; + } + + _inRuleset() { + + return this.$_temp.ruleset !== null && this.$_temp.ruleset !== false; + } + + _ruleRemove(name, options = {}) { + + if (!this._singleRules.has(name)) { + return this; + } + + const obj = options.clone !== false ? this.clone() : this; + + obj._singleRules.delete(name); + + const filtered = []; + for (let i = 0; i < obj._rules.length; ++i) { + const test = obj._rules[i]; + if (test.name === name && + !test.keep) { + + if (obj._inRuleset() && + i < obj.$_temp.ruleset) { + + --obj.$_temp.ruleset; + } + + continue; + } + + filtered.push(test); + } + + obj._rules = filtered; + return obj; + } + + _values(values, key) { + + Common.verifyFlat(values, key.slice(1, -1)); + + const obj = this.clone(); + + const override = values[0] === Common.symbols.override; + if (override) { + values = values.slice(1); + } + + if (!obj[key] && + values.length) { + + obj[key] = new Values(); + } + else if (override) { + obj[key] = values.length ? new Values() : null; + obj.$_mutateRebuild(); + } + + if (!obj[key]) { + return obj; + } + + if (override) { + obj[key].override(); + } + + for (const value of values) { + Assert(value !== undefined, 'Cannot call allow/valid/invalid with undefined'); + Assert(value !== Common.symbols.override, 'Override must be the first value'); + + const other = key === '_invalids' ? '_valids' : '_invalids'; + if (obj[other]) { + obj[other].remove(value); + if (!obj[other].length) { + Assert(key === '_valids' || !obj._flags.only, 'Setting invalid value', value, 'leaves schema rejecting all values due to previous valid rule'); + obj[other] = null; + } + } + + obj[key].add(value, obj._refs); + } + + return obj; + } +}; + + +internals.Base.prototype[Common.symbols.any] = { + version: Common.version, + compile: Compile.compile, + root: '$_root' +}; + + +internals.Base.prototype.isImmutable = true; // Prevents Hoek from deep cloning schema objects (must be on prototype) + + +// Aliases + +internals.Base.prototype.deny = internals.Base.prototype.invalid; +internals.Base.prototype.disallow = internals.Base.prototype.invalid; +internals.Base.prototype.equal = internals.Base.prototype.valid; +internals.Base.prototype.exist = internals.Base.prototype.required; +internals.Base.prototype.not = internals.Base.prototype.invalid; +internals.Base.prototype.options = internals.Base.prototype.prefs; +internals.Base.prototype.preferences = internals.Base.prototype.prefs; + + +module.exports = new internals.Base(); diff --git a/node_modules/joi/lib/cache.js b/node_modules/joi/lib/cache.js new file mode 100644 index 000000000..32c61e0e4 --- /dev/null +++ b/node_modules/joi/lib/cache.js @@ -0,0 +1,143 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); + +const Common = require('./common'); + + +const internals = { + max: 1000, + supported: new Set(['undefined', 'boolean', 'number', 'string']) +}; + + +exports.provider = { + + provision(options) { + + return new internals.Cache(options); + } +}; + + +// Least Recently Used (LRU) Cache + +internals.Cache = class { + + constructor(options = {}) { + + Common.assertOptions(options, ['max']); + Assert(options.max === undefined || options.max && options.max > 0 && isFinite(options.max), 'Invalid max cache size'); + + this._max = options.max || internals.max; + + this._map = new Map(); // Map of nodes by key + this._list = new internals.List(); // List of nodes (most recently used in head) + } + + get length() { + + return this._map.size; + } + + set(key, value) { + + if (key !== null && + !internals.supported.has(typeof key)) { + + return; + } + + let node = this._map.get(key); + if (node) { + node.value = value; + this._list.first(node); + return; + } + + node = this._list.unshift({ key, value }); + this._map.set(key, node); + this._compact(); + } + + get(key) { + + const node = this._map.get(key); + if (node) { + this._list.first(node); + return Clone(node.value); + } + } + + _compact() { + + if (this._map.size > this._max) { + const node = this._list.pop(); + this._map.delete(node.key); + } + } +}; + + +internals.List = class { + + constructor() { + + this.tail = null; + this.head = null; + } + + unshift(node) { + + node.next = null; + node.prev = this.head; + + if (this.head) { + this.head.next = node; + } + + this.head = node; + + if (!this.tail) { + this.tail = node; + } + + return node; + } + + first(node) { + + if (node === this.head) { + return; + } + + this._remove(node); + this.unshift(node); + } + + pop() { + + return this._remove(this.tail); + } + + _remove(node) { + + const { next, prev } = node; + + next.prev = prev; + + if (prev) { + prev.next = next; + } + + if (node === this.tail) { + this.tail = next; + } + + node.prev = null; + node.next = null; + + return node; + } +}; diff --git a/node_modules/joi/lib/common.js b/node_modules/joi/lib/common.js new file mode 100644 index 000000000..7d572c139 --- /dev/null +++ b/node_modules/joi/lib/common.js @@ -0,0 +1,216 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const AssertError = require('@hapi/hoek/lib/error'); + +const Pkg = require('../package.json'); + +let Messages; +let Schemas; + + +const internals = { + isoDate: /^(?:[-+]\d{2})?(?:\d{4}(?!\d{2}\b))(?:(-?)(?:(?:0[1-9]|1[0-2])(?:\1(?:[12]\d|0[1-9]|3[01]))?|W(?:[0-4]\d|5[0-2])(?:-?[1-7])?|(?:00[1-9]|0[1-9]\d|[12]\d{2}|3(?:[0-5]\d|6[1-6])))(?![T]$|[T][\d]+Z$)(?:[T\s](?:(?:(?:[01]\d|2[0-3])(?:(:?)[0-5]\d)?|24\:?00)(?:[.,]\d+(?!:))?)(?:\2[0-5]\d(?:[.,]\d+)?)?(?:[Z]|(?:[+-])(?:[01]\d|2[0-3])(?::?[0-5]\d)?)?)?)?$/ +}; + + +exports.version = Pkg.version; + + +exports.defaults = { + abortEarly: true, + allowUnknown: false, + artifacts: false, + cache: true, + context: null, + convert: true, + dateFormat: 'iso', + errors: { + escapeHtml: false, + label: 'path', + language: null, + render: true, + stack: false, + wrap: { + label: '"', + array: '[]' + } + }, + externals: true, + messages: {}, + nonEnumerables: false, + noDefaults: false, + presence: 'optional', + skipFunctions: false, + stripUnknown: false, + warnings: false +}; + + +exports.symbols = { + any: Symbol.for('@hapi/joi/schema'), // Used to internally identify any-based types (shared with other joi versions) + arraySingle: Symbol('arraySingle'), + deepDefault: Symbol('deepDefault'), + errors: Symbol('errors'), + literal: Symbol('literal'), + override: Symbol('override'), + parent: Symbol('parent'), + prefs: Symbol('prefs'), + ref: Symbol('ref'), + template: Symbol('template'), + values: Symbol('values') +}; + + +exports.assertOptions = function (options, keys, name = 'Options') { + + Assert(options && typeof options === 'object' && !Array.isArray(options), 'Options must be of type object'); + const unknownKeys = Object.keys(options).filter((k) => !keys.includes(k)); + Assert(unknownKeys.length === 0, `${name} contain unknown keys: ${unknownKeys}`); +}; + + +exports.checkPreferences = function (prefs) { + + Schemas = Schemas || require('./schemas'); + + const result = Schemas.preferences.validate(prefs); + + if (result.error) { + throw new AssertError([result.error.details[0].message]); + } +}; + + +exports.compare = function (a, b, operator) { + + switch (operator) { + case '=': return a === b; + case '>': return a > b; + case '<': return a < b; + case '>=': return a >= b; + case '<=': return a <= b; + } +}; + + +exports.default = function (value, defaultValue) { + + return value === undefined ? defaultValue : value; +}; + + +exports.isIsoDate = function (date) { + + return internals.isoDate.test(date); +}; + + +exports.isNumber = function (value) { + + return typeof value === 'number' && !isNaN(value); +}; + + +exports.isResolvable = function (obj) { + + if (!obj) { + return false; + } + + return obj[exports.symbols.ref] || obj[exports.symbols.template]; +}; + + +exports.isSchema = function (schema, options = {}) { + + const any = schema && schema[exports.symbols.any]; + if (!any) { + return false; + } + + Assert(options.legacy || any.version === exports.version, 'Cannot mix different versions of joi schemas'); + return true; +}; + + +exports.isValues = function (obj) { + + return obj[exports.symbols.values]; +}; + + +exports.limit = function (value) { + + return Number.isSafeInteger(value) && value >= 0; +}; + + +exports.preferences = function (target, source) { + + Messages = Messages || require('./messages'); + + target = target || {}; + source = source || {}; + + const merged = Object.assign({}, target, source); + if (source.errors && + target.errors) { + + merged.errors = Object.assign({}, target.errors, source.errors); + merged.errors.wrap = Object.assign({}, target.errors.wrap, source.errors.wrap); + } + + if (source.messages) { + merged.messages = Messages.compile(source.messages, target.messages); + } + + delete merged[exports.symbols.prefs]; + return merged; +}; + + +exports.tryWithPath = function (fn, key, options = {}) { + + try { + return fn(); + } + catch (err) { + if (err.path !== undefined) { + err.path = key + '.' + err.path; + } + else { + err.path = key; + } + + if (options.append) { + err.message = `${err.message} (${err.path})`; + } + + throw err; + } +}; + + +exports.validateArg = function (value, label, { assert, message }) { + + if (exports.isSchema(assert)) { + const result = assert.validate(value); + if (!result.error) { + return; + } + + return result.error.message; + } + else if (!assert(value)) { + return label ? `${label} ${message}` : message; + } +}; + + +exports.verifyFlat = function (args, method) { + + for (const arg of args) { + Assert(!Array.isArray(arg), 'Method no longer accepts array arguments:', method); + } +}; diff --git a/node_modules/joi/lib/compile.js b/node_modules/joi/lib/compile.js new file mode 100644 index 000000000..5593b7c90 --- /dev/null +++ b/node_modules/joi/lib/compile.js @@ -0,0 +1,283 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Common = require('./common'); +const Ref = require('./ref'); + + +const internals = {}; + + +exports.schema = function (Joi, config, options = {}) { + + Common.assertOptions(options, ['appendPath', 'override']); + + try { + return internals.schema(Joi, config, options); + } + catch (err) { + if (options.appendPath && + err.path !== undefined) { + + err.message = `${err.message} (${err.path})`; + } + + throw err; + } +}; + + +internals.schema = function (Joi, config, options) { + + Assert(config !== undefined, 'Invalid undefined schema'); + + if (Array.isArray(config)) { + Assert(config.length, 'Invalid empty array schema'); + + if (config.length === 1) { + config = config[0]; + } + } + + const valid = (base, ...values) => { + + if (options.override !== false) { + return base.valid(Joi.override, ...values); + } + + return base.valid(...values); + }; + + if (internals.simple(config)) { + return valid(Joi, config); + } + + if (typeof config === 'function') { + return Joi.custom(config); + } + + Assert(typeof config === 'object', 'Invalid schema content:', typeof config); + + if (Common.isResolvable(config)) { + return valid(Joi, config); + } + + if (Common.isSchema(config)) { + return config; + } + + if (Array.isArray(config)) { + for (const item of config) { + if (!internals.simple(item)) { + return Joi.alternatives().try(...config); + } + } + + return valid(Joi, ...config); + } + + if (config instanceof RegExp) { + return Joi.string().regex(config); + } + + if (config instanceof Date) { + return valid(Joi.date(), config); + } + + Assert(Object.getPrototypeOf(config) === Object.getPrototypeOf({}), 'Schema can only contain plain objects'); + + return Joi.object().keys(config); +}; + + +exports.ref = function (id, options) { + + return Ref.isRef(id) ? id : Ref.create(id, options); +}; + + +exports.compile = function (root, schema, options = {}) { + + Common.assertOptions(options, ['legacy']); + + // Compiled by any supported version + + const any = schema && schema[Common.symbols.any]; + if (any) { + Assert(options.legacy || any.version === Common.version, 'Cannot mix different versions of joi schemas:', any.version, Common.version); + return schema; + } + + // Uncompiled root + + if (typeof schema !== 'object' || + !options.legacy) { + + return exports.schema(root, schema, { appendPath: true }); // Will error if schema contains other versions + } + + // Scan schema for compiled parts + + const compiler = internals.walk(schema); + if (!compiler) { + return exports.schema(root, schema, { appendPath: true }); + } + + return compiler.compile(compiler.root, schema); +}; + + +internals.walk = function (schema) { + + if (typeof schema !== 'object') { + return null; + } + + if (Array.isArray(schema)) { + for (const item of schema) { + const compiler = internals.walk(item); + if (compiler) { + return compiler; + } + } + + return null; + } + + const any = schema[Common.symbols.any]; + if (any) { + return { root: schema[any.root], compile: any.compile }; + } + + Assert(Object.getPrototypeOf(schema) === Object.getPrototypeOf({}), 'Schema can only contain plain objects'); + + for (const key in schema) { + const compiler = internals.walk(schema[key]); + if (compiler) { + return compiler; + } + } + + return null; +}; + + +internals.simple = function (value) { + + return value === null || ['boolean', 'string', 'number'].includes(typeof value); +}; + + +exports.when = function (schema, condition, options) { + + if (options === undefined) { + Assert(condition && typeof condition === 'object', 'Missing options'); + + options = condition; + condition = Ref.create('.'); + } + + if (Array.isArray(options)) { + options = { switch: options }; + } + + Common.assertOptions(options, ['is', 'not', 'then', 'otherwise', 'switch', 'break']); + + // Schema condition + + if (Common.isSchema(condition)) { + Assert(options.is === undefined, '"is" can not be used with a schema condition'); + Assert(options.not === undefined, '"not" can not be used with a schema condition'); + Assert(options.switch === undefined, '"switch" can not be used with a schema condition'); + + return internals.condition(schema, { is: condition, then: options.then, otherwise: options.otherwise, break: options.break }); + } + + // Single condition + + Assert(Ref.isRef(condition) || typeof condition === 'string', 'Invalid condition:', condition); + Assert(options.not === undefined || options.is === undefined, 'Cannot combine "is" with "not"'); + + if (options.switch === undefined) { + let rule = options; + if (options.not !== undefined) { + rule = { is: options.not, then: options.otherwise, otherwise: options.then, break: options.break }; + } + + let is = rule.is !== undefined ? schema.$_compile(rule.is) : schema.$_root.invalid(null, false, 0, '').required(); + Assert(rule.then !== undefined || rule.otherwise !== undefined, 'options must have at least one of "then", "otherwise", or "switch"'); + Assert(rule.break === undefined || rule.then === undefined || rule.otherwise === undefined, 'Cannot specify then, otherwise, and break all together'); + + if (options.is !== undefined && + !Ref.isRef(options.is) && + !Common.isSchema(options.is)) { + + is = is.required(); // Only apply required if this wasn't already a schema or a ref + } + + return internals.condition(schema, { ref: exports.ref(condition), is, then: rule.then, otherwise: rule.otherwise, break: rule.break }); + } + + // Switch statement + + Assert(Array.isArray(options.switch), '"switch" must be an array'); + Assert(options.is === undefined, 'Cannot combine "switch" with "is"'); + Assert(options.not === undefined, 'Cannot combine "switch" with "not"'); + Assert(options.then === undefined, 'Cannot combine "switch" with "then"'); + + const rule = { + ref: exports.ref(condition), + switch: [], + break: options.break + }; + + for (let i = 0; i < options.switch.length; ++i) { + const test = options.switch[i]; + const last = i === options.switch.length - 1; + + Common.assertOptions(test, last ? ['is', 'then', 'otherwise'] : ['is', 'then']); + + Assert(test.is !== undefined, 'Switch statement missing "is"'); + Assert(test.then !== undefined, 'Switch statement missing "then"'); + + const item = { + is: schema.$_compile(test.is), + then: schema.$_compile(test.then) + }; + + if (!Ref.isRef(test.is) && + !Common.isSchema(test.is)) { + + item.is = item.is.required(); // Only apply required if this wasn't already a schema or a ref + } + + if (last) { + Assert(options.otherwise === undefined || test.otherwise === undefined, 'Cannot specify "otherwise" inside and outside a "switch"'); + const otherwise = options.otherwise !== undefined ? options.otherwise : test.otherwise; + if (otherwise !== undefined) { + Assert(rule.break === undefined, 'Cannot specify both otherwise and break'); + item.otherwise = schema.$_compile(otherwise); + } + } + + rule.switch.push(item); + } + + return rule; +}; + + +internals.condition = function (schema, condition) { + + for (const key of ['then', 'otherwise']) { + if (condition[key] === undefined) { + delete condition[key]; + } + else { + condition[key] = schema.$_compile(condition[key]); + } + } + + return condition; +}; diff --git a/node_modules/joi/lib/errors.js b/node_modules/joi/lib/errors.js new file mode 100644 index 000000000..0823f924f --- /dev/null +++ b/node_modules/joi/lib/errors.js @@ -0,0 +1,262 @@ +'use strict'; + +const Annotate = require('./annotate'); +const Common = require('./common'); +const Template = require('./template'); + + +const internals = {}; + + +exports.Report = class { + + constructor(code, value, local, flags, messages, state, prefs) { + + this.code = code; + this.flags = flags; + this.messages = messages; + this.path = state.path; + this.prefs = prefs; + this.state = state; + this.value = value; + + this.message = null; + this.template = null; + + this.local = local || {}; + this.local.label = exports.label(this.flags, this.state, this.prefs, this.messages); + + if (this.value !== undefined && + !this.local.hasOwnProperty('value')) { + + this.local.value = this.value; + } + + if (this.path.length) { + const key = this.path[this.path.length - 1]; + if (typeof key !== 'object') { + this.local.key = key; + } + } + } + + _setTemplate(template) { + + this.template = template; + + if (!this.flags.label && + this.path.length === 0) { + + const localized = this._template(this.template, 'root'); + if (localized) { + this.local.label = localized; + } + } + } + + toString() { + + if (this.message) { + return this.message; + } + + const code = this.code; + + if (!this.prefs.errors.render) { + return this.code; + } + + const template = this._template(this.template) || + this._template(this.prefs.messages) || + this._template(this.messages); + + if (template === undefined) { + return `Error code "${code}" is not defined, your custom type is missing the correct messages definition`; + } + + // Render and cache result + + this.message = template.render(this.value, this.state, this.prefs, this.local, { errors: this.prefs.errors, messages: [this.prefs.messages, this.messages] }); + if (!this.prefs.errors.label) { + this.message = this.message.replace(/^"" /, '').trim(); + } + + return this.message; + } + + _template(messages, code) { + + return exports.template(this.value, messages, code || this.code, this.state, this.prefs); + } +}; + + +exports.path = function (path) { + + let label = ''; + for (const segment of path) { + if (typeof segment === 'object') { // Exclude array single path segment + continue; + } + + if (typeof segment === 'string') { + if (label) { + label += '.'; + } + + label += segment; + } + else { + label += `[${segment}]`; + } + } + + return label; +}; + + +exports.template = function (value, messages, code, state, prefs) { + + if (!messages) { + return; + } + + if (Template.isTemplate(messages)) { + return code !== 'root' ? messages : null; + } + + let lang = prefs.errors.language; + if (Common.isResolvable(lang)) { + lang = lang.resolve(value, state, prefs); + } + + if (lang && + messages[lang] && + messages[lang][code] !== undefined) { + + return messages[lang][code]; + } + + return messages[code]; +}; + + +exports.label = function (flags, state, prefs, messages) { + + if (flags.label) { + return flags.label; + } + + if (!prefs.errors.label) { + return ''; + } + + let path = state.path; + if (prefs.errors.label === 'key' && + state.path.length > 1) { + + path = state.path.slice(-1); + } + + const normalized = exports.path(path); + if (normalized) { + return normalized; + } + + return exports.template(null, prefs.messages, 'root', state, prefs) || + messages && exports.template(null, messages, 'root', state, prefs) || + 'value'; +}; + + +exports.process = function (errors, original, prefs) { + + if (!errors) { + return null; + } + + const { override, message, details } = exports.details(errors); + if (override) { + return override; + } + + if (prefs.errors.stack) { + return new exports.ValidationError(message, details, original); + } + + const limit = Error.stackTraceLimit; + Error.stackTraceLimit = 0; + const validationError = new exports.ValidationError(message, details, original); + Error.stackTraceLimit = limit; + return validationError; +}; + + +exports.details = function (errors, options = {}) { + + let messages = []; + const details = []; + + for (const item of errors) { + + // Override + + if (item instanceof Error) { + if (options.override !== false) { + return { override: item }; + } + + const message = item.toString(); + messages.push(message); + + details.push({ + message, + type: 'override', + context: { error: item } + }); + + continue; + } + + // Report + + const message = item.toString(); + messages.push(message); + + details.push({ + message, + path: item.path.filter((v) => typeof v !== 'object'), + type: item.code, + context: item.local + }); + } + + if (messages.length > 1) { + messages = [...new Set(messages)]; + } + + return { message: messages.join('. '), details }; +}; + + +exports.ValidationError = class extends Error { + + constructor(message, details, original) { + + super(message); + this._original = original; + this.details = details; + } + + static isError(err) { + + return err instanceof exports.ValidationError; + } +}; + + +exports.ValidationError.prototype.isJoi = true; + +exports.ValidationError.prototype.name = 'ValidationError'; + +exports.ValidationError.prototype.annotate = Annotate.error; diff --git a/node_modules/joi/lib/extend.js b/node_modules/joi/lib/extend.js new file mode 100644 index 000000000..2a5acf7b4 --- /dev/null +++ b/node_modules/joi/lib/extend.js @@ -0,0 +1,312 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); + +const Common = require('./common'); +const Messages = require('./messages'); + + +const internals = {}; + + +exports.type = function (from, options) { + + const base = Object.getPrototypeOf(from); + const prototype = Clone(base); + const schema = from._assign(Object.create(prototype)); + const def = Object.assign({}, options); // Shallow cloned + delete def.base; + + prototype._definition = def; + + const parent = base._definition || {}; + def.messages = Messages.merge(parent.messages, def.messages); + def.properties = Object.assign({}, parent.properties, def.properties); + + // Type + + schema.type = def.type; + + // Flags + + def.flags = Object.assign({}, parent.flags, def.flags); + + // Terms + + const terms = Object.assign({}, parent.terms); + if (def.terms) { + for (const name in def.terms) { // Only apply own terms + const term = def.terms[name]; + Assert(schema.$_terms[name] === undefined, 'Invalid term override for', def.type, name); + schema.$_terms[name] = term.init; + terms[name] = term; + } + } + + def.terms = terms; + + // Constructor arguments + + if (!def.args) { + def.args = parent.args; + } + + // Prepare + + def.prepare = internals.prepare(def.prepare, parent.prepare); + + // Coerce + + if (def.coerce) { + if (typeof def.coerce === 'function') { + def.coerce = { method: def.coerce }; + } + + if (def.coerce.from && + !Array.isArray(def.coerce.from)) { + + def.coerce = { method: def.coerce.method, from: [].concat(def.coerce.from) }; + } + } + + def.coerce = internals.coerce(def.coerce, parent.coerce); + + // Validate + + def.validate = internals.validate(def.validate, parent.validate); + + // Rules + + const rules = Object.assign({}, parent.rules); + if (def.rules) { + for (const name in def.rules) { + const rule = def.rules[name]; + Assert(typeof rule === 'object', 'Invalid rule definition for', def.type, name); + + let method = rule.method; + if (method === undefined) { + method = function () { + + return this.$_addRule(name); + }; + } + + if (method) { + Assert(!prototype[name], 'Rule conflict in', def.type, name); + prototype[name] = method; + } + + Assert(!rules[name], 'Rule conflict in', def.type, name); + rules[name] = rule; + + if (rule.alias) { + const aliases = [].concat(rule.alias); + for (const alias of aliases) { + prototype[alias] = rule.method; + } + } + + if (rule.args) { + rule.argsByName = new Map(); + rule.args = rule.args.map((arg) => { + + if (typeof arg === 'string') { + arg = { name: arg }; + } + + Assert(!rule.argsByName.has(arg.name), 'Duplicated argument name', arg.name); + + if (Common.isSchema(arg.assert)) { + arg.assert = arg.assert.strict().label(arg.name); + } + + rule.argsByName.set(arg.name, arg); + return arg; + }); + } + } + } + + def.rules = rules; + + // Modifiers + + const modifiers = Object.assign({}, parent.modifiers); + if (def.modifiers) { + for (const name in def.modifiers) { + Assert(!prototype[name], 'Rule conflict in', def.type, name); + + const modifier = def.modifiers[name]; + Assert(typeof modifier === 'function', 'Invalid modifier definition for', def.type, name); + + const method = function (arg) { + + return this.rule({ [name]: arg }); + }; + + prototype[name] = method; + modifiers[name] = modifier; + } + } + + def.modifiers = modifiers; + + // Overrides + + if (def.overrides) { + prototype._super = base; + schema.$_super = {}; // Backwards compatibility + for (const override in def.overrides) { + Assert(base[override], 'Cannot override missing', override); + def.overrides[override][Common.symbols.parent] = base[override]; + schema.$_super[override] = base[override].bind(schema); // Backwards compatibility + } + + Object.assign(prototype, def.overrides); + } + + // Casts + + def.cast = Object.assign({}, parent.cast, def.cast); + + // Manifest + + const manifest = Object.assign({}, parent.manifest, def.manifest); + manifest.build = internals.build(def.manifest && def.manifest.build, parent.manifest && parent.manifest.build); + def.manifest = manifest; + + // Rebuild + + def.rebuild = internals.rebuild(def.rebuild, parent.rebuild); + + return schema; +}; + + +// Helpers + +internals.build = function (child, parent) { + + if (!child || + !parent) { + + return child || parent; + } + + return function (obj, desc) { + + return parent(child(obj, desc), desc); + }; +}; + + +internals.coerce = function (child, parent) { + + if (!child || + !parent) { + + return child || parent; + } + + return { + from: child.from && parent.from ? [...new Set([...child.from, ...parent.from])] : null, + method(value, helpers) { + + let coerced; + if (!parent.from || + parent.from.includes(typeof value)) { + + coerced = parent.method(value, helpers); + if (coerced) { + if (coerced.errors || + coerced.value === undefined) { + + return coerced; + } + + value = coerced.value; + } + } + + if (!child.from || + child.from.includes(typeof value)) { + + const own = child.method(value, helpers); + if (own) { + return own; + } + } + + return coerced; + } + }; +}; + + +internals.prepare = function (child, parent) { + + if (!child || + !parent) { + + return child || parent; + } + + return function (value, helpers) { + + const prepared = child(value, helpers); + if (prepared) { + if (prepared.errors || + prepared.value === undefined) { + + return prepared; + } + + value = prepared.value; + } + + return parent(value, helpers) || prepared; + }; +}; + + +internals.rebuild = function (child, parent) { + + if (!child || + !parent) { + + return child || parent; + } + + return function (schema) { + + parent(schema); + child(schema); + }; +}; + + +internals.validate = function (child, parent) { + + if (!child || + !parent) { + + return child || parent; + } + + return function (value, helpers) { + + const result = parent(value, helpers); + if (result) { + if (result.errors && + (!Array.isArray(result.errors) || result.errors.length)) { + + return result; + } + + value = result.value; + } + + return child(value, helpers) || result; + }; +}; diff --git a/node_modules/joi/lib/index.d.ts b/node_modules/joi/lib/index.d.ts new file mode 100644 index 000000000..dae991eeb --- /dev/null +++ b/node_modules/joi/lib/index.d.ts @@ -0,0 +1,2230 @@ +// The following definitions have been copied (almost) as-is from: +// https://github.com/DefinitelyTyped/DefinitelyTyped/tree/master/types/hapi__joi +// +// Note: This file is expected to change dramatically in the next major release and have been +// imported here to make migrating back to the "joi" module name simpler. It include known bugs +// and other issues. It does not include some new features included in version 17.2.0 or newer. +// +// TypeScript Version: 2.8 + +// TODO express type of Schema in a type-parameter (.default, .valid, .example etc) + +declare namespace Joi { + type Types = + | 'any' + | 'alternatives' + | 'array' + | 'boolean' + | 'binary' + | 'date' + | 'function' + | 'link' + | 'number' + | 'object' + | 'string' + | 'symbol'; + + type BasicType = boolean | number | string | any[] | object | null; + + type LanguageMessages = Record; + + type PresenceMode = 'optional' | 'required' | 'forbidden'; + + interface ErrorFormattingOptions { + /** + * when true, error message templates will escape special characters to HTML entities, for security purposes. + * + * @default false + */ + escapeHtml?: boolean; + /** + * defines the value used to set the label context variable. + */ + label?: 'path' | 'key' | false; + /** + * The preferred language code for error messages. + * The value is matched against keys at the root of the messages object, and then the error code as a child key of that. + * Can be a reference to the value, global context, or local context which is the root value passed to the validation function. + * + * Note that references to the value are usually not what you want as they move around the value structure relative to where the error happens. + * Instead, either use the global context, or the absolute value (e.g. `Joi.ref('/variable')`) + */ + language?: keyof LanguageMessages; + /** + * when false, skips rendering error templates. Useful when error messages are generated elsewhere to save processing time. + * + * @default true + */ + render?: boolean; + /** + * when true, the main error will possess a stack trace, otherwise it will be disabled. + * Defaults to false for performances reasons. Has no effect on platforms other than V8/node.js as it uses the Stack trace API. + * + * @default false + */ + stack?: boolean; + /** + * overrides the way values are wrapped (e.g. `[]` arround arrays, `""` around labels). + * Each key can be set to a string with one (same character before and after the value) or two characters (first character + * before and second character after), or `false` to disable wrapping. + */ + wrap?: { + /** + * the characters used around `{#label}` references. Defaults to `'"'`. + * + * @default '"' + */ + label?: string | false, + + /** + * the characters used around array avlues. Defaults to `'[]'` + * + * @default '[]' + */ + array?: string | false + }; + } + + interface BaseValidationOptions { + /** + * when true, stops validation on the first error, otherwise returns all the errors found. + * + * @default true + */ + abortEarly?: boolean; + /** + * when true, allows object to contain unknown keys which are ignored. + * + * @default false + */ + allowUnknown?: boolean; + /** + * when true, schema caching is enabled (for schemas with explicit caching rules). + * + * @default false + */ + cache?: boolean; + /** + * provides an external data set to be used in references + */ + context?: Context; + /** + * when true, attempts to cast values to the required types (e.g. a string to a number). + * + * @default true + */ + convert?: boolean; + /** + * sets the string format used when converting dates to strings in error messages and casting. + * + * @default 'iso' + */ + dateFormat?: 'date' | 'iso' | 'string' | 'time' | 'utc'; + /** + * when true, valid results and throw errors are decorated with a debug property which includes an array of the validation steps used to generate the returned result. + * + * @default false + */ + debug?: boolean; + /** + * error formatting settings. + */ + errors?: ErrorFormattingOptions; + /** + * if false, the external rules set with `any.external()` are ignored, which is required to ignore any external validations in synchronous mode (or an exception is thrown). + * + * @default true + */ + externals?: boolean; + /** + * when true, do not apply default values. + * + * @default false + */ + noDefaults?: boolean; + /** + * when true, inputs are shallow cloned to include non-enumerables properties. + * + * @default false + */ + nonEnumerables?: boolean; + /** + * sets the default presence requirements. Supported modes: 'optional', 'required', and 'forbidden'. + * + * @default 'optional' + */ + presence?: PresenceMode; + /** + * when true, ignores unknown keys with a function value. + * + * @default false + */ + skipFunctions?: boolean; + /** + * remove unknown elements from objects and arrays. + * - when true, all unknown elements will be removed + * - when an object: + * - objects - set to true to remove unknown keys from objects + * + * @default false + */ + stripUnknown?: boolean | { arrays?: boolean; objects?: boolean }; + } + + interface ValidationOptions extends BaseValidationOptions { + /** + * overrides individual error messages. Defaults to no override (`{}`). + * Messages use the same rules as templates. + * Variables in double braces `{{var}}` are HTML escaped if the option `errors.escapeHtml` is set to true. + * + * @default {} + */ + messages?: LanguageMessages; + } + + interface AsyncValidationOptions extends ValidationOptions { + /** + * when true, warnings are returned alongside the value (i.e. `{ value, warning }`). + * + * @default false + */ + warnings?: boolean; + } + + interface LanguageMessageTemplate { + source: string; + rendered: string; + } + + interface ErrorValidationOptions extends BaseValidationOptions { + messages?: Record; + } + + interface RenameOptions { + /** + * if true, does not delete the old key name, keeping both the new and old keys in place. + * + * @default false + */ + alias?: boolean; + /** + * if true, allows renaming multiple keys to the same destination where the last rename wins. + * + * @default false + */ + multiple?: boolean; + /** + * if true, allows renaming a key over an existing key. + * + * @default false + */ + override?: boolean; + /** + * if true, skip renaming of a key if it's undefined. + * + * @default false + */ + ignoreUndefined?: boolean; + } + + interface TopLevelDomainOptions { + /** + * - `true` to use the IANA list of registered TLDs. This is the default value. + * - `false` to allow any TLD not listed in the `deny` list, if present. + * - A `Set` or array of the allowed TLDs. Cannot be used together with `deny`. + */ + allow?: Set | string[] | boolean; + /** + * - A `Set` or array of the forbidden TLDs. Cannot be used together with a custom `allow` list. + */ + deny?: Set | string[]; + } + + interface HierarchySeparatorOptions { + /** + * overrides the default `.` hierarchy separator. Set to false to treat the key as a literal value. + * + * @default '.' + */ + separator?: string | false; + } + + interface EmailOptions { + /** + * If `true`, Unicode characters are permitted + * + * @default true + */ + allowUnicode?: boolean; + /** + * if `true`, ignore invalid email length errors. + * + * @default false + */ + ignoreLength?: boolean; + /** + * if true, allows multiple email addresses in a single string, separated by , or the separator characters. + * + * @default false + */ + multiple?: boolean; + /** + * when multiple is true, overrides the default , separator. String can be a single character or multiple separator characters. + * + * @default ',' + */ + separator?: string | string[]; + /** + * Options for TLD (top level domain) validation. By default, the TLD must be a valid name listed on the [IANA registry](http://data.iana.org/TLD/tlds-alpha-by-domain.txt) + * + * @default { allow: true } + */ + tlds?: TopLevelDomainOptions | false; + /** + * Number of segments required for the domain. Be careful since some domains, such as `io`, directly allow email. + * + * @default 2 + */ + minDomainSegments?: number; + } + + interface DomainOptions { + /** + * If `true`, Unicode characters are permitted + * + * @default true + */ + allowUnicode?: boolean; + + /** + * Options for TLD (top level domain) validation. By default, the TLD must be a valid name listed on the [IANA registry](http://data.iana.org/TLD/tlds-alpha-by-domain.txt) + * + * @default { allow: true } + */ + tlds?: TopLevelDomainOptions | false; + /** + * Number of segments required for the domain. + * + * @default 2 + */ + minDomainSegments?: number; + } + + interface HexOptions { + /** + * hex decoded representation must be byte aligned. + * @default false + */ + byteAligned?: boolean; + } + + interface IpOptions { + /** + * One or more IP address versions to validate against. Valid values: ipv4, ipv6, ipvfuture + */ + version?: string | string[]; + /** + * Used to determine if a CIDR is allowed or not. Valid values: optional, required, forbidden + */ + cidr?: PresenceMode; + } + + type GuidVersions = 'uuidv1' | 'uuidv2' | 'uuidv3' | 'uuidv4' | 'uuidv5'; + + interface GuidOptions { + version?: GuidVersions[] | GuidVersions; + separator?: boolean | '-' | ':'; + } + + interface UriOptions { + /** + * Specifies one or more acceptable Schemes, should only include the scheme name. + * Can be an Array or String (strings are automatically escaped for use in a Regular Expression). + */ + scheme?: string | RegExp | Array; + /** + * Allow relative URIs. + * + * @default false + */ + allowRelative?: boolean; + /** + * Restrict only relative URIs. + * + * @default false + */ + relativeOnly?: boolean; + /** + * Allows unencoded square brackets inside the query string. + * This is NOT RFC 3986 compliant but query strings like abc[]=123&abc[]=456 are very common these days. + * + * @default false + */ + allowQuerySquareBrackets?: boolean; + /** + * Validate the domain component using the options specified in `string.domain()`. + */ + domain?: DomainOptions; + } + + interface DataUriOptions { + /** + * optional parameter defaulting to true which will require `=` padding if true or make padding optional if false + * + * @default true + */ + paddingRequired?: boolean; + } + + interface Base64Options extends Pick { + /** + * if true, uses the URI-safe base64 format which replaces `+` with `-` and `\` with `_`. + * + * @default false + */ + urlSafe?: boolean; + } + + interface SwitchCases { + /** + * the required condition joi type. + */ + is: SchemaLike; + /** + * the alternative schema type if the condition is true. + */ + then: SchemaLike; + } + + interface SwitchDefault { + /** + * the alternative schema type if no cases matched. + * Only one otherwise statement is allowed in switch as the last array item. + */ + otherwise: SchemaLike; + } + + interface WhenOptions { + /** + * the required condition joi type. + */ + is?: SchemaLike; + + /** + * the negative version of `is` (`then` and `otherwise` have reverse + * roles). + */ + not?: SchemaLike; + + /** + * the alternative schema type if the condition is true. Required if otherwise or switch are missing. + */ + then?: SchemaLike; + + /** + * the alternative schema type if the condition is false. Required if then or switch are missing. + */ + otherwise?: SchemaLike; + + /** + * the list of cases. Required if then is missing. Required if then or otherwise are missing. + */ + switch?: Array; + + /** + * whether to stop applying further conditions if the condition is true. + */ + break?: boolean; + } + + interface WhenSchemaOptions { + /** + * the alternative schema type if the condition is true. Required if otherwise is missing. + */ + then?: SchemaLike; + /** + * the alternative schema type if the condition is false. Required if then is missing. + */ + otherwise?: SchemaLike; + } + + interface Cache { + /** + * Add an item to the cache. + * + * Note that key and value can be anything including objects, array, etc. + */ + set(key: any, value: any): void; + + /** + * Retrieve an item from the cache. + * + * Note that key and value can be anything including objects, array, etc. + */ + get(key: any): any; + } + interface CacheProvisionOptions { + /** + * number of items to store in the cache before the least used items are dropped. + * + * @default 1000 + */ + max: number; + } + + interface CacheConfiguration { + /** + * Provisions a simple LRU cache for caching simple inputs (`undefined`, `null`, strings, numbers, and booleans). + */ + provision(options?: CacheProvisionOptions): void; + } + + interface CompileOptions { + /** + * If true and the provided schema is (or contains parts) using an older version of joi, will return a compiled schema that is compatible with the older version. + * If false, the schema is always compiled using the current version and if older schema components are found, an error is thrown. + */ + legacy: boolean; + } + + interface IsSchemaOptions { + /** + * If true, will identify schemas from older versions of joi, otherwise will throw an error. + * + * @default false + */ + legacy: boolean; + } + + interface ReferenceOptions extends HierarchySeparatorOptions { + /** + * a function with the signature `function(value)` where `value` is the resolved reference value and the return value is the adjusted value to use. + * Note that the adjust feature will not perform any type validation on the adjusted value and it must match the value expected by the rule it is used in. + * Cannot be used with `map`. + * + * @example `(value) => value + 5` + */ + adjust?: (value: any) => any; + + /** + * an array of array pairs using the format `[[key, value], [key, value]]` used to maps the resolved reference value to another value. + * If the resolved value is not in the map, it is returned as-is. + * Cannot be used with `adjust`. + */ + map?: Array<[any, any]>; + + /** + * overrides default prefix characters. + */ + prefix?: { + /** + * references to the globally provided context preference. + * + * @default '$' + */ + global?: string; + + /** + * references to error-specific or rule specific context. + * + * @default '#' + */ + local?: string; + + /** + * references to the root value being validated. + * + * @default '/' + */ + root?: string; + }; + + /** + * If set to a number, sets the reference relative starting point. + * Cannot be combined with separator prefix characters. + * Defaults to the reference key prefix (or 1 if none present) + */ + ancestor?: number; + + /** + * creates an in-reference. + */ + in?: boolean; + + /** + * when true, the reference resolves by reaching into maps and sets. + */ + iterables?: boolean; + + /** + * when true, the value of the reference is used instead of its name in error messages + * and template rendering. Defaults to false. + */ + render?: boolean; + } + + interface StringRegexOptions { + /** + * optional pattern name. + */ + name?: string; + + /** + * when true, the provided pattern will be disallowed instead of required. + * + * @default false + */ + invert?: boolean; + } + + interface RuleOptions { + /** + * if true, the rules will not be replaced by the same unique rule later. + * + * For example, `Joi.number().min(1).rule({ keep: true }).min(2)` will keep both `min()` rules instead of the later rule overriding the first. + * + * @default false + */ + keep?: boolean; + + /** + * a single message string or a messages object where each key is an error code and corresponding message string as value. + * + * The object is the same as the messages used as an option in `any.validate()`. + * The strings can be plain messages or a message template. + */ + message?: string | LanguageMessages; + + /** + * if true, turns any error generated by the ruleset to warnings. + */ + warn?: boolean; + } + + interface ErrorReport extends Error { + code: string; + flags: Record; + path: string[]; + prefs: ErrorValidationOptions; + messages: LanguageMessages; + state: State; + value: any; + } + + interface ValidationError extends Error { + name: 'ValidationError'; + + isJoi: boolean; + + /** + * array of errors. + */ + details: ValidationErrorItem[]; + + /** + * function that returns a string with an annotated version of the object pointing at the places where errors occurred. + * + * NOTE: This method does not exist in browser builds of Joi + * + * @param stripColors - if truthy, will strip the colors out of the output. + */ + annotate(stripColors?: boolean): string; + + _original: any; + } + + interface ValidationErrorItem { + message: string; + path: Array; + type: string; + context?: Context; + } + + type ValidationErrorFunction = (errors: ErrorReport[]) => string | ValidationErrorItem | Error; + + interface ValidationResult { + error?: ValidationError; + warning?: ValidationError; + value: any; + } + + interface CreateErrorOptions { + flags?: boolean; + messages?: LanguageMessages; + } + + interface ModifyOptions { + each?: boolean; + once?: boolean; + ref?: boolean; + schema?: boolean; + } + + interface MutateRegisterOptions { + family?: any; + key?: any; + } + + interface SetFlagOptions { + clone: boolean; + } + + interface CustomHelpers { + schema: ExtensionBoundSchema; + state: State; + prefs: ValidationOptions; + original: V; + warn: (code: string, local?: Context) => void; + error: (code: string, local?: Context) => ErrorReport; + message: (messages: LanguageMessages, local?: Context) => ErrorReport; + } + + type CustomValidator = (value: V, helpers: CustomHelpers) => V | ErrorReport; + + type ExternalValidationFunction = (value: any) => any; + + type SchemaLikeWithoutArray = string | number | boolean | null | Schema | SchemaMap; + type SchemaLike = SchemaLikeWithoutArray | object; + + type NullableType = undefined | null | T + + type ObjectPropertiesSchema = + T extends NullableType + ? Joi.StringSchema + : T extends NullableType + ? Joi.NumberSchema + : T extends NullableType + ? Joi.NumberSchema + : T extends NullableType + ? Joi.BooleanSchema + : T extends NullableType> + ? Joi.ArraySchema + : T extends NullableType + ? ObjectSchema> + : never + + type PartialSchemaMap = { + [key in keyof TSchema]?: SchemaLike | SchemaLike[]; + } + + type StrictSchemaMap = { + [key in keyof TSchema]-?: ObjectPropertiesSchema + }; + + type SchemaMap = isStrict extends true ? StrictSchemaMap : PartialSchemaMap + + type Schema

= + | AnySchema + | ArraySchema + | AlternativesSchema + | BinarySchema + | BooleanSchema + | DateSchema + | FunctionSchema + | NumberSchema + | ObjectSchema

+ | StringSchema + | LinkSchema + | SymbolSchema; + + type SchemaFunction = (schema: Schema) => Schema; + + interface AddRuleOptions { + name: string; + args?: { + [key: string]: any; + }; + } + + interface GetRuleOptions { + args?: Record; + method?: string; + name: string; + operator?: string; + } + + interface SchemaInternals { + /** + * Parent schema object. + */ + $_super: Schema; + + /** + * Terms of current schema. + */ + $_terms: Record; + + /** + * Adds a rule to current validation schema. + */ + $_addRule(rule: string | AddRuleOptions): Schema; + + /** + * Internally compiles schema. + */ + $_compile(schema: SchemaLike, options?: CompileOptions): Schema; + + /** + * Creates a joi error object. + */ + $_createError( + code: string, + value: any, + context: Context, + state: State, + prefs: ValidationOptions, + options?: CreateErrorOptions, + ): Err; + + /** + * Get value from given flag. + */ + $_getFlag(name: string): any; + + /** + * Retrieve some rule configuration. + */ + $_getRule(name: string): GetRuleOptions | undefined; + + $_mapLabels(path: string | string[]): string; + + /** + * Returns true if validations runs fine on given value. + */ + $_match(value: any, state: State, prefs: ValidationOptions): boolean; + + $_modify(options?: ModifyOptions): Schema; + + /** + * Resets current schema. + */ + $_mutateRebuild(): this; + + $_mutateRegister(schema: Schema, options?: MutateRegisterOptions): void; + + /** + * Get value from given property. + */ + $_property(name: string): any; + + /** + * Get schema at given path. + */ + $_reach(path: string[]): Schema; + + /** + * Get current schema root references. + */ + $_rootReferences(): any; + + /** + * Set flag to given value. + */ + $_setFlag(flag: string, value: any, options?: SetFlagOptions): void; + + /** + * Runs internal validations against given value. + */ + $_validate(value: any, state: State, prefs: ValidationOptions): ValidationResult; + } + + interface AnySchema extends SchemaInternals { + /** + * Flags of current schema. + */ + _flags: Record; + + /** + * Starts a ruleset in order to apply multiple rule options. The set ends when `rule()`, `keep()`, `message()`, or `warn()` is called. + */ + $: this; + + /** + * Starts a ruleset in order to apply multiple rule options. The set ends when `rule()`, `keep()`, `message()`, or `warn()` is called. + */ + ruleset: this; + + type?: Types | string; + + /** + * Whitelists a value + */ + allow(...values: any[]): this; + + /** + * Assign target alteration options to a schema that are applied when `any.tailor()` is called. + * @param targets - an object where each key is a target name, and each value is a function that takes an schema and returns an schema. + */ + alter(targets: Record Schema>): this; + + /** + * By default, some Joi methods to function properly need to rely on the Joi instance they are attached to because + * they use `this` internally. + * So `Joi.string()` works but if you extract the function from it and call `string()` it won't. + * `bind()` creates a new Joi instance where all the functions relying on `this` are bound to the Joi instance. + */ + bind(): this; + + /** + * Adds caching to the schema which will attempt to cache the validation results (success and failures) of incoming inputs. + * If no cache is passed, a default cache is provisioned by using `cache.provision()` internally. + */ + cache(cache?: Cache): this; + + /** + * Casts the validated value to the specified type. + */ + cast(to: 'map' | 'number' | 'set' | 'string'): this; + + /** + * Returns a new type that is the result of adding the rules of one type to another. + */ + concat(schema: this): this; + + /** + * Adds a custom validation function. + */ + custom(fn: CustomValidator, description?: string): this; + + /** + * Sets a default value if the original value is `undefined` where: + * @param value - the default value. One of: + * - a literal value (string, number, object, etc.) + * - a [references](#refkey-options) + * - a function which returns the default value using the signature `function(parent, helpers)` where: + * - `parent` - a clone of the object containing the value being validated. Note that since specifying a + * `parent` argument performs cloning, do not declare format arguments if you are not using them. + * - `helpers` - same as thsoe described in [`any.custom()`](anycustomermethod_description) + * + * When called without any `value` on an object schema type, a default value will be automatically generated + * based on the default values of the object keys. + * + * Note that if value is an object, any changes to the object after `default()` is called will change the + * reference and any future assignment. + */ + default(value?: BasicType | Reference | ((parent: any, helpers: CustomHelpers) => BasicType | Reference)): this; + + /** + * Returns a plain object representing the schema's rules and properties + */ + describe(): Description; + + /** + * Annotates the key + */ + description(desc: string): this; + + /** + * Disallows values. + */ + disallow(...values: any[]): this; + + /** + * Considers anything that matches the schema to be empty (undefined). + * @param schema - any object or joi schema to match. An undefined schema unsets that rule. + */ + empty(schema?: SchemaLike): this; + + /** + * Adds the provided values into the allowed whitelist and marks them as the only valid values allowed. + */ + equal(...values: any[]): this; + + /** + * Overrides the default joi error with a custom error if the rule fails where: + * @param err - can be: + * an instance of `Error` - the override error. + * a `function(errors)`, taking an array of errors as argument, where it must either: + * return a `string` - substitutes the error message with this text + * return a single ` object` or an `Array` of it, where: + * `type` - optional parameter providing the type of the error (eg. `number.min`). + * `message` - optional parameter if `template` is provided, containing the text of the error. + * `template` - optional parameter if `message` is provided, containing a template string, using the same format as usual joi language errors. + * `context` - optional parameter, to provide context to your error if you are using the `template`. + * return an `Error` - same as when you directly provide an `Error`, but you can customize the error message based on the errors. + * + * Note that if you provide an `Error`, it will be returned as-is, unmodified and undecorated with any of the + * normal joi error properties. If validation fails and another error is found before the error + * override, that error will be returned and the override will be ignored (unless the `abortEarly` + * option has been set to `false`). + */ + error(err: Error | ValidationErrorFunction): this; + + /** + * Annotates the key with an example value, must be valid. + */ + example(value: any, options?: { override: boolean }): this; + + /** + * Marks a key as required which will not allow undefined as value. All keys are optional by default. + */ + exist(): this; + + /** + * Adds an external validation rule. + * + * Note that external validation rules are only called after the all other validation rules for the entire schema (from the value root) are checked. + * This means that any changes made to the value by the external rules are not available to any other validation rules during the non-external validation phase. + * If schema validation failed, no external validation rules are called. + */ + external(method: ExternalValidationFunction, description?: string): this; + + /** + * Returns a sub-schema based on a path of object keys or schema ids. + * + * @param path - a dot `.` separated path string or a pre-split array of path keys. The keys must match the sub-schema id or object key (if no id was explicitly set). + */ + extract(path: string | string[]): Schema; + + /** + * Sets a failover value if the original value fails passing validation. + * + * @param value - the failover value. value supports references. value may be assigned a function which returns the default value. + * + * If value is specified as a function that accepts a single parameter, that parameter will be a context object that can be used to derive the resulting value. + * Note that if value is an object, any changes to the object after `failover()` is called will change the reference and any future assignment. + * Use a function when setting a dynamic value (e.g. the current time). + * Using a function with a single argument performs some internal cloning which has a performance impact. + * If you do not need access to the context, define the function without any arguments. + */ + failover(value: any): this; + + /** + * Marks a key as forbidden which will not allow any value except undefined. Used to explicitly forbid keys. + */ + forbidden(): this; + + /** + * Returns a new schema where each of the path keys listed have been modified. + * + * @param key - an array of key strings, a single key string, or an array of arrays of pre-split key strings. + * @param adjuster - a function which must return a modified schema. + */ + fork(key: string | string[] | string[][], adjuster: SchemaFunction): this; + + /** + * Sets a schema id for reaching into the schema via `any.extract()`. + * If no id is set, the schema id defaults to the object key it is associated with. + * If the schema is used in an array or alternatives type and no id is set, the schema in unreachable. + */ + id(name?: string): this; + + /** + * Disallows values. + */ + invalid(...values: any[]): this; + + /** + * Same as `rule({ keep: true })`. + * + * Note that `keep()` will terminate the current ruleset and cannot be followed by another rule option. + * Use `rule()` to apply multiple rule options. + */ + keep(): this; + + /** + * Overrides the key name in error messages. + */ + label(name: string): this; + + /** + * Same as `rule({ message })`. + * + * Note that `message()` will terminate the current ruleset and cannot be followed by another rule option. + * Use `rule()` to apply multiple rule options. + */ + message(message: string): this; + + /** + * Same as `any.prefs({ messages })`. + * Note that while `any.message()` applies only to the last rule or ruleset, `any.messages()` applies to the entire schema. + */ + messages(messages: LanguageMessages): this; + + /** + * Attaches metadata to the key. + */ + meta(meta: object): this; + + /** + * Disallows values. + */ + not(...values: any[]): this; + + /** + * Annotates the key + */ + note(...notes: string[]): this; + + /** + * Requires the validated value to match of the provided `any.allow()` values. + * It has not effect when called together with `any.valid()` since it already sets the requirements. + * When used with `any.allow()` it converts it to an `any.valid()`. + */ + only(): this; + + /** + * Marks a key as optional which will allow undefined as values. Used to annotate the schema for readability as all keys are optional by default. + */ + optional(): this; + + /** + * Overrides the global validate() options for the current key and any sub-key. + */ + options(options: ValidationOptions): this; + + /** + * Overrides the global validate() options for the current key and any sub-key. + */ + prefs(options: ValidationOptions): this; + + /** + * Overrides the global validate() options for the current key and any sub-key. + */ + preferences(options: ValidationOptions): this; + + /** + * Sets the presence mode for the schema. + */ + presence(mode: PresenceMode): this; + + /** + * Outputs the original untouched value instead of the casted value. + */ + raw(enabled?: boolean): this; + + /** + * Marks a key as required which will not allow undefined as value. All keys are optional by default. + */ + required(): this; + + /** + * Applies a set of rule options to the current ruleset or last rule added. + * + * When applying rule options, the last rule (e.g. `min()`) is used unless there is an active ruleset defined (e.g. `$.min().max()`) + * in which case the options are applied to all the provided rules. + * Once `rule()` is called, the previous rules can no longer be modified and any active ruleset is terminated. + * + * Rule modifications can only be applied to supported rules. + * Most of the `any` methods do not support rule modifications because they are implemented using schema flags (e.g. `required()`) or special + * internal implementation (e.g. `valid()`). + * In those cases, use the `any.messages()` method to override the error codes for the errors you want to customize. + */ + rule(options: RuleOptions): this; + + /** + * Registers a schema to be used by decendents of the current schema in named link references. + */ + shared(ref: Schema): this; + + /** + * Sets the options.convert options to false which prevent type casting for the current key and any child keys. + */ + strict(isStrict?: boolean): this; + + /** + * Marks a key to be removed from a resulting object or array after validation. Used to sanitize output. + * @param [enabled=true] - if true, the value is stripped, otherwise the validated value is retained. Defaults to true. + */ + strip(enabled?: boolean): this; + + /** + * Annotates the key + */ + tag(...tags: string[]): this; + + /** + * Applies any assigned target alterations to a copy of the schema that were applied via `any.alter()`. + */ + tailor(targets: string | string[]): Schema; + + /** + * Annotates the key with an unit name. + */ + unit(name: string): this; + + /** + * Adds the provided values into the allowed whitelist and marks them as the only valid values allowed. + */ + valid(...values: any[]): this; + + /** + * Validates a value using the schema and options. + */ + validate(value: any, options?: ValidationOptions): ValidationResult; + + /** + * Validates a value using the schema and options. + */ + validateAsync(value: any, options?: AsyncValidationOptions): Promise; + + /** + * Same as `rule({ warn: true })`. + * Note that `warn()` will terminate the current ruleset and cannot be followed by another rule option. + * Use `rule()` to apply multiple rule options. + */ + warn(): this; + + /** + * Generates a warning. + * When calling `any.validateAsync()`, set the `warning` option to true to enable warnings. + * Warnings are reported separately from errors alongside the result value via the warning key (i.e. `{ value, warning }`). + * Warning are always included when calling `any.validate()`. + */ + warning(code: string, context: Context): this; + + /** + * Converts the type into an alternatives type where the conditions are merged into the type definition where: + */ + when(ref: string | Reference, options: WhenOptions | WhenOptions[]): this; + + /** + * Converts the type into an alternatives type where the conditions are merged into the type definition where: + */ + when(ref: Schema, options: WhenSchemaOptions): this; + } + + interface Description { + type?: Types | string; + label?: string; + description?: string; + flags?: object; + notes?: string[]; + tags?: string[]; + meta?: any[]; + example?: any[]; + valids?: any[]; + invalids?: any[]; + unit?: string; + options?: ValidationOptions; + [key: string]: any; + } + + interface Context { + [key: string]: any; + key?: string; + label?: string; + value?: any; + } + + interface State { + key?: string; + path?: string; + parent?: any; + reference?: any; + ancestors?: any; + localize?(...args: any[]): State; + } + + interface BooleanSchema extends AnySchema { + /** + * Allows for additional values to be considered valid booleans by converting them to false during validation. + * String comparisons are by default case insensitive, + * see `boolean.sensitive()` to change this behavior. + * @param values - strings, numbers or arrays of them + */ + falsy(...values: Array): this; + + /** + * Allows the values provided to truthy and falsy as well as the "true" and "false" default conversion + * (when not in `strict()` mode) to be matched in a case insensitive manner. + */ + sensitive(enabled?: boolean): this; + + /** + * Allows for additional values to be considered valid booleans by converting them to true during validation. + * String comparisons are by default case insensitive, see `boolean.sensitive()` to change this behavior. + * @param values - strings, numbers or arrays of them + */ + truthy(...values: Array): this; + } + + interface NumberSchema extends AnySchema { + /** + * Specifies that the value must be greater than limit. + * It can also be a reference to another field. + */ + greater(limit: number | Reference): this; + + /** + * Requires the number to be an integer (no floating point). + */ + integer(): this; + + /** + * Specifies that the value must be less than limit. + * It can also be a reference to another field. + */ + less(limit: number | Reference): this; + + /** + * Specifies the maximum value. + * It can also be a reference to another field. + */ + max(limit: number | Reference): this; + + /** + * Specifies the minimum value. + * It can also be a reference to another field. + */ + min(limit: number | Reference): this; + + /** + * Specifies that the value must be a multiple of base. + */ + multiple(base: number | Reference): this; + + /** + * Requires the number to be negative. + */ + negative(): this; + + /** + * Requires the number to be a TCP port, so between 0 and 65535. + */ + port(): this; + + /** + * Requires the number to be positive. + */ + positive(): this; + + /** + * Specifies the maximum number of decimal places where: + * @param limit - the maximum number of decimal places allowed. + */ + precision(limit: number): this; + + /** + * Requires the number to be negative or positive. + */ + sign(sign: 'positive' | 'negative'): this; + + /** + * Allows the number to be outside of JavaScript's safety range (Number.MIN_SAFE_INTEGER & Number.MAX_SAFE_INTEGER). + */ + unsafe(enabled?: any): this; + } + + interface StringSchema extends AnySchema { + /** + * Requires the string value to only contain a-z, A-Z, and 0-9. + */ + alphanum(): this; + + /** + * Requires the string value to be a valid base64 string; does not check the decoded value. + */ + base64(options?: Base64Options): this; + + /** + * Sets the required string case. + */ + case(direction: 'upper' | 'lower'): this; + + /** + * Requires the number to be a credit card number (Using Lunh Algorithm). + */ + creditCard(): this; + + /** + * Requires the string value to be a valid data URI string. + */ + dataUri(options?: DataUriOptions): this; + + /** + * Requires the string value to be a valid domain. + */ + domain(options?: DomainOptions): this; + + /** + * Requires the string value to be a valid email address. + */ + email(options?: EmailOptions): this; + + /** + * Requires the string value to be a valid GUID. + */ + guid(options?: GuidOptions): this; + + /** + * Requires the string value to be a valid hexadecimal string. + */ + hex(options?: HexOptions): this; + + /** + * Requires the string value to be a valid hostname as per RFC1123. + */ + hostname(): this; + + /** + * Allows the value to match any whitelist of blacklist item in a case insensitive comparison. + */ + insensitive(): this; + + /** + * Requires the string value to be a valid ip address. + */ + ip(options?: IpOptions): this; + + /** + * Requires the string value to be in valid ISO 8601 date format. + */ + isoDate(): this; + + /** + * Requires the string value to be in valid ISO 8601 duration format. + */ + isoDuration(): this; + + /** + * Specifies the exact string length required + * @param limit - the required string length. It can also be a reference to another field. + * @param encoding - if specified, the string length is calculated in bytes using the provided encoding. + */ + length(limit: number | Reference, encoding?: string): this; + + /** + * Requires the string value to be all lowercase. If the validation convert option is on (enabled by default), the string will be forced to lowercase. + */ + lowercase(): this; + + /** + * Specifies the maximum number of string characters. + * @param limit - the maximum number of string characters allowed. It can also be a reference to another field. + * @param encoding - if specified, the string length is calculated in bytes using the provided encoding. + */ + max(limit: number | Reference, encoding?: string): this; + + /** + * Specifies the minimum number string characters. + * @param limit - the minimum number of string characters required. It can also be a reference to another field. + * @param encoding - if specified, the string length is calculated in bytes using the provided encoding. + */ + min(limit: number | Reference, encoding?: string): this; + + /** + * Requires the string value to be in a unicode normalized form. If the validation convert option is on (enabled by default), the string will be normalized. + * @param [form='NFC'] - The unicode normalization form to use. Valid values: NFC [default], NFD, NFKC, NFKD + */ + normalize(form?: 'NFC' | 'NFD' | 'NFKC' | 'NFKD'): this; + + /** + * Defines a regular expression rule. + * @param pattern - a regular expression object the string value must match against. + * @param options - optional, can be: + * Name for patterns (useful with multiple patterns). Defaults to 'required'. + * An optional configuration object with the following supported properties: + * name - optional pattern name. + * invert - optional boolean flag. Defaults to false behavior. If specified as true, the provided pattern will be disallowed instead of required. + */ + pattern(pattern: RegExp, options?: string | StringRegexOptions): this; + + /** + * Defines a regular expression rule. + * @param pattern - a regular expression object the string value must match against. + * @param options - optional, can be: + * Name for patterns (useful with multiple patterns). Defaults to 'required'. + * An optional configuration object with the following supported properties: + * name - optional pattern name. + * invert - optional boolean flag. Defaults to false behavior. If specified as true, the provided pattern will be disallowed instead of required. + */ + regex(pattern: RegExp, options?: string | StringRegexOptions): this; + + /** + * Replace characters matching the given pattern with the specified replacement string where: + * @param pattern - a regular expression object to match against, or a string of which all occurrences will be replaced. + * @param replacement - the string that will replace the pattern. + */ + replace(pattern: RegExp | string, replacement: string): this; + + /** + * Requires the string value to only contain a-z, A-Z, 0-9, and underscore _. + */ + token(): this; + + /** + * Requires the string value to contain no whitespace before or after. If the validation convert option is on (enabled by default), the string will be trimmed. + * @param [enabled=true] - optional parameter defaulting to true which allows you to reset the behavior of trim by providing a falsy value. + */ + trim(enabled?: any): this; + + /** + * Specifies whether the string.max() limit should be used as a truncation. + * @param [enabled=true] - optional parameter defaulting to true which allows you to reset the behavior of truncate by providing a falsy value. + */ + truncate(enabled?: boolean): this; + + /** + * Requires the string value to be all uppercase. If the validation convert option is on (enabled by default), the string will be forced to uppercase. + */ + uppercase(): this; + + /** + * Requires the string value to be a valid RFC 3986 URI. + */ + uri(options?: UriOptions): this; + + /** + * Requires the string value to be a valid GUID. + */ + uuid(options?: GuidOptions): this; + } + + interface SymbolSchema extends AnySchema { + // TODO: support number and symbol index + map(iterable: Iterable<[string | number | boolean | symbol, symbol]> | { [key: string]: symbol }): this; + } + + interface ArraySortOptions { + /** + * @default 'ascending' + */ + order?: 'ascending' | 'descending'; + by?: string | Reference; + } + + interface ArrayUniqueOptions extends HierarchySeparatorOptions { + /** + * if true, undefined values for the dot notation string comparator will not cause the array to fail on uniqueness. + * + * @default false + */ + ignoreUndefined?: boolean; + } + + type ComparatorFunction = (a: any, b: any) => boolean; + + interface ArraySchema extends AnySchema { + /** + * Verifies that an assertion passes for at least one item in the array, where: + * `schema` - the validation rules required to satisfy the assertion. If the `schema` includes references, they are resolved against + * the array item being tested, not the value of the `ref` target. + */ + has(schema: SchemaLike): this; + + /** + * List the types allowed for the array values. + * If a given type is .required() then there must be a matching item in the array. + * If a type is .forbidden() then it cannot appear in the array. + * Required items can be added multiple times to signify that multiple items must be found. + * Errors will contain the number of items that didn't match. + * Any unmatched item having a label will be mentioned explicitly. + * + * @param type - a joi schema object to validate each array item against. + */ + items(...types: SchemaLikeWithoutArray[]): this; + + /** + * Specifies the exact number of items in the array. + */ + length(limit: number | Reference): this; + + /** + * Specifies the maximum number of items in the array. + */ + max(limit: number | Reference): this; + + /** + * Specifies the minimum number of items in the array. + */ + min(limit: number | Reference): this; + + /** + * Lists the types in sequence order for the array values where: + * @param type - a joi schema object to validate against each array item in sequence order. type can be multiple values passed as individual arguments. + * If a given type is .required() then there must be a matching item with the same index position in the array. + * Errors will contain the number of items that didn't match. + * Any unmatched item having a label will be mentioned explicitly. + */ + ordered(...types: SchemaLikeWithoutArray[]): this; + + /** + * Allow single values to be checked against rules as if it were provided as an array. + * enabled can be used with a falsy value to go back to the default behavior. + */ + single(enabled?: any): this; + + /** + * Sorts the array by given order. + */ + sort(options?: ArraySortOptions): this; + + /** + * Allow this array to be sparse. + * enabled can be used with a falsy value to go back to the default behavior. + */ + sparse(enabled?: any): this; + + /** + * Requires the array values to be unique. + * Remember that if you provide a custom comparator function, + * different types can be passed as parameter depending on the rules you set on items. + * Be aware that a deep equality is performed on elements of the array having a type of object, + * a performance penalty is to be expected for this kind of operation. + */ + unique(comparator?: string | ComparatorFunction, options?: ArrayUniqueOptions): this; + } + + interface ObjectPatternOptions { + fallthrough?: boolean; + matches: SchemaLike | Reference; + } + + interface ObjectSchema extends AnySchema { + /** + * Defines an all-or-nothing relationship between keys where if one of the peers is present, all of them are required as well. + * + * Optional settings must be the last argument. + */ + and(...peers: Array): this; + + /** + * Appends the allowed object keys. If schema is null, undefined, or {}, no changes will be applied. + */ + append(schema?: SchemaMap): this; + append(schema?: SchemaMap): ObjectSchema + + /** + * Verifies an assertion where. + */ + assert(ref: string | Reference, schema: SchemaLike, message?: string): this; + + /** + * Requires the object to be an instance of a given constructor. + * + * @param constructor - the constructor function that the object must be an instance of. + * @param name - an alternate name to use in validation errors. This is useful when the constructor function does not have a name. + */ + // tslint:disable-next-line:ban-types + instance(constructor: Function, name?: string): this; + + /** + * Sets or extends the allowed object keys. + */ + keys(schema?: SchemaMap): this; + + /** + * Specifies the exact number of keys in the object. + */ + length(limit: number): this; + + /** + * Specifies the maximum number of keys in the object. + */ + max(limit: number | Reference): this; + + /** + * Specifies the minimum number of keys in the object. + */ + min(limit: number | Reference): this; + + /** + * Defines a relationship between keys where not all peers can be present at the same time. + * + * Optional settings must be the last argument. + */ + nand(...peers: Array): this; + + /** + * Defines a relationship between keys where one of the peers is required (and more than one is allowed). + * + * Optional settings must be the last argument. + */ + or(...peers: Array): this; + + /** + * Defines an exclusive relationship between a set of keys where only one is allowed but none are required. + * + * Optional settings must be the last argument. + */ + oxor(...peers: Array): this; + + /** + * Specify validation rules for unknown keys matching a pattern. + * + * @param pattern - a pattern that can be either a regular expression or a joi schema that will be tested against the unknown key names + * @param schema - the schema object matching keys must validate against + */ + pattern(pattern: RegExp | SchemaLike, schema: SchemaLike, options?: ObjectPatternOptions): this; + + /** + * Requires the object to be a Joi reference. + */ + ref(): this; + + /** + * Requires the object to be a `RegExp` object. + */ + regex(): this; + + /** + * Renames a key to another name (deletes the renamed key). + */ + rename(from: string | RegExp, to: string, options?: RenameOptions): this; + + /** + * Requires the object to be a Joi schema instance. + */ + schema(type?: SchemaLike): this; + + /** + * Overrides the handling of unknown keys for the scope of the current object only (does not apply to children). + */ + unknown(allow?: boolean): this; + + /** + * Requires the presence of other keys whenever the specified key is present. + */ + with(key: string, peers: string | string[], options?: HierarchySeparatorOptions): this; + + /** + * Forbids the presence of other keys whenever the specified is present. + */ + without(key: string, peers: string | string[], options?: HierarchySeparatorOptions): this; + + /** + * Defines an exclusive relationship between a set of keys. one of them is required but not at the same time. + * + * Optional settings must be the last argument. + */ + xor(...peers: Array): this; + } + + interface BinarySchema extends AnySchema { + /** + * Sets the string encoding format if a string input is converted to a buffer. + */ + encoding(encoding: string): this; + + /** + * Specifies the minimum length of the buffer. + */ + min(limit: number | Reference): this; + + /** + * Specifies the maximum length of the buffer. + */ + max(limit: number | Reference): this; + + /** + * Specifies the exact length of the buffer: + */ + length(limit: number | Reference): this; + } + + interface DateSchema extends AnySchema { + /** + * Specifies that the value must be greater than date. + * Notes: 'now' can be passed in lieu of date so as to always compare relatively to the current date, + * allowing to explicitly ensure a date is either in the past or in the future. + * It can also be a reference to another field. + */ + greater(date: 'now' | Date | number | string | Reference): this; + + /** + * Requires the string value to be in valid ISO 8601 date format. + */ + iso(): this; + + /** + * Specifies that the value must be less than date. + * Notes: 'now' can be passed in lieu of date so as to always compare relatively to the current date, + * allowing to explicitly ensure a date is either in the past or in the future. + * It can also be a reference to another field. + */ + less(date: 'now' | Date | number | string | Reference): this; + + /** + * Specifies the oldest date allowed. + * Notes: 'now' can be passed in lieu of date so as to always compare relatively to the current date, + * allowing to explicitly ensure a date is either in the past or in the future. + * It can also be a reference to another field. + */ + min(date: 'now' | Date | number | string | Reference): this; + + /** + * Specifies the latest date allowed. + * Notes: 'now' can be passed in lieu of date so as to always compare relatively to the current date, + * allowing to explicitly ensure a date is either in the past or in the future. + * It can also be a reference to another field. + */ + max(date: 'now' | Date | number | string | Reference): this; + + /** + * Requires the value to be a timestamp interval from Unix Time. + * @param type - the type of timestamp (allowed values are unix or javascript [default]) + */ + timestamp(type?: 'javascript' | 'unix'): this; + } + + interface FunctionSchema extends ObjectSchema { + /** + * Specifies the arity of the function where: + * @param n - the arity expected. + */ + arity(n: number): this; + + /** + * Requires the function to be a class. + */ + class(): this; + + /** + * Specifies the minimal arity of the function where: + * @param n - the minimal arity expected. + */ + minArity(n: number): this; + + /** + * Specifies the minimal arity of the function where: + * @param n - the minimal arity expected. + */ + maxArity(n: number): this; + } + + interface AlternativesSchema extends AnySchema { + /** + * Adds a conditional alternative schema type, either based on another key value, or a schema peeking into the current value. + */ + conditional(ref: string | Reference, options: WhenOptions | WhenOptions[]): this; + conditional(ref: Schema, options: WhenSchemaOptions): this; + + /** + * Requires the validated value to match a specific set of the provided alternative.try() schemas. + * Cannot be combined with `alternatives.conditional()`. + */ + match(mode: 'any' | 'all' | 'one'): this; + + /** + * Adds an alternative schema type for attempting to match against the validated value. + */ + try(...types: SchemaLikeWithoutArray[]): this; + } + + interface LinkSchema extends AnySchema { + /** + * Same as `any.concat()` but the schema is merged after the link is resolved which allows merging with schemas of the same type as the resolved link. + * Will throw an exception during validation if the merged types are not compatible. + */ + concat(schema: Schema): this; + + /** + * Initializes the schema after constructions for cases where the schema has to be constructed first and then initialized. + * If `ref` was not passed to the constructor, `link.ref()` must be called prior to usaged. + */ + ref(ref: string): this; + } + + interface Reference extends Exclude { + depth: number; + type: string; + key: string; + root: string; + path: string[]; + display: string; + toString(): string; + } + + type ExtensionBoundSchema = Schema & SchemaInternals; + + interface RuleArgs { + name: string; + ref?: boolean; + assert?: ((value: any) => boolean) | AnySchema; + message?: string; + + /** + * Undocumented properties + */ + normalize?(value: any): any; + } + + type RuleMethod = (...args: any[]) => any; + + interface ExtensionRule { + /** + * alternative name for this rule. + */ + alias?: string; + /** + * whether rule supports multiple invocations. + */ + multi?: boolean; + /** + * Dual rule: converts or validates. + */ + convert?: boolean; + /** + * list of arguments accepted by `method`. + */ + args?: Array; + /** + * rule body. + */ + method?: RuleMethod | false; + /** + * validation function. + */ + validate?(value: any, helpers: any, args: Record, options: any): any; + + /** + * undocumented flags. + */ + priority?: boolean; + manifest?: boolean; + } + + interface CoerceResult { + errors?: ErrorReport[]; + value?: any; + } + + type CoerceFunction = (value: any, helpers: CustomHelpers) => CoerceResult; + + interface CoerceObject { + method: CoerceFunction; + from?: string | string[]; + } + + interface ExtensionFlag { + setter?: string; + default?: any; + } + + interface ExtensionTermManifest { + mapped: { + from: string; + to: string; + }; + } + + interface ExtensionTerm { + init: any[] | null; + register?: any; + manifest?: Record; + } + + interface Extension { + type: string | RegExp; + args?(...args: SchemaLike[]): Schema; + base?: Schema; + coerce?: CoerceFunction | CoerceObject; + flags?: Record; + manifest?: { + build?(obj: ExtensionBoundSchema, desc: Record): any; + }; + messages?: LanguageMessages | string; + modifiers?: Record any>; + overrides?: Record Schema>; + prepare?(value: any, helpers: CustomHelpers): any; + rebuild?(schema: ExtensionBoundSchema): void; + rules?: Record>; + terms?: Record; + validate?(value: any, helpers: CustomHelpers): any; + + /** + * undocumented options + */ + cast?: Record; + properties?: Record; + } + + type ExtensionFactory = (joi: Root) => Extension; + + interface Err { + toString(): string; + } + + // --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- + + interface Root { + /** + * Current version of the joi package. + */ + version: string; + + ValidationError: new (message: string, details: any, original: any) => ValidationError; + + /** + * Generates a schema object that matches any data type. + */ + any(): AnySchema; + + /** + * Generates a schema object that matches an array data type. + */ + array(): ArraySchema; + + /** + * Generates a schema object that matches a boolean data type (as well as the strings 'true', 'false', 'yes', and 'no'). Can also be called via bool(). + */ + bool(): BooleanSchema; + + /** + * Generates a schema object that matches a boolean data type (as well as the strings 'true', 'false', 'yes', and 'no'). Can also be called via bool(). + */ + boolean(): BooleanSchema; + + /** + * Generates a schema object that matches a Buffer data type (as well as the strings which will be converted to Buffers). + */ + binary(): BinarySchema; + + /** + * Generates a schema object that matches a date type (as well as a JavaScript date string or number of milliseconds). + */ + date(): DateSchema; + + /** + * Generates a schema object that matches a function type. + */ + func(): FunctionSchema; + + /** + * Generates a schema object that matches a function type. + */ + function(): FunctionSchema; + + /** + * Generates a schema object that matches a number data type (as well as strings that can be converted to numbers). + */ + number(): NumberSchema; + + /** + * Generates a schema object that matches an object data type (as well as JSON strings that have been parsed into objects). + */ + // tslint:disable-next-line:no-unnecessary-generics + object(schema?: SchemaMap): ObjectSchema; + + /** + * Generates a schema object that matches a string data type. Note that empty strings are not allowed by default and must be enabled with allow(''). + */ + string(): StringSchema; + + /** + * Generates a schema object that matches any symbol. + */ + symbol(): SymbolSchema; + + /** + * Generates a type that will match one of the provided alternative schemas + */ + alternatives(types: SchemaLike[]): AlternativesSchema; + alternatives(...types: SchemaLike[]): AlternativesSchema; + + /** + * Alias for `alternatives` + */ + alt(types: SchemaLike[]): AlternativesSchema; + alt(...types: SchemaLike[]): AlternativesSchema; + + /** + * Links to another schema node and reuses it for validation, typically for creative recursive schemas. + * + * @param ref - the reference to the linked schema node. + * Cannot reference itself or its children as well as other links. + * Links can be expressed in relative terms like value references (`Joi.link('...')`), + * in absolute terms from the schema run-time root (`Joi.link('/a')`), + * or using schema ids implicitly using object keys or explicitly using `any.id()` (`Joi.link('#a.b.c')`). + */ + link(ref?: string): LinkSchema; + + /** + * Validates a value against a schema and throws if validation fails. + * + * @param value - the value to validate. + * @param schema - the schema object. + * @param message - optional message string prefix added in front of the error message. may also be an Error object. + */ + assert(value: any, schema: Schema, options?: ValidationOptions): void; + assert(value: any, schema: Schema, message: string | Error, options?: ValidationOptions): void; + + /** + * Validates a value against a schema, returns valid object, and throws if validation fails. + * + * @param value - the value to validate. + * @param schema - the schema object. + * @param message - optional message string prefix added in front of the error message. may also be an Error object. + */ + attempt(value: any, schema: Schema, options?: ValidationOptions): any; + attempt(value: any, schema: Schema, message: string | Error, options?: ValidationOptions): any; + + cache: CacheConfiguration; + + /** + * Converts literal schema definition to joi schema object (or returns the same back if already a joi schema object). + */ + compile(schema: SchemaLike, options?: CompileOptions): Schema; + + /** + * Checks if the provided preferences are valid. + * + * Throws an exception if the prefs object is invalid. + * + * The method is provided to perform inputs validation for the `any.validate()` and `any.validateAsync()` methods. + * Validation is not performed automatically for performance reasons. Instead, manually validate the preferences passed once and reuse. + */ + checkPreferences(prefs: ValidationOptions): void; + + /** + * Creates a custom validation schema. + */ + custom(fn: CustomValidator, description?: string): Schema; + + /** + * Creates a new Joi instance that will apply defaults onto newly created schemas + * through the use of the fn function that takes exactly one argument, the schema being created. + * + * @param fn - The function must always return a schema, even if untransformed. + */ + defaults(fn: SchemaFunction): Root; + + /** + * Generates a dynamic expression using a template string. + */ + expression(template: string, options?: ReferenceOptions): any; + + /** + * Creates a new Joi instance customized with the extension(s) you provide included. + */ + extend(...extensions: Array): any; + + /** + * Creates a reference that when resolved, is used as an array of values to match against the rule. + */ + in(ref: string, options?: ReferenceOptions): Reference; + + /** + * Checks whether or not the provided argument is an instance of ValidationError + */ + isError(error: any): error is ValidationError; + + /** + * Checks whether or not the provided argument is an expression. + */ + isExpression(expression: any): boolean; + + /** + * Checks whether or not the provided argument is a reference. It's especially useful if you want to post-process error messages. + */ + isRef(ref: any): ref is Reference; + + /** + * Checks whether or not the provided argument is a joi schema. + */ + isSchema(schema: any, options?: CompileOptions): schema is AnySchema; + + /** + * A special value used with `any.allow()`, `any.invalid()`, and `any.valid()` as the first value to reset any previously set values. + */ + override: symbol; + + /** + * Generates a reference to the value of the named key. + */ + ref(key: string, options?: ReferenceOptions): Reference; + + /** + * Returns an object where each key is a plain joi schema type. + * Useful for creating type shortcuts using deconstruction. + * Note that the types are already formed and do not need to be called as functions (e.g. `string`, not `string()`). + */ + types(): { + alternatives: AlternativesSchema; + any: AnySchema; + array: ArraySchema; + binary: BinarySchema; + boolean: BooleanSchema; + date: DateSchema; + function: FunctionSchema; + link: LinkSchema; + number: NumberSchema; + object: ObjectSchema; + string: StringSchema; + symbol: SymbolSchema; + }; + + /** + * Generates a dynamic expression using a template string. + */ + x(template: string, options?: ReferenceOptions): any; + + // --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- + // Below are undocumented APIs. use at your own risk + // --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- --- + + /** + * Whitelists a value + */ + allow(...values: any[]): Schema; + + /** + * Adds the provided values into the allowed whitelist and marks them as the only valid values allowed. + */ + valid(...values: any[]): Schema; + equal(...values: any[]): Schema; + + /** + * Blacklists a value + */ + invalid(...values: any[]): Schema; + disallow(...values: any[]): Schema; + not(...values: any[]): Schema; + + /** + * Marks a key as required which will not allow undefined as value. All keys are optional by default. + */ + required(): Schema; + + /** + * Alias of `required`. + */ + exist(): Schema; + + /** + * Marks a key as optional which will allow undefined as values. Used to annotate the schema for readability as all keys are optional by default. + */ + optional(): Schema; + + /** + * Marks a key as forbidden which will not allow any value except undefined. Used to explicitly forbid keys. + */ + forbidden(): Schema; + + /** + * Overrides the global validate() options for the current key and any sub-key. + */ + preferences(options: ValidationOptions): Schema; + + /** + * Overrides the global validate() options for the current key and any sub-key. + */ + prefs(options: ValidationOptions): Schema; + + /** + * Converts the type into an alternatives type where the conditions are merged into the type definition where: + */ + when(ref: string | Reference, options: WhenOptions | WhenOptions[]): AlternativesSchema; + when(ref: Schema, options: WhenSchemaOptions): AlternativesSchema; + + /** + * Unsure, maybe alias for `compile`? + */ + build(...args: any[]): any; + + /** + * Unsure, maybe alias for `preferences`? + */ + options(...args: any[]): any; + + /** + * Unsure, maybe leaked from `@hapi/lab/coverage/initialize` + */ + trace(...args: any[]): any; + untrace(...args: any[]): any; + } +} + +declare const Joi: Joi.Root; +export = Joi; diff --git a/node_modules/joi/lib/index.js b/node_modules/joi/lib/index.js new file mode 100644 index 000000000..8813f516d --- /dev/null +++ b/node_modules/joi/lib/index.js @@ -0,0 +1,283 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); + +const Cache = require('./cache'); +const Common = require('./common'); +const Compile = require('./compile'); +const Errors = require('./errors'); +const Extend = require('./extend'); +const Manifest = require('./manifest'); +const Ref = require('./ref'); +const Template = require('./template'); +const Trace = require('./trace'); + +let Schemas; + + +const internals = { + types: { + alternatives: require('./types/alternatives'), + any: require('./types/any'), + array: require('./types/array'), + boolean: require('./types/boolean'), + date: require('./types/date'), + function: require('./types/function'), + link: require('./types/link'), + number: require('./types/number'), + object: require('./types/object'), + string: require('./types/string'), + symbol: require('./types/symbol') + }, + aliases: { + alt: 'alternatives', + bool: 'boolean', + func: 'function' + } +}; + + +if (Buffer) { // $lab:coverage:ignore$ + internals.types.binary = require('./types/binary'); +} + + +internals.root = function () { + + const root = { + _types: new Set(Object.keys(internals.types)) + }; + + // Types + + for (const type of root._types) { + root[type] = function (...args) { + + Assert(!args.length || ['alternatives', 'link', 'object'].includes(type), 'The', type, 'type does not allow arguments'); + return internals.generate(this, internals.types[type], args); + }; + } + + // Shortcuts + + for (const method of ['allow', 'custom', 'disallow', 'equal', 'exist', 'forbidden', 'invalid', 'not', 'only', 'optional', 'options', 'prefs', 'preferences', 'required', 'strip', 'valid', 'when']) { + root[method] = function (...args) { + + return this.any()[method](...args); + }; + } + + // Methods + + Object.assign(root, internals.methods); + + // Aliases + + for (const alias in internals.aliases) { + const target = internals.aliases[alias]; + root[alias] = root[target]; + } + + root.x = root.expression; + + // Trace + + if (Trace.setup) { // $lab:coverage:ignore$ + Trace.setup(root); + } + + return root; +}; + + +internals.methods = { + + ValidationError: Errors.ValidationError, + version: Common.version, + cache: Cache.provider, + + assert(value, schema, ...args /* [message], [options] */) { + + internals.assert(value, schema, true, args); + }, + + attempt(value, schema, ...args /* [message], [options] */) { + + return internals.assert(value, schema, false, args); + }, + + build(desc) { + + Assert(typeof Manifest.build === 'function', 'Manifest functionality disabled'); + return Manifest.build(this, desc); + }, + + checkPreferences(prefs) { + + Common.checkPreferences(prefs); + }, + + compile(schema, options) { + + return Compile.compile(this, schema, options); + }, + + defaults(modifier) { + + Assert(typeof modifier === 'function', 'modifier must be a function'); + + const joi = Object.assign({}, this); + for (const type of joi._types) { + const schema = modifier(joi[type]()); + Assert(Common.isSchema(schema), 'modifier must return a valid schema object'); + + joi[type] = function (...args) { + + return internals.generate(this, schema, args); + }; + } + + return joi; + }, + + expression(...args) { + + return new Template(...args); + }, + + extend(...extensions) { + + Common.verifyFlat(extensions, 'extend'); + + Schemas = Schemas || require('./schemas'); + + Assert(extensions.length, 'You need to provide at least one extension'); + this.assert(extensions, Schemas.extensions); + + const joi = Object.assign({}, this); + joi._types = new Set(joi._types); + + for (let extension of extensions) { + if (typeof extension === 'function') { + extension = extension(joi); + } + + this.assert(extension, Schemas.extension); + + const expanded = internals.expandExtension(extension, joi); + for (const item of expanded) { + Assert(joi[item.type] === undefined || joi._types.has(item.type), 'Cannot override name', item.type); + + const base = item.base || this.any(); + const schema = Extend.type(base, item); + + joi._types.add(item.type); + joi[item.type] = function (...args) { + + return internals.generate(this, schema, args); + }; + } + } + + return joi; + }, + + isError: Errors.ValidationError.isError, + isExpression: Template.isTemplate, + isRef: Ref.isRef, + isSchema: Common.isSchema, + + in(...args) { + + return Ref.in(...args); + }, + + override: Common.symbols.override, + + ref(...args) { + + return Ref.create(...args); + }, + + types() { + + const types = {}; + for (const type of this._types) { + types[type] = this[type](); + } + + for (const target in internals.aliases) { + types[target] = this[target](); + } + + return types; + } +}; + + +// Helpers + +internals.assert = function (value, schema, annotate, args /* [message], [options] */) { + + const message = args[0] instanceof Error || typeof args[0] === 'string' ? args[0] : null; + const options = message ? args[1] : args[0]; + const result = schema.validate(value, Common.preferences({ errors: { stack: true } }, options || {})); + + let error = result.error; + if (!error) { + return result.value; + } + + if (message instanceof Error) { + throw message; + } + + const display = annotate && typeof error.annotate === 'function' ? error.annotate() : error.message; + + if (error instanceof Errors.ValidationError === false) { + error = Clone(error); + } + + error.message = message ? `${message} ${display}` : display; + throw error; +}; + + +internals.generate = function (root, schema, args) { + + Assert(root, 'Must be invoked on a Joi instance.'); + + schema.$_root = root; + + if (!schema._definition.args || + !args.length) { + + return schema; + } + + return schema._definition.args(schema, ...args); +}; + + +internals.expandExtension = function (extension, joi) { + + if (typeof extension.type === 'string') { + return [extension]; + } + + const extended = []; + for (const type of joi._types) { + if (extension.type.test(type)) { + const item = Object.assign({}, extension); + item.type = type; + item.base = joi[type](); + extended.push(item); + } + } + + return extended; +}; + + +module.exports = internals.root(); diff --git a/node_modules/joi/lib/manifest.js b/node_modules/joi/lib/manifest.js new file mode 100644 index 000000000..8fed3c923 --- /dev/null +++ b/node_modules/joi/lib/manifest.js @@ -0,0 +1,476 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); + +const Common = require('./common'); +const Messages = require('./messages'); +const Ref = require('./ref'); +const Template = require('./template'); + +let Schemas; + + +const internals = {}; + + +exports.describe = function (schema) { + + const def = schema._definition; + + // Type + + const desc = { + type: schema.type, + flags: {}, + rules: [] + }; + + // Flags + + for (const flag in schema._flags) { + if (flag[0] !== '_') { + desc.flags[flag] = internals.describe(schema._flags[flag]); + } + } + + if (!Object.keys(desc.flags).length) { + delete desc.flags; + } + + // Preferences + + if (schema._preferences) { + desc.preferences = Clone(schema._preferences, { shallow: ['messages'] }); + delete desc.preferences[Common.symbols.prefs]; + if (desc.preferences.messages) { + desc.preferences.messages = Messages.decompile(desc.preferences.messages); + } + } + + // Allow / Invalid + + if (schema._valids) { + desc.allow = schema._valids.describe(); + } + + if (schema._invalids) { + desc.invalid = schema._invalids.describe(); + } + + // Rules + + for (const rule of schema._rules) { + const ruleDef = def.rules[rule.name]; + if (ruleDef.manifest === false) { // Defaults to true + continue; + } + + const item = { name: rule.name }; + + for (const custom in def.modifiers) { + if (rule[custom] !== undefined) { + item[custom] = internals.describe(rule[custom]); + } + } + + if (rule.args) { + item.args = {}; + for (const key in rule.args) { + const arg = rule.args[key]; + if (key === 'options' && + !Object.keys(arg).length) { + + continue; + } + + item.args[key] = internals.describe(arg, { assign: key }); + } + + if (!Object.keys(item.args).length) { + delete item.args; + } + } + + desc.rules.push(item); + } + + if (!desc.rules.length) { + delete desc.rules; + } + + // Terms (must be last to verify no name conflicts) + + for (const term in schema.$_terms) { + if (term[0] === '_') { + continue; + } + + Assert(!desc[term], 'Cannot describe schema due to internal name conflict with', term); + + const items = schema.$_terms[term]; + if (!items) { + continue; + } + + if (items instanceof Map) { + if (items.size) { + desc[term] = [...items.entries()]; + } + + continue; + } + + if (Common.isValues(items)) { + desc[term] = items.describe(); + continue; + } + + Assert(def.terms[term], 'Term', term, 'missing configuration'); + const manifest = def.terms[term].manifest; + const mapped = typeof manifest === 'object'; + if (!items.length && + !mapped) { + + continue; + } + + const normalized = []; + for (const item of items) { + normalized.push(internals.describe(item)); + } + + // Mapped + + if (mapped) { + const { from, to } = manifest.mapped; + desc[term] = {}; + for (const item of normalized) { + desc[term][item[to]] = item[from]; + } + + continue; + } + + // Single + + if (manifest === 'single') { + Assert(normalized.length === 1, 'Term', term, 'contains more than one item'); + desc[term] = normalized[0]; + continue; + } + + // Array + + desc[term] = normalized; + } + + internals.validate(schema.$_root, desc); + return desc; +}; + + +internals.describe = function (item, options = {}) { + + if (Array.isArray(item)) { + return item.map(internals.describe); + } + + if (item === Common.symbols.deepDefault) { + return { special: 'deep' }; + } + + if (typeof item !== 'object' || + item === null) { + + return item; + } + + if (options.assign === 'options') { + return Clone(item); + } + + if (Buffer && Buffer.isBuffer(item)) { // $lab:coverage:ignore$ + return { buffer: item.toString('binary') }; + } + + if (item instanceof Date) { + return item.toISOString(); + } + + if (item instanceof Error) { + return item; + } + + if (item instanceof RegExp) { + if (options.assign === 'regex') { + return item.toString(); + } + + return { regex: item.toString() }; + } + + if (item[Common.symbols.literal]) { + return { function: item.literal }; + } + + if (typeof item.describe === 'function') { + if (options.assign === 'ref') { + return item.describe().ref; + } + + return item.describe(); + } + + const normalized = {}; + for (const key in item) { + const value = item[key]; + if (value === undefined) { + continue; + } + + normalized[key] = internals.describe(value, { assign: key }); + } + + return normalized; +}; + + +exports.build = function (joi, desc) { + + const builder = new internals.Builder(joi); + return builder.parse(desc); +}; + + +internals.Builder = class { + + constructor(joi) { + + this.joi = joi; + } + + parse(desc) { + + internals.validate(this.joi, desc); + + // Type + + let schema = this.joi[desc.type]()._bare(); + const def = schema._definition; + + // Flags + + if (desc.flags) { + for (const flag in desc.flags) { + const setter = def.flags[flag] && def.flags[flag].setter || flag; + Assert(typeof schema[setter] === 'function', 'Invalid flag', flag, 'for type', desc.type); + schema = schema[setter](this.build(desc.flags[flag])); + } + } + + // Preferences + + if (desc.preferences) { + schema = schema.preferences(this.build(desc.preferences)); + } + + // Allow / Invalid + + if (desc.allow) { + schema = schema.allow(...this.build(desc.allow)); + } + + if (desc.invalid) { + schema = schema.invalid(...this.build(desc.invalid)); + } + + // Rules + + if (desc.rules) { + for (const rule of desc.rules) { + Assert(typeof schema[rule.name] === 'function', 'Invalid rule', rule.name, 'for type', desc.type); + + const args = []; + if (rule.args) { + const built = {}; + for (const key in rule.args) { + built[key] = this.build(rule.args[key], { assign: key }); + } + + const keys = Object.keys(built); + const definition = def.rules[rule.name].args; + if (definition) { + Assert(keys.length <= definition.length, 'Invalid number of arguments for', desc.type, rule.name, '(expected up to', definition.length, ', found', keys.length, ')'); + for (const { name } of definition) { + args.push(built[name]); + } + } + else { + Assert(keys.length === 1, 'Invalid number of arguments for', desc.type, rule.name, '(expected up to 1, found', keys.length, ')'); + args.push(built[keys[0]]); + } + } + + // Apply + + schema = schema[rule.name](...args); + + // Ruleset + + const options = {}; + for (const custom in def.modifiers) { + if (rule[custom] !== undefined) { + options[custom] = this.build(rule[custom]); + } + } + + if (Object.keys(options).length) { + schema = schema.rule(options); + } + } + } + + // Terms + + const terms = {}; + for (const key in desc) { + if (['allow', 'flags', 'invalid', 'whens', 'preferences', 'rules', 'type'].includes(key)) { + continue; + } + + Assert(def.terms[key], 'Term', key, 'missing configuration'); + const manifest = def.terms[key].manifest; + + if (manifest === 'schema') { + terms[key] = desc[key].map((item) => this.parse(item)); + continue; + } + + if (manifest === 'values') { + terms[key] = desc[key].map((item) => this.build(item)); + continue; + } + + if (manifest === 'single') { + terms[key] = this.build(desc[key]); + continue; + } + + if (typeof manifest === 'object') { + terms[key] = {}; + for (const name in desc[key]) { + const value = desc[key][name]; + terms[key][name] = this.parse(value); + } + + continue; + } + + terms[key] = this.build(desc[key]); + } + + if (desc.whens) { + terms.whens = desc.whens.map((when) => this.build(when)); + } + + schema = def.manifest.build(schema, terms); + schema.$_temp.ruleset = false; + return schema; + } + + build(desc, options = {}) { + + if (desc === null) { + return null; + } + + if (Array.isArray(desc)) { + return desc.map((item) => this.build(item)); + } + + if (desc instanceof Error) { + return desc; + } + + if (options.assign === 'options') { + return Clone(desc); + } + + if (options.assign === 'regex') { + return internals.regex(desc); + } + + if (options.assign === 'ref') { + return Ref.build(desc); + } + + if (typeof desc !== 'object') { + return desc; + } + + if (Object.keys(desc).length === 1) { + if (desc.buffer) { + Assert(Buffer, 'Buffers are not supported'); + return Buffer && Buffer.from(desc.buffer, 'binary'); // $lab:coverage:ignore$ + } + + if (desc.function) { + return { [Common.symbols.literal]: true, literal: desc.function }; + } + + if (desc.override) { + return Common.symbols.override; + } + + if (desc.ref) { + return Ref.build(desc.ref); + } + + if (desc.regex) { + return internals.regex(desc.regex); + } + + if (desc.special) { + Assert(['deep'].includes(desc.special), 'Unknown special value', desc.special); + return Common.symbols.deepDefault; + } + + if (desc.value) { + return Clone(desc.value); + } + } + + if (desc.type) { + return this.parse(desc); + } + + if (desc.template) { + return Template.build(desc); + } + + const normalized = {}; + for (const key in desc) { + normalized[key] = this.build(desc[key], { assign: key }); + } + + return normalized; + } +}; + + +internals.regex = function (string) { + + const end = string.lastIndexOf('/'); + const exp = string.slice(1, end); + const flags = string.slice(end + 1); + return new RegExp(exp, flags); +}; + + +internals.validate = function (joi, desc) { + + Schemas = Schemas || require('./schemas'); + + joi.assert(desc, Schemas.description); +}; diff --git a/node_modules/joi/lib/messages.js b/node_modules/joi/lib/messages.js new file mode 100644 index 000000000..f719779b2 --- /dev/null +++ b/node_modules/joi/lib/messages.js @@ -0,0 +1,178 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); + +const Template = require('./template'); + + +const internals = {}; + + +exports.compile = function (messages, target) { + + // Single value string ('plain error message', 'template {error} message') + + if (typeof messages === 'string') { + Assert(!target, 'Cannot set single message string'); + return new Template(messages); + } + + // Single value template + + if (Template.isTemplate(messages)) { + Assert(!target, 'Cannot set single message template'); + return messages; + } + + // By error code { 'number.min': } + + Assert(typeof messages === 'object' && !Array.isArray(messages), 'Invalid message options'); + + target = target ? Clone(target) : {}; + + for (let code in messages) { + const message = messages[code]; + + if (code === 'root' || + Template.isTemplate(message)) { + + target[code] = message; + continue; + } + + if (typeof message === 'string') { + target[code] = new Template(message); + continue; + } + + // By language { english: { 'number.min': } } + + Assert(typeof message === 'object' && !Array.isArray(message), 'Invalid message for', code); + + const language = code; + target[language] = target[language] || {}; + + for (code in message) { + const localized = message[code]; + + if (code === 'root' || + Template.isTemplate(localized)) { + + target[language][code] = localized; + continue; + } + + Assert(typeof localized === 'string', 'Invalid message for', code, 'in', language); + target[language][code] = new Template(localized); + } + } + + return target; +}; + + +exports.decompile = function (messages) { + + // By error code { 'number.min': } + + const target = {}; + for (let code in messages) { + const message = messages[code]; + + if (code === 'root') { + target[code] = message; + continue; + } + + if (Template.isTemplate(message)) { + target[code] = message.describe({ compact: true }); + continue; + } + + // By language { english: { 'number.min': } } + + const language = code; + target[language] = {}; + + for (code in message) { + const localized = message[code]; + + if (code === 'root') { + target[language][code] = localized; + continue; + } + + target[language][code] = localized.describe({ compact: true }); + } + } + + return target; +}; + + +exports.merge = function (base, extended) { + + if (!base) { + return exports.compile(extended); + } + + if (!extended) { + return base; + } + + // Single value string + + if (typeof extended === 'string') { + return new Template(extended); + } + + // Single value template + + if (Template.isTemplate(extended)) { + return extended; + } + + // By error code { 'number.min': } + + const target = Clone(base); + + for (let code in extended) { + const message = extended[code]; + + if (code === 'root' || + Template.isTemplate(message)) { + + target[code] = message; + continue; + } + + if (typeof message === 'string') { + target[code] = new Template(message); + continue; + } + + // By language { english: { 'number.min': } } + + Assert(typeof message === 'object' && !Array.isArray(message), 'Invalid message for', code); + + const language = code; + target[language] = target[language] || {}; + + for (code in message) { + const localized = message[code]; + + if (code === 'root' || + Template.isTemplate(localized)) { + + target[language][code] = localized; + continue; + } + + Assert(typeof localized === 'string', 'Invalid message for', code, 'in', language); + target[language][code] = new Template(localized); + } + } + + return target; +}; diff --git a/node_modules/joi/lib/modify.js b/node_modules/joi/lib/modify.js new file mode 100644 index 000000000..6f1484847 --- /dev/null +++ b/node_modules/joi/lib/modify.js @@ -0,0 +1,267 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Common = require('./common'); +const Ref = require('./ref'); + + +const internals = {}; + + + +exports.Ids = internals.Ids = class { + + constructor() { + + this._byId = new Map(); + this._byKey = new Map(); + this._schemaChain = false; + } + + clone() { + + const clone = new internals.Ids(); + clone._byId = new Map(this._byId); + clone._byKey = new Map(this._byKey); + clone._schemaChain = this._schemaChain; + return clone; + } + + concat(source) { + + if (source._schemaChain) { + this._schemaChain = true; + } + + for (const [id, value] of source._byId.entries()) { + Assert(!this._byKey.has(id), 'Schema id conflicts with existing key:', id); + this._byId.set(id, value); + } + + for (const [key, value] of source._byKey.entries()) { + Assert(!this._byId.has(key), 'Schema key conflicts with existing id:', key); + this._byKey.set(key, value); + } + } + + fork(path, adjuster, root) { + + const chain = this._collect(path); + chain.push({ schema: root }); + const tail = chain.shift(); + let adjusted = { id: tail.id, schema: adjuster(tail.schema) }; + + Assert(Common.isSchema(adjusted.schema), 'adjuster function failed to return a joi schema type'); + + for (const node of chain) { + adjusted = { id: node.id, schema: internals.fork(node.schema, adjusted.id, adjusted.schema) }; + } + + return adjusted.schema; + } + + labels(path, behind = []) { + + const current = path[0]; + const node = this._get(current); + if (!node) { + return [...behind, ...path].join('.'); + } + + const forward = path.slice(1); + behind = [...behind, node.schema._flags.label || current]; + if (!forward.length) { + return behind.join('.'); + } + + return node.schema._ids.labels(forward, behind); + } + + reach(path, behind = []) { + + const current = path[0]; + const node = this._get(current); + Assert(node, 'Schema does not contain path', [...behind, ...path].join('.')); + + const forward = path.slice(1); + if (!forward.length) { + return node.schema; + } + + return node.schema._ids.reach(forward, [...behind, current]); + } + + register(schema, { key } = {}) { + + if (!schema || + !Common.isSchema(schema)) { + + return; + } + + if (schema.$_property('schemaChain') || + schema._ids._schemaChain) { + + this._schemaChain = true; + } + + const id = schema._flags.id; + if (id) { + const existing = this._byId.get(id); + Assert(!existing || existing.schema === schema, 'Cannot add different schemas with the same id:', id); + Assert(!this._byKey.has(id), 'Schema id conflicts with existing key:', id); + + this._byId.set(id, { schema, id }); + } + + if (key) { + Assert(!this._byKey.has(key), 'Schema already contains key:', key); + Assert(!this._byId.has(key), 'Schema key conflicts with existing id:', key); + + this._byKey.set(key, { schema, id: key }); + } + } + + reset() { + + this._byId = new Map(); + this._byKey = new Map(); + this._schemaChain = false; + } + + _collect(path, behind = [], nodes = []) { + + const current = path[0]; + const node = this._get(current); + Assert(node, 'Schema does not contain path', [...behind, ...path].join('.')); + + nodes = [node, ...nodes]; + + const forward = path.slice(1); + if (!forward.length) { + return nodes; + } + + return node.schema._ids._collect(forward, [...behind, current], nodes); + } + + _get(id) { + + return this._byId.get(id) || this._byKey.get(id); + } +}; + + +internals.fork = function (schema, id, replacement) { + + const each = (item, { key }) => { + + if (id === (item._flags.id || key)) { + return replacement; + } + }; + + const obj = exports.schema(schema, { each, ref: false }); + return obj ? obj.$_mutateRebuild() : schema; +}; + + +exports.schema = function (schema, options) { + + let obj; + + for (const name in schema._flags) { + if (name[0] === '_') { + continue; + } + + const result = internals.scan(schema._flags[name], { source: 'flags', name }, options); + if (result !== undefined) { + obj = obj || schema.clone(); + obj._flags[name] = result; + } + } + + for (let i = 0; i < schema._rules.length; ++i) { + const rule = schema._rules[i]; + const result = internals.scan(rule.args, { source: 'rules', name: rule.name }, options); + if (result !== undefined) { + obj = obj || schema.clone(); + const clone = Object.assign({}, rule); + clone.args = result; + obj._rules[i] = clone; + + const existingUnique = obj._singleRules.get(rule.name); + if (existingUnique === rule) { + obj._singleRules.set(rule.name, clone); + } + } + } + + for (const name in schema.$_terms) { + if (name[0] === '_') { + continue; + } + + const result = internals.scan(schema.$_terms[name], { source: 'terms', name }, options); + if (result !== undefined) { + obj = obj || schema.clone(); + obj.$_terms[name] = result; + } + } + + return obj; +}; + + +internals.scan = function (item, source, options, _path, _key) { + + const path = _path || []; + + if (item === null || + typeof item !== 'object') { + + return; + } + + let clone; + + if (Array.isArray(item)) { + for (let i = 0; i < item.length; ++i) { + const key = source.source === 'terms' && source.name === 'keys' && item[i].key; + const result = internals.scan(item[i], source, options, [i, ...path], key); + if (result !== undefined) { + clone = clone || item.slice(); + clone[i] = result; + } + } + + return clone; + } + + if (options.schema !== false && Common.isSchema(item) || + options.ref !== false && Ref.isRef(item)) { + + const result = options.each(item, { ...source, path, key: _key }); + if (result === item) { + return; + } + + return result; + } + + for (const key in item) { + if (key[0] === '_') { + continue; + } + + const result = internals.scan(item[key], source, options, [key, ...path], _key); + if (result !== undefined) { + clone = clone || Object.assign({}, item); + clone[key] = result; + } + } + + return clone; +}; diff --git a/node_modules/joi/lib/ref.js b/node_modules/joi/lib/ref.js new file mode 100644 index 000000000..9f84a7b6d --- /dev/null +++ b/node_modules/joi/lib/ref.js @@ -0,0 +1,414 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); +const Reach = require('@hapi/hoek/lib/reach'); + +const Common = require('./common'); + +let Template; + + +const internals = { + symbol: Symbol('ref'), // Used to internally identify references (shared with other joi versions) + defaults: { + adjust: null, + in: false, + iterables: null, + map: null, + separator: '.', + type: 'value' + } +}; + + +exports.create = function (key, options = {}) { + + Assert(typeof key === 'string', 'Invalid reference key:', key); + Common.assertOptions(options, ['adjust', 'ancestor', 'in', 'iterables', 'map', 'prefix', 'render', 'separator']); + Assert(!options.prefix || typeof options.prefix === 'object', 'options.prefix must be of type object'); + + const ref = Object.assign({}, internals.defaults, options); + delete ref.prefix; + + const separator = ref.separator; + const context = internals.context(key, separator, options.prefix); + ref.type = context.type; + key = context.key; + + if (ref.type === 'value') { + if (context.root) { + Assert(!separator || key[0] !== separator, 'Cannot specify relative path with root prefix'); + ref.ancestor = 'root'; + if (!key) { + key = null; + } + } + + if (separator && + separator === key) { + + key = null; + ref.ancestor = 0; + } + else { + if (ref.ancestor !== undefined) { + Assert(!separator || !key || key[0] !== separator, 'Cannot combine prefix with ancestor option'); + } + else { + const [ancestor, slice] = internals.ancestor(key, separator); + if (slice) { + key = key.slice(slice); + if (key === '') { + key = null; + } + } + + ref.ancestor = ancestor; + } + } + } + + ref.path = separator ? (key === null ? [] : key.split(separator)) : [key]; + + return new internals.Ref(ref); +}; + + +exports.in = function (key, options = {}) { + + return exports.create(key, { ...options, in: true }); +}; + + +exports.isRef = function (ref) { + + return ref ? !!ref[Common.symbols.ref] : false; +}; + + +internals.Ref = class { + + constructor(options) { + + Assert(typeof options === 'object', 'Invalid reference construction'); + Common.assertOptions(options, [ + 'adjust', 'ancestor', 'in', 'iterables', 'map', 'path', 'render', 'separator', 'type', // Copied + 'depth', 'key', 'root', 'display' // Overridden + ]); + + Assert([false, undefined].includes(options.separator) || typeof options.separator === 'string' && options.separator.length === 1, 'Invalid separator'); + Assert(!options.adjust || typeof options.adjust === 'function', 'options.adjust must be a function'); + Assert(!options.map || Array.isArray(options.map), 'options.map must be an array'); + Assert(!options.map || !options.adjust, 'Cannot set both map and adjust options'); + + Object.assign(this, internals.defaults, options); + + Assert(this.type === 'value' || this.ancestor === undefined, 'Non-value references cannot reference ancestors'); + + if (Array.isArray(this.map)) { + this.map = new Map(this.map); + } + + this.depth = this.path.length; + this.key = this.path.length ? this.path.join(this.separator) : null; + this.root = this.path[0]; + + this.updateDisplay(); + } + + resolve(value, state, prefs, local, options = {}) { + + Assert(!this.in || options.in, 'Invalid in() reference usage'); + + if (this.type === 'global') { + return this._resolve(prefs.context, state, options); + } + + if (this.type === 'local') { + return this._resolve(local, state, options); + } + + if (!this.ancestor) { + return this._resolve(value, state, options); + } + + if (this.ancestor === 'root') { + return this._resolve(state.ancestors[state.ancestors.length - 1], state, options); + } + + Assert(this.ancestor <= state.ancestors.length, 'Invalid reference exceeds the schema root:', this.display); + return this._resolve(state.ancestors[this.ancestor - 1], state, options); + } + + _resolve(target, state, options) { + + let resolved; + + if (this.type === 'value' && + state.mainstay.shadow && + options.shadow !== false) { + + resolved = state.mainstay.shadow.get(this.absolute(state)); + } + + if (resolved === undefined) { + resolved = Reach(target, this.path, { iterables: this.iterables, functions: true }); + } + + if (this.adjust) { + resolved = this.adjust(resolved); + } + + if (this.map) { + const mapped = this.map.get(resolved); + if (mapped !== undefined) { + resolved = mapped; + } + } + + if (state.mainstay) { + state.mainstay.tracer.resolve(state, this, resolved); + } + + return resolved; + } + + toString() { + + return this.display; + } + + absolute(state) { + + return [...state.path.slice(0, -this.ancestor), ...this.path]; + } + + clone() { + + return new internals.Ref(this); + } + + describe() { + + const ref = { path: this.path }; + + if (this.type !== 'value') { + ref.type = this.type; + } + + if (this.separator !== '.') { + ref.separator = this.separator; + } + + if (this.type === 'value' && + this.ancestor !== 1) { + + ref.ancestor = this.ancestor; + } + + if (this.map) { + ref.map = [...this.map]; + } + + for (const key of ['adjust', 'iterables', 'render']) { + if (this[key] !== null && + this[key] !== undefined) { + + ref[key] = this[key]; + } + } + + if (this.in !== false) { + ref.in = true; + } + + return { ref }; + } + + updateDisplay() { + + const key = this.key !== null ? this.key : ''; + if (this.type !== 'value') { + this.display = `ref:${this.type}:${key}`; + return; + } + + if (!this.separator) { + this.display = `ref:${key}`; + return; + } + + if (!this.ancestor) { + this.display = `ref:${this.separator}${key}`; + return; + } + + if (this.ancestor === 'root') { + this.display = `ref:root:${key}`; + return; + } + + if (this.ancestor === 1) { + this.display = `ref:${key || '..'}`; + return; + } + + const lead = new Array(this.ancestor + 1).fill(this.separator).join(''); + this.display = `ref:${lead}${key || ''}`; + } +}; + + +internals.Ref.prototype[Common.symbols.ref] = true; + + +exports.build = function (desc) { + + desc = Object.assign({}, internals.defaults, desc); + if (desc.type === 'value' && + desc.ancestor === undefined) { + + desc.ancestor = 1; + } + + return new internals.Ref(desc); +}; + + +internals.context = function (key, separator, prefix = {}) { + + key = key.trim(); + + if (prefix) { + const globalp = prefix.global === undefined ? '$' : prefix.global; + if (globalp !== separator && + key.startsWith(globalp)) { + + return { key: key.slice(globalp.length), type: 'global' }; + } + + const local = prefix.local === undefined ? '#' : prefix.local; + if (local !== separator && + key.startsWith(local)) { + + return { key: key.slice(local.length), type: 'local' }; + } + + const root = prefix.root === undefined ? '/' : prefix.root; + if (root !== separator && + key.startsWith(root)) { + + return { key: key.slice(root.length), type: 'value', root: true }; + } + } + + return { key, type: 'value' }; +}; + + +internals.ancestor = function (key, separator) { + + if (!separator) { + return [1, 0]; // 'a_b' -> 1 (parent) + } + + if (key[0] !== separator) { // 'a.b' -> 1 (parent) + return [1, 0]; + } + + if (key[1] !== separator) { // '.a.b' -> 0 (self) + return [0, 1]; + } + + let i = 2; + while (key[i] === separator) { + ++i; + } + + return [i - 1, i]; // '...a.b.' -> 2 (grandparent) +}; + + +exports.toSibling = 0; + +exports.toParent = 1; + + +exports.Manager = class { + + constructor() { + + this.refs = []; // 0: [self refs], 1: [parent refs], 2: [grandparent refs], ... + } + + register(source, target) { + + if (!source) { + return; + } + + target = target === undefined ? exports.toParent : target; + + // Array + + if (Array.isArray(source)) { + for (const ref of source) { + this.register(ref, target); + } + + return; + } + + // Schema + + if (Common.isSchema(source)) { + for (const item of source._refs.refs) { + if (item.ancestor - target >= 0) { + this.refs.push({ ancestor: item.ancestor - target, root: item.root }); + } + } + + return; + } + + // Reference + + if (exports.isRef(source) && + source.type === 'value' && + source.ancestor - target >= 0) { + + this.refs.push({ ancestor: source.ancestor - target, root: source.root }); + } + + // Template + + Template = Template || require('./template'); + + if (Template.isTemplate(source)) { + this.register(source.refs(), target); + } + } + + get length() { + + return this.refs.length; + } + + clone() { + + const copy = new exports.Manager(); + copy.refs = Clone(this.refs); + return copy; + } + + reset() { + + this.refs = []; + } + + roots() { + + return this.refs.filter((ref) => !ref.ancestor).map((ref) => ref.root); + } +}; diff --git a/node_modules/joi/lib/schemas.js b/node_modules/joi/lib/schemas.js new file mode 100644 index 000000000..ab56109c3 --- /dev/null +++ b/node_modules/joi/lib/schemas.js @@ -0,0 +1,301 @@ +'use strict'; + +const Joi = require('./index'); + + +const internals = {}; + + +// Preferences + +internals.wrap = Joi.string() + .min(1) + .max(2) + .allow(false); + + +exports.preferences = Joi.object({ + allowUnknown: Joi.boolean(), + abortEarly: Joi.boolean(), + artifacts: Joi.boolean(), + cache: Joi.boolean(), + context: Joi.object(), + convert: Joi.boolean(), + dateFormat: Joi.valid('date', 'iso', 'string', 'time', 'utc'), + debug: Joi.boolean(), + errors: { + escapeHtml: Joi.boolean(), + label: Joi.valid('path', 'key', false), + language: [ + Joi.string(), + Joi.object().ref() + ], + render: Joi.boolean(), + stack: Joi.boolean(), + wrap: { + label: internals.wrap, + array: internals.wrap + } + }, + externals: Joi.boolean(), + messages: Joi.object(), + noDefaults: Joi.boolean(), + nonEnumerables: Joi.boolean(), + presence: Joi.valid('required', 'optional', 'forbidden'), + skipFunctions: Joi.boolean(), + stripUnknown: Joi.object({ + arrays: Joi.boolean(), + objects: Joi.boolean() + }) + .or('arrays', 'objects') + .allow(true, false), + warnings: Joi.boolean() +}) + .strict(); + + +// Extensions + +internals.nameRx = /^[a-zA-Z0-9]\w*$/; + + +internals.rule = Joi.object({ + alias: Joi.array().items(Joi.string().pattern(internals.nameRx)).single(), + args: Joi.array().items( + Joi.string(), + Joi.object({ + name: Joi.string().pattern(internals.nameRx).required(), + ref: Joi.boolean(), + assert: Joi.alternatives([ + Joi.function(), + Joi.object().schema() + ]) + .conditional('ref', { is: true, then: Joi.required() }), + normalize: Joi.function(), + message: Joi.string().when('assert', { is: Joi.function(), then: Joi.required() }) + }) + ), + convert: Joi.boolean(), + manifest: Joi.boolean(), + method: Joi.function().allow(false), + multi: Joi.boolean(), + validate: Joi.function() +}); + + +exports.extension = Joi.object({ + type: Joi.alternatives([ + Joi.string(), + Joi.object().regex() + ]) + .required(), + args: Joi.function(), + cast: Joi.object().pattern(internals.nameRx, Joi.object({ + from: Joi.function().maxArity(1).required(), + to: Joi.function().minArity(1).maxArity(2).required() + })), + base: Joi.object().schema() + .when('type', { is: Joi.object().regex(), then: Joi.forbidden() }), + coerce: [ + Joi.function().maxArity(3), + Joi.object({ method: Joi.function().maxArity(3).required(), from: Joi.array().items(Joi.string()).single() }) + ], + flags: Joi.object().pattern(internals.nameRx, Joi.object({ + setter: Joi.string(), + default: Joi.any() + })), + manifest: { + build: Joi.function().arity(2) + }, + messages: [Joi.object(), Joi.string()], + modifiers: Joi.object().pattern(internals.nameRx, Joi.function().minArity(1).maxArity(2)), + overrides: Joi.object().pattern(internals.nameRx, Joi.function()), + prepare: Joi.function().maxArity(3), + rebuild: Joi.function().arity(1), + rules: Joi.object().pattern(internals.nameRx, internals.rule), + terms: Joi.object().pattern(internals.nameRx, Joi.object({ + init: Joi.array().allow(null).required(), + manifest: Joi.object().pattern(/.+/, [ + Joi.valid('schema', 'single'), + Joi.object({ + mapped: Joi.object({ + from: Joi.string().required(), + to: Joi.string().required() + }) + .required() + }) + ]) + })), + validate: Joi.function().maxArity(3) +}) + .strict(); + + +exports.extensions = Joi.array().items(Joi.object(), Joi.function().arity(1)).strict(); + + +// Manifest + +internals.desc = { + + buffer: Joi.object({ + buffer: Joi.string() + }), + + func: Joi.object({ + function: Joi.function().required(), + options: { + literal: true + } + }), + + override: Joi.object({ + override: true + }), + + ref: Joi.object({ + ref: Joi.object({ + type: Joi.valid('value', 'global', 'local'), + path: Joi.array().required(), + separator: Joi.string().length(1).allow(false), + ancestor: Joi.number().min(0).integer().allow('root'), + map: Joi.array().items(Joi.array().length(2)).min(1), + adjust: Joi.function(), + iterables: Joi.boolean(), + in: Joi.boolean(), + render: Joi.boolean() + }) + .required() + }), + + regex: Joi.object({ + regex: Joi.string().min(3) + }), + + special: Joi.object({ + special: Joi.valid('deep').required() + }), + + template: Joi.object({ + template: Joi.string().required(), + options: Joi.object() + }), + + value: Joi.object({ + value: Joi.alternatives([Joi.object(), Joi.array()]).required() + }) +}; + + +internals.desc.entity = Joi.alternatives([ + Joi.array().items(Joi.link('...')), + Joi.boolean(), + Joi.function(), + Joi.number(), + Joi.string(), + internals.desc.buffer, + internals.desc.func, + internals.desc.ref, + internals.desc.regex, + internals.desc.special, + internals.desc.template, + internals.desc.value, + Joi.link('/') +]); + + +internals.desc.values = Joi.array() + .items( + null, + Joi.boolean(), + Joi.function(), + Joi.number().allow(Infinity, -Infinity), + Joi.string().allow(''), + Joi.symbol(), + internals.desc.buffer, + internals.desc.func, + internals.desc.override, + internals.desc.ref, + internals.desc.regex, + internals.desc.template, + internals.desc.value + ); + + +internals.desc.messages = Joi.object() + .pattern(/.+/, [ + Joi.string(), + internals.desc.template, + Joi.object().pattern(/.+/, [Joi.string(), internals.desc.template]) + ]); + + +exports.description = Joi.object({ + type: Joi.string().required(), + flags: Joi.object({ + cast: Joi.string(), + default: Joi.any(), + description: Joi.string(), + empty: Joi.link('/'), + failover: internals.desc.entity, + id: Joi.string(), + label: Joi.string(), + only: true, + presence: ['optional', 'required', 'forbidden'], + result: ['raw', 'strip'], + strip: Joi.boolean(), + unit: Joi.string() + }) + .unknown(), + preferences: { + allowUnknown: Joi.boolean(), + abortEarly: Joi.boolean(), + artifacts: Joi.boolean(), + cache: Joi.boolean(), + convert: Joi.boolean(), + dateFormat: ['date', 'iso', 'string', 'time', 'utc'], + errors: { + escapeHtml: Joi.boolean(), + label: ['path', 'key'], + language: [ + Joi.string(), + internals.desc.ref + ], + wrap: { + label: internals.wrap, + array: internals.wrap + } + }, + externals: Joi.boolean(), + messages: internals.desc.messages, + noDefaults: Joi.boolean(), + nonEnumerables: Joi.boolean(), + presence: ['required', 'optional', 'forbidden'], + skipFunctions: Joi.boolean(), + stripUnknown: Joi.object({ + arrays: Joi.boolean(), + objects: Joi.boolean() + }) + .or('arrays', 'objects') + .allow(true, false), + warnings: Joi.boolean() + }, + allow: internals.desc.values, + invalid: internals.desc.values, + rules: Joi.array().min(1).items({ + name: Joi.string().required(), + args: Joi.object().min(1), + keep: Joi.boolean(), + message: [ + Joi.string(), + internals.desc.messages + ], + warn: Joi.boolean() + }), + + // Terms + + keys: Joi.object().pattern(/.*/, Joi.link('/')), + link: internals.desc.ref +}) + .pattern(/^[a-z]\w*$/, Joi.any()); diff --git a/node_modules/joi/lib/state.js b/node_modules/joi/lib/state.js new file mode 100644 index 000000000..8db251b76 --- /dev/null +++ b/node_modules/joi/lib/state.js @@ -0,0 +1,152 @@ +'use strict'; + +const Clone = require('@hapi/hoek/lib/clone'); +const Reach = require('@hapi/hoek/lib/reach'); + +const Common = require('./common'); + + +const internals = { + value: Symbol('value') +}; + + +module.exports = internals.State = class { + + constructor(path, ancestors, state) { + + this.path = path; + this.ancestors = ancestors; // [parent, ..., root] + + this.mainstay = state.mainstay; + this.schemas = state.schemas; // [current, ..., root] + this.debug = null; + } + + localize(path, ancestors = null, schema = null) { + + const state = new internals.State(path, ancestors, this); + + if (schema && + state.schemas) { + + state.schemas = [internals.schemas(schema), ...state.schemas]; + } + + return state; + } + + nest(schema, debug) { + + const state = new internals.State(this.path, this.ancestors, this); + state.schemas = state.schemas && [internals.schemas(schema), ...state.schemas]; + state.debug = debug; + return state; + } + + shadow(value, reason) { + + this.mainstay.shadow = this.mainstay.shadow || new internals.Shadow(); + this.mainstay.shadow.set(this.path, value, reason); + } + + snapshot() { + + if (this.mainstay.shadow) { + this._snapshot = Clone(this.mainstay.shadow.node(this.path)); + } + } + + restore() { + + if (this.mainstay.shadow) { + this.mainstay.shadow.override(this.path, this._snapshot); + this._snapshot = undefined; + } + } +}; + + +internals.schemas = function (schema) { + + if (Common.isSchema(schema)) { + return { schema }; + } + + return schema; +}; + + +internals.Shadow = class { + + constructor() { + + this._values = null; + } + + set(path, value, reason) { + + if (!path.length) { // No need to store root value + return; + } + + if (reason === 'strip' && + typeof path[path.length - 1] === 'number') { // Cannot store stripped array values (due to shift) + + return; + } + + this._values = this._values || new Map(); + + let node = this._values; + for (let i = 0; i < path.length; ++i) { + const segment = path[i]; + let next = node.get(segment); + if (!next) { + next = new Map(); + node.set(segment, next); + } + + node = next; + } + + node[internals.value] = value; + } + + get(path) { + + const node = this.node(path); + if (node) { + return node[internals.value]; + } + } + + node(path) { + + if (!this._values) { + return; + } + + return Reach(this._values, path, { iterables: true }); + } + + override(path, node) { + + if (!this._values) { + return; + } + + const parents = path.slice(0, -1); + const own = path[path.length - 1]; + const parent = Reach(this._values, parents, { iterables: true }); + + if (node) { + parent.set(own, node); + return; + } + + if (parent) { + parent.delete(own); + } + } +}; diff --git a/node_modules/joi/lib/template.js b/node_modules/joi/lib/template.js new file mode 100644 index 000000000..c21eb70e6 --- /dev/null +++ b/node_modules/joi/lib/template.js @@ -0,0 +1,427 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); +const EscapeHtml = require('@hapi/hoek/lib/escapeHtml'); +const Formula = require('@sideway/formula'); + +const Common = require('./common'); +const Errors = require('./errors'); +const Ref = require('./ref'); + + +const internals = { + symbol: Symbol('template'), + + opens: new Array(1000).join('\u0000'), + closes: new Array(1000).join('\u0001'), + + dateFormat: { + date: Date.prototype.toDateString, + iso: Date.prototype.toISOString, + string: Date.prototype.toString, + time: Date.prototype.toTimeString, + utc: Date.prototype.toUTCString + } +}; + + +module.exports = exports = internals.Template = class { + + constructor(source, options) { + + Assert(typeof source === 'string', 'Template source must be a string'); + Assert(!source.includes('\u0000') && !source.includes('\u0001'), 'Template source cannot contain reserved control characters'); + + this.source = source; + this.rendered = source; + + this._template = null; + this._settings = Clone(options); + + this._parse(); + } + + _parse() { + + // 'text {raw} {{ref}} \\{{ignore}} {{ignore\\}} {{ignore {{ignore}' + + if (!this.source.includes('{')) { + return; + } + + // Encode escaped \\{{{{{ + + const encoded = internals.encode(this.source); + + // Split on first { in each set + + const parts = internals.split(encoded); + + // Process parts + + let refs = false; + const processed = []; + const head = parts.shift(); + if (head) { + processed.push(head); + } + + for (const part of parts) { + const raw = part[0] !== '{'; + const ender = raw ? '}' : '}}'; + const end = part.indexOf(ender); + if (end === -1 || // Ignore non-matching closing + part[1] === '{') { // Ignore more than two { + + processed.push(`{${internals.decode(part)}`); + continue; + } + + let variable = part.slice(raw ? 0 : 1, end); + const wrapped = variable[0] === ':'; + if (wrapped) { + variable = variable.slice(1); + } + + const dynamic = this._ref(internals.decode(variable), { raw, wrapped }); + processed.push(dynamic); + if (typeof dynamic !== 'string') { + refs = true; + } + + const rest = part.slice(end + ender.length); + if (rest) { + processed.push(internals.decode(rest)); + } + } + + if (!refs) { + this.rendered = processed.join(''); + return; + } + + this._template = processed; + } + + static date(date, prefs) { + + return internals.dateFormat[prefs.dateFormat].call(date); + } + + describe(options = {}) { + + if (!this._settings && + options.compact) { + + return this.source; + } + + const desc = { template: this.source }; + if (this._settings) { + desc.options = this._settings; + } + + return desc; + } + + static build(desc) { + + return new internals.Template(desc.template, desc.options); + } + + isDynamic() { + + return !!this._template; + } + + static isTemplate(template) { + + return template ? !!template[Common.symbols.template] : false; + } + + refs() { + + if (!this._template) { + return; + } + + const refs = []; + for (const part of this._template) { + if (typeof part !== 'string') { + refs.push(...part.refs); + } + } + + return refs; + } + + resolve(value, state, prefs, local) { + + if (this._template && + this._template.length === 1) { + + return this._part(this._template[0], /* context -> [*/ value, state, prefs, local, {} /*] */); + } + + return this.render(value, state, prefs, local); + } + + _part(part, ...args) { + + if (part.ref) { + return part.ref.resolve(...args); + } + + return part.formula.evaluate(args); + } + + render(value, state, prefs, local, options = {}) { + + if (!this.isDynamic()) { + return this.rendered; + } + + const parts = []; + for (const part of this._template) { + if (typeof part === 'string') { + parts.push(part); + } + else { + const rendered = this._part(part, /* context -> [*/ value, state, prefs, local, options /*] */); + const string = internals.stringify(rendered, value, state, prefs, local, options); + if (string !== undefined) { + const result = part.raw || (options.errors && options.errors.escapeHtml) === false ? string : EscapeHtml(string); + parts.push(internals.wrap(result, part.wrapped && prefs.errors.wrap.label)); + } + } + } + + return parts.join(''); + } + + _ref(content, { raw, wrapped }) { + + const refs = []; + const reference = (variable) => { + + const ref = Ref.create(variable, this._settings); + refs.push(ref); + return (context) => ref.resolve(...context); + }; + + try { + var formula = new Formula.Parser(content, { reference, functions: internals.functions, constants: internals.constants }); + } + catch (err) { + err.message = `Invalid template variable "${content}" fails due to: ${err.message}`; + throw err; + } + + if (formula.single) { + if (formula.single.type === 'reference') { + const ref = refs[0]; + return { ref, raw, refs, wrapped: wrapped || ref.type === 'local' && ref.key === 'label' }; + } + + return internals.stringify(formula.single.value); + } + + return { formula, raw, refs }; + } + + toString() { + + return this.source; + } +}; + + +internals.Template.prototype[Common.symbols.template] = true; +internals.Template.prototype.isImmutable = true; // Prevents Hoek from deep cloning schema objects + + +internals.encode = function (string) { + + return string + .replace(/\\(\{+)/g, ($0, $1) => { + + return internals.opens.slice(0, $1.length); + }) + .replace(/\\(\}+)/g, ($0, $1) => { + + return internals.closes.slice(0, $1.length); + }); +}; + + +internals.decode = function (string) { + + return string + .replace(/\u0000/g, '{') + .replace(/\u0001/g, '}'); +}; + + +internals.split = function (string) { + + const parts = []; + let current = ''; + + for (let i = 0; i < string.length; ++i) { + const char = string[i]; + + if (char === '{') { + let next = ''; + while (i + 1 < string.length && + string[i + 1] === '{') { + + next += '{'; + ++i; + } + + parts.push(current); + current = next; + } + else { + current += char; + } + } + + parts.push(current); + return parts; +}; + + +internals.wrap = function (value, ends) { + + if (!ends) { + return value; + } + + if (ends.length === 1) { + return `${ends}${value}${ends}`; + } + + return `${ends[0]}${value}${ends[1]}`; +}; + + +internals.stringify = function (value, original, state, prefs, local, options) { + + const type = typeof value; + + let skipWrap = false; + if (Ref.isRef(value) && + value.render) { + + skipWrap = value.in; + value = value.resolve(original, state, prefs, local, { in: value.in, ...options }); + } + + if (value === null) { + return 'null'; + } + + if (type === 'string') { + return value; + } + + if (type === 'number' || + type === 'function' || + type === 'symbol') { + + return value.toString(); + } + + if (type !== 'object') { + return JSON.stringify(value); + } + + if (value instanceof Date) { + return internals.Template.date(value, prefs); + } + + if (value instanceof Map) { + const pairs = []; + for (const [key, sym] of value.entries()) { + pairs.push(`${key.toString()} -> ${sym.toString()}`); + } + + value = pairs; + } + + if (!Array.isArray(value)) { + return value.toString(); + } + + let partial = ''; + for (const item of value) { + partial = partial + (partial.length ? ', ' : '') + internals.stringify(item, original, state, prefs, local, options); + } + + if (skipWrap) { + return partial; + } + + return internals.wrap(partial, prefs.errors.wrap.array); +}; + + +internals.constants = { + + true: true, + false: false, + null: null, + + second: 1000, + minute: 60 * 1000, + hour: 60 * 60 * 1000, + day: 24 * 60 * 60 * 1000 +}; + + +internals.functions = { + + if(condition, then, otherwise) { + + return condition ? then : otherwise; + }, + + msg(code) { + + const [value, state, prefs, local, options] = this; + const messages = options.messages; + if (!messages) { + return ''; + } + + const template = Errors.template(value, messages[0], code, state, prefs) || Errors.template(value, messages[1], code, state, prefs); + if (!template) { + return ''; + } + + return template.render(value, state, prefs, local, options); + }, + + number(value) { + + if (typeof value === 'number') { + return value; + } + + if (typeof value === 'string') { + return parseFloat(value); + } + + if (typeof value === 'boolean') { + return value ? 1 : 0; + } + + if (value instanceof Date) { + return value.getTime(); + } + + return null; + } +}; diff --git a/node_modules/joi/lib/trace.js b/node_modules/joi/lib/trace.js new file mode 100644 index 000000000..ded4d7e76 --- /dev/null +++ b/node_modules/joi/lib/trace.js @@ -0,0 +1,346 @@ +'use strict'; + +const DeepEqual = require('@hapi/hoek/lib/deepEqual'); +const Pinpoint = require('@sideway/pinpoint'); + +const Errors = require('./errors'); + + +const internals = { + codes: { + error: 1, + pass: 2, + full: 3 + }, + labels: { + 0: 'never used', + 1: 'always error', + 2: 'always pass' + } +}; + + +exports.setup = function (root) { + + const trace = function () { + + root._tracer = root._tracer || new internals.Tracer(); + return root._tracer; + }; + + root.trace = trace; + root[Symbol.for('@hapi/lab/coverage/initialize')] = trace; + + root.untrace = () => { + + root._tracer = null; + }; +}; + + +exports.location = function (schema) { + + return schema.$_setFlag('_tracerLocation', Pinpoint.location(2)); // base.tracer(), caller +}; + + +internals.Tracer = class { + + constructor() { + + this.name = 'Joi'; + this._schemas = new Map(); + } + + _register(schema) { + + const existing = this._schemas.get(schema); + if (existing) { + return existing.store; + } + + const store = new internals.Store(schema); + const { filename, line } = schema._flags._tracerLocation || Pinpoint.location(5); // internals.tracer(), internals.entry(), exports.entry(), validate(), caller + this._schemas.set(schema, { filename, line, store }); + return store; + } + + _combine(merged, sources) { + + for (const { store } of this._schemas.values()) { + store._combine(merged, sources); + } + } + + report(file) { + + const coverage = []; + + // Process each registered schema + + for (const { filename, line, store } of this._schemas.values()) { + if (file && + file !== filename) { + + continue; + } + + // Process sub schemas of the registered root + + const missing = []; + const skipped = []; + + for (const [schema, log] of store._sources.entries()) { + + // Check if sub schema parent skipped + + if (internals.sub(log.paths, skipped)) { + continue; + } + + // Check if sub schema reached + + if (!log.entry) { + missing.push({ + status: 'never reached', + paths: [...log.paths] + }); + + skipped.push(...log.paths); + continue; + } + + // Check values + + for (const type of ['valid', 'invalid']) { + const set = schema[`_${type}s`]; + if (!set) { + continue; + } + + const values = new Set(set._values); + const refs = new Set(set._refs); + for (const { value, ref } of log[type]) { + values.delete(value); + refs.delete(ref); + } + + if (values.size || + refs.size) { + + missing.push({ + status: [...values, ...[...refs].map((ref) => ref.display)], + rule: `${type}s` + }); + } + } + + // Check rules status + + const rules = schema._rules.map((rule) => rule.name); + for (const type of ['default', 'failover']) { + if (schema._flags[type] !== undefined) { + rules.push(type); + } + } + + for (const name of rules) { + const status = internals.labels[log.rule[name] || 0]; + if (status) { + const report = { rule: name, status }; + if (log.paths.size) { + report.paths = [...log.paths]; + } + + missing.push(report); + } + } + } + + if (missing.length) { + coverage.push({ + filename, + line, + missing, + severity: 'error', + message: `Schema missing tests for ${missing.map(internals.message).join(', ')}` + }); + } + } + + return coverage.length ? coverage : null; + } +}; + + +internals.Store = class { + + constructor(schema) { + + this.active = true; + this._sources = new Map(); // schema -> { paths, entry, rule, valid, invalid } + this._combos = new Map(); // merged -> [sources] + this._scan(schema); + } + + debug(state, source, name, result) { + + state.mainstay.debug && state.mainstay.debug.push({ type: source, name, result, path: state.path }); + } + + entry(schema, state) { + + internals.debug(state, { type: 'entry' }); + + this._record(schema, (log) => { + + log.entry = true; + }); + } + + filter(schema, state, source, value) { + + internals.debug(state, { type: source, ...value }); + + this._record(schema, (log) => { + + log[source].add(value); + }); + } + + log(schema, state, source, name, result) { + + internals.debug(state, { type: source, name, result: result === 'full' ? 'pass' : result }); + + this._record(schema, (log) => { + + log[source][name] = log[source][name] || 0; + log[source][name] |= internals.codes[result]; + }); + } + + resolve(state, ref, to) { + + if (!state.mainstay.debug) { + return; + } + + const log = { type: 'resolve', ref: ref.display, to, path: state.path }; + state.mainstay.debug.push(log); + } + + value(state, by, from, to, name) { + + if (!state.mainstay.debug || + DeepEqual(from, to)) { + + return; + } + + const log = { type: 'value', by, from, to, path: state.path }; + if (name) { + log.name = name; + } + + state.mainstay.debug.push(log); + } + + _record(schema, each) { + + const log = this._sources.get(schema); + if (log) { + each(log); + return; + } + + const sources = this._combos.get(schema); + for (const source of sources) { + this._record(source, each); + } + } + + _scan(schema, _path) { + + const path = _path || []; + + let log = this._sources.get(schema); + if (!log) { + log = { + paths: new Set(), + entry: false, + rule: {}, + valid: new Set(), + invalid: new Set() + }; + + this._sources.set(schema, log); + } + + if (path.length) { + log.paths.add(path); + } + + const each = (sub, source) => { + + const subId = internals.id(sub, source); + this._scan(sub, path.concat(subId)); + }; + + schema.$_modify({ each, ref: false }); + } + + _combine(merged, sources) { + + this._combos.set(merged, sources); + } +}; + + +internals.message = function (item) { + + const path = item.paths ? Errors.path(item.paths[0]) + (item.rule ? ':' : '') : ''; + return `${path}${item.rule || ''} (${item.status})`; +}; + + +internals.id = function (schema, { source, name, path, key }) { + + if (schema._flags.id) { + return schema._flags.id; + } + + if (key) { + return key; + } + + name = `@${name}`; + + if (source === 'terms') { + return [name, path[Math.min(path.length - 1, 1)]]; + } + + return name; +}; + + +internals.sub = function (paths, skipped) { + + for (const path of paths) { + for (const skip of skipped) { + if (DeepEqual(path.slice(0, skip.length), skip)) { + return true; + } + } + } + + return false; +}; + + +internals.debug = function (state, event) { + + if (state.mainstay.debug) { + event.path = state.debug ? [...state.path, state.debug] : state.path; + state.mainstay.debug.push(event); + } +}; diff --git a/node_modules/joi/lib/types/alternatives.js b/node_modules/joi/lib/types/alternatives.js new file mode 100644 index 000000000..875f2b090 --- /dev/null +++ b/node_modules/joi/lib/types/alternatives.js @@ -0,0 +1,333 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Merge = require('@hapi/hoek/lib/merge'); + +const Any = require('./any'); +const Common = require('../common'); +const Compile = require('../compile'); +const Errors = require('../errors'); +const Ref = require('../ref'); + + +const internals = {}; + + +module.exports = Any.extend({ + + type: 'alternatives', + + flags: { + + match: { default: 'any' } // 'any', 'one', 'all' + }, + + terms: { + + matches: { init: [], register: Ref.toSibling } + }, + + args(schema, ...schemas) { + + if (schemas.length === 1) { + if (Array.isArray(schemas[0])) { + return schema.try(...schemas[0]); + } + } + + return schema.try(...schemas); + }, + + validate(value, helpers) { + + const { schema, error, state, prefs } = helpers; + + // Match all or one + + if (schema._flags.match) { + const matched = []; + + for (let i = 0; i < schema.$_terms.matches.length; ++i) { + const item = schema.$_terms.matches[i]; + const localState = state.nest(item.schema, `match.${i}`); + localState.snapshot(); + + const result = item.schema.$_validate(value, localState, prefs); + if (!result.errors) { + matched.push(result.value); + } + else { + localState.restore(); + } + } + + if (matched.length === 0) { + return { errors: error('alternatives.any') }; + } + + if (schema._flags.match === 'one') { + return matched.length === 1 ? { value: matched[0] } : { errors: error('alternatives.one') }; + } + + if (matched.length !== schema.$_terms.matches.length) { + return { errors: error('alternatives.all') }; + } + + const allobj = schema.$_terms.matches.reduce((acc, v) => acc && v.schema.type === 'object', true); + return allobj ? { value: matched.reduce((acc, v) => Merge(acc, v, { mergeArrays: false })) } : { value: matched[matched.length - 1] }; + } + + // Match any + + const errors = []; + for (let i = 0; i < schema.$_terms.matches.length; ++i) { + const item = schema.$_terms.matches[i]; + + // Try + + if (item.schema) { + const localState = state.nest(item.schema, `match.${i}`); + localState.snapshot(); + + const result = item.schema.$_validate(value, localState, prefs); + if (!result.errors) { + return result; + } + + localState.restore(); + errors.push({ schema: item.schema, reports: result.errors }); + continue; + } + + // Conditional + + const input = item.ref ? item.ref.resolve(value, state, prefs) : value; + const tests = item.is ? [item] : item.switch; + + for (let j = 0; j < tests.length; ++j) { + const test = tests[j]; + const { is, then, otherwise } = test; + + const id = `match.${i}${item.switch ? '.' + j : ''}`; + if (!is.$_match(input, state.nest(is, `${id}.is`), prefs)) { + if (otherwise) { + return otherwise.$_validate(value, state.nest(otherwise, `${id}.otherwise`), prefs); + } + } + else if (then) { + return then.$_validate(value, state.nest(then, `${id}.then`), prefs); + } + } + } + + return internals.errors(errors, helpers); + }, + + rules: { + + conditional: { + method(condition, options) { + + Assert(!this._flags._endedSwitch, 'Unreachable condition'); + Assert(!this._flags.match, 'Cannot combine match mode', this._flags.match, 'with conditional rule'); + Assert(options.break === undefined, 'Cannot use break option with alternatives conditional'); + + const obj = this.clone(); + + const match = Compile.when(obj, condition, options); + const conditions = match.is ? [match] : match.switch; + for (const item of conditions) { + if (item.then && + item.otherwise) { + + obj.$_setFlag('_endedSwitch', true, { clone: false }); + break; + } + } + + obj.$_terms.matches.push(match); + return obj.$_mutateRebuild(); + } + }, + + match: { + method(mode) { + + Assert(['any', 'one', 'all'].includes(mode), 'Invalid alternatives match mode', mode); + + if (mode !== 'any') { + for (const match of this.$_terms.matches) { + Assert(match.schema, 'Cannot combine match mode', mode, 'with conditional rules'); + } + } + + return this.$_setFlag('match', mode); + } + }, + + try: { + method(...schemas) { + + Assert(schemas.length, 'Missing alternative schemas'); + Common.verifyFlat(schemas, 'try'); + + Assert(!this._flags._endedSwitch, 'Unreachable condition'); + + const obj = this.clone(); + for (const schema of schemas) { + obj.$_terms.matches.push({ schema: obj.$_compile(schema) }); + } + + return obj.$_mutateRebuild(); + } + } + }, + + overrides: { + + label(name) { + + const obj = this.$_parent('label', name); + const each = (item, source) => (source.path[0] !== 'is' ? item.label(name) : undefined); + return obj.$_modify({ each, ref: false }); + } + }, + + rebuild(schema) { + + // Flag when an alternative type is an array + + const each = (item) => { + + if (Common.isSchema(item) && + item.type === 'array') { + + schema.$_setFlag('_arrayItems', true, { clone: false }); + } + }; + + schema.$_modify({ each }); + }, + + manifest: { + + build(obj, desc) { + + if (desc.matches) { + for (const match of desc.matches) { + const { schema, ref, is, not, then, otherwise } = match; + if (schema) { + obj = obj.try(schema); + } + else if (ref) { + obj = obj.conditional(ref, { is, then, not, otherwise, switch: match.switch }); + } + else { + obj = obj.conditional(is, { then, otherwise }); + } + } + } + + return obj; + } + }, + + messages: { + 'alternatives.all': '{{#label}} does not match all of the required types', + 'alternatives.any': '{{#label}} does not match any of the allowed types', + 'alternatives.match': '{{#label}} does not match any of the allowed types', + 'alternatives.one': '{{#label}} matches more than one allowed type', + 'alternatives.types': '{{#label}} must be one of {{#types}}' + } +}); + + +// Helpers + +internals.errors = function (failures, { error, state }) { + + // Nothing matched due to type criteria rules + + if (!failures.length) { + return { errors: error('alternatives.any') }; + } + + // Single error + + if (failures.length === 1) { + return { errors: failures[0].reports }; + } + + // Analyze reasons + + const valids = new Set(); + const complex = []; + + for (const { reports, schema } of failures) { + + // Multiple errors (!abortEarly) + + if (reports.length > 1) { + return internals.unmatched(failures, error); + } + + // Custom error + + const report = reports[0]; + if (report instanceof Errors.Report === false) { + return internals.unmatched(failures, error); + } + + // Internal object or array error + + if (report.state.path.length !== state.path.length) { + complex.push({ type: schema.type, report }); + continue; + } + + // Valids + + if (report.code === 'any.only') { + for (const valid of report.local.valids) { + valids.add(valid); + } + + continue; + } + + // Base type + + const [type, code] = report.code.split('.'); + if (code !== 'base') { + complex.push({ type: schema.type, report }); + continue; + } + + valids.add(type); + } + + // All errors are base types or valids + + if (!complex.length) { + return { errors: error('alternatives.types', { types: [...valids] }) }; + } + + // Single complex error + + if (complex.length === 1) { + return { errors: complex[0].report }; + } + + return internals.unmatched(failures, error); +}; + + +internals.unmatched = function (failures, error) { + + const errors = []; + for (const failure of failures) { + errors.push(...failure.reports); + } + + return { errors: error('alternatives.match', Errors.details(errors, { override: false })) }; +}; diff --git a/node_modules/joi/lib/types/any.js b/node_modules/joi/lib/types/any.js new file mode 100644 index 000000000..2b1ad58b8 --- /dev/null +++ b/node_modules/joi/lib/types/any.js @@ -0,0 +1,174 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Base = require('../base'); +const Common = require('../common'); +const Messages = require('../messages'); + + +const internals = {}; + + +module.exports = Base.extend({ + + type: 'any', + + flags: { + + only: { default: false } + }, + + terms: { + + alterations: { init: null }, + examples: { init: null }, + externals: { init: null }, + metas: { init: [] }, + notes: { init: [] }, + shared: { init: null }, + tags: { init: [] }, + whens: { init: null } + }, + + rules: { + + custom: { + method(method, description) { + + Assert(typeof method === 'function', 'Method must be a function'); + Assert(description === undefined || description && typeof description === 'string', 'Description must be a non-empty string'); + + return this.$_addRule({ name: 'custom', args: { method, description } }); + }, + validate(value, helpers, { method }) { + + try { + return method(value, helpers); + } + catch (err) { + return helpers.error('any.custom', { error: err }); + } + }, + args: ['method', 'description'], + multi: true + }, + + messages: { + method(messages) { + + return this.prefs({ messages }); + } + }, + + shared: { + method(schema) { + + Assert(Common.isSchema(schema) && schema._flags.id, 'Schema must be a schema with an id'); + + const obj = this.clone(); + obj.$_terms.shared = obj.$_terms.shared || []; + obj.$_terms.shared.push(schema); + obj.$_mutateRegister(schema); + return obj; + } + }, + + warning: { + method(code, local) { + + Assert(code && typeof code === 'string', 'Invalid warning code'); + + return this.$_addRule({ name: 'warning', args: { code, local }, warn: true }); + }, + validate(value, helpers, { code, local }) { + + return helpers.error(code, local); + }, + args: ['code', 'local'], + multi: true + } + }, + + modifiers: { + + keep(rule, enabled = true) { + + rule.keep = enabled; + }, + + message(rule, message) { + + rule.message = Messages.compile(message); + }, + + warn(rule, enabled = true) { + + rule.warn = enabled; + } + }, + + manifest: { + + build(obj, desc) { + + for (const key in desc) { + const values = desc[key]; + + if (['examples', 'externals', 'metas', 'notes', 'tags'].includes(key)) { + for (const value of values) { + obj = obj[key.slice(0, -1)](value); + } + + continue; + } + + if (key === 'alterations') { + const alter = {}; + for (const { target, adjuster } of values) { + alter[target] = adjuster; + } + + obj = obj.alter(alter); + continue; + } + + if (key === 'whens') { + for (const value of values) { + const { ref, is, not, then, otherwise, concat } = value; + if (concat) { + obj = obj.concat(concat); + } + else if (ref) { + obj = obj.when(ref, { is, not, then, otherwise, switch: value.switch, break: value.break }); + } + else { + obj = obj.when(is, { then, otherwise, break: value.break }); + } + } + + continue; + } + + if (key === 'shared') { + for (const value of values) { + obj = obj.shared(value); + } + } + } + + return obj; + } + }, + + messages: { + 'any.custom': '{{#label}} failed custom validation because {{#error.message}}', + 'any.default': '{{#label}} threw an error when running default method', + 'any.failover': '{{#label}} threw an error when running failover method', + 'any.invalid': '{{#label}} contains an invalid value', + 'any.only': '{{#label}} must be {if(#valids.length == 1, "", "one of ")}{{#valids}}', + 'any.ref': '{{#label}} {{#arg}} references {{:#ref}} which {{#reason}}', + 'any.required': '{{#label}} is required', + 'any.unknown': '{{#label}} is not allowed' + } +}); diff --git a/node_modules/joi/lib/types/array.js b/node_modules/joi/lib/types/array.js new file mode 100644 index 000000000..7682e8293 --- /dev/null +++ b/node_modules/joi/lib/types/array.js @@ -0,0 +1,806 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const DeepEqual = require('@hapi/hoek/lib/deepEqual'); +const Reach = require('@hapi/hoek/lib/reach'); + +const Any = require('./any'); +const Common = require('../common'); +const Compile = require('../compile'); + + +const internals = {}; + + +module.exports = Any.extend({ + + type: 'array', + + flags: { + + single: { default: false }, + sparse: { default: false } + }, + + terms: { + + items: { init: [], manifest: 'schema' }, + ordered: { init: [], manifest: 'schema' }, + + _exclusions: { init: [] }, + _inclusions: { init: [] }, + _requireds: { init: [] } + }, + + coerce: { + from: 'object', + method(value, { schema, state, prefs }) { + + if (!Array.isArray(value)) { + return; + } + + const sort = schema.$_getRule('sort'); + if (!sort) { + return; + } + + return internals.sort(schema, value, sort.args.options, state, prefs); + } + }, + + validate(value, { schema, error }) { + + if (!Array.isArray(value)) { + if (schema._flags.single) { + const single = [value]; + single[Common.symbols.arraySingle] = true; + return { value: single }; + } + + return { errors: error('array.base') }; + } + + if (!schema.$_getRule('items') && + !schema.$_terms.externals) { + + return; + } + + return { value: value.slice() }; // Clone the array so that we don't modify the original + }, + + rules: { + + has: { + method(schema) { + + schema = this.$_compile(schema, { appendPath: true }); + const obj = this.$_addRule({ name: 'has', args: { schema } }); + obj.$_mutateRegister(schema); + return obj; + }, + validate(value, { state, prefs, error }, { schema: has }) { + + const ancestors = [value, ...state.ancestors]; + for (let i = 0; i < value.length; ++i) { + const localState = state.localize([...state.path, i], ancestors, has); + if (has.$_match(value[i], localState, prefs)) { + return value; + } + } + + const patternLabel = has._flags.label; + if (patternLabel) { + return error('array.hasKnown', { patternLabel }); + } + + return error('array.hasUnknown', null); + }, + multi: true + }, + + items: { + method(...schemas) { + + Common.verifyFlat(schemas, 'items'); + + const obj = this.$_addRule('items'); + + for (let i = 0; i < schemas.length; ++i) { + const type = Common.tryWithPath(() => this.$_compile(schemas[i]), i, { append: true }); + obj.$_terms.items.push(type); + } + + return obj.$_mutateRebuild(); + }, + validate(value, { schema, error, state, prefs, errorsArray }) { + + const requireds = schema.$_terms._requireds.slice(); + const ordereds = schema.$_terms.ordered.slice(); + const inclusions = [...schema.$_terms._inclusions, ...requireds]; + + const wasArray = !value[Common.symbols.arraySingle]; + delete value[Common.symbols.arraySingle]; + + const errors = errorsArray(); + + let il = value.length; + for (let i = 0; i < il; ++i) { + const item = value[i]; + + let errored = false; + let isValid = false; + + const key = wasArray ? i : new Number(i); // eslint-disable-line no-new-wrappers + const path = [...state.path, key]; + + // Sparse + + if (!schema._flags.sparse && + item === undefined) { + + errors.push(error('array.sparse', { key, path, pos: i, value: undefined }, state.localize(path))); + if (prefs.abortEarly) { + return errors; + } + + ordereds.shift(); + continue; + } + + // Exclusions + + const ancestors = [value, ...state.ancestors]; + + for (const exclusion of schema.$_terms._exclusions) { + if (!exclusion.$_match(item, state.localize(path, ancestors, exclusion), prefs, { presence: 'ignore' })) { + continue; + } + + errors.push(error('array.excludes', { pos: i, value: item }, state.localize(path))); + if (prefs.abortEarly) { + return errors; + } + + errored = true; + ordereds.shift(); + break; + } + + if (errored) { + continue; + } + + // Ordered + + if (schema.$_terms.ordered.length) { + if (ordereds.length) { + const ordered = ordereds.shift(); + const res = ordered.$_validate(item, state.localize(path, ancestors, ordered), prefs); + if (!res.errors) { + if (ordered._flags.result === 'strip') { + internals.fastSplice(value, i); + --i; + --il; + } + else if (!schema._flags.sparse && res.value === undefined) { + errors.push(error('array.sparse', { key, path, pos: i, value: undefined }, state.localize(path))); + if (prefs.abortEarly) { + return errors; + } + + continue; + } + else { + value[i] = res.value; + } + } + else { + errors.push(...res.errors); + if (prefs.abortEarly) { + return errors; + } + } + + continue; + } + else if (!schema.$_terms.items.length) { + errors.push(error('array.orderedLength', { pos: i, limit: schema.$_terms.ordered.length })); + if (prefs.abortEarly) { + return errors; + } + + break; // No reason to continue since there are no other rules to validate other than array.orderedLength + } + } + + // Requireds + + const requiredChecks = []; + let jl = requireds.length; + for (let j = 0; j < jl; ++j) { + const localState = state.localize(path, ancestors, requireds[j]); + localState.snapshot(); + + const res = requireds[j].$_validate(item, localState, prefs); + requiredChecks[j] = res; + + if (!res.errors) { + value[i] = res.value; + isValid = true; + internals.fastSplice(requireds, j); + --j; + --jl; + + if (!schema._flags.sparse && + res.value === undefined) { + + errors.push(error('array.sparse', { key, path, pos: i, value: undefined }, state.localize(path))); + if (prefs.abortEarly) { + return errors; + } + } + + break; + } + + localState.restore(); + } + + if (isValid) { + continue; + } + + // Inclusions + + const stripUnknown = prefs.stripUnknown && !!prefs.stripUnknown.arrays || false; + + jl = inclusions.length; + for (const inclusion of inclusions) { + + // Avoid re-running requireds that already didn't match in the previous loop + + let res; + const previousCheck = requireds.indexOf(inclusion); + if (previousCheck !== -1) { + res = requiredChecks[previousCheck]; + } + else { + const localState = state.localize(path, ancestors, inclusion); + localState.snapshot(); + + res = inclusion.$_validate(item, localState, prefs); + if (!res.errors) { + if (inclusion._flags.result === 'strip') { + internals.fastSplice(value, i); + --i; + --il; + } + else if (!schema._flags.sparse && + res.value === undefined) { + + errors.push(error('array.sparse', { key, path, pos: i, value: undefined }, state.localize(path))); + errored = true; + } + else { + value[i] = res.value; + } + + isValid = true; + break; + } + + localState.restore(); + } + + // Return the actual error if only one inclusion defined + + if (jl === 1) { + if (stripUnknown) { + internals.fastSplice(value, i); + --i; + --il; + isValid = true; + break; + } + + errors.push(...res.errors); + if (prefs.abortEarly) { + return errors; + } + + errored = true; + break; + } + } + + if (errored) { + continue; + } + + if ((schema.$_terms._inclusions.length || schema.$_terms._requireds.length) && + !isValid) { + + if (stripUnknown) { + internals.fastSplice(value, i); + --i; + --il; + continue; + } + + errors.push(error('array.includes', { pos: i, value: item }, state.localize(path))); + if (prefs.abortEarly) { + return errors; + } + } + } + + if (requireds.length) { + internals.fillMissedErrors(schema, errors, requireds, value, state, prefs); + } + + if (ordereds.length) { + internals.fillOrderedErrors(schema, errors, ordereds, value, state, prefs); + + if (!errors.length) { + internals.fillDefault(ordereds, value, state, prefs); + } + } + + return errors.length ? errors : value; + }, + + priority: true, + manifest: false + }, + + length: { + method(limit) { + + return this.$_addRule({ name: 'length', args: { limit }, operator: '=' }); + }, + validate(value, helpers, { limit }, { name, operator, args }) { + + if (Common.compare(value.length, limit, operator)) { + return value; + } + + return helpers.error('array.' + name, { limit: args.limit, value }); + }, + args: [ + { + name: 'limit', + ref: true, + assert: Common.limit, + message: 'must be a positive integer' + } + ] + }, + + max: { + method(limit) { + + return this.$_addRule({ name: 'max', method: 'length', args: { limit }, operator: '<=' }); + } + }, + + min: { + method(limit) { + + return this.$_addRule({ name: 'min', method: 'length', args: { limit }, operator: '>=' }); + } + }, + + ordered: { + method(...schemas) { + + Common.verifyFlat(schemas, 'ordered'); + + const obj = this.$_addRule('items'); + + for (let i = 0; i < schemas.length; ++i) { + const type = Common.tryWithPath(() => this.$_compile(schemas[i]), i, { append: true }); + internals.validateSingle(type, obj); + + obj.$_mutateRegister(type); + obj.$_terms.ordered.push(type); + } + + return obj.$_mutateRebuild(); + } + }, + + single: { + method(enabled) { + + const value = enabled === undefined ? true : !!enabled; + Assert(!value || !this._flags._arrayItems, 'Cannot specify single rule when array has array items'); + + return this.$_setFlag('single', value); + } + }, + + sort: { + method(options = {}) { + + Common.assertOptions(options, ['by', 'order']); + + const settings = { + order: options.order || 'ascending' + }; + + if (options.by) { + settings.by = Compile.ref(options.by, { ancestor: 0 }); + Assert(!settings.by.ancestor, 'Cannot sort by ancestor'); + } + + return this.$_addRule({ name: 'sort', args: { options: settings } }); + }, + validate(value, { error, state, prefs, schema }, { options }) { + + const { value: sorted, errors } = internals.sort(schema, value, options, state, prefs); + if (errors) { + return errors; + } + + for (let i = 0; i < value.length; ++i) { + if (value[i] !== sorted[i]) { + return error('array.sort', { order: options.order, by: options.by ? options.by.key : 'value' }); + } + } + + return value; + }, + convert: true + }, + + sparse: { + method(enabled) { + + const value = enabled === undefined ? true : !!enabled; + + if (this._flags.sparse === value) { + return this; + } + + const obj = value ? this.clone() : this.$_addRule('items'); + return obj.$_setFlag('sparse', value, { clone: false }); + } + }, + + unique: { + method(comparator, options = {}) { + + Assert(!comparator || typeof comparator === 'function' || typeof comparator === 'string', 'comparator must be a function or a string'); + Common.assertOptions(options, ['ignoreUndefined', 'separator']); + + const rule = { name: 'unique', args: { options, comparator } }; + + if (comparator) { + if (typeof comparator === 'string') { + const separator = Common.default(options.separator, '.'); + rule.path = separator ? comparator.split(separator) : [comparator]; + } + else { + rule.comparator = comparator; + } + } + + return this.$_addRule(rule); + }, + validate(value, { state, error, schema }, { comparator: raw, options }, { comparator, path }) { + + const found = { + string: Object.create(null), + number: Object.create(null), + undefined: Object.create(null), + boolean: Object.create(null), + object: new Map(), + function: new Map(), + custom: new Map() + }; + + const compare = comparator || DeepEqual; + const ignoreUndefined = options.ignoreUndefined; + + for (let i = 0; i < value.length; ++i) { + const item = path ? Reach(value[i], path) : value[i]; + const records = comparator ? found.custom : found[typeof item]; + Assert(records, 'Failed to find unique map container for type', typeof item); + + if (records instanceof Map) { + const entries = records.entries(); + let current; + while (!(current = entries.next()).done) { + if (compare(current.value[0], item)) { + const localState = state.localize([...state.path, i], [value, ...state.ancestors]); + const context = { + pos: i, + value: value[i], + dupePos: current.value[1], + dupeValue: value[current.value[1]] + }; + + if (path) { + context.path = raw; + } + + return error('array.unique', context, localState); + } + } + + records.set(item, i); + } + else { + if ((!ignoreUndefined || item !== undefined) && + records[item] !== undefined) { + + const context = { + pos: i, + value: value[i], + dupePos: records[item], + dupeValue: value[records[item]] + }; + + if (path) { + context.path = raw; + } + + const localState = state.localize([...state.path, i], [value, ...state.ancestors]); + return error('array.unique', context, localState); + } + + records[item] = i; + } + } + + return value; + }, + args: ['comparator', 'options'], + multi: true + } + }, + + cast: { + set: { + from: Array.isArray, + to(value, helpers) { + + return new Set(value); + } + } + }, + + rebuild(schema) { + + schema.$_terms._inclusions = []; + schema.$_terms._exclusions = []; + schema.$_terms._requireds = []; + + for (const type of schema.$_terms.items) { + internals.validateSingle(type, schema); + + if (type._flags.presence === 'required') { + schema.$_terms._requireds.push(type); + } + else if (type._flags.presence === 'forbidden') { + schema.$_terms._exclusions.push(type); + } + else { + schema.$_terms._inclusions.push(type); + } + } + + for (const type of schema.$_terms.ordered) { + internals.validateSingle(type, schema); + } + }, + + manifest: { + + build(obj, desc) { + + if (desc.items) { + obj = obj.items(...desc.items); + } + + if (desc.ordered) { + obj = obj.ordered(...desc.ordered); + } + + return obj; + } + }, + + messages: { + 'array.base': '{{#label}} must be an array', + 'array.excludes': '{{#label}} contains an excluded value', + 'array.hasKnown': '{{#label}} does not contain at least one required match for type {:#patternLabel}', + 'array.hasUnknown': '{{#label}} does not contain at least one required match', + 'array.includes': '{{#label}} does not match any of the allowed types', + 'array.includesRequiredBoth': '{{#label}} does not contain {{#knownMisses}} and {{#unknownMisses}} other required value(s)', + 'array.includesRequiredKnowns': '{{#label}} does not contain {{#knownMisses}}', + 'array.includesRequiredUnknowns': '{{#label}} does not contain {{#unknownMisses}} required value(s)', + 'array.length': '{{#label}} must contain {{#limit}} items', + 'array.max': '{{#label}} must contain less than or equal to {{#limit}} items', + 'array.min': '{{#label}} must contain at least {{#limit}} items', + 'array.orderedLength': '{{#label}} must contain at most {{#limit}} items', + 'array.sort': '{{#label}} must be sorted in {#order} order by {{#by}}', + 'array.sort.mismatching': '{{#label}} cannot be sorted due to mismatching types', + 'array.sort.unsupported': '{{#label}} cannot be sorted due to unsupported type {#type}', + 'array.sparse': '{{#label}} must not be a sparse array item', + 'array.unique': '{{#label}} contains a duplicate value' + } +}); + + +// Helpers + +internals.fillMissedErrors = function (schema, errors, requireds, value, state, prefs) { + + const knownMisses = []; + let unknownMisses = 0; + for (const required of requireds) { + const label = required._flags.label; + if (label) { + knownMisses.push(label); + } + else { + ++unknownMisses; + } + } + + if (knownMisses.length) { + if (unknownMisses) { + errors.push(schema.$_createError('array.includesRequiredBoth', value, { knownMisses, unknownMisses }, state, prefs)); + } + else { + errors.push(schema.$_createError('array.includesRequiredKnowns', value, { knownMisses }, state, prefs)); + } + } + else { + errors.push(schema.$_createError('array.includesRequiredUnknowns', value, { unknownMisses }, state, prefs)); + } +}; + + +internals.fillOrderedErrors = function (schema, errors, ordereds, value, state, prefs) { + + const requiredOrdereds = []; + + for (const ordered of ordereds) { + if (ordered._flags.presence === 'required') { + requiredOrdereds.push(ordered); + } + } + + if (requiredOrdereds.length) { + internals.fillMissedErrors(schema, errors, requiredOrdereds, value, state, prefs); + } +}; + + +internals.fillDefault = function (ordereds, value, state, prefs) { + + const overrides = []; + let trailingUndefined = true; + + for (let i = ordereds.length - 1; i >= 0; --i) { + const ordered = ordereds[i]; + const ancestors = [value, ...state.ancestors]; + const override = ordered.$_validate(undefined, state.localize(state.path, ancestors, ordered), prefs).value; + + if (trailingUndefined) { + if (override === undefined) { + continue; + } + + trailingUndefined = false; + } + + overrides.unshift(override); + } + + if (overrides.length) { + value.push(...overrides); + } +}; + + +internals.fastSplice = function (arr, i) { + + let pos = i; + while (pos < arr.length) { + arr[pos++] = arr[pos]; + } + + --arr.length; +}; + + +internals.validateSingle = function (type, obj) { + + if (type.type === 'array' || + type._flags._arrayItems) { + + Assert(!obj._flags.single, 'Cannot specify array item with single rule enabled'); + obj.$_setFlag('_arrayItems', true, { clone: false }); + } +}; + + +internals.sort = function (schema, value, settings, state, prefs) { + + const order = settings.order === 'ascending' ? 1 : -1; + const aFirst = -1 * order; + const bFirst = order; + + const sort = (a, b) => { + + let compare = internals.compare(a, b, aFirst, bFirst); + if (compare !== null) { + return compare; + } + + if (settings.by) { + a = settings.by.resolve(a, state, prefs); + b = settings.by.resolve(b, state, prefs); + } + + compare = internals.compare(a, b, aFirst, bFirst); + if (compare !== null) { + return compare; + } + + const type = typeof a; + if (type !== typeof b) { + throw schema.$_createError('array.sort.mismatching', value, null, state, prefs); + } + + if (type !== 'number' && + type !== 'string') { + + throw schema.$_createError('array.sort.unsupported', value, { type }, state, prefs); + } + + if (type === 'number') { + return (a - b) * order; + } + + return a < b ? aFirst : bFirst; + }; + + try { + return { value: value.slice().sort(sort) }; + } + catch (err) { + return { errors: err }; + } +}; + + +internals.compare = function (a, b, aFirst, bFirst) { + + if (a === b) { + return 0; + } + + if (a === undefined) { + return 1; // Always last regardless of sort order + } + + if (b === undefined) { + return -1; // Always last regardless of sort order + } + + if (a === null) { + return bFirst; + } + + if (b === null) { + return aFirst; + } + + return null; +}; diff --git a/node_modules/joi/lib/types/binary.js b/node_modules/joi/lib/types/binary.js new file mode 100644 index 000000000..9147166ab --- /dev/null +++ b/node_modules/joi/lib/types/binary.js @@ -0,0 +1,98 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Any = require('./any'); +const Common = require('../common'); + + +const internals = {}; + + +module.exports = Any.extend({ + + type: 'binary', + + coerce: { + from: 'string', + method(value, { schema }) { + + try { + return { value: Buffer.from(value, schema._flags.encoding) }; + } + catch (ignoreErr) { } + } + }, + + validate(value, { error }) { + + if (!Buffer.isBuffer(value)) { + return { value, errors: error('binary.base') }; + } + }, + + rules: { + encoding: { + method(encoding) { + + Assert(Buffer.isEncoding(encoding), 'Invalid encoding:', encoding); + + return this.$_setFlag('encoding', encoding); + } + }, + + length: { + method(limit) { + + return this.$_addRule({ name: 'length', method: 'length', args: { limit }, operator: '=' }); + }, + validate(value, helpers, { limit }, { name, operator, args }) { + + if (Common.compare(value.length, limit, operator)) { + return value; + } + + return helpers.error('binary.' + name, { limit: args.limit, value }); + }, + args: [ + { + name: 'limit', + ref: true, + assert: Common.limit, + message: 'must be a positive integer' + } + ] + }, + + max: { + method(limit) { + + return this.$_addRule({ name: 'max', method: 'length', args: { limit }, operator: '<=' }); + } + }, + + min: { + method(limit) { + + return this.$_addRule({ name: 'min', method: 'length', args: { limit }, operator: '>=' }); + } + } + }, + + cast: { + string: { + from: (value) => Buffer.isBuffer(value), + to(value, helpers) { + + return value.toString(); + } + } + }, + + messages: { + 'binary.base': '{{#label}} must be a buffer or a string', + 'binary.length': '{{#label}} must be {{#limit}} bytes', + 'binary.max': '{{#label}} must be less than or equal to {{#limit}} bytes', + 'binary.min': '{{#label}} must be at least {{#limit}} bytes' + } +}); diff --git a/node_modules/joi/lib/types/boolean.js b/node_modules/joi/lib/types/boolean.js new file mode 100644 index 000000000..686586648 --- /dev/null +++ b/node_modules/joi/lib/types/boolean.js @@ -0,0 +1,150 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Any = require('./any'); +const Common = require('../common'); +const Values = require('../values'); + + +const internals = {}; + + +internals.isBool = function (value) { + + return typeof value === 'boolean'; +}; + + +module.exports = Any.extend({ + + type: 'boolean', + + flags: { + + sensitive: { default: false } + }, + + terms: { + + falsy: { + init: null, + manifest: 'values' + }, + + truthy: { + init: null, + manifest: 'values' + } + }, + + coerce(value, { schema }) { + + if (typeof value === 'boolean') { + return; + } + + if (typeof value === 'string') { + const normalized = schema._flags.sensitive ? value : value.toLowerCase(); + value = normalized === 'true' ? true : (normalized === 'false' ? false : value); + } + + if (typeof value !== 'boolean') { + value = schema.$_terms.truthy && schema.$_terms.truthy.has(value, null, null, !schema._flags.sensitive) || + (schema.$_terms.falsy && schema.$_terms.falsy.has(value, null, null, !schema._flags.sensitive) ? false : value); + } + + return { value }; + }, + + validate(value, { error }) { + + if (typeof value !== 'boolean') { + return { value, errors: error('boolean.base') }; + } + }, + + rules: { + truthy: { + method(...values) { + + Common.verifyFlat(values, 'truthy'); + + const obj = this.clone(); + obj.$_terms.truthy = obj.$_terms.truthy || new Values(); + + for (let i = 0; i < values.length; ++i) { + const value = values[i]; + + Assert(value !== undefined, 'Cannot call truthy with undefined'); + obj.$_terms.truthy.add(value); + } + + return obj; + } + }, + + falsy: { + method(...values) { + + Common.verifyFlat(values, 'falsy'); + + const obj = this.clone(); + obj.$_terms.falsy = obj.$_terms.falsy || new Values(); + + for (let i = 0; i < values.length; ++i) { + const value = values[i]; + + Assert(value !== undefined, 'Cannot call falsy with undefined'); + obj.$_terms.falsy.add(value); + } + + return obj; + } + }, + + sensitive: { + method(enabled = true) { + + return this.$_setFlag('sensitive', enabled); + } + } + }, + + cast: { + number: { + from: internals.isBool, + to(value, helpers) { + + return value ? 1 : 0; + } + }, + string: { + from: internals.isBool, + to(value, helpers) { + + return value ? 'true' : 'false'; + } + } + }, + + manifest: { + + build(obj, desc) { + + if (desc.truthy) { + obj = obj.truthy(...desc.truthy); + } + + if (desc.falsy) { + obj = obj.falsy(...desc.falsy); + } + + return obj; + } + }, + + messages: { + 'boolean.base': '{{#label}} must be a boolean' + } +}); diff --git a/node_modules/joi/lib/types/date.js b/node_modules/joi/lib/types/date.js new file mode 100644 index 000000000..b8206c6f8 --- /dev/null +++ b/node_modules/joi/lib/types/date.js @@ -0,0 +1,233 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Any = require('./any'); +const Common = require('../common'); +const Template = require('../template'); + + +const internals = {}; + + +internals.isDate = function (value) { + + return value instanceof Date; +}; + + +module.exports = Any.extend({ + + type: 'date', + + coerce: { + from: ['number', 'string'], + method(value, { schema }) { + + return { value: internals.parse(value, schema._flags.format) || value }; + } + }, + + validate(value, { schema, error, prefs }) { + + if (value instanceof Date && + !isNaN(value.getTime())) { + + return; + } + + const format = schema._flags.format; + + if (!prefs.convert || + !format || + typeof value !== 'string') { + + return { value, errors: error('date.base') }; + } + + return { value, errors: error('date.format', { format }) }; + }, + + rules: { + + compare: { + method: false, + validate(value, helpers, { date }, { name, operator, args }) { + + const to = date === 'now' ? Date.now() : date.getTime(); + if (Common.compare(value.getTime(), to, operator)) { + return value; + } + + return helpers.error('date.' + name, { limit: args.date, value }); + }, + args: [ + { + name: 'date', + ref: true, + normalize: (date) => { + + return date === 'now' ? date : internals.parse(date); + }, + assert: (date) => date !== null, + message: 'must have a valid date format' + } + ] + }, + + format: { + method(format) { + + Assert(['iso', 'javascript', 'unix'].includes(format), 'Unknown date format', format); + + return this.$_setFlag('format', format); + } + }, + + greater: { + method(date) { + + return this.$_addRule({ name: 'greater', method: 'compare', args: { date }, operator: '>' }); + } + }, + + iso: { + method() { + + return this.format('iso'); + } + }, + + less: { + method(date) { + + return this.$_addRule({ name: 'less', method: 'compare', args: { date }, operator: '<' }); + } + }, + + max: { + method(date) { + + return this.$_addRule({ name: 'max', method: 'compare', args: { date }, operator: '<=' }); + } + }, + + min: { + method(date) { + + return this.$_addRule({ name: 'min', method: 'compare', args: { date }, operator: '>=' }); + } + }, + + timestamp: { + method(type = 'javascript') { + + Assert(['javascript', 'unix'].includes(type), '"type" must be one of "javascript, unix"'); + + return this.format(type); + } + } + }, + + cast: { + number: { + from: internals.isDate, + to(value, helpers) { + + return value.getTime(); + } + }, + string: { + from: internals.isDate, + to(value, { prefs }) { + + return Template.date(value, prefs); + } + } + }, + + messages: { + 'date.base': '{{#label}} must be a valid date', + 'date.format': '{{#label}} must be in {msg("date.format." + #format) || #format} format', + 'date.greater': '{{#label}} must be greater than {{:#limit}}', + 'date.less': '{{#label}} must be less than {{:#limit}}', + 'date.max': '{{#label}} must be less than or equal to {{:#limit}}', + 'date.min': '{{#label}} must be greater than or equal to {{:#limit}}', + + // Messages used in date.format + + 'date.format.iso': 'ISO 8601 date', + 'date.format.javascript': 'timestamp or number of milliseconds', + 'date.format.unix': 'timestamp or number of seconds' + } +}); + + +// Helpers + +internals.parse = function (value, format) { + + if (value instanceof Date) { + return value; + } + + if (typeof value !== 'string' && + (isNaN(value) || !isFinite(value))) { + + return null; + } + + if (/^\s*$/.test(value)) { + return null; + } + + // ISO + + if (format === 'iso') { + if (!Common.isIsoDate(value)) { + return null; + } + + return internals.date(value.toString()); + } + + // Normalize number string + + const original = value; + if (typeof value === 'string' && + /^[+-]?\d+(\.\d+)?$/.test(value)) { + + value = parseFloat(value); + } + + // Timestamp + + if (format) { + if (format === 'javascript') { + return internals.date(1 * value); // Casting to number + } + + if (format === 'unix') { + return internals.date(1000 * value); + } + + if (typeof original === 'string') { + return null; + } + } + + // Plain + + return internals.date(value); +}; + + +internals.date = function (value) { + + const date = new Date(value); + if (!isNaN(date.getTime())) { + return date; + } + + return null; +}; diff --git a/node_modules/joi/lib/types/function.js b/node_modules/joi/lib/types/function.js new file mode 100644 index 000000000..eb8792e43 --- /dev/null +++ b/node_modules/joi/lib/types/function.js @@ -0,0 +1,93 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Keys = require('./keys'); + + +const internals = {}; + + +module.exports = Keys.extend({ + + type: 'function', + + properties: { + typeof: 'function' + }, + + rules: { + arity: { + method(n) { + + Assert(Number.isSafeInteger(n) && n >= 0, 'n must be a positive integer'); + + return this.$_addRule({ name: 'arity', args: { n } }); + }, + validate(value, helpers, { n }) { + + if (value.length === n) { + return value; + } + + return helpers.error('function.arity', { n }); + } + }, + + class: { + method() { + + return this.$_addRule('class'); + }, + validate(value, helpers) { + + if ((/^\s*class\s/).test(value.toString())) { + return value; + } + + return helpers.error('function.class', { value }); + } + }, + + minArity: { + method(n) { + + Assert(Number.isSafeInteger(n) && n > 0, 'n must be a strict positive integer'); + + return this.$_addRule({ name: 'minArity', args: { n } }); + }, + validate(value, helpers, { n }) { + + if (value.length >= n) { + return value; + } + + return helpers.error('function.minArity', { n }); + } + }, + + maxArity: { + method(n) { + + Assert(Number.isSafeInteger(n) && n >= 0, 'n must be a positive integer'); + + return this.$_addRule({ name: 'maxArity', args: { n } }); + }, + validate(value, helpers, { n }) { + + if (value.length <= n) { + return value; + } + + return helpers.error('function.maxArity', { n }); + } + } + }, + + messages: { + 'function.arity': '{{#label}} must have an arity of {{#n}}', + 'function.class': '{{#label}} must be a class', + 'function.maxArity': '{{#label}} must have an arity lesser or equal to {{#n}}', + 'function.minArity': '{{#label}} must have an arity greater or equal to {{#n}}' + } +}); diff --git a/node_modules/joi/lib/types/keys.js b/node_modules/joi/lib/types/keys.js new file mode 100644 index 000000000..8f8a950f2 --- /dev/null +++ b/node_modules/joi/lib/types/keys.js @@ -0,0 +1,1047 @@ +'use strict'; + +const ApplyToDefaults = require('@hapi/hoek/lib/applyToDefaults'); +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); +const Topo = require('@hapi/topo'); + +const Any = require('./any'); +const Common = require('../common'); +const Compile = require('../compile'); +const Errors = require('../errors'); +const Ref = require('../ref'); +const Template = require('../template'); + + +const internals = { + renameDefaults: { + alias: false, // Keep old value in place + multiple: false, // Allow renaming multiple keys into the same target + override: false // Overrides an existing key + } +}; + + +module.exports = Any.extend({ + + type: '_keys', + + properties: { + + typeof: 'object' + }, + + flags: { + + unknown: { default: false } + }, + + terms: { + + dependencies: { init: null }, + keys: { init: null, manifest: { mapped: { from: 'schema', to: 'key' } } }, + patterns: { init: null }, + renames: { init: null } + }, + + args(schema, keys) { + + return schema.keys(keys); + }, + + validate(value, { schema, error, state, prefs }) { + + if (!value || + typeof value !== schema.$_property('typeof') || + Array.isArray(value)) { + + return { value, errors: error('object.base', { type: schema.$_property('typeof') }) }; + } + + // Skip if there are no other rules to test + + if (!schema.$_terms.renames && + !schema.$_terms.dependencies && + !schema.$_terms.keys && // null allows any keys + !schema.$_terms.patterns && + !schema.$_terms.externals) { + + return; + } + + // Shallow clone value + + value = internals.clone(value, prefs); + const errors = []; + + // Rename keys + + if (schema.$_terms.renames && + !internals.rename(schema, value, state, prefs, errors)) { + + return { value, errors }; + } + + // Anything allowed + + if (!schema.$_terms.keys && // null allows any keys + !schema.$_terms.patterns && + !schema.$_terms.dependencies) { + + return { value, errors }; + } + + // Defined keys + + const unprocessed = new Set(Object.keys(value)); + + if (schema.$_terms.keys) { + const ancestors = [value, ...state.ancestors]; + + for (const child of schema.$_terms.keys) { + const key = child.key; + const item = value[key]; + + unprocessed.delete(key); + + const localState = state.localize([...state.path, key], ancestors, child); + const result = child.schema.$_validate(item, localState, prefs); + + if (result.errors) { + if (prefs.abortEarly) { + return { value, errors: result.errors }; + } + + if (result.value !== undefined) { + value[key] = result.value; + } + + errors.push(...result.errors); + } + else if (child.schema._flags.result === 'strip' || + result.value === undefined && item !== undefined) { + + delete value[key]; + } + else if (result.value !== undefined) { + value[key] = result.value; + } + } + } + + // Unknown keys + + if (unprocessed.size || + schema._flags._hasPatternMatch) { + + const early = internals.unknown(schema, value, unprocessed, errors, state, prefs); + if (early) { + return early; + } + } + + // Validate dependencies + + if (schema.$_terms.dependencies) { + for (const dep of schema.$_terms.dependencies) { + if (dep.key && + dep.key.resolve(value, state, prefs, null, { shadow: false }) === undefined) { + + continue; + } + + const failed = internals.dependencies[dep.rel](schema, dep, value, state, prefs); + if (failed) { + const report = schema.$_createError(failed.code, value, failed.context, state, prefs); + if (prefs.abortEarly) { + return { value, errors: report }; + } + + errors.push(report); + } + } + } + + return { value, errors }; + }, + + rules: { + + and: { + method(...peers /*, [options] */) { + + Common.verifyFlat(peers, 'and'); + + return internals.dependency(this, 'and', null, peers); + } + }, + + append: { + method(schema) { + + if (schema === null || + schema === undefined || + Object.keys(schema).length === 0) { + + return this; + } + + return this.keys(schema); + } + }, + + assert: { + method(subject, schema, message) { + + if (!Template.isTemplate(subject)) { + subject = Compile.ref(subject); + } + + Assert(message === undefined || typeof message === 'string', 'Message must be a string'); + + schema = this.$_compile(schema, { appendPath: true }); + + const obj = this.$_addRule({ name: 'assert', args: { subject, schema, message } }); + obj.$_mutateRegister(subject); + obj.$_mutateRegister(schema); + return obj; + }, + validate(value, { error, prefs, state }, { subject, schema, message }) { + + const about = subject.resolve(value, state, prefs); + const path = Ref.isRef(subject) ? subject.absolute(state) : []; + if (schema.$_match(about, state.localize(path, [value, ...state.ancestors], schema), prefs)) { + return value; + } + + return error('object.assert', { subject, message }); + }, + args: ['subject', 'schema', 'message'], + multi: true + }, + + instance: { + method(constructor, name) { + + Assert(typeof constructor === 'function', 'constructor must be a function'); + + name = name || constructor.name; + + return this.$_addRule({ name: 'instance', args: { constructor, name } }); + }, + validate(value, helpers, { constructor, name }) { + + if (value instanceof constructor) { + return value; + } + + return helpers.error('object.instance', { type: name, value }); + }, + args: ['constructor', 'name'] + }, + + keys: { + method(schema) { + + Assert(schema === undefined || typeof schema === 'object', 'Object schema must be a valid object'); + Assert(!Common.isSchema(schema), 'Object schema cannot be a joi schema'); + + const obj = this.clone(); + + if (!schema) { // Allow all + obj.$_terms.keys = null; + } + else if (!Object.keys(schema).length) { // Allow none + obj.$_terms.keys = new internals.Keys(); + } + else { + obj.$_terms.keys = obj.$_terms.keys ? obj.$_terms.keys.filter((child) => !schema.hasOwnProperty(child.key)) : new internals.Keys(); + for (const key in schema) { + Common.tryWithPath(() => obj.$_terms.keys.push({ key, schema: this.$_compile(schema[key]) }), key); + } + } + + return obj.$_mutateRebuild(); + } + }, + + length: { + method(limit) { + + return this.$_addRule({ name: 'length', args: { limit }, operator: '=' }); + }, + validate(value, helpers, { limit }, { name, operator, args }) { + + if (Common.compare(Object.keys(value).length, limit, operator)) { + return value; + } + + return helpers.error('object.' + name, { limit: args.limit, value }); + }, + args: [ + { + name: 'limit', + ref: true, + assert: Common.limit, + message: 'must be a positive integer' + } + ] + }, + + max: { + method(limit) { + + return this.$_addRule({ name: 'max', method: 'length', args: { limit }, operator: '<=' }); + } + }, + + min: { + method(limit) { + + return this.$_addRule({ name: 'min', method: 'length', args: { limit }, operator: '>=' }); + } + }, + + nand: { + method(...peers /*, [options] */) { + + Common.verifyFlat(peers, 'nand'); + + return internals.dependency(this, 'nand', null, peers); + } + }, + + or: { + method(...peers /*, [options] */) { + + Common.verifyFlat(peers, 'or'); + + return internals.dependency(this, 'or', null, peers); + } + }, + + oxor: { + method(...peers /*, [options] */) { + + return internals.dependency(this, 'oxor', null, peers); + } + }, + + pattern: { + method(pattern, schema, options = {}) { + + const isRegExp = pattern instanceof RegExp; + if (!isRegExp) { + pattern = this.$_compile(pattern, { appendPath: true }); + } + + Assert(schema !== undefined, 'Invalid rule'); + Common.assertOptions(options, ['fallthrough', 'matches']); + + if (isRegExp) { + Assert(!pattern.flags.includes('g') && !pattern.flags.includes('y'), 'pattern should not use global or sticky mode'); + } + + schema = this.$_compile(schema, { appendPath: true }); + + const obj = this.clone(); + obj.$_terms.patterns = obj.$_terms.patterns || []; + const config = { [isRegExp ? 'regex' : 'schema']: pattern, rule: schema }; + if (options.matches) { + config.matches = this.$_compile(options.matches); + if (config.matches.type !== 'array') { + config.matches = config.matches.$_root.array().items(config.matches); + } + + obj.$_mutateRegister(config.matches); + obj.$_setFlag('_hasPatternMatch', true, { clone: false }); + } + + if (options.fallthrough) { + config.fallthrough = true; + } + + obj.$_terms.patterns.push(config); + obj.$_mutateRegister(schema); + return obj; + } + }, + + ref: { + method() { + + return this.$_addRule('ref'); + }, + validate(value, helpers) { + + if (Ref.isRef(value)) { + return value; + } + + return helpers.error('object.refType', { value }); + } + }, + + regex: { + method() { + + return this.$_addRule('regex'); + }, + validate(value, helpers) { + + if (value instanceof RegExp) { + return value; + } + + return helpers.error('object.regex', { value }); + } + }, + + rename: { + method(from, to, options = {}) { + + Assert(typeof from === 'string' || from instanceof RegExp, 'Rename missing the from argument'); + Assert(typeof to === 'string' || to instanceof Template, 'Invalid rename to argument'); + Assert(to !== from, 'Cannot rename key to same name:', from); + + Common.assertOptions(options, ['alias', 'ignoreUndefined', 'override', 'multiple']); + + const obj = this.clone(); + + obj.$_terms.renames = obj.$_terms.renames || []; + for (const rename of obj.$_terms.renames) { + Assert(rename.from !== from, 'Cannot rename the same key multiple times'); + } + + if (to instanceof Template) { + obj.$_mutateRegister(to); + } + + obj.$_terms.renames.push({ + from, + to, + options: ApplyToDefaults(internals.renameDefaults, options) + }); + + return obj; + } + }, + + schema: { + method(type = 'any') { + + return this.$_addRule({ name: 'schema', args: { type } }); + }, + validate(value, helpers, { type }) { + + if (Common.isSchema(value) && + (type === 'any' || value.type === type)) { + + return value; + } + + return helpers.error('object.schema', { type }); + } + }, + + unknown: { + method(allow) { + + return this.$_setFlag('unknown', allow !== false); + } + }, + + with: { + method(key, peers, options = {}) { + + return internals.dependency(this, 'with', key, peers, options); + } + }, + + without: { + method(key, peers, options = {}) { + + return internals.dependency(this, 'without', key, peers, options); + } + }, + + xor: { + method(...peers /*, [options] */) { + + Common.verifyFlat(peers, 'xor'); + + return internals.dependency(this, 'xor', null, peers); + } + } + }, + + overrides: { + + default(value, options) { + + if (value === undefined) { + value = Common.symbols.deepDefault; + } + + return this.$_parent('default', value, options); + } + }, + + rebuild(schema) { + + if (schema.$_terms.keys) { + const topo = new Topo.Sorter(); + for (const child of schema.$_terms.keys) { + Common.tryWithPath(() => topo.add(child, { after: child.schema.$_rootReferences(), group: child.key }), child.key); + } + + schema.$_terms.keys = new internals.Keys(...topo.nodes); + } + }, + + manifest: { + + build(obj, desc) { + + if (desc.keys) { + obj = obj.keys(desc.keys); + } + + if (desc.dependencies) { + for (const { rel, key = null, peers, options } of desc.dependencies) { + obj = internals.dependency(obj, rel, key, peers, options); + } + } + + if (desc.patterns) { + for (const { regex, schema, rule, fallthrough, matches } of desc.patterns) { + obj = obj.pattern(regex || schema, rule, { fallthrough, matches }); + } + } + + if (desc.renames) { + for (const { from, to, options } of desc.renames) { + obj = obj.rename(from, to, options); + } + } + + return obj; + } + }, + + messages: { + 'object.and': '{{#label}} contains {{#presentWithLabels}} without its required peers {{#missingWithLabels}}', + 'object.assert': '{{#label}} is invalid because {if(#subject.key, `"` + #subject.key + `" failed to ` + (#message || "pass the assertion test"), #message || "the assertion failed")}', + 'object.base': '{{#label}} must be of type {{#type}}', + 'object.instance': '{{#label}} must be an instance of {{:#type}}', + 'object.length': '{{#label}} must have {{#limit}} key{if(#limit == 1, "", "s")}', + 'object.max': '{{#label}} must have less than or equal to {{#limit}} key{if(#limit == 1, "", "s")}', + 'object.min': '{{#label}} must have at least {{#limit}} key{if(#limit == 1, "", "s")}', + 'object.missing': '{{#label}} must contain at least one of {{#peersWithLabels}}', + 'object.nand': '{{:#mainWithLabel}} must not exist simultaneously with {{#peersWithLabels}}', + 'object.oxor': '{{#label}} contains a conflict between optional exclusive peers {{#peersWithLabels}}', + 'object.pattern.match': '{{#label}} keys failed to match pattern requirements', + 'object.refType': '{{#label}} must be a Joi reference', + 'object.regex': '{{#label}} must be a RegExp object', + 'object.rename.multiple': '{{#label}} cannot rename {{:#from}} because multiple renames are disabled and another key was already renamed to {{:#to}}', + 'object.rename.override': '{{#label}} cannot rename {{:#from}} because override is disabled and target {{:#to}} exists', + 'object.schema': '{{#label}} must be a Joi schema of {{#type}} type', + 'object.unknown': '{{#label}} is not allowed', + 'object.with': '{{:#mainWithLabel}} missing required peer {{:#peerWithLabel}}', + 'object.without': '{{:#mainWithLabel}} conflict with forbidden peer {{:#peerWithLabel}}', + 'object.xor': '{{#label}} contains a conflict between exclusive peers {{#peersWithLabels}}' + } +}); + + +// Helpers + +internals.clone = function (value, prefs) { + + // Object + + if (typeof value === 'object') { + if (prefs.nonEnumerables) { + return Clone(value, { shallow: true }); + } + + const clone = Object.create(Object.getPrototypeOf(value)); + Object.assign(clone, value); + return clone; + } + + // Function + + const clone = function (...args) { + + return value.apply(this, args); + }; + + clone.prototype = Clone(value.prototype); + Object.defineProperty(clone, 'name', { value: value.name, writable: false }); + Object.defineProperty(clone, 'length', { value: value.length, writable: false }); + Object.assign(clone, value); + return clone; +}; + + +internals.dependency = function (schema, rel, key, peers, options) { + + Assert(key === null || typeof key === 'string', rel, 'key must be a strings'); + + // Extract options from peers array + + if (!options) { + options = peers.length > 1 && typeof peers[peers.length - 1] === 'object' ? peers.pop() : {}; + } + + Common.assertOptions(options, ['separator']); + + peers = [].concat(peers); + + // Cast peer paths + + const separator = Common.default(options.separator, '.'); + const paths = []; + for (const peer of peers) { + Assert(typeof peer === 'string', rel, 'peers must be strings'); + paths.push(Compile.ref(peer, { separator, ancestor: 0, prefix: false })); + } + + // Cast key + + if (key !== null) { + key = Compile.ref(key, { separator, ancestor: 0, prefix: false }); + } + + // Add rule + + const obj = schema.clone(); + obj.$_terms.dependencies = obj.$_terms.dependencies || []; + obj.$_terms.dependencies.push(new internals.Dependency(rel, key, paths, peers)); + return obj; +}; + + +internals.dependencies = { + + and(schema, dep, value, state, prefs) { + + const missing = []; + const present = []; + const count = dep.peers.length; + for (const peer of dep.peers) { + if (peer.resolve(value, state, prefs, null, { shadow: false }) === undefined) { + missing.push(peer.key); + } + else { + present.push(peer.key); + } + } + + if (missing.length !== count && + present.length !== count) { + + return { + code: 'object.and', + context: { + present, + presentWithLabels: internals.keysToLabels(schema, present), + missing, + missingWithLabels: internals.keysToLabels(schema, missing) + } + }; + } + }, + + nand(schema, dep, value, state, prefs) { + + const present = []; + for (const peer of dep.peers) { + if (peer.resolve(value, state, prefs, null, { shadow: false }) !== undefined) { + present.push(peer.key); + } + } + + if (present.length !== dep.peers.length) { + return; + } + + const main = dep.paths[0]; + const values = dep.paths.slice(1); + return { + code: 'object.nand', + context: { + main, + mainWithLabel: internals.keysToLabels(schema, main), + peers: values, + peersWithLabels: internals.keysToLabels(schema, values) + } + }; + }, + + or(schema, dep, value, state, prefs) { + + for (const peer of dep.peers) { + if (peer.resolve(value, state, prefs, null, { shadow: false }) !== undefined) { + return; + } + } + + return { + code: 'object.missing', + context: { + peers: dep.paths, + peersWithLabels: internals.keysToLabels(schema, dep.paths) + } + }; + }, + + oxor(schema, dep, value, state, prefs) { + + const present = []; + for (const peer of dep.peers) { + if (peer.resolve(value, state, prefs, null, { shadow: false }) !== undefined) { + present.push(peer.key); + } + } + + if (!present.length || + present.length === 1) { + + return; + } + + const context = { peers: dep.paths, peersWithLabels: internals.keysToLabels(schema, dep.paths) }; + context.present = present; + context.presentWithLabels = internals.keysToLabels(schema, present); + return { code: 'object.oxor', context }; + }, + + with(schema, dep, value, state, prefs) { + + for (const peer of dep.peers) { + if (peer.resolve(value, state, prefs, null, { shadow: false }) === undefined) { + return { + code: 'object.with', + context: { + main: dep.key.key, + mainWithLabel: internals.keysToLabels(schema, dep.key.key), + peer: peer.key, + peerWithLabel: internals.keysToLabels(schema, peer.key) + } + }; + } + } + }, + + without(schema, dep, value, state, prefs) { + + for (const peer of dep.peers) { + if (peer.resolve(value, state, prefs, null, { shadow: false }) !== undefined) { + return { + code: 'object.without', + context: { + main: dep.key.key, + mainWithLabel: internals.keysToLabels(schema, dep.key.key), + peer: peer.key, + peerWithLabel: internals.keysToLabels(schema, peer.key) + } + }; + } + } + }, + + xor(schema, dep, value, state, prefs) { + + const present = []; + for (const peer of dep.peers) { + if (peer.resolve(value, state, prefs, null, { shadow: false }) !== undefined) { + present.push(peer.key); + } + } + + if (present.length === 1) { + return; + } + + const context = { peers: dep.paths, peersWithLabels: internals.keysToLabels(schema, dep.paths) }; + if (present.length === 0) { + return { code: 'object.missing', context }; + } + + context.present = present; + context.presentWithLabels = internals.keysToLabels(schema, present); + return { code: 'object.xor', context }; + } +}; + + +internals.keysToLabels = function (schema, keys) { + + if (Array.isArray(keys)) { + return keys.map((key) => schema.$_mapLabels(key)); + } + + return schema.$_mapLabels(keys); +}; + + +internals.rename = function (schema, value, state, prefs, errors) { + + const renamed = {}; + for (const rename of schema.$_terms.renames) { + const matches = []; + const pattern = typeof rename.from !== 'string'; + + if (!pattern) { + if (Object.prototype.hasOwnProperty.call(value, rename.from) && + (value[rename.from] !== undefined || !rename.options.ignoreUndefined)) { + + matches.push(rename); + } + } + else { + for (const from in value) { + if (value[from] === undefined && + rename.options.ignoreUndefined) { + + continue; + } + + if (from === rename.to) { + continue; + } + + const match = rename.from.exec(from); + if (!match) { + continue; + } + + matches.push({ from, to: rename.to, match }); + } + } + + for (const match of matches) { + const from = match.from; + let to = match.to; + if (to instanceof Template) { + to = to.render(value, state, prefs, match.match); + } + + if (from === to) { + continue; + } + + if (!rename.options.multiple && + renamed[to]) { + + errors.push(schema.$_createError('object.rename.multiple', value, { from, to, pattern }, state, prefs)); + if (prefs.abortEarly) { + return false; + } + } + + if (Object.prototype.hasOwnProperty.call(value, to) && + !rename.options.override && + !renamed[to]) { + + errors.push(schema.$_createError('object.rename.override', value, { from, to, pattern }, state, prefs)); + if (prefs.abortEarly) { + return false; + } + } + + if (value[from] === undefined) { + delete value[to]; + } + else { + value[to] = value[from]; + } + + renamed[to] = true; + + if (!rename.options.alias) { + delete value[from]; + } + } + } + + return true; +}; + + +internals.unknown = function (schema, value, unprocessed, errors, state, prefs) { + + if (schema.$_terms.patterns) { + let hasMatches = false; + const matches = schema.$_terms.patterns.map((pattern) => { + + if (pattern.matches) { + hasMatches = true; + return []; + } + }); + + const ancestors = [value, ...state.ancestors]; + + for (const key of unprocessed) { + const item = value[key]; + const path = [...state.path, key]; + + for (let i = 0; i < schema.$_terms.patterns.length; ++i) { + const pattern = schema.$_terms.patterns[i]; + if (pattern.regex) { + const match = pattern.regex.test(key); + state.mainstay.tracer.debug(state, 'rule', `pattern.${i}`, match ? 'pass' : 'error'); + if (!match) { + continue; + } + } + else { + if (!pattern.schema.$_match(key, state.nest(pattern.schema, `pattern.${i}`), prefs)) { + continue; + } + } + + unprocessed.delete(key); + + const localState = state.localize(path, ancestors, { schema: pattern.rule, key }); + const result = pattern.rule.$_validate(item, localState, prefs); + if (result.errors) { + if (prefs.abortEarly) { + return { value, errors: result.errors }; + } + + errors.push(...result.errors); + } + + if (pattern.matches) { + matches[i].push(key); + } + + value[key] = result.value; + if (!pattern.fallthrough) { + break; + } + } + } + + // Validate pattern matches rules + + if (hasMatches) { + for (let i = 0; i < matches.length; ++i) { + const match = matches[i]; + if (!match) { + continue; + } + + const stpm = schema.$_terms.patterns[i].matches; + const localState = state.localize(state.path, ancestors, stpm); + const result = stpm.$_validate(match, localState, prefs); + if (result.errors) { + const details = Errors.details(result.errors, { override: false }); + details.matches = match; + const report = schema.$_createError('object.pattern.match', value, details, state, prefs); + if (prefs.abortEarly) { + return { value, errors: report }; + } + + errors.push(report); + } + } + } + } + + if (!unprocessed.size || + !schema.$_terms.keys && !schema.$_terms.patterns) { // If no keys or patterns specified, unknown keys allowed + + return; + } + + if (prefs.stripUnknown && !schema._flags.unknown || + prefs.skipFunctions) { + + const stripUnknown = prefs.stripUnknown ? (prefs.stripUnknown === true ? true : !!prefs.stripUnknown.objects) : false; + + for (const key of unprocessed) { + if (stripUnknown) { + delete value[key]; + unprocessed.delete(key); + } + else if (typeof value[key] === 'function') { + unprocessed.delete(key); + } + } + } + + const forbidUnknown = !Common.default(schema._flags.unknown, prefs.allowUnknown); + if (forbidUnknown) { + for (const unprocessedKey of unprocessed) { + const localState = state.localize([...state.path, unprocessedKey], []); + const report = schema.$_createError('object.unknown', value[unprocessedKey], { child: unprocessedKey }, localState, prefs, { flags: false }); + if (prefs.abortEarly) { + return { value, errors: report }; + } + + errors.push(report); + } + } +}; + + +internals.Dependency = class { + + constructor(rel, key, peers, paths) { + + this.rel = rel; + this.key = key; + this.peers = peers; + this.paths = paths; + } + + describe() { + + const desc = { + rel: this.rel, + peers: this.paths + }; + + if (this.key !== null) { + desc.key = this.key.key; + } + + if (this.peers[0].separator !== '.') { + desc.options = { separator: this.peers[0].separator }; + } + + return desc; + } +}; + + +internals.Keys = class extends Array { + + concat(source) { + + const result = this.slice(); + + const keys = new Map(); + for (let i = 0; i < result.length; ++i) { + keys.set(result[i].key, i); + } + + for (const item of source) { + const key = item.key; + const pos = keys.get(key); + if (pos !== undefined) { + result[pos] = { key, schema: result[pos].schema.concat(item.schema) }; + } + else { + result.push(item); + } + } + + return result; + } +}; diff --git a/node_modules/joi/lib/types/link.js b/node_modules/joi/lib/types/link.js new file mode 100644 index 000000000..d99d0025c --- /dev/null +++ b/node_modules/joi/lib/types/link.js @@ -0,0 +1,168 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Any = require('./any'); +const Common = require('../common'); +const Compile = require('../compile'); +const Errors = require('../errors'); + + +const internals = {}; + + +module.exports = Any.extend({ + + type: 'link', + + properties: { + schemaChain: true + }, + + terms: { + + link: { init: null, manifest: 'single', register: false } + }, + + args(schema, ref) { + + return schema.ref(ref); + }, + + validate(value, { schema, state, prefs }) { + + Assert(schema.$_terms.link, 'Uninitialized link schema'); + + const linked = internals.generate(schema, value, state, prefs); + const ref = schema.$_terms.link[0].ref; + return linked.$_validate(value, state.nest(linked, `link:${ref.display}:${linked.type}`), prefs); + }, + + generate(schema, value, state, prefs) { + + return internals.generate(schema, value, state, prefs); + }, + + rules: { + + ref: { + method(ref) { + + Assert(!this.$_terms.link, 'Cannot reinitialize schema'); + + ref = Compile.ref(ref); + + Assert(ref.type === 'value' || ref.type === 'local', 'Invalid reference type:', ref.type); + Assert(ref.type === 'local' || ref.ancestor === 'root' || ref.ancestor > 0, 'Link cannot reference itself'); + + const obj = this.clone(); + obj.$_terms.link = [{ ref }]; + return obj; + } + }, + + relative: { + method(enabled = true) { + + return this.$_setFlag('relative', enabled); + } + } + }, + + overrides: { + + concat(source) { + + Assert(this.$_terms.link, 'Uninitialized link schema'); + Assert(Common.isSchema(source), 'Invalid schema object'); + Assert(source.type !== 'link', 'Cannot merge type link with another link'); + + const obj = this.clone(); + + if (!obj.$_terms.whens) { + obj.$_terms.whens = []; + } + + obj.$_terms.whens.push({ concat: source }); + return obj.$_mutateRebuild(); + } + }, + + manifest: { + + build(obj, desc) { + + Assert(desc.link, 'Invalid link description missing link'); + return obj.ref(desc.link); + } + } +}); + + +// Helpers + +internals.generate = function (schema, value, state, prefs) { + + let linked = state.mainstay.links.get(schema); + if (linked) { + return linked._generate(value, state, prefs).schema; + } + + const ref = schema.$_terms.link[0].ref; + const { perspective, path } = internals.perspective(ref, state); + internals.assert(perspective, 'which is outside of schema boundaries', ref, schema, state, prefs); + + try { + linked = path.length ? perspective.$_reach(path) : perspective; + } + catch (ignoreErr) { + internals.assert(false, 'to non-existing schema', ref, schema, state, prefs); + } + + internals.assert(linked.type !== 'link', 'which is another link', ref, schema, state, prefs); + + if (!schema._flags.relative) { + state.mainstay.links.set(schema, linked); + } + + return linked._generate(value, state, prefs).schema; +}; + + +internals.perspective = function (ref, state) { + + if (ref.type === 'local') { + for (const { schema, key } of state.schemas) { // From parent to root + const id = schema._flags.id || key; + if (id === ref.path[0]) { + return { perspective: schema, path: ref.path.slice(1) }; + } + + if (schema.$_terms.shared) { + for (const shared of schema.$_terms.shared) { + if (shared._flags.id === ref.path[0]) { + return { perspective: shared, path: ref.path.slice(1) }; + } + } + } + } + + return { perspective: null, path: null }; + } + + if (ref.ancestor === 'root') { + return { perspective: state.schemas[state.schemas.length - 1].schema, path: ref.path }; + } + + return { perspective: state.schemas[ref.ancestor] && state.schemas[ref.ancestor].schema, path: ref.path }; +}; + + +internals.assert = function (condition, message, ref, schema, state, prefs) { + + if (condition) { // Manual check to avoid generating error message on success + return; + } + + Assert(false, `"${Errors.label(schema._flags, state, prefs)}" contains link reference "${ref.display}" ${message}`); +}; diff --git a/node_modules/joi/lib/types/number.js b/node_modules/joi/lib/types/number.js new file mode 100644 index 000000000..191db4743 --- /dev/null +++ b/node_modules/joi/lib/types/number.js @@ -0,0 +1,335 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Any = require('./any'); +const Common = require('../common'); + + +const internals = { + numberRx: /^\s*[+-]?(?:(?:\d+(?:\.\d*)?)|(?:\.\d+))(?:e([+-]?\d+))?\s*$/i, + precisionRx: /(?:\.(\d+))?(?:[eE]([+-]?\d+))?$/ +}; + + +module.exports = Any.extend({ + + type: 'number', + + flags: { + + unsafe: { default: false } + }, + + coerce: { + from: 'string', + method(value, { schema, error }) { + + const matches = value.match(internals.numberRx); + if (!matches) { + return; + } + + value = value.trim(); + const result = { value: parseFloat(value) }; + + if (result.value === 0) { + result.value = 0; // -0 + } + + if (!schema._flags.unsafe) { + if (value.match(/e/i)) { + const constructed = internals.normalizeExponent(`${result.value / Math.pow(10, matches[1])}e${matches[1]}`); + if (constructed !== internals.normalizeExponent(value)) { + result.errors = error('number.unsafe'); + return result; + } + } + else { + const string = result.value.toString(); + if (string.match(/e/i)) { + return result; + } + + if (string !== internals.normalizeDecimal(value)) { + result.errors = error('number.unsafe'); + return result; + } + } + } + + return result; + } + }, + + validate(value, { schema, error, prefs }) { + + if (value === Infinity || + value === -Infinity) { + + return { value, errors: error('number.infinity') }; + } + + if (!Common.isNumber(value)) { + return { value, errors: error('number.base') }; + } + + const result = { value }; + + if (prefs.convert) { + const rule = schema.$_getRule('precision'); + if (rule) { + const precision = Math.pow(10, rule.args.limit); // This is conceptually equivalent to using toFixed but it should be much faster + result.value = Math.round(result.value * precision) / precision; + } + } + + if (result.value === 0) { + result.value = 0; // -0 + } + + if (!schema._flags.unsafe && + (value > Number.MAX_SAFE_INTEGER || value < Number.MIN_SAFE_INTEGER)) { + + result.errors = error('number.unsafe'); + } + + return result; + }, + + rules: { + + compare: { + method: false, + validate(value, helpers, { limit }, { name, operator, args }) { + + if (Common.compare(value, limit, operator)) { + return value; + } + + return helpers.error('number.' + name, { limit: args.limit, value }); + }, + args: [ + { + name: 'limit', + ref: true, + assert: Common.isNumber, + message: 'must be a number' + } + ] + }, + + greater: { + method(limit) { + + return this.$_addRule({ name: 'greater', method: 'compare', args: { limit }, operator: '>' }); + } + }, + + integer: { + method() { + + return this.$_addRule('integer'); + }, + validate(value, helpers) { + + if (Math.trunc(value) - value === 0) { + return value; + } + + return helpers.error('number.integer'); + } + }, + + less: { + method(limit) { + + return this.$_addRule({ name: 'less', method: 'compare', args: { limit }, operator: '<' }); + } + }, + + max: { + method(limit) { + + return this.$_addRule({ name: 'max', method: 'compare', args: { limit }, operator: '<=' }); + } + }, + + min: { + method(limit) { + + return this.$_addRule({ name: 'min', method: 'compare', args: { limit }, operator: '>=' }); + } + }, + + multiple: { + method(base) { + + return this.$_addRule({ name: 'multiple', args: { base } }); + }, + validate(value, helpers, { base }, options) { + + if (value % base === 0) { + return value; + } + + return helpers.error('number.multiple', { multiple: options.args.base, value }); + }, + args: [ + { + name: 'base', + ref: true, + assert: (value) => typeof value === 'number' && isFinite(value) && value > 0, + message: 'must be a positive number' + } + ], + multi: true + }, + + negative: { + method() { + + return this.sign('negative'); + } + }, + + port: { + method() { + + return this.$_addRule('port'); + }, + validate(value, helpers) { + + if (Number.isSafeInteger(value) && + value >= 0 && + value <= 65535) { + + return value; + } + + return helpers.error('number.port'); + } + }, + + positive: { + method() { + + return this.sign('positive'); + } + }, + + precision: { + method(limit) { + + Assert(Number.isSafeInteger(limit), 'limit must be an integer'); + + return this.$_addRule({ name: 'precision', args: { limit } }); + }, + validate(value, helpers, { limit }) { + + const places = value.toString().match(internals.precisionRx); + const decimals = Math.max((places[1] ? places[1].length : 0) - (places[2] ? parseInt(places[2], 10) : 0), 0); + if (decimals <= limit) { + return value; + } + + return helpers.error('number.precision', { limit, value }); + }, + convert: true + }, + + sign: { + method(sign) { + + Assert(['negative', 'positive'].includes(sign), 'Invalid sign', sign); + + return this.$_addRule({ name: 'sign', args: { sign } }); + }, + validate(value, helpers, { sign }) { + + if (sign === 'negative' && value < 0 || + sign === 'positive' && value > 0) { + + return value; + } + + return helpers.error(`number.${sign}`); + } + }, + + unsafe: { + method(enabled = true) { + + Assert(typeof enabled === 'boolean', 'enabled must be a boolean'); + + return this.$_setFlag('unsafe', enabled); + } + } + }, + + cast: { + string: { + from: (value) => typeof value === 'number', + to(value, helpers) { + + return value.toString(); + } + } + }, + + messages: { + 'number.base': '{{#label}} must be a number', + 'number.greater': '{{#label}} must be greater than {{#limit}}', + 'number.infinity': '{{#label}} cannot be infinity', + 'number.integer': '{{#label}} must be an integer', + 'number.less': '{{#label}} must be less than {{#limit}}', + 'number.max': '{{#label}} must be less than or equal to {{#limit}}', + 'number.min': '{{#label}} must be greater than or equal to {{#limit}}', + 'number.multiple': '{{#label}} must be a multiple of {{#multiple}}', + 'number.negative': '{{#label}} must be a negative number', + 'number.port': '{{#label}} must be a valid port', + 'number.positive': '{{#label}} must be a positive number', + 'number.precision': '{{#label}} must have no more than {{#limit}} decimal places', + 'number.unsafe': '{{#label}} must be a safe number' + } +}); + + +// Helpers + +internals.normalizeExponent = function (str) { + + return str + .replace(/E/, 'e') + .replace(/\.(\d*[1-9])?0+e/, '.$1e') + .replace(/\.e/, 'e') + .replace(/e\+/, 'e') + .replace(/^\+/, '') + .replace(/^(-?)0+([1-9])/, '$1$2'); +}; + + +internals.normalizeDecimal = function (str) { + + str = str + // Remove leading plus signs + .replace(/^\+/, '') + // Remove trailing zeros if there is a decimal point and unecessary decimal points + .replace(/\.0*$/, '') + // Add a integer 0 if the numbers starts with a decimal point + .replace(/^(-?)\.([^\.]*)$/, '$10.$2') + // Remove leading zeros + .replace(/^(-?)0+([0-9])/, '$1$2'); + + if (str.includes('.') && + str.endsWith('0')) { + + str = str.replace(/0+$/, ''); + } + + if (str === '-0') { + return '0'; + } + + return str; +}; diff --git a/node_modules/joi/lib/types/object.js b/node_modules/joi/lib/types/object.js new file mode 100644 index 000000000..57046fdd4 --- /dev/null +++ b/node_modules/joi/lib/types/object.js @@ -0,0 +1,22 @@ +'use strict'; + +const Keys = require('./keys'); + + +const internals = {}; + + +module.exports = Keys.extend({ + + type: 'object', + + cast: { + map: { + from: (value) => value && typeof value === 'object', + to(value, helpers) { + + return new Map(Object.entries(value)); + } + } + } +}); diff --git a/node_modules/joi/lib/types/string.js b/node_modules/joi/lib/types/string.js new file mode 100644 index 000000000..d907d8901 --- /dev/null +++ b/node_modules/joi/lib/types/string.js @@ -0,0 +1,821 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Domain = require('@sideway/address/lib/domain'); +const Email = require('@sideway/address/lib/email'); +const Ip = require('@sideway/address/lib/ip'); +const EscapeRegex = require('@hapi/hoek/lib/escapeRegex'); +const Tlds = require('@sideway/address/lib/tlds'); +const Uri = require('@sideway/address/lib/uri'); + +const Any = require('./any'); +const Common = require('../common'); + + +const internals = { + tlds: Tlds instanceof Set ? { tlds: { allow: Tlds, deny: null } } : false, // $lab:coverage:ignore$ + base64Regex: { + // paddingRequired + true: { + // urlSafe + true: /^(?:[\w\-]{2}[\w\-]{2})*(?:[\w\-]{2}==|[\w\-]{3}=)?$/, + false: /^(?:[A-Za-z0-9+\/]{2}[A-Za-z0-9+\/]{2})*(?:[A-Za-z0-9+\/]{2}==|[A-Za-z0-9+\/]{3}=)?$/ + }, + false: { + true: /^(?:[\w\-]{2}[\w\-]{2})*(?:[\w\-]{2}(==)?|[\w\-]{3}=?)?$/, + false: /^(?:[A-Za-z0-9+\/]{2}[A-Za-z0-9+\/]{2})*(?:[A-Za-z0-9+\/]{2}(==)?|[A-Za-z0-9+\/]{3}=?)?$/ + } + }, + dataUriRegex: /^data:[\w+.-]+\/[\w+.-]+;((charset=[\w-]+|base64),)?(.*)$/, + hexRegex: /^[a-f0-9]+$/i, + ipRegex: Ip.regex().regex, + isoDurationRegex: /^P(?!$)(\d+Y)?(\d+M)?(\d+W)?(\d+D)?(T(?=\d)(\d+H)?(\d+M)?(\d+S)?)?$/, + + guidBrackets: { + '{': '}', '[': ']', '(': ')', '': '' + }, + guidVersions: { + uuidv1: '1', + uuidv2: '2', + uuidv3: '3', + uuidv4: '4', + uuidv5: '5' + }, + guidSeparators: new Set([undefined, true, false, '-', ':']), + + normalizationForms: ['NFC', 'NFD', 'NFKC', 'NFKD'] +}; + + +module.exports = Any.extend({ + + type: 'string', + + flags: { + + insensitive: { default: false }, + truncate: { default: false } + }, + + terms: { + + replacements: { init: null } + }, + + coerce: { + from: 'string', + method(value, { schema, state, prefs }) { + + const normalize = schema.$_getRule('normalize'); + if (normalize) { + value = value.normalize(normalize.args.form); + } + + const casing = schema.$_getRule('case'); + if (casing) { + value = casing.args.direction === 'upper' ? value.toLocaleUpperCase() : value.toLocaleLowerCase(); + } + + const trim = schema.$_getRule('trim'); + if (trim && + trim.args.enabled) { + + value = value.trim(); + } + + if (schema.$_terms.replacements) { + for (const replacement of schema.$_terms.replacements) { + value = value.replace(replacement.pattern, replacement.replacement); + } + } + + const hex = schema.$_getRule('hex'); + if (hex && + hex.args.options.byteAligned && + value.length % 2 !== 0) { + + value = `0${value}`; + } + + if (schema.$_getRule('isoDate')) { + const iso = internals.isoDate(value); + if (iso) { + value = iso; + } + } + + if (schema._flags.truncate) { + const rule = schema.$_getRule('max'); + if (rule) { + let limit = rule.args.limit; + if (Common.isResolvable(limit)) { + limit = limit.resolve(value, state, prefs); + if (!Common.limit(limit)) { + return { value, errors: schema.$_createError('any.ref', limit, { ref: rule.args.limit, arg: 'limit', reason: 'must be a positive integer' }, state, prefs) }; + } + } + + value = value.slice(0, limit); + } + } + + return { value }; + } + }, + + validate(value, { error }) { + + if (typeof value !== 'string') { + return { value, errors: error('string.base') }; + } + + if (value === '') { + return { value, errors: error('string.empty') }; + } + }, + + rules: { + + alphanum: { + method() { + + return this.$_addRule('alphanum'); + }, + validate(value, helpers) { + + if (/^[a-zA-Z0-9]+$/.test(value)) { + return value; + } + + return helpers.error('string.alphanum'); + } + }, + + base64: { + method(options = {}) { + + Common.assertOptions(options, ['paddingRequired', 'urlSafe']); + + options = { urlSafe: false, paddingRequired: true, ...options }; + Assert(typeof options.paddingRequired === 'boolean', 'paddingRequired must be boolean'); + Assert(typeof options.urlSafe === 'boolean', 'urlSafe must be boolean'); + + return this.$_addRule({ name: 'base64', args: { options } }); + }, + validate(value, helpers, { options }) { + + const regex = internals.base64Regex[options.paddingRequired][options.urlSafe]; + if (regex.test(value)) { + return value; + } + + return helpers.error('string.base64'); + } + }, + + case: { + method(direction) { + + Assert(['lower', 'upper'].includes(direction), 'Invalid case:', direction); + + return this.$_addRule({ name: 'case', args: { direction } }); + }, + validate(value, helpers, { direction }) { + + if (direction === 'lower' && value === value.toLocaleLowerCase() || + direction === 'upper' && value === value.toLocaleUpperCase()) { + + return value; + } + + return helpers.error(`string.${direction}case`); + }, + convert: true + }, + + creditCard: { + method() { + + return this.$_addRule('creditCard'); + }, + validate(value, helpers) { + + let i = value.length; + let sum = 0; + let mul = 1; + + while (i--) { + const char = value.charAt(i) * mul; + sum = sum + (char - (char > 9) * 9); + mul = mul ^ 3; + } + + if (sum > 0 && + sum % 10 === 0) { + + return value; + } + + return helpers.error('string.creditCard'); + } + }, + + dataUri: { + method(options = {}) { + + Common.assertOptions(options, ['paddingRequired']); + + options = { paddingRequired: true, ...options }; + Assert(typeof options.paddingRequired === 'boolean', 'paddingRequired must be boolean'); + + return this.$_addRule({ name: 'dataUri', args: { options } }); + }, + validate(value, helpers, { options }) { + + const matches = value.match(internals.dataUriRegex); + + if (matches) { + if (!matches[2]) { + return value; + } + + if (matches[2] !== 'base64') { + return value; + } + + const base64regex = internals.base64Regex[options.paddingRequired].false; + if (base64regex.test(matches[3])) { + return value; + } + } + + return helpers.error('string.dataUri'); + } + }, + + domain: { + method(options) { + + if (options) { + Common.assertOptions(options, ['allowUnicode', 'maxDomainSegments', 'minDomainSegments', 'tlds']); + } + + const address = internals.addressOptions(options); + return this.$_addRule({ name: 'domain', args: { options }, address }); + }, + validate(value, helpers, args, { address }) { + + if (Domain.isValid(value, address)) { + return value; + } + + return helpers.error('string.domain'); + } + }, + + email: { + method(options = {}) { + + Common.assertOptions(options, ['allowUnicode', 'ignoreLength', 'maxDomainSegments', 'minDomainSegments', 'multiple', 'separator', 'tlds']); + Assert(options.multiple === undefined || typeof options.multiple === 'boolean', 'multiple option must be an boolean'); + + const address = internals.addressOptions(options); + const regex = new RegExp(`\\s*[${options.separator ? EscapeRegex(options.separator) : ','}]\\s*`); + + return this.$_addRule({ name: 'email', args: { options }, regex, address }); + }, + validate(value, helpers, { options }, { regex, address }) { + + const emails = options.multiple ? value.split(regex) : [value]; + const invalids = []; + for (const email of emails) { + if (!Email.isValid(email, address)) { + invalids.push(email); + } + } + + if (!invalids.length) { + return value; + } + + return helpers.error('string.email', { value, invalids }); + } + }, + + guid: { + alias: 'uuid', + method(options = {}) { + + Common.assertOptions(options, ['version', 'separator']); + + let versionNumbers = ''; + + if (options.version) { + const versions = [].concat(options.version); + + Assert(versions.length >= 1, 'version must have at least 1 valid version specified'); + const set = new Set(); + + for (let i = 0; i < versions.length; ++i) { + const version = versions[i]; + Assert(typeof version === 'string', 'version at position ' + i + ' must be a string'); + const versionNumber = internals.guidVersions[version.toLowerCase()]; + Assert(versionNumber, 'version at position ' + i + ' must be one of ' + Object.keys(internals.guidVersions).join(', ')); + Assert(!set.has(versionNumber), 'version at position ' + i + ' must not be a duplicate'); + + versionNumbers += versionNumber; + set.add(versionNumber); + } + } + + Assert(internals.guidSeparators.has(options.separator), 'separator must be one of true, false, "-", or ":"'); + const separator = options.separator === undefined ? '[:-]?' : + options.separator === true ? '[:-]' : + options.separator === false ? '[]?' : `\\${options.separator}`; + + const regex = new RegExp(`^([\\[{\\(]?)[0-9A-F]{8}(${separator})[0-9A-F]{4}\\2?[${versionNumbers || '0-9A-F'}][0-9A-F]{3}\\2?[${versionNumbers ? '89AB' : '0-9A-F'}][0-9A-F]{3}\\2?[0-9A-F]{12}([\\]}\\)]?)$`, 'i'); + + return this.$_addRule({ name: 'guid', args: { options }, regex }); + }, + validate(value, helpers, args, { regex }) { + + const results = regex.exec(value); + + if (!results) { + return helpers.error('string.guid'); + } + + // Matching braces + + if (internals.guidBrackets[results[1]] !== results[results.length - 1]) { + return helpers.error('string.guid'); + } + + return value; + } + }, + + hex: { + method(options = {}) { + + Common.assertOptions(options, ['byteAligned']); + + options = { byteAligned: false, ...options }; + Assert(typeof options.byteAligned === 'boolean', 'byteAligned must be boolean'); + + return this.$_addRule({ name: 'hex', args: { options } }); + }, + validate(value, helpers, { options }) { + + if (!internals.hexRegex.test(value)) { + return helpers.error('string.hex'); + } + + if (options.byteAligned && + value.length % 2 !== 0) { + + return helpers.error('string.hexAlign'); + } + + return value; + } + }, + + hostname: { + method() { + + return this.$_addRule('hostname'); + }, + validate(value, helpers) { + + if (Domain.isValid(value, { minDomainSegments: 1 }) || + internals.ipRegex.test(value)) { + + return value; + } + + return helpers.error('string.hostname'); + } + }, + + insensitive: { + method() { + + return this.$_setFlag('insensitive', true); + } + }, + + ip: { + method(options = {}) { + + Common.assertOptions(options, ['cidr', 'version']); + + const { cidr, versions, regex } = Ip.regex(options); + const version = options.version ? versions : undefined; + return this.$_addRule({ name: 'ip', args: { options: { cidr, version } }, regex }); + }, + validate(value, helpers, { options }, { regex }) { + + if (regex.test(value)) { + return value; + } + + if (options.version) { + return helpers.error('string.ipVersion', { value, cidr: options.cidr, version: options.version }); + } + + return helpers.error('string.ip', { value, cidr: options.cidr }); + } + }, + + isoDate: { + method() { + + return this.$_addRule('isoDate'); + }, + validate(value, { error }) { + + if (internals.isoDate(value)) { + return value; + } + + return error('string.isoDate'); + } + }, + + isoDuration: { + method() { + + return this.$_addRule('isoDuration'); + }, + validate(value, helpers) { + + if (internals.isoDurationRegex.test(value)) { + return value; + } + + return helpers.error('string.isoDuration'); + } + }, + + length: { + method(limit, encoding) { + + return internals.length(this, 'length', limit, '=', encoding); + }, + validate(value, helpers, { limit, encoding }, { name, operator, args }) { + + const length = encoding ? Buffer && Buffer.byteLength(value, encoding) : value.length; // $lab:coverage:ignore$ + if (Common.compare(length, limit, operator)) { + return value; + } + + return helpers.error('string.' + name, { limit: args.limit, value, encoding }); + }, + args: [ + { + name: 'limit', + ref: true, + assert: Common.limit, + message: 'must be a positive integer' + }, + 'encoding' + ] + }, + + lowercase: { + method() { + + return this.case('lower'); + } + }, + + max: { + method(limit, encoding) { + + return internals.length(this, 'max', limit, '<=', encoding); + }, + args: ['limit', 'encoding'] + }, + + min: { + method(limit, encoding) { + + return internals.length(this, 'min', limit, '>=', encoding); + }, + args: ['limit', 'encoding'] + }, + + normalize: { + method(form = 'NFC') { + + Assert(internals.normalizationForms.includes(form), 'normalization form must be one of ' + internals.normalizationForms.join(', ')); + + return this.$_addRule({ name: 'normalize', args: { form } }); + }, + validate(value, { error }, { form }) { + + if (value === value.normalize(form)) { + return value; + } + + return error('string.normalize', { value, form }); + }, + convert: true + }, + + pattern: { + alias: 'regex', + method(regex, options = {}) { + + Assert(regex instanceof RegExp, 'regex must be a RegExp'); + Assert(!regex.flags.includes('g') && !regex.flags.includes('y'), 'regex should not use global or sticky mode'); + + if (typeof options === 'string') { + options = { name: options }; + } + + Common.assertOptions(options, ['invert', 'name']); + + const errorCode = ['string.pattern', options.invert ? '.invert' : '', options.name ? '.name' : '.base'].join(''); + return this.$_addRule({ name: 'pattern', args: { regex, options }, errorCode }); + }, + validate(value, helpers, { regex, options }, { errorCode }) { + + const patternMatch = regex.test(value); + + if (patternMatch ^ options.invert) { + return value; + } + + return helpers.error(errorCode, { name: options.name, regex, value }); + }, + args: ['regex', 'options'], + multi: true + }, + + replace: { + method(pattern, replacement) { + + if (typeof pattern === 'string') { + pattern = new RegExp(EscapeRegex(pattern), 'g'); + } + + Assert(pattern instanceof RegExp, 'pattern must be a RegExp'); + Assert(typeof replacement === 'string', 'replacement must be a String'); + + const obj = this.clone(); + + if (!obj.$_terms.replacements) { + obj.$_terms.replacements = []; + } + + obj.$_terms.replacements.push({ pattern, replacement }); + return obj; + } + }, + + token: { + method() { + + return this.$_addRule('token'); + }, + validate(value, helpers) { + + if (/^\w+$/.test(value)) { + return value; + } + + return helpers.error('string.token'); + } + }, + + trim: { + method(enabled = true) { + + Assert(typeof enabled === 'boolean', 'enabled must be a boolean'); + + return this.$_addRule({ name: 'trim', args: { enabled } }); + }, + validate(value, helpers, { enabled }) { + + if (!enabled || + value === value.trim()) { + + return value; + } + + return helpers.error('string.trim'); + }, + convert: true + }, + + truncate: { + method(enabled = true) { + + Assert(typeof enabled === 'boolean', 'enabled must be a boolean'); + + return this.$_setFlag('truncate', enabled); + } + }, + + uppercase: { + method() { + + return this.case('upper'); + } + }, + + uri: { + method(options = {}) { + + Common.assertOptions(options, ['allowRelative', 'allowQuerySquareBrackets', 'domain', 'relativeOnly', 'scheme']); + + if (options.domain) { + Common.assertOptions(options.domain, ['allowUnicode', 'maxDomainSegments', 'minDomainSegments', 'tlds']); + } + + const { regex, scheme } = Uri.regex(options); + const domain = options.domain ? internals.addressOptions(options.domain) : null; + return this.$_addRule({ name: 'uri', args: { options }, regex, domain, scheme }); + }, + validate(value, helpers, { options }, { regex, domain, scheme }) { + + if (['http:/', 'https:/'].includes(value)) { // scheme:/ is technically valid but makes no sense + return helpers.error('string.uri'); + } + + const match = regex.exec(value); + if (match) { + const matched = match[1] || match[2]; + if (domain && + (!options.allowRelative || matched) && + !Domain.isValid(matched, domain)) { + + return helpers.error('string.domain', { value: matched }); + } + + return value; + } + + if (options.relativeOnly) { + return helpers.error('string.uriRelativeOnly'); + } + + if (options.scheme) { + return helpers.error('string.uriCustomScheme', { scheme, value }); + } + + return helpers.error('string.uri'); + } + } + }, + + manifest: { + + build(obj, desc) { + + if (desc.replacements) { + for (const { pattern, replacement } of desc.replacements) { + obj = obj.replace(pattern, replacement); + } + } + + return obj; + } + }, + + messages: { + 'string.alphanum': '{{#label}} must only contain alpha-numeric characters', + 'string.base': '{{#label}} must be a string', + 'string.base64': '{{#label}} must be a valid base64 string', + 'string.creditCard': '{{#label}} must be a credit card', + 'string.dataUri': '{{#label}} must be a valid dataUri string', + 'string.domain': '{{#label}} must contain a valid domain name', + 'string.email': '{{#label}} must be a valid email', + 'string.empty': '{{#label}} is not allowed to be empty', + 'string.guid': '{{#label}} must be a valid GUID', + 'string.hex': '{{#label}} must only contain hexadecimal characters', + 'string.hexAlign': '{{#label}} hex decoded representation must be byte aligned', + 'string.hostname': '{{#label}} must be a valid hostname', + 'string.ip': '{{#label}} must be a valid ip address with a {{#cidr}} CIDR', + 'string.ipVersion': '{{#label}} must be a valid ip address of one of the following versions {{#version}} with a {{#cidr}} CIDR', + 'string.isoDate': '{{#label}} must be in iso format', + 'string.isoDuration': '{{#label}} must be a valid ISO 8601 duration', + 'string.length': '{{#label}} length must be {{#limit}} characters long', + 'string.lowercase': '{{#label}} must only contain lowercase characters', + 'string.max': '{{#label}} length must be less than or equal to {{#limit}} characters long', + 'string.min': '{{#label}} length must be at least {{#limit}} characters long', + 'string.normalize': '{{#label}} must be unicode normalized in the {{#form}} form', + 'string.token': '{{#label}} must only contain alpha-numeric and underscore characters', + 'string.pattern.base': '{{#label}} with value {:[.]} fails to match the required pattern: {{#regex}}', + 'string.pattern.name': '{{#label}} with value {:[.]} fails to match the {{#name}} pattern', + 'string.pattern.invert.base': '{{#label}} with value {:[.]} matches the inverted pattern: {{#regex}}', + 'string.pattern.invert.name': '{{#label}} with value {:[.]} matches the inverted {{#name}} pattern', + 'string.trim': '{{#label}} must not have leading or trailing whitespace', + 'string.uri': '{{#label}} must be a valid uri', + 'string.uriCustomScheme': '{{#label}} must be a valid uri with a scheme matching the {{#scheme}} pattern', + 'string.uriRelativeOnly': '{{#label}} must be a valid relative uri', + 'string.uppercase': '{{#label}} must only contain uppercase characters' + } +}); + + +// Helpers + +internals.addressOptions = function (options) { + + if (!options) { + return options; + } + + // minDomainSegments + + Assert(options.minDomainSegments === undefined || + Number.isSafeInteger(options.minDomainSegments) && options.minDomainSegments > 0, 'minDomainSegments must be a positive integer'); + + // maxDomainSegments + + Assert(options.maxDomainSegments === undefined || + Number.isSafeInteger(options.maxDomainSegments) && options.maxDomainSegments > 0, 'maxDomainSegments must be a positive integer'); + + // tlds + + if (options.tlds === false) { + return options; + } + + if (options.tlds === true || + options.tlds === undefined) { + + Assert(internals.tlds, 'Built-in TLD list disabled'); + return Object.assign({}, options, internals.tlds); + } + + Assert(typeof options.tlds === 'object', 'tlds must be true, false, or an object'); + + const deny = options.tlds.deny; + if (deny) { + if (Array.isArray(deny)) { + options = Object.assign({}, options, { tlds: { deny: new Set(deny) } }); + } + + Assert(options.tlds.deny instanceof Set, 'tlds.deny must be an array, Set, or boolean'); + Assert(!options.tlds.allow, 'Cannot specify both tlds.allow and tlds.deny lists'); + internals.validateTlds(options.tlds.deny, 'tlds.deny'); + return options; + } + + const allow = options.tlds.allow; + if (!allow) { + return options; + } + + if (allow === true) { + Assert(internals.tlds, 'Built-in TLD list disabled'); + return Object.assign({}, options, internals.tlds); + } + + if (Array.isArray(allow)) { + options = Object.assign({}, options, { tlds: { allow: new Set(allow) } }); + } + + Assert(options.tlds.allow instanceof Set, 'tlds.allow must be an array, Set, or boolean'); + internals.validateTlds(options.tlds.allow, 'tlds.allow'); + return options; +}; + + +internals.validateTlds = function (set, source) { + + for (const tld of set) { + Assert(Domain.isValid(tld, { minDomainSegments: 1, maxDomainSegments: 1 }), `${source} must contain valid top level domain names`); + } +}; + + +internals.isoDate = function (value) { + + if (!Common.isIsoDate(value)) { + return null; + } + + if (/.*T.*[+-]\d\d$/.test(value)) { // Add missing trailing zeros to timeshift + value += '00'; + } + + const date = new Date(value); + if (isNaN(date.getTime())) { + return null; + } + + return date.toISOString(); +}; + + +internals.length = function (schema, name, limit, operator, encoding) { + + Assert(!encoding || Buffer && Buffer.isEncoding(encoding), 'Invalid encoding:', encoding); // $lab:coverage:ignore$ + + return schema.$_addRule({ name, method: 'length', args: { limit, encoding }, operator }); +}; diff --git a/node_modules/joi/lib/types/symbol.js b/node_modules/joi/lib/types/symbol.js new file mode 100644 index 000000000..eafe9ae53 --- /dev/null +++ b/node_modules/joi/lib/types/symbol.js @@ -0,0 +1,102 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); + +const Any = require('./any'); + + +const internals = {}; + + +internals.Map = class extends Map { + + slice() { + + return new internals.Map(this); + } +}; + + +module.exports = Any.extend({ + + type: 'symbol', + + terms: { + + map: { init: new internals.Map() } + }, + + coerce: { + method(value, { schema, error }) { + + const lookup = schema.$_terms.map.get(value); + if (lookup) { + value = lookup; + } + + if (!schema._flags.only || + typeof value === 'symbol') { + + return { value }; + } + + return { value, errors: error('symbol.map', { map: schema.$_terms.map }) }; + } + }, + + validate(value, { error }) { + + if (typeof value !== 'symbol') { + return { value, errors: error('symbol.base') }; + } + }, + + rules: { + map: { + method(iterable) { + + if (iterable && + !iterable[Symbol.iterator] && + typeof iterable === 'object') { + + iterable = Object.entries(iterable); + } + + Assert(iterable && iterable[Symbol.iterator], 'Iterable must be an iterable or object'); + + const obj = this.clone(); + + const symbols = []; + for (const entry of iterable) { + Assert(entry && entry[Symbol.iterator], 'Entry must be an iterable'); + const [key, value] = entry; + + Assert(typeof key !== 'object' && typeof key !== 'function' && typeof key !== 'symbol', 'Key must not be of type object, function, or Symbol'); + Assert(typeof value === 'symbol', 'Value must be a Symbol'); + + obj.$_terms.map.set(key, value); + symbols.push(value); + } + + return obj.valid(...symbols); + } + } + }, + + manifest: { + + build(obj, desc) { + + if (desc.map) { + obj = obj.map(desc.map); + } + + return obj; + } + }, + + messages: { + 'symbol.base': '{{#label}} must be a symbol', + 'symbol.map': '{{#label}} must be one of {{#map}}' + } +}); diff --git a/node_modules/joi/lib/validator.js b/node_modules/joi/lib/validator.js new file mode 100644 index 000000000..c7fe231e0 --- /dev/null +++ b/node_modules/joi/lib/validator.js @@ -0,0 +1,650 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const Clone = require('@hapi/hoek/lib/clone'); +const Ignore = require('@hapi/hoek/lib/ignore'); +const Reach = require('@hapi/hoek/lib/reach'); + +const Common = require('./common'); +const Errors = require('./errors'); +const State = require('./state'); + + +const internals = { + result: Symbol('result') +}; + + +exports.entry = function (value, schema, prefs) { + + let settings = Common.defaults; + if (prefs) { + Assert(prefs.warnings === undefined, 'Cannot override warnings preference in synchronous validation'); + Assert(prefs.artifacts === undefined, 'Cannot override artifacts preference in synchronous validation'); + settings = Common.preferences(Common.defaults, prefs); + } + + const result = internals.entry(value, schema, settings); + Assert(!result.mainstay.externals.length, 'Schema with external rules must use validateAsync()'); + const outcome = { value: result.value }; + + if (result.error) { + outcome.error = result.error; + } + + if (result.mainstay.warnings.length) { + outcome.warning = Errors.details(result.mainstay.warnings); + } + + if (result.mainstay.debug) { + outcome.debug = result.mainstay.debug; + } + + if (result.mainstay.artifacts) { + outcome.artifacts = result.mainstay.artifacts; + } + + return outcome; +}; + + +exports.entryAsync = async function (value, schema, prefs) { + + let settings = Common.defaults; + if (prefs) { + settings = Common.preferences(Common.defaults, prefs); + } + + const result = internals.entry(value, schema, settings); + const mainstay = result.mainstay; + if (result.error) { + if (mainstay.debug) { + result.error.debug = mainstay.debug; + } + + throw result.error; + } + + if (mainstay.externals.length) { + let root = result.value; + for (const { method, path, label } of mainstay.externals) { + let node = root; + let key; + let parent; + + if (path.length) { + key = path[path.length - 1]; + parent = Reach(root, path.slice(0, -1)); + node = parent[key]; + } + + try { + const output = await method(node, { prefs }); + if (output === undefined || + output === node) { + + continue; + } + + if (parent) { + parent[key] = output; + } + else { + root = output; + } + } + catch (err) { + err.message += ` (${label})`; // Change message to include path + throw err; + } + } + + result.value = root; + } + + if (!settings.warnings && + !settings.debug && + !settings.artifacts) { + + return result.value; + } + + const outcome = { value: result.value }; + if (mainstay.warnings.length) { + outcome.warning = Errors.details(mainstay.warnings); + } + + if (mainstay.debug) { + outcome.debug = mainstay.debug; + } + + if (mainstay.artifacts) { + outcome.artifacts = mainstay.artifacts; + } + + return outcome; +}; + + +internals.entry = function (value, schema, prefs) { + + // Prepare state + + const { tracer, cleanup } = internals.tracer(schema, prefs); + const debug = prefs.debug ? [] : null; + const links = schema._ids._schemaChain ? new Map() : null; + const mainstay = { externals: [], warnings: [], tracer, debug, links }; + const schemas = schema._ids._schemaChain ? [{ schema }] : null; + const state = new State([], [], { mainstay, schemas }); + + // Validate value + + const result = exports.validate(value, schema, state, prefs); + + // Process value and errors + + if (cleanup) { + schema.$_root.untrace(); + } + + const error = Errors.process(result.errors, value, prefs); + return { value: result.value, error, mainstay }; +}; + + +internals.tracer = function (schema, prefs) { + + if (schema.$_root._tracer) { + return { tracer: schema.$_root._tracer._register(schema) }; + } + + if (prefs.debug) { + Assert(schema.$_root.trace, 'Debug mode not supported'); + return { tracer: schema.$_root.trace()._register(schema), cleanup: true }; + } + + return { tracer: internals.ignore }; +}; + + +exports.validate = function (value, schema, state, prefs, overrides = {}) { + + if (schema.$_terms.whens) { + schema = schema._generate(value, state, prefs).schema; + } + + // Setup state and settings + + if (schema._preferences) { + prefs = internals.prefs(schema, prefs); + } + + // Cache + + if (schema._cache && + prefs.cache) { + + const result = schema._cache.get(value); + state.mainstay.tracer.debug(state, 'validate', 'cached', !!result); + if (result) { + return result; + } + } + + // Helpers + + const createError = (code, local, localState) => schema.$_createError(code, value, local, localState || state, prefs); + const helpers = { + original: value, + prefs, + schema, + state, + error: createError, + errorsArray: internals.errorsArray, + warn: (code, local, localState) => state.mainstay.warnings.push(createError(code, local, localState)), + message: (messages, local) => schema.$_createError('custom', value, local, state, prefs, { messages }) + }; + + // Prepare + + state.mainstay.tracer.entry(schema, state); + + const def = schema._definition; + if (def.prepare && + value !== undefined && + prefs.convert) { + + const prepared = def.prepare(value, helpers); + if (prepared) { + state.mainstay.tracer.value(state, 'prepare', value, prepared.value); + if (prepared.errors) { + return internals.finalize(prepared.value, [].concat(prepared.errors), helpers); // Prepare error always aborts early + } + + value = prepared.value; + } + } + + // Type coercion + + if (def.coerce && + value !== undefined && + prefs.convert && + (!def.coerce.from || def.coerce.from.includes(typeof value))) { + + const coerced = def.coerce.method(value, helpers); + if (coerced) { + state.mainstay.tracer.value(state, 'coerced', value, coerced.value); + if (coerced.errors) { + return internals.finalize(coerced.value, [].concat(coerced.errors), helpers); // Coerce error always aborts early + } + + value = coerced.value; + } + } + + // Empty value + + const empty = schema._flags.empty; + if (empty && + empty.$_match(internals.trim(value, schema), state.nest(empty), Common.defaults)) { + + state.mainstay.tracer.value(state, 'empty', value, undefined); + value = undefined; + } + + // Presence requirements (required, optional, forbidden) + + const presence = overrides.presence || schema._flags.presence || (schema._flags._endedSwitch ? null : prefs.presence); + if (value === undefined) { + if (presence === 'forbidden') { + return internals.finalize(value, null, helpers); + } + + if (presence === 'required') { + return internals.finalize(value, [schema.$_createError('any.required', value, null, state, prefs)], helpers); + } + + if (presence === 'optional') { + if (schema._flags.default !== Common.symbols.deepDefault) { + return internals.finalize(value, null, helpers); + } + + state.mainstay.tracer.value(state, 'default', value, {}); + value = {}; + } + } + else if (presence === 'forbidden') { + return internals.finalize(value, [schema.$_createError('any.unknown', value, null, state, prefs)], helpers); + } + + // Allowed values + + const errors = []; + + if (schema._valids) { + const match = schema._valids.get(value, state, prefs, schema._flags.insensitive); + if (match) { + if (prefs.convert) { + state.mainstay.tracer.value(state, 'valids', value, match.value); + value = match.value; + } + + state.mainstay.tracer.filter(schema, state, 'valid', match); + return internals.finalize(value, null, helpers); + } + + if (schema._flags.only) { + const report = schema.$_createError('any.only', value, { valids: schema._valids.values({ display: true }) }, state, prefs); + if (prefs.abortEarly) { + return internals.finalize(value, [report], helpers); + } + + errors.push(report); + } + } + + // Denied values + + if (schema._invalids) { + const match = schema._invalids.get(value, state, prefs, schema._flags.insensitive); + if (match) { + state.mainstay.tracer.filter(schema, state, 'invalid', match); + const report = schema.$_createError('any.invalid', value, { invalids: schema._invalids.values({ display: true }) }, state, prefs); + if (prefs.abortEarly) { + return internals.finalize(value, [report], helpers); + } + + errors.push(report); + } + } + + // Base type + + if (def.validate) { + const base = def.validate(value, helpers); + if (base) { + state.mainstay.tracer.value(state, 'base', value, base.value); + value = base.value; + + if (base.errors) { + if (!Array.isArray(base.errors)) { + errors.push(base.errors); + return internals.finalize(value, errors, helpers); // Base error always aborts early + } + + if (base.errors.length) { + errors.push(...base.errors); + return internals.finalize(value, errors, helpers); // Base error always aborts early + } + } + } + } + + // Validate tests + + if (!schema._rules.length) { + return internals.finalize(value, errors, helpers); + } + + return internals.rules(value, errors, helpers); +}; + + +internals.rules = function (value, errors, helpers) { + + const { schema, state, prefs } = helpers; + + for (const rule of schema._rules) { + const definition = schema._definition.rules[rule.method]; + + // Skip rules that are also applied in coerce step + + if (definition.convert && + prefs.convert) { + + state.mainstay.tracer.log(schema, state, 'rule', rule.name, 'full'); + continue; + } + + // Resolve references + + let ret; + let args = rule.args; + if (rule._resolve.length) { + args = Object.assign({}, args); // Shallow copy + for (const key of rule._resolve) { + const resolver = definition.argsByName.get(key); + + const resolved = args[key].resolve(value, state, prefs); + const normalized = resolver.normalize ? resolver.normalize(resolved) : resolved; + + const invalid = Common.validateArg(normalized, null, resolver); + if (invalid) { + ret = schema.$_createError('any.ref', resolved, { arg: key, ref: args[key], reason: invalid }, state, prefs); + break; + } + + args[key] = normalized; + } + } + + // Test rule + + ret = ret || definition.validate(value, helpers, args, rule); // Use ret if already set to reference error + + const result = internals.rule(ret, rule); + if (result.errors) { + state.mainstay.tracer.log(schema, state, 'rule', rule.name, 'error'); + + if (rule.warn) { + state.mainstay.warnings.push(...result.errors); + continue; + } + + if (prefs.abortEarly) { + return internals.finalize(value, result.errors, helpers); + } + + errors.push(...result.errors); + } + else { + state.mainstay.tracer.log(schema, state, 'rule', rule.name, 'pass'); + state.mainstay.tracer.value(state, 'rule', value, result.value, rule.name); + value = result.value; + } + } + + return internals.finalize(value, errors, helpers); +}; + + +internals.rule = function (ret, rule) { + + if (ret instanceof Errors.Report) { + internals.error(ret, rule); + return { errors: [ret], value: null }; + } + + if (Array.isArray(ret) && + ret[Common.symbols.errors]) { + + ret.forEach((report) => internals.error(report, rule)); + return { errors: ret, value: null }; + } + + return { errors: null, value: ret }; +}; + + +internals.error = function (report, rule) { + + if (rule.message) { + report._setTemplate(rule.message); + } + + return report; +}; + + +internals.finalize = function (value, errors, helpers) { + + errors = errors || []; + const { schema, state, prefs } = helpers; + + // Failover value + + if (errors.length) { + const failover = internals.default('failover', undefined, errors, helpers); + if (failover !== undefined) { + state.mainstay.tracer.value(state, 'failover', value, failover); + value = failover; + errors = []; + } + } + + // Error override + + if (errors.length && + schema._flags.error) { + + if (typeof schema._flags.error === 'function') { + errors = schema._flags.error(errors); + if (!Array.isArray(errors)) { + errors = [errors]; + } + + for (const error of errors) { + Assert(error instanceof Error || error instanceof Errors.Report, 'error() must return an Error object'); + } + } + else { + errors = [schema._flags.error]; + } + } + + // Default + + if (value === undefined) { + const defaulted = internals.default('default', value, errors, helpers); + state.mainstay.tracer.value(state, 'default', value, defaulted); + value = defaulted; + } + + // Cast + + if (schema._flags.cast && + value !== undefined) { + + const caster = schema._definition.cast[schema._flags.cast]; + if (caster.from(value)) { + const casted = caster.to(value, helpers); + state.mainstay.tracer.value(state, 'cast', value, casted, schema._flags.cast); + value = casted; + } + } + + // Externals + + if (schema.$_terms.externals && + prefs.externals && + prefs._externals !== false) { // Disabled for matching + + for (const { method } of schema.$_terms.externals) { + state.mainstay.externals.push({ method, path: state.path, label: Errors.label(schema._flags, state, prefs) }); + } + } + + // Result + + const result = { value, errors: errors.length ? errors : null }; + + if (schema._flags.result) { + result.value = schema._flags.result === 'strip' ? undefined : /* raw */ helpers.original; + state.mainstay.tracer.value(state, schema._flags.result, value, result.value); + state.shadow(value, schema._flags.result); + } + + // Cache + + if (schema._cache && + prefs.cache !== false && + !schema._refs.length) { + + schema._cache.set(helpers.original, result); + } + + // Artifacts + + if (value !== undefined && + !result.errors && + schema._flags.artifact !== undefined) { + + state.mainstay.artifacts = state.mainstay.artifacts || new Map(); + if (!state.mainstay.artifacts.has(schema._flags.artifact)) { + state.mainstay.artifacts.set(schema._flags.artifact, []); + } + + state.mainstay.artifacts.get(schema._flags.artifact).push(state.path); + } + + return result; +}; + + +internals.prefs = function (schema, prefs) { + + const isDefaultOptions = prefs === Common.defaults; + if (isDefaultOptions && + schema._preferences[Common.symbols.prefs]) { + + return schema._preferences[Common.symbols.prefs]; + } + + prefs = Common.preferences(prefs, schema._preferences); + if (isDefaultOptions) { + schema._preferences[Common.symbols.prefs] = prefs; + } + + return prefs; +}; + + +internals.default = function (flag, value, errors, helpers) { + + const { schema, state, prefs } = helpers; + const source = schema._flags[flag]; + if (prefs.noDefaults || + source === undefined) { + + return value; + } + + state.mainstay.tracer.log(schema, state, 'rule', flag, 'full'); + + if (!source) { + return source; + } + + if (typeof source === 'function') { + const args = source.length ? [Clone(state.ancestors[0]), helpers] : []; + + try { + return source(...args); + } + catch (err) { + errors.push(schema.$_createError(`any.${flag}`, null, { error: err }, state, prefs)); + return; + } + } + + if (typeof source !== 'object') { + return source; + } + + if (source[Common.symbols.literal]) { + return source.literal; + } + + if (Common.isResolvable(source)) { + return source.resolve(value, state, prefs); + } + + return Clone(source); +}; + + +internals.trim = function (value, schema) { + + if (typeof value !== 'string') { + return value; + } + + const trim = schema.$_getRule('trim'); + if (!trim || + !trim.args.enabled) { + + return value; + } + + return value.trim(); +}; + + +internals.ignore = { + active: false, + debug: Ignore, + entry: Ignore, + filter: Ignore, + log: Ignore, + resolve: Ignore, + value: Ignore +}; + + +internals.errorsArray = function () { + + const errors = []; + errors[Common.symbols.errors] = true; + return errors; +}; diff --git a/node_modules/joi/lib/values.js b/node_modules/joi/lib/values.js new file mode 100644 index 000000000..bfdb90b41 --- /dev/null +++ b/node_modules/joi/lib/values.js @@ -0,0 +1,263 @@ +'use strict'; + +const Assert = require('@hapi/hoek/lib/assert'); +const DeepEqual = require('@hapi/hoek/lib/deepEqual'); + +const Common = require('./common'); + + +const internals = {}; + + +module.exports = internals.Values = class { + + constructor(values, refs) { + + this._values = new Set(values); + this._refs = new Set(refs); + this._lowercase = internals.lowercases(values); + + this._override = false; + } + + get length() { + + return this._values.size + this._refs.size; + } + + add(value, refs) { + + // Reference + + if (Common.isResolvable(value)) { + if (!this._refs.has(value)) { + this._refs.add(value); + + if (refs) { // Skipped in a merge + refs.register(value); + } + } + + return; + } + + // Value + + if (!this.has(value, null, null, false)) { + this._values.add(value); + + if (typeof value === 'string') { + this._lowercase.set(value.toLowerCase(), value); + } + } + } + + static merge(target, source, remove) { + + target = target || new internals.Values(); + + if (source) { + if (source._override) { + return source.clone(); + } + + for (const item of [...source._values, ...source._refs]) { + target.add(item); + } + } + + if (remove) { + for (const item of [...remove._values, ...remove._refs]) { + target.remove(item); + } + } + + return target.length ? target : null; + } + + remove(value) { + + // Reference + + if (Common.isResolvable(value)) { + this._refs.delete(value); + return; + } + + // Value + + this._values.delete(value); + + if (typeof value === 'string') { + this._lowercase.delete(value.toLowerCase()); + } + } + + has(value, state, prefs, insensitive) { + + return !!this.get(value, state, prefs, insensitive); + } + + get(value, state, prefs, insensitive) { + + if (!this.length) { + return false; + } + + // Simple match + + if (this._values.has(value)) { + return { value }; + } + + // Case insensitive string match + + if (typeof value === 'string' && + value && + insensitive) { + + const found = this._lowercase.get(value.toLowerCase()); + if (found) { + return { value: found }; + } + } + + if (!this._refs.size && + typeof value !== 'object') { + + return false; + } + + // Objects + + if (typeof value === 'object') { + for (const item of this._values) { + if (DeepEqual(item, value)) { + return { value: item }; + } + } + } + + // References + + if (state) { + for (const ref of this._refs) { + const resolved = ref.resolve(value, state, prefs, null, { in: true }); + if (resolved === undefined) { + continue; + } + + const items = !ref.in || typeof resolved !== 'object' + ? [resolved] + : Array.isArray(resolved) ? resolved : Object.keys(resolved); + + for (const item of items) { + if (typeof item !== typeof value) { + continue; + } + + if (insensitive && + value && + typeof value === 'string') { + + if (item.toLowerCase() === value.toLowerCase()) { + return { value: item, ref }; + } + } + else { + if (DeepEqual(item, value)) { + return { value: item, ref }; + } + } + } + } + } + + return false; + } + + override() { + + this._override = true; + } + + values(options) { + + if (options && + options.display) { + + const values = []; + + for (const item of [...this._values, ...this._refs]) { + if (item !== undefined) { + values.push(item); + } + } + + return values; + } + + return Array.from([...this._values, ...this._refs]); + } + + clone() { + + const set = new internals.Values(this._values, this._refs); + set._override = this._override; + return set; + } + + concat(source) { + + Assert(!source._override, 'Cannot concat override set of values'); + + const set = new internals.Values([...this._values, ...source._values], [...this._refs, ...source._refs]); + set._override = this._override; + return set; + } + + describe() { + + const normalized = []; + + if (this._override) { + normalized.push({ override: true }); + } + + for (const value of this._values.values()) { + normalized.push(value && typeof value === 'object' ? { value } : value); + } + + for (const value of this._refs.values()) { + normalized.push(value.describe()); + } + + return normalized; + } +}; + + +internals.Values.prototype[Common.symbols.values] = true; + + +// Aliases + +internals.Values.prototype.slice = internals.Values.prototype.clone; + + +// Helpers + +internals.lowercases = function (from) { + + const map = new Map(); + + if (from) { + for (const value of from) { + if (typeof value === 'string') { + map.set(value.toLowerCase(), value); + } + } + } + + return map; +}; diff --git a/node_modules/joi/package.json b/node_modules/joi/package.json new file mode 100644 index 000000000..5b2e1ec9a --- /dev/null +++ b/node_modules/joi/package.json @@ -0,0 +1,70 @@ +{ + "_from": "joi", + "_id": "joi@17.4.2", + "_inBundle": false, + "_integrity": "sha512-Lm56PP+n0+Z2A2rfRvsfWVDXGEWjXxatPopkQ8qQ5mxCEhwHG+Ettgg5o98FFaxilOxozoa14cFhrE/hOzh/Nw==", + "_location": "/joi", + "_phantomChildren": {}, + "_requested": { + "type": "tag", + "registry": true, + "raw": "joi", + "name": "joi", + "escapedName": "joi", + "rawSpec": "", + "saveSpec": null, + "fetchSpec": "latest" + }, + "_requiredBy": [ + "#USER", + "/" + ], + "_resolved": "https://registry.npmjs.org/joi/-/joi-17.4.2.tgz", + "_shasum": "02f4eb5cf88e515e614830239379dcbbe28ce7f7", + "_spec": "joi", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project", + "browser": "dist/joi-browser.min.js", + "bugs": { + "url": "https://github.com/sideway/joi/issues" + }, + "bundleDependencies": false, + "dependencies": { + "@hapi/hoek": "^9.0.0", + "@hapi/topo": "^5.0.0", + "@sideway/address": "^4.1.0", + "@sideway/formula": "^3.0.0", + "@sideway/pinpoint": "^2.0.0" + }, + "deprecated": false, + "description": "Object schema validation", + "devDependencies": { + "@hapi/bourne": "2.x.x", + "@hapi/code": "8.x.x", + "@hapi/joi-legacy-test": "npm:@hapi/joi@15.x.x", + "@hapi/lab": "24.x.x", + "typescript": "4.3.x" + }, + "files": [ + "lib/**/*", + "dist/*" + ], + "homepage": "https://github.com/sideway/joi#readme", + "keywords": [ + "schema", + "validation" + ], + "license": "BSD-3-Clause", + "main": "lib/index.js", + "name": "joi", + "repository": { + "type": "git", + "url": "git://github.com/sideway/joi.git" + }, + "scripts": { + "prepublishOnly": "cd browser && npm install && npm run build", + "test": "lab -t 100 -a @hapi/code -L -Y", + "test-cov-html": "lab -r html -o coverage.html -a @hapi/code" + }, + "types": "lib/index.d.ts", + "version": "17.4.2" +} diff --git a/node_modules/json-buffer/.npmignore b/node_modules/json-buffer/.npmignore new file mode 100644 index 000000000..13abef4f5 --- /dev/null +++ b/node_modules/json-buffer/.npmignore @@ -0,0 +1,3 @@ +node_modules +node_modules/* +npm_debug.log diff --git a/node_modules/json-buffer/.travis.yml b/node_modules/json-buffer/.travis.yml new file mode 100644 index 000000000..244b7e88e --- /dev/null +++ b/node_modules/json-buffer/.travis.yml @@ -0,0 +1,3 @@ +language: node_js +node_js: + - '0.10' diff --git a/node_modules/json-buffer/LICENSE b/node_modules/json-buffer/LICENSE new file mode 100644 index 000000000..b799ec00c --- /dev/null +++ b/node_modules/json-buffer/LICENSE @@ -0,0 +1,22 @@ +Copyright (c) 2013 Dominic Tarr + +Permission is hereby granted, free of charge, +to any person obtaining a copy of this software and +associated documentation files (the "Software"), to +deal in the Software without restriction, including +without limitation the rights to use, copy, modify, +merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom +the Software is furnished to do so, +subject to the following conditions: + +The above copyright notice and this permission notice +shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES +OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR +ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/json-buffer/README.md b/node_modules/json-buffer/README.md new file mode 100644 index 000000000..43857bb1a --- /dev/null +++ b/node_modules/json-buffer/README.md @@ -0,0 +1,24 @@ +# json-buffer + +JSON functions that can convert buffers! + +[![build status](https://secure.travis-ci.org/dominictarr/json-buffer.png)](http://travis-ci.org/dominictarr/json-buffer) + +[![testling badge](https://ci.testling.com/dominictarr/json-buffer.png)](https://ci.testling.com/dominictarr/json-buffer) + +JSON mangles buffers by converting to an array... +which isn't helpful. json-buffers converts to base64 instead, +and deconverts base64 to a buffer. + +``` js +var JSONB = require('json-buffer') +var Buffer = require('buffer').Buffer + +var str = JSONB.stringify(new Buffer('hello there!')) + +console.log(JSONB.parse(str)) //GET a BUFFER back +``` + +## License + +MIT diff --git a/node_modules/json-buffer/index.js b/node_modules/json-buffer/index.js new file mode 100644 index 000000000..9cafed821 --- /dev/null +++ b/node_modules/json-buffer/index.js @@ -0,0 +1,58 @@ +//TODO: handle reviver/dehydrate function like normal +//and handle indentation, like normal. +//if anyone needs this... please send pull request. + +exports.stringify = function stringify (o) { + if('undefined' == typeof o) return o + + if(o && Buffer.isBuffer(o)) + return JSON.stringify(':base64:' + o.toString('base64')) + + if(o && o.toJSON) + o = o.toJSON() + + if(o && 'object' === typeof o) { + var s = '' + var array = Array.isArray(o) + s = array ? '[' : '{' + var first = true + + for(var k in o) { + var ignore = 'function' == typeof o[k] || (!array && 'undefined' === typeof o[k]) + if(Object.hasOwnProperty.call(o, k) && !ignore) { + if(!first) + s += ',' + first = false + if (array) { + if(o[k] == undefined) + s += 'null' + else + s += stringify(o[k]) + } else if (o[k] !== void(0)) { + s += stringify(k) + ':' + stringify(o[k]) + } + } + } + + s += array ? ']' : '}' + + return s + } else if ('string' === typeof o) { + return JSON.stringify(/^:/.test(o) ? ':' + o : o) + } else if ('undefined' === typeof o) { + return 'null'; + } else + return JSON.stringify(o) +} + +exports.parse = function (s) { + return JSON.parse(s, function (key, value) { + if('string' === typeof value) { + if(/^:base64:/.test(value)) + return new Buffer(value.substring(8), 'base64') + else + return /^:/.test(value) ? value.substring(1) : value + } + return value + }) +} diff --git a/node_modules/json-buffer/package.json b/node_modules/json-buffer/package.json new file mode 100644 index 000000000..390856e6a --- /dev/null +++ b/node_modules/json-buffer/package.json @@ -0,0 +1,66 @@ +{ + "_from": "json-buffer@3.0.0", + "_id": "json-buffer@3.0.0", + "_inBundle": false, + "_integrity": "sha1-Wx85evx11ne96Lz8Dkfh+aPZqJg=", + "_location": "/json-buffer", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "json-buffer@3.0.0", + "name": "json-buffer", + "escapedName": "json-buffer", + "rawSpec": "3.0.0", + "saveSpec": null, + "fetchSpec": "3.0.0" + }, + "_requiredBy": [ + "/keyv" + ], + "_resolved": "https://registry.npmjs.org/json-buffer/-/json-buffer-3.0.0.tgz", + "_shasum": "5b1f397afc75d677bde8bcfc0e47e1f9a3d9a898", + "_spec": "json-buffer@3.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\keyv", + "author": { + "name": "Dominic Tarr", + "email": "dominic.tarr@gmail.com", + "url": "http://dominictarr.com" + }, + "bugs": { + "url": "https://github.com/dominictarr/json-buffer/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "JSON parse & stringify that supports binary via bops & base64", + "devDependencies": { + "tape": "^4.6.3" + }, + "homepage": "https://github.com/dominictarr/json-buffer", + "license": "MIT", + "name": "json-buffer", + "repository": { + "type": "git", + "url": "git://github.com/dominictarr/json-buffer.git" + }, + "scripts": { + "test": "set -e; for t in test/*.js; do node $t; done" + }, + "testling": { + "files": "test/*.js", + "browsers": [ + "ie/8..latest", + "firefox/17..latest", + "firefox/nightly", + "chrome/22..latest", + "chrome/canary", + "opera/12..latest", + "opera/next", + "safari/5.1..latest", + "ipad/6.0..latest", + "iphone/6.0..latest", + "android-browser/4.2..latest" + ] + }, + "version": "3.0.0" +} diff --git a/node_modules/json-buffer/test/index.js b/node_modules/json-buffer/test/index.js new file mode 100644 index 000000000..83518042c --- /dev/null +++ b/node_modules/json-buffer/test/index.js @@ -0,0 +1,63 @@ + +var test = require('tape') +var _JSON = require('../') + +function clone (o) { + return JSON.parse(JSON.stringify(o)) +} + +var examples = { + simple: { foo: [], bar: {}, baz: new Buffer('some binary data') }, + just_buffer: new Buffer('JUST A BUFFER'), + all_types: { + string:'hello', + number: 3145, + null: null, + object: {}, + array: [], + boolean: true, + boolean2: false + }, + foo: new Buffer('foo'), + foo2: new Buffer('foo2'), + escape: { + buffer: new Buffer('x'), + string: _JSON.stringify(new Buffer('x')) + }, + escape2: { + buffer: new Buffer('x'), + string: ':base64:'+ new Buffer('x').toString('base64') + }, + undefined: { + empty: undefined, test: true + }, + undefined2: { + first: 1, empty: undefined, test: true + }, + undefinedArray: { + array: [undefined, 1, 'two'] + }, + fn: { + fn: function () {} + }, + undefined: undefined +} + +for(k in examples) +(function (value, k) { + test(k, function (t) { + var s = _JSON.stringify(value) + console.log('parse', s) + if(JSON.stringify(value) !== undefined) { + console.log(s) + var _value = _JSON.parse(s) + t.deepEqual(clone(_value), clone(value)) + } + else + t.equal(s, undefined) + t.end() + }) +})(examples[k], k) + + + diff --git a/node_modules/kareem/.travis.yml b/node_modules/kareem/.travis.yml new file mode 100644 index 000000000..0a388b901 --- /dev/null +++ b/node_modules/kareem/.travis.yml @@ -0,0 +1,12 @@ +language: node_js +node_js: + - "12" + - "10" + - "9" + - "8" + - "7" + - "6" + - "5" + - "4" +script: "npm run-script test-travis" +after_script: "npm install coveralls@2.10.0 && cat ./coverage/lcov.info | coveralls" diff --git a/node_modules/kareem/CHANGELOG.md b/node_modules/kareem/CHANGELOG.md new file mode 100644 index 000000000..b0cbb5221 --- /dev/null +++ b/node_modules/kareem/CHANGELOG.md @@ -0,0 +1,793 @@ +# Changelog + + +## 2.3.2 (2020-12-08) + +* fix: handle sync errors in pre hooks if there are multiple hooks + + +## 2.3.0 (2018-09-24) + +* chore(release): 2.2.3 ([c8f2695](https://github.com/vkarpov15/kareem/commit/c8f2695)) +* chore(release): 2.2.4 ([a377a4f](https://github.com/vkarpov15/kareem/commit/a377a4f)) +* chore(release): 2.2.5 ([5a495e3](https://github.com/vkarpov15/kareem/commit/5a495e3)) +* fix(filter): copy async pres correctly with `filter()` ([1b1ed8a](https://github.com/vkarpov15/kareem/commit/1b1ed8a)), closes [Automattic/mongoose#3054](https://github.com/Automattic/mongoose/issues/3054) +* feat: add filter() function ([1f641f4](https://github.com/vkarpov15/kareem/commit/1f641f4)) +* feat: support storing options on pre and post hooks ([59220b9](https://github.com/vkarpov15/kareem/commit/59220b9)) + + + + +## 2.2.3 (2018-09-10) + +* chore: release 2.2.3 ([af653a3](https://github.com/vkarpov15/kareem/commit/af653a3)) + + + + +## 2.2.2 (2018-09-10) + +* chore: release 2.2.2 ([3f0144d](https://github.com/vkarpov15/kareem/commit/3f0144d)) +* fix: allow merge() to not clone ([e628d65](https://github.com/vkarpov15/kareem/commit/e628d65)) + + + + +## 2.2.1 (2018-06-05) + +* chore: release 2.2.1 ([4625a64](https://github.com/vkarpov15/kareem/commit/4625a64)) +* chore: remove lockfile from git ([7f3e4e6](https://github.com/vkarpov15/kareem/commit/7f3e4e6)) +* fix: handle numAsync correctly when merging ([fef8e7e](https://github.com/vkarpov15/kareem/commit/fef8e7e)) +* test: repro issue with not copying numAsync ([952d9db](https://github.com/vkarpov15/kareem/commit/952d9db)) + + + + +## 2.2.0 (2018-06-05) + +* chore: release 2.2.0 ([ff9ad03](https://github.com/vkarpov15/kareem/commit/ff9ad03)) +* fix: use maps instead of objects for _pres and _posts so `toString()` doesn't get reported as having ([55df303](https://github.com/vkarpov15/kareem/commit/55df303)), closes [Automattic/mongoose#6538](https://github.com/Automattic/mongoose/issues/6538) + + + + +## 2.1.0 (2018-05-16) + +* chore: release 2.1.0 ([ba5f1bc](https://github.com/vkarpov15/kareem/commit/ba5f1bc)) +* feat: add option to check wrapped function return value for promises ([c9d7dd1](https://github.com/vkarpov15/kareem/commit/c9d7dd1)) +* refactor: use const in wrap() ([0fc21f9](https://github.com/vkarpov15/kareem/commit/0fc21f9)) + + + + +## 2.0.7 (2018-04-28) + +* chore: release 2.0.7 ([0bf91e6](https://github.com/vkarpov15/kareem/commit/0bf91e6)) +* feat: add `hasHooks()` ([225f18d](https://github.com/vkarpov15/kareem/commit/225f18d)), closes [Automattic/mongoose#6385](https://github.com/Automattic/mongoose/issues/6385) + + + + +## 2.0.6 (2018-03-22) + +* chore: release 2.0.6 ([f3d406b](https://github.com/vkarpov15/kareem/commit/f3d406b)) +* fix(wrap): ensure fast path still wraps function in `nextTick()` for chaining ([7000494](https://github.com/vkarpov15/kareem/commit/7000494)), closes [Automattic/mongoose#6250](https://github.com/Automattic/mongoose/issues/6250) [dsanel/mongoose-delete#36](https://github.com/dsanel/mongoose-delete/issues/36) + + + + +## 2.0.5 (2018-02-22) + +* chore: release 2.0.5 ([3286612](https://github.com/vkarpov15/kareem/commit/3286612)) +* perf(createWrapper): don't create wrapper if there are no hooks ([5afc5b9](https://github.com/vkarpov15/kareem/commit/5afc5b9)), closes [Automattic/mongoose#6126](https://github.com/Automattic/mongoose/issues/6126) + + + + +## 2.0.4 (2018-02-08) + +* chore: release 2.0.4 ([2ab0293](https://github.com/vkarpov15/kareem/commit/2ab0293)) + + + + +## 2.0.3 (2018-02-01) + +* chore: release 2.0.3 ([3c1abe5](https://github.com/vkarpov15/kareem/commit/3c1abe5)) +* fix: use process.nextTick() re: Automattic/mongoose#6074 ([e5bfe33](https://github.com/vkarpov15/kareem/commit/e5bfe33)), closes [Automattic/mongoose#6074](https://github.com/Automattic/mongoose/issues/6074) + + + + +## 2.0.2 (2018-01-24) + +* chore: fix license ([a9d755c](https://github.com/vkarpov15/kareem/commit/a9d755c)), closes [#10](https://github.com/vkarpov15/kareem/issues/10) +* chore: release 2.0.2 ([fe87ab6](https://github.com/vkarpov15/kareem/commit/fe87ab6)) + + + + +## 2.0.1 (2018-01-09) + +* chore: release 2.0.1 with lockfile bump ([09c44fb](https://github.com/vkarpov15/kareem/commit/09c44fb)) + + + + +## 2.0.0 (2018-01-09) + +* chore: bump marked re: security ([cc564a9](https://github.com/vkarpov15/kareem/commit/cc564a9)) +* chore: release 2.0.0 ([f511d1c](https://github.com/vkarpov15/kareem/commit/f511d1c)) + + + + +## 2.0.0-rc5 (2017-12-23) + +* chore: fix build on node 4+5 ([6dac5a4](https://github.com/vkarpov15/kareem/commit/6dac5a4)) +* chore: fix built on node 4 + 5 again ([434ef0a](https://github.com/vkarpov15/kareem/commit/434ef0a)) +* chore: release 2.0.0-rc5 ([25a32ee](https://github.com/vkarpov15/kareem/commit/25a32ee)) + + + + +## 2.0.0-rc4 (2017-12-22) + +* chore: release 2.0.0-rc4 ([49fc083](https://github.com/vkarpov15/kareem/commit/49fc083)) +* BREAKING CHANGE: deduplicate when merging hooks re: Automattic/mongoose#2945 ([d458573](https://github.com/vkarpov15/kareem/commit/d458573)), closes [Automattic/mongoose#2945](https://github.com/Automattic/mongoose/issues/2945) + + + + +## 2.0.0-rc3 (2017-12-22) + +* chore: release 2.0.0-rc3 ([adaaa00](https://github.com/vkarpov15/kareem/commit/adaaa00)) +* feat: support returning promises from middleware functions ([05b4480](https://github.com/vkarpov15/kareem/commit/05b4480)), closes [Automattic/mongoose#3779](https://github.com/Automattic/mongoose/issues/3779) + + + + +## 2.0.0-rc2 (2017-12-21) + +* chore: release 2.0.0-rc2 ([76325fa](https://github.com/vkarpov15/kareem/commit/76325fa)) +* fix: ensure next() and done() run in next tick ([6c20684](https://github.com/vkarpov15/kareem/commit/6c20684)) + + + + +## 2.0.0-rc1 (2017-12-21) + +* chore: improve test coverage re: Automattic/mongoose#3232 ([7b45cf0](https://github.com/vkarpov15/kareem/commit/7b45cf0)), closes [Automattic/mongoose#3232](https://github.com/Automattic/mongoose/issues/3232) +* chore: release 2.0.0-rc1 ([9b83f52](https://github.com/vkarpov15/kareem/commit/9b83f52)) +* BREAKING CHANGE: report sync exceptions as errors, only allow calling next() and done() once ([674adcc](https://github.com/vkarpov15/kareem/commit/674adcc)), closes [Automattic/mongoose#3483](https://github.com/Automattic/mongoose/issues/3483) + + + + +## 2.0.0-rc0 (2017-12-17) + +* chore: release 2.0.0-rc0 ([16b44b5](https://github.com/vkarpov15/kareem/commit/16b44b5)) +* BREAKING CHANGE: drop support for node < 4 ([9cbb8c7](https://github.com/vkarpov15/kareem/commit/9cbb8c7)) +* BREAKING CHANGE: remove useLegacyPost and add several new features ([6dd8531](https://github.com/vkarpov15/kareem/commit/6dd8531)), closes [Automattic/mongoose#3232](https://github.com/Automattic/mongoose/issues/3232) + + + + +## 1.5.0 (2017-07-20) + +* chore: release 1.5.0 ([9c491a0](https://github.com/vkarpov15/kareem/commit/9c491a0)) +* fix: improve post error handlers results ([9928dd5](https://github.com/vkarpov15/kareem/commit/9928dd5)), closes [Automattic/mongoose#5466](https://github.com/Automattic/mongoose/issues/5466) + + + + +## 1.4.2 (2017-07-06) + +* chore: release 1.4.2 ([8d14ac5](https://github.com/vkarpov15/kareem/commit/8d14ac5)) +* fix: correct args re: Automattic/mongoose#5405 ([3f28ae6](https://github.com/vkarpov15/kareem/commit/3f28ae6)), closes [Automattic/mongoose#5405](https://github.com/Automattic/mongoose/issues/5405) + + + + +## 1.4.1 (2017-04-25) + +* chore: release 1.4.1 ([5ecf0c2](https://github.com/vkarpov15/kareem/commit/5ecf0c2)) +* fix: handle numAsyncPres with clone() ([c72e857](https://github.com/vkarpov15/kareem/commit/c72e857)), closes [#8](https://github.com/vkarpov15/kareem/issues/8) +* test: repro #8 ([9b4d6b2](https://github.com/vkarpov15/kareem/commit/9b4d6b2)), closes [#8](https://github.com/vkarpov15/kareem/issues/8) + + + + +## 1.4.0 (2017-04-19) + +* chore: release 1.4.0 ([101c5f5](https://github.com/vkarpov15/kareem/commit/101c5f5)) +* feat: add merge() function ([285325e](https://github.com/vkarpov15/kareem/commit/285325e)) + + + + +## 1.3.0 (2017-03-26) + +* chore: release 1.3.0 ([f3a9e50](https://github.com/vkarpov15/kareem/commit/f3a9e50)) +* feat: pass function args to execPre ([4dd466d](https://github.com/vkarpov15/kareem/commit/4dd466d)) + + + + +## 1.2.1 (2017-02-03) + +* chore: release 1.2.1 ([d97081f](https://github.com/vkarpov15/kareem/commit/d97081f)) +* fix: filter out _kareemIgnored args for error handlers re: Automattic/mongoose#4925 ([ddc7aeb](https://github.com/vkarpov15/kareem/commit/ddc7aeb)), closes [Automattic/mongoose#4925](https://github.com/Automattic/mongoose/issues/4925) +* fix: make error handlers handle errors in pre hooks ([af38033](https://github.com/vkarpov15/kareem/commit/af38033)), closes [Automattic/mongoose#4927](https://github.com/Automattic/mongoose/issues/4927) + + + + +## 1.2.0 (2017-01-02) + +* chore: release 1.2.0 ([033225c](https://github.com/vkarpov15/kareem/commit/033225c)) +* chore: upgrade deps ([f9e9a09](https://github.com/vkarpov15/kareem/commit/f9e9a09)) +* feat: add _kareemIgnore re: Automattic/mongoose#4836 ([7957771](https://github.com/vkarpov15/kareem/commit/7957771)), closes [Automattic/mongoose#4836](https://github.com/Automattic/mongoose/issues/4836) + + + + +## 1.1.5 (2016-12-13) + +* chore: release 1.1.5 ([1a9f684](https://github.com/vkarpov15/kareem/commit/1a9f684)) +* fix: correct field name ([04a0e9d](https://github.com/vkarpov15/kareem/commit/04a0e9d)) + + + + +## 1.1.4 (2016-12-09) + +* chore: release 1.1.4 ([ece401c](https://github.com/vkarpov15/kareem/commit/ece401c)) +* chore: run tests on node 6 ([e0cb1cb](https://github.com/vkarpov15/kareem/commit/e0cb1cb)) +* fix: only copy own properties in clone() ([dfe28ce](https://github.com/vkarpov15/kareem/commit/dfe28ce)), closes [#7](https://github.com/vkarpov15/kareem/issues/7) + + + + +## 1.1.3 (2016-06-27) + +* chore: release 1.1.3 ([87171c8](https://github.com/vkarpov15/kareem/commit/87171c8)) +* fix: couple more issues with arg processing ([c65f523](https://github.com/vkarpov15/kareem/commit/c65f523)) + + + + +## 1.1.2 (2016-06-27) + +* chore: release 1.1.2 ([8e102b6](https://github.com/vkarpov15/kareem/commit/8e102b6)) +* fix: add early return ([4feda4e](https://github.com/vkarpov15/kareem/commit/4feda4e)) + + + + +## 1.1.1 (2016-06-27) + +* chore: release 1.1.1 ([8bb3050](https://github.com/vkarpov15/kareem/commit/8bb3050)) +* fix: skip error handlers if no error ([0eb3a44](https://github.com/vkarpov15/kareem/commit/0eb3a44)) + + + + +## 1.1.0 (2016-05-11) + +* chore: release 1.1.0 ([85332d9](https://github.com/vkarpov15/kareem/commit/85332d9)) +* chore: test on node 4 and node 5 ([1faefa1](https://github.com/vkarpov15/kareem/commit/1faefa1)) +* 100% coverage again ([c9aee4e](https://github.com/vkarpov15/kareem/commit/c9aee4e)) +* add support for error post hooks ([d378113](https://github.com/vkarpov15/kareem/commit/d378113)) +* basic setup for sync hooks #4 ([55aa081](https://github.com/vkarpov15/kareem/commit/55aa081)), closes [#4](https://github.com/vkarpov15/kareem/issues/4) +* proof of concept for error handlers ([e4a07d9](https://github.com/vkarpov15/kareem/commit/e4a07d9)) +* refactor out handleWrapError helper ([b19af38](https://github.com/vkarpov15/kareem/commit/b19af38)) + + + + +## 1.0.1 (2015-05-10) + +* Fix #1 ([de60dc6](https://github.com/vkarpov15/kareem/commit/de60dc6)), closes [#1](https://github.com/vkarpov15/kareem/issues/1) +* release 1.0.1 ([6971088](https://github.com/vkarpov15/kareem/commit/6971088)) +* Run tests on iojs in travis ([adcd201](https://github.com/vkarpov15/kareem/commit/adcd201)) +* support legacy post hook behavior in wrap() ([23fa74c](https://github.com/vkarpov15/kareem/commit/23fa74c)) +* Use node 0.12 in travis ([834689d](https://github.com/vkarpov15/kareem/commit/834689d)) + + + + +## 1.0.0 (2015-01-28) + +* Tag 1.0.0 ([4c5a35a](https://github.com/vkarpov15/kareem/commit/4c5a35a)) + + + + +## 0.0.8 (2015-01-27) + +* Add clone function ([688bba7](https://github.com/vkarpov15/kareem/commit/688bba7)) +* Add jscs for style checking ([5c93149](https://github.com/vkarpov15/kareem/commit/5c93149)) +* Bump 0.0.8 ([03c0d2f](https://github.com/vkarpov15/kareem/commit/03c0d2f)) +* Fix jscs config, add gulp rules ([9989abf](https://github.com/vkarpov15/kareem/commit/9989abf)) +* fix Makefile typo ([1f7e61a](https://github.com/vkarpov15/kareem/commit/1f7e61a)) + + + + +## 0.0.7 (2015-01-04) + +* Bump 0.0.7 ([98ef173](https://github.com/vkarpov15/kareem/commit/98ef173)) +* fix LearnBoost/mongoose#2553 - use null instead of undefined for err ([9157b48](https://github.com/vkarpov15/kareem/commit/9157b48)), closes [LearnBoost/mongoose#2553](https://github.com/LearnBoost/mongoose/issues/2553) +* Regenerate docs ([2331cdf](https://github.com/vkarpov15/kareem/commit/2331cdf)) + + + + +## 0.0.6 (2015-01-01) + +* Update docs and bump 0.0.6 ([92c12a7](https://github.com/vkarpov15/kareem/commit/92c12a7)) + + + + +## 0.0.5 (2015-01-01) + +* Add coverage rule to Makefile ([825a91c](https://github.com/vkarpov15/kareem/commit/825a91c)) +* Add coveralls to README ([fb52369](https://github.com/vkarpov15/kareem/commit/fb52369)) +* Add coveralls to travis ([93f6f15](https://github.com/vkarpov15/kareem/commit/93f6f15)) +* Add createWrapper() function ([ea77741](https://github.com/vkarpov15/kareem/commit/ea77741)) +* Add istanbul code coverage ([6eceeef](https://github.com/vkarpov15/kareem/commit/6eceeef)) +* Add some more comments for examples ([c5b0c6f](https://github.com/vkarpov15/kareem/commit/c5b0c6f)) +* Add travis ([e6dcb06](https://github.com/vkarpov15/kareem/commit/e6dcb06)) +* Add travis badge to docs ([ad8c9b3](https://github.com/vkarpov15/kareem/commit/ad8c9b3)) +* Add wrap() tests, 100% coverage ([6945be4](https://github.com/vkarpov15/kareem/commit/6945be4)) +* Better test coverage for execPost ([d9ad539](https://github.com/vkarpov15/kareem/commit/d9ad539)) +* Bump 0.0.5 ([69875b1](https://github.com/vkarpov15/kareem/commit/69875b1)) +* Docs fix ([15b7098](https://github.com/vkarpov15/kareem/commit/15b7098)) +* Fix silly mistake in docs generation ([50373eb](https://github.com/vkarpov15/kareem/commit/50373eb)) +* Fix typo in readme ([fec4925](https://github.com/vkarpov15/kareem/commit/fec4925)) +* Linkify travis badge ([92b25fe](https://github.com/vkarpov15/kareem/commit/92b25fe)) +* Make travis run coverage ([747157b](https://github.com/vkarpov15/kareem/commit/747157b)) +* Move travis status badge ([d52e89b](https://github.com/vkarpov15/kareem/commit/d52e89b)) +* Quick fix for coverage ([50bbddb](https://github.com/vkarpov15/kareem/commit/50bbddb)) +* Typo fix ([adea794](https://github.com/vkarpov15/kareem/commit/adea794)) + + + + +## 0.0.4 (2014-12-13) + +* Bump 0.0.4, run docs generation ([51a15fe](https://github.com/vkarpov15/kareem/commit/51a15fe)) +* Use correct post parameters in wrap() ([9bb5da3](https://github.com/vkarpov15/kareem/commit/9bb5da3)) + + + + +## 0.0.3 (2014-12-12) + +* Add npm test script, fix small bug with args not getting passed through post ([49e3e68](https://github.com/vkarpov15/kareem/commit/49e3e68)) +* Bump 0.0.3 ([65621d8](https://github.com/vkarpov15/kareem/commit/65621d8)) +* Update readme ([901388b](https://github.com/vkarpov15/kareem/commit/901388b)) + + + + +## 0.0.2 (2014-12-12) + +* Add github repo and bump 0.0.2 ([59db8be](https://github.com/vkarpov15/kareem/commit/59db8be)) + + + + +## 0.0.1 (2014-12-12) + +* Add basic docs ([ad29ea4](https://github.com/vkarpov15/kareem/commit/ad29ea4)) +* Add pre hooks ([2ffc356](https://github.com/vkarpov15/kareem/commit/2ffc356)) +* Add wrap function ([68c540c](https://github.com/vkarpov15/kareem/commit/68c540c)) +* Bump to version 0.0.1 ([a4bfd68](https://github.com/vkarpov15/kareem/commit/a4bfd68)) +* Initial commit ([4002458](https://github.com/vkarpov15/kareem/commit/4002458)) +* Initial deposit ([98fc489](https://github.com/vkarpov15/kareem/commit/98fc489)) +* Post hooks ([395b67c](https://github.com/vkarpov15/kareem/commit/395b67c)) +* Some basic setup work ([82df75e](https://github.com/vkarpov15/kareem/commit/82df75e)) +* Support sync pre hooks ([1cc1b9f](https://github.com/vkarpov15/kareem/commit/1cc1b9f)) +* Update package.json description ([978da18](https://github.com/vkarpov15/kareem/commit/978da18)) + + + + +## 2.2.5 (2018-09-24) + + + + + +## 2.2.4 (2018-09-24) + + + + + +## 2.2.3 (2018-09-24) + +* fix(filter): copy async pres correctly with `filter()` ([1b1ed8a](https://github.com/vkarpov15/kareem/commit/1b1ed8a)), closes [Automattic/mongoose#3054](https://github.com/Automattic/mongoose/issues/3054) +* feat: add filter() function ([1f641f4](https://github.com/vkarpov15/kareem/commit/1f641f4)) +* feat: support storing options on pre and post hooks ([59220b9](https://github.com/vkarpov15/kareem/commit/59220b9)) + + + + +## 2.2.3 (2018-09-10) + +* chore: release 2.2.3 ([af653a3](https://github.com/vkarpov15/kareem/commit/af653a3)) + + + + +## 2.2.2 (2018-09-10) + +* chore: release 2.2.2 ([3f0144d](https://github.com/vkarpov15/kareem/commit/3f0144d)) +* fix: allow merge() to not clone ([e628d65](https://github.com/vkarpov15/kareem/commit/e628d65)) + + + + +## 2.2.1 (2018-06-05) + +* chore: release 2.2.1 ([4625a64](https://github.com/vkarpov15/kareem/commit/4625a64)) +* chore: remove lockfile from git ([7f3e4e6](https://github.com/vkarpov15/kareem/commit/7f3e4e6)) +* fix: handle numAsync correctly when merging ([fef8e7e](https://github.com/vkarpov15/kareem/commit/fef8e7e)) +* test: repro issue with not copying numAsync ([952d9db](https://github.com/vkarpov15/kareem/commit/952d9db)) + + + + +## 2.2.0 (2018-06-05) + +* chore: release 2.2.0 ([ff9ad03](https://github.com/vkarpov15/kareem/commit/ff9ad03)) +* fix: use maps instead of objects for _pres and _posts so `toString()` doesn't get reported as having ([55df303](https://github.com/vkarpov15/kareem/commit/55df303)), closes [Automattic/mongoose#6538](https://github.com/Automattic/mongoose/issues/6538) + + + + +## 2.1.0 (2018-05-16) + +* chore: release 2.1.0 ([ba5f1bc](https://github.com/vkarpov15/kareem/commit/ba5f1bc)) +* feat: add option to check wrapped function return value for promises ([c9d7dd1](https://github.com/vkarpov15/kareem/commit/c9d7dd1)) +* refactor: use const in wrap() ([0fc21f9](https://github.com/vkarpov15/kareem/commit/0fc21f9)) + + + + +## 2.0.7 (2018-04-28) + +* chore: release 2.0.7 ([0bf91e6](https://github.com/vkarpov15/kareem/commit/0bf91e6)) +* feat: add `hasHooks()` ([225f18d](https://github.com/vkarpov15/kareem/commit/225f18d)), closes [Automattic/mongoose#6385](https://github.com/Automattic/mongoose/issues/6385) + + + + +## 2.0.6 (2018-03-22) + +* chore: release 2.0.6 ([f3d406b](https://github.com/vkarpov15/kareem/commit/f3d406b)) +* fix(wrap): ensure fast path still wraps function in `nextTick()` for chaining ([7000494](https://github.com/vkarpov15/kareem/commit/7000494)), closes [Automattic/mongoose#6250](https://github.com/Automattic/mongoose/issues/6250) [dsanel/mongoose-delete#36](https://github.com/dsanel/mongoose-delete/issues/36) + + + + +## 2.0.5 (2018-02-22) + +* chore: release 2.0.5 ([3286612](https://github.com/vkarpov15/kareem/commit/3286612)) +* perf(createWrapper): don't create wrapper if there are no hooks ([5afc5b9](https://github.com/vkarpov15/kareem/commit/5afc5b9)), closes [Automattic/mongoose#6126](https://github.com/Automattic/mongoose/issues/6126) + + + + +## 2.0.4 (2018-02-08) + +* chore: release 2.0.4 ([2ab0293](https://github.com/vkarpov15/kareem/commit/2ab0293)) + + + + +## 2.0.3 (2018-02-01) + +* chore: release 2.0.3 ([3c1abe5](https://github.com/vkarpov15/kareem/commit/3c1abe5)) +* fix: use process.nextTick() re: Automattic/mongoose#6074 ([e5bfe33](https://github.com/vkarpov15/kareem/commit/e5bfe33)), closes [Automattic/mongoose#6074](https://github.com/Automattic/mongoose/issues/6074) + + + + +## 2.0.2 (2018-01-24) + +* chore: fix license ([a9d755c](https://github.com/vkarpov15/kareem/commit/a9d755c)), closes [#10](https://github.com/vkarpov15/kareem/issues/10) +* chore: release 2.0.2 ([fe87ab6](https://github.com/vkarpov15/kareem/commit/fe87ab6)) + + + + +## 2.0.1 (2018-01-09) + +* chore: release 2.0.1 with lockfile bump ([09c44fb](https://github.com/vkarpov15/kareem/commit/09c44fb)) + + + + +## 2.0.0 (2018-01-09) + +* chore: bump marked re: security ([cc564a9](https://github.com/vkarpov15/kareem/commit/cc564a9)) +* chore: release 2.0.0 ([f511d1c](https://github.com/vkarpov15/kareem/commit/f511d1c)) + + + + +## 2.0.0-rc5 (2017-12-23) + +* chore: fix build on node 4+5 ([6dac5a4](https://github.com/vkarpov15/kareem/commit/6dac5a4)) +* chore: fix built on node 4 + 5 again ([434ef0a](https://github.com/vkarpov15/kareem/commit/434ef0a)) +* chore: release 2.0.0-rc5 ([25a32ee](https://github.com/vkarpov15/kareem/commit/25a32ee)) + + + + +## 2.0.0-rc4 (2017-12-22) + +* chore: release 2.0.0-rc4 ([49fc083](https://github.com/vkarpov15/kareem/commit/49fc083)) +* BREAKING CHANGE: deduplicate when merging hooks re: Automattic/mongoose#2945 ([d458573](https://github.com/vkarpov15/kareem/commit/d458573)), closes [Automattic/mongoose#2945](https://github.com/Automattic/mongoose/issues/2945) + + + + +## 2.0.0-rc3 (2017-12-22) + +* chore: release 2.0.0-rc3 ([adaaa00](https://github.com/vkarpov15/kareem/commit/adaaa00)) +* feat: support returning promises from middleware functions ([05b4480](https://github.com/vkarpov15/kareem/commit/05b4480)), closes [Automattic/mongoose#3779](https://github.com/Automattic/mongoose/issues/3779) + + + + +## 2.0.0-rc2 (2017-12-21) + +* chore: release 2.0.0-rc2 ([76325fa](https://github.com/vkarpov15/kareem/commit/76325fa)) +* fix: ensure next() and done() run in next tick ([6c20684](https://github.com/vkarpov15/kareem/commit/6c20684)) + + + + +## 2.0.0-rc1 (2017-12-21) + +* chore: improve test coverage re: Automattic/mongoose#3232 ([7b45cf0](https://github.com/vkarpov15/kareem/commit/7b45cf0)), closes [Automattic/mongoose#3232](https://github.com/Automattic/mongoose/issues/3232) +* chore: release 2.0.0-rc1 ([9b83f52](https://github.com/vkarpov15/kareem/commit/9b83f52)) +* BREAKING CHANGE: report sync exceptions as errors, only allow calling next() and done() once ([674adcc](https://github.com/vkarpov15/kareem/commit/674adcc)), closes [Automattic/mongoose#3483](https://github.com/Automattic/mongoose/issues/3483) + + + + +## 2.0.0-rc0 (2017-12-17) + +* chore: release 2.0.0-rc0 ([16b44b5](https://github.com/vkarpov15/kareem/commit/16b44b5)) +* BREAKING CHANGE: drop support for node < 4 ([9cbb8c7](https://github.com/vkarpov15/kareem/commit/9cbb8c7)) +* BREAKING CHANGE: remove useLegacyPost and add several new features ([6dd8531](https://github.com/vkarpov15/kareem/commit/6dd8531)), closes [Automattic/mongoose#3232](https://github.com/Automattic/mongoose/issues/3232) + + + + +## 1.5.0 (2017-07-20) + +* chore: release 1.5.0 ([9c491a0](https://github.com/vkarpov15/kareem/commit/9c491a0)) +* fix: improve post error handlers results ([9928dd5](https://github.com/vkarpov15/kareem/commit/9928dd5)), closes [Automattic/mongoose#5466](https://github.com/Automattic/mongoose/issues/5466) + + + + +## 1.4.2 (2017-07-06) + +* chore: release 1.4.2 ([8d14ac5](https://github.com/vkarpov15/kareem/commit/8d14ac5)) +* fix: correct args re: Automattic/mongoose#5405 ([3f28ae6](https://github.com/vkarpov15/kareem/commit/3f28ae6)), closes [Automattic/mongoose#5405](https://github.com/Automattic/mongoose/issues/5405) + + + + +## 1.4.1 (2017-04-25) + +* chore: release 1.4.1 ([5ecf0c2](https://github.com/vkarpov15/kareem/commit/5ecf0c2)) +* fix: handle numAsyncPres with clone() ([c72e857](https://github.com/vkarpov15/kareem/commit/c72e857)), closes [#8](https://github.com/vkarpov15/kareem/issues/8) +* test: repro #8 ([9b4d6b2](https://github.com/vkarpov15/kareem/commit/9b4d6b2)), closes [#8](https://github.com/vkarpov15/kareem/issues/8) + + + + +## 1.4.0 (2017-04-19) + +* chore: release 1.4.0 ([101c5f5](https://github.com/vkarpov15/kareem/commit/101c5f5)) +* feat: add merge() function ([285325e](https://github.com/vkarpov15/kareem/commit/285325e)) + + + + +## 1.3.0 (2017-03-26) + +* chore: release 1.3.0 ([f3a9e50](https://github.com/vkarpov15/kareem/commit/f3a9e50)) +* feat: pass function args to execPre ([4dd466d](https://github.com/vkarpov15/kareem/commit/4dd466d)) + + + + +## 1.2.1 (2017-02-03) + +* chore: release 1.2.1 ([d97081f](https://github.com/vkarpov15/kareem/commit/d97081f)) +* fix: filter out _kareemIgnored args for error handlers re: Automattic/mongoose#4925 ([ddc7aeb](https://github.com/vkarpov15/kareem/commit/ddc7aeb)), closes [Automattic/mongoose#4925](https://github.com/Automattic/mongoose/issues/4925) +* fix: make error handlers handle errors in pre hooks ([af38033](https://github.com/vkarpov15/kareem/commit/af38033)), closes [Automattic/mongoose#4927](https://github.com/Automattic/mongoose/issues/4927) + + + + +## 1.2.0 (2017-01-02) + +* chore: release 1.2.0 ([033225c](https://github.com/vkarpov15/kareem/commit/033225c)) +* chore: upgrade deps ([f9e9a09](https://github.com/vkarpov15/kareem/commit/f9e9a09)) +* feat: add _kareemIgnore re: Automattic/mongoose#4836 ([7957771](https://github.com/vkarpov15/kareem/commit/7957771)), closes [Automattic/mongoose#4836](https://github.com/Automattic/mongoose/issues/4836) + + + + +## 1.1.5 (2016-12-13) + +* chore: release 1.1.5 ([1a9f684](https://github.com/vkarpov15/kareem/commit/1a9f684)) +* fix: correct field name ([04a0e9d](https://github.com/vkarpov15/kareem/commit/04a0e9d)) + + + + +## 1.1.4 (2016-12-09) + +* chore: release 1.1.4 ([ece401c](https://github.com/vkarpov15/kareem/commit/ece401c)) +* chore: run tests on node 6 ([e0cb1cb](https://github.com/vkarpov15/kareem/commit/e0cb1cb)) +* fix: only copy own properties in clone() ([dfe28ce](https://github.com/vkarpov15/kareem/commit/dfe28ce)), closes [#7](https://github.com/vkarpov15/kareem/issues/7) + + + + +## 1.1.3 (2016-06-27) + +* chore: release 1.1.3 ([87171c8](https://github.com/vkarpov15/kareem/commit/87171c8)) +* fix: couple more issues with arg processing ([c65f523](https://github.com/vkarpov15/kareem/commit/c65f523)) + + + + +## 1.1.2 (2016-06-27) + +* chore: release 1.1.2 ([8e102b6](https://github.com/vkarpov15/kareem/commit/8e102b6)) +* fix: add early return ([4feda4e](https://github.com/vkarpov15/kareem/commit/4feda4e)) + + + + +## 1.1.1 (2016-06-27) + +* chore: release 1.1.1 ([8bb3050](https://github.com/vkarpov15/kareem/commit/8bb3050)) +* fix: skip error handlers if no error ([0eb3a44](https://github.com/vkarpov15/kareem/commit/0eb3a44)) + + + + +## 1.1.0 (2016-05-11) + +* chore: release 1.1.0 ([85332d9](https://github.com/vkarpov15/kareem/commit/85332d9)) +* chore: test on node 4 and node 5 ([1faefa1](https://github.com/vkarpov15/kareem/commit/1faefa1)) +* 100% coverage again ([c9aee4e](https://github.com/vkarpov15/kareem/commit/c9aee4e)) +* add support for error post hooks ([d378113](https://github.com/vkarpov15/kareem/commit/d378113)) +* basic setup for sync hooks #4 ([55aa081](https://github.com/vkarpov15/kareem/commit/55aa081)), closes [#4](https://github.com/vkarpov15/kareem/issues/4) +* proof of concept for error handlers ([e4a07d9](https://github.com/vkarpov15/kareem/commit/e4a07d9)) +* refactor out handleWrapError helper ([b19af38](https://github.com/vkarpov15/kareem/commit/b19af38)) + + + + +## 1.0.1 (2015-05-10) + +* Fix #1 ([de60dc6](https://github.com/vkarpov15/kareem/commit/de60dc6)), closes [#1](https://github.com/vkarpov15/kareem/issues/1) +* release 1.0.1 ([6971088](https://github.com/vkarpov15/kareem/commit/6971088)) +* Run tests on iojs in travis ([adcd201](https://github.com/vkarpov15/kareem/commit/adcd201)) +* support legacy post hook behavior in wrap() ([23fa74c](https://github.com/vkarpov15/kareem/commit/23fa74c)) +* Use node 0.12 in travis ([834689d](https://github.com/vkarpov15/kareem/commit/834689d)) + + + + +## 1.0.0 (2015-01-28) + +* Tag 1.0.0 ([4c5a35a](https://github.com/vkarpov15/kareem/commit/4c5a35a)) + + + + +## 0.0.8 (2015-01-27) + +* Add clone function ([688bba7](https://github.com/vkarpov15/kareem/commit/688bba7)) +* Add jscs for style checking ([5c93149](https://github.com/vkarpov15/kareem/commit/5c93149)) +* Bump 0.0.8 ([03c0d2f](https://github.com/vkarpov15/kareem/commit/03c0d2f)) +* Fix jscs config, add gulp rules ([9989abf](https://github.com/vkarpov15/kareem/commit/9989abf)) +* fix Makefile typo ([1f7e61a](https://github.com/vkarpov15/kareem/commit/1f7e61a)) + + + + +## 0.0.7 (2015-01-04) + +* Bump 0.0.7 ([98ef173](https://github.com/vkarpov15/kareem/commit/98ef173)) +* fix LearnBoost/mongoose#2553 - use null instead of undefined for err ([9157b48](https://github.com/vkarpov15/kareem/commit/9157b48)), closes [LearnBoost/mongoose#2553](https://github.com/LearnBoost/mongoose/issues/2553) +* Regenerate docs ([2331cdf](https://github.com/vkarpov15/kareem/commit/2331cdf)) + + + + +## 0.0.6 (2015-01-01) + +* Update docs and bump 0.0.6 ([92c12a7](https://github.com/vkarpov15/kareem/commit/92c12a7)) + + + + +## 0.0.5 (2015-01-01) + +* Add coverage rule to Makefile ([825a91c](https://github.com/vkarpov15/kareem/commit/825a91c)) +* Add coveralls to README ([fb52369](https://github.com/vkarpov15/kareem/commit/fb52369)) +* Add coveralls to travis ([93f6f15](https://github.com/vkarpov15/kareem/commit/93f6f15)) +* Add createWrapper() function ([ea77741](https://github.com/vkarpov15/kareem/commit/ea77741)) +* Add istanbul code coverage ([6eceeef](https://github.com/vkarpov15/kareem/commit/6eceeef)) +* Add some more comments for examples ([c5b0c6f](https://github.com/vkarpov15/kareem/commit/c5b0c6f)) +* Add travis ([e6dcb06](https://github.com/vkarpov15/kareem/commit/e6dcb06)) +* Add travis badge to docs ([ad8c9b3](https://github.com/vkarpov15/kareem/commit/ad8c9b3)) +* Add wrap() tests, 100% coverage ([6945be4](https://github.com/vkarpov15/kareem/commit/6945be4)) +* Better test coverage for execPost ([d9ad539](https://github.com/vkarpov15/kareem/commit/d9ad539)) +* Bump 0.0.5 ([69875b1](https://github.com/vkarpov15/kareem/commit/69875b1)) +* Docs fix ([15b7098](https://github.com/vkarpov15/kareem/commit/15b7098)) +* Fix silly mistake in docs generation ([50373eb](https://github.com/vkarpov15/kareem/commit/50373eb)) +* Fix typo in readme ([fec4925](https://github.com/vkarpov15/kareem/commit/fec4925)) +* Linkify travis badge ([92b25fe](https://github.com/vkarpov15/kareem/commit/92b25fe)) +* Make travis run coverage ([747157b](https://github.com/vkarpov15/kareem/commit/747157b)) +* Move travis status badge ([d52e89b](https://github.com/vkarpov15/kareem/commit/d52e89b)) +* Quick fix for coverage ([50bbddb](https://github.com/vkarpov15/kareem/commit/50bbddb)) +* Typo fix ([adea794](https://github.com/vkarpov15/kareem/commit/adea794)) + + + + +## 0.0.4 (2014-12-13) + +* Bump 0.0.4, run docs generation ([51a15fe](https://github.com/vkarpov15/kareem/commit/51a15fe)) +* Use correct post parameters in wrap() ([9bb5da3](https://github.com/vkarpov15/kareem/commit/9bb5da3)) + + + + +## 0.0.3 (2014-12-12) + +* Add npm test script, fix small bug with args not getting passed through post ([49e3e68](https://github.com/vkarpov15/kareem/commit/49e3e68)) +* Bump 0.0.3 ([65621d8](https://github.com/vkarpov15/kareem/commit/65621d8)) +* Update readme ([901388b](https://github.com/vkarpov15/kareem/commit/901388b)) + + + + +## 0.0.2 (2014-12-12) + +* Add github repo and bump 0.0.2 ([59db8be](https://github.com/vkarpov15/kareem/commit/59db8be)) + + + + +## 0.0.1 (2014-12-12) + +* Add basic docs ([ad29ea4](https://github.com/vkarpov15/kareem/commit/ad29ea4)) +* Add pre hooks ([2ffc356](https://github.com/vkarpov15/kareem/commit/2ffc356)) +* Add wrap function ([68c540c](https://github.com/vkarpov15/kareem/commit/68c540c)) +* Bump to version 0.0.1 ([a4bfd68](https://github.com/vkarpov15/kareem/commit/a4bfd68)) +* Initial commit ([4002458](https://github.com/vkarpov15/kareem/commit/4002458)) +* Initial deposit ([98fc489](https://github.com/vkarpov15/kareem/commit/98fc489)) +* Post hooks ([395b67c](https://github.com/vkarpov15/kareem/commit/395b67c)) +* Some basic setup work ([82df75e](https://github.com/vkarpov15/kareem/commit/82df75e)) +* Support sync pre hooks ([1cc1b9f](https://github.com/vkarpov15/kareem/commit/1cc1b9f)) +* Update package.json description ([978da18](https://github.com/vkarpov15/kareem/commit/978da18)) diff --git a/node_modules/kareem/LICENSE b/node_modules/kareem/LICENSE new file mode 100644 index 000000000..e06d20818 --- /dev/null +++ b/node_modules/kareem/LICENSE @@ -0,0 +1,202 @@ +Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + + TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + + 1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + + 2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + + 3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + + 4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + + 5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + + 6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + + 7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + + 8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + + 9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + + END OF TERMS AND CONDITIONS + + APPENDIX: How to apply the Apache License to your work. + + To apply the Apache License to your work, attach the following + boilerplate notice, with the fields enclosed by brackets "{}" + replaced with your own identifying information. (Don't include + the brackets!) The text should be enclosed in the appropriate + comment syntax for the file format. We also recommend that a + file or class name and description of purpose be included on the + same "printed page" as the copyright notice for easier + identification within third-party archives. + + Copyright {yyyy} {name of copyright owner} + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. + diff --git a/node_modules/kareem/Makefile b/node_modules/kareem/Makefile new file mode 100644 index 000000000..f71ba9008 --- /dev/null +++ b/node_modules/kareem/Makefile @@ -0,0 +1,5 @@ +docs: + node ./docs.js + +coverage: + ./node_modules/istanbul/lib/cli.js cover ./node_modules/mocha/bin/_mocha -- -R spec ./test/* diff --git a/node_modules/kareem/README.md b/node_modules/kareem/README.md new file mode 100644 index 000000000..aaf84717b --- /dev/null +++ b/node_modules/kareem/README.md @@ -0,0 +1,420 @@ +# kareem + + [![Build Status](https://travis-ci.org/vkarpov15/kareem.svg?branch=master)](https://travis-ci.org/vkarpov15/kareem) + [![Coverage Status](https://img.shields.io/coveralls/vkarpov15/kareem.svg)](https://coveralls.io/r/vkarpov15/kareem) + +Re-imagined take on the [hooks](http://npmjs.org/package/hooks) module, meant to offer additional flexibility in allowing you to execute hooks whenever necessary, as opposed to simply wrapping a single function. + +Named for the NBA's all-time leading scorer Kareem Abdul-Jabbar, known for his mastery of the [hook shot](http://en.wikipedia.org/wiki/Kareem_Abdul-Jabbar#Skyhook) + + + +# API + +## pre hooks + +Much like [hooks](https://npmjs.org/package/hooks), kareem lets you define +pre and post hooks: pre hooks are called before a given function executes. +Unlike hooks, kareem stores hooks and other internal state in a separate +object, rather than relying on inheritance. Furthermore, kareem exposes +an `execPre()` function that allows you to execute your pre hooks when +appropriate, giving you more fine-grained control over your function hooks. + + +#### It runs without any hooks specified + +```javascript +hooks.execPre('cook', null, function() { + // ... +}); +``` + +#### It runs basic serial pre hooks + +pre hook functions take one parameter, a "done" function that you execute +when your pre hook is finished. + + +```javascript +var count = 0; + +hooks.pre('cook', function(done) { + ++count; + done(); +}); + +hooks.execPre('cook', null, function() { + assert.equal(1, count); +}); +``` + +#### It can run multipe pre hooks + +```javascript +var count1 = 0; +var count2 = 0; + +hooks.pre('cook', function(done) { + ++count1; + done(); +}); + +hooks.pre('cook', function(done) { + ++count2; + done(); +}); + +hooks.execPre('cook', null, function() { + assert.equal(1, count1); + assert.equal(1, count2); +}); +``` + +#### It can run fully synchronous pre hooks + +If your pre hook function takes no parameters, its assumed to be +fully synchronous. + + +```javascript +var count1 = 0; +var count2 = 0; + +hooks.pre('cook', function() { + ++count1; +}); + +hooks.pre('cook', function() { + ++count2; +}); + +hooks.execPre('cook', null, function(error) { + assert.equal(null, error); + assert.equal(1, count1); + assert.equal(1, count2); +}); +``` + +#### It properly attaches context to pre hooks + +Pre save hook functions are bound to the second parameter to `execPre()` + + +```javascript +hooks.pre('cook', function(done) { + this.bacon = 3; + done(); +}); + +hooks.pre('cook', function(done) { + this.eggs = 4; + done(); +}); + +var obj = { bacon: 0, eggs: 0 }; + +// In the pre hooks, `this` will refer to `obj` +hooks.execPre('cook', obj, function(error) { + assert.equal(null, error); + assert.equal(3, obj.bacon); + assert.equal(4, obj.eggs); +}); +``` + +#### It can execute parallel (async) pre hooks + +Like the hooks module, you can declare "async" pre hooks - these take two +parameters, the functions `next()` and `done()`. `next()` passes control to +the next pre hook, but the underlying function won't be called until all +async pre hooks have called `done()`. + + +```javascript +hooks.pre('cook', true, function(next, done) { + this.bacon = 3; + next(); + setTimeout(function() { + done(); + }, 5); +}); + +hooks.pre('cook', true, function(next, done) { + next(); + var _this = this; + setTimeout(function() { + _this.eggs = 4; + done(); + }, 10); +}); + +hooks.pre('cook', function(next) { + this.waffles = false; + next(); +}); + +var obj = { bacon: 0, eggs: 0 }; + +hooks.execPre('cook', obj, function() { + assert.equal(3, obj.bacon); + assert.equal(4, obj.eggs); + assert.equal(false, obj.waffles); +}); +``` + +#### It supports returning a promise + +You can also return a promise from your pre hooks instead of calling +`next()`. When the returned promise resolves, kareem will kick off the +next middleware. + + +```javascript +hooks.pre('cook', function() { + return new Promise(resolve => { + setTimeout(() => { + this.bacon = 3; + resolve(); + }, 100); + }); +}); + +var obj = { bacon: 0 }; + +hooks.execPre('cook', obj, function() { + assert.equal(3, obj.bacon); +}); +``` + +## post hooks + +acquit:ignore:end + +#### It runs without any hooks specified + +```javascript +hooks.execPost('cook', null, [1], function(error, eggs) { + assert.ifError(error); + assert.equal(1, eggs); + done(); +}); +``` + +#### It executes with parameters passed in + +```javascript +hooks.post('cook', function(eggs, bacon, callback) { + assert.equal(1, eggs); + assert.equal(2, bacon); + callback(); +}); + +hooks.execPost('cook', null, [1, 2], function(error, eggs, bacon) { + assert.ifError(error); + assert.equal(1, eggs); + assert.equal(2, bacon); +}); +``` + +#### It can use synchronous post hooks + +```javascript +var execed = {}; + +hooks.post('cook', function(eggs, bacon) { + execed.first = true; + assert.equal(1, eggs); + assert.equal(2, bacon); +}); + +hooks.post('cook', function(eggs, bacon, callback) { + execed.second = true; + assert.equal(1, eggs); + assert.equal(2, bacon); + callback(); +}); + +hooks.execPost('cook', null, [1, 2], function(error, eggs, bacon) { + assert.ifError(error); + assert.equal(2, Object.keys(execed).length); + assert.ok(execed.first); + assert.ok(execed.second); + assert.equal(1, eggs); + assert.equal(2, bacon); +}); +``` + +#### It supports returning a promise + +You can also return a promise from your post hooks instead of calling +`next()`. When the returned promise resolves, kareem will kick off the +next middleware. + + +```javascript +hooks.post('cook', function(bacon) { + return new Promise(resolve => { + setTimeout(() => { + this.bacon = 3; + resolve(); + }, 100); + }); +}); + +var obj = { bacon: 0 }; + +hooks.execPost('cook', obj, obj, function() { + assert.equal(obj.bacon, 3); +}); +``` + +## wrap() + +acquit:ignore:end + +#### It wraps pre and post calls into one call + +```javascript +hooks.pre('cook', true, function(next, done) { + this.bacon = 3; + next(); + setTimeout(function() { + done(); + }, 5); +}); + +hooks.pre('cook', true, function(next, done) { + next(); + var _this = this; + setTimeout(function() { + _this.eggs = 4; + done(); + }, 10); +}); + +hooks.pre('cook', function(next) { + this.waffles = false; + next(); +}); + +hooks.post('cook', function(obj) { + obj.tofu = 'no'; +}); + +var obj = { bacon: 0, eggs: 0 }; + +var args = [obj]; +args.push(function(error, result) { + assert.ifError(error); + assert.equal(null, error); + assert.equal(3, obj.bacon); + assert.equal(4, obj.eggs); + assert.equal(false, obj.waffles); + assert.equal('no', obj.tofu); + + assert.equal(obj, result); +}); + +hooks.wrap( + 'cook', + function(o, callback) { + assert.equal(3, obj.bacon); + assert.equal(4, obj.eggs); + assert.equal(false, obj.waffles); + assert.equal(undefined, obj.tofu); + callback(null, o); + }, + obj, + args); +``` + +## createWrapper() + +#### It wraps wrap() into a callable function + +```javascript +hooks.pre('cook', true, function(next, done) { + this.bacon = 3; + next(); + setTimeout(function() { + done(); + }, 5); +}); + +hooks.pre('cook', true, function(next, done) { + next(); + var _this = this; + setTimeout(function() { + _this.eggs = 4; + done(); + }, 10); +}); + +hooks.pre('cook', function(next) { + this.waffles = false; + next(); +}); + +hooks.post('cook', function(obj) { + obj.tofu = 'no'; +}); + +var obj = { bacon: 0, eggs: 0 }; + +var cook = hooks.createWrapper( + 'cook', + function(o, callback) { + assert.equal(3, obj.bacon); + assert.equal(4, obj.eggs); + assert.equal(false, obj.waffles); + assert.equal(undefined, obj.tofu); + callback(null, o); + }, + obj); + +cook(obj, function(error, result) { + assert.ifError(error); + assert.equal(3, obj.bacon); + assert.equal(4, obj.eggs); + assert.equal(false, obj.waffles); + assert.equal('no', obj.tofu); + + assert.equal(obj, result); +}); +``` + +## clone() + +acquit:ignore:end + +#### It clones a Kareem object + +```javascript +var k1 = new Kareem(); +k1.pre('cook', function() {}); +k1.post('cook', function() {}); + +var k2 = k1.clone(); +assert.deepEqual(Array.from(k2._pres.keys()), ['cook']); +assert.deepEqual(Array.from(k2._posts.keys()), ['cook']); +``` + +## merge() + +#### It pulls hooks from another Kareem object + +```javascript +var k1 = new Kareem(); +var test1 = function() {}; +k1.pre('cook', test1); +k1.post('cook', function() {}); + +var k2 = new Kareem(); +var test2 = function() {}; +k2.pre('cook', test2); +var k3 = k2.merge(k1); +assert.equal(k3._pres.get('cook').length, 2); +assert.equal(k3._pres.get('cook')[0].fn, test2); +assert.equal(k3._pres.get('cook')[1].fn, test1); +assert.equal(k3._posts.get('cook').length, 1); +``` + diff --git a/node_modules/kareem/docs.js b/node_modules/kareem/docs.js new file mode 100644 index 000000000..85ae50215 --- /dev/null +++ b/node_modules/kareem/docs.js @@ -0,0 +1,39 @@ +var acquit = require('acquit'); + +require('acquit-ignore')(); + +var content = require('fs').readFileSync('./test/examples.test.js').toString(); +var blocks = acquit.parse(content); + +var mdOutput = + '# kareem\n\n' + + ' [![Build Status](https://travis-ci.org/vkarpov15/kareem.svg?branch=master)](https://travis-ci.org/vkarpov15/kareem)\n' + + ' [![Coverage Status](https://img.shields.io/coveralls/vkarpov15/kareem.svg)](https://coveralls.io/r/vkarpov15/kareem)\n\n' + + 'Re-imagined take on the [hooks](http://npmjs.org/package/hooks) module, ' + + 'meant to offer additional flexibility in allowing you to execute hooks ' + + 'whenever necessary, as opposed to simply wrapping a single function.\n\n' + + 'Named for the NBA\'s all-time leading scorer Kareem Abdul-Jabbar, known ' + + 'for his mastery of the [hook shot](http://en.wikipedia.org/wiki/Kareem_Abdul-Jabbar#Skyhook)\n\n' + + '\n\n' + + '# API\n\n'; + +for (var i = 0; i < blocks.length; ++i) { + var describe = blocks[i]; + mdOutput += '## ' + describe.contents + '\n\n'; + mdOutput += describe.comments[0] ? + acquit.trimEachLine(describe.comments[0]) + '\n\n' : + ''; + + for (var j = 0; j < describe.blocks.length; ++j) { + var it = describe.blocks[j]; + mdOutput += '#### It ' + it.contents + '\n\n'; + mdOutput += it.comments[0] ? + acquit.trimEachLine(it.comments[0]) + '\n\n' : + ''; + mdOutput += '```javascript\n'; + mdOutput += it.code + '\n'; + mdOutput += '```\n\n'; + } +} + +require('fs').writeFileSync('README.md', mdOutput); diff --git a/node_modules/kareem/gulpfile.js b/node_modules/kareem/gulpfile.js new file mode 100644 index 000000000..635bfc75f --- /dev/null +++ b/node_modules/kareem/gulpfile.js @@ -0,0 +1,18 @@ +var gulp = require('gulp'); +var mocha = require('gulp-mocha'); +var config = require('./package.json'); +var jscs = require('gulp-jscs'); + +gulp.task('mocha', function() { + return gulp.src('./test/*'). + pipe(mocha({ reporter: 'dot' })); +}); + +gulp.task('jscs', function() { + return gulp.src('./index.js'). + pipe(jscs(config.jscsConfig)); +}); + +gulp.task('watch', function() { + gulp.watch('./index.js', ['jscs', 'mocha']); +}); diff --git a/node_modules/kareem/index.js b/node_modules/kareem/index.js new file mode 100644 index 000000000..1484ecd28 --- /dev/null +++ b/node_modules/kareem/index.js @@ -0,0 +1,512 @@ +'use strict'; + +function Kareem() { + this._pres = new Map(); + this._posts = new Map(); +} + +Kareem.prototype.execPre = function(name, context, args, callback) { + if (arguments.length === 3) { + callback = args; + args = []; + } + var pres = get(this._pres, name, []); + var numPres = pres.length; + var numAsyncPres = pres.numAsync || 0; + var currentPre = 0; + var asyncPresLeft = numAsyncPres; + var done = false; + var $args = args; + + if (!numPres) { + return process.nextTick(function() { + callback(null); + }); + } + + var next = function() { + if (currentPre >= numPres) { + return; + } + var pre = pres[currentPre]; + + if (pre.isAsync) { + var args = [ + decorateNextFn(_next), + decorateNextFn(function(error) { + if (error) { + if (done) { + return; + } + done = true; + return callback(error); + } + if (--asyncPresLeft === 0 && currentPre >= numPres) { + return callback(null); + } + }) + ]; + + callMiddlewareFunction(pre.fn, context, args, args[0]); + } else if (pre.fn.length > 0) { + var args = [decorateNextFn(_next)]; + var _args = arguments.length >= 2 ? arguments : [null].concat($args); + for (var i = 1; i < _args.length; ++i) { + args.push(_args[i]); + } + + callMiddlewareFunction(pre.fn, context, args, args[0]); + } else { + let maybePromise = null; + try { + maybePromise = pre.fn.call(context); + } catch (err) { + if (err != null) { + return callback(err); + } + } + + if (isPromise(maybePromise)) { + maybePromise.then(() => _next(), err => _next(err)); + } else { + if (++currentPre >= numPres) { + if (asyncPresLeft > 0) { + // Leave parallel hooks to run + return; + } else { + return process.nextTick(function() { + callback(null); + }); + } + } + next(); + } + } + }; + + next.apply(null, [null].concat(args)); + + function _next(error) { + if (error) { + if (done) { + return; + } + done = true; + return callback(error); + } + + if (++currentPre >= numPres) { + if (asyncPresLeft > 0) { + // Leave parallel hooks to run + return; + } else { + return callback(null); + } + } + + next.apply(context, arguments); + } +}; + +Kareem.prototype.execPreSync = function(name, context, args) { + var pres = get(this._pres, name, []); + var numPres = pres.length; + + for (var i = 0; i < numPres; ++i) { + pres[i].fn.apply(context, args || []); + } +}; + +Kareem.prototype.execPost = function(name, context, args, options, callback) { + if (arguments.length < 5) { + callback = options; + options = null; + } + var posts = get(this._posts, name, []); + var numPosts = posts.length; + var currentPost = 0; + + var firstError = null; + if (options && options.error) { + firstError = options.error; + } + + if (!numPosts) { + return process.nextTick(function() { + callback.apply(null, [firstError].concat(args)); + }); + } + + var next = function() { + var post = posts[currentPost].fn; + var numArgs = 0; + var argLength = args.length; + var newArgs = []; + for (var i = 0; i < argLength; ++i) { + numArgs += args[i] && args[i]._kareemIgnore ? 0 : 1; + if (!args[i] || !args[i]._kareemIgnore) { + newArgs.push(args[i]); + } + } + + if (firstError) { + if (post.length === numArgs + 2) { + var _cb = decorateNextFn(function(error) { + if (error) { + firstError = error; + } + if (++currentPost >= numPosts) { + return callback.call(null, firstError); + } + next(); + }); + + callMiddlewareFunction(post, context, + [firstError].concat(newArgs).concat([_cb]), _cb); + } else { + if (++currentPost >= numPosts) { + return callback.call(null, firstError); + } + next(); + } + } else { + const _cb = decorateNextFn(function(error) { + if (error) { + firstError = error; + return next(); + } + + if (++currentPost >= numPosts) { + return callback.apply(null, [null].concat(args)); + } + + next(); + }); + + if (post.length === numArgs + 2) { + // Skip error handlers if no error + if (++currentPost >= numPosts) { + return callback.apply(null, [null].concat(args)); + } + return next(); + } + if (post.length === numArgs + 1) { + callMiddlewareFunction(post, context, newArgs.concat([_cb]), _cb); + } else { + let error; + let maybePromise; + try { + maybePromise = post.apply(context, newArgs); + } catch (err) { + error = err; + firstError = err; + } + + if (isPromise(maybePromise)) { + return maybePromise.then(() => _cb(), err => _cb(err)); + } + + if (++currentPost >= numPosts) { + return callback.apply(null, [error].concat(args)); + } + + next(error); + } + } + }; + + next(); +}; + +Kareem.prototype.execPostSync = function(name, context, args) { + const posts = get(this._posts, name, []); + const numPosts = posts.length; + + for (let i = 0; i < numPosts; ++i) { + posts[i].fn.apply(context, args || []); + } +}; + +Kareem.prototype.createWrapperSync = function(name, fn) { + var kareem = this; + return function syncWrapper() { + kareem.execPreSync(name, this, arguments); + + var toReturn = fn.apply(this, arguments); + + kareem.execPostSync(name, this, [toReturn]); + + return toReturn; + }; +} + +function _handleWrapError(instance, error, name, context, args, options, callback) { + if (options.useErrorHandlers) { + var _options = { error: error }; + return instance.execPost(name, context, args, _options, function(error) { + return typeof callback === 'function' && callback(error); + }); + } else { + return typeof callback === 'function' ? + callback(error) : + undefined; + } +} + +Kareem.prototype.wrap = function(name, fn, context, args, options) { + const lastArg = (args.length > 0 ? args[args.length - 1] : null); + const argsWithoutCb = typeof lastArg === 'function' ? + args.slice(0, args.length - 1) : + args; + const _this = this; + + options = options || {}; + const checkForPromise = options.checkForPromise; + + this.execPre(name, context, args, function(error) { + if (error) { + const numCallbackParams = options.numCallbackParams || 0; + const errorArgs = options.contextParameter ? [context] : []; + for (var i = errorArgs.length; i < numCallbackParams; ++i) { + errorArgs.push(null); + } + return _handleWrapError(_this, error, name, context, errorArgs, + options, lastArg); + } + + const end = (typeof lastArg === 'function' ? args.length - 1 : args.length); + const numParameters = fn.length; + const ret = fn.apply(context, args.slice(0, end).concat(_cb)); + + if (checkForPromise) { + if (ret != null && typeof ret.then === 'function') { + // Thenable, use it + return ret.then( + res => _cb(null, res), + err => _cb(err) + ); + } + + // If `fn()` doesn't have a callback argument and doesn't return a + // promise, assume it is sync + if (numParameters < end + 1) { + return _cb(null, ret); + } + } + + function _cb() { + const args = arguments; + const argsWithoutError = Array.prototype.slice.call(arguments, 1); + if (options.nullResultByDefault && argsWithoutError.length === 0) { + argsWithoutError.push(null); + } + if (arguments[0]) { + // Assume error + return _handleWrapError(_this, arguments[0], name, context, + argsWithoutError, options, lastArg); + } else { + _this.execPost(name, context, argsWithoutError, function() { + if (arguments[0]) { + return typeof lastArg === 'function' ? + lastArg(arguments[0]) : + undefined; + } + + return typeof lastArg === 'function' ? + lastArg.apply(context, arguments) : + undefined; + }); + } + } + }); +}; + +Kareem.prototype.filter = function(fn) { + const clone = this.clone(); + + const pres = Array.from(clone._pres.keys()); + for (const name of pres) { + const hooks = this._pres.get(name). + map(h => Object.assign({}, h, { name: name })). + filter(fn); + + if (hooks.length === 0) { + clone._pres.delete(name); + continue; + } + + hooks.numAsync = hooks.filter(h => h.isAsync).length; + + clone._pres.set(name, hooks); + } + + const posts = Array.from(clone._posts.keys()); + for (const name of posts) { + const hooks = this._posts.get(name). + map(h => Object.assign({}, h, { name: name })). + filter(fn); + + if (hooks.length === 0) { + clone._posts.delete(name); + continue; + } + + clone._posts.set(name, hooks); + } + + return clone; +}; + +Kareem.prototype.hasHooks = function(name) { + return this._pres.has(name) || this._posts.has(name); +}; + +Kareem.prototype.createWrapper = function(name, fn, context, options) { + var _this = this; + if (!this.hasHooks(name)) { + // Fast path: if there's no hooks for this function, just return the + // function wrapped in a nextTick() + return function() { + process.nextTick(() => fn.apply(this, arguments)); + }; + } + return function() { + var _context = context || this; + var args = Array.prototype.slice.call(arguments); + _this.wrap(name, fn, _context, args, options); + }; +}; + +Kareem.prototype.pre = function(name, isAsync, fn, error, unshift) { + let options = {}; + if (typeof isAsync === 'object' && isAsync != null) { + options = isAsync; + isAsync = options.isAsync; + } else if (typeof arguments[1] !== 'boolean') { + error = fn; + fn = isAsync; + isAsync = false; + } + + const pres = get(this._pres, name, []); + this._pres.set(name, pres); + + if (isAsync) { + pres.numAsync = pres.numAsync || 0; + ++pres.numAsync; + } + + if (typeof fn !== 'function') { + throw new Error('pre() requires a function, got "' + typeof fn + '"'); + } + + if (unshift) { + pres.unshift(Object.assign({}, options, { fn: fn, isAsync: isAsync })); + } else { + pres.push(Object.assign({}, options, { fn: fn, isAsync: isAsync })); + } + + return this; +}; + +Kareem.prototype.post = function(name, options, fn, unshift) { + const hooks = get(this._posts, name, []); + + if (typeof options === 'function') { + unshift = !!fn; + fn = options; + options = {}; + } + + if (typeof fn !== 'function') { + throw new Error('post() requires a function, got "' + typeof fn + '"'); + } + + if (unshift) { + hooks.unshift(Object.assign({}, options, { fn: fn })); + } else { + hooks.push(Object.assign({}, options, { fn: fn })); + } + this._posts.set(name, hooks); + return this; +}; + +Kareem.prototype.clone = function() { + const n = new Kareem(); + + for (let key of this._pres.keys()) { + const clone = this._pres.get(key).slice(); + clone.numAsync = this._pres.get(key).numAsync; + n._pres.set(key, clone); + } + for (let key of this._posts.keys()) { + n._posts.set(key, this._posts.get(key).slice()); + } + + return n; +}; + +Kareem.prototype.merge = function(other, clone) { + clone = arguments.length === 1 ? true : clone; + var ret = clone ? this.clone() : this; + + for (let key of other._pres.keys()) { + const sourcePres = get(ret._pres, key, []); + const deduplicated = other._pres.get(key). + // Deduplicate based on `fn` + filter(p => sourcePres.map(_p => _p.fn).indexOf(p.fn) === -1); + const combined = sourcePres.concat(deduplicated); + combined.numAsync = sourcePres.numAsync || 0; + combined.numAsync += deduplicated.filter(p => p.isAsync).length; + ret._pres.set(key, combined); + } + for (let key of other._posts.keys()) { + const sourcePosts = get(ret._posts, key, []); + const deduplicated = other._posts.get(key). + filter(p => sourcePosts.indexOf(p) === -1); + ret._posts.set(key, sourcePosts.concat(deduplicated)); + } + + return ret; +}; + +function get(map, key, def) { + if (map.has(key)) { + return map.get(key); + } + return def; +} + +function callMiddlewareFunction(fn, context, args, next) { + let maybePromise; + try { + maybePromise = fn.apply(context, args); + } catch (error) { + return next(error); + } + + if (isPromise(maybePromise)) { + maybePromise.then(() => next(), err => next(err)); + } +} + +function isPromise(v) { + return v != null && typeof v.then === 'function'; +} + +function decorateNextFn(fn) { + var called = false; + var _this = this; + return function() { + // Ensure this function can only be called once + if (called) { + return; + } + called = true; + // Make sure to clear the stack so try/catch doesn't catch errors + // in subsequent middleware + return process.nextTick(() => fn.apply(_this, arguments)); + }; +} + +module.exports = Kareem; diff --git a/node_modules/kareem/package.json b/node_modules/kareem/package.json new file mode 100644 index 000000000..ca1ddca85 --- /dev/null +++ b/node_modules/kareem/package.json @@ -0,0 +1,54 @@ +{ + "_from": "kareem@2.3.2", + "_id": "kareem@2.3.2", + "_inBundle": false, + "_integrity": "sha512-STHz9P7X2L4Kwn72fA4rGyqyXdmrMSdxqHx9IXon/FXluXieaFA6KJ2upcHAHxQPQ0LeM/OjLrhFxifHewOALQ==", + "_location": "/kareem", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "kareem@2.3.2", + "name": "kareem", + "escapedName": "kareem", + "rawSpec": "2.3.2", + "saveSpec": null, + "fetchSpec": "2.3.2" + }, + "_requiredBy": [ + "/mongoose" + ], + "_resolved": "https://registry.npmjs.org/kareem/-/kareem-2.3.2.tgz", + "_shasum": "78c4508894985b8d38a0dc15e1a8e11078f2ca93", + "_spec": "kareem@2.3.2", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\mongoose", + "author": { + "name": "Valeri Karpov", + "email": "val@karpov.io" + }, + "bugs": { + "url": "https://github.com/vkarpov15/kareem/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Next-generation take on pre/post function hooks", + "devDependencies": { + "acquit": "1.x", + "acquit-ignore": "0.1.x", + "mocha": "5.x", + "nyc": "11.x" + }, + "homepage": "https://github.com/vkarpov15/kareem#readme", + "license": "Apache-2.0", + "main": "index.js", + "name": "kareem", + "repository": { + "type": "git", + "url": "git://github.com/vkarpov15/kareem.git" + }, + "scripts": { + "test": "mocha ./test/*", + "test-travis": "nyc mocha ./test/*" + }, + "version": "2.3.2" +} diff --git a/node_modules/kareem/test/examples.test.js b/node_modules/kareem/test/examples.test.js new file mode 100644 index 000000000..cbde98444 --- /dev/null +++ b/node_modules/kareem/test/examples.test.js @@ -0,0 +1,426 @@ +var assert = require('assert'); +var Kareem = require('../'); + +/* Much like [hooks](https://npmjs.org/package/hooks), kareem lets you define + * pre and post hooks: pre hooks are called before a given function executes. + * Unlike hooks, kareem stores hooks and other internal state in a separate + * object, rather than relying on inheritance. Furthermore, kareem exposes + * an `execPre()` function that allows you to execute your pre hooks when + * appropriate, giving you more fine-grained control over your function hooks. + */ +describe('pre hooks', function() { + var hooks; + + beforeEach(function() { + hooks = new Kareem(); + }); + + it('runs without any hooks specified', function(done) { + hooks.execPre('cook', null, function() { + // ... + // acquit:ignore:start + done(); + // acquit:ignore:end + }); + }); + + /* pre hook functions take one parameter, a "done" function that you execute + * when your pre hook is finished. + */ + it('runs basic serial pre hooks', function(done) { + var count = 0; + + hooks.pre('cook', function(done) { + ++count; + done(); + }); + + hooks.execPre('cook', null, function() { + assert.equal(1, count); + // acquit:ignore:start + done(); + // acquit:ignore:end + }); + }); + + it('can run multipe pre hooks', function(done) { + var count1 = 0; + var count2 = 0; + + hooks.pre('cook', function(done) { + ++count1; + done(); + }); + + hooks.pre('cook', function(done) { + ++count2; + done(); + }); + + hooks.execPre('cook', null, function() { + assert.equal(1, count1); + assert.equal(1, count2); + // acquit:ignore:start + done(); + // acquit:ignore:end + }); + }); + + /* If your pre hook function takes no parameters, its assumed to be + * fully synchronous. + */ + it('can run fully synchronous pre hooks', function(done) { + var count1 = 0; + var count2 = 0; + + hooks.pre('cook', function() { + ++count1; + }); + + hooks.pre('cook', function() { + ++count2; + }); + + hooks.execPre('cook', null, function(error) { + assert.equal(null, error); + assert.equal(1, count1); + assert.equal(1, count2); + // acquit:ignore:start + done(); + // acquit:ignore:end + }); + }); + + /* Pre save hook functions are bound to the second parameter to `execPre()` + */ + it('properly attaches context to pre hooks', function(done) { + hooks.pre('cook', function(done) { + this.bacon = 3; + done(); + }); + + hooks.pre('cook', function(done) { + this.eggs = 4; + done(); + }); + + var obj = { bacon: 0, eggs: 0 }; + + // In the pre hooks, `this` will refer to `obj` + hooks.execPre('cook', obj, function(error) { + assert.equal(null, error); + assert.equal(3, obj.bacon); + assert.equal(4, obj.eggs); + // acquit:ignore:start + done(); + // acquit:ignore:end + }); + }); + + /* Like the hooks module, you can declare "async" pre hooks - these take two + * parameters, the functions `next()` and `done()`. `next()` passes control to + * the next pre hook, but the underlying function won't be called until all + * async pre hooks have called `done()`. + */ + it('can execute parallel (async) pre hooks', function(done) { + hooks.pre('cook', true, function(next, done) { + this.bacon = 3; + next(); + setTimeout(function() { + done(); + }, 5); + }); + + hooks.pre('cook', true, function(next, done) { + next(); + var _this = this; + setTimeout(function() { + _this.eggs = 4; + done(); + }, 10); + }); + + hooks.pre('cook', function(next) { + this.waffles = false; + next(); + }); + + var obj = { bacon: 0, eggs: 0 }; + + hooks.execPre('cook', obj, function() { + assert.equal(3, obj.bacon); + assert.equal(4, obj.eggs); + assert.equal(false, obj.waffles); + // acquit:ignore:start + done(); + // acquit:ignore:end + }); + }); + + /* You can also return a promise from your pre hooks instead of calling + * `next()`. When the returned promise resolves, kareem will kick off the + * next middleware. + */ + it('supports returning a promise', function(done) { + hooks.pre('cook', function() { + return new Promise(resolve => { + setTimeout(() => { + this.bacon = 3; + resolve(); + }, 100); + }); + }); + + var obj = { bacon: 0 }; + + hooks.execPre('cook', obj, function() { + assert.equal(3, obj.bacon); + // acquit:ignore:start + done(); + // acquit:ignore:end + }); + }); +}); + +describe('post hooks', function() { + var hooks; + + beforeEach(function() { + hooks = new Kareem(); + }); + + it('runs without any hooks specified', function(done) { + hooks.execPost('cook', null, [1], function(error, eggs) { + assert.ifError(error); + assert.equal(1, eggs); + done(); + }); + }); + + it('executes with parameters passed in', function(done) { + hooks.post('cook', function(eggs, bacon, callback) { + assert.equal(1, eggs); + assert.equal(2, bacon); + callback(); + }); + + hooks.execPost('cook', null, [1, 2], function(error, eggs, bacon) { + assert.ifError(error); + assert.equal(1, eggs); + assert.equal(2, bacon); + // acquit:ignore:start + done(); + // acquit:ignore:end + }); + }); + + it('can use synchronous post hooks', function(done) { + var execed = {}; + + hooks.post('cook', function(eggs, bacon) { + execed.first = true; + assert.equal(1, eggs); + assert.equal(2, bacon); + }); + + hooks.post('cook', function(eggs, bacon, callback) { + execed.second = true; + assert.equal(1, eggs); + assert.equal(2, bacon); + callback(); + }); + + hooks.execPost('cook', null, [1, 2], function(error, eggs, bacon) { + assert.ifError(error); + assert.equal(2, Object.keys(execed).length); + assert.ok(execed.first); + assert.ok(execed.second); + assert.equal(1, eggs); + assert.equal(2, bacon); + // acquit:ignore:start + done(); + // acquit:ignore:end + }); + }); + + /* You can also return a promise from your post hooks instead of calling + * `next()`. When the returned promise resolves, kareem will kick off the + * next middleware. + */ + it('supports returning a promise', function(done) { + hooks.post('cook', function(bacon) { + return new Promise(resolve => { + setTimeout(() => { + this.bacon = 3; + resolve(); + }, 100); + }); + }); + + var obj = { bacon: 0 }; + + hooks.execPost('cook', obj, obj, function() { + assert.equal(obj.bacon, 3); + // acquit:ignore:start + done(); + // acquit:ignore:end + }); + }); +}); + +describe('wrap()', function() { + var hooks; + + beforeEach(function() { + hooks = new Kareem(); + }); + + it('wraps pre and post calls into one call', function(done) { + hooks.pre('cook', true, function(next, done) { + this.bacon = 3; + next(); + setTimeout(function() { + done(); + }, 5); + }); + + hooks.pre('cook', true, function(next, done) { + next(); + var _this = this; + setTimeout(function() { + _this.eggs = 4; + done(); + }, 10); + }); + + hooks.pre('cook', function(next) { + this.waffles = false; + next(); + }); + + hooks.post('cook', function(obj) { + obj.tofu = 'no'; + }); + + var obj = { bacon: 0, eggs: 0 }; + + var args = [obj]; + args.push(function(error, result) { + assert.ifError(error); + assert.equal(null, error); + assert.equal(3, obj.bacon); + assert.equal(4, obj.eggs); + assert.equal(false, obj.waffles); + assert.equal('no', obj.tofu); + + assert.equal(obj, result); + // acquit:ignore:start + done(); + // acquit:ignore:end + }); + + hooks.wrap( + 'cook', + function(o, callback) { + assert.equal(3, obj.bacon); + assert.equal(4, obj.eggs); + assert.equal(false, obj.waffles); + assert.equal(undefined, obj.tofu); + callback(null, o); + }, + obj, + args); + }); +}); + +describe('createWrapper()', function() { + var hooks; + + beforeEach(function() { + hooks = new Kareem(); + }); + + it('wraps wrap() into a callable function', function(done) { + hooks.pre('cook', true, function(next, done) { + this.bacon = 3; + next(); + setTimeout(function() { + done(); + }, 5); + }); + + hooks.pre('cook', true, function(next, done) { + next(); + var _this = this; + setTimeout(function() { + _this.eggs = 4; + done(); + }, 10); + }); + + hooks.pre('cook', function(next) { + this.waffles = false; + next(); + }); + + hooks.post('cook', function(obj) { + obj.tofu = 'no'; + }); + + var obj = { bacon: 0, eggs: 0 }; + + var cook = hooks.createWrapper( + 'cook', + function(o, callback) { + assert.equal(3, obj.bacon); + assert.equal(4, obj.eggs); + assert.equal(false, obj.waffles); + assert.equal(undefined, obj.tofu); + callback(null, o); + }, + obj); + + cook(obj, function(error, result) { + assert.ifError(error); + assert.equal(3, obj.bacon); + assert.equal(4, obj.eggs); + assert.equal(false, obj.waffles); + assert.equal('no', obj.tofu); + + assert.equal(obj, result); + // acquit:ignore:start + done(); + // acquit:ignore:end + }); + }); +}); + +describe('clone()', function() { + it('clones a Kareem object', function() { + var k1 = new Kareem(); + k1.pre('cook', function() {}); + k1.post('cook', function() {}); + + var k2 = k1.clone(); + assert.deepEqual(Array.from(k2._pres.keys()), ['cook']); + assert.deepEqual(Array.from(k2._posts.keys()), ['cook']); + }); +}); + +describe('merge()', function() { + it('pulls hooks from another Kareem object', function() { + var k1 = new Kareem(); + var test1 = function() {}; + k1.pre('cook', test1); + k1.post('cook', function() {}); + + var k2 = new Kareem(); + var test2 = function() {}; + k2.pre('cook', test2); + var k3 = k2.merge(k1); + assert.equal(k3._pres.get('cook').length, 2); + assert.equal(k3._pres.get('cook')[0].fn, test2); + assert.equal(k3._pres.get('cook')[1].fn, test1); + assert.equal(k3._posts.get('cook').length, 1); + }); +}); diff --git a/node_modules/kareem/test/misc.test.js b/node_modules/kareem/test/misc.test.js new file mode 100644 index 000000000..531b1d49d --- /dev/null +++ b/node_modules/kareem/test/misc.test.js @@ -0,0 +1,71 @@ +'use strict'; + +const assert = require('assert'); +const Kareem = require('../'); + +describe('hasHooks', function() { + it('returns false for toString (Automattic/mongoose#6538)', function() { + const k = new Kareem(); + assert.ok(!k.hasHooks('toString')); + }); +}); + +describe('merge', function() { + it('handles async pres if source doesnt have them', function() { + const k1 = new Kareem(); + k1.pre('cook', true, function(next, done) { + execed.first = true; + setTimeout( + function() { + done('error!'); + }, + 5); + + next(); + }); + + assert.equal(k1._pres.get('cook').numAsync, 1); + + const k2 = new Kareem(); + const k3 = k2.merge(k1); + assert.equal(k3._pres.get('cook').numAsync, 1); + }); +}); + +describe('filter', function() { + it('returns clone with only hooks that match `fn()`', function() { + const k1 = new Kareem(); + + k1.pre('update', { document: true }, f1); + k1.pre('update', { query: true }, f2); + k1.pre('remove', { document: true }, f3); + + k1.post('update', { document: true }, f1); + k1.post('update', { query: true }, f2); + k1.post('remove', { document: true }, f3); + + const k2 = k1.filter(hook => hook.document); + assert.equal(k2._pres.get('update').length, 1); + assert.equal(k2._pres.get('update')[0].fn, f1); + assert.equal(k2._pres.get('remove').length, 1); + assert.equal(k2._pres.get('remove')[0].fn, f3); + + assert.equal(k2._posts.get('update').length, 1); + assert.equal(k2._posts.get('update')[0].fn, f1); + assert.equal(k2._posts.get('remove').length, 1); + assert.equal(k2._posts.get('remove')[0].fn, f3); + + const k3 = k1.filter(hook => hook.query); + assert.equal(k3._pres.get('update').length, 1); + assert.equal(k3._pres.get('update')[0].fn, f2); + assert.ok(!k3._pres.has('remove')); + + assert.equal(k3._posts.get('update').length, 1); + assert.equal(k3._posts.get('update')[0].fn, f2); + assert.ok(!k3._posts.has('remove')); + + function f1() {} + function f2() {} + function f3() {} + }); +}); diff --git a/node_modules/kareem/test/post.test.js b/node_modules/kareem/test/post.test.js new file mode 100644 index 000000000..b9a776d24 --- /dev/null +++ b/node_modules/kareem/test/post.test.js @@ -0,0 +1,198 @@ +'use strict'; + +const assert = require('assert'); +const Kareem = require('../'); + +describe('execPost', function() { + var hooks; + + beforeEach(function() { + hooks = new Kareem(); + }); + + it('handles errors', function(done) { + hooks.post('cook', function(eggs, callback) { + callback('error!'); + }); + + hooks.execPost('cook', null, [4], function(error, eggs) { + assert.equal('error!', error); + assert.ok(!eggs); + done(); + }); + }); + + it('unshift', function() { + var f1 = function() {}; + var f2 = function() {}; + hooks.post('cook', f1); + hooks.post('cook', f2, true); + assert.strictEqual(hooks._posts.get('cook')[0].fn, f2); + assert.strictEqual(hooks._posts.get('cook')[1].fn, f1); + }); + + it('arbitrary options', function() { + const f1 = function() {}; + const f2 = function() {}; + hooks.post('cook', { foo: 'bar' }, f1); + hooks.post('cook', { bar: 'baz' }, f2, true); + assert.equal(hooks._posts.get('cook')[1].foo, 'bar'); + assert.equal(hooks._posts.get('cook')[0].bar, 'baz'); + }); + + it('throws error if no function', function() { + assert.throws(() => hooks.post('test'), /got "undefined"/); + }); + + it('multiple posts', function(done) { + hooks.post('cook', function(eggs, callback) { + setTimeout( + function() { + callback(); + }, + 5); + }); + + hooks.post('cook', function(eggs, callback) { + setTimeout( + function() { + callback(); + }, + 5); + }); + + hooks.execPost('cook', null, [4], function(error, eggs) { + assert.ifError(error); + assert.equal(4, eggs); + done(); + }); + }); + + it('error posts', function(done) { + var called = {}; + hooks.post('cook', function(eggs, callback) { + called.first = true; + callback(); + }); + + hooks.post('cook', function(eggs, callback) { + called.second = true; + callback(new Error('fail')); + }); + + hooks.post('cook', function(eggs, callback) { + assert.ok(false); + }); + + hooks.post('cook', function(error, eggs, callback) { + called.fourth = true; + assert.equal(error.message, 'fail'); + callback(new Error('fourth')); + }); + + hooks.post('cook', function(error, eggs, callback) { + called.fifth = true; + assert.equal(error.message, 'fourth'); + callback(new Error('fifth')); + }); + + hooks.execPost('cook', null, [4], function(error, eggs) { + assert.ok(error); + assert.equal(error.message, 'fifth'); + assert.deepEqual(called, { + first: true, + second: true, + fourth: true, + fifth: true + }); + done(); + }); + }); + + it('error posts with initial error', function(done) { + var called = {}; + + hooks.post('cook', function(eggs, callback) { + assert.ok(false); + }); + + hooks.post('cook', function(error, eggs, callback) { + called.second = true; + assert.equal(error.message, 'fail'); + callback(new Error('second')); + }); + + hooks.post('cook', function(error, eggs, callback) { + called.third = true; + assert.equal(error.message, 'second'); + callback(new Error('third')); + }); + + hooks.post('cook', function(error, eggs, callback) { + called.fourth = true; + assert.equal(error.message, 'third'); + callback(); + }); + + var options = { error: new Error('fail') }; + hooks.execPost('cook', null, [4], options, function(error, eggs) { + assert.ok(error); + assert.equal(error.message, 'third'); + assert.deepEqual(called, { + second: true, + third: true, + fourth: true + }); + done(); + }); + }); + + it('supports returning a promise', function(done) { + var calledPost = 0; + + hooks.post('cook', function() { + return new Promise(resolve => { + setTimeout(() => { + ++calledPost; + resolve(); + }, 100); + }); + }); + + hooks.execPost('cook', null, [], {}, function(error) { + assert.ifError(error); + assert.equal(calledPost, 1); + done(); + }); + }); +}); + +describe('execPostSync', function() { + var hooks; + + beforeEach(function() { + hooks = new Kareem(); + }); + + it('executes hooks synchronously', function() { + var execed = {}; + + hooks.post('cook', function() { + execed.first = true; + }); + + hooks.post('cook', function() { + execed.second = true; + }); + + hooks.execPostSync('cook', null); + assert.ok(execed.first); + assert.ok(execed.second); + }); + + it('works with no hooks specified', function() { + assert.doesNotThrow(function() { + hooks.execPostSync('cook', null); + }); + }); +}); diff --git a/node_modules/kareem/test/pre.test.js b/node_modules/kareem/test/pre.test.js new file mode 100644 index 000000000..ebf7bfbde --- /dev/null +++ b/node_modules/kareem/test/pre.test.js @@ -0,0 +1,340 @@ +'use strict'; + +const assert = require('assert'); +const Kareem = require('../'); + +describe('execPre', function() { + var hooks; + + beforeEach(function() { + hooks = new Kareem(); + }); + + it('handles errors with multiple pres', function(done) { + var execed = {}; + + hooks.pre('cook', function(done) { + execed.first = true; + done(); + }); + + hooks.pre('cook', function(done) { + execed.second = true; + done('error!'); + }); + + hooks.pre('cook', function(done) { + execed.third = true; + done(); + }); + + hooks.execPre('cook', null, function(err) { + assert.equal('error!', err); + assert.equal(2, Object.keys(execed).length); + assert.ok(execed.first); + assert.ok(execed.second); + done(); + }); + }); + + it('sync errors', function(done) { + var called = 0; + + hooks.pre('cook', function(next) { + throw new Error('woops!'); + }); + + hooks.pre('cook', function(next) { + ++called; + next(); + }); + + hooks.execPre('cook', null, function(err) { + assert.equal(err.message, 'woops!'); + assert.equal(called, 0); + done(); + }); + }); + + it('unshift', function() { + var f1 = function() {}; + var f2 = function() {}; + hooks.pre('cook', false, f1); + hooks.pre('cook', false, f2, null, true); + assert.strictEqual(hooks._pres.get('cook')[0].fn, f2); + assert.strictEqual(hooks._pres.get('cook')[1].fn, f1); + }); + + it('throws error if no function', function() { + assert.throws(() => hooks.pre('test'), /got "undefined"/); + }); + + it('arbitrary options', function() { + const f1 = function() {}; + const f2 = function() {}; + hooks.pre('cook', { foo: 'bar' }, f1); + hooks.pre('cook', { bar: 'baz' }, f2, null, true); + assert.equal(hooks._pres.get('cook')[1].foo, 'bar'); + assert.equal(hooks._pres.get('cook')[0].bar, 'baz'); + }); + + it('handles async errors', function(done) { + var execed = {}; + + hooks.pre('cook', true, function(next, done) { + execed.first = true; + setTimeout( + function() { + done('error!'); + }, + 5); + + next(); + }); + + hooks.pre('cook', true, function(next, done) { + execed.second = true; + setTimeout( + function() { + done('other error!'); + }, + 10); + + next(); + }); + + hooks.execPre('cook', null, function(err) { + assert.equal('error!', err); + assert.equal(2, Object.keys(execed).length); + assert.ok(execed.first); + assert.ok(execed.second); + done(); + }); + }); + + it('handles async errors in next()', function(done) { + var execed = {}; + + hooks.pre('cook', true, function(next, done) { + execed.first = true; + setTimeout( + function() { + done('other error!'); + }, + 15); + + next(); + }); + + hooks.pre('cook', true, function(next, done) { + execed.second = true; + setTimeout( + function() { + next('error!'); + done('another error!'); + }, + 5); + }); + + hooks.execPre('cook', null, function(err) { + assert.equal('error!', err); + assert.equal(2, Object.keys(execed).length); + assert.ok(execed.first); + assert.ok(execed.second); + done(); + }); + }); + + it('handles async errors in next() when already done', function(done) { + var execed = {}; + + hooks.pre('cook', true, function(next, done) { + execed.first = true; + setTimeout( + function() { + done('other error!'); + }, + 5); + + next(); + }); + + hooks.pre('cook', true, function(next, done) { + execed.second = true; + setTimeout( + function() { + next('error!'); + done('another error!'); + }, + 25); + }); + + hooks.execPre('cook', null, function(err) { + assert.equal('other error!', err); + assert.equal(2, Object.keys(execed).length); + assert.ok(execed.first); + assert.ok(execed.second); + done(); + }); + }); + + it('async pres with clone()', function(done) { + var execed = false; + + hooks.pre('cook', true, function(next, done) { + execed = true; + setTimeout( + function() { + done(); + }, + 5); + + next(); + }); + + hooks.clone().execPre('cook', null, function(err) { + assert.ifError(err); + assert.ok(execed); + done(); + }); + }); + + it('returns correct error when async pre errors', function(done) { + var execed = {}; + + hooks.pre('cook', true, function(next, done) { + execed.first = true; + setTimeout( + function() { + done('other error!'); + }, + 5); + + next(); + }); + + hooks.pre('cook', function(next) { + execed.second = true; + setTimeout( + function() { + next('error!'); + }, + 15); + }); + + hooks.execPre('cook', null, function(err) { + assert.equal('other error!', err); + assert.equal(2, Object.keys(execed).length); + assert.ok(execed.first); + assert.ok(execed.second); + done(); + }); + }); + + it('lets async pres run when fully sync pres are done', function(done) { + var execed = {}; + + hooks.pre('cook', true, function(next, done) { + execed.first = true; + setTimeout( + function() { + done(); + }, + 5); + + next(); + }); + + hooks.pre('cook', function() { + execed.second = true; + }); + + hooks.execPre('cook', null, function(err) { + assert.ifError(err); + assert.equal(2, Object.keys(execed).length); + assert.ok(execed.first); + assert.ok(execed.second); + done(); + }); + }); + + it('allows passing arguments to the next pre', function(done) { + var execed = {}; + + hooks.pre('cook', function(next) { + execed.first = true; + next(null, 'test'); + }); + + hooks.pre('cook', function(next, p) { + execed.second = true; + assert.equal(p, 'test'); + next(); + }); + + hooks.pre('cook', function(next, p) { + execed.third = true; + assert.ok(!p); + next(); + }); + + hooks.execPre('cook', null, function(err) { + assert.ifError(err); + assert.equal(3, Object.keys(execed).length); + assert.ok(execed.first); + assert.ok(execed.second); + assert.ok(execed.third); + done(); + }); + }); + + it('handles sync errors in pre if there are more hooks', function(done) { + var execed = {}; + + hooks.pre('cook', function() { + execed.first = true; + throw new Error('Oops!'); + }); + + hooks.pre('cook', function() { + execed.second = true; + }); + + hooks.execPre('cook', null, function(err) { + assert.ok(err); + assert.ok(execed.first); + assert.equal(err.message, 'Oops!'); + done(); + }); + }); +}); + +describe('execPreSync', function() { + var hooks; + + beforeEach(function() { + hooks = new Kareem(); + }); + + it('executes hooks synchronously', function() { + var execed = {}; + + hooks.pre('cook', function() { + execed.first = true; + }); + + hooks.pre('cook', function() { + execed.second = true; + }); + + hooks.execPreSync('cook', null); + assert.ok(execed.first); + assert.ok(execed.second); + }); + + it('works with no hooks specified', function() { + assert.doesNotThrow(function() { + hooks.execPreSync('cook', null); + }); + }); +}); diff --git a/node_modules/kareem/test/wrap.test.js b/node_modules/kareem/test/wrap.test.js new file mode 100644 index 000000000..dd9196e03 --- /dev/null +++ b/node_modules/kareem/test/wrap.test.js @@ -0,0 +1,342 @@ +var assert = require('assert'); +var Kareem = require('../'); + +describe('wrap()', function() { + var hooks; + + beforeEach(function() { + hooks = new Kareem(); + }); + + it('handles pre errors', function(done) { + hooks.pre('cook', function(done) { + done('error!'); + }); + + hooks.post('cook', function(obj) { + obj.tofu = 'no'; + }); + + var obj = { bacon: 0, eggs: 0 }; + + var args = [obj]; + args.push(function(error, result) { + assert.equal('error!', error); + assert.ok(!result); + assert.equal(undefined, obj.tofu); + done(); + }); + + hooks.wrap( + 'cook', + function(o, callback) { + // Should never get called + assert.ok(false); + callback(null, o); + }, + obj, + args); + }); + + it('handles pre errors when no callback defined', function(done) { + hooks.pre('cook', function(done) { + done('error!'); + }); + + hooks.post('cook', function(obj) { + obj.tofu = 'no'; + }); + + var obj = { bacon: 0, eggs: 0 }; + + var args = [obj]; + + hooks.wrap( + 'cook', + function(o, callback) { + // Should never get called + assert.ok(false); + callback(null, o); + }, + obj, + args); + + setTimeout( + function() { + done(); + }, + 25); + }); + + it('handles errors in wrapped function', function(done) { + hooks.pre('cook', function(done) { + done(); + }); + + hooks.post('cook', function(obj) { + obj.tofu = 'no'; + }); + + var obj = { bacon: 0, eggs: 0 }; + + var args = [obj]; + args.push(function(error, result) { + assert.equal('error!', error); + assert.ok(!result); + assert.equal(undefined, obj.tofu); + done(); + }); + + hooks.wrap( + 'cook', + function(o, callback) { + callback('error!'); + }, + obj, + args); + }); + + it('handles errors in post', function(done) { + hooks.pre('cook', function(done) { + done(); + }); + + hooks.post('cook', function(obj, callback) { + obj.tofu = 'no'; + callback('error!'); + }); + + var obj = { bacon: 0, eggs: 0 }; + + var args = [obj]; + args.push(function(error, result) { + assert.equal('error!', error); + assert.ok(!result); + assert.equal('no', obj.tofu); + done(); + }); + + hooks.wrap( + 'cook', + function(o, callback) { + callback(null, o); + }, + obj, + args); + }); + + it('defers errors to post hooks if enabled', function(done) { + hooks.pre('cook', function(done) { + done(new Error('fail')); + }); + + hooks.post('cook', function(error, res, callback) { + callback(new Error('another error occurred')); + }); + + var args = []; + args.push(function(error) { + assert.equal(error.message, 'another error occurred'); + done(); + }); + + hooks.wrap( + 'cook', + function(callback) { + assert.ok(false); + callback(); + }, + null, + args, + { useErrorHandlers: true, numCallbackParams: 1 }); + }); + + it('error handlers with no callback', function(done) { + hooks.pre('cook', function(done) { + done(new Error('fail')); + }); + + hooks.post('cook', function(error, callback) { + assert.equal(error.message, 'fail'); + done(); + }); + + var args = []; + + hooks.wrap( + 'cook', + function(callback) { + assert.ok(false); + callback(); + }, + null, + args, + { useErrorHandlers: true }); + }); + + it('error handlers with no error', function(done) { + hooks.post('cook', function(error, callback) { + callback(new Error('another error occurred')); + }); + + var args = []; + args.push(function(error) { + assert.ifError(error); + done(); + }); + + hooks.wrap( + 'cook', + function(callback) { + callback(); + }, + null, + args, + { useErrorHandlers: true }); + }); + + it('works with no args', function(done) { + hooks.pre('cook', function(done) { + done(); + }); + + hooks.post('cook', function(callback) { + obj.tofu = 'no'; + callback(); + }); + + var obj = { bacon: 0, eggs: 0 }; + + var args = []; + + hooks.wrap( + 'cook', + function(callback) { + callback(null); + }, + obj, + args); + + setTimeout( + function() { + assert.equal('no', obj.tofu); + done(); + }, + 25); + }); + + it('handles pre errors with no args', function(done) { + hooks.pre('cook', function(done) { + done('error!'); + }); + + hooks.post('cook', function(callback) { + obj.tofu = 'no'; + callback(); + }); + + var obj = { bacon: 0, eggs: 0 }; + + var args = []; + + hooks.wrap( + 'cook', + function(callback) { + callback(null); + }, + obj, + args); + + setTimeout( + function() { + assert.equal(undefined, obj.tofu); + done(); + }, + 25); + }); + + it('handles wrapped function errors with no args', function(done) { + hooks.pre('cook', function(done) { + obj.waffles = false; + done(); + }); + + hooks.post('cook', function(callback) { + obj.tofu = 'no'; + callback(); + }); + + var obj = { bacon: 0, eggs: 0 }; + + var args = []; + + hooks.wrap( + 'cook', + function(callback) { + callback('error!'); + }, + obj, + args); + + setTimeout( + function() { + assert.equal(false, obj.waffles); + assert.equal(undefined, obj.tofu); + done(); + }, + 25); + }); + + it('handles post errors with no args', function(done) { + hooks.pre('cook', function(done) { + obj.waffles = false; + done(); + }); + + hooks.post('cook', function(callback) { + obj.tofu = 'no'; + callback('error!'); + }); + + var obj = { bacon: 0, eggs: 0 }; + + var args = []; + + hooks.wrap( + 'cook', + function(callback) { + callback(); + }, + obj, + args); + + setTimeout( + function() { + assert.equal(false, obj.waffles); + assert.equal('no', obj.tofu); + done(); + }, + 25); + }); + + it('sync wrappers', function() { + var calledPre = 0; + var calledFn = 0; + var calledPost = 0; + hooks.pre('cook', function() { + ++calledPre; + }); + + hooks.post('cook', function() { + ++calledPost; + }); + + var wrapper = hooks.createWrapperSync('cook', function() { ++calledFn; }); + + wrapper(); + + assert.equal(calledPre, 1); + assert.equal(calledFn, 1); + assert.equal(calledPost, 1); + }); +}); diff --git a/node_modules/keygrip/HISTORY.md b/node_modules/keygrip/HISTORY.md new file mode 100644 index 000000000..0f4cc31a4 --- /dev/null +++ b/node_modules/keygrip/HISTORY.md @@ -0,0 +1,25 @@ +1.1.0 / 2019-05-07 +================== + + * Use `tsscmp` module for timing-safe signature verification + +1.0.3 / 2018-09-12 +================== + + * perf: enable strict mode + +1.0.2 / 2017-08-26 +================== + + * perf: improve comparison speed + +1.0.1 / 2014-05-07 +================== + + * Readme changes + * Update repository for organization move + +1.0.0 / 2013-12-21 +================== + + * Remove default key generation and associated expectations diff --git a/node_modules/keygrip/LICENSE b/node_modules/keygrip/LICENSE new file mode 100644 index 000000000..38b64a175 --- /dev/null +++ b/node_modules/keygrip/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2011-2014 Jed Schmidt (http://jedschmidt.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/keygrip/README.md b/node_modules/keygrip/README.md new file mode 100644 index 000000000..925ecef41 --- /dev/null +++ b/node_modules/keygrip/README.md @@ -0,0 +1,103 @@ +# keygrip + +[![NPM Version][npm-image]][npm-url] +[![NPM Downloads][downloads-image]][downloads-url] +[![Node.js Version][node-version-image]][node-version-url] +[![Build Status][travis-image]][travis-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +Keygrip is a [node.js](http://nodejs.org/) module for signing and verifying data (such as cookies or URLs) through a rotating credential system, in which new server keys can be added and old ones removed regularly, without invalidating client credentials. + +## Install + + $ npm install keygrip + +## API + +### keys = new Keygrip([keylist], [hmacAlgorithm], [encoding]) + +This creates a new Keygrip based on the provided keylist, an array of secret keys used for SHA1 HMAC digests. `keylist` is obligatory. `hmacAlgorithm` defaults to `'sha1'` and `encoding` defaults to `'base64'`. + +Note that the `new` operator is also optional, so all of the following will work when `Keygrip = require("keygrip")`: + +```javascript +keys = new Keygrip(["SEKRIT2", "SEKRIT1"]) +keys = Keygrip(["SEKRIT2", "SEKRIT1"]) +keys = require("keygrip")() +keys = Keygrip(["SEKRIT2", "SEKRIT1"], 'sha256', 'hex') +keys = Keygrip(["SEKRIT2", "SEKRIT1"], 'sha256') +keys = Keygrip(["SEKRIT2", "SEKRIT1"], undefined, 'hex') +``` + +The keylist is an array of all valid keys for signing, in descending order of freshness; new keys should be `unshift`ed into the array and old keys should be `pop`ped. + +The tradeoff here is that adding more keys to the keylist allows for more granular freshness for key validation, at the cost of a more expensive worst-case scenario for old or invalid hashes. + +Keygrip keeps a reference to this array to automatically reflect any changes. This reference is stored using a closure to prevent external access. + +### keys.sign(data) + +This creates a SHA1 HMAC based on the _first_ key in the keylist, and outputs it as a 27-byte url-safe base64 digest (base64 without padding, replacing `+` with `-` and `/` with `_`). + +### keys.index(data, digest) + +This loops through all of the keys currently in the keylist until the digest of the current key matches the given digest, at which point the current index is returned. If no key is matched, `-1` is returned. + +The idea is that if the index returned is greater than `0`, the data should be re-signed to prevent premature credential invalidation, and enable better performance for subsequent challenges. + +### keys.verify(data, digest) + +This uses `index` to return `true` if the digest matches any existing keys, and `false` otherwise. + +## Example + +```javascript +// ./test.js +var assert = require("assert") + , Keygrip = require("keygrip") + , keylist, keys, hash, index + +// but we're going to use our list. +// (note that the 'new' operator is optional) +keylist = ["SEKRIT3", "SEKRIT2", "SEKRIT1"] +keys = Keygrip(keylist) +// .sign returns the hash for the first key +// all hashes are SHA1 HMACs in url-safe base64 +hash = keys.sign("bieberschnitzel") +assert.ok(/^[\w\-]{27}$/.test(hash)) + +// .index returns the index of the first matching key +index = keys.index("bieberschnitzel", hash) +assert.equal(index, 0) + +// .verify returns the a boolean indicating a matched key +matched = keys.verify("bieberschnitzel", hash) +assert.ok(matched) + +index = keys.index("bieberschnitzel", "o_O") +assert.equal(index, -1) + +// rotate a new key in, and an old key out +keylist.unshift("SEKRIT4") +keylist.pop() + +// if index > 0, it's time to re-sign +index = keys.index("bieberschnitzel", hash) +assert.equal(index, 1) +hash = keys.sign("bieberschnitzel") +``` + +## License + +[MIT](LICENSE) + +[npm-image]: https://img.shields.io/npm/v/keygrip.svg +[npm-url]: https://npmjs.org/package/keygrip +[travis-image]: https://img.shields.io/travis/crypto-utils/keygrip/master.svg +[travis-url]: https://travis-ci.org/crypto-utils/keygrip +[coveralls-image]: https://img.shields.io/coveralls/crypto-utils/keygrip/master.svg +[coveralls-url]: https://coveralls.io/r/crypto-utils/keygrip +[downloads-image]: https://img.shields.io/npm/dm/keygrip.svg +[downloads-url]: https://npmjs.org/package/keygrip +[node-version-image]: https://img.shields.io/node/v/keygrip.svg +[node-version-url]: https://nodejs.org/en/download/ diff --git a/node_modules/keygrip/index.js b/node_modules/keygrip/index.js new file mode 100644 index 000000000..dbdb1e74e --- /dev/null +++ b/node_modules/keygrip/index.js @@ -0,0 +1,51 @@ +/*! + * keygrip + * Copyright(c) 2011-2014 Jed Schmidt + * MIT Licensed + */ + +'use strict' + +var compare = require('tsscmp') +var crypto = require("crypto") + +function Keygrip(keys, algorithm, encoding) { + if (!algorithm) algorithm = "sha1"; + if (!encoding) encoding = "base64"; + if (!(this instanceof Keygrip)) return new Keygrip(keys, algorithm, encoding) + + if (!keys || !(0 in keys)) { + throw new Error("Keys must be provided.") + } + + function sign(data, key) { + return crypto + .createHmac(algorithm, key) + .update(data).digest(encoding) + .replace(/\/|\+|=/g, function(x) { + return ({ "/": "_", "+": "-", "=": "" })[x] + }) + } + + this.sign = function(data){ return sign(data, keys[0]) } + + this.verify = function(data, digest) { + return this.index(data, digest) > -1 + } + + this.index = function(data, digest) { + for (var i = 0, l = keys.length; i < l; i++) { + if (compare(digest, sign(data, keys[i]))) { + return i + } + } + + return -1 + } +} + +Keygrip.sign = Keygrip.verify = Keygrip.index = function() { + throw new Error("Usage: require('keygrip')()") +} + +module.exports = Keygrip diff --git a/node_modules/keygrip/package.json b/node_modules/keygrip/package.json new file mode 100644 index 000000000..85fc42bbe --- /dev/null +++ b/node_modules/keygrip/package.json @@ -0,0 +1,60 @@ +{ + "_from": "keygrip@~1.1.0", + "_id": "keygrip@1.1.0", + "_inBundle": false, + "_integrity": "sha512-iYSchDJ+liQ8iwbSI2QqsQOvqv58eJCEanyJPJi+Khyu8smkcKSFUCbPwzFcL7YVtZ6eONjqRX/38caJ7QjRAQ==", + "_location": "/keygrip", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "keygrip@~1.1.0", + "name": "keygrip", + "escapedName": "keygrip", + "rawSpec": "~1.1.0", + "saveSpec": null, + "fetchSpec": "~1.1.0" + }, + "_requiredBy": [ + "/cookies" + ], + "_resolved": "https://registry.npmjs.org/keygrip/-/keygrip-1.1.0.tgz", + "_shasum": "871b1681d5e159c62a445b0c74b615e0917e7226", + "_spec": "keygrip@~1.1.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\cookies", + "bugs": { + "url": "https://github.com/crypto-utils/keygrip/issues" + }, + "bundleDependencies": false, + "dependencies": { + "tsscmp": "1.0.6" + }, + "deprecated": false, + "description": "Key signing and verification for rotated credentials", + "devDependencies": { + "mocha": "6.1.4", + "nyc": "14.0.0" + }, + "engines": { + "node": ">= 0.6" + }, + "files": [ + "HISTORY.md", + "LICENSE", + "README.md", + "index.js" + ], + "homepage": "https://github.com/crypto-utils/keygrip#readme", + "license": "MIT", + "name": "keygrip", + "repository": { + "type": "git", + "url": "git+https://github.com/crypto-utils/keygrip.git" + }, + "scripts": { + "test": "mocha --reporter spec test/", + "test-cov": "nyc --reporter=html --reporter=text npm test", + "test-travis": "nyc --reporter=text npm test" + }, + "version": "1.1.0" +} diff --git a/node_modules/keyv/LICENSE b/node_modules/keyv/LICENSE new file mode 100644 index 000000000..f27ee9b41 --- /dev/null +++ b/node_modules/keyv/LICENSE @@ -0,0 +1,21 @@ +MIT License + +Copyright (c) 2017 Luke Childs + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/node_modules/keyv/README.md b/node_modules/keyv/README.md new file mode 100644 index 000000000..2a9287c13 --- /dev/null +++ b/node_modules/keyv/README.md @@ -0,0 +1,276 @@ +

+ keyv +
+
+

+ +> Simple key-value storage with support for multiple backends + +[![Build Status](https://travis-ci.org/lukechilds/keyv.svg?branch=master)](https://travis-ci.org/lukechilds/keyv) +[![Coverage Status](https://coveralls.io/repos/github/lukechilds/keyv/badge.svg?branch=master)](https://coveralls.io/github/lukechilds/keyv?branch=master) +[![npm](https://img.shields.io/npm/dm/keyv.svg)](https://www.npmjs.com/package/keyv) +[![npm](https://img.shields.io/npm/v/keyv.svg)](https://www.npmjs.com/package/keyv) + +Keyv provides a consistent interface for key-value storage across multiple backends via storage adapters. It supports TTL based expiry, making it suitable as a cache or a persistent key-value store. + +## Features + +There are a few existing modules similar to Keyv, however Keyv is different because it: + +- Isn't bloated +- Has a simple Promise based API +- Suitable as a TTL based cache or persistent key-value store +- [Easily embeddable](#add-cache-support-to-your-module) inside another module +- Works with any storage that implements the [`Map`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Map) API +- Handles all JSON types plus `Buffer` +- Supports namespaces +- Wide range of [**efficient, well tested**](#official-storage-adapters) storage adapters +- Connection errors are passed through (db failures won't kill your app) +- Supports the current active LTS version of Node.js or higher + +## Usage + +Install Keyv. + +``` +npm install --save keyv +``` + +By default everything is stored in memory, you can optionally also install a storage adapter. + +``` +npm install --save @keyv/redis +npm install --save @keyv/mongo +npm install --save @keyv/sqlite +npm install --save @keyv/postgres +npm install --save @keyv/mysql +``` + +Create a new Keyv instance, passing your connection string if applicable. Keyv will automatically load the correct storage adapter. + +```js +const Keyv = require('keyv'); + +// One of the following +const keyv = new Keyv(); +const keyv = new Keyv('redis://user:pass@localhost:6379'); +const keyv = new Keyv('mongodb://user:pass@localhost:27017/dbname'); +const keyv = new Keyv('sqlite://path/to/database.sqlite'); +const keyv = new Keyv('postgresql://user:pass@localhost:5432/dbname'); +const keyv = new Keyv('mysql://user:pass@localhost:3306/dbname'); + +// Handle DB connection errors +keyv.on('error', err => console.log('Connection Error', err)); + +await keyv.set('foo', 'expires in 1 second', 1000); // true +await keyv.set('foo', 'never expires'); // true +await keyv.get('foo'); // 'never expires' +await keyv.delete('foo'); // true +await keyv.clear(); // undefined +``` + +### Namespaces + +You can namespace your Keyv instance to avoid key collisions and allow you to clear only a certain namespace while using the same database. + +```js +const users = new Keyv('redis://user:pass@localhost:6379', { namespace: 'users' }); +const cache = new Keyv('redis://user:pass@localhost:6379', { namespace: 'cache' }); + +await users.set('foo', 'users'); // true +await cache.set('foo', 'cache'); // true +await users.get('foo'); // 'users' +await cache.get('foo'); // 'cache' +await users.clear(); // undefined +await users.get('foo'); // undefined +await cache.get('foo'); // 'cache' +``` + +### Custom Serializers + +Keyv uses [`json-buffer`](https://github.com/dominictarr/json-buffer) for data serialization to ensure consistency across different backends. + +You can optionally provide your own serialization functions to support extra data types or to serialize to something other than JSON. + +```js +const keyv = new Keyv({ serialize: JSON.stringify, deserialize: JSON.parse }); +``` + +**Warning:** Using custom serializers means you lose any guarantee of data consistency. You should do extensive testing with your serialisation functions and chosen storage engine. + +## Official Storage Adapters + +The official storage adapters are covered by [over 150 integration tests](https://travis-ci.org/lukechilds/keyv/jobs/260418145) to guarantee consistent behaviour. They are lightweight, efficient wrappers over the DB clients making use of indexes and native TTLs where available. + +Database | Adapter | Native TTL | Status +---|---|---|--- +Redis | [@keyv/redis](https://github.com/lukechilds/keyv-redis) | Yes | [![Build Status](https://travis-ci.org/lukechilds/keyv-redis.svg?branch=master)](https://travis-ci.org/lukechilds/keyv-redis) [![Coverage Status](https://coveralls.io/repos/github/lukechilds/keyv-redis/badge.svg?branch=master)](https://coveralls.io/github/lukechilds/keyv-redis?branch=master) +MongoDB | [@keyv/mongo](https://github.com/lukechilds/keyv-mongo) | Yes | [![Build Status](https://travis-ci.org/lukechilds/keyv-mongo.svg?branch=master)](https://travis-ci.org/lukechilds/keyv-mongo) [![Coverage Status](https://coveralls.io/repos/github/lukechilds/keyv-mongo/badge.svg?branch=master)](https://coveralls.io/github/lukechilds/keyv-mongo?branch=master) +SQLite | [@keyv/sqlite](https://github.com/lukechilds/keyv-sqlite) | No | [![Build Status](https://travis-ci.org/lukechilds/keyv-sqlite.svg?branch=master)](https://travis-ci.org/lukechilds/keyv-sqlite) [![Coverage Status](https://coveralls.io/repos/github/lukechilds/keyv-sqlite/badge.svg?branch=master)](https://coveralls.io/github/lukechilds/keyv-sqlite?branch=master) +PostgreSQL | [@keyv/postgres](https://github.com/lukechilds/keyv-postgres) | No | [![Build Status](https://travis-ci.org/lukechilds/keyv-postgres.svg?branch=master)](https://travis-ci.org/lukechildskeyv-postgreskeyv) [![Coverage Status](https://coveralls.io/repos/github/lukechilds/keyv-postgres/badge.svg?branch=master)](https://coveralls.io/github/lukechilds/keyv-postgres?branch=master) +MySQL | [@keyv/mysql](https://github.com/lukechilds/keyv-mysql) | No | [![Build Status](https://travis-ci.org/lukechilds/keyv-mysql.svg?branch=master)](https://travis-ci.org/lukechilds/keyv-mysql) [![Coverage Status](https://coveralls.io/repos/github/lukechilds/keyv-mysql/badge.svg?branch=master)](https://coveralls.io/github/lukechilds/keyv-mysql?branch=master) + +## Third-party Storage Adapters + +You can also use third-party storage adapters or build your own. Keyv will wrap these storage adapters in TTL functionality and handle complex types internally. + +```js +const Keyv = require('keyv'); +const myAdapter = require('./my-storage-adapter'); + +const keyv = new Keyv({ store: myAdapter }); +``` + +Any store that follows the [`Map`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Map) api will work. + +```js +new Keyv({ store: new Map() }); +``` + +For example, [`quick-lru`](https://github.com/sindresorhus/quick-lru) is a completely unrelated module that implements the Map API. + +```js +const Keyv = require('keyv'); +const QuickLRU = require('quick-lru'); + +const lru = new QuickLRU({ maxSize: 1000 }); +const keyv = new Keyv({ store: lru }); +``` + +The following are third-party storage adapters compatible with Keyv: + +- [quick-lru](https://github.com/sindresorhus/quick-lru) - Simple "Least Recently Used" (LRU) cache +- [keyv-file](https://github.com/zaaack/keyv-file) - File system storage adapter for Keyv +- [keyv-dynamodb](https://www.npmjs.com/package/keyv-dynamodb) - DynamoDB storage adapter for Keyv + +## Add Cache Support to your Module + +Keyv is designed to be easily embedded into other modules to add cache support. The recommended pattern is to expose a `cache` option in your modules options which is passed through to Keyv. Caching will work in memory by default and users have the option to also install a Keyv storage adapter and pass in a connection string, or any other storage that implements the `Map` API. + +You should also set a namespace for your module so you can safely call `.clear()` without clearing unrelated app data. + +Inside your module: + +```js +class AwesomeModule { + constructor(opts) { + this.cache = new Keyv({ + uri: typeof opts.cache === 'string' && opts.cache, + store: typeof opts.cache !== 'string' && opts.cache, + namespace: 'awesome-module' + }); + } +} +``` + +Now it can be consumed like this: + +```js +const AwesomeModule = require('awesome-module'); + +// Caches stuff in memory by default +const awesomeModule = new AwesomeModule(); + +// After npm install --save keyv-redis +const awesomeModule = new AwesomeModule({ cache: 'redis://localhost' }); + +// Some third-party module that implements the Map API +const awesomeModule = new AwesomeModule({ cache: some3rdPartyStore }); +``` + +## API + +### new Keyv([uri], [options]) + +Returns a new Keyv instance. + +The Keyv instance is also an `EventEmitter` that will emit an `'error'` event if the storage adapter connection fails. + +### uri + +Type: `String`
+Default: `undefined` + +The connection string URI. + +Merged into the options object as options.uri. + +### options + +Type: `Object` + +The options object is also passed through to the storage adapter. Check your storage adapter docs for any extra options. + +#### options.namespace + +Type: `String`
+Default: `'keyv'` + +Namespace for the current instance. + +#### options.ttl + +Type: `Number`
+Default: `undefined` + +Default TTL. Can be overridden by specififying a TTL on `.set()`. + +#### options.serialize + +Type: `Function`
+Default: `JSONB.stringify` + +A custom serialization function. + +#### options.deserialize + +Type: `Function`
+Default: `JSONB.parse` + +A custom deserialization function. + +#### options.store + +Type: `Storage adapter instance`
+Default: `new Map()` + +The storage adapter instance to be used by Keyv. + +#### options.adapter + +Type: `String`
+Default: `undefined` + +Specify an adapter to use. e.g `'redis'` or `'mongodb'`. + +### Instance + +Keys must always be strings. Values can be of any type. + +#### .set(key, value, [ttl]) + +Set a value. + +By default keys are persistent. You can set an expiry TTL in milliseconds. + +Returns `true`. + +#### .get(key) + +Returns the value. + +#### .delete(key) + +Deletes an entry. + +Returns `true` if the key existed, `false` if not. + +#### .clear() + +Delete all entries in the current namespace. + +Returns `undefined`. + +## License + +MIT © Luke Childs diff --git a/node_modules/keyv/package.json b/node_modules/keyv/package.json new file mode 100644 index 000000000..3fccdf6b2 --- /dev/null +++ b/node_modules/keyv/package.json @@ -0,0 +1,78 @@ +{ + "_from": "keyv@^3.0.0", + "_id": "keyv@3.1.0", + "_inBundle": false, + "_integrity": "sha512-9ykJ/46SN/9KPM/sichzQ7OvXyGDYKGTaDlKMGCAlg2UK8KRy4jb0d8sFc+0Tt0YYnThq8X2RZgCg74RPxgcVA==", + "_location": "/keyv", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "keyv@^3.0.0", + "name": "keyv", + "escapedName": "keyv", + "rawSpec": "^3.0.0", + "saveSpec": null, + "fetchSpec": "^3.0.0" + }, + "_requiredBy": [ + "/cacheable-request" + ], + "_resolved": "https://registry.npmjs.org/keyv/-/keyv-3.1.0.tgz", + "_shasum": "ecc228486f69991e49e9476485a5be1e8fc5c4d9", + "_spec": "keyv@^3.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\cacheable-request", + "author": { + "name": "Luke Childs", + "email": "lukechilds123@gmail.com", + "url": "http://lukechilds.co.uk" + }, + "bugs": { + "url": "https://github.com/lukechilds/keyv/issues" + }, + "bundleDependencies": false, + "dependencies": { + "json-buffer": "3.0.0" + }, + "deprecated": false, + "description": "Simple key-value storage with support for multiple backends", + "devDependencies": { + "@keyv/mongo": "*", + "@keyv/mysql": "*", + "@keyv/postgres": "*", + "@keyv/redis": "*", + "@keyv/sqlite": "*", + "@keyv/test-suite": "*", + "ava": "^0.25.0", + "coveralls": "^3.0.0", + "eslint-config-xo-lukechilds": "^1.0.0", + "nyc": "^11.0.3", + "this": "^1.0.2", + "timekeeper": "^2.0.0", + "xo": "^0.20.1" + }, + "homepage": "https://github.com/lukechilds/keyv", + "keywords": [ + "key", + "value", + "store", + "cache", + "ttl" + ], + "license": "MIT", + "main": "src/index.js", + "name": "keyv", + "repository": { + "type": "git", + "url": "git+https://github.com/lukechilds/keyv.git" + }, + "scripts": { + "coverage": "nyc report --reporter=text-lcov | coveralls", + "test": "xo && nyc ava test/keyv.js", + "test:full": "xo && nyc ava --serial" + }, + "version": "3.1.0", + "xo": { + "extends": "xo-lukechilds" + } +} diff --git a/node_modules/keyv/src/index.js b/node_modules/keyv/src/index.js new file mode 100644 index 000000000..02af495a7 --- /dev/null +++ b/node_modules/keyv/src/index.js @@ -0,0 +1,103 @@ +'use strict'; + +const EventEmitter = require('events'); +const JSONB = require('json-buffer'); + +const loadStore = opts => { + const adapters = { + redis: '@keyv/redis', + mongodb: '@keyv/mongo', + mongo: '@keyv/mongo', + sqlite: '@keyv/sqlite', + postgresql: '@keyv/postgres', + postgres: '@keyv/postgres', + mysql: '@keyv/mysql' + }; + if (opts.adapter || opts.uri) { + const adapter = opts.adapter || /^[^:]*/.exec(opts.uri)[0]; + return new (require(adapters[adapter]))(opts); + } + return new Map(); +}; + +class Keyv extends EventEmitter { + constructor(uri, opts) { + super(); + this.opts = Object.assign( + { + namespace: 'keyv', + serialize: JSONB.stringify, + deserialize: JSONB.parse + }, + (typeof uri === 'string') ? { uri } : uri, + opts + ); + + if (!this.opts.store) { + const adapterOpts = Object.assign({}, this.opts); + this.opts.store = loadStore(adapterOpts); + } + + if (typeof this.opts.store.on === 'function') { + this.opts.store.on('error', err => this.emit('error', err)); + } + + this.opts.store.namespace = this.opts.namespace; + } + + _getKeyPrefix(key) { + return `${this.opts.namespace}:${key}`; + } + + get(key) { + key = this._getKeyPrefix(key); + const store = this.opts.store; + return Promise.resolve() + .then(() => store.get(key)) + .then(data => { + data = (typeof data === 'string') ? this.opts.deserialize(data) : data; + if (data === undefined) { + return undefined; + } + if (typeof data.expires === 'number' && Date.now() > data.expires) { + this.delete(key); + return undefined; + } + return data.value; + }); + } + + set(key, value, ttl) { + key = this._getKeyPrefix(key); + if (typeof ttl === 'undefined') { + ttl = this.opts.ttl; + } + if (ttl === 0) { + ttl = undefined; + } + const store = this.opts.store; + + return Promise.resolve() + .then(() => { + const expires = (typeof ttl === 'number') ? (Date.now() + ttl) : null; + value = { value, expires }; + return store.set(key, this.opts.serialize(value), ttl); + }) + .then(() => true); + } + + delete(key) { + key = this._getKeyPrefix(key); + const store = this.opts.store; + return Promise.resolve() + .then(() => store.delete(key)); + } + + clear() { + const store = this.opts.store; + return Promise.resolve() + .then(() => store.clear()); + } +} + +module.exports = Keyv; diff --git a/node_modules/latest-version/index.d.ts b/node_modules/latest-version/index.d.ts new file mode 100644 index 000000000..8b3a89a4c --- /dev/null +++ b/node_modules/latest-version/index.d.ts @@ -0,0 +1,42 @@ +declare namespace latestVersion { + interface Options { + /** + A semver range or [dist-tag](https://docs.npmjs.com/cli/dist-tag). + */ + readonly version?: string; + } +} + +declare const latestVersion: { + /** + Get the latest version of an npm package. + + @example + ``` + import latestVersion = require('latest-version'); + + (async () => { + console.log(await latestVersion('ava')); + //=> '0.18.0' + + console.log(await latestVersion('@sindresorhus/df')); + //=> '1.0.1' + + // Also works with semver ranges and dist-tags + console.log(await latestVersion('npm', {version: 'latest-5'})); + //=> '5.5.1' + })(); + ``` + */ + (packageName: string, options?: latestVersion.Options): Promise; + + // TODO: Remove this for the next major release, refactor the whole definition to: + // declare function latestVersion( + // packageName: string, + // options?: latestVersion.Options + // ): Promise; + // export = latestVersion; + default: typeof latestVersion; +}; + +export = latestVersion; diff --git a/node_modules/latest-version/index.js b/node_modules/latest-version/index.js new file mode 100644 index 000000000..b22e3aba8 --- /dev/null +++ b/node_modules/latest-version/index.js @@ -0,0 +1,11 @@ +'use strict'; +const packageJson = require('package-json'); + +const lastestVersion = async (packageName, options) => { + const {version} = await packageJson(packageName.toLowerCase(), options); + return version; +}; + +module.exports = lastestVersion; +// TODO: Remove this for the next major release +module.exports.default = lastestVersion; diff --git a/node_modules/latest-version/license b/node_modules/latest-version/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/latest-version/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/latest-version/package.json b/node_modules/latest-version/package.json new file mode 100644 index 000000000..011bef373 --- /dev/null +++ b/node_modules/latest-version/package.json @@ -0,0 +1,74 @@ +{ + "_from": "latest-version@^5.1.0", + "_id": "latest-version@5.1.0", + "_inBundle": false, + "_integrity": "sha512-weT+r0kTkRQdCdYCNtkMwWXQTMEswKrFBkm4ckQOMVhhqhIMI1UT2hMj+1iigIhgSZm5gTmrRXBNoGUgaTY1xA==", + "_location": "/latest-version", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "latest-version@^5.1.0", + "name": "latest-version", + "escapedName": "latest-version", + "rawSpec": "^5.1.0", + "saveSpec": null, + "fetchSpec": "^5.1.0" + }, + "_requiredBy": [ + "/update-notifier" + ], + "_resolved": "https://registry.npmjs.org/latest-version/-/latest-version-5.1.0.tgz", + "_shasum": "119dfe908fe38d15dfa43ecd13fa12ec8832face", + "_spec": "latest-version@^5.1.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\update-notifier", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/latest-version/issues" + }, + "bundleDependencies": false, + "dependencies": { + "package-json": "^6.3.0" + }, + "deprecated": false, + "description": "Get the latest version of an npm package", + "devDependencies": { + "ava": "^1.4.1", + "semver": "^6.0.0", + "semver-regex": "^2.0.0", + "tsd": "^0.7.2", + "xo": "^0.24.0" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "homepage": "https://github.com/sindresorhus/latest-version#readme", + "keywords": [ + "latest", + "version", + "npm", + "pkg", + "package", + "package.json", + "current", + "module" + ], + "license": "MIT", + "name": "latest-version", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/latest-version.git" + }, + "scripts": { + "test": "xo && ava && tsd" + }, + "version": "5.1.0" +} diff --git a/node_modules/latest-version/readme.md b/node_modules/latest-version/readme.md new file mode 100644 index 000000000..ee572662b --- /dev/null +++ b/node_modules/latest-version/readme.md @@ -0,0 +1,42 @@ +# latest-version [![Build Status](https://travis-ci.org/sindresorhus/latest-version.svg?branch=master)](https://travis-ci.org/sindresorhus/latest-version) + +> Get the latest version of an npm package + +Fetches the version directly from the registry instead of depending on the massive [npm](https://github.com/npm/npm/blob/8b5e7b6ae5b4cd2d7d62eaf93b1428638b387072/package.json#L37-L85) module like the [latest](https://github.com/bahamas10/node-latest) module does. + + +## Install + +``` +$ npm install latest-version +``` + + +## Usage + +```js +const latestVersion = require('latest-version'); + +(async () => { + console.log(await latestVersion('ava')); + //=> '0.18.0' + + console.log(await latestVersion('@sindresorhus/df')); + //=> '1.0.1' + + // Also works with semver ranges and dist-tags + console.log(await latestVersion('npm', {version: 'latest-5'})); + //=> '5.5.1' +})(); +``` + + +## Related + +- [latest-version-cli](https://github.com/sindresorhus/latest-version-cli) - CLI for this module +- [package-json](https://github.com/sindresorhus/package-json) - Get the package.json of a package from the npm registry + + +## License + +MIT © [Sindre Sorhus](https://sindresorhus.com) diff --git a/node_modules/lowercase-keys/index.js b/node_modules/lowercase-keys/index.js new file mode 100644 index 000000000..b8d889836 --- /dev/null +++ b/node_modules/lowercase-keys/index.js @@ -0,0 +1,11 @@ +'use strict'; +module.exports = function (obj) { + var ret = {}; + var keys = Object.keys(Object(obj)); + + for (var i = 0; i < keys.length; i++) { + ret[keys[i].toLowerCase()] = obj[keys[i]]; + } + + return ret; +}; diff --git a/node_modules/lowercase-keys/license b/node_modules/lowercase-keys/license new file mode 100644 index 000000000..654d0bfe9 --- /dev/null +++ b/node_modules/lowercase-keys/license @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/lowercase-keys/package.json b/node_modules/lowercase-keys/package.json new file mode 100644 index 000000000..fdf9a848e --- /dev/null +++ b/node_modules/lowercase-keys/package.json @@ -0,0 +1,68 @@ +{ + "_from": "lowercase-keys@^1.0.1", + "_id": "lowercase-keys@1.0.1", + "_inBundle": false, + "_integrity": "sha512-G2Lj61tXDnVFFOi8VZds+SoQjtQC3dgokKdDG2mTm1tx4m50NUHBOZSBwQQHyy0V12A0JTG4icfZQH+xPyh8VA==", + "_location": "/lowercase-keys", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "lowercase-keys@^1.0.1", + "name": "lowercase-keys", + "escapedName": "lowercase-keys", + "rawSpec": "^1.0.1", + "saveSpec": null, + "fetchSpec": "^1.0.1" + }, + "_requiredBy": [ + "/got", + "/responselike" + ], + "_resolved": "https://registry.npmjs.org/lowercase-keys/-/lowercase-keys-1.0.1.tgz", + "_shasum": "6f9e30b47084d971a7c820ff15a6c5167b74c26f", + "_spec": "lowercase-keys@^1.0.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\got", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/lowercase-keys/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Lowercase the keys of an object", + "devDependencies": { + "ava": "*" + }, + "engines": { + "node": ">=0.10.0" + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/sindresorhus/lowercase-keys#readme", + "keywords": [ + "object", + "assign", + "extend", + "properties", + "lowercase", + "lower-case", + "case", + "keys", + "key" + ], + "license": "MIT", + "name": "lowercase-keys", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/lowercase-keys.git" + }, + "scripts": { + "test": "ava" + }, + "version": "1.0.1" +} diff --git a/node_modules/lowercase-keys/readme.md b/node_modules/lowercase-keys/readme.md new file mode 100644 index 000000000..dc65770a3 --- /dev/null +++ b/node_modules/lowercase-keys/readme.md @@ -0,0 +1,33 @@ +# lowercase-keys [![Build Status](https://travis-ci.org/sindresorhus/lowercase-keys.svg?branch=master)](https://travis-ci.org/sindresorhus/lowercase-keys) + +> Lowercase the keys of an object + + +## Install + +``` +$ npm install --save lowercase-keys +``` + + +## Usage + +```js +var lowercaseKeys = require('lowercase-keys'); + +lowercaseKeys({FOO: true, bAr: false}); +//=> {foo: true, bar: false} +``` + + +## API + +### lowercaseKeys(object) + +Lowercases the keys and returns a new object. + + + +## License + +MIT © [Sindre Sorhus](http://sindresorhus.com) diff --git a/node_modules/lru-cache/LICENSE b/node_modules/lru-cache/LICENSE new file mode 100644 index 000000000..19129e315 --- /dev/null +++ b/node_modules/lru-cache/LICENSE @@ -0,0 +1,15 @@ +The ISC License + +Copyright (c) Isaac Z. Schlueter and Contributors + +Permission to use, copy, modify, and/or distribute this software for any +purpose with or without fee is hereby granted, provided that the above +copyright notice and this permission notice appear in all copies. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES +WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF +MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR +ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES +WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN +ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR +IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. diff --git a/node_modules/lru-cache/README.md b/node_modules/lru-cache/README.md new file mode 100644 index 000000000..435dfebb7 --- /dev/null +++ b/node_modules/lru-cache/README.md @@ -0,0 +1,166 @@ +# lru cache + +A cache object that deletes the least-recently-used items. + +[![Build Status](https://travis-ci.org/isaacs/node-lru-cache.svg?branch=master)](https://travis-ci.org/isaacs/node-lru-cache) [![Coverage Status](https://coveralls.io/repos/isaacs/node-lru-cache/badge.svg?service=github)](https://coveralls.io/github/isaacs/node-lru-cache) + +## Installation: + +```javascript +npm install lru-cache --save +``` + +## Usage: + +```javascript +var LRU = require("lru-cache") + , options = { max: 500 + , length: function (n, key) { return n * 2 + key.length } + , dispose: function (key, n) { n.close() } + , maxAge: 1000 * 60 * 60 } + , cache = new LRU(options) + , otherCache = new LRU(50) // sets just the max size + +cache.set("key", "value") +cache.get("key") // "value" + +// non-string keys ARE fully supported +// but note that it must be THE SAME object, not +// just a JSON-equivalent object. +var someObject = { a: 1 } +cache.set(someObject, 'a value') +// Object keys are not toString()-ed +cache.set('[object Object]', 'a different value') +assert.equal(cache.get(someObject), 'a value') +// A similar object with same keys/values won't work, +// because it's a different object identity +assert.equal(cache.get({ a: 1 }), undefined) + +cache.reset() // empty the cache +``` + +If you put more stuff in it, then items will fall out. + +If you try to put an oversized thing in it, then it'll fall out right +away. + +## Options + +* `max` The maximum size of the cache, checked by applying the length + function to all values in the cache. Not setting this is kind of + silly, since that's the whole purpose of this lib, but it defaults + to `Infinity`. Setting it to a non-number or negative number will + throw a `TypeError`. Setting it to 0 makes it be `Infinity`. +* `maxAge` Maximum age in ms. Items are not pro-actively pruned out + as they age, but if you try to get an item that is too old, it'll + drop it and return undefined instead of giving it to you. + Setting this to a negative value will make everything seem old! + Setting it to a non-number will throw a `TypeError`. +* `length` Function that is used to calculate the length of stored + items. If you're storing strings or buffers, then you probably want + to do something like `function(n, key){return n.length}`. The default is + `function(){return 1}`, which is fine if you want to store `max` + like-sized things. The item is passed as the first argument, and + the key is passed as the second argumnet. +* `dispose` Function that is called on items when they are dropped + from the cache. This can be handy if you want to close file + descriptors or do other cleanup tasks when items are no longer + accessible. Called with `key, value`. It's called *before* + actually removing the item from the internal cache, so if you want + to immediately put it back in, you'll have to do that in a + `nextTick` or `setTimeout` callback or it won't do anything. +* `stale` By default, if you set a `maxAge`, it'll only actually pull + stale items out of the cache when you `get(key)`. (That is, it's + not pre-emptively doing a `setTimeout` or anything.) If you set + `stale:true`, it'll return the stale value before deleting it. If + you don't set this, then it'll return `undefined` when you try to + get a stale entry, as if it had already been deleted. +* `noDisposeOnSet` By default, if you set a `dispose()` method, then + it'll be called whenever a `set()` operation overwrites an existing + key. If you set this option, `dispose()` will only be called when a + key falls out of the cache, not when it is overwritten. +* `updateAgeOnGet` When using time-expiring entries with `maxAge`, + setting this to `true` will make each item's effective time update + to the current time whenever it is retrieved from cache, causing it + to not expire. (It can still fall out of cache based on recency of + use, of course.) + +## API + +* `set(key, value, maxAge)` +* `get(key) => value` + + Both of these will update the "recently used"-ness of the key. + They do what you think. `maxAge` is optional and overrides the + cache `maxAge` option if provided. + + If the key is not found, `get()` will return `undefined`. + + The key and val can be any value. + +* `peek(key)` + + Returns the key value (or `undefined` if not found) without + updating the "recently used"-ness of the key. + + (If you find yourself using this a lot, you *might* be using the + wrong sort of data structure, but there are some use cases where + it's handy.) + +* `del(key)` + + Deletes a key out of the cache. + +* `reset()` + + Clear the cache entirely, throwing away all values. + +* `has(key)` + + Check if a key is in the cache, without updating the recent-ness + or deleting it for being stale. + +* `forEach(function(value,key,cache), [thisp])` + + Just like `Array.prototype.forEach`. Iterates over all the keys + in the cache, in order of recent-ness. (Ie, more recently used + items are iterated over first.) + +* `rforEach(function(value,key,cache), [thisp])` + + The same as `cache.forEach(...)` but items are iterated over in + reverse order. (ie, less recently used items are iterated over + first.) + +* `keys()` + + Return an array of the keys in the cache. + +* `values()` + + Return an array of the values in the cache. + +* `length` + + Return total length of objects in cache taking into account + `length` options function. + +* `itemCount` + + Return total quantity of objects currently in cache. Note, that + `stale` (see options) items are returned as part of this item + count. + +* `dump()` + + Return an array of the cache entries ready for serialization and usage + with 'destinationCache.load(arr)`. + +* `load(cacheEntriesArray)` + + Loads another cache entries array, obtained with `sourceCache.dump()`, + into the cache. The destination cache is reset before loading new entries + +* `prune()` + + Manually iterates over the entire cache proactively pruning old entries diff --git a/node_modules/lru-cache/index.js b/node_modules/lru-cache/index.js new file mode 100644 index 000000000..573b6b85b --- /dev/null +++ b/node_modules/lru-cache/index.js @@ -0,0 +1,334 @@ +'use strict' + +// A linked list to keep track of recently-used-ness +const Yallist = require('yallist') + +const MAX = Symbol('max') +const LENGTH = Symbol('length') +const LENGTH_CALCULATOR = Symbol('lengthCalculator') +const ALLOW_STALE = Symbol('allowStale') +const MAX_AGE = Symbol('maxAge') +const DISPOSE = Symbol('dispose') +const NO_DISPOSE_ON_SET = Symbol('noDisposeOnSet') +const LRU_LIST = Symbol('lruList') +const CACHE = Symbol('cache') +const UPDATE_AGE_ON_GET = Symbol('updateAgeOnGet') + +const naiveLength = () => 1 + +// lruList is a yallist where the head is the youngest +// item, and the tail is the oldest. the list contains the Hit +// objects as the entries. +// Each Hit object has a reference to its Yallist.Node. This +// never changes. +// +// cache is a Map (or PseudoMap) that matches the keys to +// the Yallist.Node object. +class LRUCache { + constructor (options) { + if (typeof options === 'number') + options = { max: options } + + if (!options) + options = {} + + if (options.max && (typeof options.max !== 'number' || options.max < 0)) + throw new TypeError('max must be a non-negative number') + // Kind of weird to have a default max of Infinity, but oh well. + const max = this[MAX] = options.max || Infinity + + const lc = options.length || naiveLength + this[LENGTH_CALCULATOR] = (typeof lc !== 'function') ? naiveLength : lc + this[ALLOW_STALE] = options.stale || false + if (options.maxAge && typeof options.maxAge !== 'number') + throw new TypeError('maxAge must be a number') + this[MAX_AGE] = options.maxAge || 0 + this[DISPOSE] = options.dispose + this[NO_DISPOSE_ON_SET] = options.noDisposeOnSet || false + this[UPDATE_AGE_ON_GET] = options.updateAgeOnGet || false + this.reset() + } + + // resize the cache when the max changes. + set max (mL) { + if (typeof mL !== 'number' || mL < 0) + throw new TypeError('max must be a non-negative number') + + this[MAX] = mL || Infinity + trim(this) + } + get max () { + return this[MAX] + } + + set allowStale (allowStale) { + this[ALLOW_STALE] = !!allowStale + } + get allowStale () { + return this[ALLOW_STALE] + } + + set maxAge (mA) { + if (typeof mA !== 'number') + throw new TypeError('maxAge must be a non-negative number') + + this[MAX_AGE] = mA + trim(this) + } + get maxAge () { + return this[MAX_AGE] + } + + // resize the cache when the lengthCalculator changes. + set lengthCalculator (lC) { + if (typeof lC !== 'function') + lC = naiveLength + + if (lC !== this[LENGTH_CALCULATOR]) { + this[LENGTH_CALCULATOR] = lC + this[LENGTH] = 0 + this[LRU_LIST].forEach(hit => { + hit.length = this[LENGTH_CALCULATOR](hit.value, hit.key) + this[LENGTH] += hit.length + }) + } + trim(this) + } + get lengthCalculator () { return this[LENGTH_CALCULATOR] } + + get length () { return this[LENGTH] } + get itemCount () { return this[LRU_LIST].length } + + rforEach (fn, thisp) { + thisp = thisp || this + for (let walker = this[LRU_LIST].tail; walker !== null;) { + const prev = walker.prev + forEachStep(this, fn, walker, thisp) + walker = prev + } + } + + forEach (fn, thisp) { + thisp = thisp || this + for (let walker = this[LRU_LIST].head; walker !== null;) { + const next = walker.next + forEachStep(this, fn, walker, thisp) + walker = next + } + } + + keys () { + return this[LRU_LIST].toArray().map(k => k.key) + } + + values () { + return this[LRU_LIST].toArray().map(k => k.value) + } + + reset () { + if (this[DISPOSE] && + this[LRU_LIST] && + this[LRU_LIST].length) { + this[LRU_LIST].forEach(hit => this[DISPOSE](hit.key, hit.value)) + } + + this[CACHE] = new Map() // hash of items by key + this[LRU_LIST] = new Yallist() // list of items in order of use recency + this[LENGTH] = 0 // length of items in the list + } + + dump () { + return this[LRU_LIST].map(hit => + isStale(this, hit) ? false : { + k: hit.key, + v: hit.value, + e: hit.now + (hit.maxAge || 0) + }).toArray().filter(h => h) + } + + dumpLru () { + return this[LRU_LIST] + } + + set (key, value, maxAge) { + maxAge = maxAge || this[MAX_AGE] + + if (maxAge && typeof maxAge !== 'number') + throw new TypeError('maxAge must be a number') + + const now = maxAge ? Date.now() : 0 + const len = this[LENGTH_CALCULATOR](value, key) + + if (this[CACHE].has(key)) { + if (len > this[MAX]) { + del(this, this[CACHE].get(key)) + return false + } + + const node = this[CACHE].get(key) + const item = node.value + + // dispose of the old one before overwriting + // split out into 2 ifs for better coverage tracking + if (this[DISPOSE]) { + if (!this[NO_DISPOSE_ON_SET]) + this[DISPOSE](key, item.value) + } + + item.now = now + item.maxAge = maxAge + item.value = value + this[LENGTH] += len - item.length + item.length = len + this.get(key) + trim(this) + return true + } + + const hit = new Entry(key, value, len, now, maxAge) + + // oversized objects fall out of cache automatically. + if (hit.length > this[MAX]) { + if (this[DISPOSE]) + this[DISPOSE](key, value) + + return false + } + + this[LENGTH] += hit.length + this[LRU_LIST].unshift(hit) + this[CACHE].set(key, this[LRU_LIST].head) + trim(this) + return true + } + + has (key) { + if (!this[CACHE].has(key)) return false + const hit = this[CACHE].get(key).value + return !isStale(this, hit) + } + + get (key) { + return get(this, key, true) + } + + peek (key) { + return get(this, key, false) + } + + pop () { + const node = this[LRU_LIST].tail + if (!node) + return null + + del(this, node) + return node.value + } + + del (key) { + del(this, this[CACHE].get(key)) + } + + load (arr) { + // reset the cache + this.reset() + + const now = Date.now() + // A previous serialized cache has the most recent items first + for (let l = arr.length - 1; l >= 0; l--) { + const hit = arr[l] + const expiresAt = hit.e || 0 + if (expiresAt === 0) + // the item was created without expiration in a non aged cache + this.set(hit.k, hit.v) + else { + const maxAge = expiresAt - now + // dont add already expired items + if (maxAge > 0) { + this.set(hit.k, hit.v, maxAge) + } + } + } + } + + prune () { + this[CACHE].forEach((value, key) => get(this, key, false)) + } +} + +const get = (self, key, doUse) => { + const node = self[CACHE].get(key) + if (node) { + const hit = node.value + if (isStale(self, hit)) { + del(self, node) + if (!self[ALLOW_STALE]) + return undefined + } else { + if (doUse) { + if (self[UPDATE_AGE_ON_GET]) + node.value.now = Date.now() + self[LRU_LIST].unshiftNode(node) + } + } + return hit.value + } +} + +const isStale = (self, hit) => { + if (!hit || (!hit.maxAge && !self[MAX_AGE])) + return false + + const diff = Date.now() - hit.now + return hit.maxAge ? diff > hit.maxAge + : self[MAX_AGE] && (diff > self[MAX_AGE]) +} + +const trim = self => { + if (self[LENGTH] > self[MAX]) { + for (let walker = self[LRU_LIST].tail; + self[LENGTH] > self[MAX] && walker !== null;) { + // We know that we're about to delete this one, and also + // what the next least recently used key will be, so just + // go ahead and set it now. + const prev = walker.prev + del(self, walker) + walker = prev + } + } +} + +const del = (self, node) => { + if (node) { + const hit = node.value + if (self[DISPOSE]) + self[DISPOSE](hit.key, hit.value) + + self[LENGTH] -= hit.length + self[CACHE].delete(hit.key) + self[LRU_LIST].removeNode(node) + } +} + +class Entry { + constructor (key, value, length, now, maxAge) { + this.key = key + this.value = value + this.length = length + this.now = now + this.maxAge = maxAge || 0 + } +} + +const forEachStep = (self, fn, node, thisp) => { + let hit = node.value + if (isStale(self, hit)) { + del(self, node) + if (!self[ALLOW_STALE]) + hit = undefined + } + if (hit) + fn.call(thisp, hit.value, hit.key, self) +} + +module.exports = LRUCache diff --git a/node_modules/lru-cache/package.json b/node_modules/lru-cache/package.json new file mode 100644 index 000000000..955fdacda --- /dev/null +++ b/node_modules/lru-cache/package.json @@ -0,0 +1,69 @@ +{ + "_from": "lru-cache@^6.0.0", + "_id": "lru-cache@6.0.0", + "_inBundle": false, + "_integrity": "sha512-Jo6dJ04CmSjuznwJSS3pUeWmd/H0ffTlkXXgwZi+eq1UCmqQwCh+eLsYOYCwY991i2Fah4h1BEMCx4qThGbsiA==", + "_location": "/lru-cache", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "lru-cache@^6.0.0", + "name": "lru-cache", + "escapedName": "lru-cache", + "rawSpec": "^6.0.0", + "saveSpec": null, + "fetchSpec": "^6.0.0" + }, + "_requiredBy": [ + "/update-notifier/semver" + ], + "_resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-6.0.0.tgz", + "_shasum": "6d6fe6570ebd96aaf90fcad1dafa3b2566db3a94", + "_spec": "lru-cache@^6.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\update-notifier\\node_modules\\semver", + "author": { + "name": "Isaac Z. Schlueter", + "email": "i@izs.me" + }, + "bugs": { + "url": "https://github.com/isaacs/node-lru-cache/issues" + }, + "bundleDependencies": false, + "dependencies": { + "yallist": "^4.0.0" + }, + "deprecated": false, + "description": "A cache object that deletes the least-recently-used items.", + "devDependencies": { + "benchmark": "^2.1.4", + "tap": "^14.10.7" + }, + "engines": { + "node": ">=10" + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/isaacs/node-lru-cache#readme", + "keywords": [ + "mru", + "lru", + "cache" + ], + "license": "ISC", + "main": "index.js", + "name": "lru-cache", + "repository": { + "type": "git", + "url": "git://github.com/isaacs/node-lru-cache.git" + }, + "scripts": { + "postversion": "npm publish", + "prepublishOnly": "git push origin --follow-tags", + "preversion": "npm test", + "snap": "tap", + "test": "tap" + }, + "version": "6.0.0" +} diff --git a/node_modules/make-dir/index.d.ts b/node_modules/make-dir/index.d.ts new file mode 100644 index 000000000..3a7825121 --- /dev/null +++ b/node_modules/make-dir/index.d.ts @@ -0,0 +1,66 @@ +/// +import * as fs from 'fs'; + +declare namespace makeDir { + interface Options { + /** + Directory [permissions](https://x-team.com/blog/file-system-permissions-umask-node-js/). + + @default 0o777 + */ + readonly mode?: number; + + /** + Use a custom `fs` implementation. For example [`graceful-fs`](https://github.com/isaacs/node-graceful-fs). + + Using a custom `fs` implementation will block the use of the native `recursive` option if `fs.mkdir` or `fs.mkdirSync` is not the native function. + + @default require('fs') + */ + readonly fs?: typeof fs; + } +} + +declare const makeDir: { + /** + Make a directory and its parents if needed - Think `mkdir -p`. + + @param path - Directory to create. + @returns The path to the created directory. + + @example + ``` + import makeDir = require('make-dir'); + + (async () => { + const path = await makeDir('unicorn/rainbow/cake'); + + console.log(path); + //=> '/Users/sindresorhus/fun/unicorn/rainbow/cake' + + // Multiple directories: + const paths = await Promise.all([ + makeDir('unicorn/rainbow'), + makeDir('foo/bar') + ]); + + console.log(paths); + // [ + // '/Users/sindresorhus/fun/unicorn/rainbow', + // '/Users/sindresorhus/fun/foo/bar' + // ] + })(); + ``` + */ + (path: string, options?: makeDir.Options): Promise; + + /** + Synchronously make a directory and its parents if needed - Think `mkdir -p`. + + @param path - Directory to create. + @returns The path to the created directory. + */ + sync(path: string, options?: makeDir.Options): string; +}; + +export = makeDir; diff --git a/node_modules/make-dir/index.js b/node_modules/make-dir/index.js new file mode 100644 index 000000000..75889d84a --- /dev/null +++ b/node_modules/make-dir/index.js @@ -0,0 +1,156 @@ +'use strict'; +const fs = require('fs'); +const path = require('path'); +const {promisify} = require('util'); +const semver = require('semver'); + +const useNativeRecursiveOption = semver.satisfies(process.version, '>=10.12.0'); + +// https://github.com/nodejs/node/issues/8987 +// https://github.com/libuv/libuv/pull/1088 +const checkPath = pth => { + if (process.platform === 'win32') { + const pathHasInvalidWinCharacters = /[<>:"|?*]/.test(pth.replace(path.parse(pth).root, '')); + + if (pathHasInvalidWinCharacters) { + const error = new Error(`Path contains invalid characters: ${pth}`); + error.code = 'EINVAL'; + throw error; + } + } +}; + +const processOptions = options => { + // https://github.com/sindresorhus/make-dir/issues/18 + const defaults = { + mode: 0o777, + fs + }; + + return { + ...defaults, + ...options + }; +}; + +const permissionError = pth => { + // This replicates the exception of `fs.mkdir` with native the + // `recusive` option when run on an invalid drive under Windows. + const error = new Error(`operation not permitted, mkdir '${pth}'`); + error.code = 'EPERM'; + error.errno = -4048; + error.path = pth; + error.syscall = 'mkdir'; + return error; +}; + +const makeDir = async (input, options) => { + checkPath(input); + options = processOptions(options); + + const mkdir = promisify(options.fs.mkdir); + const stat = promisify(options.fs.stat); + + if (useNativeRecursiveOption && options.fs.mkdir === fs.mkdir) { + const pth = path.resolve(input); + + await mkdir(pth, { + mode: options.mode, + recursive: true + }); + + return pth; + } + + const make = async pth => { + try { + await mkdir(pth, options.mode); + + return pth; + } catch (error) { + if (error.code === 'EPERM') { + throw error; + } + + if (error.code === 'ENOENT') { + if (path.dirname(pth) === pth) { + throw permissionError(pth); + } + + if (error.message.includes('null bytes')) { + throw error; + } + + await make(path.dirname(pth)); + + return make(pth); + } + + try { + const stats = await stat(pth); + if (!stats.isDirectory()) { + throw new Error('The path is not a directory'); + } + } catch (_) { + throw error; + } + + return pth; + } + }; + + return make(path.resolve(input)); +}; + +module.exports = makeDir; + +module.exports.sync = (input, options) => { + checkPath(input); + options = processOptions(options); + + if (useNativeRecursiveOption && options.fs.mkdirSync === fs.mkdirSync) { + const pth = path.resolve(input); + + fs.mkdirSync(pth, { + mode: options.mode, + recursive: true + }); + + return pth; + } + + const make = pth => { + try { + options.fs.mkdirSync(pth, options.mode); + } catch (error) { + if (error.code === 'EPERM') { + throw error; + } + + if (error.code === 'ENOENT') { + if (path.dirname(pth) === pth) { + throw permissionError(pth); + } + + if (error.message.includes('null bytes')) { + throw error; + } + + make(path.dirname(pth)); + return make(pth); + } + + try { + if (!options.fs.statSync(pth).isDirectory()) { + throw new Error('The path is not a directory'); + } + } catch (_) { + throw error; + } + } + + return pth; + }; + + return make(path.resolve(input)); +}; diff --git a/node_modules/make-dir/license b/node_modules/make-dir/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/make-dir/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/make-dir/node_modules/.bin/semver b/node_modules/make-dir/node_modules/.bin/semver new file mode 100644 index 000000000..7e365277d --- /dev/null +++ b/node_modules/make-dir/node_modules/.bin/semver @@ -0,0 +1,15 @@ +#!/bin/sh +basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')") + +case `uname` in + *CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;; +esac + +if [ -x "$basedir/node" ]; then + "$basedir/node" "$basedir/../semver/bin/semver.js" "$@" + ret=$? +else + node "$basedir/../semver/bin/semver.js" "$@" + ret=$? +fi +exit $ret diff --git a/node_modules/make-dir/node_modules/.bin/semver.cmd b/node_modules/make-dir/node_modules/.bin/semver.cmd new file mode 100644 index 000000000..164cdeac5 --- /dev/null +++ b/node_modules/make-dir/node_modules/.bin/semver.cmd @@ -0,0 +1,17 @@ +@ECHO off +SETLOCAL +CALL :find_dp0 + +IF EXIST "%dp0%\node.exe" ( + SET "_prog=%dp0%\node.exe" +) ELSE ( + SET "_prog=node" + SET PATHEXT=%PATHEXT:;.JS;=;% +) + +"%_prog%" "%dp0%\..\semver\bin\semver.js" %* +ENDLOCAL +EXIT /b %errorlevel% +:find_dp0 +SET dp0=%~dp0 +EXIT /b diff --git a/node_modules/make-dir/node_modules/.bin/semver.ps1 b/node_modules/make-dir/node_modules/.bin/semver.ps1 new file mode 100644 index 000000000..6a85e3405 --- /dev/null +++ b/node_modules/make-dir/node_modules/.bin/semver.ps1 @@ -0,0 +1,18 @@ +#!/usr/bin/env pwsh +$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent + +$exe="" +if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) { + # Fix case when both the Windows and Linux builds of Node + # are installed in the same directory + $exe=".exe" +} +$ret=0 +if (Test-Path "$basedir/node$exe") { + & "$basedir/node$exe" "$basedir/../semver/bin/semver.js" $args + $ret=$LASTEXITCODE +} else { + & "node$exe" "$basedir/../semver/bin/semver.js" $args + $ret=$LASTEXITCODE +} +exit $ret diff --git a/node_modules/make-dir/node_modules/semver/CHANGELOG.md b/node_modules/make-dir/node_modules/semver/CHANGELOG.md new file mode 100644 index 000000000..f567dd3fe --- /dev/null +++ b/node_modules/make-dir/node_modules/semver/CHANGELOG.md @@ -0,0 +1,70 @@ +# changes log + +## 6.2.0 + +* Coerce numbers to strings when passed to semver.coerce() +* Add `rtl` option to coerce from right to left + +## 6.1.3 + +* Handle X-ranges properly in includePrerelease mode + +## 6.1.2 + +* Do not throw when testing invalid version strings + +## 6.1.1 + +* Add options support for semver.coerce() +* Handle undefined version passed to Range.test + +## 6.1.0 + +* Add semver.compareBuild function +* Support `*` in semver.intersects + +## 6.0 + +* Fix `intersects` logic. + + This is technically a bug fix, but since it is also a change to behavior + that may require users updating their code, it is marked as a major + version increment. + +## 5.7 + +* Add `minVersion` method + +## 5.6 + +* Move boolean `loose` param to an options object, with + backwards-compatibility protection. +* Add ability to opt out of special prerelease version handling with + the `includePrerelease` option flag. + +## 5.5 + +* Add version coercion capabilities + +## 5.4 + +* Add intersection checking + +## 5.3 + +* Add `minSatisfying` method + +## 5.2 + +* Add `prerelease(v)` that returns prerelease components + +## 5.1 + +* Add Backus-Naur for ranges +* Remove excessively cute inspection methods + +## 5.0 + +* Remove AMD/Browserified build artifacts +* Fix ltr and gtr when using the `*` range +* Fix for range `*` with a prerelease identifier diff --git a/node_modules/make-dir/node_modules/semver/LICENSE b/node_modules/make-dir/node_modules/semver/LICENSE new file mode 100644 index 000000000..19129e315 --- /dev/null +++ b/node_modules/make-dir/node_modules/semver/LICENSE @@ -0,0 +1,15 @@ +The ISC License + +Copyright (c) Isaac Z. Schlueter and Contributors + +Permission to use, copy, modify, and/or distribute this software for any +purpose with or without fee is hereby granted, provided that the above +copyright notice and this permission notice appear in all copies. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES +WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF +MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR +ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES +WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN +ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR +IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. diff --git a/node_modules/make-dir/node_modules/semver/README.md b/node_modules/make-dir/node_modules/semver/README.md new file mode 100644 index 000000000..2293a14fd --- /dev/null +++ b/node_modules/make-dir/node_modules/semver/README.md @@ -0,0 +1,443 @@ +semver(1) -- The semantic versioner for npm +=========================================== + +## Install + +```bash +npm install semver +```` + +## Usage + +As a node module: + +```js +const semver = require('semver') + +semver.valid('1.2.3') // '1.2.3' +semver.valid('a.b.c') // null +semver.clean(' =v1.2.3 ') // '1.2.3' +semver.satisfies('1.2.3', '1.x || >=2.5.0 || 5.0.0 - 7.2.3') // true +semver.gt('1.2.3', '9.8.7') // false +semver.lt('1.2.3', '9.8.7') // true +semver.minVersion('>=1.0.0') // '1.0.0' +semver.valid(semver.coerce('v2')) // '2.0.0' +semver.valid(semver.coerce('42.6.7.9.3-alpha')) // '42.6.7' +``` + +As a command-line utility: + +``` +$ semver -h + +A JavaScript implementation of the https://semver.org/ specification +Copyright Isaac Z. Schlueter + +Usage: semver [options] [ [...]] +Prints valid versions sorted by SemVer precedence + +Options: +-r --range + Print versions that match the specified range. + +-i --increment [] + Increment a version by the specified level. Level can + be one of: major, minor, patch, premajor, preminor, + prepatch, or prerelease. Default level is 'patch'. + Only one version may be specified. + +--preid + Identifier to be used to prefix premajor, preminor, + prepatch or prerelease version increments. + +-l --loose + Interpret versions and ranges loosely + +-p --include-prerelease + Always include prerelease versions in range matching + +-c --coerce + Coerce a string into SemVer if possible + (does not imply --loose) + +--rtl + Coerce version strings right to left + +--ltr + Coerce version strings left to right (default) + +Program exits successfully if any valid version satisfies +all supplied ranges, and prints all satisfying versions. + +If no satisfying versions are found, then exits failure. + +Versions are printed in ascending order, so supplying +multiple versions to the utility will just sort them. +``` + +## Versions + +A "version" is described by the `v2.0.0` specification found at +. + +A leading `"="` or `"v"` character is stripped off and ignored. + +## Ranges + +A `version range` is a set of `comparators` which specify versions +that satisfy the range. + +A `comparator` is composed of an `operator` and a `version`. The set +of primitive `operators` is: + +* `<` Less than +* `<=` Less than or equal to +* `>` Greater than +* `>=` Greater than or equal to +* `=` Equal. If no operator is specified, then equality is assumed, + so this operator is optional, but MAY be included. + +For example, the comparator `>=1.2.7` would match the versions +`1.2.7`, `1.2.8`, `2.5.3`, and `1.3.9`, but not the versions `1.2.6` +or `1.1.0`. + +Comparators can be joined by whitespace to form a `comparator set`, +which is satisfied by the **intersection** of all of the comparators +it includes. + +A range is composed of one or more comparator sets, joined by `||`. A +version matches a range if and only if every comparator in at least +one of the `||`-separated comparator sets is satisfied by the version. + +For example, the range `>=1.2.7 <1.3.0` would match the versions +`1.2.7`, `1.2.8`, and `1.2.99`, but not the versions `1.2.6`, `1.3.0`, +or `1.1.0`. + +The range `1.2.7 || >=1.2.9 <2.0.0` would match the versions `1.2.7`, +`1.2.9`, and `1.4.6`, but not the versions `1.2.8` or `2.0.0`. + +### Prerelease Tags + +If a version has a prerelease tag (for example, `1.2.3-alpha.3`) then +it will only be allowed to satisfy comparator sets if at least one +comparator with the same `[major, minor, patch]` tuple also has a +prerelease tag. + +For example, the range `>1.2.3-alpha.3` would be allowed to match the +version `1.2.3-alpha.7`, but it would *not* be satisfied by +`3.4.5-alpha.9`, even though `3.4.5-alpha.9` is technically "greater +than" `1.2.3-alpha.3` according to the SemVer sort rules. The version +range only accepts prerelease tags on the `1.2.3` version. The +version `3.4.5` *would* satisfy the range, because it does not have a +prerelease flag, and `3.4.5` is greater than `1.2.3-alpha.7`. + +The purpose for this behavior is twofold. First, prerelease versions +frequently are updated very quickly, and contain many breaking changes +that are (by the author's design) not yet fit for public consumption. +Therefore, by default, they are excluded from range matching +semantics. + +Second, a user who has opted into using a prerelease version has +clearly indicated the intent to use *that specific* set of +alpha/beta/rc versions. By including a prerelease tag in the range, +the user is indicating that they are aware of the risk. However, it +is still not appropriate to assume that they have opted into taking a +similar risk on the *next* set of prerelease versions. + +Note that this behavior can be suppressed (treating all prerelease +versions as if they were normal versions, for the purpose of range +matching) by setting the `includePrerelease` flag on the options +object to any +[functions](https://github.com/npm/node-semver#functions) that do +range matching. + +#### Prerelease Identifiers + +The method `.inc` takes an additional `identifier` string argument that +will append the value of the string as a prerelease identifier: + +```javascript +semver.inc('1.2.3', 'prerelease', 'beta') +// '1.2.4-beta.0' +``` + +command-line example: + +```bash +$ semver 1.2.3 -i prerelease --preid beta +1.2.4-beta.0 +``` + +Which then can be used to increment further: + +```bash +$ semver 1.2.4-beta.0 -i prerelease +1.2.4-beta.1 +``` + +### Advanced Range Syntax + +Advanced range syntax desugars to primitive comparators in +deterministic ways. + +Advanced ranges may be combined in the same way as primitive +comparators using white space or `||`. + +#### Hyphen Ranges `X.Y.Z - A.B.C` + +Specifies an inclusive set. + +* `1.2.3 - 2.3.4` := `>=1.2.3 <=2.3.4` + +If a partial version is provided as the first version in the inclusive +range, then the missing pieces are replaced with zeroes. + +* `1.2 - 2.3.4` := `>=1.2.0 <=2.3.4` + +If a partial version is provided as the second version in the +inclusive range, then all versions that start with the supplied parts +of the tuple are accepted, but nothing that would be greater than the +provided tuple parts. + +* `1.2.3 - 2.3` := `>=1.2.3 <2.4.0` +* `1.2.3 - 2` := `>=1.2.3 <3.0.0` + +#### X-Ranges `1.2.x` `1.X` `1.2.*` `*` + +Any of `X`, `x`, or `*` may be used to "stand in" for one of the +numeric values in the `[major, minor, patch]` tuple. + +* `*` := `>=0.0.0` (Any version satisfies) +* `1.x` := `>=1.0.0 <2.0.0` (Matching major version) +* `1.2.x` := `>=1.2.0 <1.3.0` (Matching major and minor versions) + +A partial version range is treated as an X-Range, so the special +character is in fact optional. + +* `""` (empty string) := `*` := `>=0.0.0` +* `1` := `1.x.x` := `>=1.0.0 <2.0.0` +* `1.2` := `1.2.x` := `>=1.2.0 <1.3.0` + +#### Tilde Ranges `~1.2.3` `~1.2` `~1` + +Allows patch-level changes if a minor version is specified on the +comparator. Allows minor-level changes if not. + +* `~1.2.3` := `>=1.2.3 <1.(2+1).0` := `>=1.2.3 <1.3.0` +* `~1.2` := `>=1.2.0 <1.(2+1).0` := `>=1.2.0 <1.3.0` (Same as `1.2.x`) +* `~1` := `>=1.0.0 <(1+1).0.0` := `>=1.0.0 <2.0.0` (Same as `1.x`) +* `~0.2.3` := `>=0.2.3 <0.(2+1).0` := `>=0.2.3 <0.3.0` +* `~0.2` := `>=0.2.0 <0.(2+1).0` := `>=0.2.0 <0.3.0` (Same as `0.2.x`) +* `~0` := `>=0.0.0 <(0+1).0.0` := `>=0.0.0 <1.0.0` (Same as `0.x`) +* `~1.2.3-beta.2` := `>=1.2.3-beta.2 <1.3.0` Note that prereleases in + the `1.2.3` version will be allowed, if they are greater than or + equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but + `1.2.4-beta.2` would not, because it is a prerelease of a + different `[major, minor, patch]` tuple. + +#### Caret Ranges `^1.2.3` `^0.2.5` `^0.0.4` + +Allows changes that do not modify the left-most non-zero element in the +`[major, minor, patch]` tuple. In other words, this allows patch and +minor updates for versions `1.0.0` and above, patch updates for +versions `0.X >=0.1.0`, and *no* updates for versions `0.0.X`. + +Many authors treat a `0.x` version as if the `x` were the major +"breaking-change" indicator. + +Caret ranges are ideal when an author may make breaking changes +between `0.2.4` and `0.3.0` releases, which is a common practice. +However, it presumes that there will *not* be breaking changes between +`0.2.4` and `0.2.5`. It allows for changes that are presumed to be +additive (but non-breaking), according to commonly observed practices. + +* `^1.2.3` := `>=1.2.3 <2.0.0` +* `^0.2.3` := `>=0.2.3 <0.3.0` +* `^0.0.3` := `>=0.0.3 <0.0.4` +* `^1.2.3-beta.2` := `>=1.2.3-beta.2 <2.0.0` Note that prereleases in + the `1.2.3` version will be allowed, if they are greater than or + equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but + `1.2.4-beta.2` would not, because it is a prerelease of a + different `[major, minor, patch]` tuple. +* `^0.0.3-beta` := `>=0.0.3-beta <0.0.4` Note that prereleases in the + `0.0.3` version *only* will be allowed, if they are greater than or + equal to `beta`. So, `0.0.3-pr.2` would be allowed. + +When parsing caret ranges, a missing `patch` value desugars to the +number `0`, but will allow flexibility within that value, even if the +major and minor versions are both `0`. + +* `^1.2.x` := `>=1.2.0 <2.0.0` +* `^0.0.x` := `>=0.0.0 <0.1.0` +* `^0.0` := `>=0.0.0 <0.1.0` + +A missing `minor` and `patch` values will desugar to zero, but also +allow flexibility within those values, even if the major version is +zero. + +* `^1.x` := `>=1.0.0 <2.0.0` +* `^0.x` := `>=0.0.0 <1.0.0` + +### Range Grammar + +Putting all this together, here is a Backus-Naur grammar for ranges, +for the benefit of parser authors: + +```bnf +range-set ::= range ( logical-or range ) * +logical-or ::= ( ' ' ) * '||' ( ' ' ) * +range ::= hyphen | simple ( ' ' simple ) * | '' +hyphen ::= partial ' - ' partial +simple ::= primitive | partial | tilde | caret +primitive ::= ( '<' | '>' | '>=' | '<=' | '=' ) partial +partial ::= xr ( '.' xr ( '.' xr qualifier ? )? )? +xr ::= 'x' | 'X' | '*' | nr +nr ::= '0' | ['1'-'9'] ( ['0'-'9'] ) * +tilde ::= '~' partial +caret ::= '^' partial +qualifier ::= ( '-' pre )? ( '+' build )? +pre ::= parts +build ::= parts +parts ::= part ( '.' part ) * +part ::= nr | [-0-9A-Za-z]+ +``` + +## Functions + +All methods and classes take a final `options` object argument. All +options in this object are `false` by default. The options supported +are: + +- `loose` Be more forgiving about not-quite-valid semver strings. + (Any resulting output will always be 100% strict compliant, of + course.) For backwards compatibility reasons, if the `options` + argument is a boolean value instead of an object, it is interpreted + to be the `loose` param. +- `includePrerelease` Set to suppress the [default + behavior](https://github.com/npm/node-semver#prerelease-tags) of + excluding prerelease tagged versions from ranges unless they are + explicitly opted into. + +Strict-mode Comparators and Ranges will be strict about the SemVer +strings that they parse. + +* `valid(v)`: Return the parsed version, or null if it's not valid. +* `inc(v, release)`: Return the version incremented by the release + type (`major`, `premajor`, `minor`, `preminor`, `patch`, + `prepatch`, or `prerelease`), or null if it's not valid + * `premajor` in one call will bump the version up to the next major + version and down to a prerelease of that major version. + `preminor`, and `prepatch` work the same way. + * If called from a non-prerelease version, the `prerelease` will work the + same as `prepatch`. It increments the patch version, then makes a + prerelease. If the input version is already a prerelease it simply + increments it. +* `prerelease(v)`: Returns an array of prerelease components, or null + if none exist. Example: `prerelease('1.2.3-alpha.1') -> ['alpha', 1]` +* `major(v)`: Return the major version number. +* `minor(v)`: Return the minor version number. +* `patch(v)`: Return the patch version number. +* `intersects(r1, r2, loose)`: Return true if the two supplied ranges + or comparators intersect. +* `parse(v)`: Attempt to parse a string as a semantic version, returning either + a `SemVer` object or `null`. + +### Comparison + +* `gt(v1, v2)`: `v1 > v2` +* `gte(v1, v2)`: `v1 >= v2` +* `lt(v1, v2)`: `v1 < v2` +* `lte(v1, v2)`: `v1 <= v2` +* `eq(v1, v2)`: `v1 == v2` This is true if they're logically equivalent, + even if they're not the exact same string. You already know how to + compare strings. +* `neq(v1, v2)`: `v1 != v2` The opposite of `eq`. +* `cmp(v1, comparator, v2)`: Pass in a comparison string, and it'll call + the corresponding function above. `"==="` and `"!=="` do simple + string comparison, but are included for completeness. Throws if an + invalid comparison string is provided. +* `compare(v1, v2)`: Return `0` if `v1 == v2`, or `1` if `v1` is greater, or `-1` if + `v2` is greater. Sorts in ascending order if passed to `Array.sort()`. +* `rcompare(v1, v2)`: The reverse of compare. Sorts an array of versions + in descending order when passed to `Array.sort()`. +* `compareBuild(v1, v2)`: The same as `compare` but considers `build` when two versions + are equal. Sorts in ascending order if passed to `Array.sort()`. + `v2` is greater. Sorts in ascending order if passed to `Array.sort()`. +* `diff(v1, v2)`: Returns difference between two versions by the release type + (`major`, `premajor`, `minor`, `preminor`, `patch`, `prepatch`, or `prerelease`), + or null if the versions are the same. + +### Comparators + +* `intersects(comparator)`: Return true if the comparators intersect + +### Ranges + +* `validRange(range)`: Return the valid range or null if it's not valid +* `satisfies(version, range)`: Return true if the version satisfies the + range. +* `maxSatisfying(versions, range)`: Return the highest version in the list + that satisfies the range, or `null` if none of them do. +* `minSatisfying(versions, range)`: Return the lowest version in the list + that satisfies the range, or `null` if none of them do. +* `minVersion(range)`: Return the lowest version that can possibly match + the given range. +* `gtr(version, range)`: Return `true` if version is greater than all the + versions possible in the range. +* `ltr(version, range)`: Return `true` if version is less than all the + versions possible in the range. +* `outside(version, range, hilo)`: Return true if the version is outside + the bounds of the range in either the high or low direction. The + `hilo` argument must be either the string `'>'` or `'<'`. (This is + the function called by `gtr` and `ltr`.) +* `intersects(range)`: Return true if any of the ranges comparators intersect + +Note that, since ranges may be non-contiguous, a version might not be +greater than a range, less than a range, *or* satisfy a range! For +example, the range `1.2 <1.2.9 || >2.0.0` would have a hole from `1.2.9` +until `2.0.0`, so the version `1.2.10` would not be greater than the +range (because `2.0.1` satisfies, which is higher), nor less than the +range (since `1.2.8` satisfies, which is lower), and it also does not +satisfy the range. + +If you want to know if a version satisfies or does not satisfy a +range, use the `satisfies(version, range)` function. + +### Coercion + +* `coerce(version, options)`: Coerces a string to semver if possible + +This aims to provide a very forgiving translation of a non-semver string to +semver. It looks for the first digit in a string, and consumes all +remaining characters which satisfy at least a partial semver (e.g., `1`, +`1.2`, `1.2.3`) up to the max permitted length (256 characters). Longer +versions are simply truncated (`4.6.3.9.2-alpha2` becomes `4.6.3`). All +surrounding text is simply ignored (`v3.4 replaces v3.3.1` becomes +`3.4.0`). Only text which lacks digits will fail coercion (`version one` +is not valid). The maximum length for any semver component considered for +coercion is 16 characters; longer components will be ignored +(`10000000000000000.4.7.4` becomes `4.7.4`). The maximum value for any +semver component is `Integer.MAX_SAFE_INTEGER || (2**53 - 1)`; higher value +components are invalid (`9999999999999999.4.7.4` is likely invalid). + +If the `options.rtl` flag is set, then `coerce` will return the right-most +coercible tuple that does not share an ending index with a longer coercible +tuple. For example, `1.2.3.4` will return `2.3.4` in rtl mode, not +`4.0.0`. `1.2.3/4` will return `4.0.0`, because the `4` is not a part of +any other overlapping SemVer tuple. + +### Clean + +* `clean(version)`: Clean a string to be a valid semver if possible + +This will return a cleaned and trimmed semver version. If the provided version is not valid a null will be returned. This does not work for ranges. + +ex. +* `s.clean(' = v 2.1.5foo')`: `null` +* `s.clean(' = v 2.1.5foo', { loose: true })`: `'2.1.5-foo'` +* `s.clean(' = v 2.1.5-foo')`: `null` +* `s.clean(' = v 2.1.5-foo', { loose: true })`: `'2.1.5-foo'` +* `s.clean('=v2.1.5')`: `'2.1.5'` +* `s.clean(' =v2.1.5')`: `2.1.5` +* `s.clean(' 2.1.5 ')`: `'2.1.5'` +* `s.clean('~1.0.0')`: `null` diff --git a/node_modules/make-dir/node_modules/semver/bin/semver.js b/node_modules/make-dir/node_modules/semver/bin/semver.js new file mode 100644 index 000000000..666034a75 --- /dev/null +++ b/node_modules/make-dir/node_modules/semver/bin/semver.js @@ -0,0 +1,174 @@ +#!/usr/bin/env node +// Standalone semver comparison program. +// Exits successfully and prints matching version(s) if +// any supplied version is valid and passes all tests. + +var argv = process.argv.slice(2) + +var versions = [] + +var range = [] + +var inc = null + +var version = require('../package.json').version + +var loose = false + +var includePrerelease = false + +var coerce = false + +var rtl = false + +var identifier + +var semver = require('../semver') + +var reverse = false + +var options = {} + +main() + +function main () { + if (!argv.length) return help() + while (argv.length) { + var a = argv.shift() + var indexOfEqualSign = a.indexOf('=') + if (indexOfEqualSign !== -1) { + a = a.slice(0, indexOfEqualSign) + argv.unshift(a.slice(indexOfEqualSign + 1)) + } + switch (a) { + case '-rv': case '-rev': case '--rev': case '--reverse': + reverse = true + break + case '-l': case '--loose': + loose = true + break + case '-p': case '--include-prerelease': + includePrerelease = true + break + case '-v': case '--version': + versions.push(argv.shift()) + break + case '-i': case '--inc': case '--increment': + switch (argv[0]) { + case 'major': case 'minor': case 'patch': case 'prerelease': + case 'premajor': case 'preminor': case 'prepatch': + inc = argv.shift() + break + default: + inc = 'patch' + break + } + break + case '--preid': + identifier = argv.shift() + break + case '-r': case '--range': + range.push(argv.shift()) + break + case '-c': case '--coerce': + coerce = true + break + case '--rtl': + rtl = true + break + case '--ltr': + rtl = false + break + case '-h': case '--help': case '-?': + return help() + default: + versions.push(a) + break + } + } + + var options = { loose: loose, includePrerelease: includePrerelease, rtl: rtl } + + versions = versions.map(function (v) { + return coerce ? (semver.coerce(v, options) || { version: v }).version : v + }).filter(function (v) { + return semver.valid(v) + }) + if (!versions.length) return fail() + if (inc && (versions.length !== 1 || range.length)) { return failInc() } + + for (var i = 0, l = range.length; i < l; i++) { + versions = versions.filter(function (v) { + return semver.satisfies(v, range[i], options) + }) + if (!versions.length) return fail() + } + return success(versions) +} + +function failInc () { + console.error('--inc can only be used on a single version with no range') + fail() +} + +function fail () { process.exit(1) } + +function success () { + var compare = reverse ? 'rcompare' : 'compare' + versions.sort(function (a, b) { + return semver[compare](a, b, options) + }).map(function (v) { + return semver.clean(v, options) + }).map(function (v) { + return inc ? semver.inc(v, inc, options, identifier) : v + }).forEach(function (v, i, _) { console.log(v) }) +} + +function help () { + console.log(['SemVer ' + version, + '', + 'A JavaScript implementation of the https://semver.org/ specification', + 'Copyright Isaac Z. Schlueter', + '', + 'Usage: semver [options] [ [...]]', + 'Prints valid versions sorted by SemVer precedence', + '', + 'Options:', + '-r --range ', + ' Print versions that match the specified range.', + '', + '-i --increment []', + ' Increment a version by the specified level. Level can', + ' be one of: major, minor, patch, premajor, preminor,', + " prepatch, or prerelease. Default level is 'patch'.", + ' Only one version may be specified.', + '', + '--preid ', + ' Identifier to be used to prefix premajor, preminor,', + ' prepatch or prerelease version increments.', + '', + '-l --loose', + ' Interpret versions and ranges loosely', + '', + '-p --include-prerelease', + ' Always include prerelease versions in range matching', + '', + '-c --coerce', + ' Coerce a string into SemVer if possible', + ' (does not imply --loose)', + '', + '--rtl', + ' Coerce version strings right to left', + '', + '--ltr', + ' Coerce version strings left to right (default)', + '', + 'Program exits successfully if any valid version satisfies', + 'all supplied ranges, and prints all satisfying versions.', + '', + 'If no satisfying versions are found, then exits failure.', + '', + 'Versions are printed in ascending order, so supplying', + 'multiple versions to the utility will just sort them.' + ].join('\n')) +} diff --git a/node_modules/make-dir/node_modules/semver/package.json b/node_modules/make-dir/node_modules/semver/package.json new file mode 100644 index 000000000..4d05cd0da --- /dev/null +++ b/node_modules/make-dir/node_modules/semver/package.json @@ -0,0 +1,60 @@ +{ + "_from": "semver@^6.0.0", + "_id": "semver@6.3.0", + "_inBundle": false, + "_integrity": "sha512-b39TBaTSfV6yBrapU89p5fKekE2m/NwnDocOVruQFS1/veMgdzuPcnOM34M6CwxW8jH/lxEa5rBoDeUwu5HHTw==", + "_location": "/make-dir/semver", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "semver@^6.0.0", + "name": "semver", + "escapedName": "semver", + "rawSpec": "^6.0.0", + "saveSpec": null, + "fetchSpec": "^6.0.0" + }, + "_requiredBy": [ + "/make-dir" + ], + "_resolved": "https://registry.npmjs.org/semver/-/semver-6.3.0.tgz", + "_shasum": "ee0a64c8af5e8ceea67687b133761e1becbd1d3d", + "_spec": "semver@^6.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\make-dir", + "bin": { + "semver": "bin/semver.js" + }, + "bugs": { + "url": "https://github.com/npm/node-semver/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "The semantic version parser used by npm.", + "devDependencies": { + "tap": "^14.3.1" + }, + "files": [ + "bin", + "range.bnf", + "semver.js" + ], + "homepage": "https://github.com/npm/node-semver#readme", + "license": "ISC", + "main": "semver.js", + "name": "semver", + "repository": { + "type": "git", + "url": "git+https://github.com/npm/node-semver.git" + }, + "scripts": { + "postpublish": "git push origin --follow-tags", + "postversion": "npm publish", + "preversion": "npm test", + "test": "tap" + }, + "tap": { + "check-coverage": true + }, + "version": "6.3.0" +} diff --git a/node_modules/make-dir/node_modules/semver/range.bnf b/node_modules/make-dir/node_modules/semver/range.bnf new file mode 100644 index 000000000..d4c6ae0d7 --- /dev/null +++ b/node_modules/make-dir/node_modules/semver/range.bnf @@ -0,0 +1,16 @@ +range-set ::= range ( logical-or range ) * +logical-or ::= ( ' ' ) * '||' ( ' ' ) * +range ::= hyphen | simple ( ' ' simple ) * | '' +hyphen ::= partial ' - ' partial +simple ::= primitive | partial | tilde | caret +primitive ::= ( '<' | '>' | '>=' | '<=' | '=' ) partial +partial ::= xr ( '.' xr ( '.' xr qualifier ? )? )? +xr ::= 'x' | 'X' | '*' | nr +nr ::= '0' | [1-9] ( [0-9] ) * +tilde ::= '~' partial +caret ::= '^' partial +qualifier ::= ( '-' pre )? ( '+' build )? +pre ::= parts +build ::= parts +parts ::= part ( '.' part ) * +part ::= nr | [-0-9A-Za-z]+ diff --git a/node_modules/make-dir/node_modules/semver/semver.js b/node_modules/make-dir/node_modules/semver/semver.js new file mode 100644 index 000000000..636fa4365 --- /dev/null +++ b/node_modules/make-dir/node_modules/semver/semver.js @@ -0,0 +1,1596 @@ +exports = module.exports = SemVer + +var debug +/* istanbul ignore next */ +if (typeof process === 'object' && + process.env && + process.env.NODE_DEBUG && + /\bsemver\b/i.test(process.env.NODE_DEBUG)) { + debug = function () { + var args = Array.prototype.slice.call(arguments, 0) + args.unshift('SEMVER') + console.log.apply(console, args) + } +} else { + debug = function () {} +} + +// Note: this is the semver.org version of the spec that it implements +// Not necessarily the package version of this code. +exports.SEMVER_SPEC_VERSION = '2.0.0' + +var MAX_LENGTH = 256 +var MAX_SAFE_INTEGER = Number.MAX_SAFE_INTEGER || + /* istanbul ignore next */ 9007199254740991 + +// Max safe segment length for coercion. +var MAX_SAFE_COMPONENT_LENGTH = 16 + +// The actual regexps go on exports.re +var re = exports.re = [] +var src = exports.src = [] +var t = exports.tokens = {} +var R = 0 + +function tok (n) { + t[n] = R++ +} + +// The following Regular Expressions can be used for tokenizing, +// validating, and parsing SemVer version strings. + +// ## Numeric Identifier +// A single `0`, or a non-zero digit followed by zero or more digits. + +tok('NUMERICIDENTIFIER') +src[t.NUMERICIDENTIFIER] = '0|[1-9]\\d*' +tok('NUMERICIDENTIFIERLOOSE') +src[t.NUMERICIDENTIFIERLOOSE] = '[0-9]+' + +// ## Non-numeric Identifier +// Zero or more digits, followed by a letter or hyphen, and then zero or +// more letters, digits, or hyphens. + +tok('NONNUMERICIDENTIFIER') +src[t.NONNUMERICIDENTIFIER] = '\\d*[a-zA-Z-][a-zA-Z0-9-]*' + +// ## Main Version +// Three dot-separated numeric identifiers. + +tok('MAINVERSION') +src[t.MAINVERSION] = '(' + src[t.NUMERICIDENTIFIER] + ')\\.' + + '(' + src[t.NUMERICIDENTIFIER] + ')\\.' + + '(' + src[t.NUMERICIDENTIFIER] + ')' + +tok('MAINVERSIONLOOSE') +src[t.MAINVERSIONLOOSE] = '(' + src[t.NUMERICIDENTIFIERLOOSE] + ')\\.' + + '(' + src[t.NUMERICIDENTIFIERLOOSE] + ')\\.' + + '(' + src[t.NUMERICIDENTIFIERLOOSE] + ')' + +// ## Pre-release Version Identifier +// A numeric identifier, or a non-numeric identifier. + +tok('PRERELEASEIDENTIFIER') +src[t.PRERELEASEIDENTIFIER] = '(?:' + src[t.NUMERICIDENTIFIER] + + '|' + src[t.NONNUMERICIDENTIFIER] + ')' + +tok('PRERELEASEIDENTIFIERLOOSE') +src[t.PRERELEASEIDENTIFIERLOOSE] = '(?:' + src[t.NUMERICIDENTIFIERLOOSE] + + '|' + src[t.NONNUMERICIDENTIFIER] + ')' + +// ## Pre-release Version +// Hyphen, followed by one or more dot-separated pre-release version +// identifiers. + +tok('PRERELEASE') +src[t.PRERELEASE] = '(?:-(' + src[t.PRERELEASEIDENTIFIER] + + '(?:\\.' + src[t.PRERELEASEIDENTIFIER] + ')*))' + +tok('PRERELEASELOOSE') +src[t.PRERELEASELOOSE] = '(?:-?(' + src[t.PRERELEASEIDENTIFIERLOOSE] + + '(?:\\.' + src[t.PRERELEASEIDENTIFIERLOOSE] + ')*))' + +// ## Build Metadata Identifier +// Any combination of digits, letters, or hyphens. + +tok('BUILDIDENTIFIER') +src[t.BUILDIDENTIFIER] = '[0-9A-Za-z-]+' + +// ## Build Metadata +// Plus sign, followed by one or more period-separated build metadata +// identifiers. + +tok('BUILD') +src[t.BUILD] = '(?:\\+(' + src[t.BUILDIDENTIFIER] + + '(?:\\.' + src[t.BUILDIDENTIFIER] + ')*))' + +// ## Full Version String +// A main version, followed optionally by a pre-release version and +// build metadata. + +// Note that the only major, minor, patch, and pre-release sections of +// the version string are capturing groups. The build metadata is not a +// capturing group, because it should not ever be used in version +// comparison. + +tok('FULL') +tok('FULLPLAIN') +src[t.FULLPLAIN] = 'v?' + src[t.MAINVERSION] + + src[t.PRERELEASE] + '?' + + src[t.BUILD] + '?' + +src[t.FULL] = '^' + src[t.FULLPLAIN] + '$' + +// like full, but allows v1.2.3 and =1.2.3, which people do sometimes. +// also, 1.0.0alpha1 (prerelease without the hyphen) which is pretty +// common in the npm registry. +tok('LOOSEPLAIN') +src[t.LOOSEPLAIN] = '[v=\\s]*' + src[t.MAINVERSIONLOOSE] + + src[t.PRERELEASELOOSE] + '?' + + src[t.BUILD] + '?' + +tok('LOOSE') +src[t.LOOSE] = '^' + src[t.LOOSEPLAIN] + '$' + +tok('GTLT') +src[t.GTLT] = '((?:<|>)?=?)' + +// Something like "2.*" or "1.2.x". +// Note that "x.x" is a valid xRange identifer, meaning "any version" +// Only the first item is strictly required. +tok('XRANGEIDENTIFIERLOOSE') +src[t.XRANGEIDENTIFIERLOOSE] = src[t.NUMERICIDENTIFIERLOOSE] + '|x|X|\\*' +tok('XRANGEIDENTIFIER') +src[t.XRANGEIDENTIFIER] = src[t.NUMERICIDENTIFIER] + '|x|X|\\*' + +tok('XRANGEPLAIN') +src[t.XRANGEPLAIN] = '[v=\\s]*(' + src[t.XRANGEIDENTIFIER] + ')' + + '(?:\\.(' + src[t.XRANGEIDENTIFIER] + ')' + + '(?:\\.(' + src[t.XRANGEIDENTIFIER] + ')' + + '(?:' + src[t.PRERELEASE] + ')?' + + src[t.BUILD] + '?' + + ')?)?' + +tok('XRANGEPLAINLOOSE') +src[t.XRANGEPLAINLOOSE] = '[v=\\s]*(' + src[t.XRANGEIDENTIFIERLOOSE] + ')' + + '(?:\\.(' + src[t.XRANGEIDENTIFIERLOOSE] + ')' + + '(?:\\.(' + src[t.XRANGEIDENTIFIERLOOSE] + ')' + + '(?:' + src[t.PRERELEASELOOSE] + ')?' + + src[t.BUILD] + '?' + + ')?)?' + +tok('XRANGE') +src[t.XRANGE] = '^' + src[t.GTLT] + '\\s*' + src[t.XRANGEPLAIN] + '$' +tok('XRANGELOOSE') +src[t.XRANGELOOSE] = '^' + src[t.GTLT] + '\\s*' + src[t.XRANGEPLAINLOOSE] + '$' + +// Coercion. +// Extract anything that could conceivably be a part of a valid semver +tok('COERCE') +src[t.COERCE] = '(^|[^\\d])' + + '(\\d{1,' + MAX_SAFE_COMPONENT_LENGTH + '})' + + '(?:\\.(\\d{1,' + MAX_SAFE_COMPONENT_LENGTH + '}))?' + + '(?:\\.(\\d{1,' + MAX_SAFE_COMPONENT_LENGTH + '}))?' + + '(?:$|[^\\d])' +tok('COERCERTL') +re[t.COERCERTL] = new RegExp(src[t.COERCE], 'g') + +// Tilde ranges. +// Meaning is "reasonably at or greater than" +tok('LONETILDE') +src[t.LONETILDE] = '(?:~>?)' + +tok('TILDETRIM') +src[t.TILDETRIM] = '(\\s*)' + src[t.LONETILDE] + '\\s+' +re[t.TILDETRIM] = new RegExp(src[t.TILDETRIM], 'g') +var tildeTrimReplace = '$1~' + +tok('TILDE') +src[t.TILDE] = '^' + src[t.LONETILDE] + src[t.XRANGEPLAIN] + '$' +tok('TILDELOOSE') +src[t.TILDELOOSE] = '^' + src[t.LONETILDE] + src[t.XRANGEPLAINLOOSE] + '$' + +// Caret ranges. +// Meaning is "at least and backwards compatible with" +tok('LONECARET') +src[t.LONECARET] = '(?:\\^)' + +tok('CARETTRIM') +src[t.CARETTRIM] = '(\\s*)' + src[t.LONECARET] + '\\s+' +re[t.CARETTRIM] = new RegExp(src[t.CARETTRIM], 'g') +var caretTrimReplace = '$1^' + +tok('CARET') +src[t.CARET] = '^' + src[t.LONECARET] + src[t.XRANGEPLAIN] + '$' +tok('CARETLOOSE') +src[t.CARETLOOSE] = '^' + src[t.LONECARET] + src[t.XRANGEPLAINLOOSE] + '$' + +// A simple gt/lt/eq thing, or just "" to indicate "any version" +tok('COMPARATORLOOSE') +src[t.COMPARATORLOOSE] = '^' + src[t.GTLT] + '\\s*(' + src[t.LOOSEPLAIN] + ')$|^$' +tok('COMPARATOR') +src[t.COMPARATOR] = '^' + src[t.GTLT] + '\\s*(' + src[t.FULLPLAIN] + ')$|^$' + +// An expression to strip any whitespace between the gtlt and the thing +// it modifies, so that `> 1.2.3` ==> `>1.2.3` +tok('COMPARATORTRIM') +src[t.COMPARATORTRIM] = '(\\s*)' + src[t.GTLT] + + '\\s*(' + src[t.LOOSEPLAIN] + '|' + src[t.XRANGEPLAIN] + ')' + +// this one has to use the /g flag +re[t.COMPARATORTRIM] = new RegExp(src[t.COMPARATORTRIM], 'g') +var comparatorTrimReplace = '$1$2$3' + +// Something like `1.2.3 - 1.2.4` +// Note that these all use the loose form, because they'll be +// checked against either the strict or loose comparator form +// later. +tok('HYPHENRANGE') +src[t.HYPHENRANGE] = '^\\s*(' + src[t.XRANGEPLAIN] + ')' + + '\\s+-\\s+' + + '(' + src[t.XRANGEPLAIN] + ')' + + '\\s*$' + +tok('HYPHENRANGELOOSE') +src[t.HYPHENRANGELOOSE] = '^\\s*(' + src[t.XRANGEPLAINLOOSE] + ')' + + '\\s+-\\s+' + + '(' + src[t.XRANGEPLAINLOOSE] + ')' + + '\\s*$' + +// Star ranges basically just allow anything at all. +tok('STAR') +src[t.STAR] = '(<|>)?=?\\s*\\*' + +// Compile to actual regexp objects. +// All are flag-free, unless they were created above with a flag. +for (var i = 0; i < R; i++) { + debug(i, src[i]) + if (!re[i]) { + re[i] = new RegExp(src[i]) + } +} + +exports.parse = parse +function parse (version, options) { + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + + if (version instanceof SemVer) { + return version + } + + if (typeof version !== 'string') { + return null + } + + if (version.length > MAX_LENGTH) { + return null + } + + var r = options.loose ? re[t.LOOSE] : re[t.FULL] + if (!r.test(version)) { + return null + } + + try { + return new SemVer(version, options) + } catch (er) { + return null + } +} + +exports.valid = valid +function valid (version, options) { + var v = parse(version, options) + return v ? v.version : null +} + +exports.clean = clean +function clean (version, options) { + var s = parse(version.trim().replace(/^[=v]+/, ''), options) + return s ? s.version : null +} + +exports.SemVer = SemVer + +function SemVer (version, options) { + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + if (version instanceof SemVer) { + if (version.loose === options.loose) { + return version + } else { + version = version.version + } + } else if (typeof version !== 'string') { + throw new TypeError('Invalid Version: ' + version) + } + + if (version.length > MAX_LENGTH) { + throw new TypeError('version is longer than ' + MAX_LENGTH + ' characters') + } + + if (!(this instanceof SemVer)) { + return new SemVer(version, options) + } + + debug('SemVer', version, options) + this.options = options + this.loose = !!options.loose + + var m = version.trim().match(options.loose ? re[t.LOOSE] : re[t.FULL]) + + if (!m) { + throw new TypeError('Invalid Version: ' + version) + } + + this.raw = version + + // these are actually numbers + this.major = +m[1] + this.minor = +m[2] + this.patch = +m[3] + + if (this.major > MAX_SAFE_INTEGER || this.major < 0) { + throw new TypeError('Invalid major version') + } + + if (this.minor > MAX_SAFE_INTEGER || this.minor < 0) { + throw new TypeError('Invalid minor version') + } + + if (this.patch > MAX_SAFE_INTEGER || this.patch < 0) { + throw new TypeError('Invalid patch version') + } + + // numberify any prerelease numeric ids + if (!m[4]) { + this.prerelease = [] + } else { + this.prerelease = m[4].split('.').map(function (id) { + if (/^[0-9]+$/.test(id)) { + var num = +id + if (num >= 0 && num < MAX_SAFE_INTEGER) { + return num + } + } + return id + }) + } + + this.build = m[5] ? m[5].split('.') : [] + this.format() +} + +SemVer.prototype.format = function () { + this.version = this.major + '.' + this.minor + '.' + this.patch + if (this.prerelease.length) { + this.version += '-' + this.prerelease.join('.') + } + return this.version +} + +SemVer.prototype.toString = function () { + return this.version +} + +SemVer.prototype.compare = function (other) { + debug('SemVer.compare', this.version, this.options, other) + if (!(other instanceof SemVer)) { + other = new SemVer(other, this.options) + } + + return this.compareMain(other) || this.comparePre(other) +} + +SemVer.prototype.compareMain = function (other) { + if (!(other instanceof SemVer)) { + other = new SemVer(other, this.options) + } + + return compareIdentifiers(this.major, other.major) || + compareIdentifiers(this.minor, other.minor) || + compareIdentifiers(this.patch, other.patch) +} + +SemVer.prototype.comparePre = function (other) { + if (!(other instanceof SemVer)) { + other = new SemVer(other, this.options) + } + + // NOT having a prerelease is > having one + if (this.prerelease.length && !other.prerelease.length) { + return -1 + } else if (!this.prerelease.length && other.prerelease.length) { + return 1 + } else if (!this.prerelease.length && !other.prerelease.length) { + return 0 + } + + var i = 0 + do { + var a = this.prerelease[i] + var b = other.prerelease[i] + debug('prerelease compare', i, a, b) + if (a === undefined && b === undefined) { + return 0 + } else if (b === undefined) { + return 1 + } else if (a === undefined) { + return -1 + } else if (a === b) { + continue + } else { + return compareIdentifiers(a, b) + } + } while (++i) +} + +SemVer.prototype.compareBuild = function (other) { + if (!(other instanceof SemVer)) { + other = new SemVer(other, this.options) + } + + var i = 0 + do { + var a = this.build[i] + var b = other.build[i] + debug('prerelease compare', i, a, b) + if (a === undefined && b === undefined) { + return 0 + } else if (b === undefined) { + return 1 + } else if (a === undefined) { + return -1 + } else if (a === b) { + continue + } else { + return compareIdentifiers(a, b) + } + } while (++i) +} + +// preminor will bump the version up to the next minor release, and immediately +// down to pre-release. premajor and prepatch work the same way. +SemVer.prototype.inc = function (release, identifier) { + switch (release) { + case 'premajor': + this.prerelease.length = 0 + this.patch = 0 + this.minor = 0 + this.major++ + this.inc('pre', identifier) + break + case 'preminor': + this.prerelease.length = 0 + this.patch = 0 + this.minor++ + this.inc('pre', identifier) + break + case 'prepatch': + // If this is already a prerelease, it will bump to the next version + // drop any prereleases that might already exist, since they are not + // relevant at this point. + this.prerelease.length = 0 + this.inc('patch', identifier) + this.inc('pre', identifier) + break + // If the input is a non-prerelease version, this acts the same as + // prepatch. + case 'prerelease': + if (this.prerelease.length === 0) { + this.inc('patch', identifier) + } + this.inc('pre', identifier) + break + + case 'major': + // If this is a pre-major version, bump up to the same major version. + // Otherwise increment major. + // 1.0.0-5 bumps to 1.0.0 + // 1.1.0 bumps to 2.0.0 + if (this.minor !== 0 || + this.patch !== 0 || + this.prerelease.length === 0) { + this.major++ + } + this.minor = 0 + this.patch = 0 + this.prerelease = [] + break + case 'minor': + // If this is a pre-minor version, bump up to the same minor version. + // Otherwise increment minor. + // 1.2.0-5 bumps to 1.2.0 + // 1.2.1 bumps to 1.3.0 + if (this.patch !== 0 || this.prerelease.length === 0) { + this.minor++ + } + this.patch = 0 + this.prerelease = [] + break + case 'patch': + // If this is not a pre-release version, it will increment the patch. + // If it is a pre-release it will bump up to the same patch version. + // 1.2.0-5 patches to 1.2.0 + // 1.2.0 patches to 1.2.1 + if (this.prerelease.length === 0) { + this.patch++ + } + this.prerelease = [] + break + // This probably shouldn't be used publicly. + // 1.0.0 "pre" would become 1.0.0-0 which is the wrong direction. + case 'pre': + if (this.prerelease.length === 0) { + this.prerelease = [0] + } else { + var i = this.prerelease.length + while (--i >= 0) { + if (typeof this.prerelease[i] === 'number') { + this.prerelease[i]++ + i = -2 + } + } + if (i === -1) { + // didn't increment anything + this.prerelease.push(0) + } + } + if (identifier) { + // 1.2.0-beta.1 bumps to 1.2.0-beta.2, + // 1.2.0-beta.fooblz or 1.2.0-beta bumps to 1.2.0-beta.0 + if (this.prerelease[0] === identifier) { + if (isNaN(this.prerelease[1])) { + this.prerelease = [identifier, 0] + } + } else { + this.prerelease = [identifier, 0] + } + } + break + + default: + throw new Error('invalid increment argument: ' + release) + } + this.format() + this.raw = this.version + return this +} + +exports.inc = inc +function inc (version, release, loose, identifier) { + if (typeof (loose) === 'string') { + identifier = loose + loose = undefined + } + + try { + return new SemVer(version, loose).inc(release, identifier).version + } catch (er) { + return null + } +} + +exports.diff = diff +function diff (version1, version2) { + if (eq(version1, version2)) { + return null + } else { + var v1 = parse(version1) + var v2 = parse(version2) + var prefix = '' + if (v1.prerelease.length || v2.prerelease.length) { + prefix = 'pre' + var defaultResult = 'prerelease' + } + for (var key in v1) { + if (key === 'major' || key === 'minor' || key === 'patch') { + if (v1[key] !== v2[key]) { + return prefix + key + } + } + } + return defaultResult // may be undefined + } +} + +exports.compareIdentifiers = compareIdentifiers + +var numeric = /^[0-9]+$/ +function compareIdentifiers (a, b) { + var anum = numeric.test(a) + var bnum = numeric.test(b) + + if (anum && bnum) { + a = +a + b = +b + } + + return a === b ? 0 + : (anum && !bnum) ? -1 + : (bnum && !anum) ? 1 + : a < b ? -1 + : 1 +} + +exports.rcompareIdentifiers = rcompareIdentifiers +function rcompareIdentifiers (a, b) { + return compareIdentifiers(b, a) +} + +exports.major = major +function major (a, loose) { + return new SemVer(a, loose).major +} + +exports.minor = minor +function minor (a, loose) { + return new SemVer(a, loose).minor +} + +exports.patch = patch +function patch (a, loose) { + return new SemVer(a, loose).patch +} + +exports.compare = compare +function compare (a, b, loose) { + return new SemVer(a, loose).compare(new SemVer(b, loose)) +} + +exports.compareLoose = compareLoose +function compareLoose (a, b) { + return compare(a, b, true) +} + +exports.compareBuild = compareBuild +function compareBuild (a, b, loose) { + var versionA = new SemVer(a, loose) + var versionB = new SemVer(b, loose) + return versionA.compare(versionB) || versionA.compareBuild(versionB) +} + +exports.rcompare = rcompare +function rcompare (a, b, loose) { + return compare(b, a, loose) +} + +exports.sort = sort +function sort (list, loose) { + return list.sort(function (a, b) { + return exports.compareBuild(a, b, loose) + }) +} + +exports.rsort = rsort +function rsort (list, loose) { + return list.sort(function (a, b) { + return exports.compareBuild(b, a, loose) + }) +} + +exports.gt = gt +function gt (a, b, loose) { + return compare(a, b, loose) > 0 +} + +exports.lt = lt +function lt (a, b, loose) { + return compare(a, b, loose) < 0 +} + +exports.eq = eq +function eq (a, b, loose) { + return compare(a, b, loose) === 0 +} + +exports.neq = neq +function neq (a, b, loose) { + return compare(a, b, loose) !== 0 +} + +exports.gte = gte +function gte (a, b, loose) { + return compare(a, b, loose) >= 0 +} + +exports.lte = lte +function lte (a, b, loose) { + return compare(a, b, loose) <= 0 +} + +exports.cmp = cmp +function cmp (a, op, b, loose) { + switch (op) { + case '===': + if (typeof a === 'object') + a = a.version + if (typeof b === 'object') + b = b.version + return a === b + + case '!==': + if (typeof a === 'object') + a = a.version + if (typeof b === 'object') + b = b.version + return a !== b + + case '': + case '=': + case '==': + return eq(a, b, loose) + + case '!=': + return neq(a, b, loose) + + case '>': + return gt(a, b, loose) + + case '>=': + return gte(a, b, loose) + + case '<': + return lt(a, b, loose) + + case '<=': + return lte(a, b, loose) + + default: + throw new TypeError('Invalid operator: ' + op) + } +} + +exports.Comparator = Comparator +function Comparator (comp, options) { + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + + if (comp instanceof Comparator) { + if (comp.loose === !!options.loose) { + return comp + } else { + comp = comp.value + } + } + + if (!(this instanceof Comparator)) { + return new Comparator(comp, options) + } + + debug('comparator', comp, options) + this.options = options + this.loose = !!options.loose + this.parse(comp) + + if (this.semver === ANY) { + this.value = '' + } else { + this.value = this.operator + this.semver.version + } + + debug('comp', this) +} + +var ANY = {} +Comparator.prototype.parse = function (comp) { + var r = this.options.loose ? re[t.COMPARATORLOOSE] : re[t.COMPARATOR] + var m = comp.match(r) + + if (!m) { + throw new TypeError('Invalid comparator: ' + comp) + } + + this.operator = m[1] !== undefined ? m[1] : '' + if (this.operator === '=') { + this.operator = '' + } + + // if it literally is just '>' or '' then allow anything. + if (!m[2]) { + this.semver = ANY + } else { + this.semver = new SemVer(m[2], this.options.loose) + } +} + +Comparator.prototype.toString = function () { + return this.value +} + +Comparator.prototype.test = function (version) { + debug('Comparator.test', version, this.options.loose) + + if (this.semver === ANY || version === ANY) { + return true + } + + if (typeof version === 'string') { + try { + version = new SemVer(version, this.options) + } catch (er) { + return false + } + } + + return cmp(version, this.operator, this.semver, this.options) +} + +Comparator.prototype.intersects = function (comp, options) { + if (!(comp instanceof Comparator)) { + throw new TypeError('a Comparator is required') + } + + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + + var rangeTmp + + if (this.operator === '') { + if (this.value === '') { + return true + } + rangeTmp = new Range(comp.value, options) + return satisfies(this.value, rangeTmp, options) + } else if (comp.operator === '') { + if (comp.value === '') { + return true + } + rangeTmp = new Range(this.value, options) + return satisfies(comp.semver, rangeTmp, options) + } + + var sameDirectionIncreasing = + (this.operator === '>=' || this.operator === '>') && + (comp.operator === '>=' || comp.operator === '>') + var sameDirectionDecreasing = + (this.operator === '<=' || this.operator === '<') && + (comp.operator === '<=' || comp.operator === '<') + var sameSemVer = this.semver.version === comp.semver.version + var differentDirectionsInclusive = + (this.operator === '>=' || this.operator === '<=') && + (comp.operator === '>=' || comp.operator === '<=') + var oppositeDirectionsLessThan = + cmp(this.semver, '<', comp.semver, options) && + ((this.operator === '>=' || this.operator === '>') && + (comp.operator === '<=' || comp.operator === '<')) + var oppositeDirectionsGreaterThan = + cmp(this.semver, '>', comp.semver, options) && + ((this.operator === '<=' || this.operator === '<') && + (comp.operator === '>=' || comp.operator === '>')) + + return sameDirectionIncreasing || sameDirectionDecreasing || + (sameSemVer && differentDirectionsInclusive) || + oppositeDirectionsLessThan || oppositeDirectionsGreaterThan +} + +exports.Range = Range +function Range (range, options) { + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + + if (range instanceof Range) { + if (range.loose === !!options.loose && + range.includePrerelease === !!options.includePrerelease) { + return range + } else { + return new Range(range.raw, options) + } + } + + if (range instanceof Comparator) { + return new Range(range.value, options) + } + + if (!(this instanceof Range)) { + return new Range(range, options) + } + + this.options = options + this.loose = !!options.loose + this.includePrerelease = !!options.includePrerelease + + // First, split based on boolean or || + this.raw = range + this.set = range.split(/\s*\|\|\s*/).map(function (range) { + return this.parseRange(range.trim()) + }, this).filter(function (c) { + // throw out any that are not relevant for whatever reason + return c.length + }) + + if (!this.set.length) { + throw new TypeError('Invalid SemVer Range: ' + range) + } + + this.format() +} + +Range.prototype.format = function () { + this.range = this.set.map(function (comps) { + return comps.join(' ').trim() + }).join('||').trim() + return this.range +} + +Range.prototype.toString = function () { + return this.range +} + +Range.prototype.parseRange = function (range) { + var loose = this.options.loose + range = range.trim() + // `1.2.3 - 1.2.4` => `>=1.2.3 <=1.2.4` + var hr = loose ? re[t.HYPHENRANGELOOSE] : re[t.HYPHENRANGE] + range = range.replace(hr, hyphenReplace) + debug('hyphen replace', range) + // `> 1.2.3 < 1.2.5` => `>1.2.3 <1.2.5` + range = range.replace(re[t.COMPARATORTRIM], comparatorTrimReplace) + debug('comparator trim', range, re[t.COMPARATORTRIM]) + + // `~ 1.2.3` => `~1.2.3` + range = range.replace(re[t.TILDETRIM], tildeTrimReplace) + + // `^ 1.2.3` => `^1.2.3` + range = range.replace(re[t.CARETTRIM], caretTrimReplace) + + // normalize spaces + range = range.split(/\s+/).join(' ') + + // At this point, the range is completely trimmed and + // ready to be split into comparators. + + var compRe = loose ? re[t.COMPARATORLOOSE] : re[t.COMPARATOR] + var set = range.split(' ').map(function (comp) { + return parseComparator(comp, this.options) + }, this).join(' ').split(/\s+/) + if (this.options.loose) { + // in loose mode, throw out any that are not valid comparators + set = set.filter(function (comp) { + return !!comp.match(compRe) + }) + } + set = set.map(function (comp) { + return new Comparator(comp, this.options) + }, this) + + return set +} + +Range.prototype.intersects = function (range, options) { + if (!(range instanceof Range)) { + throw new TypeError('a Range is required') + } + + return this.set.some(function (thisComparators) { + return ( + isSatisfiable(thisComparators, options) && + range.set.some(function (rangeComparators) { + return ( + isSatisfiable(rangeComparators, options) && + thisComparators.every(function (thisComparator) { + return rangeComparators.every(function (rangeComparator) { + return thisComparator.intersects(rangeComparator, options) + }) + }) + ) + }) + ) + }) +} + +// take a set of comparators and determine whether there +// exists a version which can satisfy it +function isSatisfiable (comparators, options) { + var result = true + var remainingComparators = comparators.slice() + var testComparator = remainingComparators.pop() + + while (result && remainingComparators.length) { + result = remainingComparators.every(function (otherComparator) { + return testComparator.intersects(otherComparator, options) + }) + + testComparator = remainingComparators.pop() + } + + return result +} + +// Mostly just for testing and legacy API reasons +exports.toComparators = toComparators +function toComparators (range, options) { + return new Range(range, options).set.map(function (comp) { + return comp.map(function (c) { + return c.value + }).join(' ').trim().split(' ') + }) +} + +// comprised of xranges, tildes, stars, and gtlt's at this point. +// already replaced the hyphen ranges +// turn into a set of JUST comparators. +function parseComparator (comp, options) { + debug('comp', comp, options) + comp = replaceCarets(comp, options) + debug('caret', comp) + comp = replaceTildes(comp, options) + debug('tildes', comp) + comp = replaceXRanges(comp, options) + debug('xrange', comp) + comp = replaceStars(comp, options) + debug('stars', comp) + return comp +} + +function isX (id) { + return !id || id.toLowerCase() === 'x' || id === '*' +} + +// ~, ~> --> * (any, kinda silly) +// ~2, ~2.x, ~2.x.x, ~>2, ~>2.x ~>2.x.x --> >=2.0.0 <3.0.0 +// ~2.0, ~2.0.x, ~>2.0, ~>2.0.x --> >=2.0.0 <2.1.0 +// ~1.2, ~1.2.x, ~>1.2, ~>1.2.x --> >=1.2.0 <1.3.0 +// ~1.2.3, ~>1.2.3 --> >=1.2.3 <1.3.0 +// ~1.2.0, ~>1.2.0 --> >=1.2.0 <1.3.0 +function replaceTildes (comp, options) { + return comp.trim().split(/\s+/).map(function (comp) { + return replaceTilde(comp, options) + }).join(' ') +} + +function replaceTilde (comp, options) { + var r = options.loose ? re[t.TILDELOOSE] : re[t.TILDE] + return comp.replace(r, function (_, M, m, p, pr) { + debug('tilde', comp, _, M, m, p, pr) + var ret + + if (isX(M)) { + ret = '' + } else if (isX(m)) { + ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0' + } else if (isX(p)) { + // ~1.2 == >=1.2.0 <1.3.0 + ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0' + } else if (pr) { + debug('replaceTilde pr', pr) + ret = '>=' + M + '.' + m + '.' + p + '-' + pr + + ' <' + M + '.' + (+m + 1) + '.0' + } else { + // ~1.2.3 == >=1.2.3 <1.3.0 + ret = '>=' + M + '.' + m + '.' + p + + ' <' + M + '.' + (+m + 1) + '.0' + } + + debug('tilde return', ret) + return ret + }) +} + +// ^ --> * (any, kinda silly) +// ^2, ^2.x, ^2.x.x --> >=2.0.0 <3.0.0 +// ^2.0, ^2.0.x --> >=2.0.0 <3.0.0 +// ^1.2, ^1.2.x --> >=1.2.0 <2.0.0 +// ^1.2.3 --> >=1.2.3 <2.0.0 +// ^1.2.0 --> >=1.2.0 <2.0.0 +function replaceCarets (comp, options) { + return comp.trim().split(/\s+/).map(function (comp) { + return replaceCaret(comp, options) + }).join(' ') +} + +function replaceCaret (comp, options) { + debug('caret', comp, options) + var r = options.loose ? re[t.CARETLOOSE] : re[t.CARET] + return comp.replace(r, function (_, M, m, p, pr) { + debug('caret', comp, _, M, m, p, pr) + var ret + + if (isX(M)) { + ret = '' + } else if (isX(m)) { + ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0' + } else if (isX(p)) { + if (M === '0') { + ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0' + } else { + ret = '>=' + M + '.' + m + '.0 <' + (+M + 1) + '.0.0' + } + } else if (pr) { + debug('replaceCaret pr', pr) + if (M === '0') { + if (m === '0') { + ret = '>=' + M + '.' + m + '.' + p + '-' + pr + + ' <' + M + '.' + m + '.' + (+p + 1) + } else { + ret = '>=' + M + '.' + m + '.' + p + '-' + pr + + ' <' + M + '.' + (+m + 1) + '.0' + } + } else { + ret = '>=' + M + '.' + m + '.' + p + '-' + pr + + ' <' + (+M + 1) + '.0.0' + } + } else { + debug('no pr') + if (M === '0') { + if (m === '0') { + ret = '>=' + M + '.' + m + '.' + p + + ' <' + M + '.' + m + '.' + (+p + 1) + } else { + ret = '>=' + M + '.' + m + '.' + p + + ' <' + M + '.' + (+m + 1) + '.0' + } + } else { + ret = '>=' + M + '.' + m + '.' + p + + ' <' + (+M + 1) + '.0.0' + } + } + + debug('caret return', ret) + return ret + }) +} + +function replaceXRanges (comp, options) { + debug('replaceXRanges', comp, options) + return comp.split(/\s+/).map(function (comp) { + return replaceXRange(comp, options) + }).join(' ') +} + +function replaceXRange (comp, options) { + comp = comp.trim() + var r = options.loose ? re[t.XRANGELOOSE] : re[t.XRANGE] + return comp.replace(r, function (ret, gtlt, M, m, p, pr) { + debug('xRange', comp, ret, gtlt, M, m, p, pr) + var xM = isX(M) + var xm = xM || isX(m) + var xp = xm || isX(p) + var anyX = xp + + if (gtlt === '=' && anyX) { + gtlt = '' + } + + // if we're including prereleases in the match, then we need + // to fix this to -0, the lowest possible prerelease value + pr = options.includePrerelease ? '-0' : '' + + if (xM) { + if (gtlt === '>' || gtlt === '<') { + // nothing is allowed + ret = '<0.0.0-0' + } else { + // nothing is forbidden + ret = '*' + } + } else if (gtlt && anyX) { + // we know patch is an x, because we have any x at all. + // replace X with 0 + if (xm) { + m = 0 + } + p = 0 + + if (gtlt === '>') { + // >1 => >=2.0.0 + // >1.2 => >=1.3.0 + // >1.2.3 => >= 1.2.4 + gtlt = '>=' + if (xm) { + M = +M + 1 + m = 0 + p = 0 + } else { + m = +m + 1 + p = 0 + } + } else if (gtlt === '<=') { + // <=0.7.x is actually <0.8.0, since any 0.7.x should + // pass. Similarly, <=7.x is actually <8.0.0, etc. + gtlt = '<' + if (xm) { + M = +M + 1 + } else { + m = +m + 1 + } + } + + ret = gtlt + M + '.' + m + '.' + p + pr + } else if (xm) { + ret = '>=' + M + '.0.0' + pr + ' <' + (+M + 1) + '.0.0' + pr + } else if (xp) { + ret = '>=' + M + '.' + m + '.0' + pr + + ' <' + M + '.' + (+m + 1) + '.0' + pr + } + + debug('xRange return', ret) + + return ret + }) +} + +// Because * is AND-ed with everything else in the comparator, +// and '' means "any version", just remove the *s entirely. +function replaceStars (comp, options) { + debug('replaceStars', comp, options) + // Looseness is ignored here. star is always as loose as it gets! + return comp.trim().replace(re[t.STAR], '') +} + +// This function is passed to string.replace(re[t.HYPHENRANGE]) +// M, m, patch, prerelease, build +// 1.2 - 3.4.5 => >=1.2.0 <=3.4.5 +// 1.2.3 - 3.4 => >=1.2.0 <3.5.0 Any 3.4.x will do +// 1.2 - 3.4 => >=1.2.0 <3.5.0 +function hyphenReplace ($0, + from, fM, fm, fp, fpr, fb, + to, tM, tm, tp, tpr, tb) { + if (isX(fM)) { + from = '' + } else if (isX(fm)) { + from = '>=' + fM + '.0.0' + } else if (isX(fp)) { + from = '>=' + fM + '.' + fm + '.0' + } else { + from = '>=' + from + } + + if (isX(tM)) { + to = '' + } else if (isX(tm)) { + to = '<' + (+tM + 1) + '.0.0' + } else if (isX(tp)) { + to = '<' + tM + '.' + (+tm + 1) + '.0' + } else if (tpr) { + to = '<=' + tM + '.' + tm + '.' + tp + '-' + tpr + } else { + to = '<=' + to + } + + return (from + ' ' + to).trim() +} + +// if ANY of the sets match ALL of its comparators, then pass +Range.prototype.test = function (version) { + if (!version) { + return false + } + + if (typeof version === 'string') { + try { + version = new SemVer(version, this.options) + } catch (er) { + return false + } + } + + for (var i = 0; i < this.set.length; i++) { + if (testSet(this.set[i], version, this.options)) { + return true + } + } + return false +} + +function testSet (set, version, options) { + for (var i = 0; i < set.length; i++) { + if (!set[i].test(version)) { + return false + } + } + + if (version.prerelease.length && !options.includePrerelease) { + // Find the set of versions that are allowed to have prereleases + // For example, ^1.2.3-pr.1 desugars to >=1.2.3-pr.1 <2.0.0 + // That should allow `1.2.3-pr.2` to pass. + // However, `1.2.4-alpha.notready` should NOT be allowed, + // even though it's within the range set by the comparators. + for (i = 0; i < set.length; i++) { + debug(set[i].semver) + if (set[i].semver === ANY) { + continue + } + + if (set[i].semver.prerelease.length > 0) { + var allowed = set[i].semver + if (allowed.major === version.major && + allowed.minor === version.minor && + allowed.patch === version.patch) { + return true + } + } + } + + // Version has a -pre, but it's not one of the ones we like. + return false + } + + return true +} + +exports.satisfies = satisfies +function satisfies (version, range, options) { + try { + range = new Range(range, options) + } catch (er) { + return false + } + return range.test(version) +} + +exports.maxSatisfying = maxSatisfying +function maxSatisfying (versions, range, options) { + var max = null + var maxSV = null + try { + var rangeObj = new Range(range, options) + } catch (er) { + return null + } + versions.forEach(function (v) { + if (rangeObj.test(v)) { + // satisfies(v, range, options) + if (!max || maxSV.compare(v) === -1) { + // compare(max, v, true) + max = v + maxSV = new SemVer(max, options) + } + } + }) + return max +} + +exports.minSatisfying = minSatisfying +function minSatisfying (versions, range, options) { + var min = null + var minSV = null + try { + var rangeObj = new Range(range, options) + } catch (er) { + return null + } + versions.forEach(function (v) { + if (rangeObj.test(v)) { + // satisfies(v, range, options) + if (!min || minSV.compare(v) === 1) { + // compare(min, v, true) + min = v + minSV = new SemVer(min, options) + } + } + }) + return min +} + +exports.minVersion = minVersion +function minVersion (range, loose) { + range = new Range(range, loose) + + var minver = new SemVer('0.0.0') + if (range.test(minver)) { + return minver + } + + minver = new SemVer('0.0.0-0') + if (range.test(minver)) { + return minver + } + + minver = null + for (var i = 0; i < range.set.length; ++i) { + var comparators = range.set[i] + + comparators.forEach(function (comparator) { + // Clone to avoid manipulating the comparator's semver object. + var compver = new SemVer(comparator.semver.version) + switch (comparator.operator) { + case '>': + if (compver.prerelease.length === 0) { + compver.patch++ + } else { + compver.prerelease.push(0) + } + compver.raw = compver.format() + /* fallthrough */ + case '': + case '>=': + if (!minver || gt(minver, compver)) { + minver = compver + } + break + case '<': + case '<=': + /* Ignore maximum versions */ + break + /* istanbul ignore next */ + default: + throw new Error('Unexpected operation: ' + comparator.operator) + } + }) + } + + if (minver && range.test(minver)) { + return minver + } + + return null +} + +exports.validRange = validRange +function validRange (range, options) { + try { + // Return '*' instead of '' so that truthiness works. + // This will throw if it's invalid anyway + return new Range(range, options).range || '*' + } catch (er) { + return null + } +} + +// Determine if version is less than all the versions possible in the range +exports.ltr = ltr +function ltr (version, range, options) { + return outside(version, range, '<', options) +} + +// Determine if version is greater than all the versions possible in the range. +exports.gtr = gtr +function gtr (version, range, options) { + return outside(version, range, '>', options) +} + +exports.outside = outside +function outside (version, range, hilo, options) { + version = new SemVer(version, options) + range = new Range(range, options) + + var gtfn, ltefn, ltfn, comp, ecomp + switch (hilo) { + case '>': + gtfn = gt + ltefn = lte + ltfn = lt + comp = '>' + ecomp = '>=' + break + case '<': + gtfn = lt + ltefn = gte + ltfn = gt + comp = '<' + ecomp = '<=' + break + default: + throw new TypeError('Must provide a hilo val of "<" or ">"') + } + + // If it satisifes the range it is not outside + if (satisfies(version, range, options)) { + return false + } + + // From now on, variable terms are as if we're in "gtr" mode. + // but note that everything is flipped for the "ltr" function. + + for (var i = 0; i < range.set.length; ++i) { + var comparators = range.set[i] + + var high = null + var low = null + + comparators.forEach(function (comparator) { + if (comparator.semver === ANY) { + comparator = new Comparator('>=0.0.0') + } + high = high || comparator + low = low || comparator + if (gtfn(comparator.semver, high.semver, options)) { + high = comparator + } else if (ltfn(comparator.semver, low.semver, options)) { + low = comparator + } + }) + + // If the edge version comparator has a operator then our version + // isn't outside it + if (high.operator === comp || high.operator === ecomp) { + return false + } + + // If the lowest version comparator has an operator and our version + // is less than it then it isn't higher than the range + if ((!low.operator || low.operator === comp) && + ltefn(version, low.semver)) { + return false + } else if (low.operator === ecomp && ltfn(version, low.semver)) { + return false + } + } + return true +} + +exports.prerelease = prerelease +function prerelease (version, options) { + var parsed = parse(version, options) + return (parsed && parsed.prerelease.length) ? parsed.prerelease : null +} + +exports.intersects = intersects +function intersects (r1, r2, options) { + r1 = new Range(r1, options) + r2 = new Range(r2, options) + return r1.intersects(r2) +} + +exports.coerce = coerce +function coerce (version, options) { + if (version instanceof SemVer) { + return version + } + + if (typeof version === 'number') { + version = String(version) + } + + if (typeof version !== 'string') { + return null + } + + options = options || {} + + var match = null + if (!options.rtl) { + match = version.match(re[t.COERCE]) + } else { + // Find the right-most coercible string that does not share + // a terminus with a more left-ward coercible string. + // Eg, '1.2.3.4' wants to coerce '2.3.4', not '3.4' or '4' + // + // Walk through the string checking with a /g regexp + // Manually set the index so as to pick up overlapping matches. + // Stop when we get a match that ends at the string end, since no + // coercible string can be more right-ward without the same terminus. + var next + while ((next = re[t.COERCERTL].exec(version)) && + (!match || match.index + match[0].length !== version.length) + ) { + if (!match || + next.index + next[0].length !== match.index + match[0].length) { + match = next + } + re[t.COERCERTL].lastIndex = next.index + next[1].length + next[2].length + } + // leave it in a clean state + re[t.COERCERTL].lastIndex = -1 + } + + if (match === null) { + return null + } + + return parse(match[2] + + '.' + (match[3] || '0') + + '.' + (match[4] || '0'), options) +} diff --git a/node_modules/make-dir/package.json b/node_modules/make-dir/package.json new file mode 100644 index 000000000..08f9a7bfa --- /dev/null +++ b/node_modules/make-dir/package.json @@ -0,0 +1,91 @@ +{ + "_from": "make-dir@^3.0.0", + "_id": "make-dir@3.1.0", + "_inBundle": false, + "_integrity": "sha512-g3FeP20LNwhALb/6Cz6Dd4F2ngze0jz7tbzrD2wAV+o9FeNHe4rL+yK2md0J/fiSf1sa1ADhXqi5+oVwOM/eGw==", + "_location": "/make-dir", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "make-dir@^3.0.0", + "name": "make-dir", + "escapedName": "make-dir", + "rawSpec": "^3.0.0", + "saveSpec": null, + "fetchSpec": "^3.0.0" + }, + "_requiredBy": [ + "/configstore" + ], + "_resolved": "https://registry.npmjs.org/make-dir/-/make-dir-3.1.0.tgz", + "_shasum": "415e967046b3a7f1d185277d84aa58203726a13f", + "_spec": "make-dir@^3.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\configstore", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/make-dir/issues" + }, + "bundleDependencies": false, + "dependencies": { + "semver": "^6.0.0" + }, + "deprecated": false, + "description": "Make a directory and its parents if needed - Think `mkdir -p`", + "devDependencies": { + "@types/graceful-fs": "^4.1.3", + "@types/node": "^13.7.1", + "ava": "^1.4.0", + "codecov": "^3.2.0", + "graceful-fs": "^4.1.15", + "nyc": "^15.0.0", + "path-type": "^4.0.0", + "tempy": "^0.2.1", + "tsd": "^0.11.0", + "xo": "^0.25.4" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "funding": "https://github.com/sponsors/sindresorhus", + "homepage": "https://github.com/sindresorhus/make-dir#readme", + "keywords": [ + "mkdir", + "mkdirp", + "make", + "directories", + "dir", + "dirs", + "folders", + "directory", + "folder", + "path", + "parent", + "parents", + "intermediate", + "recursively", + "recursive", + "create", + "fs", + "filesystem", + "file-system" + ], + "license": "MIT", + "name": "make-dir", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/make-dir.git" + }, + "scripts": { + "test": "xo && nyc ava && tsd" + }, + "version": "3.1.0" +} diff --git a/node_modules/make-dir/readme.md b/node_modules/make-dir/readme.md new file mode 100644 index 000000000..a10a1a424 --- /dev/null +++ b/node_modules/make-dir/readme.md @@ -0,0 +1,125 @@ +# make-dir [![Build Status](https://travis-ci.org/sindresorhus/make-dir.svg?branch=master)](https://travis-ci.org/sindresorhus/make-dir) [![codecov](https://codecov.io/gh/sindresorhus/make-dir/branch/master/graph/badge.svg)](https://codecov.io/gh/sindresorhus/make-dir) + +> Make a directory and its parents if needed - Think `mkdir -p` + +## Advantages over [`mkdirp`](https://github.com/substack/node-mkdirp) + +- Promise API *(Async/await ready!)* +- Fixes many `mkdirp` issues: [#96](https://github.com/substack/node-mkdirp/pull/96) [#70](https://github.com/substack/node-mkdirp/issues/70) [#66](https://github.com/substack/node-mkdirp/issues/66) +- 100% test coverage +- CI-tested on macOS, Linux, and Windows +- Actively maintained +- Doesn't bundle a CLI +- Uses the native `fs.mkdir/mkdirSync` [`recursive` option](https://nodejs.org/dist/latest/docs/api/fs.html#fs_fs_mkdir_path_options_callback) in Node.js >=10.12.0 unless [overridden](#fs) + +## Install + +``` +$ npm install make-dir +``` + +## Usage + +``` +$ pwd +/Users/sindresorhus/fun +$ tree +. +``` + +```js +const makeDir = require('make-dir'); + +(async () => { + const path = await makeDir('unicorn/rainbow/cake'); + + console.log(path); + //=> '/Users/sindresorhus/fun/unicorn/rainbow/cake' +})(); +``` + +``` +$ tree +. +└── unicorn + └── rainbow + └── cake +``` + +Multiple directories: + +```js +const makeDir = require('make-dir'); + +(async () => { + const paths = await Promise.all([ + makeDir('unicorn/rainbow'), + makeDir('foo/bar') + ]); + + console.log(paths); + /* + [ + '/Users/sindresorhus/fun/unicorn/rainbow', + '/Users/sindresorhus/fun/foo/bar' + ] + */ +})(); +``` + +## API + +### makeDir(path, options?) + +Returns a `Promise` for the path to the created directory. + +### makeDir.sync(path, options?) + +Returns the path to the created directory. + +#### path + +Type: `string` + +Directory to create. + +#### options + +Type: `object` + +##### mode + +Type: `integer`\ +Default: `0o777` + +Directory [permissions](https://x-team.com/blog/file-system-permissions-umask-node-js/). + +##### fs + +Type: `object`\ +Default: `require('fs')` + +Use a custom `fs` implementation. For example [`graceful-fs`](https://github.com/isaacs/node-graceful-fs). + +Using a custom `fs` implementation will block the use of the native `recursive` option if `fs.mkdir` or `fs.mkdirSync` is not the native function. + +## Related + +- [make-dir-cli](https://github.com/sindresorhus/make-dir-cli) - CLI for this module +- [del](https://github.com/sindresorhus/del) - Delete files and directories +- [globby](https://github.com/sindresorhus/globby) - User-friendly glob matching +- [cpy](https://github.com/sindresorhus/cpy) - Copy files +- [cpy-cli](https://github.com/sindresorhus/cpy-cli) - Copy files on the command-line +- [move-file](https://github.com/sindresorhus/move-file) - Move a file + +--- + +
+ + Get professional support for this package with a Tidelift subscription + +
+ + Tidelift helps make open source sustainable for maintainers while giving companies
assurances about security, maintenance, and licensing for their dependencies. +
+
diff --git a/node_modules/media-typer/HISTORY.md b/node_modules/media-typer/HISTORY.md new file mode 100644 index 000000000..62c200316 --- /dev/null +++ b/node_modules/media-typer/HISTORY.md @@ -0,0 +1,22 @@ +0.3.0 / 2014-09-07 +================== + + * Support Node.js 0.6 + * Throw error when parameter format invalid on parse + +0.2.0 / 2014-06-18 +================== + + * Add `typer.format()` to format media types + +0.1.0 / 2014-06-17 +================== + + * Accept `req` as argument to `parse` + * Accept `res` as argument to `parse` + * Parse media type with extra LWS between type and first parameter + +0.0.0 / 2014-06-13 +================== + + * Initial implementation diff --git a/node_modules/media-typer/LICENSE b/node_modules/media-typer/LICENSE new file mode 100644 index 000000000..b7dce6cf9 --- /dev/null +++ b/node_modules/media-typer/LICENSE @@ -0,0 +1,22 @@ +(The MIT License) + +Copyright (c) 2014 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/media-typer/README.md b/node_modules/media-typer/README.md new file mode 100644 index 000000000..d8df62347 --- /dev/null +++ b/node_modules/media-typer/README.md @@ -0,0 +1,81 @@ +# media-typer + +[![NPM Version][npm-image]][npm-url] +[![NPM Downloads][downloads-image]][downloads-url] +[![Node.js Version][node-version-image]][node-version-url] +[![Build Status][travis-image]][travis-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +Simple RFC 6838 media type parser + +## Installation + +```sh +$ npm install media-typer +``` + +## API + +```js +var typer = require('media-typer') +``` + +### typer.parse(string) + +```js +var obj = typer.parse('image/svg+xml; charset=utf-8') +``` + +Parse a media type string. This will return an object with the following +properties (examples are shown for the string `'image/svg+xml; charset=utf-8'`): + + - `type`: The type of the media type (always lower case). Example: `'image'` + + - `subtype`: The subtype of the media type (always lower case). Example: `'svg'` + + - `suffix`: The suffix of the media type (always lower case). Example: `'xml'` + + - `parameters`: An object of the parameters in the media type (name of parameter always lower case). Example: `{charset: 'utf-8'}` + +### typer.parse(req) + +```js +var obj = typer.parse(req) +``` + +Parse the `content-type` header from the given `req`. Short-cut for +`typer.parse(req.headers['content-type'])`. + +### typer.parse(res) + +```js +var obj = typer.parse(res) +``` + +Parse the `content-type` header set on the given `res`. Short-cut for +`typer.parse(res.getHeader('content-type'))`. + +### typer.format(obj) + +```js +var obj = typer.format({type: 'image', subtype: 'svg', suffix: 'xml'}) +``` + +Format an object into a media type string. This will return a string of the +mime type for the given object. For the properties of the object, see the +documentation for `typer.parse(string)`. + +## License + +[MIT](LICENSE) + +[npm-image]: https://img.shields.io/npm/v/media-typer.svg?style=flat +[npm-url]: https://npmjs.org/package/media-typer +[node-version-image]: https://img.shields.io/badge/node.js-%3E%3D_0.6-brightgreen.svg?style=flat +[node-version-url]: http://nodejs.org/download/ +[travis-image]: https://img.shields.io/travis/jshttp/media-typer.svg?style=flat +[travis-url]: https://travis-ci.org/jshttp/media-typer +[coveralls-image]: https://img.shields.io/coveralls/jshttp/media-typer.svg?style=flat +[coveralls-url]: https://coveralls.io/r/jshttp/media-typer +[downloads-image]: https://img.shields.io/npm/dm/media-typer.svg?style=flat +[downloads-url]: https://npmjs.org/package/media-typer diff --git a/node_modules/media-typer/index.js b/node_modules/media-typer/index.js new file mode 100644 index 000000000..07f7295ee --- /dev/null +++ b/node_modules/media-typer/index.js @@ -0,0 +1,270 @@ +/*! + * media-typer + * Copyright(c) 2014 Douglas Christopher Wilson + * MIT Licensed + */ + +/** + * RegExp to match *( ";" parameter ) in RFC 2616 sec 3.7 + * + * parameter = token "=" ( token | quoted-string ) + * token = 1* + * separators = "(" | ")" | "<" | ">" | "@" + * | "," | ";" | ":" | "\" | <"> + * | "/" | "[" | "]" | "?" | "=" + * | "{" | "}" | SP | HT + * quoted-string = ( <"> *(qdtext | quoted-pair ) <"> ) + * qdtext = > + * quoted-pair = "\" CHAR + * CHAR = + * TEXT = + * LWS = [CRLF] 1*( SP | HT ) + * CRLF = CR LF + * CR = + * LF = + * SP = + * SHT = + * CTL = + * OCTET = + */ +var paramRegExp = /; *([!#$%&'\*\+\-\.0-9A-Z\^_`a-z\|~]+) *= *("(?:[ !\u0023-\u005b\u005d-\u007e\u0080-\u00ff]|\\[\u0020-\u007e])*"|[!#$%&'\*\+\-\.0-9A-Z\^_`a-z\|~]+) */g; +var textRegExp = /^[\u0020-\u007e\u0080-\u00ff]+$/ +var tokenRegExp = /^[!#$%&'\*\+\-\.0-9A-Z\^_`a-z\|~]+$/ + +/** + * RegExp to match quoted-pair in RFC 2616 + * + * quoted-pair = "\" CHAR + * CHAR = + */ +var qescRegExp = /\\([\u0000-\u007f])/g; + +/** + * RegExp to match chars that must be quoted-pair in RFC 2616 + */ +var quoteRegExp = /([\\"])/g; + +/** + * RegExp to match type in RFC 6838 + * + * type-name = restricted-name + * subtype-name = restricted-name + * restricted-name = restricted-name-first *126restricted-name-chars + * restricted-name-first = ALPHA / DIGIT + * restricted-name-chars = ALPHA / DIGIT / "!" / "#" / + * "$" / "&" / "-" / "^" / "_" + * restricted-name-chars =/ "." ; Characters before first dot always + * ; specify a facet name + * restricted-name-chars =/ "+" ; Characters after last plus always + * ; specify a structured syntax suffix + * ALPHA = %x41-5A / %x61-7A ; A-Z / a-z + * DIGIT = %x30-39 ; 0-9 + */ +var subtypeNameRegExp = /^[A-Za-z0-9][A-Za-z0-9!#$&^_.-]{0,126}$/ +var typeNameRegExp = /^[A-Za-z0-9][A-Za-z0-9!#$&^_-]{0,126}$/ +var typeRegExp = /^ *([A-Za-z0-9][A-Za-z0-9!#$&^_-]{0,126})\/([A-Za-z0-9][A-Za-z0-9!#$&^_.+-]{0,126}) *$/; + +/** + * Module exports. + */ + +exports.format = format +exports.parse = parse + +/** + * Format object to media type. + * + * @param {object} obj + * @return {string} + * @api public + */ + +function format(obj) { + if (!obj || typeof obj !== 'object') { + throw new TypeError('argument obj is required') + } + + var parameters = obj.parameters + var subtype = obj.subtype + var suffix = obj.suffix + var type = obj.type + + if (!type || !typeNameRegExp.test(type)) { + throw new TypeError('invalid type') + } + + if (!subtype || !subtypeNameRegExp.test(subtype)) { + throw new TypeError('invalid subtype') + } + + // format as type/subtype + var string = type + '/' + subtype + + // append +suffix + if (suffix) { + if (!typeNameRegExp.test(suffix)) { + throw new TypeError('invalid suffix') + } + + string += '+' + suffix + } + + // append parameters + if (parameters && typeof parameters === 'object') { + var param + var params = Object.keys(parameters).sort() + + for (var i = 0; i < params.length; i++) { + param = params[i] + + if (!tokenRegExp.test(param)) { + throw new TypeError('invalid parameter name') + } + + string += '; ' + param + '=' + qstring(parameters[param]) + } + } + + return string +} + +/** + * Parse media type to object. + * + * @param {string|object} string + * @return {Object} + * @api public + */ + +function parse(string) { + if (!string) { + throw new TypeError('argument string is required') + } + + // support req/res-like objects as argument + if (typeof string === 'object') { + string = getcontenttype(string) + } + + if (typeof string !== 'string') { + throw new TypeError('argument string is required to be a string') + } + + var index = string.indexOf(';') + var type = index !== -1 + ? string.substr(0, index) + : string + + var key + var match + var obj = splitType(type) + var params = {} + var value + + paramRegExp.lastIndex = index + + while (match = paramRegExp.exec(string)) { + if (match.index !== index) { + throw new TypeError('invalid parameter format') + } + + index += match[0].length + key = match[1].toLowerCase() + value = match[2] + + if (value[0] === '"') { + // remove quotes and escapes + value = value + .substr(1, value.length - 2) + .replace(qescRegExp, '$1') + } + + params[key] = value + } + + if (index !== -1 && index !== string.length) { + throw new TypeError('invalid parameter format') + } + + obj.parameters = params + + return obj +} + +/** + * Get content-type from req/res objects. + * + * @param {object} + * @return {Object} + * @api private + */ + +function getcontenttype(obj) { + if (typeof obj.getHeader === 'function') { + // res-like + return obj.getHeader('content-type') + } + + if (typeof obj.headers === 'object') { + // req-like + return obj.headers && obj.headers['content-type'] + } +} + +/** + * Quote a string if necessary. + * + * @param {string} val + * @return {string} + * @api private + */ + +function qstring(val) { + var str = String(val) + + // no need to quote tokens + if (tokenRegExp.test(str)) { + return str + } + + if (str.length > 0 && !textRegExp.test(str)) { + throw new TypeError('invalid parameter value') + } + + return '"' + str.replace(quoteRegExp, '\\$1') + '"' +} + +/** + * Simply "type/subtype+siffx" into parts. + * + * @param {string} string + * @return {Object} + * @api private + */ + +function splitType(string) { + var match = typeRegExp.exec(string.toLowerCase()) + + if (!match) { + throw new TypeError('invalid media type') + } + + var type = match[1] + var subtype = match[2] + var suffix + + // suffix after last + + var index = subtype.lastIndexOf('+') + if (index !== -1) { + suffix = subtype.substr(index + 1) + subtype = subtype.substr(0, index) + } + + var obj = { + type: type, + subtype: subtype, + suffix: suffix + } + + return obj +} diff --git a/node_modules/media-typer/package.json b/node_modules/media-typer/package.json new file mode 100644 index 000000000..c66e2f343 --- /dev/null +++ b/node_modules/media-typer/package.json @@ -0,0 +1,61 @@ +{ + "_from": "media-typer@0.3.0", + "_id": "media-typer@0.3.0", + "_inBundle": false, + "_integrity": "sha1-hxDXrwqmJvj/+hzgAWhUUmMlV0g=", + "_location": "/media-typer", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "media-typer@0.3.0", + "name": "media-typer", + "escapedName": "media-typer", + "rawSpec": "0.3.0", + "saveSpec": null, + "fetchSpec": "0.3.0" + }, + "_requiredBy": [ + "/type-is" + ], + "_resolved": "https://registry.npmjs.org/media-typer/-/media-typer-0.3.0.tgz", + "_shasum": "8710d7af0aa626f8fffa1ce00168545263255748", + "_spec": "media-typer@0.3.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\type-is", + "author": { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + "bugs": { + "url": "https://github.com/jshttp/media-typer/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Simple RFC 6838 media type parser and formatter", + "devDependencies": { + "istanbul": "0.3.2", + "mocha": "~1.21.4", + "should": "~4.0.4" + }, + "engines": { + "node": ">= 0.6" + }, + "files": [ + "LICENSE", + "HISTORY.md", + "index.js" + ], + "homepage": "https://github.com/jshttp/media-typer#readme", + "license": "MIT", + "name": "media-typer", + "repository": { + "type": "git", + "url": "git+https://github.com/jshttp/media-typer.git" + }, + "scripts": { + "test": "mocha --reporter spec --check-leaks --bail test/", + "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --reporter dot --check-leaks test/", + "test-travis": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --reporter spec --check-leaks test/" + }, + "version": "0.3.0" +} diff --git a/node_modules/memory-pager/.travis.yml b/node_modules/memory-pager/.travis.yml new file mode 100644 index 000000000..1c4ab31e9 --- /dev/null +++ b/node_modules/memory-pager/.travis.yml @@ -0,0 +1,4 @@ +language: node_js +node_js: + - '4' + - '6' diff --git a/node_modules/memory-pager/LICENSE b/node_modules/memory-pager/LICENSE new file mode 100644 index 000000000..56fce0895 --- /dev/null +++ b/node_modules/memory-pager/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2017 Mathias Buus + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/memory-pager/README.md b/node_modules/memory-pager/README.md new file mode 100644 index 000000000..aed176140 --- /dev/null +++ b/node_modules/memory-pager/README.md @@ -0,0 +1,65 @@ +# memory-pager + +Access memory using small fixed sized buffers instead of allocating a huge buffer. +Useful if you are implementing sparse data structures (such as large bitfield). + +![travis](https://travis-ci.org/mafintosh/memory-pager.svg?branch=master) + +``` +npm install memory-pager +``` + +## Usage + +``` js +var pager = require('paged-memory') + +var pages = pager(1024) // use 1kb per page + +var page = pages.get(10) // get page #10 + +console.log(page.offset) // 10240 +console.log(page.buffer) // a blank 1kb buffer +``` + +## API + +#### `var pages = pager(pageSize)` + +Create a new pager. `pageSize` defaults to `1024`. + +#### `var page = pages.get(pageNumber, [noAllocate])` + +Get a page. The page will be allocated at first access. + +Optionally you can set the `noAllocate` flag which will make the +method return undefined if no page has been allocated already + +A page looks like this + +``` js +{ + offset: byteOffset, + buffer: bufferWithPageSize +} +``` + +#### `pages.set(pageNumber, buffer)` + +Explicitly set the buffer for a page. + +#### `pages.updated(page)` + +Mark a page as updated. + +#### `pages.lastUpdate()` + +Get the last page that was updated. + +#### `var buf = pages.toBuffer()` + +Concat all pages allocated pages into a single buffer + +## License + +MIT diff --git a/node_modules/memory-pager/index.js b/node_modules/memory-pager/index.js new file mode 100644 index 000000000..687f346f3 --- /dev/null +++ b/node_modules/memory-pager/index.js @@ -0,0 +1,160 @@ +module.exports = Pager + +function Pager (pageSize, opts) { + if (!(this instanceof Pager)) return new Pager(pageSize, opts) + + this.length = 0 + this.updates = [] + this.path = new Uint16Array(4) + this.pages = new Array(32768) + this.maxPages = this.pages.length + this.level = 0 + this.pageSize = pageSize || 1024 + this.deduplicate = opts ? opts.deduplicate : null + this.zeros = this.deduplicate ? alloc(this.deduplicate.length) : null +} + +Pager.prototype.updated = function (page) { + while (this.deduplicate && page.buffer[page.deduplicate] === this.deduplicate[page.deduplicate]) { + page.deduplicate++ + if (page.deduplicate === this.deduplicate.length) { + page.deduplicate = 0 + if (page.buffer.equals && page.buffer.equals(this.deduplicate)) page.buffer = this.deduplicate + break + } + } + if (page.updated || !this.updates) return + page.updated = true + this.updates.push(page) +} + +Pager.prototype.lastUpdate = function () { + if (!this.updates || !this.updates.length) return null + var page = this.updates.pop() + page.updated = false + return page +} + +Pager.prototype._array = function (i, noAllocate) { + if (i >= this.maxPages) { + if (noAllocate) return + grow(this, i) + } + + factor(i, this.path) + + var arr = this.pages + + for (var j = this.level; j > 0; j--) { + var p = this.path[j] + var next = arr[p] + + if (!next) { + if (noAllocate) return + next = arr[p] = new Array(32768) + } + + arr = next + } + + return arr +} + +Pager.prototype.get = function (i, noAllocate) { + var arr = this._array(i, noAllocate) + var first = this.path[0] + var page = arr && arr[first] + + if (!page && !noAllocate) { + page = arr[first] = new Page(i, alloc(this.pageSize)) + if (i >= this.length) this.length = i + 1 + } + + if (page && page.buffer === this.deduplicate && this.deduplicate && !noAllocate) { + page.buffer = copy(page.buffer) + page.deduplicate = 0 + } + + return page +} + +Pager.prototype.set = function (i, buf) { + var arr = this._array(i, false) + var first = this.path[0] + + if (i >= this.length) this.length = i + 1 + + if (!buf || (this.zeros && buf.equals && buf.equals(this.zeros))) { + arr[first] = undefined + return + } + + if (this.deduplicate && buf.equals && buf.equals(this.deduplicate)) { + buf = this.deduplicate + } + + var page = arr[first] + var b = truncate(buf, this.pageSize) + + if (page) page.buffer = b + else arr[first] = new Page(i, b) +} + +Pager.prototype.toBuffer = function () { + var list = new Array(this.length) + var empty = alloc(this.pageSize) + var ptr = 0 + + while (ptr < list.length) { + var arr = this._array(ptr, true) + for (var i = 0; i < 32768 && ptr < list.length; i++) { + list[ptr++] = (arr && arr[i]) ? arr[i].buffer : empty + } + } + + return Buffer.concat(list) +} + +function grow (pager, index) { + while (pager.maxPages < index) { + var old = pager.pages + pager.pages = new Array(32768) + pager.pages[0] = old + pager.level++ + pager.maxPages *= 32768 + } +} + +function truncate (buf, len) { + if (buf.length === len) return buf + if (buf.length > len) return buf.slice(0, len) + var cpy = alloc(len) + buf.copy(cpy) + return cpy +} + +function alloc (size) { + if (Buffer.alloc) return Buffer.alloc(size) + var buf = new Buffer(size) + buf.fill(0) + return buf +} + +function copy (buf) { + var cpy = Buffer.allocUnsafe ? Buffer.allocUnsafe(buf.length) : new Buffer(buf.length) + buf.copy(cpy) + return cpy +} + +function Page (i, buf) { + this.offset = i * buf.length + this.buffer = buf + this.updated = false + this.deduplicate = 0 +} + +function factor (n, out) { + n = (n - (out[0] = (n & 32767))) / 32768 + n = (n - (out[1] = (n & 32767))) / 32768 + out[3] = ((n - (out[2] = (n & 32767))) / 32768) & 32767 +} diff --git a/node_modules/memory-pager/package.json b/node_modules/memory-pager/package.json new file mode 100644 index 000000000..d51cd4c7b --- /dev/null +++ b/node_modules/memory-pager/package.json @@ -0,0 +1,52 @@ +{ + "_from": "memory-pager@^1.0.2", + "_id": "memory-pager@1.5.0", + "_inBundle": false, + "_integrity": "sha512-ZS4Bp4r/Zoeq6+NLJpP+0Zzm0pR8whtGPf1XExKLJBAczGMnSi3It14OiNCStjQjM6NU1okjQGSxgEZN8eBYKg==", + "_location": "/memory-pager", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "memory-pager@^1.0.2", + "name": "memory-pager", + "escapedName": "memory-pager", + "rawSpec": "^1.0.2", + "saveSpec": null, + "fetchSpec": "^1.0.2" + }, + "_requiredBy": [ + "/sparse-bitfield" + ], + "_resolved": "https://registry.npmjs.org/memory-pager/-/memory-pager-1.5.0.tgz", + "_shasum": "d8751655d22d384682741c972f2c3d6dfa3e66b5", + "_spec": "memory-pager@^1.0.2", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\sparse-bitfield", + "author": { + "name": "Mathias Buus", + "url": "@mafintosh" + }, + "bugs": { + "url": "https://github.com/mafintosh/memory-pager/issues" + }, + "bundleDependencies": false, + "dependencies": {}, + "deprecated": false, + "description": "Access memory using small fixed sized buffers", + "devDependencies": { + "standard": "^9.0.0", + "tape": "^4.6.3" + }, + "homepage": "https://github.com/mafintosh/memory-pager", + "license": "MIT", + "main": "index.js", + "name": "memory-pager", + "repository": { + "type": "git", + "url": "git+https://github.com/mafintosh/memory-pager.git" + }, + "scripts": { + "test": "standard && tape test.js" + }, + "version": "1.5.0" +} diff --git a/node_modules/memory-pager/test.js b/node_modules/memory-pager/test.js new file mode 100644 index 000000000..163821003 --- /dev/null +++ b/node_modules/memory-pager/test.js @@ -0,0 +1,80 @@ +var tape = require('tape') +var pager = require('./') + +tape('get page', function (t) { + var pages = pager(1024) + + var page = pages.get(0) + + t.same(page.offset, 0) + t.same(page.buffer, Buffer.alloc(1024)) + t.end() +}) + +tape('get page twice', function (t) { + var pages = pager(1024) + t.same(pages.length, 0) + + var page = pages.get(0) + + t.same(page.offset, 0) + t.same(page.buffer, Buffer.alloc(1024)) + t.same(pages.length, 1) + + var other = pages.get(0) + + t.same(other, page) + t.end() +}) + +tape('get no mutable page', function (t) { + var pages = pager(1024) + + t.ok(!pages.get(141, true)) + t.ok(pages.get(141)) + t.ok(pages.get(141, true)) + + t.end() +}) + +tape('get far out page', function (t) { + var pages = pager(1024) + + var page = pages.get(1000000) + + t.same(page.offset, 1000000 * 1024) + t.same(page.buffer, Buffer.alloc(1024)) + t.same(pages.length, 1000000 + 1) + + var other = pages.get(1) + + t.same(other.offset, 1024) + t.same(other.buffer, Buffer.alloc(1024)) + t.same(pages.length, 1000000 + 1) + t.ok(other !== page) + + t.end() +}) + +tape('updates', function (t) { + var pages = pager(1024) + + t.same(pages.lastUpdate(), null) + + var page = pages.get(10) + + page.buffer[42] = 1 + pages.updated(page) + + t.same(pages.lastUpdate(), page) + t.same(pages.lastUpdate(), null) + + page.buffer[42] = 2 + pages.updated(page) + pages.updated(page) + + t.same(pages.lastUpdate(), page) + t.same(pages.lastUpdate(), null) + + t.end() +}) diff --git a/node_modules/merge-descriptors/HISTORY.md b/node_modules/merge-descriptors/HISTORY.md new file mode 100644 index 000000000..486771f08 --- /dev/null +++ b/node_modules/merge-descriptors/HISTORY.md @@ -0,0 +1,21 @@ +1.0.1 / 2016-01-17 +================== + + * perf: enable strict mode + +1.0.0 / 2015-03-01 +================== + + * Add option to only add new descriptors + * Add simple argument validation + * Add jsdoc to source file + +0.0.2 / 2013-12-14 +================== + + * Move repository to `component` organization + +0.0.1 / 2013-10-29 +================== + + * Initial release diff --git a/node_modules/merge-descriptors/LICENSE b/node_modules/merge-descriptors/LICENSE new file mode 100644 index 000000000..274bfd82b --- /dev/null +++ b/node_modules/merge-descriptors/LICENSE @@ -0,0 +1,23 @@ +(The MIT License) + +Copyright (c) 2013 Jonathan Ong +Copyright (c) 2015 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/merge-descriptors/README.md b/node_modules/merge-descriptors/README.md new file mode 100644 index 000000000..d593c0ebd --- /dev/null +++ b/node_modules/merge-descriptors/README.md @@ -0,0 +1,48 @@ +# Merge Descriptors + +[![NPM Version][npm-image]][npm-url] +[![NPM Downloads][downloads-image]][downloads-url] +[![Build Status][travis-image]][travis-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +Merge objects using descriptors. + +```js +var thing = { + get name() { + return 'jon' + } +} + +var animal = { + +} + +merge(animal, thing) + +animal.name === 'jon' +``` + +## API + +### merge(destination, source) + +Redefines `destination`'s descriptors with `source`'s. + +### merge(destination, source, false) + +Defines `source`'s descriptors on `destination` if `destination` does not have +a descriptor by the same name. + +## License + +[MIT](LICENSE) + +[npm-image]: https://img.shields.io/npm/v/merge-descriptors.svg +[npm-url]: https://npmjs.org/package/merge-descriptors +[travis-image]: https://img.shields.io/travis/component/merge-descriptors/master.svg +[travis-url]: https://travis-ci.org/component/merge-descriptors +[coveralls-image]: https://img.shields.io/coveralls/component/merge-descriptors/master.svg +[coveralls-url]: https://coveralls.io/r/component/merge-descriptors?branch=master +[downloads-image]: https://img.shields.io/npm/dm/merge-descriptors.svg +[downloads-url]: https://npmjs.org/package/merge-descriptors diff --git a/node_modules/merge-descriptors/index.js b/node_modules/merge-descriptors/index.js new file mode 100644 index 000000000..573b132eb --- /dev/null +++ b/node_modules/merge-descriptors/index.js @@ -0,0 +1,60 @@ +/*! + * merge-descriptors + * Copyright(c) 2014 Jonathan Ong + * Copyright(c) 2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module exports. + * @public + */ + +module.exports = merge + +/** + * Module variables. + * @private + */ + +var hasOwnProperty = Object.prototype.hasOwnProperty + +/** + * Merge the property descriptors of `src` into `dest` + * + * @param {object} dest Object to add descriptors to + * @param {object} src Object to clone descriptors from + * @param {boolean} [redefine=true] Redefine `dest` properties with `src` properties + * @returns {object} Reference to dest + * @public + */ + +function merge(dest, src, redefine) { + if (!dest) { + throw new TypeError('argument dest is required') + } + + if (!src) { + throw new TypeError('argument src is required') + } + + if (redefine === undefined) { + // Default to true + redefine = true + } + + Object.getOwnPropertyNames(src).forEach(function forEachOwnPropertyName(name) { + if (!redefine && hasOwnProperty.call(dest, name)) { + // Skip desriptor + return + } + + // Copy descriptor + var descriptor = Object.getOwnPropertyDescriptor(src, name) + Object.defineProperty(dest, name, descriptor) + }) + + return dest +} diff --git a/node_modules/merge-descriptors/package.json b/node_modules/merge-descriptors/package.json new file mode 100644 index 000000000..eff5ce40e --- /dev/null +++ b/node_modules/merge-descriptors/package.json @@ -0,0 +1,69 @@ +{ + "_from": "merge-descriptors@1.0.1", + "_id": "merge-descriptors@1.0.1", + "_inBundle": false, + "_integrity": "sha1-sAqqVW3YtEVoFQ7J0blT8/kMu2E=", + "_location": "/merge-descriptors", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "merge-descriptors@1.0.1", + "name": "merge-descriptors", + "escapedName": "merge-descriptors", + "rawSpec": "1.0.1", + "saveSpec": null, + "fetchSpec": "1.0.1" + }, + "_requiredBy": [ + "/express" + ], + "_resolved": "https://registry.npmjs.org/merge-descriptors/-/merge-descriptors-1.0.1.tgz", + "_shasum": "b00aaa556dd8b44568150ec9d1b953f3f90cbb61", + "_spec": "merge-descriptors@1.0.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\express", + "author": { + "name": "Jonathan Ong", + "email": "me@jongleberry.com", + "url": "http://jongleberry.com" + }, + "bugs": { + "url": "https://github.com/component/merge-descriptors/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + { + "name": "Mike Grabowski", + "email": "grabbou@gmail.com" + } + ], + "deprecated": false, + "description": "Merge objects using descriptors", + "devDependencies": { + "istanbul": "0.4.1", + "mocha": "1.21.5" + }, + "files": [ + "HISTORY.md", + "LICENSE", + "README.md", + "index.js" + ], + "homepage": "https://github.com/component/merge-descriptors#readme", + "license": "MIT", + "name": "merge-descriptors", + "repository": { + "type": "git", + "url": "git+https://github.com/component/merge-descriptors.git" + }, + "scripts": { + "test": "mocha --reporter spec --bail --check-leaks test/", + "test-ci": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --reporter spec --check-leaks test/", + "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --reporter dot --check-leaks test/" + }, + "version": "1.0.1" +} diff --git a/node_modules/methods/HISTORY.md b/node_modules/methods/HISTORY.md new file mode 100644 index 000000000..c0ecf072d --- /dev/null +++ b/node_modules/methods/HISTORY.md @@ -0,0 +1,29 @@ +1.1.2 / 2016-01-17 +================== + + * perf: enable strict mode + +1.1.1 / 2014-12-30 +================== + + * Improve `browserify` support + +1.1.0 / 2014-07-05 +================== + + * Add `CONNECT` method + +1.0.1 / 2014-06-02 +================== + + * Fix module to work with harmony transform + +1.0.0 / 2014-05-08 +================== + + * Add `PURGE` method + +0.1.0 / 2013-10-28 +================== + + * Add `http.METHODS` support diff --git a/node_modules/methods/LICENSE b/node_modules/methods/LICENSE new file mode 100644 index 000000000..220dc1a24 --- /dev/null +++ b/node_modules/methods/LICENSE @@ -0,0 +1,24 @@ +(The MIT License) + +Copyright (c) 2013-2014 TJ Holowaychuk +Copyright (c) 2015-2016 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. + diff --git a/node_modules/methods/README.md b/node_modules/methods/README.md new file mode 100644 index 000000000..672a32bfe --- /dev/null +++ b/node_modules/methods/README.md @@ -0,0 +1,51 @@ +# Methods + +[![NPM Version][npm-image]][npm-url] +[![NPM Downloads][downloads-image]][downloads-url] +[![Node.js Version][node-version-image]][node-version-url] +[![Build Status][travis-image]][travis-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +HTTP verbs that Node.js core's HTTP parser supports. + +This module provides an export that is just like `http.METHODS` from Node.js core, +with the following differences: + + * All method names are lower-cased. + * Contains a fallback list of methods for Node.js versions that do not have a + `http.METHODS` export (0.10 and lower). + * Provides the fallback list when using tools like `browserify` without pulling + in the `http` shim module. + +## Install + +```bash +$ npm install methods +``` + +## API + +```js +var methods = require('methods') +``` + +### methods + +This is an array of lower-cased method names that Node.js supports. If Node.js +provides the `http.METHODS` export, then this is the same array lower-cased, +otherwise it is a snapshot of the verbs from Node.js 0.10. + +## License + +[MIT](LICENSE) + +[npm-image]: https://img.shields.io/npm/v/methods.svg?style=flat +[npm-url]: https://npmjs.org/package/methods +[node-version-image]: https://img.shields.io/node/v/methods.svg?style=flat +[node-version-url]: https://nodejs.org/en/download/ +[travis-image]: https://img.shields.io/travis/jshttp/methods.svg?style=flat +[travis-url]: https://travis-ci.org/jshttp/methods +[coveralls-image]: https://img.shields.io/coveralls/jshttp/methods.svg?style=flat +[coveralls-url]: https://coveralls.io/r/jshttp/methods?branch=master +[downloads-image]: https://img.shields.io/npm/dm/methods.svg?style=flat +[downloads-url]: https://npmjs.org/package/methods diff --git a/node_modules/methods/index.js b/node_modules/methods/index.js new file mode 100644 index 000000000..667a50bde --- /dev/null +++ b/node_modules/methods/index.js @@ -0,0 +1,69 @@ +/*! + * methods + * Copyright(c) 2013-2014 TJ Holowaychuk + * Copyright(c) 2015-2016 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +/** + * Module dependencies. + * @private + */ + +var http = require('http'); + +/** + * Module exports. + * @public + */ + +module.exports = getCurrentNodeMethods() || getBasicNodeMethods(); + +/** + * Get the current Node.js methods. + * @private + */ + +function getCurrentNodeMethods() { + return http.METHODS && http.METHODS.map(function lowerCaseMethod(method) { + return method.toLowerCase(); + }); +} + +/** + * Get the "basic" Node.js methods, a snapshot from Node.js 0.10. + * @private + */ + +function getBasicNodeMethods() { + return [ + 'get', + 'post', + 'put', + 'head', + 'delete', + 'options', + 'trace', + 'copy', + 'lock', + 'mkcol', + 'move', + 'purge', + 'propfind', + 'proppatch', + 'unlock', + 'report', + 'mkactivity', + 'checkout', + 'merge', + 'm-search', + 'notify', + 'subscribe', + 'unsubscribe', + 'patch', + 'search', + 'connect' + ]; +} diff --git a/node_modules/methods/package.json b/node_modules/methods/package.json new file mode 100644 index 000000000..0c2fe0fa3 --- /dev/null +++ b/node_modules/methods/package.json @@ -0,0 +1,79 @@ +{ + "_from": "methods@~1.1.2", + "_id": "methods@1.1.2", + "_inBundle": false, + "_integrity": "sha1-VSmk1nZUE07cxSZmVoNbD4Ua/O4=", + "_location": "/methods", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "methods@~1.1.2", + "name": "methods", + "escapedName": "methods", + "rawSpec": "~1.1.2", + "saveSpec": null, + "fetchSpec": "~1.1.2" + }, + "_requiredBy": [ + "/express" + ], + "_resolved": "https://registry.npmjs.org/methods/-/methods-1.1.2.tgz", + "_shasum": "5529a4d67654134edcc5266656835b0f851afcee", + "_spec": "methods@~1.1.2", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\express", + "browser": { + "http": false + }, + "bugs": { + "url": "https://github.com/jshttp/methods/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + { + "name": "Jonathan Ong", + "email": "me@jongleberry.com", + "url": "http://jongleberry.com" + }, + { + "name": "TJ Holowaychuk", + "email": "tj@vision-media.ca", + "url": "http://tjholowaychuk.com" + } + ], + "deprecated": false, + "description": "HTTP methods that node supports", + "devDependencies": { + "istanbul": "0.4.1", + "mocha": "1.21.5" + }, + "engines": { + "node": ">= 0.6" + }, + "files": [ + "index.js", + "HISTORY.md", + "LICENSE" + ], + "homepage": "https://github.com/jshttp/methods#readme", + "keywords": [ + "http", + "methods" + ], + "license": "MIT", + "name": "methods", + "repository": { + "type": "git", + "url": "git+https://github.com/jshttp/methods.git" + }, + "scripts": { + "test": "mocha --reporter spec --bail --check-leaks test/", + "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --reporter dot --check-leaks test/", + "test-travis": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --reporter spec --check-leaks test/" + }, + "version": "1.1.2" +} diff --git a/node_modules/mime-db/HISTORY.md b/node_modules/mime-db/HISTORY.md new file mode 100644 index 000000000..622adb90c --- /dev/null +++ b/node_modules/mime-db/HISTORY.md @@ -0,0 +1,492 @@ +1.50.0 / 2021-09-15 +=================== + + * Add deprecated iWorks mime types and extensions + * Add new upstream MIME types + +1.49.0 / 2021-07-26 +=================== + + * Add extension `.trig` to `application/trig` + * Add new upstream MIME types + +1.48.0 / 2021-05-30 +=================== + + * Add extension `.mvt` to `application/vnd.mapbox-vector-tile` + * Add new upstream MIME types + * Mark `text/yaml` as compressible + +1.47.0 / 2021-04-01 +=================== + + * Add new upstream MIME types + * Remove ambigious extensions from IANA for `application/*+xml` types + * Update primary extension to `.es` for `application/ecmascript` + +1.46.0 / 2021-02-13 +=================== + + * Add extension `.amr` to `audio/amr` + * Add extension `.m4s` to `video/iso.segment` + * Add extension `.opus` to `audio/ogg` + * Add new upstream MIME types + +1.45.0 / 2020-09-22 +=================== + + * Add `application/ubjson` with extension `.ubj` + * Add `image/avif` with extension `.avif` + * Add `image/ktx2` with extension `.ktx2` + * Add extension `.dbf` to `application/vnd.dbf` + * Add extension `.rar` to `application/vnd.rar` + * Add extension `.td` to `application/urc-targetdesc+xml` + * Add new upstream MIME types + * Fix extension of `application/vnd.apple.keynote` to be `.key` + +1.44.0 / 2020-04-22 +=================== + + * Add charsets from IANA + * Add extension `.cjs` to `application/node` + * Add new upstream MIME types + +1.43.0 / 2020-01-05 +=================== + + * Add `application/x-keepass2` with extension `.kdbx` + * Add extension `.mxmf` to `audio/mobile-xmf` + * Add extensions from IANA for `application/*+xml` types + * Add new upstream MIME types + +1.42.0 / 2019-09-25 +=================== + + * Add `image/vnd.ms-dds` with extension `.dds` + * Add new upstream MIME types + * Remove compressible from `multipart/mixed` + +1.41.0 / 2019-08-30 +=================== + + * Add new upstream MIME types + * Add `application/toml` with extension `.toml` + * Mark `font/ttf` as compressible + +1.40.0 / 2019-04-20 +=================== + + * Add extensions from IANA for `model/*` types + * Add `text/mdx` with extension `.mdx` + +1.39.0 / 2019-04-04 +=================== + + * Add extensions `.siv` and `.sieve` to `application/sieve` + * Add new upstream MIME types + +1.38.0 / 2019-02-04 +=================== + + * Add extension `.nq` to `application/n-quads` + * Add extension `.nt` to `application/n-triples` + * Add new upstream MIME types + * Mark `text/less` as compressible + +1.37.0 / 2018-10-19 +=================== + + * Add extensions to HEIC image types + * Add new upstream MIME types + +1.36.0 / 2018-08-20 +=================== + + * Add Apple file extensions from IANA + * Add extensions from IANA for `image/*` types + * Add new upstream MIME types + +1.35.0 / 2018-07-15 +=================== + + * Add extension `.owl` to `application/rdf+xml` + * Add new upstream MIME types + - Removes extension `.woff` from `application/font-woff` + +1.34.0 / 2018-06-03 +=================== + + * Add extension `.csl` to `application/vnd.citationstyles.style+xml` + * Add extension `.es` to `application/ecmascript` + * Add new upstream MIME types + * Add `UTF-8` as default charset for `text/turtle` + * Mark all XML-derived types as compressible + +1.33.0 / 2018-02-15 +=================== + + * Add extensions from IANA for `message/*` types + * Add new upstream MIME types + * Fix some incorrect OOXML types + * Remove `application/font-woff2` + +1.32.0 / 2017-11-29 +=================== + + * Add new upstream MIME types + * Update `text/hjson` to registered `application/hjson` + * Add `text/shex` with extension `.shex` + +1.31.0 / 2017-10-25 +=================== + + * Add `application/raml+yaml` with extension `.raml` + * Add `application/wasm` with extension `.wasm` + * Add new `font` type from IANA + * Add new upstream font extensions + * Add new upstream MIME types + * Add extensions for JPEG-2000 images + +1.30.0 / 2017-08-27 +=================== + + * Add `application/vnd.ms-outlook` + * Add `application/x-arj` + * Add extension `.mjs` to `application/javascript` + * Add glTF types and extensions + * Add new upstream MIME types + * Add `text/x-org` + * Add VirtualBox MIME types + * Fix `source` records for `video/*` types that are IANA + * Update `font/opentype` to registered `font/otf` + +1.29.0 / 2017-07-10 +=================== + + * Add `application/fido.trusted-apps+json` + * Add extension `.wadl` to `application/vnd.sun.wadl+xml` + * Add new upstream MIME types + * Add `UTF-8` as default charset for `text/css` + +1.28.0 / 2017-05-14 +=================== + + * Add new upstream MIME types + * Add extension `.gz` to `application/gzip` + * Update extensions `.md` and `.markdown` to be `text/markdown` + +1.27.0 / 2017-03-16 +=================== + + * Add new upstream MIME types + * Add `image/apng` with extension `.apng` + +1.26.0 / 2017-01-14 +=================== + + * Add new upstream MIME types + * Add extension `.geojson` to `application/geo+json` + +1.25.0 / 2016-11-11 +=================== + + * Add new upstream MIME types + +1.24.0 / 2016-09-18 +=================== + + * Add `audio/mp3` + * Add new upstream MIME types + +1.23.0 / 2016-05-01 +=================== + + * Add new upstream MIME types + * Add extension `.3gpp` to `audio/3gpp` + +1.22.0 / 2016-02-15 +=================== + + * Add `text/slim` + * Add extension `.rng` to `application/xml` + * Add new upstream MIME types + * Fix extension of `application/dash+xml` to be `.mpd` + * Update primary extension to `.m4a` for `audio/mp4` + +1.21.0 / 2016-01-06 +=================== + + * Add Google document types + * Add new upstream MIME types + +1.20.0 / 2015-11-10 +=================== + + * Add `text/x-suse-ymp` + * Add new upstream MIME types + +1.19.0 / 2015-09-17 +=================== + + * Add `application/vnd.apple.pkpass` + * Add new upstream MIME types + +1.18.0 / 2015-09-03 +=================== + + * Add new upstream MIME types + +1.17.0 / 2015-08-13 +=================== + + * Add `application/x-msdos-program` + * Add `audio/g711-0` + * Add `image/vnd.mozilla.apng` + * Add extension `.exe` to `application/x-msdos-program` + +1.16.0 / 2015-07-29 +=================== + + * Add `application/vnd.uri-map` + +1.15.0 / 2015-07-13 +=================== + + * Add `application/x-httpd-php` + +1.14.0 / 2015-06-25 +=================== + + * Add `application/scim+json` + * Add `application/vnd.3gpp.ussd+xml` + * Add `application/vnd.biopax.rdf+xml` + * Add `text/x-processing` + +1.13.0 / 2015-06-07 +=================== + + * Add nginx as a source + * Add `application/x-cocoa` + * Add `application/x-java-archive-diff` + * Add `application/x-makeself` + * Add `application/x-perl` + * Add `application/x-pilot` + * Add `application/x-redhat-package-manager` + * Add `application/x-sea` + * Add `audio/x-m4a` + * Add `audio/x-realaudio` + * Add `image/x-jng` + * Add `text/mathml` + +1.12.0 / 2015-06-05 +=================== + + * Add `application/bdoc` + * Add `application/vnd.hyperdrive+json` + * Add `application/x-bdoc` + * Add extension `.rtf` to `text/rtf` + +1.11.0 / 2015-05-31 +=================== + + * Add `audio/wav` + * Add `audio/wave` + * Add extension `.litcoffee` to `text/coffeescript` + * Add extension `.sfd-hdstx` to `application/vnd.hydrostatix.sof-data` + * Add extension `.n-gage` to `application/vnd.nokia.n-gage.symbian.install` + +1.10.0 / 2015-05-19 +=================== + + * Add `application/vnd.balsamiq.bmpr` + * Add `application/vnd.microsoft.portable-executable` + * Add `application/x-ns-proxy-autoconfig` + +1.9.1 / 2015-04-19 +================== + + * Remove `.json` extension from `application/manifest+json` + - This is causing bugs downstream + +1.9.0 / 2015-04-19 +================== + + * Add `application/manifest+json` + * Add `application/vnd.micro+json` + * Add `image/vnd.zbrush.pcx` + * Add `image/x-ms-bmp` + +1.8.0 / 2015-03-13 +================== + + * Add `application/vnd.citationstyles.style+xml` + * Add `application/vnd.fastcopy-disk-image` + * Add `application/vnd.gov.sk.xmldatacontainer+xml` + * Add extension `.jsonld` to `application/ld+json` + +1.7.0 / 2015-02-08 +================== + + * Add `application/vnd.gerber` + * Add `application/vnd.msa-disk-image` + +1.6.1 / 2015-02-05 +================== + + * Community extensions ownership transferred from `node-mime` + +1.6.0 / 2015-01-29 +================== + + * Add `application/jose` + * Add `application/jose+json` + * Add `application/json-seq` + * Add `application/jwk+json` + * Add `application/jwk-set+json` + * Add `application/jwt` + * Add `application/rdap+json` + * Add `application/vnd.gov.sk.e-form+xml` + * Add `application/vnd.ims.imsccv1p3` + +1.5.0 / 2014-12-30 +================== + + * Add `application/vnd.oracle.resource+json` + * Fix various invalid MIME type entries + - `application/mbox+xml` + - `application/oscp-response` + - `application/vwg-multiplexed` + - `audio/g721` + +1.4.0 / 2014-12-21 +================== + + * Add `application/vnd.ims.imsccv1p2` + * Fix various invalid MIME type entries + - `application/vnd-acucobol` + - `application/vnd-curl` + - `application/vnd-dart` + - `application/vnd-dxr` + - `application/vnd-fdf` + - `application/vnd-mif` + - `application/vnd-sema` + - `application/vnd-wap-wmlc` + - `application/vnd.adobe.flash-movie` + - `application/vnd.dece-zip` + - `application/vnd.dvb_service` + - `application/vnd.micrografx-igx` + - `application/vnd.sealed-doc` + - `application/vnd.sealed-eml` + - `application/vnd.sealed-mht` + - `application/vnd.sealed-ppt` + - `application/vnd.sealed-tiff` + - `application/vnd.sealed-xls` + - `application/vnd.sealedmedia.softseal-html` + - `application/vnd.sealedmedia.softseal-pdf` + - `application/vnd.wap-slc` + - `application/vnd.wap-wbxml` + - `audio/vnd.sealedmedia.softseal-mpeg` + - `image/vnd-djvu` + - `image/vnd-svf` + - `image/vnd-wap-wbmp` + - `image/vnd.sealed-png` + - `image/vnd.sealedmedia.softseal-gif` + - `image/vnd.sealedmedia.softseal-jpg` + - `model/vnd-dwf` + - `model/vnd.parasolid.transmit-binary` + - `model/vnd.parasolid.transmit-text` + - `text/vnd-a` + - `text/vnd-curl` + - `text/vnd.wap-wml` + * Remove example template MIME types + - `application/example` + - `audio/example` + - `image/example` + - `message/example` + - `model/example` + - `multipart/example` + - `text/example` + - `video/example` + +1.3.1 / 2014-12-16 +================== + + * Fix missing extensions + - `application/json5` + - `text/hjson` + +1.3.0 / 2014-12-07 +================== + + * Add `application/a2l` + * Add `application/aml` + * Add `application/atfx` + * Add `application/atxml` + * Add `application/cdfx+xml` + * Add `application/dii` + * Add `application/json5` + * Add `application/lxf` + * Add `application/mf4` + * Add `application/vnd.apache.thrift.compact` + * Add `application/vnd.apache.thrift.json` + * Add `application/vnd.coffeescript` + * Add `application/vnd.enphase.envoy` + * Add `application/vnd.ims.imsccv1p1` + * Add `text/csv-schema` + * Add `text/hjson` + * Add `text/markdown` + * Add `text/yaml` + +1.2.0 / 2014-11-09 +================== + + * Add `application/cea` + * Add `application/dit` + * Add `application/vnd.gov.sk.e-form+zip` + * Add `application/vnd.tmd.mediaflex.api+xml` + * Type `application/epub+zip` is now IANA-registered + +1.1.2 / 2014-10-23 +================== + + * Rebuild database for `application/x-www-form-urlencoded` change + +1.1.1 / 2014-10-20 +================== + + * Mark `application/x-www-form-urlencoded` as compressible. + +1.1.0 / 2014-09-28 +================== + + * Add `application/font-woff2` + +1.0.3 / 2014-09-25 +================== + + * Fix engine requirement in package + +1.0.2 / 2014-09-25 +================== + + * Add `application/coap-group+json` + * Add `application/dcd` + * Add `application/vnd.apache.thrift.binary` + * Add `image/vnd.tencent.tap` + * Mark all JSON-derived types as compressible + * Update `text/vtt` data + +1.0.1 / 2014-08-30 +================== + + * Fix extension ordering + +1.0.0 / 2014-08-30 +================== + + * Add `application/atf` + * Add `application/merge-patch+json` + * Add `multipart/x-mixed-replace` + * Add `source: 'apache'` metadata + * Add `source: 'iana'` metadata + * Remove badly-assumed charset data diff --git a/node_modules/mime-db/LICENSE b/node_modules/mime-db/LICENSE new file mode 100644 index 000000000..a7ae8ee9b --- /dev/null +++ b/node_modules/mime-db/LICENSE @@ -0,0 +1,22 @@ + +The MIT License (MIT) + +Copyright (c) 2014 Jonathan Ong me@jongleberry.com + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/mime-db/README.md b/node_modules/mime-db/README.md new file mode 100644 index 000000000..41c696a30 --- /dev/null +++ b/node_modules/mime-db/README.md @@ -0,0 +1,100 @@ +# mime-db + +[![NPM Version][npm-version-image]][npm-url] +[![NPM Downloads][npm-downloads-image]][npm-url] +[![Node.js Version][node-image]][node-url] +[![Build Status][ci-image]][ci-url] +[![Coverage Status][coveralls-image]][coveralls-url] + +This is a database of all mime types. +It consists of a single, public JSON file and does not include any logic, +allowing it to remain as un-opinionated as possible with an API. +It aggregates data from the following sources: + +- http://www.iana.org/assignments/media-types/media-types.xhtml +- http://svn.apache.org/repos/asf/httpd/httpd/trunk/docs/conf/mime.types +- http://hg.nginx.org/nginx/raw-file/default/conf/mime.types + +## Installation + +```bash +npm install mime-db +``` + +### Database Download + +If you're crazy enough to use this in the browser, you can just grab the +JSON file using [jsDelivr](https://www.jsdelivr.com/). It is recommended to +replace `master` with [a release tag](https://github.com/jshttp/mime-db/tags) +as the JSON format may change in the future. + +``` +https://cdn.jsdelivr.net/gh/jshttp/mime-db@master/db.json +``` + +## Usage + +```js +var db = require('mime-db') + +// grab data on .js files +var data = db['application/javascript'] +``` + +## Data Structure + +The JSON file is a map lookup for lowercased mime types. +Each mime type has the following properties: + +- `.source` - where the mime type is defined. + If not set, it's probably a custom media type. + - `apache` - [Apache common media types](http://svn.apache.org/repos/asf/httpd/httpd/trunk/docs/conf/mime.types) + - `iana` - [IANA-defined media types](http://www.iana.org/assignments/media-types/media-types.xhtml) + - `nginx` - [nginx media types](http://hg.nginx.org/nginx/raw-file/default/conf/mime.types) +- `.extensions[]` - known extensions associated with this mime type. +- `.compressible` - whether a file of this type can be gzipped. +- `.charset` - the default charset associated with this type, if any. + +If unknown, every property could be `undefined`. + +## Contributing + +To edit the database, only make PRs against `src/custom-types.json` or +`src/custom-suffix.json`. + +The `src/custom-types.json` file is a JSON object with the MIME type as the +keys and the values being an object with the following keys: + +- `compressible` - leave out if you don't know, otherwise `true`/`false` to + indicate whether the data represented by the type is typically compressible. +- `extensions` - include an array of file extensions that are associated with + the type. +- `notes` - human-readable notes about the type, typically what the type is. +- `sources` - include an array of URLs of where the MIME type and the associated + extensions are sourced from. This needs to be a [primary source](https://en.wikipedia.org/wiki/Primary_source); + links to type aggregating sites and Wikipedia are _not acceptable_. + +To update the build, run `npm run build`. + +### Adding Custom Media Types + +The best way to get new media types included in this library is to register +them with the IANA. The community registration procedure is outlined in +[RFC 6838 section 5](http://tools.ietf.org/html/rfc6838#section-5). Types +registered with the IANA are automatically pulled into this library. + +If that is not possible / feasible, they can be added directly here as a +"custom" type. To do this, it is required to have a primary source that +definitively lists the media type. If an extension is going to be listed as +associateed with this media type, the source must definitively link the +media type and extension as well. + +[ci-image]: https://badgen.net/github/checks/jshttp/mime-db/master?label=ci +[ci-url]: https://github.com/jshttp/mime-db/actions?query=workflow%3Aci +[coveralls-image]: https://badgen.net/coveralls/c/github/jshttp/mime-db/master +[coveralls-url]: https://coveralls.io/r/jshttp/mime-db?branch=master +[node-image]: https://badgen.net/npm/node/mime-db +[node-url]: https://nodejs.org/en/download +[npm-downloads-image]: https://badgen.net/npm/dm/mime-db +[npm-url]: https://npmjs.org/package/mime-db +[npm-version-image]: https://badgen.net/npm/v/mime-db diff --git a/node_modules/mime-db/db.json b/node_modules/mime-db/db.json new file mode 100644 index 000000000..29786a79e --- /dev/null +++ b/node_modules/mime-db/db.json @@ -0,0 +1,8453 @@ +{ + "application/1d-interleaved-parityfec": { + "source": "iana" + }, + "application/3gpdash-qoe-report+xml": { + "source": "iana", + "charset": "UTF-8", + "compressible": true + }, + "application/3gpp-ims+xml": { + "source": "iana", + "compressible": true + }, + "application/3gpphal+json": { + "source": "iana", + "compressible": true + }, + "application/3gpphalforms+json": { + "source": "iana", + "compressible": true + }, + "application/a2l": { + "source": "iana" + }, + "application/ace+cbor": { + "source": "iana" + }, + "application/activemessage": { + "source": "iana" + }, + "application/activity+json": { + "source": "iana", + "compressible": true + }, + "application/alto-costmap+json": { + "source": "iana", + "compressible": true + }, + "application/alto-costmapfilter+json": { + "source": "iana", + "compressible": true + }, + "application/alto-directory+json": { + "source": "iana", + "compressible": true + }, + "application/alto-endpointcost+json": { + "source": "iana", + "compressible": true + }, + "application/alto-endpointcostparams+json": { + "source": "iana", + "compressible": true + }, + "application/alto-endpointprop+json": { + "source": "iana", + "compressible": true + }, + "application/alto-endpointpropparams+json": { + "source": "iana", + "compressible": true + }, + "application/alto-error+json": { + "source": "iana", + "compressible": true + }, + "application/alto-networkmap+json": { + "source": "iana", + "compressible": true + }, + "application/alto-networkmapfilter+json": { + "source": "iana", + "compressible": true + }, + "application/alto-updatestreamcontrol+json": { + "source": "iana", + "compressible": true + }, + "application/alto-updatestreamparams+json": { + "source": "iana", + "compressible": true + }, + "application/aml": { + "source": "iana" + }, + "application/andrew-inset": { + "source": "iana", + "extensions": ["ez"] + }, + "application/applefile": { + "source": "iana" + }, + "application/applixware": { + "source": "apache", + "extensions": ["aw"] + }, + "application/at+jwt": { + "source": "iana" + }, + "application/atf": { + "source": "iana" + }, + "application/atfx": { + "source": "iana" + }, + "application/atom+xml": { + "source": "iana", + "compressible": true, + "extensions": ["atom"] + }, + "application/atomcat+xml": { + "source": "iana", + "compressible": true, + "extensions": ["atomcat"] + }, + "application/atomdeleted+xml": { + "source": "iana", + "compressible": true, + "extensions": ["atomdeleted"] + }, + "application/atomicmail": { + "source": "iana" + }, + "application/atomsvc+xml": { + "source": "iana", + "compressible": true, + "extensions": ["atomsvc"] + }, + "application/atsc-dwd+xml": { + "source": "iana", + "compressible": true, + "extensions": ["dwd"] + }, + "application/atsc-dynamic-event-message": { + "source": "iana" + }, + "application/atsc-held+xml": { + "source": "iana", + "compressible": true, + "extensions": ["held"] + }, + "application/atsc-rdt+json": { + "source": "iana", + "compressible": true + }, + "application/atsc-rsat+xml": { + "source": "iana", + "compressible": true, + "extensions": ["rsat"] + }, + "application/atxml": { + "source": "iana" + }, + "application/auth-policy+xml": { + "source": "iana", + "compressible": true + }, + "application/bacnet-xdd+zip": { + "source": "iana", + "compressible": false + }, + "application/batch-smtp": { + "source": "iana" + }, + "application/bdoc": { + "compressible": false, + "extensions": ["bdoc"] + }, + "application/beep+xml": { + "source": "iana", + "charset": "UTF-8", + "compressible": true + }, + "application/calendar+json": { + "source": "iana", + "compressible": true + }, + "application/calendar+xml": { + "source": "iana", + "compressible": true, + "extensions": ["xcs"] + }, + "application/call-completion": { + "source": "iana" + }, + "application/cals-1840": { + "source": "iana" + }, + "application/captive+json": { + "source": "iana", + "compressible": true + }, + "application/cbor": { + "source": "iana" + }, + "application/cbor-seq": { + "source": "iana" + }, + "application/cccex": { + "source": "iana" + }, + "application/ccmp+xml": { + "source": "iana", + "compressible": true + }, + "application/ccxml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["ccxml"] + }, + "application/cdfx+xml": { + "source": "iana", + "compressible": true, + "extensions": ["cdfx"] + }, + "application/cdmi-capability": { + "source": "iana", + "extensions": ["cdmia"] + }, + "application/cdmi-container": { + "source": "iana", + "extensions": ["cdmic"] + }, + "application/cdmi-domain": { + "source": "iana", + "extensions": ["cdmid"] + }, + "application/cdmi-object": { + "source": "iana", + "extensions": ["cdmio"] + }, + "application/cdmi-queue": { + "source": "iana", + "extensions": ["cdmiq"] + }, + "application/cdni": { + "source": "iana" + }, + "application/cea": { + "source": "iana" + }, + "application/cea-2018+xml": { + "source": "iana", + "compressible": true + }, + "application/cellml+xml": { + "source": "iana", + "compressible": true + }, + "application/cfw": { + "source": "iana" + }, + "application/clr": { + "source": "iana" + }, + "application/clue+xml": { + "source": "iana", + "compressible": true + }, + "application/clue_info+xml": { + "source": "iana", + "compressible": true + }, + "application/cms": { + "source": "iana" + }, + "application/cnrp+xml": { + "source": "iana", + "compressible": true + }, + "application/coap-group+json": { + "source": "iana", + "compressible": true + }, + "application/coap-payload": { + "source": "iana" + }, + "application/commonground": { + "source": "iana" + }, + "application/conference-info+xml": { + "source": "iana", + "compressible": true + }, + "application/cose": { + "source": "iana" + }, + "application/cose-key": { + "source": "iana" + }, + "application/cose-key-set": { + "source": "iana" + }, + "application/cpl+xml": { + "source": "iana", + "compressible": true + }, + "application/csrattrs": { + "source": "iana" + }, + "application/csta+xml": { + "source": "iana", + "compressible": true + }, + "application/cstadata+xml": { + "source": "iana", + "compressible": true + }, + "application/csvm+json": { + "source": "iana", + "compressible": true + }, + "application/cu-seeme": { + "source": "apache", + "extensions": ["cu"] + }, + "application/cwt": { + "source": "iana" + }, + "application/cybercash": { + "source": "iana" + }, + "application/dart": { + "compressible": true + }, + "application/dash+xml": { + "source": "iana", + "compressible": true, + "extensions": ["mpd"] + }, + "application/dashdelta": { + "source": "iana" + }, + "application/davmount+xml": { + "source": "iana", + "compressible": true, + "extensions": ["davmount"] + }, + "application/dca-rft": { + "source": "iana" + }, + "application/dcd": { + "source": "iana" + }, + "application/dec-dx": { + "source": "iana" + }, + "application/dialog-info+xml": { + "source": "iana", + "compressible": true + }, + "application/dicom": { + "source": "iana" + }, + "application/dicom+json": { + "source": "iana", + "compressible": true + }, + "application/dicom+xml": { + "source": "iana", + "compressible": true + }, + "application/dii": { + "source": "iana" + }, + "application/dit": { + "source": "iana" + }, + "application/dns": { + "source": "iana" + }, + "application/dns+json": { + "source": "iana", + "compressible": true + }, + "application/dns-message": { + "source": "iana" + }, + "application/docbook+xml": { + "source": "apache", + "compressible": true, + "extensions": ["dbk"] + }, + "application/dots+cbor": { + "source": "iana" + }, + "application/dskpp+xml": { + "source": "iana", + "compressible": true + }, + "application/dssc+der": { + "source": "iana", + "extensions": ["dssc"] + }, + "application/dssc+xml": { + "source": "iana", + "compressible": true, + "extensions": ["xdssc"] + }, + "application/dvcs": { + "source": "iana" + }, + "application/ecmascript": { + "source": "iana", + "compressible": true, + "extensions": ["es","ecma"] + }, + "application/edi-consent": { + "source": "iana" + }, + "application/edi-x12": { + "source": "iana", + "compressible": false + }, + "application/edifact": { + "source": "iana", + "compressible": false + }, + "application/efi": { + "source": "iana" + }, + "application/elm+json": { + "source": "iana", + "charset": "UTF-8", + "compressible": true + }, + "application/elm+xml": { + "source": "iana", + "compressible": true + }, + "application/emergencycalldata.cap+xml": { + "source": "iana", + "charset": "UTF-8", + "compressible": true + }, + "application/emergencycalldata.comment+xml": { + "source": "iana", + "compressible": true + }, + "application/emergencycalldata.control+xml": { + "source": "iana", + "compressible": true + }, + "application/emergencycalldata.deviceinfo+xml": { + "source": "iana", + "compressible": true + }, + "application/emergencycalldata.ecall.msd": { + "source": "iana" + }, + "application/emergencycalldata.providerinfo+xml": { + "source": "iana", + "compressible": true + }, + "application/emergencycalldata.serviceinfo+xml": { + "source": "iana", + "compressible": true + }, + "application/emergencycalldata.subscriberinfo+xml": { + "source": "iana", + "compressible": true + }, + "application/emergencycalldata.veds+xml": { + "source": "iana", + "compressible": true + }, + "application/emma+xml": { + "source": "iana", + "compressible": true, + "extensions": ["emma"] + }, + "application/emotionml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["emotionml"] + }, + "application/encaprtp": { + "source": "iana" + }, + "application/epp+xml": { + "source": "iana", + "compressible": true + }, + "application/epub+zip": { + "source": "iana", + "compressible": false, + "extensions": ["epub"] + }, + "application/eshop": { + "source": "iana" + }, + "application/exi": { + "source": "iana", + "extensions": ["exi"] + }, + "application/expect-ct-report+json": { + "source": "iana", + "compressible": true + }, + "application/express": { + "source": "iana", + "extensions": ["exp"] + }, + "application/fastinfoset": { + "source": "iana" + }, + "application/fastsoap": { + "source": "iana" + }, + "application/fdt+xml": { + "source": "iana", + "compressible": true, + "extensions": ["fdt"] + }, + "application/fhir+json": { + "source": "iana", + "charset": "UTF-8", + "compressible": true + }, + "application/fhir+xml": { + "source": "iana", + "charset": "UTF-8", + "compressible": true + }, + "application/fido.trusted-apps+json": { + "compressible": true + }, + "application/fits": { + "source": "iana" + }, + "application/flexfec": { + "source": "iana" + }, + "application/font-sfnt": { + "source": "iana" + }, + "application/font-tdpfr": { + "source": "iana", + "extensions": ["pfr"] + }, + "application/font-woff": { + "source": "iana", + "compressible": false + }, + "application/framework-attributes+xml": { + "source": "iana", + "compressible": true + }, + "application/geo+json": { + "source": "iana", + "compressible": true, + "extensions": ["geojson"] + }, + "application/geo+json-seq": { + "source": "iana" + }, + "application/geopackage+sqlite3": { + "source": "iana" + }, + "application/geoxacml+xml": { + "source": "iana", + "compressible": true + }, + "application/gltf-buffer": { + "source": "iana" + }, + "application/gml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["gml"] + }, + "application/gpx+xml": { + "source": "apache", + "compressible": true, + "extensions": ["gpx"] + }, + "application/gxf": { + "source": "apache", + "extensions": ["gxf"] + }, + "application/gzip": { + "source": "iana", + "compressible": false, + "extensions": ["gz"] + }, + "application/h224": { + "source": "iana" + }, + "application/held+xml": { + "source": "iana", + "compressible": true + }, + "application/hjson": { + "extensions": ["hjson"] + }, + "application/http": { + "source": "iana" + }, + "application/hyperstudio": { + "source": "iana", + "extensions": ["stk"] + }, + "application/ibe-key-request+xml": { + "source": "iana", + "compressible": true + }, + "application/ibe-pkg-reply+xml": { + "source": "iana", + "compressible": true + }, + "application/ibe-pp-data": { + "source": "iana" + }, + "application/iges": { + "source": "iana" + }, + "application/im-iscomposing+xml": { + "source": "iana", + "charset": "UTF-8", + "compressible": true + }, + "application/index": { + "source": "iana" + }, + "application/index.cmd": { + "source": "iana" + }, + "application/index.obj": { + "source": "iana" + }, + "application/index.response": { + "source": "iana" + }, + "application/index.vnd": { + "source": "iana" + }, + "application/inkml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["ink","inkml"] + }, + "application/iotp": { + "source": "iana" + }, + "application/ipfix": { + "source": "iana", + "extensions": ["ipfix"] + }, + "application/ipp": { + "source": "iana" + }, + "application/isup": { + "source": "iana" + }, + "application/its+xml": { + "source": "iana", + "compressible": true, + "extensions": ["its"] + }, + "application/java-archive": { + "source": "apache", + "compressible": false, + "extensions": ["jar","war","ear"] + }, + "application/java-serialized-object": { + "source": "apache", + "compressible": false, + "extensions": ["ser"] + }, + "application/java-vm": { + "source": "apache", + "compressible": false, + "extensions": ["class"] + }, + "application/javascript": { + "source": "iana", + "charset": "UTF-8", + "compressible": true, + "extensions": ["js","mjs"] + }, + "application/jf2feed+json": { + "source": "iana", + "compressible": true + }, + "application/jose": { + "source": "iana" + }, + "application/jose+json": { + "source": "iana", + "compressible": true + }, + "application/jrd+json": { + "source": "iana", + "compressible": true + }, + "application/jscalendar+json": { + "source": "iana", + "compressible": true + }, + "application/json": { + "source": "iana", + "charset": "UTF-8", + "compressible": true, + "extensions": ["json","map"] + }, + "application/json-patch+json": { + "source": "iana", + "compressible": true + }, + "application/json-seq": { + "source": "iana" + }, + "application/json5": { + "extensions": ["json5"] + }, + "application/jsonml+json": { + "source": "apache", + "compressible": true, + "extensions": ["jsonml"] + }, + "application/jwk+json": { + "source": "iana", + "compressible": true + }, + "application/jwk-set+json": { + "source": "iana", + "compressible": true + }, + "application/jwt": { + "source": "iana" + }, + "application/kpml-request+xml": { + "source": "iana", + "compressible": true + }, + "application/kpml-response+xml": { + "source": "iana", + "compressible": true + }, + "application/ld+json": { + "source": "iana", + "compressible": true, + "extensions": ["jsonld"] + }, + "application/lgr+xml": { + "source": "iana", + "compressible": true, + "extensions": ["lgr"] + }, + "application/link-format": { + "source": "iana" + }, + "application/load-control+xml": { + "source": "iana", + "compressible": true + }, + "application/lost+xml": { + "source": "iana", + "compressible": true, + "extensions": ["lostxml"] + }, + "application/lostsync+xml": { + "source": "iana", + "compressible": true + }, + "application/lpf+zip": { + "source": "iana", + "compressible": false + }, + "application/lxf": { + "source": "iana" + }, + "application/mac-binhex40": { + "source": "iana", + "extensions": ["hqx"] + }, + "application/mac-compactpro": { + "source": "apache", + "extensions": ["cpt"] + }, + "application/macwriteii": { + "source": "iana" + }, + "application/mads+xml": { + "source": "iana", + "compressible": true, + "extensions": ["mads"] + }, + "application/manifest+json": { + "source": "iana", + "charset": "UTF-8", + "compressible": true, + "extensions": ["webmanifest"] + }, + "application/marc": { + "source": "iana", + "extensions": ["mrc"] + }, + "application/marcxml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["mrcx"] + }, + "application/mathematica": { + "source": "iana", + "extensions": ["ma","nb","mb"] + }, + "application/mathml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["mathml"] + }, + "application/mathml-content+xml": { + "source": "iana", + "compressible": true + }, + "application/mathml-presentation+xml": { + "source": "iana", + "compressible": true + }, + "application/mbms-associated-procedure-description+xml": { + "source": "iana", + "compressible": true + }, + "application/mbms-deregister+xml": { + "source": "iana", + "compressible": true + }, + "application/mbms-envelope+xml": { + "source": "iana", + "compressible": true + }, + "application/mbms-msk+xml": { + "source": "iana", + "compressible": true + }, + "application/mbms-msk-response+xml": { + "source": "iana", + "compressible": true + }, + "application/mbms-protection-description+xml": { + "source": "iana", + "compressible": true + }, + "application/mbms-reception-report+xml": { + "source": "iana", + "compressible": true + }, + "application/mbms-register+xml": { + "source": "iana", + "compressible": true + }, + "application/mbms-register-response+xml": { + "source": "iana", + "compressible": true + }, + "application/mbms-schedule+xml": { + "source": "iana", + "compressible": true + }, + "application/mbms-user-service-description+xml": { + "source": "iana", + "compressible": true + }, + "application/mbox": { + "source": "iana", + "extensions": ["mbox"] + }, + "application/media-policy-dataset+xml": { + "source": "iana", + "compressible": true + }, + "application/media_control+xml": { + "source": "iana", + "compressible": true + }, + "application/mediaservercontrol+xml": { + "source": "iana", + "compressible": true, + "extensions": ["mscml"] + }, + "application/merge-patch+json": { + "source": "iana", + "compressible": true + }, + "application/metalink+xml": { + "source": "apache", + "compressible": true, + "extensions": ["metalink"] + }, + "application/metalink4+xml": { + "source": "iana", + "compressible": true, + "extensions": ["meta4"] + }, + "application/mets+xml": { + "source": "iana", + "compressible": true, + "extensions": ["mets"] + }, + "application/mf4": { + "source": "iana" + }, + "application/mikey": { + "source": "iana" + }, + "application/mipc": { + "source": "iana" + }, + "application/missing-blocks+cbor-seq": { + "source": "iana" + }, + "application/mmt-aei+xml": { + "source": "iana", + "compressible": true, + "extensions": ["maei"] + }, + "application/mmt-usd+xml": { + "source": "iana", + "compressible": true, + "extensions": ["musd"] + }, + "application/mods+xml": { + "source": "iana", + "compressible": true, + "extensions": ["mods"] + }, + "application/moss-keys": { + "source": "iana" + }, + "application/moss-signature": { + "source": "iana" + }, + "application/mosskey-data": { + "source": "iana" + }, + "application/mosskey-request": { + "source": "iana" + }, + "application/mp21": { + "source": "iana", + "extensions": ["m21","mp21"] + }, + "application/mp4": { + "source": "iana", + "extensions": ["mp4s","m4p"] + }, + "application/mpeg4-generic": { + "source": "iana" + }, + "application/mpeg4-iod": { + "source": "iana" + }, + "application/mpeg4-iod-xmt": { + "source": "iana" + }, + "application/mrb-consumer+xml": { + "source": "iana", + "compressible": true + }, + "application/mrb-publish+xml": { + "source": "iana", + "compressible": true + }, + "application/msc-ivr+xml": { + "source": "iana", + "charset": "UTF-8", + "compressible": true + }, + "application/msc-mixer+xml": { + "source": "iana", + "charset": "UTF-8", + "compressible": true + }, + "application/msword": { + "source": "iana", + "compressible": false, + "extensions": ["doc","dot"] + }, + "application/mud+json": { + "source": "iana", + "compressible": true + }, + "application/multipart-core": { + "source": "iana" + }, + "application/mxf": { + "source": "iana", + "extensions": ["mxf"] + }, + "application/n-quads": { + "source": "iana", + "extensions": ["nq"] + }, + "application/n-triples": { + "source": "iana", + "extensions": ["nt"] + }, + "application/nasdata": { + "source": "iana" + }, + "application/news-checkgroups": { + "source": "iana", + "charset": "US-ASCII" + }, + "application/news-groupinfo": { + "source": "iana", + "charset": "US-ASCII" + }, + "application/news-transmission": { + "source": "iana" + }, + "application/nlsml+xml": { + "source": "iana", + "compressible": true + }, + "application/node": { + "source": "iana", + "extensions": ["cjs"] + }, + "application/nss": { + "source": "iana" + }, + "application/oauth-authz-req+jwt": { + "source": "iana" + }, + "application/ocsp-request": { + "source": "iana" + }, + "application/ocsp-response": { + "source": "iana" + }, + "application/octet-stream": { + "source": "iana", + "compressible": false, + "extensions": ["bin","dms","lrf","mar","so","dist","distz","pkg","bpk","dump","elc","deploy","exe","dll","deb","dmg","iso","img","msi","msp","msm","buffer"] + }, + "application/oda": { + "source": "iana", + "extensions": ["oda"] + }, + "application/odm+xml": { + "source": "iana", + "compressible": true + }, + "application/odx": { + "source": "iana" + }, + "application/oebps-package+xml": { + "source": "iana", + "compressible": true, + "extensions": ["opf"] + }, + "application/ogg": { + "source": "iana", + "compressible": false, + "extensions": ["ogx"] + }, + "application/omdoc+xml": { + "source": "apache", + "compressible": true, + "extensions": ["omdoc"] + }, + "application/onenote": { + "source": "apache", + "extensions": ["onetoc","onetoc2","onetmp","onepkg"] + }, + "application/opc-nodeset+xml": { + "source": "iana", + "compressible": true + }, + "application/oscore": { + "source": "iana" + }, + "application/oxps": { + "source": "iana", + "extensions": ["oxps"] + }, + "application/p21": { + "source": "iana" + }, + "application/p21+zip": { + "source": "iana", + "compressible": false + }, + "application/p2p-overlay+xml": { + "source": "iana", + "compressible": true, + "extensions": ["relo"] + }, + "application/parityfec": { + "source": "iana" + }, + "application/passport": { + "source": "iana" + }, + "application/patch-ops-error+xml": { + "source": "iana", + "compressible": true, + "extensions": ["xer"] + }, + "application/pdf": { + "source": "iana", + "compressible": false, + "extensions": ["pdf"] + }, + "application/pdx": { + "source": "iana" + }, + "application/pem-certificate-chain": { + "source": "iana" + }, + "application/pgp-encrypted": { + "source": "iana", + "compressible": false, + "extensions": ["pgp"] + }, + "application/pgp-keys": { + "source": "iana" + }, + "application/pgp-signature": { + "source": "iana", + "extensions": ["asc","sig"] + }, + "application/pics-rules": { + "source": "apache", + "extensions": ["prf"] + }, + "application/pidf+xml": { + "source": "iana", + "charset": "UTF-8", + "compressible": true + }, + "application/pidf-diff+xml": { + "source": "iana", + "charset": "UTF-8", + "compressible": true + }, + "application/pkcs10": { + "source": "iana", + "extensions": ["p10"] + }, + "application/pkcs12": { + "source": "iana" + }, + "application/pkcs7-mime": { + "source": "iana", + "extensions": ["p7m","p7c"] + }, + "application/pkcs7-signature": { + "source": "iana", + "extensions": ["p7s"] + }, + "application/pkcs8": { + "source": "iana", + "extensions": ["p8"] + }, + "application/pkcs8-encrypted": { + "source": "iana" + }, + "application/pkix-attr-cert": { + "source": "iana", + "extensions": ["ac"] + }, + "application/pkix-cert": { + "source": "iana", + "extensions": ["cer"] + }, + "application/pkix-crl": { + "source": "iana", + "extensions": ["crl"] + }, + "application/pkix-pkipath": { + "source": "iana", + "extensions": ["pkipath"] + }, + "application/pkixcmp": { + "source": "iana", + "extensions": ["pki"] + }, + "application/pls+xml": { + "source": "iana", + "compressible": true, + "extensions": ["pls"] + }, + "application/poc-settings+xml": { + "source": "iana", + "charset": "UTF-8", + "compressible": true + }, + "application/postscript": { + "source": "iana", + "compressible": true, + "extensions": ["ai","eps","ps"] + }, + "application/ppsp-tracker+json": { + "source": "iana", + "compressible": true + }, + "application/problem+json": { + "source": "iana", + "compressible": true + }, + "application/problem+xml": { + "source": "iana", + "compressible": true + }, + "application/provenance+xml": { + "source": "iana", + "compressible": true, + "extensions": ["provx"] + }, + "application/prs.alvestrand.titrax-sheet": { + "source": "iana" + }, + "application/prs.cww": { + "source": "iana", + "extensions": ["cww"] + }, + "application/prs.cyn": { + "source": "iana", + "charset": "7-BIT" + }, + "application/prs.hpub+zip": { + "source": "iana", + "compressible": false + }, + "application/prs.nprend": { + "source": "iana" + }, + "application/prs.plucker": { + "source": "iana" + }, + "application/prs.rdf-xml-crypt": { + "source": "iana" + }, + "application/prs.xsf+xml": { + "source": "iana", + "compressible": true + }, + "application/pskc+xml": { + "source": "iana", + "compressible": true, + "extensions": ["pskcxml"] + }, + "application/pvd+json": { + "source": "iana", + "compressible": true + }, + "application/qsig": { + "source": "iana" + }, + "application/raml+yaml": { + "compressible": true, + "extensions": ["raml"] + }, + "application/raptorfec": { + "source": "iana" + }, + "application/rdap+json": { + "source": "iana", + "compressible": true + }, + "application/rdf+xml": { + "source": "iana", + "compressible": true, + "extensions": ["rdf","owl"] + }, + "application/reginfo+xml": { + "source": "iana", + "compressible": true, + "extensions": ["rif"] + }, + "application/relax-ng-compact-syntax": { + "source": "iana", + "extensions": ["rnc"] + }, + "application/remote-printing": { + "source": "iana" + }, + "application/reputon+json": { + "source": "iana", + "compressible": true + }, + "application/resource-lists+xml": { + "source": "iana", + "compressible": true, + "extensions": ["rl"] + }, + "application/resource-lists-diff+xml": { + "source": "iana", + "compressible": true, + "extensions": ["rld"] + }, + "application/rfc+xml": { + "source": "iana", + "compressible": true + }, + "application/riscos": { + "source": "iana" + }, + "application/rlmi+xml": { + "source": "iana", + "compressible": true + }, + "application/rls-services+xml": { + "source": "iana", + "compressible": true, + "extensions": ["rs"] + }, + "application/route-apd+xml": { + "source": "iana", + "compressible": true, + "extensions": ["rapd"] + }, + "application/route-s-tsid+xml": { + "source": "iana", + "compressible": true, + "extensions": ["sls"] + }, + "application/route-usd+xml": { + "source": "iana", + "compressible": true, + "extensions": ["rusd"] + }, + "application/rpki-ghostbusters": { + "source": "iana", + "extensions": ["gbr"] + }, + "application/rpki-manifest": { + "source": "iana", + "extensions": ["mft"] + }, + "application/rpki-publication": { + "source": "iana" + }, + "application/rpki-roa": { + "source": "iana", + "extensions": ["roa"] + }, + "application/rpki-updown": { + "source": "iana" + }, + "application/rsd+xml": { + "source": "apache", + "compressible": true, + "extensions": ["rsd"] + }, + "application/rss+xml": { + "source": "apache", + "compressible": true, + "extensions": ["rss"] + }, + "application/rtf": { + "source": "iana", + "compressible": true, + "extensions": ["rtf"] + }, + "application/rtploopback": { + "source": "iana" + }, + "application/rtx": { + "source": "iana" + }, + "application/samlassertion+xml": { + "source": "iana", + "compressible": true + }, + "application/samlmetadata+xml": { + "source": "iana", + "compressible": true + }, + "application/sarif+json": { + "source": "iana", + "compressible": true + }, + "application/sarif-external-properties+json": { + "source": "iana", + "compressible": true + }, + "application/sbe": { + "source": "iana" + }, + "application/sbml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["sbml"] + }, + "application/scaip+xml": { + "source": "iana", + "compressible": true + }, + "application/scim+json": { + "source": "iana", + "compressible": true + }, + "application/scvp-cv-request": { + "source": "iana", + "extensions": ["scq"] + }, + "application/scvp-cv-response": { + "source": "iana", + "extensions": ["scs"] + }, + "application/scvp-vp-request": { + "source": "iana", + "extensions": ["spq"] + }, + "application/scvp-vp-response": { + "source": "iana", + "extensions": ["spp"] + }, + "application/sdp": { + "source": "iana", + "extensions": ["sdp"] + }, + "application/secevent+jwt": { + "source": "iana" + }, + "application/senml+cbor": { + "source": "iana" + }, + "application/senml+json": { + "source": "iana", + "compressible": true + }, + "application/senml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["senmlx"] + }, + "application/senml-etch+cbor": { + "source": "iana" + }, + "application/senml-etch+json": { + "source": "iana", + "compressible": true + }, + "application/senml-exi": { + "source": "iana" + }, + "application/sensml+cbor": { + "source": "iana" + }, + "application/sensml+json": { + "source": "iana", + "compressible": true + }, + "application/sensml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["sensmlx"] + }, + "application/sensml-exi": { + "source": "iana" + }, + "application/sep+xml": { + "source": "iana", + "compressible": true + }, + "application/sep-exi": { + "source": "iana" + }, + "application/session-info": { + "source": "iana" + }, + "application/set-payment": { + "source": "iana" + }, + "application/set-payment-initiation": { + "source": "iana", + "extensions": ["setpay"] + }, + "application/set-registration": { + "source": "iana" + }, + "application/set-registration-initiation": { + "source": "iana", + "extensions": ["setreg"] + }, + "application/sgml": { + "source": "iana" + }, + "application/sgml-open-catalog": { + "source": "iana" + }, + "application/shf+xml": { + "source": "iana", + "compressible": true, + "extensions": ["shf"] + }, + "application/sieve": { + "source": "iana", + "extensions": ["siv","sieve"] + }, + "application/simple-filter+xml": { + "source": "iana", + "compressible": true + }, + "application/simple-message-summary": { + "source": "iana" + }, + "application/simplesymbolcontainer": { + "source": "iana" + }, + "application/sipc": { + "source": "iana" + }, + "application/slate": { + "source": "iana" + }, + "application/smil": { + "source": "iana" + }, + "application/smil+xml": { + "source": "iana", + "compressible": true, + "extensions": ["smi","smil"] + }, + "application/smpte336m": { + "source": "iana" + }, + "application/soap+fastinfoset": { + "source": "iana" + }, + "application/soap+xml": { + "source": "iana", + "compressible": true + }, + "application/sparql-query": { + "source": "iana", + "extensions": ["rq"] + }, + "application/sparql-results+xml": { + "source": "iana", + "compressible": true, + "extensions": ["srx"] + }, + "application/spirits-event+xml": { + "source": "iana", + "compressible": true + }, + "application/sql": { + "source": "iana" + }, + "application/srgs": { + "source": "iana", + "extensions": ["gram"] + }, + "application/srgs+xml": { + "source": "iana", + "compressible": true, + "extensions": ["grxml"] + }, + "application/sru+xml": { + "source": "iana", + "compressible": true, + "extensions": ["sru"] + }, + "application/ssdl+xml": { + "source": "apache", + "compressible": true, + "extensions": ["ssdl"] + }, + "application/ssml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["ssml"] + }, + "application/stix+json": { + "source": "iana", + "compressible": true + }, + "application/swid+xml": { + "source": "iana", + "compressible": true, + "extensions": ["swidtag"] + }, + "application/tamp-apex-update": { + "source": "iana" + }, + "application/tamp-apex-update-confirm": { + "source": "iana" + }, + "application/tamp-community-update": { + "source": "iana" + }, + "application/tamp-community-update-confirm": { + "source": "iana" + }, + "application/tamp-error": { + "source": "iana" + }, + "application/tamp-sequence-adjust": { + "source": "iana" + }, + "application/tamp-sequence-adjust-confirm": { + "source": "iana" + }, + "application/tamp-status-query": { + "source": "iana" + }, + "application/tamp-status-response": { + "source": "iana" + }, + "application/tamp-update": { + "source": "iana" + }, + "application/tamp-update-confirm": { + "source": "iana" + }, + "application/tar": { + "compressible": true + }, + "application/taxii+json": { + "source": "iana", + "compressible": true + }, + "application/td+json": { + "source": "iana", + "compressible": true + }, + "application/tei+xml": { + "source": "iana", + "compressible": true, + "extensions": ["tei","teicorpus"] + }, + "application/tetra_isi": { + "source": "iana" + }, + "application/thraud+xml": { + "source": "iana", + "compressible": true, + "extensions": ["tfi"] + }, + "application/timestamp-query": { + "source": "iana" + }, + "application/timestamp-reply": { + "source": "iana" + }, + "application/timestamped-data": { + "source": "iana", + "extensions": ["tsd"] + }, + "application/tlsrpt+gzip": { + "source": "iana" + }, + "application/tlsrpt+json": { + "source": "iana", + "compressible": true + }, + "application/tnauthlist": { + "source": "iana" + }, + "application/token-introspection+jwt": { + "source": "iana" + }, + "application/toml": { + "compressible": true, + "extensions": ["toml"] + }, + "application/trickle-ice-sdpfrag": { + "source": "iana" + }, + "application/trig": { + "source": "iana", + "extensions": ["trig"] + }, + "application/ttml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["ttml"] + }, + "application/tve-trigger": { + "source": "iana" + }, + "application/tzif": { + "source": "iana" + }, + "application/tzif-leap": { + "source": "iana" + }, + "application/ubjson": { + "compressible": false, + "extensions": ["ubj"] + }, + "application/ulpfec": { + "source": "iana" + }, + "application/urc-grpsheet+xml": { + "source": "iana", + "compressible": true + }, + "application/urc-ressheet+xml": { + "source": "iana", + "compressible": true, + "extensions": ["rsheet"] + }, + "application/urc-targetdesc+xml": { + "source": "iana", + "compressible": true, + "extensions": ["td"] + }, + "application/urc-uisocketdesc+xml": { + "source": "iana", + "compressible": true + }, + "application/vcard+json": { + "source": "iana", + "compressible": true + }, + "application/vcard+xml": { + "source": "iana", + "compressible": true + }, + "application/vemmi": { + "source": "iana" + }, + "application/vividence.scriptfile": { + "source": "apache" + }, + "application/vnd.1000minds.decision-model+xml": { + "source": "iana", + "compressible": true, + "extensions": ["1km"] + }, + "application/vnd.3gpp-prose+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp-prose-pc3ch+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp-v2x-local-service-information": { + "source": "iana" + }, + "application/vnd.3gpp.5gnas": { + "source": "iana" + }, + "application/vnd.3gpp.access-transfer-events+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.bsf+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.gmop+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.gtpc": { + "source": "iana" + }, + "application/vnd.3gpp.interworking-data": { + "source": "iana" + }, + "application/vnd.3gpp.lpp": { + "source": "iana" + }, + "application/vnd.3gpp.mc-signalling-ear": { + "source": "iana" + }, + "application/vnd.3gpp.mcdata-affiliation-command+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcdata-info+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcdata-payload": { + "source": "iana" + }, + "application/vnd.3gpp.mcdata-service-config+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcdata-signalling": { + "source": "iana" + }, + "application/vnd.3gpp.mcdata-ue-config+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcdata-user-profile+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcptt-affiliation-command+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcptt-floor-request+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcptt-info+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcptt-location-info+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcptt-mbms-usage-info+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcptt-service-config+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcptt-signed+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcptt-ue-config+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcptt-ue-init-config+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcptt-user-profile+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcvideo-affiliation-command+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcvideo-affiliation-info+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcvideo-info+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcvideo-location-info+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcvideo-mbms-usage-info+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcvideo-service-config+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcvideo-transmission-request+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcvideo-ue-config+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mcvideo-user-profile+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.mid-call+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.ngap": { + "source": "iana" + }, + "application/vnd.3gpp.pfcp": { + "source": "iana" + }, + "application/vnd.3gpp.pic-bw-large": { + "source": "iana", + "extensions": ["plb"] + }, + "application/vnd.3gpp.pic-bw-small": { + "source": "iana", + "extensions": ["psb"] + }, + "application/vnd.3gpp.pic-bw-var": { + "source": "iana", + "extensions": ["pvb"] + }, + "application/vnd.3gpp.s1ap": { + "source": "iana" + }, + "application/vnd.3gpp.sms": { + "source": "iana" + }, + "application/vnd.3gpp.sms+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.srvcc-ext+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.srvcc-info+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.state-and-event-info+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp.ussd+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp2.bcmcsinfo+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.3gpp2.sms": { + "source": "iana" + }, + "application/vnd.3gpp2.tcap": { + "source": "iana", + "extensions": ["tcap"] + }, + "application/vnd.3lightssoftware.imagescal": { + "source": "iana" + }, + "application/vnd.3m.post-it-notes": { + "source": "iana", + "extensions": ["pwn"] + }, + "application/vnd.accpac.simply.aso": { + "source": "iana", + "extensions": ["aso"] + }, + "application/vnd.accpac.simply.imp": { + "source": "iana", + "extensions": ["imp"] + }, + "application/vnd.acucobol": { + "source": "iana", + "extensions": ["acu"] + }, + "application/vnd.acucorp": { + "source": "iana", + "extensions": ["atc","acutc"] + }, + "application/vnd.adobe.air-application-installer-package+zip": { + "source": "apache", + "compressible": false, + "extensions": ["air"] + }, + "application/vnd.adobe.flash.movie": { + "source": "iana" + }, + "application/vnd.adobe.formscentral.fcdt": { + "source": "iana", + "extensions": ["fcdt"] + }, + "application/vnd.adobe.fxp": { + "source": "iana", + "extensions": ["fxp","fxpl"] + }, + "application/vnd.adobe.partial-upload": { + "source": "iana" + }, + "application/vnd.adobe.xdp+xml": { + "source": "iana", + "compressible": true, + "extensions": ["xdp"] + }, + "application/vnd.adobe.xfdf": { + "source": "iana", + "extensions": ["xfdf"] + }, + "application/vnd.aether.imp": { + "source": "iana" + }, + "application/vnd.afpc.afplinedata": { + "source": "iana" + }, + "application/vnd.afpc.afplinedata-pagedef": { + "source": "iana" + }, + "application/vnd.afpc.cmoca-cmresource": { + "source": "iana" + }, + "application/vnd.afpc.foca-charset": { + "source": "iana" + }, + "application/vnd.afpc.foca-codedfont": { + "source": "iana" + }, + "application/vnd.afpc.foca-codepage": { + "source": "iana" + }, + "application/vnd.afpc.modca": { + "source": "iana" + }, + "application/vnd.afpc.modca-cmtable": { + "source": "iana" + }, + "application/vnd.afpc.modca-formdef": { + "source": "iana" + }, + "application/vnd.afpc.modca-mediummap": { + "source": "iana" + }, + "application/vnd.afpc.modca-objectcontainer": { + "source": "iana" + }, + "application/vnd.afpc.modca-overlay": { + "source": "iana" + }, + "application/vnd.afpc.modca-pagesegment": { + "source": "iana" + }, + "application/vnd.ah-barcode": { + "source": "iana" + }, + "application/vnd.ahead.space": { + "source": "iana", + "extensions": ["ahead"] + }, + "application/vnd.airzip.filesecure.azf": { + "source": "iana", + "extensions": ["azf"] + }, + "application/vnd.airzip.filesecure.azs": { + "source": "iana", + "extensions": ["azs"] + }, + "application/vnd.amadeus+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.amazon.ebook": { + "source": "apache", + "extensions": ["azw"] + }, + "application/vnd.amazon.mobi8-ebook": { + "source": "iana" + }, + "application/vnd.americandynamics.acc": { + "source": "iana", + "extensions": ["acc"] + }, + "application/vnd.amiga.ami": { + "source": "iana", + "extensions": ["ami"] + }, + "application/vnd.amundsen.maze+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.android.ota": { + "source": "iana" + }, + "application/vnd.android.package-archive": { + "source": "apache", + "compressible": false, + "extensions": ["apk"] + }, + "application/vnd.anki": { + "source": "iana" + }, + "application/vnd.anser-web-certificate-issue-initiation": { + "source": "iana", + "extensions": ["cii"] + }, + "application/vnd.anser-web-funds-transfer-initiation": { + "source": "apache", + "extensions": ["fti"] + }, + "application/vnd.antix.game-component": { + "source": "iana", + "extensions": ["atx"] + }, + "application/vnd.apache.arrow.file": { + "source": "iana" + }, + "application/vnd.apache.arrow.stream": { + "source": "iana" + }, + "application/vnd.apache.thrift.binary": { + "source": "iana" + }, + "application/vnd.apache.thrift.compact": { + "source": "iana" + }, + "application/vnd.apache.thrift.json": { + "source": "iana" + }, + "application/vnd.api+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.aplextor.warrp+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.apothekende.reservation+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.apple.installer+xml": { + "source": "iana", + "compressible": true, + "extensions": ["mpkg"] + }, + "application/vnd.apple.keynote": { + "source": "iana", + "extensions": ["key"] + }, + "application/vnd.apple.mpegurl": { + "source": "iana", + "extensions": ["m3u8"] + }, + "application/vnd.apple.numbers": { + "source": "iana", + "extensions": ["numbers"] + }, + "application/vnd.apple.pages": { + "source": "iana", + "extensions": ["pages"] + }, + "application/vnd.apple.pkpass": { + "compressible": false, + "extensions": ["pkpass"] + }, + "application/vnd.arastra.swi": { + "source": "iana" + }, + "application/vnd.aristanetworks.swi": { + "source": "iana", + "extensions": ["swi"] + }, + "application/vnd.artisan+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.artsquare": { + "source": "iana" + }, + "application/vnd.astraea-software.iota": { + "source": "iana", + "extensions": ["iota"] + }, + "application/vnd.audiograph": { + "source": "iana", + "extensions": ["aep"] + }, + "application/vnd.autopackage": { + "source": "iana" + }, + "application/vnd.avalon+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.avistar+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.balsamiq.bmml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["bmml"] + }, + "application/vnd.balsamiq.bmpr": { + "source": "iana" + }, + "application/vnd.banana-accounting": { + "source": "iana" + }, + "application/vnd.bbf.usp.error": { + "source": "iana" + }, + "application/vnd.bbf.usp.msg": { + "source": "iana" + }, + "application/vnd.bbf.usp.msg+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.bekitzur-stech+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.bint.med-content": { + "source": "iana" + }, + "application/vnd.biopax.rdf+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.blink-idb-value-wrapper": { + "source": "iana" + }, + "application/vnd.blueice.multipass": { + "source": "iana", + "extensions": ["mpm"] + }, + "application/vnd.bluetooth.ep.oob": { + "source": "iana" + }, + "application/vnd.bluetooth.le.oob": { + "source": "iana" + }, + "application/vnd.bmi": { + "source": "iana", + "extensions": ["bmi"] + }, + "application/vnd.bpf": { + "source": "iana" + }, + "application/vnd.bpf3": { + "source": "iana" + }, + "application/vnd.businessobjects": { + "source": "iana", + "extensions": ["rep"] + }, + "application/vnd.byu.uapi+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.cab-jscript": { + "source": "iana" + }, + "application/vnd.canon-cpdl": { + "source": "iana" + }, + "application/vnd.canon-lips": { + "source": "iana" + }, + "application/vnd.capasystems-pg+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.cendio.thinlinc.clientconf": { + "source": "iana" + }, + "application/vnd.century-systems.tcp_stream": { + "source": "iana" + }, + "application/vnd.chemdraw+xml": { + "source": "iana", + "compressible": true, + "extensions": ["cdxml"] + }, + "application/vnd.chess-pgn": { + "source": "iana" + }, + "application/vnd.chipnuts.karaoke-mmd": { + "source": "iana", + "extensions": ["mmd"] + }, + "application/vnd.ciedi": { + "source": "iana" + }, + "application/vnd.cinderella": { + "source": "iana", + "extensions": ["cdy"] + }, + "application/vnd.cirpack.isdn-ext": { + "source": "iana" + }, + "application/vnd.citationstyles.style+xml": { + "source": "iana", + "compressible": true, + "extensions": ["csl"] + }, + "application/vnd.claymore": { + "source": "iana", + "extensions": ["cla"] + }, + "application/vnd.cloanto.rp9": { + "source": "iana", + "extensions": ["rp9"] + }, + "application/vnd.clonk.c4group": { + "source": "iana", + "extensions": ["c4g","c4d","c4f","c4p","c4u"] + }, + "application/vnd.cluetrust.cartomobile-config": { + "source": "iana", + "extensions": ["c11amc"] + }, + "application/vnd.cluetrust.cartomobile-config-pkg": { + "source": "iana", + "extensions": ["c11amz"] + }, + "application/vnd.coffeescript": { + "source": "iana" + }, + "application/vnd.collabio.xodocuments.document": { + "source": "iana" + }, + "application/vnd.collabio.xodocuments.document-template": { + "source": "iana" + }, + "application/vnd.collabio.xodocuments.presentation": { + "source": "iana" + }, + "application/vnd.collabio.xodocuments.presentation-template": { + "source": "iana" + }, + "application/vnd.collabio.xodocuments.spreadsheet": { + "source": "iana" + }, + "application/vnd.collabio.xodocuments.spreadsheet-template": { + "source": "iana" + }, + "application/vnd.collection+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.collection.doc+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.collection.next+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.comicbook+zip": { + "source": "iana", + "compressible": false + }, + "application/vnd.comicbook-rar": { + "source": "iana" + }, + "application/vnd.commerce-battelle": { + "source": "iana" + }, + "application/vnd.commonspace": { + "source": "iana", + "extensions": ["csp"] + }, + "application/vnd.contact.cmsg": { + "source": "iana", + "extensions": ["cdbcmsg"] + }, + "application/vnd.coreos.ignition+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.cosmocaller": { + "source": "iana", + "extensions": ["cmc"] + }, + "application/vnd.crick.clicker": { + "source": "iana", + "extensions": ["clkx"] + }, + "application/vnd.crick.clicker.keyboard": { + "source": "iana", + "extensions": ["clkk"] + }, + "application/vnd.crick.clicker.palette": { + "source": "iana", + "extensions": ["clkp"] + }, + "application/vnd.crick.clicker.template": { + "source": "iana", + "extensions": ["clkt"] + }, + "application/vnd.crick.clicker.wordbank": { + "source": "iana", + "extensions": ["clkw"] + }, + "application/vnd.criticaltools.wbs+xml": { + "source": "iana", + "compressible": true, + "extensions": ["wbs"] + }, + "application/vnd.cryptii.pipe+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.crypto-shade-file": { + "source": "iana" + }, + "application/vnd.cryptomator.encrypted": { + "source": "iana" + }, + "application/vnd.cryptomator.vault": { + "source": "iana" + }, + "application/vnd.ctc-posml": { + "source": "iana", + "extensions": ["pml"] + }, + "application/vnd.ctct.ws+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.cups-pdf": { + "source": "iana" + }, + "application/vnd.cups-postscript": { + "source": "iana" + }, + "application/vnd.cups-ppd": { + "source": "iana", + "extensions": ["ppd"] + }, + "application/vnd.cups-raster": { + "source": "iana" + }, + "application/vnd.cups-raw": { + "source": "iana" + }, + "application/vnd.curl": { + "source": "iana" + }, + "application/vnd.curl.car": { + "source": "apache", + "extensions": ["car"] + }, + "application/vnd.curl.pcurl": { + "source": "apache", + "extensions": ["pcurl"] + }, + "application/vnd.cyan.dean.root+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.cybank": { + "source": "iana" + }, + "application/vnd.cyclonedx+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.cyclonedx+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.d2l.coursepackage1p0+zip": { + "source": "iana", + "compressible": false + }, + "application/vnd.d3m-dataset": { + "source": "iana" + }, + "application/vnd.d3m-problem": { + "source": "iana" + }, + "application/vnd.dart": { + "source": "iana", + "compressible": true, + "extensions": ["dart"] + }, + "application/vnd.data-vision.rdz": { + "source": "iana", + "extensions": ["rdz"] + }, + "application/vnd.datapackage+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.dataresource+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.dbf": { + "source": "iana", + "extensions": ["dbf"] + }, + "application/vnd.debian.binary-package": { + "source": "iana" + }, + "application/vnd.dece.data": { + "source": "iana", + "extensions": ["uvf","uvvf","uvd","uvvd"] + }, + "application/vnd.dece.ttml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["uvt","uvvt"] + }, + "application/vnd.dece.unspecified": { + "source": "iana", + "extensions": ["uvx","uvvx"] + }, + "application/vnd.dece.zip": { + "source": "iana", + "extensions": ["uvz","uvvz"] + }, + "application/vnd.denovo.fcselayout-link": { + "source": "iana", + "extensions": ["fe_launch"] + }, + "application/vnd.desmume.movie": { + "source": "iana" + }, + "application/vnd.dir-bi.plate-dl-nosuffix": { + "source": "iana" + }, + "application/vnd.dm.delegation+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.dna": { + "source": "iana", + "extensions": ["dna"] + }, + "application/vnd.document+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.dolby.mlp": { + "source": "apache", + "extensions": ["mlp"] + }, + "application/vnd.dolby.mobile.1": { + "source": "iana" + }, + "application/vnd.dolby.mobile.2": { + "source": "iana" + }, + "application/vnd.doremir.scorecloud-binary-document": { + "source": "iana" + }, + "application/vnd.dpgraph": { + "source": "iana", + "extensions": ["dpg"] + }, + "application/vnd.dreamfactory": { + "source": "iana", + "extensions": ["dfac"] + }, + "application/vnd.drive+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.ds-keypoint": { + "source": "apache", + "extensions": ["kpxx"] + }, + "application/vnd.dtg.local": { + "source": "iana" + }, + "application/vnd.dtg.local.flash": { + "source": "iana" + }, + "application/vnd.dtg.local.html": { + "source": "iana" + }, + "application/vnd.dvb.ait": { + "source": "iana", + "extensions": ["ait"] + }, + "application/vnd.dvb.dvbisl+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.dvb.dvbj": { + "source": "iana" + }, + "application/vnd.dvb.esgcontainer": { + "source": "iana" + }, + "application/vnd.dvb.ipdcdftnotifaccess": { + "source": "iana" + }, + "application/vnd.dvb.ipdcesgaccess": { + "source": "iana" + }, + "application/vnd.dvb.ipdcesgaccess2": { + "source": "iana" + }, + "application/vnd.dvb.ipdcesgpdd": { + "source": "iana" + }, + "application/vnd.dvb.ipdcroaming": { + "source": "iana" + }, + "application/vnd.dvb.iptv.alfec-base": { + "source": "iana" + }, + "application/vnd.dvb.iptv.alfec-enhancement": { + "source": "iana" + }, + "application/vnd.dvb.notif-aggregate-root+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.dvb.notif-container+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.dvb.notif-generic+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.dvb.notif-ia-msglist+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.dvb.notif-ia-registration-request+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.dvb.notif-ia-registration-response+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.dvb.notif-init+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.dvb.pfr": { + "source": "iana" + }, + "application/vnd.dvb.service": { + "source": "iana", + "extensions": ["svc"] + }, + "application/vnd.dxr": { + "source": "iana" + }, + "application/vnd.dynageo": { + "source": "iana", + "extensions": ["geo"] + }, + "application/vnd.dzr": { + "source": "iana" + }, + "application/vnd.easykaraoke.cdgdownload": { + "source": "iana" + }, + "application/vnd.ecdis-update": { + "source": "iana" + }, + "application/vnd.ecip.rlp": { + "source": "iana" + }, + "application/vnd.ecowin.chart": { + "source": "iana", + "extensions": ["mag"] + }, + "application/vnd.ecowin.filerequest": { + "source": "iana" + }, + "application/vnd.ecowin.fileupdate": { + "source": "iana" + }, + "application/vnd.ecowin.series": { + "source": "iana" + }, + "application/vnd.ecowin.seriesrequest": { + "source": "iana" + }, + "application/vnd.ecowin.seriesupdate": { + "source": "iana" + }, + "application/vnd.efi.img": { + "source": "iana" + }, + "application/vnd.efi.iso": { + "source": "iana" + }, + "application/vnd.emclient.accessrequest+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.enliven": { + "source": "iana", + "extensions": ["nml"] + }, + "application/vnd.enphase.envoy": { + "source": "iana" + }, + "application/vnd.eprints.data+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.epson.esf": { + "source": "iana", + "extensions": ["esf"] + }, + "application/vnd.epson.msf": { + "source": "iana", + "extensions": ["msf"] + }, + "application/vnd.epson.quickanime": { + "source": "iana", + "extensions": ["qam"] + }, + "application/vnd.epson.salt": { + "source": "iana", + "extensions": ["slt"] + }, + "application/vnd.epson.ssf": { + "source": "iana", + "extensions": ["ssf"] + }, + "application/vnd.ericsson.quickcall": { + "source": "iana" + }, + "application/vnd.espass-espass+zip": { + "source": "iana", + "compressible": false + }, + "application/vnd.eszigno3+xml": { + "source": "iana", + "compressible": true, + "extensions": ["es3","et3"] + }, + "application/vnd.etsi.aoc+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.etsi.asic-e+zip": { + "source": "iana", + "compressible": false + }, + "application/vnd.etsi.asic-s+zip": { + "source": "iana", + "compressible": false + }, + "application/vnd.etsi.cug+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.etsi.iptvcommand+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.etsi.iptvdiscovery+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.etsi.iptvprofile+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.etsi.iptvsad-bc+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.etsi.iptvsad-cod+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.etsi.iptvsad-npvr+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.etsi.iptvservice+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.etsi.iptvsync+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.etsi.iptvueprofile+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.etsi.mcid+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.etsi.mheg5": { + "source": "iana" + }, + "application/vnd.etsi.overload-control-policy-dataset+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.etsi.pstn+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.etsi.sci+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.etsi.simservs+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.etsi.timestamp-token": { + "source": "iana" + }, + "application/vnd.etsi.tsl+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.etsi.tsl.der": { + "source": "iana" + }, + "application/vnd.eudora.data": { + "source": "iana" + }, + "application/vnd.evolv.ecig.profile": { + "source": "iana" + }, + "application/vnd.evolv.ecig.settings": { + "source": "iana" + }, + "application/vnd.evolv.ecig.theme": { + "source": "iana" + }, + "application/vnd.exstream-empower+zip": { + "source": "iana", + "compressible": false + }, + "application/vnd.exstream-package": { + "source": "iana" + }, + "application/vnd.ezpix-album": { + "source": "iana", + "extensions": ["ez2"] + }, + "application/vnd.ezpix-package": { + "source": "iana", + "extensions": ["ez3"] + }, + "application/vnd.f-secure.mobile": { + "source": "iana" + }, + "application/vnd.fastcopy-disk-image": { + "source": "iana" + }, + "application/vnd.fdf": { + "source": "iana", + "extensions": ["fdf"] + }, + "application/vnd.fdsn.mseed": { + "source": "iana", + "extensions": ["mseed"] + }, + "application/vnd.fdsn.seed": { + "source": "iana", + "extensions": ["seed","dataless"] + }, + "application/vnd.ffsns": { + "source": "iana" + }, + "application/vnd.ficlab.flb+zip": { + "source": "iana", + "compressible": false + }, + "application/vnd.filmit.zfc": { + "source": "iana" + }, + "application/vnd.fints": { + "source": "iana" + }, + "application/vnd.firemonkeys.cloudcell": { + "source": "iana" + }, + "application/vnd.flographit": { + "source": "iana", + "extensions": ["gph"] + }, + "application/vnd.fluxtime.clip": { + "source": "iana", + "extensions": ["ftc"] + }, + "application/vnd.font-fontforge-sfd": { + "source": "iana" + }, + "application/vnd.framemaker": { + "source": "iana", + "extensions": ["fm","frame","maker","book"] + }, + "application/vnd.frogans.fnc": { + "source": "iana", + "extensions": ["fnc"] + }, + "application/vnd.frogans.ltf": { + "source": "iana", + "extensions": ["ltf"] + }, + "application/vnd.fsc.weblaunch": { + "source": "iana", + "extensions": ["fsc"] + }, + "application/vnd.fujifilm.fb.docuworks": { + "source": "iana" + }, + "application/vnd.fujifilm.fb.docuworks.binder": { + "source": "iana" + }, + "application/vnd.fujifilm.fb.docuworks.container": { + "source": "iana" + }, + "application/vnd.fujifilm.fb.jfi+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.fujitsu.oasys": { + "source": "iana", + "extensions": ["oas"] + }, + "application/vnd.fujitsu.oasys2": { + "source": "iana", + "extensions": ["oa2"] + }, + "application/vnd.fujitsu.oasys3": { + "source": "iana", + "extensions": ["oa3"] + }, + "application/vnd.fujitsu.oasysgp": { + "source": "iana", + "extensions": ["fg5"] + }, + "application/vnd.fujitsu.oasysprs": { + "source": "iana", + "extensions": ["bh2"] + }, + "application/vnd.fujixerox.art-ex": { + "source": "iana" + }, + "application/vnd.fujixerox.art4": { + "source": "iana" + }, + "application/vnd.fujixerox.ddd": { + "source": "iana", + "extensions": ["ddd"] + }, + "application/vnd.fujixerox.docuworks": { + "source": "iana", + "extensions": ["xdw"] + }, + "application/vnd.fujixerox.docuworks.binder": { + "source": "iana", + "extensions": ["xbd"] + }, + "application/vnd.fujixerox.docuworks.container": { + "source": "iana" + }, + "application/vnd.fujixerox.hbpl": { + "source": "iana" + }, + "application/vnd.fut-misnet": { + "source": "iana" + }, + "application/vnd.futoin+cbor": { + "source": "iana" + }, + "application/vnd.futoin+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.fuzzysheet": { + "source": "iana", + "extensions": ["fzs"] + }, + "application/vnd.genomatix.tuxedo": { + "source": "iana", + "extensions": ["txd"] + }, + "application/vnd.gentics.grd+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.geo+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.geocube+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.geogebra.file": { + "source": "iana", + "extensions": ["ggb"] + }, + "application/vnd.geogebra.slides": { + "source": "iana" + }, + "application/vnd.geogebra.tool": { + "source": "iana", + "extensions": ["ggt"] + }, + "application/vnd.geometry-explorer": { + "source": "iana", + "extensions": ["gex","gre"] + }, + "application/vnd.geonext": { + "source": "iana", + "extensions": ["gxt"] + }, + "application/vnd.geoplan": { + "source": "iana", + "extensions": ["g2w"] + }, + "application/vnd.geospace": { + "source": "iana", + "extensions": ["g3w"] + }, + "application/vnd.gerber": { + "source": "iana" + }, + "application/vnd.globalplatform.card-content-mgt": { + "source": "iana" + }, + "application/vnd.globalplatform.card-content-mgt-response": { + "source": "iana" + }, + "application/vnd.gmx": { + "source": "iana", + "extensions": ["gmx"] + }, + "application/vnd.google-apps.document": { + "compressible": false, + "extensions": ["gdoc"] + }, + "application/vnd.google-apps.presentation": { + "compressible": false, + "extensions": ["gslides"] + }, + "application/vnd.google-apps.spreadsheet": { + "compressible": false, + "extensions": ["gsheet"] + }, + "application/vnd.google-earth.kml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["kml"] + }, + "application/vnd.google-earth.kmz": { + "source": "iana", + "compressible": false, + "extensions": ["kmz"] + }, + "application/vnd.gov.sk.e-form+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.gov.sk.e-form+zip": { + "source": "iana", + "compressible": false + }, + "application/vnd.gov.sk.xmldatacontainer+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.grafeq": { + "source": "iana", + "extensions": ["gqf","gqs"] + }, + "application/vnd.gridmp": { + "source": "iana" + }, + "application/vnd.groove-account": { + "source": "iana", + "extensions": ["gac"] + }, + "application/vnd.groove-help": { + "source": "iana", + "extensions": ["ghf"] + }, + "application/vnd.groove-identity-message": { + "source": "iana", + "extensions": ["gim"] + }, + "application/vnd.groove-injector": { + "source": "iana", + "extensions": ["grv"] + }, + "application/vnd.groove-tool-message": { + "source": "iana", + "extensions": ["gtm"] + }, + "application/vnd.groove-tool-template": { + "source": "iana", + "extensions": ["tpl"] + }, + "application/vnd.groove-vcard": { + "source": "iana", + "extensions": ["vcg"] + }, + "application/vnd.hal+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.hal+xml": { + "source": "iana", + "compressible": true, + "extensions": ["hal"] + }, + "application/vnd.handheld-entertainment+xml": { + "source": "iana", + "compressible": true, + "extensions": ["zmm"] + }, + "application/vnd.hbci": { + "source": "iana", + "extensions": ["hbci"] + }, + "application/vnd.hc+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.hcl-bireports": { + "source": "iana" + }, + "application/vnd.hdt": { + "source": "iana" + }, + "application/vnd.heroku+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.hhe.lesson-player": { + "source": "iana", + "extensions": ["les"] + }, + "application/vnd.hp-hpgl": { + "source": "iana", + "extensions": ["hpgl"] + }, + "application/vnd.hp-hpid": { + "source": "iana", + "extensions": ["hpid"] + }, + "application/vnd.hp-hps": { + "source": "iana", + "extensions": ["hps"] + }, + "application/vnd.hp-jlyt": { + "source": "iana", + "extensions": ["jlt"] + }, + "application/vnd.hp-pcl": { + "source": "iana", + "extensions": ["pcl"] + }, + "application/vnd.hp-pclxl": { + "source": "iana", + "extensions": ["pclxl"] + }, + "application/vnd.httphone": { + "source": "iana" + }, + "application/vnd.hydrostatix.sof-data": { + "source": "iana", + "extensions": ["sfd-hdstx"] + }, + "application/vnd.hyper+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.hyper-item+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.hyperdrive+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.hzn-3d-crossword": { + "source": "iana" + }, + "application/vnd.ibm.afplinedata": { + "source": "iana" + }, + "application/vnd.ibm.electronic-media": { + "source": "iana" + }, + "application/vnd.ibm.minipay": { + "source": "iana", + "extensions": ["mpy"] + }, + "application/vnd.ibm.modcap": { + "source": "iana", + "extensions": ["afp","listafp","list3820"] + }, + "application/vnd.ibm.rights-management": { + "source": "iana", + "extensions": ["irm"] + }, + "application/vnd.ibm.secure-container": { + "source": "iana", + "extensions": ["sc"] + }, + "application/vnd.iccprofile": { + "source": "iana", + "extensions": ["icc","icm"] + }, + "application/vnd.ieee.1905": { + "source": "iana" + }, + "application/vnd.igloader": { + "source": "iana", + "extensions": ["igl"] + }, + "application/vnd.imagemeter.folder+zip": { + "source": "iana", + "compressible": false + }, + "application/vnd.imagemeter.image+zip": { + "source": "iana", + "compressible": false + }, + "application/vnd.immervision-ivp": { + "source": "iana", + "extensions": ["ivp"] + }, + "application/vnd.immervision-ivu": { + "source": "iana", + "extensions": ["ivu"] + }, + "application/vnd.ims.imsccv1p1": { + "source": "iana" + }, + "application/vnd.ims.imsccv1p2": { + "source": "iana" + }, + "application/vnd.ims.imsccv1p3": { + "source": "iana" + }, + "application/vnd.ims.lis.v2.result+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.ims.lti.v2.toolconsumerprofile+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.ims.lti.v2.toolproxy+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.ims.lti.v2.toolproxy.id+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.ims.lti.v2.toolsettings+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.ims.lti.v2.toolsettings.simple+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.informedcontrol.rms+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.informix-visionary": { + "source": "iana" + }, + "application/vnd.infotech.project": { + "source": "iana" + }, + "application/vnd.infotech.project+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.innopath.wamp.notification": { + "source": "iana" + }, + "application/vnd.insors.igm": { + "source": "iana", + "extensions": ["igm"] + }, + "application/vnd.intercon.formnet": { + "source": "iana", + "extensions": ["xpw","xpx"] + }, + "application/vnd.intergeo": { + "source": "iana", + "extensions": ["i2g"] + }, + "application/vnd.intertrust.digibox": { + "source": "iana" + }, + "application/vnd.intertrust.nncp": { + "source": "iana" + }, + "application/vnd.intu.qbo": { + "source": "iana", + "extensions": ["qbo"] + }, + "application/vnd.intu.qfx": { + "source": "iana", + "extensions": ["qfx"] + }, + "application/vnd.iptc.g2.catalogitem+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.iptc.g2.conceptitem+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.iptc.g2.knowledgeitem+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.iptc.g2.newsitem+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.iptc.g2.newsmessage+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.iptc.g2.packageitem+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.iptc.g2.planningitem+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.ipunplugged.rcprofile": { + "source": "iana", + "extensions": ["rcprofile"] + }, + "application/vnd.irepository.package+xml": { + "source": "iana", + "compressible": true, + "extensions": ["irp"] + }, + "application/vnd.is-xpr": { + "source": "iana", + "extensions": ["xpr"] + }, + "application/vnd.isac.fcs": { + "source": "iana", + "extensions": ["fcs"] + }, + "application/vnd.iso11783-10+zip": { + "source": "iana", + "compressible": false + }, + "application/vnd.jam": { + "source": "iana", + "extensions": ["jam"] + }, + "application/vnd.japannet-directory-service": { + "source": "iana" + }, + "application/vnd.japannet-jpnstore-wakeup": { + "source": "iana" + }, + "application/vnd.japannet-payment-wakeup": { + "source": "iana" + }, + "application/vnd.japannet-registration": { + "source": "iana" + }, + "application/vnd.japannet-registration-wakeup": { + "source": "iana" + }, + "application/vnd.japannet-setstore-wakeup": { + "source": "iana" + }, + "application/vnd.japannet-verification": { + "source": "iana" + }, + "application/vnd.japannet-verification-wakeup": { + "source": "iana" + }, + "application/vnd.jcp.javame.midlet-rms": { + "source": "iana", + "extensions": ["rms"] + }, + "application/vnd.jisp": { + "source": "iana", + "extensions": ["jisp"] + }, + "application/vnd.joost.joda-archive": { + "source": "iana", + "extensions": ["joda"] + }, + "application/vnd.jsk.isdn-ngn": { + "source": "iana" + }, + "application/vnd.kahootz": { + "source": "iana", + "extensions": ["ktz","ktr"] + }, + "application/vnd.kde.karbon": { + "source": "iana", + "extensions": ["karbon"] + }, + "application/vnd.kde.kchart": { + "source": "iana", + "extensions": ["chrt"] + }, + "application/vnd.kde.kformula": { + "source": "iana", + "extensions": ["kfo"] + }, + "application/vnd.kde.kivio": { + "source": "iana", + "extensions": ["flw"] + }, + "application/vnd.kde.kontour": { + "source": "iana", + "extensions": ["kon"] + }, + "application/vnd.kde.kpresenter": { + "source": "iana", + "extensions": ["kpr","kpt"] + }, + "application/vnd.kde.kspread": { + "source": "iana", + "extensions": ["ksp"] + }, + "application/vnd.kde.kword": { + "source": "iana", + "extensions": ["kwd","kwt"] + }, + "application/vnd.kenameaapp": { + "source": "iana", + "extensions": ["htke"] + }, + "application/vnd.kidspiration": { + "source": "iana", + "extensions": ["kia"] + }, + "application/vnd.kinar": { + "source": "iana", + "extensions": ["kne","knp"] + }, + "application/vnd.koan": { + "source": "iana", + "extensions": ["skp","skd","skt","skm"] + }, + "application/vnd.kodak-descriptor": { + "source": "iana", + "extensions": ["sse"] + }, + "application/vnd.las": { + "source": "iana" + }, + "application/vnd.las.las+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.las.las+xml": { + "source": "iana", + "compressible": true, + "extensions": ["lasxml"] + }, + "application/vnd.laszip": { + "source": "iana" + }, + "application/vnd.leap+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.liberty-request+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.llamagraphics.life-balance.desktop": { + "source": "iana", + "extensions": ["lbd"] + }, + "application/vnd.llamagraphics.life-balance.exchange+xml": { + "source": "iana", + "compressible": true, + "extensions": ["lbe"] + }, + "application/vnd.logipipe.circuit+zip": { + "source": "iana", + "compressible": false + }, + "application/vnd.loom": { + "source": "iana" + }, + "application/vnd.lotus-1-2-3": { + "source": "iana", + "extensions": ["123"] + }, + "application/vnd.lotus-approach": { + "source": "iana", + "extensions": ["apr"] + }, + "application/vnd.lotus-freelance": { + "source": "iana", + "extensions": ["pre"] + }, + "application/vnd.lotus-notes": { + "source": "iana", + "extensions": ["nsf"] + }, + "application/vnd.lotus-organizer": { + "source": "iana", + "extensions": ["org"] + }, + "application/vnd.lotus-screencam": { + "source": "iana", + "extensions": ["scm"] + }, + "application/vnd.lotus-wordpro": { + "source": "iana", + "extensions": ["lwp"] + }, + "application/vnd.macports.portpkg": { + "source": "iana", + "extensions": ["portpkg"] + }, + "application/vnd.mapbox-vector-tile": { + "source": "iana", + "extensions": ["mvt"] + }, + "application/vnd.marlin.drm.actiontoken+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.marlin.drm.conftoken+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.marlin.drm.license+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.marlin.drm.mdcf": { + "source": "iana" + }, + "application/vnd.mason+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.maxmind.maxmind-db": { + "source": "iana" + }, + "application/vnd.mcd": { + "source": "iana", + "extensions": ["mcd"] + }, + "application/vnd.medcalcdata": { + "source": "iana", + "extensions": ["mc1"] + }, + "application/vnd.mediastation.cdkey": { + "source": "iana", + "extensions": ["cdkey"] + }, + "application/vnd.meridian-slingshot": { + "source": "iana" + }, + "application/vnd.mfer": { + "source": "iana", + "extensions": ["mwf"] + }, + "application/vnd.mfmp": { + "source": "iana", + "extensions": ["mfm"] + }, + "application/vnd.micro+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.micrografx.flo": { + "source": "iana", + "extensions": ["flo"] + }, + "application/vnd.micrografx.igx": { + "source": "iana", + "extensions": ["igx"] + }, + "application/vnd.microsoft.portable-executable": { + "source": "iana" + }, + "application/vnd.microsoft.windows.thumbnail-cache": { + "source": "iana" + }, + "application/vnd.miele+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.mif": { + "source": "iana", + "extensions": ["mif"] + }, + "application/vnd.minisoft-hp3000-save": { + "source": "iana" + }, + "application/vnd.mitsubishi.misty-guard.trustweb": { + "source": "iana" + }, + "application/vnd.mobius.daf": { + "source": "iana", + "extensions": ["daf"] + }, + "application/vnd.mobius.dis": { + "source": "iana", + "extensions": ["dis"] + }, + "application/vnd.mobius.mbk": { + "source": "iana", + "extensions": ["mbk"] + }, + "application/vnd.mobius.mqy": { + "source": "iana", + "extensions": ["mqy"] + }, + "application/vnd.mobius.msl": { + "source": "iana", + "extensions": ["msl"] + }, + "application/vnd.mobius.plc": { + "source": "iana", + "extensions": ["plc"] + }, + "application/vnd.mobius.txf": { + "source": "iana", + "extensions": ["txf"] + }, + "application/vnd.mophun.application": { + "source": "iana", + "extensions": ["mpn"] + }, + "application/vnd.mophun.certificate": { + "source": "iana", + "extensions": ["mpc"] + }, + "application/vnd.motorola.flexsuite": { + "source": "iana" + }, + "application/vnd.motorola.flexsuite.adsi": { + "source": "iana" + }, + "application/vnd.motorola.flexsuite.fis": { + "source": "iana" + }, + "application/vnd.motorola.flexsuite.gotap": { + "source": "iana" + }, + "application/vnd.motorola.flexsuite.kmr": { + "source": "iana" + }, + "application/vnd.motorola.flexsuite.ttc": { + "source": "iana" + }, + "application/vnd.motorola.flexsuite.wem": { + "source": "iana" + }, + "application/vnd.motorola.iprm": { + "source": "iana" + }, + "application/vnd.mozilla.xul+xml": { + "source": "iana", + "compressible": true, + "extensions": ["xul"] + }, + "application/vnd.ms-3mfdocument": { + "source": "iana" + }, + "application/vnd.ms-artgalry": { + "source": "iana", + "extensions": ["cil"] + }, + "application/vnd.ms-asf": { + "source": "iana" + }, + "application/vnd.ms-cab-compressed": { + "source": "iana", + "extensions": ["cab"] + }, + "application/vnd.ms-color.iccprofile": { + "source": "apache" + }, + "application/vnd.ms-excel": { + "source": "iana", + "compressible": false, + "extensions": ["xls","xlm","xla","xlc","xlt","xlw"] + }, + "application/vnd.ms-excel.addin.macroenabled.12": { + "source": "iana", + "extensions": ["xlam"] + }, + "application/vnd.ms-excel.sheet.binary.macroenabled.12": { + "source": "iana", + "extensions": ["xlsb"] + }, + "application/vnd.ms-excel.sheet.macroenabled.12": { + "source": "iana", + "extensions": ["xlsm"] + }, + "application/vnd.ms-excel.template.macroenabled.12": { + "source": "iana", + "extensions": ["xltm"] + }, + "application/vnd.ms-fontobject": { + "source": "iana", + "compressible": true, + "extensions": ["eot"] + }, + "application/vnd.ms-htmlhelp": { + "source": "iana", + "extensions": ["chm"] + }, + "application/vnd.ms-ims": { + "source": "iana", + "extensions": ["ims"] + }, + "application/vnd.ms-lrm": { + "source": "iana", + "extensions": ["lrm"] + }, + "application/vnd.ms-office.activex+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.ms-officetheme": { + "source": "iana", + "extensions": ["thmx"] + }, + "application/vnd.ms-opentype": { + "source": "apache", + "compressible": true + }, + "application/vnd.ms-outlook": { + "compressible": false, + "extensions": ["msg"] + }, + "application/vnd.ms-package.obfuscated-opentype": { + "source": "apache" + }, + "application/vnd.ms-pki.seccat": { + "source": "apache", + "extensions": ["cat"] + }, + "application/vnd.ms-pki.stl": { + "source": "apache", + "extensions": ["stl"] + }, + "application/vnd.ms-playready.initiator+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.ms-powerpoint": { + "source": "iana", + "compressible": false, + "extensions": ["ppt","pps","pot"] + }, + "application/vnd.ms-powerpoint.addin.macroenabled.12": { + "source": "iana", + "extensions": ["ppam"] + }, + "application/vnd.ms-powerpoint.presentation.macroenabled.12": { + "source": "iana", + "extensions": ["pptm"] + }, + "application/vnd.ms-powerpoint.slide.macroenabled.12": { + "source": "iana", + "extensions": ["sldm"] + }, + "application/vnd.ms-powerpoint.slideshow.macroenabled.12": { + "source": "iana", + "extensions": ["ppsm"] + }, + "application/vnd.ms-powerpoint.template.macroenabled.12": { + "source": "iana", + "extensions": ["potm"] + }, + "application/vnd.ms-printdevicecapabilities+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.ms-printing.printticket+xml": { + "source": "apache", + "compressible": true + }, + "application/vnd.ms-printschematicket+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.ms-project": { + "source": "iana", + "extensions": ["mpp","mpt"] + }, + "application/vnd.ms-tnef": { + "source": "iana" + }, + "application/vnd.ms-windows.devicepairing": { + "source": "iana" + }, + "application/vnd.ms-windows.nwprinting.oob": { + "source": "iana" + }, + "application/vnd.ms-windows.printerpairing": { + "source": "iana" + }, + "application/vnd.ms-windows.wsd.oob": { + "source": "iana" + }, + "application/vnd.ms-wmdrm.lic-chlg-req": { + "source": "iana" + }, + "application/vnd.ms-wmdrm.lic-resp": { + "source": "iana" + }, + "application/vnd.ms-wmdrm.meter-chlg-req": { + "source": "iana" + }, + "application/vnd.ms-wmdrm.meter-resp": { + "source": "iana" + }, + "application/vnd.ms-word.document.macroenabled.12": { + "source": "iana", + "extensions": ["docm"] + }, + "application/vnd.ms-word.template.macroenabled.12": { + "source": "iana", + "extensions": ["dotm"] + }, + "application/vnd.ms-works": { + "source": "iana", + "extensions": ["wps","wks","wcm","wdb"] + }, + "application/vnd.ms-wpl": { + "source": "iana", + "extensions": ["wpl"] + }, + "application/vnd.ms-xpsdocument": { + "source": "iana", + "compressible": false, + "extensions": ["xps"] + }, + "application/vnd.msa-disk-image": { + "source": "iana" + }, + "application/vnd.mseq": { + "source": "iana", + "extensions": ["mseq"] + }, + "application/vnd.msign": { + "source": "iana" + }, + "application/vnd.multiad.creator": { + "source": "iana" + }, + "application/vnd.multiad.creator.cif": { + "source": "iana" + }, + "application/vnd.music-niff": { + "source": "iana" + }, + "application/vnd.musician": { + "source": "iana", + "extensions": ["mus"] + }, + "application/vnd.muvee.style": { + "source": "iana", + "extensions": ["msty"] + }, + "application/vnd.mynfc": { + "source": "iana", + "extensions": ["taglet"] + }, + "application/vnd.ncd.control": { + "source": "iana" + }, + "application/vnd.ncd.reference": { + "source": "iana" + }, + "application/vnd.nearst.inv+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.nebumind.line": { + "source": "iana" + }, + "application/vnd.nervana": { + "source": "iana" + }, + "application/vnd.netfpx": { + "source": "iana" + }, + "application/vnd.neurolanguage.nlu": { + "source": "iana", + "extensions": ["nlu"] + }, + "application/vnd.nimn": { + "source": "iana" + }, + "application/vnd.nintendo.nitro.rom": { + "source": "iana" + }, + "application/vnd.nintendo.snes.rom": { + "source": "iana" + }, + "application/vnd.nitf": { + "source": "iana", + "extensions": ["ntf","nitf"] + }, + "application/vnd.noblenet-directory": { + "source": "iana", + "extensions": ["nnd"] + }, + "application/vnd.noblenet-sealer": { + "source": "iana", + "extensions": ["nns"] + }, + "application/vnd.noblenet-web": { + "source": "iana", + "extensions": ["nnw"] + }, + "application/vnd.nokia.catalogs": { + "source": "iana" + }, + "application/vnd.nokia.conml+wbxml": { + "source": "iana" + }, + "application/vnd.nokia.conml+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.nokia.iptv.config+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.nokia.isds-radio-presets": { + "source": "iana" + }, + "application/vnd.nokia.landmark+wbxml": { + "source": "iana" + }, + "application/vnd.nokia.landmark+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.nokia.landmarkcollection+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.nokia.n-gage.ac+xml": { + "source": "iana", + "compressible": true, + "extensions": ["ac"] + }, + "application/vnd.nokia.n-gage.data": { + "source": "iana", + "extensions": ["ngdat"] + }, + "application/vnd.nokia.n-gage.symbian.install": { + "source": "iana", + "extensions": ["n-gage"] + }, + "application/vnd.nokia.ncd": { + "source": "iana" + }, + "application/vnd.nokia.pcd+wbxml": { + "source": "iana" + }, + "application/vnd.nokia.pcd+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.nokia.radio-preset": { + "source": "iana", + "extensions": ["rpst"] + }, + "application/vnd.nokia.radio-presets": { + "source": "iana", + "extensions": ["rpss"] + }, + "application/vnd.novadigm.edm": { + "source": "iana", + "extensions": ["edm"] + }, + "application/vnd.novadigm.edx": { + "source": "iana", + "extensions": ["edx"] + }, + "application/vnd.novadigm.ext": { + "source": "iana", + "extensions": ["ext"] + }, + "application/vnd.ntt-local.content-share": { + "source": "iana" + }, + "application/vnd.ntt-local.file-transfer": { + "source": "iana" + }, + "application/vnd.ntt-local.ogw_remote-access": { + "source": "iana" + }, + "application/vnd.ntt-local.sip-ta_remote": { + "source": "iana" + }, + "application/vnd.ntt-local.sip-ta_tcp_stream": { + "source": "iana" + }, + "application/vnd.oasis.opendocument.chart": { + "source": "iana", + "extensions": ["odc"] + }, + "application/vnd.oasis.opendocument.chart-template": { + "source": "iana", + "extensions": ["otc"] + }, + "application/vnd.oasis.opendocument.database": { + "source": "iana", + "extensions": ["odb"] + }, + "application/vnd.oasis.opendocument.formula": { + "source": "iana", + "extensions": ["odf"] + }, + "application/vnd.oasis.opendocument.formula-template": { + "source": "iana", + "extensions": ["odft"] + }, + "application/vnd.oasis.opendocument.graphics": { + "source": "iana", + "compressible": false, + "extensions": ["odg"] + }, + "application/vnd.oasis.opendocument.graphics-template": { + "source": "iana", + "extensions": ["otg"] + }, + "application/vnd.oasis.opendocument.image": { + "source": "iana", + "extensions": ["odi"] + }, + "application/vnd.oasis.opendocument.image-template": { + "source": "iana", + "extensions": ["oti"] + }, + "application/vnd.oasis.opendocument.presentation": { + "source": "iana", + "compressible": false, + "extensions": ["odp"] + }, + "application/vnd.oasis.opendocument.presentation-template": { + "source": "iana", + "extensions": ["otp"] + }, + "application/vnd.oasis.opendocument.spreadsheet": { + "source": "iana", + "compressible": false, + "extensions": ["ods"] + }, + "application/vnd.oasis.opendocument.spreadsheet-template": { + "source": "iana", + "extensions": ["ots"] + }, + "application/vnd.oasis.opendocument.text": { + "source": "iana", + "compressible": false, + "extensions": ["odt"] + }, + "application/vnd.oasis.opendocument.text-master": { + "source": "iana", + "extensions": ["odm"] + }, + "application/vnd.oasis.opendocument.text-template": { + "source": "iana", + "extensions": ["ott"] + }, + "application/vnd.oasis.opendocument.text-web": { + "source": "iana", + "extensions": ["oth"] + }, + "application/vnd.obn": { + "source": "iana" + }, + "application/vnd.ocf+cbor": { + "source": "iana" + }, + "application/vnd.oci.image.manifest.v1+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.oftn.l10n+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.oipf.contentaccessdownload+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oipf.contentaccessstreaming+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oipf.cspg-hexbinary": { + "source": "iana" + }, + "application/vnd.oipf.dae.svg+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oipf.dae.xhtml+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oipf.mippvcontrolmessage+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oipf.pae.gem": { + "source": "iana" + }, + "application/vnd.oipf.spdiscovery+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oipf.spdlist+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oipf.ueprofile+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oipf.userprofile+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.olpc-sugar": { + "source": "iana", + "extensions": ["xo"] + }, + "application/vnd.oma-scws-config": { + "source": "iana" + }, + "application/vnd.oma-scws-http-request": { + "source": "iana" + }, + "application/vnd.oma-scws-http-response": { + "source": "iana" + }, + "application/vnd.oma.bcast.associated-procedure-parameter+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.bcast.drm-trigger+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.bcast.imd+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.bcast.ltkm": { + "source": "iana" + }, + "application/vnd.oma.bcast.notification+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.bcast.provisioningtrigger": { + "source": "iana" + }, + "application/vnd.oma.bcast.sgboot": { + "source": "iana" + }, + "application/vnd.oma.bcast.sgdd+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.bcast.sgdu": { + "source": "iana" + }, + "application/vnd.oma.bcast.simple-symbol-container": { + "source": "iana" + }, + "application/vnd.oma.bcast.smartcard-trigger+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.bcast.sprov+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.bcast.stkm": { + "source": "iana" + }, + "application/vnd.oma.cab-address-book+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.cab-feature-handler+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.cab-pcc+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.cab-subs-invite+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.cab-user-prefs+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.dcd": { + "source": "iana" + }, + "application/vnd.oma.dcdc": { + "source": "iana" + }, + "application/vnd.oma.dd2+xml": { + "source": "iana", + "compressible": true, + "extensions": ["dd2"] + }, + "application/vnd.oma.drm.risd+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.group-usage-list+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.lwm2m+cbor": { + "source": "iana" + }, + "application/vnd.oma.lwm2m+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.lwm2m+tlv": { + "source": "iana" + }, + "application/vnd.oma.pal+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.poc.detailed-progress-report+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.poc.final-report+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.poc.groups+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.poc.invocation-descriptor+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.poc.optimized-progress-report+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.push": { + "source": "iana" + }, + "application/vnd.oma.scidm.messages+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oma.xcap-directory+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.omads-email+xml": { + "source": "iana", + "charset": "UTF-8", + "compressible": true + }, + "application/vnd.omads-file+xml": { + "source": "iana", + "charset": "UTF-8", + "compressible": true + }, + "application/vnd.omads-folder+xml": { + "source": "iana", + "charset": "UTF-8", + "compressible": true + }, + "application/vnd.omaloc-supl-init": { + "source": "iana" + }, + "application/vnd.onepager": { + "source": "iana" + }, + "application/vnd.onepagertamp": { + "source": "iana" + }, + "application/vnd.onepagertamx": { + "source": "iana" + }, + "application/vnd.onepagertat": { + "source": "iana" + }, + "application/vnd.onepagertatp": { + "source": "iana" + }, + "application/vnd.onepagertatx": { + "source": "iana" + }, + "application/vnd.openblox.game+xml": { + "source": "iana", + "compressible": true, + "extensions": ["obgx"] + }, + "application/vnd.openblox.game-binary": { + "source": "iana" + }, + "application/vnd.openeye.oeb": { + "source": "iana" + }, + "application/vnd.openofficeorg.extension": { + "source": "apache", + "extensions": ["oxt"] + }, + "application/vnd.openstreetmap.data+xml": { + "source": "iana", + "compressible": true, + "extensions": ["osm"] + }, + "application/vnd.opentimestamps.ots": { + "source": "iana" + }, + "application/vnd.openxmlformats-officedocument.custom-properties+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.customxmlproperties+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.drawing+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.drawingml.chart+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.drawingml.chartshapes+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.drawingml.diagramcolors+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.drawingml.diagramdata+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.drawingml.diagramlayout+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.drawingml.diagramstyle+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.extended-properties+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.presentationml.commentauthors+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.presentationml.comments+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.presentationml.handoutmaster+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.presentationml.notesmaster+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.presentationml.notesslide+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.presentationml.presentation": { + "source": "iana", + "compressible": false, + "extensions": ["pptx"] + }, + "application/vnd.openxmlformats-officedocument.presentationml.presentation.main+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.presentationml.presprops+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.presentationml.slide": { + "source": "iana", + "extensions": ["sldx"] + }, + "application/vnd.openxmlformats-officedocument.presentationml.slide+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.presentationml.slidelayout+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.presentationml.slidemaster+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.presentationml.slideshow": { + "source": "iana", + "extensions": ["ppsx"] + }, + "application/vnd.openxmlformats-officedocument.presentationml.slideshow.main+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.presentationml.slideupdateinfo+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.presentationml.tablestyles+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.presentationml.tags+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.presentationml.template": { + "source": "iana", + "extensions": ["potx"] + }, + "application/vnd.openxmlformats-officedocument.presentationml.template.main+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.presentationml.viewprops+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.calcchain+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.chartsheet+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.comments+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.connections+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.dialogsheet+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.externallink+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.pivotcachedefinition+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.pivotcacherecords+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.pivottable+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.querytable+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.revisionheaders+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.revisionlog+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.sharedstrings+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet": { + "source": "iana", + "compressible": false, + "extensions": ["xlsx"] + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.sheet.main+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.sheetmetadata+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.styles+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.table+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.tablesinglecells+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.template": { + "source": "iana", + "extensions": ["xltx"] + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.template.main+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.usernames+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.volatiledependencies+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.spreadsheetml.worksheet+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.theme+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.themeoverride+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.vmldrawing": { + "source": "iana" + }, + "application/vnd.openxmlformats-officedocument.wordprocessingml.comments+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.wordprocessingml.document": { + "source": "iana", + "compressible": false, + "extensions": ["docx"] + }, + "application/vnd.openxmlformats-officedocument.wordprocessingml.document.glossary+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.wordprocessingml.document.main+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.wordprocessingml.endnotes+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.wordprocessingml.fonttable+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.wordprocessingml.footer+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.wordprocessingml.footnotes+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.wordprocessingml.numbering+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.wordprocessingml.settings+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.wordprocessingml.styles+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.wordprocessingml.template": { + "source": "iana", + "extensions": ["dotx"] + }, + "application/vnd.openxmlformats-officedocument.wordprocessingml.template.main+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-officedocument.wordprocessingml.websettings+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-package.core-properties+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-package.digital-signature-xmlsignature+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.openxmlformats-package.relationships+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oracle.resource+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.orange.indata": { + "source": "iana" + }, + "application/vnd.osa.netdeploy": { + "source": "iana" + }, + "application/vnd.osgeo.mapguide.package": { + "source": "iana", + "extensions": ["mgp"] + }, + "application/vnd.osgi.bundle": { + "source": "iana" + }, + "application/vnd.osgi.dp": { + "source": "iana", + "extensions": ["dp"] + }, + "application/vnd.osgi.subsystem": { + "source": "iana", + "extensions": ["esa"] + }, + "application/vnd.otps.ct-kip+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.oxli.countgraph": { + "source": "iana" + }, + "application/vnd.pagerduty+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.palm": { + "source": "iana", + "extensions": ["pdb","pqa","oprc"] + }, + "application/vnd.panoply": { + "source": "iana" + }, + "application/vnd.paos.xml": { + "source": "iana" + }, + "application/vnd.patentdive": { + "source": "iana" + }, + "application/vnd.patientecommsdoc": { + "source": "iana" + }, + "application/vnd.pawaafile": { + "source": "iana", + "extensions": ["paw"] + }, + "application/vnd.pcos": { + "source": "iana" + }, + "application/vnd.pg.format": { + "source": "iana", + "extensions": ["str"] + }, + "application/vnd.pg.osasli": { + "source": "iana", + "extensions": ["ei6"] + }, + "application/vnd.piaccess.application-licence": { + "source": "iana" + }, + "application/vnd.picsel": { + "source": "iana", + "extensions": ["efif"] + }, + "application/vnd.pmi.widget": { + "source": "iana", + "extensions": ["wg"] + }, + "application/vnd.poc.group-advertisement+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.pocketlearn": { + "source": "iana", + "extensions": ["plf"] + }, + "application/vnd.powerbuilder6": { + "source": "iana", + "extensions": ["pbd"] + }, + "application/vnd.powerbuilder6-s": { + "source": "iana" + }, + "application/vnd.powerbuilder7": { + "source": "iana" + }, + "application/vnd.powerbuilder7-s": { + "source": "iana" + }, + "application/vnd.powerbuilder75": { + "source": "iana" + }, + "application/vnd.powerbuilder75-s": { + "source": "iana" + }, + "application/vnd.preminet": { + "source": "iana" + }, + "application/vnd.previewsystems.box": { + "source": "iana", + "extensions": ["box"] + }, + "application/vnd.proteus.magazine": { + "source": "iana", + "extensions": ["mgz"] + }, + "application/vnd.psfs": { + "source": "iana" + }, + "application/vnd.publishare-delta-tree": { + "source": "iana", + "extensions": ["qps"] + }, + "application/vnd.pvi.ptid1": { + "source": "iana", + "extensions": ["ptid"] + }, + "application/vnd.pwg-multiplexed": { + "source": "iana" + }, + "application/vnd.pwg-xhtml-print+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.qualcomm.brew-app-res": { + "source": "iana" + }, + "application/vnd.quarantainenet": { + "source": "iana" + }, + "application/vnd.quark.quarkxpress": { + "source": "iana", + "extensions": ["qxd","qxt","qwd","qwt","qxl","qxb"] + }, + "application/vnd.quobject-quoxdocument": { + "source": "iana" + }, + "application/vnd.radisys.moml+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.radisys.msml+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.radisys.msml-audit+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.radisys.msml-audit-conf+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.radisys.msml-audit-conn+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.radisys.msml-audit-dialog+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.radisys.msml-audit-stream+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.radisys.msml-conf+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.radisys.msml-dialog+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.radisys.msml-dialog-base+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.radisys.msml-dialog-fax-detect+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.radisys.msml-dialog-fax-sendrecv+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.radisys.msml-dialog-group+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.radisys.msml-dialog-speech+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.radisys.msml-dialog-transform+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.rainstor.data": { + "source": "iana" + }, + "application/vnd.rapid": { + "source": "iana" + }, + "application/vnd.rar": { + "source": "iana", + "extensions": ["rar"] + }, + "application/vnd.realvnc.bed": { + "source": "iana", + "extensions": ["bed"] + }, + "application/vnd.recordare.musicxml": { + "source": "iana", + "extensions": ["mxl"] + }, + "application/vnd.recordare.musicxml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["musicxml"] + }, + "application/vnd.renlearn.rlprint": { + "source": "iana" + }, + "application/vnd.resilient.logic": { + "source": "iana" + }, + "application/vnd.restful+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.rig.cryptonote": { + "source": "iana", + "extensions": ["cryptonote"] + }, + "application/vnd.rim.cod": { + "source": "apache", + "extensions": ["cod"] + }, + "application/vnd.rn-realmedia": { + "source": "apache", + "extensions": ["rm"] + }, + "application/vnd.rn-realmedia-vbr": { + "source": "apache", + "extensions": ["rmvb"] + }, + "application/vnd.route66.link66+xml": { + "source": "iana", + "compressible": true, + "extensions": ["link66"] + }, + "application/vnd.rs-274x": { + "source": "iana" + }, + "application/vnd.ruckus.download": { + "source": "iana" + }, + "application/vnd.s3sms": { + "source": "iana" + }, + "application/vnd.sailingtracker.track": { + "source": "iana", + "extensions": ["st"] + }, + "application/vnd.sar": { + "source": "iana" + }, + "application/vnd.sbm.cid": { + "source": "iana" + }, + "application/vnd.sbm.mid2": { + "source": "iana" + }, + "application/vnd.scribus": { + "source": "iana" + }, + "application/vnd.sealed.3df": { + "source": "iana" + }, + "application/vnd.sealed.csf": { + "source": "iana" + }, + "application/vnd.sealed.doc": { + "source": "iana" + }, + "application/vnd.sealed.eml": { + "source": "iana" + }, + "application/vnd.sealed.mht": { + "source": "iana" + }, + "application/vnd.sealed.net": { + "source": "iana" + }, + "application/vnd.sealed.ppt": { + "source": "iana" + }, + "application/vnd.sealed.tiff": { + "source": "iana" + }, + "application/vnd.sealed.xls": { + "source": "iana" + }, + "application/vnd.sealedmedia.softseal.html": { + "source": "iana" + }, + "application/vnd.sealedmedia.softseal.pdf": { + "source": "iana" + }, + "application/vnd.seemail": { + "source": "iana", + "extensions": ["see"] + }, + "application/vnd.seis+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.sema": { + "source": "iana", + "extensions": ["sema"] + }, + "application/vnd.semd": { + "source": "iana", + "extensions": ["semd"] + }, + "application/vnd.semf": { + "source": "iana", + "extensions": ["semf"] + }, + "application/vnd.shade-save-file": { + "source": "iana" + }, + "application/vnd.shana.informed.formdata": { + "source": "iana", + "extensions": ["ifm"] + }, + "application/vnd.shana.informed.formtemplate": { + "source": "iana", + "extensions": ["itp"] + }, + "application/vnd.shana.informed.interchange": { + "source": "iana", + "extensions": ["iif"] + }, + "application/vnd.shana.informed.package": { + "source": "iana", + "extensions": ["ipk"] + }, + "application/vnd.shootproof+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.shopkick+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.shp": { + "source": "iana" + }, + "application/vnd.shx": { + "source": "iana" + }, + "application/vnd.sigrok.session": { + "source": "iana" + }, + "application/vnd.simtech-mindmapper": { + "source": "iana", + "extensions": ["twd","twds"] + }, + "application/vnd.siren+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.smaf": { + "source": "iana", + "extensions": ["mmf"] + }, + "application/vnd.smart.notebook": { + "source": "iana" + }, + "application/vnd.smart.teacher": { + "source": "iana", + "extensions": ["teacher"] + }, + "application/vnd.snesdev-page-table": { + "source": "iana" + }, + "application/vnd.software602.filler.form+xml": { + "source": "iana", + "compressible": true, + "extensions": ["fo"] + }, + "application/vnd.software602.filler.form-xml-zip": { + "source": "iana" + }, + "application/vnd.solent.sdkm+xml": { + "source": "iana", + "compressible": true, + "extensions": ["sdkm","sdkd"] + }, + "application/vnd.spotfire.dxp": { + "source": "iana", + "extensions": ["dxp"] + }, + "application/vnd.spotfire.sfs": { + "source": "iana", + "extensions": ["sfs"] + }, + "application/vnd.sqlite3": { + "source": "iana" + }, + "application/vnd.sss-cod": { + "source": "iana" + }, + "application/vnd.sss-dtf": { + "source": "iana" + }, + "application/vnd.sss-ntf": { + "source": "iana" + }, + "application/vnd.stardivision.calc": { + "source": "apache", + "extensions": ["sdc"] + }, + "application/vnd.stardivision.draw": { + "source": "apache", + "extensions": ["sda"] + }, + "application/vnd.stardivision.impress": { + "source": "apache", + "extensions": ["sdd"] + }, + "application/vnd.stardivision.math": { + "source": "apache", + "extensions": ["smf"] + }, + "application/vnd.stardivision.writer": { + "source": "apache", + "extensions": ["sdw","vor"] + }, + "application/vnd.stardivision.writer-global": { + "source": "apache", + "extensions": ["sgl"] + }, + "application/vnd.stepmania.package": { + "source": "iana", + "extensions": ["smzip"] + }, + "application/vnd.stepmania.stepchart": { + "source": "iana", + "extensions": ["sm"] + }, + "application/vnd.street-stream": { + "source": "iana" + }, + "application/vnd.sun.wadl+xml": { + "source": "iana", + "compressible": true, + "extensions": ["wadl"] + }, + "application/vnd.sun.xml.calc": { + "source": "apache", + "extensions": ["sxc"] + }, + "application/vnd.sun.xml.calc.template": { + "source": "apache", + "extensions": ["stc"] + }, + "application/vnd.sun.xml.draw": { + "source": "apache", + "extensions": ["sxd"] + }, + "application/vnd.sun.xml.draw.template": { + "source": "apache", + "extensions": ["std"] + }, + "application/vnd.sun.xml.impress": { + "source": "apache", + "extensions": ["sxi"] + }, + "application/vnd.sun.xml.impress.template": { + "source": "apache", + "extensions": ["sti"] + }, + "application/vnd.sun.xml.math": { + "source": "apache", + "extensions": ["sxm"] + }, + "application/vnd.sun.xml.writer": { + "source": "apache", + "extensions": ["sxw"] + }, + "application/vnd.sun.xml.writer.global": { + "source": "apache", + "extensions": ["sxg"] + }, + "application/vnd.sun.xml.writer.template": { + "source": "apache", + "extensions": ["stw"] + }, + "application/vnd.sus-calendar": { + "source": "iana", + "extensions": ["sus","susp"] + }, + "application/vnd.svd": { + "source": "iana", + "extensions": ["svd"] + }, + "application/vnd.swiftview-ics": { + "source": "iana" + }, + "application/vnd.sycle+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.symbian.install": { + "source": "apache", + "extensions": ["sis","sisx"] + }, + "application/vnd.syncml+xml": { + "source": "iana", + "charset": "UTF-8", + "compressible": true, + "extensions": ["xsm"] + }, + "application/vnd.syncml.dm+wbxml": { + "source": "iana", + "charset": "UTF-8", + "extensions": ["bdm"] + }, + "application/vnd.syncml.dm+xml": { + "source": "iana", + "charset": "UTF-8", + "compressible": true, + "extensions": ["xdm"] + }, + "application/vnd.syncml.dm.notification": { + "source": "iana" + }, + "application/vnd.syncml.dmddf+wbxml": { + "source": "iana" + }, + "application/vnd.syncml.dmddf+xml": { + "source": "iana", + "charset": "UTF-8", + "compressible": true, + "extensions": ["ddf"] + }, + "application/vnd.syncml.dmtnds+wbxml": { + "source": "iana" + }, + "application/vnd.syncml.dmtnds+xml": { + "source": "iana", + "charset": "UTF-8", + "compressible": true + }, + "application/vnd.syncml.ds.notification": { + "source": "iana" + }, + "application/vnd.tableschema+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.tao.intent-module-archive": { + "source": "iana", + "extensions": ["tao"] + }, + "application/vnd.tcpdump.pcap": { + "source": "iana", + "extensions": ["pcap","cap","dmp"] + }, + "application/vnd.think-cell.ppttc+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.tmd.mediaflex.api+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.tml": { + "source": "iana" + }, + "application/vnd.tmobile-livetv": { + "source": "iana", + "extensions": ["tmo"] + }, + "application/vnd.tri.onesource": { + "source": "iana" + }, + "application/vnd.trid.tpt": { + "source": "iana", + "extensions": ["tpt"] + }, + "application/vnd.triscape.mxs": { + "source": "iana", + "extensions": ["mxs"] + }, + "application/vnd.trueapp": { + "source": "iana", + "extensions": ["tra"] + }, + "application/vnd.truedoc": { + "source": "iana" + }, + "application/vnd.ubisoft.webplayer": { + "source": "iana" + }, + "application/vnd.ufdl": { + "source": "iana", + "extensions": ["ufd","ufdl"] + }, + "application/vnd.uiq.theme": { + "source": "iana", + "extensions": ["utz"] + }, + "application/vnd.umajin": { + "source": "iana", + "extensions": ["umj"] + }, + "application/vnd.unity": { + "source": "iana", + "extensions": ["unityweb"] + }, + "application/vnd.uoml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["uoml"] + }, + "application/vnd.uplanet.alert": { + "source": "iana" + }, + "application/vnd.uplanet.alert-wbxml": { + "source": "iana" + }, + "application/vnd.uplanet.bearer-choice": { + "source": "iana" + }, + "application/vnd.uplanet.bearer-choice-wbxml": { + "source": "iana" + }, + "application/vnd.uplanet.cacheop": { + "source": "iana" + }, + "application/vnd.uplanet.cacheop-wbxml": { + "source": "iana" + }, + "application/vnd.uplanet.channel": { + "source": "iana" + }, + "application/vnd.uplanet.channel-wbxml": { + "source": "iana" + }, + "application/vnd.uplanet.list": { + "source": "iana" + }, + "application/vnd.uplanet.list-wbxml": { + "source": "iana" + }, + "application/vnd.uplanet.listcmd": { + "source": "iana" + }, + "application/vnd.uplanet.listcmd-wbxml": { + "source": "iana" + }, + "application/vnd.uplanet.signal": { + "source": "iana" + }, + "application/vnd.uri-map": { + "source": "iana" + }, + "application/vnd.valve.source.material": { + "source": "iana" + }, + "application/vnd.vcx": { + "source": "iana", + "extensions": ["vcx"] + }, + "application/vnd.vd-study": { + "source": "iana" + }, + "application/vnd.vectorworks": { + "source": "iana" + }, + "application/vnd.vel+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.verimatrix.vcas": { + "source": "iana" + }, + "application/vnd.veritone.aion+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.veryant.thin": { + "source": "iana" + }, + "application/vnd.ves.encrypted": { + "source": "iana" + }, + "application/vnd.vidsoft.vidconference": { + "source": "iana" + }, + "application/vnd.visio": { + "source": "iana", + "extensions": ["vsd","vst","vss","vsw"] + }, + "application/vnd.visionary": { + "source": "iana", + "extensions": ["vis"] + }, + "application/vnd.vividence.scriptfile": { + "source": "iana" + }, + "application/vnd.vsf": { + "source": "iana", + "extensions": ["vsf"] + }, + "application/vnd.wap.sic": { + "source": "iana" + }, + "application/vnd.wap.slc": { + "source": "iana" + }, + "application/vnd.wap.wbxml": { + "source": "iana", + "charset": "UTF-8", + "extensions": ["wbxml"] + }, + "application/vnd.wap.wmlc": { + "source": "iana", + "extensions": ["wmlc"] + }, + "application/vnd.wap.wmlscriptc": { + "source": "iana", + "extensions": ["wmlsc"] + }, + "application/vnd.webturbo": { + "source": "iana", + "extensions": ["wtb"] + }, + "application/vnd.wfa.dpp": { + "source": "iana" + }, + "application/vnd.wfa.p2p": { + "source": "iana" + }, + "application/vnd.wfa.wsc": { + "source": "iana" + }, + "application/vnd.windows.devicepairing": { + "source": "iana" + }, + "application/vnd.wmc": { + "source": "iana" + }, + "application/vnd.wmf.bootstrap": { + "source": "iana" + }, + "application/vnd.wolfram.mathematica": { + "source": "iana" + }, + "application/vnd.wolfram.mathematica.package": { + "source": "iana" + }, + "application/vnd.wolfram.player": { + "source": "iana", + "extensions": ["nbp"] + }, + "application/vnd.wordperfect": { + "source": "iana", + "extensions": ["wpd"] + }, + "application/vnd.wqd": { + "source": "iana", + "extensions": ["wqd"] + }, + "application/vnd.wrq-hp3000-labelled": { + "source": "iana" + }, + "application/vnd.wt.stf": { + "source": "iana", + "extensions": ["stf"] + }, + "application/vnd.wv.csp+wbxml": { + "source": "iana" + }, + "application/vnd.wv.csp+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.wv.ssp+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.xacml+json": { + "source": "iana", + "compressible": true + }, + "application/vnd.xara": { + "source": "iana", + "extensions": ["xar"] + }, + "application/vnd.xfdl": { + "source": "iana", + "extensions": ["xfdl"] + }, + "application/vnd.xfdl.webform": { + "source": "iana" + }, + "application/vnd.xmi+xml": { + "source": "iana", + "compressible": true + }, + "application/vnd.xmpie.cpkg": { + "source": "iana" + }, + "application/vnd.xmpie.dpkg": { + "source": "iana" + }, + "application/vnd.xmpie.plan": { + "source": "iana" + }, + "application/vnd.xmpie.ppkg": { + "source": "iana" + }, + "application/vnd.xmpie.xlim": { + "source": "iana" + }, + "application/vnd.yamaha.hv-dic": { + "source": "iana", + "extensions": ["hvd"] + }, + "application/vnd.yamaha.hv-script": { + "source": "iana", + "extensions": ["hvs"] + }, + "application/vnd.yamaha.hv-voice": { + "source": "iana", + "extensions": ["hvp"] + }, + "application/vnd.yamaha.openscoreformat": { + "source": "iana", + "extensions": ["osf"] + }, + "application/vnd.yamaha.openscoreformat.osfpvg+xml": { + "source": "iana", + "compressible": true, + "extensions": ["osfpvg"] + }, + "application/vnd.yamaha.remote-setup": { + "source": "iana" + }, + "application/vnd.yamaha.smaf-audio": { + "source": "iana", + "extensions": ["saf"] + }, + "application/vnd.yamaha.smaf-phrase": { + "source": "iana", + "extensions": ["spf"] + }, + "application/vnd.yamaha.through-ngn": { + "source": "iana" + }, + "application/vnd.yamaha.tunnel-udpencap": { + "source": "iana" + }, + "application/vnd.yaoweme": { + "source": "iana" + }, + "application/vnd.yellowriver-custom-menu": { + "source": "iana", + "extensions": ["cmp"] + }, + "application/vnd.youtube.yt": { + "source": "iana" + }, + "application/vnd.zul": { + "source": "iana", + "extensions": ["zir","zirz"] + }, + "application/vnd.zzazz.deck+xml": { + "source": "iana", + "compressible": true, + "extensions": ["zaz"] + }, + "application/voicexml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["vxml"] + }, + "application/voucher-cms+json": { + "source": "iana", + "compressible": true + }, + "application/vq-rtcpxr": { + "source": "iana" + }, + "application/wasm": { + "source": "iana", + "compressible": true, + "extensions": ["wasm"] + }, + "application/watcherinfo+xml": { + "source": "iana", + "compressible": true + }, + "application/webpush-options+json": { + "source": "iana", + "compressible": true + }, + "application/whoispp-query": { + "source": "iana" + }, + "application/whoispp-response": { + "source": "iana" + }, + "application/widget": { + "source": "iana", + "extensions": ["wgt"] + }, + "application/winhlp": { + "source": "apache", + "extensions": ["hlp"] + }, + "application/wita": { + "source": "iana" + }, + "application/wordperfect5.1": { + "source": "iana" + }, + "application/wsdl+xml": { + "source": "iana", + "compressible": true, + "extensions": ["wsdl"] + }, + "application/wspolicy+xml": { + "source": "iana", + "compressible": true, + "extensions": ["wspolicy"] + }, + "application/x-7z-compressed": { + "source": "apache", + "compressible": false, + "extensions": ["7z"] + }, + "application/x-abiword": { + "source": "apache", + "extensions": ["abw"] + }, + "application/x-ace-compressed": { + "source": "apache", + "extensions": ["ace"] + }, + "application/x-amf": { + "source": "apache" + }, + "application/x-apple-diskimage": { + "source": "apache", + "extensions": ["dmg"] + }, + "application/x-arj": { + "compressible": false, + "extensions": ["arj"] + }, + "application/x-authorware-bin": { + "source": "apache", + "extensions": ["aab","x32","u32","vox"] + }, + "application/x-authorware-map": { + "source": "apache", + "extensions": ["aam"] + }, + "application/x-authorware-seg": { + "source": "apache", + "extensions": ["aas"] + }, + "application/x-bcpio": { + "source": "apache", + "extensions": ["bcpio"] + }, + "application/x-bdoc": { + "compressible": false, + "extensions": ["bdoc"] + }, + "application/x-bittorrent": { + "source": "apache", + "extensions": ["torrent"] + }, + "application/x-blorb": { + "source": "apache", + "extensions": ["blb","blorb"] + }, + "application/x-bzip": { + "source": "apache", + "compressible": false, + "extensions": ["bz"] + }, + "application/x-bzip2": { + "source": "apache", + "compressible": false, + "extensions": ["bz2","boz"] + }, + "application/x-cbr": { + "source": "apache", + "extensions": ["cbr","cba","cbt","cbz","cb7"] + }, + "application/x-cdlink": { + "source": "apache", + "extensions": ["vcd"] + }, + "application/x-cfs-compressed": { + "source": "apache", + "extensions": ["cfs"] + }, + "application/x-chat": { + "source": "apache", + "extensions": ["chat"] + }, + "application/x-chess-pgn": { + "source": "apache", + "extensions": ["pgn"] + }, + "application/x-chrome-extension": { + "extensions": ["crx"] + }, + "application/x-cocoa": { + "source": "nginx", + "extensions": ["cco"] + }, + "application/x-compress": { + "source": "apache" + }, + "application/x-conference": { + "source": "apache", + "extensions": ["nsc"] + }, + "application/x-cpio": { + "source": "apache", + "extensions": ["cpio"] + }, + "application/x-csh": { + "source": "apache", + "extensions": ["csh"] + }, + "application/x-deb": { + "compressible": false + }, + "application/x-debian-package": { + "source": "apache", + "extensions": ["deb","udeb"] + }, + "application/x-dgc-compressed": { + "source": "apache", + "extensions": ["dgc"] + }, + "application/x-director": { + "source": "apache", + "extensions": ["dir","dcr","dxr","cst","cct","cxt","w3d","fgd","swa"] + }, + "application/x-doom": { + "source": "apache", + "extensions": ["wad"] + }, + "application/x-dtbncx+xml": { + "source": "apache", + "compressible": true, + "extensions": ["ncx"] + }, + "application/x-dtbook+xml": { + "source": "apache", + "compressible": true, + "extensions": ["dtb"] + }, + "application/x-dtbresource+xml": { + "source": "apache", + "compressible": true, + "extensions": ["res"] + }, + "application/x-dvi": { + "source": "apache", + "compressible": false, + "extensions": ["dvi"] + }, + "application/x-envoy": { + "source": "apache", + "extensions": ["evy"] + }, + "application/x-eva": { + "source": "apache", + "extensions": ["eva"] + }, + "application/x-font-bdf": { + "source": "apache", + "extensions": ["bdf"] + }, + "application/x-font-dos": { + "source": "apache" + }, + "application/x-font-framemaker": { + "source": "apache" + }, + "application/x-font-ghostscript": { + "source": "apache", + "extensions": ["gsf"] + }, + "application/x-font-libgrx": { + "source": "apache" + }, + "application/x-font-linux-psf": { + "source": "apache", + "extensions": ["psf"] + }, + "application/x-font-pcf": { + "source": "apache", + "extensions": ["pcf"] + }, + "application/x-font-snf": { + "source": "apache", + "extensions": ["snf"] + }, + "application/x-font-speedo": { + "source": "apache" + }, + "application/x-font-sunos-news": { + "source": "apache" + }, + "application/x-font-type1": { + "source": "apache", + "extensions": ["pfa","pfb","pfm","afm"] + }, + "application/x-font-vfont": { + "source": "apache" + }, + "application/x-freearc": { + "source": "apache", + "extensions": ["arc"] + }, + "application/x-futuresplash": { + "source": "apache", + "extensions": ["spl"] + }, + "application/x-gca-compressed": { + "source": "apache", + "extensions": ["gca"] + }, + "application/x-glulx": { + "source": "apache", + "extensions": ["ulx"] + }, + "application/x-gnumeric": { + "source": "apache", + "extensions": ["gnumeric"] + }, + "application/x-gramps-xml": { + "source": "apache", + "extensions": ["gramps"] + }, + "application/x-gtar": { + "source": "apache", + "extensions": ["gtar"] + }, + "application/x-gzip": { + "source": "apache" + }, + "application/x-hdf": { + "source": "apache", + "extensions": ["hdf"] + }, + "application/x-httpd-php": { + "compressible": true, + "extensions": ["php"] + }, + "application/x-install-instructions": { + "source": "apache", + "extensions": ["install"] + }, + "application/x-iso9660-image": { + "source": "apache", + "extensions": ["iso"] + }, + "application/x-iwork-keynote-sffkey": { + "extensions": ["key"] + }, + "application/x-iwork-numbers-sffnumbers": { + "extensions": ["numbers"] + }, + "application/x-iwork-pages-sffpages": { + "extensions": ["pages"] + }, + "application/x-java-archive-diff": { + "source": "nginx", + "extensions": ["jardiff"] + }, + "application/x-java-jnlp-file": { + "source": "apache", + "compressible": false, + "extensions": ["jnlp"] + }, + "application/x-javascript": { + "compressible": true + }, + "application/x-keepass2": { + "extensions": ["kdbx"] + }, + "application/x-latex": { + "source": "apache", + "compressible": false, + "extensions": ["latex"] + }, + "application/x-lua-bytecode": { + "extensions": ["luac"] + }, + "application/x-lzh-compressed": { + "source": "apache", + "extensions": ["lzh","lha"] + }, + "application/x-makeself": { + "source": "nginx", + "extensions": ["run"] + }, + "application/x-mie": { + "source": "apache", + "extensions": ["mie"] + }, + "application/x-mobipocket-ebook": { + "source": "apache", + "extensions": ["prc","mobi"] + }, + "application/x-mpegurl": { + "compressible": false + }, + "application/x-ms-application": { + "source": "apache", + "extensions": ["application"] + }, + "application/x-ms-shortcut": { + "source": "apache", + "extensions": ["lnk"] + }, + "application/x-ms-wmd": { + "source": "apache", + "extensions": ["wmd"] + }, + "application/x-ms-wmz": { + "source": "apache", + "extensions": ["wmz"] + }, + "application/x-ms-xbap": { + "source": "apache", + "extensions": ["xbap"] + }, + "application/x-msaccess": { + "source": "apache", + "extensions": ["mdb"] + }, + "application/x-msbinder": { + "source": "apache", + "extensions": ["obd"] + }, + "application/x-mscardfile": { + "source": "apache", + "extensions": ["crd"] + }, + "application/x-msclip": { + "source": "apache", + "extensions": ["clp"] + }, + "application/x-msdos-program": { + "extensions": ["exe"] + }, + "application/x-msdownload": { + "source": "apache", + "extensions": ["exe","dll","com","bat","msi"] + }, + "application/x-msmediaview": { + "source": "apache", + "extensions": ["mvb","m13","m14"] + }, + "application/x-msmetafile": { + "source": "apache", + "extensions": ["wmf","wmz","emf","emz"] + }, + "application/x-msmoney": { + "source": "apache", + "extensions": ["mny"] + }, + "application/x-mspublisher": { + "source": "apache", + "extensions": ["pub"] + }, + "application/x-msschedule": { + "source": "apache", + "extensions": ["scd"] + }, + "application/x-msterminal": { + "source": "apache", + "extensions": ["trm"] + }, + "application/x-mswrite": { + "source": "apache", + "extensions": ["wri"] + }, + "application/x-netcdf": { + "source": "apache", + "extensions": ["nc","cdf"] + }, + "application/x-ns-proxy-autoconfig": { + "compressible": true, + "extensions": ["pac"] + }, + "application/x-nzb": { + "source": "apache", + "extensions": ["nzb"] + }, + "application/x-perl": { + "source": "nginx", + "extensions": ["pl","pm"] + }, + "application/x-pilot": { + "source": "nginx", + "extensions": ["prc","pdb"] + }, + "application/x-pkcs12": { + "source": "apache", + "compressible": false, + "extensions": ["p12","pfx"] + }, + "application/x-pkcs7-certificates": { + "source": "apache", + "extensions": ["p7b","spc"] + }, + "application/x-pkcs7-certreqresp": { + "source": "apache", + "extensions": ["p7r"] + }, + "application/x-pki-message": { + "source": "iana" + }, + "application/x-rar-compressed": { + "source": "apache", + "compressible": false, + "extensions": ["rar"] + }, + "application/x-redhat-package-manager": { + "source": "nginx", + "extensions": ["rpm"] + }, + "application/x-research-info-systems": { + "source": "apache", + "extensions": ["ris"] + }, + "application/x-sea": { + "source": "nginx", + "extensions": ["sea"] + }, + "application/x-sh": { + "source": "apache", + "compressible": true, + "extensions": ["sh"] + }, + "application/x-shar": { + "source": "apache", + "extensions": ["shar"] + }, + "application/x-shockwave-flash": { + "source": "apache", + "compressible": false, + "extensions": ["swf"] + }, + "application/x-silverlight-app": { + "source": "apache", + "extensions": ["xap"] + }, + "application/x-sql": { + "source": "apache", + "extensions": ["sql"] + }, + "application/x-stuffit": { + "source": "apache", + "compressible": false, + "extensions": ["sit"] + }, + "application/x-stuffitx": { + "source": "apache", + "extensions": ["sitx"] + }, + "application/x-subrip": { + "source": "apache", + "extensions": ["srt"] + }, + "application/x-sv4cpio": { + "source": "apache", + "extensions": ["sv4cpio"] + }, + "application/x-sv4crc": { + "source": "apache", + "extensions": ["sv4crc"] + }, + "application/x-t3vm-image": { + "source": "apache", + "extensions": ["t3"] + }, + "application/x-tads": { + "source": "apache", + "extensions": ["gam"] + }, + "application/x-tar": { + "source": "apache", + "compressible": true, + "extensions": ["tar"] + }, + "application/x-tcl": { + "source": "apache", + "extensions": ["tcl","tk"] + }, + "application/x-tex": { + "source": "apache", + "extensions": ["tex"] + }, + "application/x-tex-tfm": { + "source": "apache", + "extensions": ["tfm"] + }, + "application/x-texinfo": { + "source": "apache", + "extensions": ["texinfo","texi"] + }, + "application/x-tgif": { + "source": "apache", + "extensions": ["obj"] + }, + "application/x-ustar": { + "source": "apache", + "extensions": ["ustar"] + }, + "application/x-virtualbox-hdd": { + "compressible": true, + "extensions": ["hdd"] + }, + "application/x-virtualbox-ova": { + "compressible": true, + "extensions": ["ova"] + }, + "application/x-virtualbox-ovf": { + "compressible": true, + "extensions": ["ovf"] + }, + "application/x-virtualbox-vbox": { + "compressible": true, + "extensions": ["vbox"] + }, + "application/x-virtualbox-vbox-extpack": { + "compressible": false, + "extensions": ["vbox-extpack"] + }, + "application/x-virtualbox-vdi": { + "compressible": true, + "extensions": ["vdi"] + }, + "application/x-virtualbox-vhd": { + "compressible": true, + "extensions": ["vhd"] + }, + "application/x-virtualbox-vmdk": { + "compressible": true, + "extensions": ["vmdk"] + }, + "application/x-wais-source": { + "source": "apache", + "extensions": ["src"] + }, + "application/x-web-app-manifest+json": { + "compressible": true, + "extensions": ["webapp"] + }, + "application/x-www-form-urlencoded": { + "source": "iana", + "compressible": true + }, + "application/x-x509-ca-cert": { + "source": "iana", + "extensions": ["der","crt","pem"] + }, + "application/x-x509-ca-ra-cert": { + "source": "iana" + }, + "application/x-x509-next-ca-cert": { + "source": "iana" + }, + "application/x-xfig": { + "source": "apache", + "extensions": ["fig"] + }, + "application/x-xliff+xml": { + "source": "apache", + "compressible": true, + "extensions": ["xlf"] + }, + "application/x-xpinstall": { + "source": "apache", + "compressible": false, + "extensions": ["xpi"] + }, + "application/x-xz": { + "source": "apache", + "extensions": ["xz"] + }, + "application/x-zmachine": { + "source": "apache", + "extensions": ["z1","z2","z3","z4","z5","z6","z7","z8"] + }, + "application/x400-bp": { + "source": "iana" + }, + "application/xacml+xml": { + "source": "iana", + "compressible": true + }, + "application/xaml+xml": { + "source": "apache", + "compressible": true, + "extensions": ["xaml"] + }, + "application/xcap-att+xml": { + "source": "iana", + "compressible": true, + "extensions": ["xav"] + }, + "application/xcap-caps+xml": { + "source": "iana", + "compressible": true, + "extensions": ["xca"] + }, + "application/xcap-diff+xml": { + "source": "iana", + "compressible": true, + "extensions": ["xdf"] + }, + "application/xcap-el+xml": { + "source": "iana", + "compressible": true, + "extensions": ["xel"] + }, + "application/xcap-error+xml": { + "source": "iana", + "compressible": true + }, + "application/xcap-ns+xml": { + "source": "iana", + "compressible": true, + "extensions": ["xns"] + }, + "application/xcon-conference-info+xml": { + "source": "iana", + "compressible": true + }, + "application/xcon-conference-info-diff+xml": { + "source": "iana", + "compressible": true + }, + "application/xenc+xml": { + "source": "iana", + "compressible": true, + "extensions": ["xenc"] + }, + "application/xhtml+xml": { + "source": "iana", + "compressible": true, + "extensions": ["xhtml","xht"] + }, + "application/xhtml-voice+xml": { + "source": "apache", + "compressible": true + }, + "application/xliff+xml": { + "source": "iana", + "compressible": true, + "extensions": ["xlf"] + }, + "application/xml": { + "source": "iana", + "compressible": true, + "extensions": ["xml","xsl","xsd","rng"] + }, + "application/xml-dtd": { + "source": "iana", + "compressible": true, + "extensions": ["dtd"] + }, + "application/xml-external-parsed-entity": { + "source": "iana" + }, + "application/xml-patch+xml": { + "source": "iana", + "compressible": true + }, + "application/xmpp+xml": { + "source": "iana", + "compressible": true + }, + "application/xop+xml": { + "source": "iana", + "compressible": true, + "extensions": ["xop"] + }, + "application/xproc+xml": { + "source": "apache", + "compressible": true, + "extensions": ["xpl"] + }, + "application/xslt+xml": { + "source": "iana", + "compressible": true, + "extensions": ["xsl","xslt"] + }, + "application/xspf+xml": { + "source": "apache", + "compressible": true, + "extensions": ["xspf"] + }, + "application/xv+xml": { + "source": "iana", + "compressible": true, + "extensions": ["mxml","xhvml","xvml","xvm"] + }, + "application/yang": { + "source": "iana", + "extensions": ["yang"] + }, + "application/yang-data+json": { + "source": "iana", + "compressible": true + }, + "application/yang-data+xml": { + "source": "iana", + "compressible": true + }, + "application/yang-patch+json": { + "source": "iana", + "compressible": true + }, + "application/yang-patch+xml": { + "source": "iana", + "compressible": true + }, + "application/yin+xml": { + "source": "iana", + "compressible": true, + "extensions": ["yin"] + }, + "application/zip": { + "source": "iana", + "compressible": false, + "extensions": ["zip"] + }, + "application/zlib": { + "source": "iana" + }, + "application/zstd": { + "source": "iana" + }, + "audio/1d-interleaved-parityfec": { + "source": "iana" + }, + "audio/32kadpcm": { + "source": "iana" + }, + "audio/3gpp": { + "source": "iana", + "compressible": false, + "extensions": ["3gpp"] + }, + "audio/3gpp2": { + "source": "iana" + }, + "audio/aac": { + "source": "iana" + }, + "audio/ac3": { + "source": "iana" + }, + "audio/adpcm": { + "source": "apache", + "extensions": ["adp"] + }, + "audio/amr": { + "source": "iana", + "extensions": ["amr"] + }, + "audio/amr-wb": { + "source": "iana" + }, + "audio/amr-wb+": { + "source": "iana" + }, + "audio/aptx": { + "source": "iana" + }, + "audio/asc": { + "source": "iana" + }, + "audio/atrac-advanced-lossless": { + "source": "iana" + }, + "audio/atrac-x": { + "source": "iana" + }, + "audio/atrac3": { + "source": "iana" + }, + "audio/basic": { + "source": "iana", + "compressible": false, + "extensions": ["au","snd"] + }, + "audio/bv16": { + "source": "iana" + }, + "audio/bv32": { + "source": "iana" + }, + "audio/clearmode": { + "source": "iana" + }, + "audio/cn": { + "source": "iana" + }, + "audio/dat12": { + "source": "iana" + }, + "audio/dls": { + "source": "iana" + }, + "audio/dsr-es201108": { + "source": "iana" + }, + "audio/dsr-es202050": { + "source": "iana" + }, + "audio/dsr-es202211": { + "source": "iana" + }, + "audio/dsr-es202212": { + "source": "iana" + }, + "audio/dv": { + "source": "iana" + }, + "audio/dvi4": { + "source": "iana" + }, + "audio/eac3": { + "source": "iana" + }, + "audio/encaprtp": { + "source": "iana" + }, + "audio/evrc": { + "source": "iana" + }, + "audio/evrc-qcp": { + "source": "iana" + }, + "audio/evrc0": { + "source": "iana" + }, + "audio/evrc1": { + "source": "iana" + }, + "audio/evrcb": { + "source": "iana" + }, + "audio/evrcb0": { + "source": "iana" + }, + "audio/evrcb1": { + "source": "iana" + }, + "audio/evrcnw": { + "source": "iana" + }, + "audio/evrcnw0": { + "source": "iana" + }, + "audio/evrcnw1": { + "source": "iana" + }, + "audio/evrcwb": { + "source": "iana" + }, + "audio/evrcwb0": { + "source": "iana" + }, + "audio/evrcwb1": { + "source": "iana" + }, + "audio/evs": { + "source": "iana" + }, + "audio/flexfec": { + "source": "iana" + }, + "audio/fwdred": { + "source": "iana" + }, + "audio/g711-0": { + "source": "iana" + }, + "audio/g719": { + "source": "iana" + }, + "audio/g722": { + "source": "iana" + }, + "audio/g7221": { + "source": "iana" + }, + "audio/g723": { + "source": "iana" + }, + "audio/g726-16": { + "source": "iana" + }, + "audio/g726-24": { + "source": "iana" + }, + "audio/g726-32": { + "source": "iana" + }, + "audio/g726-40": { + "source": "iana" + }, + "audio/g728": { + "source": "iana" + }, + "audio/g729": { + "source": "iana" + }, + "audio/g7291": { + "source": "iana" + }, + "audio/g729d": { + "source": "iana" + }, + "audio/g729e": { + "source": "iana" + }, + "audio/gsm": { + "source": "iana" + }, + "audio/gsm-efr": { + "source": "iana" + }, + "audio/gsm-hr-08": { + "source": "iana" + }, + "audio/ilbc": { + "source": "iana" + }, + "audio/ip-mr_v2.5": { + "source": "iana" + }, + "audio/isac": { + "source": "apache" + }, + "audio/l16": { + "source": "iana" + }, + "audio/l20": { + "source": "iana" + }, + "audio/l24": { + "source": "iana", + "compressible": false + }, + "audio/l8": { + "source": "iana" + }, + "audio/lpc": { + "source": "iana" + }, + "audio/melp": { + "source": "iana" + }, + "audio/melp1200": { + "source": "iana" + }, + "audio/melp2400": { + "source": "iana" + }, + "audio/melp600": { + "source": "iana" + }, + "audio/mhas": { + "source": "iana" + }, + "audio/midi": { + "source": "apache", + "extensions": ["mid","midi","kar","rmi"] + }, + "audio/mobile-xmf": { + "source": "iana", + "extensions": ["mxmf"] + }, + "audio/mp3": { + "compressible": false, + "extensions": ["mp3"] + }, + "audio/mp4": { + "source": "iana", + "compressible": false, + "extensions": ["m4a","mp4a"] + }, + "audio/mp4a-latm": { + "source": "iana" + }, + "audio/mpa": { + "source": "iana" + }, + "audio/mpa-robust": { + "source": "iana" + }, + "audio/mpeg": { + "source": "iana", + "compressible": false, + "extensions": ["mpga","mp2","mp2a","mp3","m2a","m3a"] + }, + "audio/mpeg4-generic": { + "source": "iana" + }, + "audio/musepack": { + "source": "apache" + }, + "audio/ogg": { + "source": "iana", + "compressible": false, + "extensions": ["oga","ogg","spx","opus"] + }, + "audio/opus": { + "source": "iana" + }, + "audio/parityfec": { + "source": "iana" + }, + "audio/pcma": { + "source": "iana" + }, + "audio/pcma-wb": { + "source": "iana" + }, + "audio/pcmu": { + "source": "iana" + }, + "audio/pcmu-wb": { + "source": "iana" + }, + "audio/prs.sid": { + "source": "iana" + }, + "audio/qcelp": { + "source": "iana" + }, + "audio/raptorfec": { + "source": "iana" + }, + "audio/red": { + "source": "iana" + }, + "audio/rtp-enc-aescm128": { + "source": "iana" + }, + "audio/rtp-midi": { + "source": "iana" + }, + "audio/rtploopback": { + "source": "iana" + }, + "audio/rtx": { + "source": "iana" + }, + "audio/s3m": { + "source": "apache", + "extensions": ["s3m"] + }, + "audio/scip": { + "source": "iana" + }, + "audio/silk": { + "source": "apache", + "extensions": ["sil"] + }, + "audio/smv": { + "source": "iana" + }, + "audio/smv-qcp": { + "source": "iana" + }, + "audio/smv0": { + "source": "iana" + }, + "audio/sofa": { + "source": "iana" + }, + "audio/sp-midi": { + "source": "iana" + }, + "audio/speex": { + "source": "iana" + }, + "audio/t140c": { + "source": "iana" + }, + "audio/t38": { + "source": "iana" + }, + "audio/telephone-event": { + "source": "iana" + }, + "audio/tetra_acelp": { + "source": "iana" + }, + "audio/tetra_acelp_bb": { + "source": "iana" + }, + "audio/tone": { + "source": "iana" + }, + "audio/tsvcis": { + "source": "iana" + }, + "audio/uemclip": { + "source": "iana" + }, + "audio/ulpfec": { + "source": "iana" + }, + "audio/usac": { + "source": "iana" + }, + "audio/vdvi": { + "source": "iana" + }, + "audio/vmr-wb": { + "source": "iana" + }, + "audio/vnd.3gpp.iufp": { + "source": "iana" + }, + "audio/vnd.4sb": { + "source": "iana" + }, + "audio/vnd.audiokoz": { + "source": "iana" + }, + "audio/vnd.celp": { + "source": "iana" + }, + "audio/vnd.cisco.nse": { + "source": "iana" + }, + "audio/vnd.cmles.radio-events": { + "source": "iana" + }, + "audio/vnd.cns.anp1": { + "source": "iana" + }, + "audio/vnd.cns.inf1": { + "source": "iana" + }, + "audio/vnd.dece.audio": { + "source": "iana", + "extensions": ["uva","uvva"] + }, + "audio/vnd.digital-winds": { + "source": "iana", + "extensions": ["eol"] + }, + "audio/vnd.dlna.adts": { + "source": "iana" + }, + "audio/vnd.dolby.heaac.1": { + "source": "iana" + }, + "audio/vnd.dolby.heaac.2": { + "source": "iana" + }, + "audio/vnd.dolby.mlp": { + "source": "iana" + }, + "audio/vnd.dolby.mps": { + "source": "iana" + }, + "audio/vnd.dolby.pl2": { + "source": "iana" + }, + "audio/vnd.dolby.pl2x": { + "source": "iana" + }, + "audio/vnd.dolby.pl2z": { + "source": "iana" + }, + "audio/vnd.dolby.pulse.1": { + "source": "iana" + }, + "audio/vnd.dra": { + "source": "iana", + "extensions": ["dra"] + }, + "audio/vnd.dts": { + "source": "iana", + "extensions": ["dts"] + }, + "audio/vnd.dts.hd": { + "source": "iana", + "extensions": ["dtshd"] + }, + "audio/vnd.dts.uhd": { + "source": "iana" + }, + "audio/vnd.dvb.file": { + "source": "iana" + }, + "audio/vnd.everad.plj": { + "source": "iana" + }, + "audio/vnd.hns.audio": { + "source": "iana" + }, + "audio/vnd.lucent.voice": { + "source": "iana", + "extensions": ["lvp"] + }, + "audio/vnd.ms-playready.media.pya": { + "source": "iana", + "extensions": ["pya"] + }, + "audio/vnd.nokia.mobile-xmf": { + "source": "iana" + }, + "audio/vnd.nortel.vbk": { + "source": "iana" + }, + "audio/vnd.nuera.ecelp4800": { + "source": "iana", + "extensions": ["ecelp4800"] + }, + "audio/vnd.nuera.ecelp7470": { + "source": "iana", + "extensions": ["ecelp7470"] + }, + "audio/vnd.nuera.ecelp9600": { + "source": "iana", + "extensions": ["ecelp9600"] + }, + "audio/vnd.octel.sbc": { + "source": "iana" + }, + "audio/vnd.presonus.multitrack": { + "source": "iana" + }, + "audio/vnd.qcelp": { + "source": "iana" + }, + "audio/vnd.rhetorex.32kadpcm": { + "source": "iana" + }, + "audio/vnd.rip": { + "source": "iana", + "extensions": ["rip"] + }, + "audio/vnd.rn-realaudio": { + "compressible": false + }, + "audio/vnd.sealedmedia.softseal.mpeg": { + "source": "iana" + }, + "audio/vnd.vmx.cvsd": { + "source": "iana" + }, + "audio/vnd.wave": { + "compressible": false + }, + "audio/vorbis": { + "source": "iana", + "compressible": false + }, + "audio/vorbis-config": { + "source": "iana" + }, + "audio/wav": { + "compressible": false, + "extensions": ["wav"] + }, + "audio/wave": { + "compressible": false, + "extensions": ["wav"] + }, + "audio/webm": { + "source": "apache", + "compressible": false, + "extensions": ["weba"] + }, + "audio/x-aac": { + "source": "apache", + "compressible": false, + "extensions": ["aac"] + }, + "audio/x-aiff": { + "source": "apache", + "extensions": ["aif","aiff","aifc"] + }, + "audio/x-caf": { + "source": "apache", + "compressible": false, + "extensions": ["caf"] + }, + "audio/x-flac": { + "source": "apache", + "extensions": ["flac"] + }, + "audio/x-m4a": { + "source": "nginx", + "extensions": ["m4a"] + }, + "audio/x-matroska": { + "source": "apache", + "extensions": ["mka"] + }, + "audio/x-mpegurl": { + "source": "apache", + "extensions": ["m3u"] + }, + "audio/x-ms-wax": { + "source": "apache", + "extensions": ["wax"] + }, + "audio/x-ms-wma": { + "source": "apache", + "extensions": ["wma"] + }, + "audio/x-pn-realaudio": { + "source": "apache", + "extensions": ["ram","ra"] + }, + "audio/x-pn-realaudio-plugin": { + "source": "apache", + "extensions": ["rmp"] + }, + "audio/x-realaudio": { + "source": "nginx", + "extensions": ["ra"] + }, + "audio/x-tta": { + "source": "apache" + }, + "audio/x-wav": { + "source": "apache", + "extensions": ["wav"] + }, + "audio/xm": { + "source": "apache", + "extensions": ["xm"] + }, + "chemical/x-cdx": { + "source": "apache", + "extensions": ["cdx"] + }, + "chemical/x-cif": { + "source": "apache", + "extensions": ["cif"] + }, + "chemical/x-cmdf": { + "source": "apache", + "extensions": ["cmdf"] + }, + "chemical/x-cml": { + "source": "apache", + "extensions": ["cml"] + }, + "chemical/x-csml": { + "source": "apache", + "extensions": ["csml"] + }, + "chemical/x-pdb": { + "source": "apache" + }, + "chemical/x-xyz": { + "source": "apache", + "extensions": ["xyz"] + }, + "font/collection": { + "source": "iana", + "extensions": ["ttc"] + }, + "font/otf": { + "source": "iana", + "compressible": true, + "extensions": ["otf"] + }, + "font/sfnt": { + "source": "iana" + }, + "font/ttf": { + "source": "iana", + "compressible": true, + "extensions": ["ttf"] + }, + "font/woff": { + "source": "iana", + "extensions": ["woff"] + }, + "font/woff2": { + "source": "iana", + "extensions": ["woff2"] + }, + "image/aces": { + "source": "iana", + "extensions": ["exr"] + }, + "image/apng": { + "compressible": false, + "extensions": ["apng"] + }, + "image/avci": { + "source": "iana" + }, + "image/avcs": { + "source": "iana" + }, + "image/avif": { + "source": "iana", + "compressible": false, + "extensions": ["avif"] + }, + "image/bmp": { + "source": "iana", + "compressible": true, + "extensions": ["bmp"] + }, + "image/cgm": { + "source": "iana", + "extensions": ["cgm"] + }, + "image/dicom-rle": { + "source": "iana", + "extensions": ["drle"] + }, + "image/emf": { + "source": "iana", + "extensions": ["emf"] + }, + "image/fits": { + "source": "iana", + "extensions": ["fits"] + }, + "image/g3fax": { + "source": "iana", + "extensions": ["g3"] + }, + "image/gif": { + "source": "iana", + "compressible": false, + "extensions": ["gif"] + }, + "image/heic": { + "source": "iana", + "extensions": ["heic"] + }, + "image/heic-sequence": { + "source": "iana", + "extensions": ["heics"] + }, + "image/heif": { + "source": "iana", + "extensions": ["heif"] + }, + "image/heif-sequence": { + "source": "iana", + "extensions": ["heifs"] + }, + "image/hej2k": { + "source": "iana", + "extensions": ["hej2"] + }, + "image/hsj2": { + "source": "iana", + "extensions": ["hsj2"] + }, + "image/ief": { + "source": "iana", + "extensions": ["ief"] + }, + "image/jls": { + "source": "iana", + "extensions": ["jls"] + }, + "image/jp2": { + "source": "iana", + "compressible": false, + "extensions": ["jp2","jpg2"] + }, + "image/jpeg": { + "source": "iana", + "compressible": false, + "extensions": ["jpeg","jpg","jpe"] + }, + "image/jph": { + "source": "iana", + "extensions": ["jph"] + }, + "image/jphc": { + "source": "iana", + "extensions": ["jhc"] + }, + "image/jpm": { + "source": "iana", + "compressible": false, + "extensions": ["jpm"] + }, + "image/jpx": { + "source": "iana", + "compressible": false, + "extensions": ["jpx","jpf"] + }, + "image/jxr": { + "source": "iana", + "extensions": ["jxr"] + }, + "image/jxra": { + "source": "iana", + "extensions": ["jxra"] + }, + "image/jxrs": { + "source": "iana", + "extensions": ["jxrs"] + }, + "image/jxs": { + "source": "iana", + "extensions": ["jxs"] + }, + "image/jxsc": { + "source": "iana", + "extensions": ["jxsc"] + }, + "image/jxsi": { + "source": "iana", + "extensions": ["jxsi"] + }, + "image/jxss": { + "source": "iana", + "extensions": ["jxss"] + }, + "image/ktx": { + "source": "iana", + "extensions": ["ktx"] + }, + "image/ktx2": { + "source": "iana", + "extensions": ["ktx2"] + }, + "image/naplps": { + "source": "iana" + }, + "image/pjpeg": { + "compressible": false + }, + "image/png": { + "source": "iana", + "compressible": false, + "extensions": ["png"] + }, + "image/prs.btif": { + "source": "iana", + "extensions": ["btif"] + }, + "image/prs.pti": { + "source": "iana", + "extensions": ["pti"] + }, + "image/pwg-raster": { + "source": "iana" + }, + "image/sgi": { + "source": "apache", + "extensions": ["sgi"] + }, + "image/svg+xml": { + "source": "iana", + "compressible": true, + "extensions": ["svg","svgz"] + }, + "image/t38": { + "source": "iana", + "extensions": ["t38"] + }, + "image/tiff": { + "source": "iana", + "compressible": false, + "extensions": ["tif","tiff"] + }, + "image/tiff-fx": { + "source": "iana", + "extensions": ["tfx"] + }, + "image/vnd.adobe.photoshop": { + "source": "iana", + "compressible": true, + "extensions": ["psd"] + }, + "image/vnd.airzip.accelerator.azv": { + "source": "iana", + "extensions": ["azv"] + }, + "image/vnd.cns.inf2": { + "source": "iana" + }, + "image/vnd.dece.graphic": { + "source": "iana", + "extensions": ["uvi","uvvi","uvg","uvvg"] + }, + "image/vnd.djvu": { + "source": "iana", + "extensions": ["djvu","djv"] + }, + "image/vnd.dvb.subtitle": { + "source": "iana", + "extensions": ["sub"] + }, + "image/vnd.dwg": { + "source": "iana", + "extensions": ["dwg"] + }, + "image/vnd.dxf": { + "source": "iana", + "extensions": ["dxf"] + }, + "image/vnd.fastbidsheet": { + "source": "iana", + "extensions": ["fbs"] + }, + "image/vnd.fpx": { + "source": "iana", + "extensions": ["fpx"] + }, + "image/vnd.fst": { + "source": "iana", + "extensions": ["fst"] + }, + "image/vnd.fujixerox.edmics-mmr": { + "source": "iana", + "extensions": ["mmr"] + }, + "image/vnd.fujixerox.edmics-rlc": { + "source": "iana", + "extensions": ["rlc"] + }, + "image/vnd.globalgraphics.pgb": { + "source": "iana" + }, + "image/vnd.microsoft.icon": { + "source": "iana", + "extensions": ["ico"] + }, + "image/vnd.mix": { + "source": "iana" + }, + "image/vnd.mozilla.apng": { + "source": "iana" + }, + "image/vnd.ms-dds": { + "extensions": ["dds"] + }, + "image/vnd.ms-modi": { + "source": "iana", + "extensions": ["mdi"] + }, + "image/vnd.ms-photo": { + "source": "apache", + "extensions": ["wdp"] + }, + "image/vnd.net-fpx": { + "source": "iana", + "extensions": ["npx"] + }, + "image/vnd.pco.b16": { + "source": "iana", + "extensions": ["b16"] + }, + "image/vnd.radiance": { + "source": "iana" + }, + "image/vnd.sealed.png": { + "source": "iana" + }, + "image/vnd.sealedmedia.softseal.gif": { + "source": "iana" + }, + "image/vnd.sealedmedia.softseal.jpg": { + "source": "iana" + }, + "image/vnd.svf": { + "source": "iana" + }, + "image/vnd.tencent.tap": { + "source": "iana", + "extensions": ["tap"] + }, + "image/vnd.valve.source.texture": { + "source": "iana", + "extensions": ["vtf"] + }, + "image/vnd.wap.wbmp": { + "source": "iana", + "extensions": ["wbmp"] + }, + "image/vnd.xiff": { + "source": "iana", + "extensions": ["xif"] + }, + "image/vnd.zbrush.pcx": { + "source": "iana", + "extensions": ["pcx"] + }, + "image/webp": { + "source": "apache", + "extensions": ["webp"] + }, + "image/wmf": { + "source": "iana", + "extensions": ["wmf"] + }, + "image/x-3ds": { + "source": "apache", + "extensions": ["3ds"] + }, + "image/x-cmu-raster": { + "source": "apache", + "extensions": ["ras"] + }, + "image/x-cmx": { + "source": "apache", + "extensions": ["cmx"] + }, + "image/x-freehand": { + "source": "apache", + "extensions": ["fh","fhc","fh4","fh5","fh7"] + }, + "image/x-icon": { + "source": "apache", + "compressible": true, + "extensions": ["ico"] + }, + "image/x-jng": { + "source": "nginx", + "extensions": ["jng"] + }, + "image/x-mrsid-image": { + "source": "apache", + "extensions": ["sid"] + }, + "image/x-ms-bmp": { + "source": "nginx", + "compressible": true, + "extensions": ["bmp"] + }, + "image/x-pcx": { + "source": "apache", + "extensions": ["pcx"] + }, + "image/x-pict": { + "source": "apache", + "extensions": ["pic","pct"] + }, + "image/x-portable-anymap": { + "source": "apache", + "extensions": ["pnm"] + }, + "image/x-portable-bitmap": { + "source": "apache", + "extensions": ["pbm"] + }, + "image/x-portable-graymap": { + "source": "apache", + "extensions": ["pgm"] + }, + "image/x-portable-pixmap": { + "source": "apache", + "extensions": ["ppm"] + }, + "image/x-rgb": { + "source": "apache", + "extensions": ["rgb"] + }, + "image/x-tga": { + "source": "apache", + "extensions": ["tga"] + }, + "image/x-xbitmap": { + "source": "apache", + "extensions": ["xbm"] + }, + "image/x-xcf": { + "compressible": false + }, + "image/x-xpixmap": { + "source": "apache", + "extensions": ["xpm"] + }, + "image/x-xwindowdump": { + "source": "apache", + "extensions": ["xwd"] + }, + "message/cpim": { + "source": "iana" + }, + "message/delivery-status": { + "source": "iana" + }, + "message/disposition-notification": { + "source": "iana", + "extensions": [ + "disposition-notification" + ] + }, + "message/external-body": { + "source": "iana" + }, + "message/feedback-report": { + "source": "iana" + }, + "message/global": { + "source": "iana", + "extensions": ["u8msg"] + }, + "message/global-delivery-status": { + "source": "iana", + "extensions": ["u8dsn"] + }, + "message/global-disposition-notification": { + "source": "iana", + "extensions": ["u8mdn"] + }, + "message/global-headers": { + "source": "iana", + "extensions": ["u8hdr"] + }, + "message/http": { + "source": "iana", + "compressible": false + }, + "message/imdn+xml": { + "source": "iana", + "compressible": true + }, + "message/news": { + "source": "iana" + }, + "message/partial": { + "source": "iana", + "compressible": false + }, + "message/rfc822": { + "source": "iana", + "compressible": true, + "extensions": ["eml","mime"] + }, + "message/s-http": { + "source": "iana" + }, + "message/sip": { + "source": "iana" + }, + "message/sipfrag": { + "source": "iana" + }, + "message/tracking-status": { + "source": "iana" + }, + "message/vnd.si.simp": { + "source": "iana" + }, + "message/vnd.wfa.wsc": { + "source": "iana", + "extensions": ["wsc"] + }, + "model/3mf": { + "source": "iana", + "extensions": ["3mf"] + }, + "model/e57": { + "source": "iana" + }, + "model/gltf+json": { + "source": "iana", + "compressible": true, + "extensions": ["gltf"] + }, + "model/gltf-binary": { + "source": "iana", + "compressible": true, + "extensions": ["glb"] + }, + "model/iges": { + "source": "iana", + "compressible": false, + "extensions": ["igs","iges"] + }, + "model/mesh": { + "source": "iana", + "compressible": false, + "extensions": ["msh","mesh","silo"] + }, + "model/mtl": { + "source": "iana", + "extensions": ["mtl"] + }, + "model/obj": { + "source": "iana", + "extensions": ["obj"] + }, + "model/step": { + "source": "iana" + }, + "model/step+xml": { + "source": "iana", + "compressible": true, + "extensions": ["stpx"] + }, + "model/step+zip": { + "source": "iana", + "compressible": false, + "extensions": ["stpz"] + }, + "model/step-xml+zip": { + "source": "iana", + "compressible": false, + "extensions": ["stpxz"] + }, + "model/stl": { + "source": "iana", + "extensions": ["stl"] + }, + "model/vnd.collada+xml": { + "source": "iana", + "compressible": true, + "extensions": ["dae"] + }, + "model/vnd.dwf": { + "source": "iana", + "extensions": ["dwf"] + }, + "model/vnd.flatland.3dml": { + "source": "iana" + }, + "model/vnd.gdl": { + "source": "iana", + "extensions": ["gdl"] + }, + "model/vnd.gs-gdl": { + "source": "apache" + }, + "model/vnd.gs.gdl": { + "source": "iana" + }, + "model/vnd.gtw": { + "source": "iana", + "extensions": ["gtw"] + }, + "model/vnd.moml+xml": { + "source": "iana", + "compressible": true + }, + "model/vnd.mts": { + "source": "iana", + "extensions": ["mts"] + }, + "model/vnd.opengex": { + "source": "iana", + "extensions": ["ogex"] + }, + "model/vnd.parasolid.transmit.binary": { + "source": "iana", + "extensions": ["x_b"] + }, + "model/vnd.parasolid.transmit.text": { + "source": "iana", + "extensions": ["x_t"] + }, + "model/vnd.pytha.pyox": { + "source": "iana" + }, + "model/vnd.rosette.annotated-data-model": { + "source": "iana" + }, + "model/vnd.sap.vds": { + "source": "iana", + "extensions": ["vds"] + }, + "model/vnd.usdz+zip": { + "source": "iana", + "compressible": false, + "extensions": ["usdz"] + }, + "model/vnd.valve.source.compiled-map": { + "source": "iana", + "extensions": ["bsp"] + }, + "model/vnd.vtu": { + "source": "iana", + "extensions": ["vtu"] + }, + "model/vrml": { + "source": "iana", + "compressible": false, + "extensions": ["wrl","vrml"] + }, + "model/x3d+binary": { + "source": "apache", + "compressible": false, + "extensions": ["x3db","x3dbz"] + }, + "model/x3d+fastinfoset": { + "source": "iana", + "extensions": ["x3db"] + }, + "model/x3d+vrml": { + "source": "apache", + "compressible": false, + "extensions": ["x3dv","x3dvz"] + }, + "model/x3d+xml": { + "source": "iana", + "compressible": true, + "extensions": ["x3d","x3dz"] + }, + "model/x3d-vrml": { + "source": "iana", + "extensions": ["x3dv"] + }, + "multipart/alternative": { + "source": "iana", + "compressible": false + }, + "multipart/appledouble": { + "source": "iana" + }, + "multipart/byteranges": { + "source": "iana" + }, + "multipart/digest": { + "source": "iana" + }, + "multipart/encrypted": { + "source": "iana", + "compressible": false + }, + "multipart/form-data": { + "source": "iana", + "compressible": false + }, + "multipart/header-set": { + "source": "iana" + }, + "multipart/mixed": { + "source": "iana" + }, + "multipart/multilingual": { + "source": "iana" + }, + "multipart/parallel": { + "source": "iana" + }, + "multipart/related": { + "source": "iana", + "compressible": false + }, + "multipart/report": { + "source": "iana" + }, + "multipart/signed": { + "source": "iana", + "compressible": false + }, + "multipart/vnd.bint.med-plus": { + "source": "iana" + }, + "multipart/voice-message": { + "source": "iana" + }, + "multipart/x-mixed-replace": { + "source": "iana" + }, + "text/1d-interleaved-parityfec": { + "source": "iana" + }, + "text/cache-manifest": { + "source": "iana", + "compressible": true, + "extensions": ["appcache","manifest"] + }, + "text/calendar": { + "source": "iana", + "extensions": ["ics","ifb"] + }, + "text/calender": { + "compressible": true + }, + "text/cmd": { + "compressible": true + }, + "text/coffeescript": { + "extensions": ["coffee","litcoffee"] + }, + "text/cql": { + "source": "iana" + }, + "text/cql-expression": { + "source": "iana" + }, + "text/cql-identifier": { + "source": "iana" + }, + "text/css": { + "source": "iana", + "charset": "UTF-8", + "compressible": true, + "extensions": ["css"] + }, + "text/csv": { + "source": "iana", + "compressible": true, + "extensions": ["csv"] + }, + "text/csv-schema": { + "source": "iana" + }, + "text/directory": { + "source": "iana" + }, + "text/dns": { + "source": "iana" + }, + "text/ecmascript": { + "source": "iana" + }, + "text/encaprtp": { + "source": "iana" + }, + "text/enriched": { + "source": "iana" + }, + "text/fhirpath": { + "source": "iana" + }, + "text/flexfec": { + "source": "iana" + }, + "text/fwdred": { + "source": "iana" + }, + "text/gff3": { + "source": "iana" + }, + "text/grammar-ref-list": { + "source": "iana" + }, + "text/html": { + "source": "iana", + "compressible": true, + "extensions": ["html","htm","shtml"] + }, + "text/jade": { + "extensions": ["jade"] + }, + "text/javascript": { + "source": "iana", + "compressible": true + }, + "text/jcr-cnd": { + "source": "iana" + }, + "text/jsx": { + "compressible": true, + "extensions": ["jsx"] + }, + "text/less": { + "compressible": true, + "extensions": ["less"] + }, + "text/markdown": { + "source": "iana", + "compressible": true, + "extensions": ["markdown","md"] + }, + "text/mathml": { + "source": "nginx", + "extensions": ["mml"] + }, + "text/mdx": { + "compressible": true, + "extensions": ["mdx"] + }, + "text/mizar": { + "source": "iana" + }, + "text/n3": { + "source": "iana", + "charset": "UTF-8", + "compressible": true, + "extensions": ["n3"] + }, + "text/parameters": { + "source": "iana", + "charset": "UTF-8" + }, + "text/parityfec": { + "source": "iana" + }, + "text/plain": { + "source": "iana", + "compressible": true, + "extensions": ["txt","text","conf","def","list","log","in","ini"] + }, + "text/provenance-notation": { + "source": "iana", + "charset": "UTF-8" + }, + "text/prs.fallenstein.rst": { + "source": "iana" + }, + "text/prs.lines.tag": { + "source": "iana", + "extensions": ["dsc"] + }, + "text/prs.prop.logic": { + "source": "iana" + }, + "text/raptorfec": { + "source": "iana" + }, + "text/red": { + "source": "iana" + }, + "text/rfc822-headers": { + "source": "iana" + }, + "text/richtext": { + "source": "iana", + "compressible": true, + "extensions": ["rtx"] + }, + "text/rtf": { + "source": "iana", + "compressible": true, + "extensions": ["rtf"] + }, + "text/rtp-enc-aescm128": { + "source": "iana" + }, + "text/rtploopback": { + "source": "iana" + }, + "text/rtx": { + "source": "iana" + }, + "text/sgml": { + "source": "iana", + "extensions": ["sgml","sgm"] + }, + "text/shaclc": { + "source": "iana" + }, + "text/shex": { + "source": "iana", + "extensions": ["shex"] + }, + "text/slim": { + "extensions": ["slim","slm"] + }, + "text/spdx": { + "source": "iana", + "extensions": ["spdx"] + }, + "text/strings": { + "source": "iana" + }, + "text/stylus": { + "extensions": ["stylus","styl"] + }, + "text/t140": { + "source": "iana" + }, + "text/tab-separated-values": { + "source": "iana", + "compressible": true, + "extensions": ["tsv"] + }, + "text/troff": { + "source": "iana", + "extensions": ["t","tr","roff","man","me","ms"] + }, + "text/turtle": { + "source": "iana", + "charset": "UTF-8", + "extensions": ["ttl"] + }, + "text/ulpfec": { + "source": "iana" + }, + "text/uri-list": { + "source": "iana", + "compressible": true, + "extensions": ["uri","uris","urls"] + }, + "text/vcard": { + "source": "iana", + "compressible": true, + "extensions": ["vcard"] + }, + "text/vnd.a": { + "source": "iana" + }, + "text/vnd.abc": { + "source": "iana" + }, + "text/vnd.ascii-art": { + "source": "iana" + }, + "text/vnd.curl": { + "source": "iana", + "extensions": ["curl"] + }, + "text/vnd.curl.dcurl": { + "source": "apache", + "extensions": ["dcurl"] + }, + "text/vnd.curl.mcurl": { + "source": "apache", + "extensions": ["mcurl"] + }, + "text/vnd.curl.scurl": { + "source": "apache", + "extensions": ["scurl"] + }, + "text/vnd.debian.copyright": { + "source": "iana", + "charset": "UTF-8" + }, + "text/vnd.dmclientscript": { + "source": "iana" + }, + "text/vnd.dvb.subtitle": { + "source": "iana", + "extensions": ["sub"] + }, + "text/vnd.esmertec.theme-descriptor": { + "source": "iana", + "charset": "UTF-8" + }, + "text/vnd.ficlab.flt": { + "source": "iana" + }, + "text/vnd.fly": { + "source": "iana", + "extensions": ["fly"] + }, + "text/vnd.fmi.flexstor": { + "source": "iana", + "extensions": ["flx"] + }, + "text/vnd.gml": { + "source": "iana" + }, + "text/vnd.graphviz": { + "source": "iana", + "extensions": ["gv"] + }, + "text/vnd.hans": { + "source": "iana" + }, + "text/vnd.hgl": { + "source": "iana" + }, + "text/vnd.in3d.3dml": { + "source": "iana", + "extensions": ["3dml"] + }, + "text/vnd.in3d.spot": { + "source": "iana", + "extensions": ["spot"] + }, + "text/vnd.iptc.newsml": { + "source": "iana" + }, + "text/vnd.iptc.nitf": { + "source": "iana" + }, + "text/vnd.latex-z": { + "source": "iana" + }, + "text/vnd.motorola.reflex": { + "source": "iana" + }, + "text/vnd.ms-mediapackage": { + "source": "iana" + }, + "text/vnd.net2phone.commcenter.command": { + "source": "iana" + }, + "text/vnd.radisys.msml-basic-layout": { + "source": "iana" + }, + "text/vnd.senx.warpscript": { + "source": "iana" + }, + "text/vnd.si.uricatalogue": { + "source": "iana" + }, + "text/vnd.sosi": { + "source": "iana" + }, + "text/vnd.sun.j2me.app-descriptor": { + "source": "iana", + "charset": "UTF-8", + "extensions": ["jad"] + }, + "text/vnd.trolltech.linguist": { + "source": "iana", + "charset": "UTF-8" + }, + "text/vnd.wap.si": { + "source": "iana" + }, + "text/vnd.wap.sl": { + "source": "iana" + }, + "text/vnd.wap.wml": { + "source": "iana", + "extensions": ["wml"] + }, + "text/vnd.wap.wmlscript": { + "source": "iana", + "extensions": ["wmls"] + }, + "text/vtt": { + "source": "iana", + "charset": "UTF-8", + "compressible": true, + "extensions": ["vtt"] + }, + "text/x-asm": { + "source": "apache", + "extensions": ["s","asm"] + }, + "text/x-c": { + "source": "apache", + "extensions": ["c","cc","cxx","cpp","h","hh","dic"] + }, + "text/x-component": { + "source": "nginx", + "extensions": ["htc"] + }, + "text/x-fortran": { + "source": "apache", + "extensions": ["f","for","f77","f90"] + }, + "text/x-gwt-rpc": { + "compressible": true + }, + "text/x-handlebars-template": { + "extensions": ["hbs"] + }, + "text/x-java-source": { + "source": "apache", + "extensions": ["java"] + }, + "text/x-jquery-tmpl": { + "compressible": true + }, + "text/x-lua": { + "extensions": ["lua"] + }, + "text/x-markdown": { + "compressible": true, + "extensions": ["mkd"] + }, + "text/x-nfo": { + "source": "apache", + "extensions": ["nfo"] + }, + "text/x-opml": { + "source": "apache", + "extensions": ["opml"] + }, + "text/x-org": { + "compressible": true, + "extensions": ["org"] + }, + "text/x-pascal": { + "source": "apache", + "extensions": ["p","pas"] + }, + "text/x-processing": { + "compressible": true, + "extensions": ["pde"] + }, + "text/x-sass": { + "extensions": ["sass"] + }, + "text/x-scss": { + "extensions": ["scss"] + }, + "text/x-setext": { + "source": "apache", + "extensions": ["etx"] + }, + "text/x-sfv": { + "source": "apache", + "extensions": ["sfv"] + }, + "text/x-suse-ymp": { + "compressible": true, + "extensions": ["ymp"] + }, + "text/x-uuencode": { + "source": "apache", + "extensions": ["uu"] + }, + "text/x-vcalendar": { + "source": "apache", + "extensions": ["vcs"] + }, + "text/x-vcard": { + "source": "apache", + "extensions": ["vcf"] + }, + "text/xml": { + "source": "iana", + "compressible": true, + "extensions": ["xml"] + }, + "text/xml-external-parsed-entity": { + "source": "iana" + }, + "text/yaml": { + "compressible": true, + "extensions": ["yaml","yml"] + }, + "video/1d-interleaved-parityfec": { + "source": "iana" + }, + "video/3gpp": { + "source": "iana", + "extensions": ["3gp","3gpp"] + }, + "video/3gpp-tt": { + "source": "iana" + }, + "video/3gpp2": { + "source": "iana", + "extensions": ["3g2"] + }, + "video/av1": { + "source": "iana" + }, + "video/bmpeg": { + "source": "iana" + }, + "video/bt656": { + "source": "iana" + }, + "video/celb": { + "source": "iana" + }, + "video/dv": { + "source": "iana" + }, + "video/encaprtp": { + "source": "iana" + }, + "video/ffv1": { + "source": "iana" + }, + "video/flexfec": { + "source": "iana" + }, + "video/h261": { + "source": "iana", + "extensions": ["h261"] + }, + "video/h263": { + "source": "iana", + "extensions": ["h263"] + }, + "video/h263-1998": { + "source": "iana" + }, + "video/h263-2000": { + "source": "iana" + }, + "video/h264": { + "source": "iana", + "extensions": ["h264"] + }, + "video/h264-rcdo": { + "source": "iana" + }, + "video/h264-svc": { + "source": "iana" + }, + "video/h265": { + "source": "iana" + }, + "video/iso.segment": { + "source": "iana", + "extensions": ["m4s"] + }, + "video/jpeg": { + "source": "iana", + "extensions": ["jpgv"] + }, + "video/jpeg2000": { + "source": "iana" + }, + "video/jpm": { + "source": "apache", + "extensions": ["jpm","jpgm"] + }, + "video/jxsv": { + "source": "iana" + }, + "video/mj2": { + "source": "iana", + "extensions": ["mj2","mjp2"] + }, + "video/mp1s": { + "source": "iana" + }, + "video/mp2p": { + "source": "iana" + }, + "video/mp2t": { + "source": "iana", + "extensions": ["ts"] + }, + "video/mp4": { + "source": "iana", + "compressible": false, + "extensions": ["mp4","mp4v","mpg4"] + }, + "video/mp4v-es": { + "source": "iana" + }, + "video/mpeg": { + "source": "iana", + "compressible": false, + "extensions": ["mpeg","mpg","mpe","m1v","m2v"] + }, + "video/mpeg4-generic": { + "source": "iana" + }, + "video/mpv": { + "source": "iana" + }, + "video/nv": { + "source": "iana" + }, + "video/ogg": { + "source": "iana", + "compressible": false, + "extensions": ["ogv"] + }, + "video/parityfec": { + "source": "iana" + }, + "video/pointer": { + "source": "iana" + }, + "video/quicktime": { + "source": "iana", + "compressible": false, + "extensions": ["qt","mov"] + }, + "video/raptorfec": { + "source": "iana" + }, + "video/raw": { + "source": "iana" + }, + "video/rtp-enc-aescm128": { + "source": "iana" + }, + "video/rtploopback": { + "source": "iana" + }, + "video/rtx": { + "source": "iana" + }, + "video/scip": { + "source": "iana" + }, + "video/smpte291": { + "source": "iana" + }, + "video/smpte292m": { + "source": "iana" + }, + "video/ulpfec": { + "source": "iana" + }, + "video/vc1": { + "source": "iana" + }, + "video/vc2": { + "source": "iana" + }, + "video/vnd.cctv": { + "source": "iana" + }, + "video/vnd.dece.hd": { + "source": "iana", + "extensions": ["uvh","uvvh"] + }, + "video/vnd.dece.mobile": { + "source": "iana", + "extensions": ["uvm","uvvm"] + }, + "video/vnd.dece.mp4": { + "source": "iana" + }, + "video/vnd.dece.pd": { + "source": "iana", + "extensions": ["uvp","uvvp"] + }, + "video/vnd.dece.sd": { + "source": "iana", + "extensions": ["uvs","uvvs"] + }, + "video/vnd.dece.video": { + "source": "iana", + "extensions": ["uvv","uvvv"] + }, + "video/vnd.directv.mpeg": { + "source": "iana" + }, + "video/vnd.directv.mpeg-tts": { + "source": "iana" + }, + "video/vnd.dlna.mpeg-tts": { + "source": "iana" + }, + "video/vnd.dvb.file": { + "source": "iana", + "extensions": ["dvb"] + }, + "video/vnd.fvt": { + "source": "iana", + "extensions": ["fvt"] + }, + "video/vnd.hns.video": { + "source": "iana" + }, + "video/vnd.iptvforum.1dparityfec-1010": { + "source": "iana" + }, + "video/vnd.iptvforum.1dparityfec-2005": { + "source": "iana" + }, + "video/vnd.iptvforum.2dparityfec-1010": { + "source": "iana" + }, + "video/vnd.iptvforum.2dparityfec-2005": { + "source": "iana" + }, + "video/vnd.iptvforum.ttsavc": { + "source": "iana" + }, + "video/vnd.iptvforum.ttsmpeg2": { + "source": "iana" + }, + "video/vnd.motorola.video": { + "source": "iana" + }, + "video/vnd.motorola.videop": { + "source": "iana" + }, + "video/vnd.mpegurl": { + "source": "iana", + "extensions": ["mxu","m4u"] + }, + "video/vnd.ms-playready.media.pyv": { + "source": "iana", + "extensions": ["pyv"] + }, + "video/vnd.nokia.interleaved-multimedia": { + "source": "iana" + }, + "video/vnd.nokia.mp4vr": { + "source": "iana" + }, + "video/vnd.nokia.videovoip": { + "source": "iana" + }, + "video/vnd.objectvideo": { + "source": "iana" + }, + "video/vnd.radgamettools.bink": { + "source": "iana" + }, + "video/vnd.radgamettools.smacker": { + "source": "iana" + }, + "video/vnd.sealed.mpeg1": { + "source": "iana" + }, + "video/vnd.sealed.mpeg4": { + "source": "iana" + }, + "video/vnd.sealed.swf": { + "source": "iana" + }, + "video/vnd.sealedmedia.softseal.mov": { + "source": "iana" + }, + "video/vnd.uvvu.mp4": { + "source": "iana", + "extensions": ["uvu","uvvu"] + }, + "video/vnd.vivo": { + "source": "iana", + "extensions": ["viv"] + }, + "video/vnd.youtube.yt": { + "source": "iana" + }, + "video/vp8": { + "source": "iana" + }, + "video/vp9": { + "source": "iana" + }, + "video/webm": { + "source": "apache", + "compressible": false, + "extensions": ["webm"] + }, + "video/x-f4v": { + "source": "apache", + "extensions": ["f4v"] + }, + "video/x-fli": { + "source": "apache", + "extensions": ["fli"] + }, + "video/x-flv": { + "source": "apache", + "compressible": false, + "extensions": ["flv"] + }, + "video/x-m4v": { + "source": "apache", + "extensions": ["m4v"] + }, + "video/x-matroska": { + "source": "apache", + "compressible": false, + "extensions": ["mkv","mk3d","mks"] + }, + "video/x-mng": { + "source": "apache", + "extensions": ["mng"] + }, + "video/x-ms-asf": { + "source": "apache", + "extensions": ["asf","asx"] + }, + "video/x-ms-vob": { + "source": "apache", + "extensions": ["vob"] + }, + "video/x-ms-wm": { + "source": "apache", + "extensions": ["wm"] + }, + "video/x-ms-wmv": { + "source": "apache", + "compressible": false, + "extensions": ["wmv"] + }, + "video/x-ms-wmx": { + "source": "apache", + "extensions": ["wmx"] + }, + "video/x-ms-wvx": { + "source": "apache", + "extensions": ["wvx"] + }, + "video/x-msvideo": { + "source": "apache", + "extensions": ["avi"] + }, + "video/x-sgi-movie": { + "source": "apache", + "extensions": ["movie"] + }, + "video/x-smv": { + "source": "apache", + "extensions": ["smv"] + }, + "x-conference/x-cooltalk": { + "source": "apache", + "extensions": ["ice"] + }, + "x-shader/x-fragment": { + "compressible": true + }, + "x-shader/x-vertex": { + "compressible": true + } +} diff --git a/node_modules/mime-db/index.js b/node_modules/mime-db/index.js new file mode 100644 index 000000000..551031f69 --- /dev/null +++ b/node_modules/mime-db/index.js @@ -0,0 +1,11 @@ +/*! + * mime-db + * Copyright(c) 2014 Jonathan Ong + * MIT Licensed + */ + +/** + * Module exports. + */ + +module.exports = require('./db.json') diff --git a/node_modules/mime-db/package.json b/node_modules/mime-db/package.json new file mode 100644 index 000000000..aabb474f3 --- /dev/null +++ b/node_modules/mime-db/package.json @@ -0,0 +1,102 @@ +{ + "_from": "mime-db@1.50.0", + "_id": "mime-db@1.50.0", + "_inBundle": false, + "_integrity": "sha512-9tMZCDlYHqeERXEHO9f/hKfNXhre5dK2eE/krIvUjZbS2KPcqGDfNShIWS1uW9XOTKQKqK6qbeOci18rbfW77A==", + "_location": "/mime-db", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "mime-db@1.50.0", + "name": "mime-db", + "escapedName": "mime-db", + "rawSpec": "1.50.0", + "saveSpec": null, + "fetchSpec": "1.50.0" + }, + "_requiredBy": [ + "/mime-types" + ], + "_resolved": "https://registry.npmjs.org/mime-db/-/mime-db-1.50.0.tgz", + "_shasum": "abd4ac94e98d3c0e185016c67ab45d5fde40c11f", + "_spec": "mime-db@1.50.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\mime-types", + "bugs": { + "url": "https://github.com/jshttp/mime-db/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + { + "name": "Jonathan Ong", + "email": "me@jongleberry.com", + "url": "http://jongleberry.com" + }, + { + "name": "Robert Kieffer", + "email": "robert@broofa.com", + "url": "http://github.com/broofa" + } + ], + "deprecated": false, + "description": "Media Type Database", + "devDependencies": { + "bluebird": "3.7.2", + "co": "4.6.0", + "cogent": "1.0.1", + "csv-parse": "4.16.3", + "eslint": "7.32.0", + "eslint-config-standard": "15.0.1", + "eslint-plugin-import": "2.24.2", + "eslint-plugin-markdown": "2.2.1", + "eslint-plugin-node": "11.1.0", + "eslint-plugin-promise": "5.1.0", + "eslint-plugin-standard": "4.1.0", + "gnode": "0.1.2", + "mocha": "9.1.1", + "nyc": "15.1.0", + "raw-body": "2.4.1", + "stream-to-array": "2.3.0" + }, + "engines": { + "node": ">= 0.6" + }, + "files": [ + "HISTORY.md", + "LICENSE", + "README.md", + "db.json", + "index.js" + ], + "homepage": "https://github.com/jshttp/mime-db#readme", + "keywords": [ + "mime", + "db", + "type", + "types", + "database", + "charset", + "charsets" + ], + "license": "MIT", + "name": "mime-db", + "repository": { + "type": "git", + "url": "git+https://github.com/jshttp/mime-db.git" + }, + "scripts": { + "build": "node scripts/build", + "fetch": "node scripts/fetch-apache && gnode scripts/fetch-iana && node scripts/fetch-nginx", + "lint": "eslint .", + "test": "mocha --reporter spec --bail --check-leaks test/", + "test-ci": "nyc --reporter=lcov --reporter=text npm test", + "test-cov": "nyc --reporter=html --reporter=text npm test", + "update": "npm run fetch && npm run build", + "version": "node scripts/version-history.js && git add HISTORY.md" + }, + "version": "1.50.0" +} diff --git a/node_modules/mime-types/HISTORY.md b/node_modules/mime-types/HISTORY.md new file mode 100644 index 000000000..d5545c178 --- /dev/null +++ b/node_modules/mime-types/HISTORY.md @@ -0,0 +1,385 @@ +2.1.33 / 2021-10-01 +=================== + + * deps: mime-db@1.50.0 + - Add deprecated iWorks mime types and extensions + - Add new upstream MIME types + +2.1.32 / 2021-07-27 +=================== + + * deps: mime-db@1.49.0 + - Add extension `.trig` to `application/trig` + - Add new upstream MIME types + +2.1.31 / 2021-06-01 +=================== + + * deps: mime-db@1.48.0 + - Add extension `.mvt` to `application/vnd.mapbox-vector-tile` + - Add new upstream MIME types + - Mark `text/yaml` as compressible + +2.1.30 / 2021-04-02 +=================== + + * deps: mime-db@1.47.0 + - Add extension `.amr` to `audio/amr` + - Remove ambigious extensions from IANA for `application/*+xml` types + - Update primary extension to `.es` for `application/ecmascript` + +2.1.29 / 2021-02-17 +=================== + + * deps: mime-db@1.46.0 + - Add extension `.amr` to `audio/amr` + - Add extension `.m4s` to `video/iso.segment` + - Add extension `.opus` to `audio/ogg` + - Add new upstream MIME types + +2.1.28 / 2021-01-01 +=================== + + * deps: mime-db@1.45.0 + - Add `application/ubjson` with extension `.ubj` + - Add `image/avif` with extension `.avif` + - Add `image/ktx2` with extension `.ktx2` + - Add extension `.dbf` to `application/vnd.dbf` + - Add extension `.rar` to `application/vnd.rar` + - Add extension `.td` to `application/urc-targetdesc+xml` + - Add new upstream MIME types + - Fix extension of `application/vnd.apple.keynote` to be `.key` + +2.1.27 / 2020-04-23 +=================== + + * deps: mime-db@1.44.0 + - Add charsets from IANA + - Add extension `.cjs` to `application/node` + - Add new upstream MIME types + +2.1.26 / 2020-01-05 +=================== + + * deps: mime-db@1.43.0 + - Add `application/x-keepass2` with extension `.kdbx` + - Add extension `.mxmf` to `audio/mobile-xmf` + - Add extensions from IANA for `application/*+xml` types + - Add new upstream MIME types + +2.1.25 / 2019-11-12 +=================== + + * deps: mime-db@1.42.0 + - Add new upstream MIME types + - Add `application/toml` with extension `.toml` + - Add `image/vnd.ms-dds` with extension `.dds` + +2.1.24 / 2019-04-20 +=================== + + * deps: mime-db@1.40.0 + - Add extensions from IANA for `model/*` types + - Add `text/mdx` with extension `.mdx` + +2.1.23 / 2019-04-17 +=================== + + * deps: mime-db@~1.39.0 + - Add extensions `.siv` and `.sieve` to `application/sieve` + - Add new upstream MIME types + +2.1.22 / 2019-02-14 +=================== + + * deps: mime-db@~1.38.0 + - Add extension `.nq` to `application/n-quads` + - Add extension `.nt` to `application/n-triples` + - Add new upstream MIME types + - Mark `text/less` as compressible + +2.1.21 / 2018-10-19 +=================== + + * deps: mime-db@~1.37.0 + - Add extensions to HEIC image types + - Add new upstream MIME types + +2.1.20 / 2018-08-26 +=================== + + * deps: mime-db@~1.36.0 + - Add Apple file extensions from IANA + - Add extensions from IANA for `image/*` types + - Add new upstream MIME types + +2.1.19 / 2018-07-17 +=================== + + * deps: mime-db@~1.35.0 + - Add extension `.csl` to `application/vnd.citationstyles.style+xml` + - Add extension `.es` to `application/ecmascript` + - Add extension `.owl` to `application/rdf+xml` + - Add new upstream MIME types + - Add UTF-8 as default charset for `text/turtle` + +2.1.18 / 2018-02-16 +=================== + + * deps: mime-db@~1.33.0 + - Add `application/raml+yaml` with extension `.raml` + - Add `application/wasm` with extension `.wasm` + - Add `text/shex` with extension `.shex` + - Add extensions for JPEG-2000 images + - Add extensions from IANA for `message/*` types + - Add new upstream MIME types + - Update font MIME types + - Update `text/hjson` to registered `application/hjson` + +2.1.17 / 2017-09-01 +=================== + + * deps: mime-db@~1.30.0 + - Add `application/vnd.ms-outlook` + - Add `application/x-arj` + - Add extension `.mjs` to `application/javascript` + - Add glTF types and extensions + - Add new upstream MIME types + - Add `text/x-org` + - Add VirtualBox MIME types + - Fix `source` records for `video/*` types that are IANA + - Update `font/opentype` to registered `font/otf` + +2.1.16 / 2017-07-24 +=================== + + * deps: mime-db@~1.29.0 + - Add `application/fido.trusted-apps+json` + - Add extension `.wadl` to `application/vnd.sun.wadl+xml` + - Add extension `.gz` to `application/gzip` + - Add new upstream MIME types + - Update extensions `.md` and `.markdown` to be `text/markdown` + +2.1.15 / 2017-03-23 +=================== + + * deps: mime-db@~1.27.0 + - Add new mime types + - Add `image/apng` + +2.1.14 / 2017-01-14 +=================== + + * deps: mime-db@~1.26.0 + - Add new mime types + +2.1.13 / 2016-11-18 +=================== + + * deps: mime-db@~1.25.0 + - Add new mime types + +2.1.12 / 2016-09-18 +=================== + + * deps: mime-db@~1.24.0 + - Add new mime types + - Add `audio/mp3` + +2.1.11 / 2016-05-01 +=================== + + * deps: mime-db@~1.23.0 + - Add new mime types + +2.1.10 / 2016-02-15 +=================== + + * deps: mime-db@~1.22.0 + - Add new mime types + - Fix extension of `application/dash+xml` + - Update primary extension for `audio/mp4` + +2.1.9 / 2016-01-06 +================== + + * deps: mime-db@~1.21.0 + - Add new mime types + +2.1.8 / 2015-11-30 +================== + + * deps: mime-db@~1.20.0 + - Add new mime types + +2.1.7 / 2015-09-20 +================== + + * deps: mime-db@~1.19.0 + - Add new mime types + +2.1.6 / 2015-09-03 +================== + + * deps: mime-db@~1.18.0 + - Add new mime types + +2.1.5 / 2015-08-20 +================== + + * deps: mime-db@~1.17.0 + - Add new mime types + +2.1.4 / 2015-07-30 +================== + + * deps: mime-db@~1.16.0 + - Add new mime types + +2.1.3 / 2015-07-13 +================== + + * deps: mime-db@~1.15.0 + - Add new mime types + +2.1.2 / 2015-06-25 +================== + + * deps: mime-db@~1.14.0 + - Add new mime types + +2.1.1 / 2015-06-08 +================== + + * perf: fix deopt during mapping + +2.1.0 / 2015-06-07 +================== + + * Fix incorrectly treating extension-less file name as extension + - i.e. `'path/to/json'` will no longer return `application/json` + * Fix `.charset(type)` to accept parameters + * Fix `.charset(type)` to match case-insensitive + * Improve generation of extension to MIME mapping + * Refactor internals for readability and no argument reassignment + * Prefer `application/*` MIME types from the same source + * Prefer any type over `application/octet-stream` + * deps: mime-db@~1.13.0 + - Add nginx as a source + - Add new mime types + +2.0.14 / 2015-06-06 +=================== + + * deps: mime-db@~1.12.0 + - Add new mime types + +2.0.13 / 2015-05-31 +=================== + + * deps: mime-db@~1.11.0 + - Add new mime types + +2.0.12 / 2015-05-19 +=================== + + * deps: mime-db@~1.10.0 + - Add new mime types + +2.0.11 / 2015-05-05 +=================== + + * deps: mime-db@~1.9.1 + - Add new mime types + +2.0.10 / 2015-03-13 +=================== + + * deps: mime-db@~1.8.0 + - Add new mime types + +2.0.9 / 2015-02-09 +================== + + * deps: mime-db@~1.7.0 + - Add new mime types + - Community extensions ownership transferred from `node-mime` + +2.0.8 / 2015-01-29 +================== + + * deps: mime-db@~1.6.0 + - Add new mime types + +2.0.7 / 2014-12-30 +================== + + * deps: mime-db@~1.5.0 + - Add new mime types + - Fix various invalid MIME type entries + +2.0.6 / 2014-12-30 +================== + + * deps: mime-db@~1.4.0 + - Add new mime types + - Fix various invalid MIME type entries + - Remove example template MIME types + +2.0.5 / 2014-12-29 +================== + + * deps: mime-db@~1.3.1 + - Fix missing extensions + +2.0.4 / 2014-12-10 +================== + + * deps: mime-db@~1.3.0 + - Add new mime types + +2.0.3 / 2014-11-09 +================== + + * deps: mime-db@~1.2.0 + - Add new mime types + +2.0.2 / 2014-09-28 +================== + + * deps: mime-db@~1.1.0 + - Add new mime types + - Add additional compressible + - Update charsets + +2.0.1 / 2014-09-07 +================== + + * Support Node.js 0.6 + +2.0.0 / 2014-09-02 +================== + + * Use `mime-db` + * Remove `.define()` + +1.0.2 / 2014-08-04 +================== + + * Set charset=utf-8 for `text/javascript` + +1.0.1 / 2014-06-24 +================== + + * Add `text/jsx` type + +1.0.0 / 2014-05-12 +================== + + * Return `false` for unknown types + * Set charset=utf-8 for `application/json` + +0.1.0 / 2014-05-02 +================== + + * Initial release diff --git a/node_modules/mime-types/LICENSE b/node_modules/mime-types/LICENSE new file mode 100644 index 000000000..06166077b --- /dev/null +++ b/node_modules/mime-types/LICENSE @@ -0,0 +1,23 @@ +(The MIT License) + +Copyright (c) 2014 Jonathan Ong +Copyright (c) 2015 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/mime-types/README.md b/node_modules/mime-types/README.md new file mode 100644 index 000000000..c978ac27a --- /dev/null +++ b/node_modules/mime-types/README.md @@ -0,0 +1,113 @@ +# mime-types + +[![NPM Version][npm-version-image]][npm-url] +[![NPM Downloads][npm-downloads-image]][npm-url] +[![Node.js Version][node-version-image]][node-version-url] +[![Build Status][ci-image]][ci-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +The ultimate javascript content-type utility. + +Similar to [the `mime@1.x` module](https://www.npmjs.com/package/mime), except: + +- __No fallbacks.__ Instead of naively returning the first available type, + `mime-types` simply returns `false`, so do + `var type = mime.lookup('unrecognized') || 'application/octet-stream'`. +- No `new Mime()` business, so you could do `var lookup = require('mime-types').lookup`. +- No `.define()` functionality +- Bug fixes for `.lookup(path)` + +Otherwise, the API is compatible with `mime` 1.x. + +## Install + +This is a [Node.js](https://nodejs.org/en/) module available through the +[npm registry](https://www.npmjs.com/). Installation is done using the +[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally): + +```sh +$ npm install mime-types +``` + +## Adding Types + +All mime types are based on [mime-db](https://www.npmjs.com/package/mime-db), +so open a PR there if you'd like to add mime types. + +## API + +```js +var mime = require('mime-types') +``` + +All functions return `false` if input is invalid or not found. + +### mime.lookup(path) + +Lookup the content-type associated with a file. + +```js +mime.lookup('json') // 'application/json' +mime.lookup('.md') // 'text/markdown' +mime.lookup('file.html') // 'text/html' +mime.lookup('folder/file.js') // 'application/javascript' +mime.lookup('folder/.htaccess') // false + +mime.lookup('cats') // false +``` + +### mime.contentType(type) + +Create a full content-type header given a content-type or extension. +When given an extension, `mime.lookup` is used to get the matching +content-type, otherwise the given content-type is used. Then if the +content-type does not already have a `charset` parameter, `mime.charset` +is used to get the default charset and add to the returned content-type. + +```js +mime.contentType('markdown') // 'text/x-markdown; charset=utf-8' +mime.contentType('file.json') // 'application/json; charset=utf-8' +mime.contentType('text/html') // 'text/html; charset=utf-8' +mime.contentType('text/html; charset=iso-8859-1') // 'text/html; charset=iso-8859-1' + +// from a full path +mime.contentType(path.extname('/path/to/file.json')) // 'application/json; charset=utf-8' +``` + +### mime.extension(type) + +Get the default extension for a content-type. + +```js +mime.extension('application/octet-stream') // 'bin' +``` + +### mime.charset(type) + +Lookup the implied default charset of a content-type. + +```js +mime.charset('text/markdown') // 'UTF-8' +``` + +### var type = mime.types[extension] + +A map of content-types by extension. + +### [extensions...] = mime.extensions[type] + +A map of extensions by content-type. + +## License + +[MIT](LICENSE) + +[ci-image]: https://badgen.net/github/checks/jshttp/mime-types/master?label=ci +[ci-url]: https://github.com/jshttp/mime-types/actions?query=workflow%3Aci +[coveralls-image]: https://badgen.net/coveralls/c/github/jshttp/mime-types/master +[coveralls-url]: https://coveralls.io/r/jshttp/mime-types?branch=master +[node-version-image]: https://badgen.net/npm/node/mime-types +[node-version-url]: https://nodejs.org/en/download +[npm-downloads-image]: https://badgen.net/npm/dm/mime-types +[npm-url]: https://npmjs.org/package/mime-types +[npm-version-image]: https://badgen.net/npm/v/mime-types diff --git a/node_modules/mime-types/index.js b/node_modules/mime-types/index.js new file mode 100644 index 000000000..b9f34d599 --- /dev/null +++ b/node_modules/mime-types/index.js @@ -0,0 +1,188 @@ +/*! + * mime-types + * Copyright(c) 2014 Jonathan Ong + * Copyright(c) 2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module dependencies. + * @private + */ + +var db = require('mime-db') +var extname = require('path').extname + +/** + * Module variables. + * @private + */ + +var EXTRACT_TYPE_REGEXP = /^\s*([^;\s]*)(?:;|\s|$)/ +var TEXT_TYPE_REGEXP = /^text\//i + +/** + * Module exports. + * @public + */ + +exports.charset = charset +exports.charsets = { lookup: charset } +exports.contentType = contentType +exports.extension = extension +exports.extensions = Object.create(null) +exports.lookup = lookup +exports.types = Object.create(null) + +// Populate the extensions/types maps +populateMaps(exports.extensions, exports.types) + +/** + * Get the default charset for a MIME type. + * + * @param {string} type + * @return {boolean|string} + */ + +function charset (type) { + if (!type || typeof type !== 'string') { + return false + } + + // TODO: use media-typer + var match = EXTRACT_TYPE_REGEXP.exec(type) + var mime = match && db[match[1].toLowerCase()] + + if (mime && mime.charset) { + return mime.charset + } + + // default text/* to utf-8 + if (match && TEXT_TYPE_REGEXP.test(match[1])) { + return 'UTF-8' + } + + return false +} + +/** + * Create a full Content-Type header given a MIME type or extension. + * + * @param {string} str + * @return {boolean|string} + */ + +function contentType (str) { + // TODO: should this even be in this module? + if (!str || typeof str !== 'string') { + return false + } + + var mime = str.indexOf('/') === -1 + ? exports.lookup(str) + : str + + if (!mime) { + return false + } + + // TODO: use content-type or other module + if (mime.indexOf('charset') === -1) { + var charset = exports.charset(mime) + if (charset) mime += '; charset=' + charset.toLowerCase() + } + + return mime +} + +/** + * Get the default extension for a MIME type. + * + * @param {string} type + * @return {boolean|string} + */ + +function extension (type) { + if (!type || typeof type !== 'string') { + return false + } + + // TODO: use media-typer + var match = EXTRACT_TYPE_REGEXP.exec(type) + + // get extensions + var exts = match && exports.extensions[match[1].toLowerCase()] + + if (!exts || !exts.length) { + return false + } + + return exts[0] +} + +/** + * Lookup the MIME type for a file path/extension. + * + * @param {string} path + * @return {boolean|string} + */ + +function lookup (path) { + if (!path || typeof path !== 'string') { + return false + } + + // get the extension ("ext" or ".ext" or full path) + var extension = extname('x.' + path) + .toLowerCase() + .substr(1) + + if (!extension) { + return false + } + + return exports.types[extension] || false +} + +/** + * Populate the extensions and types maps. + * @private + */ + +function populateMaps (extensions, types) { + // source preference (least -> most) + var preference = ['nginx', 'apache', undefined, 'iana'] + + Object.keys(db).forEach(function forEachMimeType (type) { + var mime = db[type] + var exts = mime.extensions + + if (!exts || !exts.length) { + return + } + + // mime -> extensions + extensions[type] = exts + + // extension -> mime + for (var i = 0; i < exts.length; i++) { + var extension = exts[i] + + if (types[extension]) { + var from = preference.indexOf(db[types[extension]].source) + var to = preference.indexOf(mime.source) + + if (types[extension] !== 'application/octet-stream' && + (from > to || (from === to && types[extension].substr(0, 12) === 'application/'))) { + // skip the remapping + continue + } + } + + // set the extension -> mime + types[extension] = type + } + }) +} diff --git a/node_modules/mime-types/package.json b/node_modules/mime-types/package.json new file mode 100644 index 000000000..382ce854e --- /dev/null +++ b/node_modules/mime-types/package.json @@ -0,0 +1,88 @@ +{ + "_from": "mime-types@~2.1.24", + "_id": "mime-types@2.1.33", + "_inBundle": false, + "_integrity": "sha512-plLElXp7pRDd0bNZHw+nMd52vRYjLwQjygaNg7ddJ2uJtTlmnTCjWuPKxVu6//AdaRuME84SvLW91sIkBqGT0g==", + "_location": "/mime-types", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "mime-types@~2.1.24", + "name": "mime-types", + "escapedName": "mime-types", + "rawSpec": "~2.1.24", + "saveSpec": null, + "fetchSpec": "~2.1.24" + }, + "_requiredBy": [ + "/accepts", + "/type-is" + ], + "_resolved": "https://registry.npmjs.org/mime-types/-/mime-types-2.1.33.tgz", + "_shasum": "1fa12a904472fafd068e48d9e8401f74d3f70edb", + "_spec": "mime-types@~2.1.24", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\accepts", + "bugs": { + "url": "https://github.com/jshttp/mime-types/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + { + "name": "Jeremiah Senkpiel", + "email": "fishrock123@rocketmail.com", + "url": "https://searchbeam.jit.su" + }, + { + "name": "Jonathan Ong", + "email": "me@jongleberry.com", + "url": "http://jongleberry.com" + } + ], + "dependencies": { + "mime-db": "1.50.0" + }, + "deprecated": false, + "description": "The ultimate javascript content-type utility.", + "devDependencies": { + "eslint": "7.32.0", + "eslint-config-standard": "14.1.1", + "eslint-plugin-import": "2.24.2", + "eslint-plugin-markdown": "2.2.1", + "eslint-plugin-node": "11.1.0", + "eslint-plugin-promise": "5.1.0", + "eslint-plugin-standard": "4.1.0", + "mocha": "9.1.2", + "nyc": "15.1.0" + }, + "engines": { + "node": ">= 0.6" + }, + "files": [ + "HISTORY.md", + "LICENSE", + "index.js" + ], + "homepage": "https://github.com/jshttp/mime-types#readme", + "keywords": [ + "mime", + "types" + ], + "license": "MIT", + "name": "mime-types", + "repository": { + "type": "git", + "url": "git+https://github.com/jshttp/mime-types.git" + }, + "scripts": { + "lint": "eslint .", + "test": "mocha --reporter spec test/test.js", + "test-ci": "nyc --reporter=lcov --reporter=text npm test", + "test-cov": "nyc --reporter=html --reporter=text npm test" + }, + "version": "2.1.33" +} diff --git a/node_modules/mime/.npmignore b/node_modules/mime/.npmignore new file mode 100644 index 000000000..e69de29bb diff --git a/node_modules/mime/CHANGELOG.md b/node_modules/mime/CHANGELOG.md new file mode 100644 index 000000000..f12753505 --- /dev/null +++ b/node_modules/mime/CHANGELOG.md @@ -0,0 +1,164 @@ +# Changelog + +## v1.6.0 (24/11/2017) +*No changelog for this release.* + +--- + +## v2.0.4 (24/11/2017) +- [**closed**] Switch to mime-score module for resolving extension contention issues. [#182](https://github.com/broofa/node-mime/issues/182) +- [**closed**] Update mime-db to 1.31.0 in v1.x branch [#181](https://github.com/broofa/node-mime/issues/181) + +--- + +## v1.5.0 (22/11/2017) +- [**closed**] need ES5 version ready in npm package [#179](https://github.com/broofa/node-mime/issues/179) +- [**closed**] mime-db no trace of iWork - pages / numbers / etc. [#178](https://github.com/broofa/node-mime/issues/178) +- [**closed**] How it works in brownser ? [#176](https://github.com/broofa/node-mime/issues/176) +- [**closed**] Missing `./Mime` [#175](https://github.com/broofa/node-mime/issues/175) +- [**closed**] Vulnerable Regular Expression [#167](https://github.com/broofa/node-mime/issues/167) + +--- + +## v2.0.3 (25/09/2017) +*No changelog for this release.* + +--- + +## v1.4.1 (25/09/2017) +- [**closed**] Issue when bundling with webpack [#172](https://github.com/broofa/node-mime/issues/172) + +--- + +## v2.0.2 (15/09/2017) +- [**V2**] fs.readFileSync is not a function [#165](https://github.com/broofa/node-mime/issues/165) +- [**closed**] The extension for video/quicktime should map to .mov, not .qt [#164](https://github.com/broofa/node-mime/issues/164) +- [**V2**] [v2 Feedback request] Mime class API [#163](https://github.com/broofa/node-mime/issues/163) +- [**V2**] [v2 Feedback request] Resolving conflicts over extensions [#162](https://github.com/broofa/node-mime/issues/162) +- [**V2**] Allow callers to load module with official, full, or no defined types. [#161](https://github.com/broofa/node-mime/issues/161) +- [**V2**] Use "facets" to resolve extension conflicts [#160](https://github.com/broofa/node-mime/issues/160) +- [**V2**] Remove fs and path dependencies [#152](https://github.com/broofa/node-mime/issues/152) +- [**V2**] Default content-type should not be application/octet-stream [#139](https://github.com/broofa/node-mime/issues/139) +- [**V2**] reset mime-types [#124](https://github.com/broofa/node-mime/issues/124) +- [**V2**] Extensionless paths should return null or false [#113](https://github.com/broofa/node-mime/issues/113) + +--- + +## v2.0.1 (14/09/2017) +- [**closed**] Changelog for v2.0 does not mention breaking changes [#171](https://github.com/broofa/node-mime/issues/171) +- [**closed**] MIME breaking with 'class' declaration as it is without 'use strict mode' [#170](https://github.com/broofa/node-mime/issues/170) + +--- + +## v2.0.0 (12/09/2017) +- [**closed**] woff and woff2 [#168](https://github.com/broofa/node-mime/issues/168) + +--- + +## v1.4.0 (28/08/2017) +- [**closed**] support for ac3 voc files [#159](https://github.com/broofa/node-mime/issues/159) +- [**closed**] Help understanding change from application/xml to text/xml [#158](https://github.com/broofa/node-mime/issues/158) +- [**closed**] no longer able to override mimetype [#157](https://github.com/broofa/node-mime/issues/157) +- [**closed**] application/vnd.adobe.photoshop [#147](https://github.com/broofa/node-mime/issues/147) +- [**closed**] Directories should appear as something other than application/octet-stream [#135](https://github.com/broofa/node-mime/issues/135) +- [**closed**] requested features [#131](https://github.com/broofa/node-mime/issues/131) +- [**closed**] Make types.json loading optional? [#129](https://github.com/broofa/node-mime/issues/129) +- [**closed**] Cannot find module './types.json' [#120](https://github.com/broofa/node-mime/issues/120) +- [**V2**] .wav files show up as "audio/x-wav" instead of "audio/x-wave" [#118](https://github.com/broofa/node-mime/issues/118) +- [**closed**] Don't be a pain in the ass for node community [#108](https://github.com/broofa/node-mime/issues/108) +- [**closed**] don't make default_type global [#78](https://github.com/broofa/node-mime/issues/78) +- [**closed**] mime.extension() fails if the content-type is parameterized [#74](https://github.com/broofa/node-mime/issues/74) + +--- + +## v1.3.6 (11/05/2017) +- [**closed**] .md should be text/markdown as of March 2016 [#154](https://github.com/broofa/node-mime/issues/154) +- [**closed**] Error while installing mime [#153](https://github.com/broofa/node-mime/issues/153) +- [**closed**] application/manifest+json [#149](https://github.com/broofa/node-mime/issues/149) +- [**closed**] Dynamic adaptive streaming over HTTP (DASH) file extension typo [#141](https://github.com/broofa/node-mime/issues/141) +- [**closed**] charsets image/png undefined [#140](https://github.com/broofa/node-mime/issues/140) +- [**closed**] Mime-db dependency out of date [#130](https://github.com/broofa/node-mime/issues/130) +- [**closed**] how to support plist? [#126](https://github.com/broofa/node-mime/issues/126) +- [**closed**] how does .types file format look like? [#123](https://github.com/broofa/node-mime/issues/123) +- [**closed**] Feature: support for expanding MIME patterns [#121](https://github.com/broofa/node-mime/issues/121) +- [**closed**] DEBUG_MIME doesn't work [#117](https://github.com/broofa/node-mime/issues/117) + +--- + +## v1.3.4 (06/02/2015) +*No changelog for this release.* + +--- + +## v1.3.3 (06/02/2015) +*No changelog for this release.* + +--- + +## v1.3.1 (05/02/2015) +- [**closed**] Consider adding support for Handlebars .hbs file ending [#111](https://github.com/broofa/node-mime/issues/111) +- [**closed**] Consider adding support for hjson. [#110](https://github.com/broofa/node-mime/issues/110) +- [**closed**] Add mime type for Opus audio files [#94](https://github.com/broofa/node-mime/issues/94) +- [**closed**] Consider making the `Requesting New Types` information more visible [#77](https://github.com/broofa/node-mime/issues/77) + +--- + +## v1.3.0 (05/02/2015) +- [**closed**] Add common name? [#114](https://github.com/broofa/node-mime/issues/114) +- [**closed**] application/x-yaml [#104](https://github.com/broofa/node-mime/issues/104) +- [**closed**] Add mime type for WOFF file format 2.0 [#102](https://github.com/broofa/node-mime/issues/102) +- [**closed**] application/x-msi for .msi [#99](https://github.com/broofa/node-mime/issues/99) +- [**closed**] Add mimetype for gettext translation files [#98](https://github.com/broofa/node-mime/issues/98) +- [**closed**] collaborators [#88](https://github.com/broofa/node-mime/issues/88) +- [**closed**] getting errot in installation of mime module...any1 can help? [#87](https://github.com/broofa/node-mime/issues/87) +- [**closed**] should application/json's charset be utf8? [#86](https://github.com/broofa/node-mime/issues/86) +- [**closed**] Add "license" and "licenses" to package.json [#81](https://github.com/broofa/node-mime/issues/81) +- [**closed**] lookup with extension-less file on Windows returns wrong type [#68](https://github.com/broofa/node-mime/issues/68) + +--- + +## v1.2.11 (15/08/2013) +- [**closed**] Update mime.types [#65](https://github.com/broofa/node-mime/issues/65) +- [**closed**] Publish a new version [#63](https://github.com/broofa/node-mime/issues/63) +- [**closed**] README should state upfront that "application/octet-stream" is default for unknown extension [#55](https://github.com/broofa/node-mime/issues/55) +- [**closed**] Suggested improvement to the charset API [#52](https://github.com/broofa/node-mime/issues/52) + +--- + +## v1.2.10 (25/07/2013) +- [**closed**] Mime type for woff files should be application/font-woff and not application/x-font-woff [#62](https://github.com/broofa/node-mime/issues/62) +- [**closed**] node.types in conflict with mime.types [#51](https://github.com/broofa/node-mime/issues/51) + +--- + +## v1.2.9 (17/01/2013) +- [**closed**] Please update "mime" NPM [#49](https://github.com/broofa/node-mime/issues/49) +- [**closed**] Please add semicolon [#46](https://github.com/broofa/node-mime/issues/46) +- [**closed**] parse full mime types [#43](https://github.com/broofa/node-mime/issues/43) + +--- + +## v1.2.8 (10/01/2013) +- [**closed**] /js directory mime is application/javascript. Is it correct? [#47](https://github.com/broofa/node-mime/issues/47) +- [**closed**] Add mime types for lua code. [#45](https://github.com/broofa/node-mime/issues/45) + +--- + +## v1.2.7 (19/10/2012) +- [**closed**] cannot install 1.2.7 via npm [#41](https://github.com/broofa/node-mime/issues/41) +- [**closed**] Transfer ownership to @broofa [#36](https://github.com/broofa/node-mime/issues/36) +- [**closed**] it's wrong to set charset to UTF-8 for text [#30](https://github.com/broofa/node-mime/issues/30) +- [**closed**] Allow multiple instances of MIME types container [#27](https://github.com/broofa/node-mime/issues/27) + +--- + +## v1.2.5 (16/02/2012) +- [**closed**] When looking up a types, check hasOwnProperty [#23](https://github.com/broofa/node-mime/issues/23) +- [**closed**] Bump version to 1.2.2 [#18](https://github.com/broofa/node-mime/issues/18) +- [**closed**] No license [#16](https://github.com/broofa/node-mime/issues/16) +- [**closed**] Some types missing that are used by html5/css3 [#13](https://github.com/broofa/node-mime/issues/13) +- [**closed**] npm install fails for 1.2.1 [#12](https://github.com/broofa/node-mime/issues/12) +- [**closed**] image/pjpeg + image/x-png [#10](https://github.com/broofa/node-mime/issues/10) +- [**closed**] symlink [#8](https://github.com/broofa/node-mime/issues/8) +- [**closed**] gzip [#2](https://github.com/broofa/node-mime/issues/2) +- [**closed**] ALL CAPS filenames return incorrect mime type [#1](https://github.com/broofa/node-mime/issues/1) diff --git a/node_modules/mime/LICENSE b/node_modules/mime/LICENSE new file mode 100644 index 000000000..d3f46f7e1 --- /dev/null +++ b/node_modules/mime/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2010 Benjamin Thomas, Robert Kieffer + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/mime/README.md b/node_modules/mime/README.md new file mode 100644 index 000000000..506fbe550 --- /dev/null +++ b/node_modules/mime/README.md @@ -0,0 +1,90 @@ +# mime + +Comprehensive MIME type mapping API based on mime-db module. + +## Install + +Install with [npm](http://github.com/isaacs/npm): + + npm install mime + +## Contributing / Testing + + npm run test + +## Command Line + + mime [path_string] + +E.g. + + > mime scripts/jquery.js + application/javascript + +## API - Queries + +### mime.lookup(path) +Get the mime type associated with a file, if no mime type is found `application/octet-stream` is returned. Performs a case-insensitive lookup using the extension in `path` (the substring after the last '/' or '.'). E.g. + +```js +var mime = require('mime'); + +mime.lookup('/path/to/file.txt'); // => 'text/plain' +mime.lookup('file.txt'); // => 'text/plain' +mime.lookup('.TXT'); // => 'text/plain' +mime.lookup('htm'); // => 'text/html' +``` + +### mime.default_type +Sets the mime type returned when `mime.lookup` fails to find the extension searched for. (Default is `application/octet-stream`.) + +### mime.extension(type) +Get the default extension for `type` + +```js +mime.extension('text/html'); // => 'html' +mime.extension('application/octet-stream'); // => 'bin' +``` + +### mime.charsets.lookup() + +Map mime-type to charset + +```js +mime.charsets.lookup('text/plain'); // => 'UTF-8' +``` + +(The logic for charset lookups is pretty rudimentary. Feel free to suggest improvements.) + +## API - Defining Custom Types + +Custom type mappings can be added on a per-project basis via the following APIs. + +### mime.define() + +Add custom mime/extension mappings + +```js +mime.define({ + 'text/x-some-format': ['x-sf', 'x-sft', 'x-sfml'], + 'application/x-my-type': ['x-mt', 'x-mtt'], + // etc ... +}); + +mime.lookup('x-sft'); // => 'text/x-some-format' +``` + +The first entry in the extensions array is returned by `mime.extension()`. E.g. + +```js +mime.extension('text/x-some-format'); // => 'x-sf' +``` + +### mime.load(filepath) + +Load mappings from an Apache ".types" format file + +```js +mime.load('./my_project.types'); +``` +The .types file format is simple - See the `types` dir for examples. diff --git a/node_modules/mime/cli.js b/node_modules/mime/cli.js new file mode 100644 index 000000000..20b1ffeb2 --- /dev/null +++ b/node_modules/mime/cli.js @@ -0,0 +1,8 @@ +#!/usr/bin/env node + +var mime = require('./mime.js'); +var file = process.argv[2]; +var type = mime.lookup(file); + +process.stdout.write(type + '\n'); + diff --git a/node_modules/mime/mime.js b/node_modules/mime/mime.js new file mode 100644 index 000000000..d7efbde70 --- /dev/null +++ b/node_modules/mime/mime.js @@ -0,0 +1,108 @@ +var path = require('path'); +var fs = require('fs'); + +function Mime() { + // Map of extension -> mime type + this.types = Object.create(null); + + // Map of mime type -> extension + this.extensions = Object.create(null); +} + +/** + * Define mimetype -> extension mappings. Each key is a mime-type that maps + * to an array of extensions associated with the type. The first extension is + * used as the default extension for the type. + * + * e.g. mime.define({'audio/ogg', ['oga', 'ogg', 'spx']}); + * + * @param map (Object) type definitions + */ +Mime.prototype.define = function (map) { + for (var type in map) { + var exts = map[type]; + for (var i = 0; i < exts.length; i++) { + if (process.env.DEBUG_MIME && this.types[exts[i]]) { + console.warn((this._loading || "define()").replace(/.*\//, ''), 'changes "' + exts[i] + '" extension type from ' + + this.types[exts[i]] + ' to ' + type); + } + + this.types[exts[i]] = type; + } + + // Default extension is the first one we encounter + if (!this.extensions[type]) { + this.extensions[type] = exts[0]; + } + } +}; + +/** + * Load an Apache2-style ".types" file + * + * This may be called multiple times (it's expected). Where files declare + * overlapping types/extensions, the last file wins. + * + * @param file (String) path of file to load. + */ +Mime.prototype.load = function(file) { + this._loading = file; + // Read file and split into lines + var map = {}, + content = fs.readFileSync(file, 'ascii'), + lines = content.split(/[\r\n]+/); + + lines.forEach(function(line) { + // Clean up whitespace/comments, and split into fields + var fields = line.replace(/\s*#.*|^\s*|\s*$/g, '').split(/\s+/); + map[fields.shift()] = fields; + }); + + this.define(map); + + this._loading = null; +}; + +/** + * Lookup a mime type based on extension + */ +Mime.prototype.lookup = function(path, fallback) { + var ext = path.replace(/^.*[\.\/\\]/, '').toLowerCase(); + + return this.types[ext] || fallback || this.default_type; +}; + +/** + * Return file extension associated with a mime type + */ +Mime.prototype.extension = function(mimeType) { + var type = mimeType.match(/^\s*([^;\s]*)(?:;|\s|$)/)[1].toLowerCase(); + return this.extensions[type]; +}; + +// Default instance +var mime = new Mime(); + +// Define built-in types +mime.define(require('./types.json')); + +// Default type +mime.default_type = mime.lookup('bin'); + +// +// Additional API specific to the default instance +// + +mime.Mime = Mime; + +/** + * Lookup a charset based on mime type. + */ +mime.charsets = { + lookup: function(mimeType, fallback) { + // Assume text types are utf8 + return (/^text\/|^application\/(javascript|json)/).test(mimeType) ? 'UTF-8' : fallback; + } +}; + +module.exports = mime; diff --git a/node_modules/mime/package.json b/node_modules/mime/package.json new file mode 100644 index 000000000..e361faf15 --- /dev/null +++ b/node_modules/mime/package.json @@ -0,0 +1,73 @@ +{ + "_from": "mime@1.6.0", + "_id": "mime@1.6.0", + "_inBundle": false, + "_integrity": "sha512-x0Vn8spI+wuJ1O6S7gnbaQg8Pxh4NNHb7KSINmEWKiPE4RKOplvijn+NkmYmmRgP68mc70j2EbeTFRsrswaQeg==", + "_location": "/mime", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "mime@1.6.0", + "name": "mime", + "escapedName": "mime", + "rawSpec": "1.6.0", + "saveSpec": null, + "fetchSpec": "1.6.0" + }, + "_requiredBy": [ + "/send" + ], + "_resolved": "https://registry.npmjs.org/mime/-/mime-1.6.0.tgz", + "_shasum": "32cd9e5c64553bd58d19a568af452acff04981b1", + "_spec": "mime@1.6.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\send", + "author": { + "name": "Robert Kieffer", + "email": "robert@broofa.com", + "url": "http://github.com/broofa" + }, + "bin": { + "mime": "cli.js" + }, + "bugs": { + "url": "https://github.com/broofa/node-mime/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Benjamin Thomas", + "email": "benjamin@benjaminthomas.org", + "url": "http://github.com/bentomas" + } + ], + "dependencies": {}, + "deprecated": false, + "description": "A comprehensive library for mime-type mapping", + "devDependencies": { + "github-release-notes": "0.13.1", + "mime-db": "1.31.0", + "mime-score": "1.1.0" + }, + "engines": { + "node": ">=4" + }, + "homepage": "https://github.com/broofa/node-mime#readme", + "keywords": [ + "util", + "mime" + ], + "license": "MIT", + "main": "mime.js", + "name": "mime", + "repository": { + "url": "git+https://github.com/broofa/node-mime.git", + "type": "git" + }, + "scripts": { + "changelog": "gren changelog --tags=all --generate --override", + "prepare": "node src/build.js", + "test": "node src/test.js" + }, + "version": "1.6.0" +} diff --git a/node_modules/mime/src/build.js b/node_modules/mime/src/build.js new file mode 100644 index 000000000..4928e48bc --- /dev/null +++ b/node_modules/mime/src/build.js @@ -0,0 +1,53 @@ +#!/usr/bin/env node + +'use strict'; + +const fs = require('fs'); +const path = require('path'); +const mimeScore = require('mime-score'); + +let db = require('mime-db'); +let chalk = require('chalk'); + +const STANDARD_FACET_SCORE = 900; + +const byExtension = {}; + +// Clear out any conflict extensions in mime-db +for (let type in db) { + let entry = db[type]; + entry.type = type; + + if (!entry.extensions) continue; + + entry.extensions.forEach(ext => { + if (ext in byExtension) { + const e0 = entry; + const e1 = byExtension[ext]; + e0.pri = mimeScore(e0.type, e0.source); + e1.pri = mimeScore(e1.type, e1.source); + + let drop = e0.pri < e1.pri ? e0 : e1; + let keep = e0.pri >= e1.pri ? e0 : e1; + drop.extensions = drop.extensions.filter(e => e !== ext); + + console.log(`${ext}: Keeping ${chalk.green(keep.type)} (${keep.pri}), dropping ${chalk.red(drop.type)} (${drop.pri})`); + } + byExtension[ext] = entry; + }); +} + +function writeTypesFile(types, path) { + fs.writeFileSync(path, JSON.stringify(types)); +} + +// Segregate into standard and non-standard types based on facet per +// https://tools.ietf.org/html/rfc6838#section-3.1 +const types = {}; + +Object.keys(db).sort().forEach(k => { + const entry = db[k]; + types[entry.type] = entry.extensions; +}); + +writeTypesFile(types, path.join(__dirname, '..', 'types.json')); diff --git a/node_modules/mime/src/test.js b/node_modules/mime/src/test.js new file mode 100644 index 000000000..42958a20d --- /dev/null +++ b/node_modules/mime/src/test.js @@ -0,0 +1,60 @@ +/** + * Usage: node test.js + */ + +var mime = require('../mime'); +var assert = require('assert'); +var path = require('path'); + +// +// Test mime lookups +// + +assert.equal('text/plain', mime.lookup('text.txt')); // normal file +assert.equal('text/plain', mime.lookup('TEXT.TXT')); // uppercase +assert.equal('text/plain', mime.lookup('dir/text.txt')); // dir + file +assert.equal('text/plain', mime.lookup('.text.txt')); // hidden file +assert.equal('text/plain', mime.lookup('.txt')); // nameless +assert.equal('text/plain', mime.lookup('txt')); // extension-only +assert.equal('text/plain', mime.lookup('/txt')); // extension-less () +assert.equal('text/plain', mime.lookup('\\txt')); // Windows, extension-less +assert.equal('application/octet-stream', mime.lookup('text.nope')); // unrecognized +assert.equal('fallback', mime.lookup('text.fallback', 'fallback')); // alternate default + +// +// Test extensions +// + +assert.equal('txt', mime.extension(mime.types.text)); +assert.equal('html', mime.extension(mime.types.htm)); +assert.equal('bin', mime.extension('application/octet-stream')); +assert.equal('bin', mime.extension('application/octet-stream ')); +assert.equal('html', mime.extension(' text/html; charset=UTF-8')); +assert.equal('html', mime.extension('text/html; charset=UTF-8 ')); +assert.equal('html', mime.extension('text/html; charset=UTF-8')); +assert.equal('html', mime.extension('text/html ; charset=UTF-8')); +assert.equal('html', mime.extension('text/html;charset=UTF-8')); +assert.equal('html', mime.extension('text/Html;charset=UTF-8')); +assert.equal(undefined, mime.extension('unrecognized')); + +// +// Test node.types lookups +// + +assert.equal('font/woff', mime.lookup('file.woff')); +assert.equal('application/octet-stream', mime.lookup('file.buffer')); +// TODO: Uncomment once #157 is resolved +// assert.equal('audio/mp4', mime.lookup('file.m4a')); +assert.equal('font/otf', mime.lookup('file.otf')); + +// +// Test charsets +// + +assert.equal('UTF-8', mime.charsets.lookup('text/plain')); +assert.equal('UTF-8', mime.charsets.lookup(mime.types.js)); +assert.equal('UTF-8', mime.charsets.lookup(mime.types.json)); +assert.equal(undefined, mime.charsets.lookup(mime.types.bin)); +assert.equal('fallback', mime.charsets.lookup('application/octet-stream', 'fallback')); + +console.log('\nAll tests passed'); diff --git a/node_modules/mime/types.json b/node_modules/mime/types.json new file mode 100644 index 000000000..bec78abd4 --- /dev/null +++ b/node_modules/mime/types.json @@ -0,0 +1 @@ +{"application/andrew-inset":["ez"],"application/applixware":["aw"],"application/atom+xml":["atom"],"application/atomcat+xml":["atomcat"],"application/atomsvc+xml":["atomsvc"],"application/bdoc":["bdoc"],"application/ccxml+xml":["ccxml"],"application/cdmi-capability":["cdmia"],"application/cdmi-container":["cdmic"],"application/cdmi-domain":["cdmid"],"application/cdmi-object":["cdmio"],"application/cdmi-queue":["cdmiq"],"application/cu-seeme":["cu"],"application/dash+xml":["mpd"],"application/davmount+xml":["davmount"],"application/docbook+xml":["dbk"],"application/dssc+der":["dssc"],"application/dssc+xml":["xdssc"],"application/ecmascript":["ecma"],"application/emma+xml":["emma"],"application/epub+zip":["epub"],"application/exi":["exi"],"application/font-tdpfr":["pfr"],"application/font-woff":[],"application/font-woff2":[],"application/geo+json":["geojson"],"application/gml+xml":["gml"],"application/gpx+xml":["gpx"],"application/gxf":["gxf"],"application/gzip":["gz"],"application/hyperstudio":["stk"],"application/inkml+xml":["ink","inkml"],"application/ipfix":["ipfix"],"application/java-archive":["jar","war","ear"],"application/java-serialized-object":["ser"],"application/java-vm":["class"],"application/javascript":["js","mjs"],"application/json":["json","map"],"application/json5":["json5"],"application/jsonml+json":["jsonml"],"application/ld+json":["jsonld"],"application/lost+xml":["lostxml"],"application/mac-binhex40":["hqx"],"application/mac-compactpro":["cpt"],"application/mads+xml":["mads"],"application/manifest+json":["webmanifest"],"application/marc":["mrc"],"application/marcxml+xml":["mrcx"],"application/mathematica":["ma","nb","mb"],"application/mathml+xml":["mathml"],"application/mbox":["mbox"],"application/mediaservercontrol+xml":["mscml"],"application/metalink+xml":["metalink"],"application/metalink4+xml":["meta4"],"application/mets+xml":["mets"],"application/mods+xml":["mods"],"application/mp21":["m21","mp21"],"application/mp4":["mp4s","m4p"],"application/msword":["doc","dot"],"application/mxf":["mxf"],"application/octet-stream":["bin","dms","lrf","mar","so","dist","distz","pkg","bpk","dump","elc","deploy","exe","dll","deb","dmg","iso","img","msi","msp","msm","buffer"],"application/oda":["oda"],"application/oebps-package+xml":["opf"],"application/ogg":["ogx"],"application/omdoc+xml":["omdoc"],"application/onenote":["onetoc","onetoc2","onetmp","onepkg"],"application/oxps":["oxps"],"application/patch-ops-error+xml":["xer"],"application/pdf":["pdf"],"application/pgp-encrypted":["pgp"],"application/pgp-signature":["asc","sig"],"application/pics-rules":["prf"],"application/pkcs10":["p10"],"application/pkcs7-mime":["p7m","p7c"],"application/pkcs7-signature":["p7s"],"application/pkcs8":["p8"],"application/pkix-attr-cert":["ac"],"application/pkix-cert":["cer"],"application/pkix-crl":["crl"],"application/pkix-pkipath":["pkipath"],"application/pkixcmp":["pki"],"application/pls+xml":["pls"],"application/postscript":["ai","eps","ps"],"application/prs.cww":["cww"],"application/pskc+xml":["pskcxml"],"application/raml+yaml":["raml"],"application/rdf+xml":["rdf"],"application/reginfo+xml":["rif"],"application/relax-ng-compact-syntax":["rnc"],"application/resource-lists+xml":["rl"],"application/resource-lists-diff+xml":["rld"],"application/rls-services+xml":["rs"],"application/rpki-ghostbusters":["gbr"],"application/rpki-manifest":["mft"],"application/rpki-roa":["roa"],"application/rsd+xml":["rsd"],"application/rss+xml":["rss"],"application/rtf":["rtf"],"application/sbml+xml":["sbml"],"application/scvp-cv-request":["scq"],"application/scvp-cv-response":["scs"],"application/scvp-vp-request":["spq"],"application/scvp-vp-response":["spp"],"application/sdp":["sdp"],"application/set-payment-initiation":["setpay"],"application/set-registration-initiation":["setreg"],"application/shf+xml":["shf"],"application/smil+xml":["smi","smil"],"application/sparql-query":["rq"],"application/sparql-results+xml":["srx"],"application/srgs":["gram"],"application/srgs+xml":["grxml"],"application/sru+xml":["sru"],"application/ssdl+xml":["ssdl"],"application/ssml+xml":["ssml"],"application/tei+xml":["tei","teicorpus"],"application/thraud+xml":["tfi"],"application/timestamped-data":["tsd"],"application/vnd.3gpp.pic-bw-large":["plb"],"application/vnd.3gpp.pic-bw-small":["psb"],"application/vnd.3gpp.pic-bw-var":["pvb"],"application/vnd.3gpp2.tcap":["tcap"],"application/vnd.3m.post-it-notes":["pwn"],"application/vnd.accpac.simply.aso":["aso"],"application/vnd.accpac.simply.imp":["imp"],"application/vnd.acucobol":["acu"],"application/vnd.acucorp":["atc","acutc"],"application/vnd.adobe.air-application-installer-package+zip":["air"],"application/vnd.adobe.formscentral.fcdt":["fcdt"],"application/vnd.adobe.fxp":["fxp","fxpl"],"application/vnd.adobe.xdp+xml":["xdp"],"application/vnd.adobe.xfdf":["xfdf"],"application/vnd.ahead.space":["ahead"],"application/vnd.airzip.filesecure.azf":["azf"],"application/vnd.airzip.filesecure.azs":["azs"],"application/vnd.amazon.ebook":["azw"],"application/vnd.americandynamics.acc":["acc"],"application/vnd.amiga.ami":["ami"],"application/vnd.android.package-archive":["apk"],"application/vnd.anser-web-certificate-issue-initiation":["cii"],"application/vnd.anser-web-funds-transfer-initiation":["fti"],"application/vnd.antix.game-component":["atx"],"application/vnd.apple.installer+xml":["mpkg"],"application/vnd.apple.mpegurl":["m3u8"],"application/vnd.apple.pkpass":["pkpass"],"application/vnd.aristanetworks.swi":["swi"],"application/vnd.astraea-software.iota":["iota"],"application/vnd.audiograph":["aep"],"application/vnd.blueice.multipass":["mpm"],"application/vnd.bmi":["bmi"],"application/vnd.businessobjects":["rep"],"application/vnd.chemdraw+xml":["cdxml"],"application/vnd.chipnuts.karaoke-mmd":["mmd"],"application/vnd.cinderella":["cdy"],"application/vnd.claymore":["cla"],"application/vnd.cloanto.rp9":["rp9"],"application/vnd.clonk.c4group":["c4g","c4d","c4f","c4p","c4u"],"application/vnd.cluetrust.cartomobile-config":["c11amc"],"application/vnd.cluetrust.cartomobile-config-pkg":["c11amz"],"application/vnd.commonspace":["csp"],"application/vnd.contact.cmsg":["cdbcmsg"],"application/vnd.cosmocaller":["cmc"],"application/vnd.crick.clicker":["clkx"],"application/vnd.crick.clicker.keyboard":["clkk"],"application/vnd.crick.clicker.palette":["clkp"],"application/vnd.crick.clicker.template":["clkt"],"application/vnd.crick.clicker.wordbank":["clkw"],"application/vnd.criticaltools.wbs+xml":["wbs"],"application/vnd.ctc-posml":["pml"],"application/vnd.cups-ppd":["ppd"],"application/vnd.curl.car":["car"],"application/vnd.curl.pcurl":["pcurl"],"application/vnd.dart":["dart"],"application/vnd.data-vision.rdz":["rdz"],"application/vnd.dece.data":["uvf","uvvf","uvd","uvvd"],"application/vnd.dece.ttml+xml":["uvt","uvvt"],"application/vnd.dece.unspecified":["uvx","uvvx"],"application/vnd.dece.zip":["uvz","uvvz"],"application/vnd.denovo.fcselayout-link":["fe_launch"],"application/vnd.dna":["dna"],"application/vnd.dolby.mlp":["mlp"],"application/vnd.dpgraph":["dpg"],"application/vnd.dreamfactory":["dfac"],"application/vnd.ds-keypoint":["kpxx"],"application/vnd.dvb.ait":["ait"],"application/vnd.dvb.service":["svc"],"application/vnd.dynageo":["geo"],"application/vnd.ecowin.chart":["mag"],"application/vnd.enliven":["nml"],"application/vnd.epson.esf":["esf"],"application/vnd.epson.msf":["msf"],"application/vnd.epson.quickanime":["qam"],"application/vnd.epson.salt":["slt"],"application/vnd.epson.ssf":["ssf"],"application/vnd.eszigno3+xml":["es3","et3"],"application/vnd.ezpix-album":["ez2"],"application/vnd.ezpix-package":["ez3"],"application/vnd.fdf":["fdf"],"application/vnd.fdsn.mseed":["mseed"],"application/vnd.fdsn.seed":["seed","dataless"],"application/vnd.flographit":["gph"],"application/vnd.fluxtime.clip":["ftc"],"application/vnd.framemaker":["fm","frame","maker","book"],"application/vnd.frogans.fnc":["fnc"],"application/vnd.frogans.ltf":["ltf"],"application/vnd.fsc.weblaunch":["fsc"],"application/vnd.fujitsu.oasys":["oas"],"application/vnd.fujitsu.oasys2":["oa2"],"application/vnd.fujitsu.oasys3":["oa3"],"application/vnd.fujitsu.oasysgp":["fg5"],"application/vnd.fujitsu.oasysprs":["bh2"],"application/vnd.fujixerox.ddd":["ddd"],"application/vnd.fujixerox.docuworks":["xdw"],"application/vnd.fujixerox.docuworks.binder":["xbd"],"application/vnd.fuzzysheet":["fzs"],"application/vnd.genomatix.tuxedo":["txd"],"application/vnd.geogebra.file":["ggb"],"application/vnd.geogebra.tool":["ggt"],"application/vnd.geometry-explorer":["gex","gre"],"application/vnd.geonext":["gxt"],"application/vnd.geoplan":["g2w"],"application/vnd.geospace":["g3w"],"application/vnd.gmx":["gmx"],"application/vnd.google-apps.document":["gdoc"],"application/vnd.google-apps.presentation":["gslides"],"application/vnd.google-apps.spreadsheet":["gsheet"],"application/vnd.google-earth.kml+xml":["kml"],"application/vnd.google-earth.kmz":["kmz"],"application/vnd.grafeq":["gqf","gqs"],"application/vnd.groove-account":["gac"],"application/vnd.groove-help":["ghf"],"application/vnd.groove-identity-message":["gim"],"application/vnd.groove-injector":["grv"],"application/vnd.groove-tool-message":["gtm"],"application/vnd.groove-tool-template":["tpl"],"application/vnd.groove-vcard":["vcg"],"application/vnd.hal+xml":["hal"],"application/vnd.handheld-entertainment+xml":["zmm"],"application/vnd.hbci":["hbci"],"application/vnd.hhe.lesson-player":["les"],"application/vnd.hp-hpgl":["hpgl"],"application/vnd.hp-hpid":["hpid"],"application/vnd.hp-hps":["hps"],"application/vnd.hp-jlyt":["jlt"],"application/vnd.hp-pcl":["pcl"],"application/vnd.hp-pclxl":["pclxl"],"application/vnd.hydrostatix.sof-data":["sfd-hdstx"],"application/vnd.ibm.minipay":["mpy"],"application/vnd.ibm.modcap":["afp","listafp","list3820"],"application/vnd.ibm.rights-management":["irm"],"application/vnd.ibm.secure-container":["sc"],"application/vnd.iccprofile":["icc","icm"],"application/vnd.igloader":["igl"],"application/vnd.immervision-ivp":["ivp"],"application/vnd.immervision-ivu":["ivu"],"application/vnd.insors.igm":["igm"],"application/vnd.intercon.formnet":["xpw","xpx"],"application/vnd.intergeo":["i2g"],"application/vnd.intu.qbo":["qbo"],"application/vnd.intu.qfx":["qfx"],"application/vnd.ipunplugged.rcprofile":["rcprofile"],"application/vnd.irepository.package+xml":["irp"],"application/vnd.is-xpr":["xpr"],"application/vnd.isac.fcs":["fcs"],"application/vnd.jam":["jam"],"application/vnd.jcp.javame.midlet-rms":["rms"],"application/vnd.jisp":["jisp"],"application/vnd.joost.joda-archive":["joda"],"application/vnd.kahootz":["ktz","ktr"],"application/vnd.kde.karbon":["karbon"],"application/vnd.kde.kchart":["chrt"],"application/vnd.kde.kformula":["kfo"],"application/vnd.kde.kivio":["flw"],"application/vnd.kde.kontour":["kon"],"application/vnd.kde.kpresenter":["kpr","kpt"],"application/vnd.kde.kspread":["ksp"],"application/vnd.kde.kword":["kwd","kwt"],"application/vnd.kenameaapp":["htke"],"application/vnd.kidspiration":["kia"],"application/vnd.kinar":["kne","knp"],"application/vnd.koan":["skp","skd","skt","skm"],"application/vnd.kodak-descriptor":["sse"],"application/vnd.las.las+xml":["lasxml"],"application/vnd.llamagraphics.life-balance.desktop":["lbd"],"application/vnd.llamagraphics.life-balance.exchange+xml":["lbe"],"application/vnd.lotus-1-2-3":["123"],"application/vnd.lotus-approach":["apr"],"application/vnd.lotus-freelance":["pre"],"application/vnd.lotus-notes":["nsf"],"application/vnd.lotus-organizer":["org"],"application/vnd.lotus-screencam":["scm"],"application/vnd.lotus-wordpro":["lwp"],"application/vnd.macports.portpkg":["portpkg"],"application/vnd.mcd":["mcd"],"application/vnd.medcalcdata":["mc1"],"application/vnd.mediastation.cdkey":["cdkey"],"application/vnd.mfer":["mwf"],"application/vnd.mfmp":["mfm"],"application/vnd.micrografx.flo":["flo"],"application/vnd.micrografx.igx":["igx"],"application/vnd.mif":["mif"],"application/vnd.mobius.daf":["daf"],"application/vnd.mobius.dis":["dis"],"application/vnd.mobius.mbk":["mbk"],"application/vnd.mobius.mqy":["mqy"],"application/vnd.mobius.msl":["msl"],"application/vnd.mobius.plc":["plc"],"application/vnd.mobius.txf":["txf"],"application/vnd.mophun.application":["mpn"],"application/vnd.mophun.certificate":["mpc"],"application/vnd.mozilla.xul+xml":["xul"],"application/vnd.ms-artgalry":["cil"],"application/vnd.ms-cab-compressed":["cab"],"application/vnd.ms-excel":["xls","xlm","xla","xlc","xlt","xlw"],"application/vnd.ms-excel.addin.macroenabled.12":["xlam"],"application/vnd.ms-excel.sheet.binary.macroenabled.12":["xlsb"],"application/vnd.ms-excel.sheet.macroenabled.12":["xlsm"],"application/vnd.ms-excel.template.macroenabled.12":["xltm"],"application/vnd.ms-fontobject":["eot"],"application/vnd.ms-htmlhelp":["chm"],"application/vnd.ms-ims":["ims"],"application/vnd.ms-lrm":["lrm"],"application/vnd.ms-officetheme":["thmx"],"application/vnd.ms-outlook":["msg"],"application/vnd.ms-pki.seccat":["cat"],"application/vnd.ms-pki.stl":["stl"],"application/vnd.ms-powerpoint":["ppt","pps","pot"],"application/vnd.ms-powerpoint.addin.macroenabled.12":["ppam"],"application/vnd.ms-powerpoint.presentation.macroenabled.12":["pptm"],"application/vnd.ms-powerpoint.slide.macroenabled.12":["sldm"],"application/vnd.ms-powerpoint.slideshow.macroenabled.12":["ppsm"],"application/vnd.ms-powerpoint.template.macroenabled.12":["potm"],"application/vnd.ms-project":["mpp","mpt"],"application/vnd.ms-word.document.macroenabled.12":["docm"],"application/vnd.ms-word.template.macroenabled.12":["dotm"],"application/vnd.ms-works":["wps","wks","wcm","wdb"],"application/vnd.ms-wpl":["wpl"],"application/vnd.ms-xpsdocument":["xps"],"application/vnd.mseq":["mseq"],"application/vnd.musician":["mus"],"application/vnd.muvee.style":["msty"],"application/vnd.mynfc":["taglet"],"application/vnd.neurolanguage.nlu":["nlu"],"application/vnd.nitf":["ntf","nitf"],"application/vnd.noblenet-directory":["nnd"],"application/vnd.noblenet-sealer":["nns"],"application/vnd.noblenet-web":["nnw"],"application/vnd.nokia.n-gage.data":["ngdat"],"application/vnd.nokia.n-gage.symbian.install":["n-gage"],"application/vnd.nokia.radio-preset":["rpst"],"application/vnd.nokia.radio-presets":["rpss"],"application/vnd.novadigm.edm":["edm"],"application/vnd.novadigm.edx":["edx"],"application/vnd.novadigm.ext":["ext"],"application/vnd.oasis.opendocument.chart":["odc"],"application/vnd.oasis.opendocument.chart-template":["otc"],"application/vnd.oasis.opendocument.database":["odb"],"application/vnd.oasis.opendocument.formula":["odf"],"application/vnd.oasis.opendocument.formula-template":["odft"],"application/vnd.oasis.opendocument.graphics":["odg"],"application/vnd.oasis.opendocument.graphics-template":["otg"],"application/vnd.oasis.opendocument.image":["odi"],"application/vnd.oasis.opendocument.image-template":["oti"],"application/vnd.oasis.opendocument.presentation":["odp"],"application/vnd.oasis.opendocument.presentation-template":["otp"],"application/vnd.oasis.opendocument.spreadsheet":["ods"],"application/vnd.oasis.opendocument.spreadsheet-template":["ots"],"application/vnd.oasis.opendocument.text":["odt"],"application/vnd.oasis.opendocument.text-master":["odm"],"application/vnd.oasis.opendocument.text-template":["ott"],"application/vnd.oasis.opendocument.text-web":["oth"],"application/vnd.olpc-sugar":["xo"],"application/vnd.oma.dd2+xml":["dd2"],"application/vnd.openofficeorg.extension":["oxt"],"application/vnd.openxmlformats-officedocument.presentationml.presentation":["pptx"],"application/vnd.openxmlformats-officedocument.presentationml.slide":["sldx"],"application/vnd.openxmlformats-officedocument.presentationml.slideshow":["ppsx"],"application/vnd.openxmlformats-officedocument.presentationml.template":["potx"],"application/vnd.openxmlformats-officedocument.spreadsheetml.sheet":["xlsx"],"application/vnd.openxmlformats-officedocument.spreadsheetml.template":["xltx"],"application/vnd.openxmlformats-officedocument.wordprocessingml.document":["docx"],"application/vnd.openxmlformats-officedocument.wordprocessingml.template":["dotx"],"application/vnd.osgeo.mapguide.package":["mgp"],"application/vnd.osgi.dp":["dp"],"application/vnd.osgi.subsystem":["esa"],"application/vnd.palm":["pdb","pqa","oprc"],"application/vnd.pawaafile":["paw"],"application/vnd.pg.format":["str"],"application/vnd.pg.osasli":["ei6"],"application/vnd.picsel":["efif"],"application/vnd.pmi.widget":["wg"],"application/vnd.pocketlearn":["plf"],"application/vnd.powerbuilder6":["pbd"],"application/vnd.previewsystems.box":["box"],"application/vnd.proteus.magazine":["mgz"],"application/vnd.publishare-delta-tree":["qps"],"application/vnd.pvi.ptid1":["ptid"],"application/vnd.quark.quarkxpress":["qxd","qxt","qwd","qwt","qxl","qxb"],"application/vnd.realvnc.bed":["bed"],"application/vnd.recordare.musicxml":["mxl"],"application/vnd.recordare.musicxml+xml":["musicxml"],"application/vnd.rig.cryptonote":["cryptonote"],"application/vnd.rim.cod":["cod"],"application/vnd.rn-realmedia":["rm"],"application/vnd.rn-realmedia-vbr":["rmvb"],"application/vnd.route66.link66+xml":["link66"],"application/vnd.sailingtracker.track":["st"],"application/vnd.seemail":["see"],"application/vnd.sema":["sema"],"application/vnd.semd":["semd"],"application/vnd.semf":["semf"],"application/vnd.shana.informed.formdata":["ifm"],"application/vnd.shana.informed.formtemplate":["itp"],"application/vnd.shana.informed.interchange":["iif"],"application/vnd.shana.informed.package":["ipk"],"application/vnd.simtech-mindmapper":["twd","twds"],"application/vnd.smaf":["mmf"],"application/vnd.smart.teacher":["teacher"],"application/vnd.solent.sdkm+xml":["sdkm","sdkd"],"application/vnd.spotfire.dxp":["dxp"],"application/vnd.spotfire.sfs":["sfs"],"application/vnd.stardivision.calc":["sdc"],"application/vnd.stardivision.draw":["sda"],"application/vnd.stardivision.impress":["sdd"],"application/vnd.stardivision.math":["smf"],"application/vnd.stardivision.writer":["sdw","vor"],"application/vnd.stardivision.writer-global":["sgl"],"application/vnd.stepmania.package":["smzip"],"application/vnd.stepmania.stepchart":["sm"],"application/vnd.sun.wadl+xml":["wadl"],"application/vnd.sun.xml.calc":["sxc"],"application/vnd.sun.xml.calc.template":["stc"],"application/vnd.sun.xml.draw":["sxd"],"application/vnd.sun.xml.draw.template":["std"],"application/vnd.sun.xml.impress":["sxi"],"application/vnd.sun.xml.impress.template":["sti"],"application/vnd.sun.xml.math":["sxm"],"application/vnd.sun.xml.writer":["sxw"],"application/vnd.sun.xml.writer.global":["sxg"],"application/vnd.sun.xml.writer.template":["stw"],"application/vnd.sus-calendar":["sus","susp"],"application/vnd.svd":["svd"],"application/vnd.symbian.install":["sis","sisx"],"application/vnd.syncml+xml":["xsm"],"application/vnd.syncml.dm+wbxml":["bdm"],"application/vnd.syncml.dm+xml":["xdm"],"application/vnd.tao.intent-module-archive":["tao"],"application/vnd.tcpdump.pcap":["pcap","cap","dmp"],"application/vnd.tmobile-livetv":["tmo"],"application/vnd.trid.tpt":["tpt"],"application/vnd.triscape.mxs":["mxs"],"application/vnd.trueapp":["tra"],"application/vnd.ufdl":["ufd","ufdl"],"application/vnd.uiq.theme":["utz"],"application/vnd.umajin":["umj"],"application/vnd.unity":["unityweb"],"application/vnd.uoml+xml":["uoml"],"application/vnd.vcx":["vcx"],"application/vnd.visio":["vsd","vst","vss","vsw"],"application/vnd.visionary":["vis"],"application/vnd.vsf":["vsf"],"application/vnd.wap.wbxml":["wbxml"],"application/vnd.wap.wmlc":["wmlc"],"application/vnd.wap.wmlscriptc":["wmlsc"],"application/vnd.webturbo":["wtb"],"application/vnd.wolfram.player":["nbp"],"application/vnd.wordperfect":["wpd"],"application/vnd.wqd":["wqd"],"application/vnd.wt.stf":["stf"],"application/vnd.xara":["xar"],"application/vnd.xfdl":["xfdl"],"application/vnd.yamaha.hv-dic":["hvd"],"application/vnd.yamaha.hv-script":["hvs"],"application/vnd.yamaha.hv-voice":["hvp"],"application/vnd.yamaha.openscoreformat":["osf"],"application/vnd.yamaha.openscoreformat.osfpvg+xml":["osfpvg"],"application/vnd.yamaha.smaf-audio":["saf"],"application/vnd.yamaha.smaf-phrase":["spf"],"application/vnd.yellowriver-custom-menu":["cmp"],"application/vnd.zul":["zir","zirz"],"application/vnd.zzazz.deck+xml":["zaz"],"application/voicexml+xml":["vxml"],"application/wasm":["wasm"],"application/widget":["wgt"],"application/winhlp":["hlp"],"application/wsdl+xml":["wsdl"],"application/wspolicy+xml":["wspolicy"],"application/x-7z-compressed":["7z"],"application/x-abiword":["abw"],"application/x-ace-compressed":["ace"],"application/x-apple-diskimage":[],"application/x-arj":["arj"],"application/x-authorware-bin":["aab","x32","u32","vox"],"application/x-authorware-map":["aam"],"application/x-authorware-seg":["aas"],"application/x-bcpio":["bcpio"],"application/x-bdoc":[],"application/x-bittorrent":["torrent"],"application/x-blorb":["blb","blorb"],"application/x-bzip":["bz"],"application/x-bzip2":["bz2","boz"],"application/x-cbr":["cbr","cba","cbt","cbz","cb7"],"application/x-cdlink":["vcd"],"application/x-cfs-compressed":["cfs"],"application/x-chat":["chat"],"application/x-chess-pgn":["pgn"],"application/x-chrome-extension":["crx"],"application/x-cocoa":["cco"],"application/x-conference":["nsc"],"application/x-cpio":["cpio"],"application/x-csh":["csh"],"application/x-debian-package":["udeb"],"application/x-dgc-compressed":["dgc"],"application/x-director":["dir","dcr","dxr","cst","cct","cxt","w3d","fgd","swa"],"application/x-doom":["wad"],"application/x-dtbncx+xml":["ncx"],"application/x-dtbook+xml":["dtb"],"application/x-dtbresource+xml":["res"],"application/x-dvi":["dvi"],"application/x-envoy":["evy"],"application/x-eva":["eva"],"application/x-font-bdf":["bdf"],"application/x-font-ghostscript":["gsf"],"application/x-font-linux-psf":["psf"],"application/x-font-pcf":["pcf"],"application/x-font-snf":["snf"],"application/x-font-type1":["pfa","pfb","pfm","afm"],"application/x-freearc":["arc"],"application/x-futuresplash":["spl"],"application/x-gca-compressed":["gca"],"application/x-glulx":["ulx"],"application/x-gnumeric":["gnumeric"],"application/x-gramps-xml":["gramps"],"application/x-gtar":["gtar"],"application/x-hdf":["hdf"],"application/x-httpd-php":["php"],"application/x-install-instructions":["install"],"application/x-iso9660-image":[],"application/x-java-archive-diff":["jardiff"],"application/x-java-jnlp-file":["jnlp"],"application/x-latex":["latex"],"application/x-lua-bytecode":["luac"],"application/x-lzh-compressed":["lzh","lha"],"application/x-makeself":["run"],"application/x-mie":["mie"],"application/x-mobipocket-ebook":["prc","mobi"],"application/x-ms-application":["application"],"application/x-ms-shortcut":["lnk"],"application/x-ms-wmd":["wmd"],"application/x-ms-wmz":["wmz"],"application/x-ms-xbap":["xbap"],"application/x-msaccess":["mdb"],"application/x-msbinder":["obd"],"application/x-mscardfile":["crd"],"application/x-msclip":["clp"],"application/x-msdos-program":[],"application/x-msdownload":["com","bat"],"application/x-msmediaview":["mvb","m13","m14"],"application/x-msmetafile":["wmf","emf","emz"],"application/x-msmoney":["mny"],"application/x-mspublisher":["pub"],"application/x-msschedule":["scd"],"application/x-msterminal":["trm"],"application/x-mswrite":["wri"],"application/x-netcdf":["nc","cdf"],"application/x-ns-proxy-autoconfig":["pac"],"application/x-nzb":["nzb"],"application/x-perl":["pl","pm"],"application/x-pilot":[],"application/x-pkcs12":["p12","pfx"],"application/x-pkcs7-certificates":["p7b","spc"],"application/x-pkcs7-certreqresp":["p7r"],"application/x-rar-compressed":["rar"],"application/x-redhat-package-manager":["rpm"],"application/x-research-info-systems":["ris"],"application/x-sea":["sea"],"application/x-sh":["sh"],"application/x-shar":["shar"],"application/x-shockwave-flash":["swf"],"application/x-silverlight-app":["xap"],"application/x-sql":["sql"],"application/x-stuffit":["sit"],"application/x-stuffitx":["sitx"],"application/x-subrip":["srt"],"application/x-sv4cpio":["sv4cpio"],"application/x-sv4crc":["sv4crc"],"application/x-t3vm-image":["t3"],"application/x-tads":["gam"],"application/x-tar":["tar"],"application/x-tcl":["tcl","tk"],"application/x-tex":["tex"],"application/x-tex-tfm":["tfm"],"application/x-texinfo":["texinfo","texi"],"application/x-tgif":["obj"],"application/x-ustar":["ustar"],"application/x-virtualbox-hdd":["hdd"],"application/x-virtualbox-ova":["ova"],"application/x-virtualbox-ovf":["ovf"],"application/x-virtualbox-vbox":["vbox"],"application/x-virtualbox-vbox-extpack":["vbox-extpack"],"application/x-virtualbox-vdi":["vdi"],"application/x-virtualbox-vhd":["vhd"],"application/x-virtualbox-vmdk":["vmdk"],"application/x-wais-source":["src"],"application/x-web-app-manifest+json":["webapp"],"application/x-x509-ca-cert":["der","crt","pem"],"application/x-xfig":["fig"],"application/x-xliff+xml":["xlf"],"application/x-xpinstall":["xpi"],"application/x-xz":["xz"],"application/x-zmachine":["z1","z2","z3","z4","z5","z6","z7","z8"],"application/xaml+xml":["xaml"],"application/xcap-diff+xml":["xdf"],"application/xenc+xml":["xenc"],"application/xhtml+xml":["xhtml","xht"],"application/xml":["xml","xsl","xsd","rng"],"application/xml-dtd":["dtd"],"application/xop+xml":["xop"],"application/xproc+xml":["xpl"],"application/xslt+xml":["xslt"],"application/xspf+xml":["xspf"],"application/xv+xml":["mxml","xhvml","xvml","xvm"],"application/yang":["yang"],"application/yin+xml":["yin"],"application/zip":["zip"],"audio/3gpp":[],"audio/adpcm":["adp"],"audio/basic":["au","snd"],"audio/midi":["mid","midi","kar","rmi"],"audio/mp3":[],"audio/mp4":["m4a","mp4a"],"audio/mpeg":["mpga","mp2","mp2a","mp3","m2a","m3a"],"audio/ogg":["oga","ogg","spx"],"audio/s3m":["s3m"],"audio/silk":["sil"],"audio/vnd.dece.audio":["uva","uvva"],"audio/vnd.digital-winds":["eol"],"audio/vnd.dra":["dra"],"audio/vnd.dts":["dts"],"audio/vnd.dts.hd":["dtshd"],"audio/vnd.lucent.voice":["lvp"],"audio/vnd.ms-playready.media.pya":["pya"],"audio/vnd.nuera.ecelp4800":["ecelp4800"],"audio/vnd.nuera.ecelp7470":["ecelp7470"],"audio/vnd.nuera.ecelp9600":["ecelp9600"],"audio/vnd.rip":["rip"],"audio/wav":["wav"],"audio/wave":[],"audio/webm":["weba"],"audio/x-aac":["aac"],"audio/x-aiff":["aif","aiff","aifc"],"audio/x-caf":["caf"],"audio/x-flac":["flac"],"audio/x-m4a":[],"audio/x-matroska":["mka"],"audio/x-mpegurl":["m3u"],"audio/x-ms-wax":["wax"],"audio/x-ms-wma":["wma"],"audio/x-pn-realaudio":["ram","ra"],"audio/x-pn-realaudio-plugin":["rmp"],"audio/x-realaudio":[],"audio/x-wav":[],"audio/xm":["xm"],"chemical/x-cdx":["cdx"],"chemical/x-cif":["cif"],"chemical/x-cmdf":["cmdf"],"chemical/x-cml":["cml"],"chemical/x-csml":["csml"],"chemical/x-xyz":["xyz"],"font/collection":["ttc"],"font/otf":["otf"],"font/ttf":["ttf"],"font/woff":["woff"],"font/woff2":["woff2"],"image/apng":["apng"],"image/bmp":["bmp"],"image/cgm":["cgm"],"image/g3fax":["g3"],"image/gif":["gif"],"image/ief":["ief"],"image/jp2":["jp2","jpg2"],"image/jpeg":["jpeg","jpg","jpe"],"image/jpm":["jpm"],"image/jpx":["jpx","jpf"],"image/ktx":["ktx"],"image/png":["png"],"image/prs.btif":["btif"],"image/sgi":["sgi"],"image/svg+xml":["svg","svgz"],"image/tiff":["tiff","tif"],"image/vnd.adobe.photoshop":["psd"],"image/vnd.dece.graphic":["uvi","uvvi","uvg","uvvg"],"image/vnd.djvu":["djvu","djv"],"image/vnd.dvb.subtitle":[],"image/vnd.dwg":["dwg"],"image/vnd.dxf":["dxf"],"image/vnd.fastbidsheet":["fbs"],"image/vnd.fpx":["fpx"],"image/vnd.fst":["fst"],"image/vnd.fujixerox.edmics-mmr":["mmr"],"image/vnd.fujixerox.edmics-rlc":["rlc"],"image/vnd.ms-modi":["mdi"],"image/vnd.ms-photo":["wdp"],"image/vnd.net-fpx":["npx"],"image/vnd.wap.wbmp":["wbmp"],"image/vnd.xiff":["xif"],"image/webp":["webp"],"image/x-3ds":["3ds"],"image/x-cmu-raster":["ras"],"image/x-cmx":["cmx"],"image/x-freehand":["fh","fhc","fh4","fh5","fh7"],"image/x-icon":["ico"],"image/x-jng":["jng"],"image/x-mrsid-image":["sid"],"image/x-ms-bmp":[],"image/x-pcx":["pcx"],"image/x-pict":["pic","pct"],"image/x-portable-anymap":["pnm"],"image/x-portable-bitmap":["pbm"],"image/x-portable-graymap":["pgm"],"image/x-portable-pixmap":["ppm"],"image/x-rgb":["rgb"],"image/x-tga":["tga"],"image/x-xbitmap":["xbm"],"image/x-xpixmap":["xpm"],"image/x-xwindowdump":["xwd"],"message/rfc822":["eml","mime"],"model/gltf+json":["gltf"],"model/gltf-binary":["glb"],"model/iges":["igs","iges"],"model/mesh":["msh","mesh","silo"],"model/vnd.collada+xml":["dae"],"model/vnd.dwf":["dwf"],"model/vnd.gdl":["gdl"],"model/vnd.gtw":["gtw"],"model/vnd.mts":["mts"],"model/vnd.vtu":["vtu"],"model/vrml":["wrl","vrml"],"model/x3d+binary":["x3db","x3dbz"],"model/x3d+vrml":["x3dv","x3dvz"],"model/x3d+xml":["x3d","x3dz"],"text/cache-manifest":["appcache","manifest"],"text/calendar":["ics","ifb"],"text/coffeescript":["coffee","litcoffee"],"text/css":["css"],"text/csv":["csv"],"text/hjson":["hjson"],"text/html":["html","htm","shtml"],"text/jade":["jade"],"text/jsx":["jsx"],"text/less":["less"],"text/markdown":["markdown","md"],"text/mathml":["mml"],"text/n3":["n3"],"text/plain":["txt","text","conf","def","list","log","in","ini"],"text/prs.lines.tag":["dsc"],"text/richtext":["rtx"],"text/rtf":[],"text/sgml":["sgml","sgm"],"text/slim":["slim","slm"],"text/stylus":["stylus","styl"],"text/tab-separated-values":["tsv"],"text/troff":["t","tr","roff","man","me","ms"],"text/turtle":["ttl"],"text/uri-list":["uri","uris","urls"],"text/vcard":["vcard"],"text/vnd.curl":["curl"],"text/vnd.curl.dcurl":["dcurl"],"text/vnd.curl.mcurl":["mcurl"],"text/vnd.curl.scurl":["scurl"],"text/vnd.dvb.subtitle":["sub"],"text/vnd.fly":["fly"],"text/vnd.fmi.flexstor":["flx"],"text/vnd.graphviz":["gv"],"text/vnd.in3d.3dml":["3dml"],"text/vnd.in3d.spot":["spot"],"text/vnd.sun.j2me.app-descriptor":["jad"],"text/vnd.wap.wml":["wml"],"text/vnd.wap.wmlscript":["wmls"],"text/vtt":["vtt"],"text/x-asm":["s","asm"],"text/x-c":["c","cc","cxx","cpp","h","hh","dic"],"text/x-component":["htc"],"text/x-fortran":["f","for","f77","f90"],"text/x-handlebars-template":["hbs"],"text/x-java-source":["java"],"text/x-lua":["lua"],"text/x-markdown":["mkd"],"text/x-nfo":["nfo"],"text/x-opml":["opml"],"text/x-org":[],"text/x-pascal":["p","pas"],"text/x-processing":["pde"],"text/x-sass":["sass"],"text/x-scss":["scss"],"text/x-setext":["etx"],"text/x-sfv":["sfv"],"text/x-suse-ymp":["ymp"],"text/x-uuencode":["uu"],"text/x-vcalendar":["vcs"],"text/x-vcard":["vcf"],"text/xml":[],"text/yaml":["yaml","yml"],"video/3gpp":["3gp","3gpp"],"video/3gpp2":["3g2"],"video/h261":["h261"],"video/h263":["h263"],"video/h264":["h264"],"video/jpeg":["jpgv"],"video/jpm":["jpgm"],"video/mj2":["mj2","mjp2"],"video/mp2t":["ts"],"video/mp4":["mp4","mp4v","mpg4"],"video/mpeg":["mpeg","mpg","mpe","m1v","m2v"],"video/ogg":["ogv"],"video/quicktime":["qt","mov"],"video/vnd.dece.hd":["uvh","uvvh"],"video/vnd.dece.mobile":["uvm","uvvm"],"video/vnd.dece.pd":["uvp","uvvp"],"video/vnd.dece.sd":["uvs","uvvs"],"video/vnd.dece.video":["uvv","uvvv"],"video/vnd.dvb.file":["dvb"],"video/vnd.fvt":["fvt"],"video/vnd.mpegurl":["mxu","m4u"],"video/vnd.ms-playready.media.pyv":["pyv"],"video/vnd.uvvu.mp4":["uvu","uvvu"],"video/vnd.vivo":["viv"],"video/webm":["webm"],"video/x-f4v":["f4v"],"video/x-fli":["fli"],"video/x-flv":["flv"],"video/x-m4v":["m4v"],"video/x-matroska":["mkv","mk3d","mks"],"video/x-mng":["mng"],"video/x-ms-asf":["asf","asx"],"video/x-ms-vob":["vob"],"video/x-ms-wm":["wm"],"video/x-ms-wmv":["wmv"],"video/x-ms-wmx":["wmx"],"video/x-ms-wvx":["wvx"],"video/x-msvideo":["avi"],"video/x-sgi-movie":["movie"],"video/x-smv":["smv"],"x-conference/x-cooltalk":["ice"]} \ No newline at end of file diff --git a/node_modules/mimic-response/index.js b/node_modules/mimic-response/index.js new file mode 100644 index 000000000..d5e33be47 --- /dev/null +++ b/node_modules/mimic-response/index.js @@ -0,0 +1,32 @@ +'use strict'; + +// We define these manually to ensure they're always copied +// even if they would move up the prototype chain +// https://nodejs.org/api/http.html#http_class_http_incomingmessage +const knownProps = [ + 'destroy', + 'setTimeout', + 'socket', + 'headers', + 'trailers', + 'rawHeaders', + 'statusCode', + 'httpVersion', + 'httpVersionMinor', + 'httpVersionMajor', + 'rawTrailers', + 'statusMessage' +]; + +module.exports = (fromStream, toStream) => { + const fromProps = new Set(Object.keys(fromStream).concat(knownProps)); + + for (const prop of fromProps) { + // Don't overwrite existing properties + if (prop in toStream) { + continue; + } + + toStream[prop] = typeof fromStream[prop] === 'function' ? fromStream[prop].bind(fromStream) : fromStream[prop]; + } +}; diff --git a/node_modules/mimic-response/license b/node_modules/mimic-response/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/mimic-response/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/mimic-response/package.json b/node_modules/mimic-response/package.json new file mode 100644 index 000000000..af468a9d3 --- /dev/null +++ b/node_modules/mimic-response/package.json @@ -0,0 +1,71 @@ +{ + "_from": "mimic-response@^1.0.1", + "_id": "mimic-response@1.0.1", + "_inBundle": false, + "_integrity": "sha512-j5EctnkH7amfV/q5Hgmoal1g2QHFJRraOtmx0JpIqkxhBhI/lJSl1nMpQ45hVarwNETOoWEimndZ4QK0RHxuxQ==", + "_location": "/mimic-response", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "mimic-response@^1.0.1", + "name": "mimic-response", + "escapedName": "mimic-response", + "rawSpec": "^1.0.1", + "saveSpec": null, + "fetchSpec": "^1.0.1" + }, + "_requiredBy": [ + "/clone-response", + "/decompress-response", + "/got" + ], + "_resolved": "https://registry.npmjs.org/mimic-response/-/mimic-response-1.0.1.tgz", + "_shasum": "4923538878eef42063cb8a3e3b0798781487ab1b", + "_spec": "mimic-response@^1.0.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\got", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/mimic-response/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Mimic a Node.js HTTP response stream", + "devDependencies": { + "ava": "*", + "create-test-server": "^0.1.0", + "pify": "^3.0.0", + "xo": "*" + }, + "engines": { + "node": ">=4" + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/sindresorhus/mimic-response#readme", + "keywords": [ + "mimic", + "response", + "stream", + "http", + "https", + "request", + "get", + "core" + ], + "license": "MIT", + "name": "mimic-response", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/mimic-response.git" + }, + "scripts": { + "test": "xo && ava" + }, + "version": "1.0.1" +} diff --git a/node_modules/mimic-response/readme.md b/node_modules/mimic-response/readme.md new file mode 100644 index 000000000..e07ec661c --- /dev/null +++ b/node_modules/mimic-response/readme.md @@ -0,0 +1,54 @@ +# mimic-response [![Build Status](https://travis-ci.org/sindresorhus/mimic-response.svg?branch=master)](https://travis-ci.org/sindresorhus/mimic-response) + +> Mimic a [Node.js HTTP response stream](https://nodejs.org/api/http.html#http_class_http_incomingmessage) + + +## Install + +``` +$ npm install mimic-response +``` + + +## Usage + +```js +const stream = require('stream'); +const mimicResponse = require('mimic-response'); + +const responseStream = getHttpResponseStream(); +const myStream = new stream.PassThrough(); + +mimicResponse(responseStream, myStream); + +console.log(myStream.statusCode); +//=> 200 +``` + + +## API + +### mimicResponse(from, to) + +#### from + +Type: `Stream` + +[Node.js HTTP response stream.](https://nodejs.org/api/http.html#http_class_http_incomingmessage) + +#### to + +Type: `Stream` + +Any stream. + + +## Related + +- [mimic-fn](https://github.com/sindresorhus/mimic-fn) - Make a function mimic another one +- [clone-response](https://github.com/lukechilds/clone-response) - Clone a Node.js response stream + + +## License + +MIT © [Sindre Sorhus](https://sindresorhus.com) diff --git a/node_modules/minimatch/LICENSE b/node_modules/minimatch/LICENSE new file mode 100644 index 000000000..19129e315 --- /dev/null +++ b/node_modules/minimatch/LICENSE @@ -0,0 +1,15 @@ +The ISC License + +Copyright (c) Isaac Z. Schlueter and Contributors + +Permission to use, copy, modify, and/or distribute this software for any +purpose with or without fee is hereby granted, provided that the above +copyright notice and this permission notice appear in all copies. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES +WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF +MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR +ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES +WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN +ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR +IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. diff --git a/node_modules/minimatch/README.md b/node_modules/minimatch/README.md new file mode 100644 index 000000000..ad72b8133 --- /dev/null +++ b/node_modules/minimatch/README.md @@ -0,0 +1,209 @@ +# minimatch + +A minimal matching utility. + +[![Build Status](https://secure.travis-ci.org/isaacs/minimatch.svg)](http://travis-ci.org/isaacs/minimatch) + + +This is the matching library used internally by npm. + +It works by converting glob expressions into JavaScript `RegExp` +objects. + +## Usage + +```javascript +var minimatch = require("minimatch") + +minimatch("bar.foo", "*.foo") // true! +minimatch("bar.foo", "*.bar") // false! +minimatch("bar.foo", "*.+(bar|foo)", { debug: true }) // true, and noisy! +``` + +## Features + +Supports these glob features: + +* Brace Expansion +* Extended glob matching +* "Globstar" `**` matching + +See: + +* `man sh` +* `man bash` +* `man 3 fnmatch` +* `man 5 gitignore` + +## Minimatch Class + +Create a minimatch object by instantiating the `minimatch.Minimatch` class. + +```javascript +var Minimatch = require("minimatch").Minimatch +var mm = new Minimatch(pattern, options) +``` + +### Properties + +* `pattern` The original pattern the minimatch object represents. +* `options` The options supplied to the constructor. +* `set` A 2-dimensional array of regexp or string expressions. + Each row in the + array corresponds to a brace-expanded pattern. Each item in the row + corresponds to a single path-part. For example, the pattern + `{a,b/c}/d` would expand to a set of patterns like: + + [ [ a, d ] + , [ b, c, d ] ] + + If a portion of the pattern doesn't have any "magic" in it + (that is, it's something like `"foo"` rather than `fo*o?`), then it + will be left as a string rather than converted to a regular + expression. + +* `regexp` Created by the `makeRe` method. A single regular expression + expressing the entire pattern. This is useful in cases where you wish + to use the pattern somewhat like `fnmatch(3)` with `FNM_PATH` enabled. +* `negate` True if the pattern is negated. +* `comment` True if the pattern is a comment. +* `empty` True if the pattern is `""`. + +### Methods + +* `makeRe` Generate the `regexp` member if necessary, and return it. + Will return `false` if the pattern is invalid. +* `match(fname)` Return true if the filename matches the pattern, or + false otherwise. +* `matchOne(fileArray, patternArray, partial)` Take a `/`-split + filename, and match it against a single row in the `regExpSet`. This + method is mainly for internal use, but is exposed so that it can be + used by a glob-walker that needs to avoid excessive filesystem calls. + +All other methods are internal, and will be called as necessary. + +### minimatch(path, pattern, options) + +Main export. Tests a path against the pattern using the options. + +```javascript +var isJS = minimatch(file, "*.js", { matchBase: true }) +``` + +### minimatch.filter(pattern, options) + +Returns a function that tests its +supplied argument, suitable for use with `Array.filter`. Example: + +```javascript +var javascripts = fileList.filter(minimatch.filter("*.js", {matchBase: true})) +``` + +### minimatch.match(list, pattern, options) + +Match against the list of +files, in the style of fnmatch or glob. If nothing is matched, and +options.nonull is set, then return a list containing the pattern itself. + +```javascript +var javascripts = minimatch.match(fileList, "*.js", {matchBase: true})) +``` + +### minimatch.makeRe(pattern, options) + +Make a regular expression object from the pattern. + +## Options + +All options are `false` by default. + +### debug + +Dump a ton of stuff to stderr. + +### nobrace + +Do not expand `{a,b}` and `{1..3}` brace sets. + +### noglobstar + +Disable `**` matching against multiple folder names. + +### dot + +Allow patterns to match filenames starting with a period, even if +the pattern does not explicitly have a period in that spot. + +Note that by default, `a/**/b` will **not** match `a/.d/b`, unless `dot` +is set. + +### noext + +Disable "extglob" style patterns like `+(a|b)`. + +### nocase + +Perform a case-insensitive match. + +### nonull + +When a match is not found by `minimatch.match`, return a list containing +the pattern itself if this option is set. When not set, an empty list +is returned if there are no matches. + +### matchBase + +If set, then patterns without slashes will be matched +against the basename of the path if it contains slashes. For example, +`a?b` would match the path `/xyz/123/acb`, but not `/xyz/acb/123`. + +### nocomment + +Suppress the behavior of treating `#` at the start of a pattern as a +comment. + +### nonegate + +Suppress the behavior of treating a leading `!` character as negation. + +### flipNegate + +Returns from negate expressions the same as if they were not negated. +(Ie, true on a hit, false on a miss.) + + +## Comparisons to other fnmatch/glob implementations + +While strict compliance with the existing standards is a worthwhile +goal, some discrepancies exist between minimatch and other +implementations, and are intentional. + +If the pattern starts with a `!` character, then it is negated. Set the +`nonegate` flag to suppress this behavior, and treat leading `!` +characters normally. This is perhaps relevant if you wish to start the +pattern with a negative extglob pattern like `!(a|B)`. Multiple `!` +characters at the start of a pattern will negate the pattern multiple +times. + +If a pattern starts with `#`, then it is treated as a comment, and +will not match anything. Use `\#` to match a literal `#` at the +start of a line, or set the `nocomment` flag to suppress this behavior. + +The double-star character `**` is supported by default, unless the +`noglobstar` flag is set. This is supported in the manner of bsdglob +and bash 4.1, where `**` only has special significance if it is the only +thing in a path part. That is, `a/**/b` will match `a/x/y/b`, but +`a/**b` will not. + +If an escaped pattern has no matches, and the `nonull` flag is set, +then minimatch.match returns the pattern as-provided, rather than +interpreting the character escapes. For example, +`minimatch.match([], "\\*a\\?")` will return `"\\*a\\?"` rather than +`"*a?"`. This is akin to setting the `nullglob` option in bash, except +that it does not resolve escaped pattern characters. + +If brace expansion is not disabled, then it is performed before any +other interpretation of the glob pattern. Thus, a pattern like +`+(a|{b),c)}`, which would not be valid in bash or zsh, is expanded +**first** into the set of `+(a|b)` and `+(a|c)`, and those patterns are +checked for validity. Since those two are valid, matching proceeds. diff --git a/node_modules/minimatch/minimatch.js b/node_modules/minimatch/minimatch.js new file mode 100644 index 000000000..5b5f8cf44 --- /dev/null +++ b/node_modules/minimatch/minimatch.js @@ -0,0 +1,923 @@ +module.exports = minimatch +minimatch.Minimatch = Minimatch + +var path = { sep: '/' } +try { + path = require('path') +} catch (er) {} + +var GLOBSTAR = minimatch.GLOBSTAR = Minimatch.GLOBSTAR = {} +var expand = require('brace-expansion') + +var plTypes = { + '!': { open: '(?:(?!(?:', close: '))[^/]*?)'}, + '?': { open: '(?:', close: ')?' }, + '+': { open: '(?:', close: ')+' }, + '*': { open: '(?:', close: ')*' }, + '@': { open: '(?:', close: ')' } +} + +// any single thing other than / +// don't need to escape / when using new RegExp() +var qmark = '[^/]' + +// * => any number of characters +var star = qmark + '*?' + +// ** when dots are allowed. Anything goes, except .. and . +// not (^ or / followed by one or two dots followed by $ or /), +// followed by anything, any number of times. +var twoStarDot = '(?:(?!(?:\\\/|^)(?:\\.{1,2})($|\\\/)).)*?' + +// not a ^ or / followed by a dot, +// followed by anything, any number of times. +var twoStarNoDot = '(?:(?!(?:\\\/|^)\\.).)*?' + +// characters that need to be escaped in RegExp. +var reSpecials = charSet('().*{}+?[]^$\\!') + +// "abc" -> { a:true, b:true, c:true } +function charSet (s) { + return s.split('').reduce(function (set, c) { + set[c] = true + return set + }, {}) +} + +// normalizes slashes. +var slashSplit = /\/+/ + +minimatch.filter = filter +function filter (pattern, options) { + options = options || {} + return function (p, i, list) { + return minimatch(p, pattern, options) + } +} + +function ext (a, b) { + a = a || {} + b = b || {} + var t = {} + Object.keys(b).forEach(function (k) { + t[k] = b[k] + }) + Object.keys(a).forEach(function (k) { + t[k] = a[k] + }) + return t +} + +minimatch.defaults = function (def) { + if (!def || !Object.keys(def).length) return minimatch + + var orig = minimatch + + var m = function minimatch (p, pattern, options) { + return orig.minimatch(p, pattern, ext(def, options)) + } + + m.Minimatch = function Minimatch (pattern, options) { + return new orig.Minimatch(pattern, ext(def, options)) + } + + return m +} + +Minimatch.defaults = function (def) { + if (!def || !Object.keys(def).length) return Minimatch + return minimatch.defaults(def).Minimatch +} + +function minimatch (p, pattern, options) { + if (typeof pattern !== 'string') { + throw new TypeError('glob pattern string required') + } + + if (!options) options = {} + + // shortcut: comments match nothing. + if (!options.nocomment && pattern.charAt(0) === '#') { + return false + } + + // "" only matches "" + if (pattern.trim() === '') return p === '' + + return new Minimatch(pattern, options).match(p) +} + +function Minimatch (pattern, options) { + if (!(this instanceof Minimatch)) { + return new Minimatch(pattern, options) + } + + if (typeof pattern !== 'string') { + throw new TypeError('glob pattern string required') + } + + if (!options) options = {} + pattern = pattern.trim() + + // windows support: need to use /, not \ + if (path.sep !== '/') { + pattern = pattern.split(path.sep).join('/') + } + + this.options = options + this.set = [] + this.pattern = pattern + this.regexp = null + this.negate = false + this.comment = false + this.empty = false + + // make the set of regexps etc. + this.make() +} + +Minimatch.prototype.debug = function () {} + +Minimatch.prototype.make = make +function make () { + // don't do it more than once. + if (this._made) return + + var pattern = this.pattern + var options = this.options + + // empty patterns and comments match nothing. + if (!options.nocomment && pattern.charAt(0) === '#') { + this.comment = true + return + } + if (!pattern) { + this.empty = true + return + } + + // step 1: figure out negation, etc. + this.parseNegate() + + // step 2: expand braces + var set = this.globSet = this.braceExpand() + + if (options.debug) this.debug = console.error + + this.debug(this.pattern, set) + + // step 3: now we have a set, so turn each one into a series of path-portion + // matching patterns. + // These will be regexps, except in the case of "**", which is + // set to the GLOBSTAR object for globstar behavior, + // and will not contain any / characters + set = this.globParts = set.map(function (s) { + return s.split(slashSplit) + }) + + this.debug(this.pattern, set) + + // glob --> regexps + set = set.map(function (s, si, set) { + return s.map(this.parse, this) + }, this) + + this.debug(this.pattern, set) + + // filter out everything that didn't compile properly. + set = set.filter(function (s) { + return s.indexOf(false) === -1 + }) + + this.debug(this.pattern, set) + + this.set = set +} + +Minimatch.prototype.parseNegate = parseNegate +function parseNegate () { + var pattern = this.pattern + var negate = false + var options = this.options + var negateOffset = 0 + + if (options.nonegate) return + + for (var i = 0, l = pattern.length + ; i < l && pattern.charAt(i) === '!' + ; i++) { + negate = !negate + negateOffset++ + } + + if (negateOffset) this.pattern = pattern.substr(negateOffset) + this.negate = negate +} + +// Brace expansion: +// a{b,c}d -> abd acd +// a{b,}c -> abc ac +// a{0..3}d -> a0d a1d a2d a3d +// a{b,c{d,e}f}g -> abg acdfg acefg +// a{b,c}d{e,f}g -> abdeg acdeg abdeg abdfg +// +// Invalid sets are not expanded. +// a{2..}b -> a{2..}b +// a{b}c -> a{b}c +minimatch.braceExpand = function (pattern, options) { + return braceExpand(pattern, options) +} + +Minimatch.prototype.braceExpand = braceExpand + +function braceExpand (pattern, options) { + if (!options) { + if (this instanceof Minimatch) { + options = this.options + } else { + options = {} + } + } + + pattern = typeof pattern === 'undefined' + ? this.pattern : pattern + + if (typeof pattern === 'undefined') { + throw new TypeError('undefined pattern') + } + + if (options.nobrace || + !pattern.match(/\{.*\}/)) { + // shortcut. no need to expand. + return [pattern] + } + + return expand(pattern) +} + +// parse a component of the expanded set. +// At this point, no pattern may contain "/" in it +// so we're going to return a 2d array, where each entry is the full +// pattern, split on '/', and then turned into a regular expression. +// A regexp is made at the end which joins each array with an +// escaped /, and another full one which joins each regexp with |. +// +// Following the lead of Bash 4.1, note that "**" only has special meaning +// when it is the *only* thing in a path portion. Otherwise, any series +// of * is equivalent to a single *. Globstar behavior is enabled by +// default, and can be disabled by setting options.noglobstar. +Minimatch.prototype.parse = parse +var SUBPARSE = {} +function parse (pattern, isSub) { + if (pattern.length > 1024 * 64) { + throw new TypeError('pattern is too long') + } + + var options = this.options + + // shortcuts + if (!options.noglobstar && pattern === '**') return GLOBSTAR + if (pattern === '') return '' + + var re = '' + var hasMagic = !!options.nocase + var escaping = false + // ? => one single character + var patternListStack = [] + var negativeLists = [] + var stateChar + var inClass = false + var reClassStart = -1 + var classStart = -1 + // . and .. never match anything that doesn't start with ., + // even when options.dot is set. + var patternStart = pattern.charAt(0) === '.' ? '' // anything + // not (start or / followed by . or .. followed by / or end) + : options.dot ? '(?!(?:^|\\\/)\\.{1,2}(?:$|\\\/))' + : '(?!\\.)' + var self = this + + function clearStateChar () { + if (stateChar) { + // we had some state-tracking character + // that wasn't consumed by this pass. + switch (stateChar) { + case '*': + re += star + hasMagic = true + break + case '?': + re += qmark + hasMagic = true + break + default: + re += '\\' + stateChar + break + } + self.debug('clearStateChar %j %j', stateChar, re) + stateChar = false + } + } + + for (var i = 0, len = pattern.length, c + ; (i < len) && (c = pattern.charAt(i)) + ; i++) { + this.debug('%s\t%s %s %j', pattern, i, re, c) + + // skip over any that are escaped. + if (escaping && reSpecials[c]) { + re += '\\' + c + escaping = false + continue + } + + switch (c) { + case '/': + // completely not allowed, even escaped. + // Should already be path-split by now. + return false + + case '\\': + clearStateChar() + escaping = true + continue + + // the various stateChar values + // for the "extglob" stuff. + case '?': + case '*': + case '+': + case '@': + case '!': + this.debug('%s\t%s %s %j <-- stateChar', pattern, i, re, c) + + // all of those are literals inside a class, except that + // the glob [!a] means [^a] in regexp + if (inClass) { + this.debug(' in class') + if (c === '!' && i === classStart + 1) c = '^' + re += c + continue + } + + // if we already have a stateChar, then it means + // that there was something like ** or +? in there. + // Handle the stateChar, then proceed with this one. + self.debug('call clearStateChar %j', stateChar) + clearStateChar() + stateChar = c + // if extglob is disabled, then +(asdf|foo) isn't a thing. + // just clear the statechar *now*, rather than even diving into + // the patternList stuff. + if (options.noext) clearStateChar() + continue + + case '(': + if (inClass) { + re += '(' + continue + } + + if (!stateChar) { + re += '\\(' + continue + } + + patternListStack.push({ + type: stateChar, + start: i - 1, + reStart: re.length, + open: plTypes[stateChar].open, + close: plTypes[stateChar].close + }) + // negation is (?:(?!js)[^/]*) + re += stateChar === '!' ? '(?:(?!(?:' : '(?:' + this.debug('plType %j %j', stateChar, re) + stateChar = false + continue + + case ')': + if (inClass || !patternListStack.length) { + re += '\\)' + continue + } + + clearStateChar() + hasMagic = true + var pl = patternListStack.pop() + // negation is (?:(?!js)[^/]*) + // The others are (?:) + re += pl.close + if (pl.type === '!') { + negativeLists.push(pl) + } + pl.reEnd = re.length + continue + + case '|': + if (inClass || !patternListStack.length || escaping) { + re += '\\|' + escaping = false + continue + } + + clearStateChar() + re += '|' + continue + + // these are mostly the same in regexp and glob + case '[': + // swallow any state-tracking char before the [ + clearStateChar() + + if (inClass) { + re += '\\' + c + continue + } + + inClass = true + classStart = i + reClassStart = re.length + re += c + continue + + case ']': + // a right bracket shall lose its special + // meaning and represent itself in + // a bracket expression if it occurs + // first in the list. -- POSIX.2 2.8.3.2 + if (i === classStart + 1 || !inClass) { + re += '\\' + c + escaping = false + continue + } + + // handle the case where we left a class open. + // "[z-a]" is valid, equivalent to "\[z-a\]" + if (inClass) { + // split where the last [ was, make sure we don't have + // an invalid re. if so, re-walk the contents of the + // would-be class to re-translate any characters that + // were passed through as-is + // TODO: It would probably be faster to determine this + // without a try/catch and a new RegExp, but it's tricky + // to do safely. For now, this is safe and works. + var cs = pattern.substring(classStart + 1, i) + try { + RegExp('[' + cs + ']') + } catch (er) { + // not a valid class! + var sp = this.parse(cs, SUBPARSE) + re = re.substr(0, reClassStart) + '\\[' + sp[0] + '\\]' + hasMagic = hasMagic || sp[1] + inClass = false + continue + } + } + + // finish up the class. + hasMagic = true + inClass = false + re += c + continue + + default: + // swallow any state char that wasn't consumed + clearStateChar() + + if (escaping) { + // no need + escaping = false + } else if (reSpecials[c] + && !(c === '^' && inClass)) { + re += '\\' + } + + re += c + + } // switch + } // for + + // handle the case where we left a class open. + // "[abc" is valid, equivalent to "\[abc" + if (inClass) { + // split where the last [ was, and escape it + // this is a huge pita. We now have to re-walk + // the contents of the would-be class to re-translate + // any characters that were passed through as-is + cs = pattern.substr(classStart + 1) + sp = this.parse(cs, SUBPARSE) + re = re.substr(0, reClassStart) + '\\[' + sp[0] + hasMagic = hasMagic || sp[1] + } + + // handle the case where we had a +( thing at the *end* + // of the pattern. + // each pattern list stack adds 3 chars, and we need to go through + // and escape any | chars that were passed through as-is for the regexp. + // Go through and escape them, taking care not to double-escape any + // | chars that were already escaped. + for (pl = patternListStack.pop(); pl; pl = patternListStack.pop()) { + var tail = re.slice(pl.reStart + pl.open.length) + this.debug('setting tail', re, pl) + // maybe some even number of \, then maybe 1 \, followed by a | + tail = tail.replace(/((?:\\{2}){0,64})(\\?)\|/g, function (_, $1, $2) { + if (!$2) { + // the | isn't already escaped, so escape it. + $2 = '\\' + } + + // need to escape all those slashes *again*, without escaping the + // one that we need for escaping the | character. As it works out, + // escaping an even number of slashes can be done by simply repeating + // it exactly after itself. That's why this trick works. + // + // I am sorry that you have to see this. + return $1 + $1 + $2 + '|' + }) + + this.debug('tail=%j\n %s', tail, tail, pl, re) + var t = pl.type === '*' ? star + : pl.type === '?' ? qmark + : '\\' + pl.type + + hasMagic = true + re = re.slice(0, pl.reStart) + t + '\\(' + tail + } + + // handle trailing things that only matter at the very end. + clearStateChar() + if (escaping) { + // trailing \\ + re += '\\\\' + } + + // only need to apply the nodot start if the re starts with + // something that could conceivably capture a dot + var addPatternStart = false + switch (re.charAt(0)) { + case '.': + case '[': + case '(': addPatternStart = true + } + + // Hack to work around lack of negative lookbehind in JS + // A pattern like: *.!(x).!(y|z) needs to ensure that a name + // like 'a.xyz.yz' doesn't match. So, the first negative + // lookahead, has to look ALL the way ahead, to the end of + // the pattern. + for (var n = negativeLists.length - 1; n > -1; n--) { + var nl = negativeLists[n] + + var nlBefore = re.slice(0, nl.reStart) + var nlFirst = re.slice(nl.reStart, nl.reEnd - 8) + var nlLast = re.slice(nl.reEnd - 8, nl.reEnd) + var nlAfter = re.slice(nl.reEnd) + + nlLast += nlAfter + + // Handle nested stuff like *(*.js|!(*.json)), where open parens + // mean that we should *not* include the ) in the bit that is considered + // "after" the negated section. + var openParensBefore = nlBefore.split('(').length - 1 + var cleanAfter = nlAfter + for (i = 0; i < openParensBefore; i++) { + cleanAfter = cleanAfter.replace(/\)[+*?]?/, '') + } + nlAfter = cleanAfter + + var dollar = '' + if (nlAfter === '' && isSub !== SUBPARSE) { + dollar = '$' + } + var newRe = nlBefore + nlFirst + nlAfter + dollar + nlLast + re = newRe + } + + // if the re is not "" at this point, then we need to make sure + // it doesn't match against an empty path part. + // Otherwise a/* will match a/, which it should not. + if (re !== '' && hasMagic) { + re = '(?=.)' + re + } + + if (addPatternStart) { + re = patternStart + re + } + + // parsing just a piece of a larger pattern. + if (isSub === SUBPARSE) { + return [re, hasMagic] + } + + // skip the regexp for non-magical patterns + // unescape anything in it, though, so that it'll be + // an exact match against a file etc. + if (!hasMagic) { + return globUnescape(pattern) + } + + var flags = options.nocase ? 'i' : '' + try { + var regExp = new RegExp('^' + re + '$', flags) + } catch (er) { + // If it was an invalid regular expression, then it can't match + // anything. This trick looks for a character after the end of + // the string, which is of course impossible, except in multi-line + // mode, but it's not a /m regex. + return new RegExp('$.') + } + + regExp._glob = pattern + regExp._src = re + + return regExp +} + +minimatch.makeRe = function (pattern, options) { + return new Minimatch(pattern, options || {}).makeRe() +} + +Minimatch.prototype.makeRe = makeRe +function makeRe () { + if (this.regexp || this.regexp === false) return this.regexp + + // at this point, this.set is a 2d array of partial + // pattern strings, or "**". + // + // It's better to use .match(). This function shouldn't + // be used, really, but it's pretty convenient sometimes, + // when you just want to work with a regex. + var set = this.set + + if (!set.length) { + this.regexp = false + return this.regexp + } + var options = this.options + + var twoStar = options.noglobstar ? star + : options.dot ? twoStarDot + : twoStarNoDot + var flags = options.nocase ? 'i' : '' + + var re = set.map(function (pattern) { + return pattern.map(function (p) { + return (p === GLOBSTAR) ? twoStar + : (typeof p === 'string') ? regExpEscape(p) + : p._src + }).join('\\\/') + }).join('|') + + // must match entire pattern + // ending in a * or ** will make it less strict. + re = '^(?:' + re + ')$' + + // can match anything, as long as it's not this. + if (this.negate) re = '^(?!' + re + ').*$' + + try { + this.regexp = new RegExp(re, flags) + } catch (ex) { + this.regexp = false + } + return this.regexp +} + +minimatch.match = function (list, pattern, options) { + options = options || {} + var mm = new Minimatch(pattern, options) + list = list.filter(function (f) { + return mm.match(f) + }) + if (mm.options.nonull && !list.length) { + list.push(pattern) + } + return list +} + +Minimatch.prototype.match = match +function match (f, partial) { + this.debug('match', f, this.pattern) + // short-circuit in the case of busted things. + // comments, etc. + if (this.comment) return false + if (this.empty) return f === '' + + if (f === '/' && partial) return true + + var options = this.options + + // windows: need to use /, not \ + if (path.sep !== '/') { + f = f.split(path.sep).join('/') + } + + // treat the test path as a set of pathparts. + f = f.split(slashSplit) + this.debug(this.pattern, 'split', f) + + // just ONE of the pattern sets in this.set needs to match + // in order for it to be valid. If negating, then just one + // match means that we have failed. + // Either way, return on the first hit. + + var set = this.set + this.debug(this.pattern, 'set', set) + + // Find the basename of the path by looking for the last non-empty segment + var filename + var i + for (i = f.length - 1; i >= 0; i--) { + filename = f[i] + if (filename) break + } + + for (i = 0; i < set.length; i++) { + var pattern = set[i] + var file = f + if (options.matchBase && pattern.length === 1) { + file = [filename] + } + var hit = this.matchOne(file, pattern, partial) + if (hit) { + if (options.flipNegate) return true + return !this.negate + } + } + + // didn't get any hits. this is success if it's a negative + // pattern, failure otherwise. + if (options.flipNegate) return false + return this.negate +} + +// set partial to true to test if, for example, +// "/a/b" matches the start of "/*/b/*/d" +// Partial means, if you run out of file before you run +// out of pattern, then that's fine, as long as all +// the parts match. +Minimatch.prototype.matchOne = function (file, pattern, partial) { + var options = this.options + + this.debug('matchOne', + { 'this': this, file: file, pattern: pattern }) + + this.debug('matchOne', file.length, pattern.length) + + for (var fi = 0, + pi = 0, + fl = file.length, + pl = pattern.length + ; (fi < fl) && (pi < pl) + ; fi++, pi++) { + this.debug('matchOne loop') + var p = pattern[pi] + var f = file[fi] + + this.debug(pattern, p, f) + + // should be impossible. + // some invalid regexp stuff in the set. + if (p === false) return false + + if (p === GLOBSTAR) { + this.debug('GLOBSTAR', [pattern, p, f]) + + // "**" + // a/**/b/**/c would match the following: + // a/b/x/y/z/c + // a/x/y/z/b/c + // a/b/x/b/x/c + // a/b/c + // To do this, take the rest of the pattern after + // the **, and see if it would match the file remainder. + // If so, return success. + // If not, the ** "swallows" a segment, and try again. + // This is recursively awful. + // + // a/**/b/**/c matching a/b/x/y/z/c + // - a matches a + // - doublestar + // - matchOne(b/x/y/z/c, b/**/c) + // - b matches b + // - doublestar + // - matchOne(x/y/z/c, c) -> no + // - matchOne(y/z/c, c) -> no + // - matchOne(z/c, c) -> no + // - matchOne(c, c) yes, hit + var fr = fi + var pr = pi + 1 + if (pr === pl) { + this.debug('** at the end') + // a ** at the end will just swallow the rest. + // We have found a match. + // however, it will not swallow /.x, unless + // options.dot is set. + // . and .. are *never* matched by **, for explosively + // exponential reasons. + for (; fi < fl; fi++) { + if (file[fi] === '.' || file[fi] === '..' || + (!options.dot && file[fi].charAt(0) === '.')) return false + } + return true + } + + // ok, let's see if we can swallow whatever we can. + while (fr < fl) { + var swallowee = file[fr] + + this.debug('\nglobstar while', file, fr, pattern, pr, swallowee) + + // XXX remove this slice. Just pass the start index. + if (this.matchOne(file.slice(fr), pattern.slice(pr), partial)) { + this.debug('globstar found match!', fr, fl, swallowee) + // found a match. + return true + } else { + // can't swallow "." or ".." ever. + // can only swallow ".foo" when explicitly asked. + if (swallowee === '.' || swallowee === '..' || + (!options.dot && swallowee.charAt(0) === '.')) { + this.debug('dot detected!', file, fr, pattern, pr) + break + } + + // ** swallows a segment, and continue. + this.debug('globstar swallow a segment, and continue') + fr++ + } + } + + // no match was found. + // However, in partial mode, we can't say this is necessarily over. + // If there's more *pattern* left, then + if (partial) { + // ran out of file + this.debug('\n>>> no match, partial?', file, fr, pattern, pr) + if (fr === fl) return true + } + return false + } + + // something other than ** + // non-magic patterns just have to match exactly + // patterns with magic have been turned into regexps. + var hit + if (typeof p === 'string') { + if (options.nocase) { + hit = f.toLowerCase() === p.toLowerCase() + } else { + hit = f === p + } + this.debug('string match', p, f, hit) + } else { + hit = f.match(p) + this.debug('pattern match', p, f, hit) + } + + if (!hit) return false + } + + // Note: ending in / means that we'll get a final "" + // at the end of the pattern. This can only match a + // corresponding "" at the end of the file. + // If the file ends in /, then it can only match a + // a pattern that ends in /, unless the pattern just + // doesn't have any more for it. But, a/b/ should *not* + // match "a/b/*", even though "" matches against the + // [^/]*? pattern, except in partial mode, where it might + // simply not be reached yet. + // However, a/b/ should still satisfy a/* + + // now either we fell off the end of the pattern, or we're done. + if (fi === fl && pi === pl) { + // ran out of pattern and filename at the same time. + // an exact hit! + return true + } else if (fi === fl) { + // ran out of file, but still had pattern left. + // this is ok if we're doing the match as part of + // a glob fs traversal. + return partial + } else if (pi === pl) { + // ran out of pattern, still have file left. + // this is only acceptable if we're on the very last + // empty segment of a file with a trailing slash. + // a/* should match a/b/ + var emptyFileEnd = (fi === fl - 1) && (file[fi] === '') + return emptyFileEnd + } + + // should be unreachable. + throw new Error('wtf?') +} + +// replace stuff like \* with * +function globUnescape (s) { + return s.replace(/\\(.)/g, '$1') +} + +function regExpEscape (s) { + return s.replace(/[-[\]{}()*+?.,\\^$|#\s]/g, '\\$&') +} diff --git a/node_modules/minimatch/package.json b/node_modules/minimatch/package.json new file mode 100644 index 000000000..a17b4b3c8 --- /dev/null +++ b/node_modules/minimatch/package.json @@ -0,0 +1,64 @@ +{ + "_from": "minimatch@^3.0.4", + "_id": "minimatch@3.0.4", + "_inBundle": false, + "_integrity": "sha512-yJHVQEhyqPLUTgt9B83PXu6W3rx4MvvHvSUvToogpwoGDOUQ+yDrR0HRot+yOCdCO7u4hX3pWft6kWBBcqh0UA==", + "_location": "/minimatch", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "minimatch@^3.0.4", + "name": "minimatch", + "escapedName": "minimatch", + "rawSpec": "^3.0.4", + "saveSpec": null, + "fetchSpec": "^3.0.4" + }, + "_requiredBy": [ + "/filelist", + "/jake" + ], + "_resolved": "https://registry.npmjs.org/minimatch/-/minimatch-3.0.4.tgz", + "_shasum": "5166e286457f03306064be5497e8dbb0c3d32083", + "_spec": "minimatch@^3.0.4", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\jake", + "author": { + "name": "Isaac Z. Schlueter", + "email": "i@izs.me", + "url": "http://blog.izs.me" + }, + "bugs": { + "url": "https://github.com/isaacs/minimatch/issues" + }, + "bundleDependencies": false, + "dependencies": { + "brace-expansion": "^1.1.7" + }, + "deprecated": false, + "description": "a glob matcher in javascript", + "devDependencies": { + "tap": "^10.3.2" + }, + "engines": { + "node": "*" + }, + "files": [ + "minimatch.js" + ], + "homepage": "https://github.com/isaacs/minimatch#readme", + "license": "ISC", + "main": "minimatch.js", + "name": "minimatch", + "repository": { + "type": "git", + "url": "git://github.com/isaacs/minimatch.git" + }, + "scripts": { + "postpublish": "git push origin --all; git push origin --tags", + "postversion": "npm publish", + "preversion": "npm test", + "test": "tap test/*.js --cov" + }, + "version": "3.0.4" +} diff --git a/node_modules/minimist/.travis.yml b/node_modules/minimist/.travis.yml new file mode 100644 index 000000000..74c57bf15 --- /dev/null +++ b/node_modules/minimist/.travis.yml @@ -0,0 +1,8 @@ +language: node_js +node_js: + - "0.8" + - "0.10" + - "0.12" + - "iojs" +before_install: + - npm install -g npm@~1.4.6 diff --git a/node_modules/minimist/LICENSE b/node_modules/minimist/LICENSE new file mode 100644 index 000000000..ee27ba4b4 --- /dev/null +++ b/node_modules/minimist/LICENSE @@ -0,0 +1,18 @@ +This software is released under the MIT license: + +Permission is hereby granted, free of charge, to any person obtaining a copy of +this software and associated documentation files (the "Software"), to deal in +the Software without restriction, including without limitation the rights to +use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of +the Software, and to permit persons to whom the Software is furnished to do so, +subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS +FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR +COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER +IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN +CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/minimist/example/parse.js b/node_modules/minimist/example/parse.js new file mode 100644 index 000000000..f7c8d4980 --- /dev/null +++ b/node_modules/minimist/example/parse.js @@ -0,0 +1,2 @@ +var argv = require('../')(process.argv.slice(2)); +console.log(argv); diff --git a/node_modules/minimist/index.js b/node_modules/minimist/index.js new file mode 100644 index 000000000..d2afe5e4d --- /dev/null +++ b/node_modules/minimist/index.js @@ -0,0 +1,245 @@ +module.exports = function (args, opts) { + if (!opts) opts = {}; + + var flags = { bools : {}, strings : {}, unknownFn: null }; + + if (typeof opts['unknown'] === 'function') { + flags.unknownFn = opts['unknown']; + } + + if (typeof opts['boolean'] === 'boolean' && opts['boolean']) { + flags.allBools = true; + } else { + [].concat(opts['boolean']).filter(Boolean).forEach(function (key) { + flags.bools[key] = true; + }); + } + + var aliases = {}; + Object.keys(opts.alias || {}).forEach(function (key) { + aliases[key] = [].concat(opts.alias[key]); + aliases[key].forEach(function (x) { + aliases[x] = [key].concat(aliases[key].filter(function (y) { + return x !== y; + })); + }); + }); + + [].concat(opts.string).filter(Boolean).forEach(function (key) { + flags.strings[key] = true; + if (aliases[key]) { + flags.strings[aliases[key]] = true; + } + }); + + var defaults = opts['default'] || {}; + + var argv = { _ : [] }; + Object.keys(flags.bools).forEach(function (key) { + setArg(key, defaults[key] === undefined ? false : defaults[key]); + }); + + var notFlags = []; + + if (args.indexOf('--') !== -1) { + notFlags = args.slice(args.indexOf('--')+1); + args = args.slice(0, args.indexOf('--')); + } + + function argDefined(key, arg) { + return (flags.allBools && /^--[^=]+$/.test(arg)) || + flags.strings[key] || flags.bools[key] || aliases[key]; + } + + function setArg (key, val, arg) { + if (arg && flags.unknownFn && !argDefined(key, arg)) { + if (flags.unknownFn(arg) === false) return; + } + + var value = !flags.strings[key] && isNumber(val) + ? Number(val) : val + ; + setKey(argv, key.split('.'), value); + + (aliases[key] || []).forEach(function (x) { + setKey(argv, x.split('.'), value); + }); + } + + function setKey (obj, keys, value) { + var o = obj; + for (var i = 0; i < keys.length-1; i++) { + var key = keys[i]; + if (key === '__proto__') return; + if (o[key] === undefined) o[key] = {}; + if (o[key] === Object.prototype || o[key] === Number.prototype + || o[key] === String.prototype) o[key] = {}; + if (o[key] === Array.prototype) o[key] = []; + o = o[key]; + } + + var key = keys[keys.length - 1]; + if (key === '__proto__') return; + if (o === Object.prototype || o === Number.prototype + || o === String.prototype) o = {}; + if (o === Array.prototype) o = []; + if (o[key] === undefined || flags.bools[key] || typeof o[key] === 'boolean') { + o[key] = value; + } + else if (Array.isArray(o[key])) { + o[key].push(value); + } + else { + o[key] = [ o[key], value ]; + } + } + + function aliasIsBoolean(key) { + return aliases[key].some(function (x) { + return flags.bools[x]; + }); + } + + for (var i = 0; i < args.length; i++) { + var arg = args[i]; + + if (/^--.+=/.test(arg)) { + // Using [\s\S] instead of . because js doesn't support the + // 'dotall' regex modifier. See: + // http://stackoverflow.com/a/1068308/13216 + var m = arg.match(/^--([^=]+)=([\s\S]*)$/); + var key = m[1]; + var value = m[2]; + if (flags.bools[key]) { + value = value !== 'false'; + } + setArg(key, value, arg); + } + else if (/^--no-.+/.test(arg)) { + var key = arg.match(/^--no-(.+)/)[1]; + setArg(key, false, arg); + } + else if (/^--.+/.test(arg)) { + var key = arg.match(/^--(.+)/)[1]; + var next = args[i + 1]; + if (next !== undefined && !/^-/.test(next) + && !flags.bools[key] + && !flags.allBools + && (aliases[key] ? !aliasIsBoolean(key) : true)) { + setArg(key, next, arg); + i++; + } + else if (/^(true|false)$/.test(next)) { + setArg(key, next === 'true', arg); + i++; + } + else { + setArg(key, flags.strings[key] ? '' : true, arg); + } + } + else if (/^-[^-]+/.test(arg)) { + var letters = arg.slice(1,-1).split(''); + + var broken = false; + for (var j = 0; j < letters.length; j++) { + var next = arg.slice(j+2); + + if (next === '-') { + setArg(letters[j], next, arg) + continue; + } + + if (/[A-Za-z]/.test(letters[j]) && /=/.test(next)) { + setArg(letters[j], next.split('=')[1], arg); + broken = true; + break; + } + + if (/[A-Za-z]/.test(letters[j]) + && /-?\d+(\.\d*)?(e-?\d+)?$/.test(next)) { + setArg(letters[j], next, arg); + broken = true; + break; + } + + if (letters[j+1] && letters[j+1].match(/\W/)) { + setArg(letters[j], arg.slice(j+2), arg); + broken = true; + break; + } + else { + setArg(letters[j], flags.strings[letters[j]] ? '' : true, arg); + } + } + + var key = arg.slice(-1)[0]; + if (!broken && key !== '-') { + if (args[i+1] && !/^(-|--)[^-]/.test(args[i+1]) + && !flags.bools[key] + && (aliases[key] ? !aliasIsBoolean(key) : true)) { + setArg(key, args[i+1], arg); + i++; + } + else if (args[i+1] && /^(true|false)$/.test(args[i+1])) { + setArg(key, args[i+1] === 'true', arg); + i++; + } + else { + setArg(key, flags.strings[key] ? '' : true, arg); + } + } + } + else { + if (!flags.unknownFn || flags.unknownFn(arg) !== false) { + argv._.push( + flags.strings['_'] || !isNumber(arg) ? arg : Number(arg) + ); + } + if (opts.stopEarly) { + argv._.push.apply(argv._, args.slice(i + 1)); + break; + } + } + } + + Object.keys(defaults).forEach(function (key) { + if (!hasKey(argv, key.split('.'))) { + setKey(argv, key.split('.'), defaults[key]); + + (aliases[key] || []).forEach(function (x) { + setKey(argv, x.split('.'), defaults[key]); + }); + } + }); + + if (opts['--']) { + argv['--'] = new Array(); + notFlags.forEach(function(key) { + argv['--'].push(key); + }); + } + else { + notFlags.forEach(function(key) { + argv._.push(key); + }); + } + + return argv; +}; + +function hasKey (obj, keys) { + var o = obj; + keys.slice(0,-1).forEach(function (key) { + o = (o[key] || {}); + }); + + var key = keys[keys.length - 1]; + return key in o; +} + +function isNumber (x) { + if (typeof x === 'number') return true; + if (/^0x[0-9a-f]+$/i.test(x)) return true; + return /^[-+]?(?:\d+(?:\.\d*)?|\.\d+)(e[-+]?\d+)?$/.test(x); +} + diff --git a/node_modules/minimist/package.json b/node_modules/minimist/package.json new file mode 100644 index 000000000..a9061371c --- /dev/null +++ b/node_modules/minimist/package.json @@ -0,0 +1,73 @@ +{ + "_from": "minimist@^1.2.0", + "_id": "minimist@1.2.5", + "_inBundle": false, + "_integrity": "sha512-FM9nNUYrRBAELZQT3xeZQ7fmMOBg6nWNmJKTcgsJeaLstP/UODVpGsr5OhXhhXg6f+qtJ8uiZ+PUxkDWcgIXLw==", + "_location": "/minimist", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "minimist@^1.2.0", + "name": "minimist", + "escapedName": "minimist", + "rawSpec": "^1.2.0", + "saveSpec": null, + "fetchSpec": "^1.2.0" + }, + "_requiredBy": [ + "/rc" + ], + "_resolved": "https://registry.npmjs.org/minimist/-/minimist-1.2.5.tgz", + "_shasum": "67d66014b66a6a8aaa0c083c5fd58df4e4e97602", + "_spec": "minimist@^1.2.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\rc", + "author": { + "name": "James Halliday", + "email": "mail@substack.net", + "url": "http://substack.net" + }, + "bugs": { + "url": "https://github.com/substack/minimist/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "parse argument options", + "devDependencies": { + "covert": "^1.0.0", + "tap": "~0.4.0", + "tape": "^3.5.0" + }, + "homepage": "https://github.com/substack/minimist", + "keywords": [ + "argv", + "getopt", + "parser", + "optimist" + ], + "license": "MIT", + "main": "index.js", + "name": "minimist", + "repository": { + "type": "git", + "url": "git://github.com/substack/minimist.git" + }, + "scripts": { + "coverage": "covert test/*.js", + "test": "tap test/*.js" + }, + "testling": { + "files": "test/*.js", + "browsers": [ + "ie/6..latest", + "ff/5", + "firefox/latest", + "chrome/10", + "chrome/latest", + "safari/5.1", + "safari/latest", + "opera/12" + ] + }, + "version": "1.2.5" +} diff --git a/node_modules/minimist/readme.markdown b/node_modules/minimist/readme.markdown new file mode 100644 index 000000000..5fd97ab11 --- /dev/null +++ b/node_modules/minimist/readme.markdown @@ -0,0 +1,95 @@ +# minimist + +parse argument options + +This module is the guts of optimist's argument parser without all the +fanciful decoration. + +# example + +``` js +var argv = require('minimist')(process.argv.slice(2)); +console.log(argv); +``` + +``` +$ node example/parse.js -a beep -b boop +{ _: [], a: 'beep', b: 'boop' } +``` + +``` +$ node example/parse.js -x 3 -y 4 -n5 -abc --beep=boop foo bar baz +{ _: [ 'foo', 'bar', 'baz' ], + x: 3, + y: 4, + n: 5, + a: true, + b: true, + c: true, + beep: 'boop' } +``` + +# security + +Previous versions had a prototype pollution bug that could cause privilege +escalation in some circumstances when handling untrusted user input. + +Please use version 1.2.3 or later: https://snyk.io/vuln/SNYK-JS-MINIMIST-559764 + +# methods + +``` js +var parseArgs = require('minimist') +``` + +## var argv = parseArgs(args, opts={}) + +Return an argument object `argv` populated with the array arguments from `args`. + +`argv._` contains all the arguments that didn't have an option associated with +them. + +Numeric-looking arguments will be returned as numbers unless `opts.string` or +`opts.boolean` is set for that argument name. + +Any arguments after `'--'` will not be parsed and will end up in `argv._`. + +options can be: + +* `opts.string` - a string or array of strings argument names to always treat as +strings +* `opts.boolean` - a boolean, string or array of strings to always treat as +booleans. if `true` will treat all double hyphenated arguments without equal signs +as boolean (e.g. affects `--foo`, not `-f` or `--foo=bar`) +* `opts.alias` - an object mapping string names to strings or arrays of string +argument names to use as aliases +* `opts.default` - an object mapping string argument names to default values +* `opts.stopEarly` - when true, populate `argv._` with everything after the +first non-option +* `opts['--']` - when true, populate `argv._` with everything before the `--` +and `argv['--']` with everything after the `--`. Here's an example: + + ``` + > require('./')('one two three -- four five --six'.split(' '), { '--': true }) + { _: [ 'one', 'two', 'three' ], + '--': [ 'four', 'five', '--six' ] } + ``` + + Note that with `opts['--']` set, parsing for arguments still stops after the + `--`. + +* `opts.unknown` - a function which is invoked with a command line parameter not +defined in the `opts` configuration object. If the function returns `false`, the +unknown option is not added to `argv`. + +# install + +With [npm](https://npmjs.org) do: + +``` +npm install minimist +``` + +# license + +MIT diff --git a/node_modules/minimist/test/all_bool.js b/node_modules/minimist/test/all_bool.js new file mode 100644 index 000000000..ac835483d --- /dev/null +++ b/node_modules/minimist/test/all_bool.js @@ -0,0 +1,32 @@ +var parse = require('../'); +var test = require('tape'); + +test('flag boolean true (default all --args to boolean)', function (t) { + var argv = parse(['moo', '--honk', 'cow'], { + boolean: true + }); + + t.deepEqual(argv, { + honk: true, + _: ['moo', 'cow'] + }); + + t.deepEqual(typeof argv.honk, 'boolean'); + t.end(); +}); + +test('flag boolean true only affects double hyphen arguments without equals signs', function (t) { + var argv = parse(['moo', '--honk', 'cow', '-p', '55', '--tacos=good'], { + boolean: true + }); + + t.deepEqual(argv, { + honk: true, + tacos: 'good', + p: 55, + _: ['moo', 'cow'] + }); + + t.deepEqual(typeof argv.honk, 'boolean'); + t.end(); +}); diff --git a/node_modules/minimist/test/bool.js b/node_modules/minimist/test/bool.js new file mode 100644 index 000000000..5f7dbde16 --- /dev/null +++ b/node_modules/minimist/test/bool.js @@ -0,0 +1,178 @@ +var parse = require('../'); +var test = require('tape'); + +test('flag boolean default false', function (t) { + var argv = parse(['moo'], { + boolean: ['t', 'verbose'], + default: { verbose: false, t: false } + }); + + t.deepEqual(argv, { + verbose: false, + t: false, + _: ['moo'] + }); + + t.deepEqual(typeof argv.verbose, 'boolean'); + t.deepEqual(typeof argv.t, 'boolean'); + t.end(); + +}); + +test('boolean groups', function (t) { + var argv = parse([ '-x', '-z', 'one', 'two', 'three' ], { + boolean: ['x','y','z'] + }); + + t.deepEqual(argv, { + x : true, + y : false, + z : true, + _ : [ 'one', 'two', 'three' ] + }); + + t.deepEqual(typeof argv.x, 'boolean'); + t.deepEqual(typeof argv.y, 'boolean'); + t.deepEqual(typeof argv.z, 'boolean'); + t.end(); +}); +test('boolean and alias with chainable api', function (t) { + var aliased = [ '-h', 'derp' ]; + var regular = [ '--herp', 'derp' ]; + var opts = { + herp: { alias: 'h', boolean: true } + }; + var aliasedArgv = parse(aliased, { + boolean: 'herp', + alias: { h: 'herp' } + }); + var propertyArgv = parse(regular, { + boolean: 'herp', + alias: { h: 'herp' } + }); + var expected = { + herp: true, + h: true, + '_': [ 'derp' ] + }; + + t.same(aliasedArgv, expected); + t.same(propertyArgv, expected); + t.end(); +}); + +test('boolean and alias with options hash', function (t) { + var aliased = [ '-h', 'derp' ]; + var regular = [ '--herp', 'derp' ]; + var opts = { + alias: { 'h': 'herp' }, + boolean: 'herp' + }; + var aliasedArgv = parse(aliased, opts); + var propertyArgv = parse(regular, opts); + var expected = { + herp: true, + h: true, + '_': [ 'derp' ] + }; + t.same(aliasedArgv, expected); + t.same(propertyArgv, expected); + t.end(); +}); + +test('boolean and alias array with options hash', function (t) { + var aliased = [ '-h', 'derp' ]; + var regular = [ '--herp', 'derp' ]; + var alt = [ '--harp', 'derp' ]; + var opts = { + alias: { 'h': ['herp', 'harp'] }, + boolean: 'h' + }; + var aliasedArgv = parse(aliased, opts); + var propertyArgv = parse(regular, opts); + var altPropertyArgv = parse(alt, opts); + var expected = { + harp: true, + herp: true, + h: true, + '_': [ 'derp' ] + }; + t.same(aliasedArgv, expected); + t.same(propertyArgv, expected); + t.same(altPropertyArgv, expected); + t.end(); +}); + +test('boolean and alias using explicit true', function (t) { + var aliased = [ '-h', 'true' ]; + var regular = [ '--herp', 'true' ]; + var opts = { + alias: { h: 'herp' }, + boolean: 'h' + }; + var aliasedArgv = parse(aliased, opts); + var propertyArgv = parse(regular, opts); + var expected = { + herp: true, + h: true, + '_': [ ] + }; + + t.same(aliasedArgv, expected); + t.same(propertyArgv, expected); + t.end(); +}); + +// regression, see https://github.com/substack/node-optimist/issues/71 +test('boolean and --x=true', function(t) { + var parsed = parse(['--boool', '--other=true'], { + boolean: 'boool' + }); + + t.same(parsed.boool, true); + t.same(parsed.other, 'true'); + + parsed = parse(['--boool', '--other=false'], { + boolean: 'boool' + }); + + t.same(parsed.boool, true); + t.same(parsed.other, 'false'); + t.end(); +}); + +test('boolean --boool=true', function (t) { + var parsed = parse(['--boool=true'], { + default: { + boool: false + }, + boolean: ['boool'] + }); + + t.same(parsed.boool, true); + t.end(); +}); + +test('boolean --boool=false', function (t) { + var parsed = parse(['--boool=false'], { + default: { + boool: true + }, + boolean: ['boool'] + }); + + t.same(parsed.boool, false); + t.end(); +}); + +test('boolean using something similar to true', function (t) { + var opts = { boolean: 'h' }; + var result = parse(['-h', 'true.txt'], opts); + var expected = { + h: true, + '_': ['true.txt'] + }; + + t.same(result, expected); + t.end(); +}); \ No newline at end of file diff --git a/node_modules/minimist/test/dash.js b/node_modules/minimist/test/dash.js new file mode 100644 index 000000000..5a4fa5be4 --- /dev/null +++ b/node_modules/minimist/test/dash.js @@ -0,0 +1,31 @@ +var parse = require('../'); +var test = require('tape'); + +test('-', function (t) { + t.plan(5); + t.deepEqual(parse([ '-n', '-' ]), { n: '-', _: [] }); + t.deepEqual(parse([ '-' ]), { _: [ '-' ] }); + t.deepEqual(parse([ '-f-' ]), { f: '-', _: [] }); + t.deepEqual( + parse([ '-b', '-' ], { boolean: 'b' }), + { b: true, _: [ '-' ] } + ); + t.deepEqual( + parse([ '-s', '-' ], { string: 's' }), + { s: '-', _: [] } + ); +}); + +test('-a -- b', function (t) { + t.plan(3); + t.deepEqual(parse([ '-a', '--', 'b' ]), { a: true, _: [ 'b' ] }); + t.deepEqual(parse([ '--a', '--', 'b' ]), { a: true, _: [ 'b' ] }); + t.deepEqual(parse([ '--a', '--', 'b' ]), { a: true, _: [ 'b' ] }); +}); + +test('move arguments after the -- into their own `--` array', function(t) { + t.plan(1); + t.deepEqual( + parse([ '--name', 'John', 'before', '--', 'after' ], { '--': true }), + { name: 'John', _: [ 'before' ], '--': [ 'after' ] }); +}); diff --git a/node_modules/minimist/test/default_bool.js b/node_modules/minimist/test/default_bool.js new file mode 100644 index 000000000..780a31127 --- /dev/null +++ b/node_modules/minimist/test/default_bool.js @@ -0,0 +1,35 @@ +var test = require('tape'); +var parse = require('../'); + +test('boolean default true', function (t) { + var argv = parse([], { + boolean: 'sometrue', + default: { sometrue: true } + }); + t.equal(argv.sometrue, true); + t.end(); +}); + +test('boolean default false', function (t) { + var argv = parse([], { + boolean: 'somefalse', + default: { somefalse: false } + }); + t.equal(argv.somefalse, false); + t.end(); +}); + +test('boolean default to null', function (t) { + var argv = parse([], { + boolean: 'maybe', + default: { maybe: null } + }); + t.equal(argv.maybe, null); + var argv = parse(['--maybe'], { + boolean: 'maybe', + default: { maybe: null } + }); + t.equal(argv.maybe, true); + t.end(); + +}) diff --git a/node_modules/minimist/test/dotted.js b/node_modules/minimist/test/dotted.js new file mode 100644 index 000000000..d8b3e856e --- /dev/null +++ b/node_modules/minimist/test/dotted.js @@ -0,0 +1,22 @@ +var parse = require('../'); +var test = require('tape'); + +test('dotted alias', function (t) { + var argv = parse(['--a.b', '22'], {default: {'a.b': 11}, alias: {'a.b': 'aa.bb'}}); + t.equal(argv.a.b, 22); + t.equal(argv.aa.bb, 22); + t.end(); +}); + +test('dotted default', function (t) { + var argv = parse('', {default: {'a.b': 11}, alias: {'a.b': 'aa.bb'}}); + t.equal(argv.a.b, 11); + t.equal(argv.aa.bb, 11); + t.end(); +}); + +test('dotted default with no alias', function (t) { + var argv = parse('', {default: {'a.b': 11}}); + t.equal(argv.a.b, 11); + t.end(); +}); diff --git a/node_modules/minimist/test/kv_short.js b/node_modules/minimist/test/kv_short.js new file mode 100644 index 000000000..f813b3050 --- /dev/null +++ b/node_modules/minimist/test/kv_short.js @@ -0,0 +1,16 @@ +var parse = require('../'); +var test = require('tape'); + +test('short -k=v' , function (t) { + t.plan(1); + + var argv = parse([ '-b=123' ]); + t.deepEqual(argv, { b: 123, _: [] }); +}); + +test('multi short -k=v' , function (t) { + t.plan(1); + + var argv = parse([ '-a=whatever', '-b=robots' ]); + t.deepEqual(argv, { a: 'whatever', b: 'robots', _: [] }); +}); diff --git a/node_modules/minimist/test/long.js b/node_modules/minimist/test/long.js new file mode 100644 index 000000000..5d3a1e09d --- /dev/null +++ b/node_modules/minimist/test/long.js @@ -0,0 +1,31 @@ +var test = require('tape'); +var parse = require('../'); + +test('long opts', function (t) { + t.deepEqual( + parse([ '--bool' ]), + { bool : true, _ : [] }, + 'long boolean' + ); + t.deepEqual( + parse([ '--pow', 'xixxle' ]), + { pow : 'xixxle', _ : [] }, + 'long capture sp' + ); + t.deepEqual( + parse([ '--pow=xixxle' ]), + { pow : 'xixxle', _ : [] }, + 'long capture eq' + ); + t.deepEqual( + parse([ '--host', 'localhost', '--port', '555' ]), + { host : 'localhost', port : 555, _ : [] }, + 'long captures sp' + ); + t.deepEqual( + parse([ '--host=localhost', '--port=555' ]), + { host : 'localhost', port : 555, _ : [] }, + 'long captures eq' + ); + t.end(); +}); diff --git a/node_modules/minimist/test/num.js b/node_modules/minimist/test/num.js new file mode 100644 index 000000000..2cc77f4d6 --- /dev/null +++ b/node_modules/minimist/test/num.js @@ -0,0 +1,36 @@ +var parse = require('../'); +var test = require('tape'); + +test('nums', function (t) { + var argv = parse([ + '-x', '1234', + '-y', '5.67', + '-z', '1e7', + '-w', '10f', + '--hex', '0xdeadbeef', + '789' + ]); + t.deepEqual(argv, { + x : 1234, + y : 5.67, + z : 1e7, + w : '10f', + hex : 0xdeadbeef, + _ : [ 789 ] + }); + t.deepEqual(typeof argv.x, 'number'); + t.deepEqual(typeof argv.y, 'number'); + t.deepEqual(typeof argv.z, 'number'); + t.deepEqual(typeof argv.w, 'string'); + t.deepEqual(typeof argv.hex, 'number'); + t.deepEqual(typeof argv._[0], 'number'); + t.end(); +}); + +test('already a number', function (t) { + var argv = parse([ '-x', 1234, 789 ]); + t.deepEqual(argv, { x : 1234, _ : [ 789 ] }); + t.deepEqual(typeof argv.x, 'number'); + t.deepEqual(typeof argv._[0], 'number'); + t.end(); +}); diff --git a/node_modules/minimist/test/parse.js b/node_modules/minimist/test/parse.js new file mode 100644 index 000000000..7b4a2a17c --- /dev/null +++ b/node_modules/minimist/test/parse.js @@ -0,0 +1,197 @@ +var parse = require('../'); +var test = require('tape'); + +test('parse args', function (t) { + t.deepEqual( + parse([ '--no-moo' ]), + { moo : false, _ : [] }, + 'no' + ); + t.deepEqual( + parse([ '-v', 'a', '-v', 'b', '-v', 'c' ]), + { v : ['a','b','c'], _ : [] }, + 'multi' + ); + t.end(); +}); + +test('comprehensive', function (t) { + t.deepEqual( + parse([ + '--name=meowmers', 'bare', '-cats', 'woo', + '-h', 'awesome', '--multi=quux', + '--key', 'value', + '-b', '--bool', '--no-meep', '--multi=baz', + '--', '--not-a-flag', 'eek' + ]), + { + c : true, + a : true, + t : true, + s : 'woo', + h : 'awesome', + b : true, + bool : true, + key : 'value', + multi : [ 'quux', 'baz' ], + meep : false, + name : 'meowmers', + _ : [ 'bare', '--not-a-flag', 'eek' ] + } + ); + t.end(); +}); + +test('flag boolean', function (t) { + var argv = parse([ '-t', 'moo' ], { boolean: 't' }); + t.deepEqual(argv, { t : true, _ : [ 'moo' ] }); + t.deepEqual(typeof argv.t, 'boolean'); + t.end(); +}); + +test('flag boolean value', function (t) { + var argv = parse(['--verbose', 'false', 'moo', '-t', 'true'], { + boolean: [ 't', 'verbose' ], + default: { verbose: true } + }); + + t.deepEqual(argv, { + verbose: false, + t: true, + _: ['moo'] + }); + + t.deepEqual(typeof argv.verbose, 'boolean'); + t.deepEqual(typeof argv.t, 'boolean'); + t.end(); +}); + +test('newlines in params' , function (t) { + var args = parse([ '-s', "X\nX" ]) + t.deepEqual(args, { _ : [], s : "X\nX" }); + + // reproduce in bash: + // VALUE="new + // line" + // node program.js --s="$VALUE" + args = parse([ "--s=X\nX" ]) + t.deepEqual(args, { _ : [], s : "X\nX" }); + t.end(); +}); + +test('strings' , function (t) { + var s = parse([ '-s', '0001234' ], { string: 's' }).s; + t.equal(s, '0001234'); + t.equal(typeof s, 'string'); + + var x = parse([ '-x', '56' ], { string: 'x' }).x; + t.equal(x, '56'); + t.equal(typeof x, 'string'); + t.end(); +}); + +test('stringArgs', function (t) { + var s = parse([ ' ', ' ' ], { string: '_' })._; + t.same(s.length, 2); + t.same(typeof s[0], 'string'); + t.same(s[0], ' '); + t.same(typeof s[1], 'string'); + t.same(s[1], ' '); + t.end(); +}); + +test('empty strings', function(t) { + var s = parse([ '-s' ], { string: 's' }).s; + t.equal(s, ''); + t.equal(typeof s, 'string'); + + var str = parse([ '--str' ], { string: 'str' }).str; + t.equal(str, ''); + t.equal(typeof str, 'string'); + + var letters = parse([ '-art' ], { + string: [ 'a', 't' ] + }); + + t.equal(letters.a, ''); + t.equal(letters.r, true); + t.equal(letters.t, ''); + + t.end(); +}); + + +test('string and alias', function(t) { + var x = parse([ '--str', '000123' ], { + string: 's', + alias: { s: 'str' } + }); + + t.equal(x.str, '000123'); + t.equal(typeof x.str, 'string'); + t.equal(x.s, '000123'); + t.equal(typeof x.s, 'string'); + + var y = parse([ '-s', '000123' ], { + string: 'str', + alias: { str: 's' } + }); + + t.equal(y.str, '000123'); + t.equal(typeof y.str, 'string'); + t.equal(y.s, '000123'); + t.equal(typeof y.s, 'string'); + t.end(); +}); + +test('slashBreak', function (t) { + t.same( + parse([ '-I/foo/bar/baz' ]), + { I : '/foo/bar/baz', _ : [] } + ); + t.same( + parse([ '-xyz/foo/bar/baz' ]), + { x : true, y : true, z : '/foo/bar/baz', _ : [] } + ); + t.end(); +}); + +test('alias', function (t) { + var argv = parse([ '-f', '11', '--zoom', '55' ], { + alias: { z: 'zoom' } + }); + t.equal(argv.zoom, 55); + t.equal(argv.z, argv.zoom); + t.equal(argv.f, 11); + t.end(); +}); + +test('multiAlias', function (t) { + var argv = parse([ '-f', '11', '--zoom', '55' ], { + alias: { z: [ 'zm', 'zoom' ] } + }); + t.equal(argv.zoom, 55); + t.equal(argv.z, argv.zoom); + t.equal(argv.z, argv.zm); + t.equal(argv.f, 11); + t.end(); +}); + +test('nested dotted objects', function (t) { + var argv = parse([ + '--foo.bar', '3', '--foo.baz', '4', + '--foo.quux.quibble', '5', '--foo.quux.o_O', + '--beep.boop' + ]); + + t.same(argv.foo, { + bar : 3, + baz : 4, + quux : { + quibble : 5, + o_O : true + } + }); + t.same(argv.beep, { boop : true }); + t.end(); +}); diff --git a/node_modules/minimist/test/parse_modified.js b/node_modules/minimist/test/parse_modified.js new file mode 100644 index 000000000..ab620dc5e --- /dev/null +++ b/node_modules/minimist/test/parse_modified.js @@ -0,0 +1,9 @@ +var parse = require('../'); +var test = require('tape'); + +test('parse with modifier functions' , function (t) { + t.plan(1); + + var argv = parse([ '-b', '123' ], { boolean: 'b' }); + t.deepEqual(argv, { b: true, _: [123] }); +}); diff --git a/node_modules/minimist/test/proto.js b/node_modules/minimist/test/proto.js new file mode 100644 index 000000000..8649107ec --- /dev/null +++ b/node_modules/minimist/test/proto.js @@ -0,0 +1,44 @@ +var parse = require('../'); +var test = require('tape'); + +test('proto pollution', function (t) { + var argv = parse(['--__proto__.x','123']); + t.equal({}.x, undefined); + t.equal(argv.__proto__.x, undefined); + t.equal(argv.x, undefined); + t.end(); +}); + +test('proto pollution (array)', function (t) { + var argv = parse(['--x','4','--x','5','--x.__proto__.z','789']); + t.equal({}.z, undefined); + t.deepEqual(argv.x, [4,5]); + t.equal(argv.x.z, undefined); + t.equal(argv.x.__proto__.z, undefined); + t.end(); +}); + +test('proto pollution (number)', function (t) { + var argv = parse(['--x','5','--x.__proto__.z','100']); + t.equal({}.z, undefined); + t.equal((4).z, undefined); + t.equal(argv.x, 5); + t.equal(argv.x.z, undefined); + t.end(); +}); + +test('proto pollution (string)', function (t) { + var argv = parse(['--x','abc','--x.__proto__.z','def']); + t.equal({}.z, undefined); + t.equal('...'.z, undefined); + t.equal(argv.x, 'abc'); + t.equal(argv.x.z, undefined); + t.end(); +}); + +test('proto pollution (constructor)', function (t) { + var argv = parse(['--constructor.prototype.y','123']); + t.equal({}.y, undefined); + t.equal(argv.y, undefined); + t.end(); +}); diff --git a/node_modules/minimist/test/short.js b/node_modules/minimist/test/short.js new file mode 100644 index 000000000..d513a1c25 --- /dev/null +++ b/node_modules/minimist/test/short.js @@ -0,0 +1,67 @@ +var parse = require('../'); +var test = require('tape'); + +test('numeric short args', function (t) { + t.plan(2); + t.deepEqual(parse([ '-n123' ]), { n: 123, _: [] }); + t.deepEqual( + parse([ '-123', '456' ]), + { 1: true, 2: true, 3: 456, _: [] } + ); +}); + +test('short', function (t) { + t.deepEqual( + parse([ '-b' ]), + { b : true, _ : [] }, + 'short boolean' + ); + t.deepEqual( + parse([ 'foo', 'bar', 'baz' ]), + { _ : [ 'foo', 'bar', 'baz' ] }, + 'bare' + ); + t.deepEqual( + parse([ '-cats' ]), + { c : true, a : true, t : true, s : true, _ : [] }, + 'group' + ); + t.deepEqual( + parse([ '-cats', 'meow' ]), + { c : true, a : true, t : true, s : 'meow', _ : [] }, + 'short group next' + ); + t.deepEqual( + parse([ '-h', 'localhost' ]), + { h : 'localhost', _ : [] }, + 'short capture' + ); + t.deepEqual( + parse([ '-h', 'localhost', '-p', '555' ]), + { h : 'localhost', p : 555, _ : [] }, + 'short captures' + ); + t.end(); +}); + +test('mixed short bool and capture', function (t) { + t.same( + parse([ '-h', 'localhost', '-fp', '555', 'script.js' ]), + { + f : true, p : 555, h : 'localhost', + _ : [ 'script.js' ] + } + ); + t.end(); +}); + +test('short and long', function (t) { + t.deepEqual( + parse([ '-h', 'localhost', '-fp', '555', 'script.js' ]), + { + f : true, p : 555, h : 'localhost', + _ : [ 'script.js' ] + } + ); + t.end(); +}); diff --git a/node_modules/minimist/test/stop_early.js b/node_modules/minimist/test/stop_early.js new file mode 100644 index 000000000..bdf9fbcb0 --- /dev/null +++ b/node_modules/minimist/test/stop_early.js @@ -0,0 +1,15 @@ +var parse = require('../'); +var test = require('tape'); + +test('stops parsing on the first non-option when stopEarly is set', function (t) { + var argv = parse(['--aaa', 'bbb', 'ccc', '--ddd'], { + stopEarly: true + }); + + t.deepEqual(argv, { + aaa: 'bbb', + _: ['ccc', '--ddd'] + }); + + t.end(); +}); diff --git a/node_modules/minimist/test/unknown.js b/node_modules/minimist/test/unknown.js new file mode 100644 index 000000000..462a36bdd --- /dev/null +++ b/node_modules/minimist/test/unknown.js @@ -0,0 +1,102 @@ +var parse = require('../'); +var test = require('tape'); + +test('boolean and alias is not unknown', function (t) { + var unknown = []; + function unknownFn(arg) { + unknown.push(arg); + return false; + } + var aliased = [ '-h', 'true', '--derp', 'true' ]; + var regular = [ '--herp', 'true', '-d', 'true' ]; + var opts = { + alias: { h: 'herp' }, + boolean: 'h', + unknown: unknownFn + }; + var aliasedArgv = parse(aliased, opts); + var propertyArgv = parse(regular, opts); + + t.same(unknown, ['--derp', '-d']); + t.end(); +}); + +test('flag boolean true any double hyphen argument is not unknown', function (t) { + var unknown = []; + function unknownFn(arg) { + unknown.push(arg); + return false; + } + var argv = parse(['--honk', '--tacos=good', 'cow', '-p', '55'], { + boolean: true, + unknown: unknownFn + }); + t.same(unknown, ['--tacos=good', 'cow', '-p']); + t.same(argv, { + honk: true, + _: [] + }); + t.end(); +}); + +test('string and alias is not unknown', function (t) { + var unknown = []; + function unknownFn(arg) { + unknown.push(arg); + return false; + } + var aliased = [ '-h', 'hello', '--derp', 'goodbye' ]; + var regular = [ '--herp', 'hello', '-d', 'moon' ]; + var opts = { + alias: { h: 'herp' }, + string: 'h', + unknown: unknownFn + }; + var aliasedArgv = parse(aliased, opts); + var propertyArgv = parse(regular, opts); + + t.same(unknown, ['--derp', '-d']); + t.end(); +}); + +test('default and alias is not unknown', function (t) { + var unknown = []; + function unknownFn(arg) { + unknown.push(arg); + return false; + } + var aliased = [ '-h', 'hello' ]; + var regular = [ '--herp', 'hello' ]; + var opts = { + default: { 'h': 'bar' }, + alias: { 'h': 'herp' }, + unknown: unknownFn + }; + var aliasedArgv = parse(aliased, opts); + var propertyArgv = parse(regular, opts); + + t.same(unknown, []); + t.end(); + unknownFn(); // exercise fn for 100% coverage +}); + +test('value following -- is not unknown', function (t) { + var unknown = []; + function unknownFn(arg) { + unknown.push(arg); + return false; + } + var aliased = [ '--bad', '--', 'good', 'arg' ]; + var opts = { + '--': true, + unknown: unknownFn + }; + var argv = parse(aliased, opts); + + t.same(unknown, ['--bad']); + t.same(argv, { + '--': ['good', 'arg'], + '_': [] + }) + t.end(); +}); diff --git a/node_modules/minimist/test/whitespace.js b/node_modules/minimist/test/whitespace.js new file mode 100644 index 000000000..8a52a58ce --- /dev/null +++ b/node_modules/minimist/test/whitespace.js @@ -0,0 +1,8 @@ +var parse = require('../'); +var test = require('tape'); + +test('whitespace should be whitespace' , function (t) { + t.plan(1); + var x = parse([ '-x', '\t' ]).x; + t.equal(x, '\t'); +}); diff --git a/node_modules/mongodb-connection-string-url/.esm-wrapper.mjs b/node_modules/mongodb-connection-string-url/.esm-wrapper.mjs new file mode 100644 index 000000000..4cd3c0461 --- /dev/null +++ b/node_modules/mongodb-connection-string-url/.esm-wrapper.mjs @@ -0,0 +1,4 @@ +import mod from "./lib/index.js"; + +export default mod["default"]; +export const CommaAndColonSeparatedRecord = mod.CommaAndColonSeparatedRecord; diff --git a/node_modules/mongodb-connection-string-url/LICENSE b/node_modules/mongodb-connection-string-url/LICENSE new file mode 100644 index 000000000..d57f55f4e --- /dev/null +++ b/node_modules/mongodb-connection-string-url/LICENSE @@ -0,0 +1,192 @@ + + Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + + TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + + 1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + + 2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + + 3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + + 4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + + 5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + + 6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + + 7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + + 8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + + 9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + + END OF TERMS AND CONDITIONS + + Copyright 2020 MongoDB Inc. + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. + diff --git a/node_modules/mongodb-connection-string-url/README.md b/node_modules/mongodb-connection-string-url/README.md new file mode 100644 index 000000000..0eb65d00f --- /dev/null +++ b/node_modules/mongodb-connection-string-url/README.md @@ -0,0 +1,25 @@ +# mongodb-connection-string-url + +MongoDB connection strings, based on the WhatWG URL API + +```js +import ConnectionString from 'mongodb-connection-string-url'; + +const cs = new ConnectionString('mongodb://localhost'); +cs.searchParams.set('readPreference', 'secondary'); +console.log(cs.href); // 'mongodb://localhost/?readPreference=secondary' +``` + +## Deviations from the WhatWG URL package + +- URL parameters are case-insensitive +- The `.host`, `.hostname` and `.port` properties cannot be set, and reading + them does not return meaningful results (and are typed as `never`in TypeScript) +- The `.hosts` property contains a list of all hosts in the connection string +- The `.href` property cannot be set, only read +- There is an additional `.isSRV` property, set to `true` for `mongodb+srv://` +- There is an additional `.clone()` utility method on the prototype + +## LICENSE + +Apache-2.0 diff --git a/node_modules/mongodb-connection-string-url/lib/index.d.ts b/node_modules/mongodb-connection-string-url/lib/index.d.ts new file mode 100644 index 000000000..1020a12da --- /dev/null +++ b/node_modules/mongodb-connection-string-url/lib/index.d.ts @@ -0,0 +1,40 @@ +import { URL } from 'whatwg-url'; +declare class CaseInsensitiveMap extends Map { + delete(name: any): boolean; + get(name: any): string | undefined; + has(name: any): boolean; + set(name: any, value: any): this; + _normalizeKey(name: any): string; +} +declare abstract class URLWithoutHost extends URL { + abstract get host(): never; + abstract set host(value: never); + abstract get hostname(): never; + abstract set hostname(value: never); + abstract get port(): never; + abstract set port(value: never); + abstract get href(): string; + abstract set href(value: string); +} +export default class ConnectionString extends URLWithoutHost { + _hosts: string[]; + constructor(uri: string); + get host(): never; + set host(_ignored: never); + get hostname(): never; + set hostname(_ignored: never); + get port(): never; + set port(_ignored: never); + get href(): string; + set href(_ignored: string); + get isSRV(): boolean; + get hosts(): string[]; + set hosts(list: string[]); + toString(): string; + clone(): ConnectionString; +} +export declare class CommaAndColonSeparatedRecord extends CaseInsensitiveMap { + constructor(from?: string | null); + toString(): string; +} +export {}; diff --git a/node_modules/mongodb-connection-string-url/lib/index.js b/node_modules/mongodb-connection-string-url/lib/index.js new file mode 100644 index 000000000..82b4f98f9 --- /dev/null +++ b/node_modules/mongodb-connection-string-url/lib/index.js @@ -0,0 +1,161 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.CommaAndColonSeparatedRecord = void 0; +const whatwg_url_1 = require("whatwg-url"); +const DUMMY_HOSTNAME = '__this_is_a_placeholder__'; +const HOSTS_REGEX = new RegExp(String.raw `(?mongodb(?:\+srv|)):\/\/(?:(?[^:]*)(?::(?[^@]*))?@)?(?(?!:)[^\/?@]+)(?.*)`); +class CaseInsensitiveMap extends Map { + delete(name) { + return super.delete(this._normalizeKey(name)); + } + get(name) { + return super.get(this._normalizeKey(name)); + } + has(name) { + return super.has(this._normalizeKey(name)); + } + set(name, value) { + return super.set(this._normalizeKey(name), value); + } + _normalizeKey(name) { + name = `${name}`; + for (const key of this.keys()) { + if (key.toLowerCase() === name.toLowerCase()) { + name = key; + break; + } + } + return name; + } +} +const caseInsenstiveURLSearchParams = (Ctor) => class CaseInsenstiveURLSearchParams extends Ctor { + append(name, value) { + return super.append(this._normalizeKey(name), value); + } + delete(name) { + return super.delete(this._normalizeKey(name)); + } + get(name) { + return super.get(this._normalizeKey(name)); + } + getAll(name) { + return super.getAll(this._normalizeKey(name)); + } + has(name) { + return super.has(this._normalizeKey(name)); + } + set(name, value) { + return super.set(this._normalizeKey(name), value); + } + _normalizeKey(name) { + return CaseInsensitiveMap.prototype._normalizeKey.call(this, name); + } +}; +class URLWithoutHost extends whatwg_url_1.URL { +} +class MongoParseError extends Error { + get name() { + return 'MongoParseError'; + } +} +class ConnectionString extends URLWithoutHost { + constructor(uri) { + var _a; + const match = uri.match(HOSTS_REGEX); + if (!match) { + throw new MongoParseError(`Invalid connection string "${uri}"`); + } + const { protocol, username, password, hosts, rest } = (_a = match.groups) !== null && _a !== void 0 ? _a : {}; + if (!protocol || !hosts) { + throw new MongoParseError(`Protocol and host list are required in "${uri}"`); + } + try { + decodeURIComponent(username !== null && username !== void 0 ? username : ''); + decodeURIComponent(password !== null && password !== void 0 ? password : ''); + } + catch (err) { + throw new MongoParseError(err.message); + } + const illegalCharacters = new RegExp(String.raw `[:/?#\[\]@]`, 'gi'); + if (username === null || username === void 0 ? void 0 : username.match(illegalCharacters)) { + throw new MongoParseError(`Username contains unescaped characters ${username}`); + } + if (!username || !password) { + const uriWithoutProtocol = uri.replace(`${protocol}://`, ''); + if (uriWithoutProtocol.startsWith('@') || uriWithoutProtocol.startsWith(':')) { + throw new MongoParseError('URI contained empty userinfo section'); + } + } + if (password === null || password === void 0 ? void 0 : password.match(illegalCharacters)) { + throw new MongoParseError('Password contains unescaped characters'); + } + let authString = ''; + if (typeof username === 'string') + authString += username; + if (typeof password === 'string') + authString += `:${password}`; + if (authString) + authString += '@'; + super(`${protocol.toLowerCase()}://${authString}${DUMMY_HOSTNAME}${rest}`); + this._hosts = hosts.split(','); + if (this.isSRV && this.hosts.length !== 1) { + throw new MongoParseError('mongodb+srv URI cannot have multiple service names'); + } + if (this.isSRV && this.hosts.some(host => host.includes(':'))) { + throw new MongoParseError('mongodb+srv URI cannot have port number'); + } + if (!this.pathname) { + this.pathname = '/'; + } + Object.setPrototypeOf(this.searchParams, caseInsenstiveURLSearchParams(this.searchParams.constructor).prototype); + } + get host() { return DUMMY_HOSTNAME; } + set host(_ignored) { throw new Error('No single host for connection string'); } + get hostname() { return DUMMY_HOSTNAME; } + set hostname(_ignored) { throw new Error('No single host for connection string'); } + get port() { return ''; } + set port(_ignored) { throw new Error('No single host for connection string'); } + get href() { return this.toString(); } + set href(_ignored) { throw new Error('Cannot set href for connection strings'); } + get isSRV() { + return this.protocol.includes('srv'); + } + get hosts() { + return this._hosts; + } + set hosts(list) { + this._hosts = list; + } + toString() { + return super.toString().replace(DUMMY_HOSTNAME, this.hosts.join(',')); + } + clone() { + return new ConnectionString(this.toString()); + } + [Symbol.for('nodejs.util.inspect.custom')]() { + const { href, origin, protocol, username, password, hosts, pathname, search, searchParams, hash } = this; + return { href, origin, protocol, username, password, hosts, pathname, search, searchParams, hash }; + } +} +exports.default = ConnectionString; +class CommaAndColonSeparatedRecord extends CaseInsensitiveMap { + constructor(from) { + super(); + for (const entry of (from !== null && from !== void 0 ? from : '').split(',')) { + if (!entry) + continue; + const colonIndex = entry.indexOf(':'); + if (colonIndex === -1) { + this.set(entry, ''); + } + else { + this.set(entry.slice(0, colonIndex), entry.slice(colonIndex + 1)); + } + } + } + toString() { + return [...this].map(entry => entry.join(':')).join(','); + } +} +exports.CommaAndColonSeparatedRecord = CommaAndColonSeparatedRecord; +//# sourceMappingURL=index.js.map \ No newline at end of file diff --git a/node_modules/mongodb-connection-string-url/lib/index.js.map b/node_modules/mongodb-connection-string-url/lib/index.js.map new file mode 100644 index 000000000..df059f01f --- /dev/null +++ b/node_modules/mongodb-connection-string-url/lib/index.js.map @@ -0,0 +1 @@ +{"version":3,"file":"index.js","sourceRoot":"","sources":["../src/index.ts"],"names":[],"mappings":";;;AAAA,2CAAkD;AAElD,MAAM,cAAc,GAAG,2BAA2B,CAAC;AAInD,MAAM,WAAW,GAAG,IAAI,MAAM,CAC5B,MAAM,CAAC,GAAG,CAAA,uHAAuH,CAClI,CAAC;AAEF,MAAM,kBAAmB,SAAQ,GAAmB;IAClD,MAAM,CAAC,IAAS;QACd,OAAO,KAAK,CAAC,MAAM,CAAC,IAAI,CAAC,aAAa,CAAC,IAAI,CAAC,CAAC,CAAC;IAChD,CAAC;IAED,GAAG,CAAC,IAAS;QACX,OAAO,KAAK,CAAC,GAAG,CAAC,IAAI,CAAC,aAAa,CAAC,IAAI,CAAC,CAAC,CAAC;IAC7C,CAAC;IAED,GAAG,CAAC,IAAS;QACX,OAAO,KAAK,CAAC,GAAG,CAAC,IAAI,CAAC,aAAa,CAAC,IAAI,CAAC,CAAC,CAAC;IAC7C,CAAC;IAED,GAAG,CAAC,IAAS,EAAE,KAAU;QACvB,OAAO,KAAK,CAAC,GAAG,CAAC,IAAI,CAAC,aAAa,CAAC,IAAI,CAAC,EAAE,KAAK,CAAC,CAAC;IACpD,CAAC;IAED,aAAa,CAAC,IAAS;QACrB,IAAI,GAAG,GAAG,IAAI,EAAE,CAAC;QACjB,KAAK,MAAM,GAAG,IAAI,IAAI,CAAC,IAAI,EAAE,EAAE;YAC7B,IAAI,GAAG,CAAC,WAAW,EAAE,KAAK,IAAI,CAAC,WAAW,EAAE,EAAE;gBAC5C,IAAI,GAAG,GAAG,CAAC;gBACX,MAAM;aACP;SACF;QACD,OAAO,IAAI,CAAC;IACd,CAAC;CACF;AAED,MAAM,6BAA6B,GAAG,CAAC,IAA4B,EAAE,EAAE,CACrE,MAAM,6BAA8B,SAAQ,IAAI;IAC9C,MAAM,CAAC,IAAS,EAAE,KAAU;QAC1B,OAAO,KAAK,CAAC,MAAM,CAAC,IAAI,CAAC,aAAa,CAAC,IAAI,CAAC,EAAE,KAAK,CAAC,CAAC;IACvD,CAAC;IAED,MAAM,CAAC,IAAS;QACd,OAAO,KAAK,CAAC,MAAM,CAAC,IAAI,CAAC,aAAa,CAAC,IAAI,CAAC,CAAC,CAAC;IAChD,CAAC;IAED,GAAG,CAAC,IAAS;QACX,OAAO,KAAK,CAAC,GAAG,CAAC,IAAI,CAAC,aAAa,CAAC,IAAI,CAAC,CAAC,CAAC;IAC7C,CAAC;IAED,MAAM,CAAC,IAAS;QACd,OAAO,KAAK,CAAC,MAAM,CAAC,IAAI,CAAC,aAAa,CAAC,IAAI,CAAC,CAAC,CAAC;IAChD,CAAC;IAED,GAAG,CAAC,IAAS;QACX,OAAO,KAAK,CAAC,GAAG,CAAC,IAAI,CAAC,aAAa,CAAC,IAAI,CAAC,CAAC,CAAC;IAC7C,CAAC;IAED,GAAG,CAAC,IAAS,EAAE,KAAU;QACvB,OAAO,KAAK,CAAC,GAAG,CAAC,IAAI,CAAC,aAAa,CAAC,IAAI,CAAC,EAAE,KAAK,CAAC,CAAC;IACpD,CAAC;IAED,aAAa,CAAC,IAAS;QACrB,OAAO,kBAAkB,CAAC,SAAS,CAAC,aAAa,CAAC,IAAI,CAAC,IAAI,EAAE,IAAI,CAAC,CAAC;IACrE,CAAC;CACF,CAAC;AAGJ,MAAe,cAAe,SAAQ,gBAAG;CASxC;AAED,MAAM,eAAgB,SAAQ,KAAK;IACjC,IAAI,IAAI;QACN,OAAO,iBAAiB,CAAC;IAC3B,CAAC;CACF;AAMD,MAAqB,gBAAiB,SAAQ,cAAc;IAI1D,YAAY,GAAW;;QACrB,MAAM,KAAK,GAAG,GAAG,CAAC,KAAK,CAAC,WAAW,CAAC,CAAC;QACrC,IAAI,CAAC,KAAK,EAAE;YACV,MAAM,IAAI,eAAe,CAAC,8BAA8B,GAAG,GAAG,CAAC,CAAC;SACjE;QAED,MAAM,EAAE,QAAQ,EAAE,QAAQ,EAAE,QAAQ,EAAE,KAAK,EAAE,IAAI,EAAE,GAAG,MAAA,KAAK,CAAC,MAAM,mCAAI,EAAE,CAAC;QAEzE,IAAI,CAAC,QAAQ,IAAI,CAAC,KAAK,EAAE;YACvB,MAAM,IAAI,eAAe,CAAC,2CAA2C,GAAG,GAAG,CAAC,CAAC;SAC9E;QAED,IAAI;YACF,kBAAkB,CAAC,QAAQ,aAAR,QAAQ,cAAR,QAAQ,GAAI,EAAE,CAAC,CAAC;YACnC,kBAAkB,CAAC,QAAQ,aAAR,QAAQ,cAAR,QAAQ,GAAI,EAAE,CAAC,CAAC;SACpC;QAAC,OAAO,GAAG,EAAE;YACZ,MAAM,IAAI,eAAe,CAAE,GAAa,CAAC,OAAO,CAAC,CAAC;SACnD;QAGD,MAAM,iBAAiB,GAAG,IAAI,MAAM,CAAC,MAAM,CAAC,GAAG,CAAA,aAAa,EAAE,IAAI,CAAC,CAAC;QACpE,IAAI,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,KAAK,CAAC,iBAAiB,CAAC,EAAE;YACtC,MAAM,IAAI,eAAe,CAAC,0CAA0C,QAAQ,EAAE,CAAC,CAAC;SACjF;QACD,IAAI,CAAC,QAAQ,IAAI,CAAC,QAAQ,EAAE;YAC1B,MAAM,kBAAkB,GAAG,GAAG,CAAC,OAAO,CAAC,GAAG,QAAQ,KAAK,EAAE,EAAE,CAAC,CAAC;YAC7D,IAAI,kBAAkB,CAAC,UAAU,CAAC,GAAG,CAAC,IAAI,kBAAkB,CAAC,UAAU,CAAC,GAAG,CAAC,EAAE;gBAC5E,MAAM,IAAI,eAAe,CAAC,sCAAsC,CAAC,CAAC;aACnE;SACF;QAED,IAAI,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,KAAK,CAAC,iBAAiB,CAAC,EAAE;YACtC,MAAM,IAAI,eAAe,CAAC,wCAAwC,CAAC,CAAC;SACrE;QAED,IAAI,UAAU,GAAG,EAAE,CAAC;QACpB,IAAI,OAAO,QAAQ,KAAK,QAAQ;YAAE,UAAU,IAAI,QAAQ,CAAC;QACzD,IAAI,OAAO,QAAQ,KAAK,QAAQ;YAAE,UAAU,IAAI,IAAI,QAAQ,EAAE,CAAC;QAC/D,IAAI,UAAU;YAAE,UAAU,IAAI,GAAG,CAAC;QAElC,KAAK,CAAC,GAAG,QAAQ,CAAC,WAAW,EAAE,MAAM,UAAU,GAAG,cAAc,GAAG,IAAI,EAAE,CAAC,CAAC;QAC3E,IAAI,CAAC,MAAM,GAAG,KAAK,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;QAE/B,IAAI,IAAI,CAAC,KAAK,IAAI,IAAI,CAAC,KAAK,CAAC,MAAM,KAAK,CAAC,EAAE;YACzC,MAAM,IAAI,eAAe,CAAC,oDAAoD,CAAC,CAAC;SACjF;QACD,IAAI,IAAI,CAAC,KAAK,IAAI,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,EAAE,CAAC,IAAI,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAC,EAAE;YAC7D,MAAM,IAAI,eAAe,CAAC,yCAAyC,CAAC,CAAC;SACtE;QACD,IAAI,CAAC,IAAI,CAAC,QAAQ,EAAE;YAClB,IAAI,CAAC,QAAQ,GAAG,GAAG,CAAC;SACrB;QACD,MAAM,CAAC,cAAc,CAAC,IAAI,CAAC,YAAY,EAAE,6BAA6B,CAAC,IAAI,CAAC,YAAY,CAAC,WAAkB,CAAC,CAAC,SAAS,CAAC,CAAC;IAC1H,CAAC;IAKD,IAAI,IAAI,KAAY,OAAO,cAAuB,CAAC,CAAC,CAAC;IACrD,IAAI,IAAI,CAAC,QAAe,IAAI,MAAM,IAAI,KAAK,CAAC,sCAAsC,CAAC,CAAC,CAAC,CAAC;IACtF,IAAI,QAAQ,KAAY,OAAO,cAAuB,CAAC,CAAC,CAAC;IACzD,IAAI,QAAQ,CAAC,QAAe,IAAI,MAAM,IAAI,KAAK,CAAC,sCAAsC,CAAC,CAAC,CAAC,CAAC;IAC1F,IAAI,IAAI,KAAY,OAAO,EAAW,CAAC,CAAC,CAAC;IACzC,IAAI,IAAI,CAAC,QAAe,IAAI,MAAM,IAAI,KAAK,CAAC,sCAAsC,CAAC,CAAC,CAAC,CAAC;IACtF,IAAI,IAAI,KAAa,OAAO,IAAI,CAAC,QAAQ,EAAE,CAAC,CAAC,CAAC;IAC9C,IAAI,IAAI,CAAC,QAAgB,IAAI,MAAM,IAAI,KAAK,CAAC,wCAAwC,CAAC,CAAC,CAAC,CAAC;IAEzF,IAAI,KAAK;QACP,OAAO,IAAI,CAAC,QAAQ,CAAC,QAAQ,CAAC,KAAK,CAAC,CAAC;IACvC,CAAC;IAED,IAAI,KAAK;QACP,OAAO,IAAI,CAAC,MAAM,CAAC;IACrB,CAAC;IAED,IAAI,KAAK,CAAC,IAAc;QACtB,IAAI,CAAC,MAAM,GAAG,IAAI,CAAC;IACrB,CAAC;IAED,QAAQ;QACN,OAAO,KAAK,CAAC,QAAQ,EAAE,CAAC,OAAO,CAAC,cAAc,EAAE,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC;IACxE,CAAC;IAED,KAAK;QACH,OAAO,IAAI,gBAAgB,CAAC,IAAI,CAAC,QAAQ,EAAE,CAAC,CAAC;IAC/C,CAAC;IAED,CAAC,MAAM,CAAC,GAAG,CAAC,4BAA4B,CAAC,CAAC;QACxC,MAAM,EAAE,IAAI,EAAE,MAAM,EAAE,QAAQ,EAAE,QAAQ,EAAE,QAAQ,EAAE,KAAK,EAAE,QAAQ,EAAE,MAAM,EAAE,YAAY,EAAE,IAAI,EAAE,GAAG,IAAI,CAAC;QACzG,OAAO,EAAE,IAAI,EAAE,MAAM,EAAE,QAAQ,EAAE,QAAQ,EAAE,QAAQ,EAAE,KAAK,EAAE,QAAQ,EAAE,MAAM,EAAE,YAAY,EAAE,IAAI,EAAE,CAAC;IACrG,CAAC;CACF;AA/FD,mCA+FC;AAMD,MAAa,4BAA6B,SAAQ,kBAAkB;IAClE,YAAY,IAAoB;QAC9B,KAAK,EAAE,CAAC;QACR,KAAK,MAAM,KAAK,IAAI,CAAC,IAAI,aAAJ,IAAI,cAAJ,IAAI,GAAI,EAAE,CAAC,CAAC,KAAK,CAAC,GAAG,CAAC,EAAE;YAC3C,IAAI,CAAC,KAAK;gBAAE,SAAS;YACrB,MAAM,UAAU,GAAG,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC;YAEtC,IAAI,UAAU,KAAK,CAAC,CAAC,EAAE;gBACrB,IAAI,CAAC,GAAG,CAAC,KAAK,EAAE,EAAE,CAAC,CAAC;aACrB;iBAAM;gBACL,IAAI,CAAC,GAAG,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC,EAAE,UAAU,CAAC,EAAE,KAAK,CAAC,KAAK,CAAC,UAAU,GAAG,CAAC,CAAC,CAAC,CAAC;aACnE;SACF;IACH,CAAC;IAED,QAAQ;QACN,OAAO,CAAC,GAAG,IAAI,CAAC,CAAC,GAAG,CAAC,KAAK,CAAC,EAAE,CAAC,KAAK,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;IAC3D,CAAC;CACF;AAlBD,oEAkBC"} \ No newline at end of file diff --git a/node_modules/mongodb-connection-string-url/package.json b/node_modules/mongodb-connection-string-url/package.json new file mode 100644 index 000000000..89b91a6d9 --- /dev/null +++ b/node_modules/mongodb-connection-string-url/package.json @@ -0,0 +1,87 @@ +{ + "_from": "mongodb-connection-string-url@^2.0.0", + "_id": "mongodb-connection-string-url@2.1.0", + "_inBundle": false, + "_integrity": "sha512-Qf9Zw7KGiRljWvMrrUFDdVqo46KIEiDuCzvEN97rh/PcKzk2bd6n9KuzEwBwW9xo5glwx69y1mI6s+jFUD/aIQ==", + "_location": "/mongodb-connection-string-url", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "mongodb-connection-string-url@^2.0.0", + "name": "mongodb-connection-string-url", + "escapedName": "mongodb-connection-string-url", + "rawSpec": "^2.0.0", + "saveSpec": null, + "fetchSpec": "^2.0.0" + }, + "_requiredBy": [ + "/mongodb" + ], + "_resolved": "https://registry.npmjs.org/mongodb-connection-string-url/-/mongodb-connection-string-url-2.1.0.tgz", + "_shasum": "9c522c11c37f571fecddcb267ac4a76ef432aeb7", + "_spec": "mongodb-connection-string-url@^2.0.0", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\mongodb", + "bugs": { + "url": "https://github.com/mongodb-js/mongodb-connection-string-url/issues" + }, + "bundleDependencies": false, + "dependencies": { + "@types/whatwg-url": "^8.2.1", + "whatwg-url": "^9.1.0" + }, + "deprecated": false, + "description": "MongoDB connection strings, based on the WhatWG URL API", + "devDependencies": { + "@types/chai": "^4.2.5", + "@types/mocha": "^8.0.3", + "@types/node": "^14.11.1", + "@typescript-eslint/eslint-plugin": "^4.2.0", + "@typescript-eslint/parser": "^4.2.0", + "chai": "^4.2.0", + "eslint": "^7.9.0", + "eslint-config-semistandard": "^15.0.1", + "eslint-config-standard": "^14.1.1", + "eslint-plugin-import": "^2.22.0", + "eslint-plugin-node": "^11.1.0", + "eslint-plugin-promise": "^4.2.1", + "eslint-plugin-standard": "^4.0.1", + "gen-esm-wrapper": "^1.1.3", + "mocha": "^8.1.3", + "nyc": "^15.1.0", + "ts-node": "^9.0.0", + "typescript": "^4.0.3" + }, + "exports": { + "require": "./lib/index.js", + "import": "./.esm-wrapper.mjs" + }, + "files": [ + "LICENSE", + "lib", + "package.json", + "README.md", + ".esm-wrapper.mjs" + ], + "homepage": "https://github.com/mongodb-js/mongodb-connection-string-url", + "keywords": [ + "password", + "prompt", + "tty" + ], + "license": "Apache-2.0", + "main": "lib/index.js", + "name": "mongodb-connection-string-url", + "repository": { + "type": "git", + "url": "git+https://github.com/mongodb-js/mongodb-connection-string-url.git" + }, + "scripts": { + "build": "npm run compile-ts && gen-esm-wrapper . ./.esm-wrapper.mjs", + "compile-ts": "tsc -p tsconfig.json", + "lint": "eslint \"{src,test}/**/*.ts\"", + "prepack": "npm run build", + "test": "npm run lint && npm run build && nyc mocha --colors -r ts-node/register test/*.ts" + }, + "version": "2.1.0" +} diff --git a/node_modules/mongodb/LICENSE.md b/node_modules/mongodb/LICENSE.md new file mode 100644 index 000000000..ad410e113 --- /dev/null +++ b/node_modules/mongodb/LICENSE.md @@ -0,0 +1,201 @@ +Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + + TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + + 1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + + 2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + + 3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + + 4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + + 5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + + 6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + + 7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + + 8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + + 9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + + END OF TERMS AND CONDITIONS + + APPENDIX: How to apply the Apache License to your work. + + To apply the Apache License to your work, attach the following + boilerplate notice, with the fields enclosed by brackets "{}" + replaced with your own identifying information. (Don't include + the brackets!) The text should be enclosed in the appropriate + comment syntax for the file format. We also recommend that a + file or class name and description of purpose be included on the + same "printed page" as the copyright notice for easier + identification within third-party archives. + + Copyright {yyyy} {name of copyright owner} + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. \ No newline at end of file diff --git a/node_modules/mongodb/README.md b/node_modules/mongodb/README.md new file mode 100644 index 000000000..7100cce86 --- /dev/null +++ b/node_modules/mongodb/README.md @@ -0,0 +1,278 @@ +# MongoDB NodeJS Driver + +The official [MongoDB](https://www.mongodb.com/) driver for Node.js. + +**Upgrading to version 4? Take a look at our [upgrade guide here](https://github.com/mongodb/node-mongodb-native/blob/4.1/docs/CHANGES_4.0.0.md)!** + +## Quick Links + +| what | where | +| ------------- | ------------------------------------------------------------------------------------------------------ | +| documentation | [docs.mongodb.com/drivers/node](https://docs.mongodb.com/drivers/node) | +| api-doc | [mongodb.github.io/node-mongodb-native/4.1/](https://mongodb.github.io/node-mongodb-native/4.1/) | +| npm package | [www.npmjs.com/package/mongodb](https://www.npmjs.com/package/mongodb) | +| source | [github.com/mongodb/node-mongodb-native](https://github.com/mongodb/node-mongodb-native) | +| mongodb | [www.mongodb.com](https://www.mongodb.com) | +| changelog | [HISTORY.md](https://github.com/mongodb/node-mongodb-native/blob/4.1/HISTORY.md) | +| upgrade to v4 | [docs/CHANGES_4.0.0.md](https://github.com/mongodb/node-mongodb-native/blob/4.1/docs/CHANGES_4.0.0.md) | + +### Bugs / Feature Requests + +Think you’ve found a bug? Want to see a new feature in `node-mongodb-native`? Please open a +case in our issue management tool, JIRA: + +- Create an account and login [jira.mongodb.org](https://jira.mongodb.org). +- Navigate to the NODE project [jira.mongodb.org/browse/NODE](https://jira.mongodb.org/browse/NODE). +- Click **Create Issue** - Please provide as much information as possible about the issue type and how to reproduce it. + +Bug reports in JIRA for all driver projects (i.e. NODE, PYTHON, CSHARP, JAVA) and the +Core Server (i.e. SERVER) project are **public**. + +### Support / Feedback + +For issues with, questions about, or feedback for the Node.js driver, please look into our [support channels](https://docs.mongodb.com/manual/support). Please do not email any of the driver developers directly with issues or questions - you're more likely to get an answer on the [MongoDB Community Forums](https://community.mongodb.com/tags/c/drivers-odms-connectors/7/node-js-driver). + +### Change Log + +Change history can be found in [`HISTORY.md`](https://github.com/mongodb/node-mongodb-native/blob/4.1/HISTORY.md). + +### Compatibility + +For version compatibility matrices, please refer to the following links: + +- [MongoDB](https://docs.mongodb.com/drivers/node/compatibility#mongodb-compatibility) +- [NodeJS](https://docs.mongodb.com/drivers/node/compatibility#language-compatibility) + +#### Typescript Version + +We recommend using the latest version of typescript, however we do provide a [downleveled](https://github.com/sandersn/downlevel-dts#readme) version of the type definitions that we test compiling against `typescript@4.0.2`. +Since typescript [does not restrict breaking changes to major versions](https://github.com/Microsoft/TypeScript/wiki/Breaking-Changes) we consider this support best effort. +If you run into any unexpected compiler failures please let us know and we will do our best to correct it. + +## Installation + +The recommended way to get started using the Node.js 4.x driver is by using the `npm` (Node Package Manager) to install the dependency in your project. + +After you've created your own project using `npm init`, you can run: + +```bash +npm install mongodb +# or ... +yarn add mongodb +``` + +This will download the MongoDB driver and add a dependency entry in your `package.json` file. + +If you are a Typescript user, you will need the Node.js type definitions to use the driver's definitions: + +```sh +npm install -D @types/node +``` + +## Troubleshooting + +The MongoDB driver depends on several other packages. These are: + +- [bson](https://github.com/mongodb/js-bson) +- [bson-ext](https://github.com/mongodb-js/bson-ext) +- [kerberos](https://github.com/mongodb-js/kerberos) +- [mongodb-client-encryption](https://github.com/mongodb/libmongocrypt#readme) + +Some of these packages include native C++ extensions, consult the [trouble shooting guide here](https://github.com/mongodb/node-mongodb-native/blob/4.1/docs/native-extensions.md) if you run into issues. + +## Quick Start + +This guide will show you how to set up a simple application using Node.js and MongoDB. Its scope is only how to set up the driver and perform the simple CRUD operations. For more in-depth coverage, see the [official documentation](https://docs.mongodb.com/drivers/node/). + +### Create the `package.json` file + +First, create a directory where your application will live. + +```bash +mkdir myProject +cd myProject +``` + +Enter the following command and answer the questions to create the initial structure for your new project: + +```bash +npm init -y +``` + +Next, install the driver as a dependency. + +```bash +npm install mongodb +``` + +### Start a MongoDB Server + +For complete MongoDB installation instructions, see [the manual](https://docs.mongodb.org/manual/installation/). + +1. Download the right MongoDB version from [MongoDB](https://www.mongodb.org/downloads) +2. Create a database directory (in this case under **/data**). +3. Install and start a `mongod` process. + +```bash +mongod --dbpath=/data +``` + +You should see the **mongod** process start up and print some status information. + +### Connect to MongoDB + +Create a new **app.js** file and add the following code to try out some basic CRUD +operations using the MongoDB driver. + +Add code to connect to the server and the database **myProject**: + +> **NOTE:** All the examples below use async/await syntax. +> +> However, all async API calls support an optional callback as the final argument, +> if a callback is provided a Promise will not be returned. + +```js +const { MongoClient } = require('mongodb'); +// or as an es module: +// import { MongoClient } from 'mongodb' + +// Connection URL +const url = 'mongodb://localhost:27017'; +const client = new MongoClient(url); + +// Database Name +const dbName = 'myProject'; + +async function main() { + // Use connect method to connect to the server + await client.connect(); + console.log('Connected successfully to server'); + const db = client.db(dbName); + const collection = db.collection('documents'); + + // the following code examples can be pasted here... + + return 'done.'; +} + +main() + .then(console.log) + .catch(console.error) + .finally(() => client.close()); +``` + +Run your app from the command line with: + +```bash +node app.js +``` + +The application should print **Connected successfully to server** to the console. + +### Insert a Document + +Add to **app.js** the following function which uses the **insertMany** +method to add three documents to the **documents** collection. + +```js +const insertResult = await collection.insertMany([{ a: 1 }, { a: 2 }, { a: 3 }]); +console.log('Inserted documents =>', insertResult); +``` + +The **insertMany** command returns an object with information about the insert operations. + +### Find All Documents + +Add a query that returns all the documents. + +```js +const findResult = await collection.find({}).toArray(); +console.log('Found documents =>', findResult); +``` + +This query returns all the documents in the **documents** collection. +If you add this below the insertMany example you'll see the document's you've inserted. + +### Find Documents with a Query Filter + +Add a query filter to find only documents which meet the query criteria. + +```js +const filteredDocs = await collection.find({ a: 3 }).toArray(); +console.log('Found documents filtered by { a: 3 } =>', filteredDocs); +``` + +Only the documents which match `'a' : 3` should be returned. + +### Update a document + +The following operation updates a document in the **documents** collection. + +```js +const updateResult = await collection.updateOne({ a: 3 }, { $set: { b: 1 } }); +console.log('Updated documents =>', updateResult); +``` + +The method updates the first document where the field **a** is equal to **3** by adding a new field **b** to the document set to **1**. `updateResult` contains information about whether there was a matching document to update or not. + +### Remove a document + +Remove the document where the field **a** is equal to **3**. + +```js +const deleteResult = await collection.deleteMany({ a: 3 }); +console.log('Deleted documents =>', deleteResult); +``` + +### Index a Collection + +[Indexes](https://docs.mongodb.org/manual/indexes/) can improve your application's +performance. The following function creates an index on the **a** field in the +**documents** collection. + +```js +const indexName = await collection.createIndex({ a: 1 }); +console.log('index name =', indexName); +``` + +For more detailed information, see the [indexing strategies page](https://docs.mongodb.com/manual/applications/indexes/). + +## Error Handling + +If you need to filter certain errors from our driver we have a helpful tree of errors described in [docs/errors.md](https://github.com/mongodb/node-mongodb-native/blob/4.1/docs/errors.md). + +It is our recommendation to use `instanceof` checks on errors and to avoid relying on parsing `error.message` and `error.name` strings in your code. +We guarantee `instanceof` checks will pass according to semver guidelines, but errors may be sub-classed or their messages may change at any time, even patch releases, as we see fit to increase the helpfulness of the errors. + +Any new errors we add to the driver will directly extend an existing error class and no existing error will be moved to a different parent class outside of a major release. +This means `instanceof` will always be able to accurately capture the errors that our driver throws (or returns in a callback). + +```typescript +const client = new MongoClient(url); +await client.connect(); +const collection = client.db().collection('collection'); + +try { + await collection.insertOne({ _id: 1 }); + await collection.insertOne({ _id: 1 }); // duplicate key error +} catch (error) { + if (error instanceof MongoServerError) { + console.log(`Error worth logging: ${error}`); // special case for some reason + } + throw error; // still want to crash +} +``` + +## Next Steps + +- [MongoDB Documentation](https://docs.mongodb.com/manual/) +- [MongoDB Node Driver Documentation](https://docs.mongodb.com/drivers/node/) +- [Read about Schemas](https://docs.mongodb.com/manual/core/data-modeling-introduction/) +- [Star us on GitHub](https://github.com/mongodb/node-mongodb-native) + +## License + +[Apache 2.0](LICENSE.md) + +© 2009-2012 Christian Amor Kvalheim +© 2012-present MongoDB [Contributors](https://github.com/mongodb/node-mongodb-native/blob/4.1/CONTRIBUTORS.md) diff --git a/node_modules/mongodb/etc/prepare.js b/node_modules/mongodb/etc/prepare.js new file mode 100644 index 000000000..2039d0b33 --- /dev/null +++ b/node_modules/mongodb/etc/prepare.js @@ -0,0 +1,12 @@ +#! /usr/bin/env node +var cp = require('child_process'); +var fs = require('fs'); +var os = require('os'); + +if (fs.existsSync('src')) { + cp.spawn('npm', ['run', 'build:dts'], { stdio: 'inherit', shell: os.platform() === 'win32' }); +} else { + if (!fs.existsSync('lib')) { + console.warn('MongoDB: No compiled javascript present, the driver is not installed correctly.'); + } +} diff --git a/node_modules/mongodb/lib/admin.js b/node_modules/mongodb/lib/admin.js new file mode 100644 index 000000000..09a59befb --- /dev/null +++ b/node_modules/mongodb/lib/admin.js @@ -0,0 +1,124 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.Admin = void 0; +const add_user_1 = require("./operations/add_user"); +const remove_user_1 = require("./operations/remove_user"); +const validate_collection_1 = require("./operations/validate_collection"); +const list_databases_1 = require("./operations/list_databases"); +const execute_operation_1 = require("./operations/execute_operation"); +const run_command_1 = require("./operations/run_command"); +const utils_1 = require("./utils"); +/** + * The **Admin** class is an internal class that allows convenient access to + * the admin functionality and commands for MongoDB. + * + * **ADMIN Cannot directly be instantiated** + * @public + * + * @example + * ```js + * const MongoClient = require('mongodb').MongoClient; + * const test = require('assert'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * + * // Connect using MongoClient + * MongoClient.connect(url, function(err, client) { + * // Use the admin database for the operation + * const adminDb = client.db(dbName).admin(); + * + * // List all the available databases + * adminDb.listDatabases(function(err, dbs) { + * expect(err).to.not.exist; + * test.ok(dbs.databases.length > 0); + * client.close(); + * }); + * }); + * ``` + */ +class Admin { + /** + * Create a new Admin instance + * @internal + */ + constructor(db) { + this.s = { db }; + } + command(command, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = Object.assign({ dbName: 'admin' }, options); + return execute_operation_1.executeOperation(utils_1.getTopology(this.s.db), new run_command_1.RunCommandOperation(this.s.db, command, options), callback); + } + buildInfo(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = options !== null && options !== void 0 ? options : {}; + return this.command({ buildinfo: 1 }, options, callback); + } + serverInfo(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = options !== null && options !== void 0 ? options : {}; + return this.command({ buildinfo: 1 }, options, callback); + } + serverStatus(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = options !== null && options !== void 0 ? options : {}; + return this.command({ serverStatus: 1 }, options, callback); + } + ping(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = options !== null && options !== void 0 ? options : {}; + return this.command({ ping: 1 }, options, callback); + } + addUser(username, password, options, callback) { + if (typeof password === 'function') { + (callback = password), (password = undefined), (options = {}); + } + else if (typeof password !== 'string') { + if (typeof options === 'function') { + (callback = options), (options = password), (password = undefined); + } + else { + (options = password), (callback = undefined), (password = undefined); + } + } + else { + if (typeof options === 'function') + (callback = options), (options = {}); + } + options = Object.assign({ dbName: 'admin' }, options); + return execute_operation_1.executeOperation(utils_1.getTopology(this.s.db), new add_user_1.AddUserOperation(this.s.db, username, password, options), callback); + } + removeUser(username, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = Object.assign({ dbName: 'admin' }, options); + return execute_operation_1.executeOperation(utils_1.getTopology(this.s.db), new remove_user_1.RemoveUserOperation(this.s.db, username, options), callback); + } + validateCollection(collectionName, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = options !== null && options !== void 0 ? options : {}; + return execute_operation_1.executeOperation(utils_1.getTopology(this.s.db), new validate_collection_1.ValidateCollectionOperation(this, collectionName, options), callback); + } + listDatabases(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = options !== null && options !== void 0 ? options : {}; + return execute_operation_1.executeOperation(utils_1.getTopology(this.s.db), new list_databases_1.ListDatabasesOperation(this.s.db, options), callback); + } + replSetGetStatus(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = options !== null && options !== void 0 ? options : {}; + return this.command({ replSetGetStatus: 1 }, options, callback); + } +} +exports.Admin = Admin; +//# sourceMappingURL=admin.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/admin.js.map b/node_modules/mongodb/lib/admin.js.map new file mode 100644 index 000000000..c782035b1 --- /dev/null +++ b/node_modules/mongodb/lib/admin.js.map @@ -0,0 +1 @@ +{"version":3,"file":"admin.js","sourceRoot":"","sources":["../src/admin.ts"],"names":[],"mappings":";;;AAAA,oDAAyE;AACzE,0DAAkF;AAClF,0EAG0C;AAC1C,gEAIqC;AACrC,sEAAkE;AAClE,0DAAkF;AAClF,mCAAgD;AAUhD;;;;;;;;;;;;;;;;;;;;;;;;;;;;;GA6BG;AACH,MAAa,KAAK;IAIhB;;;OAGG;IACH,YAAY,EAAM;QAChB,IAAI,CAAC,CAAC,GAAG,EAAE,EAAE,EAAE,CAAC;IAClB,CAAC;IAaD,OAAO,CACL,OAAiB,EACjB,OAAgD,EAChD,QAA6B;QAE7B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,MAAM,EAAE,OAAO,EAAE,EAAE,OAAO,CAAC,CAAC;QAEtD,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EACtB,IAAI,iCAAmB,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,EAAE,OAAO,EAAE,OAAO,CAAC,EACpD,QAAQ,CACT,CAAC;IACJ,CAAC;IAYD,SAAS,CACP,OAAsD,EACtD,QAA6B;QAE7B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QACxB,OAAO,IAAI,CAAC,OAAO,CAAC,EAAE,SAAS,EAAE,CAAC,EAAE,EAAE,OAAO,EAAE,QAA8B,CAAC,CAAC;IACjF,CAAC;IAYD,UAAU,CACR,OAAsD,EACtD,QAA6B;QAE7B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QACxB,OAAO,IAAI,CAAC,OAAO,CAAC,EAAE,SAAS,EAAE,CAAC,EAAE,EAAE,OAAO,EAAE,QAA8B,CAAC,CAAC;IACjF,CAAC;IAYD,YAAY,CACV,OAAsD,EACtD,QAA6B;QAE7B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QACxB,OAAO,IAAI,CAAC,OAAO,CAAC,EAAE,YAAY,EAAE,CAAC,EAAE,EAAE,OAAO,EAAE,QAA8B,CAAC,CAAC;IACpF,CAAC;IAYD,IAAI,CACF,OAAsD,EACtD,QAA6B;QAE7B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QACxB,OAAO,IAAI,CAAC,OAAO,CAAC,EAAE,IAAI,EAAE,CAAC,EAAE,EAAE,OAAO,EAAE,QAA8B,CAAC,CAAC;IAC5E,CAAC;IAuBD,OAAO,CACL,QAAgB,EAChB,QAAuD,EACvD,OAA6C,EAC7C,QAA6B;QAE7B,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;YAClC,CAAC,QAAQ,GAAG,QAAQ,CAAC,EAAE,CAAC,QAAQ,GAAG,SAAS,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;SAC/D;aAAM,IAAI,OAAO,QAAQ,KAAK,QAAQ,EAAE;YACvC,IAAI,OAAO,OAAO,KAAK,UAAU,EAAE;gBACjC,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,QAAQ,CAAC,EAAE,CAAC,QAAQ,GAAG,SAAS,CAAC,CAAC;aACpE;iBAAM;gBACL,CAAC,OAAO,GAAG,QAAQ,CAAC,EAAE,CAAC,QAAQ,GAAG,SAAS,CAAC,EAAE,CAAC,QAAQ,GAAG,SAAS,CAAC,CAAC;aACtE;SACF;aAAM;YACL,IAAI,OAAO,OAAO,KAAK,UAAU;gBAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;SACzE;QAED,OAAO,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,MAAM,EAAE,OAAO,EAAE,EAAE,OAAO,CAAC,CAAC;QAEtD,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EACtB,IAAI,2BAAgB,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,EAAE,QAAQ,EAAE,QAAQ,EAAE,OAAO,CAAC,EAC5D,QAAQ,CACT,CAAC;IACJ,CAAC;IAaD,UAAU,CACR,QAAgB,EAChB,OAA+C,EAC/C,QAA4B;QAE5B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,MAAM,EAAE,OAAO,EAAE,EAAE,OAAO,CAAC,CAAC;QAEtD,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EACtB,IAAI,iCAAmB,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,EAAE,QAAQ,EAAE,OAAO,CAAC,EACrD,QAAQ,CACT,CAAC;IACJ,CAAC;IAiBD,kBAAkB,CAChB,cAAsB,EACtB,OAAwD,EACxD,QAA6B;QAE7B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAExB,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EACtB,IAAI,iDAA2B,CAAC,IAAI,EAAE,cAAc,EAAE,OAAO,CAAC,EAC9D,QAAQ,CACT,CAAC;IACJ,CAAC;IAYD,aAAa,CACX,OAA8D,EAC9D,QAAwC;QAExC,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAExB,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EACtB,IAAI,uCAAsB,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,EAAE,OAAO,CAAC,EAC9C,QAAQ,CACT,CAAC;IACJ,CAAC;IAYD,gBAAgB,CACd,OAAsD,EACtD,QAA6B;QAE7B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QACxB,OAAO,IAAI,CAAC,OAAO,CAAC,EAAE,gBAAgB,EAAE,CAAC,EAAE,EAAE,OAAO,EAAE,QAA8B,CAAC,CAAC;IACxF,CAAC;CACF;AApQD,sBAoQC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/bson.js b/node_modules/mongodb/lib/bson.js new file mode 100644 index 000000000..74385c8a9 --- /dev/null +++ b/node_modules/mongodb/lib/bson.js @@ -0,0 +1,67 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.resolveBSONOptions = exports.pluckBSONSerializeOptions = exports.Map = exports.BSONSymbol = exports.BSONRegExp = exports.DBRef = exports.Double = exports.Int32 = exports.Decimal128 = exports.MaxKey = exports.MinKey = exports.Code = exports.Timestamp = exports.ObjectId = exports.Binary = exports.Long = exports.calculateObjectSize = exports.serialize = exports.deserialize = void 0; +// eslint-disable-next-line @typescript-eslint/no-var-requires +let BSON = require('bson'); +try { + // Ensure you always wrap an optional require in the try block NODE-3199 + BSON = require('bson-ext'); +} +catch { } // eslint-disable-line +/** @internal */ +exports.deserialize = BSON.deserialize; +/** @internal */ +exports.serialize = BSON.serialize; +/** @internal */ +exports.calculateObjectSize = BSON.calculateObjectSize; +var bson_1 = require("bson"); +Object.defineProperty(exports, "Long", { enumerable: true, get: function () { return bson_1.Long; } }); +Object.defineProperty(exports, "Binary", { enumerable: true, get: function () { return bson_1.Binary; } }); +Object.defineProperty(exports, "ObjectId", { enumerable: true, get: function () { return bson_1.ObjectId; } }); +Object.defineProperty(exports, "Timestamp", { enumerable: true, get: function () { return bson_1.Timestamp; } }); +Object.defineProperty(exports, "Code", { enumerable: true, get: function () { return bson_1.Code; } }); +Object.defineProperty(exports, "MinKey", { enumerable: true, get: function () { return bson_1.MinKey; } }); +Object.defineProperty(exports, "MaxKey", { enumerable: true, get: function () { return bson_1.MaxKey; } }); +Object.defineProperty(exports, "Decimal128", { enumerable: true, get: function () { return bson_1.Decimal128; } }); +Object.defineProperty(exports, "Int32", { enumerable: true, get: function () { return bson_1.Int32; } }); +Object.defineProperty(exports, "Double", { enumerable: true, get: function () { return bson_1.Double; } }); +Object.defineProperty(exports, "DBRef", { enumerable: true, get: function () { return bson_1.DBRef; } }); +Object.defineProperty(exports, "BSONRegExp", { enumerable: true, get: function () { return bson_1.BSONRegExp; } }); +Object.defineProperty(exports, "BSONSymbol", { enumerable: true, get: function () { return bson_1.BSONSymbol; } }); +Object.defineProperty(exports, "Map", { enumerable: true, get: function () { return bson_1.Map; } }); +function pluckBSONSerializeOptions(options) { + const { fieldsAsRaw, promoteValues, promoteBuffers, promoteLongs, serializeFunctions, ignoreUndefined, bsonRegExp, raw } = options; + return { + fieldsAsRaw, + promoteValues, + promoteBuffers, + promoteLongs, + serializeFunctions, + ignoreUndefined, + bsonRegExp, + raw + }; +} +exports.pluckBSONSerializeOptions = pluckBSONSerializeOptions; +/** + * Merge the given BSONSerializeOptions, preferring options over the parent's options, and + * substituting defaults for values not set. + * + * @internal + */ +function resolveBSONOptions(options, parent) { + var _a, _b, _c, _d, _e, _f, _g, _h, _j, _k, _l, _m, _o, _p, _q, _r; + const parentOptions = parent === null || parent === void 0 ? void 0 : parent.bsonOptions; + return { + raw: (_b = (_a = options === null || options === void 0 ? void 0 : options.raw) !== null && _a !== void 0 ? _a : parentOptions === null || parentOptions === void 0 ? void 0 : parentOptions.raw) !== null && _b !== void 0 ? _b : false, + promoteLongs: (_d = (_c = options === null || options === void 0 ? void 0 : options.promoteLongs) !== null && _c !== void 0 ? _c : parentOptions === null || parentOptions === void 0 ? void 0 : parentOptions.promoteLongs) !== null && _d !== void 0 ? _d : true, + promoteValues: (_f = (_e = options === null || options === void 0 ? void 0 : options.promoteValues) !== null && _e !== void 0 ? _e : parentOptions === null || parentOptions === void 0 ? void 0 : parentOptions.promoteValues) !== null && _f !== void 0 ? _f : true, + promoteBuffers: (_h = (_g = options === null || options === void 0 ? void 0 : options.promoteBuffers) !== null && _g !== void 0 ? _g : parentOptions === null || parentOptions === void 0 ? void 0 : parentOptions.promoteBuffers) !== null && _h !== void 0 ? _h : false, + ignoreUndefined: (_k = (_j = options === null || options === void 0 ? void 0 : options.ignoreUndefined) !== null && _j !== void 0 ? _j : parentOptions === null || parentOptions === void 0 ? void 0 : parentOptions.ignoreUndefined) !== null && _k !== void 0 ? _k : false, + bsonRegExp: (_m = (_l = options === null || options === void 0 ? void 0 : options.bsonRegExp) !== null && _l !== void 0 ? _l : parentOptions === null || parentOptions === void 0 ? void 0 : parentOptions.bsonRegExp) !== null && _m !== void 0 ? _m : false, + serializeFunctions: (_p = (_o = options === null || options === void 0 ? void 0 : options.serializeFunctions) !== null && _o !== void 0 ? _o : parentOptions === null || parentOptions === void 0 ? void 0 : parentOptions.serializeFunctions) !== null && _p !== void 0 ? _p : false, + fieldsAsRaw: (_r = (_q = options === null || options === void 0 ? void 0 : options.fieldsAsRaw) !== null && _q !== void 0 ? _q : parentOptions === null || parentOptions === void 0 ? void 0 : parentOptions.fieldsAsRaw) !== null && _r !== void 0 ? _r : {} + }; +} +exports.resolveBSONOptions = resolveBSONOptions; +//# sourceMappingURL=bson.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/bson.js.map b/node_modules/mongodb/lib/bson.js.map new file mode 100644 index 000000000..2f9bfd70b --- /dev/null +++ b/node_modules/mongodb/lib/bson.js.map @@ -0,0 +1 @@ +{"version":3,"file":"bson.js","sourceRoot":"","sources":["../src/bson.ts"],"names":[],"mappings":";;;AAMA,8DAA8D;AAC9D,IAAI,IAAI,GAAG,OAAO,CAAC,MAAM,CAAC,CAAC;AAC3B,IAAI;IACF,wEAAwE;IACxE,IAAI,GAAG,OAAO,CAAC,UAAU,CAAC,CAAC;CAC5B;AAAC,MAAM,GAAE,CAAC,sBAAsB;AAEjC,gBAAgB;AACH,QAAA,WAAW,GAAG,IAAI,CAAC,WAAmC,CAAC;AACpE,gBAAgB;AACH,QAAA,SAAS,GAAG,IAAI,CAAC,SAA+B,CAAC;AAC9D,gBAAgB;AACH,QAAA,mBAAmB,GAAG,IAAI,CAAC,mBAAmD,CAAC;AAE5F,6BAgBc;AAfZ,4FAAA,IAAI,OAAA;AACJ,8FAAA,MAAM,OAAA;AACN,gGAAA,QAAQ,OAAA;AACR,iGAAA,SAAS,OAAA;AACT,4FAAA,IAAI,OAAA;AACJ,8FAAA,MAAM,OAAA;AACN,8FAAA,MAAM,OAAA;AACN,kGAAA,UAAU,OAAA;AACV,6FAAA,KAAK,OAAA;AACL,8FAAA,MAAM,OAAA;AACN,6FAAA,KAAK,OAAA;AACL,kGAAA,UAAU,OAAA;AACV,kGAAA,UAAU,OAAA;AACV,2FAAA,GAAG,OAAA;AAwBL,SAAgB,yBAAyB,CAAC,OAA6B;IACrE,MAAM,EACJ,WAAW,EACX,aAAa,EACb,cAAc,EACd,YAAY,EACZ,kBAAkB,EAClB,eAAe,EACf,UAAU,EACV,GAAG,EACJ,GAAG,OAAO,CAAC;IACZ,OAAO;QACL,WAAW;QACX,aAAa;QACb,cAAc;QACd,YAAY;QACZ,kBAAkB;QAClB,eAAe;QACf,UAAU;QACV,GAAG;KACJ,CAAC;AACJ,CAAC;AArBD,8DAqBC;AAED;;;;;GAKG;AACH,SAAgB,kBAAkB,CAChC,OAA8B,EAC9B,MAA+C;;IAE/C,MAAM,aAAa,GAAG,MAAM,aAAN,MAAM,uBAAN,MAAM,CAAE,WAAW,CAAC;IAC1C,OAAO;QACL,GAAG,EAAE,MAAA,MAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,GAAG,mCAAI,aAAa,aAAb,aAAa,uBAAb,aAAa,CAAE,GAAG,mCAAI,KAAK;QAChD,YAAY,EAAE,MAAA,MAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,YAAY,mCAAI,aAAa,aAAb,aAAa,uBAAb,aAAa,CAAE,YAAY,mCAAI,IAAI;QAC1E,aAAa,EAAE,MAAA,MAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,aAAa,mCAAI,aAAa,aAAb,aAAa,uBAAb,aAAa,CAAE,aAAa,mCAAI,IAAI;QAC7E,cAAc,EAAE,MAAA,MAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,cAAc,mCAAI,aAAa,aAAb,aAAa,uBAAb,aAAa,CAAE,cAAc,mCAAI,KAAK;QACjF,eAAe,EAAE,MAAA,MAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,eAAe,mCAAI,aAAa,aAAb,aAAa,uBAAb,aAAa,CAAE,eAAe,mCAAI,KAAK;QACpF,UAAU,EAAE,MAAA,MAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,UAAU,mCAAI,aAAa,aAAb,aAAa,uBAAb,aAAa,CAAE,UAAU,mCAAI,KAAK;QACrE,kBAAkB,EAAE,MAAA,MAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,kBAAkB,mCAAI,aAAa,aAAb,aAAa,uBAAb,aAAa,CAAE,kBAAkB,mCAAI,KAAK;QAC7F,WAAW,EAAE,MAAA,MAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,WAAW,mCAAI,aAAa,aAAb,aAAa,uBAAb,aAAa,CAAE,WAAW,mCAAI,EAAE;KACtE,CAAC;AACJ,CAAC;AAfD,gDAeC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/bulk/common.js b/node_modules/mongodb/lib/bulk/common.js new file mode 100644 index 000000000..19550413b --- /dev/null +++ b/node_modules/mongodb/lib/bulk/common.js @@ -0,0 +1,941 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.BulkOperationBase = exports.FindOperators = exports.MongoBulkWriteError = exports.WriteError = exports.WriteConcernError = exports.BulkWriteResult = exports.Batch = exports.BatchType = void 0; +const promise_provider_1 = require("../promise_provider"); +const bson_1 = require("../bson"); +const error_1 = require("../error"); +const utils_1 = require("../utils"); +const execute_operation_1 = require("../operations/execute_operation"); +const insert_1 = require("../operations/insert"); +const update_1 = require("../operations/update"); +const delete_1 = require("../operations/delete"); +const write_concern_1 = require("../write_concern"); +/** @internal */ +const kServerError = Symbol('serverError'); +/** @public */ +exports.BatchType = Object.freeze({ + INSERT: 1, + UPDATE: 2, + DELETE: 3 +}); +/** + * Keeps the state of a unordered batch so we can rewrite the results + * correctly after command execution + * + * @public + */ +class Batch { + constructor(batchType, originalZeroIndex) { + this.originalZeroIndex = originalZeroIndex; + this.currentIndex = 0; + this.originalIndexes = []; + this.batchType = batchType; + this.operations = []; + this.size = 0; + this.sizeBytes = 0; + } +} +exports.Batch = Batch; +/** + * @public + * The result of a bulk write. + */ +class BulkWriteResult { + /** + * Create a new BulkWriteResult instance + * @internal + */ + constructor(bulkResult) { + this.result = bulkResult; + } + /** Number of documents inserted. */ + get insertedCount() { + var _a; + return (_a = this.result.nInserted) !== null && _a !== void 0 ? _a : 0; + } + /** Number of documents matched for update. */ + get matchedCount() { + var _a; + return (_a = this.result.nMatched) !== null && _a !== void 0 ? _a : 0; + } + /** Number of documents modified. */ + get modifiedCount() { + var _a; + return (_a = this.result.nModified) !== null && _a !== void 0 ? _a : 0; + } + /** Number of documents deleted. */ + get deletedCount() { + var _a; + return (_a = this.result.nRemoved) !== null && _a !== void 0 ? _a : 0; + } + /** Number of documents upserted. */ + get upsertedCount() { + var _a; + return (_a = this.result.upserted.length) !== null && _a !== void 0 ? _a : 0; + } + /** Upserted document generated Id's, hash key is the index of the originating operation */ + get upsertedIds() { + var _a; + const upserted = {}; + for (const doc of (_a = this.result.upserted) !== null && _a !== void 0 ? _a : []) { + upserted[doc.index] = doc._id; + } + return upserted; + } + /** Inserted document generated Id's, hash key is the index of the originating operation */ + get insertedIds() { + var _a; + const inserted = {}; + for (const doc of (_a = this.result.insertedIds) !== null && _a !== void 0 ? _a : []) { + inserted[doc.index] = doc._id; + } + return inserted; + } + /** Evaluates to true if the bulk operation correctly executes */ + get ok() { + return this.result.ok; + } + /** The number of inserted documents */ + get nInserted() { + return this.result.nInserted; + } + /** Number of upserted documents */ + get nUpserted() { + return this.result.nUpserted; + } + /** Number of matched documents */ + get nMatched() { + return this.result.nMatched; + } + /** Number of documents updated physically on disk */ + get nModified() { + return this.result.nModified; + } + /** Number of removed documents */ + get nRemoved() { + return this.result.nRemoved; + } + /** Returns an array of all inserted ids */ + getInsertedIds() { + return this.result.insertedIds; + } + /** Returns an array of all upserted ids */ + getUpsertedIds() { + return this.result.upserted; + } + /** Returns the upserted id at the given index */ + getUpsertedIdAt(index) { + return this.result.upserted[index]; + } + /** Returns raw internal result */ + getRawResponse() { + return this.result; + } + /** Returns true if the bulk operation contains a write error */ + hasWriteErrors() { + return this.result.writeErrors.length > 0; + } + /** Returns the number of write errors off the bulk operation */ + getWriteErrorCount() { + return this.result.writeErrors.length; + } + /** Returns a specific write error object */ + getWriteErrorAt(index) { + if (index < this.result.writeErrors.length) { + return this.result.writeErrors[index]; + } + } + /** Retrieve all write errors */ + getWriteErrors() { + return this.result.writeErrors; + } + /** Retrieve lastOp if available */ + getLastOp() { + return this.result.opTime; + } + /** Retrieve the write concern error if one exists */ + getWriteConcernError() { + if (this.result.writeConcernErrors.length === 0) { + return; + } + else if (this.result.writeConcernErrors.length === 1) { + // Return the error + return this.result.writeConcernErrors[0]; + } + else { + // Combine the errors + let errmsg = ''; + for (let i = 0; i < this.result.writeConcernErrors.length; i++) { + const err = this.result.writeConcernErrors[i]; + errmsg = errmsg + err.errmsg; + // TODO: Something better + if (i === 0) + errmsg = errmsg + ' and '; + } + return new WriteConcernError({ errmsg, code: error_1.MONGODB_ERROR_CODES.WriteConcernFailed }); + } + } + toJSON() { + return this.result; + } + toString() { + return `BulkWriteResult(${this.toJSON()})`; + } + isOk() { + return this.result.ok === 1; + } +} +exports.BulkWriteResult = BulkWriteResult; +/** + * An error representing a failure by the server to apply the requested write concern to the bulk operation. + * @public + * @category Error + */ +class WriteConcernError { + constructor(error) { + this[kServerError] = error; + } + /** Write concern error code. */ + get code() { + return this[kServerError].code; + } + /** Write concern error message. */ + get errmsg() { + return this[kServerError].errmsg; + } + /** Write concern error info. */ + get errInfo() { + return this[kServerError].errInfo; + } + /** @deprecated The `err` prop that contained a MongoServerError has been deprecated. */ + get err() { + return this[kServerError]; + } + toJSON() { + return this[kServerError]; + } + toString() { + return `WriteConcernError(${this.errmsg})`; + } +} +exports.WriteConcernError = WriteConcernError; +/** + * An error that occurred during a BulkWrite on the server. + * @public + * @category Error + */ +class WriteError { + constructor(err) { + this.err = err; + } + /** WriteError code. */ + get code() { + return this.err.code; + } + /** WriteError original bulk operation index. */ + get index() { + return this.err.index; + } + /** WriteError message. */ + get errmsg() { + return this.err.errmsg; + } + /** WriteError details. */ + get errInfo() { + return this.err.errInfo; + } + /** Returns the underlying operation that caused the error */ + getOperation() { + return this.err.op; + } + toJSON() { + return { code: this.err.code, index: this.err.index, errmsg: this.err.errmsg, op: this.err.op }; + } + toString() { + return `WriteError(${JSON.stringify(this.toJSON())})`; + } +} +exports.WriteError = WriteError; +/** Merges results into shared data structure */ +function mergeBatchResults(batch, bulkResult, err, result) { + // If we have an error set the result to be the err object + if (err) { + result = err; + } + else if (result && result.result) { + result = result.result; + } + if (result == null) { + return; + } + // Do we have a top level error stop processing and return + if (result.ok === 0 && bulkResult.ok === 1) { + bulkResult.ok = 0; + const writeError = { + index: 0, + code: result.code || 0, + errmsg: result.message, + errInfo: result.errInfo, + op: batch.operations[0] + }; + bulkResult.writeErrors.push(new WriteError(writeError)); + return; + } + else if (result.ok === 0 && bulkResult.ok === 0) { + return; + } + // Deal with opTime if available + if (result.opTime || result.lastOp) { + const opTime = result.lastOp || result.opTime; + let lastOpTS = null; + let lastOpT = null; + // We have a time stamp + if (opTime && opTime._bsontype === 'Timestamp') { + if (bulkResult.opTime == null) { + bulkResult.opTime = opTime; + } + else if (opTime.greaterThan(bulkResult.opTime)) { + bulkResult.opTime = opTime; + } + } + else { + // Existing TS + if (bulkResult.opTime) { + lastOpTS = + typeof bulkResult.opTime.ts === 'number' + ? bson_1.Long.fromNumber(bulkResult.opTime.ts) + : bulkResult.opTime.ts; + lastOpT = + typeof bulkResult.opTime.t === 'number' + ? bson_1.Long.fromNumber(bulkResult.opTime.t) + : bulkResult.opTime.t; + } + // Current OpTime TS + const opTimeTS = typeof opTime.ts === 'number' ? bson_1.Long.fromNumber(opTime.ts) : opTime.ts; + const opTimeT = typeof opTime.t === 'number' ? bson_1.Long.fromNumber(opTime.t) : opTime.t; + // Compare the opTime's + if (bulkResult.opTime == null) { + bulkResult.opTime = opTime; + } + else if (opTimeTS.greaterThan(lastOpTS)) { + bulkResult.opTime = opTime; + } + else if (opTimeTS.equals(lastOpTS)) { + if (opTimeT.greaterThan(lastOpT)) { + bulkResult.opTime = opTime; + } + } + } + } + // If we have an insert Batch type + if (isInsertBatch(batch) && result.n) { + bulkResult.nInserted = bulkResult.nInserted + result.n; + } + // If we have an insert Batch type + if (isDeleteBatch(batch) && result.n) { + bulkResult.nRemoved = bulkResult.nRemoved + result.n; + } + let nUpserted = 0; + // We have an array of upserted values, we need to rewrite the indexes + if (Array.isArray(result.upserted)) { + nUpserted = result.upserted.length; + for (let i = 0; i < result.upserted.length; i++) { + bulkResult.upserted.push({ + index: result.upserted[i].index + batch.originalZeroIndex, + _id: result.upserted[i]._id + }); + } + } + else if (result.upserted) { + nUpserted = 1; + bulkResult.upserted.push({ + index: batch.originalZeroIndex, + _id: result.upserted + }); + } + // If we have an update Batch type + if (isUpdateBatch(batch) && result.n) { + const nModified = result.nModified; + bulkResult.nUpserted = bulkResult.nUpserted + nUpserted; + bulkResult.nMatched = bulkResult.nMatched + (result.n - nUpserted); + if (typeof nModified === 'number') { + bulkResult.nModified = bulkResult.nModified + nModified; + } + else { + bulkResult.nModified = 0; + } + } + if (Array.isArray(result.writeErrors)) { + for (let i = 0; i < result.writeErrors.length; i++) { + const writeError = { + index: batch.originalIndexes[result.writeErrors[i].index], + code: result.writeErrors[i].code, + errmsg: result.writeErrors[i].errmsg, + errInfo: result.writeErrors[i].errInfo, + op: batch.operations[result.writeErrors[i].index] + }; + bulkResult.writeErrors.push(new WriteError(writeError)); + } + } + if (result.writeConcernError) { + bulkResult.writeConcernErrors.push(new WriteConcernError(result.writeConcernError)); + } +} +function executeCommands(bulkOperation, options, callback) { + if (bulkOperation.s.batches.length === 0) { + return callback(undefined, new BulkWriteResult(bulkOperation.s.bulkResult)); + } + const batch = bulkOperation.s.batches.shift(); + function resultHandler(err, result) { + // Error is a driver related error not a bulk op error, return early + if (err && 'message' in err && !(err instanceof error_1.MongoWriteConcernError)) { + return callback(new MongoBulkWriteError(err, new BulkWriteResult(bulkOperation.s.bulkResult))); + } + if (err instanceof error_1.MongoWriteConcernError) { + return handleMongoWriteConcernError(batch, bulkOperation.s.bulkResult, err, callback); + } + // Merge the results together + const writeResult = new BulkWriteResult(bulkOperation.s.bulkResult); + const mergeResult = mergeBatchResults(batch, bulkOperation.s.bulkResult, err, result); + if (mergeResult != null) { + return callback(undefined, writeResult); + } + if (bulkOperation.handleWriteError(callback, writeResult)) + return; + // Execute the next command in line + executeCommands(bulkOperation, options, callback); + } + const finalOptions = utils_1.resolveOptions(bulkOperation, { + ...options, + ordered: bulkOperation.isOrdered + }); + if (finalOptions.bypassDocumentValidation !== true) { + delete finalOptions.bypassDocumentValidation; + } + // Set an operationIf if provided + if (bulkOperation.operationId) { + resultHandler.operationId = bulkOperation.operationId; + } + // Is the bypassDocumentValidation options specific + if (bulkOperation.s.bypassDocumentValidation === true) { + finalOptions.bypassDocumentValidation = true; + } + // Is the checkKeys option disabled + if (bulkOperation.s.checkKeys === false) { + finalOptions.checkKeys = false; + } + if (finalOptions.retryWrites) { + if (isUpdateBatch(batch)) { + finalOptions.retryWrites = finalOptions.retryWrites && !batch.operations.some(op => op.multi); + } + if (isDeleteBatch(batch)) { + finalOptions.retryWrites = + finalOptions.retryWrites && !batch.operations.some(op => op.limit === 0); + } + } + try { + if (isInsertBatch(batch)) { + execute_operation_1.executeOperation(bulkOperation.s.topology, new insert_1.InsertOperation(bulkOperation.s.namespace, batch.operations, finalOptions), resultHandler); + } + else if (isUpdateBatch(batch)) { + execute_operation_1.executeOperation(bulkOperation.s.topology, new update_1.UpdateOperation(bulkOperation.s.namespace, batch.operations, finalOptions), resultHandler); + } + else if (isDeleteBatch(batch)) { + execute_operation_1.executeOperation(bulkOperation.s.topology, new delete_1.DeleteOperation(bulkOperation.s.namespace, batch.operations, finalOptions), resultHandler); + } + } + catch (err) { + // Force top level error + err.ok = 0; + // Merge top level error and return + mergeBatchResults(batch, bulkOperation.s.bulkResult, err, undefined); + callback(); + } +} +function handleMongoWriteConcernError(batch, bulkResult, err, callback) { + var _a, _b; + mergeBatchResults(batch, bulkResult, undefined, err.result); + callback(new MongoBulkWriteError({ + message: (_a = err.result) === null || _a === void 0 ? void 0 : _a.writeConcernError.errmsg, + code: (_b = err.result) === null || _b === void 0 ? void 0 : _b.writeConcernError.result + }, new BulkWriteResult(bulkResult))); +} +/** + * An error indicating an unsuccessful Bulk Write + * @public + * @category Error + */ +class MongoBulkWriteError extends error_1.MongoServerError { + /** Creates a new MongoBulkWriteError */ + constructor(error, result) { + var _a; + super(error); + this.writeErrors = []; + if (error instanceof WriteConcernError) + this.err = error; + else if (!(error instanceof Error)) { + this.message = error.message; + this.code = error.code; + this.writeErrors = (_a = error.writeErrors) !== null && _a !== void 0 ? _a : []; + } + this.result = result; + Object.assign(this, error); + } + get name() { + return 'MongoBulkWriteError'; + } + /** Number of documents inserted. */ + get insertedCount() { + return this.result.insertedCount; + } + /** Number of documents matched for update. */ + get matchedCount() { + return this.result.matchedCount; + } + /** Number of documents modified. */ + get modifiedCount() { + return this.result.modifiedCount; + } + /** Number of documents deleted. */ + get deletedCount() { + return this.result.deletedCount; + } + /** Number of documents upserted. */ + get upsertedCount() { + return this.result.upsertedCount; + } + /** Inserted document generated Id's, hash key is the index of the originating operation */ + get insertedIds() { + return this.result.insertedIds; + } + /** Upserted document generated Id's, hash key is the index of the originating operation */ + get upsertedIds() { + return this.result.upsertedIds; + } +} +exports.MongoBulkWriteError = MongoBulkWriteError; +/** + * A builder object that is returned from {@link BulkOperationBase#find}. + * Is used to build a write operation that involves a query filter. + * + * @public + */ +class FindOperators { + /** + * Creates a new FindOperators object. + * @internal + */ + constructor(bulkOperation) { + this.bulkOperation = bulkOperation; + } + /** Add a multiple update operation to the bulk operation */ + update(updateDocument) { + const currentOp = buildCurrentOp(this.bulkOperation); + return this.bulkOperation.addToOperationsList(exports.BatchType.UPDATE, update_1.makeUpdateStatement(currentOp.selector, updateDocument, { + ...currentOp, + multi: true + })); + } + /** Add a single update operation to the bulk operation */ + updateOne(updateDocument) { + if (!utils_1.hasAtomicOperators(updateDocument)) { + throw new error_1.MongoInvalidArgumentError('Update document requires atomic operators'); + } + const currentOp = buildCurrentOp(this.bulkOperation); + return this.bulkOperation.addToOperationsList(exports.BatchType.UPDATE, update_1.makeUpdateStatement(currentOp.selector, updateDocument, { ...currentOp, multi: false })); + } + /** Add a replace one operation to the bulk operation */ + replaceOne(replacement) { + if (utils_1.hasAtomicOperators(replacement)) { + throw new error_1.MongoInvalidArgumentError('Replacement document must not use atomic operators'); + } + const currentOp = buildCurrentOp(this.bulkOperation); + return this.bulkOperation.addToOperationsList(exports.BatchType.UPDATE, update_1.makeUpdateStatement(currentOp.selector, replacement, { ...currentOp, multi: false })); + } + /** Add a delete one operation to the bulk operation */ + deleteOne() { + const currentOp = buildCurrentOp(this.bulkOperation); + return this.bulkOperation.addToOperationsList(exports.BatchType.DELETE, delete_1.makeDeleteStatement(currentOp.selector, { ...currentOp, limit: 1 })); + } + /** Add a delete many operation to the bulk operation */ + delete() { + const currentOp = buildCurrentOp(this.bulkOperation); + return this.bulkOperation.addToOperationsList(exports.BatchType.DELETE, delete_1.makeDeleteStatement(currentOp.selector, { ...currentOp, limit: 0 })); + } + /** Upsert modifier for update bulk operation, noting that this operation is an upsert. */ + upsert() { + if (!this.bulkOperation.s.currentOp) { + this.bulkOperation.s.currentOp = {}; + } + this.bulkOperation.s.currentOp.upsert = true; + return this; + } + /** Specifies the collation for the query condition. */ + collation(collation) { + if (!this.bulkOperation.s.currentOp) { + this.bulkOperation.s.currentOp = {}; + } + this.bulkOperation.s.currentOp.collation = collation; + return this; + } + /** Specifies arrayFilters for UpdateOne or UpdateMany bulk operations. */ + arrayFilters(arrayFilters) { + if (!this.bulkOperation.s.currentOp) { + this.bulkOperation.s.currentOp = {}; + } + this.bulkOperation.s.currentOp.arrayFilters = arrayFilters; + return this; + } +} +exports.FindOperators = FindOperators; +/** @public */ +class BulkOperationBase { + /** + * Create a new OrderedBulkOperation or UnorderedBulkOperation instance + * @internal + */ + constructor(collection, options, isOrdered) { + // determine whether bulkOperation is ordered or unordered + this.isOrdered = isOrdered; + const topology = utils_1.getTopology(collection); + options = options == null ? {} : options; + // TODO Bring from driver information in isMaster + // Get the namespace for the write operations + const namespace = collection.s.namespace; + // Used to mark operation as executed + const executed = false; + // Current item + const currentOp = undefined; + // Set max byte size + const isMaster = topology.lastIsMaster(); + // If we have autoEncryption on, batch-splitting must be done on 2mb chunks, but single documents + // over 2mb are still allowed + const usingAutoEncryption = !!(topology.s.options && topology.s.options.autoEncrypter); + const maxBsonObjectSize = isMaster && isMaster.maxBsonObjectSize ? isMaster.maxBsonObjectSize : 1024 * 1024 * 16; + const maxBatchSizeBytes = usingAutoEncryption ? 1024 * 1024 * 2 : maxBsonObjectSize; + const maxWriteBatchSize = isMaster && isMaster.maxWriteBatchSize ? isMaster.maxWriteBatchSize : 1000; + // Calculates the largest possible size of an Array key, represented as a BSON string + // element. This calculation: + // 1 byte for BSON type + // # of bytes = length of (string representation of (maxWriteBatchSize - 1)) + // + 1 bytes for null terminator + const maxKeySize = (maxWriteBatchSize - 1).toString(10).length + 2; + // Final options for retryable writes + let finalOptions = Object.assign({}, options); + finalOptions = utils_1.applyRetryableWrites(finalOptions, collection.s.db); + // Final results + const bulkResult = { + ok: 1, + writeErrors: [], + writeConcernErrors: [], + insertedIds: [], + nInserted: 0, + nUpserted: 0, + nMatched: 0, + nModified: 0, + nRemoved: 0, + upserted: [] + }; + // Internal state + this.s = { + // Final result + bulkResult, + // Current batch state + currentBatch: undefined, + currentIndex: 0, + // ordered specific + currentBatchSize: 0, + currentBatchSizeBytes: 0, + // unordered specific + currentInsertBatch: undefined, + currentUpdateBatch: undefined, + currentRemoveBatch: undefined, + batches: [], + // Write concern + writeConcern: write_concern_1.WriteConcern.fromOptions(options), + // Max batch size options + maxBsonObjectSize, + maxBatchSizeBytes, + maxWriteBatchSize, + maxKeySize, + // Namespace + namespace, + // Topology + topology, + // Options + options: finalOptions, + // BSON options + bsonOptions: bson_1.resolveBSONOptions(options), + // Current operation + currentOp, + // Executed + executed, + // Collection + collection, + // Fundamental error + err: undefined, + // check keys + checkKeys: typeof options.checkKeys === 'boolean' ? options.checkKeys : false + }; + // bypass Validation + if (options.bypassDocumentValidation === true) { + this.s.bypassDocumentValidation = true; + } + } + /** + * Add a single insert document to the bulk operation + * + * @example + * ```js + * const bulkOp = collection.initializeOrderedBulkOp(); + * + * // Adds three inserts to the bulkOp. + * bulkOp + * .insert({ a: 1 }) + * .insert({ b: 2 }) + * .insert({ c: 3 }); + * await bulkOp.execute(); + * ``` + */ + insert(document) { + if (document._id == null && !shouldForceServerObjectId(this)) { + document._id = new bson_1.ObjectId(); + } + return this.addToOperationsList(exports.BatchType.INSERT, document); + } + /** + * Builds a find operation for an update/updateOne/delete/deleteOne/replaceOne. + * Returns a builder object used to complete the definition of the operation. + * + * @example + * ```js + * const bulkOp = collection.initializeOrderedBulkOp(); + * + * // Add an updateOne to the bulkOp + * bulkOp.find({ a: 1 }).updateOne({ $set: { b: 2 } }); + * + * // Add an updateMany to the bulkOp + * bulkOp.find({ c: 3 }).update({ $set: { d: 4 } }); + * + * // Add an upsert + * bulkOp.find({ e: 5 }).upsert().updateOne({ $set: { f: 6 } }); + * + * // Add a deletion + * bulkOp.find({ g: 7 }).deleteOne(); + * + * // Add a multi deletion + * bulkOp.find({ h: 8 }).delete(); + * + * // Add a replaceOne + * bulkOp.find({ i: 9 }).replaceOne({writeConcern: { j: 10 }}); + * + * // Update using a pipeline (requires Mongodb 4.2 or higher) + * bulk.find({ k: 11, y: { $exists: true }, z: { $exists: true } }).updateOne([ + * { $set: { total: { $sum: [ '$y', '$z' ] } } } + * ]); + * + * // All of the ops will now be executed + * await bulkOp.execute(); + * ``` + */ + find(selector) { + if (!selector) { + throw new error_1.MongoInvalidArgumentError('Bulk find operation must specify a selector'); + } + // Save a current selector + this.s.currentOp = { + selector: selector + }; + return new FindOperators(this); + } + /** Specifies a raw operation to perform in the bulk write. */ + raw(op) { + if ('insertOne' in op) { + const forceServerObjectId = shouldForceServerObjectId(this); + if (op.insertOne && op.insertOne.document == null) { + // NOTE: provided for legacy support, but this is a malformed operation + if (forceServerObjectId !== true && op.insertOne._id == null) { + op.insertOne._id = new bson_1.ObjectId(); + } + return this.addToOperationsList(exports.BatchType.INSERT, op.insertOne); + } + if (forceServerObjectId !== true && op.insertOne.document._id == null) { + op.insertOne.document._id = new bson_1.ObjectId(); + } + return this.addToOperationsList(exports.BatchType.INSERT, op.insertOne.document); + } + if ('replaceOne' in op || 'updateOne' in op || 'updateMany' in op) { + if ('replaceOne' in op) { + if ('q' in op.replaceOne) { + throw new error_1.MongoInvalidArgumentError('Raw operations are not allowed'); + } + const updateStatement = update_1.makeUpdateStatement(op.replaceOne.filter, op.replaceOne.replacement, { ...op.replaceOne, multi: false }); + if (utils_1.hasAtomicOperators(updateStatement.u)) { + throw new error_1.MongoInvalidArgumentError('Replacement document must not use atomic operators'); + } + return this.addToOperationsList(exports.BatchType.UPDATE, updateStatement); + } + if ('updateOne' in op) { + if ('q' in op.updateOne) { + throw new error_1.MongoInvalidArgumentError('Raw operations are not allowed'); + } + const updateStatement = update_1.makeUpdateStatement(op.updateOne.filter, op.updateOne.update, { + ...op.updateOne, + multi: false + }); + if (!utils_1.hasAtomicOperators(updateStatement.u)) { + throw new error_1.MongoInvalidArgumentError('Update document requires atomic operators'); + } + return this.addToOperationsList(exports.BatchType.UPDATE, updateStatement); + } + if ('updateMany' in op) { + if ('q' in op.updateMany) { + throw new error_1.MongoInvalidArgumentError('Raw operations are not allowed'); + } + const updateStatement = update_1.makeUpdateStatement(op.updateMany.filter, op.updateMany.update, { + ...op.updateMany, + multi: true + }); + if (!utils_1.hasAtomicOperators(updateStatement.u)) { + throw new error_1.MongoInvalidArgumentError('Update document requires atomic operators'); + } + return this.addToOperationsList(exports.BatchType.UPDATE, updateStatement); + } + } + if ('deleteOne' in op) { + if ('q' in op.deleteOne) { + throw new error_1.MongoInvalidArgumentError('Raw operations are not allowed'); + } + return this.addToOperationsList(exports.BatchType.DELETE, delete_1.makeDeleteStatement(op.deleteOne.filter, { ...op.deleteOne, limit: 1 })); + } + if ('deleteMany' in op) { + if ('q' in op.deleteMany) { + throw new error_1.MongoInvalidArgumentError('Raw operations are not allowed'); + } + return this.addToOperationsList(exports.BatchType.DELETE, delete_1.makeDeleteStatement(op.deleteMany.filter, { ...op.deleteMany, limit: 0 })); + } + // otherwise an unknown operation was provided + throw new error_1.MongoInvalidArgumentError('bulkWrite only supports insertOne, updateOne, updateMany, deleteOne, deleteMany'); + } + get bsonOptions() { + return this.s.bsonOptions; + } + get writeConcern() { + return this.s.writeConcern; + } + get batches() { + const batches = [...this.s.batches]; + if (this.isOrdered) { + if (this.s.currentBatch) + batches.push(this.s.currentBatch); + } + else { + if (this.s.currentInsertBatch) + batches.push(this.s.currentInsertBatch); + if (this.s.currentUpdateBatch) + batches.push(this.s.currentUpdateBatch); + if (this.s.currentRemoveBatch) + batches.push(this.s.currentRemoveBatch); + } + return batches; + } + /** An internal helper method. Do not invoke directly. Will be going away in the future */ + execute(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = options !== null && options !== void 0 ? options : {}; + if (this.s.executed) { + return handleEarlyError(new error_1.MongoBatchReExecutionError(), callback); + } + const writeConcern = write_concern_1.WriteConcern.fromOptions(options); + if (writeConcern) { + this.s.writeConcern = writeConcern; + } + // If we have current batch + if (this.isOrdered) { + if (this.s.currentBatch) + this.s.batches.push(this.s.currentBatch); + } + else { + if (this.s.currentInsertBatch) + this.s.batches.push(this.s.currentInsertBatch); + if (this.s.currentUpdateBatch) + this.s.batches.push(this.s.currentUpdateBatch); + if (this.s.currentRemoveBatch) + this.s.batches.push(this.s.currentRemoveBatch); + } + // If we have no operations in the bulk raise an error + if (this.s.batches.length === 0) { + const emptyBatchError = new error_1.MongoInvalidArgumentError('Invalid BulkOperation, Batch cannot be empty'); + return handleEarlyError(emptyBatchError, callback); + } + this.s.executed = true; + const finalOptions = { ...this.s.options, ...options }; + return utils_1.executeLegacyOperation(this.s.topology, executeCommands, [this, finalOptions, callback]); + } + /** + * Handles the write error before executing commands + * @internal + */ + handleWriteError(callback, writeResult) { + if (this.s.bulkResult.writeErrors.length > 0) { + const msg = this.s.bulkResult.writeErrors[0].errmsg + ? this.s.bulkResult.writeErrors[0].errmsg + : 'write operation failed'; + callback(new MongoBulkWriteError({ + message: msg, + code: this.s.bulkResult.writeErrors[0].code, + writeErrors: this.s.bulkResult.writeErrors + }, writeResult)); + return true; + } + const writeConcernError = writeResult.getWriteConcernError(); + if (writeConcernError) { + callback(new MongoBulkWriteError(writeConcernError, writeResult)); + return true; + } + } +} +exports.BulkOperationBase = BulkOperationBase; +Object.defineProperty(BulkOperationBase.prototype, 'length', { + enumerable: true, + get() { + return this.s.currentIndex; + } +}); +/** helper function to assist with promiseOrCallback behavior */ +function handleEarlyError(err, callback) { + const Promise = promise_provider_1.PromiseProvider.get(); + if (typeof callback === 'function') { + callback(err); + return; + } + return Promise.reject(err); +} +function shouldForceServerObjectId(bulkOperation) { + var _a, _b; + if (typeof bulkOperation.s.options.forceServerObjectId === 'boolean') { + return bulkOperation.s.options.forceServerObjectId; + } + if (typeof ((_a = bulkOperation.s.collection.s.db.options) === null || _a === void 0 ? void 0 : _a.forceServerObjectId) === 'boolean') { + return (_b = bulkOperation.s.collection.s.db.options) === null || _b === void 0 ? void 0 : _b.forceServerObjectId; + } + return false; +} +function isInsertBatch(batch) { + return batch.batchType === exports.BatchType.INSERT; +} +function isUpdateBatch(batch) { + return batch.batchType === exports.BatchType.UPDATE; +} +function isDeleteBatch(batch) { + return batch.batchType === exports.BatchType.DELETE; +} +function buildCurrentOp(bulkOp) { + let { currentOp } = bulkOp.s; + bulkOp.s.currentOp = undefined; + if (!currentOp) + currentOp = {}; + return currentOp; +} +//# sourceMappingURL=common.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/bulk/common.js.map b/node_modules/mongodb/lib/bulk/common.js.map new file mode 100644 index 000000000..3b9a767a3 --- /dev/null +++ b/node_modules/mongodb/lib/bulk/common.js.map @@ -0,0 +1 @@ +{"version":3,"file":"common.js","sourceRoot":"","sources":["../../src/bulk/common.ts"],"names":[],"mappings":";;;AAAA,0DAAsD;AACtD,kCAA6F;AAC7F,oCAOkB;AAClB,oCAQkB;AAClB,uEAAmE;AACnE,iDAAuD;AACvD,iDAA6F;AAC7F,iDAA6F;AAC7F,oDAAgD;AAOhD,gBAAgB;AAChB,MAAM,YAAY,GAAG,MAAM,CAAC,aAAa,CAAC,CAAC;AAE3C,cAAc;AACD,QAAA,SAAS,GAAG,MAAM,CAAC,MAAM,CAAC;IACrC,MAAM,EAAE,CAAC;IACT,MAAM,EAAE,CAAC;IACT,MAAM,EAAE,CAAC;CACD,CAAC,CAAC;AAqGZ;;;;;GAKG;AACH,MAAa,KAAK;IAShB,YAAY,SAAoB,EAAE,iBAAyB;QACzD,IAAI,CAAC,iBAAiB,GAAG,iBAAiB,CAAC;QAC3C,IAAI,CAAC,YAAY,GAAG,CAAC,CAAC;QACtB,IAAI,CAAC,eAAe,GAAG,EAAE,CAAC;QAC1B,IAAI,CAAC,SAAS,GAAG,SAAS,CAAC;QAC3B,IAAI,CAAC,UAAU,GAAG,EAAE,CAAC;QACrB,IAAI,CAAC,IAAI,GAAG,CAAC,CAAC;QACd,IAAI,CAAC,SAAS,GAAG,CAAC,CAAC;IACrB,CAAC;CACF;AAlBD,sBAkBC;AAED;;;GAGG;AACH,MAAa,eAAe;IAG1B;;;OAGG;IACH,YAAY,UAAsB;QAChC,IAAI,CAAC,MAAM,GAAG,UAAU,CAAC;IAC3B,CAAC;IAED,oCAAoC;IACpC,IAAI,aAAa;;QACf,OAAO,MAAA,IAAI,CAAC,MAAM,CAAC,SAAS,mCAAI,CAAC,CAAC;IACpC,CAAC;IACD,8CAA8C;IAC9C,IAAI,YAAY;;QACd,OAAO,MAAA,IAAI,CAAC,MAAM,CAAC,QAAQ,mCAAI,CAAC,CAAC;IACnC,CAAC;IACD,oCAAoC;IACpC,IAAI,aAAa;;QACf,OAAO,MAAA,IAAI,CAAC,MAAM,CAAC,SAAS,mCAAI,CAAC,CAAC;IACpC,CAAC;IACD,mCAAmC;IACnC,IAAI,YAAY;;QACd,OAAO,MAAA,IAAI,CAAC,MAAM,CAAC,QAAQ,mCAAI,CAAC,CAAC;IACnC,CAAC;IACD,oCAAoC;IACpC,IAAI,aAAa;;QACf,OAAO,MAAA,IAAI,CAAC,MAAM,CAAC,QAAQ,CAAC,MAAM,mCAAI,CAAC,CAAC;IAC1C,CAAC;IAED,2FAA2F;IAC3F,IAAI,WAAW;;QACb,MAAM,QAAQ,GAA6B,EAAE,CAAC;QAC9C,KAAK,MAAM,GAAG,IAAI,MAAA,IAAI,CAAC,MAAM,CAAC,QAAQ,mCAAI,EAAE,EAAE;YAC5C,QAAQ,CAAC,GAAG,CAAC,KAAK,CAAC,GAAG,GAAG,CAAC,GAAG,CAAC;SAC/B;QACD,OAAO,QAAQ,CAAC;IAClB,CAAC;IAED,2FAA2F;IAC3F,IAAI,WAAW;;QACb,MAAM,QAAQ,GAA6B,EAAE,CAAC;QAC9C,KAAK,MAAM,GAAG,IAAI,MAAA,IAAI,CAAC,MAAM,CAAC,WAAW,mCAAI,EAAE,EAAE;YAC/C,QAAQ,CAAC,GAAG,CAAC,KAAK,CAAC,GAAG,GAAG,CAAC,GAAG,CAAC;SAC/B;QACD,OAAO,QAAQ,CAAC;IAClB,CAAC;IAED,iEAAiE;IACjE,IAAI,EAAE;QACJ,OAAO,IAAI,CAAC,MAAM,CAAC,EAAE,CAAC;IACxB,CAAC;IAED,uCAAuC;IACvC,IAAI,SAAS;QACX,OAAO,IAAI,CAAC,MAAM,CAAC,SAAS,CAAC;IAC/B,CAAC;IAED,mCAAmC;IACnC,IAAI,SAAS;QACX,OAAO,IAAI,CAAC,MAAM,CAAC,SAAS,CAAC;IAC/B,CAAC;IAED,kCAAkC;IAClC,IAAI,QAAQ;QACV,OAAO,IAAI,CAAC,MAAM,CAAC,QAAQ,CAAC;IAC9B,CAAC;IAED,qDAAqD;IACrD,IAAI,SAAS;QACX,OAAO,IAAI,CAAC,MAAM,CAAC,SAAS,CAAC;IAC/B,CAAC;IAED,kCAAkC;IAClC,IAAI,QAAQ;QACV,OAAO,IAAI,CAAC,MAAM,CAAC,QAAQ,CAAC;IAC9B,CAAC;IAED,2CAA2C;IAC3C,cAAc;QACZ,OAAO,IAAI,CAAC,MAAM,CAAC,WAAW,CAAC;IACjC,CAAC;IAED,2CAA2C;IAC3C,cAAc;QACZ,OAAO,IAAI,CAAC,MAAM,CAAC,QAAQ,CAAC;IAC9B,CAAC;IAED,iDAAiD;IACjD,eAAe,CAAC,KAAa;QAC3B,OAAO,IAAI,CAAC,MAAM,CAAC,QAAQ,CAAC,KAAK,CAAC,CAAC;IACrC,CAAC;IAED,kCAAkC;IAClC,cAAc;QACZ,OAAO,IAAI,CAAC,MAAM,CAAC;IACrB,CAAC;IAED,gEAAgE;IAChE,cAAc;QACZ,OAAO,IAAI,CAAC,MAAM,CAAC,WAAW,CAAC,MAAM,GAAG,CAAC,CAAC;IAC5C,CAAC;IAED,gEAAgE;IAChE,kBAAkB;QAChB,OAAO,IAAI,CAAC,MAAM,CAAC,WAAW,CAAC,MAAM,CAAC;IACxC,CAAC;IAED,4CAA4C;IAC5C,eAAe,CAAC,KAAa;QAC3B,IAAI,KAAK,GAAG,IAAI,CAAC,MAAM,CAAC,WAAW,CAAC,MAAM,EAAE;YAC1C,OAAO,IAAI,CAAC,MAAM,CAAC,WAAW,CAAC,KAAK,CAAC,CAAC;SACvC;IACH,CAAC;IAED,gCAAgC;IAChC,cAAc;QACZ,OAAO,IAAI,CAAC,MAAM,CAAC,WAAW,CAAC;IACjC,CAAC;IAED,mCAAmC;IACnC,SAAS;QACP,OAAO,IAAI,CAAC,MAAM,CAAC,MAAM,CAAC;IAC5B,CAAC;IAED,qDAAqD;IACrD,oBAAoB;QAClB,IAAI,IAAI,CAAC,MAAM,CAAC,kBAAkB,CAAC,MAAM,KAAK,CAAC,EAAE;YAC/C,OAAO;SACR;aAAM,IAAI,IAAI,CAAC,MAAM,CAAC,kBAAkB,CAAC,MAAM,KAAK,CAAC,EAAE;YACtD,mBAAmB;YACnB,OAAO,IAAI,CAAC,MAAM,CAAC,kBAAkB,CAAC,CAAC,CAAC,CAAC;SAC1C;aAAM;YACL,qBAAqB;YACrB,IAAI,MAAM,GAAG,EAAE,CAAC;YAChB,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,IAAI,CAAC,MAAM,CAAC,kBAAkB,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;gBAC9D,MAAM,GAAG,GAAG,IAAI,CAAC,MAAM,CAAC,kBAAkB,CAAC,CAAC,CAAC,CAAC;gBAC9C,MAAM,GAAG,MAAM,GAAG,GAAG,CAAC,MAAM,CAAC;gBAE7B,yBAAyB;gBACzB,IAAI,CAAC,KAAK,CAAC;oBAAE,MAAM,GAAG,MAAM,GAAG,OAAO,CAAC;aACxC;YAED,OAAO,IAAI,iBAAiB,CAAC,EAAE,MAAM,EAAE,IAAI,EAAE,2BAAmB,CAAC,kBAAkB,EAAE,CAAC,CAAC;SACxF;IACH,CAAC;IAED,MAAM;QACJ,OAAO,IAAI,CAAC,MAAM,CAAC;IACrB,CAAC;IAED,QAAQ;QACN,OAAO,mBAAmB,IAAI,CAAC,MAAM,EAAE,GAAG,CAAC;IAC7C,CAAC;IAED,IAAI;QACF,OAAO,IAAI,CAAC,MAAM,CAAC,EAAE,KAAK,CAAC,CAAC;IAC9B,CAAC;CACF;AAhKD,0CAgKC;AASD;;;;GAIG;AACH,MAAa,iBAAiB;IAI5B,YAAY,KAA4B;QACtC,IAAI,CAAC,YAAY,CAAC,GAAG,KAAK,CAAC;IAC7B,CAAC;IAED,gCAAgC;IAChC,IAAI,IAAI;QACN,OAAO,IAAI,CAAC,YAAY,CAAC,CAAC,IAAI,CAAC;IACjC,CAAC;IAED,mCAAmC;IACnC,IAAI,MAAM;QACR,OAAO,IAAI,CAAC,YAAY,CAAC,CAAC,MAAM,CAAC;IACnC,CAAC;IAED,gCAAgC;IAChC,IAAI,OAAO;QACT,OAAO,IAAI,CAAC,YAAY,CAAC,CAAC,OAAO,CAAC;IACpC,CAAC;IAED,wFAAwF;IACxF,IAAI,GAAG;QACL,OAAO,IAAI,CAAC,YAAY,CAAC,CAAC;IAC5B,CAAC;IAED,MAAM;QACJ,OAAO,IAAI,CAAC,YAAY,CAAC,CAAC;IAC5B,CAAC;IAED,QAAQ;QACN,OAAO,qBAAqB,IAAI,CAAC,MAAM,GAAG,CAAC;IAC7C,CAAC;CACF;AAnCD,8CAmCC;AAWD;;;;GAIG;AACH,MAAa,UAAU;IAGrB,YAAY,GAA4B;QACtC,IAAI,CAAC,GAAG,GAAG,GAAG,CAAC;IACjB,CAAC;IAED,uBAAuB;IACvB,IAAI,IAAI;QACN,OAAO,IAAI,CAAC,GAAG,CAAC,IAAI,CAAC;IACvB,CAAC;IAED,gDAAgD;IAChD,IAAI,KAAK;QACP,OAAO,IAAI,CAAC,GAAG,CAAC,KAAK,CAAC;IACxB,CAAC;IAED,0BAA0B;IAC1B,IAAI,MAAM;QACR,OAAO,IAAI,CAAC,GAAG,CAAC,MAAM,CAAC;IACzB,CAAC;IAED,0BAA0B;IAC1B,IAAI,OAAO;QACT,OAAO,IAAI,CAAC,GAAG,CAAC,OAAO,CAAC;IAC1B,CAAC;IAED,6DAA6D;IAC7D,YAAY;QACV,OAAO,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC;IACrB,CAAC;IAED,MAAM;QACJ,OAAO,EAAE,IAAI,EAAE,IAAI,CAAC,GAAG,CAAC,IAAI,EAAE,KAAK,EAAE,IAAI,CAAC,GAAG,CAAC,KAAK,EAAE,MAAM,EAAE,IAAI,CAAC,GAAG,CAAC,MAAM,EAAE,EAAE,EAAE,IAAI,CAAC,GAAG,CAAC,EAAE,EAAE,CAAC;IAClG,CAAC;IAED,QAAQ;QACN,OAAO,cAAc,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,MAAM,EAAE,CAAC,GAAG,CAAC;IACxD,CAAC;CACF;AAvCD,gCAuCC;AAED,gDAAgD;AAChD,SAAS,iBAAiB,CACxB,KAAY,EACZ,UAAsB,EACtB,GAAc,EACd,MAAiB;IAEjB,0DAA0D;IAC1D,IAAI,GAAG,EAAE;QACP,MAAM,GAAG,GAAG,CAAC;KACd;SAAM,IAAI,MAAM,IAAI,MAAM,CAAC,MAAM,EAAE;QAClC,MAAM,GAAG,MAAM,CAAC,MAAM,CAAC;KACxB;IAED,IAAI,MAAM,IAAI,IAAI,EAAE;QAClB,OAAO;KACR;IAED,0DAA0D;IAC1D,IAAI,MAAM,CAAC,EAAE,KAAK,CAAC,IAAI,UAAU,CAAC,EAAE,KAAK,CAAC,EAAE;QAC1C,UAAU,CAAC,EAAE,GAAG,CAAC,CAAC;QAElB,MAAM,UAAU,GAAG;YACjB,KAAK,EAAE,CAAC;YACR,IAAI,EAAE,MAAM,CAAC,IAAI,IAAI,CAAC;YACtB,MAAM,EAAE,MAAM,CAAC,OAAO;YACtB,OAAO,EAAE,MAAM,CAAC,OAAO;YACvB,EAAE,EAAE,KAAK,CAAC,UAAU,CAAC,CAAC,CAAC;SACxB,CAAC;QAEF,UAAU,CAAC,WAAW,CAAC,IAAI,CAAC,IAAI,UAAU,CAAC,UAAU,CAAC,CAAC,CAAC;QACxD,OAAO;KACR;SAAM,IAAI,MAAM,CAAC,EAAE,KAAK,CAAC,IAAI,UAAU,CAAC,EAAE,KAAK,CAAC,EAAE;QACjD,OAAO;KACR;IAED,gCAAgC;IAChC,IAAI,MAAM,CAAC,MAAM,IAAI,MAAM,CAAC,MAAM,EAAE;QAClC,MAAM,MAAM,GAAG,MAAM,CAAC,MAAM,IAAI,MAAM,CAAC,MAAM,CAAC;QAC9C,IAAI,QAAQ,GAAG,IAAI,CAAC;QACpB,IAAI,OAAO,GAAG,IAAI,CAAC;QAEnB,uBAAuB;QACvB,IAAI,MAAM,IAAI,MAAM,CAAC,SAAS,KAAK,WAAW,EAAE;YAC9C,IAAI,UAAU,CAAC,MAAM,IAAI,IAAI,EAAE;gBAC7B,UAAU,CAAC,MAAM,GAAG,MAAM,CAAC;aAC5B;iBAAM,IAAI,MAAM,CAAC,WAAW,CAAC,UAAU,CAAC,MAAM,CAAC,EAAE;gBAChD,UAAU,CAAC,MAAM,GAAG,MAAM,CAAC;aAC5B;SACF;aAAM;YACL,cAAc;YACd,IAAI,UAAU,CAAC,MAAM,EAAE;gBACrB,QAAQ;oBACN,OAAO,UAAU,CAAC,MAAM,CAAC,EAAE,KAAK,QAAQ;wBACtC,CAAC,CAAC,WAAI,CAAC,UAAU,CAAC,UAAU,CAAC,MAAM,CAAC,EAAE,CAAC;wBACvC,CAAC,CAAC,UAAU,CAAC,MAAM,CAAC,EAAE,CAAC;gBAC3B,OAAO;oBACL,OAAO,UAAU,CAAC,MAAM,CAAC,CAAC,KAAK,QAAQ;wBACrC,CAAC,CAAC,WAAI,CAAC,UAAU,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC,CAAC;wBACtC,CAAC,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC,CAAC;aAC3B;YAED,oBAAoB;YACpB,MAAM,QAAQ,GAAG,OAAO,MAAM,CAAC,EAAE,KAAK,QAAQ,CAAC,CAAC,CAAC,WAAI,CAAC,UAAU,CAAC,MAAM,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,MAAM,CAAC,EAAE,CAAC;YACxF,MAAM,OAAO,GAAG,OAAO,MAAM,CAAC,CAAC,KAAK,QAAQ,CAAC,CAAC,CAAC,WAAI,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC;YAEpF,uBAAuB;YACvB,IAAI,UAAU,CAAC,MAAM,IAAI,IAAI,EAAE;gBAC7B,UAAU,CAAC,MAAM,GAAG,MAAM,CAAC;aAC5B;iBAAM,IAAI,QAAQ,CAAC,WAAW,CAAC,QAAQ,CAAC,EAAE;gBACzC,UAAU,CAAC,MAAM,GAAG,MAAM,CAAC;aAC5B;iBAAM,IAAI,QAAQ,CAAC,MAAM,CAAC,QAAQ,CAAC,EAAE;gBACpC,IAAI,OAAO,CAAC,WAAW,CAAC,OAAO,CAAC,EAAE;oBAChC,UAAU,CAAC,MAAM,GAAG,MAAM,CAAC;iBAC5B;aACF;SACF;KACF;IAED,kCAAkC;IAClC,IAAI,aAAa,CAAC,KAAK,CAAC,IAAI,MAAM,CAAC,CAAC,EAAE;QACpC,UAAU,CAAC,SAAS,GAAG,UAAU,CAAC,SAAS,GAAG,MAAM,CAAC,CAAC,CAAC;KACxD;IAED,kCAAkC;IAClC,IAAI,aAAa,CAAC,KAAK,CAAC,IAAI,MAAM,CAAC,CAAC,EAAE;QACpC,UAAU,CAAC,QAAQ,GAAG,UAAU,CAAC,QAAQ,GAAG,MAAM,CAAC,CAAC,CAAC;KACtD;IAED,IAAI,SAAS,GAAG,CAAC,CAAC;IAElB,sEAAsE;IACtE,IAAI,KAAK,CAAC,OAAO,CAAC,MAAM,CAAC,QAAQ,CAAC,EAAE;QAClC,SAAS,GAAG,MAAM,CAAC,QAAQ,CAAC,MAAM,CAAC;QAEnC,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,MAAM,CAAC,QAAQ,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;YAC/C,UAAU,CAAC,QAAQ,CAAC,IAAI,CAAC;gBACvB,KAAK,EAAE,MAAM,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,KAAK,GAAG,KAAK,CAAC,iBAAiB;gBACzD,GAAG,EAAE,MAAM,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,GAAG;aAC5B,CAAC,CAAC;SACJ;KACF;SAAM,IAAI,MAAM,CAAC,QAAQ,EAAE;QAC1B,SAAS,GAAG,CAAC,CAAC;QAEd,UAAU,CAAC,QAAQ,CAAC,IAAI,CAAC;YACvB,KAAK,EAAE,KAAK,CAAC,iBAAiB;YAC9B,GAAG,EAAE,MAAM,CAAC,QAAQ;SACrB,CAAC,CAAC;KACJ;IAED,kCAAkC;IAClC,IAAI,aAAa,CAAC,KAAK,CAAC,IAAI,MAAM,CAAC,CAAC,EAAE;QACpC,MAAM,SAAS,GAAG,MAAM,CAAC,SAAS,CAAC;QACnC,UAAU,CAAC,SAAS,GAAG,UAAU,CAAC,SAAS,GAAG,SAAS,CAAC;QACxD,UAAU,CAAC,QAAQ,GAAG,UAAU,CAAC,QAAQ,GAAG,CAAC,MAAM,CAAC,CAAC,GAAG,SAAS,CAAC,CAAC;QAEnE,IAAI,OAAO,SAAS,KAAK,QAAQ,EAAE;YACjC,UAAU,CAAC,SAAS,GAAG,UAAU,CAAC,SAAS,GAAG,SAAS,CAAC;SACzD;aAAM;YACL,UAAU,CAAC,SAAS,GAAG,CAAC,CAAC;SAC1B;KACF;IAED,IAAI,KAAK,CAAC,OAAO,CAAC,MAAM,CAAC,WAAW,CAAC,EAAE;QACrC,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,MAAM,CAAC,WAAW,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;YAClD,MAAM,UAAU,GAAG;gBACjB,KAAK,EAAE,KAAK,CAAC,eAAe,CAAC,MAAM,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC;gBACzD,IAAI,EAAE,MAAM,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,IAAI;gBAChC,MAAM,EAAE,MAAM,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,MAAM;gBACpC,OAAO,EAAE,MAAM,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,OAAO;gBACtC,EAAE,EAAE,KAAK,CAAC,UAAU,CAAC,MAAM,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC;aAClD,CAAC;YAEF,UAAU,CAAC,WAAW,CAAC,IAAI,CAAC,IAAI,UAAU,CAAC,UAAU,CAAC,CAAC,CAAC;SACzD;KACF;IAED,IAAI,MAAM,CAAC,iBAAiB,EAAE;QAC5B,UAAU,CAAC,kBAAkB,CAAC,IAAI,CAAC,IAAI,iBAAiB,CAAC,MAAM,CAAC,iBAAiB,CAAC,CAAC,CAAC;KACrF;AACH,CAAC;AAED,SAAS,eAAe,CACtB,aAAgC,EAChC,OAAyB,EACzB,QAAmC;IAEnC,IAAI,aAAa,CAAC,CAAC,CAAC,OAAO,CAAC,MAAM,KAAK,CAAC,EAAE;QACxC,OAAO,QAAQ,CAAC,SAAS,EAAE,IAAI,eAAe,CAAC,aAAa,CAAC,CAAC,CAAC,UAAU,CAAC,CAAC,CAAC;KAC7E;IAED,MAAM,KAAK,GAAG,aAAa,CAAC,CAAC,CAAC,OAAO,CAAC,KAAK,EAAW,CAAC;IAEvD,SAAS,aAAa,CAAC,GAAc,EAAE,MAAiB;QACtD,oEAAoE;QACpE,IAAI,GAAG,IAAI,SAAS,IAAI,GAAG,IAAI,CAAC,CAAC,GAAG,YAAY,8BAAsB,CAAC,EAAE;YACvE,OAAO,QAAQ,CACb,IAAI,mBAAmB,CAAC,GAAG,EAAE,IAAI,eAAe,CAAC,aAAa,CAAC,CAAC,CAAC,UAAU,CAAC,CAAC,CAC9E,CAAC;SACH;QAED,IAAI,GAAG,YAAY,8BAAsB,EAAE;YACzC,OAAO,4BAA4B,CAAC,KAAK,EAAE,aAAa,CAAC,CAAC,CAAC,UAAU,EAAE,GAAG,EAAE,QAAQ,CAAC,CAAC;SACvF;QAED,6BAA6B;QAC7B,MAAM,WAAW,GAAG,IAAI,eAAe,CAAC,aAAa,CAAC,CAAC,CAAC,UAAU,CAAC,CAAC;QACpE,MAAM,WAAW,GAAG,iBAAiB,CAAC,KAAK,EAAE,aAAa,CAAC,CAAC,CAAC,UAAU,EAAE,GAAG,EAAE,MAAM,CAAC,CAAC;QACtF,IAAI,WAAW,IAAI,IAAI,EAAE;YACvB,OAAO,QAAQ,CAAC,SAAS,EAAE,WAAW,CAAC,CAAC;SACzC;QAED,IAAI,aAAa,CAAC,gBAAgB,CAAC,QAAQ,EAAE,WAAW,CAAC;YAAE,OAAO;QAElE,mCAAmC;QACnC,eAAe,CAAC,aAAa,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;IACpD,CAAC;IAED,MAAM,YAAY,GAAG,sBAAc,CAAC,aAAa,EAAE;QACjD,GAAG,OAAO;QACV,OAAO,EAAE,aAAa,CAAC,SAAS;KACjC,CAAC,CAAC;IAEH,IAAI,YAAY,CAAC,wBAAwB,KAAK,IAAI,EAAE;QAClD,OAAO,YAAY,CAAC,wBAAwB,CAAC;KAC9C;IAED,iCAAiC;IACjC,IAAI,aAAa,CAAC,WAAW,EAAE;QAC7B,aAAa,CAAC,WAAW,GAAG,aAAa,CAAC,WAAW,CAAC;KACvD;IAED,mDAAmD;IACnD,IAAI,aAAa,CAAC,CAAC,CAAC,wBAAwB,KAAK,IAAI,EAAE;QACrD,YAAY,CAAC,wBAAwB,GAAG,IAAI,CAAC;KAC9C;IAED,mCAAmC;IACnC,IAAI,aAAa,CAAC,CAAC,CAAC,SAAS,KAAK,KAAK,EAAE;QACvC,YAAY,CAAC,SAAS,GAAG,KAAK,CAAC;KAChC;IAED,IAAI,YAAY,CAAC,WAAW,EAAE;QAC5B,IAAI,aAAa,CAAC,KAAK,CAAC,EAAE;YACxB,YAAY,CAAC,WAAW,GAAG,YAAY,CAAC,WAAW,IAAI,CAAC,KAAK,CAAC,UAAU,CAAC,IAAI,CAAC,EAAE,CAAC,EAAE,CAAC,EAAE,CAAC,KAAK,CAAC,CAAC;SAC/F;QAED,IAAI,aAAa,CAAC,KAAK,CAAC,EAAE;YACxB,YAAY,CAAC,WAAW;gBACtB,YAAY,CAAC,WAAW,IAAI,CAAC,KAAK,CAAC,UAAU,CAAC,IAAI,CAAC,EAAE,CAAC,EAAE,CAAC,EAAE,CAAC,KAAK,KAAK,CAAC,CAAC,CAAC;SAC5E;KACF;IAED,IAAI;QACF,IAAI,aAAa,CAAC,KAAK,CAAC,EAAE;YACxB,oCAAgB,CACd,aAAa,CAAC,CAAC,CAAC,QAAQ,EACxB,IAAI,wBAAe,CAAC,aAAa,CAAC,CAAC,CAAC,SAAS,EAAE,KAAK,CAAC,UAAU,EAAE,YAAY,CAAC,EAC9E,aAAa,CACd,CAAC;SACH;aAAM,IAAI,aAAa,CAAC,KAAK,CAAC,EAAE;YAC/B,oCAAgB,CACd,aAAa,CAAC,CAAC,CAAC,QAAQ,EACxB,IAAI,wBAAe,CAAC,aAAa,CAAC,CAAC,CAAC,SAAS,EAAE,KAAK,CAAC,UAAU,EAAE,YAAY,CAAC,EAC9E,aAAa,CACd,CAAC;SACH;aAAM,IAAI,aAAa,CAAC,KAAK,CAAC,EAAE;YAC/B,oCAAgB,CACd,aAAa,CAAC,CAAC,CAAC,QAAQ,EACxB,IAAI,wBAAe,CAAC,aAAa,CAAC,CAAC,CAAC,SAAS,EAAE,KAAK,CAAC,UAAU,EAAE,YAAY,CAAC,EAC9E,aAAa,CACd,CAAC;SACH;KACF;IAAC,OAAO,GAAG,EAAE;QACZ,wBAAwB;QACxB,GAAG,CAAC,EAAE,GAAG,CAAC,CAAC;QACX,mCAAmC;QACnC,iBAAiB,CAAC,KAAK,EAAE,aAAa,CAAC,CAAC,CAAC,UAAU,EAAE,GAAG,EAAE,SAAS,CAAC,CAAC;QACrE,QAAQ,EAAE,CAAC;KACZ;AACH,CAAC;AAED,SAAS,4BAA4B,CACnC,KAAY,EACZ,UAAsB,EACtB,GAA2B,EAC3B,QAAmC;;IAEnC,iBAAiB,CAAC,KAAK,EAAE,UAAU,EAAE,SAAS,EAAE,GAAG,CAAC,MAAM,CAAC,CAAC;IAE5D,QAAQ,CACN,IAAI,mBAAmB,CACrB;QACE,OAAO,EAAE,MAAA,GAAG,CAAC,MAAM,0CAAE,iBAAiB,CAAC,MAAM;QAC7C,IAAI,EAAE,MAAA,GAAG,CAAC,MAAM,0CAAE,iBAAiB,CAAC,MAAM;KAC3C,EACD,IAAI,eAAe,CAAC,UAAU,CAAC,CAChC,CACF,CAAC;AACJ,CAAC;AAED;;;;GAIG;AACH,MAAa,mBAAoB,SAAQ,wBAAgB;IAKvD,wCAAwC;IACxC,YACE,KAGY,EACZ,MAAuB;;QAEvB,KAAK,CAAC,KAAK,CAAC,CAAC;QAXf,gBAAW,GAA0B,EAAE,CAAC;QAatC,IAAI,KAAK,YAAY,iBAAiB;YAAE,IAAI,CAAC,GAAG,GAAG,KAAK,CAAC;aACpD,IAAI,CAAC,CAAC,KAAK,YAAY,KAAK,CAAC,EAAE;YAClC,IAAI,CAAC,OAAO,GAAG,KAAK,CAAC,OAAO,CAAC;YAC7B,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC,IAAI,CAAC;YACvB,IAAI,CAAC,WAAW,GAAG,MAAA,KAAK,CAAC,WAAW,mCAAI,EAAE,CAAC;SAC5C;QAED,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC;QACrB,MAAM,CAAC,MAAM,CAAC,IAAI,EAAE,KAAK,CAAC,CAAC;IAC7B,CAAC;IAED,IAAI,IAAI;QACN,OAAO,qBAAqB,CAAC;IAC/B,CAAC;IAED,oCAAoC;IACpC,IAAI,aAAa;QACf,OAAO,IAAI,CAAC,MAAM,CAAC,aAAa,CAAC;IACnC,CAAC;IACD,8CAA8C;IAC9C,IAAI,YAAY;QACd,OAAO,IAAI,CAAC,MAAM,CAAC,YAAY,CAAC;IAClC,CAAC;IACD,oCAAoC;IACpC,IAAI,aAAa;QACf,OAAO,IAAI,CAAC,MAAM,CAAC,aAAa,CAAC;IACnC,CAAC;IACD,mCAAmC;IACnC,IAAI,YAAY;QACd,OAAO,IAAI,CAAC,MAAM,CAAC,YAAY,CAAC;IAClC,CAAC;IACD,oCAAoC;IACpC,IAAI,aAAa;QACf,OAAO,IAAI,CAAC,MAAM,CAAC,aAAa,CAAC;IACnC,CAAC;IACD,2FAA2F;IAC3F,IAAI,WAAW;QACb,OAAO,IAAI,CAAC,MAAM,CAAC,WAAW,CAAC;IACjC,CAAC;IACD,2FAA2F;IAC3F,IAAI,WAAW;QACb,OAAO,IAAI,CAAC,MAAM,CAAC,WAAW,CAAC;IACjC,CAAC;CACF;AA1DD,kDA0DC;AAED;;;;;GAKG;AACH,MAAa,aAAa;IAGxB;;;OAGG;IACH,YAAY,aAAgC;QAC1C,IAAI,CAAC,aAAa,GAAG,aAAa,CAAC;IACrC,CAAC;IAED,4DAA4D;IAC5D,MAAM,CAAC,cAAwB;QAC7B,MAAM,SAAS,GAAG,cAAc,CAAC,IAAI,CAAC,aAAa,CAAC,CAAC;QACrD,OAAO,IAAI,CAAC,aAAa,CAAC,mBAAmB,CAC3C,iBAAS,CAAC,MAAM,EAChB,4BAAmB,CAAC,SAAS,CAAC,QAAQ,EAAE,cAAc,EAAE;YACtD,GAAG,SAAS;YACZ,KAAK,EAAE,IAAI;SACZ,CAAC,CACH,CAAC;IACJ,CAAC;IAED,0DAA0D;IAC1D,SAAS,CAAC,cAAwB;QAChC,IAAI,CAAC,0BAAkB,CAAC,cAAc,CAAC,EAAE;YACvC,MAAM,IAAI,iCAAyB,CAAC,2CAA2C,CAAC,CAAC;SAClF;QAED,MAAM,SAAS,GAAG,cAAc,CAAC,IAAI,CAAC,aAAa,CAAC,CAAC;QACrD,OAAO,IAAI,CAAC,aAAa,CAAC,mBAAmB,CAC3C,iBAAS,CAAC,MAAM,EAChB,4BAAmB,CAAC,SAAS,CAAC,QAAQ,EAAE,cAAc,EAAE,EAAE,GAAG,SAAS,EAAE,KAAK,EAAE,KAAK,EAAE,CAAC,CACxF,CAAC;IACJ,CAAC;IAED,wDAAwD;IACxD,UAAU,CAAC,WAAqB;QAC9B,IAAI,0BAAkB,CAAC,WAAW,CAAC,EAAE;YACnC,MAAM,IAAI,iCAAyB,CAAC,oDAAoD,CAAC,CAAC;SAC3F;QAED,MAAM,SAAS,GAAG,cAAc,CAAC,IAAI,CAAC,aAAa,CAAC,CAAC;QACrD,OAAO,IAAI,CAAC,aAAa,CAAC,mBAAmB,CAC3C,iBAAS,CAAC,MAAM,EAChB,4BAAmB,CAAC,SAAS,CAAC,QAAQ,EAAE,WAAW,EAAE,EAAE,GAAG,SAAS,EAAE,KAAK,EAAE,KAAK,EAAE,CAAC,CACrF,CAAC;IACJ,CAAC;IAED,uDAAuD;IACvD,SAAS;QACP,MAAM,SAAS,GAAG,cAAc,CAAC,IAAI,CAAC,aAAa,CAAC,CAAC;QACrD,OAAO,IAAI,CAAC,aAAa,CAAC,mBAAmB,CAC3C,iBAAS,CAAC,MAAM,EAChB,4BAAmB,CAAC,SAAS,CAAC,QAAQ,EAAE,EAAE,GAAG,SAAS,EAAE,KAAK,EAAE,CAAC,EAAE,CAAC,CACpE,CAAC;IACJ,CAAC;IAED,wDAAwD;IACxD,MAAM;QACJ,MAAM,SAAS,GAAG,cAAc,CAAC,IAAI,CAAC,aAAa,CAAC,CAAC;QACrD,OAAO,IAAI,CAAC,aAAa,CAAC,mBAAmB,CAC3C,iBAAS,CAAC,MAAM,EAChB,4BAAmB,CAAC,SAAS,CAAC,QAAQ,EAAE,EAAE,GAAG,SAAS,EAAE,KAAK,EAAE,CAAC,EAAE,CAAC,CACpE,CAAC;IACJ,CAAC;IAED,0FAA0F;IAC1F,MAAM;QACJ,IAAI,CAAC,IAAI,CAAC,aAAa,CAAC,CAAC,CAAC,SAAS,EAAE;YACnC,IAAI,CAAC,aAAa,CAAC,CAAC,CAAC,SAAS,GAAG,EAAE,CAAC;SACrC;QAED,IAAI,CAAC,aAAa,CAAC,CAAC,CAAC,SAAS,CAAC,MAAM,GAAG,IAAI,CAAC;QAC7C,OAAO,IAAI,CAAC;IACd,CAAC;IAED,uDAAuD;IACvD,SAAS,CAAC,SAA2B;QACnC,IAAI,CAAC,IAAI,CAAC,aAAa,CAAC,CAAC,CAAC,SAAS,EAAE;YACnC,IAAI,CAAC,aAAa,CAAC,CAAC,CAAC,SAAS,GAAG,EAAE,CAAC;SACrC;QAED,IAAI,CAAC,aAAa,CAAC,CAAC,CAAC,SAAS,CAAC,SAAS,GAAG,SAAS,CAAC;QACrD,OAAO,IAAI,CAAC;IACd,CAAC;IAED,0EAA0E;IAC1E,YAAY,CAAC,YAAwB;QACnC,IAAI,CAAC,IAAI,CAAC,aAAa,CAAC,CAAC,CAAC,SAAS,EAAE;YACnC,IAAI,CAAC,aAAa,CAAC,CAAC,CAAC,SAAS,GAAG,EAAE,CAAC;SACrC;QAED,IAAI,CAAC,aAAa,CAAC,CAAC,CAAC,SAAS,CAAC,YAAY,GAAG,YAAY,CAAC;QAC3D,OAAO,IAAI,CAAC;IACd,CAAC;CACF;AAhGD,sCAgGC;AAuDD,cAAc;AACd,MAAsB,iBAAiB;IAMrC;;;OAGG;IACH,YAAY,UAAsB,EAAE,OAAyB,EAAE,SAAkB;QAC/E,0DAA0D;QAC1D,IAAI,CAAC,SAAS,GAAG,SAAS,CAAC;QAE3B,MAAM,QAAQ,GAAG,mBAAW,CAAC,UAAU,CAAC,CAAC;QACzC,OAAO,GAAG,OAAO,IAAI,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,OAAO,CAAC;QACzC,iDAAiD;QACjD,6CAA6C;QAC7C,MAAM,SAAS,GAAG,UAAU,CAAC,CAAC,CAAC,SAAS,CAAC;QACzC,qCAAqC;QACrC,MAAM,QAAQ,GAAG,KAAK,CAAC;QAEvB,eAAe;QACf,MAAM,SAAS,GAAG,SAAS,CAAC;QAE5B,oBAAoB;QACpB,MAAM,QAAQ,GAAG,QAAQ,CAAC,YAAY,EAAE,CAAC;QAEzC,iGAAiG;QACjG,6BAA6B;QAC7B,MAAM,mBAAmB,GAAG,CAAC,CAAC,CAAC,QAAQ,CAAC,CAAC,CAAC,OAAO,IAAI,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,aAAa,CAAC,CAAC;QACvF,MAAM,iBAAiB,GACrB,QAAQ,IAAI,QAAQ,CAAC,iBAAiB,CAAC,CAAC,CAAC,QAAQ,CAAC,iBAAiB,CAAC,CAAC,CAAC,IAAI,GAAG,IAAI,GAAG,EAAE,CAAC;QACzF,MAAM,iBAAiB,GAAG,mBAAmB,CAAC,CAAC,CAAC,IAAI,GAAG,IAAI,GAAG,CAAC,CAAC,CAAC,CAAC,iBAAiB,CAAC;QACpF,MAAM,iBAAiB,GACrB,QAAQ,IAAI,QAAQ,CAAC,iBAAiB,CAAC,CAAC,CAAC,QAAQ,CAAC,iBAAiB,CAAC,CAAC,CAAC,IAAI,CAAC;QAE7E,qFAAqF;QACrF,6BAA6B;QAC7B,2BAA2B;QAC3B,gFAAgF;QAChF,kCAAkC;QAClC,MAAM,UAAU,GAAG,CAAC,iBAAiB,GAAG,CAAC,CAAC,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,MAAM,GAAG,CAAC,CAAC;QAEnE,qCAAqC;QACrC,IAAI,YAAY,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,OAAO,CAAC,CAAC;QAC9C,YAAY,GAAG,4BAAoB,CAAC,YAAY,EAAE,UAAU,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC;QAEnE,gBAAgB;QAChB,MAAM,UAAU,GAAe;YAC7B,EAAE,EAAE,CAAC;YACL,WAAW,EAAE,EAAE;YACf,kBAAkB,EAAE,EAAE;YACtB,WAAW,EAAE,EAAE;YACf,SAAS,EAAE,CAAC;YACZ,SAAS,EAAE,CAAC;YACZ,QAAQ,EAAE,CAAC;YACX,SAAS,EAAE,CAAC;YACZ,QAAQ,EAAE,CAAC;YACX,QAAQ,EAAE,EAAE;SACb,CAAC;QAEF,iBAAiB;QACjB,IAAI,CAAC,CAAC,GAAG;YACP,eAAe;YACf,UAAU;YACV,sBAAsB;YACtB,YAAY,EAAE,SAAS;YACvB,YAAY,EAAE,CAAC;YACf,mBAAmB;YACnB,gBAAgB,EAAE,CAAC;YACnB,qBAAqB,EAAE,CAAC;YACxB,qBAAqB;YACrB,kBAAkB,EAAE,SAAS;YAC7B,kBAAkB,EAAE,SAAS;YAC7B,kBAAkB,EAAE,SAAS;YAC7B,OAAO,EAAE,EAAE;YACX,gBAAgB;YAChB,YAAY,EAAE,4BAAY,CAAC,WAAW,CAAC,OAAO,CAAC;YAC/C,yBAAyB;YACzB,iBAAiB;YACjB,iBAAiB;YACjB,iBAAiB;YACjB,UAAU;YACV,YAAY;YACZ,SAAS;YACT,WAAW;YACX,QAAQ;YACR,UAAU;YACV,OAAO,EAAE,YAAY;YACrB,eAAe;YACf,WAAW,EAAE,yBAAkB,CAAC,OAAO,CAAC;YACxC,oBAAoB;YACpB,SAAS;YACT,WAAW;YACX,QAAQ;YACR,aAAa;YACb,UAAU;YACV,oBAAoB;YACpB,GAAG,EAAE,SAAS;YACd,aAAa;YACb,SAAS,EAAE,OAAO,OAAO,CAAC,SAAS,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC,CAAC,KAAK;SAC9E,CAAC;QAEF,oBAAoB;QACpB,IAAI,OAAO,CAAC,wBAAwB,KAAK,IAAI,EAAE;YAC7C,IAAI,CAAC,CAAC,CAAC,wBAAwB,GAAG,IAAI,CAAC;SACxC;IACH,CAAC;IAED;;;;;;;;;;;;;;OAcG;IACH,MAAM,CAAC,QAAkB;QACvB,IAAI,QAAQ,CAAC,GAAG,IAAI,IAAI,IAAI,CAAC,yBAAyB,CAAC,IAAI,CAAC,EAAE;YAC5D,QAAQ,CAAC,GAAG,GAAG,IAAI,eAAQ,EAAE,CAAC;SAC/B;QAED,OAAO,IAAI,CAAC,mBAAmB,CAAC,iBAAS,CAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;IAC9D,CAAC;IAED;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;OAkCG;IACH,IAAI,CAAC,QAAkB;QACrB,IAAI,CAAC,QAAQ,EAAE;YACb,MAAM,IAAI,iCAAyB,CAAC,6CAA6C,CAAC,CAAC;SACpF;QAED,0BAA0B;QAC1B,IAAI,CAAC,CAAC,CAAC,SAAS,GAAG;YACjB,QAAQ,EAAE,QAAQ;SACnB,CAAC;QAEF,OAAO,IAAI,aAAa,CAAC,IAAI,CAAC,CAAC;IACjC,CAAC;IAED,8DAA8D;IAC9D,GAAG,CAAC,EAAyB;QAC3B,IAAI,WAAW,IAAI,EAAE,EAAE;YACrB,MAAM,mBAAmB,GAAG,yBAAyB,CAAC,IAAI,CAAC,CAAC;YAC5D,IAAI,EAAE,CAAC,SAAS,IAAI,EAAE,CAAC,SAAS,CAAC,QAAQ,IAAI,IAAI,EAAE;gBACjD,uEAAuE;gBACvE,IAAI,mBAAmB,KAAK,IAAI,IAAK,EAAE,CAAC,SAAsB,CAAC,GAAG,IAAI,IAAI,EAAE;oBACzE,EAAE,CAAC,SAAsB,CAAC,GAAG,GAAG,IAAI,eAAQ,EAAE,CAAC;iBACjD;gBAED,OAAO,IAAI,CAAC,mBAAmB,CAAC,iBAAS,CAAC,MAAM,EAAE,EAAE,CAAC,SAAS,CAAC,CAAC;aACjE;YAED,IAAI,mBAAmB,KAAK,IAAI,IAAI,EAAE,CAAC,SAAS,CAAC,QAAQ,CAAC,GAAG,IAAI,IAAI,EAAE;gBACrE,EAAE,CAAC,SAAS,CAAC,QAAQ,CAAC,GAAG,GAAG,IAAI,eAAQ,EAAE,CAAC;aAC5C;YAED,OAAO,IAAI,CAAC,mBAAmB,CAAC,iBAAS,CAAC,MAAM,EAAE,EAAE,CAAC,SAAS,CAAC,QAAQ,CAAC,CAAC;SAC1E;QAED,IAAI,YAAY,IAAI,EAAE,IAAI,WAAW,IAAI,EAAE,IAAI,YAAY,IAAI,EAAE,EAAE;YACjE,IAAI,YAAY,IAAI,EAAE,EAAE;gBACtB,IAAI,GAAG,IAAI,EAAE,CAAC,UAAU,EAAE;oBACxB,MAAM,IAAI,iCAAyB,CAAC,gCAAgC,CAAC,CAAC;iBACvE;gBACD,MAAM,eAAe,GAAG,4BAAmB,CACzC,EAAE,CAAC,UAAU,CAAC,MAAM,EACpB,EAAE,CAAC,UAAU,CAAC,WAAW,EACzB,EAAE,GAAG,EAAE,CAAC,UAAU,EAAE,KAAK,EAAE,KAAK,EAAE,CACnC,CAAC;gBACF,IAAI,0BAAkB,CAAC,eAAe,CAAC,CAAC,CAAC,EAAE;oBACzC,MAAM,IAAI,iCAAyB,CAAC,oDAAoD,CAAC,CAAC;iBAC3F;gBACD,OAAO,IAAI,CAAC,mBAAmB,CAAC,iBAAS,CAAC,MAAM,EAAE,eAAe,CAAC,CAAC;aACpE;YAED,IAAI,WAAW,IAAI,EAAE,EAAE;gBACrB,IAAI,GAAG,IAAI,EAAE,CAAC,SAAS,EAAE;oBACvB,MAAM,IAAI,iCAAyB,CAAC,gCAAgC,CAAC,CAAC;iBACvE;gBACD,MAAM,eAAe,GAAG,4BAAmB,CAAC,EAAE,CAAC,SAAS,CAAC,MAAM,EAAE,EAAE,CAAC,SAAS,CAAC,MAAM,EAAE;oBACpF,GAAG,EAAE,CAAC,SAAS;oBACf,KAAK,EAAE,KAAK;iBACb,CAAC,CAAC;gBACH,IAAI,CAAC,0BAAkB,CAAC,eAAe,CAAC,CAAC,CAAC,EAAE;oBAC1C,MAAM,IAAI,iCAAyB,CAAC,2CAA2C,CAAC,CAAC;iBAClF;gBACD,OAAO,IAAI,CAAC,mBAAmB,CAAC,iBAAS,CAAC,MAAM,EAAE,eAAe,CAAC,CAAC;aACpE;YAED,IAAI,YAAY,IAAI,EAAE,EAAE;gBACtB,IAAI,GAAG,IAAI,EAAE,CAAC,UAAU,EAAE;oBACxB,MAAM,IAAI,iCAAyB,CAAC,gCAAgC,CAAC,CAAC;iBACvE;gBACD,MAAM,eAAe,GAAG,4BAAmB,CAAC,EAAE,CAAC,UAAU,CAAC,MAAM,EAAE,EAAE,CAAC,UAAU,CAAC,MAAM,EAAE;oBACtF,GAAG,EAAE,CAAC,UAAU;oBAChB,KAAK,EAAE,IAAI;iBACZ,CAAC,CAAC;gBACH,IAAI,CAAC,0BAAkB,CAAC,eAAe,CAAC,CAAC,CAAC,EAAE;oBAC1C,MAAM,IAAI,iCAAyB,CAAC,2CAA2C,CAAC,CAAC;iBAClF;gBACD,OAAO,IAAI,CAAC,mBAAmB,CAAC,iBAAS,CAAC,MAAM,EAAE,eAAe,CAAC,CAAC;aACpE;SACF;QAED,IAAI,WAAW,IAAI,EAAE,EAAE;YACrB,IAAI,GAAG,IAAI,EAAE,CAAC,SAAS,EAAE;gBACvB,MAAM,IAAI,iCAAyB,CAAC,gCAAgC,CAAC,CAAC;aACvE;YACD,OAAO,IAAI,CAAC,mBAAmB,CAC7B,iBAAS,CAAC,MAAM,EAChB,4BAAmB,CAAC,EAAE,CAAC,SAAS,CAAC,MAAM,EAAE,EAAE,GAAG,EAAE,CAAC,SAAS,EAAE,KAAK,EAAE,CAAC,EAAE,CAAC,CACxE,CAAC;SACH;QAED,IAAI,YAAY,IAAI,EAAE,EAAE;YACtB,IAAI,GAAG,IAAI,EAAE,CAAC,UAAU,EAAE;gBACxB,MAAM,IAAI,iCAAyB,CAAC,gCAAgC,CAAC,CAAC;aACvE;YACD,OAAO,IAAI,CAAC,mBAAmB,CAC7B,iBAAS,CAAC,MAAM,EAChB,4BAAmB,CAAC,EAAE,CAAC,UAAU,CAAC,MAAM,EAAE,EAAE,GAAG,EAAE,CAAC,UAAU,EAAE,KAAK,EAAE,CAAC,EAAE,CAAC,CAC1E,CAAC;SACH;QAED,8CAA8C;QAC9C,MAAM,IAAI,iCAAyB,CACjC,iFAAiF,CAClF,CAAC;IACJ,CAAC;IAED,IAAI,WAAW;QACb,OAAO,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC;IAC5B,CAAC;IAED,IAAI,YAAY;QACd,OAAO,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC;IAC7B,CAAC;IAED,IAAI,OAAO;QACT,MAAM,OAAO,GAAG,CAAC,GAAG,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC;QACpC,IAAI,IAAI,CAAC,SAAS,EAAE;YAClB,IAAI,IAAI,CAAC,CAAC,CAAC,YAAY;gBAAE,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,CAAC;SAC5D;aAAM;YACL,IAAI,IAAI,CAAC,CAAC,CAAC,kBAAkB;gBAAE,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,kBAAkB,CAAC,CAAC;YACvE,IAAI,IAAI,CAAC,CAAC,CAAC,kBAAkB;gBAAE,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,kBAAkB,CAAC,CAAC;YACvE,IAAI,IAAI,CAAC,CAAC,CAAC,kBAAkB;gBAAE,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,kBAAkB,CAAC,CAAC;SACxE;QACD,OAAO,OAAO,CAAC;IACjB,CAAC;IAED,0FAA0F;IAC1F,OAAO,CACL,OAA0B,EAC1B,QAAoC;QAEpC,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAExB,IAAI,IAAI,CAAC,CAAC,CAAC,QAAQ,EAAE;YACnB,OAAO,gBAAgB,CAAC,IAAI,kCAA0B,EAAE,EAAE,QAAQ,CAAC,CAAC;SACrE;QAED,MAAM,YAAY,GAAG,4BAAY,CAAC,WAAW,CAAC,OAAO,CAAC,CAAC;QACvD,IAAI,YAAY,EAAE;YAChB,IAAI,CAAC,CAAC,CAAC,YAAY,GAAG,YAAY,CAAC;SACpC;QAED,2BAA2B;QAC3B,IAAI,IAAI,CAAC,SAAS,EAAE;YAClB,IAAI,IAAI,CAAC,CAAC,CAAC,YAAY;gBAAE,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,CAAC;SACnE;aAAM;YACL,IAAI,IAAI,CAAC,CAAC,CAAC,kBAAkB;gBAAE,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,kBAAkB,CAAC,CAAC;YAC9E,IAAI,IAAI,CAAC,CAAC,CAAC,kBAAkB;gBAAE,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,kBAAkB,CAAC,CAAC;YAC9E,IAAI,IAAI,CAAC,CAAC,CAAC,kBAAkB;gBAAE,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,kBAAkB,CAAC,CAAC;SAC/E;QACD,sDAAsD;QACtD,IAAI,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,MAAM,KAAK,CAAC,EAAE;YAC/B,MAAM,eAAe,GAAG,IAAI,iCAAyB,CACnD,8CAA8C,CAC/C,CAAC;YACF,OAAO,gBAAgB,CAAC,eAAe,EAAE,QAAQ,CAAC,CAAC;SACpD;QAED,IAAI,CAAC,CAAC,CAAC,QAAQ,GAAG,IAAI,CAAC;QACvB,MAAM,YAAY,GAAG,EAAE,GAAG,IAAI,CAAC,CAAC,CAAC,OAAO,EAAE,GAAG,OAAO,EAAE,CAAC;QACvD,OAAO,8BAAsB,CAAC,IAAI,CAAC,CAAC,CAAC,QAAQ,EAAE,eAAe,EAAE,CAAC,IAAI,EAAE,YAAY,EAAE,QAAQ,CAAC,CAAC,CAAC;IAClG,CAAC;IAED;;;OAGG;IACH,gBAAgB,CACd,QAAmC,EACnC,WAA4B;QAE5B,IAAI,IAAI,CAAC,CAAC,CAAC,UAAU,CAAC,WAAW,CAAC,MAAM,GAAG,CAAC,EAAE;YAC5C,MAAM,GAAG,GAAG,IAAI,CAAC,CAAC,CAAC,UAAU,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,MAAM;gBACjD,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,UAAU,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,MAAM;gBACzC,CAAC,CAAC,wBAAwB,CAAC;YAE7B,QAAQ,CACN,IAAI,mBAAmB,CACrB;gBACE,OAAO,EAAE,GAAG;gBACZ,IAAI,EAAE,IAAI,CAAC,CAAC,CAAC,UAAU,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,IAAI;gBAC3C,WAAW,EAAE,IAAI,CAAC,CAAC,CAAC,UAAU,CAAC,WAAW;aAC3C,EACD,WAAW,CACZ,CACF,CAAC;YAEF,OAAO,IAAI,CAAC;SACb;QAED,MAAM,iBAAiB,GAAG,WAAW,CAAC,oBAAoB,EAAE,CAAC;QAC7D,IAAI,iBAAiB,EAAE;YACrB,QAAQ,CAAC,IAAI,mBAAmB,CAAC,iBAAiB,EAAE,WAAW,CAAC,CAAC,CAAC;YAClE,OAAO,IAAI,CAAC;SACb;IACH,CAAC;CAMF;AAhXD,8CAgXC;AAED,MAAM,CAAC,cAAc,CAAC,iBAAiB,CAAC,SAAS,EAAE,QAAQ,EAAE;IAC3D,UAAU,EAAE,IAAI;IAChB,GAAG;QACD,OAAO,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC;IAC7B,CAAC;CACF,CAAC,CAAC;AAEH,gEAAgE;AAChE,SAAS,gBAAgB,CACvB,GAAc,EACd,QAAoC;IAEpC,MAAM,OAAO,GAAG,kCAAe,CAAC,GAAG,EAAE,CAAC;IACtC,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;QAClC,QAAQ,CAAC,GAAG,CAAC,CAAC;QACd,OAAO;KACR;IAED,OAAO,OAAO,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC;AAC7B,CAAC;AAED,SAAS,yBAAyB,CAAC,aAAgC;;IACjE,IAAI,OAAO,aAAa,CAAC,CAAC,CAAC,OAAO,CAAC,mBAAmB,KAAK,SAAS,EAAE;QACpE,OAAO,aAAa,CAAC,CAAC,CAAC,OAAO,CAAC,mBAAmB,CAAC;KACpD;IAED,IAAI,OAAO,CAAA,MAAA,aAAa,CAAC,CAAC,CAAC,UAAU,CAAC,CAAC,CAAC,EAAE,CAAC,OAAO,0CAAE,mBAAmB,CAAA,KAAK,SAAS,EAAE;QACrF,OAAO,MAAA,aAAa,CAAC,CAAC,CAAC,UAAU,CAAC,CAAC,CAAC,EAAE,CAAC,OAAO,0CAAE,mBAAmB,CAAC;KACrE;IAED,OAAO,KAAK,CAAC;AACf,CAAC;AAED,SAAS,aAAa,CAAC,KAAY;IACjC,OAAO,KAAK,CAAC,SAAS,KAAK,iBAAS,CAAC,MAAM,CAAC;AAC9C,CAAC;AAED,SAAS,aAAa,CAAC,KAAY;IACjC,OAAO,KAAK,CAAC,SAAS,KAAK,iBAAS,CAAC,MAAM,CAAC;AAC9C,CAAC;AAED,SAAS,aAAa,CAAC,KAAY;IACjC,OAAO,KAAK,CAAC,SAAS,KAAK,iBAAS,CAAC,MAAM,CAAC;AAC9C,CAAC;AAED,SAAS,cAAc,CAAC,MAAyB;IAC/C,IAAI,EAAE,SAAS,EAAE,GAAG,MAAM,CAAC,CAAC,CAAC;IAC7B,MAAM,CAAC,CAAC,CAAC,SAAS,GAAG,SAAS,CAAC;IAC/B,IAAI,CAAC,SAAS;QAAE,SAAS,GAAG,EAAE,CAAC;IAC/B,OAAO,SAAS,CAAC;AACnB,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/bulk/ordered.js b/node_modules/mongodb/lib/bulk/ordered.js new file mode 100644 index 000000000..bf4b3f385 --- /dev/null +++ b/node_modules/mongodb/lib/bulk/ordered.js @@ -0,0 +1,66 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.OrderedBulkOperation = void 0; +const BSON = require("../bson"); +const common_1 = require("./common"); +const error_1 = require("../error"); +/** @public */ +class OrderedBulkOperation extends common_1.BulkOperationBase { + constructor(collection, options) { + super(collection, options, true); + } + addToOperationsList(batchType, document) { + // Get the bsonSize + const bsonSize = BSON.calculateObjectSize(document, { + checkKeys: false, + // Since we don't know what the user selected for BSON options here, + // err on the safe side, and check the size with ignoreUndefined: false. + ignoreUndefined: false + }); + // Throw error if the doc is bigger than the max BSON size + if (bsonSize >= this.s.maxBsonObjectSize) + // TODO(NODE-3483): Change this to MongoBSONError + throw new error_1.MongoInvalidArgumentError(`Document is larger than the maximum size ${this.s.maxBsonObjectSize}`); + // Create a new batch object if we don't have a current one + if (this.s.currentBatch == null) { + this.s.currentBatch = new common_1.Batch(batchType, this.s.currentIndex); + } + const maxKeySize = this.s.maxKeySize; + // Check if we need to create a new batch + if ( + // New batch if we exceed the max batch op size + this.s.currentBatchSize + 1 >= this.s.maxWriteBatchSize || + // New batch if we exceed the maxBatchSizeBytes. Only matters if batch already has a doc, + // since we can't sent an empty batch + (this.s.currentBatchSize > 0 && + this.s.currentBatchSizeBytes + maxKeySize + bsonSize >= this.s.maxBatchSizeBytes) || + // New batch if the new op does not have the same op type as the current batch + this.s.currentBatch.batchType !== batchType) { + // Save the batch to the execution stack + this.s.batches.push(this.s.currentBatch); + // Create a new batch + this.s.currentBatch = new common_1.Batch(batchType, this.s.currentIndex); + // Reset the current size trackers + this.s.currentBatchSize = 0; + this.s.currentBatchSizeBytes = 0; + } + if (batchType === common_1.BatchType.INSERT) { + this.s.bulkResult.insertedIds.push({ + index: this.s.currentIndex, + _id: document._id + }); + } + // We have an array of documents + if (Array.isArray(document)) { + throw new error_1.MongoInvalidArgumentError('Operation passed in cannot be an Array'); + } + this.s.currentBatch.originalIndexes.push(this.s.currentIndex); + this.s.currentBatch.operations.push(document); + this.s.currentBatchSize += 1; + this.s.currentBatchSizeBytes += maxKeySize + bsonSize; + this.s.currentIndex += 1; + return this; + } +} +exports.OrderedBulkOperation = OrderedBulkOperation; +//# sourceMappingURL=ordered.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/bulk/ordered.js.map b/node_modules/mongodb/lib/bulk/ordered.js.map new file mode 100644 index 000000000..2ad7c9d44 --- /dev/null +++ b/node_modules/mongodb/lib/bulk/ordered.js.map @@ -0,0 +1 @@ +{"version":3,"file":"ordered.js","sourceRoot":"","sources":["../../src/bulk/ordered.ts"],"names":[],"mappings":";;;AAAA,gCAAgC;AAChC,qCAAiF;AAKjF,oCAAqD;AAErD,cAAc;AACd,MAAa,oBAAqB,SAAQ,0BAAiB;IACzD,YAAY,UAAsB,EAAE,OAAyB;QAC3D,KAAK,CAAC,UAAU,EAAE,OAAO,EAAE,IAAI,CAAC,CAAC;IACnC,CAAC;IAED,mBAAmB,CACjB,SAAoB,EACpB,QAAsD;QAEtD,mBAAmB;QACnB,MAAM,QAAQ,GAAG,IAAI,CAAC,mBAAmB,CAAC,QAAQ,EAAE;YAClD,SAAS,EAAE,KAAK;YAChB,oEAAoE;YACpE,wEAAwE;YACxE,eAAe,EAAE,KAAK;SAChB,CAAC,CAAC;QAEV,0DAA0D;QAC1D,IAAI,QAAQ,IAAI,IAAI,CAAC,CAAC,CAAC,iBAAiB;YACtC,iDAAiD;YACjD,MAAM,IAAI,iCAAyB,CACjC,4CAA4C,IAAI,CAAC,CAAC,CAAC,iBAAiB,EAAE,CACvE,CAAC;QAEJ,2DAA2D;QAC3D,IAAI,IAAI,CAAC,CAAC,CAAC,YAAY,IAAI,IAAI,EAAE;YAC/B,IAAI,CAAC,CAAC,CAAC,YAAY,GAAG,IAAI,cAAK,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,CAAC;SACjE;QAED,MAAM,UAAU,GAAG,IAAI,CAAC,CAAC,CAAC,UAAU,CAAC;QAErC,yCAAyC;QACzC;QACE,+CAA+C;QAC/C,IAAI,CAAC,CAAC,CAAC,gBAAgB,GAAG,CAAC,IAAI,IAAI,CAAC,CAAC,CAAC,iBAAiB;YACvD,yFAAyF;YACzF,qCAAqC;YACrC,CAAC,IAAI,CAAC,CAAC,CAAC,gBAAgB,GAAG,CAAC;gBAC1B,IAAI,CAAC,CAAC,CAAC,qBAAqB,GAAG,UAAU,GAAG,QAAQ,IAAI,IAAI,CAAC,CAAC,CAAC,iBAAiB,CAAC;YACnF,8EAA8E;YAC9E,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,SAAS,KAAK,SAAS,EAC3C;YACA,wCAAwC;YACxC,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,CAAC;YAEzC,qBAAqB;YACrB,IAAI,CAAC,CAAC,CAAC,YAAY,GAAG,IAAI,cAAK,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,CAAC;YAEhE,kCAAkC;YAClC,IAAI,CAAC,CAAC,CAAC,gBAAgB,GAAG,CAAC,CAAC;YAC5B,IAAI,CAAC,CAAC,CAAC,qBAAqB,GAAG,CAAC,CAAC;SAClC;QAED,IAAI,SAAS,KAAK,kBAAS,CAAC,MAAM,EAAE;YAClC,IAAI,CAAC,CAAC,CAAC,UAAU,CAAC,WAAW,CAAC,IAAI,CAAC;gBACjC,KAAK,EAAE,IAAI,CAAC,CAAC,CAAC,YAAY;gBAC1B,GAAG,EAAG,QAAqB,CAAC,GAAG;aAChC,CAAC,CAAC;SACJ;QAED,gCAAgC;QAChC,IAAI,KAAK,CAAC,OAAO,CAAC,QAAQ,CAAC,EAAE;YAC3B,MAAM,IAAI,iCAAyB,CAAC,wCAAwC,CAAC,CAAC;SAC/E;QAED,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,eAAe,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,CAAC;QAC9D,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,UAAU,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;QAC9C,IAAI,CAAC,CAAC,CAAC,gBAAgB,IAAI,CAAC,CAAC;QAC7B,IAAI,CAAC,CAAC,CAAC,qBAAqB,IAAI,UAAU,GAAG,QAAQ,CAAC;QACtD,IAAI,CAAC,CAAC,CAAC,YAAY,IAAI,CAAC,CAAC;QACzB,OAAO,IAAI,CAAC;IACd,CAAC;CACF;AAxED,oDAwEC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/bulk/unordered.js b/node_modules/mongodb/lib/bulk/unordered.js new file mode 100644 index 000000000..e65657cfa --- /dev/null +++ b/node_modules/mongodb/lib/bulk/unordered.js @@ -0,0 +1,91 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.UnorderedBulkOperation = void 0; +const BSON = require("../bson"); +const common_1 = require("./common"); +const error_1 = require("../error"); +/** @public */ +class UnorderedBulkOperation extends common_1.BulkOperationBase { + constructor(collection, options) { + super(collection, options, false); + } + handleWriteError(callback, writeResult) { + if (this.s.batches.length) { + return false; + } + return super.handleWriteError(callback, writeResult); + } + addToOperationsList(batchType, document) { + // Get the bsonSize + const bsonSize = BSON.calculateObjectSize(document, { + checkKeys: false, + // Since we don't know what the user selected for BSON options here, + // err on the safe side, and check the size with ignoreUndefined: false. + ignoreUndefined: false + }); + // Throw error if the doc is bigger than the max BSON size + if (bsonSize >= this.s.maxBsonObjectSize) { + // TODO(NODE-3483): Change this to MongoBSONError + throw new error_1.MongoInvalidArgumentError(`Document is larger than the maximum size ${this.s.maxBsonObjectSize}`); + } + // Holds the current batch + this.s.currentBatch = undefined; + // Get the right type of batch + if (batchType === common_1.BatchType.INSERT) { + this.s.currentBatch = this.s.currentInsertBatch; + } + else if (batchType === common_1.BatchType.UPDATE) { + this.s.currentBatch = this.s.currentUpdateBatch; + } + else if (batchType === common_1.BatchType.DELETE) { + this.s.currentBatch = this.s.currentRemoveBatch; + } + const maxKeySize = this.s.maxKeySize; + // Create a new batch object if we don't have a current one + if (this.s.currentBatch == null) { + this.s.currentBatch = new common_1.Batch(batchType, this.s.currentIndex); + } + // Check if we need to create a new batch + if ( + // New batch if we exceed the max batch op size + this.s.currentBatch.size + 1 >= this.s.maxWriteBatchSize || + // New batch if we exceed the maxBatchSizeBytes. Only matters if batch already has a doc, + // since we can't sent an empty batch + (this.s.currentBatch.size > 0 && + this.s.currentBatch.sizeBytes + maxKeySize + bsonSize >= this.s.maxBatchSizeBytes) || + // New batch if the new op does not have the same op type as the current batch + this.s.currentBatch.batchType !== batchType) { + // Save the batch to the execution stack + this.s.batches.push(this.s.currentBatch); + // Create a new batch + this.s.currentBatch = new common_1.Batch(batchType, this.s.currentIndex); + } + // We have an array of documents + if (Array.isArray(document)) { + throw new error_1.MongoInvalidArgumentError('Operation passed in cannot be an Array'); + } + this.s.currentBatch.operations.push(document); + this.s.currentBatch.originalIndexes.push(this.s.currentIndex); + this.s.currentIndex = this.s.currentIndex + 1; + // Save back the current Batch to the right type + if (batchType === common_1.BatchType.INSERT) { + this.s.currentInsertBatch = this.s.currentBatch; + this.s.bulkResult.insertedIds.push({ + index: this.s.bulkResult.insertedIds.length, + _id: document._id + }); + } + else if (batchType === common_1.BatchType.UPDATE) { + this.s.currentUpdateBatch = this.s.currentBatch; + } + else if (batchType === common_1.BatchType.DELETE) { + this.s.currentRemoveBatch = this.s.currentBatch; + } + // Update current batch size + this.s.currentBatch.size += 1; + this.s.currentBatch.sizeBytes += maxKeySize + bsonSize; + return this; + } +} +exports.UnorderedBulkOperation = UnorderedBulkOperation; +//# sourceMappingURL=unordered.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/bulk/unordered.js.map b/node_modules/mongodb/lib/bulk/unordered.js.map new file mode 100644 index 000000000..7e970f70e --- /dev/null +++ b/node_modules/mongodb/lib/bulk/unordered.js.map @@ -0,0 +1 @@ +{"version":3,"file":"unordered.js","sourceRoot":"","sources":["../../src/bulk/unordered.ts"],"names":[],"mappings":";;;AAAA,gCAAgC;AAChC,qCAAkG;AAMlG,oCAAqD;AAErD,cAAc;AACd,MAAa,sBAAuB,SAAQ,0BAAiB;IAC3D,YAAY,UAAsB,EAAE,OAAyB;QAC3D,KAAK,CAAC,UAAU,EAAE,OAAO,EAAE,KAAK,CAAC,CAAC;IACpC,CAAC;IAED,gBAAgB,CAAC,QAAkB,EAAE,WAA4B;QAC/D,IAAI,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,MAAM,EAAE;YACzB,OAAO,KAAK,CAAC;SACd;QAED,OAAO,KAAK,CAAC,gBAAgB,CAAC,QAAQ,EAAE,WAAW,CAAC,CAAC;IACvD,CAAC;IAED,mBAAmB,CACjB,SAAoB,EACpB,QAAsD;QAEtD,mBAAmB;QACnB,MAAM,QAAQ,GAAG,IAAI,CAAC,mBAAmB,CAAC,QAAQ,EAAE;YAClD,SAAS,EAAE,KAAK;YAEhB,oEAAoE;YACpE,wEAAwE;YACxE,eAAe,EAAE,KAAK;SAChB,CAAC,CAAC;QAEV,0DAA0D;QAC1D,IAAI,QAAQ,IAAI,IAAI,CAAC,CAAC,CAAC,iBAAiB,EAAE;YACxC,iDAAiD;YACjD,MAAM,IAAI,iCAAyB,CACjC,4CAA4C,IAAI,CAAC,CAAC,CAAC,iBAAiB,EAAE,CACvE,CAAC;SACH;QAED,0BAA0B;QAC1B,IAAI,CAAC,CAAC,CAAC,YAAY,GAAG,SAAS,CAAC;QAChC,8BAA8B;QAC9B,IAAI,SAAS,KAAK,kBAAS,CAAC,MAAM,EAAE;YAClC,IAAI,CAAC,CAAC,CAAC,YAAY,GAAG,IAAI,CAAC,CAAC,CAAC,kBAAkB,CAAC;SACjD;aAAM,IAAI,SAAS,KAAK,kBAAS,CAAC,MAAM,EAAE;YACzC,IAAI,CAAC,CAAC,CAAC,YAAY,GAAG,IAAI,CAAC,CAAC,CAAC,kBAAkB,CAAC;SACjD;aAAM,IAAI,SAAS,KAAK,kBAAS,CAAC,MAAM,EAAE;YACzC,IAAI,CAAC,CAAC,CAAC,YAAY,GAAG,IAAI,CAAC,CAAC,CAAC,kBAAkB,CAAC;SACjD;QAED,MAAM,UAAU,GAAG,IAAI,CAAC,CAAC,CAAC,UAAU,CAAC;QAErC,2DAA2D;QAC3D,IAAI,IAAI,CAAC,CAAC,CAAC,YAAY,IAAI,IAAI,EAAE;YAC/B,IAAI,CAAC,CAAC,CAAC,YAAY,GAAG,IAAI,cAAK,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,CAAC;SACjE;QAED,yCAAyC;QACzC;QACE,+CAA+C;QAC/C,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,IAAI,GAAG,CAAC,IAAI,IAAI,CAAC,CAAC,CAAC,iBAAiB;YACxD,yFAAyF;YACzF,qCAAqC;YACrC,CAAC,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,IAAI,GAAG,CAAC;gBAC3B,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,SAAS,GAAG,UAAU,GAAG,QAAQ,IAAI,IAAI,CAAC,CAAC,CAAC,iBAAiB,CAAC;YACpF,8EAA8E;YAC9E,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,SAAS,KAAK,SAAS,EAC3C;YACA,wCAAwC;YACxC,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,CAAC;YAEzC,qBAAqB;YACrB,IAAI,CAAC,CAAC,CAAC,YAAY,GAAG,IAAI,cAAK,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,CAAC;SACjE;QAED,gCAAgC;QAChC,IAAI,KAAK,CAAC,OAAO,CAAC,QAAQ,CAAC,EAAE;YAC3B,MAAM,IAAI,iCAAyB,CAAC,wCAAwC,CAAC,CAAC;SAC/E;QAED,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,UAAU,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;QAC9C,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,eAAe,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,CAAC;QAC9D,IAAI,CAAC,CAAC,CAAC,YAAY,GAAG,IAAI,CAAC,CAAC,CAAC,YAAY,GAAG,CAAC,CAAC;QAE9C,gDAAgD;QAChD,IAAI,SAAS,KAAK,kBAAS,CAAC,MAAM,EAAE;YAClC,IAAI,CAAC,CAAC,CAAC,kBAAkB,GAAG,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC;YAChD,IAAI,CAAC,CAAC,CAAC,UAAU,CAAC,WAAW,CAAC,IAAI,CAAC;gBACjC,KAAK,EAAE,IAAI,CAAC,CAAC,CAAC,UAAU,CAAC,WAAW,CAAC,MAAM;gBAC3C,GAAG,EAAG,QAAqB,CAAC,GAAG;aAChC,CAAC,CAAC;SACJ;aAAM,IAAI,SAAS,KAAK,kBAAS,CAAC,MAAM,EAAE;YACzC,IAAI,CAAC,CAAC,CAAC,kBAAkB,GAAG,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC;SACjD;aAAM,IAAI,SAAS,KAAK,kBAAS,CAAC,MAAM,EAAE;YACzC,IAAI,CAAC,CAAC,CAAC,kBAAkB,GAAG,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC;SACjD;QAED,4BAA4B;QAC5B,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,IAAI,IAAI,CAAC,CAAC;QAC9B,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC,SAAS,IAAI,UAAU,GAAG,QAAQ,CAAC;QAEvD,OAAO,IAAI,CAAC;IACd,CAAC;CACF;AAlGD,wDAkGC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/change_stream.js b/node_modules/mongodb/lib/change_stream.js new file mode 100644 index 000000000..1a846638a --- /dev/null +++ b/node_modules/mongodb/lib/change_stream.js @@ -0,0 +1,495 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.ChangeStreamCursor = exports.ChangeStream = void 0; +const Denque = require("denque"); +const error_1 = require("./error"); +const aggregate_1 = require("./operations/aggregate"); +const utils_1 = require("./utils"); +const mongo_client_1 = require("./mongo_client"); +const db_1 = require("./db"); +const collection_1 = require("./collection"); +const abstract_cursor_1 = require("./cursor/abstract_cursor"); +const execute_operation_1 = require("./operations/execute_operation"); +const mongo_types_1 = require("./mongo_types"); +/** @internal */ +const kResumeQueue = Symbol('resumeQueue'); +/** @internal */ +const kCursorStream = Symbol('cursorStream'); +/** @internal */ +const kClosed = Symbol('closed'); +/** @internal */ +const kMode = Symbol('mode'); +const CHANGE_STREAM_OPTIONS = ['resumeAfter', 'startAfter', 'startAtOperationTime', 'fullDocument']; +const CURSOR_OPTIONS = ['batchSize', 'maxAwaitTimeMS', 'collation', 'readPreference'].concat(CHANGE_STREAM_OPTIONS); +const CHANGE_DOMAIN_TYPES = { + COLLECTION: Symbol('Collection'), + DATABASE: Symbol('Database'), + CLUSTER: Symbol('Cluster') +}; +const NO_RESUME_TOKEN_ERROR = 'A change stream document has been received that lacks a resume token (_id).'; +const NO_CURSOR_ERROR = 'ChangeStream has no cursor'; +const CHANGESTREAM_CLOSED_ERROR = 'ChangeStream is closed'; +/** + * Creates a new Change Stream instance. Normally created using {@link Collection#watch|Collection.watch()}. + * @public + */ +class ChangeStream extends mongo_types_1.TypedEventEmitter { + /** + * @internal + * + * @param parent - The parent object that created this change stream + * @param pipeline - An array of {@link https://docs.mongodb.com/manual/reference/operator/aggregation-pipeline/|aggregation pipeline stages} through which to pass change stream documents + */ + constructor(parent, pipeline = [], options = {}) { + super(); + this.pipeline = pipeline; + this.options = options; + if (parent instanceof collection_1.Collection) { + this.type = CHANGE_DOMAIN_TYPES.COLLECTION; + } + else if (parent instanceof db_1.Db) { + this.type = CHANGE_DOMAIN_TYPES.DATABASE; + } + else if (parent instanceof mongo_client_1.MongoClient) { + this.type = CHANGE_DOMAIN_TYPES.CLUSTER; + } + else { + throw new error_1.MongoChangeStreamError('Parent provided to ChangeStream constructor must be an instance of Collection, Db, or MongoClient'); + } + this.parent = parent; + this.namespace = parent.s.namespace; + if (!this.options.readPreference && parent.readPreference) { + this.options.readPreference = parent.readPreference; + } + this[kResumeQueue] = new Denque(); + // Create contained Change Stream cursor + this.cursor = createChangeStreamCursor(this, options); + this[kClosed] = false; + this[kMode] = false; + // Listen for any `change` listeners being added to ChangeStream + this.on('newListener', eventName => { + if (eventName === 'change' && this.cursor && this.listenerCount('change') === 0) { + streamEvents(this, this.cursor); + } + }); + this.on('removeListener', eventName => { + var _a; + if (eventName === 'change' && this.listenerCount('change') === 0 && this.cursor) { + (_a = this[kCursorStream]) === null || _a === void 0 ? void 0 : _a.removeAllListeners('data'); + } + }); + } + /** @internal */ + get cursorStream() { + return this[kCursorStream]; + } + /** The cached resume token that is used to resume after the most recently returned change. */ + get resumeToken() { + var _a; + return (_a = this.cursor) === null || _a === void 0 ? void 0 : _a.resumeToken; + } + /** Check if there is any document still available in the Change Stream */ + hasNext(callback) { + setIsIterator(this); + return utils_1.maybePromise(callback, cb => { + getCursor(this, (err, cursor) => { + if (err || !cursor) + return cb(err); // failed to resume, raise an error + cursor.hasNext(cb); + }); + }); + } + next(callback) { + setIsIterator(this); + return utils_1.maybePromise(callback, cb => { + getCursor(this, (err, cursor) => { + if (err || !cursor) + return cb(err); // failed to resume, raise an error + cursor.next((error, change) => { + if (error) { + this[kResumeQueue].push(() => this.next(cb)); + processError(this, error, cb); + return; + } + processNewChange(this, change, cb); + }); + }); + }); + } + /** Is the cursor closed */ + get closed() { + var _a, _b; + return this[kClosed] || ((_b = (_a = this.cursor) === null || _a === void 0 ? void 0 : _a.closed) !== null && _b !== void 0 ? _b : false); + } + /** Close the Change Stream */ + close(callback) { + this[kClosed] = true; + return utils_1.maybePromise(callback, cb => { + if (!this.cursor) { + return cb(); + } + const cursor = this.cursor; + return cursor.close(err => { + endStream(this); + this.cursor = undefined; + return cb(err); + }); + }); + } + /** + * Return a modified Readable stream including a possible transform method. + * @throws MongoDriverError if this.cursor is undefined + */ + stream(options) { + this.streamOptions = options; + if (!this.cursor) + throw new error_1.MongoChangeStreamError(NO_CURSOR_ERROR); + return this.cursor.stream(options); + } + tryNext(callback) { + setIsIterator(this); + return utils_1.maybePromise(callback, cb => { + getCursor(this, (err, cursor) => { + if (err || !cursor) + return cb(err); // failed to resume, raise an error + return cursor.tryNext(cb); + }); + }); + } +} +exports.ChangeStream = ChangeStream; +/** @event */ +ChangeStream.RESPONSE = 'response'; +/** @event */ +ChangeStream.MORE = 'more'; +/** @event */ +ChangeStream.INIT = 'init'; +/** @event */ +ChangeStream.CLOSE = 'close'; +/** + * Fired for each new matching change in the specified namespace. Attaching a `change` + * event listener to a Change Stream will switch the stream into flowing mode. Data will + * then be passed as soon as it is available. + * @event + */ +ChangeStream.CHANGE = 'change'; +/** @event */ +ChangeStream.END = 'end'; +/** @event */ +ChangeStream.ERROR = 'error'; +/** + * Emitted each time the change stream stores a new resume token. + * @event + */ +ChangeStream.RESUME_TOKEN_CHANGED = 'resumeTokenChanged'; +/** @internal */ +class ChangeStreamCursor extends abstract_cursor_1.AbstractCursor { + constructor(topology, namespace, pipeline = [], options = {}) { + super(topology, namespace, options); + this.pipeline = pipeline; + this.options = options; + this._resumeToken = null; + this.startAtOperationTime = options.startAtOperationTime; + if (options.startAfter) { + this.resumeToken = options.startAfter; + } + else if (options.resumeAfter) { + this.resumeToken = options.resumeAfter; + } + } + set resumeToken(token) { + this._resumeToken = token; + this.emit(ChangeStream.RESUME_TOKEN_CHANGED, token); + } + get resumeToken() { + return this._resumeToken; + } + get resumeOptions() { + const result = {}; + for (const optionName of CURSOR_OPTIONS) { + if (Reflect.has(this.options, optionName)) { + Reflect.set(result, optionName, Reflect.get(this.options, optionName)); + } + } + if (this.resumeToken || this.startAtOperationTime) { + ['resumeAfter', 'startAfter', 'startAtOperationTime'].forEach(key => Reflect.deleteProperty(result, key)); + if (this.resumeToken) { + const resumeKey = this.options.startAfter && !this.hasReceived ? 'startAfter' : 'resumeAfter'; + Reflect.set(result, resumeKey, this.resumeToken); + } + else if (this.startAtOperationTime && utils_1.maxWireVersion(this.server) >= 7) { + result.startAtOperationTime = this.startAtOperationTime; + } + } + return result; + } + cacheResumeToken(resumeToken) { + if (this.bufferedCount() === 0 && this.postBatchResumeToken) { + this.resumeToken = this.postBatchResumeToken; + } + else { + this.resumeToken = resumeToken; + } + this.hasReceived = true; + } + _processBatch(batchName, response) { + const cursor = (response === null || response === void 0 ? void 0 : response.cursor) || {}; + if (cursor.postBatchResumeToken) { + this.postBatchResumeToken = cursor.postBatchResumeToken; + if (cursor[batchName].length === 0) { + this.resumeToken = cursor.postBatchResumeToken; + } + } + } + clone() { + return new ChangeStreamCursor(this.topology, this.namespace, this.pipeline, { + ...this.cursorOptions + }); + } + _initialize(session, callback) { + const aggregateOperation = new aggregate_1.AggregateOperation(this.namespace, this.pipeline, { + ...this.cursorOptions, + ...this.options, + session + }); + execute_operation_1.executeOperation(this.topology, aggregateOperation, (err, response) => { + if (err || response == null) { + return callback(err); + } + const server = aggregateOperation.server; + if (this.startAtOperationTime == null && + this.resumeAfter == null && + this.startAfter == null && + utils_1.maxWireVersion(server) >= 7) { + this.startAtOperationTime = response.operationTime; + } + this._processBatch('firstBatch', response); + this.emit(ChangeStream.INIT, response); + this.emit(ChangeStream.RESPONSE); + // TODO: NODE-2882 + callback(undefined, { server, session, response }); + }); + } + _getMore(batchSize, callback) { + super._getMore(batchSize, (err, response) => { + if (err) { + return callback(err); + } + this._processBatch('nextBatch', response); + this.emit(ChangeStream.MORE, response); + this.emit(ChangeStream.RESPONSE); + callback(err, response); + }); + } +} +exports.ChangeStreamCursor = ChangeStreamCursor; +const CHANGE_STREAM_EVENTS = [ + ChangeStream.RESUME_TOKEN_CHANGED, + ChangeStream.END, + ChangeStream.CLOSE +]; +function setIsEmitter(changeStream) { + if (changeStream[kMode] === 'iterator') { + // TODO(NODE-3485): Replace with MongoChangeStreamModeError + throw new error_1.MongoAPIError('ChangeStream cannot be used as an EventEmitter after being used as an iterator'); + } + changeStream[kMode] = 'emitter'; +} +function setIsIterator(changeStream) { + if (changeStream[kMode] === 'emitter') { + // TODO(NODE-3485): Replace with MongoChangeStreamModeError + throw new error_1.MongoAPIError('ChangeStream cannot be used as an iterator after being used as an EventEmitter'); + } + changeStream[kMode] = 'iterator'; +} +/** + * Create a new change stream cursor based on self's configuration + * @internal + */ +function createChangeStreamCursor(changeStream, options) { + const changeStreamStageOptions = { fullDocument: options.fullDocument || 'default' }; + applyKnownOptions(changeStreamStageOptions, options, CHANGE_STREAM_OPTIONS); + if (changeStream.type === CHANGE_DOMAIN_TYPES.CLUSTER) { + changeStreamStageOptions.allChangesForCluster = true; + } + const pipeline = [{ $changeStream: changeStreamStageOptions }].concat(changeStream.pipeline); + const cursorOptions = applyKnownOptions({}, options, CURSOR_OPTIONS); + const changeStreamCursor = new ChangeStreamCursor(utils_1.getTopology(changeStream.parent), changeStream.namespace, pipeline, cursorOptions); + for (const event of CHANGE_STREAM_EVENTS) { + changeStreamCursor.on(event, e => changeStream.emit(event, e)); + } + if (changeStream.listenerCount(ChangeStream.CHANGE) > 0) { + streamEvents(changeStream, changeStreamCursor); + } + return changeStreamCursor; +} +function applyKnownOptions(target, source, optionNames) { + optionNames.forEach(name => { + if (source[name]) { + target[name] = source[name]; + } + }); + return target; +} +// This method performs a basic server selection loop, satisfying the requirements of +// ChangeStream resumability until the new SDAM layer can be used. +const SELECTION_TIMEOUT = 30000; +function waitForTopologyConnected(topology, options, callback) { + setTimeout(() => { + if (options && options.start == null) { + options.start = utils_1.now(); + } + const start = options.start || utils_1.now(); + const timeout = options.timeout || SELECTION_TIMEOUT; + if (topology.isConnected()) { + return callback(); + } + if (utils_1.calculateDurationInMs(start) > timeout) { + // TODO(NODE-3497): Replace with MongoNetworkTimeoutError + return callback(new error_1.MongoRuntimeError('Timed out waiting for connection')); + } + waitForTopologyConnected(topology, options, callback); + }, 500); // this is an arbitrary wait time to allow SDAM to transition +} +function closeWithError(changeStream, error, callback) { + if (!callback) { + changeStream.emit(ChangeStream.ERROR, error); + } + changeStream.close(() => callback && callback(error)); +} +function streamEvents(changeStream, cursor) { + setIsEmitter(changeStream); + const stream = changeStream[kCursorStream] || cursor.stream(); + changeStream[kCursorStream] = stream; + stream.on('data', change => processNewChange(changeStream, change)); + stream.on('error', error => processError(changeStream, error)); +} +function endStream(changeStream) { + const cursorStream = changeStream[kCursorStream]; + if (cursorStream) { + ['data', 'close', 'end', 'error'].forEach(event => cursorStream.removeAllListeners(event)); + cursorStream.destroy(); + } + changeStream[kCursorStream] = undefined; +} +function processNewChange(changeStream, change, callback) { + var _a; + if (changeStream[kClosed]) { + // TODO(NODE-3485): Replace with MongoChangeStreamClosedError + if (callback) + callback(new error_1.MongoAPIError(CHANGESTREAM_CLOSED_ERROR)); + return; + } + // a null change means the cursor has been notified, implicitly closing the change stream + if (change == null) { + // TODO(NODE-3485): Replace with MongoChangeStreamClosedError + return closeWithError(changeStream, new error_1.MongoRuntimeError(CHANGESTREAM_CLOSED_ERROR), callback); + } + if (change && !change._id) { + return closeWithError(changeStream, new error_1.MongoChangeStreamError(NO_RESUME_TOKEN_ERROR), callback); + } + // cache the resume token + (_a = changeStream.cursor) === null || _a === void 0 ? void 0 : _a.cacheResumeToken(change._id); + // wipe the startAtOperationTime if there was one so that there won't be a conflict + // between resumeToken and startAtOperationTime if we need to reconnect the cursor + changeStream.options.startAtOperationTime = undefined; + // Return the change + if (!callback) + return changeStream.emit(ChangeStream.CHANGE, change); + return callback(undefined, change); +} +function processError(changeStream, error, callback) { + const cursor = changeStream.cursor; + // If the change stream has been closed explicitly, do not process error. + if (changeStream[kClosed]) { + // TODO(NODE-3485): Replace with MongoChangeStreamClosedError + if (callback) + callback(new error_1.MongoAPIError(CHANGESTREAM_CLOSED_ERROR)); + return; + } + // if the resume succeeds, continue with the new cursor + function resumeWithCursor(newCursor) { + changeStream.cursor = newCursor; + processResumeQueue(changeStream); + } + // otherwise, raise an error and close the change stream + function unresumableError(err) { + if (!callback) { + changeStream.emit(ChangeStream.ERROR, err); + } + changeStream.close(() => processResumeQueue(changeStream, err)); + } + if (cursor && error_1.isResumableError(error, utils_1.maxWireVersion(cursor.server))) { + changeStream.cursor = undefined; + // stop listening to all events from old cursor + endStream(changeStream); + // close internal cursor, ignore errors + cursor.close(); + const topology = utils_1.getTopology(changeStream.parent); + waitForTopologyConnected(topology, { readPreference: cursor.readPreference }, err => { + // if the topology can't reconnect, close the stream + if (err) + return unresumableError(err); + // create a new cursor, preserving the old cursor's options + const newCursor = createChangeStreamCursor(changeStream, cursor.resumeOptions); + // attempt to continue in emitter mode + if (!callback) + return resumeWithCursor(newCursor); + // attempt to continue in iterator mode + newCursor.hasNext(err => { + // if there's an error immediately after resuming, close the stream + if (err) + return unresumableError(err); + resumeWithCursor(newCursor); + }); + }); + return; + } + // if initial error wasn't resumable, raise an error and close the change stream + return closeWithError(changeStream, error, callback); +} +/** + * Safely provides a cursor across resume attempts + * + * @param changeStream - the parent ChangeStream + */ +function getCursor(changeStream, callback) { + if (changeStream[kClosed]) { + // TODO(NODE-3485): Replace with MongoChangeStreamClosedError + callback(new error_1.MongoAPIError(CHANGESTREAM_CLOSED_ERROR)); + return; + } + // if a cursor exists and it is open, return it + if (changeStream.cursor) { + callback(undefined, changeStream.cursor); + return; + } + // no cursor, queue callback until topology reconnects + changeStream[kResumeQueue].push(callback); +} +/** + * Drain the resume queue when a new has become available + * + * @param changeStream - the parent ChangeStream + * @param err - error getting a new cursor + */ +function processResumeQueue(changeStream, err) { + while (changeStream[kResumeQueue].length) { + const request = changeStream[kResumeQueue].pop(); + if (!request) + break; // Should never occur but TS can't use the length check in the while condition + if (!err) { + if (changeStream[kClosed]) { + // TODO(NODE-3485): Replace with MongoChangeStreamClosedError + request(new error_1.MongoAPIError(CHANGESTREAM_CLOSED_ERROR)); + return; + } + if (!changeStream.cursor) { + request(new error_1.MongoChangeStreamError(NO_CURSOR_ERROR)); + return; + } + } + request(err, changeStream.cursor); + } +} +//# sourceMappingURL=change_stream.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/change_stream.js.map b/node_modules/mongodb/lib/change_stream.js.map new file mode 100644 index 000000000..81449be23 --- /dev/null +++ b/node_modules/mongodb/lib/change_stream.js.map @@ -0,0 +1 @@ +{"version":3,"file":"change_stream.js","sourceRoot":"","sources":["../src/change_stream.ts"],"names":[],"mappings":";;;AAAA,iCAAkC;AAClC,mCAOiB;AACjB,sDAA8E;AAC9E,mCAQiB;AAKjB,iDAA6C;AAC7C,6BAA0B;AAC1B,6CAA0C;AAE1C,8DAKkC;AAElC,sEAAmF;AACnF,+CAAyE;AAEzE,gBAAgB;AAChB,MAAM,YAAY,GAAG,MAAM,CAAC,aAAa,CAAC,CAAC;AAC3C,gBAAgB;AAChB,MAAM,aAAa,GAAG,MAAM,CAAC,cAAc,CAAC,CAAC;AAC7C,gBAAgB;AAChB,MAAM,OAAO,GAAG,MAAM,CAAC,QAAQ,CAAC,CAAC;AACjC,gBAAgB;AAChB,MAAM,KAAK,GAAG,MAAM,CAAC,MAAM,CAAC,CAAC;AAE7B,MAAM,qBAAqB,GAAG,CAAC,aAAa,EAAE,YAAY,EAAE,sBAAsB,EAAE,cAAc,CAAC,CAAC;AACpG,MAAM,cAAc,GAAG,CAAC,WAAW,EAAE,gBAAgB,EAAE,WAAW,EAAE,gBAAgB,CAAC,CAAC,MAAM,CAC1F,qBAAqB,CACtB,CAAC;AAEF,MAAM,mBAAmB,GAAG;IAC1B,UAAU,EAAE,MAAM,CAAC,YAAY,CAAC;IAChC,QAAQ,EAAE,MAAM,CAAC,UAAU,CAAC;IAC5B,OAAO,EAAE,MAAM,CAAC,SAAS,CAAC;CAC3B,CAAC;AAEF,MAAM,qBAAqB,GACzB,6EAA6E,CAAC;AAChF,MAAM,eAAe,GAAG,4BAA4B,CAAC;AACrD,MAAM,yBAAyB,GAAG,wBAAwB,CAAC;AAwI3D;;;GAGG;AACH,MAAa,YAEX,SAAQ,+BAAqC;IA2C7C;;;;;OAKG;IACH,YACE,MAAuB,EACvB,WAAuB,EAAE,EACzB,UAA+B,EAAE;QAEjC,KAAK,EAAE,CAAC;QAER,IAAI,CAAC,QAAQ,GAAG,QAAQ,CAAC;QACzB,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QAEvB,IAAI,MAAM,YAAY,uBAAU,EAAE;YAChC,IAAI,CAAC,IAAI,GAAG,mBAAmB,CAAC,UAAU,CAAC;SAC5C;aAAM,IAAI,MAAM,YAAY,OAAE,EAAE;YAC/B,IAAI,CAAC,IAAI,GAAG,mBAAmB,CAAC,QAAQ,CAAC;SAC1C;aAAM,IAAI,MAAM,YAAY,0BAAW,EAAE;YACxC,IAAI,CAAC,IAAI,GAAG,mBAAmB,CAAC,OAAO,CAAC;SACzC;aAAM;YACL,MAAM,IAAI,8BAAsB,CAC9B,mGAAmG,CACpG,CAAC;SACH;QAED,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC;QACrB,IAAI,CAAC,SAAS,GAAG,MAAM,CAAC,CAAC,CAAC,SAAS,CAAC;QACpC,IAAI,CAAC,IAAI,CAAC,OAAO,CAAC,cAAc,IAAI,MAAM,CAAC,cAAc,EAAE;YACzD,IAAI,CAAC,OAAO,CAAC,cAAc,GAAG,MAAM,CAAC,cAAc,CAAC;SACrD;QAED,IAAI,CAAC,YAAY,CAAC,GAAG,IAAI,MAAM,EAAE,CAAC;QAElC,wCAAwC;QACxC,IAAI,CAAC,MAAM,GAAG,wBAAwB,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC;QAEtD,IAAI,CAAC,OAAO,CAAC,GAAG,KAAK,CAAC;QACtB,IAAI,CAAC,KAAK,CAAC,GAAG,KAAK,CAAC;QAEpB,gEAAgE;QAChE,IAAI,CAAC,EAAE,CAAC,aAAa,EAAE,SAAS,CAAC,EAAE;YACjC,IAAI,SAAS,KAAK,QAAQ,IAAI,IAAI,CAAC,MAAM,IAAI,IAAI,CAAC,aAAa,CAAC,QAAQ,CAAC,KAAK,CAAC,EAAE;gBAC/E,YAAY,CAAC,IAAI,EAAE,IAAI,CAAC,MAAM,CAAC,CAAC;aACjC;QACH,CAAC,CAAC,CAAC;QAEH,IAAI,CAAC,EAAE,CAAC,gBAAgB,EAAE,SAAS,CAAC,EAAE;;YACpC,IAAI,SAAS,KAAK,QAAQ,IAAI,IAAI,CAAC,aAAa,CAAC,QAAQ,CAAC,KAAK,CAAC,IAAI,IAAI,CAAC,MAAM,EAAE;gBAC/E,MAAA,IAAI,CAAC,aAAa,CAAC,0CAAE,kBAAkB,CAAC,MAAM,CAAC,CAAC;aACjD;QACH,CAAC,CAAC,CAAC;IACL,CAAC;IAED,gBAAgB;IAChB,IAAI,YAAY;QACd,OAAO,IAAI,CAAC,aAAa,CAAC,CAAC;IAC7B,CAAC;IAED,8FAA8F;IAC9F,IAAI,WAAW;;QACb,OAAO,MAAA,IAAI,CAAC,MAAM,0CAAE,WAAW,CAAC;IAClC,CAAC;IAED,0EAA0E;IAC1E,OAAO,CAAC,QAAmB;QACzB,aAAa,CAAC,IAAI,CAAC,CAAC;QACpB,OAAO,oBAAY,CAAC,QAAQ,EAAE,EAAE,CAAC,EAAE;YACjC,SAAS,CAAC,IAAI,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;gBAC9B,IAAI,GAAG,IAAI,CAAC,MAAM;oBAAE,OAAO,EAAE,CAAC,GAAG,CAAC,CAAC,CAAC,mCAAmC;gBACvE,MAAM,CAAC,OAAO,CAAC,EAAE,CAAC,CAAC;YACrB,CAAC,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;IAKD,IAAI,CACF,QAAkD;QAElD,aAAa,CAAC,IAAI,CAAC,CAAC;QACpB,OAAO,oBAAY,CAAC,QAAQ,EAAE,EAAE,CAAC,EAAE;YACjC,SAAS,CAAC,IAAI,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;gBAC9B,IAAI,GAAG,IAAI,CAAC,MAAM;oBAAE,OAAO,EAAE,CAAC,GAAG,CAAC,CAAC,CAAC,mCAAmC;gBACvE,MAAM,CAAC,IAAI,CAAC,CAAC,KAAK,EAAE,MAAM,EAAE,EAAE;oBAC5B,IAAI,KAAK,EAAE;wBACT,IAAI,CAAC,YAAY,CAAC,CAAC,IAAI,CAAC,GAAG,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC,CAAC;wBAC7C,YAAY,CAAC,IAAI,EAAE,KAAK,EAAE,EAAE,CAAC,CAAC;wBAC9B,OAAO;qBACR;oBACD,gBAAgB,CAAU,IAAI,EAAE,MAAM,EAAE,EAAE,CAAC,CAAC;gBAC9C,CAAC,CAAC,CAAC;YACL,CAAC,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;IAED,2BAA2B;IAC3B,IAAI,MAAM;;QACR,OAAO,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,MAAA,MAAA,IAAI,CAAC,MAAM,0CAAE,MAAM,mCAAI,KAAK,CAAC,CAAC;IACzD,CAAC;IAED,8BAA8B;IAC9B,KAAK,CAAC,QAAmB;QACvB,IAAI,CAAC,OAAO,CAAC,GAAG,IAAI,CAAC;QAErB,OAAO,oBAAY,CAAC,QAAQ,EAAE,EAAE,CAAC,EAAE;YACjC,IAAI,CAAC,IAAI,CAAC,MAAM,EAAE;gBAChB,OAAO,EAAE,EAAE,CAAC;aACb;YAED,MAAM,MAAM,GAAG,IAAI,CAAC,MAAM,CAAC;YAC3B,OAAO,MAAM,CAAC,KAAK,CAAC,GAAG,CAAC,EAAE;gBACxB,SAAS,CAAC,IAAI,CAAC,CAAC;gBAChB,IAAI,CAAC,MAAM,GAAG,SAAS,CAAC;gBACxB,OAAO,EAAE,CAAC,GAAG,CAAC,CAAC;YACjB,CAAC,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;IAED;;;OAGG;IACH,MAAM,CAAC,OAA6B;QAClC,IAAI,CAAC,aAAa,GAAG,OAAO,CAAC;QAC7B,IAAI,CAAC,IAAI,CAAC,MAAM;YAAE,MAAM,IAAI,8BAAsB,CAAC,eAAe,CAAC,CAAC;QACpE,OAAO,IAAI,CAAC,MAAM,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC;IACrC,CAAC;IAOD,OAAO,CAAC,QAAoC;QAC1C,aAAa,CAAC,IAAI,CAAC,CAAC;QACpB,OAAO,oBAAY,CAAC,QAAQ,EAAE,EAAE,CAAC,EAAE;YACjC,SAAS,CAAC,IAAI,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;gBAC9B,IAAI,GAAG,IAAI,CAAC,MAAM;oBAAE,OAAO,EAAE,CAAC,GAAG,CAAC,CAAC,CAAC,mCAAmC;gBACvE,OAAO,MAAM,CAAC,OAAO,CAAC,EAAE,CAAC,CAAC;YAC5B,CAAC,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;;AA9LH,oCA+LC;AA3KC,aAAa;AACG,qBAAQ,GAAG,UAAmB,CAAC;AAC/C,aAAa;AACG,iBAAI,GAAG,MAAe,CAAC;AACvC,aAAa;AACG,iBAAI,GAAG,MAAe,CAAC;AACvC,aAAa;AACG,kBAAK,GAAG,OAAgB,CAAC;AACzC;;;;;GAKG;AACa,mBAAM,GAAG,QAAiB,CAAC;AAC3C,aAAa;AACG,gBAAG,GAAG,KAAc,CAAC;AACrC,aAAa;AACG,kBAAK,GAAG,OAAgB,CAAC;AACzC;;;GAGG;AACa,iCAAoB,GAAG,oBAA6B,CAAC;AA6JvE,gBAAgB;AAChB,MAAa,kBAAwD,SAAQ,gCAG5E;IAWC,YACE,QAAkB,EAClB,SAA2B,EAC3B,WAAuB,EAAE,EACzB,UAAqC,EAAE;QAEvC,KAAK,CAAC,QAAQ,EAAE,SAAS,EAAE,OAAO,CAAC,CAAC;QAEpC,IAAI,CAAC,QAAQ,GAAG,QAAQ,CAAC;QACzB,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,YAAY,GAAG,IAAI,CAAC;QACzB,IAAI,CAAC,oBAAoB,GAAG,OAAO,CAAC,oBAAoB,CAAC;QAEzD,IAAI,OAAO,CAAC,UAAU,EAAE;YACtB,IAAI,CAAC,WAAW,GAAG,OAAO,CAAC,UAAU,CAAC;SACvC;aAAM,IAAI,OAAO,CAAC,WAAW,EAAE;YAC9B,IAAI,CAAC,WAAW,GAAG,OAAO,CAAC,WAAW,CAAC;SACxC;IACH,CAAC;IAED,IAAI,WAAW,CAAC,KAAkB;QAChC,IAAI,CAAC,YAAY,GAAG,KAAK,CAAC;QAC1B,IAAI,CAAC,IAAI,CAAC,YAAY,CAAC,oBAAoB,EAAE,KAAK,CAAC,CAAC;IACtD,CAAC;IAED,IAAI,WAAW;QACb,OAAO,IAAI,CAAC,YAAY,CAAC;IAC3B,CAAC;IAED,IAAI,aAAa;QACf,MAAM,MAAM,GAAG,EAAmB,CAAC;QACnC,KAAK,MAAM,UAAU,IAAI,cAAc,EAAE;YACvC,IAAI,OAAO,CAAC,GAAG,CAAC,IAAI,CAAC,OAAO,EAAE,UAAU,CAAC,EAAE;gBACzC,OAAO,CAAC,GAAG,CAAC,MAAM,EAAE,UAAU,EAAE,OAAO,CAAC,GAAG,CAAC,IAAI,CAAC,OAAO,EAAE,UAAU,CAAC,CAAC,CAAC;aACxE;SACF;QAED,IAAI,IAAI,CAAC,WAAW,IAAI,IAAI,CAAC,oBAAoB,EAAE;YACjD,CAAC,aAAa,EAAE,YAAY,EAAE,sBAAsB,CAAC,CAAC,OAAO,CAAC,GAAG,CAAC,EAAE,CAClE,OAAO,CAAC,cAAc,CAAC,MAAM,EAAE,GAAG,CAAC,CACpC,CAAC;YAEF,IAAI,IAAI,CAAC,WAAW,EAAE;gBACpB,MAAM,SAAS,GACb,IAAI,CAAC,OAAO,CAAC,UAAU,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,CAAC,CAAC,YAAY,CAAC,CAAC,CAAC,aAAa,CAAC;gBAC9E,OAAO,CAAC,GAAG,CAAC,MAAM,EAAE,SAAS,EAAE,IAAI,CAAC,WAAW,CAAC,CAAC;aAClD;iBAAM,IAAI,IAAI,CAAC,oBAAoB,IAAI,sBAAc,CAAC,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,EAAE;gBACxE,MAAM,CAAC,oBAAoB,GAAG,IAAI,CAAC,oBAAoB,CAAC;aACzD;SACF;QAED,OAAO,MAAM,CAAC;IAChB,CAAC;IAED,gBAAgB,CAAC,WAAwB;QACvC,IAAI,IAAI,CAAC,aAAa,EAAE,KAAK,CAAC,IAAI,IAAI,CAAC,oBAAoB,EAAE;YAC3D,IAAI,CAAC,WAAW,GAAG,IAAI,CAAC,oBAAoB,CAAC;SAC9C;aAAM;YACL,IAAI,CAAC,WAAW,GAAG,WAAW,CAAC;SAChC;QACD,IAAI,CAAC,WAAW,GAAG,IAAI,CAAC;IAC1B,CAAC;IAED,aAAa,CAAC,SAAiB,EAAE,QAAmB;QAClD,MAAM,MAAM,GAAG,CAAA,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,MAAM,KAAI,EAAE,CAAC;QACtC,IAAI,MAAM,CAAC,oBAAoB,EAAE;YAC/B,IAAI,CAAC,oBAAoB,GAAG,MAAM,CAAC,oBAAoB,CAAC;YAExD,IAAI,MAAM,CAAC,SAAS,CAAC,CAAC,MAAM,KAAK,CAAC,EAAE;gBAClC,IAAI,CAAC,WAAW,GAAG,MAAM,CAAC,oBAAoB,CAAC;aAChD;SACF;IACH,CAAC;IAED,KAAK;QACH,OAAO,IAAI,kBAAkB,CAAC,IAAI,CAAC,QAAQ,EAAE,IAAI,CAAC,SAAS,EAAE,IAAI,CAAC,QAAQ,EAAE;YAC1E,GAAG,IAAI,CAAC,aAAa;SACtB,CAAC,CAAC;IACL,CAAC;IAED,WAAW,CAAC,OAAsB,EAAE,QAAmC;QACrE,MAAM,kBAAkB,GAAG,IAAI,8BAAkB,CAAC,IAAI,CAAC,SAAS,EAAE,IAAI,CAAC,QAAQ,EAAE;YAC/E,GAAG,IAAI,CAAC,aAAa;YACrB,GAAG,IAAI,CAAC,OAAO;YACf,OAAO;SACR,CAAC,CAAC;QAEH,oCAAgB,CAAC,IAAI,CAAC,QAAQ,EAAE,kBAAkB,EAAE,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;YACpE,IAAI,GAAG,IAAI,QAAQ,IAAI,IAAI,EAAE;gBAC3B,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;aACtB;YAED,MAAM,MAAM,GAAG,kBAAkB,CAAC,MAAM,CAAC;YACzC,IACE,IAAI,CAAC,oBAAoB,IAAI,IAAI;gBACjC,IAAI,CAAC,WAAW,IAAI,IAAI;gBACxB,IAAI,CAAC,UAAU,IAAI,IAAI;gBACvB,sBAAc,CAAC,MAAM,CAAC,IAAI,CAAC,EAC3B;gBACA,IAAI,CAAC,oBAAoB,GAAG,QAAQ,CAAC,aAAa,CAAC;aACpD;YAED,IAAI,CAAC,aAAa,CAAC,YAAY,EAAE,QAAQ,CAAC,CAAC;YAE3C,IAAI,CAAC,IAAI,CAAC,YAAY,CAAC,IAAI,EAAE,QAAQ,CAAC,CAAC;YACvC,IAAI,CAAC,IAAI,CAAC,YAAY,CAAC,QAAQ,CAAC,CAAC;YAEjC,kBAAkB;YAClB,QAAQ,CAAC,SAAS,EAAE,EAAE,MAAM,EAAE,OAAO,EAAE,QAAQ,EAAE,CAAC,CAAC;QACrD,CAAC,CAAC,CAAC;IACL,CAAC;IAED,QAAQ,CAAC,SAAiB,EAAE,QAAkB;QAC5C,KAAK,CAAC,QAAQ,CAAC,SAAS,EAAE,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;YAC1C,IAAI,GAAG,EAAE;gBACP,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;aACtB;YAED,IAAI,CAAC,aAAa,CAAC,WAAW,EAAE,QAAQ,CAAC,CAAC;YAE1C,IAAI,CAAC,IAAI,CAAC,YAAY,CAAC,IAAI,EAAE,QAAQ,CAAC,CAAC;YACvC,IAAI,CAAC,IAAI,CAAC,YAAY,CAAC,QAAQ,CAAC,CAAC;YACjC,QAAQ,CAAC,GAAG,EAAE,QAAQ,CAAC,CAAC;QAC1B,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AA3ID,gDA2IC;AAED,MAAM,oBAAoB,GAAG;IAC3B,YAAY,CAAC,oBAAoB;IACjC,YAAY,CAAC,GAAG;IAChB,YAAY,CAAC,KAAK;CACnB,CAAC;AAEF,SAAS,YAAY,CAAU,YAAmC;IAChE,IAAI,YAAY,CAAC,KAAK,CAAC,KAAK,UAAU,EAAE;QACtC,2DAA2D;QAC3D,MAAM,IAAI,qBAAa,CACrB,gFAAgF,CACjF,CAAC;KACH;IACD,YAAY,CAAC,KAAK,CAAC,GAAG,SAAS,CAAC;AAClC,CAAC;AAED,SAAS,aAAa,CAAU,YAAmC;IACjE,IAAI,YAAY,CAAC,KAAK,CAAC,KAAK,SAAS,EAAE;QACrC,2DAA2D;QAC3D,MAAM,IAAI,qBAAa,CACrB,gFAAgF,CACjF,CAAC;KACH;IACD,YAAY,CAAC,KAAK,CAAC,GAAG,UAAU,CAAC;AACnC,CAAC;AACD;;;GAGG;AACH,SAAS,wBAAwB,CAC/B,YAAmC,EACnC,OAA4B;IAE5B,MAAM,wBAAwB,GAAa,EAAE,YAAY,EAAE,OAAO,CAAC,YAAY,IAAI,SAAS,EAAE,CAAC;IAC/F,iBAAiB,CAAC,wBAAwB,EAAE,OAAO,EAAE,qBAAqB,CAAC,CAAC;IAC5E,IAAI,YAAY,CAAC,IAAI,KAAK,mBAAmB,CAAC,OAAO,EAAE;QACrD,wBAAwB,CAAC,oBAAoB,GAAG,IAAI,CAAC;KACtD;IAED,MAAM,QAAQ,GAAG,CAAC,EAAE,aAAa,EAAE,wBAAwB,EAAc,CAAC,CAAC,MAAM,CAC/E,YAAY,CAAC,QAAQ,CACtB,CAAC;IAEF,MAAM,aAAa,GAAG,iBAAiB,CAAC,EAAE,EAAE,OAAO,EAAE,cAAc,CAAC,CAAC;IACrE,MAAM,kBAAkB,GAAG,IAAI,kBAAkB,CAC/C,mBAAW,CAAC,YAAY,CAAC,MAAM,CAAC,EAChC,YAAY,CAAC,SAAS,EACtB,QAAQ,EACR,aAAa,CACd,CAAC;IAEF,KAAK,MAAM,KAAK,IAAI,oBAAoB,EAAE;QACxC,kBAAkB,CAAC,EAAE,CAAC,KAAK,EAAE,CAAC,CAAC,EAAE,CAAC,YAAY,CAAC,IAAI,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,CAAC;KAChE;IAED,IAAI,YAAY,CAAC,aAAa,CAAC,YAAY,CAAC,MAAM,CAAC,GAAG,CAAC,EAAE;QACvD,YAAY,CAAC,YAAY,EAAE,kBAAkB,CAAC,CAAC;KAChD;IAED,OAAO,kBAAkB,CAAC;AAC5B,CAAC;AAED,SAAS,iBAAiB,CAAC,MAAgB,EAAE,MAAgB,EAAE,WAAqB;IAClF,WAAW,CAAC,OAAO,CAAC,IAAI,CAAC,EAAE;QACzB,IAAI,MAAM,CAAC,IAAI,CAAC,EAAE;YAChB,MAAM,CAAC,IAAI,CAAC,GAAG,MAAM,CAAC,IAAI,CAAC,CAAC;SAC7B;IACH,CAAC,CAAC,CAAC;IAEH,OAAO,MAAM,CAAC;AAChB,CAAC;AAOD,qFAAqF;AACrF,kEAAkE;AAClE,MAAM,iBAAiB,GAAG,KAAK,CAAC;AAChC,SAAS,wBAAwB,CAC/B,QAAkB,EAClB,OAA4B,EAC5B,QAAkB;IAElB,UAAU,CAAC,GAAG,EAAE;QACd,IAAI,OAAO,IAAI,OAAO,CAAC,KAAK,IAAI,IAAI,EAAE;YACpC,OAAO,CAAC,KAAK,GAAG,WAAG,EAAE,CAAC;SACvB;QAED,MAAM,KAAK,GAAG,OAAO,CAAC,KAAK,IAAI,WAAG,EAAE,CAAC;QACrC,MAAM,OAAO,GAAG,OAAO,CAAC,OAAO,IAAI,iBAAiB,CAAC;QACrD,IAAI,QAAQ,CAAC,WAAW,EAAE,EAAE;YAC1B,OAAO,QAAQ,EAAE,CAAC;SACnB;QAED,IAAI,6BAAqB,CAAC,KAAK,CAAC,GAAG,OAAO,EAAE;YAC1C,yDAAyD;YACzD,OAAO,QAAQ,CAAC,IAAI,yBAAiB,CAAC,kCAAkC,CAAC,CAAC,CAAC;SAC5E;QAED,wBAAwB,CAAC,QAAQ,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;IACxD,CAAC,EAAE,GAAG,CAAC,CAAC,CAAC,6DAA6D;AACxE,CAAC;AAED,SAAS,cAAc,CACrB,YAA6B,EAC7B,KAAe,EACf,QAAmB;IAEnB,IAAI,CAAC,QAAQ,EAAE;QACb,YAAY,CAAC,IAAI,CAAC,YAAY,CAAC,KAAK,EAAE,KAAK,CAAC,CAAC;KAC9C;IAED,YAAY,CAAC,KAAK,CAAC,GAAG,EAAE,CAAC,QAAQ,IAAI,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC;AACxD,CAAC;AAED,SAAS,YAAY,CACnB,YAAmC,EACnC,MAAmC;IAEnC,YAAY,CAAC,YAAY,CAAC,CAAC;IAC3B,MAAM,MAAM,GAAG,YAAY,CAAC,aAAa,CAAC,IAAI,MAAM,CAAC,MAAM,EAAE,CAAC;IAC9D,YAAY,CAAC,aAAa,CAAC,GAAG,MAAM,CAAC;IACrC,MAAM,CAAC,EAAE,CAAC,MAAM,EAAE,MAAM,CAAC,EAAE,CAAC,gBAAgB,CAAC,YAAY,EAAE,MAAM,CAAC,CAAC,CAAC;IACpE,MAAM,CAAC,EAAE,CAAC,OAAO,EAAE,KAAK,CAAC,EAAE,CAAC,YAAY,CAAC,YAAY,EAAE,KAAK,CAAC,CAAC,CAAC;AACjE,CAAC;AAED,SAAS,SAAS,CAAU,YAAmC;IAC7D,MAAM,YAAY,GAAG,YAAY,CAAC,aAAa,CAAC,CAAC;IACjD,IAAI,YAAY,EAAE;QAChB,CAAC,MAAM,EAAE,OAAO,EAAE,KAAK,EAAE,OAAO,CAAC,CAAC,OAAO,CAAC,KAAK,CAAC,EAAE,CAAC,YAAY,CAAC,kBAAkB,CAAC,KAAK,CAAC,CAAC,CAAC;QAC3F,YAAY,CAAC,OAAO,EAAE,CAAC;KACxB;IAED,YAAY,CAAC,aAAa,CAAC,GAAG,SAAS,CAAC;AAC1C,CAAC;AAED,SAAS,gBAAgB,CACvB,YAAmC,EACnC,MAA+C,EAC/C,QAAkD;;IAElD,IAAI,YAAY,CAAC,OAAO,CAAC,EAAE;QACzB,6DAA6D;QAC7D,IAAI,QAAQ;YAAE,QAAQ,CAAC,IAAI,qBAAa,CAAC,yBAAyB,CAAC,CAAC,CAAC;QACrE,OAAO;KACR;IAED,yFAAyF;IACzF,IAAI,MAAM,IAAI,IAAI,EAAE;QAClB,6DAA6D;QAC7D,OAAO,cAAc,CAAC,YAAY,EAAE,IAAI,yBAAiB,CAAC,yBAAyB,CAAC,EAAE,QAAQ,CAAC,CAAC;KACjG;IAED,IAAI,MAAM,IAAI,CAAC,MAAM,CAAC,GAAG,EAAE;QACzB,OAAO,cAAc,CACnB,YAAY,EACZ,IAAI,8BAAsB,CAAC,qBAAqB,CAAC,EACjD,QAAQ,CACT,CAAC;KACH;IAED,yBAAyB;IACzB,MAAA,YAAY,CAAC,MAAM,0CAAE,gBAAgB,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC;IAElD,mFAAmF;IACnF,kFAAkF;IAClF,YAAY,CAAC,OAAO,CAAC,oBAAoB,GAAG,SAAS,CAAC;IAEtD,oBAAoB;IACpB,IAAI,CAAC,QAAQ;QAAE,OAAO,YAAY,CAAC,IAAI,CAAC,YAAY,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;IACrE,OAAO,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,CAAC;AACrC,CAAC;AAED,SAAS,YAAY,CACnB,YAAmC,EACnC,KAAe,EACf,QAAmB;IAEnB,MAAM,MAAM,GAAG,YAAY,CAAC,MAAM,CAAC;IAEnC,yEAAyE;IACzE,IAAI,YAAY,CAAC,OAAO,CAAC,EAAE;QACzB,6DAA6D;QAC7D,IAAI,QAAQ;YAAE,QAAQ,CAAC,IAAI,qBAAa,CAAC,yBAAyB,CAAC,CAAC,CAAC;QACrE,OAAO;KACR;IAED,uDAAuD;IACvD,SAAS,gBAAgB,CAAC,SAAsC;QAC9D,YAAY,CAAC,MAAM,GAAG,SAAS,CAAC;QAChC,kBAAkB,CAAC,YAAY,CAAC,CAAC;IACnC,CAAC;IAED,wDAAwD;IACxD,SAAS,gBAAgB,CAAC,GAAa;QACrC,IAAI,CAAC,QAAQ,EAAE;YACb,YAAY,CAAC,IAAI,CAAC,YAAY,CAAC,KAAK,EAAE,GAAG,CAAC,CAAC;SAC5C;QAED,YAAY,CAAC,KAAK,CAAC,GAAG,EAAE,CAAC,kBAAkB,CAAC,YAAY,EAAE,GAAG,CAAC,CAAC,CAAC;IAClE,CAAC;IAED,IAAI,MAAM,IAAI,wBAAgB,CAAC,KAAmB,EAAE,sBAAc,CAAC,MAAM,CAAC,MAAM,CAAC,CAAC,EAAE;QAClF,YAAY,CAAC,MAAM,GAAG,SAAS,CAAC;QAEhC,+CAA+C;QAC/C,SAAS,CAAC,YAAY,CAAC,CAAC;QAExB,uCAAuC;QACvC,MAAM,CAAC,KAAK,EAAE,CAAC;QAEf,MAAM,QAAQ,GAAG,mBAAW,CAAC,YAAY,CAAC,MAAM,CAAC,CAAC;QAClD,wBAAwB,CAAC,QAAQ,EAAE,EAAE,cAAc,EAAE,MAAM,CAAC,cAAc,EAAE,EAAE,GAAG,CAAC,EAAE;YAClF,oDAAoD;YACpD,IAAI,GAAG;gBAAE,OAAO,gBAAgB,CAAC,GAAG,CAAC,CAAC;YAEtC,2DAA2D;YAC3D,MAAM,SAAS,GAAG,wBAAwB,CAAC,YAAY,EAAE,MAAM,CAAC,aAAa,CAAC,CAAC;YAE/E,sCAAsC;YACtC,IAAI,CAAC,QAAQ;gBAAE,OAAO,gBAAgB,CAAC,SAAS,CAAC,CAAC;YAElD,uCAAuC;YACvC,SAAS,CAAC,OAAO,CAAC,GAAG,CAAC,EAAE;gBACtB,mEAAmE;gBACnE,IAAI,GAAG;oBAAE,OAAO,gBAAgB,CAAC,GAAG,CAAC,CAAC;gBACtC,gBAAgB,CAAC,SAAS,CAAC,CAAC;YAC9B,CAAC,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;QACH,OAAO;KACR;IAED,gFAAgF;IAChF,OAAO,cAAc,CAAC,YAAY,EAAE,KAAK,EAAE,QAAQ,CAAC,CAAC;AACvD,CAAC;AAED;;;;GAIG;AACH,SAAS,SAAS,CAAI,YAA6B,EAAE,QAAyC;IAC5F,IAAI,YAAY,CAAC,OAAO,CAAC,EAAE;QACzB,6DAA6D;QAC7D,QAAQ,CAAC,IAAI,qBAAa,CAAC,yBAAyB,CAAC,CAAC,CAAC;QACvD,OAAO;KACR;IAED,+CAA+C;IAC/C,IAAI,YAAY,CAAC,MAAM,EAAE;QACvB,QAAQ,CAAC,SAAS,EAAE,YAAY,CAAC,MAAM,CAAC,CAAC;QACzC,OAAO;KACR;IAED,sDAAsD;IACtD,YAAY,CAAC,YAAY,CAAC,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;AAC5C,CAAC;AAED;;;;;GAKG;AACH,SAAS,kBAAkB,CAAU,YAAmC,EAAE,GAAW;IACnF,OAAO,YAAY,CAAC,YAAY,CAAC,CAAC,MAAM,EAAE;QACxC,MAAM,OAAO,GAAG,YAAY,CAAC,YAAY,CAAC,CAAC,GAAG,EAAE,CAAC;QACjD,IAAI,CAAC,OAAO;YAAE,MAAM,CAAC,8EAA8E;QAEnG,IAAI,CAAC,GAAG,EAAE;YACR,IAAI,YAAY,CAAC,OAAO,CAAC,EAAE;gBACzB,6DAA6D;gBAC7D,OAAO,CAAC,IAAI,qBAAa,CAAC,yBAAyB,CAAC,CAAC,CAAC;gBACtD,OAAO;aACR;YACD,IAAI,CAAC,YAAY,CAAC,MAAM,EAAE;gBACxB,OAAO,CAAC,IAAI,8BAAsB,CAAC,eAAe,CAAC,CAAC,CAAC;gBACrD,OAAO;aACR;SACF;QACD,OAAO,CAAC,GAAG,EAAE,YAAY,CAAC,MAAM,CAAC,CAAC;KACnC;AACH,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/auth_provider.js b/node_modules/mongodb/lib/cmap/auth/auth_provider.js new file mode 100644 index 000000000..b9a840ee8 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/auth_provider.js @@ -0,0 +1,36 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.AuthProvider = exports.AuthContext = void 0; +const error_1 = require("../../error"); +/** Context used during authentication */ +class AuthContext { + constructor(connection, credentials, options) { + this.connection = connection; + this.credentials = credentials; + this.options = options; + } +} +exports.AuthContext = AuthContext; +class AuthProvider { + /** + * Prepare the handshake document before the initial handshake. + * + * @param handshakeDoc - The document used for the initial handshake on a connection + * @param authContext - Context for authentication flow + */ + prepare(handshakeDoc, authContext, callback) { + callback(undefined, handshakeDoc); + } + /** + * Authenticate + * + * @param context - A shared context for authentication flow + * @param callback - The callback to return the result from the authentication + */ + auth(context, callback) { + // TODO(NODE-3483): Replace this with MongoMethodOverrideError + callback(new error_1.MongoRuntimeError('`auth` method must be overridden by subclass')); + } +} +exports.AuthProvider = AuthProvider; +//# sourceMappingURL=auth_provider.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/auth_provider.js.map b/node_modules/mongodb/lib/cmap/auth/auth_provider.js.map new file mode 100644 index 000000000..6bcee0e3d --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/auth_provider.js.map @@ -0,0 +1 @@ +{"version":3,"file":"auth_provider.js","sourceRoot":"","sources":["../../../src/cmap/auth/auth_provider.ts"],"names":[],"mappings":";;;AAKA,uCAAgD;AAIhD,yCAAyC;AACzC,MAAa,WAAW;IAatB,YACE,UAAsB,EACtB,WAAyC,EACzC,OAA2B;QAE3B,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;QAC7B,IAAI,CAAC,WAAW,GAAG,WAAW,CAAC;QAC/B,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;IACzB,CAAC;CACF;AAtBD,kCAsBC;AAED,MAAa,YAAY;IACvB;;;;;OAKG;IACH,OAAO,CACL,YAA+B,EAC/B,WAAwB,EACxB,QAAqC;QAErC,QAAQ,CAAC,SAAS,EAAE,YAAY,CAAC,CAAC;IACpC,CAAC;IAED;;;;;OAKG;IACH,IAAI,CAAC,OAAoB,EAAE,QAAkB;QAC3C,8DAA8D;QAC9D,QAAQ,CAAC,IAAI,yBAAiB,CAAC,8CAA8C,CAAC,CAAC,CAAC;IAClF,CAAC;CACF;AAzBD,oCAyBC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/defaultAuthProviders.js b/node_modules/mongodb/lib/cmap/auth/defaultAuthProviders.js new file mode 100644 index 000000000..ad71c1887 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/defaultAuthProviders.js @@ -0,0 +1,30 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.AUTH_PROVIDERS = exports.AuthMechanism = void 0; +const mongocr_1 = require("./mongocr"); +const x509_1 = require("./x509"); +const plain_1 = require("./plain"); +const gssapi_1 = require("./gssapi"); +const scram_1 = require("./scram"); +const mongodb_aws_1 = require("./mongodb_aws"); +/** @public */ +exports.AuthMechanism = Object.freeze({ + MONGODB_AWS: 'MONGODB-AWS', + MONGODB_CR: 'MONGODB-CR', + MONGODB_DEFAULT: 'DEFAULT', + MONGODB_GSSAPI: 'GSSAPI', + MONGODB_PLAIN: 'PLAIN', + MONGODB_SCRAM_SHA1: 'SCRAM-SHA-1', + MONGODB_SCRAM_SHA256: 'SCRAM-SHA-256', + MONGODB_X509: 'MONGODB-X509' +}); +exports.AUTH_PROVIDERS = new Map([ + [exports.AuthMechanism.MONGODB_AWS, new mongodb_aws_1.MongoDBAWS()], + [exports.AuthMechanism.MONGODB_CR, new mongocr_1.MongoCR()], + [exports.AuthMechanism.MONGODB_GSSAPI, new gssapi_1.GSSAPI()], + [exports.AuthMechanism.MONGODB_PLAIN, new plain_1.Plain()], + [exports.AuthMechanism.MONGODB_SCRAM_SHA1, new scram_1.ScramSHA1()], + [exports.AuthMechanism.MONGODB_SCRAM_SHA256, new scram_1.ScramSHA256()], + [exports.AuthMechanism.MONGODB_X509, new x509_1.X509()] +]); +//# sourceMappingURL=defaultAuthProviders.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/defaultAuthProviders.js.map b/node_modules/mongodb/lib/cmap/auth/defaultAuthProviders.js.map new file mode 100644 index 000000000..924e4dc12 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/defaultAuthProviders.js.map @@ -0,0 +1 @@ +{"version":3,"file":"defaultAuthProviders.js","sourceRoot":"","sources":["../../../src/cmap/auth/defaultAuthProviders.ts"],"names":[],"mappings":";;;AAAA,uCAAoC;AACpC,iCAA8B;AAC9B,mCAAgC;AAChC,qCAAkC;AAClC,mCAAiD;AACjD,+CAA2C;AAG3C,cAAc;AACD,QAAA,aAAa,GAAG,MAAM,CAAC,MAAM,CAAC;IACzC,WAAW,EAAE,aAAa;IAC1B,UAAU,EAAE,YAAY;IACxB,eAAe,EAAE,SAAS;IAC1B,cAAc,EAAE,QAAQ;IACxB,aAAa,EAAE,OAAO;IACtB,kBAAkB,EAAE,aAAa;IACjC,oBAAoB,EAAE,eAAe;IACrC,YAAY,EAAE,cAAc;CACpB,CAAC,CAAC;AAKC,QAAA,cAAc,GAAG,IAAI,GAAG,CAAuC;IAC1E,CAAC,qBAAa,CAAC,WAAW,EAAE,IAAI,wBAAU,EAAE,CAAC;IAC7C,CAAC,qBAAa,CAAC,UAAU,EAAE,IAAI,iBAAO,EAAE,CAAC;IACzC,CAAC,qBAAa,CAAC,cAAc,EAAE,IAAI,eAAM,EAAE,CAAC;IAC5C,CAAC,qBAAa,CAAC,aAAa,EAAE,IAAI,aAAK,EAAE,CAAC;IAC1C,CAAC,qBAAa,CAAC,kBAAkB,EAAE,IAAI,iBAAS,EAAE,CAAC;IACnD,CAAC,qBAAa,CAAC,oBAAoB,EAAE,IAAI,mBAAW,EAAE,CAAC;IACvD,CAAC,qBAAa,CAAC,YAAY,EAAE,IAAI,WAAI,EAAE,CAAC;CACzC,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/gssapi.js b/node_modules/mongodb/lib/cmap/auth/gssapi.js new file mode 100644 index 000000000..78d02c996 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/gssapi.js @@ -0,0 +1,148 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.GSSAPI = void 0; +const auth_provider_1 = require("./auth_provider"); +const error_1 = require("../../error"); +const deps_1 = require("../../deps"); +const utils_1 = require("../../utils"); +const dns = require("dns"); +class GSSAPI extends auth_provider_1.AuthProvider { + auth(authContext, callback) { + const { connection, credentials } = authContext; + if (credentials == null) + return callback(new error_1.MongoMissingCredentialsError('Credentials required for GSSAPI authentication')); + const { username } = credentials; + function externalCommand(command, cb) { + return connection.command(utils_1.ns('$external.$cmd'), command, undefined, cb); + } + makeKerberosClient(authContext, (err, client) => { + if (err) + return callback(err); + if (client == null) + return callback(new error_1.MongoMissingDependencyError('GSSAPI client missing')); + client.step('', (err, payload) => { + if (err) + return callback(err); + externalCommand(saslStart(payload), (err, result) => { + if (err) + return callback(err); + if (result == null) + return callback(); + negotiate(client, 10, result.payload, (err, payload) => { + if (err) + return callback(err); + externalCommand(saslContinue(payload, result.conversationId), (err, result) => { + if (err) + return callback(err); + if (result == null) + return callback(); + finalize(client, username, result.payload, (err, payload) => { + if (err) + return callback(err); + externalCommand({ + saslContinue: 1, + conversationId: result.conversationId, + payload + }, (err, result) => { + if (err) + return callback(err); + callback(undefined, result); + }); + }); + }); + }); + }); + }); + }); + } +} +exports.GSSAPI = GSSAPI; +function makeKerberosClient(authContext, callback) { + var _a; + const { hostAddress } = authContext.options; + const { credentials } = authContext; + if (!hostAddress || typeof hostAddress.host !== 'string' || !credentials) { + return callback(new error_1.MongoInvalidArgumentError('Connection must have host and port and credentials defined.')); + } + if ('kModuleError' in deps_1.Kerberos) { + return callback(deps_1.Kerberos['kModuleError']); + } + const { initializeClient } = deps_1.Kerberos; + const { username, password } = credentials; + const mechanismProperties = credentials.mechanismProperties; + const serviceName = (_a = mechanismProperties.SERVICE_NAME) !== null && _a !== void 0 ? _a : 'mongodb'; + performGssapiCanonicalizeHostName(hostAddress.host, mechanismProperties, (err, host) => { + if (err) + return callback(err); + const initOptions = {}; + if (password != null) { + Object.assign(initOptions, { user: username, password: password }); + } + let spn = `${serviceName}${process.platform === 'win32' ? '/' : '@'}${host}`; + if ('SERVICE_REALM' in mechanismProperties) { + spn = `${spn}@${mechanismProperties.SERVICE_REALM}`; + } + initializeClient(spn, initOptions, (err, client) => { + // TODO(NODE-3483) + if (err) + return callback(new error_1.MongoRuntimeError(err)); + callback(undefined, client); + }); + }); +} +function saslStart(payload) { + return { + saslStart: 1, + mechanism: 'GSSAPI', + payload, + autoAuthorize: 1 + }; +} +function saslContinue(payload, conversationId) { + return { + saslContinue: 1, + conversationId, + payload + }; +} +function negotiate(client, retries, payload, callback) { + client.step(payload, (err, response) => { + // Retries exhausted, raise error + if (err && retries === 0) + return callback(err); + // Adjust number of retries and call step again + if (err) + return negotiate(client, retries - 1, payload, callback); + // Return the payload + callback(undefined, response || ''); + }); +} +function finalize(client, user, payload, callback) { + // GSS Client Unwrap + client.unwrap(payload, (err, response) => { + if (err) + return callback(err); + // Wrap the response + client.wrap(response || '', { user }, (err, wrapped) => { + if (err) + return callback(err); + // Return the payload + callback(undefined, wrapped); + }); + }); +} +function performGssapiCanonicalizeHostName(host, mechanismProperties, callback) { + if (!mechanismProperties.gssapiCanonicalizeHostName) + return callback(undefined, host); + // Attempt to resolve the host name + dns.resolveCname(host, (err, r) => { + if (err) + return callback(err); + // Get the first resolve host id + if (Array.isArray(r) && r.length > 0) { + return callback(undefined, r[0]); + } + callback(undefined, host); + }); +} +//# sourceMappingURL=gssapi.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/gssapi.js.map b/node_modules/mongodb/lib/cmap/auth/gssapi.js.map new file mode 100644 index 000000000..630b4bb8f --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/gssapi.js.map @@ -0,0 +1 @@ +{"version":3,"file":"gssapi.js","sourceRoot":"","sources":["../../../src/cmap/auth/gssapi.ts"],"names":[],"mappings":";;;AAAA,mDAA4D;AAC5D,uCAMqB;AACrB,qCAAsD;AACtD,uCAA2C;AAS3C,2BAA2B;AAE3B,MAAa,MAAO,SAAQ,4BAAY;IACtC,IAAI,CAAC,WAAwB,EAAE,QAAkB;QAC/C,MAAM,EAAE,UAAU,EAAE,WAAW,EAAE,GAAG,WAAW,CAAC;QAChD,IAAI,WAAW,IAAI,IAAI;YACrB,OAAO,QAAQ,CACb,IAAI,oCAA4B,CAAC,gDAAgD,CAAC,CACnF,CAAC;QACJ,MAAM,EAAE,QAAQ,EAAE,GAAG,WAAW,CAAC;QACjC,SAAS,eAAe,CACtB,OAAiB,EACjB,EAAsD;YAEtD,OAAO,UAAU,CAAC,OAAO,CAAC,UAAE,CAAC,gBAAgB,CAAC,EAAE,OAAO,EAAE,SAAS,EAAE,EAAE,CAAC,CAAC;QAC1E,CAAC;QACD,kBAAkB,CAAC,WAAW,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;YAC9C,IAAI,GAAG;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAC9B,IAAI,MAAM,IAAI,IAAI;gBAAE,OAAO,QAAQ,CAAC,IAAI,mCAA2B,CAAC,uBAAuB,CAAC,CAAC,CAAC;YAC9F,MAAM,CAAC,IAAI,CAAC,EAAE,EAAE,CAAC,GAAG,EAAE,OAAO,EAAE,EAAE;gBAC/B,IAAI,GAAG;oBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;gBAE9B,eAAe,CAAC,SAAS,CAAC,OAAO,CAAC,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;oBAClD,IAAI,GAAG;wBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;oBAC9B,IAAI,MAAM,IAAI,IAAI;wBAAE,OAAO,QAAQ,EAAE,CAAC;oBACtC,SAAS,CAAC,MAAM,EAAE,EAAE,EAAE,MAAM,CAAC,OAAO,EAAE,CAAC,GAAG,EAAE,OAAO,EAAE,EAAE;wBACrD,IAAI,GAAG;4BAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;wBAE9B,eAAe,CAAC,YAAY,CAAC,OAAO,EAAE,MAAM,CAAC,cAAc,CAAC,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;4BAC5E,IAAI,GAAG;gCAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;4BAC9B,IAAI,MAAM,IAAI,IAAI;gCAAE,OAAO,QAAQ,EAAE,CAAC;4BACtC,QAAQ,CAAC,MAAM,EAAE,QAAQ,EAAE,MAAM,CAAC,OAAO,EAAE,CAAC,GAAG,EAAE,OAAO,EAAE,EAAE;gCAC1D,IAAI,GAAG;oCAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;gCAE9B,eAAe,CACb;oCACE,YAAY,EAAE,CAAC;oCACf,cAAc,EAAE,MAAM,CAAC,cAAc;oCACrC,OAAO;iCACR,EACD,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;oCACd,IAAI,GAAG;wCAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;oCAE9B,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,CAAC;gCAC9B,CAAC,CACF,CAAC;4BACJ,CAAC,CAAC,CAAC;wBACL,CAAC,CAAC,CAAC;oBACL,CAAC,CAAC,CAAC;gBACL,CAAC,CAAC,CAAC;YACL,CAAC,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAnDD,wBAmDC;AACD,SAAS,kBAAkB,CAAC,WAAwB,EAAE,QAAkC;;IACtF,MAAM,EAAE,WAAW,EAAE,GAAG,WAAW,CAAC,OAAO,CAAC;IAC5C,MAAM,EAAE,WAAW,EAAE,GAAG,WAAW,CAAC;IACpC,IAAI,CAAC,WAAW,IAAI,OAAO,WAAW,CAAC,IAAI,KAAK,QAAQ,IAAI,CAAC,WAAW,EAAE;QACxE,OAAO,QAAQ,CACb,IAAI,iCAAyB,CAAC,6DAA6D,CAAC,CAC7F,CAAC;KACH;IAED,IAAI,cAAc,IAAI,eAAQ,EAAE;QAC9B,OAAO,QAAQ,CAAC,eAAQ,CAAC,cAAc,CAAC,CAAC,CAAC;KAC3C;IACD,MAAM,EAAE,gBAAgB,EAAE,GAAG,eAAQ,CAAC;IAEtC,MAAM,EAAE,QAAQ,EAAE,QAAQ,EAAE,GAAG,WAAW,CAAC;IAC3C,MAAM,mBAAmB,GAAG,WAAW,CAAC,mBAA0C,CAAC;IAEnF,MAAM,WAAW,GAAG,MAAA,mBAAmB,CAAC,YAAY,mCAAI,SAAS,CAAC;IAElE,iCAAiC,CAC/B,WAAW,CAAC,IAAI,EAChB,mBAAmB,EACnB,CAAC,GAAwB,EAAE,IAAa,EAAE,EAAE;QAC1C,IAAI,GAAG;YAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;QAE9B,MAAM,WAAW,GAAG,EAAE,CAAC;QACvB,IAAI,QAAQ,IAAI,IAAI,EAAE;YACpB,MAAM,CAAC,MAAM,CAAC,WAAW,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,QAAQ,EAAE,QAAQ,EAAE,CAAC,CAAC;SACpE;QAED,IAAI,GAAG,GAAG,GAAG,WAAW,GAAG,OAAO,CAAC,QAAQ,KAAK,OAAO,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,GAAG,GAAG,IAAI,EAAE,CAAC;QAC7E,IAAI,eAAe,IAAI,mBAAmB,EAAE;YAC1C,GAAG,GAAG,GAAG,GAAG,IAAI,mBAAmB,CAAC,aAAa,EAAE,CAAC;SACrD;QAED,gBAAgB,CAAC,GAAG,EAAE,WAAW,EAAE,CAAC,GAAW,EAAE,MAAsB,EAAQ,EAAE;YAC/E,kBAAkB;YAClB,IAAI,GAAG;gBAAE,OAAO,QAAQ,CAAC,IAAI,yBAAiB,CAAC,GAAG,CAAC,CAAC,CAAC;YACrD,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,CAAC;QAC9B,CAAC,CAAC,CAAC;IACL,CAAC,CACF,CAAC;AACJ,CAAC;AAED,SAAS,SAAS,CAAC,OAAgB;IACjC,OAAO;QACL,SAAS,EAAE,CAAC;QACZ,SAAS,EAAE,QAAQ;QACnB,OAAO;QACP,aAAa,EAAE,CAAC;KACjB,CAAC;AACJ,CAAC;AAED,SAAS,YAAY,CAAC,OAAgB,EAAE,cAAuB;IAC7D,OAAO;QACL,YAAY,EAAE,CAAC;QACf,cAAc;QACd,OAAO;KACR,CAAC;AACJ,CAAC;AAED,SAAS,SAAS,CAChB,MAAsB,EACtB,OAAe,EACf,OAAe,EACf,QAA0B;IAE1B,MAAM,CAAC,IAAI,CAAC,OAAO,EAAE,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;QACrC,iCAAiC;QACjC,IAAI,GAAG,IAAI,OAAO,KAAK,CAAC;YAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;QAE/C,+CAA+C;QAC/C,IAAI,GAAG;YAAE,OAAO,SAAS,CAAC,MAAM,EAAE,OAAO,GAAG,CAAC,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;QAElE,qBAAqB;QACrB,QAAQ,CAAC,SAAS,EAAE,QAAQ,IAAI,EAAE,CAAC,CAAC;IACtC,CAAC,CAAC,CAAC;AACL,CAAC;AAED,SAAS,QAAQ,CACf,MAAsB,EACtB,IAAY,EACZ,OAAe,EACf,QAA0B;IAE1B,oBAAoB;IACpB,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;QACvC,IAAI,GAAG;YAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;QAE9B,oBAAoB;QACpB,MAAM,CAAC,IAAI,CAAC,QAAQ,IAAI,EAAE,EAAE,EAAE,IAAI,EAAE,EAAE,CAAC,GAAG,EAAE,OAAO,EAAE,EAAE;YACrD,IAAI,GAAG;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAE9B,qBAAqB;YACrB,QAAQ,CAAC,SAAS,EAAE,OAAO,CAAC,CAAC;QAC/B,CAAC,CAAC,CAAC;IACL,CAAC,CAAC,CAAC;AACL,CAAC;AAED,SAAS,iCAAiC,CACxC,IAAY,EACZ,mBAAwC,EACxC,QAA0B;IAE1B,IAAI,CAAC,mBAAmB,CAAC,0BAA0B;QAAE,OAAO,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;IAEtF,mCAAmC;IACnC,GAAG,CAAC,YAAY,CAAC,IAAI,EAAE,CAAC,GAAG,EAAE,CAAC,EAAE,EAAE;QAChC,IAAI,GAAG;YAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;QAE9B,gCAAgC;QAChC,IAAI,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,MAAM,GAAG,CAAC,EAAE;YACpC,OAAO,QAAQ,CAAC,SAAS,EAAE,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;SAClC;QAED,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;IAC5B,CAAC,CAAC,CAAC;AACL,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/mongo_credentials.js b/node_modules/mongodb/lib/cmap/auth/mongo_credentials.js new file mode 100644 index 000000000..1368227b2 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/mongo_credentials.js @@ -0,0 +1,125 @@ +"use strict"; +// Resolves the default auth mechanism according to +Object.defineProperty(exports, "__esModule", { value: true }); +exports.MongoCredentials = void 0; +const error_1 = require("../../error"); +const defaultAuthProviders_1 = require("./defaultAuthProviders"); +// https://github.com/mongodb/specifications/blob/master/source/auth/auth.rst +function getDefaultAuthMechanism(ismaster) { + if (ismaster) { + // If ismaster contains saslSupportedMechs, use scram-sha-256 + // if it is available, else scram-sha-1 + if (Array.isArray(ismaster.saslSupportedMechs)) { + return ismaster.saslSupportedMechs.includes(defaultAuthProviders_1.AuthMechanism.MONGODB_SCRAM_SHA256) + ? defaultAuthProviders_1.AuthMechanism.MONGODB_SCRAM_SHA256 + : defaultAuthProviders_1.AuthMechanism.MONGODB_SCRAM_SHA1; + } + // Fallback to legacy selection method. If wire version >= 3, use scram-sha-1 + if (ismaster.maxWireVersion >= 3) { + return defaultAuthProviders_1.AuthMechanism.MONGODB_SCRAM_SHA1; + } + } + // Default for wireprotocol < 3 + return defaultAuthProviders_1.AuthMechanism.MONGODB_CR; +} +/** + * A representation of the credentials used by MongoDB + * @public + */ +class MongoCredentials { + constructor(options) { + this.username = options.username; + this.password = options.password; + this.source = options.source; + if (!this.source && options.db) { + this.source = options.db; + } + this.mechanism = options.mechanism || defaultAuthProviders_1.AuthMechanism.MONGODB_DEFAULT; + this.mechanismProperties = options.mechanismProperties || {}; + if (this.mechanism.match(/MONGODB-AWS/i)) { + if (!this.username && process.env.AWS_ACCESS_KEY_ID) { + this.username = process.env.AWS_ACCESS_KEY_ID; + } + if (!this.password && process.env.AWS_SECRET_ACCESS_KEY) { + this.password = process.env.AWS_SECRET_ACCESS_KEY; + } + if (this.mechanismProperties.AWS_SESSION_TOKEN == null && + process.env.AWS_SESSION_TOKEN != null) { + this.mechanismProperties = { + ...this.mechanismProperties, + AWS_SESSION_TOKEN: process.env.AWS_SESSION_TOKEN + }; + } + } + Object.freeze(this.mechanismProperties); + Object.freeze(this); + } + /** Determines if two MongoCredentials objects are equivalent */ + equals(other) { + return (this.mechanism === other.mechanism && + this.username === other.username && + this.password === other.password && + this.source === other.source); + } + /** + * If the authentication mechanism is set to "default", resolves the authMechanism + * based on the server version and server supported sasl mechanisms. + * + * @param ismaster - An ismaster response from the server + */ + resolveAuthMechanism(ismaster) { + // If the mechanism is not "default", then it does not need to be resolved + if (this.mechanism.match(/DEFAULT/i)) { + return new MongoCredentials({ + username: this.username, + password: this.password, + source: this.source, + mechanism: getDefaultAuthMechanism(ismaster), + mechanismProperties: this.mechanismProperties + }); + } + return this; + } + validate() { + if ((this.mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_GSSAPI || + this.mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_CR || + this.mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_PLAIN || + this.mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_SCRAM_SHA1 || + this.mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_SCRAM_SHA256) && + !this.username) { + throw new error_1.MongoMissingCredentialsError(`Username required for mechanism '${this.mechanism}'`); + } + if (this.mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_GSSAPI || + this.mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_AWS || + this.mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_X509) { + if (this.source != null && this.source !== '$external') { + // TODO(NODE-3485): Replace this with a MongoAuthValidationError + throw new error_1.MongoAPIError(`Invalid source '${this.source}' for mechanism '${this.mechanism}' specified.`); + } + } + if (this.mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_PLAIN && this.source == null) { + // TODO(NODE-3485): Replace this with a MongoAuthValidationError + throw new error_1.MongoAPIError('PLAIN Authentication Mechanism needs an auth source'); + } + if (this.mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_X509 && this.password != null) { + if (this.password === '') { + Reflect.set(this, 'password', undefined); + return; + } + // TODO(NODE-3485): Replace this with a MongoAuthValidationError + throw new error_1.MongoAPIError(`Password not allowed for mechanism MONGODB-X509`); + } + } + static merge(creds, options) { + var _a, _b, _c, _d, _e, _f, _g, _h, _j, _k, _l; + return new MongoCredentials({ + username: (_b = (_a = options.username) !== null && _a !== void 0 ? _a : creds === null || creds === void 0 ? void 0 : creds.username) !== null && _b !== void 0 ? _b : '', + password: (_d = (_c = options.password) !== null && _c !== void 0 ? _c : creds === null || creds === void 0 ? void 0 : creds.password) !== null && _d !== void 0 ? _d : '', + mechanism: (_f = (_e = options.mechanism) !== null && _e !== void 0 ? _e : creds === null || creds === void 0 ? void 0 : creds.mechanism) !== null && _f !== void 0 ? _f : defaultAuthProviders_1.AuthMechanism.MONGODB_DEFAULT, + mechanismProperties: (_h = (_g = options.mechanismProperties) !== null && _g !== void 0 ? _g : creds === null || creds === void 0 ? void 0 : creds.mechanismProperties) !== null && _h !== void 0 ? _h : {}, + source: (_l = (_k = (_j = options.source) !== null && _j !== void 0 ? _j : options.db) !== null && _k !== void 0 ? _k : creds === null || creds === void 0 ? void 0 : creds.source) !== null && _l !== void 0 ? _l : 'admin' + }); + } +} +exports.MongoCredentials = MongoCredentials; +//# sourceMappingURL=mongo_credentials.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/mongo_credentials.js.map b/node_modules/mongodb/lib/cmap/auth/mongo_credentials.js.map new file mode 100644 index 000000000..523fec2c5 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/mongo_credentials.js.map @@ -0,0 +1 @@ +{"version":3,"file":"mongo_credentials.js","sourceRoot":"","sources":["../../../src/cmap/auth/mongo_credentials.ts"],"names":[],"mappings":";AAAA,mDAAmD;;;AAGnD,uCAA0E;AAC1E,iEAAuD;AAEvD,6EAA6E;AAC7E,SAAS,uBAAuB,CAAC,QAAmB;IAClD,IAAI,QAAQ,EAAE;QACZ,6DAA6D;QAC7D,uCAAuC;QACvC,IAAI,KAAK,CAAC,OAAO,CAAC,QAAQ,CAAC,kBAAkB,CAAC,EAAE;YAC9C,OAAO,QAAQ,CAAC,kBAAkB,CAAC,QAAQ,CAAC,oCAAa,CAAC,oBAAoB,CAAC;gBAC7E,CAAC,CAAC,oCAAa,CAAC,oBAAoB;gBACpC,CAAC,CAAC,oCAAa,CAAC,kBAAkB,CAAC;SACtC;QAED,6EAA6E;QAC7E,IAAI,QAAQ,CAAC,cAAc,IAAI,CAAC,EAAE;YAChC,OAAO,oCAAa,CAAC,kBAAkB,CAAC;SACzC;KACF;IAED,+BAA+B;IAC/B,OAAO,oCAAa,CAAC,UAAU,CAAC;AAClC,CAAC;AAoBD;;;GAGG;AACH,MAAa,gBAAgB;IAY3B,YAAY,OAAgC;QAC1C,IAAI,CAAC,QAAQ,GAAG,OAAO,CAAC,QAAQ,CAAC;QACjC,IAAI,CAAC,QAAQ,GAAG,OAAO,CAAC,QAAQ,CAAC;QACjC,IAAI,CAAC,MAAM,GAAG,OAAO,CAAC,MAAM,CAAC;QAC7B,IAAI,CAAC,IAAI,CAAC,MAAM,IAAI,OAAO,CAAC,EAAE,EAAE;YAC9B,IAAI,CAAC,MAAM,GAAG,OAAO,CAAC,EAAE,CAAC;SAC1B;QACD,IAAI,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,IAAI,oCAAa,CAAC,eAAe,CAAC;QACpE,IAAI,CAAC,mBAAmB,GAAG,OAAO,CAAC,mBAAmB,IAAI,EAAE,CAAC;QAE7D,IAAI,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,cAAc,CAAC,EAAE;YACxC,IAAI,CAAC,IAAI,CAAC,QAAQ,IAAI,OAAO,CAAC,GAAG,CAAC,iBAAiB,EAAE;gBACnD,IAAI,CAAC,QAAQ,GAAG,OAAO,CAAC,GAAG,CAAC,iBAAiB,CAAC;aAC/C;YAED,IAAI,CAAC,IAAI,CAAC,QAAQ,IAAI,OAAO,CAAC,GAAG,CAAC,qBAAqB,EAAE;gBACvD,IAAI,CAAC,QAAQ,GAAG,OAAO,CAAC,GAAG,CAAC,qBAAqB,CAAC;aACnD;YAED,IACE,IAAI,CAAC,mBAAmB,CAAC,iBAAiB,IAAI,IAAI;gBAClD,OAAO,CAAC,GAAG,CAAC,iBAAiB,IAAI,IAAI,EACrC;gBACA,IAAI,CAAC,mBAAmB,GAAG;oBACzB,GAAG,IAAI,CAAC,mBAAmB;oBAC3B,iBAAiB,EAAE,OAAO,CAAC,GAAG,CAAC,iBAAiB;iBACjD,CAAC;aACH;SACF;QAED,MAAM,CAAC,MAAM,CAAC,IAAI,CAAC,mBAAmB,CAAC,CAAC;QACxC,MAAM,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC;IACtB,CAAC;IAED,gEAAgE;IAChE,MAAM,CAAC,KAAuB;QAC5B,OAAO,CACL,IAAI,CAAC,SAAS,KAAK,KAAK,CAAC,SAAS;YAClC,IAAI,CAAC,QAAQ,KAAK,KAAK,CAAC,QAAQ;YAChC,IAAI,CAAC,QAAQ,KAAK,KAAK,CAAC,QAAQ;YAChC,IAAI,CAAC,MAAM,KAAK,KAAK,CAAC,MAAM,CAC7B,CAAC;IACJ,CAAC;IAED;;;;;OAKG;IACH,oBAAoB,CAAC,QAAmB;QACtC,0EAA0E;QAC1E,IAAI,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,UAAU,CAAC,EAAE;YACpC,OAAO,IAAI,gBAAgB,CAAC;gBAC1B,QAAQ,EAAE,IAAI,CAAC,QAAQ;gBACvB,QAAQ,EAAE,IAAI,CAAC,QAAQ;gBACvB,MAAM,EAAE,IAAI,CAAC,MAAM;gBACnB,SAAS,EAAE,uBAAuB,CAAC,QAAQ,CAAC;gBAC5C,mBAAmB,EAAE,IAAI,CAAC,mBAAmB;aAC9C,CAAC,CAAC;SACJ;QAED,OAAO,IAAI,CAAC;IACd,CAAC;IAED,QAAQ;QACN,IACE,CAAC,IAAI,CAAC,SAAS,KAAK,oCAAa,CAAC,cAAc;YAC9C,IAAI,CAAC,SAAS,KAAK,oCAAa,CAAC,UAAU;YAC3C,IAAI,CAAC,SAAS,KAAK,oCAAa,CAAC,aAAa;YAC9C,IAAI,CAAC,SAAS,KAAK,oCAAa,CAAC,kBAAkB;YACnD,IAAI,CAAC,SAAS,KAAK,oCAAa,CAAC,oBAAoB,CAAC;YACxD,CAAC,IAAI,CAAC,QAAQ,EACd;YACA,MAAM,IAAI,oCAA4B,CAAC,oCAAoC,IAAI,CAAC,SAAS,GAAG,CAAC,CAAC;SAC/F;QAED,IACE,IAAI,CAAC,SAAS,KAAK,oCAAa,CAAC,cAAc;YAC/C,IAAI,CAAC,SAAS,KAAK,oCAAa,CAAC,WAAW;YAC5C,IAAI,CAAC,SAAS,KAAK,oCAAa,CAAC,YAAY,EAC7C;YACA,IAAI,IAAI,CAAC,MAAM,IAAI,IAAI,IAAI,IAAI,CAAC,MAAM,KAAK,WAAW,EAAE;gBACtD,gEAAgE;gBAChE,MAAM,IAAI,qBAAa,CACrB,mBAAmB,IAAI,CAAC,MAAM,oBAAoB,IAAI,CAAC,SAAS,cAAc,CAC/E,CAAC;aACH;SACF;QAED,IAAI,IAAI,CAAC,SAAS,KAAK,oCAAa,CAAC,aAAa,IAAI,IAAI,CAAC,MAAM,IAAI,IAAI,EAAE;YACzE,gEAAgE;YAChE,MAAM,IAAI,qBAAa,CAAC,qDAAqD,CAAC,CAAC;SAChF;QAED,IAAI,IAAI,CAAC,SAAS,KAAK,oCAAa,CAAC,YAAY,IAAI,IAAI,CAAC,QAAQ,IAAI,IAAI,EAAE;YAC1E,IAAI,IAAI,CAAC,QAAQ,KAAK,EAAE,EAAE;gBACxB,OAAO,CAAC,GAAG,CAAC,IAAI,EAAE,UAAU,EAAE,SAAS,CAAC,CAAC;gBACzC,OAAO;aACR;YACD,gEAAgE;YAChE,MAAM,IAAI,qBAAa,CAAC,iDAAiD,CAAC,CAAC;SAC5E;IACH,CAAC;IAED,MAAM,CAAC,KAAK,CACV,KAAmC,EACnC,OAAyC;;QAEzC,OAAO,IAAI,gBAAgB,CAAC;YAC1B,QAAQ,EAAE,MAAA,MAAA,OAAO,CAAC,QAAQ,mCAAI,KAAK,aAAL,KAAK,uBAAL,KAAK,CAAE,QAAQ,mCAAI,EAAE;YACnD,QAAQ,EAAE,MAAA,MAAA,OAAO,CAAC,QAAQ,mCAAI,KAAK,aAAL,KAAK,uBAAL,KAAK,CAAE,QAAQ,mCAAI,EAAE;YACnD,SAAS,EAAE,MAAA,MAAA,OAAO,CAAC,SAAS,mCAAI,KAAK,aAAL,KAAK,uBAAL,KAAK,CAAE,SAAS,mCAAI,oCAAa,CAAC,eAAe;YACjF,mBAAmB,EAAE,MAAA,MAAA,OAAO,CAAC,mBAAmB,mCAAI,KAAK,aAAL,KAAK,uBAAL,KAAK,CAAE,mBAAmB,mCAAI,EAAE;YACpF,MAAM,EAAE,MAAA,MAAA,MAAA,OAAO,CAAC,MAAM,mCAAI,OAAO,CAAC,EAAE,mCAAI,KAAK,aAAL,KAAK,uBAAL,KAAK,CAAE,MAAM,mCAAI,OAAO;SACjE,CAAC,CAAC;IACL,CAAC;CACF;AAjID,4CAiIC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/mongocr.js b/node_modules/mongodb/lib/cmap/auth/mongocr.js new file mode 100644 index 000000000..6d6b25614 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/mongocr.js @@ -0,0 +1,44 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.MongoCR = void 0; +const crypto = require("crypto"); +const auth_provider_1 = require("./auth_provider"); +const utils_1 = require("../../utils"); +const error_1 = require("../../error"); +class MongoCR extends auth_provider_1.AuthProvider { + auth(authContext, callback) { + const { connection, credentials } = authContext; + if (!credentials) { + return callback(new error_1.MongoMissingCredentialsError('AuthContext must provide credentials.')); + } + const username = credentials.username; + const password = credentials.password; + const source = credentials.source; + connection.command(utils_1.ns(`${source}.$cmd`), { getnonce: 1 }, undefined, (err, r) => { + let nonce = null; + let key = null; + // Get nonce + if (err == null) { + nonce = r.nonce; + // Use node md5 generator + let md5 = crypto.createHash('md5'); + // Generate keys used for authentication + md5.update(`${username}:mongo:${password}`, 'utf8'); + const hash_password = md5.digest('hex'); + // Final key + md5 = crypto.createHash('md5'); + md5.update(nonce + username + hash_password, 'utf8'); + key = md5.digest('hex'); + } + const authenticateCommand = { + authenticate: 1, + user: username, + nonce, + key + }; + connection.command(utils_1.ns(`${source}.$cmd`), authenticateCommand, undefined, callback); + }); + } +} +exports.MongoCR = MongoCR; +//# sourceMappingURL=mongocr.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/mongocr.js.map b/node_modules/mongodb/lib/cmap/auth/mongocr.js.map new file mode 100644 index 000000000..d463397df --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/mongocr.js.map @@ -0,0 +1 @@ +{"version":3,"file":"mongocr.js","sourceRoot":"","sources":["../../../src/cmap/auth/mongocr.ts"],"names":[],"mappings":";;;AAAA,iCAAiC;AACjC,mDAA4D;AAC5D,uCAA2C;AAC3C,uCAA2D;AAE3D,MAAa,OAAQ,SAAQ,4BAAY;IACvC,IAAI,CAAC,WAAwB,EAAE,QAAkB;QAC/C,MAAM,EAAE,UAAU,EAAE,WAAW,EAAE,GAAG,WAAW,CAAC;QAChD,IAAI,CAAC,WAAW,EAAE;YAChB,OAAO,QAAQ,CAAC,IAAI,oCAA4B,CAAC,uCAAuC,CAAC,CAAC,CAAC;SAC5F;QACD,MAAM,QAAQ,GAAG,WAAW,CAAC,QAAQ,CAAC;QACtC,MAAM,QAAQ,GAAG,WAAW,CAAC,QAAQ,CAAC;QACtC,MAAM,MAAM,GAAG,WAAW,CAAC,MAAM,CAAC;QAClC,UAAU,CAAC,OAAO,CAAC,UAAE,CAAC,GAAG,MAAM,OAAO,CAAC,EAAE,EAAE,QAAQ,EAAE,CAAC,EAAE,EAAE,SAAS,EAAE,CAAC,GAAG,EAAE,CAAC,EAAE,EAAE;YAC9E,IAAI,KAAK,GAAG,IAAI,CAAC;YACjB,IAAI,GAAG,GAAG,IAAI,CAAC;YAEf,YAAY;YACZ,IAAI,GAAG,IAAI,IAAI,EAAE;gBACf,KAAK,GAAG,CAAC,CAAC,KAAK,CAAC;gBAEhB,yBAAyB;gBACzB,IAAI,GAAG,GAAG,MAAM,CAAC,UAAU,CAAC,KAAK,CAAC,CAAC;gBAEnC,wCAAwC;gBACxC,GAAG,CAAC,MAAM,CAAC,GAAG,QAAQ,UAAU,QAAQ,EAAE,EAAE,MAAM,CAAC,CAAC;gBACpD,MAAM,aAAa,GAAG,GAAG,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC;gBAExC,YAAY;gBACZ,GAAG,GAAG,MAAM,CAAC,UAAU,CAAC,KAAK,CAAC,CAAC;gBAC/B,GAAG,CAAC,MAAM,CAAC,KAAK,GAAG,QAAQ,GAAG,aAAa,EAAE,MAAM,CAAC,CAAC;gBACrD,GAAG,GAAG,GAAG,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC;aACzB;YAED,MAAM,mBAAmB,GAAG;gBAC1B,YAAY,EAAE,CAAC;gBACf,IAAI,EAAE,QAAQ;gBACd,KAAK;gBACL,GAAG;aACJ,CAAC;YAEF,UAAU,CAAC,OAAO,CAAC,UAAE,CAAC,GAAG,MAAM,OAAO,CAAC,EAAE,mBAAmB,EAAE,SAAS,EAAE,QAAQ,CAAC,CAAC;QACrF,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAxCD,0BAwCC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/mongodb_aws.js b/node_modules/mongodb/lib/cmap/auth/mongodb_aws.js new file mode 100644 index 000000000..6ecf9fba5 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/mongodb_aws.js @@ -0,0 +1,201 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.MongoDBAWS = void 0; +const http = require("http"); +const crypto = require("crypto"); +const url = require("url"); +const BSON = require("../../bson"); +const auth_provider_1 = require("./auth_provider"); +const mongo_credentials_1 = require("./mongo_credentials"); +const error_1 = require("../../error"); +const utils_1 = require("../../utils"); +const deps_1 = require("../../deps"); +const defaultAuthProviders_1 = require("./defaultAuthProviders"); +const ASCII_N = 110; +const AWS_RELATIVE_URI = 'http://169.254.170.2'; +const AWS_EC2_URI = 'http://169.254.169.254'; +const AWS_EC2_PATH = '/latest/meta-data/iam/security-credentials'; +const bsonOptions = { + promoteLongs: true, + promoteValues: true, + promoteBuffers: false, + bsonRegExp: false +}; +class MongoDBAWS extends auth_provider_1.AuthProvider { + auth(authContext, callback) { + const { connection, credentials } = authContext; + if (!credentials) { + return callback(new error_1.MongoMissingCredentialsError('AuthContext must provide credentials.')); + } + if ('kModuleError' in deps_1.aws4) { + return callback(deps_1.aws4['kModuleError']); + } + const { sign } = deps_1.aws4; + if (utils_1.maxWireVersion(connection) < 9) { + callback(new error_1.MongoCompatibilityError('MONGODB-AWS authentication requires MongoDB version 4.4 or later')); + return; + } + if (!credentials.username) { + makeTempCredentials(credentials, (err, tempCredentials) => { + if (err || !tempCredentials) + return callback(err); + authContext.credentials = tempCredentials; + this.auth(authContext, callback); + }); + return; + } + const accessKeyId = credentials.username; + const secretAccessKey = credentials.password; + const sessionToken = credentials.mechanismProperties.AWS_SESSION_TOKEN; + // If all three defined, include sessionToken, else include username and pass, else no credentials + const awsCredentials = accessKeyId && secretAccessKey && sessionToken + ? { accessKeyId, secretAccessKey, sessionToken } + : accessKeyId && secretAccessKey + ? { accessKeyId, secretAccessKey } + : undefined; + const db = credentials.source; + crypto.randomBytes(32, (err, nonce) => { + if (err) { + callback(err); + return; + } + const saslStart = { + saslStart: 1, + mechanism: 'MONGODB-AWS', + payload: BSON.serialize({ r: nonce, p: ASCII_N }, bsonOptions) + }; + connection.command(utils_1.ns(`${db}.$cmd`), saslStart, undefined, (err, res) => { + if (err) + return callback(err); + const serverResponse = BSON.deserialize(res.payload.buffer, bsonOptions); + const host = serverResponse.h; + const serverNonce = serverResponse.s.buffer; + if (serverNonce.length !== 64) { + callback( + // TODO(NODE-3483) + new error_1.MongoRuntimeError(`Invalid server nonce length ${serverNonce.length}, expected 64`)); + return; + } + if (serverNonce.compare(nonce, 0, nonce.length, 0, nonce.length) !== 0) { + // TODO(NODE-3483) + callback(new error_1.MongoRuntimeError('Server nonce does not begin with client nonce')); + return; + } + if (host.length < 1 || host.length > 255 || host.indexOf('..') !== -1) { + // TODO(NODE-3483) + callback(new error_1.MongoRuntimeError(`Server returned an invalid host: "${host}"`)); + return; + } + const body = 'Action=GetCallerIdentity&Version=2011-06-15'; + const options = sign({ + method: 'POST', + host, + region: deriveRegion(serverResponse.h), + service: 'sts', + headers: { + 'Content-Type': 'application/x-www-form-urlencoded', + 'Content-Length': body.length, + 'X-MongoDB-Server-Nonce': serverNonce.toString('base64'), + 'X-MongoDB-GS2-CB-Flag': 'n' + }, + path: '/', + body + }, awsCredentials); + const payload = { + a: options.headers.Authorization, + d: options.headers['X-Amz-Date'] + }; + if (sessionToken) { + payload.t = sessionToken; + } + const saslContinue = { + saslContinue: 1, + conversationId: 1, + payload: BSON.serialize(payload, bsonOptions) + }; + connection.command(utils_1.ns(`${db}.$cmd`), saslContinue, undefined, callback); + }); + }); + } +} +exports.MongoDBAWS = MongoDBAWS; +function makeTempCredentials(credentials, callback) { + function done(creds) { + if (!creds.AccessKeyId || !creds.SecretAccessKey || !creds.Token) { + callback(new error_1.MongoMissingCredentialsError('Could not obtain temporary MONGODB-AWS credentials')); + return; + } + callback(undefined, new mongo_credentials_1.MongoCredentials({ + username: creds.AccessKeyId, + password: creds.SecretAccessKey, + source: credentials.source, + mechanism: defaultAuthProviders_1.AuthMechanism.MONGODB_AWS, + mechanismProperties: { + AWS_SESSION_TOKEN: creds.Token + } + })); + } + // If the environment variable AWS_CONTAINER_CREDENTIALS_RELATIVE_URI + // is set then drivers MUST assume that it was set by an AWS ECS agent + if (process.env.AWS_CONTAINER_CREDENTIALS_RELATIVE_URI) { + request(`${AWS_RELATIVE_URI}${process.env.AWS_CONTAINER_CREDENTIALS_RELATIVE_URI}`, undefined, (err, res) => { + if (err) + return callback(err); + done(res); + }); + return; + } + // Otherwise assume we are on an EC2 instance + // get a token + request(`${AWS_EC2_URI}/latest/api/token`, { method: 'PUT', json: false, headers: { 'X-aws-ec2-metadata-token-ttl-seconds': 30 } }, (err, token) => { + if (err) + return callback(err); + // get role name + request(`${AWS_EC2_URI}/${AWS_EC2_PATH}`, { json: false, headers: { 'X-aws-ec2-metadata-token': token } }, (err, roleName) => { + if (err) + return callback(err); + // get temp credentials + request(`${AWS_EC2_URI}/${AWS_EC2_PATH}/${roleName}`, { headers: { 'X-aws-ec2-metadata-token': token } }, (err, creds) => { + if (err) + return callback(err); + done(creds); + }); + }); + }); +} +function deriveRegion(host) { + const parts = host.split('.'); + if (parts.length === 1 || parts[1] === 'amazonaws') { + return 'us-east-1'; + } + return parts[1]; +} +function request(uri, _options, callback) { + const options = Object.assign({ + method: 'GET', + timeout: 10000, + json: true + }, url.parse(uri), _options); + const req = http.request(options, res => { + res.setEncoding('utf8'); + let data = ''; + res.on('data', d => (data += d)); + res.on('end', () => { + if (options.json === false) { + callback(undefined, data); + return; + } + try { + const parsed = JSON.parse(data); + callback(undefined, parsed); + } + catch (err) { + // TODO(NODE-3483) + callback(new error_1.MongoRuntimeError(`Invalid JSON response: "${data}"`)); + } + }); + }); + req.on('error', err => callback(err)); + req.end(); +} +//# sourceMappingURL=mongodb_aws.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/mongodb_aws.js.map b/node_modules/mongodb/lib/cmap/auth/mongodb_aws.js.map new file mode 100644 index 000000000..2d12f9ab3 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/mongodb_aws.js.map @@ -0,0 +1 @@ +{"version":3,"file":"mongodb_aws.js","sourceRoot":"","sources":["../../../src/cmap/auth/mongodb_aws.ts"],"names":[],"mappings":";;;AAAA,6BAA6B;AAC7B,iCAAiC;AACjC,2BAA2B;AAC3B,mCAAmC;AACnC,mDAA4D;AAC5D,2DAAuD;AACvD,uCAIqB;AACrB,uCAA2D;AAG3D,qCAAkC;AAClC,iEAAuD;AAGvD,MAAM,OAAO,GAAG,GAAG,CAAC;AACpB,MAAM,gBAAgB,GAAG,sBAAsB,CAAC;AAChD,MAAM,WAAW,GAAG,wBAAwB,CAAC;AAC7C,MAAM,YAAY,GAAG,4CAA4C,CAAC;AAClE,MAAM,WAAW,GAAyB;IACxC,YAAY,EAAE,IAAI;IAClB,aAAa,EAAE,IAAI;IACnB,cAAc,EAAE,KAAK;IACrB,UAAU,EAAE,KAAK;CAClB,CAAC;AAQF,MAAa,UAAW,SAAQ,4BAAY;IAC1C,IAAI,CAAC,WAAwB,EAAE,QAAkB;QAC/C,MAAM,EAAE,UAAU,EAAE,WAAW,EAAE,GAAG,WAAW,CAAC;QAChD,IAAI,CAAC,WAAW,EAAE;YAChB,OAAO,QAAQ,CAAC,IAAI,oCAA4B,CAAC,uCAAuC,CAAC,CAAC,CAAC;SAC5F;QAED,IAAI,cAAc,IAAI,WAAI,EAAE;YAC1B,OAAO,QAAQ,CAAC,WAAI,CAAC,cAAc,CAAC,CAAC,CAAC;SACvC;QACD,MAAM,EAAE,IAAI,EAAE,GAAG,WAAI,CAAC;QAEtB,IAAI,sBAAc,CAAC,UAAU,CAAC,GAAG,CAAC,EAAE;YAClC,QAAQ,CACN,IAAI,+BAAuB,CACzB,kEAAkE,CACnE,CACF,CAAC;YACF,OAAO;SACR;QAED,IAAI,CAAC,WAAW,CAAC,QAAQ,EAAE;YACzB,mBAAmB,CAAC,WAAW,EAAE,CAAC,GAAG,EAAE,eAAe,EAAE,EAAE;gBACxD,IAAI,GAAG,IAAI,CAAC,eAAe;oBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;gBAElD,WAAW,CAAC,WAAW,GAAG,eAAe,CAAC;gBAC1C,IAAI,CAAC,IAAI,CAAC,WAAW,EAAE,QAAQ,CAAC,CAAC;YACnC,CAAC,CAAC,CAAC;YAEH,OAAO;SACR;QAED,MAAM,WAAW,GAAG,WAAW,CAAC,QAAQ,CAAC;QACzC,MAAM,eAAe,GAAG,WAAW,CAAC,QAAQ,CAAC;QAC7C,MAAM,YAAY,GAAG,WAAW,CAAC,mBAAmB,CAAC,iBAAiB,CAAC;QAEvE,kGAAkG;QAClG,MAAM,cAAc,GAClB,WAAW,IAAI,eAAe,IAAI,YAAY;YAC5C,CAAC,CAAC,EAAE,WAAW,EAAE,eAAe,EAAE,YAAY,EAAE;YAChD,CAAC,CAAC,WAAW,IAAI,eAAe;gBAChC,CAAC,CAAC,EAAE,WAAW,EAAE,eAAe,EAAE;gBAClC,CAAC,CAAC,SAAS,CAAC;QAEhB,MAAM,EAAE,GAAG,WAAW,CAAC,MAAM,CAAC;QAC9B,MAAM,CAAC,WAAW,CAAC,EAAE,EAAE,CAAC,GAAG,EAAE,KAAK,EAAE,EAAE;YACpC,IAAI,GAAG,EAAE;gBACP,QAAQ,CAAC,GAAG,CAAC,CAAC;gBACd,OAAO;aACR;YAED,MAAM,SAAS,GAAG;gBAChB,SAAS,EAAE,CAAC;gBACZ,SAAS,EAAE,aAAa;gBACxB,OAAO,EAAE,IAAI,CAAC,SAAS,CAAC,EAAE,CAAC,EAAE,KAAK,EAAE,CAAC,EAAE,OAAO,EAAE,EAAE,WAAW,CAAC;aAC/D,CAAC;YAEF,UAAU,CAAC,OAAO,CAAC,UAAE,CAAC,GAAG,EAAE,OAAO,CAAC,EAAE,SAAS,EAAE,SAAS,EAAE,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;gBACtE,IAAI,GAAG;oBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;gBAE9B,MAAM,cAAc,GAAG,IAAI,CAAC,WAAW,CAAC,GAAG,CAAC,OAAO,CAAC,MAAM,EAAE,WAAW,CAGtE,CAAC;gBACF,MAAM,IAAI,GAAG,cAAc,CAAC,CAAC,CAAC;gBAC9B,MAAM,WAAW,GAAG,cAAc,CAAC,CAAC,CAAC,MAAM,CAAC;gBAC5C,IAAI,WAAW,CAAC,MAAM,KAAK,EAAE,EAAE;oBAC7B,QAAQ;oBACN,kBAAkB;oBAClB,IAAI,yBAAiB,CAAC,+BAA+B,WAAW,CAAC,MAAM,eAAe,CAAC,CACxF,CAAC;oBAEF,OAAO;iBACR;gBAED,IAAI,WAAW,CAAC,OAAO,CAAC,KAAK,EAAE,CAAC,EAAE,KAAK,CAAC,MAAM,EAAE,CAAC,EAAE,KAAK,CAAC,MAAM,CAAC,KAAK,CAAC,EAAE;oBACtE,kBAAkB;oBAClB,QAAQ,CAAC,IAAI,yBAAiB,CAAC,+CAA+C,CAAC,CAAC,CAAC;oBACjF,OAAO;iBACR;gBAED,IAAI,IAAI,CAAC,MAAM,GAAG,CAAC,IAAI,IAAI,CAAC,MAAM,GAAG,GAAG,IAAI,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC,EAAE;oBACrE,kBAAkB;oBAClB,QAAQ,CAAC,IAAI,yBAAiB,CAAC,qCAAqC,IAAI,GAAG,CAAC,CAAC,CAAC;oBAC9E,OAAO;iBACR;gBAED,MAAM,IAAI,GAAG,6CAA6C,CAAC;gBAC3D,MAAM,OAAO,GAAG,IAAI,CAClB;oBACE,MAAM,EAAE,MAAM;oBACd,IAAI;oBACJ,MAAM,EAAE,YAAY,CAAC,cAAc,CAAC,CAAC,CAAC;oBACtC,OAAO,EAAE,KAAK;oBACd,OAAO,EAAE;wBACP,cAAc,EAAE,mCAAmC;wBACnD,gBAAgB,EAAE,IAAI,CAAC,MAAM;wBAC7B,wBAAwB,EAAE,WAAW,CAAC,QAAQ,CAAC,QAAQ,CAAC;wBACxD,uBAAuB,EAAE,GAAG;qBAC7B;oBACD,IAAI,EAAE,GAAG;oBACT,IAAI;iBACL,EACD,cAAc,CACf,CAAC;gBAEF,MAAM,OAAO,GAA2B;oBACtC,CAAC,EAAE,OAAO,CAAC,OAAO,CAAC,aAAa;oBAChC,CAAC,EAAE,OAAO,CAAC,OAAO,CAAC,YAAY,CAAC;iBACjC,CAAC;gBACF,IAAI,YAAY,EAAE;oBAChB,OAAO,CAAC,CAAC,GAAG,YAAY,CAAC;iBAC1B;gBAED,MAAM,YAAY,GAAG;oBACnB,YAAY,EAAE,CAAC;oBACf,cAAc,EAAE,CAAC;oBACjB,OAAO,EAAE,IAAI,CAAC,SAAS,CAAC,OAAO,EAAE,WAAW,CAAC;iBAC9C,CAAC;gBAEF,UAAU,CAAC,OAAO,CAAC,UAAE,CAAC,GAAG,EAAE,OAAO,CAAC,EAAE,YAAY,EAAE,SAAS,EAAE,QAAQ,CAAC,CAAC;YAC1E,CAAC,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AA5HD,gCA4HC;AAUD,SAAS,mBAAmB,CAAC,WAA6B,EAAE,QAAoC;IAC9F,SAAS,IAAI,CAAC,KAAyB;QACrC,IAAI,CAAC,KAAK,CAAC,WAAW,IAAI,CAAC,KAAK,CAAC,eAAe,IAAI,CAAC,KAAK,CAAC,KAAK,EAAE;YAChE,QAAQ,CACN,IAAI,oCAA4B,CAAC,oDAAoD,CAAC,CACvF,CAAC;YACF,OAAO;SACR;QAED,QAAQ,CACN,SAAS,EACT,IAAI,oCAAgB,CAAC;YACnB,QAAQ,EAAE,KAAK,CAAC,WAAW;YAC3B,QAAQ,EAAE,KAAK,CAAC,eAAe;YAC/B,MAAM,EAAE,WAAW,CAAC,MAAM;YAC1B,SAAS,EAAE,oCAAa,CAAC,WAAW;YACpC,mBAAmB,EAAE;gBACnB,iBAAiB,EAAE,KAAK,CAAC,KAAK;aAC/B;SACF,CAAC,CACH,CAAC;IACJ,CAAC;IAED,qEAAqE;IACrE,sEAAsE;IACtE,IAAI,OAAO,CAAC,GAAG,CAAC,sCAAsC,EAAE;QACtD,OAAO,CACL,GAAG,gBAAgB,GAAG,OAAO,CAAC,GAAG,CAAC,sCAAsC,EAAE,EAC1E,SAAS,EACT,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;YACX,IAAI,GAAG;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAC9B,IAAI,CAAC,GAAG,CAAC,CAAC;QACZ,CAAC,CACF,CAAC;QAEF,OAAO;KACR;IAED,6CAA6C;IAE7C,cAAc;IACd,OAAO,CACL,GAAG,WAAW,mBAAmB,EACjC,EAAE,MAAM,EAAE,KAAK,EAAE,IAAI,EAAE,KAAK,EAAE,OAAO,EAAE,EAAE,sCAAsC,EAAE,EAAE,EAAE,EAAE,EACvF,CAAC,GAAG,EAAE,KAAK,EAAE,EAAE;QACb,IAAI,GAAG;YAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;QAE9B,gBAAgB;QAChB,OAAO,CACL,GAAG,WAAW,IAAI,YAAY,EAAE,EAChC,EAAE,IAAI,EAAE,KAAK,EAAE,OAAO,EAAE,EAAE,0BAA0B,EAAE,KAAK,EAAE,EAAE,EAC/D,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;YAChB,IAAI,GAAG;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAE9B,uBAAuB;YACvB,OAAO,CACL,GAAG,WAAW,IAAI,YAAY,IAAI,QAAQ,EAAE,EAC5C,EAAE,OAAO,EAAE,EAAE,0BAA0B,EAAE,KAAK,EAAE,EAAE,EAClD,CAAC,GAAG,EAAE,KAAK,EAAE,EAAE;gBACb,IAAI,GAAG;oBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;gBAC9B,IAAI,CAAC,KAAK,CAAC,CAAC;YACd,CAAC,CACF,CAAC;QACJ,CAAC,CACF,CAAC;IACJ,CAAC,CACF,CAAC;AACJ,CAAC;AAED,SAAS,YAAY,CAAC,IAAY;IAChC,MAAM,KAAK,GAAG,IAAI,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;IAC9B,IAAI,KAAK,CAAC,MAAM,KAAK,CAAC,IAAI,KAAK,CAAC,CAAC,CAAC,KAAK,WAAW,EAAE;QAClD,OAAO,WAAW,CAAC;KACpB;IAED,OAAO,KAAK,CAAC,CAAC,CAAC,CAAC;AAClB,CAAC;AASD,SAAS,OAAO,CAAC,GAAW,EAAE,QAAoC,EAAE,QAAkB;IACpF,MAAM,OAAO,GAAG,MAAM,CAAC,MAAM,CAC3B;QACE,MAAM,EAAE,KAAK;QACb,OAAO,EAAE,KAAK;QACd,IAAI,EAAE,IAAI;KACX,EACD,GAAG,CAAC,KAAK,CAAC,GAAG,CAAC,EACd,QAAQ,CACT,CAAC;IAEF,MAAM,GAAG,GAAG,IAAI,CAAC,OAAO,CAAC,OAAO,EAAE,GAAG,CAAC,EAAE;QACtC,GAAG,CAAC,WAAW,CAAC,MAAM,CAAC,CAAC;QAExB,IAAI,IAAI,GAAG,EAAE,CAAC;QACd,GAAG,CAAC,EAAE,CAAC,MAAM,EAAE,CAAC,CAAC,EAAE,CAAC,CAAC,IAAI,IAAI,CAAC,CAAC,CAAC,CAAC;QACjC,GAAG,CAAC,EAAE,CAAC,KAAK,EAAE,GAAG,EAAE;YACjB,IAAI,OAAO,CAAC,IAAI,KAAK,KAAK,EAAE;gBAC1B,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;gBAC1B,OAAO;aACR;YAED,IAAI;gBACF,MAAM,MAAM,GAAG,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,CAAC;gBAChC,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,CAAC;aAC7B;YAAC,OAAO,GAAG,EAAE;gBACZ,kBAAkB;gBAClB,QAAQ,CAAC,IAAI,yBAAiB,CAAC,2BAA2B,IAAI,GAAG,CAAC,CAAC,CAAC;aACrE;QACH,CAAC,CAAC,CAAC;IACL,CAAC,CAAC,CAAC;IAEH,GAAG,CAAC,EAAE,CAAC,OAAO,EAAE,GAAG,CAAC,EAAE,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAC,CAAC;IACtC,GAAG,CAAC,GAAG,EAAE,CAAC;AACZ,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/plain.js b/node_modules/mongodb/lib/cmap/auth/plain.js new file mode 100644 index 000000000..a128b1b8c --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/plain.js @@ -0,0 +1,27 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.Plain = void 0; +const bson_1 = require("../../bson"); +const auth_provider_1 = require("./auth_provider"); +const error_1 = require("../../error"); +const utils_1 = require("../../utils"); +class Plain extends auth_provider_1.AuthProvider { + auth(authContext, callback) { + const { connection, credentials } = authContext; + if (!credentials) { + return callback(new error_1.MongoMissingCredentialsError('AuthContext must provide credentials.')); + } + const username = credentials.username; + const password = credentials.password; + const payload = new bson_1.Binary(Buffer.from(`\x00${username}\x00${password}`)); + const command = { + saslStart: 1, + mechanism: 'PLAIN', + payload: payload, + autoAuthorize: 1 + }; + connection.command(utils_1.ns('$external.$cmd'), command, undefined, callback); + } +} +exports.Plain = Plain; +//# sourceMappingURL=plain.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/plain.js.map b/node_modules/mongodb/lib/cmap/auth/plain.js.map new file mode 100644 index 000000000..aa85a393d --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/plain.js.map @@ -0,0 +1 @@ +{"version":3,"file":"plain.js","sourceRoot":"","sources":["../../../src/cmap/auth/plain.ts"],"names":[],"mappings":";;;AAAA,qCAAoC;AACpC,mDAA4D;AAC5D,uCAA2D;AAC3D,uCAA2C;AAE3C,MAAa,KAAM,SAAQ,4BAAY;IACrC,IAAI,CAAC,WAAwB,EAAE,QAAkB;QAC/C,MAAM,EAAE,UAAU,EAAE,WAAW,EAAE,GAAG,WAAW,CAAC;QAChD,IAAI,CAAC,WAAW,EAAE;YAChB,OAAO,QAAQ,CAAC,IAAI,oCAA4B,CAAC,uCAAuC,CAAC,CAAC,CAAC;SAC5F;QACD,MAAM,QAAQ,GAAG,WAAW,CAAC,QAAQ,CAAC;QACtC,MAAM,QAAQ,GAAG,WAAW,CAAC,QAAQ,CAAC;QAEtC,MAAM,OAAO,GAAG,IAAI,aAAM,CAAC,MAAM,CAAC,IAAI,CAAC,OAAO,QAAQ,OAAO,QAAQ,EAAE,CAAC,CAAC,CAAC;QAC1E,MAAM,OAAO,GAAG;YACd,SAAS,EAAE,CAAC;YACZ,SAAS,EAAE,OAAO;YAClB,OAAO,EAAE,OAAO;YAChB,aAAa,EAAE,CAAC;SACjB,CAAC;QAEF,UAAU,CAAC,OAAO,CAAC,UAAE,CAAC,gBAAgB,CAAC,EAAE,OAAO,EAAE,SAAS,EAAE,QAAQ,CAAC,CAAC;IACzE,CAAC;CACF;AAnBD,sBAmBC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/scram.js b/node_modules/mongodb/lib/cmap/auth/scram.js new file mode 100644 index 000000000..5e191040f --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/scram.js @@ -0,0 +1,276 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.ScramSHA256 = exports.ScramSHA1 = void 0; +const crypto = require("crypto"); +const bson_1 = require("../../bson"); +const error_1 = require("../../error"); +const auth_provider_1 = require("./auth_provider"); +const utils_1 = require("../../utils"); +const deps_1 = require("../../deps"); +const defaultAuthProviders_1 = require("./defaultAuthProviders"); +class ScramSHA extends auth_provider_1.AuthProvider { + constructor(cryptoMethod) { + super(); + this.cryptoMethod = cryptoMethod || 'sha1'; + } + prepare(handshakeDoc, authContext, callback) { + const cryptoMethod = this.cryptoMethod; + const credentials = authContext.credentials; + if (!credentials) { + return callback(new error_1.MongoMissingCredentialsError('AuthContext must provide credentials.')); + } + if (cryptoMethod === 'sha256' && deps_1.saslprep == null) { + utils_1.emitWarning('Warning: no saslprep library specified. Passwords will not be sanitized'); + } + crypto.randomBytes(24, (err, nonce) => { + if (err) { + return callback(err); + } + // store the nonce for later use + Object.assign(authContext, { nonce }); + const request = Object.assign({}, handshakeDoc, { + speculativeAuthenticate: Object.assign(makeFirstMessage(cryptoMethod, credentials, nonce), { + db: credentials.source + }) + }); + callback(undefined, request); + }); + } + auth(authContext, callback) { + const response = authContext.response; + if (response && response.speculativeAuthenticate) { + continueScramConversation(this.cryptoMethod, response.speculativeAuthenticate, authContext, callback); + return; + } + executeScram(this.cryptoMethod, authContext, callback); + } +} +function cleanUsername(username) { + return username.replace('=', '=3D').replace(',', '=2C'); +} +function clientFirstMessageBare(username, nonce) { + // NOTE: This is done b/c Javascript uses UTF-16, but the server is hashing in UTF-8. + // Since the username is not sasl-prep-d, we need to do this here. + return Buffer.concat([ + Buffer.from('n=', 'utf8'), + Buffer.from(username, 'utf8'), + Buffer.from(',r=', 'utf8'), + Buffer.from(nonce.toString('base64'), 'utf8') + ]); +} +function makeFirstMessage(cryptoMethod, credentials, nonce) { + const username = cleanUsername(credentials.username); + const mechanism = cryptoMethod === 'sha1' ? defaultAuthProviders_1.AuthMechanism.MONGODB_SCRAM_SHA1 : defaultAuthProviders_1.AuthMechanism.MONGODB_SCRAM_SHA256; + // NOTE: This is done b/c Javascript uses UTF-16, but the server is hashing in UTF-8. + // Since the username is not sasl-prep-d, we need to do this here. + return { + saslStart: 1, + mechanism, + payload: new bson_1.Binary(Buffer.concat([Buffer.from('n,,', 'utf8'), clientFirstMessageBare(username, nonce)])), + autoAuthorize: 1, + options: { skipEmptyExchange: true } + }; +} +function executeScram(cryptoMethod, authContext, callback) { + const { connection, credentials } = authContext; + if (!credentials) { + return callback(new error_1.MongoMissingCredentialsError('AuthContext must provide credentials.')); + } + if (!authContext.nonce) { + return callback(new error_1.MongoInvalidArgumentError('AuthContext must contain a valid nonce property')); + } + const nonce = authContext.nonce; + const db = credentials.source; + const saslStartCmd = makeFirstMessage(cryptoMethod, credentials, nonce); + connection.command(utils_1.ns(`${db}.$cmd`), saslStartCmd, undefined, (_err, result) => { + const err = resolveError(_err, result); + if (err) { + return callback(err); + } + continueScramConversation(cryptoMethod, result, authContext, callback); + }); +} +function continueScramConversation(cryptoMethod, response, authContext, callback) { + const connection = authContext.connection; + const credentials = authContext.credentials; + if (!credentials) { + return callback(new error_1.MongoMissingCredentialsError('AuthContext must provide credentials.')); + } + if (!authContext.nonce) { + return callback(new error_1.MongoInvalidArgumentError('Unable to continue SCRAM without valid nonce')); + } + const nonce = authContext.nonce; + const db = credentials.source; + const username = cleanUsername(credentials.username); + const password = credentials.password; + let processedPassword; + if (cryptoMethod === 'sha256') { + processedPassword = 'kModuleError' in deps_1.saslprep ? password : deps_1.saslprep(password); + } + else { + try { + processedPassword = passwordDigest(username, password); + } + catch (e) { + return callback(e); + } + } + const payload = Buffer.isBuffer(response.payload) + ? new bson_1.Binary(response.payload) + : response.payload; + const dict = parsePayload(payload.value()); + const iterations = parseInt(dict.i, 10); + if (iterations && iterations < 4096) { + callback( + // TODO(NODE-3483) + new error_1.MongoRuntimeError(`Server returned an invalid iteration count ${iterations}`), false); + return; + } + const salt = dict.s; + const rnonce = dict.r; + if (rnonce.startsWith('nonce')) { + // TODO(NODE-3483) + callback(new error_1.MongoRuntimeError(`Server returned an invalid nonce: ${rnonce}`), false); + return; + } + // Set up start of proof + const withoutProof = `c=biws,r=${rnonce}`; + const saltedPassword = HI(processedPassword, Buffer.from(salt, 'base64'), iterations, cryptoMethod); + const clientKey = HMAC(cryptoMethod, saltedPassword, 'Client Key'); + const serverKey = HMAC(cryptoMethod, saltedPassword, 'Server Key'); + const storedKey = H(cryptoMethod, clientKey); + const authMessage = [clientFirstMessageBare(username, nonce), payload.value(), withoutProof].join(','); + const clientSignature = HMAC(cryptoMethod, storedKey, authMessage); + const clientProof = `p=${xor(clientKey, clientSignature)}`; + const clientFinal = [withoutProof, clientProof].join(','); + const serverSignature = HMAC(cryptoMethod, serverKey, authMessage); + const saslContinueCmd = { + saslContinue: 1, + conversationId: response.conversationId, + payload: new bson_1.Binary(Buffer.from(clientFinal)) + }; + connection.command(utils_1.ns(`${db}.$cmd`), saslContinueCmd, undefined, (_err, r) => { + const err = resolveError(_err, r); + if (err) { + return callback(err); + } + const parsedResponse = parsePayload(r.payload.value()); + if (!compareDigest(Buffer.from(parsedResponse.v, 'base64'), serverSignature)) { + callback(new error_1.MongoRuntimeError('Server returned an invalid signature')); + return; + } + if (!r || r.done !== false) { + return callback(err, r); + } + const retrySaslContinueCmd = { + saslContinue: 1, + conversationId: r.conversationId, + payload: Buffer.alloc(0) + }; + connection.command(utils_1.ns(`${db}.$cmd`), retrySaslContinueCmd, undefined, callback); + }); +} +function parsePayload(payload) { + const dict = {}; + const parts = payload.split(','); + for (let i = 0; i < parts.length; i++) { + const valueParts = parts[i].split('='); + dict[valueParts[0]] = valueParts[1]; + } + return dict; +} +function passwordDigest(username, password) { + if (typeof username !== 'string') { + throw new error_1.MongoInvalidArgumentError('Username must be a string'); + } + if (typeof password !== 'string') { + throw new error_1.MongoInvalidArgumentError('Password must be a string'); + } + if (password.length === 0) { + throw new error_1.MongoInvalidArgumentError('Password cannot be empty'); + } + const md5 = crypto.createHash('md5'); + md5.update(`${username}:mongo:${password}`, 'utf8'); + return md5.digest('hex'); +} +// XOR two buffers +function xor(a, b) { + if (!Buffer.isBuffer(a)) { + a = Buffer.from(a); + } + if (!Buffer.isBuffer(b)) { + b = Buffer.from(b); + } + const length = Math.max(a.length, b.length); + const res = []; + for (let i = 0; i < length; i += 1) { + res.push(a[i] ^ b[i]); + } + return Buffer.from(res).toString('base64'); +} +function H(method, text) { + return crypto.createHash(method).update(text).digest(); +} +function HMAC(method, key, text) { + return crypto.createHmac(method, key).update(text).digest(); +} +let _hiCache = {}; +let _hiCacheCount = 0; +function _hiCachePurge() { + _hiCache = {}; + _hiCacheCount = 0; +} +const hiLengthMap = { + sha256: 32, + sha1: 20 +}; +function HI(data, salt, iterations, cryptoMethod) { + // omit the work if already generated + const key = [data, salt.toString('base64'), iterations].join('_'); + if (_hiCache[key] != null) { + return _hiCache[key]; + } + // generate the salt + const saltedData = crypto.pbkdf2Sync(data, salt, iterations, hiLengthMap[cryptoMethod], cryptoMethod); + // cache a copy to speed up the next lookup, but prevent unbounded cache growth + if (_hiCacheCount >= 200) { + _hiCachePurge(); + } + _hiCache[key] = saltedData; + _hiCacheCount += 1; + return saltedData; +} +function compareDigest(lhs, rhs) { + if (lhs.length !== rhs.length) { + return false; + } + if (typeof crypto.timingSafeEqual === 'function') { + return crypto.timingSafeEqual(lhs, rhs); + } + let result = 0; + for (let i = 0; i < lhs.length; i++) { + result |= lhs[i] ^ rhs[i]; + } + return result === 0; +} +function resolveError(err, result) { + if (err) + return err; + if (result) { + if (result.$err || result.errmsg) + return new error_1.MongoServerError(result); + } +} +class ScramSHA1 extends ScramSHA { + constructor() { + super('sha1'); + } +} +exports.ScramSHA1 = ScramSHA1; +class ScramSHA256 extends ScramSHA { + constructor() { + super('sha256'); + } +} +exports.ScramSHA256 = ScramSHA256; +//# sourceMappingURL=scram.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/scram.js.map b/node_modules/mongodb/lib/cmap/auth/scram.js.map new file mode 100644 index 000000000..e77df0cd6 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/scram.js.map @@ -0,0 +1 @@ +{"version":3,"file":"scram.js","sourceRoot":"","sources":["../../../src/cmap/auth/scram.ts"],"names":[],"mappings":";;;AAAA,iCAAiC;AACjC,qCAA8C;AAC9C,uCAMqB;AACrB,mDAA4D;AAC5D,uCAAwD;AAIxD,qCAAsC;AACtC,iEAAuD;AAIvD,MAAM,QAAS,SAAQ,4BAAY;IAEjC,YAAY,YAA0B;QACpC,KAAK,EAAE,CAAC;QACR,IAAI,CAAC,YAAY,GAAG,YAAY,IAAI,MAAM,CAAC;IAC7C,CAAC;IAED,OAAO,CAAC,YAA+B,EAAE,WAAwB,EAAE,QAAkB;QACnF,MAAM,YAAY,GAAG,IAAI,CAAC,YAAY,CAAC;QACvC,MAAM,WAAW,GAAG,WAAW,CAAC,WAAW,CAAC;QAC5C,IAAI,CAAC,WAAW,EAAE;YAChB,OAAO,QAAQ,CAAC,IAAI,oCAA4B,CAAC,uCAAuC,CAAC,CAAC,CAAC;SAC5F;QACD,IAAI,YAAY,KAAK,QAAQ,IAAI,eAAQ,IAAI,IAAI,EAAE;YACjD,mBAAW,CAAC,yEAAyE,CAAC,CAAC;SACxF;QAED,MAAM,CAAC,WAAW,CAAC,EAAE,EAAE,CAAC,GAAG,EAAE,KAAK,EAAE,EAAE;YACpC,IAAI,GAAG,EAAE;gBACP,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;aACtB;YAED,gCAAgC;YAChC,MAAM,CAAC,MAAM,CAAC,WAAW,EAAE,EAAE,KAAK,EAAE,CAAC,CAAC;YAEtC,MAAM,OAAO,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,YAAY,EAAE;gBAC9C,uBAAuB,EAAE,MAAM,CAAC,MAAM,CAAC,gBAAgB,CAAC,YAAY,EAAE,WAAW,EAAE,KAAK,CAAC,EAAE;oBACzF,EAAE,EAAE,WAAW,CAAC,MAAM;iBACvB,CAAC;aACH,CAAC,CAAC;YAEH,QAAQ,CAAC,SAAS,EAAE,OAAO,CAAC,CAAC;QAC/B,CAAC,CAAC,CAAC;IACL,CAAC;IAED,IAAI,CAAC,WAAwB,EAAE,QAAkB;QAC/C,MAAM,QAAQ,GAAG,WAAW,CAAC,QAAQ,CAAC;QACtC,IAAI,QAAQ,IAAI,QAAQ,CAAC,uBAAuB,EAAE;YAChD,yBAAyB,CACvB,IAAI,CAAC,YAAY,EACjB,QAAQ,CAAC,uBAAuB,EAChC,WAAW,EACX,QAAQ,CACT,CAAC;YAEF,OAAO;SACR;QAED,YAAY,CAAC,IAAI,CAAC,YAAY,EAAE,WAAW,EAAE,QAAQ,CAAC,CAAC;IACzD,CAAC;CACF;AAED,SAAS,aAAa,CAAC,QAAgB;IACrC,OAAO,QAAQ,CAAC,OAAO,CAAC,GAAG,EAAE,KAAK,CAAC,CAAC,OAAO,CAAC,GAAG,EAAE,KAAK,CAAC,CAAC;AAC1D,CAAC;AAED,SAAS,sBAAsB,CAAC,QAAgB,EAAE,KAAa;IAC7D,qFAAqF;IACrF,kEAAkE;IAClE,OAAO,MAAM,CAAC,MAAM,CAAC;QACnB,MAAM,CAAC,IAAI,CAAC,IAAI,EAAE,MAAM,CAAC;QACzB,MAAM,CAAC,IAAI,CAAC,QAAQ,EAAE,MAAM,CAAC;QAC7B,MAAM,CAAC,IAAI,CAAC,KAAK,EAAE,MAAM,CAAC;QAC1B,MAAM,CAAC,IAAI,CAAC,KAAK,CAAC,QAAQ,CAAC,QAAQ,CAAC,EAAE,MAAM,CAAC;KAC9C,CAAC,CAAC;AACL,CAAC;AAED,SAAS,gBAAgB,CACvB,YAA0B,EAC1B,WAA6B,EAC7B,KAAa;IAEb,MAAM,QAAQ,GAAG,aAAa,CAAC,WAAW,CAAC,QAAQ,CAAC,CAAC;IACrD,MAAM,SAAS,GACb,YAAY,KAAK,MAAM,CAAC,CAAC,CAAC,oCAAa,CAAC,kBAAkB,CAAC,CAAC,CAAC,oCAAa,CAAC,oBAAoB,CAAC;IAElG,qFAAqF;IACrF,kEAAkE;IAClE,OAAO;QACL,SAAS,EAAE,CAAC;QACZ,SAAS;QACT,OAAO,EAAE,IAAI,aAAM,CACjB,MAAM,CAAC,MAAM,CAAC,CAAC,MAAM,CAAC,IAAI,CAAC,KAAK,EAAE,MAAM,CAAC,EAAE,sBAAsB,CAAC,QAAQ,EAAE,KAAK,CAAC,CAAC,CAAC,CACrF;QACD,aAAa,EAAE,CAAC;QAChB,OAAO,EAAE,EAAE,iBAAiB,EAAE,IAAI,EAAE;KACrC,CAAC;AACJ,CAAC;AAED,SAAS,YAAY,CAAC,YAA0B,EAAE,WAAwB,EAAE,QAAkB;IAC5F,MAAM,EAAE,UAAU,EAAE,WAAW,EAAE,GAAG,WAAW,CAAC;IAChD,IAAI,CAAC,WAAW,EAAE;QAChB,OAAO,QAAQ,CAAC,IAAI,oCAA4B,CAAC,uCAAuC,CAAC,CAAC,CAAC;KAC5F;IACD,IAAI,CAAC,WAAW,CAAC,KAAK,EAAE;QACtB,OAAO,QAAQ,CACb,IAAI,iCAAyB,CAAC,iDAAiD,CAAC,CACjF,CAAC;KACH;IACD,MAAM,KAAK,GAAG,WAAW,CAAC,KAAK,CAAC;IAChC,MAAM,EAAE,GAAG,WAAW,CAAC,MAAM,CAAC;IAE9B,MAAM,YAAY,GAAG,gBAAgB,CAAC,YAAY,EAAE,WAAW,EAAE,KAAK,CAAC,CAAC;IACxE,UAAU,CAAC,OAAO,CAAC,UAAE,CAAC,GAAG,EAAE,OAAO,CAAC,EAAE,YAAY,EAAE,SAAS,EAAE,CAAC,IAAI,EAAE,MAAM,EAAE,EAAE;QAC7E,MAAM,GAAG,GAAG,YAAY,CAAC,IAAI,EAAE,MAAM,CAAC,CAAC;QACvC,IAAI,GAAG,EAAE;YACP,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;SACtB;QAED,yBAAyB,CAAC,YAAY,EAAE,MAAM,EAAE,WAAW,EAAE,QAAQ,CAAC,CAAC;IACzE,CAAC,CAAC,CAAC;AACL,CAAC;AAED,SAAS,yBAAyB,CAChC,YAA0B,EAC1B,QAAkB,EAClB,WAAwB,EACxB,QAAkB;IAElB,MAAM,UAAU,GAAG,WAAW,CAAC,UAAU,CAAC;IAC1C,MAAM,WAAW,GAAG,WAAW,CAAC,WAAW,CAAC;IAC5C,IAAI,CAAC,WAAW,EAAE;QAChB,OAAO,QAAQ,CAAC,IAAI,oCAA4B,CAAC,uCAAuC,CAAC,CAAC,CAAC;KAC5F;IACD,IAAI,CAAC,WAAW,CAAC,KAAK,EAAE;QACtB,OAAO,QAAQ,CAAC,IAAI,iCAAyB,CAAC,8CAA8C,CAAC,CAAC,CAAC;KAChG;IACD,MAAM,KAAK,GAAG,WAAW,CAAC,KAAK,CAAC;IAEhC,MAAM,EAAE,GAAG,WAAW,CAAC,MAAM,CAAC;IAC9B,MAAM,QAAQ,GAAG,aAAa,CAAC,WAAW,CAAC,QAAQ,CAAC,CAAC;IACrD,MAAM,QAAQ,GAAG,WAAW,CAAC,QAAQ,CAAC;IAEtC,IAAI,iBAAiB,CAAC;IACtB,IAAI,YAAY,KAAK,QAAQ,EAAE;QAC7B,iBAAiB,GAAG,cAAc,IAAI,eAAQ,CAAC,CAAC,CAAC,QAAQ,CAAC,CAAC,CAAC,eAAQ,CAAC,QAAQ,CAAC,CAAC;KAChF;SAAM;QACL,IAAI;YACF,iBAAiB,GAAG,cAAc,CAAC,QAAQ,EAAE,QAAQ,CAAC,CAAC;SACxD;QAAC,OAAO,CAAC,EAAE;YACV,OAAO,QAAQ,CAAC,CAAC,CAAC,CAAC;SACpB;KACF;IAED,MAAM,OAAO,GAAG,MAAM,CAAC,QAAQ,CAAC,QAAQ,CAAC,OAAO,CAAC;QAC/C,CAAC,CAAC,IAAI,aAAM,CAAC,QAAQ,CAAC,OAAO,CAAC;QAC9B,CAAC,CAAC,QAAQ,CAAC,OAAO,CAAC;IACrB,MAAM,IAAI,GAAG,YAAY,CAAC,OAAO,CAAC,KAAK,EAAE,CAAC,CAAC;IAE3C,MAAM,UAAU,GAAG,QAAQ,CAAC,IAAI,CAAC,CAAC,EAAE,EAAE,CAAC,CAAC;IACxC,IAAI,UAAU,IAAI,UAAU,GAAG,IAAI,EAAE;QACnC,QAAQ;QACN,kBAAkB;QAClB,IAAI,yBAAiB,CAAC,8CAA8C,UAAU,EAAE,CAAC,EACjF,KAAK,CACN,CAAC;QACF,OAAO;KACR;IAED,MAAM,IAAI,GAAG,IAAI,CAAC,CAAC,CAAC;IACpB,MAAM,MAAM,GAAG,IAAI,CAAC,CAAC,CAAC;IACtB,IAAI,MAAM,CAAC,UAAU,CAAC,OAAO,CAAC,EAAE;QAC9B,kBAAkB;QAClB,QAAQ,CAAC,IAAI,yBAAiB,CAAC,qCAAqC,MAAM,EAAE,CAAC,EAAE,KAAK,CAAC,CAAC;QACtF,OAAO;KACR;IAED,wBAAwB;IACxB,MAAM,YAAY,GAAG,YAAY,MAAM,EAAE,CAAC;IAC1C,MAAM,cAAc,GAAG,EAAE,CACvB,iBAAiB,EACjB,MAAM,CAAC,IAAI,CAAC,IAAI,EAAE,QAAQ,CAAC,EAC3B,UAAU,EACV,YAAY,CACb,CAAC;IAEF,MAAM,SAAS,GAAG,IAAI,CAAC,YAAY,EAAE,cAAc,EAAE,YAAY,CAAC,CAAC;IACnE,MAAM,SAAS,GAAG,IAAI,CAAC,YAAY,EAAE,cAAc,EAAE,YAAY,CAAC,CAAC;IACnE,MAAM,SAAS,GAAG,CAAC,CAAC,YAAY,EAAE,SAAS,CAAC,CAAC;IAC7C,MAAM,WAAW,GAAG,CAAC,sBAAsB,CAAC,QAAQ,EAAE,KAAK,CAAC,EAAE,OAAO,CAAC,KAAK,EAAE,EAAE,YAAY,CAAC,CAAC,IAAI,CAC/F,GAAG,CACJ,CAAC;IAEF,MAAM,eAAe,GAAG,IAAI,CAAC,YAAY,EAAE,SAAS,EAAE,WAAW,CAAC,CAAC;IACnE,MAAM,WAAW,GAAG,KAAK,GAAG,CAAC,SAAS,EAAE,eAAe,CAAC,EAAE,CAAC;IAC3D,MAAM,WAAW,GAAG,CAAC,YAAY,EAAE,WAAW,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;IAE1D,MAAM,eAAe,GAAG,IAAI,CAAC,YAAY,EAAE,SAAS,EAAE,WAAW,CAAC,CAAC;IACnE,MAAM,eAAe,GAAG;QACtB,YAAY,EAAE,CAAC;QACf,cAAc,EAAE,QAAQ,CAAC,cAAc;QACvC,OAAO,EAAE,IAAI,aAAM,CAAC,MAAM,CAAC,IAAI,CAAC,WAAW,CAAC,CAAC;KAC9C,CAAC;IAEF,UAAU,CAAC,OAAO,CAAC,UAAE,CAAC,GAAG,EAAE,OAAO,CAAC,EAAE,eAAe,EAAE,SAAS,EAAE,CAAC,IAAI,EAAE,CAAC,EAAE,EAAE;QAC3E,MAAM,GAAG,GAAG,YAAY,CAAC,IAAI,EAAE,CAAC,CAAC,CAAC;QAClC,IAAI,GAAG,EAAE;YACP,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;SACtB;QAED,MAAM,cAAc,GAAG,YAAY,CAAC,CAAC,CAAC,OAAO,CAAC,KAAK,EAAE,CAAC,CAAC;QACvD,IAAI,CAAC,aAAa,CAAC,MAAM,CAAC,IAAI,CAAC,cAAc,CAAC,CAAC,EAAE,QAAQ,CAAC,EAAE,eAAe,CAAC,EAAE;YAC5E,QAAQ,CAAC,IAAI,yBAAiB,CAAC,sCAAsC,CAAC,CAAC,CAAC;YACxE,OAAO;SACR;QAED,IAAI,CAAC,CAAC,IAAI,CAAC,CAAC,IAAI,KAAK,KAAK,EAAE;YAC1B,OAAO,QAAQ,CAAC,GAAG,EAAE,CAAC,CAAC,CAAC;SACzB;QAED,MAAM,oBAAoB,GAAG;YAC3B,YAAY,EAAE,CAAC;YACf,cAAc,EAAE,CAAC,CAAC,cAAc;YAChC,OAAO,EAAE,MAAM,CAAC,KAAK,CAAC,CAAC,CAAC;SACzB,CAAC;QAEF,UAAU,CAAC,OAAO,CAAC,UAAE,CAAC,GAAG,EAAE,OAAO,CAAC,EAAE,oBAAoB,EAAE,SAAS,EAAE,QAAQ,CAAC,CAAC;IAClF,CAAC,CAAC,CAAC;AACL,CAAC;AAED,SAAS,YAAY,CAAC,OAAe;IACnC,MAAM,IAAI,GAAa,EAAE,CAAC;IAC1B,MAAM,KAAK,GAAG,OAAO,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;IACjC,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,KAAK,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;QACrC,MAAM,UAAU,GAAG,KAAK,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;QACvC,IAAI,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,GAAG,UAAU,CAAC,CAAC,CAAC,CAAC;KACrC;IAED,OAAO,IAAI,CAAC;AACd,CAAC;AAED,SAAS,cAAc,CAAC,QAAgB,EAAE,QAAgB;IACxD,IAAI,OAAO,QAAQ,KAAK,QAAQ,EAAE;QAChC,MAAM,IAAI,iCAAyB,CAAC,2BAA2B,CAAC,CAAC;KAClE;IAED,IAAI,OAAO,QAAQ,KAAK,QAAQ,EAAE;QAChC,MAAM,IAAI,iCAAyB,CAAC,2BAA2B,CAAC,CAAC;KAClE;IAED,IAAI,QAAQ,CAAC,MAAM,KAAK,CAAC,EAAE;QACzB,MAAM,IAAI,iCAAyB,CAAC,0BAA0B,CAAC,CAAC;KACjE;IAED,MAAM,GAAG,GAAG,MAAM,CAAC,UAAU,CAAC,KAAK,CAAC,CAAC;IACrC,GAAG,CAAC,MAAM,CAAC,GAAG,QAAQ,UAAU,QAAQ,EAAE,EAAE,MAAM,CAAC,CAAC;IACpD,OAAO,GAAG,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC;AAC3B,CAAC;AAED,kBAAkB;AAClB,SAAS,GAAG,CAAC,CAAS,EAAE,CAAS;IAC/B,IAAI,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAC,CAAC,EAAE;QACvB,CAAC,GAAG,MAAM,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;KACpB;IAED,IAAI,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAC,CAAC,EAAE;QACvB,CAAC,GAAG,MAAM,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;KACpB;IAED,MAAM,MAAM,GAAG,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC,MAAM,EAAE,CAAC,CAAC,MAAM,CAAC,CAAC;IAC5C,MAAM,GAAG,GAAG,EAAE,CAAC;IAEf,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,MAAM,EAAE,CAAC,IAAI,CAAC,EAAE;QAClC,GAAG,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;KACvB;IAED,OAAO,MAAM,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC,QAAQ,CAAC,QAAQ,CAAC,CAAC;AAC7C,CAAC;AAED,SAAS,CAAC,CAAC,MAAoB,EAAE,IAAY;IAC3C,OAAO,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC,MAAM,EAAE,CAAC;AACzD,CAAC;AAED,SAAS,IAAI,CAAC,MAAoB,EAAE,GAAW,EAAE,IAAqB;IACpE,OAAO,MAAM,CAAC,UAAU,CAAC,MAAM,EAAE,GAAG,CAAC,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC,MAAM,EAAE,CAAC;AAC9D,CAAC;AAMD,IAAI,QAAQ,GAAY,EAAE,CAAC;AAC3B,IAAI,aAAa,GAAG,CAAC,CAAC;AACtB,SAAS,aAAa;IACpB,QAAQ,GAAG,EAAE,CAAC;IACd,aAAa,GAAG,CAAC,CAAC;AACpB,CAAC;AAED,MAAM,WAAW,GAAG;IAClB,MAAM,EAAE,EAAE;IACV,IAAI,EAAE,EAAE;CACT,CAAC;AAEF,SAAS,EAAE,CAAC,IAAY,EAAE,IAAY,EAAE,UAAkB,EAAE,YAA0B;IACpF,qCAAqC;IACrC,MAAM,GAAG,GAAG,CAAC,IAAI,EAAE,IAAI,CAAC,QAAQ,CAAC,QAAQ,CAAC,EAAE,UAAU,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;IAClE,IAAI,QAAQ,CAAC,GAAG,CAAC,IAAI,IAAI,EAAE;QACzB,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;KACtB;IAED,oBAAoB;IACpB,MAAM,UAAU,GAAG,MAAM,CAAC,UAAU,CAClC,IAAI,EACJ,IAAI,EACJ,UAAU,EACV,WAAW,CAAC,YAAY,CAAC,EACzB,YAAY,CACb,CAAC;IAEF,+EAA+E;IAC/E,IAAI,aAAa,IAAI,GAAG,EAAE;QACxB,aAAa,EAAE,CAAC;KACjB;IAED,QAAQ,CAAC,GAAG,CAAC,GAAG,UAAU,CAAC;IAC3B,aAAa,IAAI,CAAC,CAAC;IACnB,OAAO,UAAU,CAAC;AACpB,CAAC;AAED,SAAS,aAAa,CAAC,GAAW,EAAE,GAAe;IACjD,IAAI,GAAG,CAAC,MAAM,KAAK,GAAG,CAAC,MAAM,EAAE;QAC7B,OAAO,KAAK,CAAC;KACd;IAED,IAAI,OAAO,MAAM,CAAC,eAAe,KAAK,UAAU,EAAE;QAChD,OAAO,MAAM,CAAC,eAAe,CAAC,GAAG,EAAE,GAAG,CAAC,CAAC;KACzC;IAED,IAAI,MAAM,GAAG,CAAC,CAAC;IACf,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,GAAG,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;QACnC,MAAM,IAAI,GAAG,CAAC,CAAC,CAAC,GAAG,GAAG,CAAC,CAAC,CAAC,CAAC;KAC3B;IAED,OAAO,MAAM,KAAK,CAAC,CAAC;AACtB,CAAC;AAED,SAAS,YAAY,CAAC,GAAc,EAAE,MAAiB;IACrD,IAAI,GAAG;QAAE,OAAO,GAAG,CAAC;IACpB,IAAI,MAAM,EAAE;QACV,IAAI,MAAM,CAAC,IAAI,IAAI,MAAM,CAAC,MAAM;YAAE,OAAO,IAAI,wBAAgB,CAAC,MAAM,CAAC,CAAC;KACvE;AACH,CAAC;AAED,MAAa,SAAU,SAAQ,QAAQ;IACrC;QACE,KAAK,CAAC,MAAM,CAAC,CAAC;IAChB,CAAC;CACF;AAJD,8BAIC;AAED,MAAa,WAAY,SAAQ,QAAQ;IACvC;QACE,KAAK,CAAC,QAAQ,CAAC,CAAC;IAClB,CAAC;CACF;AAJD,kCAIC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/x509.js b/node_modules/mongodb/lib/cmap/auth/x509.js new file mode 100644 index 000000000..0c9897d05 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/x509.js @@ -0,0 +1,39 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.X509 = void 0; +const auth_provider_1 = require("./auth_provider"); +const error_1 = require("../../error"); +const utils_1 = require("../../utils"); +class X509 extends auth_provider_1.AuthProvider { + prepare(handshakeDoc, authContext, callback) { + const { credentials } = authContext; + if (!credentials) { + return callback(new error_1.MongoMissingCredentialsError('AuthContext must provide credentials.')); + } + Object.assign(handshakeDoc, { + speculativeAuthenticate: x509AuthenticateCommand(credentials) + }); + callback(undefined, handshakeDoc); + } + auth(authContext, callback) { + const connection = authContext.connection; + const credentials = authContext.credentials; + if (!credentials) { + return callback(new error_1.MongoMissingCredentialsError('AuthContext must provide credentials.')); + } + const response = authContext.response; + if (response && response.speculativeAuthenticate) { + return callback(); + } + connection.command(utils_1.ns('$external.$cmd'), x509AuthenticateCommand(credentials), undefined, callback); + } +} +exports.X509 = X509; +function x509AuthenticateCommand(credentials) { + const command = { authenticate: 1, mechanism: 'MONGODB-X509' }; + if (credentials.username) { + command.user = credentials.username; + } + return command; +} +//# sourceMappingURL=x509.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/auth/x509.js.map b/node_modules/mongodb/lib/cmap/auth/x509.js.map new file mode 100644 index 000000000..4a80bd309 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/auth/x509.js.map @@ -0,0 +1 @@ +{"version":3,"file":"x509.js","sourceRoot":"","sources":["../../../src/cmap/auth/x509.ts"],"names":[],"mappings":";;;AAAA,mDAA4D;AAC5D,uCAA2D;AAE3D,uCAA2C;AAI3C,MAAa,IAAK,SAAQ,4BAAY;IACpC,OAAO,CAAC,YAA+B,EAAE,WAAwB,EAAE,QAAkB;QACnF,MAAM,EAAE,WAAW,EAAE,GAAG,WAAW,CAAC;QACpC,IAAI,CAAC,WAAW,EAAE;YAChB,OAAO,QAAQ,CAAC,IAAI,oCAA4B,CAAC,uCAAuC,CAAC,CAAC,CAAC;SAC5F;QACD,MAAM,CAAC,MAAM,CAAC,YAAY,EAAE;YAC1B,uBAAuB,EAAE,uBAAuB,CAAC,WAAW,CAAC;SAC9D,CAAC,CAAC;QAEH,QAAQ,CAAC,SAAS,EAAE,YAAY,CAAC,CAAC;IACpC,CAAC;IAED,IAAI,CAAC,WAAwB,EAAE,QAAkB;QAC/C,MAAM,UAAU,GAAG,WAAW,CAAC,UAAU,CAAC;QAC1C,MAAM,WAAW,GAAG,WAAW,CAAC,WAAW,CAAC;QAC5C,IAAI,CAAC,WAAW,EAAE;YAChB,OAAO,QAAQ,CAAC,IAAI,oCAA4B,CAAC,uCAAuC,CAAC,CAAC,CAAC;SAC5F;QACD,MAAM,QAAQ,GAAG,WAAW,CAAC,QAAQ,CAAC;QAEtC,IAAI,QAAQ,IAAI,QAAQ,CAAC,uBAAuB,EAAE;YAChD,OAAO,QAAQ,EAAE,CAAC;SACnB;QAED,UAAU,CAAC,OAAO,CAChB,UAAE,CAAC,gBAAgB,CAAC,EACpB,uBAAuB,CAAC,WAAW,CAAC,EACpC,SAAS,EACT,QAAQ,CACT,CAAC;IACJ,CAAC;CACF;AAhCD,oBAgCC;AAED,SAAS,uBAAuB,CAAC,WAA6B;IAC5D,MAAM,OAAO,GAAa,EAAE,YAAY,EAAE,CAAC,EAAE,SAAS,EAAE,cAAc,EAAE,CAAC;IACzE,IAAI,WAAW,CAAC,QAAQ,EAAE;QACxB,OAAO,CAAC,IAAI,GAAG,WAAW,CAAC,QAAQ,CAAC;KACrC;IAED,OAAO,OAAO,CAAC;AACjB,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/command_monitoring_events.js b/node_modules/mongodb/lib/cmap/command_monitoring_events.js new file mode 100644 index 000000000..a82b0f919 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/command_monitoring_events.js @@ -0,0 +1,272 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.CommandFailedEvent = exports.CommandSucceededEvent = exports.CommandStartedEvent = void 0; +const commands_1 = require("./commands"); +const utils_1 = require("../utils"); +/** + * An event indicating the start of a given + * @public + * @category Event + */ +class CommandStartedEvent { + /** + * Create a started event + * + * @internal + * @param pool - the pool that originated the command + * @param command - the command + */ + constructor(connection, command) { + const cmd = extractCommand(command); + const commandName = extractCommandName(cmd); + const { address, connectionId, serviceId } = extractConnectionDetails(connection); + // TODO: remove in major revision, this is not spec behavior + if (SENSITIVE_COMMANDS.has(commandName)) { + this.commandObj = {}; + this.commandObj[commandName] = true; + } + this.address = address; + this.connectionId = connectionId; + this.serviceId = serviceId; + this.requestId = command.requestId; + this.databaseName = databaseName(command); + this.commandName = commandName; + this.command = maybeRedact(commandName, cmd, cmd); + } + /* @internal */ + get hasServiceId() { + return !!this.serviceId; + } +} +exports.CommandStartedEvent = CommandStartedEvent; +/** + * An event indicating the success of a given command + * @public + * @category Event + */ +class CommandSucceededEvent { + /** + * Create a succeeded event + * + * @internal + * @param pool - the pool that originated the command + * @param command - the command + * @param reply - the reply for this command from the server + * @param started - a high resolution tuple timestamp of when the command was first sent, to calculate duration + */ + constructor(connection, command, reply, started) { + const cmd = extractCommand(command); + const commandName = extractCommandName(cmd); + const { address, connectionId, serviceId } = extractConnectionDetails(connection); + this.address = address; + this.connectionId = connectionId; + this.serviceId = serviceId; + this.requestId = command.requestId; + this.commandName = commandName; + this.duration = utils_1.calculateDurationInMs(started); + this.reply = maybeRedact(commandName, cmd, extractReply(command, reply)); + } + /* @internal */ + get hasServiceId() { + return !!this.serviceId; + } +} +exports.CommandSucceededEvent = CommandSucceededEvent; +/** + * An event indicating the failure of a given command + * @public + * @category Event + */ +class CommandFailedEvent { + /** + * Create a failure event + * + * @internal + * @param pool - the pool that originated the command + * @param command - the command + * @param error - the generated error or a server error response + * @param started - a high resolution tuple timestamp of when the command was first sent, to calculate duration + */ + constructor(connection, command, error, started) { + const cmd = extractCommand(command); + const commandName = extractCommandName(cmd); + const { address, connectionId, serviceId } = extractConnectionDetails(connection); + this.address = address; + this.connectionId = connectionId; + this.serviceId = serviceId; + this.requestId = command.requestId; + this.commandName = commandName; + this.duration = utils_1.calculateDurationInMs(started); + this.failure = maybeRedact(commandName, cmd, error); + } + /* @internal */ + get hasServiceId() { + return !!this.serviceId; + } +} +exports.CommandFailedEvent = CommandFailedEvent; +/** Commands that we want to redact because of the sensitive nature of their contents */ +const SENSITIVE_COMMANDS = new Set([ + 'authenticate', + 'saslStart', + 'saslContinue', + 'getnonce', + 'createUser', + 'updateUser', + 'copydbgetnonce', + 'copydbsaslstart', + 'copydb' +]); +const HELLO_COMMANDS = new Set(['hello', 'ismaster', 'isMaster']); +// helper methods +const extractCommandName = (commandDoc) => Object.keys(commandDoc)[0]; +const namespace = (command) => command.ns; +const databaseName = (command) => command.ns.split('.')[0]; +const collectionName = (command) => command.ns.split('.')[1]; +const maybeRedact = (commandName, commandDoc, result) => SENSITIVE_COMMANDS.has(commandName) || + (HELLO_COMMANDS.has(commandName) && commandDoc.speculativeAuthenticate) + ? {} + : result; +const LEGACY_FIND_QUERY_MAP = { + $query: 'filter', + $orderby: 'sort', + $hint: 'hint', + $comment: 'comment', + $maxScan: 'maxScan', + $max: 'max', + $min: 'min', + $returnKey: 'returnKey', + $showDiskLoc: 'showRecordId', + $maxTimeMS: 'maxTimeMS', + $snapshot: 'snapshot' +}; +const LEGACY_FIND_OPTIONS_MAP = { + numberToSkip: 'skip', + numberToReturn: 'batchSize', + returnFieldSelector: 'projection' +}; +const OP_QUERY_KEYS = [ + 'tailable', + 'oplogReplay', + 'noCursorTimeout', + 'awaitData', + 'partial', + 'exhaust' +]; +/** Extract the actual command from the query, possibly up-converting if it's a legacy format */ +function extractCommand(command) { + var _a; + if (command instanceof commands_1.GetMore) { + return { + getMore: utils_1.deepCopy(command.cursorId), + collection: collectionName(command), + batchSize: command.numberToReturn + }; + } + if (command instanceof commands_1.KillCursor) { + return { + killCursors: collectionName(command), + cursors: utils_1.deepCopy(command.cursorIds) + }; + } + if (command instanceof commands_1.Msg) { + return utils_1.deepCopy(command.command); + } + if ((_a = command.query) === null || _a === void 0 ? void 0 : _a.$query) { + let result; + if (command.ns === 'admin.$cmd') { + // up-convert legacy command + result = Object.assign({}, command.query.$query); + } + else { + // up-convert legacy find command + result = { find: collectionName(command) }; + Object.keys(LEGACY_FIND_QUERY_MAP).forEach(key => { + if (command.query[key] != null) { + result[LEGACY_FIND_QUERY_MAP[key]] = utils_1.deepCopy(command.query[key]); + } + }); + } + Object.keys(LEGACY_FIND_OPTIONS_MAP).forEach(key => { + const legacyKey = key; + if (command[legacyKey] != null) { + result[LEGACY_FIND_OPTIONS_MAP[legacyKey]] = utils_1.deepCopy(command[legacyKey]); + } + }); + OP_QUERY_KEYS.forEach(key => { + const opKey = key; + if (command[opKey]) { + result[opKey] = command[opKey]; + } + }); + if (command.pre32Limit != null) { + result.limit = command.pre32Limit; + } + if (command.query.$explain) { + return { explain: result }; + } + return result; + } + const clonedQuery = {}; + const clonedCommand = {}; + if (command.query) { + for (const k in command.query) { + clonedQuery[k] = utils_1.deepCopy(command.query[k]); + } + clonedCommand.query = clonedQuery; + } + for (const k in command) { + if (k === 'query') + continue; + clonedCommand[k] = utils_1.deepCopy(command[k]); + } + return command.query ? clonedQuery : clonedCommand; +} +function extractReply(command, reply) { + if (command instanceof commands_1.KillCursor) { + return { + ok: 1, + cursorsUnknown: command.cursorIds + }; + } + if (!reply) { + return reply; + } + if (command instanceof commands_1.GetMore) { + return { + ok: 1, + cursor: { + id: utils_1.deepCopy(reply.cursorId), + ns: namespace(command), + nextBatch: utils_1.deepCopy(reply.documents) + } + }; + } + if (command instanceof commands_1.Msg) { + return utils_1.deepCopy(reply.result ? reply.result : reply); + } + // is this a legacy find command? + if (command.query && command.query.$query != null) { + return { + ok: 1, + cursor: { + id: utils_1.deepCopy(reply.cursorId), + ns: namespace(command), + firstBatch: utils_1.deepCopy(reply.documents) + } + }; + } + return utils_1.deepCopy(reply.result ? reply.result : reply); +} +function extractConnectionDetails(connection) { + let connectionId; + if ('id' in connection) { + connectionId = connection.id; + } + return { + address: connection.address, + serviceId: connection.serviceId, + connectionId + }; +} +//# sourceMappingURL=command_monitoring_events.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/command_monitoring_events.js.map b/node_modules/mongodb/lib/cmap/command_monitoring_events.js.map new file mode 100644 index 000000000..6b333f6f9 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/command_monitoring_events.js.map @@ -0,0 +1 @@ +{"version":3,"file":"command_monitoring_events.js","sourceRoot":"","sources":["../../src/cmap/command_monitoring_events.ts"],"names":[],"mappings":";;;AAAA,yCAAgF;AAChF,oCAA2D;AAI3D;;;;GAIG;AACH,MAAa,mBAAmB;IAU9B;;;;;;OAMG;IACH,YAAY,UAAsB,EAAE,OAAiC;QACnE,MAAM,GAAG,GAAG,cAAc,CAAC,OAAO,CAAC,CAAC;QACpC,MAAM,WAAW,GAAG,kBAAkB,CAAC,GAAG,CAAC,CAAC;QAC5C,MAAM,EAAE,OAAO,EAAE,YAAY,EAAE,SAAS,EAAE,GAAG,wBAAwB,CAAC,UAAU,CAAC,CAAC;QAElF,4DAA4D;QAC5D,IAAI,kBAAkB,CAAC,GAAG,CAAC,WAAW,CAAC,EAAE;YACvC,IAAI,CAAC,UAAU,GAAG,EAAE,CAAC;YACrB,IAAI,CAAC,UAAU,CAAC,WAAW,CAAC,GAAG,IAAI,CAAC;SACrC;QAED,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,YAAY,GAAG,YAAY,CAAC;QACjC,IAAI,CAAC,SAAS,GAAG,SAAS,CAAC;QAC3B,IAAI,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;QACnC,IAAI,CAAC,YAAY,GAAG,YAAY,CAAC,OAAO,CAAC,CAAC;QAC1C,IAAI,CAAC,WAAW,GAAG,WAAW,CAAC;QAC/B,IAAI,CAAC,OAAO,GAAG,WAAW,CAAC,WAAW,EAAE,GAAG,EAAE,GAAG,CAAC,CAAC;IACpD,CAAC;IAED,eAAe;IACf,IAAI,YAAY;QACd,OAAO,CAAC,CAAC,IAAI,CAAC,SAAS,CAAC;IAC1B,CAAC;CACF;AAzCD,kDAyCC;AAED;;;;GAIG;AACH,MAAa,qBAAqB;IAShC;;;;;;;;OAQG;IACH,YACE,UAAsB,EACtB,OAAiC,EACjC,KAA2B,EAC3B,OAAe;QAEf,MAAM,GAAG,GAAG,cAAc,CAAC,OAAO,CAAC,CAAC;QACpC,MAAM,WAAW,GAAG,kBAAkB,CAAC,GAAG,CAAC,CAAC;QAC5C,MAAM,EAAE,OAAO,EAAE,YAAY,EAAE,SAAS,EAAE,GAAG,wBAAwB,CAAC,UAAU,CAAC,CAAC;QAElF,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,YAAY,GAAG,YAAY,CAAC;QACjC,IAAI,CAAC,SAAS,GAAG,SAAS,CAAC;QAC3B,IAAI,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;QACnC,IAAI,CAAC,WAAW,GAAG,WAAW,CAAC;QAC/B,IAAI,CAAC,QAAQ,GAAG,6BAAqB,CAAC,OAAO,CAAC,CAAC;QAC/C,IAAI,CAAC,KAAK,GAAG,WAAW,CAAC,WAAW,EAAE,GAAG,EAAE,YAAY,CAAC,OAAO,EAAE,KAAK,CAAC,CAAC,CAAC;IAC3E,CAAC;IAED,eAAe;IACf,IAAI,YAAY;QACd,OAAO,CAAC,CAAC,IAAI,CAAC,SAAS,CAAC;IAC1B,CAAC;CACF;AAzCD,sDAyCC;AAED;;;;GAIG;AACH,MAAa,kBAAkB;IAS7B;;;;;;;;OAQG;IACH,YACE,UAAsB,EACtB,OAAiC,EACjC,KAAuB,EACvB,OAAe;QAEf,MAAM,GAAG,GAAG,cAAc,CAAC,OAAO,CAAC,CAAC;QACpC,MAAM,WAAW,GAAG,kBAAkB,CAAC,GAAG,CAAC,CAAC;QAC5C,MAAM,EAAE,OAAO,EAAE,YAAY,EAAE,SAAS,EAAE,GAAG,wBAAwB,CAAC,UAAU,CAAC,CAAC;QAElF,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,YAAY,GAAG,YAAY,CAAC;QACjC,IAAI,CAAC,SAAS,GAAG,SAAS,CAAC;QAE3B,IAAI,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;QACnC,IAAI,CAAC,WAAW,GAAG,WAAW,CAAC;QAC/B,IAAI,CAAC,QAAQ,GAAG,6BAAqB,CAAC,OAAO,CAAC,CAAC;QAC/C,IAAI,CAAC,OAAO,GAAG,WAAW,CAAC,WAAW,EAAE,GAAG,EAAE,KAAK,CAAU,CAAC;IAC/D,CAAC;IAED,eAAe;IACf,IAAI,YAAY;QACd,OAAO,CAAC,CAAC,IAAI,CAAC,SAAS,CAAC;IAC1B,CAAC;CACF;AA1CD,gDA0CC;AAED,wFAAwF;AACxF,MAAM,kBAAkB,GAAG,IAAI,GAAG,CAAC;IACjC,cAAc;IACd,WAAW;IACX,cAAc;IACd,UAAU;IACV,YAAY;IACZ,YAAY;IACZ,gBAAgB;IAChB,iBAAiB;IACjB,QAAQ;CACT,CAAC,CAAC;AAEH,MAAM,cAAc,GAAG,IAAI,GAAG,CAAC,CAAC,OAAO,EAAE,UAAU,EAAE,UAAU,CAAC,CAAC,CAAC;AAElE,iBAAiB;AACjB,MAAM,kBAAkB,GAAG,CAAC,UAAoB,EAAE,EAAE,CAAC,MAAM,CAAC,IAAI,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,CAAC;AAChF,MAAM,SAAS,GAAG,CAAC,OAAiC,EAAE,EAAE,CAAC,OAAO,CAAC,EAAE,CAAC;AACpE,MAAM,YAAY,GAAG,CAAC,OAAiC,EAAE,EAAE,CAAC,OAAO,CAAC,EAAE,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC;AACrF,MAAM,cAAc,GAAG,CAAC,OAAiC,EAAE,EAAE,CAAC,OAAO,CAAC,EAAE,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC;AACvF,MAAM,WAAW,GAAG,CAAC,WAAmB,EAAE,UAAoB,EAAE,MAAwB,EAAE,EAAE,CAC1F,kBAAkB,CAAC,GAAG,CAAC,WAAW,CAAC;IACnC,CAAC,cAAc,CAAC,GAAG,CAAC,WAAW,CAAC,IAAI,UAAU,CAAC,uBAAuB,CAAC;IACrE,CAAC,CAAC,EAAE;IACJ,CAAC,CAAC,MAAM,CAAC;AAEb,MAAM,qBAAqB,GAA8B;IACvD,MAAM,EAAE,QAAQ;IAChB,QAAQ,EAAE,MAAM;IAChB,KAAK,EAAE,MAAM;IACb,QAAQ,EAAE,SAAS;IACnB,QAAQ,EAAE,SAAS;IACnB,IAAI,EAAE,KAAK;IACX,IAAI,EAAE,KAAK;IACX,UAAU,EAAE,WAAW;IACvB,YAAY,EAAE,cAAc;IAC5B,UAAU,EAAE,WAAW;IACvB,SAAS,EAAE,UAAU;CACtB,CAAC;AAEF,MAAM,uBAAuB,GAAG;IAC9B,YAAY,EAAE,MAAM;IACpB,cAAc,EAAE,WAAW;IAC3B,mBAAmB,EAAE,YAAY;CACzB,CAAC;AAEX,MAAM,aAAa,GAAG;IACpB,UAAU;IACV,aAAa;IACb,iBAAiB;IACjB,WAAW;IACX,SAAS;IACT,SAAS;CACD,CAAC;AAEX,gGAAgG;AAChG,SAAS,cAAc,CAAC,OAAiC;;IACvD,IAAI,OAAO,YAAY,kBAAO,EAAE;QAC9B,OAAO;YACL,OAAO,EAAE,gBAAQ,CAAC,OAAO,CAAC,QAAQ,CAAC;YACnC,UAAU,EAAE,cAAc,CAAC,OAAO,CAAC;YACnC,SAAS,EAAE,OAAO,CAAC,cAAc;SAClC,CAAC;KACH;IAED,IAAI,OAAO,YAAY,qBAAU,EAAE;QACjC,OAAO;YACL,WAAW,EAAE,cAAc,CAAC,OAAO,CAAC;YACpC,OAAO,EAAE,gBAAQ,CAAC,OAAO,CAAC,SAAS,CAAC;SACrC,CAAC;KACH;IAED,IAAI,OAAO,YAAY,cAAG,EAAE;QAC1B,OAAO,gBAAQ,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;KAClC;IAED,IAAI,MAAA,OAAO,CAAC,KAAK,0CAAE,MAAM,EAAE;QACzB,IAAI,MAAgB,CAAC;QACrB,IAAI,OAAO,CAAC,EAAE,KAAK,YAAY,EAAE;YAC/B,4BAA4B;YAC5B,MAAM,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,OAAO,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC;SAClD;aAAM;YACL,iCAAiC;YACjC,MAAM,GAAG,EAAE,IAAI,EAAE,cAAc,CAAC,OAAO,CAAC,EAAE,CAAC;YAC3C,MAAM,CAAC,IAAI,CAAC,qBAAqB,CAAC,CAAC,OAAO,CAAC,GAAG,CAAC,EAAE;gBAC/C,IAAI,OAAO,CAAC,KAAK,CAAC,GAAG,CAAC,IAAI,IAAI,EAAE;oBAC9B,MAAM,CAAC,qBAAqB,CAAC,GAAG,CAAC,CAAC,GAAG,gBAAQ,CAAC,OAAO,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC;iBACnE;YACH,CAAC,CAAC,CAAC;SACJ;QAED,MAAM,CAAC,IAAI,CAAC,uBAAuB,CAAC,CAAC,OAAO,CAAC,GAAG,CAAC,EAAE;YACjD,MAAM,SAAS,GAAG,GAA2C,CAAC;YAC9D,IAAI,OAAO,CAAC,SAAS,CAAC,IAAI,IAAI,EAAE;gBAC9B,MAAM,CAAC,uBAAuB,CAAC,SAAS,CAAC,CAAC,GAAG,gBAAQ,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC,CAAC;aAC3E;QACH,CAAC,CAAC,CAAC;QAEH,aAAa,CAAC,OAAO,CAAC,GAAG,CAAC,EAAE;YAC1B,MAAM,KAAK,GAAG,GAAmC,CAAC;YAClD,IAAI,OAAO,CAAC,KAAK,CAAC,EAAE;gBAClB,MAAM,CAAC,KAAK,CAAC,GAAG,OAAO,CAAC,KAAK,CAAC,CAAC;aAChC;QACH,CAAC,CAAC,CAAC;QAEH,IAAI,OAAO,CAAC,UAAU,IAAI,IAAI,EAAE;YAC9B,MAAM,CAAC,KAAK,GAAG,OAAO,CAAC,UAAU,CAAC;SACnC;QAED,IAAI,OAAO,CAAC,KAAK,CAAC,QAAQ,EAAE;YAC1B,OAAO,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC;SAC5B;QACD,OAAO,MAAM,CAAC;KACf;IAED,MAAM,WAAW,GAA4B,EAAE,CAAC;IAChD,MAAM,aAAa,GAA4B,EAAE,CAAC;IAClD,IAAI,OAAO,CAAC,KAAK,EAAE;QACjB,KAAK,MAAM,CAAC,IAAI,OAAO,CAAC,KAAK,EAAE;YAC7B,WAAW,CAAC,CAAC,CAAC,GAAG,gBAAQ,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC;SAC7C;QACD,aAAa,CAAC,KAAK,GAAG,WAAW,CAAC;KACnC;IAED,KAAK,MAAM,CAAC,IAAI,OAAO,EAAE;QACvB,IAAI,CAAC,KAAK,OAAO;YAAE,SAAS;QAC5B,aAAa,CAAC,CAAC,CAAC,GAAG,gBAAQ,CAAE,OAA8C,CAAC,CAAC,CAAC,CAAC,CAAC;KACjF;IACD,OAAO,OAAO,CAAC,KAAK,CAAC,CAAC,CAAC,WAAW,CAAC,CAAC,CAAC,aAAa,CAAC;AACrD,CAAC;AAED,SAAS,YAAY,CAAC,OAAiC,EAAE,KAAgB;IACvE,IAAI,OAAO,YAAY,qBAAU,EAAE;QACjC,OAAO;YACL,EAAE,EAAE,CAAC;YACL,cAAc,EAAE,OAAO,CAAC,SAAS;SAClC,CAAC;KACH;IAED,IAAI,CAAC,KAAK,EAAE;QACV,OAAO,KAAK,CAAC;KACd;IAED,IAAI,OAAO,YAAY,kBAAO,EAAE;QAC9B,OAAO;YACL,EAAE,EAAE,CAAC;YACL,MAAM,EAAE;gBACN,EAAE,EAAE,gBAAQ,CAAC,KAAK,CAAC,QAAQ,CAAC;gBAC5B,EAAE,EAAE,SAAS,CAAC,OAAO,CAAC;gBACtB,SAAS,EAAE,gBAAQ,CAAC,KAAK,CAAC,SAAS,CAAC;aACrC;SACF,CAAC;KACH;IAED,IAAI,OAAO,YAAY,cAAG,EAAE;QAC1B,OAAO,gBAAQ,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC;KACtD;IAED,iCAAiC;IACjC,IAAI,OAAO,CAAC,KAAK,IAAI,OAAO,CAAC,KAAK,CAAC,MAAM,IAAI,IAAI,EAAE;QACjD,OAAO;YACL,EAAE,EAAE,CAAC;YACL,MAAM,EAAE;gBACN,EAAE,EAAE,gBAAQ,CAAC,KAAK,CAAC,QAAQ,CAAC;gBAC5B,EAAE,EAAE,SAAS,CAAC,OAAO,CAAC;gBACtB,UAAU,EAAE,gBAAQ,CAAC,KAAK,CAAC,SAAS,CAAC;aACtC;SACF,CAAC;KACH;IAED,OAAO,gBAAQ,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC;AACvD,CAAC;AAED,SAAS,wBAAwB,CAAC,UAAsB;IACtD,IAAI,YAAY,CAAC;IACjB,IAAI,IAAI,IAAI,UAAU,EAAE;QACtB,YAAY,GAAG,UAAU,CAAC,EAAE,CAAC;KAC9B;IACD,OAAO;QACL,OAAO,EAAE,UAAU,CAAC,OAAO;QAC3B,SAAS,EAAE,UAAU,CAAC,SAAS;QAC/B,YAAY;KACb,CAAC;AACJ,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/commands.js b/node_modules/mongodb/lib/cmap/commands.js new file mode 100644 index 000000000..869b91bc3 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/commands.js @@ -0,0 +1,623 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.BinMsg = exports.Msg = exports.Response = exports.KillCursor = exports.GetMore = exports.Query = void 0; +const read_preference_1 = require("../read_preference"); +const BSON = require("../bson"); +const utils_1 = require("../utils"); +const constants_1 = require("./wire_protocol/constants"); +const error_1 = require("../error"); +// Incrementing request id +let _requestId = 0; +// Query flags +const OPTS_TAILABLE_CURSOR = 2; +const OPTS_SLAVE = 4; +const OPTS_OPLOG_REPLAY = 8; +const OPTS_NO_CURSOR_TIMEOUT = 16; +const OPTS_AWAIT_DATA = 32; +const OPTS_EXHAUST = 64; +const OPTS_PARTIAL = 128; +// Response flags +const CURSOR_NOT_FOUND = 1; +const QUERY_FAILURE = 2; +const SHARD_CONFIG_STALE = 4; +const AWAIT_CAPABLE = 8; +/************************************************************** + * QUERY + **************************************************************/ +/** @internal */ +class Query { + constructor(ns, query, options) { + // Basic options needed to be passed in + // TODO(NODE-3483): Replace with MongoCommandError + if (ns == null) + throw new error_1.MongoRuntimeError('Namespace must be specified for query'); + // TODO(NODE-3483): Replace with MongoCommandError + if (query == null) + throw new error_1.MongoRuntimeError('A query document must be specified for query'); + // Validate that we are not passing 0x00 in the collection name + if (ns.indexOf('\x00') !== -1) { + // TODO(NODE-3483): Use MongoNamespace static method + throw new error_1.MongoRuntimeError('Namespace cannot contain a null character'); + } + // Basic options + this.ns = ns; + this.query = query; + // Additional options + this.numberToSkip = options.numberToSkip || 0; + this.numberToReturn = options.numberToReturn || 0; + this.returnFieldSelector = options.returnFieldSelector || undefined; + this.requestId = Query.getRequestId(); + // special case for pre-3.2 find commands, delete ASAP + this.pre32Limit = options.pre32Limit; + // Serialization option + this.serializeFunctions = + typeof options.serializeFunctions === 'boolean' ? options.serializeFunctions : false; + this.ignoreUndefined = + typeof options.ignoreUndefined === 'boolean' ? options.ignoreUndefined : false; + this.maxBsonSize = options.maxBsonSize || 1024 * 1024 * 16; + this.checkKeys = typeof options.checkKeys === 'boolean' ? options.checkKeys : false; + this.batchSize = this.numberToReturn; + // Flags + this.tailable = false; + this.slaveOk = typeof options.slaveOk === 'boolean' ? options.slaveOk : false; + this.oplogReplay = false; + this.noCursorTimeout = false; + this.awaitData = false; + this.exhaust = false; + this.partial = false; + } + /** Assign next request Id. */ + incRequestId() { + this.requestId = _requestId++; + } + /** Peek next request Id. */ + nextRequestId() { + return _requestId + 1; + } + /** Increment then return next request Id. */ + static getRequestId() { + return ++_requestId; + } + // Uses a single allocated buffer for the process, avoiding multiple memory allocations + toBin() { + const buffers = []; + let projection = null; + // Set up the flags + let flags = 0; + if (this.tailable) { + flags |= OPTS_TAILABLE_CURSOR; + } + if (this.slaveOk) { + flags |= OPTS_SLAVE; + } + if (this.oplogReplay) { + flags |= OPTS_OPLOG_REPLAY; + } + if (this.noCursorTimeout) { + flags |= OPTS_NO_CURSOR_TIMEOUT; + } + if (this.awaitData) { + flags |= OPTS_AWAIT_DATA; + } + if (this.exhaust) { + flags |= OPTS_EXHAUST; + } + if (this.partial) { + flags |= OPTS_PARTIAL; + } + // If batchSize is different to this.numberToReturn + if (this.batchSize !== this.numberToReturn) + this.numberToReturn = this.batchSize; + // Allocate write protocol header buffer + const header = Buffer.alloc(4 * 4 + // Header + 4 + // Flags + Buffer.byteLength(this.ns) + + 1 + // namespace + 4 + // numberToSkip + 4 // numberToReturn + ); + // Add header to buffers + buffers.push(header); + // Serialize the query + const query = BSON.serialize(this.query, { + checkKeys: this.checkKeys, + serializeFunctions: this.serializeFunctions, + ignoreUndefined: this.ignoreUndefined + }); + // Add query document + buffers.push(query); + if (this.returnFieldSelector && Object.keys(this.returnFieldSelector).length > 0) { + // Serialize the projection document + projection = BSON.serialize(this.returnFieldSelector, { + checkKeys: this.checkKeys, + serializeFunctions: this.serializeFunctions, + ignoreUndefined: this.ignoreUndefined + }); + // Add projection document + buffers.push(projection); + } + // Total message size + const totalLength = header.length + query.length + (projection ? projection.length : 0); + // Set up the index + let index = 4; + // Write total document length + header[3] = (totalLength >> 24) & 0xff; + header[2] = (totalLength >> 16) & 0xff; + header[1] = (totalLength >> 8) & 0xff; + header[0] = totalLength & 0xff; + // Write header information requestId + header[index + 3] = (this.requestId >> 24) & 0xff; + header[index + 2] = (this.requestId >> 16) & 0xff; + header[index + 1] = (this.requestId >> 8) & 0xff; + header[index] = this.requestId & 0xff; + index = index + 4; + // Write header information responseTo + header[index + 3] = (0 >> 24) & 0xff; + header[index + 2] = (0 >> 16) & 0xff; + header[index + 1] = (0 >> 8) & 0xff; + header[index] = 0 & 0xff; + index = index + 4; + // Write header information OP_QUERY + header[index + 3] = (constants_1.OP_QUERY >> 24) & 0xff; + header[index + 2] = (constants_1.OP_QUERY >> 16) & 0xff; + header[index + 1] = (constants_1.OP_QUERY >> 8) & 0xff; + header[index] = constants_1.OP_QUERY & 0xff; + index = index + 4; + // Write header information flags + header[index + 3] = (flags >> 24) & 0xff; + header[index + 2] = (flags >> 16) & 0xff; + header[index + 1] = (flags >> 8) & 0xff; + header[index] = flags & 0xff; + index = index + 4; + // Write collection name + index = index + header.write(this.ns, index, 'utf8') + 1; + header[index - 1] = 0; + // Write header information flags numberToSkip + header[index + 3] = (this.numberToSkip >> 24) & 0xff; + header[index + 2] = (this.numberToSkip >> 16) & 0xff; + header[index + 1] = (this.numberToSkip >> 8) & 0xff; + header[index] = this.numberToSkip & 0xff; + index = index + 4; + // Write header information flags numberToReturn + header[index + 3] = (this.numberToReturn >> 24) & 0xff; + header[index + 2] = (this.numberToReturn >> 16) & 0xff; + header[index + 1] = (this.numberToReturn >> 8) & 0xff; + header[index] = this.numberToReturn & 0xff; + index = index + 4; + // Return the buffers + return buffers; + } +} +exports.Query = Query; +/************************************************************** + * GETMORE + **************************************************************/ +/** @internal */ +class GetMore { + constructor(ns, cursorId, opts = {}) { + this.numberToReturn = opts.numberToReturn || 0; + this.requestId = _requestId++; + this.ns = ns; + this.cursorId = cursorId; + } + // Uses a single allocated buffer for the process, avoiding multiple memory allocations + toBin() { + const length = 4 + Buffer.byteLength(this.ns) + 1 + 4 + 8 + 4 * 4; + // Create command buffer + let index = 0; + // Allocate buffer + const _buffer = Buffer.alloc(length); + // Write header information + // index = write32bit(index, _buffer, length); + _buffer[index + 3] = (length >> 24) & 0xff; + _buffer[index + 2] = (length >> 16) & 0xff; + _buffer[index + 1] = (length >> 8) & 0xff; + _buffer[index] = length & 0xff; + index = index + 4; + // index = write32bit(index, _buffer, requestId); + _buffer[index + 3] = (this.requestId >> 24) & 0xff; + _buffer[index + 2] = (this.requestId >> 16) & 0xff; + _buffer[index + 1] = (this.requestId >> 8) & 0xff; + _buffer[index] = this.requestId & 0xff; + index = index + 4; + // index = write32bit(index, _buffer, 0); + _buffer[index + 3] = (0 >> 24) & 0xff; + _buffer[index + 2] = (0 >> 16) & 0xff; + _buffer[index + 1] = (0 >> 8) & 0xff; + _buffer[index] = 0 & 0xff; + index = index + 4; + // index = write32bit(index, _buffer, OP_GETMORE); + _buffer[index + 3] = (constants_1.OP_GETMORE >> 24) & 0xff; + _buffer[index + 2] = (constants_1.OP_GETMORE >> 16) & 0xff; + _buffer[index + 1] = (constants_1.OP_GETMORE >> 8) & 0xff; + _buffer[index] = constants_1.OP_GETMORE & 0xff; + index = index + 4; + // index = write32bit(index, _buffer, 0); + _buffer[index + 3] = (0 >> 24) & 0xff; + _buffer[index + 2] = (0 >> 16) & 0xff; + _buffer[index + 1] = (0 >> 8) & 0xff; + _buffer[index] = 0 & 0xff; + index = index + 4; + // Write collection name + index = index + _buffer.write(this.ns, index, 'utf8') + 1; + _buffer[index - 1] = 0; + // Write batch size + // index = write32bit(index, _buffer, numberToReturn); + _buffer[index + 3] = (this.numberToReturn >> 24) & 0xff; + _buffer[index + 2] = (this.numberToReturn >> 16) & 0xff; + _buffer[index + 1] = (this.numberToReturn >> 8) & 0xff; + _buffer[index] = this.numberToReturn & 0xff; + index = index + 4; + // Write cursor id + // index = write32bit(index, _buffer, cursorId.getLowBits()); + _buffer[index + 3] = (this.cursorId.getLowBits() >> 24) & 0xff; + _buffer[index + 2] = (this.cursorId.getLowBits() >> 16) & 0xff; + _buffer[index + 1] = (this.cursorId.getLowBits() >> 8) & 0xff; + _buffer[index] = this.cursorId.getLowBits() & 0xff; + index = index + 4; + // index = write32bit(index, _buffer, cursorId.getHighBits()); + _buffer[index + 3] = (this.cursorId.getHighBits() >> 24) & 0xff; + _buffer[index + 2] = (this.cursorId.getHighBits() >> 16) & 0xff; + _buffer[index + 1] = (this.cursorId.getHighBits() >> 8) & 0xff; + _buffer[index] = this.cursorId.getHighBits() & 0xff; + index = index + 4; + // Return buffer + return [_buffer]; + } +} +exports.GetMore = GetMore; +/************************************************************** + * KILLCURSOR + **************************************************************/ +/** @internal */ +class KillCursor { + constructor(ns, cursorIds) { + this.ns = ns; + this.requestId = _requestId++; + this.cursorIds = cursorIds; + } + // Uses a single allocated buffer for the process, avoiding multiple memory allocations + toBin() { + const length = 4 + 4 + 4 * 4 + this.cursorIds.length * 8; + // Create command buffer + let index = 0; + const _buffer = Buffer.alloc(length); + // Write header information + // index = write32bit(index, _buffer, length); + _buffer[index + 3] = (length >> 24) & 0xff; + _buffer[index + 2] = (length >> 16) & 0xff; + _buffer[index + 1] = (length >> 8) & 0xff; + _buffer[index] = length & 0xff; + index = index + 4; + // index = write32bit(index, _buffer, requestId); + _buffer[index + 3] = (this.requestId >> 24) & 0xff; + _buffer[index + 2] = (this.requestId >> 16) & 0xff; + _buffer[index + 1] = (this.requestId >> 8) & 0xff; + _buffer[index] = this.requestId & 0xff; + index = index + 4; + // index = write32bit(index, _buffer, 0); + _buffer[index + 3] = (0 >> 24) & 0xff; + _buffer[index + 2] = (0 >> 16) & 0xff; + _buffer[index + 1] = (0 >> 8) & 0xff; + _buffer[index] = 0 & 0xff; + index = index + 4; + // index = write32bit(index, _buffer, OP_KILL_CURSORS); + _buffer[index + 3] = (constants_1.OP_KILL_CURSORS >> 24) & 0xff; + _buffer[index + 2] = (constants_1.OP_KILL_CURSORS >> 16) & 0xff; + _buffer[index + 1] = (constants_1.OP_KILL_CURSORS >> 8) & 0xff; + _buffer[index] = constants_1.OP_KILL_CURSORS & 0xff; + index = index + 4; + // index = write32bit(index, _buffer, 0); + _buffer[index + 3] = (0 >> 24) & 0xff; + _buffer[index + 2] = (0 >> 16) & 0xff; + _buffer[index + 1] = (0 >> 8) & 0xff; + _buffer[index] = 0 & 0xff; + index = index + 4; + // Write batch size + // index = write32bit(index, _buffer, this.cursorIds.length); + _buffer[index + 3] = (this.cursorIds.length >> 24) & 0xff; + _buffer[index + 2] = (this.cursorIds.length >> 16) & 0xff; + _buffer[index + 1] = (this.cursorIds.length >> 8) & 0xff; + _buffer[index] = this.cursorIds.length & 0xff; + index = index + 4; + // Write all the cursor ids into the array + for (let i = 0; i < this.cursorIds.length; i++) { + // Write cursor id + // index = write32bit(index, _buffer, cursorIds[i].getLowBits()); + _buffer[index + 3] = (this.cursorIds[i].getLowBits() >> 24) & 0xff; + _buffer[index + 2] = (this.cursorIds[i].getLowBits() >> 16) & 0xff; + _buffer[index + 1] = (this.cursorIds[i].getLowBits() >> 8) & 0xff; + _buffer[index] = this.cursorIds[i].getLowBits() & 0xff; + index = index + 4; + // index = write32bit(index, _buffer, cursorIds[i].getHighBits()); + _buffer[index + 3] = (this.cursorIds[i].getHighBits() >> 24) & 0xff; + _buffer[index + 2] = (this.cursorIds[i].getHighBits() >> 16) & 0xff; + _buffer[index + 1] = (this.cursorIds[i].getHighBits() >> 8) & 0xff; + _buffer[index] = this.cursorIds[i].getHighBits() & 0xff; + index = index + 4; + } + // Return buffer + return [_buffer]; + } +} +exports.KillCursor = KillCursor; +/** @internal */ +class Response { + constructor(message, msgHeader, msgBody, opts) { + this.parsed = false; + this.raw = message; + this.data = msgBody; + this.opts = opts !== null && opts !== void 0 ? opts : { + promoteLongs: true, + promoteValues: true, + promoteBuffers: false, + bsonRegExp: false + }; + // Read the message header + this.length = msgHeader.length; + this.requestId = msgHeader.requestId; + this.responseTo = msgHeader.responseTo; + this.opCode = msgHeader.opCode; + this.fromCompressed = msgHeader.fromCompressed; + // Read the message body + this.responseFlags = msgBody.readInt32LE(0); + this.cursorId = new BSON.Long(msgBody.readInt32LE(4), msgBody.readInt32LE(8)); + this.startingFrom = msgBody.readInt32LE(12); + this.numberReturned = msgBody.readInt32LE(16); + // Preallocate document array + this.documents = new Array(this.numberReturned); + // Flag values + this.cursorNotFound = (this.responseFlags & CURSOR_NOT_FOUND) !== 0; + this.queryFailure = (this.responseFlags & QUERY_FAILURE) !== 0; + this.shardConfigStale = (this.responseFlags & SHARD_CONFIG_STALE) !== 0; + this.awaitCapable = (this.responseFlags & AWAIT_CAPABLE) !== 0; + this.promoteLongs = typeof this.opts.promoteLongs === 'boolean' ? this.opts.promoteLongs : true; + this.promoteValues = + typeof this.opts.promoteValues === 'boolean' ? this.opts.promoteValues : true; + this.promoteBuffers = + typeof this.opts.promoteBuffers === 'boolean' ? this.opts.promoteBuffers : false; + this.bsonRegExp = typeof this.opts.bsonRegExp === 'boolean' ? this.opts.bsonRegExp : false; + } + isParsed() { + return this.parsed; + } + parse(options) { + var _a, _b, _c, _d; + // Don't parse again if not needed + if (this.parsed) + return; + options = options !== null && options !== void 0 ? options : {}; + // Allow the return of raw documents instead of parsing + const raw = options.raw || false; + const documentsReturnedIn = options.documentsReturnedIn || null; + const promoteLongs = (_a = options.promoteLongs) !== null && _a !== void 0 ? _a : this.opts.promoteLongs; + const promoteValues = (_b = options.promoteValues) !== null && _b !== void 0 ? _b : this.opts.promoteValues; + const promoteBuffers = (_c = options.promoteBuffers) !== null && _c !== void 0 ? _c : this.opts.promoteBuffers; + const bsonRegExp = (_d = options.bsonRegExp) !== null && _d !== void 0 ? _d : this.opts.bsonRegExp; + let bsonSize; + // Set up the options + const _options = { + promoteLongs, + promoteValues, + promoteBuffers, + bsonRegExp + }; + // Position within OP_REPLY at which documents start + // (See https://docs.mongodb.com/manual/reference/mongodb-wire-protocol/#wire-op-reply) + this.index = 20; + // Parse Body + for (let i = 0; i < this.numberReturned; i++) { + bsonSize = + this.data[this.index] | + (this.data[this.index + 1] << 8) | + (this.data[this.index + 2] << 16) | + (this.data[this.index + 3] << 24); + // If we have raw results specified slice the return document + if (raw) { + this.documents[i] = this.data.slice(this.index, this.index + bsonSize); + } + else { + this.documents[i] = BSON.deserialize(this.data.slice(this.index, this.index + bsonSize), _options); + } + // Adjust the index + this.index = this.index + bsonSize; + } + if (this.documents.length === 1 && documentsReturnedIn != null && raw) { + const fieldsAsRaw = {}; + fieldsAsRaw[documentsReturnedIn] = true; + _options.fieldsAsRaw = fieldsAsRaw; + const doc = BSON.deserialize(this.documents[0], _options); + this.documents = [doc]; + } + // Set parsed + this.parsed = true; + } +} +exports.Response = Response; +// Implementation of OP_MSG spec: +// https://github.com/mongodb/specifications/blob/master/source/message/OP_MSG.rst +// +// struct Section { +// uint8 payloadType; +// union payload { +// document document; // payloadType == 0 +// struct sequence { // payloadType == 1 +// int32 size; +// cstring identifier; +// document* documents; +// }; +// }; +// }; +// struct OP_MSG { +// struct MsgHeader { +// int32 messageLength; +// int32 requestID; +// int32 responseTo; +// int32 opCode = 2013; +// }; +// uint32 flagBits; +// Section+ sections; +// [uint32 checksum;] +// }; +// Msg Flags +const OPTS_CHECKSUM_PRESENT = 1; +const OPTS_MORE_TO_COME = 2; +const OPTS_EXHAUST_ALLOWED = 1 << 16; +/** @internal */ +class Msg { + constructor(ns, command, options) { + // Basic options needed to be passed in + if (command == null) + throw new error_1.MongoInvalidArgumentError('Query document must be specified for query'); + // Basic options + this.ns = ns; + this.command = command; + this.command.$db = utils_1.databaseNamespace(ns); + if (options.readPreference && options.readPreference.mode !== read_preference_1.ReadPreference.PRIMARY) { + this.command.$readPreference = options.readPreference.toJSON(); + } + // Ensure empty options + this.options = options !== null && options !== void 0 ? options : {}; + // Additional options + this.requestId = options.requestId ? options.requestId : Msg.getRequestId(); + // Serialization option + this.serializeFunctions = + typeof options.serializeFunctions === 'boolean' ? options.serializeFunctions : false; + this.ignoreUndefined = + typeof options.ignoreUndefined === 'boolean' ? options.ignoreUndefined : false; + this.checkKeys = typeof options.checkKeys === 'boolean' ? options.checkKeys : false; + this.maxBsonSize = options.maxBsonSize || 1024 * 1024 * 16; + // flags + this.checksumPresent = false; + this.moreToCome = options.moreToCome || false; + this.exhaustAllowed = + typeof options.exhaustAllowed === 'boolean' ? options.exhaustAllowed : false; + } + toBin() { + const buffers = []; + let flags = 0; + if (this.checksumPresent) { + flags |= OPTS_CHECKSUM_PRESENT; + } + if (this.moreToCome) { + flags |= OPTS_MORE_TO_COME; + } + if (this.exhaustAllowed) { + flags |= OPTS_EXHAUST_ALLOWED; + } + const header = Buffer.alloc(4 * 4 + // Header + 4 // Flags + ); + buffers.push(header); + let totalLength = header.length; + const command = this.command; + totalLength += this.makeDocumentSegment(buffers, command); + header.writeInt32LE(totalLength, 0); // messageLength + header.writeInt32LE(this.requestId, 4); // requestID + header.writeInt32LE(0, 8); // responseTo + header.writeInt32LE(constants_1.OP_MSG, 12); // opCode + header.writeUInt32LE(flags, 16); // flags + return buffers; + } + makeDocumentSegment(buffers, document) { + const payloadTypeBuffer = Buffer.alloc(1); + payloadTypeBuffer[0] = 0; + const documentBuffer = this.serializeBson(document); + buffers.push(payloadTypeBuffer); + buffers.push(documentBuffer); + return payloadTypeBuffer.length + documentBuffer.length; + } + serializeBson(document) { + return BSON.serialize(document, { + checkKeys: this.checkKeys, + serializeFunctions: this.serializeFunctions, + ignoreUndefined: this.ignoreUndefined + }); + } + static getRequestId() { + _requestId = (_requestId + 1) & 0x7fffffff; + return _requestId; + } +} +exports.Msg = Msg; +/** @internal */ +class BinMsg { + constructor(message, msgHeader, msgBody, opts) { + this.parsed = false; + this.raw = message; + this.data = msgBody; + this.opts = opts !== null && opts !== void 0 ? opts : { + promoteLongs: true, + promoteValues: true, + promoteBuffers: false, + bsonRegExp: false + }; + // Read the message header + this.length = msgHeader.length; + this.requestId = msgHeader.requestId; + this.responseTo = msgHeader.responseTo; + this.opCode = msgHeader.opCode; + this.fromCompressed = msgHeader.fromCompressed; + // Read response flags + this.responseFlags = msgBody.readInt32LE(0); + this.checksumPresent = (this.responseFlags & OPTS_CHECKSUM_PRESENT) !== 0; + this.moreToCome = (this.responseFlags & OPTS_MORE_TO_COME) !== 0; + this.exhaustAllowed = (this.responseFlags & OPTS_EXHAUST_ALLOWED) !== 0; + this.promoteLongs = typeof this.opts.promoteLongs === 'boolean' ? this.opts.promoteLongs : true; + this.promoteValues = + typeof this.opts.promoteValues === 'boolean' ? this.opts.promoteValues : true; + this.promoteBuffers = + typeof this.opts.promoteBuffers === 'boolean' ? this.opts.promoteBuffers : false; + this.bsonRegExp = typeof this.opts.bsonRegExp === 'boolean' ? this.opts.bsonRegExp : false; + this.documents = []; + } + isParsed() { + return this.parsed; + } + parse(options) { + var _a, _b, _c, _d; + // Don't parse again if not needed + if (this.parsed) + return; + options = options !== null && options !== void 0 ? options : {}; + this.index = 4; + // Allow the return of raw documents instead of parsing + const raw = options.raw || false; + const documentsReturnedIn = options.documentsReturnedIn || null; + const promoteLongs = (_a = options.promoteLongs) !== null && _a !== void 0 ? _a : this.opts.promoteLongs; + const promoteValues = (_b = options.promoteValues) !== null && _b !== void 0 ? _b : this.opts.promoteValues; + const promoteBuffers = (_c = options.promoteBuffers) !== null && _c !== void 0 ? _c : this.opts.promoteBuffers; + const bsonRegExp = (_d = options.bsonRegExp) !== null && _d !== void 0 ? _d : this.opts.bsonRegExp; + // Set up the options + const _options = { + promoteLongs, + promoteValues, + promoteBuffers, + bsonRegExp + }; + while (this.index < this.data.length) { + const payloadType = this.data.readUInt8(this.index++); + if (payloadType === 0) { + const bsonSize = this.data.readUInt32LE(this.index); + const bin = this.data.slice(this.index, this.index + bsonSize); + this.documents.push(raw ? bin : BSON.deserialize(bin, _options)); + this.index += bsonSize; + } + else if (payloadType === 1) { + // It was decided that no driver makes use of payload type 1 + // TODO(NODE-3483): Replace with MongoDeprecationError + throw new error_1.MongoRuntimeError('OP_MSG Payload Type 1 detected unsupported protocol'); + } + } + if (this.documents.length === 1 && documentsReturnedIn != null && raw) { + const fieldsAsRaw = {}; + fieldsAsRaw[documentsReturnedIn] = true; + _options.fieldsAsRaw = fieldsAsRaw; + const doc = BSON.deserialize(this.documents[0], _options); + this.documents = [doc]; + } + this.parsed = true; + } +} +exports.BinMsg = BinMsg; +//# sourceMappingURL=commands.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/commands.js.map b/node_modules/mongodb/lib/cmap/commands.js.map new file mode 100644 index 000000000..0dc3cd13c --- /dev/null +++ b/node_modules/mongodb/lib/cmap/commands.js.map @@ -0,0 +1 @@ +{"version":3,"file":"commands.js","sourceRoot":"","sources":["../../src/cmap/commands.ts"],"names":[],"mappings":";;;AAAA,wDAAoD;AACpD,gCAAgC;AAChC,oCAA6C;AAC7C,yDAA0F;AAI1F,oCAAwE;AAExE,0BAA0B;AAC1B,IAAI,UAAU,GAAG,CAAC,CAAC;AAEnB,cAAc;AACd,MAAM,oBAAoB,GAAG,CAAC,CAAC;AAC/B,MAAM,UAAU,GAAG,CAAC,CAAC;AACrB,MAAM,iBAAiB,GAAG,CAAC,CAAC;AAC5B,MAAM,sBAAsB,GAAG,EAAE,CAAC;AAClC,MAAM,eAAe,GAAG,EAAE,CAAC;AAC3B,MAAM,YAAY,GAAG,EAAE,CAAC;AACxB,MAAM,YAAY,GAAG,GAAG,CAAC;AAEzB,iBAAiB;AACjB,MAAM,gBAAgB,GAAG,CAAC,CAAC;AAC3B,MAAM,aAAa,GAAG,CAAC,CAAC;AACxB,MAAM,kBAAkB,GAAG,CAAC,CAAC;AAC7B,MAAM,aAAa,GAAG,CAAC,CAAC;AA0BxB;;gEAEgE;AAChE,gBAAgB;AAChB,MAAa,KAAK;IAsBhB,YAAY,EAAU,EAAE,KAAe,EAAE,OAAuB;QAC9D,uCAAuC;QACvC,kDAAkD;QAClD,IAAI,EAAE,IAAI,IAAI;YAAE,MAAM,IAAI,yBAAiB,CAAC,uCAAuC,CAAC,CAAC;QACrF,kDAAkD;QAClD,IAAI,KAAK,IAAI,IAAI;YAAE,MAAM,IAAI,yBAAiB,CAAC,8CAA8C,CAAC,CAAC;QAE/F,+DAA+D;QAC/D,IAAI,EAAE,CAAC,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC,EAAE;YAC7B,oDAAoD;YACpD,MAAM,IAAI,yBAAiB,CAAC,2CAA2C,CAAC,CAAC;SAC1E;QAED,gBAAgB;QAChB,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC;QACb,IAAI,CAAC,KAAK,GAAG,KAAK,CAAC;QAEnB,qBAAqB;QACrB,IAAI,CAAC,YAAY,GAAG,OAAO,CAAC,YAAY,IAAI,CAAC,CAAC;QAC9C,IAAI,CAAC,cAAc,GAAG,OAAO,CAAC,cAAc,IAAI,CAAC,CAAC;QAClD,IAAI,CAAC,mBAAmB,GAAG,OAAO,CAAC,mBAAmB,IAAI,SAAS,CAAC;QACpE,IAAI,CAAC,SAAS,GAAG,KAAK,CAAC,YAAY,EAAE,CAAC;QAEtC,sDAAsD;QACtD,IAAI,CAAC,UAAU,GAAG,OAAO,CAAC,UAAU,CAAC;QAErC,uBAAuB;QACvB,IAAI,CAAC,kBAAkB;YACrB,OAAO,OAAO,CAAC,kBAAkB,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,kBAAkB,CAAC,CAAC,CAAC,KAAK,CAAC;QACvF,IAAI,CAAC,eAAe;YAClB,OAAO,OAAO,CAAC,eAAe,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,eAAe,CAAC,CAAC,CAAC,KAAK,CAAC;QACjF,IAAI,CAAC,WAAW,GAAG,OAAO,CAAC,WAAW,IAAI,IAAI,GAAG,IAAI,GAAG,EAAE,CAAC;QAC3D,IAAI,CAAC,SAAS,GAAG,OAAO,OAAO,CAAC,SAAS,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC,CAAC,KAAK,CAAC;QACpF,IAAI,CAAC,SAAS,GAAG,IAAI,CAAC,cAAc,CAAC;QAErC,QAAQ;QACR,IAAI,CAAC,QAAQ,GAAG,KAAK,CAAC;QACtB,IAAI,CAAC,OAAO,GAAG,OAAO,OAAO,CAAC,OAAO,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC,CAAC,KAAK,CAAC;QAC9E,IAAI,CAAC,WAAW,GAAG,KAAK,CAAC;QACzB,IAAI,CAAC,eAAe,GAAG,KAAK,CAAC;QAC7B,IAAI,CAAC,SAAS,GAAG,KAAK,CAAC;QACvB,IAAI,CAAC,OAAO,GAAG,KAAK,CAAC;QACrB,IAAI,CAAC,OAAO,GAAG,KAAK,CAAC;IACvB,CAAC;IAED,8BAA8B;IAC9B,YAAY;QACV,IAAI,CAAC,SAAS,GAAG,UAAU,EAAE,CAAC;IAChC,CAAC;IAED,4BAA4B;IAC5B,aAAa;QACX,OAAO,UAAU,GAAG,CAAC,CAAC;IACxB,CAAC;IAED,6CAA6C;IAC7C,MAAM,CAAC,YAAY;QACjB,OAAO,EAAE,UAAU,CAAC;IACtB,CAAC;IAED,uFAAuF;IACvF,KAAK;QACH,MAAM,OAAO,GAAG,EAAE,CAAC;QACnB,IAAI,UAAU,GAAG,IAAI,CAAC;QAEtB,mBAAmB;QACnB,IAAI,KAAK,GAAG,CAAC,CAAC;QACd,IAAI,IAAI,CAAC,QAAQ,EAAE;YACjB,KAAK,IAAI,oBAAoB,CAAC;SAC/B;QAED,IAAI,IAAI,CAAC,OAAO,EAAE;YAChB,KAAK,IAAI,UAAU,CAAC;SACrB;QAED,IAAI,IAAI,CAAC,WAAW,EAAE;YACpB,KAAK,IAAI,iBAAiB,CAAC;SAC5B;QAED,IAAI,IAAI,CAAC,eAAe,EAAE;YACxB,KAAK,IAAI,sBAAsB,CAAC;SACjC;QAED,IAAI,IAAI,CAAC,SAAS,EAAE;YAClB,KAAK,IAAI,eAAe,CAAC;SAC1B;QAED,IAAI,IAAI,CAAC,OAAO,EAAE;YAChB,KAAK,IAAI,YAAY,CAAC;SACvB;QAED,IAAI,IAAI,CAAC,OAAO,EAAE;YAChB,KAAK,IAAI,YAAY,CAAC;SACvB;QAED,mDAAmD;QACnD,IAAI,IAAI,CAAC,SAAS,KAAK,IAAI,CAAC,cAAc;YAAE,IAAI,CAAC,cAAc,GAAG,IAAI,CAAC,SAAS,CAAC;QAEjF,wCAAwC;QACxC,MAAM,MAAM,GAAG,MAAM,CAAC,KAAK,CACzB,CAAC,GAAG,CAAC,GAAG,SAAS;YACf,CAAC,GAAG,QAAQ;YACZ,MAAM,CAAC,UAAU,CAAC,IAAI,CAAC,EAAE,CAAC;YAC1B,CAAC,GAAG,YAAY;YAChB,CAAC,GAAG,eAAe;YACnB,CAAC,CAAC,iBAAiB;SACtB,CAAC;QAEF,wBAAwB;QACxB,OAAO,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC;QAErB,sBAAsB;QACtB,MAAM,KAAK,GAAG,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,KAAK,EAAE;YACvC,SAAS,EAAE,IAAI,CAAC,SAAS;YACzB,kBAAkB,EAAE,IAAI,CAAC,kBAAkB;YAC3C,eAAe,EAAE,IAAI,CAAC,eAAe;SACtC,CAAC,CAAC;QAEH,qBAAqB;QACrB,OAAO,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;QAEpB,IAAI,IAAI,CAAC,mBAAmB,IAAI,MAAM,CAAC,IAAI,CAAC,IAAI,CAAC,mBAAmB,CAAC,CAAC,MAAM,GAAG,CAAC,EAAE;YAChF,oCAAoC;YACpC,UAAU,GAAG,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,mBAAmB,EAAE;gBACpD,SAAS,EAAE,IAAI,CAAC,SAAS;gBACzB,kBAAkB,EAAE,IAAI,CAAC,kBAAkB;gBAC3C,eAAe,EAAE,IAAI,CAAC,eAAe;aACtC,CAAC,CAAC;YACH,0BAA0B;YAC1B,OAAO,CAAC,IAAI,CAAC,UAAU,CAAC,CAAC;SAC1B;QAED,qBAAqB;QACrB,MAAM,WAAW,GAAG,MAAM,CAAC,MAAM,GAAG,KAAK,CAAC,MAAM,GAAG,CAAC,UAAU,CAAC,CAAC,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;QAExF,mBAAmB;QACnB,IAAI,KAAK,GAAG,CAAC,CAAC;QAEd,8BAA8B;QAC9B,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,WAAW,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACvC,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,WAAW,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACvC,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,WAAW,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QACtC,MAAM,CAAC,CAAC,CAAC,GAAG,WAAW,GAAG,IAAI,CAAC;QAE/B,qCAAqC;QACrC,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QAClD,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QAClD,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QACjD,MAAM,CAAC,KAAK,CAAC,GAAG,IAAI,CAAC,SAAS,GAAG,IAAI,CAAC;QACtC,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,sCAAsC;QACtC,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACrC,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACrC,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QACpC,MAAM,CAAC,KAAK,CAAC,GAAG,CAAC,GAAG,IAAI,CAAC;QACzB,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,oCAAoC;QACpC,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,oBAAQ,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QAC5C,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,oBAAQ,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QAC5C,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,oBAAQ,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QAC3C,MAAM,CAAC,KAAK,CAAC,GAAG,oBAAQ,GAAG,IAAI,CAAC;QAChC,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,iCAAiC;QACjC,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,KAAK,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACzC,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,KAAK,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACzC,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,KAAK,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QACxC,MAAM,CAAC,KAAK,CAAC,GAAG,KAAK,GAAG,IAAI,CAAC;QAC7B,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,wBAAwB;QACxB,KAAK,GAAG,KAAK,GAAG,MAAM,CAAC,KAAK,CAAC,IAAI,CAAC,EAAE,EAAE,KAAK,EAAE,MAAM,CAAC,GAAG,CAAC,CAAC;QACzD,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC;QAEtB,8CAA8C;QAC9C,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,YAAY,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACrD,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,YAAY,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACrD,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,YAAY,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QACpD,MAAM,CAAC,KAAK,CAAC,GAAG,IAAI,CAAC,YAAY,GAAG,IAAI,CAAC;QACzC,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,gDAAgD;QAChD,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,cAAc,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACvD,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,cAAc,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACvD,MAAM,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,cAAc,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QACtD,MAAM,CAAC,KAAK,CAAC,GAAG,IAAI,CAAC,cAAc,GAAG,IAAI,CAAC;QAC3C,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,qBAAqB;QACrB,OAAO,OAAO,CAAC;IACjB,CAAC;CACF;AAvND,sBAuNC;AAOD;;gEAEgE;AAChE,gBAAgB;AAChB,MAAa,OAAO;IAMlB,YAAY,EAAU,EAAE,QAAc,EAAE,OAAyB,EAAE;QACjE,IAAI,CAAC,cAAc,GAAG,IAAI,CAAC,cAAc,IAAI,CAAC,CAAC;QAC/C,IAAI,CAAC,SAAS,GAAG,UAAU,EAAE,CAAC;QAC9B,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC;QACb,IAAI,CAAC,QAAQ,GAAG,QAAQ,CAAC;IAC3B,CAAC;IAED,uFAAuF;IACvF,KAAK;QACH,MAAM,MAAM,GAAG,CAAC,GAAG,MAAM,CAAC,UAAU,CAAC,IAAI,CAAC,EAAE,CAAC,GAAG,CAAC,GAAG,CAAC,GAAG,CAAC,GAAG,CAAC,GAAG,CAAC,CAAC;QAClE,wBAAwB;QACxB,IAAI,KAAK,GAAG,CAAC,CAAC;QACd,kBAAkB;QAClB,MAAM,OAAO,GAAG,MAAM,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC;QAErC,2BAA2B;QAC3B,8CAA8C;QAC9C,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,MAAM,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QAC3C,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,MAAM,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QAC3C,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,MAAM,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QAC1C,OAAO,CAAC,KAAK,CAAC,GAAG,MAAM,GAAG,IAAI,CAAC;QAC/B,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,iDAAiD;QACjD,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACnD,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACnD,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QAClD,OAAO,CAAC,KAAK,CAAC,GAAG,IAAI,CAAC,SAAS,GAAG,IAAI,CAAC;QACvC,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,yCAAyC;QACzC,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACtC,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACtC,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QACrC,OAAO,CAAC,KAAK,CAAC,GAAG,CAAC,GAAG,IAAI,CAAC;QAC1B,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,kDAAkD;QAClD,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,sBAAU,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QAC/C,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,sBAAU,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QAC/C,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,sBAAU,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QAC9C,OAAO,CAAC,KAAK,CAAC,GAAG,sBAAU,GAAG,IAAI,CAAC;QACnC,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,yCAAyC;QACzC,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACtC,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACtC,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QACrC,OAAO,CAAC,KAAK,CAAC,GAAG,CAAC,GAAG,IAAI,CAAC;QAC1B,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,wBAAwB;QACxB,KAAK,GAAG,KAAK,GAAG,OAAO,CAAC,KAAK,CAAC,IAAI,CAAC,EAAE,EAAE,KAAK,EAAE,MAAM,CAAC,GAAG,CAAC,CAAC;QAC1D,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC;QAEvB,mBAAmB;QACnB,sDAAsD;QACtD,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,cAAc,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACxD,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,cAAc,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACxD,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,cAAc,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QACvD,OAAO,CAAC,KAAK,CAAC,GAAG,IAAI,CAAC,cAAc,GAAG,IAAI,CAAC;QAC5C,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,kBAAkB;QAClB,6DAA6D;QAC7D,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,QAAQ,CAAC,UAAU,EAAE,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QAC/D,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,QAAQ,CAAC,UAAU,EAAE,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QAC/D,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,QAAQ,CAAC,UAAU,EAAE,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QAC9D,OAAO,CAAC,KAAK,CAAC,GAAG,IAAI,CAAC,QAAQ,CAAC,UAAU,EAAE,GAAG,IAAI,CAAC;QACnD,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,8DAA8D;QAC9D,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,QAAQ,CAAC,WAAW,EAAE,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QAChE,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,QAAQ,CAAC,WAAW,EAAE,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QAChE,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,QAAQ,CAAC,WAAW,EAAE,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QAC/D,OAAO,CAAC,KAAK,CAAC,GAAG,IAAI,CAAC,QAAQ,CAAC,WAAW,EAAE,GAAG,IAAI,CAAC;QACpD,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,gBAAgB;QAChB,OAAO,CAAC,OAAO,CAAC,CAAC;IACnB,CAAC;CACF;AAvFD,0BAuFC;AAED;;gEAEgE;AAChE,gBAAgB;AAChB,MAAa,UAAU;IAKrB,YAAY,EAAU,EAAE,SAAiB;QACvC,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC;QACb,IAAI,CAAC,SAAS,GAAG,UAAU,EAAE,CAAC;QAC9B,IAAI,CAAC,SAAS,GAAG,SAAS,CAAC;IAC7B,CAAC;IAED,uFAAuF;IACvF,KAAK;QACH,MAAM,MAAM,GAAG,CAAC,GAAG,CAAC,GAAG,CAAC,GAAG,CAAC,GAAG,IAAI,CAAC,SAAS,CAAC,MAAM,GAAG,CAAC,CAAC;QAEzD,wBAAwB;QACxB,IAAI,KAAK,GAAG,CAAC,CAAC;QACd,MAAM,OAAO,GAAG,MAAM,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC;QAErC,2BAA2B;QAC3B,8CAA8C;QAC9C,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,MAAM,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QAC3C,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,MAAM,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QAC3C,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,MAAM,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QAC1C,OAAO,CAAC,KAAK,CAAC,GAAG,MAAM,GAAG,IAAI,CAAC;QAC/B,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,iDAAiD;QACjD,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACnD,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACnD,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QAClD,OAAO,CAAC,KAAK,CAAC,GAAG,IAAI,CAAC,SAAS,GAAG,IAAI,CAAC;QACvC,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,yCAAyC;QACzC,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACtC,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACtC,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QACrC,OAAO,CAAC,KAAK,CAAC,GAAG,CAAC,GAAG,IAAI,CAAC;QAC1B,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,uDAAuD;QACvD,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,2BAAe,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACpD,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,2BAAe,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACpD,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,2BAAe,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QACnD,OAAO,CAAC,KAAK,CAAC,GAAG,2BAAe,GAAG,IAAI,CAAC;QACxC,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,yCAAyC;QACzC,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACtC,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QACtC,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QACrC,OAAO,CAAC,KAAK,CAAC,GAAG,CAAC,GAAG,IAAI,CAAC;QAC1B,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,mBAAmB;QACnB,6DAA6D;QAC7D,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,CAAC,MAAM,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QAC1D,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,CAAC,MAAM,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;QAC1D,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,CAAC,MAAM,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;QACzD,OAAO,CAAC,KAAK,CAAC,GAAG,IAAI,CAAC,SAAS,CAAC,MAAM,GAAG,IAAI,CAAC;QAC9C,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;QAElB,0CAA0C;QAC1C,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,IAAI,CAAC,SAAS,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;YAC9C,kBAAkB;YAClB,iEAAiE;YACjE,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC,UAAU,EAAE,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;YACnE,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC,UAAU,EAAE,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;YACnE,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC,UAAU,EAAE,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;YAClE,OAAO,CAAC,KAAK,CAAC,GAAG,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC,UAAU,EAAE,GAAG,IAAI,CAAC;YACvD,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;YAElB,kEAAkE;YAClE,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC,WAAW,EAAE,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;YACpE,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC,WAAW,EAAE,IAAI,EAAE,CAAC,GAAG,IAAI,CAAC;YACpE,OAAO,CAAC,KAAK,GAAG,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC,WAAW,EAAE,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC;YACnE,OAAO,CAAC,KAAK,CAAC,GAAG,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC,WAAW,EAAE,GAAG,IAAI,CAAC;YACxD,KAAK,GAAG,KAAK,GAAG,CAAC,CAAC;SACnB;QAED,gBAAgB;QAChB,OAAO,CAAC,OAAO,CAAC,CAAC;IACnB,CAAC;CACF;AApFD,gCAoFC;AAeD,gBAAgB;AAChB,MAAa,QAAQ;IAyBnB,YACE,OAAe,EACf,SAAwB,EACxB,OAAe,EACf,IAAwB;QAExB,IAAI,CAAC,MAAM,GAAG,KAAK,CAAC;QACpB,IAAI,CAAC,GAAG,GAAG,OAAO,CAAC;QACnB,IAAI,CAAC,IAAI,GAAG,OAAO,CAAC;QACpB,IAAI,CAAC,IAAI,GAAG,IAAI,aAAJ,IAAI,cAAJ,IAAI,GAAI;YAClB,YAAY,EAAE,IAAI;YAClB,aAAa,EAAE,IAAI;YACnB,cAAc,EAAE,KAAK;YACrB,UAAU,EAAE,KAAK;SAClB,CAAC;QAEF,0BAA0B;QAC1B,IAAI,CAAC,MAAM,GAAG,SAAS,CAAC,MAAM,CAAC;QAC/B,IAAI,CAAC,SAAS,GAAG,SAAS,CAAC,SAAS,CAAC;QACrC,IAAI,CAAC,UAAU,GAAG,SAAS,CAAC,UAAU,CAAC;QACvC,IAAI,CAAC,MAAM,GAAG,SAAS,CAAC,MAAM,CAAC;QAC/B,IAAI,CAAC,cAAc,GAAG,SAAS,CAAC,cAAc,CAAC;QAE/C,wBAAwB;QACxB,IAAI,CAAC,aAAa,GAAG,OAAO,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC;QAC5C,IAAI,CAAC,QAAQ,GAAG,IAAI,IAAI,CAAC,IAAI,CAAC,OAAO,CAAC,WAAW,CAAC,CAAC,CAAC,EAAE,OAAO,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,CAAC;QAC9E,IAAI,CAAC,YAAY,GAAG,OAAO,CAAC,WAAW,CAAC,EAAE,CAAC,CAAC;QAC5C,IAAI,CAAC,cAAc,GAAG,OAAO,CAAC,WAAW,CAAC,EAAE,CAAC,CAAC;QAE9C,6BAA6B;QAC7B,IAAI,CAAC,SAAS,GAAG,IAAI,KAAK,CAAC,IAAI,CAAC,cAAc,CAAC,CAAC;QAEhD,cAAc;QACd,IAAI,CAAC,cAAc,GAAG,CAAC,IAAI,CAAC,aAAa,GAAG,gBAAgB,CAAC,KAAK,CAAC,CAAC;QACpE,IAAI,CAAC,YAAY,GAAG,CAAC,IAAI,CAAC,aAAa,GAAG,aAAa,CAAC,KAAK,CAAC,CAAC;QAC/D,IAAI,CAAC,gBAAgB,GAAG,CAAC,IAAI,CAAC,aAAa,GAAG,kBAAkB,CAAC,KAAK,CAAC,CAAC;QACxE,IAAI,CAAC,YAAY,GAAG,CAAC,IAAI,CAAC,aAAa,GAAG,aAAa,CAAC,KAAK,CAAC,CAAC;QAC/D,IAAI,CAAC,YAAY,GAAG,OAAO,IAAI,CAAC,IAAI,CAAC,YAAY,KAAK,SAAS,CAAC,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,YAAY,CAAC,CAAC,CAAC,IAAI,CAAC;QAChG,IAAI,CAAC,aAAa;YAChB,OAAO,IAAI,CAAC,IAAI,CAAC,aAAa,KAAK,SAAS,CAAC,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,aAAa,CAAC,CAAC,CAAC,IAAI,CAAC;QAChF,IAAI,CAAC,cAAc;YACjB,OAAO,IAAI,CAAC,IAAI,CAAC,cAAc,KAAK,SAAS,CAAC,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,cAAc,CAAC,CAAC,CAAC,KAAK,CAAC;QACnF,IAAI,CAAC,UAAU,GAAG,OAAO,IAAI,CAAC,IAAI,CAAC,UAAU,KAAK,SAAS,CAAC,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,UAAU,CAAC,CAAC,CAAC,KAAK,CAAC;IAC7F,CAAC;IAED,QAAQ;QACN,OAAO,IAAI,CAAC,MAAM,CAAC;IACrB,CAAC;IAED,KAAK,CAAC,OAA0B;;QAC9B,kCAAkC;QAClC,IAAI,IAAI,CAAC,MAAM;YAAE,OAAO;QACxB,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAExB,uDAAuD;QACvD,MAAM,GAAG,GAAG,OAAO,CAAC,GAAG,IAAI,KAAK,CAAC;QACjC,MAAM,mBAAmB,GAAG,OAAO,CAAC,mBAAmB,IAAI,IAAI,CAAC;QAChE,MAAM,YAAY,GAAG,MAAA,OAAO,CAAC,YAAY,mCAAI,IAAI,CAAC,IAAI,CAAC,YAAY,CAAC;QACpE,MAAM,aAAa,GAAG,MAAA,OAAO,CAAC,aAAa,mCAAI,IAAI,CAAC,IAAI,CAAC,aAAa,CAAC;QACvE,MAAM,cAAc,GAAG,MAAA,OAAO,CAAC,cAAc,mCAAI,IAAI,CAAC,IAAI,CAAC,cAAc,CAAC;QAC1E,MAAM,UAAU,GAAG,MAAA,OAAO,CAAC,UAAU,mCAAI,IAAI,CAAC,IAAI,CAAC,UAAU,CAAC;QAC9D,IAAI,QAAQ,CAAC;QAEb,qBAAqB;QACrB,MAAM,QAAQ,GAAyB;YACrC,YAAY;YACZ,aAAa;YACb,cAAc;YACd,UAAU;SACX,CAAC;QAEF,oDAAoD;QACpD,uFAAuF;QACvF,IAAI,CAAC,KAAK,GAAG,EAAE,CAAC;QAEhB,aAAa;QACb,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,IAAI,CAAC,cAAc,EAAE,CAAC,EAAE,EAAE;YAC5C,QAAQ;gBACN,IAAI,CAAC,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC;oBACrB,CAAC,IAAI,CAAC,IAAI,CAAC,IAAI,CAAC,KAAK,GAAG,CAAC,CAAC,IAAI,CAAC,CAAC;oBAChC,CAAC,IAAI,CAAC,IAAI,CAAC,IAAI,CAAC,KAAK,GAAG,CAAC,CAAC,IAAI,EAAE,CAAC;oBACjC,CAAC,IAAI,CAAC,IAAI,CAAC,IAAI,CAAC,KAAK,GAAG,CAAC,CAAC,IAAI,EAAE,CAAC,CAAC;YAEpC,6DAA6D;YAC7D,IAAI,GAAG,EAAE;gBACP,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,GAAG,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,KAAK,EAAE,IAAI,CAAC,KAAK,GAAG,QAAQ,CAAC,CAAC;aACxE;iBAAM;gBACL,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,GAAG,IAAI,CAAC,WAAW,CAClC,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,KAAK,EAAE,IAAI,CAAC,KAAK,GAAG,QAAQ,CAAC,EAClD,QAAQ,CACT,CAAC;aACH;YAED,mBAAmB;YACnB,IAAI,CAAC,KAAK,GAAG,IAAI,CAAC,KAAK,GAAG,QAAQ,CAAC;SACpC;QAED,IAAI,IAAI,CAAC,SAAS,CAAC,MAAM,KAAK,CAAC,IAAI,mBAAmB,IAAI,IAAI,IAAI,GAAG,EAAE;YACrE,MAAM,WAAW,GAAa,EAAE,CAAC;YACjC,WAAW,CAAC,mBAAmB,CAAC,GAAG,IAAI,CAAC;YACxC,QAAQ,CAAC,WAAW,GAAG,WAAW,CAAC;YAEnC,MAAM,GAAG,GAAG,IAAI,CAAC,WAAW,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAW,EAAE,QAAQ,CAAC,CAAC;YACpE,IAAI,CAAC,SAAS,GAAG,CAAC,GAAG,CAAC,CAAC;SACxB;QAED,aAAa;QACb,IAAI,CAAC,MAAM,GAAG,IAAI,CAAC;IACrB,CAAC;CACF;AAtID,4BAsIC;AAED,iCAAiC;AACjC,kFAAkF;AAClF,EAAE;AACF,mBAAmB;AACnB,uBAAuB;AACvB,oBAAoB;AACpB,gDAAgD;AAChD,8CAA8C;AAC9C,6BAA6B;AAC7B,mCAAmC;AACnC,kCAAkC;AAClC,WAAW;AACX,OAAO;AACP,KAAK;AAEL,kBAAkB;AAClB,uBAAuB;AACvB,8BAA8B;AAC9B,0BAA0B;AAC1B,2BAA2B;AAC3B,8BAA8B;AAC9B,OAAO;AACP,0BAA0B;AAC1B,0BAA0B;AAC1B,2BAA2B;AAC3B,KAAK;AAEL,YAAY;AACZ,MAAM,qBAAqB,GAAG,CAAC,CAAC;AAChC,MAAM,iBAAiB,GAAG,CAAC,CAAC;AAC5B,MAAM,oBAAoB,GAAG,CAAC,IAAI,EAAE,CAAC;AAarC,gBAAgB;AAChB,MAAa,GAAG;IAad,YAAY,EAAU,EAAE,OAAiB,EAAE,OAAuB;QAChE,uCAAuC;QACvC,IAAI,OAAO,IAAI,IAAI;YACjB,MAAM,IAAI,iCAAyB,CAAC,4CAA4C,CAAC,CAAC;QAEpF,gBAAgB;QAChB,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC;QACb,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,OAAO,CAAC,GAAG,GAAG,yBAAiB,CAAC,EAAE,CAAC,CAAC;QAEzC,IAAI,OAAO,CAAC,cAAc,IAAI,OAAO,CAAC,cAAc,CAAC,IAAI,KAAK,gCAAc,CAAC,OAAO,EAAE;YACpF,IAAI,CAAC,OAAO,CAAC,eAAe,GAAG,OAAO,CAAC,cAAc,CAAC,MAAM,EAAE,CAAC;SAChE;QAED,uBAAuB;QACvB,IAAI,CAAC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAE7B,qBAAqB;QACrB,IAAI,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC,CAAC,GAAG,CAAC,YAAY,EAAE,CAAC;QAE5E,uBAAuB;QACvB,IAAI,CAAC,kBAAkB;YACrB,OAAO,OAAO,CAAC,kBAAkB,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,kBAAkB,CAAC,CAAC,CAAC,KAAK,CAAC;QACvF,IAAI,CAAC,eAAe;YAClB,OAAO,OAAO,CAAC,eAAe,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,eAAe,CAAC,CAAC,CAAC,KAAK,CAAC;QACjF,IAAI,CAAC,SAAS,GAAG,OAAO,OAAO,CAAC,SAAS,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC,CAAC,KAAK,CAAC;QACpF,IAAI,CAAC,WAAW,GAAG,OAAO,CAAC,WAAW,IAAI,IAAI,GAAG,IAAI,GAAG,EAAE,CAAC;QAE3D,QAAQ;QACR,IAAI,CAAC,eAAe,GAAG,KAAK,CAAC;QAC7B,IAAI,CAAC,UAAU,GAAG,OAAO,CAAC,UAAU,IAAI,KAAK,CAAC;QAC9C,IAAI,CAAC,cAAc;YACjB,OAAO,OAAO,CAAC,cAAc,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,cAAc,CAAC,CAAC,CAAC,KAAK,CAAC;IACjF,CAAC;IAED,KAAK;QACH,MAAM,OAAO,GAAa,EAAE,CAAC;QAC7B,IAAI,KAAK,GAAG,CAAC,CAAC;QAEd,IAAI,IAAI,CAAC,eAAe,EAAE;YACxB,KAAK,IAAI,qBAAqB,CAAC;SAChC;QAED,IAAI,IAAI,CAAC,UAAU,EAAE;YACnB,KAAK,IAAI,iBAAiB,CAAC;SAC5B;QAED,IAAI,IAAI,CAAC,cAAc,EAAE;YACvB,KAAK,IAAI,oBAAoB,CAAC;SAC/B;QAED,MAAM,MAAM,GAAG,MAAM,CAAC,KAAK,CACzB,CAAC,GAAG,CAAC,GAAG,SAAS;YACf,CAAC,CAAC,QAAQ;SACb,CAAC;QAEF,OAAO,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC;QAErB,IAAI,WAAW,GAAG,MAAM,CAAC,MAAM,CAAC;QAChC,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAC;QAC7B,WAAW,IAAI,IAAI,CAAC,mBAAmB,CAAC,OAAO,EAAE,OAAO,CAAC,CAAC;QAE1D,MAAM,CAAC,YAAY,CAAC,WAAW,EAAE,CAAC,CAAC,CAAC,CAAC,gBAAgB;QACrD,MAAM,CAAC,YAAY,CAAC,IAAI,CAAC,SAAS,EAAE,CAAC,CAAC,CAAC,CAAC,YAAY;QACpD,MAAM,CAAC,YAAY,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,aAAa;QACxC,MAAM,CAAC,YAAY,CAAC,kBAAM,EAAE,EAAE,CAAC,CAAC,CAAC,SAAS;QAC1C,MAAM,CAAC,aAAa,CAAC,KAAK,EAAE,EAAE,CAAC,CAAC,CAAC,QAAQ;QACzC,OAAO,OAAO,CAAC;IACjB,CAAC;IAED,mBAAmB,CAAC,OAAiB,EAAE,QAAkB;QACvD,MAAM,iBAAiB,GAAG,MAAM,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;QAC1C,iBAAiB,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC;QAEzB,MAAM,cAAc,GAAG,IAAI,CAAC,aAAa,CAAC,QAAQ,CAAC,CAAC;QACpD,OAAO,CAAC,IAAI,CAAC,iBAAiB,CAAC,CAAC;QAChC,OAAO,CAAC,IAAI,CAAC,cAAc,CAAC,CAAC;QAE7B,OAAO,iBAAiB,CAAC,MAAM,GAAG,cAAc,CAAC,MAAM,CAAC;IAC1D,CAAC;IAED,aAAa,CAAC,QAAkB;QAC9B,OAAO,IAAI,CAAC,SAAS,CAAC,QAAQ,EAAE;YAC9B,SAAS,EAAE,IAAI,CAAC,SAAS;YACzB,kBAAkB,EAAE,IAAI,CAAC,kBAAkB;YAC3C,eAAe,EAAE,IAAI,CAAC,eAAe;SACtC,CAAC,CAAC;IACL,CAAC;IAED,MAAM,CAAC,YAAY;QACjB,UAAU,GAAG,CAAC,UAAU,GAAG,CAAC,CAAC,GAAG,UAAU,CAAC;QAC3C,OAAO,UAAU,CAAC;IACpB,CAAC;CACF;AA1GD,kBA0GC;AAED,gBAAgB;AAChB,MAAa,MAAM;IAqBjB,YACE,OAAe,EACf,SAAwB,EACxB,OAAe,EACf,IAAwB;QAExB,IAAI,CAAC,MAAM,GAAG,KAAK,CAAC;QACpB,IAAI,CAAC,GAAG,GAAG,OAAO,CAAC;QACnB,IAAI,CAAC,IAAI,GAAG,OAAO,CAAC;QACpB,IAAI,CAAC,IAAI,GAAG,IAAI,aAAJ,IAAI,cAAJ,IAAI,GAAI;YAClB,YAAY,EAAE,IAAI;YAClB,aAAa,EAAE,IAAI;YACnB,cAAc,EAAE,KAAK;YACrB,UAAU,EAAE,KAAK;SAClB,CAAC;QAEF,0BAA0B;QAC1B,IAAI,CAAC,MAAM,GAAG,SAAS,CAAC,MAAM,CAAC;QAC/B,IAAI,CAAC,SAAS,GAAG,SAAS,CAAC,SAAS,CAAC;QACrC,IAAI,CAAC,UAAU,GAAG,SAAS,CAAC,UAAU,CAAC;QACvC,IAAI,CAAC,MAAM,GAAG,SAAS,CAAC,MAAM,CAAC;QAC/B,IAAI,CAAC,cAAc,GAAG,SAAS,CAAC,cAAc,CAAC;QAE/C,sBAAsB;QACtB,IAAI,CAAC,aAAa,GAAG,OAAO,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC;QAC5C,IAAI,CAAC,eAAe,GAAG,CAAC,IAAI,CAAC,aAAa,GAAG,qBAAqB,CAAC,KAAK,CAAC,CAAC;QAC1E,IAAI,CAAC,UAAU,GAAG,CAAC,IAAI,CAAC,aAAa,GAAG,iBAAiB,CAAC,KAAK,CAAC,CAAC;QACjE,IAAI,CAAC,cAAc,GAAG,CAAC,IAAI,CAAC,aAAa,GAAG,oBAAoB,CAAC,KAAK,CAAC,CAAC;QACxE,IAAI,CAAC,YAAY,GAAG,OAAO,IAAI,CAAC,IAAI,CAAC,YAAY,KAAK,SAAS,CAAC,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,YAAY,CAAC,CAAC,CAAC,IAAI,CAAC;QAChG,IAAI,CAAC,aAAa;YAChB,OAAO,IAAI,CAAC,IAAI,CAAC,aAAa,KAAK,SAAS,CAAC,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,aAAa,CAAC,CAAC,CAAC,IAAI,CAAC;QAChF,IAAI,CAAC,cAAc;YACjB,OAAO,IAAI,CAAC,IAAI,CAAC,cAAc,KAAK,SAAS,CAAC,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,cAAc,CAAC,CAAC,CAAC,KAAK,CAAC;QACnF,IAAI,CAAC,UAAU,GAAG,OAAO,IAAI,CAAC,IAAI,CAAC,UAAU,KAAK,SAAS,CAAC,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,UAAU,CAAC,CAAC,CAAC,KAAK,CAAC;QAE3F,IAAI,CAAC,SAAS,GAAG,EAAE,CAAC;IACtB,CAAC;IAED,QAAQ;QACN,OAAO,IAAI,CAAC,MAAM,CAAC;IACrB,CAAC;IAED,KAAK,CAAC,OAA0B;;QAC9B,kCAAkC;QAClC,IAAI,IAAI,CAAC,MAAM;YAAE,OAAO;QACxB,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAExB,IAAI,CAAC,KAAK,GAAG,CAAC,CAAC;QACf,uDAAuD;QACvD,MAAM,GAAG,GAAG,OAAO,CAAC,GAAG,IAAI,KAAK,CAAC;QACjC,MAAM,mBAAmB,GAAG,OAAO,CAAC,mBAAmB,IAAI,IAAI,CAAC;QAChE,MAAM,YAAY,GAAG,MAAA,OAAO,CAAC,YAAY,mCAAI,IAAI,CAAC,IAAI,CAAC,YAAY,CAAC;QACpE,MAAM,aAAa,GAAG,MAAA,OAAO,CAAC,aAAa,mCAAI,IAAI,CAAC,IAAI,CAAC,aAAa,CAAC;QACvE,MAAM,cAAc,GAAG,MAAA,OAAO,CAAC,cAAc,mCAAI,IAAI,CAAC,IAAI,CAAC,cAAc,CAAC;QAC1E,MAAM,UAAU,GAAG,MAAA,OAAO,CAAC,UAAU,mCAAI,IAAI,CAAC,IAAI,CAAC,UAAU,CAAC;QAE9D,qBAAqB;QACrB,MAAM,QAAQ,GAAyB;YACrC,YAAY;YACZ,aAAa;YACb,cAAc;YACd,UAAU;SACX,CAAC;QAEF,OAAO,IAAI,CAAC,KAAK,GAAG,IAAI,CAAC,IAAI,CAAC,MAAM,EAAE;YACpC,MAAM,WAAW,GAAG,IAAI,CAAC,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,KAAK,EAAE,CAAC,CAAC;YACtD,IAAI,WAAW,KAAK,CAAC,EAAE;gBACrB,MAAM,QAAQ,GAAG,IAAI,CAAC,IAAI,CAAC,YAAY,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;gBACpD,MAAM,GAAG,GAAG,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,KAAK,EAAE,IAAI,CAAC,KAAK,GAAG,QAAQ,CAAC,CAAC;gBAC/D,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,IAAI,CAAC,WAAW,CAAC,GAAG,EAAE,QAAQ,CAAC,CAAC,CAAC;gBAEjE,IAAI,CAAC,KAAK,IAAI,QAAQ,CAAC;aACxB;iBAAM,IAAI,WAAW,KAAK,CAAC,EAAE;gBAC5B,4DAA4D;gBAE5D,sDAAsD;gBACtD,MAAM,IAAI,yBAAiB,CAAC,qDAAqD,CAAC,CAAC;aACpF;SACF;QAED,IAAI,IAAI,CAAC,SAAS,CAAC,MAAM,KAAK,CAAC,IAAI,mBAAmB,IAAI,IAAI,IAAI,GAAG,EAAE;YACrE,MAAM,WAAW,GAAa,EAAE,CAAC;YACjC,WAAW,CAAC,mBAAmB,CAAC,GAAG,IAAI,CAAC;YACxC,QAAQ,CAAC,WAAW,GAAG,WAAW,CAAC;YAEnC,MAAM,GAAG,GAAG,IAAI,CAAC,WAAW,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAW,EAAE,QAAQ,CAAC,CAAC;YACpE,IAAI,CAAC,SAAS,GAAG,CAAC,GAAG,CAAC,CAAC;SACxB;QAED,IAAI,CAAC,MAAM,GAAG,IAAI,CAAC;IACrB,CAAC;CACF;AAhHD,wBAgHC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/connect.js b/node_modules/mongodb/lib/cmap/connect.js new file mode 100644 index 000000000..eaacbdec1 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/connect.js @@ -0,0 +1,304 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.LEGAL_TCP_SOCKET_OPTIONS = exports.LEGAL_TLS_SOCKET_OPTIONS = exports.connect = void 0; +const net = require("net"); +const tls = require("tls"); +const connection_1 = require("./connection"); +const error_1 = require("../error"); +const defaultAuthProviders_1 = require("./auth/defaultAuthProviders"); +const auth_provider_1 = require("./auth/auth_provider"); +const utils_1 = require("../utils"); +const constants_1 = require("./wire_protocol/constants"); +const bson_1 = require("../bson"); +const FAKE_MONGODB_SERVICE_ID = typeof process.env.FAKE_MONGODB_SERVICE_ID === 'string' && + process.env.FAKE_MONGODB_SERVICE_ID.toLowerCase() === 'true'; +function connect(options, callback) { + makeConnection(options, (err, socket) => { + var _a; + if (err || !socket) { + return callback(err); + } + let ConnectionType = (_a = options.connectionType) !== null && _a !== void 0 ? _a : connection_1.Connection; + if (options.autoEncrypter) { + ConnectionType = connection_1.CryptoConnection; + } + performInitialHandshake(new ConnectionType(socket, options), options, callback); + }); +} +exports.connect = connect; +function checkSupportedServer(ismaster, options) { + var _a; + const serverVersionHighEnough = ismaster && + (typeof ismaster.maxWireVersion === 'number' || ismaster.maxWireVersion instanceof bson_1.Int32) && + ismaster.maxWireVersion >= constants_1.MIN_SUPPORTED_WIRE_VERSION; + const serverVersionLowEnough = ismaster && + (typeof ismaster.minWireVersion === 'number' || ismaster.minWireVersion instanceof bson_1.Int32) && + ismaster.minWireVersion <= constants_1.MAX_SUPPORTED_WIRE_VERSION; + if (serverVersionHighEnough) { + if (serverVersionLowEnough) { + return null; + } + const message = `Server at ${options.hostAddress} reports minimum wire version ${JSON.stringify(ismaster.minWireVersion)}, but this version of the Node.js Driver requires at most ${constants_1.MAX_SUPPORTED_WIRE_VERSION} (MongoDB ${constants_1.MAX_SUPPORTED_SERVER_VERSION})`; + return new error_1.MongoCompatibilityError(message); + } + const message = `Server at ${options.hostAddress} reports maximum wire version ${(_a = JSON.stringify(ismaster.maxWireVersion)) !== null && _a !== void 0 ? _a : 0}, but this version of the Node.js Driver requires at least ${constants_1.MIN_SUPPORTED_WIRE_VERSION} (MongoDB ${constants_1.MIN_SUPPORTED_SERVER_VERSION})`; + return new error_1.MongoCompatibilityError(message); +} +function performInitialHandshake(conn, options, _callback) { + const callback = function (err, ret) { + if (err && conn) { + conn.destroy(); + } + _callback(err, ret); + }; + const credentials = options.credentials; + if (credentials) { + if (!(credentials.mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_DEFAULT) && + !defaultAuthProviders_1.AUTH_PROVIDERS.get(credentials.mechanism)) { + callback(new error_1.MongoInvalidArgumentError(`AuthMechanism '${credentials.mechanism}' not supported`)); + return; + } + } + const authContext = new auth_provider_1.AuthContext(conn, credentials, options); + prepareHandshakeDocument(authContext, (err, handshakeDoc) => { + if (err || !handshakeDoc) { + return callback(err); + } + const handshakeOptions = Object.assign({}, options); + if (typeof options.connectTimeoutMS === 'number') { + // The handshake technically is a monitoring check, so its socket timeout should be connectTimeoutMS + handshakeOptions.socketTimeoutMS = options.connectTimeoutMS; + } + const start = new Date().getTime(); + conn.command(utils_1.ns('admin.$cmd'), handshakeDoc, handshakeOptions, (err, response) => { + if (err) { + callback(err); + return; + } + if ((response === null || response === void 0 ? void 0 : response.ok) === 0) { + callback(new error_1.MongoServerError(response)); + return; + } + if ('isWritablePrimary' in response) { + // Provide pre-hello-style response document. + response.ismaster = response.isWritablePrimary; + } + if (response.helloOk) { + conn.helloOk = true; + } + const supportedServerErr = checkSupportedServer(response, options); + if (supportedServerErr) { + callback(supportedServerErr); + return; + } + if (options.loadBalanced) { + // TODO: Durran: Remove when server support exists. (NODE-3431) + if (FAKE_MONGODB_SERVICE_ID) { + response.serviceId = response.topologyVersion.processId; + } + if (!response.serviceId) { + return callback(new error_1.MongoCompatibilityError('Driver attempted to initialize in load balancing mode, ' + + 'but the server does not support this mode.')); + } + } + // NOTE: This is metadata attached to the connection while porting away from + // handshake being done in the `Server` class. Likely, it should be + // relocated, or at very least restructured. + conn.ismaster = response; + conn.lastIsMasterMS = new Date().getTime() - start; + if (!response.arbiterOnly && credentials) { + // store the response on auth context + authContext.response = response; + const resolvedCredentials = credentials.resolveAuthMechanism(response); + const provider = defaultAuthProviders_1.AUTH_PROVIDERS.get(resolvedCredentials.mechanism); + if (!provider) { + return callback(new error_1.MongoInvalidArgumentError(`No AuthProvider for ${resolvedCredentials.mechanism} defined.`)); + } + provider.auth(authContext, err => { + if (err) + return callback(err); + callback(undefined, conn); + }); + return; + } + callback(undefined, conn); + }); + }); +} +function prepareHandshakeDocument(authContext, callback) { + const options = authContext.options; + const compressors = options.compressors ? options.compressors : []; + const { serverApi } = authContext.connection; + const handshakeDoc = { + [(serverApi === null || serverApi === void 0 ? void 0 : serverApi.version) ? 'hello' : 'ismaster']: true, + helloOk: true, + client: options.metadata || utils_1.makeClientMetadata(options), + compression: compressors, + loadBalanced: options.loadBalanced + }; + const credentials = authContext.credentials; + if (credentials) { + if (credentials.mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_DEFAULT && credentials.username) { + handshakeDoc.saslSupportedMechs = `${credentials.source}.${credentials.username}`; + const provider = defaultAuthProviders_1.AUTH_PROVIDERS.get(defaultAuthProviders_1.AuthMechanism.MONGODB_SCRAM_SHA256); + if (!provider) { + // This auth mechanism is always present. + return callback(new error_1.MongoInvalidArgumentError(`No AuthProvider for ${defaultAuthProviders_1.AuthMechanism.MONGODB_SCRAM_SHA256} defined.`)); + } + return provider.prepare(handshakeDoc, authContext, callback); + } + const provider = defaultAuthProviders_1.AUTH_PROVIDERS.get(credentials.mechanism); + if (!provider) { + return callback(new error_1.MongoInvalidArgumentError(`No AuthProvider for ${credentials.mechanism} defined.`)); + } + return provider.prepare(handshakeDoc, authContext, callback); + } + callback(undefined, handshakeDoc); +} +/** @public */ +exports.LEGAL_TLS_SOCKET_OPTIONS = [ + 'ALPNProtocols', + 'ca', + 'cert', + 'checkServerIdentity', + 'ciphers', + 'crl', + 'ecdhCurve', + 'key', + 'minDHSize', + 'passphrase', + 'pfx', + 'rejectUnauthorized', + 'secureContext', + 'secureProtocol', + 'servername', + 'session' +]; +/** @public */ +exports.LEGAL_TCP_SOCKET_OPTIONS = [ + 'family', + 'hints', + 'localAddress', + 'localPort', + 'lookup' +]; +function parseConnectOptions(options) { + const hostAddress = options.hostAddress; + if (!hostAddress) + throw new error_1.MongoInvalidArgumentError('Option "hostAddress" is required'); + const result = {}; + for (const name of exports.LEGAL_TCP_SOCKET_OPTIONS) { + if (options[name] != null) { + result[name] = options[name]; + } + } + if (typeof hostAddress.socketPath === 'string') { + result.path = hostAddress.socketPath; + return result; + } + else if (typeof hostAddress.host === 'string') { + result.host = hostAddress.host; + result.port = hostAddress.port; + return result; + } + else { + // This should never happen since we set up HostAddresses + // But if we don't throw here the socket could hang until timeout + // TODO(NODE-3483) + throw new error_1.MongoRuntimeError(`Unexpected HostAddress ${JSON.stringify(hostAddress)}`); + } +} +function parseSslOptions(options) { + const result = parseConnectOptions(options); + // Merge in valid SSL options + for (const name of exports.LEGAL_TLS_SOCKET_OPTIONS) { + if (options[name] != null) { + result[name] = options[name]; + } + } + // Set default sni servername to be the same as host + if (result.servername == null && result.host && !net.isIP(result.host)) { + result.servername = result.host; + } + return result; +} +const SOCKET_ERROR_EVENT_LIST = ['error', 'close', 'timeout', 'parseError']; +const SOCKET_ERROR_EVENTS = new Set(SOCKET_ERROR_EVENT_LIST); +function makeConnection(options, _callback) { + var _a, _b, _c, _d, _e, _f, _g, _h, _j; + const useTLS = (_a = options.tls) !== null && _a !== void 0 ? _a : false; + const keepAlive = (_b = options.keepAlive) !== null && _b !== void 0 ? _b : true; + const socketTimeoutMS = (_d = (_c = options.socketTimeoutMS) !== null && _c !== void 0 ? _c : Reflect.get(options, 'socketTimeout')) !== null && _d !== void 0 ? _d : 0; + const noDelay = (_e = options.noDelay) !== null && _e !== void 0 ? _e : true; + const connectionTimeout = (_f = options.connectTimeoutMS) !== null && _f !== void 0 ? _f : 30000; + const rejectUnauthorized = (_g = options.rejectUnauthorized) !== null && _g !== void 0 ? _g : true; + const keepAliveInitialDelay = (_j = (((_h = options.keepAliveInitialDelay) !== null && _h !== void 0 ? _h : 120000) > socketTimeoutMS + ? Math.round(socketTimeoutMS / 2) + : options.keepAliveInitialDelay)) !== null && _j !== void 0 ? _j : 120000; + let socket; + const callback = function (err, ret) { + if (err && socket) { + socket.destroy(); + } + _callback(err, ret); + }; + if (useTLS) { + const tlsSocket = tls.connect(parseSslOptions(options)); + if (typeof tlsSocket.disableRenegotiation === 'function') { + tlsSocket.disableRenegotiation(); + } + socket = tlsSocket; + } + else { + socket = net.createConnection(parseConnectOptions(options)); + } + socket.setKeepAlive(keepAlive, keepAliveInitialDelay); + socket.setTimeout(connectionTimeout); + socket.setNoDelay(noDelay); + const connectEvent = useTLS ? 'secureConnect' : 'connect'; + let cancellationHandler; + function errorHandler(eventName) { + return (err) => { + SOCKET_ERROR_EVENTS.forEach(event => socket.removeAllListeners(event)); + if (cancellationHandler && options.cancellationToken) { + options.cancellationToken.removeListener('cancel', cancellationHandler); + } + socket.removeListener(connectEvent, connectHandler); + callback(connectionFailureError(eventName, err)); + }; + } + function connectHandler() { + SOCKET_ERROR_EVENTS.forEach(event => socket.removeAllListeners(event)); + if (cancellationHandler && options.cancellationToken) { + options.cancellationToken.removeListener('cancel', cancellationHandler); + } + if ('authorizationError' in socket) { + if (socket.authorizationError && rejectUnauthorized) { + return callback(socket.authorizationError); + } + } + socket.setTimeout(socketTimeoutMS); + callback(undefined, socket); + } + SOCKET_ERROR_EVENTS.forEach(event => socket.once(event, errorHandler(event))); + if (options.cancellationToken) { + cancellationHandler = errorHandler('cancel'); + options.cancellationToken.once('cancel', cancellationHandler); + } + socket.once(connectEvent, connectHandler); +} +function connectionFailureError(type, err) { + switch (type) { + case 'error': + return new error_1.MongoNetworkError(err); + case 'timeout': + return new error_1.MongoNetworkTimeoutError('connection timed out'); + case 'close': + return new error_1.MongoNetworkError('connection closed'); + case 'cancel': + return new error_1.MongoNetworkError('connection establishment was cancelled'); + default: + return new error_1.MongoNetworkError('unknown network error'); + } +} +//# sourceMappingURL=connect.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/connect.js.map b/node_modules/mongodb/lib/cmap/connect.js.map new file mode 100644 index 000000000..dbee3c2c8 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/connect.js.map @@ -0,0 +1 @@ +{"version":3,"file":"connect.js","sourceRoot":"","sources":["../../src/cmap/connect.ts"],"names":[],"mappings":";;;AAAA,2BAA2B;AAC3B,2BAA2B;AAC3B,6CAA+E;AAC/E,oCAQkB;AAClB,sEAA4E;AAC5E,wDAAmD;AACnD,oCAA8F;AAC9F,yDAKmC;AAEnC,kCAAgC;AAKhC,MAAM,uBAAuB,GAC3B,OAAO,OAAO,CAAC,GAAG,CAAC,uBAAuB,KAAK,QAAQ;IACvD,OAAO,CAAC,GAAG,CAAC,uBAAuB,CAAC,WAAW,EAAE,KAAK,MAAM,CAAC;AAK/D,SAAgB,OAAO,CAAC,OAA0B,EAAE,QAA8B;IAChF,cAAc,CAAC,OAAO,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;;QACtC,IAAI,GAAG,IAAI,CAAC,MAAM,EAAE;YAClB,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;SACtB;QAED,IAAI,cAAc,GAAG,MAAA,OAAO,CAAC,cAAc,mCAAI,uBAAU,CAAC;QAC1D,IAAI,OAAO,CAAC,aAAa,EAAE;YACzB,cAAc,GAAG,6BAAgB,CAAC;SACnC;QACD,uBAAuB,CAAC,IAAI,cAAc,CAAC,MAAM,EAAE,OAAO,CAAC,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;IAClF,CAAC,CAAC,CAAC;AACL,CAAC;AAZD,0BAYC;AAED,SAAS,oBAAoB,CAAC,QAAkB,EAAE,OAA0B;;IAC1E,MAAM,uBAAuB,GAC3B,QAAQ;QACR,CAAC,OAAO,QAAQ,CAAC,cAAc,KAAK,QAAQ,IAAI,QAAQ,CAAC,cAAc,YAAY,YAAK,CAAC;QACzF,QAAQ,CAAC,cAAc,IAAI,sCAA0B,CAAC;IACxD,MAAM,sBAAsB,GAC1B,QAAQ;QACR,CAAC,OAAO,QAAQ,CAAC,cAAc,KAAK,QAAQ,IAAI,QAAQ,CAAC,cAAc,YAAY,YAAK,CAAC;QACzF,QAAQ,CAAC,cAAc,IAAI,sCAA0B,CAAC;IAExD,IAAI,uBAAuB,EAAE;QAC3B,IAAI,sBAAsB,EAAE;YAC1B,OAAO,IAAI,CAAC;SACb;QAED,MAAM,OAAO,GAAG,aAAa,OAAO,CAAC,WAAW,iCAAiC,IAAI,CAAC,SAAS,CAC7F,QAAQ,CAAC,cAAc,CACxB,6DAA6D,sCAA0B,aAAa,wCAA4B,GAAG,CAAC;QACrI,OAAO,IAAI,+BAAuB,CAAC,OAAO,CAAC,CAAC;KAC7C;IAED,MAAM,OAAO,GAAG,aAAa,OAAO,CAAC,WAAW,iCAC9C,MAAA,IAAI,CAAC,SAAS,CAAC,QAAQ,CAAC,cAAc,CAAC,mCAAI,CAC7C,8DAA8D,sCAA0B,aAAa,wCAA4B,GAAG,CAAC;IACrI,OAAO,IAAI,+BAAuB,CAAC,OAAO,CAAC,CAAC;AAC9C,CAAC;AAED,SAAS,uBAAuB,CAC9B,IAAgB,EAChB,OAA0B,EAC1B,SAAmB;IAEnB,MAAM,QAAQ,GAAuB,UAAU,GAAG,EAAE,GAAG;QACrD,IAAI,GAAG,IAAI,IAAI,EAAE;YACf,IAAI,CAAC,OAAO,EAAE,CAAC;SAChB;QACD,SAAS,CAAC,GAAG,EAAE,GAAG,CAAC,CAAC;IACtB,CAAC,CAAC;IAEF,MAAM,WAAW,GAAG,OAAO,CAAC,WAAW,CAAC;IACxC,IAAI,WAAW,EAAE;QACf,IACE,CAAC,CAAC,WAAW,CAAC,SAAS,KAAK,oCAAa,CAAC,eAAe,CAAC;YAC1D,CAAC,qCAAc,CAAC,GAAG,CAAC,WAAW,CAAC,SAAS,CAAC,EAC1C;YACA,QAAQ,CACN,IAAI,iCAAyB,CAAC,kBAAkB,WAAW,CAAC,SAAS,iBAAiB,CAAC,CACxF,CAAC;YACF,OAAO;SACR;KACF;IAED,MAAM,WAAW,GAAG,IAAI,2BAAW,CAAC,IAAI,EAAE,WAAW,EAAE,OAAO,CAAC,CAAC;IAChE,wBAAwB,CAAC,WAAW,EAAE,CAAC,GAAG,EAAE,YAAY,EAAE,EAAE;QAC1D,IAAI,GAAG,IAAI,CAAC,YAAY,EAAE;YACxB,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;SACtB;QAED,MAAM,gBAAgB,GAAa,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,OAAO,CAAC,CAAC;QAC9D,IAAI,OAAO,OAAO,CAAC,gBAAgB,KAAK,QAAQ,EAAE;YAChD,oGAAoG;YACpG,gBAAgB,CAAC,eAAe,GAAG,OAAO,CAAC,gBAAgB,CAAC;SAC7D;QAED,MAAM,KAAK,GAAG,IAAI,IAAI,EAAE,CAAC,OAAO,EAAE,CAAC;QACnC,IAAI,CAAC,OAAO,CAAC,UAAE,CAAC,YAAY,CAAC,EAAE,YAAY,EAAE,gBAAgB,EAAE,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;YAC/E,IAAI,GAAG,EAAE;gBACP,QAAQ,CAAC,GAAG,CAAC,CAAC;gBACd,OAAO;aACR;YAED,IAAI,CAAA,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,EAAE,MAAK,CAAC,EAAE;gBACtB,QAAQ,CAAC,IAAI,wBAAgB,CAAC,QAAQ,CAAC,CAAC,CAAC;gBACzC,OAAO;aACR;YAED,IAAI,mBAAmB,IAAI,QAAQ,EAAE;gBACnC,6CAA6C;gBAC7C,QAAQ,CAAC,QAAQ,GAAG,QAAQ,CAAC,iBAAiB,CAAC;aAChD;YAED,IAAI,QAAQ,CAAC,OAAO,EAAE;gBACpB,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC;aACrB;YAED,MAAM,kBAAkB,GAAG,oBAAoB,CAAC,QAAQ,EAAE,OAAO,CAAC,CAAC;YACnE,IAAI,kBAAkB,EAAE;gBACtB,QAAQ,CAAC,kBAAkB,CAAC,CAAC;gBAC7B,OAAO;aACR;YAED,IAAI,OAAO,CAAC,YAAY,EAAE;gBACxB,+DAA+D;gBAC/D,IAAI,uBAAuB,EAAE;oBAC3B,QAAQ,CAAC,SAAS,GAAG,QAAQ,CAAC,eAAe,CAAC,SAAS,CAAC;iBACzD;gBACD,IAAI,CAAC,QAAQ,CAAC,SAAS,EAAE;oBACvB,OAAO,QAAQ,CACb,IAAI,+BAAuB,CACzB,yDAAyD;wBACvD,4CAA4C,CAC/C,CACF,CAAC;iBACH;aACF;YAED,4EAA4E;YAC5E,yEAAyE;YACzE,kDAAkD;YAClD,IAAI,CAAC,QAAQ,GAAG,QAAQ,CAAC;YACzB,IAAI,CAAC,cAAc,GAAG,IAAI,IAAI,EAAE,CAAC,OAAO,EAAE,GAAG,KAAK,CAAC;YAEnD,IAAI,CAAC,QAAQ,CAAC,WAAW,IAAI,WAAW,EAAE;gBACxC,qCAAqC;gBACrC,WAAW,CAAC,QAAQ,GAAG,QAAQ,CAAC;gBAEhC,MAAM,mBAAmB,GAAG,WAAW,CAAC,oBAAoB,CAAC,QAAQ,CAAC,CAAC;gBACvE,MAAM,QAAQ,GAAG,qCAAc,CAAC,GAAG,CAAC,mBAAmB,CAAC,SAAS,CAAC,CAAC;gBACnE,IAAI,CAAC,QAAQ,EAAE;oBACb,OAAO,QAAQ,CACb,IAAI,iCAAyB,CAC3B,uBAAuB,mBAAmB,CAAC,SAAS,WAAW,CAChE,CACF,CAAC;iBACH;gBACD,QAAQ,CAAC,IAAI,CAAC,WAAW,EAAE,GAAG,CAAC,EAAE;oBAC/B,IAAI,GAAG;wBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;oBAC9B,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;gBAC5B,CAAC,CAAC,CAAC;gBAEH,OAAO;aACR;YAED,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;QAC5B,CAAC,CAAC,CAAC;IACL,CAAC,CAAC,CAAC;AACL,CAAC;AAYD,SAAS,wBAAwB,CAAC,WAAwB,EAAE,QAAqC;IAC/F,MAAM,OAAO,GAAG,WAAW,CAAC,OAAO,CAAC;IACpC,MAAM,WAAW,GAAG,OAAO,CAAC,WAAW,CAAC,CAAC,CAAC,OAAO,CAAC,WAAW,CAAC,CAAC,CAAC,EAAE,CAAC;IACnE,MAAM,EAAE,SAAS,EAAE,GAAG,WAAW,CAAC,UAAU,CAAC;IAE7C,MAAM,YAAY,GAAsB;QACtC,CAAC,CAAA,SAAS,aAAT,SAAS,uBAAT,SAAS,CAAE,OAAO,EAAC,CAAC,CAAC,OAAO,CAAC,CAAC,CAAC,UAAU,CAAC,EAAE,IAAI;QACjD,OAAO,EAAE,IAAI;QACb,MAAM,EAAE,OAAO,CAAC,QAAQ,IAAI,0BAAkB,CAAC,OAAO,CAAC;QACvD,WAAW,EAAE,WAAW;QACxB,YAAY,EAAE,OAAO,CAAC,YAAY;KACnC,CAAC;IAEF,MAAM,WAAW,GAAG,WAAW,CAAC,WAAW,CAAC;IAC5C,IAAI,WAAW,EAAE;QACf,IAAI,WAAW,CAAC,SAAS,KAAK,oCAAa,CAAC,eAAe,IAAI,WAAW,CAAC,QAAQ,EAAE;YACnF,YAAY,CAAC,kBAAkB,GAAG,GAAG,WAAW,CAAC,MAAM,IAAI,WAAW,CAAC,QAAQ,EAAE,CAAC;YAElF,MAAM,QAAQ,GAAG,qCAAc,CAAC,GAAG,CAAC,oCAAa,CAAC,oBAAoB,CAAC,CAAC;YACxE,IAAI,CAAC,QAAQ,EAAE;gBACb,yCAAyC;gBACzC,OAAO,QAAQ,CACb,IAAI,iCAAyB,CAC3B,uBAAuB,oCAAa,CAAC,oBAAoB,WAAW,CACrE,CACF,CAAC;aACH;YACD,OAAO,QAAQ,CAAC,OAAO,CAAC,YAAY,EAAE,WAAW,EAAE,QAAQ,CAAC,CAAC;SAC9D;QACD,MAAM,QAAQ,GAAG,qCAAc,CAAC,GAAG,CAAC,WAAW,CAAC,SAAS,CAAC,CAAC;QAC3D,IAAI,CAAC,QAAQ,EAAE;YACb,OAAO,QAAQ,CACb,IAAI,iCAAyB,CAAC,uBAAuB,WAAW,CAAC,SAAS,WAAW,CAAC,CACvF,CAAC;SACH;QACD,OAAO,QAAQ,CAAC,OAAO,CAAC,YAAY,EAAE,WAAW,EAAE,QAAQ,CAAC,CAAC;KAC9D;IACD,QAAQ,CAAC,SAAS,EAAE,YAAY,CAAC,CAAC;AACpC,CAAC;AAED,cAAc;AACD,QAAA,wBAAwB,GAAG;IACtC,eAAe;IACf,IAAI;IACJ,MAAM;IACN,qBAAqB;IACrB,SAAS;IACT,KAAK;IACL,WAAW;IACX,KAAK;IACL,WAAW;IACX,YAAY;IACZ,KAAK;IACL,oBAAoB;IACpB,eAAe;IACf,gBAAgB;IAChB,YAAY;IACZ,SAAS;CACD,CAAC;AAEX,cAAc;AACD,QAAA,wBAAwB,GAAG;IACtC,QAAQ;IACR,OAAO;IACP,cAAc;IACd,WAAW;IACX,QAAQ;CACA,CAAC;AAEX,SAAS,mBAAmB,CAAC,OAA0B;IACrD,MAAM,WAAW,GAAG,OAAO,CAAC,WAAW,CAAC;IACxC,IAAI,CAAC,WAAW;QAAE,MAAM,IAAI,iCAAyB,CAAC,kCAAkC,CAAC,CAAC;IAE1F,MAAM,MAAM,GAA2D,EAAE,CAAC;IAC1E,KAAK,MAAM,IAAI,IAAI,gCAAwB,EAAE;QAC3C,IAAI,OAAO,CAAC,IAAI,CAAC,IAAI,IAAI,EAAE;YACxB,MAAmB,CAAC,IAAI,CAAC,GAAG,OAAO,CAAC,IAAI,CAAC,CAAC;SAC5C;KACF;IAED,IAAI,OAAO,WAAW,CAAC,UAAU,KAAK,QAAQ,EAAE;QAC9C,MAAM,CAAC,IAAI,GAAG,WAAW,CAAC,UAAU,CAAC;QACrC,OAAO,MAA+B,CAAC;KACxC;SAAM,IAAI,OAAO,WAAW,CAAC,IAAI,KAAK,QAAQ,EAAE;QAC/C,MAAM,CAAC,IAAI,GAAG,WAAW,CAAC,IAAI,CAAC;QAC/B,MAAM,CAAC,IAAI,GAAG,WAAW,CAAC,IAAI,CAAC;QAC/B,OAAO,MAA+B,CAAC;KACxC;SAAM;QACL,yDAAyD;QACzD,iEAAiE;QACjE,kBAAkB;QAClB,MAAM,IAAI,yBAAiB,CAAC,0BAA0B,IAAI,CAAC,SAAS,CAAC,WAAW,CAAC,EAAE,CAAC,CAAC;KACtF;AACH,CAAC;AAED,SAAS,eAAe,CAAC,OAA0B;IACjD,MAAM,MAAM,GAAsB,mBAAmB,CAAC,OAAO,CAAC,CAAC;IAC/D,6BAA6B;IAC7B,KAAK,MAAM,IAAI,IAAI,gCAAwB,EAAE;QAC3C,IAAI,OAAO,CAAC,IAAI,CAAC,IAAI,IAAI,EAAE;YACxB,MAAmB,CAAC,IAAI,CAAC,GAAG,OAAO,CAAC,IAAI,CAAC,CAAC;SAC5C;KACF;IAED,oDAAoD;IACpD,IAAI,MAAM,CAAC,UAAU,IAAI,IAAI,IAAI,MAAM,CAAC,IAAI,IAAI,CAAC,GAAG,CAAC,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,EAAE;QACtE,MAAM,CAAC,UAAU,GAAG,MAAM,CAAC,IAAI,CAAC;KACjC;IAED,OAAO,MAAM,CAAC;AAChB,CAAC;AAED,MAAM,uBAAuB,GAAG,CAAC,OAAO,EAAE,OAAO,EAAE,SAAS,EAAE,YAAY,CAAU,CAAC;AAErF,MAAM,mBAAmB,GAAG,IAAI,GAAG,CAAC,uBAAuB,CAAC,CAAC;AAE7D,SAAS,cAAc,CAAC,OAA0B,EAAE,SAA6C;;IAC/F,MAAM,MAAM,GAAG,MAAA,OAAO,CAAC,GAAG,mCAAI,KAAK,CAAC;IACpC,MAAM,SAAS,GAAG,MAAA,OAAO,CAAC,SAAS,mCAAI,IAAI,CAAC;IAC5C,MAAM,eAAe,GAAG,MAAA,MAAA,OAAO,CAAC,eAAe,mCAAI,OAAO,CAAC,GAAG,CAAC,OAAO,EAAE,eAAe,CAAC,mCAAI,CAAC,CAAC;IAC9F,MAAM,OAAO,GAAG,MAAA,OAAO,CAAC,OAAO,mCAAI,IAAI,CAAC;IACxC,MAAM,iBAAiB,GAAG,MAAA,OAAO,CAAC,gBAAgB,mCAAI,KAAK,CAAC;IAC5D,MAAM,kBAAkB,GAAG,MAAA,OAAO,CAAC,kBAAkB,mCAAI,IAAI,CAAC;IAC9D,MAAM,qBAAqB,GACzB,MAAA,CAAC,CAAC,MAAA,OAAO,CAAC,qBAAqB,mCAAI,MAAM,CAAC,GAAG,eAAe;QAC1D,CAAC,CAAC,IAAI,CAAC,KAAK,CAAC,eAAe,GAAG,CAAC,CAAC;QACjC,CAAC,CAAC,OAAO,CAAC,qBAAqB,CAAC,mCAAI,MAAM,CAAC;IAE/C,IAAI,MAAc,CAAC;IACnB,MAAM,QAAQ,GAAqB,UAAU,GAAG,EAAE,GAAG;QACnD,IAAI,GAAG,IAAI,MAAM,EAAE;YACjB,MAAM,CAAC,OAAO,EAAE,CAAC;SAClB;QAED,SAAS,CAAC,GAAG,EAAE,GAAG,CAAC,CAAC;IACtB,CAAC,CAAC;IAEF,IAAI,MAAM,EAAE;QACV,MAAM,SAAS,GAAG,GAAG,CAAC,OAAO,CAAC,eAAe,CAAC,OAAO,CAAC,CAAC,CAAC;QACxD,IAAI,OAAO,SAAS,CAAC,oBAAoB,KAAK,UAAU,EAAE;YACxD,SAAS,CAAC,oBAAoB,EAAE,CAAC;SAClC;QACD,MAAM,GAAG,SAAS,CAAC;KACpB;SAAM;QACL,MAAM,GAAG,GAAG,CAAC,gBAAgB,CAAC,mBAAmB,CAAC,OAAO,CAAC,CAAC,CAAC;KAC7D;IAED,MAAM,CAAC,YAAY,CAAC,SAAS,EAAE,qBAAqB,CAAC,CAAC;IACtD,MAAM,CAAC,UAAU,CAAC,iBAAiB,CAAC,CAAC;IACrC,MAAM,CAAC,UAAU,CAAC,OAAO,CAAC,CAAC;IAE3B,MAAM,YAAY,GAAG,MAAM,CAAC,CAAC,CAAC,eAAe,CAAC,CAAC,CAAC,SAAS,CAAC;IAC1D,IAAI,mBAAyC,CAAC;IAC9C,SAAS,YAAY,CAAC,SAAgC;QACpD,OAAO,CAAC,GAAU,EAAE,EAAE;YACpB,mBAAmB,CAAC,OAAO,CAAC,KAAK,CAAC,EAAE,CAAC,MAAM,CAAC,kBAAkB,CAAC,KAAK,CAAC,CAAC,CAAC;YACvE,IAAI,mBAAmB,IAAI,OAAO,CAAC,iBAAiB,EAAE;gBACpD,OAAO,CAAC,iBAAiB,CAAC,cAAc,CAAC,QAAQ,EAAE,mBAAmB,CAAC,CAAC;aACzE;YAED,MAAM,CAAC,cAAc,CAAC,YAAY,EAAE,cAAc,CAAC,CAAC;YACpD,QAAQ,CAAC,sBAAsB,CAAC,SAAS,EAAE,GAAG,CAAC,CAAC,CAAC;QACnD,CAAC,CAAC;IACJ,CAAC;IAED,SAAS,cAAc;QACrB,mBAAmB,CAAC,OAAO,CAAC,KAAK,CAAC,EAAE,CAAC,MAAM,CAAC,kBAAkB,CAAC,KAAK,CAAC,CAAC,CAAC;QACvE,IAAI,mBAAmB,IAAI,OAAO,CAAC,iBAAiB,EAAE;YACpD,OAAO,CAAC,iBAAiB,CAAC,cAAc,CAAC,QAAQ,EAAE,mBAAmB,CAAC,CAAC;SACzE;QAED,IAAI,oBAAoB,IAAI,MAAM,EAAE;YAClC,IAAI,MAAM,CAAC,kBAAkB,IAAI,kBAAkB,EAAE;gBACnD,OAAO,QAAQ,CAAC,MAAM,CAAC,kBAAkB,CAAC,CAAC;aAC5C;SACF;QAED,MAAM,CAAC,UAAU,CAAC,eAAe,CAAC,CAAC;QACnC,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,CAAC;IAC9B,CAAC;IAED,mBAAmB,CAAC,OAAO,CAAC,KAAK,CAAC,EAAE,CAAC,MAAM,CAAC,IAAI,CAAC,KAAK,EAAE,YAAY,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;IAC9E,IAAI,OAAO,CAAC,iBAAiB,EAAE;QAC7B,mBAAmB,GAAG,YAAY,CAAC,QAAQ,CAAC,CAAC;QAC7C,OAAO,CAAC,iBAAiB,CAAC,IAAI,CAAC,QAAQ,EAAE,mBAAmB,CAAC,CAAC;KAC/D;IAED,MAAM,CAAC,IAAI,CAAC,YAAY,EAAE,cAAc,CAAC,CAAC;AAC5C,CAAC;AAED,SAAS,sBAAsB,CAAC,IAAY,EAAE,GAAU;IACtD,QAAQ,IAAI,EAAE;QACZ,KAAK,OAAO;YACV,OAAO,IAAI,yBAAiB,CAAC,GAAG,CAAC,CAAC;QACpC,KAAK,SAAS;YACZ,OAAO,IAAI,gCAAwB,CAAC,sBAAsB,CAAC,CAAC;QAC9D,KAAK,OAAO;YACV,OAAO,IAAI,yBAAiB,CAAC,mBAAmB,CAAC,CAAC;QACpD,KAAK,QAAQ;YACX,OAAO,IAAI,yBAAiB,CAAC,wCAAwC,CAAC,CAAC;QACzE;YACE,OAAO,IAAI,yBAAiB,CAAC,uBAAuB,CAAC,CAAC;KACzD;AACH,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/connection.js b/node_modules/mongodb/lib/cmap/connection.js new file mode 100644 index 000000000..bd4e0d94b --- /dev/null +++ b/node_modules/mongodb/lib/cmap/connection.js @@ -0,0 +1,557 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.hasSessionSupport = exports.CryptoConnection = exports.APM_EVENTS = exports.Connection = void 0; +const message_stream_1 = require("./message_stream"); +const stream_description_1 = require("./stream_description"); +const command_monitoring_events_1 = require("./command_monitoring_events"); +const sessions_1 = require("../sessions"); +const utils_1 = require("../utils"); +const error_1 = require("../error"); +const commands_1 = require("./commands"); +const bson_1 = require("../bson"); +const shared_1 = require("./wire_protocol/shared"); +const read_preference_1 = require("../read_preference"); +const mongo_types_1 = require("../mongo_types"); +/** @internal */ +const kStream = Symbol('stream'); +/** @internal */ +const kQueue = Symbol('queue'); +/** @internal */ +const kMessageStream = Symbol('messageStream'); +/** @internal */ +const kGeneration = Symbol('generation'); +/** @internal */ +const kLastUseTime = Symbol('lastUseTime'); +/** @internal */ +const kClusterTime = Symbol('clusterTime'); +/** @internal */ +const kDescription = Symbol('description'); +/** @internal */ +const kIsMaster = Symbol('ismaster'); +/** @internal */ +const kAutoEncrypter = Symbol('autoEncrypter'); +/** @internal */ +const kFullResult = Symbol('fullResult'); +/** @internal */ +class Connection extends mongo_types_1.TypedEventEmitter { + constructor(stream, options) { + var _a, _b; + super(); + this.id = options.id; + this.address = streamIdentifier(stream); + this.socketTimeoutMS = (_a = options.socketTimeoutMS) !== null && _a !== void 0 ? _a : 0; + this.monitorCommands = options.monitorCommands; + this.serverApi = options.serverApi; + this.closed = false; + this.destroyed = false; + this[kDescription] = new stream_description_1.StreamDescription(this.address, options); + this[kGeneration] = options.generation; + this[kLastUseTime] = utils_1.now(); + // setup parser stream and message handling + this[kQueue] = new Map(); + this[kMessageStream] = new message_stream_1.MessageStream({ + ...options, + maxBsonMessageSize: (_b = this.ismaster) === null || _b === void 0 ? void 0 : _b.maxBsonMessageSize + }); + this[kMessageStream].on('message', messageHandler(this)); + this[kStream] = stream; + stream.on('error', () => { + /* ignore errors, listen to `close` instead */ + }); + this[kMessageStream].on('error', error => this.handleIssue({ destroy: error })); + stream.on('close', () => this.handleIssue({ isClose: true })); + stream.on('timeout', () => this.handleIssue({ isTimeout: true, destroy: true })); + // hook the message stream up to the passed in stream + stream.pipe(this[kMessageStream]); + this[kMessageStream].pipe(stream); + } + get description() { + return this[kDescription]; + } + get ismaster() { + return this[kIsMaster]; + } + // the `connect` method stores the result of the handshake ismaster on the connection + set ismaster(response) { + this[kDescription].receiveResponse(response); + this[kDescription] = Object.freeze(this[kDescription]); + // TODO: remove this, and only use the `StreamDescription` in the future + this[kIsMaster] = response; + } + get serviceId() { + var _a; + return (_a = this.ismaster) === null || _a === void 0 ? void 0 : _a.serviceId; + } + get loadBalanced() { + return this.description.loadBalanced; + } + get generation() { + return this[kGeneration] || 0; + } + set generation(generation) { + this[kGeneration] = generation; + } + get idleTime() { + return utils_1.calculateDurationInMs(this[kLastUseTime]); + } + get clusterTime() { + return this[kClusterTime]; + } + get stream() { + return this[kStream]; + } + markAvailable() { + this[kLastUseTime] = utils_1.now(); + } + handleIssue(issue) { + if (this.closed) { + return; + } + if (issue.destroy) { + this[kStream].destroy(typeof issue.destroy === 'boolean' ? undefined : issue.destroy); + } + this.closed = true; + for (const [, op] of this[kQueue]) { + if (issue.isTimeout) { + op.cb(new error_1.MongoNetworkTimeoutError(`connection ${this.id} to ${this.address} timed out`, { + beforeHandshake: this.ismaster == null + })); + } + else if (issue.isClose) { + op.cb(new error_1.MongoNetworkError(`connection ${this.id} to ${this.address} closed`)); + } + else { + op.cb(typeof issue.destroy === 'boolean' ? undefined : issue.destroy); + } + } + this[kQueue].clear(); + this.emit(Connection.CLOSE); + } + destroy(options, callback) { + if (typeof options === 'function') { + callback = options; + options = { force: false }; + } + this.removeAllListeners(Connection.PINNED); + this.removeAllListeners(Connection.UNPINNED); + options = Object.assign({ force: false }, options); + if (this[kStream] == null || this.destroyed) { + this.destroyed = true; + if (typeof callback === 'function') { + callback(); + } + return; + } + if (options.force) { + this[kStream].destroy(); + this.destroyed = true; + if (typeof callback === 'function') { + callback(); + } + return; + } + this[kStream].end(() => { + this.destroyed = true; + if (typeof callback === 'function') { + callback(); + } + }); + } + /** @internal */ + command(ns, cmd, options, callback) { + if (!(ns instanceof utils_1.MongoDBNamespace)) { + // TODO(NODE-3483): Replace this with a MongoCommandError + throw new error_1.MongoRuntimeError('Must provide a MongoDBNamespace instance'); + } + const readPreference = shared_1.getReadPreference(cmd, options); + const shouldUseOpMsg = supportsOpMsg(this); + const session = options === null || options === void 0 ? void 0 : options.session; + let clusterTime = this.clusterTime; + let finalCmd = Object.assign({}, cmd); + if (this.serverApi) { + const { version, strict, deprecationErrors } = this.serverApi; + finalCmd.apiVersion = version; + if (strict != null) + finalCmd.apiStrict = strict; + if (deprecationErrors != null) + finalCmd.apiDeprecationErrors = deprecationErrors; + } + if (hasSessionSupport(this) && session) { + if (session.clusterTime && + clusterTime && + session.clusterTime.clusterTime.greaterThan(clusterTime.clusterTime)) { + clusterTime = session.clusterTime; + } + const err = sessions_1.applySession(session, finalCmd, options); + if (err) { + return callback(err); + } + } + // if we have a known cluster time, gossip it + if (clusterTime) { + finalCmd.$clusterTime = clusterTime; + } + if (shared_1.isSharded(this) && !shouldUseOpMsg && readPreference && readPreference.mode !== 'primary') { + finalCmd = { + $query: finalCmd, + $readPreference: readPreference.toJSON() + }; + } + const commandOptions = Object.assign({ + command: true, + numberToSkip: 0, + numberToReturn: -1, + checkKeys: false, + // This value is not overridable + slaveOk: readPreference.slaveOk() + }, options); + const cmdNs = `${ns.db}.$cmd`; + const message = shouldUseOpMsg + ? new commands_1.Msg(cmdNs, finalCmd, commandOptions) + : new commands_1.Query(cmdNs, finalCmd, commandOptions); + try { + write(this, message, commandOptions, callback); + } + catch (err) { + callback(err); + } + } + /** @internal */ + query(ns, cmd, options, callback) { + var _a; + const isExplain = cmd.$explain != null; + const readPreference = (_a = options.readPreference) !== null && _a !== void 0 ? _a : read_preference_1.ReadPreference.primary; + const batchSize = options.batchSize || 0; + const limit = options.limit; + const numberToSkip = options.skip || 0; + let numberToReturn = 0; + if (limit && + (limit < 0 || (limit !== 0 && limit < batchSize) || (limit > 0 && batchSize === 0))) { + numberToReturn = limit; + } + else { + numberToReturn = batchSize; + } + if (isExplain) { + // nToReturn must be 0 (match all) or negative (match N and close cursor) + // nToReturn > 0 will give explain results equivalent to limit(0) + numberToReturn = -Math.abs(limit || 0); + } + const queryOptions = { + numberToSkip, + numberToReturn, + pre32Limit: typeof limit === 'number' ? limit : undefined, + checkKeys: false, + slaveOk: readPreference.slaveOk() + }; + if (options.projection) { + queryOptions.returnFieldSelector = options.projection; + } + const query = new commands_1.Query(ns.toString(), cmd, queryOptions); + if (typeof options.tailable === 'boolean') { + query.tailable = options.tailable; + } + if (typeof options.oplogReplay === 'boolean') { + query.oplogReplay = options.oplogReplay; + } + if (typeof options.timeout === 'boolean') { + query.noCursorTimeout = !options.timeout; + } + else if (typeof options.noCursorTimeout === 'boolean') { + query.noCursorTimeout = options.noCursorTimeout; + } + if (typeof options.awaitData === 'boolean') { + query.awaitData = options.awaitData; + } + if (typeof options.partial === 'boolean') { + query.partial = options.partial; + } + write(this, query, { [kFullResult]: true, ...bson_1.pluckBSONSerializeOptions(options) }, (err, result) => { + if (err || !result) + return callback(err, result); + if (isExplain && result.documents && result.documents[0]) { + return callback(undefined, result.documents[0]); + } + callback(undefined, result); + }); + } + /** @internal */ + getMore(ns, cursorId, options, callback) { + const fullResult = !!options[kFullResult]; + const wireVersion = utils_1.maxWireVersion(this); + if (!cursorId) { + // TODO(NODE-3483): Replace this with a MongoCommandError + callback(new error_1.MongoRuntimeError('Invalid internal cursor state, no known cursor id')); + return; + } + if (wireVersion < 4) { + const getMoreOp = new commands_1.GetMore(ns.toString(), cursorId, { numberToReturn: options.batchSize }); + const queryOptions = shared_1.applyCommonQueryOptions({}, Object.assign(options, { ...bson_1.pluckBSONSerializeOptions(options) })); + queryOptions[kFullResult] = true; + queryOptions.command = true; + write(this, getMoreOp, queryOptions, (err, response) => { + if (fullResult) + return callback(err, response); + if (err) + return callback(err); + callback(undefined, { cursor: { id: response.cursorId, nextBatch: response.documents } }); + }); + return; + } + const getMoreCmd = { + getMore: cursorId, + collection: ns.collection + }; + if (typeof options.batchSize === 'number') { + getMoreCmd.batchSize = Math.abs(options.batchSize); + } + if (typeof options.maxAwaitTimeMS === 'number') { + getMoreCmd.maxTimeMS = options.maxAwaitTimeMS; + } + const commandOptions = Object.assign({ + returnFieldSelector: null, + documentsReturnedIn: 'nextBatch' + }, options); + this.command(ns, getMoreCmd, commandOptions, callback); + } + /** @internal */ + killCursors(ns, cursorIds, options, callback) { + if (!cursorIds || !Array.isArray(cursorIds)) { + // TODO(NODE-3483): Replace this with a MongoCommandError + throw new error_1.MongoRuntimeError(`Invalid list of cursor ids provided: ${cursorIds}`); + } + if (utils_1.maxWireVersion(this) < 4) { + try { + write(this, new commands_1.KillCursor(ns.toString(), cursorIds), { noResponse: true, ...options }, callback); + } + catch (err) { + callback(err); + } + return; + } + this.command(ns, { killCursors: ns.collection, cursors: cursorIds }, { [kFullResult]: true, ...options }, (err, response) => { + if (err || !response) + return callback(err); + if (response.cursorNotFound) { + return callback(new error_1.MongoNetworkError('cursor killed or timed out'), null); + } + if (!Array.isArray(response.documents) || response.documents.length === 0) { + return callback( + // TODO(NODE-3483) + new error_1.MongoRuntimeError(`invalid killCursors result returned for cursor id ${cursorIds[0]}`)); + } + callback(undefined, response.documents[0]); + }); + } +} +exports.Connection = Connection; +/** @event */ +Connection.COMMAND_STARTED = 'commandStarted'; +/** @event */ +Connection.COMMAND_SUCCEEDED = 'commandSucceeded'; +/** @event */ +Connection.COMMAND_FAILED = 'commandFailed'; +/** @event */ +Connection.CLUSTER_TIME_RECEIVED = 'clusterTimeReceived'; +/** @event */ +Connection.CLOSE = 'close'; +/** @event */ +Connection.MESSAGE = 'message'; +/** @event */ +Connection.PINNED = 'pinned'; +/** @event */ +Connection.UNPINNED = 'unpinned'; +/** @public */ +exports.APM_EVENTS = [ + Connection.COMMAND_STARTED, + Connection.COMMAND_SUCCEEDED, + Connection.COMMAND_FAILED +]; +/** @internal */ +class CryptoConnection extends Connection { + constructor(stream, options) { + super(stream, options); + this[kAutoEncrypter] = options.autoEncrypter; + } + /** @internal @override */ + command(ns, cmd, options, callback) { + const autoEncrypter = this[kAutoEncrypter]; + if (!autoEncrypter) { + return callback(new error_1.MongoMissingDependencyError('No AutoEncrypter available for encryption')); + } + const serverWireVersion = utils_1.maxWireVersion(this); + if (serverWireVersion === 0) { + // This means the initial handshake hasn't happened yet + return super.command(ns, cmd, options, callback); + } + if (serverWireVersion < 8) { + callback(new error_1.MongoCompatibilityError('Auto-encryption requires a minimum MongoDB version of 4.2')); + return; + } + autoEncrypter.encrypt(ns.toString(), cmd, options, (err, encrypted) => { + if (err || encrypted == null) { + callback(err, null); + return; + } + super.command(ns, encrypted, options, (err, response) => { + if (err || response == null) { + callback(err, response); + return; + } + autoEncrypter.decrypt(response, options, callback); + }); + }); + } +} +exports.CryptoConnection = CryptoConnection; +/** @internal */ +function hasSessionSupport(conn) { + const description = conn.description; + return description.logicalSessionTimeoutMinutes != null || !!description.loadBalanced; +} +exports.hasSessionSupport = hasSessionSupport; +function supportsOpMsg(conn) { + const description = conn.description; + if (description == null) { + return false; + } + return utils_1.maxWireVersion(conn) >= 6 && !description.__nodejs_mock_server__; +} +function messageHandler(conn) { + return function messageHandler(message) { + // always emit the message, in case we are streaming + conn.emit('message', message); + const operationDescription = conn[kQueue].get(message.responseTo); + if (!operationDescription) { + return; + } + const callback = operationDescription.cb; + // SERVER-45775: For exhaust responses we should be able to use the same requestId to + // track response, however the server currently synthetically produces remote requests + // making the `responseTo` change on each response + conn[kQueue].delete(message.responseTo); + if ('moreToCome' in message && message.moreToCome) { + // requeue the callback for next synthetic request + conn[kQueue].set(message.requestId, operationDescription); + } + else if (operationDescription.socketTimeoutOverride) { + conn[kStream].setTimeout(conn.socketTimeoutMS); + } + try { + // Pass in the entire description because it has BSON parsing options + message.parse(operationDescription); + } + catch (err) { + // If this error is generated by our own code, it will already have the correct class applied + // if it is not, then it is coming from a catastrophic data parse failure or the BSON library + // in either case, it should not be wrapped + callback(err); + return; + } + if (message.documents[0]) { + const document = message.documents[0]; + const session = operationDescription.session; + if (session) { + sessions_1.updateSessionFromResponse(session, document); + } + if (document.$clusterTime) { + conn[kClusterTime] = document.$clusterTime; + conn.emit(Connection.CLUSTER_TIME_RECEIVED, document.$clusterTime); + } + if (operationDescription.command) { + if (document.writeConcernError) { + callback(new error_1.MongoWriteConcernError(document.writeConcernError, document)); + return; + } + if (document.ok === 0 || document.$err || document.errmsg || document.code) { + callback(new error_1.MongoServerError(document)); + return; + } + } + else { + // Pre 3.2 support + if (document.ok === 0 || document.$err || document.errmsg) { + callback(new error_1.MongoServerError(document)); + return; + } + } + } + callback(undefined, operationDescription.fullResult ? message : message.documents[0]); + }; +} +function streamIdentifier(stream) { + if (typeof stream.address === 'function') { + return `${stream.remoteAddress}:${stream.remotePort}`; + } + return utils_1.uuidV4().toString('hex'); +} +function write(conn, command, options, callback) { + if (typeof options === 'function') { + callback = options; + } + options = options !== null && options !== void 0 ? options : {}; + const operationDescription = { + requestId: command.requestId, + cb: callback, + session: options.session, + fullResult: !!options[kFullResult], + noResponse: typeof options.noResponse === 'boolean' ? options.noResponse : false, + documentsReturnedIn: options.documentsReturnedIn, + command: !!options.command, + // for BSON parsing + promoteLongs: typeof options.promoteLongs === 'boolean' ? options.promoteLongs : true, + promoteValues: typeof options.promoteValues === 'boolean' ? options.promoteValues : true, + promoteBuffers: typeof options.promoteBuffers === 'boolean' ? options.promoteBuffers : false, + bsonRegExp: typeof options.bsonRegExp === 'boolean' ? options.bsonRegExp : false, + raw: typeof options.raw === 'boolean' ? options.raw : false, + started: 0 + }; + if (conn[kDescription] && conn[kDescription].compressor) { + operationDescription.agreedCompressor = conn[kDescription].compressor; + if (conn[kDescription].zlibCompressionLevel) { + operationDescription.zlibCompressionLevel = conn[kDescription].zlibCompressionLevel; + } + } + if (typeof options.socketTimeoutMS === 'number') { + operationDescription.socketTimeoutOverride = true; + conn[kStream].setTimeout(options.socketTimeoutMS); + } + // if command monitoring is enabled we need to modify the callback here + if (conn.monitorCommands) { + conn.emit(Connection.COMMAND_STARTED, new command_monitoring_events_1.CommandStartedEvent(conn, command)); + operationDescription.started = utils_1.now(); + operationDescription.cb = (err, reply) => { + if (err) { + conn.emit(Connection.COMMAND_FAILED, new command_monitoring_events_1.CommandFailedEvent(conn, command, err, operationDescription.started)); + } + else { + if (reply && (reply.ok === 0 || reply.$err)) { + conn.emit(Connection.COMMAND_FAILED, new command_monitoring_events_1.CommandFailedEvent(conn, command, reply, operationDescription.started)); + } + else { + conn.emit(Connection.COMMAND_SUCCEEDED, new command_monitoring_events_1.CommandSucceededEvent(conn, command, reply, operationDescription.started)); + } + } + if (typeof callback === 'function') { + callback(err, reply); + } + }; + } + if (!operationDescription.noResponse) { + conn[kQueue].set(operationDescription.requestId, operationDescription); + } + try { + conn[kMessageStream].writeCommand(command, operationDescription); + } + catch (e) { + if (!operationDescription.noResponse) { + conn[kQueue].delete(operationDescription.requestId); + operationDescription.cb(e); + return; + } + } + if (operationDescription.noResponse) { + operationDescription.cb(); + } +} +//# sourceMappingURL=connection.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/connection.js.map b/node_modules/mongodb/lib/cmap/connection.js.map new file mode 100644 index 000000000..a8a214a5b --- /dev/null +++ b/node_modules/mongodb/lib/cmap/connection.js.map @@ -0,0 +1 @@ +{"version":3,"file":"connection.js","sourceRoot":"","sources":["../../src/cmap/connection.ts"],"names":[],"mappings":";;;AAAA,qDAAuE;AACvE,6DAAmF;AACnF,2EAIqC;AACrC,0CAAqF;AACrF,oCASkB;AAClB,oCAQkB;AAClB,yCASoB;AACpB,kCAAoG;AAIpG,mDAA+F;AAC/F,wDAAwE;AAGxE,gDAAsE;AAEtE,gBAAgB;AAChB,MAAM,OAAO,GAAG,MAAM,CAAC,QAAQ,CAAC,CAAC;AACjC,gBAAgB;AAChB,MAAM,MAAM,GAAG,MAAM,CAAC,OAAO,CAAC,CAAC;AAC/B,gBAAgB;AAChB,MAAM,cAAc,GAAG,MAAM,CAAC,eAAe,CAAC,CAAC;AAC/C,gBAAgB;AAChB,MAAM,WAAW,GAAG,MAAM,CAAC,YAAY,CAAC,CAAC;AACzC,gBAAgB;AAChB,MAAM,YAAY,GAAG,MAAM,CAAC,aAAa,CAAC,CAAC;AAC3C,gBAAgB;AAChB,MAAM,YAAY,GAAG,MAAM,CAAC,aAAa,CAAC,CAAC;AAC3C,gBAAgB;AAChB,MAAM,YAAY,GAAG,MAAM,CAAC,aAAa,CAAC,CAAC;AAC3C,gBAAgB;AAChB,MAAM,SAAS,GAAG,MAAM,CAAC,UAAU,CAAC,CAAC;AACrC,gBAAgB;AAChB,MAAM,cAAc,GAAG,MAAM,CAAC,eAAe,CAAC,CAAC;AAC/C,gBAAgB;AAChB,MAAM,WAAW,GAAG,MAAM,CAAC,YAAY,CAAC,CAAC;AA6FzC,gBAAgB;AAChB,MAAa,UAAW,SAAQ,+BAAmC;IA4CjE,YAAY,MAAc,EAAE,OAA0B;;QACpD,KAAK,EAAE,CAAC;QACR,IAAI,CAAC,EAAE,GAAG,OAAO,CAAC,EAAE,CAAC;QACrB,IAAI,CAAC,OAAO,GAAG,gBAAgB,CAAC,MAAM,CAAC,CAAC;QACxC,IAAI,CAAC,eAAe,GAAG,MAAA,OAAO,CAAC,eAAe,mCAAI,CAAC,CAAC;QACpD,IAAI,CAAC,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC;QAC/C,IAAI,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;QACnC,IAAI,CAAC,MAAM,GAAG,KAAK,CAAC;QACpB,IAAI,CAAC,SAAS,GAAG,KAAK,CAAC;QAEvB,IAAI,CAAC,YAAY,CAAC,GAAG,IAAI,sCAAiB,CAAC,IAAI,CAAC,OAAO,EAAE,OAAO,CAAC,CAAC;QAClE,IAAI,CAAC,WAAW,CAAC,GAAG,OAAO,CAAC,UAAU,CAAC;QACvC,IAAI,CAAC,YAAY,CAAC,GAAG,WAAG,EAAE,CAAC;QAE3B,2CAA2C;QAC3C,IAAI,CAAC,MAAM,CAAC,GAAG,IAAI,GAAG,EAAE,CAAC;QACzB,IAAI,CAAC,cAAc,CAAC,GAAG,IAAI,8BAAa,CAAC;YACvC,GAAG,OAAO;YACV,kBAAkB,EAAE,MAAA,IAAI,CAAC,QAAQ,0CAAE,kBAAkB;SACtD,CAAC,CAAC;QACH,IAAI,CAAC,cAAc,CAAC,CAAC,EAAE,CAAC,SAAS,EAAE,cAAc,CAAC,IAAI,CAAC,CAAC,CAAC;QACzD,IAAI,CAAC,OAAO,CAAC,GAAG,MAAM,CAAC;QACvB,MAAM,CAAC,EAAE,CAAC,OAAO,EAAE,GAAG,EAAE;YACtB,8CAA8C;QAChD,CAAC,CAAC,CAAC;QAEH,IAAI,CAAC,cAAc,CAAC,CAAC,EAAE,CAAC,OAAO,EAAE,KAAK,CAAC,EAAE,CAAC,IAAI,CAAC,WAAW,CAAC,EAAE,OAAO,EAAE,KAAK,EAAE,CAAC,CAAC,CAAC;QAChF,MAAM,CAAC,EAAE,CAAC,OAAO,EAAE,GAAG,EAAE,CAAC,IAAI,CAAC,WAAW,CAAC,EAAE,OAAO,EAAE,IAAI,EAAE,CAAC,CAAC,CAAC;QAC9D,MAAM,CAAC,EAAE,CAAC,SAAS,EAAE,GAAG,EAAE,CAAC,IAAI,CAAC,WAAW,CAAC,EAAE,SAAS,EAAE,IAAI,EAAE,OAAO,EAAE,IAAI,EAAE,CAAC,CAAC,CAAC;QAEjF,qDAAqD;QACrD,MAAM,CAAC,IAAI,CAAC,IAAI,CAAC,cAAc,CAAC,CAAC,CAAC;QAClC,IAAI,CAAC,cAAc,CAAC,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC;IACpC,CAAC;IAED,IAAI,WAAW;QACb,OAAO,IAAI,CAAC,YAAY,CAAC,CAAC;IAC5B,CAAC;IAED,IAAI,QAAQ;QACV,OAAO,IAAI,CAAC,SAAS,CAAC,CAAC;IACzB,CAAC;IAED,qFAAqF;IACrF,IAAI,QAAQ,CAAC,QAAkB;QAC7B,IAAI,CAAC,YAAY,CAAC,CAAC,eAAe,CAAC,QAAQ,CAAC,CAAC;QAC7C,IAAI,CAAC,YAAY,CAAC,GAAG,MAAM,CAAC,MAAM,CAAC,IAAI,CAAC,YAAY,CAAC,CAAC,CAAC;QAEvD,wEAAwE;QACxE,IAAI,CAAC,SAAS,CAAC,GAAG,QAAQ,CAAC;IAC7B,CAAC;IAED,IAAI,SAAS;;QACX,OAAO,MAAA,IAAI,CAAC,QAAQ,0CAAE,SAAS,CAAC;IAClC,CAAC;IAED,IAAI,YAAY;QACd,OAAO,IAAI,CAAC,WAAW,CAAC,YAAY,CAAC;IACvC,CAAC;IAED,IAAI,UAAU;QACZ,OAAO,IAAI,CAAC,WAAW,CAAC,IAAI,CAAC,CAAC;IAChC,CAAC;IAED,IAAI,UAAU,CAAC,UAAkB;QAC/B,IAAI,CAAC,WAAW,CAAC,GAAG,UAAU,CAAC;IACjC,CAAC;IAED,IAAI,QAAQ;QACV,OAAO,6BAAqB,CAAC,IAAI,CAAC,YAAY,CAAC,CAAC,CAAC;IACnD,CAAC;IAED,IAAI,WAAW;QACb,OAAO,IAAI,CAAC,YAAY,CAAC,CAAC;IAC5B,CAAC;IAED,IAAI,MAAM;QACR,OAAO,IAAI,CAAC,OAAO,CAAC,CAAC;IACvB,CAAC;IAED,aAAa;QACX,IAAI,CAAC,YAAY,CAAC,GAAG,WAAG,EAAE,CAAC;IAC7B,CAAC;IAED,WAAW,CAAC,KAA4E;QACtF,IAAI,IAAI,CAAC,MAAM,EAAE;YACf,OAAO;SACR;QAED,IAAI,KAAK,CAAC,OAAO,EAAE;YACjB,IAAI,CAAC,OAAO,CAAC,CAAC,OAAO,CAAC,OAAO,KAAK,CAAC,OAAO,KAAK,SAAS,CAAC,CAAC,CAAC,SAAS,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;SACvF;QAED,IAAI,CAAC,MAAM,GAAG,IAAI,CAAC;QAEnB,KAAK,MAAM,CAAC,EAAE,EAAE,CAAC,IAAI,IAAI,CAAC,MAAM,CAAC,EAAE;YACjC,IAAI,KAAK,CAAC,SAAS,EAAE;gBACnB,EAAE,CAAC,EAAE,CACH,IAAI,gCAAwB,CAAC,cAAc,IAAI,CAAC,EAAE,OAAO,IAAI,CAAC,OAAO,YAAY,EAAE;oBACjF,eAAe,EAAE,IAAI,CAAC,QAAQ,IAAI,IAAI;iBACvC,CAAC,CACH,CAAC;aACH;iBAAM,IAAI,KAAK,CAAC,OAAO,EAAE;gBACxB,EAAE,CAAC,EAAE,CAAC,IAAI,yBAAiB,CAAC,cAAc,IAAI,CAAC,EAAE,OAAO,IAAI,CAAC,OAAO,SAAS,CAAC,CAAC,CAAC;aACjF;iBAAM;gBACL,EAAE,CAAC,EAAE,CAAC,OAAO,KAAK,CAAC,OAAO,KAAK,SAAS,CAAC,CAAC,CAAC,SAAS,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC;aACvE;SACF;QAED,IAAI,CAAC,MAAM,CAAC,CAAC,KAAK,EAAE,CAAC;QACrB,IAAI,CAAC,IAAI,CAAC,UAAU,CAAC,KAAK,CAAC,CAAC;IAC9B,CAAC;IAMD,OAAO,CAAC,OAAmC,EAAE,QAAmB;QAC9D,IAAI,OAAO,OAAO,KAAK,UAAU,EAAE;YACjC,QAAQ,GAAG,OAAO,CAAC;YACnB,OAAO,GAAG,EAAE,KAAK,EAAE,KAAK,EAAE,CAAC;SAC5B;QAED,IAAI,CAAC,kBAAkB,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC;QAC3C,IAAI,CAAC,kBAAkB,CAAC,UAAU,CAAC,QAAQ,CAAC,CAAC;QAE7C,OAAO,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,KAAK,EAAE,KAAK,EAAE,EAAE,OAAO,CAAC,CAAC;QACnD,IAAI,IAAI,CAAC,OAAO,CAAC,IAAI,IAAI,IAAI,IAAI,CAAC,SAAS,EAAE;YAC3C,IAAI,CAAC,SAAS,GAAG,IAAI,CAAC;YACtB,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;gBAClC,QAAQ,EAAE,CAAC;aACZ;YAED,OAAO;SACR;QAED,IAAI,OAAO,CAAC,KAAK,EAAE;YACjB,IAAI,CAAC,OAAO,CAAC,CAAC,OAAO,EAAE,CAAC;YACxB,IAAI,CAAC,SAAS,GAAG,IAAI,CAAC;YACtB,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;gBAClC,QAAQ,EAAE,CAAC;aACZ;YAED,OAAO;SACR;QAED,IAAI,CAAC,OAAO,CAAC,CAAC,GAAG,CAAC,GAAG,EAAE;YACrB,IAAI,CAAC,SAAS,GAAG,IAAI,CAAC;YACtB,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;gBAClC,QAAQ,EAAE,CAAC;aACZ;QACH,CAAC,CAAC,CAAC;IACL,CAAC;IAED,gBAAgB;IAChB,OAAO,CACL,EAAoB,EACpB,GAAa,EACb,OAAmC,EACnC,QAAkB;QAElB,IAAI,CAAC,CAAC,EAAE,YAAY,wBAAgB,CAAC,EAAE;YACrC,yDAAyD;YACzD,MAAM,IAAI,yBAAiB,CAAC,0CAA0C,CAAC,CAAC;SACzE;QAED,MAAM,cAAc,GAAG,0BAAiB,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC;QACvD,MAAM,cAAc,GAAG,aAAa,CAAC,IAAI,CAAC,CAAC;QAC3C,MAAM,OAAO,GAAG,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,OAAO,CAAC;QAEjC,IAAI,WAAW,GAAG,IAAI,CAAC,WAAW,CAAC;QACnC,IAAI,QAAQ,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,GAAG,CAAC,CAAC;QAEtC,IAAI,IAAI,CAAC,SAAS,EAAE;YAClB,MAAM,EAAE,OAAO,EAAE,MAAM,EAAE,iBAAiB,EAAE,GAAG,IAAI,CAAC,SAAS,CAAC;YAC9D,QAAQ,CAAC,UAAU,GAAG,OAAO,CAAC;YAC9B,IAAI,MAAM,IAAI,IAAI;gBAAE,QAAQ,CAAC,SAAS,GAAG,MAAM,CAAC;YAChD,IAAI,iBAAiB,IAAI,IAAI;gBAAE,QAAQ,CAAC,oBAAoB,GAAG,iBAAiB,CAAC;SAClF;QAED,IAAI,iBAAiB,CAAC,IAAI,CAAC,IAAI,OAAO,EAAE;YACtC,IACE,OAAO,CAAC,WAAW;gBACnB,WAAW;gBACX,OAAO,CAAC,WAAW,CAAC,WAAW,CAAC,WAAW,CAAC,WAAW,CAAC,WAAW,CAAC,EACpE;gBACA,WAAW,GAAG,OAAO,CAAC,WAAW,CAAC;aACnC;YAED,MAAM,GAAG,GAAG,uBAAY,CAAC,OAAO,EAAE,QAAQ,EAAE,OAAyB,CAAC,CAAC;YACvE,IAAI,GAAG,EAAE;gBACP,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;aACtB;SACF;QAED,6CAA6C;QAC7C,IAAI,WAAW,EAAE;YACf,QAAQ,CAAC,YAAY,GAAG,WAAW,CAAC;SACrC;QAED,IAAI,kBAAS,CAAC,IAAI,CAAC,IAAI,CAAC,cAAc,IAAI,cAAc,IAAI,cAAc,CAAC,IAAI,KAAK,SAAS,EAAE;YAC7F,QAAQ,GAAG;gBACT,MAAM,EAAE,QAAQ;gBAChB,eAAe,EAAE,cAAc,CAAC,MAAM,EAAE;aACzC,CAAC;SACH;QAED,MAAM,cAAc,GAAa,MAAM,CAAC,MAAM,CAC5C;YACE,OAAO,EAAE,IAAI;YACb,YAAY,EAAE,CAAC;YACf,cAAc,EAAE,CAAC,CAAC;YAClB,SAAS,EAAE,KAAK;YAChB,gCAAgC;YAChC,OAAO,EAAE,cAAc,CAAC,OAAO,EAAE;SAClC,EACD,OAAO,CACR,CAAC;QAEF,MAAM,KAAK,GAAG,GAAG,EAAE,CAAC,EAAE,OAAO,CAAC;QAC9B,MAAM,OAAO,GAAG,cAAc;YAC5B,CAAC,CAAC,IAAI,cAAG,CAAC,KAAK,EAAE,QAAQ,EAAE,cAAc,CAAC;YAC1C,CAAC,CAAC,IAAI,gBAAK,CAAC,KAAK,EAAE,QAAQ,EAAE,cAAc,CAAC,CAAC;QAE/C,IAAI;YACF,KAAK,CAAC,IAAI,EAAE,OAAO,EAAE,cAAc,EAAE,QAAQ,CAAC,CAAC;SAChD;QAAC,OAAO,GAAG,EAAE;YACZ,QAAQ,CAAC,GAAG,CAAC,CAAC;SACf;IACH,CAAC;IAED,gBAAgB;IAChB,KAAK,CAAC,EAAoB,EAAE,GAAa,EAAE,OAAqB,EAAE,QAAkB;;QAClF,MAAM,SAAS,GAAG,GAAG,CAAC,QAAQ,IAAI,IAAI,CAAC;QACvC,MAAM,cAAc,GAAG,MAAA,OAAO,CAAC,cAAc,mCAAI,gCAAc,CAAC,OAAO,CAAC;QACxE,MAAM,SAAS,GAAG,OAAO,CAAC,SAAS,IAAI,CAAC,CAAC;QACzC,MAAM,KAAK,GAAG,OAAO,CAAC,KAAK,CAAC;QAC5B,MAAM,YAAY,GAAG,OAAO,CAAC,IAAI,IAAI,CAAC,CAAC;QACvC,IAAI,cAAc,GAAG,CAAC,CAAC;QACvB,IACE,KAAK;YACL,CAAC,KAAK,GAAG,CAAC,IAAI,CAAC,KAAK,KAAK,CAAC,IAAI,KAAK,GAAG,SAAS,CAAC,IAAI,CAAC,KAAK,GAAG,CAAC,IAAI,SAAS,KAAK,CAAC,CAAC,CAAC,EACnF;YACA,cAAc,GAAG,KAAK,CAAC;SACxB;aAAM;YACL,cAAc,GAAG,SAAS,CAAC;SAC5B;QAED,IAAI,SAAS,EAAE;YACb,yEAAyE;YACzE,iEAAiE;YACjE,cAAc,GAAG,CAAC,IAAI,CAAC,GAAG,CAAC,KAAK,IAAI,CAAC,CAAC,CAAC;SACxC;QAED,MAAM,YAAY,GAAmB;YACnC,YAAY;YACZ,cAAc;YACd,UAAU,EAAE,OAAO,KAAK,KAAK,QAAQ,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,SAAS;YACzD,SAAS,EAAE,KAAK;YAChB,OAAO,EAAE,cAAc,CAAC,OAAO,EAAE;SAClC,CAAC;QAEF,IAAI,OAAO,CAAC,UAAU,EAAE;YACtB,YAAY,CAAC,mBAAmB,GAAG,OAAO,CAAC,UAAU,CAAC;SACvD;QAED,MAAM,KAAK,GAAG,IAAI,gBAAK,CAAC,EAAE,CAAC,QAAQ,EAAE,EAAE,GAAG,EAAE,YAAY,CAAC,CAAC;QAC1D,IAAI,OAAO,OAAO,CAAC,QAAQ,KAAK,SAAS,EAAE;YACzC,KAAK,CAAC,QAAQ,GAAG,OAAO,CAAC,QAAQ,CAAC;SACnC;QAED,IAAI,OAAO,OAAO,CAAC,WAAW,KAAK,SAAS,EAAE;YAC5C,KAAK,CAAC,WAAW,GAAG,OAAO,CAAC,WAAW,CAAC;SACzC;QAED,IAAI,OAAO,OAAO,CAAC,OAAO,KAAK,SAAS,EAAE;YACxC,KAAK,CAAC,eAAe,GAAG,CAAC,OAAO,CAAC,OAAO,CAAC;SAC1C;aAAM,IAAI,OAAO,OAAO,CAAC,eAAe,KAAK,SAAS,EAAE;YACvD,KAAK,CAAC,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC;SACjD;QAED,IAAI,OAAO,OAAO,CAAC,SAAS,KAAK,SAAS,EAAE;YAC1C,KAAK,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;SACrC;QAED,IAAI,OAAO,OAAO,CAAC,OAAO,KAAK,SAAS,EAAE;YACxC,KAAK,CAAC,OAAO,GAAG,OAAO,CAAC,OAAO,CAAC;SACjC;QAED,KAAK,CACH,IAAI,EACJ,KAAK,EACL,EAAE,CAAC,WAAW,CAAC,EAAE,IAAI,EAAE,GAAG,gCAAyB,CAAC,OAAO,CAAC,EAAE,EAC9D,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;YACd,IAAI,GAAG,IAAI,CAAC,MAAM;gBAAE,OAAO,QAAQ,CAAC,GAAG,EAAE,MAAM,CAAC,CAAC;YACjD,IAAI,SAAS,IAAI,MAAM,CAAC,SAAS,IAAI,MAAM,CAAC,SAAS,CAAC,CAAC,CAAC,EAAE;gBACxD,OAAO,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC,CAAC;aACjD;YAED,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,CAAC;QAC9B,CAAC,CACF,CAAC;IACJ,CAAC;IAED,gBAAgB;IAChB,OAAO,CACL,EAAoB,EACpB,QAAc,EACd,OAAuB,EACvB,QAA4B;QAE5B,MAAM,UAAU,GAAG,CAAC,CAAC,OAAO,CAAC,WAAW,CAAC,CAAC;QAC1C,MAAM,WAAW,GAAG,sBAAc,CAAC,IAAI,CAAC,CAAC;QACzC,IAAI,CAAC,QAAQ,EAAE;YACb,yDAAyD;YACzD,QAAQ,CAAC,IAAI,yBAAiB,CAAC,mDAAmD,CAAC,CAAC,CAAC;YACrF,OAAO;SACR;QAED,IAAI,WAAW,GAAG,CAAC,EAAE;YACnB,MAAM,SAAS,GAAG,IAAI,kBAAO,CAAC,EAAE,CAAC,QAAQ,EAAE,EAAE,QAAQ,EAAE,EAAE,cAAc,EAAE,OAAO,CAAC,SAAS,EAAE,CAAC,CAAC;YAC9F,MAAM,YAAY,GAAG,gCAAuB,CAC1C,EAAE,EACF,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,EAAE,GAAG,gCAAyB,CAAC,OAAO,CAAC,EAAE,CAAC,CAClE,CAAC;YAEF,YAAY,CAAC,WAAW,CAAC,GAAG,IAAI,CAAC;YACjC,YAAY,CAAC,OAAO,GAAG,IAAI,CAAC;YAC5B,KAAK,CAAC,IAAI,EAAE,SAAS,EAAE,YAAY,EAAE,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;gBACrD,IAAI,UAAU;oBAAE,OAAO,QAAQ,CAAC,GAAG,EAAE,QAAQ,CAAC,CAAC;gBAC/C,IAAI,GAAG;oBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;gBAC9B,QAAQ,CAAC,SAAS,EAAE,EAAE,MAAM,EAAE,EAAE,EAAE,EAAE,QAAQ,CAAC,QAAQ,EAAE,SAAS,EAAE,QAAQ,CAAC,SAAS,EAAE,EAAE,CAAC,CAAC;YAC5F,CAAC,CAAC,CAAC;YAEH,OAAO;SACR;QAED,MAAM,UAAU,GAAa;YAC3B,OAAO,EAAE,QAAQ;YACjB,UAAU,EAAE,EAAE,CAAC,UAAU;SAC1B,CAAC;QAEF,IAAI,OAAO,OAAO,CAAC,SAAS,KAAK,QAAQ,EAAE;YACzC,UAAU,CAAC,SAAS,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC;SACpD;QAED,IAAI,OAAO,OAAO,CAAC,cAAc,KAAK,QAAQ,EAAE;YAC9C,UAAU,CAAC,SAAS,GAAG,OAAO,CAAC,cAAc,CAAC;SAC/C;QAED,MAAM,cAAc,GAAG,MAAM,CAAC,MAAM,CAClC;YACE,mBAAmB,EAAE,IAAI;YACzB,mBAAmB,EAAE,WAAW;SACjC,EACD,OAAO,CACR,CAAC;QAEF,IAAI,CAAC,OAAO,CAAC,EAAE,EAAE,UAAU,EAAE,cAAc,EAAE,QAAQ,CAAC,CAAC;IACzD,CAAC;IAED,gBAAgB;IAChB,WAAW,CACT,EAAoB,EACpB,SAAiB,EACjB,OAAuB,EACvB,QAAkB;QAElB,IAAI,CAAC,SAAS,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,SAAS,CAAC,EAAE;YAC3C,yDAAyD;YACzD,MAAM,IAAI,yBAAiB,CAAC,wCAAwC,SAAS,EAAE,CAAC,CAAC;SAClF;QAED,IAAI,sBAAc,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE;YAC5B,IAAI;gBACF,KAAK,CACH,IAAI,EACJ,IAAI,qBAAU,CAAC,EAAE,CAAC,QAAQ,EAAE,EAAE,SAAS,CAAC,EACxC,EAAE,UAAU,EAAE,IAAI,EAAE,GAAG,OAAO,EAAE,EAChC,QAAQ,CACT,CAAC;aACH;YAAC,OAAO,GAAG,EAAE;gBACZ,QAAQ,CAAC,GAAG,CAAC,CAAC;aACf;YAED,OAAO;SACR;QAED,IAAI,CAAC,OAAO,CACV,EAAE,EACF,EAAE,WAAW,EAAE,EAAE,CAAC,UAAU,EAAE,OAAO,EAAE,SAAS,EAAE,EAClD,EAAE,CAAC,WAAW,CAAC,EAAE,IAAI,EAAE,GAAG,OAAO,EAAE,EACnC,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;YAChB,IAAI,GAAG,IAAI,CAAC,QAAQ;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAC3C,IAAI,QAAQ,CAAC,cAAc,EAAE;gBAC3B,OAAO,QAAQ,CAAC,IAAI,yBAAiB,CAAC,4BAA4B,CAAC,EAAE,IAAI,CAAC,CAAC;aAC5E;YAED,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,QAAQ,CAAC,SAAS,CAAC,IAAI,QAAQ,CAAC,SAAS,CAAC,MAAM,KAAK,CAAC,EAAE;gBACzE,OAAO,QAAQ;gBACb,kBAAkB;gBAClB,IAAI,yBAAiB,CACnB,qDAAqD,SAAS,CAAC,CAAC,CAAC,EAAE,CACpE,CACF,CAAC;aACH;YAED,QAAQ,CAAC,SAAS,EAAE,QAAQ,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC,CAAC;QAC7C,CAAC,CACF,CAAC;IACJ,CAAC;;AAtcH,gCAucC;AA5aC,aAAa;AACG,0BAAe,GAAG,gBAAyB,CAAC;AAC5D,aAAa;AACG,4BAAiB,GAAG,kBAA2B,CAAC;AAChE,aAAa;AACG,yBAAc,GAAG,eAAwB,CAAC;AAC1D,aAAa;AACG,gCAAqB,GAAG,qBAA8B,CAAC;AACvE,aAAa;AACG,gBAAK,GAAG,OAAgB,CAAC;AACzC,aAAa;AACG,kBAAO,GAAG,SAAkB,CAAC;AAC7C,aAAa;AACG,iBAAM,GAAG,QAAiB,CAAC;AAC3C,aAAa;AACG,mBAAQ,GAAG,UAAmB,CAAC;AA+ZjD,cAAc;AACD,QAAA,UAAU,GAAG;IACxB,UAAU,CAAC,eAAe;IAC1B,UAAU,CAAC,iBAAiB;IAC5B,UAAU,CAAC,cAAc;CAC1B,CAAC;AAEF,gBAAgB;AAChB,MAAa,gBAAiB,SAAQ,UAAU;IAI9C,YAAY,MAAc,EAAE,OAA0B;QACpD,KAAK,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;QACvB,IAAI,CAAC,cAAc,CAAC,GAAG,OAAO,CAAC,aAAa,CAAC;IAC/C,CAAC;IAED,0BAA0B;IAC1B,OAAO,CAAC,EAAoB,EAAE,GAAa,EAAE,OAAuB,EAAE,QAAkB;QACtF,MAAM,aAAa,GAAG,IAAI,CAAC,cAAc,CAAC,CAAC;QAC3C,IAAI,CAAC,aAAa,EAAE;YAClB,OAAO,QAAQ,CAAC,IAAI,mCAA2B,CAAC,2CAA2C,CAAC,CAAC,CAAC;SAC/F;QAED,MAAM,iBAAiB,GAAG,sBAAc,CAAC,IAAI,CAAC,CAAC;QAC/C,IAAI,iBAAiB,KAAK,CAAC,EAAE;YAC3B,uDAAuD;YACvD,OAAO,KAAK,CAAC,OAAO,CAAC,EAAE,EAAE,GAAG,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;SAClD;QAED,IAAI,iBAAiB,GAAG,CAAC,EAAE;YACzB,QAAQ,CACN,IAAI,+BAAuB,CAAC,2DAA2D,CAAC,CACzF,CAAC;YACF,OAAO;SACR;QAED,aAAa,CAAC,OAAO,CAAC,EAAE,CAAC,QAAQ,EAAE,EAAE,GAAG,EAAE,OAAO,EAAE,CAAC,GAAG,EAAE,SAAS,EAAE,EAAE;YACpE,IAAI,GAAG,IAAI,SAAS,IAAI,IAAI,EAAE;gBAC5B,QAAQ,CAAC,GAAG,EAAE,IAAI,CAAC,CAAC;gBACpB,OAAO;aACR;YAED,KAAK,CAAC,OAAO,CAAC,EAAE,EAAE,SAAS,EAAE,OAAO,EAAE,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;gBACtD,IAAI,GAAG,IAAI,QAAQ,IAAI,IAAI,EAAE;oBAC3B,QAAQ,CAAC,GAAG,EAAE,QAAQ,CAAC,CAAC;oBACxB,OAAO;iBACR;gBAED,aAAa,CAAC,OAAO,CAAC,QAAQ,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;YACrD,CAAC,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AA7CD,4CA6CC;AAED,gBAAgB;AAChB,SAAgB,iBAAiB,CAAC,IAAgB;IAChD,MAAM,WAAW,GAAG,IAAI,CAAC,WAAW,CAAC;IACrC,OAAO,WAAW,CAAC,4BAA4B,IAAI,IAAI,IAAI,CAAC,CAAC,WAAW,CAAC,YAAY,CAAC;AACxF,CAAC;AAHD,8CAGC;AAED,SAAS,aAAa,CAAC,IAAgB;IACrC,MAAM,WAAW,GAAG,IAAI,CAAC,WAAW,CAAC;IACrC,IAAI,WAAW,IAAI,IAAI,EAAE;QACvB,OAAO,KAAK,CAAC;KACd;IAED,OAAO,sBAAc,CAAC,IAAI,CAAC,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,sBAAsB,CAAC;AAC1E,CAAC;AAED,SAAS,cAAc,CAAC,IAAgB;IACtC,OAAO,SAAS,cAAc,CAAC,OAA0B;QACvD,oDAAoD;QACpD,IAAI,CAAC,IAAI,CAAC,SAAS,EAAE,OAAO,CAAC,CAAC;QAC9B,MAAM,oBAAoB,GAAG,IAAI,CAAC,MAAM,CAAC,CAAC,GAAG,CAAC,OAAO,CAAC,UAAU,CAAC,CAAC;QAClE,IAAI,CAAC,oBAAoB,EAAE;YACzB,OAAO;SACR;QAED,MAAM,QAAQ,GAAG,oBAAoB,CAAC,EAAE,CAAC;QAEzC,qFAAqF;QACrF,sFAAsF;QACtF,kDAAkD;QAClD,IAAI,CAAC,MAAM,CAAC,CAAC,MAAM,CAAC,OAAO,CAAC,UAAU,CAAC,CAAC;QACxC,IAAI,YAAY,IAAI,OAAO,IAAI,OAAO,CAAC,UAAU,EAAE;YACjD,kDAAkD;YAClD,IAAI,CAAC,MAAM,CAAC,CAAC,GAAG,CAAC,OAAO,CAAC,SAAS,EAAE,oBAAoB,CAAC,CAAC;SAC3D;aAAM,IAAI,oBAAoB,CAAC,qBAAqB,EAAE;YACrD,IAAI,CAAC,OAAO,CAAC,CAAC,UAAU,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;SAChD;QAED,IAAI;YACF,qEAAqE;YACrE,OAAO,CAAC,KAAK,CAAC,oBAAoB,CAAC,CAAC;SACrC;QAAC,OAAO,GAAG,EAAE;YACZ,6FAA6F;YAC7F,6FAA6F;YAC7F,2CAA2C;YAC3C,QAAQ,CAAC,GAAG,CAAC,CAAC;YACd,OAAO;SACR;QAED,IAAI,OAAO,CAAC,SAAS,CAAC,CAAC,CAAC,EAAE;YACxB,MAAM,QAAQ,GAAa,OAAO,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC;YAChD,MAAM,OAAO,GAAG,oBAAoB,CAAC,OAAO,CAAC;YAC7C,IAAI,OAAO,EAAE;gBACX,oCAAyB,CAAC,OAAO,EAAE,QAAQ,CAAC,CAAC;aAC9C;YAED,IAAI,QAAQ,CAAC,YAAY,EAAE;gBACzB,IAAI,CAAC,YAAY,CAAC,GAAG,QAAQ,CAAC,YAAY,CAAC;gBAC3C,IAAI,CAAC,IAAI,CAAC,UAAU,CAAC,qBAAqB,EAAE,QAAQ,CAAC,YAAY,CAAC,CAAC;aACpE;YAED,IAAI,oBAAoB,CAAC,OAAO,EAAE;gBAChC,IAAI,QAAQ,CAAC,iBAAiB,EAAE;oBAC9B,QAAQ,CAAC,IAAI,8BAAsB,CAAC,QAAQ,CAAC,iBAAiB,EAAE,QAAQ,CAAC,CAAC,CAAC;oBAC3E,OAAO;iBACR;gBAED,IAAI,QAAQ,CAAC,EAAE,KAAK,CAAC,IAAI,QAAQ,CAAC,IAAI,IAAI,QAAQ,CAAC,MAAM,IAAI,QAAQ,CAAC,IAAI,EAAE;oBAC1E,QAAQ,CAAC,IAAI,wBAAgB,CAAC,QAAQ,CAAC,CAAC,CAAC;oBACzC,OAAO;iBACR;aACF;iBAAM;gBACL,kBAAkB;gBAClB,IAAI,QAAQ,CAAC,EAAE,KAAK,CAAC,IAAI,QAAQ,CAAC,IAAI,IAAI,QAAQ,CAAC,MAAM,EAAE;oBACzD,QAAQ,CAAC,IAAI,wBAAgB,CAAC,QAAQ,CAAC,CAAC,CAAC;oBACzC,OAAO;iBACR;aACF;SACF;QAED,QAAQ,CAAC,SAAS,EAAE,oBAAoB,CAAC,UAAU,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC,CAAC;IACxF,CAAC,CAAC;AACJ,CAAC;AAED,SAAS,gBAAgB,CAAC,MAAc;IACtC,IAAI,OAAO,MAAM,CAAC,OAAO,KAAK,UAAU,EAAE;QACxC,OAAO,GAAG,MAAM,CAAC,aAAa,IAAI,MAAM,CAAC,UAAU,EAAE,CAAC;KACvD;IAED,OAAO,cAAM,EAAE,CAAC,QAAQ,CAAC,KAAK,CAAC,CAAC;AAClC,CAAC;AAED,SAAS,KAAK,CACZ,IAAgB,EAChB,OAAiC,EACjC,OAAuB,EACvB,QAAkB;IAElB,IAAI,OAAO,OAAO,KAAK,UAAU,EAAE;QACjC,QAAQ,GAAG,OAAO,CAAC;KACpB;IAED,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;IACxB,MAAM,oBAAoB,GAAyB;QACjD,SAAS,EAAE,OAAO,CAAC,SAAS;QAC5B,EAAE,EAAE,QAAQ;QACZ,OAAO,EAAE,OAAO,CAAC,OAAO;QACxB,UAAU,EAAE,CAAC,CAAC,OAAO,CAAC,WAAW,CAAC;QAClC,UAAU,EAAE,OAAO,OAAO,CAAC,UAAU,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,UAAU,CAAC,CAAC,CAAC,KAAK;QAChF,mBAAmB,EAAE,OAAO,CAAC,mBAAmB;QAChD,OAAO,EAAE,CAAC,CAAC,OAAO,CAAC,OAAO;QAE1B,mBAAmB;QACnB,YAAY,EAAE,OAAO,OAAO,CAAC,YAAY,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,YAAY,CAAC,CAAC,CAAC,IAAI;QACrF,aAAa,EAAE,OAAO,OAAO,CAAC,aAAa,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,aAAa,CAAC,CAAC,CAAC,IAAI;QACxF,cAAc,EAAE,OAAO,OAAO,CAAC,cAAc,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,cAAc,CAAC,CAAC,CAAC,KAAK;QAC5F,UAAU,EAAE,OAAO,OAAO,CAAC,UAAU,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,UAAU,CAAC,CAAC,CAAC,KAAK;QAChF,GAAG,EAAE,OAAO,OAAO,CAAC,GAAG,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC,KAAK;QAC3D,OAAO,EAAE,CAAC;KACX,CAAC;IAEF,IAAI,IAAI,CAAC,YAAY,CAAC,IAAI,IAAI,CAAC,YAAY,CAAC,CAAC,UAAU,EAAE;QACvD,oBAAoB,CAAC,gBAAgB,GAAG,IAAI,CAAC,YAAY,CAAC,CAAC,UAAU,CAAC;QAEtE,IAAI,IAAI,CAAC,YAAY,CAAC,CAAC,oBAAoB,EAAE;YAC3C,oBAAoB,CAAC,oBAAoB,GAAG,IAAI,CAAC,YAAY,CAAC,CAAC,oBAAoB,CAAC;SACrF;KACF;IAED,IAAI,OAAO,OAAO,CAAC,eAAe,KAAK,QAAQ,EAAE;QAC/C,oBAAoB,CAAC,qBAAqB,GAAG,IAAI,CAAC;QAClD,IAAI,CAAC,OAAO,CAAC,CAAC,UAAU,CAAC,OAAO,CAAC,eAAe,CAAC,CAAC;KACnD;IAED,uEAAuE;IACvE,IAAI,IAAI,CAAC,eAAe,EAAE;QACxB,IAAI,CAAC,IAAI,CAAC,UAAU,CAAC,eAAe,EAAE,IAAI,+CAAmB,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,CAAC;QAE9E,oBAAoB,CAAC,OAAO,GAAG,WAAG,EAAE,CAAC;QACrC,oBAAoB,CAAC,EAAE,GAAG,CAAC,GAAG,EAAE,KAAK,EAAE,EAAE;YACvC,IAAI,GAAG,EAAE;gBACP,IAAI,CAAC,IAAI,CACP,UAAU,CAAC,cAAc,EACzB,IAAI,8CAAkB,CAAC,IAAI,EAAE,OAAO,EAAE,GAAG,EAAE,oBAAoB,CAAC,OAAO,CAAC,CACzE,CAAC;aACH;iBAAM;gBACL,IAAI,KAAK,IAAI,CAAC,KAAK,CAAC,EAAE,KAAK,CAAC,IAAI,KAAK,CAAC,IAAI,CAAC,EAAE;oBAC3C,IAAI,CAAC,IAAI,CACP,UAAU,CAAC,cAAc,EACzB,IAAI,8CAAkB,CAAC,IAAI,EAAE,OAAO,EAAE,KAAK,EAAE,oBAAoB,CAAC,OAAO,CAAC,CAC3E,CAAC;iBACH;qBAAM;oBACL,IAAI,CAAC,IAAI,CACP,UAAU,CAAC,iBAAiB,EAC5B,IAAI,iDAAqB,CAAC,IAAI,EAAE,OAAO,EAAE,KAAK,EAAE,oBAAoB,CAAC,OAAO,CAAC,CAC9E,CAAC;iBACH;aACF;YAED,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;gBAClC,QAAQ,CAAC,GAAG,EAAE,KAAK,CAAC,CAAC;aACtB;QACH,CAAC,CAAC;KACH;IAED,IAAI,CAAC,oBAAoB,CAAC,UAAU,EAAE;QACpC,IAAI,CAAC,MAAM,CAAC,CAAC,GAAG,CAAC,oBAAoB,CAAC,SAAS,EAAE,oBAAoB,CAAC,CAAC;KACxE;IAED,IAAI;QACF,IAAI,CAAC,cAAc,CAAC,CAAC,YAAY,CAAC,OAAO,EAAE,oBAAoB,CAAC,CAAC;KAClE;IAAC,OAAO,CAAC,EAAE;QACV,IAAI,CAAC,oBAAoB,CAAC,UAAU,EAAE;YACpC,IAAI,CAAC,MAAM,CAAC,CAAC,MAAM,CAAC,oBAAoB,CAAC,SAAS,CAAC,CAAC;YACpD,oBAAoB,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC;YAC3B,OAAO;SACR;KACF;IAED,IAAI,oBAAoB,CAAC,UAAU,EAAE;QACnC,oBAAoB,CAAC,EAAE,EAAE,CAAC;KAC3B;AACH,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/connection_pool.js b/node_modules/mongodb/lib/cmap/connection_pool.js new file mode 100644 index 000000000..85ffb9a5e --- /dev/null +++ b/node_modules/mongodb/lib/cmap/connection_pool.js @@ -0,0 +1,495 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.CMAP_EVENTS = exports.ConnectionPool = void 0; +const Denque = require("denque"); +const connection_1 = require("./connection"); +const logger_1 = require("../logger"); +const metrics_1 = require("./metrics"); +const connect_1 = require("./connect"); +const utils_1 = require("../utils"); +const error_1 = require("../error"); +const errors_1 = require("./errors"); +const connection_pool_events_1 = require("./connection_pool_events"); +const mongo_types_1 = require("../mongo_types"); +/** @internal */ +const kLogger = Symbol('logger'); +/** @internal */ +const kConnections = Symbol('connections'); +/** @internal */ +const kPermits = Symbol('permits'); +/** @internal */ +const kMinPoolSizeTimer = Symbol('minPoolSizeTimer'); +/** @internal */ +const kGeneration = Symbol('generation'); +/** @internal */ +const kServiceGenerations = Symbol('serviceGenerations'); +/** @internal */ +const kConnectionCounter = Symbol('connectionCounter'); +/** @internal */ +const kCancellationToken = Symbol('cancellationToken'); +/** @internal */ +const kWaitQueue = Symbol('waitQueue'); +/** @internal */ +const kCancelled = Symbol('cancelled'); +/** @internal */ +const kMetrics = Symbol('metrics'); +/** @internal */ +const kCheckedOut = Symbol('checkedOut'); +/** @internal */ +const kProcessingWaitQueue = Symbol('processingWaitQueue'); +/** + * A pool of connections which dynamically resizes, and emit events related to pool activity + * @internal + */ +class ConnectionPool extends mongo_types_1.TypedEventEmitter { + /** @internal */ + constructor(options) { + var _a, _b, _c, _d; + super(); + this.closed = false; + this.options = Object.freeze({ + ...options, + connectionType: connection_1.Connection, + maxPoolSize: (_a = options.maxPoolSize) !== null && _a !== void 0 ? _a : 100, + minPoolSize: (_b = options.minPoolSize) !== null && _b !== void 0 ? _b : 0, + maxIdleTimeMS: (_c = options.maxIdleTimeMS) !== null && _c !== void 0 ? _c : 0, + waitQueueTimeoutMS: (_d = options.waitQueueTimeoutMS) !== null && _d !== void 0 ? _d : 0, + autoEncrypter: options.autoEncrypter, + metadata: options.metadata + }); + if (this.options.minPoolSize > this.options.maxPoolSize) { + throw new error_1.MongoInvalidArgumentError('Connection pool minimum size must not be greater than maximum pool size'); + } + this[kLogger] = new logger_1.Logger('ConnectionPool'); + this[kConnections] = new Denque(); + this[kPermits] = this.options.maxPoolSize; + this[kMinPoolSizeTimer] = undefined; + this[kGeneration] = 0; + this[kServiceGenerations] = new Map(); + this[kConnectionCounter] = utils_1.makeCounter(1); + this[kCancellationToken] = new mongo_types_1.CancellationToken(); + this[kCancellationToken].setMaxListeners(Infinity); + this[kWaitQueue] = new Denque(); + this[kMetrics] = new metrics_1.ConnectionPoolMetrics(); + this[kCheckedOut] = 0; + this[kProcessingWaitQueue] = false; + process.nextTick(() => { + this.emit(ConnectionPool.CONNECTION_POOL_CREATED, new connection_pool_events_1.ConnectionPoolCreatedEvent(this)); + ensureMinPoolSize(this); + }); + } + /** The address of the endpoint the pool is connected to */ + get address() { + return this.options.hostAddress.toString(); + } + /** An integer representing the SDAM generation of the pool */ + get generation() { + return this[kGeneration]; + } + /** An integer expressing how many total connections (active + in use) the pool currently has */ + get totalConnectionCount() { + return this[kConnections].length + (this.options.maxPoolSize - this[kPermits]); + } + /** An integer expressing how many connections are currently available in the pool. */ + get availableConnectionCount() { + return this[kConnections].length; + } + get waitQueueSize() { + return this[kWaitQueue].length; + } + get loadBalanced() { + return this.options.loadBalanced; + } + get serviceGenerations() { + return this[kServiceGenerations]; + } + get currentCheckedOutCount() { + return this[kCheckedOut]; + } + /** + * Get the metrics information for the pool when a wait queue timeout occurs. + */ + waitQueueErrorMetrics() { + return this[kMetrics].info(this.options.maxPoolSize); + } + /** + * Check a connection out of this pool. The connection will continue to be tracked, but no reference to it + * will be held by the pool. This means that if a connection is checked out it MUST be checked back in or + * explicitly destroyed by the new owner. + */ + checkOut(callback) { + this.emit(ConnectionPool.CONNECTION_CHECK_OUT_STARTED, new connection_pool_events_1.ConnectionCheckOutStartedEvent(this)); + if (this.closed) { + this.emit(ConnectionPool.CONNECTION_CHECK_OUT_FAILED, new connection_pool_events_1.ConnectionCheckOutFailedEvent(this, 'poolClosed')); + callback(new errors_1.PoolClosedError(this)); + return; + } + const waitQueueMember = { callback }; + const waitQueueTimeoutMS = this.options.waitQueueTimeoutMS; + if (waitQueueTimeoutMS) { + waitQueueMember.timer = setTimeout(() => { + waitQueueMember[kCancelled] = true; + waitQueueMember.timer = undefined; + this.emit(ConnectionPool.CONNECTION_CHECK_OUT_FAILED, new connection_pool_events_1.ConnectionCheckOutFailedEvent(this, 'timeout')); + waitQueueMember.callback(new errors_1.WaitQueueTimeoutError(this.loadBalanced + ? this.waitQueueErrorMetrics() + : 'Timed out while checking out a connection from connection pool', this.address)); + }, waitQueueTimeoutMS); + } + this[kCheckedOut] = this[kCheckedOut] + 1; + this[kWaitQueue].push(waitQueueMember); + process.nextTick(processWaitQueue, this); + } + /** + * Check a connection into the pool. + * + * @param connection - The connection to check in + */ + checkIn(connection) { + const poolClosed = this.closed; + const stale = connectionIsStale(this, connection); + const willDestroy = !!(poolClosed || stale || connection.closed); + if (!willDestroy) { + connection.markAvailable(); + this[kConnections].unshift(connection); + } + this[kCheckedOut] = this[kCheckedOut] - 1; + this.emit(ConnectionPool.CONNECTION_CHECKED_IN, new connection_pool_events_1.ConnectionCheckedInEvent(this, connection)); + if (willDestroy) { + const reason = connection.closed ? 'error' : poolClosed ? 'poolClosed' : 'stale'; + destroyConnection(this, connection, reason); + } + process.nextTick(processWaitQueue, this); + } + /** + * Clear the pool + * + * Pool reset is handled by incrementing the pool's generation count. Any existing connection of a + * previous generation will eventually be pruned during subsequent checkouts. + */ + clear(serviceId) { + if (this.loadBalanced && serviceId) { + const sid = serviceId.toHexString(); + const generation = this.serviceGenerations.get(sid); + // Only need to worry if the generation exists, since it should + // always be there but typescript needs the check. + if (generation == null) { + // TODO(NODE-3483) + throw new error_1.MongoRuntimeError('Service generations are required in load balancer mode.'); + } + else { + // Increment the generation for the service id. + this.serviceGenerations.set(sid, generation + 1); + } + } + else { + this[kGeneration] += 1; + } + this.emit('connectionPoolCleared', new connection_pool_events_1.ConnectionPoolClearedEvent(this, serviceId)); + } + close(_options, _cb) { + let options = _options; + const callback = (_cb !== null && _cb !== void 0 ? _cb : _options); + if (typeof options === 'function') { + options = {}; + } + options = Object.assign({ force: false }, options); + if (this.closed) { + return callback(); + } + // immediately cancel any in-flight connections + this[kCancellationToken].emit('cancel'); + // drain the wait queue + while (this.waitQueueSize) { + const waitQueueMember = this[kWaitQueue].pop(); + if (waitQueueMember) { + if (waitQueueMember.timer) { + clearTimeout(waitQueueMember.timer); + } + if (!waitQueueMember[kCancelled]) { + // TODO(NODE-3483): Replace with MongoConnectionPoolClosedError + waitQueueMember.callback(new error_1.MongoRuntimeError('Connection pool closed')); + } + } + } + // clear the min pool size timer + const minPoolSizeTimer = this[kMinPoolSizeTimer]; + if (minPoolSizeTimer) { + clearTimeout(minPoolSizeTimer); + } + // end the connection counter + if (typeof this[kConnectionCounter].return === 'function') { + this[kConnectionCounter].return(undefined); + } + // mark the pool as closed immediately + this.closed = true; + utils_1.eachAsync(this[kConnections].toArray(), (conn, cb) => { + this.emit(ConnectionPool.CONNECTION_CLOSED, new connection_pool_events_1.ConnectionClosedEvent(this, conn, 'poolClosed')); + conn.destroy(options, cb); + }, err => { + this[kConnections].clear(); + this.emit(ConnectionPool.CONNECTION_POOL_CLOSED, new connection_pool_events_1.ConnectionPoolClosedEvent(this)); + callback(err); + }); + } + /** + * Runs a lambda with an implicitly checked out connection, checking that connection back in when the lambda + * has completed by calling back. + * + * NOTE: please note the required signature of `fn` + * + * @remarks When in load balancer mode, connections can be pinned to cursors or transactions. + * In these cases we pass the connection in to this method to ensure it is used and a new + * connection is not checked out. + * + * @param conn - A pinned connection for use in load balancing mode. + * @param fn - A function which operates on a managed connection + * @param callback - The original callback + */ + withConnection(conn, fn, callback) { + if (conn) { + // use the provided connection, and do _not_ check it in after execution + fn(undefined, conn, (fnErr, result) => { + if (typeof callback === 'function') { + if (fnErr) { + callback(fnErr); + } + else { + callback(undefined, result); + } + } + }); + return; + } + this.checkOut((err, conn) => { + // don't callback with `err` here, we might want to act upon it inside `fn` + fn(err, conn, (fnErr, result) => { + if (typeof callback === 'function') { + if (fnErr) { + callback(fnErr); + } + else { + callback(undefined, result); + } + } + if (conn) { + this.checkIn(conn); + } + }); + }); + } +} +exports.ConnectionPool = ConnectionPool; +/** + * Emitted when the connection pool is created. + * @event + */ +ConnectionPool.CONNECTION_POOL_CREATED = 'connectionPoolCreated'; +/** + * Emitted once when the connection pool is closed + * @event + */ +ConnectionPool.CONNECTION_POOL_CLOSED = 'connectionPoolClosed'; +/** + * Emitted each time the connection pool is cleared and it's generation incremented + * @event + */ +ConnectionPool.CONNECTION_POOL_CLEARED = 'connectionPoolCleared'; +/** + * Emitted when a connection is created. + * @event + */ +ConnectionPool.CONNECTION_CREATED = 'connectionCreated'; +/** + * Emitted when a connection becomes established, and is ready to use + * @event + */ +ConnectionPool.CONNECTION_READY = 'connectionReady'; +/** + * Emitted when a connection is closed + * @event + */ +ConnectionPool.CONNECTION_CLOSED = 'connectionClosed'; +/** + * Emitted when an attempt to check out a connection begins + * @event + */ +ConnectionPool.CONNECTION_CHECK_OUT_STARTED = 'connectionCheckOutStarted'; +/** + * Emitted when an attempt to check out a connection fails + * @event + */ +ConnectionPool.CONNECTION_CHECK_OUT_FAILED = 'connectionCheckOutFailed'; +/** + * Emitted each time a connection is successfully checked out of the connection pool + * @event + */ +ConnectionPool.CONNECTION_CHECKED_OUT = 'connectionCheckedOut'; +/** + * Emitted each time a connection is successfully checked into the connection pool + * @event + */ +ConnectionPool.CONNECTION_CHECKED_IN = 'connectionCheckedIn'; +function ensureMinPoolSize(pool) { + if (pool.closed || pool.options.minPoolSize === 0) { + return; + } + const minPoolSize = pool.options.minPoolSize; + for (let i = pool.totalConnectionCount; i < minPoolSize; ++i) { + createConnection(pool); + } + pool[kMinPoolSizeTimer] = setTimeout(() => ensureMinPoolSize(pool), 10); +} +function connectionIsStale(pool, connection) { + const serviceId = connection.serviceId; + if (pool.loadBalanced && serviceId) { + const sid = serviceId.toHexString(); + const generation = pool.serviceGenerations.get(sid); + return connection.generation !== generation; + } + return connection.generation !== pool[kGeneration]; +} +function connectionIsIdle(pool, connection) { + return !!(pool.options.maxIdleTimeMS && connection.idleTime > pool.options.maxIdleTimeMS); +} +function createConnection(pool, callback) { + const connectOptions = { + ...pool.options, + id: pool[kConnectionCounter].next().value, + generation: pool[kGeneration], + cancellationToken: pool[kCancellationToken] + }; + pool[kPermits]--; + connect_1.connect(connectOptions, (err, connection) => { + if (err || !connection) { + pool[kPermits]++; + pool[kLogger].debug(`connection attempt failed with error [${JSON.stringify(err)}]`); + if (typeof callback === 'function') { + callback(err); + } + return; + } + // The pool might have closed since we started trying to create a connection + if (pool.closed) { + connection.destroy({ force: true }); + return; + } + // forward all events from the connection to the pool + for (const event of [...connection_1.APM_EVENTS, connection_1.Connection.CLUSTER_TIME_RECEIVED]) { + connection.on(event, (e) => pool.emit(event, e)); + } + pool.emit(ConnectionPool.CONNECTION_CREATED, new connection_pool_events_1.ConnectionCreatedEvent(pool, connection)); + if (pool.loadBalanced) { + connection.on(connection_1.Connection.PINNED, pinType => pool[kMetrics].markPinned(pinType)); + connection.on(connection_1.Connection.UNPINNED, pinType => pool[kMetrics].markUnpinned(pinType)); + const serviceId = connection.serviceId; + if (serviceId) { + let generation; + const sid = serviceId.toHexString(); + if ((generation = pool.serviceGenerations.get(sid))) { + connection.generation = generation; + } + else { + pool.serviceGenerations.set(sid, 0); + connection.generation = 0; + } + } + } + connection.markAvailable(); + pool.emit(ConnectionPool.CONNECTION_READY, new connection_pool_events_1.ConnectionReadyEvent(pool, connection)); + // if a callback has been provided, check out the connection immediately + if (typeof callback === 'function') { + callback(undefined, connection); + return; + } + // otherwise add it to the pool for later acquisition, and try to process the wait queue + pool[kConnections].push(connection); + process.nextTick(processWaitQueue, pool); + }); +} +function destroyConnection(pool, connection, reason) { + pool.emit(ConnectionPool.CONNECTION_CLOSED, new connection_pool_events_1.ConnectionClosedEvent(pool, connection, reason)); + // allow more connections to be created + pool[kPermits]++; + // destroy the connection + process.nextTick(() => connection.destroy()); +} +function processWaitQueue(pool) { + if (pool.closed || pool[kProcessingWaitQueue]) { + return; + } + pool[kProcessingWaitQueue] = true; + while (pool.waitQueueSize) { + const waitQueueMember = pool[kWaitQueue].peekFront(); + if (!waitQueueMember) { + pool[kWaitQueue].shift(); + continue; + } + if (waitQueueMember[kCancelled]) { + pool[kWaitQueue].shift(); + continue; + } + if (!pool.availableConnectionCount) { + break; + } + const connection = pool[kConnections].shift(); + if (!connection) { + break; + } + const isStale = connectionIsStale(pool, connection); + const isIdle = connectionIsIdle(pool, connection); + if (!isStale && !isIdle && !connection.closed) { + pool.emit(ConnectionPool.CONNECTION_CHECKED_OUT, new connection_pool_events_1.ConnectionCheckedOutEvent(pool, connection)); + if (waitQueueMember.timer) { + clearTimeout(waitQueueMember.timer); + } + pool[kWaitQueue].shift(); + waitQueueMember.callback(undefined, connection); + } + else { + const reason = connection.closed ? 'error' : isStale ? 'stale' : 'idle'; + destroyConnection(pool, connection, reason); + } + } + const maxPoolSize = pool.options.maxPoolSize; + if (pool.waitQueueSize && (maxPoolSize <= 0 || pool.totalConnectionCount < maxPoolSize)) { + createConnection(pool, (err, connection) => { + const waitQueueMember = pool[kWaitQueue].shift(); + if (!waitQueueMember || waitQueueMember[kCancelled]) { + if (!err && connection) { + pool[kConnections].push(connection); + } + pool[kProcessingWaitQueue] = false; + return; + } + if (err) { + pool.emit(ConnectionPool.CONNECTION_CHECK_OUT_FAILED, new connection_pool_events_1.ConnectionCheckOutFailedEvent(pool, err)); + } + else if (connection) { + pool.emit(ConnectionPool.CONNECTION_CHECKED_OUT, new connection_pool_events_1.ConnectionCheckedOutEvent(pool, connection)); + } + if (waitQueueMember.timer) { + clearTimeout(waitQueueMember.timer); + } + waitQueueMember.callback(err, connection); + pool[kProcessingWaitQueue] = false; + process.nextTick(() => processWaitQueue(pool)); + }); + } + else { + pool[kProcessingWaitQueue] = false; + } +} +exports.CMAP_EVENTS = [ + ConnectionPool.CONNECTION_POOL_CREATED, + ConnectionPool.CONNECTION_POOL_CLOSED, + ConnectionPool.CONNECTION_CREATED, + ConnectionPool.CONNECTION_READY, + ConnectionPool.CONNECTION_CLOSED, + ConnectionPool.CONNECTION_CHECK_OUT_STARTED, + ConnectionPool.CONNECTION_CHECK_OUT_FAILED, + ConnectionPool.CONNECTION_CHECKED_OUT, + ConnectionPool.CONNECTION_CHECKED_IN, + ConnectionPool.CONNECTION_POOL_CLEARED +]; +//# sourceMappingURL=connection_pool.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/connection_pool.js.map b/node_modules/mongodb/lib/cmap/connection_pool.js.map new file mode 100644 index 000000000..204241290 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/connection_pool.js.map @@ -0,0 +1 @@ +{"version":3,"file":"connection_pool.js","sourceRoot":"","sources":["../../src/cmap/connection_pool.ts"],"names":[],"mappings":";;;AAAA,iCAAkC;AAClC,6CAA2F;AAE3F,sCAAmC;AACnC,uCAAkD;AAClD,uCAAoC;AACpC,oCAA4D;AAC5D,oCAAoF;AACpF,qCAAkE;AAClE,qEAWkC;AAClC,gDAAsE;AAEtE,gBAAgB;AAChB,MAAM,OAAO,GAAG,MAAM,CAAC,QAAQ,CAAC,CAAC;AACjC,gBAAgB;AAChB,MAAM,YAAY,GAAG,MAAM,CAAC,aAAa,CAAC,CAAC;AAC3C,gBAAgB;AAChB,MAAM,QAAQ,GAAG,MAAM,CAAC,SAAS,CAAC,CAAC;AACnC,gBAAgB;AAChB,MAAM,iBAAiB,GAAG,MAAM,CAAC,kBAAkB,CAAC,CAAC;AACrD,gBAAgB;AAChB,MAAM,WAAW,GAAG,MAAM,CAAC,YAAY,CAAC,CAAC;AACzC,gBAAgB;AAChB,MAAM,mBAAmB,GAAG,MAAM,CAAC,oBAAoB,CAAC,CAAC;AACzD,gBAAgB;AAChB,MAAM,kBAAkB,GAAG,MAAM,CAAC,mBAAmB,CAAC,CAAC;AACvD,gBAAgB;AAChB,MAAM,kBAAkB,GAAG,MAAM,CAAC,mBAAmB,CAAC,CAAC;AACvD,gBAAgB;AAChB,MAAM,UAAU,GAAG,MAAM,CAAC,WAAW,CAAC,CAAC;AACvC,gBAAgB;AAChB,MAAM,UAAU,GAAG,MAAM,CAAC,WAAW,CAAC,CAAC;AACvC,gBAAgB;AAChB,MAAM,QAAQ,GAAG,MAAM,CAAC,SAAS,CAAC,CAAC;AACnC,gBAAgB;AAChB,MAAM,WAAW,GAAG,MAAM,CAAC,YAAY,CAAC,CAAC;AACzC,gBAAgB;AAChB,MAAM,oBAAoB,GAAG,MAAM,CAAC,qBAAqB,CAAC,CAAC;AA0C3D;;;GAGG;AACH,MAAa,cAAe,SAAQ,+BAAuC;IAuFzE,gBAAgB;IAChB,YAAY,OAA8B;;QACxC,KAAK,EAAE,CAAC;QAER,IAAI,CAAC,MAAM,GAAG,KAAK,CAAC;QACpB,IAAI,CAAC,OAAO,GAAG,MAAM,CAAC,MAAM,CAAC;YAC3B,GAAG,OAAO;YACV,cAAc,EAAE,uBAAU;YAC1B,WAAW,EAAE,MAAA,OAAO,CAAC,WAAW,mCAAI,GAAG;YACvC,WAAW,EAAE,MAAA,OAAO,CAAC,WAAW,mCAAI,CAAC;YACrC,aAAa,EAAE,MAAA,OAAO,CAAC,aAAa,mCAAI,CAAC;YACzC,kBAAkB,EAAE,MAAA,OAAO,CAAC,kBAAkB,mCAAI,CAAC;YACnD,aAAa,EAAE,OAAO,CAAC,aAAa;YACpC,QAAQ,EAAE,OAAO,CAAC,QAAQ;SAC3B,CAAC,CAAC;QAEH,IAAI,IAAI,CAAC,OAAO,CAAC,WAAW,GAAG,IAAI,CAAC,OAAO,CAAC,WAAW,EAAE;YACvD,MAAM,IAAI,iCAAyB,CACjC,yEAAyE,CAC1E,CAAC;SACH;QAED,IAAI,CAAC,OAAO,CAAC,GAAG,IAAI,eAAM,CAAC,gBAAgB,CAAC,CAAC;QAC7C,IAAI,CAAC,YAAY,CAAC,GAAG,IAAI,MAAM,EAAE,CAAC;QAClC,IAAI,CAAC,QAAQ,CAAC,GAAG,IAAI,CAAC,OAAO,CAAC,WAAW,CAAC;QAC1C,IAAI,CAAC,iBAAiB,CAAC,GAAG,SAAS,CAAC;QACpC,IAAI,CAAC,WAAW,CAAC,GAAG,CAAC,CAAC;QACtB,IAAI,CAAC,mBAAmB,CAAC,GAAG,IAAI,GAAG,EAAE,CAAC;QACtC,IAAI,CAAC,kBAAkB,CAAC,GAAG,mBAAW,CAAC,CAAC,CAAC,CAAC;QAC1C,IAAI,CAAC,kBAAkB,CAAC,GAAG,IAAI,+BAAiB,EAAE,CAAC;QACnD,IAAI,CAAC,kBAAkB,CAAC,CAAC,eAAe,CAAC,QAAQ,CAAC,CAAC;QACnD,IAAI,CAAC,UAAU,CAAC,GAAG,IAAI,MAAM,EAAE,CAAC;QAChC,IAAI,CAAC,QAAQ,CAAC,GAAG,IAAI,+BAAqB,EAAE,CAAC;QAC7C,IAAI,CAAC,WAAW,CAAC,GAAG,CAAC,CAAC;QACtB,IAAI,CAAC,oBAAoB,CAAC,GAAG,KAAK,CAAC;QAEnC,OAAO,CAAC,QAAQ,CAAC,GAAG,EAAE;YACpB,IAAI,CAAC,IAAI,CAAC,cAAc,CAAC,uBAAuB,EAAE,IAAI,mDAA0B,CAAC,IAAI,CAAC,CAAC,CAAC;YACxF,iBAAiB,CAAC,IAAI,CAAC,CAAC;QAC1B,CAAC,CAAC,CAAC;IACL,CAAC;IAED,2DAA2D;IAC3D,IAAI,OAAO;QACT,OAAO,IAAI,CAAC,OAAO,CAAC,WAAW,CAAC,QAAQ,EAAE,CAAC;IAC7C,CAAC;IAED,8DAA8D;IAC9D,IAAI,UAAU;QACZ,OAAO,IAAI,CAAC,WAAW,CAAC,CAAC;IAC3B,CAAC;IAED,gGAAgG;IAChG,IAAI,oBAAoB;QACtB,OAAO,IAAI,CAAC,YAAY,CAAC,CAAC,MAAM,GAAG,CAAC,IAAI,CAAC,OAAO,CAAC,WAAW,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC;IACjF,CAAC;IAED,sFAAsF;IACtF,IAAI,wBAAwB;QAC1B,OAAO,IAAI,CAAC,YAAY,CAAC,CAAC,MAAM,CAAC;IACnC,CAAC;IAED,IAAI,aAAa;QACf,OAAO,IAAI,CAAC,UAAU,CAAC,CAAC,MAAM,CAAC;IACjC,CAAC;IAED,IAAI,YAAY;QACd,OAAO,IAAI,CAAC,OAAO,CAAC,YAAY,CAAC;IACnC,CAAC;IAED,IAAI,kBAAkB;QACpB,OAAO,IAAI,CAAC,mBAAmB,CAAC,CAAC;IACnC,CAAC;IAED,IAAI,sBAAsB;QACxB,OAAO,IAAI,CAAC,WAAW,CAAC,CAAC;IAC3B,CAAC;IAED;;OAEG;IACK,qBAAqB;QAC3B,OAAO,IAAI,CAAC,QAAQ,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,OAAO,CAAC,WAAW,CAAC,CAAC;IACvD,CAAC;IAED;;;;OAIG;IACH,QAAQ,CAAC,QAA8B;QACrC,IAAI,CAAC,IAAI,CACP,cAAc,CAAC,4BAA4B,EAC3C,IAAI,uDAA8B,CAAC,IAAI,CAAC,CACzC,CAAC;QAEF,IAAI,IAAI,CAAC,MAAM,EAAE;YACf,IAAI,CAAC,IAAI,CACP,cAAc,CAAC,2BAA2B,EAC1C,IAAI,sDAA6B,CAAC,IAAI,EAAE,YAAY,CAAC,CACtD,CAAC;YACF,QAAQ,CAAC,IAAI,wBAAe,CAAC,IAAI,CAAC,CAAC,CAAC;YACpC,OAAO;SACR;QAED,MAAM,eAAe,GAAoB,EAAE,QAAQ,EAAE,CAAC;QACtD,MAAM,kBAAkB,GAAG,IAAI,CAAC,OAAO,CAAC,kBAAkB,CAAC;QAC3D,IAAI,kBAAkB,EAAE;YACtB,eAAe,CAAC,KAAK,GAAG,UAAU,CAAC,GAAG,EAAE;gBACtC,eAAe,CAAC,UAAU,CAAC,GAAG,IAAI,CAAC;gBACnC,eAAe,CAAC,KAAK,GAAG,SAAS,CAAC;gBAElC,IAAI,CAAC,IAAI,CACP,cAAc,CAAC,2BAA2B,EAC1C,IAAI,sDAA6B,CAAC,IAAI,EAAE,SAAS,CAAC,CACnD,CAAC;gBACF,eAAe,CAAC,QAAQ,CACtB,IAAI,8BAAqB,CACvB,IAAI,CAAC,YAAY;oBACf,CAAC,CAAC,IAAI,CAAC,qBAAqB,EAAE;oBAC9B,CAAC,CAAC,gEAAgE,EACpE,IAAI,CAAC,OAAO,CACb,CACF,CAAC;YACJ,CAAC,EAAE,kBAAkB,CAAC,CAAC;SACxB;QAED,IAAI,CAAC,WAAW,CAAC,GAAG,IAAI,CAAC,WAAW,CAAC,GAAG,CAAC,CAAC;QAC1C,IAAI,CAAC,UAAU,CAAC,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;QACvC,OAAO,CAAC,QAAQ,CAAC,gBAAgB,EAAE,IAAI,CAAC,CAAC;IAC3C,CAAC;IAED;;;;OAIG;IACH,OAAO,CAAC,UAAsB;QAC5B,MAAM,UAAU,GAAG,IAAI,CAAC,MAAM,CAAC;QAC/B,MAAM,KAAK,GAAG,iBAAiB,CAAC,IAAI,EAAE,UAAU,CAAC,CAAC;QAClD,MAAM,WAAW,GAAG,CAAC,CAAC,CAAC,UAAU,IAAI,KAAK,IAAI,UAAU,CAAC,MAAM,CAAC,CAAC;QAEjE,IAAI,CAAC,WAAW,EAAE;YAChB,UAAU,CAAC,aAAa,EAAE,CAAC;YAC3B,IAAI,CAAC,YAAY,CAAC,CAAC,OAAO,CAAC,UAAU,CAAC,CAAC;SACxC;QAED,IAAI,CAAC,WAAW,CAAC,GAAG,IAAI,CAAC,WAAW,CAAC,GAAG,CAAC,CAAC;QAC1C,IAAI,CAAC,IAAI,CAAC,cAAc,CAAC,qBAAqB,EAAE,IAAI,iDAAwB,CAAC,IAAI,EAAE,UAAU,CAAC,CAAC,CAAC;QAEhG,IAAI,WAAW,EAAE;YACf,MAAM,MAAM,GAAG,UAAU,CAAC,MAAM,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC,CAAC,UAAU,CAAC,CAAC,CAAC,YAAY,CAAC,CAAC,CAAC,OAAO,CAAC;YACjF,iBAAiB,CAAC,IAAI,EAAE,UAAU,EAAE,MAAM,CAAC,CAAC;SAC7C;QAED,OAAO,CAAC,QAAQ,CAAC,gBAAgB,EAAE,IAAI,CAAC,CAAC;IAC3C,CAAC;IAED;;;;;OAKG;IACH,KAAK,CAAC,SAAoB;QACxB,IAAI,IAAI,CAAC,YAAY,IAAI,SAAS,EAAE;YAClC,MAAM,GAAG,GAAG,SAAS,CAAC,WAAW,EAAE,CAAC;YACpC,MAAM,UAAU,GAAG,IAAI,CAAC,kBAAkB,CAAC,GAAG,CAAC,GAAG,CAAC,CAAC;YACpD,+DAA+D;YAC/D,kDAAkD;YAClD,IAAI,UAAU,IAAI,IAAI,EAAE;gBACtB,kBAAkB;gBAClB,MAAM,IAAI,yBAAiB,CAAC,yDAAyD,CAAC,CAAC;aACxF;iBAAM;gBACL,+CAA+C;gBAC/C,IAAI,CAAC,kBAAkB,CAAC,GAAG,CAAC,GAAG,EAAE,UAAU,GAAG,CAAC,CAAC,CAAC;aAClD;SACF;aAAM;YACL,IAAI,CAAC,WAAW,CAAC,IAAI,CAAC,CAAC;SACxB;QAED,IAAI,CAAC,IAAI,CAAC,uBAAuB,EAAE,IAAI,mDAA0B,CAAC,IAAI,EAAE,SAAS,CAAC,CAAC,CAAC;IACtF,CAAC;IAKD,KAAK,CAAC,QAAwC,EAAE,GAAoB;QAClE,IAAI,OAAO,GAAG,QAAwB,CAAC;QACvC,MAAM,QAAQ,GAAG,CAAC,GAAG,aAAH,GAAG,cAAH,GAAG,GAAI,QAAQ,CAAmB,CAAC;QACrD,IAAI,OAAO,OAAO,KAAK,UAAU,EAAE;YACjC,OAAO,GAAG,EAAE,CAAC;SACd;QAED,OAAO,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,KAAK,EAAE,KAAK,EAAE,EAAE,OAAO,CAAC,CAAC;QACnD,IAAI,IAAI,CAAC,MAAM,EAAE;YACf,OAAO,QAAQ,EAAE,CAAC;SACnB;QAED,+CAA+C;QAC/C,IAAI,CAAC,kBAAkB,CAAC,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;QAExC,uBAAuB;QACvB,OAAO,IAAI,CAAC,aAAa,EAAE;YACzB,MAAM,eAAe,GAAG,IAAI,CAAC,UAAU,CAAC,CAAC,GAAG,EAAE,CAAC;YAC/C,IAAI,eAAe,EAAE;gBACnB,IAAI,eAAe,CAAC,KAAK,EAAE;oBACzB,YAAY,CAAC,eAAe,CAAC,KAAK,CAAC,CAAC;iBACrC;gBACD,IAAI,CAAC,eAAe,CAAC,UAAU,CAAC,EAAE;oBAChC,+DAA+D;oBAC/D,eAAe,CAAC,QAAQ,CAAC,IAAI,yBAAiB,CAAC,wBAAwB,CAAC,CAAC,CAAC;iBAC3E;aACF;SACF;QAED,gCAAgC;QAChC,MAAM,gBAAgB,GAAG,IAAI,CAAC,iBAAiB,CAAC,CAAC;QACjD,IAAI,gBAAgB,EAAE;YACpB,YAAY,CAAC,gBAAgB,CAAC,CAAC;SAChC;QAED,6BAA6B;QAC7B,IAAI,OAAO,IAAI,CAAC,kBAAkB,CAAC,CAAC,MAAM,KAAK,UAAU,EAAE;YACzD,IAAI,CAAC,kBAAkB,CAAC,CAAC,MAAM,CAAC,SAAS,CAAC,CAAC;SAC5C;QAED,sCAAsC;QACtC,IAAI,CAAC,MAAM,GAAG,IAAI,CAAC;QACnB,iBAAS,CACP,IAAI,CAAC,YAAY,CAAC,CAAC,OAAO,EAAE,EAC5B,CAAC,IAAI,EAAE,EAAE,EAAE,EAAE;YACX,IAAI,CAAC,IAAI,CACP,cAAc,CAAC,iBAAiB,EAChC,IAAI,8CAAqB,CAAC,IAAI,EAAE,IAAI,EAAE,YAAY,CAAC,CACpD,CAAC;YACF,IAAI,CAAC,OAAO,CAAC,OAAO,EAAE,EAAE,CAAC,CAAC;QAC5B,CAAC,EACD,GAAG,CAAC,EAAE;YACJ,IAAI,CAAC,YAAY,CAAC,CAAC,KAAK,EAAE,CAAC;YAC3B,IAAI,CAAC,IAAI,CAAC,cAAc,CAAC,sBAAsB,EAAE,IAAI,kDAAyB,CAAC,IAAI,CAAC,CAAC,CAAC;YACtF,QAAQ,CAAC,GAAG,CAAC,CAAC;QAChB,CAAC,CACF,CAAC;IACJ,CAAC;IAED;;;;;;;;;;;;;OAaG;IACH,cAAc,CACZ,IAA4B,EAC5B,EAA0B,EAC1B,QAA+B;QAE/B,IAAI,IAAI,EAAE;YACR,wEAAwE;YACxE,EAAE,CAAC,SAAS,EAAE,IAAI,EAAE,CAAC,KAAK,EAAE,MAAM,EAAE,EAAE;gBACpC,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;oBAClC,IAAI,KAAK,EAAE;wBACT,QAAQ,CAAC,KAAK,CAAC,CAAC;qBACjB;yBAAM;wBACL,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,CAAC;qBAC7B;iBACF;YACH,CAAC,CAAC,CAAC;YAEH,OAAO;SACR;QAED,IAAI,CAAC,QAAQ,CAAC,CAAC,GAAG,EAAE,IAAI,EAAE,EAAE;YAC1B,2EAA2E;YAC3E,EAAE,CAAC,GAAiB,EAAE,IAAI,EAAE,CAAC,KAAK,EAAE,MAAM,EAAE,EAAE;gBAC5C,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;oBAClC,IAAI,KAAK,EAAE;wBACT,QAAQ,CAAC,KAAK,CAAC,CAAC;qBACjB;yBAAM;wBACL,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,CAAC;qBAC7B;iBACF;gBAED,IAAI,IAAI,EAAE;oBACR,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;iBACpB;YACH,CAAC,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;;AA/XH,wCAgYC;AA5VC;;;GAGG;AACa,sCAAuB,GAAG,uBAAgC,CAAC;AAC3E;;;GAGG;AACa,qCAAsB,GAAG,sBAA+B,CAAC;AACzE;;;GAGG;AACa,sCAAuB,GAAG,uBAAgC,CAAC;AAC3E;;;GAGG;AACa,iCAAkB,GAAG,mBAA4B,CAAC;AAClE;;;GAGG;AACa,+BAAgB,GAAG,iBAA0B,CAAC;AAC9D;;;GAGG;AACa,gCAAiB,GAAG,kBAA2B,CAAC;AAChE;;;GAGG;AACa,2CAA4B,GAAG,2BAAoC,CAAC;AACpF;;;GAGG;AACa,0CAA2B,GAAG,0BAAmC,CAAC;AAClF;;;GAGG;AACa,qCAAsB,GAAG,sBAA+B,CAAC;AACzE;;;GAGG;AACa,oCAAqB,GAAG,qBAA8B,CAAC;AA6SzE,SAAS,iBAAiB,CAAC,IAAoB;IAC7C,IAAI,IAAI,CAAC,MAAM,IAAI,IAAI,CAAC,OAAO,CAAC,WAAW,KAAK,CAAC,EAAE;QACjD,OAAO;KACR;IAED,MAAM,WAAW,GAAG,IAAI,CAAC,OAAO,CAAC,WAAW,CAAC;IAC7C,KAAK,IAAI,CAAC,GAAG,IAAI,CAAC,oBAAoB,EAAE,CAAC,GAAG,WAAW,EAAE,EAAE,CAAC,EAAE;QAC5D,gBAAgB,CAAC,IAAI,CAAC,CAAC;KACxB;IAED,IAAI,CAAC,iBAAiB,CAAC,GAAG,UAAU,CAAC,GAAG,EAAE,CAAC,iBAAiB,CAAC,IAAI,CAAC,EAAE,EAAE,CAAC,CAAC;AAC1E,CAAC;AAED,SAAS,iBAAiB,CAAC,IAAoB,EAAE,UAAsB;IACrE,MAAM,SAAS,GAAG,UAAU,CAAC,SAAS,CAAC;IACvC,IAAI,IAAI,CAAC,YAAY,IAAI,SAAS,EAAE;QAClC,MAAM,GAAG,GAAG,SAAS,CAAC,WAAW,EAAE,CAAC;QACpC,MAAM,UAAU,GAAG,IAAI,CAAC,kBAAkB,CAAC,GAAG,CAAC,GAAG,CAAC,CAAC;QACpD,OAAO,UAAU,CAAC,UAAU,KAAK,UAAU,CAAC;KAC7C;IAED,OAAO,UAAU,CAAC,UAAU,KAAK,IAAI,CAAC,WAAW,CAAC,CAAC;AACrD,CAAC;AAED,SAAS,gBAAgB,CAAC,IAAoB,EAAE,UAAsB;IACpE,OAAO,CAAC,CAAC,CAAC,IAAI,CAAC,OAAO,CAAC,aAAa,IAAI,UAAU,CAAC,QAAQ,GAAG,IAAI,CAAC,OAAO,CAAC,aAAa,CAAC,CAAC;AAC5F,CAAC;AAED,SAAS,gBAAgB,CAAC,IAAoB,EAAE,QAA+B;IAC7E,MAAM,cAAc,GAAsB;QACxC,GAAG,IAAI,CAAC,OAAO;QACf,EAAE,EAAE,IAAI,CAAC,kBAAkB,CAAC,CAAC,IAAI,EAAE,CAAC,KAAK;QACzC,UAAU,EAAE,IAAI,CAAC,WAAW,CAAC;QAC7B,iBAAiB,EAAE,IAAI,CAAC,kBAAkB,CAAC;KAC5C,CAAC;IAEF,IAAI,CAAC,QAAQ,CAAC,EAAE,CAAC;IACjB,iBAAO,CAAC,cAAc,EAAE,CAAC,GAAG,EAAE,UAAU,EAAE,EAAE;QAC1C,IAAI,GAAG,IAAI,CAAC,UAAU,EAAE;YACtB,IAAI,CAAC,QAAQ,CAAC,EAAE,CAAC;YACjB,IAAI,CAAC,OAAO,CAAC,CAAC,KAAK,CAAC,yCAAyC,IAAI,CAAC,SAAS,CAAC,GAAG,CAAC,GAAG,CAAC,CAAC;YACrF,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;gBAClC,QAAQ,CAAC,GAAG,CAAC,CAAC;aACf;YAED,OAAO;SACR;QAED,4EAA4E;QAC5E,IAAI,IAAI,CAAC,MAAM,EAAE;YACf,UAAU,CAAC,OAAO,CAAC,EAAE,KAAK,EAAE,IAAI,EAAE,CAAC,CAAC;YACpC,OAAO;SACR;QAED,qDAAqD;QACrD,KAAK,MAAM,KAAK,IAAI,CAAC,GAAG,uBAAU,EAAE,uBAAU,CAAC,qBAAqB,CAAC,EAAE;YACrE,UAAU,CAAC,EAAE,CAAC,KAAK,EAAE,CAAC,CAAM,EAAE,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,CAAC;SACvD;QAED,IAAI,CAAC,IAAI,CAAC,cAAc,CAAC,kBAAkB,EAAE,IAAI,+CAAsB,CAAC,IAAI,EAAE,UAAU,CAAC,CAAC,CAAC;QAE3F,IAAI,IAAI,CAAC,YAAY,EAAE;YACrB,UAAU,CAAC,EAAE,CAAC,uBAAU,CAAC,MAAM,EAAE,OAAO,CAAC,EAAE,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC,UAAU,CAAC,OAAO,CAAC,CAAC,CAAC;YAChF,UAAU,CAAC,EAAE,CAAC,uBAAU,CAAC,QAAQ,EAAE,OAAO,CAAC,EAAE,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC,YAAY,CAAC,OAAO,CAAC,CAAC,CAAC;YAEpF,MAAM,SAAS,GAAG,UAAU,CAAC,SAAS,CAAC;YACvC,IAAI,SAAS,EAAE;gBACb,IAAI,UAAU,CAAC;gBACf,MAAM,GAAG,GAAG,SAAS,CAAC,WAAW,EAAE,CAAC;gBACpC,IAAI,CAAC,UAAU,GAAG,IAAI,CAAC,kBAAkB,CAAC,GAAG,CAAC,GAAG,CAAC,CAAC,EAAE;oBACnD,UAAU,CAAC,UAAU,GAAG,UAAU,CAAC;iBACpC;qBAAM;oBACL,IAAI,CAAC,kBAAkB,CAAC,GAAG,CAAC,GAAG,EAAE,CAAC,CAAC,CAAC;oBACpC,UAAU,CAAC,UAAU,GAAG,CAAC,CAAC;iBAC3B;aACF;SACF;QAED,UAAU,CAAC,aAAa,EAAE,CAAC;QAC3B,IAAI,CAAC,IAAI,CAAC,cAAc,CAAC,gBAAgB,EAAE,IAAI,6CAAoB,CAAC,IAAI,EAAE,UAAU,CAAC,CAAC,CAAC;QAEvF,wEAAwE;QACxE,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;YAClC,QAAQ,CAAC,SAAS,EAAE,UAAU,CAAC,CAAC;YAChC,OAAO;SACR;QAED,wFAAwF;QACxF,IAAI,CAAC,YAAY,CAAC,CAAC,IAAI,CAAC,UAAU,CAAC,CAAC;QACpC,OAAO,CAAC,QAAQ,CAAC,gBAAgB,EAAE,IAAI,CAAC,CAAC;IAC3C,CAAC,CAAC,CAAC;AACL,CAAC;AAED,SAAS,iBAAiB,CAAC,IAAoB,EAAE,UAAsB,EAAE,MAAc;IACrF,IAAI,CAAC,IAAI,CAAC,cAAc,CAAC,iBAAiB,EAAE,IAAI,8CAAqB,CAAC,IAAI,EAAE,UAAU,EAAE,MAAM,CAAC,CAAC,CAAC;IAEjG,uCAAuC;IACvC,IAAI,CAAC,QAAQ,CAAC,EAAE,CAAC;IAEjB,yBAAyB;IACzB,OAAO,CAAC,QAAQ,CAAC,GAAG,EAAE,CAAC,UAAU,CAAC,OAAO,EAAE,CAAC,CAAC;AAC/C,CAAC;AAED,SAAS,gBAAgB,CAAC,IAAoB;IAC5C,IAAI,IAAI,CAAC,MAAM,IAAI,IAAI,CAAC,oBAAoB,CAAC,EAAE;QAC7C,OAAO;KACR;IAED,IAAI,CAAC,oBAAoB,CAAC,GAAG,IAAI,CAAC;IAClC,OAAO,IAAI,CAAC,aAAa,EAAE;QACzB,MAAM,eAAe,GAAG,IAAI,CAAC,UAAU,CAAC,CAAC,SAAS,EAAE,CAAC;QACrD,IAAI,CAAC,eAAe,EAAE;YACpB,IAAI,CAAC,UAAU,CAAC,CAAC,KAAK,EAAE,CAAC;YACzB,SAAS;SACV;QAED,IAAI,eAAe,CAAC,UAAU,CAAC,EAAE;YAC/B,IAAI,CAAC,UAAU,CAAC,CAAC,KAAK,EAAE,CAAC;YACzB,SAAS;SACV;QAED,IAAI,CAAC,IAAI,CAAC,wBAAwB,EAAE;YAClC,MAAM;SACP;QAED,MAAM,UAAU,GAAG,IAAI,CAAC,YAAY,CAAC,CAAC,KAAK,EAAE,CAAC;QAC9C,IAAI,CAAC,UAAU,EAAE;YACf,MAAM;SACP;QAED,MAAM,OAAO,GAAG,iBAAiB,CAAC,IAAI,EAAE,UAAU,CAAC,CAAC;QACpD,MAAM,MAAM,GAAG,gBAAgB,CAAC,IAAI,EAAE,UAAU,CAAC,CAAC;QAClD,IAAI,CAAC,OAAO,IAAI,CAAC,MAAM,IAAI,CAAC,UAAU,CAAC,MAAM,EAAE;YAC7C,IAAI,CAAC,IAAI,CACP,cAAc,CAAC,sBAAsB,EACrC,IAAI,kDAAyB,CAAC,IAAI,EAAE,UAAU,CAAC,CAChD,CAAC;YACF,IAAI,eAAe,CAAC,KAAK,EAAE;gBACzB,YAAY,CAAC,eAAe,CAAC,KAAK,CAAC,CAAC;aACrC;YAED,IAAI,CAAC,UAAU,CAAC,CAAC,KAAK,EAAE,CAAC;YACzB,eAAe,CAAC,QAAQ,CAAC,SAAS,EAAE,UAAU,CAAC,CAAC;SACjD;aAAM;YACL,MAAM,MAAM,GAAG,UAAU,CAAC,MAAM,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC,CAAC,MAAM,CAAC;YACxE,iBAAiB,CAAC,IAAI,EAAE,UAAU,EAAE,MAAM,CAAC,CAAC;SAC7C;KACF;IAED,MAAM,WAAW,GAAG,IAAI,CAAC,OAAO,CAAC,WAAW,CAAC;IAC7C,IAAI,IAAI,CAAC,aAAa,IAAI,CAAC,WAAW,IAAI,CAAC,IAAI,IAAI,CAAC,oBAAoB,GAAG,WAAW,CAAC,EAAE;QACvF,gBAAgB,CAAC,IAAI,EAAE,CAAC,GAAG,EAAE,UAAU,EAAE,EAAE;YACzC,MAAM,eAAe,GAAG,IAAI,CAAC,UAAU,CAAC,CAAC,KAAK,EAAE,CAAC;YACjD,IAAI,CAAC,eAAe,IAAI,eAAe,CAAC,UAAU,CAAC,EAAE;gBACnD,IAAI,CAAC,GAAG,IAAI,UAAU,EAAE;oBACtB,IAAI,CAAC,YAAY,CAAC,CAAC,IAAI,CAAC,UAAU,CAAC,CAAC;iBACrC;gBAED,IAAI,CAAC,oBAAoB,CAAC,GAAG,KAAK,CAAC;gBACnC,OAAO;aACR;YAED,IAAI,GAAG,EAAE;gBACP,IAAI,CAAC,IAAI,CACP,cAAc,CAAC,2BAA2B,EAC1C,IAAI,sDAA6B,CAAC,IAAI,EAAE,GAAG,CAAC,CAC7C,CAAC;aACH;iBAAM,IAAI,UAAU,EAAE;gBACrB,IAAI,CAAC,IAAI,CACP,cAAc,CAAC,sBAAsB,EACrC,IAAI,kDAAyB,CAAC,IAAI,EAAE,UAAU,CAAC,CAChD,CAAC;aACH;YAED,IAAI,eAAe,CAAC,KAAK,EAAE;gBACzB,YAAY,CAAC,eAAe,CAAC,KAAK,CAAC,CAAC;aACrC;YACD,eAAe,CAAC,QAAQ,CAAC,GAAG,EAAE,UAAU,CAAC,CAAC;YAC1C,IAAI,CAAC,oBAAoB,CAAC,GAAG,KAAK,CAAC;YACnC,OAAO,CAAC,QAAQ,CAAC,GAAG,EAAE,CAAC,gBAAgB,CAAC,IAAI,CAAC,CAAC,CAAC;QACjD,CAAC,CAAC,CAAC;KACJ;SAAM;QACL,IAAI,CAAC,oBAAoB,CAAC,GAAG,KAAK,CAAC;KACpC;AACH,CAAC;AAEY,QAAA,WAAW,GAAG;IACzB,cAAc,CAAC,uBAAuB;IACtC,cAAc,CAAC,sBAAsB;IACrC,cAAc,CAAC,kBAAkB;IACjC,cAAc,CAAC,gBAAgB;IAC/B,cAAc,CAAC,iBAAiB;IAChC,cAAc,CAAC,4BAA4B;IAC3C,cAAc,CAAC,2BAA2B;IAC1C,cAAc,CAAC,sBAAsB;IACrC,cAAc,CAAC,qBAAqB;IACpC,cAAc,CAAC,uBAAuB;CAC9B,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/connection_pool_events.js b/node_modules/mongodb/lib/cmap/connection_pool_events.js new file mode 100644 index 000000000..86aeaef8a --- /dev/null +++ b/node_modules/mongodb/lib/cmap/connection_pool_events.js @@ -0,0 +1,147 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.ConnectionPoolClearedEvent = exports.ConnectionCheckedInEvent = exports.ConnectionCheckedOutEvent = exports.ConnectionCheckOutFailedEvent = exports.ConnectionCheckOutStartedEvent = exports.ConnectionClosedEvent = exports.ConnectionReadyEvent = exports.ConnectionCreatedEvent = exports.ConnectionPoolClosedEvent = exports.ConnectionPoolCreatedEvent = exports.ConnectionPoolMonitoringEvent = void 0; +/** + * The base export class for all monitoring events published from the connection pool + * @public + * @category Event + */ +class ConnectionPoolMonitoringEvent { + /** @internal */ + constructor(pool) { + this.time = new Date(); + this.address = pool.address; + } +} +exports.ConnectionPoolMonitoringEvent = ConnectionPoolMonitoringEvent; +/** + * An event published when a connection pool is created + * @public + * @category Event + */ +class ConnectionPoolCreatedEvent extends ConnectionPoolMonitoringEvent { + /** @internal */ + constructor(pool) { + super(pool); + this.options = pool.options; + } +} +exports.ConnectionPoolCreatedEvent = ConnectionPoolCreatedEvent; +/** + * An event published when a connection pool is closed + * @public + * @category Event + */ +class ConnectionPoolClosedEvent extends ConnectionPoolMonitoringEvent { + /** @internal */ + constructor(pool) { + super(pool); + } +} +exports.ConnectionPoolClosedEvent = ConnectionPoolClosedEvent; +/** + * An event published when a connection pool creates a new connection + * @public + * @category Event + */ +class ConnectionCreatedEvent extends ConnectionPoolMonitoringEvent { + /** @internal */ + constructor(pool, connection) { + super(pool); + this.connectionId = connection.id; + } +} +exports.ConnectionCreatedEvent = ConnectionCreatedEvent; +/** + * An event published when a connection is ready for use + * @public + * @category Event + */ +class ConnectionReadyEvent extends ConnectionPoolMonitoringEvent { + /** @internal */ + constructor(pool, connection) { + super(pool); + this.connectionId = connection.id; + } +} +exports.ConnectionReadyEvent = ConnectionReadyEvent; +/** + * An event published when a connection is closed + * @public + * @category Event + */ +class ConnectionClosedEvent extends ConnectionPoolMonitoringEvent { + /** @internal */ + constructor(pool, connection, reason) { + super(pool); + this.connectionId = connection.id; + this.reason = reason || 'unknown'; + this.serviceId = connection.serviceId; + } +} +exports.ConnectionClosedEvent = ConnectionClosedEvent; +/** + * An event published when a request to check a connection out begins + * @public + * @category Event + */ +class ConnectionCheckOutStartedEvent extends ConnectionPoolMonitoringEvent { + /** @internal */ + constructor(pool) { + super(pool); + } +} +exports.ConnectionCheckOutStartedEvent = ConnectionCheckOutStartedEvent; +/** + * An event published when a request to check a connection out fails + * @public + * @category Event + */ +class ConnectionCheckOutFailedEvent extends ConnectionPoolMonitoringEvent { + /** @internal */ + constructor(pool, reason) { + super(pool); + this.reason = reason; + } +} +exports.ConnectionCheckOutFailedEvent = ConnectionCheckOutFailedEvent; +/** + * An event published when a connection is checked out of the connection pool + * @public + * @category Event + */ +class ConnectionCheckedOutEvent extends ConnectionPoolMonitoringEvent { + /** @internal */ + constructor(pool, connection) { + super(pool); + this.connectionId = connection.id; + } +} +exports.ConnectionCheckedOutEvent = ConnectionCheckedOutEvent; +/** + * An event published when a connection is checked into the connection pool + * @public + * @category Event + */ +class ConnectionCheckedInEvent extends ConnectionPoolMonitoringEvent { + /** @internal */ + constructor(pool, connection) { + super(pool); + this.connectionId = connection.id; + } +} +exports.ConnectionCheckedInEvent = ConnectionCheckedInEvent; +/** + * An event published when a connection pool is cleared + * @public + * @category Event + */ +class ConnectionPoolClearedEvent extends ConnectionPoolMonitoringEvent { + /** @internal */ + constructor(pool, serviceId) { + super(pool); + this.serviceId = serviceId; + } +} +exports.ConnectionPoolClearedEvent = ConnectionPoolClearedEvent; +//# sourceMappingURL=connection_pool_events.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/connection_pool_events.js.map b/node_modules/mongodb/lib/cmap/connection_pool_events.js.map new file mode 100644 index 000000000..a24be17f7 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/connection_pool_events.js.map @@ -0,0 +1 @@ +{"version":3,"file":"connection_pool_events.js","sourceRoot":"","sources":["../../src/cmap/connection_pool_events.ts"],"names":[],"mappings":";;;AAKA;;;;GAIG;AACH,MAAa,6BAA6B;IAMxC,gBAAgB;IAChB,YAAY,IAAoB;QAC9B,IAAI,CAAC,IAAI,GAAG,IAAI,IAAI,EAAE,CAAC;QACvB,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC,OAAO,CAAC;IAC9B,CAAC;CACF;AAXD,sEAWC;AAED;;;;GAIG;AACH,MAAa,0BAA2B,SAAQ,6BAA6B;IAI3E,gBAAgB;IAChB,YAAY,IAAoB;QAC9B,KAAK,CAAC,IAAI,CAAC,CAAC;QACZ,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC,OAAO,CAAC;IAC9B,CAAC;CACF;AATD,gEASC;AAED;;;;GAIG;AACH,MAAa,yBAA0B,SAAQ,6BAA6B;IAC1E,gBAAgB;IAChB,YAAY,IAAoB;QAC9B,KAAK,CAAC,IAAI,CAAC,CAAC;IACd,CAAC;CACF;AALD,8DAKC;AAED;;;;GAIG;AACH,MAAa,sBAAuB,SAAQ,6BAA6B;IAIvE,gBAAgB;IAChB,YAAY,IAAoB,EAAE,UAAsB;QACtD,KAAK,CAAC,IAAI,CAAC,CAAC;QACZ,IAAI,CAAC,YAAY,GAAG,UAAU,CAAC,EAAE,CAAC;IACpC,CAAC;CACF;AATD,wDASC;AAED;;;;GAIG;AACH,MAAa,oBAAqB,SAAQ,6BAA6B;IAIrE,gBAAgB;IAChB,YAAY,IAAoB,EAAE,UAAsB;QACtD,KAAK,CAAC,IAAI,CAAC,CAAC;QACZ,IAAI,CAAC,YAAY,GAAG,UAAU,CAAC,EAAE,CAAC;IACpC,CAAC;CACF;AATD,oDASC;AAED;;;;GAIG;AACH,MAAa,qBAAsB,SAAQ,6BAA6B;IAOtE,gBAAgB;IAChB,YAAY,IAAoB,EAAE,UAAsB,EAAE,MAAc;QACtE,KAAK,CAAC,IAAI,CAAC,CAAC;QACZ,IAAI,CAAC,YAAY,GAAG,UAAU,CAAC,EAAE,CAAC;QAClC,IAAI,CAAC,MAAM,GAAG,MAAM,IAAI,SAAS,CAAC;QAClC,IAAI,CAAC,SAAS,GAAG,UAAU,CAAC,SAAS,CAAC;IACxC,CAAC;CACF;AAdD,sDAcC;AAED;;;;GAIG;AACH,MAAa,8BAA+B,SAAQ,6BAA6B;IAC/E,gBAAgB;IAChB,YAAY,IAAoB;QAC9B,KAAK,CAAC,IAAI,CAAC,CAAC;IACd,CAAC;CACF;AALD,wEAKC;AAED;;;;GAIG;AACH,MAAa,6BAA8B,SAAQ,6BAA6B;IAI9E,gBAAgB;IAChB,YAAY,IAAoB,EAAE,MAAyB;QACzD,KAAK,CAAC,IAAI,CAAC,CAAC;QACZ,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC;IACvB,CAAC;CACF;AATD,sEASC;AAED;;;;GAIG;AACH,MAAa,yBAA0B,SAAQ,6BAA6B;IAI1E,gBAAgB;IAChB,YAAY,IAAoB,EAAE,UAAsB;QACtD,KAAK,CAAC,IAAI,CAAC,CAAC;QACZ,IAAI,CAAC,YAAY,GAAG,UAAU,CAAC,EAAE,CAAC;IACpC,CAAC;CACF;AATD,8DASC;AAED;;;;GAIG;AACH,MAAa,wBAAyB,SAAQ,6BAA6B;IAIzE,gBAAgB;IAChB,YAAY,IAAoB,EAAE,UAAsB;QACtD,KAAK,CAAC,IAAI,CAAC,CAAC;QACZ,IAAI,CAAC,YAAY,GAAG,UAAU,CAAC,EAAE,CAAC;IACpC,CAAC;CACF;AATD,4DASC;AAED;;;;GAIG;AACH,MAAa,0BAA2B,SAAQ,6BAA6B;IAI3E,gBAAgB;IAChB,YAAY,IAAoB,EAAE,SAAoB;QACpD,KAAK,CAAC,IAAI,CAAC,CAAC;QACZ,IAAI,CAAC,SAAS,GAAG,SAAS,CAAC;IAC7B,CAAC;CACF;AATD,gEASC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/errors.js b/node_modules/mongodb/lib/cmap/errors.js new file mode 100644 index 000000000..63112628a --- /dev/null +++ b/node_modules/mongodb/lib/cmap/errors.js @@ -0,0 +1,33 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.WaitQueueTimeoutError = exports.PoolClosedError = void 0; +const error_1 = require("../error"); +/** + * An error indicating a connection pool is closed + * @category Error + */ +class PoolClosedError extends error_1.MongoDriverError { + constructor(pool) { + super('Attempted to check out a connection from closed connection pool'); + this.address = pool.address; + } + get name() { + return 'MongoPoolClosedError'; + } +} +exports.PoolClosedError = PoolClosedError; +/** + * An error thrown when a request to check out a connection times out + * @category Error + */ +class WaitQueueTimeoutError extends error_1.MongoDriverError { + constructor(message, address) { + super(message); + this.address = address; + } + get name() { + return 'MongoWaitQueueTimeoutError'; + } +} +exports.WaitQueueTimeoutError = WaitQueueTimeoutError; +//# sourceMappingURL=errors.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/errors.js.map b/node_modules/mongodb/lib/cmap/errors.js.map new file mode 100644 index 000000000..c1c0959dc --- /dev/null +++ b/node_modules/mongodb/lib/cmap/errors.js.map @@ -0,0 +1 @@ +{"version":3,"file":"errors.js","sourceRoot":"","sources":["../../src/cmap/errors.ts"],"names":[],"mappings":";;;AAAA,oCAA4C;AAG5C;;;GAGG;AACH,MAAa,eAAgB,SAAQ,wBAAgB;IAInD,YAAY,IAAoB;QAC9B,KAAK,CAAC,iEAAiE,CAAC,CAAC;QACzE,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC,OAAO,CAAC;IAC9B,CAAC;IAED,IAAI,IAAI;QACN,OAAO,sBAAsB,CAAC;IAChC,CAAC;CACF;AAZD,0CAYC;AAED;;;GAGG;AACH,MAAa,qBAAsB,SAAQ,wBAAgB;IAIzD,YAAY,OAAe,EAAE,OAAe;QAC1C,KAAK,CAAC,OAAO,CAAC,CAAC;QACf,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;IACzB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,4BAA4B,CAAC;IACtC,CAAC;CACF;AAZD,sDAYC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/message_stream.js b/node_modules/mongodb/lib/cmap/message_stream.js new file mode 100644 index 000000000..08b188aa2 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/message_stream.js @@ -0,0 +1,142 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.MessageStream = void 0; +const stream_1 = require("stream"); +const commands_1 = require("./commands"); +const error_1 = require("../error"); +const constants_1 = require("./wire_protocol/constants"); +const compression_1 = require("./wire_protocol/compression"); +const utils_1 = require("../utils"); +const MESSAGE_HEADER_SIZE = 16; +const COMPRESSION_DETAILS_SIZE = 9; // originalOpcode + uncompressedSize, compressorID +const kDefaultMaxBsonMessageSize = 1024 * 1024 * 16 * 4; +/** @internal */ +const kBuffer = Symbol('buffer'); +/** + * A duplex stream that is capable of reading and writing raw wire protocol messages, with + * support for optional compression + * @internal + */ +class MessageStream extends stream_1.Duplex { + constructor(options = {}) { + super(options); + this.maxBsonMessageSize = options.maxBsonMessageSize || kDefaultMaxBsonMessageSize; + this[kBuffer] = new utils_1.BufferPool(); + } + _write(chunk, _, callback) { + this[kBuffer].append(chunk); + processIncomingData(this, callback); + } + _read( /* size */) { + // NOTE: This implementation is empty because we explicitly push data to be read + // when `writeMessage` is called. + return; + } + writeCommand(command, operationDescription) { + // TODO: agreed compressor should live in `StreamDescription` + const compressorName = operationDescription && operationDescription.agreedCompressor + ? operationDescription.agreedCompressor + : 'none'; + if (compressorName === 'none' || !canCompress(command)) { + const data = command.toBin(); + this.push(Array.isArray(data) ? Buffer.concat(data) : data); + return; + } + // otherwise, compress the message + const concatenatedOriginalCommandBuffer = Buffer.concat(command.toBin()); + const messageToBeCompressed = concatenatedOriginalCommandBuffer.slice(MESSAGE_HEADER_SIZE); + // Extract information needed for OP_COMPRESSED from the uncompressed message + const originalCommandOpCode = concatenatedOriginalCommandBuffer.readInt32LE(12); + // Compress the message body + compression_1.compress({ options: operationDescription }, messageToBeCompressed, (err, compressedMessage) => { + if (err || !compressedMessage) { + operationDescription.cb(err); + return; + } + // Create the msgHeader of OP_COMPRESSED + const msgHeader = Buffer.alloc(MESSAGE_HEADER_SIZE); + msgHeader.writeInt32LE(MESSAGE_HEADER_SIZE + COMPRESSION_DETAILS_SIZE + compressedMessage.length, 0); // messageLength + msgHeader.writeInt32LE(command.requestId, 4); // requestID + msgHeader.writeInt32LE(0, 8); // responseTo (zero) + msgHeader.writeInt32LE(constants_1.OP_COMPRESSED, 12); // opCode + // Create the compression details of OP_COMPRESSED + const compressionDetails = Buffer.alloc(COMPRESSION_DETAILS_SIZE); + compressionDetails.writeInt32LE(originalCommandOpCode, 0); // originalOpcode + compressionDetails.writeInt32LE(messageToBeCompressed.length, 4); // Size of the uncompressed compressedMessage, excluding the MsgHeader + compressionDetails.writeUInt8(compression_1.Compressor[compressorName], 8); // compressorID + this.push(Buffer.concat([msgHeader, compressionDetails, compressedMessage])); + }); + } +} +exports.MessageStream = MessageStream; +// Return whether a command contains an uncompressible command term +// Will return true if command contains no uncompressible command terms +function canCompress(command) { + const commandDoc = command instanceof commands_1.Msg ? command.command : command.query; + const commandName = Object.keys(commandDoc)[0]; + return !compression_1.uncompressibleCommands.has(commandName); +} +function processIncomingData(stream, callback) { + const buffer = stream[kBuffer]; + if (buffer.length < 4) { + callback(); + return; + } + const sizeOfMessage = buffer.peek(4).readInt32LE(); + if (sizeOfMessage < 0) { + callback(new error_1.MongoParseError(`Invalid message size: ${sizeOfMessage}`)); + return; + } + if (sizeOfMessage > stream.maxBsonMessageSize) { + callback(new error_1.MongoParseError(`Invalid message size: ${sizeOfMessage}, max allowed: ${stream.maxBsonMessageSize}`)); + return; + } + if (sizeOfMessage > buffer.length) { + callback(); + return; + } + const message = buffer.read(sizeOfMessage); + const messageHeader = { + length: message.readInt32LE(0), + requestId: message.readInt32LE(4), + responseTo: message.readInt32LE(8), + opCode: message.readInt32LE(12) + }; + let ResponseType = messageHeader.opCode === constants_1.OP_MSG ? commands_1.BinMsg : commands_1.Response; + if (messageHeader.opCode !== constants_1.OP_COMPRESSED) { + const messageBody = message.slice(MESSAGE_HEADER_SIZE); + stream.emit('message', new ResponseType(message, messageHeader, messageBody)); + if (buffer.length >= 4) { + processIncomingData(stream, callback); + } + else { + callback(); + } + return; + } + messageHeader.fromCompressed = true; + messageHeader.opCode = message.readInt32LE(MESSAGE_HEADER_SIZE); + messageHeader.length = message.readInt32LE(MESSAGE_HEADER_SIZE + 4); + const compressorID = message[MESSAGE_HEADER_SIZE + 8]; + const compressedBuffer = message.slice(MESSAGE_HEADER_SIZE + 9); + // recalculate based on wrapped opcode + ResponseType = messageHeader.opCode === constants_1.OP_MSG ? commands_1.BinMsg : commands_1.Response; + compression_1.decompress(compressorID, compressedBuffer, (err, messageBody) => { + if (err || !messageBody) { + callback(err); + return; + } + if (messageBody.length !== messageHeader.length) { + callback(new error_1.MongoDecompressionError('Message body and message header must be the same length')); + return; + } + stream.emit('message', new ResponseType(message, messageHeader, messageBody)); + if (buffer.length >= 4) { + processIncomingData(stream, callback); + } + else { + callback(); + } + }); +} +//# sourceMappingURL=message_stream.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/message_stream.js.map b/node_modules/mongodb/lib/cmap/message_stream.js.map new file mode 100644 index 000000000..80370cea4 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/message_stream.js.map @@ -0,0 +1 @@ +{"version":3,"file":"message_stream.js","sourceRoot":"","sources":["../../src/cmap/message_stream.ts"],"names":[],"mappings":";;;AAAA,mCAA+C;AAC/C,yCAAmG;AACnG,oCAAoE;AACpE,yDAAkE;AAClE,6DAMqC;AAErC,oCAAgD;AAGhD,MAAM,mBAAmB,GAAG,EAAE,CAAC;AAC/B,MAAM,wBAAwB,GAAG,CAAC,CAAC,CAAC,kDAAkD;AAEtF,MAAM,0BAA0B,GAAG,IAAI,GAAG,IAAI,GAAG,EAAE,GAAG,CAAC,CAAC;AACxD,gBAAgB;AAChB,MAAM,OAAO,GAAG,MAAM,CAAC,QAAQ,CAAC,CAAC;AAwBjC;;;;GAIG;AACH,MAAa,aAAc,SAAQ,eAAM;IAMvC,YAAY,UAAgC,EAAE;QAC5C,KAAK,CAAC,OAAO,CAAC,CAAC;QACf,IAAI,CAAC,kBAAkB,GAAG,OAAO,CAAC,kBAAkB,IAAI,0BAA0B,CAAC;QACnF,IAAI,CAAC,OAAO,CAAC,GAAG,IAAI,kBAAU,EAAE,CAAC;IACnC,CAAC;IAED,MAAM,CAAC,KAAa,EAAE,CAAU,EAAE,QAA0B;QAC1D,IAAI,CAAC,OAAO,CAAC,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC;QAC5B,mBAAmB,CAAC,IAAI,EAAE,QAAQ,CAAC,CAAC;IACtC,CAAC;IAED,KAAK,EAAC,UAAU;QACd,gFAAgF;QAChF,uCAAuC;QACvC,OAAO;IACT,CAAC;IAED,YAAY,CACV,OAAiC,EACjC,oBAA0C;QAE1C,6DAA6D;QAC7D,MAAM,cAAc,GAClB,oBAAoB,IAAI,oBAAoB,CAAC,gBAAgB;YAC3D,CAAC,CAAC,oBAAoB,CAAC,gBAAgB;YACvC,CAAC,CAAC,MAAM,CAAC;QACb,IAAI,cAAc,KAAK,MAAM,IAAI,CAAC,WAAW,CAAC,OAAO,CAAC,EAAE;YACtD,MAAM,IAAI,GAAG,OAAO,CAAC,KAAK,EAAE,CAAC;YAC7B,IAAI,CAAC,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,MAAM,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC;YAC5D,OAAO;SACR;QACD,kCAAkC;QAClC,MAAM,iCAAiC,GAAG,MAAM,CAAC,MAAM,CAAC,OAAO,CAAC,KAAK,EAAE,CAAC,CAAC;QACzE,MAAM,qBAAqB,GAAG,iCAAiC,CAAC,KAAK,CAAC,mBAAmB,CAAC,CAAC;QAE3F,6EAA6E;QAC7E,MAAM,qBAAqB,GAAG,iCAAiC,CAAC,WAAW,CAAC,EAAE,CAAC,CAAC;QAEhF,4BAA4B;QAC5B,sBAAQ,CAAC,EAAE,OAAO,EAAE,oBAAoB,EAAE,EAAE,qBAAqB,EAAE,CAAC,GAAG,EAAE,iBAAiB,EAAE,EAAE;YAC5F,IAAI,GAAG,IAAI,CAAC,iBAAiB,EAAE;gBAC7B,oBAAoB,CAAC,EAAE,CAAC,GAAG,CAAC,CAAC;gBAC7B,OAAO;aACR;YAED,wCAAwC;YACxC,MAAM,SAAS,GAAG,MAAM,CAAC,KAAK,CAAC,mBAAmB,CAAC,CAAC;YACpD,SAAS,CAAC,YAAY,CACpB,mBAAmB,GAAG,wBAAwB,GAAG,iBAAiB,CAAC,MAAM,EACzE,CAAC,CACF,CAAC,CAAC,gBAAgB;YACnB,SAAS,CAAC,YAAY,CAAC,OAAO,CAAC,SAAS,EAAE,CAAC,CAAC,CAAC,CAAC,YAAY;YAC1D,SAAS,CAAC,YAAY,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,oBAAoB;YAClD,SAAS,CAAC,YAAY,CAAC,yBAAa,EAAE,EAAE,CAAC,CAAC,CAAC,SAAS;YAEpD,kDAAkD;YAClD,MAAM,kBAAkB,GAAG,MAAM,CAAC,KAAK,CAAC,wBAAwB,CAAC,CAAC;YAClE,kBAAkB,CAAC,YAAY,CAAC,qBAAqB,EAAE,CAAC,CAAC,CAAC,CAAC,iBAAiB;YAC5E,kBAAkB,CAAC,YAAY,CAAC,qBAAqB,CAAC,MAAM,EAAE,CAAC,CAAC,CAAC,CAAC,sEAAsE;YACxI,kBAAkB,CAAC,UAAU,CAAC,wBAAU,CAAC,cAAc,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,eAAe;YAC7E,IAAI,CAAC,IAAI,CAAC,MAAM,CAAC,MAAM,CAAC,CAAC,SAAS,EAAE,kBAAkB,EAAE,iBAAiB,CAAC,CAAC,CAAC,CAAC;QAC/E,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AArED,sCAqEC;AAED,mEAAmE;AACnE,uEAAuE;AACvE,SAAS,WAAW,CAAC,OAAiC;IACpD,MAAM,UAAU,GAAG,OAAO,YAAY,cAAG,CAAC,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC,CAAE,OAAiB,CAAC,KAAK,CAAC;IACvF,MAAM,WAAW,GAAG,MAAM,CAAC,IAAI,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,CAAC;IAC/C,OAAO,CAAC,oCAAsB,CAAC,GAAG,CAAC,WAAW,CAAC,CAAC;AAClD,CAAC;AAED,SAAS,mBAAmB,CAAC,MAAqB,EAAE,QAA0B;IAC5E,MAAM,MAAM,GAAG,MAAM,CAAC,OAAO,CAAC,CAAC;IAC/B,IAAI,MAAM,CAAC,MAAM,GAAG,CAAC,EAAE;QACrB,QAAQ,EAAE,CAAC;QACX,OAAO;KACR;IAED,MAAM,aAAa,GAAG,MAAM,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,WAAW,EAAE,CAAC;IACnD,IAAI,aAAa,GAAG,CAAC,EAAE;QACrB,QAAQ,CAAC,IAAI,uBAAe,CAAC,yBAAyB,aAAa,EAAE,CAAC,CAAC,CAAC;QACxE,OAAO;KACR;IAED,IAAI,aAAa,GAAG,MAAM,CAAC,kBAAkB,EAAE;QAC7C,QAAQ,CACN,IAAI,uBAAe,CACjB,yBAAyB,aAAa,kBAAkB,MAAM,CAAC,kBAAkB,EAAE,CACpF,CACF,CAAC;QACF,OAAO;KACR;IAED,IAAI,aAAa,GAAG,MAAM,CAAC,MAAM,EAAE;QACjC,QAAQ,EAAE,CAAC;QACX,OAAO;KACR;IAED,MAAM,OAAO,GAAG,MAAM,CAAC,IAAI,CAAC,aAAa,CAAC,CAAC;IAC3C,MAAM,aAAa,GAAkB;QACnC,MAAM,EAAE,OAAO,CAAC,WAAW,CAAC,CAAC,CAAC;QAC9B,SAAS,EAAE,OAAO,CAAC,WAAW,CAAC,CAAC,CAAC;QACjC,UAAU,EAAE,OAAO,CAAC,WAAW,CAAC,CAAC,CAAC;QAClC,MAAM,EAAE,OAAO,CAAC,WAAW,CAAC,EAAE,CAAC;KAChC,CAAC;IAEF,IAAI,YAAY,GAAG,aAAa,CAAC,MAAM,KAAK,kBAAM,CAAC,CAAC,CAAC,iBAAM,CAAC,CAAC,CAAC,mBAAQ,CAAC;IACvE,IAAI,aAAa,CAAC,MAAM,KAAK,yBAAa,EAAE;QAC1C,MAAM,WAAW,GAAG,OAAO,CAAC,KAAK,CAAC,mBAAmB,CAAC,CAAC;QACvD,MAAM,CAAC,IAAI,CAAC,SAAS,EAAE,IAAI,YAAY,CAAC,OAAO,EAAE,aAAa,EAAE,WAAW,CAAC,CAAC,CAAC;QAE9E,IAAI,MAAM,CAAC,MAAM,IAAI,CAAC,EAAE;YACtB,mBAAmB,CAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;SACvC;aAAM;YACL,QAAQ,EAAE,CAAC;SACZ;QAED,OAAO;KACR;IAED,aAAa,CAAC,cAAc,GAAG,IAAI,CAAC;IACpC,aAAa,CAAC,MAAM,GAAG,OAAO,CAAC,WAAW,CAAC,mBAAmB,CAAC,CAAC;IAChE,aAAa,CAAC,MAAM,GAAG,OAAO,CAAC,WAAW,CAAC,mBAAmB,GAAG,CAAC,CAAC,CAAC;IACpE,MAAM,YAAY,GAAe,OAAO,CAAC,mBAAmB,GAAG,CAAC,CAAe,CAAC;IAChF,MAAM,gBAAgB,GAAG,OAAO,CAAC,KAAK,CAAC,mBAAmB,GAAG,CAAC,CAAC,CAAC;IAEhE,sCAAsC;IACtC,YAAY,GAAG,aAAa,CAAC,MAAM,KAAK,kBAAM,CAAC,CAAC,CAAC,iBAAM,CAAC,CAAC,CAAC,mBAAQ,CAAC;IACnE,wBAAU,CAAC,YAAY,EAAE,gBAAgB,EAAE,CAAC,GAAG,EAAE,WAAW,EAAE,EAAE;QAC9D,IAAI,GAAG,IAAI,CAAC,WAAW,EAAE;YACvB,QAAQ,CAAC,GAAG,CAAC,CAAC;YACd,OAAO;SACR;QAED,IAAI,WAAW,CAAC,MAAM,KAAK,aAAa,CAAC,MAAM,EAAE;YAC/C,QAAQ,CACN,IAAI,+BAAuB,CAAC,yDAAyD,CAAC,CACvF,CAAC;YAEF,OAAO;SACR;QAED,MAAM,CAAC,IAAI,CAAC,SAAS,EAAE,IAAI,YAAY,CAAC,OAAO,EAAE,aAAa,EAAE,WAAW,CAAC,CAAC,CAAC;QAE9E,IAAI,MAAM,CAAC,MAAM,IAAI,CAAC,EAAE;YACtB,mBAAmB,CAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;SACvC;aAAM;YACL,QAAQ,EAAE,CAAC;SACZ;IACH,CAAC,CAAC,CAAC;AACL,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/metrics.js b/node_modules/mongodb/lib/cmap/metrics.js new file mode 100644 index 000000000..89c221091 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/metrics.js @@ -0,0 +1,62 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.ConnectionPoolMetrics = void 0; +/** @internal */ +class ConnectionPoolMetrics { + constructor() { + this.txnConnections = 0; + this.cursorConnections = 0; + this.otherConnections = 0; + } + /** + * Mark a connection as pinned for a specific operation. + */ + markPinned(pinType) { + if (pinType === ConnectionPoolMetrics.TXN) { + this.txnConnections += 1; + } + else if (pinType === ConnectionPoolMetrics.CURSOR) { + this.cursorConnections += 1; + } + else { + this.otherConnections += 1; + } + } + /** + * Unmark a connection as pinned for an operation. + */ + markUnpinned(pinType) { + if (pinType === ConnectionPoolMetrics.TXN) { + this.txnConnections -= 1; + } + else if (pinType === ConnectionPoolMetrics.CURSOR) { + this.cursorConnections -= 1; + } + else { + this.otherConnections -= 1; + } + } + /** + * Return information about the cmap metrics as a string. + */ + info(maxPoolSize) { + return ('Timed out while checking out a connection from connection pool: ' + + `maxPoolSize: ${maxPoolSize}, ` + + `connections in use by cursors: ${this.cursorConnections}, ` + + `connections in use by transactions: ${this.txnConnections}, ` + + `connections in use by other operations: ${this.otherConnections}`); + } + /** + * Reset the metrics to the initial values. + */ + reset() { + this.txnConnections = 0; + this.cursorConnections = 0; + this.otherConnections = 0; + } +} +exports.ConnectionPoolMetrics = ConnectionPoolMetrics; +ConnectionPoolMetrics.TXN = 'txn'; +ConnectionPoolMetrics.CURSOR = 'cursor'; +ConnectionPoolMetrics.OTHER = 'other'; +//# sourceMappingURL=metrics.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/metrics.js.map b/node_modules/mongodb/lib/cmap/metrics.js.map new file mode 100644 index 000000000..270cefe79 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/metrics.js.map @@ -0,0 +1 @@ +{"version":3,"file":"metrics.js","sourceRoot":"","sources":["../../src/cmap/metrics.ts"],"names":[],"mappings":";;;AAAA,gBAAgB;AAChB,MAAa,qBAAqB;IAAlC;QAKE,mBAAc,GAAG,CAAC,CAAC;QACnB,sBAAiB,GAAG,CAAC,CAAC;QACtB,qBAAgB,GAAG,CAAC,CAAC;IAiDvB,CAAC;IA/CC;;OAEG;IACH,UAAU,CAAC,OAAe;QACxB,IAAI,OAAO,KAAK,qBAAqB,CAAC,GAAG,EAAE;YACzC,IAAI,CAAC,cAAc,IAAI,CAAC,CAAC;SAC1B;aAAM,IAAI,OAAO,KAAK,qBAAqB,CAAC,MAAM,EAAE;YACnD,IAAI,CAAC,iBAAiB,IAAI,CAAC,CAAC;SAC7B;aAAM;YACL,IAAI,CAAC,gBAAgB,IAAI,CAAC,CAAC;SAC5B;IACH,CAAC;IAED;;OAEG;IACH,YAAY,CAAC,OAAe;QAC1B,IAAI,OAAO,KAAK,qBAAqB,CAAC,GAAG,EAAE;YACzC,IAAI,CAAC,cAAc,IAAI,CAAC,CAAC;SAC1B;aAAM,IAAI,OAAO,KAAK,qBAAqB,CAAC,MAAM,EAAE;YACnD,IAAI,CAAC,iBAAiB,IAAI,CAAC,CAAC;SAC7B;aAAM;YACL,IAAI,CAAC,gBAAgB,IAAI,CAAC,CAAC;SAC5B;IACH,CAAC;IAED;;OAEG;IACH,IAAI,CAAC,WAAmB;QACtB,OAAO,CACL,kEAAkE;YAClE,gBAAgB,WAAW,IAAI;YAC/B,kCAAkC,IAAI,CAAC,iBAAiB,IAAI;YAC5D,uCAAuC,IAAI,CAAC,cAAc,IAAI;YAC9D,2CAA2C,IAAI,CAAC,gBAAgB,EAAE,CACnE,CAAC;IACJ,CAAC;IAED;;OAEG;IACH,KAAK;QACH,IAAI,CAAC,cAAc,GAAG,CAAC,CAAC;QACxB,IAAI,CAAC,iBAAiB,GAAG,CAAC,CAAC;QAC3B,IAAI,CAAC,gBAAgB,GAAG,CAAC,CAAC;IAC5B,CAAC;;AAvDH,sDAwDC;AAvDiB,yBAAG,GAAG,KAAc,CAAC;AACrB,4BAAM,GAAG,QAAiB,CAAC;AAC3B,2BAAK,GAAG,OAAgB,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/stream_description.js b/node_modules/mongodb/lib/cmap/stream_description.js new file mode 100644 index 000000000..b9b2d27c9 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/stream_description.js @@ -0,0 +1,48 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.StreamDescription = void 0; +const server_description_1 = require("../sdam/server_description"); +const common_1 = require("../sdam/common"); +const RESPONSE_FIELDS = [ + 'minWireVersion', + 'maxWireVersion', + 'maxBsonObjectSize', + 'maxMessageSizeBytes', + 'maxWriteBatchSize', + 'logicalSessionTimeoutMinutes' +]; +/** @public */ +class StreamDescription { + constructor(address, options) { + this.address = address; + this.type = common_1.ServerType.Unknown; + this.minWireVersion = undefined; + this.maxWireVersion = undefined; + this.maxBsonObjectSize = 16777216; + this.maxMessageSizeBytes = 48000000; + this.maxWriteBatchSize = 100000; + this.logicalSessionTimeoutMinutes = options === null || options === void 0 ? void 0 : options.logicalSessionTimeoutMinutes; + this.loadBalanced = !!(options === null || options === void 0 ? void 0 : options.loadBalanced); + this.compressors = + options && options.compressors && Array.isArray(options.compressors) + ? options.compressors + : []; + } + receiveResponse(response) { + this.type = server_description_1.parseServerType(response); + RESPONSE_FIELDS.forEach(field => { + if (typeof response[field] != null) { + this[field] = response[field]; + } + // testing case + if ('__nodejs_mock_server__' in response) { + this.__nodejs_mock_server__ = response['__nodejs_mock_server__']; + } + }); + if (response.compression) { + this.compressor = this.compressors.filter(c => { var _a; return (_a = response.compression) === null || _a === void 0 ? void 0 : _a.includes(c); })[0]; + } + } +} +exports.StreamDescription = StreamDescription; +//# sourceMappingURL=stream_description.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/stream_description.js.map b/node_modules/mongodb/lib/cmap/stream_description.js.map new file mode 100644 index 000000000..1d281b2d9 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/stream_description.js.map @@ -0,0 +1 @@ +{"version":3,"file":"stream_description.js","sourceRoot":"","sources":["../../src/cmap/stream_description.ts"],"names":[],"mappings":";;;AAAA,mEAA6D;AAG7D,2CAA4C;AAE5C,MAAM,eAAe,GAAG;IACtB,gBAAgB;IAChB,gBAAgB;IAChB,mBAAmB;IACnB,qBAAqB;IACrB,mBAAmB;IACnB,8BAA8B;CACtB,CAAC;AASX,cAAc;AACd,MAAa,iBAAiB;IAiB5B,YAAY,OAAe,EAAE,OAAkC;QAC7D,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,IAAI,GAAG,mBAAU,CAAC,OAAO,CAAC;QAC/B,IAAI,CAAC,cAAc,GAAG,SAAS,CAAC;QAChC,IAAI,CAAC,cAAc,GAAG,SAAS,CAAC;QAChC,IAAI,CAAC,iBAAiB,GAAG,QAAQ,CAAC;QAClC,IAAI,CAAC,mBAAmB,GAAG,QAAQ,CAAC;QACpC,IAAI,CAAC,iBAAiB,GAAG,MAAM,CAAC;QAChC,IAAI,CAAC,4BAA4B,GAAG,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,4BAA4B,CAAC;QAC1E,IAAI,CAAC,YAAY,GAAG,CAAC,CAAC,CAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,YAAY,CAAA,CAAC;QAC5C,IAAI,CAAC,WAAW;YACd,OAAO,IAAI,OAAO,CAAC,WAAW,IAAI,KAAK,CAAC,OAAO,CAAC,OAAO,CAAC,WAAW,CAAC;gBAClE,CAAC,CAAC,OAAO,CAAC,WAAW;gBACrB,CAAC,CAAC,EAAE,CAAC;IACX,CAAC;IAED,eAAe,CAAC,QAAkB;QAChC,IAAI,CAAC,IAAI,GAAG,oCAAe,CAAC,QAAQ,CAAC,CAAC;QACtC,eAAe,CAAC,OAAO,CAAC,KAAK,CAAC,EAAE;YAC9B,IAAI,OAAO,QAAQ,CAAC,KAAK,CAAC,IAAI,IAAI,EAAE;gBAClC,IAAI,CAAC,KAAK,CAAC,GAAG,QAAQ,CAAC,KAAK,CAAC,CAAC;aAC/B;YAED,eAAe;YACf,IAAI,wBAAwB,IAAI,QAAQ,EAAE;gBACxC,IAAI,CAAC,sBAAsB,GAAG,QAAQ,CAAC,wBAAwB,CAAC,CAAC;aAClE;QACH,CAAC,CAAC,CAAC;QAEH,IAAI,QAAQ,CAAC,WAAW,EAAE;YACxB,IAAI,CAAC,UAAU,GAAG,IAAI,CAAC,WAAW,CAAC,MAAM,CAAC,CAAC,CAAC,EAAE,WAAC,OAAA,MAAA,QAAQ,CAAC,WAAW,0CAAE,QAAQ,CAAC,CAAC,CAAC,CAAA,EAAA,CAAC,CAAC,CAAC,CAAC,CAAC;SACtF;IACH,CAAC;CACF;AAlDD,8CAkDC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/wire_protocol/compression.js b/node_modules/mongodb/lib/cmap/wire_protocol/compression.js new file mode 100644 index 000000000..7994189e4 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/wire_protocol/compression.js @@ -0,0 +1,83 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.decompress = exports.compress = exports.uncompressibleCommands = exports.Compressor = void 0; +const zlib = require("zlib"); +const deps_1 = require("../../deps"); +const error_1 = require("../../error"); +/** @public */ +exports.Compressor = Object.freeze({ + none: 0, + snappy: 1, + zlib: 2 +}); +exports.uncompressibleCommands = new Set([ + 'ismaster', + 'saslStart', + 'saslContinue', + 'getnonce', + 'authenticate', + 'createUser', + 'updateUser', + 'copydbSaslStart', + 'copydbgetnonce', + 'copydb' +]); +// Facilitate compressing a message using an agreed compressor +function compress(self, dataToBeCompressed, callback) { + const zlibOptions = {}; + switch (self.options.agreedCompressor) { + case 'snappy': { + if ('kModuleError' in deps_1.Snappy) { + return callback(deps_1.Snappy['kModuleError']); + } + if (deps_1.Snappy[deps_1.PKG_VERSION].major <= 6) { + deps_1.Snappy.compress(dataToBeCompressed, callback); + } + else { + deps_1.Snappy.compress(dataToBeCompressed) + .then(buffer => callback(undefined, buffer)) + .catch(error => callback(error)); + } + break; + } + case 'zlib': + // Determine zlibCompressionLevel + if (self.options.zlibCompressionLevel) { + zlibOptions.level = self.options.zlibCompressionLevel; + } + zlib.deflate(dataToBeCompressed, zlibOptions, callback); + break; + default: + throw new error_1.MongoInvalidArgumentError(`Unknown compressor ${self.options.agreedCompressor} failed to compress`); + } +} +exports.compress = compress; +// Decompress a message using the given compressor +function decompress(compressorID, compressedData, callback) { + if (compressorID < 0 || compressorID > Math.max(2)) { + throw new error_1.MongoDecompressionError(`Server sent message compressed using an unsupported compressor. (Received compressor ID ${compressorID})`); + } + switch (compressorID) { + case exports.Compressor.snappy: { + if ('kModuleError' in deps_1.Snappy) { + return callback(deps_1.Snappy['kModuleError']); + } + if (deps_1.Snappy[deps_1.PKG_VERSION].major <= 6) { + deps_1.Snappy.uncompress(compressedData, { asBuffer: true }, callback); + } + else { + deps_1.Snappy.uncompress(compressedData, { asBuffer: true }) + .then(buffer => callback(undefined, buffer)) + .catch(error => callback(error)); + } + break; + } + case exports.Compressor.zlib: + zlib.inflate(compressedData, callback); + break; + default: + callback(undefined, compressedData); + } +} +exports.decompress = decompress; +//# sourceMappingURL=compression.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/wire_protocol/compression.js.map b/node_modules/mongodb/lib/cmap/wire_protocol/compression.js.map new file mode 100644 index 000000000..18a16c7fe --- /dev/null +++ b/node_modules/mongodb/lib/cmap/wire_protocol/compression.js.map @@ -0,0 +1 @@ +{"version":3,"file":"compression.js","sourceRoot":"","sources":["../../../src/cmap/wire_protocol/compression.ts"],"names":[],"mappings":";;;AAAA,6BAA6B;AAI7B,qCAAiD;AACjD,uCAAiF;AAEjF,cAAc;AACD,QAAA,UAAU,GAAG,MAAM,CAAC,MAAM,CAAC;IACtC,IAAI,EAAE,CAAC;IACP,MAAM,EAAE,CAAC;IACT,IAAI,EAAE,CAAC;CACC,CAAC,CAAC;AAQC,QAAA,sBAAsB,GAAG,IAAI,GAAG,CAAC;IAC5C,UAAU;IACV,WAAW;IACX,cAAc;IACd,UAAU;IACV,cAAc;IACd,YAAY;IACZ,YAAY;IACZ,iBAAiB;IACjB,gBAAgB;IAChB,QAAQ;CACT,CAAC,CAAC;AAEH,8DAA8D;AAC9D,SAAgB,QAAQ,CACtB,IAA0D,EAC1D,kBAA0B,EAC1B,QAA0B;IAE1B,MAAM,WAAW,GAAG,EAAsB,CAAC;IAC3C,QAAQ,IAAI,CAAC,OAAO,CAAC,gBAAgB,EAAE;QACrC,KAAK,QAAQ,CAAC,CAAC;YACb,IAAI,cAAc,IAAI,aAAM,EAAE;gBAC5B,OAAO,QAAQ,CAAC,aAAM,CAAC,cAAc,CAAC,CAAC,CAAC;aACzC;YAED,IAAI,aAAM,CAAC,kBAAW,CAAC,CAAC,KAAK,IAAI,CAAC,EAAE;gBAClC,aAAM,CAAC,QAAQ,CAAC,kBAAkB,EAAE,QAAQ,CAAC,CAAC;aAC/C;iBAAM;gBACL,aAAM,CAAC,QAAQ,CAAC,kBAAkB,CAAC;qBAChC,IAAI,CAAC,MAAM,CAAC,EAAE,CAAC,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,CAAC;qBAC3C,KAAK,CAAC,KAAK,CAAC,EAAE,CAAC,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC;aACpC;YACD,MAAM;SACP;QACD,KAAK,MAAM;YACT,iCAAiC;YACjC,IAAI,IAAI,CAAC,OAAO,CAAC,oBAAoB,EAAE;gBACrC,WAAW,CAAC,KAAK,GAAG,IAAI,CAAC,OAAO,CAAC,oBAAoB,CAAC;aACvD;YACD,IAAI,CAAC,OAAO,CAAC,kBAAkB,EAAE,WAAW,EAAE,QAAiC,CAAC,CAAC;YACjF,MAAM;QACR;YACE,MAAM,IAAI,iCAAyB,CACjC,sBAAsB,IAAI,CAAC,OAAO,CAAC,gBAAgB,qBAAqB,CACzE,CAAC;KACL;AACH,CAAC;AAjCD,4BAiCC;AAED,kDAAkD;AAClD,SAAgB,UAAU,CACxB,YAAwB,EACxB,cAAsB,EACtB,QAA0B;IAE1B,IAAI,YAAY,GAAG,CAAC,IAAI,YAAY,GAAG,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE;QAClD,MAAM,IAAI,+BAAuB,CAC/B,2FAA2F,YAAY,GAAG,CAC3G,CAAC;KACH;IAED,QAAQ,YAAY,EAAE;QACpB,KAAK,kBAAU,CAAC,MAAM,CAAC,CAAC;YACtB,IAAI,cAAc,IAAI,aAAM,EAAE;gBAC5B,OAAO,QAAQ,CAAC,aAAM,CAAC,cAAc,CAAC,CAAC,CAAC;aACzC;YAED,IAAI,aAAM,CAAC,kBAAW,CAAC,CAAC,KAAK,IAAI,CAAC,EAAE;gBAClC,aAAM,CAAC,UAAU,CAAC,cAAc,EAAE,EAAE,QAAQ,EAAE,IAAI,EAAE,EAAE,QAAQ,CAAC,CAAC;aACjE;iBAAM;gBACL,aAAM,CAAC,UAAU,CAAC,cAAc,EAAE,EAAE,QAAQ,EAAE,IAAI,EAAE,CAAC;qBAClD,IAAI,CAAC,MAAM,CAAC,EAAE,CAAC,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,CAAC;qBAC3C,KAAK,CAAC,KAAK,CAAC,EAAE,CAAC,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC;aACpC;YACD,MAAM;SACP;QACD,KAAK,kBAAU,CAAC,IAAI;YAClB,IAAI,CAAC,OAAO,CAAC,cAAc,EAAE,QAAiC,CAAC,CAAC;YAChE,MAAM;QACR;YACE,QAAQ,CAAC,SAAS,EAAE,cAAc,CAAC,CAAC;KACvC;AACH,CAAC;AAhCD,gCAgCC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/wire_protocol/constants.js b/node_modules/mongodb/lib/cmap/wire_protocol/constants.js new file mode 100644 index 000000000..352d2e7d2 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/wire_protocol/constants.js @@ -0,0 +1,17 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.OP_MSG = exports.OP_COMPRESSED = exports.OP_KILL_CURSORS = exports.OP_DELETE = exports.OP_GETMORE = exports.OP_QUERY = exports.OP_INSERT = exports.OP_UPDATE = exports.OP_REPLY = exports.MAX_SUPPORTED_WIRE_VERSION = exports.MIN_SUPPORTED_WIRE_VERSION = exports.MAX_SUPPORTED_SERVER_VERSION = exports.MIN_SUPPORTED_SERVER_VERSION = void 0; +exports.MIN_SUPPORTED_SERVER_VERSION = '2.6'; +exports.MAX_SUPPORTED_SERVER_VERSION = '5.0'; +exports.MIN_SUPPORTED_WIRE_VERSION = 2; +exports.MAX_SUPPORTED_WIRE_VERSION = 13; +exports.OP_REPLY = 1; +exports.OP_UPDATE = 2001; +exports.OP_INSERT = 2002; +exports.OP_QUERY = 2004; +exports.OP_GETMORE = 2005; +exports.OP_DELETE = 2006; +exports.OP_KILL_CURSORS = 2007; +exports.OP_COMPRESSED = 2012; +exports.OP_MSG = 2013; +//# sourceMappingURL=constants.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/wire_protocol/constants.js.map b/node_modules/mongodb/lib/cmap/wire_protocol/constants.js.map new file mode 100644 index 000000000..9dc053b22 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/wire_protocol/constants.js.map @@ -0,0 +1 @@ +{"version":3,"file":"constants.js","sourceRoot":"","sources":["../../../src/cmap/wire_protocol/constants.ts"],"names":[],"mappings":";;;AAAa,QAAA,4BAA4B,GAAG,KAAK,CAAC;AACrC,QAAA,4BAA4B,GAAG,KAAK,CAAC;AACrC,QAAA,0BAA0B,GAAG,CAAC,CAAC;AAC/B,QAAA,0BAA0B,GAAG,EAAE,CAAC;AAChC,QAAA,QAAQ,GAAG,CAAC,CAAC;AACb,QAAA,SAAS,GAAG,IAAI,CAAC;AACjB,QAAA,SAAS,GAAG,IAAI,CAAC;AACjB,QAAA,QAAQ,GAAG,IAAI,CAAC;AAChB,QAAA,UAAU,GAAG,IAAI,CAAC;AAClB,QAAA,SAAS,GAAG,IAAI,CAAC;AACjB,QAAA,eAAe,GAAG,IAAI,CAAC;AACvB,QAAA,aAAa,GAAG,IAAI,CAAC;AACrB,QAAA,MAAM,GAAG,IAAI,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/wire_protocol/shared.js b/node_modules/mongodb/lib/cmap/wire_protocol/shared.js new file mode 100644 index 000000000..44fdf0632 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/wire_protocol/shared.js @@ -0,0 +1,51 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.isSharded = exports.applyCommonQueryOptions = exports.getReadPreference = void 0; +const common_1 = require("../../sdam/common"); +const topology_description_1 = require("../../sdam/topology_description"); +const error_1 = require("../../error"); +const read_preference_1 = require("../../read_preference"); +function getReadPreference(cmd, options) { + // Default to command version of the readPreference + let readPreference = cmd.readPreference || read_preference_1.ReadPreference.primary; + // If we have an option readPreference override the command one + if (options === null || options === void 0 ? void 0 : options.readPreference) { + readPreference = options.readPreference; + } + if (typeof readPreference === 'string') { + readPreference = read_preference_1.ReadPreference.fromString(readPreference); + } + if (!(readPreference instanceof read_preference_1.ReadPreference)) { + throw new error_1.MongoInvalidArgumentError('Option "readPreference" must be a ReadPreference instance'); + } + return readPreference; +} +exports.getReadPreference = getReadPreference; +function applyCommonQueryOptions(queryOptions, options) { + Object.assign(queryOptions, { + raw: typeof options.raw === 'boolean' ? options.raw : false, + promoteLongs: typeof options.promoteLongs === 'boolean' ? options.promoteLongs : true, + promoteValues: typeof options.promoteValues === 'boolean' ? options.promoteValues : true, + promoteBuffers: typeof options.promoteBuffers === 'boolean' ? options.promoteBuffers : false, + bsonRegExp: typeof options.bsonRegExp === 'boolean' ? options.bsonRegExp : false + }); + if (options.session) { + queryOptions.session = options.session; + } + return queryOptions; +} +exports.applyCommonQueryOptions = applyCommonQueryOptions; +function isSharded(topologyOrServer) { + if (topologyOrServer.description && topologyOrServer.description.type === common_1.ServerType.Mongos) { + return true; + } + // NOTE: This is incredibly inefficient, and should be removed once command construction + // happens based on `Server` not `Topology`. + if (topologyOrServer.description && topologyOrServer.description instanceof topology_description_1.TopologyDescription) { + const servers = Array.from(topologyOrServer.description.servers.values()); + return servers.some((server) => server.type === common_1.ServerType.Mongos); + } + return false; +} +exports.isSharded = isSharded; +//# sourceMappingURL=shared.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cmap/wire_protocol/shared.js.map b/node_modules/mongodb/lib/cmap/wire_protocol/shared.js.map new file mode 100644 index 000000000..b7f7ae1e9 --- /dev/null +++ b/node_modules/mongodb/lib/cmap/wire_protocol/shared.js.map @@ -0,0 +1 @@ +{"version":3,"file":"shared.js","sourceRoot":"","sources":["../../../src/cmap/wire_protocol/shared.ts"],"names":[],"mappings":";;;AAAA,8CAA+C;AAC/C,0EAAsE;AACtE,uCAAwD;AACxD,2DAAuD;AAcvD,SAAgB,iBAAiB,CAAC,GAAa,EAAE,OAA8B;IAC7E,mDAAmD;IACnD,IAAI,cAAc,GAAG,GAAG,CAAC,cAAc,IAAI,gCAAc,CAAC,OAAO,CAAC;IAClE,+DAA+D;IAC/D,IAAI,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,cAAc,EAAE;QAC3B,cAAc,GAAG,OAAO,CAAC,cAAc,CAAC;KACzC;IAED,IAAI,OAAO,cAAc,KAAK,QAAQ,EAAE;QACtC,cAAc,GAAG,gCAAc,CAAC,UAAU,CAAC,cAAc,CAAC,CAAC;KAC5D;IAED,IAAI,CAAC,CAAC,cAAc,YAAY,gCAAc,CAAC,EAAE;QAC/C,MAAM,IAAI,iCAAyB,CACjC,2DAA2D,CAC5D,CAAC;KACH;IAED,OAAO,cAAc,CAAC;AACxB,CAAC;AAnBD,8CAmBC;AAED,SAAgB,uBAAuB,CACrC,YAA4B,EAC5B,OAAuB;IAEvB,MAAM,CAAC,MAAM,CAAC,YAAY,EAAE;QAC1B,GAAG,EAAE,OAAO,OAAO,CAAC,GAAG,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC,KAAK;QAC3D,YAAY,EAAE,OAAO,OAAO,CAAC,YAAY,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,YAAY,CAAC,CAAC,CAAC,IAAI;QACrF,aAAa,EAAE,OAAO,OAAO,CAAC,aAAa,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,aAAa,CAAC,CAAC,CAAC,IAAI;QACxF,cAAc,EAAE,OAAO,OAAO,CAAC,cAAc,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,cAAc,CAAC,CAAC,CAAC,KAAK;QAC5F,UAAU,EAAE,OAAO,OAAO,CAAC,UAAU,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,UAAU,CAAC,CAAC,CAAC,KAAK;KACjF,CAAC,CAAC;IAEH,IAAI,OAAO,CAAC,OAAO,EAAE;QACnB,YAAY,CAAC,OAAO,GAAG,OAAO,CAAC,OAAO,CAAC;KACxC;IAED,OAAO,YAAY,CAAC;AACtB,CAAC;AAjBD,0DAiBC;AAED,SAAgB,SAAS,CAAC,gBAAgD;IACxE,IAAI,gBAAgB,CAAC,WAAW,IAAI,gBAAgB,CAAC,WAAW,CAAC,IAAI,KAAK,mBAAU,CAAC,MAAM,EAAE;QAC3F,OAAO,IAAI,CAAC;KACb;IAED,wFAAwF;IACxF,kDAAkD;IAClD,IAAI,gBAAgB,CAAC,WAAW,IAAI,gBAAgB,CAAC,WAAW,YAAY,0CAAmB,EAAE;QAC/F,MAAM,OAAO,GAAwB,KAAK,CAAC,IAAI,CAAC,gBAAgB,CAAC,WAAW,CAAC,OAAO,CAAC,MAAM,EAAE,CAAC,CAAC;QAC/F,OAAO,OAAO,CAAC,IAAI,CAAC,CAAC,MAAyB,EAAE,EAAE,CAAC,MAAM,CAAC,IAAI,KAAK,mBAAU,CAAC,MAAM,CAAC,CAAC;KACvF;IAED,OAAO,KAAK,CAAC;AACf,CAAC;AAbD,8BAaC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/collection.js b/node_modules/mongodb/lib/collection.js new file mode 100644 index 000000000..e5aafebe2 --- /dev/null +++ b/node_modules/mongodb/lib/collection.js @@ -0,0 +1,491 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.Collection = void 0; +const utils_1 = require("./utils"); +const read_preference_1 = require("./read_preference"); +const utils_2 = require("./utils"); +const bson_1 = require("./bson"); +const error_1 = require("./error"); +const unordered_1 = require("./bulk/unordered"); +const ordered_1 = require("./bulk/ordered"); +const change_stream_1 = require("./change_stream"); +const write_concern_1 = require("./write_concern"); +const read_concern_1 = require("./read_concern"); +const aggregation_cursor_1 = require("./cursor/aggregation_cursor"); +const bulk_write_1 = require("./operations/bulk_write"); +const count_documents_1 = require("./operations/count_documents"); +const indexes_1 = require("./operations/indexes"); +const distinct_1 = require("./operations/distinct"); +const drop_1 = require("./operations/drop"); +const estimated_document_count_1 = require("./operations/estimated_document_count"); +const find_and_modify_1 = require("./operations/find_and_modify"); +const insert_1 = require("./operations/insert"); +const update_1 = require("./operations/update"); +const delete_1 = require("./operations/delete"); +const is_capped_1 = require("./operations/is_capped"); +const map_reduce_1 = require("./operations/map_reduce"); +const options_operation_1 = require("./operations/options_operation"); +const rename_1 = require("./operations/rename"); +const stats_1 = require("./operations/stats"); +const execute_operation_1 = require("./operations/execute_operation"); +const find_cursor_1 = require("./cursor/find_cursor"); +/** + * The **Collection** class is an internal class that embodies a MongoDB collection + * allowing for insert/update/remove/find and other command operation on that MongoDB collection. + * + * **COLLECTION Cannot directly be instantiated** + * @public + * + * @example + * ```js + * const MongoClient = require('mongodb').MongoClient; + * const test = require('assert'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * // Connect using MongoClient + * MongoClient.connect(url, function(err, client) { + * // Create a collection we want to drop later + * const col = client.db(dbName).collection('createIndexExample1'); + * // Show that duplicate records got dropped + * col.find({}).toArray(function(err, items) { + * expect(err).to.not.exist; + * test.equal(4, items.length); + * client.close(); + * }); + * }); + * ``` + */ +class Collection { + /** + * Create a new Collection instance + * @internal + */ + constructor(db, name, options) { + var _a, _b; + utils_2.checkCollectionName(name); + // Internal state + this.s = { + db, + options, + namespace: new utils_2.MongoDBNamespace(db.databaseName, name), + pkFactory: (_b = (_a = db.options) === null || _a === void 0 ? void 0 : _a.pkFactory) !== null && _b !== void 0 ? _b : utils_1.DEFAULT_PK_FACTORY, + readPreference: read_preference_1.ReadPreference.fromOptions(options), + bsonOptions: bson_1.resolveBSONOptions(options, db), + readConcern: read_concern_1.ReadConcern.fromOptions(options), + writeConcern: write_concern_1.WriteConcern.fromOptions(options), + slaveOk: options == null || options.slaveOk == null ? db.slaveOk : options.slaveOk + }; + } + /** + * The name of the database this collection belongs to + */ + get dbName() { + return this.s.namespace.db; + } + /** + * The name of this collection + */ + get collectionName() { + // eslint-disable-next-line @typescript-eslint/no-non-null-assertion + return this.s.namespace.collection; + } + /** + * The namespace of this collection, in the format `${this.dbName}.${this.collectionName}` + */ + get namespace() { + return this.s.namespace.toString(); + } + /** + * The current readConcern of the collection. If not explicitly defined for + * this collection, will be inherited from the parent DB + */ + get readConcern() { + if (this.s.readConcern == null) { + return this.s.db.readConcern; + } + return this.s.readConcern; + } + /** + * The current readPreference of the collection. If not explicitly defined for + * this collection, will be inherited from the parent DB + */ + get readPreference() { + if (this.s.readPreference == null) { + return this.s.db.readPreference; + } + return this.s.readPreference; + } + get bsonOptions() { + return this.s.bsonOptions; + } + /** + * The current writeConcern of the collection. If not explicitly defined for + * this collection, will be inherited from the parent DB + */ + get writeConcern() { + if (this.s.writeConcern == null) { + return this.s.db.writeConcern; + } + return this.s.writeConcern; + } + /** The current index hint for the collection */ + get hint() { + return this.s.collectionHint; + } + set hint(v) { + this.s.collectionHint = utils_2.normalizeHintField(v); + } + insertOne(doc, options, callback) { + if (typeof options === 'function') { + callback = options; + options = {}; + } + // CSFLE passes in { w: 'majority' } to ensure the lib works in both 3.x and 4.x + // we support that option style here only + if (options && Reflect.get(options, 'w')) { + options.writeConcern = write_concern_1.WriteConcern.fromOptions(Reflect.get(options, 'w')); + } + return execute_operation_1.executeOperation(utils_2.getTopology(this), new insert_1.InsertOneOperation(this, doc, utils_1.resolveOptions(this, options)), callback); + } + insertMany(docs, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = options ? Object.assign({}, options) : { ordered: true }; + return execute_operation_1.executeOperation(utils_2.getTopology(this), new insert_1.InsertManyOperation(this, docs, utils_1.resolveOptions(this, options)), callback); + } + bulkWrite(operations, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = options || { ordered: true }; + if (!Array.isArray(operations)) { + throw new error_1.MongoInvalidArgumentError('Argument "operations" must be an array of documents'); + } + return execute_operation_1.executeOperation(utils_2.getTopology(this), new bulk_write_1.BulkWriteOperation(this, operations, utils_1.resolveOptions(this, options)), callback); + } + updateOne(filter, update, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new update_1.UpdateOneOperation(this, filter, update, utils_1.resolveOptions(this, options)), callback); + } + replaceOne(filter, replacement, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new update_1.ReplaceOneOperation(this, filter, replacement, utils_1.resolveOptions(this, options)), callback); + } + updateMany(filter, update, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new update_1.UpdateManyOperation(this, filter, update, utils_1.resolveOptions(this, options)), callback); + } + deleteOne(filter, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new delete_1.DeleteOneOperation(this, filter, utils_1.resolveOptions(this, options)), callback); + } + deleteMany(filter, options, callback) { + if (filter == null) { + filter = {}; + options = {}; + callback = undefined; + } + else if (typeof filter === 'function') { + callback = filter; + filter = {}; + options = {}; + } + else if (typeof options === 'function') { + callback = options; + options = {}; + } + return execute_operation_1.executeOperation(utils_2.getTopology(this), new delete_1.DeleteManyOperation(this, filter, utils_1.resolveOptions(this, options)), callback); + } + rename(newName, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + // Intentionally, we do not inherit options from parent for this operation. + return execute_operation_1.executeOperation(utils_2.getTopology(this), new rename_1.RenameOperation(this, newName, { + ...options, + readPreference: read_preference_1.ReadPreference.PRIMARY + }), callback); + } + drop(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = options !== null && options !== void 0 ? options : {}; + return execute_operation_1.executeOperation(utils_2.getTopology(this), new drop_1.DropCollectionOperation(this.s.db, this.collectionName, options), callback); + } + findOne(filter, options, callback) { + if (callback != null && typeof callback !== 'function') { + throw new error_1.MongoInvalidArgumentError('Third parameter to `findOne()` must be a callback or undefined'); + } + if (typeof filter === 'function') { + callback = filter; + filter = {}; + options = {}; + } + if (typeof options === 'function') { + callback = options; + options = {}; + } + const finalFilter = filter !== null && filter !== void 0 ? filter : {}; + const finalOptions = options !== null && options !== void 0 ? options : {}; + return this.find(finalFilter, finalOptions).limit(-1).batchSize(1).next(callback); + } + find(filter, options) { + if (arguments.length > 2) { + throw new error_1.MongoInvalidArgumentError('Method "collection.find()" accepts at most two arguments'); + } + if (typeof options === 'function') { + throw new error_1.MongoInvalidArgumentError('Argument "options" must not be function'); + } + return new find_cursor_1.FindCursor(utils_2.getTopology(this), this.s.namespace, filter, utils_1.resolveOptions(this, options)); + } + options(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new options_operation_1.OptionsOperation(this, utils_1.resolveOptions(this, options)), callback); + } + isCapped(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new is_capped_1.IsCappedOperation(this, utils_1.resolveOptions(this, options)), callback); + } + createIndex(indexSpec, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new indexes_1.CreateIndexOperation(this, this.collectionName, indexSpec, utils_1.resolveOptions(this, options)), callback); + } + createIndexes(indexSpecs, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = options ? Object.assign({}, options) : {}; + if (typeof options.maxTimeMS !== 'number') + delete options.maxTimeMS; + return execute_operation_1.executeOperation(utils_2.getTopology(this), new indexes_1.CreateIndexesOperation(this, this.collectionName, indexSpecs, utils_1.resolveOptions(this, options)), callback); + } + dropIndex(indexName, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = utils_1.resolveOptions(this, options); + // Run only against primary + options.readPreference = read_preference_1.ReadPreference.primary; + return execute_operation_1.executeOperation(utils_2.getTopology(this), new indexes_1.DropIndexOperation(this, indexName, options), callback); + } + dropIndexes(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new indexes_1.DropIndexesOperation(this, utils_1.resolveOptions(this, options)), callback); + } + /** + * Get the list of all indexes information for the collection. + * + * @param options - Optional settings for the command + */ + listIndexes(options) { + return new indexes_1.ListIndexesCursor(this, utils_1.resolveOptions(this, options)); + } + indexExists(indexes, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new indexes_1.IndexExistsOperation(this, indexes, utils_1.resolveOptions(this, options)), callback); + } + indexInformation(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new indexes_1.IndexInformationOperation(this.s.db, this.collectionName, utils_1.resolveOptions(this, options)), callback); + } + estimatedDocumentCount(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new estimated_document_count_1.EstimatedDocumentCountOperation(this, utils_1.resolveOptions(this, options)), callback); + } + countDocuments(filter, options, callback) { + if (filter == null) { + (filter = {}), (options = {}), (callback = undefined); + } + else if (typeof filter === 'function') { + (callback = filter), (filter = {}), (options = {}); + } + else { + if (arguments.length === 2) { + if (typeof options === 'function') + (callback = options), (options = {}); + } + } + filter !== null && filter !== void 0 ? filter : (filter = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new count_documents_1.CountDocumentsOperation(this, filter, utils_1.resolveOptions(this, options)), callback); + } + // Implementation + distinct(key, filter, options, callback) { + if (typeof filter === 'function') { + (callback = filter), (filter = {}), (options = {}); + } + else { + if (arguments.length === 3 && typeof options === 'function') { + (callback = options), (options = {}); + } + } + filter !== null && filter !== void 0 ? filter : (filter = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new distinct_1.DistinctOperation(this, key, filter, utils_1.resolveOptions(this, options)), callback); + } + indexes(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new indexes_1.IndexesOperation(this, utils_1.resolveOptions(this, options)), callback); + } + stats(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = options !== null && options !== void 0 ? options : {}; + return execute_operation_1.executeOperation(utils_2.getTopology(this), new stats_1.CollStatsOperation(this, options), callback); + } + findOneAndDelete(filter, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new find_and_modify_1.FindOneAndDeleteOperation(this, filter, utils_1.resolveOptions(this, options)), callback); + } + findOneAndReplace(filter, replacement, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new find_and_modify_1.FindOneAndReplaceOperation(this, filter, replacement, utils_1.resolveOptions(this, options)), callback); + } + findOneAndUpdate(filter, update, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new find_and_modify_1.FindOneAndUpdateOperation(this, filter, update, utils_1.resolveOptions(this, options)), callback); + } + /** + * Execute an aggregation framework pipeline against the collection, needs MongoDB \>= 2.2 + * + * @param pipeline - An array of aggregation pipelines to execute + * @param options - Optional settings for the command + */ + aggregate(pipeline = [], options) { + if (arguments.length > 2) { + throw new error_1.MongoInvalidArgumentError('Method "collection.aggregate()" accepts at most two arguments'); + } + if (!Array.isArray(pipeline)) { + throw new error_1.MongoInvalidArgumentError('Argument "pipeline" must be an array of aggregation stages'); + } + if (typeof options === 'function') { + throw new error_1.MongoInvalidArgumentError('Argument "options" must not be function'); + } + return new aggregation_cursor_1.AggregationCursor(utils_2.getTopology(this), this.s.namespace, pipeline, utils_1.resolveOptions(this, options)); + } + /** + * Create a new Change Stream, watching for new changes (insertions, updates, replacements, deletions, and invalidations) in this collection. + * + * @since 3.0.0 + * @param pipeline - An array of {@link https://docs.mongodb.com/manual/reference/operator/aggregation-pipeline/|aggregation pipeline stages} through which to pass change stream documents. This allows for filtering (using $match) and manipulating the change stream documents. + * @param options - Optional settings for the command + */ + watch(pipeline = [], options = {}) { + // Allow optionally not specifying a pipeline + if (!Array.isArray(pipeline)) { + options = pipeline; + pipeline = []; + } + return new change_stream_1.ChangeStream(this, pipeline, utils_1.resolveOptions(this, options)); + } + mapReduce(map, reduce, options, callback) { + if ('function' === typeof options) + (callback = options), (options = {}); + // Out must always be defined (make sure we don't break weirdly on pre 1.8+ servers) + // TODO NODE-3339: Figure out if this is still necessary given we no longer officially support pre-1.8 + if ((options === null || options === void 0 ? void 0 : options.out) == null) { + throw new error_1.MongoInvalidArgumentError('Option "out" must be defined, see mongodb docs for possible values'); + } + if ('function' === typeof map) { + map = map.toString(); + } + if ('function' === typeof reduce) { + reduce = reduce.toString(); + } + if ('function' === typeof options.finalize) { + options.finalize = options.finalize.toString(); + } + return execute_operation_1.executeOperation(utils_2.getTopology(this), new map_reduce_1.MapReduceOperation(this, map, reduce, utils_1.resolveOptions(this, options)), callback); + } + /** Initiate an Out of order batch write operation. All operations will be buffered into insert/update/remove commands executed out of order. */ + initializeUnorderedBulkOp(options) { + return new unordered_1.UnorderedBulkOperation(this, utils_1.resolveOptions(this, options)); + } + /** Initiate an In order bulk write operation. Operations will be serially executed in the order they are added, creating a new operation for each switch in types. */ + initializeOrderedBulkOp(options) { + return new ordered_1.OrderedBulkOperation(this, utils_1.resolveOptions(this, options)); + } + /** Get the db scoped logger */ + getLogger() { + return this.s.db.s.logger; + } + get logger() { + return this.s.db.s.logger; + } + /** + * Inserts a single document or a an array of documents into MongoDB. If documents passed in do not contain the **_id** field, + * one will be added to each of the documents missing it by the driver, mutating the document. This behavior + * can be overridden by setting the **forceServerObjectId** flag. + * + * @deprecated Use insertOne, insertMany or bulkWrite instead. + * @param docs - The documents to insert + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + insert(docs, options, callback) { + utils_1.emitWarningOnce('collection.insert is deprecated. Use insertOne, insertMany or bulkWrite instead.'); + if (typeof options === 'function') + (callback = options), (options = {}); + options = options || { ordered: false }; + docs = !Array.isArray(docs) ? [docs] : docs; + if (options.keepGoing === true) { + options.ordered = false; + } + return this.insertMany(docs, options, callback); + } + /** + * Updates documents. + * + * @deprecated use updateOne, updateMany or bulkWrite + * @param selector - The selector for the update operation. + * @param update - The update operations to be applied to the documents + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + update(selector, update, options, callback) { + utils_1.emitWarningOnce('collection.update is deprecated. Use updateOne, updateMany, or bulkWrite instead.'); + if (typeof options === 'function') + (callback = options), (options = {}); + options = options !== null && options !== void 0 ? options : {}; + return this.updateMany(selector, update, options, callback); + } + /** + * Remove documents. + * + * @deprecated use deleteOne, deleteMany or bulkWrite + * @param selector - The selector for the update operation. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + remove(selector, options, callback) { + utils_1.emitWarningOnce('collection.remove is deprecated. Use deleteOne, deleteMany, or bulkWrite instead.'); + if (typeof options === 'function') + (callback = options), (options = {}); + options = options !== null && options !== void 0 ? options : {}; + return this.deleteMany(selector, options, callback); + } + count(filter, options, callback) { + if (typeof filter === 'function') { + (callback = filter), (filter = {}), (options = {}); + } + else { + if (typeof options === 'function') + (callback = options), (options = {}); + } + filter !== null && filter !== void 0 ? filter : (filter = {}); + return execute_operation_1.executeOperation(utils_2.getTopology(this), new count_documents_1.CountDocumentsOperation(this, filter, utils_1.resolveOptions(this, options)), callback); + } +} +exports.Collection = Collection; +//# sourceMappingURL=collection.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/collection.js.map b/node_modules/mongodb/lib/collection.js.map new file mode 100644 index 000000000..e18e76718 --- /dev/null +++ b/node_modules/mongodb/lib/collection.js.map @@ -0,0 +1 @@ +{"version":3,"file":"collection.js","sourceRoot":"","sources":["../src/collection.ts"],"names":[],"mappings":";;;AAAA,mCAA8E;AAC9E,uDAAuE;AACvE,mCAMiB;AACjB,iCAA4E;AAC5E,mCAAoD;AACpD,gDAA0D;AAC1D,4CAAsD;AACtD,mDAAoE;AACpE,mDAAoE;AACpE,iDAA8D;AAC9D,oEAAgE;AAEhE,wDAA6D;AAC7D,kEAA8F;AAC9F,kDAc8B;AAC9B,oDAA2E;AAC3E,4CAAmF;AACnF,oFAG+C;AAE/C,kEAOsC;AACtC,gDAM6B;AAC7B,gDAO6B;AAC7B,gDAK6B;AAC7B,sDAA2D;AAC3D,wDAKiC;AACjC,sEAAkE;AAClE,gDAAqE;AACrE,8CAAqF;AACrF,sEAAkE;AAOlE,sDAAkD;AA4ClD;;;;;;;;;;;;;;;;;;;;;;;;;;;GA2BG;AACH,MAAa,UAAU;IAIrB;;;OAGG;IACH,YAAY,EAAM,EAAE,IAAY,EAAE,OAA2B;;QAC3D,2BAAmB,CAAC,IAAI,CAAC,CAAC;QAE1B,iBAAiB;QACjB,IAAI,CAAC,CAAC,GAAG;YACP,EAAE;YACF,OAAO;YACP,SAAS,EAAE,IAAI,wBAAgB,CAAC,EAAE,CAAC,YAAY,EAAE,IAAI,CAAC;YACtD,SAAS,EAAE,MAAA,MAAA,EAAE,CAAC,OAAO,0CAAE,SAAS,mCAAI,0BAAkB;YACtD,cAAc,EAAE,gCAAc,CAAC,WAAW,CAAC,OAAO,CAAC;YACnD,WAAW,EAAE,yBAAkB,CAAC,OAAO,EAAE,EAAE,CAAC;YAC5C,WAAW,EAAE,0BAAW,CAAC,WAAW,CAAC,OAAO,CAAC;YAC7C,YAAY,EAAE,4BAAY,CAAC,WAAW,CAAC,OAAO,CAAC;YAC/C,OAAO,EAAE,OAAO,IAAI,IAAI,IAAI,OAAO,CAAC,OAAO,IAAI,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,OAAO,CAAC,CAAC,CAAC,OAAO,CAAC,OAAO;SACnF,CAAC;IACJ,CAAC;IAED;;OAEG;IACH,IAAI,MAAM;QACR,OAAO,IAAI,CAAC,CAAC,CAAC,SAAS,CAAC,EAAE,CAAC;IAC7B,CAAC;IAED;;OAEG;IACH,IAAI,cAAc;QAChB,oEAAoE;QACpE,OAAO,IAAI,CAAC,CAAC,CAAC,SAAS,CAAC,UAAW,CAAC;IACtC,CAAC;IAED;;OAEG;IACH,IAAI,SAAS;QACX,OAAO,IAAI,CAAC,CAAC,CAAC,SAAS,CAAC,QAAQ,EAAE,CAAC;IACrC,CAAC;IAED;;;OAGG;IACH,IAAI,WAAW;QACb,IAAI,IAAI,CAAC,CAAC,CAAC,WAAW,IAAI,IAAI,EAAE;YAC9B,OAAO,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,WAAW,CAAC;SAC9B;QACD,OAAO,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC;IAC5B,CAAC;IAED;;;OAGG;IACH,IAAI,cAAc;QAChB,IAAI,IAAI,CAAC,CAAC,CAAC,cAAc,IAAI,IAAI,EAAE;YACjC,OAAO,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,cAAc,CAAC;SACjC;QAED,OAAO,IAAI,CAAC,CAAC,CAAC,cAAc,CAAC;IAC/B,CAAC;IAED,IAAI,WAAW;QACb,OAAO,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC;IAC5B,CAAC;IAED;;;OAGG;IACH,IAAI,YAAY;QACd,IAAI,IAAI,CAAC,CAAC,CAAC,YAAY,IAAI,IAAI,EAAE;YAC/B,OAAO,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,YAAY,CAAC;SAC/B;QACD,OAAO,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC;IAC7B,CAAC;IAED,gDAAgD;IAChD,IAAI,IAAI;QACN,OAAO,IAAI,CAAC,CAAC,CAAC,cAAc,CAAC;IAC/B,CAAC;IAED,IAAI,IAAI,CAAC,CAAmB;QAC1B,IAAI,CAAC,CAAC,CAAC,cAAc,GAAG,0BAAkB,CAAC,CAAC,CAAC,CAAC;IAChD,CAAC;IAmBD,SAAS,CACP,GAAwB,EACxB,OAA+D,EAC/D,QAA6C;QAE7C,IAAI,OAAO,OAAO,KAAK,UAAU,EAAE;YACjC,QAAQ,GAAG,OAAO,CAAC;YACnB,OAAO,GAAG,EAAE,CAAC;SACd;QAED,gFAAgF;QAChF,yCAAyC;QACzC,IAAI,OAAO,IAAI,OAAO,CAAC,GAAG,CAAC,OAAO,EAAE,GAAG,CAAC,EAAE;YACxC,OAAO,CAAC,YAAY,GAAG,4BAAY,CAAC,WAAW,CAAC,OAAO,CAAC,GAAG,CAAC,OAAO,EAAE,GAAG,CAAC,CAAC,CAAC;SAC5E;QAED,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,2BAAkB,CACpB,IAAsB,EACtB,GAAG,EACH,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CACZ,EACnB,QAAQ,CACT,CAAC;IACJ,CAAC;IAsBD,UAAU,CACR,IAA2B,EAC3B,OAAgE,EAChE,QAA8C;QAE9C,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,CAAC,CAAC,CAAC,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,OAAO,CAAC,CAAC,CAAC,CAAC,EAAE,OAAO,EAAE,IAAI,EAAE,CAAC;QAEnE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,4BAAmB,CACrB,IAAsB,EACtB,IAAI,EACJ,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CACZ,EACnB,QAAQ,CACT,CAAC;IACJ,CAAC;IA+CD,SAAS,CACP,UAA4C,EAC5C,OAAsD,EACtD,QAAoC;QAEpC,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,IAAI,EAAE,OAAO,EAAE,IAAI,EAAE,CAAC;QAEvC,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,UAAU,CAAC,EAAE;YAC9B,MAAM,IAAI,iCAAyB,CAAC,qDAAqD,CAAC,CAAC;SAC5F;QAED,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,+BAAkB,CACpB,IAAsB,EACtB,UAA4B,EAC5B,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAC9B,EACD,QAAQ,CACT,CAAC;IACJ,CAAC;IA8BD,SAAS,CACP,MAAuB,EACvB,MAAgD,EAChD,OAAgD,EAChD,QAAiC;QAEjC,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,2BAAkB,CACpB,IAAsB,EACtB,MAAM,EACN,MAAM,EACN,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CACZ,EACnB,QAAQ,CACT,CAAC;IACJ,CAAC;IA2BD,UAAU,CACR,MAAuB,EACvB,WAAoB,EACpB,OAA4D,EAC5D,QAA4C;QAE5C,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,4BAAmB,CACrB,IAAsB,EACtB,MAAM,EACN,WAAW,EACX,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAC9B,EACD,QAAQ,CACT,CAAC;IACJ,CAAC;IA8BD,UAAU,CACR,MAAuB,EACvB,MAA6B,EAC7B,OAA2D,EAC3D,QAA4C;QAE5C,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,4BAAmB,CACrB,IAAsB,EACtB,MAAM,EACN,MAAM,EACN,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAC9B,EACD,QAAQ,CACT,CAAC;IACJ,CAAC;IAiBD,SAAS,CACP,MAAuB,EACvB,OAAgD,EAChD,QAAiC;QAEjC,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,2BAAkB,CAAC,IAAsB,EAAE,MAAM,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EACrF,QAAQ,CACT,CAAC;IACJ,CAAC;IAiBD,UAAU,CACR,MAAuB,EACvB,OAAgD,EAChD,QAAiC;QAEjC,IAAI,MAAM,IAAI,IAAI,EAAE;YAClB,MAAM,GAAG,EAAE,CAAC;YACZ,OAAO,GAAG,EAAE,CAAC;YACb,QAAQ,GAAG,SAAS,CAAC;SACtB;aAAM,IAAI,OAAO,MAAM,KAAK,UAAU,EAAE;YACvC,QAAQ,GAAG,MAAgC,CAAC;YAC5C,MAAM,GAAG,EAAE,CAAC;YACZ,OAAO,GAAG,EAAE,CAAC;SACd;aAAM,IAAI,OAAO,OAAO,KAAK,UAAU,EAAE;YACxC,QAAQ,GAAG,OAAO,CAAC;YACnB,OAAO,GAAG,EAAE,CAAC;SACd;QAED,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,4BAAmB,CAAC,IAAsB,EAAE,MAAM,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EACtF,QAAQ,CACT,CAAC;IACJ,CAAC;IAgBD,MAAM,CACJ,OAAe,EACf,OAA8C,EAC9C,QAA+B;QAE/B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,2EAA2E;QAC3E,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,wBAAe,CAAC,IAAsB,EAAE,OAAO,EAAE;YACnD,GAAG,OAAO;YACV,cAAc,EAAE,gCAAc,CAAC,OAAO;SACvC,CAAC,EACF,QAAQ,CACT,CAAC;IACJ,CAAC;IAYD,IAAI,CACF,OAAmD,EACnD,QAA4B;QAE5B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAExB,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,8BAAuB,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,EAAE,IAAI,CAAC,cAAc,EAAE,OAAO,CAAC,EACpE,QAAQ,CACT,CAAC;IACJ,CAAC;IA2BD,OAAO,CACL,MAAmD,EACnD,OAAgD,EAChD,QAAmC;QAEnC,IAAI,QAAQ,IAAI,IAAI,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;YACtD,MAAM,IAAI,iCAAyB,CACjC,gEAAgE,CACjE,CAAC;SACH;QAED,IAAI,OAAO,MAAM,KAAK,UAAU,EAAE;YAChC,QAAQ,GAAG,MAAkC,CAAC;YAC9C,MAAM,GAAG,EAAE,CAAC;YACZ,OAAO,GAAG,EAAE,CAAC;SACd;QACD,IAAI,OAAO,OAAO,KAAK,UAAU,EAAE;YACjC,QAAQ,GAAG,OAAO,CAAC;YACnB,OAAO,GAAG,EAAE,CAAC;SACd;QAED,MAAM,WAAW,GAAG,MAAM,aAAN,MAAM,cAAN,MAAM,GAAI,EAAE,CAAC;QACjC,MAAM,YAAY,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QACnC,OAAO,IAAI,CAAC,IAAI,CAAC,WAAW,EAAE,YAAY,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;IACpF,CAAC;IAUD,IAAI,CAAC,MAAwB,EAAE,OAAqB;QAClD,IAAI,SAAS,CAAC,MAAM,GAAG,CAAC,EAAE;YACxB,MAAM,IAAI,iCAAyB,CACjC,0DAA0D,CAC3D,CAAC;SACH;QACD,IAAI,OAAO,OAAO,KAAK,UAAU,EAAE;YACjC,MAAM,IAAI,iCAAyB,CAAC,yCAAyC,CAAC,CAAC;SAChF;QAED,OAAO,IAAI,wBAAU,CACnB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,CAAC,CAAC,CAAC,SAAS,EAChB,MAAM,EACN,sBAAc,CAAC,IAAsB,EAAE,OAAO,CAAC,CAChD,CAAC;IACJ,CAAC;IAYD,OAAO,CACL,OAA+C,EAC/C,QAA6B;QAE7B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,oCAAgB,CAAC,IAAsB,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EAC3E,QAAQ,CACT,CAAC;IACJ,CAAC;IAYD,QAAQ,CACN,OAA8C,EAC9C,QAA4B;QAE5B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,6BAAiB,CAAC,IAAsB,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EAC5E,QAAQ,CACT,CAAC;IACJ,CAAC;IAuCD,WAAW,CACT,SAA6B,EAC7B,OAAiD,EACjD,QAA2B;QAE3B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,8BAAoB,CACtB,IAAsB,EACtB,IAAI,CAAC,cAAc,EACnB,SAAS,EACT,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAC9B,EACD,QAAQ,CACT,CAAC;IACJ,CAAC;IA0CD,aAAa,CACX,UAA8B,EAC9B,OAAmD,EACnD,QAA6B;QAE7B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,CAAC,CAAC,CAAC,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,OAAO,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC;QACpD,IAAI,OAAO,OAAO,CAAC,SAAS,KAAK,QAAQ;YAAE,OAAO,OAAO,CAAC,SAAS,CAAC;QAEpE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,gCAAsB,CACxB,IAAsB,EACtB,IAAI,CAAC,cAAc,EACnB,UAAU,EACV,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAC9B,EACD,QAAQ,CACT,CAAC;IACJ,CAAC;IAaD,SAAS,CACP,SAAiB,EACjB,OAAiD,EACjD,QAA6B;QAE7B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC;QAExC,2BAA2B;QAC3B,OAAO,CAAC,cAAc,GAAG,gCAAc,CAAC,OAAO,CAAC;QAEhD,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,4BAAkB,CAAC,IAAsB,EAAE,SAAS,EAAE,OAAO,CAAC,EAClE,QAAQ,CACT,CAAC;IACJ,CAAC;IAYD,WAAW,CACT,OAAiD,EACjD,QAA6B;QAE7B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,8BAAoB,CAAC,IAAsB,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EAC/E,QAAQ,CACT,CAAC;IACJ,CAAC;IAED;;;;OAIG;IACH,WAAW,CAAC,OAA4B;QACtC,OAAO,IAAI,2BAAiB,CAAC,IAAsB,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,CAAC;IACtF,CAAC;IAiBD,WAAW,CACT,OAA0B,EAC1B,OAAqD,EACrD,QAA4B;QAE5B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,8BAAoB,CAAC,IAAsB,EAAE,OAAO,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EACxF,QAAQ,CACT,CAAC;IACJ,CAAC;IAYD,gBAAgB,CACd,OAAsD,EACtD,QAA6B;QAE7B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,mCAAyB,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,EAAE,IAAI,CAAC,cAAc,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EAC5F,QAAQ,CACT,CAAC;IACJ,CAAC;IAYD,sBAAsB,CACpB,OAA0D,EAC1D,QAA2B;QAE3B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,0DAA+B,CAAC,IAAsB,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EAC1F,QAAQ,CACT,CAAC;IACJ,CAAC;IAuCD,cAAc,CACZ,MAA4D,EAC5D,OAAkD,EAClD,QAA2B;QAE3B,IAAI,MAAM,IAAI,IAAI,EAAE;YAClB,CAAC,MAAM,GAAG,EAAE,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,EAAE,CAAC,QAAQ,GAAG,SAAS,CAAC,CAAC;SACvD;aAAM,IAAI,OAAO,MAAM,KAAK,UAAU,EAAE;YACvC,CAAC,QAAQ,GAAG,MAA0B,CAAC,EAAE,CAAC,MAAM,GAAG,EAAE,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;SACxE;aAAM;YACL,IAAI,SAAS,CAAC,MAAM,KAAK,CAAC,EAAE;gBAC1B,IAAI,OAAO,OAAO,KAAK,UAAU;oBAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;aACzE;SACF;QAED,MAAM,aAAN,MAAM,cAAN,MAAM,IAAN,MAAM,GAAK,EAAE,EAAC;QACd,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,yCAAuB,CACzB,IAAsB,EACtB,MAAkB,EAClB,sBAAc,CAAC,IAAI,EAAE,OAAgC,CAAC,CACvD,EACD,QAAQ,CACT,CAAC;IACJ,CAAC;IAkDD,iBAAiB;IACjB,QAAQ,CACN,GAAQ,EACR,MAA4D,EAC5D,OAA2C,EAC3C,QAA0B;QAE1B,IAAI,OAAO,MAAM,KAAK,UAAU,EAAE;YAChC,CAAC,QAAQ,GAAG,MAAyB,CAAC,EAAE,CAAC,MAAM,GAAG,EAAE,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;SACvE;aAAM;YACL,IAAI,SAAS,CAAC,MAAM,KAAK,CAAC,IAAI,OAAO,OAAO,KAAK,UAAU,EAAE;gBAC3D,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;aACtC;SACF;QAED,MAAM,aAAN,MAAM,cAAN,MAAM,IAAN,MAAM,GAAK,EAAE,EAAC;QACd,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,4BAAiB,CACnB,IAAsB,EACtB,GAAqB,EACrB,MAAM,EACN,sBAAc,CAAC,IAAI,EAAE,OAA0B,CAAC,CACjD,EACD,QAAQ,CACT,CAAC;IACJ,CAAC;IAYD,OAAO,CACL,OAAwD,EACxD,QAA+B;QAE/B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,0BAAgB,CAAC,IAAsB,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EAC3E,QAAQ,CACT,CAAC;IACJ,CAAC;IAYD,KAAK,CACH,OAAgD,EAChD,QAA8B;QAE9B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAExB,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,0BAAkB,CAAC,IAAsB,EAAE,OAAO,CAAC,EACvD,QAAQ,CACT,CAAC;IACJ,CAAC;IAoBD,gBAAgB,CACd,MAAuB,EACvB,OAAmE,EACnE,QAA0C;QAE1C,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,2CAAyB,CAC3B,IAAsB,EACtB,MAAM,EACN,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CACZ,EACnB,QAAQ,CACT,CAAC;IACJ,CAAC;IA2BD,iBAAiB,CACf,MAAuB,EACvB,WAAqB,EACrB,OAAoE,EACpE,QAA0C;QAE1C,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,4CAA0B,CAC5B,IAAsB,EACtB,MAAM,EACN,WAAW,EACX,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CACZ,EACnB,QAAQ,CACT,CAAC;IACJ,CAAC;IA8BD,gBAAgB,CACd,MAAuB,EACvB,MAA6B,EAC7B,OAAmE,EACnE,QAA0C;QAE1C,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,2CAAyB,CAC3B,IAAsB,EACtB,MAAM,EACN,MAAM,EACN,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CACZ,EACnB,QAAQ,CACT,CAAC;IACJ,CAAC;IAED;;;;;OAKG;IACH,SAAS,CACP,WAAuB,EAAE,EACzB,OAA0B;QAE1B,IAAI,SAAS,CAAC,MAAM,GAAG,CAAC,EAAE;YACxB,MAAM,IAAI,iCAAyB,CACjC,+DAA+D,CAChE,CAAC;SACH;QACD,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,QAAQ,CAAC,EAAE;YAC5B,MAAM,IAAI,iCAAyB,CACjC,4DAA4D,CAC7D,CAAC;SACH;QACD,IAAI,OAAO,OAAO,KAAK,UAAU,EAAE;YACjC,MAAM,IAAI,iCAAyB,CAAC,yCAAyC,CAAC,CAAC;SAChF;QAED,OAAO,IAAI,sCAAiB,CAC1B,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,CAAC,CAAC,CAAC,SAAS,EAChB,QAAQ,EACR,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAC9B,CAAC;IACJ,CAAC;IAED;;;;;;OAMG;IACH,KAAK,CACH,WAAuB,EAAE,EACzB,UAA+B,EAAE;QAEjC,6CAA6C;QAC7C,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,QAAQ,CAAC,EAAE;YAC5B,OAAO,GAAG,QAAQ,CAAC;YACnB,QAAQ,GAAG,EAAE,CAAC;SACf;QAED,OAAO,IAAI,4BAAY,CAAS,IAAI,EAAE,QAAQ,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,CAAC;IACjF,CAAC;IA8BD,SAAS,CACP,GAAkC,EAClC,MAA6C,EAC7C,OAA0E,EAC1E,QAA0C;QAE1C,IAAI,UAAU,KAAK,OAAO,OAAO;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,oFAAoF;QACpF,sGAAsG;QACtG,IAAI,CAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,GAAG,KAAI,IAAI,EAAE;YACxB,MAAM,IAAI,iCAAyB,CACjC,oEAAoE,CACrE,CAAC;SACH;QAED,IAAI,UAAU,KAAK,OAAO,GAAG,EAAE;YAC7B,GAAG,GAAG,GAAG,CAAC,QAAQ,EAAE,CAAC;SACtB;QAED,IAAI,UAAU,KAAK,OAAO,MAAM,EAAE;YAChC,MAAM,GAAG,MAAM,CAAC,QAAQ,EAAE,CAAC;SAC5B;QAED,IAAI,UAAU,KAAK,OAAO,OAAO,CAAC,QAAQ,EAAE;YAC1C,OAAO,CAAC,QAAQ,GAAG,OAAO,CAAC,QAAQ,CAAC,QAAQ,EAAE,CAAC;SAChD;QAED,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,+BAAkB,CACpB,IAAsB,EACtB,GAAG,EACH,MAAM,EACN,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAmB,CAChD,EACD,QAAQ,CACT,CAAC;IACJ,CAAC;IAED,gJAAgJ;IAChJ,yBAAyB,CAAC,OAA0B;QAClD,OAAO,IAAI,kCAAsB,CAAC,IAAsB,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,CAAC;IAC3F,CAAC;IAED,sKAAsK;IACtK,uBAAuB,CAAC,OAA0B;QAChD,OAAO,IAAI,8BAAoB,CAAC,IAAsB,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,CAAC;IACzF,CAAC;IAED,+BAA+B;IAC/B,SAAS;QACP,OAAO,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,MAAM,CAAC;IAC5B,CAAC;IAED,IAAI,MAAM;QACR,OAAO,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,MAAM,CAAC;IAC5B,CAAC;IAED;;;;;;;;;OASG;IACH,MAAM,CACJ,IAA2B,EAC3B,OAAyB,EACzB,QAA6C;QAE7C,uBAAe,CACb,kFAAkF,CACnF,CAAC;QACF,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,IAAI,EAAE,OAAO,EAAE,KAAK,EAAE,CAAC;QACxC,IAAI,GAAG,CAAC,KAAK,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC;QAE5C,IAAI,OAAO,CAAC,SAAS,KAAK,IAAI,EAAE;YAC9B,OAAO,CAAC,OAAO,GAAG,KAAK,CAAC;SACzB;QAED,OAAO,IAAI,CAAC,UAAU,CAAC,IAAI,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;IAClD,CAAC;IAED;;;;;;;;OAQG;IACH,MAAM,CACJ,QAAyB,EACzB,MAA6B,EAC7B,OAAsB,EACtB,QAA4B;QAE5B,uBAAe,CACb,mFAAmF,CACpF,CAAC;QACF,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAExB,OAAO,IAAI,CAAC,UAAU,CAAC,QAAQ,EAAE,MAAM,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;IAC9D,CAAC;IAED;;;;;;;OAOG;IACH,MAAM,CACJ,QAAyB,EACzB,OAAsB,EACtB,QAAkB;QAElB,uBAAe,CACb,mFAAmF,CACpF,CAAC;QACF,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAExB,OAAO,IAAI,CAAC,UAAU,CAAC,QAAQ,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;IACtD,CAAC;IAyBD,KAAK,CACH,MAA0D,EAC1D,OAAyC,EACzC,QAA2B;QAE3B,IAAI,OAAO,MAAM,KAAK,UAAU,EAAE;YAChC,CAAC,QAAQ,GAAG,MAA0B,CAAC,EAAE,CAAC,MAAM,GAAG,EAAE,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;SACxE;aAAM;YACL,IAAI,OAAO,OAAO,KAAK,UAAU;gBAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;SACzE;QAED,MAAM,aAAN,MAAM,cAAN,MAAM,IAAN,MAAM,GAAK,EAAE,EAAC;QACd,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,yCAAuB,CAAC,IAAsB,EAAE,MAAM,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EAC1F,QAAQ,CACT,CAAC;IACJ,CAAC;CACF;AAh7CD,gCAg7CC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/connection_string.js b/node_modules/mongodb/lib/connection_string.js new file mode 100644 index 000000000..0e8ae2f33 --- /dev/null +++ b/node_modules/mongodb/lib/connection_string.js @@ -0,0 +1,996 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.DEFAULT_OPTIONS = exports.OPTIONS = exports.parseOptions = exports.checkTLSOptions = exports.resolveSRVRecord = void 0; +const dns = require("dns"); +const fs = require("fs"); +const mongodb_connection_string_url_1 = require("mongodb-connection-string-url"); +const url_1 = require("url"); +const defaultAuthProviders_1 = require("./cmap/auth/defaultAuthProviders"); +const read_preference_1 = require("./read_preference"); +const read_concern_1 = require("./read_concern"); +const write_concern_1 = require("./write_concern"); +const error_1 = require("./error"); +const utils_1 = require("./utils"); +const mongo_client_1 = require("./mongo_client"); +const mongo_credentials_1 = require("./cmap/auth/mongo_credentials"); +const logger_1 = require("./logger"); +const promise_provider_1 = require("./promise_provider"); +const encrypter_1 = require("./encrypter"); +const compression_1 = require("./cmap/wire_protocol/compression"); +const VALID_TXT_RECORDS = ['authSource', 'replicaSet', 'loadBalanced']; +const LB_SINGLE_HOST_ERROR = 'loadBalanced option only supported with a single host in the URI'; +const LB_REPLICA_SET_ERROR = 'loadBalanced option not supported with a replicaSet option'; +const LB_DIRECT_CONNECTION_ERROR = 'loadBalanced option not supported when directConnection is provided'; +/** + * Determines whether a provided address matches the provided parent domain in order + * to avoid certain attack vectors. + * + * @param srvAddress - The address to check against a domain + * @param parentDomain - The domain to check the provided address against + * @returns Whether the provided address matches the parent domain + */ +function matchesParentDomain(srvAddress, parentDomain) { + const regex = /^.*?\./; + const srv = `.${srvAddress.replace(regex, '')}`; + const parent = `.${parentDomain.replace(regex, '')}`; + return srv.endsWith(parent); +} +/** + * Lookup a `mongodb+srv` connection string, combine the parts and reparse it as a normal + * connection string. + * + * @param uri - The connection string to parse + * @param options - Optional user provided connection string options + */ +function resolveSRVRecord(options, callback) { + if (typeof options.srvHost !== 'string') { + return callback(new error_1.MongoAPIError('Option "srvHost" must not be empty')); + } + if (options.srvHost.split('.').length < 3) { + // TODO(NODE-3484): Replace with MongoConnectionStringError + return callback(new error_1.MongoAPIError('URI must include hostname, domain name, and tld')); + } + // Resolve the SRV record and use the result as the list of hosts to connect to. + const lookupAddress = options.srvHost; + dns.resolveSrv(`_mongodb._tcp.${lookupAddress}`, (err, addresses) => { + if (err) + return callback(err); + if (addresses.length === 0) { + return callback(new error_1.MongoAPIError('No addresses found at host')); + } + for (const { name } of addresses) { + if (!matchesParentDomain(name, lookupAddress)) { + return callback(new error_1.MongoAPIError('Server record does not share hostname with parent URI')); + } + } + const hostAddresses = addresses.map(r => { var _a; return utils_1.HostAddress.fromString(`${r.name}:${(_a = r.port) !== null && _a !== void 0 ? _a : 27017}`); }); + const lbError = validateLoadBalancedOptions(hostAddresses, options); + if (lbError) { + return callback(lbError); + } + // Resolve TXT record and add options from there if they exist. + dns.resolveTxt(lookupAddress, (err, record) => { + var _a, _b, _c; + if (err) { + if (err.code !== 'ENODATA' && err.code !== 'ENOTFOUND') { + return callback(err); + } + } + else { + if (record.length > 1) { + return callback(new error_1.MongoParseError('Multiple text records not allowed')); + } + const txtRecordOptions = new url_1.URLSearchParams(record[0].join('')); + const txtRecordOptionKeys = [...txtRecordOptions.keys()]; + if (txtRecordOptionKeys.some(key => !VALID_TXT_RECORDS.includes(key))) { + return callback(new error_1.MongoParseError(`Text record may only set any of: ${VALID_TXT_RECORDS.join(', ')}`)); + } + const source = (_a = txtRecordOptions.get('authSource')) !== null && _a !== void 0 ? _a : undefined; + const replicaSet = (_b = txtRecordOptions.get('replicaSet')) !== null && _b !== void 0 ? _b : undefined; + const loadBalanced = (_c = txtRecordOptions.get('loadBalanced')) !== null && _c !== void 0 ? _c : undefined; + if (source === '' || replicaSet === '') { + return callback(new error_1.MongoParseError('Cannot have empty URI params in DNS TXT Record')); + } + if (!options.userSpecifiedAuthSource && source) { + options.credentials = mongo_credentials_1.MongoCredentials.merge(options.credentials, { source }); + } + if (!options.userSpecifiedReplicaSet && replicaSet) { + options.replicaSet = replicaSet; + } + if (loadBalanced === 'true') { + options.loadBalanced = true; + } + const lbError = validateLoadBalancedOptions(hostAddresses, options); + if (lbError) { + return callback(lbError); + } + } + callback(undefined, hostAddresses); + }); + }); +} +exports.resolveSRVRecord = resolveSRVRecord; +/** + * Checks if TLS options are valid + * + * @param options - The options used for options parsing + * @throws MongoParseError if TLS options are invalid + */ +function checkTLSOptions(options) { + if (!options) + return; + const check = (a, b) => { + if (Reflect.has(options, a) && Reflect.has(options, b)) { + throw new error_1.MongoParseError(`The '${a}' option cannot be used with '${b}'`); + } + }; + check('tlsInsecure', 'tlsAllowInvalidCertificates'); + check('tlsInsecure', 'tlsAllowInvalidHostnames'); + check('tlsInsecure', 'tlsDisableCertificateRevocationCheck'); + check('tlsInsecure', 'tlsDisableOCSPEndpointCheck'); + check('tlsAllowInvalidCertificates', 'tlsDisableCertificateRevocationCheck'); + check('tlsAllowInvalidCertificates', 'tlsDisableOCSPEndpointCheck'); + check('tlsDisableCertificateRevocationCheck', 'tlsDisableOCSPEndpointCheck'); +} +exports.checkTLSOptions = checkTLSOptions; +const TRUTHS = new Set(['true', 't', '1', 'y', 'yes']); +const FALSEHOODS = new Set(['false', 'f', '0', 'n', 'no', '-1']); +function getBoolean(name, value) { + if (typeof value === 'boolean') + return value; + const valueString = String(value).toLowerCase(); + if (TRUTHS.has(valueString)) + return true; + if (FALSEHOODS.has(valueString)) + return false; + throw new error_1.MongoParseError(`Expected ${name} to be stringified boolean value, got: ${value}`); +} +function getInt(name, value) { + if (typeof value === 'number') + return Math.trunc(value); + const parsedValue = Number.parseInt(String(value), 10); + if (!Number.isNaN(parsedValue)) + return parsedValue; + throw new error_1.MongoParseError(`Expected ${name} to be stringified int value, got: ${value}`); +} +function getUint(name, value) { + const parsedValue = getInt(name, value); + if (parsedValue < 0) { + throw new error_1.MongoParseError(`${name} can only be a positive int value, got: ${value}`); + } + return parsedValue; +} +function toRecord(value) { + const record = Object.create(null); + const keyValuePairs = value.split(','); + for (const keyValue of keyValuePairs) { + const [key, value] = keyValue.split(':'); + if (value == null) { + throw new error_1.MongoParseError('Cannot have undefined values in key value pairs'); + } + try { + // try to get a boolean + record[key] = getBoolean('', value); + } + catch { + try { + // try to get a number + record[key] = getInt('', value); + } + catch { + // keep value as a string + record[key] = value; + } + } + } + return record; +} +class CaseInsensitiveMap extends Map { + constructor(entries = []) { + super(entries.map(([k, v]) => [k.toLowerCase(), v])); + } + has(k) { + return super.has(k.toLowerCase()); + } + get(k) { + return super.get(k.toLowerCase()); + } + set(k, v) { + return super.set(k.toLowerCase(), v); + } + delete(k) { + return super.delete(k.toLowerCase()); + } +} +function parseOptions(uri, mongoClient = undefined, options = {}) { + if (mongoClient != null && !(mongoClient instanceof mongo_client_1.MongoClient)) { + options = mongoClient; + mongoClient = undefined; + } + const url = new mongodb_connection_string_url_1.default(uri); + const { hosts, isSRV } = url; + const mongoOptions = Object.create(null); + mongoOptions.hosts = isSRV ? [] : hosts.map(utils_1.HostAddress.fromString); + if (isSRV) { + // SRV Record is resolved upon connecting + mongoOptions.srvHost = hosts[0]; + if (!url.searchParams.has('tls') && !url.searchParams.has('ssl')) { + options.tls = true; + } + } + const urlOptions = new CaseInsensitiveMap(); + if (url.pathname !== '/' && url.pathname !== '') { + const dbName = decodeURIComponent(url.pathname[0] === '/' ? url.pathname.slice(1) : url.pathname); + if (dbName) { + urlOptions.set('dbName', [dbName]); + } + } + if (url.username !== '') { + const auth = { + username: decodeURIComponent(url.username) + }; + if (typeof url.password === 'string') { + auth.password = decodeURIComponent(url.password); + } + urlOptions.set('auth', [auth]); + } + for (const key of url.searchParams.keys()) { + const values = [...url.searchParams.getAll(key)]; + if (values.includes('')) { + throw new error_1.MongoAPIError('URI cannot contain options with no value'); + } + if (key.toLowerCase() === 'serverapi') { + throw new error_1.MongoParseError('URI cannot contain `serverApi`, it can only be passed to the client'); + } + if (key.toLowerCase() === 'authsource' && urlOptions.has('authSource')) { + // If authSource is an explicit key in the urlOptions we need to remove the implicit dbName + urlOptions.delete('authSource'); + } + if (!urlOptions.has(key)) { + urlOptions.set(key, values); + } + } + const objectOptions = new CaseInsensitiveMap(Object.entries(options).filter(([, v]) => v != null)); + if (objectOptions.has('loadBalanced')) { + throw new error_1.MongoParseError('loadBalanced is only a valid option in the URI'); + } + const allOptions = new CaseInsensitiveMap(); + const allKeys = new Set([ + ...urlOptions.keys(), + ...objectOptions.keys(), + ...exports.DEFAULT_OPTIONS.keys() + ]); + for (const key of allKeys) { + const values = []; + if (objectOptions.has(key)) { + values.push(objectOptions.get(key)); + } + if (urlOptions.has(key)) { + values.push(...urlOptions.get(key)); + } + if (exports.DEFAULT_OPTIONS.has(key)) { + values.push(exports.DEFAULT_OPTIONS.get(key)); + } + allOptions.set(key, values); + } + if (allOptions.has('tlsCertificateKeyFile') && !allOptions.has('tlsCertificateFile')) { + allOptions.set('tlsCertificateFile', allOptions.get('tlsCertificateKeyFile')); + } + if (allOptions.has('tls') || allOptions.has('ssl')) { + const tlsAndSslOpts = (allOptions.get('tls') || []) + .concat(allOptions.get('ssl') || []) + .map(getBoolean.bind(null, 'tls/ssl')); + if (new Set(tlsAndSslOpts).size !== 1) { + throw new error_1.MongoParseError('All values of tls/ssl must be the same.'); + } + } + const unsupportedOptions = utils_1.setDifference(allKeys, Array.from(Object.keys(exports.OPTIONS)).map(s => s.toLowerCase())); + if (unsupportedOptions.size !== 0) { + const optionWord = unsupportedOptions.size > 1 ? 'options' : 'option'; + const isOrAre = unsupportedOptions.size > 1 ? 'are' : 'is'; + throw new error_1.MongoParseError(`${optionWord} ${Array.from(unsupportedOptions).join(', ')} ${isOrAre} not supported`); + } + for (const [key, descriptor] of Object.entries(exports.OPTIONS)) { + const values = allOptions.get(key); + if (!values || values.length === 0) + continue; + setOption(mongoOptions, key, descriptor, values); + } + if (mongoOptions.credentials) { + const isGssapi = mongoOptions.credentials.mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_GSSAPI; + const isX509 = mongoOptions.credentials.mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_X509; + const isAws = mongoOptions.credentials.mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_AWS; + if ((isGssapi || isX509) && + allOptions.has('authSource') && + mongoOptions.credentials.source !== '$external') { + // If authSource was explicitly given and its incorrect, we error + throw new error_1.MongoParseError(`${mongoOptions.credentials} can only have authSource set to '$external'`); + } + if (!(isGssapi || isX509 || isAws) && mongoOptions.dbName && !allOptions.has('authSource')) { + // inherit the dbName unless GSSAPI or X509, then silently ignore dbName + // and there was no specific authSource given + mongoOptions.credentials = mongo_credentials_1.MongoCredentials.merge(mongoOptions.credentials, { + source: mongoOptions.dbName + }); + } + mongoOptions.credentials.validate(); + } + if (!mongoOptions.dbName) { + // dbName default is applied here because of the credential validation above + mongoOptions.dbName = 'test'; + } + checkTLSOptions(mongoOptions); + if (options.promiseLibrary) + promise_provider_1.PromiseProvider.set(options.promiseLibrary); + if (mongoOptions.directConnection && typeof mongoOptions.srvHost === 'string') { + throw new error_1.MongoAPIError('SRV URI does not support directConnection'); + } + const lbError = validateLoadBalancedOptions(hosts, mongoOptions); + if (lbError) { + throw lbError; + } + // Potential SRV Overrides + mongoOptions.userSpecifiedAuthSource = + objectOptions.has('authSource') || urlOptions.has('authSource'); + mongoOptions.userSpecifiedReplicaSet = + objectOptions.has('replicaSet') || urlOptions.has('replicaSet'); + if (mongoClient && mongoOptions.autoEncryption) { + encrypter_1.Encrypter.checkForMongoCrypt(); + mongoOptions.encrypter = new encrypter_1.Encrypter(mongoClient, uri, options); + mongoOptions.autoEncrypter = mongoOptions.encrypter.autoEncrypter; + } + return mongoOptions; +} +exports.parseOptions = parseOptions; +function validateLoadBalancedOptions(hosts, mongoOptions) { + if (mongoOptions.loadBalanced) { + if (hosts.length > 1) { + return new error_1.MongoParseError(LB_SINGLE_HOST_ERROR); + } + if (mongoOptions.replicaSet) { + return new error_1.MongoParseError(LB_REPLICA_SET_ERROR); + } + if (mongoOptions.directConnection) { + return new error_1.MongoParseError(LB_DIRECT_CONNECTION_ERROR); + } + } +} +function setOption(mongoOptions, key, descriptor, values) { + const { target, type, transform, deprecated } = descriptor; + const name = target !== null && target !== void 0 ? target : key; + if (deprecated) { + const deprecatedMsg = typeof deprecated === 'string' ? `: ${deprecated}` : ''; + utils_1.emitWarning(`${key} is a deprecated option${deprecatedMsg}`); + } + switch (type) { + case 'boolean': + mongoOptions[name] = getBoolean(name, values[0]); + break; + case 'int': + mongoOptions[name] = getInt(name, values[0]); + break; + case 'uint': + mongoOptions[name] = getUint(name, values[0]); + break; + case 'string': + if (values[0] == null) { + break; + } + mongoOptions[name] = String(values[0]); + break; + case 'record': + if (!utils_1.isRecord(values[0])) { + throw new error_1.MongoParseError(`${name} must be an object`); + } + mongoOptions[name] = values[0]; + break; + case 'any': + mongoOptions[name] = values[0]; + break; + default: { + if (!transform) { + throw new error_1.MongoParseError('Descriptors missing a type must define a transform'); + } + const transformValue = transform({ name, options: mongoOptions, values }); + mongoOptions[name] = transformValue; + break; + } + } +} +exports.OPTIONS = { + appName: { + target: 'metadata', + transform({ options, values: [value] }) { + return utils_1.makeClientMetadata({ ...options.driverInfo, appName: String(value) }); + } + }, + auth: { + target: 'credentials', + transform({ name, options, values: [value] }) { + if (!utils_1.isRecord(value, ['username', 'password'])) { + throw new error_1.MongoParseError(`${name} must be an object with 'username' and 'password' properties`); + } + return mongo_credentials_1.MongoCredentials.merge(options.credentials, { + username: value.username, + password: value.password + }); + } + }, + authMechanism: { + target: 'credentials', + transform({ options, values: [value] }) { + var _a, _b; + const mechanisms = Object.values(defaultAuthProviders_1.AuthMechanism); + const [mechanism] = mechanisms.filter(m => m.match(RegExp(String.raw `\b${value}\b`, 'i'))); + if (!mechanism) { + throw new error_1.MongoParseError(`authMechanism one of ${mechanisms}, got ${value}`); + } + let source = (_a = options.credentials) === null || _a === void 0 ? void 0 : _a.source; + if (mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_PLAIN || + mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_GSSAPI || + mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_AWS || + mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_X509) { + // some mechanisms have '$external' as the Auth Source + source = '$external'; + } + let password = (_b = options.credentials) === null || _b === void 0 ? void 0 : _b.password; + if (mechanism === defaultAuthProviders_1.AuthMechanism.MONGODB_X509 && password === '') { + password = undefined; + } + return mongo_credentials_1.MongoCredentials.merge(options.credentials, { + mechanism, + source, + password + }); + } + }, + authMechanismProperties: { + target: 'credentials', + transform({ options, values: [value] }) { + if (typeof value === 'string') { + value = toRecord(value); + } + if (!utils_1.isRecord(value)) { + throw new error_1.MongoParseError('AuthMechanismProperties must be an object'); + } + return mongo_credentials_1.MongoCredentials.merge(options.credentials, { mechanismProperties: value }); + } + }, + authSource: { + target: 'credentials', + transform({ options, values: [value] }) { + const source = String(value); + return mongo_credentials_1.MongoCredentials.merge(options.credentials, { source }); + } + }, + autoEncryption: { + type: 'record' + }, + bsonRegExp: { + type: 'boolean' + }, + serverApi: { + target: 'serverApi', + transform({ values: [version] }) { + const serverApiToValidate = typeof version === 'string' ? { version } : version; + const versionToValidate = serverApiToValidate && serverApiToValidate.version; + if (!versionToValidate) { + throw new error_1.MongoParseError(`Invalid \`serverApi\` property; must specify a version from the following enum: ["${Object.values(mongo_client_1.ServerApiVersion).join('", "')}"]`); + } + if (!Object.values(mongo_client_1.ServerApiVersion).some(v => v === versionToValidate)) { + throw new error_1.MongoParseError(`Invalid server API version=${versionToValidate}; must be in the following enum: ["${Object.values(mongo_client_1.ServerApiVersion).join('", "')}"]`); + } + return serverApiToValidate; + } + }, + checkKeys: { + type: 'boolean' + }, + compressors: { + default: 'none', + target: 'compressors', + transform({ values }) { + const compressionList = new Set(); + for (const compVal of values) { + const compValArray = typeof compVal === 'string' ? compVal.split(',') : compVal; + if (!Array.isArray(compValArray)) { + throw new error_1.MongoInvalidArgumentError('compressors must be an array or a comma-delimited list of strings'); + } + for (const c of compValArray) { + if (Object.keys(compression_1.Compressor).includes(String(c))) { + compressionList.add(String(c)); + } + else { + throw new error_1.MongoInvalidArgumentError(`${c} is not a valid compression mechanism. Must be one of: ${Object.keys(compression_1.Compressor)}.`); + } + } + } + return [...compressionList]; + } + }, + connectTimeoutMS: { + default: 30000, + type: 'uint' + }, + dbName: { + type: 'string' + }, + directConnection: { + default: false, + type: 'boolean' + }, + driverInfo: { + target: 'metadata', + default: utils_1.makeClientMetadata(), + transform({ options, values: [value] }) { + var _a, _b; + if (!utils_1.isRecord(value)) + throw new error_1.MongoParseError('DriverInfo must be an object'); + return utils_1.makeClientMetadata({ + driverInfo: value, + appName: (_b = (_a = options.metadata) === null || _a === void 0 ? void 0 : _a.application) === null || _b === void 0 ? void 0 : _b.name + }); + } + }, + family: { + transform({ name, values: [value] }) { + const transformValue = getInt(name, value); + if (transformValue === 4 || transformValue === 6) { + return transformValue; + } + throw new error_1.MongoParseError(`Option 'family' must be 4 or 6 got ${transformValue}.`); + } + }, + fieldsAsRaw: { + type: 'record' + }, + forceServerObjectId: { + default: false, + type: 'boolean' + }, + fsync: { + deprecated: 'Please use journal instead', + target: 'writeConcern', + transform({ name, options, values: [value] }) { + const wc = write_concern_1.WriteConcern.fromOptions({ + writeConcern: { + ...options.writeConcern, + fsync: getBoolean(name, value) + } + }); + if (!wc) + throw new error_1.MongoParseError(`Unable to make a writeConcern from fsync=${value}`); + return wc; + } + }, + heartbeatFrequencyMS: { + default: 10000, + type: 'uint' + }, + ignoreUndefined: { + type: 'boolean' + }, + j: { + deprecated: 'Please use journal instead', + target: 'writeConcern', + transform({ name, options, values: [value] }) { + const wc = write_concern_1.WriteConcern.fromOptions({ + writeConcern: { + ...options.writeConcern, + journal: getBoolean(name, value) + } + }); + if (!wc) + throw new error_1.MongoParseError(`Unable to make a writeConcern from journal=${value}`); + return wc; + } + }, + journal: { + target: 'writeConcern', + transform({ name, options, values: [value] }) { + const wc = write_concern_1.WriteConcern.fromOptions({ + writeConcern: { + ...options.writeConcern, + journal: getBoolean(name, value) + } + }); + if (!wc) + throw new error_1.MongoParseError(`Unable to make a writeConcern from journal=${value}`); + return wc; + } + }, + keepAlive: { + default: true, + type: 'boolean' + }, + keepAliveInitialDelay: { + default: 120000, + type: 'uint' + }, + loadBalanced: { + default: false, + type: 'boolean' + }, + localThresholdMS: { + default: 15, + type: 'uint' + }, + logger: { + default: new logger_1.Logger('MongoClient'), + transform({ values: [value] }) { + if (value instanceof logger_1.Logger) { + return value; + } + utils_1.emitWarning('Alternative loggers might not be supported'); + // TODO: make Logger an interface that others can implement, make usage consistent in driver + // DRIVERS-1204 + } + }, + loggerLevel: { + target: 'logger', + transform({ values: [value] }) { + return new logger_1.Logger('MongoClient', { loggerLevel: value }); + } + }, + maxIdleTimeMS: { + default: 0, + type: 'uint' + }, + maxPoolSize: { + default: 100, + type: 'uint' + }, + maxStalenessSeconds: { + target: 'readPreference', + transform({ name, options, values: [value] }) { + const maxStalenessSeconds = getUint(name, value); + if (options.readPreference) { + return read_preference_1.ReadPreference.fromOptions({ + readPreference: { ...options.readPreference, maxStalenessSeconds } + }); + } + else { + return new read_preference_1.ReadPreference('secondary', undefined, { maxStalenessSeconds }); + } + } + }, + minInternalBufferSize: { + type: 'uint' + }, + minPoolSize: { + default: 0, + type: 'uint' + }, + minHeartbeatFrequencyMS: { + default: 500, + type: 'uint' + }, + monitorCommands: { + default: false, + type: 'boolean' + }, + name: { + target: 'driverInfo', + transform({ values: [value], options }) { + return { ...options.driverInfo, name: String(value) }; + } + }, + noDelay: { + default: true, + type: 'boolean' + }, + pkFactory: { + default: utils_1.DEFAULT_PK_FACTORY, + transform({ values: [value] }) { + if (utils_1.isRecord(value, ['createPk']) && typeof value.createPk === 'function') { + return value; + } + throw new error_1.MongoParseError(`Option pkFactory must be an object with a createPk function, got ${value}`); + } + }, + promiseLibrary: { + deprecated: true, + type: 'any' + }, + promoteBuffers: { + type: 'boolean' + }, + promoteLongs: { + type: 'boolean' + }, + promoteValues: { + type: 'boolean' + }, + raw: { + default: false, + type: 'boolean' + }, + readConcern: { + transform({ values: [value], options }) { + if (value instanceof read_concern_1.ReadConcern || utils_1.isRecord(value, ['level'])) { + return read_concern_1.ReadConcern.fromOptions({ ...options.readConcern, ...value }); + } + throw new error_1.MongoParseError(`ReadConcern must be an object, got ${JSON.stringify(value)}`); + } + }, + readConcernLevel: { + target: 'readConcern', + transform({ values: [level], options }) { + return read_concern_1.ReadConcern.fromOptions({ + ...options.readConcern, + level: level + }); + } + }, + readPreference: { + default: read_preference_1.ReadPreference.primary, + transform({ values: [value], options }) { + var _a, _b, _c; + if (value instanceof read_preference_1.ReadPreference) { + return read_preference_1.ReadPreference.fromOptions({ + readPreference: { ...options.readPreference, ...value }, + ...value + }); + } + if (utils_1.isRecord(value, ['mode'])) { + const rp = read_preference_1.ReadPreference.fromOptions({ + readPreference: { ...options.readPreference, ...value }, + ...value + }); + if (rp) + return rp; + else + throw new error_1.MongoParseError(`Cannot make read preference from ${JSON.stringify(value)}`); + } + if (typeof value === 'string') { + const rpOpts = { + hedge: (_a = options.readPreference) === null || _a === void 0 ? void 0 : _a.hedge, + maxStalenessSeconds: (_b = options.readPreference) === null || _b === void 0 ? void 0 : _b.maxStalenessSeconds + }; + return new read_preference_1.ReadPreference(value, (_c = options.readPreference) === null || _c === void 0 ? void 0 : _c.tags, rpOpts); + } + } + }, + readPreferenceTags: { + target: 'readPreference', + transform({ values, options }) { + const readPreferenceTags = []; + for (const tag of values) { + const readPreferenceTag = Object.create(null); + if (typeof tag === 'string') { + for (const [k, v] of Object.entries(toRecord(tag))) { + readPreferenceTag[k] = v; + } + } + if (utils_1.isRecord(tag)) { + for (const [k, v] of Object.entries(tag)) { + readPreferenceTag[k] = v; + } + } + readPreferenceTags.push(readPreferenceTag); + } + return read_preference_1.ReadPreference.fromOptions({ + readPreference: options.readPreference, + readPreferenceTags + }); + } + }, + replicaSet: { + type: 'string' + }, + retryReads: { + default: true, + type: 'boolean' + }, + retryWrites: { + default: true, + type: 'boolean' + }, + serializeFunctions: { + type: 'boolean' + }, + serverSelectionTimeoutMS: { + default: 30000, + type: 'uint' + }, + servername: { + type: 'string' + }, + socketTimeoutMS: { + default: 0, + type: 'uint' + }, + ssl: { + target: 'tls', + type: 'boolean' + }, + sslCA: { + target: 'ca', + transform({ values: [value] }) { + return fs.readFileSync(String(value), { encoding: 'ascii' }); + } + }, + sslCRL: { + target: 'crl', + transform({ values: [value] }) { + return fs.readFileSync(String(value), { encoding: 'ascii' }); + } + }, + sslCert: { + target: 'cert', + transform({ values: [value] }) { + return fs.readFileSync(String(value), { encoding: 'ascii' }); + } + }, + sslKey: { + target: 'key', + transform({ values: [value] }) { + return fs.readFileSync(String(value), { encoding: 'ascii' }); + } + }, + sslPass: { + deprecated: true, + target: 'passphrase', + type: 'string' + }, + sslValidate: { + target: 'rejectUnauthorized', + type: 'boolean' + }, + tls: { + type: 'boolean' + }, + tlsAllowInvalidCertificates: { + target: 'rejectUnauthorized', + transform({ name, values: [value] }) { + // allowInvalidCertificates is the inverse of rejectUnauthorized + return !getBoolean(name, value); + } + }, + tlsAllowInvalidHostnames: { + target: 'checkServerIdentity', + transform({ name, values: [value] }) { + // tlsAllowInvalidHostnames means setting the checkServerIdentity function to a noop + return getBoolean(name, value) ? () => undefined : undefined; + } + }, + tlsCAFile: { + target: 'ca', + transform({ values: [value] }) { + return fs.readFileSync(String(value), { encoding: 'ascii' }); + } + }, + tlsCertificateFile: { + target: 'cert', + transform({ values: [value] }) { + return fs.readFileSync(String(value), { encoding: 'ascii' }); + } + }, + tlsCertificateKeyFile: { + target: 'key', + transform({ values: [value] }) { + return fs.readFileSync(String(value), { encoding: 'ascii' }); + } + }, + tlsCertificateKeyFilePassword: { + target: 'passphrase', + type: 'any' + }, + tlsInsecure: { + transform({ name, options, values: [value] }) { + const tlsInsecure = getBoolean(name, value); + if (tlsInsecure) { + options.checkServerIdentity = () => undefined; + options.rejectUnauthorized = false; + } + else { + options.checkServerIdentity = options.tlsAllowInvalidHostnames + ? () => undefined + : undefined; + options.rejectUnauthorized = options.tlsAllowInvalidCertificates ? false : true; + } + return tlsInsecure; + } + }, + w: { + target: 'writeConcern', + transform({ values: [value], options }) { + return write_concern_1.WriteConcern.fromOptions({ writeConcern: { ...options.writeConcern, w: value } }); + } + }, + waitQueueTimeoutMS: { + default: 0, + type: 'uint' + }, + writeConcern: { + target: 'writeConcern', + transform({ values: [value], options }) { + if (utils_1.isRecord(value) || value instanceof write_concern_1.WriteConcern) { + return write_concern_1.WriteConcern.fromOptions({ + writeConcern: { + ...options.writeConcern, + ...value + } + }); + } + else if (value === 'majority' || typeof value === 'number') { + return write_concern_1.WriteConcern.fromOptions({ + writeConcern: { + ...options.writeConcern, + w: value + } + }); + } + throw new error_1.MongoParseError(`Invalid WriteConcern cannot parse: ${JSON.stringify(value)}`); + } + }, + wtimeout: { + deprecated: 'Please use wtimeoutMS instead', + target: 'writeConcern', + transform({ values: [value], options }) { + const wc = write_concern_1.WriteConcern.fromOptions({ + writeConcern: { + ...options.writeConcern, + wtimeout: getUint('wtimeout', value) + } + }); + if (wc) + return wc; + throw new error_1.MongoParseError(`Cannot make WriteConcern from wtimeout`); + } + }, + wtimeoutMS: { + target: 'writeConcern', + transform({ values: [value], options }) { + const wc = write_concern_1.WriteConcern.fromOptions({ + writeConcern: { + ...options.writeConcern, + wtimeoutMS: getUint('wtimeoutMS', value) + } + }); + if (wc) + return wc; + throw new error_1.MongoParseError(`Cannot make WriteConcern from wtimeout`); + } + }, + zlibCompressionLevel: { + default: 0, + type: 'int' + }, + // Custom types for modifying core behavior + connectionType: { type: 'any' }, + srvPoller: { type: 'any' }, + // Accepted NodeJS Options + minDHSize: { type: 'any' }, + pskCallback: { type: 'any' }, + secureContext: { type: 'any' }, + enableTrace: { type: 'any' }, + requestCert: { type: 'any' }, + rejectUnauthorized: { type: 'any' }, + checkServerIdentity: { type: 'any' }, + ALPNProtocols: { type: 'any' }, + SNICallback: { type: 'any' }, + session: { type: 'any' }, + requestOCSP: { type: 'any' }, + localAddress: { type: 'any' }, + localPort: { type: 'any' }, + hints: { type: 'any' }, + lookup: { type: 'any' }, + ca: { type: 'any' }, + cert: { type: 'any' }, + ciphers: { type: 'any' }, + crl: { type: 'any' }, + ecdhCurve: { type: 'any' }, + key: { type: 'any' }, + passphrase: { type: 'any' }, + pfx: { type: 'any' }, + secureProtocol: { type: 'any' }, + index: { type: 'any' }, + // Legacy Options, these are unused but left here to avoid errors with CSFLE lib + useNewUrlParser: { type: 'boolean' }, + useUnifiedTopology: { type: 'boolean' } +}; +exports.DEFAULT_OPTIONS = new CaseInsensitiveMap(Object.entries(exports.OPTIONS) + .filter(([, descriptor]) => descriptor.default != null) + .map(([k, d]) => [k, d.default])); +//# sourceMappingURL=connection_string.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/connection_string.js.map b/node_modules/mongodb/lib/connection_string.js.map new file mode 100644 index 000000000..866db7549 --- /dev/null +++ b/node_modules/mongodb/lib/connection_string.js.map @@ -0,0 +1 @@ +{"version":3,"file":"connection_string.js","sourceRoot":"","sources":["../src/connection_string.ts"],"names":[],"mappings":";;;AAAA,2BAA2B;AAC3B,yBAAyB;AACzB,iFAA6D;AAC7D,6BAAsC;AACtC,2EAAiE;AACjE,uDAAuE;AACvE,iDAA+D;AAC/D,mDAAkD;AAClD,mCAAoF;AACpF,mCASiB;AAEjB,iDAQwB;AACxB,qEAAiE;AAEjE,qCAA+C;AAC/C,yDAAqD;AACrD,2CAAwC;AACxC,kEAA8E;AAE9E,MAAM,iBAAiB,GAAG,CAAC,YAAY,EAAE,YAAY,EAAE,cAAc,CAAC,CAAC;AAEvE,MAAM,oBAAoB,GAAG,kEAAkE,CAAC;AAChG,MAAM,oBAAoB,GAAG,4DAA4D,CAAC;AAC1F,MAAM,0BAA0B,GAC9B,qEAAqE,CAAC;AAExE;;;;;;;GAOG;AACH,SAAS,mBAAmB,CAAC,UAAkB,EAAE,YAAoB;IACnE,MAAM,KAAK,GAAG,QAAQ,CAAC;IACvB,MAAM,GAAG,GAAG,IAAI,UAAU,CAAC,OAAO,CAAC,KAAK,EAAE,EAAE,CAAC,EAAE,CAAC;IAChD,MAAM,MAAM,GAAG,IAAI,YAAY,CAAC,OAAO,CAAC,KAAK,EAAE,EAAE,CAAC,EAAE,CAAC;IACrD,OAAO,GAAG,CAAC,QAAQ,CAAC,MAAM,CAAC,CAAC;AAC9B,CAAC;AAED;;;;;;GAMG;AACH,SAAgB,gBAAgB,CAAC,OAAqB,EAAE,QAAiC;IACvF,IAAI,OAAO,OAAO,CAAC,OAAO,KAAK,QAAQ,EAAE;QACvC,OAAO,QAAQ,CAAC,IAAI,qBAAa,CAAC,oCAAoC,CAAC,CAAC,CAAC;KAC1E;IAED,IAAI,OAAO,CAAC,OAAO,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,MAAM,GAAG,CAAC,EAAE;QACzC,2DAA2D;QAC3D,OAAO,QAAQ,CAAC,IAAI,qBAAa,CAAC,iDAAiD,CAAC,CAAC,CAAC;KACvF;IAED,gFAAgF;IAChF,MAAM,aAAa,GAAG,OAAO,CAAC,OAAO,CAAC;IACtC,GAAG,CAAC,UAAU,CAAC,iBAAiB,aAAa,EAAE,EAAE,CAAC,GAAG,EAAE,SAAS,EAAE,EAAE;QAClE,IAAI,GAAG;YAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;QAE9B,IAAI,SAAS,CAAC,MAAM,KAAK,CAAC,EAAE;YAC1B,OAAO,QAAQ,CAAC,IAAI,qBAAa,CAAC,4BAA4B,CAAC,CAAC,CAAC;SAClE;QAED,KAAK,MAAM,EAAE,IAAI,EAAE,IAAI,SAAS,EAAE;YAChC,IAAI,CAAC,mBAAmB,CAAC,IAAI,EAAE,aAAa,CAAC,EAAE;gBAC7C,OAAO,QAAQ,CAAC,IAAI,qBAAa,CAAC,uDAAuD,CAAC,CAAC,CAAC;aAC7F;SACF;QAED,MAAM,aAAa,GAAG,SAAS,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,WACtC,OAAA,mBAAW,CAAC,UAAU,CAAC,GAAG,CAAC,CAAC,IAAI,IAAI,MAAA,CAAC,CAAC,IAAI,mCAAI,KAAK,EAAE,CAAC,CAAA,EAAA,CACvD,CAAC;QAEF,MAAM,OAAO,GAAG,2BAA2B,CAAC,aAAa,EAAE,OAAO,CAAC,CAAC;QACpE,IAAI,OAAO,EAAE;YACX,OAAO,QAAQ,CAAC,OAAO,CAAC,CAAC;SAC1B;QAED,+DAA+D;QAC/D,GAAG,CAAC,UAAU,CAAC,aAAa,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;;YAC5C,IAAI,GAAG,EAAE;gBACP,IAAI,GAAG,CAAC,IAAI,KAAK,SAAS,IAAI,GAAG,CAAC,IAAI,KAAK,WAAW,EAAE;oBACtD,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;iBACtB;aACF;iBAAM;gBACL,IAAI,MAAM,CAAC,MAAM,GAAG,CAAC,EAAE;oBACrB,OAAO,QAAQ,CAAC,IAAI,uBAAe,CAAC,mCAAmC,CAAC,CAAC,CAAC;iBAC3E;gBAED,MAAM,gBAAgB,GAAG,IAAI,qBAAe,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC,CAAC;gBACjE,MAAM,mBAAmB,GAAG,CAAC,GAAG,gBAAgB,CAAC,IAAI,EAAE,CAAC,CAAC;gBACzD,IAAI,mBAAmB,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,CAAC,iBAAiB,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAC,EAAE;oBACrE,OAAO,QAAQ,CACb,IAAI,uBAAe,CAAC,oCAAoC,iBAAiB,CAAC,IAAI,CAAC,IAAI,CAAC,EAAE,CAAC,CACxF,CAAC;iBACH;gBAED,MAAM,MAAM,GAAG,MAAA,gBAAgB,CAAC,GAAG,CAAC,YAAY,CAAC,mCAAI,SAAS,CAAC;gBAC/D,MAAM,UAAU,GAAG,MAAA,gBAAgB,CAAC,GAAG,CAAC,YAAY,CAAC,mCAAI,SAAS,CAAC;gBACnE,MAAM,YAAY,GAAG,MAAA,gBAAgB,CAAC,GAAG,CAAC,cAAc,CAAC,mCAAI,SAAS,CAAC;gBAEvE,IAAI,MAAM,KAAK,EAAE,IAAI,UAAU,KAAK,EAAE,EAAE;oBACtC,OAAO,QAAQ,CAAC,IAAI,uBAAe,CAAC,gDAAgD,CAAC,CAAC,CAAC;iBACxF;gBAED,IAAI,CAAC,OAAO,CAAC,uBAAuB,IAAI,MAAM,EAAE;oBAC9C,OAAO,CAAC,WAAW,GAAG,oCAAgB,CAAC,KAAK,CAAC,OAAO,CAAC,WAAW,EAAE,EAAE,MAAM,EAAE,CAAC,CAAC;iBAC/E;gBAED,IAAI,CAAC,OAAO,CAAC,uBAAuB,IAAI,UAAU,EAAE;oBAClD,OAAO,CAAC,UAAU,GAAG,UAAU,CAAC;iBACjC;gBAED,IAAI,YAAY,KAAK,MAAM,EAAE;oBAC3B,OAAO,CAAC,YAAY,GAAG,IAAI,CAAC;iBAC7B;gBAED,MAAM,OAAO,GAAG,2BAA2B,CAAC,aAAa,EAAE,OAAO,CAAC,CAAC;gBACpE,IAAI,OAAO,EAAE;oBACX,OAAO,QAAQ,CAAC,OAAO,CAAC,CAAC;iBAC1B;aACF;YAED,QAAQ,CAAC,SAAS,EAAE,aAAa,CAAC,CAAC;QACrC,CAAC,CAAC,CAAC;IACL,CAAC,CAAC,CAAC;AACL,CAAC;AAlFD,4CAkFC;AAED;;;;;GAKG;AACH,SAAgB,eAAe,CAAC,OAAmB;IACjD,IAAI,CAAC,OAAO;QAAE,OAAO;IACrB,MAAM,KAAK,GAAG,CAAC,CAAS,EAAE,CAAS,EAAE,EAAE;QACrC,IAAI,OAAO,CAAC,GAAG,CAAC,OAAO,EAAE,CAAC,CAAC,IAAI,OAAO,CAAC,GAAG,CAAC,OAAO,EAAE,CAAC,CAAC,EAAE;YACtD,MAAM,IAAI,uBAAe,CAAC,QAAQ,CAAC,iCAAiC,CAAC,GAAG,CAAC,CAAC;SAC3E;IACH,CAAC,CAAC;IACF,KAAK,CAAC,aAAa,EAAE,6BAA6B,CAAC,CAAC;IACpD,KAAK,CAAC,aAAa,EAAE,0BAA0B,CAAC,CAAC;IACjD,KAAK,CAAC,aAAa,EAAE,sCAAsC,CAAC,CAAC;IAC7D,KAAK,CAAC,aAAa,EAAE,6BAA6B,CAAC,CAAC;IACpD,KAAK,CAAC,6BAA6B,EAAE,sCAAsC,CAAC,CAAC;IAC7E,KAAK,CAAC,6BAA6B,EAAE,6BAA6B,CAAC,CAAC;IACpE,KAAK,CAAC,sCAAsC,EAAE,6BAA6B,CAAC,CAAC;AAC/E,CAAC;AAdD,0CAcC;AAED,MAAM,MAAM,GAAG,IAAI,GAAG,CAAC,CAAC,MAAM,EAAE,GAAG,EAAE,GAAG,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC,CAAC;AACvD,MAAM,UAAU,GAAG,IAAI,GAAG,CAAC,CAAC,OAAO,EAAE,GAAG,EAAE,GAAG,EAAE,GAAG,EAAE,IAAI,EAAE,IAAI,CAAC,CAAC,CAAC;AACjE,SAAS,UAAU,CAAC,IAAY,EAAE,KAAc;IAC9C,IAAI,OAAO,KAAK,KAAK,SAAS;QAAE,OAAO,KAAK,CAAC;IAC7C,MAAM,WAAW,GAAG,MAAM,CAAC,KAAK,CAAC,CAAC,WAAW,EAAE,CAAC;IAChD,IAAI,MAAM,CAAC,GAAG,CAAC,WAAW,CAAC;QAAE,OAAO,IAAI,CAAC;IACzC,IAAI,UAAU,CAAC,GAAG,CAAC,WAAW,CAAC;QAAE,OAAO,KAAK,CAAC;IAC9C,MAAM,IAAI,uBAAe,CAAC,YAAY,IAAI,0CAA0C,KAAK,EAAE,CAAC,CAAC;AAC/F,CAAC;AAED,SAAS,MAAM,CAAC,IAAY,EAAE,KAAc;IAC1C,IAAI,OAAO,KAAK,KAAK,QAAQ;QAAE,OAAO,IAAI,CAAC,KAAK,CAAC,KAAK,CAAC,CAAC;IACxD,MAAM,WAAW,GAAG,MAAM,CAAC,QAAQ,CAAC,MAAM,CAAC,KAAK,CAAC,EAAE,EAAE,CAAC,CAAC;IACvD,IAAI,CAAC,MAAM,CAAC,KAAK,CAAC,WAAW,CAAC;QAAE,OAAO,WAAW,CAAC;IACnD,MAAM,IAAI,uBAAe,CAAC,YAAY,IAAI,sCAAsC,KAAK,EAAE,CAAC,CAAC;AAC3F,CAAC;AAED,SAAS,OAAO,CAAC,IAAY,EAAE,KAAc;IAC3C,MAAM,WAAW,GAAG,MAAM,CAAC,IAAI,EAAE,KAAK,CAAC,CAAC;IACxC,IAAI,WAAW,GAAG,CAAC,EAAE;QACnB,MAAM,IAAI,uBAAe,CAAC,GAAG,IAAI,2CAA2C,KAAK,EAAE,CAAC,CAAC;KACtF;IACD,OAAO,WAAW,CAAC;AACrB,CAAC;AAED,SAAS,QAAQ,CAAC,KAAa;IAC7B,MAAM,MAAM,GAAG,MAAM,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC;IACnC,MAAM,aAAa,GAAG,KAAK,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;IACvC,KAAK,MAAM,QAAQ,IAAI,aAAa,EAAE;QACpC,MAAM,CAAC,GAAG,EAAE,KAAK,CAAC,GAAG,QAAQ,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;QACzC,IAAI,KAAK,IAAI,IAAI,EAAE;YACjB,MAAM,IAAI,uBAAe,CAAC,iDAAiD,CAAC,CAAC;SAC9E;QACD,IAAI;YACF,uBAAuB;YACvB,MAAM,CAAC,GAAG,CAAC,GAAG,UAAU,CAAC,EAAE,EAAE,KAAK,CAAC,CAAC;SACrC;QAAC,MAAM;YACN,IAAI;gBACF,sBAAsB;gBACtB,MAAM,CAAC,GAAG,CAAC,GAAG,MAAM,CAAC,EAAE,EAAE,KAAK,CAAC,CAAC;aACjC;YAAC,MAAM;gBACN,yBAAyB;gBACzB,MAAM,CAAC,GAAG,CAAC,GAAG,KAAK,CAAC;aACrB;SACF;KACF;IACD,OAAO,MAAM,CAAC;AAChB,CAAC;AAED,MAAM,kBAAmB,SAAQ,GAAgB;IAC/C,YAAY,UAAgC,EAAE;QAC5C,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,EAAE,EAAE,CAAC,CAAC,CAAC,CAAC,WAAW,EAAE,EAAE,CAAC,CAAC,CAAC,CAAC,CAAC;IACvD,CAAC;IACD,GAAG,CAAC,CAAS;QACX,OAAO,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC,WAAW,EAAE,CAAC,CAAC;IACpC,CAAC;IACD,GAAG,CAAC,CAAS;QACX,OAAO,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC,WAAW,EAAE,CAAC,CAAC;IACpC,CAAC;IACD,GAAG,CAAC,CAAS,EAAE,CAAM;QACnB,OAAO,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC,WAAW,EAAE,EAAE,CAAC,CAAC,CAAC;IACvC,CAAC;IACD,MAAM,CAAC,CAAS;QACd,OAAO,KAAK,CAAC,MAAM,CAAC,CAAC,CAAC,WAAW,EAAE,CAAC,CAAC;IACvC,CAAC;CACF;AAED,SAAgB,YAAY,CAC1B,GAAW,EACX,cAA4D,SAAS,EACrE,UAA8B,EAAE;IAEhC,IAAI,WAAW,IAAI,IAAI,IAAI,CAAC,CAAC,WAAW,YAAY,0BAAW,CAAC,EAAE;QAChE,OAAO,GAAG,WAAW,CAAC;QACtB,WAAW,GAAG,SAAS,CAAC;KACzB;IAED,MAAM,GAAG,GAAG,IAAI,uCAAgB,CAAC,GAAG,CAAC,CAAC;IACtC,MAAM,EAAE,KAAK,EAAE,KAAK,EAAE,GAAG,GAAG,CAAC;IAE7B,MAAM,YAAY,GAAG,MAAM,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC;IACzC,YAAY,CAAC,KAAK,GAAG,KAAK,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,KAAK,CAAC,GAAG,CAAC,mBAAW,CAAC,UAAU,CAAC,CAAC;IACpE,IAAI,KAAK,EAAE;QACT,yCAAyC;QACzC,YAAY,CAAC,OAAO,GAAG,KAAK,CAAC,CAAC,CAAC,CAAC;QAChC,IAAI,CAAC,GAAG,CAAC,YAAY,CAAC,GAAG,CAAC,KAAK,CAAC,IAAI,CAAC,GAAG,CAAC,YAAY,CAAC,GAAG,CAAC,KAAK,CAAC,EAAE;YAChE,OAAO,CAAC,GAAG,GAAG,IAAI,CAAC;SACpB;KACF;IAED,MAAM,UAAU,GAAG,IAAI,kBAAkB,EAAE,CAAC;IAE5C,IAAI,GAAG,CAAC,QAAQ,KAAK,GAAG,IAAI,GAAG,CAAC,QAAQ,KAAK,EAAE,EAAE;QAC/C,MAAM,MAAM,GAAG,kBAAkB,CAC/B,GAAG,CAAC,QAAQ,CAAC,CAAC,CAAC,KAAK,GAAG,CAAC,CAAC,CAAC,GAAG,CAAC,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,QAAQ,CAC/D,CAAC;QACF,IAAI,MAAM,EAAE;YACV,UAAU,CAAC,GAAG,CAAC,QAAQ,EAAE,CAAC,MAAM,CAAC,CAAC,CAAC;SACpC;KACF;IAED,IAAI,GAAG,CAAC,QAAQ,KAAK,EAAE,EAAE;QACvB,MAAM,IAAI,GAAa;YACrB,QAAQ,EAAE,kBAAkB,CAAC,GAAG,CAAC,QAAQ,CAAC;SAC3C,CAAC;QAEF,IAAI,OAAO,GAAG,CAAC,QAAQ,KAAK,QAAQ,EAAE;YACpC,IAAI,CAAC,QAAQ,GAAG,kBAAkB,CAAC,GAAG,CAAC,QAAQ,CAAC,CAAC;SAClD;QAED,UAAU,CAAC,GAAG,CAAC,MAAM,EAAE,CAAC,IAAI,CAAC,CAAC,CAAC;KAChC;IAED,KAAK,MAAM,GAAG,IAAI,GAAG,CAAC,YAAY,CAAC,IAAI,EAAE,EAAE;QACzC,MAAM,MAAM,GAAG,CAAC,GAAG,GAAG,CAAC,YAAY,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC,CAAC;QAEjD,IAAI,MAAM,CAAC,QAAQ,CAAC,EAAE,CAAC,EAAE;YACvB,MAAM,IAAI,qBAAa,CAAC,0CAA0C,CAAC,CAAC;SACrE;QAED,IAAI,GAAG,CAAC,WAAW,EAAE,KAAK,WAAW,EAAE;YACrC,MAAM,IAAI,uBAAe,CACvB,qEAAqE,CACtE,CAAC;SACH;QAED,IAAI,GAAG,CAAC,WAAW,EAAE,KAAK,YAAY,IAAI,UAAU,CAAC,GAAG,CAAC,YAAY,CAAC,EAAE;YACtE,2FAA2F;YAC3F,UAAU,CAAC,MAAM,CAAC,YAAY,CAAC,CAAC;SACjC;QAED,IAAI,CAAC,UAAU,CAAC,GAAG,CAAC,GAAG,CAAC,EAAE;YACxB,UAAU,CAAC,GAAG,CAAC,GAAG,EAAE,MAAM,CAAC,CAAC;SAC7B;KACF;IAED,MAAM,aAAa,GAAG,IAAI,kBAAkB,CAC1C,MAAM,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,EAAE,EAAE,CAAC,CAAC,IAAI,IAAI,CAAC,CACrD,CAAC;IAEF,IAAI,aAAa,CAAC,GAAG,CAAC,cAAc,CAAC,EAAE;QACrC,MAAM,IAAI,uBAAe,CAAC,gDAAgD,CAAC,CAAC;KAC7E;IAED,MAAM,UAAU,GAAG,IAAI,kBAAkB,EAAE,CAAC;IAE5C,MAAM,OAAO,GAAG,IAAI,GAAG,CAAS;QAC9B,GAAG,UAAU,CAAC,IAAI,EAAE;QACpB,GAAG,aAAa,CAAC,IAAI,EAAE;QACvB,GAAG,uBAAe,CAAC,IAAI,EAAE;KAC1B,CAAC,CAAC;IAEH,KAAK,MAAM,GAAG,IAAI,OAAO,EAAE;QACzB,MAAM,MAAM,GAAG,EAAE,CAAC;QAClB,IAAI,aAAa,CAAC,GAAG,CAAC,GAAG,CAAC,EAAE;YAC1B,MAAM,CAAC,IAAI,CAAC,aAAa,CAAC,GAAG,CAAC,GAAG,CAAC,CAAC,CAAC;SACrC;QACD,IAAI,UAAU,CAAC,GAAG,CAAC,GAAG,CAAC,EAAE;YACvB,MAAM,CAAC,IAAI,CAAC,GAAG,UAAU,CAAC,GAAG,CAAC,GAAG,CAAC,CAAC,CAAC;SACrC;QACD,IAAI,uBAAe,CAAC,GAAG,CAAC,GAAG,CAAC,EAAE;YAC5B,MAAM,CAAC,IAAI,CAAC,uBAAe,CAAC,GAAG,CAAC,GAAG,CAAC,CAAC,CAAC;SACvC;QACD,UAAU,CAAC,GAAG,CAAC,GAAG,EAAE,MAAM,CAAC,CAAC;KAC7B;IAED,IAAI,UAAU,CAAC,GAAG,CAAC,uBAAuB,CAAC,IAAI,CAAC,UAAU,CAAC,GAAG,CAAC,oBAAoB,CAAC,EAAE;QACpF,UAAU,CAAC,GAAG,CAAC,oBAAoB,EAAE,UAAU,CAAC,GAAG,CAAC,uBAAuB,CAAC,CAAC,CAAC;KAC/E;IAED,IAAI,UAAU,CAAC,GAAG,CAAC,KAAK,CAAC,IAAI,UAAU,CAAC,GAAG,CAAC,KAAK,CAAC,EAAE;QAClD,MAAM,aAAa,GAAG,CAAC,UAAU,CAAC,GAAG,CAAC,KAAK,CAAC,IAAI,EAAE,CAAC;aAChD,MAAM,CAAC,UAAU,CAAC,GAAG,CAAC,KAAK,CAAC,IAAI,EAAE,CAAC;aACnC,GAAG,CAAC,UAAU,CAAC,IAAI,CAAC,IAAI,EAAE,SAAS,CAAC,CAAC,CAAC;QACzC,IAAI,IAAI,GAAG,CAAC,aAAa,CAAC,CAAC,IAAI,KAAK,CAAC,EAAE;YACrC,MAAM,IAAI,uBAAe,CAAC,yCAAyC,CAAC,CAAC;SACtE;KACF;IAED,MAAM,kBAAkB,GAAG,qBAAa,CACtC,OAAO,EACP,KAAK,CAAC,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,eAAO,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,WAAW,EAAE,CAAC,CAC3D,CAAC;IACF,IAAI,kBAAkB,CAAC,IAAI,KAAK,CAAC,EAAE;QACjC,MAAM,UAAU,GAAG,kBAAkB,CAAC,IAAI,GAAG,CAAC,CAAC,CAAC,CAAC,SAAS,CAAC,CAAC,CAAC,QAAQ,CAAC;QACtE,MAAM,OAAO,GAAG,kBAAkB,CAAC,IAAI,GAAG,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,IAAI,CAAC;QAC3D,MAAM,IAAI,uBAAe,CACvB,GAAG,UAAU,IAAI,KAAK,CAAC,IAAI,CAAC,kBAAkB,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,IAAI,OAAO,gBAAgB,CACtF,CAAC;KACH;IAED,KAAK,MAAM,CAAC,GAAG,EAAE,UAAU,CAAC,IAAI,MAAM,CAAC,OAAO,CAAC,eAAO,CAAC,EAAE;QACvD,MAAM,MAAM,GAAG,UAAU,CAAC,GAAG,CAAC,GAAG,CAAC,CAAC;QACnC,IAAI,CAAC,MAAM,IAAI,MAAM,CAAC,MAAM,KAAK,CAAC;YAAE,SAAS;QAC7C,SAAS,CAAC,YAAY,EAAE,GAAG,EAAE,UAAU,EAAE,MAAM,CAAC,CAAC;KAClD;IAED,IAAI,YAAY,CAAC,WAAW,EAAE;QAC5B,MAAM,QAAQ,GAAG,YAAY,CAAC,WAAW,CAAC,SAAS,KAAK,oCAAa,CAAC,cAAc,CAAC;QACrF,MAAM,MAAM,GAAG,YAAY,CAAC,WAAW,CAAC,SAAS,KAAK,oCAAa,CAAC,YAAY,CAAC;QACjF,MAAM,KAAK,GAAG,YAAY,CAAC,WAAW,CAAC,SAAS,KAAK,oCAAa,CAAC,WAAW,CAAC;QAC/E,IACE,CAAC,QAAQ,IAAI,MAAM,CAAC;YACpB,UAAU,CAAC,GAAG,CAAC,YAAY,CAAC;YAC5B,YAAY,CAAC,WAAW,CAAC,MAAM,KAAK,WAAW,EAC/C;YACA,iEAAiE;YACjE,MAAM,IAAI,uBAAe,CACvB,GAAG,YAAY,CAAC,WAAW,8CAA8C,CAC1E,CAAC;SACH;QAED,IAAI,CAAC,CAAC,QAAQ,IAAI,MAAM,IAAI,KAAK,CAAC,IAAI,YAAY,CAAC,MAAM,IAAI,CAAC,UAAU,CAAC,GAAG,CAAC,YAAY,CAAC,EAAE;YAC1F,wEAAwE;YACxE,6CAA6C;YAC7C,YAAY,CAAC,WAAW,GAAG,oCAAgB,CAAC,KAAK,CAAC,YAAY,CAAC,WAAW,EAAE;gBAC1E,MAAM,EAAE,YAAY,CAAC,MAAM;aAC5B,CAAC,CAAC;SACJ;QAED,YAAY,CAAC,WAAW,CAAC,QAAQ,EAAE,CAAC;KACrC;IAED,IAAI,CAAC,YAAY,CAAC,MAAM,EAAE;QACxB,4EAA4E;QAC5E,YAAY,CAAC,MAAM,GAAG,MAAM,CAAC;KAC9B;IAED,eAAe,CAAC,YAAY,CAAC,CAAC;IAE9B,IAAI,OAAO,CAAC,cAAc;QAAE,kCAAe,CAAC,GAAG,CAAC,OAAO,CAAC,cAAc,CAAC,CAAC;IAExE,IAAI,YAAY,CAAC,gBAAgB,IAAI,OAAO,YAAY,CAAC,OAAO,KAAK,QAAQ,EAAE;QAC7E,MAAM,IAAI,qBAAa,CAAC,2CAA2C,CAAC,CAAC;KACtE;IAED,MAAM,OAAO,GAAG,2BAA2B,CAAC,KAAK,EAAE,YAAY,CAAC,CAAC;IACjE,IAAI,OAAO,EAAE;QACX,MAAM,OAAO,CAAC;KACf;IAED,0BAA0B;IAC1B,YAAY,CAAC,uBAAuB;QAClC,aAAa,CAAC,GAAG,CAAC,YAAY,CAAC,IAAI,UAAU,CAAC,GAAG,CAAC,YAAY,CAAC,CAAC;IAClE,YAAY,CAAC,uBAAuB;QAClC,aAAa,CAAC,GAAG,CAAC,YAAY,CAAC,IAAI,UAAU,CAAC,GAAG,CAAC,YAAY,CAAC,CAAC;IAElE,IAAI,WAAW,IAAI,YAAY,CAAC,cAAc,EAAE;QAC9C,qBAAS,CAAC,kBAAkB,EAAE,CAAC;QAC/B,YAAY,CAAC,SAAS,GAAG,IAAI,qBAAS,CAAC,WAAW,EAAE,GAAG,EAAE,OAAO,CAAC,CAAC;QAClE,YAAY,CAAC,aAAa,GAAG,YAAY,CAAC,SAAS,CAAC,aAAa,CAAC;KACnE;IAED,OAAO,YAAY,CAAC;AACtB,CAAC;AA3LD,oCA2LC;AAED,SAAS,2BAA2B,CAClC,KAA+B,EAC/B,YAA0B;IAE1B,IAAI,YAAY,CAAC,YAAY,EAAE;QAC7B,IAAI,KAAK,CAAC,MAAM,GAAG,CAAC,EAAE;YACpB,OAAO,IAAI,uBAAe,CAAC,oBAAoB,CAAC,CAAC;SAClD;QACD,IAAI,YAAY,CAAC,UAAU,EAAE;YAC3B,OAAO,IAAI,uBAAe,CAAC,oBAAoB,CAAC,CAAC;SAClD;QACD,IAAI,YAAY,CAAC,gBAAgB,EAAE;YACjC,OAAO,IAAI,uBAAe,CAAC,0BAA0B,CAAC,CAAC;SACxD;KACF;AACH,CAAC;AAED,SAAS,SAAS,CAChB,YAAiB,EACjB,GAAW,EACX,UAA4B,EAC5B,MAAiB;IAEjB,MAAM,EAAE,MAAM,EAAE,IAAI,EAAE,SAAS,EAAE,UAAU,EAAE,GAAG,UAAU,CAAC;IAC3D,MAAM,IAAI,GAAG,MAAM,aAAN,MAAM,cAAN,MAAM,GAAI,GAAG,CAAC;IAE3B,IAAI,UAAU,EAAE;QACd,MAAM,aAAa,GAAG,OAAO,UAAU,KAAK,QAAQ,CAAC,CAAC,CAAC,KAAK,UAAU,EAAE,CAAC,CAAC,CAAC,EAAE,CAAC;QAC9E,mBAAW,CAAC,GAAG,GAAG,0BAA0B,aAAa,EAAE,CAAC,CAAC;KAC9D;IAED,QAAQ,IAAI,EAAE;QACZ,KAAK,SAAS;YACZ,YAAY,CAAC,IAAI,CAAC,GAAG,UAAU,CAAC,IAAI,EAAE,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC;YACjD,MAAM;QACR,KAAK,KAAK;YACR,YAAY,CAAC,IAAI,CAAC,GAAG,MAAM,CAAC,IAAI,EAAE,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC;YAC7C,MAAM;QACR,KAAK,MAAM;YACT,YAAY,CAAC,IAAI,CAAC,GAAG,OAAO,CAAC,IAAI,EAAE,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC;YAC9C,MAAM;QACR,KAAK,QAAQ;YACX,IAAI,MAAM,CAAC,CAAC,CAAC,IAAI,IAAI,EAAE;gBACrB,MAAM;aACP;YACD,YAAY,CAAC,IAAI,CAAC,GAAG,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC;YACvC,MAAM;QACR,KAAK,QAAQ;YACX,IAAI,CAAC,gBAAQ,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,EAAE;gBACxB,MAAM,IAAI,uBAAe,CAAC,GAAG,IAAI,oBAAoB,CAAC,CAAC;aACxD;YACD,YAAY,CAAC,IAAI,CAAC,GAAG,MAAM,CAAC,CAAC,CAAC,CAAC;YAC/B,MAAM;QACR,KAAK,KAAK;YACR,YAAY,CAAC,IAAI,CAAC,GAAG,MAAM,CAAC,CAAC,CAAC,CAAC;YAC/B,MAAM;QACR,OAAO,CAAC,CAAC;YACP,IAAI,CAAC,SAAS,EAAE;gBACd,MAAM,IAAI,uBAAe,CAAC,oDAAoD,CAAC,CAAC;aACjF;YACD,MAAM,cAAc,GAAG,SAAS,CAAC,EAAE,IAAI,EAAE,OAAO,EAAE,YAAY,EAAE,MAAM,EAAE,CAAC,CAAC;YAC1E,YAAY,CAAC,IAAI,CAAC,GAAG,cAAc,CAAC;YACpC,MAAM;SACP;KACF;AACH,CAAC;AAgBY,QAAA,OAAO,GAAG;IACrB,OAAO,EAAE;QACP,MAAM,EAAE,UAAU;QAClB,SAAS,CAAC,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YACpC,OAAO,0BAAkB,CAAC,EAAE,GAAG,OAAO,CAAC,UAAU,EAAE,OAAO,EAAE,MAAM,CAAC,KAAK,CAAC,EAAE,CAAC,CAAC;QAC/E,CAAC;KACF;IACD,IAAI,EAAE;QACJ,MAAM,EAAE,aAAa;QACrB,SAAS,CAAC,EAAE,IAAI,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YAC1C,IAAI,CAAC,gBAAQ,CAAC,KAAK,EAAE,CAAC,UAAU,EAAE,UAAU,CAAU,CAAC,EAAE;gBACvD,MAAM,IAAI,uBAAe,CACvB,GAAG,IAAI,8DAA8D,CACtE,CAAC;aACH;YACD,OAAO,oCAAgB,CAAC,KAAK,CAAC,OAAO,CAAC,WAAW,EAAE;gBACjD,QAAQ,EAAE,KAAK,CAAC,QAAQ;gBACxB,QAAQ,EAAE,KAAK,CAAC,QAAQ;aACzB,CAAC,CAAC;QACL,CAAC;KACF;IACD,aAAa,EAAE;QACb,MAAM,EAAE,aAAa;QACrB,SAAS,CAAC,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;;YACpC,MAAM,UAAU,GAAG,MAAM,CAAC,MAAM,CAAC,oCAAa,CAAC,CAAC;YAChD,MAAM,CAAC,SAAS,CAAC,GAAG,UAAU,CAAC,MAAM,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,KAAK,CAAC,MAAM,CAAC,MAAM,CAAC,GAAG,CAAA,KAAK,KAAK,IAAI,EAAE,GAAG,CAAC,CAAC,CAAC,CAAC;YAC3F,IAAI,CAAC,SAAS,EAAE;gBACd,MAAM,IAAI,uBAAe,CAAC,wBAAwB,UAAU,SAAS,KAAK,EAAE,CAAC,CAAC;aAC/E;YACD,IAAI,MAAM,GAAG,MAAA,OAAO,CAAC,WAAW,0CAAE,MAAM,CAAC;YACzC,IACE,SAAS,KAAK,oCAAa,CAAC,aAAa;gBACzC,SAAS,KAAK,oCAAa,CAAC,cAAc;gBAC1C,SAAS,KAAK,oCAAa,CAAC,WAAW;gBACvC,SAAS,KAAK,oCAAa,CAAC,YAAY,EACxC;gBACA,sDAAsD;gBACtD,MAAM,GAAG,WAAW,CAAC;aACtB;YAED,IAAI,QAAQ,GAAG,MAAA,OAAO,CAAC,WAAW,0CAAE,QAAQ,CAAC;YAC7C,IAAI,SAAS,KAAK,oCAAa,CAAC,YAAY,IAAI,QAAQ,KAAK,EAAE,EAAE;gBAC/D,QAAQ,GAAG,SAAS,CAAC;aACtB;YACD,OAAO,oCAAgB,CAAC,KAAK,CAAC,OAAO,CAAC,WAAW,EAAE;gBACjD,SAAS;gBACT,MAAM;gBACN,QAAQ;aACT,CAAC,CAAC;QACL,CAAC;KACF;IACD,uBAAuB,EAAE;QACvB,MAAM,EAAE,aAAa;QACrB,SAAS,CAAC,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YACpC,IAAI,OAAO,KAAK,KAAK,QAAQ,EAAE;gBAC7B,KAAK,GAAG,QAAQ,CAAC,KAAK,CAAC,CAAC;aACzB;YACD,IAAI,CAAC,gBAAQ,CAAC,KAAK,CAAC,EAAE;gBACpB,MAAM,IAAI,uBAAe,CAAC,2CAA2C,CAAC,CAAC;aACxE;YACD,OAAO,oCAAgB,CAAC,KAAK,CAAC,OAAO,CAAC,WAAW,EAAE,EAAE,mBAAmB,EAAE,KAAK,EAAE,CAAC,CAAC;QACrF,CAAC;KACF;IACD,UAAU,EAAE;QACV,MAAM,EAAE,aAAa;QACrB,SAAS,CAAC,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YACpC,MAAM,MAAM,GAAG,MAAM,CAAC,KAAK,CAAC,CAAC;YAC7B,OAAO,oCAAgB,CAAC,KAAK,CAAC,OAAO,CAAC,WAAW,EAAE,EAAE,MAAM,EAAE,CAAC,CAAC;QACjE,CAAC;KACF;IACD,cAAc,EAAE;QACd,IAAI,EAAE,QAAQ;KACf;IACD,UAAU,EAAE;QACV,IAAI,EAAE,SAAS;KAChB;IACD,SAAS,EAAE;QACT,MAAM,EAAE,WAAW;QACnB,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,OAAO,CAAC,EAAE;YAC7B,MAAM,mBAAmB,GACvB,OAAO,OAAO,KAAK,QAAQ,CAAC,CAAC,CAAE,EAAE,OAAO,EAAgB,CAAC,CAAC,CAAE,OAAqB,CAAC;YACpF,MAAM,iBAAiB,GAAG,mBAAmB,IAAI,mBAAmB,CAAC,OAAO,CAAC;YAC7E,IAAI,CAAC,iBAAiB,EAAE;gBACtB,MAAM,IAAI,uBAAe,CACvB,qFAAqF,MAAM,CAAC,MAAM,CAChG,+BAAgB,CACjB,CAAC,IAAI,CAAC,MAAM,CAAC,IAAI,CACnB,CAAC;aACH;YACD,IAAI,CAAC,MAAM,CAAC,MAAM,CAAC,+BAAgB,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,KAAK,iBAAiB,CAAC,EAAE;gBACvE,MAAM,IAAI,uBAAe,CACvB,8BAA8B,iBAAiB,sCAAsC,MAAM,CAAC,MAAM,CAChG,+BAAgB,CACjB,CAAC,IAAI,CAAC,MAAM,CAAC,IAAI,CACnB,CAAC;aACH;YACD,OAAO,mBAAmB,CAAC;QAC7B,CAAC;KACF;IACD,SAAS,EAAE;QACT,IAAI,EAAE,SAAS;KAChB;IACD,WAAW,EAAE;QACX,OAAO,EAAE,MAAM;QACf,MAAM,EAAE,aAAa;QACrB,SAAS,CAAC,EAAE,MAAM,EAAE;YAClB,MAAM,eAAe,GAAG,IAAI,GAAG,EAAE,CAAC;YAClC,KAAK,MAAM,OAAO,IAAI,MAAuC,EAAE;gBAC7D,MAAM,YAAY,GAAG,OAAO,OAAO,KAAK,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,OAAO,CAAC;gBAChF,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,YAAY,CAAC,EAAE;oBAChC,MAAM,IAAI,iCAAyB,CACjC,mEAAmE,CACpE,CAAC;iBACH;gBACD,KAAK,MAAM,CAAC,IAAI,YAAY,EAAE;oBAC5B,IAAI,MAAM,CAAC,IAAI,CAAC,wBAAU,CAAC,CAAC,QAAQ,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,EAAE;wBAC/C,eAAe,CAAC,GAAG,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC;qBAChC;yBAAM;wBACL,MAAM,IAAI,iCAAyB,CACjC,GAAG,CAAC,0DAA0D,MAAM,CAAC,IAAI,CACvE,wBAAU,CACX,GAAG,CACL,CAAC;qBACH;iBACF;aACF;YACD,OAAO,CAAC,GAAG,eAAe,CAAC,CAAC;QAC9B,CAAC;KACF;IACD,gBAAgB,EAAE;QAChB,OAAO,EAAE,KAAK;QACd,IAAI,EAAE,MAAM;KACb;IACD,MAAM,EAAE;QACN,IAAI,EAAE,QAAQ;KACf;IACD,gBAAgB,EAAE;QAChB,OAAO,EAAE,KAAK;QACd,IAAI,EAAE,SAAS;KAChB;IACD,UAAU,EAAE;QACV,MAAM,EAAE,UAAU;QAClB,OAAO,EAAE,0BAAkB,EAAE;QAC7B,SAAS,CAAC,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;;YACpC,IAAI,CAAC,gBAAQ,CAAC,KAAK,CAAC;gBAAE,MAAM,IAAI,uBAAe,CAAC,8BAA8B,CAAC,CAAC;YAChF,OAAO,0BAAkB,CAAC;gBACxB,UAAU,EAAE,KAAK;gBACjB,OAAO,EAAE,MAAA,MAAA,OAAO,CAAC,QAAQ,0CAAE,WAAW,0CAAE,IAAI;aAC7C,CAAC,CAAC;QACL,CAAC;KACF;IACD,MAAM,EAAE;QACN,SAAS,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YACjC,MAAM,cAAc,GAAG,MAAM,CAAC,IAAI,EAAE,KAAK,CAAC,CAAC;YAC3C,IAAI,cAAc,KAAK,CAAC,IAAI,cAAc,KAAK,CAAC,EAAE;gBAChD,OAAO,cAAc,CAAC;aACvB;YACD,MAAM,IAAI,uBAAe,CAAC,sCAAsC,cAAc,GAAG,CAAC,CAAC;QACrF,CAAC;KACF;IACD,WAAW,EAAE;QACX,IAAI,EAAE,QAAQ;KACf;IACD,mBAAmB,EAAE;QACnB,OAAO,EAAE,KAAK;QACd,IAAI,EAAE,SAAS;KAChB;IACD,KAAK,EAAE;QACL,UAAU,EAAE,4BAA4B;QACxC,MAAM,EAAE,cAAc;QACtB,SAAS,CAAC,EAAE,IAAI,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YAC1C,MAAM,EAAE,GAAG,4BAAY,CAAC,WAAW,CAAC;gBAClC,YAAY,EAAE;oBACZ,GAAG,OAAO,CAAC,YAAY;oBACvB,KAAK,EAAE,UAAU,CAAC,IAAI,EAAE,KAAK,CAAC;iBAC/B;aACF,CAAC,CAAC;YACH,IAAI,CAAC,EAAE;gBAAE,MAAM,IAAI,uBAAe,CAAC,4CAA4C,KAAK,EAAE,CAAC,CAAC;YACxF,OAAO,EAAE,CAAC;QACZ,CAAC;KACkB;IACrB,oBAAoB,EAAE;QACpB,OAAO,EAAE,KAAK;QACd,IAAI,EAAE,MAAM;KACb;IACD,eAAe,EAAE;QACf,IAAI,EAAE,SAAS;KAChB;IACD,CAAC,EAAE;QACD,UAAU,EAAE,4BAA4B;QACxC,MAAM,EAAE,cAAc;QACtB,SAAS,CAAC,EAAE,IAAI,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YAC1C,MAAM,EAAE,GAAG,4BAAY,CAAC,WAAW,CAAC;gBAClC,YAAY,EAAE;oBACZ,GAAG,OAAO,CAAC,YAAY;oBACvB,OAAO,EAAE,UAAU,CAAC,IAAI,EAAE,KAAK,CAAC;iBACjC;aACF,CAAC,CAAC;YACH,IAAI,CAAC,EAAE;gBAAE,MAAM,IAAI,uBAAe,CAAC,8CAA8C,KAAK,EAAE,CAAC,CAAC;YAC1F,OAAO,EAAE,CAAC;QACZ,CAAC;KACkB;IACrB,OAAO,EAAE;QACP,MAAM,EAAE,cAAc;QACtB,SAAS,CAAC,EAAE,IAAI,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YAC1C,MAAM,EAAE,GAAG,4BAAY,CAAC,WAAW,CAAC;gBAClC,YAAY,EAAE;oBACZ,GAAG,OAAO,CAAC,YAAY;oBACvB,OAAO,EAAE,UAAU,CAAC,IAAI,EAAE,KAAK,CAAC;iBACjC;aACF,CAAC,CAAC;YACH,IAAI,CAAC,EAAE;gBAAE,MAAM,IAAI,uBAAe,CAAC,8CAA8C,KAAK,EAAE,CAAC,CAAC;YAC1F,OAAO,EAAE,CAAC;QACZ,CAAC;KACF;IACD,SAAS,EAAE;QACT,OAAO,EAAE,IAAI;QACb,IAAI,EAAE,SAAS;KAChB;IACD,qBAAqB,EAAE;QACrB,OAAO,EAAE,MAAM;QACf,IAAI,EAAE,MAAM;KACb;IACD,YAAY,EAAE;QACZ,OAAO,EAAE,KAAK;QACd,IAAI,EAAE,SAAS;KAChB;IACD,gBAAgB,EAAE;QAChB,OAAO,EAAE,EAAE;QACX,IAAI,EAAE,MAAM;KACb;IACD,MAAM,EAAE;QACN,OAAO,EAAE,IAAI,eAAM,CAAC,aAAa,CAAC;QAClC,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YAC3B,IAAI,KAAK,YAAY,eAAM,EAAE;gBAC3B,OAAO,KAAK,CAAC;aACd;YACD,mBAAW,CAAC,4CAA4C,CAAC,CAAC;YAC1D,4FAA4F;YAC5F,eAAe;QACjB,CAAC;KACF;IACD,WAAW,EAAE;QACX,MAAM,EAAE,QAAQ;QAChB,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YAC3B,OAAO,IAAI,eAAM,CAAC,aAAa,EAAE,EAAE,WAAW,EAAE,KAAoB,EAAE,CAAC,CAAC;QAC1E,CAAC;KACF;IACD,aAAa,EAAE;QACb,OAAO,EAAE,CAAC;QACV,IAAI,EAAE,MAAM;KACb;IACD,WAAW,EAAE;QACX,OAAO,EAAE,GAAG;QACZ,IAAI,EAAE,MAAM;KACb;IACD,mBAAmB,EAAE;QACnB,MAAM,EAAE,gBAAgB;QACxB,SAAS,CAAC,EAAE,IAAI,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YAC1C,MAAM,mBAAmB,GAAG,OAAO,CAAC,IAAI,EAAE,KAAK,CAAC,CAAC;YACjD,IAAI,OAAO,CAAC,cAAc,EAAE;gBAC1B,OAAO,gCAAc,CAAC,WAAW,CAAC;oBAChC,cAAc,EAAE,EAAE,GAAG,OAAO,CAAC,cAAc,EAAE,mBAAmB,EAAE;iBACnE,CAAC,CAAC;aACJ;iBAAM;gBACL,OAAO,IAAI,gCAAc,CAAC,WAAW,EAAE,SAAS,EAAE,EAAE,mBAAmB,EAAE,CAAC,CAAC;aAC5E;QACH,CAAC;KACF;IACD,qBAAqB,EAAE;QACrB,IAAI,EAAE,MAAM;KACb;IACD,WAAW,EAAE;QACX,OAAO,EAAE,CAAC;QACV,IAAI,EAAE,MAAM;KACb;IACD,uBAAuB,EAAE;QACvB,OAAO,EAAE,GAAG;QACZ,IAAI,EAAE,MAAM;KACb;IACD,eAAe,EAAE;QACf,OAAO,EAAE,KAAK;QACd,IAAI,EAAE,SAAS;KAChB;IACD,IAAI,EAAE;QACJ,MAAM,EAAE,YAAY;QACpB,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE,OAAO,EAAE;YACpC,OAAO,EAAE,GAAG,OAAO,CAAC,UAAU,EAAE,IAAI,EAAE,MAAM,CAAC,KAAK,CAAC,EAAE,CAAC;QACxD,CAAC;KACkB;IACrB,OAAO,EAAE;QACP,OAAO,EAAE,IAAI;QACb,IAAI,EAAE,SAAS;KAChB;IACD,SAAS,EAAE;QACT,OAAO,EAAE,0BAAkB;QAC3B,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YAC3B,IAAI,gBAAQ,CAAC,KAAK,EAAE,CAAC,UAAU,CAAU,CAAC,IAAI,OAAO,KAAK,CAAC,QAAQ,KAAK,UAAU,EAAE;gBAClF,OAAO,KAAkB,CAAC;aAC3B;YACD,MAAM,IAAI,uBAAe,CACvB,oEAAoE,KAAK,EAAE,CAC5E,CAAC;QACJ,CAAC;KACF;IACD,cAAc,EAAE;QACd,UAAU,EAAE,IAAI;QAChB,IAAI,EAAE,KAAK;KACZ;IACD,cAAc,EAAE;QACd,IAAI,EAAE,SAAS;KAChB;IACD,YAAY,EAAE;QACZ,IAAI,EAAE,SAAS;KAChB;IACD,aAAa,EAAE;QACb,IAAI,EAAE,SAAS;KAChB;IACD,GAAG,EAAE;QACH,OAAO,EAAE,KAAK;QACd,IAAI,EAAE,SAAS;KAChB;IACD,WAAW,EAAE;QACX,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE,OAAO,EAAE;YACpC,IAAI,KAAK,YAAY,0BAAW,IAAI,gBAAQ,CAAC,KAAK,EAAE,CAAC,OAAO,CAAU,CAAC,EAAE;gBACvE,OAAO,0BAAW,CAAC,WAAW,CAAC,EAAE,GAAG,OAAO,CAAC,WAAW,EAAE,GAAG,KAAK,EAAS,CAAC,CAAC;aAC7E;YACD,MAAM,IAAI,uBAAe,CAAC,sCAAsC,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,EAAE,CAAC,CAAC;QAC3F,CAAC;KACF;IACD,gBAAgB,EAAE;QAChB,MAAM,EAAE,aAAa;QACrB,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE,OAAO,EAAE;YACpC,OAAO,0BAAW,CAAC,WAAW,CAAC;gBAC7B,GAAG,OAAO,CAAC,WAAW;gBACtB,KAAK,EAAE,KAAyB;aACjC,CAAC,CAAC;QACL,CAAC;KACF;IACD,cAAc,EAAE;QACd,OAAO,EAAE,gCAAc,CAAC,OAAO;QAC/B,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE,OAAO,EAAE;;YACpC,IAAI,KAAK,YAAY,gCAAc,EAAE;gBACnC,OAAO,gCAAc,CAAC,WAAW,CAAC;oBAChC,cAAc,EAAE,EAAE,GAAG,OAAO,CAAC,cAAc,EAAE,GAAG,KAAK,EAAE;oBACvD,GAAG,KAAK;iBACF,CAAC,CAAC;aACX;YACD,IAAI,gBAAQ,CAAC,KAAK,EAAE,CAAC,MAAM,CAAU,CAAC,EAAE;gBACtC,MAAM,EAAE,GAAG,gCAAc,CAAC,WAAW,CAAC;oBACpC,cAAc,EAAE,EAAE,GAAG,OAAO,CAAC,cAAc,EAAE,GAAG,KAAK,EAAE;oBACvD,GAAG,KAAK;iBACF,CAAC,CAAC;gBACV,IAAI,EAAE;oBAAE,OAAO,EAAE,CAAC;;oBACb,MAAM,IAAI,uBAAe,CAAC,oCAAoC,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,EAAE,CAAC,CAAC;aAC7F;YACD,IAAI,OAAO,KAAK,KAAK,QAAQ,EAAE;gBAC7B,MAAM,MAAM,GAAG;oBACb,KAAK,EAAE,MAAA,OAAO,CAAC,cAAc,0CAAE,KAAK;oBACpC,mBAAmB,EAAE,MAAA,OAAO,CAAC,cAAc,0CAAE,mBAAmB;iBACjE,CAAC;gBACF,OAAO,IAAI,gCAAc,CACvB,KAA2B,EAC3B,MAAA,OAAO,CAAC,cAAc,0CAAE,IAAI,EAC5B,MAAM,CACP,CAAC;aACH;QACH,CAAC;KACF;IACD,kBAAkB,EAAE;QAClB,MAAM,EAAE,gBAAgB;QACxB,SAAS,CAAC,EAAE,MAAM,EAAE,OAAO,EAAE;YAC3B,MAAM,kBAAkB,GAAG,EAAE,CAAC;YAC9B,KAAK,MAAM,GAAG,IAAI,MAAM,EAAE;gBACxB,MAAM,iBAAiB,GAAW,MAAM,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC;gBACtD,IAAI,OAAO,GAAG,KAAK,QAAQ,EAAE;oBAC3B,KAAK,MAAM,CAAC,CAAC,EAAE,CAAC,CAAC,IAAI,MAAM,CAAC,OAAO,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAC,EAAE;wBAClD,iBAAiB,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC;qBAC1B;iBACF;gBACD,IAAI,gBAAQ,CAAC,GAAG,CAAC,EAAE;oBACjB,KAAK,MAAM,CAAC,CAAC,EAAE,CAAC,CAAC,IAAI,MAAM,CAAC,OAAO,CAAC,GAAG,CAAC,EAAE;wBACxC,iBAAiB,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC;qBAC1B;iBACF;gBACD,kBAAkB,CAAC,IAAI,CAAC,iBAAiB,CAAC,CAAC;aAC5C;YACD,OAAO,gCAAc,CAAC,WAAW,CAAC;gBAChC,cAAc,EAAE,OAAO,CAAC,cAAc;gBACtC,kBAAkB;aACnB,CAAC,CAAC;QACL,CAAC;KACF;IACD,UAAU,EAAE;QACV,IAAI,EAAE,QAAQ;KACf;IACD,UAAU,EAAE;QACV,OAAO,EAAE,IAAI;QACb,IAAI,EAAE,SAAS;KAChB;IACD,WAAW,EAAE;QACX,OAAO,EAAE,IAAI;QACb,IAAI,EAAE,SAAS;KAChB;IACD,kBAAkB,EAAE;QAClB,IAAI,EAAE,SAAS;KAChB;IACD,wBAAwB,EAAE;QACxB,OAAO,EAAE,KAAK;QACd,IAAI,EAAE,MAAM;KACb;IACD,UAAU,EAAE;QACV,IAAI,EAAE,QAAQ;KACf;IACD,eAAe,EAAE;QACf,OAAO,EAAE,CAAC;QACV,IAAI,EAAE,MAAM;KACb;IACD,GAAG,EAAE;QACH,MAAM,EAAE,KAAK;QACb,IAAI,EAAE,SAAS;KAChB;IACD,KAAK,EAAE;QACL,MAAM,EAAE,IAAI;QACZ,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YAC3B,OAAO,EAAE,CAAC,YAAY,CAAC,MAAM,CAAC,KAAK,CAAC,EAAE,EAAE,QAAQ,EAAE,OAAO,EAAE,CAAC,CAAC;QAC/D,CAAC;KACF;IACD,MAAM,EAAE;QACN,MAAM,EAAE,KAAK;QACb,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YAC3B,OAAO,EAAE,CAAC,YAAY,CAAC,MAAM,CAAC,KAAK,CAAC,EAAE,EAAE,QAAQ,EAAE,OAAO,EAAE,CAAC,CAAC;QAC/D,CAAC;KACF;IACD,OAAO,EAAE;QACP,MAAM,EAAE,MAAM;QACd,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YAC3B,OAAO,EAAE,CAAC,YAAY,CAAC,MAAM,CAAC,KAAK,CAAC,EAAE,EAAE,QAAQ,EAAE,OAAO,EAAE,CAAC,CAAC;QAC/D,CAAC;KACF;IACD,MAAM,EAAE;QACN,MAAM,EAAE,KAAK;QACb,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YAC3B,OAAO,EAAE,CAAC,YAAY,CAAC,MAAM,CAAC,KAAK,CAAC,EAAE,EAAE,QAAQ,EAAE,OAAO,EAAE,CAAC,CAAC;QAC/D,CAAC;KACF;IACD,OAAO,EAAE;QACP,UAAU,EAAE,IAAI;QAChB,MAAM,EAAE,YAAY;QACpB,IAAI,EAAE,QAAQ;KACf;IACD,WAAW,EAAE;QACX,MAAM,EAAE,oBAAoB;QAC5B,IAAI,EAAE,SAAS;KAChB;IACD,GAAG,EAAE;QACH,IAAI,EAAE,SAAS;KAChB;IACD,2BAA2B,EAAE;QAC3B,MAAM,EAAE,oBAAoB;QAC5B,SAAS,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YACjC,gEAAgE;YAChE,OAAO,CAAC,UAAU,CAAC,IAAI,EAAE,KAAK,CAAC,CAAC;QAClC,CAAC;KACF;IACD,wBAAwB,EAAE;QACxB,MAAM,EAAE,qBAAqB;QAC7B,SAAS,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YACjC,oFAAoF;YACpF,OAAO,UAAU,CAAC,IAAI,EAAE,KAAK,CAAC,CAAC,CAAC,CAAC,GAAG,EAAE,CAAC,SAAS,CAAC,CAAC,CAAC,SAAS,CAAC;QAC/D,CAAC;KACF;IACD,SAAS,EAAE;QACT,MAAM,EAAE,IAAI;QACZ,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YAC3B,OAAO,EAAE,CAAC,YAAY,CAAC,MAAM,CAAC,KAAK,CAAC,EAAE,EAAE,QAAQ,EAAE,OAAO,EAAE,CAAC,CAAC;QAC/D,CAAC;KACF;IACD,kBAAkB,EAAE;QAClB,MAAM,EAAE,MAAM;QACd,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YAC3B,OAAO,EAAE,CAAC,YAAY,CAAC,MAAM,CAAC,KAAK,CAAC,EAAE,EAAE,QAAQ,EAAE,OAAO,EAAE,CAAC,CAAC;QAC/D,CAAC;KACF;IACD,qBAAqB,EAAE;QACrB,MAAM,EAAE,KAAK;QACb,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YAC3B,OAAO,EAAE,CAAC,YAAY,CAAC,MAAM,CAAC,KAAK,CAAC,EAAE,EAAE,QAAQ,EAAE,OAAO,EAAE,CAAC,CAAC;QAC/D,CAAC;KACF;IACD,6BAA6B,EAAE;QAC7B,MAAM,EAAE,YAAY;QACpB,IAAI,EAAE,KAAK;KACZ;IACD,WAAW,EAAE;QACX,SAAS,CAAC,EAAE,IAAI,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE;YAC1C,MAAM,WAAW,GAAG,UAAU,CAAC,IAAI,EAAE,KAAK,CAAC,CAAC;YAC5C,IAAI,WAAW,EAAE;gBACf,OAAO,CAAC,mBAAmB,GAAG,GAAG,EAAE,CAAC,SAAS,CAAC;gBAC9C,OAAO,CAAC,kBAAkB,GAAG,KAAK,CAAC;aACpC;iBAAM;gBACL,OAAO,CAAC,mBAAmB,GAAG,OAAO,CAAC,wBAAwB;oBAC5D,CAAC,CAAC,GAAG,EAAE,CAAC,SAAS;oBACjB,CAAC,CAAC,SAAS,CAAC;gBACd,OAAO,CAAC,kBAAkB,GAAG,OAAO,CAAC,2BAA2B,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,IAAI,CAAC;aACjF;YACD,OAAO,WAAW,CAAC;QACrB,CAAC;KACF;IACD,CAAC,EAAE;QACD,MAAM,EAAE,cAAc;QACtB,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE,OAAO,EAAE;YACpC,OAAO,4BAAY,CAAC,WAAW,CAAC,EAAE,YAAY,EAAE,EAAE,GAAG,OAAO,CAAC,YAAY,EAAE,CAAC,EAAE,KAAU,EAAE,EAAE,CAAC,CAAC;QAChG,CAAC;KACF;IACD,kBAAkB,EAAE;QAClB,OAAO,EAAE,CAAC;QACV,IAAI,EAAE,MAAM;KACb;IACD,YAAY,EAAE;QACZ,MAAM,EAAE,cAAc;QACtB,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE,OAAO,EAAE;YACpC,IAAI,gBAAQ,CAAC,KAAK,CAAC,IAAI,KAAK,YAAY,4BAAY,EAAE;gBACpD,OAAO,4BAAY,CAAC,WAAW,CAAC;oBAC9B,YAAY,EAAE;wBACZ,GAAG,OAAO,CAAC,YAAY;wBACvB,GAAG,KAAK;qBACT;iBACF,CAAC,CAAC;aACJ;iBAAM,IAAI,KAAK,KAAK,UAAU,IAAI,OAAO,KAAK,KAAK,QAAQ,EAAE;gBAC5D,OAAO,4BAAY,CAAC,WAAW,CAAC;oBAC9B,YAAY,EAAE;wBACZ,GAAG,OAAO,CAAC,YAAY;wBACvB,CAAC,EAAE,KAAK;qBACT;iBACF,CAAC,CAAC;aACJ;YAED,MAAM,IAAI,uBAAe,CAAC,sCAAsC,IAAI,CAAC,SAAS,CAAC,KAAK,CAAC,EAAE,CAAC,CAAC;QAC3F,CAAC;KACkB;IACrB,QAAQ,EAAE;QACR,UAAU,EAAE,+BAA+B;QAC3C,MAAM,EAAE,cAAc;QACtB,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE,OAAO,EAAE;YACpC,MAAM,EAAE,GAAG,4BAAY,CAAC,WAAW,CAAC;gBAClC,YAAY,EAAE;oBACZ,GAAG,OAAO,CAAC,YAAY;oBACvB,QAAQ,EAAE,OAAO,CAAC,UAAU,EAAE,KAAK,CAAC;iBACrC;aACF,CAAC,CAAC;YACH,IAAI,EAAE;gBAAE,OAAO,EAAE,CAAC;YAClB,MAAM,IAAI,uBAAe,CAAC,wCAAwC,CAAC,CAAC;QACtE,CAAC;KACkB;IACrB,UAAU,EAAE;QACV,MAAM,EAAE,cAAc;QACtB,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC,KAAK,CAAC,EAAE,OAAO,EAAE;YACpC,MAAM,EAAE,GAAG,4BAAY,CAAC,WAAW,CAAC;gBAClC,YAAY,EAAE;oBACZ,GAAG,OAAO,CAAC,YAAY;oBACvB,UAAU,EAAE,OAAO,CAAC,YAAY,EAAE,KAAK,CAAC;iBACzC;aACF,CAAC,CAAC;YACH,IAAI,EAAE;gBAAE,OAAO,EAAE,CAAC;YAClB,MAAM,IAAI,uBAAe,CAAC,wCAAwC,CAAC,CAAC;QACtE,CAAC;KACF;IACD,oBAAoB,EAAE;QACpB,OAAO,EAAE,CAAC;QACV,IAAI,EAAE,KAAK;KACZ;IACD,2CAA2C;IAC3C,cAAc,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IAC/B,SAAS,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IAC1B,0BAA0B;IAC1B,SAAS,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IAC1B,WAAW,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IAC5B,aAAa,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IAC9B,WAAW,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IAC5B,WAAW,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IAC5B,kBAAkB,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IACnC,mBAAmB,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IACpC,aAAa,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IAC9B,WAAW,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IAC5B,OAAO,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IACxB,WAAW,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IAC5B,YAAY,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IAC7B,SAAS,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IAC1B,KAAK,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IACtB,MAAM,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IACvB,EAAE,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IACnB,IAAI,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IACrB,OAAO,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IACxB,GAAG,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IACpB,SAAS,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IAC1B,GAAG,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IACpB,UAAU,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IAC3B,GAAG,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IACpB,cAAc,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IAC/B,KAAK,EAAE,EAAE,IAAI,EAAE,KAAK,EAAE;IACtB,gFAAgF;IAChF,eAAe,EAAE,EAAE,IAAI,EAAE,SAAS,EAAsB;IACxD,kBAAkB,EAAE,EAAE,IAAI,EAAE,SAAS,EAAsB;CACN,CAAC;AAE3C,QAAA,eAAe,GAAG,IAAI,kBAAkB,CACnD,MAAM,CAAC,OAAO,CAAC,eAAO,CAAC;KACpB,MAAM,CAAC,CAAC,CAAC,EAAE,UAAU,CAAC,EAAE,EAAE,CAAC,UAAU,CAAC,OAAO,IAAI,IAAI,CAAC;KACtD,GAAG,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,EAAE,EAAE,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,OAAO,CAAC,CAAC,CACnC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/constants.js b/node_modules/mongodb/lib/constants.js new file mode 100644 index 000000000..f7ee65383 --- /dev/null +++ b/node_modules/mongodb/lib/constants.js @@ -0,0 +1,10 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.SYSTEM_JS_COLLECTION = exports.SYSTEM_COMMAND_COLLECTION = exports.SYSTEM_USER_COLLECTION = exports.SYSTEM_PROFILE_COLLECTION = exports.SYSTEM_INDEX_COLLECTION = exports.SYSTEM_NAMESPACE_COLLECTION = void 0; +exports.SYSTEM_NAMESPACE_COLLECTION = 'system.namespaces'; +exports.SYSTEM_INDEX_COLLECTION = 'system.indexes'; +exports.SYSTEM_PROFILE_COLLECTION = 'system.profile'; +exports.SYSTEM_USER_COLLECTION = 'system.users'; +exports.SYSTEM_COMMAND_COLLECTION = '$cmd'; +exports.SYSTEM_JS_COLLECTION = 'system.js'; +//# sourceMappingURL=constants.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/constants.js.map b/node_modules/mongodb/lib/constants.js.map new file mode 100644 index 000000000..67df39ce1 --- /dev/null +++ b/node_modules/mongodb/lib/constants.js.map @@ -0,0 +1 @@ +{"version":3,"file":"constants.js","sourceRoot":"","sources":["../src/constants.ts"],"names":[],"mappings":";;;AAAa,QAAA,2BAA2B,GAAG,mBAAmB,CAAC;AAClD,QAAA,uBAAuB,GAAG,gBAAgB,CAAC;AAC3C,QAAA,yBAAyB,GAAG,gBAAgB,CAAC;AAC7C,QAAA,sBAAsB,GAAG,cAAc,CAAC;AACxC,QAAA,yBAAyB,GAAG,MAAM,CAAC;AACnC,QAAA,oBAAoB,GAAG,WAAW,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cursor/abstract_cursor.js b/node_modules/mongodb/lib/cursor/abstract_cursor.js new file mode 100644 index 000000000..a63d17860 --- /dev/null +++ b/node_modules/mongodb/lib/cursor/abstract_cursor.js @@ -0,0 +1,629 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.assertUninitialized = exports.AbstractCursor = exports.CURSOR_FLAGS = void 0; +const utils_1 = require("../utils"); +const bson_1 = require("../bson"); +const sessions_1 = require("../sessions"); +const error_1 = require("../error"); +const read_preference_1 = require("../read_preference"); +const stream_1 = require("stream"); +const read_concern_1 = require("../read_concern"); +const mongo_types_1 = require("../mongo_types"); +/** @internal */ +const kId = Symbol('id'); +/** @internal */ +const kDocuments = Symbol('documents'); +/** @internal */ +const kServer = Symbol('server'); +/** @internal */ +const kNamespace = Symbol('namespace'); +/** @internal */ +const kTopology = Symbol('topology'); +/** @internal */ +const kSession = Symbol('session'); +/** @internal */ +const kOptions = Symbol('options'); +/** @internal */ +const kTransform = Symbol('transform'); +/** @internal */ +const kInitialized = Symbol('initialized'); +/** @internal */ +const kClosed = Symbol('closed'); +/** @internal */ +const kKilled = Symbol('killed'); +/** @public */ +exports.CURSOR_FLAGS = [ + 'tailable', + 'oplogReplay', + 'noCursorTimeout', + 'awaitData', + 'exhaust', + 'partial' +]; +/** @public */ +class AbstractCursor extends mongo_types_1.TypedEventEmitter { + /** @internal */ + constructor(topology, namespace, options = {}) { + super(); + this[kTopology] = topology; + this[kNamespace] = namespace; + this[kDocuments] = []; // TODO: https://github.com/microsoft/TypeScript/issues/36230 + this[kInitialized] = false; + this[kClosed] = false; + this[kKilled] = false; + this[kOptions] = { + readPreference: options.readPreference && options.readPreference instanceof read_preference_1.ReadPreference + ? options.readPreference + : read_preference_1.ReadPreference.primary, + ...bson_1.pluckBSONSerializeOptions(options) + }; + const readConcern = read_concern_1.ReadConcern.fromOptions(options); + if (readConcern) { + this[kOptions].readConcern = readConcern; + } + if (typeof options.batchSize === 'number') { + this[kOptions].batchSize = options.batchSize; + } + if (options.comment != null) { + this[kOptions].comment = options.comment; + } + if (typeof options.maxTimeMS === 'number') { + this[kOptions].maxTimeMS = options.maxTimeMS; + } + if (options.session instanceof sessions_1.ClientSession) { + this[kSession] = options.session; + } + } + get id() { + return this[kId]; + } + /** @internal */ + get topology() { + return this[kTopology]; + } + /** @internal */ + get server() { + return this[kServer]; + } + get namespace() { + return this[kNamespace]; + } + get readPreference() { + return this[kOptions].readPreference; + } + get readConcern() { + return this[kOptions].readConcern; + } + /** @internal */ + get session() { + return this[kSession]; + } + set session(clientSession) { + this[kSession] = clientSession; + } + /** @internal */ + get cursorOptions() { + return this[kOptions]; + } + get closed() { + return this[kClosed]; + } + get killed() { + return this[kKilled]; + } + get loadBalanced() { + return this[kTopology].loadBalanced; + } + /** Returns current buffered documents length */ + bufferedCount() { + return this[kDocuments].length; + } + /** Returns current buffered documents */ + readBufferedDocuments(number) { + return this[kDocuments].splice(0, number !== null && number !== void 0 ? number : this[kDocuments].length); + } + [Symbol.asyncIterator]() { + return { + next: () => this.next().then(value => value != null ? { value, done: false } : { value: undefined, done: true }) + }; + } + stream(options) { + if (options === null || options === void 0 ? void 0 : options.transform) { + const transform = options.transform; + const readable = makeCursorStream(this); + return readable.pipe(new stream_1.Transform({ + objectMode: true, + highWaterMark: 1, + transform(chunk, _, callback) { + try { + const transformed = transform(chunk); + callback(undefined, transformed); + } + catch (err) { + callback(err); + } + } + })); + } + return makeCursorStream(this); + } + hasNext(callback) { + return utils_1.maybePromise(callback, done => { + if (this[kId] === bson_1.Long.ZERO) { + return done(undefined, false); + } + if (this[kDocuments].length) { + return done(undefined, true); + } + next(this, true, (err, doc) => { + if (err) + return done(err); + if (doc) { + this[kDocuments].unshift(doc); + done(undefined, true); + return; + } + done(undefined, false); + }); + }); + } + next(callback) { + return utils_1.maybePromise(callback, done => { + if (this[kId] === bson_1.Long.ZERO) { + return done(new error_1.MongoCursorExhaustedError()); + } + next(this, true, done); + }); + } + tryNext(callback) { + return utils_1.maybePromise(callback, done => { + if (this[kId] === bson_1.Long.ZERO) { + return done(new error_1.MongoCursorExhaustedError()); + } + next(this, false, done); + }); + } + forEach(iterator, callback) { + if (typeof iterator !== 'function') { + throw new error_1.MongoInvalidArgumentError('Argument "iterator" must be a function'); + } + return utils_1.maybePromise(callback, done => { + const transform = this[kTransform]; + const fetchDocs = () => { + next(this, true, (err, doc) => { + if (err || doc == null) + return done(err); + let result; + // NOTE: no need to transform because `next` will do this automatically + try { + result = iterator(doc); // TODO(NODE-3283): Improve transform typing + } + catch (error) { + return done(error); + } + if (result === false) + return done(); + // these do need to be transformed since they are copying the rest of the batch + const internalDocs = this[kDocuments].splice(0, this[kDocuments].length); + for (let i = 0; i < internalDocs.length; ++i) { + try { + result = iterator((transform ? transform(internalDocs[i]) : internalDocs[i]) // TODO(NODE-3283): Improve transform typing + ); + } + catch (error) { + return done(error); + } + if (result === false) + return done(); + } + fetchDocs(); + }); + }; + fetchDocs(); + }); + } + close(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = options !== null && options !== void 0 ? options : {}; + const needsToEmitClosed = !this[kClosed]; + this[kClosed] = true; + return utils_1.maybePromise(callback, done => cleanupCursor(this, { needsToEmitClosed }, done)); + } + toArray(callback) { + return utils_1.maybePromise(callback, done => { + const docs = []; + const transform = this[kTransform]; + const fetchDocs = () => { + // NOTE: if we add a `nextBatch` then we should use it here + next(this, true, (err, doc) => { + if (err) + return done(err); + if (doc == null) + return done(undefined, docs); + // NOTE: no need to transform because `next` will do this automatically + docs.push(doc); + // these do need to be transformed since they are copying the rest of the batch + const internalDocs = (transform + ? this[kDocuments].splice(0, this[kDocuments].length).map(transform) + : this[kDocuments].splice(0, this[kDocuments].length)); // TODO(NODE-3283): Improve transform typing + if (internalDocs) { + docs.push(...internalDocs); + } + fetchDocs(); + }); + }; + fetchDocs(); + }); + } + /** + * Add a cursor flag to the cursor + * + * @param flag - The flag to set, must be one of following ['tailable', 'oplogReplay', 'noCursorTimeout', 'awaitData', 'partial' -. + * @param value - The flag boolean value. + */ + addCursorFlag(flag, value) { + assertUninitialized(this); + if (!exports.CURSOR_FLAGS.includes(flag)) { + throw new error_1.MongoInvalidArgumentError(`Flag ${flag} is not one of ${exports.CURSOR_FLAGS}`); + } + if (typeof value !== 'boolean') { + throw new error_1.MongoInvalidArgumentError(`Flag ${flag} must be a boolean value`); + } + this[kOptions][flag] = value; + return this; + } + /** + * Map all documents using the provided function + * If there is a transform set on the cursor, that will be called first and the result passed to + * this function's transform. + * + * @remarks + * **Note for Typescript Users:** adding a transform changes the return type of the iteration of this cursor, + * it **does not** return a new instance of a cursor. This means when calling map, + * you should always assign the result to a new variable in order to get a correctly typed cursor variable. + * Take note of the following example: + * + * @example + * ```typescript + * const cursor: FindCursor = coll.find(); + * const mappedCursor: FindCursor = cursor.map(doc => Object.keys(doc).length); + * const keyCounts: number[] = await mappedCursor.toArray(); // cursor.toArray() still returns Document[] + * ``` + * @param transform - The mapping transformation method. + */ + map(transform) { + assertUninitialized(this); + const oldTransform = this[kTransform]; // TODO(NODE-3283): Improve transform typing + if (oldTransform) { + this[kTransform] = doc => { + return transform(oldTransform(doc)); + }; + } + else { + this[kTransform] = transform; + } + return this; + } + /** + * Set the ReadPreference for the cursor. + * + * @param readPreference - The new read preference for the cursor. + */ + withReadPreference(readPreference) { + assertUninitialized(this); + if (readPreference instanceof read_preference_1.ReadPreference) { + this[kOptions].readPreference = readPreference; + } + else if (typeof readPreference === 'string') { + this[kOptions].readPreference = read_preference_1.ReadPreference.fromString(readPreference); + } + else { + throw new error_1.MongoInvalidArgumentError(`Invalid read preference: ${readPreference}`); + } + return this; + } + /** + * Set the ReadPreference for the cursor. + * + * @param readPreference - The new read preference for the cursor. + */ + withReadConcern(readConcern) { + assertUninitialized(this); + const resolvedReadConcern = read_concern_1.ReadConcern.fromOptions({ readConcern }); + if (resolvedReadConcern) { + this[kOptions].readConcern = resolvedReadConcern; + } + return this; + } + /** + * Set a maxTimeMS on the cursor query, allowing for hard timeout limits on queries (Only supported on MongoDB 2.6 or higher) + * + * @param value - Number of milliseconds to wait before aborting the query. + */ + maxTimeMS(value) { + assertUninitialized(this); + if (typeof value !== 'number') { + throw new error_1.MongoInvalidArgumentError('Argument for maxTimeMS must be a number'); + } + this[kOptions].maxTimeMS = value; + return this; + } + /** + * Set the batch size for the cursor. + * + * @param value - The number of documents to return per batch. See {@link https://docs.mongodb.com/manual/reference/command/find/|find command documentation}. + */ + batchSize(value) { + assertUninitialized(this); + if (this[kOptions].tailable) { + throw new error_1.MongoTailableCursorError('Tailable cursor does not support batchSize'); + } + if (typeof value !== 'number') { + throw new error_1.MongoInvalidArgumentError('Operation "batchSize" requires an integer'); + } + this[kOptions].batchSize = value; + return this; + } + /** + * Rewind this cursor to its uninitialized state. Any options that are present on the cursor will + * remain in effect. Iterating this cursor will cause new queries to be sent to the server, even + * if the resultant data has already been retrieved by this cursor. + */ + rewind() { + if (!this[kInitialized]) { + return; + } + this[kId] = undefined; + this[kDocuments] = []; + this[kClosed] = false; + this[kKilled] = false; + this[kInitialized] = false; + const session = this[kSession]; + if (session) { + // We only want to end this session if we created it, and it hasn't ended yet + if (session.explicit === false && !session.hasEnded) { + session.endSession(); + } + this[kSession] = undefined; + } + } + /** @internal */ + _getMore(batchSize, callback) { + const cursorId = this[kId]; + const cursorNs = this[kNamespace]; + const server = this[kServer]; + if (cursorId == null) { + callback(new error_1.MongoRuntimeError('Unable to iterate cursor with no id')); + return; + } + if (server == null) { + callback(new error_1.MongoRuntimeError('Unable to iterate cursor without selected server')); + return; + } + server.getMore(cursorNs, cursorId, { + ...this[kOptions], + session: this[kSession], + batchSize + }, callback); + } +} +exports.AbstractCursor = AbstractCursor; +/** @event */ +AbstractCursor.CLOSE = 'close'; +function nextDocument(cursor) { + if (cursor[kDocuments] == null || !cursor[kDocuments].length) { + return null; + } + const doc = cursor[kDocuments].shift(); + if (doc) { + const transform = cursor[kTransform]; + if (transform) { + return transform(doc); + } + return doc; + } + return null; +} +function next(cursor, blocking, callback) { + const cursorId = cursor[kId]; + if (cursor.closed) { + return callback(undefined, null); + } + if (cursor[kDocuments] && cursor[kDocuments].length) { + callback(undefined, nextDocument(cursor)); + return; + } + if (cursorId == null) { + // All cursors must operate within a session, one must be made implicitly if not explicitly provided + if (cursor[kSession] == null && cursor[kTopology].hasSessionSupport()) { + cursor[kSession] = cursor[kTopology].startSession({ owner: cursor, explicit: false }); + } + cursor._initialize(cursor[kSession], (err, state) => { + if (state) { + const response = state.response; + cursor[kServer] = state.server; + cursor[kSession] = state.session; + if (response.cursor) { + cursor[kId] = + typeof response.cursor.id === 'number' + ? bson_1.Long.fromNumber(response.cursor.id) + : response.cursor.id; + if (response.cursor.ns) { + cursor[kNamespace] = utils_1.ns(response.cursor.ns); + } + cursor[kDocuments] = response.cursor.firstBatch; + } + else { + // NOTE: This is for support of older servers (<3.2) which do not use commands + cursor[kId] = + typeof response.cursorId === 'number' + ? bson_1.Long.fromNumber(response.cursorId) + : response.cursorId; + cursor[kDocuments] = response.documents; + } + // When server responses return without a cursor document, we close this cursor + // and return the raw server response. This is often the case for explain commands + // for example + if (cursor[kId] == null) { + cursor[kId] = bson_1.Long.ZERO; + // TODO(NODE-3286): ExecutionResult needs to accept a generic parameter + cursor[kDocuments] = [state.response]; + } + } + // the cursor is now initialized, even if an error occurred or it is dead + cursor[kInitialized] = true; + if (err || cursorIsDead(cursor)) { + return cleanupCursor(cursor, { error: err }, () => callback(err, nextDocument(cursor))); + } + next(cursor, blocking, callback); + }); + return; + } + if (cursorIsDead(cursor)) { + return cleanupCursor(cursor, undefined, () => callback(undefined, null)); + } + // otherwise need to call getMore + const batchSize = cursor[kOptions].batchSize || 1000; + cursor._getMore(batchSize, (err, response) => { + if (response) { + const cursorId = typeof response.cursor.id === 'number' + ? bson_1.Long.fromNumber(response.cursor.id) + : response.cursor.id; + cursor[kDocuments] = response.cursor.nextBatch; + cursor[kId] = cursorId; + } + if (err || cursorIsDead(cursor)) { + return cleanupCursor(cursor, { error: err }, () => callback(err, nextDocument(cursor))); + } + if (cursor[kDocuments].length === 0 && blocking === false) { + return callback(undefined, null); + } + next(cursor, blocking, callback); + }); +} +function cursorIsDead(cursor) { + const cursorId = cursor[kId]; + return !!cursorId && cursorId.isZero(); +} +function cleanupCursor(cursor, options, callback) { + var _a; + const cursorId = cursor[kId]; + const cursorNs = cursor[kNamespace]; + const server = cursor[kServer]; + const session = cursor[kSession]; + const error = options === null || options === void 0 ? void 0 : options.error; + const needsToEmitClosed = (_a = options === null || options === void 0 ? void 0 : options.needsToEmitClosed) !== null && _a !== void 0 ? _a : cursor[kDocuments].length === 0; + if (error) { + if (cursor.loadBalanced && error instanceof error_1.MongoNetworkError) { + return completeCleanup(); + } + } + if (cursorId == null || server == null || cursorId.isZero() || cursorNs == null) { + if (needsToEmitClosed) { + cursor[kClosed] = true; + cursor[kId] = bson_1.Long.ZERO; + cursor.emit(AbstractCursor.CLOSE); + } + if (session) { + if (session.owner === cursor) { + return session.endSession({ error }, callback); + } + if (!session.inTransaction()) { + sessions_1.maybeClearPinnedConnection(session, { error }); + } + } + return callback(); + } + function completeCleanup() { + if (session) { + if (session.owner === cursor) { + return session.endSession({ error }, () => { + cursor.emit(AbstractCursor.CLOSE); + callback(); + }); + } + if (!session.inTransaction()) { + sessions_1.maybeClearPinnedConnection(session, { error }); + } + } + cursor.emit(AbstractCursor.CLOSE); + return callback(); + } + cursor[kKilled] = true; + server.killCursors(cursorNs, [cursorId], { ...bson_1.pluckBSONSerializeOptions(cursor[kOptions]), session }, () => completeCleanup()); +} +/** @internal */ +function assertUninitialized(cursor) { + if (cursor[kInitialized]) { + throw new error_1.MongoCursorInUseError(); + } +} +exports.assertUninitialized = assertUninitialized; +function makeCursorStream(cursor) { + const readable = new stream_1.Readable({ + objectMode: true, + autoDestroy: false, + highWaterMark: 1 + }); + let initialized = false; + let reading = false; + let needToClose = true; // NOTE: we must close the cursor if we never read from it, use `_construct` in future node versions + readable._read = function () { + if (initialized === false) { + needToClose = false; + initialized = true; + } + if (!reading) { + reading = true; + readNext(); + } + }; + readable._destroy = function (error, cb) { + if (needToClose) { + cursor.close(err => process.nextTick(cb, err || error)); + } + else { + cb(error); + } + }; + function readNext() { + needToClose = false; + next(cursor, true, (err, result) => { + needToClose = err ? !cursor.closed : result != null; + if (err) { + // NOTE: This is questionable, but we have a test backing the behavior. It seems the + // desired behavior is that a stream ends cleanly when a user explicitly closes + // a client during iteration. Alternatively, we could do the "right" thing and + // propagate the error message by removing this special case. + if (err.message.match(/server is closed/)) { + cursor.close(); + return readable.push(null); + } + // NOTE: This is also perhaps questionable. The rationale here is that these errors tend + // to be "operation interrupted", where a cursor has been closed but there is an + // active getMore in-flight. This used to check if the cursor was killed but once + // that changed to happen in cleanup legitimate errors would not destroy the + // stream. There are change streams test specifically test these cases. + if (err.message.match(/interrupted/)) { + return readable.push(null); + } + return readable.destroy(err); + } + if (result == null) { + readable.push(null); + } + else if (readable.destroyed) { + cursor.close(); + } + else { + if (readable.push(result)) { + return readNext(); + } + reading = false; + } + }); + } + return readable; +} +//# sourceMappingURL=abstract_cursor.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cursor/abstract_cursor.js.map b/node_modules/mongodb/lib/cursor/abstract_cursor.js.map new file mode 100644 index 000000000..30b3074bb --- /dev/null +++ b/node_modules/mongodb/lib/cursor/abstract_cursor.js.map @@ -0,0 +1 @@ +{"version":3,"file":"abstract_cursor.js","sourceRoot":"","sources":["../../src/cursor/abstract_cursor.ts"],"names":[],"mappings":";;;AAAA,oCAAwE;AACxE,kCAA0F;AAC1F,0CAAwE;AACxE,oCAQkB;AAClB,wDAAwE;AAGxE,mCAA6C;AAE7C,kDAA+D;AAC/D,gDAAmE;AAEnE,gBAAgB;AAChB,MAAM,GAAG,GAAG,MAAM,CAAC,IAAI,CAAC,CAAC;AACzB,gBAAgB;AAChB,MAAM,UAAU,GAAG,MAAM,CAAC,WAAW,CAAC,CAAC;AACvC,gBAAgB;AAChB,MAAM,OAAO,GAAG,MAAM,CAAC,QAAQ,CAAC,CAAC;AACjC,gBAAgB;AAChB,MAAM,UAAU,GAAG,MAAM,CAAC,WAAW,CAAC,CAAC;AACvC,gBAAgB;AAChB,MAAM,SAAS,GAAG,MAAM,CAAC,UAAU,CAAC,CAAC;AACrC,gBAAgB;AAChB,MAAM,QAAQ,GAAG,MAAM,CAAC,SAAS,CAAC,CAAC;AACnC,gBAAgB;AAChB,MAAM,QAAQ,GAAG,MAAM,CAAC,SAAS,CAAC,CAAC;AACnC,gBAAgB;AAChB,MAAM,UAAU,GAAG,MAAM,CAAC,WAAW,CAAC,CAAC;AACvC,gBAAgB;AAChB,MAAM,YAAY,GAAG,MAAM,CAAC,aAAa,CAAC,CAAC;AAC3C,gBAAgB;AAChB,MAAM,OAAO,GAAG,MAAM,CAAC,QAAQ,CAAC,CAAC;AACjC,gBAAgB;AAChB,MAAM,OAAO,GAAG,MAAM,CAAC,QAAQ,CAAC,CAAC;AAEjC,cAAc;AACD,QAAA,YAAY,GAAG;IAC1B,UAAU;IACV,aAAa;IACb,iBAAiB;IACjB,WAAW;IACX,SAAS;IACT,SAAS;CACD,CAAC;AAiDX,cAAc;AACd,MAAsB,cAGpB,SAAQ,+BAA+B;IA2BvC,gBAAgB;IAChB,YACE,QAAkB,EAClB,SAA2B,EAC3B,UAAiC,EAAE;QAEnC,KAAK,EAAE,CAAC;QAER,IAAI,CAAC,SAAS,CAAC,GAAG,QAAQ,CAAC;QAC3B,IAAI,CAAC,UAAU,CAAC,GAAG,SAAS,CAAC;QAC7B,IAAI,CAAC,UAAU,CAAC,GAAG,EAAE,CAAC,CAAC,6DAA6D;QACpF,IAAI,CAAC,YAAY,CAAC,GAAG,KAAK,CAAC;QAC3B,IAAI,CAAC,OAAO,CAAC,GAAG,KAAK,CAAC;QACtB,IAAI,CAAC,OAAO,CAAC,GAAG,KAAK,CAAC;QACtB,IAAI,CAAC,QAAQ,CAAC,GAAG;YACf,cAAc,EACZ,OAAO,CAAC,cAAc,IAAI,OAAO,CAAC,cAAc,YAAY,gCAAc;gBACxE,CAAC,CAAC,OAAO,CAAC,cAAc;gBACxB,CAAC,CAAC,gCAAc,CAAC,OAAO;YAC5B,GAAG,gCAAyB,CAAC,OAAO,CAAC;SACtC,CAAC;QAEF,MAAM,WAAW,GAAG,0BAAW,CAAC,WAAW,CAAC,OAAO,CAAC,CAAC;QACrD,IAAI,WAAW,EAAE;YACf,IAAI,CAAC,QAAQ,CAAC,CAAC,WAAW,GAAG,WAAW,CAAC;SAC1C;QAED,IAAI,OAAO,OAAO,CAAC,SAAS,KAAK,QAAQ,EAAE;YACzC,IAAI,CAAC,QAAQ,CAAC,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;SAC9C;QAED,IAAI,OAAO,CAAC,OAAO,IAAI,IAAI,EAAE;YAC3B,IAAI,CAAC,QAAQ,CAAC,CAAC,OAAO,GAAG,OAAO,CAAC,OAAO,CAAC;SAC1C;QAED,IAAI,OAAO,OAAO,CAAC,SAAS,KAAK,QAAQ,EAAE;YACzC,IAAI,CAAC,QAAQ,CAAC,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;SAC9C;QAED,IAAI,OAAO,CAAC,OAAO,YAAY,wBAAa,EAAE;YAC5C,IAAI,CAAC,QAAQ,CAAC,GAAG,OAAO,CAAC,OAAO,CAAC;SAClC;IACH,CAAC;IAED,IAAI,EAAE;QACJ,OAAO,IAAI,CAAC,GAAG,CAAC,CAAC;IACnB,CAAC;IAED,gBAAgB;IAChB,IAAI,QAAQ;QACV,OAAO,IAAI,CAAC,SAAS,CAAC,CAAC;IACzB,CAAC;IAED,gBAAgB;IAChB,IAAI,MAAM;QACR,OAAO,IAAI,CAAC,OAAO,CAAC,CAAC;IACvB,CAAC;IAED,IAAI,SAAS;QACX,OAAO,IAAI,CAAC,UAAU,CAAC,CAAC;IAC1B,CAAC;IAED,IAAI,cAAc;QAChB,OAAO,IAAI,CAAC,QAAQ,CAAC,CAAC,cAAc,CAAC;IACvC,CAAC;IAED,IAAI,WAAW;QACb,OAAO,IAAI,CAAC,QAAQ,CAAC,CAAC,WAAW,CAAC;IACpC,CAAC;IAED,gBAAgB;IAChB,IAAI,OAAO;QACT,OAAO,IAAI,CAAC,QAAQ,CAAC,CAAC;IACxB,CAAC;IAED,IAAI,OAAO,CAAC,aAAwC;QAClD,IAAI,CAAC,QAAQ,CAAC,GAAG,aAAa,CAAC;IACjC,CAAC;IAED,gBAAgB;IAChB,IAAI,aAAa;QACf,OAAO,IAAI,CAAC,QAAQ,CAAC,CAAC;IACxB,CAAC;IAED,IAAI,MAAM;QACR,OAAO,IAAI,CAAC,OAAO,CAAC,CAAC;IACvB,CAAC;IAED,IAAI,MAAM;QACR,OAAO,IAAI,CAAC,OAAO,CAAC,CAAC;IACvB,CAAC;IAED,IAAI,YAAY;QACd,OAAO,IAAI,CAAC,SAAS,CAAC,CAAC,YAAY,CAAC;IACtC,CAAC;IAED,gDAAgD;IAChD,aAAa;QACX,OAAO,IAAI,CAAC,UAAU,CAAC,CAAC,MAAM,CAAC;IACjC,CAAC;IAED,yCAAyC;IACzC,qBAAqB,CAAC,MAAe;QACnC,OAAO,IAAI,CAAC,UAAU,CAAC,CAAC,MAAM,CAAC,CAAC,EAAE,MAAM,aAAN,MAAM,cAAN,MAAM,GAAI,IAAI,CAAC,UAAU,CAAC,CAAC,MAAM,CAAC,CAAC;IACvE,CAAC;IAED,CAAC,MAAM,CAAC,aAAa,CAAC;QACpB,OAAO;YACL,IAAI,EAAE,GAAG,EAAE,CACT,IAAI,CAAC,IAAI,EAAE,CAAC,IAAI,CAAC,KAAK,CAAC,EAAE,CACvB,KAAK,IAAI,IAAI,CAAC,CAAC,CAAC,EAAE,KAAK,EAAE,IAAI,EAAE,KAAK,EAAE,CAAC,CAAC,CAAC,EAAE,KAAK,EAAE,SAAS,EAAE,IAAI,EAAE,IAAI,EAAE,CAC1E;SACJ,CAAC;IACJ,CAAC;IAED,MAAM,CAAC,OAA6B;QAClC,IAAI,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,SAAS,EAAE;YACtB,MAAM,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;YACpC,MAAM,QAAQ,GAAG,gBAAgB,CAAC,IAAI,CAAC,CAAC;YAExC,OAAO,QAAQ,CAAC,IAAI,CAClB,IAAI,kBAAS,CAAC;gBACZ,UAAU,EAAE,IAAI;gBAChB,aAAa,EAAE,CAAC;gBAChB,SAAS,CAAC,KAAK,EAAE,CAAC,EAAE,QAAQ;oBAC1B,IAAI;wBACF,MAAM,WAAW,GAAG,SAAS,CAAC,KAAK,CAAC,CAAC;wBACrC,QAAQ,CAAC,SAAS,EAAE,WAAW,CAAC,CAAC;qBAClC;oBAAC,OAAO,GAAG,EAAE;wBACZ,QAAQ,CAAC,GAAG,CAAC,CAAC;qBACf;gBACH,CAAC;aACF,CAAC,CACH,CAAC;SACH;QAED,OAAO,gBAAgB,CAAC,IAAI,CAAC,CAAC;IAChC,CAAC;IAID,OAAO,CAAC,QAA4B;QAClC,OAAO,oBAAY,CAAC,QAAQ,EAAE,IAAI,CAAC,EAAE;YACnC,IAAI,IAAI,CAAC,GAAG,CAAC,KAAK,WAAI,CAAC,IAAI,EAAE;gBAC3B,OAAO,IAAI,CAAC,SAAS,EAAE,KAAK,CAAC,CAAC;aAC/B;YAED,IAAI,IAAI,CAAC,UAAU,CAAC,CAAC,MAAM,EAAE;gBAC3B,OAAO,IAAI,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;aAC9B;YAED,IAAI,CAAU,IAAI,EAAE,IAAI,EAAE,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;gBACrC,IAAI,GAAG;oBAAE,OAAO,IAAI,CAAC,GAAG,CAAC,CAAC;gBAE1B,IAAI,GAAG,EAAE;oBACP,IAAI,CAAC,UAAU,CAAC,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC;oBAC9B,IAAI,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;oBACtB,OAAO;iBACR;gBAED,IAAI,CAAC,SAAS,EAAE,KAAK,CAAC,CAAC;YACzB,CAAC,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;IAMD,IAAI,CAAC,QAAmC;QACtC,OAAO,oBAAY,CAAC,QAAQ,EAAE,IAAI,CAAC,EAAE;YACnC,IAAI,IAAI,CAAC,GAAG,CAAC,KAAK,WAAI,CAAC,IAAI,EAAE;gBAC3B,OAAO,IAAI,CAAC,IAAI,iCAAyB,EAAE,CAAC,CAAC;aAC9C;YAED,IAAI,CAAC,IAAI,EAAE,IAAI,EAAE,IAAI,CAAC,CAAC;QACzB,CAAC,CAAC,CAAC;IACL,CAAC;IAOD,OAAO,CAAC,QAAmC;QACzC,OAAO,oBAAY,CAAC,QAAQ,EAAE,IAAI,CAAC,EAAE;YACnC,IAAI,IAAI,CAAC,GAAG,CAAC,KAAK,WAAI,CAAC,IAAI,EAAE;gBAC3B,OAAO,IAAI,CAAC,IAAI,iCAAyB,EAAE,CAAC,CAAC;aAC9C;YAED,IAAI,CAAC,IAAI,EAAE,KAAK,EAAE,IAAI,CAAC,CAAC;QAC1B,CAAC,CAAC,CAAC;IACL,CAAC;IAUD,OAAO,CACL,QAA0C,EAC1C,QAAyB;QAEzB,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;YAClC,MAAM,IAAI,iCAAyB,CAAC,wCAAwC,CAAC,CAAC;SAC/E;QACD,OAAO,oBAAY,CAAC,QAAQ,EAAE,IAAI,CAAC,EAAE;YACnC,MAAM,SAAS,GAAG,IAAI,CAAC,UAAU,CAAC,CAAC;YACnC,MAAM,SAAS,GAAG,GAAG,EAAE;gBACrB,IAAI,CAAU,IAAI,EAAE,IAAI,EAAE,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;oBACrC,IAAI,GAAG,IAAI,GAAG,IAAI,IAAI;wBAAE,OAAO,IAAI,CAAC,GAAG,CAAC,CAAC;oBACzC,IAAI,MAAM,CAAC;oBACX,uEAAuE;oBACvE,IAAI;wBACF,MAAM,GAAG,QAAQ,CAAC,GAAG,CAAC,CAAC,CAAC,4CAA4C;qBACrE;oBAAC,OAAO,KAAK,EAAE;wBACd,OAAO,IAAI,CAAC,KAAK,CAAC,CAAC;qBACpB;oBAED,IAAI,MAAM,KAAK,KAAK;wBAAE,OAAO,IAAI,EAAE,CAAC;oBAEpC,+EAA+E;oBAC/E,MAAM,YAAY,GAAG,IAAI,CAAC,UAAU,CAAC,CAAC,MAAM,CAAC,CAAC,EAAE,IAAI,CAAC,UAAU,CAAC,CAAC,MAAM,CAAC,CAAC;oBACzE,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,YAAY,CAAC,MAAM,EAAE,EAAE,CAAC,EAAE;wBAC5C,IAAI;4BACF,MAAM,GAAG,QAAQ,CACf,CAAC,SAAS,CAAC,CAAC,CAAC,SAAS,CAAC,YAAY,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,YAAY,CAAC,CAAC,CAAC,CAAY,CAAC,4CAA4C;6BACnH,CAAC;yBACH;wBAAC,OAAO,KAAK,EAAE;4BACd,OAAO,IAAI,CAAC,KAAK,CAAC,CAAC;yBACpB;wBACD,IAAI,MAAM,KAAK,KAAK;4BAAE,OAAO,IAAI,EAAE,CAAC;qBACrC;oBAED,SAAS,EAAE,CAAC;gBACd,CAAC,CAAC,CAAC;YACL,CAAC,CAAC;YAEF,SAAS,EAAE,CAAC;QACd,CAAC,CAAC,CAAC;IACL,CAAC;IAYD,KAAK,CAAC,OAAuC,EAAE,QAAmB;QAChE,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAExB,MAAM,iBAAiB,GAAG,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;QACzC,IAAI,CAAC,OAAO,CAAC,GAAG,IAAI,CAAC;QAErB,OAAO,oBAAY,CAAC,QAAQ,EAAE,IAAI,CAAC,EAAE,CAAC,aAAa,CAAC,IAAI,EAAE,EAAE,iBAAiB,EAAE,EAAE,IAAI,CAAC,CAAC,CAAC;IAC1F,CAAC;IAYD,OAAO,CAAC,QAA8B;QACpC,OAAO,oBAAY,CAAC,QAAQ,EAAE,IAAI,CAAC,EAAE;YACnC,MAAM,IAAI,GAAc,EAAE,CAAC;YAC3B,MAAM,SAAS,GAAG,IAAI,CAAC,UAAU,CAAC,CAAC;YACnC,MAAM,SAAS,GAAG,GAAG,EAAE;gBACrB,2DAA2D;gBAC3D,IAAI,CAAU,IAAI,EAAE,IAAI,EAAE,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;oBACrC,IAAI,GAAG;wBAAE,OAAO,IAAI,CAAC,GAAG,CAAC,CAAC;oBAC1B,IAAI,GAAG,IAAI,IAAI;wBAAE,OAAO,IAAI,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;oBAE9C,uEAAuE;oBACvE,IAAI,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;oBAEf,+EAA+E;oBAC/E,MAAM,YAAY,GAAG,CACnB,SAAS;wBACP,CAAC,CAAC,IAAI,CAAC,UAAU,CAAC,CAAC,MAAM,CAAC,CAAC,EAAE,IAAI,CAAC,UAAU,CAAC,CAAC,MAAM,CAAC,CAAC,GAAG,CAAC,SAAS,CAAC;wBACpE,CAAC,CAAC,IAAI,CAAC,UAAU,CAAC,CAAC,MAAM,CAAC,CAAC,EAAE,IAAI,CAAC,UAAU,CAAC,CAAC,MAAM,CAAC,CAC3C,CAAC,CAAC,4CAA4C;oBAE5D,IAAI,YAAY,EAAE;wBAChB,IAAI,CAAC,IAAI,CAAC,GAAG,YAAY,CAAC,CAAC;qBAC5B;oBAED,SAAS,EAAE,CAAC;gBACd,CAAC,CAAC,CAAC;YACL,CAAC,CAAC;YAEF,SAAS,EAAE,CAAC;QACd,CAAC,CAAC,CAAC;IACL,CAAC;IAED;;;;;OAKG;IACH,aAAa,CAAC,IAAgB,EAAE,KAAc;QAC5C,mBAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,oBAAY,CAAC,QAAQ,CAAC,IAAI,CAAC,EAAE;YAChC,MAAM,IAAI,iCAAyB,CAAC,QAAQ,IAAI,kBAAkB,oBAAY,EAAE,CAAC,CAAC;SACnF;QAED,IAAI,OAAO,KAAK,KAAK,SAAS,EAAE;YAC9B,MAAM,IAAI,iCAAyB,CAAC,QAAQ,IAAI,0BAA0B,CAAC,CAAC;SAC7E;QAED,IAAI,CAAC,QAAQ,CAAC,CAAC,IAAI,CAAC,GAAG,KAAK,CAAC;QAC7B,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;;;;;;;;;;;;;;;OAkBG;IACH,GAAG,CAAU,SAA8B;QACzC,mBAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,MAAM,YAAY,GAAG,IAAI,CAAC,UAAU,CAA8B,CAAC,CAAC,4CAA4C;QAChH,IAAI,YAAY,EAAE;YAChB,IAAI,CAAC,UAAU,CAAC,GAAG,GAAG,CAAC,EAAE;gBACvB,OAAO,SAAS,CAAC,YAAY,CAAC,GAAG,CAAC,CAAC,CAAC;YACtC,CAAC,CAAC;SACH;aAAM;YACL,IAAI,CAAC,UAAU,CAAC,GAAG,SAAS,CAAC;SAC9B;QAED,OAAO,IAAoC,CAAC;IAC9C,CAAC;IAED;;;;OAIG;IACH,kBAAkB,CAAC,cAAkC;QACnD,mBAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,cAAc,YAAY,gCAAc,EAAE;YAC5C,IAAI,CAAC,QAAQ,CAAC,CAAC,cAAc,GAAG,cAAc,CAAC;SAChD;aAAM,IAAI,OAAO,cAAc,KAAK,QAAQ,EAAE;YAC7C,IAAI,CAAC,QAAQ,CAAC,CAAC,cAAc,GAAG,gCAAc,CAAC,UAAU,CAAC,cAAc,CAAC,CAAC;SAC3E;aAAM;YACL,MAAM,IAAI,iCAAyB,CAAC,4BAA4B,cAAc,EAAE,CAAC,CAAC;SACnF;QAED,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;OAIG;IACH,eAAe,CAAC,WAA4B;QAC1C,mBAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,MAAM,mBAAmB,GAAG,0BAAW,CAAC,WAAW,CAAC,EAAE,WAAW,EAAE,CAAC,CAAC;QACrE,IAAI,mBAAmB,EAAE;YACvB,IAAI,CAAC,QAAQ,CAAC,CAAC,WAAW,GAAG,mBAAmB,CAAC;SAClD;QAED,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;OAIG;IACH,SAAS,CAAC,KAAa;QACrB,mBAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,OAAO,KAAK,KAAK,QAAQ,EAAE;YAC7B,MAAM,IAAI,iCAAyB,CAAC,yCAAyC,CAAC,CAAC;SAChF;QAED,IAAI,CAAC,QAAQ,CAAC,CAAC,SAAS,GAAG,KAAK,CAAC;QACjC,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;OAIG;IACH,SAAS,CAAC,KAAa;QACrB,mBAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,IAAI,CAAC,QAAQ,CAAC,CAAC,QAAQ,EAAE;YAC3B,MAAM,IAAI,gCAAwB,CAAC,4CAA4C,CAAC,CAAC;SAClF;QAED,IAAI,OAAO,KAAK,KAAK,QAAQ,EAAE;YAC7B,MAAM,IAAI,iCAAyB,CAAC,2CAA2C,CAAC,CAAC;SAClF;QAED,IAAI,CAAC,QAAQ,CAAC,CAAC,SAAS,GAAG,KAAK,CAAC;QACjC,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;OAIG;IACH,MAAM;QACJ,IAAI,CAAC,IAAI,CAAC,YAAY,CAAC,EAAE;YACvB,OAAO;SACR;QAED,IAAI,CAAC,GAAG,CAAC,GAAG,SAAS,CAAC;QACtB,IAAI,CAAC,UAAU,CAAC,GAAG,EAAE,CAAC;QACtB,IAAI,CAAC,OAAO,CAAC,GAAG,KAAK,CAAC;QACtB,IAAI,CAAC,OAAO,CAAC,GAAG,KAAK,CAAC;QACtB,IAAI,CAAC,YAAY,CAAC,GAAG,KAAK,CAAC;QAE3B,MAAM,OAAO,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC;QAC/B,IAAI,OAAO,EAAE;YACX,6EAA6E;YAC7E,IAAI,OAAO,CAAC,QAAQ,KAAK,KAAK,IAAI,CAAC,OAAO,CAAC,QAAQ,EAAE;gBACnD,OAAO,CAAC,UAAU,EAAE,CAAC;aACtB;YAED,IAAI,CAAC,QAAQ,CAAC,GAAG,SAAS,CAAC;SAC5B;IACH,CAAC;IAaD,gBAAgB;IAChB,QAAQ,CAAC,SAAiB,EAAE,QAA4B;QACtD,MAAM,QAAQ,GAAG,IAAI,CAAC,GAAG,CAAC,CAAC;QAC3B,MAAM,QAAQ,GAAG,IAAI,CAAC,UAAU,CAAC,CAAC;QAClC,MAAM,MAAM,GAAG,IAAI,CAAC,OAAO,CAAC,CAAC;QAE7B,IAAI,QAAQ,IAAI,IAAI,EAAE;YACpB,QAAQ,CAAC,IAAI,yBAAiB,CAAC,qCAAqC,CAAC,CAAC,CAAC;YACvE,OAAO;SACR;QAED,IAAI,MAAM,IAAI,IAAI,EAAE;YAClB,QAAQ,CAAC,IAAI,yBAAiB,CAAC,kDAAkD,CAAC,CAAC,CAAC;YACpF,OAAO;SACR;QAED,MAAM,CAAC,OAAO,CACZ,QAAQ,EACR,QAAQ,EACR;YACE,GAAG,IAAI,CAAC,QAAQ,CAAC;YACjB,OAAO,EAAE,IAAI,CAAC,QAAQ,CAAC;YACvB,SAAS;SACV,EACD,QAAQ,CACT,CAAC;IACJ,CAAC;;AAzgBH,wCA0gBC;AA/eC,aAAa;AACG,oBAAK,GAAG,OAAgB,CAAC;AAgf3C,SAAS,YAAY,CAAI,MAAsB;IAC7C,IAAI,MAAM,CAAC,UAAU,CAAC,IAAI,IAAI,IAAI,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,MAAM,EAAE;QAC5D,OAAO,IAAI,CAAC;KACb;IAED,MAAM,GAAG,GAAG,MAAM,CAAC,UAAU,CAAC,CAAC,KAAK,EAAE,CAAC;IACvC,IAAI,GAAG,EAAE;QACP,MAAM,SAAS,GAAG,MAAM,CAAC,UAAU,CAAC,CAAC;QACrC,IAAI,SAAS,EAAE;YACb,OAAO,SAAS,CAAC,GAAG,CAAM,CAAC;SAC5B;QAED,OAAO,GAAG,CAAC;KACZ;IAED,OAAO,IAAI,CAAC;AACd,CAAC;AAED,SAAS,IAAI,CAAI,MAAsB,EAAE,QAAiB,EAAE,QAA4B;IACtF,MAAM,QAAQ,GAAG,MAAM,CAAC,GAAG,CAAC,CAAC;IAC7B,IAAI,MAAM,CAAC,MAAM,EAAE;QACjB,OAAO,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;KAClC;IAED,IAAI,MAAM,CAAC,UAAU,CAAC,IAAI,MAAM,CAAC,UAAU,CAAC,CAAC,MAAM,EAAE;QACnD,QAAQ,CAAC,SAAS,EAAE,YAAY,CAAC,MAAM,CAAC,CAAC,CAAC;QAC1C,OAAO;KACR;IAED,IAAI,QAAQ,IAAI,IAAI,EAAE;QACpB,oGAAoG;QACpG,IAAI,MAAM,CAAC,QAAQ,CAAC,IAAI,IAAI,IAAI,MAAM,CAAC,SAAS,CAAC,CAAC,iBAAiB,EAAE,EAAE;YACrE,MAAM,CAAC,QAAQ,CAAC,GAAG,MAAM,CAAC,SAAS,CAAC,CAAC,YAAY,CAAC,EAAE,KAAK,EAAE,MAAM,EAAE,QAAQ,EAAE,KAAK,EAAE,CAAC,CAAC;SACvF;QAED,MAAM,CAAC,WAAW,CAAC,MAAM,CAAC,QAAQ,CAAC,EAAE,CAAC,GAAG,EAAE,KAAK,EAAE,EAAE;YAClD,IAAI,KAAK,EAAE;gBACT,MAAM,QAAQ,GAAG,KAAK,CAAC,QAAQ,CAAC;gBAChC,MAAM,CAAC,OAAO,CAAC,GAAG,KAAK,CAAC,MAAM,CAAC;gBAC/B,MAAM,CAAC,QAAQ,CAAC,GAAG,KAAK,CAAC,OAAO,CAAC;gBAEjC,IAAI,QAAQ,CAAC,MAAM,EAAE;oBACnB,MAAM,CAAC,GAAG,CAAC;wBACT,OAAO,QAAQ,CAAC,MAAM,CAAC,EAAE,KAAK,QAAQ;4BACpC,CAAC,CAAC,WAAI,CAAC,UAAU,CAAC,QAAQ,CAAC,MAAM,CAAC,EAAE,CAAC;4BACrC,CAAC,CAAC,QAAQ,CAAC,MAAM,CAAC,EAAE,CAAC;oBAEzB,IAAI,QAAQ,CAAC,MAAM,CAAC,EAAE,EAAE;wBACtB,MAAM,CAAC,UAAU,CAAC,GAAG,UAAE,CAAC,QAAQ,CAAC,MAAM,CAAC,EAAE,CAAC,CAAC;qBAC7C;oBAED,MAAM,CAAC,UAAU,CAAC,GAAG,QAAQ,CAAC,MAAM,CAAC,UAAU,CAAC;iBACjD;qBAAM;oBACL,8EAA8E;oBAC9E,MAAM,CAAC,GAAG,CAAC;wBACT,OAAO,QAAQ,CAAC,QAAQ,KAAK,QAAQ;4BACnC,CAAC,CAAC,WAAI,CAAC,UAAU,CAAC,QAAQ,CAAC,QAAQ,CAAC;4BACpC,CAAC,CAAC,QAAQ,CAAC,QAAQ,CAAC;oBACxB,MAAM,CAAC,UAAU,CAAC,GAAG,QAAQ,CAAC,SAAS,CAAC;iBACzC;gBAED,+EAA+E;gBAC/E,kFAAkF;gBAClF,cAAc;gBACd,IAAI,MAAM,CAAC,GAAG,CAAC,IAAI,IAAI,EAAE;oBACvB,MAAM,CAAC,GAAG,CAAC,GAAG,WAAI,CAAC,IAAI,CAAC;oBACxB,uEAAuE;oBACvE,MAAM,CAAC,UAAU,CAAC,GAAG,CAAC,KAAK,CAAC,QAA0B,CAAC,CAAC;iBACzD;aACF;YAED,yEAAyE;YACzE,MAAM,CAAC,YAAY,CAAC,GAAG,IAAI,CAAC;YAE5B,IAAI,GAAG,IAAI,YAAY,CAAC,MAAM,CAAC,EAAE;gBAC/B,OAAO,aAAa,CAAC,MAAM,EAAE,EAAE,KAAK,EAAE,GAAG,EAAE,EAAE,GAAG,EAAE,CAAC,QAAQ,CAAC,GAAG,EAAE,YAAY,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC;aACzF;YAED,IAAI,CAAC,MAAM,EAAE,QAAQ,EAAE,QAAQ,CAAC,CAAC;QACnC,CAAC,CAAC,CAAC;QAEH,OAAO;KACR;IAED,IAAI,YAAY,CAAC,MAAM,CAAC,EAAE;QACxB,OAAO,aAAa,CAAC,MAAM,EAAE,SAAS,EAAE,GAAG,EAAE,CAAC,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC,CAAC;KAC1E;IAED,iCAAiC;IACjC,MAAM,SAAS,GAAG,MAAM,CAAC,QAAQ,CAAC,CAAC,SAAS,IAAI,IAAI,CAAC;IACrD,MAAM,CAAC,QAAQ,CAAC,SAAS,EAAE,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;QAC3C,IAAI,QAAQ,EAAE;YACZ,MAAM,QAAQ,GACZ,OAAO,QAAQ,CAAC,MAAM,CAAC,EAAE,KAAK,QAAQ;gBACpC,CAAC,CAAC,WAAI,CAAC,UAAU,CAAC,QAAQ,CAAC,MAAM,CAAC,EAAE,CAAC;gBACrC,CAAC,CAAC,QAAQ,CAAC,MAAM,CAAC,EAAE,CAAC;YAEzB,MAAM,CAAC,UAAU,CAAC,GAAG,QAAQ,CAAC,MAAM,CAAC,SAAS,CAAC;YAC/C,MAAM,CAAC,GAAG,CAAC,GAAG,QAAQ,CAAC;SACxB;QAED,IAAI,GAAG,IAAI,YAAY,CAAC,MAAM,CAAC,EAAE;YAC/B,OAAO,aAAa,CAAC,MAAM,EAAE,EAAE,KAAK,EAAE,GAAG,EAAE,EAAE,GAAG,EAAE,CAAC,QAAQ,CAAC,GAAG,EAAE,YAAY,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC;SACzF;QAED,IAAI,MAAM,CAAC,UAAU,CAAC,CAAC,MAAM,KAAK,CAAC,IAAI,QAAQ,KAAK,KAAK,EAAE;YACzD,OAAO,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;SAClC;QAED,IAAI,CAAC,MAAM,EAAE,QAAQ,EAAE,QAAQ,CAAC,CAAC;IACnC,CAAC,CAAC,CAAC;AACL,CAAC;AAED,SAAS,YAAY,CAAC,MAAsB;IAC1C,MAAM,QAAQ,GAAG,MAAM,CAAC,GAAG,CAAC,CAAC;IAC7B,OAAO,CAAC,CAAC,QAAQ,IAAI,QAAQ,CAAC,MAAM,EAAE,CAAC;AACzC,CAAC;AAED,SAAS,aAAa,CACpB,MAAsB,EACtB,OAAkF,EAClF,QAAkB;;IAElB,MAAM,QAAQ,GAAG,MAAM,CAAC,GAAG,CAAC,CAAC;IAC7B,MAAM,QAAQ,GAAG,MAAM,CAAC,UAAU,CAAC,CAAC;IACpC,MAAM,MAAM,GAAG,MAAM,CAAC,OAAO,CAAC,CAAC;IAC/B,MAAM,OAAO,GAAG,MAAM,CAAC,QAAQ,CAAC,CAAC;IACjC,MAAM,KAAK,GAAG,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,KAAK,CAAC;IAC7B,MAAM,iBAAiB,GAAG,MAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,iBAAiB,mCAAI,MAAM,CAAC,UAAU,CAAC,CAAC,MAAM,KAAK,CAAC,CAAC;IAExF,IAAI,KAAK,EAAE;QACT,IAAI,MAAM,CAAC,YAAY,IAAI,KAAK,YAAY,yBAAiB,EAAE;YAC7D,OAAO,eAAe,EAAE,CAAC;SAC1B;KACF;IAED,IAAI,QAAQ,IAAI,IAAI,IAAI,MAAM,IAAI,IAAI,IAAI,QAAQ,CAAC,MAAM,EAAE,IAAI,QAAQ,IAAI,IAAI,EAAE;QAC/E,IAAI,iBAAiB,EAAE;YACrB,MAAM,CAAC,OAAO,CAAC,GAAG,IAAI,CAAC;YACvB,MAAM,CAAC,GAAG,CAAC,GAAG,WAAI,CAAC,IAAI,CAAC;YACxB,MAAM,CAAC,IAAI,CAAC,cAAc,CAAC,KAAK,CAAC,CAAC;SACnC;QAED,IAAI,OAAO,EAAE;YACX,IAAI,OAAO,CAAC,KAAK,KAAK,MAAM,EAAE;gBAC5B,OAAO,OAAO,CAAC,UAAU,CAAC,EAAE,KAAK,EAAE,EAAE,QAAQ,CAAC,CAAC;aAChD;YAED,IAAI,CAAC,OAAO,CAAC,aAAa,EAAE,EAAE;gBAC5B,qCAA0B,CAAC,OAAO,EAAE,EAAE,KAAK,EAAE,CAAC,CAAC;aAChD;SACF;QAED,OAAO,QAAQ,EAAE,CAAC;KACnB;IAED,SAAS,eAAe;QACtB,IAAI,OAAO,EAAE;YACX,IAAI,OAAO,CAAC,KAAK,KAAK,MAAM,EAAE;gBAC5B,OAAO,OAAO,CAAC,UAAU,CAAC,EAAE,KAAK,EAAE,EAAE,GAAG,EAAE;oBACxC,MAAM,CAAC,IAAI,CAAC,cAAc,CAAC,KAAK,CAAC,CAAC;oBAClC,QAAQ,EAAE,CAAC;gBACb,CAAC,CAAC,CAAC;aACJ;YAED,IAAI,CAAC,OAAO,CAAC,aAAa,EAAE,EAAE;gBAC5B,qCAA0B,CAAC,OAAO,EAAE,EAAE,KAAK,EAAE,CAAC,CAAC;aAChD;SACF;QAED,MAAM,CAAC,IAAI,CAAC,cAAc,CAAC,KAAK,CAAC,CAAC;QAClC,OAAO,QAAQ,EAAE,CAAC;IACpB,CAAC;IAED,MAAM,CAAC,OAAO,CAAC,GAAG,IAAI,CAAC;IACvB,MAAM,CAAC,WAAW,CAChB,QAAQ,EACR,CAAC,QAAQ,CAAC,EACV,EAAE,GAAG,gCAAyB,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAC,EAAE,OAAO,EAAE,EAC3D,GAAG,EAAE,CAAC,eAAe,EAAE,CACxB,CAAC;AACJ,CAAC;AAED,gBAAgB;AAChB,SAAgB,mBAAmB,CAAC,MAAsB;IACxD,IAAI,MAAM,CAAC,YAAY,CAAC,EAAE;QACxB,MAAM,IAAI,6BAAqB,EAAE,CAAC;KACnC;AACH,CAAC;AAJD,kDAIC;AAED,SAAS,gBAAgB,CAA2B,MAA+B;IACjF,MAAM,QAAQ,GAAG,IAAI,iBAAQ,CAAC;QAC5B,UAAU,EAAE,IAAI;QAChB,WAAW,EAAE,KAAK;QAClB,aAAa,EAAE,CAAC;KACjB,CAAC,CAAC;IAEH,IAAI,WAAW,GAAG,KAAK,CAAC;IACxB,IAAI,OAAO,GAAG,KAAK,CAAC;IACpB,IAAI,WAAW,GAAG,IAAI,CAAC,CAAC,oGAAoG;IAE5H,QAAQ,CAAC,KAAK,GAAG;QACf,IAAI,WAAW,KAAK,KAAK,EAAE;YACzB,WAAW,GAAG,KAAK,CAAC;YACpB,WAAW,GAAG,IAAI,CAAC;SACpB;QAED,IAAI,CAAC,OAAO,EAAE;YACZ,OAAO,GAAG,IAAI,CAAC;YACf,QAAQ,EAAE,CAAC;SACZ;IACH,CAAC,CAAC;IAEF,QAAQ,CAAC,QAAQ,GAAG,UAAU,KAAK,EAAE,EAAE;QACrC,IAAI,WAAW,EAAE;YACf,MAAM,CAAC,KAAK,CAAC,GAAG,CAAC,EAAE,CAAC,OAAO,CAAC,QAAQ,CAAC,EAAE,EAAE,GAAG,IAAI,KAAK,CAAC,CAAC,CAAC;SACzD;aAAM;YACL,EAAE,CAAC,KAAK,CAAC,CAAC;SACX;IACH,CAAC,CAAC;IAEF,SAAS,QAAQ;QACf,WAAW,GAAG,KAAK,CAAC;QACpB,IAAI,CAAC,MAAM,EAAE,IAAI,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;YACjC,WAAW,GAAG,GAAG,CAAC,CAAC,CAAC,CAAC,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,MAAM,IAAI,IAAI,CAAC;YAEpD,IAAI,GAAG,EAAE;gBACP,oFAAoF;gBACpF,qFAAqF;gBACrF,oFAAoF;gBACpF,mEAAmE;gBACnE,IAAI,GAAG,CAAC,OAAO,CAAC,KAAK,CAAC,kBAAkB,CAAC,EAAE;oBACzC,MAAM,CAAC,KAAK,EAAE,CAAC;oBACf,OAAO,QAAQ,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;iBAC5B;gBAED,wFAAwF;gBACxF,sFAAsF;gBACtF,uFAAuF;gBACvF,kFAAkF;gBAClF,6EAA6E;gBAC7E,IAAI,GAAG,CAAC,OAAO,CAAC,KAAK,CAAC,aAAa,CAAC,EAAE;oBACpC,OAAO,QAAQ,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;iBAC5B;gBAED,OAAO,QAAQ,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC;aAC9B;YAED,IAAI,MAAM,IAAI,IAAI,EAAE;gBAClB,QAAQ,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;aACrB;iBAAM,IAAI,QAAQ,CAAC,SAAS,EAAE;gBAC7B,MAAM,CAAC,KAAK,EAAE,CAAC;aAChB;iBAAM;gBACL,IAAI,QAAQ,CAAC,IAAI,CAAC,MAAM,CAAC,EAAE;oBACzB,OAAO,QAAQ,EAAE,CAAC;iBACnB;gBAED,OAAO,GAAG,KAAK,CAAC;aACjB;QACH,CAAC,CAAC,CAAC;IACL,CAAC;IAED,OAAO,QAAQ,CAAC;AAClB,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cursor/aggregation_cursor.js b/node_modules/mongodb/lib/cursor/aggregation_cursor.js new file mode 100644 index 000000000..21ee2c0ad --- /dev/null +++ b/node_modules/mongodb/lib/cursor/aggregation_cursor.js @@ -0,0 +1,172 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.AggregationCursor = void 0; +const aggregate_1 = require("../operations/aggregate"); +const abstract_cursor_1 = require("./abstract_cursor"); +const execute_operation_1 = require("../operations/execute_operation"); +const utils_1 = require("../utils"); +/** @internal */ +const kPipeline = Symbol('pipeline'); +/** @internal */ +const kOptions = Symbol('options'); +/** + * The **AggregationCursor** class is an internal class that embodies an aggregation cursor on MongoDB + * allowing for iteration over the results returned from the underlying query. It supports + * one by one document iteration, conversion to an array or can be iterated as a Node 4.X + * or higher stream + * @public + */ +class AggregationCursor extends abstract_cursor_1.AbstractCursor { + /** @internal */ + constructor(topology, namespace, pipeline = [], options = {}) { + super(topology, namespace, options); + this[kPipeline] = pipeline; + this[kOptions] = options; + } + get pipeline() { + return this[kPipeline]; + } + clone() { + const clonedOptions = utils_1.mergeOptions({}, this[kOptions]); + delete clonedOptions.session; + return new AggregationCursor(this.topology, this.namespace, this[kPipeline], { + ...clonedOptions + }); + } + map(transform) { + return super.map(transform); + } + /** @internal */ + _initialize(session, callback) { + const aggregateOperation = new aggregate_1.AggregateOperation(this.namespace, this[kPipeline], { + ...this[kOptions], + ...this.cursorOptions, + session + }); + execute_operation_1.executeOperation(this.topology, aggregateOperation, (err, response) => { + if (err || response == null) + return callback(err); + // TODO: NODE-2882 + callback(undefined, { server: aggregateOperation.server, session, response }); + }); + } + explain(verbosity, callback) { + if (typeof verbosity === 'function') + (callback = verbosity), (verbosity = true); + if (verbosity == null) + verbosity = true; + return execute_operation_1.executeOperation(this.topology, new aggregate_1.AggregateOperation(this.namespace, this[kPipeline], { + ...this[kOptions], + ...this.cursorOptions, + explain: verbosity + }), callback); + } + group($group) { + abstract_cursor_1.assertUninitialized(this); + this[kPipeline].push({ $group }); + return this; + } + /** Add a limit stage to the aggregation pipeline */ + limit($limit) { + abstract_cursor_1.assertUninitialized(this); + this[kPipeline].push({ $limit }); + return this; + } + /** Add a match stage to the aggregation pipeline */ + match($match) { + abstract_cursor_1.assertUninitialized(this); + this[kPipeline].push({ $match }); + return this; + } + /** Add an out stage to the aggregation pipeline */ + out($out) { + abstract_cursor_1.assertUninitialized(this); + this[kPipeline].push({ $out }); + return this; + } + /** + * Add a project stage to the aggregation pipeline + * + * @remarks + * In order to strictly type this function you must provide an interface + * that represents the effect of your projection on the result documents. + * + * By default chaining a projection to your cursor changes the returned type to the generic {@link Document} type. + * You should specify a parameterized type to have assertions on your final results. + * + * @example + * ```typescript + * // Best way + * const docs: AggregationCursor<{ a: number }> = cursor.project<{ a: number }>({ _id: 0, a: true }); + * // Flexible way + * const docs: AggregationCursor = cursor.project({ _id: 0, a: true }); + * ``` + * + * @remarks + * In order to strictly type this function you must provide an interface + * that represents the effect of your projection on the result documents. + * + * **Note for Typescript Users:** adding a transform changes the return type of the iteration of this cursor, + * it **does not** return a new instance of a cursor. This means when calling project, + * you should always assign the result to a new variable in order to get a correctly typed cursor variable. + * Take note of the following example: + * + * @example + * ```typescript + * const cursor: AggregationCursor<{ a: number; b: string }> = coll.aggregate([]); + * const projectCursor = cursor.project<{ a: number }>({ _id: 0, a: true }); + * const aPropOnlyArray: {a: number}[] = await projectCursor.toArray(); + * + * // or always use chaining and save the final cursor + * + * const cursor = coll.aggregate().project<{ a: string }>({ + * _id: 0, + * a: { $convert: { input: '$a', to: 'string' } + * }}); + * ``` + */ + project($project) { + abstract_cursor_1.assertUninitialized(this); + this[kPipeline].push({ $project }); + return this; + } + /** Add a lookup stage to the aggregation pipeline */ + lookup($lookup) { + abstract_cursor_1.assertUninitialized(this); + this[kPipeline].push({ $lookup }); + return this; + } + /** Add a redact stage to the aggregation pipeline */ + redact($redact) { + abstract_cursor_1.assertUninitialized(this); + this[kPipeline].push({ $redact }); + return this; + } + /** Add a skip stage to the aggregation pipeline */ + skip($skip) { + abstract_cursor_1.assertUninitialized(this); + this[kPipeline].push({ $skip }); + return this; + } + /** Add a sort stage to the aggregation pipeline */ + sort($sort) { + abstract_cursor_1.assertUninitialized(this); + this[kPipeline].push({ $sort }); + return this; + } + /** Add a unwind stage to the aggregation pipeline */ + unwind($unwind) { + abstract_cursor_1.assertUninitialized(this); + this[kPipeline].push({ $unwind }); + return this; + } + // deprecated methods + /** @deprecated Add a geoNear stage to the aggregation pipeline */ + geoNear($geoNear) { + abstract_cursor_1.assertUninitialized(this); + this[kPipeline].push({ $geoNear }); + return this; + } +} +exports.AggregationCursor = AggregationCursor; +//# sourceMappingURL=aggregation_cursor.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cursor/aggregation_cursor.js.map b/node_modules/mongodb/lib/cursor/aggregation_cursor.js.map new file mode 100644 index 000000000..d21e89f22 --- /dev/null +++ b/node_modules/mongodb/lib/cursor/aggregation_cursor.js.map @@ -0,0 +1 @@ +{"version":3,"file":"aggregation_cursor.js","sourceRoot":"","sources":["../../src/cursor/aggregation_cursor.ts"],"names":[],"mappings":";;;AAAA,uDAA+E;AAC/E,uDAAwE;AACxE,uEAAoF;AACpF,oCAAwC;AAYxC,gBAAgB;AAChB,MAAM,SAAS,GAAG,MAAM,CAAC,UAAU,CAAC,CAAC;AACrC,gBAAgB;AAChB,MAAM,QAAQ,GAAG,MAAM,CAAC,SAAS,CAAC,CAAC;AAEnC;;;;;;GAMG;AACH,MAAa,iBAAsC,SAAQ,gCAAuB;IAMhF,gBAAgB;IAChB,YACE,QAAkB,EAClB,SAA2B,EAC3B,WAAuB,EAAE,EACzB,UAA4B,EAAE;QAE9B,KAAK,CAAC,QAAQ,EAAE,SAAS,EAAE,OAAO,CAAC,CAAC;QAEpC,IAAI,CAAC,SAAS,CAAC,GAAG,QAAQ,CAAC;QAC3B,IAAI,CAAC,QAAQ,CAAC,GAAG,OAAO,CAAC;IAC3B,CAAC;IAED,IAAI,QAAQ;QACV,OAAO,IAAI,CAAC,SAAS,CAAC,CAAC;IACzB,CAAC;IAED,KAAK;QACH,MAAM,aAAa,GAAG,oBAAY,CAAC,EAAE,EAAE,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC;QACvD,OAAO,aAAa,CAAC,OAAO,CAAC;QAC7B,OAAO,IAAI,iBAAiB,CAAC,IAAI,CAAC,QAAQ,EAAE,IAAI,CAAC,SAAS,EAAE,IAAI,CAAC,SAAS,CAAC,EAAE;YAC3E,GAAG,aAAa;SACjB,CAAC,CAAC;IACL,CAAC;IAED,GAAG,CAAI,SAA8B;QACnC,OAAO,KAAK,CAAC,GAAG,CAAC,SAAS,CAAyB,CAAC;IACtD,CAAC;IAED,gBAAgB;IAChB,WAAW,CAAC,OAAkC,EAAE,QAAmC;QACjF,MAAM,kBAAkB,GAAG,IAAI,8BAAkB,CAAC,IAAI,CAAC,SAAS,EAAE,IAAI,CAAC,SAAS,CAAC,EAAE;YACjF,GAAG,IAAI,CAAC,QAAQ,CAAC;YACjB,GAAG,IAAI,CAAC,aAAa;YACrB,OAAO;SACR,CAAC,CAAC;QAEH,oCAAgB,CAAC,IAAI,CAAC,QAAQ,EAAE,kBAAkB,EAAE,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;YACpE,IAAI,GAAG,IAAI,QAAQ,IAAI,IAAI;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAElD,kBAAkB;YAClB,QAAQ,CAAC,SAAS,EAAE,EAAE,MAAM,EAAE,kBAAkB,CAAC,MAAM,EAAE,OAAO,EAAE,QAAQ,EAAE,CAAC,CAAC;QAChF,CAAC,CAAC,CAAC;IACL,CAAC;IAMD,OAAO,CACL,SAA2C,EAC3C,QAA6B;QAE7B,IAAI,OAAO,SAAS,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,SAAS,CAAC,EAAE,CAAC,SAAS,GAAG,IAAI,CAAC,CAAC;QAChF,IAAI,SAAS,IAAI,IAAI;YAAE,SAAS,GAAG,IAAI,CAAC;QAExC,OAAO,oCAAgB,CACrB,IAAI,CAAC,QAAQ,EACb,IAAI,8BAAkB,CAAC,IAAI,CAAC,SAAS,EAAE,IAAI,CAAC,SAAS,CAAC,EAAE;YACtD,GAAG,IAAI,CAAC,QAAQ,CAAC;YACjB,GAAG,IAAI,CAAC,aAAa;YACrB,OAAO,EAAE,SAAS;SACnB,CAAC,EACF,QAAQ,CACT,CAAC;IACJ,CAAC;IAID,KAAK,CAAC,MAAgB;QACpB,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,SAAS,CAAC,CAAC,IAAI,CAAC,EAAE,MAAM,EAAE,CAAC,CAAC;QACjC,OAAO,IAAI,CAAC;IACd,CAAC;IAED,oDAAoD;IACpD,KAAK,CAAC,MAAc;QAClB,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,SAAS,CAAC,CAAC,IAAI,CAAC,EAAE,MAAM,EAAE,CAAC,CAAC;QACjC,OAAO,IAAI,CAAC;IACd,CAAC;IAED,oDAAoD;IACpD,KAAK,CAAC,MAAgB;QACpB,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,SAAS,CAAC,CAAC,IAAI,CAAC,EAAE,MAAM,EAAE,CAAC,CAAC;QACjC,OAAO,IAAI,CAAC;IACd,CAAC;IAED,mDAAmD;IACnD,GAAG,CAAC,IAA2C;QAC7C,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,SAAS,CAAC,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,CAAC,CAAC;QAC/B,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;OAwCG;IACH,OAAO,CAAgC,QAAkB;QACvD,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,SAAS,CAAC,CAAC,IAAI,CAAC,EAAE,QAAQ,EAAE,CAAC,CAAC;QACnC,OAAO,IAAuC,CAAC;IACjD,CAAC;IAED,qDAAqD;IACrD,MAAM,CAAC,OAAiB;QACtB,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,SAAS,CAAC,CAAC,IAAI,CAAC,EAAE,OAAO,EAAE,CAAC,CAAC;QAClC,OAAO,IAAI,CAAC;IACd,CAAC;IAED,qDAAqD;IACrD,MAAM,CAAC,OAAiB;QACtB,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,SAAS,CAAC,CAAC,IAAI,CAAC,EAAE,OAAO,EAAE,CAAC,CAAC;QAClC,OAAO,IAAI,CAAC;IACd,CAAC;IAED,mDAAmD;IACnD,IAAI,CAAC,KAAa;QAChB,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,SAAS,CAAC,CAAC,IAAI,CAAC,EAAE,KAAK,EAAE,CAAC,CAAC;QAChC,OAAO,IAAI,CAAC;IACd,CAAC;IAED,mDAAmD;IACnD,IAAI,CAAC,KAAW;QACd,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,SAAS,CAAC,CAAC,IAAI,CAAC,EAAE,KAAK,EAAE,CAAC,CAAC;QAChC,OAAO,IAAI,CAAC;IACd,CAAC;IAED,qDAAqD;IACrD,MAAM,CAAC,OAA0B;QAC/B,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,SAAS,CAAC,CAAC,IAAI,CAAC,EAAE,OAAO,EAAE,CAAC,CAAC;QAClC,OAAO,IAAI,CAAC;IACd,CAAC;IAED,qBAAqB;IACrB,kEAAkE;IAClE,OAAO,CAAC,QAAkB;QACxB,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,SAAS,CAAC,CAAC,IAAI,CAAC,EAAE,QAAQ,EAAE,CAAC,CAAC;QACnC,OAAO,IAAI,CAAC;IACd,CAAC;CACF;AA/LD,8CA+LC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/cursor/find_cursor.js b/node_modules/mongodb/lib/cursor/find_cursor.js new file mode 100644 index 000000000..d56493bb7 --- /dev/null +++ b/node_modules/mongodb/lib/cursor/find_cursor.js @@ -0,0 +1,376 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.FindCursor = exports.FLAGS = void 0; +const error_1 = require("../error"); +const count_1 = require("../operations/count"); +const execute_operation_1 = require("../operations/execute_operation"); +const find_1 = require("../operations/find"); +const utils_1 = require("../utils"); +const sort_1 = require("../sort"); +const abstract_cursor_1 = require("./abstract_cursor"); +/** @internal */ +const kFilter = Symbol('filter'); +/** @internal */ +const kNumReturned = Symbol('numReturned'); +/** @internal */ +const kBuiltOptions = Symbol('builtOptions'); +/** @public Flags allowed for cursor */ +exports.FLAGS = [ + 'tailable', + 'oplogReplay', + 'noCursorTimeout', + 'awaitData', + 'exhaust', + 'partial' +]; +/** @public */ +class FindCursor extends abstract_cursor_1.AbstractCursor { + /** @internal */ + constructor(topology, namespace, filter, options = {}) { + super(topology, namespace, options); + this[kFilter] = filter || {}; + this[kBuiltOptions] = options; + if (options.sort != null) { + this[kBuiltOptions].sort = sort_1.formatSort(options.sort); + } + } + clone() { + const clonedOptions = utils_1.mergeOptions({}, this[kBuiltOptions]); + delete clonedOptions.session; + return new FindCursor(this.topology, this.namespace, this[kFilter], { + ...clonedOptions + }); + } + map(transform) { + return super.map(transform); + } + /** @internal */ + _initialize(session, callback) { + const findOperation = new find_1.FindOperation(undefined, this.namespace, this[kFilter], { + ...this[kBuiltOptions], + ...this.cursorOptions, + session + }); + execute_operation_1.executeOperation(this.topology, findOperation, (err, response) => { + if (err || response == null) + return callback(err); + // TODO: We only need this for legacy queries that do not support `limit`, maybe + // the value should only be saved in those cases. + if (response.cursor) { + this[kNumReturned] = response.cursor.firstBatch.length; + } + else { + this[kNumReturned] = response.documents ? response.documents.length : 0; + } + // TODO: NODE-2882 + callback(undefined, { server: findOperation.server, session, response }); + }); + } + /** @internal */ + _getMore(batchSize, callback) { + // NOTE: this is to support client provided limits in pre-command servers + const numReturned = this[kNumReturned]; + if (numReturned) { + const limit = this[kBuiltOptions].limit; + batchSize = + limit && limit > 0 && numReturned + batchSize > limit ? limit - numReturned : batchSize; + if (batchSize <= 0) { + return this.close(callback); + } + } + super._getMore(batchSize, (err, response) => { + if (err) + return callback(err); + // TODO: wrap this in some logic to prevent it from happening if we don't need this support + if (response) { + this[kNumReturned] = this[kNumReturned] + response.cursor.nextBatch.length; + } + callback(undefined, response); + }); + } + count(options, callback) { + if (typeof options === 'boolean') { + throw new error_1.MongoInvalidArgumentError('Invalid first parameter to count'); + } + if (typeof options === 'function') + (callback = options), (options = {}); + options = options !== null && options !== void 0 ? options : {}; + return execute_operation_1.executeOperation(this.topology, new count_1.CountOperation(this.namespace, this[kFilter], { + ...this[kBuiltOptions], + ...this.cursorOptions, + ...options + }), callback); + } + explain(verbosity, callback) { + if (typeof verbosity === 'function') + (callback = verbosity), (verbosity = true); + if (verbosity == null) + verbosity = true; + return execute_operation_1.executeOperation(this.topology, new find_1.FindOperation(undefined, this.namespace, this[kFilter], { + ...this[kBuiltOptions], + ...this.cursorOptions, + explain: verbosity + }), callback); + } + /** Set the cursor query */ + filter(filter) { + abstract_cursor_1.assertUninitialized(this); + this[kFilter] = filter; + return this; + } + /** + * Set the cursor hint + * + * @param hint - If specified, then the query system will only consider plans using the hinted index. + */ + hint(hint) { + abstract_cursor_1.assertUninitialized(this); + this[kBuiltOptions].hint = hint; + return this; + } + /** + * Set the cursor min + * + * @param min - Specify a $min value to specify the inclusive lower bound for a specific index in order to constrain the results of find(). The $min specifies the lower bound for all keys of a specific index in order. + */ + min(min) { + abstract_cursor_1.assertUninitialized(this); + this[kBuiltOptions].min = min; + return this; + } + /** + * Set the cursor max + * + * @param max - Specify a $max value to specify the exclusive upper bound for a specific index in order to constrain the results of find(). The $max specifies the upper bound for all keys of a specific index in order. + */ + max(max) { + abstract_cursor_1.assertUninitialized(this); + this[kBuiltOptions].max = max; + return this; + } + /** + * Set the cursor returnKey. + * If set to true, modifies the cursor to only return the index field or fields for the results of the query, rather than documents. + * If set to true and the query does not use an index to perform the read operation, the returned documents will not contain any fields. + * + * @param value - the returnKey value. + */ + returnKey(value) { + abstract_cursor_1.assertUninitialized(this); + this[kBuiltOptions].returnKey = value; + return this; + } + /** + * Modifies the output of a query by adding a field $recordId to matching documents. $recordId is the internal key which uniquely identifies a document in a collection. + * + * @param value - The $showDiskLoc option has now been deprecated and replaced with the showRecordId field. $showDiskLoc will still be accepted for OP_QUERY stye find. + */ + showRecordId(value) { + abstract_cursor_1.assertUninitialized(this); + this[kBuiltOptions].showRecordId = value; + return this; + } + /** + * Add a query modifier to the cursor query + * + * @param name - The query modifier (must start with $, such as $orderby etc) + * @param value - The modifier value. + */ + addQueryModifier(name, value) { + abstract_cursor_1.assertUninitialized(this); + if (name[0] !== '$') { + throw new error_1.MongoInvalidArgumentError(`${name} is not a valid query modifier`); + } + // Strip of the $ + const field = name.substr(1); + // NOTE: consider some TS magic for this + switch (field) { + case 'comment': + this[kBuiltOptions].comment = value; + break; + case 'explain': + this[kBuiltOptions].explain = value; + break; + case 'hint': + this[kBuiltOptions].hint = value; + break; + case 'max': + this[kBuiltOptions].max = value; + break; + case 'maxTimeMS': + this[kBuiltOptions].maxTimeMS = value; + break; + case 'min': + this[kBuiltOptions].min = value; + break; + case 'orderby': + this[kBuiltOptions].sort = sort_1.formatSort(value); + break; + case 'query': + this[kFilter] = value; + break; + case 'returnKey': + this[kBuiltOptions].returnKey = value; + break; + case 'showDiskLoc': + this[kBuiltOptions].showRecordId = value; + break; + default: + throw new error_1.MongoInvalidArgumentError(`Invalid query modifier: ${name}`); + } + return this; + } + /** + * Add a comment to the cursor query allowing for tracking the comment in the log. + * + * @param value - The comment attached to this query. + */ + comment(value) { + abstract_cursor_1.assertUninitialized(this); + this[kBuiltOptions].comment = value; + return this; + } + /** + * Set a maxAwaitTimeMS on a tailing cursor query to allow to customize the timeout value for the option awaitData (Only supported on MongoDB 3.2 or higher, ignored otherwise) + * + * @param value - Number of milliseconds to wait before aborting the tailed query. + */ + maxAwaitTimeMS(value) { + abstract_cursor_1.assertUninitialized(this); + if (typeof value !== 'number') { + throw new error_1.MongoInvalidArgumentError('Argument for maxAwaitTimeMS must be a number'); + } + this[kBuiltOptions].maxAwaitTimeMS = value; + return this; + } + /** + * Set a maxTimeMS on the cursor query, allowing for hard timeout limits on queries (Only supported on MongoDB 2.6 or higher) + * + * @param value - Number of milliseconds to wait before aborting the query. + */ + maxTimeMS(value) { + abstract_cursor_1.assertUninitialized(this); + if (typeof value !== 'number') { + throw new error_1.MongoInvalidArgumentError('Argument for maxTimeMS must be a number'); + } + this[kBuiltOptions].maxTimeMS = value; + return this; + } + /** + * Add a project stage to the aggregation pipeline + * + * @remarks + * In order to strictly type this function you must provide an interface + * that represents the effect of your projection on the result documents. + * + * By default chaining a projection to your cursor changes the returned type to the generic + * {@link Document} type. + * You should specify a parameterized type to have assertions on your final results. + * + * @example + * ```typescript + * // Best way + * const docs: FindCursor<{ a: number }> = cursor.project<{ a: number }>({ _id: 0, a: true }); + * // Flexible way + * const docs: FindCursor = cursor.project({ _id: 0, a: true }); + * ``` + * + * @remarks + * + * **Note for Typescript Users:** adding a transform changes the return type of the iteration of this cursor, + * it **does not** return a new instance of a cursor. This means when calling project, + * you should always assign the result to a new variable in order to get a correctly typed cursor variable. + * Take note of the following example: + * + * @example + * ```typescript + * const cursor: FindCursor<{ a: number; b: string }> = coll.find(); + * const projectCursor = cursor.project<{ a: number }>({ _id: 0, a: true }); + * const aPropOnlyArray: {a: number}[] = await projectCursor.toArray(); + * + * // or always use chaining and save the final cursor + * + * const cursor = coll.find().project<{ a: string }>({ + * _id: 0, + * a: { $convert: { input: '$a', to: 'string' } + * }}); + * ``` + */ + project(value) { + abstract_cursor_1.assertUninitialized(this); + this[kBuiltOptions].projection = value; + return this; + } + /** + * Sets the sort order of the cursor query. + * + * @param sort - The key or keys set for the sort. + * @param direction - The direction of the sorting (1 or -1). + */ + sort(sort, direction) { + abstract_cursor_1.assertUninitialized(this); + if (this[kBuiltOptions].tailable) { + throw new error_1.MongoTailableCursorError('Tailable cursor does not support sorting'); + } + this[kBuiltOptions].sort = sort_1.formatSort(sort, direction); + return this; + } + /** + * Allows disk use for blocking sort operations exceeding 100MB memory. (MongoDB 3.2 or higher) + * + * @remarks + * {@link https://docs.mongodb.com/manual/reference/command/find/#find-cmd-allowdiskuse | find command allowDiskUse documentation} + */ + allowDiskUse() { + abstract_cursor_1.assertUninitialized(this); + if (!this[kBuiltOptions].sort) { + throw new error_1.MongoInvalidArgumentError('Option "allowDiskUse" requires a sort specification'); + } + this[kBuiltOptions].allowDiskUse = true; + return this; + } + /** + * Set the collation options for the cursor. + * + * @param value - The cursor collation options (MongoDB 3.4 or higher) settings for update operation (see 3.4 documentation for available fields). + */ + collation(value) { + abstract_cursor_1.assertUninitialized(this); + this[kBuiltOptions].collation = value; + return this; + } + /** + * Set the limit for the cursor. + * + * @param value - The limit for the cursor query. + */ + limit(value) { + abstract_cursor_1.assertUninitialized(this); + if (this[kBuiltOptions].tailable) { + throw new error_1.MongoTailableCursorError('Tailable cursor does not support limit'); + } + if (typeof value !== 'number') { + throw new error_1.MongoInvalidArgumentError('Operation "limit" requires an integer'); + } + this[kBuiltOptions].limit = value; + return this; + } + /** + * Set the skip for the cursor. + * + * @param value - The skip for the cursor query. + */ + skip(value) { + abstract_cursor_1.assertUninitialized(this); + if (this[kBuiltOptions].tailable) { + throw new error_1.MongoTailableCursorError('Tailable cursor does not support skip'); + } + if (typeof value !== 'number') { + throw new error_1.MongoInvalidArgumentError('Operation "skip" requires an integer'); + } + this[kBuiltOptions].skip = value; + return this; + } +} +exports.FindCursor = FindCursor; +//# sourceMappingURL=find_cursor.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/cursor/find_cursor.js.map b/node_modules/mongodb/lib/cursor/find_cursor.js.map new file mode 100644 index 000000000..66efed812 --- /dev/null +++ b/node_modules/mongodb/lib/cursor/find_cursor.js.map @@ -0,0 +1 @@ +{"version":3,"file":"find_cursor.js","sourceRoot":"","sources":["../../src/cursor/find_cursor.ts"],"names":[],"mappings":";;;AACA,oCAA+E;AAE/E,+CAAmE;AACnE,uEAAoF;AACpF,6CAAgE;AAChE,oCAAwC;AAKxC,kCAA0D;AAE1D,uDAAwE;AAExE,gBAAgB;AAChB,MAAM,OAAO,GAAG,MAAM,CAAC,QAAQ,CAAC,CAAC;AACjC,gBAAgB;AAChB,MAAM,YAAY,GAAG,MAAM,CAAC,aAAa,CAAC,CAAC;AAC3C,gBAAgB;AAChB,MAAM,aAAa,GAAG,MAAM,CAAC,cAAc,CAAC,CAAC;AAE7C,uCAAuC;AAC1B,QAAA,KAAK,GAAG;IACnB,UAAU;IACV,aAAa;IACb,iBAAiB;IACjB,WAAW;IACX,SAAS;IACT,SAAS;CACD,CAAC;AAEX,cAAc;AACd,MAAa,UAA+B,SAAQ,gCAAuB;IAQzE,gBAAgB;IAChB,YACE,QAAkB,EAClB,SAA2B,EAC3B,MAA4B,EAC5B,UAAuB,EAAE;QAEzB,KAAK,CAAC,QAAQ,EAAE,SAAS,EAAE,OAAO,CAAC,CAAC;QAEpC,IAAI,CAAC,OAAO,CAAC,GAAG,MAAM,IAAI,EAAE,CAAC;QAC7B,IAAI,CAAC,aAAa,CAAC,GAAG,OAAO,CAAC;QAE9B,IAAI,OAAO,CAAC,IAAI,IAAI,IAAI,EAAE;YACxB,IAAI,CAAC,aAAa,CAAC,CAAC,IAAI,GAAG,iBAAU,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;SACrD;IACH,CAAC;IAED,KAAK;QACH,MAAM,aAAa,GAAG,oBAAY,CAAC,EAAE,EAAE,IAAI,CAAC,aAAa,CAAC,CAAC,CAAC;QAC5D,OAAO,aAAa,CAAC,OAAO,CAAC;QAC7B,OAAO,IAAI,UAAU,CAAC,IAAI,CAAC,QAAQ,EAAE,IAAI,CAAC,SAAS,EAAE,IAAI,CAAC,OAAO,CAAC,EAAE;YAClE,GAAG,aAAa;SACjB,CAAC,CAAC;IACL,CAAC;IAED,GAAG,CAAI,SAA8B;QACnC,OAAO,KAAK,CAAC,GAAG,CAAC,SAAS,CAAkB,CAAC;IAC/C,CAAC;IAED,gBAAgB;IAChB,WAAW,CAAC,OAAkC,EAAE,QAAmC;QACjF,MAAM,aAAa,GAAG,IAAI,oBAAa,CAAC,SAAS,EAAE,IAAI,CAAC,SAAS,EAAE,IAAI,CAAC,OAAO,CAAC,EAAE;YAChF,GAAG,IAAI,CAAC,aAAa,CAAC;YACtB,GAAG,IAAI,CAAC,aAAa;YACrB,OAAO;SACR,CAAC,CAAC;QAEH,oCAAgB,CAAC,IAAI,CAAC,QAAQ,EAAE,aAAa,EAAE,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;YAC/D,IAAI,GAAG,IAAI,QAAQ,IAAI,IAAI;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAElD,gFAAgF;YAChF,uDAAuD;YACvD,IAAI,QAAQ,CAAC,MAAM,EAAE;gBACnB,IAAI,CAAC,YAAY,CAAC,GAAG,QAAQ,CAAC,MAAM,CAAC,UAAU,CAAC,MAAM,CAAC;aACxD;iBAAM;gBACL,IAAI,CAAC,YAAY,CAAC,GAAG,QAAQ,CAAC,SAAS,CAAC,CAAC,CAAC,QAAQ,CAAC,SAAS,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC;aACzE;YAED,kBAAkB;YAClB,QAAQ,CAAC,SAAS,EAAE,EAAE,MAAM,EAAE,aAAa,CAAC,MAAM,EAAE,OAAO,EAAE,QAAQ,EAAE,CAAC,CAAC;QAC3E,CAAC,CAAC,CAAC;IACL,CAAC;IAED,gBAAgB;IAChB,QAAQ,CAAC,SAAiB,EAAE,QAA4B;QACtD,yEAAyE;QACzE,MAAM,WAAW,GAAG,IAAI,CAAC,YAAY,CAAC,CAAC;QACvC,IAAI,WAAW,EAAE;YACf,MAAM,KAAK,GAAG,IAAI,CAAC,aAAa,CAAC,CAAC,KAAK,CAAC;YACxC,SAAS;gBACP,KAAK,IAAI,KAAK,GAAG,CAAC,IAAI,WAAW,GAAG,SAAS,GAAG,KAAK,CAAC,CAAC,CAAC,KAAK,GAAG,WAAW,CAAC,CAAC,CAAC,SAAS,CAAC;YAE1F,IAAI,SAAS,IAAI,CAAC,EAAE;gBAClB,OAAO,IAAI,CAAC,KAAK,CAAC,QAAQ,CAAC,CAAC;aAC7B;SACF;QAED,KAAK,CAAC,QAAQ,CAAC,SAAS,EAAE,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;YAC1C,IAAI,GAAG;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAE9B,2FAA2F;YAC3F,IAAI,QAAQ,EAAE;gBACZ,IAAI,CAAC,YAAY,CAAC,GAAG,IAAI,CAAC,YAAY,CAAC,GAAG,QAAQ,CAAC,MAAM,CAAC,SAAS,CAAC,MAAM,CAAC;aAC5E;YAED,QAAQ,CAAC,SAAS,EAAE,QAAQ,CAAC,CAAC;QAChC,CAAC,CAAC,CAAC;IACL,CAAC;IAOD,KAAK,CACH,OAAyC,EACzC,QAA2B;QAE3B,IAAI,OAAO,OAAO,KAAK,SAAS,EAAE;YAChC,MAAM,IAAI,iCAAyB,CAAC,kCAAkC,CAAC,CAAC;SACzE;QAED,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAExB,OAAO,oCAAgB,CACrB,IAAI,CAAC,QAAQ,EACb,IAAI,sBAAc,CAAC,IAAI,CAAC,SAAS,EAAE,IAAI,CAAC,OAAO,CAAC,EAAE;YAChD,GAAG,IAAI,CAAC,aAAa,CAAC;YACtB,GAAG,IAAI,CAAC,aAAa;YACrB,GAAG,OAAO;SACX,CAAC,EACF,QAAQ,CACT,CAAC;IACJ,CAAC;IAMD,OAAO,CACL,SAA2C,EAC3C,QAA6B;QAE7B,IAAI,OAAO,SAAS,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,SAAS,CAAC,EAAE,CAAC,SAAS,GAAG,IAAI,CAAC,CAAC;QAChF,IAAI,SAAS,IAAI,IAAI;YAAE,SAAS,GAAG,IAAI,CAAC;QAExC,OAAO,oCAAgB,CACrB,IAAI,CAAC,QAAQ,EACb,IAAI,oBAAa,CAAC,SAAS,EAAE,IAAI,CAAC,SAAS,EAAE,IAAI,CAAC,OAAO,CAAC,EAAE;YAC1D,GAAG,IAAI,CAAC,aAAa,CAAC;YACtB,GAAG,IAAI,CAAC,aAAa;YACrB,OAAO,EAAE,SAAS;SACnB,CAAC,EACF,QAAQ,CACT,CAAC;IACJ,CAAC;IAED,2BAA2B;IAC3B,MAAM,CAAC,MAAgB;QACrB,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,OAAO,CAAC,GAAG,MAAM,CAAC;QACvB,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;OAIG;IACH,IAAI,CAAC,IAAU;QACb,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,aAAa,CAAC,CAAC,IAAI,GAAG,IAAI,CAAC;QAChC,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;OAIG;IACH,GAAG,CAAC,GAAa;QACf,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,aAAa,CAAC,CAAC,GAAG,GAAG,GAAG,CAAC;QAC9B,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;OAIG;IACH,GAAG,CAAC,GAAa;QACf,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,aAAa,CAAC,CAAC,GAAG,GAAG,GAAG,CAAC;QAC9B,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;;;OAMG;IACH,SAAS,CAAC,KAAc;QACtB,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,aAAa,CAAC,CAAC,SAAS,GAAG,KAAK,CAAC;QACtC,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;OAIG;IACH,YAAY,CAAC,KAAc;QACzB,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,aAAa,CAAC,CAAC,YAAY,GAAG,KAAK,CAAC;QACzC,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;;OAKG;IACH,gBAAgB,CAAC,IAAY,EAAE,KAA2C;QACxE,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,GAAG,EAAE;YACnB,MAAM,IAAI,iCAAyB,CAAC,GAAG,IAAI,gCAAgC,CAAC,CAAC;SAC9E;QAED,iBAAiB;QACjB,MAAM,KAAK,GAAG,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC;QAE7B,wCAAwC;QACxC,QAAQ,KAAK,EAAE;YACb,KAAK,SAAS;gBACZ,IAAI,CAAC,aAAa,CAAC,CAAC,OAAO,GAAG,KAA0B,CAAC;gBACzD,MAAM;YAER,KAAK,SAAS;gBACZ,IAAI,CAAC,aAAa,CAAC,CAAC,OAAO,GAAG,KAAgB,CAAC;gBAC/C,MAAM;YAER,KAAK,MAAM;gBACT,IAAI,CAAC,aAAa,CAAC,CAAC,IAAI,GAAG,KAA0B,CAAC;gBACtD,MAAM;YAER,KAAK,KAAK;gBACR,IAAI,CAAC,aAAa,CAAC,CAAC,GAAG,GAAG,KAAiB,CAAC;gBAC5C,MAAM;YAER,KAAK,WAAW;gBACd,IAAI,CAAC,aAAa,CAAC,CAAC,SAAS,GAAG,KAAe,CAAC;gBAChD,MAAM;YAER,KAAK,KAAK;gBACR,IAAI,CAAC,aAAa,CAAC,CAAC,GAAG,GAAG,KAAiB,CAAC;gBAC5C,MAAM;YAER,KAAK,SAAS;gBACZ,IAAI,CAAC,aAAa,CAAC,CAAC,IAAI,GAAG,iBAAU,CAAC,KAA0B,CAAC,CAAC;gBAClE,MAAM;YAER,KAAK,OAAO;gBACV,IAAI,CAAC,OAAO,CAAC,GAAG,KAAiB,CAAC;gBAClC,MAAM;YAER,KAAK,WAAW;gBACd,IAAI,CAAC,aAAa,CAAC,CAAC,SAAS,GAAG,KAAgB,CAAC;gBACjD,MAAM;YAER,KAAK,aAAa;gBAChB,IAAI,CAAC,aAAa,CAAC,CAAC,YAAY,GAAG,KAAgB,CAAC;gBACpD,MAAM;YAER;gBACE,MAAM,IAAI,iCAAyB,CAAC,2BAA2B,IAAI,EAAE,CAAC,CAAC;SAC1E;QAED,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;OAIG;IACH,OAAO,CAAC,KAAa;QACnB,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,aAAa,CAAC,CAAC,OAAO,GAAG,KAAK,CAAC;QACpC,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;OAIG;IACH,cAAc,CAAC,KAAa;QAC1B,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,OAAO,KAAK,KAAK,QAAQ,EAAE;YAC7B,MAAM,IAAI,iCAAyB,CAAC,8CAA8C,CAAC,CAAC;SACrF;QAED,IAAI,CAAC,aAAa,CAAC,CAAC,cAAc,GAAG,KAAK,CAAC;QAC3C,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;OAIG;IACH,SAAS,CAAC,KAAa;QACrB,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,OAAO,KAAK,KAAK,QAAQ,EAAE;YAC7B,MAAM,IAAI,iCAAyB,CAAC,yCAAyC,CAAC,CAAC;SAChF;QAED,IAAI,CAAC,aAAa,CAAC,CAAC,SAAS,GAAG,KAAK,CAAC;QACtC,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;OAuCG;IACH,OAAO,CAAgC,KAAe;QACpD,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,aAAa,CAAC,CAAC,UAAU,GAAG,KAAK,CAAC;QACvC,OAAO,IAAgC,CAAC;IAC1C,CAAC;IAED;;;;;OAKG;IACH,IAAI,CAAC,IAAmB,EAAE,SAAyB;QACjD,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,IAAI,CAAC,aAAa,CAAC,CAAC,QAAQ,EAAE;YAChC,MAAM,IAAI,gCAAwB,CAAC,0CAA0C,CAAC,CAAC;SAChF;QAED,IAAI,CAAC,aAAa,CAAC,CAAC,IAAI,GAAG,iBAAU,CAAC,IAAI,EAAE,SAAS,CAAC,CAAC;QACvD,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;;OAKG;IACH,YAAY;QACV,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,IAAI,CAAC,aAAa,CAAC,CAAC,IAAI,EAAE;YAC7B,MAAM,IAAI,iCAAyB,CAAC,qDAAqD,CAAC,CAAC;SAC5F;QACD,IAAI,CAAC,aAAa,CAAC,CAAC,YAAY,GAAG,IAAI,CAAC;QACxC,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;OAIG;IACH,SAAS,CAAC,KAAuB;QAC/B,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,CAAC,aAAa,CAAC,CAAC,SAAS,GAAG,KAAK,CAAC;QACtC,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;OAIG;IACH,KAAK,CAAC,KAAa;QACjB,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,IAAI,CAAC,aAAa,CAAC,CAAC,QAAQ,EAAE;YAChC,MAAM,IAAI,gCAAwB,CAAC,wCAAwC,CAAC,CAAC;SAC9E;QAED,IAAI,OAAO,KAAK,KAAK,QAAQ,EAAE;YAC7B,MAAM,IAAI,iCAAyB,CAAC,uCAAuC,CAAC,CAAC;SAC9E;QAED,IAAI,CAAC,aAAa,CAAC,CAAC,KAAK,GAAG,KAAK,CAAC;QAClC,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;OAIG;IACH,IAAI,CAAC,KAAa;QAChB,qCAAmB,CAAC,IAAI,CAAC,CAAC;QAC1B,IAAI,IAAI,CAAC,aAAa,CAAC,CAAC,QAAQ,EAAE;YAChC,MAAM,IAAI,gCAAwB,CAAC,uCAAuC,CAAC,CAAC;SAC7E;QAED,IAAI,OAAO,KAAK,KAAK,QAAQ,EAAE;YAC7B,MAAM,IAAI,iCAAyB,CAAC,sCAAsC,CAAC,CAAC;SAC7E;QAED,IAAI,CAAC,aAAa,CAAC,CAAC,IAAI,GAAG,KAAK,CAAC;QACjC,OAAO,IAAI,CAAC;IACd,CAAC;CACF;AA9aD,gCA8aC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/db.js b/node_modules/mongodb/lib/db.js new file mode 100644 index 000000000..59acda455 --- /dev/null +++ b/node_modules/mongodb/lib/db.js @@ -0,0 +1,317 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.Db = void 0; +const utils_1 = require("./utils"); +const aggregation_cursor_1 = require("./cursor/aggregation_cursor"); +const bson_1 = require("./bson"); +const read_preference_1 = require("./read_preference"); +const error_1 = require("./error"); +const collection_1 = require("./collection"); +const change_stream_1 = require("./change_stream"); +const CONSTANTS = require("./constants"); +const write_concern_1 = require("./write_concern"); +const read_concern_1 = require("./read_concern"); +const logger_1 = require("./logger"); +const add_user_1 = require("./operations/add_user"); +const collections_1 = require("./operations/collections"); +const stats_1 = require("./operations/stats"); +const run_command_1 = require("./operations/run_command"); +const create_collection_1 = require("./operations/create_collection"); +const indexes_1 = require("./operations/indexes"); +const drop_1 = require("./operations/drop"); +const list_collections_1 = require("./operations/list_collections"); +const profiling_level_1 = require("./operations/profiling_level"); +const remove_user_1 = require("./operations/remove_user"); +const rename_1 = require("./operations/rename"); +const set_profiling_level_1 = require("./operations/set_profiling_level"); +const execute_operation_1 = require("./operations/execute_operation"); +const admin_1 = require("./admin"); +// Allowed parameters +const DB_OPTIONS_ALLOW_LIST = [ + 'writeConcern', + 'readPreference', + 'readPreferenceTags', + 'native_parser', + 'forceServerObjectId', + 'pkFactory', + 'serializeFunctions', + 'raw', + 'authSource', + 'ignoreUndefined', + 'readConcern', + 'retryMiliSeconds', + 'numberOfRetries', + 'loggerLevel', + 'logger', + 'promoteBuffers', + 'promoteLongs', + 'bsonRegExp', + 'promoteValues', + 'compression', + 'retryWrites' +]; +/** + * The **Db** class is a class that represents a MongoDB Database. + * @public + * + * @example + * ```js + * const { MongoClient } = require('mongodb'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * // Connect using MongoClient + * MongoClient.connect(url, function(err, client) { + * // Select the database by name + * const testDb = client.db(dbName); + * client.close(); + * }); + * ``` + */ +class Db { + /** + * Creates a new Db instance + * + * @param client - The MongoClient for the database. + * @param databaseName - The name of the database this instance represents. + * @param options - Optional settings for Db construction + */ + constructor(client, databaseName, options) { + var _a; + options = options !== null && options !== void 0 ? options : {}; + // Filter the options + options = utils_1.filterOptions(options, DB_OPTIONS_ALLOW_LIST); + // Ensure we have a valid db name + validateDatabaseName(databaseName); + // Internal state of the db object + this.s = { + // Client + client, + // Options + options, + // Logger instance + logger: new logger_1.Logger('Db', options), + // Unpack read preference + readPreference: read_preference_1.ReadPreference.fromOptions(options), + // Merge bson options + bsonOptions: bson_1.resolveBSONOptions(options, client), + // Set up the primary key factory or fallback to ObjectId + pkFactory: (_a = options === null || options === void 0 ? void 0 : options.pkFactory) !== null && _a !== void 0 ? _a : utils_1.DEFAULT_PK_FACTORY, + // ReadConcern + readConcern: read_concern_1.ReadConcern.fromOptions(options), + writeConcern: write_concern_1.WriteConcern.fromOptions(options), + // Namespace + namespace: new utils_1.MongoDBNamespace(databaseName) + }; + } + get databaseName() { + return this.s.namespace.db; + } + // Options + get options() { + return this.s.options; + } + // slaveOk specified + get slaveOk() { + var _a; + return ((_a = this.s.readPreference) === null || _a === void 0 ? void 0 : _a.preference) !== 'primary' || false; + } + get readConcern() { + return this.s.readConcern; + } + /** + * The current readPreference of the Db. If not explicitly defined for + * this Db, will be inherited from the parent MongoClient + */ + get readPreference() { + if (this.s.readPreference == null) { + return this.s.client.readPreference; + } + return this.s.readPreference; + } + get bsonOptions() { + return this.s.bsonOptions; + } + // get the write Concern + get writeConcern() { + return this.s.writeConcern; + } + get namespace() { + return this.s.namespace.toString(); + } + createCollection(name, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_1.getTopology(this), new create_collection_1.CreateCollectionOperation(this, name, utils_1.resolveOptions(this, options)), callback); + } + command(command, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + // Intentionally, we do not inherit options from parent for this operation. + return execute_operation_1.executeOperation(utils_1.getTopology(this), new run_command_1.RunCommandOperation(this, command, options !== null && options !== void 0 ? options : {}), callback); + } + /** + * Execute an aggregation framework pipeline against the database, needs MongoDB \>= 3.6 + * + * @param pipeline - An array of aggregation stages to be executed + * @param options - Optional settings for the command + */ + aggregate(pipeline = [], options) { + if (arguments.length > 2) { + throw new error_1.MongoInvalidArgumentError('Method "db.aggregate()" accepts at most two arguments'); + } + if (typeof pipeline === 'function') { + throw new error_1.MongoInvalidArgumentError('Argument "pipeline" must not be function'); + } + if (typeof options === 'function') { + throw new error_1.MongoInvalidArgumentError('Argument "options" must not be function'); + } + return new aggregation_cursor_1.AggregationCursor(utils_1.getTopology(this), this.s.namespace, pipeline, utils_1.resolveOptions(this, options)); + } + /** Return the Admin db instance */ + admin() { + return new admin_1.Admin(this); + } + /** + * Returns a reference to a MongoDB Collection. If it does not exist it will be created implicitly. + * + * @param name - the collection name we wish to access. + * @returns return the new Collection instance + */ + collection(name, options = {}) { + if (typeof options === 'function') { + throw new error_1.MongoInvalidArgumentError('The callback form of this helper has been removed.'); + } + const finalOptions = utils_1.resolveOptions(this, options); + return new collection_1.Collection(this, name, finalOptions); + } + stats(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_1.getTopology(this), new stats_1.DbStatsOperation(this, utils_1.resolveOptions(this, options)), callback); + } + listCollections(filter = {}, options = {}) { + return new list_collections_1.ListCollectionsCursor(this, filter, utils_1.resolveOptions(this, options)); + } + renameCollection(fromCollection, toCollection, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + // Intentionally, we do not inherit options from parent for this operation. + options = { ...options, readPreference: read_preference_1.ReadPreference.PRIMARY }; + // Add return new collection + options.new_collection = true; + return execute_operation_1.executeOperation(utils_1.getTopology(this), new rename_1.RenameOperation(this.collection(fromCollection), toCollection, options), callback); + } + dropCollection(name, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_1.getTopology(this), new drop_1.DropCollectionOperation(this, name, utils_1.resolveOptions(this, options)), callback); + } + dropDatabase(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_1.getTopology(this), new drop_1.DropDatabaseOperation(this, utils_1.resolveOptions(this, options)), callback); + } + collections(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_1.getTopology(this), new collections_1.CollectionsOperation(this, utils_1.resolveOptions(this, options)), callback); + } + createIndex(name, indexSpec, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_1.getTopology(this), new indexes_1.CreateIndexOperation(this, name, indexSpec, utils_1.resolveOptions(this, options)), callback); + } + addUser(username, password, options, callback) { + if (typeof password === 'function') { + (callback = password), (password = undefined), (options = {}); + } + else if (typeof password !== 'string') { + if (typeof options === 'function') { + (callback = options), (options = password), (password = undefined); + } + else { + (options = password), (callback = undefined), (password = undefined); + } + } + else { + if (typeof options === 'function') + (callback = options), (options = {}); + } + return execute_operation_1.executeOperation(utils_1.getTopology(this), new add_user_1.AddUserOperation(this, username, password, utils_1.resolveOptions(this, options)), callback); + } + removeUser(username, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_1.getTopology(this), new remove_user_1.RemoveUserOperation(this, username, utils_1.resolveOptions(this, options)), callback); + } + setProfilingLevel(level, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_1.getTopology(this), new set_profiling_level_1.SetProfilingLevelOperation(this, level, utils_1.resolveOptions(this, options)), callback); + } + profilingLevel(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_1.getTopology(this), new profiling_level_1.ProfilingLevelOperation(this, utils_1.resolveOptions(this, options)), callback); + } + indexInformation(name, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + return execute_operation_1.executeOperation(utils_1.getTopology(this), new indexes_1.IndexInformationOperation(this, name, utils_1.resolveOptions(this, options)), callback); + } + /** + * Unref all sockets + * @deprecated This function is deprecated and will be removed in the next major version. + */ + unref() { + utils_1.getTopology(this).unref(); + } + /** + * Create a new Change Stream, watching for new changes (insertions, updates, + * replacements, deletions, and invalidations) in this database. Will ignore all + * changes to system collections. + * + * @param pipeline - An array of {@link https://docs.mongodb.com/manual/reference/operator/aggregation-pipeline/|aggregation pipeline stages} through which to pass change stream documents. This allows for filtering (using $match) and manipulating the change stream documents. + * @param options - Optional settings for the command + */ + watch(pipeline = [], options = {}) { + // Allow optionally not specifying a pipeline + if (!Array.isArray(pipeline)) { + options = pipeline; + pipeline = []; + } + return new change_stream_1.ChangeStream(this, pipeline, utils_1.resolveOptions(this, options)); + } + /** Return the db logger */ + getLogger() { + return this.s.logger; + } + get logger() { + return this.s.logger; + } +} +exports.Db = Db; +Db.SYSTEM_NAMESPACE_COLLECTION = CONSTANTS.SYSTEM_NAMESPACE_COLLECTION; +Db.SYSTEM_INDEX_COLLECTION = CONSTANTS.SYSTEM_INDEX_COLLECTION; +Db.SYSTEM_PROFILE_COLLECTION = CONSTANTS.SYSTEM_PROFILE_COLLECTION; +Db.SYSTEM_USER_COLLECTION = CONSTANTS.SYSTEM_USER_COLLECTION; +Db.SYSTEM_COMMAND_COLLECTION = CONSTANTS.SYSTEM_COMMAND_COLLECTION; +Db.SYSTEM_JS_COLLECTION = CONSTANTS.SYSTEM_JS_COLLECTION; +// TODO(NODE-3484): Refactor into MongoDBNamespace +// Validate the database name +function validateDatabaseName(databaseName) { + if (typeof databaseName !== 'string') + throw new error_1.MongoInvalidArgumentError('Database name must be a string'); + if (databaseName.length === 0) + throw new error_1.MongoInvalidArgumentError('Database name cannot be the empty string'); + if (databaseName === '$external') + return; + const invalidChars = [' ', '.', '$', '/', '\\']; + for (let i = 0; i < invalidChars.length; i++) { + if (databaseName.indexOf(invalidChars[i]) !== -1) + throw new error_1.MongoAPIError(`database names cannot contain the character '${invalidChars[i]}'`); + } +} +//# sourceMappingURL=db.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/db.js.map b/node_modules/mongodb/lib/db.js.map new file mode 100644 index 000000000..5e08bdcd7 --- /dev/null +++ b/node_modules/mongodb/lib/db.js.map @@ -0,0 +1 @@ +{"version":3,"file":"db.js","sourceRoot":"","sources":["../src/db.ts"],"names":[],"mappings":";;;AAAA,mCAOiB;AACjB,oEAAgE;AAChE,iCAA4E;AAC5E,uDAAuE;AACvE,mCAAmE;AACnE,6CAA6D;AAC7D,mDAAoE;AACpE,yCAAyC;AACzC,mDAAoE;AACpE,iDAA6C;AAC7C,qCAAiD;AAEjD,oDAAyE;AACzE,0DAAgE;AAChE,8CAAsE;AACtE,0DAAkF;AAClF,sEAAoG;AACpG,kDAK8B;AAC9B,4CAK2B;AAC3B,oEAIuC;AACvC,kEAA8F;AAC9F,0DAAkF;AAClF,gDAAqE;AACrE,0EAI0C;AAC1C,sEAAkE;AAGlE,mCAAgC;AAGhC,qBAAqB;AACrB,MAAM,qBAAqB,GAAG;IAC5B,cAAc;IACd,gBAAgB;IAChB,oBAAoB;IACpB,eAAe;IACf,qBAAqB;IACrB,WAAW;IACX,oBAAoB;IACpB,KAAK;IACL,YAAY;IACZ,iBAAiB;IACjB,aAAa;IACb,kBAAkB;IAClB,iBAAiB;IACjB,aAAa;IACb,QAAQ;IACR,gBAAgB;IAChB,cAAc;IACd,YAAY;IACZ,eAAe;IACf,aAAa;IACb,aAAa;CACd,CAAC;AA+BF;;;;;;;;;;;;;;;;;;GAkBG;AACH,MAAa,EAAE;IAWb;;;;;;OAMG;IACH,YAAY,MAAmB,EAAE,YAAoB,EAAE,OAAmB;;QACxE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAExB,qBAAqB;QACrB,OAAO,GAAG,qBAAa,CAAC,OAAO,EAAE,qBAAqB,CAAC,CAAC;QAExD,iCAAiC;QACjC,oBAAoB,CAAC,YAAY,CAAC,CAAC;QAEnC,kCAAkC;QAClC,IAAI,CAAC,CAAC,GAAG;YACP,SAAS;YACT,MAAM;YACN,UAAU;YACV,OAAO;YACP,kBAAkB;YAClB,MAAM,EAAE,IAAI,eAAM,CAAC,IAAI,EAAE,OAAO,CAAC;YACjC,yBAAyB;YACzB,cAAc,EAAE,gCAAc,CAAC,WAAW,CAAC,OAAO,CAAC;YACnD,qBAAqB;YACrB,WAAW,EAAE,yBAAkB,CAAC,OAAO,EAAE,MAAM,CAAC;YAChD,yDAAyD;YACzD,SAAS,EAAE,MAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,SAAS,mCAAI,0BAAkB;YACnD,cAAc;YACd,WAAW,EAAE,0BAAW,CAAC,WAAW,CAAC,OAAO,CAAC;YAC7C,YAAY,EAAE,4BAAY,CAAC,WAAW,CAAC,OAAO,CAAC;YAC/C,YAAY;YACZ,SAAS,EAAE,IAAI,wBAAgB,CAAC,YAAY,CAAC;SAC9C,CAAC;IACJ,CAAC;IAED,IAAI,YAAY;QACd,OAAO,IAAI,CAAC,CAAC,CAAC,SAAS,CAAC,EAAE,CAAC;IAC7B,CAAC;IAED,UAAU;IACV,IAAI,OAAO;QACT,OAAO,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC;IACxB,CAAC;IAED,oBAAoB;IACpB,IAAI,OAAO;;QACT,OAAO,CAAA,MAAA,IAAI,CAAC,CAAC,CAAC,cAAc,0CAAE,UAAU,MAAK,SAAS,IAAI,KAAK,CAAC;IAClE,CAAC;IAED,IAAI,WAAW;QACb,OAAO,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC;IAC5B,CAAC;IAED;;;OAGG;IACH,IAAI,cAAc;QAChB,IAAI,IAAI,CAAC,CAAC,CAAC,cAAc,IAAI,IAAI,EAAE;YACjC,OAAO,IAAI,CAAC,CAAC,CAAC,MAAM,CAAC,cAAc,CAAC;SACrC;QAED,OAAO,IAAI,CAAC,CAAC,CAAC,cAAc,CAAC;IAC/B,CAAC;IAED,IAAI,WAAW;QACb,OAAO,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC;IAC5B,CAAC;IAED,wBAAwB;IACxB,IAAI,YAAY;QACd,OAAO,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC;IAC7B,CAAC;IAED,IAAI,SAAS;QACX,OAAO,IAAI,CAAC,CAAC,CAAC,SAAS,CAAC,QAAQ,EAAE,CAAC;IACrC,CAAC;IAwBD,gBAAgB,CACd,IAAY,EACZ,OAAwD,EACxD,QAA+B;QAE/B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,6CAAyB,CAAC,IAAI,EAAE,IAAI,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAmB,EAC1F,QAAQ,CACS,CAAC;IACtB,CAAC;IAgBD,OAAO,CACL,OAAiB,EACjB,OAAgD,EAChD,QAA6B;QAE7B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,2EAA2E;QAC3E,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,iCAAmB,CAAC,IAAI,EAAE,OAAO,EAAE,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC,EACrD,QAAQ,CACT,CAAC;IACJ,CAAC;IAED;;;;;OAKG;IACH,SAAS,CACP,WAAuB,EAAE,EACzB,OAA0B;QAE1B,IAAI,SAAS,CAAC,MAAM,GAAG,CAAC,EAAE;YACxB,MAAM,IAAI,iCAAyB,CAAC,uDAAuD,CAAC,CAAC;SAC9F;QACD,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;YAClC,MAAM,IAAI,iCAAyB,CAAC,0CAA0C,CAAC,CAAC;SACjF;QACD,IAAI,OAAO,OAAO,KAAK,UAAU,EAAE;YACjC,MAAM,IAAI,iCAAyB,CAAC,yCAAyC,CAAC,CAAC;SAChF;QAED,OAAO,IAAI,sCAAiB,CAC1B,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,CAAC,CAAC,CAAC,SAAS,EAChB,QAAQ,EACR,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAC9B,CAAC;IACJ,CAAC;IAED,mCAAmC;IACnC,KAAK;QACH,OAAO,IAAI,aAAK,CAAC,IAAI,CAAC,CAAC;IACzB,CAAC;IAED;;;;;OAKG;IACH,UAAU,CACR,IAAY,EACZ,UAA6B,EAAE;QAE/B,IAAI,OAAO,OAAO,KAAK,UAAU,EAAE;YACjC,MAAM,IAAI,iCAAyB,CAAC,oDAAoD,CAAC,CAAC;SAC3F;QACD,MAAM,YAAY,GAAG,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC;QACnD,OAAO,IAAI,uBAAU,CAAU,IAAI,EAAE,IAAI,EAAE,YAAY,CAAC,CAAC;IAC3D,CAAC;IAYD,KAAK,CACH,OAA6C,EAC7C,QAA6B;QAE7B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,wBAAgB,CAAC,IAAI,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EACzD,QAAQ,CACT,CAAC;IACJ,CAAC;IAqBD,eAAe,CAIb,SAAmB,EAAE,EAAE,UAAkC,EAAE;QAC3D,OAAO,IAAI,wCAAqB,CAAI,IAAI,EAAE,MAAM,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,CAAC;IACnF,CAAC;IAiCD,gBAAgB,CACd,cAAsB,EACtB,YAAoB,EACpB,OAAuD,EACvD,QAAwC;QAExC,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,2EAA2E;QAC3E,OAAO,GAAG,EAAE,GAAG,OAAO,EAAE,cAAc,EAAE,gCAAc,CAAC,OAAO,EAAE,CAAC;QAEjE,4BAA4B;QAC5B,OAAO,CAAC,cAAc,GAAG,IAAI,CAAC;QAE9B,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,wBAAe,CACjB,IAAI,CAAC,UAAU,CAAU,cAAc,CAAmB,EAC1D,YAAY,EACZ,OAAO,CACU,EACnB,QAAQ,CACT,CAAC;IACJ,CAAC;IAaD,cAAc,CACZ,IAAY,EACZ,OAAmD,EACnD,QAA4B;QAE5B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,8BAAuB,CAAC,IAAI,EAAE,IAAI,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EACtE,QAAQ,CACT,CAAC;IACJ,CAAC;IAYD,YAAY,CACV,OAAiD,EACjD,QAA4B;QAE5B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,4BAAqB,CAAC,IAAI,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EAC9D,QAAQ,CACT,CAAC;IACJ,CAAC;IAYD,WAAW,CACT,OAAyD,EACzD,QAAiC;QAEjC,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,kCAAoB,CAAC,IAAI,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EAC7D,QAAQ,CACT,CAAC;IACJ,CAAC;IAuBD,WAAW,CACT,IAAY,EACZ,SAA6B,EAC7B,OAAiD,EACjD,QAA2B;QAE3B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,8BAAoB,CAAC,IAAI,EAAE,IAAI,EAAE,SAAS,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EAC9E,QAAQ,CACT,CAAC;IACJ,CAAC;IAuBD,OAAO,CACL,QAAgB,EAChB,QAAuD,EACvD,OAA6C,EAC7C,QAA6B;QAE7B,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;YAClC,CAAC,QAAQ,GAAG,QAAQ,CAAC,EAAE,CAAC,QAAQ,GAAG,SAAS,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;SAC/D;aAAM,IAAI,OAAO,QAAQ,KAAK,QAAQ,EAAE;YACvC,IAAI,OAAO,OAAO,KAAK,UAAU,EAAE;gBACjC,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,QAAQ,CAAC,EAAE,CAAC,QAAQ,GAAG,SAAS,CAAC,CAAC;aACpE;iBAAM;gBACL,CAAC,OAAO,GAAG,QAAQ,CAAC,EAAE,CAAC,QAAQ,GAAG,SAAS,CAAC,EAAE,CAAC,QAAQ,GAAG,SAAS,CAAC,CAAC;aACtE;SACF;aAAM;YACL,IAAI,OAAO,OAAO,KAAK,UAAU;gBAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;SACzE;QAED,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,2BAAgB,CAAC,IAAI,EAAE,QAAQ,EAAE,QAAQ,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EAC7E,QAAQ,CACT,CAAC;IACJ,CAAC;IAaD,UAAU,CACR,QAAgB,EAChB,OAA+C,EAC/C,QAA4B;QAE5B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,iCAAmB,CAAC,IAAI,EAAE,QAAQ,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EACtE,QAAQ,CACT,CAAC;IACJ,CAAC;IAoBD,iBAAiB,CACf,KAAqB,EACrB,OAA6D,EAC7D,QAAmC;QAEnC,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,gDAA0B,CAAC,IAAI,EAAE,KAAK,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EAC1E,QAAQ,CACT,CAAC;IACJ,CAAC;IAYD,cAAc,CACZ,OAAkD,EAClD,QAA2B;QAE3B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,yCAAuB,CAAC,IAAI,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EAChE,QAAQ,CACT,CAAC;IACJ,CAAC;IAiBD,gBAAgB,CACd,IAAY,EACZ,OAAsD,EACtD,QAA6B;QAE7B,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QAExE,OAAO,oCAAgB,CACrB,mBAAW,CAAC,IAAI,CAAC,EACjB,IAAI,mCAAyB,CAAC,IAAI,EAAE,IAAI,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,EACxE,QAAQ,CACT,CAAC;IACJ,CAAC;IAED;;;OAGG;IACH,KAAK;QACH,mBAAW,CAAC,IAAI,CAAC,CAAC,KAAK,EAAE,CAAC;IAC5B,CAAC;IAED;;;;;;;OAOG;IACH,KAAK,CACH,WAAuB,EAAE,EACzB,UAA+B,EAAE;QAEjC,6CAA6C;QAC7C,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,QAAQ,CAAC,EAAE;YAC5B,OAAO,GAAG,QAAQ,CAAC;YACnB,QAAQ,GAAG,EAAE,CAAC;SACf;QAED,OAAO,IAAI,4BAAY,CAAU,IAAI,EAAE,QAAQ,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,CAAC;IAClF,CAAC;IAED,2BAA2B;IAC3B,SAAS;QACP,OAAO,IAAI,CAAC,CAAC,CAAC,MAAM,CAAC;IACvB,CAAC;IAED,IAAI,MAAM;QACR,OAAO,IAAI,CAAC,CAAC,CAAC,MAAM,CAAC;IACvB,CAAC;;AAlmBH,gBAmmBC;AA/lBe,8BAA2B,GAAG,SAAS,CAAC,2BAA2B,CAAC;AACpE,0BAAuB,GAAG,SAAS,CAAC,uBAAuB,CAAC;AAC5D,4BAAyB,GAAG,SAAS,CAAC,yBAAyB,CAAC;AAChE,yBAAsB,GAAG,SAAS,CAAC,sBAAsB,CAAC;AAC1D,4BAAyB,GAAG,SAAS,CAAC,yBAAyB,CAAC;AAChE,uBAAoB,GAAG,SAAS,CAAC,oBAAoB,CAAC;AA4lBtE,kDAAkD;AAClD,6BAA6B;AAC7B,SAAS,oBAAoB,CAAC,YAAoB;IAChD,IAAI,OAAO,YAAY,KAAK,QAAQ;QAClC,MAAM,IAAI,iCAAyB,CAAC,gCAAgC,CAAC,CAAC;IACxE,IAAI,YAAY,CAAC,MAAM,KAAK,CAAC;QAC3B,MAAM,IAAI,iCAAyB,CAAC,0CAA0C,CAAC,CAAC;IAClF,IAAI,YAAY,KAAK,WAAW;QAAE,OAAO;IAEzC,MAAM,YAAY,GAAG,CAAC,GAAG,EAAE,GAAG,EAAE,GAAG,EAAE,GAAG,EAAE,IAAI,CAAC,CAAC;IAChD,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,YAAY,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;QAC5C,IAAI,YAAY,CAAC,OAAO,CAAC,YAAY,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC;YAC9C,MAAM,IAAI,qBAAa,CAAC,gDAAgD,YAAY,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC;KAC/F;AACH,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/deps.js b/node_modules/mongodb/lib/deps.js new file mode 100644 index 000000000..a98a783fc --- /dev/null +++ b/node_modules/mongodb/lib/deps.js @@ -0,0 +1,59 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.AutoEncryptionLoggerLevel = exports.aws4 = exports.saslprep = exports.Snappy = exports.Kerberos = exports.PKG_VERSION = void 0; +/* eslint-disable @typescript-eslint/no-var-requires */ +const error_1 = require("./error"); +const utils_1 = require("./utils"); +exports.PKG_VERSION = Symbol('kPkgVersion'); +function makeErrorModule(error) { + const props = error ? { kModuleError: error } : {}; + return new Proxy(props, { + get: (_, key) => { + if (key === 'kModuleError') { + return error; + } + throw error; + }, + set: () => { + throw error; + } + }); +} +exports.Kerberos = makeErrorModule(new error_1.MongoMissingDependencyError('Optional module `kerberos` not found. Please install it to enable kerberos authentication')); +try { + // Ensure you always wrap an optional require in the try block NODE-3199 + exports.Kerberos = require('kerberos'); +} +catch { } // eslint-disable-line +exports.Snappy = makeErrorModule(new error_1.MongoMissingDependencyError('Optional module `snappy` not found. Please install it to enable snappy compression')); +try { + // Ensure you always wrap an optional require in the try block NODE-3199 + exports.Snappy = require('snappy'); + try { + exports.Snappy[exports.PKG_VERSION] = utils_1.parsePackageVersion(require('snappy/package.json')); + } + catch { } // eslint-disable-line +} +catch { } // eslint-disable-line +exports.saslprep = makeErrorModule(new error_1.MongoMissingDependencyError('Optional module `saslprep` not found.' + + ' Please install it to enable Stringprep Profile for User Names and Passwords')); +try { + // Ensure you always wrap an optional require in the try block NODE-3199 + exports.saslprep = require('saslprep'); +} +catch { } // eslint-disable-line +exports.aws4 = makeErrorModule(new error_1.MongoMissingDependencyError('Optional module `aws4` not found. Please install it to enable AWS authentication')); +try { + // Ensure you always wrap an optional require in the try block NODE-3199 + exports.aws4 = require('aws4'); +} +catch { } // eslint-disable-line +/** @public */ +exports.AutoEncryptionLoggerLevel = Object.freeze({ + FatalError: 0, + Error: 1, + Warning: 2, + Info: 3, + Trace: 4 +}); +//# sourceMappingURL=deps.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/deps.js.map b/node_modules/mongodb/lib/deps.js.map new file mode 100644 index 000000000..6264183c7 --- /dev/null +++ b/node_modules/mongodb/lib/deps.js.map @@ -0,0 +1 @@ +{"version":3,"file":"deps.js","sourceRoot":"","sources":["../src/deps.ts"],"names":[],"mappings":";;;AAAA,uDAAuD;AACvD,mCAAsD;AAGtD,mCAAwD;AAE3C,QAAA,WAAW,GAAG,MAAM,CAAC,aAAa,CAAC,CAAC;AAEjD,SAAS,eAAe,CAAC,KAAU;IACjC,MAAM,KAAK,GAAG,KAAK,CAAC,CAAC,CAAC,EAAE,YAAY,EAAE,KAAK,EAAE,CAAC,CAAC,CAAC,EAAE,CAAC;IACnD,OAAO,IAAI,KAAK,CAAC,KAAK,EAAE;QACtB,GAAG,EAAE,CAAC,CAAM,EAAE,GAAQ,EAAE,EAAE;YACxB,IAAI,GAAG,KAAK,cAAc,EAAE;gBAC1B,OAAO,KAAK,CAAC;aACd;YACD,MAAM,KAAK,CAAC;QACd,CAAC;QACD,GAAG,EAAE,GAAG,EAAE;YACR,MAAM,KAAK,CAAC;QACd,CAAC;KACF,CAAC,CAAC;AACL,CAAC;AAEU,QAAA,QAAQ,GACjB,eAAe,CACb,IAAI,mCAA2B,CAC7B,2FAA2F,CAC5F,CACF,CAAC;AAEJ,IAAI;IACF,wEAAwE;IACxE,gBAAQ,GAAG,OAAO,CAAC,UAAU,CAAC,CAAC;CAChC;AAAC,MAAM,GAAE,CAAC,sBAAsB;AAmDtB,QAAA,MAAM,GAA8D,eAAe,CAC5F,IAAI,mCAA2B,CAC7B,oFAAoF,CACrF,CACF,CAAC;AAEF,IAAI;IACF,wEAAwE;IACxE,cAAM,GAAG,OAAO,CAAC,QAAQ,CAAC,CAAC;IAC3B,IAAI;QACD,cAAc,CAAC,mBAAW,CAAC,GAAG,2BAAmB,CAAC,OAAO,CAAC,qBAAqB,CAAC,CAAC,CAAC;KACpF;IAAC,MAAM,GAAE,CAAC,sBAAsB;CAClC;AAAC,MAAM,GAAE,CAAC,sBAAsB;AAEtB,QAAA,QAAQ,GACjB,eAAe,CACb,IAAI,mCAA2B,CAC7B,uCAAuC;IACrC,8EAA8E,CACjF,CACF,CAAC;AAEJ,IAAI;IACF,wEAAwE;IACxE,gBAAQ,GAAG,OAAO,CAAC,UAAU,CAAC,CAAC;CAChC;AAAC,MAAM,GAAE,CAAC,sBAAsB;AA0CtB,QAAA,IAAI,GAAyD,eAAe,CACrF,IAAI,mCAA2B,CAC7B,kFAAkF,CACnF,CACF,CAAC;AAEF,IAAI;IACF,wEAAwE;IACxE,YAAI,GAAG,OAAO,CAAC,MAAM,CAAC,CAAC;CACxB;AAAC,MAAM,GAAE,CAAC,sBAAsB;AAEjC,cAAc;AACD,QAAA,yBAAyB,GAAG,MAAM,CAAC,MAAM,CAAC;IACrD,UAAU,EAAE,CAAC;IACb,KAAK,EAAE,CAAC;IACR,OAAO,EAAE,CAAC;IACV,IAAI,EAAE,CAAC;IACP,KAAK,EAAE,CAAC;CACA,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/encrypter.js b/node_modules/mongodb/lib/encrypter.js new file mode 100644 index 000000000..c00349cbe --- /dev/null +++ b/node_modules/mongodb/lib/encrypter.js @@ -0,0 +1,93 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.Encrypter = void 0; +/* eslint-disable @typescript-eslint/no-var-requires */ +const mongo_client_1 = require("./mongo_client"); +const error_1 = require("./error"); +const bson_1 = require("./bson"); +const connect_1 = require("./operations/connect"); +let AutoEncrypterClass; +/** @internal */ +const kInternalClient = Symbol('internalClient'); +/** @internal */ +class Encrypter { + constructor(client, uri, options) { + if (typeof options.autoEncryption !== 'object') { + throw new error_1.MongoInvalidArgumentError('Option "autoEncryption" must be specified'); + } + this.bypassAutoEncryption = !!options.autoEncryption.bypassAutoEncryption; + this.needsConnecting = false; + if (options.maxPoolSize === 0 && options.autoEncryption.keyVaultClient == null) { + options.autoEncryption.keyVaultClient = client; + } + else if (options.autoEncryption.keyVaultClient == null) { + options.autoEncryption.keyVaultClient = this.getInternalClient(client, uri, options); + } + if (this.bypassAutoEncryption) { + options.autoEncryption.metadataClient = undefined; + } + else if (options.maxPoolSize === 0) { + options.autoEncryption.metadataClient = client; + } + else { + options.autoEncryption.metadataClient = this.getInternalClient(client, uri, options); + } + options.autoEncryption.bson = Object.create(null); + // eslint-disable-next-line @typescript-eslint/no-non-null-assertion + options.autoEncryption.bson.serialize = bson_1.serialize; + // eslint-disable-next-line @typescript-eslint/no-non-null-assertion + options.autoEncryption.bson.deserialize = bson_1.deserialize; + this.autoEncrypter = new AutoEncrypterClass(client, options.autoEncryption); + } + getInternalClient(client, uri, options) { + if (!this[kInternalClient]) { + const clonedOptions = {}; + for (const key of Object.keys(options)) { + if (['autoEncryption', 'minPoolSize', 'servers', 'caseTranslate', 'dbName'].includes(key)) + continue; + Reflect.set(clonedOptions, key, Reflect.get(options, key)); + } + clonedOptions.minPoolSize = 0; + this[kInternalClient] = new mongo_client_1.MongoClient(uri, clonedOptions); + for (const eventName of connect_1.MONGO_CLIENT_EVENTS) { + for (const listener of client.listeners(eventName)) { + this[kInternalClient].on(eventName, listener); + } + } + client.on('newListener', (eventName, listener) => { + this[kInternalClient].on(eventName, listener); + }); + this.needsConnecting = true; + } + return this[kInternalClient]; + } + connectInternalClient(callback) { + if (this.needsConnecting) { + this.needsConnecting = false; + return this[kInternalClient].connect(callback); + } + return callback(); + } + close(client, force, callback) { + this.autoEncrypter.teardown(!!force, e => { + if (this[kInternalClient] && client !== this[kInternalClient]) { + return this[kInternalClient].close(force, callback); + } + callback(e); + }); + } + static checkForMongoCrypt() { + let mongodbClientEncryption = undefined; + try { + // Ensure you always wrap an optional require in the try block NODE-3199 + mongodbClientEncryption = require('mongodb-client-encryption'); + } + catch (err) { + throw new error_1.MongoMissingDependencyError('Auto-encryption requested, but the module is not installed. ' + + 'Please add `mongodb-client-encryption` as a dependency of your project'); + } + AutoEncrypterClass = mongodbClientEncryption.extension(require('../lib/index')).AutoEncrypter; + } +} +exports.Encrypter = Encrypter; +//# sourceMappingURL=encrypter.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/encrypter.js.map b/node_modules/mongodb/lib/encrypter.js.map new file mode 100644 index 000000000..a0dcc5d84 --- /dev/null +++ b/node_modules/mongodb/lib/encrypter.js.map @@ -0,0 +1 @@ +{"version":3,"file":"encrypter.js","sourceRoot":"","sources":["../src/encrypter.ts"],"names":[],"mappings":";;;AAAA,uDAAuD;AACvD,iDAAiE;AAEjE,mCAAiF;AACjF,iCAAgD;AAEhD,kDAA2D;AAE3D,IAAI,kBAAiC,CAAC;AAEtC,gBAAgB;AAChB,MAAM,eAAe,GAAG,MAAM,CAAC,gBAAgB,CAAC,CAAC;AAQjD,gBAAgB;AAChB,MAAa,SAAS;IAMpB,YAAY,MAAmB,EAAE,GAAW,EAAE,OAA2B;QACvE,IAAI,OAAO,OAAO,CAAC,cAAc,KAAK,QAAQ,EAAE;YAC9C,MAAM,IAAI,iCAAyB,CAAC,2CAA2C,CAAC,CAAC;SAClF;QAED,IAAI,CAAC,oBAAoB,GAAG,CAAC,CAAC,OAAO,CAAC,cAAc,CAAC,oBAAoB,CAAC;QAC1E,IAAI,CAAC,eAAe,GAAG,KAAK,CAAC;QAE7B,IAAI,OAAO,CAAC,WAAW,KAAK,CAAC,IAAI,OAAO,CAAC,cAAc,CAAC,cAAc,IAAI,IAAI,EAAE;YAC9E,OAAO,CAAC,cAAc,CAAC,cAAc,GAAG,MAAM,CAAC;SAChD;aAAM,IAAI,OAAO,CAAC,cAAc,CAAC,cAAc,IAAI,IAAI,EAAE;YACxD,OAAO,CAAC,cAAc,CAAC,cAAc,GAAG,IAAI,CAAC,iBAAiB,CAAC,MAAM,EAAE,GAAG,EAAE,OAAO,CAAC,CAAC;SACtF;QAED,IAAI,IAAI,CAAC,oBAAoB,EAAE;YAC7B,OAAO,CAAC,cAAc,CAAC,cAAc,GAAG,SAAS,CAAC;SACnD;aAAM,IAAI,OAAO,CAAC,WAAW,KAAK,CAAC,EAAE;YACpC,OAAO,CAAC,cAAc,CAAC,cAAc,GAAG,MAAM,CAAC;SAChD;aAAM;YACL,OAAO,CAAC,cAAc,CAAC,cAAc,GAAG,IAAI,CAAC,iBAAiB,CAAC,MAAM,EAAE,GAAG,EAAE,OAAO,CAAC,CAAC;SACtF;QAED,OAAO,CAAC,cAAc,CAAC,IAAI,GAAG,MAAM,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC;QAClD,oEAAoE;QACpE,OAAO,CAAC,cAAc,CAAC,IAAK,CAAC,SAAS,GAAG,gBAAS,CAAC;QACnD,oEAAoE;QACpE,OAAO,CAAC,cAAc,CAAC,IAAK,CAAC,WAAW,GAAG,kBAAW,CAAC;QAEvD,IAAI,CAAC,aAAa,GAAG,IAAI,kBAAkB,CAAC,MAAM,EAAE,OAAO,CAAC,cAAc,CAAC,CAAC;IAC9E,CAAC;IAED,iBAAiB,CAAC,MAAmB,EAAE,GAAW,EAAE,OAA2B;QAC7E,IAAI,CAAC,IAAI,CAAC,eAAe,CAAC,EAAE;YAC1B,MAAM,aAAa,GAAuB,EAAE,CAAC;YAE7C,KAAK,MAAM,GAAG,IAAI,MAAM,CAAC,IAAI,CAAC,OAAO,CAAC,EAAE;gBACtC,IAAI,CAAC,gBAAgB,EAAE,aAAa,EAAE,SAAS,EAAE,eAAe,EAAE,QAAQ,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC;oBACvF,SAAS;gBACX,OAAO,CAAC,GAAG,CAAC,aAAa,EAAE,GAAG,EAAE,OAAO,CAAC,GAAG,CAAC,OAAO,EAAE,GAAG,CAAC,CAAC,CAAC;aAC5D;YAED,aAAa,CAAC,WAAW,GAAG,CAAC,CAAC;YAE9B,IAAI,CAAC,eAAe,CAAC,GAAG,IAAI,0BAAW,CAAC,GAAG,EAAE,aAAa,CAAC,CAAC;YAE5D,KAAK,MAAM,SAAS,IAAI,6BAAmB,EAAE;gBAC3C,KAAK,MAAM,QAAQ,IAAI,MAAM,CAAC,SAAS,CAAC,SAAS,CAAC,EAAE;oBAClD,IAAI,CAAC,eAAe,CAAC,CAAC,EAAE,CAAC,SAAS,EAAE,QAAQ,CAAC,CAAC;iBAC/C;aACF;YAED,MAAM,CAAC,EAAE,CAAC,aAAa,EAAE,CAAC,SAAS,EAAE,QAAQ,EAAE,EAAE;gBAC/C,IAAI,CAAC,eAAe,CAAC,CAAC,EAAE,CAAC,SAAS,EAAE,QAAQ,CAAC,CAAC;YAChD,CAAC,CAAC,CAAC;YAEH,IAAI,CAAC,eAAe,GAAG,IAAI,CAAC;SAC7B;QACD,OAAO,IAAI,CAAC,eAAe,CAAC,CAAC;IAC/B,CAAC;IAED,qBAAqB,CAAC,QAAkB;QACtC,IAAI,IAAI,CAAC,eAAe,EAAE;YACxB,IAAI,CAAC,eAAe,GAAG,KAAK,CAAC;YAC7B,OAAO,IAAI,CAAC,eAAe,CAAC,CAAC,OAAO,CAAC,QAAQ,CAAC,CAAC;SAChD;QAED,OAAO,QAAQ,EAAE,CAAC;IACpB,CAAC;IAED,KAAK,CAAC,MAAmB,EAAE,KAAc,EAAE,QAAkB;QAC3D,IAAI,CAAC,aAAa,CAAC,QAAQ,CAAC,CAAC,CAAC,KAAK,EAAE,CAAC,CAAC,EAAE;YACvC,IAAI,IAAI,CAAC,eAAe,CAAC,IAAI,MAAM,KAAK,IAAI,CAAC,eAAe,CAAC,EAAE;gBAC7D,OAAO,IAAI,CAAC,eAAe,CAAC,CAAC,KAAK,CAAC,KAAK,EAAE,QAAQ,CAAC,CAAC;aACrD;YACD,QAAQ,CAAC,CAAC,CAAC,CAAC;QACd,CAAC,CAAC,CAAC;IACL,CAAC;IAED,MAAM,CAAC,kBAAkB;QACvB,IAAI,uBAAuB,GAAG,SAAS,CAAC;QACxC,IAAI;YACF,wEAAwE;YACxE,uBAAuB,GAAG,OAAO,CAAC,2BAA2B,CAAC,CAAC;SAChE;QAAC,OAAO,GAAG,EAAE;YACZ,MAAM,IAAI,mCAA2B,CACnC,8DAA8D;gBAC5D,wEAAwE,CAC3E,CAAC;SACH;QAED,kBAAkB,GAAG,uBAAuB,CAAC,SAAS,CAAC,OAAO,CAAC,cAAc,CAAC,CAAC,CAAC,aAAa,CAAC;IAChG,CAAC;CACF;AAlGD,8BAkGC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/error.js b/node_modules/mongodb/lib/error.js new file mode 100644 index 000000000..94d55fd3e --- /dev/null +++ b/node_modules/mongodb/lib/error.js @@ -0,0 +1,712 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.isResumableError = exports.isNetworkTimeoutError = exports.isSDAMUnrecoverableError = exports.isNodeShuttingDownError = exports.isRetryableError = exports.isRetryableWriteError = exports.MongoWriteConcernError = exports.MongoServerSelectionError = exports.MongoSystemError = exports.MongoMissingDependencyError = exports.MongoMissingCredentialsError = exports.MongoCompatibilityError = exports.MongoInvalidArgumentError = exports.MongoParseError = exports.MongoNetworkTimeoutError = exports.MongoNetworkError = exports.isNetworkErrorBeforeHandshake = exports.MongoTopologyClosedError = exports.MongoCursorExhaustedError = exports.MongoServerClosedError = exports.MongoCursorInUseError = exports.MongoGridFSChunkError = exports.MongoGridFSStreamError = exports.MongoTailableCursorError = exports.MongoChangeStreamError = exports.MongoKerberosError = exports.MongoExpiredSessionError = exports.MongoTransactionError = exports.MongoNotConnectedError = exports.MongoDecompressionError = exports.MongoBatchReExecutionError = exports.MongoRuntimeError = exports.MongoAPIError = exports.MongoDriverError = exports.MongoServerError = exports.MongoError = exports.GET_MORE_RESUMABLE_CODES = exports.MONGODB_ERROR_CODES = void 0; +/** @internal */ +const kErrorLabels = Symbol('errorLabels'); +/** @internal MongoDB Error Codes */ +exports.MONGODB_ERROR_CODES = Object.freeze({ + HostUnreachable: 6, + HostNotFound: 7, + NetworkTimeout: 89, + ShutdownInProgress: 91, + PrimarySteppedDown: 189, + ExceededTimeLimit: 262, + SocketException: 9001, + NotMaster: 10107, + InterruptedAtShutdown: 11600, + InterruptedDueToReplStateChange: 11602, + NotMasterNoSlaveOk: 13435, + NotMasterOrSecondary: 13436, + StaleShardVersion: 63, + StaleEpoch: 150, + StaleConfig: 13388, + RetryChangeStream: 234, + FailedToSatisfyReadPreference: 133, + CursorNotFound: 43, + LegacyNotPrimary: 10058, + WriteConcernFailed: 64, + NamespaceNotFound: 26, + IllegalOperation: 20, + MaxTimeMSExpired: 50, + UnknownReplWriteConcern: 79, + UnsatisfiableWriteConcern: 100 +}); +// From spec@https://github.com/mongodb/specifications/blob/f93d78191f3db2898a59013a7ed5650352ef6da8/source/change-streams/change-streams.rst#resumable-error +exports.GET_MORE_RESUMABLE_CODES = new Set([ + exports.MONGODB_ERROR_CODES.HostUnreachable, + exports.MONGODB_ERROR_CODES.HostNotFound, + exports.MONGODB_ERROR_CODES.NetworkTimeout, + exports.MONGODB_ERROR_CODES.ShutdownInProgress, + exports.MONGODB_ERROR_CODES.PrimarySteppedDown, + exports.MONGODB_ERROR_CODES.ExceededTimeLimit, + exports.MONGODB_ERROR_CODES.SocketException, + exports.MONGODB_ERROR_CODES.NotMaster, + exports.MONGODB_ERROR_CODES.InterruptedAtShutdown, + exports.MONGODB_ERROR_CODES.InterruptedDueToReplStateChange, + exports.MONGODB_ERROR_CODES.NotMasterNoSlaveOk, + exports.MONGODB_ERROR_CODES.NotMasterOrSecondary, + exports.MONGODB_ERROR_CODES.StaleShardVersion, + exports.MONGODB_ERROR_CODES.StaleEpoch, + exports.MONGODB_ERROR_CODES.StaleConfig, + exports.MONGODB_ERROR_CODES.RetryChangeStream, + exports.MONGODB_ERROR_CODES.FailedToSatisfyReadPreference, + exports.MONGODB_ERROR_CODES.CursorNotFound +]); +/** + * @public + * @category Error + * + * @privateRemarks + * CSFLE has a dependency on this error, it uses the constructor with a string argument + */ +class MongoError extends Error { + constructor(message) { + if (message instanceof Error) { + super(message.message); + } + else { + super(message); + } + } + get name() { + return 'MongoError'; + } + /** Legacy name for server error responses */ + get errmsg() { + return this.message; + } + /** + * Checks the error to see if it has an error label + * + * @param label - The error label to check for + * @returns returns true if the error has the provided error label + */ + hasErrorLabel(label) { + if (this[kErrorLabels] == null) { + return false; + } + return this[kErrorLabels].has(label); + } + addErrorLabel(label) { + if (this[kErrorLabels] == null) { + this[kErrorLabels] = new Set(); + } + this[kErrorLabels].add(label); + } + get errorLabels() { + return this[kErrorLabels] ? Array.from(this[kErrorLabels]) : []; + } +} +exports.MongoError = MongoError; +/** + * An error coming from the mongo server + * + * @public + * @category Error + */ +class MongoServerError extends MongoError { + constructor(message) { + super(message.message || message.errmsg || message.$err || 'n/a'); + if (message.errorLabels) { + this[kErrorLabels] = new Set(message.errorLabels); + } + for (const name in message) { + if (name !== 'errorLabels' && name !== 'errmsg' && name !== 'message') + this[name] = message[name]; + } + } + get name() { + return 'MongoServerError'; + } +} +exports.MongoServerError = MongoServerError; +/** + * An error generated by the driver + * + * @public + * @category Error + */ +class MongoDriverError extends MongoError { + constructor(message) { + super(message); + } + get name() { + return 'MongoDriverError'; + } +} +exports.MongoDriverError = MongoDriverError; +/** + * An error generated when the driver API is used incorrectly + * + * @privateRemarks + * Should **never** be directly instantiated + * + * @public + * @category Error + */ +class MongoAPIError extends MongoDriverError { + constructor(message) { + super(message); + } + get name() { + return 'MongoAPIError'; + } +} +exports.MongoAPIError = MongoAPIError; +/** + * An error generated when the driver encounters unexpected input + * or reaches an unexpected/invalid internal state + * + * @privateRemarks + * Should **never** be directly instantiated. + * + * @public + * @category Error + */ +class MongoRuntimeError extends MongoDriverError { + constructor(message) { + super(message); + } + get name() { + return 'MongoRuntimeError'; + } +} +exports.MongoRuntimeError = MongoRuntimeError; +/** + * An error generated when a batch command is reexecuted after one of the commands in the batch + * has failed + * + * @public + * @category Error + */ +class MongoBatchReExecutionError extends MongoAPIError { + constructor(message = 'This batch has already been executed, create new batch to execute') { + super(message); + } + get name() { + return 'MongoBatchReExecutionError'; + } +} +exports.MongoBatchReExecutionError = MongoBatchReExecutionError; +/** + * An error generated when the driver fails to decompress + * data received from the server. + * + * @public + * @category Error + */ +class MongoDecompressionError extends MongoRuntimeError { + constructor(message) { + super(message); + } + get name() { + return 'MongoDecompressionError'; + } +} +exports.MongoDecompressionError = MongoDecompressionError; +/** + * An error thrown when the user attempts to operate on a database or collection through a MongoClient + * that has not yet successfully called the "connect" method + * + * @public + * @category Error + */ +class MongoNotConnectedError extends MongoAPIError { + constructor(message) { + super(message); + } + get name() { + return 'MongoNotConnectedError'; + } +} +exports.MongoNotConnectedError = MongoNotConnectedError; +/** + * An error generated when the user makes a mistake in the usage of transactions. + * (e.g. attempting to commit a transaction with a readPreference other than primary) + * + * @public + * @category Error + */ +class MongoTransactionError extends MongoAPIError { + constructor(message) { + super(message); + } + get name() { + return 'MongoTransactionError'; + } +} +exports.MongoTransactionError = MongoTransactionError; +/** + * An error generated when the user attempts to operate + * on a session that has expired or has been closed. + * + * @public + * @category Error + */ +class MongoExpiredSessionError extends MongoAPIError { + constructor(message = 'Cannot use a session that has ended') { + super(message); + } + get name() { + return 'MongoExpiredSessionError'; + } +} +exports.MongoExpiredSessionError = MongoExpiredSessionError; +/** + * A error generated when the user attempts to authenticate + * via Kerberos, but fails to connect to the Kerberos client. + * + * @public + * @category Error + */ +class MongoKerberosError extends MongoRuntimeError { + constructor(message) { + super(message); + } + get name() { + return 'MongoKerberosError'; + } +} +exports.MongoKerberosError = MongoKerberosError; +/** + * An error generated when a ChangeStream operation fails to execute. + * + * @public + * @category Error + */ +class MongoChangeStreamError extends MongoRuntimeError { + constructor(message) { + super(message); + } + get name() { + return 'MongoChangeStreamError'; + } +} +exports.MongoChangeStreamError = MongoChangeStreamError; +/** + * An error thrown when the user calls a function or method not supported on a tailable cursor + * + * @public + * @category Error + */ +class MongoTailableCursorError extends MongoAPIError { + constructor(message = 'Tailable cursor does not support this operation') { + super(message); + } + get name() { + return 'MongoTailableCursorError'; + } +} +exports.MongoTailableCursorError = MongoTailableCursorError; +/** An error generated when a GridFSStream operation fails to execute. + * + * @public + * @category Error + */ +class MongoGridFSStreamError extends MongoRuntimeError { + constructor(message) { + super(message); + } + get name() { + return 'MongoGridFSStreamError'; + } +} +exports.MongoGridFSStreamError = MongoGridFSStreamError; +/** + * An error generated when a malformed or invalid chunk is + * encountered when reading from a GridFSStream. + * + * @public + * @category Error + */ +class MongoGridFSChunkError extends MongoRuntimeError { + constructor(message) { + super(message); + } + get name() { + return 'MongoGridFSChunkError'; + } +} +exports.MongoGridFSChunkError = MongoGridFSChunkError; +/** + * An error thrown when the user attempts to add options to a cursor that has already been + * initialized + * + * @public + * @category Error + */ +class MongoCursorInUseError extends MongoAPIError { + constructor(message = 'Cursor is already initialized') { + super(message); + } + get name() { + return 'MongoCursorInUseError'; + } +} +exports.MongoCursorInUseError = MongoCursorInUseError; +/** + * An error generated when an attempt is made to operate + * on a closed/closing server. + * + * @public + * @category Error + */ +class MongoServerClosedError extends MongoAPIError { + constructor(message = 'Server is closed') { + super(message); + } + get name() { + return 'MongoServerClosedError'; + } +} +exports.MongoServerClosedError = MongoServerClosedError; +/** + * An error thrown when an attempt is made to read from a cursor that has been exhausted + * + * @public + * @category Error + */ +class MongoCursorExhaustedError extends MongoAPIError { + constructor(message) { + super(message || 'Cursor is exhausted'); + } + get name() { + return 'MongoCursorExhaustedError'; + } +} +exports.MongoCursorExhaustedError = MongoCursorExhaustedError; +/** + * An error generated when an attempt is made to operate on a + * dropped, or otherwise unavailable, database. + * + * @public + * @category Error + */ +class MongoTopologyClosedError extends MongoAPIError { + constructor(message = 'Topology is closed') { + super(message); + } + get name() { + return 'MongoTopologyClosedError'; + } +} +exports.MongoTopologyClosedError = MongoTopologyClosedError; +/** @internal */ +const kBeforeHandshake = Symbol('beforeHandshake'); +function isNetworkErrorBeforeHandshake(err) { + return err[kBeforeHandshake] === true; +} +exports.isNetworkErrorBeforeHandshake = isNetworkErrorBeforeHandshake; +/** + * An error indicating an issue with the network, including TCP errors and timeouts. + * @public + * @category Error + */ +class MongoNetworkError extends MongoError { + constructor(message, options) { + super(message); + if (options && typeof options.beforeHandshake === 'boolean') { + this[kBeforeHandshake] = options.beforeHandshake; + } + } + get name() { + return 'MongoNetworkError'; + } +} +exports.MongoNetworkError = MongoNetworkError; +/** + * An error indicating a network timeout occurred + * @public + * @category Error + * + * @privateRemarks + * CSFLE has a dependency on this error with an instanceof check + */ +class MongoNetworkTimeoutError extends MongoNetworkError { + constructor(message, options) { + super(message, options); + } + get name() { + return 'MongoNetworkTimeoutError'; + } +} +exports.MongoNetworkTimeoutError = MongoNetworkTimeoutError; +/** + * An error used when attempting to parse a value (like a connection string) + * @public + * @category Error + */ +class MongoParseError extends MongoDriverError { + constructor(message) { + super(message); + } + get name() { + return 'MongoParseError'; + } +} +exports.MongoParseError = MongoParseError; +/** + * An error generated when the user supplies malformed or unexpected arguments + * or when a required argument or field is not provided. + * + * + * @public + * @category Error + */ +class MongoInvalidArgumentError extends MongoAPIError { + constructor(message) { + super(message); + } + get name() { + return 'MongoInvalidArgumentError'; + } +} +exports.MongoInvalidArgumentError = MongoInvalidArgumentError; +/** + * An error generated when a feature that is not enabled or allowed for the current server + * configuration is used + * + * + * @public + * @category Error + */ +class MongoCompatibilityError extends MongoAPIError { + constructor(message) { + super(message); + } + get name() { + return 'MongoCompatibilityError'; + } +} +exports.MongoCompatibilityError = MongoCompatibilityError; +/** + * An error generated when the user fails to provide authentication credentials before attempting + * to connect to a mongo server instance. + * + * + * @public + * @category Error + */ +class MongoMissingCredentialsError extends MongoAPIError { + constructor(message) { + super(message); + } + get name() { + return 'MongoMissingCredentialsError'; + } +} +exports.MongoMissingCredentialsError = MongoMissingCredentialsError; +/** + * An error generated when a required module or dependency is not present in the local environment + * + * @public + * @category Error + */ +class MongoMissingDependencyError extends MongoAPIError { + constructor(message) { + super(message); + } + get name() { + return 'MongoMissingDependencyError'; + } +} +exports.MongoMissingDependencyError = MongoMissingDependencyError; +/** + * An error signifying a general system issue + * @public + * @category Error + */ +class MongoSystemError extends MongoError { + constructor(message, reason) { + if (reason && reason.error) { + super(reason.error.message || reason.error); + } + else { + super(message); + } + if (reason) { + this.reason = reason; + } + } + get name() { + return 'MongoSystemError'; + } +} +exports.MongoSystemError = MongoSystemError; +/** + * An error signifying a client-side server selection error + * @public + * @category Error + */ +class MongoServerSelectionError extends MongoSystemError { + constructor(message, reason) { + super(message, reason); + } + get name() { + return 'MongoServerSelectionError'; + } +} +exports.MongoServerSelectionError = MongoServerSelectionError; +function makeWriteConcernResultObject(input) { + const output = Object.assign({}, input); + if (output.ok === 0) { + output.ok = 1; + delete output.errmsg; + delete output.code; + delete output.codeName; + } + return output; +} +/** + * An error thrown when the server reports a writeConcernError + * @public + * @category Error + */ +class MongoWriteConcernError extends MongoServerError { + constructor(message, result) { + if (result && Array.isArray(result.errorLabels)) { + message.errorLabels = result.errorLabels; + } + super(message); + this.errInfo = message.errInfo; + if (result != null) { + this.result = makeWriteConcernResultObject(result); + } + } + get name() { + return 'MongoWriteConcernError'; + } +} +exports.MongoWriteConcernError = MongoWriteConcernError; +// see: https://github.com/mongodb/specifications/blob/master/source/retryable-writes/retryable-writes.rst#terms +const RETRYABLE_ERROR_CODES = new Set([ + exports.MONGODB_ERROR_CODES.HostUnreachable, + exports.MONGODB_ERROR_CODES.HostNotFound, + exports.MONGODB_ERROR_CODES.NetworkTimeout, + exports.MONGODB_ERROR_CODES.ShutdownInProgress, + exports.MONGODB_ERROR_CODES.PrimarySteppedDown, + exports.MONGODB_ERROR_CODES.SocketException, + exports.MONGODB_ERROR_CODES.NotMaster, + exports.MONGODB_ERROR_CODES.InterruptedAtShutdown, + exports.MONGODB_ERROR_CODES.InterruptedDueToReplStateChange, + exports.MONGODB_ERROR_CODES.NotMasterNoSlaveOk, + exports.MONGODB_ERROR_CODES.NotMasterOrSecondary +]); +const RETRYABLE_WRITE_ERROR_CODES = new Set([ + exports.MONGODB_ERROR_CODES.InterruptedAtShutdown, + exports.MONGODB_ERROR_CODES.InterruptedDueToReplStateChange, + exports.MONGODB_ERROR_CODES.NotMaster, + exports.MONGODB_ERROR_CODES.NotMasterNoSlaveOk, + exports.MONGODB_ERROR_CODES.NotMasterOrSecondary, + exports.MONGODB_ERROR_CODES.PrimarySteppedDown, + exports.MONGODB_ERROR_CODES.ShutdownInProgress, + exports.MONGODB_ERROR_CODES.HostNotFound, + exports.MONGODB_ERROR_CODES.HostUnreachable, + exports.MONGODB_ERROR_CODES.NetworkTimeout, + exports.MONGODB_ERROR_CODES.SocketException, + exports.MONGODB_ERROR_CODES.ExceededTimeLimit +]); +function isRetryableWriteError(error) { + var _a, _b, _c; + if (error instanceof MongoWriteConcernError) { + return RETRYABLE_WRITE_ERROR_CODES.has((_c = (_b = (_a = error.result) === null || _a === void 0 ? void 0 : _a.code) !== null && _b !== void 0 ? _b : error.code) !== null && _c !== void 0 ? _c : 0); + } + return typeof error.code === 'number' && RETRYABLE_WRITE_ERROR_CODES.has(error.code); +} +exports.isRetryableWriteError = isRetryableWriteError; +/** Determines whether an error is something the driver should attempt to retry */ +function isRetryableError(error) { + return ( + // eslint-disable-next-line @typescript-eslint/no-non-null-assertion + (typeof error.code === 'number' && RETRYABLE_ERROR_CODES.has(error.code)) || + error instanceof MongoNetworkError || + !!error.message.match(/not master/) || + !!error.message.match(/node is recovering/)); +} +exports.isRetryableError = isRetryableError; +const SDAM_RECOVERING_CODES = new Set([ + exports.MONGODB_ERROR_CODES.ShutdownInProgress, + exports.MONGODB_ERROR_CODES.PrimarySteppedDown, + exports.MONGODB_ERROR_CODES.InterruptedAtShutdown, + exports.MONGODB_ERROR_CODES.InterruptedDueToReplStateChange, + exports.MONGODB_ERROR_CODES.NotMasterOrSecondary +]); +const SDAM_NOTMASTER_CODES = new Set([ + exports.MONGODB_ERROR_CODES.NotMaster, + exports.MONGODB_ERROR_CODES.NotMasterNoSlaveOk, + exports.MONGODB_ERROR_CODES.LegacyNotPrimary +]); +const SDAM_NODE_SHUTTING_DOWN_ERROR_CODES = new Set([ + exports.MONGODB_ERROR_CODES.InterruptedAtShutdown, + exports.MONGODB_ERROR_CODES.ShutdownInProgress +]); +function isRecoveringError(err) { + if (typeof err.code === 'number') { + // If any error code exists, we ignore the error.message + return SDAM_RECOVERING_CODES.has(err.code); + } + return /not master or secondary/.test(err.message) || /node is recovering/.test(err.message); +} +function isNotMasterError(err) { + if (typeof err.code === 'number') { + // If any error code exists, we ignore the error.message + return SDAM_NOTMASTER_CODES.has(err.code); + } + if (isRecoveringError(err)) { + return false; + } + return /not master/.test(err.message); +} +function isNodeShuttingDownError(err) { + return !!(typeof err.code === 'number' && SDAM_NODE_SHUTTING_DOWN_ERROR_CODES.has(err.code)); +} +exports.isNodeShuttingDownError = isNodeShuttingDownError; +/** + * Determines whether SDAM can recover from a given error. If it cannot + * then the pool will be cleared, and server state will completely reset + * locally. + * + * @see https://github.com/mongodb/specifications/blob/master/source/server-discovery-and-monitoring/server-discovery-and-monitoring.rst#not-master-and-node-is-recovering + */ +function isSDAMUnrecoverableError(error) { + // NOTE: null check is here for a strictly pre-CMAP world, a timeout or + // close event are considered unrecoverable + if (error instanceof MongoParseError || error == null) { + return true; + } + return isRecoveringError(error) || isNotMasterError(error); +} +exports.isSDAMUnrecoverableError = isSDAMUnrecoverableError; +function isNetworkTimeoutError(err) { + return !!(err instanceof MongoNetworkError && err.message.match(/timed out/)); +} +exports.isNetworkTimeoutError = isNetworkTimeoutError; +// From spec@https://github.com/mongodb/specifications/blob/7a2e93d85935ee4b1046a8d2ad3514c657dc74fa/source/change-streams/change-streams.rst#resumable-error: +// +// An error is considered resumable if it meets any of the following criteria: +// - any error encountered which is not a server error (e.g. a timeout error or network error) +// - any server error response from a getMore command excluding those containing the error label +// NonRetryableChangeStreamError and those containing the following error codes: +// - Interrupted: 11601 +// - CappedPositionLost: 136 +// - CursorKilled: 237 +// +// An error on an aggregate command is not a resumable error. Only errors on a getMore command may be considered resumable errors. +function isResumableError(error, wireVersion) { + if (error instanceof MongoNetworkError) { + return true; + } + if (wireVersion != null && wireVersion >= 9) { + // DRIVERS-1308: For 4.4 drivers running against 4.4 servers, drivers will add a special case to treat the CursorNotFound error code as resumable + if (error && error instanceof MongoError && error.code === 43) { + return true; + } + return error instanceof MongoError && error.hasErrorLabel('ResumableChangeStreamError'); + } + if (error && typeof error.code === 'number') { + return exports.GET_MORE_RESUMABLE_CODES.has(error.code); + } + return false; +} +exports.isResumableError = isResumableError; +//# sourceMappingURL=error.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/error.js.map b/node_modules/mongodb/lib/error.js.map new file mode 100644 index 000000000..477d8bc63 --- /dev/null +++ b/node_modules/mongodb/lib/error.js.map @@ -0,0 +1 @@ +{"version":3,"file":"error.js","sourceRoot":"","sources":["../src/error.ts"],"names":[],"mappings":";;;AAOA,gBAAgB;AAChB,MAAM,YAAY,GAAG,MAAM,CAAC,aAAa,CAAC,CAAC;AAE3C,oCAAoC;AACvB,QAAA,mBAAmB,GAAG,MAAM,CAAC,MAAM,CAAC;IAC/C,eAAe,EAAE,CAAC;IAClB,YAAY,EAAE,CAAC;IACf,cAAc,EAAE,EAAE;IAClB,kBAAkB,EAAE,EAAE;IACtB,kBAAkB,EAAE,GAAG;IACvB,iBAAiB,EAAE,GAAG;IACtB,eAAe,EAAE,IAAI;IACrB,SAAS,EAAE,KAAK;IAChB,qBAAqB,EAAE,KAAK;IAC5B,+BAA+B,EAAE,KAAK;IACtC,kBAAkB,EAAE,KAAK;IACzB,oBAAoB,EAAE,KAAK;IAC3B,iBAAiB,EAAE,EAAE;IACrB,UAAU,EAAE,GAAG;IACf,WAAW,EAAE,KAAK;IAClB,iBAAiB,EAAE,GAAG;IACtB,6BAA6B,EAAE,GAAG;IAClC,cAAc,EAAE,EAAE;IAClB,gBAAgB,EAAE,KAAK;IACvB,kBAAkB,EAAE,EAAE;IACtB,iBAAiB,EAAE,EAAE;IACrB,gBAAgB,EAAE,EAAE;IACpB,gBAAgB,EAAE,EAAE;IACpB,uBAAuB,EAAE,EAAE;IAC3B,yBAAyB,EAAE,GAAG;CACtB,CAAC,CAAC;AAEZ,6JAA6J;AAChJ,QAAA,wBAAwB,GAAG,IAAI,GAAG,CAAS;IACtD,2BAAmB,CAAC,eAAe;IACnC,2BAAmB,CAAC,YAAY;IAChC,2BAAmB,CAAC,cAAc;IAClC,2BAAmB,CAAC,kBAAkB;IACtC,2BAAmB,CAAC,kBAAkB;IACtC,2BAAmB,CAAC,iBAAiB;IACrC,2BAAmB,CAAC,eAAe;IACnC,2BAAmB,CAAC,SAAS;IAC7B,2BAAmB,CAAC,qBAAqB;IACzC,2BAAmB,CAAC,+BAA+B;IACnD,2BAAmB,CAAC,kBAAkB;IACtC,2BAAmB,CAAC,oBAAoB;IACxC,2BAAmB,CAAC,iBAAiB;IACrC,2BAAmB,CAAC,UAAU;IAC9B,2BAAmB,CAAC,WAAW;IAC/B,2BAAmB,CAAC,iBAAiB;IACrC,2BAAmB,CAAC,6BAA6B;IACjD,2BAAmB,CAAC,cAAc;CACnC,CAAC,CAAC;AAWH;;;;;;GAMG;AACH,MAAa,UAAW,SAAQ,KAAK;IAWnC,YAAY,OAAuB;QACjC,IAAI,OAAO,YAAY,KAAK,EAAE;YAC5B,KAAK,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;SACxB;aAAM;YACL,KAAK,CAAC,OAAO,CAAC,CAAC;SAChB;IACH,CAAC;IAED,IAAI,IAAI;QACN,OAAO,YAAY,CAAC;IACtB,CAAC;IAED,6CAA6C;IAC7C,IAAI,MAAM;QACR,OAAO,IAAI,CAAC,OAAO,CAAC;IACtB,CAAC;IAED;;;;;OAKG;IACH,aAAa,CAAC,KAAa;QACzB,IAAI,IAAI,CAAC,YAAY,CAAC,IAAI,IAAI,EAAE;YAC9B,OAAO,KAAK,CAAC;SACd;QAED,OAAO,IAAI,CAAC,YAAY,CAAC,CAAC,GAAG,CAAC,KAAK,CAAC,CAAC;IACvC,CAAC;IAED,aAAa,CAAC,KAAa;QACzB,IAAI,IAAI,CAAC,YAAY,CAAC,IAAI,IAAI,EAAE;YAC9B,IAAI,CAAC,YAAY,CAAC,GAAG,IAAI,GAAG,EAAE,CAAC;SAChC;QAED,IAAI,CAAC,YAAY,CAAC,CAAC,GAAG,CAAC,KAAK,CAAC,CAAC;IAChC,CAAC;IAED,IAAI,WAAW;QACb,OAAO,IAAI,CAAC,YAAY,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,YAAY,CAAC,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC;IAClE,CAAC;CACF;AArDD,gCAqDC;AAED;;;;;GAKG;AACH,MAAa,gBAAiB,SAAQ,UAAU;IAO9C,YAAY,OAAyB;QACnC,KAAK,CAAC,OAAO,CAAC,OAAO,IAAI,OAAO,CAAC,MAAM,IAAI,OAAO,CAAC,IAAI,IAAI,KAAK,CAAC,CAAC;QAClE,IAAI,OAAO,CAAC,WAAW,EAAE;YACvB,IAAI,CAAC,YAAY,CAAC,GAAG,IAAI,GAAG,CAAC,OAAO,CAAC,WAAW,CAAC,CAAC;SACnD;QAED,KAAK,MAAM,IAAI,IAAI,OAAO,EAAE;YAC1B,IAAI,IAAI,KAAK,aAAa,IAAI,IAAI,KAAK,QAAQ,IAAI,IAAI,KAAK,SAAS;gBACnE,IAAI,CAAC,IAAI,CAAC,GAAG,OAAO,CAAC,IAAI,CAAC,CAAC;SAC9B;IACH,CAAC;IAED,IAAI,IAAI;QACN,OAAO,kBAAkB,CAAC;IAC5B,CAAC;CACF;AAtBD,4CAsBC;AAED;;;;;GAKG;AACH,MAAa,gBAAiB,SAAQ,UAAU;IAC9C,YAAY,OAAe;QACzB,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,kBAAkB,CAAC;IAC5B,CAAC;CACF;AARD,4CAQC;AAED;;;;;;;;GAQG;AAEH,MAAa,aAAc,SAAQ,gBAAgB;IACjD,YAAY,OAAe;QACzB,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,eAAe,CAAC;IACzB,CAAC;CACF;AARD,sCAQC;AAED;;;;;;;;;GASG;AACH,MAAa,iBAAkB,SAAQ,gBAAgB;IACrD,YAAY,OAAe;QACzB,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,mBAAmB,CAAC;IAC7B,CAAC;CACF;AARD,8CAQC;AAED;;;;;;GAMG;AACH,MAAa,0BAA2B,SAAQ,aAAa;IAC3D,YAAY,OAAO,GAAG,mEAAmE;QACvF,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,4BAA4B,CAAC;IACtC,CAAC;CACF;AARD,gEAQC;AAED;;;;;;GAMG;AACH,MAAa,uBAAwB,SAAQ,iBAAiB;IAC5D,YAAY,OAAe;QACzB,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,yBAAyB,CAAC;IACnC,CAAC;CACF;AARD,0DAQC;AAED;;;;;;GAMG;AACH,MAAa,sBAAuB,SAAQ,aAAa;IACvD,YAAY,OAAe;QACzB,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,wBAAwB,CAAC;IAClC,CAAC;CACF;AARD,wDAQC;AAED;;;;;;GAMG;AACH,MAAa,qBAAsB,SAAQ,aAAa;IACtD,YAAY,OAAe;QACzB,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,uBAAuB,CAAC;IACjC,CAAC;CACF;AARD,sDAQC;AAED;;;;;;GAMG;AACH,MAAa,wBAAyB,SAAQ,aAAa;IACzD,YAAY,OAAO,GAAG,qCAAqC;QACzD,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,0BAA0B,CAAC;IACpC,CAAC;CACF;AARD,4DAQC;AAED;;;;;;GAMG;AACH,MAAa,kBAAmB,SAAQ,iBAAiB;IACvD,YAAY,OAAe;QACzB,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,oBAAoB,CAAC;IAC9B,CAAC;CACF;AARD,gDAQC;AAED;;;;;GAKG;AACH,MAAa,sBAAuB,SAAQ,iBAAiB;IAC3D,YAAY,OAAe;QACzB,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,wBAAwB,CAAC;IAClC,CAAC;CACF;AARD,wDAQC;AAED;;;;;GAKG;AACH,MAAa,wBAAyB,SAAQ,aAAa;IACzD,YAAY,OAAO,GAAG,iDAAiD;QACrE,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,0BAA0B,CAAC;IACpC,CAAC;CACF;AARD,4DAQC;AAED;;;;GAIG;AACH,MAAa,sBAAuB,SAAQ,iBAAiB;IAC3D,YAAY,OAAe;QACzB,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,wBAAwB,CAAC;IAClC,CAAC;CACF;AARD,wDAQC;AAED;;;;;;GAMG;AACH,MAAa,qBAAsB,SAAQ,iBAAiB;IAC1D,YAAY,OAAe;QACzB,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,uBAAuB,CAAC;IACjC,CAAC;CACF;AARD,sDAQC;AAED;;;;;;GAMG;AACH,MAAa,qBAAsB,SAAQ,aAAa;IACtD,YAAY,OAAO,GAAG,+BAA+B;QACnD,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,uBAAuB,CAAC;IACjC,CAAC;CACF;AARD,sDAQC;AAED;;;;;;GAMG;AACH,MAAa,sBAAuB,SAAQ,aAAa;IACvD,YAAY,OAAO,GAAG,kBAAkB;QACtC,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,wBAAwB,CAAC;IAClC,CAAC;CACF;AARD,wDAQC;AAED;;;;;GAKG;AACH,MAAa,yBAA0B,SAAQ,aAAa;IAC1D,YAAY,OAAgB;QAC1B,KAAK,CAAC,OAAO,IAAI,qBAAqB,CAAC,CAAC;IAC1C,CAAC;IAED,IAAI,IAAI;QACN,OAAO,2BAA2B,CAAC;IACrC,CAAC;CACF;AARD,8DAQC;AAED;;;;;;GAMG;AACH,MAAa,wBAAyB,SAAQ,aAAa;IACzD,YAAY,OAAO,GAAG,oBAAoB;QACxC,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,0BAA0B,CAAC;IACpC,CAAC;CACF;AARD,4DAQC;AAED,gBAAgB;AAChB,MAAM,gBAAgB,GAAG,MAAM,CAAC,iBAAiB,CAAC,CAAC;AACnD,SAAgB,6BAA6B,CAAC,GAAsB;IAClE,OAAO,GAAG,CAAC,gBAAgB,CAAC,KAAK,IAAI,CAAC;AACxC,CAAC;AAFD,sEAEC;AAQD;;;;GAIG;AACH,MAAa,iBAAkB,SAAQ,UAAU;IAI/C,YAAY,OAAuB,EAAE,OAAkC;QACrE,KAAK,CAAC,OAAO,CAAC,CAAC;QAEf,IAAI,OAAO,IAAI,OAAO,OAAO,CAAC,eAAe,KAAK,SAAS,EAAE;YAC3D,IAAI,CAAC,gBAAgB,CAAC,GAAG,OAAO,CAAC,eAAe,CAAC;SAClD;IACH,CAAC;IAED,IAAI,IAAI;QACN,OAAO,mBAAmB,CAAC;IAC7B,CAAC;CACF;AAfD,8CAeC;AAED;;;;;;;GAOG;AACH,MAAa,wBAAyB,SAAQ,iBAAiB;IAC7D,YAAY,OAAe,EAAE,OAAkC;QAC7D,KAAK,CAAC,OAAO,EAAE,OAAO,CAAC,CAAC;IAC1B,CAAC;IAED,IAAI,IAAI;QACN,OAAO,0BAA0B,CAAC;IACpC,CAAC;CACF;AARD,4DAQC;AAED;;;;GAIG;AACH,MAAa,eAAgB,SAAQ,gBAAgB;IACnD,YAAY,OAAe;QACzB,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,iBAAiB,CAAC;IAC3B,CAAC;CACF;AARD,0CAQC;AAED;;;;;;;GAOG;AACH,MAAa,yBAA0B,SAAQ,aAAa;IAC1D,YAAY,OAAe;QACzB,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,2BAA2B,CAAC;IACrC,CAAC;CACF;AARD,8DAQC;AAED;;;;;;;GAOG;AACH,MAAa,uBAAwB,SAAQ,aAAa;IACxD,YAAY,OAAe;QACzB,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,yBAAyB,CAAC;IACnC,CAAC;CACF;AARD,0DAQC;AAED;;;;;;;GAOG;AACH,MAAa,4BAA6B,SAAQ,aAAa;IAC7D,YAAY,OAAe;QACzB,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,8BAA8B,CAAC;IACxC,CAAC;CACF;AARD,oEAQC;AAED;;;;;GAKG;AACH,MAAa,2BAA4B,SAAQ,aAAa;IAC5D,YAAY,OAAe;QACzB,KAAK,CAAC,OAAO,CAAC,CAAC;IACjB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,6BAA6B,CAAC;IACvC,CAAC;CACF;AARD,kEAQC;AACD;;;;GAIG;AACH,MAAa,gBAAiB,SAAQ,UAAU;IAI9C,YAAY,OAAe,EAAE,MAA2B;QACtD,IAAI,MAAM,IAAI,MAAM,CAAC,KAAK,EAAE;YAC1B,KAAK,CAAC,MAAM,CAAC,KAAK,CAAC,OAAO,IAAI,MAAM,CAAC,KAAK,CAAC,CAAC;SAC7C;aAAM;YACL,KAAK,CAAC,OAAO,CAAC,CAAC;SAChB;QAED,IAAI,MAAM,EAAE;YACV,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC;SACtB;IACH,CAAC;IAED,IAAI,IAAI;QACN,OAAO,kBAAkB,CAAC;IAC5B,CAAC;CACF;AAnBD,4CAmBC;AAED;;;;GAIG;AACH,MAAa,yBAA0B,SAAQ,gBAAgB;IAC7D,YAAY,OAAe,EAAE,MAA2B;QACtD,KAAK,CAAC,OAAO,EAAE,MAAM,CAAC,CAAC;IACzB,CAAC;IAED,IAAI,IAAI;QACN,OAAO,2BAA2B,CAAC;IACrC,CAAC;CACF;AARD,8DAQC;AAED,SAAS,4BAA4B,CAAC,KAAU;IAC9C,MAAM,MAAM,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,KAAK,CAAC,CAAC;IAExC,IAAI,MAAM,CAAC,EAAE,KAAK,CAAC,EAAE;QACnB,MAAM,CAAC,EAAE,GAAG,CAAC,CAAC;QACd,OAAO,MAAM,CAAC,MAAM,CAAC;QACrB,OAAO,MAAM,CAAC,IAAI,CAAC;QACnB,OAAO,MAAM,CAAC,QAAQ,CAAC;KACxB;IAED,OAAO,MAAM,CAAC;AAChB,CAAC;AAED;;;;GAIG;AACH,MAAa,sBAAuB,SAAQ,gBAAgB;IAK1D,YAAY,OAAyB,EAAE,MAAiB;QACtD,IAAI,MAAM,IAAI,KAAK,CAAC,OAAO,CAAC,MAAM,CAAC,WAAW,CAAC,EAAE;YAC/C,OAAO,CAAC,WAAW,GAAG,MAAM,CAAC,WAAW,CAAC;SAC1C;QAED,KAAK,CAAC,OAAO,CAAC,CAAC;QACf,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC,OAAO,CAAC;QAE/B,IAAI,MAAM,IAAI,IAAI,EAAE;YAClB,IAAI,CAAC,MAAM,GAAG,4BAA4B,CAAC,MAAM,CAAC,CAAC;SACpD;IACH,CAAC;IAED,IAAI,IAAI;QACN,OAAO,wBAAwB,CAAC;IAClC,CAAC;CACF;AArBD,wDAqBC;AAED,gHAAgH;AAChH,MAAM,qBAAqB,GAAG,IAAI,GAAG,CAAS;IAC5C,2BAAmB,CAAC,eAAe;IACnC,2BAAmB,CAAC,YAAY;IAChC,2BAAmB,CAAC,cAAc;IAClC,2BAAmB,CAAC,kBAAkB;IACtC,2BAAmB,CAAC,kBAAkB;IACtC,2BAAmB,CAAC,eAAe;IACnC,2BAAmB,CAAC,SAAS;IAC7B,2BAAmB,CAAC,qBAAqB;IACzC,2BAAmB,CAAC,+BAA+B;IACnD,2BAAmB,CAAC,kBAAkB;IACtC,2BAAmB,CAAC,oBAAoB;CACzC,CAAC,CAAC;AAEH,MAAM,2BAA2B,GAAG,IAAI,GAAG,CAAS;IAClD,2BAAmB,CAAC,qBAAqB;IACzC,2BAAmB,CAAC,+BAA+B;IACnD,2BAAmB,CAAC,SAAS;IAC7B,2BAAmB,CAAC,kBAAkB;IACtC,2BAAmB,CAAC,oBAAoB;IACxC,2BAAmB,CAAC,kBAAkB;IACtC,2BAAmB,CAAC,kBAAkB;IACtC,2BAAmB,CAAC,YAAY;IAChC,2BAAmB,CAAC,eAAe;IACnC,2BAAmB,CAAC,cAAc;IAClC,2BAAmB,CAAC,eAAe;IACnC,2BAAmB,CAAC,iBAAiB;CACtC,CAAC,CAAC;AAEH,SAAgB,qBAAqB,CAAC,KAAiB;;IACrD,IAAI,KAAK,YAAY,sBAAsB,EAAE;QAC3C,OAAO,2BAA2B,CAAC,GAAG,CAAC,MAAA,MAAA,MAAA,KAAK,CAAC,MAAM,0CAAE,IAAI,mCAAI,KAAK,CAAC,IAAI,mCAAI,CAAC,CAAC,CAAC;KAC/E;IACD,OAAO,OAAO,KAAK,CAAC,IAAI,KAAK,QAAQ,IAAI,2BAA2B,CAAC,GAAG,CAAC,KAAK,CAAC,IAAI,CAAC,CAAC;AACvF,CAAC;AALD,sDAKC;AAED,kFAAkF;AAClF,SAAgB,gBAAgB,CAAC,KAAiB;IAChD,OAAO;IACL,oEAAoE;IACpE,CAAC,OAAO,KAAK,CAAC,IAAI,KAAK,QAAQ,IAAI,qBAAqB,CAAC,GAAG,CAAC,KAAK,CAAC,IAAK,CAAC,CAAC;QAC1E,KAAK,YAAY,iBAAiB;QAClC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,KAAK,CAAC,YAAY,CAAC;QACnC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,KAAK,CAAC,oBAAoB,CAAC,CAC5C,CAAC;AACJ,CAAC;AARD,4CAQC;AAED,MAAM,qBAAqB,GAAG,IAAI,GAAG,CAAS;IAC5C,2BAAmB,CAAC,kBAAkB;IACtC,2BAAmB,CAAC,kBAAkB;IACtC,2BAAmB,CAAC,qBAAqB;IACzC,2BAAmB,CAAC,+BAA+B;IACnD,2BAAmB,CAAC,oBAAoB;CACzC,CAAC,CAAC;AAEH,MAAM,oBAAoB,GAAG,IAAI,GAAG,CAAS;IAC3C,2BAAmB,CAAC,SAAS;IAC7B,2BAAmB,CAAC,kBAAkB;IACtC,2BAAmB,CAAC,gBAAgB;CACrC,CAAC,CAAC;AAEH,MAAM,mCAAmC,GAAG,IAAI,GAAG,CAAS;IAC1D,2BAAmB,CAAC,qBAAqB;IACzC,2BAAmB,CAAC,kBAAkB;CACvC,CAAC,CAAC;AAEH,SAAS,iBAAiB,CAAC,GAAe;IACxC,IAAI,OAAO,GAAG,CAAC,IAAI,KAAK,QAAQ,EAAE;QAChC,wDAAwD;QACxD,OAAO,qBAAqB,CAAC,GAAG,CAAC,GAAG,CAAC,IAAI,CAAC,CAAC;KAC5C;IAED,OAAO,yBAAyB,CAAC,IAAI,CAAC,GAAG,CAAC,OAAO,CAAC,IAAI,oBAAoB,CAAC,IAAI,CAAC,GAAG,CAAC,OAAO,CAAC,CAAC;AAC/F,CAAC;AAED,SAAS,gBAAgB,CAAC,GAAe;IACvC,IAAI,OAAO,GAAG,CAAC,IAAI,KAAK,QAAQ,EAAE;QAChC,wDAAwD;QACxD,OAAO,oBAAoB,CAAC,GAAG,CAAC,GAAG,CAAC,IAAI,CAAC,CAAC;KAC3C;IAED,IAAI,iBAAiB,CAAC,GAAG,CAAC,EAAE;QAC1B,OAAO,KAAK,CAAC;KACd;IAED,OAAO,YAAY,CAAC,IAAI,CAAC,GAAG,CAAC,OAAO,CAAC,CAAC;AACxC,CAAC;AAED,SAAgB,uBAAuB,CAAC,GAAe;IACrD,OAAO,CAAC,CAAC,CAAC,OAAO,GAAG,CAAC,IAAI,KAAK,QAAQ,IAAI,mCAAmC,CAAC,GAAG,CAAC,GAAG,CAAC,IAAI,CAAC,CAAC,CAAC;AAC/F,CAAC;AAFD,0DAEC;AAED;;;;;;GAMG;AACH,SAAgB,wBAAwB,CAAC,KAAiB;IACxD,uEAAuE;IACvE,iDAAiD;IACjD,IAAI,KAAK,YAAY,eAAe,IAAI,KAAK,IAAI,IAAI,EAAE;QACrD,OAAO,IAAI,CAAC;KACb;IAED,OAAO,iBAAiB,CAAC,KAAK,CAAC,IAAI,gBAAgB,CAAC,KAAK,CAAC,CAAC;AAC7D,CAAC;AARD,4DAQC;AAED,SAAgB,qBAAqB,CAAC,GAAe;IACnD,OAAO,CAAC,CAAC,CAAC,GAAG,YAAY,iBAAiB,IAAI,GAAG,CAAC,OAAO,CAAC,KAAK,CAAC,WAAW,CAAC,CAAC,CAAC;AAChF,CAAC;AAFD,sDAEC;AAED,8JAA8J;AAC9J,EAAE;AACF,8EAA8E;AAC9E,8FAA8F;AAC9F,gGAAgG;AAChG,kFAAkF;AAClF,yBAAyB;AACzB,8BAA8B;AAC9B,wBAAwB;AACxB,EAAE;AACF,kIAAkI;AAElI,SAAgB,gBAAgB,CAAC,KAAkB,EAAE,WAAoB;IACvE,IAAI,KAAK,YAAY,iBAAiB,EAAE;QACtC,OAAO,IAAI,CAAC;KACb;IAED,IAAI,WAAW,IAAI,IAAI,IAAI,WAAW,IAAI,CAAC,EAAE;QAC3C,iJAAiJ;QACjJ,IAAI,KAAK,IAAI,KAAK,YAAY,UAAU,IAAI,KAAK,CAAC,IAAI,KAAK,EAAE,EAAE;YAC7D,OAAO,IAAI,CAAC;SACb;QACD,OAAO,KAAK,YAAY,UAAU,IAAI,KAAK,CAAC,aAAa,CAAC,4BAA4B,CAAC,CAAC;KACzF;IAED,IAAI,KAAK,IAAI,OAAO,KAAK,CAAC,IAAI,KAAK,QAAQ,EAAE;QAC3C,OAAO,gCAAwB,CAAC,GAAG,CAAC,KAAK,CAAC,IAAI,CAAC,CAAC;KACjD;IACD,OAAO,KAAK,CAAC;AACf,CAAC;AAjBD,4CAiBC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/explain.js b/node_modules/mongodb/lib/explain.js new file mode 100644 index 000000000..bc55e1fee --- /dev/null +++ b/node_modules/mongodb/lib/explain.js @@ -0,0 +1,35 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.Explain = exports.ExplainVerbosity = void 0; +const error_1 = require("./error"); +/** @public */ +exports.ExplainVerbosity = Object.freeze({ + queryPlanner: 'queryPlanner', + queryPlannerExtended: 'queryPlannerExtended', + executionStats: 'executionStats', + allPlansExecution: 'allPlansExecution' +}); +/** @internal */ +class Explain { + constructor(verbosity) { + if (typeof verbosity === 'boolean') { + this.verbosity = verbosity + ? exports.ExplainVerbosity.allPlansExecution + : exports.ExplainVerbosity.queryPlanner; + } + else { + this.verbosity = verbosity; + } + } + static fromOptions(options) { + if ((options === null || options === void 0 ? void 0 : options.explain) == null) + return; + const explain = options.explain; + if (typeof explain === 'boolean' || typeof explain === 'string') { + return new Explain(explain); + } + throw new error_1.MongoInvalidArgumentError('Field "explain" must be a string or a boolean'); + } +} +exports.Explain = Explain; +//# sourceMappingURL=explain.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/explain.js.map b/node_modules/mongodb/lib/explain.js.map new file mode 100644 index 000000000..ae503b22b --- /dev/null +++ b/node_modules/mongodb/lib/explain.js.map @@ -0,0 +1 @@ +{"version":3,"file":"explain.js","sourceRoot":"","sources":["../src/explain.ts"],"names":[],"mappings":";;;AAAA,mCAAoD;AAEpD,cAAc;AACD,QAAA,gBAAgB,GAAG,MAAM,CAAC,MAAM,CAAC;IAC5C,YAAY,EAAE,cAAc;IAC5B,oBAAoB,EAAE,sBAAsB;IAC5C,cAAc,EAAE,gBAAgB;IAChC,iBAAiB,EAAE,mBAAmB;CAC9B,CAAC,CAAC;AAmBZ,gBAAgB;AAChB,MAAa,OAAO;IAGlB,YAAY,SAA+B;QACzC,IAAI,OAAO,SAAS,KAAK,SAAS,EAAE;YAClC,IAAI,CAAC,SAAS,GAAG,SAAS;gBACxB,CAAC,CAAC,wBAAgB,CAAC,iBAAiB;gBACpC,CAAC,CAAC,wBAAgB,CAAC,YAAY,CAAC;SACnC;aAAM;YACL,IAAI,CAAC,SAAS,GAAG,SAAS,CAAC;SAC5B;IACH,CAAC;IAED,MAAM,CAAC,WAAW,CAAC,OAAwB;QACzC,IAAI,CAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,OAAO,KAAI,IAAI;YAAE,OAAO;QAErC,MAAM,OAAO,GAAG,OAAO,CAAC,OAAO,CAAC;QAChC,IAAI,OAAO,OAAO,KAAK,SAAS,IAAI,OAAO,OAAO,KAAK,QAAQ,EAAE;YAC/D,OAAO,IAAI,OAAO,CAAC,OAAO,CAAC,CAAC;SAC7B;QAED,MAAM,IAAI,iCAAyB,CAAC,+CAA+C,CAAC,CAAC;IACvF,CAAC;CACF;AAvBD,0BAuBC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/gridfs/download.js b/node_modules/mongodb/lib/gridfs/download.js new file mode 100644 index 000000000..1e9687bc7 --- /dev/null +++ b/node_modules/mongodb/lib/gridfs/download.js @@ -0,0 +1,315 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.GridFSBucketReadStream = void 0; +const stream_1 = require("stream"); +const error_1 = require("../error"); +/** + * A readable stream that enables you to read buffers from GridFS. + * + * Do not instantiate this class directly. Use `openDownloadStream()` instead. + * @public + */ +class GridFSBucketReadStream extends stream_1.Readable { + /** @internal + * @param chunks - Handle for chunks collection + * @param files - Handle for files collection + * @param readPreference - The read preference to use + * @param filter - The filter to use to find the file document + */ + constructor(chunks, files, readPreference, filter, options) { + super(); + this.s = { + bytesToTrim: 0, + bytesToSkip: 0, + bytesRead: 0, + chunks, + expected: 0, + files, + filter, + init: false, + expectedEnd: 0, + options: { + start: 0, + end: 0, + ...options + }, + readPreference + }; + } + /** + * Reads from the cursor and pushes to the stream. + * Private Impl, do not call directly + * @internal + */ + _read() { + if (this.destroyed) + return; + waitForFile(this, () => doRead(this)); + } + /** + * Sets the 0-based offset in bytes to start streaming from. Throws + * an error if this stream has entered flowing mode + * (e.g. if you've already called `on('data')`) + * + * @param start - 0-based offset in bytes to start streaming from + */ + start(start = 0) { + throwIfInitialized(this); + this.s.options.start = start; + return this; + } + /** + * Sets the 0-based offset in bytes to start streaming from. Throws + * an error if this stream has entered flowing mode + * (e.g. if you've already called `on('data')`) + * + * @param end - Offset in bytes to stop reading at + */ + end(end = 0) { + throwIfInitialized(this); + this.s.options.end = end; + return this; + } + /** + * Marks this stream as aborted (will never push another `data` event) + * and kills the underlying cursor. Will emit the 'end' event, and then + * the 'close' event once the cursor is successfully killed. + * + * @param callback - called when the cursor is successfully closed or an error occurred. + */ + abort(callback) { + this.push(null); + this.destroyed = true; + if (this.s.cursor) { + this.s.cursor.close(error => { + this.emit(GridFSBucketReadStream.CLOSE); + callback && callback(error); + }); + } + else { + if (!this.s.init) { + // If not initialized, fire close event because we will never + // get a cursor + this.emit(GridFSBucketReadStream.CLOSE); + } + callback && callback(); + } + } +} +exports.GridFSBucketReadStream = GridFSBucketReadStream; +/** + * An error occurred + * @event + */ +GridFSBucketReadStream.ERROR = 'error'; +/** + * Fires when the stream loaded the file document corresponding to the provided id. + * @event + */ +GridFSBucketReadStream.FILE = 'file'; +/** + * Emitted when a chunk of data is available to be consumed. + * @event + */ +GridFSBucketReadStream.DATA = 'data'; +/** + * Fired when the stream is exhausted (no more data events). + * @event + */ +GridFSBucketReadStream.END = 'end'; +/** + * Fired when the stream is exhausted and the underlying cursor is killed + * @event + */ +GridFSBucketReadStream.CLOSE = 'close'; +function throwIfInitialized(stream) { + if (stream.s.init) { + throw new error_1.MongoGridFSStreamError('Options cannot be changed after the stream is initialized'); + } +} +function doRead(stream) { + if (stream.destroyed) + return; + if (!stream.s.cursor) + return; + if (!stream.s.file) + return; + stream.s.cursor.next((error, doc) => { + if (stream.destroyed) { + return; + } + if (error) { + stream.emit(GridFSBucketReadStream.ERROR, error); + return; + } + if (!doc) { + stream.push(null); + process.nextTick(() => { + if (!stream.s.cursor) + return; + stream.s.cursor.close(error => { + if (error) { + stream.emit(GridFSBucketReadStream.ERROR, error); + return; + } + stream.emit(GridFSBucketReadStream.CLOSE); + }); + }); + return; + } + if (!stream.s.file) + return; + const bytesRemaining = stream.s.file.length - stream.s.bytesRead; + const expectedN = stream.s.expected++; + const expectedLength = Math.min(stream.s.file.chunkSize, bytesRemaining); + if (doc.n > expectedN) { + return stream.emit(GridFSBucketReadStream.ERROR, new error_1.MongoGridFSChunkError(`ChunkIsMissing: Got unexpected n: ${doc.n}, expected: ${expectedN}`)); + } + if (doc.n < expectedN) { + return stream.emit(GridFSBucketReadStream.ERROR, new error_1.MongoGridFSChunkError(`ExtraChunk: Got unexpected n: ${doc.n}, expected: ${expectedN}`)); + } + let buf = Buffer.isBuffer(doc.data) ? doc.data : doc.data.buffer; + if (buf.byteLength !== expectedLength) { + if (bytesRemaining <= 0) { + return stream.emit(GridFSBucketReadStream.ERROR, new error_1.MongoGridFSChunkError(`ExtraChunk: Got unexpected n: ${doc.n}`)); + } + return stream.emit(GridFSBucketReadStream.ERROR, new error_1.MongoGridFSChunkError(`ChunkIsWrongSize: Got unexpected length: ${buf.byteLength}, expected: ${expectedLength}`)); + } + stream.s.bytesRead += buf.byteLength; + if (buf.byteLength === 0) { + return stream.push(null); + } + let sliceStart = null; + let sliceEnd = null; + if (stream.s.bytesToSkip != null) { + sliceStart = stream.s.bytesToSkip; + stream.s.bytesToSkip = 0; + } + const atEndOfStream = expectedN === stream.s.expectedEnd - 1; + const bytesLeftToRead = stream.s.options.end - stream.s.bytesToSkip; + if (atEndOfStream && stream.s.bytesToTrim != null) { + sliceEnd = stream.s.file.chunkSize - stream.s.bytesToTrim; + } + else if (stream.s.options.end && bytesLeftToRead < doc.data.byteLength) { + sliceEnd = bytesLeftToRead; + } + if (sliceStart != null || sliceEnd != null) { + buf = buf.slice(sliceStart || 0, sliceEnd || buf.byteLength); + } + stream.push(buf); + }); +} +function init(stream) { + const findOneOptions = {}; + if (stream.s.readPreference) { + findOneOptions.readPreference = stream.s.readPreference; + } + if (stream.s.options && stream.s.options.sort) { + findOneOptions.sort = stream.s.options.sort; + } + if (stream.s.options && stream.s.options.skip) { + findOneOptions.skip = stream.s.options.skip; + } + stream.s.files.findOne(stream.s.filter, findOneOptions, (error, doc) => { + if (error) { + return stream.emit(GridFSBucketReadStream.ERROR, error); + } + if (!doc) { + const identifier = stream.s.filter._id + ? stream.s.filter._id.toString() + : stream.s.filter.filename; + const errmsg = `FileNotFound: file ${identifier} was not found`; + // TODO(NODE-3483) + const err = new error_1.MongoRuntimeError(errmsg); + err.code = 'ENOENT'; // TODO: NODE-3338 set property as part of constructor + return stream.emit(GridFSBucketReadStream.ERROR, err); + } + // If document is empty, kill the stream immediately and don't + // execute any reads + if (doc.length <= 0) { + stream.push(null); + return; + } + if (stream.destroyed) { + // If user destroys the stream before we have a cursor, wait + // until the query is done to say we're 'closed' because we can't + // cancel a query. + stream.emit(GridFSBucketReadStream.CLOSE); + return; + } + try { + stream.s.bytesToSkip = handleStartOption(stream, doc, stream.s.options); + } + catch (error) { + return stream.emit(GridFSBucketReadStream.ERROR, error); + } + const filter = { files_id: doc._id }; + // Currently (MongoDB 3.4.4) skip function does not support the index, + // it needs to retrieve all the documents first and then skip them. (CS-25811) + // As work around we use $gte on the "n" field. + if (stream.s.options && stream.s.options.start != null) { + const skip = Math.floor(stream.s.options.start / doc.chunkSize); + if (skip > 0) { + filter['n'] = { $gte: skip }; + } + } + stream.s.cursor = stream.s.chunks.find(filter).sort({ n: 1 }); + if (stream.s.readPreference) { + stream.s.cursor.withReadPreference(stream.s.readPreference); + } + stream.s.expectedEnd = Math.ceil(doc.length / doc.chunkSize); + stream.s.file = doc; + try { + stream.s.bytesToTrim = handleEndOption(stream, doc, stream.s.cursor, stream.s.options); + } + catch (error) { + return stream.emit(GridFSBucketReadStream.ERROR, error); + } + stream.emit(GridFSBucketReadStream.FILE, doc); + }); +} +function waitForFile(stream, callback) { + if (stream.s.file) { + return callback(); + } + if (!stream.s.init) { + init(stream); + stream.s.init = true; + } + stream.once('file', () => { + callback(); + }); +} +function handleStartOption(stream, doc, options) { + if (options && options.start != null) { + if (options.start > doc.length) { + throw new error_1.MongoInvalidArgumentError(`Stream start (${options.start}) must not be more than the length of the file (${doc.length})`); + } + if (options.start < 0) { + throw new error_1.MongoInvalidArgumentError(`Stream start (${options.start}) must not be negative`); + } + if (options.end != null && options.end < options.start) { + throw new error_1.MongoInvalidArgumentError(`Stream start (${options.start}) must not be greater than stream end (${options.end})`); + } + stream.s.bytesRead = Math.floor(options.start / doc.chunkSize) * doc.chunkSize; + stream.s.expected = Math.floor(options.start / doc.chunkSize); + return options.start - stream.s.bytesRead; + } + throw new error_1.MongoInvalidArgumentError('Start option must be defined'); +} +function handleEndOption(stream, doc, cursor, options) { + if (options && options.end != null) { + if (options.end > doc.length) { + throw new error_1.MongoInvalidArgumentError(`Stream end (${options.end}) must not be more than the length of the file (${doc.length})`); + } + if (options.start == null || options.start < 0) { + throw new error_1.MongoInvalidArgumentError(`Stream end (${options.end}) must not be negative`); + } + const start = options.start != null ? Math.floor(options.start / doc.chunkSize) : 0; + cursor.limit(Math.ceil(options.end / doc.chunkSize) - start); + stream.s.expectedEnd = Math.ceil(options.end / doc.chunkSize); + return Math.ceil(options.end / doc.chunkSize) * doc.chunkSize - options.end; + } + throw new error_1.MongoInvalidArgumentError('End option must be defined'); +} +//# sourceMappingURL=download.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/gridfs/download.js.map b/node_modules/mongodb/lib/gridfs/download.js.map new file mode 100644 index 000000000..9416e3b86 --- /dev/null +++ b/node_modules/mongodb/lib/gridfs/download.js.map @@ -0,0 +1 @@ +{"version":3,"file":"download.js","sourceRoot":"","sources":["../../src/gridfs/download.ts"],"names":[],"mappings":";;;AAAA,mCAAkC;AAClC,oCAKkB;AA+DlB;;;;;GAKG;AACH,MAAa,sBAAuB,SAAQ,iBAAQ;IA8BlD;;;;;OAKG;IACH,YACE,MAA+B,EAC/B,KAA6B,EAC7B,cAA0C,EAC1C,MAAgB,EAChB,OAAuC;QAEvC,KAAK,EAAE,CAAC;QACR,IAAI,CAAC,CAAC,GAAG;YACP,WAAW,EAAE,CAAC;YACd,WAAW,EAAE,CAAC;YACd,SAAS,EAAE,CAAC;YACZ,MAAM;YACN,QAAQ,EAAE,CAAC;YACX,KAAK;YACL,MAAM;YACN,IAAI,EAAE,KAAK;YACX,WAAW,EAAE,CAAC;YACd,OAAO,EAAE;gBACP,KAAK,EAAE,CAAC;gBACR,GAAG,EAAE,CAAC;gBACN,GAAG,OAAO;aACX;YACD,cAAc;SACf,CAAC;IACJ,CAAC;IAED;;;;OAIG;IACH,KAAK;QACH,IAAI,IAAI,CAAC,SAAS;YAAE,OAAO;QAC3B,WAAW,CAAC,IAAI,EAAE,GAAG,EAAE,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC,CAAC;IACxC,CAAC;IAED;;;;;;OAMG;IACH,KAAK,CAAC,KAAK,GAAG,CAAC;QACb,kBAAkB,CAAC,IAAI,CAAC,CAAC;QACzB,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,KAAK,GAAG,KAAK,CAAC;QAC7B,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;;;OAMG;IACH,GAAG,CAAC,GAAG,GAAG,CAAC;QACT,kBAAkB,CAAC,IAAI,CAAC,CAAC;QACzB,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,GAAG,GAAG,GAAG,CAAC;QACzB,OAAO,IAAI,CAAC;IACd,CAAC;IAED;;;;;;OAMG;IACH,KAAK,CAAC,QAAyB;QAC7B,IAAI,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;QAChB,IAAI,CAAC,SAAS,GAAG,IAAI,CAAC;QACtB,IAAI,IAAI,CAAC,CAAC,CAAC,MAAM,EAAE;YACjB,IAAI,CAAC,CAAC,CAAC,MAAM,CAAC,KAAK,CAAC,KAAK,CAAC,EAAE;gBAC1B,IAAI,CAAC,IAAI,CAAC,sBAAsB,CAAC,KAAK,CAAC,CAAC;gBACxC,QAAQ,IAAI,QAAQ,CAAC,KAAK,CAAC,CAAC;YAC9B,CAAC,CAAC,CAAC;SACJ;aAAM;YACL,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,IAAI,EAAE;gBAChB,6DAA6D;gBAC7D,eAAe;gBACf,IAAI,CAAC,IAAI,CAAC,sBAAsB,CAAC,KAAK,CAAC,CAAC;aACzC;YACD,QAAQ,IAAI,QAAQ,EAAE,CAAC;SACxB;IACH,CAAC;;AA1HH,wDA2HC;AAvHC;;;GAGG;AACa,4BAAK,GAAG,OAAgB,CAAC;AACzC;;;GAGG;AACa,2BAAI,GAAG,MAAe,CAAC;AACvC;;;GAGG;AACa,2BAAI,GAAG,MAAe,CAAC;AACvC;;;GAGG;AACa,0BAAG,GAAG,KAAc,CAAC;AACrC;;;GAGG;AACa,4BAAK,GAAG,OAAgB,CAAC;AAiG3C,SAAS,kBAAkB,CAAC,MAA8B;IACxD,IAAI,MAAM,CAAC,CAAC,CAAC,IAAI,EAAE;QACjB,MAAM,IAAI,8BAAsB,CAAC,2DAA2D,CAAC,CAAC;KAC/F;AACH,CAAC;AAED,SAAS,MAAM,CAAC,MAA8B;IAC5C,IAAI,MAAM,CAAC,SAAS;QAAE,OAAO;IAC7B,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,MAAM;QAAE,OAAO;IAC7B,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,IAAI;QAAE,OAAO;IAE3B,MAAM,CAAC,CAAC,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC,KAAK,EAAE,GAAG,EAAE,EAAE;QAClC,IAAI,MAAM,CAAC,SAAS,EAAE;YACpB,OAAO;SACR;QACD,IAAI,KAAK,EAAE;YACT,MAAM,CAAC,IAAI,CAAC,sBAAsB,CAAC,KAAK,EAAE,KAAK,CAAC,CAAC;YACjD,OAAO;SACR;QACD,IAAI,CAAC,GAAG,EAAE;YACR,MAAM,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;YAElB,OAAO,CAAC,QAAQ,CAAC,GAAG,EAAE;gBACpB,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,MAAM;oBAAE,OAAO;gBAC7B,MAAM,CAAC,CAAC,CAAC,MAAM,CAAC,KAAK,CAAC,KAAK,CAAC,EAAE;oBAC5B,IAAI,KAAK,EAAE;wBACT,MAAM,CAAC,IAAI,CAAC,sBAAsB,CAAC,KAAK,EAAE,KAAK,CAAC,CAAC;wBACjD,OAAO;qBACR;oBAED,MAAM,CAAC,IAAI,CAAC,sBAAsB,CAAC,KAAK,CAAC,CAAC;gBAC5C,CAAC,CAAC,CAAC;YACL,CAAC,CAAC,CAAC;YAEH,OAAO;SACR;QAED,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,IAAI;YAAE,OAAO;QAE3B,MAAM,cAAc,GAAG,MAAM,CAAC,CAAC,CAAC,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC,CAAC,CAAC,SAAS,CAAC;QACjE,MAAM,SAAS,GAAG,MAAM,CAAC,CAAC,CAAC,QAAQ,EAAE,CAAC;QACtC,MAAM,cAAc,GAAG,IAAI,CAAC,GAAG,CAAC,MAAM,CAAC,CAAC,CAAC,IAAI,CAAC,SAAS,EAAE,cAAc,CAAC,CAAC;QACzE,IAAI,GAAG,CAAC,CAAC,GAAG,SAAS,EAAE;YACrB,OAAO,MAAM,CAAC,IAAI,CAChB,sBAAsB,CAAC,KAAK,EAC5B,IAAI,6BAAqB,CACvB,qCAAqC,GAAG,CAAC,CAAC,eAAe,SAAS,EAAE,CACrE,CACF,CAAC;SACH;QAED,IAAI,GAAG,CAAC,CAAC,GAAG,SAAS,EAAE;YACrB,OAAO,MAAM,CAAC,IAAI,CAChB,sBAAsB,CAAC,KAAK,EAC5B,IAAI,6BAAqB,CAAC,iCAAiC,GAAG,CAAC,CAAC,eAAe,SAAS,EAAE,CAAC,CAC5F,CAAC;SACH;QAED,IAAI,GAAG,GAAG,MAAM,CAAC,QAAQ,CAAC,GAAG,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,MAAM,CAAC;QAEjE,IAAI,GAAG,CAAC,UAAU,KAAK,cAAc,EAAE;YACrC,IAAI,cAAc,IAAI,CAAC,EAAE;gBACvB,OAAO,MAAM,CAAC,IAAI,CAChB,sBAAsB,CAAC,KAAK,EAC5B,IAAI,6BAAqB,CAAC,iCAAiC,GAAG,CAAC,CAAC,EAAE,CAAC,CACpE,CAAC;aACH;YAED,OAAO,MAAM,CAAC,IAAI,CAChB,sBAAsB,CAAC,KAAK,EAC5B,IAAI,6BAAqB,CACvB,4CAA4C,GAAG,CAAC,UAAU,eAAe,cAAc,EAAE,CAC1F,CACF,CAAC;SACH;QAED,MAAM,CAAC,CAAC,CAAC,SAAS,IAAI,GAAG,CAAC,UAAU,CAAC;QAErC,IAAI,GAAG,CAAC,UAAU,KAAK,CAAC,EAAE;YACxB,OAAO,MAAM,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;SAC1B;QAED,IAAI,UAAU,GAAG,IAAI,CAAC;QACtB,IAAI,QAAQ,GAAG,IAAI,CAAC;QAEpB,IAAI,MAAM,CAAC,CAAC,CAAC,WAAW,IAAI,IAAI,EAAE;YAChC,UAAU,GAAG,MAAM,CAAC,CAAC,CAAC,WAAW,CAAC;YAClC,MAAM,CAAC,CAAC,CAAC,WAAW,GAAG,CAAC,CAAC;SAC1B;QAED,MAAM,aAAa,GAAG,SAAS,KAAK,MAAM,CAAC,CAAC,CAAC,WAAW,GAAG,CAAC,CAAC;QAC7D,MAAM,eAAe,GAAG,MAAM,CAAC,CAAC,CAAC,OAAO,CAAC,GAAG,GAAG,MAAM,CAAC,CAAC,CAAC,WAAW,CAAC;QACpE,IAAI,aAAa,IAAI,MAAM,CAAC,CAAC,CAAC,WAAW,IAAI,IAAI,EAAE;YACjD,QAAQ,GAAG,MAAM,CAAC,CAAC,CAAC,IAAI,CAAC,SAAS,GAAG,MAAM,CAAC,CAAC,CAAC,WAAW,CAAC;SAC3D;aAAM,IAAI,MAAM,CAAC,CAAC,CAAC,OAAO,CAAC,GAAG,IAAI,eAAe,GAAG,GAAG,CAAC,IAAI,CAAC,UAAU,EAAE;YACxE,QAAQ,GAAG,eAAe,CAAC;SAC5B;QAED,IAAI,UAAU,IAAI,IAAI,IAAI,QAAQ,IAAI,IAAI,EAAE;YAC1C,GAAG,GAAG,GAAG,CAAC,KAAK,CAAC,UAAU,IAAI,CAAC,EAAE,QAAQ,IAAI,GAAG,CAAC,UAAU,CAAC,CAAC;SAC9D;QAED,MAAM,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;IACnB,CAAC,CAAC,CAAC;AACL,CAAC;AAED,SAAS,IAAI,CAAC,MAA8B;IAC1C,MAAM,cAAc,GAAgB,EAAE,CAAC;IACvC,IAAI,MAAM,CAAC,CAAC,CAAC,cAAc,EAAE;QAC3B,cAAc,CAAC,cAAc,GAAG,MAAM,CAAC,CAAC,CAAC,cAAc,CAAC;KACzD;IACD,IAAI,MAAM,CAAC,CAAC,CAAC,OAAO,IAAI,MAAM,CAAC,CAAC,CAAC,OAAO,CAAC,IAAI,EAAE;QAC7C,cAAc,CAAC,IAAI,GAAG,MAAM,CAAC,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC;KAC7C;IACD,IAAI,MAAM,CAAC,CAAC,CAAC,OAAO,IAAI,MAAM,CAAC,CAAC,CAAC,OAAO,CAAC,IAAI,EAAE;QAC7C,cAAc,CAAC,IAAI,GAAG,MAAM,CAAC,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC;KAC7C;IAED,MAAM,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,MAAM,EAAE,cAAc,EAAE,CAAC,KAAK,EAAE,GAAG,EAAE,EAAE;QACrE,IAAI,KAAK,EAAE;YACT,OAAO,MAAM,CAAC,IAAI,CAAC,sBAAsB,CAAC,KAAK,EAAE,KAAK,CAAC,CAAC;SACzD;QAED,IAAI,CAAC,GAAG,EAAE;YACR,MAAM,UAAU,GAAG,MAAM,CAAC,CAAC,CAAC,MAAM,CAAC,GAAG;gBACpC,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC,MAAM,CAAC,GAAG,CAAC,QAAQ,EAAE;gBAChC,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC,MAAM,CAAC,QAAQ,CAAC;YAC7B,MAAM,MAAM,GAAG,sBAAsB,UAAU,gBAAgB,CAAC;YAChE,kBAAkB;YAClB,MAAM,GAAG,GAAG,IAAI,yBAAiB,CAAC,MAAM,CAAC,CAAC;YAC1C,GAAG,CAAC,IAAI,GAAG,QAAQ,CAAC,CAAC,sDAAsD;YAC3E,OAAO,MAAM,CAAC,IAAI,CAAC,sBAAsB,CAAC,KAAK,EAAE,GAAG,CAAC,CAAC;SACvD;QAED,8DAA8D;QAC9D,oBAAoB;QACpB,IAAI,GAAG,CAAC,MAAM,IAAI,CAAC,EAAE;YACnB,MAAM,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;YAClB,OAAO;SACR;QAED,IAAI,MAAM,CAAC,SAAS,EAAE;YACpB,4DAA4D;YAC5D,iEAAiE;YACjE,kBAAkB;YAClB,MAAM,CAAC,IAAI,CAAC,sBAAsB,CAAC,KAAK,CAAC,CAAC;YAC1C,OAAO;SACR;QAED,IAAI;YACF,MAAM,CAAC,CAAC,CAAC,WAAW,GAAG,iBAAiB,CAAC,MAAM,EAAE,GAAG,EAAE,MAAM,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC;SACzE;QAAC,OAAO,KAAK,EAAE;YACd,OAAO,MAAM,CAAC,IAAI,CAAC,sBAAsB,CAAC,KAAK,EAAE,KAAK,CAAC,CAAC;SACzD;QAED,MAAM,MAAM,GAAa,EAAE,QAAQ,EAAE,GAAG,CAAC,GAAG,EAAE,CAAC;QAE/C,sEAAsE;QACtE,8EAA8E;QAC9E,+CAA+C;QAC/C,IAAI,MAAM,CAAC,CAAC,CAAC,OAAO,IAAI,MAAM,CAAC,CAAC,CAAC,OAAO,CAAC,KAAK,IAAI,IAAI,EAAE;YACtD,MAAM,IAAI,GAAG,IAAI,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC,CAAC,OAAO,CAAC,KAAK,GAAG,GAAG,CAAC,SAAS,CAAC,CAAC;YAChE,IAAI,IAAI,GAAG,CAAC,EAAE;gBACZ,MAAM,CAAC,GAAG,CAAC,GAAG,EAAE,IAAI,EAAE,IAAI,EAAE,CAAC;aAC9B;SACF;QACD,MAAM,CAAC,CAAC,CAAC,MAAM,GAAG,MAAM,CAAC,CAAC,CAAC,MAAM,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC,IAAI,CAAC,EAAE,CAAC,EAAE,CAAC,EAAE,CAAC,CAAC;QAE9D,IAAI,MAAM,CAAC,CAAC,CAAC,cAAc,EAAE;YAC3B,MAAM,CAAC,CAAC,CAAC,MAAM,CAAC,kBAAkB,CAAC,MAAM,CAAC,CAAC,CAAC,cAAc,CAAC,CAAC;SAC7D;QAED,MAAM,CAAC,CAAC,CAAC,WAAW,GAAG,IAAI,CAAC,IAAI,CAAC,GAAG,CAAC,MAAM,GAAG,GAAG,CAAC,SAAS,CAAC,CAAC;QAC7D,MAAM,CAAC,CAAC,CAAC,IAAI,GAAG,GAAiB,CAAC;QAElC,IAAI;YACF,MAAM,CAAC,CAAC,CAAC,WAAW,GAAG,eAAe,CAAC,MAAM,EAAE,GAAG,EAAE,MAAM,CAAC,CAAC,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC;SACxF;QAAC,OAAO,KAAK,EAAE;YACd,OAAO,MAAM,CAAC,IAAI,CAAC,sBAAsB,CAAC,KAAK,EAAE,KAAK,CAAC,CAAC;SACzD;QAED,MAAM,CAAC,IAAI,CAAC,sBAAsB,CAAC,IAAI,EAAE,GAAG,CAAC,CAAC;IAChD,CAAC,CAAC,CAAC;AACL,CAAC;AAED,SAAS,WAAW,CAAC,MAA8B,EAAE,QAAkB;IACrE,IAAI,MAAM,CAAC,CAAC,CAAC,IAAI,EAAE;QACjB,OAAO,QAAQ,EAAE,CAAC;KACnB;IAED,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,IAAI,EAAE;QAClB,IAAI,CAAC,MAAM,CAAC,CAAC;QACb,MAAM,CAAC,CAAC,CAAC,IAAI,GAAG,IAAI,CAAC;KACtB;IAED,MAAM,CAAC,IAAI,CAAC,MAAM,EAAE,GAAG,EAAE;QACvB,QAAQ,EAAE,CAAC;IACb,CAAC,CAAC,CAAC;AACL,CAAC;AAED,SAAS,iBAAiB,CACxB,MAA8B,EAC9B,GAAa,EACb,OAAsC;IAEtC,IAAI,OAAO,IAAI,OAAO,CAAC,KAAK,IAAI,IAAI,EAAE;QACpC,IAAI,OAAO,CAAC,KAAK,GAAG,GAAG,CAAC,MAAM,EAAE;YAC9B,MAAM,IAAI,iCAAyB,CACjC,iBAAiB,OAAO,CAAC,KAAK,mDAAmD,GAAG,CAAC,MAAM,GAAG,CAC/F,CAAC;SACH;QACD,IAAI,OAAO,CAAC,KAAK,GAAG,CAAC,EAAE;YACrB,MAAM,IAAI,iCAAyB,CAAC,iBAAiB,OAAO,CAAC,KAAK,wBAAwB,CAAC,CAAC;SAC7F;QACD,IAAI,OAAO,CAAC,GAAG,IAAI,IAAI,IAAI,OAAO,CAAC,GAAG,GAAG,OAAO,CAAC,KAAK,EAAE;YACtD,MAAM,IAAI,iCAAyB,CACjC,iBAAiB,OAAO,CAAC,KAAK,0CAA0C,OAAO,CAAC,GAAG,GAAG,CACvF,CAAC;SACH;QAED,MAAM,CAAC,CAAC,CAAC,SAAS,GAAG,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,KAAK,GAAG,GAAG,CAAC,SAAS,CAAC,GAAG,GAAG,CAAC,SAAS,CAAC;QAC/E,MAAM,CAAC,CAAC,CAAC,QAAQ,GAAG,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,KAAK,GAAG,GAAG,CAAC,SAAS,CAAC,CAAC;QAE9D,OAAO,OAAO,CAAC,KAAK,GAAG,MAAM,CAAC,CAAC,CAAC,SAAS,CAAC;KAC3C;IACD,MAAM,IAAI,iCAAyB,CAAC,8BAA8B,CAAC,CAAC;AACtE,CAAC;AAED,SAAS,eAAe,CACtB,MAA8B,EAC9B,GAAa,EACb,MAA+B,EAC/B,OAAsC;IAEtC,IAAI,OAAO,IAAI,OAAO,CAAC,GAAG,IAAI,IAAI,EAAE;QAClC,IAAI,OAAO,CAAC,GAAG,GAAG,GAAG,CAAC,MAAM,EAAE;YAC5B,MAAM,IAAI,iCAAyB,CACjC,eAAe,OAAO,CAAC,GAAG,mDAAmD,GAAG,CAAC,MAAM,GAAG,CAC3F,CAAC;SACH;QACD,IAAI,OAAO,CAAC,KAAK,IAAI,IAAI,IAAI,OAAO,CAAC,KAAK,GAAG,CAAC,EAAE;YAC9C,MAAM,IAAI,iCAAyB,CAAC,eAAe,OAAO,CAAC,GAAG,wBAAwB,CAAC,CAAC;SACzF;QAED,MAAM,KAAK,GAAG,OAAO,CAAC,KAAK,IAAI,IAAI,CAAC,CAAC,CAAC,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,KAAK,GAAG,GAAG,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;QAEpF,MAAM,CAAC,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,OAAO,CAAC,GAAG,GAAG,GAAG,CAAC,SAAS,CAAC,GAAG,KAAK,CAAC,CAAC;QAE7D,MAAM,CAAC,CAAC,CAAC,WAAW,GAAG,IAAI,CAAC,IAAI,CAAC,OAAO,CAAC,GAAG,GAAG,GAAG,CAAC,SAAS,CAAC,CAAC;QAE9D,OAAO,IAAI,CAAC,IAAI,CAAC,OAAO,CAAC,GAAG,GAAG,GAAG,CAAC,SAAS,CAAC,GAAG,GAAG,CAAC,SAAS,GAAG,OAAO,CAAC,GAAG,CAAC;KAC7E;IACD,MAAM,IAAI,iCAAyB,CAAC,4BAA4B,CAAC,CAAC;AACpE,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/gridfs/index.js b/node_modules/mongodb/lib/gridfs/index.js new file mode 100644 index 000000000..4ed123ff0 --- /dev/null +++ b/node_modules/mongodb/lib/gridfs/index.js @@ -0,0 +1,161 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.GridFSBucket = void 0; +const error_1 = require("../error"); +const download_1 = require("./download"); +const upload_1 = require("./upload"); +const utils_1 = require("../utils"); +const write_concern_1 = require("../write_concern"); +const mongo_types_1 = require("../mongo_types"); +const DEFAULT_GRIDFS_BUCKET_OPTIONS = { + bucketName: 'fs', + chunkSizeBytes: 255 * 1024 +}; +/** + * Constructor for a streaming GridFS interface + * @public + */ +class GridFSBucket extends mongo_types_1.TypedEventEmitter { + constructor(db, options) { + super(); + this.setMaxListeners(0); + const privateOptions = { + ...DEFAULT_GRIDFS_BUCKET_OPTIONS, + ...options, + writeConcern: write_concern_1.WriteConcern.fromOptions(options) + }; + this.s = { + db, + options: privateOptions, + _chunksCollection: db.collection(privateOptions.bucketName + '.chunks'), + _filesCollection: db.collection(privateOptions.bucketName + '.files'), + checkedIndexes: false, + calledOpenUploadStream: false + }; + } + /** + * Returns a writable stream (GridFSBucketWriteStream) for writing + * buffers to GridFS. The stream's 'id' property contains the resulting + * file's id. + * + * @param filename - The value of the 'filename' key in the files doc + * @param options - Optional settings. + */ + openUploadStream(filename, options) { + return new upload_1.GridFSBucketWriteStream(this, filename, options); + } + /** + * Returns a writable stream (GridFSBucketWriteStream) for writing + * buffers to GridFS for a custom file id. The stream's 'id' property contains the resulting + * file's id. + */ + openUploadStreamWithId(id, filename, options) { + return new upload_1.GridFSBucketWriteStream(this, filename, { ...options, id }); + } + /** Returns a readable stream (GridFSBucketReadStream) for streaming file data from GridFS. */ + openDownloadStream(id, options) { + return new download_1.GridFSBucketReadStream(this.s._chunksCollection, this.s._filesCollection, this.s.options.readPreference, { _id: id }, options); + } + delete(id, callback) { + return utils_1.executeLegacyOperation(utils_1.getTopology(this.s.db), _delete, [this, id, callback], { + skipSessions: true + }); + } + /** Convenience wrapper around find on the files collection */ + find(filter, options) { + filter !== null && filter !== void 0 ? filter : (filter = {}); + options = options !== null && options !== void 0 ? options : {}; + return this.s._filesCollection.find(filter, options); + } + /** + * Returns a readable stream (GridFSBucketReadStream) for streaming the + * file with the given name from GridFS. If there are multiple files with + * the same name, this will stream the most recent file with the given name + * (as determined by the `uploadDate` field). You can set the `revision` + * option to change this behavior. + */ + openDownloadStreamByName(filename, options) { + let sort = { uploadDate: -1 }; + let skip = undefined; + if (options && options.revision != null) { + if (options.revision >= 0) { + sort = { uploadDate: 1 }; + skip = options.revision; + } + else { + skip = -options.revision - 1; + } + } + return new download_1.GridFSBucketReadStream(this.s._chunksCollection, this.s._filesCollection, this.s.options.readPreference, { filename }, { ...options, sort, skip }); + } + rename(id, filename, callback) { + return utils_1.executeLegacyOperation(utils_1.getTopology(this.s.db), _rename, [this, id, filename, callback], { + skipSessions: true + }); + } + drop(callback) { + return utils_1.executeLegacyOperation(utils_1.getTopology(this.s.db), _drop, [this, callback], { + skipSessions: true + }); + } + /** Get the Db scoped logger. */ + getLogger() { + return this.s.db.s.logger; + } +} +exports.GridFSBucket = GridFSBucket; +/** + * When the first call to openUploadStream is made, the upload stream will + * check to see if it needs to create the proper indexes on the chunks and + * files collections. This event is fired either when 1) it determines that + * no index creation is necessary, 2) when it successfully creates the + * necessary indexes. + * @event + */ +GridFSBucket.INDEX = 'index'; +function _delete(bucket, id, callback) { + return bucket.s._filesCollection.deleteOne({ _id: id }, (error, res) => { + if (error) { + return callback(error); + } + return bucket.s._chunksCollection.deleteMany({ files_id: id }, error => { + if (error) { + return callback(error); + } + // Delete orphaned chunks before returning FileNotFound + if (!(res === null || res === void 0 ? void 0 : res.deletedCount)) { + // TODO(NODE-3483): Replace with more appropriate error + // Consider creating new error MongoGridFSFileNotFoundError + return callback(new error_1.MongoRuntimeError(`File not found for id ${id}`)); + } + return callback(); + }); + }); +} +function _rename(bucket, id, filename, callback) { + const filter = { _id: id }; + const update = { $set: { filename } }; + return bucket.s._filesCollection.updateOne(filter, update, (error, res) => { + if (error) { + return callback(error); + } + if (!(res === null || res === void 0 ? void 0 : res.matchedCount)) { + return callback(new error_1.MongoRuntimeError(`File with id ${id} not found`)); + } + return callback(); + }); +} +function _drop(bucket, callback) { + return bucket.s._filesCollection.drop((error) => { + if (error) { + return callback(error); + } + return bucket.s._chunksCollection.drop((error) => { + if (error) { + return callback(error); + } + return callback(); + }); + }); +} +//# sourceMappingURL=index.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/gridfs/index.js.map b/node_modules/mongodb/lib/gridfs/index.js.map new file mode 100644 index 000000000..c2b2e86d1 --- /dev/null +++ b/node_modules/mongodb/lib/gridfs/index.js.map @@ -0,0 +1 @@ +{"version":3,"file":"index.js","sourceRoot":"","sources":["../../src/gridfs/index.ts"],"names":[],"mappings":";;;AAAA,oCAA6C;AAC7C,yCAKoB;AACpB,qCAAgG;AAChG,oCAAyE;AACzE,oDAAqE;AASrE,gDAA2D;AAE3D,MAAM,6BAA6B,GAG/B;IACF,UAAU,EAAE,IAAI;IAChB,cAAc,EAAE,GAAG,GAAG,IAAI;CAC3B,CAAC;AAgCF;;;GAGG;AACH,MAAa,YAAa,SAAQ,+BAAqC;IAcrE,YAAY,EAAM,EAAE,OAA6B;QAC/C,KAAK,EAAE,CAAC;QACR,IAAI,CAAC,eAAe,CAAC,CAAC,CAAC,CAAC;QACxB,MAAM,cAAc,GAAG;YACrB,GAAG,6BAA6B;YAChC,GAAG,OAAO;YACV,YAAY,EAAE,4BAAY,CAAC,WAAW,CAAC,OAAO,CAAC;SAChD,CAAC;QACF,IAAI,CAAC,CAAC,GAAG;YACP,EAAE;YACF,OAAO,EAAE,cAAc;YACvB,iBAAiB,EAAE,EAAE,CAAC,UAAU,CAAc,cAAc,CAAC,UAAU,GAAG,SAAS,CAAC;YACpF,gBAAgB,EAAE,EAAE,CAAC,UAAU,CAAa,cAAc,CAAC,UAAU,GAAG,QAAQ,CAAC;YACjF,cAAc,EAAE,KAAK;YACrB,sBAAsB,EAAE,KAAK;SAC9B,CAAC;IACJ,CAAC;IAED;;;;;;;OAOG;IAEH,gBAAgB,CACd,QAAgB,EAChB,OAAwC;QAExC,OAAO,IAAI,gCAAuB,CAAC,IAAI,EAAE,QAAQ,EAAE,OAAO,CAAC,CAAC;IAC9D,CAAC;IAED;;;;OAIG;IACH,sBAAsB,CACpB,EAAY,EACZ,QAAgB,EAChB,OAAwC;QAExC,OAAO,IAAI,gCAAuB,CAAC,IAAI,EAAE,QAAQ,EAAE,EAAE,GAAG,OAAO,EAAE,EAAE,EAAE,CAAC,CAAC;IACzE,CAAC;IAED,8FAA8F;IAC9F,kBAAkB,CAChB,EAAY,EACZ,OAAuC;QAEvC,OAAO,IAAI,iCAAsB,CAC/B,IAAI,CAAC,CAAC,CAAC,iBAAiB,EACxB,IAAI,CAAC,CAAC,CAAC,gBAAgB,EACvB,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,cAAc,EAC7B,EAAE,GAAG,EAAE,EAAE,EAAE,EACX,OAAO,CACR,CAAC;IACJ,CAAC;IASD,MAAM,CAAC,EAAY,EAAE,QAAyB;QAC5C,OAAO,8BAAsB,CAAC,mBAAW,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,OAAO,EAAE,CAAC,IAAI,EAAE,EAAE,EAAE,QAAQ,CAAC,EAAE;YACnF,YAAY,EAAE,IAAI;SACnB,CAAC,CAAC;IACL,CAAC;IAED,8DAA8D;IAC9D,IAAI,CAAC,MAA2B,EAAE,OAAqB;QACrD,MAAM,aAAN,MAAM,cAAN,MAAM,IAAN,MAAM,GAAK,EAAE,EAAC;QACd,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QACxB,OAAO,IAAI,CAAC,CAAC,CAAC,gBAAgB,CAAC,IAAI,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;IACvD,CAAC;IAED;;;;;;OAMG;IACH,wBAAwB,CACtB,QAAgB,EAChB,OAAmD;QAEnD,IAAI,IAAI,GAAS,EAAE,UAAU,EAAE,CAAC,CAAC,EAAE,CAAC;QACpC,IAAI,IAAI,GAAG,SAAS,CAAC;QACrB,IAAI,OAAO,IAAI,OAAO,CAAC,QAAQ,IAAI,IAAI,EAAE;YACvC,IAAI,OAAO,CAAC,QAAQ,IAAI,CAAC,EAAE;gBACzB,IAAI,GAAG,EAAE,UAAU,EAAE,CAAC,EAAE,CAAC;gBACzB,IAAI,GAAG,OAAO,CAAC,QAAQ,CAAC;aACzB;iBAAM;gBACL,IAAI,GAAG,CAAC,OAAO,CAAC,QAAQ,GAAG,CAAC,CAAC;aAC9B;SACF;QACD,OAAO,IAAI,iCAAsB,CAC/B,IAAI,CAAC,CAAC,CAAC,iBAAiB,EACxB,IAAI,CAAC,CAAC,CAAC,gBAAgB,EACvB,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,cAAc,EAC7B,EAAE,QAAQ,EAAE,EACZ,EAAE,GAAG,OAAO,EAAE,IAAI,EAAE,IAAI,EAAE,CAC3B,CAAC;IACJ,CAAC;IAUD,MAAM,CAAC,EAAY,EAAE,QAAgB,EAAE,QAAyB;QAC9D,OAAO,8BAAsB,CAAC,mBAAW,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,OAAO,EAAE,CAAC,IAAI,EAAE,EAAE,EAAE,QAAQ,EAAE,QAAQ,CAAC,EAAE;YAC7F,YAAY,EAAE,IAAI;SACnB,CAAC,CAAC;IACL,CAAC;IAKD,IAAI,CAAC,QAAyB;QAC5B,OAAO,8BAAsB,CAAC,mBAAW,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,KAAK,EAAE,CAAC,IAAI,EAAE,QAAQ,CAAC,EAAE;YAC7E,YAAY,EAAE,IAAI;SACnB,CAAC,CAAC;IACL,CAAC;IAED,gCAAgC;IAChC,SAAS;QACP,OAAO,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,MAAM,CAAC;IAC5B,CAAC;;AAvJH,oCAwJC;AApJC;;;;;;;GAOG;AACa,kBAAK,GAAG,OAAgB,CAAC;AA8I3C,SAAS,OAAO,CAAC,MAAoB,EAAE,EAAY,EAAE,QAAwB;IAC3E,OAAO,MAAM,CAAC,CAAC,CAAC,gBAAgB,CAAC,SAAS,CAAC,EAAE,GAAG,EAAE,EAAE,EAAE,EAAE,CAAC,KAAK,EAAE,GAAG,EAAE,EAAE;QACrE,IAAI,KAAK,EAAE;YACT,OAAO,QAAQ,CAAC,KAAK,CAAC,CAAC;SACxB;QAED,OAAO,MAAM,CAAC,CAAC,CAAC,iBAAiB,CAAC,UAAU,CAAC,EAAE,QAAQ,EAAE,EAAE,EAAE,EAAE,KAAK,CAAC,EAAE;YACrE,IAAI,KAAK,EAAE;gBACT,OAAO,QAAQ,CAAC,KAAK,CAAC,CAAC;aACxB;YAED,uDAAuD;YACvD,IAAI,CAAC,CAAA,GAAG,aAAH,GAAG,uBAAH,GAAG,CAAE,YAAY,CAAA,EAAE;gBACtB,uDAAuD;gBACvD,2DAA2D;gBAC3D,OAAO,QAAQ,CAAC,IAAI,yBAAiB,CAAC,yBAAyB,EAAE,EAAE,CAAC,CAAC,CAAC;aACvE;YAED,OAAO,QAAQ,EAAE,CAAC;QACpB,CAAC,CAAC,CAAC;IACL,CAAC,CAAC,CAAC;AACL,CAAC;AAED,SAAS,OAAO,CACd,MAAoB,EACpB,EAAY,EACZ,QAAgB,EAChB,QAAwB;IAExB,MAAM,MAAM,GAAG,EAAE,GAAG,EAAE,EAAE,EAAE,CAAC;IAC3B,MAAM,MAAM,GAAG,EAAE,IAAI,EAAE,EAAE,QAAQ,EAAE,EAAE,CAAC;IACtC,OAAO,MAAM,CAAC,CAAC,CAAC,gBAAgB,CAAC,SAAS,CAAC,MAAM,EAAE,MAAM,EAAE,CAAC,KAAM,EAAE,GAAI,EAAE,EAAE;QAC1E,IAAI,KAAK,EAAE;YACT,OAAO,QAAQ,CAAC,KAAK,CAAC,CAAC;SACxB;QAED,IAAI,CAAC,CAAA,GAAG,aAAH,GAAG,uBAAH,GAAG,CAAE,YAAY,CAAA,EAAE;YACtB,OAAO,QAAQ,CAAC,IAAI,yBAAiB,CAAC,gBAAgB,EAAE,YAAY,CAAC,CAAC,CAAC;SACxE;QAED,OAAO,QAAQ,EAAE,CAAC;IACpB,CAAC,CAAC,CAAC;AACL,CAAC;AAED,SAAS,KAAK,CAAC,MAAoB,EAAE,QAAwB;IAC3D,OAAO,MAAM,CAAC,CAAC,CAAC,gBAAgB,CAAC,IAAI,CAAC,CAAC,KAAa,EAAE,EAAE;QACtD,IAAI,KAAK,EAAE;YACT,OAAO,QAAQ,CAAC,KAAK,CAAC,CAAC;SACxB;QACD,OAAO,MAAM,CAAC,CAAC,CAAC,iBAAiB,CAAC,IAAI,CAAC,CAAC,KAAa,EAAE,EAAE;YACvD,IAAI,KAAK,EAAE;gBACT,OAAO,QAAQ,CAAC,KAAK,CAAC,CAAC;aACxB;YAED,OAAO,QAAQ,EAAE,CAAC;QACpB,CAAC,CAAC,CAAC;IACL,CAAC,CAAC,CAAC;AACL,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/gridfs/upload.js b/node_modules/mongodb/lib/gridfs/upload.js new file mode 100644 index 000000000..19a778120 --- /dev/null +++ b/node_modules/mongodb/lib/gridfs/upload.js @@ -0,0 +1,375 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.GridFSBucketWriteStream = void 0; +const stream_1 = require("stream"); +const bson_1 = require("../bson"); +const error_1 = require("../error"); +const utils_1 = require("../utils"); +const write_concern_1 = require("./../write_concern"); +/** + * A writable stream that enables you to write buffers to GridFS. + * + * Do not instantiate this class directly. Use `openUploadStream()` instead. + * @public + */ +class GridFSBucketWriteStream extends stream_1.Writable { + /** @internal + * @param bucket - Handle for this stream's corresponding bucket + * @param filename - The value of the 'filename' key in the files doc + * @param options - Optional settings. + */ + constructor(bucket, filename, options) { + super(); + options = options !== null && options !== void 0 ? options : {}; + this.bucket = bucket; + this.chunks = bucket.s._chunksCollection; + this.filename = filename; + this.files = bucket.s._filesCollection; + this.options = options; + this.writeConcern = write_concern_1.WriteConcern.fromOptions(options) || bucket.s.options.writeConcern; + // Signals the write is all done + this.done = false; + this.id = options.id ? options.id : new bson_1.ObjectId(); + // properly inherit the default chunksize from parent + this.chunkSizeBytes = options.chunkSizeBytes || this.bucket.s.options.chunkSizeBytes; + this.bufToStore = Buffer.alloc(this.chunkSizeBytes); + this.length = 0; + this.n = 0; + this.pos = 0; + this.state = { + streamEnd: false, + outstandingRequests: 0, + errored: false, + aborted: false + }; + if (!this.bucket.s.calledOpenUploadStream) { + this.bucket.s.calledOpenUploadStream = true; + checkIndexes(this, () => { + this.bucket.s.checkedIndexes = true; + this.bucket.emit('index'); + }); + } + } + write(chunk, encodingOrCallback, callback) { + const encoding = typeof encodingOrCallback === 'function' ? undefined : encodingOrCallback; + callback = typeof encodingOrCallback === 'function' ? encodingOrCallback : callback; + return waitForIndexes(this, () => doWrite(this, chunk, encoding, callback)); + } + abort(callback) { + return utils_1.maybePromise(callback, callback => { + if (this.state.streamEnd) { + // TODO(NODE-3485): Replace with MongoGridFSStreamClosed + return callback(new error_1.MongoAPIError('Cannot abort a stream that has already completed')); + } + if (this.state.aborted) { + // TODO(NODE-3485): Replace with MongoGridFSStreamClosed + return callback(new error_1.MongoAPIError('Cannot call abort() on a stream twice')); + } + this.state.aborted = true; + this.chunks.deleteMany({ files_id: this.id }, error => callback(error)); + }); + } + end(chunkOrCallback, encodingOrCallback, callback) { + const chunk = typeof chunkOrCallback === 'function' ? undefined : chunkOrCallback; + const encoding = typeof encodingOrCallback === 'function' ? undefined : encodingOrCallback; + callback = + typeof chunkOrCallback === 'function' + ? chunkOrCallback + : typeof encodingOrCallback === 'function' + ? encodingOrCallback + : callback; + if (checkAborted(this, callback)) + return; + this.state.streamEnd = true; + if (callback) { + this.once(GridFSBucketWriteStream.FINISH, (result) => { + if (callback) + callback(undefined, result); + }); + } + if (!chunk) { + waitForIndexes(this, () => !!writeRemnant(this)); + return; + } + this.write(chunk, encoding, () => { + writeRemnant(this); + }); + } +} +exports.GridFSBucketWriteStream = GridFSBucketWriteStream; +/** @event */ +GridFSBucketWriteStream.CLOSE = 'close'; +/** @event */ +GridFSBucketWriteStream.ERROR = 'error'; +/** + * `end()` was called and the write stream successfully wrote the file metadata and all the chunks to MongoDB. + * @event + */ +GridFSBucketWriteStream.FINISH = 'finish'; +function __handleError(stream, error, callback) { + if (stream.state.errored) { + return; + } + stream.state.errored = true; + if (callback) { + return callback(error); + } + stream.emit(GridFSBucketWriteStream.ERROR, error); +} +function createChunkDoc(filesId, n, data) { + return { + _id: new bson_1.ObjectId(), + files_id: filesId, + n, + data + }; +} +function checkChunksIndex(stream, callback) { + stream.chunks.listIndexes().toArray((error, indexes) => { + let index; + if (error) { + // Collection doesn't exist so create index + if (error instanceof error_1.MongoError && error.code === error_1.MONGODB_ERROR_CODES.NamespaceNotFound) { + index = { files_id: 1, n: 1 }; + stream.chunks.createIndex(index, { background: false, unique: true }, error => { + if (error) { + return callback(error); + } + callback(); + }); + return; + } + return callback(error); + } + let hasChunksIndex = false; + if (indexes) { + indexes.forEach((index) => { + if (index.key) { + const keys = Object.keys(index.key); + if (keys.length === 2 && index.key.files_id === 1 && index.key.n === 1) { + hasChunksIndex = true; + } + } + }); + } + if (hasChunksIndex) { + callback(); + } + else { + index = { files_id: 1, n: 1 }; + const writeConcernOptions = getWriteOptions(stream); + stream.chunks.createIndex(index, { + ...writeConcernOptions, + background: true, + unique: true + }, callback); + } + }); +} +function checkDone(stream, callback) { + if (stream.done) + return true; + if (stream.state.streamEnd && stream.state.outstandingRequests === 0 && !stream.state.errored) { + // Set done so we do not trigger duplicate createFilesDoc + stream.done = true; + // Create a new files doc + const filesDoc = createFilesDoc(stream.id, stream.length, stream.chunkSizeBytes, stream.filename, stream.options.contentType, stream.options.aliases, stream.options.metadata); + if (checkAborted(stream, callback)) { + return false; + } + stream.files.insertOne(filesDoc, getWriteOptions(stream), (error) => { + if (error) { + return __handleError(stream, error, callback); + } + stream.emit(GridFSBucketWriteStream.FINISH, filesDoc); + stream.emit(GridFSBucketWriteStream.CLOSE); + }); + return true; + } + return false; +} +function checkIndexes(stream, callback) { + stream.files.findOne({}, { projection: { _id: 1 } }, (error, doc) => { + if (error) { + return callback(error); + } + if (doc) { + return callback(); + } + stream.files.listIndexes().toArray((error, indexes) => { + let index; + if (error) { + // Collection doesn't exist so create index + if (error instanceof error_1.MongoError && error.code === error_1.MONGODB_ERROR_CODES.NamespaceNotFound) { + index = { filename: 1, uploadDate: 1 }; + stream.files.createIndex(index, { background: false }, (error) => { + if (error) { + return callback(error); + } + checkChunksIndex(stream, callback); + }); + return; + } + return callback(error); + } + let hasFileIndex = false; + if (indexes) { + indexes.forEach((index) => { + const keys = Object.keys(index.key); + if (keys.length === 2 && index.key.filename === 1 && index.key.uploadDate === 1) { + hasFileIndex = true; + } + }); + } + if (hasFileIndex) { + checkChunksIndex(stream, callback); + } + else { + index = { filename: 1, uploadDate: 1 }; + const writeConcernOptions = getWriteOptions(stream); + stream.files.createIndex(index, { + ...writeConcernOptions, + background: false + }, (error) => { + if (error) { + return callback(error); + } + checkChunksIndex(stream, callback); + }); + } + }); + }); +} +function createFilesDoc(_id, length, chunkSize, filename, contentType, aliases, metadata) { + const ret = { + _id, + length, + chunkSize, + uploadDate: new Date(), + filename + }; + if (contentType) { + ret.contentType = contentType; + } + if (aliases) { + ret.aliases = aliases; + } + if (metadata) { + ret.metadata = metadata; + } + return ret; +} +function doWrite(stream, chunk, encoding, callback) { + if (checkAborted(stream, callback)) { + return false; + } + const inputBuf = Buffer.isBuffer(chunk) ? chunk : Buffer.from(chunk, encoding); + stream.length += inputBuf.length; + // Input is small enough to fit in our buffer + if (stream.pos + inputBuf.length < stream.chunkSizeBytes) { + inputBuf.copy(stream.bufToStore, stream.pos); + stream.pos += inputBuf.length; + callback && callback(); + // Note that we reverse the typical semantics of write's return value + // to be compatible with node's `.pipe()` function. + // True means client can keep writing. + return true; + } + // Otherwise, buffer is too big for current chunk, so we need to flush + // to MongoDB. + let inputBufRemaining = inputBuf.length; + let spaceRemaining = stream.chunkSizeBytes - stream.pos; + let numToCopy = Math.min(spaceRemaining, inputBuf.length); + let outstandingRequests = 0; + while (inputBufRemaining > 0) { + const inputBufPos = inputBuf.length - inputBufRemaining; + inputBuf.copy(stream.bufToStore, stream.pos, inputBufPos, inputBufPos + numToCopy); + stream.pos += numToCopy; + spaceRemaining -= numToCopy; + let doc; + if (spaceRemaining === 0) { + doc = createChunkDoc(stream.id, stream.n, Buffer.from(stream.bufToStore)); + ++stream.state.outstandingRequests; + ++outstandingRequests; + if (checkAborted(stream, callback)) { + return false; + } + stream.chunks.insertOne(doc, getWriteOptions(stream), (error) => { + if (error) { + return __handleError(stream, error); + } + --stream.state.outstandingRequests; + --outstandingRequests; + if (!outstandingRequests) { + stream.emit('drain', doc); + callback && callback(); + checkDone(stream); + } + }); + spaceRemaining = stream.chunkSizeBytes; + stream.pos = 0; + ++stream.n; + } + inputBufRemaining -= numToCopy; + numToCopy = Math.min(spaceRemaining, inputBufRemaining); + } + // Note that we reverse the typical semantics of write's return value + // to be compatible with node's `.pipe()` function. + // False means the client should wait for the 'drain' event. + return false; +} +function getWriteOptions(stream) { + const obj = {}; + if (stream.writeConcern) { + obj.writeConcern = { + w: stream.writeConcern.w, + wtimeout: stream.writeConcern.wtimeout, + j: stream.writeConcern.j + }; + } + return obj; +} +function waitForIndexes(stream, callback) { + if (stream.bucket.s.checkedIndexes) { + return callback(false); + } + stream.bucket.once('index', () => { + callback(true); + }); + return true; +} +function writeRemnant(stream, callback) { + // Buffer is empty, so don't bother to insert + if (stream.pos === 0) { + return checkDone(stream, callback); + } + ++stream.state.outstandingRequests; + // Create a new buffer to make sure the buffer isn't bigger than it needs + // to be. + const remnant = Buffer.alloc(stream.pos); + stream.bufToStore.copy(remnant, 0, 0, stream.pos); + const doc = createChunkDoc(stream.id, stream.n, remnant); + // If the stream was aborted, do not write remnant + if (checkAborted(stream, callback)) { + return false; + } + stream.chunks.insertOne(doc, getWriteOptions(stream), (error) => { + if (error) { + return __handleError(stream, error); + } + --stream.state.outstandingRequests; + checkDone(stream); + }); + return true; +} +function checkAborted(stream, callback) { + if (stream.state.aborted) { + if (typeof callback === 'function') { + // TODO(NODE-3485): Replace with MongoGridFSStreamClosedError + callback(new error_1.MongoAPIError('Stream has been aborted')); + } + return true; + } + return false; +} +//# sourceMappingURL=upload.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/gridfs/upload.js.map b/node_modules/mongodb/lib/gridfs/upload.js.map new file mode 100644 index 000000000..41091d36e --- /dev/null +++ b/node_modules/mongodb/lib/gridfs/upload.js.map @@ -0,0 +1 @@ +{"version":3,"file":"upload.js","sourceRoot":"","sources":["../../src/gridfs/upload.ts"],"names":[],"mappings":";;;AAAA,mCAAkC;AAElC,kCAAmC;AAEnC,oCAAoF;AACpF,oCAAkD;AAElD,sDAAkD;AA0BlD;;;;;GAKG;AACH,MAAa,uBAAwB,SAAQ,iBAAQ;IA+BnD;;;;OAIG;IACH,YAAY,MAAoB,EAAE,QAAgB,EAAE,OAAwC;QAC1F,KAAK,EAAE,CAAC;QAER,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QACxB,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC;QACrB,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC,CAAC,CAAC,iBAAiB,CAAC;QACzC,IAAI,CAAC,QAAQ,GAAG,QAAQ,CAAC;QACzB,IAAI,CAAC,KAAK,GAAG,MAAM,CAAC,CAAC,CAAC,gBAAgB,CAAC;QACvC,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,YAAY,GAAG,4BAAY,CAAC,WAAW,CAAC,OAAO,CAAC,IAAI,MAAM,CAAC,CAAC,CAAC,OAAO,CAAC,YAAY,CAAC;QACvF,gCAAgC;QAChC,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;QAElB,IAAI,CAAC,EAAE,GAAG,OAAO,CAAC,EAAE,CAAC,CAAC,CAAC,OAAO,CAAC,EAAE,CAAC,CAAC,CAAC,IAAI,eAAQ,EAAE,CAAC;QACnD,qDAAqD;QACrD,IAAI,CAAC,cAAc,GAAG,OAAO,CAAC,cAAc,IAAI,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,OAAO,CAAC,cAAc,CAAC;QACrF,IAAI,CAAC,UAAU,GAAG,MAAM,CAAC,KAAK,CAAC,IAAI,CAAC,cAAc,CAAC,CAAC;QACpD,IAAI,CAAC,MAAM,GAAG,CAAC,CAAC;QAChB,IAAI,CAAC,CAAC,GAAG,CAAC,CAAC;QACX,IAAI,CAAC,GAAG,GAAG,CAAC,CAAC;QACb,IAAI,CAAC,KAAK,GAAG;YACX,SAAS,EAAE,KAAK;YAChB,mBAAmB,EAAE,CAAC;YACtB,OAAO,EAAE,KAAK;YACd,OAAO,EAAE,KAAK;SACf,CAAC;QAEF,IAAI,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,sBAAsB,EAAE;YACzC,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,sBAAsB,GAAG,IAAI,CAAC;YAE5C,YAAY,CAAC,IAAI,EAAE,GAAG,EAAE;gBACtB,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,cAAc,GAAG,IAAI,CAAC;gBACpC,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;YAC5B,CAAC,CAAC,CAAC;SACJ;IACH,CAAC;IAkBD,KAAK,CACH,KAAsB,EACtB,kBAAoD,EACpD,QAAyB;QAEzB,MAAM,QAAQ,GAAG,OAAO,kBAAkB,KAAK,UAAU,CAAC,CAAC,CAAC,SAAS,CAAC,CAAC,CAAC,kBAAkB,CAAC;QAC3F,QAAQ,GAAG,OAAO,kBAAkB,KAAK,UAAU,CAAC,CAAC,CAAC,kBAAkB,CAAC,CAAC,CAAC,QAAQ,CAAC;QACpF,OAAO,cAAc,CAAC,IAAI,EAAE,GAAG,EAAE,CAAC,OAAO,CAAC,IAAI,EAAE,KAAK,EAAE,QAAQ,EAAE,QAAQ,CAAC,CAAC,CAAC;IAC9E,CAAC;IAWD,KAAK,CAAC,QAAyB;QAC7B,OAAO,oBAAY,CAAC,QAAQ,EAAE,QAAQ,CAAC,EAAE;YACvC,IAAI,IAAI,CAAC,KAAK,CAAC,SAAS,EAAE;gBACxB,wDAAwD;gBACxD,OAAO,QAAQ,CAAC,IAAI,qBAAa,CAAC,kDAAkD,CAAC,CAAC,CAAC;aACxF;YAED,IAAI,IAAI,CAAC,KAAK,CAAC,OAAO,EAAE;gBACtB,wDAAwD;gBACxD,OAAO,QAAQ,CAAC,IAAI,qBAAa,CAAC,uCAAuC,CAAC,CAAC,CAAC;aAC7E;YAED,IAAI,CAAC,KAAK,CAAC,OAAO,GAAG,IAAI,CAAC;YAC1B,IAAI,CAAC,MAAM,CAAC,UAAU,CAAC,EAAE,QAAQ,EAAE,IAAI,CAAC,EAAE,EAAE,EAAE,KAAK,CAAC,EAAE,CAAC,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC;QAC1E,CAAC,CAAC,CAAC;IACL,CAAC;IAqBD,GAAG,CACD,eAAsD,EACtD,kBAAiE,EACjE,QAAsC;QAEtC,MAAM,KAAK,GAAG,OAAO,eAAe,KAAK,UAAU,CAAC,CAAC,CAAC,SAAS,CAAC,CAAC,CAAC,eAAe,CAAC;QAClF,MAAM,QAAQ,GAAG,OAAO,kBAAkB,KAAK,UAAU,CAAC,CAAC,CAAC,SAAS,CAAC,CAAC,CAAC,kBAAkB,CAAC;QAC3F,QAAQ;YACN,OAAO,eAAe,KAAK,UAAU;gBACnC,CAAC,CAAC,eAAe;gBACjB,CAAC,CAAC,OAAO,kBAAkB,KAAK,UAAU;oBAC1C,CAAC,CAAC,kBAAkB;oBACpB,CAAC,CAAC,QAAQ,CAAC;QAEf,IAAI,YAAY,CAAC,IAAI,EAAE,QAAQ,CAAC;YAAE,OAAO;QAEzC,IAAI,CAAC,KAAK,CAAC,SAAS,GAAG,IAAI,CAAC;QAE5B,IAAI,QAAQ,EAAE;YACZ,IAAI,CAAC,IAAI,CAAC,uBAAuB,CAAC,MAAM,EAAE,CAAC,MAAkB,EAAE,EAAE;gBAC/D,IAAI,QAAQ;oBAAE,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,CAAC;YAC5C,CAAC,CAAC,CAAC;SACJ;QAED,IAAI,CAAC,KAAK,EAAE;YACV,cAAc,CAAC,IAAI,EAAE,GAAG,EAAE,CAAC,CAAC,CAAC,YAAY,CAAC,IAAI,CAAC,CAAC,CAAC;YACjD,OAAO;SACR;QAED,IAAI,CAAC,KAAK,CAAC,KAAK,EAAE,QAAQ,EAAE,GAAG,EAAE;YAC/B,YAAY,CAAC,IAAI,CAAC,CAAC;QACrB,CAAC,CAAC,CAAC;IACL,CAAC;;AAhLH,0DAiLC;AA5JC,aAAa;AACG,6BAAK,GAAG,OAAO,CAAC;AAChC,aAAa;AACG,6BAAK,GAAG,OAAO,CAAC;AAChC;;;GAGG;AACa,8BAAM,GAAG,QAAQ,CAAC;AAsJpC,SAAS,aAAa,CACpB,MAA+B,EAC/B,KAAe,EACf,QAAmB;IAEnB,IAAI,MAAM,CAAC,KAAK,CAAC,OAAO,EAAE;QACxB,OAAO;KACR;IACD,MAAM,CAAC,KAAK,CAAC,OAAO,GAAG,IAAI,CAAC;IAC5B,IAAI,QAAQ,EAAE;QACZ,OAAO,QAAQ,CAAC,KAAK,CAAC,CAAC;KACxB;IACD,MAAM,CAAC,IAAI,CAAC,uBAAuB,CAAC,KAAK,EAAE,KAAK,CAAC,CAAC;AACpD,CAAC;AAED,SAAS,cAAc,CAAC,OAAiB,EAAE,CAAS,EAAE,IAAY;IAChE,OAAO;QACL,GAAG,EAAE,IAAI,eAAQ,EAAE;QACnB,QAAQ,EAAE,OAAO;QACjB,CAAC;QACD,IAAI;KACL,CAAC;AACJ,CAAC;AAED,SAAS,gBAAgB,CAAC,MAA+B,EAAE,QAAkB;IAC3E,MAAM,CAAC,MAAM,CAAC,WAAW,EAAE,CAAC,OAAO,CAAC,CAAC,KAAgB,EAAE,OAAoB,EAAE,EAAE;QAC7E,IAAI,KAAsC,CAAC;QAC3C,IAAI,KAAK,EAAE;YACT,2CAA2C;YAC3C,IAAI,KAAK,YAAY,kBAAU,IAAI,KAAK,CAAC,IAAI,KAAK,2BAAmB,CAAC,iBAAiB,EAAE;gBACvF,KAAK,GAAG,EAAE,QAAQ,EAAE,CAAC,EAAE,CAAC,EAAE,CAAC,EAAE,CAAC;gBAC9B,MAAM,CAAC,MAAM,CAAC,WAAW,CAAC,KAAK,EAAE,EAAE,UAAU,EAAE,KAAK,EAAE,MAAM,EAAE,IAAI,EAAE,EAAE,KAAK,CAAC,EAAE;oBAC5E,IAAI,KAAK,EAAE;wBACT,OAAO,QAAQ,CAAC,KAAK,CAAC,CAAC;qBACxB;oBAED,QAAQ,EAAE,CAAC;gBACb,CAAC,CAAC,CAAC;gBACH,OAAO;aACR;YACD,OAAO,QAAQ,CAAC,KAAK,CAAC,CAAC;SACxB;QAED,IAAI,cAAc,GAAG,KAAK,CAAC;QAC3B,IAAI,OAAO,EAAE;YACX,OAAO,CAAC,OAAO,CAAC,CAAC,KAAe,EAAE,EAAE;gBAClC,IAAI,KAAK,CAAC,GAAG,EAAE;oBACb,MAAM,IAAI,GAAG,MAAM,CAAC,IAAI,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;oBACpC,IAAI,IAAI,CAAC,MAAM,KAAK,CAAC,IAAI,KAAK,CAAC,GAAG,CAAC,QAAQ,KAAK,CAAC,IAAI,KAAK,CAAC,GAAG,CAAC,CAAC,KAAK,CAAC,EAAE;wBACtE,cAAc,GAAG,IAAI,CAAC;qBACvB;iBACF;YACH,CAAC,CAAC,CAAC;SACJ;QAED,IAAI,cAAc,EAAE;YAClB,QAAQ,EAAE,CAAC;SACZ;aAAM;YACL,KAAK,GAAG,EAAE,QAAQ,EAAE,CAAC,EAAE,CAAC,EAAE,CAAC,EAAE,CAAC;YAC9B,MAAM,mBAAmB,GAAG,eAAe,CAAC,MAAM,CAAC,CAAC;YAEpD,MAAM,CAAC,MAAM,CAAC,WAAW,CACvB,KAAK,EACL;gBACE,GAAG,mBAAmB;gBACtB,UAAU,EAAE,IAAI;gBAChB,MAAM,EAAE,IAAI;aACb,EACD,QAAQ,CACT,CAAC;SACH;IACH,CAAC,CAAC,CAAC;AACL,CAAC;AAED,SAAS,SAAS,CAAC,MAA+B,EAAE,QAAmB;IACrE,IAAI,MAAM,CAAC,IAAI;QAAE,OAAO,IAAI,CAAC;IAC7B,IAAI,MAAM,CAAC,KAAK,CAAC,SAAS,IAAI,MAAM,CAAC,KAAK,CAAC,mBAAmB,KAAK,CAAC,IAAI,CAAC,MAAM,CAAC,KAAK,CAAC,OAAO,EAAE;QAC7F,yDAAyD;QACzD,MAAM,CAAC,IAAI,GAAG,IAAI,CAAC;QACnB,yBAAyB;QACzB,MAAM,QAAQ,GAAG,cAAc,CAC7B,MAAM,CAAC,EAAE,EACT,MAAM,CAAC,MAAM,EACb,MAAM,CAAC,cAAc,EACrB,MAAM,CAAC,QAAQ,EACf,MAAM,CAAC,OAAO,CAAC,WAAW,EAC1B,MAAM,CAAC,OAAO,CAAC,OAAO,EACtB,MAAM,CAAC,OAAO,CAAC,QAAQ,CACxB,CAAC;QAEF,IAAI,YAAY,CAAC,MAAM,EAAE,QAAQ,CAAC,EAAE;YAClC,OAAO,KAAK,CAAC;SACd;QAED,MAAM,CAAC,KAAK,CAAC,SAAS,CAAC,QAAQ,EAAE,eAAe,CAAC,MAAM,CAAC,EAAE,CAAC,KAAgB,EAAE,EAAE;YAC7E,IAAI,KAAK,EAAE;gBACT,OAAO,aAAa,CAAC,MAAM,EAAE,KAAK,EAAE,QAAQ,CAAC,CAAC;aAC/C;YACD,MAAM,CAAC,IAAI,CAAC,uBAAuB,CAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;YACtD,MAAM,CAAC,IAAI,CAAC,uBAAuB,CAAC,KAAK,CAAC,CAAC;QAC7C,CAAC,CAAC,CAAC;QAEH,OAAO,IAAI,CAAC;KACb;IAED,OAAO,KAAK,CAAC;AACf,CAAC;AAED,SAAS,YAAY,CAAC,MAA+B,EAAE,QAAkB;IACvE,MAAM,CAAC,KAAK,CAAC,OAAO,CAAC,EAAE,EAAE,EAAE,UAAU,EAAE,EAAE,GAAG,EAAE,CAAC,EAAE,EAAE,EAAE,CAAC,KAAK,EAAE,GAAG,EAAE,EAAE;QAClE,IAAI,KAAK,EAAE;YACT,OAAO,QAAQ,CAAC,KAAK,CAAC,CAAC;SACxB;QACD,IAAI,GAAG,EAAE;YACP,OAAO,QAAQ,EAAE,CAAC;SACnB;QAED,MAAM,CAAC,KAAK,CAAC,WAAW,EAAE,CAAC,OAAO,CAAC,CAAC,KAAgB,EAAE,OAAkB,EAAE,EAAE;YAC1E,IAAI,KAA+C,CAAC;YACpD,IAAI,KAAK,EAAE;gBACT,2CAA2C;gBAC3C,IAAI,KAAK,YAAY,kBAAU,IAAI,KAAK,CAAC,IAAI,KAAK,2BAAmB,CAAC,iBAAiB,EAAE;oBACvF,KAAK,GAAG,EAAE,QAAQ,EAAE,CAAC,EAAE,UAAU,EAAE,CAAC,EAAE,CAAC;oBACvC,MAAM,CAAC,KAAK,CAAC,WAAW,CAAC,KAAK,EAAE,EAAE,UAAU,EAAE,KAAK,EAAE,EAAE,CAAC,KAAgB,EAAE,EAAE;wBAC1E,IAAI,KAAK,EAAE;4BACT,OAAO,QAAQ,CAAC,KAAK,CAAC,CAAC;yBACxB;wBAED,gBAAgB,CAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;oBACrC,CAAC,CAAC,CAAC;oBACH,OAAO;iBACR;gBACD,OAAO,QAAQ,CAAC,KAAK,CAAC,CAAC;aACxB;YAED,IAAI,YAAY,GAAG,KAAK,CAAC;YACzB,IAAI,OAAO,EAAE;gBACX,OAAO,CAAC,OAAO,CAAC,CAAC,KAAe,EAAE,EAAE;oBAClC,MAAM,IAAI,GAAG,MAAM,CAAC,IAAI,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;oBACpC,IAAI,IAAI,CAAC,MAAM,KAAK,CAAC,IAAI,KAAK,CAAC,GAAG,CAAC,QAAQ,KAAK,CAAC,IAAI,KAAK,CAAC,GAAG,CAAC,UAAU,KAAK,CAAC,EAAE;wBAC/E,YAAY,GAAG,IAAI,CAAC;qBACrB;gBACH,CAAC,CAAC,CAAC;aACJ;YAED,IAAI,YAAY,EAAE;gBAChB,gBAAgB,CAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;aACpC;iBAAM;gBACL,KAAK,GAAG,EAAE,QAAQ,EAAE,CAAC,EAAE,UAAU,EAAE,CAAC,EAAE,CAAC;gBAEvC,MAAM,mBAAmB,GAAG,eAAe,CAAC,MAAM,CAAC,CAAC;gBAEpD,MAAM,CAAC,KAAK,CAAC,WAAW,CACtB,KAAK,EACL;oBACE,GAAG,mBAAmB;oBACtB,UAAU,EAAE,KAAK;iBAClB,EACD,CAAC,KAAgB,EAAE,EAAE;oBACnB,IAAI,KAAK,EAAE;wBACT,OAAO,QAAQ,CAAC,KAAK,CAAC,CAAC;qBACxB;oBAED,gBAAgB,CAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;gBACrC,CAAC,CACF,CAAC;aACH;QACH,CAAC,CAAC,CAAC;IACL,CAAC,CAAC,CAAC;AACL,CAAC;AAED,SAAS,cAAc,CACrB,GAAa,EACb,MAAc,EACd,SAAiB,EACjB,QAAgB,EAChB,WAAoB,EACpB,OAAkB,EAClB,QAAmB;IAEnB,MAAM,GAAG,GAAe;QACtB,GAAG;QACH,MAAM;QACN,SAAS;QACT,UAAU,EAAE,IAAI,IAAI,EAAE;QACtB,QAAQ;KACT,CAAC;IAEF,IAAI,WAAW,EAAE;QACf,GAAG,CAAC,WAAW,GAAG,WAAW,CAAC;KAC/B;IAED,IAAI,OAAO,EAAE;QACX,GAAG,CAAC,OAAO,GAAG,OAAO,CAAC;KACvB;IAED,IAAI,QAAQ,EAAE;QACZ,GAAG,CAAC,QAAQ,GAAG,QAAQ,CAAC;KACzB;IAED,OAAO,GAAG,CAAC;AACb,CAAC;AAED,SAAS,OAAO,CACd,MAA+B,EAC/B,KAAsB,EACtB,QAAyB,EACzB,QAAyB;IAEzB,IAAI,YAAY,CAAC,MAAM,EAAE,QAAQ,CAAC,EAAE;QAClC,OAAO,KAAK,CAAC;KACd;IAED,MAAM,QAAQ,GAAG,MAAM,CAAC,QAAQ,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,MAAM,CAAC,IAAI,CAAC,KAAK,EAAE,QAAQ,CAAC,CAAC;IAE/E,MAAM,CAAC,MAAM,IAAI,QAAQ,CAAC,MAAM,CAAC;IAEjC,6CAA6C;IAC7C,IAAI,MAAM,CAAC,GAAG,GAAG,QAAQ,CAAC,MAAM,GAAG,MAAM,CAAC,cAAc,EAAE;QACxD,QAAQ,CAAC,IAAI,CAAC,MAAM,CAAC,UAAU,EAAE,MAAM,CAAC,GAAG,CAAC,CAAC;QAC7C,MAAM,CAAC,GAAG,IAAI,QAAQ,CAAC,MAAM,CAAC;QAE9B,QAAQ,IAAI,QAAQ,EAAE,CAAC;QAEvB,qEAAqE;QACrE,mDAAmD;QACnD,sCAAsC;QACtC,OAAO,IAAI,CAAC;KACb;IAED,sEAAsE;IACtE,cAAc;IACd,IAAI,iBAAiB,GAAG,QAAQ,CAAC,MAAM,CAAC;IACxC,IAAI,cAAc,GAAW,MAAM,CAAC,cAAc,GAAG,MAAM,CAAC,GAAG,CAAC;IAChE,IAAI,SAAS,GAAG,IAAI,CAAC,GAAG,CAAC,cAAc,EAAE,QAAQ,CAAC,MAAM,CAAC,CAAC;IAC1D,IAAI,mBAAmB,GAAG,CAAC,CAAC;IAC5B,OAAO,iBAAiB,GAAG,CAAC,EAAE;QAC5B,MAAM,WAAW,GAAG,QAAQ,CAAC,MAAM,GAAG,iBAAiB,CAAC;QACxD,QAAQ,CAAC,IAAI,CAAC,MAAM,CAAC,UAAU,EAAE,MAAM,CAAC,GAAG,EAAE,WAAW,EAAE,WAAW,GAAG,SAAS,CAAC,CAAC;QACnF,MAAM,CAAC,GAAG,IAAI,SAAS,CAAC;QACxB,cAAc,IAAI,SAAS,CAAC;QAC5B,IAAI,GAAgB,CAAC;QACrB,IAAI,cAAc,KAAK,CAAC,EAAE;YACxB,GAAG,GAAG,cAAc,CAAC,MAAM,CAAC,EAAE,EAAE,MAAM,CAAC,CAAC,EAAE,MAAM,CAAC,IAAI,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,CAAC;YAC1E,EAAE,MAAM,CAAC,KAAK,CAAC,mBAAmB,CAAC;YACnC,EAAE,mBAAmB,CAAC;YAEtB,IAAI,YAAY,CAAC,MAAM,EAAE,QAAQ,CAAC,EAAE;gBAClC,OAAO,KAAK,CAAC;aACd;YAED,MAAM,CAAC,MAAM,CAAC,SAAS,CAAC,GAAG,EAAE,eAAe,CAAC,MAAM,CAAC,EAAE,CAAC,KAAgB,EAAE,EAAE;gBACzE,IAAI,KAAK,EAAE;oBACT,OAAO,aAAa,CAAC,MAAM,EAAE,KAAK,CAAC,CAAC;iBACrC;gBACD,EAAE,MAAM,CAAC,KAAK,CAAC,mBAAmB,CAAC;gBACnC,EAAE,mBAAmB,CAAC;gBAEtB,IAAI,CAAC,mBAAmB,EAAE;oBACxB,MAAM,CAAC,IAAI,CAAC,OAAO,EAAE,GAAG,CAAC,CAAC;oBAC1B,QAAQ,IAAI,QAAQ,EAAE,CAAC;oBACvB,SAAS,CAAC,MAAM,CAAC,CAAC;iBACnB;YACH,CAAC,CAAC,CAAC;YAEH,cAAc,GAAG,MAAM,CAAC,cAAc,CAAC;YACvC,MAAM,CAAC,GAAG,GAAG,CAAC,CAAC;YACf,EAAE,MAAM,CAAC,CAAC,CAAC;SACZ;QACD,iBAAiB,IAAI,SAAS,CAAC;QAC/B,SAAS,GAAG,IAAI,CAAC,GAAG,CAAC,cAAc,EAAE,iBAAiB,CAAC,CAAC;KACzD;IAED,qEAAqE;IACrE,mDAAmD;IACnD,4DAA4D;IAC5D,OAAO,KAAK,CAAC;AACf,CAAC;AAED,SAAS,eAAe,CAAC,MAA+B;IACtD,MAAM,GAAG,GAAwB,EAAE,CAAC;IACpC,IAAI,MAAM,CAAC,YAAY,EAAE;QACvB,GAAG,CAAC,YAAY,GAAG;YACjB,CAAC,EAAE,MAAM,CAAC,YAAY,CAAC,CAAC;YACxB,QAAQ,EAAE,MAAM,CAAC,YAAY,CAAC,QAAQ;YACtC,CAAC,EAAE,MAAM,CAAC,YAAY,CAAC,CAAC;SACzB,CAAC;KACH;IACD,OAAO,GAAG,CAAC;AACb,CAAC;AAED,SAAS,cAAc,CACrB,MAA+B,EAC/B,QAAmC;IAEnC,IAAI,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,cAAc,EAAE;QAClC,OAAO,QAAQ,CAAC,KAAK,CAAC,CAAC;KACxB;IAED,MAAM,CAAC,MAAM,CAAC,IAAI,CAAC,OAAO,EAAE,GAAG,EAAE;QAC/B,QAAQ,CAAC,IAAI,CAAC,CAAC;IACjB,CAAC,CAAC,CAAC;IAEH,OAAO,IAAI,CAAC;AACd,CAAC;AAED,SAAS,YAAY,CAAC,MAA+B,EAAE,QAAmB;IACxE,6CAA6C;IAC7C,IAAI,MAAM,CAAC,GAAG,KAAK,CAAC,EAAE;QACpB,OAAO,SAAS,CAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;KACpC;IAED,EAAE,MAAM,CAAC,KAAK,CAAC,mBAAmB,CAAC;IAEnC,yEAAyE;IACzE,SAAS;IACT,MAAM,OAAO,GAAG,MAAM,CAAC,KAAK,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC;IACzC,MAAM,CAAC,UAAU,CAAC,IAAI,CAAC,OAAO,EAAE,CAAC,EAAE,CAAC,EAAE,MAAM,CAAC,GAAG,CAAC,CAAC;IAClD,MAAM,GAAG,GAAG,cAAc,CAAC,MAAM,CAAC,EAAE,EAAE,MAAM,CAAC,CAAC,EAAE,OAAO,CAAC,CAAC;IAEzD,kDAAkD;IAClD,IAAI,YAAY,CAAC,MAAM,EAAE,QAAQ,CAAC,EAAE;QAClC,OAAO,KAAK,CAAC;KACd;IAED,MAAM,CAAC,MAAM,CAAC,SAAS,CAAC,GAAG,EAAE,eAAe,CAAC,MAAM,CAAC,EAAE,CAAC,KAAgB,EAAE,EAAE;QACzE,IAAI,KAAK,EAAE;YACT,OAAO,aAAa,CAAC,MAAM,EAAE,KAAK,CAAC,CAAC;SACrC;QACD,EAAE,MAAM,CAAC,KAAK,CAAC,mBAAmB,CAAC;QACnC,SAAS,CAAC,MAAM,CAAC,CAAC;IACpB,CAAC,CAAC,CAAC;IACH,OAAO,IAAI,CAAC;AACd,CAAC;AAED,SAAS,YAAY,CAAC,MAA+B,EAAE,QAAyB;IAC9E,IAAI,MAAM,CAAC,KAAK,CAAC,OAAO,EAAE;QACxB,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;YAClC,6DAA6D;YAC7D,QAAQ,CAAC,IAAI,qBAAa,CAAC,yBAAyB,CAAC,CAAC,CAAC;SACxD;QACD,OAAO,IAAI,CAAC;KACb;IACD,OAAO,KAAK,CAAC;AACf,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/index.js b/node_modules/mongodb/lib/index.js new file mode 100644 index 000000000..901b079f1 --- /dev/null +++ b/node_modules/mongodb/lib/index.js @@ -0,0 +1,149 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.Logger = exports.Collection = exports.Db = exports.MongoClient = exports.Admin = exports.Promise = exports.MongoBulkWriteError = exports.MongoTopologyClosedError = exports.MongoServerClosedError = exports.MongoKerberosError = exports.MongoTransactionError = exports.MongoExpiredSessionError = exports.MongoNotConnectedError = exports.MongoCursorInUseError = exports.MongoCursorExhaustedError = exports.MongoBatchReExecutionError = exports.MongoDecompressionError = exports.MongoGridFSChunkError = exports.MongoGridFSStreamError = exports.MongoChangeStreamError = exports.MongoRuntimeError = exports.MongoWriteConcernError = exports.MongoParseError = exports.MongoServerSelectionError = exports.MongoSystemError = exports.MongoNetworkTimeoutError = exports.MongoNetworkError = exports.MongoMissingDependencyError = exports.MongoMissingCredentialsError = exports.MongoInvalidArgumentError = exports.MongoCompatibilityError = exports.MongoAPIError = exports.MongoDriverError = exports.MongoServerError = exports.MongoError = exports.ObjectID = exports.Map = exports.BSONSymbol = exports.BSONRegExp = exports.Decimal128 = exports.Timestamp = exports.ObjectId = exports.MaxKey = exports.MinKey = exports.Long = exports.Int32 = exports.Double = exports.DBRef = exports.Code = exports.Binary = void 0; +exports.SrvPollingEvent = exports.TopologyOpeningEvent = exports.TopologyDescriptionChangedEvent = exports.TopologyClosedEvent = exports.ServerOpeningEvent = exports.ServerDescriptionChangedEvent = exports.ServerClosedEvent = exports.ServerHeartbeatFailedEvent = exports.ServerHeartbeatSucceededEvent = exports.ServerHeartbeatStartedEvent = exports.ConnectionReadyEvent = exports.ConnectionPoolMonitoringEvent = exports.ConnectionPoolCreatedEvent = exports.ConnectionPoolClosedEvent = exports.ConnectionPoolClearedEvent = exports.ConnectionCreatedEvent = exports.ConnectionClosedEvent = exports.ConnectionCheckedOutEvent = exports.ConnectionCheckedInEvent = exports.ConnectionCheckOutStartedEvent = exports.ConnectionCheckOutFailedEvent = exports.CommandFailedEvent = exports.CommandSucceededEvent = exports.CommandStartedEvent = exports.ReadPreference = exports.ReadConcern = exports.WriteConcern = exports.BSONType = exports.ServerApiVersion = exports.ReadPreferenceMode = exports.ReadConcernLevel = exports.ExplainVerbosity = exports.ReturnDocument = exports.Compressor = exports.CURSOR_FLAGS = exports.AuthMechanism = exports.BatchType = exports.AutoEncryptionLoggerLevel = exports.LoggerLevel = exports.TopologyType = exports.ServerType = exports.ProfilingLevel = exports.CancellationToken = exports.GridFSBucket = exports.ListCollectionsCursor = exports.ListIndexesCursor = exports.FindCursor = exports.AggregationCursor = exports.AbstractCursor = void 0; +const abstract_cursor_1 = require("./cursor/abstract_cursor"); +Object.defineProperty(exports, "AbstractCursor", { enumerable: true, get: function () { return abstract_cursor_1.AbstractCursor; } }); +const aggregation_cursor_1 = require("./cursor/aggregation_cursor"); +Object.defineProperty(exports, "AggregationCursor", { enumerable: true, get: function () { return aggregation_cursor_1.AggregationCursor; } }); +const find_cursor_1 = require("./cursor/find_cursor"); +Object.defineProperty(exports, "FindCursor", { enumerable: true, get: function () { return find_cursor_1.FindCursor; } }); +const indexes_1 = require("./operations/indexes"); +Object.defineProperty(exports, "ListIndexesCursor", { enumerable: true, get: function () { return indexes_1.ListIndexesCursor; } }); +const list_collections_1 = require("./operations/list_collections"); +Object.defineProperty(exports, "ListCollectionsCursor", { enumerable: true, get: function () { return list_collections_1.ListCollectionsCursor; } }); +const promise_provider_1 = require("./promise_provider"); +Object.defineProperty(exports, "Promise", { enumerable: true, get: function () { return promise_provider_1.PromiseProvider; } }); +const admin_1 = require("./admin"); +Object.defineProperty(exports, "Admin", { enumerable: true, get: function () { return admin_1.Admin; } }); +const mongo_client_1 = require("./mongo_client"); +Object.defineProperty(exports, "MongoClient", { enumerable: true, get: function () { return mongo_client_1.MongoClient; } }); +const db_1 = require("./db"); +Object.defineProperty(exports, "Db", { enumerable: true, get: function () { return db_1.Db; } }); +const collection_1 = require("./collection"); +Object.defineProperty(exports, "Collection", { enumerable: true, get: function () { return collection_1.Collection; } }); +const logger_1 = require("./logger"); +Object.defineProperty(exports, "Logger", { enumerable: true, get: function () { return logger_1.Logger; } }); +const gridfs_1 = require("./gridfs"); +Object.defineProperty(exports, "GridFSBucket", { enumerable: true, get: function () { return gridfs_1.GridFSBucket; } }); +const mongo_types_1 = require("./mongo_types"); +Object.defineProperty(exports, "CancellationToken", { enumerable: true, get: function () { return mongo_types_1.CancellationToken; } }); +var bson_1 = require("./bson"); +Object.defineProperty(exports, "Binary", { enumerable: true, get: function () { return bson_1.Binary; } }); +Object.defineProperty(exports, "Code", { enumerable: true, get: function () { return bson_1.Code; } }); +Object.defineProperty(exports, "DBRef", { enumerable: true, get: function () { return bson_1.DBRef; } }); +Object.defineProperty(exports, "Double", { enumerable: true, get: function () { return bson_1.Double; } }); +Object.defineProperty(exports, "Int32", { enumerable: true, get: function () { return bson_1.Int32; } }); +Object.defineProperty(exports, "Long", { enumerable: true, get: function () { return bson_1.Long; } }); +Object.defineProperty(exports, "MinKey", { enumerable: true, get: function () { return bson_1.MinKey; } }); +Object.defineProperty(exports, "MaxKey", { enumerable: true, get: function () { return bson_1.MaxKey; } }); +Object.defineProperty(exports, "ObjectId", { enumerable: true, get: function () { return bson_1.ObjectId; } }); +Object.defineProperty(exports, "Timestamp", { enumerable: true, get: function () { return bson_1.Timestamp; } }); +Object.defineProperty(exports, "Decimal128", { enumerable: true, get: function () { return bson_1.Decimal128; } }); +Object.defineProperty(exports, "BSONRegExp", { enumerable: true, get: function () { return bson_1.BSONRegExp; } }); +Object.defineProperty(exports, "BSONSymbol", { enumerable: true, get: function () { return bson_1.BSONSymbol; } }); +Object.defineProperty(exports, "Map", { enumerable: true, get: function () { return bson_1.Map; } }); +const bson_2 = require("bson"); +/** + * @public + * @deprecated Please use `ObjectId` + */ +exports.ObjectID = bson_2.ObjectId; +var error_1 = require("./error"); +Object.defineProperty(exports, "MongoError", { enumerable: true, get: function () { return error_1.MongoError; } }); +Object.defineProperty(exports, "MongoServerError", { enumerable: true, get: function () { return error_1.MongoServerError; } }); +Object.defineProperty(exports, "MongoDriverError", { enumerable: true, get: function () { return error_1.MongoDriverError; } }); +Object.defineProperty(exports, "MongoAPIError", { enumerable: true, get: function () { return error_1.MongoAPIError; } }); +Object.defineProperty(exports, "MongoCompatibilityError", { enumerable: true, get: function () { return error_1.MongoCompatibilityError; } }); +Object.defineProperty(exports, "MongoInvalidArgumentError", { enumerable: true, get: function () { return error_1.MongoInvalidArgumentError; } }); +Object.defineProperty(exports, "MongoMissingCredentialsError", { enumerable: true, get: function () { return error_1.MongoMissingCredentialsError; } }); +Object.defineProperty(exports, "MongoMissingDependencyError", { enumerable: true, get: function () { return error_1.MongoMissingDependencyError; } }); +Object.defineProperty(exports, "MongoNetworkError", { enumerable: true, get: function () { return error_1.MongoNetworkError; } }); +Object.defineProperty(exports, "MongoNetworkTimeoutError", { enumerable: true, get: function () { return error_1.MongoNetworkTimeoutError; } }); +Object.defineProperty(exports, "MongoSystemError", { enumerable: true, get: function () { return error_1.MongoSystemError; } }); +Object.defineProperty(exports, "MongoServerSelectionError", { enumerable: true, get: function () { return error_1.MongoServerSelectionError; } }); +Object.defineProperty(exports, "MongoParseError", { enumerable: true, get: function () { return error_1.MongoParseError; } }); +Object.defineProperty(exports, "MongoWriteConcernError", { enumerable: true, get: function () { return error_1.MongoWriteConcernError; } }); +Object.defineProperty(exports, "MongoRuntimeError", { enumerable: true, get: function () { return error_1.MongoRuntimeError; } }); +Object.defineProperty(exports, "MongoChangeStreamError", { enumerable: true, get: function () { return error_1.MongoChangeStreamError; } }); +Object.defineProperty(exports, "MongoGridFSStreamError", { enumerable: true, get: function () { return error_1.MongoGridFSStreamError; } }); +Object.defineProperty(exports, "MongoGridFSChunkError", { enumerable: true, get: function () { return error_1.MongoGridFSChunkError; } }); +Object.defineProperty(exports, "MongoDecompressionError", { enumerable: true, get: function () { return error_1.MongoDecompressionError; } }); +Object.defineProperty(exports, "MongoBatchReExecutionError", { enumerable: true, get: function () { return error_1.MongoBatchReExecutionError; } }); +Object.defineProperty(exports, "MongoCursorExhaustedError", { enumerable: true, get: function () { return error_1.MongoCursorExhaustedError; } }); +Object.defineProperty(exports, "MongoCursorInUseError", { enumerable: true, get: function () { return error_1.MongoCursorInUseError; } }); +Object.defineProperty(exports, "MongoNotConnectedError", { enumerable: true, get: function () { return error_1.MongoNotConnectedError; } }); +Object.defineProperty(exports, "MongoExpiredSessionError", { enumerable: true, get: function () { return error_1.MongoExpiredSessionError; } }); +Object.defineProperty(exports, "MongoTransactionError", { enumerable: true, get: function () { return error_1.MongoTransactionError; } }); +Object.defineProperty(exports, "MongoKerberosError", { enumerable: true, get: function () { return error_1.MongoKerberosError; } }); +Object.defineProperty(exports, "MongoServerClosedError", { enumerable: true, get: function () { return error_1.MongoServerClosedError; } }); +Object.defineProperty(exports, "MongoTopologyClosedError", { enumerable: true, get: function () { return error_1.MongoTopologyClosedError; } }); +var common_1 = require("./bulk/common"); +Object.defineProperty(exports, "MongoBulkWriteError", { enumerable: true, get: function () { return common_1.MongoBulkWriteError; } }); +// enums +var set_profiling_level_1 = require("./operations/set_profiling_level"); +Object.defineProperty(exports, "ProfilingLevel", { enumerable: true, get: function () { return set_profiling_level_1.ProfilingLevel; } }); +var common_2 = require("./sdam/common"); +Object.defineProperty(exports, "ServerType", { enumerable: true, get: function () { return common_2.ServerType; } }); +Object.defineProperty(exports, "TopologyType", { enumerable: true, get: function () { return common_2.TopologyType; } }); +var logger_2 = require("./logger"); +Object.defineProperty(exports, "LoggerLevel", { enumerable: true, get: function () { return logger_2.LoggerLevel; } }); +var deps_1 = require("./deps"); +Object.defineProperty(exports, "AutoEncryptionLoggerLevel", { enumerable: true, get: function () { return deps_1.AutoEncryptionLoggerLevel; } }); +var common_3 = require("./bulk/common"); +Object.defineProperty(exports, "BatchType", { enumerable: true, get: function () { return common_3.BatchType; } }); +var defaultAuthProviders_1 = require("./cmap/auth/defaultAuthProviders"); +Object.defineProperty(exports, "AuthMechanism", { enumerable: true, get: function () { return defaultAuthProviders_1.AuthMechanism; } }); +var abstract_cursor_2 = require("./cursor/abstract_cursor"); +Object.defineProperty(exports, "CURSOR_FLAGS", { enumerable: true, get: function () { return abstract_cursor_2.CURSOR_FLAGS; } }); +var compression_1 = require("./cmap/wire_protocol/compression"); +Object.defineProperty(exports, "Compressor", { enumerable: true, get: function () { return compression_1.Compressor; } }); +var find_and_modify_1 = require("./operations/find_and_modify"); +Object.defineProperty(exports, "ReturnDocument", { enumerable: true, get: function () { return find_and_modify_1.ReturnDocument; } }); +var explain_1 = require("./explain"); +Object.defineProperty(exports, "ExplainVerbosity", { enumerable: true, get: function () { return explain_1.ExplainVerbosity; } }); +var read_concern_1 = require("./read_concern"); +Object.defineProperty(exports, "ReadConcernLevel", { enumerable: true, get: function () { return read_concern_1.ReadConcernLevel; } }); +var read_preference_1 = require("./read_preference"); +Object.defineProperty(exports, "ReadPreferenceMode", { enumerable: true, get: function () { return read_preference_1.ReadPreferenceMode; } }); +var mongo_client_2 = require("./mongo_client"); +Object.defineProperty(exports, "ServerApiVersion", { enumerable: true, get: function () { return mongo_client_2.ServerApiVersion; } }); +var mongo_types_2 = require("./mongo_types"); +Object.defineProperty(exports, "BSONType", { enumerable: true, get: function () { return mongo_types_2.BSONType; } }); +// Helper classes +var write_concern_1 = require("./write_concern"); +Object.defineProperty(exports, "WriteConcern", { enumerable: true, get: function () { return write_concern_1.WriteConcern; } }); +var read_concern_2 = require("./read_concern"); +Object.defineProperty(exports, "ReadConcern", { enumerable: true, get: function () { return read_concern_2.ReadConcern; } }); +var read_preference_2 = require("./read_preference"); +Object.defineProperty(exports, "ReadPreference", { enumerable: true, get: function () { return read_preference_2.ReadPreference; } }); +// events +var command_monitoring_events_1 = require("./cmap/command_monitoring_events"); +Object.defineProperty(exports, "CommandStartedEvent", { enumerable: true, get: function () { return command_monitoring_events_1.CommandStartedEvent; } }); +Object.defineProperty(exports, "CommandSucceededEvent", { enumerable: true, get: function () { return command_monitoring_events_1.CommandSucceededEvent; } }); +Object.defineProperty(exports, "CommandFailedEvent", { enumerable: true, get: function () { return command_monitoring_events_1.CommandFailedEvent; } }); +var connection_pool_events_1 = require("./cmap/connection_pool_events"); +Object.defineProperty(exports, "ConnectionCheckOutFailedEvent", { enumerable: true, get: function () { return connection_pool_events_1.ConnectionCheckOutFailedEvent; } }); +Object.defineProperty(exports, "ConnectionCheckOutStartedEvent", { enumerable: true, get: function () { return connection_pool_events_1.ConnectionCheckOutStartedEvent; } }); +Object.defineProperty(exports, "ConnectionCheckedInEvent", { enumerable: true, get: function () { return connection_pool_events_1.ConnectionCheckedInEvent; } }); +Object.defineProperty(exports, "ConnectionCheckedOutEvent", { enumerable: true, get: function () { return connection_pool_events_1.ConnectionCheckedOutEvent; } }); +Object.defineProperty(exports, "ConnectionClosedEvent", { enumerable: true, get: function () { return connection_pool_events_1.ConnectionClosedEvent; } }); +Object.defineProperty(exports, "ConnectionCreatedEvent", { enumerable: true, get: function () { return connection_pool_events_1.ConnectionCreatedEvent; } }); +Object.defineProperty(exports, "ConnectionPoolClearedEvent", { enumerable: true, get: function () { return connection_pool_events_1.ConnectionPoolClearedEvent; } }); +Object.defineProperty(exports, "ConnectionPoolClosedEvent", { enumerable: true, get: function () { return connection_pool_events_1.ConnectionPoolClosedEvent; } }); +Object.defineProperty(exports, "ConnectionPoolCreatedEvent", { enumerable: true, get: function () { return connection_pool_events_1.ConnectionPoolCreatedEvent; } }); +Object.defineProperty(exports, "ConnectionPoolMonitoringEvent", { enumerable: true, get: function () { return connection_pool_events_1.ConnectionPoolMonitoringEvent; } }); +Object.defineProperty(exports, "ConnectionReadyEvent", { enumerable: true, get: function () { return connection_pool_events_1.ConnectionReadyEvent; } }); +var events_1 = require("./sdam/events"); +Object.defineProperty(exports, "ServerHeartbeatStartedEvent", { enumerable: true, get: function () { return events_1.ServerHeartbeatStartedEvent; } }); +Object.defineProperty(exports, "ServerHeartbeatSucceededEvent", { enumerable: true, get: function () { return events_1.ServerHeartbeatSucceededEvent; } }); +Object.defineProperty(exports, "ServerHeartbeatFailedEvent", { enumerable: true, get: function () { return events_1.ServerHeartbeatFailedEvent; } }); +Object.defineProperty(exports, "ServerClosedEvent", { enumerable: true, get: function () { return events_1.ServerClosedEvent; } }); +Object.defineProperty(exports, "ServerDescriptionChangedEvent", { enumerable: true, get: function () { return events_1.ServerDescriptionChangedEvent; } }); +Object.defineProperty(exports, "ServerOpeningEvent", { enumerable: true, get: function () { return events_1.ServerOpeningEvent; } }); +Object.defineProperty(exports, "TopologyClosedEvent", { enumerable: true, get: function () { return events_1.TopologyClosedEvent; } }); +Object.defineProperty(exports, "TopologyDescriptionChangedEvent", { enumerable: true, get: function () { return events_1.TopologyDescriptionChangedEvent; } }); +Object.defineProperty(exports, "TopologyOpeningEvent", { enumerable: true, get: function () { return events_1.TopologyOpeningEvent; } }); +var srv_polling_1 = require("./sdam/srv_polling"); +Object.defineProperty(exports, "SrvPollingEvent", { enumerable: true, get: function () { return srv_polling_1.SrvPollingEvent; } }); +//# sourceMappingURL=index.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/index.js.map b/node_modules/mongodb/lib/index.js.map new file mode 100644 index 000000000..67e42a514 --- /dev/null +++ b/node_modules/mongodb/lib/index.js.map @@ -0,0 +1 @@ +{"version":3,"file":"index.js","sourceRoot":"","sources":["../src/index.ts"],"names":[],"mappings":";;;;AAAA,8DAA0D;AA8ExD,+FA9EO,gCAAc,OA8EP;AA7EhB,oEAAgE;AA8E9D,kGA9EO,sCAAiB,OA8EP;AA7EnB,sDAAkD;AA8EhD,2FA9EO,wBAAU,OA8EP;AA7EZ,kDAAyD;AA8EvD,kGA9EO,2BAAiB,OA8EP;AA7EnB,oEAAsE;AA8EpE,sGA9EO,wCAAqB,OA8EP;AA7EvB,yDAAqD;AAkEhC,wFAlEZ,kCAAe,OAkEI;AAjE5B,mCAAgC;AAmE9B,sFAnEO,aAAK,OAmEP;AAlEP,iDAA6C;AAmE3C,4FAnEO,0BAAW,OAmEP;AAlEb,6BAA0B;AAmExB,mFAnEO,OAAE,OAmEP;AAlEJ,6CAA0C;AAmExC,2FAnEO,uBAAU,OAmEP;AAlEZ,qCAAkC;AAmEhC,uFAnEO,eAAM,OAmEP;AAlER,qCAAwC;AAwEtC,6FAxEO,qBAAY,OAwEP;AAvEd,+CAAkD;AAwEhD,kGAxEO,+BAAiB,OAwEP;AAtEnB,+BAegB;AAdd,8FAAA,MAAM,OAAA;AACN,4FAAA,IAAI,OAAA;AACJ,6FAAA,KAAK,OAAA;AACL,8FAAA,MAAM,OAAA;AACN,6FAAA,KAAK,OAAA;AACL,4FAAA,IAAI,OAAA;AACJ,8FAAA,MAAM,OAAA;AACN,8FAAA,MAAM,OAAA;AACN,gGAAA,QAAQ,OAAA;AACR,iGAAA,SAAS,OAAA;AACT,kGAAA,UAAU,OAAA;AACV,kGAAA,UAAU,OAAA;AACV,kGAAA,UAAU,OAAA;AACV,2FAAA,GAAG,OAAA;AAGL,+BAAgC;AAChC;;;GAGG;AACU,QAAA,QAAQ,GAAG,eAAQ,CAAC;AAEjC,iCA6BiB;AA5Bf,mGAAA,UAAU,OAAA;AACV,yGAAA,gBAAgB,OAAA;AAChB,yGAAA,gBAAgB,OAAA;AAChB,sGAAA,aAAa,OAAA;AACb,gHAAA,uBAAuB,OAAA;AACvB,kHAAA,yBAAyB,OAAA;AACzB,qHAAA,4BAA4B,OAAA;AAC5B,oHAAA,2BAA2B,OAAA;AAC3B,0GAAA,iBAAiB,OAAA;AACjB,iHAAA,wBAAwB,OAAA;AACxB,yGAAA,gBAAgB,OAAA;AAChB,kHAAA,yBAAyB,OAAA;AACzB,wGAAA,eAAe,OAAA;AACf,+GAAA,sBAAsB,OAAA;AACtB,0GAAA,iBAAiB,OAAA;AACjB,+GAAA,sBAAsB,OAAA;AACtB,+GAAA,sBAAsB,OAAA;AACtB,8GAAA,qBAAqB,OAAA;AACrB,gHAAA,uBAAuB,OAAA;AACvB,mHAAA,0BAA0B,OAAA;AAC1B,kHAAA,yBAAyB,OAAA;AACzB,8GAAA,qBAAqB,OAAA;AACrB,+GAAA,sBAAsB,OAAA;AACtB,iHAAA,wBAAwB,OAAA;AACxB,8GAAA,qBAAqB,OAAA;AACrB,2GAAA,kBAAkB,OAAA;AAClB,+GAAA,sBAAsB,OAAA;AACtB,iHAAA,wBAAwB,OAAA;AAE1B,wCAA6F;AAApF,6GAAA,mBAAmB,OAAA;AAmB5B,QAAQ;AACR,wEAAkE;AAAzD,qHAAA,cAAc,OAAA;AACvB,wCAAyD;AAAhD,oGAAA,UAAU,OAAA;AAAE,sGAAA,YAAY,OAAA;AACjC,mCAAuC;AAA9B,qGAAA,WAAW,OAAA;AACpB,+BAAmD;AAA1C,iHAAA,yBAAyB,OAAA;AAClC,wCAA0C;AAAjC,mGAAA,SAAS,OAAA;AAClB,yEAAiE;AAAxD,qHAAA,aAAa,OAAA;AACtB,4DAAwD;AAA/C,+GAAA,YAAY,OAAA;AACrB,gEAA8D;AAArD,yGAAA,UAAU,OAAA;AACnB,gEAA8D;AAArD,iHAAA,cAAc,OAAA;AACvB,qCAA6C;AAApC,2GAAA,gBAAgB,OAAA;AACzB,+CAAkD;AAAzC,gHAAA,gBAAgB,OAAA;AACzB,qDAAuD;AAA9C,qHAAA,kBAAkB,OAAA;AAC3B,+CAAkD;AAAzC,gHAAA,gBAAgB,OAAA;AACzB,6CAAyC;AAAhC,uGAAA,QAAQ,OAAA;AAEjB,iBAAiB;AACjB,iDAA+C;AAAtC,6GAAA,YAAY,OAAA;AACrB,+CAA6C;AAApC,2GAAA,WAAW,OAAA;AACpB,qDAAmD;AAA1C,iHAAA,cAAc,OAAA;AAEvB,SAAS;AACT,8EAI0C;AAHxC,gIAAA,mBAAmB,OAAA;AACnB,kIAAA,qBAAqB,OAAA;AACrB,+HAAA,kBAAkB,OAAA;AAEpB,wEAYuC;AAXrC,uIAAA,6BAA6B,OAAA;AAC7B,wIAAA,8BAA8B,OAAA;AAC9B,kIAAA,wBAAwB,OAAA;AACxB,mIAAA,yBAAyB,OAAA;AACzB,+HAAA,qBAAqB,OAAA;AACrB,gIAAA,sBAAsB,OAAA;AACtB,oIAAA,0BAA0B,OAAA;AAC1B,mIAAA,yBAAyB,OAAA;AACzB,oIAAA,0BAA0B,OAAA;AAC1B,uIAAA,6BAA6B,OAAA;AAC7B,8HAAA,oBAAoB,OAAA;AAEtB,wCAUuB;AATrB,qHAAA,2BAA2B,OAAA;AAC3B,uHAAA,6BAA6B,OAAA;AAC7B,oHAAA,0BAA0B,OAAA;AAC1B,2GAAA,iBAAiB,OAAA;AACjB,uHAAA,6BAA6B,OAAA;AAC7B,4GAAA,kBAAkB,OAAA;AAClB,6GAAA,mBAAmB,OAAA;AACnB,yHAAA,+BAA+B,OAAA;AAC/B,8GAAA,oBAAoB,OAAA;AAEtB,kDAAqD;AAA5C,8GAAA,eAAe,OAAA"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/logger.js b/node_modules/mongodb/lib/logger.js new file mode 100644 index 000000000..94496c4ac --- /dev/null +++ b/node_modules/mongodb/lib/logger.js @@ -0,0 +1,217 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.Logger = exports.LoggerLevel = void 0; +const util_1 = require("util"); +const utils_1 = require("./utils"); +const error_1 = require("./error"); +// Filters for classes +const classFilters = {}; +let filteredClasses = {}; +let level; +// Save the process id +const pid = process.pid; +// current logger +// eslint-disable-next-line no-console +let currentLogger = console.warn; +/** @public */ +exports.LoggerLevel = Object.freeze({ + ERROR: 'error', + WARN: 'warn', + INFO: 'info', + DEBUG: 'debug', + error: 'error', + warn: 'warn', + info: 'info', + debug: 'debug' +}); +/** + * @public + */ +class Logger { + /** + * Creates a new Logger instance + * + * @param className - The Class name associated with the logging instance + * @param options - Optional logging settings + */ + constructor(className, options) { + options = options !== null && options !== void 0 ? options : {}; + // Current reference + this.className = className; + // Current logger + if (!(options.logger instanceof Logger) && typeof options.logger === 'function') { + currentLogger = options.logger; + } + // Set level of logging, default is error + if (options.loggerLevel) { + level = options.loggerLevel || exports.LoggerLevel.ERROR; + } + // Add all class names + if (filteredClasses[this.className] == null) { + classFilters[this.className] = true; + } + } + /** + * Log a message at the debug level + * + * @param message - The message to log + * @param object - Additional meta data to log + */ + debug(message, object) { + if (this.isDebug() && + ((Object.keys(filteredClasses).length > 0 && filteredClasses[this.className]) || + (Object.keys(filteredClasses).length === 0 && classFilters[this.className]))) { + const dateTime = new Date().getTime(); + const msg = util_1.format('[%s-%s:%s] %s %s', 'DEBUG', this.className, pid, dateTime, message); + const state = { + type: exports.LoggerLevel.DEBUG, + message, + className: this.className, + pid, + date: dateTime + }; + if (object) + state.meta = object; + currentLogger(msg, state); + } + } + /** + * Log a message at the warn level + * + * @param message - The message to log + * @param object - Additional meta data to log + */ + warn(message, object) { + if (this.isWarn() && + ((Object.keys(filteredClasses).length > 0 && filteredClasses[this.className]) || + (Object.keys(filteredClasses).length === 0 && classFilters[this.className]))) { + const dateTime = new Date().getTime(); + const msg = util_1.format('[%s-%s:%s] %s %s', 'WARN', this.className, pid, dateTime, message); + const state = { + type: exports.LoggerLevel.WARN, + message, + className: this.className, + pid, + date: dateTime + }; + if (object) + state.meta = object; + currentLogger(msg, state); + } + } + /** + * Log a message at the info level + * + * @param message - The message to log + * @param object - Additional meta data to log + */ + info(message, object) { + if (this.isInfo() && + ((Object.keys(filteredClasses).length > 0 && filteredClasses[this.className]) || + (Object.keys(filteredClasses).length === 0 && classFilters[this.className]))) { + const dateTime = new Date().getTime(); + const msg = util_1.format('[%s-%s:%s] %s %s', 'INFO', this.className, pid, dateTime, message); + const state = { + type: exports.LoggerLevel.INFO, + message, + className: this.className, + pid, + date: dateTime + }; + if (object) + state.meta = object; + currentLogger(msg, state); + } + } + /** + * Log a message at the error level + * + * @param message - The message to log + * @param object - Additional meta data to log + */ + error(message, object) { + if (this.isError() && + ((Object.keys(filteredClasses).length > 0 && filteredClasses[this.className]) || + (Object.keys(filteredClasses).length === 0 && classFilters[this.className]))) { + const dateTime = new Date().getTime(); + const msg = util_1.format('[%s-%s:%s] %s %s', 'ERROR', this.className, pid, dateTime, message); + const state = { + type: exports.LoggerLevel.ERROR, + message, + className: this.className, + pid, + date: dateTime + }; + if (object) + state.meta = object; + currentLogger(msg, state); + } + } + /** Is the logger set at info level */ + isInfo() { + return level === exports.LoggerLevel.INFO || level === exports.LoggerLevel.DEBUG; + } + /** Is the logger set at error level */ + isError() { + return level === exports.LoggerLevel.ERROR || level === exports.LoggerLevel.INFO || level === exports.LoggerLevel.DEBUG; + } + /** Is the logger set at error level */ + isWarn() { + return (level === exports.LoggerLevel.ERROR || + level === exports.LoggerLevel.WARN || + level === exports.LoggerLevel.INFO || + level === exports.LoggerLevel.DEBUG); + } + /** Is the logger set at debug level */ + isDebug() { + return level === exports.LoggerLevel.DEBUG; + } + /** Resets the logger to default settings, error and no filtered classes */ + static reset() { + level = exports.LoggerLevel.ERROR; + filteredClasses = {}; + } + /** Get the current logger function */ + static currentLogger() { + return currentLogger; + } + /** + * Set the current logger function + * + * @param logger - Custom logging function + */ + static setCurrentLogger(logger) { + if (typeof logger !== 'function') { + throw new error_1.MongoInvalidArgumentError('Current logger must be a function'); + } + currentLogger = logger; + } + /** + * Filter log messages for a particular class + * + * @param type - The type of filter (currently only class) + * @param values - The filters to apply + */ + static filter(type, values) { + if (type === 'class' && Array.isArray(values)) { + filteredClasses = {}; + values.forEach(x => (filteredClasses[x] = true)); + } + } + /** + * Set the current log level + * + * @param newLevel - Set current log level (debug, warn, info, error) + */ + static setLevel(newLevel) { + if (newLevel !== exports.LoggerLevel.INFO && + newLevel !== exports.LoggerLevel.ERROR && + newLevel !== exports.LoggerLevel.DEBUG && + newLevel !== exports.LoggerLevel.WARN) { + throw new error_1.MongoInvalidArgumentError(`Argument "newLevel" should be one of ${utils_1.enumToString(exports.LoggerLevel)}`); + } + level = newLevel; + } +} +exports.Logger = Logger; +//# sourceMappingURL=logger.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/logger.js.map b/node_modules/mongodb/lib/logger.js.map new file mode 100644 index 000000000..7aeb630f7 --- /dev/null +++ b/node_modules/mongodb/lib/logger.js.map @@ -0,0 +1 @@ +{"version":3,"file":"logger.js","sourceRoot":"","sources":["../src/logger.ts"],"names":[],"mappings":";;;AAAA,+BAA8B;AAC9B,mCAAuC;AACvC,mCAAoD;AAEpD,sBAAsB;AACtB,MAAM,YAAY,GAAQ,EAAE,CAAC;AAC7B,IAAI,eAAe,GAAQ,EAAE,CAAC;AAC9B,IAAI,KAAkB,CAAC;AAEvB,sBAAsB;AACtB,MAAM,GAAG,GAAG,OAAO,CAAC,GAAG,CAAC;AAExB,iBAAiB;AACjB,sCAAsC;AACtC,IAAI,aAAa,GAAmB,OAAO,CAAC,IAAI,CAAC;AAEjD,cAAc;AACD,QAAA,WAAW,GAAG,MAAM,CAAC,MAAM,CAAC;IACvC,KAAK,EAAE,OAAO;IACd,IAAI,EAAE,MAAM;IACZ,IAAI,EAAE,MAAM;IACZ,KAAK,EAAE,OAAO;IACd,KAAK,EAAE,OAAO;IACd,IAAI,EAAE,MAAM;IACZ,IAAI,EAAE,MAAM;IACZ,KAAK,EAAE,OAAO;CACN,CAAC,CAAC;AAcZ;;GAEG;AACH,MAAa,MAAM;IAGjB;;;;;OAKG;IACH,YAAY,SAAiB,EAAE,OAAuB;QACpD,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAExB,oBAAoB;QACpB,IAAI,CAAC,SAAS,GAAG,SAAS,CAAC;QAE3B,iBAAiB;QACjB,IAAI,CAAC,CAAC,OAAO,CAAC,MAAM,YAAY,MAAM,CAAC,IAAI,OAAO,OAAO,CAAC,MAAM,KAAK,UAAU,EAAE;YAC/E,aAAa,GAAG,OAAO,CAAC,MAAM,CAAC;SAChC;QAED,yCAAyC;QACzC,IAAI,OAAO,CAAC,WAAW,EAAE;YACvB,KAAK,GAAG,OAAO,CAAC,WAAW,IAAI,mBAAW,CAAC,KAAK,CAAC;SAClD;QAED,sBAAsB;QACtB,IAAI,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,IAAI,IAAI,EAAE;YAC3C,YAAY,CAAC,IAAI,CAAC,SAAS,CAAC,GAAG,IAAI,CAAC;SACrC;IACH,CAAC;IAED;;;;;OAKG;IACH,KAAK,CAAC,OAAe,EAAE,MAAgB;QACrC,IACE,IAAI,CAAC,OAAO,EAAE;YACd,CAAC,CAAC,MAAM,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC,MAAM,GAAG,CAAC,IAAI,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;gBAC3E,CAAC,MAAM,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC,MAAM,KAAK,CAAC,IAAI,YAAY,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,EAC9E;YACA,MAAM,QAAQ,GAAG,IAAI,IAAI,EAAE,CAAC,OAAO,EAAE,CAAC;YACtC,MAAM,GAAG,GAAG,aAAM,CAAC,kBAAkB,EAAE,OAAO,EAAE,IAAI,CAAC,SAAS,EAAE,GAAG,EAAE,QAAQ,EAAE,OAAO,CAAC,CAAC;YACxF,MAAM,KAAK,GAAG;gBACZ,IAAI,EAAE,mBAAW,CAAC,KAAK;gBACvB,OAAO;gBACP,SAAS,EAAE,IAAI,CAAC,SAAS;gBACzB,GAAG;gBACH,IAAI,EAAE,QAAQ;aACR,CAAC;YAET,IAAI,MAAM;gBAAE,KAAK,CAAC,IAAI,GAAG,MAAM,CAAC;YAChC,aAAa,CAAC,GAAG,EAAE,KAAK,CAAC,CAAC;SAC3B;IACH,CAAC;IAED;;;;;OAKG;IACH,IAAI,CAAC,OAAe,EAAE,MAAgB;QACpC,IACE,IAAI,CAAC,MAAM,EAAE;YACb,CAAC,CAAC,MAAM,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC,MAAM,GAAG,CAAC,IAAI,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;gBAC3E,CAAC,MAAM,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC,MAAM,KAAK,CAAC,IAAI,YAAY,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,EAC9E;YACA,MAAM,QAAQ,GAAG,IAAI,IAAI,EAAE,CAAC,OAAO,EAAE,CAAC;YACtC,MAAM,GAAG,GAAG,aAAM,CAAC,kBAAkB,EAAE,MAAM,EAAE,IAAI,CAAC,SAAS,EAAE,GAAG,EAAE,QAAQ,EAAE,OAAO,CAAC,CAAC;YACvF,MAAM,KAAK,GAAG;gBACZ,IAAI,EAAE,mBAAW,CAAC,IAAI;gBACtB,OAAO;gBACP,SAAS,EAAE,IAAI,CAAC,SAAS;gBACzB,GAAG;gBACH,IAAI,EAAE,QAAQ;aACR,CAAC;YAET,IAAI,MAAM;gBAAE,KAAK,CAAC,IAAI,GAAG,MAAM,CAAC;YAChC,aAAa,CAAC,GAAG,EAAE,KAAK,CAAC,CAAC;SAC3B;IACH,CAAC;IAED;;;;;OAKG;IACH,IAAI,CAAC,OAAe,EAAE,MAAgB;QACpC,IACE,IAAI,CAAC,MAAM,EAAE;YACb,CAAC,CAAC,MAAM,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC,MAAM,GAAG,CAAC,IAAI,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;gBAC3E,CAAC,MAAM,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC,MAAM,KAAK,CAAC,IAAI,YAAY,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,EAC9E;YACA,MAAM,QAAQ,GAAG,IAAI,IAAI,EAAE,CAAC,OAAO,EAAE,CAAC;YACtC,MAAM,GAAG,GAAG,aAAM,CAAC,kBAAkB,EAAE,MAAM,EAAE,IAAI,CAAC,SAAS,EAAE,GAAG,EAAE,QAAQ,EAAE,OAAO,CAAC,CAAC;YACvF,MAAM,KAAK,GAAG;gBACZ,IAAI,EAAE,mBAAW,CAAC,IAAI;gBACtB,OAAO;gBACP,SAAS,EAAE,IAAI,CAAC,SAAS;gBACzB,GAAG;gBACH,IAAI,EAAE,QAAQ;aACR,CAAC;YAET,IAAI,MAAM;gBAAE,KAAK,CAAC,IAAI,GAAG,MAAM,CAAC;YAChC,aAAa,CAAC,GAAG,EAAE,KAAK,CAAC,CAAC;SAC3B;IACH,CAAC;IAED;;;;;OAKG;IACH,KAAK,CAAC,OAAe,EAAE,MAAgB;QACrC,IACE,IAAI,CAAC,OAAO,EAAE;YACd,CAAC,CAAC,MAAM,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC,MAAM,GAAG,CAAC,IAAI,eAAe,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;gBAC3E,CAAC,MAAM,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC,MAAM,KAAK,CAAC,IAAI,YAAY,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,EAC9E;YACA,MAAM,QAAQ,GAAG,IAAI,IAAI,EAAE,CAAC,OAAO,EAAE,CAAC;YACtC,MAAM,GAAG,GAAG,aAAM,CAAC,kBAAkB,EAAE,OAAO,EAAE,IAAI,CAAC,SAAS,EAAE,GAAG,EAAE,QAAQ,EAAE,OAAO,CAAC,CAAC;YACxF,MAAM,KAAK,GAAG;gBACZ,IAAI,EAAE,mBAAW,CAAC,KAAK;gBACvB,OAAO;gBACP,SAAS,EAAE,IAAI,CAAC,SAAS;gBACzB,GAAG;gBACH,IAAI,EAAE,QAAQ;aACR,CAAC;YAET,IAAI,MAAM;gBAAE,KAAK,CAAC,IAAI,GAAG,MAAM,CAAC;YAChC,aAAa,CAAC,GAAG,EAAE,KAAK,CAAC,CAAC;SAC3B;IACH,CAAC;IAED,sCAAsC;IACtC,MAAM;QACJ,OAAO,KAAK,KAAK,mBAAW,CAAC,IAAI,IAAI,KAAK,KAAK,mBAAW,CAAC,KAAK,CAAC;IACnE,CAAC;IAED,uCAAuC;IACvC,OAAO;QACL,OAAO,KAAK,KAAK,mBAAW,CAAC,KAAK,IAAI,KAAK,KAAK,mBAAW,CAAC,IAAI,IAAI,KAAK,KAAK,mBAAW,CAAC,KAAK,CAAC;IAClG,CAAC;IAED,uCAAuC;IACvC,MAAM;QACJ,OAAO,CACL,KAAK,KAAK,mBAAW,CAAC,KAAK;YAC3B,KAAK,KAAK,mBAAW,CAAC,IAAI;YAC1B,KAAK,KAAK,mBAAW,CAAC,IAAI;YAC1B,KAAK,KAAK,mBAAW,CAAC,KAAK,CAC5B,CAAC;IACJ,CAAC;IAED,uCAAuC;IACvC,OAAO;QACL,OAAO,KAAK,KAAK,mBAAW,CAAC,KAAK,CAAC;IACrC,CAAC;IAED,2EAA2E;IAC3E,MAAM,CAAC,KAAK;QACV,KAAK,GAAG,mBAAW,CAAC,KAAK,CAAC;QAC1B,eAAe,GAAG,EAAE,CAAC;IACvB,CAAC;IAED,sCAAsC;IACtC,MAAM,CAAC,aAAa;QAClB,OAAO,aAAa,CAAC;IACvB,CAAC;IAED;;;;OAIG;IACH,MAAM,CAAC,gBAAgB,CAAC,MAAsB;QAC5C,IAAI,OAAO,MAAM,KAAK,UAAU,EAAE;YAChC,MAAM,IAAI,iCAAyB,CAAC,mCAAmC,CAAC,CAAC;SAC1E;QAED,aAAa,GAAG,MAAM,CAAC;IACzB,CAAC;IAED;;;;;OAKG;IACH,MAAM,CAAC,MAAM,CAAC,IAAY,EAAE,MAAgB;QAC1C,IAAI,IAAI,KAAK,OAAO,IAAI,KAAK,CAAC,OAAO,CAAC,MAAM,CAAC,EAAE;YAC7C,eAAe,GAAG,EAAE,CAAC;YACrB,MAAM,CAAC,OAAO,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,eAAe,CAAC,CAAC,CAAC,GAAG,IAAI,CAAC,CAAC,CAAC;SAClD;IACH,CAAC;IAED;;;;OAIG;IACH,MAAM,CAAC,QAAQ,CAAC,QAAqB;QACnC,IACE,QAAQ,KAAK,mBAAW,CAAC,IAAI;YAC7B,QAAQ,KAAK,mBAAW,CAAC,KAAK;YAC9B,QAAQ,KAAK,mBAAW,CAAC,KAAK;YAC9B,QAAQ,KAAK,mBAAW,CAAC,IAAI,EAC7B;YACA,MAAM,IAAI,iCAAyB,CACjC,wCAAwC,oBAAY,CAAC,mBAAW,CAAC,EAAE,CACpE,CAAC;SACH;QAED,KAAK,GAAG,QAAQ,CAAC;IACnB,CAAC;CACF;AA5ND,wBA4NC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/mongo_client.js b/node_modules/mongodb/lib/mongo_client.js new file mode 100644 index 000000000..e5e8aadda --- /dev/null +++ b/node_modules/mongodb/lib/mongo_client.js @@ -0,0 +1,262 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.MongoClient = exports.ServerApiVersion = void 0; +const db_1 = require("./db"); +const change_stream_1 = require("./change_stream"); +const error_1 = require("./error"); +const utils_1 = require("./utils"); +const connect_1 = require("./operations/connect"); +const promise_provider_1 = require("./promise_provider"); +const bson_1 = require("./bson"); +const connection_string_1 = require("./connection_string"); +const mongo_types_1 = require("./mongo_types"); +/** @public */ +exports.ServerApiVersion = Object.freeze({ + v1: '1' +}); +/** @internal */ +const kOptions = Symbol('options'); +/** + * The **MongoClient** class is a class that allows for making Connections to MongoDB. + * @public + * + * @remarks + * The programmatically provided options take precedent over the URI options. + * + * @example + * ```js + * // Connect using a MongoClient instance + * const MongoClient = require('mongodb').MongoClient; + * const test = require('assert'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * // Connect using MongoClient + * const mongoClient = new MongoClient(url); + * mongoClient.connect(function(err, client) { + * const db = client.db(dbName); + * client.close(); + * }); + * ``` + * + * @example + * ```js + * // Connect using the MongoClient.connect static method + * const MongoClient = require('mongodb').MongoClient; + * const test = require('assert'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * // Connect using MongoClient + * MongoClient.connect(url, function(err, client) { + * const db = client.db(dbName); + * client.close(); + * }); + * ``` + */ +class MongoClient extends mongo_types_1.TypedEventEmitter { + constructor(url, options) { + super(); + this[kOptions] = connection_string_1.parseOptions(url, this, options); + // eslint-disable-next-line @typescript-eslint/no-this-alias + const client = this; + // The internal state + this.s = { + url, + sessions: new Set(), + bsonOptions: bson_1.resolveBSONOptions(this[kOptions]), + namespace: utils_1.ns('admin'), + get options() { + return client[kOptions]; + }, + get readConcern() { + return client[kOptions].readConcern; + }, + get writeConcern() { + return client[kOptions].writeConcern; + }, + get readPreference() { + return client[kOptions].readPreference; + }, + get logger() { + return client[kOptions].logger; + } + }; + } + get options() { + return Object.freeze({ ...this[kOptions] }); + } + get serverApi() { + return this[kOptions].serverApi && Object.freeze({ ...this[kOptions].serverApi }); + } + /** + * Intended for APM use only + * @internal + */ + get monitorCommands() { + return this[kOptions].monitorCommands; + } + set monitorCommands(value) { + this[kOptions].monitorCommands = value; + } + get autoEncrypter() { + return this[kOptions].autoEncrypter; + } + get readConcern() { + return this.s.readConcern; + } + get writeConcern() { + return this.s.writeConcern; + } + get readPreference() { + return this.s.readPreference; + } + get bsonOptions() { + return this.s.bsonOptions; + } + get logger() { + return this.s.logger; + } + connect(callback) { + if (callback && typeof callback !== 'function') { + throw new error_1.MongoInvalidArgumentError('Method `connect` only accepts a callback'); + } + return utils_1.maybePromise(callback, cb => { + connect_1.connect(this, this[kOptions], err => { + if (err) + return cb(err); + cb(undefined, this); + }); + }); + } + close(forceOrCallback, callback) { + if (typeof forceOrCallback === 'function') { + callback = forceOrCallback; + } + const force = typeof forceOrCallback === 'boolean' ? forceOrCallback : false; + return utils_1.maybePromise(callback, callback => { + if (this.topology == null) { + return callback(); + } + // clear out references to old topology + const topology = this.topology; + this.topology = undefined; + topology.close({ force }, error => { + if (error) + return callback(error); + const { encrypter } = this[kOptions]; + if (encrypter) { + return encrypter.close(this, force, error => { + callback(error); + }); + } + callback(); + }); + }); + } + /** + * Create a new Db instance sharing the current socket connections. + * + * @param dbName - The name of the database we want to use. If not provided, use database name from connection string. + * @param options - Optional settings for Db construction + */ + db(dbName, options) { + options = options !== null && options !== void 0 ? options : {}; + // Default to db from connection string if not provided + if (!dbName) { + dbName = this.options.dbName; + } + // Copy the options and add out internal override of the not shared flag + const finalOptions = Object.assign({}, this[kOptions], options); + // Return the db object + const db = new db_1.Db(this, dbName, finalOptions); + // Return the database + return db; + } + static connect(url, options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = options !== null && options !== void 0 ? options : {}; + try { + // Create client + const mongoClient = new MongoClient(url, options); + // Execute the connect method + if (callback) { + return mongoClient.connect(callback); + } + else { + return mongoClient.connect(); + } + } + catch (error) { + if (callback) + return callback(error); + else + return promise_provider_1.PromiseProvider.get().reject(error); + } + } + startSession(options) { + options = Object.assign({ explicit: true }, options); + if (!this.topology) { + throw new error_1.MongoNotConnectedError('MongoClient must be connected to start a session'); + } + return this.topology.startSession(options, this.s.options); + } + withSession(optionsOrOperation, callback) { + let options = optionsOrOperation; + if (typeof optionsOrOperation === 'function') { + callback = optionsOrOperation; + options = { owner: Symbol() }; + } + if (callback == null) { + throw new error_1.MongoInvalidArgumentError('Missing required callback parameter'); + } + const session = this.startSession(options); + const Promise = promise_provider_1.PromiseProvider.get(); + let cleanupHandler = ((err, result, opts) => { + // prevent multiple calls to cleanupHandler + cleanupHandler = () => { + // TODO(NODE-3483) + throw new error_1.MongoRuntimeError('cleanupHandler was called too many times'); + }; + opts = Object.assign({ throw: true }, opts); + session.endSession(); + if (err) { + if (opts.throw) + throw err; + return Promise.reject(err); + } + }); + try { + const result = callback(session); + return Promise.resolve(result).then(result => cleanupHandler(undefined, result, undefined), err => cleanupHandler(err, null, { throw: true })); + } + catch (err) { + return cleanupHandler(err, null, { throw: false }); + } + } + /** + * Create a new Change Stream, watching for new changes (insertions, updates, + * replacements, deletions, and invalidations) in this cluster. Will ignore all + * changes to system collections, as well as the local, admin, and config databases. + * + * @param pipeline - An array of {@link https://docs.mongodb.com/manual/reference/operator/aggregation-pipeline/|aggregation pipeline stages} through which to pass change stream documents. This allows for filtering (using $match) and manipulating the change stream documents. + * @param options - Optional settings for the command + */ + watch(pipeline = [], options = {}) { + // Allow optionally not specifying a pipeline + if (!Array.isArray(pipeline)) { + options = pipeline; + pipeline = []; + } + return new change_stream_1.ChangeStream(this, pipeline, utils_1.resolveOptions(this, options)); + } + /** Return the mongo client logger */ + getLogger() { + return this.s.logger; + } +} +exports.MongoClient = MongoClient; +//# sourceMappingURL=mongo_client.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/mongo_client.js.map b/node_modules/mongodb/lib/mongo_client.js.map new file mode 100644 index 000000000..be823c40b --- /dev/null +++ b/node_modules/mongodb/lib/mongo_client.js.map @@ -0,0 +1 @@ +{"version":3,"file":"mongo_client.js","sourceRoot":"","sources":["../src/mongo_client.ts"],"names":[],"mappings":";;;AAAA,6BAAqC;AACrC,mDAAoE;AAEpE,mCAKiB;AAEjB,mCAQiB;AACjB,kDAAoE;AACpE,yDAAqD;AAGrD,iCAA4E;AAO5E,2DAAmD;AAQnD,+CAAkD;AAElD,cAAc;AACD,QAAA,gBAAgB,GAAG,MAAM,CAAC,MAAM,CAAC;IAC5C,EAAE,EAAE,GAAG;CACC,CAAC,CAAC;AAiOZ,gBAAgB;AAChB,MAAM,QAAQ,GAAG,MAAM,CAAC,SAAS,CAAC,CAAC;AAEnC;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;GAuCG;AACH,MAAa,WAAY,SAAQ,+BAAoC;IAYnE,YAAY,GAAW,EAAE,OAA4B;QACnD,KAAK,EAAE,CAAC;QAER,IAAI,CAAC,QAAQ,CAAC,GAAG,gCAAY,CAAC,GAAG,EAAE,IAAI,EAAE,OAAO,CAAC,CAAC;QAElD,4DAA4D;QAC5D,MAAM,MAAM,GAAG,IAAI,CAAC;QAEpB,qBAAqB;QACrB,IAAI,CAAC,CAAC,GAAG;YACP,GAAG;YACH,QAAQ,EAAE,IAAI,GAAG,EAAE;YACnB,WAAW,EAAE,yBAAkB,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;YAC/C,SAAS,EAAE,UAAE,CAAC,OAAO,CAAC;YAEtB,IAAI,OAAO;gBACT,OAAO,MAAM,CAAC,QAAQ,CAAC,CAAC;YAC1B,CAAC;YACD,IAAI,WAAW;gBACb,OAAO,MAAM,CAAC,QAAQ,CAAC,CAAC,WAAW,CAAC;YACtC,CAAC;YACD,IAAI,YAAY;gBACd,OAAO,MAAM,CAAC,QAAQ,CAAC,CAAC,YAAY,CAAC;YACvC,CAAC;YACD,IAAI,cAAc;gBAChB,OAAO,MAAM,CAAC,QAAQ,CAAC,CAAC,cAAc,CAAC;YACzC,CAAC;YACD,IAAI,MAAM;gBACR,OAAO,MAAM,CAAC,QAAQ,CAAC,CAAC,MAAM,CAAC;YACjC,CAAC;SACF,CAAC;IACJ,CAAC;IAED,IAAI,OAAO;QACT,OAAO,MAAM,CAAC,MAAM,CAAC,EAAE,GAAG,IAAI,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC;IAC9C,CAAC;IAED,IAAI,SAAS;QACX,OAAO,IAAI,CAAC,QAAQ,CAAC,CAAC,SAAS,IAAI,MAAM,CAAC,MAAM,CAAC,EAAE,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,SAAS,EAAE,CAAC,CAAC;IACpF,CAAC;IACD;;;OAGG;IACH,IAAI,eAAe;QACjB,OAAO,IAAI,CAAC,QAAQ,CAAC,CAAC,eAAe,CAAC;IACxC,CAAC;IACD,IAAI,eAAe,CAAC,KAAc;QAChC,IAAI,CAAC,QAAQ,CAAC,CAAC,eAAe,GAAG,KAAK,CAAC;IACzC,CAAC;IAED,IAAI,aAAa;QACf,OAAO,IAAI,CAAC,QAAQ,CAAC,CAAC,aAAa,CAAC;IACtC,CAAC;IAED,IAAI,WAAW;QACb,OAAO,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC;IAC5B,CAAC;IAED,IAAI,YAAY;QACd,OAAO,IAAI,CAAC,CAAC,CAAC,YAAY,CAAC;IAC7B,CAAC;IAED,IAAI,cAAc;QAChB,OAAO,IAAI,CAAC,CAAC,CAAC,cAAc,CAAC;IAC/B,CAAC;IAED,IAAI,WAAW;QACb,OAAO,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC;IAC5B,CAAC;IAED,IAAI,MAAM;QACR,OAAO,IAAI,CAAC,CAAC,CAAC,MAAM,CAAC;IACvB,CAAC;IASD,OAAO,CAAC,QAAgC;QACtC,IAAI,QAAQ,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;YAC9C,MAAM,IAAI,iCAAyB,CAAC,0CAA0C,CAAC,CAAC;SACjF;QAED,OAAO,oBAAY,CAAC,QAAQ,EAAE,EAAE,CAAC,EAAE;YACjC,iBAAO,CAAC,IAAI,EAAE,IAAI,CAAC,QAAQ,CAAC,EAAE,GAAG,CAAC,EAAE;gBAClC,IAAI,GAAG;oBAAE,OAAO,EAAE,CAAC,GAAG,CAAC,CAAC;gBACxB,EAAE,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;YACtB,CAAC,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;IAYD,KAAK,CACH,eAA0C,EAC1C,QAAyB;QAEzB,IAAI,OAAO,eAAe,KAAK,UAAU,EAAE;YACzC,QAAQ,GAAG,eAAe,CAAC;SAC5B;QAED,MAAM,KAAK,GAAG,OAAO,eAAe,KAAK,SAAS,CAAC,CAAC,CAAC,eAAe,CAAC,CAAC,CAAC,KAAK,CAAC;QAE7E,OAAO,oBAAY,CAAC,QAAQ,EAAE,QAAQ,CAAC,EAAE;YACvC,IAAI,IAAI,CAAC,QAAQ,IAAI,IAAI,EAAE;gBACzB,OAAO,QAAQ,EAAE,CAAC;aACnB;YAED,uCAAuC;YACvC,MAAM,QAAQ,GAAG,IAAI,CAAC,QAAQ,CAAC;YAC/B,IAAI,CAAC,QAAQ,GAAG,SAAS,CAAC;YAE1B,QAAQ,CAAC,KAAK,CAAC,EAAE,KAAK,EAAE,EAAE,KAAK,CAAC,EAAE;gBAChC,IAAI,KAAK;oBAAE,OAAO,QAAQ,CAAC,KAAK,CAAC,CAAC;gBAClC,MAAM,EAAE,SAAS,EAAE,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC;gBACrC,IAAI,SAAS,EAAE;oBACb,OAAO,SAAS,CAAC,KAAK,CAAC,IAAI,EAAE,KAAK,EAAE,KAAK,CAAC,EAAE;wBAC1C,QAAQ,CAAC,KAAK,CAAC,CAAC;oBAClB,CAAC,CAAC,CAAC;iBACJ;gBACD,QAAQ,EAAE,CAAC;YACb,CAAC,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;IAED;;;;;OAKG;IACH,EAAE,CAAC,MAAe,EAAE,OAAmB;QACrC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAExB,uDAAuD;QACvD,IAAI,CAAC,MAAM,EAAE;YACX,MAAM,GAAG,IAAI,CAAC,OAAO,CAAC,MAAM,CAAC;SAC9B;QAED,wEAAwE;QACxE,MAAM,YAAY,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,IAAI,CAAC,QAAQ,CAAC,EAAE,OAAO,CAAC,CAAC;QAEhE,uBAAuB;QACvB,MAAM,EAAE,GAAG,IAAI,OAAE,CAAC,IAAI,EAAE,MAAM,EAAE,YAAY,CAAC,CAAC;QAE9C,sBAAsB;QACtB,OAAO,EAAE,CAAC;IACZ,CAAC;IAcD,MAAM,CAAC,OAAO,CACZ,GAAW,EACX,OAAoD,EACpD,QAAgC;QAEhC,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAExB,IAAI;YACF,gBAAgB;YAChB,MAAM,WAAW,GAAG,IAAI,WAAW,CAAC,GAAG,EAAE,OAAO,CAAC,CAAC;YAClD,6BAA6B;YAC7B,IAAI,QAAQ,EAAE;gBACZ,OAAO,WAAW,CAAC,OAAO,CAAC,QAAQ,CAAC,CAAC;aACtC;iBAAM;gBACL,OAAO,WAAW,CAAC,OAAO,EAAE,CAAC;aAC9B;SACF;QAAC,OAAO,KAAK,EAAE;YACd,IAAI,QAAQ;gBAAE,OAAO,QAAQ,CAAC,KAAK,CAAC,CAAC;;gBAChC,OAAO,kCAAe,CAAC,GAAG,EAAE,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC;SACjD;IACH,CAAC;IAKD,YAAY,CAAC,OAA8B;QACzC,OAAO,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,QAAQ,EAAE,IAAI,EAAE,EAAE,OAAO,CAAC,CAAC;QACrD,IAAI,CAAC,IAAI,CAAC,QAAQ,EAAE;YAClB,MAAM,IAAI,8BAAsB,CAAC,kDAAkD,CAAC,CAAC;SACtF;QAED,OAAO,IAAI,CAAC,QAAQ,CAAC,YAAY,CAAC,OAAO,EAAE,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC;IAC7D,CAAC;IAaD,WAAW,CACT,kBAA+D,EAC/D,QAA8B;QAE9B,IAAI,OAAO,GAAyB,kBAA0C,CAAC;QAC/E,IAAI,OAAO,kBAAkB,KAAK,UAAU,EAAE;YAC5C,QAAQ,GAAG,kBAAyC,CAAC;YACrD,OAAO,GAAG,EAAE,KAAK,EAAE,MAAM,EAAE,EAAE,CAAC;SAC/B;QAED,IAAI,QAAQ,IAAI,IAAI,EAAE;YACpB,MAAM,IAAI,iCAAyB,CAAC,qCAAqC,CAAC,CAAC;SAC5E;QAED,MAAM,OAAO,GAAG,IAAI,CAAC,YAAY,CAAC,OAAO,CAAC,CAAC;QAC3C,MAAM,OAAO,GAAG,kCAAe,CAAC,GAAG,EAAE,CAAC;QAEtC,IAAI,cAAc,GAA2B,CAAC,CAAC,GAAG,EAAE,MAAM,EAAE,IAAI,EAAE,EAAE;YAClE,2CAA2C;YAC3C,cAAc,GAAG,GAAG,EAAE;gBACpB,kBAAkB;gBAClB,MAAM,IAAI,yBAAiB,CAAC,0CAA0C,CAAC,CAAC;YAC1E,CAAC,CAAC;YAEF,IAAI,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,KAAK,EAAE,IAAI,EAAE,EAAE,IAAI,CAAC,CAAC;YAC5C,OAAO,CAAC,UAAU,EAAE,CAAC;YAErB,IAAI,GAAG,EAAE;gBACP,IAAI,IAAI,CAAC,KAAK;oBAAE,MAAM,GAAG,CAAC;gBAC1B,OAAO,OAAO,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC;aAC5B;QACH,CAAC,CAA2B,CAAC;QAE7B,IAAI;YACF,MAAM,MAAM,GAAG,QAAQ,CAAC,OAAO,CAAC,CAAC;YACjC,OAAO,OAAO,CAAC,OAAO,CAAC,MAAM,CAAC,CAAC,IAAI,CACjC,MAAM,CAAC,EAAE,CAAC,cAAc,CAAC,SAAS,EAAE,MAAM,EAAE,SAAS,CAAC,EACtD,GAAG,CAAC,EAAE,CAAC,cAAc,CAAC,GAAG,EAAE,IAAI,EAAE,EAAE,KAAK,EAAE,IAAI,EAAE,CAAC,CAClD,CAAC;SACH;QAAC,OAAO,GAAG,EAAE;YACZ,OAAO,cAAc,CAAC,GAAG,EAAE,IAAI,EAAE,EAAE,KAAK,EAAE,KAAK,EAAE,CAAkB,CAAC;SACrE;IACH,CAAC;IAED;;;;;;;OAOG;IACH,KAAK,CACH,WAAuB,EAAE,EACzB,UAA+B,EAAE;QAEjC,6CAA6C;QAC7C,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,QAAQ,CAAC,EAAE;YAC5B,OAAO,GAAG,QAAQ,CAAC;YACnB,QAAQ,GAAG,EAAE,CAAC;SACf;QAED,OAAO,IAAI,4BAAY,CAAU,IAAI,EAAE,QAAQ,EAAE,sBAAc,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,CAAC;IAClF,CAAC;IAED,qCAAqC;IACrC,SAAS;QACP,OAAO,IAAI,CAAC,CAAC,CAAC,MAAM,CAAC;IACvB,CAAC;CACF;AA5SD,kCA4SC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/mongo_types.js b/node_modules/mongodb/lib/mongo_types.js new file mode 100644 index 000000000..7cd40483f --- /dev/null +++ b/node_modules/mongodb/lib/mongo_types.js @@ -0,0 +1,41 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.CancellationToken = exports.TypedEventEmitter = exports.BSONType = void 0; +const events_1 = require("events"); +/** @public */ +exports.BSONType = Object.freeze({ + double: 1, + string: 2, + object: 3, + array: 4, + binData: 5, + undefined: 6, + objectId: 7, + bool: 8, + date: 9, + null: 10, + regex: 11, + dbPointer: 12, + javascript: 13, + symbol: 14, + javascriptWithScope: 15, + int: 16, + timestamp: 17, + long: 18, + decimal: 19, + minKey: -1, + maxKey: 127 +}); +/** + * Typescript type safe event emitter + * @public + */ +// eslint-disable-next-line @typescript-eslint/no-unused-vars +class TypedEventEmitter extends events_1.EventEmitter { +} +exports.TypedEventEmitter = TypedEventEmitter; +/** @public */ +class CancellationToken extends TypedEventEmitter { +} +exports.CancellationToken = CancellationToken; +//# sourceMappingURL=mongo_types.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/mongo_types.js.map b/node_modules/mongodb/lib/mongo_types.js.map new file mode 100644 index 000000000..605f94e43 --- /dev/null +++ b/node_modules/mongodb/lib/mongo_types.js.map @@ -0,0 +1 @@ +{"version":3,"file":"mongo_types.js","sourceRoot":"","sources":["../src/mongo_types.ts"],"names":[],"mappings":";;;AAWA,mCAAsC;AAkItC,cAAc;AACD,QAAA,QAAQ,GAAG,MAAM,CAAC,MAAM,CAAC;IACpC,MAAM,EAAE,CAAC;IACT,MAAM,EAAE,CAAC;IACT,MAAM,EAAE,CAAC;IACT,KAAK,EAAE,CAAC;IACR,OAAO,EAAE,CAAC;IACV,SAAS,EAAE,CAAC;IACZ,QAAQ,EAAE,CAAC;IACX,IAAI,EAAE,CAAC;IACP,IAAI,EAAE,CAAC;IACP,IAAI,EAAE,EAAE;IACR,KAAK,EAAE,EAAE;IACT,SAAS,EAAE,EAAE;IACb,UAAU,EAAE,EAAE;IACd,MAAM,EAAE,EAAE;IACV,mBAAmB,EAAE,EAAE;IACvB,GAAG,EAAE,EAAE;IACP,SAAS,EAAE,EAAE;IACb,IAAI,EAAE,EAAE;IACR,OAAO,EAAE,EAAE;IACX,MAAM,EAAE,CAAC,CAAC;IACV,MAAM,EAAE,GAAG;CACH,CAAC,CAAC;AA8PZ;;;GAGG;AACH,6DAA6D;AAC7D,MAAa,iBAAoD,SAAQ,qBAAY;CAAG;AAAxF,8CAAwF;AAExF,cAAc;AACd,MAAa,iBAAkB,SAAQ,iBAAqC;CAAG;AAA/E,8CAA+E"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/add_user.js b/node_modules/mongodb/lib/operations/add_user.js new file mode 100644 index 000000000..cdac6adc7 --- /dev/null +++ b/node_modules/mongodb/lib/operations/add_user.js @@ -0,0 +1,65 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.AddUserOperation = void 0; +const crypto = require("crypto"); +const operation_1 = require("./operation"); +const command_1 = require("./command"); +const error_1 = require("../error"); +const utils_1 = require("../utils"); +/** @internal */ +class AddUserOperation extends command_1.CommandOperation { + constructor(db, username, password, options) { + super(db, options); + this.db = db; + this.username = username; + this.password = password; + this.options = options !== null && options !== void 0 ? options : {}; + } + execute(server, session, callback) { + const db = this.db; + const username = this.username; + const password = this.password; + const options = this.options; + // Error out if digestPassword set + if (options.digestPassword != null) { + return callback(new error_1.MongoInvalidArgumentError('Option "digestPassword" not supported via addUser, use db.command(...) instead')); + } + let roles; + if (!options.roles || (Array.isArray(options.roles) && options.roles.length === 0)) { + utils_1.emitWarningOnce('Creating a user without roles is deprecated. Defaults to "root" if db is "admin" or "dbOwner" otherwise'); + if (db.databaseName.toLowerCase() === 'admin') { + roles = ['root']; + } + else { + roles = ['dbOwner']; + } + } + else { + roles = Array.isArray(options.roles) ? options.roles : [options.roles]; + } + const digestPassword = utils_1.getTopology(db).lastIsMaster().maxWireVersion >= 7; + let userPassword = password; + if (!digestPassword) { + // Use node md5 generator + const md5 = crypto.createHash('md5'); + // Generate keys used for authentication + md5.update(`${username}:mongo:${password}`); + userPassword = md5.digest('hex'); + } + // Build the command to execute + const command = { + createUser: username, + customData: options.customData || {}, + roles: roles, + digestPassword + }; + // No password + if (typeof password === 'string') { + command.pwd = userPassword; + } + super.executeCommand(server, session, command, callback); + } +} +exports.AddUserOperation = AddUserOperation; +operation_1.defineAspects(AddUserOperation, [operation_1.Aspect.WRITE_OPERATION]); +//# sourceMappingURL=add_user.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/add_user.js.map b/node_modules/mongodb/lib/operations/add_user.js.map new file mode 100644 index 000000000..8d1e5d159 --- /dev/null +++ b/node_modules/mongodb/lib/operations/add_user.js.map @@ -0,0 +1 @@ +{"version":3,"file":"add_user.js","sourceRoot":"","sources":["../../src/operations/add_user.ts"],"names":[],"mappings":";;;AAAA,iCAAiC;AACjC,2CAAoD;AACpD,uCAAsE;AACtE,oCAAqD;AACrD,oCAAkE;AA2BlE,gBAAgB;AAChB,MAAa,gBAAiB,SAAQ,0BAA0B;IAM9D,YAAY,EAAM,EAAE,QAAgB,EAAE,QAA4B,EAAE,OAAwB;QAC1F,KAAK,CAAC,EAAE,EAAE,OAAO,CAAC,CAAC;QAEnB,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC;QACb,IAAI,CAAC,QAAQ,GAAG,QAAQ,CAAC;QACzB,IAAI,CAAC,QAAQ,GAAG,QAAQ,CAAC;QACzB,IAAI,CAAC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;IAC/B,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA4B;QAC1E,MAAM,EAAE,GAAG,IAAI,CAAC,EAAE,CAAC;QACnB,MAAM,QAAQ,GAAG,IAAI,CAAC,QAAQ,CAAC;QAC/B,MAAM,QAAQ,GAAG,IAAI,CAAC,QAAQ,CAAC;QAC/B,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAC;QAE7B,kCAAkC;QAClC,IAAI,OAAO,CAAC,cAAc,IAAI,IAAI,EAAE;YAClC,OAAO,QAAQ,CACb,IAAI,iCAAyB,CAC3B,gFAAgF,CACjF,CACF,CAAC;SACH;QAED,IAAI,KAAK,CAAC;QACV,IAAI,CAAC,OAAO,CAAC,KAAK,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,OAAO,CAAC,KAAK,CAAC,IAAI,OAAO,CAAC,KAAK,CAAC,MAAM,KAAK,CAAC,CAAC,EAAE;YAClF,uBAAe,CACb,yGAAyG,CAC1G,CAAC;YACF,IAAI,EAAE,CAAC,YAAY,CAAC,WAAW,EAAE,KAAK,OAAO,EAAE;gBAC7C,KAAK,GAAG,CAAC,MAAM,CAAC,CAAC;aAClB;iBAAM;gBACL,KAAK,GAAG,CAAC,SAAS,CAAC,CAAC;aACrB;SACF;aAAM;YACL,KAAK,GAAG,KAAK,CAAC,OAAO,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC;SACxE;QAED,MAAM,cAAc,GAAG,mBAAW,CAAC,EAAE,CAAC,CAAC,YAAY,EAAE,CAAC,cAAc,IAAI,CAAC,CAAC;QAE1E,IAAI,YAAY,GAAG,QAAQ,CAAC;QAE5B,IAAI,CAAC,cAAc,EAAE;YACnB,yBAAyB;YACzB,MAAM,GAAG,GAAG,MAAM,CAAC,UAAU,CAAC,KAAK,CAAC,CAAC;YACrC,wCAAwC;YACxC,GAAG,CAAC,MAAM,CAAC,GAAG,QAAQ,UAAU,QAAQ,EAAE,CAAC,CAAC;YAC5C,YAAY,GAAG,GAAG,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC;SAClC;QAED,+BAA+B;QAC/B,MAAM,OAAO,GAAa;YACxB,UAAU,EAAE,QAAQ;YACpB,UAAU,EAAE,OAAO,CAAC,UAAU,IAAI,EAAE;YACpC,KAAK,EAAE,KAAK;YACZ,cAAc;SACf,CAAC;QAEF,cAAc;QACd,IAAI,OAAO,QAAQ,KAAK,QAAQ,EAAE;YAChC,OAAO,CAAC,GAAG,GAAG,YAAY,CAAC;SAC5B;QAED,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;IAC3D,CAAC;CACF;AAvED,4CAuEC;AAED,yBAAa,CAAC,gBAAgB,EAAE,CAAC,kBAAM,CAAC,eAAe,CAAC,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/aggregate.js b/node_modules/mongodb/lib/operations/aggregate.js new file mode 100644 index 000000000..f66116c3c --- /dev/null +++ b/node_modules/mongodb/lib/operations/aggregate.js @@ -0,0 +1,86 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.AggregateOperation = exports.DB_AGGREGATE_COLLECTION = void 0; +const command_1 = require("./command"); +const read_preference_1 = require("../read_preference"); +const error_1 = require("../error"); +const utils_1 = require("../utils"); +const operation_1 = require("./operation"); +/** @internal */ +exports.DB_AGGREGATE_COLLECTION = 1; +const MIN_WIRE_VERSION_$OUT_READ_CONCERN_SUPPORT = 8; +/** @internal */ +class AggregateOperation extends command_1.CommandOperation { + constructor(ns, pipeline, options) { + super(undefined, { ...options, dbName: ns.db }); + this.options = options !== null && options !== void 0 ? options : {}; + // Covers when ns.collection is null, undefined or the empty string, use DB_AGGREGATE_COLLECTION + this.target = ns.collection || exports.DB_AGGREGATE_COLLECTION; + this.pipeline = pipeline; + // determine if we have a write stage, override read preference if so + this.hasWriteStage = false; + if (typeof (options === null || options === void 0 ? void 0 : options.out) === 'string') { + this.pipeline = this.pipeline.concat({ $out: options.out }); + this.hasWriteStage = true; + } + else if (pipeline.length > 0) { + const finalStage = pipeline[pipeline.length - 1]; + if (finalStage.$out || finalStage.$merge) { + this.hasWriteStage = true; + } + } + if (this.hasWriteStage) { + this.readPreference = read_preference_1.ReadPreference.primary; + } + if (this.explain && this.writeConcern) { + throw new error_1.MongoInvalidArgumentError('Option "explain" cannot be used on an aggregate call with writeConcern'); + } + if ((options === null || options === void 0 ? void 0 : options.cursor) != null && typeof options.cursor !== 'object') { + throw new error_1.MongoInvalidArgumentError('Cursor options must be an object'); + } + } + get canRetryRead() { + return !this.hasWriteStage; + } + addToPipeline(stage) { + this.pipeline.push(stage); + } + execute(server, session, callback) { + const options = this.options; + const serverWireVersion = utils_1.maxWireVersion(server); + const command = { aggregate: this.target, pipeline: this.pipeline }; + if (this.hasWriteStage && serverWireVersion < MIN_WIRE_VERSION_$OUT_READ_CONCERN_SUPPORT) { + this.readConcern = undefined; + } + if (serverWireVersion >= 5) { + if (this.hasWriteStage && this.writeConcern) { + Object.assign(command, { writeConcern: this.writeConcern }); + } + } + if (options.bypassDocumentValidation === true) { + command.bypassDocumentValidation = options.bypassDocumentValidation; + } + if (typeof options.allowDiskUse === 'boolean') { + command.allowDiskUse = options.allowDiskUse; + } + if (options.hint) { + command.hint = options.hint; + } + if (options.let) { + command.let = options.let; + } + command.cursor = options.cursor || {}; + if (options.batchSize && !this.hasWriteStage) { + command.cursor.batchSize = options.batchSize; + } + super.executeCommand(server, session, command, callback); + } +} +exports.AggregateOperation = AggregateOperation; +operation_1.defineAspects(AggregateOperation, [ + operation_1.Aspect.READ_OPERATION, + operation_1.Aspect.RETRYABLE, + operation_1.Aspect.EXPLAINABLE, + operation_1.Aspect.CURSOR_CREATING +]); +//# sourceMappingURL=aggregate.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/aggregate.js.map b/node_modules/mongodb/lib/operations/aggregate.js.map new file mode 100644 index 000000000..cc1f24dac --- /dev/null +++ b/node_modules/mongodb/lib/operations/aggregate.js.map @@ -0,0 +1 @@ +{"version":3,"file":"aggregate.js","sourceRoot":"","sources":["../../src/operations/aggregate.ts"],"names":[],"mappings":";;;AAAA,uCAAwF;AACxF,wDAAoD;AACpD,oCAAqD;AACrD,oCAA4D;AAC5D,2CAA0D;AAM1D,gBAAgB;AACH,QAAA,uBAAuB,GAAG,CAAU,CAAC;AAClD,MAAM,0CAA0C,GAAG,CAAU,CAAC;AAyB9D,gBAAgB;AAChB,MAAa,kBAAiC,SAAQ,0BAAmB;IAMvE,YAAY,EAAoB,EAAE,QAAoB,EAAE,OAA0B;QAChF,KAAK,CAAC,SAAS,EAAE,EAAE,GAAG,OAAO,EAAE,MAAM,EAAE,EAAE,CAAC,EAAE,EAAE,CAAC,CAAC;QAEhD,IAAI,CAAC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAE7B,gGAAgG;QAChG,IAAI,CAAC,MAAM,GAAG,EAAE,CAAC,UAAU,IAAI,+BAAuB,CAAC;QAEvD,IAAI,CAAC,QAAQ,GAAG,QAAQ,CAAC;QAEzB,qEAAqE;QACrE,IAAI,CAAC,aAAa,GAAG,KAAK,CAAC;QAC3B,IAAI,OAAO,CAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,GAAG,CAAA,KAAK,QAAQ,EAAE;YACpC,IAAI,CAAC,QAAQ,GAAG,IAAI,CAAC,QAAQ,CAAC,MAAM,CAAC,EAAE,IAAI,EAAE,OAAO,CAAC,GAAG,EAAE,CAAC,CAAC;YAC5D,IAAI,CAAC,aAAa,GAAG,IAAI,CAAC;SAC3B;aAAM,IAAI,QAAQ,CAAC,MAAM,GAAG,CAAC,EAAE;YAC9B,MAAM,UAAU,GAAG,QAAQ,CAAC,QAAQ,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC;YACjD,IAAI,UAAU,CAAC,IAAI,IAAI,UAAU,CAAC,MAAM,EAAE;gBACxC,IAAI,CAAC,aAAa,GAAG,IAAI,CAAC;aAC3B;SACF;QAED,IAAI,IAAI,CAAC,aAAa,EAAE;YACtB,IAAI,CAAC,cAAc,GAAG,gCAAc,CAAC,OAAO,CAAC;SAC9C;QAED,IAAI,IAAI,CAAC,OAAO,IAAI,IAAI,CAAC,YAAY,EAAE;YACrC,MAAM,IAAI,iCAAyB,CACjC,wEAAwE,CACzE,CAAC;SACH;QAED,IAAI,CAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,MAAM,KAAI,IAAI,IAAI,OAAO,OAAO,CAAC,MAAM,KAAK,QAAQ,EAAE;YACjE,MAAM,IAAI,iCAAyB,CAAC,kCAAkC,CAAC,CAAC;SACzE;IACH,CAAC;IAED,IAAI,YAAY;QACd,OAAO,CAAC,IAAI,CAAC,aAAa,CAAC;IAC7B,CAAC;IAED,aAAa,CAAC,KAAe;QAC3B,IAAI,CAAC,QAAQ,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;IAC5B,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAAqB;QACnE,MAAM,OAAO,GAAqB,IAAI,CAAC,OAAO,CAAC;QAC/C,MAAM,iBAAiB,GAAG,sBAAc,CAAC,MAAM,CAAC,CAAC;QACjD,MAAM,OAAO,GAAa,EAAE,SAAS,EAAE,IAAI,CAAC,MAAM,EAAE,QAAQ,EAAE,IAAI,CAAC,QAAQ,EAAE,CAAC;QAE9E,IAAI,IAAI,CAAC,aAAa,IAAI,iBAAiB,GAAG,0CAA0C,EAAE;YACxF,IAAI,CAAC,WAAW,GAAG,SAAS,CAAC;SAC9B;QAED,IAAI,iBAAiB,IAAI,CAAC,EAAE;YAC1B,IAAI,IAAI,CAAC,aAAa,IAAI,IAAI,CAAC,YAAY,EAAE;gBAC3C,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,EAAE,YAAY,EAAE,IAAI,CAAC,YAAY,EAAE,CAAC,CAAC;aAC7D;SACF;QAED,IAAI,OAAO,CAAC,wBAAwB,KAAK,IAAI,EAAE;YAC7C,OAAO,CAAC,wBAAwB,GAAG,OAAO,CAAC,wBAAwB,CAAC;SACrE;QAED,IAAI,OAAO,OAAO,CAAC,YAAY,KAAK,SAAS,EAAE;YAC7C,OAAO,CAAC,YAAY,GAAG,OAAO,CAAC,YAAY,CAAC;SAC7C;QAED,IAAI,OAAO,CAAC,IAAI,EAAE;YAChB,OAAO,CAAC,IAAI,GAAG,OAAO,CAAC,IAAI,CAAC;SAC7B;QAED,IAAI,OAAO,CAAC,GAAG,EAAE;YACf,OAAO,CAAC,GAAG,GAAG,OAAO,CAAC,GAAG,CAAC;SAC3B;QAED,OAAO,CAAC,MAAM,GAAG,OAAO,CAAC,MAAM,IAAI,EAAE,CAAC;QACtC,IAAI,OAAO,CAAC,SAAS,IAAI,CAAC,IAAI,CAAC,aAAa,EAAE;YAC5C,OAAO,CAAC,MAAM,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;SAC9C;QAED,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;IAC3D,CAAC;CACF;AAzFD,gDAyFC;AAED,yBAAa,CAAC,kBAAkB,EAAE;IAChC,kBAAM,CAAC,cAAc;IACrB,kBAAM,CAAC,SAAS;IAChB,kBAAM,CAAC,WAAW;IAClB,kBAAM,CAAC,eAAe;CACvB,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/bulk_write.js b/node_modules/mongodb/lib/operations/bulk_write.js new file mode 100644 index 000000000..e02029af5 --- /dev/null +++ b/node_modules/mongodb/lib/operations/bulk_write.js @@ -0,0 +1,43 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.BulkWriteOperation = void 0; +const operation_1 = require("./operation"); +/** @internal */ +class BulkWriteOperation extends operation_1.AbstractOperation { + constructor(collection, operations, options) { + super(options); + this.options = options; + this.collection = collection; + this.operations = operations; + } + execute(server, session, callback) { + const coll = this.collection; + const operations = this.operations; + const options = { ...this.options, ...this.bsonOptions, readPreference: this.readPreference }; + // Create the bulk operation + const bulk = options.ordered === false + ? coll.initializeUnorderedBulkOp(options) + : coll.initializeOrderedBulkOp(options); + // for each op go through and add to the bulk + try { + for (let i = 0; i < operations.length; i++) { + bulk.raw(operations[i]); + } + } + catch (err) { + return callback(err); + } + // Execute the bulk + bulk.execute({ ...options, session }, (err, r) => { + // We have connection level error + if (!r && err) { + return callback(err); + } + // Return the results + callback(undefined, r); + }); + } +} +exports.BulkWriteOperation = BulkWriteOperation; +operation_1.defineAspects(BulkWriteOperation, [operation_1.Aspect.WRITE_OPERATION]); +//# sourceMappingURL=bulk_write.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/bulk_write.js.map b/node_modules/mongodb/lib/operations/bulk_write.js.map new file mode 100644 index 000000000..cf5a6e406 --- /dev/null +++ b/node_modules/mongodb/lib/operations/bulk_write.js.map @@ -0,0 +1 @@ +{"version":3,"file":"bulk_write.js","sourceRoot":"","sources":["../../src/operations/bulk_write.ts"],"names":[],"mappings":";;;AAAA,2CAAuE;AAYvE,gBAAgB;AAChB,MAAa,kBAAmB,SAAQ,6BAAkC;IAKxE,YACE,UAAsB,EACtB,UAAmC,EACnC,OAAyB;QAEzB,KAAK,CAAC,OAAO,CAAC,CAAC;QACf,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;QAC7B,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;IAC/B,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAAmC;QACjF,MAAM,IAAI,GAAG,IAAI,CAAC,UAAU,CAAC;QAC7B,MAAM,UAAU,GAAG,IAAI,CAAC,UAAU,CAAC;QACnC,MAAM,OAAO,GAAG,EAAE,GAAG,IAAI,CAAC,OAAO,EAAE,GAAG,IAAI,CAAC,WAAW,EAAE,cAAc,EAAE,IAAI,CAAC,cAAc,EAAE,CAAC;QAE9F,4BAA4B;QAC5B,MAAM,IAAI,GACR,OAAO,CAAC,OAAO,KAAK,KAAK;YACvB,CAAC,CAAC,IAAI,CAAC,yBAAyB,CAAC,OAAO,CAAC;YACzC,CAAC,CAAC,IAAI,CAAC,uBAAuB,CAAC,OAAO,CAAC,CAAC;QAE5C,6CAA6C;QAC7C,IAAI;YACF,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,UAAU,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;gBAC1C,IAAI,CAAC,GAAG,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,CAAC;aACzB;SACF;QAAC,OAAO,GAAG,EAAE;YACZ,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;SACtB;QAED,mBAAmB;QACnB,IAAI,CAAC,OAAO,CAAC,EAAE,GAAG,OAAO,EAAE,OAAO,EAAE,EAAE,CAAC,GAAG,EAAE,CAAC,EAAE,EAAE;YAC/C,iCAAiC;YACjC,IAAI,CAAC,CAAC,IAAI,GAAG,EAAE;gBACb,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;aACtB;YAED,qBAAqB;YACrB,QAAQ,CAAC,SAAS,EAAE,CAAC,CAAC,CAAC;QACzB,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AA/CD,gDA+CC;AAED,yBAAa,CAAC,kBAAkB,EAAE,CAAC,kBAAM,CAAC,eAAe,CAAC,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/collections.js b/node_modules/mongodb/lib/operations/collections.js new file mode 100644 index 000000000..927b99148 --- /dev/null +++ b/node_modules/mongodb/lib/operations/collections.js @@ -0,0 +1,29 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.CollectionsOperation = void 0; +const operation_1 = require("./operation"); +const collection_1 = require("../collection"); +/** @internal */ +class CollectionsOperation extends operation_1.AbstractOperation { + constructor(db, options) { + super(options); + this.options = options; + this.db = db; + } + execute(server, session, callback) { + const db = this.db; + // Let's get the collection names + db.listCollections({}, { ...this.options, nameOnly: true, readPreference: this.readPreference, session }).toArray((err, documents) => { + if (err || !documents) + return callback(err); + // Filter collections removing any illegal ones + documents = documents.filter(doc => doc.name.indexOf('$') === -1); + // Return the collection objects + callback(undefined, documents.map(d => { + return new collection_1.Collection(db, d.name, db.s.options); + })); + }); + } +} +exports.CollectionsOperation = CollectionsOperation; +//# sourceMappingURL=collections.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/collections.js.map b/node_modules/mongodb/lib/operations/collections.js.map new file mode 100644 index 000000000..31af51ab3 --- /dev/null +++ b/node_modules/mongodb/lib/operations/collections.js.map @@ -0,0 +1 @@ +{"version":3,"file":"collections.js","sourceRoot":"","sources":["../../src/operations/collections.ts"],"names":[],"mappings":";;;AAAA,2CAAkE;AAClE,8CAA2C;AAU3C,gBAAgB;AAChB,MAAa,oBAAqB,SAAQ,6BAA+B;IAIvE,YAAY,EAAM,EAAE,OAA2B;QAC7C,KAAK,CAAC,OAAO,CAAC,CAAC;QACf,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC;IACf,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAAgC;QAC9E,MAAM,EAAE,GAAG,IAAI,CAAC,EAAE,CAAC;QAEnB,iCAAiC;QACjC,EAAE,CAAC,eAAe,CAChB,EAAE,EACF,EAAE,GAAG,IAAI,CAAC,OAAO,EAAE,QAAQ,EAAE,IAAI,EAAE,cAAc,EAAE,IAAI,CAAC,cAAc,EAAE,OAAO,EAAE,CAClF,CAAC,OAAO,CAAC,CAAC,GAAG,EAAE,SAAS,EAAE,EAAE;YAC3B,IAAI,GAAG,IAAI,CAAC,SAAS;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAC5C,+CAA+C;YAC/C,SAAS,GAAG,SAAS,CAAC,MAAM,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,CAAC,IAAI,CAAC,OAAO,CAAC,GAAG,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;YAElE,gCAAgC;YAChC,QAAQ,CACN,SAAS,EACT,SAAS,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE;gBAChB,OAAO,IAAI,uBAAU,CAAC,EAAE,EAAE,CAAC,CAAC,IAAI,EAAE,EAAE,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC;YAClD,CAAC,CAAC,CACH,CAAC;QACJ,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AA/BD,oDA+BC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/command.js b/node_modules/mongodb/lib/operations/command.js new file mode 100644 index 000000000..1c8b704f4 --- /dev/null +++ b/node_modules/mongodb/lib/operations/command.js @@ -0,0 +1,95 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.CommandOperation = void 0; +const operation_1 = require("./operation"); +const read_concern_1 = require("../read_concern"); +const write_concern_1 = require("../write_concern"); +const utils_1 = require("../utils"); +const sessions_1 = require("../sessions"); +const error_1 = require("../error"); +const explain_1 = require("../explain"); +const SUPPORTS_WRITE_CONCERN_AND_COLLATION = 5; +/** @internal */ +class CommandOperation extends operation_1.AbstractOperation { + constructor(parent, options) { + super(options); + this.options = options !== null && options !== void 0 ? options : {}; + // NOTE: this was explicitly added for the add/remove user operations, it's likely + // something we'd want to reconsider. Perhaps those commands can use `Admin` + // as a parent? + const dbNameOverride = (options === null || options === void 0 ? void 0 : options.dbName) || (options === null || options === void 0 ? void 0 : options.authdb); + if (dbNameOverride) { + this.ns = new utils_1.MongoDBNamespace(dbNameOverride, '$cmd'); + } + else { + this.ns = parent + ? parent.s.namespace.withCollection('$cmd') + : new utils_1.MongoDBNamespace('admin', '$cmd'); + } + this.readConcern = read_concern_1.ReadConcern.fromOptions(options); + this.writeConcern = write_concern_1.WriteConcern.fromOptions(options); + // TODO(NODE-2056): make logger another "inheritable" property + if (parent && parent.logger) { + this.logger = parent.logger; + } + if (this.hasAspect(operation_1.Aspect.EXPLAINABLE)) { + this.explain = explain_1.Explain.fromOptions(options); + } + else if ((options === null || options === void 0 ? void 0 : options.explain) != null) { + throw new error_1.MongoInvalidArgumentError(`Option "explain" is not supported on this command`); + } + } + get canRetryWrite() { + if (this.hasAspect(operation_1.Aspect.EXPLAINABLE)) { + return this.explain == null; + } + return true; + } + executeCommand(server, session, cmd, callback) { + // TODO: consider making this a non-enumerable property + this.server = server; + const options = { + ...this.options, + ...this.bsonOptions, + readPreference: this.readPreference, + session + }; + const serverWireVersion = utils_1.maxWireVersion(server); + const inTransaction = this.session && this.session.inTransaction(); + if (this.readConcern && sessions_1.commandSupportsReadConcern(cmd) && !inTransaction) { + Object.assign(cmd, { readConcern: this.readConcern }); + } + if (options.collation && serverWireVersion < SUPPORTS_WRITE_CONCERN_AND_COLLATION) { + callback(new error_1.MongoCompatibilityError(`Server ${server.name}, which reports wire version ${serverWireVersion}, does not support collation`)); + return; + } + if (this.writeConcern && this.hasAspect(operation_1.Aspect.WRITE_OPERATION) && !inTransaction) { + Object.assign(cmd, { writeConcern: this.writeConcern }); + } + if (serverWireVersion >= SUPPORTS_WRITE_CONCERN_AND_COLLATION) { + if (options.collation && + typeof options.collation === 'object' && + !this.hasAspect(operation_1.Aspect.SKIP_COLLATION)) { + Object.assign(cmd, { collation: options.collation }); + } + } + if (typeof options.maxTimeMS === 'number') { + cmd.maxTimeMS = options.maxTimeMS; + } + if (typeof options.comment === 'string') { + cmd.comment = options.comment; + } + if (this.hasAspect(operation_1.Aspect.EXPLAINABLE) && this.explain) { + if (serverWireVersion < 6 && cmd.aggregate) { + // Prior to 3.6, with aggregate, verbosity is ignored, and we must pass in "explain: true" + cmd.explain = true; + } + else { + cmd = utils_1.decorateWithExplain(cmd, this.explain); + } + } + server.command(this.ns, cmd, options, callback); + } +} +exports.CommandOperation = CommandOperation; +//# sourceMappingURL=command.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/command.js.map b/node_modules/mongodb/lib/operations/command.js.map new file mode 100644 index 000000000..1c74467cf --- /dev/null +++ b/node_modules/mongodb/lib/operations/command.js.map @@ -0,0 +1 @@ +{"version":3,"file":"command.js","sourceRoot":"","sources":["../../src/operations/command.ts"],"names":[],"mappings":";;;AAAA,2CAA0E;AAC1E,kDAA8C;AAC9C,oDAAqE;AACrE,oCAA2F;AAE3F,0CAAwE;AACxE,oCAA8E;AAK9E,wCAAqD;AAErD,MAAM,oCAAoC,GAAG,CAAC,CAAC;AAgD/C,gBAAgB;AAChB,MAAsB,gBAAoB,SAAQ,6BAAoB;IAQpE,YAAY,MAAwB,EAAE,OAAiC;QACrE,KAAK,CAAC,OAAO,CAAC,CAAC;QACf,IAAI,CAAC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAE7B,kFAAkF;QAClF,kFAAkF;QAClF,qBAAqB;QACrB,MAAM,cAAc,GAAG,CAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,MAAM,MAAI,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,MAAM,CAAA,CAAC;QAC1D,IAAI,cAAc,EAAE;YAClB,IAAI,CAAC,EAAE,GAAG,IAAI,wBAAgB,CAAC,cAAc,EAAE,MAAM,CAAC,CAAC;SACxD;aAAM;YACL,IAAI,CAAC,EAAE,GAAG,MAAM;gBACd,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,MAAM,CAAC;gBAC3C,CAAC,CAAC,IAAI,wBAAgB,CAAC,OAAO,EAAE,MAAM,CAAC,CAAC;SAC3C;QAED,IAAI,CAAC,WAAW,GAAG,0BAAW,CAAC,WAAW,CAAC,OAAO,CAAC,CAAC;QACpD,IAAI,CAAC,YAAY,GAAG,4BAAY,CAAC,WAAW,CAAC,OAAO,CAAC,CAAC;QAEtD,8DAA8D;QAC9D,IAAI,MAAM,IAAI,MAAM,CAAC,MAAM,EAAE;YAC3B,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC,MAAM,CAAC;SAC7B;QAED,IAAI,IAAI,CAAC,SAAS,CAAC,kBAAM,CAAC,WAAW,CAAC,EAAE;YACtC,IAAI,CAAC,OAAO,GAAG,iBAAO,CAAC,WAAW,CAAC,OAAO,CAAC,CAAC;SAC7C;aAAM,IAAI,CAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,OAAO,KAAI,IAAI,EAAE;YACnC,MAAM,IAAI,iCAAyB,CAAC,mDAAmD,CAAC,CAAC;SAC1F;IACH,CAAC;IAED,IAAI,aAAa;QACf,IAAI,IAAI,CAAC,SAAS,CAAC,kBAAM,CAAC,WAAW,CAAC,EAAE;YACtC,OAAO,IAAI,CAAC,OAAO,IAAI,IAAI,CAAC;SAC7B;QACD,OAAO,IAAI,CAAC;IACd,CAAC;IAID,cAAc,CAAC,MAAc,EAAE,OAAsB,EAAE,GAAa,EAAE,QAAkB;QACtF,uDAAuD;QACvD,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC;QAErB,MAAM,OAAO,GAAG;YACd,GAAG,IAAI,CAAC,OAAO;YACf,GAAG,IAAI,CAAC,WAAW;YACnB,cAAc,EAAE,IAAI,CAAC,cAAc;YACnC,OAAO;SACR,CAAC;QAEF,MAAM,iBAAiB,GAAG,sBAAc,CAAC,MAAM,CAAC,CAAC;QACjD,MAAM,aAAa,GAAG,IAAI,CAAC,OAAO,IAAI,IAAI,CAAC,OAAO,CAAC,aAAa,EAAE,CAAC;QAEnE,IAAI,IAAI,CAAC,WAAW,IAAI,qCAA0B,CAAC,GAAG,CAAC,IAAI,CAAC,aAAa,EAAE;YACzE,MAAM,CAAC,MAAM,CAAC,GAAG,EAAE,EAAE,WAAW,EAAE,IAAI,CAAC,WAAW,EAAE,CAAC,CAAC;SACvD;QAED,IAAI,OAAO,CAAC,SAAS,IAAI,iBAAiB,GAAG,oCAAoC,EAAE;YACjF,QAAQ,CACN,IAAI,+BAAuB,CACzB,UAAU,MAAM,CAAC,IAAI,gCAAgC,iBAAiB,8BAA8B,CACrG,CACF,CAAC;YACF,OAAO;SACR;QAED,IAAI,IAAI,CAAC,YAAY,IAAI,IAAI,CAAC,SAAS,CAAC,kBAAM,CAAC,eAAe,CAAC,IAAI,CAAC,aAAa,EAAE;YACjF,MAAM,CAAC,MAAM,CAAC,GAAG,EAAE,EAAE,YAAY,EAAE,IAAI,CAAC,YAAY,EAAE,CAAC,CAAC;SACzD;QAED,IAAI,iBAAiB,IAAI,oCAAoC,EAAE;YAC7D,IACE,OAAO,CAAC,SAAS;gBACjB,OAAO,OAAO,CAAC,SAAS,KAAK,QAAQ;gBACrC,CAAC,IAAI,CAAC,SAAS,CAAC,kBAAM,CAAC,cAAc,CAAC,EACtC;gBACA,MAAM,CAAC,MAAM,CAAC,GAAG,EAAE,EAAE,SAAS,EAAE,OAAO,CAAC,SAAS,EAAE,CAAC,CAAC;aACtD;SACF;QAED,IAAI,OAAO,OAAO,CAAC,SAAS,KAAK,QAAQ,EAAE;YACzC,GAAG,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;SACnC;QAED,IAAI,OAAO,OAAO,CAAC,OAAO,KAAK,QAAQ,EAAE;YACvC,GAAG,CAAC,OAAO,GAAG,OAAO,CAAC,OAAO,CAAC;SAC/B;QAED,IAAI,IAAI,CAAC,SAAS,CAAC,kBAAM,CAAC,WAAW,CAAC,IAAI,IAAI,CAAC,OAAO,EAAE;YACtD,IAAI,iBAAiB,GAAG,CAAC,IAAI,GAAG,CAAC,SAAS,EAAE;gBAC1C,0FAA0F;gBAC1F,GAAG,CAAC,OAAO,GAAG,IAAI,CAAC;aACpB;iBAAM;gBACL,GAAG,GAAG,2BAAmB,CAAC,GAAG,EAAE,IAAI,CAAC,OAAO,CAAC,CAAC;aAC9C;SACF;QAED,MAAM,CAAC,OAAO,CAAC,IAAI,CAAC,EAAE,EAAE,GAAG,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;IAClD,CAAC;CACF;AA5GD,4CA4GC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/common_functions.js b/node_modules/mongodb/lib/operations/common_functions.js new file mode 100644 index 000000000..ac8c6b014 --- /dev/null +++ b/node_modules/mongodb/lib/operations/common_functions.js @@ -0,0 +1,64 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.prepareDocs = exports.indexInformation = void 0; +const error_1 = require("../error"); +const utils_1 = require("../utils"); +function indexInformation(db, name, _optionsOrCallback, _callback) { + let options = _optionsOrCallback; + let callback = _callback; + if ('function' === typeof _optionsOrCallback) { + callback = _optionsOrCallback; + options = {}; + } + // If we specified full information + const full = options.full == null ? false : options.full; + // Did the user destroy the topology + if (utils_1.getTopology(db).isDestroyed()) + return callback(new error_1.MongoTopologyClosedError()); + // Process all the results from the index command and collection + function processResults(indexes) { + // Contains all the information + const info = {}; + // Process all the indexes + for (let i = 0; i < indexes.length; i++) { + const index = indexes[i]; + // Let's unpack the object + info[index.name] = []; + for (const name in index.key) { + info[index.name].push([name, index.key[name]]); + } + } + return info; + } + // Get the list of indexes of the specified collection + db.collection(name) + .listIndexes(options) + .toArray((err, indexes) => { + if (err) + return callback(err); + if (!Array.isArray(indexes)) + return callback(undefined, []); + if (full) + return callback(undefined, indexes); + callback(undefined, processResults(indexes)); + }); +} +exports.indexInformation = indexInformation; +function prepareDocs(coll, docs, options) { + var _a; + const forceServerObjectId = typeof options.forceServerObjectId === 'boolean' + ? options.forceServerObjectId + : (_a = coll.s.db.options) === null || _a === void 0 ? void 0 : _a.forceServerObjectId; + // no need to modify the docs if server sets the ObjectId + if (forceServerObjectId === true) { + return docs; + } + return docs.map(doc => { + if (doc._id == null) { + doc._id = coll.s.pkFactory.createPk(); + } + return doc; + }); +} +exports.prepareDocs = prepareDocs; +//# sourceMappingURL=common_functions.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/common_functions.js.map b/node_modules/mongodb/lib/operations/common_functions.js.map new file mode 100644 index 000000000..fa919758c --- /dev/null +++ b/node_modules/mongodb/lib/operations/common_functions.js.map @@ -0,0 +1 @@ +{"version":3,"file":"common_functions.js","sourceRoot":"","sources":["../../src/operations/common_functions.ts"],"names":[],"mappings":";;;AAAA,oCAAoD;AACpD,oCAAiD;AA0BjD,SAAgB,gBAAgB,CAC9B,EAAM,EACN,IAAY,EACZ,kBAAsD,EACtD,SAAoB;IAEpB,IAAI,OAAO,GAAG,kBAA6C,CAAC;IAC5D,IAAI,QAAQ,GAAG,SAAqB,CAAC;IACrC,IAAI,UAAU,KAAK,OAAO,kBAAkB,EAAE;QAC5C,QAAQ,GAAG,kBAA8B,CAAC;QAC1C,OAAO,GAAG,EAAE,CAAC;KACd;IACD,mCAAmC;IACnC,MAAM,IAAI,GAAG,OAAO,CAAC,IAAI,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC;IAEzD,oCAAoC;IACpC,IAAI,mBAAW,CAAC,EAAE,CAAC,CAAC,WAAW,EAAE;QAAE,OAAO,QAAQ,CAAC,IAAI,gCAAwB,EAAE,CAAC,CAAC;IACnF,gEAAgE;IAChE,SAAS,cAAc,CAAC,OAAY;QAClC,+BAA+B;QAC/B,MAAM,IAAI,GAAQ,EAAE,CAAC;QACrB,0BAA0B;QAC1B,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,OAAO,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;YACvC,MAAM,KAAK,GAAG,OAAO,CAAC,CAAC,CAAC,CAAC;YACzB,0BAA0B;YAC1B,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,GAAG,EAAE,CAAC;YACtB,KAAK,MAAM,IAAI,IAAI,KAAK,CAAC,GAAG,EAAE;gBAC5B,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,CAAC,IAAI,CAAC,CAAC,IAAI,EAAE,KAAK,CAAC,GAAG,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;aAChD;SACF;QAED,OAAO,IAAI,CAAC;IACd,CAAC;IAED,sDAAsD;IACtD,EAAE,CAAC,UAAU,CAAC,IAAI,CAAC;SAChB,WAAW,CAAC,OAAO,CAAC;SACpB,OAAO,CAAC,CAAC,GAAG,EAAE,OAAO,EAAE,EAAE;QACxB,IAAI,GAAG;YAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;QAC9B,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,OAAO,CAAC;YAAE,OAAO,QAAQ,CAAC,SAAS,EAAE,EAAE,CAAC,CAAC;QAC5D,IAAI,IAAI;YAAE,OAAO,QAAQ,CAAC,SAAS,EAAE,OAAO,CAAC,CAAC;QAC9C,QAAQ,CAAC,SAAS,EAAE,cAAc,CAAC,OAAO,CAAC,CAAC,CAAC;IAC/C,CAAC,CAAC,CAAC;AACP,CAAC;AA3CD,4CA2CC;AAED,SAAgB,WAAW,CACzB,IAAgB,EAChB,IAAgB,EAChB,OAA0C;;IAE1C,MAAM,mBAAmB,GACvB,OAAO,OAAO,CAAC,mBAAmB,KAAK,SAAS;QAC9C,CAAC,CAAC,OAAO,CAAC,mBAAmB;QAC7B,CAAC,CAAC,MAAA,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,OAAO,0CAAE,mBAAmB,CAAC;IAE7C,yDAAyD;IACzD,IAAI,mBAAmB,KAAK,IAAI,EAAE;QAChC,OAAO,IAAI,CAAC;KACb;IAED,OAAO,IAAI,CAAC,GAAG,CAAC,GAAG,CAAC,EAAE;QACpB,IAAI,GAAG,CAAC,GAAG,IAAI,IAAI,EAAE;YACnB,GAAG,CAAC,GAAG,GAAG,IAAI,CAAC,CAAC,CAAC,SAAS,CAAC,QAAQ,EAAE,CAAC;SACvC;QAED,OAAO,GAAG,CAAC;IACb,CAAC,CAAC,CAAC;AACL,CAAC;AAtBD,kCAsBC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/connect.js b/node_modules/mongodb/lib/operations/connect.js new file mode 100644 index 000000000..71c69573e --- /dev/null +++ b/node_modules/mongodb/lib/operations/connect.js @@ -0,0 +1,93 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.connect = exports.MONGO_CLIENT_EVENTS = void 0; +const error_1 = require("../error"); +const topology_1 = require("../sdam/topology"); +const connection_string_1 = require("../connection_string"); +const connection_pool_1 = require("../cmap/connection_pool"); +const connection_1 = require("../cmap/connection"); +const server_1 = require("../sdam/server"); +/** @public */ +exports.MONGO_CLIENT_EVENTS = [ + ...connection_pool_1.CMAP_EVENTS, + ...connection_1.APM_EVENTS, + ...topology_1.TOPOLOGY_EVENTS, + ...server_1.HEARTBEAT_EVENTS +]; +function connect(mongoClient, options, callback) { + if (!callback) { + throw new error_1.MongoInvalidArgumentError('Callback function must be provided'); + } + // If a connection already been established, we can terminate early + if (mongoClient.topology && mongoClient.topology.isConnected()) { + return callback(undefined, mongoClient); + } + const logger = mongoClient.logger; + const connectCallback = err => { + const warningMessage = 'seed list contains no mongos proxies, replicaset connections requires ' + + 'the parameter replicaSet to be supplied in the URI or options object, ' + + 'mongodb://server:port/db?replicaSet=name'; + if (err && err.message === 'no mongos proxies found in seed list') { + if (logger.isWarn()) { + logger.warn(warningMessage); + } + // Return a more specific error message for MongoClient.connect + // TODO(NODE-3483) + return callback(new error_1.MongoRuntimeError(warningMessage)); + } + callback(err, mongoClient); + }; + if (typeof options.srvHost === 'string') { + return connection_string_1.resolveSRVRecord(options, (err, hosts) => { + if (err || !hosts) + return callback(err); + for (const [index, host] of hosts.entries()) { + options.hosts[index] = host; + } + return createTopology(mongoClient, options, connectCallback); + }); + } + return createTopology(mongoClient, options, connectCallback); +} +exports.connect = connect; +function createTopology(mongoClient, options, callback) { + // Create the topology + const topology = new topology_1.Topology(options.hosts, options); + // Events can be emitted before initialization is complete so we have to + // save the reference to the topology on the client ASAP if the event handlers need to access it + mongoClient.topology = topology; + topology.once(topology_1.Topology.OPEN, () => mongoClient.emit('open', mongoClient)); + for (const event of exports.MONGO_CLIENT_EVENTS) { + topology.on(event, (...args) => mongoClient.emit(event, ...args)); + } + // initialize CSFLE if requested + if (mongoClient.autoEncrypter) { + mongoClient.autoEncrypter.init(err => { + if (err) { + return callback(err); + } + topology.connect(options, err => { + if (err) { + topology.close({ force: true }); + return callback(err); + } + options.encrypter.connectInternalClient(error => { + if (error) + return callback(error); + callback(undefined, topology); + }); + }); + }); + return; + } + // otherwise connect normally + topology.connect(options, err => { + if (err) { + topology.close({ force: true }); + return callback(err); + } + callback(undefined, topology); + return; + }); +} +//# sourceMappingURL=connect.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/connect.js.map b/node_modules/mongodb/lib/operations/connect.js.map new file mode 100644 index 000000000..1f127be8b --- /dev/null +++ b/node_modules/mongodb/lib/operations/connect.js.map @@ -0,0 +1 @@ +{"version":3,"file":"connect.js","sourceRoot":"","sources":["../../src/operations/connect.ts"],"names":[],"mappings":";;;AAAA,oCAAwE;AACxE,+CAA6D;AAC7D,4DAAwD;AAGxD,6DAAsD;AACtD,mDAAgD;AAChD,2CAAkD;AAElD,cAAc;AACD,QAAA,mBAAmB,GAAG;IACjC,GAAG,6BAAW;IACd,GAAG,uBAAU;IACb,GAAG,0BAAe;IAClB,GAAG,yBAAgB;CACX,CAAC;AAEX,SAAgB,OAAO,CACrB,WAAwB,EACxB,OAAqB,EACrB,QAA+B;IAE/B,IAAI,CAAC,QAAQ,EAAE;QACb,MAAM,IAAI,iCAAyB,CAAC,oCAAoC,CAAC,CAAC;KAC3E;IAED,mEAAmE;IACnE,IAAI,WAAW,CAAC,QAAQ,IAAI,WAAW,CAAC,QAAQ,CAAC,WAAW,EAAE,EAAE;QAC9D,OAAO,QAAQ,CAAC,SAAS,EAAE,WAAW,CAAC,CAAC;KACzC;IAED,MAAM,MAAM,GAAG,WAAW,CAAC,MAAM,CAAC;IAClC,MAAM,eAAe,GAAa,GAAG,CAAC,EAAE;QACtC,MAAM,cAAc,GAClB,wEAAwE;YACxE,wEAAwE;YACxE,0CAA0C,CAAC;QAC7C,IAAI,GAAG,IAAI,GAAG,CAAC,OAAO,KAAK,sCAAsC,EAAE;YACjE,IAAI,MAAM,CAAC,MAAM,EAAE,EAAE;gBACnB,MAAM,CAAC,IAAI,CAAC,cAAc,CAAC,CAAC;aAC7B;YAED,+DAA+D;YAC/D,kBAAkB;YAClB,OAAO,QAAQ,CAAC,IAAI,yBAAiB,CAAC,cAAc,CAAC,CAAC,CAAC;SACxD;QAED,QAAQ,CAAC,GAAG,EAAE,WAAW,CAAC,CAAC;IAC7B,CAAC,CAAC;IAEF,IAAI,OAAO,OAAO,CAAC,OAAO,KAAK,QAAQ,EAAE;QACvC,OAAO,oCAAgB,CAAC,OAAO,EAAE,CAAC,GAAG,EAAE,KAAK,EAAE,EAAE;YAC9C,IAAI,GAAG,IAAI,CAAC,KAAK;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YACxC,KAAK,MAAM,CAAC,KAAK,EAAE,IAAI,CAAC,IAAI,KAAK,CAAC,OAAO,EAAE,EAAE;gBAC3C,OAAO,CAAC,KAAK,CAAC,KAAK,CAAC,GAAG,IAAI,CAAC;aAC7B;YAED,OAAO,cAAc,CAAC,WAAW,EAAE,OAAO,EAAE,eAAe,CAAC,CAAC;QAC/D,CAAC,CAAC,CAAC;KACJ;IAED,OAAO,cAAc,CAAC,WAAW,EAAE,OAAO,EAAE,eAAe,CAAC,CAAC;AAC/D,CAAC;AA7CD,0BA6CC;AAED,SAAS,cAAc,CACrB,WAAwB,EACxB,OAAqB,EACrB,QAA4B;IAE5B,sBAAsB;IACtB,MAAM,QAAQ,GAAG,IAAI,mBAAQ,CAAC,OAAO,CAAC,KAAK,EAAE,OAAO,CAAC,CAAC;IACtD,wEAAwE;IACxE,gGAAgG;IAChG,WAAW,CAAC,QAAQ,GAAG,QAAQ,CAAC;IAEhC,QAAQ,CAAC,IAAI,CAAC,mBAAQ,CAAC,IAAI,EAAE,GAAG,EAAE,CAAC,WAAW,CAAC,IAAI,CAAC,MAAM,EAAE,WAAW,CAAC,CAAC,CAAC;IAE1E,KAAK,MAAM,KAAK,IAAI,2BAAmB,EAAE;QACvC,QAAQ,CAAC,EAAE,CAAC,KAAK,EAAE,CAAC,GAAG,IAAW,EAAE,EAAE,CAAC,WAAW,CAAC,IAAI,CAAC,KAAK,EAAE,GAAI,IAAY,CAAC,CAAC,CAAC;KACnF;IAED,gCAAgC;IAChC,IAAI,WAAW,CAAC,aAAa,EAAE;QAC7B,WAAW,CAAC,aAAa,CAAC,IAAI,CAAC,GAAG,CAAC,EAAE;YACnC,IAAI,GAAG,EAAE;gBACP,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;aACtB;YAED,QAAQ,CAAC,OAAO,CAAC,OAAO,EAAE,GAAG,CAAC,EAAE;gBAC9B,IAAI,GAAG,EAAE;oBACP,QAAQ,CAAC,KAAK,CAAC,EAAE,KAAK,EAAE,IAAI,EAAE,CAAC,CAAC;oBAChC,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;iBACtB;gBAED,OAAO,CAAC,SAAS,CAAC,qBAAqB,CAAC,KAAK,CAAC,EAAE;oBAC9C,IAAI,KAAK;wBAAE,OAAO,QAAQ,CAAC,KAAK,CAAC,CAAC;oBAElC,QAAQ,CAAC,SAAS,EAAE,QAAQ,CAAC,CAAC;gBAChC,CAAC,CAAC,CAAC;YACL,CAAC,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;QAEH,OAAO;KACR;IAED,6BAA6B;IAC7B,QAAQ,CAAC,OAAO,CAAC,OAAO,EAAE,GAAG,CAAC,EAAE;QAC9B,IAAI,GAAG,EAAE;YACP,QAAQ,CAAC,KAAK,CAAC,EAAE,KAAK,EAAE,IAAI,EAAE,CAAC,CAAC;YAChC,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;SACtB;QAED,QAAQ,CAAC,SAAS,EAAE,QAAQ,CAAC,CAAC;QAC9B,OAAO;IACT,CAAC,CAAC,CAAC;AACL,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/count.js b/node_modules/mongodb/lib/operations/count.js new file mode 100644 index 000000000..ba2a5b486 --- /dev/null +++ b/node_modules/mongodb/lib/operations/count.js @@ -0,0 +1,39 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.CountOperation = void 0; +const operation_1 = require("./operation"); +const command_1 = require("./command"); +/** @internal */ +class CountOperation extends command_1.CommandOperation { + constructor(namespace, filter, options) { + super({ s: { namespace: namespace } }, options); + this.options = options; + this.collectionName = namespace.collection; + this.query = filter; + } + execute(server, session, callback) { + const options = this.options; + const cmd = { + count: this.collectionName, + query: this.query + }; + if (typeof options.limit === 'number') { + cmd.limit = options.limit; + } + if (typeof options.skip === 'number') { + cmd.skip = options.skip; + } + if (options.hint != null) { + cmd.hint = options.hint; + } + if (typeof options.maxTimeMS === 'number') { + cmd.maxTimeMS = options.maxTimeMS; + } + super.executeCommand(server, session, cmd, (err, result) => { + callback(err, result ? result.n : 0); + }); + } +} +exports.CountOperation = CountOperation; +operation_1.defineAspects(CountOperation, [operation_1.Aspect.READ_OPERATION, operation_1.Aspect.RETRYABLE]); +//# sourceMappingURL=count.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/count.js.map b/node_modules/mongodb/lib/operations/count.js.map new file mode 100644 index 000000000..6da1b437a --- /dev/null +++ b/node_modules/mongodb/lib/operations/count.js.map @@ -0,0 +1 @@ +{"version":3,"file":"count.js","sourceRoot":"","sources":["../../src/operations/count.ts"],"names":[],"mappings":";;;AAAA,2CAAoD;AACpD,uCAAsE;AAmBtE,gBAAgB;AAChB,MAAa,cAAe,SAAQ,0BAAwB;IAK1D,YAAY,SAA2B,EAAE,MAAgB,EAAE,OAAqB;QAC9E,KAAK,CAAC,EAAE,CAAC,EAAE,EAAE,SAAS,EAAE,SAAS,EAAE,EAA2B,EAAE,OAAO,CAAC,CAAC;QAEzE,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,cAAc,GAAG,SAAS,CAAC,UAAU,CAAC;QAC3C,IAAI,CAAC,KAAK,GAAG,MAAM,CAAC;IACtB,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA0B;QACxE,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAC;QAC7B,MAAM,GAAG,GAAa;YACpB,KAAK,EAAE,IAAI,CAAC,cAAc;YAC1B,KAAK,EAAE,IAAI,CAAC,KAAK;SAClB,CAAC;QAEF,IAAI,OAAO,OAAO,CAAC,KAAK,KAAK,QAAQ,EAAE;YACrC,GAAG,CAAC,KAAK,GAAG,OAAO,CAAC,KAAK,CAAC;SAC3B;QAED,IAAI,OAAO,OAAO,CAAC,IAAI,KAAK,QAAQ,EAAE;YACpC,GAAG,CAAC,IAAI,GAAG,OAAO,CAAC,IAAI,CAAC;SACzB;QAED,IAAI,OAAO,CAAC,IAAI,IAAI,IAAI,EAAE;YACxB,GAAG,CAAC,IAAI,GAAG,OAAO,CAAC,IAAI,CAAC;SACzB;QAED,IAAI,OAAO,OAAO,CAAC,SAAS,KAAK,QAAQ,EAAE;YACzC,GAAG,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;SACnC;QAED,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,GAAG,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;YACzD,QAAQ,CAAC,GAAG,EAAE,MAAM,CAAC,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;QACvC,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAxCD,wCAwCC;AAED,yBAAa,CAAC,cAAc,EAAE,CAAC,kBAAM,CAAC,cAAc,EAAE,kBAAM,CAAC,SAAS,CAAC,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/count_documents.js b/node_modules/mongodb/lib/operations/count_documents.js new file mode 100644 index 000000000..562c9c068 --- /dev/null +++ b/node_modules/mongodb/lib/operations/count_documents.js @@ -0,0 +1,37 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.CountDocumentsOperation = void 0; +const aggregate_1 = require("./aggregate"); +/** @internal */ +class CountDocumentsOperation extends aggregate_1.AggregateOperation { + constructor(collection, query, options) { + const pipeline = []; + pipeline.push({ $match: query }); + if (typeof options.skip === 'number') { + pipeline.push({ $skip: options.skip }); + } + if (typeof options.limit === 'number') { + pipeline.push({ $limit: options.limit }); + } + pipeline.push({ $group: { _id: 1, n: { $sum: 1 } } }); + super(collection.s.namespace, pipeline, options); + } + execute(server, session, callback) { + super.execute(server, session, (err, result) => { + if (err || !result) { + callback(err); + return; + } + // NOTE: We're avoiding creating a cursor here to reduce the callstack. + const response = result; + if (response.cursor == null || response.cursor.firstBatch == null) { + callback(undefined, 0); + return; + } + const docs = response.cursor.firstBatch; + callback(undefined, docs.length ? docs[0].n : 0); + }); + } +} +exports.CountDocumentsOperation = CountDocumentsOperation; +//# sourceMappingURL=count_documents.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/count_documents.js.map b/node_modules/mongodb/lib/operations/count_documents.js.map new file mode 100644 index 000000000..8faec0eb9 --- /dev/null +++ b/node_modules/mongodb/lib/operations/count_documents.js.map @@ -0,0 +1 @@ +{"version":3,"file":"count_documents.js","sourceRoot":"","sources":["../../src/operations/count_documents.ts"],"names":[],"mappings":";;;AAAA,2CAAmE;AAenE,gBAAgB;AAChB,MAAa,uBAAwB,SAAQ,8BAA0B;IACrE,YAAY,UAAsB,EAAE,KAAe,EAAE,OAA8B;QACjF,MAAM,QAAQ,GAAG,EAAE,CAAC;QACpB,QAAQ,CAAC,IAAI,CAAC,EAAE,MAAM,EAAE,KAAK,EAAE,CAAC,CAAC;QAEjC,IAAI,OAAO,OAAO,CAAC,IAAI,KAAK,QAAQ,EAAE;YACpC,QAAQ,CAAC,IAAI,CAAC,EAAE,KAAK,EAAE,OAAO,CAAC,IAAI,EAAE,CAAC,CAAC;SACxC;QAED,IAAI,OAAO,OAAO,CAAC,KAAK,KAAK,QAAQ,EAAE;YACrC,QAAQ,CAAC,IAAI,CAAC,EAAE,MAAM,EAAE,OAAO,CAAC,KAAK,EAAE,CAAC,CAAC;SAC1C;QAED,QAAQ,CAAC,IAAI,CAAC,EAAE,MAAM,EAAE,EAAE,GAAG,EAAE,CAAC,EAAE,CAAC,EAAE,EAAE,IAAI,EAAE,CAAC,EAAE,EAAE,EAAE,CAAC,CAAC;QAEtD,KAAK,CAAC,UAAU,CAAC,CAAC,CAAC,SAAS,EAAE,QAAQ,EAAE,OAAO,CAAC,CAAC;IACnD,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA0B;QACxE,KAAK,CAAC,OAAO,CAAC,MAAM,EAAE,OAAO,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;YAC7C,IAAI,GAAG,IAAI,CAAC,MAAM,EAAE;gBAClB,QAAQ,CAAC,GAAG,CAAC,CAAC;gBACd,OAAO;aACR;YAED,uEAAuE;YACvE,MAAM,QAAQ,GAAG,MAA6B,CAAC;YAC/C,IAAI,QAAQ,CAAC,MAAM,IAAI,IAAI,IAAI,QAAQ,CAAC,MAAM,CAAC,UAAU,IAAI,IAAI,EAAE;gBACjE,QAAQ,CAAC,SAAS,EAAE,CAAC,CAAC,CAAC;gBACvB,OAAO;aACR;YAED,MAAM,IAAI,GAAG,QAAQ,CAAC,MAAM,CAAC,UAAU,CAAC;YACxC,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;QACnD,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AApCD,0DAoCC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/create_collection.js b/node_modules/mongodb/lib/operations/create_collection.js new file mode 100644 index 000000000..3841a5af0 --- /dev/null +++ b/node_modules/mongodb/lib/operations/create_collection.js @@ -0,0 +1,60 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.CreateCollectionOperation = void 0; +const command_1 = require("./command"); +const operation_1 = require("./operation"); +const collection_1 = require("../collection"); +const ILLEGAL_COMMAND_FIELDS = new Set([ + 'w', + 'wtimeout', + 'j', + 'fsync', + 'autoIndexId', + 'pkFactory', + 'raw', + 'readPreference', + 'session', + 'readConcern', + 'writeConcern', + 'raw', + 'fieldsAsRaw', + 'promoteLongs', + 'promoteValues', + 'promoteBuffers', + 'bsonRegExp', + 'serializeFunctions', + 'ignoreUndefined' +]); +/** @internal */ +class CreateCollectionOperation extends command_1.CommandOperation { + constructor(db, name, options = {}) { + super(db, options); + this.options = options; + this.db = db; + this.name = name; + } + execute(server, session, callback) { + const db = this.db; + const name = this.name; + const options = this.options; + const done = err => { + if (err) { + return callback(err); + } + callback(undefined, new collection_1.Collection(db, name, options)); + }; + const cmd = { create: name }; + for (const n in options) { + if (options[n] != null && + typeof options[n] !== 'function' && + !ILLEGAL_COMMAND_FIELDS.has(n)) { + cmd[n] = options[n]; + } + } + // otherwise just execute the command + super.executeCommand(server, session, cmd, done); + } +} +exports.CreateCollectionOperation = CreateCollectionOperation; +operation_1.defineAspects(CreateCollectionOperation, [operation_1.Aspect.WRITE_OPERATION]); +//# sourceMappingURL=create_collection.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/create_collection.js.map b/node_modules/mongodb/lib/operations/create_collection.js.map new file mode 100644 index 000000000..15f50e08a --- /dev/null +++ b/node_modules/mongodb/lib/operations/create_collection.js.map @@ -0,0 +1 @@ +{"version":3,"file":"create_collection.js","sourceRoot":"","sources":["../../src/operations/create_collection.ts"],"names":[],"mappings":";;;AAAA,uCAAsE;AACtE,2CAAoD;AACpD,8CAA2C;AAQ3C,MAAM,sBAAsB,GAAG,IAAI,GAAG,CAAC;IACrC,GAAG;IACH,UAAU;IACV,GAAG;IACH,OAAO;IACP,aAAa;IACb,WAAW;IACX,KAAK;IACL,gBAAgB;IAChB,SAAS;IACT,aAAa;IACb,cAAc;IACd,KAAK;IACL,aAAa;IACb,cAAc;IACd,eAAe;IACf,gBAAgB;IAChB,YAAY;IACZ,oBAAoB;IACpB,iBAAiB;CAClB,CAAC,CAAC;AAgDH,gBAAgB;AAChB,MAAa,yBAA0B,SAAQ,0BAA4B;IAKzE,YAAY,EAAM,EAAE,IAAY,EAAE,UAAmC,EAAE;QACrE,KAAK,CAAC,EAAE,EAAE,OAAO,CAAC,CAAC;QAEnB,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC;QACb,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;IACnB,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA8B;QAC5E,MAAM,EAAE,GAAG,IAAI,CAAC,EAAE,CAAC;QACnB,MAAM,IAAI,GAAG,IAAI,CAAC,IAAI,CAAC;QACvB,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAC;QAE7B,MAAM,IAAI,GAAa,GAAG,CAAC,EAAE;YAC3B,IAAI,GAAG,EAAE;gBACP,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;aACtB;YAED,QAAQ,CAAC,SAAS,EAAE,IAAI,uBAAU,CAAC,EAAE,EAAE,IAAI,EAAE,OAAO,CAAC,CAAC,CAAC;QACzD,CAAC,CAAC;QAEF,MAAM,GAAG,GAAa,EAAE,MAAM,EAAE,IAAI,EAAE,CAAC;QACvC,KAAK,MAAM,CAAC,IAAI,OAAO,EAAE;YACvB,IACG,OAAe,CAAC,CAAC,CAAC,IAAI,IAAI;gBAC3B,OAAQ,OAAe,CAAC,CAAC,CAAC,KAAK,UAAU;gBACzC,CAAC,sBAAsB,CAAC,GAAG,CAAC,CAAC,CAAC,EAC9B;gBACA,GAAG,CAAC,CAAC,CAAC,GAAI,OAAe,CAAC,CAAC,CAAC,CAAC;aAC9B;SACF;QAED,qCAAqC;QACrC,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,GAAG,EAAE,IAAI,CAAC,CAAC;IACnD,CAAC;CACF;AAxCD,8DAwCC;AAED,yBAAa,CAAC,yBAAyB,EAAE,CAAC,kBAAM,CAAC,eAAe,CAAC,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/delete.js b/node_modules/mongodb/lib/operations/delete.js new file mode 100644 index 000000000..c43ba3f6b --- /dev/null +++ b/node_modules/mongodb/lib/operations/delete.js @@ -0,0 +1,133 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.makeDeleteStatement = exports.DeleteManyOperation = exports.DeleteOneOperation = exports.DeleteOperation = void 0; +const operation_1 = require("./operation"); +const command_1 = require("./command"); +const utils_1 = require("../utils"); +const error_1 = require("../error"); +/** @internal */ +class DeleteOperation extends command_1.CommandOperation { + constructor(ns, statements, options) { + super(undefined, options); + this.options = options; + this.ns = ns; + this.statements = statements; + } + get canRetryWrite() { + if (super.canRetryWrite === false) { + return false; + } + return this.statements.every(op => (op.limit != null ? op.limit > 0 : true)); + } + execute(server, session, callback) { + var _a; + const options = (_a = this.options) !== null && _a !== void 0 ? _a : {}; + const ordered = typeof options.ordered === 'boolean' ? options.ordered : true; + const command = { + delete: this.ns.collection, + deletes: this.statements, + ordered + }; + if (options.let) { + command.let = options.let; + } + if (options.explain != null && utils_1.maxWireVersion(server) < 3) { + return callback + ? callback(new error_1.MongoCompatibilityError(`Server ${server.name} does not support explain on delete`)) + : undefined; + } + const unacknowledgedWrite = this.writeConcern && this.writeConcern.w === 0; + if (unacknowledgedWrite || utils_1.maxWireVersion(server) < 5) { + if (this.statements.find((o) => o.hint)) { + callback(new error_1.MongoCompatibilityError(`Servers < 3.4 do not support hint on delete`)); + return; + } + } + const statementWithCollation = this.statements.find(statement => !!statement.collation); + if (statementWithCollation && utils_1.collationNotSupported(server, statementWithCollation)) { + callback(new error_1.MongoCompatibilityError(`Server ${server.name} does not support collation`)); + return; + } + super.executeCommand(server, session, command, callback); + } +} +exports.DeleteOperation = DeleteOperation; +class DeleteOneOperation extends DeleteOperation { + constructor(collection, filter, options) { + super(collection.s.namespace, [makeDeleteStatement(filter, { ...options, limit: 1 })], options); + } + execute(server, session, callback) { + super.execute(server, session, (err, res) => { + var _a, _b; + if (err || res == null) + return callback(err); + if (res.code) + return callback(new error_1.MongoServerError(res)); + if (res.writeErrors) + return callback(new error_1.MongoServerError(res.writeErrors[0])); + if (this.explain) + return callback(undefined, res); + callback(undefined, { + acknowledged: (_b = ((_a = this.writeConcern) === null || _a === void 0 ? void 0 : _a.w) !== 0) !== null && _b !== void 0 ? _b : true, + deletedCount: res.n + }); + }); + } +} +exports.DeleteOneOperation = DeleteOneOperation; +class DeleteManyOperation extends DeleteOperation { + constructor(collection, filter, options) { + super(collection.s.namespace, [makeDeleteStatement(filter, options)], options); + } + execute(server, session, callback) { + super.execute(server, session, (err, res) => { + var _a, _b; + if (err || res == null) + return callback(err); + if (res.code) + return callback(new error_1.MongoServerError(res)); + if (res.writeErrors) + return callback(new error_1.MongoServerError(res.writeErrors[0])); + if (this.explain) + return callback(undefined, res); + callback(undefined, { + acknowledged: (_b = ((_a = this.writeConcern) === null || _a === void 0 ? void 0 : _a.w) !== 0) !== null && _b !== void 0 ? _b : true, + deletedCount: res.n + }); + }); + } +} +exports.DeleteManyOperation = DeleteManyOperation; +function makeDeleteStatement(filter, options) { + const op = { + q: filter, + limit: typeof options.limit === 'number' ? options.limit : 0 + }; + if (options.single === true) { + op.limit = 1; + } + if (options.collation) { + op.collation = options.collation; + } + if (options.hint) { + op.hint = options.hint; + } + if (options.comment) { + op.comment = options.comment; + } + return op; +} +exports.makeDeleteStatement = makeDeleteStatement; +operation_1.defineAspects(DeleteOperation, [operation_1.Aspect.RETRYABLE, operation_1.Aspect.WRITE_OPERATION]); +operation_1.defineAspects(DeleteOneOperation, [ + operation_1.Aspect.RETRYABLE, + operation_1.Aspect.WRITE_OPERATION, + operation_1.Aspect.EXPLAINABLE, + operation_1.Aspect.SKIP_COLLATION +]); +operation_1.defineAspects(DeleteManyOperation, [ + operation_1.Aspect.WRITE_OPERATION, + operation_1.Aspect.EXPLAINABLE, + operation_1.Aspect.SKIP_COLLATION +]); +//# sourceMappingURL=delete.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/delete.js.map b/node_modules/mongodb/lib/operations/delete.js.map new file mode 100644 index 000000000..ee3b68834 --- /dev/null +++ b/node_modules/mongodb/lib/operations/delete.js.map @@ -0,0 +1 @@ +{"version":3,"file":"delete.js","sourceRoot":"","sources":["../../src/operations/delete.ts"],"names":[],"mappings":";;;AAAA,2CAA0D;AAC1D,uCAAwF;AACxF,oCAA6F;AAK7F,oCAAqE;AA0CrE,gBAAgB;AAChB,MAAa,eAAgB,SAAQ,0BAA0B;IAI7D,YAAY,EAAoB,EAAE,UAA6B,EAAE,OAAsB;QACrF,KAAK,CAAC,SAAS,EAAE,OAAO,CAAC,CAAC;QAC1B,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC;QACb,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;IAC/B,CAAC;IAED,IAAI,aAAa;QACf,IAAI,KAAK,CAAC,aAAa,KAAK,KAAK,EAAE;YACjC,OAAO,KAAK,CAAC;SACd;QAED,OAAO,IAAI,CAAC,UAAU,CAAC,KAAK,CAAC,EAAE,CAAC,EAAE,CAAC,CAAC,EAAE,CAAC,KAAK,IAAI,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,KAAK,GAAG,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC;IAC/E,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAAkB;;QAChE,MAAM,OAAO,GAAG,MAAA,IAAI,CAAC,OAAO,mCAAI,EAAE,CAAC;QACnC,MAAM,OAAO,GAAG,OAAO,OAAO,CAAC,OAAO,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC,CAAC,IAAI,CAAC;QAC9E,MAAM,OAAO,GAAa;YACxB,MAAM,EAAE,IAAI,CAAC,EAAE,CAAC,UAAU;YAC1B,OAAO,EAAE,IAAI,CAAC,UAAU;YACxB,OAAO;SACR,CAAC;QAEF,IAAI,OAAO,CAAC,GAAG,EAAE;YACf,OAAO,CAAC,GAAG,GAAG,OAAO,CAAC,GAAG,CAAC;SAC3B;QAED,IAAI,OAAO,CAAC,OAAO,IAAI,IAAI,IAAI,sBAAc,CAAC,MAAM,CAAC,GAAG,CAAC,EAAE;YACzD,OAAO,QAAQ;gBACb,CAAC,CAAC,QAAQ,CACN,IAAI,+BAAuB,CAAC,UAAU,MAAM,CAAC,IAAI,qCAAqC,CAAC,CACxF;gBACH,CAAC,CAAC,SAAS,CAAC;SACf;QAED,MAAM,mBAAmB,GAAG,IAAI,CAAC,YAAY,IAAI,IAAI,CAAC,YAAY,CAAC,CAAC,KAAK,CAAC,CAAC;QAC3E,IAAI,mBAAmB,IAAI,sBAAc,CAAC,MAAM,CAAC,GAAG,CAAC,EAAE;YACrD,IAAI,IAAI,CAAC,UAAU,CAAC,IAAI,CAAC,CAAC,CAAW,EAAE,EAAE,CAAC,CAAC,CAAC,IAAI,CAAC,EAAE;gBACjD,QAAQ,CAAC,IAAI,+BAAuB,CAAC,6CAA6C,CAAC,CAAC,CAAC;gBACrF,OAAO;aACR;SACF;QAED,MAAM,sBAAsB,GAAG,IAAI,CAAC,UAAU,CAAC,IAAI,CAAC,SAAS,CAAC,EAAE,CAAC,CAAC,CAAC,SAAS,CAAC,SAAS,CAAC,CAAC;QACxF,IAAI,sBAAsB,IAAI,6BAAqB,CAAC,MAAM,EAAE,sBAAsB,CAAC,EAAE;YACnF,QAAQ,CAAC,IAAI,+BAAuB,CAAC,UAAU,MAAM,CAAC,IAAI,6BAA6B,CAAC,CAAC,CAAC;YAC1F,OAAO;SACR;QAED,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;IAC3D,CAAC;CACF;AAxDD,0CAwDC;AAED,MAAa,kBAAmB,SAAQ,eAAe;IACrD,YAAY,UAAsB,EAAE,MAAgB,EAAE,OAAsB;QAC1E,KAAK,CAAC,UAAU,CAAC,CAAC,CAAC,SAAS,EAAE,CAAC,mBAAmB,CAAC,MAAM,EAAE,EAAE,GAAG,OAAO,EAAE,KAAK,EAAE,CAAC,EAAE,CAAC,CAAC,EAAE,OAAO,CAAC,CAAC;IAClG,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAAgC;QAC9E,KAAK,CAAC,OAAO,CAAC,MAAM,EAAE,OAAO,EAAE,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;;YAC1C,IAAI,GAAG,IAAI,GAAG,IAAI,IAAI;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAC7C,IAAI,GAAG,CAAC,IAAI;gBAAE,OAAO,QAAQ,CAAC,IAAI,wBAAgB,CAAC,GAAG,CAAC,CAAC,CAAC;YACzD,IAAI,GAAG,CAAC,WAAW;gBAAE,OAAO,QAAQ,CAAC,IAAI,wBAAgB,CAAC,GAAG,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;YAC/E,IAAI,IAAI,CAAC,OAAO;gBAAE,OAAO,QAAQ,CAAC,SAAS,EAAE,GAAG,CAAC,CAAC;YAElD,QAAQ,CAAC,SAAS,EAAE;gBAClB,YAAY,EAAE,MAAA,CAAA,MAAA,IAAI,CAAC,YAAY,0CAAE,CAAC,MAAK,CAAC,mCAAI,IAAI;gBAChD,YAAY,EAAE,GAAG,CAAC,CAAC;aACpB,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAlBD,gDAkBC;AAED,MAAa,mBAAoB,SAAQ,eAAe;IACtD,YAAY,UAAsB,EAAE,MAAgB,EAAE,OAAsB;QAC1E,KAAK,CAAC,UAAU,CAAC,CAAC,CAAC,SAAS,EAAE,CAAC,mBAAmB,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC,EAAE,OAAO,CAAC,CAAC;IACjF,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAAgC;QAC9E,KAAK,CAAC,OAAO,CAAC,MAAM,EAAE,OAAO,EAAE,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;;YAC1C,IAAI,GAAG,IAAI,GAAG,IAAI,IAAI;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAC7C,IAAI,GAAG,CAAC,IAAI;gBAAE,OAAO,QAAQ,CAAC,IAAI,wBAAgB,CAAC,GAAG,CAAC,CAAC,CAAC;YACzD,IAAI,GAAG,CAAC,WAAW;gBAAE,OAAO,QAAQ,CAAC,IAAI,wBAAgB,CAAC,GAAG,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;YAC/E,IAAI,IAAI,CAAC,OAAO;gBAAE,OAAO,QAAQ,CAAC,SAAS,EAAE,GAAG,CAAC,CAAC;YAElD,QAAQ,CAAC,SAAS,EAAE;gBAClB,YAAY,EAAE,MAAA,CAAA,MAAA,IAAI,CAAC,YAAY,0CAAE,CAAC,MAAK,CAAC,mCAAI,IAAI;gBAChD,YAAY,EAAE,GAAG,CAAC,CAAC;aACpB,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAlBD,kDAkBC;AAED,SAAgB,mBAAmB,CACjC,MAAgB,EAChB,OAA2C;IAE3C,MAAM,EAAE,GAAoB;QAC1B,CAAC,EAAE,MAAM;QACT,KAAK,EAAE,OAAO,OAAO,CAAC,KAAK,KAAK,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;KAC7D,CAAC;IAEF,IAAI,OAAO,CAAC,MAAM,KAAK,IAAI,EAAE;QAC3B,EAAE,CAAC,KAAK,GAAG,CAAC,CAAC;KACd;IAED,IAAI,OAAO,CAAC,SAAS,EAAE;QACrB,EAAE,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;KAClC;IAED,IAAI,OAAO,CAAC,IAAI,EAAE;QAChB,EAAE,CAAC,IAAI,GAAG,OAAO,CAAC,IAAI,CAAC;KACxB;IAED,IAAI,OAAO,CAAC,OAAO,EAAE;QACnB,EAAE,CAAC,OAAO,GAAG,OAAO,CAAC,OAAO,CAAC;KAC9B;IAED,OAAO,EAAE,CAAC;AACZ,CAAC;AA1BD,kDA0BC;AAED,yBAAa,CAAC,eAAe,EAAE,CAAC,kBAAM,CAAC,SAAS,EAAE,kBAAM,CAAC,eAAe,CAAC,CAAC,CAAC;AAC3E,yBAAa,CAAC,kBAAkB,EAAE;IAChC,kBAAM,CAAC,SAAS;IAChB,kBAAM,CAAC,eAAe;IACtB,kBAAM,CAAC,WAAW;IAClB,kBAAM,CAAC,cAAc;CACtB,CAAC,CAAC;AACH,yBAAa,CAAC,mBAAmB,EAAE;IACjC,kBAAM,CAAC,eAAe;IACtB,kBAAM,CAAC,WAAW;IAClB,kBAAM,CAAC,cAAc;CACtB,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/distinct.js b/node_modules/mongodb/lib/operations/distinct.js new file mode 100644 index 000000000..cfed93842 --- /dev/null +++ b/node_modules/mongodb/lib/operations/distinct.js @@ -0,0 +1,67 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.DistinctOperation = void 0; +const operation_1 = require("./operation"); +const command_1 = require("./command"); +const utils_1 = require("../utils"); +const error_1 = require("../error"); +/** + * Return a list of distinct values for the given key across a collection. + * @internal + */ +class DistinctOperation extends command_1.CommandOperation { + /** + * Construct a Distinct operation. + * + * @param collection - Collection instance. + * @param key - Field of the document to find distinct values for. + * @param query - The query for filtering the set of documents to which we apply the distinct filter. + * @param options - Optional settings. See Collection.prototype.distinct for a list of options. + */ + constructor(collection, key, query, options) { + super(collection, options); + this.options = options !== null && options !== void 0 ? options : {}; + this.collection = collection; + this.key = key; + this.query = query; + } + execute(server, session, callback) { + const coll = this.collection; + const key = this.key; + const query = this.query; + const options = this.options; + // Distinct command + const cmd = { + distinct: coll.collectionName, + key: key, + query: query + }; + // Add maxTimeMS if defined + if (typeof options.maxTimeMS === 'number') { + cmd.maxTimeMS = options.maxTimeMS; + } + // Do we have a readConcern specified + utils_1.decorateWithReadConcern(cmd, coll, options); + // Have we specified collation + try { + utils_1.decorateWithCollation(cmd, coll, options); + } + catch (err) { + return callback(err); + } + if (this.explain && utils_1.maxWireVersion(server) < 4) { + callback(new error_1.MongoCompatibilityError(`Server ${server.name} does not support explain on distinct`)); + return; + } + super.executeCommand(server, session, cmd, (err, result) => { + if (err) { + callback(err); + return; + } + callback(undefined, this.explain ? result : result.values); + }); + } +} +exports.DistinctOperation = DistinctOperation; +operation_1.defineAspects(DistinctOperation, [operation_1.Aspect.READ_OPERATION, operation_1.Aspect.RETRYABLE, operation_1.Aspect.EXPLAINABLE]); +//# sourceMappingURL=distinct.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/distinct.js.map b/node_modules/mongodb/lib/operations/distinct.js.map new file mode 100644 index 000000000..ba4e678d0 --- /dev/null +++ b/node_modules/mongodb/lib/operations/distinct.js.map @@ -0,0 +1 @@ +{"version":3,"file":"distinct.js","sourceRoot":"","sources":["../../src/operations/distinct.ts"],"names":[],"mappings":";;;AAAA,2CAAoD;AACpD,uCAAsE;AACtE,oCAAoG;AAIpG,oCAAmD;AAMnD;;;GAGG;AACH,MAAa,iBAAkB,SAAQ,0BAAuB;IAQ5D;;;;;;;OAOG;IACH,YAAY,UAAsB,EAAE,GAAW,EAAE,KAAe,EAAE,OAAyB;QACzF,KAAK,CAAC,UAAU,EAAE,OAAO,CAAC,CAAC;QAE3B,IAAI,CAAC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAC7B,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;QAC7B,IAAI,CAAC,GAAG,GAAG,GAAG,CAAC;QACf,IAAI,CAAC,KAAK,GAAG,KAAK,CAAC;IACrB,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAAyB;QACvE,MAAM,IAAI,GAAG,IAAI,CAAC,UAAU,CAAC;QAC7B,MAAM,GAAG,GAAG,IAAI,CAAC,GAAG,CAAC;QACrB,MAAM,KAAK,GAAG,IAAI,CAAC,KAAK,CAAC;QACzB,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAC;QAE7B,mBAAmB;QACnB,MAAM,GAAG,GAAa;YACpB,QAAQ,EAAE,IAAI,CAAC,cAAc;YAC7B,GAAG,EAAE,GAAG;YACR,KAAK,EAAE,KAAK;SACb,CAAC;QAEF,2BAA2B;QAC3B,IAAI,OAAO,OAAO,CAAC,SAAS,KAAK,QAAQ,EAAE;YACzC,GAAG,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;SACnC;QAED,qCAAqC;QACrC,+BAAuB,CAAC,GAAG,EAAE,IAAI,EAAE,OAAO,CAAC,CAAC;QAE5C,8BAA8B;QAC9B,IAAI;YACF,6BAAqB,CAAC,GAAG,EAAE,IAAI,EAAE,OAAO,CAAC,CAAC;SAC3C;QAAC,OAAO,GAAG,EAAE;YACZ,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;SACtB;QAED,IAAI,IAAI,CAAC,OAAO,IAAI,sBAAc,CAAC,MAAM,CAAC,GAAG,CAAC,EAAE;YAC9C,QAAQ,CACN,IAAI,+BAAuB,CAAC,UAAU,MAAM,CAAC,IAAI,uCAAuC,CAAC,CAC1F,CAAC;YACF,OAAO;SACR;QAED,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,GAAG,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;YACzD,IAAI,GAAG,EAAE;gBACP,QAAQ,CAAC,GAAG,CAAC,CAAC;gBACd,OAAO;aACR;YAED,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC,MAAM,CAAC,MAAM,CAAC,CAAC;QAC7D,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AArED,8CAqEC;AAED,yBAAa,CAAC,iBAAiB,EAAE,CAAC,kBAAM,CAAC,cAAc,EAAE,kBAAM,CAAC,SAAS,EAAE,kBAAM,CAAC,WAAW,CAAC,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/drop.js b/node_modules/mongodb/lib/operations/drop.js new file mode 100644 index 000000000..7bda79770 --- /dev/null +++ b/node_modules/mongodb/lib/operations/drop.js @@ -0,0 +1,43 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.DropDatabaseOperation = exports.DropCollectionOperation = void 0; +const operation_1 = require("./operation"); +const command_1 = require("./command"); +/** @internal */ +class DropCollectionOperation extends command_1.CommandOperation { + constructor(db, name, options) { + super(db, options); + this.options = options; + this.name = name; + } + execute(server, session, callback) { + super.executeCommand(server, session, { drop: this.name }, (err, result) => { + if (err) + return callback(err); + if (result.ok) + return callback(undefined, true); + callback(undefined, false); + }); + } +} +exports.DropCollectionOperation = DropCollectionOperation; +/** @internal */ +class DropDatabaseOperation extends command_1.CommandOperation { + constructor(db, options) { + super(db, options); + this.options = options; + } + execute(server, session, callback) { + super.executeCommand(server, session, { dropDatabase: 1 }, (err, result) => { + if (err) + return callback(err); + if (result.ok) + return callback(undefined, true); + callback(undefined, false); + }); + } +} +exports.DropDatabaseOperation = DropDatabaseOperation; +operation_1.defineAspects(DropCollectionOperation, [operation_1.Aspect.WRITE_OPERATION]); +operation_1.defineAspects(DropDatabaseOperation, [operation_1.Aspect.WRITE_OPERATION]); +//# sourceMappingURL=drop.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/drop.js.map b/node_modules/mongodb/lib/operations/drop.js.map new file mode 100644 index 000000000..1517cad08 --- /dev/null +++ b/node_modules/mongodb/lib/operations/drop.js.map @@ -0,0 +1 @@ +{"version":3,"file":"drop.js","sourceRoot":"","sources":["../../src/operations/drop.ts"],"names":[],"mappings":";;;AAAA,2CAAoD;AACpD,uCAAsE;AAStE,gBAAgB;AAChB,MAAa,uBAAwB,SAAQ,0BAAyB;IAIpE,YAAY,EAAM,EAAE,IAAY,EAAE,OAA8B;QAC9D,KAAK,CAAC,EAAE,EAAE,OAAO,CAAC,CAAC;QACnB,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;IACnB,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA2B;QACzE,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,EAAE,IAAI,EAAE,IAAI,CAAC,IAAI,EAAE,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;YACzE,IAAI,GAAG;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAC9B,IAAI,MAAM,CAAC,EAAE;gBAAE,OAAO,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;YAChD,QAAQ,CAAC,SAAS,EAAE,KAAK,CAAC,CAAC;QAC7B,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAjBD,0DAiBC;AAKD,gBAAgB;AAChB,MAAa,qBAAsB,SAAQ,0BAAyB;IAGlE,YAAY,EAAM,EAAE,OAA4B;QAC9C,KAAK,CAAC,EAAE,EAAE,OAAO,CAAC,CAAC;QACnB,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;IACzB,CAAC;IACD,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA2B;QACzE,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,EAAE,YAAY,EAAE,CAAC,EAAE,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;YACzE,IAAI,GAAG;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAC9B,IAAI,MAAM,CAAC,EAAE;gBAAE,OAAO,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;YAChD,QAAQ,CAAC,SAAS,EAAE,KAAK,CAAC,CAAC;QAC7B,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAdD,sDAcC;AAED,yBAAa,CAAC,uBAAuB,EAAE,CAAC,kBAAM,CAAC,eAAe,CAAC,CAAC,CAAC;AACjE,yBAAa,CAAC,qBAAqB,EAAE,CAAC,kBAAM,CAAC,eAAe,CAAC,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/estimated_document_count.js b/node_modules/mongodb/lib/operations/estimated_document_count.js new file mode 100644 index 000000000..4366d756c --- /dev/null +++ b/node_modules/mongodb/lib/operations/estimated_document_count.js @@ -0,0 +1,52 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.EstimatedDocumentCountOperation = void 0; +const operation_1 = require("./operation"); +const command_1 = require("./command"); +const utils_1 = require("../utils"); +/** @internal */ +class EstimatedDocumentCountOperation extends command_1.CommandOperation { + constructor(collection, options = {}) { + super(collection, options); + this.options = options; + this.collectionName = collection.collectionName; + } + execute(server, session, callback) { + if (utils_1.maxWireVersion(server) < 12) { + return this.executeLegacy(server, session, callback); + } + const pipeline = [{ $collStats: { count: {} } }, { $group: { _id: 1, n: { $sum: '$count' } } }]; + const cmd = { aggregate: this.collectionName, pipeline, cursor: {} }; + if (typeof this.options.maxTimeMS === 'number') { + cmd.maxTimeMS = this.options.maxTimeMS; + } + super.executeCommand(server, session, cmd, (err, response) => { + var _a, _b; + if (err && err.code !== 26) { + callback(err); + return; + } + callback(undefined, ((_b = (_a = response === null || response === void 0 ? void 0 : response.cursor) === null || _a === void 0 ? void 0 : _a.firstBatch[0]) === null || _b === void 0 ? void 0 : _b.n) || 0); + }); + } + executeLegacy(server, session, callback) { + const cmd = { count: this.collectionName }; + if (typeof this.options.maxTimeMS === 'number') { + cmd.maxTimeMS = this.options.maxTimeMS; + } + super.executeCommand(server, session, cmd, (err, response) => { + if (err) { + callback(err); + return; + } + callback(undefined, response.n || 0); + }); + } +} +exports.EstimatedDocumentCountOperation = EstimatedDocumentCountOperation; +operation_1.defineAspects(EstimatedDocumentCountOperation, [ + operation_1.Aspect.READ_OPERATION, + operation_1.Aspect.RETRYABLE, + operation_1.Aspect.CURSOR_CREATING +]); +//# sourceMappingURL=estimated_document_count.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/estimated_document_count.js.map b/node_modules/mongodb/lib/operations/estimated_document_count.js.map new file mode 100644 index 000000000..f5f92a79d --- /dev/null +++ b/node_modules/mongodb/lib/operations/estimated_document_count.js.map @@ -0,0 +1 @@ +{"version":3,"file":"estimated_document_count.js","sourceRoot":"","sources":["../../src/operations/estimated_document_count.ts"],"names":[],"mappings":";;;AAAA,2CAAoD;AACpD,uCAAsE;AACtE,oCAAoD;AAiBpD,gBAAgB;AAChB,MAAa,+BAAgC,SAAQ,0BAAwB;IAI3E,YAAY,UAAsB,EAAE,UAAyC,EAAE;QAC7E,KAAK,CAAC,UAAU,EAAE,OAAO,CAAC,CAAC;QAC3B,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,cAAc,GAAG,UAAU,CAAC,cAAc,CAAC;IAClD,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA0B;QACxE,IAAI,sBAAc,CAAC,MAAM,CAAC,GAAG,EAAE,EAAE;YAC/B,OAAO,IAAI,CAAC,aAAa,CAAC,MAAM,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;SACtD;QACD,MAAM,QAAQ,GAAG,CAAC,EAAE,UAAU,EAAE,EAAE,KAAK,EAAE,EAAE,EAAE,EAAE,EAAE,EAAE,MAAM,EAAE,EAAE,GAAG,EAAE,CAAC,EAAE,CAAC,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,EAAE,EAAE,CAAC,CAAC;QAEhG,MAAM,GAAG,GAAa,EAAE,SAAS,EAAE,IAAI,CAAC,cAAc,EAAE,QAAQ,EAAE,MAAM,EAAE,EAAE,EAAE,CAAC;QAE/E,IAAI,OAAO,IAAI,CAAC,OAAO,CAAC,SAAS,KAAK,QAAQ,EAAE;YAC9C,GAAG,CAAC,SAAS,GAAG,IAAI,CAAC,OAAO,CAAC,SAAS,CAAC;SACxC;QAED,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,GAAG,EAAE,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;;YAC3D,IAAI,GAAG,IAAK,GAAwB,CAAC,IAAI,KAAK,EAAE,EAAE;gBAChD,QAAQ,CAAC,GAAG,CAAC,CAAC;gBACd,OAAO;aACR;YAED,QAAQ,CAAC,SAAS,EAAE,CAAA,MAAA,MAAA,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,MAAM,0CAAE,UAAU,CAAC,CAAC,CAAC,0CAAE,CAAC,KAAI,CAAC,CAAC,CAAC;QAC/D,CAAC,CAAC,CAAC;IACL,CAAC;IAED,aAAa,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA0B;QAC9E,MAAM,GAAG,GAAa,EAAE,KAAK,EAAE,IAAI,CAAC,cAAc,EAAE,CAAC;QAErD,IAAI,OAAO,IAAI,CAAC,OAAO,CAAC,SAAS,KAAK,QAAQ,EAAE;YAC9C,GAAG,CAAC,SAAS,GAAG,IAAI,CAAC,OAAO,CAAC,SAAS,CAAC;SACxC;QAED,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,GAAG,EAAE,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;YAC3D,IAAI,GAAG,EAAE;gBACP,QAAQ,CAAC,GAAG,CAAC,CAAC;gBACd,OAAO;aACR;YAED,QAAQ,CAAC,SAAS,EAAE,QAAQ,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC;QACvC,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAhDD,0EAgDC;AAED,yBAAa,CAAC,+BAA+B,EAAE;IAC7C,kBAAM,CAAC,cAAc;IACrB,kBAAM,CAAC,SAAS;IAChB,kBAAM,CAAC,eAAe;CACvB,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/eval.js b/node_modules/mongodb/lib/operations/eval.js new file mode 100644 index 000000000..83a3648b4 --- /dev/null +++ b/node_modules/mongodb/lib/operations/eval.js @@ -0,0 +1,55 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.EvalOperation = void 0; +const command_1 = require("./command"); +const bson_1 = require("../bson"); +const read_preference_1 = require("../read_preference"); +const error_1 = require("../error"); +/** @internal */ +class EvalOperation extends command_1.CommandOperation { + constructor(db, code, parameters, options) { + super(db, options); + this.options = options !== null && options !== void 0 ? options : {}; + this.code = code; + this.parameters = parameters; + // force primary read preference + Object.defineProperty(this, 'readPreference', { + value: read_preference_1.ReadPreference.primary, + configurable: false, + writable: false + }); + } + execute(server, session, callback) { + let finalCode = this.code; + let finalParameters = []; + // If not a code object translate to one + if (!(finalCode && finalCode._bsontype === 'Code')) { + finalCode = new bson_1.Code(finalCode); + } + // Ensure the parameters are correct + if (this.parameters != null && typeof this.parameters !== 'function') { + finalParameters = Array.isArray(this.parameters) ? this.parameters : [this.parameters]; + } + // Create execution selector + const cmd = { $eval: finalCode, args: finalParameters }; + // Check if the nolock parameter is passed in + if (this.options.nolock) { + cmd.nolock = this.options.nolock; + } + // Execute the command + super.executeCommand(server, session, cmd, (err, result) => { + if (err) + return callback(err); + if (result && result.ok === 1) { + return callback(undefined, result.retval); + } + if (result) { + callback(new error_1.MongoServerError({ message: `eval failed: ${result.errmsg}` })); + return; + } + callback(err, result); + }); + } +} +exports.EvalOperation = EvalOperation; +//# sourceMappingURL=eval.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/eval.js.map b/node_modules/mongodb/lib/operations/eval.js.map new file mode 100644 index 000000000..bbd955d52 --- /dev/null +++ b/node_modules/mongodb/lib/operations/eval.js.map @@ -0,0 +1 @@ +{"version":3,"file":"eval.js","sourceRoot":"","sources":["../../src/operations/eval.ts"],"names":[],"mappings":";;;AAAA,uCAAsE;AACtE,kCAAyC;AACzC,wDAAoD;AACpD,oCAA4C;AAY5C,gBAAgB;AAChB,MAAa,aAAc,SAAQ,0BAA0B;IAK3D,YACE,EAAmB,EACnB,IAAU,EACV,UAAkC,EAClC,OAAqB;QAErB,KAAK,CAAC,EAAE,EAAE,OAAO,CAAC,CAAC;QAEnB,IAAI,CAAC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAC7B,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;QACjB,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;QAC7B,gCAAgC;QAChC,MAAM,CAAC,cAAc,CAAC,IAAI,EAAE,gBAAgB,EAAE;YAC5C,KAAK,EAAE,gCAAc,CAAC,OAAO;YAC7B,YAAY,EAAE,KAAK;YACnB,QAAQ,EAAE,KAAK;SAChB,CAAC,CAAC;IACL,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA4B;QAC1E,IAAI,SAAS,GAAG,IAAI,CAAC,IAAI,CAAC;QAC1B,IAAI,eAAe,GAAe,EAAE,CAAC;QAErC,wCAAwC;QACxC,IAAI,CAAC,CAAC,SAAS,IAAK,SAA8C,CAAC,SAAS,KAAK,MAAM,CAAC,EAAE;YACxF,SAAS,GAAG,IAAI,WAAI,CAAC,SAAkB,CAAC,CAAC;SAC1C;QAED,oCAAoC;QACpC,IAAI,IAAI,CAAC,UAAU,IAAI,IAAI,IAAI,OAAO,IAAI,CAAC,UAAU,KAAK,UAAU,EAAE;YACpE,eAAe,GAAG,KAAK,CAAC,OAAO,CAAC,IAAI,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC,UAAU,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC,UAAU,CAAC,CAAC;SACxF;QAED,4BAA4B;QAC5B,MAAM,GAAG,GAAa,EAAE,KAAK,EAAE,SAAS,EAAE,IAAI,EAAE,eAAe,EAAE,CAAC;QAElE,6CAA6C;QAC7C,IAAI,IAAI,CAAC,OAAO,CAAC,MAAM,EAAE;YACvB,GAAG,CAAC,MAAM,GAAG,IAAI,CAAC,OAAO,CAAC,MAAM,CAAC;SAClC;QAED,sBAAsB;QACtB,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,GAAG,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;YACzD,IAAI,GAAG;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAC9B,IAAI,MAAM,IAAI,MAAM,CAAC,EAAE,KAAK,CAAC,EAAE;gBAC7B,OAAO,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,MAAM,CAAC,CAAC;aAC3C;YAED,IAAI,MAAM,EAAE;gBACV,QAAQ,CAAC,IAAI,wBAAgB,CAAC,EAAE,OAAO,EAAE,gBAAgB,MAAM,CAAC,MAAM,EAAE,EAAE,CAAC,CAAC,CAAC;gBAC7E,OAAO;aACR;YAED,QAAQ,CAAC,GAAG,EAAE,MAAM,CAAC,CAAC;QACxB,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AA7DD,sCA6DC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/execute_operation.js b/node_modules/mongodb/lib/operations/execute_operation.js new file mode 100644 index 000000000..b6073e478 --- /dev/null +++ b/node_modules/mongodb/lib/operations/execute_operation.js @@ -0,0 +1,162 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.executeOperation = void 0; +const read_preference_1 = require("../read_preference"); +const error_1 = require("../error"); +const operation_1 = require("./operation"); +const utils_1 = require("../utils"); +const utils_2 = require("../utils"); +const MMAPv1_RETRY_WRITES_ERROR_CODE = error_1.MONGODB_ERROR_CODES.IllegalOperation; +const MMAPv1_RETRY_WRITES_ERROR_MESSAGE = 'This MongoDB deployment does not support retryable writes. Please add retryWrites=false to your connection string.'; +function executeOperation(topology, operation, callback) { + if (!(operation instanceof operation_1.AbstractOperation)) { + // TODO(NODE-3483) + throw new error_1.MongoRuntimeError('This method requires a valid operation instance'); + } + return utils_1.maybePromise(callback, cb => { + if (topology.shouldCheckForSessionSupport()) { + return topology.selectServer(read_preference_1.ReadPreference.primaryPreferred, err => { + if (err) + return cb(err); + executeOperation(topology, operation, cb); + }); + } + // The driver sessions spec mandates that we implicitly create sessions for operations + // that are not explicitly provided with a session. + let session = operation.session; + let owner; + if (topology.hasSessionSupport()) { + if (session == null) { + owner = Symbol(); + session = topology.startSession({ owner, explicit: false }); + } + else if (session.hasEnded) { + return cb(new error_1.MongoExpiredSessionError('Use of expired sessions is not permitted')); + } + else if (session.snapshotEnabled && !topology.capabilities.supportsSnapshotReads) { + return cb(new error_1.MongoCompatibilityError('Snapshot reads require MongoDB 5.0 or later')); + } + } + else if (session) { + // If the user passed an explicit session and we are still, after server selection, + // trying to run against a topology that doesn't support sessions we error out. + return cb(new error_1.MongoCompatibilityError('Current topology does not support sessions')); + } + try { + executeWithServerSelection(topology, session, operation, (err, result) => { + if (session && session.owner && session.owner === owner) { + return session.endSession(err2 => cb(err2 || err, result)); + } + cb(err, result); + }); + } + catch (e) { + if (session && session.owner && session.owner === owner) { + session.endSession(); + } + throw e; + } + }); +} +exports.executeOperation = executeOperation; +function supportsRetryableReads(server) { + return utils_1.maxWireVersion(server) >= 6; +} +function executeWithServerSelection(topology, session, operation, callback) { + const readPreference = operation.readPreference || read_preference_1.ReadPreference.primary; + const inTransaction = session && session.inTransaction(); + if (inTransaction && !readPreference.equals(read_preference_1.ReadPreference.primary)) { + callback(new error_1.MongoTransactionError(`Read preference in a transaction must be primary, not: ${readPreference.mode}`)); + return; + } + if (session && + session.isPinned && + session.transaction.isCommitted && + !operation.bypassPinningCheck) { + session.unpin(); + } + const serverSelectionOptions = { session }; + function callbackWithRetry(err, result) { + if (err == null) { + return callback(undefined, result); + } + const hasReadAspect = operation.hasAspect(operation_1.Aspect.READ_OPERATION); + const hasWriteAspect = operation.hasAspect(operation_1.Aspect.WRITE_OPERATION); + const itShouldRetryWrite = shouldRetryWrite(err); + if ((hasReadAspect && !error_1.isRetryableError(err)) || (hasWriteAspect && !itShouldRetryWrite)) { + return callback(err); + } + if (hasWriteAspect && + itShouldRetryWrite && + err.code === MMAPv1_RETRY_WRITES_ERROR_CODE && + err.errmsg.match(/Transaction numbers/)) { + callback(new error_1.MongoServerError({ + message: MMAPv1_RETRY_WRITES_ERROR_MESSAGE, + errmsg: MMAPv1_RETRY_WRITES_ERROR_MESSAGE, + originalError: err + })); + return; + } + // select a new server, and attempt to retry the operation + topology.selectServer(readPreference, serverSelectionOptions, (e, server) => { + if (e || + (operation.hasAspect(operation_1.Aspect.READ_OPERATION) && !supportsRetryableReads(server)) || + (operation.hasAspect(operation_1.Aspect.WRITE_OPERATION) && !utils_2.supportsRetryableWrites(server))) { + callback(e); + return; + } + // If we have a cursor and the initial command fails with a network error, + // we can retry it on another connection. So we need to check it back in, clear the + // pool for the service id, and retry again. + if (err && + err instanceof error_1.MongoNetworkError && + server.loadBalanced && + session && + session.isPinned && + !session.inTransaction() && + operation.hasAspect(operation_1.Aspect.CURSOR_CREATING)) { + session.unpin({ force: true, forceClear: true }); + } + operation.execute(server, session, callback); + }); + } + if (readPreference && + !readPreference.equals(read_preference_1.ReadPreference.primary) && + session && + session.inTransaction()) { + callback(new error_1.MongoTransactionError(`Read preference in a transaction must be primary, not: ${readPreference.mode}`)); + return; + } + // select a server, and execute the operation against it + topology.selectServer(readPreference, serverSelectionOptions, (err, server) => { + if (err) { + callback(err); + return; + } + if (session && operation.hasAspect(operation_1.Aspect.RETRYABLE)) { + const willRetryRead = topology.s.options.retryReads !== false && + !inTransaction && + supportsRetryableReads(server) && + operation.canRetryRead; + const willRetryWrite = topology.s.options.retryWrites === true && + !inTransaction && + utils_2.supportsRetryableWrites(server) && + operation.canRetryWrite; + const hasReadAspect = operation.hasAspect(operation_1.Aspect.READ_OPERATION); + const hasWriteAspect = operation.hasAspect(operation_1.Aspect.WRITE_OPERATION); + if ((hasReadAspect && willRetryRead) || (hasWriteAspect && willRetryWrite)) { + if (hasWriteAspect && willRetryWrite) { + operation.options.willRetryWrite = true; + session.incrementTransactionNumber(); + } + operation.execute(server, session, callbackWithRetry); + return; + } + } + operation.execute(server, session, callback); + }); +} +function shouldRetryWrite(err) { + return err instanceof error_1.MongoError && err.hasErrorLabel('RetryableWriteError'); +} +//# sourceMappingURL=execute_operation.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/execute_operation.js.map b/node_modules/mongodb/lib/operations/execute_operation.js.map new file mode 100644 index 000000000..8ff11bbb9 --- /dev/null +++ b/node_modules/mongodb/lib/operations/execute_operation.js.map @@ -0,0 +1 @@ +{"version":3,"file":"execute_operation.js","sourceRoot":"","sources":["../../src/operations/execute_operation.ts"],"names":[],"mappings":";;;AAAA,wDAAoD;AACpD,oCAUkB;AAClB,2CAAwD;AACxD,oCAAkE;AAKlE,oCAAmD;AAEnD,MAAM,8BAA8B,GAAG,2BAAmB,CAAC,gBAAgB,CAAC;AAC5E,MAAM,iCAAiC,GACrC,oHAAoH,CAAC;AA2CvH,SAAgB,gBAAgB,CAG9B,QAAkB,EAAE,SAAY,EAAE,QAA4B;IAC9D,IAAI,CAAC,CAAC,SAAS,YAAY,6BAAiB,CAAC,EAAE;QAC7C,kBAAkB;QAClB,MAAM,IAAI,yBAAiB,CAAC,iDAAiD,CAAC,CAAC;KAChF;IAED,OAAO,oBAAY,CAAC,QAAQ,EAAE,EAAE,CAAC,EAAE;QACjC,IAAI,QAAQ,CAAC,4BAA4B,EAAE,EAAE;YAC3C,OAAO,QAAQ,CAAC,YAAY,CAAC,gCAAc,CAAC,gBAAgB,EAAE,GAAG,CAAC,EAAE;gBAClE,IAAI,GAAG;oBAAE,OAAO,EAAE,CAAC,GAAG,CAAC,CAAC;gBAExB,gBAAgB,CAAa,QAAQ,EAAE,SAAS,EAAE,EAAE,CAAC,CAAC;YACxD,CAAC,CAAC,CAAC;SACJ;QAED,sFAAsF;QACtF,mDAAmD;QACnD,IAAI,OAAO,GAA8B,SAAS,CAAC,OAAO,CAAC;QAC3D,IAAI,KAAyB,CAAC;QAC9B,IAAI,QAAQ,CAAC,iBAAiB,EAAE,EAAE;YAChC,IAAI,OAAO,IAAI,IAAI,EAAE;gBACnB,KAAK,GAAG,MAAM,EAAE,CAAC;gBACjB,OAAO,GAAG,QAAQ,CAAC,YAAY,CAAC,EAAE,KAAK,EAAE,QAAQ,EAAE,KAAK,EAAE,CAAC,CAAC;aAC7D;iBAAM,IAAI,OAAO,CAAC,QAAQ,EAAE;gBAC3B,OAAO,EAAE,CAAC,IAAI,gCAAwB,CAAC,0CAA0C,CAAC,CAAC,CAAC;aACrF;iBAAM,IAAI,OAAO,CAAC,eAAe,IAAI,CAAC,QAAQ,CAAC,YAAY,CAAC,qBAAqB,EAAE;gBAClF,OAAO,EAAE,CAAC,IAAI,+BAAuB,CAAC,6CAA6C,CAAC,CAAC,CAAC;aACvF;SACF;aAAM,IAAI,OAAO,EAAE;YAClB,mFAAmF;YACnF,+EAA+E;YAC/E,OAAO,EAAE,CAAC,IAAI,+BAAuB,CAAC,4CAA4C,CAAC,CAAC,CAAC;SACtF;QAED,IAAI;YACF,0BAA0B,CAAC,QAAQ,EAAE,OAAO,EAAE,SAAS,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;gBACvE,IAAI,OAAO,IAAI,OAAO,CAAC,KAAK,IAAI,OAAO,CAAC,KAAK,KAAK,KAAK,EAAE;oBACvD,OAAO,OAAO,CAAC,UAAU,CAAC,IAAI,CAAC,EAAE,CAAC,EAAE,CAAC,IAAI,IAAI,GAAG,EAAE,MAAM,CAAC,CAAC,CAAC;iBAC5D;gBAED,EAAE,CAAC,GAAG,EAAE,MAAM,CAAC,CAAC;YAClB,CAAC,CAAC,CAAC;SACJ;QAAC,OAAO,CAAC,EAAE;YACV,IAAI,OAAO,IAAI,OAAO,CAAC,KAAK,IAAI,OAAO,CAAC,KAAK,KAAK,KAAK,EAAE;gBACvD,OAAO,CAAC,UAAU,EAAE,CAAC;aACtB;YAED,MAAM,CAAC,CAAC;SACT;IACH,CAAC,CAAC,CAAC;AACL,CAAC;AArDD,4CAqDC;AAED,SAAS,sBAAsB,CAAC,MAAc;IAC5C,OAAO,sBAAc,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC;AACrC,CAAC;AAED,SAAS,0BAA0B,CACjC,QAAkB,EAClB,OAAsB,EACtB,SAA4B,EAC5B,QAAkB;IAElB,MAAM,cAAc,GAAG,SAAS,CAAC,cAAc,IAAI,gCAAc,CAAC,OAAO,CAAC;IAC1E,MAAM,aAAa,GAAG,OAAO,IAAI,OAAO,CAAC,aAAa,EAAE,CAAC;IAEzD,IAAI,aAAa,IAAI,CAAC,cAAc,CAAC,MAAM,CAAC,gCAAc,CAAC,OAAO,CAAC,EAAE;QACnE,QAAQ,CACN,IAAI,6BAAqB,CACvB,0DAA0D,cAAc,CAAC,IAAI,EAAE,CAChF,CACF,CAAC;QAEF,OAAO;KACR;IAED,IACE,OAAO;QACP,OAAO,CAAC,QAAQ;QAChB,OAAO,CAAC,WAAW,CAAC,WAAW;QAC/B,CAAC,SAAS,CAAC,kBAAkB,EAC7B;QACA,OAAO,CAAC,KAAK,EAAE,CAAC;KACjB;IAED,MAAM,sBAAsB,GAAG,EAAE,OAAO,EAAE,CAAC;IAC3C,SAAS,iBAAiB,CAAC,GAAS,EAAE,MAAY;QAChD,IAAI,GAAG,IAAI,IAAI,EAAE;YACf,OAAO,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,CAAC;SACpC;QAED,MAAM,aAAa,GAAG,SAAS,CAAC,SAAS,CAAC,kBAAM,CAAC,cAAc,CAAC,CAAC;QACjE,MAAM,cAAc,GAAG,SAAS,CAAC,SAAS,CAAC,kBAAM,CAAC,eAAe,CAAC,CAAC;QACnE,MAAM,kBAAkB,GAAG,gBAAgB,CAAC,GAAG,CAAC,CAAC;QAEjD,IAAI,CAAC,aAAa,IAAI,CAAC,wBAAgB,CAAC,GAAG,CAAC,CAAC,IAAI,CAAC,cAAc,IAAI,CAAC,kBAAkB,CAAC,EAAE;YACxF,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;SACtB;QAED,IACE,cAAc;YACd,kBAAkB;YAClB,GAAG,CAAC,IAAI,KAAK,8BAA8B;YAC3C,GAAG,CAAC,MAAM,CAAC,KAAK,CAAC,qBAAqB,CAAC,EACvC;YACA,QAAQ,CACN,IAAI,wBAAgB,CAAC;gBACnB,OAAO,EAAE,iCAAiC;gBAC1C,MAAM,EAAE,iCAAiC;gBACzC,aAAa,EAAE,GAAG;aACnB,CAAC,CACH,CAAC;YAEF,OAAO;SACR;QAED,0DAA0D;QAC1D,QAAQ,CAAC,YAAY,CAAC,cAAc,EAAE,sBAAsB,EAAE,CAAC,CAAO,EAAE,MAAY,EAAE,EAAE;YACtF,IACE,CAAC;gBACD,CAAC,SAAS,CAAC,SAAS,CAAC,kBAAM,CAAC,cAAc,CAAC,IAAI,CAAC,sBAAsB,CAAC,MAAM,CAAC,CAAC;gBAC/E,CAAC,SAAS,CAAC,SAAS,CAAC,kBAAM,CAAC,eAAe,CAAC,IAAI,CAAC,+BAAuB,CAAC,MAAM,CAAC,CAAC,EACjF;gBACA,QAAQ,CAAC,CAAC,CAAC,CAAC;gBACZ,OAAO;aACR;YAED,0EAA0E;YAC1E,mFAAmF;YACnF,4CAA4C;YAC5C,IACE,GAAG;gBACH,GAAG,YAAY,yBAAiB;gBAChC,MAAM,CAAC,YAAY;gBACnB,OAAO;gBACP,OAAO,CAAC,QAAQ;gBAChB,CAAC,OAAO,CAAC,aAAa,EAAE;gBACxB,SAAS,CAAC,SAAS,CAAC,kBAAM,CAAC,eAAe,CAAC,EAC3C;gBACA,OAAO,CAAC,KAAK,CAAC,EAAE,KAAK,EAAE,IAAI,EAAE,UAAU,EAAE,IAAI,EAAE,CAAC,CAAC;aAClD;YAED,SAAS,CAAC,OAAO,CAAC,MAAM,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;QAC/C,CAAC,CAAC,CAAC;IACL,CAAC;IAED,IACE,cAAc;QACd,CAAC,cAAc,CAAC,MAAM,CAAC,gCAAc,CAAC,OAAO,CAAC;QAC9C,OAAO;QACP,OAAO,CAAC,aAAa,EAAE,EACvB;QACA,QAAQ,CACN,IAAI,6BAAqB,CACvB,0DAA0D,cAAc,CAAC,IAAI,EAAE,CAChF,CACF,CAAC;QAEF,OAAO;KACR;IAED,wDAAwD;IACxD,QAAQ,CAAC,YAAY,CAAC,cAAc,EAAE,sBAAsB,EAAE,CAAC,GAAS,EAAE,MAAY,EAAE,EAAE;QACxF,IAAI,GAAG,EAAE;YACP,QAAQ,CAAC,GAAG,CAAC,CAAC;YACd,OAAO;SACR;QAED,IAAI,OAAO,IAAI,SAAS,CAAC,SAAS,CAAC,kBAAM,CAAC,SAAS,CAAC,EAAE;YACpD,MAAM,aAAa,GACjB,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,UAAU,KAAK,KAAK;gBACvC,CAAC,aAAa;gBACd,sBAAsB,CAAC,MAAM,CAAC;gBAC9B,SAAS,CAAC,YAAY,CAAC;YAEzB,MAAM,cAAc,GAClB,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,WAAW,KAAK,IAAI;gBACvC,CAAC,aAAa;gBACd,+BAAuB,CAAC,MAAM,CAAC;gBAC/B,SAAS,CAAC,aAAa,CAAC;YAE1B,MAAM,aAAa,GAAG,SAAS,CAAC,SAAS,CAAC,kBAAM,CAAC,cAAc,CAAC,CAAC;YACjE,MAAM,cAAc,GAAG,SAAS,CAAC,SAAS,CAAC,kBAAM,CAAC,eAAe,CAAC,CAAC;YAEnE,IAAI,CAAC,aAAa,IAAI,aAAa,CAAC,IAAI,CAAC,cAAc,IAAI,cAAc,CAAC,EAAE;gBAC1E,IAAI,cAAc,IAAI,cAAc,EAAE;oBACpC,SAAS,CAAC,OAAO,CAAC,cAAc,GAAG,IAAI,CAAC;oBACxC,OAAO,CAAC,0BAA0B,EAAE,CAAC;iBACtC;gBAED,SAAS,CAAC,OAAO,CAAC,MAAM,EAAE,OAAO,EAAE,iBAAiB,CAAC,CAAC;gBACtD,OAAO;aACR;SACF;QAED,SAAS,CAAC,OAAO,CAAC,MAAM,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;IAC/C,CAAC,CAAC,CAAC;AACL,CAAC;AAED,SAAS,gBAAgB,CAAC,GAAQ;IAChC,OAAO,GAAG,YAAY,kBAAU,IAAI,GAAG,CAAC,aAAa,CAAC,qBAAqB,CAAC,CAAC;AAC/E,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/find.js b/node_modules/mongodb/lib/operations/find.js new file mode 100644 index 000000000..bbaebc5e3 --- /dev/null +++ b/node_modules/mongodb/lib/operations/find.js @@ -0,0 +1,211 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.FindOperation = void 0; +const operation_1 = require("./operation"); +const utils_1 = require("../utils"); +const error_1 = require("../error"); +const command_1 = require("./command"); +const sort_1 = require("../sort"); +const shared_1 = require("../cmap/wire_protocol/shared"); +const read_concern_1 = require("../read_concern"); +const SUPPORTS_WRITE_CONCERN_AND_COLLATION = 5; +/** @internal */ +class FindOperation extends command_1.CommandOperation { + constructor(collection, ns, filter = {}, options = {}) { + super(collection, options); + this.options = options; + this.ns = ns; + if (typeof filter !== 'object' || Array.isArray(filter)) { + throw new error_1.MongoInvalidArgumentError('Query filter must be a plain object or ObjectId'); + } + // If the filter is a buffer, validate that is a valid BSON document + if (Buffer.isBuffer(filter)) { + const objectSize = filter[0] | (filter[1] << 8) | (filter[2] << 16) | (filter[3] << 24); + if (objectSize !== filter.length) { + throw new error_1.MongoInvalidArgumentError(`Query filter raw message size does not match message header size [${filter.length}] != [${objectSize}]`); + } + } + // special case passing in an ObjectId as a filter + this.filter = filter != null && filter._bsontype === 'ObjectID' ? { _id: filter } : filter; + } + execute(server, session, callback) { + this.server = server; + const serverWireVersion = utils_1.maxWireVersion(server); + const options = this.options; + if (options.allowDiskUse != null && serverWireVersion < 4) { + callback(new error_1.MongoCompatibilityError('Option "allowDiskUse" is not supported on MongoDB < 3.2')); + return; + } + if (options.collation && serverWireVersion < SUPPORTS_WRITE_CONCERN_AND_COLLATION) { + callback(new error_1.MongoCompatibilityError(`Server ${server.name}, which reports wire version ${serverWireVersion}, does not support collation`)); + return; + } + if (serverWireVersion < 4) { + if (this.readConcern && this.readConcern.level !== 'local') { + callback(new error_1.MongoCompatibilityError(`Server find command does not support a readConcern level of ${this.readConcern.level}`)); + return; + } + const findCommand = makeLegacyFindCommand(this.ns, this.filter, options); + if (shared_1.isSharded(server) && this.readPreference) { + findCommand.$readPreference = this.readPreference.toJSON(); + } + server.query(this.ns, findCommand, { + ...this.options, + ...this.bsonOptions, + documentsReturnedIn: 'firstBatch', + readPreference: this.readPreference + }, callback); + return; + } + let findCommand = makeFindCommand(this.ns, this.filter, options); + if (this.explain) { + findCommand = utils_1.decorateWithExplain(findCommand, this.explain); + } + server.command(this.ns, findCommand, { + ...this.options, + ...this.bsonOptions, + documentsReturnedIn: 'firstBatch', + session + }, callback); + } +} +exports.FindOperation = FindOperation; +function makeFindCommand(ns, filter, options) { + const findCommand = { + find: ns.collection, + filter + }; + if (options.sort) { + findCommand.sort = sort_1.formatSort(options.sort); + } + if (options.projection) { + let projection = options.projection; + if (projection && Array.isArray(projection)) { + projection = projection.length + ? projection.reduce((result, field) => { + result[field] = 1; + return result; + }, {}) + : { _id: 1 }; + } + findCommand.projection = projection; + } + if (options.hint) { + findCommand.hint = utils_1.normalizeHintField(options.hint); + } + if (typeof options.skip === 'number') { + findCommand.skip = options.skip; + } + if (typeof options.limit === 'number') { + if (options.limit < 0) { + findCommand.limit = -options.limit; + findCommand.singleBatch = true; + } + else { + findCommand.limit = options.limit; + } + } + if (typeof options.batchSize === 'number') { + if (options.batchSize < 0) { + if (options.limit && + options.limit !== 0 && + Math.abs(options.batchSize) < Math.abs(options.limit)) { + findCommand.limit = -options.batchSize; + } + findCommand.singleBatch = true; + } + else { + findCommand.batchSize = options.batchSize; + } + } + if (typeof options.singleBatch === 'boolean') { + findCommand.singleBatch = options.singleBatch; + } + if (options.comment) { + findCommand.comment = options.comment; + } + if (typeof options.maxTimeMS === 'number') { + findCommand.maxTimeMS = options.maxTimeMS; + } + const readConcern = read_concern_1.ReadConcern.fromOptions(options); + if (readConcern) { + findCommand.readConcern = readConcern.toJSON(); + } + if (options.max) { + findCommand.max = options.max; + } + if (options.min) { + findCommand.min = options.min; + } + if (typeof options.returnKey === 'boolean') { + findCommand.returnKey = options.returnKey; + } + if (typeof options.showRecordId === 'boolean') { + findCommand.showRecordId = options.showRecordId; + } + if (typeof options.tailable === 'boolean') { + findCommand.tailable = options.tailable; + } + if (typeof options.timeout === 'boolean') { + findCommand.noCursorTimeout = !options.timeout; + } + else if (typeof options.noCursorTimeout === 'boolean') { + findCommand.noCursorTimeout = options.noCursorTimeout; + } + if (typeof options.awaitData === 'boolean') { + findCommand.awaitData = options.awaitData; + } + if (typeof options.allowPartialResults === 'boolean') { + findCommand.allowPartialResults = options.allowPartialResults; + } + if (options.collation) { + findCommand.collation = options.collation; + } + if (typeof options.allowDiskUse === 'boolean') { + findCommand.allowDiskUse = options.allowDiskUse; + } + if (options.let) { + findCommand.let = options.let; + } + return findCommand; +} +function makeLegacyFindCommand(ns, filter, options) { + const findCommand = { + $query: filter + }; + if (options.sort) { + findCommand.$orderby = sort_1.formatSort(options.sort); + } + if (options.hint) { + findCommand.$hint = utils_1.normalizeHintField(options.hint); + } + if (typeof options.returnKey === 'boolean') { + findCommand.$returnKey = options.returnKey; + } + if (options.max) { + findCommand.$max = options.max; + } + if (options.min) { + findCommand.$min = options.min; + } + if (typeof options.showRecordId === 'boolean') { + findCommand.$showDiskLoc = options.showRecordId; + } + if (options.comment) { + findCommand.$comment = options.comment; + } + if (typeof options.maxTimeMS === 'number') { + findCommand.$maxTimeMS = options.maxTimeMS; + } + if (options.explain != null) { + findCommand.$explain = true; + } + return findCommand; +} +operation_1.defineAspects(FindOperation, [ + operation_1.Aspect.READ_OPERATION, + operation_1.Aspect.RETRYABLE, + operation_1.Aspect.EXPLAINABLE, + operation_1.Aspect.CURSOR_CREATING +]); +//# sourceMappingURL=find.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/find.js.map b/node_modules/mongodb/lib/operations/find.js.map new file mode 100644 index 000000000..0de5689db --- /dev/null +++ b/node_modules/mongodb/lib/operations/find.js.map @@ -0,0 +1 @@ +{"version":3,"file":"find.js","sourceRoot":"","sources":["../../src/operations/find.ts"],"names":[],"mappings":";;;AAAA,2CAA0D;AAC1D,oCAMkB;AAClB,oCAA8E;AAI9E,uCAAwF;AACxF,kCAA2C;AAC3C,yDAAyD;AACzD,kDAA8C;AAuD9C,MAAM,oCAAoC,GAAG,CAAC,CAAC;AAE/C,gBAAgB;AAChB,MAAa,aAAc,SAAQ,0BAA0B;IAI3D,YACE,UAAkC,EAClC,EAAoB,EACpB,SAAmB,EAAE,EACrB,UAAuB,EAAE;QAEzB,KAAK,CAAC,UAAU,EAAE,OAAO,CAAC,CAAC;QAE3B,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC;QAEb,IAAI,OAAO,MAAM,KAAK,QAAQ,IAAI,KAAK,CAAC,OAAO,CAAC,MAAM,CAAC,EAAE;YACvD,MAAM,IAAI,iCAAyB,CAAC,iDAAiD,CAAC,CAAC;SACxF;QAED,oEAAoE;QACpE,IAAI,MAAM,CAAC,QAAQ,CAAC,MAAM,CAAC,EAAE;YAC3B,MAAM,UAAU,GAAG,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,MAAM,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,GAAG,CAAC,MAAM,CAAC,CAAC,CAAC,IAAI,EAAE,CAAC,GAAG,CAAC,MAAM,CAAC,CAAC,CAAC,IAAI,EAAE,CAAC,CAAC;YACxF,IAAI,UAAU,KAAK,MAAM,CAAC,MAAM,EAAE;gBAChC,MAAM,IAAI,iCAAyB,CACjC,qEAAqE,MAAM,CAAC,MAAM,SAAS,UAAU,GAAG,CACzG,CAAC;aACH;SACF;QAED,kDAAkD;QAClD,IAAI,CAAC,MAAM,GAAG,MAAM,IAAI,IAAI,IAAI,MAAM,CAAC,SAAS,KAAK,UAAU,CAAC,CAAC,CAAC,EAAE,GAAG,EAAE,MAAM,EAAE,CAAC,CAAC,CAAC,MAAM,CAAC;IAC7F,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA4B;QAC1E,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC;QAErB,MAAM,iBAAiB,GAAG,sBAAc,CAAC,MAAM,CAAC,CAAC;QACjD,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAC;QAC7B,IAAI,OAAO,CAAC,YAAY,IAAI,IAAI,IAAI,iBAAiB,GAAG,CAAC,EAAE;YACzD,QAAQ,CACN,IAAI,+BAAuB,CAAC,yDAAyD,CAAC,CACvF,CAAC;YACF,OAAO;SACR;QAED,IAAI,OAAO,CAAC,SAAS,IAAI,iBAAiB,GAAG,oCAAoC,EAAE;YACjF,QAAQ,CACN,IAAI,+BAAuB,CACzB,UAAU,MAAM,CAAC,IAAI,gCAAgC,iBAAiB,8BAA8B,CACrG,CACF,CAAC;YAEF,OAAO;SACR;QAED,IAAI,iBAAiB,GAAG,CAAC,EAAE;YACzB,IAAI,IAAI,CAAC,WAAW,IAAI,IAAI,CAAC,WAAW,CAAC,KAAK,KAAK,OAAO,EAAE;gBAC1D,QAAQ,CACN,IAAI,+BAAuB,CACzB,+DAA+D,IAAI,CAAC,WAAW,CAAC,KAAK,EAAE,CACxF,CACF,CAAC;gBAEF,OAAO;aACR;YAED,MAAM,WAAW,GAAG,qBAAqB,CAAC,IAAI,CAAC,EAAE,EAAE,IAAI,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;YACzE,IAAI,kBAAS,CAAC,MAAM,CAAC,IAAI,IAAI,CAAC,cAAc,EAAE;gBAC5C,WAAW,CAAC,eAAe,GAAG,IAAI,CAAC,cAAc,CAAC,MAAM,EAAE,CAAC;aAC5D;YAED,MAAM,CAAC,KAAK,CACV,IAAI,CAAC,EAAE,EACP,WAAW,EACX;gBACE,GAAG,IAAI,CAAC,OAAO;gBACf,GAAG,IAAI,CAAC,WAAW;gBACnB,mBAAmB,EAAE,YAAY;gBACjC,cAAc,EAAE,IAAI,CAAC,cAAc;aACpC,EACD,QAAQ,CACT,CAAC;YAEF,OAAO;SACR;QAED,IAAI,WAAW,GAAG,eAAe,CAAC,IAAI,CAAC,EAAE,EAAE,IAAI,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;QACjE,IAAI,IAAI,CAAC,OAAO,EAAE;YAChB,WAAW,GAAG,2BAAmB,CAAC,WAAW,EAAE,IAAI,CAAC,OAAO,CAAC,CAAC;SAC9D;QAED,MAAM,CAAC,OAAO,CACZ,IAAI,CAAC,EAAE,EACP,WAAW,EACX;YACE,GAAG,IAAI,CAAC,OAAO;YACf,GAAG,IAAI,CAAC,WAAW;YACnB,mBAAmB,EAAE,YAAY;YACjC,OAAO;SACR,EACD,QAAQ,CACT,CAAC;IACJ,CAAC;CACF;AAvGD,sCAuGC;AAED,SAAS,eAAe,CAAC,EAAoB,EAAE,MAAgB,EAAE,OAAoB;IACnF,MAAM,WAAW,GAAa;QAC5B,IAAI,EAAE,EAAE,CAAC,UAAU;QACnB,MAAM;KACP,CAAC;IAEF,IAAI,OAAO,CAAC,IAAI,EAAE;QAChB,WAAW,CAAC,IAAI,GAAG,iBAAU,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;KAC7C;IAED,IAAI,OAAO,CAAC,UAAU,EAAE;QACtB,IAAI,UAAU,GAAG,OAAO,CAAC,UAAU,CAAC;QACpC,IAAI,UAAU,IAAI,KAAK,CAAC,OAAO,CAAC,UAAU,CAAC,EAAE;YAC3C,UAAU,GAAG,UAAU,CAAC,MAAM;gBAC5B,CAAC,CAAC,UAAU,CAAC,MAAM,CAAC,CAAC,MAAM,EAAE,KAAK,EAAE,EAAE;oBAClC,MAAM,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;oBAClB,OAAO,MAAM,CAAC;gBAChB,CAAC,EAAE,EAAE,CAAC;gBACR,CAAC,CAAC,EAAE,GAAG,EAAE,CAAC,EAAE,CAAC;SAChB;QAED,WAAW,CAAC,UAAU,GAAG,UAAU,CAAC;KACrC;IAED,IAAI,OAAO,CAAC,IAAI,EAAE;QAChB,WAAW,CAAC,IAAI,GAAG,0BAAkB,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;KACrD;IAED,IAAI,OAAO,OAAO,CAAC,IAAI,KAAK,QAAQ,EAAE;QACpC,WAAW,CAAC,IAAI,GAAG,OAAO,CAAC,IAAI,CAAC;KACjC;IAED,IAAI,OAAO,OAAO,CAAC,KAAK,KAAK,QAAQ,EAAE;QACrC,IAAI,OAAO,CAAC,KAAK,GAAG,CAAC,EAAE;YACrB,WAAW,CAAC,KAAK,GAAG,CAAC,OAAO,CAAC,KAAK,CAAC;YACnC,WAAW,CAAC,WAAW,GAAG,IAAI,CAAC;SAChC;aAAM;YACL,WAAW,CAAC,KAAK,GAAG,OAAO,CAAC,KAAK,CAAC;SACnC;KACF;IAED,IAAI,OAAO,OAAO,CAAC,SAAS,KAAK,QAAQ,EAAE;QACzC,IAAI,OAAO,CAAC,SAAS,GAAG,CAAC,EAAE;YACzB,IACE,OAAO,CAAC,KAAK;gBACb,OAAO,CAAC,KAAK,KAAK,CAAC;gBACnB,IAAI,CAAC,GAAG,CAAC,OAAO,CAAC,SAAS,CAAC,GAAG,IAAI,CAAC,GAAG,CAAC,OAAO,CAAC,KAAK,CAAC,EACrD;gBACA,WAAW,CAAC,KAAK,GAAG,CAAC,OAAO,CAAC,SAAS,CAAC;aACxC;YAED,WAAW,CAAC,WAAW,GAAG,IAAI,CAAC;SAChC;aAAM;YACL,WAAW,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;SAC3C;KACF;IAED,IAAI,OAAO,OAAO,CAAC,WAAW,KAAK,SAAS,EAAE;QAC5C,WAAW,CAAC,WAAW,GAAG,OAAO,CAAC,WAAW,CAAC;KAC/C;IAED,IAAI,OAAO,CAAC,OAAO,EAAE;QACnB,WAAW,CAAC,OAAO,GAAG,OAAO,CAAC,OAAO,CAAC;KACvC;IAED,IAAI,OAAO,OAAO,CAAC,SAAS,KAAK,QAAQ,EAAE;QACzC,WAAW,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;KAC3C;IAED,MAAM,WAAW,GAAG,0BAAW,CAAC,WAAW,CAAC,OAAO,CAAC,CAAC;IACrD,IAAI,WAAW,EAAE;QACf,WAAW,CAAC,WAAW,GAAG,WAAW,CAAC,MAAM,EAAE,CAAC;KAChD;IAED,IAAI,OAAO,CAAC,GAAG,EAAE;QACf,WAAW,CAAC,GAAG,GAAG,OAAO,CAAC,GAAG,CAAC;KAC/B;IAED,IAAI,OAAO,CAAC,GAAG,EAAE;QACf,WAAW,CAAC,GAAG,GAAG,OAAO,CAAC,GAAG,CAAC;KAC/B;IAED,IAAI,OAAO,OAAO,CAAC,SAAS,KAAK,SAAS,EAAE;QAC1C,WAAW,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;KAC3C;IAED,IAAI,OAAO,OAAO,CAAC,YAAY,KAAK,SAAS,EAAE;QAC7C,WAAW,CAAC,YAAY,GAAG,OAAO,CAAC,YAAY,CAAC;KACjD;IAED,IAAI,OAAO,OAAO,CAAC,QAAQ,KAAK,SAAS,EAAE;QACzC,WAAW,CAAC,QAAQ,GAAG,OAAO,CAAC,QAAQ,CAAC;KACzC;IAED,IAAI,OAAO,OAAO,CAAC,OAAO,KAAK,SAAS,EAAE;QACxC,WAAW,CAAC,eAAe,GAAG,CAAC,OAAO,CAAC,OAAO,CAAC;KAChD;SAAM,IAAI,OAAO,OAAO,CAAC,eAAe,KAAK,SAAS,EAAE;QACvD,WAAW,CAAC,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC;KACvD;IAED,IAAI,OAAO,OAAO,CAAC,SAAS,KAAK,SAAS,EAAE;QAC1C,WAAW,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;KAC3C;IAED,IAAI,OAAO,OAAO,CAAC,mBAAmB,KAAK,SAAS,EAAE;QACpD,WAAW,CAAC,mBAAmB,GAAG,OAAO,CAAC,mBAAmB,CAAC;KAC/D;IAED,IAAI,OAAO,CAAC,SAAS,EAAE;QACrB,WAAW,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;KAC3C;IAED,IAAI,OAAO,OAAO,CAAC,YAAY,KAAK,SAAS,EAAE;QAC7C,WAAW,CAAC,YAAY,GAAG,OAAO,CAAC,YAAY,CAAC;KACjD;IAED,IAAI,OAAO,CAAC,GAAG,EAAE;QACf,WAAW,CAAC,GAAG,GAAG,OAAO,CAAC,GAAG,CAAC;KAC/B;IAED,OAAO,WAAW,CAAC;AACrB,CAAC;AAED,SAAS,qBAAqB,CAC5B,EAAoB,EACpB,MAAgB,EAChB,OAAoB;IAEpB,MAAM,WAAW,GAAa;QAC5B,MAAM,EAAE,MAAM;KACf,CAAC;IAEF,IAAI,OAAO,CAAC,IAAI,EAAE;QAChB,WAAW,CAAC,QAAQ,GAAG,iBAAU,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;KACjD;IAED,IAAI,OAAO,CAAC,IAAI,EAAE;QAChB,WAAW,CAAC,KAAK,GAAG,0BAAkB,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;KACtD;IAED,IAAI,OAAO,OAAO,CAAC,SAAS,KAAK,SAAS,EAAE;QAC1C,WAAW,CAAC,UAAU,GAAG,OAAO,CAAC,SAAS,CAAC;KAC5C;IAED,IAAI,OAAO,CAAC,GAAG,EAAE;QACf,WAAW,CAAC,IAAI,GAAG,OAAO,CAAC,GAAG,CAAC;KAChC;IAED,IAAI,OAAO,CAAC,GAAG,EAAE;QACf,WAAW,CAAC,IAAI,GAAG,OAAO,CAAC,GAAG,CAAC;KAChC;IAED,IAAI,OAAO,OAAO,CAAC,YAAY,KAAK,SAAS,EAAE;QAC7C,WAAW,CAAC,YAAY,GAAG,OAAO,CAAC,YAAY,CAAC;KACjD;IAED,IAAI,OAAO,CAAC,OAAO,EAAE;QACnB,WAAW,CAAC,QAAQ,GAAG,OAAO,CAAC,OAAO,CAAC;KACxC;IAED,IAAI,OAAO,OAAO,CAAC,SAAS,KAAK,QAAQ,EAAE;QACzC,WAAW,CAAC,UAAU,GAAG,OAAO,CAAC,SAAS,CAAC;KAC5C;IAED,IAAI,OAAO,CAAC,OAAO,IAAI,IAAI,EAAE;QAC3B,WAAW,CAAC,QAAQ,GAAG,IAAI,CAAC;KAC7B;IAED,OAAO,WAAW,CAAC;AACrB,CAAC;AAED,yBAAa,CAAC,aAAa,EAAE;IAC3B,kBAAM,CAAC,cAAc;IACrB,kBAAM,CAAC,SAAS;IAChB,kBAAM,CAAC,WAAW;IAClB,kBAAM,CAAC,eAAe;CACvB,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/find_and_modify.js b/node_modules/mongodb/lib/operations/find_and_modify.js new file mode 100644 index 000000000..a88a61d0f --- /dev/null +++ b/node_modules/mongodb/lib/operations/find_and_modify.js @@ -0,0 +1,151 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.FindOneAndUpdateOperation = exports.FindOneAndReplaceOperation = exports.FindOneAndDeleteOperation = exports.ReturnDocument = void 0; +const read_preference_1 = require("../read_preference"); +const utils_1 = require("../utils"); +const error_1 = require("../error"); +const command_1 = require("./command"); +const operation_1 = require("./operation"); +const sort_1 = require("../sort"); +/** @public */ +exports.ReturnDocument = Object.freeze({ + BEFORE: 'before', + AFTER: 'after' +}); +function configureFindAndModifyCmdBaseUpdateOpts(cmdBase, options) { + cmdBase.new = options.returnDocument === exports.ReturnDocument.AFTER; + cmdBase.upsert = options.upsert === true; + if (options.bypassDocumentValidation === true) { + cmdBase.bypassDocumentValidation = options.bypassDocumentValidation; + } + return cmdBase; +} +/** @internal */ +class FindAndModifyOperation extends command_1.CommandOperation { + constructor(collection, query, options) { + super(collection, options); + this.options = options !== null && options !== void 0 ? options : {}; + this.cmdBase = { + remove: false, + new: false, + upsert: false + }; + const sort = sort_1.formatSort(options.sort); + if (sort) { + this.cmdBase.sort = sort; + } + if (options.projection) { + this.cmdBase.fields = options.projection; + } + if (options.maxTimeMS) { + this.cmdBase.maxTimeMS = options.maxTimeMS; + } + // Decorate the findAndModify command with the write Concern + if (options.writeConcern) { + this.cmdBase.writeConcern = options.writeConcern; + } + if (options.let) { + this.cmdBase.let = options.let; + } + // force primary read preference + this.readPreference = read_preference_1.ReadPreference.primary; + this.collection = collection; + this.query = query; + } + execute(server, session, callback) { + var _a; + const coll = this.collection; + const query = this.query; + const options = { ...this.options, ...this.bsonOptions }; + // Create findAndModify command object + const cmd = { + findAndModify: coll.collectionName, + query: query, + ...this.cmdBase + }; + // Have we specified collation + try { + utils_1.decorateWithCollation(cmd, coll, options); + } + catch (err) { + return callback(err); + } + if (options.hint) { + // TODO: once this method becomes a CommandOperation we will have the server + // in place to check. + const unacknowledgedWrite = ((_a = this.writeConcern) === null || _a === void 0 ? void 0 : _a.w) === 0; + if (unacknowledgedWrite || utils_1.maxWireVersion(server) < 8) { + callback(new error_1.MongoCompatibilityError('The current topology does not support a hint on findAndModify commands')); + return; + } + cmd.hint = options.hint; + } + if (this.explain && utils_1.maxWireVersion(server) < 4) { + callback(new error_1.MongoCompatibilityError(`Server ${server.name} does not support explain on findAndModify`)); + return; + } + // Execute the command + super.executeCommand(server, session, cmd, (err, result) => { + if (err) + return callback(err); + return callback(undefined, result); + }); + } +} +/** @internal */ +class FindOneAndDeleteOperation extends FindAndModifyOperation { + constructor(collection, filter, options) { + // Basic validation + if (filter == null || typeof filter !== 'object') { + throw new error_1.MongoInvalidArgumentError('Argument "filter" must be an object'); + } + super(collection, filter, options); + this.cmdBase.remove = true; + } +} +exports.FindOneAndDeleteOperation = FindOneAndDeleteOperation; +/** @internal */ +class FindOneAndReplaceOperation extends FindAndModifyOperation { + constructor(collection, filter, replacement, options) { + if (filter == null || typeof filter !== 'object') { + throw new error_1.MongoInvalidArgumentError('Argument "filter" must be an object'); + } + if (replacement == null || typeof replacement !== 'object') { + throw new error_1.MongoInvalidArgumentError('Argument "replacement" must be an object'); + } + if (utils_1.hasAtomicOperators(replacement)) { + throw new error_1.MongoInvalidArgumentError('Replacement document must not contain atomic operators'); + } + super(collection, filter, options); + this.cmdBase.update = replacement; + configureFindAndModifyCmdBaseUpdateOpts(this.cmdBase, options); + } +} +exports.FindOneAndReplaceOperation = FindOneAndReplaceOperation; +/** @internal */ +class FindOneAndUpdateOperation extends FindAndModifyOperation { + constructor(collection, filter, update, options) { + if (filter == null || typeof filter !== 'object') { + throw new error_1.MongoInvalidArgumentError('Argument "filter" must be an object'); + } + if (update == null || typeof update !== 'object') { + throw new error_1.MongoInvalidArgumentError('Argument "update" must be an object'); + } + if (!utils_1.hasAtomicOperators(update)) { + throw new error_1.MongoInvalidArgumentError('Update document requires atomic operators'); + } + super(collection, filter, options); + this.cmdBase.update = update; + configureFindAndModifyCmdBaseUpdateOpts(this.cmdBase, options); + if (options.arrayFilters) { + this.cmdBase.arrayFilters = options.arrayFilters; + } + } +} +exports.FindOneAndUpdateOperation = FindOneAndUpdateOperation; +operation_1.defineAspects(FindAndModifyOperation, [ + operation_1.Aspect.WRITE_OPERATION, + operation_1.Aspect.RETRYABLE, + operation_1.Aspect.EXPLAINABLE +]); +//# sourceMappingURL=find_and_modify.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/find_and_modify.js.map b/node_modules/mongodb/lib/operations/find_and_modify.js.map new file mode 100644 index 000000000..5a3497831 --- /dev/null +++ b/node_modules/mongodb/lib/operations/find_and_modify.js.map @@ -0,0 +1 @@ +{"version":3,"file":"find_and_modify.js","sourceRoot":"","sources":["../../src/operations/find_and_modify.ts"],"names":[],"mappings":";;;AAAA,wDAAoD;AACpD,oCAA+F;AAC/F,oCAA8E;AAC9E,uCAAsE;AACtE,2CAAoD;AAIpD,kCAAuD;AAIvD,cAAc;AACD,QAAA,cAAc,GAAG,MAAM,CAAC,MAAM,CAAC;IAC1C,MAAM,EAAE,QAAQ;IAChB,KAAK,EAAE,OAAO;CACN,CAAC,CAAC;AAsEZ,SAAS,uCAAuC,CAC9C,OAA6B,EAC7B,OAA2D;IAE3D,OAAO,CAAC,GAAG,GAAG,OAAO,CAAC,cAAc,KAAK,sBAAc,CAAC,KAAK,CAAC;IAC9D,OAAO,CAAC,MAAM,GAAG,OAAO,CAAC,MAAM,KAAK,IAAI,CAAC;IAEzC,IAAI,OAAO,CAAC,wBAAwB,KAAK,IAAI,EAAE;QAC7C,OAAO,CAAC,wBAAwB,GAAG,OAAO,CAAC,wBAAwB,CAAC;KACrE;IACD,OAAO,OAAO,CAAC;AACjB,CAAC;AAED,gBAAgB;AAChB,MAAM,sBAAuB,SAAQ,0BAA0B;IAO7D,YACE,UAAsB,EACtB,KAAe,EACf,OAAqF;QAErF,KAAK,CAAC,UAAU,EAAE,OAAO,CAAC,CAAC;QAC3B,IAAI,CAAC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAC7B,IAAI,CAAC,OAAO,GAAG;YACb,MAAM,EAAE,KAAK;YACb,GAAG,EAAE,KAAK;YACV,MAAM,EAAE,KAAK;SACd,CAAC;QAEF,MAAM,IAAI,GAAG,iBAAU,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;QACtC,IAAI,IAAI,EAAE;YACR,IAAI,CAAC,OAAO,CAAC,IAAI,GAAG,IAAI,CAAC;SAC1B;QAED,IAAI,OAAO,CAAC,UAAU,EAAE;YACtB,IAAI,CAAC,OAAO,CAAC,MAAM,GAAG,OAAO,CAAC,UAAU,CAAC;SAC1C;QAED,IAAI,OAAO,CAAC,SAAS,EAAE;YACrB,IAAI,CAAC,OAAO,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;SAC5C;QAED,4DAA4D;QAC5D,IAAI,OAAO,CAAC,YAAY,EAAE;YACxB,IAAI,CAAC,OAAO,CAAC,YAAY,GAAG,OAAO,CAAC,YAAY,CAAC;SAClD;QAED,IAAI,OAAO,CAAC,GAAG,EAAE;YACf,IAAI,CAAC,OAAO,CAAC,GAAG,GAAG,OAAO,CAAC,GAAG,CAAC;SAChC;QAED,gCAAgC;QAChC,IAAI,CAAC,cAAc,GAAG,gCAAc,CAAC,OAAO,CAAC;QAE7C,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;QAC7B,IAAI,CAAC,KAAK,GAAG,KAAK,CAAC;IACrB,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA4B;;QAC1E,MAAM,IAAI,GAAG,IAAI,CAAC,UAAU,CAAC;QAC7B,MAAM,KAAK,GAAG,IAAI,CAAC,KAAK,CAAC;QACzB,MAAM,OAAO,GAAG,EAAE,GAAG,IAAI,CAAC,OAAO,EAAE,GAAG,IAAI,CAAC,WAAW,EAAE,CAAC;QAEzD,sCAAsC;QACtC,MAAM,GAAG,GAAa;YACpB,aAAa,EAAE,IAAI,CAAC,cAAc;YAClC,KAAK,EAAE,KAAK;YACZ,GAAG,IAAI,CAAC,OAAO;SAChB,CAAC;QAEF,8BAA8B;QAC9B,IAAI;YACF,6BAAqB,CAAC,GAAG,EAAE,IAAI,EAAE,OAAO,CAAC,CAAC;SAC3C;QAAC,OAAO,GAAG,EAAE;YACZ,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;SACtB;QAED,IAAI,OAAO,CAAC,IAAI,EAAE;YAChB,4EAA4E;YAC5E,qBAAqB;YACrB,MAAM,mBAAmB,GAAG,CAAA,MAAA,IAAI,CAAC,YAAY,0CAAE,CAAC,MAAK,CAAC,CAAC;YACvD,IAAI,mBAAmB,IAAI,sBAAc,CAAC,MAAM,CAAC,GAAG,CAAC,EAAE;gBACrD,QAAQ,CACN,IAAI,+BAAuB,CACzB,wEAAwE,CACzE,CACF,CAAC;gBAEF,OAAO;aACR;YAED,GAAG,CAAC,IAAI,GAAG,OAAO,CAAC,IAAI,CAAC;SACzB;QAED,IAAI,IAAI,CAAC,OAAO,IAAI,sBAAc,CAAC,MAAM,CAAC,GAAG,CAAC,EAAE;YAC9C,QAAQ,CACN,IAAI,+BAAuB,CACzB,UAAU,MAAM,CAAC,IAAI,4CAA4C,CAClE,CACF,CAAC;YACF,OAAO;SACR;QAED,sBAAsB;QACtB,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,GAAG,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;YACzD,IAAI,GAAG;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAC9B,OAAO,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,CAAC;QACrC,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAED,gBAAgB;AAChB,MAAa,yBAA0B,SAAQ,sBAAsB;IACnE,YAAY,UAAsB,EAAE,MAAgB,EAAE,OAAgC;QACpF,mBAAmB;QACnB,IAAI,MAAM,IAAI,IAAI,IAAI,OAAO,MAAM,KAAK,QAAQ,EAAE;YAChD,MAAM,IAAI,iCAAyB,CAAC,qCAAqC,CAAC,CAAC;SAC5E;QAED,KAAK,CAAC,UAAU,EAAE,MAAM,EAAE,OAAO,CAAC,CAAC;QACnC,IAAI,CAAC,OAAO,CAAC,MAAM,GAAG,IAAI,CAAC;IAC7B,CAAC;CACF;AAVD,8DAUC;AAED,gBAAgB;AAChB,MAAa,0BAA2B,SAAQ,sBAAsB;IACpE,YACE,UAAsB,EACtB,MAAgB,EAChB,WAAqB,EACrB,OAAiC;QAEjC,IAAI,MAAM,IAAI,IAAI,IAAI,OAAO,MAAM,KAAK,QAAQ,EAAE;YAChD,MAAM,IAAI,iCAAyB,CAAC,qCAAqC,CAAC,CAAC;SAC5E;QAED,IAAI,WAAW,IAAI,IAAI,IAAI,OAAO,WAAW,KAAK,QAAQ,EAAE;YAC1D,MAAM,IAAI,iCAAyB,CAAC,0CAA0C,CAAC,CAAC;SACjF;QAED,IAAI,0BAAkB,CAAC,WAAW,CAAC,EAAE;YACnC,MAAM,IAAI,iCAAyB,CAAC,wDAAwD,CAAC,CAAC;SAC/F;QAED,KAAK,CAAC,UAAU,EAAE,MAAM,EAAE,OAAO,CAAC,CAAC;QACnC,IAAI,CAAC,OAAO,CAAC,MAAM,GAAG,WAAW,CAAC;QAClC,uCAAuC,CAAC,IAAI,CAAC,OAAO,EAAE,OAAO,CAAC,CAAC;IACjE,CAAC;CACF;AAvBD,gEAuBC;AAED,gBAAgB;AAChB,MAAa,yBAA0B,SAAQ,sBAAsB;IACnE,YACE,UAAsB,EACtB,MAAgB,EAChB,MAAgB,EAChB,OAAgC;QAEhC,IAAI,MAAM,IAAI,IAAI,IAAI,OAAO,MAAM,KAAK,QAAQ,EAAE;YAChD,MAAM,IAAI,iCAAyB,CAAC,qCAAqC,CAAC,CAAC;SAC5E;QAED,IAAI,MAAM,IAAI,IAAI,IAAI,OAAO,MAAM,KAAK,QAAQ,EAAE;YAChD,MAAM,IAAI,iCAAyB,CAAC,qCAAqC,CAAC,CAAC;SAC5E;QAED,IAAI,CAAC,0BAAkB,CAAC,MAAM,CAAC,EAAE;YAC/B,MAAM,IAAI,iCAAyB,CAAC,2CAA2C,CAAC,CAAC;SAClF;QAED,KAAK,CAAC,UAAU,EAAE,MAAM,EAAE,OAAO,CAAC,CAAC;QACnC,IAAI,CAAC,OAAO,CAAC,MAAM,GAAG,MAAM,CAAC;QAC7B,uCAAuC,CAAC,IAAI,CAAC,OAAO,EAAE,OAAO,CAAC,CAAC;QAE/D,IAAI,OAAO,CAAC,YAAY,EAAE;YACxB,IAAI,CAAC,OAAO,CAAC,YAAY,GAAG,OAAO,CAAC,YAAY,CAAC;SAClD;IACH,CAAC;CACF;AA3BD,8DA2BC;AAED,yBAAa,CAAC,sBAAsB,EAAE;IACpC,kBAAM,CAAC,eAAe;IACtB,kBAAM,CAAC,SAAS;IAChB,kBAAM,CAAC,WAAW;CACnB,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/indexes.js b/node_modules/mongodb/lib/operations/indexes.js new file mode 100644 index 000000000..b3560cb4c --- /dev/null +++ b/node_modules/mongodb/lib/operations/indexes.js @@ -0,0 +1,297 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.IndexInformationOperation = exports.IndexExistsOperation = exports.ListIndexesCursor = exports.ListIndexesOperation = exports.DropIndexesOperation = exports.DropIndexOperation = exports.EnsureIndexOperation = exports.CreateIndexOperation = exports.CreateIndexesOperation = exports.IndexesOperation = void 0; +const common_functions_1 = require("./common_functions"); +const operation_1 = require("./operation"); +const error_1 = require("../error"); +const utils_1 = require("../utils"); +const command_1 = require("./command"); +const read_preference_1 = require("../read_preference"); +const abstract_cursor_1 = require("../cursor/abstract_cursor"); +const execute_operation_1 = require("./execute_operation"); +const LIST_INDEXES_WIRE_VERSION = 3; +const VALID_INDEX_OPTIONS = new Set([ + 'background', + 'unique', + 'name', + 'partialFilterExpression', + 'sparse', + 'hidden', + 'expireAfterSeconds', + 'storageEngine', + 'collation', + 'version', + // text indexes + 'weights', + 'default_language', + 'language_override', + 'textIndexVersion', + // 2d-sphere indexes + '2dsphereIndexVersion', + // 2d indexes + 'bits', + 'min', + 'max', + // geoHaystack Indexes + 'bucketSize', + // wildcard indexes + 'wildcardProjection' +]); +function makeIndexSpec(indexSpec, options) { + const indexParameters = utils_1.parseIndexOptions(indexSpec); + // Generate the index name + const name = typeof options.name === 'string' ? options.name : indexParameters.name; + // Set up the index + const finalIndexSpec = { name, key: indexParameters.fieldHash }; + // merge valid index options into the index spec + for (const optionName in options) { + if (VALID_INDEX_OPTIONS.has(optionName)) { + finalIndexSpec[optionName] = options[optionName]; + } + } + return finalIndexSpec; +} +/** @internal */ +class IndexesOperation extends operation_1.AbstractOperation { + constructor(collection, options) { + super(options); + this.options = options; + this.collection = collection; + } + execute(server, session, callback) { + const coll = this.collection; + const options = this.options; + common_functions_1.indexInformation(coll.s.db, coll.collectionName, { full: true, ...options, readPreference: this.readPreference, session }, callback); + } +} +exports.IndexesOperation = IndexesOperation; +/** @internal */ +class CreateIndexesOperation extends command_1.CommandOperation { + constructor(parent, collectionName, indexes, options) { + super(parent, options); + this.options = options !== null && options !== void 0 ? options : {}; + this.collectionName = collectionName; + this.indexes = indexes; + } + execute(server, session, callback) { + const options = this.options; + const indexes = this.indexes; + const serverWireVersion = utils_1.maxWireVersion(server); + // Ensure we generate the correct name if the parameter is not set + for (let i = 0; i < indexes.length; i++) { + // Did the user pass in a collation, check if our write server supports it + if (indexes[i].collation && serverWireVersion < 5) { + callback(new error_1.MongoCompatibilityError(`Server ${server.name}, which reports wire version ${serverWireVersion}, ` + + 'does not support collation')); + return; + } + if (indexes[i].name == null) { + const keys = []; + for (const name in indexes[i].key) { + keys.push(`${name}_${indexes[i].key[name]}`); + } + // Set the name + indexes[i].name = keys.join('_'); + } + } + const cmd = { createIndexes: this.collectionName, indexes }; + if (options.commitQuorum != null) { + if (serverWireVersion < 9) { + callback(new error_1.MongoCompatibilityError('Option `commitQuorum` for `createIndexes` not supported on servers < 4.4')); + return; + } + cmd.commitQuorum = options.commitQuorum; + } + // collation is set on each index, it should not be defined at the root + this.options.collation = undefined; + super.executeCommand(server, session, cmd, err => { + if (err) { + callback(err); + return; + } + const indexNames = indexes.map(index => index.name || ''); + callback(undefined, indexNames); + }); + } +} +exports.CreateIndexesOperation = CreateIndexesOperation; +/** @internal */ +class CreateIndexOperation extends CreateIndexesOperation { + constructor(parent, collectionName, indexSpec, options) { + // createIndex can be called with a variety of styles: + // coll.createIndex('a'); + // coll.createIndex({ a: 1 }); + // coll.createIndex([['a', 1]]); + // createIndexes is always called with an array of index spec objects + super(parent, collectionName, [makeIndexSpec(indexSpec, options)], options); + } + execute(server, session, callback) { + super.execute(server, session, (err, indexNames) => { + if (err || !indexNames) + return callback(err); + return callback(undefined, indexNames[0]); + }); + } +} +exports.CreateIndexOperation = CreateIndexOperation; +/** @internal */ +class EnsureIndexOperation extends CreateIndexOperation { + constructor(db, collectionName, indexSpec, options) { + super(db, collectionName, indexSpec, options); + this.readPreference = read_preference_1.ReadPreference.primary; + this.db = db; + this.collectionName = collectionName; + } + execute(server, session, callback) { + const indexName = this.indexes[0].name; + const cursor = this.db.collection(this.collectionName).listIndexes({ session }); + cursor.toArray((err, indexes) => { + /// ignore "NamespaceNotFound" errors + if (err && err.code !== error_1.MONGODB_ERROR_CODES.NamespaceNotFound) { + return callback(err); + } + if (indexes) { + indexes = Array.isArray(indexes) ? indexes : [indexes]; + if (indexes.some(index => index.name === indexName)) { + callback(undefined, indexName); + return; + } + } + super.execute(server, session, callback); + }); + } +} +exports.EnsureIndexOperation = EnsureIndexOperation; +/** @internal */ +class DropIndexOperation extends command_1.CommandOperation { + constructor(collection, indexName, options) { + super(collection, options); + this.options = options !== null && options !== void 0 ? options : {}; + this.collection = collection; + this.indexName = indexName; + } + execute(server, session, callback) { + const cmd = { dropIndexes: this.collection.collectionName, index: this.indexName }; + super.executeCommand(server, session, cmd, callback); + } +} +exports.DropIndexOperation = DropIndexOperation; +/** @internal */ +class DropIndexesOperation extends DropIndexOperation { + constructor(collection, options) { + super(collection, '*', options); + } + execute(server, session, callback) { + super.execute(server, session, err => { + if (err) + return callback(err, false); + callback(undefined, true); + }); + } +} +exports.DropIndexesOperation = DropIndexesOperation; +/** @internal */ +class ListIndexesOperation extends command_1.CommandOperation { + constructor(collection, options) { + super(collection, options); + this.options = options !== null && options !== void 0 ? options : {}; + this.collectionNamespace = collection.s.namespace; + } + execute(server, session, callback) { + const serverWireVersion = utils_1.maxWireVersion(server); + if (serverWireVersion < LIST_INDEXES_WIRE_VERSION) { + const systemIndexesNS = this.collectionNamespace.withCollection('system.indexes'); + const collectionNS = this.collectionNamespace.toString(); + server.query(systemIndexesNS, { query: { ns: collectionNS } }, { ...this.options, readPreference: this.readPreference }, callback); + return; + } + const cursor = this.options.batchSize ? { batchSize: this.options.batchSize } : {}; + super.executeCommand(server, session, { listIndexes: this.collectionNamespace.collection, cursor }, callback); + } +} +exports.ListIndexesOperation = ListIndexesOperation; +/** @public */ +class ListIndexesCursor extends abstract_cursor_1.AbstractCursor { + constructor(collection, options) { + super(utils_1.getTopology(collection), collection.s.namespace, options); + this.parent = collection; + this.options = options; + } + clone() { + return new ListIndexesCursor(this.parent, { + ...this.options, + ...this.cursorOptions + }); + } + /** @internal */ + _initialize(session, callback) { + const operation = new ListIndexesOperation(this.parent, { + ...this.cursorOptions, + ...this.options, + session + }); + execute_operation_1.executeOperation(utils_1.getTopology(this.parent), operation, (err, response) => { + if (err || response == null) + return callback(err); + // TODO: NODE-2882 + callback(undefined, { server: operation.server, session, response }); + }); + } +} +exports.ListIndexesCursor = ListIndexesCursor; +/** @internal */ +class IndexExistsOperation extends operation_1.AbstractOperation { + constructor(collection, indexes, options) { + super(options); + this.options = options; + this.collection = collection; + this.indexes = indexes; + } + execute(server, session, callback) { + const coll = this.collection; + const indexes = this.indexes; + common_functions_1.indexInformation(coll.s.db, coll.collectionName, { ...this.options, readPreference: this.readPreference, session }, (err, indexInformation) => { + // If we have an error return + if (err != null) + return callback(err); + // Let's check for the index names + if (!Array.isArray(indexes)) + return callback(undefined, indexInformation[indexes] != null); + // Check in list of indexes + for (let i = 0; i < indexes.length; i++) { + if (indexInformation[indexes[i]] == null) { + return callback(undefined, false); + } + } + // All keys found return true + return callback(undefined, true); + }); + } +} +exports.IndexExistsOperation = IndexExistsOperation; +/** @internal */ +class IndexInformationOperation extends operation_1.AbstractOperation { + constructor(db, name, options) { + super(options); + this.options = options !== null && options !== void 0 ? options : {}; + this.db = db; + this.name = name; + } + execute(server, session, callback) { + const db = this.db; + const name = this.name; + common_functions_1.indexInformation(db, name, { ...this.options, readPreference: this.readPreference, session }, callback); + } +} +exports.IndexInformationOperation = IndexInformationOperation; +operation_1.defineAspects(ListIndexesOperation, [ + operation_1.Aspect.READ_OPERATION, + operation_1.Aspect.RETRYABLE, + operation_1.Aspect.CURSOR_CREATING +]); +operation_1.defineAspects(CreateIndexesOperation, [operation_1.Aspect.WRITE_OPERATION]); +operation_1.defineAspects(CreateIndexOperation, [operation_1.Aspect.WRITE_OPERATION]); +operation_1.defineAspects(EnsureIndexOperation, [operation_1.Aspect.WRITE_OPERATION]); +operation_1.defineAspects(DropIndexOperation, [operation_1.Aspect.WRITE_OPERATION]); +operation_1.defineAspects(DropIndexesOperation, [operation_1.Aspect.WRITE_OPERATION]); +//# sourceMappingURL=indexes.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/indexes.js.map b/node_modules/mongodb/lib/operations/indexes.js.map new file mode 100644 index 000000000..12222d5f3 --- /dev/null +++ b/node_modules/mongodb/lib/operations/indexes.js.map @@ -0,0 +1 @@ +{"version":3,"file":"indexes.js","sourceRoot":"","sources":["../../src/operations/indexes.ts"],"names":[],"mappings":";;;AAAA,yDAA+E;AAC/E,2CAAuE;AACvE,oCAA0F;AAC1F,oCAMkB;AAClB,uCAKmB;AACnB,wDAAoD;AAKpD,+DAA2D;AAE3D,2DAAwE;AAGxE,MAAM,yBAAyB,GAAG,CAAC,CAAC;AACpC,MAAM,mBAAmB,GAAG,IAAI,GAAG,CAAC;IAClC,YAAY;IACZ,QAAQ;IACR,MAAM;IACN,yBAAyB;IACzB,QAAQ;IACR,QAAQ;IACR,oBAAoB;IACpB,eAAe;IACf,WAAW;IACX,SAAS;IAET,eAAe;IACf,SAAS;IACT,kBAAkB;IAClB,mBAAmB;IACnB,kBAAkB;IAElB,oBAAoB;IACpB,sBAAsB;IAEtB,aAAa;IACb,MAAM;IACN,KAAK;IACL,KAAK;IAEL,sBAAsB;IACtB,YAAY;IAEZ,mBAAmB;IACnB,oBAAoB;CACrB,CAAC,CAAC;AAmFH,SAAS,aAAa,CAAC,SAA6B,EAAE,OAAY;IAChE,MAAM,eAAe,GAAG,yBAAiB,CAAC,SAAS,CAAC,CAAC;IAErD,0BAA0B;IAC1B,MAAM,IAAI,GAAG,OAAO,OAAO,CAAC,IAAI,KAAK,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,eAAe,CAAC,IAAI,CAAC;IAEpF,mBAAmB;IACnB,MAAM,cAAc,GAAa,EAAE,IAAI,EAAE,GAAG,EAAE,eAAe,CAAC,SAAS,EAAE,CAAC;IAE1E,gDAAgD;IAChD,KAAK,MAAM,UAAU,IAAI,OAAO,EAAE;QAChC,IAAI,mBAAmB,CAAC,GAAG,CAAC,UAAU,CAAC,EAAE;YACvC,cAAc,CAAC,UAAU,CAAC,GAAG,OAAO,CAAC,UAAU,CAAC,CAAC;SAClD;KACF;IAED,OAAO,cAAkC,CAAC;AAC5C,CAAC;AAED,gBAAgB;AAChB,MAAa,gBAAiB,SAAQ,6BAA6B;IAIjE,YAAY,UAAsB,EAAE,OAAgC;QAClE,KAAK,CAAC,OAAO,CAAC,CAAC;QACf,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;IAC/B,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA8B;QAC5E,MAAM,IAAI,GAAG,IAAI,CAAC,UAAU,CAAC;QAC7B,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAC;QAE7B,mCAAgB,CACd,IAAI,CAAC,CAAC,CAAC,EAAE,EACT,IAAI,CAAC,cAAc,EACnB,EAAE,IAAI,EAAE,IAAI,EAAE,GAAG,OAAO,EAAE,cAAc,EAAE,IAAI,CAAC,cAAc,EAAE,OAAO,EAAE,EACxE,QAAQ,CACT,CAAC;IACJ,CAAC;CACF;AArBD,4CAqBC;AAED,gBAAgB;AAChB,MAAa,sBAEX,SAAQ,0BAAmB;IAK3B,YACE,MAAuB,EACvB,cAAsB,EACtB,OAA2B,EAC3B,OAA8B;QAE9B,KAAK,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;QAEvB,IAAI,CAAC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAC7B,IAAI,CAAC,cAAc,GAAG,cAAc,CAAC;QAErC,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;IACzB,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAAqB;QACnE,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAC;QAC7B,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAC;QAE7B,MAAM,iBAAiB,GAAG,sBAAc,CAAC,MAAM,CAAC,CAAC;QAEjD,kEAAkE;QAClE,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,OAAO,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;YACvC,0EAA0E;YAC1E,IAAI,OAAO,CAAC,CAAC,CAAC,CAAC,SAAS,IAAI,iBAAiB,GAAG,CAAC,EAAE;gBACjD,QAAQ,CACN,IAAI,+BAAuB,CACzB,UAAU,MAAM,CAAC,IAAI,gCAAgC,iBAAiB,IAAI;oBACxE,4BAA4B,CAC/B,CACF,CAAC;gBACF,OAAO;aACR;YAED,IAAI,OAAO,CAAC,CAAC,CAAC,CAAC,IAAI,IAAI,IAAI,EAAE;gBAC3B,MAAM,IAAI,GAAG,EAAE,CAAC;gBAEhB,KAAK,MAAM,IAAI,IAAI,OAAO,CAAC,CAAC,CAAC,CAAC,GAAG,EAAE;oBACjC,IAAI,CAAC,IAAI,CAAC,GAAG,IAAI,IAAI,OAAO,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC;iBAC9C;gBAED,eAAe;gBACf,OAAO,CAAC,CAAC,CAAC,CAAC,IAAI,GAAG,IAAI,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;aAClC;SACF;QAED,MAAM,GAAG,GAAa,EAAE,aAAa,EAAE,IAAI,CAAC,cAAc,EAAE,OAAO,EAAE,CAAC;QAEtE,IAAI,OAAO,CAAC,YAAY,IAAI,IAAI,EAAE;YAChC,IAAI,iBAAiB,GAAG,CAAC,EAAE;gBACzB,QAAQ,CACN,IAAI,+BAAuB,CACzB,0EAA0E,CAC3E,CACF,CAAC;gBACF,OAAO;aACR;YACD,GAAG,CAAC,YAAY,GAAG,OAAO,CAAC,YAAY,CAAC;SACzC;QAED,uEAAuE;QACvE,IAAI,CAAC,OAAO,CAAC,SAAS,GAAG,SAAS,CAAC;QAEnC,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,GAAG,EAAE,GAAG,CAAC,EAAE;YAC/C,IAAI,GAAG,EAAE;gBACP,QAAQ,CAAC,GAAG,CAAC,CAAC;gBACd,OAAO;aACR;YAED,MAAM,UAAU,GAAG,OAAO,CAAC,GAAG,CAAC,KAAK,CAAC,EAAE,CAAC,KAAK,CAAC,IAAI,IAAI,EAAE,CAAC,CAAC;YAC1D,QAAQ,CAAC,SAAS,EAAE,UAAe,CAAC,CAAC;QACvC,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AA/ED,wDA+EC;AAED,gBAAgB;AAChB,MAAa,oBAAqB,SAAQ,sBAA8B;IACtE,YACE,MAAuB,EACvB,cAAsB,EACtB,SAA6B,EAC7B,OAA8B;QAE9B,sDAAsD;QACtD,2BAA2B;QAC3B,gCAAgC;QAChC,kCAAkC;QAClC,qEAAqE;QAErE,KAAK,CAAC,MAAM,EAAE,cAAc,EAAE,CAAC,aAAa,CAAC,SAAS,EAAE,OAAO,CAAC,CAAC,EAAE,OAAO,CAAC,CAAC;IAC9E,CAAC;IACD,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA0B;QACxE,KAAK,CAAC,OAAO,CAAC,MAAM,EAAE,OAAO,EAAE,CAAC,GAAG,EAAE,UAAU,EAAE,EAAE;YACjD,IAAI,GAAG,IAAI,CAAC,UAAU;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAC7C,OAAO,QAAQ,CAAC,SAAS,EAAE,UAAU,CAAC,CAAC,CAAC,CAAC,CAAC;QAC5C,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AArBD,oDAqBC;AAED,gBAAgB;AAChB,MAAa,oBAAqB,SAAQ,oBAAoB;IAI5D,YACE,EAAM,EACN,cAAsB,EACtB,SAA6B,EAC7B,OAA8B;QAE9B,KAAK,CAAC,EAAE,EAAE,cAAc,EAAE,SAAS,EAAE,OAAO,CAAC,CAAC;QAE9C,IAAI,CAAC,cAAc,GAAG,gCAAc,CAAC,OAAO,CAAC;QAC7C,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC;QACb,IAAI,CAAC,cAAc,GAAG,cAAc,CAAC;IACvC,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAAkB;QAChE,MAAM,SAAS,GAAG,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC;QACvC,MAAM,MAAM,GAAG,IAAI,CAAC,EAAE,CAAC,UAAU,CAAC,IAAI,CAAC,cAAc,CAAC,CAAC,WAAW,CAAC,EAAE,OAAO,EAAE,CAAC,CAAC;QAChF,MAAM,CAAC,OAAO,CAAC,CAAC,GAAG,EAAE,OAAO,EAAE,EAAE;YAC9B,qCAAqC;YACrC,IAAI,GAAG,IAAK,GAAwB,CAAC,IAAI,KAAK,2BAAmB,CAAC,iBAAiB,EAAE;gBACnF,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;aACtB;YAED,IAAI,OAAO,EAAE;gBACX,OAAO,GAAG,KAAK,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC;gBACvD,IAAI,OAAO,CAAC,IAAI,CAAC,KAAK,CAAC,EAAE,CAAC,KAAK,CAAC,IAAI,KAAK,SAAS,CAAC,EAAE;oBACnD,QAAQ,CAAC,SAAS,EAAE,SAAS,CAAC,CAAC;oBAC/B,OAAO;iBACR;aACF;YAED,KAAK,CAAC,OAAO,CAAC,MAAM,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;QAC3C,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AArCD,oDAqCC;AAKD,gBAAgB;AAChB,MAAa,kBAAmB,SAAQ,0BAA0B;IAKhE,YAAY,UAAsB,EAAE,SAAiB,EAAE,OAA4B;QACjF,KAAK,CAAC,UAAU,EAAE,OAAO,CAAC,CAAC;QAE3B,IAAI,CAAC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAC7B,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;QAC7B,IAAI,CAAC,SAAS,GAAG,SAAS,CAAC;IAC7B,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA4B;QAC1E,MAAM,GAAG,GAAG,EAAE,WAAW,EAAE,IAAI,CAAC,UAAU,CAAC,cAAc,EAAE,KAAK,EAAE,IAAI,CAAC,SAAS,EAAE,CAAC;QACnF,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,GAAG,EAAE,QAAQ,CAAC,CAAC;IACvD,CAAC;CACF;AAjBD,gDAiBC;AAED,gBAAgB;AAChB,MAAa,oBAAqB,SAAQ,kBAAkB;IAC1D,YAAY,UAAsB,EAAE,OAA2B;QAC7D,KAAK,CAAC,UAAU,EAAE,GAAG,EAAE,OAAO,CAAC,CAAC;IAClC,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAAkB;QAChE,KAAK,CAAC,OAAO,CAAC,MAAM,EAAE,OAAO,EAAE,GAAG,CAAC,EAAE;YACnC,IAAI,GAAG;gBAAE,OAAO,QAAQ,CAAC,GAAG,EAAE,KAAK,CAAC,CAAC;YACrC,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;QAC5B,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAXD,oDAWC;AAQD,gBAAgB;AAChB,MAAa,oBAAqB,SAAQ,0BAA0B;IAIlE,YAAY,UAAsB,EAAE,OAA4B;QAC9D,KAAK,CAAC,UAAU,EAAE,OAAO,CAAC,CAAC;QAE3B,IAAI,CAAC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAC7B,IAAI,CAAC,mBAAmB,GAAG,UAAU,CAAC,CAAC,CAAC,SAAS,CAAC;IACpD,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA4B;QAC1E,MAAM,iBAAiB,GAAG,sBAAc,CAAC,MAAM,CAAC,CAAC;QACjD,IAAI,iBAAiB,GAAG,yBAAyB,EAAE;YACjD,MAAM,eAAe,GAAG,IAAI,CAAC,mBAAmB,CAAC,cAAc,CAAC,gBAAgB,CAAC,CAAC;YAClF,MAAM,YAAY,GAAG,IAAI,CAAC,mBAAmB,CAAC,QAAQ,EAAE,CAAC;YAEzD,MAAM,CAAC,KAAK,CACV,eAAe,EACf,EAAE,KAAK,EAAE,EAAE,EAAE,EAAE,YAAY,EAAE,EAAE,EAC/B,EAAE,GAAG,IAAI,CAAC,OAAO,EAAE,cAAc,EAAE,IAAI,CAAC,cAAc,EAAE,EACxD,QAAQ,CACT,CAAC;YACF,OAAO;SACR;QAED,MAAM,MAAM,GAAG,IAAI,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC,CAAC,EAAE,SAAS,EAAE,IAAI,CAAC,OAAO,CAAC,SAAS,EAAE,CAAC,CAAC,CAAC,EAAE,CAAC;QACnF,KAAK,CAAC,cAAc,CAClB,MAAM,EACN,OAAO,EACP,EAAE,WAAW,EAAE,IAAI,CAAC,mBAAmB,CAAC,UAAU,EAAE,MAAM,EAAE,EAC5D,QAAQ,CACT,CAAC;IACJ,CAAC;CACF;AAlCD,oDAkCC;AAED,cAAc;AACd,MAAa,iBAAkB,SAAQ,gCAAc;IAInD,YAAY,UAAsB,EAAE,OAA4B;QAC9D,KAAK,CAAC,mBAAW,CAAC,UAAU,CAAC,EAAE,UAAU,CAAC,CAAC,CAAC,SAAS,EAAE,OAAO,CAAC,CAAC;QAChE,IAAI,CAAC,MAAM,GAAG,UAAU,CAAC;QACzB,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;IACzB,CAAC;IAED,KAAK;QACH,OAAO,IAAI,iBAAiB,CAAC,IAAI,CAAC,MAAM,EAAE;YACxC,GAAG,IAAI,CAAC,OAAO;YACf,GAAG,IAAI,CAAC,aAAa;SACtB,CAAC,CAAC;IACL,CAAC;IAED,gBAAgB;IAChB,WAAW,CAAC,OAAkC,EAAE,QAAmC;QACjF,MAAM,SAAS,GAAG,IAAI,oBAAoB,CAAC,IAAI,CAAC,MAAM,EAAE;YACtD,GAAG,IAAI,CAAC,aAAa;YACrB,GAAG,IAAI,CAAC,OAAO;YACf,OAAO;SACR,CAAC,CAAC;QAEH,oCAAgB,CAAC,mBAAW,CAAC,IAAI,CAAC,MAAM,CAAC,EAAE,SAAS,EAAE,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;YACtE,IAAI,GAAG,IAAI,QAAQ,IAAI,IAAI;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAElD,kBAAkB;YAClB,QAAQ,CAAC,SAAS,EAAE,EAAE,MAAM,EAAE,SAAS,CAAC,MAAM,EAAE,OAAO,EAAE,QAAQ,EAAE,CAAC,CAAC;QACvE,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAhCD,8CAgCC;AAED,gBAAgB;AAChB,MAAa,oBAAqB,SAAQ,6BAA0B;IAKlE,YACE,UAAsB,EACtB,OAA0B,EAC1B,OAAgC;QAEhC,KAAK,CAAC,OAAO,CAAC,CAAC;QACf,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;QAC7B,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;IACzB,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA2B;QACzE,MAAM,IAAI,GAAG,IAAI,CAAC,UAAU,CAAC;QAC7B,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAC;QAE7B,mCAAgB,CACd,IAAI,CAAC,CAAC,CAAC,EAAE,EACT,IAAI,CAAC,cAAc,EACnB,EAAE,GAAG,IAAI,CAAC,OAAO,EAAE,cAAc,EAAE,IAAI,CAAC,cAAc,EAAE,OAAO,EAAE,EACjE,CAAC,GAAG,EAAE,gBAAgB,EAAE,EAAE;YACxB,6BAA6B;YAC7B,IAAI,GAAG,IAAI,IAAI;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YACtC,kCAAkC;YAClC,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,OAAO,CAAC;gBAAE,OAAO,QAAQ,CAAC,SAAS,EAAE,gBAAgB,CAAC,OAAO,CAAC,IAAI,IAAI,CAAC,CAAC;YAC3F,2BAA2B;YAC3B,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,OAAO,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;gBACvC,IAAI,gBAAgB,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC,IAAI,IAAI,EAAE;oBACxC,OAAO,QAAQ,CAAC,SAAS,EAAE,KAAK,CAAC,CAAC;iBACnC;aACF;YAED,6BAA6B;YAC7B,OAAO,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;QACnC,CAAC,CACF,CAAC;IACJ,CAAC;CACF;AAzCD,oDAyCC;AAED,gBAAgB;AAChB,MAAa,yBAA0B,SAAQ,6BAA2B;IAKxE,YAAY,EAAM,EAAE,IAAY,EAAE,OAAiC;QACjE,KAAK,CAAC,OAAO,CAAC,CAAC;QACf,IAAI,CAAC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAC7B,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC;QACb,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;IACnB,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA4B;QAC1E,MAAM,EAAE,GAAG,IAAI,CAAC,EAAE,CAAC;QACnB,MAAM,IAAI,GAAG,IAAI,CAAC,IAAI,CAAC;QAEvB,mCAAgB,CACd,EAAE,EACF,IAAI,EACJ,EAAE,GAAG,IAAI,CAAC,OAAO,EAAE,cAAc,EAAE,IAAI,CAAC,cAAc,EAAE,OAAO,EAAE,EACjE,QAAQ,CACT,CAAC;IACJ,CAAC;CACF;AAvBD,8DAuBC;AAED,yBAAa,CAAC,oBAAoB,EAAE;IAClC,kBAAM,CAAC,cAAc;IACrB,kBAAM,CAAC,SAAS;IAChB,kBAAM,CAAC,eAAe;CACvB,CAAC,CAAC;AACH,yBAAa,CAAC,sBAAsB,EAAE,CAAC,kBAAM,CAAC,eAAe,CAAC,CAAC,CAAC;AAChE,yBAAa,CAAC,oBAAoB,EAAE,CAAC,kBAAM,CAAC,eAAe,CAAC,CAAC,CAAC;AAC9D,yBAAa,CAAC,oBAAoB,EAAE,CAAC,kBAAM,CAAC,eAAe,CAAC,CAAC,CAAC;AAC9D,yBAAa,CAAC,kBAAkB,EAAE,CAAC,kBAAM,CAAC,eAAe,CAAC,CAAC,CAAC;AAC5D,yBAAa,CAAC,oBAAoB,EAAE,CAAC,kBAAM,CAAC,eAAe,CAAC,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/insert.js b/node_modules/mongodb/lib/operations/insert.js new file mode 100644 index 000000000..9f877265d --- /dev/null +++ b/node_modules/mongodb/lib/operations/insert.js @@ -0,0 +1,93 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.InsertManyOperation = exports.InsertOneOperation = exports.InsertOperation = void 0; +const error_1 = require("../error"); +const operation_1 = require("./operation"); +const command_1 = require("./command"); +const common_functions_1 = require("./common_functions"); +const write_concern_1 = require("../write_concern"); +const bulk_write_1 = require("./bulk_write"); +/** @internal */ +class InsertOperation extends command_1.CommandOperation { + constructor(ns, documents, options) { + var _a; + super(undefined, options); + this.options = { ...options, checkKeys: (_a = options.checkKeys) !== null && _a !== void 0 ? _a : false }; + this.ns = ns; + this.documents = documents; + } + execute(server, session, callback) { + var _a; + const options = (_a = this.options) !== null && _a !== void 0 ? _a : {}; + const ordered = typeof options.ordered === 'boolean' ? options.ordered : true; + const command = { + insert: this.ns.collection, + documents: this.documents, + ordered + }; + if (typeof options.bypassDocumentValidation === 'boolean') { + command.bypassDocumentValidation = options.bypassDocumentValidation; + } + if (options.comment != null) { + command.comment = options.comment; + } + super.executeCommand(server, session, command, callback); + } +} +exports.InsertOperation = InsertOperation; +class InsertOneOperation extends InsertOperation { + constructor(collection, doc, options) { + super(collection.s.namespace, common_functions_1.prepareDocs(collection, [doc], options), options); + } + execute(server, session, callback) { + super.execute(server, session, (err, res) => { + var _a, _b; + if (err || res == null) + return callback(err); + if (res.code) + return callback(new error_1.MongoServerError(res)); + if (res.writeErrors) { + // This should be a WriteError but we can't change it now because of error hierarchy + return callback(new error_1.MongoServerError(res.writeErrors[0])); + } + callback(undefined, { + acknowledged: (_b = ((_a = this.writeConcern) === null || _a === void 0 ? void 0 : _a.w) !== 0) !== null && _b !== void 0 ? _b : true, + insertedId: this.documents[0]._id + }); + }); + } +} +exports.InsertOneOperation = InsertOneOperation; +/** @internal */ +class InsertManyOperation extends operation_1.AbstractOperation { + constructor(collection, docs, options) { + super(options); + if (!Array.isArray(docs)) { + throw new error_1.MongoInvalidArgumentError('Argument "docs" must be an array of documents'); + } + this.options = options; + this.collection = collection; + this.docs = docs; + } + execute(server, session, callback) { + const coll = this.collection; + const options = { ...this.options, ...this.bsonOptions, readPreference: this.readPreference }; + const writeConcern = write_concern_1.WriteConcern.fromOptions(options); + const bulkWriteOperation = new bulk_write_1.BulkWriteOperation(coll, common_functions_1.prepareDocs(coll, this.docs, options).map(document => ({ insertOne: { document } })), options); + bulkWriteOperation.execute(server, session, (err, res) => { + var _a; + if (err || res == null) + return callback(err); + callback(undefined, { + acknowledged: (_a = (writeConcern === null || writeConcern === void 0 ? void 0 : writeConcern.w) !== 0) !== null && _a !== void 0 ? _a : true, + insertedCount: res.insertedCount, + insertedIds: res.insertedIds + }); + }); + } +} +exports.InsertManyOperation = InsertManyOperation; +operation_1.defineAspects(InsertOperation, [operation_1.Aspect.RETRYABLE, operation_1.Aspect.WRITE_OPERATION]); +operation_1.defineAspects(InsertOneOperation, [operation_1.Aspect.RETRYABLE, operation_1.Aspect.WRITE_OPERATION]); +operation_1.defineAspects(InsertManyOperation, [operation_1.Aspect.WRITE_OPERATION]); +//# sourceMappingURL=insert.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/insert.js.map b/node_modules/mongodb/lib/operations/insert.js.map new file mode 100644 index 000000000..c8ab1dfb8 --- /dev/null +++ b/node_modules/mongodb/lib/operations/insert.js.map @@ -0,0 +1 @@ +{"version":3,"file":"insert.js","sourceRoot":"","sources":["../../src/operations/insert.ts"],"names":[],"mappings":";;;AAAA,oCAAuE;AACvE,2CAAuE;AACvE,uCAAsE;AACtE,yDAAiD;AAMjD,oDAAgD;AAEhD,6CAAkD;AAGlD,gBAAgB;AAChB,MAAa,eAAgB,SAAQ,0BAA0B;IAI7D,YAAY,EAAoB,EAAE,SAAqB,EAAE,OAAyB;;QAChF,KAAK,CAAC,SAAS,EAAE,OAAO,CAAC,CAAC;QAC1B,IAAI,CAAC,OAAO,GAAG,EAAE,GAAG,OAAO,EAAE,SAAS,EAAE,MAAA,OAAO,CAAC,SAAS,mCAAI,KAAK,EAAE,CAAC;QACrE,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC;QACb,IAAI,CAAC,SAAS,GAAG,SAAS,CAAC;IAC7B,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA4B;;QAC1E,MAAM,OAAO,GAAG,MAAA,IAAI,CAAC,OAAO,mCAAI,EAAE,CAAC;QACnC,MAAM,OAAO,GAAG,OAAO,OAAO,CAAC,OAAO,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC,CAAC,IAAI,CAAC;QAC9E,MAAM,OAAO,GAAa;YACxB,MAAM,EAAE,IAAI,CAAC,EAAE,CAAC,UAAU;YAC1B,SAAS,EAAE,IAAI,CAAC,SAAS;YACzB,OAAO;SACR,CAAC;QAEF,IAAI,OAAO,OAAO,CAAC,wBAAwB,KAAK,SAAS,EAAE;YACzD,OAAO,CAAC,wBAAwB,GAAG,OAAO,CAAC,wBAAwB,CAAC;SACrE;QAED,IAAI,OAAO,CAAC,OAAO,IAAI,IAAI,EAAE;YAC3B,OAAO,CAAC,OAAO,GAAG,OAAO,CAAC,OAAO,CAAC;SACnC;QAED,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;IAC3D,CAAC;CACF;AA9BD,0CA8BC;AAkBD,MAAa,kBAAmB,SAAQ,eAAe;IACrD,YAAY,UAAsB,EAAE,GAAa,EAAE,OAAyB;QAC1E,KAAK,CAAC,UAAU,CAAC,CAAC,CAAC,SAAS,EAAE,8BAAW,CAAC,UAAU,EAAE,CAAC,GAAG,CAAC,EAAE,OAAO,CAAC,EAAE,OAAO,CAAC,CAAC;IAClF,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAAmC;QACjF,KAAK,CAAC,OAAO,CAAC,MAAM,EAAE,OAAO,EAAE,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;;YAC1C,IAAI,GAAG,IAAI,GAAG,IAAI,IAAI;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAC7C,IAAI,GAAG,CAAC,IAAI;gBAAE,OAAO,QAAQ,CAAC,IAAI,wBAAgB,CAAC,GAAG,CAAC,CAAC,CAAC;YACzD,IAAI,GAAG,CAAC,WAAW,EAAE;gBACnB,oFAAoF;gBACpF,OAAO,QAAQ,CAAC,IAAI,wBAAgB,CAAC,GAAG,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;aAC3D;YAED,QAAQ,CAAC,SAAS,EAAE;gBAClB,YAAY,EAAE,MAAA,CAAA,MAAA,IAAI,CAAC,YAAY,0CAAE,CAAC,MAAK,CAAC,mCAAI,IAAI;gBAChD,UAAU,EAAE,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC,GAAG;aAClC,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AApBD,gDAoBC;AAYD,gBAAgB;AAChB,MAAa,mBAAoB,SAAQ,6BAAmC;IAK1E,YAAY,UAAsB,EAAE,IAAgB,EAAE,OAAyB;QAC7E,KAAK,CAAC,OAAO,CAAC,CAAC;QAEf,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,IAAI,CAAC,EAAE;YACxB,MAAM,IAAI,iCAAyB,CAAC,+CAA+C,CAAC,CAAC;SACtF;QAED,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;QAC7B,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;IACnB,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAAoC;QAClF,MAAM,IAAI,GAAG,IAAI,CAAC,UAAU,CAAC;QAC7B,MAAM,OAAO,GAAG,EAAE,GAAG,IAAI,CAAC,OAAO,EAAE,GAAG,IAAI,CAAC,WAAW,EAAE,cAAc,EAAE,IAAI,CAAC,cAAc,EAAE,CAAC;QAC9F,MAAM,YAAY,GAAG,4BAAY,CAAC,WAAW,CAAC,OAAO,CAAC,CAAC;QACvD,MAAM,kBAAkB,GAAG,IAAI,+BAAkB,CAC/C,IAAI,EACJ,8BAAW,CAAC,IAAI,EAAE,IAAI,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC,GAAG,CAAC,QAAQ,CAAC,EAAE,CAAC,CAAC,EAAE,SAAS,EAAE,EAAE,QAAQ,EAAE,EAAE,CAAC,CAAC,EACpF,OAAO,CACR,CAAC;QAEF,kBAAkB,CAAC,OAAO,CAAC,MAAM,EAAE,OAAO,EAAE,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;;YACvD,IAAI,GAAG,IAAI,GAAG,IAAI,IAAI;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAC7C,QAAQ,CAAC,SAAS,EAAE;gBAClB,YAAY,EAAE,MAAA,CAAA,YAAY,aAAZ,YAAY,uBAAZ,YAAY,CAAE,CAAC,MAAK,CAAC,mCAAI,IAAI;gBAC3C,aAAa,EAAE,GAAG,CAAC,aAAa;gBAChC,WAAW,EAAE,GAAG,CAAC,WAAW;aAC7B,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AApCD,kDAoCC;AAED,yBAAa,CAAC,eAAe,EAAE,CAAC,kBAAM,CAAC,SAAS,EAAE,kBAAM,CAAC,eAAe,CAAC,CAAC,CAAC;AAC3E,yBAAa,CAAC,kBAAkB,EAAE,CAAC,kBAAM,CAAC,SAAS,EAAE,kBAAM,CAAC,eAAe,CAAC,CAAC,CAAC;AAC9E,yBAAa,CAAC,mBAAmB,EAAE,CAAC,kBAAM,CAAC,eAAe,CAAC,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/is_capped.js b/node_modules/mongodb/lib/operations/is_capped.js new file mode 100644 index 000000000..16e0a315a --- /dev/null +++ b/node_modules/mongodb/lib/operations/is_capped.js @@ -0,0 +1,30 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.IsCappedOperation = void 0; +const operation_1 = require("./operation"); +const error_1 = require("../error"); +/** @internal */ +class IsCappedOperation extends operation_1.AbstractOperation { + constructor(collection, options) { + super(options); + this.options = options; + this.collection = collection; + } + execute(server, session, callback) { + const coll = this.collection; + coll.s.db + .listCollections({ name: coll.collectionName }, { ...this.options, nameOnly: false, readPreference: this.readPreference, session }) + .toArray((err, collections) => { + if (err || !collections) + return callback(err); + if (collections.length === 0) { + // TODO(NODE-3485) + return callback(new error_1.MongoAPIError(`collection ${coll.namespace} not found`)); + } + const collOptions = collections[0].options; + callback(undefined, !!(collOptions && collOptions.capped)); + }); + } +} +exports.IsCappedOperation = IsCappedOperation; +//# sourceMappingURL=is_capped.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/is_capped.js.map b/node_modules/mongodb/lib/operations/is_capped.js.map new file mode 100644 index 000000000..227f1567e --- /dev/null +++ b/node_modules/mongodb/lib/operations/is_capped.js.map @@ -0,0 +1 @@ +{"version":3,"file":"is_capped.js","sourceRoot":"","sources":["../../src/operations/is_capped.ts"],"names":[],"mappings":";;;AAEA,2CAAkE;AAGlE,oCAAyC;AAEzC,gBAAgB;AAChB,MAAa,iBAAkB,SAAQ,6BAA0B;IAI/D,YAAY,UAAsB,EAAE,OAAyB;QAC3D,KAAK,CAAC,OAAO,CAAC,CAAC;QACf,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;IAC/B,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA2B;QACzE,MAAM,IAAI,GAAG,IAAI,CAAC,UAAU,CAAC;QAE7B,IAAI,CAAC,CAAC,CAAC,EAAE;aACN,eAAe,CACd,EAAE,IAAI,EAAE,IAAI,CAAC,cAAc,EAAE,EAC7B,EAAE,GAAG,IAAI,CAAC,OAAO,EAAE,QAAQ,EAAE,KAAK,EAAE,cAAc,EAAE,IAAI,CAAC,cAAc,EAAE,OAAO,EAAE,CACnF;aACA,OAAO,CAAC,CAAC,GAAG,EAAE,WAAW,EAAE,EAAE;YAC5B,IAAI,GAAG,IAAI,CAAC,WAAW;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAC9C,IAAI,WAAW,CAAC,MAAM,KAAK,CAAC,EAAE;gBAC5B,kBAAkB;gBAClB,OAAO,QAAQ,CAAC,IAAI,qBAAa,CAAC,cAAc,IAAI,CAAC,SAAS,YAAY,CAAC,CAAC,CAAC;aAC9E;YAED,MAAM,WAAW,GAAG,WAAW,CAAC,CAAC,CAAC,CAAC,OAAO,CAAC;YAC3C,QAAQ,CAAC,SAAS,EAAE,CAAC,CAAC,CAAC,WAAW,IAAI,WAAW,CAAC,MAAM,CAAC,CAAC,CAAC;QAC7D,CAAC,CAAC,CAAC;IACP,CAAC;CACF;AA7BD,8CA6BC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/list_collections.js b/node_modules/mongodb/lib/operations/list_collections.js new file mode 100644 index 000000000..7ff7277a7 --- /dev/null +++ b/node_modules/mongodb/lib/operations/list_collections.js @@ -0,0 +1,105 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.ListCollectionsCursor = exports.ListCollectionsOperation = void 0; +const command_1 = require("./command"); +const operation_1 = require("./operation"); +const utils_1 = require("../utils"); +const CONSTANTS = require("../constants"); +const abstract_cursor_1 = require("../cursor/abstract_cursor"); +const execute_operation_1 = require("./execute_operation"); +const LIST_COLLECTIONS_WIRE_VERSION = 3; +/** @internal */ +class ListCollectionsOperation extends command_1.CommandOperation { + constructor(db, filter, options) { + super(db, options); + this.options = options !== null && options !== void 0 ? options : {}; + this.db = db; + this.filter = filter; + this.nameOnly = !!this.options.nameOnly; + if (typeof this.options.batchSize === 'number') { + this.batchSize = this.options.batchSize; + } + } + execute(server, session, callback) { + if (utils_1.maxWireVersion(server) < LIST_COLLECTIONS_WIRE_VERSION) { + let filter = this.filter; + const databaseName = this.db.s.namespace.db; + // If we have legacy mode and have not provided a full db name filter it + if (typeof filter.name === 'string' && !new RegExp(`^${databaseName}\\.`).test(filter.name)) { + filter = Object.assign({}, filter); + filter.name = this.db.s.namespace.withCollection(filter.name).toString(); + } + // No filter, filter by current database + if (filter == null) { + filter = { name: `/${databaseName}/` }; + } + // Rewrite the filter to use $and to filter out indexes + if (filter.name) { + filter = { $and: [{ name: filter.name }, { name: /^((?!\$).)*$/ }] }; + } + else { + filter = { name: /^((?!\$).)*$/ }; + } + const documentTransform = (doc) => { + const matching = `${databaseName}.`; + const index = doc.name.indexOf(matching); + // Remove database name if available + if (doc.name && index === 0) { + doc.name = doc.name.substr(index + matching.length); + } + return doc; + }; + server.query(new utils_1.MongoDBNamespace(databaseName, CONSTANTS.SYSTEM_NAMESPACE_COLLECTION), { query: filter }, { batchSize: this.batchSize || 1000, readPreference: this.readPreference }, (err, result) => { + if (result && result.documents && Array.isArray(result.documents)) { + result.documents = result.documents.map(documentTransform); + } + callback(err, result); + }); + return; + } + const command = { + listCollections: 1, + filter: this.filter, + cursor: this.batchSize ? { batchSize: this.batchSize } : {}, + nameOnly: this.nameOnly + }; + return super.executeCommand(server, session, command, callback); + } +} +exports.ListCollectionsOperation = ListCollectionsOperation; +/** @public */ +class ListCollectionsCursor extends abstract_cursor_1.AbstractCursor { + constructor(db, filter, options) { + super(utils_1.getTopology(db), db.s.namespace, options); + this.parent = db; + this.filter = filter; + this.options = options; + } + clone() { + return new ListCollectionsCursor(this.parent, this.filter, { + ...this.options, + ...this.cursorOptions + }); + } + /** @internal */ + _initialize(session, callback) { + const operation = new ListCollectionsOperation(this.parent, this.filter, { + ...this.cursorOptions, + ...this.options, + session + }); + execute_operation_1.executeOperation(utils_1.getTopology(this.parent), operation, (err, response) => { + if (err || response == null) + return callback(err); + // TODO: NODE-2882 + callback(undefined, { server: operation.server, session, response }); + }); + } +} +exports.ListCollectionsCursor = ListCollectionsCursor; +operation_1.defineAspects(ListCollectionsOperation, [ + operation_1.Aspect.READ_OPERATION, + operation_1.Aspect.RETRYABLE, + operation_1.Aspect.CURSOR_CREATING +]); +//# sourceMappingURL=list_collections.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/list_collections.js.map b/node_modules/mongodb/lib/operations/list_collections.js.map new file mode 100644 index 000000000..8e9077226 --- /dev/null +++ b/node_modules/mongodb/lib/operations/list_collections.js.map @@ -0,0 +1 @@ +{"version":3,"file":"list_collections.js","sourceRoot":"","sources":["../../src/operations/list_collections.ts"],"names":[],"mappings":";;;AAAA,uCAAsE;AACtE,2CAAoD;AACpD,oCAAmF;AACnF,0CAA0C;AAI1C,+DAA2D;AAE3D,2DAAwE;AAExE,MAAM,6BAA6B,GAAG,CAAC,CAAC;AAUxC,gBAAgB;AAChB,MAAa,wBAAyB,SAAQ,0BAA0B;IAOtE,YAAY,EAAM,EAAE,MAAgB,EAAE,OAAgC;QACpE,KAAK,CAAC,EAAE,EAAE,OAAO,CAAC,CAAC;QAEnB,IAAI,CAAC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAC7B,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC;QACb,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC;QACrB,IAAI,CAAC,QAAQ,GAAG,CAAC,CAAC,IAAI,CAAC,OAAO,CAAC,QAAQ,CAAC;QAExC,IAAI,OAAO,IAAI,CAAC,OAAO,CAAC,SAAS,KAAK,QAAQ,EAAE;YAC9C,IAAI,CAAC,SAAS,GAAG,IAAI,CAAC,OAAO,CAAC,SAAS,CAAC;SACzC;IACH,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA4B;QAC1E,IAAI,sBAAc,CAAC,MAAM,CAAC,GAAG,6BAA6B,EAAE;YAC1D,IAAI,MAAM,GAAG,IAAI,CAAC,MAAM,CAAC;YACzB,MAAM,YAAY,GAAG,IAAI,CAAC,EAAE,CAAC,CAAC,CAAC,SAAS,CAAC,EAAE,CAAC;YAE5C,wEAAwE;YACxE,IAAI,OAAO,MAAM,CAAC,IAAI,KAAK,QAAQ,IAAI,CAAC,IAAI,MAAM,CAAC,IAAI,YAAY,KAAK,CAAC,CAAC,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,EAAE;gBAC3F,MAAM,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,MAAM,CAAC,CAAC;gBACnC,MAAM,CAAC,IAAI,GAAG,IAAI,CAAC,EAAE,CAAC,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC,QAAQ,EAAE,CAAC;aAC1E;YAED,wCAAwC;YACxC,IAAI,MAAM,IAAI,IAAI,EAAE;gBAClB,MAAM,GAAG,EAAE,IAAI,EAAE,IAAI,YAAY,GAAG,EAAE,CAAC;aACxC;YAED,uDAAuD;YACvD,IAAI,MAAM,CAAC,IAAI,EAAE;gBACf,MAAM,GAAG,EAAE,IAAI,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,CAAC,IAAI,EAAE,EAAE,EAAE,IAAI,EAAE,cAAc,EAAE,CAAC,EAAE,CAAC;aACtE;iBAAM;gBACL,MAAM,GAAG,EAAE,IAAI,EAAE,cAAc,EAAE,CAAC;aACnC;YAED,MAAM,iBAAiB,GAAG,CAAC,GAAa,EAAE,EAAE;gBAC1C,MAAM,QAAQ,GAAG,GAAG,YAAY,GAAG,CAAC;gBACpC,MAAM,KAAK,GAAG,GAAG,CAAC,IAAI,CAAC,OAAO,CAAC,QAAQ,CAAC,CAAC;gBACzC,oCAAoC;gBACpC,IAAI,GAAG,CAAC,IAAI,IAAI,KAAK,KAAK,CAAC,EAAE;oBAC3B,GAAG,CAAC,IAAI,GAAG,GAAG,CAAC,IAAI,CAAC,MAAM,CAAC,KAAK,GAAG,QAAQ,CAAC,MAAM,CAAC,CAAC;iBACrD;gBAED,OAAO,GAAG,CAAC;YACb,CAAC,CAAC;YAEF,MAAM,CAAC,KAAK,CACV,IAAI,wBAAgB,CAAC,YAAY,EAAE,SAAS,CAAC,2BAA2B,CAAC,EACzE,EAAE,KAAK,EAAE,MAAM,EAAE,EACjB,EAAE,SAAS,EAAE,IAAI,CAAC,SAAS,IAAI,IAAI,EAAE,cAAc,EAAE,IAAI,CAAC,cAAc,EAAE,EAC1E,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;gBACd,IAAI,MAAM,IAAI,MAAM,CAAC,SAAS,IAAI,KAAK,CAAC,OAAO,CAAC,MAAM,CAAC,SAAS,CAAC,EAAE;oBACjE,MAAM,CAAC,SAAS,GAAG,MAAM,CAAC,SAAS,CAAC,GAAG,CAAC,iBAAiB,CAAC,CAAC;iBAC5D;gBAED,QAAQ,CAAC,GAAG,EAAE,MAAM,CAAC,CAAC;YACxB,CAAC,CACF,CAAC;YAEF,OAAO;SACR;QAED,MAAM,OAAO,GAAG;YACd,eAAe,EAAE,CAAC;YAClB,MAAM,EAAE,IAAI,CAAC,MAAM;YACnB,MAAM,EAAE,IAAI,CAAC,SAAS,CAAC,CAAC,CAAC,EAAE,SAAS,EAAE,IAAI,CAAC,SAAS,EAAE,CAAC,CAAC,CAAC,EAAE;YAC3D,QAAQ,EAAE,IAAI,CAAC,QAAQ;SACxB,CAAC;QAEF,OAAO,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;IAClE,CAAC;CACF;AA/ED,4DA+EC;AAcD,cAAc;AACd,MAAa,qBAIX,SAAQ,gCAAiB;IAKzB,YAAY,EAAM,EAAE,MAAgB,EAAE,OAAgC;QACpE,KAAK,CAAC,mBAAW,CAAC,EAAE,CAAC,EAAE,EAAE,CAAC,CAAC,CAAC,SAAS,EAAE,OAAO,CAAC,CAAC;QAChD,IAAI,CAAC,MAAM,GAAG,EAAE,CAAC;QACjB,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC;QACrB,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;IACzB,CAAC;IAED,KAAK;QACH,OAAO,IAAI,qBAAqB,CAAC,IAAI,CAAC,MAAM,EAAE,IAAI,CAAC,MAAM,EAAE;YACzD,GAAG,IAAI,CAAC,OAAO;YACf,GAAG,IAAI,CAAC,aAAa;SACtB,CAAC,CAAC;IACL,CAAC;IAED,gBAAgB;IAChB,WAAW,CAAC,OAAkC,EAAE,QAAmC;QACjF,MAAM,SAAS,GAAG,IAAI,wBAAwB,CAAC,IAAI,CAAC,MAAM,EAAE,IAAI,CAAC,MAAM,EAAE;YACvE,GAAG,IAAI,CAAC,aAAa;YACrB,GAAG,IAAI,CAAC,OAAO;YACf,OAAO;SACR,CAAC,CAAC;QAEH,oCAAgB,CAAC,mBAAW,CAAC,IAAI,CAAC,MAAM,CAAC,EAAE,SAAS,EAAE,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;YACtE,IAAI,GAAG,IAAI,QAAQ,IAAI,IAAI;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAElD,kBAAkB;YAClB,QAAQ,CAAC,SAAS,EAAE,EAAE,MAAM,EAAE,SAAS,CAAC,MAAM,EAAE,OAAO,EAAE,QAAQ,EAAE,CAAC,CAAC;QACvE,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAtCD,sDAsCC;AAED,yBAAa,CAAC,wBAAwB,EAAE;IACtC,kBAAM,CAAC,cAAc;IACrB,kBAAM,CAAC,SAAS;IAChB,kBAAM,CAAC,eAAe;CACvB,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/list_databases.js b/node_modules/mongodb/lib/operations/list_databases.js new file mode 100644 index 000000000..5133b32a2 --- /dev/null +++ b/node_modules/mongodb/lib/operations/list_databases.js @@ -0,0 +1,30 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.ListDatabasesOperation = void 0; +const command_1 = require("./command"); +const operation_1 = require("./operation"); +const utils_1 = require("../utils"); +/** @internal */ +class ListDatabasesOperation extends command_1.CommandOperation { + constructor(db, options) { + super(db, options); + this.options = options !== null && options !== void 0 ? options : {}; + this.ns = new utils_1.MongoDBNamespace('admin', '$cmd'); + } + execute(server, session, callback) { + const cmd = { listDatabases: 1 }; + if (this.options.nameOnly) { + cmd.nameOnly = Number(cmd.nameOnly); + } + if (this.options.filter) { + cmd.filter = this.options.filter; + } + if (typeof this.options.authorizedDatabases === 'boolean') { + cmd.authorizedDatabases = this.options.authorizedDatabases; + } + super.executeCommand(server, session, cmd, callback); + } +} +exports.ListDatabasesOperation = ListDatabasesOperation; +operation_1.defineAspects(ListDatabasesOperation, [operation_1.Aspect.READ_OPERATION, operation_1.Aspect.RETRYABLE]); +//# sourceMappingURL=list_databases.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/list_databases.js.map b/node_modules/mongodb/lib/operations/list_databases.js.map new file mode 100644 index 000000000..36e00572b --- /dev/null +++ b/node_modules/mongodb/lib/operations/list_databases.js.map @@ -0,0 +1 @@ +{"version":3,"file":"list_databases.js","sourceRoot":"","sources":["../../src/operations/list_databases.ts"],"names":[],"mappings":";;;AAAA,uCAAsE;AACtE,2CAAoD;AACpD,oCAAsD;AAmBtD,gBAAgB;AAChB,MAAa,sBAAuB,SAAQ,0BAAqC;IAG/E,YAAY,EAAM,EAAE,OAA8B;QAChD,KAAK,CAAC,EAAE,EAAE,OAAO,CAAC,CAAC;QACnB,IAAI,CAAC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAC7B,IAAI,CAAC,EAAE,GAAG,IAAI,wBAAgB,CAAC,OAAO,EAAE,MAAM,CAAC,CAAC;IAClD,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAAuC;QACrF,MAAM,GAAG,GAAa,EAAE,aAAa,EAAE,CAAC,EAAE,CAAC;QAC3C,IAAI,IAAI,CAAC,OAAO,CAAC,QAAQ,EAAE;YACzB,GAAG,CAAC,QAAQ,GAAG,MAAM,CAAC,GAAG,CAAC,QAAQ,CAAC,CAAC;SACrC;QAED,IAAI,IAAI,CAAC,OAAO,CAAC,MAAM,EAAE;YACvB,GAAG,CAAC,MAAM,GAAG,IAAI,CAAC,OAAO,CAAC,MAAM,CAAC;SAClC;QAED,IAAI,OAAO,IAAI,CAAC,OAAO,CAAC,mBAAmB,KAAK,SAAS,EAAE;YACzD,GAAG,CAAC,mBAAmB,GAAG,IAAI,CAAC,OAAO,CAAC,mBAAmB,CAAC;SAC5D;QAED,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,GAAG,EAAE,QAAQ,CAAC,CAAC;IACvD,CAAC;CACF;AAzBD,wDAyBC;AAED,yBAAa,CAAC,sBAAsB,EAAE,CAAC,kBAAM,CAAC,cAAc,EAAE,kBAAM,CAAC,SAAS,CAAC,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/map_reduce.js b/node_modules/mongodb/lib/operations/map_reduce.js new file mode 100644 index 000000000..45fd0f2a6 --- /dev/null +++ b/node_modules/mongodb/lib/operations/map_reduce.js @@ -0,0 +1,166 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.MapReduceOperation = void 0; +const bson_1 = require("../bson"); +const utils_1 = require("../utils"); +const read_preference_1 = require("../read_preference"); +const command_1 = require("./command"); +const error_1 = require("../error"); +const operation_1 = require("./operation"); +const db_1 = require("../db"); +const exclusionList = [ + 'explain', + 'readPreference', + 'readConcern', + 'session', + 'bypassDocumentValidation', + 'writeConcern', + 'raw', + 'fieldsAsRaw', + 'promoteLongs', + 'promoteValues', + 'promoteBuffers', + 'bsonRegExp', + 'serializeFunctions', + 'ignoreUndefined', + 'scope' // this option is reformatted thus exclude the original +]; +/** + * Run Map Reduce across a collection. Be aware that the inline option for out will return an array of results not a collection. + * @internal + */ +class MapReduceOperation extends command_1.CommandOperation { + /** + * Constructs a MapReduce operation. + * + * @param collection - Collection instance. + * @param map - The mapping function. + * @param reduce - The reduce function. + * @param options - Optional settings. See Collection.prototype.mapReduce for a list of options. + */ + constructor(collection, map, reduce, options) { + super(collection, options); + this.options = options !== null && options !== void 0 ? options : {}; + this.collection = collection; + this.map = map; + this.reduce = reduce; + } + execute(server, session, callback) { + const coll = this.collection; + const map = this.map; + const reduce = this.reduce; + let options = this.options; + const mapCommandHash = { + mapReduce: coll.collectionName, + map: map, + reduce: reduce + }; + if (options.scope) { + mapCommandHash.scope = processScope(options.scope); + } + // Add any other options passed in + for (const n in options) { + // Only include if not in exclusion list + if (exclusionList.indexOf(n) === -1) { + mapCommandHash[n] = options[n]; + } + } + options = Object.assign({}, options); + // If we have a read preference and inline is not set as output fail hard + if (this.readPreference.mode === read_preference_1.ReadPreferenceMode.primary && + options.out && + options.out.inline !== 1 && + options.out !== 'inline') { + // Force readPreference to primary + options.readPreference = read_preference_1.ReadPreference.primary; + // Decorate command with writeConcern if supported + utils_1.applyWriteConcern(mapCommandHash, { db: coll.s.db, collection: coll }, options); + } + else { + utils_1.decorateWithReadConcern(mapCommandHash, coll, options); + } + // Is bypassDocumentValidation specified + if (options.bypassDocumentValidation === true) { + mapCommandHash.bypassDocumentValidation = options.bypassDocumentValidation; + } + // Have we specified collation + try { + utils_1.decorateWithCollation(mapCommandHash, coll, options); + } + catch (err) { + return callback(err); + } + if (this.explain && utils_1.maxWireVersion(server) < 9) { + callback(new error_1.MongoCompatibilityError(`Server ${server.name} does not support explain on mapReduce`)); + return; + } + // Execute command + super.executeCommand(server, session, mapCommandHash, (err, result) => { + if (err) + return callback(err); + // Check if we have an error + if (1 !== result.ok || result.err || result.errmsg) { + return callback(new error_1.MongoServerError(result)); + } + // If an explain option was executed, don't process the server results + if (this.explain) + return callback(undefined, result); + // Create statistics value + const stats = {}; + if (result.timeMillis) + stats['processtime'] = result.timeMillis; + if (result.counts) + stats['counts'] = result.counts; + if (result.timing) + stats['timing'] = result.timing; + // invoked with inline? + if (result.results) { + // If we wish for no verbosity + if (options['verbose'] == null || !options['verbose']) { + return callback(undefined, result.results); + } + return callback(undefined, { results: result.results, stats: stats }); + } + // The returned collection + let collection = null; + // If we have an object it's a different db + if (result.result != null && typeof result.result === 'object') { + const doc = result.result; + // Return a collection from another db + collection = new db_1.Db(coll.s.db.s.client, doc.db, coll.s.db.s.options).collection(doc.collection); + } + else { + // Create a collection object that wraps the result collection + collection = coll.s.db.collection(result.result); + } + // If we wish for no verbosity + if (options['verbose'] == null || !options['verbose']) { + return callback(err, collection); + } + // Return stats as third set of values + callback(err, { collection, stats }); + }); + } +} +exports.MapReduceOperation = MapReduceOperation; +/** Functions that are passed as scope args must be converted to Code instances. */ +function processScope(scope) { + if (!utils_1.isObject(scope) || scope._bsontype === 'ObjectID') { + return scope; + } + const newScope = {}; + for (const key of Object.keys(scope)) { + if ('function' === typeof scope[key]) { + newScope[key] = new bson_1.Code(String(scope[key])); + } + else if (scope[key]._bsontype === 'Code') { + newScope[key] = scope[key]; + } + else { + newScope[key] = processScope(scope[key]); + } + } + return newScope; +} +operation_1.defineAspects(MapReduceOperation, [operation_1.Aspect.EXPLAINABLE]); +//# sourceMappingURL=map_reduce.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/map_reduce.js.map b/node_modules/mongodb/lib/operations/map_reduce.js.map new file mode 100644 index 000000000..249bce711 --- /dev/null +++ b/node_modules/mongodb/lib/operations/map_reduce.js.map @@ -0,0 +1 @@ +{"version":3,"file":"map_reduce.js","sourceRoot":"","sources":["../../src/operations/map_reduce.ts"],"names":[],"mappings":";;;AAAA,kCAAyC;AACzC,oCAOkB;AAClB,wDAAwE;AACxE,uCAAsE;AAItE,oCAAqE;AAErE,2CAAoD;AAEpD,8BAA2B;AAE3B,MAAM,aAAa,GAAG;IACpB,SAAS;IACT,gBAAgB;IAChB,aAAa;IACb,SAAS;IACT,0BAA0B;IAC1B,cAAc;IACd,KAAK;IACL,aAAa;IACb,cAAc;IACd,eAAe;IACf,gBAAgB;IAChB,YAAY;IACZ,oBAAoB;IACpB,iBAAiB;IACjB,OAAO,CAAC,uDAAuD;CAChE,CAAC;AA2CF;;;GAGG;AACH,MAAa,kBAAmB,SAAQ,0BAAuC;IAQ7E;;;;;;;OAOG;IACH,YACE,UAAsB,EACtB,GAAyB,EACzB,MAA+B,EAC/B,OAA0B;QAE1B,KAAK,CAAC,UAAU,EAAE,OAAO,CAAC,CAAC;QAE3B,IAAI,CAAC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAC7B,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;QAC7B,IAAI,CAAC,GAAG,GAAG,GAAG,CAAC;QACf,IAAI,CAAC,MAAM,GAAG,MAAM,CAAC;IACvB,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAAyC;QACvF,MAAM,IAAI,GAAG,IAAI,CAAC,UAAU,CAAC;QAC7B,MAAM,GAAG,GAAG,IAAI,CAAC,GAAG,CAAC;QACrB,MAAM,MAAM,GAAG,IAAI,CAAC,MAAM,CAAC;QAC3B,IAAI,OAAO,GAAG,IAAI,CAAC,OAAO,CAAC;QAE3B,MAAM,cAAc,GAAa;YAC/B,SAAS,EAAE,IAAI,CAAC,cAAc;YAC9B,GAAG,EAAE,GAAG;YACR,MAAM,EAAE,MAAM;SACf,CAAC;QAEF,IAAI,OAAO,CAAC,KAAK,EAAE;YACjB,cAAc,CAAC,KAAK,GAAG,YAAY,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC;SACpD;QAED,kCAAkC;QAClC,KAAK,MAAM,CAAC,IAAI,OAAO,EAAE;YACvB,wCAAwC;YACxC,IAAI,aAAa,CAAC,OAAO,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,EAAE;gBACnC,cAAc,CAAC,CAAC,CAAC,GAAI,OAAe,CAAC,CAAC,CAAC,CAAC;aACzC;SACF;QAED,OAAO,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,OAAO,CAAC,CAAC;QAErC,yEAAyE;QACzE,IACE,IAAI,CAAC,cAAc,CAAC,IAAI,KAAK,oCAAkB,CAAC,OAAO;YACvD,OAAO,CAAC,GAAG;YACV,OAAO,CAAC,GAAW,CAAC,MAAM,KAAK,CAAC;YACjC,OAAO,CAAC,GAAG,KAAK,QAAQ,EACxB;YACA,kCAAkC;YAClC,OAAO,CAAC,cAAc,GAAG,gCAAc,CAAC,OAAO,CAAC;YAChD,kDAAkD;YAClD,yBAAiB,CAAC,cAAc,EAAE,EAAE,EAAE,EAAE,IAAI,CAAC,CAAC,CAAC,EAAE,EAAE,UAAU,EAAE,IAAI,EAAE,EAAE,OAAO,CAAC,CAAC;SACjF;aAAM;YACL,+BAAuB,CAAC,cAAc,EAAE,IAAI,EAAE,OAAO,CAAC,CAAC;SACxD;QAED,wCAAwC;QACxC,IAAI,OAAO,CAAC,wBAAwB,KAAK,IAAI,EAAE;YAC7C,cAAc,CAAC,wBAAwB,GAAG,OAAO,CAAC,wBAAwB,CAAC;SAC5E;QAED,8BAA8B;QAC9B,IAAI;YACF,6BAAqB,CAAC,cAAc,EAAE,IAAI,EAAE,OAAO,CAAC,CAAC;SACtD;QAAC,OAAO,GAAG,EAAE;YACZ,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;SACtB;QAED,IAAI,IAAI,CAAC,OAAO,IAAI,sBAAc,CAAC,MAAM,CAAC,GAAG,CAAC,EAAE;YAC9C,QAAQ,CACN,IAAI,+BAAuB,CAAC,UAAU,MAAM,CAAC,IAAI,wCAAwC,CAAC,CAC3F,CAAC;YACF,OAAO;SACR;QAED,kBAAkB;QAClB,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,cAAc,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;YACpE,IAAI,GAAG;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAC9B,4BAA4B;YAC5B,IAAI,CAAC,KAAK,MAAM,CAAC,EAAE,IAAI,MAAM,CAAC,GAAG,IAAI,MAAM,CAAC,MAAM,EAAE;gBAClD,OAAO,QAAQ,CAAC,IAAI,wBAAgB,CAAC,MAAM,CAAC,CAAC,CAAC;aAC/C;YAED,sEAAsE;YACtE,IAAI,IAAI,CAAC,OAAO;gBAAE,OAAO,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,CAAC;YAErD,0BAA0B;YAC1B,MAAM,KAAK,GAAmB,EAAE,CAAC;YACjC,IAAI,MAAM,CAAC,UAAU;gBAAE,KAAK,CAAC,aAAa,CAAC,GAAG,MAAM,CAAC,UAAU,CAAC;YAChE,IAAI,MAAM,CAAC,MAAM;gBAAE,KAAK,CAAC,QAAQ,CAAC,GAAG,MAAM,CAAC,MAAM,CAAC;YACnD,IAAI,MAAM,CAAC,MAAM;gBAAE,KAAK,CAAC,QAAQ,CAAC,GAAG,MAAM,CAAC,MAAM,CAAC;YAEnD,uBAAuB;YACvB,IAAI,MAAM,CAAC,OAAO,EAAE;gBAClB,8BAA8B;gBAC9B,IAAI,OAAO,CAAC,SAAS,CAAC,IAAI,IAAI,IAAI,CAAC,OAAO,CAAC,SAAS,CAAC,EAAE;oBACrD,OAAO,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,OAAO,CAAC,CAAC;iBAC5C;gBAED,OAAO,QAAQ,CAAC,SAAS,EAAE,EAAE,OAAO,EAAE,MAAM,CAAC,OAAO,EAAE,KAAK,EAAE,KAAK,EAAE,CAAC,CAAC;aACvE;YAED,0BAA0B;YAC1B,IAAI,UAAU,GAAG,IAAI,CAAC;YAEtB,2CAA2C;YAC3C,IAAI,MAAM,CAAC,MAAM,IAAI,IAAI,IAAI,OAAO,MAAM,CAAC,MAAM,KAAK,QAAQ,EAAE;gBAC9D,MAAM,GAAG,GAAG,MAAM,CAAC,MAAM,CAAC;gBAC1B,sCAAsC;gBACtC,UAAU,GAAG,IAAI,OAAE,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,MAAM,EAAE,GAAG,CAAC,EAAE,EAAE,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC,UAAU,CAC7E,GAAG,CAAC,UAAU,CACf,CAAC;aACH;iBAAM;gBACL,8DAA8D;gBAC9D,UAAU,GAAG,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,UAAU,CAAC,MAAM,CAAC,MAAM,CAAC,CAAC;aAClD;YAED,8BAA8B;YAC9B,IAAI,OAAO,CAAC,SAAS,CAAC,IAAI,IAAI,IAAI,CAAC,OAAO,CAAC,SAAS,CAAC,EAAE;gBACrD,OAAO,QAAQ,CAAC,GAAG,EAAE,UAAU,CAAC,CAAC;aAClC;YAED,sCAAsC;YACtC,QAAQ,CAAC,GAAG,EAAE,EAAE,UAAU,EAAE,KAAK,EAAE,CAAC,CAAC;QACvC,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AA7ID,gDA6IC;AAED,mFAAmF;AACnF,SAAS,YAAY,CAAC,KAA0B;IAC9C,IAAI,CAAC,gBAAQ,CAAC,KAAK,CAAC,IAAK,KAAa,CAAC,SAAS,KAAK,UAAU,EAAE;QAC/D,OAAO,KAAK,CAAC;KACd;IAED,MAAM,QAAQ,GAAa,EAAE,CAAC;IAE9B,KAAK,MAAM,GAAG,IAAI,MAAM,CAAC,IAAI,CAAC,KAAK,CAAC,EAAE;QACpC,IAAI,UAAU,KAAK,OAAQ,KAAkB,CAAC,GAAG,CAAC,EAAE;YAClD,QAAQ,CAAC,GAAG,CAAC,GAAG,IAAI,WAAI,CAAC,MAAM,CAAE,KAAkB,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC;SAC5D;aAAM,IAAK,KAAkB,CAAC,GAAG,CAAC,CAAC,SAAS,KAAK,MAAM,EAAE;YACxD,QAAQ,CAAC,GAAG,CAAC,GAAI,KAAkB,CAAC,GAAG,CAAC,CAAC;SAC1C;aAAM;YACL,QAAQ,CAAC,GAAG,CAAC,GAAG,YAAY,CAAE,KAAkB,CAAC,GAAG,CAAC,CAAC,CAAC;SACxD;KACF;IAED,OAAO,QAAQ,CAAC;AAClB,CAAC;AAED,yBAAa,CAAC,kBAAkB,EAAE,CAAC,kBAAM,CAAC,WAAW,CAAC,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/operation.js b/node_modules/mongodb/lib/operations/operation.js new file mode 100644 index 000000000..f7d7c6cac --- /dev/null +++ b/node_modules/mongodb/lib/operations/operation.js @@ -0,0 +1,67 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.defineAspects = exports.AbstractOperation = exports.Aspect = void 0; +const read_preference_1 = require("../read_preference"); +const bson_1 = require("../bson"); +exports.Aspect = { + READ_OPERATION: Symbol('READ_OPERATION'), + WRITE_OPERATION: Symbol('WRITE_OPERATION'), + RETRYABLE: Symbol('RETRYABLE'), + EXPLAINABLE: Symbol('EXPLAINABLE'), + SKIP_COLLATION: Symbol('SKIP_COLLATION'), + CURSOR_CREATING: Symbol('CURSOR_CREATING') +}; +/** @internal */ +const kSession = Symbol('session'); +/** + * This class acts as a parent class for any operation and is responsible for setting this.options, + * as well as setting and getting a session. + * Additionally, this class implements `hasAspect`, which determines whether an operation has + * a specific aspect. + * @internal + */ +class AbstractOperation { + constructor(options = {}) { + var _a; + this.readPreference = this.hasAspect(exports.Aspect.WRITE_OPERATION) + ? read_preference_1.ReadPreference.primary + : (_a = read_preference_1.ReadPreference.fromOptions(options)) !== null && _a !== void 0 ? _a : read_preference_1.ReadPreference.primary; + // Pull the BSON serialize options from the already-resolved options + this.bsonOptions = bson_1.resolveBSONOptions(options); + if (options.session) { + this[kSession] = options.session; + } + this.options = options; + this.bypassPinningCheck = !!options.bypassPinningCheck; + } + hasAspect(aspect) { + const ctor = this.constructor; + if (ctor.aspects == null) { + return false; + } + return ctor.aspects.has(aspect); + } + get session() { + return this[kSession]; + } + get canRetryRead() { + return true; + } + get canRetryWrite() { + return true; + } +} +exports.AbstractOperation = AbstractOperation; +function defineAspects(operation, aspects) { + if (!Array.isArray(aspects) && !(aspects instanceof Set)) { + aspects = [aspects]; + } + aspects = new Set(aspects); + Object.defineProperty(operation, 'aspects', { + value: aspects, + writable: false + }); + return aspects; +} +exports.defineAspects = defineAspects; +//# sourceMappingURL=operation.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/operation.js.map b/node_modules/mongodb/lib/operations/operation.js.map new file mode 100644 index 000000000..be2f2cef1 --- /dev/null +++ b/node_modules/mongodb/lib/operations/operation.js.map @@ -0,0 +1 @@ +{"version":3,"file":"operation.js","sourceRoot":"","sources":["../../src/operations/operation.ts"],"names":[],"mappings":";;;AAAA,wDAAwE;AAExE,kCAA6E;AAIhE,QAAA,MAAM,GAAG;IACpB,cAAc,EAAE,MAAM,CAAC,gBAAgB,CAAC;IACxC,eAAe,EAAE,MAAM,CAAC,iBAAiB,CAAC;IAC1C,SAAS,EAAE,MAAM,CAAC,WAAW,CAAC;IAC9B,WAAW,EAAE,MAAM,CAAC,aAAa,CAAC;IAClC,cAAc,EAAE,MAAM,CAAC,gBAAgB,CAAC;IACxC,eAAe,EAAE,MAAM,CAAC,iBAAiB,CAAC;CAClC,CAAC;AAsBX,gBAAgB;AAChB,MAAM,QAAQ,GAAG,MAAM,CAAC,SAAS,CAAC,CAAC;AAEnC;;;;;;GAMG;AACH,MAAsB,iBAAiB;IAerC,YAAY,UAA4B,EAAE;;QACxC,IAAI,CAAC,cAAc,GAAG,IAAI,CAAC,SAAS,CAAC,cAAM,CAAC,eAAe,CAAC;YAC1D,CAAC,CAAC,gCAAc,CAAC,OAAO;YACxB,CAAC,CAAC,MAAA,gCAAc,CAAC,WAAW,CAAC,OAAO,CAAC,mCAAI,gCAAc,CAAC,OAAO,CAAC;QAElE,oEAAoE;QACpE,IAAI,CAAC,WAAW,GAAG,yBAAkB,CAAC,OAAO,CAAC,CAAC;QAE/C,IAAI,OAAO,CAAC,OAAO,EAAE;YACnB,IAAI,CAAC,QAAQ,CAAC,GAAG,OAAO,CAAC,OAAO,CAAC;SAClC;QAED,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,kBAAkB,GAAG,CAAC,CAAC,OAAO,CAAC,kBAAkB,CAAC;IACzD,CAAC;IAID,SAAS,CAAC,MAAc;QACtB,MAAM,IAAI,GAAG,IAAI,CAAC,WAAmC,CAAC;QACtD,IAAI,IAAI,CAAC,OAAO,IAAI,IAAI,EAAE;YACxB,OAAO,KAAK,CAAC;SACd;QAED,OAAO,IAAI,CAAC,OAAO,CAAC,GAAG,CAAC,MAAM,CAAC,CAAC;IAClC,CAAC;IAED,IAAI,OAAO;QACT,OAAO,IAAI,CAAC,QAAQ,CAAC,CAAC;IACxB,CAAC;IAED,IAAI,YAAY;QACd,OAAO,IAAI,CAAC;IACd,CAAC;IAED,IAAI,aAAa;QACf,OAAO,IAAI,CAAC;IACd,CAAC;CACF;AArDD,8CAqDC;AAED,SAAgB,aAAa,CAC3B,SAA+B,EAC/B,OAAwC;IAExC,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC,OAAO,YAAY,GAAG,CAAC,EAAE;QACxD,OAAO,GAAG,CAAC,OAAO,CAAC,CAAC;KACrB;IAED,OAAO,GAAG,IAAI,GAAG,CAAC,OAAO,CAAC,CAAC;IAC3B,MAAM,CAAC,cAAc,CAAC,SAAS,EAAE,SAAS,EAAE;QAC1C,KAAK,EAAE,OAAO;QACd,QAAQ,EAAE,KAAK;KAChB,CAAC,CAAC;IAEH,OAAO,OAAO,CAAC;AACjB,CAAC;AAfD,sCAeC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/options_operation.js b/node_modules/mongodb/lib/operations/options_operation.js new file mode 100644 index 000000000..ab6d4c85d --- /dev/null +++ b/node_modules/mongodb/lib/operations/options_operation.js @@ -0,0 +1,29 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.OptionsOperation = void 0; +const operation_1 = require("./operation"); +const error_1 = require("../error"); +/** @internal */ +class OptionsOperation extends operation_1.AbstractOperation { + constructor(collection, options) { + super(options); + this.options = options; + this.collection = collection; + } + execute(server, session, callback) { + const coll = this.collection; + coll.s.db + .listCollections({ name: coll.collectionName }, { ...this.options, nameOnly: false, readPreference: this.readPreference, session }) + .toArray((err, collections) => { + if (err || !collections) + return callback(err); + if (collections.length === 0) { + // TODO(NODE-3485) + return callback(new error_1.MongoAPIError(`collection ${coll.namespace} not found`)); + } + callback(err, collections[0].options); + }); + } +} +exports.OptionsOperation = OptionsOperation; +//# sourceMappingURL=options_operation.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/options_operation.js.map b/node_modules/mongodb/lib/operations/options_operation.js.map new file mode 100644 index 000000000..d009af4f2 --- /dev/null +++ b/node_modules/mongodb/lib/operations/options_operation.js.map @@ -0,0 +1 @@ +{"version":3,"file":"options_operation.js","sourceRoot":"","sources":["../../src/operations/options_operation.ts"],"names":[],"mappings":";;;AAAA,2CAAkE;AAClE,oCAAyC;AAOzC,gBAAgB;AAChB,MAAa,gBAAiB,SAAQ,6BAA2B;IAI/D,YAAY,UAAsB,EAAE,OAAyB;QAC3D,KAAK,CAAC,OAAO,CAAC,CAAC;QACf,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;IAC/B,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA4B;QAC1E,MAAM,IAAI,GAAG,IAAI,CAAC,UAAU,CAAC;QAE7B,IAAI,CAAC,CAAC,CAAC,EAAE;aACN,eAAe,CACd,EAAE,IAAI,EAAE,IAAI,CAAC,cAAc,EAAE,EAC7B,EAAE,GAAG,IAAI,CAAC,OAAO,EAAE,QAAQ,EAAE,KAAK,EAAE,cAAc,EAAE,IAAI,CAAC,cAAc,EAAE,OAAO,EAAE,CACnF;aACA,OAAO,CAAC,CAAC,GAAG,EAAE,WAAW,EAAE,EAAE;YAC5B,IAAI,GAAG,IAAI,CAAC,WAAW;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAC9C,IAAI,WAAW,CAAC,MAAM,KAAK,CAAC,EAAE;gBAC5B,kBAAkB;gBAClB,OAAO,QAAQ,CAAC,IAAI,qBAAa,CAAC,cAAc,IAAI,CAAC,SAAS,YAAY,CAAC,CAAC,CAAC;aAC9E;YAED,QAAQ,CAAC,GAAG,EAAE,WAAW,CAAC,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC;QACxC,CAAC,CAAC,CAAC;IACP,CAAC;CACF;AA5BD,4CA4BC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/profiling_level.js b/node_modules/mongodb/lib/operations/profiling_level.js new file mode 100644 index 000000000..8d4026845 --- /dev/null +++ b/node_modules/mongodb/lib/operations/profiling_level.js @@ -0,0 +1,33 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.ProfilingLevelOperation = void 0; +const command_1 = require("./command"); +const error_1 = require("../error"); +/** @internal */ +class ProfilingLevelOperation extends command_1.CommandOperation { + constructor(db, options) { + super(db, options); + this.options = options; + } + execute(server, session, callback) { + super.executeCommand(server, session, { profile: -1 }, (err, doc) => { + if (err == null && doc.ok === 1) { + const was = doc.was; + if (was === 0) + return callback(undefined, 'off'); + if (was === 1) + return callback(undefined, 'slow_only'); + if (was === 2) + return callback(undefined, 'all'); + // TODO(NODE-3483) + return callback(new error_1.MongoRuntimeError(`Illegal profiling level value ${was}`)); + } + else { + // TODO(NODE-3483): Consider MongoUnexpectedServerResponseError + err != null ? callback(err) : callback(new error_1.MongoRuntimeError('Error with profile command')); + } + }); + } +} +exports.ProfilingLevelOperation = ProfilingLevelOperation; +//# sourceMappingURL=profiling_level.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/profiling_level.js.map b/node_modules/mongodb/lib/operations/profiling_level.js.map new file mode 100644 index 000000000..a95d93bc0 --- /dev/null +++ b/node_modules/mongodb/lib/operations/profiling_level.js.map @@ -0,0 +1 @@ +{"version":3,"file":"profiling_level.js","sourceRoot":"","sources":["../../src/operations/profiling_level.ts"],"names":[],"mappings":";;;AAAA,uCAAsE;AAKtE,oCAA6C;AAK7C,gBAAgB;AAChB,MAAa,uBAAwB,SAAQ,0BAAwB;IAGnE,YAAY,EAAM,EAAE,OAA8B;QAChD,KAAK,CAAC,EAAE,EAAE,OAAO,CAAC,CAAC;QACnB,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;IACzB,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA0B;QACxE,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,EAAE,OAAO,EAAE,CAAC,CAAC,EAAE,EAAE,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;YAClE,IAAI,GAAG,IAAI,IAAI,IAAI,GAAG,CAAC,EAAE,KAAK,CAAC,EAAE;gBAC/B,MAAM,GAAG,GAAG,GAAG,CAAC,GAAG,CAAC;gBACpB,IAAI,GAAG,KAAK,CAAC;oBAAE,OAAO,QAAQ,CAAC,SAAS,EAAE,KAAK,CAAC,CAAC;gBACjD,IAAI,GAAG,KAAK,CAAC;oBAAE,OAAO,QAAQ,CAAC,SAAS,EAAE,WAAW,CAAC,CAAC;gBACvD,IAAI,GAAG,KAAK,CAAC;oBAAE,OAAO,QAAQ,CAAC,SAAS,EAAE,KAAK,CAAC,CAAC;gBACjD,kBAAkB;gBAClB,OAAO,QAAQ,CAAC,IAAI,yBAAiB,CAAC,iCAAiC,GAAG,EAAE,CAAC,CAAC,CAAC;aAChF;iBAAM;gBACL,+DAA+D;gBAC/D,GAAG,IAAI,IAAI,CAAC,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,QAAQ,CAAC,IAAI,yBAAiB,CAAC,4BAA4B,CAAC,CAAC,CAAC;aAC7F;QACH,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAvBD,0DAuBC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/remove_user.js b/node_modules/mongodb/lib/operations/remove_user.js new file mode 100644 index 000000000..4b3705f2b --- /dev/null +++ b/node_modules/mongodb/lib/operations/remove_user.js @@ -0,0 +1,21 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.RemoveUserOperation = void 0; +const operation_1 = require("./operation"); +const command_1 = require("./command"); +/** @internal */ +class RemoveUserOperation extends command_1.CommandOperation { + constructor(db, username, options) { + super(db, options); + this.options = options; + this.username = username; + } + execute(server, session, callback) { + super.executeCommand(server, session, { dropUser: this.username }, err => { + callback(err, err ? false : true); + }); + } +} +exports.RemoveUserOperation = RemoveUserOperation; +operation_1.defineAspects(RemoveUserOperation, [operation_1.Aspect.WRITE_OPERATION]); +//# sourceMappingURL=remove_user.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/remove_user.js.map b/node_modules/mongodb/lib/operations/remove_user.js.map new file mode 100644 index 000000000..a4239abf1 --- /dev/null +++ b/node_modules/mongodb/lib/operations/remove_user.js.map @@ -0,0 +1 @@ +{"version":3,"file":"remove_user.js","sourceRoot":"","sources":["../../src/operations/remove_user.ts"],"names":[],"mappings":";;;AAAA,2CAAoD;AACpD,uCAAsE;AAStE,gBAAgB;AAChB,MAAa,mBAAoB,SAAQ,0BAAyB;IAIhE,YAAY,EAAM,EAAE,QAAgB,EAAE,OAA0B;QAC9D,KAAK,CAAC,EAAE,EAAE,OAAO,CAAC,CAAC;QACnB,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,QAAQ,GAAG,QAAQ,CAAC;IAC3B,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA2B;QACzE,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,EAAE,QAAQ,EAAE,IAAI,CAAC,QAAQ,EAAE,EAAE,GAAG,CAAC,EAAE;YACvE,QAAQ,CAAC,GAAG,EAAE,GAAG,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC;QACpC,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAfD,kDAeC;AAED,yBAAa,CAAC,mBAAmB,EAAE,CAAC,kBAAM,CAAC,eAAe,CAAC,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/rename.js b/node_modules/mongodb/lib/operations/rename.js new file mode 100644 index 000000000..498cc6b66 --- /dev/null +++ b/node_modules/mongodb/lib/operations/rename.js @@ -0,0 +1,46 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.RenameOperation = void 0; +const utils_1 = require("../utils"); +const run_command_1 = require("./run_command"); +const operation_1 = require("./operation"); +const collection_1 = require("../collection"); +const error_1 = require("../error"); +/** @internal */ +class RenameOperation extends run_command_1.RunAdminCommandOperation { + constructor(collection, newName, options) { + // Check the collection name + utils_1.checkCollectionName(newName); + // Build the command + const renameCollection = collection.namespace; + const toCollection = collection.s.namespace.withCollection(newName).toString(); + const dropTarget = typeof options.dropTarget === 'boolean' ? options.dropTarget : false; + const cmd = { renameCollection: renameCollection, to: toCollection, dropTarget: dropTarget }; + super(collection, cmd, options); + this.options = options; + this.collection = collection; + this.newName = newName; + } + execute(server, session, callback) { + const coll = this.collection; + super.execute(server, session, (err, doc) => { + if (err) + return callback(err); + // We have an error + if (doc.errmsg) { + return callback(new error_1.MongoServerError(doc)); + } + let newColl; + try { + newColl = new collection_1.Collection(coll.s.db, this.newName, coll.s.options); + } + catch (err) { + return callback(err); + } + return callback(undefined, newColl); + }); + } +} +exports.RenameOperation = RenameOperation; +operation_1.defineAspects(RenameOperation, [operation_1.Aspect.WRITE_OPERATION]); +//# sourceMappingURL=rename.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/rename.js.map b/node_modules/mongodb/lib/operations/rename.js.map new file mode 100644 index 000000000..3a47354c6 --- /dev/null +++ b/node_modules/mongodb/lib/operations/rename.js.map @@ -0,0 +1 @@ +{"version":3,"file":"rename.js","sourceRoot":"","sources":["../../src/operations/rename.ts"],"names":[],"mappings":";;;AAAA,oCAAyD;AACzD,+CAAyD;AACzD,2CAAoD;AAEpD,8CAA2C;AAE3C,oCAA4C;AAY5C,gBAAgB;AAChB,MAAa,eAAgB,SAAQ,sCAAwB;IAK3D,YAAY,UAAsB,EAAE,OAAe,EAAE,OAAsB;QACzE,4BAA4B;QAC5B,2BAAmB,CAAC,OAAO,CAAC,CAAC;QAE7B,oBAAoB;QACpB,MAAM,gBAAgB,GAAG,UAAU,CAAC,SAAS,CAAC;QAC9C,MAAM,YAAY,GAAG,UAAU,CAAC,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,OAAO,CAAC,CAAC,QAAQ,EAAE,CAAC;QAC/E,MAAM,UAAU,GAAG,OAAO,OAAO,CAAC,UAAU,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,UAAU,CAAC,CAAC,CAAC,KAAK,CAAC;QACxF,MAAM,GAAG,GAAG,EAAE,gBAAgB,EAAE,gBAAgB,EAAE,EAAE,EAAE,YAAY,EAAE,UAAU,EAAE,UAAU,EAAE,CAAC;QAE7F,KAAK,CAAC,UAAU,EAAE,GAAG,EAAE,OAAO,CAAC,CAAC;QAChC,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;QAC7B,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;IACzB,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA8B;QAC5E,MAAM,IAAI,GAAG,IAAI,CAAC,UAAU,CAAC;QAE7B,KAAK,CAAC,OAAO,CAAC,MAAM,EAAE,OAAO,EAAE,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;YAC1C,IAAI,GAAG;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAC9B,mBAAmB;YACnB,IAAI,GAAG,CAAC,MAAM,EAAE;gBACd,OAAO,QAAQ,CAAC,IAAI,wBAAgB,CAAC,GAAG,CAAC,CAAC,CAAC;aAC5C;YAED,IAAI,OAA6B,CAAC;YAClC,IAAI;gBACF,OAAO,GAAG,IAAI,uBAAU,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,EAAE,IAAI,CAAC,OAAO,EAAE,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC;aACnE;YAAC,OAAO,GAAG,EAAE;gBACZ,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;aACtB;YAED,OAAO,QAAQ,CAAC,SAAS,EAAE,OAAO,CAAC,CAAC;QACtC,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAzCD,0CAyCC;AAED,yBAAa,CAAC,eAAe,EAAE,CAAC,kBAAM,CAAC,eAAe,CAAC,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/run_command.js b/node_modules/mongodb/lib/operations/run_command.js new file mode 100644 index 000000000..c3be60106 --- /dev/null +++ b/node_modules/mongodb/lib/operations/run_command.js @@ -0,0 +1,26 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.RunAdminCommandOperation = exports.RunCommandOperation = void 0; +const command_1 = require("./command"); +const utils_1 = require("../utils"); +/** @internal */ +class RunCommandOperation extends command_1.CommandOperation { + constructor(parent, command, options) { + super(parent, options); + this.options = options !== null && options !== void 0 ? options : {}; + this.command = command; + } + execute(server, session, callback) { + const command = this.command; + this.executeCommand(server, session, command, callback); + } +} +exports.RunCommandOperation = RunCommandOperation; +class RunAdminCommandOperation extends RunCommandOperation { + constructor(parent, command, options) { + super(parent, command, options); + this.ns = new utils_1.MongoDBNamespace('admin'); + } +} +exports.RunAdminCommandOperation = RunAdminCommandOperation; +//# sourceMappingURL=run_command.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/run_command.js.map b/node_modules/mongodb/lib/operations/run_command.js.map new file mode 100644 index 000000000..272316037 --- /dev/null +++ b/node_modules/mongodb/lib/operations/run_command.js.map @@ -0,0 +1 @@ +{"version":3,"file":"run_command.js","sourceRoot":"","sources":["../../src/operations/run_command.ts"],"names":[],"mappings":";;;AAAA,uCAAuF;AACvF,oCAAsD;AAQtD,gBAAgB;AAChB,MAAa,mBAAkC,SAAQ,0BAAmB;IAIxE,YAAY,MAAmC,EAAE,OAAiB,EAAE,OAA2B;QAC7F,KAAK,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;QACvB,IAAI,CAAC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAC7B,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;IACzB,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAAkB;QAChE,MAAM,OAAO,GAAG,IAAI,CAAC,OAAO,CAAC;QAC7B,IAAI,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;IAC1D,CAAC;CACF;AAdD,kDAcC;AAED,MAAa,wBAAuC,SAAQ,mBAAsB;IAChF,YAAY,MAAmC,EAAE,OAAiB,EAAE,OAA2B;QAC7F,KAAK,CAAC,MAAM,EAAE,OAAO,EAAE,OAAO,CAAC,CAAC;QAChC,IAAI,CAAC,EAAE,GAAG,IAAI,wBAAgB,CAAC,OAAO,CAAC,CAAC;IAC1C,CAAC;CACF;AALD,4DAKC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/set_profiling_level.js b/node_modules/mongodb/lib/operations/set_profiling_level.js new file mode 100644 index 000000000..0d6305fce --- /dev/null +++ b/node_modules/mongodb/lib/operations/set_profiling_level.js @@ -0,0 +1,51 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.SetProfilingLevelOperation = exports.ProfilingLevel = void 0; +const command_1 = require("./command"); +const utils_1 = require("../utils"); +const error_1 = require("../error"); +const levelValues = new Set(['off', 'slow_only', 'all']); +/** @public */ +exports.ProfilingLevel = Object.freeze({ + off: 'off', + slowOnly: 'slow_only', + all: 'all' +}); +/** @internal */ +class SetProfilingLevelOperation extends command_1.CommandOperation { + constructor(db, level, options) { + super(db, options); + this.options = options; + switch (level) { + case exports.ProfilingLevel.off: + this.profile = 0; + break; + case exports.ProfilingLevel.slowOnly: + this.profile = 1; + break; + case exports.ProfilingLevel.all: + this.profile = 2; + break; + default: + this.profile = 0; + break; + } + this.level = level; + } + execute(server, session, callback) { + const level = this.level; + if (!levelValues.has(level)) { + return callback(new error_1.MongoInvalidArgumentError(`Profiling level must be one of "${utils_1.enumToString(exports.ProfilingLevel)}"`)); + } + // TODO(NODE-3483): Determine error to put here + super.executeCommand(server, session, { profile: this.profile }, (err, doc) => { + if (err == null && doc.ok === 1) + return callback(undefined, level); + return err != null + ? callback(err) + : callback(new error_1.MongoRuntimeError('Error with profile command')); + }); + } +} +exports.SetProfilingLevelOperation = SetProfilingLevelOperation; +//# sourceMappingURL=set_profiling_level.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/set_profiling_level.js.map b/node_modules/mongodb/lib/operations/set_profiling_level.js.map new file mode 100644 index 000000000..699b3e458 --- /dev/null +++ b/node_modules/mongodb/lib/operations/set_profiling_level.js.map @@ -0,0 +1 @@ +{"version":3,"file":"set_profiling_level.js","sourceRoot":"","sources":["../../src/operations/set_profiling_level.ts"],"names":[],"mappings":";;;AAAA,uCAAsE;AAEtE,oCAAwC;AAIxC,oCAAwE;AACxE,MAAM,WAAW,GAAG,IAAI,GAAG,CAAC,CAAC,KAAK,EAAE,WAAW,EAAE,KAAK,CAAC,CAAC,CAAC;AAEzD,cAAc;AACD,QAAA,cAAc,GAAG,MAAM,CAAC,MAAM,CAAC;IAC1C,GAAG,EAAE,KAAK;IACV,QAAQ,EAAE,WAAW;IACrB,GAAG,EAAE,KAAK;CACF,CAAC,CAAC;AAQZ,gBAAgB;AAChB,MAAa,0BAA2B,SAAQ,0BAAgC;IAK9E,YAAY,EAAM,EAAE,KAAqB,EAAE,OAAiC;QAC1E,KAAK,CAAC,EAAE,EAAE,OAAO,CAAC,CAAC;QACnB,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,QAAQ,KAAK,EAAE;YACb,KAAK,sBAAc,CAAC,GAAG;gBACrB,IAAI,CAAC,OAAO,GAAG,CAAC,CAAC;gBACjB,MAAM;YACR,KAAK,sBAAc,CAAC,QAAQ;gBAC1B,IAAI,CAAC,OAAO,GAAG,CAAC,CAAC;gBACjB,MAAM;YACR,KAAK,sBAAc,CAAC,GAAG;gBACrB,IAAI,CAAC,OAAO,GAAG,CAAC,CAAC;gBACjB,MAAM;YACR;gBACE,IAAI,CAAC,OAAO,GAAG,CAAC,CAAC;gBACjB,MAAM;SACT;QAED,IAAI,CAAC,KAAK,GAAG,KAAK,CAAC;IACrB,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAAkC;QAChF,MAAM,KAAK,GAAG,IAAI,CAAC,KAAK,CAAC;QAEzB,IAAI,CAAC,WAAW,CAAC,GAAG,CAAC,KAAK,CAAC,EAAE;YAC3B,OAAO,QAAQ,CACb,IAAI,iCAAyB,CAC3B,mCAAmC,oBAAY,CAAC,sBAAc,CAAC,GAAG,CACnE,CACF,CAAC;SACH;QAED,+CAA+C;QAC/C,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,EAAE,OAAO,EAAE,IAAI,CAAC,OAAO,EAAE,EAAE,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;YAC5E,IAAI,GAAG,IAAI,IAAI,IAAI,GAAG,CAAC,EAAE,KAAK,CAAC;gBAAE,OAAO,QAAQ,CAAC,SAAS,EAAE,KAAK,CAAC,CAAC;YACnE,OAAO,GAAG,IAAI,IAAI;gBAChB,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC;gBACf,CAAC,CAAC,QAAQ,CAAC,IAAI,yBAAiB,CAAC,4BAA4B,CAAC,CAAC,CAAC;QACpE,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AA7CD,gEA6CC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/stats.js b/node_modules/mongodb/lib/operations/stats.js new file mode 100644 index 000000000..5cf92bc3d --- /dev/null +++ b/node_modules/mongodb/lib/operations/stats.js @@ -0,0 +1,48 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.DbStatsOperation = exports.CollStatsOperation = void 0; +const operation_1 = require("./operation"); +const command_1 = require("./command"); +/** + * Get all the collection statistics. + * @internal + */ +class CollStatsOperation extends command_1.CommandOperation { + /** + * Construct a Stats operation. + * + * @param collection - Collection instance + * @param options - Optional settings. See Collection.prototype.stats for a list of options. + */ + constructor(collection, options) { + super(collection, options); + this.options = options !== null && options !== void 0 ? options : {}; + this.collectionName = collection.collectionName; + } + execute(server, session, callback) { + const command = { collStats: this.collectionName }; + if (this.options.scale != null) { + command.scale = this.options.scale; + } + super.executeCommand(server, session, command, callback); + } +} +exports.CollStatsOperation = CollStatsOperation; +/** @internal */ +class DbStatsOperation extends command_1.CommandOperation { + constructor(db, options) { + super(db, options); + this.options = options; + } + execute(server, session, callback) { + const command = { dbStats: true }; + if (this.options.scale != null) { + command.scale = this.options.scale; + } + super.executeCommand(server, session, command, callback); + } +} +exports.DbStatsOperation = DbStatsOperation; +operation_1.defineAspects(CollStatsOperation, [operation_1.Aspect.READ_OPERATION]); +operation_1.defineAspects(DbStatsOperation, [operation_1.Aspect.READ_OPERATION]); +//# sourceMappingURL=stats.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/stats.js.map b/node_modules/mongodb/lib/operations/stats.js.map new file mode 100644 index 000000000..bfc1a8b23 --- /dev/null +++ b/node_modules/mongodb/lib/operations/stats.js.map @@ -0,0 +1 @@ +{"version":3,"file":"stats.js","sourceRoot":"","sources":["../../src/operations/stats.ts"],"names":[],"mappings":";;;AAAA,2CAAoD;AACpD,uCAAsE;AActE;;;GAGG;AACH,MAAa,kBAAmB,SAAQ,0BAA0B;IAIhE;;;;;OAKG;IACH,YAAY,UAAsB,EAAE,OAA0B;QAC5D,KAAK,CAAC,UAAU,EAAE,OAAO,CAAC,CAAC;QAC3B,IAAI,CAAC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAC7B,IAAI,CAAC,cAAc,GAAG,UAAU,CAAC,cAAc,CAAC;IAClD,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA6B;QAC3E,MAAM,OAAO,GAAa,EAAE,SAAS,EAAE,IAAI,CAAC,cAAc,EAAE,CAAC;QAC7D,IAAI,IAAI,CAAC,OAAO,CAAC,KAAK,IAAI,IAAI,EAAE;YAC9B,OAAO,CAAC,KAAK,GAAG,IAAI,CAAC,OAAO,CAAC,KAAK,CAAC;SACpC;QAED,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;IAC3D,CAAC;CACF;AAxBD,gDAwBC;AAQD,gBAAgB;AAChB,MAAa,gBAAiB,SAAQ,0BAA0B;IAG9D,YAAY,EAAM,EAAE,OAAuB;QACzC,KAAK,CAAC,EAAE,EAAE,OAAO,CAAC,CAAC;QACnB,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;IACzB,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA4B;QAC1E,MAAM,OAAO,GAAa,EAAE,OAAO,EAAE,IAAI,EAAE,CAAC;QAC5C,IAAI,IAAI,CAAC,OAAO,CAAC,KAAK,IAAI,IAAI,EAAE;YAC9B,OAAO,CAAC,KAAK,GAAG,IAAI,CAAC,OAAO,CAAC,KAAK,CAAC;SACpC;QAED,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;IAC3D,CAAC;CACF;AAhBD,4CAgBC;AAiMD,yBAAa,CAAC,kBAAkB,EAAE,CAAC,kBAAM,CAAC,cAAc,CAAC,CAAC,CAAC;AAC3D,yBAAa,CAAC,gBAAgB,EAAE,CAAC,kBAAM,CAAC,cAAc,CAAC,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/update.js b/node_modules/mongodb/lib/operations/update.js new file mode 100644 index 000000000..9c3562962 --- /dev/null +++ b/node_modules/mongodb/lib/operations/update.js @@ -0,0 +1,195 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.makeUpdateStatement = exports.ReplaceOneOperation = exports.UpdateManyOperation = exports.UpdateOneOperation = exports.UpdateOperation = void 0; +const operation_1 = require("./operation"); +const utils_1 = require("../utils"); +const command_1 = require("./command"); +const error_1 = require("../error"); +/** @internal */ +class UpdateOperation extends command_1.CommandOperation { + constructor(ns, statements, options) { + super(undefined, options); + this.options = options; + this.ns = ns; + this.statements = statements; + } + get canRetryWrite() { + if (super.canRetryWrite === false) { + return false; + } + return this.statements.every(op => op.multi == null || op.multi === false); + } + execute(server, session, callback) { + var _a; + const options = (_a = this.options) !== null && _a !== void 0 ? _a : {}; + const ordered = typeof options.ordered === 'boolean' ? options.ordered : true; + const command = { + update: this.ns.collection, + updates: this.statements, + ordered + }; + if (typeof options.bypassDocumentValidation === 'boolean') { + command.bypassDocumentValidation = options.bypassDocumentValidation; + } + if (options.let) { + command.let = options.let; + } + const statementWithCollation = this.statements.find(statement => !!statement.collation); + if (utils_1.collationNotSupported(server, options) || + (statementWithCollation && utils_1.collationNotSupported(server, statementWithCollation))) { + callback(new error_1.MongoCompatibilityError(`Server ${server.name} does not support collation`)); + return; + } + const unacknowledgedWrite = this.writeConcern && this.writeConcern.w === 0; + if (unacknowledgedWrite || utils_1.maxWireVersion(server) < 5) { + if (this.statements.find((o) => o.hint)) { + callback(new error_1.MongoCompatibilityError(`Servers < 3.4 do not support hint on update`)); + return; + } + } + if (this.explain && utils_1.maxWireVersion(server) < 3) { + callback(new error_1.MongoCompatibilityError(`Server ${server.name} does not support explain on update`)); + return; + } + if (this.statements.some(statement => !!statement.arrayFilters) && utils_1.maxWireVersion(server) < 6) { + callback(new error_1.MongoCompatibilityError('Option "arrayFilters" is only supported on MongoDB 3.6+')); + return; + } + super.executeCommand(server, session, command, callback); + } +} +exports.UpdateOperation = UpdateOperation; +/** @internal */ +class UpdateOneOperation extends UpdateOperation { + constructor(collection, filter, update, options) { + super(collection.s.namespace, [makeUpdateStatement(filter, update, { ...options, multi: false })], options); + if (!utils_1.hasAtomicOperators(update)) { + throw new error_1.MongoInvalidArgumentError('Update document requires atomic operators'); + } + } + execute(server, session, callback) { + super.execute(server, session, (err, res) => { + var _a, _b; + if (err || !res) + return callback(err); + if (this.explain != null) + return callback(undefined, res); + if (res.code) + return callback(new error_1.MongoServerError(res)); + if (res.writeErrors) + return callback(new error_1.MongoServerError(res.writeErrors[0])); + callback(undefined, { + acknowledged: (_b = ((_a = this.writeConcern) === null || _a === void 0 ? void 0 : _a.w) !== 0) !== null && _b !== void 0 ? _b : true, + modifiedCount: res.nModified != null ? res.nModified : res.n, + upsertedId: Array.isArray(res.upserted) && res.upserted.length > 0 ? res.upserted[0]._id : null, + upsertedCount: Array.isArray(res.upserted) && res.upserted.length ? res.upserted.length : 0, + matchedCount: Array.isArray(res.upserted) && res.upserted.length > 0 ? 0 : res.n + }); + }); + } +} +exports.UpdateOneOperation = UpdateOneOperation; +/** @internal */ +class UpdateManyOperation extends UpdateOperation { + constructor(collection, filter, update, options) { + super(collection.s.namespace, [makeUpdateStatement(filter, update, { ...options, multi: true })], options); + if (!utils_1.hasAtomicOperators(update)) { + throw new error_1.MongoInvalidArgumentError('Update document requires atomic operators'); + } + } + execute(server, session, callback) { + super.execute(server, session, (err, res) => { + var _a, _b; + if (err || !res) + return callback(err); + if (this.explain != null) + return callback(undefined, res); + if (res.code) + return callback(new error_1.MongoServerError(res)); + if (res.writeErrors) + return callback(new error_1.MongoServerError(res.writeErrors[0])); + callback(undefined, { + acknowledged: (_b = ((_a = this.writeConcern) === null || _a === void 0 ? void 0 : _a.w) !== 0) !== null && _b !== void 0 ? _b : true, + modifiedCount: res.nModified != null ? res.nModified : res.n, + upsertedId: Array.isArray(res.upserted) && res.upserted.length > 0 ? res.upserted[0]._id : null, + upsertedCount: Array.isArray(res.upserted) && res.upserted.length ? res.upserted.length : 0, + matchedCount: Array.isArray(res.upserted) && res.upserted.length > 0 ? 0 : res.n + }); + }); + } +} +exports.UpdateManyOperation = UpdateManyOperation; +/** @internal */ +class ReplaceOneOperation extends UpdateOperation { + constructor(collection, filter, replacement, options) { + super(collection.s.namespace, [makeUpdateStatement(filter, replacement, { ...options, multi: false })], options); + if (utils_1.hasAtomicOperators(replacement)) { + throw new error_1.MongoInvalidArgumentError('Replacement document must not contain atomic operators'); + } + } + execute(server, session, callback) { + super.execute(server, session, (err, res) => { + var _a, _b; + if (err || !res) + return callback(err); + if (this.explain != null) + return callback(undefined, res); + if (res.code) + return callback(new error_1.MongoServerError(res)); + if (res.writeErrors) + return callback(new error_1.MongoServerError(res.writeErrors[0])); + callback(undefined, { + acknowledged: (_b = ((_a = this.writeConcern) === null || _a === void 0 ? void 0 : _a.w) !== 0) !== null && _b !== void 0 ? _b : true, + modifiedCount: res.nModified != null ? res.nModified : res.n, + upsertedId: Array.isArray(res.upserted) && res.upserted.length > 0 ? res.upserted[0]._id : null, + upsertedCount: Array.isArray(res.upserted) && res.upserted.length ? res.upserted.length : 0, + matchedCount: Array.isArray(res.upserted) && res.upserted.length > 0 ? 0 : res.n + }); + }); + } +} +exports.ReplaceOneOperation = ReplaceOneOperation; +function makeUpdateStatement(filter, update, options) { + if (filter == null || typeof filter !== 'object') { + throw new error_1.MongoInvalidArgumentError('Selector must be a valid JavaScript object'); + } + if (update == null || typeof update !== 'object') { + throw new error_1.MongoInvalidArgumentError('Document must be a valid JavaScript object'); + } + const op = { q: filter, u: update }; + if (typeof options.upsert === 'boolean') { + op.upsert = options.upsert; + } + if (options.multi) { + op.multi = options.multi; + } + if (options.hint) { + op.hint = options.hint; + } + if (options.arrayFilters) { + op.arrayFilters = options.arrayFilters; + } + if (options.collation) { + op.collation = options.collation; + } + return op; +} +exports.makeUpdateStatement = makeUpdateStatement; +operation_1.defineAspects(UpdateOperation, [operation_1.Aspect.RETRYABLE, operation_1.Aspect.WRITE_OPERATION, operation_1.Aspect.SKIP_COLLATION]); +operation_1.defineAspects(UpdateOneOperation, [ + operation_1.Aspect.RETRYABLE, + operation_1.Aspect.WRITE_OPERATION, + operation_1.Aspect.EXPLAINABLE, + operation_1.Aspect.SKIP_COLLATION +]); +operation_1.defineAspects(UpdateManyOperation, [ + operation_1.Aspect.WRITE_OPERATION, + operation_1.Aspect.EXPLAINABLE, + operation_1.Aspect.SKIP_COLLATION +]); +operation_1.defineAspects(ReplaceOneOperation, [ + operation_1.Aspect.RETRYABLE, + operation_1.Aspect.WRITE_OPERATION, + operation_1.Aspect.SKIP_COLLATION +]); +//# sourceMappingURL=update.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/update.js.map b/node_modules/mongodb/lib/operations/update.js.map new file mode 100644 index 000000000..306eaff35 --- /dev/null +++ b/node_modules/mongodb/lib/operations/update.js.map @@ -0,0 +1 @@ +{"version":3,"file":"update.js","sourceRoot":"","sources":["../../src/operations/update.ts"],"names":[],"mappings":";;;AAAA,2CAA0D;AAC1D,oCAMkB;AAClB,uCAAwF;AAKxF,oCAAgG;AAkDhG,gBAAgB;AAChB,MAAa,eAAgB,SAAQ,0BAA0B;IAI7D,YACE,EAAoB,EACpB,UAA6B,EAC7B,OAA8C;QAE9C,KAAK,CAAC,SAAS,EAAE,OAAO,CAAC,CAAC;QAC1B,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC;QAEb,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;IAC/B,CAAC;IAED,IAAI,aAAa;QACf,IAAI,KAAK,CAAC,aAAa,KAAK,KAAK,EAAE;YACjC,OAAO,KAAK,CAAC;SACd;QAED,OAAO,IAAI,CAAC,UAAU,CAAC,KAAK,CAAC,EAAE,CAAC,EAAE,CAAC,EAAE,CAAC,KAAK,IAAI,IAAI,IAAI,EAAE,CAAC,KAAK,KAAK,KAAK,CAAC,CAAC;IAC7E,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA4B;;QAC1E,MAAM,OAAO,GAAG,MAAA,IAAI,CAAC,OAAO,mCAAI,EAAE,CAAC;QACnC,MAAM,OAAO,GAAG,OAAO,OAAO,CAAC,OAAO,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC,CAAC,IAAI,CAAC;QAC9E,MAAM,OAAO,GAAa;YACxB,MAAM,EAAE,IAAI,CAAC,EAAE,CAAC,UAAU;YAC1B,OAAO,EAAE,IAAI,CAAC,UAAU;YACxB,OAAO;SACR,CAAC;QAEF,IAAI,OAAO,OAAO,CAAC,wBAAwB,KAAK,SAAS,EAAE;YACzD,OAAO,CAAC,wBAAwB,GAAG,OAAO,CAAC,wBAAwB,CAAC;SACrE;QAED,IAAI,OAAO,CAAC,GAAG,EAAE;YACf,OAAO,CAAC,GAAG,GAAG,OAAO,CAAC,GAAG,CAAC;SAC3B;QAED,MAAM,sBAAsB,GAAG,IAAI,CAAC,UAAU,CAAC,IAAI,CAAC,SAAS,CAAC,EAAE,CAAC,CAAC,CAAC,SAAS,CAAC,SAAS,CAAC,CAAC;QACxF,IACE,6BAAqB,CAAC,MAAM,EAAE,OAAO,CAAC;YACtC,CAAC,sBAAsB,IAAI,6BAAqB,CAAC,MAAM,EAAE,sBAAsB,CAAC,CAAC,EACjF;YACA,QAAQ,CAAC,IAAI,+BAAuB,CAAC,UAAU,MAAM,CAAC,IAAI,6BAA6B,CAAC,CAAC,CAAC;YAC1F,OAAO;SACR;QAED,MAAM,mBAAmB,GAAG,IAAI,CAAC,YAAY,IAAI,IAAI,CAAC,YAAY,CAAC,CAAC,KAAK,CAAC,CAAC;QAC3E,IAAI,mBAAmB,IAAI,sBAAc,CAAC,MAAM,CAAC,GAAG,CAAC,EAAE;YACrD,IAAI,IAAI,CAAC,UAAU,CAAC,IAAI,CAAC,CAAC,CAAW,EAAE,EAAE,CAAC,CAAC,CAAC,IAAI,CAAC,EAAE;gBACjD,QAAQ,CAAC,IAAI,+BAAuB,CAAC,6CAA6C,CAAC,CAAC,CAAC;gBACrF,OAAO;aACR;SACF;QAED,IAAI,IAAI,CAAC,OAAO,IAAI,sBAAc,CAAC,MAAM,CAAC,GAAG,CAAC,EAAE;YAC9C,QAAQ,CACN,IAAI,+BAAuB,CAAC,UAAU,MAAM,CAAC,IAAI,qCAAqC,CAAC,CACxF,CAAC;YACF,OAAO;SACR;QAED,IAAI,IAAI,CAAC,UAAU,CAAC,IAAI,CAAC,SAAS,CAAC,EAAE,CAAC,CAAC,CAAC,SAAS,CAAC,YAAY,CAAC,IAAI,sBAAc,CAAC,MAAM,CAAC,GAAG,CAAC,EAAE;YAC7F,QAAQ,CACN,IAAI,+BAAuB,CAAC,yDAAyD,CAAC,CACvF,CAAC;YACF,OAAO;SACR;QAED,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;IAC3D,CAAC;CACF;AA1ED,0CA0EC;AAED,gBAAgB;AAChB,MAAa,kBAAmB,SAAQ,eAAe;IACrD,YAAY,UAAsB,EAAE,MAAgB,EAAE,MAAgB,EAAE,OAAsB;QAC5F,KAAK,CACH,UAAU,CAAC,CAAC,CAAC,SAAS,EACtB,CAAC,mBAAmB,CAAC,MAAM,EAAE,MAAM,EAAE,EAAE,GAAG,OAAO,EAAE,KAAK,EAAE,KAAK,EAAE,CAAC,CAAC,EACnE,OAAO,CACR,CAAC;QAEF,IAAI,CAAC,0BAAkB,CAAC,MAAM,CAAC,EAAE;YAC/B,MAAM,IAAI,iCAAyB,CAAC,2CAA2C,CAAC,CAAC;SAClF;IACH,CAAC;IAED,OAAO,CACL,MAAc,EACd,OAAsB,EACtB,QAA2C;QAE3C,KAAK,CAAC,OAAO,CAAC,MAAM,EAAE,OAAO,EAAE,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;;YAC1C,IAAI,GAAG,IAAI,CAAC,GAAG;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YACtC,IAAI,IAAI,CAAC,OAAO,IAAI,IAAI;gBAAE,OAAO,QAAQ,CAAC,SAAS,EAAE,GAAG,CAAC,CAAC;YAC1D,IAAI,GAAG,CAAC,IAAI;gBAAE,OAAO,QAAQ,CAAC,IAAI,wBAAgB,CAAC,GAAG,CAAC,CAAC,CAAC;YACzD,IAAI,GAAG,CAAC,WAAW;gBAAE,OAAO,QAAQ,CAAC,IAAI,wBAAgB,CAAC,GAAG,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;YAE/E,QAAQ,CAAC,SAAS,EAAE;gBAClB,YAAY,EAAE,MAAA,CAAA,MAAA,IAAI,CAAC,YAAY,0CAAE,CAAC,MAAK,CAAC,mCAAI,IAAI;gBAChD,aAAa,EAAE,GAAG,CAAC,SAAS,IAAI,IAAI,CAAC,CAAC,CAAC,GAAG,CAAC,SAAS,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC;gBAC5D,UAAU,EACR,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,QAAQ,CAAC,IAAI,GAAG,CAAC,QAAQ,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,IAAI;gBACrF,aAAa,EAAE,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,QAAQ,CAAC,IAAI,GAAG,CAAC,QAAQ,CAAC,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,QAAQ,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC;gBAC3F,YAAY,EAAE,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,QAAQ,CAAC,IAAI,GAAG,CAAC,QAAQ,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC;aACjF,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAlCD,gDAkCC;AAED,gBAAgB;AAChB,MAAa,mBAAoB,SAAQ,eAAe;IACtD,YAAY,UAAsB,EAAE,MAAgB,EAAE,MAAgB,EAAE,OAAsB;QAC5F,KAAK,CACH,UAAU,CAAC,CAAC,CAAC,SAAS,EACtB,CAAC,mBAAmB,CAAC,MAAM,EAAE,MAAM,EAAE,EAAE,GAAG,OAAO,EAAE,KAAK,EAAE,IAAI,EAAE,CAAC,CAAC,EAClE,OAAO,CACR,CAAC;QAEF,IAAI,CAAC,0BAAkB,CAAC,MAAM,CAAC,EAAE;YAC/B,MAAM,IAAI,iCAAyB,CAAC,2CAA2C,CAAC,CAAC;SAClF;IACH,CAAC;IAED,OAAO,CACL,MAAc,EACd,OAAsB,EACtB,QAA2C;QAE3C,KAAK,CAAC,OAAO,CAAC,MAAM,EAAE,OAAO,EAAE,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;;YAC1C,IAAI,GAAG,IAAI,CAAC,GAAG;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YACtC,IAAI,IAAI,CAAC,OAAO,IAAI,IAAI;gBAAE,OAAO,QAAQ,CAAC,SAAS,EAAE,GAAG,CAAC,CAAC;YAC1D,IAAI,GAAG,CAAC,IAAI;gBAAE,OAAO,QAAQ,CAAC,IAAI,wBAAgB,CAAC,GAAG,CAAC,CAAC,CAAC;YACzD,IAAI,GAAG,CAAC,WAAW;gBAAE,OAAO,QAAQ,CAAC,IAAI,wBAAgB,CAAC,GAAG,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;YAE/E,QAAQ,CAAC,SAAS,EAAE;gBAClB,YAAY,EAAE,MAAA,CAAA,MAAA,IAAI,CAAC,YAAY,0CAAE,CAAC,MAAK,CAAC,mCAAI,IAAI;gBAChD,aAAa,EAAE,GAAG,CAAC,SAAS,IAAI,IAAI,CAAC,CAAC,CAAC,GAAG,CAAC,SAAS,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC;gBAC5D,UAAU,EACR,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,QAAQ,CAAC,IAAI,GAAG,CAAC,QAAQ,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,IAAI;gBACrF,aAAa,EAAE,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,QAAQ,CAAC,IAAI,GAAG,CAAC,QAAQ,CAAC,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,QAAQ,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC;gBAC3F,YAAY,EAAE,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,QAAQ,CAAC,IAAI,GAAG,CAAC,QAAQ,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC;aACjF,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAlCD,kDAkCC;AAcD,gBAAgB;AAChB,MAAa,mBAAoB,SAAQ,eAAe;IACtD,YACE,UAAsB,EACtB,MAAgB,EAChB,WAAqB,EACrB,OAAuB;QAEvB,KAAK,CACH,UAAU,CAAC,CAAC,CAAC,SAAS,EACtB,CAAC,mBAAmB,CAAC,MAAM,EAAE,WAAW,EAAE,EAAE,GAAG,OAAO,EAAE,KAAK,EAAE,KAAK,EAAE,CAAC,CAAC,EACxE,OAAO,CACR,CAAC;QAEF,IAAI,0BAAkB,CAAC,WAAW,CAAC,EAAE;YACnC,MAAM,IAAI,iCAAyB,CAAC,wDAAwD,CAAC,CAAC;SAC/F;IACH,CAAC;IAED,OAAO,CACL,MAAc,EACd,OAAsB,EACtB,QAA2C;QAE3C,KAAK,CAAC,OAAO,CAAC,MAAM,EAAE,OAAO,EAAE,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;;YAC1C,IAAI,GAAG,IAAI,CAAC,GAAG;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YACtC,IAAI,IAAI,CAAC,OAAO,IAAI,IAAI;gBAAE,OAAO,QAAQ,CAAC,SAAS,EAAE,GAAG,CAAC,CAAC;YAC1D,IAAI,GAAG,CAAC,IAAI;gBAAE,OAAO,QAAQ,CAAC,IAAI,wBAAgB,CAAC,GAAG,CAAC,CAAC,CAAC;YACzD,IAAI,GAAG,CAAC,WAAW;gBAAE,OAAO,QAAQ,CAAC,IAAI,wBAAgB,CAAC,GAAG,CAAC,WAAW,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;YAE/E,QAAQ,CAAC,SAAS,EAAE;gBAClB,YAAY,EAAE,MAAA,CAAA,MAAA,IAAI,CAAC,YAAY,0CAAE,CAAC,MAAK,CAAC,mCAAI,IAAI;gBAChD,aAAa,EAAE,GAAG,CAAC,SAAS,IAAI,IAAI,CAAC,CAAC,CAAC,GAAG,CAAC,SAAS,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC;gBAC5D,UAAU,EACR,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,QAAQ,CAAC,IAAI,GAAG,CAAC,QAAQ,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,IAAI;gBACrF,aAAa,EAAE,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,QAAQ,CAAC,IAAI,GAAG,CAAC,QAAQ,CAAC,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,QAAQ,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC;gBAC3F,YAAY,EAAE,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,QAAQ,CAAC,IAAI,GAAG,CAAC,QAAQ,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC;aACjF,CAAC,CAAC;QACL,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAvCD,kDAuCC;AAED,SAAgB,mBAAmB,CACjC,MAAgB,EAChB,MAAgB,EAChB,OAA4C;IAE5C,IAAI,MAAM,IAAI,IAAI,IAAI,OAAO,MAAM,KAAK,QAAQ,EAAE;QAChD,MAAM,IAAI,iCAAyB,CAAC,4CAA4C,CAAC,CAAC;KACnF;IAED,IAAI,MAAM,IAAI,IAAI,IAAI,OAAO,MAAM,KAAK,QAAQ,EAAE;QAChD,MAAM,IAAI,iCAAyB,CAAC,4CAA4C,CAAC,CAAC;KACnF;IAED,MAAM,EAAE,GAAoB,EAAE,CAAC,EAAE,MAAM,EAAE,CAAC,EAAE,MAAM,EAAE,CAAC;IACrD,IAAI,OAAO,OAAO,CAAC,MAAM,KAAK,SAAS,EAAE;QACvC,EAAE,CAAC,MAAM,GAAG,OAAO,CAAC,MAAM,CAAC;KAC5B;IAED,IAAI,OAAO,CAAC,KAAK,EAAE;QACjB,EAAE,CAAC,KAAK,GAAG,OAAO,CAAC,KAAK,CAAC;KAC1B;IAED,IAAI,OAAO,CAAC,IAAI,EAAE;QAChB,EAAE,CAAC,IAAI,GAAG,OAAO,CAAC,IAAI,CAAC;KACxB;IAED,IAAI,OAAO,CAAC,YAAY,EAAE;QACxB,EAAE,CAAC,YAAY,GAAG,OAAO,CAAC,YAAY,CAAC;KACxC;IAED,IAAI,OAAO,CAAC,SAAS,EAAE;QACrB,EAAE,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;KAClC;IAED,OAAO,EAAE,CAAC;AACZ,CAAC;AAnCD,kDAmCC;AAED,yBAAa,CAAC,eAAe,EAAE,CAAC,kBAAM,CAAC,SAAS,EAAE,kBAAM,CAAC,eAAe,EAAE,kBAAM,CAAC,cAAc,CAAC,CAAC,CAAC;AAClG,yBAAa,CAAC,kBAAkB,EAAE;IAChC,kBAAM,CAAC,SAAS;IAChB,kBAAM,CAAC,eAAe;IACtB,kBAAM,CAAC,WAAW;IAClB,kBAAM,CAAC,cAAc;CACtB,CAAC,CAAC;AACH,yBAAa,CAAC,mBAAmB,EAAE;IACjC,kBAAM,CAAC,eAAe;IACtB,kBAAM,CAAC,WAAW;IAClB,kBAAM,CAAC,cAAc;CACtB,CAAC,CAAC;AACH,yBAAa,CAAC,mBAAmB,EAAE;IACjC,kBAAM,CAAC,SAAS;IAChB,kBAAM,CAAC,eAAe;IACtB,kBAAM,CAAC,cAAc;CACtB,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/validate_collection.js b/node_modules/mongodb/lib/operations/validate_collection.js new file mode 100644 index 000000000..c79752486 --- /dev/null +++ b/node_modules/mongodb/lib/operations/validate_collection.js @@ -0,0 +1,41 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.ValidateCollectionOperation = void 0; +const command_1 = require("./command"); +const error_1 = require("../error"); +/** @internal */ +class ValidateCollectionOperation extends command_1.CommandOperation { + constructor(admin, collectionName, options) { + // Decorate command with extra options + const command = { validate: collectionName }; + const keys = Object.keys(options); + for (let i = 0; i < keys.length; i++) { + if (Object.prototype.hasOwnProperty.call(options, keys[i]) && keys[i] !== 'session') { + command[keys[i]] = options[keys[i]]; + } + } + super(admin.s.db, options); + this.options = options; + this.command = command; + this.collectionName = collectionName; + } + execute(server, session, callback) { + const collectionName = this.collectionName; + super.executeCommand(server, session, this.command, (err, doc) => { + if (err != null) + return callback(err); + // TODO(NODE-3483): Replace these with MongoUnexpectedServerResponseError + if (doc.ok === 0) + return callback(new error_1.MongoRuntimeError('Error with validate command')); + if (doc.result != null && typeof doc.result !== 'string') + return callback(new error_1.MongoRuntimeError('Error with validation data')); + if (doc.result != null && doc.result.match(/exception|corrupt/) != null) + return callback(new error_1.MongoRuntimeError(`Invalid collection ${collectionName}`)); + if (doc.valid != null && !doc.valid) + return callback(new error_1.MongoRuntimeError(`Invalid collection ${collectionName}`)); + return callback(undefined, doc); + }); + } +} +exports.ValidateCollectionOperation = ValidateCollectionOperation; +//# sourceMappingURL=validate_collection.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/operations/validate_collection.js.map b/node_modules/mongodb/lib/operations/validate_collection.js.map new file mode 100644 index 000000000..517c29ba1 --- /dev/null +++ b/node_modules/mongodb/lib/operations/validate_collection.js.map @@ -0,0 +1 @@ +{"version":3,"file":"validate_collection.js","sourceRoot":"","sources":["../../src/operations/validate_collection.ts"],"names":[],"mappings":";;;AAAA,uCAAsE;AAMtE,oCAA6C;AAQ7C,gBAAgB;AAChB,MAAa,2BAA4B,SAAQ,0BAA0B;IAKzE,YAAY,KAAY,EAAE,cAAsB,EAAE,OAAkC;QAClF,sCAAsC;QACtC,MAAM,OAAO,GAAa,EAAE,QAAQ,EAAE,cAAc,EAAE,CAAC;QACvD,MAAM,IAAI,GAAG,MAAM,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;QAClC,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,IAAI,CAAC,MAAM,EAAE,CAAC,EAAE,EAAE;YACpC,IAAI,MAAM,CAAC,SAAS,CAAC,cAAc,CAAC,IAAI,CAAC,OAAO,EAAE,IAAI,CAAC,CAAC,CAAC,CAAC,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,SAAS,EAAE;gBACnF,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,GAAI,OAAoB,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC;aACnD;SACF;QAED,KAAK,CAAC,KAAK,CAAC,CAAC,CAAC,EAAE,EAAE,OAAO,CAAC,CAAC;QAC3B,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,cAAc,GAAG,cAAc,CAAC;IACvC,CAAC;IAED,OAAO,CAAC,MAAc,EAAE,OAAsB,EAAE,QAA4B;QAC1E,MAAM,cAAc,GAAG,IAAI,CAAC,cAAc,CAAC;QAE3C,KAAK,CAAC,cAAc,CAAC,MAAM,EAAE,OAAO,EAAE,IAAI,CAAC,OAAO,EAAE,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;YAC/D,IAAI,GAAG,IAAI,IAAI;gBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;YAEtC,yEAAyE;YACzE,IAAI,GAAG,CAAC,EAAE,KAAK,CAAC;gBAAE,OAAO,QAAQ,CAAC,IAAI,yBAAiB,CAAC,6BAA6B,CAAC,CAAC,CAAC;YACxF,IAAI,GAAG,CAAC,MAAM,IAAI,IAAI,IAAI,OAAO,GAAG,CAAC,MAAM,KAAK,QAAQ;gBACtD,OAAO,QAAQ,CAAC,IAAI,yBAAiB,CAAC,4BAA4B,CAAC,CAAC,CAAC;YACvE,IAAI,GAAG,CAAC,MAAM,IAAI,IAAI,IAAI,GAAG,CAAC,MAAM,CAAC,KAAK,CAAC,mBAAmB,CAAC,IAAI,IAAI;gBACrE,OAAO,QAAQ,CAAC,IAAI,yBAAiB,CAAC,sBAAsB,cAAc,EAAE,CAAC,CAAC,CAAC;YACjF,IAAI,GAAG,CAAC,KAAK,IAAI,IAAI,IAAI,CAAC,GAAG,CAAC,KAAK;gBACjC,OAAO,QAAQ,CAAC,IAAI,yBAAiB,CAAC,sBAAsB,cAAc,EAAE,CAAC,CAAC,CAAC;YAEjF,OAAO,QAAQ,CAAC,SAAS,EAAE,GAAG,CAAC,CAAC;QAClC,CAAC,CAAC,CAAC;IACL,CAAC;CACF;AAvCD,kEAuCC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/promise_provider.js b/node_modules/mongodb/lib/promise_provider.js new file mode 100644 index 000000000..e76480a29 --- /dev/null +++ b/node_modules/mongodb/lib/promise_provider.js @@ -0,0 +1,36 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.PromiseProvider = void 0; +const error_1 = require("./error"); +/** @internal */ +const kPromise = Symbol('promise'); +const store = { + [kPromise]: undefined +}; +/** + * Global promise store allowing user-provided promises + * @public + */ +class PromiseProvider { + /** Validates the passed in promise library */ + static validate(lib) { + if (typeof lib !== 'function') + throw new error_1.MongoInvalidArgumentError(`Promise must be a function, got ${lib}`); + return !!lib; + } + /** Sets the promise library */ + static set(lib) { + if (!PromiseProvider.validate(lib)) { + // validate + return; + } + store[kPromise] = lib; + } + /** Get the stored promise library, or resolves passed in */ + static get() { + return store[kPromise]; + } +} +exports.PromiseProvider = PromiseProvider; +PromiseProvider.set(global.Promise); +//# sourceMappingURL=promise_provider.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/promise_provider.js.map b/node_modules/mongodb/lib/promise_provider.js.map new file mode 100644 index 000000000..703bee2c9 --- /dev/null +++ b/node_modules/mongodb/lib/promise_provider.js.map @@ -0,0 +1 @@ +{"version":3,"file":"promise_provider.js","sourceRoot":"","sources":["../src/promise_provider.ts"],"names":[],"mappings":";;;AAAA,mCAAoD;AAEpD,gBAAgB;AAChB,MAAM,QAAQ,GAAG,MAAM,CAAC,SAAS,CAAC,CAAC;AAMnC,MAAM,KAAK,GAAiB;IAC1B,CAAC,QAAQ,CAAC,EAAE,SAAS;CACtB,CAAC;AAEF;;;GAGG;AACH,MAAa,eAAe;IAC1B,8CAA8C;IAC9C,MAAM,CAAC,QAAQ,CAAC,GAAY;QAC1B,IAAI,OAAO,GAAG,KAAK,UAAU;YAC3B,MAAM,IAAI,iCAAyB,CAAC,mCAAmC,GAAG,EAAE,CAAC,CAAC;QAChF,OAAO,CAAC,CAAC,GAAG,CAAC;IACf,CAAC;IAED,+BAA+B;IAC/B,MAAM,CAAC,GAAG,CAAC,GAAuB;QAChC,IAAI,CAAC,eAAe,CAAC,QAAQ,CAAC,GAAG,CAAC,EAAE;YAClC,WAAW;YACX,OAAO;SACR;QACD,KAAK,CAAC,QAAQ,CAAC,GAAG,GAAG,CAAC;IACxB,CAAC;IAED,4DAA4D;IAC5D,MAAM,CAAC,GAAG;QACR,OAAO,KAAK,CAAC,QAAQ,CAAuB,CAAC;IAC/C,CAAC;CACF;AArBD,0CAqBC;AAED,eAAe,CAAC,GAAG,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/read_concern.js b/node_modules/mongodb/lib/read_concern.js new file mode 100644 index 000000000..b4ea20179 --- /dev/null +++ b/node_modules/mongodb/lib/read_concern.js @@ -0,0 +1,73 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.ReadConcern = exports.ReadConcernLevel = void 0; +/** @public */ +exports.ReadConcernLevel = Object.freeze({ + local: 'local', + majority: 'majority', + linearizable: 'linearizable', + available: 'available', + snapshot: 'snapshot' +}); +/** + * The MongoDB ReadConcern, which allows for control of the consistency and isolation properties + * of the data read from replica sets and replica set shards. + * @public + * + * @see https://docs.mongodb.com/manual/reference/read-concern/index.html + */ +class ReadConcern { + /** Constructs a ReadConcern from the read concern level.*/ + constructor(level) { + var _a; + /** + * A spec test exists that allows level to be any string. + * "invalid readConcern with out stage" + * @see ./test/spec/crud/v2/aggregate-out-readConcern.json + * @see https://github.com/mongodb/specifications/blob/master/source/read-write-concern/read-write-concern.rst#unknown-levels-and-additional-options-for-string-based-readconcerns + */ + this.level = (_a = exports.ReadConcernLevel[level]) !== null && _a !== void 0 ? _a : level; + } + /** + * Construct a ReadConcern given an options object. + * + * @param options - The options object from which to extract the write concern. + */ + static fromOptions(options) { + if (options == null) { + return; + } + if (options.readConcern) { + const { readConcern } = options; + if (readConcern instanceof ReadConcern) { + return readConcern; + } + else if (typeof readConcern === 'string') { + return new ReadConcern(readConcern); + } + else if ('level' in readConcern && readConcern.level) { + return new ReadConcern(readConcern.level); + } + } + if (options.level) { + return new ReadConcern(options.level); + } + } + static get MAJORITY() { + return exports.ReadConcernLevel.majority; + } + static get AVAILABLE() { + return exports.ReadConcernLevel.available; + } + static get LINEARIZABLE() { + return exports.ReadConcernLevel.linearizable; + } + static get SNAPSHOT() { + return exports.ReadConcernLevel.snapshot; + } + toJSON() { + return { level: this.level }; + } +} +exports.ReadConcern = ReadConcern; +//# sourceMappingURL=read_concern.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/read_concern.js.map b/node_modules/mongodb/lib/read_concern.js.map new file mode 100644 index 000000000..d92bca139 --- /dev/null +++ b/node_modules/mongodb/lib/read_concern.js.map @@ -0,0 +1 @@ +{"version":3,"file":"read_concern.js","sourceRoot":"","sources":["../src/read_concern.ts"],"names":[],"mappings":";;;AAEA,cAAc;AACD,QAAA,gBAAgB,GAAG,MAAM,CAAC,MAAM,CAAC;IAC5C,KAAK,EAAE,OAAO;IACd,QAAQ,EAAE,UAAU;IACpB,YAAY,EAAE,cAAc;IAC5B,SAAS,EAAE,WAAW;IACtB,QAAQ,EAAE,UAAU;CACZ,CAAC,CAAC;AAQZ;;;;;;GAMG;AACH,MAAa,WAAW;IAGtB,2DAA2D;IAC3D,YAAY,KAAuB;;QACjC;;;;;WAKG;QACH,IAAI,CAAC,KAAK,GAAG,MAAA,wBAAgB,CAAC,KAAK,CAAC,mCAAI,KAAK,CAAC;IAChD,CAAC;IAED;;;;OAIG;IACH,MAAM,CAAC,WAAW,CAAC,OAGlB;QACC,IAAI,OAAO,IAAI,IAAI,EAAE;YACnB,OAAO;SACR;QAED,IAAI,OAAO,CAAC,WAAW,EAAE;YACvB,MAAM,EAAE,WAAW,EAAE,GAAG,OAAO,CAAC;YAChC,IAAI,WAAW,YAAY,WAAW,EAAE;gBACtC,OAAO,WAAW,CAAC;aACpB;iBAAM,IAAI,OAAO,WAAW,KAAK,QAAQ,EAAE;gBAC1C,OAAO,IAAI,WAAW,CAAC,WAAW,CAAC,CAAC;aACrC;iBAAM,IAAI,OAAO,IAAI,WAAW,IAAI,WAAW,CAAC,KAAK,EAAE;gBACtD,OAAO,IAAI,WAAW,CAAC,WAAW,CAAC,KAAK,CAAC,CAAC;aAC3C;SACF;QAED,IAAI,OAAO,CAAC,KAAK,EAAE;YACjB,OAAO,IAAI,WAAW,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC;SACvC;IACH,CAAC;IAED,MAAM,KAAK,QAAQ;QACjB,OAAO,wBAAgB,CAAC,QAAQ,CAAC;IACnC,CAAC;IAED,MAAM,KAAK,SAAS;QAClB,OAAO,wBAAgB,CAAC,SAAS,CAAC;IACpC,CAAC;IAED,MAAM,KAAK,YAAY;QACrB,OAAO,wBAAgB,CAAC,YAAY,CAAC;IACvC,CAAC;IAED,MAAM,KAAK,QAAQ;QACjB,OAAO,wBAAgB,CAAC,QAAQ,CAAC;IACnC,CAAC;IAED,MAAM;QACJ,OAAO,EAAE,KAAK,EAAE,IAAI,CAAC,KAAK,EAAE,CAAC;IAC/B,CAAC;CACF;AA9DD,kCA8DC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/read_preference.js b/node_modules/mongodb/lib/read_preference.js new file mode 100644 index 000000000..7eee4650a --- /dev/null +++ b/node_modules/mongodb/lib/read_preference.js @@ -0,0 +1,194 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.ReadPreference = exports.ReadPreferenceMode = void 0; +const error_1 = require("./error"); +/** @public */ +exports.ReadPreferenceMode = Object.freeze({ + primary: 'primary', + primaryPreferred: 'primaryPreferred', + secondary: 'secondary', + secondaryPreferred: 'secondaryPreferred', + nearest: 'nearest' +}); +/** + * The **ReadPreference** class is a class that represents a MongoDB ReadPreference and is + * used to construct connections. + * @public + * + * @see https://docs.mongodb.com/manual/core/read-preference/ + */ +class ReadPreference { + /** + * @param mode - A string describing the read preference mode (primary|primaryPreferred|secondary|secondaryPreferred|nearest) + * @param tags - A tag set used to target reads to members with the specified tag(s). tagSet is not available if using read preference mode primary. + * @param options - Additional read preference options + */ + constructor(mode, tags, options) { + if (!ReadPreference.isValid(mode)) { + throw new error_1.MongoInvalidArgumentError(`Invalid read preference mode ${JSON.stringify(mode)}`); + } + if (options == null && typeof tags === 'object' && !Array.isArray(tags)) { + options = tags; + tags = undefined; + } + else if (tags && !Array.isArray(tags)) { + throw new error_1.MongoInvalidArgumentError('ReadPreference tags must be an array'); + } + this.mode = mode; + this.tags = tags; + this.hedge = options === null || options === void 0 ? void 0 : options.hedge; + this.maxStalenessSeconds = undefined; + this.minWireVersion = undefined; + options = options !== null && options !== void 0 ? options : {}; + if (options.maxStalenessSeconds != null) { + if (options.maxStalenessSeconds <= 0) { + throw new error_1.MongoInvalidArgumentError('maxStalenessSeconds must be a positive integer'); + } + this.maxStalenessSeconds = options.maxStalenessSeconds; + // NOTE: The minimum required wire version is 5 for this read preference. If the existing + // topology has a lower value then a MongoError will be thrown during server selection. + this.minWireVersion = 5; + } + if (this.mode === ReadPreference.PRIMARY) { + if (this.tags && Array.isArray(this.tags) && this.tags.length > 0) { + throw new error_1.MongoInvalidArgumentError('Primary read preference cannot be combined with tags'); + } + if (this.maxStalenessSeconds) { + throw new error_1.MongoInvalidArgumentError('Primary read preference cannot be combined with maxStalenessSeconds'); + } + if (this.hedge) { + throw new error_1.MongoInvalidArgumentError('Primary read preference cannot be combined with hedge'); + } + } + } + // Support the deprecated `preference` property introduced in the porcelain layer + get preference() { + return this.mode; + } + static fromString(mode) { + return new ReadPreference(mode); + } + /** + * Construct a ReadPreference given an options object. + * + * @param options - The options object from which to extract the read preference. + */ + static fromOptions(options) { + var _a, _b, _c; + if (!options) + return; + const readPreference = (_a = options.readPreference) !== null && _a !== void 0 ? _a : (_b = options.session) === null || _b === void 0 ? void 0 : _b.transaction.options.readPreference; + const readPreferenceTags = options.readPreferenceTags; + if (readPreference == null) { + return; + } + if (typeof readPreference === 'string') { + return new ReadPreference(readPreference, readPreferenceTags); + } + else if (!(readPreference instanceof ReadPreference) && typeof readPreference === 'object') { + const mode = readPreference.mode || readPreference.preference; + if (mode && typeof mode === 'string') { + return new ReadPreference(mode, (_c = readPreference.tags) !== null && _c !== void 0 ? _c : readPreferenceTags, { + maxStalenessSeconds: readPreference.maxStalenessSeconds, + hedge: options.hedge + }); + } + } + if (readPreferenceTags) { + readPreference.tags = readPreferenceTags; + } + return readPreference; + } + /** + * Replaces options.readPreference with a ReadPreference instance + */ + static translate(options) { + if (options.readPreference == null) + return options; + const r = options.readPreference; + if (typeof r === 'string') { + options.readPreference = new ReadPreference(r); + } + else if (r && !(r instanceof ReadPreference) && typeof r === 'object') { + const mode = r.mode || r.preference; + if (mode && typeof mode === 'string') { + options.readPreference = new ReadPreference(mode, r.tags, { + maxStalenessSeconds: r.maxStalenessSeconds + }); + } + } + else if (!(r instanceof ReadPreference)) { + throw new error_1.MongoInvalidArgumentError(`Invalid read preference: ${r}`); + } + return options; + } + /** + * Validate if a mode is legal + * + * @param mode - The string representing the read preference mode. + */ + static isValid(mode) { + const VALID_MODES = new Set([ + ReadPreference.PRIMARY, + ReadPreference.PRIMARY_PREFERRED, + ReadPreference.SECONDARY, + ReadPreference.SECONDARY_PREFERRED, + ReadPreference.NEAREST, + null + ]); + return VALID_MODES.has(mode); + } + /** + * Validate if a mode is legal + * + * @param mode - The string representing the read preference mode. + */ + isValid(mode) { + return ReadPreference.isValid(typeof mode === 'string' ? mode : this.mode); + } + /** + * Indicates that this readPreference needs the "slaveOk" bit when sent over the wire + * + * @see https://docs.mongodb.com/manual/reference/mongodb-wire-protocol/#op-query + */ + slaveOk() { + const NEEDS_SLAVEOK = new Set([ + ReadPreference.PRIMARY_PREFERRED, + ReadPreference.SECONDARY, + ReadPreference.SECONDARY_PREFERRED, + ReadPreference.NEAREST + ]); + return NEEDS_SLAVEOK.has(this.mode); + } + /** + * Check if the two ReadPreferences are equivalent + * + * @param readPreference - The read preference with which to check equality + */ + equals(readPreference) { + return readPreference.mode === this.mode; + } + /** Return JSON representation */ + toJSON() { + const readPreference = { mode: this.mode }; + if (Array.isArray(this.tags)) + readPreference.tags = this.tags; + if (this.maxStalenessSeconds) + readPreference.maxStalenessSeconds = this.maxStalenessSeconds; + if (this.hedge) + readPreference.hedge = this.hedge; + return readPreference; + } +} +exports.ReadPreference = ReadPreference; +ReadPreference.PRIMARY = exports.ReadPreferenceMode.primary; +ReadPreference.PRIMARY_PREFERRED = exports.ReadPreferenceMode.primaryPreferred; +ReadPreference.SECONDARY = exports.ReadPreferenceMode.secondary; +ReadPreference.SECONDARY_PREFERRED = exports.ReadPreferenceMode.secondaryPreferred; +ReadPreference.NEAREST = exports.ReadPreferenceMode.nearest; +ReadPreference.primary = new ReadPreference(exports.ReadPreferenceMode.primary); +ReadPreference.primaryPreferred = new ReadPreference(exports.ReadPreferenceMode.primaryPreferred); +ReadPreference.secondary = new ReadPreference(exports.ReadPreferenceMode.secondary); +ReadPreference.secondaryPreferred = new ReadPreference(exports.ReadPreferenceMode.secondaryPreferred); +ReadPreference.nearest = new ReadPreference(exports.ReadPreferenceMode.nearest); +//# sourceMappingURL=read_preference.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/read_preference.js.map b/node_modules/mongodb/lib/read_preference.js.map new file mode 100644 index 000000000..95e65afd7 --- /dev/null +++ b/node_modules/mongodb/lib/read_preference.js.map @@ -0,0 +1 @@ +{"version":3,"file":"read_preference.js","sourceRoot":"","sources":["../src/read_preference.ts"],"names":[],"mappings":";;;AAGA,mCAAoD;AAKpD,cAAc;AACD,QAAA,kBAAkB,GAAG,MAAM,CAAC,MAAM,CAAC;IAC9C,OAAO,EAAE,SAAS;IAClB,gBAAgB,EAAE,kBAAkB;IACpC,SAAS,EAAE,WAAW;IACtB,kBAAkB,EAAE,oBAAoB;IACxC,OAAO,EAAE,SAAS;CACV,CAAC,CAAC;AAsCZ;;;;;;GAMG;AACH,MAAa,cAAc;IAmBzB;;;;OAIG;IACH,YAAY,IAAwB,EAAE,IAAe,EAAE,OAA+B;QACpF,IAAI,CAAC,cAAc,CAAC,OAAO,CAAC,IAAI,CAAC,EAAE;YACjC,MAAM,IAAI,iCAAyB,CAAC,gCAAgC,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC;SAC7F;QACD,IAAI,OAAO,IAAI,IAAI,IAAI,OAAO,IAAI,KAAK,QAAQ,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,IAAI,CAAC,EAAE;YACvE,OAAO,GAAG,IAAI,CAAC;YACf,IAAI,GAAG,SAAS,CAAC;SAClB;aAAM,IAAI,IAAI,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,IAAI,CAAC,EAAE;YACvC,MAAM,IAAI,iCAAyB,CAAC,sCAAsC,CAAC,CAAC;SAC7E;QAED,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;QACjB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;QACjB,IAAI,CAAC,KAAK,GAAG,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,KAAK,CAAC;QAC5B,IAAI,CAAC,mBAAmB,GAAG,SAAS,CAAC;QACrC,IAAI,CAAC,cAAc,GAAG,SAAS,CAAC;QAEhC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QACxB,IAAI,OAAO,CAAC,mBAAmB,IAAI,IAAI,EAAE;YACvC,IAAI,OAAO,CAAC,mBAAmB,IAAI,CAAC,EAAE;gBACpC,MAAM,IAAI,iCAAyB,CAAC,gDAAgD,CAAC,CAAC;aACvF;YAED,IAAI,CAAC,mBAAmB,GAAG,OAAO,CAAC,mBAAmB,CAAC;YAEvD,yFAAyF;YACzF,6FAA6F;YAC7F,IAAI,CAAC,cAAc,GAAG,CAAC,CAAC;SACzB;QAED,IAAI,IAAI,CAAC,IAAI,KAAK,cAAc,CAAC,OAAO,EAAE;YACxC,IAAI,IAAI,CAAC,IAAI,IAAI,KAAK,CAAC,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC,IAAI,IAAI,CAAC,IAAI,CAAC,MAAM,GAAG,CAAC,EAAE;gBACjE,MAAM,IAAI,iCAAyB,CAAC,sDAAsD,CAAC,CAAC;aAC7F;YAED,IAAI,IAAI,CAAC,mBAAmB,EAAE;gBAC5B,MAAM,IAAI,iCAAyB,CACjC,qEAAqE,CACtE,CAAC;aACH;YAED,IAAI,IAAI,CAAC,KAAK,EAAE;gBACd,MAAM,IAAI,iCAAyB,CACjC,uDAAuD,CACxD,CAAC;aACH;SACF;IACH,CAAC;IAED,iFAAiF;IACjF,IAAI,UAAU;QACZ,OAAO,IAAI,CAAC,IAAI,CAAC;IACnB,CAAC;IAED,MAAM,CAAC,UAAU,CAAC,IAAY;QAC5B,OAAO,IAAI,cAAc,CAAC,IAA0B,CAAC,CAAC;IACxD,CAAC;IAED;;;;OAIG;IACH,MAAM,CAAC,WAAW,CAAC,OAAmC;;QACpD,IAAI,CAAC,OAAO;YAAE,OAAO;QACrB,MAAM,cAAc,GAClB,MAAA,OAAO,CAAC,cAAc,mCAAI,MAAA,OAAO,CAAC,OAAO,0CAAE,WAAW,CAAC,OAAO,CAAC,cAAc,CAAC;QAChF,MAAM,kBAAkB,GAAG,OAAO,CAAC,kBAAkB,CAAC;QAEtD,IAAI,cAAc,IAAI,IAAI,EAAE;YAC1B,OAAO;SACR;QAED,IAAI,OAAO,cAAc,KAAK,QAAQ,EAAE;YACtC,OAAO,IAAI,cAAc,CAAC,cAAoC,EAAE,kBAAkB,CAAC,CAAC;SACrF;aAAM,IAAI,CAAC,CAAC,cAAc,YAAY,cAAc,CAAC,IAAI,OAAO,cAAc,KAAK,QAAQ,EAAE;YAC5F,MAAM,IAAI,GAAG,cAAc,CAAC,IAAI,IAAI,cAAc,CAAC,UAAU,CAAC;YAC9D,IAAI,IAAI,IAAI,OAAO,IAAI,KAAK,QAAQ,EAAE;gBACpC,OAAO,IAAI,cAAc,CACvB,IAA0B,EAC1B,MAAA,cAAc,CAAC,IAAI,mCAAI,kBAAkB,EACzC;oBACE,mBAAmB,EAAE,cAAc,CAAC,mBAAmB;oBACvD,KAAK,EAAE,OAAO,CAAC,KAAK;iBACrB,CACF,CAAC;aACH;SACF;QAED,IAAI,kBAAkB,EAAE;YACtB,cAAc,CAAC,IAAI,GAAG,kBAAkB,CAAC;SAC1C;QAED,OAAO,cAAgC,CAAC;IAC1C,CAAC;IAED;;OAEG;IACH,MAAM,CAAC,SAAS,CAAC,OAAkC;QACjD,IAAI,OAAO,CAAC,cAAc,IAAI,IAAI;YAAE,OAAO,OAAO,CAAC;QACnD,MAAM,CAAC,GAAG,OAAO,CAAC,cAAc,CAAC;QAEjC,IAAI,OAAO,CAAC,KAAK,QAAQ,EAAE;YACzB,OAAO,CAAC,cAAc,GAAG,IAAI,cAAc,CAAC,CAAuB,CAAC,CAAC;SACtE;aAAM,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,YAAY,cAAc,CAAC,IAAI,OAAO,CAAC,KAAK,QAAQ,EAAE;YACvE,MAAM,IAAI,GAAG,CAAC,CAAC,IAAI,IAAI,CAAC,CAAC,UAAU,CAAC;YACpC,IAAI,IAAI,IAAI,OAAO,IAAI,KAAK,QAAQ,EAAE;gBACpC,OAAO,CAAC,cAAc,GAAG,IAAI,cAAc,CAAC,IAA0B,EAAE,CAAC,CAAC,IAAI,EAAE;oBAC9E,mBAAmB,EAAE,CAAC,CAAC,mBAAmB;iBAC3C,CAAC,CAAC;aACJ;SACF;aAAM,IAAI,CAAC,CAAC,CAAC,YAAY,cAAc,CAAC,EAAE;YACzC,MAAM,IAAI,iCAAyB,CAAC,4BAA4B,CAAC,EAAE,CAAC,CAAC;SACtE;QAED,OAAO,OAAO,CAAC;IACjB,CAAC;IAED;;;;OAIG;IACH,MAAM,CAAC,OAAO,CAAC,IAAY;QACzB,MAAM,WAAW,GAAG,IAAI,GAAG,CAAC;YAC1B,cAAc,CAAC,OAAO;YACtB,cAAc,CAAC,iBAAiB;YAChC,cAAc,CAAC,SAAS;YACxB,cAAc,CAAC,mBAAmB;YAClC,cAAc,CAAC,OAAO;YACtB,IAAI;SACL,CAAC,CAAC;QAEH,OAAO,WAAW,CAAC,GAAG,CAAC,IAA0B,CAAC,CAAC;IACrD,CAAC;IAED;;;;OAIG;IACH,OAAO,CAAC,IAAa;QACnB,OAAO,cAAc,CAAC,OAAO,CAAC,OAAO,IAAI,KAAK,QAAQ,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;IAC7E,CAAC;IAED;;;;OAIG;IACH,OAAO;QACL,MAAM,aAAa,GAAG,IAAI,GAAG,CAAS;YACpC,cAAc,CAAC,iBAAiB;YAChC,cAAc,CAAC,SAAS;YACxB,cAAc,CAAC,mBAAmB;YAClC,cAAc,CAAC,OAAO;SACvB,CAAC,CAAC;QAEH,OAAO,aAAa,CAAC,GAAG,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;IACtC,CAAC;IAED;;;;OAIG;IACH,MAAM,CAAC,cAA8B;QACnC,OAAO,cAAc,CAAC,IAAI,KAAK,IAAI,CAAC,IAAI,CAAC;IAC3C,CAAC;IAED,iCAAiC;IACjC,MAAM;QACJ,MAAM,cAAc,GAAG,EAAE,IAAI,EAAE,IAAI,CAAC,IAAI,EAAc,CAAC;QACvD,IAAI,KAAK,CAAC,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC;YAAE,cAAc,CAAC,IAAI,GAAG,IAAI,CAAC,IAAI,CAAC;QAC9D,IAAI,IAAI,CAAC,mBAAmB;YAAE,cAAc,CAAC,mBAAmB,GAAG,IAAI,CAAC,mBAAmB,CAAC;QAC5F,IAAI,IAAI,CAAC,KAAK;YAAE,cAAc,CAAC,KAAK,GAAG,IAAI,CAAC,KAAK,CAAC;QAClD,OAAO,cAAc,CAAC;IACxB,CAAC;;AA1MH,wCA2MC;AApMe,sBAAO,GAAG,0BAAkB,CAAC,OAAO,CAAC;AACrC,gCAAiB,GAAG,0BAAkB,CAAC,gBAAgB,CAAC;AACxD,wBAAS,GAAG,0BAAkB,CAAC,SAAS,CAAC;AACzC,kCAAmB,GAAG,0BAAkB,CAAC,kBAAkB,CAAC;AAC5D,sBAAO,GAAG,0BAAkB,CAAC,OAAO,CAAC;AAErC,sBAAO,GAAG,IAAI,cAAc,CAAC,0BAAkB,CAAC,OAAO,CAAC,CAAC;AACzD,+BAAgB,GAAG,IAAI,cAAc,CAAC,0BAAkB,CAAC,gBAAgB,CAAC,CAAC;AAC3E,wBAAS,GAAG,IAAI,cAAc,CAAC,0BAAkB,CAAC,SAAS,CAAC,CAAC;AAC7D,iCAAkB,GAAG,IAAI,cAAc,CAAC,0BAAkB,CAAC,kBAAkB,CAAC,CAAC;AAC/E,sBAAO,GAAG,IAAI,cAAc,CAAC,0BAAkB,CAAC,OAAO,CAAC,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/common.js b/node_modules/mongodb/lib/sdam/common.js new file mode 100644 index 000000000..f8d382b86 --- /dev/null +++ b/node_modules/mongodb/lib/sdam/common.js @@ -0,0 +1,61 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports._advanceClusterTime = exports.clearAndRemoveTimerFrom = exports.drainTimerQueue = exports.ServerType = exports.TopologyType = exports.STATE_CONNECTED = exports.STATE_CONNECTING = exports.STATE_CLOSED = exports.STATE_CLOSING = void 0; +// shared state names +exports.STATE_CLOSING = 'closing'; +exports.STATE_CLOSED = 'closed'; +exports.STATE_CONNECTING = 'connecting'; +exports.STATE_CONNECTED = 'connected'; +/** + * An enumeration of topology types we know about + * @public + */ +exports.TopologyType = Object.freeze({ + Single: 'Single', + ReplicaSetNoPrimary: 'ReplicaSetNoPrimary', + ReplicaSetWithPrimary: 'ReplicaSetWithPrimary', + Sharded: 'Sharded', + Unknown: 'Unknown', + LoadBalanced: 'LoadBalanced' +}); +/** + * An enumeration of server types we know about + * @public + */ +exports.ServerType = Object.freeze({ + Standalone: 'Standalone', + Mongos: 'Mongos', + PossiblePrimary: 'PossiblePrimary', + RSPrimary: 'RSPrimary', + RSSecondary: 'RSSecondary', + RSArbiter: 'RSArbiter', + RSOther: 'RSOther', + RSGhost: 'RSGhost', + Unknown: 'Unknown', + LoadBalancer: 'LoadBalancer' +}); +/** @internal */ +function drainTimerQueue(queue) { + queue.forEach(clearTimeout); + queue.clear(); +} +exports.drainTimerQueue = drainTimerQueue; +/** @internal */ +function clearAndRemoveTimerFrom(timer, timers) { + clearTimeout(timer); + return timers.delete(timer); +} +exports.clearAndRemoveTimerFrom = clearAndRemoveTimerFrom; +/** Shared function to determine clusterTime for a given topology or session */ +function _advanceClusterTime(entity, $clusterTime) { + if (entity.clusterTime == null) { + entity.clusterTime = $clusterTime; + } + else { + if ($clusterTime.clusterTime.greaterThan(entity.clusterTime.clusterTime)) { + entity.clusterTime = $clusterTime; + } + } +} +exports._advanceClusterTime = _advanceClusterTime; +//# sourceMappingURL=common.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/common.js.map b/node_modules/mongodb/lib/sdam/common.js.map new file mode 100644 index 000000000..b54682fd7 --- /dev/null +++ b/node_modules/mongodb/lib/sdam/common.js.map @@ -0,0 +1 @@ +{"version":3,"file":"common.js","sourceRoot":"","sources":["../../src/sdam/common.ts"],"names":[],"mappings":";;;AAIA,qBAAqB;AACR,QAAA,aAAa,GAAG,SAAS,CAAC;AAC1B,QAAA,YAAY,GAAG,QAAQ,CAAC;AACxB,QAAA,gBAAgB,GAAG,YAAY,CAAC;AAChC,QAAA,eAAe,GAAG,WAAW,CAAC;AAE3C;;;GAGG;AACU,QAAA,YAAY,GAAG,MAAM,CAAC,MAAM,CAAC;IACxC,MAAM,EAAE,QAAQ;IAChB,mBAAmB,EAAE,qBAAqB;IAC1C,qBAAqB,EAAE,uBAAuB;IAC9C,OAAO,EAAE,SAAS;IAClB,OAAO,EAAE,SAAS;IAClB,YAAY,EAAE,cAAc;CACpB,CAAC,CAAC;AAKZ;;;GAGG;AACU,QAAA,UAAU,GAAG,MAAM,CAAC,MAAM,CAAC;IACtC,UAAU,EAAE,YAAY;IACxB,MAAM,EAAE,QAAQ;IAChB,eAAe,EAAE,iBAAiB;IAClC,SAAS,EAAE,WAAW;IACtB,WAAW,EAAE,aAAa;IAC1B,SAAS,EAAE,WAAW;IACtB,OAAO,EAAE,SAAS;IAClB,OAAO,EAAE,SAAS;IAClB,OAAO,EAAE,SAAS;IAClB,YAAY,EAAE,cAAc;CACpB,CAAC,CAAC;AAQZ,gBAAgB;AAChB,SAAgB,eAAe,CAAC,KAAiB;IAC/C,KAAK,CAAC,OAAO,CAAC,YAAY,CAAC,CAAC;IAC5B,KAAK,CAAC,KAAK,EAAE,CAAC;AAChB,CAAC;AAHD,0CAGC;AAED,gBAAgB;AAChB,SAAgB,uBAAuB,CAAC,KAAqB,EAAE,MAAkB;IAC/E,YAAY,CAAC,KAAK,CAAC,CAAC;IACpB,OAAO,MAAM,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC;AAC9B,CAAC;AAHD,0DAGC;AAWD,+EAA+E;AAC/E,SAAgB,mBAAmB,CACjC,MAAgC,EAChC,YAAyB;IAEzB,IAAI,MAAM,CAAC,WAAW,IAAI,IAAI,EAAE;QAC9B,MAAM,CAAC,WAAW,GAAG,YAAY,CAAC;KACnC;SAAM;QACL,IAAI,YAAY,CAAC,WAAW,CAAC,WAAW,CAAC,MAAM,CAAC,WAAW,CAAC,WAAW,CAAC,EAAE;YACxE,MAAM,CAAC,WAAW,GAAG,YAAY,CAAC;SACnC;KACF;AACH,CAAC;AAXD,kDAWC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/events.js b/node_modules/mongodb/lib/sdam/events.js new file mode 100644 index 000000000..6e11b5d7d --- /dev/null +++ b/node_modules/mongodb/lib/sdam/events.js @@ -0,0 +1,125 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.ServerHeartbeatFailedEvent = exports.ServerHeartbeatSucceededEvent = exports.ServerHeartbeatStartedEvent = exports.TopologyClosedEvent = exports.TopologyOpeningEvent = exports.TopologyDescriptionChangedEvent = exports.ServerClosedEvent = exports.ServerOpeningEvent = exports.ServerDescriptionChangedEvent = void 0; +/** + * Emitted when server description changes, but does NOT include changes to the RTT. + * @public + * @category Event + */ +class ServerDescriptionChangedEvent { + /** @internal */ + constructor(topologyId, address, previousDescription, newDescription) { + this.topologyId = topologyId; + this.address = address; + this.previousDescription = previousDescription; + this.newDescription = newDescription; + } +} +exports.ServerDescriptionChangedEvent = ServerDescriptionChangedEvent; +/** + * Emitted when server is initialized. + * @public + * @category Event + */ +class ServerOpeningEvent { + /** @internal */ + constructor(topologyId, address) { + this.topologyId = topologyId; + this.address = address; + } +} +exports.ServerOpeningEvent = ServerOpeningEvent; +/** + * Emitted when server is closed. + * @public + * @category Event + */ +class ServerClosedEvent { + /** @internal */ + constructor(topologyId, address) { + this.topologyId = topologyId; + this.address = address; + } +} +exports.ServerClosedEvent = ServerClosedEvent; +/** + * Emitted when topology description changes. + * @public + * @category Event + */ +class TopologyDescriptionChangedEvent { + /** @internal */ + constructor(topologyId, previousDescription, newDescription) { + this.topologyId = topologyId; + this.previousDescription = previousDescription; + this.newDescription = newDescription; + } +} +exports.TopologyDescriptionChangedEvent = TopologyDescriptionChangedEvent; +/** + * Emitted when topology is initialized. + * @public + * @category Event + */ +class TopologyOpeningEvent { + /** @internal */ + constructor(topologyId) { + this.topologyId = topologyId; + } +} +exports.TopologyOpeningEvent = TopologyOpeningEvent; +/** + * Emitted when topology is closed. + * @public + * @category Event + */ +class TopologyClosedEvent { + /** @internal */ + constructor(topologyId) { + this.topologyId = topologyId; + } +} +exports.TopologyClosedEvent = TopologyClosedEvent; +/** + * Emitted when the server monitor’s ismaster command is started - immediately before + * the ismaster command is serialized into raw BSON and written to the socket. + * + * @public + * @category Event + */ +class ServerHeartbeatStartedEvent { + /** @internal */ + constructor(connectionId) { + this.connectionId = connectionId; + } +} +exports.ServerHeartbeatStartedEvent = ServerHeartbeatStartedEvent; +/** + * Emitted when the server monitor’s ismaster succeeds. + * @public + * @category Event + */ +class ServerHeartbeatSucceededEvent { + /** @internal */ + constructor(connectionId, duration, reply) { + this.connectionId = connectionId; + this.duration = duration; + this.reply = reply; + } +} +exports.ServerHeartbeatSucceededEvent = ServerHeartbeatSucceededEvent; +/** + * Emitted when the server monitor’s ismaster fails, either with an “ok: 0” or a socket exception. + * @public + * @category Event + */ +class ServerHeartbeatFailedEvent { + /** @internal */ + constructor(connectionId, duration, failure) { + this.connectionId = connectionId; + this.duration = duration; + this.failure = failure; + } +} +exports.ServerHeartbeatFailedEvent = ServerHeartbeatFailedEvent; +//# sourceMappingURL=events.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/events.js.map b/node_modules/mongodb/lib/sdam/events.js.map new file mode 100644 index 000000000..ab6297877 --- /dev/null +++ b/node_modules/mongodb/lib/sdam/events.js.map @@ -0,0 +1 @@ +{"version":3,"file":"events.js","sourceRoot":"","sources":["../../src/sdam/events.ts"],"names":[],"mappings":";;;AAIA;;;;GAIG;AACH,MAAa,6BAA6B;IAUxC,gBAAgB;IAChB,YACE,UAAkB,EAClB,OAAe,EACf,mBAAsC,EACtC,cAAiC;QAEjC,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;QAC7B,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;QACvB,IAAI,CAAC,mBAAmB,GAAG,mBAAmB,CAAC;QAC/C,IAAI,CAAC,cAAc,GAAG,cAAc,CAAC;IACvC,CAAC;CACF;AAtBD,sEAsBC;AAED;;;;GAIG;AACH,MAAa,kBAAkB;IAM7B,gBAAgB;IAChB,YAAY,UAAkB,EAAE,OAAe;QAC7C,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;QAC7B,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;IACzB,CAAC;CACF;AAXD,gDAWC;AAED;;;;GAIG;AACH,MAAa,iBAAiB;IAM5B,gBAAgB;IAChB,YAAY,UAAkB,EAAE,OAAe;QAC7C,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;QAC7B,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;IACzB,CAAC;CACF;AAXD,8CAWC;AAED;;;;GAIG;AACH,MAAa,+BAA+B;IAQ1C,gBAAgB;IAChB,YACE,UAAkB,EAClB,mBAAwC,EACxC,cAAmC;QAEnC,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;QAC7B,IAAI,CAAC,mBAAmB,GAAG,mBAAmB,CAAC;QAC/C,IAAI,CAAC,cAAc,GAAG,cAAc,CAAC;IACvC,CAAC;CACF;AAlBD,0EAkBC;AAED;;;;GAIG;AACH,MAAa,oBAAoB;IAI/B,gBAAgB;IAChB,YAAY,UAAkB;QAC5B,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;IAC/B,CAAC;CACF;AARD,oDAQC;AAED;;;;GAIG;AACH,MAAa,mBAAmB;IAI9B,gBAAgB;IAChB,YAAY,UAAkB;QAC5B,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;IAC/B,CAAC;CACF;AARD,kDAQC;AAED;;;;;;GAMG;AACH,MAAa,2BAA2B;IAItC,gBAAgB;IAChB,YAAY,YAAoB;QAC9B,IAAI,CAAC,YAAY,GAAG,YAAY,CAAC;IACnC,CAAC;CACF;AARD,kEAQC;AAED;;;;GAIG;AACH,MAAa,6BAA6B;IAQxC,gBAAgB;IAChB,YAAY,YAAoB,EAAE,QAAgB,EAAE,KAAe;QACjE,IAAI,CAAC,YAAY,GAAG,YAAY,CAAC;QACjC,IAAI,CAAC,QAAQ,GAAG,QAAQ,CAAC;QACzB,IAAI,CAAC,KAAK,GAAG,KAAK,CAAC;IACrB,CAAC;CACF;AAdD,sEAcC;AAED;;;;GAIG;AACH,MAAa,0BAA0B;IAQrC,gBAAgB;IAChB,YAAY,YAAoB,EAAE,QAAgB,EAAE,OAAc;QAChE,IAAI,CAAC,YAAY,GAAG,YAAY,CAAC;QACjC,IAAI,CAAC,QAAQ,GAAG,QAAQ,CAAC;QACzB,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;IACzB,CAAC;CACF;AAdD,gEAcC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/monitor.js b/node_modules/mongodb/lib/sdam/monitor.js new file mode 100644 index 000000000..c783455e1 --- /dev/null +++ b/node_modules/mongodb/lib/sdam/monitor.js @@ -0,0 +1,319 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.RTTPinger = exports.Monitor = void 0; +const common_1 = require("./common"); +const utils_1 = require("../utils"); +const connect_1 = require("../cmap/connect"); +const connection_1 = require("../cmap/connection"); +const error_1 = require("../error"); +const bson_1 = require("../bson"); +const events_1 = require("./events"); +const server_1 = require("./server"); +const mongo_types_1 = require("../mongo_types"); +/** @internal */ +const kServer = Symbol('server'); +/** @internal */ +const kMonitorId = Symbol('monitorId'); +/** @internal */ +const kConnection = Symbol('connection'); +/** @internal */ +const kCancellationToken = Symbol('cancellationToken'); +/** @internal */ +const kRTTPinger = Symbol('rttPinger'); +/** @internal */ +const kRoundTripTime = Symbol('roundTripTime'); +const STATE_IDLE = 'idle'; +const STATE_MONITORING = 'monitoring'; +const stateTransition = utils_1.makeStateMachine({ + [common_1.STATE_CLOSING]: [common_1.STATE_CLOSING, STATE_IDLE, common_1.STATE_CLOSED], + [common_1.STATE_CLOSED]: [common_1.STATE_CLOSED, STATE_MONITORING], + [STATE_IDLE]: [STATE_IDLE, STATE_MONITORING, common_1.STATE_CLOSING], + [STATE_MONITORING]: [STATE_MONITORING, STATE_IDLE, common_1.STATE_CLOSING] +}); +const INVALID_REQUEST_CHECK_STATES = new Set([common_1.STATE_CLOSING, common_1.STATE_CLOSED, STATE_MONITORING]); +function isInCloseState(monitor) { + return monitor.s.state === common_1.STATE_CLOSED || monitor.s.state === common_1.STATE_CLOSING; +} +/** @internal */ +class Monitor extends mongo_types_1.TypedEventEmitter { + constructor(server, options) { + var _a, _b, _c; + super(); + this[kServer] = server; + this[kConnection] = undefined; + this[kCancellationToken] = new mongo_types_1.CancellationToken(); + this[kCancellationToken].setMaxListeners(Infinity); + this[kMonitorId] = undefined; + this.s = { + state: common_1.STATE_CLOSED + }; + this.address = server.description.address; + this.options = Object.freeze({ + connectTimeoutMS: (_a = options.connectTimeoutMS) !== null && _a !== void 0 ? _a : 10000, + heartbeatFrequencyMS: (_b = options.heartbeatFrequencyMS) !== null && _b !== void 0 ? _b : 10000, + minHeartbeatFrequencyMS: (_c = options.minHeartbeatFrequencyMS) !== null && _c !== void 0 ? _c : 500 + }); + const cancellationToken = this[kCancellationToken]; + // TODO: refactor this to pull it directly from the pool, requires new ConnectionPool integration + const connectOptions = Object.assign({ + id: '', + generation: server.s.pool.generation, + connectionType: connection_1.Connection, + cancellationToken, + hostAddress: server.description.hostAddress + }, options, + // force BSON serialization options + { + raw: false, + promoteLongs: true, + promoteValues: true, + promoteBuffers: true + }); + // ensure no authentication is used for monitoring + delete connectOptions.credentials; + if (connectOptions.autoEncrypter) { + delete connectOptions.autoEncrypter; + } + this.connectOptions = Object.freeze(connectOptions); + } + connect() { + if (this.s.state !== common_1.STATE_CLOSED) { + return; + } + // start + const heartbeatFrequencyMS = this.options.heartbeatFrequencyMS; + const minHeartbeatFrequencyMS = this.options.minHeartbeatFrequencyMS; + this[kMonitorId] = utils_1.makeInterruptibleAsyncInterval(monitorServer(this), { + interval: heartbeatFrequencyMS, + minInterval: minHeartbeatFrequencyMS, + immediate: true + }); + } + requestCheck() { + var _a; + if (INVALID_REQUEST_CHECK_STATES.has(this.s.state)) { + return; + } + (_a = this[kMonitorId]) === null || _a === void 0 ? void 0 : _a.wake(); + } + reset() { + const topologyVersion = this[kServer].description.topologyVersion; + if (isInCloseState(this) || topologyVersion == null) { + return; + } + stateTransition(this, common_1.STATE_CLOSING); + resetMonitorState(this); + // restart monitor + stateTransition(this, STATE_IDLE); + // restart monitoring + const heartbeatFrequencyMS = this.options.heartbeatFrequencyMS; + const minHeartbeatFrequencyMS = this.options.minHeartbeatFrequencyMS; + this[kMonitorId] = utils_1.makeInterruptibleAsyncInterval(monitorServer(this), { + interval: heartbeatFrequencyMS, + minInterval: minHeartbeatFrequencyMS + }); + } + close() { + if (isInCloseState(this)) { + return; + } + stateTransition(this, common_1.STATE_CLOSING); + resetMonitorState(this); + // close monitor + this.emit('close'); + stateTransition(this, common_1.STATE_CLOSED); + } +} +exports.Monitor = Monitor; +function resetMonitorState(monitor) { + var _a, _b, _c; + (_a = monitor[kMonitorId]) === null || _a === void 0 ? void 0 : _a.stop(); + monitor[kMonitorId] = undefined; + (_b = monitor[kRTTPinger]) === null || _b === void 0 ? void 0 : _b.close(); + monitor[kRTTPinger] = undefined; + monitor[kCancellationToken].emit('cancel'); + (_c = monitor[kConnection]) === null || _c === void 0 ? void 0 : _c.destroy({ force: true }); + monitor[kConnection] = undefined; +} +function checkServer(monitor, callback) { + let start = utils_1.now(); + monitor.emit(server_1.Server.SERVER_HEARTBEAT_STARTED, new events_1.ServerHeartbeatStartedEvent(monitor.address)); + function failureHandler(err) { + var _a; + (_a = monitor[kConnection]) === null || _a === void 0 ? void 0 : _a.destroy({ force: true }); + monitor[kConnection] = undefined; + monitor.emit(server_1.Server.SERVER_HEARTBEAT_FAILED, new events_1.ServerHeartbeatFailedEvent(monitor.address, utils_1.calculateDurationInMs(start), err)); + monitor.emit('resetServer', err); + monitor.emit('resetConnectionPool'); + callback(err); + } + const connection = monitor[kConnection]; + if (connection && !connection.closed) { + const { serverApi, helloOk } = connection; + const connectTimeoutMS = monitor.options.connectTimeoutMS; + const maxAwaitTimeMS = monitor.options.heartbeatFrequencyMS; + const topologyVersion = monitor[kServer].description.topologyVersion; + const isAwaitable = topologyVersion != null; + const cmd = { + [(serverApi === null || serverApi === void 0 ? void 0 : serverApi.version) || helloOk ? 'hello' : 'ismaster']: true, + ...(isAwaitable && topologyVersion + ? { maxAwaitTimeMS, topologyVersion: makeTopologyVersion(topologyVersion) } + : {}) + }; + const options = isAwaitable + ? { + socketTimeoutMS: connectTimeoutMS ? connectTimeoutMS + maxAwaitTimeMS : 0, + exhaustAllowed: true + } + : { socketTimeoutMS: connectTimeoutMS }; + if (isAwaitable && monitor[kRTTPinger] == null) { + monitor[kRTTPinger] = new RTTPinger(monitor[kCancellationToken], Object.assign({ heartbeatFrequencyMS: monitor.options.heartbeatFrequencyMS }, monitor.connectOptions)); + } + connection.command(utils_1.ns('admin.$cmd'), cmd, options, (err, isMaster) => { + var _a; + if (err) { + failureHandler(err); + return; + } + if ('isWritablePrimary' in isMaster) { + // Provide pre-hello-style response document. + isMaster.ismaster = isMaster.isWritablePrimary; + } + const rttPinger = monitor[kRTTPinger]; + const duration = isAwaitable && rttPinger ? rttPinger.roundTripTime : utils_1.calculateDurationInMs(start); + monitor.emit(server_1.Server.SERVER_HEARTBEAT_SUCCEEDED, new events_1.ServerHeartbeatSucceededEvent(monitor.address, duration, isMaster)); + // if we are using the streaming protocol then we immediately issue another `started` + // event, otherwise the "check" is complete and return to the main monitor loop + if (isAwaitable && isMaster.topologyVersion) { + monitor.emit(server_1.Server.SERVER_HEARTBEAT_STARTED, new events_1.ServerHeartbeatStartedEvent(monitor.address)); + start = utils_1.now(); + } + else { + (_a = monitor[kRTTPinger]) === null || _a === void 0 ? void 0 : _a.close(); + monitor[kRTTPinger] = undefined; + callback(undefined, isMaster); + } + }); + return; + } + // connecting does an implicit `ismaster` + connect_1.connect(monitor.connectOptions, (err, conn) => { + if (err) { + monitor[kConnection] = undefined; + // we already reset the connection pool on network errors in all cases + if (!(err instanceof error_1.MongoNetworkError)) { + monitor.emit('resetConnectionPool'); + } + failureHandler(err); + return; + } + if (conn) { + if (isInCloseState(monitor)) { + conn.destroy({ force: true }); + return; + } + monitor[kConnection] = conn; + monitor.emit(server_1.Server.SERVER_HEARTBEAT_SUCCEEDED, new events_1.ServerHeartbeatSucceededEvent(monitor.address, utils_1.calculateDurationInMs(start), conn.ismaster)); + callback(undefined, conn.ismaster); + } + }); +} +function monitorServer(monitor) { + return (callback) => { + stateTransition(monitor, STATE_MONITORING); + function done() { + if (!isInCloseState(monitor)) { + stateTransition(monitor, STATE_IDLE); + } + callback(); + } + checkServer(monitor, (err, isMaster) => { + if (err) { + // otherwise an error occurred on initial discovery, also bail + if (monitor[kServer].description.type === common_1.ServerType.Unknown) { + monitor.emit('resetServer', err); + return done(); + } + } + // if the check indicates streaming is supported, immediately reschedule monitoring + if (isMaster && isMaster.topologyVersion) { + setTimeout(() => { + var _a; + if (!isInCloseState(monitor)) { + (_a = monitor[kMonitorId]) === null || _a === void 0 ? void 0 : _a.wake(); + } + }, 0); + } + done(); + }); + }; +} +function makeTopologyVersion(tv) { + return { + processId: tv.processId, + // tests mock counter as just number, but in a real situation counter should always be a Long + counter: bson_1.Long.isLong(tv.counter) ? tv.counter : bson_1.Long.fromNumber(tv.counter) + }; +} +/** @internal */ +class RTTPinger { + constructor(cancellationToken, options) { + this[kConnection] = undefined; + this[kCancellationToken] = cancellationToken; + this[kRoundTripTime] = 0; + this.closed = false; + const heartbeatFrequencyMS = options.heartbeatFrequencyMS; + this[kMonitorId] = setTimeout(() => measureRoundTripTime(this, options), heartbeatFrequencyMS); + } + get roundTripTime() { + return this[kRoundTripTime]; + } + close() { + var _a; + this.closed = true; + clearTimeout(this[kMonitorId]); + (_a = this[kConnection]) === null || _a === void 0 ? void 0 : _a.destroy({ force: true }); + this[kConnection] = undefined; + } +} +exports.RTTPinger = RTTPinger; +function measureRoundTripTime(rttPinger, options) { + const start = utils_1.now(); + options.cancellationToken = rttPinger[kCancellationToken]; + const heartbeatFrequencyMS = options.heartbeatFrequencyMS; + if (rttPinger.closed) { + return; + } + function measureAndReschedule(conn) { + if (rttPinger.closed) { + conn === null || conn === void 0 ? void 0 : conn.destroy({ force: true }); + return; + } + if (rttPinger[kConnection] == null) { + rttPinger[kConnection] = conn; + } + rttPinger[kRoundTripTime] = utils_1.calculateDurationInMs(start); + rttPinger[kMonitorId] = setTimeout(() => measureRoundTripTime(rttPinger, options), heartbeatFrequencyMS); + } + const connection = rttPinger[kConnection]; + if (connection == null) { + connect_1.connect(options, (err, conn) => { + if (err) { + rttPinger[kConnection] = undefined; + rttPinger[kRoundTripTime] = 0; + return; + } + measureAndReschedule(conn); + }); + return; + } + connection.command(utils_1.ns('admin.$cmd'), { ismaster: 1 }, undefined, err => { + if (err) { + rttPinger[kConnection] = undefined; + rttPinger[kRoundTripTime] = 0; + return; + } + measureAndReschedule(); + }); +} +//# sourceMappingURL=monitor.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/monitor.js.map b/node_modules/mongodb/lib/sdam/monitor.js.map new file mode 100644 index 000000000..abb35ff9c --- /dev/null +++ b/node_modules/mongodb/lib/sdam/monitor.js.map @@ -0,0 +1 @@ +{"version":3,"file":"monitor.js","sourceRoot":"","sources":["../../src/sdam/monitor.ts"],"names":[],"mappings":";;;AAAA,qCAAmE;AACnE,oCAOkB;AAClB,6CAA0C;AAC1C,mDAAmE;AACnE,oCAAuD;AACvD,kCAAyC;AACzC,qCAIkB;AAElB,qCAAkC;AAGlC,gDAAsE;AAEtE,gBAAgB;AAChB,MAAM,OAAO,GAAG,MAAM,CAAC,QAAQ,CAAC,CAAC;AACjC,gBAAgB;AAChB,MAAM,UAAU,GAAG,MAAM,CAAC,WAAW,CAAC,CAAC;AACvC,gBAAgB;AAChB,MAAM,WAAW,GAAG,MAAM,CAAC,YAAY,CAAC,CAAC;AACzC,gBAAgB;AAChB,MAAM,kBAAkB,GAAG,MAAM,CAAC,mBAAmB,CAAC,CAAC;AACvD,gBAAgB;AAChB,MAAM,UAAU,GAAG,MAAM,CAAC,WAAW,CAAC,CAAC;AACvC,gBAAgB;AAChB,MAAM,cAAc,GAAG,MAAM,CAAC,eAAe,CAAC,CAAC;AAE/C,MAAM,UAAU,GAAG,MAAM,CAAC;AAC1B,MAAM,gBAAgB,GAAG,YAAY,CAAC;AACtC,MAAM,eAAe,GAAG,wBAAgB,CAAC;IACvC,CAAC,sBAAa,CAAC,EAAE,CAAC,sBAAa,EAAE,UAAU,EAAE,qBAAY,CAAC;IAC1D,CAAC,qBAAY,CAAC,EAAE,CAAC,qBAAY,EAAE,gBAAgB,CAAC;IAChD,CAAC,UAAU,CAAC,EAAE,CAAC,UAAU,EAAE,gBAAgB,EAAE,sBAAa,CAAC;IAC3D,CAAC,gBAAgB,CAAC,EAAE,CAAC,gBAAgB,EAAE,UAAU,EAAE,sBAAa,CAAC;CAClE,CAAC,CAAC;AAEH,MAAM,4BAA4B,GAAG,IAAI,GAAG,CAAC,CAAC,sBAAa,EAAE,qBAAY,EAAE,gBAAgB,CAAC,CAAC,CAAC;AAC9F,SAAS,cAAc,CAAC,OAAgB;IACtC,OAAO,OAAO,CAAC,CAAC,CAAC,KAAK,KAAK,qBAAY,IAAI,OAAO,CAAC,CAAC,CAAC,KAAK,KAAK,sBAAa,CAAC;AAC/E,CAAC;AAyBD,gBAAgB;AAChB,MAAa,OAAQ,SAAQ,+BAAgC;IAe3D,YAAY,MAAc,EAAE,OAAuB;;QACjD,KAAK,EAAE,CAAC;QAER,IAAI,CAAC,OAAO,CAAC,GAAG,MAAM,CAAC;QACvB,IAAI,CAAC,WAAW,CAAC,GAAG,SAAS,CAAC;QAC9B,IAAI,CAAC,kBAAkB,CAAC,GAAG,IAAI,+BAAiB,EAAE,CAAC;QACnD,IAAI,CAAC,kBAAkB,CAAC,CAAC,eAAe,CAAC,QAAQ,CAAC,CAAC;QACnD,IAAI,CAAC,UAAU,CAAC,GAAG,SAAS,CAAC;QAC7B,IAAI,CAAC,CAAC,GAAG;YACP,KAAK,EAAE,qBAAY;SACpB,CAAC;QAEF,IAAI,CAAC,OAAO,GAAG,MAAM,CAAC,WAAW,CAAC,OAAO,CAAC;QAC1C,IAAI,CAAC,OAAO,GAAG,MAAM,CAAC,MAAM,CAAC;YAC3B,gBAAgB,EAAE,MAAA,OAAO,CAAC,gBAAgB,mCAAI,KAAK;YACnD,oBAAoB,EAAE,MAAA,OAAO,CAAC,oBAAoB,mCAAI,KAAK;YAC3D,uBAAuB,EAAE,MAAA,OAAO,CAAC,uBAAuB,mCAAI,GAAG;SAChE,CAAC,CAAC;QAEH,MAAM,iBAAiB,GAAG,IAAI,CAAC,kBAAkB,CAAC,CAAC;QACnD,iGAAiG;QACjG,MAAM,cAAc,GAAG,MAAM,CAAC,MAAM,CAClC;YACE,EAAE,EAAE,WAAoB;YACxB,UAAU,EAAE,MAAM,CAAC,CAAC,CAAC,IAAI,CAAC,UAAU;YACpC,cAAc,EAAE,uBAAU;YAC1B,iBAAiB;YACjB,WAAW,EAAE,MAAM,CAAC,WAAW,CAAC,WAAW;SAC5C,EACD,OAAO;QACP,mCAAmC;QACnC;YACE,GAAG,EAAE,KAAK;YACV,YAAY,EAAE,IAAI;YAClB,aAAa,EAAE,IAAI;YACnB,cAAc,EAAE,IAAI;SACrB,CACF,CAAC;QAEF,kDAAkD;QAClD,OAAO,cAAc,CAAC,WAAW,CAAC;QAClC,IAAI,cAAc,CAAC,aAAa,EAAE;YAChC,OAAO,cAAc,CAAC,aAAa,CAAC;SACrC;QAED,IAAI,CAAC,cAAc,GAAG,MAAM,CAAC,MAAM,CAAC,cAAc,CAAC,CAAC;IACtD,CAAC;IAED,OAAO;QACL,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,KAAK,qBAAY,EAAE;YACjC,OAAO;SACR;QAED,QAAQ;QACR,MAAM,oBAAoB,GAAG,IAAI,CAAC,OAAO,CAAC,oBAAoB,CAAC;QAC/D,MAAM,uBAAuB,GAAG,IAAI,CAAC,OAAO,CAAC,uBAAuB,CAAC;QACrE,IAAI,CAAC,UAAU,CAAC,GAAG,sCAA8B,CAAC,aAAa,CAAC,IAAI,CAAC,EAAE;YACrE,QAAQ,EAAE,oBAAoB;YAC9B,WAAW,EAAE,uBAAuB;YACpC,SAAS,EAAE,IAAI;SAChB,CAAC,CAAC;IACL,CAAC;IAED,YAAY;;QACV,IAAI,4BAA4B,CAAC,GAAG,CAAC,IAAI,CAAC,CAAC,CAAC,KAAK,CAAC,EAAE;YAClD,OAAO;SACR;QAED,MAAA,IAAI,CAAC,UAAU,CAAC,0CAAE,IAAI,EAAE,CAAC;IAC3B,CAAC;IAED,KAAK;QACH,MAAM,eAAe,GAAG,IAAI,CAAC,OAAO,CAAC,CAAC,WAAW,CAAC,eAAe,CAAC;QAClE,IAAI,cAAc,CAAC,IAAI,CAAC,IAAI,eAAe,IAAI,IAAI,EAAE;YACnD,OAAO;SACR;QAED,eAAe,CAAC,IAAI,EAAE,sBAAa,CAAC,CAAC;QACrC,iBAAiB,CAAC,IAAI,CAAC,CAAC;QAExB,kBAAkB;QAClB,eAAe,CAAC,IAAI,EAAE,UAAU,CAAC,CAAC;QAElC,qBAAqB;QACrB,MAAM,oBAAoB,GAAG,IAAI,CAAC,OAAO,CAAC,oBAAoB,CAAC;QAC/D,MAAM,uBAAuB,GAAG,IAAI,CAAC,OAAO,CAAC,uBAAuB,CAAC;QACrE,IAAI,CAAC,UAAU,CAAC,GAAG,sCAA8B,CAAC,aAAa,CAAC,IAAI,CAAC,EAAE;YACrE,QAAQ,EAAE,oBAAoB;YAC9B,WAAW,EAAE,uBAAuB;SACrC,CAAC,CAAC;IACL,CAAC;IAED,KAAK;QACH,IAAI,cAAc,CAAC,IAAI,CAAC,EAAE;YACxB,OAAO;SACR;QAED,eAAe,CAAC,IAAI,EAAE,sBAAa,CAAC,CAAC;QACrC,iBAAiB,CAAC,IAAI,CAAC,CAAC;QAExB,gBAAgB;QAChB,IAAI,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;QACnB,eAAe,CAAC,IAAI,EAAE,qBAAY,CAAC,CAAC;IACtC,CAAC;CACF;AAvHD,0BAuHC;AAED,SAAS,iBAAiB,CAAC,OAAgB;;IACzC,MAAA,OAAO,CAAC,UAAU,CAAC,0CAAE,IAAI,EAAE,CAAC;IAC5B,OAAO,CAAC,UAAU,CAAC,GAAG,SAAS,CAAC;IAEhC,MAAA,OAAO,CAAC,UAAU,CAAC,0CAAE,KAAK,EAAE,CAAC;IAC7B,OAAO,CAAC,UAAU,CAAC,GAAG,SAAS,CAAC;IAEhC,OAAO,CAAC,kBAAkB,CAAC,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;IAE3C,MAAA,OAAO,CAAC,WAAW,CAAC,0CAAE,OAAO,CAAC,EAAE,KAAK,EAAE,IAAI,EAAE,CAAC,CAAC;IAC/C,OAAO,CAAC,WAAW,CAAC,GAAG,SAAS,CAAC;AACnC,CAAC;AAED,SAAS,WAAW,CAAC,OAAgB,EAAE,QAA4B;IACjE,IAAI,KAAK,GAAG,WAAG,EAAE,CAAC;IAClB,OAAO,CAAC,IAAI,CAAC,eAAM,CAAC,wBAAwB,EAAE,IAAI,oCAA2B,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC,CAAC;IAEhG,SAAS,cAAc,CAAC,GAAa;;QACnC,MAAA,OAAO,CAAC,WAAW,CAAC,0CAAE,OAAO,CAAC,EAAE,KAAK,EAAE,IAAI,EAAE,CAAC,CAAC;QAC/C,OAAO,CAAC,WAAW,CAAC,GAAG,SAAS,CAAC;QAEjC,OAAO,CAAC,IAAI,CACV,eAAM,CAAC,uBAAuB,EAC9B,IAAI,mCAA0B,CAAC,OAAO,CAAC,OAAO,EAAE,6BAAqB,CAAC,KAAK,CAAC,EAAE,GAAG,CAAC,CACnF,CAAC;QAEF,OAAO,CAAC,IAAI,CAAC,aAAa,EAAE,GAAG,CAAC,CAAC;QACjC,OAAO,CAAC,IAAI,CAAC,qBAAqB,CAAC,CAAC;QACpC,QAAQ,CAAC,GAAG,CAAC,CAAC;IAChB,CAAC;IAED,MAAM,UAAU,GAAG,OAAO,CAAC,WAAW,CAAC,CAAC;IACxC,IAAI,UAAU,IAAI,CAAC,UAAU,CAAC,MAAM,EAAE;QACpC,MAAM,EAAE,SAAS,EAAE,OAAO,EAAE,GAAG,UAAU,CAAC;QAC1C,MAAM,gBAAgB,GAAG,OAAO,CAAC,OAAO,CAAC,gBAAgB,CAAC;QAC1D,MAAM,cAAc,GAAG,OAAO,CAAC,OAAO,CAAC,oBAAoB,CAAC;QAC5D,MAAM,eAAe,GAAG,OAAO,CAAC,OAAO,CAAC,CAAC,WAAW,CAAC,eAAe,CAAC;QACrE,MAAM,WAAW,GAAG,eAAe,IAAI,IAAI,CAAC;QAE5C,MAAM,GAAG,GAAG;YACV,CAAC,CAAA,SAAS,aAAT,SAAS,uBAAT,SAAS,CAAE,OAAO,KAAI,OAAO,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC,CAAC,UAAU,CAAC,EAAE,IAAI;YAC5D,GAAG,CAAC,WAAW,IAAI,eAAe;gBAChC,CAAC,CAAC,EAAE,cAAc,EAAE,eAAe,EAAE,mBAAmB,CAAC,eAAe,CAAC,EAAE;gBAC3E,CAAC,CAAC,EAAE,CAAC;SACR,CAAC;QAEF,MAAM,OAAO,GAAG,WAAW;YACzB,CAAC,CAAC;gBACE,eAAe,EAAE,gBAAgB,CAAC,CAAC,CAAC,gBAAgB,GAAG,cAAc,CAAC,CAAC,CAAC,CAAC;gBACzE,cAAc,EAAE,IAAI;aACrB;YACH,CAAC,CAAC,EAAE,eAAe,EAAE,gBAAgB,EAAE,CAAC;QAE1C,IAAI,WAAW,IAAI,OAAO,CAAC,UAAU,CAAC,IAAI,IAAI,EAAE;YAC9C,OAAO,CAAC,UAAU,CAAC,GAAG,IAAI,SAAS,CACjC,OAAO,CAAC,kBAAkB,CAAC,EAC3B,MAAM,CAAC,MAAM,CACX,EAAE,oBAAoB,EAAE,OAAO,CAAC,OAAO,CAAC,oBAAoB,EAAE,EAC9D,OAAO,CAAC,cAAc,CACvB,CACF,CAAC;SACH;QAED,UAAU,CAAC,OAAO,CAAC,UAAE,CAAC,YAAY,CAAC,EAAE,GAAG,EAAE,OAAO,EAAE,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;;YACnE,IAAI,GAAG,EAAE;gBACP,cAAc,CAAC,GAAG,CAAC,CAAC;gBACpB,OAAO;aACR;YACD,IAAI,mBAAmB,IAAI,QAAQ,EAAE;gBACnC,6CAA6C;gBAC7C,QAAQ,CAAC,QAAQ,GAAG,QAAQ,CAAC,iBAAiB,CAAC;aAChD;YAED,MAAM,SAAS,GAAG,OAAO,CAAC,UAAU,CAAC,CAAC;YACtC,MAAM,QAAQ,GACZ,WAAW,IAAI,SAAS,CAAC,CAAC,CAAC,SAAS,CAAC,aAAa,CAAC,CAAC,CAAC,6BAAqB,CAAC,KAAK,CAAC,CAAC;YAEpF,OAAO,CAAC,IAAI,CACV,eAAM,CAAC,0BAA0B,EACjC,IAAI,sCAA6B,CAAC,OAAO,CAAC,OAAO,EAAE,QAAQ,EAAE,QAAQ,CAAC,CACvE,CAAC;YAEF,qFAAqF;YACrF,+EAA+E;YAC/E,IAAI,WAAW,IAAI,QAAQ,CAAC,eAAe,EAAE;gBAC3C,OAAO,CAAC,IAAI,CACV,eAAM,CAAC,wBAAwB,EAC/B,IAAI,oCAA2B,CAAC,OAAO,CAAC,OAAO,CAAC,CACjD,CAAC;gBACF,KAAK,GAAG,WAAG,EAAE,CAAC;aACf;iBAAM;gBACL,MAAA,OAAO,CAAC,UAAU,CAAC,0CAAE,KAAK,EAAE,CAAC;gBAC7B,OAAO,CAAC,UAAU,CAAC,GAAG,SAAS,CAAC;gBAEhC,QAAQ,CAAC,SAAS,EAAE,QAAQ,CAAC,CAAC;aAC/B;QACH,CAAC,CAAC,CAAC;QAEH,OAAO;KACR;IAED,yCAAyC;IACzC,iBAAO,CAAC,OAAO,CAAC,cAAc,EAAE,CAAC,GAAG,EAAE,IAAI,EAAE,EAAE;QAC5C,IAAI,GAAG,EAAE;YACP,OAAO,CAAC,WAAW,CAAC,GAAG,SAAS,CAAC;YAEjC,sEAAsE;YACtE,IAAI,CAAC,CAAC,GAAG,YAAY,yBAAiB,CAAC,EAAE;gBACvC,OAAO,CAAC,IAAI,CAAC,qBAAqB,CAAC,CAAC;aACrC;YAED,cAAc,CAAC,GAAG,CAAC,CAAC;YACpB,OAAO;SACR;QAED,IAAI,IAAI,EAAE;YACR,IAAI,cAAc,CAAC,OAAO,CAAC,EAAE;gBAC3B,IAAI,CAAC,OAAO,CAAC,EAAE,KAAK,EAAE,IAAI,EAAE,CAAC,CAAC;gBAC9B,OAAO;aACR;YAED,OAAO,CAAC,WAAW,CAAC,GAAG,IAAI,CAAC;YAC5B,OAAO,CAAC,IAAI,CACV,eAAM,CAAC,0BAA0B,EACjC,IAAI,sCAA6B,CAC/B,OAAO,CAAC,OAAO,EACf,6BAAqB,CAAC,KAAK,CAAC,EAC5B,IAAI,CAAC,QAAQ,CACd,CACF,CAAC;YAEF,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,QAAQ,CAAC,CAAC;SACpC;IACH,CAAC,CAAC,CAAC;AACL,CAAC;AAED,SAAS,aAAa,CAAC,OAAgB;IACrC,OAAO,CAAC,QAAkB,EAAE,EAAE;QAC5B,eAAe,CAAC,OAAO,EAAE,gBAAgB,CAAC,CAAC;QAC3C,SAAS,IAAI;YACX,IAAI,CAAC,cAAc,CAAC,OAAO,CAAC,EAAE;gBAC5B,eAAe,CAAC,OAAO,EAAE,UAAU,CAAC,CAAC;aACtC;YAED,QAAQ,EAAE,CAAC;QACb,CAAC;QAED,WAAW,CAAC,OAAO,EAAE,CAAC,GAAG,EAAE,QAAQ,EAAE,EAAE;YACrC,IAAI,GAAG,EAAE;gBACP,8DAA8D;gBAC9D,IAAI,OAAO,CAAC,OAAO,CAAC,CAAC,WAAW,CAAC,IAAI,KAAK,mBAAU,CAAC,OAAO,EAAE;oBAC5D,OAAO,CAAC,IAAI,CAAC,aAAa,EAAE,GAAG,CAAC,CAAC;oBACjC,OAAO,IAAI,EAAE,CAAC;iBACf;aACF;YAED,mFAAmF;YACnF,IAAI,QAAQ,IAAI,QAAQ,CAAC,eAAe,EAAE;gBACxC,UAAU,CAAC,GAAG,EAAE;;oBACd,IAAI,CAAC,cAAc,CAAC,OAAO,CAAC,EAAE;wBAC5B,MAAA,OAAO,CAAC,UAAU,CAAC,0CAAE,IAAI,EAAE,CAAC;qBAC7B;gBACH,CAAC,EAAE,CAAC,CAAC,CAAC;aACP;YAED,IAAI,EAAE,CAAC;QACT,CAAC,CAAC,CAAC;IACL,CAAC,CAAC;AACJ,CAAC;AAED,SAAS,mBAAmB,CAAC,EAAmB;IAC9C,OAAO;QACL,SAAS,EAAE,EAAE,CAAC,SAAS;QACvB,6FAA6F;QAC7F,OAAO,EAAE,WAAI,CAAC,MAAM,CAAC,EAAE,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,OAAO,CAAC,CAAC,CAAC,WAAI,CAAC,UAAU,CAAC,EAAE,CAAC,OAAO,CAAC;KAC5E,CAAC;AACJ,CAAC;AAOD,gBAAgB;AAChB,MAAa,SAAS;IAWpB,YAAY,iBAAoC,EAAE,OAAyB;QACzE,IAAI,CAAC,WAAW,CAAC,GAAG,SAAS,CAAC;QAC9B,IAAI,CAAC,kBAAkB,CAAC,GAAG,iBAAiB,CAAC;QAC7C,IAAI,CAAC,cAAc,CAAC,GAAG,CAAC,CAAC;QACzB,IAAI,CAAC,MAAM,GAAG,KAAK,CAAC;QAEpB,MAAM,oBAAoB,GAAG,OAAO,CAAC,oBAAoB,CAAC;QAC1D,IAAI,CAAC,UAAU,CAAC,GAAG,UAAU,CAAC,GAAG,EAAE,CAAC,oBAAoB,CAAC,IAAI,EAAE,OAAO,CAAC,EAAE,oBAAoB,CAAC,CAAC;IACjG,CAAC;IAED,IAAI,aAAa;QACf,OAAO,IAAI,CAAC,cAAc,CAAC,CAAC;IAC9B,CAAC;IAED,KAAK;;QACH,IAAI,CAAC,MAAM,GAAG,IAAI,CAAC;QACnB,YAAY,CAAC,IAAI,CAAC,UAAU,CAAC,CAAC,CAAC;QAE/B,MAAA,IAAI,CAAC,WAAW,CAAC,0CAAE,OAAO,CAAC,EAAE,KAAK,EAAE,IAAI,EAAE,CAAC,CAAC;QAC5C,IAAI,CAAC,WAAW,CAAC,GAAG,SAAS,CAAC;IAChC,CAAC;CACF;AAhCD,8BAgCC;AAED,SAAS,oBAAoB,CAAC,SAAoB,EAAE,OAAyB;IAC3E,MAAM,KAAK,GAAG,WAAG,EAAE,CAAC;IACpB,OAAO,CAAC,iBAAiB,GAAG,SAAS,CAAC,kBAAkB,CAAC,CAAC;IAC1D,MAAM,oBAAoB,GAAG,OAAO,CAAC,oBAAoB,CAAC;IAE1D,IAAI,SAAS,CAAC,MAAM,EAAE;QACpB,OAAO;KACR;IAED,SAAS,oBAAoB,CAAC,IAAiB;QAC7C,IAAI,SAAS,CAAC,MAAM,EAAE;YACpB,IAAI,aAAJ,IAAI,uBAAJ,IAAI,CAAE,OAAO,CAAC,EAAE,KAAK,EAAE,IAAI,EAAE,CAAC,CAAC;YAC/B,OAAO;SACR;QAED,IAAI,SAAS,CAAC,WAAW,CAAC,IAAI,IAAI,EAAE;YAClC,SAAS,CAAC,WAAW,CAAC,GAAG,IAAI,CAAC;SAC/B;QAED,SAAS,CAAC,cAAc,CAAC,GAAG,6BAAqB,CAAC,KAAK,CAAC,CAAC;QACzD,SAAS,CAAC,UAAU,CAAC,GAAG,UAAU,CAChC,GAAG,EAAE,CAAC,oBAAoB,CAAC,SAAS,EAAE,OAAO,CAAC,EAC9C,oBAAoB,CACrB,CAAC;IACJ,CAAC;IAED,MAAM,UAAU,GAAG,SAAS,CAAC,WAAW,CAAC,CAAC;IAC1C,IAAI,UAAU,IAAI,IAAI,EAAE;QACtB,iBAAO,CAAC,OAAO,EAAE,CAAC,GAAG,EAAE,IAAI,EAAE,EAAE;YAC7B,IAAI,GAAG,EAAE;gBACP,SAAS,CAAC,WAAW,CAAC,GAAG,SAAS,CAAC;gBACnC,SAAS,CAAC,cAAc,CAAC,GAAG,CAAC,CAAC;gBAC9B,OAAO;aACR;YAED,oBAAoB,CAAC,IAAI,CAAC,CAAC;QAC7B,CAAC,CAAC,CAAC;QAEH,OAAO;KACR;IAED,UAAU,CAAC,OAAO,CAAC,UAAE,CAAC,YAAY,CAAC,EAAE,EAAE,QAAQ,EAAE,CAAC,EAAE,EAAE,SAAS,EAAE,GAAG,CAAC,EAAE;QACrE,IAAI,GAAG,EAAE;YACP,SAAS,CAAC,WAAW,CAAC,GAAG,SAAS,CAAC;YACnC,SAAS,CAAC,cAAc,CAAC,GAAG,CAAC,CAAC;YAC9B,OAAO;SACR;QAED,oBAAoB,EAAE,CAAC;IACzB,CAAC,CAAC,CAAC;AACL,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/server.js b/node_modules/mongodb/lib/sdam/server.js new file mode 100644 index 000000000..0b0273ffb --- /dev/null +++ b/node_modules/mongodb/lib/sdam/server.js @@ -0,0 +1,366 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.HEARTBEAT_EVENTS = exports.Server = void 0; +const logger_1 = require("../logger"); +const connection_pool_1 = require("../cmap/connection_pool"); +const server_description_1 = require("./server_description"); +const monitor_1 = require("./monitor"); +const transactions_1 = require("../transactions"); +const utils_1 = require("../utils"); +const common_1 = require("./common"); +const error_1 = require("../error"); +const connection_1 = require("../cmap/connection"); +const mongo_types_1 = require("../mongo_types"); +const utils_2 = require("../utils"); +const stateTransition = utils_1.makeStateMachine({ + [common_1.STATE_CLOSED]: [common_1.STATE_CLOSED, common_1.STATE_CONNECTING], + [common_1.STATE_CONNECTING]: [common_1.STATE_CONNECTING, common_1.STATE_CLOSING, common_1.STATE_CONNECTED, common_1.STATE_CLOSED], + [common_1.STATE_CONNECTED]: [common_1.STATE_CONNECTED, common_1.STATE_CLOSING, common_1.STATE_CLOSED], + [common_1.STATE_CLOSING]: [common_1.STATE_CLOSING, common_1.STATE_CLOSED] +}); +/** @internal */ +const kMonitor = Symbol('monitor'); +/** @internal */ +class Server extends mongo_types_1.TypedEventEmitter { + /** + * Create a server + */ + constructor(topology, description, options) { + super(); + this.serverApi = options.serverApi; + const poolOptions = { hostAddress: description.hostAddress, ...options }; + this.s = { + description, + options, + logger: new logger_1.Logger('Server'), + state: common_1.STATE_CLOSED, + topology, + pool: new connection_pool_1.ConnectionPool(poolOptions) + }; + for (const event of [...connection_pool_1.CMAP_EVENTS, ...connection_1.APM_EVENTS]) { + this.s.pool.on(event, (e) => this.emit(event, e)); + } + this.s.pool.on(connection_1.Connection.CLUSTER_TIME_RECEIVED, (clusterTime) => { + this.clusterTime = clusterTime; + }); + // monitoring is disabled in load balancing mode + if (this.loadBalanced) + return; + // create the monitor + this[kMonitor] = new monitor_1.Monitor(this, this.s.options); + for (const event of exports.HEARTBEAT_EVENTS) { + this[kMonitor].on(event, (e) => this.emit(event, e)); + } + this[kMonitor].on('resetConnectionPool', () => { + this.s.pool.clear(); + }); + this[kMonitor].on('resetServer', (error) => markServerUnknown(this, error)); + this[kMonitor].on(Server.SERVER_HEARTBEAT_SUCCEEDED, (event) => { + this.emit(Server.DESCRIPTION_RECEIVED, new server_description_1.ServerDescription(this.description.hostAddress, event.reply, { + roundTripTime: calculateRoundTripTime(this.description.roundTripTime, event.duration) + })); + if (this.s.state === common_1.STATE_CONNECTING) { + stateTransition(this, common_1.STATE_CONNECTED); + this.emit(Server.CONNECT, this); + } + }); + } + get description() { + return this.s.description; + } + get name() { + return this.s.description.address; + } + get autoEncrypter() { + if (this.s.options && this.s.options.autoEncrypter) { + return this.s.options.autoEncrypter; + } + } + get loadBalanced() { + return this.s.topology.description.type === common_1.TopologyType.LoadBalanced; + } + /** + * Initiate server connect + */ + connect() { + if (this.s.state !== common_1.STATE_CLOSED) { + return; + } + stateTransition(this, common_1.STATE_CONNECTING); + // If in load balancer mode we automatically set the server to + // a load balancer. It never transitions out of this state and + // has no monitor. + if (!this.loadBalanced) { + this[kMonitor].connect(); + } + else { + stateTransition(this, common_1.STATE_CONNECTED); + this.emit(Server.CONNECT, this); + } + } + /** Destroy the server connection */ + destroy(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + options = Object.assign({}, { force: false }, options); + if (this.s.state === common_1.STATE_CLOSED) { + if (typeof callback === 'function') { + callback(); + } + return; + } + stateTransition(this, common_1.STATE_CLOSING); + if (!this.loadBalanced) { + this[kMonitor].close(); + } + this.s.pool.close(options, err => { + stateTransition(this, common_1.STATE_CLOSED); + this.emit('closed'); + if (typeof callback === 'function') { + callback(err); + } + }); + } + /** + * Immediately schedule monitoring of this server. If there already an attempt being made + * this will be a no-op. + */ + requestCheck() { + if (!this.loadBalanced) { + this[kMonitor].requestCheck(); + } + } + command(ns, cmd, options, callback) { + if (typeof options === 'function') { + (callback = options), (options = {}), (options = options !== null && options !== void 0 ? options : {}); + } + if (callback == null) { + throw new error_1.MongoInvalidArgumentError('Callback must be provided'); + } + if (ns.db == null || typeof ns === 'string') { + throw new error_1.MongoInvalidArgumentError('Namespace must not be a string'); + } + if (this.s.state === common_1.STATE_CLOSING || this.s.state === common_1.STATE_CLOSED) { + callback(new error_1.MongoServerClosedError()); + return; + } + // Clone the options + const finalOptions = Object.assign({}, options, { wireProtocolCommand: false }); + // error if collation not supported + if (utils_1.collationNotSupported(this, cmd)) { + callback(new error_1.MongoCompatibilityError(`Server ${this.name} does not support collation`)); + return; + } + const session = finalOptions.session; + const conn = session === null || session === void 0 ? void 0 : session.pinnedConnection; + // NOTE: This is a hack! We can't retrieve the connections used for executing an operation + // (and prevent them from being checked back in) at the point of operation execution. + // This should be considered as part of the work for NODE-2882 + if (this.loadBalanced && session && conn == null && isPinnableCommand(cmd, session)) { + this.s.pool.checkOut((err, checkedOut) => { + if (err || checkedOut == null) { + if (callback) + return callback(err); + return; + } + session.pin(checkedOut); + this.command(ns, cmd, finalOptions, callback); + }); + return; + } + this.s.pool.withConnection(conn, (err, conn, cb) => { + if (err || !conn) { + markServerUnknown(this, err); + return cb(err); + } + conn.command(ns, cmd, finalOptions, makeOperationHandler(this, conn, cmd, finalOptions, cb)); + }, callback); + } + /** + * Execute a query against the server + * @internal + */ + query(ns, cmd, options, callback) { + if (this.s.state === common_1.STATE_CLOSING || this.s.state === common_1.STATE_CLOSED) { + callback(new error_1.MongoServerClosedError()); + return; + } + this.s.pool.withConnection(undefined, (err, conn, cb) => { + if (err || !conn) { + markServerUnknown(this, err); + return cb(err); + } + conn.query(ns, cmd, options, makeOperationHandler(this, conn, cmd, options, cb)); + }, callback); + } + /** + * Execute a `getMore` against the server + * @internal + */ + getMore(ns, cursorId, options, callback) { + var _a; + if (this.s.state === common_1.STATE_CLOSING || this.s.state === common_1.STATE_CLOSED) { + callback(new error_1.MongoServerClosedError()); + return; + } + this.s.pool.withConnection((_a = options.session) === null || _a === void 0 ? void 0 : _a.pinnedConnection, (err, conn, cb) => { + if (err || !conn) { + markServerUnknown(this, err); + return cb(err); + } + conn.getMore(ns, cursorId, options, makeOperationHandler(this, conn, {}, options, cb)); + }, callback); + } + /** + * Execute a `killCursors` command against the server + * @internal + */ + killCursors(ns, cursorIds, options, callback) { + var _a; + if (this.s.state === common_1.STATE_CLOSING || this.s.state === common_1.STATE_CLOSED) { + if (typeof callback === 'function') { + callback(new error_1.MongoServerClosedError()); + } + return; + } + this.s.pool.withConnection((_a = options.session) === null || _a === void 0 ? void 0 : _a.pinnedConnection, (err, conn, cb) => { + if (err || !conn) { + markServerUnknown(this, err); + return cb(err); + } + conn.killCursors(ns, cursorIds, options, makeOperationHandler(this, conn, {}, undefined, cb)); + }, callback); + } +} +exports.Server = Server; +/** @event */ +Server.SERVER_HEARTBEAT_STARTED = 'serverHeartbeatStarted'; +/** @event */ +Server.SERVER_HEARTBEAT_SUCCEEDED = 'serverHeartbeatSucceeded'; +/** @event */ +Server.SERVER_HEARTBEAT_FAILED = 'serverHeartbeatFailed'; +/** @event */ +Server.CONNECT = 'connect'; +/** @event */ +Server.DESCRIPTION_RECEIVED = 'descriptionReceived'; +/** @event */ +Server.CLOSED = 'closed'; +/** @event */ +Server.ENDED = 'ended'; +exports.HEARTBEAT_EVENTS = [ + Server.SERVER_HEARTBEAT_STARTED, + Server.SERVER_HEARTBEAT_SUCCEEDED, + Server.SERVER_HEARTBEAT_FAILED +]; +Object.defineProperty(Server.prototype, 'clusterTime', { + get() { + return this.s.topology.clusterTime; + }, + set(clusterTime) { + this.s.topology.clusterTime = clusterTime; + } +}); +function calculateRoundTripTime(oldRtt, duration) { + if (oldRtt === -1) { + return duration; + } + const alpha = 0.2; + return alpha * duration + (1 - alpha) * oldRtt; +} +function markServerUnknown(server, error) { + // Load balancer servers can never be marked unknown. + if (server.loadBalanced) { + return; + } + if (error instanceof error_1.MongoNetworkError && !(error instanceof error_1.MongoNetworkTimeoutError)) { + server[kMonitor].reset(); + } + server.emit(Server.DESCRIPTION_RECEIVED, new server_description_1.ServerDescription(server.description.hostAddress, undefined, { + error, + topologyVersion: error && error.topologyVersion ? error.topologyVersion : server.description.topologyVersion + })); +} +function isPinnableCommand(cmd, session) { + if (session) { + return (session.inTransaction() || + 'aggregate' in cmd || + 'find' in cmd || + 'getMore' in cmd || + 'listCollections' in cmd || + 'listIndexes' in cmd); + } + return false; +} +function connectionIsStale(pool, connection) { + if (connection.serviceId) { + return (connection.generation !== pool.serviceGenerations.get(connection.serviceId.toHexString())); + } + return connection.generation !== pool.generation; +} +function shouldHandleStateChangeError(server, err) { + const etv = err.topologyVersion; + const stv = server.description.topologyVersion; + return server_description_1.compareTopologyVersion(stv, etv) < 0; +} +function inActiveTransaction(session, cmd) { + return session && session.inTransaction() && !transactions_1.isTransactionCommand(cmd); +} +/** this checks the retryWrites option passed down from the client options, it + * does not check if the server supports retryable writes */ +function isRetryableWritesEnabled(topology) { + return topology.s.options.retryWrites !== false; +} +function makeOperationHandler(server, connection, cmd, options, callback) { + const session = options === null || options === void 0 ? void 0 : options.session; + return function handleOperationResult(err, result) { + if (err && !connectionIsStale(server.s.pool, connection)) { + if (err instanceof error_1.MongoNetworkError) { + if (session && !session.hasEnded && session.serverSession) { + session.serverSession.isDirty = true; + } + // inActiveTransaction check handles commit and abort. + if (inActiveTransaction(session, cmd) && !err.hasErrorLabel('TransientTransactionError')) { + err.addErrorLabel('TransientTransactionError'); + } + if ((isRetryableWritesEnabled(server.s.topology) || transactions_1.isTransactionCommand(cmd)) && + utils_2.supportsRetryableWrites(server) && + !inActiveTransaction(session, cmd)) { + err.addErrorLabel('RetryableWriteError'); + } + if (!(err instanceof error_1.MongoNetworkTimeoutError) || error_1.isNetworkErrorBeforeHandshake(err)) { + // In load balanced mode we never mark the server as unknown and always + // clear for the specific service id. + server.s.pool.clear(connection.serviceId); + if (!server.loadBalanced) { + markServerUnknown(server, err); + } + } + } + else { + // if pre-4.4 server, then add error label if its a retryable write error + if ((isRetryableWritesEnabled(server.s.topology) || transactions_1.isTransactionCommand(cmd)) && + utils_1.maxWireVersion(server) < 9 && + error_1.isRetryableWriteError(err) && + !inActiveTransaction(session, cmd)) { + err.addErrorLabel('RetryableWriteError'); + } + if (error_1.isSDAMUnrecoverableError(err)) { + if (shouldHandleStateChangeError(server, err)) { + if (utils_1.maxWireVersion(server) <= 7 || error_1.isNodeShuttingDownError(err)) { + server.s.pool.clear(connection.serviceId); + } + if (!server.loadBalanced) { + markServerUnknown(server, err); + process.nextTick(() => server.requestCheck()); + } + } + } + } + if (session && session.isPinned && err.hasErrorLabel('TransientTransactionError')) { + session.unpin({ force: true }); + } + } + callback(err, result); + }; +} +//# sourceMappingURL=server.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/server.js.map b/node_modules/mongodb/lib/sdam/server.js.map new file mode 100644 index 000000000..fbb97969f --- /dev/null +++ b/node_modules/mongodb/lib/sdam/server.js.map @@ -0,0 +1 @@ +{"version":3,"file":"server.js","sourceRoot":"","sources":["../../src/sdam/server.ts"],"names":[],"mappings":";;;AAAA,sCAAmC;AACnC,6DAKiC;AACjC,6DAAiF;AACjF,uCAAoD;AACpD,kDAAuD;AACvD,oCAQkB;AAClB,qCAOkB;AAClB,oCAWkB;AAClB,mDAO4B;AAW5B,gDAAmD;AACnD,oCAAmD;AAEnD,MAAM,eAAe,GAAG,wBAAgB,CAAC;IACvC,CAAC,qBAAY,CAAC,EAAE,CAAC,qBAAY,EAAE,yBAAgB,CAAC;IAChD,CAAC,yBAAgB,CAAC,EAAE,CAAC,yBAAgB,EAAE,sBAAa,EAAE,wBAAe,EAAE,qBAAY,CAAC;IACpF,CAAC,wBAAe,CAAC,EAAE,CAAC,wBAAe,EAAE,sBAAa,EAAE,qBAAY,CAAC;IACjE,CAAC,sBAAa,CAAC,EAAE,CAAC,sBAAa,EAAE,qBAAY,CAAC;CAC/C,CAAC,CAAC;AAEH,gBAAgB;AAChB,MAAM,QAAQ,GAAG,MAAM,CAAC,SAAS,CAAC,CAAC;AAqCnC,gBAAgB;AAChB,MAAa,MAAO,SAAQ,+BAA+B;IAuBzD;;OAEG;IACH,YAAY,QAAkB,EAAE,WAA8B,EAAE,OAAsB;QACpF,KAAK,EAAE,CAAC;QAER,IAAI,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;QAEnC,MAAM,WAAW,GAAG,EAAE,WAAW,EAAE,WAAW,CAAC,WAAW,EAAE,GAAG,OAAO,EAAE,CAAC;QAEzE,IAAI,CAAC,CAAC,GAAG;YACP,WAAW;YACX,OAAO;YACP,MAAM,EAAE,IAAI,eAAM,CAAC,QAAQ,CAAC;YAC5B,KAAK,EAAE,qBAAY;YACnB,QAAQ;YACR,IAAI,EAAE,IAAI,gCAAc,CAAC,WAAW,CAAC;SACtC,CAAC;QAEF,KAAK,MAAM,KAAK,IAAI,CAAC,GAAG,6BAAW,EAAE,GAAG,uBAAU,CAAC,EAAE;YACnD,IAAI,CAAC,CAAC,CAAC,IAAI,CAAC,EAAE,CAAC,KAAK,EAAE,CAAC,CAAM,EAAE,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,CAAC;SACxD;QAED,IAAI,CAAC,CAAC,CAAC,IAAI,CAAC,EAAE,CAAC,uBAAU,CAAC,qBAAqB,EAAE,CAAC,WAAwB,EAAE,EAAE;YAC5E,IAAI,CAAC,WAAW,GAAG,WAAW,CAAC;QACjC,CAAC,CAAC,CAAC;QAEH,gDAAgD;QAChD,IAAI,IAAI,CAAC,YAAY;YAAE,OAAO;QAE9B,qBAAqB;QACrB,IAAI,CAAC,QAAQ,CAAC,GAAG,IAAI,iBAAO,CAAC,IAAI,EAAE,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC;QAEnD,KAAK,MAAM,KAAK,IAAI,wBAAgB,EAAE;YACpC,IAAI,CAAC,QAAQ,CAAC,CAAC,EAAE,CAAC,KAAK,EAAE,CAAC,CAAM,EAAE,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,CAAC;SAC3D;QAED,IAAI,CAAC,QAAQ,CAAC,CAAC,EAAE,CAAC,qBAAqB,EAAE,GAAG,EAAE;YAC5C,IAAI,CAAC,CAAC,CAAC,IAAI,CAAC,KAAK,EAAE,CAAC;QACtB,CAAC,CAAC,CAAC;QAEH,IAAI,CAAC,QAAQ,CAAC,CAAC,EAAE,CAAC,aAAa,EAAE,CAAC,KAAiB,EAAE,EAAE,CAAC,iBAAiB,CAAC,IAAI,EAAE,KAAK,CAAC,CAAC,CAAC;QACxF,IAAI,CAAC,QAAQ,CAAC,CAAC,EAAE,CAAC,MAAM,CAAC,0BAA0B,EAAE,CAAC,KAAoC,EAAE,EAAE;YAC5F,IAAI,CAAC,IAAI,CACP,MAAM,CAAC,oBAAoB,EAC3B,IAAI,sCAAiB,CAAC,IAAI,CAAC,WAAW,CAAC,WAAW,EAAE,KAAK,CAAC,KAAK,EAAE;gBAC/D,aAAa,EAAE,sBAAsB,CAAC,IAAI,CAAC,WAAW,CAAC,aAAa,EAAE,KAAK,CAAC,QAAQ,CAAC;aACtF,CAAC,CACH,CAAC;YAEF,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,KAAK,yBAAgB,EAAE;gBACrC,eAAe,CAAC,IAAI,EAAE,wBAAe,CAAC,CAAC;gBACvC,IAAI,CAAC,IAAI,CAAC,MAAM,CAAC,OAAO,EAAE,IAAI,CAAC,CAAC;aACjC;QACH,CAAC,CAAC,CAAC;IACL,CAAC;IAED,IAAI,WAAW;QACb,OAAO,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC;IAC5B,CAAC;IAED,IAAI,IAAI;QACN,OAAO,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC,OAAO,CAAC;IACpC,CAAC;IAED,IAAI,aAAa;QACf,IAAI,IAAI,CAAC,CAAC,CAAC,OAAO,IAAI,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,aAAa,EAAE;YAClD,OAAO,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,aAAa,CAAC;SACrC;IACH,CAAC;IAED,IAAI,YAAY;QACd,OAAO,IAAI,CAAC,CAAC,CAAC,QAAQ,CAAC,WAAW,CAAC,IAAI,KAAK,qBAAY,CAAC,YAAY,CAAC;IACxE,CAAC;IAED;;OAEG;IACH,OAAO;QACL,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,KAAK,qBAAY,EAAE;YACjC,OAAO;SACR;QAED,eAAe,CAAC,IAAI,EAAE,yBAAgB,CAAC,CAAC;QAExC,8DAA8D;QAC9D,8DAA8D;QAC9D,kBAAkB;QAClB,IAAI,CAAC,IAAI,CAAC,YAAY,EAAE;YACtB,IAAI,CAAC,QAAQ,CAAC,CAAC,OAAO,EAAE,CAAC;SAC1B;aAAM;YACL,eAAe,CAAC,IAAI,EAAE,wBAAe,CAAC,CAAC;YACvC,IAAI,CAAC,IAAI,CAAC,MAAM,CAAC,OAAO,EAAE,IAAI,CAAC,CAAC;SACjC;IACH,CAAC;IAED,oCAAoC;IACpC,OAAO,CAAC,OAAwB,EAAE,QAAmB;QACnD,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,EAAE,KAAK,EAAE,KAAK,EAAE,EAAE,OAAO,CAAC,CAAC;QAEvD,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,KAAK,qBAAY,EAAE;YACjC,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;gBAClC,QAAQ,EAAE,CAAC;aACZ;YAED,OAAO;SACR;QAED,eAAe,CAAC,IAAI,EAAE,sBAAa,CAAC,CAAC;QAErC,IAAI,CAAC,IAAI,CAAC,YAAY,EAAE;YACtB,IAAI,CAAC,QAAQ,CAAC,CAAC,KAAK,EAAE,CAAC;SACxB;QAED,IAAI,CAAC,CAAC,CAAC,IAAI,CAAC,KAAK,CAAC,OAAO,EAAE,GAAG,CAAC,EAAE;YAC/B,eAAe,CAAC,IAAI,EAAE,qBAAY,CAAC,CAAC;YACpC,IAAI,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;YACpB,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;gBAClC,QAAQ,CAAC,GAAG,CAAC,CAAC;aACf;QACH,CAAC,CAAC,CAAC;IACL,CAAC;IAED;;;OAGG;IACH,YAAY;QACV,IAAI,CAAC,IAAI,CAAC,YAAY,EAAE;YACtB,IAAI,CAAC,QAAQ,CAAC,CAAC,YAAY,EAAE,CAAC;SAC/B;IACH,CAAC;IAcD,OAAO,CACL,EAAoB,EACpB,GAAa,EACb,OAA6C,EAC7C,QAA6B;QAE7B,IAAI,OAAO,OAAO,KAAK,UAAU,EAAE;YACjC,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,EAAE,CAAC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC,CAAC;SACjE;QAED,IAAI,QAAQ,IAAI,IAAI,EAAE;YACpB,MAAM,IAAI,iCAAyB,CAAC,2BAA2B,CAAC,CAAC;SAClE;QAED,IAAI,EAAE,CAAC,EAAE,IAAI,IAAI,IAAI,OAAO,EAAE,KAAK,QAAQ,EAAE;YAC3C,MAAM,IAAI,iCAAyB,CAAC,gCAAgC,CAAC,CAAC;SACvE;QAED,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,KAAK,sBAAa,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,KAAK,qBAAY,EAAE;YACnE,QAAQ,CAAC,IAAI,8BAAsB,EAAE,CAAC,CAAC;YACvC,OAAO;SACR;QAED,oBAAoB;QACpB,MAAM,YAAY,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,OAAO,EAAE,EAAE,mBAAmB,EAAE,KAAK,EAAE,CAAC,CAAC;QAEhF,mCAAmC;QACnC,IAAI,6BAAqB,CAAC,IAAI,EAAE,GAAG,CAAC,EAAE;YACpC,QAAQ,CAAC,IAAI,+BAAuB,CAAC,UAAU,IAAI,CAAC,IAAI,6BAA6B,CAAC,CAAC,CAAC;YACxF,OAAO;SACR;QAED,MAAM,OAAO,GAAG,YAAY,CAAC,OAAO,CAAC;QACrC,MAAM,IAAI,GAAG,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,gBAAgB,CAAC;QAEvC,0FAA0F;QAC1F,2FAA2F;QAC3F,oEAAoE;QACpE,IAAI,IAAI,CAAC,YAAY,IAAI,OAAO,IAAI,IAAI,IAAI,IAAI,IAAI,iBAAiB,CAAC,GAAG,EAAE,OAAO,CAAC,EAAE;YACnF,IAAI,CAAC,CAAC,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC,GAAG,EAAE,UAAU,EAAE,EAAE;gBACvC,IAAI,GAAG,IAAI,UAAU,IAAI,IAAI,EAAE;oBAC7B,IAAI,QAAQ;wBAAE,OAAO,QAAQ,CAAC,GAAG,CAAC,CAAC;oBACnC,OAAO;iBACR;gBAED,OAAO,CAAC,GAAG,CAAC,UAAU,CAAC,CAAC;gBACxB,IAAI,CAAC,OAAO,CAAC,EAAE,EAAE,GAAG,EAAE,YAAY,EAAE,QAA8B,CAAC,CAAC;YACtE,CAAC,CAAC,CAAC;YAEH,OAAO;SACR;QAED,IAAI,CAAC,CAAC,CAAC,IAAI,CAAC,cAAc,CACxB,IAAI,EACJ,CAAC,GAAG,EAAE,IAAI,EAAE,EAAE,EAAE,EAAE;YAChB,IAAI,GAAG,IAAI,CAAC,IAAI,EAAE;gBAChB,iBAAiB,CAAC,IAAI,EAAE,GAAG,CAAC,CAAC;gBAC7B,OAAO,EAAE,CAAC,GAAG,CAAC,CAAC;aAChB;YAED,IAAI,CAAC,OAAO,CACV,EAAE,EACF,GAAG,EACH,YAAY,EACZ,oBAAoB,CAAC,IAAI,EAAE,IAAI,EAAE,GAAG,EAAE,YAAY,EAAE,EAAE,CAAuB,CAC9E,CAAC;QACJ,CAAC,EACD,QAAQ,CACT,CAAC;IACJ,CAAC;IAED;;;OAGG;IACH,KAAK,CAAC,EAAoB,EAAE,GAAa,EAAE,OAAqB,EAAE,QAAkB;QAClF,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,KAAK,sBAAa,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,KAAK,qBAAY,EAAE;YACnE,QAAQ,CAAC,IAAI,8BAAsB,EAAE,CAAC,CAAC;YACvC,OAAO;SACR;QAED,IAAI,CAAC,CAAC,CAAC,IAAI,CAAC,cAAc,CACxB,SAAS,EACT,CAAC,GAAG,EAAE,IAAI,EAAE,EAAE,EAAE,EAAE;YAChB,IAAI,GAAG,IAAI,CAAC,IAAI,EAAE;gBAChB,iBAAiB,CAAC,IAAI,EAAE,GAAG,CAAC,CAAC;gBAC7B,OAAO,EAAE,CAAC,GAAG,CAAC,CAAC;aAChB;YAED,IAAI,CAAC,KAAK,CACR,EAAE,EACF,GAAG,EACH,OAAO,EACP,oBAAoB,CAAC,IAAI,EAAE,IAAI,EAAE,GAAG,EAAE,OAAO,EAAE,EAAE,CAAa,CAC/D,CAAC;QACJ,CAAC,EACD,QAAQ,CACT,CAAC;IACJ,CAAC;IAED;;;OAGG;IACH,OAAO,CACL,EAAoB,EACpB,QAAc,EACd,OAAuB,EACvB,QAA4B;;QAE5B,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,KAAK,sBAAa,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,KAAK,qBAAY,EAAE;YACnE,QAAQ,CAAC,IAAI,8BAAsB,EAAE,CAAC,CAAC;YACvC,OAAO;SACR;QAED,IAAI,CAAC,CAAC,CAAC,IAAI,CAAC,cAAc,CACxB,MAAA,OAAO,CAAC,OAAO,0CAAE,gBAAgB,EACjC,CAAC,GAAG,EAAE,IAAI,EAAE,EAAE,EAAE,EAAE;YAChB,IAAI,GAAG,IAAI,CAAC,IAAI,EAAE;gBAChB,iBAAiB,CAAC,IAAI,EAAE,GAAG,CAAC,CAAC;gBAC7B,OAAO,EAAE,CAAC,GAAG,CAAC,CAAC;aAChB;YAED,IAAI,CAAC,OAAO,CACV,EAAE,EACF,QAAQ,EACR,OAAO,EACP,oBAAoB,CAAC,IAAI,EAAE,IAAI,EAAE,EAAE,EAAE,OAAO,EAAE,EAAE,CAAa,CAC9D,CAAC;QACJ,CAAC,EACD,QAAQ,CACT,CAAC;IACJ,CAAC;IAED;;;OAGG;IACH,WAAW,CACT,EAAoB,EACpB,SAAiB,EACjB,OAAuB,EACvB,QAAmB;;QAEnB,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,KAAK,sBAAa,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,KAAK,qBAAY,EAAE;YACnE,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;gBAClC,QAAQ,CAAC,IAAI,8BAAsB,EAAE,CAAC,CAAC;aACxC;YAED,OAAO;SACR;QAED,IAAI,CAAC,CAAC,CAAC,IAAI,CAAC,cAAc,CACxB,MAAA,OAAO,CAAC,OAAO,0CAAE,gBAAgB,EACjC,CAAC,GAAG,EAAE,IAAI,EAAE,EAAE,EAAE,EAAE;YAChB,IAAI,GAAG,IAAI,CAAC,IAAI,EAAE;gBAChB,iBAAiB,CAAC,IAAI,EAAE,GAAG,CAAC,CAAC;gBAC7B,OAAO,EAAE,CAAC,GAAG,CAAC,CAAC;aAChB;YAED,IAAI,CAAC,WAAW,CACd,EAAE,EACF,SAAS,EACT,OAAO,EACP,oBAAoB,CAAC,IAAI,EAAE,IAAI,EAAE,EAAE,EAAE,SAAS,EAAE,EAAE,CAAa,CAChE,CAAC;QACJ,CAAC,EACD,QAAQ,CACT,CAAC;IACJ,CAAC;;AAlVH,wBAmVC;AA3UC,aAAa;AACG,+BAAwB,GAAG,wBAAiC,CAAC;AAC7E,aAAa;AACG,iCAA0B,GAAG,0BAAmC,CAAC;AACjF,aAAa;AACG,8BAAuB,GAAG,uBAAgC,CAAC;AAC3E,aAAa;AACG,cAAO,GAAG,SAAkB,CAAC;AAC7C,aAAa;AACG,2BAAoB,GAAG,qBAA8B,CAAC;AACtE,aAAa;AACG,aAAM,GAAG,QAAiB,CAAC;AAC3C,aAAa;AACG,YAAK,GAAG,OAAgB,CAAC;AAgU9B,QAAA,gBAAgB,GAAG;IAC9B,MAAM,CAAC,wBAAwB;IAC/B,MAAM,CAAC,0BAA0B;IACjC,MAAM,CAAC,uBAAuB;CAC/B,CAAC;AAEF,MAAM,CAAC,cAAc,CAAC,MAAM,CAAC,SAAS,EAAE,aAAa,EAAE;IACrD,GAAG;QACD,OAAO,IAAI,CAAC,CAAC,CAAC,QAAQ,CAAC,WAAW,CAAC;IACrC,CAAC;IACD,GAAG,CAAC,WAAwB;QAC1B,IAAI,CAAC,CAAC,CAAC,QAAQ,CAAC,WAAW,GAAG,WAAW,CAAC;IAC5C,CAAC;CACF,CAAC,CAAC;AAEH,SAAS,sBAAsB,CAAC,MAAc,EAAE,QAAgB;IAC9D,IAAI,MAAM,KAAK,CAAC,CAAC,EAAE;QACjB,OAAO,QAAQ,CAAC;KACjB;IAED,MAAM,KAAK,GAAG,GAAG,CAAC;IAClB,OAAO,KAAK,GAAG,QAAQ,GAAG,CAAC,CAAC,GAAG,KAAK,CAAC,GAAG,MAAM,CAAC;AACjD,CAAC;AAED,SAAS,iBAAiB,CAAC,MAAc,EAAE,KAAkB;IAC3D,qDAAqD;IACrD,IAAI,MAAM,CAAC,YAAY,EAAE;QACvB,OAAO;KACR;IAED,IAAI,KAAK,YAAY,yBAAiB,IAAI,CAAC,CAAC,KAAK,YAAY,gCAAwB,CAAC,EAAE;QACtF,MAAM,CAAC,QAAQ,CAAC,CAAC,KAAK,EAAE,CAAC;KAC1B;IAED,MAAM,CAAC,IAAI,CACT,MAAM,CAAC,oBAAoB,EAC3B,IAAI,sCAAiB,CAAC,MAAM,CAAC,WAAW,CAAC,WAAW,EAAE,SAAS,EAAE;QAC/D,KAAK;QACL,eAAe,EACb,KAAK,IAAI,KAAK,CAAC,eAAe,CAAC,CAAC,CAAC,KAAK,CAAC,eAAe,CAAC,CAAC,CAAC,MAAM,CAAC,WAAW,CAAC,eAAe;KAC9F,CAAC,CACH,CAAC;AACJ,CAAC;AAED,SAAS,iBAAiB,CAAC,GAAa,EAAE,OAAuB;IAC/D,IAAI,OAAO,EAAE;QACX,OAAO,CACL,OAAO,CAAC,aAAa,EAAE;YACvB,WAAW,IAAI,GAAG;YAClB,MAAM,IAAI,GAAG;YACb,SAAS,IAAI,GAAG;YAChB,iBAAiB,IAAI,GAAG;YACxB,aAAa,IAAI,GAAG,CACrB,CAAC;KACH;IAED,OAAO,KAAK,CAAC;AACf,CAAC;AAED,SAAS,iBAAiB,CAAC,IAAoB,EAAE,UAAsB;IACrE,IAAI,UAAU,CAAC,SAAS,EAAE;QACxB,OAAO,CACL,UAAU,CAAC,UAAU,KAAK,IAAI,CAAC,kBAAkB,CAAC,GAAG,CAAC,UAAU,CAAC,SAAS,CAAC,WAAW,EAAE,CAAC,CAC1F,CAAC;KACH;IAED,OAAO,UAAU,CAAC,UAAU,KAAK,IAAI,CAAC,UAAU,CAAC;AACnD,CAAC;AAED,SAAS,4BAA4B,CAAC,MAAc,EAAE,GAAe;IACnE,MAAM,GAAG,GAAG,GAAG,CAAC,eAAe,CAAC;IAChC,MAAM,GAAG,GAAG,MAAM,CAAC,WAAW,CAAC,eAAe,CAAC;IAC/C,OAAO,2CAAsB,CAAC,GAAG,EAAE,GAAG,CAAC,GAAG,CAAC,CAAC;AAC9C,CAAC;AAED,SAAS,mBAAmB,CAAC,OAAkC,EAAE,GAAa;IAC5E,OAAO,OAAO,IAAI,OAAO,CAAC,aAAa,EAAE,IAAI,CAAC,mCAAoB,CAAC,GAAG,CAAC,CAAC;AAC1E,CAAC;AAED;4DAC4D;AAC5D,SAAS,wBAAwB,CAAC,QAAkB;IAClD,OAAO,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,WAAW,KAAK,KAAK,CAAC;AAClD,CAAC;AAED,SAAS,oBAAoB,CAC3B,MAAc,EACd,UAAsB,EACtB,GAAa,EACb,OAAoD,EACpD,QAAkB;IAElB,MAAM,OAAO,GAAG,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,OAAO,CAAC;IACjC,OAAO,SAAS,qBAAqB,CAAC,GAAG,EAAE,MAAM;QAC/C,IAAI,GAAG,IAAI,CAAC,iBAAiB,CAAC,MAAM,CAAC,CAAC,CAAC,IAAI,EAAE,UAAU,CAAC,EAAE;YACxD,IAAI,GAAG,YAAY,yBAAiB,EAAE;gBACpC,IAAI,OAAO,IAAI,CAAC,OAAO,CAAC,QAAQ,IAAI,OAAO,CAAC,aAAa,EAAE;oBACzD,OAAO,CAAC,aAAa,CAAC,OAAO,GAAG,IAAI,CAAC;iBACtC;gBAED,sDAAsD;gBACtD,IAAI,mBAAmB,CAAC,OAAO,EAAE,GAAG,CAAC,IAAI,CAAC,GAAG,CAAC,aAAa,CAAC,2BAA2B,CAAC,EAAE;oBACxF,GAAG,CAAC,aAAa,CAAC,2BAA2B,CAAC,CAAC;iBAChD;gBAED,IACE,CAAC,wBAAwB,CAAC,MAAM,CAAC,CAAC,CAAC,QAAQ,CAAC,IAAI,mCAAoB,CAAC,GAAG,CAAC,CAAC;oBAC1E,+BAAuB,CAAC,MAAM,CAAC;oBAC/B,CAAC,mBAAmB,CAAC,OAAO,EAAE,GAAG,CAAC,EAClC;oBACA,GAAG,CAAC,aAAa,CAAC,qBAAqB,CAAC,CAAC;iBAC1C;gBAED,IAAI,CAAC,CAAC,GAAG,YAAY,gCAAwB,CAAC,IAAI,qCAA6B,CAAC,GAAG,CAAC,EAAE;oBACpF,uEAAuE;oBACvE,qCAAqC;oBAErC,MAAM,CAAC,CAAC,CAAC,IAAI,CAAC,KAAK,CAAC,UAAU,CAAC,SAAS,CAAC,CAAC;oBAC1C,IAAI,CAAC,MAAM,CAAC,YAAY,EAAE;wBACxB,iBAAiB,CAAC,MAAM,EAAE,GAAG,CAAC,CAAC;qBAChC;iBACF;aACF;iBAAM;gBACL,yEAAyE;gBACzE,IACE,CAAC,wBAAwB,CAAC,MAAM,CAAC,CAAC,CAAC,QAAQ,CAAC,IAAI,mCAAoB,CAAC,GAAG,CAAC,CAAC;oBAC1E,sBAAc,CAAC,MAAM,CAAC,GAAG,CAAC;oBAC1B,6BAAqB,CAAC,GAAG,CAAC;oBAC1B,CAAC,mBAAmB,CAAC,OAAO,EAAE,GAAG,CAAC,EAClC;oBACA,GAAG,CAAC,aAAa,CAAC,qBAAqB,CAAC,CAAC;iBAC1C;gBAED,IAAI,gCAAwB,CAAC,GAAG,CAAC,EAAE;oBACjC,IAAI,4BAA4B,CAAC,MAAM,EAAE,GAAG,CAAC,EAAE;wBAC7C,IAAI,sBAAc,CAAC,MAAM,CAAC,IAAI,CAAC,IAAI,+BAAuB,CAAC,GAAG,CAAC,EAAE;4BAC/D,MAAM,CAAC,CAAC,CAAC,IAAI,CAAC,KAAK,CAAC,UAAU,CAAC,SAAS,CAAC,CAAC;yBAC3C;wBAED,IAAI,CAAC,MAAM,CAAC,YAAY,EAAE;4BACxB,iBAAiB,CAAC,MAAM,EAAE,GAAG,CAAC,CAAC;4BAC/B,OAAO,CAAC,QAAQ,CAAC,GAAG,EAAE,CAAC,MAAM,CAAC,YAAY,EAAE,CAAC,CAAC;yBAC/C;qBACF;iBACF;aACF;YAED,IAAI,OAAO,IAAI,OAAO,CAAC,QAAQ,IAAI,GAAG,CAAC,aAAa,CAAC,2BAA2B,CAAC,EAAE;gBACjF,OAAO,CAAC,KAAK,CAAC,EAAE,KAAK,EAAE,IAAI,EAAE,CAAC,CAAC;aAChC;SACF;QAED,QAAQ,CAAC,GAAG,EAAE,MAAM,CAAC,CAAC;IACxB,CAAC,CAAC;AACJ,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/server_description.js b/node_modules/mongodb/lib/sdam/server_description.js new file mode 100644 index 000000000..6277ad922 --- /dev/null +++ b/node_modules/mongodb/lib/sdam/server_description.js @@ -0,0 +1,197 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.compareTopologyVersion = exports.parseServerType = exports.ServerDescription = void 0; +const utils_1 = require("../utils"); +const common_1 = require("./common"); +const bson_1 = require("../bson"); +const WRITABLE_SERVER_TYPES = new Set([ + common_1.ServerType.RSPrimary, + common_1.ServerType.Standalone, + common_1.ServerType.Mongos, + common_1.ServerType.LoadBalancer +]); +const DATA_BEARING_SERVER_TYPES = new Set([ + common_1.ServerType.RSPrimary, + common_1.ServerType.RSSecondary, + common_1.ServerType.Mongos, + common_1.ServerType.Standalone, + common_1.ServerType.LoadBalancer +]); +/** + * The client's view of a single server, based on the most recent ismaster outcome. + * + * Internal type, not meant to be directly instantiated + * @public + */ +class ServerDescription { + /** + * Create a ServerDescription + * @internal + * + * @param address - The address of the server + * @param ismaster - An optional ismaster response for this server + */ + constructor(address, ismaster, options) { + var _a, _b, _c, _d, _e, _f, _g, _h, _j, _k, _l, _m; + if (typeof address === 'string') { + this._hostAddress = new utils_1.HostAddress(address); + this.address = this._hostAddress.toString(); + } + else { + this._hostAddress = address; + this.address = this._hostAddress.toString(); + } + this.type = parseServerType(ismaster, options); + this.hosts = (_b = (_a = ismaster === null || ismaster === void 0 ? void 0 : ismaster.hosts) === null || _a === void 0 ? void 0 : _a.map((host) => host.toLowerCase())) !== null && _b !== void 0 ? _b : []; + this.passives = (_d = (_c = ismaster === null || ismaster === void 0 ? void 0 : ismaster.passives) === null || _c === void 0 ? void 0 : _c.map((host) => host.toLowerCase())) !== null && _d !== void 0 ? _d : []; + this.arbiters = (_f = (_e = ismaster === null || ismaster === void 0 ? void 0 : ismaster.arbiters) === null || _e === void 0 ? void 0 : _e.map((host) => host.toLowerCase())) !== null && _f !== void 0 ? _f : []; + this.tags = (_g = ismaster === null || ismaster === void 0 ? void 0 : ismaster.tags) !== null && _g !== void 0 ? _g : {}; + this.minWireVersion = (_h = ismaster === null || ismaster === void 0 ? void 0 : ismaster.minWireVersion) !== null && _h !== void 0 ? _h : 0; + this.maxWireVersion = (_j = ismaster === null || ismaster === void 0 ? void 0 : ismaster.maxWireVersion) !== null && _j !== void 0 ? _j : 0; + this.roundTripTime = (_k = options === null || options === void 0 ? void 0 : options.roundTripTime) !== null && _k !== void 0 ? _k : -1; + this.lastUpdateTime = utils_1.now(); + this.lastWriteDate = (_m = (_l = ismaster === null || ismaster === void 0 ? void 0 : ismaster.lastWrite) === null || _l === void 0 ? void 0 : _l.lastWriteDate) !== null && _m !== void 0 ? _m : 0; + if (options === null || options === void 0 ? void 0 : options.topologyVersion) { + this.topologyVersion = options.topologyVersion; + } + else if (ismaster === null || ismaster === void 0 ? void 0 : ismaster.topologyVersion) { + this.topologyVersion = ismaster.topologyVersion; + } + if (options === null || options === void 0 ? void 0 : options.error) { + this.error = options.error; + } + if (ismaster === null || ismaster === void 0 ? void 0 : ismaster.primary) { + this.primary = ismaster.primary; + } + if (ismaster === null || ismaster === void 0 ? void 0 : ismaster.me) { + this.me = ismaster.me.toLowerCase(); + } + if (ismaster === null || ismaster === void 0 ? void 0 : ismaster.setName) { + this.setName = ismaster.setName; + } + if (ismaster === null || ismaster === void 0 ? void 0 : ismaster.setVersion) { + this.setVersion = ismaster.setVersion; + } + if (ismaster === null || ismaster === void 0 ? void 0 : ismaster.electionId) { + this.electionId = ismaster.electionId; + } + if (ismaster === null || ismaster === void 0 ? void 0 : ismaster.logicalSessionTimeoutMinutes) { + this.logicalSessionTimeoutMinutes = ismaster.logicalSessionTimeoutMinutes; + } + if (ismaster === null || ismaster === void 0 ? void 0 : ismaster.$clusterTime) { + this.$clusterTime = ismaster.$clusterTime; + } + } + get hostAddress() { + if (this._hostAddress) + return this._hostAddress; + else + return new utils_1.HostAddress(this.address); + } + get allHosts() { + return this.hosts.concat(this.arbiters).concat(this.passives); + } + /** Is this server available for reads*/ + get isReadable() { + return this.type === common_1.ServerType.RSSecondary || this.isWritable; + } + /** Is this server data bearing */ + get isDataBearing() { + return DATA_BEARING_SERVER_TYPES.has(this.type); + } + /** Is this server available for writes */ + get isWritable() { + return WRITABLE_SERVER_TYPES.has(this.type); + } + get host() { + const chopLength = `:${this.port}`.length; + return this.address.slice(0, -chopLength); + } + get port() { + const port = this.address.split(':').pop(); + return port ? Number.parseInt(port, 10) : 27017; + } + /** + * Determines if another `ServerDescription` is equal to this one per the rules defined + * in the {@link https://github.com/mongodb/specifications/blob/master/source/server-discovery-and-monitoring/server-discovery-and-monitoring.rst#serverdescription|SDAM spec} + */ + equals(other) { + const topologyVersionsEqual = this.topologyVersion === other.topologyVersion || + compareTopologyVersion(this.topologyVersion, other.topologyVersion) === 0; + const electionIdsEqual = this.electionId && other.electionId + ? other.electionId && this.electionId.equals(other.electionId) + : this.electionId === other.electionId; + return (other != null && + utils_1.errorStrictEqual(this.error, other.error) && + this.type === other.type && + this.minWireVersion === other.minWireVersion && + utils_1.arrayStrictEqual(this.hosts, other.hosts) && + tagsStrictEqual(this.tags, other.tags) && + this.setName === other.setName && + this.setVersion === other.setVersion && + electionIdsEqual && + this.primary === other.primary && + this.logicalSessionTimeoutMinutes === other.logicalSessionTimeoutMinutes && + topologyVersionsEqual); + } +} +exports.ServerDescription = ServerDescription; +// Parses an `ismaster` message and determines the server type +function parseServerType(ismaster, options) { + if (options === null || options === void 0 ? void 0 : options.loadBalanced) { + return common_1.ServerType.LoadBalancer; + } + if (!ismaster || !ismaster.ok) { + return common_1.ServerType.Unknown; + } + if (ismaster.isreplicaset) { + return common_1.ServerType.RSGhost; + } + if (ismaster.msg && ismaster.msg === 'isdbgrid') { + return common_1.ServerType.Mongos; + } + if (ismaster.setName) { + if (ismaster.hidden) { + return common_1.ServerType.RSOther; + } + else if (ismaster.ismaster || ismaster.isWritablePrimary) { + return common_1.ServerType.RSPrimary; + } + else if (ismaster.secondary) { + return common_1.ServerType.RSSecondary; + } + else if (ismaster.arbiterOnly) { + return common_1.ServerType.RSArbiter; + } + else { + return common_1.ServerType.RSOther; + } + } + return common_1.ServerType.Standalone; +} +exports.parseServerType = parseServerType; +function tagsStrictEqual(tags, tags2) { + const tagsKeys = Object.keys(tags); + const tags2Keys = Object.keys(tags2); + return (tagsKeys.length === tags2Keys.length && + tagsKeys.every((key) => tags2[key] === tags[key])); +} +/** + * Compares two topology versions. + * + * @returns A negative number if `lhs` is older than `rhs`; positive if `lhs` is newer than `rhs`; 0 if they are equivalent. + */ +function compareTopologyVersion(lhs, rhs) { + if (lhs == null || rhs == null) { + return -1; + } + if (lhs.processId.equals(rhs.processId)) { + // tests mock counter as just number, but in a real situation counter should always be a Long + const lhsCounter = bson_1.Long.isLong(lhs.counter) ? lhs.counter : bson_1.Long.fromNumber(lhs.counter); + const rhsCounter = bson_1.Long.isLong(rhs.counter) ? lhs.counter : bson_1.Long.fromNumber(rhs.counter); + return lhsCounter.compare(rhsCounter); + } + return -1; +} +exports.compareTopologyVersion = compareTopologyVersion; +//# sourceMappingURL=server_description.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/server_description.js.map b/node_modules/mongodb/lib/sdam/server_description.js.map new file mode 100644 index 000000000..072d11d9f --- /dev/null +++ b/node_modules/mongodb/lib/sdam/server_description.js.map @@ -0,0 +1 @@ +{"version":3,"file":"server_description.js","sourceRoot":"","sources":["../../src/sdam/server_description.ts"],"names":[],"mappings":";;;AAAA,oCAAgF;AAChF,qCAAsC;AACtC,kCAAmD;AAInD,MAAM,qBAAqB,GAAG,IAAI,GAAG,CAAa;IAChD,mBAAU,CAAC,SAAS;IACpB,mBAAU,CAAC,UAAU;IACrB,mBAAU,CAAC,MAAM;IACjB,mBAAU,CAAC,YAAY;CACxB,CAAC,CAAC;AAEH,MAAM,yBAAyB,GAAG,IAAI,GAAG,CAAa;IACpD,mBAAU,CAAC,SAAS;IACpB,mBAAU,CAAC,WAAW;IACtB,mBAAU,CAAC,MAAM;IACjB,mBAAU,CAAC,UAAU;IACrB,mBAAU,CAAC,YAAY;CACxB,CAAC,CAAC;AA0BH;;;;;GAKG;AACH,MAAa,iBAAiB;IA2B5B;;;;;;OAMG;IACH,YACE,OAA6B,EAC7B,QAAmB,EACnB,OAAkC;;QAElC,IAAI,OAAO,OAAO,KAAK,QAAQ,EAAE;YAC/B,IAAI,CAAC,YAAY,GAAG,IAAI,mBAAW,CAAC,OAAO,CAAC,CAAC;YAC7C,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC,YAAY,CAAC,QAAQ,EAAE,CAAC;SAC7C;aAAM;YACL,IAAI,CAAC,YAAY,GAAG,OAAO,CAAC;YAC5B,IAAI,CAAC,OAAO,GAAG,IAAI,CAAC,YAAY,CAAC,QAAQ,EAAE,CAAC;SAC7C;QACD,IAAI,CAAC,IAAI,GAAG,eAAe,CAAC,QAAQ,EAAE,OAAO,CAAC,CAAC;QAC/C,IAAI,CAAC,KAAK,GAAG,MAAA,MAAA,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,KAAK,0CAAE,GAAG,CAAC,CAAC,IAAY,EAAE,EAAE,CAAC,IAAI,CAAC,WAAW,EAAE,CAAC,mCAAI,EAAE,CAAC;QAC9E,IAAI,CAAC,QAAQ,GAAG,MAAA,MAAA,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,QAAQ,0CAAE,GAAG,CAAC,CAAC,IAAY,EAAE,EAAE,CAAC,IAAI,CAAC,WAAW,EAAE,CAAC,mCAAI,EAAE,CAAC;QACpF,IAAI,CAAC,QAAQ,GAAG,MAAA,MAAA,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,QAAQ,0CAAE,GAAG,CAAC,CAAC,IAAY,EAAE,EAAE,CAAC,IAAI,CAAC,WAAW,EAAE,CAAC,mCAAI,EAAE,CAAC;QACpF,IAAI,CAAC,IAAI,GAAG,MAAA,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,IAAI,mCAAI,EAAE,CAAC;QACjC,IAAI,CAAC,cAAc,GAAG,MAAA,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,cAAc,mCAAI,CAAC,CAAC;QACpD,IAAI,CAAC,cAAc,GAAG,MAAA,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,cAAc,mCAAI,CAAC,CAAC;QACpD,IAAI,CAAC,aAAa,GAAG,MAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,aAAa,mCAAI,CAAC,CAAC,CAAC;QAClD,IAAI,CAAC,cAAc,GAAG,WAAG,EAAE,CAAC;QAC5B,IAAI,CAAC,aAAa,GAAG,MAAA,MAAA,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,SAAS,0CAAE,aAAa,mCAAI,CAAC,CAAC;QAE7D,IAAI,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,eAAe,EAAE;YAC5B,IAAI,CAAC,eAAe,GAAG,OAAO,CAAC,eAAe,CAAC;SAChD;aAAM,IAAI,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,eAAe,EAAE;YACpC,IAAI,CAAC,eAAe,GAAG,QAAQ,CAAC,eAAe,CAAC;SACjD;QAED,IAAI,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,KAAK,EAAE;YAClB,IAAI,CAAC,KAAK,GAAG,OAAO,CAAC,KAAK,CAAC;SAC5B;QAED,IAAI,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,OAAO,EAAE;YACrB,IAAI,CAAC,OAAO,GAAG,QAAQ,CAAC,OAAO,CAAC;SACjC;QAED,IAAI,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,EAAE,EAAE;YAChB,IAAI,CAAC,EAAE,GAAG,QAAQ,CAAC,EAAE,CAAC,WAAW,EAAE,CAAC;SACrC;QAED,IAAI,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,OAAO,EAAE;YACrB,IAAI,CAAC,OAAO,GAAG,QAAQ,CAAC,OAAO,CAAC;SACjC;QAED,IAAI,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,UAAU,EAAE;YACxB,IAAI,CAAC,UAAU,GAAG,QAAQ,CAAC,UAAU,CAAC;SACvC;QAED,IAAI,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,UAAU,EAAE;YACxB,IAAI,CAAC,UAAU,GAAG,QAAQ,CAAC,UAAU,CAAC;SACvC;QAED,IAAI,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,4BAA4B,EAAE;YAC1C,IAAI,CAAC,4BAA4B,GAAG,QAAQ,CAAC,4BAA4B,CAAC;SAC3E;QAED,IAAI,QAAQ,aAAR,QAAQ,uBAAR,QAAQ,CAAE,YAAY,EAAE;YAC1B,IAAI,CAAC,YAAY,GAAG,QAAQ,CAAC,YAAY,CAAC;SAC3C;IACH,CAAC;IAED,IAAI,WAAW;QACb,IAAI,IAAI,CAAC,YAAY;YAAE,OAAO,IAAI,CAAC,YAAY,CAAC;;YAC3C,OAAO,IAAI,mBAAW,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;IAC5C,CAAC;IAED,IAAI,QAAQ;QACV,OAAO,IAAI,CAAC,KAAK,CAAC,MAAM,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC,MAAM,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;IAChE,CAAC;IAED,wCAAwC;IACxC,IAAI,UAAU;QACZ,OAAO,IAAI,CAAC,IAAI,KAAK,mBAAU,CAAC,WAAW,IAAI,IAAI,CAAC,UAAU,CAAC;IACjE,CAAC;IAED,kCAAkC;IAClC,IAAI,aAAa;QACf,OAAO,yBAAyB,CAAC,GAAG,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;IAClD,CAAC;IAED,0CAA0C;IAC1C,IAAI,UAAU;QACZ,OAAO,qBAAqB,CAAC,GAAG,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;IAC9C,CAAC;IAED,IAAI,IAAI;QACN,MAAM,UAAU,GAAG,IAAI,IAAI,CAAC,IAAI,EAAE,CAAC,MAAM,CAAC;QAC1C,OAAO,IAAI,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC,EAAE,CAAC,UAAU,CAAC,CAAC;IAC5C,CAAC;IAED,IAAI,IAAI;QACN,MAAM,IAAI,GAAG,IAAI,CAAC,OAAO,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,GAAG,EAAE,CAAC;QAC3C,OAAO,IAAI,CAAC,CAAC,CAAC,MAAM,CAAC,QAAQ,CAAC,IAAI,EAAE,EAAE,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC;IAClD,CAAC;IAED;;;OAGG;IACH,MAAM,CAAC,KAAwB;QAC7B,MAAM,qBAAqB,GACzB,IAAI,CAAC,eAAe,KAAK,KAAK,CAAC,eAAe;YAC9C,sBAAsB,CAAC,IAAI,CAAC,eAAe,EAAE,KAAK,CAAC,eAAe,CAAC,KAAK,CAAC,CAAC;QAE5E,MAAM,gBAAgB,GACpB,IAAI,CAAC,UAAU,IAAI,KAAK,CAAC,UAAU;YACjC,CAAC,CAAC,KAAK,CAAC,UAAU,IAAI,IAAI,CAAC,UAAU,CAAC,MAAM,CAAC,KAAK,CAAC,UAAU,CAAC;YAC9D,CAAC,CAAC,IAAI,CAAC,UAAU,KAAK,KAAK,CAAC,UAAU,CAAC;QAE3C,OAAO,CACL,KAAK,IAAI,IAAI;YACb,wBAAgB,CAAC,IAAI,CAAC,KAAK,EAAE,KAAK,CAAC,KAAK,CAAC;YACzC,IAAI,CAAC,IAAI,KAAK,KAAK,CAAC,IAAI;YACxB,IAAI,CAAC,cAAc,KAAK,KAAK,CAAC,cAAc;YAC5C,wBAAgB,CAAC,IAAI,CAAC,KAAK,EAAE,KAAK,CAAC,KAAK,CAAC;YACzC,eAAe,CAAC,IAAI,CAAC,IAAI,EAAE,KAAK,CAAC,IAAI,CAAC;YACtC,IAAI,CAAC,OAAO,KAAK,KAAK,CAAC,OAAO;YAC9B,IAAI,CAAC,UAAU,KAAK,KAAK,CAAC,UAAU;YACpC,gBAAgB;YAChB,IAAI,CAAC,OAAO,KAAK,KAAK,CAAC,OAAO;YAC9B,IAAI,CAAC,4BAA4B,KAAK,KAAK,CAAC,4BAA4B;YACxE,qBAAqB,CACtB,CAAC;IACJ,CAAC;CACF;AA/JD,8CA+JC;AAED,8DAA8D;AAC9D,SAAgB,eAAe,CAC7B,QAAmB,EACnB,OAAkC;IAElC,IAAI,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,YAAY,EAAE;QACzB,OAAO,mBAAU,CAAC,YAAY,CAAC;KAChC;IAED,IAAI,CAAC,QAAQ,IAAI,CAAC,QAAQ,CAAC,EAAE,EAAE;QAC7B,OAAO,mBAAU,CAAC,OAAO,CAAC;KAC3B;IAED,IAAI,QAAQ,CAAC,YAAY,EAAE;QACzB,OAAO,mBAAU,CAAC,OAAO,CAAC;KAC3B;IAED,IAAI,QAAQ,CAAC,GAAG,IAAI,QAAQ,CAAC,GAAG,KAAK,UAAU,EAAE;QAC/C,OAAO,mBAAU,CAAC,MAAM,CAAC;KAC1B;IAED,IAAI,QAAQ,CAAC,OAAO,EAAE;QACpB,IAAI,QAAQ,CAAC,MAAM,EAAE;YACnB,OAAO,mBAAU,CAAC,OAAO,CAAC;SAC3B;aAAM,IAAI,QAAQ,CAAC,QAAQ,IAAI,QAAQ,CAAC,iBAAiB,EAAE;YAC1D,OAAO,mBAAU,CAAC,SAAS,CAAC;SAC7B;aAAM,IAAI,QAAQ,CAAC,SAAS,EAAE;YAC7B,OAAO,mBAAU,CAAC,WAAW,CAAC;SAC/B;aAAM,IAAI,QAAQ,CAAC,WAAW,EAAE;YAC/B,OAAO,mBAAU,CAAC,SAAS,CAAC;SAC7B;aAAM;YACL,OAAO,mBAAU,CAAC,OAAO,CAAC;SAC3B;KACF;IAED,OAAO,mBAAU,CAAC,UAAU,CAAC;AAC/B,CAAC;AAnCD,0CAmCC;AAED,SAAS,eAAe,CAAC,IAAY,EAAE,KAAa;IAClD,MAAM,QAAQ,GAAG,MAAM,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;IACnC,MAAM,SAAS,GAAG,MAAM,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;IAErC,OAAO,CACL,QAAQ,CAAC,MAAM,KAAK,SAAS,CAAC,MAAM;QACpC,QAAQ,CAAC,KAAK,CAAC,CAAC,GAAW,EAAE,EAAE,CAAC,KAAK,CAAC,GAAG,CAAC,KAAK,IAAI,CAAC,GAAG,CAAC,CAAC,CAC1D,CAAC;AACJ,CAAC;AAED;;;;GAIG;AACH,SAAgB,sBAAsB,CAAC,GAAqB,EAAE,GAAqB;IACjF,IAAI,GAAG,IAAI,IAAI,IAAI,GAAG,IAAI,IAAI,EAAE;QAC9B,OAAO,CAAC,CAAC,CAAC;KACX;IAED,IAAI,GAAG,CAAC,SAAS,CAAC,MAAM,CAAC,GAAG,CAAC,SAAS,CAAC,EAAE;QACvC,6FAA6F;QAC7F,MAAM,UAAU,GAAG,WAAI,CAAC,MAAM,CAAC,GAAG,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,OAAO,CAAC,CAAC,CAAC,WAAI,CAAC,UAAU,CAAC,GAAG,CAAC,OAAO,CAAC,CAAC;QACzF,MAAM,UAAU,GAAG,WAAI,CAAC,MAAM,CAAC,GAAG,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,OAAO,CAAC,CAAC,CAAC,WAAI,CAAC,UAAU,CAAC,GAAG,CAAC,OAAO,CAAC,CAAC;QACzF,OAAO,UAAU,CAAC,OAAO,CAAC,UAAU,CAAC,CAAC;KACvC;IAED,OAAO,CAAC,CAAC,CAAC;AACZ,CAAC;AAbD,wDAaC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/server_selection.js b/node_modules/mongodb/lib/sdam/server_selection.js new file mode 100644 index 000000000..3c31f59fa --- /dev/null +++ b/node_modules/mongodb/lib/sdam/server_selection.js @@ -0,0 +1,193 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.readPreferenceServerSelector = exports.writableServerSelector = void 0; +const common_1 = require("./common"); +const read_preference_1 = require("../read_preference"); +const error_1 = require("../error"); +// max staleness constants +const IDLE_WRITE_PERIOD = 10000; +const SMALLEST_MAX_STALENESS_SECONDS = 90; +/** + * Returns a server selector that selects for writable servers + */ +function writableServerSelector() { + return (topologyDescription, servers) => latencyWindowReducer(topologyDescription, servers.filter((s) => s.isWritable)); +} +exports.writableServerSelector = writableServerSelector; +/** + * Reduces the passed in array of servers by the rules of the "Max Staleness" specification + * found here: https://github.com/mongodb/specifications/blob/master/source/max-staleness/max-staleness.rst + * + * @param readPreference - The read preference providing max staleness guidance + * @param topologyDescription - The topology description + * @param servers - The list of server descriptions to be reduced + * @returns The list of servers that satisfy the requirements of max staleness + */ +function maxStalenessReducer(readPreference, topologyDescription, servers) { + if (readPreference.maxStalenessSeconds == null || readPreference.maxStalenessSeconds < 0) { + return servers; + } + const maxStaleness = readPreference.maxStalenessSeconds; + const maxStalenessVariance = (topologyDescription.heartbeatFrequencyMS + IDLE_WRITE_PERIOD) / 1000; + if (maxStaleness < maxStalenessVariance) { + throw new error_1.MongoInvalidArgumentError(`Option "maxStalenessSeconds" must be at least ${maxStalenessVariance} seconds`); + } + if (maxStaleness < SMALLEST_MAX_STALENESS_SECONDS) { + throw new error_1.MongoInvalidArgumentError(`Option "maxStalenessSeconds" must be at least ${SMALLEST_MAX_STALENESS_SECONDS} seconds`); + } + if (topologyDescription.type === common_1.TopologyType.ReplicaSetWithPrimary) { + const primary = Array.from(topologyDescription.servers.values()).filter(primaryFilter)[0]; + return servers.reduce((result, server) => { + var _a; + const stalenessMS = server.lastUpdateTime - + server.lastWriteDate - + (primary.lastUpdateTime - primary.lastWriteDate) + + topologyDescription.heartbeatFrequencyMS; + const staleness = stalenessMS / 1000; + const maxStalenessSeconds = (_a = readPreference.maxStalenessSeconds) !== null && _a !== void 0 ? _a : 0; + if (staleness <= maxStalenessSeconds) { + result.push(server); + } + return result; + }, []); + } + if (topologyDescription.type === common_1.TopologyType.ReplicaSetNoPrimary) { + if (servers.length === 0) { + return servers; + } + const sMax = servers.reduce((max, s) => s.lastWriteDate > max.lastWriteDate ? s : max); + return servers.reduce((result, server) => { + var _a; + const stalenessMS = sMax.lastWriteDate - server.lastWriteDate + topologyDescription.heartbeatFrequencyMS; + const staleness = stalenessMS / 1000; + const maxStalenessSeconds = (_a = readPreference.maxStalenessSeconds) !== null && _a !== void 0 ? _a : 0; + if (staleness <= maxStalenessSeconds) { + result.push(server); + } + return result; + }, []); + } + return servers; +} +/** + * Determines whether a server's tags match a given set of tags + * + * @param tagSet - The requested tag set to match + * @param serverTags - The server's tags + */ +function tagSetMatch(tagSet, serverTags) { + const keys = Object.keys(tagSet); + const serverTagKeys = Object.keys(serverTags); + for (let i = 0; i < keys.length; ++i) { + const key = keys[i]; + if (serverTagKeys.indexOf(key) === -1 || serverTags[key] !== tagSet[key]) { + return false; + } + } + return true; +} +/** + * Reduces a set of server descriptions based on tags requested by the read preference + * + * @param readPreference - The read preference providing the requested tags + * @param servers - The list of server descriptions to reduce + * @returns The list of servers matching the requested tags + */ +function tagSetReducer(readPreference, servers) { + if (readPreference.tags == null || + (Array.isArray(readPreference.tags) && readPreference.tags.length === 0)) { + return servers; + } + for (let i = 0; i < readPreference.tags.length; ++i) { + const tagSet = readPreference.tags[i]; + const serversMatchingTagset = servers.reduce((matched, server) => { + if (tagSetMatch(tagSet, server.tags)) + matched.push(server); + return matched; + }, []); + if (serversMatchingTagset.length) { + return serversMatchingTagset; + } + } + return []; +} +/** + * Reduces a list of servers to ensure they fall within an acceptable latency window. This is + * further specified in the "Server Selection" specification, found here: + * https://github.com/mongodb/specifications/blob/master/source/server-selection/server-selection.rst + * + * @param topologyDescription - The topology description + * @param servers - The list of servers to reduce + * @returns The servers which fall within an acceptable latency window + */ +function latencyWindowReducer(topologyDescription, servers) { + const low = servers.reduce((min, server) => min === -1 ? server.roundTripTime : Math.min(server.roundTripTime, min), -1); + const high = low + topologyDescription.localThresholdMS; + return servers.reduce((result, server) => { + if (server.roundTripTime <= high && server.roundTripTime >= low) + result.push(server); + return result; + }, []); +} +// filters +function primaryFilter(server) { + return server.type === common_1.ServerType.RSPrimary; +} +function secondaryFilter(server) { + return server.type === common_1.ServerType.RSSecondary; +} +function nearestFilter(server) { + return server.type === common_1.ServerType.RSSecondary || server.type === common_1.ServerType.RSPrimary; +} +function knownFilter(server) { + return server.type !== common_1.ServerType.Unknown; +} +function loadBalancerFilter(server) { + return server.type === common_1.ServerType.LoadBalancer; +} +/** + * Returns a function which selects servers based on a provided read preference + * + * @param readPreference - The read preference to select with + */ +function readPreferenceServerSelector(readPreference) { + if (!readPreference.isValid()) { + throw new error_1.MongoInvalidArgumentError('Invalid read preference specified'); + } + return (topologyDescription, servers) => { + const commonWireVersion = topologyDescription.commonWireVersion; + if (commonWireVersion && + readPreference.minWireVersion && + readPreference.minWireVersion > commonWireVersion) { + throw new error_1.MongoCompatibilityError(`Minimum wire version '${readPreference.minWireVersion}' required, but found '${commonWireVersion}'`); + } + if (topologyDescription.type === common_1.TopologyType.LoadBalanced) { + return servers.filter(loadBalancerFilter); + } + if (topologyDescription.type === common_1.TopologyType.Unknown) { + return []; + } + if (topologyDescription.type === common_1.TopologyType.Single || + topologyDescription.type === common_1.TopologyType.Sharded) { + return latencyWindowReducer(topologyDescription, servers.filter(knownFilter)); + } + const mode = readPreference.mode; + if (mode === read_preference_1.ReadPreference.PRIMARY) { + return servers.filter(primaryFilter); + } + if (mode === read_preference_1.ReadPreference.PRIMARY_PREFERRED) { + const result = servers.filter(primaryFilter); + if (result.length) { + return result; + } + } + const filter = mode === read_preference_1.ReadPreference.NEAREST ? nearestFilter : secondaryFilter; + const selectedServers = latencyWindowReducer(topologyDescription, tagSetReducer(readPreference, maxStalenessReducer(readPreference, topologyDescription, servers.filter(filter)))); + if (mode === read_preference_1.ReadPreference.SECONDARY_PREFERRED && selectedServers.length === 0) { + return servers.filter(primaryFilter); + } + return selectedServers; + }; +} +exports.readPreferenceServerSelector = readPreferenceServerSelector; +//# sourceMappingURL=server_selection.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/server_selection.js.map b/node_modules/mongodb/lib/sdam/server_selection.js.map new file mode 100644 index 000000000..5f115f3d0 --- /dev/null +++ b/node_modules/mongodb/lib/sdam/server_selection.js.map @@ -0,0 +1 @@ +{"version":3,"file":"server_selection.js","sourceRoot":"","sources":["../../src/sdam/server_selection.ts"],"names":[],"mappings":";;;AAAA,qCAAoD;AACpD,wDAAoD;AACpD,oCAA8E;AAI9E,0BAA0B;AAC1B,MAAM,iBAAiB,GAAG,KAAK,CAAC;AAChC,MAAM,8BAA8B,GAAG,EAAE,CAAC;AAQ1C;;GAEG;AACH,SAAgB,sBAAsB;IACpC,OAAO,CACL,mBAAwC,EACxC,OAA4B,EACP,EAAE,CACvB,oBAAoB,CAClB,mBAAmB,EACnB,OAAO,CAAC,MAAM,CAAC,CAAC,CAAoB,EAAE,EAAE,CAAC,CAAC,CAAC,UAAU,CAAC,CACvD,CAAC;AACN,CAAC;AATD,wDASC;AAED;;;;;;;;GAQG;AACH,SAAS,mBAAmB,CAC1B,cAA8B,EAC9B,mBAAwC,EACxC,OAA4B;IAE5B,IAAI,cAAc,CAAC,mBAAmB,IAAI,IAAI,IAAI,cAAc,CAAC,mBAAmB,GAAG,CAAC,EAAE;QACxF,OAAO,OAAO,CAAC;KAChB;IAED,MAAM,YAAY,GAAG,cAAc,CAAC,mBAAmB,CAAC;IACxD,MAAM,oBAAoB,GACxB,CAAC,mBAAmB,CAAC,oBAAoB,GAAG,iBAAiB,CAAC,GAAG,IAAI,CAAC;IACxE,IAAI,YAAY,GAAG,oBAAoB,EAAE;QACvC,MAAM,IAAI,iCAAyB,CACjC,iDAAiD,oBAAoB,UAAU,CAChF,CAAC;KACH;IAED,IAAI,YAAY,GAAG,8BAA8B,EAAE;QACjD,MAAM,IAAI,iCAAyB,CACjC,iDAAiD,8BAA8B,UAAU,CAC1F,CAAC;KACH;IAED,IAAI,mBAAmB,CAAC,IAAI,KAAK,qBAAY,CAAC,qBAAqB,EAAE;QACnE,MAAM,OAAO,GAAsB,KAAK,CAAC,IAAI,CAAC,mBAAmB,CAAC,OAAO,CAAC,MAAM,EAAE,CAAC,CAAC,MAAM,CACxF,aAAa,CACd,CAAC,CAAC,CAAC,CAAC;QAEL,OAAO,OAAO,CAAC,MAAM,CAAC,CAAC,MAA2B,EAAE,MAAyB,EAAE,EAAE;;YAC/E,MAAM,WAAW,GACf,MAAM,CAAC,cAAc;gBACrB,MAAM,CAAC,aAAa;gBACpB,CAAC,OAAO,CAAC,cAAc,GAAG,OAAO,CAAC,aAAa,CAAC;gBAChD,mBAAmB,CAAC,oBAAoB,CAAC;YAE3C,MAAM,SAAS,GAAG,WAAW,GAAG,IAAI,CAAC;YACrC,MAAM,mBAAmB,GAAG,MAAA,cAAc,CAAC,mBAAmB,mCAAI,CAAC,CAAC;YACpE,IAAI,SAAS,IAAI,mBAAmB,EAAE;gBACpC,MAAM,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC;aACrB;YAED,OAAO,MAAM,CAAC;QAChB,CAAC,EAAE,EAAE,CAAC,CAAC;KACR;IAED,IAAI,mBAAmB,CAAC,IAAI,KAAK,qBAAY,CAAC,mBAAmB,EAAE;QACjE,IAAI,OAAO,CAAC,MAAM,KAAK,CAAC,EAAE;YACxB,OAAO,OAAO,CAAC;SAChB;QAED,MAAM,IAAI,GAAG,OAAO,CAAC,MAAM,CAAC,CAAC,GAAsB,EAAE,CAAoB,EAAE,EAAE,CAC3E,CAAC,CAAC,aAAa,GAAG,GAAG,CAAC,aAAa,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,GAAG,CAC9C,CAAC;QAEF,OAAO,OAAO,CAAC,MAAM,CAAC,CAAC,MAA2B,EAAE,MAAyB,EAAE,EAAE;;YAC/E,MAAM,WAAW,GACf,IAAI,CAAC,aAAa,GAAG,MAAM,CAAC,aAAa,GAAG,mBAAmB,CAAC,oBAAoB,CAAC;YAEvF,MAAM,SAAS,GAAG,WAAW,GAAG,IAAI,CAAC;YACrC,MAAM,mBAAmB,GAAG,MAAA,cAAc,CAAC,mBAAmB,mCAAI,CAAC,CAAC;YACpE,IAAI,SAAS,IAAI,mBAAmB,EAAE;gBACpC,MAAM,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC;aACrB;YAED,OAAO,MAAM,CAAC;QAChB,CAAC,EAAE,EAAE,CAAC,CAAC;KACR;IAED,OAAO,OAAO,CAAC;AACjB,CAAC;AAED;;;;;GAKG;AACH,SAAS,WAAW,CAAC,MAAc,EAAE,UAAkB;IACrD,MAAM,IAAI,GAAG,MAAM,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC;IACjC,MAAM,aAAa,GAAG,MAAM,CAAC,IAAI,CAAC,UAAU,CAAC,CAAC;IAC9C,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,IAAI,CAAC,MAAM,EAAE,EAAE,CAAC,EAAE;QACpC,MAAM,GAAG,GAAG,IAAI,CAAC,CAAC,CAAC,CAAC;QACpB,IAAI,aAAa,CAAC,OAAO,CAAC,GAAG,CAAC,KAAK,CAAC,CAAC,IAAI,UAAU,CAAC,GAAG,CAAC,KAAK,MAAM,CAAC,GAAG,CAAC,EAAE;YACxE,OAAO,KAAK,CAAC;SACd;KACF;IAED,OAAO,IAAI,CAAC;AACd,CAAC;AAED;;;;;;GAMG;AACH,SAAS,aAAa,CACpB,cAA8B,EAC9B,OAA4B;IAE5B,IACE,cAAc,CAAC,IAAI,IAAI,IAAI;QAC3B,CAAC,KAAK,CAAC,OAAO,CAAC,cAAc,CAAC,IAAI,CAAC,IAAI,cAAc,CAAC,IAAI,CAAC,MAAM,KAAK,CAAC,CAAC,EACxE;QACA,OAAO,OAAO,CAAC;KAChB;IAED,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,cAAc,CAAC,IAAI,CAAC,MAAM,EAAE,EAAE,CAAC,EAAE;QACnD,MAAM,MAAM,GAAG,cAAc,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;QACtC,MAAM,qBAAqB,GAAG,OAAO,CAAC,MAAM,CAC1C,CAAC,OAA4B,EAAE,MAAyB,EAAE,EAAE;YAC1D,IAAI,WAAW,CAAC,MAAM,EAAE,MAAM,CAAC,IAAI,CAAC;gBAAE,OAAO,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC;YAC3D,OAAO,OAAO,CAAC;QACjB,CAAC,EACD,EAAE,CACH,CAAC;QAEF,IAAI,qBAAqB,CAAC,MAAM,EAAE;YAChC,OAAO,qBAAqB,CAAC;SAC9B;KACF;IAED,OAAO,EAAE,CAAC;AACZ,CAAC;AAED;;;;;;;;GAQG;AACH,SAAS,oBAAoB,CAC3B,mBAAwC,EACxC,OAA4B;IAE5B,MAAM,GAAG,GAAG,OAAO,CAAC,MAAM,CACxB,CAAC,GAAW,EAAE,MAAyB,EAAE,EAAE,CACzC,GAAG,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC,MAAM,CAAC,aAAa,CAAC,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,MAAM,CAAC,aAAa,EAAE,GAAG,CAAC,EACzE,CAAC,CAAC,CACH,CAAC;IAEF,MAAM,IAAI,GAAG,GAAG,GAAG,mBAAmB,CAAC,gBAAgB,CAAC;IACxD,OAAO,OAAO,CAAC,MAAM,CAAC,CAAC,MAA2B,EAAE,MAAyB,EAAE,EAAE;QAC/E,IAAI,MAAM,CAAC,aAAa,IAAI,IAAI,IAAI,MAAM,CAAC,aAAa,IAAI,GAAG;YAAE,MAAM,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC;QACrF,OAAO,MAAM,CAAC;IAChB,CAAC,EAAE,EAAE,CAAC,CAAC;AACT,CAAC;AAED,UAAU;AACV,SAAS,aAAa,CAAC,MAAyB;IAC9C,OAAO,MAAM,CAAC,IAAI,KAAK,mBAAU,CAAC,SAAS,CAAC;AAC9C,CAAC;AAED,SAAS,eAAe,CAAC,MAAyB;IAChD,OAAO,MAAM,CAAC,IAAI,KAAK,mBAAU,CAAC,WAAW,CAAC;AAChD,CAAC;AAED,SAAS,aAAa,CAAC,MAAyB;IAC9C,OAAO,MAAM,CAAC,IAAI,KAAK,mBAAU,CAAC,WAAW,IAAI,MAAM,CAAC,IAAI,KAAK,mBAAU,CAAC,SAAS,CAAC;AACxF,CAAC;AAED,SAAS,WAAW,CAAC,MAAyB;IAC5C,OAAO,MAAM,CAAC,IAAI,KAAK,mBAAU,CAAC,OAAO,CAAC;AAC5C,CAAC;AAED,SAAS,kBAAkB,CAAC,MAAyB;IACnD,OAAO,MAAM,CAAC,IAAI,KAAK,mBAAU,CAAC,YAAY,CAAC;AACjD,CAAC;AAED;;;;GAIG;AACH,SAAgB,4BAA4B,CAAC,cAA8B;IACzE,IAAI,CAAC,cAAc,CAAC,OAAO,EAAE,EAAE;QAC7B,MAAM,IAAI,iCAAyB,CAAC,mCAAmC,CAAC,CAAC;KAC1E;IAED,OAAO,CACL,mBAAwC,EACxC,OAA4B,EACP,EAAE;QACvB,MAAM,iBAAiB,GAAG,mBAAmB,CAAC,iBAAiB,CAAC;QAChE,IACE,iBAAiB;YACjB,cAAc,CAAC,cAAc;YAC7B,cAAc,CAAC,cAAc,GAAG,iBAAiB,EACjD;YACA,MAAM,IAAI,+BAAuB,CAC/B,yBAAyB,cAAc,CAAC,cAAc,0BAA0B,iBAAiB,GAAG,CACrG,CAAC;SACH;QAED,IAAI,mBAAmB,CAAC,IAAI,KAAK,qBAAY,CAAC,YAAY,EAAE;YAC1D,OAAO,OAAO,CAAC,MAAM,CAAC,kBAAkB,CAAC,CAAC;SAC3C;QAED,IAAI,mBAAmB,CAAC,IAAI,KAAK,qBAAY,CAAC,OAAO,EAAE;YACrD,OAAO,EAAE,CAAC;SACX;QAED,IACE,mBAAmB,CAAC,IAAI,KAAK,qBAAY,CAAC,MAAM;YAChD,mBAAmB,CAAC,IAAI,KAAK,qBAAY,CAAC,OAAO,EACjD;YACA,OAAO,oBAAoB,CAAC,mBAAmB,EAAE,OAAO,CAAC,MAAM,CAAC,WAAW,CAAC,CAAC,CAAC;SAC/E;QAED,MAAM,IAAI,GAAG,cAAc,CAAC,IAAI,CAAC;QACjC,IAAI,IAAI,KAAK,gCAAc,CAAC,OAAO,EAAE;YACnC,OAAO,OAAO,CAAC,MAAM,CAAC,aAAa,CAAC,CAAC;SACtC;QAED,IAAI,IAAI,KAAK,gCAAc,CAAC,iBAAiB,EAAE;YAC7C,MAAM,MAAM,GAAG,OAAO,CAAC,MAAM,CAAC,aAAa,CAAC,CAAC;YAC7C,IAAI,MAAM,CAAC,MAAM,EAAE;gBACjB,OAAO,MAAM,CAAC;aACf;SACF;QAED,MAAM,MAAM,GAAG,IAAI,KAAK,gCAAc,CAAC,OAAO,CAAC,CAAC,CAAC,aAAa,CAAC,CAAC,CAAC,eAAe,CAAC;QACjF,MAAM,eAAe,GAAG,oBAAoB,CAC1C,mBAAmB,EACnB,aAAa,CACX,cAAc,EACd,mBAAmB,CAAC,cAAc,EAAE,mBAAmB,EAAE,OAAO,CAAC,MAAM,CAAC,MAAM,CAAC,CAAC,CACjF,CACF,CAAC;QAEF,IAAI,IAAI,KAAK,gCAAc,CAAC,mBAAmB,IAAI,eAAe,CAAC,MAAM,KAAK,CAAC,EAAE;YAC/E,OAAO,OAAO,CAAC,MAAM,CAAC,aAAa,CAAC,CAAC;SACtC;QAED,OAAO,eAAe,CAAC;IACzB,CAAC,CAAC;AACJ,CAAC;AA9DD,oEA8DC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/srv_polling.js b/node_modules/mongodb/lib/sdam/srv_polling.js new file mode 100644 index 000000000..a7c80e3fe --- /dev/null +++ b/node_modules/mongodb/lib/sdam/srv_polling.js @@ -0,0 +1,121 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.SrvPoller = exports.SrvPollingEvent = void 0; +const dns = require("dns"); +const logger_1 = require("../logger"); +const utils_1 = require("../utils"); +const mongo_types_1 = require("../mongo_types"); +const error_1 = require("../error"); +/** + * Determines whether a provided address matches the provided parent domain in order + * to avoid certain attack vectors. + * + * @param srvAddress - The address to check against a domain + * @param parentDomain - The domain to check the provided address against + * @returns Whether the provided address matches the parent domain + */ +function matchesParentDomain(srvAddress, parentDomain) { + const regex = /^.*?\./; + const srv = `.${srvAddress.replace(regex, '')}`; + const parent = `.${parentDomain.replace(regex, '')}`; + return srv.endsWith(parent); +} +/** + * @internal + * @category Event + */ +class SrvPollingEvent { + constructor(srvRecords) { + this.srvRecords = srvRecords; + } + addresses() { + return new Map(this.srvRecords.map(record => { + const host = new utils_1.HostAddress(`${record.name}:${record.port}`); + return [host.toString(), host]; + })); + } +} +exports.SrvPollingEvent = SrvPollingEvent; +/** @internal */ +class SrvPoller extends mongo_types_1.TypedEventEmitter { + constructor(options) { + super(); + if (!options || !options.srvHost) { + throw new error_1.MongoRuntimeError('Options for SrvPoller must exist and include srvHost'); + } + this.srvHost = options.srvHost; + this.rescanSrvIntervalMS = 60000; + this.heartbeatFrequencyMS = options.heartbeatFrequencyMS || 10000; + this.logger = new logger_1.Logger('srvPoller', options); + this.haMode = false; + this.generation = 0; + this._timeout = undefined; + } + get srvAddress() { + return `_mongodb._tcp.${this.srvHost}`; + } + get intervalMS() { + return this.haMode ? this.heartbeatFrequencyMS : this.rescanSrvIntervalMS; + } + start() { + if (!this._timeout) { + this.schedule(); + } + } + stop() { + if (this._timeout) { + clearTimeout(this._timeout); + this.generation += 1; + this._timeout = undefined; + } + } + schedule() { + if (this._timeout) { + clearTimeout(this._timeout); + } + this._timeout = setTimeout(() => this._poll(), this.intervalMS); + } + success(srvRecords) { + this.haMode = false; + this.schedule(); + this.emit(SrvPoller.SRV_RECORD_DISCOVERY, new SrvPollingEvent(srvRecords)); + } + failure(message, obj) { + this.logger.warn(message, obj); + this.haMode = true; + this.schedule(); + } + parentDomainMismatch(srvRecord) { + this.logger.warn(`parent domain mismatch on SRV record (${srvRecord.name}:${srvRecord.port})`, srvRecord); + } + _poll() { + const generation = this.generation; + dns.resolveSrv(this.srvAddress, (err, srvRecords) => { + if (generation !== this.generation) { + return; + } + if (err) { + this.failure('DNS error', err); + return; + } + const finalAddresses = []; + srvRecords.forEach((record) => { + if (matchesParentDomain(record.name, this.srvHost)) { + finalAddresses.push(record); + } + else { + this.parentDomainMismatch(record); + } + }); + if (!finalAddresses.length) { + this.failure('No valid addresses found at host'); + return; + } + this.success(finalAddresses); + }); + } +} +exports.SrvPoller = SrvPoller; +/** @event */ +SrvPoller.SRV_RECORD_DISCOVERY = 'srvRecordDiscovery'; +//# sourceMappingURL=srv_polling.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/srv_polling.js.map b/node_modules/mongodb/lib/sdam/srv_polling.js.map new file mode 100644 index 000000000..d3d2f3587 --- /dev/null +++ b/node_modules/mongodb/lib/sdam/srv_polling.js.map @@ -0,0 +1 @@ +{"version":3,"file":"srv_polling.js","sourceRoot":"","sources":["../../src/sdam/srv_polling.ts"],"names":[],"mappings":";;;AAAA,2BAA2B;AAC3B,sCAAkD;AAClD,oCAAuC;AACvC,gDAAmD;AACnD,oCAA6C;AAE7C;;;;;;;GAOG;AACH,SAAS,mBAAmB,CAAC,UAAkB,EAAE,YAAoB;IACnE,MAAM,KAAK,GAAG,QAAQ,CAAC;IACvB,MAAM,GAAG,GAAG,IAAI,UAAU,CAAC,OAAO,CAAC,KAAK,EAAE,EAAE,CAAC,EAAE,CAAC;IAChD,MAAM,MAAM,GAAG,IAAI,YAAY,CAAC,OAAO,CAAC,KAAK,EAAE,EAAE,CAAC,EAAE,CAAC;IACrD,OAAO,GAAG,CAAC,QAAQ,CAAC,MAAM,CAAC,CAAC;AAC9B,CAAC;AAED;;;GAGG;AACH,MAAa,eAAe;IAE1B,YAAY,UAA2B;QACrC,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;IAC/B,CAAC;IAED,SAAS;QACP,OAAO,IAAI,GAAG,CACZ,IAAI,CAAC,UAAU,CAAC,GAAG,CAAC,MAAM,CAAC,EAAE;YAC3B,MAAM,IAAI,GAAG,IAAI,mBAAW,CAAC,GAAG,MAAM,CAAC,IAAI,IAAI,MAAM,CAAC,IAAI,EAAE,CAAC,CAAC;YAC9D,OAAO,CAAC,IAAI,CAAC,QAAQ,EAAE,EAAE,IAAI,CAAC,CAAC;QACjC,CAAC,CAAC,CACH,CAAC;IACJ,CAAC;CACF;AAdD,0CAcC;AAaD,gBAAgB;AAChB,MAAa,SAAU,SAAQ,+BAAkC;IAY/D,YAAY,OAAyB;QACnC,KAAK,EAAE,CAAC;QAER,IAAI,CAAC,OAAO,IAAI,CAAC,OAAO,CAAC,OAAO,EAAE;YAChC,MAAM,IAAI,yBAAiB,CAAC,sDAAsD,CAAC,CAAC;SACrF;QAED,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC,OAAO,CAAC;QAC/B,IAAI,CAAC,mBAAmB,GAAG,KAAK,CAAC;QACjC,IAAI,CAAC,oBAAoB,GAAG,OAAO,CAAC,oBAAoB,IAAI,KAAK,CAAC;QAClE,IAAI,CAAC,MAAM,GAAG,IAAI,eAAM,CAAC,WAAW,EAAE,OAAO,CAAC,CAAC;QAE/C,IAAI,CAAC,MAAM,GAAG,KAAK,CAAC;QACpB,IAAI,CAAC,UAAU,GAAG,CAAC,CAAC;QAEpB,IAAI,CAAC,QAAQ,GAAG,SAAS,CAAC;IAC5B,CAAC;IAED,IAAI,UAAU;QACZ,OAAO,iBAAiB,IAAI,CAAC,OAAO,EAAE,CAAC;IACzC,CAAC;IAED,IAAI,UAAU;QACZ,OAAO,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,IAAI,CAAC,oBAAoB,CAAC,CAAC,CAAC,IAAI,CAAC,mBAAmB,CAAC;IAC5E,CAAC;IAED,KAAK;QACH,IAAI,CAAC,IAAI,CAAC,QAAQ,EAAE;YAClB,IAAI,CAAC,QAAQ,EAAE,CAAC;SACjB;IACH,CAAC;IAED,IAAI;QACF,IAAI,IAAI,CAAC,QAAQ,EAAE;YACjB,YAAY,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;YAC5B,IAAI,CAAC,UAAU,IAAI,CAAC,CAAC;YACrB,IAAI,CAAC,QAAQ,GAAG,SAAS,CAAC;SAC3B;IACH,CAAC;IAED,QAAQ;QACN,IAAI,IAAI,CAAC,QAAQ,EAAE;YACjB,YAAY,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;SAC7B;QAED,IAAI,CAAC,QAAQ,GAAG,UAAU,CAAC,GAAG,EAAE,CAAC,IAAI,CAAC,KAAK,EAAE,EAAE,IAAI,CAAC,UAAU,CAAC,CAAC;IAClE,CAAC;IAED,OAAO,CAAC,UAA2B;QACjC,IAAI,CAAC,MAAM,GAAG,KAAK,CAAC;QACpB,IAAI,CAAC,QAAQ,EAAE,CAAC;QAChB,IAAI,CAAC,IAAI,CAAC,SAAS,CAAC,oBAAoB,EAAE,IAAI,eAAe,CAAC,UAAU,CAAC,CAAC,CAAC;IAC7E,CAAC;IAED,OAAO,CAAC,OAAe,EAAE,GAA2B;QAClD,IAAI,CAAC,MAAM,CAAC,IAAI,CAAC,OAAO,EAAE,GAAG,CAAC,CAAC;QAC/B,IAAI,CAAC,MAAM,GAAG,IAAI,CAAC;QACnB,IAAI,CAAC,QAAQ,EAAE,CAAC;IAClB,CAAC;IAED,oBAAoB,CAAC,SAAwB;QAC3C,IAAI,CAAC,MAAM,CAAC,IAAI,CACd,yCAAyC,SAAS,CAAC,IAAI,IAAI,SAAS,CAAC,IAAI,GAAG,EAC5E,SAAS,CACV,CAAC;IACJ,CAAC;IAED,KAAK;QACH,MAAM,UAAU,GAAG,IAAI,CAAC,UAAU,CAAC;QACnC,GAAG,CAAC,UAAU,CAAC,IAAI,CAAC,UAAU,EAAE,CAAC,GAAG,EAAE,UAAU,EAAE,EAAE;YAClD,IAAI,UAAU,KAAK,IAAI,CAAC,UAAU,EAAE;gBAClC,OAAO;aACR;YAED,IAAI,GAAG,EAAE;gBACP,IAAI,CAAC,OAAO,CAAC,WAAW,EAAE,GAAG,CAAC,CAAC;gBAC/B,OAAO;aACR;YAED,MAAM,cAAc,GAAoB,EAAE,CAAC;YAC3C,UAAU,CAAC,OAAO,CAAC,CAAC,MAAqB,EAAE,EAAE;gBAC3C,IAAI,mBAAmB,CAAC,MAAM,CAAC,IAAI,EAAE,IAAI,CAAC,OAAO,CAAC,EAAE;oBAClD,cAAc,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC;iBAC7B;qBAAM;oBACL,IAAI,CAAC,oBAAoB,CAAC,MAAM,CAAC,CAAC;iBACnC;YACH,CAAC,CAAC,CAAC;YAEH,IAAI,CAAC,cAAc,CAAC,MAAM,EAAE;gBAC1B,IAAI,CAAC,OAAO,CAAC,kCAAkC,CAAC,CAAC;gBACjD,OAAO;aACR;YAED,IAAI,CAAC,OAAO,CAAC,cAAc,CAAC,CAAC;QAC/B,CAAC,CAAC,CAAC;IACL,CAAC;;AA3GH,8BA4GC;AAnGC,aAAa;AACG,8BAAoB,GAAG,oBAA6B,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/topology.js b/node_modules/mongodb/lib/sdam/topology.js new file mode 100644 index 000000000..e9fd1f062 --- /dev/null +++ b/node_modules/mongodb/lib/sdam/topology.js @@ -0,0 +1,722 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.ServerCapabilities = exports.TOPOLOGY_EVENTS = exports.Topology = void 0; +const Denque = require("denque"); +const read_preference_1 = require("../read_preference"); +const server_description_1 = require("./server_description"); +const topology_description_1 = require("./topology_description"); +const server_1 = require("./server"); +const sessions_1 = require("../sessions"); +const srv_polling_1 = require("./srv_polling"); +const connection_pool_1 = require("../cmap/connection_pool"); +const error_1 = require("../error"); +const server_selection_1 = require("./server_selection"); +const utils_1 = require("../utils"); +const common_1 = require("./common"); +const events_1 = require("./events"); +const connection_1 = require("../cmap/connection"); +const connection_string_1 = require("../connection_string"); +const bson_1 = require("../bson"); +const mongo_types_1 = require("../mongo_types"); +// Global state +let globalTopologyCounter = 0; +// events that we relay to the `Topology` +const SERVER_RELAY_EVENTS = [ + server_1.Server.SERVER_HEARTBEAT_STARTED, + server_1.Server.SERVER_HEARTBEAT_SUCCEEDED, + server_1.Server.SERVER_HEARTBEAT_FAILED, + connection_1.Connection.COMMAND_STARTED, + connection_1.Connection.COMMAND_SUCCEEDED, + connection_1.Connection.COMMAND_FAILED, + ...connection_pool_1.CMAP_EVENTS +]; +// all events we listen to from `Server` instances +const LOCAL_SERVER_EVENTS = [ + server_1.Server.CONNECT, + server_1.Server.DESCRIPTION_RECEIVED, + server_1.Server.CLOSED, + server_1.Server.ENDED +]; +const stateTransition = utils_1.makeStateMachine({ + [common_1.STATE_CLOSED]: [common_1.STATE_CLOSED, common_1.STATE_CONNECTING], + [common_1.STATE_CONNECTING]: [common_1.STATE_CONNECTING, common_1.STATE_CLOSING, common_1.STATE_CONNECTED, common_1.STATE_CLOSED], + [common_1.STATE_CONNECTED]: [common_1.STATE_CONNECTED, common_1.STATE_CLOSING, common_1.STATE_CLOSED], + [common_1.STATE_CLOSING]: [common_1.STATE_CLOSING, common_1.STATE_CLOSED] +}); +/** @internal */ +const kCancelled = Symbol('cancelled'); +/** @internal */ +const kWaitQueue = Symbol('waitQueue'); +/** + * A container of server instances representing a connection to a MongoDB topology. + * @internal + */ +class Topology extends mongo_types_1.TypedEventEmitter { + /** + * @param seedlist - a list of HostAddress instances to connect to + */ + constructor(seeds, options) { + var _a; + super(); + // Legacy CSFLE support + this.bson = Object.create(null); + this.bson.serialize = bson_1.serialize; + this.bson.deserialize = bson_1.deserialize; + // Options should only be undefined in tests, MongoClient will always have defined options + options = options !== null && options !== void 0 ? options : { + hosts: [utils_1.HostAddress.fromString('localhost:27017')], + retryReads: connection_string_1.DEFAULT_OPTIONS.get('retryReads'), + retryWrites: connection_string_1.DEFAULT_OPTIONS.get('retryWrites'), + serverSelectionTimeoutMS: connection_string_1.DEFAULT_OPTIONS.get('serverSelectionTimeoutMS'), + directConnection: connection_string_1.DEFAULT_OPTIONS.get('directConnection'), + loadBalanced: connection_string_1.DEFAULT_OPTIONS.get('loadBalanced'), + metadata: connection_string_1.DEFAULT_OPTIONS.get('metadata'), + monitorCommands: connection_string_1.DEFAULT_OPTIONS.get('monitorCommands'), + tls: connection_string_1.DEFAULT_OPTIONS.get('tls'), + maxPoolSize: connection_string_1.DEFAULT_OPTIONS.get('maxPoolSize'), + minPoolSize: connection_string_1.DEFAULT_OPTIONS.get('minPoolSize'), + waitQueueTimeoutMS: connection_string_1.DEFAULT_OPTIONS.get('waitQueueTimeoutMS'), + connectionType: connection_string_1.DEFAULT_OPTIONS.get('connectionType'), + connectTimeoutMS: connection_string_1.DEFAULT_OPTIONS.get('connectTimeoutMS'), + maxIdleTimeMS: connection_string_1.DEFAULT_OPTIONS.get('maxIdleTimeMS'), + heartbeatFrequencyMS: connection_string_1.DEFAULT_OPTIONS.get('heartbeatFrequencyMS'), + minHeartbeatFrequencyMS: connection_string_1.DEFAULT_OPTIONS.get('minHeartbeatFrequencyMS') + }; + if (typeof seeds === 'string') { + seeds = [utils_1.HostAddress.fromString(seeds)]; + } + else if (!Array.isArray(seeds)) { + seeds = [seeds]; + } + const seedlist = []; + for (const seed of seeds) { + if (typeof seed === 'string') { + seedlist.push(utils_1.HostAddress.fromString(seed)); + } + else if (seed instanceof utils_1.HostAddress) { + seedlist.push(seed); + } + else { + // FIXME(NODE-3483): May need to be a MongoParseError + throw new error_1.MongoRuntimeError(`Topology cannot be constructed from ${JSON.stringify(seed)}`); + } + } + const topologyType = topologyTypeFromOptions(options); + const topologyId = globalTopologyCounter++; + const serverDescriptions = new Map(); + for (const hostAddress of seedlist) { + serverDescriptions.set(hostAddress.toString(), new server_description_1.ServerDescription(hostAddress)); + } + this[kWaitQueue] = new Denque(); + this.s = { + // the id of this topology + id: topologyId, + // passed in options + options, + // initial seedlist of servers to connect to + seedlist, + // initial state + state: common_1.STATE_CLOSED, + // the topology description + description: new topology_description_1.TopologyDescription(topologyType, serverDescriptions, options.replicaSet, undefined, undefined, undefined, options), + serverSelectionTimeoutMS: options.serverSelectionTimeoutMS, + heartbeatFrequencyMS: options.heartbeatFrequencyMS, + minHeartbeatFrequencyMS: options.minHeartbeatFrequencyMS, + // a map of server instances to normalized addresses + servers: new Map(), + // Server Session Pool + sessionPool: new sessions_1.ServerSessionPool(this), + // Active client sessions + sessions: new Set(), + credentials: options === null || options === void 0 ? void 0 : options.credentials, + clusterTime: undefined, + // timer management + connectionTimers: new Set(), + detectShardedTopology: ev => this.detectShardedTopology(ev), + detectSrvRecords: ev => this.detectSrvRecords(ev) + }; + if (options.srvHost && !options.loadBalanced) { + this.s.srvPoller = + (_a = options.srvPoller) !== null && _a !== void 0 ? _a : new srv_polling_1.SrvPoller({ + heartbeatFrequencyMS: this.s.heartbeatFrequencyMS, + srvHost: options.srvHost + }); + this.on(Topology.TOPOLOGY_DESCRIPTION_CHANGED, this.s.detectShardedTopology); + } + } + detectShardedTopology(event) { + var _a, _b, _c; + const previousType = event.previousDescription.type; + const newType = event.newDescription.type; + const transitionToSharded = previousType !== common_1.TopologyType.Sharded && newType === common_1.TopologyType.Sharded; + const srvListeners = (_a = this.s.srvPoller) === null || _a === void 0 ? void 0 : _a.listeners(srv_polling_1.SrvPoller.SRV_RECORD_DISCOVERY); + const listeningToSrvPolling = !!(srvListeners === null || srvListeners === void 0 ? void 0 : srvListeners.includes(this.s.detectSrvRecords)); + if (transitionToSharded && !listeningToSrvPolling) { + (_b = this.s.srvPoller) === null || _b === void 0 ? void 0 : _b.on(srv_polling_1.SrvPoller.SRV_RECORD_DISCOVERY, this.s.detectSrvRecords); + (_c = this.s.srvPoller) === null || _c === void 0 ? void 0 : _c.start(); + } + } + detectSrvRecords(ev) { + const previousTopologyDescription = this.s.description; + this.s.description = this.s.description.updateFromSrvPollingEvent(ev); + if (this.s.description === previousTopologyDescription) { + // Nothing changed, so return + return; + } + updateServers(this); + this.emit(Topology.TOPOLOGY_DESCRIPTION_CHANGED, new events_1.TopologyDescriptionChangedEvent(this.s.id, previousTopologyDescription, this.s.description)); + } + /** + * @returns A `TopologyDescription` for this topology + */ + get description() { + return this.s.description; + } + get loadBalanced() { + return this.s.options.loadBalanced; + } + get capabilities() { + return new ServerCapabilities(this.lastIsMaster()); + } + /** Initiate server connect */ + connect(options, callback) { + var _a; + if (typeof options === 'function') + (callback = options), (options = {}); + options = options !== null && options !== void 0 ? options : {}; + if (this.s.state === common_1.STATE_CONNECTED) { + if (typeof callback === 'function') { + callback(); + } + return; + } + stateTransition(this, common_1.STATE_CONNECTING); + // emit SDAM monitoring events + this.emit(Topology.TOPOLOGY_OPENING, new events_1.TopologyOpeningEvent(this.s.id)); + // emit an event for the topology change + this.emit(Topology.TOPOLOGY_DESCRIPTION_CHANGED, new events_1.TopologyDescriptionChangedEvent(this.s.id, new topology_description_1.TopologyDescription(common_1.TopologyType.Unknown), // initial is always Unknown + this.s.description)); + // connect all known servers, then attempt server selection to connect + const serverDescriptions = Array.from(this.s.description.servers.values()); + connectServers(this, serverDescriptions); + // In load balancer mode we need to fake a server description getting + // emitted from the monitor, since the monitor doesn't exist. + if (this.s.options.loadBalanced) { + for (const description of serverDescriptions) { + const newDescription = new server_description_1.ServerDescription(description.hostAddress, undefined, { + loadBalanced: this.s.options.loadBalanced + }); + this.serverUpdateHandler(newDescription); + } + } + const readPreference = (_a = options.readPreference) !== null && _a !== void 0 ? _a : read_preference_1.ReadPreference.primary; + this.selectServer(server_selection_1.readPreferenceServerSelector(readPreference), options, (err, server) => { + if (err) { + this.close(); + typeof callback === 'function' ? callback(err) : this.emit(Topology.ERROR, err); + return; + } + // TODO: NODE-2471 + if (server && this.s.credentials) { + server.command(utils_1.ns('admin.$cmd'), { ping: 1 }, err => { + if (err) { + typeof callback === 'function' ? callback(err) : this.emit(Topology.ERROR, err); + return; + } + stateTransition(this, common_1.STATE_CONNECTED); + // TODO(NODE-3273) - remove err + this.emit(Topology.OPEN, err, this); + this.emit(Topology.CONNECT, this); + if (typeof callback === 'function') + callback(undefined, this); + }); + return; + } + stateTransition(this, common_1.STATE_CONNECTED); + // TODO(NODE-3273) - remove err + this.emit(Topology.OPEN, err, this); + this.emit(Topology.CONNECT, this); + if (typeof callback === 'function') + callback(undefined, this); + }); + } + /** Close this topology */ + close(options, callback) { + if (typeof options === 'function') { + callback = options; + options = {}; + } + if (typeof options === 'boolean') { + options = { force: options }; + } + options = options !== null && options !== void 0 ? options : {}; + if (this.s.state === common_1.STATE_CLOSED || this.s.state === common_1.STATE_CLOSING) { + if (typeof callback === 'function') { + callback(); + } + return; + } + stateTransition(this, common_1.STATE_CLOSING); + drainWaitQueue(this[kWaitQueue], new error_1.MongoTopologyClosedError()); + common_1.drainTimerQueue(this.s.connectionTimers); + if (this.s.srvPoller) { + this.s.srvPoller.stop(); + this.s.srvPoller.removeListener(srv_polling_1.SrvPoller.SRV_RECORD_DISCOVERY, this.s.detectSrvRecords); + } + this.removeListener(Topology.TOPOLOGY_DESCRIPTION_CHANGED, this.s.detectShardedTopology); + utils_1.eachAsync(Array.from(this.s.sessions.values()), (session, cb) => session.endSession(cb), () => { + this.s.sessionPool.endAllPooledSessions(() => { + utils_1.eachAsync(Array.from(this.s.servers.values()), (server, cb) => destroyServer(server, this, options, cb), err => { + this.s.servers.clear(); + // emit an event for close + this.emit(Topology.TOPOLOGY_CLOSED, new events_1.TopologyClosedEvent(this.s.id)); + stateTransition(this, common_1.STATE_CLOSED); + if (typeof callback === 'function') { + callback(err); + } + }); + }); + }); + } + selectServer(selector, _options, _callback) { + let options = _options; + const callback = (_callback !== null && _callback !== void 0 ? _callback : _options); + if (typeof options === 'function') { + options = {}; + } + let serverSelector; + if (typeof selector !== 'function') { + if (typeof selector === 'string') { + serverSelector = server_selection_1.readPreferenceServerSelector(read_preference_1.ReadPreference.fromString(selector)); + } + else { + let readPreference; + if (selector instanceof read_preference_1.ReadPreference) { + readPreference = selector; + } + else { + read_preference_1.ReadPreference.translate(options); + readPreference = options.readPreference || read_preference_1.ReadPreference.primary; + } + serverSelector = server_selection_1.readPreferenceServerSelector(readPreference); + } + } + else { + serverSelector = selector; + } + options = Object.assign({}, { serverSelectionTimeoutMS: this.s.serverSelectionTimeoutMS }, options); + const isSharded = this.description.type === common_1.TopologyType.Sharded; + const session = options.session; + const transaction = session && session.transaction; + if (isSharded && transaction && transaction.server) { + callback(undefined, transaction.server); + return; + } + const waitQueueMember = { + serverSelector, + transaction, + callback + }; + const serverSelectionTimeoutMS = options.serverSelectionTimeoutMS; + if (serverSelectionTimeoutMS) { + waitQueueMember.timer = setTimeout(() => { + waitQueueMember[kCancelled] = true; + waitQueueMember.timer = undefined; + const timeoutError = new error_1.MongoServerSelectionError(`Server selection timed out after ${serverSelectionTimeoutMS} ms`, this.description); + waitQueueMember.callback(timeoutError); + }, serverSelectionTimeoutMS); + } + this[kWaitQueue].push(waitQueueMember); + processWaitQueue(this); + } + // Sessions related methods + /** + * @returns Whether the topology should initiate selection to determine session support + */ + shouldCheckForSessionSupport() { + if (this.description.type === common_1.TopologyType.Single) { + return !this.description.hasKnownServers; + } + return !this.description.hasDataBearingServers; + } + /** + * @returns Whether sessions are supported on the current topology + */ + hasSessionSupport() { + return this.loadBalanced || this.description.logicalSessionTimeoutMinutes != null; + } + /** Start a logical session */ + startSession(options, clientOptions) { + const session = new sessions_1.ClientSession(this, this.s.sessionPool, options, clientOptions); + session.once('ended', () => { + this.s.sessions.delete(session); + }); + this.s.sessions.add(session); + return session; + } + /** Send endSessions command(s) with the given session ids */ + endSessions(sessions, callback) { + if (!Array.isArray(sessions)) { + sessions = [sessions]; + } + this.selectServer(server_selection_1.readPreferenceServerSelector(read_preference_1.ReadPreference.primaryPreferred), (err, server) => { + if (err || !server) { + if (typeof callback === 'function') + callback(err); + return; + } + server.command(utils_1.ns('admin.$cmd'), { endSessions: sessions }, { noResponse: true }, (err, result) => { + if (typeof callback === 'function') + callback(err, result); + }); + }); + } + /** + * Update the internal TopologyDescription with a ServerDescription + * + * @param serverDescription - The server to update in the internal list of server descriptions + */ + serverUpdateHandler(serverDescription) { + if (!this.s.description.hasServer(serverDescription.address)) { + return; + } + // ignore this server update if its from an outdated topologyVersion + if (isStaleServerDescription(this.s.description, serverDescription)) { + return; + } + // these will be used for monitoring events later + const previousTopologyDescription = this.s.description; + const previousServerDescription = this.s.description.servers.get(serverDescription.address); + if (!previousServerDescription) { + return; + } + // Driver Sessions Spec: "Whenever a driver receives a cluster time from + // a server it MUST compare it to the current highest seen cluster time + // for the deployment. If the new cluster time is higher than the + // highest seen cluster time it MUST become the new highest seen cluster + // time. Two cluster times are compared using only the BsonTimestamp + // value of the clusterTime embedded field." + const clusterTime = serverDescription.$clusterTime; + if (clusterTime) { + common_1._advanceClusterTime(this, clusterTime); + } + // If we already know all the information contained in this updated description, then + // we don't need to emit SDAM events, but still need to update the description, in order + // to keep client-tracked attributes like last update time and round trip time up to date + const equalDescriptions = previousServerDescription && previousServerDescription.equals(serverDescription); + // first update the TopologyDescription + this.s.description = this.s.description.update(serverDescription); + if (this.s.description.compatibilityError) { + this.emit(Topology.ERROR, new error_1.MongoCompatibilityError(this.s.description.compatibilityError)); + return; + } + // emit monitoring events for this change + if (!equalDescriptions) { + const newDescription = this.s.description.servers.get(serverDescription.address); + if (newDescription) { + this.emit(Topology.SERVER_DESCRIPTION_CHANGED, new events_1.ServerDescriptionChangedEvent(this.s.id, serverDescription.address, previousServerDescription, newDescription)); + } + } + // update server list from updated descriptions + updateServers(this, serverDescription); + // attempt to resolve any outstanding server selection attempts + if (this[kWaitQueue].length > 0) { + processWaitQueue(this); + } + if (!equalDescriptions) { + this.emit(Topology.TOPOLOGY_DESCRIPTION_CHANGED, new events_1.TopologyDescriptionChangedEvent(this.s.id, previousTopologyDescription, this.s.description)); + } + } + auth(credentials, callback) { + if (typeof credentials === 'function') + (callback = credentials), (credentials = undefined); + if (typeof callback === 'function') + callback(undefined, true); + } + get clientMetadata() { + return this.s.options.metadata; + } + isConnected() { + return this.s.state === common_1.STATE_CONNECTED; + } + isDestroyed() { + return this.s.state === common_1.STATE_CLOSED; + } + /** + * @deprecated This function is deprecated and will be removed in the next major version. + */ + unref() { + utils_1.emitWarning('`unref` is a noop and will be removed in the next major version'); + } + // NOTE: There are many places in code where we explicitly check the last isMaster + // to do feature support detection. This should be done any other way, but for + // now we will just return the first isMaster seen, which should suffice. + lastIsMaster() { + const serverDescriptions = Array.from(this.description.servers.values()); + if (serverDescriptions.length === 0) + return {}; + const sd = serverDescriptions.filter((sd) => sd.type !== common_1.ServerType.Unknown)[0]; + const result = sd || { maxWireVersion: this.description.commonWireVersion }; + return result; + } + get logicalSessionTimeoutMinutes() { + return this.description.logicalSessionTimeoutMinutes; + } + get clusterTime() { + return this.s.clusterTime; + } + set clusterTime(clusterTime) { + this.s.clusterTime = clusterTime; + } +} +exports.Topology = Topology; +/** @event */ +Topology.SERVER_OPENING = 'serverOpening'; +/** @event */ +Topology.SERVER_CLOSED = 'serverClosed'; +/** @event */ +Topology.SERVER_DESCRIPTION_CHANGED = 'serverDescriptionChanged'; +/** @event */ +Topology.TOPOLOGY_OPENING = 'topologyOpening'; +/** @event */ +Topology.TOPOLOGY_CLOSED = 'topologyClosed'; +/** @event */ +Topology.TOPOLOGY_DESCRIPTION_CHANGED = 'topologyDescriptionChanged'; +/** @event */ +Topology.ERROR = 'error'; +/** @event */ +Topology.OPEN = 'open'; +/** @event */ +Topology.CONNECT = 'connect'; +/** @event */ +Topology.CLOSE = 'close'; +/** @event */ +Topology.TIMEOUT = 'timeout'; +/** @public */ +exports.TOPOLOGY_EVENTS = [ + Topology.SERVER_OPENING, + Topology.SERVER_CLOSED, + Topology.SERVER_DESCRIPTION_CHANGED, + Topology.TOPOLOGY_OPENING, + Topology.TOPOLOGY_CLOSED, + Topology.TOPOLOGY_DESCRIPTION_CHANGED, + Topology.ERROR, + Topology.TIMEOUT, + Topology.CLOSE +]; +/** Destroys a server, and removes all event listeners from the instance */ +function destroyServer(server, topology, options, callback) { + options = options !== null && options !== void 0 ? options : {}; + for (const event of LOCAL_SERVER_EVENTS) { + server.removeAllListeners(event); + } + server.destroy(options, () => { + topology.emit(Topology.SERVER_CLOSED, new events_1.ServerClosedEvent(topology.s.id, server.description.address)); + for (const event of SERVER_RELAY_EVENTS) { + server.removeAllListeners(event); + } + if (typeof callback === 'function') { + callback(); + } + }); +} +/** Predicts the TopologyType from options */ +function topologyTypeFromOptions(options) { + if (options === null || options === void 0 ? void 0 : options.directConnection) { + return common_1.TopologyType.Single; + } + if (options === null || options === void 0 ? void 0 : options.replicaSet) { + return common_1.TopologyType.ReplicaSetNoPrimary; + } + if (options === null || options === void 0 ? void 0 : options.loadBalanced) { + return common_1.TopologyType.LoadBalanced; + } + return common_1.TopologyType.Unknown; +} +function randomSelection(array) { + return array[Math.floor(Math.random() * array.length)]; +} +/** + * Creates new server instances and attempts to connect them + * + * @param topology - The topology that this server belongs to + * @param serverDescription - The description for the server to initialize and connect to + * @param connectDelay - Time to wait before attempting initial connection + */ +function createAndConnectServer(topology, serverDescription, connectDelay) { + topology.emit(Topology.SERVER_OPENING, new events_1.ServerOpeningEvent(topology.s.id, serverDescription.address)); + const server = new server_1.Server(topology, serverDescription, topology.s.options); + for (const event of SERVER_RELAY_EVENTS) { + server.on(event, (e) => topology.emit(event, e)); + } + server.on(server_1.Server.DESCRIPTION_RECEIVED, description => topology.serverUpdateHandler(description)); + if (connectDelay) { + const connectTimer = setTimeout(() => { + common_1.clearAndRemoveTimerFrom(connectTimer, topology.s.connectionTimers); + server.connect(); + }, connectDelay); + topology.s.connectionTimers.add(connectTimer); + return server; + } + server.connect(); + return server; +} +/** + * Create `Server` instances for all initially known servers, connect them, and assign + * them to the passed in `Topology`. + * + * @param topology - The topology responsible for the servers + * @param serverDescriptions - A list of server descriptions to connect + */ +function connectServers(topology, serverDescriptions) { + topology.s.servers = serverDescriptions.reduce((servers, serverDescription) => { + const server = createAndConnectServer(topology, serverDescription); + servers.set(serverDescription.address, server); + return servers; + }, new Map()); +} +/** + * @param topology - Topology to update. + * @param incomingServerDescription - New server description. + */ +function updateServers(topology, incomingServerDescription) { + // update the internal server's description + if (incomingServerDescription && topology.s.servers.has(incomingServerDescription.address)) { + const server = topology.s.servers.get(incomingServerDescription.address); + if (server) { + server.s.description = incomingServerDescription; + } + } + // add new servers for all descriptions we currently don't know about locally + for (const serverDescription of topology.description.servers.values()) { + if (!topology.s.servers.has(serverDescription.address)) { + const server = createAndConnectServer(topology, serverDescription); + topology.s.servers.set(serverDescription.address, server); + } + } + // for all servers no longer known, remove their descriptions and destroy their instances + for (const entry of topology.s.servers) { + const serverAddress = entry[0]; + if (topology.description.hasServer(serverAddress)) { + continue; + } + if (!topology.s.servers.has(serverAddress)) { + continue; + } + const server = topology.s.servers.get(serverAddress); + topology.s.servers.delete(serverAddress); + // prepare server for garbage collection + if (server) { + destroyServer(server, topology); + } + } +} +function drainWaitQueue(queue, err) { + while (queue.length) { + const waitQueueMember = queue.shift(); + if (!waitQueueMember) { + continue; + } + if (waitQueueMember.timer) { + clearTimeout(waitQueueMember.timer); + } + if (!waitQueueMember[kCancelled]) { + waitQueueMember.callback(err); + } + } +} +function processWaitQueue(topology) { + if (topology.s.state === common_1.STATE_CLOSED) { + drainWaitQueue(topology[kWaitQueue], new error_1.MongoTopologyClosedError()); + return; + } + const isSharded = topology.description.type === common_1.TopologyType.Sharded; + const serverDescriptions = Array.from(topology.description.servers.values()); + const membersToProcess = topology[kWaitQueue].length; + for (let i = 0; i < membersToProcess; ++i) { + const waitQueueMember = topology[kWaitQueue].shift(); + if (!waitQueueMember) { + continue; + } + if (waitQueueMember[kCancelled]) { + continue; + } + let selectedDescriptions; + try { + const serverSelector = waitQueueMember.serverSelector; + selectedDescriptions = serverSelector + ? serverSelector(topology.description, serverDescriptions) + : serverDescriptions; + } + catch (e) { + if (waitQueueMember.timer) { + clearTimeout(waitQueueMember.timer); + } + waitQueueMember.callback(e); + continue; + } + if (selectedDescriptions.length === 0) { + topology[kWaitQueue].push(waitQueueMember); + continue; + } + const selectedServerDescription = randomSelection(selectedDescriptions); + const selectedServer = topology.s.servers.get(selectedServerDescription.address); + const transaction = waitQueueMember.transaction; + if (isSharded && transaction && transaction.isActive && selectedServer) { + transaction.pinServer(selectedServer); + } + if (waitQueueMember.timer) { + clearTimeout(waitQueueMember.timer); + } + waitQueueMember.callback(undefined, selectedServer); + } + if (topology[kWaitQueue].length > 0) { + // ensure all server monitors attempt monitoring soon + for (const [, server] of topology.s.servers) { + process.nextTick(function scheduleServerCheck() { + return server.requestCheck(); + }); + } + } +} +function isStaleServerDescription(topologyDescription, incomingServerDescription) { + const currentServerDescription = topologyDescription.servers.get(incomingServerDescription.address); + const currentTopologyVersion = currentServerDescription === null || currentServerDescription === void 0 ? void 0 : currentServerDescription.topologyVersion; + return (server_description_1.compareTopologyVersion(currentTopologyVersion, incomingServerDescription.topologyVersion) > 0); +} +/** @public */ +class ServerCapabilities { + constructor(ismaster) { + this.minWireVersion = ismaster.minWireVersion || 0; + this.maxWireVersion = ismaster.maxWireVersion || 0; + } + get hasAggregationCursor() { + return this.maxWireVersion >= 1; + } + get hasWriteCommands() { + return this.maxWireVersion >= 2; + } + get hasTextSearch() { + return this.minWireVersion >= 0; + } + get hasAuthCommands() { + return this.maxWireVersion >= 1; + } + get hasListCollectionsCommand() { + return this.maxWireVersion >= 3; + } + get hasListIndexesCommand() { + return this.maxWireVersion >= 3; + } + get supportsSnapshotReads() { + return this.maxWireVersion >= 13; + } + get commandsTakeWriteConcern() { + return this.maxWireVersion >= 5; + } + get commandsTakeCollation() { + return this.maxWireVersion >= 5; + } +} +exports.ServerCapabilities = ServerCapabilities; +//# sourceMappingURL=topology.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/topology.js.map b/node_modules/mongodb/lib/sdam/topology.js.map new file mode 100644 index 000000000..8210e35da --- /dev/null +++ b/node_modules/mongodb/lib/sdam/topology.js.map @@ -0,0 +1 @@ +{"version":3,"file":"topology.js","sourceRoot":"","sources":["../../src/sdam/topology.ts"],"names":[],"mappings":";;;AAAA,iCAAkC;AAClC,wDAAwE;AACxE,6DAAiF;AACjF,iEAA6D;AAC7D,qCAA+D;AAC/D,0CAKqB;AACrB,+CAA2D;AAC3D,6DAA4E;AAC5E,oCAMkB;AAClB,yDAAkF;AAClF,oCASkB;AAClB,qCAYkB;AAClB,qCAOkB;AAKlB,mDAAkF;AAElF,4DAAuD;AACvD,kCAAiD;AACjD,gDAAmD;AAEnD,eAAe;AACf,IAAI,qBAAqB,GAAG,CAAC,CAAC;AAE9B,yCAAyC;AACzC,MAAM,mBAAmB,GAAG;IAC1B,eAAM,CAAC,wBAAwB;IAC/B,eAAM,CAAC,0BAA0B;IACjC,eAAM,CAAC,uBAAuB;IAC9B,uBAAU,CAAC,eAAe;IAC1B,uBAAU,CAAC,iBAAiB;IAC5B,uBAAU,CAAC,cAAc;IACzB,GAAG,6BAAW;CACf,CAAC;AAEF,kDAAkD;AAClD,MAAM,mBAAmB,GAAG;IAC1B,eAAM,CAAC,OAAO;IACd,eAAM,CAAC,oBAAoB;IAC3B,eAAM,CAAC,MAAM;IACb,eAAM,CAAC,KAAK;CACb,CAAC;AAEF,MAAM,eAAe,GAAG,wBAAgB,CAAC;IACvC,CAAC,qBAAY,CAAC,EAAE,CAAC,qBAAY,EAAE,yBAAgB,CAAC;IAChD,CAAC,yBAAgB,CAAC,EAAE,CAAC,yBAAgB,EAAE,sBAAa,EAAE,wBAAe,EAAE,qBAAY,CAAC;IACpF,CAAC,wBAAe,CAAC,EAAE,CAAC,wBAAe,EAAE,sBAAa,EAAE,qBAAY,CAAC;IACjE,CAAC,sBAAa,CAAC,EAAE,CAAC,sBAAa,EAAE,qBAAY,CAAC;CAC/C,CAAC,CAAC;AAEH,gBAAgB;AAChB,MAAM,UAAU,GAAG,MAAM,CAAC,WAAW,CAAC,CAAC;AACvC,gBAAgB;AAChB,MAAM,UAAU,GAAG,MAAM,CAAC,WAAW,CAAC,CAAC;AAkGvC;;;GAGG;AACH,MAAa,QAAS,SAAQ,+BAAiC;IA2C7D;;OAEG;IACH,YAAY,KAAsD,EAAE,OAAwB;;QAC1F,KAAK,EAAE,CAAC;QAER,uBAAuB;QACvB,IAAI,CAAC,IAAI,GAAG,MAAM,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC;QAChC,IAAI,CAAC,IAAI,CAAC,SAAS,GAAG,gBAAS,CAAC;QAChC,IAAI,CAAC,IAAI,CAAC,WAAW,GAAG,kBAAW,CAAC;QAEpC,0FAA0F;QAC1F,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI;YACnB,KAAK,EAAE,CAAC,mBAAW,CAAC,UAAU,CAAC,iBAAiB,CAAC,CAAC;YAClD,UAAU,EAAE,mCAAe,CAAC,GAAG,CAAC,YAAY,CAAC;YAC7C,WAAW,EAAE,mCAAe,CAAC,GAAG,CAAC,aAAa,CAAC;YAC/C,wBAAwB,EAAE,mCAAe,CAAC,GAAG,CAAC,0BAA0B,CAAC;YACzE,gBAAgB,EAAE,mCAAe,CAAC,GAAG,CAAC,kBAAkB,CAAC;YACzD,YAAY,EAAE,mCAAe,CAAC,GAAG,CAAC,cAAc,CAAC;YACjD,QAAQ,EAAE,mCAAe,CAAC,GAAG,CAAC,UAAU,CAAC;YACzC,eAAe,EAAE,mCAAe,CAAC,GAAG,CAAC,iBAAiB,CAAC;YACvD,GAAG,EAAE,mCAAe,CAAC,GAAG,CAAC,KAAK,CAAC;YAC/B,WAAW,EAAE,mCAAe,CAAC,GAAG,CAAC,aAAa,CAAC;YAC/C,WAAW,EAAE,mCAAe,CAAC,GAAG,CAAC,aAAa,CAAC;YAC/C,kBAAkB,EAAE,mCAAe,CAAC,GAAG,CAAC,oBAAoB,CAAC;YAC7D,cAAc,EAAE,mCAAe,CAAC,GAAG,CAAC,gBAAgB,CAAC;YACrD,gBAAgB,EAAE,mCAAe,CAAC,GAAG,CAAC,kBAAkB,CAAC;YACzD,aAAa,EAAE,mCAAe,CAAC,GAAG,CAAC,eAAe,CAAC;YACnD,oBAAoB,EAAE,mCAAe,CAAC,GAAG,CAAC,sBAAsB,CAAC;YACjE,uBAAuB,EAAE,mCAAe,CAAC,GAAG,CAAC,yBAAyB,CAAC;SACxE,CAAC;QAEF,IAAI,OAAO,KAAK,KAAK,QAAQ,EAAE;YAC7B,KAAK,GAAG,CAAC,mBAAW,CAAC,UAAU,CAAC,KAAK,CAAC,CAAC,CAAC;SACzC;aAAM,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,KAAK,CAAC,EAAE;YAChC,KAAK,GAAG,CAAC,KAAK,CAAC,CAAC;SACjB;QAED,MAAM,QAAQ,GAAkB,EAAE,CAAC;QACnC,KAAK,MAAM,IAAI,IAAI,KAAK,EAAE;YACxB,IAAI,OAAO,IAAI,KAAK,QAAQ,EAAE;gBAC5B,QAAQ,CAAC,IAAI,CAAC,mBAAW,CAAC,UAAU,CAAC,IAAI,CAAC,CAAC,CAAC;aAC7C;iBAAM,IAAI,IAAI,YAAY,mBAAW,EAAE;gBACtC,QAAQ,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;aACrB;iBAAM;gBACL,qDAAqD;gBACrD,MAAM,IAAI,yBAAiB,CAAC,uCAAuC,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC;aAC5F;SACF;QAED,MAAM,YAAY,GAAG,uBAAuB,CAAC,OAAO,CAAC,CAAC;QACtD,MAAM,UAAU,GAAG,qBAAqB,EAAE,CAAC;QAE3C,MAAM,kBAAkB,GAAG,IAAI,GAAG,EAAE,CAAC;QACrC,KAAK,MAAM,WAAW,IAAI,QAAQ,EAAE;YAClC,kBAAkB,CAAC,GAAG,CAAC,WAAW,CAAC,QAAQ,EAAE,EAAE,IAAI,sCAAiB,CAAC,WAAW,CAAC,CAAC,CAAC;SACpF;QAED,IAAI,CAAC,UAAU,CAAC,GAAG,IAAI,MAAM,EAAE,CAAC;QAChC,IAAI,CAAC,CAAC,GAAG;YACP,0BAA0B;YAC1B,EAAE,EAAE,UAAU;YACd,oBAAoB;YACpB,OAAO;YACP,4CAA4C;YAC5C,QAAQ;YACR,gBAAgB;YAChB,KAAK,EAAE,qBAAY;YACnB,2BAA2B;YAC3B,WAAW,EAAE,IAAI,0CAAmB,CAClC,YAAY,EACZ,kBAAkB,EAClB,OAAO,CAAC,UAAU,EAClB,SAAS,EACT,SAAS,EACT,SAAS,EACT,OAAO,CACR;YACD,wBAAwB,EAAE,OAAO,CAAC,wBAAwB;YAC1D,oBAAoB,EAAE,OAAO,CAAC,oBAAoB;YAClD,uBAAuB,EAAE,OAAO,CAAC,uBAAuB;YACxD,oDAAoD;YACpD,OAAO,EAAE,IAAI,GAAG,EAAE;YAClB,sBAAsB;YACtB,WAAW,EAAE,IAAI,4BAAiB,CAAC,IAAI,CAAC;YACxC,yBAAyB;YACzB,QAAQ,EAAE,IAAI,GAAG,EAAE;YACnB,WAAW,EAAE,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,WAAW;YACjC,WAAW,EAAE,SAAS;YAEtB,mBAAmB;YACnB,gBAAgB,EAAE,IAAI,GAAG,EAAkB;YAE3C,qBAAqB,EAAE,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC,qBAAqB,CAAC,EAAE,CAAC;YAC3D,gBAAgB,EAAE,EAAE,CAAC,EAAE,CAAC,IAAI,CAAC,gBAAgB,CAAC,EAAE,CAAC;SAClD,CAAC;QAEF,IAAI,OAAO,CAAC,OAAO,IAAI,CAAC,OAAO,CAAC,YAAY,EAAE;YAC5C,IAAI,CAAC,CAAC,CAAC,SAAS;gBACd,MAAA,OAAO,CAAC,SAAS,mCACjB,IAAI,uBAAS,CAAC;oBACZ,oBAAoB,EAAE,IAAI,CAAC,CAAC,CAAC,oBAAoB;oBACjD,OAAO,EAAE,OAAO,CAAC,OAAO;iBACzB,CAAC,CAAC;YAEL,IAAI,CAAC,EAAE,CAAC,QAAQ,CAAC,4BAA4B,EAAE,IAAI,CAAC,CAAC,CAAC,qBAAqB,CAAC,CAAC;SAC9E;IACH,CAAC;IAEO,qBAAqB,CAAC,KAAsC;;QAClE,MAAM,YAAY,GAAG,KAAK,CAAC,mBAAmB,CAAC,IAAI,CAAC;QACpD,MAAM,OAAO,GAAG,KAAK,CAAC,cAAc,CAAC,IAAI,CAAC;QAE1C,MAAM,mBAAmB,GACvB,YAAY,KAAK,qBAAY,CAAC,OAAO,IAAI,OAAO,KAAK,qBAAY,CAAC,OAAO,CAAC;QAC5E,MAAM,YAAY,GAAG,MAAA,IAAI,CAAC,CAAC,CAAC,SAAS,0CAAE,SAAS,CAAC,uBAAS,CAAC,oBAAoB,CAAC,CAAC;QACjF,MAAM,qBAAqB,GAAG,CAAC,CAAC,CAAA,YAAY,aAAZ,YAAY,uBAAZ,YAAY,CAAE,QAAQ,CAAC,IAAI,CAAC,CAAC,CAAC,gBAAgB,CAAC,CAAA,CAAC;QAEhF,IAAI,mBAAmB,IAAI,CAAC,qBAAqB,EAAE;YACjD,MAAA,IAAI,CAAC,CAAC,CAAC,SAAS,0CAAE,EAAE,CAAC,uBAAS,CAAC,oBAAoB,EAAE,IAAI,CAAC,CAAC,CAAC,gBAAgB,CAAC,CAAC;YAC9E,MAAA,IAAI,CAAC,CAAC,CAAC,SAAS,0CAAE,KAAK,EAAE,CAAC;SAC3B;IACH,CAAC;IAEO,gBAAgB,CAAC,EAAmB;QAC1C,MAAM,2BAA2B,GAAG,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC;QACvD,IAAI,CAAC,CAAC,CAAC,WAAW,GAAG,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC,yBAAyB,CAAC,EAAE,CAAC,CAAC;QACtE,IAAI,IAAI,CAAC,CAAC,CAAC,WAAW,KAAK,2BAA2B,EAAE;YACtD,6BAA6B;YAC7B,OAAO;SACR;QAED,aAAa,CAAC,IAAI,CAAC,CAAC;QAEpB,IAAI,CAAC,IAAI,CACP,QAAQ,CAAC,4BAA4B,EACrC,IAAI,wCAA+B,CACjC,IAAI,CAAC,CAAC,CAAC,EAAE,EACT,2BAA2B,EAC3B,IAAI,CAAC,CAAC,CAAC,WAAW,CACnB,CACF,CAAC;IACJ,CAAC;IAED;;OAEG;IACH,IAAI,WAAW;QACb,OAAO,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC;IAC5B,CAAC;IAED,IAAI,YAAY;QACd,OAAO,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,YAAY,CAAC;IACrC,CAAC;IAED,IAAI,YAAY;QACd,OAAO,IAAI,kBAAkB,CAAC,IAAI,CAAC,YAAY,EAAE,CAAC,CAAC;IACrD,CAAC;IAED,8BAA8B;IAC9B,OAAO,CAAC,OAAwB,EAAE,QAAmB;;QACnD,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QACxB,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,KAAK,wBAAe,EAAE;YACpC,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;gBAClC,QAAQ,EAAE,CAAC;aACZ;YAED,OAAO;SACR;QAED,eAAe,CAAC,IAAI,EAAE,yBAAgB,CAAC,CAAC;QAExC,8BAA8B;QAC9B,IAAI,CAAC,IAAI,CAAC,QAAQ,CAAC,gBAAgB,EAAE,IAAI,6BAAoB,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC;QAE1E,wCAAwC;QACxC,IAAI,CAAC,IAAI,CACP,QAAQ,CAAC,4BAA4B,EACrC,IAAI,wCAA+B,CACjC,IAAI,CAAC,CAAC,CAAC,EAAE,EACT,IAAI,0CAAmB,CAAC,qBAAY,CAAC,OAAO,CAAC,EAAE,4BAA4B;QAC3E,IAAI,CAAC,CAAC,CAAC,WAAW,CACnB,CACF,CAAC;QAEF,sEAAsE;QACtE,MAAM,kBAAkB,GAAG,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC,OAAO,CAAC,MAAM,EAAE,CAAC,CAAC;QAC3E,cAAc,CAAC,IAAI,EAAE,kBAAkB,CAAC,CAAC;QAEzC,qEAAqE;QACrE,6DAA6D;QAC7D,IAAI,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,YAAY,EAAE;YAC/B,KAAK,MAAM,WAAW,IAAI,kBAAkB,EAAE;gBAC5C,MAAM,cAAc,GAAG,IAAI,sCAAiB,CAAC,WAAW,CAAC,WAAW,EAAE,SAAS,EAAE;oBAC/E,YAAY,EAAE,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,YAAY;iBAC1C,CAAC,CAAC;gBACH,IAAI,CAAC,mBAAmB,CAAC,cAAc,CAAC,CAAC;aAC1C;SACF;QAED,MAAM,cAAc,GAAG,MAAA,OAAO,CAAC,cAAc,mCAAI,gCAAc,CAAC,OAAO,CAAC;QACxE,IAAI,CAAC,YAAY,CAAC,+CAA4B,CAAC,cAAc,CAAC,EAAE,OAAO,EAAE,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;YACvF,IAAI,GAAG,EAAE;gBACP,IAAI,CAAC,KAAK,EAAE,CAAC;gBAEb,OAAO,QAAQ,KAAK,UAAU,CAAC,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,QAAQ,CAAC,KAAK,EAAE,GAAG,CAAC,CAAC;gBAChF,OAAO;aACR;YAED,kBAAkB;YAClB,IAAI,MAAM,IAAI,IAAI,CAAC,CAAC,CAAC,WAAW,EAAE;gBAChC,MAAM,CAAC,OAAO,CAAC,UAAE,CAAC,YAAY,CAAC,EAAE,EAAE,IAAI,EAAE,CAAC,EAAE,EAAE,GAAG,CAAC,EAAE;oBAClD,IAAI,GAAG,EAAE;wBACP,OAAO,QAAQ,KAAK,UAAU,CAAC,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,QAAQ,CAAC,KAAK,EAAE,GAAG,CAAC,CAAC;wBAChF,OAAO;qBACR;oBAED,eAAe,CAAC,IAAI,EAAE,wBAAe,CAAC,CAAC;oBACvC,+BAA+B;oBAC/B,IAAI,CAAC,IAAI,CAAC,QAAQ,CAAC,IAAI,EAAE,GAAG,EAAE,IAAI,CAAC,CAAC;oBACpC,IAAI,CAAC,IAAI,CAAC,QAAQ,CAAC,OAAO,EAAE,IAAI,CAAC,CAAC;oBAElC,IAAI,OAAO,QAAQ,KAAK,UAAU;wBAAE,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;gBAChE,CAAC,CAAC,CAAC;gBAEH,OAAO;aACR;YAED,eAAe,CAAC,IAAI,EAAE,wBAAe,CAAC,CAAC;YACvC,+BAA+B;YAC/B,IAAI,CAAC,IAAI,CAAC,QAAQ,CAAC,IAAI,EAAE,GAAG,EAAE,IAAI,CAAC,CAAC;YACpC,IAAI,CAAC,IAAI,CAAC,QAAQ,CAAC,OAAO,EAAE,IAAI,CAAC,CAAC;YAElC,IAAI,OAAO,QAAQ,KAAK,UAAU;gBAAE,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;QAChE,CAAC,CAAC,CAAC;IACL,CAAC;IAED,0BAA0B;IAC1B,KAAK,CAAC,OAAsB,EAAE,QAAmB;QAC/C,IAAI,OAAO,OAAO,KAAK,UAAU,EAAE;YACjC,QAAQ,GAAG,OAAO,CAAC;YACnB,OAAO,GAAG,EAAE,CAAC;SACd;QAED,IAAI,OAAO,OAAO,KAAK,SAAS,EAAE;YAChC,OAAO,GAAG,EAAE,KAAK,EAAE,OAAO,EAAE,CAAC;SAC9B;QAED,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QACxB,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,KAAK,qBAAY,IAAI,IAAI,CAAC,CAAC,CAAC,KAAK,KAAK,sBAAa,EAAE;YACnE,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;gBAClC,QAAQ,EAAE,CAAC;aACZ;YAED,OAAO;SACR;QAED,eAAe,CAAC,IAAI,EAAE,sBAAa,CAAC,CAAC;QAErC,cAAc,CAAC,IAAI,CAAC,UAAU,CAAC,EAAE,IAAI,gCAAwB,EAAE,CAAC,CAAC;QACjE,wBAAe,CAAC,IAAI,CAAC,CAAC,CAAC,gBAAgB,CAAC,CAAC;QAEzC,IAAI,IAAI,CAAC,CAAC,CAAC,SAAS,EAAE;YACpB,IAAI,CAAC,CAAC,CAAC,SAAS,CAAC,IAAI,EAAE,CAAC;YACxB,IAAI,CAAC,CAAC,CAAC,SAAS,CAAC,cAAc,CAAC,uBAAS,CAAC,oBAAoB,EAAE,IAAI,CAAC,CAAC,CAAC,gBAAgB,CAAC,CAAC;SAC1F;QAED,IAAI,CAAC,cAAc,CAAC,QAAQ,CAAC,4BAA4B,EAAE,IAAI,CAAC,CAAC,CAAC,qBAAqB,CAAC,CAAC;QAEzF,iBAAS,CACP,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,QAAQ,CAAC,MAAM,EAAE,CAAC,EACpC,CAAC,OAAO,EAAE,EAAE,EAAE,EAAE,CAAC,OAAO,CAAC,UAAU,CAAC,EAAE,CAAC,EACvC,GAAG,EAAE;YACH,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC,oBAAoB,CAAC,GAAG,EAAE;gBAC3C,iBAAS,CACP,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,MAAM,EAAE,CAAC,EACnC,CAAC,MAAM,EAAE,EAAE,EAAE,EAAE,CAAC,aAAa,CAAC,MAAM,EAAE,IAAI,EAAE,OAAO,EAAE,EAAE,CAAC,EACxD,GAAG,CAAC,EAAE;oBACJ,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,KAAK,EAAE,CAAC;oBAEvB,0BAA0B;oBAC1B,IAAI,CAAC,IAAI,CAAC,QAAQ,CAAC,eAAe,EAAE,IAAI,4BAAmB,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC;oBAExE,eAAe,CAAC,IAAI,EAAE,qBAAY,CAAC,CAAC;oBAEpC,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;wBAClC,QAAQ,CAAC,GAAG,CAAC,CAAC;qBACf;gBACH,CAAC,CACF,CAAC;YACJ,CAAC,CAAC,CAAC;QACL,CAAC,CACF,CAAC;IACJ,CAAC;IAoBD,YAAY,CACV,QAAwE,EACxE,QAAiD,EACjD,SAA4B;QAE5B,IAAI,OAAO,GAAG,QAA+B,CAAC;QAC9C,MAAM,QAAQ,GAAG,CAAC,SAAS,aAAT,SAAS,cAAT,SAAS,GAAI,QAAQ,CAAqB,CAAC;QAC7D,IAAI,OAAO,OAAO,KAAK,UAAU,EAAE;YACjC,OAAO,GAAG,EAAE,CAAC;SACd;QAED,IAAI,cAAc,CAAC;QACnB,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;YAClC,IAAI,OAAO,QAAQ,KAAK,QAAQ,EAAE;gBAChC,cAAc,GAAG,+CAA4B,CAAC,gCAAc,CAAC,UAAU,CAAC,QAAQ,CAAC,CAAC,CAAC;aACpF;iBAAM;gBACL,IAAI,cAAc,CAAC;gBACnB,IAAI,QAAQ,YAAY,gCAAc,EAAE;oBACtC,cAAc,GAAG,QAAQ,CAAC;iBAC3B;qBAAM;oBACL,gCAAc,CAAC,SAAS,CAAC,OAAO,CAAC,CAAC;oBAClC,cAAc,GAAG,OAAO,CAAC,cAAc,IAAI,gCAAc,CAAC,OAAO,CAAC;iBACnE;gBAED,cAAc,GAAG,+CAA4B,CAAC,cAAgC,CAAC,CAAC;aACjF;SACF;aAAM;YACL,cAAc,GAAG,QAAQ,CAAC;SAC3B;QAED,OAAO,GAAG,MAAM,CAAC,MAAM,CACrB,EAAE,EACF,EAAE,wBAAwB,EAAE,IAAI,CAAC,CAAC,CAAC,wBAAwB,EAAE,EAC7D,OAAO,CACR,CAAC;QAEF,MAAM,SAAS,GAAG,IAAI,CAAC,WAAW,CAAC,IAAI,KAAK,qBAAY,CAAC,OAAO,CAAC;QACjE,MAAM,OAAO,GAAG,OAAO,CAAC,OAAO,CAAC;QAChC,MAAM,WAAW,GAAG,OAAO,IAAI,OAAO,CAAC,WAAW,CAAC;QAEnD,IAAI,SAAS,IAAI,WAAW,IAAI,WAAW,CAAC,MAAM,EAAE;YAClD,QAAQ,CAAC,SAAS,EAAE,WAAW,CAAC,MAAM,CAAC,CAAC;YACxC,OAAO;SACR;QAED,MAAM,eAAe,GAA2B;YAC9C,cAAc;YACd,WAAW;YACX,QAAQ;SACT,CAAC;QAEF,MAAM,wBAAwB,GAAG,OAAO,CAAC,wBAAwB,CAAC;QAClE,IAAI,wBAAwB,EAAE;YAC5B,eAAe,CAAC,KAAK,GAAG,UAAU,CAAC,GAAG,EAAE;gBACtC,eAAe,CAAC,UAAU,CAAC,GAAG,IAAI,CAAC;gBACnC,eAAe,CAAC,KAAK,GAAG,SAAS,CAAC;gBAClC,MAAM,YAAY,GAAG,IAAI,iCAAyB,CAChD,oCAAoC,wBAAwB,KAAK,EACjE,IAAI,CAAC,WAAW,CACjB,CAAC;gBAEF,eAAe,CAAC,QAAQ,CAAC,YAAY,CAAC,CAAC;YACzC,CAAC,EAAE,wBAAwB,CAAC,CAAC;SAC9B;QAED,IAAI,CAAC,UAAU,CAAC,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;QACvC,gBAAgB,CAAC,IAAI,CAAC,CAAC;IACzB,CAAC;IAED,2BAA2B;IAE3B;;OAEG;IACH,4BAA4B;QAC1B,IAAI,IAAI,CAAC,WAAW,CAAC,IAAI,KAAK,qBAAY,CAAC,MAAM,EAAE;YACjD,OAAO,CAAC,IAAI,CAAC,WAAW,CAAC,eAAe,CAAC;SAC1C;QAED,OAAO,CAAC,IAAI,CAAC,WAAW,CAAC,qBAAqB,CAAC;IACjD,CAAC;IAED;;OAEG;IACH,iBAAiB;QACf,OAAO,IAAI,CAAC,YAAY,IAAI,IAAI,CAAC,WAAW,CAAC,4BAA4B,IAAI,IAAI,CAAC;IACpF,CAAC;IAED,8BAA8B;IAC9B,YAAY,CAAC,OAA6B,EAAE,aAA4B;QACtE,MAAM,OAAO,GAAG,IAAI,wBAAa,CAAC,IAAI,EAAE,IAAI,CAAC,CAAC,CAAC,WAAW,EAAE,OAAO,EAAE,aAAa,CAAC,CAAC;QACpF,OAAO,CAAC,IAAI,CAAC,OAAO,EAAE,GAAG,EAAE;YACzB,IAAI,CAAC,CAAC,CAAC,QAAQ,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC;QAClC,CAAC,CAAC,CAAC;QAEH,IAAI,CAAC,CAAC,CAAC,QAAQ,CAAC,GAAG,CAAC,OAAO,CAAC,CAAC;QAC7B,OAAO,OAAO,CAAC;IACjB,CAAC;IAED,6DAA6D;IAC7D,WAAW,CAAC,QAA2B,EAAE,QAA6B;QACpE,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,QAAQ,CAAC,EAAE;YAC5B,QAAQ,GAAG,CAAC,QAAQ,CAAC,CAAC;SACvB;QAED,IAAI,CAAC,YAAY,CACf,+CAA4B,CAAC,gCAAc,CAAC,gBAAgB,CAAC,EAC7D,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;YACd,IAAI,GAAG,IAAI,CAAC,MAAM,EAAE;gBAClB,IAAI,OAAO,QAAQ,KAAK,UAAU;oBAAE,QAAQ,CAAC,GAAG,CAAC,CAAC;gBAClD,OAAO;aACR;YAED,MAAM,CAAC,OAAO,CACZ,UAAE,CAAC,YAAY,CAAC,EAChB,EAAE,WAAW,EAAE,QAAQ,EAAE,EACzB,EAAE,UAAU,EAAE,IAAI,EAAE,EACpB,CAAC,GAAG,EAAE,MAAM,EAAE,EAAE;gBACd,IAAI,OAAO,QAAQ,KAAK,UAAU;oBAAE,QAAQ,CAAC,GAAG,EAAE,MAAM,CAAC,CAAC;YAC5D,CAAC,CACF,CAAC;QACJ,CAAC,CACF,CAAC;IACJ,CAAC;IAED;;;;OAIG;IACH,mBAAmB,CAAC,iBAAoC;QACtD,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC,SAAS,CAAC,iBAAiB,CAAC,OAAO,CAAC,EAAE;YAC5D,OAAO;SACR;QAED,oEAAoE;QACpE,IAAI,wBAAwB,CAAC,IAAI,CAAC,CAAC,CAAC,WAAW,EAAE,iBAAiB,CAAC,EAAE;YACnE,OAAO;SACR;QAED,iDAAiD;QACjD,MAAM,2BAA2B,GAAG,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC;QACvD,MAAM,yBAAyB,GAAG,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC,OAAO,CAAC,GAAG,CAAC,iBAAiB,CAAC,OAAO,CAAC,CAAC;QAC5F,IAAI,CAAC,yBAAyB,EAAE;YAC9B,OAAO;SACR;QAED,wEAAwE;QACxE,uEAAuE;QACvE,iEAAiE;QACjE,wEAAwE;QACxE,oEAAoE;QACpE,4CAA4C;QAC5C,MAAM,WAAW,GAAG,iBAAiB,CAAC,YAAY,CAAC;QACnD,IAAI,WAAW,EAAE;YACf,4BAAmB,CAAC,IAAI,EAAE,WAAW,CAAC,CAAC;SACxC;QAED,qFAAqF;QACrF,wFAAwF;QACxF,yFAAyF;QACzF,MAAM,iBAAiB,GACrB,yBAAyB,IAAI,yBAAyB,CAAC,MAAM,CAAC,iBAAiB,CAAC,CAAC;QAEnF,uCAAuC;QACvC,IAAI,CAAC,CAAC,CAAC,WAAW,GAAG,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC,MAAM,CAAC,iBAAiB,CAAC,CAAC;QAClE,IAAI,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC,kBAAkB,EAAE;YACzC,IAAI,CAAC,IAAI,CAAC,QAAQ,CAAC,KAAK,EAAE,IAAI,+BAAuB,CAAC,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC,kBAAkB,CAAC,CAAC,CAAC;YAC9F,OAAO;SACR;QAED,yCAAyC;QACzC,IAAI,CAAC,iBAAiB,EAAE;YACtB,MAAM,cAAc,GAAG,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC,OAAO,CAAC,GAAG,CAAC,iBAAiB,CAAC,OAAO,CAAC,CAAC;YACjF,IAAI,cAAc,EAAE;gBAClB,IAAI,CAAC,IAAI,CACP,QAAQ,CAAC,0BAA0B,EACnC,IAAI,sCAA6B,CAC/B,IAAI,CAAC,CAAC,CAAC,EAAE,EACT,iBAAiB,CAAC,OAAO,EACzB,yBAAyB,EACzB,cAAc,CACf,CACF,CAAC;aACH;SACF;QAED,+CAA+C;QAC/C,aAAa,CAAC,IAAI,EAAE,iBAAiB,CAAC,CAAC;QAEvC,+DAA+D;QAC/D,IAAI,IAAI,CAAC,UAAU,CAAC,CAAC,MAAM,GAAG,CAAC,EAAE;YAC/B,gBAAgB,CAAC,IAAI,CAAC,CAAC;SACxB;QAED,IAAI,CAAC,iBAAiB,EAAE;YACtB,IAAI,CAAC,IAAI,CACP,QAAQ,CAAC,4BAA4B,EACrC,IAAI,wCAA+B,CACjC,IAAI,CAAC,CAAC,CAAC,EAAE,EACT,2BAA2B,EAC3B,IAAI,CAAC,CAAC,CAAC,WAAW,CACnB,CACF,CAAC;SACH;IACH,CAAC;IAED,IAAI,CAAC,WAA8B,EAAE,QAAmB;QACtD,IAAI,OAAO,WAAW,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,WAAW,CAAC,EAAE,CAAC,WAAW,GAAG,SAAS,CAAC,CAAC;QAC3F,IAAI,OAAO,QAAQ,KAAK,UAAU;YAAE,QAAQ,CAAC,SAAS,EAAE,IAAI,CAAC,CAAC;IAChE,CAAC;IAED,IAAI,cAAc;QAChB,OAAO,IAAI,CAAC,CAAC,CAAC,OAAO,CAAC,QAAQ,CAAC;IACjC,CAAC;IAED,WAAW;QACT,OAAO,IAAI,CAAC,CAAC,CAAC,KAAK,KAAK,wBAAe,CAAC;IAC1C,CAAC;IAED,WAAW;QACT,OAAO,IAAI,CAAC,CAAC,CAAC,KAAK,KAAK,qBAAY,CAAC;IACvC,CAAC;IAED;;OAEG;IACH,KAAK;QACH,mBAAW,CAAC,iEAAiE,CAAC,CAAC;IACjF,CAAC;IAED,kFAAkF;IAClF,oFAAoF;IACpF,+EAA+E;IAC/E,YAAY;QACV,MAAM,kBAAkB,GAAG,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,WAAW,CAAC,OAAO,CAAC,MAAM,EAAE,CAAC,CAAC;QACzE,IAAI,kBAAkB,CAAC,MAAM,KAAK,CAAC;YAAE,OAAO,EAAE,CAAC;QAC/C,MAAM,EAAE,GAAG,kBAAkB,CAAC,MAAM,CAClC,CAAC,EAAqB,EAAE,EAAE,CAAC,EAAE,CAAC,IAAI,KAAK,mBAAU,CAAC,OAAO,CAC1D,CAAC,CAAC,CAAC,CAAC;QAEL,MAAM,MAAM,GAAG,EAAE,IAAI,EAAE,cAAc,EAAE,IAAI,CAAC,WAAW,CAAC,iBAAiB,EAAE,CAAC;QAC5E,OAAO,MAAM,CAAC;IAChB,CAAC;IAED,IAAI,4BAA4B;QAC9B,OAAO,IAAI,CAAC,WAAW,CAAC,4BAA4B,CAAC;IACvD,CAAC;IAED,IAAI,WAAW;QACb,OAAO,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC;IAC5B,CAAC;IAED,IAAI,WAAW,CAAC,WAAoC;QAClD,IAAI,CAAC,CAAC,CAAC,WAAW,GAAG,WAAW,CAAC;IACnC,CAAC;;AArmBH,4BAsmBC;AA5lBC,aAAa;AACG,uBAAc,GAAG,eAAwB,CAAC;AAC1D,aAAa;AACG,sBAAa,GAAG,cAAuB,CAAC;AACxD,aAAa;AACG,mCAA0B,GAAG,0BAAmC,CAAC;AACjF,aAAa;AACG,yBAAgB,GAAG,iBAA0B,CAAC;AAC9D,aAAa;AACG,wBAAe,GAAG,gBAAyB,CAAC;AAC5D,aAAa;AACG,qCAA4B,GAAG,4BAAqC,CAAC;AACrF,aAAa;AACG,cAAK,GAAG,OAAgB,CAAC;AACzC,aAAa;AACG,aAAI,GAAG,MAAe,CAAC;AACvC,aAAa;AACG,gBAAO,GAAG,SAAkB,CAAC;AAC7C,aAAa;AACG,cAAK,GAAG,OAAgB,CAAC;AACzC,aAAa;AACG,gBAAO,GAAG,SAAkB,CAAC;AAykB/C,cAAc;AACD,QAAA,eAAe,GAAG;IAC7B,QAAQ,CAAC,cAAc;IACvB,QAAQ,CAAC,aAAa;IACtB,QAAQ,CAAC,0BAA0B;IACnC,QAAQ,CAAC,gBAAgB;IACzB,QAAQ,CAAC,eAAe;IACxB,QAAQ,CAAC,4BAA4B;IACrC,QAAQ,CAAC,KAAK;IACd,QAAQ,CAAC,OAAO;IAChB,QAAQ,CAAC,KAAK;CACf,CAAC;AAEF,2EAA2E;AAC3E,SAAS,aAAa,CACpB,MAAc,EACd,QAAkB,EAClB,OAAwB,EACxB,QAAmB;IAEnB,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;IACxB,KAAK,MAAM,KAAK,IAAI,mBAAmB,EAAE;QACvC,MAAM,CAAC,kBAAkB,CAAC,KAAK,CAAC,CAAC;KAClC;IAED,MAAM,CAAC,OAAO,CAAC,OAAO,EAAE,GAAG,EAAE;QAC3B,QAAQ,CAAC,IAAI,CACX,QAAQ,CAAC,aAAa,EACtB,IAAI,0BAAiB,CAAC,QAAQ,CAAC,CAAC,CAAC,EAAE,EAAE,MAAM,CAAC,WAAW,CAAC,OAAO,CAAC,CACjE,CAAC;QAEF,KAAK,MAAM,KAAK,IAAI,mBAAmB,EAAE;YACvC,MAAM,CAAC,kBAAkB,CAAC,KAAK,CAAC,CAAC;SAClC;QACD,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;YAClC,QAAQ,EAAE,CAAC;SACZ;IACH,CAAC,CAAC,CAAC;AACL,CAAC;AAED,6CAA6C;AAC7C,SAAS,uBAAuB,CAAC,OAAyB;IACxD,IAAI,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,gBAAgB,EAAE;QAC7B,OAAO,qBAAY,CAAC,MAAM,CAAC;KAC5B;IAED,IAAI,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,UAAU,EAAE;QACvB,OAAO,qBAAY,CAAC,mBAAmB,CAAC;KACzC;IAED,IAAI,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,YAAY,EAAE;QACzB,OAAO,qBAAY,CAAC,YAAY,CAAC;KAClC;IAED,OAAO,qBAAY,CAAC,OAAO,CAAC;AAC9B,CAAC;AAED,SAAS,eAAe,CAAC,KAA0B;IACjD,OAAO,KAAK,CAAC,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,MAAM,EAAE,GAAG,KAAK,CAAC,MAAM,CAAC,CAAC,CAAC;AACzD,CAAC;AAED;;;;;;GAMG;AACH,SAAS,sBAAsB,CAC7B,QAAkB,EAClB,iBAAoC,EACpC,YAAqB;IAErB,QAAQ,CAAC,IAAI,CACX,QAAQ,CAAC,cAAc,EACvB,IAAI,2BAAkB,CAAC,QAAQ,CAAC,CAAC,CAAC,EAAE,EAAE,iBAAiB,CAAC,OAAO,CAAC,CACjE,CAAC;IAEF,MAAM,MAAM,GAAG,IAAI,eAAM,CAAC,QAAQ,EAAE,iBAAiB,EAAE,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC;IAC3E,KAAK,MAAM,KAAK,IAAI,mBAAmB,EAAE;QACvC,MAAM,CAAC,EAAE,CAAC,KAAK,EAAE,CAAC,CAAM,EAAE,EAAE,CAAC,QAAQ,CAAC,IAAI,CAAC,KAAK,EAAE,CAAC,CAAC,CAAC,CAAC;KACvD;IAED,MAAM,CAAC,EAAE,CAAC,eAAM,CAAC,oBAAoB,EAAE,WAAW,CAAC,EAAE,CAAC,QAAQ,CAAC,mBAAmB,CAAC,WAAW,CAAC,CAAC,CAAC;IAEjG,IAAI,YAAY,EAAE;QAChB,MAAM,YAAY,GAAG,UAAU,CAAC,GAAG,EAAE;YACnC,gCAAuB,CAAC,YAAY,EAAE,QAAQ,CAAC,CAAC,CAAC,gBAAgB,CAAC,CAAC;YACnE,MAAM,CAAC,OAAO,EAAE,CAAC;QACnB,CAAC,EAAE,YAAY,CAAC,CAAC;QAEjB,QAAQ,CAAC,CAAC,CAAC,gBAAgB,CAAC,GAAG,CAAC,YAAY,CAAC,CAAC;QAC9C,OAAO,MAAM,CAAC;KACf;IAED,MAAM,CAAC,OAAO,EAAE,CAAC;IACjB,OAAO,MAAM,CAAC;AAChB,CAAC;AAED;;;;;;GAMG;AACH,SAAS,cAAc,CAAC,QAAkB,EAAE,kBAAuC;IACjF,QAAQ,CAAC,CAAC,CAAC,OAAO,GAAG,kBAAkB,CAAC,MAAM,CAC5C,CAAC,OAA4B,EAAE,iBAAoC,EAAE,EAAE;QACrE,MAAM,MAAM,GAAG,sBAAsB,CAAC,QAAQ,EAAE,iBAAiB,CAAC,CAAC;QACnE,OAAO,CAAC,GAAG,CAAC,iBAAiB,CAAC,OAAO,EAAE,MAAM,CAAC,CAAC;QAC/C,OAAO,OAAO,CAAC;IACjB,CAAC,EACD,IAAI,GAAG,EAAkB,CAC1B,CAAC;AACJ,CAAC;AAED;;;GAGG;AACH,SAAS,aAAa,CAAC,QAAkB,EAAE,yBAA6C;IACtF,2CAA2C;IAC3C,IAAI,yBAAyB,IAAI,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,GAAG,CAAC,yBAAyB,CAAC,OAAO,CAAC,EAAE;QAC1F,MAAM,MAAM,GAAG,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,GAAG,CAAC,yBAAyB,CAAC,OAAO,CAAC,CAAC;QACzE,IAAI,MAAM,EAAE;YACV,MAAM,CAAC,CAAC,CAAC,WAAW,GAAG,yBAAyB,CAAC;SAClD;KACF;IAED,6EAA6E;IAC7E,KAAK,MAAM,iBAAiB,IAAI,QAAQ,CAAC,WAAW,CAAC,OAAO,CAAC,MAAM,EAAE,EAAE;QACrE,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,GAAG,CAAC,iBAAiB,CAAC,OAAO,CAAC,EAAE;YACtD,MAAM,MAAM,GAAG,sBAAsB,CAAC,QAAQ,EAAE,iBAAiB,CAAC,CAAC;YACnE,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,GAAG,CAAC,iBAAiB,CAAC,OAAO,EAAE,MAAM,CAAC,CAAC;SAC3D;KACF;IAED,yFAAyF;IACzF,KAAK,MAAM,KAAK,IAAI,QAAQ,CAAC,CAAC,CAAC,OAAO,EAAE;QACtC,MAAM,aAAa,GAAG,KAAK,CAAC,CAAC,CAAC,CAAC;QAC/B,IAAI,QAAQ,CAAC,WAAW,CAAC,SAAS,CAAC,aAAa,CAAC,EAAE;YACjD,SAAS;SACV;QAED,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,GAAG,CAAC,aAAa,CAAC,EAAE;YAC1C,SAAS;SACV;QAED,MAAM,MAAM,GAAG,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,GAAG,CAAC,aAAa,CAAC,CAAC;QACrD,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,MAAM,CAAC,aAAa,CAAC,CAAC;QAEzC,wCAAwC;QACxC,IAAI,MAAM,EAAE;YACV,aAAa,CAAC,MAAM,EAAE,QAAQ,CAAC,CAAC;SACjC;KACF;AACH,CAAC;AAED,SAAS,cAAc,CAAC,KAAqC,EAAE,GAAsB;IACnF,OAAO,KAAK,CAAC,MAAM,EAAE;QACnB,MAAM,eAAe,GAAG,KAAK,CAAC,KAAK,EAAE,CAAC;QACtC,IAAI,CAAC,eAAe,EAAE;YACpB,SAAS;SACV;QAED,IAAI,eAAe,CAAC,KAAK,EAAE;YACzB,YAAY,CAAC,eAAe,CAAC,KAAK,CAAC,CAAC;SACrC;QAED,IAAI,CAAC,eAAe,CAAC,UAAU,CAAC,EAAE;YAChC,eAAe,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAC;SAC/B;KACF;AACH,CAAC;AAED,SAAS,gBAAgB,CAAC,QAAkB;IAC1C,IAAI,QAAQ,CAAC,CAAC,CAAC,KAAK,KAAK,qBAAY,EAAE;QACrC,cAAc,CAAC,QAAQ,CAAC,UAAU,CAAC,EAAE,IAAI,gCAAwB,EAAE,CAAC,CAAC;QACrE,OAAO;KACR;IAED,MAAM,SAAS,GAAG,QAAQ,CAAC,WAAW,CAAC,IAAI,KAAK,qBAAY,CAAC,OAAO,CAAC;IACrE,MAAM,kBAAkB,GAAG,KAAK,CAAC,IAAI,CAAC,QAAQ,CAAC,WAAW,CAAC,OAAO,CAAC,MAAM,EAAE,CAAC,CAAC;IAC7E,MAAM,gBAAgB,GAAG,QAAQ,CAAC,UAAU,CAAC,CAAC,MAAM,CAAC;IACrD,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,CAAC,GAAG,gBAAgB,EAAE,EAAE,CAAC,EAAE;QACzC,MAAM,eAAe,GAAG,QAAQ,CAAC,UAAU,CAAC,CAAC,KAAK,EAAE,CAAC;QACrD,IAAI,CAAC,eAAe,EAAE;YACpB,SAAS;SACV;QAED,IAAI,eAAe,CAAC,UAAU,CAAC,EAAE;YAC/B,SAAS;SACV;QAED,IAAI,oBAAoB,CAAC;QACzB,IAAI;YACF,MAAM,cAAc,GAAG,eAAe,CAAC,cAAc,CAAC;YACtD,oBAAoB,GAAG,cAAc;gBACnC,CAAC,CAAC,cAAc,CAAC,QAAQ,CAAC,WAAW,EAAE,kBAAkB,CAAC;gBAC1D,CAAC,CAAC,kBAAkB,CAAC;SACxB;QAAC,OAAO,CAAC,EAAE;YACV,IAAI,eAAe,CAAC,KAAK,EAAE;gBACzB,YAAY,CAAC,eAAe,CAAC,KAAK,CAAC,CAAC;aACrC;YAED,eAAe,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC;YAC5B,SAAS;SACV;QAED,IAAI,oBAAoB,CAAC,MAAM,KAAK,CAAC,EAAE;YACrC,QAAQ,CAAC,UAAU,CAAC,CAAC,IAAI,CAAC,eAAe,CAAC,CAAC;YAC3C,SAAS;SACV;QAED,MAAM,yBAAyB,GAAG,eAAe,CAAC,oBAAoB,CAAC,CAAC;QACxE,MAAM,cAAc,GAAG,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,GAAG,CAAC,yBAAyB,CAAC,OAAO,CAAC,CAAC;QACjF,MAAM,WAAW,GAAG,eAAe,CAAC,WAAW,CAAC;QAChD,IAAI,SAAS,IAAI,WAAW,IAAI,WAAW,CAAC,QAAQ,IAAI,cAAc,EAAE;YACtE,WAAW,CAAC,SAAS,CAAC,cAAc,CAAC,CAAC;SACvC;QAED,IAAI,eAAe,CAAC,KAAK,EAAE;YACzB,YAAY,CAAC,eAAe,CAAC,KAAK,CAAC,CAAC;SACrC;QAED,eAAe,CAAC,QAAQ,CAAC,SAAS,EAAE,cAAc,CAAC,CAAC;KACrD;IAED,IAAI,QAAQ,CAAC,UAAU,CAAC,CAAC,MAAM,GAAG,CAAC,EAAE;QACnC,qDAAqD;QACrD,KAAK,MAAM,CAAC,EAAE,MAAM,CAAC,IAAI,QAAQ,CAAC,CAAC,CAAC,OAAO,EAAE;YAC3C,OAAO,CAAC,QAAQ,CAAC,SAAS,mBAAmB;gBAC3C,OAAO,MAAM,CAAC,YAAY,EAAE,CAAC;YAC/B,CAAC,CAAC,CAAC;SACJ;KACF;AACH,CAAC;AAED,SAAS,wBAAwB,CAC/B,mBAAwC,EACxC,yBAA4C;IAE5C,MAAM,wBAAwB,GAAG,mBAAmB,CAAC,OAAO,CAAC,GAAG,CAC9D,yBAAyB,CAAC,OAAO,CAClC,CAAC;IACF,MAAM,sBAAsB,GAAG,wBAAwB,aAAxB,wBAAwB,uBAAxB,wBAAwB,CAAE,eAAe,CAAC;IACzE,OAAO,CACL,2CAAsB,CAAC,sBAAsB,EAAE,yBAAyB,CAAC,eAAe,CAAC,GAAG,CAAC,CAC9F,CAAC;AACJ,CAAC;AAED,cAAc;AACd,MAAa,kBAAkB;IAI7B,YAAY,QAAkB;QAC5B,IAAI,CAAC,cAAc,GAAG,QAAQ,CAAC,cAAc,IAAI,CAAC,CAAC;QACnD,IAAI,CAAC,cAAc,GAAG,QAAQ,CAAC,cAAc,IAAI,CAAC,CAAC;IACrD,CAAC;IAED,IAAI,oBAAoB;QACtB,OAAO,IAAI,CAAC,cAAc,IAAI,CAAC,CAAC;IAClC,CAAC;IAED,IAAI,gBAAgB;QAClB,OAAO,IAAI,CAAC,cAAc,IAAI,CAAC,CAAC;IAClC,CAAC;IACD,IAAI,aAAa;QACf,OAAO,IAAI,CAAC,cAAc,IAAI,CAAC,CAAC;IAClC,CAAC;IAED,IAAI,eAAe;QACjB,OAAO,IAAI,CAAC,cAAc,IAAI,CAAC,CAAC;IAClC,CAAC;IAED,IAAI,yBAAyB;QAC3B,OAAO,IAAI,CAAC,cAAc,IAAI,CAAC,CAAC;IAClC,CAAC;IAED,IAAI,qBAAqB;QACvB,OAAO,IAAI,CAAC,cAAc,IAAI,CAAC,CAAC;IAClC,CAAC;IAED,IAAI,qBAAqB;QACvB,OAAO,IAAI,CAAC,cAAc,IAAI,EAAE,CAAC;IACnC,CAAC;IAED,IAAI,wBAAwB;QAC1B,OAAO,IAAI,CAAC,cAAc,IAAI,CAAC,CAAC;IAClC,CAAC;IAED,IAAI,qBAAqB;QACvB,OAAO,IAAI,CAAC,cAAc,IAAI,CAAC,CAAC;IAClC,CAAC;CACF;AA3CD,gDA2CC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/topology_description.js b/node_modules/mongodb/lib/sdam/topology_description.js new file mode 100644 index 000000000..37276a96c --- /dev/null +++ b/node_modules/mongodb/lib/sdam/topology_description.js @@ -0,0 +1,337 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.TopologyDescription = void 0; +const server_description_1 = require("./server_description"); +const WIRE_CONSTANTS = require("../cmap/wire_protocol/constants"); +const common_1 = require("./common"); +const error_1 = require("../error"); +// constants related to compatibility checks +const MIN_SUPPORTED_SERVER_VERSION = WIRE_CONSTANTS.MIN_SUPPORTED_SERVER_VERSION; +const MAX_SUPPORTED_SERVER_VERSION = WIRE_CONSTANTS.MAX_SUPPORTED_SERVER_VERSION; +const MIN_SUPPORTED_WIRE_VERSION = WIRE_CONSTANTS.MIN_SUPPORTED_WIRE_VERSION; +const MAX_SUPPORTED_WIRE_VERSION = WIRE_CONSTANTS.MAX_SUPPORTED_WIRE_VERSION; +const MONGOS_OR_UNKNOWN = new Set([common_1.ServerType.Mongos, common_1.ServerType.Unknown]); +const MONGOS_OR_STANDALONE = new Set([common_1.ServerType.Mongos, common_1.ServerType.Standalone]); +const NON_PRIMARY_RS_MEMBERS = new Set([ + common_1.ServerType.RSSecondary, + common_1.ServerType.RSArbiter, + common_1.ServerType.RSOther +]); +/** + * Representation of a deployment of servers + * @public + */ +class TopologyDescription { + /** + * Create a TopologyDescription + */ + constructor(topologyType, serverDescriptions, setName, maxSetVersion, maxElectionId, commonWireVersion, options) { + var _a, _b; + options = options !== null && options !== void 0 ? options : {}; + // TODO: consider assigning all these values to a temporary value `s` which + // we use `Object.freeze` on, ensuring the internal state of this type + // is immutable. + this.type = topologyType !== null && topologyType !== void 0 ? topologyType : common_1.TopologyType.Unknown; + this.servers = serverDescriptions !== null && serverDescriptions !== void 0 ? serverDescriptions : new Map(); + this.stale = false; + this.compatible = true; + this.heartbeatFrequencyMS = (_a = options.heartbeatFrequencyMS) !== null && _a !== void 0 ? _a : 0; + this.localThresholdMS = (_b = options.localThresholdMS) !== null && _b !== void 0 ? _b : 0; + if (setName) { + this.setName = setName; + } + if (maxSetVersion) { + this.maxSetVersion = maxSetVersion; + } + if (maxElectionId) { + this.maxElectionId = maxElectionId; + } + if (commonWireVersion) { + this.commonWireVersion = commonWireVersion; + } + // determine server compatibility + for (const serverDescription of this.servers.values()) { + // Load balancer mode is always compatible. + if (serverDescription.type === common_1.ServerType.Unknown || + serverDescription.type === common_1.ServerType.LoadBalancer) { + continue; + } + if (serverDescription.minWireVersion > MAX_SUPPORTED_WIRE_VERSION) { + this.compatible = false; + this.compatibilityError = `Server at ${serverDescription.address} requires wire version ${serverDescription.minWireVersion}, but this version of the driver only supports up to ${MAX_SUPPORTED_WIRE_VERSION} (MongoDB ${MAX_SUPPORTED_SERVER_VERSION})`; + } + if (serverDescription.maxWireVersion < MIN_SUPPORTED_WIRE_VERSION) { + this.compatible = false; + this.compatibilityError = `Server at ${serverDescription.address} reports wire version ${serverDescription.maxWireVersion}, but this version of the driver requires at least ${MIN_SUPPORTED_WIRE_VERSION} (MongoDB ${MIN_SUPPORTED_SERVER_VERSION}).`; + break; + } + } + // Whenever a client updates the TopologyDescription from an ismaster response, it MUST set + // TopologyDescription.logicalSessionTimeoutMinutes to the smallest logicalSessionTimeoutMinutes + // value among ServerDescriptions of all data-bearing server types. If any have a null + // logicalSessionTimeoutMinutes, then TopologyDescription.logicalSessionTimeoutMinutes MUST be + // set to null. + this.logicalSessionTimeoutMinutes = undefined; + for (const [, server] of this.servers) { + if (server.isReadable) { + if (server.logicalSessionTimeoutMinutes == null) { + // If any of the servers have a null logicalSessionsTimeout, then the whole topology does + this.logicalSessionTimeoutMinutes = undefined; + break; + } + if (this.logicalSessionTimeoutMinutes == null) { + // First server with a non null logicalSessionsTimeout + this.logicalSessionTimeoutMinutes = server.logicalSessionTimeoutMinutes; + continue; + } + // Always select the smaller of the: + // current server logicalSessionsTimeout and the topologies logicalSessionsTimeout + this.logicalSessionTimeoutMinutes = Math.min(this.logicalSessionTimeoutMinutes, server.logicalSessionTimeoutMinutes); + } + } + } + /** + * Returns a new TopologyDescription based on the SrvPollingEvent + * @internal + */ + updateFromSrvPollingEvent(ev) { + const newAddresses = ev.addresses(); + const serverDescriptions = new Map(this.servers); + for (const address of this.servers.keys()) { + if (newAddresses.has(address)) { + newAddresses.delete(address); + } + else { + serverDescriptions.delete(address); + } + } + if (serverDescriptions.size === this.servers.size && newAddresses.size === 0) { + return this; + } + for (const [address, host] of newAddresses) { + serverDescriptions.set(address, new server_description_1.ServerDescription(host)); + } + return new TopologyDescription(this.type, serverDescriptions, this.setName, this.maxSetVersion, this.maxElectionId, this.commonWireVersion, { heartbeatFrequencyMS: this.heartbeatFrequencyMS, localThresholdMS: this.localThresholdMS }); + } + /** + * Returns a copy of this description updated with a given ServerDescription + * @internal + */ + update(serverDescription) { + const address = serverDescription.address; + // potentially mutated values + let { type: topologyType, setName, maxSetVersion, maxElectionId, commonWireVersion } = this; + if (serverDescription.setName && setName && serverDescription.setName !== setName) { + serverDescription = new server_description_1.ServerDescription(address, undefined); + } + const serverType = serverDescription.type; + const serverDescriptions = new Map(this.servers); + // update common wire version + if (serverDescription.maxWireVersion !== 0) { + if (commonWireVersion == null) { + commonWireVersion = serverDescription.maxWireVersion; + } + else { + commonWireVersion = Math.min(commonWireVersion, serverDescription.maxWireVersion); + } + } + // update the actual server description + serverDescriptions.set(address, serverDescription); + if (topologyType === common_1.TopologyType.Single) { + // once we are defined as single, that never changes + return new TopologyDescription(common_1.TopologyType.Single, serverDescriptions, setName, maxSetVersion, maxElectionId, commonWireVersion, { heartbeatFrequencyMS: this.heartbeatFrequencyMS, localThresholdMS: this.localThresholdMS }); + } + if (topologyType === common_1.TopologyType.Unknown) { + if (serverType === common_1.ServerType.Standalone && this.servers.size !== 1) { + serverDescriptions.delete(address); + } + else { + topologyType = topologyTypeForServerType(serverType); + } + } + if (topologyType === common_1.TopologyType.Sharded) { + if (!MONGOS_OR_UNKNOWN.has(serverType)) { + serverDescriptions.delete(address); + } + } + if (topologyType === common_1.TopologyType.ReplicaSetNoPrimary) { + if (MONGOS_OR_STANDALONE.has(serverType)) { + serverDescriptions.delete(address); + } + if (serverType === common_1.ServerType.RSPrimary) { + const result = updateRsFromPrimary(serverDescriptions, serverDescription, setName, maxSetVersion, maxElectionId); + topologyType = result[0]; + setName = result[1]; + maxSetVersion = result[2]; + maxElectionId = result[3]; + } + else if (NON_PRIMARY_RS_MEMBERS.has(serverType)) { + const result = updateRsNoPrimaryFromMember(serverDescriptions, serverDescription, setName); + topologyType = result[0]; + setName = result[1]; + } + } + if (topologyType === common_1.TopologyType.ReplicaSetWithPrimary) { + if (MONGOS_OR_STANDALONE.has(serverType)) { + serverDescriptions.delete(address); + topologyType = checkHasPrimary(serverDescriptions); + } + else if (serverType === common_1.ServerType.RSPrimary) { + const result = updateRsFromPrimary(serverDescriptions, serverDescription, setName, maxSetVersion, maxElectionId); + topologyType = result[0]; + setName = result[1]; + maxSetVersion = result[2]; + maxElectionId = result[3]; + } + else if (NON_PRIMARY_RS_MEMBERS.has(serverType)) { + topologyType = updateRsWithPrimaryFromMember(serverDescriptions, serverDescription, setName); + } + else { + topologyType = checkHasPrimary(serverDescriptions); + } + } + return new TopologyDescription(topologyType, serverDescriptions, setName, maxSetVersion, maxElectionId, commonWireVersion, { heartbeatFrequencyMS: this.heartbeatFrequencyMS, localThresholdMS: this.localThresholdMS }); + } + get error() { + const descriptionsWithError = Array.from(this.servers.values()).filter((sd) => sd.error); + if (descriptionsWithError.length > 0) { + return descriptionsWithError[0].error; + } + } + /** + * Determines if the topology description has any known servers + */ + get hasKnownServers() { + return Array.from(this.servers.values()).some((sd) => sd.type !== common_1.ServerType.Unknown); + } + /** + * Determines if this topology description has a data-bearing server available. + */ + get hasDataBearingServers() { + return Array.from(this.servers.values()).some((sd) => sd.isDataBearing); + } + /** + * Determines if the topology has a definition for the provided address + * @internal + */ + hasServer(address) { + return this.servers.has(address); + } +} +exports.TopologyDescription = TopologyDescription; +function topologyTypeForServerType(serverType) { + switch (serverType) { + case common_1.ServerType.Standalone: + return common_1.TopologyType.Single; + case common_1.ServerType.Mongos: + return common_1.TopologyType.Sharded; + case common_1.ServerType.RSPrimary: + return common_1.TopologyType.ReplicaSetWithPrimary; + case common_1.ServerType.RSOther: + case common_1.ServerType.RSSecondary: + return common_1.TopologyType.ReplicaSetNoPrimary; + default: + return common_1.TopologyType.Unknown; + } +} +// TODO: improve these docs when ObjectId is properly typed +function compareObjectId(oid1, oid2) { + if (oid1 == null) { + return -1; + } + if (oid2 == null) { + return 1; + } + if (oid1.id instanceof Buffer && oid2.id instanceof Buffer) { + const oid1Buffer = oid1.id; + const oid2Buffer = oid2.id; + return oid1Buffer.compare(oid2Buffer); + } + const oid1String = oid1.toString(); + const oid2String = oid2.toString(); + return oid1String.localeCompare(oid2String); +} +function updateRsFromPrimary(serverDescriptions, serverDescription, setName, maxSetVersion, maxElectionId) { + setName = setName || serverDescription.setName; + if (setName !== serverDescription.setName) { + serverDescriptions.delete(serverDescription.address); + return [checkHasPrimary(serverDescriptions), setName, maxSetVersion, maxElectionId]; + } + const electionId = serverDescription.electionId ? serverDescription.electionId : null; + if (serverDescription.setVersion && electionId) { + if (maxSetVersion && maxElectionId) { + if (maxSetVersion > serverDescription.setVersion || + compareObjectId(maxElectionId, electionId) > 0) { + // this primary is stale, we must remove it + serverDescriptions.set(serverDescription.address, new server_description_1.ServerDescription(serverDescription.address)); + return [checkHasPrimary(serverDescriptions), setName, maxSetVersion, maxElectionId]; + } + } + maxElectionId = serverDescription.electionId; + } + if (serverDescription.setVersion != null && + (maxSetVersion == null || serverDescription.setVersion > maxSetVersion)) { + maxSetVersion = serverDescription.setVersion; + } + // We've heard from the primary. Is it the same primary as before? + for (const [address, server] of serverDescriptions) { + if (server.type === common_1.ServerType.RSPrimary && server.address !== serverDescription.address) { + // Reset old primary's type to Unknown. + serverDescriptions.set(address, new server_description_1.ServerDescription(server.address)); + // There can only be one primary + break; + } + } + // Discover new hosts from this primary's response. + serverDescription.allHosts.forEach((address) => { + if (!serverDescriptions.has(address)) { + serverDescriptions.set(address, new server_description_1.ServerDescription(address)); + } + }); + // Remove hosts not in the response. + const currentAddresses = Array.from(serverDescriptions.keys()); + const responseAddresses = serverDescription.allHosts; + currentAddresses + .filter((addr) => responseAddresses.indexOf(addr) === -1) + .forEach((address) => { + serverDescriptions.delete(address); + }); + return [checkHasPrimary(serverDescriptions), setName, maxSetVersion, maxElectionId]; +} +function updateRsWithPrimaryFromMember(serverDescriptions, serverDescription, setName) { + if (setName == null) { + // TODO(NODE-3483): should be an appropriate runtime error + throw new error_1.MongoRuntimeError('Argument "setName" is required if connected to a replica set'); + } + if (setName !== serverDescription.setName || + (serverDescription.me && serverDescription.address !== serverDescription.me)) { + serverDescriptions.delete(serverDescription.address); + } + return checkHasPrimary(serverDescriptions); +} +function updateRsNoPrimaryFromMember(serverDescriptions, serverDescription, setName) { + const topologyType = common_1.TopologyType.ReplicaSetNoPrimary; + setName = setName || serverDescription.setName; + if (setName !== serverDescription.setName) { + serverDescriptions.delete(serverDescription.address); + return [topologyType, setName]; + } + serverDescription.allHosts.forEach((address) => { + if (!serverDescriptions.has(address)) { + serverDescriptions.set(address, new server_description_1.ServerDescription(address)); + } + }); + if (serverDescription.me && serverDescription.address !== serverDescription.me) { + serverDescriptions.delete(serverDescription.address); + } + return [topologyType, setName]; +} +function checkHasPrimary(serverDescriptions) { + for (const serverDescription of serverDescriptions.values()) { + if (serverDescription.type === common_1.ServerType.RSPrimary) { + return common_1.TopologyType.ReplicaSetWithPrimary; + } + } + return common_1.TopologyType.ReplicaSetNoPrimary; +} +//# sourceMappingURL=topology_description.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/sdam/topology_description.js.map b/node_modules/mongodb/lib/sdam/topology_description.js.map new file mode 100644 index 000000000..48c164fb8 --- /dev/null +++ b/node_modules/mongodb/lib/sdam/topology_description.js.map @@ -0,0 +1 @@ +{"version":3,"file":"topology_description.js","sourceRoot":"","sources":["../../src/sdam/topology_description.ts"],"names":[],"mappings":";;;AAAA,6DAAyD;AACzD,kEAAkE;AAClE,qCAAoD;AAGpD,oCAAyD;AAEzD,4CAA4C;AAC5C,MAAM,4BAA4B,GAAG,cAAc,CAAC,4BAA4B,CAAC;AACjF,MAAM,4BAA4B,GAAG,cAAc,CAAC,4BAA4B,CAAC;AACjF,MAAM,0BAA0B,GAAG,cAAc,CAAC,0BAA0B,CAAC;AAC7E,MAAM,0BAA0B,GAAG,cAAc,CAAC,0BAA0B,CAAC;AAE7E,MAAM,iBAAiB,GAAG,IAAI,GAAG,CAAa,CAAC,mBAAU,CAAC,MAAM,EAAE,mBAAU,CAAC,OAAO,CAAC,CAAC,CAAC;AACvF,MAAM,oBAAoB,GAAG,IAAI,GAAG,CAAa,CAAC,mBAAU,CAAC,MAAM,EAAE,mBAAU,CAAC,UAAU,CAAC,CAAC,CAAC;AAC7F,MAAM,sBAAsB,GAAG,IAAI,GAAG,CAAa;IACjD,mBAAU,CAAC,WAAW;IACtB,mBAAU,CAAC,SAAS;IACpB,mBAAU,CAAC,OAAO;CACnB,CAAC,CAAC;AAQH;;;GAGG;AACH,MAAa,mBAAmB;IAc9B;;OAEG;IACH,YACE,YAA0B,EAC1B,kBAAmD,EACnD,OAAgB,EAChB,aAAsB,EACtB,aAAwB,EACxB,iBAA0B,EAC1B,OAAoC;;QAEpC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAExB,2EAA2E;QAC3E,4EAA4E;QAC5E,sBAAsB;QACtB,IAAI,CAAC,IAAI,GAAG,YAAY,aAAZ,YAAY,cAAZ,YAAY,GAAI,qBAAY,CAAC,OAAO,CAAC;QACjD,IAAI,CAAC,OAAO,GAAG,kBAAkB,aAAlB,kBAAkB,cAAlB,kBAAkB,GAAI,IAAI,GAAG,EAAE,CAAC;QAC/C,IAAI,CAAC,KAAK,GAAG,KAAK,CAAC;QACnB,IAAI,CAAC,UAAU,GAAG,IAAI,CAAC;QACvB,IAAI,CAAC,oBAAoB,GAAG,MAAA,OAAO,CAAC,oBAAoB,mCAAI,CAAC,CAAC;QAC9D,IAAI,CAAC,gBAAgB,GAAG,MAAA,OAAO,CAAC,gBAAgB,mCAAI,CAAC,CAAC;QAEtD,IAAI,OAAO,EAAE;YACX,IAAI,CAAC,OAAO,GAAG,OAAO,CAAC;SACxB;QAED,IAAI,aAAa,EAAE;YACjB,IAAI,CAAC,aAAa,GAAG,aAAa,CAAC;SACpC;QAED,IAAI,aAAa,EAAE;YACjB,IAAI,CAAC,aAAa,GAAG,aAAa,CAAC;SACpC;QAED,IAAI,iBAAiB,EAAE;YACrB,IAAI,CAAC,iBAAiB,GAAG,iBAAiB,CAAC;SAC5C;QAED,iCAAiC;QACjC,KAAK,MAAM,iBAAiB,IAAI,IAAI,CAAC,OAAO,CAAC,MAAM,EAAE,EAAE;YACrD,2CAA2C;YAC3C,IACE,iBAAiB,CAAC,IAAI,KAAK,mBAAU,CAAC,OAAO;gBAC7C,iBAAiB,CAAC,IAAI,KAAK,mBAAU,CAAC,YAAY,EAClD;gBACA,SAAS;aACV;YAED,IAAI,iBAAiB,CAAC,cAAc,GAAG,0BAA0B,EAAE;gBACjE,IAAI,CAAC,UAAU,GAAG,KAAK,CAAC;gBACxB,IAAI,CAAC,kBAAkB,GAAG,aAAa,iBAAiB,CAAC,OAAO,0BAA0B,iBAAiB,CAAC,cAAc,wDAAwD,0BAA0B,aAAa,4BAA4B,GAAG,CAAC;aAC1P;YAED,IAAI,iBAAiB,CAAC,cAAc,GAAG,0BAA0B,EAAE;gBACjE,IAAI,CAAC,UAAU,GAAG,KAAK,CAAC;gBACxB,IAAI,CAAC,kBAAkB,GAAG,aAAa,iBAAiB,CAAC,OAAO,yBAAyB,iBAAiB,CAAC,cAAc,sDAAsD,0BAA0B,aAAa,4BAA4B,IAAI,CAAC;gBACvP,MAAM;aACP;SACF;QAED,2FAA2F;QAC3F,gGAAgG;QAChG,sFAAsF;QACtF,8FAA8F;QAC9F,eAAe;QACf,IAAI,CAAC,4BAA4B,GAAG,SAAS,CAAC;QAC9C,KAAK,MAAM,CAAC,EAAE,MAAM,CAAC,IAAI,IAAI,CAAC,OAAO,EAAE;YACrC,IAAI,MAAM,CAAC,UAAU,EAAE;gBACrB,IAAI,MAAM,CAAC,4BAA4B,IAAI,IAAI,EAAE;oBAC/C,yFAAyF;oBACzF,IAAI,CAAC,4BAA4B,GAAG,SAAS,CAAC;oBAC9C,MAAM;iBACP;gBAED,IAAI,IAAI,CAAC,4BAA4B,IAAI,IAAI,EAAE;oBAC7C,sDAAsD;oBACtD,IAAI,CAAC,4BAA4B,GAAG,MAAM,CAAC,4BAA4B,CAAC;oBACxE,SAAS;iBACV;gBAED,oCAAoC;gBACpC,kFAAkF;gBAClF,IAAI,CAAC,4BAA4B,GAAG,IAAI,CAAC,GAAG,CAC1C,IAAI,CAAC,4BAA4B,EACjC,MAAM,CAAC,4BAA4B,CACpC,CAAC;aACH;SACF;IACH,CAAC;IAED;;;OAGG;IACH,yBAAyB,CAAC,EAAmB;QAC3C,MAAM,YAAY,GAAG,EAAE,CAAC,SAAS,EAAE,CAAC;QACpC,MAAM,kBAAkB,GAAG,IAAI,GAAG,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;QACjD,KAAK,MAAM,OAAO,IAAI,IAAI,CAAC,OAAO,CAAC,IAAI,EAAE,EAAE;YACzC,IAAI,YAAY,CAAC,GAAG,CAAC,OAAO,CAAC,EAAE;gBAC7B,YAAY,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC;aAC9B;iBAAM;gBACL,kBAAkB,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC;aACpC;SACF;QAED,IAAI,kBAAkB,CAAC,IAAI,KAAK,IAAI,CAAC,OAAO,CAAC,IAAI,IAAI,YAAY,CAAC,IAAI,KAAK,CAAC,EAAE;YAC5E,OAAO,IAAI,CAAC;SACb;QAED,KAAK,MAAM,CAAC,OAAO,EAAE,IAAI,CAAC,IAAI,YAAY,EAAE;YAC1C,kBAAkB,CAAC,GAAG,CAAC,OAAO,EAAE,IAAI,sCAAiB,CAAC,IAAI,CAAC,CAAC,CAAC;SAC9D;QAED,OAAO,IAAI,mBAAmB,CAC5B,IAAI,CAAC,IAAI,EACT,kBAAkB,EAClB,IAAI,CAAC,OAAO,EACZ,IAAI,CAAC,aAAa,EAClB,IAAI,CAAC,aAAa,EAClB,IAAI,CAAC,iBAAiB,EACtB,EAAE,oBAAoB,EAAE,IAAI,CAAC,oBAAoB,EAAE,gBAAgB,EAAE,IAAI,CAAC,gBAAgB,EAAE,CAC7F,CAAC;IACJ,CAAC;IAED;;;OAGG;IACH,MAAM,CAAC,iBAAoC;QACzC,MAAM,OAAO,GAAG,iBAAiB,CAAC,OAAO,CAAC;QAE1C,6BAA6B;QAC7B,IAAI,EAAE,IAAI,EAAE,YAAY,EAAE,OAAO,EAAE,aAAa,EAAE,aAAa,EAAE,iBAAiB,EAAE,GAAG,IAAI,CAAC;QAE5F,IAAI,iBAAiB,CAAC,OAAO,IAAI,OAAO,IAAI,iBAAiB,CAAC,OAAO,KAAK,OAAO,EAAE;YACjF,iBAAiB,GAAG,IAAI,sCAAiB,CAAC,OAAO,EAAE,SAAS,CAAC,CAAC;SAC/D;QAED,MAAM,UAAU,GAAG,iBAAiB,CAAC,IAAI,CAAC;QAC1C,MAAM,kBAAkB,GAAG,IAAI,GAAG,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;QAEjD,6BAA6B;QAC7B,IAAI,iBAAiB,CAAC,cAAc,KAAK,CAAC,EAAE;YAC1C,IAAI,iBAAiB,IAAI,IAAI,EAAE;gBAC7B,iBAAiB,GAAG,iBAAiB,CAAC,cAAc,CAAC;aACtD;iBAAM;gBACL,iBAAiB,GAAG,IAAI,CAAC,GAAG,CAAC,iBAAiB,EAAE,iBAAiB,CAAC,cAAc,CAAC,CAAC;aACnF;SACF;QAED,uCAAuC;QACvC,kBAAkB,CAAC,GAAG,CAAC,OAAO,EAAE,iBAAiB,CAAC,CAAC;QAEnD,IAAI,YAAY,KAAK,qBAAY,CAAC,MAAM,EAAE;YACxC,oDAAoD;YACpD,OAAO,IAAI,mBAAmB,CAC5B,qBAAY,CAAC,MAAM,EACnB,kBAAkB,EAClB,OAAO,EACP,aAAa,EACb,aAAa,EACb,iBAAiB,EACjB,EAAE,oBAAoB,EAAE,IAAI,CAAC,oBAAoB,EAAE,gBAAgB,EAAE,IAAI,CAAC,gBAAgB,EAAE,CAC7F,CAAC;SACH;QAED,IAAI,YAAY,KAAK,qBAAY,CAAC,OAAO,EAAE;YACzC,IAAI,UAAU,KAAK,mBAAU,CAAC,UAAU,IAAI,IAAI,CAAC,OAAO,CAAC,IAAI,KAAK,CAAC,EAAE;gBACnE,kBAAkB,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC;aACpC;iBAAM;gBACL,YAAY,GAAG,yBAAyB,CAAC,UAAU,CAAC,CAAC;aACtD;SACF;QAED,IAAI,YAAY,KAAK,qBAAY,CAAC,OAAO,EAAE;YACzC,IAAI,CAAC,iBAAiB,CAAC,GAAG,CAAC,UAAU,CAAC,EAAE;gBACtC,kBAAkB,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC;aACpC;SACF;QAED,IAAI,YAAY,KAAK,qBAAY,CAAC,mBAAmB,EAAE;YACrD,IAAI,oBAAoB,CAAC,GAAG,CAAC,UAAU,CAAC,EAAE;gBACxC,kBAAkB,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC;aACpC;YAED,IAAI,UAAU,KAAK,mBAAU,CAAC,SAAS,EAAE;gBACvC,MAAM,MAAM,GAAG,mBAAmB,CAChC,kBAAkB,EAClB,iBAAiB,EACjB,OAAO,EACP,aAAa,EACb,aAAa,CACd,CAAC;gBAEF,YAAY,GAAG,MAAM,CAAC,CAAC,CAAC,CAAC;gBACzB,OAAO,GAAG,MAAM,CAAC,CAAC,CAAC,CAAC;gBACpB,aAAa,GAAG,MAAM,CAAC,CAAC,CAAC,CAAC;gBAC1B,aAAa,GAAG,MAAM,CAAC,CAAC,CAAC,CAAC;aAC3B;iBAAM,IAAI,sBAAsB,CAAC,GAAG,CAAC,UAAU,CAAC,EAAE;gBACjD,MAAM,MAAM,GAAG,2BAA2B,CAAC,kBAAkB,EAAE,iBAAiB,EAAE,OAAO,CAAC,CAAC;gBAC3F,YAAY,GAAG,MAAM,CAAC,CAAC,CAAC,CAAC;gBACzB,OAAO,GAAG,MAAM,CAAC,CAAC,CAAC,CAAC;aACrB;SACF;QAED,IAAI,YAAY,KAAK,qBAAY,CAAC,qBAAqB,EAAE;YACvD,IAAI,oBAAoB,CAAC,GAAG,CAAC,UAAU,CAAC,EAAE;gBACxC,kBAAkB,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC;gBACnC,YAAY,GAAG,eAAe,CAAC,kBAAkB,CAAC,CAAC;aACpD;iBAAM,IAAI,UAAU,KAAK,mBAAU,CAAC,SAAS,EAAE;gBAC9C,MAAM,MAAM,GAAG,mBAAmB,CAChC,kBAAkB,EAClB,iBAAiB,EACjB,OAAO,EACP,aAAa,EACb,aAAa,CACd,CAAC;gBAEF,YAAY,GAAG,MAAM,CAAC,CAAC,CAAC,CAAC;gBACzB,OAAO,GAAG,MAAM,CAAC,CAAC,CAAC,CAAC;gBACpB,aAAa,GAAG,MAAM,CAAC,CAAC,CAAC,CAAC;gBAC1B,aAAa,GAAG,MAAM,CAAC,CAAC,CAAC,CAAC;aAC3B;iBAAM,IAAI,sBAAsB,CAAC,GAAG,CAAC,UAAU,CAAC,EAAE;gBACjD,YAAY,GAAG,6BAA6B,CAC1C,kBAAkB,EAClB,iBAAiB,EACjB,OAAO,CACR,CAAC;aACH;iBAAM;gBACL,YAAY,GAAG,eAAe,CAAC,kBAAkB,CAAC,CAAC;aACpD;SACF;QAED,OAAO,IAAI,mBAAmB,CAC5B,YAAY,EACZ,kBAAkB,EAClB,OAAO,EACP,aAAa,EACb,aAAa,EACb,iBAAiB,EACjB,EAAE,oBAAoB,EAAE,IAAI,CAAC,oBAAoB,EAAE,gBAAgB,EAAE,IAAI,CAAC,gBAAgB,EAAE,CAC7F,CAAC;IACJ,CAAC;IAED,IAAI,KAAK;QACP,MAAM,qBAAqB,GAAG,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,OAAO,CAAC,MAAM,EAAE,CAAC,CAAC,MAAM,CACpE,CAAC,EAAqB,EAAE,EAAE,CAAC,EAAE,CAAC,KAAK,CACpC,CAAC;QAEF,IAAI,qBAAqB,CAAC,MAAM,GAAG,CAAC,EAAE;YACpC,OAAO,qBAAqB,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC;SACvC;IACH,CAAC;IAED;;OAEG;IACH,IAAI,eAAe;QACjB,OAAO,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,OAAO,CAAC,MAAM,EAAE,CAAC,CAAC,IAAI,CAC3C,CAAC,EAAqB,EAAE,EAAE,CAAC,EAAE,CAAC,IAAI,KAAK,mBAAU,CAAC,OAAO,CAC1D,CAAC;IACJ,CAAC;IAED;;OAEG;IACH,IAAI,qBAAqB;QACvB,OAAO,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,OAAO,CAAC,MAAM,EAAE,CAAC,CAAC,IAAI,CAAC,CAAC,EAAqB,EAAE,EAAE,CAAC,EAAE,CAAC,aAAa,CAAC,CAAC;IAC7F,CAAC;IAED;;;OAGG;IACH,SAAS,CAAC,OAAe;QACvB,OAAO,IAAI,CAAC,OAAO,CAAC,GAAG,CAAC,OAAO,CAAC,CAAC;IACnC,CAAC;CACF;AArSD,kDAqSC;AAED,SAAS,yBAAyB,CAAC,UAAsB;IACvD,QAAQ,UAAU,EAAE;QAClB,KAAK,mBAAU,CAAC,UAAU;YACxB,OAAO,qBAAY,CAAC,MAAM,CAAC;QAC7B,KAAK,mBAAU,CAAC,MAAM;YACpB,OAAO,qBAAY,CAAC,OAAO,CAAC;QAC9B,KAAK,mBAAU,CAAC,SAAS;YACvB,OAAO,qBAAY,CAAC,qBAAqB,CAAC;QAC5C,KAAK,mBAAU,CAAC,OAAO,CAAC;QACxB,KAAK,mBAAU,CAAC,WAAW;YACzB,OAAO,qBAAY,CAAC,mBAAmB,CAAC;QAC1C;YACE,OAAO,qBAAY,CAAC,OAAO,CAAC;KAC/B;AACH,CAAC;AAED,2DAA2D;AAC3D,SAAS,eAAe,CAAC,IAAc,EAAE,IAAc;IACrD,IAAI,IAAI,IAAI,IAAI,EAAE;QAChB,OAAO,CAAC,CAAC,CAAC;KACX;IAED,IAAI,IAAI,IAAI,IAAI,EAAE;QAChB,OAAO,CAAC,CAAC;KACV;IAED,IAAI,IAAI,CAAC,EAAE,YAAY,MAAM,IAAI,IAAI,CAAC,EAAE,YAAY,MAAM,EAAE;QAC1D,MAAM,UAAU,GAAG,IAAI,CAAC,EAAE,CAAC;QAC3B,MAAM,UAAU,GAAG,IAAI,CAAC,EAAE,CAAC;QAC3B,OAAO,UAAU,CAAC,OAAO,CAAC,UAAU,CAAC,CAAC;KACvC;IAED,MAAM,UAAU,GAAG,IAAI,CAAC,QAAQ,EAAE,CAAC;IACnC,MAAM,UAAU,GAAG,IAAI,CAAC,QAAQ,EAAE,CAAC;IACnC,OAAO,UAAU,CAAC,aAAa,CAAC,UAAU,CAAC,CAAC;AAC9C,CAAC;AAED,SAAS,mBAAmB,CAC1B,kBAAkD,EAClD,iBAAoC,EACpC,OAAgB,EAChB,aAAsB,EACtB,aAAwB;IAExB,OAAO,GAAG,OAAO,IAAI,iBAAiB,CAAC,OAAO,CAAC;IAC/C,IAAI,OAAO,KAAK,iBAAiB,CAAC,OAAO,EAAE;QACzC,kBAAkB,CAAC,MAAM,CAAC,iBAAiB,CAAC,OAAO,CAAC,CAAC;QACrD,OAAO,CAAC,eAAe,CAAC,kBAAkB,CAAC,EAAE,OAAO,EAAE,aAAa,EAAE,aAAa,CAAC,CAAC;KACrF;IAED,MAAM,UAAU,GAAG,iBAAiB,CAAC,UAAU,CAAC,CAAC,CAAC,iBAAiB,CAAC,UAAU,CAAC,CAAC,CAAC,IAAI,CAAC;IACtF,IAAI,iBAAiB,CAAC,UAAU,IAAI,UAAU,EAAE;QAC9C,IAAI,aAAa,IAAI,aAAa,EAAE;YAClC,IACE,aAAa,GAAG,iBAAiB,CAAC,UAAU;gBAC5C,eAAe,CAAC,aAAa,EAAE,UAAU,CAAC,GAAG,CAAC,EAC9C;gBACA,2CAA2C;gBAC3C,kBAAkB,CAAC,GAAG,CACpB,iBAAiB,CAAC,OAAO,EACzB,IAAI,sCAAiB,CAAC,iBAAiB,CAAC,OAAO,CAAC,CACjD,CAAC;gBAEF,OAAO,CAAC,eAAe,CAAC,kBAAkB,CAAC,EAAE,OAAO,EAAE,aAAa,EAAE,aAAa,CAAC,CAAC;aACrF;SACF;QAED,aAAa,GAAG,iBAAiB,CAAC,UAAU,CAAC;KAC9C;IAED,IACE,iBAAiB,CAAC,UAAU,IAAI,IAAI;QACpC,CAAC,aAAa,IAAI,IAAI,IAAI,iBAAiB,CAAC,UAAU,GAAG,aAAa,CAAC,EACvE;QACA,aAAa,GAAG,iBAAiB,CAAC,UAAU,CAAC;KAC9C;IAED,kEAAkE;IAClE,KAAK,MAAM,CAAC,OAAO,EAAE,MAAM,CAAC,IAAI,kBAAkB,EAAE;QAClD,IAAI,MAAM,CAAC,IAAI,KAAK,mBAAU,CAAC,SAAS,IAAI,MAAM,CAAC,OAAO,KAAK,iBAAiB,CAAC,OAAO,EAAE;YACxF,uCAAuC;YACvC,kBAAkB,CAAC,GAAG,CAAC,OAAO,EAAE,IAAI,sCAAiB,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC,CAAC;YAEvE,gCAAgC;YAChC,MAAM;SACP;KACF;IAED,mDAAmD;IACnD,iBAAiB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC,OAAe,EAAE,EAAE;QACrD,IAAI,CAAC,kBAAkB,CAAC,GAAG,CAAC,OAAO,CAAC,EAAE;YACpC,kBAAkB,CAAC,GAAG,CAAC,OAAO,EAAE,IAAI,sCAAiB,CAAC,OAAO,CAAC,CAAC,CAAC;SACjE;IACH,CAAC,CAAC,CAAC;IAEH,oCAAoC;IACpC,MAAM,gBAAgB,GAAG,KAAK,CAAC,IAAI,CAAC,kBAAkB,CAAC,IAAI,EAAE,CAAC,CAAC;IAC/D,MAAM,iBAAiB,GAAG,iBAAiB,CAAC,QAAQ,CAAC;IACrD,gBAAgB;SACb,MAAM,CAAC,CAAC,IAAY,EAAE,EAAE,CAAC,iBAAiB,CAAC,OAAO,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC;SAChE,OAAO,CAAC,CAAC,OAAe,EAAE,EAAE;QAC3B,kBAAkB,CAAC,MAAM,CAAC,OAAO,CAAC,CAAC;IACrC,CAAC,CAAC,CAAC;IAEL,OAAO,CAAC,eAAe,CAAC,kBAAkB,CAAC,EAAE,OAAO,EAAE,aAAa,EAAE,aAAa,CAAC,CAAC;AACtF,CAAC;AAED,SAAS,6BAA6B,CACpC,kBAAkD,EAClD,iBAAoC,EACpC,OAAgB;IAEhB,IAAI,OAAO,IAAI,IAAI,EAAE;QACnB,0DAA0D;QAC1D,MAAM,IAAI,yBAAiB,CAAC,8DAA8D,CAAC,CAAC;KAC7F;IAED,IACE,OAAO,KAAK,iBAAiB,CAAC,OAAO;QACrC,CAAC,iBAAiB,CAAC,EAAE,IAAI,iBAAiB,CAAC,OAAO,KAAK,iBAAiB,CAAC,EAAE,CAAC,EAC5E;QACA,kBAAkB,CAAC,MAAM,CAAC,iBAAiB,CAAC,OAAO,CAAC,CAAC;KACtD;IAED,OAAO,eAAe,CAAC,kBAAkB,CAAC,CAAC;AAC7C,CAAC;AAED,SAAS,2BAA2B,CAClC,kBAAkD,EAClD,iBAAoC,EACpC,OAAgB;IAEhB,MAAM,YAAY,GAAG,qBAAY,CAAC,mBAAmB,CAAC;IACtD,OAAO,GAAG,OAAO,IAAI,iBAAiB,CAAC,OAAO,CAAC;IAC/C,IAAI,OAAO,KAAK,iBAAiB,CAAC,OAAO,EAAE;QACzC,kBAAkB,CAAC,MAAM,CAAC,iBAAiB,CAAC,OAAO,CAAC,CAAC;QACrD,OAAO,CAAC,YAAY,EAAE,OAAO,CAAC,CAAC;KAChC;IAED,iBAAiB,CAAC,QAAQ,CAAC,OAAO,CAAC,CAAC,OAAe,EAAE,EAAE;QACrD,IAAI,CAAC,kBAAkB,CAAC,GAAG,CAAC,OAAO,CAAC,EAAE;YACpC,kBAAkB,CAAC,GAAG,CAAC,OAAO,EAAE,IAAI,sCAAiB,CAAC,OAAO,CAAC,CAAC,CAAC;SACjE;IACH,CAAC,CAAC,CAAC;IAEH,IAAI,iBAAiB,CAAC,EAAE,IAAI,iBAAiB,CAAC,OAAO,KAAK,iBAAiB,CAAC,EAAE,EAAE;QAC9E,kBAAkB,CAAC,MAAM,CAAC,iBAAiB,CAAC,OAAO,CAAC,CAAC;KACtD;IAED,OAAO,CAAC,YAAY,EAAE,OAAO,CAAC,CAAC;AACjC,CAAC;AAED,SAAS,eAAe,CAAC,kBAAkD;IACzE,KAAK,MAAM,iBAAiB,IAAI,kBAAkB,CAAC,MAAM,EAAE,EAAE;QAC3D,IAAI,iBAAiB,CAAC,IAAI,KAAK,mBAAU,CAAC,SAAS,EAAE;YACnD,OAAO,qBAAY,CAAC,qBAAqB,CAAC;SAC3C;KACF;IAED,OAAO,qBAAY,CAAC,mBAAmB,CAAC;AAC1C,CAAC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/sessions.js b/node_modules/mongodb/lib/sessions.js new file mode 100644 index 000000000..982b49b1d --- /dev/null +++ b/node_modules/mongodb/lib/sessions.js @@ -0,0 +1,726 @@ +"use strict"; +var _a; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.updateSessionFromResponse = exports.applySession = exports.commandSupportsReadConcern = exports.ServerSessionPool = exports.ServerSession = exports.maybeClearPinnedConnection = exports.ClientSession = void 0; +const promise_provider_1 = require("./promise_provider"); +const bson_1 = require("./bson"); +const read_preference_1 = require("./read_preference"); +const transactions_1 = require("./transactions"); +const common_1 = require("./sdam/common"); +const shared_1 = require("./cmap/wire_protocol/shared"); +const error_1 = require("./error"); +const utils_1 = require("./utils"); +const execute_operation_1 = require("./operations/execute_operation"); +const run_command_1 = require("./operations/run_command"); +const connection_1 = require("./cmap/connection"); +const metrics_1 = require("./cmap/metrics"); +const mongo_types_1 = require("./mongo_types"); +const read_concern_1 = require("./read_concern"); +const minWireVersionForShardedTransactions = 8; +function assertAlive(session, callback) { + if (session.serverSession == null) { + const error = new error_1.MongoExpiredSessionError(); + if (typeof callback === 'function') { + callback(error); + return false; + } + throw error; + } + return true; +} +/** @internal */ +const kServerSession = Symbol('serverSession'); +/** @internal */ +const kSnapshotTime = Symbol('snapshotTime'); +/** @internal */ +const kSnapshotEnabled = Symbol('snapshotEnabled'); +/** @internal */ +const kPinnedConnection = Symbol('pinnedConnection'); +/** + * A class representing a client session on the server + * + * NOTE: not meant to be instantiated directly. + * @public + */ +class ClientSession extends mongo_types_1.TypedEventEmitter { + /** + * Create a client session. + * @internal + * @param topology - The current client's topology (Internal Class) + * @param sessionPool - The server session pool (Internal Class) + * @param options - Optional settings + * @param clientOptions - Optional settings provided when creating a MongoClient + */ + constructor(topology, sessionPool, options, clientOptions) { + super(); + /** @internal */ + this[_a] = false; + if (topology == null) { + // TODO(NODE-3483) + throw new error_1.MongoRuntimeError('ClientSession requires a topology'); + } + if (sessionPool == null || !(sessionPool instanceof ServerSessionPool)) { + // TODO(NODE-3483) + throw new error_1.MongoRuntimeError('ClientSession requires a ServerSessionPool'); + } + options = options !== null && options !== void 0 ? options : {}; + if (options.snapshot === true) { + this[kSnapshotEnabled] = true; + if (options.causalConsistency === true) { + throw new error_1.MongoInvalidArgumentError('Properties "causalConsistency" and "snapshot" are mutually exclusive'); + } + } + this.topology = topology; + this.sessionPool = sessionPool; + this.hasEnded = false; + this.clientOptions = clientOptions; + this[kServerSession] = undefined; + this.supports = { + causalConsistency: options.snapshot !== true && options.causalConsistency !== false + }; + this.clusterTime = options.initialClusterTime; + this.operationTime = undefined; + this.explicit = !!options.explicit; + this.owner = options.owner; + this.defaultTransactionOptions = Object.assign({}, options.defaultTransactionOptions); + this.transaction = new transactions_1.Transaction(); + } + /** The server id associated with this session */ + get id() { + var _b; + return (_b = this.serverSession) === null || _b === void 0 ? void 0 : _b.id; + } + get serverSession() { + if (this[kServerSession] == null) { + this[kServerSession] = this.sessionPool.acquire(); + } + // eslint-disable-next-line @typescript-eslint/no-non-null-assertion + return this[kServerSession]; + } + /** Whether or not this session is configured for snapshot reads */ + get snapshotEnabled() { + return this[kSnapshotEnabled]; + } + get loadBalanced() { + return this.topology.description.type === common_1.TopologyType.LoadBalanced; + } + /** @internal */ + get pinnedConnection() { + return this[kPinnedConnection]; + } + /** @internal */ + pin(conn) { + if (this[kPinnedConnection]) { + throw TypeError('Cannot pin multiple connections to the same session'); + } + this[kPinnedConnection] = conn; + conn.emit(connection_1.Connection.PINNED, this.inTransaction() ? metrics_1.ConnectionPoolMetrics.TXN : metrics_1.ConnectionPoolMetrics.CURSOR); + } + /** @internal */ + unpin(options) { + if (this.loadBalanced) { + return maybeClearPinnedConnection(this, options); + } + this.transaction.unpinServer(); + } + get isPinned() { + return this.loadBalanced ? !!this[kPinnedConnection] : this.transaction.isPinned; + } + endSession(options, callback) { + if (typeof options === 'function') + (callback = options), (options = {}); + const finalOptions = { force: true, ...options }; + return utils_1.maybePromise(callback, done => { + if (this.hasEnded) { + maybeClearPinnedConnection(this, finalOptions); + return done(); + } + const completeEndSession = () => { + maybeClearPinnedConnection(this, finalOptions); + // release the server session back to the pool + this.sessionPool.release(this.serverSession); + this[kServerSession] = undefined; + // mark the session as ended, and emit a signal + this.hasEnded = true; + this.emit('ended', this); + // spec indicates that we should ignore all errors for `endSessions` + done(); + }; + if (this.serverSession && this.inTransaction()) { + this.abortTransaction(err => { + if (err) + return done(err); + completeEndSession(); + }); + return; + } + completeEndSession(); + }); + } + /** + * Advances the operationTime for a ClientSession. + * + * @param operationTime - the `BSON.Timestamp` of the operation type it is desired to advance to + */ + advanceOperationTime(operationTime) { + if (this.operationTime == null) { + this.operationTime = operationTime; + return; + } + if (operationTime.greaterThan(this.operationTime)) { + this.operationTime = operationTime; + } + } + /** + * Advances the clusterTime for a ClientSession to the provided clusterTime of another ClientSession + * + * @param clusterTime - the $clusterTime returned by the server from another session in the form of a document containing the `BSON.Timestamp` clusterTime and signature + */ + advanceClusterTime(clusterTime) { + var _b, _c; + if (!clusterTime || typeof clusterTime !== 'object') { + throw new error_1.MongoInvalidArgumentError('input cluster time must be an object'); + } + if (!clusterTime.clusterTime || clusterTime.clusterTime._bsontype !== 'Timestamp') { + throw new error_1.MongoInvalidArgumentError('input cluster time "clusterTime" property must be a valid BSON Timestamp'); + } + if (!clusterTime.signature || + ((_b = clusterTime.signature.hash) === null || _b === void 0 ? void 0 : _b._bsontype) !== 'Binary' || + (typeof clusterTime.signature.keyId !== 'number' && + ((_c = clusterTime.signature.keyId) === null || _c === void 0 ? void 0 : _c._bsontype) !== 'Long') // apparently we decode the key to number? + ) { + throw new error_1.MongoInvalidArgumentError('input cluster time must have a valid "signature" property with BSON Binary hash and BSON Long keyId'); + } + common_1._advanceClusterTime(this, clusterTime); + } + /** + * Used to determine if this session equals another + * + * @param session - The session to compare to + */ + equals(session) { + if (!(session instanceof ClientSession)) { + return false; + } + if (this.id == null || session.id == null) { + return false; + } + return this.id.id.buffer.equals(session.id.id.buffer); + } + /** Increment the transaction number on the internal ServerSession */ + incrementTransactionNumber() { + if (this.serverSession) { + this.serverSession.txnNumber = + typeof this.serverSession.txnNumber === 'number' ? this.serverSession.txnNumber + 1 : 0; + } + } + /** @returns whether this session is currently in a transaction or not */ + inTransaction() { + return this.transaction.isActive; + } + /** + * Starts a new transaction with the given options. + * + * @param options - Options for the transaction + */ + startTransaction(options) { + var _b, _c, _d, _e, _f, _g, _h, _j, _k, _l; + if (this[kSnapshotEnabled]) { + throw new error_1.MongoCompatibilityError('Transactions are not allowed with snapshot sessions'); + } + assertAlive(this); + if (this.inTransaction()) { + throw new error_1.MongoTransactionError('Transaction already in progress'); + } + if (this.isPinned && this.transaction.isCommitted) { + this.unpin(); + } + const topologyMaxWireVersion = utils_1.maxWireVersion(this.topology); + if (shared_1.isSharded(this.topology) && + topologyMaxWireVersion != null && + topologyMaxWireVersion < minWireVersionForShardedTransactions) { + throw new error_1.MongoCompatibilityError('Transactions are not supported on sharded clusters in MongoDB < 4.2.'); + } + // increment txnNumber + this.incrementTransactionNumber(); + // create transaction state + this.transaction = new transactions_1.Transaction({ + readConcern: (_c = (_b = options === null || options === void 0 ? void 0 : options.readConcern) !== null && _b !== void 0 ? _b : this.defaultTransactionOptions.readConcern) !== null && _c !== void 0 ? _c : (_d = this.clientOptions) === null || _d === void 0 ? void 0 : _d.readConcern, + writeConcern: (_f = (_e = options === null || options === void 0 ? void 0 : options.writeConcern) !== null && _e !== void 0 ? _e : this.defaultTransactionOptions.writeConcern) !== null && _f !== void 0 ? _f : (_g = this.clientOptions) === null || _g === void 0 ? void 0 : _g.writeConcern, + readPreference: (_j = (_h = options === null || options === void 0 ? void 0 : options.readPreference) !== null && _h !== void 0 ? _h : this.defaultTransactionOptions.readPreference) !== null && _j !== void 0 ? _j : (_k = this.clientOptions) === null || _k === void 0 ? void 0 : _k.readPreference, + maxCommitTimeMS: (_l = options === null || options === void 0 ? void 0 : options.maxCommitTimeMS) !== null && _l !== void 0 ? _l : this.defaultTransactionOptions.maxCommitTimeMS + }); + this.transaction.transition(transactions_1.TxnState.STARTING_TRANSACTION); + } + commitTransaction(callback) { + return utils_1.maybePromise(callback, cb => endTransaction(this, 'commitTransaction', cb)); + } + abortTransaction(callback) { + return utils_1.maybePromise(callback, cb => endTransaction(this, 'abortTransaction', cb)); + } + /** + * This is here to ensure that ClientSession is never serialized to BSON. + */ + toBSON() { + throw new error_1.MongoRuntimeError('ClientSession cannot be serialized to BSON.'); + } + /** + * Runs a provided lambda within a transaction, retrying either the commit operation + * or entire transaction as needed (and when the error permits) to better ensure that + * the transaction can complete successfully. + * + * IMPORTANT: This method requires the user to return a Promise, all lambdas that do not + * return a Promise will result in undefined behavior. + * + * @param fn - A lambda to run within a transaction + * @param options - Optional settings for the transaction + */ + withTransaction(fn, options) { + const startTime = utils_1.now(); + return attemptTransaction(this, startTime, fn, options); + } +} +exports.ClientSession = ClientSession; +_a = kSnapshotEnabled; +const MAX_WITH_TRANSACTION_TIMEOUT = 120000; +const NON_DETERMINISTIC_WRITE_CONCERN_ERRORS = new Set([ + 'CannotSatisfyWriteConcern', + 'UnknownReplWriteConcern', + 'UnsatisfiableWriteConcern' +]); +function hasNotTimedOut(startTime, max) { + return utils_1.calculateDurationInMs(startTime) < max; +} +function isUnknownTransactionCommitResult(err) { + const isNonDeterministicWriteConcernError = err instanceof error_1.MongoServerError && + err.codeName && + NON_DETERMINISTIC_WRITE_CONCERN_ERRORS.has(err.codeName); + return (isMaxTimeMSExpiredError(err) || + (!isNonDeterministicWriteConcernError && + err.code !== error_1.MONGODB_ERROR_CODES.UnsatisfiableWriteConcern && + err.code !== error_1.MONGODB_ERROR_CODES.UnknownReplWriteConcern)); +} +function maybeClearPinnedConnection(session, options) { + // unpin a connection if it has been pinned + const conn = session[kPinnedConnection]; + const error = options === null || options === void 0 ? void 0 : options.error; + if (session.inTransaction() && + error && + error instanceof error_1.MongoError && + error.hasErrorLabel('TransientTransactionError')) { + return; + } + // NOTE: the spec talks about what to do on a network error only, but the tests seem to + // to validate that we don't unpin on _all_ errors? + if (conn) { + const servers = Array.from(session.topology.s.servers.values()); + const loadBalancer = servers[0]; + if ((options === null || options === void 0 ? void 0 : options.error) == null || (options === null || options === void 0 ? void 0 : options.force)) { + loadBalancer.s.pool.checkIn(conn); + conn.emit(connection_1.Connection.UNPINNED, session.transaction.state !== transactions_1.TxnState.NO_TRANSACTION + ? metrics_1.ConnectionPoolMetrics.TXN + : metrics_1.ConnectionPoolMetrics.CURSOR); + if (options === null || options === void 0 ? void 0 : options.forceClear) { + loadBalancer.s.pool.clear(conn.serviceId); + } + } + session[kPinnedConnection] = undefined; + } +} +exports.maybeClearPinnedConnection = maybeClearPinnedConnection; +function isMaxTimeMSExpiredError(err) { + if (err == null || !(err instanceof error_1.MongoServerError)) { + return false; + } + return (err.code === error_1.MONGODB_ERROR_CODES.MaxTimeMSExpired || + (err.writeConcernError && err.writeConcernError.code === error_1.MONGODB_ERROR_CODES.MaxTimeMSExpired)); +} +function attemptTransactionCommit(session, startTime, fn, options) { + return session.commitTransaction().catch((err) => { + if (err instanceof error_1.MongoError && + hasNotTimedOut(startTime, MAX_WITH_TRANSACTION_TIMEOUT) && + !isMaxTimeMSExpiredError(err)) { + if (err.hasErrorLabel('UnknownTransactionCommitResult')) { + return attemptTransactionCommit(session, startTime, fn, options); + } + if (err.hasErrorLabel('TransientTransactionError')) { + return attemptTransaction(session, startTime, fn, options); + } + } + throw err; + }); +} +const USER_EXPLICIT_TXN_END_STATES = new Set([ + transactions_1.TxnState.NO_TRANSACTION, + transactions_1.TxnState.TRANSACTION_COMMITTED, + transactions_1.TxnState.TRANSACTION_ABORTED +]); +function userExplicitlyEndedTransaction(session) { + return USER_EXPLICIT_TXN_END_STATES.has(session.transaction.state); +} +function attemptTransaction(session, startTime, fn, options) { + const Promise = promise_provider_1.PromiseProvider.get(); + session.startTransaction(options); + let promise; + try { + promise = fn(session); + } + catch (err) { + promise = Promise.reject(err); + } + if (!utils_1.isPromiseLike(promise)) { + session.abortTransaction(); + throw new error_1.MongoInvalidArgumentError('Function provided to `withTransaction` must return a Promise'); + } + return promise.then(() => { + if (userExplicitlyEndedTransaction(session)) { + return; + } + return attemptTransactionCommit(session, startTime, fn, options); + }, err => { + function maybeRetryOrThrow(err) { + if (err instanceof error_1.MongoError && + err.hasErrorLabel('TransientTransactionError') && + hasNotTimedOut(startTime, MAX_WITH_TRANSACTION_TIMEOUT)) { + return attemptTransaction(session, startTime, fn, options); + } + if (isMaxTimeMSExpiredError(err)) { + err.addErrorLabel('UnknownTransactionCommitResult'); + } + throw err; + } + if (session.transaction.isActive) { + return session.abortTransaction().then(() => maybeRetryOrThrow(err)); + } + return maybeRetryOrThrow(err); + }); +} +function endTransaction(session, commandName, callback) { + if (!assertAlive(session, callback)) { + // checking result in case callback was called + return; + } + // handle any initial problematic cases + const txnState = session.transaction.state; + if (txnState === transactions_1.TxnState.NO_TRANSACTION) { + callback(new error_1.MongoTransactionError('No transaction started')); + return; + } + if (commandName === 'commitTransaction') { + if (txnState === transactions_1.TxnState.STARTING_TRANSACTION || + txnState === transactions_1.TxnState.TRANSACTION_COMMITTED_EMPTY) { + // the transaction was never started, we can safely exit here + session.transaction.transition(transactions_1.TxnState.TRANSACTION_COMMITTED_EMPTY); + callback(); + return; + } + if (txnState === transactions_1.TxnState.TRANSACTION_ABORTED) { + callback(new error_1.MongoTransactionError('Cannot call commitTransaction after calling abortTransaction')); + return; + } + } + else { + if (txnState === transactions_1.TxnState.STARTING_TRANSACTION) { + // the transaction was never started, we can safely exit here + session.transaction.transition(transactions_1.TxnState.TRANSACTION_ABORTED); + callback(); + return; + } + if (txnState === transactions_1.TxnState.TRANSACTION_ABORTED) { + callback(new error_1.MongoTransactionError('Cannot call abortTransaction twice')); + return; + } + if (txnState === transactions_1.TxnState.TRANSACTION_COMMITTED || + txnState === transactions_1.TxnState.TRANSACTION_COMMITTED_EMPTY) { + callback(new error_1.MongoTransactionError('Cannot call abortTransaction after calling commitTransaction')); + return; + } + } + // construct and send the command + const command = { [commandName]: 1 }; + // apply a writeConcern if specified + let writeConcern; + if (session.transaction.options.writeConcern) { + writeConcern = Object.assign({}, session.transaction.options.writeConcern); + } + else if (session.clientOptions && session.clientOptions.writeConcern) { + writeConcern = { w: session.clientOptions.writeConcern.w }; + } + if (txnState === transactions_1.TxnState.TRANSACTION_COMMITTED) { + writeConcern = Object.assign({ wtimeout: 10000 }, writeConcern, { w: 'majority' }); + } + if (writeConcern) { + Object.assign(command, { writeConcern }); + } + if (commandName === 'commitTransaction' && session.transaction.options.maxTimeMS) { + Object.assign(command, { maxTimeMS: session.transaction.options.maxTimeMS }); + } + function commandHandler(e, r) { + if (commandName !== 'commitTransaction') { + session.transaction.transition(transactions_1.TxnState.TRANSACTION_ABORTED); + if (session.loadBalanced) { + maybeClearPinnedConnection(session, { force: false }); + } + // The spec indicates that we should ignore all errors on `abortTransaction` + return callback(); + } + session.transaction.transition(transactions_1.TxnState.TRANSACTION_COMMITTED); + if (e) { + if (e instanceof error_1.MongoNetworkError || + e instanceof error_1.MongoWriteConcernError || + error_1.isRetryableError(e) || + isMaxTimeMSExpiredError(e)) { + if (isUnknownTransactionCommitResult(e)) { + e.addErrorLabel('UnknownTransactionCommitResult'); + // per txns spec, must unpin session in this case + session.unpin({ error: e }); + } + } + else if (e.hasErrorLabel('TransientTransactionError')) { + session.unpin({ error: e }); + } + } + callback(e, r); + } + // Assumption here that commandName is "commitTransaction" or "abortTransaction" + if (session.transaction.recoveryToken) { + command.recoveryToken = session.transaction.recoveryToken; + } + // send the command + execute_operation_1.executeOperation(session.topology, new run_command_1.RunAdminCommandOperation(undefined, command, { + session, + readPreference: read_preference_1.ReadPreference.primary, + bypassPinningCheck: true + }), (err, reply) => { + if (command.abortTransaction) { + // always unpin on abort regardless of command outcome + session.unpin(); + } + if (err && error_1.isRetryableError(err)) { + // SPEC-1185: apply majority write concern when retrying commitTransaction + if (command.commitTransaction) { + // per txns spec, must unpin session in this case + session.unpin({ force: true }); + command.writeConcern = Object.assign({ wtimeout: 10000 }, command.writeConcern, { + w: 'majority' + }); + } + return execute_operation_1.executeOperation(session.topology, new run_command_1.RunAdminCommandOperation(undefined, command, { + session, + readPreference: read_preference_1.ReadPreference.primary, + bypassPinningCheck: true + }), (_err, _reply) => commandHandler(_err, _reply)); + } + commandHandler(err, reply); + }); +} +/** + * Reflects the existence of a session on the server. Can be reused by the session pool. + * WARNING: not meant to be instantiated directly. For internal use only. + * @public + */ +class ServerSession { + /** @internal */ + constructor() { + this.id = { id: new bson_1.Binary(utils_1.uuidV4(), bson_1.Binary.SUBTYPE_UUID) }; + this.lastUse = utils_1.now(); + this.txnNumber = 0; + this.isDirty = false; + } + /** + * Determines if the server session has timed out. + * + * @param sessionTimeoutMinutes - The server's "logicalSessionTimeoutMinutes" + */ + hasTimedOut(sessionTimeoutMinutes) { + // Take the difference of the lastUse timestamp and now, which will result in a value in + // milliseconds, and then convert milliseconds to minutes to compare to `sessionTimeoutMinutes` + const idleTimeMinutes = Math.round(((utils_1.calculateDurationInMs(this.lastUse) % 86400000) % 3600000) / 60000); + return idleTimeMinutes > sessionTimeoutMinutes - 1; + } +} +exports.ServerSession = ServerSession; +/** + * Maintains a pool of Server Sessions. + * For internal use only + * @internal + */ +class ServerSessionPool { + constructor(topology) { + if (topology == null) { + throw new error_1.MongoRuntimeError('ServerSessionPool requires a topology'); + } + this.topology = topology; + this.sessions = []; + } + /** Ends all sessions in the session pool */ + endAllPooledSessions(callback) { + if (this.sessions.length) { + this.topology.endSessions(this.sessions.map((session) => session.id), () => { + this.sessions = []; + if (typeof callback === 'function') { + callback(); + } + }); + return; + } + if (typeof callback === 'function') { + callback(); + } + } + /** + * Acquire a Server Session from the pool. + * Iterates through each session in the pool, removing any stale sessions + * along the way. The first non-stale session found is removed from the + * pool and returned. If no non-stale session is found, a new ServerSession is created. + */ + acquire() { + const sessionTimeoutMinutes = this.topology.logicalSessionTimeoutMinutes || 10; + while (this.sessions.length) { + const session = this.sessions.shift(); + if (session && (this.topology.loadBalanced || !session.hasTimedOut(sessionTimeoutMinutes))) { + return session; + } + } + return new ServerSession(); + } + /** + * Release a session to the session pool + * Adds the session back to the session pool if the session has not timed out yet. + * This method also removes any stale sessions from the pool. + * + * @param session - The session to release to the pool + */ + release(session) { + const sessionTimeoutMinutes = this.topology.logicalSessionTimeoutMinutes; + if (this.topology.loadBalanced && !sessionTimeoutMinutes) { + this.sessions.unshift(session); + } + if (!sessionTimeoutMinutes) { + return; + } + while (this.sessions.length) { + const pooledSession = this.sessions[this.sessions.length - 1]; + if (pooledSession.hasTimedOut(sessionTimeoutMinutes)) { + this.sessions.pop(); + } + else { + break; + } + } + if (!session.hasTimedOut(sessionTimeoutMinutes)) { + if (session.isDirty) { + return; + } + // otherwise, readd this session to the session pool + this.sessions.unshift(session); + } + } +} +exports.ServerSessionPool = ServerSessionPool; +// TODO: this should be codified in command construction +// @see https://github.com/mongodb/specifications/blob/master/source/read-write-concern/read-write-concern.rst#read-concern +function commandSupportsReadConcern(command, options) { + if (command.aggregate || command.count || command.distinct || command.find || command.geoNear) { + return true; + } + if (command.mapReduce && + options && + options.out && + (options.out.inline === 1 || options.out === 'inline')) { + return true; + } + return false; +} +exports.commandSupportsReadConcern = commandSupportsReadConcern; +/** + * Optionally decorate a command with sessions specific keys + * + * @param session - the session tracking transaction state + * @param command - the command to decorate + * @param options - Optional settings passed to calling operation + */ +function applySession(session, command, options) { + var _b; + // TODO: merge this with `assertAlive`, did not want to throw a try/catch here + if (session.hasEnded) { + return new error_1.MongoExpiredSessionError(); + } + const serverSession = session.serverSession; + if (serverSession == null) { + return new error_1.MongoRuntimeError('Unable to acquire server session'); + } + // SPEC-1019: silently ignore explicit session with unacknowledged write for backwards compatibility + // FIXME: NODE-2781, this check for write concern shouldn't be happening here, but instead during command construction + if (options && options.writeConcern && options.writeConcern.w === 0) { + if (session && session.explicit) { + return new error_1.MongoAPIError('Cannot have explicit session with unacknowledged writes'); + } + return; + } + // mark the last use of this session, and apply the `lsid` + serverSession.lastUse = utils_1.now(); + command.lsid = serverSession.id; + // first apply non-transaction-specific sessions data + const inTransaction = session.inTransaction() || transactions_1.isTransactionCommand(command); + const isRetryableWrite = (options === null || options === void 0 ? void 0 : options.willRetryWrite) || false; + if (serverSession.txnNumber && (isRetryableWrite || inTransaction)) { + command.txnNumber = bson_1.Long.fromNumber(serverSession.txnNumber); + } + if (!inTransaction) { + if (session.transaction.state !== transactions_1.TxnState.NO_TRANSACTION) { + session.transaction.transition(transactions_1.TxnState.NO_TRANSACTION); + } + if (session.supports.causalConsistency && + session.operationTime && + commandSupportsReadConcern(command, options)) { + command.readConcern = command.readConcern || {}; + Object.assign(command.readConcern, { afterClusterTime: session.operationTime }); + } + else if (session[kSnapshotEnabled]) { + command.readConcern = command.readConcern || { level: read_concern_1.ReadConcernLevel.snapshot }; + if (session[kSnapshotTime] != null) { + Object.assign(command.readConcern, { atClusterTime: session[kSnapshotTime] }); + } + } + return; + } + // now attempt to apply transaction-specific sessions data + // `autocommit` must always be false to differentiate from retryable writes + command.autocommit = false; + if (session.transaction.state === transactions_1.TxnState.STARTING_TRANSACTION) { + session.transaction.transition(transactions_1.TxnState.TRANSACTION_IN_PROGRESS); + command.startTransaction = true; + const readConcern = session.transaction.options.readConcern || ((_b = session === null || session === void 0 ? void 0 : session.clientOptions) === null || _b === void 0 ? void 0 : _b.readConcern); + if (readConcern) { + command.readConcern = readConcern; + } + if (session.supports.causalConsistency && session.operationTime) { + command.readConcern = command.readConcern || {}; + Object.assign(command.readConcern, { afterClusterTime: session.operationTime }); + } + } +} +exports.applySession = applySession; +function updateSessionFromResponse(session, document) { + var _b; + if (document.$clusterTime) { + common_1._advanceClusterTime(session, document.$clusterTime); + } + if (document.operationTime && session && session.supports.causalConsistency) { + session.advanceOperationTime(document.operationTime); + } + if (document.recoveryToken && session && session.inTransaction()) { + session.transaction._recoveryToken = document.recoveryToken; + } + if ((session === null || session === void 0 ? void 0 : session[kSnapshotEnabled]) && session[kSnapshotTime] == null) { + // find and aggregate commands return atClusterTime on the cursor + // distinct includes it in the response body + const atClusterTime = ((_b = document.cursor) === null || _b === void 0 ? void 0 : _b.atClusterTime) || document.atClusterTime; + if (atClusterTime) { + session[kSnapshotTime] = atClusterTime; + } + } +} +exports.updateSessionFromResponse = updateSessionFromResponse; +//# sourceMappingURL=sessions.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/sessions.js.map b/node_modules/mongodb/lib/sessions.js.map new file mode 100644 index 000000000..fef06faf9 --- /dev/null +++ b/node_modules/mongodb/lib/sessions.js.map @@ -0,0 +1 @@ +{"version":3,"file":"sessions.js","sourceRoot":"","sources":["../src/sessions.ts"],"names":[],"mappings":";;;;AAAA,yDAAqD;AACrD,iCAA2D;AAC3D,uDAAmD;AACnD,iDAAiG;AACjG,0CAA+E;AAC/E,wDAAwD;AACxD,mCAeiB;AACjB,mCAQiB;AAGjB,sEAAkE;AAClE,0DAAoE;AAGpE,kDAA+C;AAC/C,4CAAuD;AAEvD,+CAAkD;AAClD,iDAAkD;AAElD,MAAM,oCAAoC,GAAG,CAAC,CAAC;AAE/C,SAAS,WAAW,CAAC,OAAsB,EAAE,QAAmB;IAC9D,IAAI,OAAO,CAAC,aAAa,IAAI,IAAI,EAAE;QACjC,MAAM,KAAK,GAAG,IAAI,gCAAwB,EAAE,CAAC;QAC7C,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;YAClC,QAAQ,CAAC,KAAK,CAAC,CAAC;YAChB,OAAO,KAAK,CAAC;SACd;QAED,MAAM,KAAK,CAAC;KACb;IAED,OAAO,IAAI,CAAC;AACd,CAAC;AA2BD,gBAAgB;AAChB,MAAM,cAAc,GAAG,MAAM,CAAC,eAAe,CAAC,CAAC;AAC/C,gBAAgB;AAChB,MAAM,aAAa,GAAG,MAAM,CAAC,cAAc,CAAC,CAAC;AAC7C,gBAAgB;AAChB,MAAM,gBAAgB,GAAG,MAAM,CAAC,iBAAiB,CAAC,CAAC;AACnD,gBAAgB;AAChB,MAAM,iBAAiB,GAAG,MAAM,CAAC,kBAAkB,CAAC,CAAC;AAarD;;;;;GAKG;AACH,MAAa,aAAc,SAAQ,+BAAsC;IAwBvE;;;;;;;OAOG;IACH,YACE,QAAkB,EAClB,WAA8B,EAC9B,OAA6B,EAC7B,aAA4B;QAE5B,KAAK,EAAE,CAAC;QAnBV,gBAAgB;QAChB,QAAkB,GAAG,KAAK,CAAC;QAoBzB,IAAI,QAAQ,IAAI,IAAI,EAAE;YACpB,kBAAkB;YAClB,MAAM,IAAI,yBAAiB,CAAC,mCAAmC,CAAC,CAAC;SAClE;QAED,IAAI,WAAW,IAAI,IAAI,IAAI,CAAC,CAAC,WAAW,YAAY,iBAAiB,CAAC,EAAE;YACtE,kBAAkB;YAClB,MAAM,IAAI,yBAAiB,CAAC,4CAA4C,CAAC,CAAC;SAC3E;QAED,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QAExB,IAAI,OAAO,CAAC,QAAQ,KAAK,IAAI,EAAE;YAC7B,IAAI,CAAC,gBAAgB,CAAC,GAAG,IAAI,CAAC;YAC9B,IAAI,OAAO,CAAC,iBAAiB,KAAK,IAAI,EAAE;gBACtC,MAAM,IAAI,iCAAyB,CACjC,sEAAsE,CACvE,CAAC;aACH;SACF;QAED,IAAI,CAAC,QAAQ,GAAG,QAAQ,CAAC;QACzB,IAAI,CAAC,WAAW,GAAG,WAAW,CAAC;QAC/B,IAAI,CAAC,QAAQ,GAAG,KAAK,CAAC;QACtB,IAAI,CAAC,aAAa,GAAG,aAAa,CAAC;QACnC,IAAI,CAAC,cAAc,CAAC,GAAG,SAAS,CAAC;QAEjC,IAAI,CAAC,QAAQ,GAAG;YACd,iBAAiB,EAAE,OAAO,CAAC,QAAQ,KAAK,IAAI,IAAI,OAAO,CAAC,iBAAiB,KAAK,KAAK;SACpF,CAAC;QAEF,IAAI,CAAC,WAAW,GAAG,OAAO,CAAC,kBAAkB,CAAC;QAE9C,IAAI,CAAC,aAAa,GAAG,SAAS,CAAC;QAC/B,IAAI,CAAC,QAAQ,GAAG,CAAC,CAAC,OAAO,CAAC,QAAQ,CAAC;QACnC,IAAI,CAAC,KAAK,GAAG,OAAO,CAAC,KAAK,CAAC;QAC3B,IAAI,CAAC,yBAAyB,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,OAAO,CAAC,yBAAyB,CAAC,CAAC;QACtF,IAAI,CAAC,WAAW,GAAG,IAAI,0BAAW,EAAE,CAAC;IACvC,CAAC;IAED,iDAAiD;IACjD,IAAI,EAAE;;QACJ,OAAO,MAAA,IAAI,CAAC,aAAa,0CAAE,EAAE,CAAC;IAChC,CAAC;IAED,IAAI,aAAa;QACf,IAAI,IAAI,CAAC,cAAc,CAAC,IAAI,IAAI,EAAE;YAChC,IAAI,CAAC,cAAc,CAAC,GAAG,IAAI,CAAC,WAAW,CAAC,OAAO,EAAE,CAAC;SACnD;QAED,oEAAoE;QACpE,OAAO,IAAI,CAAC,cAAc,CAAE,CAAC;IAC/B,CAAC;IAED,mEAAmE;IACnE,IAAI,eAAe;QACjB,OAAO,IAAI,CAAC,gBAAgB,CAAC,CAAC;IAChC,CAAC;IAED,IAAI,YAAY;QACd,OAAO,IAAI,CAAC,QAAQ,CAAC,WAAW,CAAC,IAAI,KAAK,qBAAY,CAAC,YAAY,CAAC;IACtE,CAAC;IAED,gBAAgB;IAChB,IAAI,gBAAgB;QAClB,OAAO,IAAI,CAAC,iBAAiB,CAAC,CAAC;IACjC,CAAC;IAED,gBAAgB;IAChB,GAAG,CAAC,IAAgB;QAClB,IAAI,IAAI,CAAC,iBAAiB,CAAC,EAAE;YAC3B,MAAM,SAAS,CAAC,qDAAqD,CAAC,CAAC;SACxE;QAED,IAAI,CAAC,iBAAiB,CAAC,GAAG,IAAI,CAAC;QAC/B,IAAI,CAAC,IAAI,CACP,uBAAU,CAAC,MAAM,EACjB,IAAI,CAAC,aAAa,EAAE,CAAC,CAAC,CAAC,+BAAqB,CAAC,GAAG,CAAC,CAAC,CAAC,+BAAqB,CAAC,MAAM,CAChF,CAAC;IACJ,CAAC;IAED,gBAAgB;IAChB,KAAK,CAAC,OAAqE;QACzE,IAAI,IAAI,CAAC,YAAY,EAAE;YACrB,OAAO,0BAA0B,CAAC,IAAI,EAAE,OAAO,CAAC,CAAC;SAClD;QAED,IAAI,CAAC,WAAW,CAAC,WAAW,EAAE,CAAC;IACjC,CAAC;IAED,IAAI,QAAQ;QACV,OAAO,IAAI,CAAC,YAAY,CAAC,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC,iBAAiB,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC,WAAW,CAAC,QAAQ,CAAC;IACnF,CAAC;IAYD,UAAU,CACR,OAA4C,EAC5C,QAAyB;QAEzB,IAAI,OAAO,OAAO,KAAK,UAAU;YAAE,CAAC,QAAQ,GAAG,OAAO,CAAC,EAAE,CAAC,OAAO,GAAG,EAAE,CAAC,CAAC;QACxE,MAAM,YAAY,GAAG,EAAE,KAAK,EAAE,IAAI,EAAE,GAAG,OAAO,EAAE,CAAC;QAEjD,OAAO,oBAAY,CAAC,QAAQ,EAAE,IAAI,CAAC,EAAE;YACnC,IAAI,IAAI,CAAC,QAAQ,EAAE;gBACjB,0BAA0B,CAAC,IAAI,EAAE,YAAY,CAAC,CAAC;gBAC/C,OAAO,IAAI,EAAE,CAAC;aACf;YAED,MAAM,kBAAkB,GAAG,GAAG,EAAE;gBAC9B,0BAA0B,CAAC,IAAI,EAAE,YAAY,CAAC,CAAC;gBAE/C,8CAA8C;gBAC9C,IAAI,CAAC,WAAW,CAAC,OAAO,CAAC,IAAI,CAAC,aAAa,CAAC,CAAC;gBAC7C,IAAI,CAAC,cAAc,CAAC,GAAG,SAAS,CAAC;gBAEjC,+CAA+C;gBAC/C,IAAI,CAAC,QAAQ,GAAG,IAAI,CAAC;gBACrB,IAAI,CAAC,IAAI,CAAC,OAAO,EAAE,IAAI,CAAC,CAAC;gBAEzB,oEAAoE;gBACpE,IAAI,EAAE,CAAC;YACT,CAAC,CAAC;YAEF,IAAI,IAAI,CAAC,aAAa,IAAI,IAAI,CAAC,aAAa,EAAE,EAAE;gBAC9C,IAAI,CAAC,gBAAgB,CAAC,GAAG,CAAC,EAAE;oBAC1B,IAAI,GAAG;wBAAE,OAAO,IAAI,CAAC,GAAG,CAAC,CAAC;oBAC1B,kBAAkB,EAAE,CAAC;gBACvB,CAAC,CAAC,CAAC;gBAEH,OAAO;aACR;YAED,kBAAkB,EAAE,CAAC;QACvB,CAAC,CAAC,CAAC;IACL,CAAC;IAED;;;;OAIG;IACH,oBAAoB,CAAC,aAAwB;QAC3C,IAAI,IAAI,CAAC,aAAa,IAAI,IAAI,EAAE;YAC9B,IAAI,CAAC,aAAa,GAAG,aAAa,CAAC;YACnC,OAAO;SACR;QAED,IAAI,aAAa,CAAC,WAAW,CAAC,IAAI,CAAC,aAAa,CAAC,EAAE;YACjD,IAAI,CAAC,aAAa,GAAG,aAAa,CAAC;SACpC;IACH,CAAC;IAED;;;;OAIG;IACH,kBAAkB,CAAC,WAAwB;;QACzC,IAAI,CAAC,WAAW,IAAI,OAAO,WAAW,KAAK,QAAQ,EAAE;YACnD,MAAM,IAAI,iCAAyB,CAAC,sCAAsC,CAAC,CAAC;SAC7E;QACD,IAAI,CAAC,WAAW,CAAC,WAAW,IAAI,WAAW,CAAC,WAAW,CAAC,SAAS,KAAK,WAAW,EAAE;YACjF,MAAM,IAAI,iCAAyB,CACjC,0EAA0E,CAC3E,CAAC;SACH;QACD,IACE,CAAC,WAAW,CAAC,SAAS;YACtB,CAAA,MAAA,WAAW,CAAC,SAAS,CAAC,IAAI,0CAAE,SAAS,MAAK,QAAQ;YAClD,CAAC,OAAO,WAAW,CAAC,SAAS,CAAC,KAAK,KAAK,QAAQ;gBAC9C,CAAA,MAAA,WAAW,CAAC,SAAS,CAAC,KAAK,0CAAE,SAAS,MAAK,MAAM,CAAC,CAAC,0CAA0C;UAC/F;YACA,MAAM,IAAI,iCAAyB,CACjC,qGAAqG,CACtG,CAAC;SACH;QAED,4BAAmB,CAAC,IAAI,EAAE,WAAW,CAAC,CAAC;IACzC,CAAC;IAED;;;;OAIG;IACH,MAAM,CAAC,OAAsB;QAC3B,IAAI,CAAC,CAAC,OAAO,YAAY,aAAa,CAAC,EAAE;YACvC,OAAO,KAAK,CAAC;SACd;QAED,IAAI,IAAI,CAAC,EAAE,IAAI,IAAI,IAAI,OAAO,CAAC,EAAE,IAAI,IAAI,EAAE;YACzC,OAAO,KAAK,CAAC;SACd;QAED,OAAO,IAAI,CAAC,EAAE,CAAC,EAAE,CAAC,MAAM,CAAC,MAAM,CAAC,OAAO,CAAC,EAAE,CAAC,EAAE,CAAC,MAAM,CAAC,CAAC;IACxD,CAAC;IAED,qEAAqE;IACrE,0BAA0B;QACxB,IAAI,IAAI,CAAC,aAAa,EAAE;YACtB,IAAI,CAAC,aAAa,CAAC,SAAS;gBAC1B,OAAO,IAAI,CAAC,aAAa,CAAC,SAAS,KAAK,QAAQ,CAAC,CAAC,CAAC,IAAI,CAAC,aAAa,CAAC,SAAS,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;SAC3F;IACH,CAAC;IAED,yEAAyE;IACzE,aAAa;QACX,OAAO,IAAI,CAAC,WAAW,CAAC,QAAQ,CAAC;IACnC,CAAC;IAED;;;;OAIG;IACH,gBAAgB,CAAC,OAA4B;;QAC3C,IAAI,IAAI,CAAC,gBAAgB,CAAC,EAAE;YAC1B,MAAM,IAAI,+BAAuB,CAAC,qDAAqD,CAAC,CAAC;SAC1F;QAED,WAAW,CAAC,IAAI,CAAC,CAAC;QAClB,IAAI,IAAI,CAAC,aAAa,EAAE,EAAE;YACxB,MAAM,IAAI,6BAAqB,CAAC,iCAAiC,CAAC,CAAC;SACpE;QAED,IAAI,IAAI,CAAC,QAAQ,IAAI,IAAI,CAAC,WAAW,CAAC,WAAW,EAAE;YACjD,IAAI,CAAC,KAAK,EAAE,CAAC;SACd;QAED,MAAM,sBAAsB,GAAG,sBAAc,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC;QAC7D,IACE,kBAAS,CAAC,IAAI,CAAC,QAAQ,CAAC;YACxB,sBAAsB,IAAI,IAAI;YAC9B,sBAAsB,GAAG,oCAAoC,EAC7D;YACA,MAAM,IAAI,+BAAuB,CAC/B,sEAAsE,CACvE,CAAC;SACH;QAED,sBAAsB;QACtB,IAAI,CAAC,0BAA0B,EAAE,CAAC;QAClC,2BAA2B;QAC3B,IAAI,CAAC,WAAW,GAAG,IAAI,0BAAW,CAAC;YACjC,WAAW,EACT,MAAA,MAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,WAAW,mCACpB,IAAI,CAAC,yBAAyB,CAAC,WAAW,mCAC1C,MAAA,IAAI,CAAC,aAAa,0CAAE,WAAW;YACjC,YAAY,EACV,MAAA,MAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,YAAY,mCACrB,IAAI,CAAC,yBAAyB,CAAC,YAAY,mCAC3C,MAAA,IAAI,CAAC,aAAa,0CAAE,YAAY;YAClC,cAAc,EACZ,MAAA,MAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,cAAc,mCACvB,IAAI,CAAC,yBAAyB,CAAC,cAAc,mCAC7C,MAAA,IAAI,CAAC,aAAa,0CAAE,cAAc;YACpC,eAAe,EAAE,MAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,eAAe,mCAAI,IAAI,CAAC,yBAAyB,CAAC,eAAe;SAC5F,CAAC,CAAC;QAEH,IAAI,CAAC,WAAW,CAAC,UAAU,CAAC,uBAAQ,CAAC,oBAAoB,CAAC,CAAC;IAC7D,CAAC;IASD,iBAAiB,CAAC,QAA6B;QAC7C,OAAO,oBAAY,CAAC,QAAQ,EAAE,EAAE,CAAC,EAAE,CAAC,cAAc,CAAC,IAAI,EAAE,mBAAmB,EAAE,EAAE,CAAC,CAAC,CAAC;IACrF,CAAC;IASD,gBAAgB,CAAC,QAA6B;QAC5C,OAAO,oBAAY,CAAC,QAAQ,EAAE,EAAE,CAAC,EAAE,CAAC,cAAc,CAAC,IAAI,EAAE,kBAAkB,EAAE,EAAE,CAAC,CAAC,CAAC;IACpF,CAAC;IAED;;OAEG;IACH,MAAM;QACJ,MAAM,IAAI,yBAAiB,CAAC,6CAA6C,CAAC,CAAC;IAC7E,CAAC;IAED;;;;;;;;;;OAUG;IACH,eAAe,CACb,EAA8B,EAC9B,OAA4B;QAE5B,MAAM,SAAS,GAAG,WAAG,EAAE,CAAC;QACxB,OAAO,kBAAkB,CAAC,IAAI,EAAE,SAAS,EAAE,EAAE,EAAE,OAAO,CAAC,CAAC;IAC1D,CAAC;CACF;AAtWD,sCAsWC;KAlVE,gBAAgB;AAoVnB,MAAM,4BAA4B,GAAG,MAAM,CAAC;AAC5C,MAAM,sCAAsC,GAAG,IAAI,GAAG,CAAC;IACrD,2BAA2B;IAC3B,yBAAyB;IACzB,2BAA2B;CAC5B,CAAC,CAAC;AAEH,SAAS,cAAc,CAAC,SAAiB,EAAE,GAAW;IACpD,OAAO,6BAAqB,CAAC,SAAS,CAAC,GAAG,GAAG,CAAC;AAChD,CAAC;AAED,SAAS,gCAAgC,CAAC,GAAe;IACvD,MAAM,mCAAmC,GACvC,GAAG,YAAY,wBAAgB;QAC/B,GAAG,CAAC,QAAQ;QACZ,sCAAsC,CAAC,GAAG,CAAC,GAAG,CAAC,QAAQ,CAAC,CAAC;IAE3D,OAAO,CACL,uBAAuB,CAAC,GAAG,CAAC;QAC5B,CAAC,CAAC,mCAAmC;YACnC,GAAG,CAAC,IAAI,KAAK,2BAAmB,CAAC,yBAAyB;YAC1D,GAAG,CAAC,IAAI,KAAK,2BAAmB,CAAC,uBAAuB,CAAC,CAC5D,CAAC;AACJ,CAAC;AAED,SAAgB,0BAA0B,CACxC,OAAsB,EACtB,OAA2B;IAE3B,2CAA2C;IAC3C,MAAM,IAAI,GAAG,OAAO,CAAC,iBAAiB,CAAC,CAAC;IACxC,MAAM,KAAK,GAAG,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,KAAK,CAAC;IAE7B,IACE,OAAO,CAAC,aAAa,EAAE;QACvB,KAAK;QACL,KAAK,YAAY,kBAAU;QAC3B,KAAK,CAAC,aAAa,CAAC,2BAA2B,CAAC,EAChD;QACA,OAAO;KACR;IAED,uFAAuF;IACvF,yDAAyD;IACzD,IAAI,IAAI,EAAE;QACR,MAAM,OAAO,GAAG,KAAK,CAAC,IAAI,CAAC,OAAO,CAAC,QAAQ,CAAC,CAAC,CAAC,OAAO,CAAC,MAAM,EAAE,CAAC,CAAC;QAChE,MAAM,YAAY,GAAG,OAAO,CAAC,CAAC,CAAC,CAAC;QAEhC,IAAI,CAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,KAAK,KAAI,IAAI,KAAI,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,KAAK,CAAA,EAAE;YAC5C,YAAY,CAAC,CAAC,CAAC,IAAI,CAAC,OAAO,CAAC,IAAI,CAAC,CAAC;YAClC,IAAI,CAAC,IAAI,CACP,uBAAU,CAAC,QAAQ,EACnB,OAAO,CAAC,WAAW,CAAC,KAAK,KAAK,uBAAQ,CAAC,cAAc;gBACnD,CAAC,CAAC,+BAAqB,CAAC,GAAG;gBAC3B,CAAC,CAAC,+BAAqB,CAAC,MAAM,CACjC,CAAC;YAEF,IAAI,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,UAAU,EAAE;gBACvB,YAAY,CAAC,CAAC,CAAC,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;aAC3C;SACF;QAED,OAAO,CAAC,iBAAiB,CAAC,GAAG,SAAS,CAAC;KACxC;AACH,CAAC;AAvCD,gEAuCC;AAED,SAAS,uBAAuB,CAAC,GAAe;IAC9C,IAAI,GAAG,IAAI,IAAI,IAAI,CAAC,CAAC,GAAG,YAAY,wBAAgB,CAAC,EAAE;QACrD,OAAO,KAAK,CAAC;KACd;IAED,OAAO,CACL,GAAG,CAAC,IAAI,KAAK,2BAAmB,CAAC,gBAAgB;QACjD,CAAC,GAAG,CAAC,iBAAiB,IAAI,GAAG,CAAC,iBAAiB,CAAC,IAAI,KAAK,2BAAmB,CAAC,gBAAgB,CAAC,CAC/F,CAAC;AACJ,CAAC;AAED,SAAS,wBAAwB,CAC/B,OAAsB,EACtB,SAAiB,EACjB,EAA8B,EAC9B,OAA4B;IAE5B,OAAO,OAAO,CAAC,iBAAiB,EAAE,CAAC,KAAK,CAAC,CAAC,GAAe,EAAE,EAAE;QAC3D,IACE,GAAG,YAAY,kBAAU;YACzB,cAAc,CAAC,SAAS,EAAE,4BAA4B,CAAC;YACvD,CAAC,uBAAuB,CAAC,GAAG,CAAC,EAC7B;YACA,IAAI,GAAG,CAAC,aAAa,CAAC,gCAAgC,CAAC,EAAE;gBACvD,OAAO,wBAAwB,CAAC,OAAO,EAAE,SAAS,EAAE,EAAE,EAAE,OAAO,CAAC,CAAC;aAClE;YAED,IAAI,GAAG,CAAC,aAAa,CAAC,2BAA2B,CAAC,EAAE;gBAClD,OAAO,kBAAkB,CAAC,OAAO,EAAE,SAAS,EAAE,EAAE,EAAE,OAAO,CAAC,CAAC;aAC5D;SACF;QAED,MAAM,GAAG,CAAC;IACZ,CAAC,CAAC,CAAC;AACL,CAAC;AAED,MAAM,4BAA4B,GAAG,IAAI,GAAG,CAAW;IACrD,uBAAQ,CAAC,cAAc;IACvB,uBAAQ,CAAC,qBAAqB;IAC9B,uBAAQ,CAAC,mBAAmB;CAC7B,CAAC,CAAC;AAEH,SAAS,8BAA8B,CAAC,OAAsB;IAC5D,OAAO,4BAA4B,CAAC,GAAG,CAAC,OAAO,CAAC,WAAW,CAAC,KAAK,CAAC,CAAC;AACrE,CAAC;AAED,SAAS,kBAAkB,CACzB,OAAsB,EACtB,SAAiB,EACjB,EAAoC,EACpC,OAA4B;IAE5B,MAAM,OAAO,GAAG,kCAAe,CAAC,GAAG,EAAE,CAAC;IACtC,OAAO,CAAC,gBAAgB,CAAC,OAAO,CAAC,CAAC;IAElC,IAAI,OAAO,CAAC;IACZ,IAAI;QACF,OAAO,GAAG,EAAE,CAAC,OAAO,CAAC,CAAC;KACvB;IAAC,OAAO,GAAG,EAAE;QACZ,OAAO,GAAG,OAAO,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC;KAC/B;IAED,IAAI,CAAC,qBAAa,CAAC,OAAO,CAAC,EAAE;QAC3B,OAAO,CAAC,gBAAgB,EAAE,CAAC;QAC3B,MAAM,IAAI,iCAAyB,CACjC,8DAA8D,CAC/D,CAAC;KACH;IAED,OAAO,OAAO,CAAC,IAAI,CACjB,GAAG,EAAE;QACH,IAAI,8BAA8B,CAAC,OAAO,CAAC,EAAE;YAC3C,OAAO;SACR;QAED,OAAO,wBAAwB,CAAC,OAAO,EAAE,SAAS,EAAE,EAAE,EAAE,OAAO,CAAC,CAAC;IACnE,CAAC,EACD,GAAG,CAAC,EAAE;QACJ,SAAS,iBAAiB,CAAC,GAAe;YACxC,IACE,GAAG,YAAY,kBAAU;gBACzB,GAAG,CAAC,aAAa,CAAC,2BAA2B,CAAC;gBAC9C,cAAc,CAAC,SAAS,EAAE,4BAA4B,CAAC,EACvD;gBACA,OAAO,kBAAkB,CAAC,OAAO,EAAE,SAAS,EAAE,EAAE,EAAE,OAAO,CAAC,CAAC;aAC5D;YAED,IAAI,uBAAuB,CAAC,GAAG,CAAC,EAAE;gBAChC,GAAG,CAAC,aAAa,CAAC,gCAAgC,CAAC,CAAC;aACrD;YAED,MAAM,GAAG,CAAC;QACZ,CAAC;QAED,IAAI,OAAO,CAAC,WAAW,CAAC,QAAQ,EAAE;YAChC,OAAO,OAAO,CAAC,gBAAgB,EAAE,CAAC,IAAI,CAAC,GAAG,EAAE,CAAC,iBAAiB,CAAC,GAAG,CAAC,CAAC,CAAC;SACtE;QAED,OAAO,iBAAiB,CAAC,GAAG,CAAC,CAAC;IAChC,CAAC,CACF,CAAC;AACJ,CAAC;AAED,SAAS,cAAc,CAAC,OAAsB,EAAE,WAAmB,EAAE,QAA4B;IAC/F,IAAI,CAAC,WAAW,CAAC,OAAO,EAAE,QAAQ,CAAC,EAAE;QACnC,8CAA8C;QAC9C,OAAO;KACR;IAED,uCAAuC;IACvC,MAAM,QAAQ,GAAG,OAAO,CAAC,WAAW,CAAC,KAAK,CAAC;IAE3C,IAAI,QAAQ,KAAK,uBAAQ,CAAC,cAAc,EAAE;QACxC,QAAQ,CAAC,IAAI,6BAAqB,CAAC,wBAAwB,CAAC,CAAC,CAAC;QAC9D,OAAO;KACR;IAED,IAAI,WAAW,KAAK,mBAAmB,EAAE;QACvC,IACE,QAAQ,KAAK,uBAAQ,CAAC,oBAAoB;YAC1C,QAAQ,KAAK,uBAAQ,CAAC,2BAA2B,EACjD;YACA,6DAA6D;YAC7D,OAAO,CAAC,WAAW,CAAC,UAAU,CAAC,uBAAQ,CAAC,2BAA2B,CAAC,CAAC;YACrE,QAAQ,EAAE,CAAC;YACX,OAAO;SACR;QAED,IAAI,QAAQ,KAAK,uBAAQ,CAAC,mBAAmB,EAAE;YAC7C,QAAQ,CACN,IAAI,6BAAqB,CAAC,8DAA8D,CAAC,CAC1F,CAAC;YACF,OAAO;SACR;KACF;SAAM;QACL,IAAI,QAAQ,KAAK,uBAAQ,CAAC,oBAAoB,EAAE;YAC9C,6DAA6D;YAC7D,OAAO,CAAC,WAAW,CAAC,UAAU,CAAC,uBAAQ,CAAC,mBAAmB,CAAC,CAAC;YAC7D,QAAQ,EAAE,CAAC;YACX,OAAO;SACR;QAED,IAAI,QAAQ,KAAK,uBAAQ,CAAC,mBAAmB,EAAE;YAC7C,QAAQ,CAAC,IAAI,6BAAqB,CAAC,oCAAoC,CAAC,CAAC,CAAC;YAC1E,OAAO;SACR;QAED,IACE,QAAQ,KAAK,uBAAQ,CAAC,qBAAqB;YAC3C,QAAQ,KAAK,uBAAQ,CAAC,2BAA2B,EACjD;YACA,QAAQ,CACN,IAAI,6BAAqB,CAAC,8DAA8D,CAAC,CAC1F,CAAC;YACF,OAAO;SACR;KACF;IAED,iCAAiC;IACjC,MAAM,OAAO,GAAa,EAAE,CAAC,WAAW,CAAC,EAAE,CAAC,EAAE,CAAC;IAE/C,oCAAoC;IACpC,IAAI,YAAY,CAAC;IACjB,IAAI,OAAO,CAAC,WAAW,CAAC,OAAO,CAAC,YAAY,EAAE;QAC5C,YAAY,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,OAAO,CAAC,WAAW,CAAC,OAAO,CAAC,YAAY,CAAC,CAAC;KAC5E;SAAM,IAAI,OAAO,CAAC,aAAa,IAAI,OAAO,CAAC,aAAa,CAAC,YAAY,EAAE;QACtE,YAAY,GAAG,EAAE,CAAC,EAAE,OAAO,CAAC,aAAa,CAAC,YAAY,CAAC,CAAC,EAAE,CAAC;KAC5D;IAED,IAAI,QAAQ,KAAK,uBAAQ,CAAC,qBAAqB,EAAE;QAC/C,YAAY,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,QAAQ,EAAE,KAAK,EAAE,EAAE,YAAY,EAAE,EAAE,CAAC,EAAE,UAAU,EAAE,CAAC,CAAC;KACpF;IAED,IAAI,YAAY,EAAE;QAChB,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,EAAE,YAAY,EAAE,CAAC,CAAC;KAC1C;IAED,IAAI,WAAW,KAAK,mBAAmB,IAAI,OAAO,CAAC,WAAW,CAAC,OAAO,CAAC,SAAS,EAAE;QAChF,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,EAAE,SAAS,EAAE,OAAO,CAAC,WAAW,CAAC,OAAO,CAAC,SAAS,EAAE,CAAC,CAAC;KAC9E;IAED,SAAS,cAAc,CAAC,CAAc,EAAE,CAAY;QAClD,IAAI,WAAW,KAAK,mBAAmB,EAAE;YACvC,OAAO,CAAC,WAAW,CAAC,UAAU,CAAC,uBAAQ,CAAC,mBAAmB,CAAC,CAAC;YAC7D,IAAI,OAAO,CAAC,YAAY,EAAE;gBACxB,0BAA0B,CAAC,OAAO,EAAE,EAAE,KAAK,EAAE,KAAK,EAAE,CAAC,CAAC;aACvD;YAED,4EAA4E;YAC5E,OAAO,QAAQ,EAAE,CAAC;SACnB;QAED,OAAO,CAAC,WAAW,CAAC,UAAU,CAAC,uBAAQ,CAAC,qBAAqB,CAAC,CAAC;QAC/D,IAAI,CAAC,EAAE;YACL,IACE,CAAC,YAAY,yBAAiB;gBAC9B,CAAC,YAAY,8BAAsB;gBACnC,wBAAgB,CAAC,CAAC,CAAC;gBACnB,uBAAuB,CAAC,CAAC,CAAC,EAC1B;gBACA,IAAI,gCAAgC,CAAC,CAAC,CAAC,EAAE;oBACvC,CAAC,CAAC,aAAa,CAAC,gCAAgC,CAAC,CAAC;oBAElD,iDAAiD;oBACjD,OAAO,CAAC,KAAK,CAAC,EAAE,KAAK,EAAE,CAAC,EAAE,CAAC,CAAC;iBAC7B;aACF;iBAAM,IAAI,CAAC,CAAC,aAAa,CAAC,2BAA2B,CAAC,EAAE;gBACvD,OAAO,CAAC,KAAK,CAAC,EAAE,KAAK,EAAE,CAAC,EAAE,CAAC,CAAC;aAC7B;SACF;QAED,QAAQ,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC;IACjB,CAAC;IAED,gFAAgF;IAChF,IAAI,OAAO,CAAC,WAAW,CAAC,aAAa,EAAE;QACrC,OAAO,CAAC,aAAa,GAAG,OAAO,CAAC,WAAW,CAAC,aAAa,CAAC;KAC3D;IAED,mBAAmB;IACnB,oCAAgB,CACd,OAAO,CAAC,QAAQ,EAChB,IAAI,sCAAwB,CAAC,SAAS,EAAE,OAAO,EAAE;QAC/C,OAAO;QACP,cAAc,EAAE,gCAAc,CAAC,OAAO;QACtC,kBAAkB,EAAE,IAAI;KACzB,CAAC,EACF,CAAC,GAAG,EAAE,KAAK,EAAE,EAAE;QACb,IAAI,OAAO,CAAC,gBAAgB,EAAE;YAC5B,sDAAsD;YACtD,OAAO,CAAC,KAAK,EAAE,CAAC;SACjB;QAED,IAAI,GAAG,IAAI,wBAAgB,CAAC,GAAiB,CAAC,EAAE;YAC9C,0EAA0E;YAC1E,IAAI,OAAO,CAAC,iBAAiB,EAAE;gBAC7B,iDAAiD;gBACjD,OAAO,CAAC,KAAK,CAAC,EAAE,KAAK,EAAE,IAAI,EAAE,CAAC,CAAC;gBAE/B,OAAO,CAAC,YAAY,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,QAAQ,EAAE,KAAK,EAAE,EAAE,OAAO,CAAC,YAAY,EAAE;oBAC9E,CAAC,EAAE,UAAU;iBACd,CAAC,CAAC;aACJ;YAED,OAAO,oCAAgB,CACrB,OAAO,CAAC,QAAQ,EAChB,IAAI,sCAAwB,CAAC,SAAS,EAAE,OAAO,EAAE;gBAC/C,OAAO;gBACP,cAAc,EAAE,gCAAc,CAAC,OAAO;gBACtC,kBAAkB,EAAE,IAAI;aACzB,CAAC,EACF,CAAC,IAAI,EAAE,MAAM,EAAE,EAAE,CAAC,cAAc,CAAC,IAAkB,EAAE,MAAM,CAAC,CAC7D,CAAC;SACH;QAED,cAAc,CAAC,GAAiB,EAAE,KAAK,CAAC,CAAC;IAC3C,CAAC,CACF,CAAC;AACJ,CAAC;AAKD;;;;GAIG;AACH,MAAa,aAAa;IAMxB,gBAAgB;IAChB;QACE,IAAI,CAAC,EAAE,GAAG,EAAE,EAAE,EAAE,IAAI,aAAM,CAAC,cAAM,EAAE,EAAE,aAAM,CAAC,YAAY,CAAC,EAAE,CAAC;QAC5D,IAAI,CAAC,OAAO,GAAG,WAAG,EAAE,CAAC;QACrB,IAAI,CAAC,SAAS,GAAG,CAAC,CAAC;QACnB,IAAI,CAAC,OAAO,GAAG,KAAK,CAAC;IACvB,CAAC;IAED;;;;OAIG;IACH,WAAW,CAAC,qBAA6B;QACvC,wFAAwF;QACxF,+FAA+F;QAC/F,MAAM,eAAe,GAAG,IAAI,CAAC,KAAK,CAChC,CAAC,CAAC,6BAAqB,CAAC,IAAI,CAAC,OAAO,CAAC,GAAG,QAAQ,CAAC,GAAG,OAAO,CAAC,GAAG,KAAK,CACrE,CAAC;QAEF,OAAO,eAAe,GAAG,qBAAqB,GAAG,CAAC,CAAC;IACrD,CAAC;CACF;AA5BD,sCA4BC;AAED;;;;GAIG;AACH,MAAa,iBAAiB;IAI5B,YAAY,QAAkB;QAC5B,IAAI,QAAQ,IAAI,IAAI,EAAE;YACpB,MAAM,IAAI,yBAAiB,CAAC,uCAAuC,CAAC,CAAC;SACtE;QAED,IAAI,CAAC,QAAQ,GAAG,QAAQ,CAAC;QACzB,IAAI,CAAC,QAAQ,GAAG,EAAE,CAAC;IACrB,CAAC;IAED,4CAA4C;IAC5C,oBAAoB,CAAC,QAAyB;QAC5C,IAAI,IAAI,CAAC,QAAQ,CAAC,MAAM,EAAE;YACxB,IAAI,CAAC,QAAQ,CAAC,WAAW,CACvB,IAAI,CAAC,QAAQ,CAAC,GAAG,CAAC,CAAC,OAAsB,EAAE,EAAE,CAAC,OAAO,CAAC,EAAE,CAAC,EACzD,GAAG,EAAE;gBACH,IAAI,CAAC,QAAQ,GAAG,EAAE,CAAC;gBACnB,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;oBAClC,QAAQ,EAAE,CAAC;iBACZ;YACH,CAAC,CACF,CAAC;YAEF,OAAO;SACR;QAED,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;YAClC,QAAQ,EAAE,CAAC;SACZ;IACH,CAAC;IAED;;;;;OAKG;IACH,OAAO;QACL,MAAM,qBAAqB,GAAG,IAAI,CAAC,QAAQ,CAAC,4BAA4B,IAAI,EAAE,CAAC;QAE/E,OAAO,IAAI,CAAC,QAAQ,CAAC,MAAM,EAAE;YAC3B,MAAM,OAAO,GAAG,IAAI,CAAC,QAAQ,CAAC,KAAK,EAAE,CAAC;YACtC,IAAI,OAAO,IAAI,CAAC,IAAI,CAAC,QAAQ,CAAC,YAAY,IAAI,CAAC,OAAO,CAAC,WAAW,CAAC,qBAAqB,CAAC,CAAC,EAAE;gBAC1F,OAAO,OAAO,CAAC;aAChB;SACF;QAED,OAAO,IAAI,aAAa,EAAE,CAAC;IAC7B,CAAC;IAED;;;;;;OAMG;IACH,OAAO,CAAC,OAAsB;QAC5B,MAAM,qBAAqB,GAAG,IAAI,CAAC,QAAQ,CAAC,4BAA4B,CAAC;QAEzE,IAAI,IAAI,CAAC,QAAQ,CAAC,YAAY,IAAI,CAAC,qBAAqB,EAAE;YACxD,IAAI,CAAC,QAAQ,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;SAChC;QAED,IAAI,CAAC,qBAAqB,EAAE;YAC1B,OAAO;SACR;QAED,OAAO,IAAI,CAAC,QAAQ,CAAC,MAAM,EAAE;YAC3B,MAAM,aAAa,GAAG,IAAI,CAAC,QAAQ,CAAC,IAAI,CAAC,QAAQ,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC;YAC9D,IAAI,aAAa,CAAC,WAAW,CAAC,qBAAqB,CAAC,EAAE;gBACpD,IAAI,CAAC,QAAQ,CAAC,GAAG,EAAE,CAAC;aACrB;iBAAM;gBACL,MAAM;aACP;SACF;QAED,IAAI,CAAC,OAAO,CAAC,WAAW,CAAC,qBAAqB,CAAC,EAAE;YAC/C,IAAI,OAAO,CAAC,OAAO,EAAE;gBACnB,OAAO;aACR;YAED,oDAAoD;YACpD,IAAI,CAAC,QAAQ,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;SAChC;IACH,CAAC;CACF;AAzFD,8CAyFC;AAED,wDAAwD;AACxD,2HAA2H;AAC3H,SAAgB,0BAA0B,CAAC,OAAiB,EAAE,OAAkB;IAC9E,IAAI,OAAO,CAAC,SAAS,IAAI,OAAO,CAAC,KAAK,IAAI,OAAO,CAAC,QAAQ,IAAI,OAAO,CAAC,IAAI,IAAI,OAAO,CAAC,OAAO,EAAE;QAC7F,OAAO,IAAI,CAAC;KACb;IAED,IACE,OAAO,CAAC,SAAS;QACjB,OAAO;QACP,OAAO,CAAC,GAAG;QACX,CAAC,OAAO,CAAC,GAAG,CAAC,MAAM,KAAK,CAAC,IAAI,OAAO,CAAC,GAAG,KAAK,QAAQ,CAAC,EACtD;QACA,OAAO,IAAI,CAAC;KACb;IAED,OAAO,KAAK,CAAC;AACf,CAAC;AAfD,gEAeC;AAED;;;;;;GAMG;AACH,SAAgB,YAAY,CAC1B,OAAsB,EACtB,OAAiB,EACjB,OAAwB;;IAExB,8EAA8E;IAC9E,IAAI,OAAO,CAAC,QAAQ,EAAE;QACpB,OAAO,IAAI,gCAAwB,EAAE,CAAC;KACvC;IAED,MAAM,aAAa,GAAG,OAAO,CAAC,aAAa,CAAC;IAC5C,IAAI,aAAa,IAAI,IAAI,EAAE;QACzB,OAAO,IAAI,yBAAiB,CAAC,kCAAkC,CAAC,CAAC;KAClE;IAED,oGAAoG;IACpG,sHAAsH;IACtH,IAAI,OAAO,IAAI,OAAO,CAAC,YAAY,IAAK,OAAO,CAAC,YAA6B,CAAC,CAAC,KAAK,CAAC,EAAE;QACrF,IAAI,OAAO,IAAI,OAAO,CAAC,QAAQ,EAAE;YAC/B,OAAO,IAAI,qBAAa,CAAC,yDAAyD,CAAC,CAAC;SACrF;QACD,OAAO;KACR;IAED,0DAA0D;IAC1D,aAAa,CAAC,OAAO,GAAG,WAAG,EAAE,CAAC;IAC9B,OAAO,CAAC,IAAI,GAAG,aAAa,CAAC,EAAE,CAAC;IAEhC,qDAAqD;IACrD,MAAM,aAAa,GAAG,OAAO,CAAC,aAAa,EAAE,IAAI,mCAAoB,CAAC,OAAO,CAAC,CAAC;IAC/E,MAAM,gBAAgB,GAAG,CAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,cAAc,KAAI,KAAK,CAAC;IAE1D,IAAI,aAAa,CAAC,SAAS,IAAI,CAAC,gBAAgB,IAAI,aAAa,CAAC,EAAE;QAClE,OAAO,CAAC,SAAS,GAAG,WAAI,CAAC,UAAU,CAAC,aAAa,CAAC,SAAS,CAAC,CAAC;KAC9D;IAED,IAAI,CAAC,aAAa,EAAE;QAClB,IAAI,OAAO,CAAC,WAAW,CAAC,KAAK,KAAK,uBAAQ,CAAC,cAAc,EAAE;YACzD,OAAO,CAAC,WAAW,CAAC,UAAU,CAAC,uBAAQ,CAAC,cAAc,CAAC,CAAC;SACzD;QAED,IACE,OAAO,CAAC,QAAQ,CAAC,iBAAiB;YAClC,OAAO,CAAC,aAAa;YACrB,0BAA0B,CAAC,OAAO,EAAE,OAAO,CAAC,EAC5C;YACA,OAAO,CAAC,WAAW,GAAG,OAAO,CAAC,WAAW,IAAI,EAAE,CAAC;YAChD,MAAM,CAAC,MAAM,CAAC,OAAO,CAAC,WAAW,EAAE,EAAE,gBAAgB,EAAE,OAAO,CAAC,aAAa,EAAE,CAAC,CAAC;SACjF;aAAM,IAAI,OAAO,CAAC,gBAAgB,CAAC,EAAE;YACpC,OAAO,CAAC,WAAW,GAAG,OAAO,CAAC,WAAW,IAAI,EAAE,KAAK,EAAE,+BAAgB,CAAC,QAAQ,EAAE,CAAC;YAClF,IAAI,OAAO,CAAC,aAAa,CAAC,IAAI,IAAI,EAAE;gBAClC,MAAM,CAAC,MAAM,CAAC,OAAO,CAAC,WAAW,EAAE,EAAE,aAAa,EAAE,OAAO,CAAC,aAAa,CAAC,EAAE,CAAC,CAAC;aAC/E;SACF;QAED,OAAO;KACR;IAED,0DAA0D;IAE1D,2EAA2E;IAC3E,OAAO,CAAC,UAAU,GAAG,KAAK,CAAC;IAE3B,IAAI,OAAO,CAAC,WAAW,CAAC,KAAK,KAAK,uBAAQ,CAAC,oBAAoB,EAAE;QAC/D,OAAO,CAAC,WAAW,CAAC,UAAU,CAAC,uBAAQ,CAAC,uBAAuB,CAAC,CAAC;QACjE,OAAO,CAAC,gBAAgB,GAAG,IAAI,CAAC;QAEhC,MAAM,WAAW,GACf,OAAO,CAAC,WAAW,CAAC,OAAO,CAAC,WAAW,KAAI,MAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,aAAa,0CAAE,WAAW,CAAA,CAAC;QACjF,IAAI,WAAW,EAAE;YACf,OAAO,CAAC,WAAW,GAAG,WAAW,CAAC;SACnC;QAED,IAAI,OAAO,CAAC,QAAQ,CAAC,iBAAiB,IAAI,OAAO,CAAC,aAAa,EAAE;YAC/D,OAAO,CAAC,WAAW,GAAG,OAAO,CAAC,WAAW,IAAI,EAAE,CAAC;YAChD,MAAM,CAAC,MAAM,CAAC,OAAO,CAAC,WAAW,EAAE,EAAE,gBAAgB,EAAE,OAAO,CAAC,aAAa,EAAE,CAAC,CAAC;SACjF;KACF;AACH,CAAC;AA9ED,oCA8EC;AAED,SAAgB,yBAAyB,CAAC,OAAsB,EAAE,QAAkB;;IAClF,IAAI,QAAQ,CAAC,YAAY,EAAE;QACzB,4BAAmB,CAAC,OAAO,EAAE,QAAQ,CAAC,YAAY,CAAC,CAAC;KACrD;IAED,IAAI,QAAQ,CAAC,aAAa,IAAI,OAAO,IAAI,OAAO,CAAC,QAAQ,CAAC,iBAAiB,EAAE;QAC3E,OAAO,CAAC,oBAAoB,CAAC,QAAQ,CAAC,aAAa,CAAC,CAAC;KACtD;IAED,IAAI,QAAQ,CAAC,aAAa,IAAI,OAAO,IAAI,OAAO,CAAC,aAAa,EAAE,EAAE;QAChE,OAAO,CAAC,WAAW,CAAC,cAAc,GAAG,QAAQ,CAAC,aAAa,CAAC;KAC7D;IAED,IAAI,CAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAG,gBAAgB,CAAC,KAAI,OAAO,CAAC,aAAa,CAAC,IAAI,IAAI,EAAE;QACjE,iEAAiE;QACjE,4CAA4C;QAC5C,MAAM,aAAa,GAAG,CAAA,MAAA,QAAQ,CAAC,MAAM,0CAAE,aAAa,KAAI,QAAQ,CAAC,aAAa,CAAC;QAC/E,IAAI,aAAa,EAAE;YACjB,OAAO,CAAC,aAAa,CAAC,GAAG,aAAa,CAAC;SACxC;KACF;AACH,CAAC;AArBD,8DAqBC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/sort.js b/node_modules/mongodb/lib/sort.js new file mode 100644 index 000000000..c04b6b54c --- /dev/null +++ b/node_modules/mongodb/lib/sort.js @@ -0,0 +1,97 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.formatSort = void 0; +const error_1 = require("./error"); +/** @internal */ +function prepareDirection(direction = 1) { + const value = `${direction}`.toLowerCase(); + if (isMeta(direction)) + return direction; + switch (value) { + case 'ascending': + case 'asc': + case '1': + return 1; + case 'descending': + case 'desc': + case '-1': + return -1; + default: + throw new error_1.MongoInvalidArgumentError(`Invalid sort direction: ${JSON.stringify(direction)}`); + } +} +/** @internal */ +function isMeta(t) { + return typeof t === 'object' && t != null && '$meta' in t && typeof t.$meta === 'string'; +} +/** @internal */ +function isPair(t) { + if (Array.isArray(t) && t.length === 2) { + try { + prepareDirection(t[1]); + return true; + } + catch (e) { + return false; + } + } + return false; +} +function isDeep(t) { + return Array.isArray(t) && Array.isArray(t[0]); +} +function isMap(t) { + return t instanceof Map && t.size > 0; +} +/** @internal */ +function pairToMap(v) { + return new Map([[`${v[0]}`, prepareDirection([v[1]])]]); +} +/** @internal */ +function deepToMap(t) { + const sortEntries = t.map(([k, v]) => [`${k}`, prepareDirection(v)]); + return new Map(sortEntries); +} +/** @internal */ +function stringsToMap(t) { + const sortEntries = t.map(key => [`${key}`, 1]); + return new Map(sortEntries); +} +/** @internal */ +function objectToMap(t) { + const sortEntries = Object.entries(t).map(([k, v]) => [ + `${k}`, + prepareDirection(v) + ]); + return new Map(sortEntries); +} +/** @internal */ +function mapToMap(t) { + const sortEntries = Array.from(t).map(([k, v]) => [ + `${k}`, + prepareDirection(v) + ]); + return new Map(sortEntries); +} +/** converts a Sort type into a type that is valid for the server (SortForCmd) */ +function formatSort(sort, direction) { + if (sort == null) + return undefined; + if (typeof sort === 'string') + return new Map([[sort, prepareDirection(direction)]]); + if (typeof sort !== 'object') { + throw new error_1.MongoInvalidArgumentError(`Invalid sort format: ${JSON.stringify(sort)} Sort must be a valid object`); + } + if (!Array.isArray(sort)) { + return isMap(sort) ? mapToMap(sort) : Object.keys(sort).length ? objectToMap(sort) : undefined; + } + if (!sort.length) + return undefined; + if (isDeep(sort)) + return deepToMap(sort); + if (isPair(sort)) + return pairToMap(sort); + return stringsToMap(sort); +} +exports.formatSort = formatSort; +//# sourceMappingURL=sort.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/sort.js.map b/node_modules/mongodb/lib/sort.js.map new file mode 100644 index 000000000..4072af4c8 --- /dev/null +++ b/node_modules/mongodb/lib/sort.js.map @@ -0,0 +1 @@ +{"version":3,"file":"sort.js","sourceRoot":"","sources":["../src/sort.ts"],"names":[],"mappings":";;;AAAA,mCAAoD;AAiCpD,gBAAgB;AAChB,SAAS,gBAAgB,CAAC,YAAiB,CAAC;IAC1C,MAAM,KAAK,GAAG,GAAG,SAAS,EAAE,CAAC,WAAW,EAAE,CAAC;IAC3C,IAAI,MAAM,CAAC,SAAS,CAAC;QAAE,OAAO,SAAS,CAAC;IACxC,QAAQ,KAAK,EAAE;QACb,KAAK,WAAW,CAAC;QACjB,KAAK,KAAK,CAAC;QACX,KAAK,GAAG;YACN,OAAO,CAAC,CAAC;QACX,KAAK,YAAY,CAAC;QAClB,KAAK,MAAM,CAAC;QACZ,KAAK,IAAI;YACP,OAAO,CAAC,CAAC,CAAC;QACZ;YACE,MAAM,IAAI,iCAAyB,CAAC,2BAA2B,IAAI,CAAC,SAAS,CAAC,SAAS,CAAC,EAAE,CAAC,CAAC;KAC/F;AACH,CAAC;AAED,gBAAgB;AAChB,SAAS,MAAM,CAAC,CAAgB;IAC9B,OAAO,OAAO,CAAC,KAAK,QAAQ,IAAI,CAAC,IAAI,IAAI,IAAI,OAAO,IAAI,CAAC,IAAI,OAAO,CAAC,CAAC,KAAK,KAAK,QAAQ,CAAC;AAC3F,CAAC;AAED,gBAAgB;AAChB,SAAS,MAAM,CAAC,CAAO;IACrB,IAAI,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,MAAM,KAAK,CAAC,EAAE;QACtC,IAAI;YACF,gBAAgB,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;YACvB,OAAO,IAAI,CAAC;SACb;QAAC,OAAO,CAAC,EAAE;YACV,OAAO,KAAK,CAAC;SACd;KACF;IACD,OAAO,KAAK,CAAC;AACf,CAAC;AAED,SAAS,MAAM,CAAC,CAAO;IACrB,OAAO,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,IAAI,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;AACjD,CAAC;AAED,SAAS,KAAK,CAAC,CAAO;IACpB,OAAO,CAAC,YAAY,GAAG,IAAI,CAAC,CAAC,IAAI,GAAG,CAAC,CAAC;AACxC,CAAC;AAED,gBAAgB;AAChB,SAAS,SAAS,CAAC,CAA0B;IAC3C,OAAO,IAAI,GAAG,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,EAAE,EAAE,gBAAgB,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;AAC1D,CAAC;AAED,gBAAgB;AAChB,SAAS,SAAS,CAAC,CAA4B;IAC7C,MAAM,WAAW,GAAqB,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,EAAE,EAAE,CAAC,CAAC,GAAG,CAAC,EAAE,EAAE,gBAAgB,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC;IACvF,OAAO,IAAI,GAAG,CAAC,WAAW,CAAC,CAAC;AAC9B,CAAC;AAED,gBAAgB;AAChB,SAAS,YAAY,CAAC,CAAW;IAC/B,MAAM,WAAW,GAAqB,CAAC,CAAC,GAAG,CAAC,GAAG,CAAC,EAAE,CAAC,CAAC,GAAG,GAAG,EAAE,EAAE,CAAC,CAAC,CAAC,CAAC;IAClE,OAAO,IAAI,GAAG,CAAC,WAAW,CAAC,CAAC;AAC9B,CAAC;AAED,gBAAgB;AAChB,SAAS,WAAW,CAAC,CAAmC;IACtD,MAAM,WAAW,GAAqB,MAAM,CAAC,OAAO,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,EAAE,EAAE,CAAC;QACtE,GAAG,CAAC,EAAE;QACN,gBAAgB,CAAC,CAAC,CAAC;KACpB,CAAC,CAAC;IACH,OAAO,IAAI,GAAG,CAAC,WAAW,CAAC,CAAC;AAC9B,CAAC;AAED,gBAAgB;AAChB,SAAS,QAAQ,CAAC,CAA6B;IAC7C,MAAM,WAAW,GAAqB,KAAK,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,EAAE,EAAE,CAAC;QAClE,GAAG,CAAC,EAAE;QACN,gBAAgB,CAAC,CAAC,CAAC;KACpB,CAAC,CAAC;IACH,OAAO,IAAI,GAAG,CAAC,WAAW,CAAC,CAAC;AAC9B,CAAC;AAED,iFAAiF;AACjF,SAAgB,UAAU,CACxB,IAAsB,EACtB,SAAyB;IAEzB,IAAI,IAAI,IAAI,IAAI;QAAE,OAAO,SAAS,CAAC;IACnC,IAAI,OAAO,IAAI,KAAK,QAAQ;QAAE,OAAO,IAAI,GAAG,CAAC,CAAC,CAAC,IAAI,EAAE,gBAAgB,CAAC,SAAS,CAAC,CAAC,CAAC,CAAC,CAAC;IACpF,IAAI,OAAO,IAAI,KAAK,QAAQ,EAAE;QAC5B,MAAM,IAAI,iCAAyB,CACjC,wBAAwB,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,8BAA8B,CAC3E,CAAC;KACH;IACD,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,IAAI,CAAC,EAAE;QACxB,OAAO,KAAK,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,MAAM,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,MAAM,CAAC,CAAC,CAAC,WAAW,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,SAAS,CAAC;KAChG;IACD,IAAI,CAAC,IAAI,CAAC,MAAM;QAAE,OAAO,SAAS,CAAC;IACnC,IAAI,MAAM,CAAC,IAAI,CAAC;QAAE,OAAO,SAAS,CAAC,IAAI,CAAC,CAAC;IACzC,IAAI,MAAM,CAAC,IAAI,CAAC;QAAE,OAAO,SAAS,CAAC,IAAI,CAAC,CAAC;IACzC,OAAO,YAAY,CAAC,IAAI,CAAC,CAAC;AAC5B,CAAC;AAlBD,gCAkBC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/transactions.js b/node_modules/mongodb/lib/transactions.js new file mode 100644 index 000000000..d74b673ee --- /dev/null +++ b/node_modules/mongodb/lib/transactions.js @@ -0,0 +1,138 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.isTransactionCommand = exports.Transaction = exports.TxnState = void 0; +const read_preference_1 = require("./read_preference"); +const error_1 = require("./error"); +const read_concern_1 = require("./read_concern"); +const write_concern_1 = require("./write_concern"); +/** @internal */ +exports.TxnState = Object.freeze({ + NO_TRANSACTION: 'NO_TRANSACTION', + STARTING_TRANSACTION: 'STARTING_TRANSACTION', + TRANSACTION_IN_PROGRESS: 'TRANSACTION_IN_PROGRESS', + TRANSACTION_COMMITTED: 'TRANSACTION_COMMITTED', + TRANSACTION_COMMITTED_EMPTY: 'TRANSACTION_COMMITTED_EMPTY', + TRANSACTION_ABORTED: 'TRANSACTION_ABORTED' +}); +const stateMachine = { + [exports.TxnState.NO_TRANSACTION]: [exports.TxnState.NO_TRANSACTION, exports.TxnState.STARTING_TRANSACTION], + [exports.TxnState.STARTING_TRANSACTION]: [ + exports.TxnState.TRANSACTION_IN_PROGRESS, + exports.TxnState.TRANSACTION_COMMITTED, + exports.TxnState.TRANSACTION_COMMITTED_EMPTY, + exports.TxnState.TRANSACTION_ABORTED + ], + [exports.TxnState.TRANSACTION_IN_PROGRESS]: [ + exports.TxnState.TRANSACTION_IN_PROGRESS, + exports.TxnState.TRANSACTION_COMMITTED, + exports.TxnState.TRANSACTION_ABORTED + ], + [exports.TxnState.TRANSACTION_COMMITTED]: [ + exports.TxnState.TRANSACTION_COMMITTED, + exports.TxnState.TRANSACTION_COMMITTED_EMPTY, + exports.TxnState.STARTING_TRANSACTION, + exports.TxnState.NO_TRANSACTION + ], + [exports.TxnState.TRANSACTION_ABORTED]: [exports.TxnState.STARTING_TRANSACTION, exports.TxnState.NO_TRANSACTION], + [exports.TxnState.TRANSACTION_COMMITTED_EMPTY]: [ + exports.TxnState.TRANSACTION_COMMITTED_EMPTY, + exports.TxnState.NO_TRANSACTION + ] +}; +const ACTIVE_STATES = new Set([ + exports.TxnState.STARTING_TRANSACTION, + exports.TxnState.TRANSACTION_IN_PROGRESS +]); +const COMMITTED_STATES = new Set([ + exports.TxnState.TRANSACTION_COMMITTED, + exports.TxnState.TRANSACTION_COMMITTED_EMPTY, + exports.TxnState.TRANSACTION_ABORTED +]); +/** + * @public + * A class maintaining state related to a server transaction. Internal Only + */ +class Transaction { + /** Create a transaction @internal */ + constructor(options) { + options = options !== null && options !== void 0 ? options : {}; + this.state = exports.TxnState.NO_TRANSACTION; + this.options = {}; + const writeConcern = write_concern_1.WriteConcern.fromOptions(options); + if (writeConcern) { + if (writeConcern.w === 0) { + throw new error_1.MongoTransactionError('Transactions do not support unacknowledged write concern'); + } + this.options.writeConcern = writeConcern; + } + if (options.readConcern) { + this.options.readConcern = read_concern_1.ReadConcern.fromOptions(options); + } + if (options.readPreference) { + this.options.readPreference = read_preference_1.ReadPreference.fromOptions(options); + } + if (options.maxCommitTimeMS) { + this.options.maxTimeMS = options.maxCommitTimeMS; + } + // TODO: This isn't technically necessary + this._pinnedServer = undefined; + this._recoveryToken = undefined; + } + /** @internal */ + get server() { + return this._pinnedServer; + } + get recoveryToken() { + return this._recoveryToken; + } + get isPinned() { + return !!this.server; + } + /** @returns Whether the transaction has started */ + get isStarting() { + return this.state === exports.TxnState.STARTING_TRANSACTION; + } + /** + * @returns Whether this session is presently in a transaction + */ + get isActive() { + return ACTIVE_STATES.has(this.state); + } + get isCommitted() { + return COMMITTED_STATES.has(this.state); + } + /** + * Transition the transaction in the state machine + * @internal + * @param nextState - The new state to transition to + */ + transition(nextState) { + const nextStates = stateMachine[this.state]; + if (nextStates && nextStates.includes(nextState)) { + this.state = nextState; + if (this.state === exports.TxnState.NO_TRANSACTION || + this.state === exports.TxnState.STARTING_TRANSACTION || + this.state === exports.TxnState.TRANSACTION_ABORTED) { + this.unpinServer(); + } + return; + } + throw new error_1.MongoRuntimeError(`Attempted illegal state transition from [${this.state}] to [${nextState}]`); + } + /** @internal */ + pinServer(server) { + if (this.isActive) { + this._pinnedServer = server; + } + } + /** @internal */ + unpinServer() { + this._pinnedServer = undefined; + } +} +exports.Transaction = Transaction; +function isTransactionCommand(command) { + return !!(command.commitTransaction || command.abortTransaction); +} +exports.isTransactionCommand = isTransactionCommand; +//# sourceMappingURL=transactions.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/transactions.js.map b/node_modules/mongodb/lib/transactions.js.map new file mode 100644 index 000000000..0f55f61c9 --- /dev/null +++ b/node_modules/mongodb/lib/transactions.js.map @@ -0,0 +1 @@ +{"version":3,"file":"transactions.js","sourceRoot":"","sources":["../src/transactions.ts"],"names":[],"mappings":";;;AAAA,uDAAmD;AACnD,mCAAmE;AACnE,iDAA6C;AAC7C,mDAA+C;AAK/C,gBAAgB;AACH,QAAA,QAAQ,GAAG,MAAM,CAAC,MAAM,CAAC;IACpC,cAAc,EAAE,gBAAgB;IAChC,oBAAoB,EAAE,sBAAsB;IAC5C,uBAAuB,EAAE,yBAAyB;IAClD,qBAAqB,EAAE,uBAAuB;IAC9C,2BAA2B,EAAE,6BAA6B;IAC1D,mBAAmB,EAAE,qBAAqB;CAClC,CAAC,CAAC;AAKZ,MAAM,YAAY,GAAwC;IACxD,CAAC,gBAAQ,CAAC,cAAc,CAAC,EAAE,CAAC,gBAAQ,CAAC,cAAc,EAAE,gBAAQ,CAAC,oBAAoB,CAAC;IACnF,CAAC,gBAAQ,CAAC,oBAAoB,CAAC,EAAE;QAC/B,gBAAQ,CAAC,uBAAuB;QAChC,gBAAQ,CAAC,qBAAqB;QAC9B,gBAAQ,CAAC,2BAA2B;QACpC,gBAAQ,CAAC,mBAAmB;KAC7B;IACD,CAAC,gBAAQ,CAAC,uBAAuB,CAAC,EAAE;QAClC,gBAAQ,CAAC,uBAAuB;QAChC,gBAAQ,CAAC,qBAAqB;QAC9B,gBAAQ,CAAC,mBAAmB;KAC7B;IACD,CAAC,gBAAQ,CAAC,qBAAqB,CAAC,EAAE;QAChC,gBAAQ,CAAC,qBAAqB;QAC9B,gBAAQ,CAAC,2BAA2B;QACpC,gBAAQ,CAAC,oBAAoB;QAC7B,gBAAQ,CAAC,cAAc;KACxB;IACD,CAAC,gBAAQ,CAAC,mBAAmB,CAAC,EAAE,CAAC,gBAAQ,CAAC,oBAAoB,EAAE,gBAAQ,CAAC,cAAc,CAAC;IACxF,CAAC,gBAAQ,CAAC,2BAA2B,CAAC,EAAE;QACtC,gBAAQ,CAAC,2BAA2B;QACpC,gBAAQ,CAAC,cAAc;KACxB;CACF,CAAC;AAEF,MAAM,aAAa,GAAkB,IAAI,GAAG,CAAC;IAC3C,gBAAQ,CAAC,oBAAoB;IAC7B,gBAAQ,CAAC,uBAAuB;CACjC,CAAC,CAAC;AAEH,MAAM,gBAAgB,GAAkB,IAAI,GAAG,CAAC;IAC9C,gBAAQ,CAAC,qBAAqB;IAC9B,gBAAQ,CAAC,2BAA2B;IACpC,gBAAQ,CAAC,mBAAmB;CAC7B,CAAC,CAAC;AAkBH;;;GAGG;AACH,MAAa,WAAW;IAStB,qCAAqC;IACrC,YAAY,OAA4B;QACtC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QACxB,IAAI,CAAC,KAAK,GAAG,gBAAQ,CAAC,cAAc,CAAC;QACrC,IAAI,CAAC,OAAO,GAAG,EAAE,CAAC;QAElB,MAAM,YAAY,GAAG,4BAAY,CAAC,WAAW,CAAC,OAAO,CAAC,CAAC;QACvD,IAAI,YAAY,EAAE;YAChB,IAAI,YAAY,CAAC,CAAC,KAAK,CAAC,EAAE;gBACxB,MAAM,IAAI,6BAAqB,CAAC,0DAA0D,CAAC,CAAC;aAC7F;YAED,IAAI,CAAC,OAAO,CAAC,YAAY,GAAG,YAAY,CAAC;SAC1C;QAED,IAAI,OAAO,CAAC,WAAW,EAAE;YACvB,IAAI,CAAC,OAAO,CAAC,WAAW,GAAG,0BAAW,CAAC,WAAW,CAAC,OAAO,CAAC,CAAC;SAC7D;QAED,IAAI,OAAO,CAAC,cAAc,EAAE;YAC1B,IAAI,CAAC,OAAO,CAAC,cAAc,GAAG,gCAAc,CAAC,WAAW,CAAC,OAAO,CAAC,CAAC;SACnE;QAED,IAAI,OAAO,CAAC,eAAe,EAAE;YAC3B,IAAI,CAAC,OAAO,CAAC,SAAS,GAAG,OAAO,CAAC,eAAe,CAAC;SAClD;QAED,yCAAyC;QACzC,IAAI,CAAC,aAAa,GAAG,SAAS,CAAC;QAC/B,IAAI,CAAC,cAAc,GAAG,SAAS,CAAC;IAClC,CAAC;IAED,gBAAgB;IAChB,IAAI,MAAM;QACR,OAAO,IAAI,CAAC,aAAa,CAAC;IAC5B,CAAC;IAED,IAAI,aAAa;QACf,OAAO,IAAI,CAAC,cAAc,CAAC;IAC7B,CAAC;IAED,IAAI,QAAQ;QACV,OAAO,CAAC,CAAC,IAAI,CAAC,MAAM,CAAC;IACvB,CAAC;IAED,mDAAmD;IACnD,IAAI,UAAU;QACZ,OAAO,IAAI,CAAC,KAAK,KAAK,gBAAQ,CAAC,oBAAoB,CAAC;IACtD,CAAC;IAED;;OAEG;IACH,IAAI,QAAQ;QACV,OAAO,aAAa,CAAC,GAAG,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;IACvC,CAAC;IAED,IAAI,WAAW;QACb,OAAO,gBAAgB,CAAC,GAAG,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;IAC1C,CAAC;IACD;;;;OAIG;IACH,UAAU,CAAC,SAAmB;QAC5B,MAAM,UAAU,GAAG,YAAY,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;QAC5C,IAAI,UAAU,IAAI,UAAU,CAAC,QAAQ,CAAC,SAAS,CAAC,EAAE;YAChD,IAAI,CAAC,KAAK,GAAG,SAAS,CAAC;YACvB,IACE,IAAI,CAAC,KAAK,KAAK,gBAAQ,CAAC,cAAc;gBACtC,IAAI,CAAC,KAAK,KAAK,gBAAQ,CAAC,oBAAoB;gBAC5C,IAAI,CAAC,KAAK,KAAK,gBAAQ,CAAC,mBAAmB,EAC3C;gBACA,IAAI,CAAC,WAAW,EAAE,CAAC;aACpB;YACD,OAAO;SACR;QAED,MAAM,IAAI,yBAAiB,CACzB,4CAA4C,IAAI,CAAC,KAAK,SAAS,SAAS,GAAG,CAC5E,CAAC;IACJ,CAAC;IAED,gBAAgB;IAChB,SAAS,CAAC,MAAc;QACtB,IAAI,IAAI,CAAC,QAAQ,EAAE;YACjB,IAAI,CAAC,aAAa,GAAG,MAAM,CAAC;SAC7B;IACH,CAAC;IAED,gBAAgB;IAChB,WAAW;QACT,IAAI,CAAC,aAAa,GAAG,SAAS,CAAC;IACjC,CAAC;CACF;AAxGD,kCAwGC;AAED,SAAgB,oBAAoB,CAAC,OAAiB;IACpD,OAAO,CAAC,CAAC,CAAC,OAAO,CAAC,iBAAiB,IAAI,OAAO,CAAC,gBAAgB,CAAC,CAAC;AACnE,CAAC;AAFD,oDAEC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/utils.js b/node_modules/mongodb/lib/utils.js new file mode 100644 index 000000000..00df726d7 --- /dev/null +++ b/node_modules/mongodb/lib/utils.js @@ -0,0 +1,1150 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.enumToString = exports.emitWarningOnce = exports.emitWarning = exports.MONGODB_WARNING_CODE = exports.DEFAULT_PK_FACTORY = exports.HostAddress = exports.BufferPool = exports.deepCopy = exports.isRecord = exports.setDifference = exports.isSuperset = exports.resolveOptions = exports.hasAtomicOperators = exports.makeInterruptibleAsyncInterval = exports.calculateDurationInMs = exports.now = exports.makeClientMetadata = exports.makeStateMachine = exports.errorStrictEqual = exports.arrayStrictEqual = exports.eachAsyncSeries = exports.eachAsync = exports.collationNotSupported = exports.maxWireVersion = exports.uuidV4 = exports.collectionNamespace = exports.databaseNamespace = exports.maybePromise = exports.makeCounter = exports.MongoDBNamespace = exports.ns = exports.deprecateOptions = exports.defaultMsgHandler = exports.getTopology = exports.decorateWithExplain = exports.decorateWithReadConcern = exports.decorateWithCollation = exports.isPromiseLike = exports.applyWriteConcern = exports.applyRetryableWrites = exports.executeLegacyOperation = exports.filterOptions = exports.mergeOptions = exports.decorateCommand = exports.isObject = exports.parseIndexOptions = exports.normalizeHintField = exports.checkCollectionName = exports.getSingleProperty = exports.MAX_JS_INT = void 0; +exports.parsePackageVersion = exports.supportsRetryableWrites = void 0; +const os = require("os"); +const crypto = require("crypto"); +const promise_provider_1 = require("./promise_provider"); +const error_1 = require("./error"); +const write_concern_1 = require("./write_concern"); +const common_1 = require("./sdam/common"); +const read_concern_1 = require("./read_concern"); +const bson_1 = require("./bson"); +const read_preference_1 = require("./read_preference"); +const url_1 = require("url"); +const constants_1 = require("./cmap/wire_protocol/constants"); +exports.MAX_JS_INT = Number.MAX_SAFE_INTEGER + 1; +/** + * Add a readonly enumerable property. + * @internal + */ +function getSingleProperty(obj, name, value) { + Object.defineProperty(obj, name, { + enumerable: true, + get() { + return value; + } + }); +} +exports.getSingleProperty = getSingleProperty; +/** + * Throws if collectionName is not a valid mongodb collection namespace. + * @internal + */ +function checkCollectionName(collectionName) { + if ('string' !== typeof collectionName) { + throw new error_1.MongoInvalidArgumentError('Collection name must be a String'); + } + if (!collectionName || collectionName.indexOf('..') !== -1) { + throw new error_1.MongoInvalidArgumentError('Collection names cannot be empty'); + } + if (collectionName.indexOf('$') !== -1 && + collectionName.match(/((^\$cmd)|(oplog\.\$main))/) == null) { + // TODO(NODE-3483): Use MongoNamespace static method + throw new error_1.MongoInvalidArgumentError("Collection names must not contain '$'"); + } + if (collectionName.match(/^\.|\.$/) != null) { + // TODO(NODE-3483): Use MongoNamespace static method + throw new error_1.MongoInvalidArgumentError("Collection names must not start or end with '.'"); + } + // Validate that we are not passing 0x00 in the collection name + if (collectionName.indexOf('\x00') !== -1) { + // TODO(NODE-3483): Use MongoNamespace static method + throw new error_1.MongoInvalidArgumentError('Collection names cannot contain a null character'); + } +} +exports.checkCollectionName = checkCollectionName; +/** + * Ensure Hint field is in a shape we expect: + * - object of index names mapping to 1 or -1 + * - just an index name + * @internal + */ +function normalizeHintField(hint) { + let finalHint = undefined; + if (typeof hint === 'string') { + finalHint = hint; + } + else if (Array.isArray(hint)) { + finalHint = {}; + hint.forEach(param => { + finalHint[param] = 1; + }); + } + else if (hint != null && typeof hint === 'object') { + finalHint = {}; + for (const name in hint) { + finalHint[name] = hint[name]; + } + } + return finalHint; +} +exports.normalizeHintField = normalizeHintField; +/** + * Create an index specifier based on + * @internal + */ +function parseIndexOptions(indexSpec) { + const fieldHash = {}; + const indexes = []; + let keys; + // Get all the fields accordingly + if ('string' === typeof indexSpec) { + // 'type' + indexes.push(indexSpec + '_' + 1); + fieldHash[indexSpec] = 1; + } + else if (Array.isArray(indexSpec)) { + indexSpec.forEach((f) => { + if ('string' === typeof f) { + // [{location:'2d'}, 'type'] + indexes.push(f + '_' + 1); + fieldHash[f] = 1; + } + else if (Array.isArray(f)) { + // [['location', '2d'],['type', 1]] + indexes.push(f[0] + '_' + (f[1] || 1)); + fieldHash[f[0]] = f[1] || 1; + } + else if (isObject(f)) { + // [{location:'2d'}, {type:1}] + keys = Object.keys(f); + keys.forEach(k => { + indexes.push(k + '_' + f[k]); + fieldHash[k] = f[k]; + }); + } + else { + // undefined (ignore) + } + }); + } + else if (isObject(indexSpec)) { + // {location:'2d', type:1} + keys = Object.keys(indexSpec); + Object.entries(indexSpec).forEach(([key, value]) => { + indexes.push(key + '_' + value); + fieldHash[key] = value; + }); + } + return { + name: indexes.join('_'), + keys: keys, + fieldHash: fieldHash + }; +} +exports.parseIndexOptions = parseIndexOptions; +/** + * Checks if arg is an Object: + * - **NOTE**: the check is based on the `[Symbol.toStringTag]() === 'Object'` + * @internal + */ +// eslint-disable-next-line @typescript-eslint/ban-types +function isObject(arg) { + return '[object Object]' === Object.prototype.toString.call(arg); +} +exports.isObject = isObject; +/** @internal */ +function decorateCommand(command, options, exclude) { + for (const name in options) { + if (!exclude.includes(name)) { + command[name] = options[name]; + } + } + return command; +} +exports.decorateCommand = decorateCommand; +/** @internal */ +function mergeOptions(target, source) { + return { ...target, ...source }; +} +exports.mergeOptions = mergeOptions; +/** @internal */ +function filterOptions(options, names) { + const filterOptions = {}; + for (const name in options) { + if (names.includes(name)) { + filterOptions[name] = options[name]; + } + } + // Filtered options + return filterOptions; +} +exports.filterOptions = filterOptions; +/** + * Executes the given operation with provided arguments. + * + * @remarks + * This method reduces large amounts of duplication in the entire codebase by providing + * a single point for determining whether callbacks or promises should be used. Additionally + * it allows for a single point of entry to provide features such as implicit sessions, which + * are required by the Driver Sessions specification in the event that a ClientSession is + * not provided + * + * @internal + * + * @param topology - The topology to execute this operation on + * @param operation - The operation to execute + * @param args - Arguments to apply the provided operation + * @param options - Options that modify the behavior of the method + */ +function executeLegacyOperation(topology, operation, args, options) { + const Promise = promise_provider_1.PromiseProvider.get(); + if (!Array.isArray(args)) { + // TODO(NODE-3483) + throw new error_1.MongoRuntimeError('This method requires an array of arguments to apply'); + } + options = options !== null && options !== void 0 ? options : {}; + let callback = args[args.length - 1]; + // The driver sessions spec mandates that we implicitly create sessions for operations + // that are not explicitly provided with a session. + let session; + let opOptions; + let owner; + if (!options.skipSessions && topology.hasSessionSupport()) { + opOptions = args[args.length - 2]; + if (opOptions == null || opOptions.session == null) { + owner = Symbol(); + session = topology.startSession({ owner }); + const optionsIndex = args.length - 2; + args[optionsIndex] = Object.assign({}, args[optionsIndex], { session: session }); + } + else if (opOptions.session && opOptions.session.hasEnded) { + throw new error_1.MongoExpiredSessionError(); + } + } + function makeExecuteCallback(resolve, reject) { + return function (err, result) { + if (session && session.owner === owner && !(options === null || options === void 0 ? void 0 : options.returnsCursor)) { + session.endSession(() => { + delete opOptions.session; + if (err) + return reject(err); + resolve(result); + }); + } + else { + if (err) + return reject(err); + resolve(result); + } + }; + } + // Execute using callback + if (typeof callback === 'function') { + callback = args.pop(); + const handler = makeExecuteCallback(result => callback(undefined, result), err => callback(err, null)); + args.push(handler); + try { + return operation(...args); + } + catch (e) { + handler(e); + throw e; + } + } + // Return a Promise + if (args[args.length - 1] != null) { + // TODO(NODE-3483) + throw new error_1.MongoRuntimeError('Final argument to `executeLegacyOperation` must be a callback'); + } + return new Promise((resolve, reject) => { + const handler = makeExecuteCallback(resolve, reject); + args[args.length - 1] = handler; + try { + return operation(...args); + } + catch (e) { + handler(e); + } + }); +} +exports.executeLegacyOperation = executeLegacyOperation; +/** + * Applies retryWrites: true to a command if retryWrites is set on the command's database. + * @internal + * + * @param target - The target command to which we will apply retryWrites. + * @param db - The database from which we can inherit a retryWrites value. + */ +function applyRetryableWrites(target, db) { + var _a; + if (db && ((_a = db.s.options) === null || _a === void 0 ? void 0 : _a.retryWrites)) { + target.retryWrites = true; + } + return target; +} +exports.applyRetryableWrites = applyRetryableWrites; +/** + * Applies a write concern to a command based on well defined inheritance rules, optionally + * detecting support for the write concern in the first place. + * @internal + * + * @param target - the target command we will be applying the write concern to + * @param sources - sources where we can inherit default write concerns from + * @param options - optional settings passed into a command for write concern overrides + */ +function applyWriteConcern(target, sources, options) { + options = options !== null && options !== void 0 ? options : {}; + const db = sources.db; + const coll = sources.collection; + if (options.session && options.session.inTransaction()) { + // writeConcern is not allowed within a multi-statement transaction + if (target.writeConcern) { + delete target.writeConcern; + } + return target; + } + const writeConcern = write_concern_1.WriteConcern.fromOptions(options); + if (writeConcern) { + return Object.assign(target, { writeConcern }); + } + if (coll && coll.writeConcern) { + return Object.assign(target, { writeConcern: Object.assign({}, coll.writeConcern) }); + } + if (db && db.writeConcern) { + return Object.assign(target, { writeConcern: Object.assign({}, db.writeConcern) }); + } + return target; +} +exports.applyWriteConcern = applyWriteConcern; +/** + * Checks if a given value is a Promise + * + * @typeParam T - The result type of maybePromise + * @param maybePromise - An object that could be a promise + * @returns true if the provided value is a Promise + */ +function isPromiseLike(maybePromise) { + return !!maybePromise && typeof maybePromise.then === 'function'; +} +exports.isPromiseLike = isPromiseLike; +/** + * Applies collation to a given command. + * @internal + * + * @param command - the command on which to apply collation + * @param target - target of command + * @param options - options containing collation settings + */ +function decorateWithCollation(command, target, options) { + const capabilities = getTopology(target).capabilities; + if (options.collation && typeof options.collation === 'object') { + if (capabilities && capabilities.commandsTakeCollation) { + command.collation = options.collation; + } + else { + throw new error_1.MongoCompatibilityError(`Current topology does not support collation`); + } + } +} +exports.decorateWithCollation = decorateWithCollation; +/** + * Applies a read concern to a given command. + * @internal + * + * @param command - the command on which to apply the read concern + * @param coll - the parent collection of the operation calling this method + */ +function decorateWithReadConcern(command, coll, options) { + if (options && options.session && options.session.inTransaction()) { + return; + } + const readConcern = Object.assign({}, command.readConcern || {}); + if (coll.s.readConcern) { + Object.assign(readConcern, coll.s.readConcern); + } + if (Object.keys(readConcern).length > 0) { + Object.assign(command, { readConcern: readConcern }); + } +} +exports.decorateWithReadConcern = decorateWithReadConcern; +/** + * Applies an explain to a given command. + * @internal + * + * @param command - the command on which to apply the explain + * @param options - the options containing the explain verbosity + */ +function decorateWithExplain(command, explain) { + if (command.explain) { + return command; + } + return { explain: command, verbosity: explain.verbosity }; +} +exports.decorateWithExplain = decorateWithExplain; +/** + * A helper function to get the topology from a given provider. Throws + * if the topology cannot be found. + * @internal + */ +function getTopology(provider) { + if (`topology` in provider && provider.topology) { + return provider.topology; + } + else if ('client' in provider.s && provider.s.client.topology) { + return provider.s.client.topology; + } + else if ('db' in provider.s && provider.s.db.s.client.topology) { + return provider.s.db.s.client.topology; + } + throw new error_1.MongoNotConnectedError('MongoClient must be connected to perform this operation'); +} +exports.getTopology = getTopology; +/** + * Default message handler for generating deprecation warnings. + * @internal + * + * @param name - function name + * @param option - option name + * @returns warning message + */ +function defaultMsgHandler(name, option) { + return `${name} option [${option}] is deprecated and will be removed in a later version.`; +} +exports.defaultMsgHandler = defaultMsgHandler; +/** + * Deprecates a given function's options. + * @internal + * + * @param this - the bound class if this is a method + * @param config - configuration for deprecation + * @param fn - the target function of deprecation + * @returns modified function that warns once per deprecated option, and executes original function + */ +function deprecateOptions(config, fn) { + if (process.noDeprecation === true) { + return fn; + } + const msgHandler = config.msgHandler ? config.msgHandler : defaultMsgHandler; + const optionsWarned = new Set(); + function deprecated(...args) { + const options = args[config.optionsIndex]; + // ensure options is a valid, non-empty object, otherwise short-circuit + if (!isObject(options) || Object.keys(options).length === 0) { + return fn.bind(this)(...args); // call the function, no change + } + // interrupt the function call with a warning + for (const deprecatedOption of config.deprecatedOptions) { + if (deprecatedOption in options && !optionsWarned.has(deprecatedOption)) { + optionsWarned.add(deprecatedOption); + const msg = msgHandler(config.name, deprecatedOption); + emitWarning(msg); + if (this && 'getLogger' in this) { + const logger = this.getLogger(); + if (logger) { + logger.warn(msg); + } + } + } + } + return fn.bind(this)(...args); + } + // These lines copied from https://github.com/nodejs/node/blob/25e5ae41688676a5fd29b2e2e7602168eee4ceb5/lib/internal/util.js#L73-L80 + // The wrapper will keep the same prototype as fn to maintain prototype chain + Object.setPrototypeOf(deprecated, fn); + if (fn.prototype) { + // Setting this (rather than using Object.setPrototype, as above) ensures + // that calling the unwrapped constructor gives an instanceof the wrapped + // constructor. + deprecated.prototype = fn.prototype; + } + return deprecated; +} +exports.deprecateOptions = deprecateOptions; +/** @internal */ +function ns(ns) { + return MongoDBNamespace.fromString(ns); +} +exports.ns = ns; +/** @public */ +class MongoDBNamespace { + /** + * Create a namespace object + * + * @param db - database name + * @param collection - collection name + */ + constructor(db, collection) { + this.db = db; + this.collection = collection; + } + toString() { + return this.collection ? `${this.db}.${this.collection}` : this.db; + } + withCollection(collection) { + return new MongoDBNamespace(this.db, collection); + } + static fromString(namespace) { + if (!namespace) { + // TODO(NODE-3483): Replace with MongoNamespaceError + throw new error_1.MongoRuntimeError(`Cannot parse namespace from "${namespace}"`); + } + const [db, ...collection] = namespace.split('.'); + return new MongoDBNamespace(db, collection.join('.')); + } +} +exports.MongoDBNamespace = MongoDBNamespace; +/** @internal */ +function* makeCounter(seed = 0) { + let count = seed; + while (true) { + const newCount = count; + count += 1; + yield newCount; + } +} +exports.makeCounter = makeCounter; +/** + * Helper function for either accepting a callback, or returning a promise + * @internal + * + * @param callback - The last function argument in exposed method, controls if a Promise is returned + * @param wrapper - A function that wraps the callback + * @returns Returns void if a callback is supplied, else returns a Promise. + */ +function maybePromise(callback, wrapper) { + const Promise = promise_provider_1.PromiseProvider.get(); + let result; + if (typeof callback !== 'function') { + result = new Promise((resolve, reject) => { + callback = (err, res) => { + if (err) + return reject(err); + resolve(res); + }; + }); + } + wrapper((err, res) => { + if (err != null) { + try { + // eslint-disable-next-line @typescript-eslint/no-non-null-assertion + callback(err); + } + catch (error) { + process.nextTick(() => { + throw error; + }); + } + return; + } + // eslint-disable-next-line @typescript-eslint/no-non-null-assertion + callback(err, res); + }); + return result; +} +exports.maybePromise = maybePromise; +/** @internal */ +function databaseNamespace(ns) { + return ns.split('.')[0]; +} +exports.databaseNamespace = databaseNamespace; +/** @internal */ +function collectionNamespace(ns) { + return ns.split('.').slice(1).join('.'); +} +exports.collectionNamespace = collectionNamespace; +/** + * Synchronously Generate a UUIDv4 + * @internal + */ +function uuidV4() { + const result = crypto.randomBytes(16); + result[6] = (result[6] & 0x0f) | 0x40; + result[8] = (result[8] & 0x3f) | 0x80; + return result; +} +exports.uuidV4 = uuidV4; +/** + * A helper function for determining `maxWireVersion` between legacy and new topology instances + * @internal + */ +function maxWireVersion(topologyOrServer) { + if (topologyOrServer) { + if (topologyOrServer.loadBalanced) { + // Since we do not have a monitor, we assume the load balanced server is always + // pointed at the latest mongodb version. There is a risk that for on-prem + // deployments that don't upgrade immediately that this could alert to the + // application that a feature is avaiable that is actually not. + return constants_1.MAX_SUPPORTED_WIRE_VERSION; + } + if (topologyOrServer.ismaster) { + return topologyOrServer.ismaster.maxWireVersion; + } + if ('lastIsMaster' in topologyOrServer && typeof topologyOrServer.lastIsMaster === 'function') { + const lastIsMaster = topologyOrServer.lastIsMaster(); + if (lastIsMaster) { + return lastIsMaster.maxWireVersion; + } + } + if (topologyOrServer.description && + 'maxWireVersion' in topologyOrServer.description && + topologyOrServer.description.maxWireVersion != null) { + return topologyOrServer.description.maxWireVersion; + } + } + return 0; +} +exports.maxWireVersion = maxWireVersion; +/** + * Checks that collation is supported by server. + * @internal + * + * @param server - to check against + * @param cmd - object where collation may be specified + */ +function collationNotSupported(server, cmd) { + return cmd && cmd.collation && maxWireVersion(server) < 5; +} +exports.collationNotSupported = collationNotSupported; +/** + * Applies the function `eachFn` to each item in `arr`, in parallel. + * @internal + * + * @param arr - An array of items to asynchronously iterate over + * @param eachFn - A function to call on each item of the array. The callback signature is `(item, callback)`, where the callback indicates iteration is complete. + * @param callback - The callback called after every item has been iterated + */ +function eachAsync(arr, eachFn, callback) { + arr = arr || []; + let idx = 0; + let awaiting = 0; + for (idx = 0; idx < arr.length; ++idx) { + awaiting++; + eachFn(arr[idx], eachCallback); + } + if (awaiting === 0) { + callback(); + return; + } + function eachCallback(err) { + awaiting--; + if (err) { + callback(err); + return; + } + if (idx === arr.length && awaiting <= 0) { + callback(); + } + } +} +exports.eachAsync = eachAsync; +/** @internal */ +function eachAsyncSeries(arr, eachFn, callback) { + arr = arr || []; + let idx = 0; + let awaiting = arr.length; + if (awaiting === 0) { + callback(); + return; + } + function eachCallback(err) { + idx++; + awaiting--; + if (err) { + callback(err); + return; + } + if (idx === arr.length && awaiting <= 0) { + callback(); + return; + } + eachFn(arr[idx], eachCallback); + } + eachFn(arr[idx], eachCallback); +} +exports.eachAsyncSeries = eachAsyncSeries; +/** @internal */ +function arrayStrictEqual(arr, arr2) { + if (!Array.isArray(arr) || !Array.isArray(arr2)) { + return false; + } + return arr.length === arr2.length && arr.every((elt, idx) => elt === arr2[idx]); +} +exports.arrayStrictEqual = arrayStrictEqual; +/** @internal */ +function errorStrictEqual(lhs, rhs) { + if (lhs === rhs) { + return true; + } + if (!lhs || !rhs) { + return lhs === rhs; + } + if ((lhs == null && rhs != null) || (lhs != null && rhs == null)) { + return false; + } + if (lhs.constructor.name !== rhs.constructor.name) { + return false; + } + if (lhs.message !== rhs.message) { + return false; + } + return true; +} +exports.errorStrictEqual = errorStrictEqual; +/** @internal */ +function makeStateMachine(stateTable) { + return function stateTransition(target, newState) { + const legalStates = stateTable[target.s.state]; + if (legalStates && legalStates.indexOf(newState) < 0) { + throw new error_1.MongoRuntimeError(`illegal state transition from [${target.s.state}] => [${newState}], allowed: [${legalStates}]`); + } + target.emit('stateChanged', target.s.state, newState); + target.s.state = newState; + }; +} +exports.makeStateMachine = makeStateMachine; +// eslint-disable-next-line @typescript-eslint/no-var-requires +const NODE_DRIVER_VERSION = require('../package.json').version; +function makeClientMetadata(options) { + options = options !== null && options !== void 0 ? options : {}; + const metadata = { + driver: { + name: 'nodejs', + version: NODE_DRIVER_VERSION + }, + os: { + type: os.type(), + name: process.platform, + architecture: process.arch, + version: os.release() + }, + platform: `Node.js ${process.version}, ${os.endianness()} (unified)` + }; + // support optionally provided wrapping driver info + if (options.driverInfo) { + if (options.driverInfo.name) { + metadata.driver.name = `${metadata.driver.name}|${options.driverInfo.name}`; + } + if (options.driverInfo.version) { + metadata.version = `${metadata.driver.version}|${options.driverInfo.version}`; + } + if (options.driverInfo.platform) { + metadata.platform = `${metadata.platform}|${options.driverInfo.platform}`; + } + } + if (options.appName) { + // MongoDB requires the appName not exceed a byte length of 128 + const buffer = Buffer.from(options.appName); + metadata.application = { + name: buffer.byteLength > 128 ? buffer.slice(0, 128).toString('utf8') : options.appName + }; + } + return metadata; +} +exports.makeClientMetadata = makeClientMetadata; +/** @internal */ +function now() { + const hrtime = process.hrtime(); + return Math.floor(hrtime[0] * 1000 + hrtime[1] / 1000000); +} +exports.now = now; +/** @internal */ +function calculateDurationInMs(started) { + if (typeof started !== 'number') { + throw new error_1.MongoInvalidArgumentError('Numeric value required to calculate duration'); + } + const elapsed = now() - started; + return elapsed < 0 ? 0 : elapsed; +} +exports.calculateDurationInMs = calculateDurationInMs; +/** + * Creates an interval timer which is able to be woken up sooner than + * the interval. The timer will also debounce multiple calls to wake + * ensuring that the function is only ever called once within a minimum + * interval window. + * @internal + * + * @param fn - An async function to run on an interval, must accept a `callback` as its only parameter + */ +function makeInterruptibleAsyncInterval(fn, options) { + let timerId; + let lastCallTime; + let lastWakeTime; + let stopped = false; + options = options !== null && options !== void 0 ? options : {}; + const interval = options.interval || 1000; + const minInterval = options.minInterval || 500; + const immediate = typeof options.immediate === 'boolean' ? options.immediate : false; + const clock = typeof options.clock === 'function' ? options.clock : now; + function wake() { + const currentTime = clock(); + const timeSinceLastWake = currentTime - lastWakeTime; + const timeSinceLastCall = currentTime - lastCallTime; + const timeUntilNextCall = interval - timeSinceLastCall; + lastWakeTime = currentTime; + // For the streaming protocol: there is nothing obviously stopping this + // interval from being woken up again while we are waiting "infinitely" + // for `fn` to be called again`. Since the function effectively + // never completes, the `timeUntilNextCall` will continue to grow + // negatively unbounded, so it will never trigger a reschedule here. + // debounce multiple calls to wake within the `minInterval` + if (timeSinceLastWake < minInterval) { + return; + } + // reschedule a call as soon as possible, ensuring the call never happens + // faster than the `minInterval` + if (timeUntilNextCall > minInterval) { + reschedule(minInterval); + } + // This is possible in virtualized environments like AWS Lambda where our + // clock is unreliable. In these cases the timer is "running" but never + // actually completes, so we want to execute immediately and then attempt + // to reschedule. + if (timeUntilNextCall < 0) { + executeAndReschedule(); + } + } + function stop() { + stopped = true; + if (timerId) { + clearTimeout(timerId); + timerId = undefined; + } + lastCallTime = 0; + lastWakeTime = 0; + } + function reschedule(ms) { + if (stopped) + return; + if (timerId) { + clearTimeout(timerId); + } + timerId = setTimeout(executeAndReschedule, ms || interval); + } + function executeAndReschedule() { + lastWakeTime = 0; + lastCallTime = clock(); + fn(err => { + if (err) + throw err; + reschedule(interval); + }); + } + if (immediate) { + executeAndReschedule(); + } + else { + lastCallTime = clock(); + reschedule(undefined); + } + return { wake, stop }; +} +exports.makeInterruptibleAsyncInterval = makeInterruptibleAsyncInterval; +/** @internal */ +function hasAtomicOperators(doc) { + if (Array.isArray(doc)) { + for (const document of doc) { + if (hasAtomicOperators(document)) { + return true; + } + } + return false; + } + const keys = Object.keys(doc); + return keys.length > 0 && keys[0][0] === '$'; +} +exports.hasAtomicOperators = hasAtomicOperators; +/** + * Merge inherited properties from parent into options, prioritizing values from options, + * then values from parent. + * @internal + */ +function resolveOptions(parent, options) { + var _a, _b, _c; + const result = Object.assign({}, options, bson_1.resolveBSONOptions(options, parent)); + // Users cannot pass a readConcern/writeConcern to operations in a transaction + const session = options === null || options === void 0 ? void 0 : options.session; + if (!(session === null || session === void 0 ? void 0 : session.inTransaction())) { + const readConcern = (_a = read_concern_1.ReadConcern.fromOptions(options)) !== null && _a !== void 0 ? _a : parent === null || parent === void 0 ? void 0 : parent.readConcern; + if (readConcern) { + result.readConcern = readConcern; + } + const writeConcern = (_b = write_concern_1.WriteConcern.fromOptions(options)) !== null && _b !== void 0 ? _b : parent === null || parent === void 0 ? void 0 : parent.writeConcern; + if (writeConcern) { + result.writeConcern = writeConcern; + } + } + const readPreference = (_c = read_preference_1.ReadPreference.fromOptions(options)) !== null && _c !== void 0 ? _c : parent === null || parent === void 0 ? void 0 : parent.readPreference; + if (readPreference) { + result.readPreference = readPreference; + } + return result; +} +exports.resolveOptions = resolveOptions; +function isSuperset(set, subset) { + set = Array.isArray(set) ? new Set(set) : set; + subset = Array.isArray(subset) ? new Set(subset) : subset; + for (const elem of subset) { + if (!set.has(elem)) { + return false; + } + } + return true; +} +exports.isSuperset = isSuperset; +function setDifference(setA, setB) { + const difference = new Set(setA); + for (const elem of setB) { + difference.delete(elem); + } + return difference; +} +exports.setDifference = setDifference; +function isRecord(value, requiredKeys = undefined) { + const toString = Object.prototype.toString; + const hasOwnProperty = Object.prototype.hasOwnProperty; + const isObject = (v) => toString.call(v) === '[object Object]'; + if (!isObject(value)) { + return false; + } + const ctor = value.constructor; + if (ctor && ctor.prototype) { + if (!isObject(ctor.prototype)) { + return false; + } + // Check to see if some method exists from the Object exists + if (!hasOwnProperty.call(ctor.prototype, 'isPrototypeOf')) { + return false; + } + } + if (requiredKeys) { + const keys = Object.keys(value); + return isSuperset(keys, requiredKeys); + } + return true; +} +exports.isRecord = isRecord; +/** + * Make a deep copy of an object + * + * NOTE: This is not meant to be the perfect implementation of a deep copy, + * but instead something that is good enough for the purposes of + * command monitoring. + */ +function deepCopy(value) { + if (value == null) { + return value; + } + else if (Array.isArray(value)) { + return value.map(item => deepCopy(item)); + } + else if (isRecord(value)) { + const res = {}; + for (const key in value) { + res[key] = deepCopy(value[key]); + } + return res; + } + const ctor = value.constructor; + if (ctor) { + switch (ctor.name.toLowerCase()) { + case 'date': + return new ctor(Number(value)); + case 'map': + return new Map(value); + case 'set': + return new Set(value); + case 'buffer': + return Buffer.from(value); + } + } + return value; +} +exports.deepCopy = deepCopy; +/** @internal */ +const kBuffers = Symbol('buffers'); +/** @internal */ +const kLength = Symbol('length'); +/** + * A pool of Buffers which allow you to read them as if they were one + * @internal + */ +class BufferPool { + constructor() { + this[kBuffers] = []; + this[kLength] = 0; + } + get length() { + return this[kLength]; + } + /** Adds a buffer to the internal buffer pool list */ + append(buffer) { + this[kBuffers].push(buffer); + this[kLength] += buffer.length; + } + /** Returns the requested number of bytes without consuming them */ + peek(size) { + return this.read(size, false); + } + /** Reads the requested number of bytes, optionally consuming them */ + read(size, consume = true) { + if (typeof size !== 'number' || size < 0) { + throw new error_1.MongoInvalidArgumentError('Argument "size" must be a non-negative number'); + } + if (size > this[kLength]) { + return Buffer.alloc(0); + } + let result; + // read the whole buffer + if (size === this.length) { + result = Buffer.concat(this[kBuffers]); + if (consume) { + this[kBuffers] = []; + this[kLength] = 0; + } + } + // size is within first buffer, no need to concat + else if (size <= this[kBuffers][0].length) { + result = this[kBuffers][0].slice(0, size); + if (consume) { + this[kBuffers][0] = this[kBuffers][0].slice(size); + this[kLength] -= size; + } + } + // size is beyond first buffer, need to track and copy + else { + result = Buffer.allocUnsafe(size); + let idx; + let offset = 0; + let bytesToCopy = size; + for (idx = 0; idx < this[kBuffers].length; ++idx) { + let bytesCopied; + if (bytesToCopy > this[kBuffers][idx].length) { + bytesCopied = this[kBuffers][idx].copy(result, offset, 0); + offset += bytesCopied; + } + else { + bytesCopied = this[kBuffers][idx].copy(result, offset, 0, bytesToCopy); + if (consume) { + this[kBuffers][idx] = this[kBuffers][idx].slice(bytesCopied); + } + offset += bytesCopied; + break; + } + bytesToCopy -= bytesCopied; + } + // compact the internal buffer array + if (consume) { + this[kBuffers] = this[kBuffers].slice(idx); + this[kLength] -= size; + } + } + return result; + } +} +exports.BufferPool = BufferPool; +/** @public */ +class HostAddress { + constructor(hostString) { + const escapedHost = hostString.split(' ').join('%20'); // escape spaces, for socket path hosts + const { hostname, port } = new url_1.URL(`mongodb://${escapedHost}`); + if (hostname.endsWith('.sock')) { + // heuristically determine if we're working with a domain socket + this.socketPath = decodeURIComponent(hostname); + } + else if (typeof hostname === 'string') { + this.isIPv6 = false; + let normalized = decodeURIComponent(hostname).toLowerCase(); + if (normalized.startsWith('[') && normalized.endsWith(']')) { + this.isIPv6 = true; + normalized = normalized.substring(1, hostname.length - 1); + } + this.host = normalized.toLowerCase(); + if (typeof port === 'number') { + this.port = port; + } + else if (typeof port === 'string' && port !== '') { + this.port = Number.parseInt(port, 10); + } + else { + this.port = 27017; + } + if (this.port === 0) { + throw new error_1.MongoParseError('Invalid port (zero) with hostname'); + } + } + else { + throw new error_1.MongoInvalidArgumentError('Either socketPath or host must be defined.'); + } + Object.freeze(this); + } + /** + * @param ipv6Brackets - optionally request ipv6 bracket notation required for connection strings + */ + toString(ipv6Brackets = false) { + if (typeof this.host === 'string') { + if (this.isIPv6 && ipv6Brackets) { + return `[${this.host}]:${this.port}`; + } + return `${this.host}:${this.port}`; + } + return `${this.socketPath}`; + } + static fromString(s) { + return new HostAddress(s); + } +} +exports.HostAddress = HostAddress; +exports.DEFAULT_PK_FACTORY = { + // We prefer not to rely on ObjectId having a createPk method + createPk() { + return new bson_1.ObjectId(); + } +}; +/** + * When the driver used emitWarning the code will be equal to this. + * @public + * + * @example + * ```js + * process.on('warning', (warning) => { + * if (warning.code === MONGODB_WARNING_CODE) console.error('Ah an important warning! :)') + * }) + * ``` + */ +exports.MONGODB_WARNING_CODE = 'MONGODB DRIVER'; +/** @internal */ +function emitWarning(message) { + return process.emitWarning(message, { code: exports.MONGODB_WARNING_CODE }); +} +exports.emitWarning = emitWarning; +const emittedWarnings = new Set(); +/** + * Will emit a warning once for the duration of the application. + * Uses the message to identify if it has already been emitted + * so using string interpolation can cause multiple emits + * @internal + */ +function emitWarningOnce(message) { + if (!emittedWarnings.has(message)) { + emittedWarnings.add(message); + return emitWarning(message); + } +} +exports.emitWarningOnce = emitWarningOnce; +/** + * Takes a JS object and joins the values into a string separated by ', ' + */ +function enumToString(en) { + return Object.values(en).join(', '); +} +exports.enumToString = enumToString; +/** + * Determine if a server supports retryable writes. + * + * @internal + */ +function supportsRetryableWrites(server) { + return (!!server.loadBalanced || + (server.description.maxWireVersion >= 6 && + !!server.description.logicalSessionTimeoutMinutes && + server.description.type !== common_1.ServerType.Standalone)); +} +exports.supportsRetryableWrites = supportsRetryableWrites; +function parsePackageVersion({ version }) { + const [major, minor, patch] = version.split('.').map((n) => Number.parseInt(n, 10)); + return { major, minor, patch }; +} +exports.parsePackageVersion = parsePackageVersion; +//# sourceMappingURL=utils.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/utils.js.map b/node_modules/mongodb/lib/utils.js.map new file mode 100644 index 000000000..4f545bfbe --- /dev/null +++ b/node_modules/mongodb/lib/utils.js.map @@ -0,0 +1 @@ +{"version":3,"file":"utils.js","sourceRoot":"","sources":["../src/utils.ts"],"names":[],"mappings":";;;;AAAA,yBAAyB;AACzB,iCAAiC;AACjC,yDAAqD;AACrD,mCAQiB;AACjB,mDAAuE;AAGvE,0CAA2C;AAK3C,iDAA6C;AAE7C,iCAAgE;AAKhE,uDAAmD;AACnD,6BAA0B;AAC1B,8DAA4E;AAU/D,QAAA,UAAU,GAAG,MAAM,CAAC,gBAAgB,GAAG,CAAC,CAAC;AAItD;;;GAGG;AACH,SAAgB,iBAAiB,CAC/B,GAAe,EACf,IAA8B,EAC9B,KAAc;IAEd,MAAM,CAAC,cAAc,CAAC,GAAG,EAAE,IAAI,EAAE;QAC/B,UAAU,EAAE,IAAI;QAChB,GAAG;YACD,OAAO,KAAK,CAAC;QACf,CAAC;KACF,CAAC,CAAC;AACL,CAAC;AAXD,8CAWC;AAED;;;GAGG;AACH,SAAgB,mBAAmB,CAAC,cAAsB;IACxD,IAAI,QAAQ,KAAK,OAAO,cAAc,EAAE;QACtC,MAAM,IAAI,iCAAyB,CAAC,kCAAkC,CAAC,CAAC;KACzE;IAED,IAAI,CAAC,cAAc,IAAI,cAAc,CAAC,OAAO,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC,EAAE;QAC1D,MAAM,IAAI,iCAAyB,CAAC,kCAAkC,CAAC,CAAC;KACzE;IAED,IACE,cAAc,CAAC,OAAO,CAAC,GAAG,CAAC,KAAK,CAAC,CAAC;QAClC,cAAc,CAAC,KAAK,CAAC,4BAA4B,CAAC,IAAI,IAAI,EAC1D;QACA,oDAAoD;QACpD,MAAM,IAAI,iCAAyB,CAAC,uCAAuC,CAAC,CAAC;KAC9E;IAED,IAAI,cAAc,CAAC,KAAK,CAAC,SAAS,CAAC,IAAI,IAAI,EAAE;QAC3C,oDAAoD;QACpD,MAAM,IAAI,iCAAyB,CAAC,iDAAiD,CAAC,CAAC;KACxF;IAED,+DAA+D;IAC/D,IAAI,cAAc,CAAC,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC,EAAE;QACzC,oDAAoD;QACpD,MAAM,IAAI,iCAAyB,CAAC,kDAAkD,CAAC,CAAC;KACzF;AACH,CAAC;AA3BD,kDA2BC;AAED;;;;;GAKG;AACH,SAAgB,kBAAkB,CAAC,IAAW;IAC5C,IAAI,SAAS,GAAG,SAAS,CAAC;IAE1B,IAAI,OAAO,IAAI,KAAK,QAAQ,EAAE;QAC5B,SAAS,GAAG,IAAI,CAAC;KAClB;SAAM,IAAI,KAAK,CAAC,OAAO,CAAC,IAAI,CAAC,EAAE;QAC9B,SAAS,GAAG,EAAE,CAAC;QAEf,IAAI,CAAC,OAAO,CAAC,KAAK,CAAC,EAAE;YACnB,SAAS,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;QACvB,CAAC,CAAC,CAAC;KACJ;SAAM,IAAI,IAAI,IAAI,IAAI,IAAI,OAAO,IAAI,KAAK,QAAQ,EAAE;QACnD,SAAS,GAAG,EAAc,CAAC;QAC3B,KAAK,MAAM,IAAI,IAAI,IAAI,EAAE;YACvB,SAAS,CAAC,IAAI,CAAC,GAAG,IAAI,CAAC,IAAI,CAAC,CAAC;SAC9B;KACF;IAED,OAAO,SAAS,CAAC;AACnB,CAAC;AAnBD,gDAmBC;AAQD;;;GAGG;AACH,SAAgB,iBAAiB,CAAC,SAA6B;IAC7D,MAAM,SAAS,GAAsC,EAAE,CAAC;IACxD,MAAM,OAAO,GAAG,EAAE,CAAC;IACnB,IAAI,IAAI,CAAC;IAET,iCAAiC;IACjC,IAAI,QAAQ,KAAK,OAAO,SAAS,EAAE;QACjC,SAAS;QACT,OAAO,CAAC,IAAI,CAAC,SAAS,GAAG,GAAG,GAAG,CAAC,CAAC,CAAC;QAClC,SAAS,CAAC,SAAS,CAAC,GAAG,CAAC,CAAC;KAC1B;SAAM,IAAI,KAAK,CAAC,OAAO,CAAC,SAAS,CAAC,EAAE;QACnC,SAAS,CAAC,OAAO,CAAC,CAAC,CAAM,EAAE,EAAE;YAC3B,IAAI,QAAQ,KAAK,OAAO,CAAC,EAAE;gBACzB,4BAA4B;gBAC5B,OAAO,CAAC,IAAI,CAAC,CAAC,GAAG,GAAG,GAAG,CAAC,CAAC,CAAC;gBAC1B,SAAS,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC;aAClB;iBAAM,IAAI,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,EAAE;gBAC3B,mCAAmC;gBACnC,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC,GAAG,GAAG,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;gBACvC,SAAS,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC,CAAC;aAC7B;iBAAM,IAAI,QAAQ,CAAC,CAAC,CAAC,EAAE;gBACtB,8BAA8B;gBAC9B,IAAI,GAAG,MAAM,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;gBACtB,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC,EAAE;oBACf,OAAO,CAAC,IAAI,CAAC,CAAC,GAAG,GAAG,GAAI,CAAgB,CAAC,CAAC,CAAC,CAAC,CAAC;oBAC7C,SAAS,CAAC,CAAC,CAAC,GAAI,CAAgB,CAAC,CAAC,CAAC,CAAC;gBACtC,CAAC,CAAC,CAAC;aACJ;iBAAM;gBACL,qBAAqB;aACtB;QACH,CAAC,CAAC,CAAC;KACJ;SAAM,IAAI,QAAQ,CAAC,SAAS,CAAC,EAAE;QAC9B,0BAA0B;QAC1B,IAAI,GAAG,MAAM,CAAC,IAAI,CAAC,SAAS,CAAC,CAAC;QAC9B,MAAM,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC,OAAO,CAAC,CAAC,CAAC,GAAG,EAAE,KAAK,CAAC,EAAE,EAAE;YACjD,OAAO,CAAC,IAAI,CAAC,GAAG,GAAG,GAAG,GAAG,KAAK,CAAC,CAAC;YAChC,SAAS,CAAC,GAAG,CAAC,GAAG,KAAK,CAAC;QACzB,CAAC,CAAC,CAAC;KACJ;IAED,OAAO;QACL,IAAI,EAAE,OAAO,CAAC,IAAI,CAAC,GAAG,CAAC;QACvB,IAAI,EAAE,IAAI;QACV,SAAS,EAAE,SAAS;KACrB,CAAC;AACJ,CAAC;AA7CD,8CA6CC;AAED;;;;GAIG;AACH,wDAAwD;AACxD,SAAgB,QAAQ,CAAC,GAAY;IACnC,OAAO,iBAAiB,KAAK,MAAM,CAAC,SAAS,CAAC,QAAQ,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;AACnE,CAAC;AAFD,4BAEC;AAED,gBAAgB;AAChB,SAAgB,eAAe,CAAC,OAAiB,EAAE,OAAiB,EAAE,OAAiB;IACrF,KAAK,MAAM,IAAI,IAAI,OAAO,EAAE;QAC1B,IAAI,CAAC,OAAO,CAAC,QAAQ,CAAC,IAAI,CAAC,EAAE;YAC3B,OAAO,CAAC,IAAI,CAAC,GAAG,OAAO,CAAC,IAAI,CAAC,CAAC;SAC/B;KACF;IAED,OAAO,OAAO,CAAC;AACjB,CAAC;AARD,0CAQC;AAED,gBAAgB;AAChB,SAAgB,YAAY,CAAO,MAAS,EAAE,MAAS;IACrD,OAAO,EAAE,GAAG,MAAM,EAAE,GAAG,MAAM,EAAE,CAAC;AAClC,CAAC;AAFD,oCAEC;AAED,gBAAgB;AAChB,SAAgB,aAAa,CAAC,OAAmB,EAAE,KAAe;IAChE,MAAM,aAAa,GAAe,EAAE,CAAC;IAErC,KAAK,MAAM,IAAI,IAAI,OAAO,EAAE;QAC1B,IAAI,KAAK,CAAC,QAAQ,CAAC,IAAI,CAAC,EAAE;YACxB,aAAa,CAAC,IAAI,CAAC,GAAG,OAAO,CAAC,IAAI,CAAC,CAAC;SACrC;KACF;IAED,mBAAmB;IACnB,OAAO,aAAa,CAAC;AACvB,CAAC;AAXD,sCAWC;AAED;;;;;;;;;;;;;;;;GAgBG;AACH,SAAgB,sBAAsB,CACpC,QAAkB,EAClB,SAAuD,EACvD,IAAW,EACX,OAAoB;IAEpB,MAAM,OAAO,GAAG,kCAAe,CAAC,GAAG,EAAE,CAAC;IAEtC,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,IAAI,CAAC,EAAE;QACxB,kBAAkB;QAClB,MAAM,IAAI,yBAAiB,CAAC,qDAAqD,CAAC,CAAC;KACpF;IAED,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;IAExB,IAAI,QAAQ,GAAG,IAAI,CAAC,IAAI,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC;IAErC,sFAAsF;IACtF,mDAAmD;IACnD,IAAI,OAAsB,CAAC;IAC3B,IAAI,SAAc,CAAC;IACnB,IAAI,KAAU,CAAC;IACf,IAAI,CAAC,OAAO,CAAC,YAAY,IAAI,QAAQ,CAAC,iBAAiB,EAAE,EAAE;QACzD,SAAS,GAAG,IAAI,CAAC,IAAI,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC;QAClC,IAAI,SAAS,IAAI,IAAI,IAAI,SAAS,CAAC,OAAO,IAAI,IAAI,EAAE;YAClD,KAAK,GAAG,MAAM,EAAE,CAAC;YACjB,OAAO,GAAG,QAAQ,CAAC,YAAY,CAAC,EAAE,KAAK,EAAE,CAAC,CAAC;YAC3C,MAAM,YAAY,GAAG,IAAI,CAAC,MAAM,GAAG,CAAC,CAAC;YACrC,IAAI,CAAC,YAAY,CAAC,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,IAAI,CAAC,YAAY,CAAC,EAAE,EAAE,OAAO,EAAE,OAAO,EAAE,CAAC,CAAC;SAClF;aAAM,IAAI,SAAS,CAAC,OAAO,IAAI,SAAS,CAAC,OAAO,CAAC,QAAQ,EAAE;YAC1D,MAAM,IAAI,gCAAwB,EAAE,CAAC;SACtC;KACF;IAED,SAAS,mBAAmB,CAC1B,OAAmC,EACnC,MAAmC;QAEnC,OAAO,UAAU,GAAc,EAAE,MAAY;YAC3C,IAAI,OAAO,IAAI,OAAO,CAAC,KAAK,KAAK,KAAK,IAAI,CAAC,CAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,aAAa,CAAA,EAAE;gBACjE,OAAO,CAAC,UAAU,CAAC,GAAG,EAAE;oBACtB,OAAO,SAAS,CAAC,OAAO,CAAC;oBACzB,IAAI,GAAG;wBAAE,OAAO,MAAM,CAAC,GAAG,CAAC,CAAC;oBAC5B,OAAO,CAAC,MAAM,CAAC,CAAC;gBAClB,CAAC,CAAC,CAAC;aACJ;iBAAM;gBACL,IAAI,GAAG;oBAAE,OAAO,MAAM,CAAC,GAAG,CAAC,CAAC;gBAC5B,OAAO,CAAC,MAAM,CAAC,CAAC;aACjB;QACH,CAAC,CAAC;IACJ,CAAC;IAED,yBAAyB;IACzB,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;QAClC,QAAQ,GAAG,IAAI,CAAC,GAAG,EAAE,CAAC;QACtB,MAAM,OAAO,GAAG,mBAAmB,CACjC,MAAM,CAAC,EAAE,CAAC,QAAQ,CAAC,SAAS,EAAE,MAAM,CAAC,EACrC,GAAG,CAAC,EAAE,CAAC,QAAQ,CAAC,GAAG,EAAE,IAAI,CAAC,CAC3B,CAAC;QACF,IAAI,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC;QAEnB,IAAI;YACF,OAAO,SAAS,CAAC,GAAG,IAAI,CAAC,CAAC;SAC3B;QAAC,OAAO,CAAC,EAAE;YACV,OAAO,CAAC,CAAC,CAAC,CAAC;YACX,MAAM,CAAC,CAAC;SACT;KACF;IAED,mBAAmB;IACnB,IAAI,IAAI,CAAC,IAAI,CAAC,MAAM,GAAG,CAAC,CAAC,IAAI,IAAI,EAAE;QACjC,kBAAkB;QAClB,MAAM,IAAI,yBAAiB,CAAC,+DAA+D,CAAC,CAAC;KAC9F;IAED,OAAO,IAAI,OAAO,CAAM,CAAC,OAAO,EAAE,MAAM,EAAE,EAAE;QAC1C,MAAM,OAAO,GAAG,mBAAmB,CAAC,OAAO,EAAE,MAAM,CAAC,CAAC;QACrD,IAAI,CAAC,IAAI,CAAC,MAAM,GAAG,CAAC,CAAC,GAAG,OAAO,CAAC;QAEhC,IAAI;YACF,OAAO,SAAS,CAAC,GAAG,IAAI,CAAC,CAAC;SAC3B;QAAC,OAAO,CAAC,EAAE;YACV,OAAO,CAAC,CAAC,CAAC,CAAC;SACZ;IACH,CAAC,CAAC,CAAC;AACL,CAAC;AArFD,wDAqFC;AAKD;;;;;;GAMG;AACH,SAAgB,oBAAoB,CAA+B,MAAS,EAAE,EAAO;;IACnF,IAAI,EAAE,KAAI,MAAA,EAAE,CAAC,CAAC,CAAC,OAAO,0CAAE,WAAW,CAAA,EAAE;QACnC,MAAM,CAAC,WAAW,GAAG,IAAI,CAAC;KAC3B;IAED,OAAO,MAAM,CAAC;AAChB,CAAC;AAND,oDAMC;AAKD;;;;;;;;GAQG;AACH,SAAgB,iBAAiB,CAC/B,MAAS,EACT,OAA6C,EAC7C,OAAgD;IAEhD,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;IACxB,MAAM,EAAE,GAAG,OAAO,CAAC,EAAE,CAAC;IACtB,MAAM,IAAI,GAAG,OAAO,CAAC,UAAU,CAAC;IAEhC,IAAI,OAAO,CAAC,OAAO,IAAI,OAAO,CAAC,OAAO,CAAC,aAAa,EAAE,EAAE;QACtD,mEAAmE;QACnE,IAAI,MAAM,CAAC,YAAY,EAAE;YACvB,OAAO,MAAM,CAAC,YAAY,CAAC;SAC5B;QAED,OAAO,MAAM,CAAC;KACf;IAED,MAAM,YAAY,GAAG,4BAAY,CAAC,WAAW,CAAC,OAAO,CAAC,CAAC;IACvD,IAAI,YAAY,EAAE;QAChB,OAAO,MAAM,CAAC,MAAM,CAAC,MAAM,EAAE,EAAE,YAAY,EAAE,CAAC,CAAC;KAChD;IAED,IAAI,IAAI,IAAI,IAAI,CAAC,YAAY,EAAE;QAC7B,OAAO,MAAM,CAAC,MAAM,CAAC,MAAM,EAAE,EAAE,YAAY,EAAE,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,IAAI,CAAC,YAAY,CAAC,EAAE,CAAC,CAAC;KACtF;IAED,IAAI,EAAE,IAAI,EAAE,CAAC,YAAY,EAAE;QACzB,OAAO,MAAM,CAAC,MAAM,CAAC,MAAM,EAAE,EAAE,YAAY,EAAE,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,EAAE,CAAC,YAAY,CAAC,EAAE,CAAC,CAAC;KACpF;IAED,OAAO,MAAM,CAAC;AAChB,CAAC;AAhCD,8CAgCC;AAED;;;;;;GAMG;AACH,SAAgB,aAAa,CAC3B,YAAoC;IAEpC,OAAO,CAAC,CAAC,YAAY,IAAI,OAAO,YAAY,CAAC,IAAI,KAAK,UAAU,CAAC;AACnE,CAAC;AAJD,sCAIC;AAED;;;;;;;GAOG;AACH,SAAgB,qBAAqB,CACnC,OAAiB,EACjB,MAAqC,EACrC,OAAmB;IAEnB,MAAM,YAAY,GAAG,WAAW,CAAC,MAAM,CAAC,CAAC,YAAY,CAAC;IACtD,IAAI,OAAO,CAAC,SAAS,IAAI,OAAO,OAAO,CAAC,SAAS,KAAK,QAAQ,EAAE;QAC9D,IAAI,YAAY,IAAI,YAAY,CAAC,qBAAqB,EAAE;YACtD,OAAO,CAAC,SAAS,GAAG,OAAO,CAAC,SAAS,CAAC;SACvC;aAAM;YACL,MAAM,IAAI,+BAAuB,CAAC,6CAA6C,CAAC,CAAC;SAClF;KACF;AACH,CAAC;AAbD,sDAaC;AAED;;;;;;GAMG;AACH,SAAgB,uBAAuB,CACrC,OAAiB,EACjB,IAA0C,EAC1C,OAA0B;IAE1B,IAAI,OAAO,IAAI,OAAO,CAAC,OAAO,IAAI,OAAO,CAAC,OAAO,CAAC,aAAa,EAAE,EAAE;QACjE,OAAO;KACR;IACD,MAAM,WAAW,GAAG,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,OAAO,CAAC,WAAW,IAAI,EAAE,CAAC,CAAC;IACjE,IAAI,IAAI,CAAC,CAAC,CAAC,WAAW,EAAE;QACtB,MAAM,CAAC,MAAM,CAAC,WAAW,EAAE,IAAI,CAAC,CAAC,CAAC,WAAW,CAAC,CAAC;KAChD;IAED,IAAI,MAAM,CAAC,IAAI,CAAC,WAAW,CAAC,CAAC,MAAM,GAAG,CAAC,EAAE;QACvC,MAAM,CAAC,MAAM,CAAC,OAAO,EAAE,EAAE,WAAW,EAAE,WAAW,EAAE,CAAC,CAAC;KACtD;AACH,CAAC;AAhBD,0DAgBC;AAED;;;;;;GAMG;AACH,SAAgB,mBAAmB,CAAC,OAAiB,EAAE,OAAgB;IACrE,IAAI,OAAO,CAAC,OAAO,EAAE;QACnB,OAAO,OAAO,CAAC;KAChB;IAED,OAAO,EAAE,OAAO,EAAE,OAAO,EAAE,SAAS,EAAE,OAAO,CAAC,SAAS,EAAE,CAAC;AAC5D,CAAC;AAND,kDAMC;AAED;;;;GAIG;AACH,SAAgB,WAAW,CAAI,QAA0C;IACvE,IAAI,UAAU,IAAI,QAAQ,IAAI,QAAQ,CAAC,QAAQ,EAAE;QAC/C,OAAO,QAAQ,CAAC,QAAQ,CAAC;KAC1B;SAAM,IAAI,QAAQ,IAAI,QAAQ,CAAC,CAAC,IAAI,QAAQ,CAAC,CAAC,CAAC,MAAM,CAAC,QAAQ,EAAE;QAC/D,OAAO,QAAQ,CAAC,CAAC,CAAC,MAAM,CAAC,QAAQ,CAAC;KACnC;SAAM,IAAI,IAAI,IAAI,QAAQ,CAAC,CAAC,IAAI,QAAQ,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,MAAM,CAAC,QAAQ,EAAE;QAChE,OAAO,QAAQ,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,MAAM,CAAC,QAAQ,CAAC;KACxC;IAED,MAAM,IAAI,8BAAsB,CAAC,yDAAyD,CAAC,CAAC;AAC9F,CAAC;AAVD,kCAUC;AAED;;;;;;;GAOG;AACH,SAAgB,iBAAiB,CAAC,IAAY,EAAE,MAAc;IAC5D,OAAO,GAAG,IAAI,YAAY,MAAM,yDAAyD,CAAC;AAC5F,CAAC;AAFD,8CAEC;AAaD;;;;;;;;GAQG;AACH,SAAgB,gBAAgB,CAE9B,MAA8B,EAC9B,EAA2B;IAE3B,IAAK,OAAe,CAAC,aAAa,KAAK,IAAI,EAAE;QAC3C,OAAO,EAAE,CAAC;KACX;IAED,MAAM,UAAU,GAAG,MAAM,CAAC,UAAU,CAAC,CAAC,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,CAAC,iBAAiB,CAAC;IAE7E,MAAM,aAAa,GAAG,IAAI,GAAG,EAAE,CAAC;IAChC,SAAS,UAAU,CAAY,GAAG,IAAW;QAC3C,MAAM,OAAO,GAAG,IAAI,CAAC,MAAM,CAAC,YAAY,CAAe,CAAC;QAExD,uEAAuE;QACvE,IAAI,CAAC,QAAQ,CAAC,OAAO,CAAC,IAAI,MAAM,CAAC,IAAI,CAAC,OAAO,CAAC,CAAC,MAAM,KAAK,CAAC,EAAE;YAC3D,OAAO,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC,CAAC,CAAC,+BAA+B;SAC/D;QAED,6CAA6C;QAC7C,KAAK,MAAM,gBAAgB,IAAI,MAAM,CAAC,iBAAiB,EAAE;YACvD,IAAI,gBAAgB,IAAI,OAAO,IAAI,CAAC,aAAa,CAAC,GAAG,CAAC,gBAAgB,CAAC,EAAE;gBACvE,aAAa,CAAC,GAAG,CAAC,gBAAgB,CAAC,CAAC;gBACpC,MAAM,GAAG,GAAG,UAAU,CAAC,MAAM,CAAC,IAAI,EAAE,gBAAgB,CAAC,CAAC;gBACtD,WAAW,CAAC,GAAG,CAAC,CAAC;gBACjB,IAAI,IAAI,IAAI,WAAW,IAAI,IAAI,EAAE;oBAC/B,MAAM,MAAM,GAAG,IAAI,CAAC,SAAS,EAAE,CAAC;oBAChC,IAAI,MAAM,EAAE;wBACV,MAAM,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;qBAClB;iBACF;aACF;SACF;QAED,OAAO,EAAE,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,GAAG,IAAI,CAAC,CAAC;IAChC,CAAC;IAED,oIAAoI;IACpI,6EAA6E;IAC7E,MAAM,CAAC,cAAc,CAAC,UAAU,EAAE,EAAE,CAAC,CAAC;IACtC,IAAI,EAAE,CAAC,SAAS,EAAE;QAChB,yEAAyE;QACzE,yEAAyE;QACzE,eAAe;QACf,UAAU,CAAC,SAAS,GAAG,EAAE,CAAC,SAAS,CAAC;KACrC;IAED,OAAO,UAAU,CAAC;AACpB,CAAC;AAjDD,4CAiDC;AAED,gBAAgB;AAChB,SAAgB,EAAE,CAAC,EAAU;IAC3B,OAAO,gBAAgB,CAAC,UAAU,CAAC,EAAE,CAAC,CAAC;AACzC,CAAC;AAFD,gBAEC;AAED,cAAc;AACd,MAAa,gBAAgB;IAG3B;;;;;OAKG;IACH,YAAY,EAAU,EAAE,UAAmB;QACzC,IAAI,CAAC,EAAE,GAAG,EAAE,CAAC;QACb,IAAI,CAAC,UAAU,GAAG,UAAU,CAAC;IAC/B,CAAC;IAED,QAAQ;QACN,OAAO,IAAI,CAAC,UAAU,CAAC,CAAC,CAAC,GAAG,IAAI,CAAC,EAAE,IAAI,IAAI,CAAC,UAAU,EAAE,CAAC,CAAC,CAAC,IAAI,CAAC,EAAE,CAAC;IACrE,CAAC;IAED,cAAc,CAAC,UAAkB;QAC/B,OAAO,IAAI,gBAAgB,CAAC,IAAI,CAAC,EAAE,EAAE,UAAU,CAAC,CAAC;IACnD,CAAC;IAED,MAAM,CAAC,UAAU,CAAC,SAAkB;QAClC,IAAI,CAAC,SAAS,EAAE;YACd,oDAAoD;YACpD,MAAM,IAAI,yBAAiB,CAAC,gCAAgC,SAAS,GAAG,CAAC,CAAC;SAC3E;QAED,MAAM,CAAC,EAAE,EAAE,GAAG,UAAU,CAAC,GAAG,SAAS,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;QACjD,OAAO,IAAI,gBAAgB,CAAC,EAAE,EAAE,UAAU,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC;IACxD,CAAC;CACF;AA/BD,4CA+BC;AAED,gBAAgB;AAChB,QAAe,CAAC,CAAC,WAAW,CAAC,IAAI,GAAG,CAAC;IACnC,IAAI,KAAK,GAAG,IAAI,CAAC;IACjB,OAAO,IAAI,EAAE;QACX,MAAM,QAAQ,GAAG,KAAK,CAAC;QACvB,KAAK,IAAI,CAAC,CAAC;QACX,MAAM,QAAQ,CAAC;KAChB;AACH,CAAC;AAPD,kCAOC;AAED;;;;;;;GAOG;AACH,SAAgB,YAAY,CAC1B,QAAiC,EACjC,OAAkC;IAElC,MAAM,OAAO,GAAG,kCAAe,CAAC,GAAG,EAAE,CAAC;IACtC,IAAI,MAAyB,CAAC;IAC9B,IAAI,OAAO,QAAQ,KAAK,UAAU,EAAE;QAClC,MAAM,GAAG,IAAI,OAAO,CAAM,CAAC,OAAO,EAAE,MAAM,EAAE,EAAE;YAC5C,QAAQ,GAAG,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;gBACtB,IAAI,GAAG;oBAAE,OAAO,MAAM,CAAC,GAAG,CAAC,CAAC;gBAC5B,OAAO,CAAC,GAAG,CAAC,CAAC;YACf,CAAC,CAAC;QACJ,CAAC,CAAC,CAAC;KACJ;IAED,OAAO,CAAC,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE;QACnB,IAAI,GAAG,IAAI,IAAI,EAAE;YACf,IAAI;gBACF,oEAAoE;gBACpE,QAAS,CAAC,GAAG,CAAC,CAAC;aAChB;YAAC,OAAO,KAAK,EAAE;gBACd,OAAO,CAAC,QAAQ,CAAC,GAAG,EAAE;oBACpB,MAAM,KAAK,CAAC;gBACd,CAAC,CAAC,CAAC;aACJ;YAED,OAAO;SACR;QAED,oEAAoE;QACpE,QAAS,CAAC,GAAG,EAAE,GAAG,CAAC,CAAC;IACtB,CAAC,CAAC,CAAC;IAEH,OAAO,MAAM,CAAC;AAChB,CAAC;AAlCD,oCAkCC;AAED,gBAAgB;AAChB,SAAgB,iBAAiB,CAAC,EAAU;IAC1C,OAAO,EAAE,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC;AAC1B,CAAC;AAFD,8CAEC;AAED,gBAAgB;AAChB,SAAgB,mBAAmB,CAAC,EAAU;IAC5C,OAAO,EAAE,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;AAC1C,CAAC;AAFD,kDAEC;AAED;;;GAGG;AACH,SAAgB,MAAM;IACpB,MAAM,MAAM,GAAG,MAAM,CAAC,WAAW,CAAC,EAAE,CAAC,CAAC;IACtC,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,MAAM,CAAC,CAAC,CAAC,GAAG,IAAI,CAAC,GAAG,IAAI,CAAC;IACtC,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,MAAM,CAAC,CAAC,CAAC,GAAG,IAAI,CAAC,GAAG,IAAI,CAAC;IACtC,OAAO,MAAM,CAAC;AAChB,CAAC;AALD,wBAKC;AAED;;;GAGG;AACH,SAAgB,cAAc,CAAC,gBAAiD;IAC9E,IAAI,gBAAgB,EAAE;QACpB,IAAI,gBAAgB,CAAC,YAAY,EAAE;YACjC,+EAA+E;YAC/E,0EAA0E;YAC1E,0EAA0E;YAC1E,+DAA+D;YAC/D,OAAO,sCAA0B,CAAC;SACnC;QACD,IAAI,gBAAgB,CAAC,QAAQ,EAAE;YAC7B,OAAO,gBAAgB,CAAC,QAAQ,CAAC,cAAc,CAAC;SACjD;QAED,IAAI,cAAc,IAAI,gBAAgB,IAAI,OAAO,gBAAgB,CAAC,YAAY,KAAK,UAAU,EAAE;YAC7F,MAAM,YAAY,GAAG,gBAAgB,CAAC,YAAY,EAAE,CAAC;YACrD,IAAI,YAAY,EAAE;gBAChB,OAAO,YAAY,CAAC,cAAc,CAAC;aACpC;SACF;QAED,IACE,gBAAgB,CAAC,WAAW;YAC5B,gBAAgB,IAAI,gBAAgB,CAAC,WAAW;YAChD,gBAAgB,CAAC,WAAW,CAAC,cAAc,IAAI,IAAI,EACnD;YACA,OAAO,gBAAgB,CAAC,WAAW,CAAC,cAAc,CAAC;SACpD;KACF;IAED,OAAO,CAAC,CAAC;AACX,CAAC;AA9BD,wCA8BC;AAED;;;;;;GAMG;AACH,SAAgB,qBAAqB,CAAC,MAAc,EAAE,GAAa;IACjE,OAAO,GAAG,IAAI,GAAG,CAAC,SAAS,IAAI,cAAc,CAAC,MAAM,CAAC,GAAG,CAAC,CAAC;AAC5D,CAAC;AAFD,sDAEC;AAED;;;;;;;GAOG;AACH,SAAgB,SAAS,CACvB,GAAQ,EACR,MAA6D,EAC7D,QAAkB;IAElB,GAAG,GAAG,GAAG,IAAI,EAAE,CAAC;IAEhB,IAAI,GAAG,GAAG,CAAC,CAAC;IACZ,IAAI,QAAQ,GAAG,CAAC,CAAC;IACjB,KAAK,GAAG,GAAG,CAAC,EAAE,GAAG,GAAG,GAAG,CAAC,MAAM,EAAE,EAAE,GAAG,EAAE;QACrC,QAAQ,EAAE,CAAC;QACX,MAAM,CAAC,GAAG,CAAC,GAAG,CAAC,EAAE,YAAY,CAAC,CAAC;KAChC;IAED,IAAI,QAAQ,KAAK,CAAC,EAAE;QAClB,QAAQ,EAAE,CAAC;QACX,OAAO;KACR;IAED,SAAS,YAAY,CAAC,GAAc;QAClC,QAAQ,EAAE,CAAC;QACX,IAAI,GAAG,EAAE;YACP,QAAQ,CAAC,GAAG,CAAC,CAAC;YACd,OAAO;SACR;QAED,IAAI,GAAG,KAAK,GAAG,CAAC,MAAM,IAAI,QAAQ,IAAI,CAAC,EAAE;YACvC,QAAQ,EAAE,CAAC;SACZ;IACH,CAAC;AACH,CAAC;AA9BD,8BA8BC;AAED,gBAAgB;AAChB,SAAgB,eAAe,CAC7B,GAAQ,EACR,MAA6D,EAC7D,QAAkB;IAElB,GAAG,GAAG,GAAG,IAAI,EAAE,CAAC;IAEhB,IAAI,GAAG,GAAG,CAAC,CAAC;IACZ,IAAI,QAAQ,GAAG,GAAG,CAAC,MAAM,CAAC;IAC1B,IAAI,QAAQ,KAAK,CAAC,EAAE;QAClB,QAAQ,EAAE,CAAC;QACX,OAAO;KACR;IAED,SAAS,YAAY,CAAC,GAAc;QAClC,GAAG,EAAE,CAAC;QACN,QAAQ,EAAE,CAAC;QACX,IAAI,GAAG,EAAE;YACP,QAAQ,CAAC,GAAG,CAAC,CAAC;YACd,OAAO;SACR;QAED,IAAI,GAAG,KAAK,GAAG,CAAC,MAAM,IAAI,QAAQ,IAAI,CAAC,EAAE;YACvC,QAAQ,EAAE,CAAC;YACX,OAAO;SACR;QAED,MAAM,CAAC,GAAG,CAAC,GAAG,CAAC,EAAE,YAAY,CAAC,CAAC;IACjC,CAAC;IAED,MAAM,CAAC,GAAG,CAAC,GAAG,CAAC,EAAE,YAAY,CAAC,CAAC;AACjC,CAAC;AA/BD,0CA+BC;AAED,gBAAgB;AAChB,SAAgB,gBAAgB,CAAC,GAAc,EAAE,IAAe;IAC9D,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,IAAI,CAAC,EAAE;QAC/C,OAAO,KAAK,CAAC;KACd;IAED,OAAO,GAAG,CAAC,MAAM,KAAK,IAAI,CAAC,MAAM,IAAI,GAAG,CAAC,KAAK,CAAC,CAAC,GAAG,EAAE,GAAG,EAAE,EAAE,CAAC,GAAG,KAAK,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC;AAClF,CAAC;AAND,4CAMC;AAED,gBAAgB;AAChB,SAAgB,gBAAgB,CAAC,GAAc,EAAE,GAAc;IAC7D,IAAI,GAAG,KAAK,GAAG,EAAE;QACf,OAAO,IAAI,CAAC;KACb;IAED,IAAI,CAAC,GAAG,IAAI,CAAC,GAAG,EAAE;QAChB,OAAO,GAAG,KAAK,GAAG,CAAC;KACpB;IAED,IAAI,CAAC,GAAG,IAAI,IAAI,IAAI,GAAG,IAAI,IAAI,CAAC,IAAI,CAAC,GAAG,IAAI,IAAI,IAAI,GAAG,IAAI,IAAI,CAAC,EAAE;QAChE,OAAO,KAAK,CAAC;KACd;IAED,IAAI,GAAG,CAAC,WAAW,CAAC,IAAI,KAAK,GAAG,CAAC,WAAW,CAAC,IAAI,EAAE;QACjD,OAAO,KAAK,CAAC;KACd;IAED,IAAI,GAAG,CAAC,OAAO,KAAK,GAAG,CAAC,OAAO,EAAE;QAC/B,OAAO,KAAK,CAAC;KACd;IAED,OAAO,IAAI,CAAC;AACd,CAAC;AAtBD,4CAsBC;AAmBD,gBAAgB;AAChB,SAAgB,gBAAgB,CAAC,UAAsB;IACrD,OAAO,SAAS,eAAe,CAAC,MAAM,EAAE,QAAQ;QAC9C,MAAM,WAAW,GAAG,UAAU,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC;QAC/C,IAAI,WAAW,IAAI,WAAW,CAAC,OAAO,CAAC,QAAQ,CAAC,GAAG,CAAC,EAAE;YACpD,MAAM,IAAI,yBAAiB,CACzB,kCAAkC,MAAM,CAAC,CAAC,CAAC,KAAK,SAAS,QAAQ,gBAAgB,WAAW,GAAG,CAChG,CAAC;SACH;QAED,MAAM,CAAC,IAAI,CAAC,cAAc,EAAE,MAAM,CAAC,CAAC,CAAC,KAAK,EAAE,QAAQ,CAAC,CAAC;QACtD,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,QAAQ,CAAC;IAC5B,CAAC,CAAC;AACJ,CAAC;AAZD,4CAYC;AA+BD,8DAA8D;AAC9D,MAAM,mBAAmB,GAAG,OAAO,CAAC,iBAAiB,CAAC,CAAC,OAAO,CAAC;AAE/D,SAAgB,kBAAkB,CAAC,OAA+B;IAChE,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;IAExB,MAAM,QAAQ,GAAmB;QAC/B,MAAM,EAAE;YACN,IAAI,EAAE,QAAQ;YACd,OAAO,EAAE,mBAAmB;SAC7B;QACD,EAAE,EAAE;YACF,IAAI,EAAE,EAAE,CAAC,IAAI,EAAE;YACf,IAAI,EAAE,OAAO,CAAC,QAAQ;YACtB,YAAY,EAAE,OAAO,CAAC,IAAI;YAC1B,OAAO,EAAE,EAAE,CAAC,OAAO,EAAE;SACtB;QACD,QAAQ,EAAE,WAAW,OAAO,CAAC,OAAO,KAAK,EAAE,CAAC,UAAU,EAAE,YAAY;KACrE,CAAC;IAEF,mDAAmD;IACnD,IAAI,OAAO,CAAC,UAAU,EAAE;QACtB,IAAI,OAAO,CAAC,UAAU,CAAC,IAAI,EAAE;YAC3B,QAAQ,CAAC,MAAM,CAAC,IAAI,GAAG,GAAG,QAAQ,CAAC,MAAM,CAAC,IAAI,IAAI,OAAO,CAAC,UAAU,CAAC,IAAI,EAAE,CAAC;SAC7E;QAED,IAAI,OAAO,CAAC,UAAU,CAAC,OAAO,EAAE;YAC9B,QAAQ,CAAC,OAAO,GAAG,GAAG,QAAQ,CAAC,MAAM,CAAC,OAAO,IAAI,OAAO,CAAC,UAAU,CAAC,OAAO,EAAE,CAAC;SAC/E;QAED,IAAI,OAAO,CAAC,UAAU,CAAC,QAAQ,EAAE;YAC/B,QAAQ,CAAC,QAAQ,GAAG,GAAG,QAAQ,CAAC,QAAQ,IAAI,OAAO,CAAC,UAAU,CAAC,QAAQ,EAAE,CAAC;SAC3E;KACF;IAED,IAAI,OAAO,CAAC,OAAO,EAAE;QACnB,+DAA+D;QAC/D,MAAM,MAAM,GAAG,MAAM,CAAC,IAAI,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;QAC5C,QAAQ,CAAC,WAAW,GAAG;YACrB,IAAI,EAAE,MAAM,CAAC,UAAU,GAAG,GAAG,CAAC,CAAC,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC,EAAE,GAAG,CAAC,CAAC,QAAQ,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,OAAO,CAAC,OAAO;SACxF,CAAC;KACH;IAED,OAAO,QAAQ,CAAC;AAClB,CAAC;AAzCD,gDAyCC;AAED,gBAAgB;AAChB,SAAgB,GAAG;IACjB,MAAM,MAAM,GAAG,OAAO,CAAC,MAAM,EAAE,CAAC;IAChC,OAAO,IAAI,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC,CAAC,GAAG,IAAI,GAAG,MAAM,CAAC,CAAC,CAAC,GAAG,OAAO,CAAC,CAAC;AAC5D,CAAC;AAHD,kBAGC;AAED,gBAAgB;AAChB,SAAgB,qBAAqB,CAAC,OAAe;IACnD,IAAI,OAAO,OAAO,KAAK,QAAQ,EAAE;QAC/B,MAAM,IAAI,iCAAyB,CAAC,8CAA8C,CAAC,CAAC;KACrF;IAED,MAAM,OAAO,GAAG,GAAG,EAAE,GAAG,OAAO,CAAC;IAChC,OAAO,OAAO,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,OAAO,CAAC;AACnC,CAAC;AAPD,sDAOC;AAuBD;;;;;;;;GAQG;AACH,SAAgB,8BAA8B,CAC5C,EAAgC,EAChC,OAAoD;IAEpD,IAAI,OAAmC,CAAC;IACxC,IAAI,YAAoB,CAAC;IACzB,IAAI,YAAoB,CAAC;IACzB,IAAI,OAAO,GAAG,KAAK,CAAC;IAEpB,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;IACxB,MAAM,QAAQ,GAAG,OAAO,CAAC,QAAQ,IAAI,IAAI,CAAC;IAC1C,MAAM,WAAW,GAAG,OAAO,CAAC,WAAW,IAAI,GAAG,CAAC;IAC/C,MAAM,SAAS,GAAG,OAAO,OAAO,CAAC,SAAS,KAAK,SAAS,CAAC,CAAC,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC,CAAC,KAAK,CAAC;IACrF,MAAM,KAAK,GAAG,OAAO,OAAO,CAAC,KAAK,KAAK,UAAU,CAAC,CAAC,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC,CAAC,GAAG,CAAC;IAExE,SAAS,IAAI;QACX,MAAM,WAAW,GAAG,KAAK,EAAE,CAAC;QAC5B,MAAM,iBAAiB,GAAG,WAAW,GAAG,YAAY,CAAC;QACrD,MAAM,iBAAiB,GAAG,WAAW,GAAG,YAAY,CAAC;QACrD,MAAM,iBAAiB,GAAG,QAAQ,GAAG,iBAAiB,CAAC;QACvD,YAAY,GAAG,WAAW,CAAC;QAE3B,uEAAuE;QACvE,uEAAuE;QACvE,+DAA+D;QAC/D,iEAAiE;QACjE,oEAAoE;QAEpE,2DAA2D;QAC3D,IAAI,iBAAiB,GAAG,WAAW,EAAE;YACnC,OAAO;SACR;QAED,yEAAyE;QACzE,gCAAgC;QAChC,IAAI,iBAAiB,GAAG,WAAW,EAAE;YACnC,UAAU,CAAC,WAAW,CAAC,CAAC;SACzB;QAED,yEAAyE;QACzE,uEAAuE;QACvE,yEAAyE;QACzE,iBAAiB;QACjB,IAAI,iBAAiB,GAAG,CAAC,EAAE;YACzB,oBAAoB,EAAE,CAAC;SACxB;IACH,CAAC;IAED,SAAS,IAAI;QACX,OAAO,GAAG,IAAI,CAAC;QACf,IAAI,OAAO,EAAE;YACX,YAAY,CAAC,OAAO,CAAC,CAAC;YACtB,OAAO,GAAG,SAAS,CAAC;SACrB;QAED,YAAY,GAAG,CAAC,CAAC;QACjB,YAAY,GAAG,CAAC,CAAC;IACnB,CAAC;IAED,SAAS,UAAU,CAAC,EAAW;QAC7B,IAAI,OAAO;YAAE,OAAO;QACpB,IAAI,OAAO,EAAE;YACX,YAAY,CAAC,OAAO,CAAC,CAAC;SACvB;QAED,OAAO,GAAG,UAAU,CAAC,oBAAoB,EAAE,EAAE,IAAI,QAAQ,CAAC,CAAC;IAC7D,CAAC;IAED,SAAS,oBAAoB;QAC3B,YAAY,GAAG,CAAC,CAAC;QACjB,YAAY,GAAG,KAAK,EAAE,CAAC;QAEvB,EAAE,CAAC,GAAG,CAAC,EAAE;YACP,IAAI,GAAG;gBAAE,MAAM,GAAG,CAAC;YACnB,UAAU,CAAC,QAAQ,CAAC,CAAC;QACvB,CAAC,CAAC,CAAC;IACL,CAAC;IAED,IAAI,SAAS,EAAE;QACb,oBAAoB,EAAE,CAAC;KACxB;SAAM;QACL,YAAY,GAAG,KAAK,EAAE,CAAC;QACvB,UAAU,CAAC,SAAS,CAAC,CAAC;KACvB;IAED,OAAO,EAAE,IAAI,EAAE,IAAI,EAAE,CAAC;AACxB,CAAC;AAtFD,wEAsFC;AAED,gBAAgB;AAChB,SAAgB,kBAAkB,CAAC,GAA0B;IAC3D,IAAI,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,EAAE;QACtB,KAAK,MAAM,QAAQ,IAAI,GAAG,EAAE;YAC1B,IAAI,kBAAkB,CAAC,QAAQ,CAAC,EAAE;gBAChC,OAAO,IAAI,CAAC;aACb;SACF;QACD,OAAO,KAAK,CAAC;KACd;IAED,MAAM,IAAI,GAAG,MAAM,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC;IAC9B,OAAO,IAAI,CAAC,MAAM,GAAG,CAAC,IAAI,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,KAAK,GAAG,CAAC;AAC/C,CAAC;AAZD,gDAYC;AAED;;;;GAIG;AACH,SAAgB,cAAc,CAC5B,MAAmC,EACnC,OAAW;;IAEX,MAAM,MAAM,GAAM,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,OAAO,EAAE,yBAAkB,CAAC,OAAO,EAAE,MAAM,CAAC,CAAC,CAAC;IAElF,8EAA8E;IAC9E,MAAM,OAAO,GAAG,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,OAAO,CAAC;IACjC,IAAI,CAAC,CAAA,OAAO,aAAP,OAAO,uBAAP,OAAO,CAAE,aAAa,EAAE,CAAA,EAAE;QAC7B,MAAM,WAAW,GAAG,MAAA,0BAAW,CAAC,WAAW,CAAC,OAAO,CAAC,mCAAI,MAAM,aAAN,MAAM,uBAAN,MAAM,CAAE,WAAW,CAAC;QAC5E,IAAI,WAAW,EAAE;YACf,MAAM,CAAC,WAAW,GAAG,WAAW,CAAC;SAClC;QAED,MAAM,YAAY,GAAG,MAAA,4BAAY,CAAC,WAAW,CAAC,OAAO,CAAC,mCAAI,MAAM,aAAN,MAAM,uBAAN,MAAM,CAAE,YAAY,CAAC;QAC/E,IAAI,YAAY,EAAE;YAChB,MAAM,CAAC,YAAY,GAAG,YAAY,CAAC;SACpC;KACF;IAED,MAAM,cAAc,GAAG,MAAA,gCAAc,CAAC,WAAW,CAAC,OAAO,CAAC,mCAAI,MAAM,aAAN,MAAM,uBAAN,MAAM,CAAE,cAAc,CAAC;IACrF,IAAI,cAAc,EAAE;QAClB,MAAM,CAAC,cAAc,GAAG,cAAc,CAAC;KACxC;IAED,OAAO,MAAM,CAAC;AAChB,CAAC;AA1BD,wCA0BC;AAED,SAAgB,UAAU,CAAC,GAAqB,EAAE,MAAwB;IACxE,GAAG,GAAG,KAAK,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,IAAI,GAAG,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC;IAC9C,MAAM,GAAG,KAAK,CAAC,OAAO,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,IAAI,GAAG,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,MAAM,CAAC;IAC1D,KAAK,MAAM,IAAI,IAAI,MAAM,EAAE;QACzB,IAAI,CAAC,GAAG,CAAC,GAAG,CAAC,IAAI,CAAC,EAAE;YAClB,OAAO,KAAK,CAAC;SACd;KACF;IACD,OAAO,IAAI,CAAC;AACd,CAAC;AATD,gCASC;AAED,SAAgB,aAAa,CAAC,IAAmB,EAAE,IAAmB;IACpE,MAAM,UAAU,GAAG,IAAI,GAAG,CAAC,IAAI,CAAC,CAAC;IACjC,KAAK,MAAM,IAAI,IAAI,IAAI,EAAE;QACvB,UAAU,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC;KACzB;IACD,OAAO,UAAU,CAAC;AACpB,CAAC;AAND,sCAMC;AAOD,SAAgB,QAAQ,CACtB,KAAc,EACd,eAAqC,SAAS;IAE9C,MAAM,QAAQ,GAAG,MAAM,CAAC,SAAS,CAAC,QAAQ,CAAC;IAC3C,MAAM,cAAc,GAAG,MAAM,CAAC,SAAS,CAAC,cAAc,CAAC;IACvD,MAAM,QAAQ,GAAG,CAAC,CAAU,EAAE,EAAE,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC,CAAC,KAAK,iBAAiB,CAAC;IACxE,IAAI,CAAC,QAAQ,CAAC,KAAK,CAAC,EAAE;QACpB,OAAO,KAAK,CAAC;KACd;IAED,MAAM,IAAI,GAAI,KAAa,CAAC,WAAW,CAAC;IACxC,IAAI,IAAI,IAAI,IAAI,CAAC,SAAS,EAAE;QAC1B,IAAI,CAAC,QAAQ,CAAC,IAAI,CAAC,SAAS,CAAC,EAAE;YAC7B,OAAO,KAAK,CAAC;SACd;QAED,4DAA4D;QAC5D,IAAI,CAAC,cAAc,CAAC,IAAI,CAAC,IAAI,CAAC,SAAS,EAAE,eAAe,CAAC,EAAE;YACzD,OAAO,KAAK,CAAC;SACd;KACF;IAED,IAAI,YAAY,EAAE;QAChB,MAAM,IAAI,GAAG,MAAM,CAAC,IAAI,CAAC,KAA4B,CAAC,CAAC;QACvD,OAAO,UAAU,CAAC,IAAI,EAAE,YAAY,CAAC,CAAC;KACvC;IAED,OAAO,IAAI,CAAC;AACd,CAAC;AA7BD,4BA6BC;AAED;;;;;;GAMG;AACH,SAAgB,QAAQ,CAAgB,KAAQ;IAC9C,IAAI,KAAK,IAAI,IAAI,EAAE;QACjB,OAAO,KAAK,CAAC;KACd;SAAM,IAAI,KAAK,CAAC,OAAO,CAAC,KAAK,CAAC,EAAE;QAC/B,OAAO,KAAK,CAAC,GAAG,CAAC,IAAI,CAAC,EAAE,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAM,CAAC;KAC/C;SAAM,IAAI,QAAQ,CAAC,KAAK,CAAC,EAAE;QAC1B,MAAM,GAAG,GAAG,EAAS,CAAC;QACtB,KAAK,MAAM,GAAG,IAAI,KAAK,EAAE;YACvB,GAAG,CAAC,GAAG,CAAC,GAAG,QAAQ,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,CAAC;SACjC;QACD,OAAO,GAAG,CAAC;KACZ;IAED,MAAM,IAAI,GAAI,KAAa,CAAC,WAAW,CAAC;IACxC,IAAI,IAAI,EAAE;QACR,QAAQ,IAAI,CAAC,IAAI,CAAC,WAAW,EAAE,EAAE;YAC/B,KAAK,MAAM;gBACT,OAAO,IAAI,IAAI,CAAC,MAAM,CAAC,KAAK,CAAC,CAAC,CAAC;YACjC,KAAK,KAAK;gBACR,OAAO,IAAI,GAAG,CAAC,KAAY,CAAM,CAAC;YACpC,KAAK,KAAK;gBACR,OAAO,IAAI,GAAG,CAAC,KAAY,CAAM,CAAC;YACpC,KAAK,QAAQ;gBACX,OAAO,MAAM,CAAC,IAAI,CAAC,KAAe,CAAM,CAAC;SAC5C;KACF;IAED,OAAO,KAAK,CAAC;AACf,CAAC;AA5BD,4BA4BC;AAED,gBAAgB;AAChB,MAAM,QAAQ,GAAG,MAAM,CAAC,SAAS,CAAC,CAAC;AACnC,gBAAgB;AAChB,MAAM,OAAO,GAAG,MAAM,CAAC,QAAQ,CAAC,CAAC;AAEjC;;;GAGG;AACH,MAAa,UAAU;IAIrB;QACE,IAAI,CAAC,QAAQ,CAAC,GAAG,EAAE,CAAC;QACpB,IAAI,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC;IACpB,CAAC;IAED,IAAI,MAAM;QACR,OAAO,IAAI,CAAC,OAAO,CAAC,CAAC;IACvB,CAAC;IAED,qDAAqD;IACrD,MAAM,CAAC,MAAc;QACnB,IAAI,CAAC,QAAQ,CAAC,CAAC,IAAI,CAAC,MAAM,CAAC,CAAC;QAC5B,IAAI,CAAC,OAAO,CAAC,IAAI,MAAM,CAAC,MAAM,CAAC;IACjC,CAAC;IAED,mEAAmE;IACnE,IAAI,CAAC,IAAY;QACf,OAAO,IAAI,CAAC,IAAI,CAAC,IAAI,EAAE,KAAK,CAAC,CAAC;IAChC,CAAC;IAED,qEAAqE;IACrE,IAAI,CAAC,IAAY,EAAE,OAAO,GAAG,IAAI;QAC/B,IAAI,OAAO,IAAI,KAAK,QAAQ,IAAI,IAAI,GAAG,CAAC,EAAE;YACxC,MAAM,IAAI,iCAAyB,CAAC,+CAA+C,CAAC,CAAC;SACtF;QAED,IAAI,IAAI,GAAG,IAAI,CAAC,OAAO,CAAC,EAAE;YACxB,OAAO,MAAM,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC;SACxB;QAED,IAAI,MAAc,CAAC;QAEnB,wBAAwB;QACxB,IAAI,IAAI,KAAK,IAAI,CAAC,MAAM,EAAE;YACxB,MAAM,GAAG,MAAM,CAAC,MAAM,CAAC,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC;YAEvC,IAAI,OAAO,EAAE;gBACX,IAAI,CAAC,QAAQ,CAAC,GAAG,EAAE,CAAC;gBACpB,IAAI,CAAC,OAAO,CAAC,GAAG,CAAC,CAAC;aACnB;SACF;QAED,iDAAiD;aAC5C,IAAI,IAAI,IAAI,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,CAAC,MAAM,EAAE;YACzC,MAAM,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC,CAAC,EAAE,IAAI,CAAC,CAAC;YAC1C,IAAI,OAAO,EAAE;gBACX,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,CAAC,KAAK,CAAC,IAAI,CAAC,CAAC;gBAClD,IAAI,CAAC,OAAO,CAAC,IAAI,IAAI,CAAC;aACvB;SACF;QAED,sDAAsD;aACjD;YACH,MAAM,GAAG,MAAM,CAAC,WAAW,CAAC,IAAI,CAAC,CAAC;YAElC,IAAI,GAAG,CAAC;YACR,IAAI,MAAM,GAAG,CAAC,CAAC;YACf,IAAI,WAAW,GAAG,IAAI,CAAC;YACvB,KAAK,GAAG,GAAG,CAAC,EAAE,GAAG,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,MAAM,EAAE,EAAE,GAAG,EAAE;gBAChD,IAAI,WAAW,CAAC;gBAChB,IAAI,WAAW,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,GAAG,CAAC,CAAC,MAAM,EAAE;oBAC5C,WAAW,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,CAAC,MAAM,EAAE,MAAM,EAAE,CAAC,CAAC,CAAC;oBAC1D,MAAM,IAAI,WAAW,CAAC;iBACvB;qBAAM;oBACL,WAAW,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,GAAG,CAAC,CAAC,IAAI,CAAC,MAAM,EAAE,MAAM,EAAE,CAAC,EAAE,WAAW,CAAC,CAAC;oBACvE,IAAI,OAAO,EAAE;wBACX,IAAI,CAAC,QAAQ,CAAC,CAAC,GAAG,CAAC,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,GAAG,CAAC,CAAC,KAAK,CAAC,WAAW,CAAC,CAAC;qBAC9D;oBACD,MAAM,IAAI,WAAW,CAAC;oBACtB,MAAM;iBACP;gBAED,WAAW,IAAI,WAAW,CAAC;aAC5B;YAED,oCAAoC;YACpC,IAAI,OAAO,EAAE;gBACX,IAAI,CAAC,QAAQ,CAAC,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC;gBAC3C,IAAI,CAAC,OAAO,CAAC,IAAI,IAAI,CAAC;aACvB;SACF;QAED,OAAO,MAAM,CAAC;IAChB,CAAC;CACF;AAxFD,gCAwFC;AAED,cAAc;AACd,MAAa,WAAW;IAQtB,YAAY,UAAkB;QAC5B,MAAM,WAAW,GAAG,UAAU,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC,CAAC,uCAAuC;QAC9F,MAAM,EAAE,QAAQ,EAAE,IAAI,EAAE,GAAG,IAAI,SAAG,CAAC,aAAa,WAAW,EAAE,CAAC,CAAC;QAE/D,IAAI,QAAQ,CAAC,QAAQ,CAAC,OAAO,CAAC,EAAE;YAC9B,gEAAgE;YAChE,IAAI,CAAC,UAAU,GAAG,kBAAkB,CAAC,QAAQ,CAAC,CAAC;SAChD;aAAM,IAAI,OAAO,QAAQ,KAAK,QAAQ,EAAE;YACvC,IAAI,CAAC,MAAM,GAAG,KAAK,CAAC;YAEpB,IAAI,UAAU,GAAG,kBAAkB,CAAC,QAAQ,CAAC,CAAC,WAAW,EAAE,CAAC;YAC5D,IAAI,UAAU,CAAC,UAAU,CAAC,GAAG,CAAC,IAAI,UAAU,CAAC,QAAQ,CAAC,GAAG,CAAC,EAAE;gBAC1D,IAAI,CAAC,MAAM,GAAG,IAAI,CAAC;gBACnB,UAAU,GAAG,UAAU,CAAC,SAAS,CAAC,CAAC,EAAE,QAAQ,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC;aAC3D;YAED,IAAI,CAAC,IAAI,GAAG,UAAU,CAAC,WAAW,EAAE,CAAC;YAErC,IAAI,OAAO,IAAI,KAAK,QAAQ,EAAE;gBAC5B,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;aAClB;iBAAM,IAAI,OAAO,IAAI,KAAK,QAAQ,IAAI,IAAI,KAAK,EAAE,EAAE;gBAClD,IAAI,CAAC,IAAI,GAAG,MAAM,CAAC,QAAQ,CAAC,IAAI,EAAE,EAAE,CAAC,CAAC;aACvC;iBAAM;gBACL,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;aACnB;YAED,IAAI,IAAI,CAAC,IAAI,KAAK,CAAC,EAAE;gBACnB,MAAM,IAAI,uBAAe,CAAC,mCAAmC,CAAC,CAAC;aAChE;SACF;aAAM;YACL,MAAM,IAAI,iCAAyB,CAAC,4CAA4C,CAAC,CAAC;SACnF;QACD,MAAM,CAAC,MAAM,CAAC,IAAI,CAAC,CAAC;IACtB,CAAC;IAED;;OAEG;IACH,QAAQ,CAAC,YAAY,GAAG,KAAK;QAC3B,IAAI,OAAO,IAAI,CAAC,IAAI,KAAK,QAAQ,EAAE;YACjC,IAAI,IAAI,CAAC,MAAM,IAAI,YAAY,EAAE;gBAC/B,OAAO,IAAI,IAAI,CAAC,IAAI,KAAK,IAAI,CAAC,IAAI,EAAE,CAAC;aACtC;YACD,OAAO,GAAG,IAAI,CAAC,IAAI,IAAI,IAAI,CAAC,IAAI,EAAE,CAAC;SACpC;QACD,OAAO,GAAG,IAAI,CAAC,UAAU,EAAE,CAAC;IAC9B,CAAC;IAED,MAAM,CAAC,UAAU,CAAC,CAAS;QACzB,OAAO,IAAI,WAAW,CAAC,CAAC,CAAC,CAAC;IAC5B,CAAC;CACF;AA3DD,kCA2DC;AAEY,QAAA,kBAAkB,GAAG;IAChC,6DAA6D;IAC7D,QAAQ;QACN,OAAO,IAAI,eAAQ,EAAE,CAAC;IACxB,CAAC;CACF,CAAC;AAEF;;;;;;;;;;GAUG;AACU,QAAA,oBAAoB,GAAG,gBAAyB,CAAC;AAE9D,gBAAgB;AAChB,SAAgB,WAAW,CAAC,OAAe;IACzC,OAAO,OAAO,CAAC,WAAW,CAAC,OAAO,EAAE,EAAE,IAAI,EAAE,4BAAoB,EAAS,CAAC,CAAC;AAC7E,CAAC;AAFD,kCAEC;AAED,MAAM,eAAe,GAAG,IAAI,GAAG,EAAE,CAAC;AAClC;;;;;GAKG;AACH,SAAgB,eAAe,CAAC,OAAe;IAC7C,IAAI,CAAC,eAAe,CAAC,GAAG,CAAC,OAAO,CAAC,EAAE;QACjC,eAAe,CAAC,GAAG,CAAC,OAAO,CAAC,CAAC;QAC7B,OAAO,WAAW,CAAC,OAAO,CAAC,CAAC;KAC7B;AACH,CAAC;AALD,0CAKC;AAED;;GAEG;AACH,SAAgB,YAAY,CAAC,EAA2B;IACtD,OAAO,MAAM,CAAC,MAAM,CAAC,EAAE,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC;AACtC,CAAC;AAFD,oCAEC;AAED;;;;GAIG;AACH,SAAgB,uBAAuB,CAAC,MAAc;IACpD,OAAO,CACL,CAAC,CAAC,MAAM,CAAC,YAAY;QACrB,CAAC,MAAM,CAAC,WAAW,CAAC,cAAc,IAAI,CAAC;YACrC,CAAC,CAAC,MAAM,CAAC,WAAW,CAAC,4BAA4B;YACjD,MAAM,CAAC,WAAW,CAAC,IAAI,KAAK,mBAAU,CAAC,UAAU,CAAC,CACrD,CAAC;AACJ,CAAC;AAPD,0DAOC;AAED,SAAgB,mBAAmB,CAAC,EAAE,OAAO,EAAuB;IAKlE,MAAM,CAAC,KAAK,EAAE,KAAK,EAAE,KAAK,CAAC,GAAG,OAAO,CAAC,KAAK,CAAC,GAAG,CAAC,CAAC,GAAG,CAAC,CAAC,CAAS,EAAE,EAAE,CAAC,MAAM,CAAC,QAAQ,CAAC,CAAC,EAAE,EAAE,CAAC,CAAC,CAAC;IAC5F,OAAO,EAAE,KAAK,EAAE,KAAK,EAAE,KAAK,EAAE,CAAC;AACjC,CAAC;AAPD,kDAOC"} \ No newline at end of file diff --git a/node_modules/mongodb/lib/write_concern.js b/node_modules/mongodb/lib/write_concern.js new file mode 100644 index 000000000..4816b00eb --- /dev/null +++ b/node_modules/mongodb/lib/write_concern.js @@ -0,0 +1,71 @@ +"use strict"; +Object.defineProperty(exports, "__esModule", { value: true }); +exports.WriteConcern = exports.WRITE_CONCERN_KEYS = void 0; +exports.WRITE_CONCERN_KEYS = ['w', 'wtimeout', 'j', 'journal', 'fsync']; +/** + * A MongoDB WriteConcern, which describes the level of acknowledgement + * requested from MongoDB for write operations. + * @public + * + * @see https://docs.mongodb.com/manual/reference/write-concern/ + */ +class WriteConcern { + /** + * Constructs a WriteConcern from the write concern properties. + * @param w - request acknowledgment that the write operation has propagated to a specified number of mongod instances or to mongod instances with specified tags. + * @param wtimeout - specify a time limit to prevent write operations from blocking indefinitely + * @param j - request acknowledgment that the write operation has been written to the on-disk journal + * @param fsync - equivalent to the j option + */ + constructor(w, wtimeout, j, fsync) { + if (w != null) { + if (!Number.isNaN(Number(w))) { + this.w = Number(w); + } + else { + this.w = w; + } + } + if (wtimeout != null) { + this.wtimeout = wtimeout; + } + if (j != null) { + this.j = j; + } + if (fsync != null) { + this.fsync = fsync; + } + } + /** Construct a WriteConcern given an options object. */ + static fromOptions(options, inherit) { + if (options == null) + return undefined; + inherit = inherit !== null && inherit !== void 0 ? inherit : {}; + let opts; + if (typeof options === 'string' || typeof options === 'number') { + opts = { w: options }; + } + else if (options instanceof WriteConcern) { + opts = options; + } + else { + opts = options.writeConcern; + } + const parentOpts = inherit instanceof WriteConcern ? inherit : inherit.writeConcern; + const { w = undefined, wtimeout = undefined, j = undefined, fsync = undefined, journal = undefined, wtimeoutMS = undefined } = { + ...parentOpts, + ...opts + }; + if (w != null || + wtimeout != null || + wtimeoutMS != null || + j != null || + journal != null || + fsync != null) { + return new WriteConcern(w, wtimeout !== null && wtimeout !== void 0 ? wtimeout : wtimeoutMS, j !== null && j !== void 0 ? j : journal, fsync); + } + return undefined; + } +} +exports.WriteConcern = WriteConcern; +//# sourceMappingURL=write_concern.js.map \ No newline at end of file diff --git a/node_modules/mongodb/lib/write_concern.js.map b/node_modules/mongodb/lib/write_concern.js.map new file mode 100644 index 000000000..c2e21b7e1 --- /dev/null +++ b/node_modules/mongodb/lib/write_concern.js.map @@ -0,0 +1 @@ +{"version":3,"file":"write_concern.js","sourceRoot":"","sources":["../src/write_concern.ts"],"names":[],"mappings":";;;AA2Ba,QAAA,kBAAkB,GAAG,CAAC,GAAG,EAAE,UAAU,EAAE,GAAG,EAAE,SAAS,EAAE,OAAO,CAAC,CAAC;AAE7E;;;;;;GAMG;AACH,MAAa,YAAY;IAUvB;;;;;;OAMG;IACH,YAAY,CAAK,EAAE,QAAiB,EAAE,CAAW,EAAE,KAAmB;QACpE,IAAI,CAAC,IAAI,IAAI,EAAE;YACb,IAAI,CAAC,MAAM,CAAC,KAAK,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,EAAE;gBAC5B,IAAI,CAAC,CAAC,GAAG,MAAM,CAAC,CAAC,CAAC,CAAC;aACpB;iBAAM;gBACL,IAAI,CAAC,CAAC,GAAG,CAAC,CAAC;aACZ;SACF;QACD,IAAI,QAAQ,IAAI,IAAI,EAAE;YACpB,IAAI,CAAC,QAAQ,GAAG,QAAQ,CAAC;SAC1B;QACD,IAAI,CAAC,IAAI,IAAI,EAAE;YACb,IAAI,CAAC,CAAC,GAAG,CAAC,CAAC;SACZ;QACD,IAAI,KAAK,IAAI,IAAI,EAAE;YACjB,IAAI,CAAC,KAAK,GAAG,KAAK,CAAC;SACpB;IACH,CAAC;IAED,wDAAwD;IACxD,MAAM,CAAC,WAAW,CAChB,OAAgD,EAChD,OAA4C;QAE5C,IAAI,OAAO,IAAI,IAAI;YAAE,OAAO,SAAS,CAAC;QACtC,OAAO,GAAG,OAAO,aAAP,OAAO,cAAP,OAAO,GAAI,EAAE,CAAC;QACxB,IAAI,IAAqD,CAAC;QAC1D,IAAI,OAAO,OAAO,KAAK,QAAQ,IAAI,OAAO,OAAO,KAAK,QAAQ,EAAE;YAC9D,IAAI,GAAG,EAAE,CAAC,EAAE,OAAO,EAAE,CAAC;SACvB;aAAM,IAAI,OAAO,YAAY,YAAY,EAAE;YAC1C,IAAI,GAAG,OAAO,CAAC;SAChB;aAAM;YACL,IAAI,GAAG,OAAO,CAAC,YAAY,CAAC;SAC7B;QACD,MAAM,UAAU,GACd,OAAO,YAAY,YAAY,CAAC,CAAC,CAAC,OAAO,CAAC,CAAC,CAAC,OAAO,CAAC,YAAY,CAAC;QAEnE,MAAM,EACJ,CAAC,GAAG,SAAS,EACb,QAAQ,GAAG,SAAS,EACpB,CAAC,GAAG,SAAS,EACb,KAAK,GAAG,SAAS,EACjB,OAAO,GAAG,SAAS,EACnB,UAAU,GAAG,SAAS,EACvB,GAAG;YACF,GAAG,UAAU;YACb,GAAG,IAAI;SACR,CAAC;QACF,IACE,CAAC,IAAI,IAAI;YACT,QAAQ,IAAI,IAAI;YAChB,UAAU,IAAI,IAAI;YAClB,CAAC,IAAI,IAAI;YACT,OAAO,IAAI,IAAI;YACf,KAAK,IAAI,IAAI,EACb;YACA,OAAO,IAAI,YAAY,CAAC,CAAC,EAAE,QAAQ,aAAR,QAAQ,cAAR,QAAQ,GAAI,UAAU,EAAE,CAAC,aAAD,CAAC,cAAD,CAAC,GAAI,OAAO,EAAE,KAAK,CAAC,CAAC;SACzE;QACD,OAAO,SAAS,CAAC;IACnB,CAAC;CACF;AA7ED,oCA6EC"} \ No newline at end of file diff --git a/node_modules/mongodb/mongodb.d.ts b/node_modules/mongodb/mongodb.d.ts new file mode 100644 index 000000000..d1ae58c20 --- /dev/null +++ b/node_modules/mongodb/mongodb.d.ts @@ -0,0 +1,5944 @@ +/// + +import { Binary } from 'bson'; +import { BSONRegExp } from 'bson'; +import { BSONSymbol } from 'bson'; +import { Code } from 'bson'; +import type { ConnectionOptions as ConnectionOptions_2 } from 'tls'; +import { DBRef } from 'bson'; +import { Decimal128 } from 'bson'; +import Denque = require('denque'); +import type { deserialize as deserialize_2 } from 'bson'; +import type { DeserializeOptions } from 'bson'; +import * as dns from 'dns'; +import { Document } from 'bson'; +import { Double } from 'bson'; +import { Duplex } from 'stream'; +import { DuplexOptions } from 'stream'; +import { EventEmitter } from 'events'; +import { Int32 } from 'bson'; +import { Long } from 'bson'; +import { Map as Map_2 } from 'bson'; +import { MaxKey } from 'bson'; +import { MinKey } from 'bson'; +import { ObjectId } from 'bson'; +import { Readable } from 'stream'; +import type { serialize as serialize_2 } from 'bson'; +import type { SerializeOptions } from 'bson'; +import type { Socket } from 'net'; +import type { TcpNetConnectOpts } from 'net'; +import { Timestamp } from 'bson'; +import type { TLSSocket } from 'tls'; +import type { TLSSocketOptions } from 'tls'; +import { Writable } from 'stream'; + +/** @public */ +export declare abstract class AbstractCursor extends TypedEventEmitter { + /* Excluded from this release type: [kId] */ + /* Excluded from this release type: [kSession] */ + /* Excluded from this release type: [kServer] */ + /* Excluded from this release type: [kNamespace] */ + /* Excluded from this release type: [kDocuments] */ + /* Excluded from this release type: [kTopology] */ + /* Excluded from this release type: [kTransform] */ + /* Excluded from this release type: [kInitialized] */ + /* Excluded from this release type: [kClosed] */ + /* Excluded from this release type: [kKilled] */ + /* Excluded from this release type: [kOptions] */ + /** @event */ + static readonly CLOSE: "close"; + /* Excluded from this release type: __constructor */ + get id(): Long | undefined; + /* Excluded from this release type: topology */ + /* Excluded from this release type: server */ + get namespace(): MongoDBNamespace; + get readPreference(): ReadPreference; + get readConcern(): ReadConcern | undefined; + /* Excluded from this release type: session */ + /* Excluded from this release type: session */ + /* Excluded from this release type: cursorOptions */ + get closed(): boolean; + get killed(): boolean; + get loadBalanced(): boolean; + /** Returns current buffered documents length */ + bufferedCount(): number; + /** Returns current buffered documents */ + readBufferedDocuments(number?: number): TSchema[]; + [Symbol.asyncIterator](): AsyncIterator; + stream(options?: CursorStreamOptions): Readable; + hasNext(): Promise; + hasNext(callback: Callback): void; + /** Get the next available document from the cursor, returns null if no more documents are available. */ + next(): Promise; + next(callback: Callback): void; + next(callback?: Callback): Promise | void; + /** + * Try to get the next available document from the cursor or `null` if an empty batch is returned + */ + tryNext(): Promise; + tryNext(callback: Callback): void; + /** + * Iterates over all the documents for this cursor using the iterator, callback pattern. + * + * @param iterator - The iteration callback. + * @param callback - The end callback. + */ + forEach(iterator: (doc: TSchema) => boolean | void): Promise; + forEach(iterator: (doc: TSchema) => boolean | void, callback: Callback): void; + close(): void; + close(callback: Callback): void; + /** + * @deprecated options argument is deprecated + */ + close(options: CursorCloseOptions): Promise; + /** + * @deprecated options argument is deprecated + */ + close(options: CursorCloseOptions, callback: Callback): void; + /** + * Returns an array of documents. The caller is responsible for making sure that there + * is enough memory to store the results. Note that the array only contains partial + * results when this cursor had been previously accessed. In that case, + * cursor.rewind() can be used to reset the cursor. + * + * @param callback - The result callback. + */ + toArray(): Promise; + toArray(callback: Callback): void; + /** + * Add a cursor flag to the cursor + * + * @param flag - The flag to set, must be one of following ['tailable', 'oplogReplay', 'noCursorTimeout', 'awaitData', 'partial' -. + * @param value - The flag boolean value. + */ + addCursorFlag(flag: CursorFlag, value: boolean): this; + /** + * Map all documents using the provided function + * If there is a transform set on the cursor, that will be called first and the result passed to + * this function's transform. + * + * @remarks + * **Note for Typescript Users:** adding a transform changes the return type of the iteration of this cursor, + * it **does not** return a new instance of a cursor. This means when calling map, + * you should always assign the result to a new variable in order to get a correctly typed cursor variable. + * Take note of the following example: + * + * @example + * ```typescript + * const cursor: FindCursor = coll.find(); + * const mappedCursor: FindCursor = cursor.map(doc => Object.keys(doc).length); + * const keyCounts: number[] = await mappedCursor.toArray(); // cursor.toArray() still returns Document[] + * ``` + * @param transform - The mapping transformation method. + */ + map(transform: (doc: TSchema) => T): AbstractCursor; + /** + * Set the ReadPreference for the cursor. + * + * @param readPreference - The new read preference for the cursor. + */ + withReadPreference(readPreference: ReadPreferenceLike): this; + /** + * Set the ReadPreference for the cursor. + * + * @param readPreference - The new read preference for the cursor. + */ + withReadConcern(readConcern: ReadConcernLike): this; + /** + * Set a maxTimeMS on the cursor query, allowing for hard timeout limits on queries (Only supported on MongoDB 2.6 or higher) + * + * @param value - Number of milliseconds to wait before aborting the query. + */ + maxTimeMS(value: number): this; + /** + * Set the batch size for the cursor. + * + * @param value - The number of documents to return per batch. See {@link https://docs.mongodb.com/manual/reference/command/find/|find command documentation}. + */ + batchSize(value: number): this; + /** + * Rewind this cursor to its uninitialized state. Any options that are present on the cursor will + * remain in effect. Iterating this cursor will cause new queries to be sent to the server, even + * if the resultant data has already been retrieved by this cursor. + */ + rewind(): void; + /** + * Returns a new uninitialized copy of this cursor, with options matching those that have been set on the current instance + */ + abstract clone(): AbstractCursor; + /* Excluded from this release type: _initialize */ + /* Excluded from this release type: _getMore */ +} + +/** @public */ +export declare type AbstractCursorEvents = { + [AbstractCursor.CLOSE](): void; +}; + +/** @public */ +export declare interface AbstractCursorOptions extends BSONSerializeOptions { + session?: ClientSession; + readPreference?: ReadPreferenceLike; + readConcern?: ReadConcernLike; + batchSize?: number; + maxTimeMS?: number; + comment?: Document | string; + tailable?: boolean; + awaitData?: boolean; + noCursorTimeout?: boolean; +} + +/* Excluded from this release type: AbstractOperation */ + +/** @public */ +export declare type AcceptedFields = { + readonly [key in KeysOfAType]?: AssignableType; +}; + +/** @public */ +export declare type AddToSetOperators = { + $each?: Array>; +}; + +/** @public */ +export declare interface AddUserOptions extends CommandOperationOptions { + /** @deprecated Please use db.command('createUser', ...) instead for this option */ + digestPassword?: null; + /** Roles associated with the created user */ + roles?: string | string[] | RoleSpecification | RoleSpecification[]; + /** Custom data associated with the user (only Mongodb 2.6 or higher) */ + customData?: Document; +} + +/** + * The **Admin** class is an internal class that allows convenient access to + * the admin functionality and commands for MongoDB. + * + * **ADMIN Cannot directly be instantiated** + * @public + * + * @example + * ```js + * const MongoClient = require('mongodb').MongoClient; + * const test = require('assert'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * + * // Connect using MongoClient + * MongoClient.connect(url, function(err, client) { + * // Use the admin database for the operation + * const adminDb = client.db(dbName).admin(); + * + * // List all the available databases + * adminDb.listDatabases(function(err, dbs) { + * expect(err).to.not.exist; + * test.ok(dbs.databases.length > 0); + * client.close(); + * }); + * }); + * ``` + */ +export declare class Admin { + /* Excluded from this release type: s */ + /* Excluded from this release type: __constructor */ + /** + * Execute a command + * + * @param command - The command to execute + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + command(command: Document): Promise; + command(command: Document, callback: Callback): void; + command(command: Document, options: RunCommandOptions): Promise; + command(command: Document, options: RunCommandOptions, callback: Callback): void; + /** + * Retrieve the server build information + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + buildInfo(): Promise; + buildInfo(callback: Callback): void; + buildInfo(options: CommandOperationOptions): Promise; + buildInfo(options: CommandOperationOptions, callback: Callback): void; + /** + * Retrieve the server build information + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + serverInfo(): Promise; + serverInfo(callback: Callback): void; + serverInfo(options: CommandOperationOptions): Promise; + serverInfo(options: CommandOperationOptions, callback: Callback): void; + /** + * Retrieve this db's server status. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + serverStatus(): Promise; + serverStatus(callback: Callback): void; + serverStatus(options: CommandOperationOptions): Promise; + serverStatus(options: CommandOperationOptions, callback: Callback): void; + /** + * Ping the MongoDB server and retrieve results + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + ping(): Promise; + ping(callback: Callback): void; + ping(options: CommandOperationOptions): Promise; + ping(options: CommandOperationOptions, callback: Callback): void; + /** + * Add a user to the database + * + * @param username - The username for the new user + * @param password - An optional password for the new user + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + addUser(username: string): Promise; + addUser(username: string, callback: Callback): void; + addUser(username: string, password: string): Promise; + addUser(username: string, password: string, callback: Callback): void; + addUser(username: string, options: AddUserOptions): Promise; + addUser(username: string, options: AddUserOptions, callback: Callback): void; + addUser(username: string, password: string, options: AddUserOptions): Promise; + addUser(username: string, password: string, options: AddUserOptions, callback: Callback): void; + /** + * Remove a user from a database + * + * @param username - The username to remove + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + removeUser(username: string): Promise; + removeUser(username: string, callback: Callback): void; + removeUser(username: string, options: RemoveUserOptions): Promise; + removeUser(username: string, options: RemoveUserOptions, callback: Callback): void; + /** + * Validate an existing collection + * + * @param collectionName - The name of the collection to validate. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + validateCollection(collectionName: string): Promise; + validateCollection(collectionName: string, callback: Callback): void; + validateCollection(collectionName: string, options: ValidateCollectionOptions): Promise; + validateCollection(collectionName: string, options: ValidateCollectionOptions, callback: Callback): void; + /** + * List the available databases + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + listDatabases(): Promise; + listDatabases(callback: Callback): void; + listDatabases(options: ListDatabasesOptions): Promise; + listDatabases(options: ListDatabasesOptions, callback: Callback): void; + /** + * Get ReplicaSet status + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + replSetGetStatus(): Promise; + replSetGetStatus(callback: Callback): void; + replSetGetStatus(options: CommandOperationOptions): Promise; + replSetGetStatus(options: CommandOperationOptions, callback: Callback): void; +} + +/* Excluded from this release type: AdminPrivate */ + +/* Excluded from this release type: AggregateOperation */ + +/** @public */ +export declare interface AggregateOptions extends CommandOperationOptions { + /** allowDiskUse lets the server know if it can use disk to store temporary results for the aggregation (requires mongodb 2.6 \>). */ + allowDiskUse?: boolean; + /** The number of documents to return per batch. See [aggregation documentation](https://docs.mongodb.com/manual/reference/command/aggregate). */ + batchSize?: number; + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; + /** Return the query as cursor, on 2.6 \> it returns as a real cursor on pre 2.6 it returns as an emulated cursor. */ + cursor?: Document; + /** specifies a cumulative time limit in milliseconds for processing operations on the cursor. MongoDB interrupts the operation at the earliest following interrupt point. */ + maxTimeMS?: number; + /** The maximum amount of time for the server to wait on new documents to satisfy a tailable cursor query. */ + maxAwaitTimeMS?: number; + /** Specify collation. */ + collation?: CollationOptions; + /** Add an index selection hint to an aggregation command */ + hint?: Hint; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; + out?: string; +} + +/** + * The **AggregationCursor** class is an internal class that embodies an aggregation cursor on MongoDB + * allowing for iteration over the results returned from the underlying query. It supports + * one by one document iteration, conversion to an array or can be iterated as a Node 4.X + * or higher stream + * @public + */ +export declare class AggregationCursor extends AbstractCursor { + /* Excluded from this release type: [kPipeline] */ + /* Excluded from this release type: [kOptions] */ + /* Excluded from this release type: __constructor */ + get pipeline(): Document[]; + clone(): AggregationCursor; + map(transform: (doc: TSchema) => T): AggregationCursor; + /* Excluded from this release type: _initialize */ + /** Execute the explain for the cursor */ + explain(): Promise; + explain(callback: Callback): void; + explain(verbosity: ExplainVerbosityLike): Promise; + /** Add a group stage to the aggregation pipeline */ + group($group: Document): AggregationCursor; + /** Add a limit stage to the aggregation pipeline */ + limit($limit: number): this; + /** Add a match stage to the aggregation pipeline */ + match($match: Document): this; + /** Add an out stage to the aggregation pipeline */ + out($out: { + db: string; + coll: string; + } | string): this; + /** + * Add a project stage to the aggregation pipeline + * + * @remarks + * In order to strictly type this function you must provide an interface + * that represents the effect of your projection on the result documents. + * + * By default chaining a projection to your cursor changes the returned type to the generic {@link Document} type. + * You should specify a parameterized type to have assertions on your final results. + * + * @example + * ```typescript + * // Best way + * const docs: AggregationCursor<{ a: number }> = cursor.project<{ a: number }>({ _id: 0, a: true }); + * // Flexible way + * const docs: AggregationCursor = cursor.project({ _id: 0, a: true }); + * ``` + * + * @remarks + * In order to strictly type this function you must provide an interface + * that represents the effect of your projection on the result documents. + * + * **Note for Typescript Users:** adding a transform changes the return type of the iteration of this cursor, + * it **does not** return a new instance of a cursor. This means when calling project, + * you should always assign the result to a new variable in order to get a correctly typed cursor variable. + * Take note of the following example: + * + * @example + * ```typescript + * const cursor: AggregationCursor<{ a: number; b: string }> = coll.aggregate([]); + * const projectCursor = cursor.project<{ a: number }>({ _id: 0, a: true }); + * const aPropOnlyArray: {a: number}[] = await projectCursor.toArray(); + * + * // or always use chaining and save the final cursor + * + * const cursor = coll.aggregate().project<{ a: string }>({ + * _id: 0, + * a: { $convert: { input: '$a', to: 'string' } + * }}); + * ``` + */ + project($project: Document): AggregationCursor; + /** Add a lookup stage to the aggregation pipeline */ + lookup($lookup: Document): this; + /** Add a redact stage to the aggregation pipeline */ + redact($redact: Document): this; + /** Add a skip stage to the aggregation pipeline */ + skip($skip: number): this; + /** Add a sort stage to the aggregation pipeline */ + sort($sort: Sort): this; + /** Add a unwind stage to the aggregation pipeline */ + unwind($unwind: Document | string): this; + /** @deprecated Add a geoNear stage to the aggregation pipeline */ + geoNear($geoNear: Document): this; +} + +/** @public */ +export declare interface AggregationCursorOptions extends AbstractCursorOptions, AggregateOptions { +} + +/** + * It is possible to search using alternative types in mongodb e.g. + * string types can be searched using a regex in mongo + * array types can be searched using their element type + * @public + */ +export declare type AlternativeType = T extends ReadonlyArray ? T | RegExpOrString : RegExpOrString; + +/** @public */ +export declare type AnyBulkWriteOperation = { + insertOne: InsertOneModel; +} | { + replaceOne: ReplaceOneModel; +} | { + updateOne: UpdateOneModel; +} | { + updateMany: UpdateManyModel; +} | { + deleteOne: DeleteOneModel; +} | { + deleteMany: DeleteManyModel; +}; + +/** @public */ +export declare type AnyError = MongoError | Error; + +/** @public */ +export declare type ArrayOperator = { + $each?: Array>; + $slice?: number; + $position?: number; + $sort?: Sort; +}; + +/** @public */ +export declare interface Auth { + /** The username for auth */ + username?: string; + /** The password for auth */ + password?: string; +} + +/** @public */ +export declare const AuthMechanism: Readonly<{ + readonly MONGODB_AWS: "MONGODB-AWS"; + readonly MONGODB_CR: "MONGODB-CR"; + readonly MONGODB_DEFAULT: "DEFAULT"; + readonly MONGODB_GSSAPI: "GSSAPI"; + readonly MONGODB_PLAIN: "PLAIN"; + readonly MONGODB_SCRAM_SHA1: "SCRAM-SHA-1"; + readonly MONGODB_SCRAM_SHA256: "SCRAM-SHA-256"; + readonly MONGODB_X509: "MONGODB-X509"; +}>; + +/** @public */ +export declare type AuthMechanism = typeof AuthMechanism[keyof typeof AuthMechanism]; + +/** @public */ +export declare interface AuthMechanismProperties extends Document { + SERVICE_NAME?: string; + SERVICE_REALM?: string; + CANONICALIZE_HOST_NAME?: boolean; + AWS_SESSION_TOKEN?: string; +} + +/** @public */ +export declare interface AutoEncrypter { + new (client: MongoClient, options: AutoEncryptionOptions): AutoEncrypter; + init(cb: Callback): void; + teardown(force: boolean, callback: Callback): void; + encrypt(ns: string, cmd: Document, options: any, callback: Callback): void; + decrypt(cmd: Document, options: any, callback: Callback): void; +} + +/** @public */ +export declare const AutoEncryptionLoggerLevel: Readonly<{ + readonly FatalError: 0; + readonly Error: 1; + readonly Warning: 2; + readonly Info: 3; + readonly Trace: 4; +}>; + +/** @public */ +export declare type AutoEncryptionLoggerLevel = typeof AutoEncryptionLoggerLevel[keyof typeof AutoEncryptionLoggerLevel]; + +/** @public */ +export declare interface AutoEncryptionOptions { + /* Excluded from this release type: bson */ + /* Excluded from this release type: metadataClient */ + /** A `MongoClient` used to fetch keys from a key vault */ + keyVaultClient?: MongoClient; + /** The namespace where keys are stored in the key vault */ + keyVaultNamespace?: string; + /** Configuration options that are used by specific KMS providers during key generation, encryption, and decryption. */ + kmsProviders?: { + /** Configuration options for using 'aws' as your KMS provider */ + aws?: { + /** The access key used for the AWS KMS provider */ + accessKeyId: string; + /** The secret access key used for the AWS KMS provider */ + secretAccessKey: string; + /** + * An optional AWS session token that will be used as the + * X-Amz-Security-Token header for AWS requests. + */ + sessionToken?: string; + }; + /** Configuration options for using 'local' as your KMS provider */ + local?: { + /** + * The master key used to encrypt/decrypt data keys. + * A 96-byte long Buffer or base64 encoded string. + */ + key: Buffer | string; + }; + /** Configuration options for using 'azure' as your KMS provider */ + azure?: { + /** The tenant ID identifies the organization for the account */ + tenantId: string; + /** The client ID to authenticate a registered application */ + clientId: string; + /** The client secret to authenticate a registered application */ + clientSecret: string; + /** + * If present, a host with optional port. E.g. "example.com" or "example.com:443". + * This is optional, and only needed if customer is using a non-commercial Azure instance + * (e.g. a government or China account, which use different URLs). + * Defaults to "login.microsoftonline.com" + */ + identityPlatformEndpoint?: string | undefined; + }; + /** Configuration options for using 'gcp' as your KMS provider */ + gcp?: { + /** The service account email to authenticate */ + email: string; + /** A PKCS#8 encrypted key. This can either be a base64 string or a binary representation */ + privateKey: string | Buffer; + /** + * If present, a host with optional port. E.g. "example.com" or "example.com:443". + * Defaults to "oauth2.googleapis.com" + */ + endpoint?: string | undefined; + }; + }; + /** + * A map of namespaces to a local JSON schema for encryption + * + * **NOTE**: Supplying options.schemaMap provides more security than relying on JSON Schemas obtained from the server. + * It protects against a malicious server advertising a false JSON Schema, which could trick the client into sending decrypted data that should be encrypted. + * Schemas supplied in the schemaMap only apply to configuring automatic encryption for client side encryption. + * Other validation rules in the JSON schema will not be enforced by the driver and will result in an error. + */ + schemaMap?: Document; + /** Allows the user to bypass auto encryption, maintaining implicit decryption */ + bypassAutoEncryption?: boolean; + options?: { + /** An optional hook to catch logging messages from the underlying encryption engine */ + logger?: (level: AutoEncryptionLoggerLevel, message: string) => void; + }; + extraOptions?: { + /** + * A local process the driver communicates with to determine how to encrypt values in a command. + * Defaults to "mongodb://%2Fvar%2Fmongocryptd.sock" if domain sockets are available or "mongodb://localhost:27020" otherwise + */ + mongocryptdURI?: string; + /** If true, autoEncryption will not attempt to spawn a mongocryptd before connecting */ + mongocryptdBypassSpawn?: boolean; + /** The path to the mongocryptd executable on the system */ + mongocryptdSpawnPath?: string; + /** Command line arguments to use when auto-spawning a mongocryptd */ + mongocryptdSpawnArgs?: string[]; + }; +} + +/** + * Keeps the state of a unordered batch so we can rewrite the results + * correctly after command execution + * + * @public + */ +export declare class Batch { + originalZeroIndex: number; + currentIndex: number; + originalIndexes: number[]; + batchType: BatchType; + operations: T[]; + size: number; + sizeBytes: number; + constructor(batchType: BatchType, originalZeroIndex: number); +} + +/** @public */ +export declare const BatchType: Readonly<{ + readonly INSERT: 1; + readonly UPDATE: 2; + readonly DELETE: 3; +}>; + +/** @public */ +export declare type BatchType = typeof BatchType[keyof typeof BatchType]; + +export { Binary } + +/** @public */ +export declare type BitwiseFilter = number /** numeric bit mask */ | Binary /** BinData bit mask */ | ReadonlyArray; + +export { BSONRegExp } + +/** + * BSON Serialization options. + * @public + */ +export declare interface BSONSerializeOptions extends Omit, Omit { + /** Return BSON filled buffers from operations */ + raw?: boolean; +} + +export { BSONSymbol } + +/** @public */ +export declare const BSONType: Readonly<{ + readonly double: 1; + readonly string: 2; + readonly object: 3; + readonly array: 4; + readonly binData: 5; + readonly undefined: 6; + readonly objectId: 7; + readonly bool: 8; + readonly date: 9; + readonly null: 10; + readonly regex: 11; + readonly dbPointer: 12; + readonly javascript: 13; + readonly symbol: 14; + readonly javascriptWithScope: 15; + readonly int: 16; + readonly timestamp: 17; + readonly long: 18; + readonly decimal: 19; + readonly minKey: -1; + readonly maxKey: 127; +}>; + +/** @public */ +export declare type BSONType = typeof BSONType[keyof typeof BSONType]; + +/** @public */ +export declare type BSONTypeAlias = keyof typeof BSONType; + +/* Excluded from this release type: BufferPool */ + +/** @public */ +export declare abstract class BulkOperationBase { + isOrdered: boolean; + /* Excluded from this release type: s */ + operationId?: number; + /* Excluded from this release type: __constructor */ + /** + * Add a single insert document to the bulk operation + * + * @example + * ```js + * const bulkOp = collection.initializeOrderedBulkOp(); + * + * // Adds three inserts to the bulkOp. + * bulkOp + * .insert({ a: 1 }) + * .insert({ b: 2 }) + * .insert({ c: 3 }); + * await bulkOp.execute(); + * ``` + */ + insert(document: Document): BulkOperationBase; + /** + * Builds a find operation for an update/updateOne/delete/deleteOne/replaceOne. + * Returns a builder object used to complete the definition of the operation. + * + * @example + * ```js + * const bulkOp = collection.initializeOrderedBulkOp(); + * + * // Add an updateOne to the bulkOp + * bulkOp.find({ a: 1 }).updateOne({ $set: { b: 2 } }); + * + * // Add an updateMany to the bulkOp + * bulkOp.find({ c: 3 }).update({ $set: { d: 4 } }); + * + * // Add an upsert + * bulkOp.find({ e: 5 }).upsert().updateOne({ $set: { f: 6 } }); + * + * // Add a deletion + * bulkOp.find({ g: 7 }).deleteOne(); + * + * // Add a multi deletion + * bulkOp.find({ h: 8 }).delete(); + * + * // Add a replaceOne + * bulkOp.find({ i: 9 }).replaceOne({writeConcern: { j: 10 }}); + * + * // Update using a pipeline (requires Mongodb 4.2 or higher) + * bulk.find({ k: 11, y: { $exists: true }, z: { $exists: true } }).updateOne([ + * { $set: { total: { $sum: [ '$y', '$z' ] } } } + * ]); + * + * // All of the ops will now be executed + * await bulkOp.execute(); + * ``` + */ + find(selector: Document): FindOperators; + /** Specifies a raw operation to perform in the bulk write. */ + raw(op: AnyBulkWriteOperation): this; + get bsonOptions(): BSONSerializeOptions; + get writeConcern(): WriteConcern | undefined; + get batches(): Batch[]; + /** An internal helper method. Do not invoke directly. Will be going away in the future */ + execute(options?: BulkWriteOptions, callback?: Callback): Promise | void; + /* Excluded from this release type: handleWriteError */ + abstract addToOperationsList(batchType: BatchType, document: Document | UpdateStatement | DeleteStatement): this; +} + +/* Excluded from this release type: BulkOperationPrivate */ + +/** @public */ +export declare interface BulkResult { + ok: number; + writeErrors: WriteError[]; + writeConcernErrors: WriteConcernError[]; + insertedIds: Document[]; + nInserted: number; + nUpserted: number; + nMatched: number; + nModified: number; + nRemoved: number; + upserted: Document[]; + opTime?: Document; +} + +/** @public */ +export declare interface BulkWriteOperationError { + index: number; + code: number; + errmsg: string; + errInfo: Document; + op: Document | UpdateStatement | DeleteStatement; +} + +/** @public */ +export declare interface BulkWriteOptions extends CommandOperationOptions { + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; + /** If true, when an insert fails, don't execute the remaining writes. If false, continue with remaining inserts when one fails. */ + ordered?: boolean; + /** @deprecated use `ordered` instead */ + keepGoing?: boolean; + /** Force server to assign _id values instead of driver. */ + forceServerObjectId?: boolean; +} + +/** + * @public + * The result of a bulk write. + */ +export declare class BulkWriteResult { + result: BulkResult; + /* Excluded from this release type: __constructor */ + /** Number of documents inserted. */ + get insertedCount(): number; + /** Number of documents matched for update. */ + get matchedCount(): number; + /** Number of documents modified. */ + get modifiedCount(): number; + /** Number of documents deleted. */ + get deletedCount(): number; + /** Number of documents upserted. */ + get upsertedCount(): number; + /** Upserted document generated Id's, hash key is the index of the originating operation */ + get upsertedIds(): { + [key: number]: any; + }; + /** Inserted document generated Id's, hash key is the index of the originating operation */ + get insertedIds(): { + [key: number]: any; + }; + /** Evaluates to true if the bulk operation correctly executes */ + get ok(): number; + /** The number of inserted documents */ + get nInserted(): number; + /** Number of upserted documents */ + get nUpserted(): number; + /** Number of matched documents */ + get nMatched(): number; + /** Number of documents updated physically on disk */ + get nModified(): number; + /** Number of removed documents */ + get nRemoved(): number; + /** Returns an array of all inserted ids */ + getInsertedIds(): Document[]; + /** Returns an array of all upserted ids */ + getUpsertedIds(): Document[]; + /** Returns the upserted id at the given index */ + getUpsertedIdAt(index: number): Document | undefined; + /** Returns raw internal result */ + getRawResponse(): Document; + /** Returns true if the bulk operation contains a write error */ + hasWriteErrors(): boolean; + /** Returns the number of write errors off the bulk operation */ + getWriteErrorCount(): number; + /** Returns a specific write error object */ + getWriteErrorAt(index: number): WriteError | undefined; + /** Retrieve all write errors */ + getWriteErrors(): WriteError[]; + /** Retrieve lastOp if available */ + getLastOp(): Document | undefined; + /** Retrieve the write concern error if one exists */ + getWriteConcernError(): WriteConcernError | undefined; + toJSON(): BulkResult; + toString(): string; + isOk(): boolean; +} + +/** + * MongoDB Driver style callback + * @public + */ +export declare type Callback = (error?: AnyError, result?: T) => void; + +/** @public */ +export declare class CancellationToken extends TypedEventEmitter<{ + cancel(): void; +}> { +} + +/** + * Creates a new Change Stream instance. Normally created using {@link Collection#watch|Collection.watch()}. + * @public + */ +export declare class ChangeStream extends TypedEventEmitter { + pipeline: Document[]; + options: ChangeStreamOptions; + parent: MongoClient | Db | Collection; + namespace: MongoDBNamespace; + type: symbol; + /* Excluded from this release type: cursor */ + streamOptions?: CursorStreamOptions; + /* Excluded from this release type: [kResumeQueue] */ + /* Excluded from this release type: [kCursorStream] */ + /* Excluded from this release type: [kClosed] */ + /* Excluded from this release type: [kMode] */ + /** @event */ + static readonly RESPONSE: "response"; + /** @event */ + static readonly MORE: "more"; + /** @event */ + static readonly INIT: "init"; + /** @event */ + static readonly CLOSE: "close"; + /** + * Fired for each new matching change in the specified namespace. Attaching a `change` + * event listener to a Change Stream will switch the stream into flowing mode. Data will + * then be passed as soon as it is available. + * @event + */ + static readonly CHANGE: "change"; + /** @event */ + static readonly END: "end"; + /** @event */ + static readonly ERROR: "error"; + /** + * Emitted each time the change stream stores a new resume token. + * @event + */ + static readonly RESUME_TOKEN_CHANGED: "resumeTokenChanged"; + /* Excluded from this release type: __constructor */ + /* Excluded from this release type: cursorStream */ + /** The cached resume token that is used to resume after the most recently returned change. */ + get resumeToken(): ResumeToken; + /** Check if there is any document still available in the Change Stream */ + hasNext(callback?: Callback): Promise | void; + /** Get the next available document from the Change Stream. */ + next(): Promise>; + next(callback: Callback>): void; + /** Is the cursor closed */ + get closed(): boolean; + /** Close the Change Stream */ + close(callback?: Callback): Promise | void; + /** + * Return a modified Readable stream including a possible transform method. + * @throws MongoDriverError if this.cursor is undefined + */ + stream(options?: CursorStreamOptions): Readable; + /** + * Try to get the next available document from the Change Stream's cursor or `null` if an empty batch is returned + */ + tryNext(): Promise; + tryNext(callback: Callback): void; +} + +/* Excluded from this release type: ChangeStreamCursor */ + +/* Excluded from this release type: ChangeStreamCursorOptions */ + +/** @public */ +export declare interface ChangeStreamDocument { + /** + * The id functions as an opaque token for use when resuming an interrupted + * change stream. + */ + _id: InferIdType; + /** + * Describes the type of operation represented in this change notification. + */ + operationType: 'insert' | 'update' | 'replace' | 'delete' | 'invalidate' | 'drop' | 'dropDatabase' | 'rename'; + /** + * Contains two fields: “db” and “coll” containing the database and + * collection name in which the change happened. + */ + ns: { + db: string; + coll: string; + }; + /** + * Only present for ops of type ‘insert’, ‘update’, ‘replace’, and + * ‘delete’. + * + * For unsharded collections this contains a single field, _id, with the + * value of the _id of the document updated. For sharded collections, + * this will contain all the components of the shard key in order, + * followed by the _id if the _id isn’t part of the shard key. + */ + documentKey?: InferIdType; + /** + * Only present for ops of type ‘update’. + * + * Contains a description of updated and removed fields in this + * operation. + */ + updateDescription?: UpdateDescription; + /** + * Always present for operations of type ‘insert’ and ‘replace’. Also + * present for operations of type ‘update’ if the user has specified ‘updateLookup’ + * in the ‘fullDocument’ arguments to the ‘$changeStream’ stage. + * + * For operations of type ‘insert’ and ‘replace’, this key will contain the + * document being inserted, or the new version of the document that is replacing + * the existing document, respectively. + * + * For operations of type ‘update’, this key will contain a copy of the full + * version of the document from some point after the update occurred. If the + * document was deleted since the updated happened, it will be null. + */ + fullDocument?: TSchema; +} + +/** @public */ +export declare type ChangeStreamEvents = { + resumeTokenChanged(token: ResumeToken): void; + init(response: Document): void; + more(response?: Document | undefined): void; + response(): void; + end(): void; + error(error: Error): void; + change(change: ChangeStreamDocument): void; +} & AbstractCursorEvents; + +/** + * Options that can be passed to a ChangeStream. Note that startAfter, resumeAfter, and startAtOperationTime are all mutually exclusive, and the server will error if more than one is specified. + * @public + */ +export declare interface ChangeStreamOptions extends AggregateOptions { + /** Allowed values: ‘default’, ‘updateLookup’. When set to ‘updateLookup’, the change stream will include both a delta describing the changes to the document, as well as a copy of the entire document that was changed from some time after the change occurred. */ + fullDocument?: string; + /** The maximum amount of time for the server to wait on new documents to satisfy a change stream query. */ + maxAwaitTimeMS?: number; + /** Allows you to start a changeStream after a specified event. See {@link https://docs.mongodb.com/master/changeStreams/#resumeafter-for-change-streams|ChangeStream documentation}. */ + resumeAfter?: ResumeToken; + /** Similar to resumeAfter, but will allow you to start after an invalidated event. See {@link https://docs.mongodb.com/master/changeStreams/#startafter-for-change-streams|ChangeStream documentation}. */ + startAfter?: ResumeToken; + /** Will start the changeStream after the specified operationTime. */ + startAtOperationTime?: OperationTime; + /** The number of documents to return per batch. See {@link https://docs.mongodb.com/manual/reference/command/aggregate|aggregation documentation}. */ + batchSize?: number; +} + +/** @public */ +export declare interface ClientMetadata { + driver: { + name: string; + version: string; + }; + os: { + type: string; + name: NodeJS.Platform; + architecture: string; + version: string; + }; + platform: string; + version?: string; + application?: { + name: string; + }; +} + +/** @public */ +export declare interface ClientMetadataOptions { + driverInfo?: { + name?: string; + version?: string; + platform?: string; + }; + appName?: string; +} + +/** + * A class representing a client session on the server + * + * NOTE: not meant to be instantiated directly. + * @public + */ +export declare class ClientSession extends TypedEventEmitter { + /* Excluded from this release type: topology */ + /* Excluded from this release type: sessionPool */ + hasEnded: boolean; + clientOptions?: MongoOptions; + supports: { + causalConsistency: boolean; + }; + clusterTime?: ClusterTime; + operationTime?: Timestamp; + explicit: boolean; + /* Excluded from this release type: owner */ + defaultTransactionOptions: TransactionOptions; + transaction: Transaction; + /* Excluded from this release type: [kServerSession] */ + /* Excluded from this release type: [kSnapshotTime] */ + /* Excluded from this release type: [kSnapshotEnabled] */ + /* Excluded from this release type: [kPinnedConnection] */ + /* Excluded from this release type: __constructor */ + /** The server id associated with this session */ + get id(): ServerSessionId | undefined; + get serverSession(): ServerSession; + /** Whether or not this session is configured for snapshot reads */ + get snapshotEnabled(): boolean; + get loadBalanced(): boolean; + /* Excluded from this release type: pinnedConnection */ + /* Excluded from this release type: pin */ + /* Excluded from this release type: unpin */ + get isPinned(): boolean; + /** + * Ends this session on the server + * + * @param options - Optional settings. Currently reserved for future use + * @param callback - Optional callback for completion of this operation + */ + endSession(): Promise; + endSession(callback: Callback): void; + endSession(options: EndSessionOptions): Promise; + endSession(options: EndSessionOptions, callback: Callback): void; + /** + * Advances the operationTime for a ClientSession. + * + * @param operationTime - the `BSON.Timestamp` of the operation type it is desired to advance to + */ + advanceOperationTime(operationTime: Timestamp): void; + /** + * Advances the clusterTime for a ClientSession to the provided clusterTime of another ClientSession + * + * @param clusterTime - the $clusterTime returned by the server from another session in the form of a document containing the `BSON.Timestamp` clusterTime and signature + */ + advanceClusterTime(clusterTime: ClusterTime): void; + /** + * Used to determine if this session equals another + * + * @param session - The session to compare to + */ + equals(session: ClientSession): boolean; + /** Increment the transaction number on the internal ServerSession */ + incrementTransactionNumber(): void; + /** @returns whether this session is currently in a transaction or not */ + inTransaction(): boolean; + /** + * Starts a new transaction with the given options. + * + * @param options - Options for the transaction + */ + startTransaction(options?: TransactionOptions): void; + /** + * Commits the currently active transaction in this session. + * + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + commitTransaction(): Promise; + commitTransaction(callback: Callback): void; + /** + * Aborts the currently active transaction in this session. + * + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + abortTransaction(): Promise; + abortTransaction(callback: Callback): void; + /** + * This is here to ensure that ClientSession is never serialized to BSON. + */ + toBSON(): never; + /** + * Runs a provided lambda within a transaction, retrying either the commit operation + * or entire transaction as needed (and when the error permits) to better ensure that + * the transaction can complete successfully. + * + * IMPORTANT: This method requires the user to return a Promise, all lambdas that do not + * return a Promise will result in undefined behavior. + * + * @param fn - A lambda to run within a transaction + * @param options - Optional settings for the transaction + */ + withTransaction(fn: WithTransactionCallback, options?: TransactionOptions): ReturnType; +} + +/** @public */ +export declare type ClientSessionEvents = { + ended(session: ClientSession): void; +}; + +/** @public */ +export declare interface ClientSessionOptions { + /** Whether causal consistency should be enabled on this session */ + causalConsistency?: boolean; + /** Whether all read operations should be read from the same snapshot for this session (NOTE: not compatible with `causalConsistency=true`) */ + snapshot?: boolean; + /** The default TransactionOptions to use for transactions started on this session. */ + defaultTransactionOptions?: TransactionOptions; + /* Excluded from this release type: owner */ + /* Excluded from this release type: explicit */ + /* Excluded from this release type: initialClusterTime */ +} + +/** @public */ +export declare interface CloseOptions { + force?: boolean; +} + +/** @public */ +export declare interface ClusterTime { + clusterTime: Timestamp; + signature: { + hash: Binary; + keyId: Long; + }; +} + +export { Code } + +/** @public */ +export declare interface CollationOptions { + locale: string; + caseLevel?: boolean; + caseFirst?: string; + strength?: number; + numericOrdering?: boolean; + alternate?: string; + maxVariable?: string; + backwards?: boolean; + normalization?: boolean; +} + +/** + * The **Collection** class is an internal class that embodies a MongoDB collection + * allowing for insert/update/remove/find and other command operation on that MongoDB collection. + * + * **COLLECTION Cannot directly be instantiated** + * @public + * + * @example + * ```js + * const MongoClient = require('mongodb').MongoClient; + * const test = require('assert'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * // Connect using MongoClient + * MongoClient.connect(url, function(err, client) { + * // Create a collection we want to drop later + * const col = client.db(dbName).collection('createIndexExample1'); + * // Show that duplicate records got dropped + * col.find({}).toArray(function(err, items) { + * expect(err).to.not.exist; + * test.equal(4, items.length); + * client.close(); + * }); + * }); + * ``` + */ +export declare class Collection { + /* Excluded from this release type: s */ + /* Excluded from this release type: __constructor */ + /** + * The name of the database this collection belongs to + */ + get dbName(): string; + /** + * The name of this collection + */ + get collectionName(): string; + /** + * The namespace of this collection, in the format `${this.dbName}.${this.collectionName}` + */ + get namespace(): string; + /** + * The current readConcern of the collection. If not explicitly defined for + * this collection, will be inherited from the parent DB + */ + get readConcern(): ReadConcern | undefined; + /** + * The current readPreference of the collection. If not explicitly defined for + * this collection, will be inherited from the parent DB + */ + get readPreference(): ReadPreference | undefined; + get bsonOptions(): BSONSerializeOptions; + /** + * The current writeConcern of the collection. If not explicitly defined for + * this collection, will be inherited from the parent DB + */ + get writeConcern(): WriteConcern | undefined; + /** The current index hint for the collection */ + get hint(): Hint | undefined; + set hint(v: Hint | undefined); + /** + * Inserts a single document into MongoDB. If documents passed in do not contain the **_id** field, + * one will be added to each of the documents missing it by the driver, mutating the document. This behavior + * can be overridden by setting the **forceServerObjectId** flag. + * + * @param doc - The document to insert + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + insertOne(doc: OptionalId): Promise>; + insertOne(doc: OptionalId, callback: Callback>): void; + insertOne(doc: OptionalId, options: InsertOneOptions): Promise>; + insertOne(doc: OptionalId, options: InsertOneOptions, callback: Callback>): void; + /** + * Inserts an array of documents into MongoDB. If documents passed in do not contain the **_id** field, + * one will be added to each of the documents missing it by the driver, mutating the document. This behavior + * can be overridden by setting the **forceServerObjectId** flag. + * + * @param docs - The documents to insert + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + insertMany(docs: OptionalId[]): Promise>; + insertMany(docs: OptionalId[], callback: Callback>): void; + insertMany(docs: OptionalId[], options: BulkWriteOptions): Promise>; + insertMany(docs: OptionalId[], options: BulkWriteOptions, callback: Callback>): void; + /** + * Perform a bulkWrite operation without a fluent API + * + * Legal operation types are + * + * ```js + * { insertOne: { document: { a: 1 } } } + * + * { updateOne: { filter: {a:2}, update: {$set: {a:2}}, upsert:true } } + * + * { updateMany: { filter: {a:2}, update: {$set: {a:2}}, upsert:true } } + * + * { updateMany: { filter: {}, update: {$set: {"a.$[i].x": 5}}, arrayFilters: [{ "i.x": 5 }]} } + * + * { deleteOne: { filter: {c:1} } } + * + * { deleteMany: { filter: {c:1} } } + * + * { replaceOne: { filter: {c:3}, replacement: {c:4}, upsert:true} } + *``` + * Please note that raw operations are no longer accepted as of driver version 4.0. + * + * If documents passed in do not contain the **_id** field, + * one will be added to each of the documents missing it by the driver, mutating the document. This behavior + * can be overridden by setting the **forceServerObjectId** flag. + * + * @param operations - Bulk operations to perform + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + * @throws MongoDriverError if operations is not an array + */ + bulkWrite(operations: AnyBulkWriteOperation[]): Promise; + bulkWrite(operations: AnyBulkWriteOperation[], callback: Callback): void; + bulkWrite(operations: AnyBulkWriteOperation[], options: BulkWriteOptions): Promise; + bulkWrite(operations: AnyBulkWriteOperation[], options: BulkWriteOptions, callback: Callback): void; + /** + * Update a single document in a collection + * + * @param filter - The filter used to select the document to update + * @param update - The update operations to be applied to the document + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + updateOne(filter: Filter, update: UpdateFilter | Partial): Promise; + updateOne(filter: Filter, update: UpdateFilter | Partial, callback: Callback): void; + updateOne(filter: Filter, update: UpdateFilter | Partial, options: UpdateOptions): Promise; + updateOne(filter: Filter, update: UpdateFilter | Partial, options: UpdateOptions, callback: Callback): void; + /** + * Replace a document in a collection with another document + * + * @param filter - The filter used to select the document to replace + * @param replacement - The Document that replaces the matching document + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + replaceOne(filter: Filter, replacement: TSchema): Promise; + replaceOne(filter: Filter, replacement: TSchema, callback: Callback): void; + replaceOne(filter: Filter, replacement: TSchema, options: ReplaceOptions): Promise; + replaceOne(filter: Filter, replacement: TSchema, options: ReplaceOptions, callback: Callback): void; + /** + * Update multiple documents in a collection + * + * @param filter - The filter used to select the documents to update + * @param update - The update operations to be applied to the documents + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + updateMany(filter: Filter, update: UpdateFilter): Promise; + updateMany(filter: Filter, update: UpdateFilter, callback: Callback): void; + updateMany(filter: Filter, update: UpdateFilter, options: UpdateOptions): Promise; + updateMany(filter: Filter, update: UpdateFilter, options: UpdateOptions, callback: Callback): void; + /** + * Delete a document from a collection + * + * @param filter - The filter used to select the document to remove + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + deleteOne(filter: Filter): Promise; + deleteOne(filter: Filter, callback: Callback): void; + deleteOne(filter: Filter, options: DeleteOptions): Promise; + deleteOne(filter: Filter, options: DeleteOptions, callback?: Callback): void; + /** + * Delete multiple documents from a collection + * + * @param filter - The filter used to select the documents to remove + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + deleteMany(filter: Filter): Promise; + deleteMany(filter: Filter, callback: Callback): void; + deleteMany(filter: Filter, options: DeleteOptions): Promise; + deleteMany(filter: Filter, options: DeleteOptions, callback: Callback): void; + /** + * Rename the collection. + * + * @remarks + * This operation does not inherit options from the Db or MongoClient. + * + * @param newName - New name of of the collection. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + rename(newName: string): Promise; + rename(newName: string, callback: Callback): void; + rename(newName: string, options: RenameOptions): Promise; + rename(newName: string, options: RenameOptions, callback: Callback): void; + /** + * Drop the collection from the database, removing it permanently. New accesses will create a new collection. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + drop(): Promise; + drop(callback: Callback): void; + drop(options: DropCollectionOptions): Promise; + drop(options: DropCollectionOptions, callback: Callback): void; + /** + * Fetches the first document that matches the filter + * + * @param filter - Query for find Operation + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + findOne(): Promise; + findOne(callback: Callback): void; + findOne(filter: Filter): Promise; + findOne(filter: Filter, callback: Callback): void; + findOne(filter: Filter, options: FindOptions): Promise; + findOne(filter: Filter, options: FindOptions, callback: Callback): void; + findOne(): Promise; + findOne(callback: Callback): void; + findOne(filter: Filter): Promise; + findOne(filter: Filter, options?: FindOptions): Promise; + findOne(filter: Filter, options?: FindOptions, callback?: Callback): void; + /** + * Creates a cursor for a filter that can be used to iterate over results from MongoDB + * + * @param filter - The filter predicate. If unspecified, then all documents in the collection will match the predicate + */ + find(): FindCursor; + find(filter: Filter, options?: FindOptions): FindCursor; + find(filter: Filter, options?: FindOptions): FindCursor; + /** + * Returns the options of the collection. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + options(): Promise; + options(callback: Callback): void; + options(options: OperationOptions): Promise; + options(options: OperationOptions, callback: Callback): void; + /** + * Returns if the collection is a capped collection + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + isCapped(): Promise; + isCapped(callback: Callback): void; + isCapped(options: OperationOptions): Promise; + isCapped(options: OperationOptions, callback: Callback): void; + /** + * Creates an index on the db and collection collection. + * + * @param indexSpec - The field name or index specification to create an index for + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + * + * @example + * ```js + * const collection = client.db('foo').collection('bar'); + * + * await collection.createIndex({ a: 1, b: -1 }); + * + * // Alternate syntax for { c: 1, d: -1 } that ensures order of indexes + * await collection.createIndex([ [c, 1], [d, -1] ]); + * + * // Equivalent to { e: 1 } + * await collection.createIndex('e'); + * + * // Equivalent to { f: 1, g: 1 } + * await collection.createIndex(['f', 'g']) + * + * // Equivalent to { h: 1, i: -1 } + * await collection.createIndex([ { h: 1 }, { i: -1 } ]); + * + * // Equivalent to { j: 1, k: -1, l: 2d } + * await collection.createIndex(['j', ['k', -1], { l: '2d' }]) + * ``` + */ + createIndex(indexSpec: IndexSpecification): Promise; + createIndex(indexSpec: IndexSpecification, callback: Callback): void; + createIndex(indexSpec: IndexSpecification, options: CreateIndexesOptions): Promise; + createIndex(indexSpec: IndexSpecification, options: CreateIndexesOptions, callback: Callback): void; + /** + * Creates multiple indexes in the collection, this method is only supported for + * MongoDB 2.6 or higher. Earlier version of MongoDB will throw a command not supported + * error. + * + * **Note**: Unlike {@link Collection#createIndex| createIndex}, this function takes in raw index specifications. + * Index specifications are defined {@link http://docs.mongodb.org/manual/reference/command/createIndexes/| here}. + * + * @param indexSpecs - An array of index specifications to be created + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + * + * @example + * ```js + * const collection = client.db('foo').collection('bar'); + * await collection.createIndexes([ + * // Simple index on field fizz + * { + * key: { fizz: 1 }, + * } + * // wildcard index + * { + * key: { '$**': 1 } + * }, + * // named index on darmok and jalad + * { + * key: { darmok: 1, jalad: -1 } + * name: 'tanagra' + * } + * ]); + * ``` + */ + createIndexes(indexSpecs: IndexDescription[]): Promise; + createIndexes(indexSpecs: IndexDescription[], callback: Callback): void; + createIndexes(indexSpecs: IndexDescription[], options: CreateIndexesOptions): Promise; + createIndexes(indexSpecs: IndexDescription[], options: CreateIndexesOptions, callback: Callback): void; + /** + * Drops an index from this collection. + * + * @param indexName - Name of the index to drop. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + dropIndex(indexName: string): Promise; + dropIndex(indexName: string, callback: Callback): void; + dropIndex(indexName: string, options: DropIndexesOptions): Promise; + dropIndex(indexName: string, options: DropIndexesOptions, callback: Callback): void; + /** + * Drops all indexes from this collection. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + dropIndexes(): Promise; + dropIndexes(callback: Callback): void; + dropIndexes(options: DropIndexesOptions): Promise; + dropIndexes(options: DropIndexesOptions, callback: Callback): void; + /** + * Get the list of all indexes information for the collection. + * + * @param options - Optional settings for the command + */ + listIndexes(options?: ListIndexesOptions): ListIndexesCursor; + /** + * Checks if one or more indexes exist on the collection, fails on first non-existing index + * + * @param indexes - One or more index names to check. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + indexExists(indexes: string | string[]): Promise; + indexExists(indexes: string | string[], callback: Callback): void; + indexExists(indexes: string | string[], options: IndexInformationOptions): Promise; + indexExists(indexes: string | string[], options: IndexInformationOptions, callback: Callback): void; + /** + * Retrieves this collections index info. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + indexInformation(): Promise; + indexInformation(callback: Callback): void; + indexInformation(options: IndexInformationOptions): Promise; + indexInformation(options: IndexInformationOptions, callback: Callback): void; + /** + * Gets an estimate of the count of documents in a collection using collection metadata. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + estimatedDocumentCount(): Promise; + estimatedDocumentCount(callback: Callback): void; + estimatedDocumentCount(options: EstimatedDocumentCountOptions): Promise; + estimatedDocumentCount(options: EstimatedDocumentCountOptions, callback: Callback): void; + /** + * Gets the number of documents matching the filter. + * For a fast count of the total documents in a collection see {@link Collection#estimatedDocumentCount| estimatedDocumentCount}. + * **Note**: When migrating from {@link Collection#count| count} to {@link Collection#countDocuments| countDocuments} + * the following query operators must be replaced: + * + * | Operator | Replacement | + * | -------- | ----------- | + * | `$where` | [`$expr`][1] | + * | `$near` | [`$geoWithin`][2] with [`$center`][3] | + * | `$nearSphere` | [`$geoWithin`][2] with [`$centerSphere`][4] | + * + * [1]: https://docs.mongodb.com/manual/reference/operator/query/expr/ + * [2]: https://docs.mongodb.com/manual/reference/operator/query/geoWithin/ + * [3]: https://docs.mongodb.com/manual/reference/operator/query/center/#op._S_center + * [4]: https://docs.mongodb.com/manual/reference/operator/query/centerSphere/#op._S_centerSphere + * + * @param filter - The filter for the count + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + * + * @see https://docs.mongodb.com/manual/reference/operator/query/expr/ + * @see https://docs.mongodb.com/manual/reference/operator/query/geoWithin/ + * @see https://docs.mongodb.com/manual/reference/operator/query/center/#op._S_center + * @see https://docs.mongodb.com/manual/reference/operator/query/centerSphere/#op._S_centerSphere + */ + countDocuments(): Promise; + countDocuments(callback: Callback): void; + countDocuments(filter: Filter): Promise; + countDocuments(callback: Callback): void; + countDocuments(filter: Filter, options: CountDocumentsOptions): Promise; + countDocuments(filter: Filter, options: CountDocumentsOptions, callback: Callback): void; + countDocuments(filter: Filter, callback: Callback): void; + /** + * The distinct command returns a list of distinct values for the given key across a collection. + * + * @param key - Field of the document to find distinct values for + * @param filter - The filter for filtering the set of documents to which we apply the distinct filter. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + distinct>(key: Key): Promise[Key]>>>; + distinct>(key: Key, callback: Callback[Key]>>>): void; + distinct>(key: Key, filter: Filter): Promise[Key]>>>; + distinct>(key: Key, filter: Filter, callback: Callback[Key]>>>): void; + distinct>(key: Key, filter: Filter, options: DistinctOptions): Promise[Key]>>>; + distinct>(key: Key, filter: Filter, options: DistinctOptions, callback: Callback[Key]>>>): void; + distinct(key: string): Promise; + distinct(key: string, callback: Callback): void; + distinct(key: string, filter: Filter): Promise; + distinct(key: string, filter: Filter, callback: Callback): void; + distinct(key: string, filter: Filter, options: DistinctOptions): Promise; + distinct(key: string, filter: Filter, options: DistinctOptions, callback: Callback): void; + /** + * Retrieve all the indexes on the collection. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + indexes(): Promise; + indexes(callback: Callback): void; + indexes(options: IndexInformationOptions): Promise; + indexes(options: IndexInformationOptions, callback: Callback): void; + /** + * Get all the collection statistics. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + stats(): Promise; + stats(callback: Callback): void; + stats(options: CollStatsOptions): Promise; + stats(options: CollStatsOptions, callback: Callback): void; + /** + * Find a document and delete it in one atomic operation. Requires a write lock for the duration of the operation. + * + * @param filter - The filter used to select the document to remove + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + findOneAndDelete(filter: Filter): Promise>; + findOneAndDelete(filter: Filter, options: FindOneAndDeleteOptions): Promise>; + findOneAndDelete(filter: Filter, callback: Callback>): void; + findOneAndDelete(filter: Filter, options: FindOneAndDeleteOptions, callback: Callback>): void; + /** + * Find a document and replace it in one atomic operation. Requires a write lock for the duration of the operation. + * + * @param filter - The filter used to select the document to replace + * @param replacement - The Document that replaces the matching document + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + findOneAndReplace(filter: Filter, replacement: Document): Promise>; + findOneAndReplace(filter: Filter, replacement: Document, callback: Callback>): void; + findOneAndReplace(filter: Filter, replacement: Document, options: FindOneAndReplaceOptions): Promise>; + findOneAndReplace(filter: Filter, replacement: Document, options: FindOneAndReplaceOptions, callback: Callback>): void; + /** + * Find a document and update it in one atomic operation. Requires a write lock for the duration of the operation. + * + * @param filter - The filter used to select the document to update + * @param update - Update operations to be performed on the document + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + findOneAndUpdate(filter: Filter, update: UpdateFilter): Promise>; + findOneAndUpdate(filter: Filter, update: UpdateFilter, callback: Callback>): void; + findOneAndUpdate(filter: Filter, update: UpdateFilter, options: FindOneAndUpdateOptions): Promise>; + findOneAndUpdate(filter: Filter, update: UpdateFilter, options: FindOneAndUpdateOptions, callback: Callback>): void; + /** + * Execute an aggregation framework pipeline against the collection, needs MongoDB \>= 2.2 + * + * @param pipeline - An array of aggregation pipelines to execute + * @param options - Optional settings for the command + */ + aggregate(pipeline?: Document[], options?: AggregateOptions): AggregationCursor; + /** + * Create a new Change Stream, watching for new changes (insertions, updates, replacements, deletions, and invalidations) in this collection. + * + * @since 3.0.0 + * @param pipeline - An array of {@link https://docs.mongodb.com/manual/reference/operator/aggregation-pipeline/|aggregation pipeline stages} through which to pass change stream documents. This allows for filtering (using $match) and manipulating the change stream documents. + * @param options - Optional settings for the command + */ + watch(pipeline?: Document[], options?: ChangeStreamOptions): ChangeStream; + /** + * Run Map Reduce across a collection. Be aware that the inline option for out will return an array of results not a collection. + * + * @param map - The mapping function. + * @param reduce - The reduce function. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + mapReduce(map: string | MapFunction, reduce: string | ReduceFunction): Promise; + mapReduce(map: string | MapFunction, reduce: string | ReduceFunction, callback: Callback): void; + mapReduce(map: string | MapFunction, reduce: string | ReduceFunction, options: MapReduceOptions): Promise; + mapReduce(map: string | MapFunction, reduce: string | ReduceFunction, options: MapReduceOptions, callback: Callback): void; + /** Initiate an Out of order batch write operation. All operations will be buffered into insert/update/remove commands executed out of order. */ + initializeUnorderedBulkOp(options?: BulkWriteOptions): UnorderedBulkOperation; + /** Initiate an In order bulk write operation. Operations will be serially executed in the order they are added, creating a new operation for each switch in types. */ + initializeOrderedBulkOp(options?: BulkWriteOptions): OrderedBulkOperation; + /** Get the db scoped logger */ + getLogger(): Logger; + get logger(): Logger; + /** + * Inserts a single document or a an array of documents into MongoDB. If documents passed in do not contain the **_id** field, + * one will be added to each of the documents missing it by the driver, mutating the document. This behavior + * can be overridden by setting the **forceServerObjectId** flag. + * + * @deprecated Use insertOne, insertMany or bulkWrite instead. + * @param docs - The documents to insert + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + insert(docs: OptionalId[], options: BulkWriteOptions, callback: Callback>): Promise> | void; + /** + * Updates documents. + * + * @deprecated use updateOne, updateMany or bulkWrite + * @param selector - The selector for the update operation. + * @param update - The update operations to be applied to the documents + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + update(selector: Filter, update: UpdateFilter, options: UpdateOptions, callback: Callback): Promise | void; + /** + * Remove documents. + * + * @deprecated use deleteOne, deleteMany or bulkWrite + * @param selector - The selector for the update operation. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + remove(selector: Filter, options: DeleteOptions, callback: Callback): Promise | void; + /** + * An estimated count of matching documents in the db to a filter. + * + * **NOTE:** This method has been deprecated, since it does not provide an accurate count of the documents + * in a collection. To obtain an accurate count of documents in the collection, use {@link Collection#countDocuments| countDocuments}. + * To obtain an estimated count of all documents in the collection, use {@link Collection#estimatedDocumentCount| estimatedDocumentCount}. + * + * @deprecated use {@link Collection#countDocuments| countDocuments} or {@link Collection#estimatedDocumentCount| estimatedDocumentCount} instead + * + * @param filter - The filter for the count. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + count(): Promise; + count(callback: Callback): void; + count(filter: Filter): Promise; + count(filter: Filter, callback: Callback): void; + count(filter: Filter, options: CountOptions): Promise; + count(filter: Filter, options: CountOptions, callback: Callback): Promise | void; +} + +/** @public */ +export declare interface CollectionInfo extends Document { + name: string; + type?: string; + options?: Document; + info?: { + readOnly?: false; + uuid?: Binary; + }; + idIndex?: Document; +} + +/** @public */ +export declare interface CollectionOptions extends BSONSerializeOptions, WriteConcernOptions, LoggerOptions { + slaveOk?: boolean; + /** Specify a read concern for the collection. (only MongoDB 3.2 or higher supported) */ + readConcern?: ReadConcernLike; + /** The preferred read preference (ReadPreference.PRIMARY, ReadPreference.PRIMARY_PREFERRED, ReadPreference.SECONDARY, ReadPreference.SECONDARY_PREFERRED, ReadPreference.NEAREST). */ + readPreference?: ReadPreferenceLike; +} + +/* Excluded from this release type: CollectionPrivate */ + +/** + * @public + * @see https://docs.mongodb.org/manual/reference/command/collStats/ + */ +export declare interface CollStats extends Document { + /** Namespace */ + ns: string; + /** Number of documents */ + count: number; + /** Collection size in bytes */ + size: number; + /** Average object size in bytes */ + avgObjSize: number; + /** (Pre)allocated space for the collection in bytes */ + storageSize: number; + /** Number of extents (contiguously allocated chunks of datafile space) */ + numExtents: number; + /** Number of indexes */ + nindexes: number; + /** Size of the most recently created extent in bytes */ + lastExtentSize: number; + /** Padding can speed up updates if documents grow */ + paddingFactor: number; + /** A number that indicates the user-set flags on the collection. userFlags only appears when using the mmapv1 storage engine */ + userFlags?: number; + /** Total index size in bytes */ + totalIndexSize: number; + /** Size of specific indexes in bytes */ + indexSizes: { + _id_: number; + [index: string]: number; + }; + /** `true` if the collection is capped */ + capped: boolean; + /** The maximum number of documents that may be present in a capped collection */ + max: number; + /** The maximum size of a capped collection */ + maxSize: number; + /** This document contains data reported directly by the WiredTiger engine and other data for internal diagnostic use */ + wiredTiger?: WiredTigerData; + /** The fields in this document are the names of the indexes, while the values themselves are documents that contain statistics for the index provided by the storage engine */ + indexDetails?: any; + ok: number; + /** The amount of storage available for reuse. The scale argument affects this value. */ + freeStorageSize?: number; + /** An array that contains the names of the indexes that are currently being built on the collection */ + indexBuilds?: number; + /** The sum of the storageSize and totalIndexSize. The scale argument affects this value */ + totalSize: number; + /** The scale value used by the command. */ + scaleFactor: number; +} + +/** @public */ +export declare interface CollStatsOptions extends CommandOperationOptions { + /** Divide the returned sizes by scale value. */ + scale?: number; +} + +/** + * An event indicating the failure of a given command + * @public + * @category Event + */ +export declare class CommandFailedEvent { + address: string; + connectionId?: string | number; + requestId: number; + duration: number; + commandName: string; + failure: Error; + serviceId?: ObjectId; + /* Excluded from this release type: __constructor */ + get hasServiceId(): boolean; +} + +/* Excluded from this release type: CommandOperation */ + +/** @public */ +export declare interface CommandOperationOptions extends OperationOptions, WriteConcernOptions, ExplainOptions { + /** @deprecated This option does nothing */ + fullResponse?: boolean; + /** Specify a read concern and level for the collection. (only MongoDB 3.2 or higher supported) */ + readConcern?: ReadConcernLike; + /** Collation */ + collation?: CollationOptions; + maxTimeMS?: number; + /** A user-provided comment to attach to this command */ + comment?: string | Document; + /** Should retry failed writes */ + retryWrites?: boolean; + dbName?: string; + authdb?: string; + noResponse?: boolean; +} + +/* Excluded from this release type: CommandOptions */ + +/** + * An event indicating the start of a given + * @public + * @category Event + */ +export declare class CommandStartedEvent { + commandObj?: Document; + requestId: number; + databaseName: string; + commandName: string; + command: Document; + address: string; + connectionId?: string | number; + serviceId?: ObjectId; + /* Excluded from this release type: __constructor */ + get hasServiceId(): boolean; +} + +/** + * An event indicating the success of a given command + * @public + * @category Event + */ +export declare class CommandSucceededEvent { + address: string; + connectionId?: string | number; + requestId: number; + duration: number; + commandName: string; + reply: unknown; + serviceId?: ObjectId; + /* Excluded from this release type: __constructor */ + get hasServiceId(): boolean; +} + +/** @public */ +export declare type CommonEvents = 'newListener' | 'removeListener'; + +/** @public */ +export declare const Compressor: Readonly<{ + readonly none: 0; + readonly snappy: 1; + readonly zlib: 2; +}>; + +/** @public */ +export declare type Compressor = typeof Compressor[CompressorName]; + +/** @public */ +export declare type CompressorName = keyof typeof Compressor; + +/** @public */ +export declare type Condition = AlternativeType | FilterOperators>; + +/* Excluded from this release type: Connection */ + +/** + * An event published when a connection is checked into the connection pool + * @public + * @category Event + */ +export declare class ConnectionCheckedInEvent extends ConnectionPoolMonitoringEvent { + /** The id of the connection */ + connectionId: number | ''; + /* Excluded from this release type: __constructor */ +} + +/** + * An event published when a connection is checked out of the connection pool + * @public + * @category Event + */ +export declare class ConnectionCheckedOutEvent extends ConnectionPoolMonitoringEvent { + /** The id of the connection */ + connectionId: number | ''; + /* Excluded from this release type: __constructor */ +} + +/** + * An event published when a request to check a connection out fails + * @public + * @category Event + */ +export declare class ConnectionCheckOutFailedEvent extends ConnectionPoolMonitoringEvent { + /** The reason the attempt to check out failed */ + reason: AnyError | string; + /* Excluded from this release type: __constructor */ +} + +/** + * An event published when a request to check a connection out begins + * @public + * @category Event + */ +export declare class ConnectionCheckOutStartedEvent extends ConnectionPoolMonitoringEvent { + /* Excluded from this release type: __constructor */ +} + +/** + * An event published when a connection is closed + * @public + * @category Event + */ +export declare class ConnectionClosedEvent extends ConnectionPoolMonitoringEvent { + /** The id of the connection */ + connectionId: number | ''; + /** The reason the connection was closed */ + reason: string; + serviceId?: ObjectId; + /* Excluded from this release type: __constructor */ +} + +/** + * An event published when a connection pool creates a new connection + * @public + * @category Event + */ +export declare class ConnectionCreatedEvent extends ConnectionPoolMonitoringEvent { + /** A monotonically increasing, per-pool id for the newly created connection */ + connectionId: number | ''; + /* Excluded from this release type: __constructor */ +} + +/** @public */ +export declare type ConnectionEvents = { + commandStarted(event: CommandStartedEvent): void; + commandSucceeded(event: CommandSucceededEvent): void; + commandFailed(event: CommandFailedEvent): void; + clusterTimeReceived(clusterTime: Document): void; + close(): void; + message(message: any): void; + pinned(pinType: string): void; + unpinned(pinType: string): void; +}; + +/** @public */ +export declare interface ConnectionOptions extends SupportedNodeConnectionOptions, StreamDescriptionOptions { + id: number | ''; + generation: number; + hostAddress: HostAddress; + autoEncrypter?: AutoEncrypter; + serverApi?: ServerApi; + monitorCommands: boolean; + /* Excluded from this release type: connectionType */ + credentials?: MongoCredentials; + connectTimeoutMS?: number; + tls: boolean; + keepAlive?: boolean; + keepAliveInitialDelay?: number; + noDelay?: boolean; + socketTimeoutMS?: number; + cancellationToken?: CancellationToken; + metadata: ClientMetadata; +} + +/* Excluded from this release type: ConnectionPool */ + +/** + * An event published when a connection pool is cleared + * @public + * @category Event + */ +export declare class ConnectionPoolClearedEvent extends ConnectionPoolMonitoringEvent { + /* Excluded from this release type: serviceId */ + /* Excluded from this release type: __constructor */ +} + +/** + * An event published when a connection pool is closed + * @public + * @category Event + */ +export declare class ConnectionPoolClosedEvent extends ConnectionPoolMonitoringEvent { + /* Excluded from this release type: __constructor */ +} + +/** + * An event published when a connection pool is created + * @public + * @category Event + */ +export declare class ConnectionPoolCreatedEvent extends ConnectionPoolMonitoringEvent { + /** The options used to create this connection pool */ + options?: ConnectionPoolOptions; + /* Excluded from this release type: __constructor */ +} + +/** @public */ +export declare type ConnectionPoolEvents = { + connectionPoolCreated(event: ConnectionPoolCreatedEvent): void; + connectionPoolClosed(event: ConnectionPoolClosedEvent): void; + connectionPoolCleared(event: ConnectionPoolClearedEvent): void; + connectionCreated(event: ConnectionCreatedEvent): void; + connectionReady(event: ConnectionReadyEvent): void; + connectionClosed(event: ConnectionClosedEvent): void; + connectionCheckOutStarted(event: ConnectionCheckOutStartedEvent): void; + connectionCheckOutFailed(event: ConnectionCheckOutFailedEvent): void; + connectionCheckedOut(event: ConnectionCheckedOutEvent): void; + connectionCheckedIn(event: ConnectionCheckedInEvent): void; +} & Omit; + +/* Excluded from this release type: ConnectionPoolMetrics */ + +/** + * The base export class for all monitoring events published from the connection pool + * @public + * @category Event + */ +export declare class ConnectionPoolMonitoringEvent { + /** A timestamp when the event was created */ + time: Date; + /** The address (host/port pair) of the pool */ + address: string; + /* Excluded from this release type: __constructor */ +} + +/** @public */ +export declare interface ConnectionPoolOptions extends Omit { + /** The maximum number of connections that may be associated with a pool at a given time. This includes in use and available connections. */ + maxPoolSize: number; + /** The minimum number of connections that MUST exist at any moment in a single connection pool. */ + minPoolSize: number; + /** The maximum amount of time a connection should remain idle in the connection pool before being marked idle. */ + maxIdleTimeMS: number; + /** The maximum amount of time operation execution should wait for a connection to become available. The default is 0 which means there is no limit. */ + waitQueueTimeoutMS: number; + /** If we are in load balancer mode. */ + loadBalanced: boolean; +} + +/** + * An event published when a connection is ready for use + * @public + * @category Event + */ +export declare class ConnectionReadyEvent extends ConnectionPoolMonitoringEvent { + /** The id of the connection */ + connectionId: number | ''; + /* Excluded from this release type: __constructor */ +} + +/** @public */ +export declare interface ConnectOptions { + readPreference?: ReadPreference; +} + +/** @public */ +export declare interface CountDocumentsOptions extends AggregateOptions { + /** The number of documents to skip. */ + skip?: number; + /** The maximum amounts to count before aborting. */ + limit?: number; +} + +/** @public */ +export declare interface CountOptions extends CommandOperationOptions { + /** The number of documents to skip. */ + skip?: number; + /** The maximum amounts to count before aborting. */ + limit?: number; + /** Number of milliseconds to wait before aborting the query. */ + maxTimeMS?: number; + /** An index name hint for the query. */ + hint?: string | Document; +} + +/** @public */ +export declare interface CreateCollectionOptions extends CommandOperationOptions { + /** Returns an error if the collection does not exist */ + strict?: boolean; + /** Create a capped collection */ + capped?: boolean; + /** @deprecated Create an index on the _id field of the document, True by default on MongoDB 2.6 - 3.0 */ + autoIndexId?: boolean; + /** The size of the capped collection in bytes */ + size?: number; + /** The maximum number of documents in the capped collection */ + max?: number; + /** Available for the MMAPv1 storage engine only to set the usePowerOf2Sizes and the noPadding flag */ + flags?: number; + /** Allows users to specify configuration to the storage engine on a per-collection basis when creating a collection on MongoDB 3.0 or higher */ + storageEngine?: Document; + /** Allows users to specify validation rules or expressions for the collection. For more information, see Document Validation on MongoDB 3.2 or higher */ + validator?: Document; + /** Determines how strictly MongoDB applies the validation rules to existing documents during an update on MongoDB 3.2 or higher */ + validationLevel?: string; + /** Determines whether to error on invalid documents or just warn about the violations but allow invalid documents to be inserted on MongoDB 3.2 or higher */ + validationAction?: string; + /** Allows users to specify a default configuration for indexes when creating a collection on MongoDB 3.2 or higher */ + indexOptionDefaults?: Document; + /** The name of the source collection or view from which to create the view. The name is not the full namespace of the collection or view; i.e. does not include the database name and implies the same database as the view to create on MongoDB 3.4 or higher */ + viewOn?: string; + /** An array that consists of the aggregation pipeline stage. Creates the view by applying the specified pipeline to the viewOn collection or view on MongoDB 3.4 or higher */ + pipeline?: Document[]; + /** A primary key factory function for generation of custom _id keys. */ + pkFactory?: PkFactory; + /** A document specifying configuration options for timeseries collections. */ + timeseries?: TimeSeriesCollectionOptions; + /** The number of seconds after which a document in a timeseries collection expires. */ + expireAfterSeconds?: number; +} + +/** @public */ +export declare interface CreateIndexesOptions extends CommandOperationOptions { + /** Creates the index in the background, yielding whenever possible. */ + background?: boolean; + /** Creates an unique index. */ + unique?: boolean; + /** Override the autogenerated index name (useful if the resulting name is larger than 128 bytes) */ + name?: string; + /** Creates a partial index based on the given filter object (MongoDB 3.2 or higher) */ + partialFilterExpression?: Document; + /** Creates a sparse index. */ + sparse?: boolean; + /** Allows you to expire data on indexes applied to a data (MongoDB 2.2 or higher) */ + expireAfterSeconds?: number; + /** Allows users to configure the storage engine on a per-index basis when creating an index. (MongoDB 3.0 or higher) */ + storageEngine?: Document; + /** (MongoDB 4.4. or higher) Specifies how many data-bearing members of a replica set, including the primary, must complete the index builds successfully before the primary marks the indexes as ready. This option accepts the same values for the "w" field in a write concern plus "votingMembers", which indicates all voting data-bearing nodes. */ + commitQuorum?: number | string; + /** Specifies the index version number, either 0 or 1. */ + version?: number; + weights?: Document; + default_language?: string; + language_override?: string; + textIndexVersion?: number; + '2dsphereIndexVersion'?: number; + bits?: number; + /** For geospatial indexes set the lower bound for the co-ordinates. */ + min?: number; + /** For geospatial indexes set the high bound for the co-ordinates. */ + max?: number; + bucketSize?: number; + wildcardProjection?: Document; + /** Specifies that the index should exist on the target collection but should not be used by the query planner when executing operations. (MongoDB 4.4 or higher) */ + hidden?: boolean; +} + +/** @public */ +export declare const CURSOR_FLAGS: readonly ["tailable", "oplogReplay", "noCursorTimeout", "awaitData", "exhaust", "partial"]; + +/** @public + * @deprecated This interface is deprecated */ +export declare interface CursorCloseOptions { + /** Bypass calling killCursors when closing the cursor. */ + /** @deprecated the skipKillCursors option is deprecated */ + skipKillCursors?: boolean; +} + +/** @public */ +export declare type CursorFlag = typeof CURSOR_FLAGS[number]; + +/** @public */ +export declare interface CursorStreamOptions { + /** A transformation method applied to each document emitted by the stream */ + transform?(doc: Document): Document; +} + +/** + * The **Db** class is a class that represents a MongoDB Database. + * @public + * + * @example + * ```js + * const { MongoClient } = require('mongodb'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * // Connect using MongoClient + * MongoClient.connect(url, function(err, client) { + * // Select the database by name + * const testDb = client.db(dbName); + * client.close(); + * }); + * ``` + */ +export declare class Db { + /* Excluded from this release type: s */ + static SYSTEM_NAMESPACE_COLLECTION: string; + static SYSTEM_INDEX_COLLECTION: string; + static SYSTEM_PROFILE_COLLECTION: string; + static SYSTEM_USER_COLLECTION: string; + static SYSTEM_COMMAND_COLLECTION: string; + static SYSTEM_JS_COLLECTION: string; + /** + * Creates a new Db instance + * + * @param client - The MongoClient for the database. + * @param databaseName - The name of the database this instance represents. + * @param options - Optional settings for Db construction + */ + constructor(client: MongoClient, databaseName: string, options?: DbOptions); + get databaseName(): string; + get options(): DbOptions | undefined; + get slaveOk(): boolean; + get readConcern(): ReadConcern | undefined; + /** + * The current readPreference of the Db. If not explicitly defined for + * this Db, will be inherited from the parent MongoClient + */ + get readPreference(): ReadPreference; + get bsonOptions(): BSONSerializeOptions; + get writeConcern(): WriteConcern | undefined; + get namespace(): string; + /** + * Create a new collection on a server with the specified options. Use this to create capped collections. + * More information about command options available at https://docs.mongodb.com/manual/reference/command/create/ + * + * @param name - The name of the collection to create + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + createCollection(name: string): Promise>; + createCollection(name: string, callback: Callback>): void; + createCollection(name: string, options: CreateCollectionOptions): Promise>; + createCollection(name: string, options: CreateCollectionOptions, callback: Callback>): void; + /** + * Execute a command + * + * @remarks + * This command does not inherit options from the MongoClient. + * + * @param command - The command to run + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + command(command: Document): Promise; + command(command: Document, callback: Callback): void; + command(command: Document, options: RunCommandOptions): Promise; + command(command: Document, options: RunCommandOptions, callback: Callback): void; + /** + * Execute an aggregation framework pipeline against the database, needs MongoDB \>= 3.6 + * + * @param pipeline - An array of aggregation stages to be executed + * @param options - Optional settings for the command + */ + aggregate(pipeline?: Document[], options?: AggregateOptions): AggregationCursor; + /** Return the Admin db instance */ + admin(): Admin; + /** + * Returns a reference to a MongoDB Collection. If it does not exist it will be created implicitly. + * + * @param name - the collection name we wish to access. + * @returns return the new Collection instance + */ + collection(name: string, options?: CollectionOptions): Collection; + /** + * Get all the db statistics. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + stats(): Promise; + stats(callback: Callback): void; + stats(options: DbStatsOptions): Promise; + stats(options: DbStatsOptions, callback: Callback): void; + /** + * List all collections of this database with optional filter + * + * @param filter - Query to filter collections by + * @param options - Optional settings for the command + */ + listCollections(filter: Document, options: Exclude & { + nameOnly: true; + }): ListCollectionsCursor>; + listCollections(filter: Document, options: Exclude & { + nameOnly: false; + }): ListCollectionsCursor; + listCollections | CollectionInfo = Pick | CollectionInfo>(filter?: Document, options?: ListCollectionsOptions): ListCollectionsCursor; + /** + * Rename a collection. + * + * @remarks + * This operation does not inherit options from the MongoClient. + * + * @param fromCollection - Name of current collection to rename + * @param toCollection - New name of of the collection + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + renameCollection(fromCollection: string, toCollection: string): Promise>; + renameCollection(fromCollection: string, toCollection: string, callback: Callback>): void; + renameCollection(fromCollection: string, toCollection: string, options: RenameOptions): Promise>; + renameCollection(fromCollection: string, toCollection: string, options: RenameOptions, callback: Callback>): void; + /** + * Drop a collection from the database, removing it permanently. New accesses will create a new collection. + * + * @param name - Name of collection to drop + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + dropCollection(name: string): Promise; + dropCollection(name: string, callback: Callback): void; + dropCollection(name: string, options: DropCollectionOptions): Promise; + dropCollection(name: string, options: DropCollectionOptions, callback: Callback): void; + /** + * Drop a database, removing it permanently from the server. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + dropDatabase(): Promise; + dropDatabase(callback: Callback): void; + dropDatabase(options: DropDatabaseOptions): Promise; + dropDatabase(options: DropDatabaseOptions, callback: Callback): void; + /** + * Fetch all collections for the current db. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + collections(): Promise; + collections(callback: Callback): void; + collections(options: ListCollectionsOptions): Promise; + collections(options: ListCollectionsOptions, callback: Callback): void; + /** + * Creates an index on the db and collection. + * + * @param name - Name of the collection to create the index on. + * @param indexSpec - Specify the field to index, or an index specification + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + createIndex(name: string, indexSpec: IndexSpecification): Promise; + createIndex(name: string, indexSpec: IndexSpecification, callback?: Callback): void; + createIndex(name: string, indexSpec: IndexSpecification, options: CreateIndexesOptions): Promise; + createIndex(name: string, indexSpec: IndexSpecification, options: CreateIndexesOptions, callback: Callback): void; + /** + * Add a user to the database + * + * @param username - The username for the new user + * @param password - An optional password for the new user + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + addUser(username: string): Promise; + addUser(username: string, callback: Callback): void; + addUser(username: string, password: string): Promise; + addUser(username: string, password: string, callback: Callback): void; + addUser(username: string, options: AddUserOptions): Promise; + addUser(username: string, options: AddUserOptions, callback: Callback): void; + addUser(username: string, password: string, options: AddUserOptions): Promise; + addUser(username: string, password: string, options: AddUserOptions, callback: Callback): void; + /** + * Remove a user from a database + * + * @param username - The username to remove + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + removeUser(username: string): Promise; + removeUser(username: string, callback: Callback): void; + removeUser(username: string, options: RemoveUserOptions): Promise; + removeUser(username: string, options: RemoveUserOptions, callback: Callback): void; + /** + * Set the current profiling level of MongoDB + * + * @param level - The new profiling level (off, slow_only, all). + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + setProfilingLevel(level: ProfilingLevel): Promise; + setProfilingLevel(level: ProfilingLevel, callback: Callback): void; + setProfilingLevel(level: ProfilingLevel, options: SetProfilingLevelOptions): Promise; + setProfilingLevel(level: ProfilingLevel, options: SetProfilingLevelOptions, callback: Callback): void; + /** + * Retrieve the current profiling Level for MongoDB + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + profilingLevel(): Promise; + profilingLevel(callback: Callback): void; + profilingLevel(options: ProfilingLevelOptions): Promise; + profilingLevel(options: ProfilingLevelOptions, callback: Callback): void; + /** + * Retrieves this collections index info. + * + * @param name - The name of the collection. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + indexInformation(name: string): Promise; + indexInformation(name: string, callback: Callback): void; + indexInformation(name: string, options: IndexInformationOptions): Promise; + indexInformation(name: string, options: IndexInformationOptions, callback: Callback): void; + /** + * Unref all sockets + * @deprecated This function is deprecated and will be removed in the next major version. + */ + unref(): void; + /** + * Create a new Change Stream, watching for new changes (insertions, updates, + * replacements, deletions, and invalidations) in this database. Will ignore all + * changes to system collections. + * + * @param pipeline - An array of {@link https://docs.mongodb.com/manual/reference/operator/aggregation-pipeline/|aggregation pipeline stages} through which to pass change stream documents. This allows for filtering (using $match) and manipulating the change stream documents. + * @param options - Optional settings for the command + */ + watch(pipeline?: Document[], options?: ChangeStreamOptions): ChangeStream; + /** Return the db logger */ + getLogger(): Logger; + get logger(): Logger; +} + +/* Excluded from this release type: DB_AGGREGATE_COLLECTION */ + +/** @public */ +export declare interface DbOptions extends BSONSerializeOptions, WriteConcernOptions, LoggerOptions { + /** If the database authentication is dependent on another databaseName. */ + authSource?: string; + /** Force server to assign _id values instead of driver. */ + forceServerObjectId?: boolean; + /** The preferred read preference (ReadPreference.PRIMARY, ReadPreference.PRIMARY_PREFERRED, ReadPreference.SECONDARY, ReadPreference.SECONDARY_PREFERRED, ReadPreference.NEAREST). */ + readPreference?: ReadPreferenceLike; + /** A primary key factory object for generation of custom _id keys. */ + pkFactory?: PkFactory; + /** Specify a read concern for the collection. (only MongoDB 3.2 or higher supported) */ + readConcern?: ReadConcern; + /** Should retry failed writes */ + retryWrites?: boolean; +} + +/* Excluded from this release type: DbPrivate */ +export { DBRef } + +/** @public */ +export declare interface DbStatsOptions extends CommandOperationOptions { + /** Divide the returned sizes by scale value. */ + scale?: number; +} + +export { Decimal128 } + +/** @public */ +export declare interface DeleteManyModel { + /** The filter to limit the deleted documents. */ + filter: Filter; + /** Specifies a collation. */ + collation?: CollationOptions; + /** The index to use. If specified, then the query system will only consider plans using the hinted index. */ + hint?: Hint; +} + +/** @public */ +export declare interface DeleteOneModel { + /** The filter to limit the deleted documents. */ + filter: Filter; + /** Specifies a collation. */ + collation?: CollationOptions; + /** The index to use. If specified, then the query system will only consider plans using the hinted index. */ + hint?: Hint; +} + +/** @public */ +export declare interface DeleteOptions extends CommandOperationOptions, WriteConcernOptions { + /** If true, when an insert fails, don't execute the remaining writes. If false, continue with remaining inserts when one fails. */ + ordered?: boolean; + /** A user-provided comment to attach to this command */ + comment?: string | Document; + /** Specifies the collation to use for the operation */ + collation?: CollationOptions; + /** Specify that the update query should only consider plans using the hinted index */ + hint?: string | Document; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; + /** @deprecated use `removeOne` or `removeMany` to implicitly specify the limit */ + single?: boolean; +} + +/** @public */ +export declare interface DeleteResult { + /** Indicates whether this write result was acknowledged. If not, then all other members of this result will be undefined. */ + acknowledged: boolean; + /** The number of documents that were deleted */ + deletedCount: number; +} + +/** @public */ +export declare interface DeleteStatement { + /** The query that matches documents to delete. */ + q: Document; + /** The number of matching documents to delete. */ + limit: number; + /** Specifies the collation to use for the operation. */ + collation?: CollationOptions; + /** A document or string that specifies the index to use to support the query predicate. */ + hint?: Hint; + /** A user-provided comment to attach to this command */ + comment?: string | Document; +} + +/* Excluded from this release type: deserialize */ + +/** @public */ +export declare interface DestroyOptions { + /** Force the destruction. */ + force?: boolean; +} + +/** @public */ +export declare type DistinctOptions = CommandOperationOptions; + +export { Document } + +export { Double } + +/** @public */ +export declare interface DriverInfo { + name?: string; + version?: string; + platform?: string; +} + +/** @public */ +export declare type DropCollectionOptions = CommandOperationOptions; + +/** @public */ +export declare type DropDatabaseOptions = CommandOperationOptions; + +/** @public */ +export declare type DropIndexesOptions = CommandOperationOptions; + +/* Excluded from this release type: Encrypter */ + +/* Excluded from this release type: EncrypterOptions */ + +/** @public */ +export declare interface EndSessionOptions { + /* Excluded from this release type: error */ + force?: boolean; + forceClear?: boolean; +} + +/** TypeScript Omit (Exclude to be specific) does not work for objects with an "any" indexed type, and breaks discriminated unions @public */ +export declare type EnhancedOmit = string extends keyof TRecordOrUnion ? TRecordOrUnion : TRecordOrUnion extends any ? Pick> : never; + +/** @public */ +export declare interface ErrorDescription extends Document { + message?: string; + errmsg?: string; + $err?: string; + errorLabels?: string[]; + errInfo?: Document; +} + +/** @public */ +export declare interface EstimatedDocumentCountOptions extends CommandOperationOptions { + /** + * The maximum amount of time to allow the operation to run. + * + * This option is sent only if the caller explicitly provides a value. The default is to not send a value. + */ + maxTimeMS?: number; +} + +/** @public */ +export declare interface EvalOptions extends CommandOperationOptions { + nolock?: boolean; +} + +/** @public */ +export declare type EventEmitterWithState = { + /* Excluded from this release type: stateChanged */ +}; + +/** + * Event description type + * @public + */ +export declare type EventsDescription = Record; + +/* Excluded from this release type: ExecutionResult */ + +/* Excluded from this release type: Explain */ + +/** @public */ +export declare interface ExplainOptions { + /** Specifies the verbosity mode for the explain output. */ + explain?: ExplainVerbosityLike; +} + +/** @public */ +export declare const ExplainVerbosity: Readonly<{ + readonly queryPlanner: "queryPlanner"; + readonly queryPlannerExtended: "queryPlannerExtended"; + readonly executionStats: "executionStats"; + readonly allPlansExecution: "allPlansExecution"; +}>; + +/** @public */ +export declare type ExplainVerbosity = string; + +/** + * For backwards compatibility, true is interpreted as "allPlansExecution" + * and false as "queryPlanner". Prior to server version 3.6, aggregate() + * ignores the verbosity parameter and executes in "queryPlanner". + * @public + */ +export declare type ExplainVerbosityLike = ExplainVerbosity | boolean; + +/** A MongoDB filter can be some portion of the schema or a set of operators @public */ +export declare type Filter = { + [P in keyof TSchema]?: Condition; +} & RootFilterOperators; + +/** @public */ +export declare type FilterOperations = T extends Record ? { + [key in keyof T]?: FilterOperators; +} : FilterOperators; + +/** @public */ +export declare interface FilterOperators extends Document { + $eq?: TValue; + $gt?: TValue; + $gte?: TValue; + $in?: ReadonlyArray; + $lt?: TValue; + $lte?: TValue; + $ne?: TValue; + $nin?: ReadonlyArray; + $not?: TValue extends string ? FilterOperators | RegExp : FilterOperators; + /** + * When `true`, `$exists` matches the documents that contain the field, + * including documents where the field value is null. + */ + $exists?: boolean; + $type?: BSONType | BSONTypeAlias; + $expr?: Record; + $jsonSchema?: Record; + $mod?: TValue extends number ? [number, number] : never; + $regex?: TValue extends string ? RegExp | BSONRegExp | string : never; + $options?: TValue extends string ? string : never; + $geoIntersects?: { + $geometry: Document; + }; + $geoWithin?: Document; + $near?: Document; + $nearSphere?: Document; + $maxDistance?: number; + $all?: ReadonlyArray; + $elemMatch?: Document; + $size?: TValue extends ReadonlyArray ? number : never; + $bitsAllClear?: BitwiseFilter; + $bitsAllSet?: BitwiseFilter; + $bitsAnyClear?: BitwiseFilter; + $bitsAnySet?: BitwiseFilter; + $rand?: Record; +} + +/** @public */ +export declare type FinalizeFunction = (key: TKey, reducedValue: TValue) => TValue; + +/** @public */ +export declare class FindCursor extends AbstractCursor { + /* Excluded from this release type: [kFilter] */ + /* Excluded from this release type: [kNumReturned] */ + /* Excluded from this release type: [kBuiltOptions] */ + /* Excluded from this release type: __constructor */ + clone(): FindCursor; + map(transform: (doc: TSchema) => T): FindCursor; + /* Excluded from this release type: _initialize */ + /* Excluded from this release type: _getMore */ + /** Get the count of documents for this cursor */ + count(): Promise; + count(callback: Callback): void; + count(options: CountOptions): Promise; + count(options: CountOptions, callback: Callback): void; + /** Execute the explain for the cursor */ + explain(): Promise; + explain(callback: Callback): void; + explain(verbosity?: ExplainVerbosityLike): Promise; + /** Set the cursor query */ + filter(filter: Document): this; + /** + * Set the cursor hint + * + * @param hint - If specified, then the query system will only consider plans using the hinted index. + */ + hint(hint: Hint): this; + /** + * Set the cursor min + * + * @param min - Specify a $min value to specify the inclusive lower bound for a specific index in order to constrain the results of find(). The $min specifies the lower bound for all keys of a specific index in order. + */ + min(min: Document): this; + /** + * Set the cursor max + * + * @param max - Specify a $max value to specify the exclusive upper bound for a specific index in order to constrain the results of find(). The $max specifies the upper bound for all keys of a specific index in order. + */ + max(max: Document): this; + /** + * Set the cursor returnKey. + * If set to true, modifies the cursor to only return the index field or fields for the results of the query, rather than documents. + * If set to true and the query does not use an index to perform the read operation, the returned documents will not contain any fields. + * + * @param value - the returnKey value. + */ + returnKey(value: boolean): this; + /** + * Modifies the output of a query by adding a field $recordId to matching documents. $recordId is the internal key which uniquely identifies a document in a collection. + * + * @param value - The $showDiskLoc option has now been deprecated and replaced with the showRecordId field. $showDiskLoc will still be accepted for OP_QUERY stye find. + */ + showRecordId(value: boolean): this; + /** + * Add a query modifier to the cursor query + * + * @param name - The query modifier (must start with $, such as $orderby etc) + * @param value - The modifier value. + */ + addQueryModifier(name: string, value: string | boolean | number | Document): this; + /** + * Add a comment to the cursor query allowing for tracking the comment in the log. + * + * @param value - The comment attached to this query. + */ + comment(value: string): this; + /** + * Set a maxAwaitTimeMS on a tailing cursor query to allow to customize the timeout value for the option awaitData (Only supported on MongoDB 3.2 or higher, ignored otherwise) + * + * @param value - Number of milliseconds to wait before aborting the tailed query. + */ + maxAwaitTimeMS(value: number): this; + /** + * Set a maxTimeMS on the cursor query, allowing for hard timeout limits on queries (Only supported on MongoDB 2.6 or higher) + * + * @param value - Number of milliseconds to wait before aborting the query. + */ + maxTimeMS(value: number): this; + /** + * Add a project stage to the aggregation pipeline + * + * @remarks + * In order to strictly type this function you must provide an interface + * that represents the effect of your projection on the result documents. + * + * By default chaining a projection to your cursor changes the returned type to the generic + * {@link Document} type. + * You should specify a parameterized type to have assertions on your final results. + * + * @example + * ```typescript + * // Best way + * const docs: FindCursor<{ a: number }> = cursor.project<{ a: number }>({ _id: 0, a: true }); + * // Flexible way + * const docs: FindCursor = cursor.project({ _id: 0, a: true }); + * ``` + * + * @remarks + * + * **Note for Typescript Users:** adding a transform changes the return type of the iteration of this cursor, + * it **does not** return a new instance of a cursor. This means when calling project, + * you should always assign the result to a new variable in order to get a correctly typed cursor variable. + * Take note of the following example: + * + * @example + * ```typescript + * const cursor: FindCursor<{ a: number; b: string }> = coll.find(); + * const projectCursor = cursor.project<{ a: number }>({ _id: 0, a: true }); + * const aPropOnlyArray: {a: number}[] = await projectCursor.toArray(); + * + * // or always use chaining and save the final cursor + * + * const cursor = coll.find().project<{ a: string }>({ + * _id: 0, + * a: { $convert: { input: '$a', to: 'string' } + * }}); + * ``` + */ + project(value: Document): FindCursor; + /** + * Sets the sort order of the cursor query. + * + * @param sort - The key or keys set for the sort. + * @param direction - The direction of the sorting (1 or -1). + */ + sort(sort: Sort | string, direction?: SortDirection): this; + /** + * Allows disk use for blocking sort operations exceeding 100MB memory. (MongoDB 3.2 or higher) + * + * @remarks + * {@link https://docs.mongodb.com/manual/reference/command/find/#find-cmd-allowdiskuse | find command allowDiskUse documentation} + */ + allowDiskUse(): this; + /** + * Set the collation options for the cursor. + * + * @param value - The cursor collation options (MongoDB 3.4 or higher) settings for update operation (see 3.4 documentation for available fields). + */ + collation(value: CollationOptions): this; + /** + * Set the limit for the cursor. + * + * @param value - The limit for the cursor query. + */ + limit(value: number): this; + /** + * Set the skip for the cursor. + * + * @param value - The skip for the cursor query. + */ + skip(value: number): this; +} + +/** @public */ +export declare interface FindOneAndDeleteOptions extends CommandOperationOptions { + /** An optional hint for query optimization. See the {@link https://docs.mongodb.com/manual/reference/command/update/#update-command-hint|update command} reference for more information.*/ + hint?: Document; + /** Limits the fields to return for all matching documents. */ + projection?: Document; + /** Determines which document the operation modifies if the query selects multiple documents. */ + sort?: Sort; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; +} + +/** @public */ +export declare interface FindOneAndReplaceOptions extends CommandOperationOptions { + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; + /** An optional hint for query optimization. See the {@link https://docs.mongodb.com/manual/reference/command/update/#update-command-hint|update command} reference for more information.*/ + hint?: Document; + /** Limits the fields to return for all matching documents. */ + projection?: Document; + /** When set to 'after', returns the updated document rather than the original. The default is 'before'. */ + returnDocument?: ReturnDocument; + /** Determines which document the operation modifies if the query selects multiple documents. */ + sort?: Sort; + /** Upsert the document if it does not exist. */ + upsert?: boolean; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; +} + +/** @public */ +export declare interface FindOneAndUpdateOptions extends CommandOperationOptions { + /** Optional list of array filters referenced in filtered positional operators */ + arrayFilters?: Document[]; + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; + /** An optional hint for query optimization. See the {@link https://docs.mongodb.com/manual/reference/command/update/#update-command-hint|update command} reference for more information.*/ + hint?: Document; + /** Limits the fields to return for all matching documents. */ + projection?: Document; + /** When set to 'after', returns the updated document rather than the original. The default is 'before'. */ + returnDocument?: ReturnDocument; + /** Determines which document the operation modifies if the query selects multiple documents. */ + sort?: Sort; + /** Upsert the document if it does not exist. */ + upsert?: boolean; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; +} + +/** + * A builder object that is returned from {@link BulkOperationBase#find}. + * Is used to build a write operation that involves a query filter. + * + * @public + */ +export declare class FindOperators { + bulkOperation: BulkOperationBase; + /* Excluded from this release type: __constructor */ + /** Add a multiple update operation to the bulk operation */ + update(updateDocument: Document): BulkOperationBase; + /** Add a single update operation to the bulk operation */ + updateOne(updateDocument: Document): BulkOperationBase; + /** Add a replace one operation to the bulk operation */ + replaceOne(replacement: Document): BulkOperationBase; + /** Add a delete one operation to the bulk operation */ + deleteOne(): BulkOperationBase; + /** Add a delete many operation to the bulk operation */ + delete(): BulkOperationBase; + /** Upsert modifier for update bulk operation, noting that this operation is an upsert. */ + upsert(): this; + /** Specifies the collation for the query condition. */ + collation(collation: CollationOptions): this; + /** Specifies arrayFilters for UpdateOne or UpdateMany bulk operations. */ + arrayFilters(arrayFilters: Document[]): this; +} + +/** + * @public + * @typeParam TSchema - Unused schema definition, deprecated usage, only specify `FindOptions` with no generic + */ +export declare interface FindOptions extends CommandOperationOptions { + /** Sets the limit of documents returned in the query. */ + limit?: number; + /** Set to sort the documents coming back from the query. Array of indexes, `[['a', 1]]` etc. */ + sort?: Sort; + /** The fields to return in the query. Object of fields to either include or exclude (one of, not both), `{'a':1, 'b': 1}` **or** `{'a': 0, 'b': 0}` */ + projection?: Document; + /** Set to skip N documents ahead in your query (useful for pagination). */ + skip?: number; + /** Tell the query to use specific indexes in the query. Object of indexes to use, `{'_id':1}` */ + hint?: Hint; + /** Specify if the cursor can timeout. */ + timeout?: boolean; + /** Specify if the cursor is tailable. */ + tailable?: boolean; + /** Specify if the cursor is a a tailable-await cursor. Requires `tailable` to be true */ + awaitData?: boolean; + /** Set the batchSize for the getMoreCommand when iterating over the query results. */ + batchSize?: number; + /** If true, returns only the index keys in the resulting documents. */ + returnKey?: boolean; + /** The inclusive lower bound for a specific index */ + min?: Document; + /** The exclusive upper bound for a specific index */ + max?: Document; + /** You can put a $comment field on a query to make looking in the profiler logs simpler. */ + comment?: string | Document; + /** Number of milliseconds to wait before aborting the query. */ + maxTimeMS?: number; + /** The maximum amount of time for the server to wait on new documents to satisfy a tailable cursor query. Requires `tailable` and `awaitData` to be true */ + maxAwaitTimeMS?: number; + /** The server normally times out idle cursors after an inactivity period (10 minutes) to prevent excess memory use. Set this option to prevent that. */ + noCursorTimeout?: boolean; + /** Specify collation (MongoDB 3.4 or higher) settings for update operation (see 3.4 documentation for available fields). */ + collation?: CollationOptions; + /** Allows disk use for blocking sort operations exceeding 100MB memory. (MongoDB 3.2 or higher) */ + allowDiskUse?: boolean; + /** Determines whether to close the cursor after the first batch. Defaults to false. */ + singleBatch?: boolean; + /** For queries against a sharded collection, allows the command (or subsequent getMore commands) to return partial results, rather than an error, if one or more queried shards are unavailable. */ + allowPartialResults?: boolean; + /** Determines whether to return the record identifier for each document. If true, adds a field $recordId to the returned documents. */ + showRecordId?: boolean; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; +} + +/** @public */ +export declare type Flatten = Type extends ReadonlyArray ? Item : Type; + +/** @public */ +export declare type GenericListener = (...args: any[]) => void; + +/* Excluded from this release type: GetMore */ + +/* Excluded from this release type: GetMoreOptions */ + +/** + * Constructor for a streaming GridFS interface + * @public + */ +export declare class GridFSBucket extends TypedEventEmitter { + /* Excluded from this release type: s */ + /** + * When the first call to openUploadStream is made, the upload stream will + * check to see if it needs to create the proper indexes on the chunks and + * files collections. This event is fired either when 1) it determines that + * no index creation is necessary, 2) when it successfully creates the + * necessary indexes. + * @event + */ + static readonly INDEX: "index"; + constructor(db: Db, options?: GridFSBucketOptions); + /** + * Returns a writable stream (GridFSBucketWriteStream) for writing + * buffers to GridFS. The stream's 'id' property contains the resulting + * file's id. + * + * @param filename - The value of the 'filename' key in the files doc + * @param options - Optional settings. + */ + openUploadStream(filename: string, options?: GridFSBucketWriteStreamOptions): GridFSBucketWriteStream; + /** + * Returns a writable stream (GridFSBucketWriteStream) for writing + * buffers to GridFS for a custom file id. The stream's 'id' property contains the resulting + * file's id. + */ + openUploadStreamWithId(id: ObjectId, filename: string, options?: GridFSBucketWriteStreamOptions): GridFSBucketWriteStream; + /** Returns a readable stream (GridFSBucketReadStream) for streaming file data from GridFS. */ + openDownloadStream(id: ObjectId, options?: GridFSBucketReadStreamOptions): GridFSBucketReadStream; + /** + * Deletes a file with the given id + * + * @param id - The id of the file doc + */ + delete(id: ObjectId): Promise; + delete(id: ObjectId, callback: Callback): void; + /** Convenience wrapper around find on the files collection */ + find(filter?: Filter, options?: FindOptions): FindCursor; + /** + * Returns a readable stream (GridFSBucketReadStream) for streaming the + * file with the given name from GridFS. If there are multiple files with + * the same name, this will stream the most recent file with the given name + * (as determined by the `uploadDate` field). You can set the `revision` + * option to change this behavior. + */ + openDownloadStreamByName(filename: string, options?: GridFSBucketReadStreamOptionsWithRevision): GridFSBucketReadStream; + /** + * Renames the file with the given _id to the given string + * + * @param id - the id of the file to rename + * @param filename - new name for the file + */ + rename(id: ObjectId, filename: string): Promise; + rename(id: ObjectId, filename: string, callback: Callback): void; + /** Removes this bucket's files collection, followed by its chunks collection. */ + drop(): Promise; + drop(callback: Callback): void; + /** Get the Db scoped logger. */ + getLogger(): Logger; +} + +/** @public */ +export declare type GridFSBucketEvents = { + index(): void; +}; + +/** @public */ +export declare interface GridFSBucketOptions extends WriteConcernOptions { + /** The 'files' and 'chunks' collections will be prefixed with the bucket name followed by a dot. */ + bucketName?: string; + /** Number of bytes stored in each chunk. Defaults to 255KB */ + chunkSizeBytes?: number; + /** Read preference to be passed to read operations */ + readPreference?: ReadPreference; +} + +/* Excluded from this release type: GridFSBucketPrivate */ + +/** + * A readable stream that enables you to read buffers from GridFS. + * + * Do not instantiate this class directly. Use `openDownloadStream()` instead. + * @public + */ +export declare class GridFSBucketReadStream extends Readable implements NodeJS.ReadableStream { + /* Excluded from this release type: s */ + /** + * An error occurred + * @event + */ + static readonly ERROR: "error"; + /** + * Fires when the stream loaded the file document corresponding to the provided id. + * @event + */ + static readonly FILE: "file"; + /** + * Emitted when a chunk of data is available to be consumed. + * @event + */ + static readonly DATA: "data"; + /** + * Fired when the stream is exhausted (no more data events). + * @event + */ + static readonly END: "end"; + /** + * Fired when the stream is exhausted and the underlying cursor is killed + * @event + */ + static readonly CLOSE: "close"; + /* Excluded from this release type: __constructor */ + /* Excluded from this release type: _read */ + /** + * Sets the 0-based offset in bytes to start streaming from. Throws + * an error if this stream has entered flowing mode + * (e.g. if you've already called `on('data')`) + * + * @param start - 0-based offset in bytes to start streaming from + */ + start(start?: number): this; + /** + * Sets the 0-based offset in bytes to start streaming from. Throws + * an error if this stream has entered flowing mode + * (e.g. if you've already called `on('data')`) + * + * @param end - Offset in bytes to stop reading at + */ + end(end?: number): this; + /** + * Marks this stream as aborted (will never push another `data` event) + * and kills the underlying cursor. Will emit the 'end' event, and then + * the 'close' event once the cursor is successfully killed. + * + * @param callback - called when the cursor is successfully closed or an error occurred. + */ + abort(callback?: Callback): void; +} + +/** @public */ +export declare interface GridFSBucketReadStreamOptions { + sort?: Sort; + skip?: number; + /** 0-based offset in bytes to start streaming from */ + start?: number; + /** 0-based offset in bytes to stop streaming before */ + end?: number; +} + +/** @public */ +export declare interface GridFSBucketReadStreamOptionsWithRevision extends GridFSBucketReadStreamOptions { + /** The revision number relative to the oldest file with the given filename. 0 + * gets you the oldest file, 1 gets you the 2nd oldest, -1 gets you the + * newest. */ + revision?: number; +} + +/* Excluded from this release type: GridFSBucketReadStreamPrivate */ + +/** + * A writable stream that enables you to write buffers to GridFS. + * + * Do not instantiate this class directly. Use `openUploadStream()` instead. + * @public + */ +export declare class GridFSBucketWriteStream extends Writable implements NodeJS.WritableStream { + bucket: GridFSBucket; + chunks: Collection; + filename: string; + files: Collection; + options: GridFSBucketWriteStreamOptions; + done: boolean; + id: ObjectId; + chunkSizeBytes: number; + bufToStore: Buffer; + length: number; + n: number; + pos: number; + state: { + streamEnd: boolean; + outstandingRequests: number; + errored: boolean; + aborted: boolean; + }; + writeConcern?: WriteConcern; + /** @event */ + static readonly CLOSE = "close"; + /** @event */ + static readonly ERROR = "error"; + /** + * `end()` was called and the write stream successfully wrote the file metadata and all the chunks to MongoDB. + * @event + */ + static readonly FINISH = "finish"; + /* Excluded from this release type: __constructor */ + /** + * Write a buffer to the stream. + * + * @param chunk - Buffer to write + * @param encodingOrCallback - Optional encoding for the buffer + * @param callback - Function to call when the chunk was added to the buffer, or if the entire chunk was persisted to MongoDB if this chunk caused a flush. + * @returns False if this write required flushing a chunk to MongoDB. True otherwise. + */ + write(chunk: Buffer | string): boolean; + write(chunk: Buffer | string, callback: Callback): boolean; + write(chunk: Buffer | string, encoding: BufferEncoding | undefined): boolean; + write(chunk: Buffer | string, encoding: BufferEncoding | undefined, callback: Callback): boolean; + /** + * Places this write stream into an aborted state (all future writes fail) + * and deletes all chunks that have already been written. + * + * @param callback - called when chunks are successfully removed or error occurred + */ + abort(): Promise; + abort(callback: Callback): void; + /** + * Tells the stream that no more data will be coming in. The stream will + * persist the remaining data to MongoDB, write the files document, and + * then emit a 'finish' event. + * + * @param chunk - Buffer to write + * @param encoding - Optional encoding for the buffer + * @param callback - Function to call when all files and chunks have been persisted to MongoDB + */ + end(): void; + end(chunk: Buffer): void; + end(callback: Callback): void; + end(chunk: Buffer, callback: Callback): void; + end(chunk: Buffer, encoding: BufferEncoding): void; + end(chunk: Buffer, encoding: BufferEncoding | undefined, callback: Callback): void; +} + +/** @public */ +export declare interface GridFSBucketWriteStreamOptions extends WriteConcernOptions { + /** Overwrite this bucket's chunkSizeBytes for this file */ + chunkSizeBytes?: number; + /** Custom file id for the GridFS file. */ + id?: ObjectId; + /** Object to store in the file document's `metadata` field */ + metadata?: Document; + /** String to store in the file document's `contentType` field */ + contentType?: string; + /** Array of strings to store in the file document's `aliases` field */ + aliases?: string[]; +} + +/** @public */ +export declare interface GridFSChunk { + _id: ObjectId; + files_id: ObjectId; + n: number; + data: Buffer | Uint8Array; +} + +/** @public */ +export declare interface GridFSFile { + _id: ObjectId; + length: number; + chunkSize: number; + filename: string; + contentType?: string; + aliases?: string[]; + metadata?: Document; + uploadDate: Date; +} + +/** @public */ +export declare interface HedgeOptions { + /** Explicitly enable or disable hedged reads. */ + enabled?: boolean; +} + +/** @public */ +export declare type Hint = string | Document; + +/** @public */ +export declare class HostAddress { + host: string | undefined; + port: number | undefined; + socketPath: string | undefined; + isIPv6: boolean | undefined; + constructor(hostString: string); + /** + * @param ipv6Brackets - optionally request ipv6 bracket notation required for connection strings + */ + toString(ipv6Brackets?: boolean): string; + static fromString(s: string): HostAddress; +} + +/** @public */ +export declare interface IndexDescription extends Pick { + collation?: CollationOptions; + name?: string; + key: Document; +} + +/** @public */ +export declare type IndexDirection = -1 | 1 | '2d' | '2dsphere' | 'text' | 'geoHaystack' | number; + +/** @public */ +export declare interface IndexInformationOptions { + full?: boolean; + readPreference?: ReadPreference; + session?: ClientSession; +} + +/** @public */ +export declare type IndexSpecification = OneOrMore; + +/** Given an object shaped type, return the type of the _id field or default to ObjectId @public */ +export declare type InferIdType = TSchema extends { + _id: infer IdType; +} ? {} extends IdType ? Exclude : unknown extends IdType ? ObjectId : IdType : ObjectId; + +/** @public */ +export declare interface InsertManyResult { + /** Indicates whether this write result was acknowledged. If not, then all other members of this result will be undefined */ + acknowledged: boolean; + /** The number of inserted documents for this operations */ + insertedCount: number; + /** Map of the index of the inserted document to the id of the inserted document */ + insertedIds: { + [key: number]: InferIdType; + }; +} + +/** @public */ +export declare interface InsertOneModel { + /** The document to insert. */ + document: OptionalId; +} + +/** @public */ +export declare interface InsertOneOptions extends CommandOperationOptions { + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; + /** Force server to assign _id values instead of driver. */ + forceServerObjectId?: boolean; +} + +/** @public */ +export declare interface InsertOneResult { + /** Indicates whether this write result was acknowledged. If not, then all other members of this result will be undefined */ + acknowledged: boolean; + /** The identifier that was inserted. If the server generated the identifier, this value will be null as the driver does not have access to that data */ + insertedId: InferIdType; +} + +export { Int32 } + +/** @public */ +export declare type IntegerType = number | Int32 | Long; + +/* Excluded from this release type: InternalAbstractCursorOptions */ + +/* Excluded from this release type: InterruptibleAsyncInterval */ + +/** @public */ +export declare type IsAny = true extends false & Type ? ResultIfAny : ResultIfNotAny; + +/* Excluded from this release type: kBeforeHandshake */ + +/* Excluded from this release type: kBuffer */ + +/* Excluded from this release type: kBuffers */ + +/* Excluded from this release type: kBuiltOptions */ + +/* Excluded from this release type: kCancellationToken */ + +/* Excluded from this release type: kCancellationToken_2 */ + +/* Excluded from this release type: kCancelled */ + +/* Excluded from this release type: kCancelled_2 */ + +/* Excluded from this release type: kCheckedOut */ + +/* Excluded from this release type: kClosed */ + +/* Excluded from this release type: kClosed_2 */ + +/* Excluded from this release type: kClusterTime */ + +/* Excluded from this release type: kConnection */ + +/* Excluded from this release type: kConnectionCounter */ + +/* Excluded from this release type: kConnections */ + +/* Excluded from this release type: kCursorStream */ + +/* Excluded from this release type: kDescription */ + +/* Excluded from this release type: kDocuments */ + +/* Excluded from this release type: kErrorLabels */ + +/** @public */ +export declare type KeysOfAType = { + [key in keyof TSchema]: NonNullable extends Type ? key : never; +}[keyof TSchema]; + +/** @public */ +export declare type KeysOfOtherType = { + [key in keyof TSchema]: NonNullable extends Type ? never : key; +}[keyof TSchema]; + +/* Excluded from this release type: kFilter */ + +/* Excluded from this release type: kFullResult */ + +/* Excluded from this release type: kGeneration */ + +/* Excluded from this release type: kGeneration_2 */ + +/* Excluded from this release type: kId */ + +/* Excluded from this release type: KillCursor */ + +/* Excluded from this release type: kInitialized */ + +/* Excluded from this release type: kInternalClient */ + +/* Excluded from this release type: kIsMaster */ + +/* Excluded from this release type: kKilled */ + +/* Excluded from this release type: kLastUseTime */ + +/* Excluded from this release type: kLength */ + +/* Excluded from this release type: kLogger */ + +/* Excluded from this release type: kMessageStream */ + +/* Excluded from this release type: kMetrics */ + +/* Excluded from this release type: kMinPoolSizeTimer */ + +/* Excluded from this release type: kMode */ + +/* Excluded from this release type: kMonitor */ + +/* Excluded from this release type: kMonitorId */ + +/* Excluded from this release type: kNamespace */ + +/* Excluded from this release type: kNumReturned */ + +/* Excluded from this release type: kOptions */ + +/* Excluded from this release type: kOptions_2 */ + +/* Excluded from this release type: kOptions_3 */ + +/* Excluded from this release type: kPermits */ + +/* Excluded from this release type: kPinnedConnection */ + +/* Excluded from this release type: kPipeline */ + +/* Excluded from this release type: kProcessingWaitQueue */ + +/* Excluded from this release type: kQueue */ + +/* Excluded from this release type: kResumeQueue */ + +/* Excluded from this release type: kRoundTripTime */ + +/* Excluded from this release type: kRTTPinger */ + +/* Excluded from this release type: kServer */ + +/* Excluded from this release type: kServer_2 */ + +/* Excluded from this release type: kServerError */ + +/* Excluded from this release type: kServerSession */ + +/* Excluded from this release type: kServiceGenerations */ + +/* Excluded from this release type: kSession */ + +/* Excluded from this release type: kSession_2 */ + +/* Excluded from this release type: kSnapshotEnabled */ + +/* Excluded from this release type: kSnapshotTime */ + +/* Excluded from this release type: kStream */ + +/* Excluded from this release type: kTopology */ + +/* Excluded from this release type: kTransform */ + +/* Excluded from this release type: kWaitQueue */ + +/* Excluded from this release type: kWaitQueue_2 */ + +/** @public */ +export declare const LEGAL_TCP_SOCKET_OPTIONS: readonly ["family", "hints", "localAddress", "localPort", "lookup"]; + +/** @public */ +export declare const LEGAL_TLS_SOCKET_OPTIONS: readonly ["ALPNProtocols", "ca", "cert", "checkServerIdentity", "ciphers", "crl", "ecdhCurve", "key", "minDHSize", "passphrase", "pfx", "rejectUnauthorized", "secureContext", "secureProtocol", "servername", "session"]; + +/** @public */ +export declare class ListCollectionsCursor | CollectionInfo = Pick | CollectionInfo> extends AbstractCursor { + parent: Db; + filter: Document; + options?: ListCollectionsOptions; + constructor(db: Db, filter: Document, options?: ListCollectionsOptions); + clone(): ListCollectionsCursor; + /* Excluded from this release type: _initialize */ +} + +/** @public */ +export declare interface ListCollectionsOptions extends CommandOperationOptions { + /** Since 4.0: If true, will only return the collection name in the response, and will omit additional info */ + nameOnly?: boolean; + /** The batchSize for the returned command cursor or if pre 2.8 the systems batch collection */ + batchSize?: number; +} + +/** @public */ +export declare interface ListDatabasesOptions extends CommandOperationOptions { + /** A query predicate that determines which databases are listed */ + filter?: Document; + /** A flag to indicate whether the command should return just the database names, or return both database names and size information */ + nameOnly?: boolean; + /** A flag that determines which databases are returned based on the user privileges when access control is enabled */ + authorizedDatabases?: boolean; +} + +/** @public */ +export declare type ListDatabasesResult = string[] | Document[]; + +/** @public */ +export declare class ListIndexesCursor extends AbstractCursor { + parent: Collection; + options?: ListIndexesOptions; + constructor(collection: Collection, options?: ListIndexesOptions); + clone(): ListIndexesCursor; + /* Excluded from this release type: _initialize */ +} + +/** @public */ +export declare interface ListIndexesOptions extends CommandOperationOptions { + /** The batchSize for the returned command cursor or if pre 2.8 the systems batch collection */ + batchSize?: number; +} + +/** + * @public + */ +export declare class Logger { + className: string; + /** + * Creates a new Logger instance + * + * @param className - The Class name associated with the logging instance + * @param options - Optional logging settings + */ + constructor(className: string, options?: LoggerOptions); + /** + * Log a message at the debug level + * + * @param message - The message to log + * @param object - Additional meta data to log + */ + debug(message: string, object?: unknown): void; + /** + * Log a message at the warn level + * + * @param message - The message to log + * @param object - Additional meta data to log + */ + warn(message: string, object?: unknown): void; + /** + * Log a message at the info level + * + * @param message - The message to log + * @param object - Additional meta data to log + */ + info(message: string, object?: unknown): void; + /** + * Log a message at the error level + * + * @param message - The message to log + * @param object - Additional meta data to log + */ + error(message: string, object?: unknown): void; + /** Is the logger set at info level */ + isInfo(): boolean; + /** Is the logger set at error level */ + isError(): boolean; + /** Is the logger set at error level */ + isWarn(): boolean; + /** Is the logger set at debug level */ + isDebug(): boolean; + /** Resets the logger to default settings, error and no filtered classes */ + static reset(): void; + /** Get the current logger function */ + static currentLogger(): LoggerFunction; + /** + * Set the current logger function + * + * @param logger - Custom logging function + */ + static setCurrentLogger(logger: LoggerFunction): void; + /** + * Filter log messages for a particular class + * + * @param type - The type of filter (currently only class) + * @param values - The filters to apply + */ + static filter(type: string, values: string[]): void; + /** + * Set the current log level + * + * @param newLevel - Set current log level (debug, warn, info, error) + */ + static setLevel(newLevel: LoggerLevel): void; +} + +/** @public */ +export declare type LoggerFunction = (message?: any, ...optionalParams: any[]) => void; + +/** @public */ +export declare const LoggerLevel: Readonly<{ + readonly ERROR: "error"; + readonly WARN: "warn"; + readonly INFO: "info"; + readonly DEBUG: "debug"; + readonly error: "error"; + readonly warn: "warn"; + readonly info: "info"; + readonly debug: "debug"; +}>; + +/** @public */ +export declare type LoggerLevel = typeof LoggerLevel[keyof typeof LoggerLevel]; + +/** @public */ +export declare interface LoggerOptions { + logger?: LoggerFunction; + loggerLevel?: LoggerLevel; +} + +export { Long } + +export { Map_2 as Map } + +/** @public */ +export declare type MapFunction = (this: TSchema) => void; + +/** @public */ +export declare interface MapReduceOptions extends CommandOperationOptions { + /** Sets the output target for the map reduce job. */ + out?: 'inline' | { + inline: 1; + } | { + replace: string; + } | { + merge: string; + } | { + reduce: string; + }; + /** Query filter object. */ + query?: Document; + /** Sorts the input objects using this key. Useful for optimization, like sorting by the emit key for fewer reduces. */ + sort?: Sort; + /** Number of objects to return from collection. */ + limit?: number; + /** Keep temporary data. */ + keeptemp?: boolean; + /** Finalize function. */ + finalize?: FinalizeFunction | string; + /** Can pass in variables that can be access from map/reduce/finalize. */ + scope?: Document; + /** It is possible to make the execution stay in JS. Provided in MongoDB \> 2.0.X. */ + jsMode?: boolean; + /** Provide statistics on job execution time. */ + verbose?: boolean; + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; +} + +/** @public */ +export declare type MatchKeysAndValues = Readonly> & Record; + +export { MaxKey } + +/* Excluded from this release type: MessageStream */ + +/* Excluded from this release type: MessageStreamOptions */ +export { MinKey } + +/** @public */ +export declare interface ModifyResult { + value: TSchema | null; + lastErrorObject?: Document; + ok: 0 | 1; +} + +/** @public */ +export declare const MONGO_CLIENT_EVENTS: readonly ["connectionPoolCreated", "connectionPoolClosed", "connectionCreated", "connectionReady", "connectionClosed", "connectionCheckOutStarted", "connectionCheckOutFailed", "connectionCheckedOut", "connectionCheckedIn", "connectionPoolCleared", ...("error" | "close" | "commandStarted" | "commandSucceeded" | "commandFailed" | "serverHeartbeatStarted" | "serverHeartbeatSucceeded" | "serverHeartbeatFailed" | "timeout" | "serverOpening" | "serverClosed" | "serverDescriptionChanged" | "topologyOpening" | "topologyClosed" | "topologyDescriptionChanged")[]]; + +/** + * An error generated when the driver API is used incorrectly + * + * @privateRemarks + * Should **never** be directly instantiated + * + * @public + * @category Error + */ +export declare class MongoAPIError extends MongoDriverError { + constructor(message: string); + get name(): string; +} + +/** + * An error generated when a batch command is reexecuted after one of the commands in the batch + * has failed + * + * @public + * @category Error + */ +export declare class MongoBatchReExecutionError extends MongoAPIError { + constructor(message?: string); + get name(): string; +} + +/** + * An error indicating an unsuccessful Bulk Write + * @public + * @category Error + */ +export declare class MongoBulkWriteError extends MongoServerError { + result: BulkWriteResult; + writeErrors: OneOrMore; + err?: WriteConcernError; + /** Creates a new MongoBulkWriteError */ + constructor(error: { + message: string; + code: number; + writeErrors?: WriteError[]; + } | WriteConcernError | AnyError, result: BulkWriteResult); + get name(): string; + /** Number of documents inserted. */ + get insertedCount(): number; + /** Number of documents matched for update. */ + get matchedCount(): number; + /** Number of documents modified. */ + get modifiedCount(): number; + /** Number of documents deleted. */ + get deletedCount(): number; + /** Number of documents upserted. */ + get upsertedCount(): number; + /** Inserted document generated Id's, hash key is the index of the originating operation */ + get insertedIds(): { + [key: number]: any; + }; + /** Upserted document generated Id's, hash key is the index of the originating operation */ + get upsertedIds(): { + [key: number]: any; + }; +} + +/** + * An error generated when a ChangeStream operation fails to execute. + * + * @public + * @category Error + */ +export declare class MongoChangeStreamError extends MongoRuntimeError { + constructor(message: string); + get name(): string; +} + +/** + * The **MongoClient** class is a class that allows for making Connections to MongoDB. + * @public + * + * @remarks + * The programmatically provided options take precedent over the URI options. + * + * @example + * ```js + * // Connect using a MongoClient instance + * const MongoClient = require('mongodb').MongoClient; + * const test = require('assert'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * // Connect using MongoClient + * const mongoClient = new MongoClient(url); + * mongoClient.connect(function(err, client) { + * const db = client.db(dbName); + * client.close(); + * }); + * ``` + * + * @example + * ```js + * // Connect using the MongoClient.connect static method + * const MongoClient = require('mongodb').MongoClient; + * const test = require('assert'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * // Connect using MongoClient + * MongoClient.connect(url, function(err, client) { + * const db = client.db(dbName); + * client.close(); + * }); + * ``` + */ +export declare class MongoClient extends TypedEventEmitter { + /* Excluded from this release type: s */ + /* Excluded from this release type: topology */ + /* Excluded from this release type: [kOptions] */ + constructor(url: string, options?: MongoClientOptions); + get options(): Readonly; + get serverApi(): Readonly; + /* Excluded from this release type: monitorCommands */ + /* Excluded from this release type: monitorCommands */ + get autoEncrypter(): AutoEncrypter | undefined; + get readConcern(): ReadConcern | undefined; + get writeConcern(): WriteConcern | undefined; + get readPreference(): ReadPreference; + get bsonOptions(): BSONSerializeOptions; + get logger(): Logger; + /** + * Connect to MongoDB using a url + * + * @see docs.mongodb.org/manual/reference/connection-string/ + */ + connect(): Promise; + connect(callback: Callback): void; + /** + * Close the db and its underlying connections + * + * @param force - Force close, emitting no events + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + close(): Promise; + close(callback: Callback): void; + close(force: boolean): Promise; + close(force: boolean, callback: Callback): void; + /** + * Create a new Db instance sharing the current socket connections. + * + * @param dbName - The name of the database we want to use. If not provided, use database name from connection string. + * @param options - Optional settings for Db construction + */ + db(dbName?: string, options?: DbOptions): Db; + /** + * Connect to MongoDB using a url + * + * @remarks + * The programmatically provided options take precedent over the URI options. + * + * @see https://docs.mongodb.org/manual/reference/connection-string/ + */ + static connect(url: string): Promise; + static connect(url: string, callback: Callback): void; + static connect(url: string, options: MongoClientOptions): Promise; + static connect(url: string, options: MongoClientOptions, callback: Callback): void; + /** Starts a new session on the server */ + startSession(): ClientSession; + startSession(options: ClientSessionOptions): ClientSession; + /** + * Runs a given operation with an implicitly created session. The lifetime of the session + * will be handled without the need for user interaction. + * + * NOTE: presently the operation MUST return a Promise (either explicit or implicitly as an async function) + * + * @param options - Optional settings for the command + * @param callback - An callback to execute with an implicitly created session + */ + withSession(callback: WithSessionCallback): Promise; + withSession(options: ClientSessionOptions, callback: WithSessionCallback): Promise; + /** + * Create a new Change Stream, watching for new changes (insertions, updates, + * replacements, deletions, and invalidations) in this cluster. Will ignore all + * changes to system collections, as well as the local, admin, and config databases. + * + * @param pipeline - An array of {@link https://docs.mongodb.com/manual/reference/operator/aggregation-pipeline/|aggregation pipeline stages} through which to pass change stream documents. This allows for filtering (using $match) and manipulating the change stream documents. + * @param options - Optional settings for the command + */ + watch(pipeline?: Document[], options?: ChangeStreamOptions): ChangeStream; + /** Return the mongo client logger */ + getLogger(): Logger; +} + +/** @public */ +export declare type MongoClientEvents = Pick & { + open(mongoClient: MongoClient): void; +}; + +/** + * Describes all possible URI query options for the mongo client + * @public + * @see https://docs.mongodb.com/manual/reference/connection-string + */ +export declare interface MongoClientOptions extends BSONSerializeOptions, SupportedNodeConnectionOptions { + /** Specifies the name of the replica set, if the mongod is a member of a replica set. */ + replicaSet?: string; + /** Enables or disables TLS/SSL for the connection. */ + tls?: boolean; + /** A boolean to enable or disables TLS/SSL for the connection. (The ssl option is equivalent to the tls option.) */ + ssl?: boolean; + /** Specifies the location of a local TLS Certificate */ + tlsCertificateFile?: string; + /** Specifies the location of a local .pem file that contains either the client's TLS/SSL certificate and key or only the client's TLS/SSL key when tlsCertificateFile is used to provide the certificate. */ + tlsCertificateKeyFile?: string; + /** Specifies the password to de-crypt the tlsCertificateKeyFile. */ + tlsCertificateKeyFilePassword?: string; + /** Specifies the location of a local .pem file that contains the root certificate chain from the Certificate Authority. This file is used to validate the certificate presented by the mongod/mongos instance. */ + tlsCAFile?: string; + /** Bypasses validation of the certificates presented by the mongod/mongos instance */ + tlsAllowInvalidCertificates?: boolean; + /** Disables hostname validation of the certificate presented by the mongod/mongos instance. */ + tlsAllowInvalidHostnames?: boolean; + /** Disables various certificate validations. */ + tlsInsecure?: boolean; + /** The time in milliseconds to attempt a connection before timing out. */ + connectTimeoutMS?: number; + /** The time in milliseconds to attempt a send or receive on a socket before the attempt times out. */ + socketTimeoutMS?: number; + /** An array or comma-delimited string of compressors to enable network compression for communication between this client and a mongod/mongos instance. */ + compressors?: CompressorName[] | string; + /** An integer that specifies the compression level if using zlib for network compression. */ + zlibCompressionLevel?: 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | undefined; + /** The maximum number of connections in the connection pool. */ + maxPoolSize?: number; + /** The minimum number of connections in the connection pool. */ + minPoolSize?: number; + /** The maximum number of milliseconds that a connection can remain idle in the pool before being removed and closed. */ + maxIdleTimeMS?: number; + /** The maximum time in milliseconds that a thread can wait for a connection to become available. */ + waitQueueTimeoutMS?: number; + /** Specify a read concern for the collection (only MongoDB 3.2 or higher supported) */ + readConcern?: ReadConcernLike; + /** The level of isolation */ + readConcernLevel?: ReadConcernLevel; + /** Specifies the read preferences for this connection */ + readPreference?: ReadPreferenceMode | ReadPreference; + /** Specifies, in seconds, how stale a secondary can be before the client stops using it for read operations. */ + maxStalenessSeconds?: number; + /** Specifies the tags document as a comma-separated list of colon-separated key-value pairs. */ + readPreferenceTags?: TagSet[]; + /** The auth settings for when connection to server. */ + auth?: Auth; + /** Specify the database name associated with the user’s credentials. */ + authSource?: string; + /** Specify the authentication mechanism that MongoDB will use to authenticate the connection. */ + authMechanism?: AuthMechanism; + /** Specify properties for the specified authMechanism as a comma-separated list of colon-separated key-value pairs. */ + authMechanismProperties?: AuthMechanismProperties; + /** The size (in milliseconds) of the latency window for selecting among multiple suitable MongoDB instances. */ + localThresholdMS?: number; + /** Specifies how long (in milliseconds) to block for server selection before throwing an exception. */ + serverSelectionTimeoutMS?: number; + /** heartbeatFrequencyMS controls when the driver checks the state of the MongoDB deployment. Specify the interval (in milliseconds) between checks, counted from the end of the previous check until the beginning of the next one. */ + heartbeatFrequencyMS?: number; + /** Sets the minimum heartbeat frequency. In the event that the driver has to frequently re-check a server's availability, it will wait at least this long since the previous check to avoid wasted effort. */ + minHeartbeatFrequencyMS?: number; + /** The name of the application that created this MongoClient instance. MongoDB 3.4 and newer will print this value in the server log upon establishing each connection. It is also recorded in the slow query log and profile collections */ + appName?: string; + /** Enables retryable reads. */ + retryReads?: boolean; + /** Enable retryable writes. */ + retryWrites?: boolean; + /** Allow a driver to force a Single topology type with a connection string containing one host */ + directConnection?: boolean; + /** Instruct the driver it is connecting to a load balancer fronting a mongos like service */ + loadBalanced?: boolean; + /** The write concern w value */ + w?: W; + /** The write concern timeout */ + wtimeoutMS?: number; + /** The journal write concern */ + journal?: boolean; + /** Validate mongod server certificate against Certificate Authority */ + sslValidate?: boolean; + /** SSL Certificate file path. */ + sslCA?: string; + /** SSL Certificate file path. */ + sslCert?: string; + /** SSL Key file file path. */ + sslKey?: string; + /** SSL Certificate pass phrase. */ + sslPass?: string; + /** SSL Certificate revocation list file path. */ + sslCRL?: string; + /** TCP Connection no delay */ + noDelay?: boolean; + /** TCP Connection keep alive enabled */ + keepAlive?: boolean; + /** The number of milliseconds to wait before initiating keepAlive on the TCP socket */ + keepAliveInitialDelay?: number; + /** Force server to assign `_id` values instead of driver */ + forceServerObjectId?: boolean; + /** Return document results as raw BSON buffers */ + raw?: boolean; + /** A primary key factory function for generation of custom `_id` keys */ + pkFactory?: PkFactory; + /** A Promise library class the application wishes to use such as Bluebird, must be ES6 compatible */ + promiseLibrary?: any; + /** The logging level */ + loggerLevel?: LoggerLevel; + /** Custom logger object */ + logger?: Logger; + /** Enable command monitoring for this client */ + monitorCommands?: boolean; + /** Server API version */ + serverApi?: ServerApi | ServerApiVersion; + /** + * Optionally enable client side auto encryption + * + * @remarks + * Automatic encryption is an enterprise only feature that only applies to operations on a collection. Automatic encryption is not supported for operations on a database or view, and operations that are not bypassed will result in error + * (see [libmongocrypt: Auto Encryption Allow-List](https://github.com/mongodb/specifications/blob/master/source/client-side-encryption/client-side-encryption.rst#libmongocrypt-auto-encryption-allow-list)). To bypass automatic encryption for all operations, set bypassAutoEncryption=true in AutoEncryptionOpts. + * + * Automatic encryption requires the authenticated user to have the [listCollections privilege action](https://docs.mongodb.com/manual/reference/command/listCollections/#dbcmd.listCollections). + * + * If a MongoClient with a limited connection pool size (i.e a non-zero maxPoolSize) is configured with AutoEncryptionOptions, a separate internal MongoClient is created if any of the following are true: + * - AutoEncryptionOptions.keyVaultClient is not passed. + * - AutoEncryptionOptions.bypassAutomaticEncryption is false. + * + * If an internal MongoClient is created, it is configured with the same options as the parent MongoClient except minPoolSize is set to 0 and AutoEncryptionOptions is omitted. + */ + autoEncryption?: AutoEncryptionOptions; + /** Allows a wrapping driver to amend the client metadata generated by the driver to include information about the wrapping driver */ + driverInfo?: DriverInfo; + /* Excluded from this release type: srvPoller */ + /* Excluded from this release type: connectionType */ +} + +/* Excluded from this release type: MongoClientPrivate */ + +/** + * An error generated when a feature that is not enabled or allowed for the current server + * configuration is used + * + * + * @public + * @category Error + */ +export declare class MongoCompatibilityError extends MongoAPIError { + constructor(message: string); + get name(): string; +} + +/** + * A representation of the credentials used by MongoDB + * @public + */ +export declare class MongoCredentials { + /** The username used for authentication */ + readonly username: string; + /** The password used for authentication */ + readonly password: string; + /** The database that the user should authenticate against */ + readonly source: string; + /** The method used to authenticate */ + readonly mechanism: AuthMechanism; + /** Special properties used by some types of auth mechanisms */ + readonly mechanismProperties: AuthMechanismProperties; + constructor(options: MongoCredentialsOptions); + /** Determines if two MongoCredentials objects are equivalent */ + equals(other: MongoCredentials): boolean; + /** + * If the authentication mechanism is set to "default", resolves the authMechanism + * based on the server version and server supported sasl mechanisms. + * + * @param ismaster - An ismaster response from the server + */ + resolveAuthMechanism(ismaster?: Document): MongoCredentials; + validate(): void; + static merge(creds: MongoCredentials | undefined, options: Partial): MongoCredentials; +} + +/** @public */ +export declare interface MongoCredentialsOptions { + username: string; + password: string; + source: string; + db?: string; + mechanism?: AuthMechanism; + mechanismProperties: AuthMechanismProperties; +} + +/** + * An error thrown when an attempt is made to read from a cursor that has been exhausted + * + * @public + * @category Error + */ +export declare class MongoCursorExhaustedError extends MongoAPIError { + constructor(message?: string); + get name(): string; +} + +/** + * An error thrown when the user attempts to add options to a cursor that has already been + * initialized + * + * @public + * @category Error + */ +export declare class MongoCursorInUseError extends MongoAPIError { + constructor(message?: string); + get name(): string; +} + +/** @public */ +export declare class MongoDBNamespace { + db: string; + collection?: string; + /** + * Create a namespace object + * + * @param db - database name + * @param collection - collection name + */ + constructor(db: string, collection?: string); + toString(): string; + withCollection(collection: string): MongoDBNamespace; + static fromString(namespace?: string): MongoDBNamespace; +} + +/** + * An error generated when the driver fails to decompress + * data received from the server. + * + * @public + * @category Error + */ +export declare class MongoDecompressionError extends MongoRuntimeError { + constructor(message: string); + get name(): string; +} + +/** + * An error generated by the driver + * + * @public + * @category Error + */ +export declare class MongoDriverError extends MongoError { + constructor(message: string); + get name(): string; +} + +/** + * @public + * @category Error + * + * @privateRemarks + * CSFLE has a dependency on this error, it uses the constructor with a string argument + */ +export declare class MongoError extends Error { + /* Excluded from this release type: [kErrorLabels] */ + /** + * This is a number in MongoServerError and a string in MongoDriverError + * @privateRemarks + * Define the type override on the subclasses when we can use the override keyword + */ + code?: number | string; + topologyVersion?: TopologyVersion; + constructor(message: string | Error); + get name(): string; + /** Legacy name for server error responses */ + get errmsg(): string; + /** + * Checks the error to see if it has an error label + * + * @param label - The error label to check for + * @returns returns true if the error has the provided error label + */ + hasErrorLabel(label: string): boolean; + addErrorLabel(label: string): void; + get errorLabels(): string[]; +} + +/** + * An error generated when the user attempts to operate + * on a session that has expired or has been closed. + * + * @public + * @category Error + */ +export declare class MongoExpiredSessionError extends MongoAPIError { + constructor(message?: string); + get name(): string; +} + +/** + * An error generated when a malformed or invalid chunk is + * encountered when reading from a GridFSStream. + * + * @public + * @category Error + */ +export declare class MongoGridFSChunkError extends MongoRuntimeError { + constructor(message: string); + get name(): string; +} + +/** An error generated when a GridFSStream operation fails to execute. + * + * @public + * @category Error + */ +export declare class MongoGridFSStreamError extends MongoRuntimeError { + constructor(message: string); + get name(): string; +} + +/** + * An error generated when the user supplies malformed or unexpected arguments + * or when a required argument or field is not provided. + * + * + * @public + * @category Error + */ +export declare class MongoInvalidArgumentError extends MongoAPIError { + constructor(message: string); + get name(): string; +} + +/** + * A error generated when the user attempts to authenticate + * via Kerberos, but fails to connect to the Kerberos client. + * + * @public + * @category Error + */ +export declare class MongoKerberosError extends MongoRuntimeError { + constructor(message: string); + get name(): string; +} + +/** + * An error generated when the user fails to provide authentication credentials before attempting + * to connect to a mongo server instance. + * + * + * @public + * @category Error + */ +export declare class MongoMissingCredentialsError extends MongoAPIError { + constructor(message: string); + get name(): string; +} + +/** + * An error generated when a required module or dependency is not present in the local environment + * + * @public + * @category Error + */ +export declare class MongoMissingDependencyError extends MongoAPIError { + constructor(message: string); + get name(): string; +} + +/** + * An error indicating an issue with the network, including TCP errors and timeouts. + * @public + * @category Error + */ +export declare class MongoNetworkError extends MongoError { + /* Excluded from this release type: [kBeforeHandshake] */ + constructor(message: string | Error, options?: MongoNetworkErrorOptions); + get name(): string; +} + +/** @public */ +export declare interface MongoNetworkErrorOptions { + /** Indicates the timeout happened before a connection handshake completed */ + beforeHandshake: boolean; +} + +/** + * An error indicating a network timeout occurred + * @public + * @category Error + * + * @privateRemarks + * CSFLE has a dependency on this error with an instanceof check + */ +export declare class MongoNetworkTimeoutError extends MongoNetworkError { + constructor(message: string, options?: MongoNetworkErrorOptions); + get name(): string; +} + +/** + * An error thrown when the user attempts to operate on a database or collection through a MongoClient + * that has not yet successfully called the "connect" method + * + * @public + * @category Error + */ +export declare class MongoNotConnectedError extends MongoAPIError { + constructor(message: string); + get name(): string; +} + +/** + * Mongo Client Options + * @public + */ +export declare interface MongoOptions extends Required>, SupportedNodeConnectionOptions { + hosts: HostAddress[]; + srvHost?: string; + credentials?: MongoCredentials; + readPreference: ReadPreference; + readConcern: ReadConcern; + loadBalanced: boolean; + serverApi: ServerApi; + compressors: CompressorName[]; + writeConcern: WriteConcern; + dbName: string; + metadata: ClientMetadata; + autoEncrypter?: AutoEncrypter; + /* Excluded from this release type: connectionType */ + /* Excluded from this release type: encrypter */ + /* Excluded from this release type: userSpecifiedAuthSource */ + /* Excluded from this release type: userSpecifiedReplicaSet */ + /** + * # NOTE ABOUT TLS Options + * + * If set TLS enabled, equivalent to setting the ssl option. + * + * ### Additional options: + * + * | nodejs option | MongoDB equivalent | type | + * |:---------------------|--------------------------------------------------------- |:---------------------------------------| + * | `ca` | `sslCA`, `tlsCAFile` | `string \| Buffer \| Buffer[]` | + * | `crl` | `sslCRL` | `string \| Buffer \| Buffer[]` | + * | `cert` | `sslCert`, `tlsCertificateFile`, `tlsCertificateKeyFile` | `string \| Buffer \| Buffer[]` | + * | `key` | `sslKey`, `tlsCertificateKeyFile` | `string \| Buffer \| KeyObject[]` | + * | `passphrase` | `sslPass`, `tlsCertificateKeyFilePassword` | `string` | + * | `rejectUnauthorized` | `sslValidate` | `boolean` | + * + */ + tls: boolean; + /** + * Turn these options into a reusable connection URI + */ + toURI(): string; +} + +/** + * An error used when attempting to parse a value (like a connection string) + * @public + * @category Error + */ +export declare class MongoParseError extends MongoDriverError { + constructor(message: string); + get name(): string; +} + +/** + * An error generated when the driver encounters unexpected input + * or reaches an unexpected/invalid internal state + * + * @privateRemarks + * Should **never** be directly instantiated. + * + * @public + * @category Error + */ +export declare class MongoRuntimeError extends MongoDriverError { + constructor(message: string); + get name(): string; +} + +/** + * An error generated when an attempt is made to operate + * on a closed/closing server. + * + * @public + * @category Error + */ +export declare class MongoServerClosedError extends MongoAPIError { + constructor(message?: string); + get name(): string; +} + +/** + * An error coming from the mongo server + * + * @public + * @category Error + */ +export declare class MongoServerError extends MongoError { + codeName?: string; + writeConcernError?: Document; + errInfo?: Document; + ok?: number; + [key: string]: any; + constructor(message: ErrorDescription); + get name(): string; +} + +/** + * An error signifying a client-side server selection error + * @public + * @category Error + */ +export declare class MongoServerSelectionError extends MongoSystemError { + constructor(message: string, reason: TopologyDescription); + get name(): string; +} + +/** + * An error signifying a general system issue + * @public + * @category Error + */ +export declare class MongoSystemError extends MongoError { + /** An optional reason context, such as an error saved during flow of monitoring and selecting servers */ + reason?: TopologyDescription; + constructor(message: string, reason: TopologyDescription); + get name(): string; +} + +/** + * An error generated when an attempt is made to operate on a + * dropped, or otherwise unavailable, database. + * + * @public + * @category Error + */ +export declare class MongoTopologyClosedError extends MongoAPIError { + constructor(message?: string); + get name(): string; +} + +/** + * An error generated when the user makes a mistake in the usage of transactions. + * (e.g. attempting to commit a transaction with a readPreference other than primary) + * + * @public + * @category Error + */ +export declare class MongoTransactionError extends MongoAPIError { + constructor(message: string); + get name(): string; +} + +/** + * An error thrown when the server reports a writeConcernError + * @public + * @category Error + */ +export declare class MongoWriteConcernError extends MongoServerError { + /** The result document (provided if ok: 1) */ + result?: Document; + errInfo?: Document; + constructor(message: ErrorDescription, result?: Document); + get name(): string; +} + +/* Excluded from this release type: Monitor */ + +/** @public */ +export declare type MonitorEvents = { + serverHeartbeatStarted(event: ServerHeartbeatStartedEvent): void; + serverHeartbeatSucceeded(event: ServerHeartbeatSucceededEvent): void; + serverHeartbeatFailed(event: ServerHeartbeatFailedEvent): void; + resetServer(error?: Error): void; + resetConnectionPool(): void; + close(): void; +} & EventEmitterWithState; + +/** @public */ +export declare interface MonitorOptions extends Omit { + connectTimeoutMS: number; + heartbeatFrequencyMS: number; + minHeartbeatFrequencyMS: number; +} + +/* Excluded from this release type: MonitorPrivate */ + +/* Excluded from this release type: Msg */ + +/** It avoids using fields with not acceptable types @public */ +export declare type NotAcceptedFields = { + readonly [key in KeysOfOtherType]?: never; +}; + +/** @public */ +export declare type NumericType = IntegerType | Decimal128 | Double; + +/** + * @public + * @deprecated Please use `ObjectId` + */ +export declare const ObjectID: typeof ObjectId; + +export { ObjectId } + +/** @public */ +export declare type OneOrMore = T | ReadonlyArray; + +/** @public */ +export declare type OnlyFieldsOfType = IsAny, AcceptedFields & NotAcceptedFields & Record>; + +/* Excluded from this release type: OperationDescription */ + +/** @public */ +export declare interface OperationOptions extends BSONSerializeOptions { + /** Specify ClientSession for this command */ + session?: ClientSession; + willRetryWrites?: boolean; + /** The preferred read preference (ReadPreference.primary, ReadPreference.primary_preferred, ReadPreference.secondary, ReadPreference.secondary_preferred, ReadPreference.nearest). */ + readPreference?: ReadPreferenceLike; + /* Excluded from this release type: bypassPinningCheck */ +} + +/* Excluded from this release type: OperationParent */ + +/** + * Represents a specific point in time on a server. Can be retrieved by using {@link Db#command} + * @public + * @remarks + * See {@link https://docs.mongodb.com/manual/reference/method/db.runCommand/#response| Run Command Response} + */ +export declare type OperationTime = Timestamp; + +/* Excluded from this release type: OpGetMoreOptions */ + +/* Excluded from this release type: OpQueryOptions */ + +/** + * Add an optional _id field to an object shaped type + * @public + * + * @privateRemarks + * `ObjectId extends TSchema['_id']` is a confusing ordering at first glance. Rather than ask + * `TSchema['_id'] extends ObjectId` which translated to "Is the _id property ObjectId?" + * we instead ask "Does ObjectId look like (have the same shape) as the _id?" + */ +export declare type OptionalId = ObjectId extends TSchema['_id'] ? EnhancedOmit & { + _id?: InferIdType; +} : WithId; + +/** @public */ +export declare class OrderedBulkOperation extends BulkOperationBase { + constructor(collection: Collection, options: BulkWriteOptions); + addToOperationsList(batchType: BatchType, document: Document | UpdateStatement | DeleteStatement): this; +} + +/** @public */ +export declare interface PipeOptions { + end?: boolean; +} + +/** @public */ +export declare interface PkFactory { + createPk(): any; +} + +/** @public */ +export declare const ProfilingLevel: Readonly<{ + readonly off: "off"; + readonly slowOnly: "slow_only"; + readonly all: "all"; +}>; + +/** @public */ +export declare type ProfilingLevel = typeof ProfilingLevel[keyof typeof ProfilingLevel]; + +/** @public */ +export declare type ProfilingLevelOptions = CommandOperationOptions; + +/** + * @public + * Projection is flexible to permit the wide array of aggregation operators + * @deprecated since v4.1.0: Since projections support all aggregation operations we have no plans to narrow this type further + */ +export declare type Projection = Document; + +/** + * @public + * @deprecated since v4.1.0: Since projections support all aggregation operations we have no plans to narrow this type further + */ +export declare type ProjectionOperators = Document; + +/** + * Global promise store allowing user-provided promises + * @public + */ +declare class Promise_2 { + /** Validates the passed in promise library */ + static validate(lib: unknown): lib is PromiseConstructor; + /** Sets the promise library */ + static set(lib: PromiseConstructor): void; + /** Get the stored promise library, or resolves passed in */ + static get(): PromiseConstructor; +} +export { Promise_2 as Promise } + +/** @public */ +export declare type PullAllOperator = ({ + readonly [key in KeysOfAType>]?: TSchema[key]; +} & NotAcceptedFields>) & { + readonly [key: string]: ReadonlyArray; +}; + +/** @public */ +export declare type PullOperator = ({ + readonly [key in KeysOfAType>]?: Partial> | FilterOperations>; +} & NotAcceptedFields>) & { + readonly [key: string]: FilterOperators | any; +}; + +/** @public */ +export declare type PushOperator = ({ + readonly [key in KeysOfAType>]?: Flatten | ArrayOperator>>; +} & NotAcceptedFields>) & { + readonly [key: string]: ArrayOperator | any; +}; + +/* Excluded from this release type: Query */ + +/* Excluded from this release type: QueryOptions */ + +/** + * The MongoDB ReadConcern, which allows for control of the consistency and isolation properties + * of the data read from replica sets and replica set shards. + * @public + * + * @see https://docs.mongodb.com/manual/reference/read-concern/index.html + */ +export declare class ReadConcern { + level: ReadConcernLevel | string; + /** Constructs a ReadConcern from the read concern level.*/ + constructor(level: ReadConcernLevel); + /** + * Construct a ReadConcern given an options object. + * + * @param options - The options object from which to extract the write concern. + */ + static fromOptions(options?: { + readConcern?: ReadConcernLike; + level?: ReadConcernLevel; + }): ReadConcern | undefined; + static get MAJORITY(): 'majority'; + static get AVAILABLE(): 'available'; + static get LINEARIZABLE(): 'linearizable'; + static get SNAPSHOT(): 'snapshot'; + toJSON(): Document; +} + +/** @public */ +export declare const ReadConcernLevel: Readonly<{ + readonly local: "local"; + readonly majority: "majority"; + readonly linearizable: "linearizable"; + readonly available: "available"; + readonly snapshot: "snapshot"; +}>; + +/** @public */ +export declare type ReadConcernLevel = typeof ReadConcernLevel[keyof typeof ReadConcernLevel]; + +/** @public */ +export declare type ReadConcernLike = ReadConcern | { + level: ReadConcernLevel; +} | ReadConcernLevel; + +/** + * The **ReadPreference** class is a class that represents a MongoDB ReadPreference and is + * used to construct connections. + * @public + * + * @see https://docs.mongodb.com/manual/core/read-preference/ + */ +export declare class ReadPreference { + mode: ReadPreferenceMode; + tags?: TagSet[]; + hedge?: HedgeOptions; + maxStalenessSeconds?: number; + minWireVersion?: number; + static PRIMARY: "primary"; + static PRIMARY_PREFERRED: "primaryPreferred"; + static SECONDARY: "secondary"; + static SECONDARY_PREFERRED: "secondaryPreferred"; + static NEAREST: "nearest"; + static primary: ReadPreference; + static primaryPreferred: ReadPreference; + static secondary: ReadPreference; + static secondaryPreferred: ReadPreference; + static nearest: ReadPreference; + /** + * @param mode - A string describing the read preference mode (primary|primaryPreferred|secondary|secondaryPreferred|nearest) + * @param tags - A tag set used to target reads to members with the specified tag(s). tagSet is not available if using read preference mode primary. + * @param options - Additional read preference options + */ + constructor(mode: ReadPreferenceMode, tags?: TagSet[], options?: ReadPreferenceOptions); + get preference(): ReadPreferenceMode; + static fromString(mode: string): ReadPreference; + /** + * Construct a ReadPreference given an options object. + * + * @param options - The options object from which to extract the read preference. + */ + static fromOptions(options?: ReadPreferenceFromOptions): ReadPreference | undefined; + /** + * Replaces options.readPreference with a ReadPreference instance + */ + static translate(options: ReadPreferenceLikeOptions): ReadPreferenceLikeOptions; + /** + * Validate if a mode is legal + * + * @param mode - The string representing the read preference mode. + */ + static isValid(mode: string): boolean; + /** + * Validate if a mode is legal + * + * @param mode - The string representing the read preference mode. + */ + isValid(mode?: string): boolean; + /** + * Indicates that this readPreference needs the "slaveOk" bit when sent over the wire + * + * @see https://docs.mongodb.com/manual/reference/mongodb-wire-protocol/#op-query + */ + slaveOk(): boolean; + /** + * Check if the two ReadPreferences are equivalent + * + * @param readPreference - The read preference with which to check equality + */ + equals(readPreference: ReadPreference): boolean; + /** Return JSON representation */ + toJSON(): Document; +} + +/** @public */ +export declare interface ReadPreferenceFromOptions extends ReadPreferenceLikeOptions { + session?: ClientSession; + readPreferenceTags?: TagSet[]; + hedge?: HedgeOptions; +} + +/** @public */ +export declare type ReadPreferenceLike = ReadPreference | ReadPreferenceMode; + +/** @public */ +export declare interface ReadPreferenceLikeOptions extends ReadPreferenceOptions { + readPreference?: ReadPreferenceLike | { + mode?: ReadPreferenceMode; + preference?: ReadPreferenceMode; + tags?: TagSet[]; + maxStalenessSeconds?: number; + }; +} + +/** @public */ +export declare const ReadPreferenceMode: Readonly<{ + readonly primary: "primary"; + readonly primaryPreferred: "primaryPreferred"; + readonly secondary: "secondary"; + readonly secondaryPreferred: "secondaryPreferred"; + readonly nearest: "nearest"; +}>; + +/** @public */ +export declare type ReadPreferenceMode = typeof ReadPreferenceMode[keyof typeof ReadPreferenceMode]; + +/** @public */ +export declare interface ReadPreferenceOptions { + /** Max secondary read staleness in seconds, Minimum value is 90 seconds.*/ + maxStalenessSeconds?: number; + /** Server mode in which the same query is dispatched in parallel to multiple replica set members. */ + hedge?: HedgeOptions; +} + +/** @public */ +export declare type ReduceFunction = (key: TKey, values: TValue[]) => TValue; + +/** @public */ +export declare type RegExpOrString = T extends string ? BSONRegExp | RegExp | T : T; + +/** @public */ +export declare type RemoveUserOptions = CommandOperationOptions; + +/** @public */ +export declare interface RenameOptions extends CommandOperationOptions { + /** Drop the target name collection if it previously exists. */ + dropTarget?: boolean; + /** Unclear */ + new_collection?: boolean; +} + +/** @public */ +export declare interface ReplaceOneModel { + /** The filter to limit the replaced document. */ + filter: Filter; + /** The document with which to replace the matched document. */ + replacement: TSchema; + /** Specifies a collation. */ + collation?: CollationOptions; + /** The index to use. If specified, then the query system will only consider plans using the hinted index. */ + hint?: Hint; + /** When true, creates a new document if no document matches the query. */ + upsert?: boolean; +} + +/** @public */ +export declare interface ReplaceOptions extends CommandOperationOptions { + /** If true, allows the write to opt-out of document level validation */ + bypassDocumentValidation?: boolean; + /** Specifies a collation */ + collation?: CollationOptions; + /** Specify that the update query should only consider plans using the hinted index */ + hint?: string | Document; + /** When true, creates a new document if no document matches the query */ + upsert?: boolean; +} + +/** @public */ +export declare interface ResumeOptions { + startAtOperationTime?: Timestamp; + batchSize?: number; + maxAwaitTimeMS?: number; + collation?: CollationOptions; + readPreference?: ReadPreference; +} + +/** + * Represents the logical starting point for a new or resuming {@link https://docs.mongodb.com/master/changeStreams/#change-stream-resume-token| Change Stream} on the server. + * @public + */ +export declare type ResumeToken = unknown; + +/** @public */ +export declare const ReturnDocument: Readonly<{ + readonly BEFORE: "before"; + readonly AFTER: "after"; +}>; + +/** @public */ +export declare type ReturnDocument = typeof ReturnDocument[keyof typeof ReturnDocument]; + +/** @public */ +export declare interface RoleSpecification { + /** + * A role grants privileges to perform sets of actions on defined resources. + * A given role applies to the database on which it is defined and can grant access down to a collection level of granularity. + */ + role: string; + /** The database this user's role should effect. */ + db: string; +} + +/** @public */ +export declare interface RootFilterOperators extends Document { + $and?: Filter[]; + $nor?: Filter[]; + $or?: Filter[]; + $text?: { + $search: string; + $language?: string; + $caseSensitive?: boolean; + $diacriticSensitive?: boolean; + }; + $where?: string | ((this: TSchema) => boolean); + $comment?: string | Document; +} + +/* Excluded from this release type: RTTPinger */ + +/* Excluded from this release type: RTTPingerOptions */ + +/** @public */ +export declare type RunCommandOptions = CommandOperationOptions; + +/** @public */ +export declare type SchemaMember = { + [P in keyof T]?: V; +} | { + [key: string]: V; +}; + +/** @public */ +export declare interface SelectServerOptions { + readPreference?: ReadPreferenceLike; + /** How long to block for server selection before throwing an error */ + serverSelectionTimeoutMS?: number; + session?: ClientSession; +} + +/* Excluded from this release type: serialize */ + +/* Excluded from this release type: Server */ + +/** @public */ +export declare interface ServerApi { + version: ServerApiVersion; + strict?: boolean; + deprecationErrors?: boolean; +} + +/** @public */ +export declare const ServerApiVersion: Readonly<{ + readonly v1: "1"; +}>; + +/** @public */ +export declare type ServerApiVersion = typeof ServerApiVersion[keyof typeof ServerApiVersion]; + +/** @public */ +export declare class ServerCapabilities { + maxWireVersion: number; + minWireVersion: number; + constructor(ismaster: Document); + get hasAggregationCursor(): boolean; + get hasWriteCommands(): boolean; + get hasTextSearch(): boolean; + get hasAuthCommands(): boolean; + get hasListCollectionsCommand(): boolean; + get hasListIndexesCommand(): boolean; + get supportsSnapshotReads(): boolean; + get commandsTakeWriteConcern(): boolean; + get commandsTakeCollation(): boolean; +} + +/** + * Emitted when server is closed. + * @public + * @category Event + */ +export declare class ServerClosedEvent { + /** A unique identifier for the topology */ + topologyId: number; + /** The address (host/port pair) of the server */ + address: string; + /* Excluded from this release type: __constructor */ +} + +/** + * The client's view of a single server, based on the most recent ismaster outcome. + * + * Internal type, not meant to be directly instantiated + * @public + */ +export declare class ServerDescription { + private _hostAddress; + address: string; + type: ServerType; + hosts: string[]; + passives: string[]; + arbiters: string[]; + tags: TagSet; + error?: MongoError; + topologyVersion?: TopologyVersion; + minWireVersion: number; + maxWireVersion: number; + roundTripTime: number; + lastUpdateTime: number; + lastWriteDate: number; + me?: string; + primary?: string; + setName?: string; + setVersion?: number; + electionId?: ObjectId; + logicalSessionTimeoutMinutes?: number; + $clusterTime?: ClusterTime; + /* Excluded from this release type: __constructor */ + get hostAddress(): HostAddress; + get allHosts(): string[]; + /** Is this server available for reads*/ + get isReadable(): boolean; + /** Is this server data bearing */ + get isDataBearing(): boolean; + /** Is this server available for writes */ + get isWritable(): boolean; + get host(): string; + get port(): number; + /** + * Determines if another `ServerDescription` is equal to this one per the rules defined + * in the {@link https://github.com/mongodb/specifications/blob/master/source/server-discovery-and-monitoring/server-discovery-and-monitoring.rst#serverdescription|SDAM spec} + */ + equals(other: ServerDescription): boolean; +} + +/** + * Emitted when server description changes, but does NOT include changes to the RTT. + * @public + * @category Event + */ +export declare class ServerDescriptionChangedEvent { + /** A unique identifier for the topology */ + topologyId: number; + /** The address (host/port pair) of the server */ + address: string; + /** The previous server description */ + previousDescription: ServerDescription; + /** The new server description */ + newDescription: ServerDescription; + /* Excluded from this release type: __constructor */ +} + +/* Excluded from this release type: ServerDescriptionOptions */ + +/** @public */ +export declare type ServerEvents = { + serverHeartbeatStarted(event: ServerHeartbeatStartedEvent): void; + serverHeartbeatSucceeded(event: ServerHeartbeatSucceededEvent): void; + serverHeartbeatFailed(event: ServerHeartbeatFailedEvent): void; + /* Excluded from this release type: connect */ + descriptionReceived(description: ServerDescription): void; + closed(): void; + ended(): void; +} & ConnectionPoolEvents & EventEmitterWithState; + +/** + * Emitted when the server monitor’s ismaster fails, either with an “ok: 0” or a socket exception. + * @public + * @category Event + */ +export declare class ServerHeartbeatFailedEvent { + /** The connection id for the command */ + connectionId: string; + /** The execution time of the event in ms */ + duration: number; + /** The command failure */ + failure: Error; + /* Excluded from this release type: __constructor */ +} + +/** + * Emitted when the server monitor’s ismaster command is started - immediately before + * the ismaster command is serialized into raw BSON and written to the socket. + * + * @public + * @category Event + */ +export declare class ServerHeartbeatStartedEvent { + /** The connection id for the command */ + connectionId: string; + /* Excluded from this release type: __constructor */ +} + +/** + * Emitted when the server monitor’s ismaster succeeds. + * @public + * @category Event + */ +export declare class ServerHeartbeatSucceededEvent { + /** The connection id for the command */ + connectionId: string; + /** The execution time of the event in ms */ + duration: number; + /** The command reply */ + reply: Document; + /* Excluded from this release type: __constructor */ +} + +/** + * Emitted when server is initialized. + * @public + * @category Event + */ +export declare class ServerOpeningEvent { + /** A unique identifier for the topology */ + topologyId: number; + /** The address (host/port pair) of the server */ + address: string; + /* Excluded from this release type: __constructor */ +} + +/** @public */ +export declare type ServerOptions = Omit & MonitorOptions; + +/* Excluded from this release type: ServerPrivate */ + +/* Excluded from this release type: ServerSelectionCallback */ + +/* Excluded from this release type: ServerSelectionRequest */ + +/** @public */ +export declare type ServerSelector = (topologyDescription: TopologyDescription, servers: ServerDescription[]) => ServerDescription[]; + +/** + * Reflects the existence of a session on the server. Can be reused by the session pool. + * WARNING: not meant to be instantiated directly. For internal use only. + * @public + */ +export declare class ServerSession { + id: ServerSessionId; + lastUse: number; + txnNumber: number; + isDirty: boolean; + /* Excluded from this release type: __constructor */ + /** + * Determines if the server session has timed out. + * + * @param sessionTimeoutMinutes - The server's "logicalSessionTimeoutMinutes" + */ + hasTimedOut(sessionTimeoutMinutes: number): boolean; +} + +/** @public */ +export declare type ServerSessionId = { + id: Binary; +}; + +/* Excluded from this release type: ServerSessionPool */ + +/** + * An enumeration of server types we know about + * @public + */ +export declare const ServerType: Readonly<{ + readonly Standalone: "Standalone"; + readonly Mongos: "Mongos"; + readonly PossiblePrimary: "PossiblePrimary"; + readonly RSPrimary: "RSPrimary"; + readonly RSSecondary: "RSSecondary"; + readonly RSArbiter: "RSArbiter"; + readonly RSOther: "RSOther"; + readonly RSGhost: "RSGhost"; + readonly Unknown: "Unknown"; + readonly LoadBalancer: "LoadBalancer"; +}>; + +/** @public */ +export declare type ServerType = typeof ServerType[keyof typeof ServerType]; + +/** @public */ +export declare type SetFields = ({ + readonly [key in KeysOfAType | undefined>]?: OptionalId> | AddToSetOperators>>>; +} & NotAcceptedFields | undefined>) & { + readonly [key: string]: AddToSetOperators | any; +}; + +/** @public */ +export declare type SetProfilingLevelOptions = CommandOperationOptions; + +/** @public */ +export declare type Sort = string | Exclude | string[] | { + [key: string]: SortDirection; +} | Map | [string, SortDirection][] | [string, SortDirection]; + +/** @public */ +export declare type SortDirection = 1 | -1 | 'asc' | 'desc' | 'ascending' | 'descending' | { + $meta: string; +}; + +/* Excluded from this release type: SortDirectionForCmd */ + +/* Excluded from this release type: SortForCmd */ + +/* Excluded from this release type: SrvPoller */ + +/* Excluded from this release type: SrvPollerEvents */ + +/* Excluded from this release type: SrvPollerOptions */ + +/* Excluded from this release type: SrvPollingEvent */ + +/** @public */ +export declare type Stream = Socket | TLSSocket; + +/** @public */ +export declare class StreamDescription { + address: string; + type: string; + minWireVersion?: number; + maxWireVersion?: number; + maxBsonObjectSize: number; + maxMessageSizeBytes: number; + maxWriteBatchSize: number; + compressors: CompressorName[]; + compressor?: CompressorName; + logicalSessionTimeoutMinutes?: number; + loadBalanced: boolean; + __nodejs_mock_server__?: boolean; + zlibCompressionLevel?: number; + constructor(address: string, options?: StreamDescriptionOptions); + receiveResponse(response: Document): void; +} + +/** @public */ +export declare interface StreamDescriptionOptions { + compressors?: CompressorName[]; + logicalSessionTimeoutMinutes?: number; + loadBalanced: boolean; +} + +/** @public */ +export declare type SupportedNodeConnectionOptions = SupportedTLSConnectionOptions & SupportedTLSSocketOptions & SupportedSocketOptions; + +/** @public */ +export declare type SupportedSocketOptions = Pick; + +/** @public */ +export declare type SupportedTLSConnectionOptions = Pick>; + +/** @public */ +export declare type SupportedTLSSocketOptions = Pick>; + +/** @public */ +export declare type TagSet = { + [key: string]: string; +}; + +/* Excluded from this release type: TimerQueue */ + +/** @public + * Configuration options for timeseries collections + * @see https://docs.mongodb.com/manual/core/timeseries-collections/ + */ +export declare interface TimeSeriesCollectionOptions extends Document { + timeField: string; + metaField?: string; + granularity?: string; +} + +export { Timestamp } + +/* Excluded from this release type: Topology */ + +/** + * Emitted when topology is closed. + * @public + * @category Event + */ +export declare class TopologyClosedEvent { + /** A unique identifier for the topology */ + topologyId: number; + /* Excluded from this release type: __constructor */ +} + +/** + * Representation of a deployment of servers + * @public + */ +export declare class TopologyDescription { + type: TopologyType; + setName?: string; + maxSetVersion?: number; + maxElectionId?: ObjectId; + servers: Map; + stale: boolean; + compatible: boolean; + compatibilityError?: string; + logicalSessionTimeoutMinutes?: number; + heartbeatFrequencyMS: number; + localThresholdMS: number; + commonWireVersion?: number; + /** + * Create a TopologyDescription + */ + constructor(topologyType: TopologyType, serverDescriptions?: Map, setName?: string, maxSetVersion?: number, maxElectionId?: ObjectId, commonWireVersion?: number, options?: TopologyDescriptionOptions); + /* Excluded from this release type: updateFromSrvPollingEvent */ + /* Excluded from this release type: update */ + get error(): MongoError | undefined; + /** + * Determines if the topology description has any known servers + */ + get hasKnownServers(): boolean; + /** + * Determines if this topology description has a data-bearing server available. + */ + get hasDataBearingServers(): boolean; + /* Excluded from this release type: hasServer */ +} + +/** + * Emitted when topology description changes. + * @public + * @category Event + */ +export declare class TopologyDescriptionChangedEvent { + /** A unique identifier for the topology */ + topologyId: number; + /** The old topology description */ + previousDescription: TopologyDescription; + /** The new topology description */ + newDescription: TopologyDescription; + /* Excluded from this release type: __constructor */ +} + +/** @public */ +export declare interface TopologyDescriptionOptions { + heartbeatFrequencyMS?: number; + localThresholdMS?: number; +} + +/** @public */ +export declare type TopologyEvents = { + /* Excluded from this release type: connect */ + serverOpening(event: ServerOpeningEvent): void; + serverClosed(event: ServerClosedEvent): void; + serverDescriptionChanged(event: ServerDescriptionChangedEvent): void; + topologyClosed(event: TopologyClosedEvent): void; + topologyOpening(event: TopologyOpeningEvent): void; + topologyDescriptionChanged(event: TopologyDescriptionChangedEvent): void; + error(error: Error): void; + /* Excluded from this release type: open */ + close(): void; + timeout(): void; +} & Omit & ConnectionPoolEvents & ConnectionEvents & EventEmitterWithState; + +/** + * Emitted when topology is initialized. + * @public + * @category Event + */ +export declare class TopologyOpeningEvent { + /** A unique identifier for the topology */ + topologyId: number; + /* Excluded from this release type: __constructor */ +} + +/** @public */ +export declare interface TopologyOptions extends BSONSerializeOptions, ServerOptions { + hosts: HostAddress[]; + retryWrites: boolean; + retryReads: boolean; + /** How long to block for server selection before throwing an error */ + serverSelectionTimeoutMS: number; + /** The name of the replica set to connect to */ + replicaSet?: string; + srvHost?: string; + /* Excluded from this release type: srvPoller */ + /** Indicates that a client should directly connect to a node without attempting to discover its topology type */ + directConnection: boolean; + loadBalanced: boolean; + metadata: ClientMetadata; + /** MongoDB server API version */ + serverApi?: ServerApi; +} + +/* Excluded from this release type: TopologyPrivate */ + +/** + * An enumeration of topology types we know about + * @public + */ +export declare const TopologyType: Readonly<{ + readonly Single: "Single"; + readonly ReplicaSetNoPrimary: "ReplicaSetNoPrimary"; + readonly ReplicaSetWithPrimary: "ReplicaSetWithPrimary"; + readonly Sharded: "Sharded"; + readonly Unknown: "Unknown"; + readonly LoadBalanced: "LoadBalanced"; +}>; + +/** @public */ +export declare type TopologyType = typeof TopologyType[keyof typeof TopologyType]; + +/** @public */ +export declare interface TopologyVersion { + processId: ObjectId; + counter: Long; +} + +/** + * @public + * A class maintaining state related to a server transaction. Internal Only + */ +export declare class Transaction { + /* Excluded from this release type: state */ + options: TransactionOptions; + /* Excluded from this release type: _pinnedServer */ + /* Excluded from this release type: _recoveryToken */ + /* Excluded from this release type: __constructor */ + /* Excluded from this release type: server */ + get recoveryToken(): Document | undefined; + get isPinned(): boolean; + /** @returns Whether the transaction has started */ + get isStarting(): boolean; + /** + * @returns Whether this session is presently in a transaction + */ + get isActive(): boolean; + get isCommitted(): boolean; + /* Excluded from this release type: transition */ + /* Excluded from this release type: pinServer */ + /* Excluded from this release type: unpinServer */ +} + +/** + * Configuration options for a transaction. + * @public + */ +export declare interface TransactionOptions extends CommandOperationOptions { + /** A default read concern for commands in this transaction */ + readConcern?: ReadConcern; + /** A default writeConcern for commands in this transaction */ + writeConcern?: WriteConcern; + /** A default read preference for commands in this transaction */ + readPreference?: ReadPreference; + maxCommitTimeMS?: number; +} + +/* Excluded from this release type: TxnState */ + +/** + * Typescript type safe event emitter + * @public + */ +export declare interface TypedEventEmitter extends EventEmitter { + addListener(event: EventKey, listener: Events[EventKey]): this; + addListener(event: CommonEvents, listener: (eventName: string | symbol, listener: GenericListener) => void): this; + addListener(event: string | symbol, listener: GenericListener): this; + on(event: EventKey, listener: Events[EventKey]): this; + on(event: CommonEvents, listener: (eventName: string | symbol, listener: GenericListener) => void): this; + on(event: string | symbol, listener: GenericListener): this; + once(event: EventKey, listener: Events[EventKey]): this; + once(event: CommonEvents, listener: (eventName: string | symbol, listener: GenericListener) => void): this; + once(event: string | symbol, listener: GenericListener): this; + removeListener(event: EventKey, listener: Events[EventKey]): this; + removeListener(event: CommonEvents, listener: (eventName: string | symbol, listener: GenericListener) => void): this; + removeListener(event: string | symbol, listener: GenericListener): this; + off(event: EventKey, listener: Events[EventKey]): this; + off(event: CommonEvents, listener: (eventName: string | symbol, listener: GenericListener) => void): this; + off(event: string | symbol, listener: GenericListener): this; + removeAllListeners(event?: EventKey | CommonEvents | symbol | string): this; + listeners(event: EventKey | CommonEvents | symbol | string): Events[EventKey][]; + rawListeners(event: EventKey | CommonEvents | symbol | string): Events[EventKey][]; + emit(event: EventKey | symbol, ...args: Parameters): boolean; + listenerCount(type: EventKey | CommonEvents | symbol | string): number; + prependListener(event: EventKey, listener: Events[EventKey]): this; + prependListener(event: CommonEvents, listener: (eventName: string | symbol, listener: GenericListener) => void): this; + prependListener(event: string | symbol, listener: GenericListener): this; + prependOnceListener(event: EventKey, listener: Events[EventKey]): this; + prependOnceListener(event: CommonEvents, listener: (eventName: string | symbol, listener: GenericListener) => void): this; + prependOnceListener(event: string | symbol, listener: GenericListener): this; + eventNames(): string[]; + getMaxListeners(): number; + setMaxListeners(n: number): this; +} + +/** + * Typescript type safe event emitter + * @public + */ +export declare class TypedEventEmitter extends EventEmitter { +} + +/** @public */ +export declare class UnorderedBulkOperation extends BulkOperationBase { + constructor(collection: Collection, options: BulkWriteOptions); + handleWriteError(callback: Callback, writeResult: BulkWriteResult): boolean | undefined; + addToOperationsList(batchType: BatchType, document: Document | UpdateStatement | DeleteStatement): this; +} + +/** @public */ +export declare interface UpdateDescription { + /** + * A document containing key:value pairs of names of the fields that were + * changed, and the new value for those fields. + */ + updatedFields: Partial; + /** + * An array of field names that were removed from the document. + */ + removedFields: string[]; +} + +/** @public */ +export declare type UpdateFilter = { + $currentDate?: OnlyFieldsOfType; + $inc?: OnlyFieldsOfType; + $min?: MatchKeysAndValues; + $max?: MatchKeysAndValues; + $mul?: OnlyFieldsOfType; + $rename?: Record; + $set?: MatchKeysAndValues; + $setOnInsert?: MatchKeysAndValues; + $unset?: OnlyFieldsOfType; + $addToSet?: SetFields; + $pop?: OnlyFieldsOfType, 1 | -1>; + $pull?: PullOperator; + $push?: PushOperator; + $pullAll?: PullAllOperator; + $bit?: OnlyFieldsOfType; +} & Document; + +/** @public */ +export declare interface UpdateManyModel { + /** The filter to limit the updated documents. */ + filter: Filter; + /** A document or pipeline containing update operators. */ + update: UpdateFilter | UpdateFilter[]; + /** A set of filters specifying to which array elements an update should apply. */ + arrayFilters?: Document[]; + /** Specifies a collation. */ + collation?: CollationOptions; + /** The index to use. If specified, then the query system will only consider plans using the hinted index. */ + hint?: Hint; + /** When true, creates a new document if no document matches the query. */ + upsert?: boolean; +} + +/** @public */ +export declare interface UpdateOneModel { + /** The filter to limit the updated documents. */ + filter: Filter; + /** A document or pipeline containing update operators. */ + update: UpdateFilter | UpdateFilter[]; + /** A set of filters specifying to which array elements an update should apply. */ + arrayFilters?: Document[]; + /** Specifies a collation. */ + collation?: CollationOptions; + /** The index to use. If specified, then the query system will only consider plans using the hinted index. */ + hint?: Hint; + /** When true, creates a new document if no document matches the query. */ + upsert?: boolean; +} + +/** @public */ +export declare interface UpdateOptions extends CommandOperationOptions { + /** A set of filters specifying to which array elements an update should apply */ + arrayFilters?: Document[]; + /** If true, allows the write to opt-out of document level validation */ + bypassDocumentValidation?: boolean; + /** Specifies a collation */ + collation?: CollationOptions; + /** Specify that the update query should only consider plans using the hinted index */ + hint?: string | Document; + /** When true, creates a new document if no document matches the query */ + upsert?: boolean; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; +} + +/** @public */ +export declare interface UpdateResult { + /** Indicates whether this write result was acknowledged. If not, then all other members of this result will be undefined */ + acknowledged: boolean; + /** The number of documents that matched the filter */ + matchedCount: number; + /** The number of documents that were modified */ + modifiedCount: number; + /** The number of documents that were upserted */ + upsertedCount: number; + /** The identifier of the inserted document if an upsert took place */ + upsertedId: ObjectId; +} + +/** @public */ +export declare interface UpdateStatement { + /** The query that matches documents to update. */ + q: Document; + /** The modifications to apply. */ + u: Document | Document[]; + /** If true, perform an insert if no documents match the query. */ + upsert?: boolean; + /** If true, updates all documents that meet the query criteria. */ + multi?: boolean; + /** Specifies the collation to use for the operation. */ + collation?: CollationOptions; + /** An array of filter documents that determines which array elements to modify for an update operation on an array field. */ + arrayFilters?: Document[]; + /** A document or string that specifies the index to use to support the query predicate. */ + hint?: Hint; +} + +/** @public */ +export declare interface ValidateCollectionOptions extends CommandOperationOptions { + /** Validates a collection in the background, without interrupting read or write traffic (only in MongoDB 4.4+) */ + background?: boolean; +} + +/** @public */ +export declare type W = number | 'majority'; + +/* Excluded from this release type: WaitQueueMember */ + +/** @public */ +export declare interface WiredTigerData extends Document { + LSM: { + 'bloom filter false positives': number; + 'bloom filter hits': number; + 'bloom filter misses': number; + 'bloom filter pages evicted from cache': number; + 'bloom filter pages read into cache': number; + 'bloom filters in the LSM tree': number; + 'chunks in the LSM tree': number; + 'highest merge generation in the LSM tree': number; + 'queries that could have benefited from a Bloom filter that did not exist': number; + 'sleep for LSM checkpoint throttle': number; + 'sleep for LSM merge throttle': number; + 'total size of bloom filters': number; + } & Document; + 'block-manager': { + 'allocations requiring file extension': number; + 'blocks allocated': number; + 'blocks freed': number; + 'checkpoint size': number; + 'file allocation unit size': number; + 'file bytes available for reuse': number; + 'file magic number': number; + 'file major version number': number; + 'file size in bytes': number; + 'minor version number': number; + }; + btree: { + 'btree checkpoint generation': number; + 'column-store fixed-size leaf pages': number; + 'column-store internal pages': number; + 'column-store variable-size RLE encoded values': number; + 'column-store variable-size deleted values': number; + 'column-store variable-size leaf pages': number; + 'fixed-record size': number; + 'maximum internal page key size': number; + 'maximum internal page size': number; + 'maximum leaf page key size': number; + 'maximum leaf page size': number; + 'maximum leaf page value size': number; + 'maximum tree depth': number; + 'number of key/value pairs': number; + 'overflow pages': number; + 'pages rewritten by compaction': number; + 'row-store internal pages': number; + 'row-store leaf pages': number; + } & Document; + cache: { + 'bytes currently in the cache': number; + 'bytes read into cache': number; + 'bytes written from cache': number; + 'checkpoint blocked page eviction': number; + 'data source pages selected for eviction unable to be evicted': number; + 'hazard pointer blocked page eviction': number; + 'in-memory page passed criteria to be split': number; + 'in-memory page splits': number; + 'internal pages evicted': number; + 'internal pages split during eviction': number; + 'leaf pages split during eviction': number; + 'modified pages evicted': number; + 'overflow pages read into cache': number; + 'overflow values cached in memory': number; + 'page split during eviction deepened the tree': number; + 'page written requiring lookaside records': number; + 'pages read into cache': number; + 'pages read into cache requiring lookaside entries': number; + 'pages requested from the cache': number; + 'pages written from cache': number; + 'pages written requiring in-memory restoration': number; + 'tracked dirty bytes in the cache': number; + 'unmodified pages evicted': number; + } & Document; + cache_walk: { + 'Average difference between current eviction generation when the page was last considered': number; + 'Average on-disk page image size seen': number; + 'Clean pages currently in cache': number; + 'Current eviction generation': number; + 'Dirty pages currently in cache': number; + 'Entries in the root page': number; + 'Internal pages currently in cache': number; + 'Leaf pages currently in cache': number; + 'Maximum difference between current eviction generation when the page was last considered': number; + 'Maximum page size seen': number; + 'Minimum on-disk page image size seen': number; + 'On-disk page image sizes smaller than a single allocation unit': number; + 'Pages created in memory and never written': number; + 'Pages currently queued for eviction': number; + 'Pages that could not be queued for eviction': number; + 'Refs skipped during cache traversal': number; + 'Size of the root page': number; + 'Total number of pages currently in cache': number; + } & Document; + compression: { + 'compressed pages read': number; + 'compressed pages written': number; + 'page written failed to compress': number; + 'page written was too small to compress': number; + 'raw compression call failed, additional data available': number; + 'raw compression call failed, no additional data available': number; + 'raw compression call succeeded': number; + } & Document; + cursor: { + 'bulk-loaded cursor-insert calls': number; + 'create calls': number; + 'cursor-insert key and value bytes inserted': number; + 'cursor-remove key bytes removed': number; + 'cursor-update value bytes updated': number; + 'insert calls': number; + 'next calls': number; + 'prev calls': number; + 'remove calls': number; + 'reset calls': number; + 'restarted searches': number; + 'search calls': number; + 'search near calls': number; + 'truncate calls': number; + 'update calls': number; + }; + reconciliation: { + 'dictionary matches': number; + 'fast-path pages deleted': number; + 'internal page key bytes discarded using suffix compression': number; + 'internal page multi-block writes': number; + 'internal-page overflow keys': number; + 'leaf page key bytes discarded using prefix compression': number; + 'leaf page multi-block writes': number; + 'leaf-page overflow keys': number; + 'maximum blocks required for a page': number; + 'overflow values written': number; + 'page checksum matches': number; + 'page reconciliation calls': number; + 'page reconciliation calls for eviction': number; + 'pages deleted': number; + } & Document; +} + +/* Excluded from this release type: WithConnectionCallback */ + +/** Add an _id field to an object shaped type @public */ +export declare type WithId = EnhancedOmit & { + _id: InferIdType; +}; + +/** Remove the _id field from an object shaped type @public */ +export declare type WithoutId = Omit; + +/** @public */ +export declare type WithSessionCallback = (session: ClientSession) => Promise | void; + +/** @public */ +export declare type WithTransactionCallback = (session: ClientSession) => Promise; + +/** + * A MongoDB WriteConcern, which describes the level of acknowledgement + * requested from MongoDB for write operations. + * @public + * + * @see https://docs.mongodb.com/manual/reference/write-concern/ + */ +export declare class WriteConcern { + /** request acknowledgment that the write operation has propagated to a specified number of mongod instances or to mongod instances with specified tags. */ + w?: W; + /** specify a time limit to prevent write operations from blocking indefinitely */ + wtimeout?: number; + /** request acknowledgment that the write operation has been written to the on-disk journal */ + j?: boolean; + /** equivalent to the j option */ + fsync?: boolean | 1; + /** + * Constructs a WriteConcern from the write concern properties. + * @param w - request acknowledgment that the write operation has propagated to a specified number of mongod instances or to mongod instances with specified tags. + * @param wtimeout - specify a time limit to prevent write operations from blocking indefinitely + * @param j - request acknowledgment that the write operation has been written to the on-disk journal + * @param fsync - equivalent to the j option + */ + constructor(w?: W, wtimeout?: number, j?: boolean, fsync?: boolean | 1); + /** Construct a WriteConcern given an options object. */ + static fromOptions(options?: WriteConcernOptions | WriteConcern | W, inherit?: WriteConcernOptions | WriteConcern): WriteConcern | undefined; +} + +/** + * An error representing a failure by the server to apply the requested write concern to the bulk operation. + * @public + * @category Error + */ +export declare class WriteConcernError { + /* Excluded from this release type: [kServerError] */ + constructor(error: WriteConcernErrorData); + /** Write concern error code. */ + get code(): number | undefined; + /** Write concern error message. */ + get errmsg(): string | undefined; + /** Write concern error info. */ + get errInfo(): Document | undefined; + /** @deprecated The `err` prop that contained a MongoServerError has been deprecated. */ + get err(): WriteConcernErrorData; + toJSON(): WriteConcernErrorData; + toString(): string; +} + +/** @public */ +export declare interface WriteConcernErrorData { + code: number; + errmsg: string; + errInfo?: Document; +} + +/** @public */ +export declare interface WriteConcernOptions { + /** Write Concern as an object */ + writeConcern?: WriteConcern | WriteConcernSettings; +} + +/** @public */ +export declare interface WriteConcernSettings { + /** The write concern */ + w?: W; + /** The write concern timeout */ + wtimeoutMS?: number; + /** The journal write concern */ + journal?: boolean; + /** The journal write concern */ + j?: boolean; + /** The write concern timeout */ + wtimeout?: number; + /** The file sync write concern */ + fsync?: boolean | 1; +} + +/** + * An error that occurred during a BulkWrite on the server. + * @public + * @category Error + */ +export declare class WriteError { + err: BulkWriteOperationError; + constructor(err: BulkWriteOperationError); + /** WriteError code. */ + get code(): number; + /** WriteError original bulk operation index. */ + get index(): number; + /** WriteError message. */ + get errmsg(): string | undefined; + /** WriteError details. */ + get errInfo(): Document | undefined; + /** Returns the underlying operation that caused the error */ + getOperation(): Document; + toJSON(): { + code: number; + index: number; + errmsg?: string; + op: Document; + }; + toString(): string; +} + +/* Excluded from this release type: WriteProtocolMessageType */ + +export { } diff --git a/node_modules/mongodb/mongodb.ts34.d.ts b/node_modules/mongodb/mongodb.ts34.d.ts new file mode 100644 index 000000000..85d83c27d --- /dev/null +++ b/node_modules/mongodb/mongodb.ts34.d.ts @@ -0,0 +1,5504 @@ +/// +import { Binary } from 'bson'; +import { BSONRegExp } from 'bson'; +import { BSONSymbol } from 'bson'; +import { Code } from 'bson'; +import { ConnectionOptions as ConnectionOptions_2 } from 'tls'; +import { DBRef } from 'bson'; +import { Decimal128 } from 'bson'; +import Denque = require('denque'); +import { deserialize as deserialize_2 } from 'bson'; +import { DeserializeOptions } from 'bson'; +import * as dns from 'dns'; +import { Document } from 'bson'; +import { Double } from 'bson'; +import { Duplex } from 'stream'; +import { DuplexOptions } from 'stream'; +import { EventEmitter } from 'events'; +import { Int32 } from 'bson'; +import { Long } from 'bson'; +import { Map as Map_2 } from 'bson'; +import { MaxKey } from 'bson'; +import { MinKey } from 'bson'; +import { ObjectId } from 'bson'; +import { Readable } from 'stream'; +import { serialize as serialize_2 } from 'bson'; +import { SerializeOptions } from 'bson'; +import { Socket } from 'net'; +import { TcpNetConnectOpts } from 'net'; +import { Timestamp } from 'bson'; +import { TLSSocket } from 'tls'; +import { TLSSocketOptions } from 'tls'; +import { Writable } from 'stream'; +/** @public */ +export declare abstract class AbstractCursor extends TypedEventEmitter { + /* Excluded from this release type: [kId] */ + /* Excluded from this release type: [kSession] */ + /* Excluded from this release type: [kServer] */ + /* Excluded from this release type: [kNamespace] */ + /* Excluded from this release type: [kDocuments] */ + /* Excluded from this release type: [kTopology] */ + /* Excluded from this release type: [kTransform] */ + /* Excluded from this release type: [kInitialized] */ + /* Excluded from this release type: [kClosed] */ + /* Excluded from this release type: [kKilled] */ + /* Excluded from this release type: [kOptions] */ + /** @event */ + static readonly CLOSE: "close"; + /*Excluded from this release type: __constructor */ + readonly id: Long | undefined; + /*Excluded from this release type: topology + Excluded from this release type: server */ + readonly namespace: MongoDBNamespace; + readonly readPreference: ReadPreference; + readonly readConcern: ReadConcern | undefined; + /*Excluded from this release type: session + Excluded from this release type: session + Excluded from this release type: cursorOptions */ + readonly closed: boolean; + readonly killed: boolean; + readonly loadBalanced: boolean; + /** Returns current buffered documents length */ + bufferedCount(): number; + /** Returns current buffered documents */ + readBufferedDocuments(number?: number): TSchema[]; + [Symbol.asyncIterator](): AsyncIterator; + stream(options?: CursorStreamOptions): Readable; + hasNext(): Promise; + hasNext(callback: Callback): void; + /** Get the next available document from the cursor, returns null if no more documents are available. */ + next(): Promise; + next(callback: Callback): void; + next(callback?: Callback): Promise | void; + /** + * Try to get the next available document from the cursor or `null` if an empty batch is returned + */ + tryNext(): Promise; + tryNext(callback: Callback): void; + /** + * Iterates over all the documents for this cursor using the iterator, callback pattern. + * + * @param iterator - The iteration callback. + * @param callback - The end callback. + */ + forEach(iterator: (doc: TSchema) => boolean | void): Promise; + forEach(iterator: (doc: TSchema) => boolean | void, callback: Callback): void; + close(): void; + close(callback: Callback): void; + /** + * @deprecated options argument is deprecated + */ + close(options: CursorCloseOptions): Promise; + /** + * @deprecated options argument is deprecated + */ + close(options: CursorCloseOptions, callback: Callback): void; + /** + * Returns an array of documents. The caller is responsible for making sure that there + * is enough memory to store the results. Note that the array only contains partial + * results when this cursor had been previously accessed. In that case, + * cursor.rewind() can be used to reset the cursor. + * + * @param callback - The result callback. + */ + toArray(): Promise; + toArray(callback: Callback): void; + /** + * Add a cursor flag to the cursor + * + * @param flag - The flag to set, must be one of following ['tailable', 'oplogReplay', 'noCursorTimeout', 'awaitData', 'partial' -. + * @param value - The flag boolean value. + */ + addCursorFlag(flag: CursorFlag, value: boolean): this; + /** + * Map all documents using the provided function + * If there is a transform set on the cursor, that will be called first and the result passed to + * this function's transform. + * + * @remarks + * **Note for Typescript Users:** adding a transform changes the return type of the iteration of this cursor, + * it **does not** return a new instance of a cursor. This means when calling map, + * you should always assign the result to a new variable in order to get a correctly typed cursor variable. + * Take note of the following example: + * + * @example + * ```typescript + * const cursor: FindCursor = coll.find(); + * const mappedCursor: FindCursor = cursor.map(doc => Object.keys(doc).length); + * const keyCounts: number[] = await mappedCursor.toArray(); // cursor.toArray() still returns Document[] + * ``` + * @param transform - The mapping transformation method. + */ + map(transform: (doc: TSchema) => T): AbstractCursor; + /** + * Set the ReadPreference for the cursor. + * + * @param readPreference - The new read preference for the cursor. + */ + withReadPreference(readPreference: ReadPreferenceLike): this; + /** + * Set the ReadPreference for the cursor. + * + * @param readPreference - The new read preference for the cursor. + */ + withReadConcern(readConcern: ReadConcernLike): this; + /** + * Set a maxTimeMS on the cursor query, allowing for hard timeout limits on queries (Only supported on MongoDB 2.6 or higher) + * + * @param value - Number of milliseconds to wait before aborting the query. + */ + maxTimeMS(value: number): this; + /** + * Set the batch size for the cursor. + * + * @param value - The number of documents to return per batch. See {@link https://docs.mongodb.com/manual/reference/command/find/|find command documentation}. + */ + batchSize(value: number): this; + /** + * Rewind this cursor to its uninitialized state. Any options that are present on the cursor will + * remain in effect. Iterating this cursor will cause new queries to be sent to the server, even + * if the resultant data has already been retrieved by this cursor. + */ + rewind(): void; + /** + * Returns a new uninitialized copy of this cursor, with options matching those that have been set on the current instance + */ + abstract clone(): AbstractCursor; +} +/** @public */ +export declare type AbstractCursorEvents = { + [AbstractCursor.CLOSE](): void; +}; +/** @public */ +export declare interface AbstractCursorOptions extends BSONSerializeOptions { + session?: ClientSession; + readPreference?: ReadPreferenceLike; + readConcern?: ReadConcernLike; + batchSize?: number; + maxTimeMS?: number; + comment?: Document | string; + tailable?: boolean; + awaitData?: boolean; + noCursorTimeout?: boolean; +} +/* Excluded from this release type: AbstractOperation */ +/** @public */ +export declare type AcceptedFields = { + readonly [key in KeysOfAType]?: AssignableType; +}; +/** @public */ +export declare type AddToSetOperators = { + $each?: Array>; +}; +/** @public */ +export declare interface AddUserOptions extends CommandOperationOptions { + /** @deprecated Please use db.command('createUser', ...) instead for this option */ + digestPassword?: null; + /** Roles associated with the created user */ + roles?: string | string[] | RoleSpecification | RoleSpecification[]; + /** Custom data associated with the user (only Mongodb 2.6 or higher) */ + customData?: Document; +} +/** + * The **Admin** class is an internal class that allows convenient access to + * the admin functionality and commands for MongoDB. + * + * **ADMIN Cannot directly be instantiated** + * @public + * + * @example + * ```js + * const MongoClient = require('mongodb').MongoClient; + * const test = require('assert'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * + * // Connect using MongoClient + * MongoClient.connect(url, function(err, client) { + * // Use the admin database for the operation + * const adminDb = client.db(dbName).admin(); + * + * // List all the available databases + * adminDb.listDatabases(function(err, dbs) { + * expect(err).to.not.exist; + * test.ok(dbs.databases.length > 0); + * client.close(); + * }); + * }); + * ``` + */ +export declare class Admin { + /* Excluded from this release type: s */ + /* Excluded from this release type: __constructor */ + /** + * Execute a command + * + * @param command - The command to execute + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + command(command: Document): Promise; + command(command: Document, callback: Callback): void; + command(command: Document, options: RunCommandOptions): Promise; + command(command: Document, options: RunCommandOptions, callback: Callback): void; + /** + * Retrieve the server build information + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + buildInfo(): Promise; + buildInfo(callback: Callback): void; + buildInfo(options: CommandOperationOptions): Promise; + buildInfo(options: CommandOperationOptions, callback: Callback): void; + /** + * Retrieve the server build information + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + serverInfo(): Promise; + serverInfo(callback: Callback): void; + serverInfo(options: CommandOperationOptions): Promise; + serverInfo(options: CommandOperationOptions, callback: Callback): void; + /** + * Retrieve this db's server status. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + serverStatus(): Promise; + serverStatus(callback: Callback): void; + serverStatus(options: CommandOperationOptions): Promise; + serverStatus(options: CommandOperationOptions, callback: Callback): void; + /** + * Ping the MongoDB server and retrieve results + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + ping(): Promise; + ping(callback: Callback): void; + ping(options: CommandOperationOptions): Promise; + ping(options: CommandOperationOptions, callback: Callback): void; + /** + * Add a user to the database + * + * @param username - The username for the new user + * @param password - An optional password for the new user + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + addUser(username: string): Promise; + addUser(username: string, callback: Callback): void; + addUser(username: string, password: string): Promise; + addUser(username: string, password: string, callback: Callback): void; + addUser(username: string, options: AddUserOptions): Promise; + addUser(username: string, options: AddUserOptions, callback: Callback): void; + addUser(username: string, password: string, options: AddUserOptions): Promise; + addUser(username: string, password: string, options: AddUserOptions, callback: Callback): void; + /** + * Remove a user from a database + * + * @param username - The username to remove + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + removeUser(username: string): Promise; + removeUser(username: string, callback: Callback): void; + removeUser(username: string, options: RemoveUserOptions): Promise; + removeUser(username: string, options: RemoveUserOptions, callback: Callback): void; + /** + * Validate an existing collection + * + * @param collectionName - The name of the collection to validate. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + validateCollection(collectionName: string): Promise; + validateCollection(collectionName: string, callback: Callback): void; + validateCollection(collectionName: string, options: ValidateCollectionOptions): Promise; + validateCollection(collectionName: string, options: ValidateCollectionOptions, callback: Callback): void; + /** + * List the available databases + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + listDatabases(): Promise; + listDatabases(callback: Callback): void; + listDatabases(options: ListDatabasesOptions): Promise; + listDatabases(options: ListDatabasesOptions, callback: Callback): void; + /** + * Get ReplicaSet status + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + replSetGetStatus(): Promise; + replSetGetStatus(callback: Callback): void; + replSetGetStatus(options: CommandOperationOptions): Promise; + replSetGetStatus(options: CommandOperationOptions, callback: Callback): void; +} +/* Excluded from this release type: AdminPrivate */ +/* Excluded from this release type: AggregateOperation */ +/** @public */ +export declare interface AggregateOptions extends CommandOperationOptions { + /** allowDiskUse lets the server know if it can use disk to store temporary results for the aggregation (requires mongodb 2.6 \>). */ + allowDiskUse?: boolean; + /** The number of documents to return per batch. See [aggregation documentation](https://docs.mongodb.com/manual/reference/command/aggregate). */ + batchSize?: number; + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; + /** Return the query as cursor, on 2.6 \> it returns as a real cursor on pre 2.6 it returns as an emulated cursor. */ + cursor?: Document; + /** specifies a cumulative time limit in milliseconds for processing operations on the cursor. MongoDB interrupts the operation at the earliest following interrupt point. */ + maxTimeMS?: number; + /** The maximum amount of time for the server to wait on new documents to satisfy a tailable cursor query. */ + maxAwaitTimeMS?: number; + /** Specify collation. */ + collation?: CollationOptions; + /** Add an index selection hint to an aggregation command */ + hint?: Hint; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; + out?: string; +} +/** + * The **AggregationCursor** class is an internal class that embodies an aggregation cursor on MongoDB + * allowing for iteration over the results returned from the underlying query. It supports + * one by one document iteration, conversion to an array or can be iterated as a Node 4.X + * or higher stream + * @public + */ +export declare class AggregationCursor extends AbstractCursor { + /*Excluded from this release type: [kPipeline] + Excluded from this release type: [kOptions] + Excluded from this release type: __constructor */ + readonly pipeline: Document[]; + clone(): AggregationCursor; + map(transform: (doc: TSchema) => T): AggregationCursor; + /* Excluded from this release type: _initialize */ + /** Execute the explain for the cursor */ + explain(): Promise; + explain(callback: Callback): void; + explain(verbosity: ExplainVerbosityLike): Promise; + /** Add a group stage to the aggregation pipeline */ + group($group: Document): AggregationCursor; + /** Add a limit stage to the aggregation pipeline */ + limit($limit: number): this; + /** Add a match stage to the aggregation pipeline */ + match($match: Document): this; + /** Add an out stage to the aggregation pipeline */ + out($out: { + db: string; + coll: string; + } | string): this; + /** + * Add a project stage to the aggregation pipeline + * + * @remarks + * In order to strictly type this function you must provide an interface + * that represents the effect of your projection on the result documents. + * + * By default chaining a projection to your cursor changes the returned type to the generic {@link Document} type. + * You should specify a parameterized type to have assertions on your final results. + * + * @example + * ```typescript + * // Best way + * const docs: AggregationCursor<{ a: number }> = cursor.project<{ a: number }>({ _id: 0, a: true }); + * // Flexible way + * const docs: AggregationCursor = cursor.project({ _id: 0, a: true }); + * ``` + * + * @remarks + * In order to strictly type this function you must provide an interface + * that represents the effect of your projection on the result documents. + * + * **Note for Typescript Users:** adding a transform changes the return type of the iteration of this cursor, + * it **does not** return a new instance of a cursor. This means when calling project, + * you should always assign the result to a new variable in order to get a correctly typed cursor variable. + * Take note of the following example: + * + * @example + * ```typescript + * const cursor: AggregationCursor<{ a: number; b: string }> = coll.aggregate([]); + * const projectCursor = cursor.project<{ a: number }>({ _id: 0, a: true }); + * const aPropOnlyArray: {a: number}[] = await projectCursor.toArray(); + * + * // or always use chaining and save the final cursor + * + * const cursor = coll.aggregate().project<{ a: string }>({ + * _id: 0, + * a: { $convert: { input: '$a', to: 'string' } + * }}); + * ``` + */ + project($project: Document): AggregationCursor; + /** Add a lookup stage to the aggregation pipeline */ + lookup($lookup: Document): this; + /** Add a redact stage to the aggregation pipeline */ + redact($redact: Document): this; + /** Add a skip stage to the aggregation pipeline */ + skip($skip: number): this; + /** Add a sort stage to the aggregation pipeline */ + sort($sort: Sort): this; + /** Add a unwind stage to the aggregation pipeline */ + unwind($unwind: Document | string): this; + /** @deprecated Add a geoNear stage to the aggregation pipeline */ + geoNear($geoNear: Document): this; +} +/** @public */ +export declare interface AggregationCursorOptions extends AbstractCursorOptions, AggregateOptions { +} +/** + * It is possible to search using alternative types in mongodb e.g. + * string types can be searched using a regex in mongo + * array types can be searched using their element type + * @public + */ +export declare type AlternativeType = T extends ReadonlyArray ? T | RegExpOrString : RegExpOrString; +/** @public */ +export declare type AnyBulkWriteOperation = { + insertOne: InsertOneModel; +} | { + replaceOne: ReplaceOneModel; +} | { + updateOne: UpdateOneModel; +} | { + updateMany: UpdateManyModel; +} | { + deleteOne: DeleteOneModel; +} | { + deleteMany: DeleteManyModel; +}; +/** @public */ +export declare type AnyError = MongoError | Error; +/** @public */ +export declare type ArrayOperator = { + $each?: Array>; + $slice?: number; + $position?: number; + $sort?: Sort; +}; +/** @public */ +export declare interface Auth { + /** The username for auth */ + username?: string; + /** The password for auth */ + password?: string; +} +/** @public */ +export declare const AuthMechanism: Readonly<{ + readonly MONGODB_AWS: "MONGODB-AWS"; + readonly MONGODB_CR: "MONGODB-CR"; + readonly MONGODB_DEFAULT: "DEFAULT"; + readonly MONGODB_GSSAPI: "GSSAPI"; + readonly MONGODB_PLAIN: "PLAIN"; + readonly MONGODB_SCRAM_SHA1: "SCRAM-SHA-1"; + readonly MONGODB_SCRAM_SHA256: "SCRAM-SHA-256"; + readonly MONGODB_X509: "MONGODB-X509"; +}>; +/** @public */ +export declare type AuthMechanism = typeof AuthMechanism[keyof typeof AuthMechanism]; +/** @public */ +export declare interface AuthMechanismProperties extends Document { + SERVICE_NAME?: string; + SERVICE_REALM?: string; + CANONICALIZE_HOST_NAME?: boolean; + AWS_SESSION_TOKEN?: string; +} +/** @public */ +export declare interface AutoEncrypter { + new (client: MongoClient, options: AutoEncryptionOptions): AutoEncrypter; + init(cb: Callback): void; + teardown(force: boolean, callback: Callback): void; + encrypt(ns: string, cmd: Document, options: any, callback: Callback): void; + decrypt(cmd: Document, options: any, callback: Callback): void; +} +/** @public */ +export declare const AutoEncryptionLoggerLevel: Readonly<{ + readonly FatalError: 0; + readonly Error: 1; + readonly Warning: 2; + readonly Info: 3; + readonly Trace: 4; +}>; +/** @public */ +export declare type AutoEncryptionLoggerLevel = typeof AutoEncryptionLoggerLevel[keyof typeof AutoEncryptionLoggerLevel]; +/** @public */ +export declare interface AutoEncryptionOptions { + /* Excluded from this release type: bson */ + /* Excluded from this release type: metadataClient */ + /** A `MongoClient` used to fetch keys from a key vault */ + keyVaultClient?: MongoClient; + /** The namespace where keys are stored in the key vault */ + keyVaultNamespace?: string; + /** Configuration options that are used by specific KMS providers during key generation, encryption, and decryption. */ + kmsProviders?: { + /** Configuration options for using 'aws' as your KMS provider */ + aws?: { + /** The access key used for the AWS KMS provider */ + accessKeyId: string; + /** The secret access key used for the AWS KMS provider */ + secretAccessKey: string; + /** + * An optional AWS session token that will be used as the + * X-Amz-Security-Token header for AWS requests. + */ + sessionToken?: string; + }; + /** Configuration options for using 'local' as your KMS provider */ + local?: { + /** + * The master key used to encrypt/decrypt data keys. + * A 96-byte long Buffer or base64 encoded string. + */ + key: Buffer | string; + }; + /** Configuration options for using 'azure' as your KMS provider */ + azure?: { + /** The tenant ID identifies the organization for the account */ + tenantId: string; + /** The client ID to authenticate a registered application */ + clientId: string; + /** The client secret to authenticate a registered application */ + clientSecret: string; + /** + * If present, a host with optional port. E.g. "example.com" or "example.com:443". + * This is optional, and only needed if customer is using a non-commercial Azure instance + * (e.g. a government or China account, which use different URLs). + * Defaults to "login.microsoftonline.com" + */ + identityPlatformEndpoint?: string | undefined; + }; + /** Configuration options for using 'gcp' as your KMS provider */ + gcp?: { + /** The service account email to authenticate */ + email: string; + /** A PKCS#8 encrypted key. This can either be a base64 string or a binary representation */ + privateKey: string | Buffer; + /** + * If present, a host with optional port. E.g. "example.com" or "example.com:443". + * Defaults to "oauth2.googleapis.com" + */ + endpoint?: string | undefined; + }; + }; + /** + * A map of namespaces to a local JSON schema for encryption + * + * **NOTE**: Supplying options.schemaMap provides more security than relying on JSON Schemas obtained from the server. + * It protects against a malicious server advertising a false JSON Schema, which could trick the client into sending decrypted data that should be encrypted. + * Schemas supplied in the schemaMap only apply to configuring automatic encryption for client side encryption. + * Other validation rules in the JSON schema will not be enforced by the driver and will result in an error. + */ + schemaMap?: Document; + /** Allows the user to bypass auto encryption, maintaining implicit decryption */ + bypassAutoEncryption?: boolean; + options?: { + /** An optional hook to catch logging messages from the underlying encryption engine */ + logger?: (level: AutoEncryptionLoggerLevel, message: string) => void; + }; + extraOptions?: { + /** + * A local process the driver communicates with to determine how to encrypt values in a command. + * Defaults to "mongodb://%2Fvar%2Fmongocryptd.sock" if domain sockets are available or "mongodb://localhost:27020" otherwise + */ + mongocryptdURI?: string; + /** If true, autoEncryption will not attempt to spawn a mongocryptd before connecting */ + mongocryptdBypassSpawn?: boolean; + /** The path to the mongocryptd executable on the system */ + mongocryptdSpawnPath?: string; + /** Command line arguments to use when auto-spawning a mongocryptd */ + mongocryptdSpawnArgs?: string[]; + }; +} +/** + * Keeps the state of a unordered batch so we can rewrite the results + * correctly after command execution + * + * @public + */ +export declare class Batch { + originalZeroIndex: number; + currentIndex: number; + originalIndexes: number[]; + batchType: BatchType; + operations: T[]; + size: number; + sizeBytes: number; + constructor(batchType: BatchType, originalZeroIndex: number); +} +/** @public */ +export declare const BatchType: Readonly<{ + readonly INSERT: 1; + readonly UPDATE: 2; + readonly DELETE: 3; +}>; +/** @public */ +export declare type BatchType = typeof BatchType[keyof typeof BatchType]; +export { Binary }; +/** @public */ +export declare type BitwiseFilter = number /** numeric bit mask */ | Binary /** BinData bit mask */ | ReadonlyArray; +export { BSONRegExp }; +/** + * BSON Serialization options. + * @public + */ +export declare interface BSONSerializeOptions extends Pick>, Pick> { + /** Return BSON filled buffers from operations */ + raw?: boolean; +} +export { BSONSymbol }; +/** @public */ +export declare const BSONType: Readonly<{ + readonly double: 1; + readonly string: 2; + readonly object: 3; + readonly array: 4; + readonly binData: 5; + readonly undefined: 6; + readonly objectId: 7; + readonly bool: 8; + readonly date: 9; + readonly null: 10; + readonly regex: 11; + readonly dbPointer: 12; + readonly javascript: 13; + readonly symbol: 14; + readonly javascriptWithScope: 15; + readonly int: 16; + readonly timestamp: 17; + readonly long: 18; + readonly decimal: 19; + readonly minKey: -1; + readonly maxKey: 127; +}>; +/** @public */ +export declare type BSONType = typeof BSONType[keyof typeof BSONType]; +/** @public */ +export declare type BSONTypeAlias = keyof typeof BSONType; +/* Excluded from this release type: BufferPool */ +/** @public */ +export declare abstract class BulkOperationBase { + isOrdered: boolean; + /* Excluded from this release type: s */ + operationId?: number; + /* Excluded from this release type: __constructor */ + /** + * Add a single insert document to the bulk operation + * + * @example + * ```js + * const bulkOp = collection.initializeOrderedBulkOp(); + * + * // Adds three inserts to the bulkOp. + * bulkOp + * .insert({ a: 1 }) + * .insert({ b: 2 }) + * .insert({ c: 3 }); + * await bulkOp.execute(); + * ``` + */ + insert(document: Document): BulkOperationBase; + /** + * Builds a find operation for an update/updateOne/delete/deleteOne/replaceOne. + * Returns a builder object used to complete the definition of the operation. + * + * @example + * ```js + * const bulkOp = collection.initializeOrderedBulkOp(); + * + * // Add an updateOne to the bulkOp + * bulkOp.find({ a: 1 }).updateOne({ $set: { b: 2 } }); + * + * // Add an updateMany to the bulkOp + * bulkOp.find({ c: 3 }).update({ $set: { d: 4 } }); + * + * // Add an upsert + * bulkOp.find({ e: 5 }).upsert().updateOne({ $set: { f: 6 } }); + * + * // Add a deletion + * bulkOp.find({ g: 7 }).deleteOne(); + * + * // Add a multi deletion + * bulkOp.find({ h: 8 }).delete(); + * + * // Add a replaceOne + * bulkOp.find({ i: 9 }).replaceOne({writeConcern: { j: 10 }}); + * + * // Update using a pipeline (requires Mongodb 4.2 or higher) + * bulk.find({ k: 11, y: { $exists: true }, z: { $exists: true } }).updateOne([ + * { $set: { total: { $sum: [ '$y', '$z' ] } } } + * ]); + * + * // All of the ops will now be executed + * await bulkOp.execute(); + * ``` + */ + find(selector: Document): FindOperators; + /** Specifies a raw operation to perform in the bulk write. */ + raw(op: AnyBulkWriteOperation): this; + readonly bsonOptions: BSONSerializeOptions; + readonly writeConcern: WriteConcern | undefined; + readonly batches: Batch[]; + /** An internal helper method. Do not invoke directly. Will be going away in the future */ + execute(options?: BulkWriteOptions, callback?: Callback): Promise | void; + /* Excluded from this release type: handleWriteError */ + abstract addToOperationsList(batchType: BatchType, document: Document | UpdateStatement | DeleteStatement): this; +} +/* Excluded from this release type: BulkOperationPrivate */ +/** @public */ +export declare interface BulkResult { + ok: number; + writeErrors: WriteError[]; + writeConcernErrors: WriteConcernError[]; + insertedIds: Document[]; + nInserted: number; + nUpserted: number; + nMatched: number; + nModified: number; + nRemoved: number; + upserted: Document[]; + opTime?: Document; +} +/** @public */ +export declare interface BulkWriteOperationError { + index: number; + code: number; + errmsg: string; + errInfo: Document; + op: Document | UpdateStatement | DeleteStatement; +} +/** @public */ +export declare interface BulkWriteOptions extends CommandOperationOptions { + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; + /** If true, when an insert fails, don't execute the remaining writes. If false, continue with remaining inserts when one fails. */ + ordered?: boolean; + /** @deprecated use `ordered` instead */ + keepGoing?: boolean; + /** Force server to assign _id values instead of driver. */ + forceServerObjectId?: boolean; +} +/** + * @public + * The result of a bulk write. + */ +export declare class BulkWriteResult { + result: BulkResult; + /*Excluded from this release type: __constructor + Number of documents inserted. */ + readonly insertedCount: number; + /*Number of documents matched for update. */ + readonly matchedCount: number; + /*Number of documents modified. */ + readonly modifiedCount: number; + /*Number of documents deleted. */ + readonly deletedCount: number; + /*Number of documents upserted. */ + readonly upsertedCount: number; + /*Upserted document generated Id's, hash key is the index of the originating operation */ + readonly upsertedIds: { + [key: number]: any; + }; + /*Inserted document generated Id's, hash key is the index of the originating operation */ + readonly insertedIds: { + [key: number]: any; + }; + /*Evaluates to true if the bulk operation correctly executes */ + readonly ok: number; + /*The number of inserted documents */ + readonly nInserted: number; + /*Number of upserted documents */ + readonly nUpserted: number; + /*Number of matched documents */ + readonly nMatched: number; + /*Number of documents updated physically on disk */ + readonly nModified: number; + /*Number of removed documents */ + readonly nRemoved: number; + /** Returns an array of all inserted ids */ + getInsertedIds(): Document[]; + /** Returns an array of all upserted ids */ + getUpsertedIds(): Document[]; + /** Returns the upserted id at the given index */ + getUpsertedIdAt(index: number): Document | undefined; + /** Returns raw internal result */ + getRawResponse(): Document; + /** Returns true if the bulk operation contains a write error */ + hasWriteErrors(): boolean; + /** Returns the number of write errors off the bulk operation */ + getWriteErrorCount(): number; + /** Returns a specific write error object */ + getWriteErrorAt(index: number): WriteError | undefined; + /** Retrieve all write errors */ + getWriteErrors(): WriteError[]; + /** Retrieve lastOp if available */ + getLastOp(): Document | undefined; + /** Retrieve the write concern error if one exists */ + getWriteConcernError(): WriteConcernError | undefined; + toJSON(): BulkResult; + toString(): string; + isOk(): boolean; +} +/** + * MongoDB Driver style callback + * @public + */ +export declare type Callback = (error?: AnyError, result?: T) => void; +/** @public */ +export declare class CancellationToken extends TypedEventEmitter<{ + cancel(): void; +}> { +} +/** + * Creates a new Change Stream instance. Normally created using {@link Collection#watch|Collection.watch()}. + * @public + */ +export declare class ChangeStream extends TypedEventEmitter { + pipeline: Document[]; + options: ChangeStreamOptions; + parent: MongoClient | Db | Collection; + namespace: MongoDBNamespace; + type: symbol; + /* Excluded from this release type: cursor */ + streamOptions?: CursorStreamOptions; + /* Excluded from this release type: [kResumeQueue] */ + /* Excluded from this release type: [kCursorStream] */ + /* Excluded from this release type: [kClosed] */ + /* Excluded from this release type: [kMode] */ + /** @event */ + static readonly RESPONSE: "response"; + /** @event */ + static readonly MORE: "more"; + /** @event */ + static readonly INIT: "init"; + /** @event */ + static readonly CLOSE: "close"; + /** + * Fired for each new matching change in the specified namespace. Attaching a `change` + * event listener to a Change Stream will switch the stream into flowing mode. Data will + * then be passed as soon as it is available. + * @event + */ + static readonly CHANGE: "change"; + /** @event */ + static readonly END: "end"; + /** @event */ + static readonly ERROR: "error"; + /** + * Emitted each time the change stream stores a new resume token. + * @event + */ + static readonly RESUME_TOKEN_CHANGED: "resumeTokenChanged"; + /*Excluded from this release type: __constructor + Excluded from this release type: cursorStream + The cached resume token that is used to resume after the most recently returned change. */ + readonly resumeToken: ResumeToken; + /** Check if there is any document still available in the Change Stream */ + hasNext(callback?: Callback): Promise | void; + /** Get the next available document from the Change Stream. */ + next(): Promise>; + next(callback: Callback>): void; + /*Is the cursor closed */ + readonly closed: boolean; + /** Close the Change Stream */ + close(callback?: Callback): Promise | void; + /** + * Return a modified Readable stream including a possible transform method. + * @throws MongoDriverError if this.cursor is undefined + */ + stream(options?: CursorStreamOptions): Readable; + /** + * Try to get the next available document from the Change Stream's cursor or `null` if an empty batch is returned + */ + tryNext(): Promise; + tryNext(callback: Callback): void; +} +/* Excluded from this release type: ChangeStreamCursor */ +/* Excluded from this release type: ChangeStreamCursorOptions */ +/** @public */ +export declare interface ChangeStreamDocument { + /** + * The id functions as an opaque token for use when resuming an interrupted + * change stream. + */ + _id: InferIdType; + /** + * Describes the type of operation represented in this change notification. + */ + operationType: 'insert' | 'update' | 'replace' | 'delete' | 'invalidate' | 'drop' | 'dropDatabase' | 'rename'; + /** + * Contains two fields: “db” and “coll” containing the database and + * collection name in which the change happened. + */ + ns: { + db: string; + coll: string; + }; + /** + * Only present for ops of type ‘insert’, ‘update’, ‘replace’, and + * ‘delete’. + * + * For unsharded collections this contains a single field, _id, with the + * value of the _id of the document updated. For sharded collections, + * this will contain all the components of the shard key in order, + * followed by the _id if the _id isn’t part of the shard key. + */ + documentKey?: InferIdType; + /** + * Only present for ops of type ‘update’. + * + * Contains a description of updated and removed fields in this + * operation. + */ + updateDescription?: UpdateDescription; + /** + * Always present for operations of type ‘insert’ and ‘replace’. Also + * present for operations of type ‘update’ if the user has specified ‘updateLookup’ + * in the ‘fullDocument’ arguments to the ‘$changeStream’ stage. + * + * For operations of type ‘insert’ and ‘replace’, this key will contain the + * document being inserted, or the new version of the document that is replacing + * the existing document, respectively. + * + * For operations of type ‘update’, this key will contain a copy of the full + * version of the document from some point after the update occurred. If the + * document was deleted since the updated happened, it will be null. + */ + fullDocument?: TSchema; +} +/** @public */ +export declare type ChangeStreamEvents = { + resumeTokenChanged(token: ResumeToken): void; + init(response: Document): void; + more(response?: Document | undefined): void; + response(): void; + end(): void; + error(error: Error): void; + change(change: ChangeStreamDocument): void; +} & AbstractCursorEvents; +/** + * Options that can be passed to a ChangeStream. Note that startAfter, resumeAfter, and startAtOperationTime are all mutually exclusive, and the server will error if more than one is specified. + * @public + */ +export declare interface ChangeStreamOptions extends AggregateOptions { + /** Allowed values: ‘default’, ‘updateLookup’. When set to ‘updateLookup’, the change stream will include both a delta describing the changes to the document, as well as a copy of the entire document that was changed from some time after the change occurred. */ + fullDocument?: string; + /** The maximum amount of time for the server to wait on new documents to satisfy a change stream query. */ + maxAwaitTimeMS?: number; + /** Allows you to start a changeStream after a specified event. See {@link https://docs.mongodb.com/master/changeStreams/#resumeafter-for-change-streams|ChangeStream documentation}. */ + resumeAfter?: ResumeToken; + /** Similar to resumeAfter, but will allow you to start after an invalidated event. See {@link https://docs.mongodb.com/master/changeStreams/#startafter-for-change-streams|ChangeStream documentation}. */ + startAfter?: ResumeToken; + /** Will start the changeStream after the specified operationTime. */ + startAtOperationTime?: OperationTime; + /** The number of documents to return per batch. See {@link https://docs.mongodb.com/manual/reference/command/aggregate|aggregation documentation}. */ + batchSize?: number; +} +/** @public */ +export declare interface ClientMetadata { + driver: { + name: string; + version: string; + }; + os: { + type: string; + name: NodeJS.Platform; + architecture: string; + version: string; + }; + platform: string; + version?: string; + application?: { + name: string; + }; +} +/** @public */ +export declare interface ClientMetadataOptions { + driverInfo?: { + name?: string; + version?: string; + platform?: string; + }; + appName?: string; +} +/** + * A class representing a client session on the server + * + * NOTE: not meant to be instantiated directly. + * @public + */ +export declare class ClientSession extends TypedEventEmitter { + /* Excluded from this release type: topology */ + /* Excluded from this release type: sessionPool */ + hasEnded: boolean; + clientOptions?: MongoOptions; + supports: { + causalConsistency: boolean; + }; + clusterTime?: ClusterTime; + operationTime?: Timestamp; + explicit: boolean; + /* Excluded from this release type: owner */ + defaultTransactionOptions: TransactionOptions; + transaction: Transaction; + /*Excluded from this release type: [kServerSession] + Excluded from this release type: [kSnapshotTime] + Excluded from this release type: [kSnapshotEnabled] + Excluded from this release type: [kPinnedConnection] + Excluded from this release type: __constructor + The server id associated with this session */ + readonly id: ServerSessionId | undefined; + readonly serverSession: ServerSession; + /*Whether or not this session is configured for snapshot reads */ + readonly snapshotEnabled: boolean; + readonly loadBalanced: boolean; + /*Excluded from this release type: pinnedConnection + Excluded from this release type: pin + Excluded from this release type: unpin */ + readonly isPinned: boolean; + /** + * Ends this session on the server + * + * @param options - Optional settings. Currently reserved for future use + * @param callback - Optional callback for completion of this operation + */ + endSession(): Promise; + endSession(callback: Callback): void; + endSession(options: EndSessionOptions): Promise; + endSession(options: EndSessionOptions, callback: Callback): void; + /** + * Advances the operationTime for a ClientSession. + * + * @param operationTime - the `BSON.Timestamp` of the operation type it is desired to advance to + */ + advanceOperationTime(operationTime: Timestamp): void; + /** + * Advances the clusterTime for a ClientSession to the provided clusterTime of another ClientSession + * + * @param clusterTime - the $clusterTime returned by the server from another session in the form of a document containing the `BSON.Timestamp` clusterTime and signature + */ + advanceClusterTime(clusterTime: ClusterTime): void; + /** + * Used to determine if this session equals another + * + * @param session - The session to compare to + */ + equals(session: ClientSession): boolean; + /** Increment the transaction number on the internal ServerSession */ + incrementTransactionNumber(): void; + /** @returns whether this session is currently in a transaction or not */ + inTransaction(): boolean; + /** + * Starts a new transaction with the given options. + * + * @param options - Options for the transaction + */ + startTransaction(options?: TransactionOptions): void; + /** + * Commits the currently active transaction in this session. + * + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + commitTransaction(): Promise; + commitTransaction(callback: Callback): void; + /** + * Aborts the currently active transaction in this session. + * + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + abortTransaction(): Promise; + abortTransaction(callback: Callback): void; + /** + * This is here to ensure that ClientSession is never serialized to BSON. + */ + toBSON(): never; + /** + * Runs a provided lambda within a transaction, retrying either the commit operation + * or entire transaction as needed (and when the error permits) to better ensure that + * the transaction can complete successfully. + * + * IMPORTANT: This method requires the user to return a Promise, all lambdas that do not + * return a Promise will result in undefined behavior. + * + * @param fn - A lambda to run within a transaction + * @param options - Optional settings for the transaction + */ + withTransaction(fn: WithTransactionCallback, options?: TransactionOptions): ReturnType; +} +/** @public */ +export declare type ClientSessionEvents = { + ended(session: ClientSession): void; +}; +/** @public */ +export declare interface ClientSessionOptions { + /** Whether causal consistency should be enabled on this session */ + causalConsistency?: boolean; + /** Whether all read operations should be read from the same snapshot for this session (NOTE: not compatible with `causalConsistency=true`) */ + snapshot?: boolean; + /** The default TransactionOptions to use for transactions started on this session. */ + defaultTransactionOptions?: TransactionOptions; +} +/** @public */ +export declare interface CloseOptions { + force?: boolean; +} +/** @public */ +export declare interface ClusterTime { + clusterTime: Timestamp; + signature: { + hash: Binary; + keyId: Long; + }; +} +export { Code }; +/** @public */ +export declare interface CollationOptions { + locale: string; + caseLevel?: boolean; + caseFirst?: string; + strength?: number; + numericOrdering?: boolean; + alternate?: string; + maxVariable?: string; + backwards?: boolean; + normalization?: boolean; +} +/** + * The **Collection** class is an internal class that embodies a MongoDB collection + * allowing for insert/update/remove/find and other command operation on that MongoDB collection. + * + * **COLLECTION Cannot directly be instantiated** + * @public + * + * @example + * ```js + * const MongoClient = require('mongodb').MongoClient; + * const test = require('assert'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * // Connect using MongoClient + * MongoClient.connect(url, function(err, client) { + * // Create a collection we want to drop later + * const col = client.db(dbName).collection('createIndexExample1'); + * // Show that duplicate records got dropped + * col.find({}).toArray(function(err, items) { + * expect(err).to.not.exist; + * test.equal(4, items.length); + * client.close(); + * }); + * }); + * ``` + */ +export declare class Collection { + /*Excluded from this release type: s + Excluded from this release type: __constructor + + * The name of the database this collection belongs to + */ + readonly dbName: string; + /* + * The name of this collection + */ + readonly collectionName: string; + /* + * The namespace of this collection, in the format `${this.dbName}.${this.collectionName}` + */ + readonly namespace: string; + /* + * The current readConcern of the collection. If not explicitly defined for + * this collection, will be inherited from the parent DB + */ + readonly readConcern: ReadConcern | undefined; + /* + * The current readPreference of the collection. If not explicitly defined for + * this collection, will be inherited from the parent DB + */ + readonly readPreference: ReadPreference | undefined; + readonly bsonOptions: BSONSerializeOptions; + /* + * The current writeConcern of the collection. If not explicitly defined for + * this collection, will be inherited from the parent DB + */ + readonly writeConcern: WriteConcern | undefined; + /*The current index hint for the collection */ + hint: Hint | undefined; + /** + * Inserts a single document into MongoDB. If documents passed in do not contain the **_id** field, + * one will be added to each of the documents missing it by the driver, mutating the document. This behavior + * can be overridden by setting the **forceServerObjectId** flag. + * + * @param doc - The document to insert + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + insertOne(doc: OptionalId): Promise>; + insertOne(doc: OptionalId, callback: Callback>): void; + insertOne(doc: OptionalId, options: InsertOneOptions): Promise>; + insertOne(doc: OptionalId, options: InsertOneOptions, callback: Callback>): void; + /** + * Inserts an array of documents into MongoDB. If documents passed in do not contain the **_id** field, + * one will be added to each of the documents missing it by the driver, mutating the document. This behavior + * can be overridden by setting the **forceServerObjectId** flag. + * + * @param docs - The documents to insert + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + insertMany(docs: OptionalId[]): Promise>; + insertMany(docs: OptionalId[], callback: Callback>): void; + insertMany(docs: OptionalId[], options: BulkWriteOptions): Promise>; + insertMany(docs: OptionalId[], options: BulkWriteOptions, callback: Callback>): void; + /** + * Perform a bulkWrite operation without a fluent API + * + * Legal operation types are + * + * ```js + * { insertOne: { document: { a: 1 } } } + * + * { updateOne: { filter: {a:2}, update: {$set: {a:2}}, upsert:true } } + * + * { updateMany: { filter: {a:2}, update: {$set: {a:2}}, upsert:true } } + * + * { updateMany: { filter: {}, update: {$set: {"a.$[i].x": 5}}, arrayFilters: [{ "i.x": 5 }]} } + * + * { deleteOne: { filter: {c:1} } } + * + * { deleteMany: { filter: {c:1} } } + * + * { replaceOne: { filter: {c:3}, replacement: {c:4}, upsert:true} } + *``` + * Please note that raw operations are no longer accepted as of driver version 4.0. + * + * If documents passed in do not contain the **_id** field, + * one will be added to each of the documents missing it by the driver, mutating the document. This behavior + * can be overridden by setting the **forceServerObjectId** flag. + * + * @param operations - Bulk operations to perform + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + * @throws MongoDriverError if operations is not an array + */ + bulkWrite(operations: AnyBulkWriteOperation[]): Promise; + bulkWrite(operations: AnyBulkWriteOperation[], callback: Callback): void; + bulkWrite(operations: AnyBulkWriteOperation[], options: BulkWriteOptions): Promise; + bulkWrite(operations: AnyBulkWriteOperation[], options: BulkWriteOptions, callback: Callback): void; + /** + * Update a single document in a collection + * + * @param filter - The filter used to select the document to update + * @param update - The update operations to be applied to the document + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + updateOne(filter: Filter, update: UpdateFilter | Partial): Promise; + updateOne(filter: Filter, update: UpdateFilter | Partial, callback: Callback): void; + updateOne(filter: Filter, update: UpdateFilter | Partial, options: UpdateOptions): Promise; + updateOne(filter: Filter, update: UpdateFilter | Partial, options: UpdateOptions, callback: Callback): void; + /** + * Replace a document in a collection with another document + * + * @param filter - The filter used to select the document to replace + * @param replacement - The Document that replaces the matching document + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + replaceOne(filter: Filter, replacement: TSchema): Promise; + replaceOne(filter: Filter, replacement: TSchema, callback: Callback): void; + replaceOne(filter: Filter, replacement: TSchema, options: ReplaceOptions): Promise; + replaceOne(filter: Filter, replacement: TSchema, options: ReplaceOptions, callback: Callback): void; + /** + * Update multiple documents in a collection + * + * @param filter - The filter used to select the documents to update + * @param update - The update operations to be applied to the documents + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + updateMany(filter: Filter, update: UpdateFilter): Promise; + updateMany(filter: Filter, update: UpdateFilter, callback: Callback): void; + updateMany(filter: Filter, update: UpdateFilter, options: UpdateOptions): Promise; + updateMany(filter: Filter, update: UpdateFilter, options: UpdateOptions, callback: Callback): void; + /** + * Delete a document from a collection + * + * @param filter - The filter used to select the document to remove + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + deleteOne(filter: Filter): Promise; + deleteOne(filter: Filter, callback: Callback): void; + deleteOne(filter: Filter, options: DeleteOptions): Promise; + deleteOne(filter: Filter, options: DeleteOptions, callback?: Callback): void; + /** + * Delete multiple documents from a collection + * + * @param filter - The filter used to select the documents to remove + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + deleteMany(filter: Filter): Promise; + deleteMany(filter: Filter, callback: Callback): void; + deleteMany(filter: Filter, options: DeleteOptions): Promise; + deleteMany(filter: Filter, options: DeleteOptions, callback: Callback): void; + /** + * Rename the collection. + * + * @remarks + * This operation does not inherit options from the Db or MongoClient. + * + * @param newName - New name of of the collection. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + rename(newName: string): Promise; + rename(newName: string, callback: Callback): void; + rename(newName: string, options: RenameOptions): Promise; + rename(newName: string, options: RenameOptions, callback: Callback): void; + /** + * Drop the collection from the database, removing it permanently. New accesses will create a new collection. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + drop(): Promise; + drop(callback: Callback): void; + drop(options: DropCollectionOptions): Promise; + drop(options: DropCollectionOptions, callback: Callback): void; + /** + * Fetches the first document that matches the filter + * + * @param filter - Query for find Operation + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + findOne(): Promise; + findOne(callback: Callback): void; + findOne(filter: Filter): Promise; + findOne(filter: Filter, callback: Callback): void; + findOne(filter: Filter, options: FindOptions): Promise; + findOne(filter: Filter, options: FindOptions, callback: Callback): void; + findOne(): Promise; + findOne(callback: Callback): void; + findOne(filter: Filter): Promise; + findOne(filter: Filter, options?: FindOptions): Promise; + findOne(filter: Filter, options?: FindOptions, callback?: Callback): void; + /** + * Creates a cursor for a filter that can be used to iterate over results from MongoDB + * + * @param filter - The filter predicate. If unspecified, then all documents in the collection will match the predicate + */ + find(): FindCursor; + find(filter: Filter, options?: FindOptions): FindCursor; + find(filter: Filter, options?: FindOptions): FindCursor; + /** + * Returns the options of the collection. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + options(): Promise; + options(callback: Callback): void; + options(options: OperationOptions): Promise; + options(options: OperationOptions, callback: Callback): void; + /** + * Returns if the collection is a capped collection + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + isCapped(): Promise; + isCapped(callback: Callback): void; + isCapped(options: OperationOptions): Promise; + isCapped(options: OperationOptions, callback: Callback): void; + /** + * Creates an index on the db and collection collection. + * + * @param indexSpec - The field name or index specification to create an index for + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + * + * @example + * ```js + * const collection = client.db('foo').collection('bar'); + * + * await collection.createIndex({ a: 1, b: -1 }); + * + * // Alternate syntax for { c: 1, d: -1 } that ensures order of indexes + * await collection.createIndex([ [c, 1], [d, -1] ]); + * + * // Equivalent to { e: 1 } + * await collection.createIndex('e'); + * + * // Equivalent to { f: 1, g: 1 } + * await collection.createIndex(['f', 'g']) + * + * // Equivalent to { h: 1, i: -1 } + * await collection.createIndex([ { h: 1 }, { i: -1 } ]); + * + * // Equivalent to { j: 1, k: -1, l: 2d } + * await collection.createIndex(['j', ['k', -1], { l: '2d' }]) + * ``` + */ + createIndex(indexSpec: IndexSpecification): Promise; + createIndex(indexSpec: IndexSpecification, callback: Callback): void; + createIndex(indexSpec: IndexSpecification, options: CreateIndexesOptions): Promise; + createIndex(indexSpec: IndexSpecification, options: CreateIndexesOptions, callback: Callback): void; + /** + * Creates multiple indexes in the collection, this method is only supported for + * MongoDB 2.6 or higher. Earlier version of MongoDB will throw a command not supported + * error. + * + * **Note**: Unlike {@link Collection#createIndex| createIndex}, this function takes in raw index specifications. + * Index specifications are defined {@link http://docs.mongodb.org/manual/reference/command/createIndexes/| here}. + * + * @param indexSpecs - An array of index specifications to be created + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + * + * @example + * ```js + * const collection = client.db('foo').collection('bar'); + * await collection.createIndexes([ + * // Simple index on field fizz + * { + * key: { fizz: 1 }, + * } + * // wildcard index + * { + * key: { '$**': 1 } + * }, + * // named index on darmok and jalad + * { + * key: { darmok: 1, jalad: -1 } + * name: 'tanagra' + * } + * ]); + * ``` + */ + createIndexes(indexSpecs: IndexDescription[]): Promise; + createIndexes(indexSpecs: IndexDescription[], callback: Callback): void; + createIndexes(indexSpecs: IndexDescription[], options: CreateIndexesOptions): Promise; + createIndexes(indexSpecs: IndexDescription[], options: CreateIndexesOptions, callback: Callback): void; + /** + * Drops an index from this collection. + * + * @param indexName - Name of the index to drop. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + dropIndex(indexName: string): Promise; + dropIndex(indexName: string, callback: Callback): void; + dropIndex(indexName: string, options: DropIndexesOptions): Promise; + dropIndex(indexName: string, options: DropIndexesOptions, callback: Callback): void; + /** + * Drops all indexes from this collection. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + dropIndexes(): Promise; + dropIndexes(callback: Callback): void; + dropIndexes(options: DropIndexesOptions): Promise; + dropIndexes(options: DropIndexesOptions, callback: Callback): void; + /** + * Get the list of all indexes information for the collection. + * + * @param options - Optional settings for the command + */ + listIndexes(options?: ListIndexesOptions): ListIndexesCursor; + /** + * Checks if one or more indexes exist on the collection, fails on first non-existing index + * + * @param indexes - One or more index names to check. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + indexExists(indexes: string | string[]): Promise; + indexExists(indexes: string | string[], callback: Callback): void; + indexExists(indexes: string | string[], options: IndexInformationOptions): Promise; + indexExists(indexes: string | string[], options: IndexInformationOptions, callback: Callback): void; + /** + * Retrieves this collections index info. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + indexInformation(): Promise; + indexInformation(callback: Callback): void; + indexInformation(options: IndexInformationOptions): Promise; + indexInformation(options: IndexInformationOptions, callback: Callback): void; + /** + * Gets an estimate of the count of documents in a collection using collection metadata. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + estimatedDocumentCount(): Promise; + estimatedDocumentCount(callback: Callback): void; + estimatedDocumentCount(options: EstimatedDocumentCountOptions): Promise; + estimatedDocumentCount(options: EstimatedDocumentCountOptions, callback: Callback): void; + /** + * Gets the number of documents matching the filter. + * For a fast count of the total documents in a collection see {@link Collection#estimatedDocumentCount| estimatedDocumentCount}. + * **Note**: When migrating from {@link Collection#count| count} to {@link Collection#countDocuments| countDocuments} + * the following query operators must be replaced: + * + * | Operator | Replacement | + * | -------- | ----------- | + * | `$where` | [`$expr`][1] | + * | `$near` | [`$geoWithin`][2] with [`$center`][3] | + * | `$nearSphere` | [`$geoWithin`][2] with [`$centerSphere`][4] | + * + * [1]: https://docs.mongodb.com/manual/reference/operator/query/expr/ + * [2]: https://docs.mongodb.com/manual/reference/operator/query/geoWithin/ + * [3]: https://docs.mongodb.com/manual/reference/operator/query/center/#op._S_center + * [4]: https://docs.mongodb.com/manual/reference/operator/query/centerSphere/#op._S_centerSphere + * + * @param filter - The filter for the count + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + * + * @see https://docs.mongodb.com/manual/reference/operator/query/expr/ + * @see https://docs.mongodb.com/manual/reference/operator/query/geoWithin/ + * @see https://docs.mongodb.com/manual/reference/operator/query/center/#op._S_center + * @see https://docs.mongodb.com/manual/reference/operator/query/centerSphere/#op._S_centerSphere + */ + countDocuments(): Promise; + countDocuments(callback: Callback): void; + countDocuments(filter: Filter): Promise; + countDocuments(callback: Callback): void; + countDocuments(filter: Filter, options: CountDocumentsOptions): Promise; + countDocuments(filter: Filter, options: CountDocumentsOptions, callback: Callback): void; + countDocuments(filter: Filter, callback: Callback): void; + /** + * The distinct command returns a list of distinct values for the given key across a collection. + * + * @param key - Field of the document to find distinct values for + * @param filter - The filter for filtering the set of documents to which we apply the distinct filter. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + distinct>(key: Key): Promise[Key]>>>; + distinct>(key: Key, callback: Callback[Key]>>>): void; + distinct>(key: Key, filter: Filter): Promise[Key]>>>; + distinct>(key: Key, filter: Filter, callback: Callback[Key]>>>): void; + distinct>(key: Key, filter: Filter, options: DistinctOptions): Promise[Key]>>>; + distinct>(key: Key, filter: Filter, options: DistinctOptions, callback: Callback[Key]>>>): void; + distinct(key: string): Promise; + distinct(key: string, callback: Callback): void; + distinct(key: string, filter: Filter): Promise; + distinct(key: string, filter: Filter, callback: Callback): void; + distinct(key: string, filter: Filter, options: DistinctOptions): Promise; + distinct(key: string, filter: Filter, options: DistinctOptions, callback: Callback): void; + /** + * Retrieve all the indexes on the collection. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + indexes(): Promise; + indexes(callback: Callback): void; + indexes(options: IndexInformationOptions): Promise; + indexes(options: IndexInformationOptions, callback: Callback): void; + /** + * Get all the collection statistics. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + stats(): Promise; + stats(callback: Callback): void; + stats(options: CollStatsOptions): Promise; + stats(options: CollStatsOptions, callback: Callback): void; + /** + * Find a document and delete it in one atomic operation. Requires a write lock for the duration of the operation. + * + * @param filter - The filter used to select the document to remove + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + findOneAndDelete(filter: Filter): Promise>; + findOneAndDelete(filter: Filter, options: FindOneAndDeleteOptions): Promise>; + findOneAndDelete(filter: Filter, callback: Callback>): void; + findOneAndDelete(filter: Filter, options: FindOneAndDeleteOptions, callback: Callback>): void; + /** + * Find a document and replace it in one atomic operation. Requires a write lock for the duration of the operation. + * + * @param filter - The filter used to select the document to replace + * @param replacement - The Document that replaces the matching document + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + findOneAndReplace(filter: Filter, replacement: Document): Promise>; + findOneAndReplace(filter: Filter, replacement: Document, callback: Callback>): void; + findOneAndReplace(filter: Filter, replacement: Document, options: FindOneAndReplaceOptions): Promise>; + findOneAndReplace(filter: Filter, replacement: Document, options: FindOneAndReplaceOptions, callback: Callback>): void; + /** + * Find a document and update it in one atomic operation. Requires a write lock for the duration of the operation. + * + * @param filter - The filter used to select the document to update + * @param update - Update operations to be performed on the document + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + findOneAndUpdate(filter: Filter, update: UpdateFilter): Promise>; + findOneAndUpdate(filter: Filter, update: UpdateFilter, callback: Callback>): void; + findOneAndUpdate(filter: Filter, update: UpdateFilter, options: FindOneAndUpdateOptions): Promise>; + findOneAndUpdate(filter: Filter, update: UpdateFilter, options: FindOneAndUpdateOptions, callback: Callback>): void; + /** + * Execute an aggregation framework pipeline against the collection, needs MongoDB \>= 2.2 + * + * @param pipeline - An array of aggregation pipelines to execute + * @param options - Optional settings for the command + */ + aggregate(pipeline?: Document[], options?: AggregateOptions): AggregationCursor; + /** + * Create a new Change Stream, watching for new changes (insertions, updates, replacements, deletions, and invalidations) in this collection. + * + * @since 3.0.0 + * @param pipeline - An array of {@link https://docs.mongodb.com/manual/reference/operator/aggregation-pipeline/|aggregation pipeline stages} through which to pass change stream documents. This allows for filtering (using $match) and manipulating the change stream documents. + * @param options - Optional settings for the command + */ + watch(pipeline?: Document[], options?: ChangeStreamOptions): ChangeStream; + /** + * Run Map Reduce across a collection. Be aware that the inline option for out will return an array of results not a collection. + * + * @param map - The mapping function. + * @param reduce - The reduce function. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + mapReduce(map: string | MapFunction, reduce: string | ReduceFunction): Promise; + mapReduce(map: string | MapFunction, reduce: string | ReduceFunction, callback: Callback): void; + mapReduce(map: string | MapFunction, reduce: string | ReduceFunction, options: MapReduceOptions): Promise; + mapReduce(map: string | MapFunction, reduce: string | ReduceFunction, options: MapReduceOptions, callback: Callback): void; + /** Initiate an Out of order batch write operation. All operations will be buffered into insert/update/remove commands executed out of order. */ + initializeUnorderedBulkOp(options?: BulkWriteOptions): UnorderedBulkOperation; + /** Initiate an In order bulk write operation. Operations will be serially executed in the order they are added, creating a new operation for each switch in types. */ + initializeOrderedBulkOp(options?: BulkWriteOptions): OrderedBulkOperation; + /** Get the db scoped logger */ + getLogger(): Logger; + readonly logger: Logger; + /** + * Inserts a single document or a an array of documents into MongoDB. If documents passed in do not contain the **_id** field, + * one will be added to each of the documents missing it by the driver, mutating the document. This behavior + * can be overridden by setting the **forceServerObjectId** flag. + * + * @deprecated Use insertOne, insertMany or bulkWrite instead. + * @param docs - The documents to insert + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + insert(docs: OptionalId[], options: BulkWriteOptions, callback: Callback>): Promise> | void; + /** + * Updates documents. + * + * @deprecated use updateOne, updateMany or bulkWrite + * @param selector - The selector for the update operation. + * @param update - The update operations to be applied to the documents + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + update(selector: Filter, update: UpdateFilter, options: UpdateOptions, callback: Callback): Promise | void; + /** + * Remove documents. + * + * @deprecated use deleteOne, deleteMany or bulkWrite + * @param selector - The selector for the update operation. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + remove(selector: Filter, options: DeleteOptions, callback: Callback): Promise | void; + /** + * An estimated count of matching documents in the db to a filter. + * + * **NOTE:** This method has been deprecated, since it does not provide an accurate count of the documents + * in a collection. To obtain an accurate count of documents in the collection, use {@link Collection#countDocuments| countDocuments}. + * To obtain an estimated count of all documents in the collection, use {@link Collection#estimatedDocumentCount| estimatedDocumentCount}. + * + * @deprecated use {@link Collection#countDocuments| countDocuments} or {@link Collection#estimatedDocumentCount| estimatedDocumentCount} instead + * + * @param filter - The filter for the count. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + count(): Promise; + count(callback: Callback): void; + count(filter: Filter): Promise; + count(filter: Filter, callback: Callback): void; + count(filter: Filter, options: CountOptions): Promise; + count(filter: Filter, options: CountOptions, callback: Callback): Promise | void; +} +/** @public */ +export declare interface CollectionInfo extends Document { + name: string; + type?: string; + options?: Document; + info?: { + readOnly?: false; + uuid?: Binary; + }; + idIndex?: Document; +} +/** @public */ +export declare interface CollectionOptions extends BSONSerializeOptions, WriteConcernOptions, LoggerOptions { + slaveOk?: boolean; + /** Specify a read concern for the collection. (only MongoDB 3.2 or higher supported) */ + readConcern?: ReadConcernLike; + /** The preferred read preference (ReadPreference.PRIMARY, ReadPreference.PRIMARY_PREFERRED, ReadPreference.SECONDARY, ReadPreference.SECONDARY_PREFERRED, ReadPreference.NEAREST). */ + readPreference?: ReadPreferenceLike; +} +/* Excluded from this release type: CollectionPrivate */ +/** + * @public + * @see https://docs.mongodb.org/manual/reference/command/collStats/ + */ +export declare interface CollStats extends Document { + /** Namespace */ + ns: string; + /** Number of documents */ + count: number; + /** Collection size in bytes */ + size: number; + /** Average object size in bytes */ + avgObjSize: number; + /** (Pre)allocated space for the collection in bytes */ + storageSize: number; + /** Number of extents (contiguously allocated chunks of datafile space) */ + numExtents: number; + /** Number of indexes */ + nindexes: number; + /** Size of the most recently created extent in bytes */ + lastExtentSize: number; + /** Padding can speed up updates if documents grow */ + paddingFactor: number; + /** A number that indicates the user-set flags on the collection. userFlags only appears when using the mmapv1 storage engine */ + userFlags?: number; + /** Total index size in bytes */ + totalIndexSize: number; + /** Size of specific indexes in bytes */ + indexSizes: { + _id_: number; + [index: string]: number; + }; + /** `true` if the collection is capped */ + capped: boolean; + /** The maximum number of documents that may be present in a capped collection */ + max: number; + /** The maximum size of a capped collection */ + maxSize: number; + /** This document contains data reported directly by the WiredTiger engine and other data for internal diagnostic use */ + wiredTiger?: WiredTigerData; + /** The fields in this document are the names of the indexes, while the values themselves are documents that contain statistics for the index provided by the storage engine */ + indexDetails?: any; + ok: number; + /** The amount of storage available for reuse. The scale argument affects this value. */ + freeStorageSize?: number; + /** An array that contains the names of the indexes that are currently being built on the collection */ + indexBuilds?: number; + /** The sum of the storageSize and totalIndexSize. The scale argument affects this value */ + totalSize: number; + /** The scale value used by the command. */ + scaleFactor: number; +} +/** @public */ +export declare interface CollStatsOptions extends CommandOperationOptions { + /** Divide the returned sizes by scale value. */ + scale?: number; +} +/** + * An event indicating the failure of a given command + * @public + * @category Event + */ +export declare class CommandFailedEvent { + address: string; + connectionId?: string | number; + requestId: number; + duration: number; + commandName: string; + failure: Error; + serviceId?: ObjectId; + /*Excluded from this release type: __constructor */ + readonly hasServiceId: boolean; +} +/* Excluded from this release type: CommandOperation */ +/** @public */ +export declare interface CommandOperationOptions extends OperationOptions, WriteConcernOptions, ExplainOptions { + /** @deprecated This option does nothing */ + fullResponse?: boolean; + /** Specify a read concern and level for the collection. (only MongoDB 3.2 or higher supported) */ + readConcern?: ReadConcernLike; + /** Collation */ + collation?: CollationOptions; + maxTimeMS?: number; + /** A user-provided comment to attach to this command */ + comment?: string | Document; + /** Should retry failed writes */ + retryWrites?: boolean; + dbName?: string; + authdb?: string; + noResponse?: boolean; +} +/* Excluded from this release type: CommandOptions */ +/** + * An event indicating the start of a given + * @public + * @category Event + */ +export declare class CommandStartedEvent { + commandObj?: Document; + requestId: number; + databaseName: string; + commandName: string; + command: Document; + address: string; + connectionId?: string | number; + serviceId?: ObjectId; + /*Excluded from this release type: __constructor */ + readonly hasServiceId: boolean; +} +/** + * An event indicating the success of a given command + * @public + * @category Event + */ +export declare class CommandSucceededEvent { + address: string; + connectionId?: string | number; + requestId: number; + duration: number; + commandName: string; + reply: unknown; + serviceId?: ObjectId; + /*Excluded from this release type: __constructor */ + readonly hasServiceId: boolean; +} +/** @public */ +export declare type CommonEvents = 'newListener' | 'removeListener'; +/** @public */ +export declare const Compressor: Readonly<{ + readonly none: 0; + readonly snappy: 1; + readonly zlib: 2; +}>; +/** @public */ +export declare type Compressor = typeof Compressor[CompressorName]; +/** @public */ +export declare type CompressorName = keyof typeof Compressor; +/** @public */ +export declare type Condition = AlternativeType | FilterOperators>; +/* Excluded from this release type: Connection */ +/** + * An event published when a connection is checked into the connection pool + * @public + * @category Event + */ +export declare class ConnectionCheckedInEvent extends ConnectionPoolMonitoringEvent { + /** The id of the connection */ + connectionId: number | ''; +} +/** + * An event published when a connection is checked out of the connection pool + * @public + * @category Event + */ +export declare class ConnectionCheckedOutEvent extends ConnectionPoolMonitoringEvent { + /** The id of the connection */ + connectionId: number | ''; +} +/** + * An event published when a request to check a connection out fails + * @public + * @category Event + */ +export declare class ConnectionCheckOutFailedEvent extends ConnectionPoolMonitoringEvent { + /** The reason the attempt to check out failed */ + reason: AnyError | string; +} +/** + * An event published when a request to check a connection out begins + * @public + * @category Event + */ +export declare class ConnectionCheckOutStartedEvent extends ConnectionPoolMonitoringEvent { +} +/** + * An event published when a connection is closed + * @public + * @category Event + */ +export declare class ConnectionClosedEvent extends ConnectionPoolMonitoringEvent { + /** The id of the connection */ + connectionId: number | ''; + /** The reason the connection was closed */ + reason: string; + serviceId?: ObjectId; +} +/** + * An event published when a connection pool creates a new connection + * @public + * @category Event + */ +export declare class ConnectionCreatedEvent extends ConnectionPoolMonitoringEvent { + /** A monotonically increasing, per-pool id for the newly created connection */ + connectionId: number | ''; +} +/** @public */ +export declare type ConnectionEvents = { + commandStarted(event: CommandStartedEvent): void; + commandSucceeded(event: CommandSucceededEvent): void; + commandFailed(event: CommandFailedEvent): void; + clusterTimeReceived(clusterTime: Document): void; + close(): void; + message(message: any): void; + pinned(pinType: string): void; + unpinned(pinType: string): void; +}; +/** @public */ +export declare interface ConnectionOptions extends SupportedNodeConnectionOptions, StreamDescriptionOptions { + id: number | ''; + generation: number; + hostAddress: HostAddress; + autoEncrypter?: AutoEncrypter; + serverApi?: ServerApi; + monitorCommands: boolean; + /* Excluded from this release type: connectionType */ + credentials?: MongoCredentials; + connectTimeoutMS?: number; + tls: boolean; + keepAlive?: boolean; + keepAliveInitialDelay?: number; + noDelay?: boolean; + socketTimeoutMS?: number; + cancellationToken?: CancellationToken; + metadata: ClientMetadata; +} +/* Excluded from this release type: ConnectionPool */ +/** + * An event published when a connection pool is cleared + * @public + * @category Event + */ +export declare class ConnectionPoolClearedEvent extends ConnectionPoolMonitoringEvent { +} +/** + * An event published when a connection pool is closed + * @public + * @category Event + */ +export declare class ConnectionPoolClosedEvent extends ConnectionPoolMonitoringEvent { +} +/** + * An event published when a connection pool is created + * @public + * @category Event + */ +export declare class ConnectionPoolCreatedEvent extends ConnectionPoolMonitoringEvent { + /** The options used to create this connection pool */ + options?: ConnectionPoolOptions; +} +/** @public */ +export declare type ConnectionPoolEvents = { + connectionPoolCreated(event: ConnectionPoolCreatedEvent): void; + connectionPoolClosed(event: ConnectionPoolClosedEvent): void; + connectionPoolCleared(event: ConnectionPoolClearedEvent): void; + connectionCreated(event: ConnectionCreatedEvent): void; + connectionReady(event: ConnectionReadyEvent): void; + connectionClosed(event: ConnectionClosedEvent): void; + connectionCheckOutStarted(event: ConnectionCheckOutStartedEvent): void; + connectionCheckOutFailed(event: ConnectionCheckOutFailedEvent): void; + connectionCheckedOut(event: ConnectionCheckedOutEvent): void; + connectionCheckedIn(event: ConnectionCheckedInEvent): void; +} & Pick>; +/* Excluded from this release type: ConnectionPoolMetrics */ +/** + * The base export class for all monitoring events published from the connection pool + * @public + * @category Event + */ +export declare class ConnectionPoolMonitoringEvent { + /** A timestamp when the event was created */ + time: Date; + /** The address (host/port pair) of the pool */ + address: string; +} +/** @public */ +export declare interface ConnectionPoolOptions extends Pick> { + /** The maximum number of connections that may be associated with a pool at a given time. This includes in use and available connections. */ + maxPoolSize: number; + /** The minimum number of connections that MUST exist at any moment in a single connection pool. */ + minPoolSize: number; + /** The maximum amount of time a connection should remain idle in the connection pool before being marked idle. */ + maxIdleTimeMS: number; + /** The maximum amount of time operation execution should wait for a connection to become available. The default is 0 which means there is no limit. */ + waitQueueTimeoutMS: number; + /** If we are in load balancer mode. */ + loadBalanced: boolean; +} +/** + * An event published when a connection is ready for use + * @public + * @category Event + */ +export declare class ConnectionReadyEvent extends ConnectionPoolMonitoringEvent { + /** The id of the connection */ + connectionId: number | ''; +} +/** @public */ +export declare interface ConnectOptions { + readPreference?: ReadPreference; +} +/** @public */ +export declare interface CountDocumentsOptions extends AggregateOptions { + /** The number of documents to skip. */ + skip?: number; + /** The maximum amounts to count before aborting. */ + limit?: number; +} +/** @public */ +export declare interface CountOptions extends CommandOperationOptions { + /** The number of documents to skip. */ + skip?: number; + /** The maximum amounts to count before aborting. */ + limit?: number; + /** Number of milliseconds to wait before aborting the query. */ + maxTimeMS?: number; + /** An index name hint for the query. */ + hint?: string | Document; +} +/** @public */ +export declare interface CreateCollectionOptions extends CommandOperationOptions { + /** Returns an error if the collection does not exist */ + strict?: boolean; + /** Create a capped collection */ + capped?: boolean; + /** @deprecated Create an index on the _id field of the document, True by default on MongoDB 2.6 - 3.0 */ + autoIndexId?: boolean; + /** The size of the capped collection in bytes */ + size?: number; + /** The maximum number of documents in the capped collection */ + max?: number; + /** Available for the MMAPv1 storage engine only to set the usePowerOf2Sizes and the noPadding flag */ + flags?: number; + /** Allows users to specify configuration to the storage engine on a per-collection basis when creating a collection on MongoDB 3.0 or higher */ + storageEngine?: Document; + /** Allows users to specify validation rules or expressions for the collection. For more information, see Document Validation on MongoDB 3.2 or higher */ + validator?: Document; + /** Determines how strictly MongoDB applies the validation rules to existing documents during an update on MongoDB 3.2 or higher */ + validationLevel?: string; + /** Determines whether to error on invalid documents or just warn about the violations but allow invalid documents to be inserted on MongoDB 3.2 or higher */ + validationAction?: string; + /** Allows users to specify a default configuration for indexes when creating a collection on MongoDB 3.2 or higher */ + indexOptionDefaults?: Document; + /** The name of the source collection or view from which to create the view. The name is not the full namespace of the collection or view; i.e. does not include the database name and implies the same database as the view to create on MongoDB 3.4 or higher */ + viewOn?: string; + /** An array that consists of the aggregation pipeline stage. Creates the view by applying the specified pipeline to the viewOn collection or view on MongoDB 3.4 or higher */ + pipeline?: Document[]; + /** A primary key factory function for generation of custom _id keys. */ + pkFactory?: PkFactory; + /** A document specifying configuration options for timeseries collections. */ + timeseries?: TimeSeriesCollectionOptions; + /** The number of seconds after which a document in a timeseries collection expires. */ + expireAfterSeconds?: number; +} +/** @public */ +export declare interface CreateIndexesOptions extends CommandOperationOptions { + /** Creates the index in the background, yielding whenever possible. */ + background?: boolean; + /** Creates an unique index. */ + unique?: boolean; + /** Override the autogenerated index name (useful if the resulting name is larger than 128 bytes) */ + name?: string; + /** Creates a partial index based on the given filter object (MongoDB 3.2 or higher) */ + partialFilterExpression?: Document; + /** Creates a sparse index. */ + sparse?: boolean; + /** Allows you to expire data on indexes applied to a data (MongoDB 2.2 or higher) */ + expireAfterSeconds?: number; + /** Allows users to configure the storage engine on a per-index basis when creating an index. (MongoDB 3.0 or higher) */ + storageEngine?: Document; + /** (MongoDB 4.4. or higher) Specifies how many data-bearing members of a replica set, including the primary, must complete the index builds successfully before the primary marks the indexes as ready. This option accepts the same values for the "w" field in a write concern plus "votingMembers", which indicates all voting data-bearing nodes. */ + commitQuorum?: number | string; + /** Specifies the index version number, either 0 or 1. */ + version?: number; + weights?: Document; + default_language?: string; + language_override?: string; + textIndexVersion?: number; + '2dsphereIndexVersion'?: number; + bits?: number; + /** For geospatial indexes set the lower bound for the co-ordinates. */ + min?: number; + /** For geospatial indexes set the high bound for the co-ordinates. */ + max?: number; + bucketSize?: number; + wildcardProjection?: Document; + /** Specifies that the index should exist on the target collection but should not be used by the query planner when executing operations. (MongoDB 4.4 or higher) */ + hidden?: boolean; +} +/** @public */ +export declare const CURSOR_FLAGS: readonly [ + "tailable", + "oplogReplay", + "noCursorTimeout", + "awaitData", + "exhaust", + "partial" +]; +/** @public + * @deprecated This interface is deprecated */ +export declare interface CursorCloseOptions { + /** Bypass calling killCursors when closing the cursor. */ + /** @deprecated the skipKillCursors option is deprecated */ + skipKillCursors?: boolean; +} +/** @public */ +export declare type CursorFlag = typeof CURSOR_FLAGS[number]; +/** @public */ +export declare interface CursorStreamOptions { + /** A transformation method applied to each document emitted by the stream */ + transform?(doc: Document): Document; +} +/** + * The **Db** class is a class that represents a MongoDB Database. + * @public + * + * @example + * ```js + * const { MongoClient } = require('mongodb'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * // Connect using MongoClient + * MongoClient.connect(url, function(err, client) { + * // Select the database by name + * const testDb = client.db(dbName); + * client.close(); + * }); + * ``` + */ +export declare class Db { + /* Excluded from this release type: s */ + static SYSTEM_NAMESPACE_COLLECTION: string; + static SYSTEM_INDEX_COLLECTION: string; + static SYSTEM_PROFILE_COLLECTION: string; + static SYSTEM_USER_COLLECTION: string; + static SYSTEM_COMMAND_COLLECTION: string; + static SYSTEM_JS_COLLECTION: string; + /** + * Creates a new Db instance + * + * @param client - The MongoClient for the database. + * @param databaseName - The name of the database this instance represents. + * @param options - Optional settings for Db construction + */ + constructor(client: MongoClient, databaseName: string, options?: DbOptions); + readonly databaseName: string; + readonly options: DbOptions | undefined; + readonly slaveOk: boolean; + readonly readConcern: ReadConcern | undefined; + /* + * The current readPreference of the Db. If not explicitly defined for + * this Db, will be inherited from the parent MongoClient + */ + readonly readPreference: ReadPreference; + readonly bsonOptions: BSONSerializeOptions; + readonly writeConcern: WriteConcern | undefined; + readonly namespace: string; + /** + * Create a new collection on a server with the specified options. Use this to create capped collections. + * More information about command options available at https://docs.mongodb.com/manual/reference/command/create/ + * + * @param name - The name of the collection to create + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + createCollection(name: string): Promise>; + createCollection(name: string, callback: Callback>): void; + createCollection(name: string, options: CreateCollectionOptions): Promise>; + createCollection(name: string, options: CreateCollectionOptions, callback: Callback>): void; + /** + * Execute a command + * + * @remarks + * This command does not inherit options from the MongoClient. + * + * @param command - The command to run + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + command(command: Document): Promise; + command(command: Document, callback: Callback): void; + command(command: Document, options: RunCommandOptions): Promise; + command(command: Document, options: RunCommandOptions, callback: Callback): void; + /** + * Execute an aggregation framework pipeline against the database, needs MongoDB \>= 3.6 + * + * @param pipeline - An array of aggregation stages to be executed + * @param options - Optional settings for the command + */ + aggregate(pipeline?: Document[], options?: AggregateOptions): AggregationCursor; + /** Return the Admin db instance */ + admin(): Admin; + /** + * Returns a reference to a MongoDB Collection. If it does not exist it will be created implicitly. + * + * @param name - the collection name we wish to access. + * @returns return the new Collection instance + */ + collection(name: string, options?: CollectionOptions): Collection; + /** + * Get all the db statistics. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + stats(): Promise; + stats(callback: Callback): void; + stats(options: DbStatsOptions): Promise; + stats(options: DbStatsOptions, callback: Callback): void; + /** + * List all collections of this database with optional filter + * + * @param filter - Query to filter collections by + * @param options - Optional settings for the command + */ + listCollections(filter: Document, options: Exclude & { + nameOnly: true; + }): ListCollectionsCursor>; + listCollections(filter: Document, options: Exclude & { + nameOnly: false; + }): ListCollectionsCursor; + listCollections | CollectionInfo = Pick | CollectionInfo>(filter?: Document, options?: ListCollectionsOptions): ListCollectionsCursor; + /** + * Rename a collection. + * + * @remarks + * This operation does not inherit options from the MongoClient. + * + * @param fromCollection - Name of current collection to rename + * @param toCollection - New name of of the collection + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + renameCollection(fromCollection: string, toCollection: string): Promise>; + renameCollection(fromCollection: string, toCollection: string, callback: Callback>): void; + renameCollection(fromCollection: string, toCollection: string, options: RenameOptions): Promise>; + renameCollection(fromCollection: string, toCollection: string, options: RenameOptions, callback: Callback>): void; + /** + * Drop a collection from the database, removing it permanently. New accesses will create a new collection. + * + * @param name - Name of collection to drop + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + dropCollection(name: string): Promise; + dropCollection(name: string, callback: Callback): void; + dropCollection(name: string, options: DropCollectionOptions): Promise; + dropCollection(name: string, options: DropCollectionOptions, callback: Callback): void; + /** + * Drop a database, removing it permanently from the server. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + dropDatabase(): Promise; + dropDatabase(callback: Callback): void; + dropDatabase(options: DropDatabaseOptions): Promise; + dropDatabase(options: DropDatabaseOptions, callback: Callback): void; + /** + * Fetch all collections for the current db. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + collections(): Promise; + collections(callback: Callback): void; + collections(options: ListCollectionsOptions): Promise; + collections(options: ListCollectionsOptions, callback: Callback): void; + /** + * Creates an index on the db and collection. + * + * @param name - Name of the collection to create the index on. + * @param indexSpec - Specify the field to index, or an index specification + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + createIndex(name: string, indexSpec: IndexSpecification): Promise; + createIndex(name: string, indexSpec: IndexSpecification, callback?: Callback): void; + createIndex(name: string, indexSpec: IndexSpecification, options: CreateIndexesOptions): Promise; + createIndex(name: string, indexSpec: IndexSpecification, options: CreateIndexesOptions, callback: Callback): void; + /** + * Add a user to the database + * + * @param username - The username for the new user + * @param password - An optional password for the new user + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + addUser(username: string): Promise; + addUser(username: string, callback: Callback): void; + addUser(username: string, password: string): Promise; + addUser(username: string, password: string, callback: Callback): void; + addUser(username: string, options: AddUserOptions): Promise; + addUser(username: string, options: AddUserOptions, callback: Callback): void; + addUser(username: string, password: string, options: AddUserOptions): Promise; + addUser(username: string, password: string, options: AddUserOptions, callback: Callback): void; + /** + * Remove a user from a database + * + * @param username - The username to remove + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + removeUser(username: string): Promise; + removeUser(username: string, callback: Callback): void; + removeUser(username: string, options: RemoveUserOptions): Promise; + removeUser(username: string, options: RemoveUserOptions, callback: Callback): void; + /** + * Set the current profiling level of MongoDB + * + * @param level - The new profiling level (off, slow_only, all). + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + setProfilingLevel(level: ProfilingLevel): Promise; + setProfilingLevel(level: ProfilingLevel, callback: Callback): void; + setProfilingLevel(level: ProfilingLevel, options: SetProfilingLevelOptions): Promise; + setProfilingLevel(level: ProfilingLevel, options: SetProfilingLevelOptions, callback: Callback): void; + /** + * Retrieve the current profiling Level for MongoDB + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + profilingLevel(): Promise; + profilingLevel(callback: Callback): void; + profilingLevel(options: ProfilingLevelOptions): Promise; + profilingLevel(options: ProfilingLevelOptions, callback: Callback): void; + /** + * Retrieves this collections index info. + * + * @param name - The name of the collection. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + indexInformation(name: string): Promise; + indexInformation(name: string, callback: Callback): void; + indexInformation(name: string, options: IndexInformationOptions): Promise; + indexInformation(name: string, options: IndexInformationOptions, callback: Callback): void; + /** + * Unref all sockets + * @deprecated This function is deprecated and will be removed in the next major version. + */ + unref(): void; + /** + * Create a new Change Stream, watching for new changes (insertions, updates, + * replacements, deletions, and invalidations) in this database. Will ignore all + * changes to system collections. + * + * @param pipeline - An array of {@link https://docs.mongodb.com/manual/reference/operator/aggregation-pipeline/|aggregation pipeline stages} through which to pass change stream documents. This allows for filtering (using $match) and manipulating the change stream documents. + * @param options - Optional settings for the command + */ + watch(pipeline?: Document[], options?: ChangeStreamOptions): ChangeStream; + /** Return the db logger */ + getLogger(): Logger; + readonly logger: Logger; +} +/* Excluded from this release type: DB_AGGREGATE_COLLECTION */ +/** @public */ +export declare interface DbOptions extends BSONSerializeOptions, WriteConcernOptions, LoggerOptions { + /** If the database authentication is dependent on another databaseName. */ + authSource?: string; + /** Force server to assign _id values instead of driver. */ + forceServerObjectId?: boolean; + /** The preferred read preference (ReadPreference.PRIMARY, ReadPreference.PRIMARY_PREFERRED, ReadPreference.SECONDARY, ReadPreference.SECONDARY_PREFERRED, ReadPreference.NEAREST). */ + readPreference?: ReadPreferenceLike; + /** A primary key factory object for generation of custom _id keys. */ + pkFactory?: PkFactory; + /** Specify a read concern for the collection. (only MongoDB 3.2 or higher supported) */ + readConcern?: ReadConcern; + /** Should retry failed writes */ + retryWrites?: boolean; +} +/* Excluded from this release type: DbPrivate */ +export { DBRef }; +/** @public */ +export declare interface DbStatsOptions extends CommandOperationOptions { + /** Divide the returned sizes by scale value. */ + scale?: number; +} +export { Decimal128 }; +/** @public */ +export declare interface DeleteManyModel { + /** The filter to limit the deleted documents. */ + filter: Filter; + /** Specifies a collation. */ + collation?: CollationOptions; + /** The index to use. If specified, then the query system will only consider plans using the hinted index. */ + hint?: Hint; +} +/** @public */ +export declare interface DeleteOneModel { + /** The filter to limit the deleted documents. */ + filter: Filter; + /** Specifies a collation. */ + collation?: CollationOptions; + /** The index to use. If specified, then the query system will only consider plans using the hinted index. */ + hint?: Hint; +} +/** @public */ +export declare interface DeleteOptions extends CommandOperationOptions, WriteConcernOptions { + /** If true, when an insert fails, don't execute the remaining writes. If false, continue with remaining inserts when one fails. */ + ordered?: boolean; + /** A user-provided comment to attach to this command */ + comment?: string | Document; + /** Specifies the collation to use for the operation */ + collation?: CollationOptions; + /** Specify that the update query should only consider plans using the hinted index */ + hint?: string | Document; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; + /** @deprecated use `removeOne` or `removeMany` to implicitly specify the limit */ + single?: boolean; +} +/** @public */ +export declare interface DeleteResult { + /** Indicates whether this write result was acknowledged. If not, then all other members of this result will be undefined. */ + acknowledged: boolean; + /** The number of documents that were deleted */ + deletedCount: number; +} +/** @public */ +export declare interface DeleteStatement { + /** The query that matches documents to delete. */ + q: Document; + /** The number of matching documents to delete. */ + limit: number; + /** Specifies the collation to use for the operation. */ + collation?: CollationOptions; + /** A document or string that specifies the index to use to support the query predicate. */ + hint?: Hint; + /** A user-provided comment to attach to this command */ + comment?: string | Document; +} +/* Excluded from this release type: deserialize */ +/** @public */ +export declare interface DestroyOptions { + /** Force the destruction. */ + force?: boolean; +} +/** @public */ +export declare type DistinctOptions = CommandOperationOptions; +export { Document }; +export { Double }; +/** @public */ +export declare interface DriverInfo { + name?: string; + version?: string; + platform?: string; +} +/** @public */ +export declare type DropCollectionOptions = CommandOperationOptions; +/** @public */ +export declare type DropDatabaseOptions = CommandOperationOptions; +/** @public */ +export declare type DropIndexesOptions = CommandOperationOptions; +/* Excluded from this release type: Encrypter */ +/* Excluded from this release type: EncrypterOptions */ +/** @public */ +export declare interface EndSessionOptions { + /* Excluded from this release type: error */ + force?: boolean; + forceClear?: boolean; +} +/** TypeScript Omit (Exclude to be specific) does not work for objects with an "any" indexed type, and breaks discriminated unions @public */ +export declare type EnhancedOmit = string extends keyof TRecordOrUnion ? TRecordOrUnion : TRecordOrUnion extends any ? Pick> : never; +/** @public */ +export declare interface ErrorDescription extends Document { + message?: string; + errmsg?: string; + $err?: string; + errorLabels?: string[]; + errInfo?: Document; +} +/** @public */ +export declare interface EstimatedDocumentCountOptions extends CommandOperationOptions { + /** + * The maximum amount of time to allow the operation to run. + * + * This option is sent only if the caller explicitly provides a value. The default is to not send a value. + */ + maxTimeMS?: number; +} +/** @public */ +export declare interface EvalOptions extends CommandOperationOptions { + nolock?: boolean; +} +/** @public */ +export declare type EventEmitterWithState = {}; +/** + * Event description type + * @public + */ +export declare type EventsDescription = Record; +/* Excluded from this release type: ExecutionResult */ +/* Excluded from this release type: Explain */ +/** @public */ +export declare interface ExplainOptions { + /** Specifies the verbosity mode for the explain output. */ + explain?: ExplainVerbosityLike; +} +/** @public */ +export declare const ExplainVerbosity: Readonly<{ + readonly queryPlanner: "queryPlanner"; + readonly queryPlannerExtended: "queryPlannerExtended"; + readonly executionStats: "executionStats"; + readonly allPlansExecution: "allPlansExecution"; +}>; +/** @public */ +export declare type ExplainVerbosity = string; +/** + * For backwards compatibility, true is interpreted as "allPlansExecution" + * and false as "queryPlanner". Prior to server version 3.6, aggregate() + * ignores the verbosity parameter and executes in "queryPlanner". + * @public + */ +export declare type ExplainVerbosityLike = ExplainVerbosity | boolean; +/** A MongoDB filter can be some portion of the schema or a set of operators @public */ +export declare type Filter = { + [P in keyof TSchema]?: Condition; +} & RootFilterOperators; +/** @public */ +export declare type FilterOperations = T extends Record ? { + [key in keyof T]?: FilterOperators; +} : FilterOperators; +/** @public */ +export declare interface FilterOperators extends Document { + $eq?: TValue; + $gt?: TValue; + $gte?: TValue; + $in?: ReadonlyArray; + $lt?: TValue; + $lte?: TValue; + $ne?: TValue; + $nin?: ReadonlyArray; + $not?: TValue extends string ? FilterOperators | RegExp : FilterOperators; + /** + * When `true`, `$exists` matches the documents that contain the field, + * including documents where the field value is null. + */ + $exists?: boolean; + $type?: BSONType | BSONTypeAlias; + $expr?: Record; + $jsonSchema?: Record; + $mod?: TValue extends number ? [ + number, + number + ] : never; + $regex?: TValue extends string ? RegExp | BSONRegExp | string : never; + $options?: TValue extends string ? string : never; + $geoIntersects?: { + $geometry: Document; + }; + $geoWithin?: Document; + $near?: Document; + $nearSphere?: Document; + $maxDistance?: number; + $all?: ReadonlyArray; + $elemMatch?: Document; + $size?: TValue extends ReadonlyArray ? number : never; + $bitsAllClear?: BitwiseFilter; + $bitsAllSet?: BitwiseFilter; + $bitsAnyClear?: BitwiseFilter; + $bitsAnySet?: BitwiseFilter; + $rand?: Record; +} +/** @public */ +export declare type FinalizeFunction = (key: TKey, reducedValue: TValue) => TValue; +/** @public */ +export declare class FindCursor extends AbstractCursor { + /* Excluded from this release type: [kFilter] */ + /* Excluded from this release type: [kNumReturned] */ + /* Excluded from this release type: [kBuiltOptions] */ + /* Excluded from this release type: __constructor */ + clone(): FindCursor; + map(transform: (doc: TSchema) => T): FindCursor; + /* Excluded from this release type: _initialize */ + /* Excluded from this release type: _getMore */ + /** Get the count of documents for this cursor */ + count(): Promise; + count(callback: Callback): void; + count(options: CountOptions): Promise; + count(options: CountOptions, callback: Callback): void; + /** Execute the explain for the cursor */ + explain(): Promise; + explain(callback: Callback): void; + explain(verbosity?: ExplainVerbosityLike): Promise; + /** Set the cursor query */ + filter(filter: Document): this; + /** + * Set the cursor hint + * + * @param hint - If specified, then the query system will only consider plans using the hinted index. + */ + hint(hint: Hint): this; + /** + * Set the cursor min + * + * @param min - Specify a $min value to specify the inclusive lower bound for a specific index in order to constrain the results of find(). The $min specifies the lower bound for all keys of a specific index in order. + */ + min(min: Document): this; + /** + * Set the cursor max + * + * @param max - Specify a $max value to specify the exclusive upper bound for a specific index in order to constrain the results of find(). The $max specifies the upper bound for all keys of a specific index in order. + */ + max(max: Document): this; + /** + * Set the cursor returnKey. + * If set to true, modifies the cursor to only return the index field or fields for the results of the query, rather than documents. + * If set to true and the query does not use an index to perform the read operation, the returned documents will not contain any fields. + * + * @param value - the returnKey value. + */ + returnKey(value: boolean): this; + /** + * Modifies the output of a query by adding a field $recordId to matching documents. $recordId is the internal key which uniquely identifies a document in a collection. + * + * @param value - The $showDiskLoc option has now been deprecated and replaced with the showRecordId field. $showDiskLoc will still be accepted for OP_QUERY stye find. + */ + showRecordId(value: boolean): this; + /** + * Add a query modifier to the cursor query + * + * @param name - The query modifier (must start with $, such as $orderby etc) + * @param value - The modifier value. + */ + addQueryModifier(name: string, value: string | boolean | number | Document): this; + /** + * Add a comment to the cursor query allowing for tracking the comment in the log. + * + * @param value - The comment attached to this query. + */ + comment(value: string): this; + /** + * Set a maxAwaitTimeMS on a tailing cursor query to allow to customize the timeout value for the option awaitData (Only supported on MongoDB 3.2 or higher, ignored otherwise) + * + * @param value - Number of milliseconds to wait before aborting the tailed query. + */ + maxAwaitTimeMS(value: number): this; + /** + * Set a maxTimeMS on the cursor query, allowing for hard timeout limits on queries (Only supported on MongoDB 2.6 or higher) + * + * @param value - Number of milliseconds to wait before aborting the query. + */ + maxTimeMS(value: number): this; + /** + * Add a project stage to the aggregation pipeline + * + * @remarks + * In order to strictly type this function you must provide an interface + * that represents the effect of your projection on the result documents. + * + * By default chaining a projection to your cursor changes the returned type to the generic + * {@link Document} type. + * You should specify a parameterized type to have assertions on your final results. + * + * @example + * ```typescript + * // Best way + * const docs: FindCursor<{ a: number }> = cursor.project<{ a: number }>({ _id: 0, a: true }); + * // Flexible way + * const docs: FindCursor = cursor.project({ _id: 0, a: true }); + * ``` + * + * @remarks + * + * **Note for Typescript Users:** adding a transform changes the return type of the iteration of this cursor, + * it **does not** return a new instance of a cursor. This means when calling project, + * you should always assign the result to a new variable in order to get a correctly typed cursor variable. + * Take note of the following example: + * + * @example + * ```typescript + * const cursor: FindCursor<{ a: number; b: string }> = coll.find(); + * const projectCursor = cursor.project<{ a: number }>({ _id: 0, a: true }); + * const aPropOnlyArray: {a: number}[] = await projectCursor.toArray(); + * + * // or always use chaining and save the final cursor + * + * const cursor = coll.find().project<{ a: string }>({ + * _id: 0, + * a: { $convert: { input: '$a', to: 'string' } + * }}); + * ``` + */ + project(value: Document): FindCursor; + /** + * Sets the sort order of the cursor query. + * + * @param sort - The key or keys set for the sort. + * @param direction - The direction of the sorting (1 or -1). + */ + sort(sort: Sort | string, direction?: SortDirection): this; + /** + * Allows disk use for blocking sort operations exceeding 100MB memory. (MongoDB 3.2 or higher) + * + * @remarks + * {@link https://docs.mongodb.com/manual/reference/command/find/#find-cmd-allowdiskuse | find command allowDiskUse documentation} + */ + allowDiskUse(): this; + /** + * Set the collation options for the cursor. + * + * @param value - The cursor collation options (MongoDB 3.4 or higher) settings for update operation (see 3.4 documentation for available fields). + */ + collation(value: CollationOptions): this; + /** + * Set the limit for the cursor. + * + * @param value - The limit for the cursor query. + */ + limit(value: number): this; + /** + * Set the skip for the cursor. + * + * @param value - The skip for the cursor query. + */ + skip(value: number): this; +} +/** @public */ +export declare interface FindOneAndDeleteOptions extends CommandOperationOptions { + /** An optional hint for query optimization. See the {@link https://docs.mongodb.com/manual/reference/command/update/#update-command-hint|update command} reference for more information.*/ + hint?: Document; + /** Limits the fields to return for all matching documents. */ + projection?: Document; + /** Determines which document the operation modifies if the query selects multiple documents. */ + sort?: Sort; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; +} +/** @public */ +export declare interface FindOneAndReplaceOptions extends CommandOperationOptions { + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; + /** An optional hint for query optimization. See the {@link https://docs.mongodb.com/manual/reference/command/update/#update-command-hint|update command} reference for more information.*/ + hint?: Document; + /** Limits the fields to return for all matching documents. */ + projection?: Document; + /** When set to 'after', returns the updated document rather than the original. The default is 'before'. */ + returnDocument?: ReturnDocument; + /** Determines which document the operation modifies if the query selects multiple documents. */ + sort?: Sort; + /** Upsert the document if it does not exist. */ + upsert?: boolean; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; +} +/** @public */ +export declare interface FindOneAndUpdateOptions extends CommandOperationOptions { + /** Optional list of array filters referenced in filtered positional operators */ + arrayFilters?: Document[]; + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; + /** An optional hint for query optimization. See the {@link https://docs.mongodb.com/manual/reference/command/update/#update-command-hint|update command} reference for more information.*/ + hint?: Document; + /** Limits the fields to return for all matching documents. */ + projection?: Document; + /** When set to 'after', returns the updated document rather than the original. The default is 'before'. */ + returnDocument?: ReturnDocument; + /** Determines which document the operation modifies if the query selects multiple documents. */ + sort?: Sort; + /** Upsert the document if it does not exist. */ + upsert?: boolean; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; +} +/** + * A builder object that is returned from {@link BulkOperationBase#find}. + * Is used to build a write operation that involves a query filter. + * + * @public + */ +export declare class FindOperators { + bulkOperation: BulkOperationBase; + /* Excluded from this release type: __constructor */ + /** Add a multiple update operation to the bulk operation */ + update(updateDocument: Document): BulkOperationBase; + /** Add a single update operation to the bulk operation */ + updateOne(updateDocument: Document): BulkOperationBase; + /** Add a replace one operation to the bulk operation */ + replaceOne(replacement: Document): BulkOperationBase; + /** Add a delete one operation to the bulk operation */ + deleteOne(): BulkOperationBase; + /** Add a delete many operation to the bulk operation */ + delete(): BulkOperationBase; + /** Upsert modifier for update bulk operation, noting that this operation is an upsert. */ + upsert(): this; + /** Specifies the collation for the query condition. */ + collation(collation: CollationOptions): this; + /** Specifies arrayFilters for UpdateOne or UpdateMany bulk operations. */ + arrayFilters(arrayFilters: Document[]): this; +} +/** + * @public + * @typeParam TSchema - Unused schema definition, deprecated usage, only specify `FindOptions` with no generic + */ +export declare interface FindOptions extends CommandOperationOptions { + /** Sets the limit of documents returned in the query. */ + limit?: number; + /** Set to sort the documents coming back from the query. Array of indexes, `[['a', 1]]` etc. */ + sort?: Sort; + /** The fields to return in the query. Object of fields to either include or exclude (one of, not both), `{'a':1, 'b': 1}` **or** `{'a': 0, 'b': 0}` */ + projection?: Document; + /** Set to skip N documents ahead in your query (useful for pagination). */ + skip?: number; + /** Tell the query to use specific indexes in the query. Object of indexes to use, `{'_id':1}` */ + hint?: Hint; + /** Specify if the cursor can timeout. */ + timeout?: boolean; + /** Specify if the cursor is tailable. */ + tailable?: boolean; + /** Specify if the cursor is a a tailable-await cursor. Requires `tailable` to be true */ + awaitData?: boolean; + /** Set the batchSize for the getMoreCommand when iterating over the query results. */ + batchSize?: number; + /** If true, returns only the index keys in the resulting documents. */ + returnKey?: boolean; + /** The inclusive lower bound for a specific index */ + min?: Document; + /** The exclusive upper bound for a specific index */ + max?: Document; + /** You can put a $comment field on a query to make looking in the profiler logs simpler. */ + comment?: string | Document; + /** Number of milliseconds to wait before aborting the query. */ + maxTimeMS?: number; + /** The maximum amount of time for the server to wait on new documents to satisfy a tailable cursor query. Requires `tailable` and `awaitData` to be true */ + maxAwaitTimeMS?: number; + /** The server normally times out idle cursors after an inactivity period (10 minutes) to prevent excess memory use. Set this option to prevent that. */ + noCursorTimeout?: boolean; + /** Specify collation (MongoDB 3.4 or higher) settings for update operation (see 3.4 documentation for available fields). */ + collation?: CollationOptions; + /** Allows disk use for blocking sort operations exceeding 100MB memory. (MongoDB 3.2 or higher) */ + allowDiskUse?: boolean; + /** Determines whether to close the cursor after the first batch. Defaults to false. */ + singleBatch?: boolean; + /** For queries against a sharded collection, allows the command (or subsequent getMore commands) to return partial results, rather than an error, if one or more queried shards are unavailable. */ + allowPartialResults?: boolean; + /** Determines whether to return the record identifier for each document. If true, adds a field $recordId to the returned documents. */ + showRecordId?: boolean; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; +} +/** @public */ +export declare type Flatten = Type extends ReadonlyArray ? Item : Type; +/** @public */ +export declare type GenericListener = (...args: any[]) => void; +/* Excluded from this release type: GetMore */ +/* Excluded from this release type: GetMoreOptions */ +/** + * Constructor for a streaming GridFS interface + * @public + */ +export declare class GridFSBucket extends TypedEventEmitter { + /* Excluded from this release type: s */ + /** + * When the first call to openUploadStream is made, the upload stream will + * check to see if it needs to create the proper indexes on the chunks and + * files collections. This event is fired either when 1) it determines that + * no index creation is necessary, 2) when it successfully creates the + * necessary indexes. + * @event + */ + static readonly INDEX: "index"; + constructor(db: Db, options?: GridFSBucketOptions); + /** + * Returns a writable stream (GridFSBucketWriteStream) for writing + * buffers to GridFS. The stream's 'id' property contains the resulting + * file's id. + * + * @param filename - The value of the 'filename' key in the files doc + * @param options - Optional settings. + */ + openUploadStream(filename: string, options?: GridFSBucketWriteStreamOptions): GridFSBucketWriteStream; + /** + * Returns a writable stream (GridFSBucketWriteStream) for writing + * buffers to GridFS for a custom file id. The stream's 'id' property contains the resulting + * file's id. + */ + openUploadStreamWithId(id: ObjectId, filename: string, options?: GridFSBucketWriteStreamOptions): GridFSBucketWriteStream; + /** Returns a readable stream (GridFSBucketReadStream) for streaming file data from GridFS. */ + openDownloadStream(id: ObjectId, options?: GridFSBucketReadStreamOptions): GridFSBucketReadStream; + /** + * Deletes a file with the given id + * + * @param id - The id of the file doc + */ + delete(id: ObjectId): Promise; + delete(id: ObjectId, callback: Callback): void; + /** Convenience wrapper around find on the files collection */ + find(filter?: Filter, options?: FindOptions): FindCursor; + /** + * Returns a readable stream (GridFSBucketReadStream) for streaming the + * file with the given name from GridFS. If there are multiple files with + * the same name, this will stream the most recent file with the given name + * (as determined by the `uploadDate` field). You can set the `revision` + * option to change this behavior. + */ + openDownloadStreamByName(filename: string, options?: GridFSBucketReadStreamOptionsWithRevision): GridFSBucketReadStream; + /** + * Renames the file with the given _id to the given string + * + * @param id - the id of the file to rename + * @param filename - new name for the file + */ + rename(id: ObjectId, filename: string): Promise; + rename(id: ObjectId, filename: string, callback: Callback): void; + /** Removes this bucket's files collection, followed by its chunks collection. */ + drop(): Promise; + drop(callback: Callback): void; + /** Get the Db scoped logger. */ + getLogger(): Logger; +} +/** @public */ +export declare type GridFSBucketEvents = { + index(): void; +}; +/** @public */ +export declare interface GridFSBucketOptions extends WriteConcernOptions { + /** The 'files' and 'chunks' collections will be prefixed with the bucket name followed by a dot. */ + bucketName?: string; + /** Number of bytes stored in each chunk. Defaults to 255KB */ + chunkSizeBytes?: number; + /** Read preference to be passed to read operations */ + readPreference?: ReadPreference; +} +/* Excluded from this release type: GridFSBucketPrivate */ +/** + * A readable stream that enables you to read buffers from GridFS. + * + * Do not instantiate this class directly. Use `openDownloadStream()` instead. + * @public + */ +export declare class GridFSBucketReadStream extends Readable implements NodeJS.ReadableStream { + /* Excluded from this release type: s */ + /** + * An error occurred + * @event + */ + static readonly ERROR: "error"; + /** + * Fires when the stream loaded the file document corresponding to the provided id. + * @event + */ + static readonly FILE: "file"; + /** + * Emitted when a chunk of data is available to be consumed. + * @event + */ + static readonly DATA: "data"; + /** + * Fired when the stream is exhausted (no more data events). + * @event + */ + static readonly END: "end"; + /** + * Fired when the stream is exhausted and the underlying cursor is killed + * @event + */ + static readonly CLOSE: "close"; + /* Excluded from this release type: __constructor */ + /* Excluded from this release type: _read */ + /** + * Sets the 0-based offset in bytes to start streaming from. Throws + * an error if this stream has entered flowing mode + * (e.g. if you've already called `on('data')`) + * + * @param start - 0-based offset in bytes to start streaming from + */ + start(start?: number): this; + /** + * Sets the 0-based offset in bytes to start streaming from. Throws + * an error if this stream has entered flowing mode + * (e.g. if you've already called `on('data')`) + * + * @param end - Offset in bytes to stop reading at + */ + end(end?: number): this; + /** + * Marks this stream as aborted (will never push another `data` event) + * and kills the underlying cursor. Will emit the 'end' event, and then + * the 'close' event once the cursor is successfully killed. + * + * @param callback - called when the cursor is successfully closed or an error occurred. + */ + abort(callback?: Callback): void; +} +/** @public */ +export declare interface GridFSBucketReadStreamOptions { + sort?: Sort; + skip?: number; + /** 0-based offset in bytes to start streaming from */ + start?: number; + /** 0-based offset in bytes to stop streaming before */ + end?: number; +} +/** @public */ +export declare interface GridFSBucketReadStreamOptionsWithRevision extends GridFSBucketReadStreamOptions { + /** The revision number relative to the oldest file with the given filename. 0 + * gets you the oldest file, 1 gets you the 2nd oldest, -1 gets you the + * newest. */ + revision?: number; +} +/* Excluded from this release type: GridFSBucketReadStreamPrivate */ +/** + * A writable stream that enables you to write buffers to GridFS. + * + * Do not instantiate this class directly. Use `openUploadStream()` instead. + * @public + */ +export declare class GridFSBucketWriteStream extends Writable implements NodeJS.WritableStream { + bucket: GridFSBucket; + chunks: Collection; + filename: string; + files: Collection; + options: GridFSBucketWriteStreamOptions; + done: boolean; + id: ObjectId; + chunkSizeBytes: number; + bufToStore: Buffer; + length: number; + n: number; + pos: number; + state: { + streamEnd: boolean; + outstandingRequests: number; + errored: boolean; + aborted: boolean; + }; + writeConcern?: WriteConcern; + /** @event */ + static readonly CLOSE = "close"; + /** @event */ + static readonly ERROR = "error"; + /** + * `end()` was called and the write stream successfully wrote the file metadata and all the chunks to MongoDB. + * @event + */ + static readonly FINISH = "finish"; + /* Excluded from this release type: __constructor */ + /** + * Write a buffer to the stream. + * + * @param chunk - Buffer to write + * @param encodingOrCallback - Optional encoding for the buffer + * @param callback - Function to call when the chunk was added to the buffer, or if the entire chunk was persisted to MongoDB if this chunk caused a flush. + * @returns False if this write required flushing a chunk to MongoDB. True otherwise. + */ + write(chunk: Buffer | string): boolean; + write(chunk: Buffer | string, callback: Callback): boolean; + write(chunk: Buffer | string, encoding: BufferEncoding | undefined): boolean; + write(chunk: Buffer | string, encoding: BufferEncoding | undefined, callback: Callback): boolean; + /** + * Places this write stream into an aborted state (all future writes fail) + * and deletes all chunks that have already been written. + * + * @param callback - called when chunks are successfully removed or error occurred + */ + abort(): Promise; + abort(callback: Callback): void; + /** + * Tells the stream that no more data will be coming in. The stream will + * persist the remaining data to MongoDB, write the files document, and + * then emit a 'finish' event. + * + * @param chunk - Buffer to write + * @param encoding - Optional encoding for the buffer + * @param callback - Function to call when all files and chunks have been persisted to MongoDB + */ + end(): void; + end(chunk: Buffer): void; + end(callback: Callback): void; + end(chunk: Buffer, callback: Callback): void; + end(chunk: Buffer, encoding: BufferEncoding): void; + end(chunk: Buffer, encoding: BufferEncoding | undefined, callback: Callback): void; +} +/** @public */ +export declare interface GridFSBucketWriteStreamOptions extends WriteConcernOptions { + /** Overwrite this bucket's chunkSizeBytes for this file */ + chunkSizeBytes?: number; + /** Custom file id for the GridFS file. */ + id?: ObjectId; + /** Object to store in the file document's `metadata` field */ + metadata?: Document; + /** String to store in the file document's `contentType` field */ + contentType?: string; + /** Array of strings to store in the file document's `aliases` field */ + aliases?: string[]; +} +/** @public */ +export declare interface GridFSChunk { + _id: ObjectId; + files_id: ObjectId; + n: number; + data: Buffer | Uint8Array; +} +/** @public */ +export declare interface GridFSFile { + _id: ObjectId; + length: number; + chunkSize: number; + filename: string; + contentType?: string; + aliases?: string[]; + metadata?: Document; + uploadDate: Date; +} +/** @public */ +export declare interface HedgeOptions { + /** Explicitly enable or disable hedged reads. */ + enabled?: boolean; +} +/** @public */ +export declare type Hint = string | Document; +/** @public */ +export declare class HostAddress { + host: string | undefined; + port: number | undefined; + socketPath: string | undefined; + isIPv6: boolean | undefined; + constructor(hostString: string); + /** + * @param ipv6Brackets - optionally request ipv6 bracket notation required for connection strings + */ + toString(ipv6Brackets?: boolean): string; + static fromString(s: string): HostAddress; +} +/** @public */ +export declare interface IndexDescription extends Pick { + collation?: CollationOptions; + name?: string; + key: Document; +} +/** @public */ +export declare type IndexDirection = -1 | 1 | '2d' | '2dsphere' | 'text' | 'geoHaystack' | number; +/** @public */ +export declare interface IndexInformationOptions { + full?: boolean; + readPreference?: ReadPreference; + session?: ClientSession; +} +/** @public */ +export declare type IndexSpecification = OneOrMore; +/** Given an object shaped type, return the type of the _id field or default to ObjectId @public */ +export declare type InferIdType = TSchema extends { + _id: infer IdType; +} ? {} extends IdType ? Exclude : unknown extends IdType ? ObjectId : IdType : ObjectId; +/** @public */ +export declare interface InsertManyResult { + /** Indicates whether this write result was acknowledged. If not, then all other members of this result will be undefined */ + acknowledged: boolean; + /** The number of inserted documents for this operations */ + insertedCount: number; + /** Map of the index of the inserted document to the id of the inserted document */ + insertedIds: { + [key: number]: InferIdType; + }; +} +/** @public */ +export declare interface InsertOneModel { + /** The document to insert. */ + document: OptionalId; +} +/** @public */ +export declare interface InsertOneOptions extends CommandOperationOptions { + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; + /** Force server to assign _id values instead of driver. */ + forceServerObjectId?: boolean; +} +/** @public */ +export declare interface InsertOneResult { + /** Indicates whether this write result was acknowledged. If not, then all other members of this result will be undefined */ + acknowledged: boolean; + /** The identifier that was inserted. If the server generated the identifier, this value will be null as the driver does not have access to that data */ + insertedId: InferIdType; +} +export { Int32 }; +/** @public */ +export declare type IntegerType = number | Int32 | Long; +/* Excluded from this release type: InternalAbstractCursorOptions */ +/* Excluded from this release type: InterruptibleAsyncInterval */ +/** @public */ +export declare type IsAny = true extends false & Type ? ResultIfAny : ResultIfNotAny; +/* Excluded from this release type: kBeforeHandshake */ +/* Excluded from this release type: kBuffer */ +/* Excluded from this release type: kBuffers */ +/* Excluded from this release type: kBuiltOptions */ +/* Excluded from this release type: kCancellationToken */ +/* Excluded from this release type: kCancellationToken_2 */ +/* Excluded from this release type: kCancelled */ +/* Excluded from this release type: kCancelled_2 */ +/* Excluded from this release type: kCheckedOut */ +/* Excluded from this release type: kClosed */ +/* Excluded from this release type: kClosed_2 */ +/* Excluded from this release type: kClusterTime */ +/* Excluded from this release type: kConnection */ +/* Excluded from this release type: kConnectionCounter */ +/* Excluded from this release type: kConnections */ +/* Excluded from this release type: kCursorStream */ +/* Excluded from this release type: kDescription */ +/* Excluded from this release type: kDocuments */ +/* Excluded from this release type: kErrorLabels */ +/** @public */ +export declare type KeysOfAType = { + [key in keyof TSchema]: NonNullable extends Type ? key : never; +}[keyof TSchema]; +/** @public */ +export declare type KeysOfOtherType = { + [key in keyof TSchema]: NonNullable extends Type ? never : key; +}[keyof TSchema]; +/* Excluded from this release type: kFilter */ +/* Excluded from this release type: kFullResult */ +/* Excluded from this release type: kGeneration */ +/* Excluded from this release type: kGeneration_2 */ +/* Excluded from this release type: kId */ +/* Excluded from this release type: KillCursor */ +/* Excluded from this release type: kInitialized */ +/* Excluded from this release type: kInternalClient */ +/* Excluded from this release type: kIsMaster */ +/* Excluded from this release type: kKilled */ +/* Excluded from this release type: kLastUseTime */ +/* Excluded from this release type: kLength */ +/* Excluded from this release type: kLogger */ +/* Excluded from this release type: kMessageStream */ +/* Excluded from this release type: kMetrics */ +/* Excluded from this release type: kMinPoolSizeTimer */ +/* Excluded from this release type: kMode */ +/* Excluded from this release type: kMonitor */ +/* Excluded from this release type: kMonitorId */ +/* Excluded from this release type: kNamespace */ +/* Excluded from this release type: kNumReturned */ +/* Excluded from this release type: kOptions */ +/* Excluded from this release type: kOptions_2 */ +/* Excluded from this release type: kOptions_3 */ +/* Excluded from this release type: kPermits */ +/* Excluded from this release type: kPinnedConnection */ +/* Excluded from this release type: kPipeline */ +/* Excluded from this release type: kProcessingWaitQueue */ +/* Excluded from this release type: kQueue */ +/* Excluded from this release type: kResumeQueue */ +/* Excluded from this release type: kRoundTripTime */ +/* Excluded from this release type: kRTTPinger */ +/* Excluded from this release type: kServer */ +/* Excluded from this release type: kServer_2 */ +/* Excluded from this release type: kServerError */ +/* Excluded from this release type: kServerSession */ +/* Excluded from this release type: kServiceGenerations */ +/* Excluded from this release type: kSession */ +/* Excluded from this release type: kSession_2 */ +/* Excluded from this release type: kSnapshotEnabled */ +/* Excluded from this release type: kSnapshotTime */ +/* Excluded from this release type: kStream */ +/* Excluded from this release type: kTopology */ +/* Excluded from this release type: kTransform */ +/* Excluded from this release type: kWaitQueue */ +/* Excluded from this release type: kWaitQueue_2 */ +/** @public */ +export declare const LEGAL_TCP_SOCKET_OPTIONS: readonly [ + "family", + "hints", + "localAddress", + "localPort", + "lookup" +]; +/** @public */ +export declare const LEGAL_TLS_SOCKET_OPTIONS: readonly [ + "ALPNProtocols", + "ca", + "cert", + "checkServerIdentity", + "ciphers", + "crl", + "ecdhCurve", + "key", + "minDHSize", + "passphrase", + "pfx", + "rejectUnauthorized", + "secureContext", + "secureProtocol", + "servername", + "session" +]; +/** @public */ +export declare class ListCollectionsCursor | CollectionInfo = Pick | CollectionInfo> extends AbstractCursor { + parent: Db; + filter: Document; + options?: ListCollectionsOptions; + constructor(db: Db, filter: Document, options?: ListCollectionsOptions); + clone(): ListCollectionsCursor; +} +/** @public */ +export declare interface ListCollectionsOptions extends CommandOperationOptions { + /** Since 4.0: If true, will only return the collection name in the response, and will omit additional info */ + nameOnly?: boolean; + /** The batchSize for the returned command cursor or if pre 2.8 the systems batch collection */ + batchSize?: number; +} +/** @public */ +export declare interface ListDatabasesOptions extends CommandOperationOptions { + /** A query predicate that determines which databases are listed */ + filter?: Document; + /** A flag to indicate whether the command should return just the database names, or return both database names and size information */ + nameOnly?: boolean; + /** A flag that determines which databases are returned based on the user privileges when access control is enabled */ + authorizedDatabases?: boolean; +} +/** @public */ +export declare type ListDatabasesResult = string[] | Document[]; +/** @public */ +export declare class ListIndexesCursor extends AbstractCursor { + parent: Collection; + options?: ListIndexesOptions; + constructor(collection: Collection, options?: ListIndexesOptions); + clone(): ListIndexesCursor; +} +/** @public */ +export declare interface ListIndexesOptions extends CommandOperationOptions { + /** The batchSize for the returned command cursor or if pre 2.8 the systems batch collection */ + batchSize?: number; +} +/** + * @public + */ +export declare class Logger { + className: string; + /** + * Creates a new Logger instance + * + * @param className - The Class name associated with the logging instance + * @param options - Optional logging settings + */ + constructor(className: string, options?: LoggerOptions); + /** + * Log a message at the debug level + * + * @param message - The message to log + * @param object - Additional meta data to log + */ + debug(message: string, object?: unknown): void; + /** + * Log a message at the warn level + * + * @param message - The message to log + * @param object - Additional meta data to log + */ + warn(message: string, object?: unknown): void; + /** + * Log a message at the info level + * + * @param message - The message to log + * @param object - Additional meta data to log + */ + info(message: string, object?: unknown): void; + /** + * Log a message at the error level + * + * @param message - The message to log + * @param object - Additional meta data to log + */ + error(message: string, object?: unknown): void; + /** Is the logger set at info level */ + isInfo(): boolean; + /** Is the logger set at error level */ + isError(): boolean; + /** Is the logger set at error level */ + isWarn(): boolean; + /** Is the logger set at debug level */ + isDebug(): boolean; + /** Resets the logger to default settings, error and no filtered classes */ + static reset(): void; + /** Get the current logger function */ + static currentLogger(): LoggerFunction; + /** + * Set the current logger function + * + * @param logger - Custom logging function + */ + static setCurrentLogger(logger: LoggerFunction): void; + /** + * Filter log messages for a particular class + * + * @param type - The type of filter (currently only class) + * @param values - The filters to apply + */ + static filter(type: string, values: string[]): void; + /** + * Set the current log level + * + * @param newLevel - Set current log level (debug, warn, info, error) + */ + static setLevel(newLevel: LoggerLevel): void; +} +/** @public */ +export declare type LoggerFunction = (message?: any, ...optionalParams: any[]) => void; +/** @public */ +export declare const LoggerLevel: Readonly<{ + readonly ERROR: "error"; + readonly WARN: "warn"; + readonly INFO: "info"; + readonly DEBUG: "debug"; + readonly error: "error"; + readonly warn: "warn"; + readonly info: "info"; + readonly debug: "debug"; +}>; +/** @public */ +export declare type LoggerLevel = typeof LoggerLevel[keyof typeof LoggerLevel]; +/** @public */ +export declare interface LoggerOptions { + logger?: LoggerFunction; + loggerLevel?: LoggerLevel; +} +export { Long }; +export { Map_2 as Map }; +/** @public */ +export declare type MapFunction = (this: TSchema) => void; +/** @public */ +export declare interface MapReduceOptions extends CommandOperationOptions { + /** Sets the output target for the map reduce job. */ + out?: 'inline' | { + inline: 1; + } | { + replace: string; + } | { + merge: string; + } | { + reduce: string; + }; + /** Query filter object. */ + query?: Document; + /** Sorts the input objects using this key. Useful for optimization, like sorting by the emit key for fewer reduces. */ + sort?: Sort; + /** Number of objects to return from collection. */ + limit?: number; + /** Keep temporary data. */ + keeptemp?: boolean; + /** Finalize function. */ + finalize?: FinalizeFunction | string; + /** Can pass in variables that can be access from map/reduce/finalize. */ + scope?: Document; + /** It is possible to make the execution stay in JS. Provided in MongoDB \> 2.0.X. */ + jsMode?: boolean; + /** Provide statistics on job execution time. */ + verbose?: boolean; + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; +} +/** @public */ +export declare type MatchKeysAndValues = Readonly> & Record; +export { MaxKey }; +/* Excluded from this release type: MessageStream */ +/* Excluded from this release type: MessageStreamOptions */ +export { MinKey }; +/** @public */ +export declare interface ModifyResult { + value: TSchema | null; + lastErrorObject?: Document; + ok: 0 | 1; +} +/** @public */ +export declare const MONGO_CLIENT_EVENTS: readonly [ + "connectionPoolCreated", + "connectionPoolClosed", + "connectionCreated", + "connectionReady", + "connectionClosed", + "connectionCheckOutStarted", + "connectionCheckOutFailed", + "connectionCheckedOut", + "connectionCheckedIn", + "connectionPoolCleared", + ...("error" | "close" | "commandStarted" | "commandSucceeded" | "commandFailed" | "serverHeartbeatStarted" | "serverHeartbeatSucceeded" | "serverHeartbeatFailed" | "timeout" | "serverOpening" | "serverClosed" | "serverDescriptionChanged" | "topologyOpening" | "topologyClosed" | "topologyDescriptionChanged")[] +]; +/** + * An error generated when the driver API is used incorrectly + * + * @privateRemarks + * Should **never** be directly instantiated + * + * @public + * @category Error + */ +export declare class MongoAPIError extends MongoDriverError { + constructor(message: string); + readonly name: string; +} +/** + * An error generated when a batch command is reexecuted after one of the commands in the batch + * has failed + * + * @public + * @category Error + */ +export declare class MongoBatchReExecutionError extends MongoAPIError { + constructor(message?: string); + readonly name: string; +} +/** + * An error indicating an unsuccessful Bulk Write + * @public + * @category Error + */ +export declare class MongoBulkWriteError extends MongoServerError { + result: BulkWriteResult; + writeErrors: OneOrMore; + err?: WriteConcernError; + /** Creates a new MongoBulkWriteError */ + constructor(error: { + message: string; + code: number; + writeErrors?: WriteError[]; + } | WriteConcernError | AnyError, result: BulkWriteResult); + readonly name: string; + /*Number of documents inserted. */ + readonly insertedCount: number; + /*Number of documents matched for update. */ + readonly matchedCount: number; + /*Number of documents modified. */ + readonly modifiedCount: number; + /*Number of documents deleted. */ + readonly deletedCount: number; + /*Number of documents upserted. */ + readonly upsertedCount: number; + /*Inserted document generated Id's, hash key is the index of the originating operation */ + readonly insertedIds: { + [key: number]: any; + }; + /*Upserted document generated Id's, hash key is the index of the originating operation */ + readonly upsertedIds: { + [key: number]: any; + }; +} +/** + * An error generated when a ChangeStream operation fails to execute. + * + * @public + * @category Error + */ +export declare class MongoChangeStreamError extends MongoRuntimeError { + constructor(message: string); + readonly name: string; +} +/** + * The **MongoClient** class is a class that allows for making Connections to MongoDB. + * @public + * + * @remarks + * The programmatically provided options take precedent over the URI options. + * + * @example + * ```js + * // Connect using a MongoClient instance + * const MongoClient = require('mongodb').MongoClient; + * const test = require('assert'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * // Connect using MongoClient + * const mongoClient = new MongoClient(url); + * mongoClient.connect(function(err, client) { + * const db = client.db(dbName); + * client.close(); + * }); + * ``` + * + * @example + * ```js + * // Connect using the MongoClient.connect static method + * const MongoClient = require('mongodb').MongoClient; + * const test = require('assert'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * // Connect using MongoClient + * MongoClient.connect(url, function(err, client) { + * const db = client.db(dbName); + * client.close(); + * }); + * ``` + */ +export declare class MongoClient extends TypedEventEmitter { + /* Excluded from this release type: s */ + /* Excluded from this release type: topology */ + /* Excluded from this release type: [kOptions] */ + constructor(url: string, options?: MongoClientOptions); + readonly options: Readonly; + readonly serverApi: Readonly; + /*Excluded from this release type: monitorCommands + Excluded from this release type: monitorCommands */ + readonly autoEncrypter: AutoEncrypter | undefined; + readonly readConcern: ReadConcern | undefined; + readonly writeConcern: WriteConcern | undefined; + readonly readPreference: ReadPreference; + readonly bsonOptions: BSONSerializeOptions; + readonly logger: Logger; + /** + * Connect to MongoDB using a url + * + * @see docs.mongodb.org/manual/reference/connection-string/ + */ + connect(): Promise; + connect(callback: Callback): void; + /** + * Close the db and its underlying connections + * + * @param force - Force close, emitting no events + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + close(): Promise; + close(callback: Callback): void; + close(force: boolean): Promise; + close(force: boolean, callback: Callback): void; + /** + * Create a new Db instance sharing the current socket connections. + * + * @param dbName - The name of the database we want to use. If not provided, use database name from connection string. + * @param options - Optional settings for Db construction + */ + db(dbName?: string, options?: DbOptions): Db; + /** + * Connect to MongoDB using a url + * + * @remarks + * The programmatically provided options take precedent over the URI options. + * + * @see https://docs.mongodb.org/manual/reference/connection-string/ + */ + static connect(url: string): Promise; + static connect(url: string, callback: Callback): void; + static connect(url: string, options: MongoClientOptions): Promise; + static connect(url: string, options: MongoClientOptions, callback: Callback): void; + /** Starts a new session on the server */ + startSession(): ClientSession; + startSession(options: ClientSessionOptions): ClientSession; + /** + * Runs a given operation with an implicitly created session. The lifetime of the session + * will be handled without the need for user interaction. + * + * NOTE: presently the operation MUST return a Promise (either explicit or implicitly as an async function) + * + * @param options - Optional settings for the command + * @param callback - An callback to execute with an implicitly created session + */ + withSession(callback: WithSessionCallback): Promise; + withSession(options: ClientSessionOptions, callback: WithSessionCallback): Promise; + /** + * Create a new Change Stream, watching for new changes (insertions, updates, + * replacements, deletions, and invalidations) in this cluster. Will ignore all + * changes to system collections, as well as the local, admin, and config databases. + * + * @param pipeline - An array of {@link https://docs.mongodb.com/manual/reference/operator/aggregation-pipeline/|aggregation pipeline stages} through which to pass change stream documents. This allows for filtering (using $match) and manipulating the change stream documents. + * @param options - Optional settings for the command + */ + watch(pipeline?: Document[], options?: ChangeStreamOptions): ChangeStream; + /** Return the mongo client logger */ + getLogger(): Logger; +} +/** @public */ +export declare type MongoClientEvents = Pick & { + open(mongoClient: MongoClient): void; +}; +/** + * Describes all possible URI query options for the mongo client + * @public + * @see https://docs.mongodb.com/manual/reference/connection-string + */ +export declare interface MongoClientOptions extends BSONSerializeOptions, SupportedNodeConnectionOptions { + /** Specifies the name of the replica set, if the mongod is a member of a replica set. */ + replicaSet?: string; + /** Enables or disables TLS/SSL for the connection. */ + tls?: boolean; + /** A boolean to enable or disables TLS/SSL for the connection. (The ssl option is equivalent to the tls option.) */ + ssl?: boolean; + /** Specifies the location of a local TLS Certificate */ + tlsCertificateFile?: string; + /** Specifies the location of a local .pem file that contains either the client's TLS/SSL certificate and key or only the client's TLS/SSL key when tlsCertificateFile is used to provide the certificate. */ + tlsCertificateKeyFile?: string; + /** Specifies the password to de-crypt the tlsCertificateKeyFile. */ + tlsCertificateKeyFilePassword?: string; + /** Specifies the location of a local .pem file that contains the root certificate chain from the Certificate Authority. This file is used to validate the certificate presented by the mongod/mongos instance. */ + tlsCAFile?: string; + /** Bypasses validation of the certificates presented by the mongod/mongos instance */ + tlsAllowInvalidCertificates?: boolean; + /** Disables hostname validation of the certificate presented by the mongod/mongos instance. */ + tlsAllowInvalidHostnames?: boolean; + /** Disables various certificate validations. */ + tlsInsecure?: boolean; + /** The time in milliseconds to attempt a connection before timing out. */ + connectTimeoutMS?: number; + /** The time in milliseconds to attempt a send or receive on a socket before the attempt times out. */ + socketTimeoutMS?: number; + /** An array or comma-delimited string of compressors to enable network compression for communication between this client and a mongod/mongos instance. */ + compressors?: CompressorName[] | string; + /** An integer that specifies the compression level if using zlib for network compression. */ + zlibCompressionLevel?: 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | undefined; + /** The maximum number of connections in the connection pool. */ + maxPoolSize?: number; + /** The minimum number of connections in the connection pool. */ + minPoolSize?: number; + /** The maximum number of milliseconds that a connection can remain idle in the pool before being removed and closed. */ + maxIdleTimeMS?: number; + /** The maximum time in milliseconds that a thread can wait for a connection to become available. */ + waitQueueTimeoutMS?: number; + /** Specify a read concern for the collection (only MongoDB 3.2 or higher supported) */ + readConcern?: ReadConcernLike; + /** The level of isolation */ + readConcernLevel?: ReadConcernLevel; + /** Specifies the read preferences for this connection */ + readPreference?: ReadPreferenceMode | ReadPreference; + /** Specifies, in seconds, how stale a secondary can be before the client stops using it for read operations. */ + maxStalenessSeconds?: number; + /** Specifies the tags document as a comma-separated list of colon-separated key-value pairs. */ + readPreferenceTags?: TagSet[]; + /** The auth settings for when connection to server. */ + auth?: Auth; + /** Specify the database name associated with the user’s credentials. */ + authSource?: string; + /** Specify the authentication mechanism that MongoDB will use to authenticate the connection. */ + authMechanism?: AuthMechanism; + /** Specify properties for the specified authMechanism as a comma-separated list of colon-separated key-value pairs. */ + authMechanismProperties?: AuthMechanismProperties; + /** The size (in milliseconds) of the latency window for selecting among multiple suitable MongoDB instances. */ + localThresholdMS?: number; + /** Specifies how long (in milliseconds) to block for server selection before throwing an exception. */ + serverSelectionTimeoutMS?: number; + /** heartbeatFrequencyMS controls when the driver checks the state of the MongoDB deployment. Specify the interval (in milliseconds) between checks, counted from the end of the previous check until the beginning of the next one. */ + heartbeatFrequencyMS?: number; + /** Sets the minimum heartbeat frequency. In the event that the driver has to frequently re-check a server's availability, it will wait at least this long since the previous check to avoid wasted effort. */ + minHeartbeatFrequencyMS?: number; + /** The name of the application that created this MongoClient instance. MongoDB 3.4 and newer will print this value in the server log upon establishing each connection. It is also recorded in the slow query log and profile collections */ + appName?: string; + /** Enables retryable reads. */ + retryReads?: boolean; + /** Enable retryable writes. */ + retryWrites?: boolean; + /** Allow a driver to force a Single topology type with a connection string containing one host */ + directConnection?: boolean; + /** Instruct the driver it is connecting to a load balancer fronting a mongos like service */ + loadBalanced?: boolean; + /** The write concern w value */ + w?: W; + /** The write concern timeout */ + wtimeoutMS?: number; + /** The journal write concern */ + journal?: boolean; + /** Validate mongod server certificate against Certificate Authority */ + sslValidate?: boolean; + /** SSL Certificate file path. */ + sslCA?: string; + /** SSL Certificate file path. */ + sslCert?: string; + /** SSL Key file file path. */ + sslKey?: string; + /** SSL Certificate pass phrase. */ + sslPass?: string; + /** SSL Certificate revocation list file path. */ + sslCRL?: string; + /** TCP Connection no delay */ + noDelay?: boolean; + /** TCP Connection keep alive enabled */ + keepAlive?: boolean; + /** The number of milliseconds to wait before initiating keepAlive on the TCP socket */ + keepAliveInitialDelay?: number; + /** Force server to assign `_id` values instead of driver */ + forceServerObjectId?: boolean; + /** Return document results as raw BSON buffers */ + raw?: boolean; + /** A primary key factory function for generation of custom `_id` keys */ + pkFactory?: PkFactory; + /** A Promise library class the application wishes to use such as Bluebird, must be ES6 compatible */ + promiseLibrary?: any; + /** The logging level */ + loggerLevel?: LoggerLevel; + /** Custom logger object */ + logger?: Logger; + /** Enable command monitoring for this client */ + monitorCommands?: boolean; + /** Server API version */ + serverApi?: ServerApi | ServerApiVersion; + /** + * Optionally enable client side auto encryption + * + * @remarks + * Automatic encryption is an enterprise only feature that only applies to operations on a collection. Automatic encryption is not supported for operations on a database or view, and operations that are not bypassed will result in error + * (see [libmongocrypt: Auto Encryption Allow-List](https://github.com/mongodb/specifications/blob/master/source/client-side-encryption/client-side-encryption.rst#libmongocrypt-auto-encryption-allow-list)). To bypass automatic encryption for all operations, set bypassAutoEncryption=true in AutoEncryptionOpts. + * + * Automatic encryption requires the authenticated user to have the [listCollections privilege action](https://docs.mongodb.com/manual/reference/command/listCollections/#dbcmd.listCollections). + * + * If a MongoClient with a limited connection pool size (i.e a non-zero maxPoolSize) is configured with AutoEncryptionOptions, a separate internal MongoClient is created if any of the following are true: + * - AutoEncryptionOptions.keyVaultClient is not passed. + * - AutoEncryptionOptions.bypassAutomaticEncryption is false. + * + * If an internal MongoClient is created, it is configured with the same options as the parent MongoClient except minPoolSize is set to 0 and AutoEncryptionOptions is omitted. + */ + autoEncryption?: AutoEncryptionOptions; + /** Allows a wrapping driver to amend the client metadata generated by the driver to include information about the wrapping driver */ + driverInfo?: DriverInfo; +} +/* Excluded from this release type: MongoClientPrivate */ +/** + * An error generated when a feature that is not enabled or allowed for the current server + * configuration is used + * + * + * @public + * @category Error + */ +export declare class MongoCompatibilityError extends MongoAPIError { + constructor(message: string); + readonly name: string; +} +/** + * A representation of the credentials used by MongoDB + * @public + */ +export declare class MongoCredentials { + /** The username used for authentication */ + readonly username: string; + /** The password used for authentication */ + readonly password: string; + /** The database that the user should authenticate against */ + readonly source: string; + /** The method used to authenticate */ + readonly mechanism: AuthMechanism; + /** Special properties used by some types of auth mechanisms */ + readonly mechanismProperties: AuthMechanismProperties; + constructor(options: MongoCredentialsOptions); + /** Determines if two MongoCredentials objects are equivalent */ + equals(other: MongoCredentials): boolean; + /** + * If the authentication mechanism is set to "default", resolves the authMechanism + * based on the server version and server supported sasl mechanisms. + * + * @param ismaster - An ismaster response from the server + */ + resolveAuthMechanism(ismaster?: Document): MongoCredentials; + validate(): void; + static merge(creds: MongoCredentials | undefined, options: Partial): MongoCredentials; +} +/** @public */ +export declare interface MongoCredentialsOptions { + username: string; + password: string; + source: string; + db?: string; + mechanism?: AuthMechanism; + mechanismProperties: AuthMechanismProperties; +} +/** + * An error thrown when an attempt is made to read from a cursor that has been exhausted + * + * @public + * @category Error + */ +export declare class MongoCursorExhaustedError extends MongoAPIError { + constructor(message?: string); + readonly name: string; +} +/** + * An error thrown when the user attempts to add options to a cursor that has already been + * initialized + * + * @public + * @category Error + */ +export declare class MongoCursorInUseError extends MongoAPIError { + constructor(message?: string); + readonly name: string; +} +/** @public */ +export declare class MongoDBNamespace { + db: string; + collection?: string; + /** + * Create a namespace object + * + * @param db - database name + * @param collection - collection name + */ + constructor(db: string, collection?: string); + toString(): string; + withCollection(collection: string): MongoDBNamespace; + static fromString(namespace?: string): MongoDBNamespace; +} +/** + * An error generated when the driver fails to decompress + * data received from the server. + * + * @public + * @category Error + */ +export declare class MongoDecompressionError extends MongoRuntimeError { + constructor(message: string); + readonly name: string; +} +/** + * An error generated by the driver + * + * @public + * @category Error + */ +export declare class MongoDriverError extends MongoError { + constructor(message: string); + readonly name: string; +} +/** + * @public + * @category Error + * + * @privateRemarks + * CSFLE has a dependency on this error, it uses the constructor with a string argument + */ +export declare class MongoError extends Error { + /* Excluded from this release type: [kErrorLabels] */ + /** + * This is a number in MongoServerError and a string in MongoDriverError + * @privateRemarks + * Define the type override on the subclasses when we can use the override keyword + */ + code?: number | string; + topologyVersion?: TopologyVersion; + constructor(message: string | Error); + readonly name: string; + /*Legacy name for server error responses */ + readonly errmsg: string; + /** + * Checks the error to see if it has an error label + * + * @param label - The error label to check for + * @returns returns true if the error has the provided error label + */ + hasErrorLabel(label: string): boolean; + addErrorLabel(label: string): void; + readonly errorLabels: string[]; +} +/** + * An error generated when the user attempts to operate + * on a session that has expired or has been closed. + * + * @public + * @category Error + */ +export declare class MongoExpiredSessionError extends MongoAPIError { + constructor(message?: string); + readonly name: string; +} +/** + * An error generated when a malformed or invalid chunk is + * encountered when reading from a GridFSStream. + * + * @public + * @category Error + */ +export declare class MongoGridFSChunkError extends MongoRuntimeError { + constructor(message: string); + readonly name: string; +} +/** An error generated when a GridFSStream operation fails to execute. + * + * @public + * @category Error + */ +export declare class MongoGridFSStreamError extends MongoRuntimeError { + constructor(message: string); + readonly name: string; +} +/** + * An error generated when the user supplies malformed or unexpected arguments + * or when a required argument or field is not provided. + * + * + * @public + * @category Error + */ +export declare class MongoInvalidArgumentError extends MongoAPIError { + constructor(message: string); + readonly name: string; +} +/** + * A error generated when the user attempts to authenticate + * via Kerberos, but fails to connect to the Kerberos client. + * + * @public + * @category Error + */ +export declare class MongoKerberosError extends MongoRuntimeError { + constructor(message: string); + readonly name: string; +} +/** + * An error generated when the user fails to provide authentication credentials before attempting + * to connect to a mongo server instance. + * + * + * @public + * @category Error + */ +export declare class MongoMissingCredentialsError extends MongoAPIError { + constructor(message: string); + readonly name: string; +} +/** + * An error generated when a required module or dependency is not present in the local environment + * + * @public + * @category Error + */ +export declare class MongoMissingDependencyError extends MongoAPIError { + constructor(message: string); + readonly name: string; +} +/** + * An error indicating an issue with the network, including TCP errors and timeouts. + * @public + * @category Error + */ +export declare class MongoNetworkError extends MongoError { + /* Excluded from this release type: [kBeforeHandshake] */ + constructor(message: string | Error, options?: MongoNetworkErrorOptions); + readonly name: string; +} +/** @public */ +export declare interface MongoNetworkErrorOptions { + /** Indicates the timeout happened before a connection handshake completed */ + beforeHandshake: boolean; +} +/** + * An error indicating a network timeout occurred + * @public + * @category Error + * + * @privateRemarks + * CSFLE has a dependency on this error with an instanceof check + */ +export declare class MongoNetworkTimeoutError extends MongoNetworkError { + constructor(message: string, options?: MongoNetworkErrorOptions); + readonly name: string; +} +/** + * An error thrown when the user attempts to operate on a database or collection through a MongoClient + * that has not yet successfully called the "connect" method + * + * @public + * @category Error + */ +export declare class MongoNotConnectedError extends MongoAPIError { + constructor(message: string); + readonly name: string; +} +/** + * Mongo Client Options + * @public + */ +export declare interface MongoOptions extends Required>, SupportedNodeConnectionOptions { + hosts: HostAddress[]; + srvHost?: string; + credentials?: MongoCredentials; + readPreference: ReadPreference; + readConcern: ReadConcern; + loadBalanced: boolean; + serverApi: ServerApi; + compressors: CompressorName[]; + writeConcern: WriteConcern; + dbName: string; + metadata: ClientMetadata; + autoEncrypter?: AutoEncrypter; + /* Excluded from this release type: connectionType */ + /* Excluded from this release type: encrypter */ + /* Excluded from this release type: userSpecifiedAuthSource */ + /* Excluded from this release type: userSpecifiedReplicaSet */ + /** + * # NOTE ABOUT TLS Options + * + * If set TLS enabled, equivalent to setting the ssl option. + * + * ### Additional options: + * + * | nodejs option | MongoDB equivalent | type | + * |:---------------------|--------------------------------------------------------- |:---------------------------------------| + * | `ca` | `sslCA`, `tlsCAFile` | `string \| Buffer \| Buffer[]` | + * | `crl` | `sslCRL` | `string \| Buffer \| Buffer[]` | + * | `cert` | `sslCert`, `tlsCertificateFile`, `tlsCertificateKeyFile` | `string \| Buffer \| Buffer[]` | + * | `key` | `sslKey`, `tlsCertificateKeyFile` | `string \| Buffer \| KeyObject[]` | + * | `passphrase` | `sslPass`, `tlsCertificateKeyFilePassword` | `string` | + * | `rejectUnauthorized` | `sslValidate` | `boolean` | + * + */ + tls: boolean; + /** + * Turn these options into a reusable connection URI + */ + toURI(): string; +} +/** + * An error used when attempting to parse a value (like a connection string) + * @public + * @category Error + */ +export declare class MongoParseError extends MongoDriverError { + constructor(message: string); + readonly name: string; +} +/** + * An error generated when the driver encounters unexpected input + * or reaches an unexpected/invalid internal state + * + * @privateRemarks + * Should **never** be directly instantiated. + * + * @public + * @category Error + */ +export declare class MongoRuntimeError extends MongoDriverError { + constructor(message: string); + readonly name: string; +} +/** + * An error generated when an attempt is made to operate + * on a closed/closing server. + * + * @public + * @category Error + */ +export declare class MongoServerClosedError extends MongoAPIError { + constructor(message?: string); + readonly name: string; +} +/** + * An error coming from the mongo server + * + * @public + * @category Error + */ +export declare class MongoServerError extends MongoError { + codeName?: string; + writeConcernError?: Document; + errInfo?: Document; + ok?: number; + [key: string]: any; + constructor(message: ErrorDescription); + readonly name: string; +} +/** + * An error signifying a client-side server selection error + * @public + * @category Error + */ +export declare class MongoServerSelectionError extends MongoSystemError { + constructor(message: string, reason: TopologyDescription); + readonly name: string; +} +/** + * An error signifying a general system issue + * @public + * @category Error + */ +export declare class MongoSystemError extends MongoError { + /** An optional reason context, such as an error saved during flow of monitoring and selecting servers */ + reason?: TopologyDescription; + constructor(message: string, reason: TopologyDescription); + readonly name: string; +} +/** + * An error generated when an attempt is made to operate on a + * dropped, or otherwise unavailable, database. + * + * @public + * @category Error + */ +export declare class MongoTopologyClosedError extends MongoAPIError { + constructor(message?: string); + readonly name: string; +} +/** + * An error generated when the user makes a mistake in the usage of transactions. + * (e.g. attempting to commit a transaction with a readPreference other than primary) + * + * @public + * @category Error + */ +export declare class MongoTransactionError extends MongoAPIError { + constructor(message: string); + readonly name: string; +} +/** + * An error thrown when the server reports a writeConcernError + * @public + * @category Error + */ +export declare class MongoWriteConcernError extends MongoServerError { + /** The result document (provided if ok: 1) */ + result?: Document; + errInfo?: Document; + constructor(message: ErrorDescription, result?: Document); + readonly name: string; +} +/* Excluded from this release type: Monitor */ +/** @public */ +export declare type MonitorEvents = { + serverHeartbeatStarted(event: ServerHeartbeatStartedEvent): void; + serverHeartbeatSucceeded(event: ServerHeartbeatSucceededEvent): void; + serverHeartbeatFailed(event: ServerHeartbeatFailedEvent): void; + resetServer(error?: Error): void; + resetConnectionPool(): void; + close(): void; +} & EventEmitterWithState; +/** @public */ +export declare interface MonitorOptions extends Pick> { + connectTimeoutMS: number; + heartbeatFrequencyMS: number; + minHeartbeatFrequencyMS: number; +} +/* Excluded from this release type: MonitorPrivate */ +/* Excluded from this release type: Msg */ +/** It avoids using fields with not acceptable types @public */ +export declare type NotAcceptedFields = { + readonly [key in KeysOfOtherType]?: never; +}; +/** @public */ +export declare type NumericType = IntegerType | Decimal128 | Double; +/** + * @public + * @deprecated Please use `ObjectId` + */ +export declare const ObjectID: typeof ObjectId; +export { ObjectId }; +/** @public */ +export declare type OneOrMore = T | ReadonlyArray; +/** @public */ +export declare type OnlyFieldsOfType = IsAny, AcceptedFields & NotAcceptedFields & Record>; +/* Excluded from this release type: OperationDescription */ +/** @public */ +export declare interface OperationOptions extends BSONSerializeOptions { + /** Specify ClientSession for this command */ + session?: ClientSession; + willRetryWrites?: boolean; + /** The preferred read preference (ReadPreference.primary, ReadPreference.primary_preferred, ReadPreference.secondary, ReadPreference.secondary_preferred, ReadPreference.nearest). */ + readPreference?: ReadPreferenceLike; +} +/* Excluded from this release type: OperationParent */ +/** + * Represents a specific point in time on a server. Can be retrieved by using {@link Db#command} + * @public + * @remarks + * See {@link https://docs.mongodb.com/manual/reference/method/db.runCommand/#response| Run Command Response} + */ +export declare type OperationTime = Timestamp; +/* Excluded from this release type: OpGetMoreOptions */ +/* Excluded from this release type: OpQueryOptions */ +/** + * Add an optional _id field to an object shaped type + * @public + * + * @privateRemarks + * `ObjectId extends TSchema['_id']` is a confusing ordering at first glance. Rather than ask + * `TSchema['_id'] extends ObjectId` which translated to "Is the _id property ObjectId?" + * we instead ask "Does ObjectId look like (have the same shape) as the _id?" + */ +export declare type OptionalId = ObjectId extends TSchema['_id'] ? EnhancedOmit & { + _id?: InferIdType; +} : WithId; +/** @public */ +export declare class OrderedBulkOperation extends BulkOperationBase { + constructor(collection: Collection, options: BulkWriteOptions); + addToOperationsList(batchType: BatchType, document: Document | UpdateStatement | DeleteStatement): this; +} +/** @public */ +export declare interface PipeOptions { + end?: boolean; +} +/** @public */ +export declare interface PkFactory { + createPk(): any; +} +/** @public */ +export declare const ProfilingLevel: Readonly<{ + readonly off: "off"; + readonly slowOnly: "slow_only"; + readonly all: "all"; +}>; +/** @public */ +export declare type ProfilingLevel = typeof ProfilingLevel[keyof typeof ProfilingLevel]; +/** @public */ +export declare type ProfilingLevelOptions = CommandOperationOptions; +/** + * @public + * Projection is flexible to permit the wide array of aggregation operators + * @deprecated since v4.1.0: Since projections support all aggregation operations we have no plans to narrow this type further + */ +export declare type Projection = Document; +/** + * @public + * @deprecated since v4.1.0: Since projections support all aggregation operations we have no plans to narrow this type further + */ +export declare type ProjectionOperators = Document; +/** + * Global promise store allowing user-provided promises + * @public + */ +declare class Promise_2 { + /** Validates the passed in promise library */ + static validate(lib: unknown): lib is PromiseConstructor; + /** Sets the promise library */ + static set(lib: PromiseConstructor): void; + /** Get the stored promise library, or resolves passed in */ + static get(): PromiseConstructor; +} +export { Promise_2 as Promise }; +/** @public */ +export declare type PullAllOperator = ({ + readonly [key in KeysOfAType>]?: TSchema[key]; +} & NotAcceptedFields>) & { + readonly [key: string]: ReadonlyArray; +}; +/** @public */ +export declare type PullOperator = ({ + readonly [key in KeysOfAType>]?: Partial> | FilterOperations>; +} & NotAcceptedFields>) & { + readonly [key: string]: FilterOperators | any; +}; +/** @public */ +export declare type PushOperator = ({ + readonly [key in KeysOfAType>]?: Flatten | ArrayOperator>>; +} & NotAcceptedFields>) & { + readonly [key: string]: ArrayOperator | any; +}; +/* Excluded from this release type: Query */ +/* Excluded from this release type: QueryOptions */ +/** + * The MongoDB ReadConcern, which allows for control of the consistency and isolation properties + * of the data read from replica sets and replica set shards. + * @public + * + * @see https://docs.mongodb.com/manual/reference/read-concern/index.html + */ +export declare class ReadConcern { + level: ReadConcernLevel | string; + /** Constructs a ReadConcern from the read concern level.*/ + constructor(level: ReadConcernLevel); + /** + * Construct a ReadConcern given an options object. + * + * @param options - The options object from which to extract the write concern. + */ + static fromOptions(options?: { + readConcern?: ReadConcernLike; + level?: ReadConcernLevel; + }): ReadConcern | undefined; + static readonly MAJORITY: 'majority'; + static readonly AVAILABLE: 'available'; + static readonly LINEARIZABLE: 'linearizable'; + static readonly SNAPSHOT: 'snapshot'; + toJSON(): Document; +} +/** @public */ +export declare const ReadConcernLevel: Readonly<{ + readonly local: "local"; + readonly majority: "majority"; + readonly linearizable: "linearizable"; + readonly available: "available"; + readonly snapshot: "snapshot"; +}>; +/** @public */ +export declare type ReadConcernLevel = typeof ReadConcernLevel[keyof typeof ReadConcernLevel]; +/** @public */ +export declare type ReadConcernLike = ReadConcern | { + level: ReadConcernLevel; +} | ReadConcernLevel; +/** + * The **ReadPreference** class is a class that represents a MongoDB ReadPreference and is + * used to construct connections. + * @public + * + * @see https://docs.mongodb.com/manual/core/read-preference/ + */ +export declare class ReadPreference { + mode: ReadPreferenceMode; + tags?: TagSet[]; + hedge?: HedgeOptions; + maxStalenessSeconds?: number; + minWireVersion?: number; + static PRIMARY: "primary"; + static PRIMARY_PREFERRED: "primaryPreferred"; + static SECONDARY: "secondary"; + static SECONDARY_PREFERRED: "secondaryPreferred"; + static NEAREST: "nearest"; + static primary: ReadPreference; + static primaryPreferred: ReadPreference; + static secondary: ReadPreference; + static secondaryPreferred: ReadPreference; + static nearest: ReadPreference; + /** + * @param mode - A string describing the read preference mode (primary|primaryPreferred|secondary|secondaryPreferred|nearest) + * @param tags - A tag set used to target reads to members with the specified tag(s). tagSet is not available if using read preference mode primary. + * @param options - Additional read preference options + */ + constructor(mode: ReadPreferenceMode, tags?: TagSet[], options?: ReadPreferenceOptions); + readonly preference: ReadPreferenceMode; + static fromString(mode: string): ReadPreference; + /** + * Construct a ReadPreference given an options object. + * + * @param options - The options object from which to extract the read preference. + */ + static fromOptions(options?: ReadPreferenceFromOptions): ReadPreference | undefined; + /** + * Replaces options.readPreference with a ReadPreference instance + */ + static translate(options: ReadPreferenceLikeOptions): ReadPreferenceLikeOptions; + /** + * Validate if a mode is legal + * + * @param mode - The string representing the read preference mode. + */ + static isValid(mode: string): boolean; + /** + * Validate if a mode is legal + * + * @param mode - The string representing the read preference mode. + */ + isValid(mode?: string): boolean; + /** + * Indicates that this readPreference needs the "slaveOk" bit when sent over the wire + * + * @see https://docs.mongodb.com/manual/reference/mongodb-wire-protocol/#op-query + */ + slaveOk(): boolean; + /** + * Check if the two ReadPreferences are equivalent + * + * @param readPreference - The read preference with which to check equality + */ + equals(readPreference: ReadPreference): boolean; + /** Return JSON representation */ + toJSON(): Document; +} +/** @public */ +export declare interface ReadPreferenceFromOptions extends ReadPreferenceLikeOptions { + session?: ClientSession; + readPreferenceTags?: TagSet[]; + hedge?: HedgeOptions; +} +/** @public */ +export declare type ReadPreferenceLike = ReadPreference | ReadPreferenceMode; +/** @public */ +export declare interface ReadPreferenceLikeOptions extends ReadPreferenceOptions { + readPreference?: ReadPreferenceLike | { + mode?: ReadPreferenceMode; + preference?: ReadPreferenceMode; + tags?: TagSet[]; + maxStalenessSeconds?: number; + }; +} +/** @public */ +export declare const ReadPreferenceMode: Readonly<{ + readonly primary: "primary"; + readonly primaryPreferred: "primaryPreferred"; + readonly secondary: "secondary"; + readonly secondaryPreferred: "secondaryPreferred"; + readonly nearest: "nearest"; +}>; +/** @public */ +export declare type ReadPreferenceMode = typeof ReadPreferenceMode[keyof typeof ReadPreferenceMode]; +/** @public */ +export declare interface ReadPreferenceOptions { + /** Max secondary read staleness in seconds, Minimum value is 90 seconds.*/ + maxStalenessSeconds?: number; + /** Server mode in which the same query is dispatched in parallel to multiple replica set members. */ + hedge?: HedgeOptions; +} +/** @public */ +export declare type ReduceFunction = (key: TKey, values: TValue[]) => TValue; +/** @public */ +export declare type RegExpOrString = T extends string ? BSONRegExp | RegExp | T : T; +/** @public */ +export declare type RemoveUserOptions = CommandOperationOptions; +/** @public */ +export declare interface RenameOptions extends CommandOperationOptions { + /** Drop the target name collection if it previously exists. */ + dropTarget?: boolean; + /** Unclear */ + new_collection?: boolean; +} +/** @public */ +export declare interface ReplaceOneModel { + /** The filter to limit the replaced document. */ + filter: Filter; + /** The document with which to replace the matched document. */ + replacement: TSchema; + /** Specifies a collation. */ + collation?: CollationOptions; + /** The index to use. If specified, then the query system will only consider plans using the hinted index. */ + hint?: Hint; + /** When true, creates a new document if no document matches the query. */ + upsert?: boolean; +} +/** @public */ +export declare interface ReplaceOptions extends CommandOperationOptions { + /** If true, allows the write to opt-out of document level validation */ + bypassDocumentValidation?: boolean; + /** Specifies a collation */ + collation?: CollationOptions; + /** Specify that the update query should only consider plans using the hinted index */ + hint?: string | Document; + /** When true, creates a new document if no document matches the query */ + upsert?: boolean; +} +/** @public */ +export declare interface ResumeOptions { + startAtOperationTime?: Timestamp; + batchSize?: number; + maxAwaitTimeMS?: number; + collation?: CollationOptions; + readPreference?: ReadPreference; +} +/** + * Represents the logical starting point for a new or resuming {@link https://docs.mongodb.com/master/changeStreams/#change-stream-resume-token| Change Stream} on the server. + * @public + */ +export declare type ResumeToken = unknown; +/** @public */ +export declare const ReturnDocument: Readonly<{ + readonly BEFORE: "before"; + readonly AFTER: "after"; +}>; +/** @public */ +export declare type ReturnDocument = typeof ReturnDocument[keyof typeof ReturnDocument]; +/** @public */ +export declare interface RoleSpecification { + /** + * A role grants privileges to perform sets of actions on defined resources. + * A given role applies to the database on which it is defined and can grant access down to a collection level of granularity. + */ + role: string; + /** The database this user's role should effect. */ + db: string; +} +/** @public */ +export declare interface RootFilterOperators extends Document { + $and?: Filter[]; + $nor?: Filter[]; + $or?: Filter[]; + $text?: { + $search: string; + $language?: string; + $caseSensitive?: boolean; + $diacriticSensitive?: boolean; + }; + $where?: string | ((this: TSchema) => boolean); + $comment?: string | Document; +} +/* Excluded from this release type: RTTPinger */ +/* Excluded from this release type: RTTPingerOptions */ +/** @public */ +export declare type RunCommandOptions = CommandOperationOptions; +/** @public */ +export declare type SchemaMember = { + [P in keyof T]?: V; +} | { + [key: string]: V; +}; +/** @public */ +export declare interface SelectServerOptions { + readPreference?: ReadPreferenceLike; + /** How long to block for server selection before throwing an error */ + serverSelectionTimeoutMS?: number; + session?: ClientSession; +} +/* Excluded from this release type: serialize */ +/* Excluded from this release type: Server */ +/** @public */ +export declare interface ServerApi { + version: ServerApiVersion; + strict?: boolean; + deprecationErrors?: boolean; +} +/** @public */ +export declare const ServerApiVersion: Readonly<{ + readonly v1: "1"; +}>; +/** @public */ +export declare type ServerApiVersion = typeof ServerApiVersion[keyof typeof ServerApiVersion]; +/** @public */ +export declare class ServerCapabilities { + maxWireVersion: number; + minWireVersion: number; + constructor(ismaster: Document); + readonly hasAggregationCursor: boolean; + readonly hasWriteCommands: boolean; + readonly hasTextSearch: boolean; + readonly hasAuthCommands: boolean; + readonly hasListCollectionsCommand: boolean; + readonly hasListIndexesCommand: boolean; + readonly supportsSnapshotReads: boolean; + readonly commandsTakeWriteConcern: boolean; + readonly commandsTakeCollation: boolean; +} +/** + * Emitted when server is closed. + * @public + * @category Event + */ +export declare class ServerClosedEvent { + /** A unique identifier for the topology */ + topologyId: number; + /** The address (host/port pair) of the server */ + address: string; +} +/** + * The client's view of a single server, based on the most recent ismaster outcome. + * + * Internal type, not meant to be directly instantiated + * @public + */ +export declare class ServerDescription { + private _hostAddress; + address: string; + type: ServerType; + hosts: string[]; + passives: string[]; + arbiters: string[]; + tags: TagSet; + error?: MongoError; + topologyVersion?: TopologyVersion; + minWireVersion: number; + maxWireVersion: number; + roundTripTime: number; + lastUpdateTime: number; + lastWriteDate: number; + me?: string; + primary?: string; + setName?: string; + setVersion?: number; + electionId?: ObjectId; + logicalSessionTimeoutMinutes?: number; + $clusterTime?: ClusterTime; + /*Excluded from this release type: __constructor */ + readonly hostAddress: HostAddress; + readonly allHosts: string[]; + /*Is this server available for reads*/ + readonly isReadable: boolean; + /*Is this server data bearing */ + readonly isDataBearing: boolean; + /*Is this server available for writes */ + readonly isWritable: boolean; + readonly host: string; + readonly port: number; + /** + * Determines if another `ServerDescription` is equal to this one per the rules defined + * in the {@link https://github.com/mongodb/specifications/blob/master/source/server-discovery-and-monitoring/server-discovery-and-monitoring.rst#serverdescription|SDAM spec} + */ + equals(other: ServerDescription): boolean; +} +/** + * Emitted when server description changes, but does NOT include changes to the RTT. + * @public + * @category Event + */ +export declare class ServerDescriptionChangedEvent { + /** A unique identifier for the topology */ + topologyId: number; + /** The address (host/port pair) of the server */ + address: string; + /** The previous server description */ + previousDescription: ServerDescription; + /** The new server description */ + newDescription: ServerDescription; +} +/* Excluded from this release type: ServerDescriptionOptions */ +/** @public */ +export declare type ServerEvents = { + serverHeartbeatStarted(event: ServerHeartbeatStartedEvent): void; + serverHeartbeatSucceeded(event: ServerHeartbeatSucceededEvent): void; + serverHeartbeatFailed(event: ServerHeartbeatFailedEvent): void; + /* Excluded from this release type: connect */ + descriptionReceived(description: ServerDescription): void; + closed(): void; + ended(): void; +} & ConnectionPoolEvents & EventEmitterWithState; +/** + * Emitted when the server monitor’s ismaster fails, either with an “ok: 0” or a socket exception. + * @public + * @category Event + */ +export declare class ServerHeartbeatFailedEvent { + /** The connection id for the command */ + connectionId: string; + /** The execution time of the event in ms */ + duration: number; + /** The command failure */ + failure: Error; +} +/** + * Emitted when the server monitor’s ismaster command is started - immediately before + * the ismaster command is serialized into raw BSON and written to the socket. + * + * @public + * @category Event + */ +export declare class ServerHeartbeatStartedEvent { + /** The connection id for the command */ + connectionId: string; +} +/** + * Emitted when the server monitor’s ismaster succeeds. + * @public + * @category Event + */ +export declare class ServerHeartbeatSucceededEvent { + /** The connection id for the command */ + connectionId: string; + /** The execution time of the event in ms */ + duration: number; + /** The command reply */ + reply: Document; +} +/** + * Emitted when server is initialized. + * @public + * @category Event + */ +export declare class ServerOpeningEvent { + /** A unique identifier for the topology */ + topologyId: number; + /** The address (host/port pair) of the server */ + address: string; +} +/** @public */ +export declare type ServerOptions = Pick> & MonitorOptions; +/* Excluded from this release type: ServerPrivate */ +/* Excluded from this release type: ServerSelectionCallback */ +/* Excluded from this release type: ServerSelectionRequest */ +/** @public */ +export declare type ServerSelector = (topologyDescription: TopologyDescription, servers: ServerDescription[]) => ServerDescription[]; +/** + * Reflects the existence of a session on the server. Can be reused by the session pool. + * WARNING: not meant to be instantiated directly. For internal use only. + * @public + */ +export declare class ServerSession { + id: ServerSessionId; + lastUse: number; + txnNumber: number; + isDirty: boolean; + /* Excluded from this release type: __constructor */ + /** + * Determines if the server session has timed out. + * + * @param sessionTimeoutMinutes - The server's "logicalSessionTimeoutMinutes" + */ + hasTimedOut(sessionTimeoutMinutes: number): boolean; +} +/** @public */ +export declare type ServerSessionId = { + id: Binary; +}; +/* Excluded from this release type: ServerSessionPool */ +/** + * An enumeration of server types we know about + * @public + */ +export declare const ServerType: Readonly<{ + readonly Standalone: "Standalone"; + readonly Mongos: "Mongos"; + readonly PossiblePrimary: "PossiblePrimary"; + readonly RSPrimary: "RSPrimary"; + readonly RSSecondary: "RSSecondary"; + readonly RSArbiter: "RSArbiter"; + readonly RSOther: "RSOther"; + readonly RSGhost: "RSGhost"; + readonly Unknown: "Unknown"; + readonly LoadBalancer: "LoadBalancer"; +}>; +/** @public */ +export declare type ServerType = typeof ServerType[keyof typeof ServerType]; +/** @public */ +export declare type SetFields = ({ + readonly [key in KeysOfAType | undefined>]?: OptionalId> | AddToSetOperators>>>; +} & NotAcceptedFields | undefined>) & { + readonly [key: string]: AddToSetOperators | any; +}; +/** @public */ +export declare type SetProfilingLevelOptions = CommandOperationOptions; +/** @public */ +export declare type Sort = string | Exclude | string[] | { + [key: string]: SortDirection; +} | Map | [ + string, + SortDirection +][] | [ + string, + SortDirection +]; +/** @public */ +export declare type SortDirection = 1 | -1 | 'asc' | 'desc' | 'ascending' | 'descending' | { + $meta: string; +}; +/* Excluded from this release type: SortDirectionForCmd */ +/* Excluded from this release type: SortForCmd */ +/* Excluded from this release type: SrvPoller */ +/* Excluded from this release type: SrvPollerEvents */ +/* Excluded from this release type: SrvPollerOptions */ +/* Excluded from this release type: SrvPollingEvent */ +/** @public */ +export declare type Stream = Socket | TLSSocket; +/** @public */ +export declare class StreamDescription { + address: string; + type: string; + minWireVersion?: number; + maxWireVersion?: number; + maxBsonObjectSize: number; + maxMessageSizeBytes: number; + maxWriteBatchSize: number; + compressors: CompressorName[]; + compressor?: CompressorName; + logicalSessionTimeoutMinutes?: number; + loadBalanced: boolean; + __nodejs_mock_server__?: boolean; + zlibCompressionLevel?: number; + constructor(address: string, options?: StreamDescriptionOptions); + receiveResponse(response: Document): void; +} +/** @public */ +export declare interface StreamDescriptionOptions { + compressors?: CompressorName[]; + logicalSessionTimeoutMinutes?: number; + loadBalanced: boolean; +} +/** @public */ +export declare type SupportedNodeConnectionOptions = SupportedTLSConnectionOptions & SupportedTLSSocketOptions & SupportedSocketOptions; +/** @public */ +export declare type SupportedSocketOptions = Pick; +/** @public */ +export declare type SupportedTLSConnectionOptions = Pick>; +/** @public */ +export declare type SupportedTLSSocketOptions = Pick>; +/** @public */ +export declare type TagSet = { + [key: string]: string; +}; +/* Excluded from this release type: TimerQueue */ +/** @public + * Configuration options for timeseries collections + * @see https://docs.mongodb.com/manual/core/timeseries-collections/ + */ +export declare interface TimeSeriesCollectionOptions extends Document { + timeField: string; + metaField?: string; + granularity?: string; +} +export { Timestamp }; +/* Excluded from this release type: Topology */ +/** + * Emitted when topology is closed. + * @public + * @category Event + */ +export declare class TopologyClosedEvent { + /** A unique identifier for the topology */ + topologyId: number; +} +/** + * Representation of a deployment of servers + * @public + */ +export declare class TopologyDescription { + type: TopologyType; + setName?: string; + maxSetVersion?: number; + maxElectionId?: ObjectId; + servers: Map; + stale: boolean; + compatible: boolean; + compatibilityError?: string; + logicalSessionTimeoutMinutes?: number; + heartbeatFrequencyMS: number; + localThresholdMS: number; + commonWireVersion?: number; + /** + * Create a TopologyDescription + */ + constructor(topologyType: TopologyType, serverDescriptions?: Map, setName?: string, maxSetVersion?: number, maxElectionId?: ObjectId, commonWireVersion?: number, options?: TopologyDescriptionOptions); + /*Excluded from this release type: updateFromSrvPollingEvent + Excluded from this release type: update */ + readonly error: MongoError | undefined; + /* + * Determines if the topology description has any known servers + */ + readonly hasKnownServers: boolean; + /* + * Determines if this topology description has a data-bearing server available. + */ + readonly hasDataBearingServers: boolean; +} +/** + * Emitted when topology description changes. + * @public + * @category Event + */ +export declare class TopologyDescriptionChangedEvent { + /** A unique identifier for the topology */ + topologyId: number; + /** The old topology description */ + previousDescription: TopologyDescription; + /** The new topology description */ + newDescription: TopologyDescription; +} +/** @public */ +export declare interface TopologyDescriptionOptions { + heartbeatFrequencyMS?: number; + localThresholdMS?: number; +} +/** @public */ +export declare type TopologyEvents = { + /* Excluded from this release type: connect */ + serverOpening(event: ServerOpeningEvent): void; + serverClosed(event: ServerClosedEvent): void; + serverDescriptionChanged(event: ServerDescriptionChangedEvent): void; + topologyClosed(event: TopologyClosedEvent): void; + topologyOpening(event: TopologyOpeningEvent): void; + topologyDescriptionChanged(event: TopologyDescriptionChangedEvent): void; + error(error: Error): void; + /* Excluded from this release type: open */ + close(): void; + timeout(): void; +} & Pick> & ConnectionPoolEvents & ConnectionEvents & EventEmitterWithState; +/** + * Emitted when topology is initialized. + * @public + * @category Event + */ +export declare class TopologyOpeningEvent { + /** A unique identifier for the topology */ + topologyId: number; +} +/** @public */ +export declare interface TopologyOptions extends BSONSerializeOptions, ServerOptions { + hosts: HostAddress[]; + retryWrites: boolean; + retryReads: boolean; + /** How long to block for server selection before throwing an error */ + serverSelectionTimeoutMS: number; + /** The name of the replica set to connect to */ + replicaSet?: string; + srvHost?: string; + /* Excluded from this release type: srvPoller */ + /** Indicates that a client should directly connect to a node without attempting to discover its topology type */ + directConnection: boolean; + loadBalanced: boolean; + metadata: ClientMetadata; + /** MongoDB server API version */ + serverApi?: ServerApi; +} +/* Excluded from this release type: TopologyPrivate */ +/** + * An enumeration of topology types we know about + * @public + */ +export declare const TopologyType: Readonly<{ + readonly Single: "Single"; + readonly ReplicaSetNoPrimary: "ReplicaSetNoPrimary"; + readonly ReplicaSetWithPrimary: "ReplicaSetWithPrimary"; + readonly Sharded: "Sharded"; + readonly Unknown: "Unknown"; + readonly LoadBalanced: "LoadBalanced"; +}>; +/** @public */ +export declare type TopologyType = typeof TopologyType[keyof typeof TopologyType]; +/** @public */ +export declare interface TopologyVersion { + processId: ObjectId; + counter: Long; +} +/** + * @public + * A class maintaining state related to a server transaction. Internal Only + */ +export declare class Transaction { + /* Excluded from this release type: state */ + options: TransactionOptions; + /*Excluded from this release type: _pinnedServer + Excluded from this release type: _recoveryToken + Excluded from this release type: __constructor + Excluded from this release type: server */ + readonly recoveryToken: Document | undefined; + readonly isPinned: boolean; + /*@returns Whether the transaction has started */ + readonly isStarting: boolean; + /* + * @returns Whether this session is presently in a transaction + */ + readonly isActive: boolean; + readonly isCommitted: boolean; +} +/** + * Configuration options for a transaction. + * @public + */ +export declare interface TransactionOptions extends CommandOperationOptions { + /** A default read concern for commands in this transaction */ + readConcern?: ReadConcern; + /** A default writeConcern for commands in this transaction */ + writeConcern?: WriteConcern; + /** A default read preference for commands in this transaction */ + readPreference?: ReadPreference; + maxCommitTimeMS?: number; +} +/* Excluded from this release type: TxnState */ +/** + * Typescript type safe event emitter + * @public + */ +export declare interface TypedEventEmitter extends EventEmitter { + addListener(event: EventKey, listener: Events[EventKey]): this; + addListener(event: CommonEvents, listener: (eventName: string | symbol, listener: GenericListener) => void): this; + addListener(event: string | symbol, listener: GenericListener): this; + on(event: EventKey, listener: Events[EventKey]): this; + on(event: CommonEvents, listener: (eventName: string | symbol, listener: GenericListener) => void): this; + on(event: string | symbol, listener: GenericListener): this; + once(event: EventKey, listener: Events[EventKey]): this; + once(event: CommonEvents, listener: (eventName: string | symbol, listener: GenericListener) => void): this; + once(event: string | symbol, listener: GenericListener): this; + removeListener(event: EventKey, listener: Events[EventKey]): this; + removeListener(event: CommonEvents, listener: (eventName: string | symbol, listener: GenericListener) => void): this; + removeListener(event: string | symbol, listener: GenericListener): this; + off(event: EventKey, listener: Events[EventKey]): this; + off(event: CommonEvents, listener: (eventName: string | symbol, listener: GenericListener) => void): this; + off(event: string | symbol, listener: GenericListener): this; + removeAllListeners(event?: EventKey | CommonEvents | symbol | string): this; + listeners(event: EventKey | CommonEvents | symbol | string): Events[EventKey][]; + rawListeners(event: EventKey | CommonEvents | symbol | string): Events[EventKey][]; + emit(event: EventKey | symbol, ...args: Parameters): boolean; + listenerCount(type: EventKey | CommonEvents | symbol | string): number; + prependListener(event: EventKey, listener: Events[EventKey]): this; + prependListener(event: CommonEvents, listener: (eventName: string | symbol, listener: GenericListener) => void): this; + prependListener(event: string | symbol, listener: GenericListener): this; + prependOnceListener(event: EventKey, listener: Events[EventKey]): this; + prependOnceListener(event: CommonEvents, listener: (eventName: string | symbol, listener: GenericListener) => void): this; + prependOnceListener(event: string | symbol, listener: GenericListener): this; + eventNames(): string[]; + getMaxListeners(): number; + setMaxListeners(n: number): this; +} +/** + * Typescript type safe event emitter + * @public + */ +export declare class TypedEventEmitter extends EventEmitter { +} +/** @public */ +export declare class UnorderedBulkOperation extends BulkOperationBase { + constructor(collection: Collection, options: BulkWriteOptions); + handleWriteError(callback: Callback, writeResult: BulkWriteResult): boolean | undefined; + addToOperationsList(batchType: BatchType, document: Document | UpdateStatement | DeleteStatement): this; +} +/** @public */ +export declare interface UpdateDescription { + /** + * A document containing key:value pairs of names of the fields that were + * changed, and the new value for those fields. + */ + updatedFields: Partial; + /** + * An array of field names that were removed from the document. + */ + removedFields: string[]; +} +/** @public */ +export declare type UpdateFilter = { + $currentDate?: OnlyFieldsOfType; + $inc?: OnlyFieldsOfType; + $min?: MatchKeysAndValues; + $max?: MatchKeysAndValues; + $mul?: OnlyFieldsOfType; + $rename?: Record; + $set?: MatchKeysAndValues; + $setOnInsert?: MatchKeysAndValues; + $unset?: OnlyFieldsOfType; + $addToSet?: SetFields; + $pop?: OnlyFieldsOfType, 1 | -1>; + $pull?: PullOperator; + $push?: PushOperator; + $pullAll?: PullAllOperator; + $bit?: OnlyFieldsOfType; +} & Document; +/** @public */ +export declare interface UpdateManyModel { + /** The filter to limit the updated documents. */ + filter: Filter; + /** A document or pipeline containing update operators. */ + update: UpdateFilter | UpdateFilter[]; + /** A set of filters specifying to which array elements an update should apply. */ + arrayFilters?: Document[]; + /** Specifies a collation. */ + collation?: CollationOptions; + /** The index to use. If specified, then the query system will only consider plans using the hinted index. */ + hint?: Hint; + /** When true, creates a new document if no document matches the query. */ + upsert?: boolean; +} +/** @public */ +export declare interface UpdateOneModel { + /** The filter to limit the updated documents. */ + filter: Filter; + /** A document or pipeline containing update operators. */ + update: UpdateFilter | UpdateFilter[]; + /** A set of filters specifying to which array elements an update should apply. */ + arrayFilters?: Document[]; + /** Specifies a collation. */ + collation?: CollationOptions; + /** The index to use. If specified, then the query system will only consider plans using the hinted index. */ + hint?: Hint; + /** When true, creates a new document if no document matches the query. */ + upsert?: boolean; +} +/** @public */ +export declare interface UpdateOptions extends CommandOperationOptions { + /** A set of filters specifying to which array elements an update should apply */ + arrayFilters?: Document[]; + /** If true, allows the write to opt-out of document level validation */ + bypassDocumentValidation?: boolean; + /** Specifies a collation */ + collation?: CollationOptions; + /** Specify that the update query should only consider plans using the hinted index */ + hint?: string | Document; + /** When true, creates a new document if no document matches the query */ + upsert?: boolean; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; +} +/** @public */ +export declare interface UpdateResult { + /** Indicates whether this write result was acknowledged. If not, then all other members of this result will be undefined */ + acknowledged: boolean; + /** The number of documents that matched the filter */ + matchedCount: number; + /** The number of documents that were modified */ + modifiedCount: number; + /** The number of documents that were upserted */ + upsertedCount: number; + /** The identifier of the inserted document if an upsert took place */ + upsertedId: ObjectId; +} +/** @public */ +export declare interface UpdateStatement { + /** The query that matches documents to update. */ + q: Document; + /** The modifications to apply. */ + u: Document | Document[]; + /** If true, perform an insert if no documents match the query. */ + upsert?: boolean; + /** If true, updates all documents that meet the query criteria. */ + multi?: boolean; + /** Specifies the collation to use for the operation. */ + collation?: CollationOptions; + /** An array of filter documents that determines which array elements to modify for an update operation on an array field. */ + arrayFilters?: Document[]; + /** A document or string that specifies the index to use to support the query predicate. */ + hint?: Hint; +} +/** @public */ +export declare interface ValidateCollectionOptions extends CommandOperationOptions { + /** Validates a collection in the background, without interrupting read or write traffic (only in MongoDB 4.4+) */ + background?: boolean; +} +/** @public */ +export declare type W = number | 'majority'; +/* Excluded from this release type: WaitQueueMember */ +/** @public */ +export declare interface WiredTigerData extends Document { + LSM: { + 'bloom filter false positives': number; + 'bloom filter hits': number; + 'bloom filter misses': number; + 'bloom filter pages evicted from cache': number; + 'bloom filter pages read into cache': number; + 'bloom filters in the LSM tree': number; + 'chunks in the LSM tree': number; + 'highest merge generation in the LSM tree': number; + 'queries that could have benefited from a Bloom filter that did not exist': number; + 'sleep for LSM checkpoint throttle': number; + 'sleep for LSM merge throttle': number; + 'total size of bloom filters': number; + } & Document; + 'block-manager': { + 'allocations requiring file extension': number; + 'blocks allocated': number; + 'blocks freed': number; + 'checkpoint size': number; + 'file allocation unit size': number; + 'file bytes available for reuse': number; + 'file magic number': number; + 'file major version number': number; + 'file size in bytes': number; + 'minor version number': number; + }; + btree: { + 'btree checkpoint generation': number; + 'column-store fixed-size leaf pages': number; + 'column-store internal pages': number; + 'column-store variable-size RLE encoded values': number; + 'column-store variable-size deleted values': number; + 'column-store variable-size leaf pages': number; + 'fixed-record size': number; + 'maximum internal page key size': number; + 'maximum internal page size': number; + 'maximum leaf page key size': number; + 'maximum leaf page size': number; + 'maximum leaf page value size': number; + 'maximum tree depth': number; + 'number of key/value pairs': number; + 'overflow pages': number; + 'pages rewritten by compaction': number; + 'row-store internal pages': number; + 'row-store leaf pages': number; + } & Document; + cache: { + 'bytes currently in the cache': number; + 'bytes read into cache': number; + 'bytes written from cache': number; + 'checkpoint blocked page eviction': number; + 'data source pages selected for eviction unable to be evicted': number; + 'hazard pointer blocked page eviction': number; + 'in-memory page passed criteria to be split': number; + 'in-memory page splits': number; + 'internal pages evicted': number; + 'internal pages split during eviction': number; + 'leaf pages split during eviction': number; + 'modified pages evicted': number; + 'overflow pages read into cache': number; + 'overflow values cached in memory': number; + 'page split during eviction deepened the tree': number; + 'page written requiring lookaside records': number; + 'pages read into cache': number; + 'pages read into cache requiring lookaside entries': number; + 'pages requested from the cache': number; + 'pages written from cache': number; + 'pages written requiring in-memory restoration': number; + 'tracked dirty bytes in the cache': number; + 'unmodified pages evicted': number; + } & Document; + cache_walk: { + 'Average difference between current eviction generation when the page was last considered': number; + 'Average on-disk page image size seen': number; + 'Clean pages currently in cache': number; + 'Current eviction generation': number; + 'Dirty pages currently in cache': number; + 'Entries in the root page': number; + 'Internal pages currently in cache': number; + 'Leaf pages currently in cache': number; + 'Maximum difference between current eviction generation when the page was last considered': number; + 'Maximum page size seen': number; + 'Minimum on-disk page image size seen': number; + 'On-disk page image sizes smaller than a single allocation unit': number; + 'Pages created in memory and never written': number; + 'Pages currently queued for eviction': number; + 'Pages that could not be queued for eviction': number; + 'Refs skipped during cache traversal': number; + 'Size of the root page': number; + 'Total number of pages currently in cache': number; + } & Document; + compression: { + 'compressed pages read': number; + 'compressed pages written': number; + 'page written failed to compress': number; + 'page written was too small to compress': number; + 'raw compression call failed, additional data available': number; + 'raw compression call failed, no additional data available': number; + 'raw compression call succeeded': number; + } & Document; + cursor: { + 'bulk-loaded cursor-insert calls': number; + 'create calls': number; + 'cursor-insert key and value bytes inserted': number; + 'cursor-remove key bytes removed': number; + 'cursor-update value bytes updated': number; + 'insert calls': number; + 'next calls': number; + 'prev calls': number; + 'remove calls': number; + 'reset calls': number; + 'restarted searches': number; + 'search calls': number; + 'search near calls': number; + 'truncate calls': number; + 'update calls': number; + }; + reconciliation: { + 'dictionary matches': number; + 'fast-path pages deleted': number; + 'internal page key bytes discarded using suffix compression': number; + 'internal page multi-block writes': number; + 'internal-page overflow keys': number; + 'leaf page key bytes discarded using prefix compression': number; + 'leaf page multi-block writes': number; + 'leaf-page overflow keys': number; + 'maximum blocks required for a page': number; + 'overflow values written': number; + 'page checksum matches': number; + 'page reconciliation calls': number; + 'page reconciliation calls for eviction': number; + 'pages deleted': number; + } & Document; +} +/* Excluded from this release type: WithConnectionCallback */ +/** Add an _id field to an object shaped type @public */ +export declare type WithId = EnhancedOmit & { + _id: InferIdType; +}; +/** Remove the _id field from an object shaped type @public */ +export declare type WithoutId = Pick>; +/** @public */ +export declare type WithSessionCallback = (session: ClientSession) => Promise | void; +/** @public */ +export declare type WithTransactionCallback = (session: ClientSession) => Promise; +/** + * A MongoDB WriteConcern, which describes the level of acknowledgement + * requested from MongoDB for write operations. + * @public + * + * @see https://docs.mongodb.com/manual/reference/write-concern/ + */ +export declare class WriteConcern { + /** request acknowledgment that the write operation has propagated to a specified number of mongod instances or to mongod instances with specified tags. */ + w?: W; + /** specify a time limit to prevent write operations from blocking indefinitely */ + wtimeout?: number; + /** request acknowledgment that the write operation has been written to the on-disk journal */ + j?: boolean; + /** equivalent to the j option */ + fsync?: boolean | 1; + /** + * Constructs a WriteConcern from the write concern properties. + * @param w - request acknowledgment that the write operation has propagated to a specified number of mongod instances or to mongod instances with specified tags. + * @param wtimeout - specify a time limit to prevent write operations from blocking indefinitely + * @param j - request acknowledgment that the write operation has been written to the on-disk journal + * @param fsync - equivalent to the j option + */ + constructor(w?: W, wtimeout?: number, j?: boolean, fsync?: boolean | 1); + /** Construct a WriteConcern given an options object. */ + static fromOptions(options?: WriteConcernOptions | WriteConcern | W, inherit?: WriteConcernOptions | WriteConcern): WriteConcern | undefined; +} +/** + * An error representing a failure by the server to apply the requested write concern to the bulk operation. + * @public + * @category Error + */ +export declare class WriteConcernError { + /* Excluded from this release type: [kServerError] */ + constructor(error: WriteConcernErrorData); + /*Write concern error code. */ + readonly code: number | undefined; + /*Write concern error message. */ + readonly errmsg: string | undefined; + /*Write concern error info. */ + readonly errInfo: Document | undefined; + /*@deprecated The `err` prop that contained a MongoServerError has been deprecated. */ + readonly err: WriteConcernErrorData; + toJSON(): WriteConcernErrorData; + toString(): string; +} +/** @public */ +export declare interface WriteConcernErrorData { + code: number; + errmsg: string; + errInfo?: Document; +} +/** @public */ +export declare interface WriteConcernOptions { + /** Write Concern as an object */ + writeConcern?: WriteConcern | WriteConcernSettings; +} +/** @public */ +export declare interface WriteConcernSettings { + /** The write concern */ + w?: W; + /** The write concern timeout */ + wtimeoutMS?: number; + /** The journal write concern */ + journal?: boolean; + /** The journal write concern */ + j?: boolean; + /** The write concern timeout */ + wtimeout?: number; + /** The file sync write concern */ + fsync?: boolean | 1; +} +/** + * An error that occurred during a BulkWrite on the server. + * @public + * @category Error + */ +export declare class WriteError { + err: BulkWriteOperationError; + constructor(err: BulkWriteOperationError); + /*WriteError code. */ + readonly code: number; + /*WriteError original bulk operation index. */ + readonly index: number; + /*WriteError message. */ + readonly errmsg: string | undefined; + /*WriteError details. */ + readonly errInfo: Document | undefined; + /** Returns the underlying operation that caused the error */ + getOperation(): Document; + toJSON(): { + code: number; + index: number; + errmsg?: string; + op: Document; + }; + toString(): string; +} +/* Excluded from this release type: WriteProtocolMessageType */ +export {}; diff --git a/node_modules/mongodb/package.json b/node_modules/mongodb/package.json new file mode 100644 index 000000000..4647cf286 --- /dev/null +++ b/node_modules/mongodb/package.json @@ -0,0 +1,156 @@ +{ + "_from": "mongodb@4.1.2", + "_id": "mongodb@4.1.2", + "_inBundle": false, + "_integrity": "sha512-pHCKDoOy1h6mVurziJmXmTMPatYWOx8pbnyFgSgshja9Y36Q+caHUzTDY6rrIy9HCSrjnbXmx3pCtvNZHmR8xg==", + "_location": "/mongodb", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "mongodb@4.1.2", + "name": "mongodb", + "escapedName": "mongodb", + "rawSpec": "4.1.2", + "saveSpec": null, + "fetchSpec": "4.1.2" + }, + "_requiredBy": [ + "/mongoose" + ], + "_resolved": "https://registry.npmjs.org/mongodb/-/mongodb-4.1.2.tgz", + "_shasum": "36ab494db3a9a827df41ccb0d9b36a94bfeae8d7", + "_spec": "mongodb@4.1.2", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\mongoose", + "author": { + "name": "The MongoDB NodeJS Team", + "email": "dbx-node@mongodb.com" + }, + "bugs": { + "url": "https://jira.mongodb.org/projects/NODE/issues/" + }, + "bundleDependencies": false, + "dependencies": { + "bson": "^4.5.2", + "denque": "^2.0.1", + "mongodb-connection-string-url": "^2.0.0", + "saslprep": "^1.0.3" + }, + "deprecated": false, + "description": "The official MongoDB driver for Node.js", + "devDependencies": { + "@istanbuljs/nyc-config-typescript": "^1.0.1", + "@microsoft/api-extractor": "^7.18.7", + "@microsoft/tsdoc-config": "^0.15.2", + "@types/chai": "^4.2.21", + "@types/chai-subset": "^1.3.3", + "@types/kerberos": "^1.1.1", + "@types/mocha": "^9.0.0", + "@types/node": "^16.7.13", + "@types/saslprep": "^1.0.1", + "@types/semver": "^7.3.8", + "@types/whatwg-url": "^8.2.1", + "@typescript-eslint/eslint-plugin": "^4.31.0", + "@typescript-eslint/parser": "^4.31.0", + "bluebird": "^3.7.2", + "chai": "^4.3.4", + "chai-subset": "^1.6.0", + "chalk": "^4.1.2", + "co": "4.6.0", + "downlevel-dts": "^0.7.0", + "eslint": "^7.32.0", + "eslint-config-prettier": "^8.3.0", + "eslint-plugin-jsdoc": "^36.1.0", + "eslint-plugin-prettier": "^4.0.0", + "eslint-plugin-tsdoc": "^0.2.14", + "js-yaml": "^4.1.0", + "jsdoc": "^3.6.7", + "lodash.camelcase": "^4.3.0", + "mocha": "^9.1.1", + "mocha-sinon": "^2.1.2", + "mongodb-mock-server": "^2.0.1", + "nyc": "^15.1.0", + "prettier": "^2.3.2", + "rimraf": "^3.0.2", + "semver": "^7.3.5", + "sinon": "^11.1.2", + "sinon-chai": "^3.7.0", + "source-map-support": "^0.5.19", + "standard-version": "^9.3.1", + "ts-node": "^10.2.1", + "tsd": "^0.17.0", + "typedoc": "^0.21.9", + "typescript": "4.3.5", + "typescript-cached-transpile": "^0.0.6", + "xml2js": "^0.4.23", + "yargs": "^17.1.1" + }, + "engines": { + "node": ">=12.9.0" + }, + "files": [ + "lib", + "src", + "etc/prepare.js", + "mongodb.d.ts", + "mongodb.ts34.d.ts" + ], + "homepage": "https://github.com/mongodb/node-mongodb-native", + "keywords": [ + "mongodb", + "driver", + "official" + ], + "license": "Apache-2.0", + "main": "lib/index.js", + "name": "mongodb", + "optionalDependencies": { + "saslprep": "^1.0.3" + }, + "repository": { + "type": "git", + "url": "git+ssh://git@github.com/mongodb/node-mongodb-native.git" + }, + "scripts": { + "build:docs": "typedoc", + "build:dts": "npm run build:ts && api-extractor run && rimraf 'lib/**/*.d.ts*' && downlevel-dts mongodb.d.ts mongodb.ts34.d.ts", + "build:evergreen": "node .evergreen/generate_evergreen_tasks.js", + "build:ts": "rimraf lib && tsc", + "check:adl": "mocha test/manual/data_lake.test.js", + "check:atlas": "mocha --config \"test/manual/mocharc.json\" test/manual/atlas_connectivity.test.js", + "check:bench": "node test/benchmarks/driverBench", + "check:coverage": "nyc npm run check:test", + "check:csfle": "mocha test/functional/client_side_encryption", + "check:dts": "tsc --noEmit mongodb.d.ts && tsd", + "check:eslint": "eslint -v && eslint --max-warnings=0 --ext '.js,.ts' src test", + "check:kerberos": "mocha --config \"test/manual/mocharc.json\" test/manual/kerberos.test.js", + "check:ldap": "mocha --config \"test/manual/mocharc.json\" test/manual/ldap.test.js", + "check:lint": "npm run build:dts && npm run check:dts && npm run check:eslint && npm run check:tsd", + "check:ocsp": "mocha --config \"test/manual/mocharc.json\" test/manual/ocsp_support.test.js", + "check:test": "mocha --recursive test/functional test/unit", + "check:tls": "mocha --config \"test/manual/mocharc.json\" test/manual/tls_support.test.js", + "check:ts": "tsc -v && tsc --noEmit", + "check:tsd": "tsd --version && tsd", + "prepare": "node etc/prepare.js", + "release": "standard-version -i HISTORY.md", + "test": "npm run check:lint && npm run check:test" + }, + "tsd": { + "directory": "test/types", + "compilerOptions": { + "strict": true, + "target": "esnext", + "module": "commonjs", + "moduleResolution": "node" + } + }, + "types": "mongodb.d.ts", + "typesVersions": { + "<=4.0.2": { + "mongodb.d.ts": [ + "mongodb.ts34.d.ts" + ] + } + }, + "version": "4.1.2" +} diff --git a/node_modules/mongodb/src/admin.ts b/node_modules/mongodb/src/admin.ts new file mode 100644 index 000000000..30093cca4 --- /dev/null +++ b/node_modules/mongodb/src/admin.ts @@ -0,0 +1,314 @@ +import { AddUserOperation, AddUserOptions } from './operations/add_user'; +import { RemoveUserOperation, RemoveUserOptions } from './operations/remove_user'; +import { + ValidateCollectionOperation, + ValidateCollectionOptions +} from './operations/validate_collection'; +import { + ListDatabasesOperation, + ListDatabasesOptions, + ListDatabasesResult +} from './operations/list_databases'; +import { executeOperation } from './operations/execute_operation'; +import { RunCommandOperation, RunCommandOptions } from './operations/run_command'; +import { Callback, getTopology } from './utils'; +import type { Document } from './bson'; +import type { CommandOperationOptions } from './operations/command'; +import type { Db } from './db'; + +/** @internal */ +export interface AdminPrivate { + db: Db; +} + +/** + * The **Admin** class is an internal class that allows convenient access to + * the admin functionality and commands for MongoDB. + * + * **ADMIN Cannot directly be instantiated** + * @public + * + * @example + * ```js + * const MongoClient = require('mongodb').MongoClient; + * const test = require('assert'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * + * // Connect using MongoClient + * MongoClient.connect(url, function(err, client) { + * // Use the admin database for the operation + * const adminDb = client.db(dbName).admin(); + * + * // List all the available databases + * adminDb.listDatabases(function(err, dbs) { + * expect(err).to.not.exist; + * test.ok(dbs.databases.length > 0); + * client.close(); + * }); + * }); + * ``` + */ +export class Admin { + /** @internal */ + s: AdminPrivate; + + /** + * Create a new Admin instance + * @internal + */ + constructor(db: Db) { + this.s = { db }; + } + + /** + * Execute a command + * + * @param command - The command to execute + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + command(command: Document): Promise; + command(command: Document, callback: Callback): void; + command(command: Document, options: RunCommandOptions): Promise; + command(command: Document, options: RunCommandOptions, callback: Callback): void; + command( + command: Document, + options?: RunCommandOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = Object.assign({ dbName: 'admin' }, options); + + return executeOperation( + getTopology(this.s.db), + new RunCommandOperation(this.s.db, command, options), + callback + ); + } + + /** + * Retrieve the server build information + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + buildInfo(): Promise; + buildInfo(callback: Callback): void; + buildInfo(options: CommandOperationOptions): Promise; + buildInfo(options: CommandOperationOptions, callback: Callback): void; + buildInfo( + options?: CommandOperationOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = options ?? {}; + return this.command({ buildinfo: 1 }, options, callback as Callback); + } + + /** + * Retrieve the server build information + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + serverInfo(): Promise; + serverInfo(callback: Callback): void; + serverInfo(options: CommandOperationOptions): Promise; + serverInfo(options: CommandOperationOptions, callback: Callback): void; + serverInfo( + options?: CommandOperationOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = options ?? {}; + return this.command({ buildinfo: 1 }, options, callback as Callback); + } + + /** + * Retrieve this db's server status. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + serverStatus(): Promise; + serverStatus(callback: Callback): void; + serverStatus(options: CommandOperationOptions): Promise; + serverStatus(options: CommandOperationOptions, callback: Callback): void; + serverStatus( + options?: CommandOperationOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = options ?? {}; + return this.command({ serverStatus: 1 }, options, callback as Callback); + } + + /** + * Ping the MongoDB server and retrieve results + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + ping(): Promise; + ping(callback: Callback): void; + ping(options: CommandOperationOptions): Promise; + ping(options: CommandOperationOptions, callback: Callback): void; + ping( + options?: CommandOperationOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = options ?? {}; + return this.command({ ping: 1 }, options, callback as Callback); + } + + /** + * Add a user to the database + * + * @param username - The username for the new user + * @param password - An optional password for the new user + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + addUser(username: string): Promise; + addUser(username: string, callback: Callback): void; + addUser(username: string, password: string): Promise; + addUser(username: string, password: string, callback: Callback): void; + addUser(username: string, options: AddUserOptions): Promise; + addUser(username: string, options: AddUserOptions, callback: Callback): void; + addUser(username: string, password: string, options: AddUserOptions): Promise; + addUser( + username: string, + password: string, + options: AddUserOptions, + callback: Callback + ): void; + addUser( + username: string, + password?: string | AddUserOptions | Callback, + options?: AddUserOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof password === 'function') { + (callback = password), (password = undefined), (options = {}); + } else if (typeof password !== 'string') { + if (typeof options === 'function') { + (callback = options), (options = password), (password = undefined); + } else { + (options = password), (callback = undefined), (password = undefined); + } + } else { + if (typeof options === 'function') (callback = options), (options = {}); + } + + options = Object.assign({ dbName: 'admin' }, options); + + return executeOperation( + getTopology(this.s.db), + new AddUserOperation(this.s.db, username, password, options), + callback + ); + } + + /** + * Remove a user from a database + * + * @param username - The username to remove + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + removeUser(username: string): Promise; + removeUser(username: string, callback: Callback): void; + removeUser(username: string, options: RemoveUserOptions): Promise; + removeUser(username: string, options: RemoveUserOptions, callback: Callback): void; + removeUser( + username: string, + options?: RemoveUserOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = Object.assign({ dbName: 'admin' }, options); + + return executeOperation( + getTopology(this.s.db), + new RemoveUserOperation(this.s.db, username, options), + callback + ); + } + + /** + * Validate an existing collection + * + * @param collectionName - The name of the collection to validate. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + validateCollection(collectionName: string): Promise; + validateCollection(collectionName: string, callback: Callback): void; + validateCollection(collectionName: string, options: ValidateCollectionOptions): Promise; + validateCollection( + collectionName: string, + options: ValidateCollectionOptions, + callback: Callback + ): void; + validateCollection( + collectionName: string, + options?: ValidateCollectionOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = options ?? {}; + + return executeOperation( + getTopology(this.s.db), + new ValidateCollectionOperation(this, collectionName, options), + callback + ); + } + + /** + * List the available databases + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + listDatabases(): Promise; + listDatabases(callback: Callback): void; + listDatabases(options: ListDatabasesOptions): Promise; + listDatabases(options: ListDatabasesOptions, callback: Callback): void; + listDatabases( + options?: ListDatabasesOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = options ?? {}; + + return executeOperation( + getTopology(this.s.db), + new ListDatabasesOperation(this.s.db, options), + callback + ); + } + + /** + * Get ReplicaSet status + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + replSetGetStatus(): Promise; + replSetGetStatus(callback: Callback): void; + replSetGetStatus(options: CommandOperationOptions): Promise; + replSetGetStatus(options: CommandOperationOptions, callback: Callback): void; + replSetGetStatus( + options?: CommandOperationOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = options ?? {}; + return this.command({ replSetGetStatus: 1 }, options, callback as Callback); + } +} diff --git a/node_modules/mongodb/src/bson.ts b/node_modules/mongodb/src/bson.ts new file mode 100644 index 000000000..3b445109d --- /dev/null +++ b/node_modules/mongodb/src/bson.ts @@ -0,0 +1,103 @@ +import type { + serialize as serializeFn, + deserialize as deserializeFn, + calculateObjectSize as calculateObjectSizeFn +} from 'bson'; + +// eslint-disable-next-line @typescript-eslint/no-var-requires +let BSON = require('bson'); +try { + // Ensure you always wrap an optional require in the try block NODE-3199 + BSON = require('bson-ext'); +} catch {} // eslint-disable-line + +/** @internal */ +export const deserialize = BSON.deserialize as typeof deserializeFn; +/** @internal */ +export const serialize = BSON.serialize as typeof serializeFn; +/** @internal */ +export const calculateObjectSize = BSON.calculateObjectSize as typeof calculateObjectSizeFn; + +export { + Long, + Binary, + ObjectId, + Timestamp, + Code, + MinKey, + MaxKey, + Decimal128, + Int32, + Double, + DBRef, + BSONRegExp, + BSONSymbol, + Map, + Document +} from 'bson'; + +import type { DeserializeOptions, SerializeOptions } from 'bson'; + +/** + * BSON Serialization options. + * @public + */ +export interface BSONSerializeOptions + extends Omit, + Omit< + DeserializeOptions, + | 'evalFunctions' + | 'cacheFunctions' + | 'cacheFunctionsCrc32' + | 'allowObjectSmallerThanBufferSize' + | 'index' + > { + /** Return BSON filled buffers from operations */ + raw?: boolean; +} + +export function pluckBSONSerializeOptions(options: BSONSerializeOptions): BSONSerializeOptions { + const { + fieldsAsRaw, + promoteValues, + promoteBuffers, + promoteLongs, + serializeFunctions, + ignoreUndefined, + bsonRegExp, + raw + } = options; + return { + fieldsAsRaw, + promoteValues, + promoteBuffers, + promoteLongs, + serializeFunctions, + ignoreUndefined, + bsonRegExp, + raw + }; +} + +/** + * Merge the given BSONSerializeOptions, preferring options over the parent's options, and + * substituting defaults for values not set. + * + * @internal + */ +export function resolveBSONOptions( + options?: BSONSerializeOptions, + parent?: { bsonOptions?: BSONSerializeOptions } +): BSONSerializeOptions { + const parentOptions = parent?.bsonOptions; + return { + raw: options?.raw ?? parentOptions?.raw ?? false, + promoteLongs: options?.promoteLongs ?? parentOptions?.promoteLongs ?? true, + promoteValues: options?.promoteValues ?? parentOptions?.promoteValues ?? true, + promoteBuffers: options?.promoteBuffers ?? parentOptions?.promoteBuffers ?? false, + ignoreUndefined: options?.ignoreUndefined ?? parentOptions?.ignoreUndefined ?? false, + bsonRegExp: options?.bsonRegExp ?? parentOptions?.bsonRegExp ?? false, + serializeFunctions: options?.serializeFunctions ?? parentOptions?.serializeFunctions ?? false, + fieldsAsRaw: options?.fieldsAsRaw ?? parentOptions?.fieldsAsRaw ?? {} + }; +} diff --git a/node_modules/mongodb/src/bulk/common.ts b/node_modules/mongodb/src/bulk/common.ts new file mode 100644 index 000000000..4afb06662 --- /dev/null +++ b/node_modules/mongodb/src/bulk/common.ts @@ -0,0 +1,1340 @@ +import { PromiseProvider } from '../promise_provider'; +import { Long, ObjectId, Document, BSONSerializeOptions, resolveBSONOptions } from '../bson'; +import { + MongoWriteConcernError, + AnyError, + MONGODB_ERROR_CODES, + MongoServerError, + MongoInvalidArgumentError, + MongoBatchReExecutionError +} from '../error'; +import { + applyRetryableWrites, + executeLegacyOperation, + hasAtomicOperators, + Callback, + MongoDBNamespace, + getTopology, + resolveOptions +} from '../utils'; +import { executeOperation } from '../operations/execute_operation'; +import { InsertOperation } from '../operations/insert'; +import { UpdateOperation, UpdateStatement, makeUpdateStatement } from '../operations/update'; +import { DeleteOperation, DeleteStatement, makeDeleteStatement } from '../operations/delete'; +import { WriteConcern } from '../write_concern'; +import type { Collection } from '../collection'; +import type { Topology } from '../sdam/topology'; +import type { CommandOperationOptions, CollationOptions } from '../operations/command'; +import type { Hint } from '../operations/operation'; +import type { Filter, OneOrMore, OptionalId, UpdateFilter } from '../mongo_types'; + +/** @internal */ +const kServerError = Symbol('serverError'); + +/** @public */ +export const BatchType = Object.freeze({ + INSERT: 1, + UPDATE: 2, + DELETE: 3 +} as const); + +/** @public */ +export type BatchType = typeof BatchType[keyof typeof BatchType]; + +/** @public */ +export interface InsertOneModel { + /** The document to insert. */ + document: OptionalId; +} + +/** @public */ +export interface DeleteOneModel { + /** The filter to limit the deleted documents. */ + filter: Filter; + /** Specifies a collation. */ + collation?: CollationOptions; + /** The index to use. If specified, then the query system will only consider plans using the hinted index. */ + hint?: Hint; +} + +/** @public */ +export interface DeleteManyModel { + /** The filter to limit the deleted documents. */ + filter: Filter; + /** Specifies a collation. */ + collation?: CollationOptions; + /** The index to use. If specified, then the query system will only consider plans using the hinted index. */ + hint?: Hint; +} + +/** @public */ +export interface ReplaceOneModel { + /** The filter to limit the replaced document. */ + filter: Filter; + /** The document with which to replace the matched document. */ + replacement: TSchema; + /** Specifies a collation. */ + collation?: CollationOptions; + /** The index to use. If specified, then the query system will only consider plans using the hinted index. */ + hint?: Hint; + /** When true, creates a new document if no document matches the query. */ + upsert?: boolean; +} + +/** @public */ +export interface UpdateOneModel { + /** The filter to limit the updated documents. */ + filter: Filter; + /** A document or pipeline containing update operators. */ + update: UpdateFilter | UpdateFilter[]; + /** A set of filters specifying to which array elements an update should apply. */ + arrayFilters?: Document[]; + /** Specifies a collation. */ + collation?: CollationOptions; + /** The index to use. If specified, then the query system will only consider plans using the hinted index. */ + hint?: Hint; + /** When true, creates a new document if no document matches the query. */ + upsert?: boolean; +} + +/** @public */ +export interface UpdateManyModel { + /** The filter to limit the updated documents. */ + filter: Filter; + /** A document or pipeline containing update operators. */ + update: UpdateFilter | UpdateFilter[]; + /** A set of filters specifying to which array elements an update should apply. */ + arrayFilters?: Document[]; + /** Specifies a collation. */ + collation?: CollationOptions; + /** The index to use. If specified, then the query system will only consider plans using the hinted index. */ + hint?: Hint; + /** When true, creates a new document if no document matches the query. */ + upsert?: boolean; +} + +/** @public */ +export type AnyBulkWriteOperation = + | { insertOne: InsertOneModel } + | { replaceOne: ReplaceOneModel } + | { updateOne: UpdateOneModel } + | { updateMany: UpdateManyModel } + | { deleteOne: DeleteOneModel } + | { deleteMany: DeleteManyModel }; + +/** @public */ +export interface BulkResult { + ok: number; + writeErrors: WriteError[]; + writeConcernErrors: WriteConcernError[]; + insertedIds: Document[]; + nInserted: number; + nUpserted: number; + nMatched: number; + nModified: number; + nRemoved: number; + upserted: Document[]; + opTime?: Document; +} + +/** + * Keeps the state of a unordered batch so we can rewrite the results + * correctly after command execution + * + * @public + */ +export class Batch { + originalZeroIndex: number; + currentIndex: number; + originalIndexes: number[]; + batchType: BatchType; + operations: T[]; + size: number; + sizeBytes: number; + + constructor(batchType: BatchType, originalZeroIndex: number) { + this.originalZeroIndex = originalZeroIndex; + this.currentIndex = 0; + this.originalIndexes = []; + this.batchType = batchType; + this.operations = []; + this.size = 0; + this.sizeBytes = 0; + } +} + +/** + * @public + * The result of a bulk write. + */ +export class BulkWriteResult { + result: BulkResult; + + /** + * Create a new BulkWriteResult instance + * @internal + */ + constructor(bulkResult: BulkResult) { + this.result = bulkResult; + } + + /** Number of documents inserted. */ + get insertedCount(): number { + return this.result.nInserted ?? 0; + } + /** Number of documents matched for update. */ + get matchedCount(): number { + return this.result.nMatched ?? 0; + } + /** Number of documents modified. */ + get modifiedCount(): number { + return this.result.nModified ?? 0; + } + /** Number of documents deleted. */ + get deletedCount(): number { + return this.result.nRemoved ?? 0; + } + /** Number of documents upserted. */ + get upsertedCount(): number { + return this.result.upserted.length ?? 0; + } + + /** Upserted document generated Id's, hash key is the index of the originating operation */ + get upsertedIds(): { [key: number]: any } { + const upserted: { [index: number]: any } = {}; + for (const doc of this.result.upserted ?? []) { + upserted[doc.index] = doc._id; + } + return upserted; + } + + /** Inserted document generated Id's, hash key is the index of the originating operation */ + get insertedIds(): { [key: number]: any } { + const inserted: { [index: number]: any } = {}; + for (const doc of this.result.insertedIds ?? []) { + inserted[doc.index] = doc._id; + } + return inserted; + } + + /** Evaluates to true if the bulk operation correctly executes */ + get ok(): number { + return this.result.ok; + } + + /** The number of inserted documents */ + get nInserted(): number { + return this.result.nInserted; + } + + /** Number of upserted documents */ + get nUpserted(): number { + return this.result.nUpserted; + } + + /** Number of matched documents */ + get nMatched(): number { + return this.result.nMatched; + } + + /** Number of documents updated physically on disk */ + get nModified(): number { + return this.result.nModified; + } + + /** Number of removed documents */ + get nRemoved(): number { + return this.result.nRemoved; + } + + /** Returns an array of all inserted ids */ + getInsertedIds(): Document[] { + return this.result.insertedIds; + } + + /** Returns an array of all upserted ids */ + getUpsertedIds(): Document[] { + return this.result.upserted; + } + + /** Returns the upserted id at the given index */ + getUpsertedIdAt(index: number): Document | undefined { + return this.result.upserted[index]; + } + + /** Returns raw internal result */ + getRawResponse(): Document { + return this.result; + } + + /** Returns true if the bulk operation contains a write error */ + hasWriteErrors(): boolean { + return this.result.writeErrors.length > 0; + } + + /** Returns the number of write errors off the bulk operation */ + getWriteErrorCount(): number { + return this.result.writeErrors.length; + } + + /** Returns a specific write error object */ + getWriteErrorAt(index: number): WriteError | undefined { + if (index < this.result.writeErrors.length) { + return this.result.writeErrors[index]; + } + } + + /** Retrieve all write errors */ + getWriteErrors(): WriteError[] { + return this.result.writeErrors; + } + + /** Retrieve lastOp if available */ + getLastOp(): Document | undefined { + return this.result.opTime; + } + + /** Retrieve the write concern error if one exists */ + getWriteConcernError(): WriteConcernError | undefined { + if (this.result.writeConcernErrors.length === 0) { + return; + } else if (this.result.writeConcernErrors.length === 1) { + // Return the error + return this.result.writeConcernErrors[0]; + } else { + // Combine the errors + let errmsg = ''; + for (let i = 0; i < this.result.writeConcernErrors.length; i++) { + const err = this.result.writeConcernErrors[i]; + errmsg = errmsg + err.errmsg; + + // TODO: Something better + if (i === 0) errmsg = errmsg + ' and '; + } + + return new WriteConcernError({ errmsg, code: MONGODB_ERROR_CODES.WriteConcernFailed }); + } + } + + toJSON(): BulkResult { + return this.result; + } + + toString(): string { + return `BulkWriteResult(${this.toJSON()})`; + } + + isOk(): boolean { + return this.result.ok === 1; + } +} + +/** @public */ +export interface WriteConcernErrorData { + code: number; + errmsg: string; + errInfo?: Document; +} + +/** + * An error representing a failure by the server to apply the requested write concern to the bulk operation. + * @public + * @category Error + */ +export class WriteConcernError { + /** @internal */ + [kServerError]: WriteConcernErrorData; + + constructor(error: WriteConcernErrorData) { + this[kServerError] = error; + } + + /** Write concern error code. */ + get code(): number | undefined { + return this[kServerError].code; + } + + /** Write concern error message. */ + get errmsg(): string | undefined { + return this[kServerError].errmsg; + } + + /** Write concern error info. */ + get errInfo(): Document | undefined { + return this[kServerError].errInfo; + } + + /** @deprecated The `err` prop that contained a MongoServerError has been deprecated. */ + get err(): WriteConcernErrorData { + return this[kServerError]; + } + + toJSON(): WriteConcernErrorData { + return this[kServerError]; + } + + toString(): string { + return `WriteConcernError(${this.errmsg})`; + } +} + +/** @public */ +export interface BulkWriteOperationError { + index: number; + code: number; + errmsg: string; + errInfo: Document; + op: Document | UpdateStatement | DeleteStatement; +} + +/** + * An error that occurred during a BulkWrite on the server. + * @public + * @category Error + */ +export class WriteError { + err: BulkWriteOperationError; + + constructor(err: BulkWriteOperationError) { + this.err = err; + } + + /** WriteError code. */ + get code(): number { + return this.err.code; + } + + /** WriteError original bulk operation index. */ + get index(): number { + return this.err.index; + } + + /** WriteError message. */ + get errmsg(): string | undefined { + return this.err.errmsg; + } + + /** WriteError details. */ + get errInfo(): Document | undefined { + return this.err.errInfo; + } + + /** Returns the underlying operation that caused the error */ + getOperation(): Document { + return this.err.op; + } + + toJSON(): { code: number; index: number; errmsg?: string; op: Document } { + return { code: this.err.code, index: this.err.index, errmsg: this.err.errmsg, op: this.err.op }; + } + + toString(): string { + return `WriteError(${JSON.stringify(this.toJSON())})`; + } +} + +/** Merges results into shared data structure */ +function mergeBatchResults( + batch: Batch, + bulkResult: BulkResult, + err?: AnyError, + result?: Document +): void { + // If we have an error set the result to be the err object + if (err) { + result = err; + } else if (result && result.result) { + result = result.result; + } + + if (result == null) { + return; + } + + // Do we have a top level error stop processing and return + if (result.ok === 0 && bulkResult.ok === 1) { + bulkResult.ok = 0; + + const writeError = { + index: 0, + code: result.code || 0, + errmsg: result.message, + errInfo: result.errInfo, + op: batch.operations[0] + }; + + bulkResult.writeErrors.push(new WriteError(writeError)); + return; + } else if (result.ok === 0 && bulkResult.ok === 0) { + return; + } + + // Deal with opTime if available + if (result.opTime || result.lastOp) { + const opTime = result.lastOp || result.opTime; + let lastOpTS = null; + let lastOpT = null; + + // We have a time stamp + if (opTime && opTime._bsontype === 'Timestamp') { + if (bulkResult.opTime == null) { + bulkResult.opTime = opTime; + } else if (opTime.greaterThan(bulkResult.opTime)) { + bulkResult.opTime = opTime; + } + } else { + // Existing TS + if (bulkResult.opTime) { + lastOpTS = + typeof bulkResult.opTime.ts === 'number' + ? Long.fromNumber(bulkResult.opTime.ts) + : bulkResult.opTime.ts; + lastOpT = + typeof bulkResult.opTime.t === 'number' + ? Long.fromNumber(bulkResult.opTime.t) + : bulkResult.opTime.t; + } + + // Current OpTime TS + const opTimeTS = typeof opTime.ts === 'number' ? Long.fromNumber(opTime.ts) : opTime.ts; + const opTimeT = typeof opTime.t === 'number' ? Long.fromNumber(opTime.t) : opTime.t; + + // Compare the opTime's + if (bulkResult.opTime == null) { + bulkResult.opTime = opTime; + } else if (opTimeTS.greaterThan(lastOpTS)) { + bulkResult.opTime = opTime; + } else if (opTimeTS.equals(lastOpTS)) { + if (opTimeT.greaterThan(lastOpT)) { + bulkResult.opTime = opTime; + } + } + } + } + + // If we have an insert Batch type + if (isInsertBatch(batch) && result.n) { + bulkResult.nInserted = bulkResult.nInserted + result.n; + } + + // If we have an insert Batch type + if (isDeleteBatch(batch) && result.n) { + bulkResult.nRemoved = bulkResult.nRemoved + result.n; + } + + let nUpserted = 0; + + // We have an array of upserted values, we need to rewrite the indexes + if (Array.isArray(result.upserted)) { + nUpserted = result.upserted.length; + + for (let i = 0; i < result.upserted.length; i++) { + bulkResult.upserted.push({ + index: result.upserted[i].index + batch.originalZeroIndex, + _id: result.upserted[i]._id + }); + } + } else if (result.upserted) { + nUpserted = 1; + + bulkResult.upserted.push({ + index: batch.originalZeroIndex, + _id: result.upserted + }); + } + + // If we have an update Batch type + if (isUpdateBatch(batch) && result.n) { + const nModified = result.nModified; + bulkResult.nUpserted = bulkResult.nUpserted + nUpserted; + bulkResult.nMatched = bulkResult.nMatched + (result.n - nUpserted); + + if (typeof nModified === 'number') { + bulkResult.nModified = bulkResult.nModified + nModified; + } else { + bulkResult.nModified = 0; + } + } + + if (Array.isArray(result.writeErrors)) { + for (let i = 0; i < result.writeErrors.length; i++) { + const writeError = { + index: batch.originalIndexes[result.writeErrors[i].index], + code: result.writeErrors[i].code, + errmsg: result.writeErrors[i].errmsg, + errInfo: result.writeErrors[i].errInfo, + op: batch.operations[result.writeErrors[i].index] + }; + + bulkResult.writeErrors.push(new WriteError(writeError)); + } + } + + if (result.writeConcernError) { + bulkResult.writeConcernErrors.push(new WriteConcernError(result.writeConcernError)); + } +} + +function executeCommands( + bulkOperation: BulkOperationBase, + options: BulkWriteOptions, + callback: Callback +) { + if (bulkOperation.s.batches.length === 0) { + return callback(undefined, new BulkWriteResult(bulkOperation.s.bulkResult)); + } + + const batch = bulkOperation.s.batches.shift() as Batch; + + function resultHandler(err?: AnyError, result?: Document) { + // Error is a driver related error not a bulk op error, return early + if (err && 'message' in err && !(err instanceof MongoWriteConcernError)) { + return callback( + new MongoBulkWriteError(err, new BulkWriteResult(bulkOperation.s.bulkResult)) + ); + } + + if (err instanceof MongoWriteConcernError) { + return handleMongoWriteConcernError(batch, bulkOperation.s.bulkResult, err, callback); + } + + // Merge the results together + const writeResult = new BulkWriteResult(bulkOperation.s.bulkResult); + const mergeResult = mergeBatchResults(batch, bulkOperation.s.bulkResult, err, result); + if (mergeResult != null) { + return callback(undefined, writeResult); + } + + if (bulkOperation.handleWriteError(callback, writeResult)) return; + + // Execute the next command in line + executeCommands(bulkOperation, options, callback); + } + + const finalOptions = resolveOptions(bulkOperation, { + ...options, + ordered: bulkOperation.isOrdered + }); + + if (finalOptions.bypassDocumentValidation !== true) { + delete finalOptions.bypassDocumentValidation; + } + + // Set an operationIf if provided + if (bulkOperation.operationId) { + resultHandler.operationId = bulkOperation.operationId; + } + + // Is the bypassDocumentValidation options specific + if (bulkOperation.s.bypassDocumentValidation === true) { + finalOptions.bypassDocumentValidation = true; + } + + // Is the checkKeys option disabled + if (bulkOperation.s.checkKeys === false) { + finalOptions.checkKeys = false; + } + + if (finalOptions.retryWrites) { + if (isUpdateBatch(batch)) { + finalOptions.retryWrites = finalOptions.retryWrites && !batch.operations.some(op => op.multi); + } + + if (isDeleteBatch(batch)) { + finalOptions.retryWrites = + finalOptions.retryWrites && !batch.operations.some(op => op.limit === 0); + } + } + + try { + if (isInsertBatch(batch)) { + executeOperation( + bulkOperation.s.topology, + new InsertOperation(bulkOperation.s.namespace, batch.operations, finalOptions), + resultHandler + ); + } else if (isUpdateBatch(batch)) { + executeOperation( + bulkOperation.s.topology, + new UpdateOperation(bulkOperation.s.namespace, batch.operations, finalOptions), + resultHandler + ); + } else if (isDeleteBatch(batch)) { + executeOperation( + bulkOperation.s.topology, + new DeleteOperation(bulkOperation.s.namespace, batch.operations, finalOptions), + resultHandler + ); + } + } catch (err) { + // Force top level error + err.ok = 0; + // Merge top level error and return + mergeBatchResults(batch, bulkOperation.s.bulkResult, err, undefined); + callback(); + } +} + +function handleMongoWriteConcernError( + batch: Batch, + bulkResult: BulkResult, + err: MongoWriteConcernError, + callback: Callback +) { + mergeBatchResults(batch, bulkResult, undefined, err.result); + + callback( + new MongoBulkWriteError( + { + message: err.result?.writeConcernError.errmsg, + code: err.result?.writeConcernError.result + }, + new BulkWriteResult(bulkResult) + ) + ); +} + +/** + * An error indicating an unsuccessful Bulk Write + * @public + * @category Error + */ +export class MongoBulkWriteError extends MongoServerError { + result: BulkWriteResult; + writeErrors: OneOrMore = []; + err?: WriteConcernError; + + /** Creates a new MongoBulkWriteError */ + constructor( + error: + | { message: string; code: number; writeErrors?: WriteError[] } + | WriteConcernError + | AnyError, + result: BulkWriteResult + ) { + super(error); + + if (error instanceof WriteConcernError) this.err = error; + else if (!(error instanceof Error)) { + this.message = error.message; + this.code = error.code; + this.writeErrors = error.writeErrors ?? []; + } + + this.result = result; + Object.assign(this, error); + } + + get name(): string { + return 'MongoBulkWriteError'; + } + + /** Number of documents inserted. */ + get insertedCount(): number { + return this.result.insertedCount; + } + /** Number of documents matched for update. */ + get matchedCount(): number { + return this.result.matchedCount; + } + /** Number of documents modified. */ + get modifiedCount(): number { + return this.result.modifiedCount; + } + /** Number of documents deleted. */ + get deletedCount(): number { + return this.result.deletedCount; + } + /** Number of documents upserted. */ + get upsertedCount(): number { + return this.result.upsertedCount; + } + /** Inserted document generated Id's, hash key is the index of the originating operation */ + get insertedIds(): { [key: number]: any } { + return this.result.insertedIds; + } + /** Upserted document generated Id's, hash key is the index of the originating operation */ + get upsertedIds(): { [key: number]: any } { + return this.result.upsertedIds; + } +} + +/** + * A builder object that is returned from {@link BulkOperationBase#find}. + * Is used to build a write operation that involves a query filter. + * + * @public + */ +export class FindOperators { + bulkOperation: BulkOperationBase; + + /** + * Creates a new FindOperators object. + * @internal + */ + constructor(bulkOperation: BulkOperationBase) { + this.bulkOperation = bulkOperation; + } + + /** Add a multiple update operation to the bulk operation */ + update(updateDocument: Document): BulkOperationBase { + const currentOp = buildCurrentOp(this.bulkOperation); + return this.bulkOperation.addToOperationsList( + BatchType.UPDATE, + makeUpdateStatement(currentOp.selector, updateDocument, { + ...currentOp, + multi: true + }) + ); + } + + /** Add a single update operation to the bulk operation */ + updateOne(updateDocument: Document): BulkOperationBase { + if (!hasAtomicOperators(updateDocument)) { + throw new MongoInvalidArgumentError('Update document requires atomic operators'); + } + + const currentOp = buildCurrentOp(this.bulkOperation); + return this.bulkOperation.addToOperationsList( + BatchType.UPDATE, + makeUpdateStatement(currentOp.selector, updateDocument, { ...currentOp, multi: false }) + ); + } + + /** Add a replace one operation to the bulk operation */ + replaceOne(replacement: Document): BulkOperationBase { + if (hasAtomicOperators(replacement)) { + throw new MongoInvalidArgumentError('Replacement document must not use atomic operators'); + } + + const currentOp = buildCurrentOp(this.bulkOperation); + return this.bulkOperation.addToOperationsList( + BatchType.UPDATE, + makeUpdateStatement(currentOp.selector, replacement, { ...currentOp, multi: false }) + ); + } + + /** Add a delete one operation to the bulk operation */ + deleteOne(): BulkOperationBase { + const currentOp = buildCurrentOp(this.bulkOperation); + return this.bulkOperation.addToOperationsList( + BatchType.DELETE, + makeDeleteStatement(currentOp.selector, { ...currentOp, limit: 1 }) + ); + } + + /** Add a delete many operation to the bulk operation */ + delete(): BulkOperationBase { + const currentOp = buildCurrentOp(this.bulkOperation); + return this.bulkOperation.addToOperationsList( + BatchType.DELETE, + makeDeleteStatement(currentOp.selector, { ...currentOp, limit: 0 }) + ); + } + + /** Upsert modifier for update bulk operation, noting that this operation is an upsert. */ + upsert(): this { + if (!this.bulkOperation.s.currentOp) { + this.bulkOperation.s.currentOp = {}; + } + + this.bulkOperation.s.currentOp.upsert = true; + return this; + } + + /** Specifies the collation for the query condition. */ + collation(collation: CollationOptions): this { + if (!this.bulkOperation.s.currentOp) { + this.bulkOperation.s.currentOp = {}; + } + + this.bulkOperation.s.currentOp.collation = collation; + return this; + } + + /** Specifies arrayFilters for UpdateOne or UpdateMany bulk operations. */ + arrayFilters(arrayFilters: Document[]): this { + if (!this.bulkOperation.s.currentOp) { + this.bulkOperation.s.currentOp = {}; + } + + this.bulkOperation.s.currentOp.arrayFilters = arrayFilters; + return this; + } +} + +/** @internal */ +export interface BulkOperationPrivate { + bulkResult: BulkResult; + currentBatch?: Batch; + currentIndex: number; + // ordered specific + currentBatchSize: number; + currentBatchSizeBytes: number; + // unordered specific + currentInsertBatch?: Batch; + currentUpdateBatch?: Batch; + currentRemoveBatch?: Batch; + batches: Batch[]; + // Write concern + writeConcern?: WriteConcern; + // Max batch size options + maxBsonObjectSize: number; + maxBatchSizeBytes: number; + maxWriteBatchSize: number; + maxKeySize: number; + // Namespace + namespace: MongoDBNamespace; + // Topology + topology: Topology; + // Options + options: BulkWriteOptions; + // BSON options + bsonOptions: BSONSerializeOptions; + // Document used to build a bulk operation + currentOp?: Document; + // Executed + executed: boolean; + // Collection + collection: Collection; + // Fundamental error + err?: AnyError; + // check keys + checkKeys: boolean; + bypassDocumentValidation?: boolean; +} + +/** @public */ +export interface BulkWriteOptions extends CommandOperationOptions { + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; + /** If true, when an insert fails, don't execute the remaining writes. If false, continue with remaining inserts when one fails. */ + ordered?: boolean; + /** @deprecated use `ordered` instead */ + keepGoing?: boolean; + /** Force server to assign _id values instead of driver. */ + forceServerObjectId?: boolean; +} + +/** @public */ +export abstract class BulkOperationBase { + isOrdered: boolean; + /** @internal */ + s: BulkOperationPrivate; + operationId?: number; + + /** + * Create a new OrderedBulkOperation or UnorderedBulkOperation instance + * @internal + */ + constructor(collection: Collection, options: BulkWriteOptions, isOrdered: boolean) { + // determine whether bulkOperation is ordered or unordered + this.isOrdered = isOrdered; + + const topology = getTopology(collection); + options = options == null ? {} : options; + // TODO Bring from driver information in isMaster + // Get the namespace for the write operations + const namespace = collection.s.namespace; + // Used to mark operation as executed + const executed = false; + + // Current item + const currentOp = undefined; + + // Set max byte size + const isMaster = topology.lastIsMaster(); + + // If we have autoEncryption on, batch-splitting must be done on 2mb chunks, but single documents + // over 2mb are still allowed + const usingAutoEncryption = !!(topology.s.options && topology.s.options.autoEncrypter); + const maxBsonObjectSize = + isMaster && isMaster.maxBsonObjectSize ? isMaster.maxBsonObjectSize : 1024 * 1024 * 16; + const maxBatchSizeBytes = usingAutoEncryption ? 1024 * 1024 * 2 : maxBsonObjectSize; + const maxWriteBatchSize = + isMaster && isMaster.maxWriteBatchSize ? isMaster.maxWriteBatchSize : 1000; + + // Calculates the largest possible size of an Array key, represented as a BSON string + // element. This calculation: + // 1 byte for BSON type + // # of bytes = length of (string representation of (maxWriteBatchSize - 1)) + // + 1 bytes for null terminator + const maxKeySize = (maxWriteBatchSize - 1).toString(10).length + 2; + + // Final options for retryable writes + let finalOptions = Object.assign({}, options); + finalOptions = applyRetryableWrites(finalOptions, collection.s.db); + + // Final results + const bulkResult: BulkResult = { + ok: 1, + writeErrors: [], + writeConcernErrors: [], + insertedIds: [], + nInserted: 0, + nUpserted: 0, + nMatched: 0, + nModified: 0, + nRemoved: 0, + upserted: [] + }; + + // Internal state + this.s = { + // Final result + bulkResult, + // Current batch state + currentBatch: undefined, + currentIndex: 0, + // ordered specific + currentBatchSize: 0, + currentBatchSizeBytes: 0, + // unordered specific + currentInsertBatch: undefined, + currentUpdateBatch: undefined, + currentRemoveBatch: undefined, + batches: [], + // Write concern + writeConcern: WriteConcern.fromOptions(options), + // Max batch size options + maxBsonObjectSize, + maxBatchSizeBytes, + maxWriteBatchSize, + maxKeySize, + // Namespace + namespace, + // Topology + topology, + // Options + options: finalOptions, + // BSON options + bsonOptions: resolveBSONOptions(options), + // Current operation + currentOp, + // Executed + executed, + // Collection + collection, + // Fundamental error + err: undefined, + // check keys + checkKeys: typeof options.checkKeys === 'boolean' ? options.checkKeys : false + }; + + // bypass Validation + if (options.bypassDocumentValidation === true) { + this.s.bypassDocumentValidation = true; + } + } + + /** + * Add a single insert document to the bulk operation + * + * @example + * ```js + * const bulkOp = collection.initializeOrderedBulkOp(); + * + * // Adds three inserts to the bulkOp. + * bulkOp + * .insert({ a: 1 }) + * .insert({ b: 2 }) + * .insert({ c: 3 }); + * await bulkOp.execute(); + * ``` + */ + insert(document: Document): BulkOperationBase { + if (document._id == null && !shouldForceServerObjectId(this)) { + document._id = new ObjectId(); + } + + return this.addToOperationsList(BatchType.INSERT, document); + } + + /** + * Builds a find operation for an update/updateOne/delete/deleteOne/replaceOne. + * Returns a builder object used to complete the definition of the operation. + * + * @example + * ```js + * const bulkOp = collection.initializeOrderedBulkOp(); + * + * // Add an updateOne to the bulkOp + * bulkOp.find({ a: 1 }).updateOne({ $set: { b: 2 } }); + * + * // Add an updateMany to the bulkOp + * bulkOp.find({ c: 3 }).update({ $set: { d: 4 } }); + * + * // Add an upsert + * bulkOp.find({ e: 5 }).upsert().updateOne({ $set: { f: 6 } }); + * + * // Add a deletion + * bulkOp.find({ g: 7 }).deleteOne(); + * + * // Add a multi deletion + * bulkOp.find({ h: 8 }).delete(); + * + * // Add a replaceOne + * bulkOp.find({ i: 9 }).replaceOne({writeConcern: { j: 10 }}); + * + * // Update using a pipeline (requires Mongodb 4.2 or higher) + * bulk.find({ k: 11, y: { $exists: true }, z: { $exists: true } }).updateOne([ + * { $set: { total: { $sum: [ '$y', '$z' ] } } } + * ]); + * + * // All of the ops will now be executed + * await bulkOp.execute(); + * ``` + */ + find(selector: Document): FindOperators { + if (!selector) { + throw new MongoInvalidArgumentError('Bulk find operation must specify a selector'); + } + + // Save a current selector + this.s.currentOp = { + selector: selector + }; + + return new FindOperators(this); + } + + /** Specifies a raw operation to perform in the bulk write. */ + raw(op: AnyBulkWriteOperation): this { + if ('insertOne' in op) { + const forceServerObjectId = shouldForceServerObjectId(this); + if (op.insertOne && op.insertOne.document == null) { + // NOTE: provided for legacy support, but this is a malformed operation + if (forceServerObjectId !== true && (op.insertOne as Document)._id == null) { + (op.insertOne as Document)._id = new ObjectId(); + } + + return this.addToOperationsList(BatchType.INSERT, op.insertOne); + } + + if (forceServerObjectId !== true && op.insertOne.document._id == null) { + op.insertOne.document._id = new ObjectId(); + } + + return this.addToOperationsList(BatchType.INSERT, op.insertOne.document); + } + + if ('replaceOne' in op || 'updateOne' in op || 'updateMany' in op) { + if ('replaceOne' in op) { + if ('q' in op.replaceOne) { + throw new MongoInvalidArgumentError('Raw operations are not allowed'); + } + const updateStatement = makeUpdateStatement( + op.replaceOne.filter, + op.replaceOne.replacement, + { ...op.replaceOne, multi: false } + ); + if (hasAtomicOperators(updateStatement.u)) { + throw new MongoInvalidArgumentError('Replacement document must not use atomic operators'); + } + return this.addToOperationsList(BatchType.UPDATE, updateStatement); + } + + if ('updateOne' in op) { + if ('q' in op.updateOne) { + throw new MongoInvalidArgumentError('Raw operations are not allowed'); + } + const updateStatement = makeUpdateStatement(op.updateOne.filter, op.updateOne.update, { + ...op.updateOne, + multi: false + }); + if (!hasAtomicOperators(updateStatement.u)) { + throw new MongoInvalidArgumentError('Update document requires atomic operators'); + } + return this.addToOperationsList(BatchType.UPDATE, updateStatement); + } + + if ('updateMany' in op) { + if ('q' in op.updateMany) { + throw new MongoInvalidArgumentError('Raw operations are not allowed'); + } + const updateStatement = makeUpdateStatement(op.updateMany.filter, op.updateMany.update, { + ...op.updateMany, + multi: true + }); + if (!hasAtomicOperators(updateStatement.u)) { + throw new MongoInvalidArgumentError('Update document requires atomic operators'); + } + return this.addToOperationsList(BatchType.UPDATE, updateStatement); + } + } + + if ('deleteOne' in op) { + if ('q' in op.deleteOne) { + throw new MongoInvalidArgumentError('Raw operations are not allowed'); + } + return this.addToOperationsList( + BatchType.DELETE, + makeDeleteStatement(op.deleteOne.filter, { ...op.deleteOne, limit: 1 }) + ); + } + + if ('deleteMany' in op) { + if ('q' in op.deleteMany) { + throw new MongoInvalidArgumentError('Raw operations are not allowed'); + } + return this.addToOperationsList( + BatchType.DELETE, + makeDeleteStatement(op.deleteMany.filter, { ...op.deleteMany, limit: 0 }) + ); + } + + // otherwise an unknown operation was provided + throw new MongoInvalidArgumentError( + 'bulkWrite only supports insertOne, updateOne, updateMany, deleteOne, deleteMany' + ); + } + + get bsonOptions(): BSONSerializeOptions { + return this.s.bsonOptions; + } + + get writeConcern(): WriteConcern | undefined { + return this.s.writeConcern; + } + + get batches(): Batch[] { + const batches = [...this.s.batches]; + if (this.isOrdered) { + if (this.s.currentBatch) batches.push(this.s.currentBatch); + } else { + if (this.s.currentInsertBatch) batches.push(this.s.currentInsertBatch); + if (this.s.currentUpdateBatch) batches.push(this.s.currentUpdateBatch); + if (this.s.currentRemoveBatch) batches.push(this.s.currentRemoveBatch); + } + return batches; + } + + /** An internal helper method. Do not invoke directly. Will be going away in the future */ + execute( + options?: BulkWriteOptions, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = options ?? {}; + + if (this.s.executed) { + return handleEarlyError(new MongoBatchReExecutionError(), callback); + } + + const writeConcern = WriteConcern.fromOptions(options); + if (writeConcern) { + this.s.writeConcern = writeConcern; + } + + // If we have current batch + if (this.isOrdered) { + if (this.s.currentBatch) this.s.batches.push(this.s.currentBatch); + } else { + if (this.s.currentInsertBatch) this.s.batches.push(this.s.currentInsertBatch); + if (this.s.currentUpdateBatch) this.s.batches.push(this.s.currentUpdateBatch); + if (this.s.currentRemoveBatch) this.s.batches.push(this.s.currentRemoveBatch); + } + // If we have no operations in the bulk raise an error + if (this.s.batches.length === 0) { + const emptyBatchError = new MongoInvalidArgumentError( + 'Invalid BulkOperation, Batch cannot be empty' + ); + return handleEarlyError(emptyBatchError, callback); + } + + this.s.executed = true; + const finalOptions = { ...this.s.options, ...options }; + return executeLegacyOperation(this.s.topology, executeCommands, [this, finalOptions, callback]); + } + + /** + * Handles the write error before executing commands + * @internal + */ + handleWriteError( + callback: Callback, + writeResult: BulkWriteResult + ): boolean | undefined { + if (this.s.bulkResult.writeErrors.length > 0) { + const msg = this.s.bulkResult.writeErrors[0].errmsg + ? this.s.bulkResult.writeErrors[0].errmsg + : 'write operation failed'; + + callback( + new MongoBulkWriteError( + { + message: msg, + code: this.s.bulkResult.writeErrors[0].code, + writeErrors: this.s.bulkResult.writeErrors + }, + writeResult + ) + ); + + return true; + } + + const writeConcernError = writeResult.getWriteConcernError(); + if (writeConcernError) { + callback(new MongoBulkWriteError(writeConcernError, writeResult)); + return true; + } + } + + abstract addToOperationsList( + batchType: BatchType, + document: Document | UpdateStatement | DeleteStatement + ): this; +} + +Object.defineProperty(BulkOperationBase.prototype, 'length', { + enumerable: true, + get() { + return this.s.currentIndex; + } +}); + +/** helper function to assist with promiseOrCallback behavior */ +function handleEarlyError( + err?: AnyError, + callback?: Callback +): Promise | void { + const Promise = PromiseProvider.get(); + if (typeof callback === 'function') { + callback(err); + return; + } + + return Promise.reject(err); +} + +function shouldForceServerObjectId(bulkOperation: BulkOperationBase): boolean { + if (typeof bulkOperation.s.options.forceServerObjectId === 'boolean') { + return bulkOperation.s.options.forceServerObjectId; + } + + if (typeof bulkOperation.s.collection.s.db.options?.forceServerObjectId === 'boolean') { + return bulkOperation.s.collection.s.db.options?.forceServerObjectId; + } + + return false; +} + +function isInsertBatch(batch: Batch): boolean { + return batch.batchType === BatchType.INSERT; +} + +function isUpdateBatch(batch: Batch): batch is Batch { + return batch.batchType === BatchType.UPDATE; +} + +function isDeleteBatch(batch: Batch): batch is Batch { + return batch.batchType === BatchType.DELETE; +} + +function buildCurrentOp(bulkOp: BulkOperationBase): Document { + let { currentOp } = bulkOp.s; + bulkOp.s.currentOp = undefined; + if (!currentOp) currentOp = {}; + return currentOp; +} diff --git a/node_modules/mongodb/src/bulk/ordered.ts b/node_modules/mongodb/src/bulk/ordered.ts new file mode 100644 index 000000000..ce99650c2 --- /dev/null +++ b/node_modules/mongodb/src/bulk/ordered.ts @@ -0,0 +1,82 @@ +import * as BSON from '../bson'; +import { BulkOperationBase, Batch, BatchType, BulkWriteOptions } from './common'; +import type { Document } from '../bson'; +import type { Collection } from '../collection'; +import type { UpdateStatement } from '../operations/update'; +import type { DeleteStatement } from '../operations/delete'; +import { MongoInvalidArgumentError } from '../error'; + +/** @public */ +export class OrderedBulkOperation extends BulkOperationBase { + constructor(collection: Collection, options: BulkWriteOptions) { + super(collection, options, true); + } + + addToOperationsList( + batchType: BatchType, + document: Document | UpdateStatement | DeleteStatement + ): this { + // Get the bsonSize + const bsonSize = BSON.calculateObjectSize(document, { + checkKeys: false, + // Since we don't know what the user selected for BSON options here, + // err on the safe side, and check the size with ignoreUndefined: false. + ignoreUndefined: false + } as any); + + // Throw error if the doc is bigger than the max BSON size + if (bsonSize >= this.s.maxBsonObjectSize) + // TODO(NODE-3483): Change this to MongoBSONError + throw new MongoInvalidArgumentError( + `Document is larger than the maximum size ${this.s.maxBsonObjectSize}` + ); + + // Create a new batch object if we don't have a current one + if (this.s.currentBatch == null) { + this.s.currentBatch = new Batch(batchType, this.s.currentIndex); + } + + const maxKeySize = this.s.maxKeySize; + + // Check if we need to create a new batch + if ( + // New batch if we exceed the max batch op size + this.s.currentBatchSize + 1 >= this.s.maxWriteBatchSize || + // New batch if we exceed the maxBatchSizeBytes. Only matters if batch already has a doc, + // since we can't sent an empty batch + (this.s.currentBatchSize > 0 && + this.s.currentBatchSizeBytes + maxKeySize + bsonSize >= this.s.maxBatchSizeBytes) || + // New batch if the new op does not have the same op type as the current batch + this.s.currentBatch.batchType !== batchType + ) { + // Save the batch to the execution stack + this.s.batches.push(this.s.currentBatch); + + // Create a new batch + this.s.currentBatch = new Batch(batchType, this.s.currentIndex); + + // Reset the current size trackers + this.s.currentBatchSize = 0; + this.s.currentBatchSizeBytes = 0; + } + + if (batchType === BatchType.INSERT) { + this.s.bulkResult.insertedIds.push({ + index: this.s.currentIndex, + _id: (document as Document)._id + }); + } + + // We have an array of documents + if (Array.isArray(document)) { + throw new MongoInvalidArgumentError('Operation passed in cannot be an Array'); + } + + this.s.currentBatch.originalIndexes.push(this.s.currentIndex); + this.s.currentBatch.operations.push(document); + this.s.currentBatchSize += 1; + this.s.currentBatchSizeBytes += maxKeySize + bsonSize; + this.s.currentIndex += 1; + return this; + } +} diff --git a/node_modules/mongodb/src/bulk/unordered.ts b/node_modules/mongodb/src/bulk/unordered.ts new file mode 100644 index 000000000..e241e50be --- /dev/null +++ b/node_modules/mongodb/src/bulk/unordered.ts @@ -0,0 +1,109 @@ +import * as BSON from '../bson'; +import { BulkOperationBase, Batch, BatchType, BulkWriteOptions, BulkWriteResult } from './common'; +import type { Callback } from '../utils'; +import type { Document } from '../bson'; +import type { Collection } from '../collection'; +import type { UpdateStatement } from '../operations/update'; +import type { DeleteStatement } from '../operations/delete'; +import { MongoInvalidArgumentError } from '../error'; + +/** @public */ +export class UnorderedBulkOperation extends BulkOperationBase { + constructor(collection: Collection, options: BulkWriteOptions) { + super(collection, options, false); + } + + handleWriteError(callback: Callback, writeResult: BulkWriteResult): boolean | undefined { + if (this.s.batches.length) { + return false; + } + + return super.handleWriteError(callback, writeResult); + } + + addToOperationsList( + batchType: BatchType, + document: Document | UpdateStatement | DeleteStatement + ): this { + // Get the bsonSize + const bsonSize = BSON.calculateObjectSize(document, { + checkKeys: false, + + // Since we don't know what the user selected for BSON options here, + // err on the safe side, and check the size with ignoreUndefined: false. + ignoreUndefined: false + } as any); + + // Throw error if the doc is bigger than the max BSON size + if (bsonSize >= this.s.maxBsonObjectSize) { + // TODO(NODE-3483): Change this to MongoBSONError + throw new MongoInvalidArgumentError( + `Document is larger than the maximum size ${this.s.maxBsonObjectSize}` + ); + } + + // Holds the current batch + this.s.currentBatch = undefined; + // Get the right type of batch + if (batchType === BatchType.INSERT) { + this.s.currentBatch = this.s.currentInsertBatch; + } else if (batchType === BatchType.UPDATE) { + this.s.currentBatch = this.s.currentUpdateBatch; + } else if (batchType === BatchType.DELETE) { + this.s.currentBatch = this.s.currentRemoveBatch; + } + + const maxKeySize = this.s.maxKeySize; + + // Create a new batch object if we don't have a current one + if (this.s.currentBatch == null) { + this.s.currentBatch = new Batch(batchType, this.s.currentIndex); + } + + // Check if we need to create a new batch + if ( + // New batch if we exceed the max batch op size + this.s.currentBatch.size + 1 >= this.s.maxWriteBatchSize || + // New batch if we exceed the maxBatchSizeBytes. Only matters if batch already has a doc, + // since we can't sent an empty batch + (this.s.currentBatch.size > 0 && + this.s.currentBatch.sizeBytes + maxKeySize + bsonSize >= this.s.maxBatchSizeBytes) || + // New batch if the new op does not have the same op type as the current batch + this.s.currentBatch.batchType !== batchType + ) { + // Save the batch to the execution stack + this.s.batches.push(this.s.currentBatch); + + // Create a new batch + this.s.currentBatch = new Batch(batchType, this.s.currentIndex); + } + + // We have an array of documents + if (Array.isArray(document)) { + throw new MongoInvalidArgumentError('Operation passed in cannot be an Array'); + } + + this.s.currentBatch.operations.push(document); + this.s.currentBatch.originalIndexes.push(this.s.currentIndex); + this.s.currentIndex = this.s.currentIndex + 1; + + // Save back the current Batch to the right type + if (batchType === BatchType.INSERT) { + this.s.currentInsertBatch = this.s.currentBatch; + this.s.bulkResult.insertedIds.push({ + index: this.s.bulkResult.insertedIds.length, + _id: (document as Document)._id + }); + } else if (batchType === BatchType.UPDATE) { + this.s.currentUpdateBatch = this.s.currentBatch; + } else if (batchType === BatchType.DELETE) { + this.s.currentRemoveBatch = this.s.currentBatch; + } + + // Update current batch size + this.s.currentBatch.size += 1; + this.s.currentBatch.sizeBytes += maxKeySize + bsonSize; + + return this; + } +} diff --git a/node_modules/mongodb/src/change_stream.ts b/node_modules/mongodb/src/change_stream.ts new file mode 100644 index 000000000..60b7cf750 --- /dev/null +++ b/node_modules/mongodb/src/change_stream.ts @@ -0,0 +1,827 @@ +import Denque = require('denque'); +import { + MongoError, + AnyError, + isResumableError, + MongoRuntimeError, + MongoAPIError, + MongoChangeStreamError +} from './error'; +import { AggregateOperation, AggregateOptions } from './operations/aggregate'; +import { + maxWireVersion, + calculateDurationInMs, + now, + maybePromise, + MongoDBNamespace, + Callback, + getTopology +} from './utils'; +import type { ReadPreference } from './read_preference'; +import type { Timestamp, Document } from './bson'; +import type { Topology } from './sdam/topology'; +import type { OperationParent, CollationOptions } from './operations/command'; +import { MongoClient } from './mongo_client'; +import { Db } from './db'; +import { Collection } from './collection'; +import type { Readable } from 'stream'; +import { + AbstractCursor, + AbstractCursorEvents, + AbstractCursorOptions, + CursorStreamOptions +} from './cursor/abstract_cursor'; +import type { ClientSession } from './sessions'; +import { executeOperation, ExecutionResult } from './operations/execute_operation'; +import { InferIdType, Nullable, TypedEventEmitter } from './mongo_types'; + +/** @internal */ +const kResumeQueue = Symbol('resumeQueue'); +/** @internal */ +const kCursorStream = Symbol('cursorStream'); +/** @internal */ +const kClosed = Symbol('closed'); +/** @internal */ +const kMode = Symbol('mode'); + +const CHANGE_STREAM_OPTIONS = ['resumeAfter', 'startAfter', 'startAtOperationTime', 'fullDocument']; +const CURSOR_OPTIONS = ['batchSize', 'maxAwaitTimeMS', 'collation', 'readPreference'].concat( + CHANGE_STREAM_OPTIONS +); + +const CHANGE_DOMAIN_TYPES = { + COLLECTION: Symbol('Collection'), + DATABASE: Symbol('Database'), + CLUSTER: Symbol('Cluster') +}; + +const NO_RESUME_TOKEN_ERROR = + 'A change stream document has been received that lacks a resume token (_id).'; +const NO_CURSOR_ERROR = 'ChangeStream has no cursor'; +const CHANGESTREAM_CLOSED_ERROR = 'ChangeStream is closed'; + +/** @public */ +export interface ResumeOptions { + startAtOperationTime?: Timestamp; + batchSize?: number; + maxAwaitTimeMS?: number; + collation?: CollationOptions; + readPreference?: ReadPreference; +} + +/** + * Represents the logical starting point for a new or resuming {@link https://docs.mongodb.com/master/changeStreams/#change-stream-resume-token| Change Stream} on the server. + * @public + */ +export type ResumeToken = unknown; + +/** + * Represents a specific point in time on a server. Can be retrieved by using {@link Db#command} + * @public + * @remarks + * See {@link https://docs.mongodb.com/manual/reference/method/db.runCommand/#response| Run Command Response} + */ +export type OperationTime = Timestamp; + +/** @public */ +export interface PipeOptions { + end?: boolean; +} + +/** + * Options that can be passed to a ChangeStream. Note that startAfter, resumeAfter, and startAtOperationTime are all mutually exclusive, and the server will error if more than one is specified. + * @public + */ +export interface ChangeStreamOptions extends AggregateOptions { + /** Allowed values: ‘default’, ‘updateLookup’. When set to ‘updateLookup’, the change stream will include both a delta describing the changes to the document, as well as a copy of the entire document that was changed from some time after the change occurred. */ + fullDocument?: string; + /** The maximum amount of time for the server to wait on new documents to satisfy a change stream query. */ + maxAwaitTimeMS?: number; + /** Allows you to start a changeStream after a specified event. See {@link https://docs.mongodb.com/master/changeStreams/#resumeafter-for-change-streams|ChangeStream documentation}. */ + resumeAfter?: ResumeToken; + /** Similar to resumeAfter, but will allow you to start after an invalidated event. See {@link https://docs.mongodb.com/master/changeStreams/#startafter-for-change-streams|ChangeStream documentation}. */ + startAfter?: ResumeToken; + /** Will start the changeStream after the specified operationTime. */ + startAtOperationTime?: OperationTime; + /** The number of documents to return per batch. See {@link https://docs.mongodb.com/manual/reference/command/aggregate|aggregation documentation}. */ + batchSize?: number; +} + +/** @public */ +export interface ChangeStreamDocument { + /** + * The id functions as an opaque token for use when resuming an interrupted + * change stream. + */ + _id: InferIdType; + + /** + * Describes the type of operation represented in this change notification. + */ + operationType: + | 'insert' + | 'update' + | 'replace' + | 'delete' + | 'invalidate' + | 'drop' + | 'dropDatabase' + | 'rename'; + + /** + * Contains two fields: “db” and “coll” containing the database and + * collection name in which the change happened. + */ + ns: { db: string; coll: string }; + + /** + * Only present for ops of type ‘insert’, ‘update’, ‘replace’, and + * ‘delete’. + * + * For unsharded collections this contains a single field, _id, with the + * value of the _id of the document updated. For sharded collections, + * this will contain all the components of the shard key in order, + * followed by the _id if the _id isn’t part of the shard key. + */ + documentKey?: InferIdType; + + /** + * Only present for ops of type ‘update’. + * + * Contains a description of updated and removed fields in this + * operation. + */ + updateDescription?: UpdateDescription; + + /** + * Always present for operations of type ‘insert’ and ‘replace’. Also + * present for operations of type ‘update’ if the user has specified ‘updateLookup’ + * in the ‘fullDocument’ arguments to the ‘$changeStream’ stage. + * + * For operations of type ‘insert’ and ‘replace’, this key will contain the + * document being inserted, or the new version of the document that is replacing + * the existing document, respectively. + * + * For operations of type ‘update’, this key will contain a copy of the full + * version of the document from some point after the update occurred. If the + * document was deleted since the updated happened, it will be null. + */ + fullDocument?: TSchema; +} + +/** @public */ +export interface UpdateDescription { + /** + * A document containing key:value pairs of names of the fields that were + * changed, and the new value for those fields. + */ + updatedFields: Partial; + + /** + * An array of field names that were removed from the document. + */ + removedFields: string[]; +} + +/** @public */ +export type ChangeStreamEvents = { + resumeTokenChanged(token: ResumeToken): void; + init(response: Document): void; + more(response?: Document | undefined): void; + response(): void; + end(): void; + error(error: Error): void; + change(change: ChangeStreamDocument): void; +} & AbstractCursorEvents; + +/** + * Creates a new Change Stream instance. Normally created using {@link Collection#watch|Collection.watch()}. + * @public + */ +export class ChangeStream< + TSchema extends Document = Document +> extends TypedEventEmitter { + pipeline: Document[]; + options: ChangeStreamOptions; + parent: MongoClient | Db | Collection; + namespace: MongoDBNamespace; + type: symbol; + /** @internal */ + cursor?: ChangeStreamCursor; + streamOptions?: CursorStreamOptions; + /** @internal */ + [kResumeQueue]: Denque>>; + /** @internal */ + [kCursorStream]?: Readable; + /** @internal */ + [kClosed]: boolean; + /** @internal */ + [kMode]: false | 'iterator' | 'emitter'; + + /** @event */ + static readonly RESPONSE = 'response' as const; + /** @event */ + static readonly MORE = 'more' as const; + /** @event */ + static readonly INIT = 'init' as const; + /** @event */ + static readonly CLOSE = 'close' as const; + /** + * Fired for each new matching change in the specified namespace. Attaching a `change` + * event listener to a Change Stream will switch the stream into flowing mode. Data will + * then be passed as soon as it is available. + * @event + */ + static readonly CHANGE = 'change' as const; + /** @event */ + static readonly END = 'end' as const; + /** @event */ + static readonly ERROR = 'error' as const; + /** + * Emitted each time the change stream stores a new resume token. + * @event + */ + static readonly RESUME_TOKEN_CHANGED = 'resumeTokenChanged' as const; + + /** + * @internal + * + * @param parent - The parent object that created this change stream + * @param pipeline - An array of {@link https://docs.mongodb.com/manual/reference/operator/aggregation-pipeline/|aggregation pipeline stages} through which to pass change stream documents + */ + constructor( + parent: OperationParent, + pipeline: Document[] = [], + options: ChangeStreamOptions = {} + ) { + super(); + + this.pipeline = pipeline; + this.options = options; + + if (parent instanceof Collection) { + this.type = CHANGE_DOMAIN_TYPES.COLLECTION; + } else if (parent instanceof Db) { + this.type = CHANGE_DOMAIN_TYPES.DATABASE; + } else if (parent instanceof MongoClient) { + this.type = CHANGE_DOMAIN_TYPES.CLUSTER; + } else { + throw new MongoChangeStreamError( + 'Parent provided to ChangeStream constructor must be an instance of Collection, Db, or MongoClient' + ); + } + + this.parent = parent; + this.namespace = parent.s.namespace; + if (!this.options.readPreference && parent.readPreference) { + this.options.readPreference = parent.readPreference; + } + + this[kResumeQueue] = new Denque(); + + // Create contained Change Stream cursor + this.cursor = createChangeStreamCursor(this, options); + + this[kClosed] = false; + this[kMode] = false; + + // Listen for any `change` listeners being added to ChangeStream + this.on('newListener', eventName => { + if (eventName === 'change' && this.cursor && this.listenerCount('change') === 0) { + streamEvents(this, this.cursor); + } + }); + + this.on('removeListener', eventName => { + if (eventName === 'change' && this.listenerCount('change') === 0 && this.cursor) { + this[kCursorStream]?.removeAllListeners('data'); + } + }); + } + + /** @internal */ + get cursorStream(): Readable | undefined { + return this[kCursorStream]; + } + + /** The cached resume token that is used to resume after the most recently returned change. */ + get resumeToken(): ResumeToken { + return this.cursor?.resumeToken; + } + + /** Check if there is any document still available in the Change Stream */ + hasNext(callback?: Callback): Promise | void { + setIsIterator(this); + return maybePromise(callback, cb => { + getCursor(this, (err, cursor) => { + if (err || !cursor) return cb(err); // failed to resume, raise an error + cursor.hasNext(cb); + }); + }); + } + + /** Get the next available document from the Change Stream. */ + next(): Promise>; + next(callback: Callback>): void; + next( + callback?: Callback> + ): Promise> | void { + setIsIterator(this); + return maybePromise(callback, cb => { + getCursor(this, (err, cursor) => { + if (err || !cursor) return cb(err); // failed to resume, raise an error + cursor.next((error, change) => { + if (error) { + this[kResumeQueue].push(() => this.next(cb)); + processError(this, error, cb); + return; + } + processNewChange(this, change, cb); + }); + }); + }); + } + + /** Is the cursor closed */ + get closed(): boolean { + return this[kClosed] || (this.cursor?.closed ?? false); + } + + /** Close the Change Stream */ + close(callback?: Callback): Promise | void { + this[kClosed] = true; + + return maybePromise(callback, cb => { + if (!this.cursor) { + return cb(); + } + + const cursor = this.cursor; + return cursor.close(err => { + endStream(this); + this.cursor = undefined; + return cb(err); + }); + }); + } + + /** + * Return a modified Readable stream including a possible transform method. + * @throws MongoDriverError if this.cursor is undefined + */ + stream(options?: CursorStreamOptions): Readable { + this.streamOptions = options; + if (!this.cursor) throw new MongoChangeStreamError(NO_CURSOR_ERROR); + return this.cursor.stream(options); + } + + /** + * Try to get the next available document from the Change Stream's cursor or `null` if an empty batch is returned + */ + tryNext(): Promise; + tryNext(callback: Callback): void; + tryNext(callback?: Callback): Promise | void { + setIsIterator(this); + return maybePromise(callback, cb => { + getCursor(this, (err, cursor) => { + if (err || !cursor) return cb(err); // failed to resume, raise an error + return cursor.tryNext(cb); + }); + }); + } +} + +/** @internal */ +export interface ChangeStreamCursorOptions extends AbstractCursorOptions { + startAtOperationTime?: OperationTime; + resumeAfter?: ResumeToken; + startAfter?: boolean; +} + +/** @internal */ +export class ChangeStreamCursor extends AbstractCursor< + ChangeStreamDocument, + ChangeStreamEvents +> { + _resumeToken: ResumeToken; + startAtOperationTime?: OperationTime; + hasReceived?: boolean; + resumeAfter: ResumeToken; + startAfter: ResumeToken; + options: ChangeStreamCursorOptions; + + postBatchResumeToken?: ResumeToken; + pipeline: Document[]; + + constructor( + topology: Topology, + namespace: MongoDBNamespace, + pipeline: Document[] = [], + options: ChangeStreamCursorOptions = {} + ) { + super(topology, namespace, options); + + this.pipeline = pipeline; + this.options = options; + this._resumeToken = null; + this.startAtOperationTime = options.startAtOperationTime; + + if (options.startAfter) { + this.resumeToken = options.startAfter; + } else if (options.resumeAfter) { + this.resumeToken = options.resumeAfter; + } + } + + set resumeToken(token: ResumeToken) { + this._resumeToken = token; + this.emit(ChangeStream.RESUME_TOKEN_CHANGED, token); + } + + get resumeToken(): ResumeToken { + return this._resumeToken; + } + + get resumeOptions(): ResumeOptions { + const result = {} as ResumeOptions; + for (const optionName of CURSOR_OPTIONS) { + if (Reflect.has(this.options, optionName)) { + Reflect.set(result, optionName, Reflect.get(this.options, optionName)); + } + } + + if (this.resumeToken || this.startAtOperationTime) { + ['resumeAfter', 'startAfter', 'startAtOperationTime'].forEach(key => + Reflect.deleteProperty(result, key) + ); + + if (this.resumeToken) { + const resumeKey = + this.options.startAfter && !this.hasReceived ? 'startAfter' : 'resumeAfter'; + Reflect.set(result, resumeKey, this.resumeToken); + } else if (this.startAtOperationTime && maxWireVersion(this.server) >= 7) { + result.startAtOperationTime = this.startAtOperationTime; + } + } + + return result; + } + + cacheResumeToken(resumeToken: ResumeToken): void { + if (this.bufferedCount() === 0 && this.postBatchResumeToken) { + this.resumeToken = this.postBatchResumeToken; + } else { + this.resumeToken = resumeToken; + } + this.hasReceived = true; + } + + _processBatch(batchName: string, response?: Document): void { + const cursor = response?.cursor || {}; + if (cursor.postBatchResumeToken) { + this.postBatchResumeToken = cursor.postBatchResumeToken; + + if (cursor[batchName].length === 0) { + this.resumeToken = cursor.postBatchResumeToken; + } + } + } + + clone(): AbstractCursor> { + return new ChangeStreamCursor(this.topology, this.namespace, this.pipeline, { + ...this.cursorOptions + }); + } + + _initialize(session: ClientSession, callback: Callback): void { + const aggregateOperation = new AggregateOperation(this.namespace, this.pipeline, { + ...this.cursorOptions, + ...this.options, + session + }); + + executeOperation(this.topology, aggregateOperation, (err, response) => { + if (err || response == null) { + return callback(err); + } + + const server = aggregateOperation.server; + if ( + this.startAtOperationTime == null && + this.resumeAfter == null && + this.startAfter == null && + maxWireVersion(server) >= 7 + ) { + this.startAtOperationTime = response.operationTime; + } + + this._processBatch('firstBatch', response); + + this.emit(ChangeStream.INIT, response); + this.emit(ChangeStream.RESPONSE); + + // TODO: NODE-2882 + callback(undefined, { server, session, response }); + }); + } + + _getMore(batchSize: number, callback: Callback): void { + super._getMore(batchSize, (err, response) => { + if (err) { + return callback(err); + } + + this._processBatch('nextBatch', response); + + this.emit(ChangeStream.MORE, response); + this.emit(ChangeStream.RESPONSE); + callback(err, response); + }); + } +} + +const CHANGE_STREAM_EVENTS = [ + ChangeStream.RESUME_TOKEN_CHANGED, + ChangeStream.END, + ChangeStream.CLOSE +]; + +function setIsEmitter(changeStream: ChangeStream): void { + if (changeStream[kMode] === 'iterator') { + // TODO(NODE-3485): Replace with MongoChangeStreamModeError + throw new MongoAPIError( + 'ChangeStream cannot be used as an EventEmitter after being used as an iterator' + ); + } + changeStream[kMode] = 'emitter'; +} + +function setIsIterator(changeStream: ChangeStream): void { + if (changeStream[kMode] === 'emitter') { + // TODO(NODE-3485): Replace with MongoChangeStreamModeError + throw new MongoAPIError( + 'ChangeStream cannot be used as an iterator after being used as an EventEmitter' + ); + } + changeStream[kMode] = 'iterator'; +} +/** + * Create a new change stream cursor based on self's configuration + * @internal + */ +function createChangeStreamCursor( + changeStream: ChangeStream, + options: ChangeStreamOptions +): ChangeStreamCursor { + const changeStreamStageOptions: Document = { fullDocument: options.fullDocument || 'default' }; + applyKnownOptions(changeStreamStageOptions, options, CHANGE_STREAM_OPTIONS); + if (changeStream.type === CHANGE_DOMAIN_TYPES.CLUSTER) { + changeStreamStageOptions.allChangesForCluster = true; + } + + const pipeline = [{ $changeStream: changeStreamStageOptions } as Document].concat( + changeStream.pipeline + ); + + const cursorOptions = applyKnownOptions({}, options, CURSOR_OPTIONS); + const changeStreamCursor = new ChangeStreamCursor( + getTopology(changeStream.parent), + changeStream.namespace, + pipeline, + cursorOptions + ); + + for (const event of CHANGE_STREAM_EVENTS) { + changeStreamCursor.on(event, e => changeStream.emit(event, e)); + } + + if (changeStream.listenerCount(ChangeStream.CHANGE) > 0) { + streamEvents(changeStream, changeStreamCursor); + } + + return changeStreamCursor; +} + +function applyKnownOptions(target: Document, source: Document, optionNames: string[]) { + optionNames.forEach(name => { + if (source[name]) { + target[name] = source[name]; + } + }); + + return target; +} + +interface TopologyWaitOptions { + start?: number; + timeout?: number; + readPreference?: ReadPreference; +} +// This method performs a basic server selection loop, satisfying the requirements of +// ChangeStream resumability until the new SDAM layer can be used. +const SELECTION_TIMEOUT = 30000; +function waitForTopologyConnected( + topology: Topology, + options: TopologyWaitOptions, + callback: Callback +) { + setTimeout(() => { + if (options && options.start == null) { + options.start = now(); + } + + const start = options.start || now(); + const timeout = options.timeout || SELECTION_TIMEOUT; + if (topology.isConnected()) { + return callback(); + } + + if (calculateDurationInMs(start) > timeout) { + // TODO(NODE-3497): Replace with MongoNetworkTimeoutError + return callback(new MongoRuntimeError('Timed out waiting for connection')); + } + + waitForTopologyConnected(topology, options, callback); + }, 500); // this is an arbitrary wait time to allow SDAM to transition +} + +function closeWithError( + changeStream: ChangeStream, + error: AnyError, + callback?: Callback +): void { + if (!callback) { + changeStream.emit(ChangeStream.ERROR, error); + } + + changeStream.close(() => callback && callback(error)); +} + +function streamEvents( + changeStream: ChangeStream, + cursor: ChangeStreamCursor +): void { + setIsEmitter(changeStream); + const stream = changeStream[kCursorStream] || cursor.stream(); + changeStream[kCursorStream] = stream; + stream.on('data', change => processNewChange(changeStream, change)); + stream.on('error', error => processError(changeStream, error)); +} + +function endStream(changeStream: ChangeStream): void { + const cursorStream = changeStream[kCursorStream]; + if (cursorStream) { + ['data', 'close', 'end', 'error'].forEach(event => cursorStream.removeAllListeners(event)); + cursorStream.destroy(); + } + + changeStream[kCursorStream] = undefined; +} + +function processNewChange( + changeStream: ChangeStream, + change: Nullable>, + callback?: Callback> +) { + if (changeStream[kClosed]) { + // TODO(NODE-3485): Replace with MongoChangeStreamClosedError + if (callback) callback(new MongoAPIError(CHANGESTREAM_CLOSED_ERROR)); + return; + } + + // a null change means the cursor has been notified, implicitly closing the change stream + if (change == null) { + // TODO(NODE-3485): Replace with MongoChangeStreamClosedError + return closeWithError(changeStream, new MongoRuntimeError(CHANGESTREAM_CLOSED_ERROR), callback); + } + + if (change && !change._id) { + return closeWithError( + changeStream, + new MongoChangeStreamError(NO_RESUME_TOKEN_ERROR), + callback + ); + } + + // cache the resume token + changeStream.cursor?.cacheResumeToken(change._id); + + // wipe the startAtOperationTime if there was one so that there won't be a conflict + // between resumeToken and startAtOperationTime if we need to reconnect the cursor + changeStream.options.startAtOperationTime = undefined; + + // Return the change + if (!callback) return changeStream.emit(ChangeStream.CHANGE, change); + return callback(undefined, change); +} + +function processError( + changeStream: ChangeStream, + error: AnyError, + callback?: Callback +) { + const cursor = changeStream.cursor; + + // If the change stream has been closed explicitly, do not process error. + if (changeStream[kClosed]) { + // TODO(NODE-3485): Replace with MongoChangeStreamClosedError + if (callback) callback(new MongoAPIError(CHANGESTREAM_CLOSED_ERROR)); + return; + } + + // if the resume succeeds, continue with the new cursor + function resumeWithCursor(newCursor: ChangeStreamCursor) { + changeStream.cursor = newCursor; + processResumeQueue(changeStream); + } + + // otherwise, raise an error and close the change stream + function unresumableError(err: AnyError) { + if (!callback) { + changeStream.emit(ChangeStream.ERROR, err); + } + + changeStream.close(() => processResumeQueue(changeStream, err)); + } + + if (cursor && isResumableError(error as MongoError, maxWireVersion(cursor.server))) { + changeStream.cursor = undefined; + + // stop listening to all events from old cursor + endStream(changeStream); + + // close internal cursor, ignore errors + cursor.close(); + + const topology = getTopology(changeStream.parent); + waitForTopologyConnected(topology, { readPreference: cursor.readPreference }, err => { + // if the topology can't reconnect, close the stream + if (err) return unresumableError(err); + + // create a new cursor, preserving the old cursor's options + const newCursor = createChangeStreamCursor(changeStream, cursor.resumeOptions); + + // attempt to continue in emitter mode + if (!callback) return resumeWithCursor(newCursor); + + // attempt to continue in iterator mode + newCursor.hasNext(err => { + // if there's an error immediately after resuming, close the stream + if (err) return unresumableError(err); + resumeWithCursor(newCursor); + }); + }); + return; + } + + // if initial error wasn't resumable, raise an error and close the change stream + return closeWithError(changeStream, error, callback); +} + +/** + * Safely provides a cursor across resume attempts + * + * @param changeStream - the parent ChangeStream + */ +function getCursor(changeStream: ChangeStream, callback: Callback>) { + if (changeStream[kClosed]) { + // TODO(NODE-3485): Replace with MongoChangeStreamClosedError + callback(new MongoAPIError(CHANGESTREAM_CLOSED_ERROR)); + return; + } + + // if a cursor exists and it is open, return it + if (changeStream.cursor) { + callback(undefined, changeStream.cursor); + return; + } + + // no cursor, queue callback until topology reconnects + changeStream[kResumeQueue].push(callback); +} + +/** + * Drain the resume queue when a new has become available + * + * @param changeStream - the parent ChangeStream + * @param err - error getting a new cursor + */ +function processResumeQueue(changeStream: ChangeStream, err?: Error) { + while (changeStream[kResumeQueue].length) { + const request = changeStream[kResumeQueue].pop(); + if (!request) break; // Should never occur but TS can't use the length check in the while condition + + if (!err) { + if (changeStream[kClosed]) { + // TODO(NODE-3485): Replace with MongoChangeStreamClosedError + request(new MongoAPIError(CHANGESTREAM_CLOSED_ERROR)); + return; + } + if (!changeStream.cursor) { + request(new MongoChangeStreamError(NO_CURSOR_ERROR)); + return; + } + } + request(err, changeStream.cursor); + } +} diff --git a/node_modules/mongodb/src/cmap/auth/auth_provider.ts b/node_modules/mongodb/src/cmap/auth/auth_provider.ts new file mode 100644 index 000000000..306ebd1ac --- /dev/null +++ b/node_modules/mongodb/src/cmap/auth/auth_provider.ts @@ -0,0 +1,60 @@ +import type { Document } from '../../bson'; +import type { Connection, ConnectionOptions } from '../connection'; +import type { MongoCredentials } from './mongo_credentials'; +import type { HandshakeDocument } from '../connect'; +import type { ClientMetadataOptions, Callback } from '../../utils'; +import { MongoRuntimeError } from '../../error'; + +export type AuthContextOptions = ConnectionOptions & ClientMetadataOptions; + +/** Context used during authentication */ +export class AuthContext { + /** The connection to authenticate */ + connection: Connection; + /** The credentials to use for authentication */ + credentials?: MongoCredentials; + /** The options passed to the `connect` method */ + options: AuthContextOptions; + + /** A response from an initial auth attempt, only some mechanisms use this (e.g, SCRAM) */ + response?: Document; + /** A random nonce generated for use in an authentication conversation */ + nonce?: Buffer; + + constructor( + connection: Connection, + credentials: MongoCredentials | undefined, + options: AuthContextOptions + ) { + this.connection = connection; + this.credentials = credentials; + this.options = options; + } +} + +export class AuthProvider { + /** + * Prepare the handshake document before the initial handshake. + * + * @param handshakeDoc - The document used for the initial handshake on a connection + * @param authContext - Context for authentication flow + */ + prepare( + handshakeDoc: HandshakeDocument, + authContext: AuthContext, + callback: Callback + ): void { + callback(undefined, handshakeDoc); + } + + /** + * Authenticate + * + * @param context - A shared context for authentication flow + * @param callback - The callback to return the result from the authentication + */ + auth(context: AuthContext, callback: Callback): void { + // TODO(NODE-3483): Replace this with MongoMethodOverrideError + callback(new MongoRuntimeError('`auth` method must be overridden by subclass')); + } +} diff --git a/node_modules/mongodb/src/cmap/auth/defaultAuthProviders.ts b/node_modules/mongodb/src/cmap/auth/defaultAuthProviders.ts new file mode 100644 index 000000000..df937d4a0 --- /dev/null +++ b/node_modules/mongodb/src/cmap/auth/defaultAuthProviders.ts @@ -0,0 +1,32 @@ +import { MongoCR } from './mongocr'; +import { X509 } from './x509'; +import { Plain } from './plain'; +import { GSSAPI } from './gssapi'; +import { ScramSHA1, ScramSHA256 } from './scram'; +import { MongoDBAWS } from './mongodb_aws'; +import type { AuthProvider } from './auth_provider'; + +/** @public */ +export const AuthMechanism = Object.freeze({ + MONGODB_AWS: 'MONGODB-AWS', + MONGODB_CR: 'MONGODB-CR', + MONGODB_DEFAULT: 'DEFAULT', + MONGODB_GSSAPI: 'GSSAPI', + MONGODB_PLAIN: 'PLAIN', + MONGODB_SCRAM_SHA1: 'SCRAM-SHA-1', + MONGODB_SCRAM_SHA256: 'SCRAM-SHA-256', + MONGODB_X509: 'MONGODB-X509' +} as const); + +/** @public */ +export type AuthMechanism = typeof AuthMechanism[keyof typeof AuthMechanism]; + +export const AUTH_PROVIDERS = new Map([ + [AuthMechanism.MONGODB_AWS, new MongoDBAWS()], + [AuthMechanism.MONGODB_CR, new MongoCR()], + [AuthMechanism.MONGODB_GSSAPI, new GSSAPI()], + [AuthMechanism.MONGODB_PLAIN, new Plain()], + [AuthMechanism.MONGODB_SCRAM_SHA1, new ScramSHA1()], + [AuthMechanism.MONGODB_SCRAM_SHA256, new ScramSHA256()], + [AuthMechanism.MONGODB_X509, new X509()] +]); diff --git a/node_modules/mongodb/src/cmap/auth/gssapi.ts b/node_modules/mongodb/src/cmap/auth/gssapi.ts new file mode 100644 index 000000000..e12d96039 --- /dev/null +++ b/node_modules/mongodb/src/cmap/auth/gssapi.ts @@ -0,0 +1,190 @@ +import { AuthProvider, AuthContext } from './auth_provider'; +import { + MongoRuntimeError, + MongoInvalidArgumentError, + MongoMissingCredentialsError, + MongoError, + MongoMissingDependencyError +} from '../../error'; +import { Kerberos, KerberosClient } from '../../deps'; +import { Callback, ns } from '../../utils'; +import type { Document } from '../../bson'; + +type MechanismProperties = { + gssapiCanonicalizeHostName?: boolean; + SERVICE_NAME?: string; + SERVICE_REALM?: string; +}; + +import * as dns from 'dns'; + +export class GSSAPI extends AuthProvider { + auth(authContext: AuthContext, callback: Callback): void { + const { connection, credentials } = authContext; + if (credentials == null) + return callback( + new MongoMissingCredentialsError('Credentials required for GSSAPI authentication') + ); + const { username } = credentials; + function externalCommand( + command: Document, + cb: Callback<{ payload: string; conversationId: any }> + ) { + return connection.command(ns('$external.$cmd'), command, undefined, cb); + } + makeKerberosClient(authContext, (err, client) => { + if (err) return callback(err); + if (client == null) return callback(new MongoMissingDependencyError('GSSAPI client missing')); + client.step('', (err, payload) => { + if (err) return callback(err); + + externalCommand(saslStart(payload), (err, result) => { + if (err) return callback(err); + if (result == null) return callback(); + negotiate(client, 10, result.payload, (err, payload) => { + if (err) return callback(err); + + externalCommand(saslContinue(payload, result.conversationId), (err, result) => { + if (err) return callback(err); + if (result == null) return callback(); + finalize(client, username, result.payload, (err, payload) => { + if (err) return callback(err); + + externalCommand( + { + saslContinue: 1, + conversationId: result.conversationId, + payload + }, + (err, result) => { + if (err) return callback(err); + + callback(undefined, result); + } + ); + }); + }); + }); + }); + }); + }); + } +} +function makeKerberosClient(authContext: AuthContext, callback: Callback): void { + const { hostAddress } = authContext.options; + const { credentials } = authContext; + if (!hostAddress || typeof hostAddress.host !== 'string' || !credentials) { + return callback( + new MongoInvalidArgumentError('Connection must have host and port and credentials defined.') + ); + } + + if ('kModuleError' in Kerberos) { + return callback(Kerberos['kModuleError']); + } + const { initializeClient } = Kerberos; + + const { username, password } = credentials; + const mechanismProperties = credentials.mechanismProperties as MechanismProperties; + + const serviceName = mechanismProperties.SERVICE_NAME ?? 'mongodb'; + + performGssapiCanonicalizeHostName( + hostAddress.host, + mechanismProperties, + (err?: Error | MongoError, host?: string) => { + if (err) return callback(err); + + const initOptions = {}; + if (password != null) { + Object.assign(initOptions, { user: username, password: password }); + } + + let spn = `${serviceName}${process.platform === 'win32' ? '/' : '@'}${host}`; + if ('SERVICE_REALM' in mechanismProperties) { + spn = `${spn}@${mechanismProperties.SERVICE_REALM}`; + } + + initializeClient(spn, initOptions, (err: string, client: KerberosClient): void => { + // TODO(NODE-3483) + if (err) return callback(new MongoRuntimeError(err)); + callback(undefined, client); + }); + } + ); +} + +function saslStart(payload?: string): Document { + return { + saslStart: 1, + mechanism: 'GSSAPI', + payload, + autoAuthorize: 1 + }; +} + +function saslContinue(payload?: string, conversationId?: number): Document { + return { + saslContinue: 1, + conversationId, + payload + }; +} + +function negotiate( + client: KerberosClient, + retries: number, + payload: string, + callback: Callback +): void { + client.step(payload, (err, response) => { + // Retries exhausted, raise error + if (err && retries === 0) return callback(err); + + // Adjust number of retries and call step again + if (err) return negotiate(client, retries - 1, payload, callback); + + // Return the payload + callback(undefined, response || ''); + }); +} + +function finalize( + client: KerberosClient, + user: string, + payload: string, + callback: Callback +): void { + // GSS Client Unwrap + client.unwrap(payload, (err, response) => { + if (err) return callback(err); + + // Wrap the response + client.wrap(response || '', { user }, (err, wrapped) => { + if (err) return callback(err); + + // Return the payload + callback(undefined, wrapped); + }); + }); +} + +function performGssapiCanonicalizeHostName( + host: string, + mechanismProperties: MechanismProperties, + callback: Callback +): void { + if (!mechanismProperties.gssapiCanonicalizeHostName) return callback(undefined, host); + + // Attempt to resolve the host name + dns.resolveCname(host, (err, r) => { + if (err) return callback(err); + + // Get the first resolve host id + if (Array.isArray(r) && r.length > 0) { + return callback(undefined, r[0]); + } + + callback(undefined, host); + }); +} diff --git a/node_modules/mongodb/src/cmap/auth/mongo_credentials.ts b/node_modules/mongodb/src/cmap/auth/mongo_credentials.ts new file mode 100644 index 000000000..cbc4007ac --- /dev/null +++ b/node_modules/mongodb/src/cmap/auth/mongo_credentials.ts @@ -0,0 +1,179 @@ +// Resolves the default auth mechanism according to + +import type { Document } from '../../bson'; +import { MongoAPIError, MongoMissingCredentialsError } from '../../error'; +import { AuthMechanism } from './defaultAuthProviders'; + +// https://github.com/mongodb/specifications/blob/master/source/auth/auth.rst +function getDefaultAuthMechanism(ismaster?: Document): AuthMechanism { + if (ismaster) { + // If ismaster contains saslSupportedMechs, use scram-sha-256 + // if it is available, else scram-sha-1 + if (Array.isArray(ismaster.saslSupportedMechs)) { + return ismaster.saslSupportedMechs.includes(AuthMechanism.MONGODB_SCRAM_SHA256) + ? AuthMechanism.MONGODB_SCRAM_SHA256 + : AuthMechanism.MONGODB_SCRAM_SHA1; + } + + // Fallback to legacy selection method. If wire version >= 3, use scram-sha-1 + if (ismaster.maxWireVersion >= 3) { + return AuthMechanism.MONGODB_SCRAM_SHA1; + } + } + + // Default for wireprotocol < 3 + return AuthMechanism.MONGODB_CR; +} + +/** @public */ +export interface AuthMechanismProperties extends Document { + SERVICE_NAME?: string; + SERVICE_REALM?: string; + CANONICALIZE_HOST_NAME?: boolean; + AWS_SESSION_TOKEN?: string; +} + +/** @public */ +export interface MongoCredentialsOptions { + username: string; + password: string; + source: string; + db?: string; + mechanism?: AuthMechanism; + mechanismProperties: AuthMechanismProperties; +} + +/** + * A representation of the credentials used by MongoDB + * @public + */ +export class MongoCredentials { + /** The username used for authentication */ + readonly username: string; + /** The password used for authentication */ + readonly password: string; + /** The database that the user should authenticate against */ + readonly source: string; + /** The method used to authenticate */ + readonly mechanism: AuthMechanism; + /** Special properties used by some types of auth mechanisms */ + readonly mechanismProperties: AuthMechanismProperties; + + constructor(options: MongoCredentialsOptions) { + this.username = options.username; + this.password = options.password; + this.source = options.source; + if (!this.source && options.db) { + this.source = options.db; + } + this.mechanism = options.mechanism || AuthMechanism.MONGODB_DEFAULT; + this.mechanismProperties = options.mechanismProperties || {}; + + if (this.mechanism.match(/MONGODB-AWS/i)) { + if (!this.username && process.env.AWS_ACCESS_KEY_ID) { + this.username = process.env.AWS_ACCESS_KEY_ID; + } + + if (!this.password && process.env.AWS_SECRET_ACCESS_KEY) { + this.password = process.env.AWS_SECRET_ACCESS_KEY; + } + + if ( + this.mechanismProperties.AWS_SESSION_TOKEN == null && + process.env.AWS_SESSION_TOKEN != null + ) { + this.mechanismProperties = { + ...this.mechanismProperties, + AWS_SESSION_TOKEN: process.env.AWS_SESSION_TOKEN + }; + } + } + + Object.freeze(this.mechanismProperties); + Object.freeze(this); + } + + /** Determines if two MongoCredentials objects are equivalent */ + equals(other: MongoCredentials): boolean { + return ( + this.mechanism === other.mechanism && + this.username === other.username && + this.password === other.password && + this.source === other.source + ); + } + + /** + * If the authentication mechanism is set to "default", resolves the authMechanism + * based on the server version and server supported sasl mechanisms. + * + * @param ismaster - An ismaster response from the server + */ + resolveAuthMechanism(ismaster?: Document): MongoCredentials { + // If the mechanism is not "default", then it does not need to be resolved + if (this.mechanism.match(/DEFAULT/i)) { + return new MongoCredentials({ + username: this.username, + password: this.password, + source: this.source, + mechanism: getDefaultAuthMechanism(ismaster), + mechanismProperties: this.mechanismProperties + }); + } + + return this; + } + + validate(): void { + if ( + (this.mechanism === AuthMechanism.MONGODB_GSSAPI || + this.mechanism === AuthMechanism.MONGODB_CR || + this.mechanism === AuthMechanism.MONGODB_PLAIN || + this.mechanism === AuthMechanism.MONGODB_SCRAM_SHA1 || + this.mechanism === AuthMechanism.MONGODB_SCRAM_SHA256) && + !this.username + ) { + throw new MongoMissingCredentialsError(`Username required for mechanism '${this.mechanism}'`); + } + + if ( + this.mechanism === AuthMechanism.MONGODB_GSSAPI || + this.mechanism === AuthMechanism.MONGODB_AWS || + this.mechanism === AuthMechanism.MONGODB_X509 + ) { + if (this.source != null && this.source !== '$external') { + // TODO(NODE-3485): Replace this with a MongoAuthValidationError + throw new MongoAPIError( + `Invalid source '${this.source}' for mechanism '${this.mechanism}' specified.` + ); + } + } + + if (this.mechanism === AuthMechanism.MONGODB_PLAIN && this.source == null) { + // TODO(NODE-3485): Replace this with a MongoAuthValidationError + throw new MongoAPIError('PLAIN Authentication Mechanism needs an auth source'); + } + + if (this.mechanism === AuthMechanism.MONGODB_X509 && this.password != null) { + if (this.password === '') { + Reflect.set(this, 'password', undefined); + return; + } + // TODO(NODE-3485): Replace this with a MongoAuthValidationError + throw new MongoAPIError(`Password not allowed for mechanism MONGODB-X509`); + } + } + + static merge( + creds: MongoCredentials | undefined, + options: Partial + ): MongoCredentials { + return new MongoCredentials({ + username: options.username ?? creds?.username ?? '', + password: options.password ?? creds?.password ?? '', + mechanism: options.mechanism ?? creds?.mechanism ?? AuthMechanism.MONGODB_DEFAULT, + mechanismProperties: options.mechanismProperties ?? creds?.mechanismProperties ?? {}, + source: options.source ?? options.db ?? creds?.source ?? 'admin' + }); + } +} diff --git a/node_modules/mongodb/src/cmap/auth/mongocr.ts b/node_modules/mongodb/src/cmap/auth/mongocr.ts new file mode 100644 index 000000000..fa801caf4 --- /dev/null +++ b/node_modules/mongodb/src/cmap/auth/mongocr.ts @@ -0,0 +1,46 @@ +import * as crypto from 'crypto'; +import { AuthProvider, AuthContext } from './auth_provider'; +import { Callback, ns } from '../../utils'; +import { MongoMissingCredentialsError } from '../../error'; + +export class MongoCR extends AuthProvider { + auth(authContext: AuthContext, callback: Callback): void { + const { connection, credentials } = authContext; + if (!credentials) { + return callback(new MongoMissingCredentialsError('AuthContext must provide credentials.')); + } + const username = credentials.username; + const password = credentials.password; + const source = credentials.source; + connection.command(ns(`${source}.$cmd`), { getnonce: 1 }, undefined, (err, r) => { + let nonce = null; + let key = null; + + // Get nonce + if (err == null) { + nonce = r.nonce; + + // Use node md5 generator + let md5 = crypto.createHash('md5'); + + // Generate keys used for authentication + md5.update(`${username}:mongo:${password}`, 'utf8'); + const hash_password = md5.digest('hex'); + + // Final key + md5 = crypto.createHash('md5'); + md5.update(nonce + username + hash_password, 'utf8'); + key = md5.digest('hex'); + } + + const authenticateCommand = { + authenticate: 1, + user: username, + nonce, + key + }; + + connection.command(ns(`${source}.$cmd`), authenticateCommand, undefined, callback); + }); + } +} diff --git a/node_modules/mongodb/src/cmap/auth/mongodb_aws.ts b/node_modules/mongodb/src/cmap/auth/mongodb_aws.ts new file mode 100644 index 000000000..a6cb48ab9 --- /dev/null +++ b/node_modules/mongodb/src/cmap/auth/mongodb_aws.ts @@ -0,0 +1,289 @@ +import * as http from 'http'; +import * as crypto from 'crypto'; +import * as url from 'url'; +import * as BSON from '../../bson'; +import { AuthProvider, AuthContext } from './auth_provider'; +import { MongoCredentials } from './mongo_credentials'; +import { + MongoRuntimeError, + MongoMissingCredentialsError, + MongoCompatibilityError +} from '../../error'; +import { maxWireVersion, Callback, ns } from '../../utils'; +import type { BSONSerializeOptions } from '../../bson'; + +import { aws4 } from '../../deps'; +import { AuthMechanism } from './defaultAuthProviders'; +import type { Binary } from 'bson'; + +const ASCII_N = 110; +const AWS_RELATIVE_URI = 'http://169.254.170.2'; +const AWS_EC2_URI = 'http://169.254.169.254'; +const AWS_EC2_PATH = '/latest/meta-data/iam/security-credentials'; +const bsonOptions: BSONSerializeOptions = { + promoteLongs: true, + promoteValues: true, + promoteBuffers: false, + bsonRegExp: false +}; + +interface AWSSaslContinuePayload { + a: string; + d: string; + t?: string; +} + +export class MongoDBAWS extends AuthProvider { + auth(authContext: AuthContext, callback: Callback): void { + const { connection, credentials } = authContext; + if (!credentials) { + return callback(new MongoMissingCredentialsError('AuthContext must provide credentials.')); + } + + if ('kModuleError' in aws4) { + return callback(aws4['kModuleError']); + } + const { sign } = aws4; + + if (maxWireVersion(connection) < 9) { + callback( + new MongoCompatibilityError( + 'MONGODB-AWS authentication requires MongoDB version 4.4 or later' + ) + ); + return; + } + + if (!credentials.username) { + makeTempCredentials(credentials, (err, tempCredentials) => { + if (err || !tempCredentials) return callback(err); + + authContext.credentials = tempCredentials; + this.auth(authContext, callback); + }); + + return; + } + + const accessKeyId = credentials.username; + const secretAccessKey = credentials.password; + const sessionToken = credentials.mechanismProperties.AWS_SESSION_TOKEN; + + // If all three defined, include sessionToken, else include username and pass, else no credentials + const awsCredentials = + accessKeyId && secretAccessKey && sessionToken + ? { accessKeyId, secretAccessKey, sessionToken } + : accessKeyId && secretAccessKey + ? { accessKeyId, secretAccessKey } + : undefined; + + const db = credentials.source; + crypto.randomBytes(32, (err, nonce) => { + if (err) { + callback(err); + return; + } + + const saslStart = { + saslStart: 1, + mechanism: 'MONGODB-AWS', + payload: BSON.serialize({ r: nonce, p: ASCII_N }, bsonOptions) + }; + + connection.command(ns(`${db}.$cmd`), saslStart, undefined, (err, res) => { + if (err) return callback(err); + + const serverResponse = BSON.deserialize(res.payload.buffer, bsonOptions) as { + s: Binary; + h: string; + }; + const host = serverResponse.h; + const serverNonce = serverResponse.s.buffer; + if (serverNonce.length !== 64) { + callback( + // TODO(NODE-3483) + new MongoRuntimeError(`Invalid server nonce length ${serverNonce.length}, expected 64`) + ); + + return; + } + + if (serverNonce.compare(nonce, 0, nonce.length, 0, nonce.length) !== 0) { + // TODO(NODE-3483) + callback(new MongoRuntimeError('Server nonce does not begin with client nonce')); + return; + } + + if (host.length < 1 || host.length > 255 || host.indexOf('..') !== -1) { + // TODO(NODE-3483) + callback(new MongoRuntimeError(`Server returned an invalid host: "${host}"`)); + return; + } + + const body = 'Action=GetCallerIdentity&Version=2011-06-15'; + const options = sign( + { + method: 'POST', + host, + region: deriveRegion(serverResponse.h), + service: 'sts', + headers: { + 'Content-Type': 'application/x-www-form-urlencoded', + 'Content-Length': body.length, + 'X-MongoDB-Server-Nonce': serverNonce.toString('base64'), + 'X-MongoDB-GS2-CB-Flag': 'n' + }, + path: '/', + body + }, + awsCredentials + ); + + const payload: AWSSaslContinuePayload = { + a: options.headers.Authorization, + d: options.headers['X-Amz-Date'] + }; + if (sessionToken) { + payload.t = sessionToken; + } + + const saslContinue = { + saslContinue: 1, + conversationId: 1, + payload: BSON.serialize(payload, bsonOptions) + }; + + connection.command(ns(`${db}.$cmd`), saslContinue, undefined, callback); + }); + }); + } +} + +interface AWSTempCredentials { + AccessKeyId?: string; + SecretAccessKey?: string; + Token?: string; + RoleArn?: string; + Expiration?: Date; +} + +function makeTempCredentials(credentials: MongoCredentials, callback: Callback) { + function done(creds: AWSTempCredentials) { + if (!creds.AccessKeyId || !creds.SecretAccessKey || !creds.Token) { + callback( + new MongoMissingCredentialsError('Could not obtain temporary MONGODB-AWS credentials') + ); + return; + } + + callback( + undefined, + new MongoCredentials({ + username: creds.AccessKeyId, + password: creds.SecretAccessKey, + source: credentials.source, + mechanism: AuthMechanism.MONGODB_AWS, + mechanismProperties: { + AWS_SESSION_TOKEN: creds.Token + } + }) + ); + } + + // If the environment variable AWS_CONTAINER_CREDENTIALS_RELATIVE_URI + // is set then drivers MUST assume that it was set by an AWS ECS agent + if (process.env.AWS_CONTAINER_CREDENTIALS_RELATIVE_URI) { + request( + `${AWS_RELATIVE_URI}${process.env.AWS_CONTAINER_CREDENTIALS_RELATIVE_URI}`, + undefined, + (err, res) => { + if (err) return callback(err); + done(res); + } + ); + + return; + } + + // Otherwise assume we are on an EC2 instance + + // get a token + request( + `${AWS_EC2_URI}/latest/api/token`, + { method: 'PUT', json: false, headers: { 'X-aws-ec2-metadata-token-ttl-seconds': 30 } }, + (err, token) => { + if (err) return callback(err); + + // get role name + request( + `${AWS_EC2_URI}/${AWS_EC2_PATH}`, + { json: false, headers: { 'X-aws-ec2-metadata-token': token } }, + (err, roleName) => { + if (err) return callback(err); + + // get temp credentials + request( + `${AWS_EC2_URI}/${AWS_EC2_PATH}/${roleName}`, + { headers: { 'X-aws-ec2-metadata-token': token } }, + (err, creds) => { + if (err) return callback(err); + done(creds); + } + ); + } + ); + } + ); +} + +function deriveRegion(host: string) { + const parts = host.split('.'); + if (parts.length === 1 || parts[1] === 'amazonaws') { + return 'us-east-1'; + } + + return parts[1]; +} + +interface RequestOptions { + json?: boolean; + method?: string; + timeout?: number; + headers?: http.OutgoingHttpHeaders; +} + +function request(uri: string, _options: RequestOptions | undefined, callback: Callback) { + const options = Object.assign( + { + method: 'GET', + timeout: 10000, + json: true + }, + url.parse(uri), + _options + ); + + const req = http.request(options, res => { + res.setEncoding('utf8'); + + let data = ''; + res.on('data', d => (data += d)); + res.on('end', () => { + if (options.json === false) { + callback(undefined, data); + return; + } + + try { + const parsed = JSON.parse(data); + callback(undefined, parsed); + } catch (err) { + // TODO(NODE-3483) + callback(new MongoRuntimeError(`Invalid JSON response: "${data}"`)); + } + }); + }); + + req.on('error', err => callback(err)); + req.end(); +} diff --git a/node_modules/mongodb/src/cmap/auth/plain.ts b/node_modules/mongodb/src/cmap/auth/plain.ts new file mode 100644 index 000000000..afbca63d2 --- /dev/null +++ b/node_modules/mongodb/src/cmap/auth/plain.ts @@ -0,0 +1,25 @@ +import { Binary } from '../../bson'; +import { AuthProvider, AuthContext } from './auth_provider'; +import { MongoMissingCredentialsError } from '../../error'; +import { Callback, ns } from '../../utils'; + +export class Plain extends AuthProvider { + auth(authContext: AuthContext, callback: Callback): void { + const { connection, credentials } = authContext; + if (!credentials) { + return callback(new MongoMissingCredentialsError('AuthContext must provide credentials.')); + } + const username = credentials.username; + const password = credentials.password; + + const payload = new Binary(Buffer.from(`\x00${username}\x00${password}`)); + const command = { + saslStart: 1, + mechanism: 'PLAIN', + payload: payload, + autoAuthorize: 1 + }; + + connection.command(ns('$external.$cmd'), command, undefined, callback); + } +} diff --git a/node_modules/mongodb/src/cmap/auth/scram.ts b/node_modules/mongodb/src/cmap/auth/scram.ts new file mode 100644 index 000000000..7c58eac7d --- /dev/null +++ b/node_modules/mongodb/src/cmap/auth/scram.ts @@ -0,0 +1,373 @@ +import * as crypto from 'crypto'; +import { Binary, Document } from '../../bson'; +import { + AnyError, + MongoRuntimeError, + MongoServerError, + MongoInvalidArgumentError, + MongoMissingCredentialsError +} from '../../error'; +import { AuthProvider, AuthContext } from './auth_provider'; +import { Callback, ns, emitWarning } from '../../utils'; +import type { MongoCredentials } from './mongo_credentials'; +import type { HandshakeDocument } from '../connect'; + +import { saslprep } from '../../deps'; +import { AuthMechanism } from './defaultAuthProviders'; + +type CryptoMethod = 'sha1' | 'sha256'; + +class ScramSHA extends AuthProvider { + cryptoMethod: CryptoMethod; + constructor(cryptoMethod: CryptoMethod) { + super(); + this.cryptoMethod = cryptoMethod || 'sha1'; + } + + prepare(handshakeDoc: HandshakeDocument, authContext: AuthContext, callback: Callback) { + const cryptoMethod = this.cryptoMethod; + const credentials = authContext.credentials; + if (!credentials) { + return callback(new MongoMissingCredentialsError('AuthContext must provide credentials.')); + } + if (cryptoMethod === 'sha256' && saslprep == null) { + emitWarning('Warning: no saslprep library specified. Passwords will not be sanitized'); + } + + crypto.randomBytes(24, (err, nonce) => { + if (err) { + return callback(err); + } + + // store the nonce for later use + Object.assign(authContext, { nonce }); + + const request = Object.assign({}, handshakeDoc, { + speculativeAuthenticate: Object.assign(makeFirstMessage(cryptoMethod, credentials, nonce), { + db: credentials.source + }) + }); + + callback(undefined, request); + }); + } + + auth(authContext: AuthContext, callback: Callback) { + const response = authContext.response; + if (response && response.speculativeAuthenticate) { + continueScramConversation( + this.cryptoMethod, + response.speculativeAuthenticate, + authContext, + callback + ); + + return; + } + + executeScram(this.cryptoMethod, authContext, callback); + } +} + +function cleanUsername(username: string) { + return username.replace('=', '=3D').replace(',', '=2C'); +} + +function clientFirstMessageBare(username: string, nonce: Buffer) { + // NOTE: This is done b/c Javascript uses UTF-16, but the server is hashing in UTF-8. + // Since the username is not sasl-prep-d, we need to do this here. + return Buffer.concat([ + Buffer.from('n=', 'utf8'), + Buffer.from(username, 'utf8'), + Buffer.from(',r=', 'utf8'), + Buffer.from(nonce.toString('base64'), 'utf8') + ]); +} + +function makeFirstMessage( + cryptoMethod: CryptoMethod, + credentials: MongoCredentials, + nonce: Buffer +) { + const username = cleanUsername(credentials.username); + const mechanism = + cryptoMethod === 'sha1' ? AuthMechanism.MONGODB_SCRAM_SHA1 : AuthMechanism.MONGODB_SCRAM_SHA256; + + // NOTE: This is done b/c Javascript uses UTF-16, but the server is hashing in UTF-8. + // Since the username is not sasl-prep-d, we need to do this here. + return { + saslStart: 1, + mechanism, + payload: new Binary( + Buffer.concat([Buffer.from('n,,', 'utf8'), clientFirstMessageBare(username, nonce)]) + ), + autoAuthorize: 1, + options: { skipEmptyExchange: true } + }; +} + +function executeScram(cryptoMethod: CryptoMethod, authContext: AuthContext, callback: Callback) { + const { connection, credentials } = authContext; + if (!credentials) { + return callback(new MongoMissingCredentialsError('AuthContext must provide credentials.')); + } + if (!authContext.nonce) { + return callback( + new MongoInvalidArgumentError('AuthContext must contain a valid nonce property') + ); + } + const nonce = authContext.nonce; + const db = credentials.source; + + const saslStartCmd = makeFirstMessage(cryptoMethod, credentials, nonce); + connection.command(ns(`${db}.$cmd`), saslStartCmd, undefined, (_err, result) => { + const err = resolveError(_err, result); + if (err) { + return callback(err); + } + + continueScramConversation(cryptoMethod, result, authContext, callback); + }); +} + +function continueScramConversation( + cryptoMethod: CryptoMethod, + response: Document, + authContext: AuthContext, + callback: Callback +) { + const connection = authContext.connection; + const credentials = authContext.credentials; + if (!credentials) { + return callback(new MongoMissingCredentialsError('AuthContext must provide credentials.')); + } + if (!authContext.nonce) { + return callback(new MongoInvalidArgumentError('Unable to continue SCRAM without valid nonce')); + } + const nonce = authContext.nonce; + + const db = credentials.source; + const username = cleanUsername(credentials.username); + const password = credentials.password; + + let processedPassword; + if (cryptoMethod === 'sha256') { + processedPassword = 'kModuleError' in saslprep ? password : saslprep(password); + } else { + try { + processedPassword = passwordDigest(username, password); + } catch (e) { + return callback(e); + } + } + + const payload = Buffer.isBuffer(response.payload) + ? new Binary(response.payload) + : response.payload; + const dict = parsePayload(payload.value()); + + const iterations = parseInt(dict.i, 10); + if (iterations && iterations < 4096) { + callback( + // TODO(NODE-3483) + new MongoRuntimeError(`Server returned an invalid iteration count ${iterations}`), + false + ); + return; + } + + const salt = dict.s; + const rnonce = dict.r; + if (rnonce.startsWith('nonce')) { + // TODO(NODE-3483) + callback(new MongoRuntimeError(`Server returned an invalid nonce: ${rnonce}`), false); + return; + } + + // Set up start of proof + const withoutProof = `c=biws,r=${rnonce}`; + const saltedPassword = HI( + processedPassword, + Buffer.from(salt, 'base64'), + iterations, + cryptoMethod + ); + + const clientKey = HMAC(cryptoMethod, saltedPassword, 'Client Key'); + const serverKey = HMAC(cryptoMethod, saltedPassword, 'Server Key'); + const storedKey = H(cryptoMethod, clientKey); + const authMessage = [clientFirstMessageBare(username, nonce), payload.value(), withoutProof].join( + ',' + ); + + const clientSignature = HMAC(cryptoMethod, storedKey, authMessage); + const clientProof = `p=${xor(clientKey, clientSignature)}`; + const clientFinal = [withoutProof, clientProof].join(','); + + const serverSignature = HMAC(cryptoMethod, serverKey, authMessage); + const saslContinueCmd = { + saslContinue: 1, + conversationId: response.conversationId, + payload: new Binary(Buffer.from(clientFinal)) + }; + + connection.command(ns(`${db}.$cmd`), saslContinueCmd, undefined, (_err, r) => { + const err = resolveError(_err, r); + if (err) { + return callback(err); + } + + const parsedResponse = parsePayload(r.payload.value()); + if (!compareDigest(Buffer.from(parsedResponse.v, 'base64'), serverSignature)) { + callback(new MongoRuntimeError('Server returned an invalid signature')); + return; + } + + if (!r || r.done !== false) { + return callback(err, r); + } + + const retrySaslContinueCmd = { + saslContinue: 1, + conversationId: r.conversationId, + payload: Buffer.alloc(0) + }; + + connection.command(ns(`${db}.$cmd`), retrySaslContinueCmd, undefined, callback); + }); +} + +function parsePayload(payload: string) { + const dict: Document = {}; + const parts = payload.split(','); + for (let i = 0; i < parts.length; i++) { + const valueParts = parts[i].split('='); + dict[valueParts[0]] = valueParts[1]; + } + + return dict; +} + +function passwordDigest(username: string, password: string) { + if (typeof username !== 'string') { + throw new MongoInvalidArgumentError('Username must be a string'); + } + + if (typeof password !== 'string') { + throw new MongoInvalidArgumentError('Password must be a string'); + } + + if (password.length === 0) { + throw new MongoInvalidArgumentError('Password cannot be empty'); + } + + const md5 = crypto.createHash('md5'); + md5.update(`${username}:mongo:${password}`, 'utf8'); + return md5.digest('hex'); +} + +// XOR two buffers +function xor(a: Buffer, b: Buffer) { + if (!Buffer.isBuffer(a)) { + a = Buffer.from(a); + } + + if (!Buffer.isBuffer(b)) { + b = Buffer.from(b); + } + + const length = Math.max(a.length, b.length); + const res = []; + + for (let i = 0; i < length; i += 1) { + res.push(a[i] ^ b[i]); + } + + return Buffer.from(res).toString('base64'); +} + +function H(method: CryptoMethod, text: Buffer) { + return crypto.createHash(method).update(text).digest(); +} + +function HMAC(method: CryptoMethod, key: Buffer, text: Buffer | string) { + return crypto.createHmac(method, key).update(text).digest(); +} + +interface HICache { + [key: string]: Buffer; +} + +let _hiCache: HICache = {}; +let _hiCacheCount = 0; +function _hiCachePurge() { + _hiCache = {}; + _hiCacheCount = 0; +} + +const hiLengthMap = { + sha256: 32, + sha1: 20 +}; + +function HI(data: string, salt: Buffer, iterations: number, cryptoMethod: CryptoMethod) { + // omit the work if already generated + const key = [data, salt.toString('base64'), iterations].join('_'); + if (_hiCache[key] != null) { + return _hiCache[key]; + } + + // generate the salt + const saltedData = crypto.pbkdf2Sync( + data, + salt, + iterations, + hiLengthMap[cryptoMethod], + cryptoMethod + ); + + // cache a copy to speed up the next lookup, but prevent unbounded cache growth + if (_hiCacheCount >= 200) { + _hiCachePurge(); + } + + _hiCache[key] = saltedData; + _hiCacheCount += 1; + return saltedData; +} + +function compareDigest(lhs: Buffer, rhs: Uint8Array) { + if (lhs.length !== rhs.length) { + return false; + } + + if (typeof crypto.timingSafeEqual === 'function') { + return crypto.timingSafeEqual(lhs, rhs); + } + + let result = 0; + for (let i = 0; i < lhs.length; i++) { + result |= lhs[i] ^ rhs[i]; + } + + return result === 0; +} + +function resolveError(err?: AnyError, result?: Document) { + if (err) return err; + if (result) { + if (result.$err || result.errmsg) return new MongoServerError(result); + } +} + +export class ScramSHA1 extends ScramSHA { + constructor() { + super('sha1'); + } +} + +export class ScramSHA256 extends ScramSHA { + constructor() { + super('sha256'); + } +} diff --git a/node_modules/mongodb/src/cmap/auth/x509.ts b/node_modules/mongodb/src/cmap/auth/x509.ts new file mode 100644 index 000000000..73f768803 --- /dev/null +++ b/node_modules/mongodb/src/cmap/auth/x509.ts @@ -0,0 +1,49 @@ +import { AuthProvider, AuthContext } from './auth_provider'; +import { MongoMissingCredentialsError } from '../../error'; +import type { Document } from '../../bson'; +import { Callback, ns } from '../../utils'; +import type { MongoCredentials } from './mongo_credentials'; +import type { HandshakeDocument } from '../connect'; + +export class X509 extends AuthProvider { + prepare(handshakeDoc: HandshakeDocument, authContext: AuthContext, callback: Callback): void { + const { credentials } = authContext; + if (!credentials) { + return callback(new MongoMissingCredentialsError('AuthContext must provide credentials.')); + } + Object.assign(handshakeDoc, { + speculativeAuthenticate: x509AuthenticateCommand(credentials) + }); + + callback(undefined, handshakeDoc); + } + + auth(authContext: AuthContext, callback: Callback): void { + const connection = authContext.connection; + const credentials = authContext.credentials; + if (!credentials) { + return callback(new MongoMissingCredentialsError('AuthContext must provide credentials.')); + } + const response = authContext.response; + + if (response && response.speculativeAuthenticate) { + return callback(); + } + + connection.command( + ns('$external.$cmd'), + x509AuthenticateCommand(credentials), + undefined, + callback + ); + } +} + +function x509AuthenticateCommand(credentials: MongoCredentials) { + const command: Document = { authenticate: 1, mechanism: 'MONGODB-X509' }; + if (credentials.username) { + command.user = credentials.username; + } + + return command; +} diff --git a/node_modules/mongodb/src/cmap/command_monitoring_events.ts b/node_modules/mongodb/src/cmap/command_monitoring_events.ts new file mode 100644 index 000000000..5be637240 --- /dev/null +++ b/node_modules/mongodb/src/cmap/command_monitoring_events.ts @@ -0,0 +1,334 @@ +import { GetMore, KillCursor, Msg, WriteProtocolMessageType } from './commands'; +import { calculateDurationInMs, deepCopy } from '../utils'; +import type { Connection } from './connection'; +import type { Document, ObjectId } from '../bson'; + +/** + * An event indicating the start of a given + * @public + * @category Event + */ +export class CommandStartedEvent { + commandObj?: Document; + requestId: number; + databaseName: string; + commandName: string; + command: Document; + address: string; + connectionId?: string | number; + serviceId?: ObjectId; + + /** + * Create a started event + * + * @internal + * @param pool - the pool that originated the command + * @param command - the command + */ + constructor(connection: Connection, command: WriteProtocolMessageType) { + const cmd = extractCommand(command); + const commandName = extractCommandName(cmd); + const { address, connectionId, serviceId } = extractConnectionDetails(connection); + + // TODO: remove in major revision, this is not spec behavior + if (SENSITIVE_COMMANDS.has(commandName)) { + this.commandObj = {}; + this.commandObj[commandName] = true; + } + + this.address = address; + this.connectionId = connectionId; + this.serviceId = serviceId; + this.requestId = command.requestId; + this.databaseName = databaseName(command); + this.commandName = commandName; + this.command = maybeRedact(commandName, cmd, cmd); + } + + /* @internal */ + get hasServiceId(): boolean { + return !!this.serviceId; + } +} + +/** + * An event indicating the success of a given command + * @public + * @category Event + */ +export class CommandSucceededEvent { + address: string; + connectionId?: string | number; + requestId: number; + duration: number; + commandName: string; + reply: unknown; + serviceId?: ObjectId; + + /** + * Create a succeeded event + * + * @internal + * @param pool - the pool that originated the command + * @param command - the command + * @param reply - the reply for this command from the server + * @param started - a high resolution tuple timestamp of when the command was first sent, to calculate duration + */ + constructor( + connection: Connection, + command: WriteProtocolMessageType, + reply: Document | undefined, + started: number + ) { + const cmd = extractCommand(command); + const commandName = extractCommandName(cmd); + const { address, connectionId, serviceId } = extractConnectionDetails(connection); + + this.address = address; + this.connectionId = connectionId; + this.serviceId = serviceId; + this.requestId = command.requestId; + this.commandName = commandName; + this.duration = calculateDurationInMs(started); + this.reply = maybeRedact(commandName, cmd, extractReply(command, reply)); + } + + /* @internal */ + get hasServiceId(): boolean { + return !!this.serviceId; + } +} + +/** + * An event indicating the failure of a given command + * @public + * @category Event + */ +export class CommandFailedEvent { + address: string; + connectionId?: string | number; + requestId: number; + duration: number; + commandName: string; + failure: Error; + serviceId?: ObjectId; + + /** + * Create a failure event + * + * @internal + * @param pool - the pool that originated the command + * @param command - the command + * @param error - the generated error or a server error response + * @param started - a high resolution tuple timestamp of when the command was first sent, to calculate duration + */ + constructor( + connection: Connection, + command: WriteProtocolMessageType, + error: Error | Document, + started: number + ) { + const cmd = extractCommand(command); + const commandName = extractCommandName(cmd); + const { address, connectionId, serviceId } = extractConnectionDetails(connection); + + this.address = address; + this.connectionId = connectionId; + this.serviceId = serviceId; + + this.requestId = command.requestId; + this.commandName = commandName; + this.duration = calculateDurationInMs(started); + this.failure = maybeRedact(commandName, cmd, error) as Error; + } + + /* @internal */ + get hasServiceId(): boolean { + return !!this.serviceId; + } +} + +/** Commands that we want to redact because of the sensitive nature of their contents */ +const SENSITIVE_COMMANDS = new Set([ + 'authenticate', + 'saslStart', + 'saslContinue', + 'getnonce', + 'createUser', + 'updateUser', + 'copydbgetnonce', + 'copydbsaslstart', + 'copydb' +]); + +const HELLO_COMMANDS = new Set(['hello', 'ismaster', 'isMaster']); + +// helper methods +const extractCommandName = (commandDoc: Document) => Object.keys(commandDoc)[0]; +const namespace = (command: WriteProtocolMessageType) => command.ns; +const databaseName = (command: WriteProtocolMessageType) => command.ns.split('.')[0]; +const collectionName = (command: WriteProtocolMessageType) => command.ns.split('.')[1]; +const maybeRedact = (commandName: string, commandDoc: Document, result: Error | Document) => + SENSITIVE_COMMANDS.has(commandName) || + (HELLO_COMMANDS.has(commandName) && commandDoc.speculativeAuthenticate) + ? {} + : result; + +const LEGACY_FIND_QUERY_MAP: { [key: string]: string } = { + $query: 'filter', + $orderby: 'sort', + $hint: 'hint', + $comment: 'comment', + $maxScan: 'maxScan', + $max: 'max', + $min: 'min', + $returnKey: 'returnKey', + $showDiskLoc: 'showRecordId', + $maxTimeMS: 'maxTimeMS', + $snapshot: 'snapshot' +}; + +const LEGACY_FIND_OPTIONS_MAP = { + numberToSkip: 'skip', + numberToReturn: 'batchSize', + returnFieldSelector: 'projection' +} as const; + +const OP_QUERY_KEYS = [ + 'tailable', + 'oplogReplay', + 'noCursorTimeout', + 'awaitData', + 'partial', + 'exhaust' +] as const; + +/** Extract the actual command from the query, possibly up-converting if it's a legacy format */ +function extractCommand(command: WriteProtocolMessageType): Document { + if (command instanceof GetMore) { + return { + getMore: deepCopy(command.cursorId), + collection: collectionName(command), + batchSize: command.numberToReturn + }; + } + + if (command instanceof KillCursor) { + return { + killCursors: collectionName(command), + cursors: deepCopy(command.cursorIds) + }; + } + + if (command instanceof Msg) { + return deepCopy(command.command); + } + + if (command.query?.$query) { + let result: Document; + if (command.ns === 'admin.$cmd') { + // up-convert legacy command + result = Object.assign({}, command.query.$query); + } else { + // up-convert legacy find command + result = { find: collectionName(command) }; + Object.keys(LEGACY_FIND_QUERY_MAP).forEach(key => { + if (command.query[key] != null) { + result[LEGACY_FIND_QUERY_MAP[key]] = deepCopy(command.query[key]); + } + }); + } + + Object.keys(LEGACY_FIND_OPTIONS_MAP).forEach(key => { + const legacyKey = key as keyof typeof LEGACY_FIND_OPTIONS_MAP; + if (command[legacyKey] != null) { + result[LEGACY_FIND_OPTIONS_MAP[legacyKey]] = deepCopy(command[legacyKey]); + } + }); + + OP_QUERY_KEYS.forEach(key => { + const opKey = key as typeof OP_QUERY_KEYS[number]; + if (command[opKey]) { + result[opKey] = command[opKey]; + } + }); + + if (command.pre32Limit != null) { + result.limit = command.pre32Limit; + } + + if (command.query.$explain) { + return { explain: result }; + } + return result; + } + + const clonedQuery: Record = {}; + const clonedCommand: Record = {}; + if (command.query) { + for (const k in command.query) { + clonedQuery[k] = deepCopy(command.query[k]); + } + clonedCommand.query = clonedQuery; + } + + for (const k in command) { + if (k === 'query') continue; + clonedCommand[k] = deepCopy((command as unknown as Record)[k]); + } + return command.query ? clonedQuery : clonedCommand; +} + +function extractReply(command: WriteProtocolMessageType, reply?: Document) { + if (command instanceof KillCursor) { + return { + ok: 1, + cursorsUnknown: command.cursorIds + }; + } + + if (!reply) { + return reply; + } + + if (command instanceof GetMore) { + return { + ok: 1, + cursor: { + id: deepCopy(reply.cursorId), + ns: namespace(command), + nextBatch: deepCopy(reply.documents) + } + }; + } + + if (command instanceof Msg) { + return deepCopy(reply.result ? reply.result : reply); + } + + // is this a legacy find command? + if (command.query && command.query.$query != null) { + return { + ok: 1, + cursor: { + id: deepCopy(reply.cursorId), + ns: namespace(command), + firstBatch: deepCopy(reply.documents) + } + }; + } + + return deepCopy(reply.result ? reply.result : reply); +} + +function extractConnectionDetails(connection: Connection) { + let connectionId; + if ('id' in connection) { + connectionId = connection.id; + } + return { + address: connection.address, + serviceId: connection.serviceId, + connectionId + }; +} diff --git a/node_modules/mongodb/src/cmap/commands.ts b/node_modules/mongodb/src/cmap/commands.ts new file mode 100644 index 000000000..3fa3a7141 --- /dev/null +++ b/node_modules/mongodb/src/cmap/commands.ts @@ -0,0 +1,876 @@ +import { ReadPreference } from '../read_preference'; +import * as BSON from '../bson'; +import { databaseNamespace } from '../utils'; +import { OP_QUERY, OP_GETMORE, OP_KILL_CURSORS, OP_MSG } from './wire_protocol/constants'; +import type { Long, Document, BSONSerializeOptions } from '../bson'; +import type { ClientSession } from '../sessions'; +import type { CommandOptions } from './connection'; +import { MongoRuntimeError, MongoInvalidArgumentError } from '../error'; + +// Incrementing request id +let _requestId = 0; + +// Query flags +const OPTS_TAILABLE_CURSOR = 2; +const OPTS_SLAVE = 4; +const OPTS_OPLOG_REPLAY = 8; +const OPTS_NO_CURSOR_TIMEOUT = 16; +const OPTS_AWAIT_DATA = 32; +const OPTS_EXHAUST = 64; +const OPTS_PARTIAL = 128; + +// Response flags +const CURSOR_NOT_FOUND = 1; +const QUERY_FAILURE = 2; +const SHARD_CONFIG_STALE = 4; +const AWAIT_CAPABLE = 8; + +/** @internal */ +export type WriteProtocolMessageType = Query | Msg | GetMore | KillCursor; + +/** @internal */ +export interface OpQueryOptions extends CommandOptions { + socketTimeoutMS?: number; + session?: ClientSession; + documentsReturnedIn?: string; + numberToSkip?: number; + numberToReturn?: number; + returnFieldSelector?: Document; + pre32Limit?: number; + serializeFunctions?: boolean; + ignoreUndefined?: boolean; + maxBsonSize?: number; + checkKeys?: boolean; + slaveOk?: boolean; + + requestId?: number; + moreToCome?: boolean; + exhaustAllowed?: boolean; + readPreference?: ReadPreference; +} + +/************************************************************** + * QUERY + **************************************************************/ +/** @internal */ +export class Query { + ns: string; + query: Document; + numberToSkip: number; + numberToReturn: number; + returnFieldSelector?: Document; + requestId: number; + pre32Limit?: number; + serializeFunctions: boolean; + ignoreUndefined: boolean; + maxBsonSize: number; + checkKeys: boolean; + batchSize: number; + tailable: boolean; + slaveOk: boolean; + oplogReplay: boolean; + noCursorTimeout: boolean; + awaitData: boolean; + exhaust: boolean; + partial: boolean; + documentsReturnedIn?: string; + + constructor(ns: string, query: Document, options: OpQueryOptions) { + // Basic options needed to be passed in + // TODO(NODE-3483): Replace with MongoCommandError + if (ns == null) throw new MongoRuntimeError('Namespace must be specified for query'); + // TODO(NODE-3483): Replace with MongoCommandError + if (query == null) throw new MongoRuntimeError('A query document must be specified for query'); + + // Validate that we are not passing 0x00 in the collection name + if (ns.indexOf('\x00') !== -1) { + // TODO(NODE-3483): Use MongoNamespace static method + throw new MongoRuntimeError('Namespace cannot contain a null character'); + } + + // Basic options + this.ns = ns; + this.query = query; + + // Additional options + this.numberToSkip = options.numberToSkip || 0; + this.numberToReturn = options.numberToReturn || 0; + this.returnFieldSelector = options.returnFieldSelector || undefined; + this.requestId = Query.getRequestId(); + + // special case for pre-3.2 find commands, delete ASAP + this.pre32Limit = options.pre32Limit; + + // Serialization option + this.serializeFunctions = + typeof options.serializeFunctions === 'boolean' ? options.serializeFunctions : false; + this.ignoreUndefined = + typeof options.ignoreUndefined === 'boolean' ? options.ignoreUndefined : false; + this.maxBsonSize = options.maxBsonSize || 1024 * 1024 * 16; + this.checkKeys = typeof options.checkKeys === 'boolean' ? options.checkKeys : false; + this.batchSize = this.numberToReturn; + + // Flags + this.tailable = false; + this.slaveOk = typeof options.slaveOk === 'boolean' ? options.slaveOk : false; + this.oplogReplay = false; + this.noCursorTimeout = false; + this.awaitData = false; + this.exhaust = false; + this.partial = false; + } + + /** Assign next request Id. */ + incRequestId(): void { + this.requestId = _requestId++; + } + + /** Peek next request Id. */ + nextRequestId(): number { + return _requestId + 1; + } + + /** Increment then return next request Id. */ + static getRequestId(): number { + return ++_requestId; + } + + // Uses a single allocated buffer for the process, avoiding multiple memory allocations + toBin(): Buffer[] { + const buffers = []; + let projection = null; + + // Set up the flags + let flags = 0; + if (this.tailable) { + flags |= OPTS_TAILABLE_CURSOR; + } + + if (this.slaveOk) { + flags |= OPTS_SLAVE; + } + + if (this.oplogReplay) { + flags |= OPTS_OPLOG_REPLAY; + } + + if (this.noCursorTimeout) { + flags |= OPTS_NO_CURSOR_TIMEOUT; + } + + if (this.awaitData) { + flags |= OPTS_AWAIT_DATA; + } + + if (this.exhaust) { + flags |= OPTS_EXHAUST; + } + + if (this.partial) { + flags |= OPTS_PARTIAL; + } + + // If batchSize is different to this.numberToReturn + if (this.batchSize !== this.numberToReturn) this.numberToReturn = this.batchSize; + + // Allocate write protocol header buffer + const header = Buffer.alloc( + 4 * 4 + // Header + 4 + // Flags + Buffer.byteLength(this.ns) + + 1 + // namespace + 4 + // numberToSkip + 4 // numberToReturn + ); + + // Add header to buffers + buffers.push(header); + + // Serialize the query + const query = BSON.serialize(this.query, { + checkKeys: this.checkKeys, + serializeFunctions: this.serializeFunctions, + ignoreUndefined: this.ignoreUndefined + }); + + // Add query document + buffers.push(query); + + if (this.returnFieldSelector && Object.keys(this.returnFieldSelector).length > 0) { + // Serialize the projection document + projection = BSON.serialize(this.returnFieldSelector, { + checkKeys: this.checkKeys, + serializeFunctions: this.serializeFunctions, + ignoreUndefined: this.ignoreUndefined + }); + // Add projection document + buffers.push(projection); + } + + // Total message size + const totalLength = header.length + query.length + (projection ? projection.length : 0); + + // Set up the index + let index = 4; + + // Write total document length + header[3] = (totalLength >> 24) & 0xff; + header[2] = (totalLength >> 16) & 0xff; + header[1] = (totalLength >> 8) & 0xff; + header[0] = totalLength & 0xff; + + // Write header information requestId + header[index + 3] = (this.requestId >> 24) & 0xff; + header[index + 2] = (this.requestId >> 16) & 0xff; + header[index + 1] = (this.requestId >> 8) & 0xff; + header[index] = this.requestId & 0xff; + index = index + 4; + + // Write header information responseTo + header[index + 3] = (0 >> 24) & 0xff; + header[index + 2] = (0 >> 16) & 0xff; + header[index + 1] = (0 >> 8) & 0xff; + header[index] = 0 & 0xff; + index = index + 4; + + // Write header information OP_QUERY + header[index + 3] = (OP_QUERY >> 24) & 0xff; + header[index + 2] = (OP_QUERY >> 16) & 0xff; + header[index + 1] = (OP_QUERY >> 8) & 0xff; + header[index] = OP_QUERY & 0xff; + index = index + 4; + + // Write header information flags + header[index + 3] = (flags >> 24) & 0xff; + header[index + 2] = (flags >> 16) & 0xff; + header[index + 1] = (flags >> 8) & 0xff; + header[index] = flags & 0xff; + index = index + 4; + + // Write collection name + index = index + header.write(this.ns, index, 'utf8') + 1; + header[index - 1] = 0; + + // Write header information flags numberToSkip + header[index + 3] = (this.numberToSkip >> 24) & 0xff; + header[index + 2] = (this.numberToSkip >> 16) & 0xff; + header[index + 1] = (this.numberToSkip >> 8) & 0xff; + header[index] = this.numberToSkip & 0xff; + index = index + 4; + + // Write header information flags numberToReturn + header[index + 3] = (this.numberToReturn >> 24) & 0xff; + header[index + 2] = (this.numberToReturn >> 16) & 0xff; + header[index + 1] = (this.numberToReturn >> 8) & 0xff; + header[index] = this.numberToReturn & 0xff; + index = index + 4; + + // Return the buffers + return buffers; + } +} + +/** @internal */ +export interface OpGetMoreOptions { + numberToReturn?: number; +} + +/************************************************************** + * GETMORE + **************************************************************/ +/** @internal */ +export class GetMore { + numberToReturn: number; + requestId: number; + ns: string; + cursorId: Long; + + constructor(ns: string, cursorId: Long, opts: OpGetMoreOptions = {}) { + this.numberToReturn = opts.numberToReturn || 0; + this.requestId = _requestId++; + this.ns = ns; + this.cursorId = cursorId; + } + + // Uses a single allocated buffer for the process, avoiding multiple memory allocations + toBin(): Buffer[] { + const length = 4 + Buffer.byteLength(this.ns) + 1 + 4 + 8 + 4 * 4; + // Create command buffer + let index = 0; + // Allocate buffer + const _buffer = Buffer.alloc(length); + + // Write header information + // index = write32bit(index, _buffer, length); + _buffer[index + 3] = (length >> 24) & 0xff; + _buffer[index + 2] = (length >> 16) & 0xff; + _buffer[index + 1] = (length >> 8) & 0xff; + _buffer[index] = length & 0xff; + index = index + 4; + + // index = write32bit(index, _buffer, requestId); + _buffer[index + 3] = (this.requestId >> 24) & 0xff; + _buffer[index + 2] = (this.requestId >> 16) & 0xff; + _buffer[index + 1] = (this.requestId >> 8) & 0xff; + _buffer[index] = this.requestId & 0xff; + index = index + 4; + + // index = write32bit(index, _buffer, 0); + _buffer[index + 3] = (0 >> 24) & 0xff; + _buffer[index + 2] = (0 >> 16) & 0xff; + _buffer[index + 1] = (0 >> 8) & 0xff; + _buffer[index] = 0 & 0xff; + index = index + 4; + + // index = write32bit(index, _buffer, OP_GETMORE); + _buffer[index + 3] = (OP_GETMORE >> 24) & 0xff; + _buffer[index + 2] = (OP_GETMORE >> 16) & 0xff; + _buffer[index + 1] = (OP_GETMORE >> 8) & 0xff; + _buffer[index] = OP_GETMORE & 0xff; + index = index + 4; + + // index = write32bit(index, _buffer, 0); + _buffer[index + 3] = (0 >> 24) & 0xff; + _buffer[index + 2] = (0 >> 16) & 0xff; + _buffer[index + 1] = (0 >> 8) & 0xff; + _buffer[index] = 0 & 0xff; + index = index + 4; + + // Write collection name + index = index + _buffer.write(this.ns, index, 'utf8') + 1; + _buffer[index - 1] = 0; + + // Write batch size + // index = write32bit(index, _buffer, numberToReturn); + _buffer[index + 3] = (this.numberToReturn >> 24) & 0xff; + _buffer[index + 2] = (this.numberToReturn >> 16) & 0xff; + _buffer[index + 1] = (this.numberToReturn >> 8) & 0xff; + _buffer[index] = this.numberToReturn & 0xff; + index = index + 4; + + // Write cursor id + // index = write32bit(index, _buffer, cursorId.getLowBits()); + _buffer[index + 3] = (this.cursorId.getLowBits() >> 24) & 0xff; + _buffer[index + 2] = (this.cursorId.getLowBits() >> 16) & 0xff; + _buffer[index + 1] = (this.cursorId.getLowBits() >> 8) & 0xff; + _buffer[index] = this.cursorId.getLowBits() & 0xff; + index = index + 4; + + // index = write32bit(index, _buffer, cursorId.getHighBits()); + _buffer[index + 3] = (this.cursorId.getHighBits() >> 24) & 0xff; + _buffer[index + 2] = (this.cursorId.getHighBits() >> 16) & 0xff; + _buffer[index + 1] = (this.cursorId.getHighBits() >> 8) & 0xff; + _buffer[index] = this.cursorId.getHighBits() & 0xff; + index = index + 4; + + // Return buffer + return [_buffer]; + } +} + +/************************************************************** + * KILLCURSOR + **************************************************************/ +/** @internal */ +export class KillCursor { + ns: string; + requestId: number; + cursorIds: Long[]; + + constructor(ns: string, cursorIds: Long[]) { + this.ns = ns; + this.requestId = _requestId++; + this.cursorIds = cursorIds; + } + + // Uses a single allocated buffer for the process, avoiding multiple memory allocations + toBin(): Buffer[] { + const length = 4 + 4 + 4 * 4 + this.cursorIds.length * 8; + + // Create command buffer + let index = 0; + const _buffer = Buffer.alloc(length); + + // Write header information + // index = write32bit(index, _buffer, length); + _buffer[index + 3] = (length >> 24) & 0xff; + _buffer[index + 2] = (length >> 16) & 0xff; + _buffer[index + 1] = (length >> 8) & 0xff; + _buffer[index] = length & 0xff; + index = index + 4; + + // index = write32bit(index, _buffer, requestId); + _buffer[index + 3] = (this.requestId >> 24) & 0xff; + _buffer[index + 2] = (this.requestId >> 16) & 0xff; + _buffer[index + 1] = (this.requestId >> 8) & 0xff; + _buffer[index] = this.requestId & 0xff; + index = index + 4; + + // index = write32bit(index, _buffer, 0); + _buffer[index + 3] = (0 >> 24) & 0xff; + _buffer[index + 2] = (0 >> 16) & 0xff; + _buffer[index + 1] = (0 >> 8) & 0xff; + _buffer[index] = 0 & 0xff; + index = index + 4; + + // index = write32bit(index, _buffer, OP_KILL_CURSORS); + _buffer[index + 3] = (OP_KILL_CURSORS >> 24) & 0xff; + _buffer[index + 2] = (OP_KILL_CURSORS >> 16) & 0xff; + _buffer[index + 1] = (OP_KILL_CURSORS >> 8) & 0xff; + _buffer[index] = OP_KILL_CURSORS & 0xff; + index = index + 4; + + // index = write32bit(index, _buffer, 0); + _buffer[index + 3] = (0 >> 24) & 0xff; + _buffer[index + 2] = (0 >> 16) & 0xff; + _buffer[index + 1] = (0 >> 8) & 0xff; + _buffer[index] = 0 & 0xff; + index = index + 4; + + // Write batch size + // index = write32bit(index, _buffer, this.cursorIds.length); + _buffer[index + 3] = (this.cursorIds.length >> 24) & 0xff; + _buffer[index + 2] = (this.cursorIds.length >> 16) & 0xff; + _buffer[index + 1] = (this.cursorIds.length >> 8) & 0xff; + _buffer[index] = this.cursorIds.length & 0xff; + index = index + 4; + + // Write all the cursor ids into the array + for (let i = 0; i < this.cursorIds.length; i++) { + // Write cursor id + // index = write32bit(index, _buffer, cursorIds[i].getLowBits()); + _buffer[index + 3] = (this.cursorIds[i].getLowBits() >> 24) & 0xff; + _buffer[index + 2] = (this.cursorIds[i].getLowBits() >> 16) & 0xff; + _buffer[index + 1] = (this.cursorIds[i].getLowBits() >> 8) & 0xff; + _buffer[index] = this.cursorIds[i].getLowBits() & 0xff; + index = index + 4; + + // index = write32bit(index, _buffer, cursorIds[i].getHighBits()); + _buffer[index + 3] = (this.cursorIds[i].getHighBits() >> 24) & 0xff; + _buffer[index + 2] = (this.cursorIds[i].getHighBits() >> 16) & 0xff; + _buffer[index + 1] = (this.cursorIds[i].getHighBits() >> 8) & 0xff; + _buffer[index] = this.cursorIds[i].getHighBits() & 0xff; + index = index + 4; + } + + // Return buffer + return [_buffer]; + } +} + +export interface MessageHeader { + length: number; + requestId: number; + responseTo: number; + opCode: number; + fromCompressed?: boolean; +} + +export interface OpResponseOptions extends BSONSerializeOptions { + raw?: boolean; + documentsReturnedIn?: string | null; +} + +/** @internal */ +export class Response { + parsed: boolean; + raw: Buffer; + data: Buffer; + opts: OpResponseOptions; + length: number; + requestId: number; + responseTo: number; + opCode: number; + fromCompressed?: boolean; + responseFlags: number; + cursorId: Long; + startingFrom: number; + numberReturned: number; + documents: (Document | Buffer)[]; + cursorNotFound: boolean; + queryFailure: boolean; + shardConfigStale: boolean; + awaitCapable: boolean; + promoteLongs: boolean; + promoteValues: boolean; + promoteBuffers: boolean; + bsonRegExp?: boolean; + index?: number; + + constructor( + message: Buffer, + msgHeader: MessageHeader, + msgBody: Buffer, + opts?: OpResponseOptions + ) { + this.parsed = false; + this.raw = message; + this.data = msgBody; + this.opts = opts ?? { + promoteLongs: true, + promoteValues: true, + promoteBuffers: false, + bsonRegExp: false + }; + + // Read the message header + this.length = msgHeader.length; + this.requestId = msgHeader.requestId; + this.responseTo = msgHeader.responseTo; + this.opCode = msgHeader.opCode; + this.fromCompressed = msgHeader.fromCompressed; + + // Read the message body + this.responseFlags = msgBody.readInt32LE(0); + this.cursorId = new BSON.Long(msgBody.readInt32LE(4), msgBody.readInt32LE(8)); + this.startingFrom = msgBody.readInt32LE(12); + this.numberReturned = msgBody.readInt32LE(16); + + // Preallocate document array + this.documents = new Array(this.numberReturned); + + // Flag values + this.cursorNotFound = (this.responseFlags & CURSOR_NOT_FOUND) !== 0; + this.queryFailure = (this.responseFlags & QUERY_FAILURE) !== 0; + this.shardConfigStale = (this.responseFlags & SHARD_CONFIG_STALE) !== 0; + this.awaitCapable = (this.responseFlags & AWAIT_CAPABLE) !== 0; + this.promoteLongs = typeof this.opts.promoteLongs === 'boolean' ? this.opts.promoteLongs : true; + this.promoteValues = + typeof this.opts.promoteValues === 'boolean' ? this.opts.promoteValues : true; + this.promoteBuffers = + typeof this.opts.promoteBuffers === 'boolean' ? this.opts.promoteBuffers : false; + this.bsonRegExp = typeof this.opts.bsonRegExp === 'boolean' ? this.opts.bsonRegExp : false; + } + + isParsed(): boolean { + return this.parsed; + } + + parse(options: OpResponseOptions): void { + // Don't parse again if not needed + if (this.parsed) return; + options = options ?? {}; + + // Allow the return of raw documents instead of parsing + const raw = options.raw || false; + const documentsReturnedIn = options.documentsReturnedIn || null; + const promoteLongs = options.promoteLongs ?? this.opts.promoteLongs; + const promoteValues = options.promoteValues ?? this.opts.promoteValues; + const promoteBuffers = options.promoteBuffers ?? this.opts.promoteBuffers; + const bsonRegExp = options.bsonRegExp ?? this.opts.bsonRegExp; + let bsonSize; + + // Set up the options + const _options: BSONSerializeOptions = { + promoteLongs, + promoteValues, + promoteBuffers, + bsonRegExp + }; + + // Position within OP_REPLY at which documents start + // (See https://docs.mongodb.com/manual/reference/mongodb-wire-protocol/#wire-op-reply) + this.index = 20; + + // Parse Body + for (let i = 0; i < this.numberReturned; i++) { + bsonSize = + this.data[this.index] | + (this.data[this.index + 1] << 8) | + (this.data[this.index + 2] << 16) | + (this.data[this.index + 3] << 24); + + // If we have raw results specified slice the return document + if (raw) { + this.documents[i] = this.data.slice(this.index, this.index + bsonSize); + } else { + this.documents[i] = BSON.deserialize( + this.data.slice(this.index, this.index + bsonSize), + _options + ); + } + + // Adjust the index + this.index = this.index + bsonSize; + } + + if (this.documents.length === 1 && documentsReturnedIn != null && raw) { + const fieldsAsRaw: Document = {}; + fieldsAsRaw[documentsReturnedIn] = true; + _options.fieldsAsRaw = fieldsAsRaw; + + const doc = BSON.deserialize(this.documents[0] as Buffer, _options); + this.documents = [doc]; + } + + // Set parsed + this.parsed = true; + } +} + +// Implementation of OP_MSG spec: +// https://github.com/mongodb/specifications/blob/master/source/message/OP_MSG.rst +// +// struct Section { +// uint8 payloadType; +// union payload { +// document document; // payloadType == 0 +// struct sequence { // payloadType == 1 +// int32 size; +// cstring identifier; +// document* documents; +// }; +// }; +// }; + +// struct OP_MSG { +// struct MsgHeader { +// int32 messageLength; +// int32 requestID; +// int32 responseTo; +// int32 opCode = 2013; +// }; +// uint32 flagBits; +// Section+ sections; +// [uint32 checksum;] +// }; + +// Msg Flags +const OPTS_CHECKSUM_PRESENT = 1; +const OPTS_MORE_TO_COME = 2; +const OPTS_EXHAUST_ALLOWED = 1 << 16; + +export interface OpMsgOptions { + requestId: number; + serializeFunctions: boolean; + ignoreUndefined: boolean; + checkKeys: boolean; + maxBsonSize: number; + moreToCome: boolean; + exhaustAllowed: boolean; + readPreference: ReadPreference; +} + +/** @internal */ +export class Msg { + ns: string; + command: Document; + options: OpQueryOptions; + requestId: number; + serializeFunctions: boolean; + ignoreUndefined: boolean; + checkKeys: boolean; + maxBsonSize: number; + checksumPresent: boolean; + moreToCome: boolean; + exhaustAllowed: boolean; + + constructor(ns: string, command: Document, options: OpQueryOptions) { + // Basic options needed to be passed in + if (command == null) + throw new MongoInvalidArgumentError('Query document must be specified for query'); + + // Basic options + this.ns = ns; + this.command = command; + this.command.$db = databaseNamespace(ns); + + if (options.readPreference && options.readPreference.mode !== ReadPreference.PRIMARY) { + this.command.$readPreference = options.readPreference.toJSON(); + } + + // Ensure empty options + this.options = options ?? {}; + + // Additional options + this.requestId = options.requestId ? options.requestId : Msg.getRequestId(); + + // Serialization option + this.serializeFunctions = + typeof options.serializeFunctions === 'boolean' ? options.serializeFunctions : false; + this.ignoreUndefined = + typeof options.ignoreUndefined === 'boolean' ? options.ignoreUndefined : false; + this.checkKeys = typeof options.checkKeys === 'boolean' ? options.checkKeys : false; + this.maxBsonSize = options.maxBsonSize || 1024 * 1024 * 16; + + // flags + this.checksumPresent = false; + this.moreToCome = options.moreToCome || false; + this.exhaustAllowed = + typeof options.exhaustAllowed === 'boolean' ? options.exhaustAllowed : false; + } + + toBin(): Buffer[] { + const buffers: Buffer[] = []; + let flags = 0; + + if (this.checksumPresent) { + flags |= OPTS_CHECKSUM_PRESENT; + } + + if (this.moreToCome) { + flags |= OPTS_MORE_TO_COME; + } + + if (this.exhaustAllowed) { + flags |= OPTS_EXHAUST_ALLOWED; + } + + const header = Buffer.alloc( + 4 * 4 + // Header + 4 // Flags + ); + + buffers.push(header); + + let totalLength = header.length; + const command = this.command; + totalLength += this.makeDocumentSegment(buffers, command); + + header.writeInt32LE(totalLength, 0); // messageLength + header.writeInt32LE(this.requestId, 4); // requestID + header.writeInt32LE(0, 8); // responseTo + header.writeInt32LE(OP_MSG, 12); // opCode + header.writeUInt32LE(flags, 16); // flags + return buffers; + } + + makeDocumentSegment(buffers: Buffer[], document: Document): number { + const payloadTypeBuffer = Buffer.alloc(1); + payloadTypeBuffer[0] = 0; + + const documentBuffer = this.serializeBson(document); + buffers.push(payloadTypeBuffer); + buffers.push(documentBuffer); + + return payloadTypeBuffer.length + documentBuffer.length; + } + + serializeBson(document: Document): Buffer { + return BSON.serialize(document, { + checkKeys: this.checkKeys, + serializeFunctions: this.serializeFunctions, + ignoreUndefined: this.ignoreUndefined + }); + } + + static getRequestId(): number { + _requestId = (_requestId + 1) & 0x7fffffff; + return _requestId; + } +} + +/** @internal */ +export class BinMsg { + parsed: boolean; + raw: Buffer; + data: Buffer; + opts: OpResponseOptions; + length: number; + requestId: number; + responseTo: number; + opCode: number; + fromCompressed?: boolean; + responseFlags: number; + checksumPresent: boolean; + moreToCome: boolean; + exhaustAllowed: boolean; + promoteLongs: boolean; + promoteValues: boolean; + promoteBuffers: boolean; + bsonRegExp: boolean; + documents: (Document | Buffer)[]; + index?: number; + + constructor( + message: Buffer, + msgHeader: MessageHeader, + msgBody: Buffer, + opts?: OpResponseOptions + ) { + this.parsed = false; + this.raw = message; + this.data = msgBody; + this.opts = opts ?? { + promoteLongs: true, + promoteValues: true, + promoteBuffers: false, + bsonRegExp: false + }; + + // Read the message header + this.length = msgHeader.length; + this.requestId = msgHeader.requestId; + this.responseTo = msgHeader.responseTo; + this.opCode = msgHeader.opCode; + this.fromCompressed = msgHeader.fromCompressed; + + // Read response flags + this.responseFlags = msgBody.readInt32LE(0); + this.checksumPresent = (this.responseFlags & OPTS_CHECKSUM_PRESENT) !== 0; + this.moreToCome = (this.responseFlags & OPTS_MORE_TO_COME) !== 0; + this.exhaustAllowed = (this.responseFlags & OPTS_EXHAUST_ALLOWED) !== 0; + this.promoteLongs = typeof this.opts.promoteLongs === 'boolean' ? this.opts.promoteLongs : true; + this.promoteValues = + typeof this.opts.promoteValues === 'boolean' ? this.opts.promoteValues : true; + this.promoteBuffers = + typeof this.opts.promoteBuffers === 'boolean' ? this.opts.promoteBuffers : false; + this.bsonRegExp = typeof this.opts.bsonRegExp === 'boolean' ? this.opts.bsonRegExp : false; + + this.documents = []; + } + + isParsed(): boolean { + return this.parsed; + } + + parse(options: OpResponseOptions): void { + // Don't parse again if not needed + if (this.parsed) return; + options = options ?? {}; + + this.index = 4; + // Allow the return of raw documents instead of parsing + const raw = options.raw || false; + const documentsReturnedIn = options.documentsReturnedIn || null; + const promoteLongs = options.promoteLongs ?? this.opts.promoteLongs; + const promoteValues = options.promoteValues ?? this.opts.promoteValues; + const promoteBuffers = options.promoteBuffers ?? this.opts.promoteBuffers; + const bsonRegExp = options.bsonRegExp ?? this.opts.bsonRegExp; + + // Set up the options + const _options: BSONSerializeOptions = { + promoteLongs, + promoteValues, + promoteBuffers, + bsonRegExp + }; + + while (this.index < this.data.length) { + const payloadType = this.data.readUInt8(this.index++); + if (payloadType === 0) { + const bsonSize = this.data.readUInt32LE(this.index); + const bin = this.data.slice(this.index, this.index + bsonSize); + this.documents.push(raw ? bin : BSON.deserialize(bin, _options)); + + this.index += bsonSize; + } else if (payloadType === 1) { + // It was decided that no driver makes use of payload type 1 + + // TODO(NODE-3483): Replace with MongoDeprecationError + throw new MongoRuntimeError('OP_MSG Payload Type 1 detected unsupported protocol'); + } + } + + if (this.documents.length === 1 && documentsReturnedIn != null && raw) { + const fieldsAsRaw: Document = {}; + fieldsAsRaw[documentsReturnedIn] = true; + _options.fieldsAsRaw = fieldsAsRaw; + + const doc = BSON.deserialize(this.documents[0] as Buffer, _options); + this.documents = [doc]; + } + + this.parsed = true; + } +} diff --git a/node_modules/mongodb/src/cmap/connect.ts b/node_modules/mongodb/src/cmap/connect.ts new file mode 100644 index 000000000..07917c80e --- /dev/null +++ b/node_modules/mongodb/src/cmap/connect.ts @@ -0,0 +1,400 @@ +import * as net from 'net'; +import * as tls from 'tls'; +import { Connection, ConnectionOptions, CryptoConnection } from './connection'; +import { + MongoNetworkError, + MongoNetworkTimeoutError, + AnyError, + MongoCompatibilityError, + MongoInvalidArgumentError, + MongoServerError, + MongoRuntimeError +} from '../error'; +import { AUTH_PROVIDERS, AuthMechanism } from './auth/defaultAuthProviders'; +import { AuthContext } from './auth/auth_provider'; +import { makeClientMetadata, ClientMetadata, Callback, CallbackWithType, ns } from '../utils'; +import { + MAX_SUPPORTED_WIRE_VERSION, + MAX_SUPPORTED_SERVER_VERSION, + MIN_SUPPORTED_WIRE_VERSION, + MIN_SUPPORTED_SERVER_VERSION +} from './wire_protocol/constants'; +import type { Document } from '../bson'; +import { Int32 } from '../bson'; + +import type { Socket, SocketConnectOpts } from 'net'; +import type { TLSSocket, ConnectionOptions as TLSConnectionOpts } from 'tls'; + +const FAKE_MONGODB_SERVICE_ID = + typeof process.env.FAKE_MONGODB_SERVICE_ID === 'string' && + process.env.FAKE_MONGODB_SERVICE_ID.toLowerCase() === 'true'; + +/** @public */ +export type Stream = Socket | TLSSocket; + +export function connect(options: ConnectionOptions, callback: Callback): void { + makeConnection(options, (err, socket) => { + if (err || !socket) { + return callback(err); + } + + let ConnectionType = options.connectionType ?? Connection; + if (options.autoEncrypter) { + ConnectionType = CryptoConnection; + } + performInitialHandshake(new ConnectionType(socket, options), options, callback); + }); +} + +function checkSupportedServer(ismaster: Document, options: ConnectionOptions) { + const serverVersionHighEnough = + ismaster && + (typeof ismaster.maxWireVersion === 'number' || ismaster.maxWireVersion instanceof Int32) && + ismaster.maxWireVersion >= MIN_SUPPORTED_WIRE_VERSION; + const serverVersionLowEnough = + ismaster && + (typeof ismaster.minWireVersion === 'number' || ismaster.minWireVersion instanceof Int32) && + ismaster.minWireVersion <= MAX_SUPPORTED_WIRE_VERSION; + + if (serverVersionHighEnough) { + if (serverVersionLowEnough) { + return null; + } + + const message = `Server at ${options.hostAddress} reports minimum wire version ${JSON.stringify( + ismaster.minWireVersion + )}, but this version of the Node.js Driver requires at most ${MAX_SUPPORTED_WIRE_VERSION} (MongoDB ${MAX_SUPPORTED_SERVER_VERSION})`; + return new MongoCompatibilityError(message); + } + + const message = `Server at ${options.hostAddress} reports maximum wire version ${ + JSON.stringify(ismaster.maxWireVersion) ?? 0 + }, but this version of the Node.js Driver requires at least ${MIN_SUPPORTED_WIRE_VERSION} (MongoDB ${MIN_SUPPORTED_SERVER_VERSION})`; + return new MongoCompatibilityError(message); +} + +function performInitialHandshake( + conn: Connection, + options: ConnectionOptions, + _callback: Callback +) { + const callback: Callback = function (err, ret) { + if (err && conn) { + conn.destroy(); + } + _callback(err, ret); + }; + + const credentials = options.credentials; + if (credentials) { + if ( + !(credentials.mechanism === AuthMechanism.MONGODB_DEFAULT) && + !AUTH_PROVIDERS.get(credentials.mechanism) + ) { + callback( + new MongoInvalidArgumentError(`AuthMechanism '${credentials.mechanism}' not supported`) + ); + return; + } + } + + const authContext = new AuthContext(conn, credentials, options); + prepareHandshakeDocument(authContext, (err, handshakeDoc) => { + if (err || !handshakeDoc) { + return callback(err); + } + + const handshakeOptions: Document = Object.assign({}, options); + if (typeof options.connectTimeoutMS === 'number') { + // The handshake technically is a monitoring check, so its socket timeout should be connectTimeoutMS + handshakeOptions.socketTimeoutMS = options.connectTimeoutMS; + } + + const start = new Date().getTime(); + conn.command(ns('admin.$cmd'), handshakeDoc, handshakeOptions, (err, response) => { + if (err) { + callback(err); + return; + } + + if (response?.ok === 0) { + callback(new MongoServerError(response)); + return; + } + + if ('isWritablePrimary' in response) { + // Provide pre-hello-style response document. + response.ismaster = response.isWritablePrimary; + } + + if (response.helloOk) { + conn.helloOk = true; + } + + const supportedServerErr = checkSupportedServer(response, options); + if (supportedServerErr) { + callback(supportedServerErr); + return; + } + + if (options.loadBalanced) { + // TODO: Durran: Remove when server support exists. (NODE-3431) + if (FAKE_MONGODB_SERVICE_ID) { + response.serviceId = response.topologyVersion.processId; + } + if (!response.serviceId) { + return callback( + new MongoCompatibilityError( + 'Driver attempted to initialize in load balancing mode, ' + + 'but the server does not support this mode.' + ) + ); + } + } + + // NOTE: This is metadata attached to the connection while porting away from + // handshake being done in the `Server` class. Likely, it should be + // relocated, or at very least restructured. + conn.ismaster = response; + conn.lastIsMasterMS = new Date().getTime() - start; + + if (!response.arbiterOnly && credentials) { + // store the response on auth context + authContext.response = response; + + const resolvedCredentials = credentials.resolveAuthMechanism(response); + const provider = AUTH_PROVIDERS.get(resolvedCredentials.mechanism); + if (!provider) { + return callback( + new MongoInvalidArgumentError( + `No AuthProvider for ${resolvedCredentials.mechanism} defined.` + ) + ); + } + provider.auth(authContext, err => { + if (err) return callback(err); + callback(undefined, conn); + }); + + return; + } + + callback(undefined, conn); + }); + }); +} + +export interface HandshakeDocument extends Document { + ismaster?: boolean; + hello?: boolean; + helloOk?: boolean; + client: ClientMetadata; + compression: string[]; + saslSupportedMechs?: string; + loadBalanced: boolean; +} + +function prepareHandshakeDocument(authContext: AuthContext, callback: Callback) { + const options = authContext.options; + const compressors = options.compressors ? options.compressors : []; + const { serverApi } = authContext.connection; + + const handshakeDoc: HandshakeDocument = { + [serverApi?.version ? 'hello' : 'ismaster']: true, + helloOk: true, + client: options.metadata || makeClientMetadata(options), + compression: compressors, + loadBalanced: options.loadBalanced + }; + + const credentials = authContext.credentials; + if (credentials) { + if (credentials.mechanism === AuthMechanism.MONGODB_DEFAULT && credentials.username) { + handshakeDoc.saslSupportedMechs = `${credentials.source}.${credentials.username}`; + + const provider = AUTH_PROVIDERS.get(AuthMechanism.MONGODB_SCRAM_SHA256); + if (!provider) { + // This auth mechanism is always present. + return callback( + new MongoInvalidArgumentError( + `No AuthProvider for ${AuthMechanism.MONGODB_SCRAM_SHA256} defined.` + ) + ); + } + return provider.prepare(handshakeDoc, authContext, callback); + } + const provider = AUTH_PROVIDERS.get(credentials.mechanism); + if (!provider) { + return callback( + new MongoInvalidArgumentError(`No AuthProvider for ${credentials.mechanism} defined.`) + ); + } + return provider.prepare(handshakeDoc, authContext, callback); + } + callback(undefined, handshakeDoc); +} + +/** @public */ +export const LEGAL_TLS_SOCKET_OPTIONS = [ + 'ALPNProtocols', + 'ca', + 'cert', + 'checkServerIdentity', + 'ciphers', + 'crl', + 'ecdhCurve', + 'key', + 'minDHSize', + 'passphrase', + 'pfx', + 'rejectUnauthorized', + 'secureContext', + 'secureProtocol', + 'servername', + 'session' +] as const; + +/** @public */ +export const LEGAL_TCP_SOCKET_OPTIONS = [ + 'family', + 'hints', + 'localAddress', + 'localPort', + 'lookup' +] as const; + +function parseConnectOptions(options: ConnectionOptions): SocketConnectOpts { + const hostAddress = options.hostAddress; + if (!hostAddress) throw new MongoInvalidArgumentError('Option "hostAddress" is required'); + + const result: Partial = {}; + for (const name of LEGAL_TCP_SOCKET_OPTIONS) { + if (options[name] != null) { + (result as Document)[name] = options[name]; + } + } + + if (typeof hostAddress.socketPath === 'string') { + result.path = hostAddress.socketPath; + return result as net.IpcNetConnectOpts; + } else if (typeof hostAddress.host === 'string') { + result.host = hostAddress.host; + result.port = hostAddress.port; + return result as net.TcpNetConnectOpts; + } else { + // This should never happen since we set up HostAddresses + // But if we don't throw here the socket could hang until timeout + // TODO(NODE-3483) + throw new MongoRuntimeError(`Unexpected HostAddress ${JSON.stringify(hostAddress)}`); + } +} + +function parseSslOptions(options: ConnectionOptions): TLSConnectionOpts { + const result: TLSConnectionOpts = parseConnectOptions(options); + // Merge in valid SSL options + for (const name of LEGAL_TLS_SOCKET_OPTIONS) { + if (options[name] != null) { + (result as Document)[name] = options[name]; + } + } + + // Set default sni servername to be the same as host + if (result.servername == null && result.host && !net.isIP(result.host)) { + result.servername = result.host; + } + + return result; +} + +const SOCKET_ERROR_EVENT_LIST = ['error', 'close', 'timeout', 'parseError'] as const; +type ErrorHandlerEventName = typeof SOCKET_ERROR_EVENT_LIST[number] | 'cancel'; +const SOCKET_ERROR_EVENTS = new Set(SOCKET_ERROR_EVENT_LIST); + +function makeConnection(options: ConnectionOptions, _callback: CallbackWithType) { + const useTLS = options.tls ?? false; + const keepAlive = options.keepAlive ?? true; + const socketTimeoutMS = options.socketTimeoutMS ?? Reflect.get(options, 'socketTimeout') ?? 0; + const noDelay = options.noDelay ?? true; + const connectionTimeout = options.connectTimeoutMS ?? 30000; + const rejectUnauthorized = options.rejectUnauthorized ?? true; + const keepAliveInitialDelay = + ((options.keepAliveInitialDelay ?? 120000) > socketTimeoutMS + ? Math.round(socketTimeoutMS / 2) + : options.keepAliveInitialDelay) ?? 120000; + + let socket: Stream; + const callback: Callback = function (err, ret) { + if (err && socket) { + socket.destroy(); + } + + _callback(err, ret); + }; + + if (useTLS) { + const tlsSocket = tls.connect(parseSslOptions(options)); + if (typeof tlsSocket.disableRenegotiation === 'function') { + tlsSocket.disableRenegotiation(); + } + socket = tlsSocket; + } else { + socket = net.createConnection(parseConnectOptions(options)); + } + + socket.setKeepAlive(keepAlive, keepAliveInitialDelay); + socket.setTimeout(connectionTimeout); + socket.setNoDelay(noDelay); + + const connectEvent = useTLS ? 'secureConnect' : 'connect'; + let cancellationHandler: (err: Error) => void; + function errorHandler(eventName: ErrorHandlerEventName) { + return (err: Error) => { + SOCKET_ERROR_EVENTS.forEach(event => socket.removeAllListeners(event)); + if (cancellationHandler && options.cancellationToken) { + options.cancellationToken.removeListener('cancel', cancellationHandler); + } + + socket.removeListener(connectEvent, connectHandler); + callback(connectionFailureError(eventName, err)); + }; + } + + function connectHandler() { + SOCKET_ERROR_EVENTS.forEach(event => socket.removeAllListeners(event)); + if (cancellationHandler && options.cancellationToken) { + options.cancellationToken.removeListener('cancel', cancellationHandler); + } + + if ('authorizationError' in socket) { + if (socket.authorizationError && rejectUnauthorized) { + return callback(socket.authorizationError); + } + } + + socket.setTimeout(socketTimeoutMS); + callback(undefined, socket); + } + + SOCKET_ERROR_EVENTS.forEach(event => socket.once(event, errorHandler(event))); + if (options.cancellationToken) { + cancellationHandler = errorHandler('cancel'); + options.cancellationToken.once('cancel', cancellationHandler); + } + + socket.once(connectEvent, connectHandler); +} + +function connectionFailureError(type: string, err: Error) { + switch (type) { + case 'error': + return new MongoNetworkError(err); + case 'timeout': + return new MongoNetworkTimeoutError('connection timed out'); + case 'close': + return new MongoNetworkError('connection closed'); + case 'cancel': + return new MongoNetworkError('connection establishment was cancelled'); + default: + return new MongoNetworkError('unknown network error'); + } +} diff --git a/node_modules/mongodb/src/cmap/connection.ts b/node_modules/mongodb/src/cmap/connection.ts new file mode 100644 index 000000000..e7b6e52d2 --- /dev/null +++ b/node_modules/mongodb/src/cmap/connection.ts @@ -0,0 +1,854 @@ +import { MessageStream, OperationDescription } from './message_stream'; +import { StreamDescription, StreamDescriptionOptions } from './stream_description'; +import { + CommandStartedEvent, + CommandFailedEvent, + CommandSucceededEvent +} from './command_monitoring_events'; +import { applySession, ClientSession, updateSessionFromResponse } from '../sessions'; +import { + uuidV4, + ClientMetadata, + now, + calculateDurationInMs, + Callback, + MongoDBNamespace, + maxWireVersion, + HostAddress +} from '../utils'; +import { + MongoRuntimeError, + MongoMissingDependencyError, + MongoCompatibilityError, + MongoNetworkError, + MongoNetworkTimeoutError, + MongoServerError, + MongoWriteConcernError +} from '../error'; +import { + BinMsg, + WriteProtocolMessageType, + Response, + KillCursor, + GetMore, + Query, + OpQueryOptions, + Msg +} from './commands'; +import { BSONSerializeOptions, Document, Long, pluckBSONSerializeOptions, ObjectId } from '../bson'; +import type { AutoEncrypter } from '../deps'; +import type { MongoCredentials } from './auth/mongo_credentials'; +import type { Stream } from './connect'; +import { applyCommonQueryOptions, getReadPreference, isSharded } from './wire_protocol/shared'; +import { ReadPreference, ReadPreferenceLike } from '../read_preference'; +import type { W, WriteConcern, WriteConcernOptions } from '../write_concern'; +import type { ServerApi, SupportedNodeConnectionOptions } from '../mongo_client'; +import { CancellationToken, TypedEventEmitter } from '../mongo_types'; + +/** @internal */ +const kStream = Symbol('stream'); +/** @internal */ +const kQueue = Symbol('queue'); +/** @internal */ +const kMessageStream = Symbol('messageStream'); +/** @internal */ +const kGeneration = Symbol('generation'); +/** @internal */ +const kLastUseTime = Symbol('lastUseTime'); +/** @internal */ +const kClusterTime = Symbol('clusterTime'); +/** @internal */ +const kDescription = Symbol('description'); +/** @internal */ +const kIsMaster = Symbol('ismaster'); +/** @internal */ +const kAutoEncrypter = Symbol('autoEncrypter'); +/** @internal */ +const kFullResult = Symbol('fullResult'); + +/** @internal */ +export interface QueryOptions extends BSONSerializeOptions { + readPreference: ReadPreference; + documentsReturnedIn?: string; + batchSize?: number; + limit?: number; + skip?: number; + projection?: Document; + tailable?: boolean; + awaitData?: boolean; + noCursorTimeout?: boolean; + /** @deprecated use `noCursorTimeout` instead */ + timeout?: boolean; + partial?: boolean; + oplogReplay?: boolean; +} + +/** @internal */ +export interface CommandOptions extends BSONSerializeOptions { + command?: boolean; + slaveOk?: boolean; + /** Specify read preference if command supports it */ + readPreference?: ReadPreferenceLike; + raw?: boolean; + monitoring?: boolean; + [kFullResult]?: boolean; + socketTimeoutMS?: number; + /** Session to use for the operation */ + session?: ClientSession; + documentsReturnedIn?: string; + noResponse?: boolean; + + // FIXME: NODE-2802 + willRetryWrite?: boolean; + + // FIXME: NODE-2781 + writeConcern?: WriteConcernOptions | WriteConcern | W; +} + +/** @internal */ +export interface GetMoreOptions extends CommandOptions { + batchSize?: number; + maxTimeMS?: number; + maxAwaitTimeMS?: number; + comment?: Document | string; +} + +/** @public */ +export interface ConnectionOptions + extends SupportedNodeConnectionOptions, + StreamDescriptionOptions { + // Internal creation info + id: number | ''; + generation: number; + hostAddress: HostAddress; + // Settings + autoEncrypter?: AutoEncrypter; + serverApi?: ServerApi; + monitorCommands: boolean; + /** @internal */ + connectionType?: typeof Connection; + credentials?: MongoCredentials; + connectTimeoutMS?: number; + tls: boolean; + keepAlive?: boolean; + keepAliveInitialDelay?: number; + noDelay?: boolean; + socketTimeoutMS?: number; + cancellationToken?: CancellationToken; + + metadata: ClientMetadata; +} + +/** @public */ +export interface DestroyOptions { + /** Force the destruction. */ + force?: boolean; +} + +/** @public */ +export type ConnectionEvents = { + commandStarted(event: CommandStartedEvent): void; + commandSucceeded(event: CommandSucceededEvent): void; + commandFailed(event: CommandFailedEvent): void; + clusterTimeReceived(clusterTime: Document): void; + close(): void; + message(message: any): void; + pinned(pinType: string): void; + unpinned(pinType: string): void; +}; + +/** @internal */ +export class Connection extends TypedEventEmitter { + id: number | ''; + address: string; + socketTimeoutMS: number; + monitorCommands: boolean; + closed: boolean; + destroyed: boolean; + lastIsMasterMS?: number; + serverApi?: ServerApi; + helloOk?: boolean; + /** @internal */ + [kDescription]: StreamDescription; + /** @internal */ + [kGeneration]: number; + /** @internal */ + [kLastUseTime]: number; + /** @internal */ + [kQueue]: Map; + /** @internal */ + [kMessageStream]: MessageStream; + /** @internal */ + [kStream]: Stream; + /** @internal */ + [kIsMaster]: Document; + /** @internal */ + [kClusterTime]: Document; + + /** @event */ + static readonly COMMAND_STARTED = 'commandStarted' as const; + /** @event */ + static readonly COMMAND_SUCCEEDED = 'commandSucceeded' as const; + /** @event */ + static readonly COMMAND_FAILED = 'commandFailed' as const; + /** @event */ + static readonly CLUSTER_TIME_RECEIVED = 'clusterTimeReceived' as const; + /** @event */ + static readonly CLOSE = 'close' as const; + /** @event */ + static readonly MESSAGE = 'message' as const; + /** @event */ + static readonly PINNED = 'pinned' as const; + /** @event */ + static readonly UNPINNED = 'unpinned' as const; + + constructor(stream: Stream, options: ConnectionOptions) { + super(); + this.id = options.id; + this.address = streamIdentifier(stream); + this.socketTimeoutMS = options.socketTimeoutMS ?? 0; + this.monitorCommands = options.monitorCommands; + this.serverApi = options.serverApi; + this.closed = false; + this.destroyed = false; + + this[kDescription] = new StreamDescription(this.address, options); + this[kGeneration] = options.generation; + this[kLastUseTime] = now(); + + // setup parser stream and message handling + this[kQueue] = new Map(); + this[kMessageStream] = new MessageStream({ + ...options, + maxBsonMessageSize: this.ismaster?.maxBsonMessageSize + }); + this[kMessageStream].on('message', messageHandler(this)); + this[kStream] = stream; + stream.on('error', () => { + /* ignore errors, listen to `close` instead */ + }); + + this[kMessageStream].on('error', error => this.handleIssue({ destroy: error })); + stream.on('close', () => this.handleIssue({ isClose: true })); + stream.on('timeout', () => this.handleIssue({ isTimeout: true, destroy: true })); + + // hook the message stream up to the passed in stream + stream.pipe(this[kMessageStream]); + this[kMessageStream].pipe(stream); + } + + get description(): StreamDescription { + return this[kDescription]; + } + + get ismaster(): Document { + return this[kIsMaster]; + } + + // the `connect` method stores the result of the handshake ismaster on the connection + set ismaster(response: Document) { + this[kDescription].receiveResponse(response); + this[kDescription] = Object.freeze(this[kDescription]); + + // TODO: remove this, and only use the `StreamDescription` in the future + this[kIsMaster] = response; + } + + get serviceId(): ObjectId | undefined { + return this.ismaster?.serviceId; + } + + get loadBalanced(): boolean { + return this.description.loadBalanced; + } + + get generation(): number { + return this[kGeneration] || 0; + } + + set generation(generation: number) { + this[kGeneration] = generation; + } + + get idleTime(): number { + return calculateDurationInMs(this[kLastUseTime]); + } + + get clusterTime(): Document { + return this[kClusterTime]; + } + + get stream(): Stream { + return this[kStream]; + } + + markAvailable(): void { + this[kLastUseTime] = now(); + } + + handleIssue(issue: { isTimeout?: boolean; isClose?: boolean; destroy?: boolean | Error }): void { + if (this.closed) { + return; + } + + if (issue.destroy) { + this[kStream].destroy(typeof issue.destroy === 'boolean' ? undefined : issue.destroy); + } + + this.closed = true; + + for (const [, op] of this[kQueue]) { + if (issue.isTimeout) { + op.cb( + new MongoNetworkTimeoutError(`connection ${this.id} to ${this.address} timed out`, { + beforeHandshake: this.ismaster == null + }) + ); + } else if (issue.isClose) { + op.cb(new MongoNetworkError(`connection ${this.id} to ${this.address} closed`)); + } else { + op.cb(typeof issue.destroy === 'boolean' ? undefined : issue.destroy); + } + } + + this[kQueue].clear(); + this.emit(Connection.CLOSE); + } + + destroy(): void; + destroy(callback: Callback): void; + destroy(options: DestroyOptions): void; + destroy(options: DestroyOptions, callback: Callback): void; + destroy(options?: DestroyOptions | Callback, callback?: Callback): void { + if (typeof options === 'function') { + callback = options; + options = { force: false }; + } + + this.removeAllListeners(Connection.PINNED); + this.removeAllListeners(Connection.UNPINNED); + + options = Object.assign({ force: false }, options); + if (this[kStream] == null || this.destroyed) { + this.destroyed = true; + if (typeof callback === 'function') { + callback(); + } + + return; + } + + if (options.force) { + this[kStream].destroy(); + this.destroyed = true; + if (typeof callback === 'function') { + callback(); + } + + return; + } + + this[kStream].end(() => { + this.destroyed = true; + if (typeof callback === 'function') { + callback(); + } + }); + } + + /** @internal */ + command( + ns: MongoDBNamespace, + cmd: Document, + options: CommandOptions | undefined, + callback: Callback + ): void { + if (!(ns instanceof MongoDBNamespace)) { + // TODO(NODE-3483): Replace this with a MongoCommandError + throw new MongoRuntimeError('Must provide a MongoDBNamespace instance'); + } + + const readPreference = getReadPreference(cmd, options); + const shouldUseOpMsg = supportsOpMsg(this); + const session = options?.session; + + let clusterTime = this.clusterTime; + let finalCmd = Object.assign({}, cmd); + + if (this.serverApi) { + const { version, strict, deprecationErrors } = this.serverApi; + finalCmd.apiVersion = version; + if (strict != null) finalCmd.apiStrict = strict; + if (deprecationErrors != null) finalCmd.apiDeprecationErrors = deprecationErrors; + } + + if (hasSessionSupport(this) && session) { + if ( + session.clusterTime && + clusterTime && + session.clusterTime.clusterTime.greaterThan(clusterTime.clusterTime) + ) { + clusterTime = session.clusterTime; + } + + const err = applySession(session, finalCmd, options as CommandOptions); + if (err) { + return callback(err); + } + } + + // if we have a known cluster time, gossip it + if (clusterTime) { + finalCmd.$clusterTime = clusterTime; + } + + if (isSharded(this) && !shouldUseOpMsg && readPreference && readPreference.mode !== 'primary') { + finalCmd = { + $query: finalCmd, + $readPreference: readPreference.toJSON() + }; + } + + const commandOptions: Document = Object.assign( + { + command: true, + numberToSkip: 0, + numberToReturn: -1, + checkKeys: false, + // This value is not overridable + slaveOk: readPreference.slaveOk() + }, + options + ); + + const cmdNs = `${ns.db}.$cmd`; + const message = shouldUseOpMsg + ? new Msg(cmdNs, finalCmd, commandOptions) + : new Query(cmdNs, finalCmd, commandOptions); + + try { + write(this, message, commandOptions, callback); + } catch (err) { + callback(err); + } + } + + /** @internal */ + query(ns: MongoDBNamespace, cmd: Document, options: QueryOptions, callback: Callback): void { + const isExplain = cmd.$explain != null; + const readPreference = options.readPreference ?? ReadPreference.primary; + const batchSize = options.batchSize || 0; + const limit = options.limit; + const numberToSkip = options.skip || 0; + let numberToReturn = 0; + if ( + limit && + (limit < 0 || (limit !== 0 && limit < batchSize) || (limit > 0 && batchSize === 0)) + ) { + numberToReturn = limit; + } else { + numberToReturn = batchSize; + } + + if (isExplain) { + // nToReturn must be 0 (match all) or negative (match N and close cursor) + // nToReturn > 0 will give explain results equivalent to limit(0) + numberToReturn = -Math.abs(limit || 0); + } + + const queryOptions: OpQueryOptions = { + numberToSkip, + numberToReturn, + pre32Limit: typeof limit === 'number' ? limit : undefined, + checkKeys: false, + slaveOk: readPreference.slaveOk() + }; + + if (options.projection) { + queryOptions.returnFieldSelector = options.projection; + } + + const query = new Query(ns.toString(), cmd, queryOptions); + if (typeof options.tailable === 'boolean') { + query.tailable = options.tailable; + } + + if (typeof options.oplogReplay === 'boolean') { + query.oplogReplay = options.oplogReplay; + } + + if (typeof options.timeout === 'boolean') { + query.noCursorTimeout = !options.timeout; + } else if (typeof options.noCursorTimeout === 'boolean') { + query.noCursorTimeout = options.noCursorTimeout; + } + + if (typeof options.awaitData === 'boolean') { + query.awaitData = options.awaitData; + } + + if (typeof options.partial === 'boolean') { + query.partial = options.partial; + } + + write( + this, + query, + { [kFullResult]: true, ...pluckBSONSerializeOptions(options) }, + (err, result) => { + if (err || !result) return callback(err, result); + if (isExplain && result.documents && result.documents[0]) { + return callback(undefined, result.documents[0]); + } + + callback(undefined, result); + } + ); + } + + /** @internal */ + getMore( + ns: MongoDBNamespace, + cursorId: Long, + options: GetMoreOptions, + callback: Callback + ): void { + const fullResult = !!options[kFullResult]; + const wireVersion = maxWireVersion(this); + if (!cursorId) { + // TODO(NODE-3483): Replace this with a MongoCommandError + callback(new MongoRuntimeError('Invalid internal cursor state, no known cursor id')); + return; + } + + if (wireVersion < 4) { + const getMoreOp = new GetMore(ns.toString(), cursorId, { numberToReturn: options.batchSize }); + const queryOptions = applyCommonQueryOptions( + {}, + Object.assign(options, { ...pluckBSONSerializeOptions(options) }) + ); + + queryOptions[kFullResult] = true; + queryOptions.command = true; + write(this, getMoreOp, queryOptions, (err, response) => { + if (fullResult) return callback(err, response); + if (err) return callback(err); + callback(undefined, { cursor: { id: response.cursorId, nextBatch: response.documents } }); + }); + + return; + } + + const getMoreCmd: Document = { + getMore: cursorId, + collection: ns.collection + }; + + if (typeof options.batchSize === 'number') { + getMoreCmd.batchSize = Math.abs(options.batchSize); + } + + if (typeof options.maxAwaitTimeMS === 'number') { + getMoreCmd.maxTimeMS = options.maxAwaitTimeMS; + } + + const commandOptions = Object.assign( + { + returnFieldSelector: null, + documentsReturnedIn: 'nextBatch' + }, + options + ); + + this.command(ns, getMoreCmd, commandOptions, callback); + } + + /** @internal */ + killCursors( + ns: MongoDBNamespace, + cursorIds: Long[], + options: CommandOptions, + callback: Callback + ): void { + if (!cursorIds || !Array.isArray(cursorIds)) { + // TODO(NODE-3483): Replace this with a MongoCommandError + throw new MongoRuntimeError(`Invalid list of cursor ids provided: ${cursorIds}`); + } + + if (maxWireVersion(this) < 4) { + try { + write( + this, + new KillCursor(ns.toString(), cursorIds), + { noResponse: true, ...options }, + callback + ); + } catch (err) { + callback(err); + } + + return; + } + + this.command( + ns, + { killCursors: ns.collection, cursors: cursorIds }, + { [kFullResult]: true, ...options }, + (err, response) => { + if (err || !response) return callback(err); + if (response.cursorNotFound) { + return callback(new MongoNetworkError('cursor killed or timed out'), null); + } + + if (!Array.isArray(response.documents) || response.documents.length === 0) { + return callback( + // TODO(NODE-3483) + new MongoRuntimeError( + `invalid killCursors result returned for cursor id ${cursorIds[0]}` + ) + ); + } + + callback(undefined, response.documents[0]); + } + ); + } +} + +/** @public */ +export const APM_EVENTS = [ + Connection.COMMAND_STARTED, + Connection.COMMAND_SUCCEEDED, + Connection.COMMAND_FAILED +]; + +/** @internal */ +export class CryptoConnection extends Connection { + /** @internal */ + [kAutoEncrypter]?: AutoEncrypter; + + constructor(stream: Stream, options: ConnectionOptions) { + super(stream, options); + this[kAutoEncrypter] = options.autoEncrypter; + } + + /** @internal @override */ + command(ns: MongoDBNamespace, cmd: Document, options: CommandOptions, callback: Callback): void { + const autoEncrypter = this[kAutoEncrypter]; + if (!autoEncrypter) { + return callback(new MongoMissingDependencyError('No AutoEncrypter available for encryption')); + } + + const serverWireVersion = maxWireVersion(this); + if (serverWireVersion === 0) { + // This means the initial handshake hasn't happened yet + return super.command(ns, cmd, options, callback); + } + + if (serverWireVersion < 8) { + callback( + new MongoCompatibilityError('Auto-encryption requires a minimum MongoDB version of 4.2') + ); + return; + } + + autoEncrypter.encrypt(ns.toString(), cmd, options, (err, encrypted) => { + if (err || encrypted == null) { + callback(err, null); + return; + } + + super.command(ns, encrypted, options, (err, response) => { + if (err || response == null) { + callback(err, response); + return; + } + + autoEncrypter.decrypt(response, options, callback); + }); + }); + } +} + +/** @internal */ +export function hasSessionSupport(conn: Connection): boolean { + const description = conn.description; + return description.logicalSessionTimeoutMinutes != null || !!description.loadBalanced; +} + +function supportsOpMsg(conn: Connection) { + const description = conn.description; + if (description == null) { + return false; + } + + return maxWireVersion(conn) >= 6 && !description.__nodejs_mock_server__; +} + +function messageHandler(conn: Connection) { + return function messageHandler(message: BinMsg | Response) { + // always emit the message, in case we are streaming + conn.emit('message', message); + const operationDescription = conn[kQueue].get(message.responseTo); + if (!operationDescription) { + return; + } + + const callback = operationDescription.cb; + + // SERVER-45775: For exhaust responses we should be able to use the same requestId to + // track response, however the server currently synthetically produces remote requests + // making the `responseTo` change on each response + conn[kQueue].delete(message.responseTo); + if ('moreToCome' in message && message.moreToCome) { + // requeue the callback for next synthetic request + conn[kQueue].set(message.requestId, operationDescription); + } else if (operationDescription.socketTimeoutOverride) { + conn[kStream].setTimeout(conn.socketTimeoutMS); + } + + try { + // Pass in the entire description because it has BSON parsing options + message.parse(operationDescription); + } catch (err) { + // If this error is generated by our own code, it will already have the correct class applied + // if it is not, then it is coming from a catastrophic data parse failure or the BSON library + // in either case, it should not be wrapped + callback(err); + return; + } + + if (message.documents[0]) { + const document: Document = message.documents[0]; + const session = operationDescription.session; + if (session) { + updateSessionFromResponse(session, document); + } + + if (document.$clusterTime) { + conn[kClusterTime] = document.$clusterTime; + conn.emit(Connection.CLUSTER_TIME_RECEIVED, document.$clusterTime); + } + + if (operationDescription.command) { + if (document.writeConcernError) { + callback(new MongoWriteConcernError(document.writeConcernError, document)); + return; + } + + if (document.ok === 0 || document.$err || document.errmsg || document.code) { + callback(new MongoServerError(document)); + return; + } + } else { + // Pre 3.2 support + if (document.ok === 0 || document.$err || document.errmsg) { + callback(new MongoServerError(document)); + return; + } + } + } + + callback(undefined, operationDescription.fullResult ? message : message.documents[0]); + }; +} + +function streamIdentifier(stream: Stream) { + if (typeof stream.address === 'function') { + return `${stream.remoteAddress}:${stream.remotePort}`; + } + + return uuidV4().toString('hex'); +} + +function write( + conn: Connection, + command: WriteProtocolMessageType, + options: CommandOptions, + callback: Callback +) { + if (typeof options === 'function') { + callback = options; + } + + options = options ?? {}; + const operationDescription: OperationDescription = { + requestId: command.requestId, + cb: callback, + session: options.session, + fullResult: !!options[kFullResult], + noResponse: typeof options.noResponse === 'boolean' ? options.noResponse : false, + documentsReturnedIn: options.documentsReturnedIn, + command: !!options.command, + + // for BSON parsing + promoteLongs: typeof options.promoteLongs === 'boolean' ? options.promoteLongs : true, + promoteValues: typeof options.promoteValues === 'boolean' ? options.promoteValues : true, + promoteBuffers: typeof options.promoteBuffers === 'boolean' ? options.promoteBuffers : false, + bsonRegExp: typeof options.bsonRegExp === 'boolean' ? options.bsonRegExp : false, + raw: typeof options.raw === 'boolean' ? options.raw : false, + started: 0 + }; + + if (conn[kDescription] && conn[kDescription].compressor) { + operationDescription.agreedCompressor = conn[kDescription].compressor; + + if (conn[kDescription].zlibCompressionLevel) { + operationDescription.zlibCompressionLevel = conn[kDescription].zlibCompressionLevel; + } + } + + if (typeof options.socketTimeoutMS === 'number') { + operationDescription.socketTimeoutOverride = true; + conn[kStream].setTimeout(options.socketTimeoutMS); + } + + // if command monitoring is enabled we need to modify the callback here + if (conn.monitorCommands) { + conn.emit(Connection.COMMAND_STARTED, new CommandStartedEvent(conn, command)); + + operationDescription.started = now(); + operationDescription.cb = (err, reply) => { + if (err) { + conn.emit( + Connection.COMMAND_FAILED, + new CommandFailedEvent(conn, command, err, operationDescription.started) + ); + } else { + if (reply && (reply.ok === 0 || reply.$err)) { + conn.emit( + Connection.COMMAND_FAILED, + new CommandFailedEvent(conn, command, reply, operationDescription.started) + ); + } else { + conn.emit( + Connection.COMMAND_SUCCEEDED, + new CommandSucceededEvent(conn, command, reply, operationDescription.started) + ); + } + } + + if (typeof callback === 'function') { + callback(err, reply); + } + }; + } + + if (!operationDescription.noResponse) { + conn[kQueue].set(operationDescription.requestId, operationDescription); + } + + try { + conn[kMessageStream].writeCommand(command, operationDescription); + } catch (e) { + if (!operationDescription.noResponse) { + conn[kQueue].delete(operationDescription.requestId); + operationDescription.cb(e); + return; + } + } + + if (operationDescription.noResponse) { + operationDescription.cb(); + } +} diff --git a/node_modules/mongodb/src/cmap/connection_pool.ts b/node_modules/mongodb/src/cmap/connection_pool.ts new file mode 100644 index 000000000..83c74d740 --- /dev/null +++ b/node_modules/mongodb/src/cmap/connection_pool.ts @@ -0,0 +1,692 @@ +import Denque = require('denque'); +import { APM_EVENTS, Connection, ConnectionEvents, ConnectionOptions } from './connection'; +import type { ObjectId } from 'bson'; +import { Logger } from '../logger'; +import { ConnectionPoolMetrics } from './metrics'; +import { connect } from './connect'; +import { eachAsync, makeCounter, Callback } from '../utils'; +import { MongoError, MongoInvalidArgumentError, MongoRuntimeError } from '../error'; +import { PoolClosedError, WaitQueueTimeoutError } from './errors'; +import { + ConnectionPoolCreatedEvent, + ConnectionPoolClosedEvent, + ConnectionCreatedEvent, + ConnectionReadyEvent, + ConnectionClosedEvent, + ConnectionCheckOutStartedEvent, + ConnectionCheckOutFailedEvent, + ConnectionCheckedOutEvent, + ConnectionCheckedInEvent, + ConnectionPoolClearedEvent +} from './connection_pool_events'; +import { CancellationToken, TypedEventEmitter } from '../mongo_types'; + +/** @internal */ +const kLogger = Symbol('logger'); +/** @internal */ +const kConnections = Symbol('connections'); +/** @internal */ +const kPermits = Symbol('permits'); +/** @internal */ +const kMinPoolSizeTimer = Symbol('minPoolSizeTimer'); +/** @internal */ +const kGeneration = Symbol('generation'); +/** @internal */ +const kServiceGenerations = Symbol('serviceGenerations'); +/** @internal */ +const kConnectionCounter = Symbol('connectionCounter'); +/** @internal */ +const kCancellationToken = Symbol('cancellationToken'); +/** @internal */ +const kWaitQueue = Symbol('waitQueue'); +/** @internal */ +const kCancelled = Symbol('cancelled'); +/** @internal */ +const kMetrics = Symbol('metrics'); +/** @internal */ +const kCheckedOut = Symbol('checkedOut'); +/** @internal */ +const kProcessingWaitQueue = Symbol('processingWaitQueue'); + +/** @public */ +export interface ConnectionPoolOptions extends Omit { + /** The maximum number of connections that may be associated with a pool at a given time. This includes in use and available connections. */ + maxPoolSize: number; + /** The minimum number of connections that MUST exist at any moment in a single connection pool. */ + minPoolSize: number; + /** The maximum amount of time a connection should remain idle in the connection pool before being marked idle. */ + maxIdleTimeMS: number; + /** The maximum amount of time operation execution should wait for a connection to become available. The default is 0 which means there is no limit. */ + waitQueueTimeoutMS: number; + /** If we are in load balancer mode. */ + loadBalanced: boolean; +} + +/** @internal */ +export interface WaitQueueMember { + callback: Callback; + timer?: NodeJS.Timeout; + [kCancelled]?: boolean; +} + +/** @public */ +export interface CloseOptions { + force?: boolean; +} + +/** @public */ +export type ConnectionPoolEvents = { + connectionPoolCreated(event: ConnectionPoolCreatedEvent): void; + connectionPoolClosed(event: ConnectionPoolClosedEvent): void; + connectionPoolCleared(event: ConnectionPoolClearedEvent): void; + connectionCreated(event: ConnectionCreatedEvent): void; + connectionReady(event: ConnectionReadyEvent): void; + connectionClosed(event: ConnectionClosedEvent): void; + connectionCheckOutStarted(event: ConnectionCheckOutStartedEvent): void; + connectionCheckOutFailed(event: ConnectionCheckOutFailedEvent): void; + connectionCheckedOut(event: ConnectionCheckedOutEvent): void; + connectionCheckedIn(event: ConnectionCheckedInEvent): void; +} & Omit; + +/** + * A pool of connections which dynamically resizes, and emit events related to pool activity + * @internal + */ +export class ConnectionPool extends TypedEventEmitter { + closed: boolean; + options: Readonly; + /** @internal */ + [kLogger]: Logger; + /** @internal */ + [kConnections]: Denque; + /** + * An integer expressing how many total connections are permitted + * @internal + */ + [kPermits]: number; + /** @internal */ + [kMinPoolSizeTimer]?: NodeJS.Timeout; + /** + * An integer representing the SDAM generation of the pool + * @internal + */ + [kGeneration]: number; + /** A map of generations to service ids + * @internal + */ + [kServiceGenerations]: Map; + /** @internal */ + [kConnectionCounter]: Generator; + /** @internal */ + [kCancellationToken]: CancellationToken; + /** @internal */ + [kWaitQueue]: Denque; + /** @internal */ + [kMetrics]: ConnectionPoolMetrics; + /** @internal */ + [kCheckedOut]: number; + /** @internal */ + [kProcessingWaitQueue]: boolean; + + /** + * Emitted when the connection pool is created. + * @event + */ + static readonly CONNECTION_POOL_CREATED = 'connectionPoolCreated' as const; + /** + * Emitted once when the connection pool is closed + * @event + */ + static readonly CONNECTION_POOL_CLOSED = 'connectionPoolClosed' as const; + /** + * Emitted each time the connection pool is cleared and it's generation incremented + * @event + */ + static readonly CONNECTION_POOL_CLEARED = 'connectionPoolCleared' as const; + /** + * Emitted when a connection is created. + * @event + */ + static readonly CONNECTION_CREATED = 'connectionCreated' as const; + /** + * Emitted when a connection becomes established, and is ready to use + * @event + */ + static readonly CONNECTION_READY = 'connectionReady' as const; + /** + * Emitted when a connection is closed + * @event + */ + static readonly CONNECTION_CLOSED = 'connectionClosed' as const; + /** + * Emitted when an attempt to check out a connection begins + * @event + */ + static readonly CONNECTION_CHECK_OUT_STARTED = 'connectionCheckOutStarted' as const; + /** + * Emitted when an attempt to check out a connection fails + * @event + */ + static readonly CONNECTION_CHECK_OUT_FAILED = 'connectionCheckOutFailed' as const; + /** + * Emitted each time a connection is successfully checked out of the connection pool + * @event + */ + static readonly CONNECTION_CHECKED_OUT = 'connectionCheckedOut' as const; + /** + * Emitted each time a connection is successfully checked into the connection pool + * @event + */ + static readonly CONNECTION_CHECKED_IN = 'connectionCheckedIn' as const; + + /** @internal */ + constructor(options: ConnectionPoolOptions) { + super(); + + this.closed = false; + this.options = Object.freeze({ + ...options, + connectionType: Connection, + maxPoolSize: options.maxPoolSize ?? 100, + minPoolSize: options.minPoolSize ?? 0, + maxIdleTimeMS: options.maxIdleTimeMS ?? 0, + waitQueueTimeoutMS: options.waitQueueTimeoutMS ?? 0, + autoEncrypter: options.autoEncrypter, + metadata: options.metadata + }); + + if (this.options.minPoolSize > this.options.maxPoolSize) { + throw new MongoInvalidArgumentError( + 'Connection pool minimum size must not be greater than maximum pool size' + ); + } + + this[kLogger] = new Logger('ConnectionPool'); + this[kConnections] = new Denque(); + this[kPermits] = this.options.maxPoolSize; + this[kMinPoolSizeTimer] = undefined; + this[kGeneration] = 0; + this[kServiceGenerations] = new Map(); + this[kConnectionCounter] = makeCounter(1); + this[kCancellationToken] = new CancellationToken(); + this[kCancellationToken].setMaxListeners(Infinity); + this[kWaitQueue] = new Denque(); + this[kMetrics] = new ConnectionPoolMetrics(); + this[kCheckedOut] = 0; + this[kProcessingWaitQueue] = false; + + process.nextTick(() => { + this.emit(ConnectionPool.CONNECTION_POOL_CREATED, new ConnectionPoolCreatedEvent(this)); + ensureMinPoolSize(this); + }); + } + + /** The address of the endpoint the pool is connected to */ + get address(): string { + return this.options.hostAddress.toString(); + } + + /** An integer representing the SDAM generation of the pool */ + get generation(): number { + return this[kGeneration]; + } + + /** An integer expressing how many total connections (active + in use) the pool currently has */ + get totalConnectionCount(): number { + return this[kConnections].length + (this.options.maxPoolSize - this[kPermits]); + } + + /** An integer expressing how many connections are currently available in the pool. */ + get availableConnectionCount(): number { + return this[kConnections].length; + } + + get waitQueueSize(): number { + return this[kWaitQueue].length; + } + + get loadBalanced(): boolean { + return this.options.loadBalanced; + } + + get serviceGenerations(): Map { + return this[kServiceGenerations]; + } + + get currentCheckedOutCount(): number { + return this[kCheckedOut]; + } + + /** + * Get the metrics information for the pool when a wait queue timeout occurs. + */ + private waitQueueErrorMetrics(): string { + return this[kMetrics].info(this.options.maxPoolSize); + } + + /** + * Check a connection out of this pool. The connection will continue to be tracked, but no reference to it + * will be held by the pool. This means that if a connection is checked out it MUST be checked back in or + * explicitly destroyed by the new owner. + */ + checkOut(callback: Callback): void { + this.emit( + ConnectionPool.CONNECTION_CHECK_OUT_STARTED, + new ConnectionCheckOutStartedEvent(this) + ); + + if (this.closed) { + this.emit( + ConnectionPool.CONNECTION_CHECK_OUT_FAILED, + new ConnectionCheckOutFailedEvent(this, 'poolClosed') + ); + callback(new PoolClosedError(this)); + return; + } + + const waitQueueMember: WaitQueueMember = { callback }; + const waitQueueTimeoutMS = this.options.waitQueueTimeoutMS; + if (waitQueueTimeoutMS) { + waitQueueMember.timer = setTimeout(() => { + waitQueueMember[kCancelled] = true; + waitQueueMember.timer = undefined; + + this.emit( + ConnectionPool.CONNECTION_CHECK_OUT_FAILED, + new ConnectionCheckOutFailedEvent(this, 'timeout') + ); + waitQueueMember.callback( + new WaitQueueTimeoutError( + this.loadBalanced + ? this.waitQueueErrorMetrics() + : 'Timed out while checking out a connection from connection pool', + this.address + ) + ); + }, waitQueueTimeoutMS); + } + + this[kCheckedOut] = this[kCheckedOut] + 1; + this[kWaitQueue].push(waitQueueMember); + process.nextTick(processWaitQueue, this); + } + + /** + * Check a connection into the pool. + * + * @param connection - The connection to check in + */ + checkIn(connection: Connection): void { + const poolClosed = this.closed; + const stale = connectionIsStale(this, connection); + const willDestroy = !!(poolClosed || stale || connection.closed); + + if (!willDestroy) { + connection.markAvailable(); + this[kConnections].unshift(connection); + } + + this[kCheckedOut] = this[kCheckedOut] - 1; + this.emit(ConnectionPool.CONNECTION_CHECKED_IN, new ConnectionCheckedInEvent(this, connection)); + + if (willDestroy) { + const reason = connection.closed ? 'error' : poolClosed ? 'poolClosed' : 'stale'; + destroyConnection(this, connection, reason); + } + + process.nextTick(processWaitQueue, this); + } + + /** + * Clear the pool + * + * Pool reset is handled by incrementing the pool's generation count. Any existing connection of a + * previous generation will eventually be pruned during subsequent checkouts. + */ + clear(serviceId?: ObjectId): void { + if (this.loadBalanced && serviceId) { + const sid = serviceId.toHexString(); + const generation = this.serviceGenerations.get(sid); + // Only need to worry if the generation exists, since it should + // always be there but typescript needs the check. + if (generation == null) { + // TODO(NODE-3483) + throw new MongoRuntimeError('Service generations are required in load balancer mode.'); + } else { + // Increment the generation for the service id. + this.serviceGenerations.set(sid, generation + 1); + } + } else { + this[kGeneration] += 1; + } + + this.emit('connectionPoolCleared', new ConnectionPoolClearedEvent(this, serviceId)); + } + + /** Close the pool */ + close(callback: Callback): void; + close(options: CloseOptions, callback: Callback): void; + close(_options?: CloseOptions | Callback, _cb?: Callback): void { + let options = _options as CloseOptions; + const callback = (_cb ?? _options) as Callback; + if (typeof options === 'function') { + options = {}; + } + + options = Object.assign({ force: false }, options); + if (this.closed) { + return callback(); + } + + // immediately cancel any in-flight connections + this[kCancellationToken].emit('cancel'); + + // drain the wait queue + while (this.waitQueueSize) { + const waitQueueMember = this[kWaitQueue].pop(); + if (waitQueueMember) { + if (waitQueueMember.timer) { + clearTimeout(waitQueueMember.timer); + } + if (!waitQueueMember[kCancelled]) { + // TODO(NODE-3483): Replace with MongoConnectionPoolClosedError + waitQueueMember.callback(new MongoRuntimeError('Connection pool closed')); + } + } + } + + // clear the min pool size timer + const minPoolSizeTimer = this[kMinPoolSizeTimer]; + if (minPoolSizeTimer) { + clearTimeout(minPoolSizeTimer); + } + + // end the connection counter + if (typeof this[kConnectionCounter].return === 'function') { + this[kConnectionCounter].return(undefined); + } + + // mark the pool as closed immediately + this.closed = true; + eachAsync( + this[kConnections].toArray(), + (conn, cb) => { + this.emit( + ConnectionPool.CONNECTION_CLOSED, + new ConnectionClosedEvent(this, conn, 'poolClosed') + ); + conn.destroy(options, cb); + }, + err => { + this[kConnections].clear(); + this.emit(ConnectionPool.CONNECTION_POOL_CLOSED, new ConnectionPoolClosedEvent(this)); + callback(err); + } + ); + } + + /** + * Runs a lambda with an implicitly checked out connection, checking that connection back in when the lambda + * has completed by calling back. + * + * NOTE: please note the required signature of `fn` + * + * @remarks When in load balancer mode, connections can be pinned to cursors or transactions. + * In these cases we pass the connection in to this method to ensure it is used and a new + * connection is not checked out. + * + * @param conn - A pinned connection for use in load balancing mode. + * @param fn - A function which operates on a managed connection + * @param callback - The original callback + */ + withConnection( + conn: Connection | undefined, + fn: WithConnectionCallback, + callback?: Callback + ): void { + if (conn) { + // use the provided connection, and do _not_ check it in after execution + fn(undefined, conn, (fnErr, result) => { + if (typeof callback === 'function') { + if (fnErr) { + callback(fnErr); + } else { + callback(undefined, result); + } + } + }); + + return; + } + + this.checkOut((err, conn) => { + // don't callback with `err` here, we might want to act upon it inside `fn` + fn(err as MongoError, conn, (fnErr, result) => { + if (typeof callback === 'function') { + if (fnErr) { + callback(fnErr); + } else { + callback(undefined, result); + } + } + + if (conn) { + this.checkIn(conn); + } + }); + }); + } +} + +function ensureMinPoolSize(pool: ConnectionPool) { + if (pool.closed || pool.options.minPoolSize === 0) { + return; + } + + const minPoolSize = pool.options.minPoolSize; + for (let i = pool.totalConnectionCount; i < minPoolSize; ++i) { + createConnection(pool); + } + + pool[kMinPoolSizeTimer] = setTimeout(() => ensureMinPoolSize(pool), 10); +} + +function connectionIsStale(pool: ConnectionPool, connection: Connection) { + const serviceId = connection.serviceId; + if (pool.loadBalanced && serviceId) { + const sid = serviceId.toHexString(); + const generation = pool.serviceGenerations.get(sid); + return connection.generation !== generation; + } + + return connection.generation !== pool[kGeneration]; +} + +function connectionIsIdle(pool: ConnectionPool, connection: Connection) { + return !!(pool.options.maxIdleTimeMS && connection.idleTime > pool.options.maxIdleTimeMS); +} + +function createConnection(pool: ConnectionPool, callback?: Callback) { + const connectOptions: ConnectionOptions = { + ...pool.options, + id: pool[kConnectionCounter].next().value, + generation: pool[kGeneration], + cancellationToken: pool[kCancellationToken] + }; + + pool[kPermits]--; + connect(connectOptions, (err, connection) => { + if (err || !connection) { + pool[kPermits]++; + pool[kLogger].debug(`connection attempt failed with error [${JSON.stringify(err)}]`); + if (typeof callback === 'function') { + callback(err); + } + + return; + } + + // The pool might have closed since we started trying to create a connection + if (pool.closed) { + connection.destroy({ force: true }); + return; + } + + // forward all events from the connection to the pool + for (const event of [...APM_EVENTS, Connection.CLUSTER_TIME_RECEIVED]) { + connection.on(event, (e: any) => pool.emit(event, e)); + } + + pool.emit(ConnectionPool.CONNECTION_CREATED, new ConnectionCreatedEvent(pool, connection)); + + if (pool.loadBalanced) { + connection.on(Connection.PINNED, pinType => pool[kMetrics].markPinned(pinType)); + connection.on(Connection.UNPINNED, pinType => pool[kMetrics].markUnpinned(pinType)); + + const serviceId = connection.serviceId; + if (serviceId) { + let generation; + const sid = serviceId.toHexString(); + if ((generation = pool.serviceGenerations.get(sid))) { + connection.generation = generation; + } else { + pool.serviceGenerations.set(sid, 0); + connection.generation = 0; + } + } + } + + connection.markAvailable(); + pool.emit(ConnectionPool.CONNECTION_READY, new ConnectionReadyEvent(pool, connection)); + + // if a callback has been provided, check out the connection immediately + if (typeof callback === 'function') { + callback(undefined, connection); + return; + } + + // otherwise add it to the pool for later acquisition, and try to process the wait queue + pool[kConnections].push(connection); + process.nextTick(processWaitQueue, pool); + }); +} + +function destroyConnection(pool: ConnectionPool, connection: Connection, reason: string) { + pool.emit(ConnectionPool.CONNECTION_CLOSED, new ConnectionClosedEvent(pool, connection, reason)); + + // allow more connections to be created + pool[kPermits]++; + + // destroy the connection + process.nextTick(() => connection.destroy()); +} + +function processWaitQueue(pool: ConnectionPool) { + if (pool.closed || pool[kProcessingWaitQueue]) { + return; + } + + pool[kProcessingWaitQueue] = true; + while (pool.waitQueueSize) { + const waitQueueMember = pool[kWaitQueue].peekFront(); + if (!waitQueueMember) { + pool[kWaitQueue].shift(); + continue; + } + + if (waitQueueMember[kCancelled]) { + pool[kWaitQueue].shift(); + continue; + } + + if (!pool.availableConnectionCount) { + break; + } + + const connection = pool[kConnections].shift(); + if (!connection) { + break; + } + + const isStale = connectionIsStale(pool, connection); + const isIdle = connectionIsIdle(pool, connection); + if (!isStale && !isIdle && !connection.closed) { + pool.emit( + ConnectionPool.CONNECTION_CHECKED_OUT, + new ConnectionCheckedOutEvent(pool, connection) + ); + if (waitQueueMember.timer) { + clearTimeout(waitQueueMember.timer); + } + + pool[kWaitQueue].shift(); + waitQueueMember.callback(undefined, connection); + } else { + const reason = connection.closed ? 'error' : isStale ? 'stale' : 'idle'; + destroyConnection(pool, connection, reason); + } + } + + const maxPoolSize = pool.options.maxPoolSize; + if (pool.waitQueueSize && (maxPoolSize <= 0 || pool.totalConnectionCount < maxPoolSize)) { + createConnection(pool, (err, connection) => { + const waitQueueMember = pool[kWaitQueue].shift(); + if (!waitQueueMember || waitQueueMember[kCancelled]) { + if (!err && connection) { + pool[kConnections].push(connection); + } + + pool[kProcessingWaitQueue] = false; + return; + } + + if (err) { + pool.emit( + ConnectionPool.CONNECTION_CHECK_OUT_FAILED, + new ConnectionCheckOutFailedEvent(pool, err) + ); + } else if (connection) { + pool.emit( + ConnectionPool.CONNECTION_CHECKED_OUT, + new ConnectionCheckedOutEvent(pool, connection) + ); + } + + if (waitQueueMember.timer) { + clearTimeout(waitQueueMember.timer); + } + waitQueueMember.callback(err, connection); + pool[kProcessingWaitQueue] = false; + process.nextTick(() => processWaitQueue(pool)); + }); + } else { + pool[kProcessingWaitQueue] = false; + } +} + +export const CMAP_EVENTS = [ + ConnectionPool.CONNECTION_POOL_CREATED, + ConnectionPool.CONNECTION_POOL_CLOSED, + ConnectionPool.CONNECTION_CREATED, + ConnectionPool.CONNECTION_READY, + ConnectionPool.CONNECTION_CLOSED, + ConnectionPool.CONNECTION_CHECK_OUT_STARTED, + ConnectionPool.CONNECTION_CHECK_OUT_FAILED, + ConnectionPool.CONNECTION_CHECKED_OUT, + ConnectionPool.CONNECTION_CHECKED_IN, + ConnectionPool.CONNECTION_POOL_CLEARED +] as const; + +/** + * A callback provided to `withConnection` + * @internal + * + * @param error - An error instance representing the error during the execution. + * @param connection - The managed connection which was checked out of the pool. + * @param callback - A function to call back after connection management is complete + */ +export type WithConnectionCallback = ( + error: MongoError | undefined, + connection: Connection | undefined, + callback: Callback +) => void; diff --git a/node_modules/mongodb/src/cmap/connection_pool_events.ts b/node_modules/mongodb/src/cmap/connection_pool_events.ts new file mode 100644 index 000000000..8083f6d2c --- /dev/null +++ b/node_modules/mongodb/src/cmap/connection_pool_events.ts @@ -0,0 +1,179 @@ +import type { ObjectId } from '../bson'; +import type { Connection } from './connection'; +import type { ConnectionPool, ConnectionPoolOptions } from './connection_pool'; +import type { AnyError } from '../error'; + +/** + * The base export class for all monitoring events published from the connection pool + * @public + * @category Event + */ +export class ConnectionPoolMonitoringEvent { + /** A timestamp when the event was created */ + time: Date; + /** The address (host/port pair) of the pool */ + address: string; + + /** @internal */ + constructor(pool: ConnectionPool) { + this.time = new Date(); + this.address = pool.address; + } +} + +/** + * An event published when a connection pool is created + * @public + * @category Event + */ +export class ConnectionPoolCreatedEvent extends ConnectionPoolMonitoringEvent { + /** The options used to create this connection pool */ + options?: ConnectionPoolOptions; + + /** @internal */ + constructor(pool: ConnectionPool) { + super(pool); + this.options = pool.options; + } +} + +/** + * An event published when a connection pool is closed + * @public + * @category Event + */ +export class ConnectionPoolClosedEvent extends ConnectionPoolMonitoringEvent { + /** @internal */ + constructor(pool: ConnectionPool) { + super(pool); + } +} + +/** + * An event published when a connection pool creates a new connection + * @public + * @category Event + */ +export class ConnectionCreatedEvent extends ConnectionPoolMonitoringEvent { + /** A monotonically increasing, per-pool id for the newly created connection */ + connectionId: number | ''; + + /** @internal */ + constructor(pool: ConnectionPool, connection: Connection) { + super(pool); + this.connectionId = connection.id; + } +} + +/** + * An event published when a connection is ready for use + * @public + * @category Event + */ +export class ConnectionReadyEvent extends ConnectionPoolMonitoringEvent { + /** The id of the connection */ + connectionId: number | ''; + + /** @internal */ + constructor(pool: ConnectionPool, connection: Connection) { + super(pool); + this.connectionId = connection.id; + } +} + +/** + * An event published when a connection is closed + * @public + * @category Event + */ +export class ConnectionClosedEvent extends ConnectionPoolMonitoringEvent { + /** The id of the connection */ + connectionId: number | ''; + /** The reason the connection was closed */ + reason: string; + serviceId?: ObjectId; + + /** @internal */ + constructor(pool: ConnectionPool, connection: Connection, reason: string) { + super(pool); + this.connectionId = connection.id; + this.reason = reason || 'unknown'; + this.serviceId = connection.serviceId; + } +} + +/** + * An event published when a request to check a connection out begins + * @public + * @category Event + */ +export class ConnectionCheckOutStartedEvent extends ConnectionPoolMonitoringEvent { + /** @internal */ + constructor(pool: ConnectionPool) { + super(pool); + } +} + +/** + * An event published when a request to check a connection out fails + * @public + * @category Event + */ +export class ConnectionCheckOutFailedEvent extends ConnectionPoolMonitoringEvent { + /** The reason the attempt to check out failed */ + reason: AnyError | string; + + /** @internal */ + constructor(pool: ConnectionPool, reason: AnyError | string) { + super(pool); + this.reason = reason; + } +} + +/** + * An event published when a connection is checked out of the connection pool + * @public + * @category Event + */ +export class ConnectionCheckedOutEvent extends ConnectionPoolMonitoringEvent { + /** The id of the connection */ + connectionId: number | ''; + + /** @internal */ + constructor(pool: ConnectionPool, connection: Connection) { + super(pool); + this.connectionId = connection.id; + } +} + +/** + * An event published when a connection is checked into the connection pool + * @public + * @category Event + */ +export class ConnectionCheckedInEvent extends ConnectionPoolMonitoringEvent { + /** The id of the connection */ + connectionId: number | ''; + + /** @internal */ + constructor(pool: ConnectionPool, connection: Connection) { + super(pool); + this.connectionId = connection.id; + } +} + +/** + * An event published when a connection pool is cleared + * @public + * @category Event + */ +export class ConnectionPoolClearedEvent extends ConnectionPoolMonitoringEvent { + /** @internal */ + serviceId?: ObjectId; + + /** @internal */ + constructor(pool: ConnectionPool, serviceId?: ObjectId) { + super(pool); + this.serviceId = serviceId; + } +} diff --git a/node_modules/mongodb/src/cmap/errors.ts b/node_modules/mongodb/src/cmap/errors.ts new file mode 100644 index 000000000..7d0319829 --- /dev/null +++ b/node_modules/mongodb/src/cmap/errors.ts @@ -0,0 +1,38 @@ +import { MongoDriverError } from '../error'; +import type { ConnectionPool } from './connection_pool'; + +/** + * An error indicating a connection pool is closed + * @category Error + */ +export class PoolClosedError extends MongoDriverError { + /** The address of the connection pool */ + address: string; + + constructor(pool: ConnectionPool) { + super('Attempted to check out a connection from closed connection pool'); + this.address = pool.address; + } + + get name(): string { + return 'MongoPoolClosedError'; + } +} + +/** + * An error thrown when a request to check out a connection times out + * @category Error + */ +export class WaitQueueTimeoutError extends MongoDriverError { + /** The address of the connection pool */ + address: string; + + constructor(message: string, address: string) { + super(message); + this.address = address; + } + + get name(): string { + return 'MongoWaitQueueTimeoutError'; + } +} diff --git a/node_modules/mongodb/src/cmap/message_stream.ts b/node_modules/mongodb/src/cmap/message_stream.ts new file mode 100644 index 000000000..8ab3290e3 --- /dev/null +++ b/node_modules/mongodb/src/cmap/message_stream.ts @@ -0,0 +1,208 @@ +import { Duplex, DuplexOptions } from 'stream'; +import { Response, Msg, BinMsg, Query, WriteProtocolMessageType, MessageHeader } from './commands'; +import { MongoDecompressionError, MongoParseError } from '../error'; +import { OP_COMPRESSED, OP_MSG } from './wire_protocol/constants'; +import { + compress, + decompress, + uncompressibleCommands, + Compressor, + CompressorName +} from './wire_protocol/compression'; +import type { Document, BSONSerializeOptions } from '../bson'; +import { BufferPool, Callback } from '../utils'; +import type { ClientSession } from '../sessions'; + +const MESSAGE_HEADER_SIZE = 16; +const COMPRESSION_DETAILS_SIZE = 9; // originalOpcode + uncompressedSize, compressorID + +const kDefaultMaxBsonMessageSize = 1024 * 1024 * 16 * 4; +/** @internal */ +const kBuffer = Symbol('buffer'); + +/** @internal */ +export interface MessageStreamOptions extends DuplexOptions { + maxBsonMessageSize?: number; +} + +/** @internal */ +export interface OperationDescription extends BSONSerializeOptions { + started: number; + cb: Callback; + command: boolean; + documentsReturnedIn?: string; + fullResult: boolean; + noResponse: boolean; + raw: boolean; + requestId: number; + session?: ClientSession; + socketTimeoutOverride?: boolean; + agreedCompressor?: CompressorName; + zlibCompressionLevel?: number; + $clusterTime?: Document; +} + +/** + * A duplex stream that is capable of reading and writing raw wire protocol messages, with + * support for optional compression + * @internal + */ +export class MessageStream extends Duplex { + /** @internal */ + maxBsonMessageSize: number; + /** @internal */ + [kBuffer]: BufferPool; + + constructor(options: MessageStreamOptions = {}) { + super(options); + this.maxBsonMessageSize = options.maxBsonMessageSize || kDefaultMaxBsonMessageSize; + this[kBuffer] = new BufferPool(); + } + + _write(chunk: Buffer, _: unknown, callback: Callback): void { + this[kBuffer].append(chunk); + processIncomingData(this, callback); + } + + _read(/* size */): void { + // NOTE: This implementation is empty because we explicitly push data to be read + // when `writeMessage` is called. + return; + } + + writeCommand( + command: WriteProtocolMessageType, + operationDescription: OperationDescription + ): void { + // TODO: agreed compressor should live in `StreamDescription` + const compressorName: CompressorName = + operationDescription && operationDescription.agreedCompressor + ? operationDescription.agreedCompressor + : 'none'; + if (compressorName === 'none' || !canCompress(command)) { + const data = command.toBin(); + this.push(Array.isArray(data) ? Buffer.concat(data) : data); + return; + } + // otherwise, compress the message + const concatenatedOriginalCommandBuffer = Buffer.concat(command.toBin()); + const messageToBeCompressed = concatenatedOriginalCommandBuffer.slice(MESSAGE_HEADER_SIZE); + + // Extract information needed for OP_COMPRESSED from the uncompressed message + const originalCommandOpCode = concatenatedOriginalCommandBuffer.readInt32LE(12); + + // Compress the message body + compress({ options: operationDescription }, messageToBeCompressed, (err, compressedMessage) => { + if (err || !compressedMessage) { + operationDescription.cb(err); + return; + } + + // Create the msgHeader of OP_COMPRESSED + const msgHeader = Buffer.alloc(MESSAGE_HEADER_SIZE); + msgHeader.writeInt32LE( + MESSAGE_HEADER_SIZE + COMPRESSION_DETAILS_SIZE + compressedMessage.length, + 0 + ); // messageLength + msgHeader.writeInt32LE(command.requestId, 4); // requestID + msgHeader.writeInt32LE(0, 8); // responseTo (zero) + msgHeader.writeInt32LE(OP_COMPRESSED, 12); // opCode + + // Create the compression details of OP_COMPRESSED + const compressionDetails = Buffer.alloc(COMPRESSION_DETAILS_SIZE); + compressionDetails.writeInt32LE(originalCommandOpCode, 0); // originalOpcode + compressionDetails.writeInt32LE(messageToBeCompressed.length, 4); // Size of the uncompressed compressedMessage, excluding the MsgHeader + compressionDetails.writeUInt8(Compressor[compressorName], 8); // compressorID + this.push(Buffer.concat([msgHeader, compressionDetails, compressedMessage])); + }); + } +} + +// Return whether a command contains an uncompressible command term +// Will return true if command contains no uncompressible command terms +function canCompress(command: WriteProtocolMessageType) { + const commandDoc = command instanceof Msg ? command.command : (command as Query).query; + const commandName = Object.keys(commandDoc)[0]; + return !uncompressibleCommands.has(commandName); +} + +function processIncomingData(stream: MessageStream, callback: Callback) { + const buffer = stream[kBuffer]; + if (buffer.length < 4) { + callback(); + return; + } + + const sizeOfMessage = buffer.peek(4).readInt32LE(); + if (sizeOfMessage < 0) { + callback(new MongoParseError(`Invalid message size: ${sizeOfMessage}`)); + return; + } + + if (sizeOfMessage > stream.maxBsonMessageSize) { + callback( + new MongoParseError( + `Invalid message size: ${sizeOfMessage}, max allowed: ${stream.maxBsonMessageSize}` + ) + ); + return; + } + + if (sizeOfMessage > buffer.length) { + callback(); + return; + } + + const message = buffer.read(sizeOfMessage); + const messageHeader: MessageHeader = { + length: message.readInt32LE(0), + requestId: message.readInt32LE(4), + responseTo: message.readInt32LE(8), + opCode: message.readInt32LE(12) + }; + + let ResponseType = messageHeader.opCode === OP_MSG ? BinMsg : Response; + if (messageHeader.opCode !== OP_COMPRESSED) { + const messageBody = message.slice(MESSAGE_HEADER_SIZE); + stream.emit('message', new ResponseType(message, messageHeader, messageBody)); + + if (buffer.length >= 4) { + processIncomingData(stream, callback); + } else { + callback(); + } + + return; + } + + messageHeader.fromCompressed = true; + messageHeader.opCode = message.readInt32LE(MESSAGE_HEADER_SIZE); + messageHeader.length = message.readInt32LE(MESSAGE_HEADER_SIZE + 4); + const compressorID: Compressor = message[MESSAGE_HEADER_SIZE + 8] as Compressor; + const compressedBuffer = message.slice(MESSAGE_HEADER_SIZE + 9); + + // recalculate based on wrapped opcode + ResponseType = messageHeader.opCode === OP_MSG ? BinMsg : Response; + decompress(compressorID, compressedBuffer, (err, messageBody) => { + if (err || !messageBody) { + callback(err); + return; + } + + if (messageBody.length !== messageHeader.length) { + callback( + new MongoDecompressionError('Message body and message header must be the same length') + ); + + return; + } + + stream.emit('message', new ResponseType(message, messageHeader, messageBody)); + + if (buffer.length >= 4) { + processIncomingData(stream, callback); + } else { + callback(); + } + }); +} diff --git a/node_modules/mongodb/src/cmap/metrics.ts b/node_modules/mongodb/src/cmap/metrics.ts new file mode 100644 index 000000000..b825b5393 --- /dev/null +++ b/node_modules/mongodb/src/cmap/metrics.ts @@ -0,0 +1,58 @@ +/** @internal */ +export class ConnectionPoolMetrics { + static readonly TXN = 'txn' as const; + static readonly CURSOR = 'cursor' as const; + static readonly OTHER = 'other' as const; + + txnConnections = 0; + cursorConnections = 0; + otherConnections = 0; + + /** + * Mark a connection as pinned for a specific operation. + */ + markPinned(pinType: string): void { + if (pinType === ConnectionPoolMetrics.TXN) { + this.txnConnections += 1; + } else if (pinType === ConnectionPoolMetrics.CURSOR) { + this.cursorConnections += 1; + } else { + this.otherConnections += 1; + } + } + + /** + * Unmark a connection as pinned for an operation. + */ + markUnpinned(pinType: string): void { + if (pinType === ConnectionPoolMetrics.TXN) { + this.txnConnections -= 1; + } else if (pinType === ConnectionPoolMetrics.CURSOR) { + this.cursorConnections -= 1; + } else { + this.otherConnections -= 1; + } + } + + /** + * Return information about the cmap metrics as a string. + */ + info(maxPoolSize: number): string { + return ( + 'Timed out while checking out a connection from connection pool: ' + + `maxPoolSize: ${maxPoolSize}, ` + + `connections in use by cursors: ${this.cursorConnections}, ` + + `connections in use by transactions: ${this.txnConnections}, ` + + `connections in use by other operations: ${this.otherConnections}` + ); + } + + /** + * Reset the metrics to the initial values. + */ + reset(): void { + this.txnConnections = 0; + this.cursorConnections = 0; + this.otherConnections = 0; + } +} diff --git a/node_modules/mongodb/src/cmap/stream_description.ts b/node_modules/mongodb/src/cmap/stream_description.ts new file mode 100644 index 000000000..fd0ce6102 --- /dev/null +++ b/node_modules/mongodb/src/cmap/stream_description.ts @@ -0,0 +1,73 @@ +import { parseServerType } from '../sdam/server_description'; +import type { Document } from '../bson'; +import type { CompressorName } from './wire_protocol/compression'; +import { ServerType } from '../sdam/common'; + +const RESPONSE_FIELDS = [ + 'minWireVersion', + 'maxWireVersion', + 'maxBsonObjectSize', + 'maxMessageSizeBytes', + 'maxWriteBatchSize', + 'logicalSessionTimeoutMinutes' +] as const; + +/** @public */ +export interface StreamDescriptionOptions { + compressors?: CompressorName[]; + logicalSessionTimeoutMinutes?: number; + loadBalanced: boolean; +} + +/** @public */ +export class StreamDescription { + address: string; + type: string; + minWireVersion?: number; + maxWireVersion?: number; + maxBsonObjectSize: number; + maxMessageSizeBytes: number; + maxWriteBatchSize: number; + compressors: CompressorName[]; + compressor?: CompressorName; + logicalSessionTimeoutMinutes?: number; + loadBalanced: boolean; + + __nodejs_mock_server__?: boolean; + + zlibCompressionLevel?: number; + + constructor(address: string, options?: StreamDescriptionOptions) { + this.address = address; + this.type = ServerType.Unknown; + this.minWireVersion = undefined; + this.maxWireVersion = undefined; + this.maxBsonObjectSize = 16777216; + this.maxMessageSizeBytes = 48000000; + this.maxWriteBatchSize = 100000; + this.logicalSessionTimeoutMinutes = options?.logicalSessionTimeoutMinutes; + this.loadBalanced = !!options?.loadBalanced; + this.compressors = + options && options.compressors && Array.isArray(options.compressors) + ? options.compressors + : []; + } + + receiveResponse(response: Document): void { + this.type = parseServerType(response); + RESPONSE_FIELDS.forEach(field => { + if (typeof response[field] != null) { + this[field] = response[field]; + } + + // testing case + if ('__nodejs_mock_server__' in response) { + this.__nodejs_mock_server__ = response['__nodejs_mock_server__']; + } + }); + + if (response.compression) { + this.compressor = this.compressors.filter(c => response.compression?.includes(c))[0]; + } + } +} diff --git a/node_modules/mongodb/src/cmap/wire_protocol/compression.ts b/node_modules/mongodb/src/cmap/wire_protocol/compression.ts new file mode 100644 index 000000000..957fb45a2 --- /dev/null +++ b/node_modules/mongodb/src/cmap/wire_protocol/compression.ts @@ -0,0 +1,103 @@ +import * as zlib from 'zlib'; +import type { Callback } from '../../utils'; +import type { OperationDescription } from '../message_stream'; + +import { PKG_VERSION, Snappy } from '../../deps'; +import { MongoDecompressionError, MongoInvalidArgumentError } from '../../error'; + +/** @public */ +export const Compressor = Object.freeze({ + none: 0, + snappy: 1, + zlib: 2 +} as const); + +/** @public */ +export type Compressor = typeof Compressor[CompressorName]; + +/** @public */ +export type CompressorName = keyof typeof Compressor; + +export const uncompressibleCommands = new Set([ + 'ismaster', + 'saslStart', + 'saslContinue', + 'getnonce', + 'authenticate', + 'createUser', + 'updateUser', + 'copydbSaslStart', + 'copydbgetnonce', + 'copydb' +]); + +// Facilitate compressing a message using an agreed compressor +export function compress( + self: { options: OperationDescription & zlib.ZlibOptions }, + dataToBeCompressed: Buffer, + callback: Callback +): void { + const zlibOptions = {} as zlib.ZlibOptions; + switch (self.options.agreedCompressor) { + case 'snappy': { + if ('kModuleError' in Snappy) { + return callback(Snappy['kModuleError']); + } + + if (Snappy[PKG_VERSION].major <= 6) { + Snappy.compress(dataToBeCompressed, callback); + } else { + Snappy.compress(dataToBeCompressed) + .then(buffer => callback(undefined, buffer)) + .catch(error => callback(error)); + } + break; + } + case 'zlib': + // Determine zlibCompressionLevel + if (self.options.zlibCompressionLevel) { + zlibOptions.level = self.options.zlibCompressionLevel; + } + zlib.deflate(dataToBeCompressed, zlibOptions, callback as zlib.CompressCallback); + break; + default: + throw new MongoInvalidArgumentError( + `Unknown compressor ${self.options.agreedCompressor} failed to compress` + ); + } +} + +// Decompress a message using the given compressor +export function decompress( + compressorID: Compressor, + compressedData: Buffer, + callback: Callback +): void { + if (compressorID < 0 || compressorID > Math.max(2)) { + throw new MongoDecompressionError( + `Server sent message compressed using an unsupported compressor. (Received compressor ID ${compressorID})` + ); + } + + switch (compressorID) { + case Compressor.snappy: { + if ('kModuleError' in Snappy) { + return callback(Snappy['kModuleError']); + } + + if (Snappy[PKG_VERSION].major <= 6) { + Snappy.uncompress(compressedData, { asBuffer: true }, callback); + } else { + Snappy.uncompress(compressedData, { asBuffer: true }) + .then(buffer => callback(undefined, buffer)) + .catch(error => callback(error)); + } + break; + } + case Compressor.zlib: + zlib.inflate(compressedData, callback as zlib.CompressCallback); + break; + default: + callback(undefined, compressedData); + } +} diff --git a/node_modules/mongodb/src/cmap/wire_protocol/constants.ts b/node_modules/mongodb/src/cmap/wire_protocol/constants.ts new file mode 100644 index 000000000..f37be8266 --- /dev/null +++ b/node_modules/mongodb/src/cmap/wire_protocol/constants.ts @@ -0,0 +1,13 @@ +export const MIN_SUPPORTED_SERVER_VERSION = '2.6'; +export const MAX_SUPPORTED_SERVER_VERSION = '5.0'; +export const MIN_SUPPORTED_WIRE_VERSION = 2; +export const MAX_SUPPORTED_WIRE_VERSION = 13; +export const OP_REPLY = 1; +export const OP_UPDATE = 2001; +export const OP_INSERT = 2002; +export const OP_QUERY = 2004; +export const OP_GETMORE = 2005; +export const OP_DELETE = 2006; +export const OP_KILL_CURSORS = 2007; +export const OP_COMPRESSED = 2012; +export const OP_MSG = 2013; diff --git a/node_modules/mongodb/src/cmap/wire_protocol/shared.ts b/node_modules/mongodb/src/cmap/wire_protocol/shared.ts new file mode 100644 index 000000000..0fd624df8 --- /dev/null +++ b/node_modules/mongodb/src/cmap/wire_protocol/shared.ts @@ -0,0 +1,71 @@ +import { ServerType } from '../../sdam/common'; +import { TopologyDescription } from '../../sdam/topology_description'; +import { MongoInvalidArgumentError } from '../../error'; +import { ReadPreference } from '../../read_preference'; +import type { Document } from '../../bson'; +import type { OpQueryOptions } from '../commands'; +import type { Topology } from '../../sdam/topology'; +import type { Server } from '../../sdam/server'; +import type { ServerDescription } from '../../sdam/server_description'; +import type { ReadPreferenceLike } from '../../read_preference'; +import type { CommandOptions } from '../connection'; +import type { Connection } from '../connection'; + +export interface ReadPreferenceOption { + readPreference?: ReadPreferenceLike; +} + +export function getReadPreference(cmd: Document, options?: ReadPreferenceOption): ReadPreference { + // Default to command version of the readPreference + let readPreference = cmd.readPreference || ReadPreference.primary; + // If we have an option readPreference override the command one + if (options?.readPreference) { + readPreference = options.readPreference; + } + + if (typeof readPreference === 'string') { + readPreference = ReadPreference.fromString(readPreference); + } + + if (!(readPreference instanceof ReadPreference)) { + throw new MongoInvalidArgumentError( + 'Option "readPreference" must be a ReadPreference instance' + ); + } + + return readPreference; +} + +export function applyCommonQueryOptions( + queryOptions: OpQueryOptions, + options: CommandOptions +): CommandOptions { + Object.assign(queryOptions, { + raw: typeof options.raw === 'boolean' ? options.raw : false, + promoteLongs: typeof options.promoteLongs === 'boolean' ? options.promoteLongs : true, + promoteValues: typeof options.promoteValues === 'boolean' ? options.promoteValues : true, + promoteBuffers: typeof options.promoteBuffers === 'boolean' ? options.promoteBuffers : false, + bsonRegExp: typeof options.bsonRegExp === 'boolean' ? options.bsonRegExp : false + }); + + if (options.session) { + queryOptions.session = options.session; + } + + return queryOptions; +} + +export function isSharded(topologyOrServer: Topology | Server | Connection): boolean { + if (topologyOrServer.description && topologyOrServer.description.type === ServerType.Mongos) { + return true; + } + + // NOTE: This is incredibly inefficient, and should be removed once command construction + // happens based on `Server` not `Topology`. + if (topologyOrServer.description && topologyOrServer.description instanceof TopologyDescription) { + const servers: ServerDescription[] = Array.from(topologyOrServer.description.servers.values()); + return servers.some((server: ServerDescription) => server.type === ServerType.Mongos); + } + + return false; +} diff --git a/node_modules/mongodb/src/collection.ts b/node_modules/mongodb/src/collection.ts new file mode 100644 index 000000000..fd0850268 --- /dev/null +++ b/node_modules/mongodb/src/collection.ts @@ -0,0 +1,1617 @@ +import { DEFAULT_PK_FACTORY, emitWarningOnce, resolveOptions } from './utils'; +import { ReadPreference, ReadPreferenceLike } from './read_preference'; +import { + normalizeHintField, + checkCollectionName, + MongoDBNamespace, + Callback, + getTopology +} from './utils'; +import { Document, BSONSerializeOptions, resolveBSONOptions } from './bson'; +import { MongoInvalidArgumentError } from './error'; +import { UnorderedBulkOperation } from './bulk/unordered'; +import { OrderedBulkOperation } from './bulk/ordered'; +import { ChangeStream, ChangeStreamOptions } from './change_stream'; +import { WriteConcern, WriteConcernOptions } from './write_concern'; +import { ReadConcern, ReadConcernLike } from './read_concern'; +import { AggregationCursor } from './cursor/aggregation_cursor'; +import type { AggregateOptions } from './operations/aggregate'; +import { BulkWriteOperation } from './operations/bulk_write'; +import { CountDocumentsOperation, CountDocumentsOptions } from './operations/count_documents'; +import { + CreateIndexesOperation, + CreateIndexOperation, + DropIndexOperation, + DropIndexesOperation, + IndexesOperation, + IndexExistsOperation, + IndexInformationOperation, + CreateIndexesOptions, + DropIndexesOptions, + ListIndexesOptions, + IndexSpecification, + IndexDescription, + ListIndexesCursor +} from './operations/indexes'; +import { DistinctOperation, DistinctOptions } from './operations/distinct'; +import { DropCollectionOperation, DropCollectionOptions } from './operations/drop'; +import { + EstimatedDocumentCountOperation, + EstimatedDocumentCountOptions +} from './operations/estimated_document_count'; +import type { FindOptions } from './operations/find'; +import { + FindOneAndDeleteOperation, + FindOneAndReplaceOperation, + FindOneAndUpdateOperation, + FindOneAndDeleteOptions, + FindOneAndReplaceOptions, + FindOneAndUpdateOptions +} from './operations/find_and_modify'; +import { + InsertOneOperation, + InsertOneOptions, + InsertOneResult, + InsertManyOperation, + InsertManyResult +} from './operations/insert'; +import { + UpdateOneOperation, + UpdateManyOperation, + UpdateOptions, + UpdateResult, + ReplaceOneOperation, + ReplaceOptions +} from './operations/update'; +import { + DeleteOneOperation, + DeleteManyOperation, + DeleteOptions, + DeleteResult +} from './operations/delete'; +import { IsCappedOperation } from './operations/is_capped'; +import { + MapReduceOperation, + MapFunction, + ReduceFunction, + MapReduceOptions +} from './operations/map_reduce'; +import { OptionsOperation } from './operations/options_operation'; +import { RenameOperation, RenameOptions } from './operations/rename'; +import { CollStats, CollStatsOperation, CollStatsOptions } from './operations/stats'; +import { executeOperation } from './operations/execute_operation'; +import type { Db } from './db'; +import type { OperationOptions, Hint } from './operations/operation'; +import type { IndexInformationOptions } from './operations/common_functions'; +import type { BulkWriteResult, BulkWriteOptions, AnyBulkWriteOperation } from './bulk/common'; +import type { PkFactory } from './mongo_client'; +import type { Logger, LoggerOptions } from './logger'; +import { FindCursor } from './cursor/find_cursor'; +import type { CountOptions } from './operations/count'; +import type { + Filter, + TODO_NODE_3286, + UpdateFilter, + WithId, + OptionalId, + Flatten +} from './mongo_types'; + +/** @public */ +export interface ModifyResult { + value: TSchema | null; + lastErrorObject?: Document; + ok: 0 | 1; +} + +/** @public */ +export interface CollectionOptions + extends BSONSerializeOptions, + WriteConcernOptions, + LoggerOptions { + slaveOk?: boolean; + /** Specify a read concern for the collection. (only MongoDB 3.2 or higher supported) */ + readConcern?: ReadConcernLike; + /** The preferred read preference (ReadPreference.PRIMARY, ReadPreference.PRIMARY_PREFERRED, ReadPreference.SECONDARY, ReadPreference.SECONDARY_PREFERRED, ReadPreference.NEAREST). */ + readPreference?: ReadPreferenceLike; +} + +/** @internal */ +export interface CollectionPrivate { + pkFactory: PkFactory; + db: Db; + options: any; + namespace: MongoDBNamespace; + readPreference?: ReadPreference; + bsonOptions: BSONSerializeOptions; + slaveOk?: boolean; + collectionHint?: Hint; + readConcern?: ReadConcern; + writeConcern?: WriteConcern; +} + +/** + * The **Collection** class is an internal class that embodies a MongoDB collection + * allowing for insert/update/remove/find and other command operation on that MongoDB collection. + * + * **COLLECTION Cannot directly be instantiated** + * @public + * + * @example + * ```js + * const MongoClient = require('mongodb').MongoClient; + * const test = require('assert'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * // Connect using MongoClient + * MongoClient.connect(url, function(err, client) { + * // Create a collection we want to drop later + * const col = client.db(dbName).collection('createIndexExample1'); + * // Show that duplicate records got dropped + * col.find({}).toArray(function(err, items) { + * expect(err).to.not.exist; + * test.equal(4, items.length); + * client.close(); + * }); + * }); + * ``` + */ +export class Collection { + /** @internal */ + s: CollectionPrivate; + + /** + * Create a new Collection instance + * @internal + */ + constructor(db: Db, name: string, options?: CollectionOptions) { + checkCollectionName(name); + + // Internal state + this.s = { + db, + options, + namespace: new MongoDBNamespace(db.databaseName, name), + pkFactory: db.options?.pkFactory ?? DEFAULT_PK_FACTORY, + readPreference: ReadPreference.fromOptions(options), + bsonOptions: resolveBSONOptions(options, db), + readConcern: ReadConcern.fromOptions(options), + writeConcern: WriteConcern.fromOptions(options), + slaveOk: options == null || options.slaveOk == null ? db.slaveOk : options.slaveOk + }; + } + + /** + * The name of the database this collection belongs to + */ + get dbName(): string { + return this.s.namespace.db; + } + + /** + * The name of this collection + */ + get collectionName(): string { + // eslint-disable-next-line @typescript-eslint/no-non-null-assertion + return this.s.namespace.collection!; + } + + /** + * The namespace of this collection, in the format `${this.dbName}.${this.collectionName}` + */ + get namespace(): string { + return this.s.namespace.toString(); + } + + /** + * The current readConcern of the collection. If not explicitly defined for + * this collection, will be inherited from the parent DB + */ + get readConcern(): ReadConcern | undefined { + if (this.s.readConcern == null) { + return this.s.db.readConcern; + } + return this.s.readConcern; + } + + /** + * The current readPreference of the collection. If not explicitly defined for + * this collection, will be inherited from the parent DB + */ + get readPreference(): ReadPreference | undefined { + if (this.s.readPreference == null) { + return this.s.db.readPreference; + } + + return this.s.readPreference; + } + + get bsonOptions(): BSONSerializeOptions { + return this.s.bsonOptions; + } + + /** + * The current writeConcern of the collection. If not explicitly defined for + * this collection, will be inherited from the parent DB + */ + get writeConcern(): WriteConcern | undefined { + if (this.s.writeConcern == null) { + return this.s.db.writeConcern; + } + return this.s.writeConcern; + } + + /** The current index hint for the collection */ + get hint(): Hint | undefined { + return this.s.collectionHint; + } + + set hint(v: Hint | undefined) { + this.s.collectionHint = normalizeHintField(v); + } + + /** + * Inserts a single document into MongoDB. If documents passed in do not contain the **_id** field, + * one will be added to each of the documents missing it by the driver, mutating the document. This behavior + * can be overridden by setting the **forceServerObjectId** flag. + * + * @param doc - The document to insert + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + insertOne(doc: OptionalId): Promise>; + insertOne(doc: OptionalId, callback: Callback>): void; + insertOne(doc: OptionalId, options: InsertOneOptions): Promise>; + insertOne( + doc: OptionalId, + options: InsertOneOptions, + callback: Callback> + ): void; + insertOne( + doc: OptionalId, + options?: InsertOneOptions | Callback>, + callback?: Callback> + ): Promise> | void { + if (typeof options === 'function') { + callback = options; + options = {}; + } + + // CSFLE passes in { w: 'majority' } to ensure the lib works in both 3.x and 4.x + // we support that option style here only + if (options && Reflect.get(options, 'w')) { + options.writeConcern = WriteConcern.fromOptions(Reflect.get(options, 'w')); + } + + return executeOperation( + getTopology(this), + new InsertOneOperation( + this as TODO_NODE_3286, + doc, + resolveOptions(this, options) + ) as TODO_NODE_3286, + callback + ); + } + + /** + * Inserts an array of documents into MongoDB. If documents passed in do not contain the **_id** field, + * one will be added to each of the documents missing it by the driver, mutating the document. This behavior + * can be overridden by setting the **forceServerObjectId** flag. + * + * @param docs - The documents to insert + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + insertMany(docs: OptionalId[]): Promise>; + insertMany(docs: OptionalId[], callback: Callback>): void; + insertMany( + docs: OptionalId[], + options: BulkWriteOptions + ): Promise>; + insertMany( + docs: OptionalId[], + options: BulkWriteOptions, + callback: Callback> + ): void; + insertMany( + docs: OptionalId[], + options?: BulkWriteOptions | Callback>, + callback?: Callback> + ): Promise> | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = options ? Object.assign({}, options) : { ordered: true }; + + return executeOperation( + getTopology(this), + new InsertManyOperation( + this as TODO_NODE_3286, + docs, + resolveOptions(this, options) + ) as TODO_NODE_3286, + callback + ); + } + + /** + * Perform a bulkWrite operation without a fluent API + * + * Legal operation types are + * + * ```js + * { insertOne: { document: { a: 1 } } } + * + * { updateOne: { filter: {a:2}, update: {$set: {a:2}}, upsert:true } } + * + * { updateMany: { filter: {a:2}, update: {$set: {a:2}}, upsert:true } } + * + * { updateMany: { filter: {}, update: {$set: {"a.$[i].x": 5}}, arrayFilters: [{ "i.x": 5 }]} } + * + * { deleteOne: { filter: {c:1} } } + * + * { deleteMany: { filter: {c:1} } } + * + * { replaceOne: { filter: {c:3}, replacement: {c:4}, upsert:true} } + *``` + * Please note that raw operations are no longer accepted as of driver version 4.0. + * + * If documents passed in do not contain the **_id** field, + * one will be added to each of the documents missing it by the driver, mutating the document. This behavior + * can be overridden by setting the **forceServerObjectId** flag. + * + * @param operations - Bulk operations to perform + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + * @throws MongoDriverError if operations is not an array + */ + bulkWrite(operations: AnyBulkWriteOperation[]): Promise; + bulkWrite( + operations: AnyBulkWriteOperation[], + callback: Callback + ): void; + bulkWrite( + operations: AnyBulkWriteOperation[], + options: BulkWriteOptions + ): Promise; + bulkWrite( + operations: AnyBulkWriteOperation[], + options: BulkWriteOptions, + callback: Callback + ): void; + bulkWrite( + operations: AnyBulkWriteOperation[], + options?: BulkWriteOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = options || { ordered: true }; + + if (!Array.isArray(operations)) { + throw new MongoInvalidArgumentError('Argument "operations" must be an array of documents'); + } + + return executeOperation( + getTopology(this), + new BulkWriteOperation( + this as TODO_NODE_3286, + operations as TODO_NODE_3286, + resolveOptions(this, options) + ), + callback + ); + } + + /** + * Update a single document in a collection + * + * @param filter - The filter used to select the document to update + * @param update - The update operations to be applied to the document + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + updateOne( + filter: Filter, + update: UpdateFilter | Partial + ): Promise; + updateOne( + filter: Filter, + update: UpdateFilter | Partial, + callback: Callback + ): void; + updateOne( + filter: Filter, + update: UpdateFilter | Partial, + options: UpdateOptions + ): Promise; + updateOne( + filter: Filter, + update: UpdateFilter | Partial, + options: UpdateOptions, + callback: Callback + ): void; + updateOne( + filter: Filter, + update: UpdateFilter | Partial, + options?: UpdateOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new UpdateOneOperation( + this as TODO_NODE_3286, + filter, + update, + resolveOptions(this, options) + ) as TODO_NODE_3286, + callback + ); + } + + /** + * Replace a document in a collection with another document + * + * @param filter - The filter used to select the document to replace + * @param replacement - The Document that replaces the matching document + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + replaceOne(filter: Filter, replacement: TSchema): Promise; + replaceOne( + filter: Filter, + replacement: TSchema, + callback: Callback + ): void; + replaceOne( + filter: Filter, + replacement: TSchema, + options: ReplaceOptions + ): Promise; + replaceOne( + filter: Filter, + replacement: TSchema, + options: ReplaceOptions, + callback: Callback + ): void; + replaceOne( + filter: Filter, + replacement: TSchema, + options?: ReplaceOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new ReplaceOneOperation( + this as TODO_NODE_3286, + filter, + replacement, + resolveOptions(this, options) + ), + callback + ); + } + + /** + * Update multiple documents in a collection + * + * @param filter - The filter used to select the documents to update + * @param update - The update operations to be applied to the documents + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + updateMany( + filter: Filter, + update: UpdateFilter + ): Promise; + updateMany( + filter: Filter, + update: UpdateFilter, + callback: Callback + ): void; + updateMany( + filter: Filter, + update: UpdateFilter, + options: UpdateOptions + ): Promise; + updateMany( + filter: Filter, + update: UpdateFilter, + options: UpdateOptions, + callback: Callback + ): void; + updateMany( + filter: Filter, + update: UpdateFilter, + options?: UpdateOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new UpdateManyOperation( + this as TODO_NODE_3286, + filter, + update, + resolveOptions(this, options) + ), + callback + ); + } + + /** + * Delete a document from a collection + * + * @param filter - The filter used to select the document to remove + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + deleteOne(filter: Filter): Promise; + deleteOne(filter: Filter, callback: Callback): void; + deleteOne(filter: Filter, options: DeleteOptions): Promise; + deleteOne( + filter: Filter, + options: DeleteOptions, + callback?: Callback + ): void; + deleteOne( + filter: Filter, + options?: DeleteOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new DeleteOneOperation(this as TODO_NODE_3286, filter, resolveOptions(this, options)), + callback + ); + } + + /** + * Delete multiple documents from a collection + * + * @param filter - The filter used to select the documents to remove + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + deleteMany(filter: Filter): Promise; + deleteMany(filter: Filter, callback: Callback): void; + deleteMany(filter: Filter, options: DeleteOptions): Promise; + deleteMany( + filter: Filter, + options: DeleteOptions, + callback: Callback + ): void; + deleteMany( + filter: Filter, + options?: DeleteOptions | Callback, + callback?: Callback + ): Promise | void { + if (filter == null) { + filter = {}; + options = {}; + callback = undefined; + } else if (typeof filter === 'function') { + callback = filter as Callback; + filter = {}; + options = {}; + } else if (typeof options === 'function') { + callback = options; + options = {}; + } + + return executeOperation( + getTopology(this), + new DeleteManyOperation(this as TODO_NODE_3286, filter, resolveOptions(this, options)), + callback + ); + } + + /** + * Rename the collection. + * + * @remarks + * This operation does not inherit options from the Db or MongoClient. + * + * @param newName - New name of of the collection. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + rename(newName: string): Promise; + rename(newName: string, callback: Callback): void; + rename(newName: string, options: RenameOptions): Promise; + rename(newName: string, options: RenameOptions, callback: Callback): void; + rename( + newName: string, + options?: RenameOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + // Intentionally, we do not inherit options from parent for this operation. + return executeOperation( + getTopology(this), + new RenameOperation(this as TODO_NODE_3286, newName, { + ...options, + readPreference: ReadPreference.PRIMARY + }), + callback + ); + } + + /** + * Drop the collection from the database, removing it permanently. New accesses will create a new collection. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + drop(): Promise; + drop(callback: Callback): void; + drop(options: DropCollectionOptions): Promise; + drop(options: DropCollectionOptions, callback: Callback): void; + drop( + options?: DropCollectionOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = options ?? {}; + + return executeOperation( + getTopology(this), + new DropCollectionOperation(this.s.db, this.collectionName, options), + callback + ); + } + + /** + * Fetches the first document that matches the filter + * + * @param filter - Query for find Operation + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + findOne(): Promise; + findOne(callback: Callback): void; + findOne(filter: Filter): Promise; + findOne(filter: Filter, callback: Callback): void; + findOne(filter: Filter, options: FindOptions): Promise; + findOne(filter: Filter, options: FindOptions, callback: Callback): void; + + // allow an override of the schema. + findOne(): Promise; + findOne(callback: Callback): void; + findOne(filter: Filter): Promise; + findOne(filter: Filter, options?: FindOptions): Promise; + findOne( + filter: Filter, + options?: FindOptions, + callback?: Callback + ): void; + + findOne( + filter?: Filter | Callback, + options?: FindOptions | Callback, + callback?: Callback + ): Promise | void { + if (callback != null && typeof callback !== 'function') { + throw new MongoInvalidArgumentError( + 'Third parameter to `findOne()` must be a callback or undefined' + ); + } + + if (typeof filter === 'function') { + callback = filter as Callback; + filter = {}; + options = {}; + } + if (typeof options === 'function') { + callback = options; + options = {}; + } + + const finalFilter = filter ?? {}; + const finalOptions = options ?? {}; + return this.find(finalFilter, finalOptions).limit(-1).batchSize(1).next(callback); + } + + /** + * Creates a cursor for a filter that can be used to iterate over results from MongoDB + * + * @param filter - The filter predicate. If unspecified, then all documents in the collection will match the predicate + */ + find(): FindCursor; + find(filter: Filter, options?: FindOptions): FindCursor; + find(filter: Filter, options?: FindOptions): FindCursor; + find(filter?: Filter, options?: FindOptions): FindCursor { + if (arguments.length > 2) { + throw new MongoInvalidArgumentError( + 'Method "collection.find()" accepts at most two arguments' + ); + } + if (typeof options === 'function') { + throw new MongoInvalidArgumentError('Argument "options" must not be function'); + } + + return new FindCursor( + getTopology(this), + this.s.namespace, + filter, + resolveOptions(this as TODO_NODE_3286, options) + ); + } + + /** + * Returns the options of the collection. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + options(): Promise; + options(callback: Callback): void; + options(options: OperationOptions): Promise; + options(options: OperationOptions, callback: Callback): void; + options( + options?: OperationOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new OptionsOperation(this as TODO_NODE_3286, resolveOptions(this, options)), + callback + ); + } + + /** + * Returns if the collection is a capped collection + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + isCapped(): Promise; + isCapped(callback: Callback): void; + isCapped(options: OperationOptions): Promise; + isCapped(options: OperationOptions, callback: Callback): void; + isCapped( + options?: OperationOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new IsCappedOperation(this as TODO_NODE_3286, resolveOptions(this, options)), + callback + ); + } + + /** + * Creates an index on the db and collection collection. + * + * @param indexSpec - The field name or index specification to create an index for + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + * + * @example + * ```js + * const collection = client.db('foo').collection('bar'); + * + * await collection.createIndex({ a: 1, b: -1 }); + * + * // Alternate syntax for { c: 1, d: -1 } that ensures order of indexes + * await collection.createIndex([ [c, 1], [d, -1] ]); + * + * // Equivalent to { e: 1 } + * await collection.createIndex('e'); + * + * // Equivalent to { f: 1, g: 1 } + * await collection.createIndex(['f', 'g']) + * + * // Equivalent to { h: 1, i: -1 } + * await collection.createIndex([ { h: 1 }, { i: -1 } ]); + * + * // Equivalent to { j: 1, k: -1, l: 2d } + * await collection.createIndex(['j', ['k', -1], { l: '2d' }]) + * ``` + */ + createIndex(indexSpec: IndexSpecification): Promise; + createIndex(indexSpec: IndexSpecification, callback: Callback): void; + createIndex(indexSpec: IndexSpecification, options: CreateIndexesOptions): Promise; + createIndex( + indexSpec: IndexSpecification, + options: CreateIndexesOptions, + callback: Callback + ): void; + createIndex( + indexSpec: IndexSpecification, + options?: CreateIndexesOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new CreateIndexOperation( + this as TODO_NODE_3286, + this.collectionName, + indexSpec, + resolveOptions(this, options) + ), + callback + ); + } + + /** + * Creates multiple indexes in the collection, this method is only supported for + * MongoDB 2.6 or higher. Earlier version of MongoDB will throw a command not supported + * error. + * + * **Note**: Unlike {@link Collection#createIndex| createIndex}, this function takes in raw index specifications. + * Index specifications are defined {@link http://docs.mongodb.org/manual/reference/command/createIndexes/| here}. + * + * @param indexSpecs - An array of index specifications to be created + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + * + * @example + * ```js + * const collection = client.db('foo').collection('bar'); + * await collection.createIndexes([ + * // Simple index on field fizz + * { + * key: { fizz: 1 }, + * } + * // wildcard index + * { + * key: { '$**': 1 } + * }, + * // named index on darmok and jalad + * { + * key: { darmok: 1, jalad: -1 } + * name: 'tanagra' + * } + * ]); + * ``` + */ + createIndexes(indexSpecs: IndexDescription[]): Promise; + createIndexes(indexSpecs: IndexDescription[], callback: Callback): void; + createIndexes(indexSpecs: IndexDescription[], options: CreateIndexesOptions): Promise; + createIndexes( + indexSpecs: IndexDescription[], + options: CreateIndexesOptions, + callback: Callback + ): void; + createIndexes( + indexSpecs: IndexDescription[], + options?: CreateIndexesOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = options ? Object.assign({}, options) : {}; + if (typeof options.maxTimeMS !== 'number') delete options.maxTimeMS; + + return executeOperation( + getTopology(this), + new CreateIndexesOperation( + this as TODO_NODE_3286, + this.collectionName, + indexSpecs, + resolveOptions(this, options) + ), + callback + ); + } + + /** + * Drops an index from this collection. + * + * @param indexName - Name of the index to drop. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + dropIndex(indexName: string): Promise; + dropIndex(indexName: string, callback: Callback): void; + dropIndex(indexName: string, options: DropIndexesOptions): Promise; + dropIndex(indexName: string, options: DropIndexesOptions, callback: Callback): void; + dropIndex( + indexName: string, + options?: DropIndexesOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = resolveOptions(this, options); + + // Run only against primary + options.readPreference = ReadPreference.primary; + + return executeOperation( + getTopology(this), + new DropIndexOperation(this as TODO_NODE_3286, indexName, options), + callback + ); + } + + /** + * Drops all indexes from this collection. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + dropIndexes(): Promise; + dropIndexes(callback: Callback): void; + dropIndexes(options: DropIndexesOptions): Promise; + dropIndexes(options: DropIndexesOptions, callback: Callback): void; + dropIndexes( + options?: DropIndexesOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new DropIndexesOperation(this as TODO_NODE_3286, resolveOptions(this, options)), + callback + ); + } + + /** + * Get the list of all indexes information for the collection. + * + * @param options - Optional settings for the command + */ + listIndexes(options?: ListIndexesOptions): ListIndexesCursor { + return new ListIndexesCursor(this as TODO_NODE_3286, resolveOptions(this, options)); + } + + /** + * Checks if one or more indexes exist on the collection, fails on first non-existing index + * + * @param indexes - One or more index names to check. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + indexExists(indexes: string | string[]): Promise; + indexExists(indexes: string | string[], callback: Callback): void; + indexExists(indexes: string | string[], options: IndexInformationOptions): Promise; + indexExists( + indexes: string | string[], + options: IndexInformationOptions, + callback: Callback + ): void; + indexExists( + indexes: string | string[], + options?: IndexInformationOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new IndexExistsOperation(this as TODO_NODE_3286, indexes, resolveOptions(this, options)), + callback + ); + } + + /** + * Retrieves this collections index info. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + indexInformation(): Promise; + indexInformation(callback: Callback): void; + indexInformation(options: IndexInformationOptions): Promise; + indexInformation(options: IndexInformationOptions, callback: Callback): void; + indexInformation( + options?: IndexInformationOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new IndexInformationOperation(this.s.db, this.collectionName, resolveOptions(this, options)), + callback + ); + } + + /** + * Gets an estimate of the count of documents in a collection using collection metadata. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + estimatedDocumentCount(): Promise; + estimatedDocumentCount(callback: Callback): void; + estimatedDocumentCount(options: EstimatedDocumentCountOptions): Promise; + estimatedDocumentCount(options: EstimatedDocumentCountOptions, callback: Callback): void; + estimatedDocumentCount( + options?: EstimatedDocumentCountOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + return executeOperation( + getTopology(this), + new EstimatedDocumentCountOperation(this as TODO_NODE_3286, resolveOptions(this, options)), + callback + ); + } + + /** + * Gets the number of documents matching the filter. + * For a fast count of the total documents in a collection see {@link Collection#estimatedDocumentCount| estimatedDocumentCount}. + * **Note**: When migrating from {@link Collection#count| count} to {@link Collection#countDocuments| countDocuments} + * the following query operators must be replaced: + * + * | Operator | Replacement | + * | -------- | ----------- | + * | `$where` | [`$expr`][1] | + * | `$near` | [`$geoWithin`][2] with [`$center`][3] | + * | `$nearSphere` | [`$geoWithin`][2] with [`$centerSphere`][4] | + * + * [1]: https://docs.mongodb.com/manual/reference/operator/query/expr/ + * [2]: https://docs.mongodb.com/manual/reference/operator/query/geoWithin/ + * [3]: https://docs.mongodb.com/manual/reference/operator/query/center/#op._S_center + * [4]: https://docs.mongodb.com/manual/reference/operator/query/centerSphere/#op._S_centerSphere + * + * @param filter - The filter for the count + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + * + * @see https://docs.mongodb.com/manual/reference/operator/query/expr/ + * @see https://docs.mongodb.com/manual/reference/operator/query/geoWithin/ + * @see https://docs.mongodb.com/manual/reference/operator/query/center/#op._S_center + * @see https://docs.mongodb.com/manual/reference/operator/query/centerSphere/#op._S_centerSphere + */ + countDocuments(): Promise; + countDocuments(callback: Callback): void; + countDocuments(filter: Filter): Promise; + countDocuments(callback: Callback): void; + countDocuments(filter: Filter, options: CountDocumentsOptions): Promise; + countDocuments( + filter: Filter, + options: CountDocumentsOptions, + callback: Callback + ): void; + countDocuments(filter: Filter, callback: Callback): void; + countDocuments( + filter?: Document | CountDocumentsOptions | Callback, + options?: CountDocumentsOptions | Callback, + callback?: Callback + ): Promise | void { + if (filter == null) { + (filter = {}), (options = {}), (callback = undefined); + } else if (typeof filter === 'function') { + (callback = filter as Callback), (filter = {}), (options = {}); + } else { + if (arguments.length === 2) { + if (typeof options === 'function') (callback = options), (options = {}); + } + } + + filter ??= {}; + return executeOperation( + getTopology(this), + new CountDocumentsOperation( + this as TODO_NODE_3286, + filter as Document, + resolveOptions(this, options as CountDocumentsOptions) + ), + callback + ); + } + + /** + * The distinct command returns a list of distinct values for the given key across a collection. + * + * @param key - Field of the document to find distinct values for + * @param filter - The filter for filtering the set of documents to which we apply the distinct filter. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + distinct>( + key: Key + ): Promise[Key]>>>; + distinct>( + key: Key, + callback: Callback[Key]>>> + ): void; + distinct>( + key: Key, + filter: Filter + ): Promise[Key]>>>; + distinct>( + key: Key, + filter: Filter, + callback: Callback[Key]>>> + ): void; + distinct>( + key: Key, + filter: Filter, + options: DistinctOptions + ): Promise[Key]>>>; + distinct>( + key: Key, + filter: Filter, + options: DistinctOptions, + callback: Callback[Key]>>> + ): void; + + // Embedded documents overload + distinct(key: string): Promise; + distinct(key: string, callback: Callback): void; + distinct(key: string, filter: Filter): Promise; + distinct(key: string, filter: Filter, callback: Callback): void; + distinct(key: string, filter: Filter, options: DistinctOptions): Promise; + distinct( + key: string, + filter: Filter, + options: DistinctOptions, + callback: Callback + ): void; + // Implementation + distinct>( + key: Key, + filter?: Filter | DistinctOptions | Callback, + options?: DistinctOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof filter === 'function') { + (callback = filter as Callback), (filter = {}), (options = {}); + } else { + if (arguments.length === 3 && typeof options === 'function') { + (callback = options), (options = {}); + } + } + + filter ??= {}; + return executeOperation( + getTopology(this), + new DistinctOperation( + this as TODO_NODE_3286, + key as TODO_NODE_3286, + filter, + resolveOptions(this, options as DistinctOptions) + ), + callback + ); + } + + /** + * Retrieve all the indexes on the collection. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + indexes(): Promise; + indexes(callback: Callback): void; + indexes(options: IndexInformationOptions): Promise; + indexes(options: IndexInformationOptions, callback: Callback): void; + indexes( + options?: IndexInformationOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new IndexesOperation(this as TODO_NODE_3286, resolveOptions(this, options)), + callback + ); + } + + /** + * Get all the collection statistics. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + stats(): Promise; + stats(callback: Callback): void; + stats(options: CollStatsOptions): Promise; + stats(options: CollStatsOptions, callback: Callback): void; + stats( + options?: CollStatsOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = options ?? {}; + + return executeOperation( + getTopology(this), + new CollStatsOperation(this as TODO_NODE_3286, options), + callback + ); + } + + /** + * Find a document and delete it in one atomic operation. Requires a write lock for the duration of the operation. + * + * @param filter - The filter used to select the document to remove + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + findOneAndDelete(filter: Filter): Promise>; + findOneAndDelete( + filter: Filter, + options: FindOneAndDeleteOptions + ): Promise>; + findOneAndDelete(filter: Filter, callback: Callback>): void; + findOneAndDelete( + filter: Filter, + options: FindOneAndDeleteOptions, + callback: Callback> + ): void; + findOneAndDelete( + filter: Filter, + options?: FindOneAndDeleteOptions | Callback>, + callback?: Callback> + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new FindOneAndDeleteOperation( + this as TODO_NODE_3286, + filter, + resolveOptions(this, options) + ) as TODO_NODE_3286, + callback + ); + } + + /** + * Find a document and replace it in one atomic operation. Requires a write lock for the duration of the operation. + * + * @param filter - The filter used to select the document to replace + * @param replacement - The Document that replaces the matching document + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + findOneAndReplace(filter: Filter, replacement: Document): Promise>; + findOneAndReplace( + filter: Filter, + replacement: Document, + callback: Callback> + ): void; + findOneAndReplace( + filter: Filter, + replacement: Document, + options: FindOneAndReplaceOptions + ): Promise>; + findOneAndReplace( + filter: Filter, + replacement: Document, + options: FindOneAndReplaceOptions, + callback: Callback> + ): void; + findOneAndReplace( + filter: Filter, + replacement: Document, + options?: FindOneAndReplaceOptions | Callback>, + callback?: Callback> + ): Promise> | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new FindOneAndReplaceOperation( + this as TODO_NODE_3286, + filter, + replacement, + resolveOptions(this, options) + ) as TODO_NODE_3286, + callback + ); + } + + /** + * Find a document and update it in one atomic operation. Requires a write lock for the duration of the operation. + * + * @param filter - The filter used to select the document to update + * @param update - Update operations to be performed on the document + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + findOneAndUpdate( + filter: Filter, + update: UpdateFilter + ): Promise>; + findOneAndUpdate( + filter: Filter, + update: UpdateFilter, + callback: Callback> + ): void; + findOneAndUpdate( + filter: Filter, + update: UpdateFilter, + options: FindOneAndUpdateOptions + ): Promise>; + findOneAndUpdate( + filter: Filter, + update: UpdateFilter, + options: FindOneAndUpdateOptions, + callback: Callback> + ): void; + findOneAndUpdate( + filter: Filter, + update: UpdateFilter, + options?: FindOneAndUpdateOptions | Callback>, + callback?: Callback> + ): Promise> | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new FindOneAndUpdateOperation( + this as TODO_NODE_3286, + filter, + update, + resolveOptions(this, options) + ) as TODO_NODE_3286, + callback + ); + } + + /** + * Execute an aggregation framework pipeline against the collection, needs MongoDB \>= 2.2 + * + * @param pipeline - An array of aggregation pipelines to execute + * @param options - Optional settings for the command + */ + aggregate( + pipeline: Document[] = [], + options?: AggregateOptions + ): AggregationCursor { + if (arguments.length > 2) { + throw new MongoInvalidArgumentError( + 'Method "collection.aggregate()" accepts at most two arguments' + ); + } + if (!Array.isArray(pipeline)) { + throw new MongoInvalidArgumentError( + 'Argument "pipeline" must be an array of aggregation stages' + ); + } + if (typeof options === 'function') { + throw new MongoInvalidArgumentError('Argument "options" must not be function'); + } + + return new AggregationCursor( + getTopology(this), + this.s.namespace, + pipeline, + resolveOptions(this, options) + ); + } + + /** + * Create a new Change Stream, watching for new changes (insertions, updates, replacements, deletions, and invalidations) in this collection. + * + * @since 3.0.0 + * @param pipeline - An array of {@link https://docs.mongodb.com/manual/reference/operator/aggregation-pipeline/|aggregation pipeline stages} through which to pass change stream documents. This allows for filtering (using $match) and manipulating the change stream documents. + * @param options - Optional settings for the command + */ + watch( + pipeline: Document[] = [], + options: ChangeStreamOptions = {} + ): ChangeStream { + // Allow optionally not specifying a pipeline + if (!Array.isArray(pipeline)) { + options = pipeline; + pipeline = []; + } + + return new ChangeStream(this, pipeline, resolveOptions(this, options)); + } + + /** + * Run Map Reduce across a collection. Be aware that the inline option for out will return an array of results not a collection. + * + * @param map - The mapping function. + * @param reduce - The reduce function. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + mapReduce( + map: string | MapFunction, + reduce: string | ReduceFunction + ): Promise; + mapReduce( + map: string | MapFunction, + reduce: string | ReduceFunction, + callback: Callback + ): void; + mapReduce( + map: string | MapFunction, + reduce: string | ReduceFunction, + options: MapReduceOptions + ): Promise; + mapReduce( + map: string | MapFunction, + reduce: string | ReduceFunction, + options: MapReduceOptions, + callback: Callback + ): void; + mapReduce( + map: string | MapFunction, + reduce: string | ReduceFunction, + options?: MapReduceOptions | Callback, + callback?: Callback + ): Promise | void { + if ('function' === typeof options) (callback = options), (options = {}); + // Out must always be defined (make sure we don't break weirdly on pre 1.8+ servers) + // TODO NODE-3339: Figure out if this is still necessary given we no longer officially support pre-1.8 + if (options?.out == null) { + throw new MongoInvalidArgumentError( + 'Option "out" must be defined, see mongodb docs for possible values' + ); + } + + if ('function' === typeof map) { + map = map.toString(); + } + + if ('function' === typeof reduce) { + reduce = reduce.toString(); + } + + if ('function' === typeof options.finalize) { + options.finalize = options.finalize.toString(); + } + + return executeOperation( + getTopology(this), + new MapReduceOperation( + this as TODO_NODE_3286, + map, + reduce, + resolveOptions(this, options) as TODO_NODE_3286 + ), + callback + ); + } + + /** Initiate an Out of order batch write operation. All operations will be buffered into insert/update/remove commands executed out of order. */ + initializeUnorderedBulkOp(options?: BulkWriteOptions): UnorderedBulkOperation { + return new UnorderedBulkOperation(this as TODO_NODE_3286, resolveOptions(this, options)); + } + + /** Initiate an In order bulk write operation. Operations will be serially executed in the order they are added, creating a new operation for each switch in types. */ + initializeOrderedBulkOp(options?: BulkWriteOptions): OrderedBulkOperation { + return new OrderedBulkOperation(this as TODO_NODE_3286, resolveOptions(this, options)); + } + + /** Get the db scoped logger */ + getLogger(): Logger { + return this.s.db.s.logger; + } + + get logger(): Logger { + return this.s.db.s.logger; + } + + /** + * Inserts a single document or a an array of documents into MongoDB. If documents passed in do not contain the **_id** field, + * one will be added to each of the documents missing it by the driver, mutating the document. This behavior + * can be overridden by setting the **forceServerObjectId** flag. + * + * @deprecated Use insertOne, insertMany or bulkWrite instead. + * @param docs - The documents to insert + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + insert( + docs: OptionalId[], + options: BulkWriteOptions, + callback: Callback> + ): Promise> | void { + emitWarningOnce( + 'collection.insert is deprecated. Use insertOne, insertMany or bulkWrite instead.' + ); + if (typeof options === 'function') (callback = options), (options = {}); + options = options || { ordered: false }; + docs = !Array.isArray(docs) ? [docs] : docs; + + if (options.keepGoing === true) { + options.ordered = false; + } + + return this.insertMany(docs, options, callback); + } + + /** + * Updates documents. + * + * @deprecated use updateOne, updateMany or bulkWrite + * @param selector - The selector for the update operation. + * @param update - The update operations to be applied to the documents + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + update( + selector: Filter, + update: UpdateFilter, + options: UpdateOptions, + callback: Callback + ): Promise | void { + emitWarningOnce( + 'collection.update is deprecated. Use updateOne, updateMany, or bulkWrite instead.' + ); + if (typeof options === 'function') (callback = options), (options = {}); + options = options ?? {}; + + return this.updateMany(selector, update, options, callback); + } + + /** + * Remove documents. + * + * @deprecated use deleteOne, deleteMany or bulkWrite + * @param selector - The selector for the update operation. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + remove( + selector: Filter, + options: DeleteOptions, + callback: Callback + ): Promise | void { + emitWarningOnce( + 'collection.remove is deprecated. Use deleteOne, deleteMany, or bulkWrite instead.' + ); + if (typeof options === 'function') (callback = options), (options = {}); + options = options ?? {}; + + return this.deleteMany(selector, options, callback); + } + + /** + * An estimated count of matching documents in the db to a filter. + * + * **NOTE:** This method has been deprecated, since it does not provide an accurate count of the documents + * in a collection. To obtain an accurate count of documents in the collection, use {@link Collection#countDocuments| countDocuments}. + * To obtain an estimated count of all documents in the collection, use {@link Collection#estimatedDocumentCount| estimatedDocumentCount}. + * + * @deprecated use {@link Collection#countDocuments| countDocuments} or {@link Collection#estimatedDocumentCount| estimatedDocumentCount} instead + * + * @param filter - The filter for the count. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + count(): Promise; + count(callback: Callback): void; + count(filter: Filter): Promise; + count(filter: Filter, callback: Callback): void; + count(filter: Filter, options: CountOptions): Promise; + count( + filter: Filter, + options: CountOptions, + callback: Callback + ): Promise | void; + count( + filter?: Filter | CountOptions | Callback, + options?: CountOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof filter === 'function') { + (callback = filter as Callback), (filter = {}), (options = {}); + } else { + if (typeof options === 'function') (callback = options), (options = {}); + } + + filter ??= {}; + return executeOperation( + getTopology(this), + new CountDocumentsOperation(this as TODO_NODE_3286, filter, resolveOptions(this, options)), + callback + ); + } +} diff --git a/node_modules/mongodb/src/connection_string.ts b/node_modules/mongodb/src/connection_string.ts new file mode 100644 index 000000000..c188e2df8 --- /dev/null +++ b/node_modules/mongodb/src/connection_string.ts @@ -0,0 +1,1119 @@ +import * as dns from 'dns'; +import * as fs from 'fs'; +import ConnectionString from 'mongodb-connection-string-url'; +import { URLSearchParams } from 'url'; +import { AuthMechanism } from './cmap/auth/defaultAuthProviders'; +import { ReadPreference, ReadPreferenceMode } from './read_preference'; +import { ReadConcern, ReadConcernLevel } from './read_concern'; +import { W, WriteConcern } from './write_concern'; +import { MongoAPIError, MongoInvalidArgumentError, MongoParseError } from './error'; +import { + AnyOptions, + Callback, + DEFAULT_PK_FACTORY, + isRecord, + makeClientMetadata, + setDifference, + HostAddress, + emitWarning +} from './utils'; +import type { Document } from './bson'; +import { + DriverInfo, + MongoClient, + MongoClientOptions, + MongoOptions, + PkFactory, + ServerApi, + ServerApiVersion +} from './mongo_client'; +import { MongoCredentials } from './cmap/auth/mongo_credentials'; +import type { TagSet } from './sdam/server_description'; +import { Logger, LoggerLevel } from './logger'; +import { PromiseProvider } from './promise_provider'; +import { Encrypter } from './encrypter'; +import { Compressor, CompressorName } from './cmap/wire_protocol/compression'; + +const VALID_TXT_RECORDS = ['authSource', 'replicaSet', 'loadBalanced']; + +const LB_SINGLE_HOST_ERROR = 'loadBalanced option only supported with a single host in the URI'; +const LB_REPLICA_SET_ERROR = 'loadBalanced option not supported with a replicaSet option'; +const LB_DIRECT_CONNECTION_ERROR = + 'loadBalanced option not supported when directConnection is provided'; + +/** + * Determines whether a provided address matches the provided parent domain in order + * to avoid certain attack vectors. + * + * @param srvAddress - The address to check against a domain + * @param parentDomain - The domain to check the provided address against + * @returns Whether the provided address matches the parent domain + */ +function matchesParentDomain(srvAddress: string, parentDomain: string): boolean { + const regex = /^.*?\./; + const srv = `.${srvAddress.replace(regex, '')}`; + const parent = `.${parentDomain.replace(regex, '')}`; + return srv.endsWith(parent); +} + +/** + * Lookup a `mongodb+srv` connection string, combine the parts and reparse it as a normal + * connection string. + * + * @param uri - The connection string to parse + * @param options - Optional user provided connection string options + */ +export function resolveSRVRecord(options: MongoOptions, callback: Callback): void { + if (typeof options.srvHost !== 'string') { + return callback(new MongoAPIError('Option "srvHost" must not be empty')); + } + + if (options.srvHost.split('.').length < 3) { + // TODO(NODE-3484): Replace with MongoConnectionStringError + return callback(new MongoAPIError('URI must include hostname, domain name, and tld')); + } + + // Resolve the SRV record and use the result as the list of hosts to connect to. + const lookupAddress = options.srvHost; + dns.resolveSrv(`_mongodb._tcp.${lookupAddress}`, (err, addresses) => { + if (err) return callback(err); + + if (addresses.length === 0) { + return callback(new MongoAPIError('No addresses found at host')); + } + + for (const { name } of addresses) { + if (!matchesParentDomain(name, lookupAddress)) { + return callback(new MongoAPIError('Server record does not share hostname with parent URI')); + } + } + + const hostAddresses = addresses.map(r => + HostAddress.fromString(`${r.name}:${r.port ?? 27017}`) + ); + + const lbError = validateLoadBalancedOptions(hostAddresses, options); + if (lbError) { + return callback(lbError); + } + + // Resolve TXT record and add options from there if they exist. + dns.resolveTxt(lookupAddress, (err, record) => { + if (err) { + if (err.code !== 'ENODATA' && err.code !== 'ENOTFOUND') { + return callback(err); + } + } else { + if (record.length > 1) { + return callback(new MongoParseError('Multiple text records not allowed')); + } + + const txtRecordOptions = new URLSearchParams(record[0].join('')); + const txtRecordOptionKeys = [...txtRecordOptions.keys()]; + if (txtRecordOptionKeys.some(key => !VALID_TXT_RECORDS.includes(key))) { + return callback( + new MongoParseError(`Text record may only set any of: ${VALID_TXT_RECORDS.join(', ')}`) + ); + } + + const source = txtRecordOptions.get('authSource') ?? undefined; + const replicaSet = txtRecordOptions.get('replicaSet') ?? undefined; + const loadBalanced = txtRecordOptions.get('loadBalanced') ?? undefined; + + if (source === '' || replicaSet === '') { + return callback(new MongoParseError('Cannot have empty URI params in DNS TXT Record')); + } + + if (!options.userSpecifiedAuthSource && source) { + options.credentials = MongoCredentials.merge(options.credentials, { source }); + } + + if (!options.userSpecifiedReplicaSet && replicaSet) { + options.replicaSet = replicaSet; + } + + if (loadBalanced === 'true') { + options.loadBalanced = true; + } + + const lbError = validateLoadBalancedOptions(hostAddresses, options); + if (lbError) { + return callback(lbError); + } + } + + callback(undefined, hostAddresses); + }); + }); +} + +/** + * Checks if TLS options are valid + * + * @param options - The options used for options parsing + * @throws MongoParseError if TLS options are invalid + */ +export function checkTLSOptions(options: AnyOptions): void { + if (!options) return; + const check = (a: string, b: string) => { + if (Reflect.has(options, a) && Reflect.has(options, b)) { + throw new MongoParseError(`The '${a}' option cannot be used with '${b}'`); + } + }; + check('tlsInsecure', 'tlsAllowInvalidCertificates'); + check('tlsInsecure', 'tlsAllowInvalidHostnames'); + check('tlsInsecure', 'tlsDisableCertificateRevocationCheck'); + check('tlsInsecure', 'tlsDisableOCSPEndpointCheck'); + check('tlsAllowInvalidCertificates', 'tlsDisableCertificateRevocationCheck'); + check('tlsAllowInvalidCertificates', 'tlsDisableOCSPEndpointCheck'); + check('tlsDisableCertificateRevocationCheck', 'tlsDisableOCSPEndpointCheck'); +} + +const TRUTHS = new Set(['true', 't', '1', 'y', 'yes']); +const FALSEHOODS = new Set(['false', 'f', '0', 'n', 'no', '-1']); +function getBoolean(name: string, value: unknown): boolean { + if (typeof value === 'boolean') return value; + const valueString = String(value).toLowerCase(); + if (TRUTHS.has(valueString)) return true; + if (FALSEHOODS.has(valueString)) return false; + throw new MongoParseError(`Expected ${name} to be stringified boolean value, got: ${value}`); +} + +function getInt(name: string, value: unknown): number { + if (typeof value === 'number') return Math.trunc(value); + const parsedValue = Number.parseInt(String(value), 10); + if (!Number.isNaN(parsedValue)) return parsedValue; + throw new MongoParseError(`Expected ${name} to be stringified int value, got: ${value}`); +} + +function getUint(name: string, value: unknown): number { + const parsedValue = getInt(name, value); + if (parsedValue < 0) { + throw new MongoParseError(`${name} can only be a positive int value, got: ${value}`); + } + return parsedValue; +} + +function toRecord(value: string): Record { + const record = Object.create(null); + const keyValuePairs = value.split(','); + for (const keyValue of keyValuePairs) { + const [key, value] = keyValue.split(':'); + if (value == null) { + throw new MongoParseError('Cannot have undefined values in key value pairs'); + } + try { + // try to get a boolean + record[key] = getBoolean('', value); + } catch { + try { + // try to get a number + record[key] = getInt('', value); + } catch { + // keep value as a string + record[key] = value; + } + } + } + return record; +} + +class CaseInsensitiveMap extends Map { + constructor(entries: Array<[string, any]> = []) { + super(entries.map(([k, v]) => [k.toLowerCase(), v])); + } + has(k: string) { + return super.has(k.toLowerCase()); + } + get(k: string) { + return super.get(k.toLowerCase()); + } + set(k: string, v: any) { + return super.set(k.toLowerCase(), v); + } + delete(k: string): boolean { + return super.delete(k.toLowerCase()); + } +} + +export function parseOptions( + uri: string, + mongoClient: MongoClient | MongoClientOptions | undefined = undefined, + options: MongoClientOptions = {} +): MongoOptions { + if (mongoClient != null && !(mongoClient instanceof MongoClient)) { + options = mongoClient; + mongoClient = undefined; + } + + const url = new ConnectionString(uri); + const { hosts, isSRV } = url; + + const mongoOptions = Object.create(null); + mongoOptions.hosts = isSRV ? [] : hosts.map(HostAddress.fromString); + if (isSRV) { + // SRV Record is resolved upon connecting + mongoOptions.srvHost = hosts[0]; + if (!url.searchParams.has('tls') && !url.searchParams.has('ssl')) { + options.tls = true; + } + } + + const urlOptions = new CaseInsensitiveMap(); + + if (url.pathname !== '/' && url.pathname !== '') { + const dbName = decodeURIComponent( + url.pathname[0] === '/' ? url.pathname.slice(1) : url.pathname + ); + if (dbName) { + urlOptions.set('dbName', [dbName]); + } + } + + if (url.username !== '') { + const auth: Document = { + username: decodeURIComponent(url.username) + }; + + if (typeof url.password === 'string') { + auth.password = decodeURIComponent(url.password); + } + + urlOptions.set('auth', [auth]); + } + + for (const key of url.searchParams.keys()) { + const values = [...url.searchParams.getAll(key)]; + + if (values.includes('')) { + throw new MongoAPIError('URI cannot contain options with no value'); + } + + if (key.toLowerCase() === 'serverapi') { + throw new MongoParseError( + 'URI cannot contain `serverApi`, it can only be passed to the client' + ); + } + + if (key.toLowerCase() === 'authsource' && urlOptions.has('authSource')) { + // If authSource is an explicit key in the urlOptions we need to remove the implicit dbName + urlOptions.delete('authSource'); + } + + if (!urlOptions.has(key)) { + urlOptions.set(key, values); + } + } + + const objectOptions = new CaseInsensitiveMap( + Object.entries(options).filter(([, v]) => v != null) + ); + + if (objectOptions.has('loadBalanced')) { + throw new MongoParseError('loadBalanced is only a valid option in the URI'); + } + + const allOptions = new CaseInsensitiveMap(); + + const allKeys = new Set([ + ...urlOptions.keys(), + ...objectOptions.keys(), + ...DEFAULT_OPTIONS.keys() + ]); + + for (const key of allKeys) { + const values = []; + if (objectOptions.has(key)) { + values.push(objectOptions.get(key)); + } + if (urlOptions.has(key)) { + values.push(...urlOptions.get(key)); + } + if (DEFAULT_OPTIONS.has(key)) { + values.push(DEFAULT_OPTIONS.get(key)); + } + allOptions.set(key, values); + } + + if (allOptions.has('tlsCertificateKeyFile') && !allOptions.has('tlsCertificateFile')) { + allOptions.set('tlsCertificateFile', allOptions.get('tlsCertificateKeyFile')); + } + + if (allOptions.has('tls') || allOptions.has('ssl')) { + const tlsAndSslOpts = (allOptions.get('tls') || []) + .concat(allOptions.get('ssl') || []) + .map(getBoolean.bind(null, 'tls/ssl')); + if (new Set(tlsAndSslOpts).size !== 1) { + throw new MongoParseError('All values of tls/ssl must be the same.'); + } + } + + const unsupportedOptions = setDifference( + allKeys, + Array.from(Object.keys(OPTIONS)).map(s => s.toLowerCase()) + ); + if (unsupportedOptions.size !== 0) { + const optionWord = unsupportedOptions.size > 1 ? 'options' : 'option'; + const isOrAre = unsupportedOptions.size > 1 ? 'are' : 'is'; + throw new MongoParseError( + `${optionWord} ${Array.from(unsupportedOptions).join(', ')} ${isOrAre} not supported` + ); + } + + for (const [key, descriptor] of Object.entries(OPTIONS)) { + const values = allOptions.get(key); + if (!values || values.length === 0) continue; + setOption(mongoOptions, key, descriptor, values); + } + + if (mongoOptions.credentials) { + const isGssapi = mongoOptions.credentials.mechanism === AuthMechanism.MONGODB_GSSAPI; + const isX509 = mongoOptions.credentials.mechanism === AuthMechanism.MONGODB_X509; + const isAws = mongoOptions.credentials.mechanism === AuthMechanism.MONGODB_AWS; + if ( + (isGssapi || isX509) && + allOptions.has('authSource') && + mongoOptions.credentials.source !== '$external' + ) { + // If authSource was explicitly given and its incorrect, we error + throw new MongoParseError( + `${mongoOptions.credentials} can only have authSource set to '$external'` + ); + } + + if (!(isGssapi || isX509 || isAws) && mongoOptions.dbName && !allOptions.has('authSource')) { + // inherit the dbName unless GSSAPI or X509, then silently ignore dbName + // and there was no specific authSource given + mongoOptions.credentials = MongoCredentials.merge(mongoOptions.credentials, { + source: mongoOptions.dbName + }); + } + + mongoOptions.credentials.validate(); + } + + if (!mongoOptions.dbName) { + // dbName default is applied here because of the credential validation above + mongoOptions.dbName = 'test'; + } + + checkTLSOptions(mongoOptions); + + if (options.promiseLibrary) PromiseProvider.set(options.promiseLibrary); + + if (mongoOptions.directConnection && typeof mongoOptions.srvHost === 'string') { + throw new MongoAPIError('SRV URI does not support directConnection'); + } + + const lbError = validateLoadBalancedOptions(hosts, mongoOptions); + if (lbError) { + throw lbError; + } + + // Potential SRV Overrides + mongoOptions.userSpecifiedAuthSource = + objectOptions.has('authSource') || urlOptions.has('authSource'); + mongoOptions.userSpecifiedReplicaSet = + objectOptions.has('replicaSet') || urlOptions.has('replicaSet'); + + if (mongoClient && mongoOptions.autoEncryption) { + Encrypter.checkForMongoCrypt(); + mongoOptions.encrypter = new Encrypter(mongoClient, uri, options); + mongoOptions.autoEncrypter = mongoOptions.encrypter.autoEncrypter; + } + + return mongoOptions; +} + +function validateLoadBalancedOptions( + hosts: HostAddress[] | string[], + mongoOptions: MongoOptions +): MongoParseError | undefined { + if (mongoOptions.loadBalanced) { + if (hosts.length > 1) { + return new MongoParseError(LB_SINGLE_HOST_ERROR); + } + if (mongoOptions.replicaSet) { + return new MongoParseError(LB_REPLICA_SET_ERROR); + } + if (mongoOptions.directConnection) { + return new MongoParseError(LB_DIRECT_CONNECTION_ERROR); + } + } +} + +function setOption( + mongoOptions: any, + key: string, + descriptor: OptionDescriptor, + values: unknown[] +) { + const { target, type, transform, deprecated } = descriptor; + const name = target ?? key; + + if (deprecated) { + const deprecatedMsg = typeof deprecated === 'string' ? `: ${deprecated}` : ''; + emitWarning(`${key} is a deprecated option${deprecatedMsg}`); + } + + switch (type) { + case 'boolean': + mongoOptions[name] = getBoolean(name, values[0]); + break; + case 'int': + mongoOptions[name] = getInt(name, values[0]); + break; + case 'uint': + mongoOptions[name] = getUint(name, values[0]); + break; + case 'string': + if (values[0] == null) { + break; + } + mongoOptions[name] = String(values[0]); + break; + case 'record': + if (!isRecord(values[0])) { + throw new MongoParseError(`${name} must be an object`); + } + mongoOptions[name] = values[0]; + break; + case 'any': + mongoOptions[name] = values[0]; + break; + default: { + if (!transform) { + throw new MongoParseError('Descriptors missing a type must define a transform'); + } + const transformValue = transform({ name, options: mongoOptions, values }); + mongoOptions[name] = transformValue; + break; + } + } +} + +interface OptionDescriptor { + target?: string; + type?: 'boolean' | 'int' | 'uint' | 'record' | 'string' | 'any'; + default?: any; + + deprecated?: boolean | string; + /** + * @param name - the original option name + * @param options - the options so far for resolution + * @param values - the possible values in precedence order + */ + transform?: (args: { name: string; options: MongoOptions; values: unknown[] }) => unknown; +} + +export const OPTIONS = { + appName: { + target: 'metadata', + transform({ options, values: [value] }): DriverInfo { + return makeClientMetadata({ ...options.driverInfo, appName: String(value) }); + } + }, + auth: { + target: 'credentials', + transform({ name, options, values: [value] }): MongoCredentials { + if (!isRecord(value, ['username', 'password'] as const)) { + throw new MongoParseError( + `${name} must be an object with 'username' and 'password' properties` + ); + } + return MongoCredentials.merge(options.credentials, { + username: value.username, + password: value.password + }); + } + }, + authMechanism: { + target: 'credentials', + transform({ options, values: [value] }): MongoCredentials { + const mechanisms = Object.values(AuthMechanism); + const [mechanism] = mechanisms.filter(m => m.match(RegExp(String.raw`\b${value}\b`, 'i'))); + if (!mechanism) { + throw new MongoParseError(`authMechanism one of ${mechanisms}, got ${value}`); + } + let source = options.credentials?.source; + if ( + mechanism === AuthMechanism.MONGODB_PLAIN || + mechanism === AuthMechanism.MONGODB_GSSAPI || + mechanism === AuthMechanism.MONGODB_AWS || + mechanism === AuthMechanism.MONGODB_X509 + ) { + // some mechanisms have '$external' as the Auth Source + source = '$external'; + } + + let password = options.credentials?.password; + if (mechanism === AuthMechanism.MONGODB_X509 && password === '') { + password = undefined; + } + return MongoCredentials.merge(options.credentials, { + mechanism, + source, + password + }); + } + }, + authMechanismProperties: { + target: 'credentials', + transform({ options, values: [value] }): MongoCredentials { + if (typeof value === 'string') { + value = toRecord(value); + } + if (!isRecord(value)) { + throw new MongoParseError('AuthMechanismProperties must be an object'); + } + return MongoCredentials.merge(options.credentials, { mechanismProperties: value }); + } + }, + authSource: { + target: 'credentials', + transform({ options, values: [value] }): MongoCredentials { + const source = String(value); + return MongoCredentials.merge(options.credentials, { source }); + } + }, + autoEncryption: { + type: 'record' + }, + bsonRegExp: { + type: 'boolean' + }, + serverApi: { + target: 'serverApi', + transform({ values: [version] }): ServerApi { + const serverApiToValidate = + typeof version === 'string' ? ({ version } as ServerApi) : (version as ServerApi); + const versionToValidate = serverApiToValidate && serverApiToValidate.version; + if (!versionToValidate) { + throw new MongoParseError( + `Invalid \`serverApi\` property; must specify a version from the following enum: ["${Object.values( + ServerApiVersion + ).join('", "')}"]` + ); + } + if (!Object.values(ServerApiVersion).some(v => v === versionToValidate)) { + throw new MongoParseError( + `Invalid server API version=${versionToValidate}; must be in the following enum: ["${Object.values( + ServerApiVersion + ).join('", "')}"]` + ); + } + return serverApiToValidate; + } + }, + checkKeys: { + type: 'boolean' + }, + compressors: { + default: 'none', + target: 'compressors', + transform({ values }) { + const compressionList = new Set(); + for (const compVal of values as (CompressorName[] | string)[]) { + const compValArray = typeof compVal === 'string' ? compVal.split(',') : compVal; + if (!Array.isArray(compValArray)) { + throw new MongoInvalidArgumentError( + 'compressors must be an array or a comma-delimited list of strings' + ); + } + for (const c of compValArray) { + if (Object.keys(Compressor).includes(String(c))) { + compressionList.add(String(c)); + } else { + throw new MongoInvalidArgumentError( + `${c} is not a valid compression mechanism. Must be one of: ${Object.keys( + Compressor + )}.` + ); + } + } + } + return [...compressionList]; + } + }, + connectTimeoutMS: { + default: 30000, + type: 'uint' + }, + dbName: { + type: 'string' + }, + directConnection: { + default: false, + type: 'boolean' + }, + driverInfo: { + target: 'metadata', + default: makeClientMetadata(), + transform({ options, values: [value] }) { + if (!isRecord(value)) throw new MongoParseError('DriverInfo must be an object'); + return makeClientMetadata({ + driverInfo: value, + appName: options.metadata?.application?.name + }); + } + }, + family: { + transform({ name, values: [value] }): 4 | 6 { + const transformValue = getInt(name, value); + if (transformValue === 4 || transformValue === 6) { + return transformValue; + } + throw new MongoParseError(`Option 'family' must be 4 or 6 got ${transformValue}.`); + } + }, + fieldsAsRaw: { + type: 'record' + }, + forceServerObjectId: { + default: false, + type: 'boolean' + }, + fsync: { + deprecated: 'Please use journal instead', + target: 'writeConcern', + transform({ name, options, values: [value] }): WriteConcern { + const wc = WriteConcern.fromOptions({ + writeConcern: { + ...options.writeConcern, + fsync: getBoolean(name, value) + } + }); + if (!wc) throw new MongoParseError(`Unable to make a writeConcern from fsync=${value}`); + return wc; + } + } as OptionDescriptor, + heartbeatFrequencyMS: { + default: 10000, + type: 'uint' + }, + ignoreUndefined: { + type: 'boolean' + }, + j: { + deprecated: 'Please use journal instead', + target: 'writeConcern', + transform({ name, options, values: [value] }): WriteConcern { + const wc = WriteConcern.fromOptions({ + writeConcern: { + ...options.writeConcern, + journal: getBoolean(name, value) + } + }); + if (!wc) throw new MongoParseError(`Unable to make a writeConcern from journal=${value}`); + return wc; + } + } as OptionDescriptor, + journal: { + target: 'writeConcern', + transform({ name, options, values: [value] }): WriteConcern { + const wc = WriteConcern.fromOptions({ + writeConcern: { + ...options.writeConcern, + journal: getBoolean(name, value) + } + }); + if (!wc) throw new MongoParseError(`Unable to make a writeConcern from journal=${value}`); + return wc; + } + }, + keepAlive: { + default: true, + type: 'boolean' + }, + keepAliveInitialDelay: { + default: 120000, + type: 'uint' + }, + loadBalanced: { + default: false, + type: 'boolean' + }, + localThresholdMS: { + default: 15, + type: 'uint' + }, + logger: { + default: new Logger('MongoClient'), + transform({ values: [value] }) { + if (value instanceof Logger) { + return value; + } + emitWarning('Alternative loggers might not be supported'); + // TODO: make Logger an interface that others can implement, make usage consistent in driver + // DRIVERS-1204 + } + }, + loggerLevel: { + target: 'logger', + transform({ values: [value] }) { + return new Logger('MongoClient', { loggerLevel: value as LoggerLevel }); + } + }, + maxIdleTimeMS: { + default: 0, + type: 'uint' + }, + maxPoolSize: { + default: 100, + type: 'uint' + }, + maxStalenessSeconds: { + target: 'readPreference', + transform({ name, options, values: [value] }) { + const maxStalenessSeconds = getUint(name, value); + if (options.readPreference) { + return ReadPreference.fromOptions({ + readPreference: { ...options.readPreference, maxStalenessSeconds } + }); + } else { + return new ReadPreference('secondary', undefined, { maxStalenessSeconds }); + } + } + }, + minInternalBufferSize: { + type: 'uint' + }, + minPoolSize: { + default: 0, + type: 'uint' + }, + minHeartbeatFrequencyMS: { + default: 500, + type: 'uint' + }, + monitorCommands: { + default: false, + type: 'boolean' + }, + name: { + target: 'driverInfo', + transform({ values: [value], options }) { + return { ...options.driverInfo, name: String(value) }; + } + } as OptionDescriptor, + noDelay: { + default: true, + type: 'boolean' + }, + pkFactory: { + default: DEFAULT_PK_FACTORY, + transform({ values: [value] }): PkFactory { + if (isRecord(value, ['createPk'] as const) && typeof value.createPk === 'function') { + return value as PkFactory; + } + throw new MongoParseError( + `Option pkFactory must be an object with a createPk function, got ${value}` + ); + } + }, + promiseLibrary: { + deprecated: true, + type: 'any' + }, + promoteBuffers: { + type: 'boolean' + }, + promoteLongs: { + type: 'boolean' + }, + promoteValues: { + type: 'boolean' + }, + raw: { + default: false, + type: 'boolean' + }, + readConcern: { + transform({ values: [value], options }) { + if (value instanceof ReadConcern || isRecord(value, ['level'] as const)) { + return ReadConcern.fromOptions({ ...options.readConcern, ...value } as any); + } + throw new MongoParseError(`ReadConcern must be an object, got ${JSON.stringify(value)}`); + } + }, + readConcernLevel: { + target: 'readConcern', + transform({ values: [level], options }) { + return ReadConcern.fromOptions({ + ...options.readConcern, + level: level as ReadConcernLevel + }); + } + }, + readPreference: { + default: ReadPreference.primary, + transform({ values: [value], options }) { + if (value instanceof ReadPreference) { + return ReadPreference.fromOptions({ + readPreference: { ...options.readPreference, ...value }, + ...value + } as any); + } + if (isRecord(value, ['mode'] as const)) { + const rp = ReadPreference.fromOptions({ + readPreference: { ...options.readPreference, ...value }, + ...value + } as any); + if (rp) return rp; + else throw new MongoParseError(`Cannot make read preference from ${JSON.stringify(value)}`); + } + if (typeof value === 'string') { + const rpOpts = { + hedge: options.readPreference?.hedge, + maxStalenessSeconds: options.readPreference?.maxStalenessSeconds + }; + return new ReadPreference( + value as ReadPreferenceMode, + options.readPreference?.tags, + rpOpts + ); + } + } + }, + readPreferenceTags: { + target: 'readPreference', + transform({ values, options }) { + const readPreferenceTags = []; + for (const tag of values) { + const readPreferenceTag: TagSet = Object.create(null); + if (typeof tag === 'string') { + for (const [k, v] of Object.entries(toRecord(tag))) { + readPreferenceTag[k] = v; + } + } + if (isRecord(tag)) { + for (const [k, v] of Object.entries(tag)) { + readPreferenceTag[k] = v; + } + } + readPreferenceTags.push(readPreferenceTag); + } + return ReadPreference.fromOptions({ + readPreference: options.readPreference, + readPreferenceTags + }); + } + }, + replicaSet: { + type: 'string' + }, + retryReads: { + default: true, + type: 'boolean' + }, + retryWrites: { + default: true, + type: 'boolean' + }, + serializeFunctions: { + type: 'boolean' + }, + serverSelectionTimeoutMS: { + default: 30000, + type: 'uint' + }, + servername: { + type: 'string' + }, + socketTimeoutMS: { + default: 0, + type: 'uint' + }, + ssl: { + target: 'tls', + type: 'boolean' + }, + sslCA: { + target: 'ca', + transform({ values: [value] }) { + return fs.readFileSync(String(value), { encoding: 'ascii' }); + } + }, + sslCRL: { + target: 'crl', + transform({ values: [value] }) { + return fs.readFileSync(String(value), { encoding: 'ascii' }); + } + }, + sslCert: { + target: 'cert', + transform({ values: [value] }) { + return fs.readFileSync(String(value), { encoding: 'ascii' }); + } + }, + sslKey: { + target: 'key', + transform({ values: [value] }) { + return fs.readFileSync(String(value), { encoding: 'ascii' }); + } + }, + sslPass: { + deprecated: true, + target: 'passphrase', + type: 'string' + }, + sslValidate: { + target: 'rejectUnauthorized', + type: 'boolean' + }, + tls: { + type: 'boolean' + }, + tlsAllowInvalidCertificates: { + target: 'rejectUnauthorized', + transform({ name, values: [value] }) { + // allowInvalidCertificates is the inverse of rejectUnauthorized + return !getBoolean(name, value); + } + }, + tlsAllowInvalidHostnames: { + target: 'checkServerIdentity', + transform({ name, values: [value] }) { + // tlsAllowInvalidHostnames means setting the checkServerIdentity function to a noop + return getBoolean(name, value) ? () => undefined : undefined; + } + }, + tlsCAFile: { + target: 'ca', + transform({ values: [value] }) { + return fs.readFileSync(String(value), { encoding: 'ascii' }); + } + }, + tlsCertificateFile: { + target: 'cert', + transform({ values: [value] }) { + return fs.readFileSync(String(value), { encoding: 'ascii' }); + } + }, + tlsCertificateKeyFile: { + target: 'key', + transform({ values: [value] }) { + return fs.readFileSync(String(value), { encoding: 'ascii' }); + } + }, + tlsCertificateKeyFilePassword: { + target: 'passphrase', + type: 'any' + }, + tlsInsecure: { + transform({ name, options, values: [value] }) { + const tlsInsecure = getBoolean(name, value); + if (tlsInsecure) { + options.checkServerIdentity = () => undefined; + options.rejectUnauthorized = false; + } else { + options.checkServerIdentity = options.tlsAllowInvalidHostnames + ? () => undefined + : undefined; + options.rejectUnauthorized = options.tlsAllowInvalidCertificates ? false : true; + } + return tlsInsecure; + } + }, + w: { + target: 'writeConcern', + transform({ values: [value], options }) { + return WriteConcern.fromOptions({ writeConcern: { ...options.writeConcern, w: value as W } }); + } + }, + waitQueueTimeoutMS: { + default: 0, + type: 'uint' + }, + writeConcern: { + target: 'writeConcern', + transform({ values: [value], options }) { + if (isRecord(value) || value instanceof WriteConcern) { + return WriteConcern.fromOptions({ + writeConcern: { + ...options.writeConcern, + ...value + } + }); + } else if (value === 'majority' || typeof value === 'number') { + return WriteConcern.fromOptions({ + writeConcern: { + ...options.writeConcern, + w: value + } + }); + } + + throw new MongoParseError(`Invalid WriteConcern cannot parse: ${JSON.stringify(value)}`); + } + } as OptionDescriptor, + wtimeout: { + deprecated: 'Please use wtimeoutMS instead', + target: 'writeConcern', + transform({ values: [value], options }) { + const wc = WriteConcern.fromOptions({ + writeConcern: { + ...options.writeConcern, + wtimeout: getUint('wtimeout', value) + } + }); + if (wc) return wc; + throw new MongoParseError(`Cannot make WriteConcern from wtimeout`); + } + } as OptionDescriptor, + wtimeoutMS: { + target: 'writeConcern', + transform({ values: [value], options }) { + const wc = WriteConcern.fromOptions({ + writeConcern: { + ...options.writeConcern, + wtimeoutMS: getUint('wtimeoutMS', value) + } + }); + if (wc) return wc; + throw new MongoParseError(`Cannot make WriteConcern from wtimeout`); + } + }, + zlibCompressionLevel: { + default: 0, + type: 'int' + }, + // Custom types for modifying core behavior + connectionType: { type: 'any' }, + srvPoller: { type: 'any' }, + // Accepted NodeJS Options + minDHSize: { type: 'any' }, + pskCallback: { type: 'any' }, + secureContext: { type: 'any' }, + enableTrace: { type: 'any' }, + requestCert: { type: 'any' }, + rejectUnauthorized: { type: 'any' }, + checkServerIdentity: { type: 'any' }, + ALPNProtocols: { type: 'any' }, + SNICallback: { type: 'any' }, + session: { type: 'any' }, + requestOCSP: { type: 'any' }, + localAddress: { type: 'any' }, + localPort: { type: 'any' }, + hints: { type: 'any' }, + lookup: { type: 'any' }, + ca: { type: 'any' }, + cert: { type: 'any' }, + ciphers: { type: 'any' }, + crl: { type: 'any' }, + ecdhCurve: { type: 'any' }, + key: { type: 'any' }, + passphrase: { type: 'any' }, + pfx: { type: 'any' }, + secureProtocol: { type: 'any' }, + index: { type: 'any' }, + // Legacy Options, these are unused but left here to avoid errors with CSFLE lib + useNewUrlParser: { type: 'boolean' } as OptionDescriptor, + useUnifiedTopology: { type: 'boolean' } as OptionDescriptor +} as Record; + +export const DEFAULT_OPTIONS = new CaseInsensitiveMap( + Object.entries(OPTIONS) + .filter(([, descriptor]) => descriptor.default != null) + .map(([k, d]) => [k, d.default]) +); diff --git a/node_modules/mongodb/src/constants.ts b/node_modules/mongodb/src/constants.ts new file mode 100644 index 000000000..4a6b53826 --- /dev/null +++ b/node_modules/mongodb/src/constants.ts @@ -0,0 +1,6 @@ +export const SYSTEM_NAMESPACE_COLLECTION = 'system.namespaces'; +export const SYSTEM_INDEX_COLLECTION = 'system.indexes'; +export const SYSTEM_PROFILE_COLLECTION = 'system.profile'; +export const SYSTEM_USER_COLLECTION = 'system.users'; +export const SYSTEM_COMMAND_COLLECTION = '$cmd'; +export const SYSTEM_JS_COLLECTION = 'system.js'; diff --git a/node_modules/mongodb/src/cursor/abstract_cursor.ts b/node_modules/mongodb/src/cursor/abstract_cursor.ts new file mode 100644 index 000000000..b4be34bc2 --- /dev/null +++ b/node_modules/mongodb/src/cursor/abstract_cursor.ts @@ -0,0 +1,889 @@ +import { Callback, maybePromise, MongoDBNamespace, ns } from '../utils'; +import { Long, Document, BSONSerializeOptions, pluckBSONSerializeOptions } from '../bson'; +import { ClientSession, maybeClearPinnedConnection } from '../sessions'; +import { + AnyError, + MongoRuntimeError, + MongoNetworkError, + MongoInvalidArgumentError, + MongoCursorExhaustedError, + MongoTailableCursorError, + MongoCursorInUseError +} from '../error'; +import { ReadPreference, ReadPreferenceLike } from '../read_preference'; +import type { Server } from '../sdam/server'; +import type { Topology } from '../sdam/topology'; +import { Readable, Transform } from 'stream'; +import type { ExecutionResult } from '../operations/execute_operation'; +import { ReadConcern, ReadConcernLike } from '../read_concern'; +import { TODO_NODE_3286, TypedEventEmitter } from '../mongo_types'; + +/** @internal */ +const kId = Symbol('id'); +/** @internal */ +const kDocuments = Symbol('documents'); +/** @internal */ +const kServer = Symbol('server'); +/** @internal */ +const kNamespace = Symbol('namespace'); +/** @internal */ +const kTopology = Symbol('topology'); +/** @internal */ +const kSession = Symbol('session'); +/** @internal */ +const kOptions = Symbol('options'); +/** @internal */ +const kTransform = Symbol('transform'); +/** @internal */ +const kInitialized = Symbol('initialized'); +/** @internal */ +const kClosed = Symbol('closed'); +/** @internal */ +const kKilled = Symbol('killed'); + +/** @public */ +export const CURSOR_FLAGS = [ + 'tailable', + 'oplogReplay', + 'noCursorTimeout', + 'awaitData', + 'exhaust', + 'partial' +] as const; + +/** @public + * @deprecated This interface is deprecated */ +export interface CursorCloseOptions { + /** Bypass calling killCursors when closing the cursor. */ + /** @deprecated the skipKillCursors option is deprecated */ + skipKillCursors?: boolean; +} + +/** @public */ +export interface CursorStreamOptions { + /** A transformation method applied to each document emitted by the stream */ + transform?(doc: Document): Document; +} + +/** @public */ +export type CursorFlag = typeof CURSOR_FLAGS[number]; + +/** @public */ +export interface AbstractCursorOptions extends BSONSerializeOptions { + session?: ClientSession; + readPreference?: ReadPreferenceLike; + readConcern?: ReadConcernLike; + batchSize?: number; + maxTimeMS?: number; + comment?: Document | string; + tailable?: boolean; + awaitData?: boolean; + noCursorTimeout?: boolean; +} + +/** @internal */ +export type InternalAbstractCursorOptions = Omit & { + // resolved + readPreference: ReadPreference; + readConcern?: ReadConcern; + + // cursor flags, some are deprecated + oplogReplay?: boolean; + exhaust?: boolean; + partial?: boolean; +}; + +/** @public */ +export type AbstractCursorEvents = { + [AbstractCursor.CLOSE](): void; +}; + +/** @public */ +export abstract class AbstractCursor< + TSchema = any, + CursorEvents extends AbstractCursorEvents = AbstractCursorEvents +> extends TypedEventEmitter { + /** @internal */ + [kId]?: Long; + /** @internal */ + [kSession]?: ClientSession; + /** @internal */ + [kServer]?: Server; + /** @internal */ + [kNamespace]: MongoDBNamespace; + /** @internal */ + [kDocuments]: TSchema[]; + /** @internal */ + [kTopology]: Topology; + /** @internal */ + [kTransform]?: (doc: TSchema) => Document; + /** @internal */ + [kInitialized]: boolean; + /** @internal */ + [kClosed]: boolean; + /** @internal */ + [kKilled]: boolean; + /** @internal */ + [kOptions]: InternalAbstractCursorOptions; + + /** @event */ + static readonly CLOSE = 'close' as const; + + /** @internal */ + constructor( + topology: Topology, + namespace: MongoDBNamespace, + options: AbstractCursorOptions = {} + ) { + super(); + + this[kTopology] = topology; + this[kNamespace] = namespace; + this[kDocuments] = []; // TODO: https://github.com/microsoft/TypeScript/issues/36230 + this[kInitialized] = false; + this[kClosed] = false; + this[kKilled] = false; + this[kOptions] = { + readPreference: + options.readPreference && options.readPreference instanceof ReadPreference + ? options.readPreference + : ReadPreference.primary, + ...pluckBSONSerializeOptions(options) + }; + + const readConcern = ReadConcern.fromOptions(options); + if (readConcern) { + this[kOptions].readConcern = readConcern; + } + + if (typeof options.batchSize === 'number') { + this[kOptions].batchSize = options.batchSize; + } + + if (options.comment != null) { + this[kOptions].comment = options.comment; + } + + if (typeof options.maxTimeMS === 'number') { + this[kOptions].maxTimeMS = options.maxTimeMS; + } + + if (options.session instanceof ClientSession) { + this[kSession] = options.session; + } + } + + get id(): Long | undefined { + return this[kId]; + } + + /** @internal */ + get topology(): Topology { + return this[kTopology]; + } + + /** @internal */ + get server(): Server | undefined { + return this[kServer]; + } + + get namespace(): MongoDBNamespace { + return this[kNamespace]; + } + + get readPreference(): ReadPreference { + return this[kOptions].readPreference; + } + + get readConcern(): ReadConcern | undefined { + return this[kOptions].readConcern; + } + + /** @internal */ + get session(): ClientSession | undefined { + return this[kSession]; + } + + set session(clientSession: ClientSession | undefined) { + this[kSession] = clientSession; + } + + /** @internal */ + get cursorOptions(): InternalAbstractCursorOptions { + return this[kOptions]; + } + + get closed(): boolean { + return this[kClosed]; + } + + get killed(): boolean { + return this[kKilled]; + } + + get loadBalanced(): boolean { + return this[kTopology].loadBalanced; + } + + /** Returns current buffered documents length */ + bufferedCount(): number { + return this[kDocuments].length; + } + + /** Returns current buffered documents */ + readBufferedDocuments(number?: number): TSchema[] { + return this[kDocuments].splice(0, number ?? this[kDocuments].length); + } + + [Symbol.asyncIterator](): AsyncIterator { + return { + next: () => + this.next().then(value => + value != null ? { value, done: false } : { value: undefined, done: true } + ) + }; + } + + stream(options?: CursorStreamOptions): Readable { + if (options?.transform) { + const transform = options.transform; + const readable = makeCursorStream(this); + + return readable.pipe( + new Transform({ + objectMode: true, + highWaterMark: 1, + transform(chunk, _, callback) { + try { + const transformed = transform(chunk); + callback(undefined, transformed); + } catch (err) { + callback(err); + } + } + }) + ); + } + + return makeCursorStream(this); + } + + hasNext(): Promise; + hasNext(callback: Callback): void; + hasNext(callback?: Callback): Promise | void { + return maybePromise(callback, done => { + if (this[kId] === Long.ZERO) { + return done(undefined, false); + } + + if (this[kDocuments].length) { + return done(undefined, true); + } + + next(this, true, (err, doc) => { + if (err) return done(err); + + if (doc) { + this[kDocuments].unshift(doc); + done(undefined, true); + return; + } + + done(undefined, false); + }); + }); + } + + /** Get the next available document from the cursor, returns null if no more documents are available. */ + next(): Promise; + next(callback: Callback): void; + next(callback?: Callback): Promise | void; + next(callback?: Callback): Promise | void { + return maybePromise(callback, done => { + if (this[kId] === Long.ZERO) { + return done(new MongoCursorExhaustedError()); + } + + next(this, true, done); + }); + } + + /** + * Try to get the next available document from the cursor or `null` if an empty batch is returned + */ + tryNext(): Promise; + tryNext(callback: Callback): void; + tryNext(callback?: Callback): Promise | void { + return maybePromise(callback, done => { + if (this[kId] === Long.ZERO) { + return done(new MongoCursorExhaustedError()); + } + + next(this, false, done); + }); + } + + /** + * Iterates over all the documents for this cursor using the iterator, callback pattern. + * + * @param iterator - The iteration callback. + * @param callback - The end callback. + */ + forEach(iterator: (doc: TSchema) => boolean | void): Promise; + forEach(iterator: (doc: TSchema) => boolean | void, callback: Callback): void; + forEach( + iterator: (doc: TSchema) => boolean | void, + callback?: Callback + ): Promise | void { + if (typeof iterator !== 'function') { + throw new MongoInvalidArgumentError('Argument "iterator" must be a function'); + } + return maybePromise(callback, done => { + const transform = this[kTransform]; + const fetchDocs = () => { + next(this, true, (err, doc) => { + if (err || doc == null) return done(err); + let result; + // NOTE: no need to transform because `next` will do this automatically + try { + result = iterator(doc); // TODO(NODE-3283): Improve transform typing + } catch (error) { + return done(error); + } + + if (result === false) return done(); + + // these do need to be transformed since they are copying the rest of the batch + const internalDocs = this[kDocuments].splice(0, this[kDocuments].length); + for (let i = 0; i < internalDocs.length; ++i) { + try { + result = iterator( + (transform ? transform(internalDocs[i]) : internalDocs[i]) as TSchema // TODO(NODE-3283): Improve transform typing + ); + } catch (error) { + return done(error); + } + if (result === false) return done(); + } + + fetchDocs(); + }); + }; + + fetchDocs(); + }); + } + + close(): void; + close(callback: Callback): void; + /** + * @deprecated options argument is deprecated + */ + close(options: CursorCloseOptions): Promise; + /** + * @deprecated options argument is deprecated + */ + close(options: CursorCloseOptions, callback: Callback): void; + close(options?: CursorCloseOptions | Callback, callback?: Callback): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = options ?? {}; + + const needsToEmitClosed = !this[kClosed]; + this[kClosed] = true; + + return maybePromise(callback, done => cleanupCursor(this, { needsToEmitClosed }, done)); + } + + /** + * Returns an array of documents. The caller is responsible for making sure that there + * is enough memory to store the results. Note that the array only contains partial + * results when this cursor had been previously accessed. In that case, + * cursor.rewind() can be used to reset the cursor. + * + * @param callback - The result callback. + */ + toArray(): Promise; + toArray(callback: Callback): void; + toArray(callback?: Callback): Promise | void { + return maybePromise(callback, done => { + const docs: TSchema[] = []; + const transform = this[kTransform]; + const fetchDocs = () => { + // NOTE: if we add a `nextBatch` then we should use it here + next(this, true, (err, doc) => { + if (err) return done(err); + if (doc == null) return done(undefined, docs); + + // NOTE: no need to transform because `next` will do this automatically + docs.push(doc); + + // these do need to be transformed since they are copying the rest of the batch + const internalDocs = ( + transform + ? this[kDocuments].splice(0, this[kDocuments].length).map(transform) + : this[kDocuments].splice(0, this[kDocuments].length) + ) as TSchema[]; // TODO(NODE-3283): Improve transform typing + + if (internalDocs) { + docs.push(...internalDocs); + } + + fetchDocs(); + }); + }; + + fetchDocs(); + }); + } + + /** + * Add a cursor flag to the cursor + * + * @param flag - The flag to set, must be one of following ['tailable', 'oplogReplay', 'noCursorTimeout', 'awaitData', 'partial' -. + * @param value - The flag boolean value. + */ + addCursorFlag(flag: CursorFlag, value: boolean): this { + assertUninitialized(this); + if (!CURSOR_FLAGS.includes(flag)) { + throw new MongoInvalidArgumentError(`Flag ${flag} is not one of ${CURSOR_FLAGS}`); + } + + if (typeof value !== 'boolean') { + throw new MongoInvalidArgumentError(`Flag ${flag} must be a boolean value`); + } + + this[kOptions][flag] = value; + return this; + } + + /** + * Map all documents using the provided function + * If there is a transform set on the cursor, that will be called first and the result passed to + * this function's transform. + * + * @remarks + * **Note for Typescript Users:** adding a transform changes the return type of the iteration of this cursor, + * it **does not** return a new instance of a cursor. This means when calling map, + * you should always assign the result to a new variable in order to get a correctly typed cursor variable. + * Take note of the following example: + * + * @example + * ```typescript + * const cursor: FindCursor = coll.find(); + * const mappedCursor: FindCursor = cursor.map(doc => Object.keys(doc).length); + * const keyCounts: number[] = await mappedCursor.toArray(); // cursor.toArray() still returns Document[] + * ``` + * @param transform - The mapping transformation method. + */ + map(transform: (doc: TSchema) => T): AbstractCursor { + assertUninitialized(this); + const oldTransform = this[kTransform] as (doc: TSchema) => TSchema; // TODO(NODE-3283): Improve transform typing + if (oldTransform) { + this[kTransform] = doc => { + return transform(oldTransform(doc)); + }; + } else { + this[kTransform] = transform; + } + + return this as unknown as AbstractCursor; + } + + /** + * Set the ReadPreference for the cursor. + * + * @param readPreference - The new read preference for the cursor. + */ + withReadPreference(readPreference: ReadPreferenceLike): this { + assertUninitialized(this); + if (readPreference instanceof ReadPreference) { + this[kOptions].readPreference = readPreference; + } else if (typeof readPreference === 'string') { + this[kOptions].readPreference = ReadPreference.fromString(readPreference); + } else { + throw new MongoInvalidArgumentError(`Invalid read preference: ${readPreference}`); + } + + return this; + } + + /** + * Set the ReadPreference for the cursor. + * + * @param readPreference - The new read preference for the cursor. + */ + withReadConcern(readConcern: ReadConcernLike): this { + assertUninitialized(this); + const resolvedReadConcern = ReadConcern.fromOptions({ readConcern }); + if (resolvedReadConcern) { + this[kOptions].readConcern = resolvedReadConcern; + } + + return this; + } + + /** + * Set a maxTimeMS on the cursor query, allowing for hard timeout limits on queries (Only supported on MongoDB 2.6 or higher) + * + * @param value - Number of milliseconds to wait before aborting the query. + */ + maxTimeMS(value: number): this { + assertUninitialized(this); + if (typeof value !== 'number') { + throw new MongoInvalidArgumentError('Argument for maxTimeMS must be a number'); + } + + this[kOptions].maxTimeMS = value; + return this; + } + + /** + * Set the batch size for the cursor. + * + * @param value - The number of documents to return per batch. See {@link https://docs.mongodb.com/manual/reference/command/find/|find command documentation}. + */ + batchSize(value: number): this { + assertUninitialized(this); + if (this[kOptions].tailable) { + throw new MongoTailableCursorError('Tailable cursor does not support batchSize'); + } + + if (typeof value !== 'number') { + throw new MongoInvalidArgumentError('Operation "batchSize" requires an integer'); + } + + this[kOptions].batchSize = value; + return this; + } + + /** + * Rewind this cursor to its uninitialized state. Any options that are present on the cursor will + * remain in effect. Iterating this cursor will cause new queries to be sent to the server, even + * if the resultant data has already been retrieved by this cursor. + */ + rewind(): void { + if (!this[kInitialized]) { + return; + } + + this[kId] = undefined; + this[kDocuments] = []; + this[kClosed] = false; + this[kKilled] = false; + this[kInitialized] = false; + + const session = this[kSession]; + if (session) { + // We only want to end this session if we created it, and it hasn't ended yet + if (session.explicit === false && !session.hasEnded) { + session.endSession(); + } + + this[kSession] = undefined; + } + } + + /** + * Returns a new uninitialized copy of this cursor, with options matching those that have been set on the current instance + */ + abstract clone(): AbstractCursor; + + /** @internal */ + abstract _initialize( + session: ClientSession | undefined, + callback: Callback + ): void; + + /** @internal */ + _getMore(batchSize: number, callback: Callback): void { + const cursorId = this[kId]; + const cursorNs = this[kNamespace]; + const server = this[kServer]; + + if (cursorId == null) { + callback(new MongoRuntimeError('Unable to iterate cursor with no id')); + return; + } + + if (server == null) { + callback(new MongoRuntimeError('Unable to iterate cursor without selected server')); + return; + } + + server.getMore( + cursorNs, + cursorId, + { + ...this[kOptions], + session: this[kSession], + batchSize + }, + callback + ); + } +} + +function nextDocument(cursor: AbstractCursor): T | null | undefined { + if (cursor[kDocuments] == null || !cursor[kDocuments].length) { + return null; + } + + const doc = cursor[kDocuments].shift(); + if (doc) { + const transform = cursor[kTransform]; + if (transform) { + return transform(doc) as T; + } + + return doc; + } + + return null; +} + +function next(cursor: AbstractCursor, blocking: boolean, callback: Callback): void { + const cursorId = cursor[kId]; + if (cursor.closed) { + return callback(undefined, null); + } + + if (cursor[kDocuments] && cursor[kDocuments].length) { + callback(undefined, nextDocument(cursor)); + return; + } + + if (cursorId == null) { + // All cursors must operate within a session, one must be made implicitly if not explicitly provided + if (cursor[kSession] == null && cursor[kTopology].hasSessionSupport()) { + cursor[kSession] = cursor[kTopology].startSession({ owner: cursor, explicit: false }); + } + + cursor._initialize(cursor[kSession], (err, state) => { + if (state) { + const response = state.response; + cursor[kServer] = state.server; + cursor[kSession] = state.session; + + if (response.cursor) { + cursor[kId] = + typeof response.cursor.id === 'number' + ? Long.fromNumber(response.cursor.id) + : response.cursor.id; + + if (response.cursor.ns) { + cursor[kNamespace] = ns(response.cursor.ns); + } + + cursor[kDocuments] = response.cursor.firstBatch; + } else { + // NOTE: This is for support of older servers (<3.2) which do not use commands + cursor[kId] = + typeof response.cursorId === 'number' + ? Long.fromNumber(response.cursorId) + : response.cursorId; + cursor[kDocuments] = response.documents; + } + + // When server responses return without a cursor document, we close this cursor + // and return the raw server response. This is often the case for explain commands + // for example + if (cursor[kId] == null) { + cursor[kId] = Long.ZERO; + // TODO(NODE-3286): ExecutionResult needs to accept a generic parameter + cursor[kDocuments] = [state.response as TODO_NODE_3286]; + } + } + + // the cursor is now initialized, even if an error occurred or it is dead + cursor[kInitialized] = true; + + if (err || cursorIsDead(cursor)) { + return cleanupCursor(cursor, { error: err }, () => callback(err, nextDocument(cursor))); + } + + next(cursor, blocking, callback); + }); + + return; + } + + if (cursorIsDead(cursor)) { + return cleanupCursor(cursor, undefined, () => callback(undefined, null)); + } + + // otherwise need to call getMore + const batchSize = cursor[kOptions].batchSize || 1000; + cursor._getMore(batchSize, (err, response) => { + if (response) { + const cursorId = + typeof response.cursor.id === 'number' + ? Long.fromNumber(response.cursor.id) + : response.cursor.id; + + cursor[kDocuments] = response.cursor.nextBatch; + cursor[kId] = cursorId; + } + + if (err || cursorIsDead(cursor)) { + return cleanupCursor(cursor, { error: err }, () => callback(err, nextDocument(cursor))); + } + + if (cursor[kDocuments].length === 0 && blocking === false) { + return callback(undefined, null); + } + + next(cursor, blocking, callback); + }); +} + +function cursorIsDead(cursor: AbstractCursor): boolean { + const cursorId = cursor[kId]; + return !!cursorId && cursorId.isZero(); +} + +function cleanupCursor( + cursor: AbstractCursor, + options: { error?: AnyError | undefined; needsToEmitClosed?: boolean } | undefined, + callback: Callback +): void { + const cursorId = cursor[kId]; + const cursorNs = cursor[kNamespace]; + const server = cursor[kServer]; + const session = cursor[kSession]; + const error = options?.error; + const needsToEmitClosed = options?.needsToEmitClosed ?? cursor[kDocuments].length === 0; + + if (error) { + if (cursor.loadBalanced && error instanceof MongoNetworkError) { + return completeCleanup(); + } + } + + if (cursorId == null || server == null || cursorId.isZero() || cursorNs == null) { + if (needsToEmitClosed) { + cursor[kClosed] = true; + cursor[kId] = Long.ZERO; + cursor.emit(AbstractCursor.CLOSE); + } + + if (session) { + if (session.owner === cursor) { + return session.endSession({ error }, callback); + } + + if (!session.inTransaction()) { + maybeClearPinnedConnection(session, { error }); + } + } + + return callback(); + } + + function completeCleanup() { + if (session) { + if (session.owner === cursor) { + return session.endSession({ error }, () => { + cursor.emit(AbstractCursor.CLOSE); + callback(); + }); + } + + if (!session.inTransaction()) { + maybeClearPinnedConnection(session, { error }); + } + } + + cursor.emit(AbstractCursor.CLOSE); + return callback(); + } + + cursor[kKilled] = true; + server.killCursors( + cursorNs, + [cursorId], + { ...pluckBSONSerializeOptions(cursor[kOptions]), session }, + () => completeCleanup() + ); +} + +/** @internal */ +export function assertUninitialized(cursor: AbstractCursor): void { + if (cursor[kInitialized]) { + throw new MongoCursorInUseError(); + } +} + +function makeCursorStream(cursor: AbstractCursor) { + const readable = new Readable({ + objectMode: true, + autoDestroy: false, + highWaterMark: 1 + }); + + let initialized = false; + let reading = false; + let needToClose = true; // NOTE: we must close the cursor if we never read from it, use `_construct` in future node versions + + readable._read = function () { + if (initialized === false) { + needToClose = false; + initialized = true; + } + + if (!reading) { + reading = true; + readNext(); + } + }; + + readable._destroy = function (error, cb) { + if (needToClose) { + cursor.close(err => process.nextTick(cb, err || error)); + } else { + cb(error); + } + }; + + function readNext() { + needToClose = false; + next(cursor, true, (err, result) => { + needToClose = err ? !cursor.closed : result != null; + + if (err) { + // NOTE: This is questionable, but we have a test backing the behavior. It seems the + // desired behavior is that a stream ends cleanly when a user explicitly closes + // a client during iteration. Alternatively, we could do the "right" thing and + // propagate the error message by removing this special case. + if (err.message.match(/server is closed/)) { + cursor.close(); + return readable.push(null); + } + + // NOTE: This is also perhaps questionable. The rationale here is that these errors tend + // to be "operation interrupted", where a cursor has been closed but there is an + // active getMore in-flight. This used to check if the cursor was killed but once + // that changed to happen in cleanup legitimate errors would not destroy the + // stream. There are change streams test specifically test these cases. + if (err.message.match(/interrupted/)) { + return readable.push(null); + } + + return readable.destroy(err); + } + + if (result == null) { + readable.push(null); + } else if (readable.destroyed) { + cursor.close(); + } else { + if (readable.push(result)) { + return readNext(); + } + + reading = false; + } + }); + } + + return readable; +} diff --git a/node_modules/mongodb/src/cursor/aggregation_cursor.ts b/node_modules/mongodb/src/cursor/aggregation_cursor.ts new file mode 100644 index 000000000..ffe658e34 --- /dev/null +++ b/node_modules/mongodb/src/cursor/aggregation_cursor.ts @@ -0,0 +1,219 @@ +import { AggregateOperation, AggregateOptions } from '../operations/aggregate'; +import { AbstractCursor, assertUninitialized } from './abstract_cursor'; +import { executeOperation, ExecutionResult } from '../operations/execute_operation'; +import { mergeOptions } from '../utils'; +import type { Document } from '../bson'; +import type { Sort } from '../sort'; +import type { Topology } from '../sdam/topology'; +import type { Callback, MongoDBNamespace } from '../utils'; +import type { ClientSession } from '../sessions'; +import type { AbstractCursorOptions } from './abstract_cursor'; +import type { ExplainVerbosityLike } from '../explain'; + +/** @public */ +export interface AggregationCursorOptions extends AbstractCursorOptions, AggregateOptions {} + +/** @internal */ +const kPipeline = Symbol('pipeline'); +/** @internal */ +const kOptions = Symbol('options'); + +/** + * The **AggregationCursor** class is an internal class that embodies an aggregation cursor on MongoDB + * allowing for iteration over the results returned from the underlying query. It supports + * one by one document iteration, conversion to an array or can be iterated as a Node 4.X + * or higher stream + * @public + */ +export class AggregationCursor extends AbstractCursor { + /** @internal */ + [kPipeline]: Document[]; + /** @internal */ + [kOptions]: AggregateOptions; + + /** @internal */ + constructor( + topology: Topology, + namespace: MongoDBNamespace, + pipeline: Document[] = [], + options: AggregateOptions = {} + ) { + super(topology, namespace, options); + + this[kPipeline] = pipeline; + this[kOptions] = options; + } + + get pipeline(): Document[] { + return this[kPipeline]; + } + + clone(): AggregationCursor { + const clonedOptions = mergeOptions({}, this[kOptions]); + delete clonedOptions.session; + return new AggregationCursor(this.topology, this.namespace, this[kPipeline], { + ...clonedOptions + }); + } + + map(transform: (doc: TSchema) => T): AggregationCursor { + return super.map(transform) as AggregationCursor; + } + + /** @internal */ + _initialize(session: ClientSession | undefined, callback: Callback): void { + const aggregateOperation = new AggregateOperation(this.namespace, this[kPipeline], { + ...this[kOptions], + ...this.cursorOptions, + session + }); + + executeOperation(this.topology, aggregateOperation, (err, response) => { + if (err || response == null) return callback(err); + + // TODO: NODE-2882 + callback(undefined, { server: aggregateOperation.server, session, response }); + }); + } + + /** Execute the explain for the cursor */ + explain(): Promise; + explain(callback: Callback): void; + explain(verbosity: ExplainVerbosityLike): Promise; + explain( + verbosity?: ExplainVerbosityLike | Callback, + callback?: Callback + ): Promise | void { + if (typeof verbosity === 'function') (callback = verbosity), (verbosity = true); + if (verbosity == null) verbosity = true; + + return executeOperation( + this.topology, + new AggregateOperation(this.namespace, this[kPipeline], { + ...this[kOptions], // NOTE: order matters here, we may need to refine this + ...this.cursorOptions, + explain: verbosity + }), + callback + ); + } + + /** Add a group stage to the aggregation pipeline */ + group($group: Document): AggregationCursor; + group($group: Document): this { + assertUninitialized(this); + this[kPipeline].push({ $group }); + return this; + } + + /** Add a limit stage to the aggregation pipeline */ + limit($limit: number): this { + assertUninitialized(this); + this[kPipeline].push({ $limit }); + return this; + } + + /** Add a match stage to the aggregation pipeline */ + match($match: Document): this { + assertUninitialized(this); + this[kPipeline].push({ $match }); + return this; + } + + /** Add an out stage to the aggregation pipeline */ + out($out: { db: string; coll: string } | string): this { + assertUninitialized(this); + this[kPipeline].push({ $out }); + return this; + } + + /** + * Add a project stage to the aggregation pipeline + * + * @remarks + * In order to strictly type this function you must provide an interface + * that represents the effect of your projection on the result documents. + * + * By default chaining a projection to your cursor changes the returned type to the generic {@link Document} type. + * You should specify a parameterized type to have assertions on your final results. + * + * @example + * ```typescript + * // Best way + * const docs: AggregationCursor<{ a: number }> = cursor.project<{ a: number }>({ _id: 0, a: true }); + * // Flexible way + * const docs: AggregationCursor = cursor.project({ _id: 0, a: true }); + * ``` + * + * @remarks + * In order to strictly type this function you must provide an interface + * that represents the effect of your projection on the result documents. + * + * **Note for Typescript Users:** adding a transform changes the return type of the iteration of this cursor, + * it **does not** return a new instance of a cursor. This means when calling project, + * you should always assign the result to a new variable in order to get a correctly typed cursor variable. + * Take note of the following example: + * + * @example + * ```typescript + * const cursor: AggregationCursor<{ a: number; b: string }> = coll.aggregate([]); + * const projectCursor = cursor.project<{ a: number }>({ _id: 0, a: true }); + * const aPropOnlyArray: {a: number}[] = await projectCursor.toArray(); + * + * // or always use chaining and save the final cursor + * + * const cursor = coll.aggregate().project<{ a: string }>({ + * _id: 0, + * a: { $convert: { input: '$a', to: 'string' } + * }}); + * ``` + */ + project($project: Document): AggregationCursor { + assertUninitialized(this); + this[kPipeline].push({ $project }); + return this as unknown as AggregationCursor; + } + + /** Add a lookup stage to the aggregation pipeline */ + lookup($lookup: Document): this { + assertUninitialized(this); + this[kPipeline].push({ $lookup }); + return this; + } + + /** Add a redact stage to the aggregation pipeline */ + redact($redact: Document): this { + assertUninitialized(this); + this[kPipeline].push({ $redact }); + return this; + } + + /** Add a skip stage to the aggregation pipeline */ + skip($skip: number): this { + assertUninitialized(this); + this[kPipeline].push({ $skip }); + return this; + } + + /** Add a sort stage to the aggregation pipeline */ + sort($sort: Sort): this { + assertUninitialized(this); + this[kPipeline].push({ $sort }); + return this; + } + + /** Add a unwind stage to the aggregation pipeline */ + unwind($unwind: Document | string): this { + assertUninitialized(this); + this[kPipeline].push({ $unwind }); + return this; + } + + // deprecated methods + /** @deprecated Add a geoNear stage to the aggregation pipeline */ + geoNear($geoNear: Document): this { + assertUninitialized(this); + this[kPipeline].push({ $geoNear }); + return this; + } +} diff --git a/node_modules/mongodb/src/cursor/find_cursor.ts b/node_modules/mongodb/src/cursor/find_cursor.ts new file mode 100644 index 000000000..2c9a04d3b --- /dev/null +++ b/node_modules/mongodb/src/cursor/find_cursor.ts @@ -0,0 +1,464 @@ +import type { Document } from '../bson'; +import { MongoInvalidArgumentError, MongoTailableCursorError } from '../error'; +import type { ExplainVerbosityLike } from '../explain'; +import { CountOperation, CountOptions } from '../operations/count'; +import { executeOperation, ExecutionResult } from '../operations/execute_operation'; +import { FindOperation, FindOptions } from '../operations/find'; +import { mergeOptions } from '../utils'; +import type { Hint } from '../operations/operation'; +import type { CollationOptions } from '../operations/command'; +import type { Topology } from '../sdam/topology'; +import type { ClientSession } from '../sessions'; +import { formatSort, Sort, SortDirection } from '../sort'; +import type { Callback, MongoDBNamespace } from '../utils'; +import { AbstractCursor, assertUninitialized } from './abstract_cursor'; + +/** @internal */ +const kFilter = Symbol('filter'); +/** @internal */ +const kNumReturned = Symbol('numReturned'); +/** @internal */ +const kBuiltOptions = Symbol('builtOptions'); + +/** @public Flags allowed for cursor */ +export const FLAGS = [ + 'tailable', + 'oplogReplay', + 'noCursorTimeout', + 'awaitData', + 'exhaust', + 'partial' +] as const; + +/** @public */ +export class FindCursor extends AbstractCursor { + /** @internal */ + [kFilter]: Document; + /** @internal */ + [kNumReturned]?: number; + /** @internal */ + [kBuiltOptions]: FindOptions; + + /** @internal */ + constructor( + topology: Topology, + namespace: MongoDBNamespace, + filter: Document | undefined, + options: FindOptions = {} + ) { + super(topology, namespace, options); + + this[kFilter] = filter || {}; + this[kBuiltOptions] = options; + + if (options.sort != null) { + this[kBuiltOptions].sort = formatSort(options.sort); + } + } + + clone(): FindCursor { + const clonedOptions = mergeOptions({}, this[kBuiltOptions]); + delete clonedOptions.session; + return new FindCursor(this.topology, this.namespace, this[kFilter], { + ...clonedOptions + }); + } + + map(transform: (doc: TSchema) => T): FindCursor { + return super.map(transform) as FindCursor; + } + + /** @internal */ + _initialize(session: ClientSession | undefined, callback: Callback): void { + const findOperation = new FindOperation(undefined, this.namespace, this[kFilter], { + ...this[kBuiltOptions], // NOTE: order matters here, we may need to refine this + ...this.cursorOptions, + session + }); + + executeOperation(this.topology, findOperation, (err, response) => { + if (err || response == null) return callback(err); + + // TODO: We only need this for legacy queries that do not support `limit`, maybe + // the value should only be saved in those cases. + if (response.cursor) { + this[kNumReturned] = response.cursor.firstBatch.length; + } else { + this[kNumReturned] = response.documents ? response.documents.length : 0; + } + + // TODO: NODE-2882 + callback(undefined, { server: findOperation.server, session, response }); + }); + } + + /** @internal */ + _getMore(batchSize: number, callback: Callback): void { + // NOTE: this is to support client provided limits in pre-command servers + const numReturned = this[kNumReturned]; + if (numReturned) { + const limit = this[kBuiltOptions].limit; + batchSize = + limit && limit > 0 && numReturned + batchSize > limit ? limit - numReturned : batchSize; + + if (batchSize <= 0) { + return this.close(callback); + } + } + + super._getMore(batchSize, (err, response) => { + if (err) return callback(err); + + // TODO: wrap this in some logic to prevent it from happening if we don't need this support + if (response) { + this[kNumReturned] = this[kNumReturned] + response.cursor.nextBatch.length; + } + + callback(undefined, response); + }); + } + + /** Get the count of documents for this cursor */ + count(): Promise; + count(callback: Callback): void; + count(options: CountOptions): Promise; + count(options: CountOptions, callback: Callback): void; + count( + options?: CountOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'boolean') { + throw new MongoInvalidArgumentError('Invalid first parameter to count'); + } + + if (typeof options === 'function') (callback = options), (options = {}); + options = options ?? {}; + + return executeOperation( + this.topology, + new CountOperation(this.namespace, this[kFilter], { + ...this[kBuiltOptions], // NOTE: order matters here, we may need to refine this + ...this.cursorOptions, + ...options + }), + callback + ); + } + + /** Execute the explain for the cursor */ + explain(): Promise; + explain(callback: Callback): void; + explain(verbosity?: ExplainVerbosityLike): Promise; + explain( + verbosity?: ExplainVerbosityLike | Callback, + callback?: Callback + ): Promise | void { + if (typeof verbosity === 'function') (callback = verbosity), (verbosity = true); + if (verbosity == null) verbosity = true; + + return executeOperation( + this.topology, + new FindOperation(undefined, this.namespace, this[kFilter], { + ...this[kBuiltOptions], // NOTE: order matters here, we may need to refine this + ...this.cursorOptions, + explain: verbosity + }), + callback + ); + } + + /** Set the cursor query */ + filter(filter: Document): this { + assertUninitialized(this); + this[kFilter] = filter; + return this; + } + + /** + * Set the cursor hint + * + * @param hint - If specified, then the query system will only consider plans using the hinted index. + */ + hint(hint: Hint): this { + assertUninitialized(this); + this[kBuiltOptions].hint = hint; + return this; + } + + /** + * Set the cursor min + * + * @param min - Specify a $min value to specify the inclusive lower bound for a specific index in order to constrain the results of find(). The $min specifies the lower bound for all keys of a specific index in order. + */ + min(min: Document): this { + assertUninitialized(this); + this[kBuiltOptions].min = min; + return this; + } + + /** + * Set the cursor max + * + * @param max - Specify a $max value to specify the exclusive upper bound for a specific index in order to constrain the results of find(). The $max specifies the upper bound for all keys of a specific index in order. + */ + max(max: Document): this { + assertUninitialized(this); + this[kBuiltOptions].max = max; + return this; + } + + /** + * Set the cursor returnKey. + * If set to true, modifies the cursor to only return the index field or fields for the results of the query, rather than documents. + * If set to true and the query does not use an index to perform the read operation, the returned documents will not contain any fields. + * + * @param value - the returnKey value. + */ + returnKey(value: boolean): this { + assertUninitialized(this); + this[kBuiltOptions].returnKey = value; + return this; + } + + /** + * Modifies the output of a query by adding a field $recordId to matching documents. $recordId is the internal key which uniquely identifies a document in a collection. + * + * @param value - The $showDiskLoc option has now been deprecated and replaced with the showRecordId field. $showDiskLoc will still be accepted for OP_QUERY stye find. + */ + showRecordId(value: boolean): this { + assertUninitialized(this); + this[kBuiltOptions].showRecordId = value; + return this; + } + + /** + * Add a query modifier to the cursor query + * + * @param name - The query modifier (must start with $, such as $orderby etc) + * @param value - The modifier value. + */ + addQueryModifier(name: string, value: string | boolean | number | Document): this { + assertUninitialized(this); + if (name[0] !== '$') { + throw new MongoInvalidArgumentError(`${name} is not a valid query modifier`); + } + + // Strip of the $ + const field = name.substr(1); + + // NOTE: consider some TS magic for this + switch (field) { + case 'comment': + this[kBuiltOptions].comment = value as string | Document; + break; + + case 'explain': + this[kBuiltOptions].explain = value as boolean; + break; + + case 'hint': + this[kBuiltOptions].hint = value as string | Document; + break; + + case 'max': + this[kBuiltOptions].max = value as Document; + break; + + case 'maxTimeMS': + this[kBuiltOptions].maxTimeMS = value as number; + break; + + case 'min': + this[kBuiltOptions].min = value as Document; + break; + + case 'orderby': + this[kBuiltOptions].sort = formatSort(value as string | Document); + break; + + case 'query': + this[kFilter] = value as Document; + break; + + case 'returnKey': + this[kBuiltOptions].returnKey = value as boolean; + break; + + case 'showDiskLoc': + this[kBuiltOptions].showRecordId = value as boolean; + break; + + default: + throw new MongoInvalidArgumentError(`Invalid query modifier: ${name}`); + } + + return this; + } + + /** + * Add a comment to the cursor query allowing for tracking the comment in the log. + * + * @param value - The comment attached to this query. + */ + comment(value: string): this { + assertUninitialized(this); + this[kBuiltOptions].comment = value; + return this; + } + + /** + * Set a maxAwaitTimeMS on a tailing cursor query to allow to customize the timeout value for the option awaitData (Only supported on MongoDB 3.2 or higher, ignored otherwise) + * + * @param value - Number of milliseconds to wait before aborting the tailed query. + */ + maxAwaitTimeMS(value: number): this { + assertUninitialized(this); + if (typeof value !== 'number') { + throw new MongoInvalidArgumentError('Argument for maxAwaitTimeMS must be a number'); + } + + this[kBuiltOptions].maxAwaitTimeMS = value; + return this; + } + + /** + * Set a maxTimeMS on the cursor query, allowing for hard timeout limits on queries (Only supported on MongoDB 2.6 or higher) + * + * @param value - Number of milliseconds to wait before aborting the query. + */ + maxTimeMS(value: number): this { + assertUninitialized(this); + if (typeof value !== 'number') { + throw new MongoInvalidArgumentError('Argument for maxTimeMS must be a number'); + } + + this[kBuiltOptions].maxTimeMS = value; + return this; + } + + /** + * Add a project stage to the aggregation pipeline + * + * @remarks + * In order to strictly type this function you must provide an interface + * that represents the effect of your projection on the result documents. + * + * By default chaining a projection to your cursor changes the returned type to the generic + * {@link Document} type. + * You should specify a parameterized type to have assertions on your final results. + * + * @example + * ```typescript + * // Best way + * const docs: FindCursor<{ a: number }> = cursor.project<{ a: number }>({ _id: 0, a: true }); + * // Flexible way + * const docs: FindCursor = cursor.project({ _id: 0, a: true }); + * ``` + * + * @remarks + * + * **Note for Typescript Users:** adding a transform changes the return type of the iteration of this cursor, + * it **does not** return a new instance of a cursor. This means when calling project, + * you should always assign the result to a new variable in order to get a correctly typed cursor variable. + * Take note of the following example: + * + * @example + * ```typescript + * const cursor: FindCursor<{ a: number; b: string }> = coll.find(); + * const projectCursor = cursor.project<{ a: number }>({ _id: 0, a: true }); + * const aPropOnlyArray: {a: number}[] = await projectCursor.toArray(); + * + * // or always use chaining and save the final cursor + * + * const cursor = coll.find().project<{ a: string }>({ + * _id: 0, + * a: { $convert: { input: '$a', to: 'string' } + * }}); + * ``` + */ + project(value: Document): FindCursor { + assertUninitialized(this); + this[kBuiltOptions].projection = value; + return this as unknown as FindCursor; + } + + /** + * Sets the sort order of the cursor query. + * + * @param sort - The key or keys set for the sort. + * @param direction - The direction of the sorting (1 or -1). + */ + sort(sort: Sort | string, direction?: SortDirection): this { + assertUninitialized(this); + if (this[kBuiltOptions].tailable) { + throw new MongoTailableCursorError('Tailable cursor does not support sorting'); + } + + this[kBuiltOptions].sort = formatSort(sort, direction); + return this; + } + + /** + * Allows disk use for blocking sort operations exceeding 100MB memory. (MongoDB 3.2 or higher) + * + * @remarks + * {@link https://docs.mongodb.com/manual/reference/command/find/#find-cmd-allowdiskuse | find command allowDiskUse documentation} + */ + allowDiskUse(): this { + assertUninitialized(this); + if (!this[kBuiltOptions].sort) { + throw new MongoInvalidArgumentError('Option "allowDiskUse" requires a sort specification'); + } + this[kBuiltOptions].allowDiskUse = true; + return this; + } + + /** + * Set the collation options for the cursor. + * + * @param value - The cursor collation options (MongoDB 3.4 or higher) settings for update operation (see 3.4 documentation for available fields). + */ + collation(value: CollationOptions): this { + assertUninitialized(this); + this[kBuiltOptions].collation = value; + return this; + } + + /** + * Set the limit for the cursor. + * + * @param value - The limit for the cursor query. + */ + limit(value: number): this { + assertUninitialized(this); + if (this[kBuiltOptions].tailable) { + throw new MongoTailableCursorError('Tailable cursor does not support limit'); + } + + if (typeof value !== 'number') { + throw new MongoInvalidArgumentError('Operation "limit" requires an integer'); + } + + this[kBuiltOptions].limit = value; + return this; + } + + /** + * Set the skip for the cursor. + * + * @param value - The skip for the cursor query. + */ + skip(value: number): this { + assertUninitialized(this); + if (this[kBuiltOptions].tailable) { + throw new MongoTailableCursorError('Tailable cursor does not support skip'); + } + + if (typeof value !== 'number') { + throw new MongoInvalidArgumentError('Operation "skip" requires an integer'); + } + + this[kBuiltOptions].skip = value; + return this; + } +} diff --git a/node_modules/mongodb/src/db.ts b/node_modules/mongodb/src/db.ts new file mode 100644 index 000000000..2629cffe3 --- /dev/null +++ b/node_modules/mongodb/src/db.ts @@ -0,0 +1,756 @@ +import { + Callback, + resolveOptions, + filterOptions, + MongoDBNamespace, + getTopology, + DEFAULT_PK_FACTORY +} from './utils'; +import { AggregationCursor } from './cursor/aggregation_cursor'; +import { Document, BSONSerializeOptions, resolveBSONOptions } from './bson'; +import { ReadPreference, ReadPreferenceLike } from './read_preference'; +import { MongoAPIError, MongoInvalidArgumentError } from './error'; +import { Collection, CollectionOptions } from './collection'; +import { ChangeStream, ChangeStreamOptions } from './change_stream'; +import * as CONSTANTS from './constants'; +import { WriteConcern, WriteConcernOptions } from './write_concern'; +import { ReadConcern } from './read_concern'; +import { Logger, LoggerOptions } from './logger'; +import type { AggregateOptions } from './operations/aggregate'; +import { AddUserOperation, AddUserOptions } from './operations/add_user'; +import { CollectionsOperation } from './operations/collections'; +import { DbStatsOperation, DbStatsOptions } from './operations/stats'; +import { RunCommandOperation, RunCommandOptions } from './operations/run_command'; +import { CreateCollectionOperation, CreateCollectionOptions } from './operations/create_collection'; +import { + CreateIndexOperation, + IndexInformationOperation, + CreateIndexesOptions, + IndexSpecification +} from './operations/indexes'; +import { + DropCollectionOperation, + DropDatabaseOperation, + DropDatabaseOptions, + DropCollectionOptions +} from './operations/drop'; +import { + CollectionInfo, + ListCollectionsCursor, + ListCollectionsOptions +} from './operations/list_collections'; +import { ProfilingLevelOperation, ProfilingLevelOptions } from './operations/profiling_level'; +import { RemoveUserOperation, RemoveUserOptions } from './operations/remove_user'; +import { RenameOperation, RenameOptions } from './operations/rename'; +import { + SetProfilingLevelOperation, + SetProfilingLevelOptions, + ProfilingLevel +} from './operations/set_profiling_level'; +import { executeOperation } from './operations/execute_operation'; +import type { IndexInformationOptions } from './operations/common_functions'; +import type { MongoClient, PkFactory } from './mongo_client'; +import { Admin } from './admin'; +import type { TODO_NODE_3286 } from './mongo_types'; + +// Allowed parameters +const DB_OPTIONS_ALLOW_LIST = [ + 'writeConcern', + 'readPreference', + 'readPreferenceTags', + 'native_parser', + 'forceServerObjectId', + 'pkFactory', + 'serializeFunctions', + 'raw', + 'authSource', + 'ignoreUndefined', + 'readConcern', + 'retryMiliSeconds', + 'numberOfRetries', + 'loggerLevel', + 'logger', + 'promoteBuffers', + 'promoteLongs', + 'bsonRegExp', + 'promoteValues', + 'compression', + 'retryWrites' +]; + +/** @internal */ +export interface DbPrivate { + client: MongoClient; + options?: DbOptions; + logger: Logger; + readPreference?: ReadPreference; + pkFactory: PkFactory; + readConcern?: ReadConcern; + bsonOptions: BSONSerializeOptions; + writeConcern?: WriteConcern; + namespace: MongoDBNamespace; +} + +/** @public */ +export interface DbOptions extends BSONSerializeOptions, WriteConcernOptions, LoggerOptions { + /** If the database authentication is dependent on another databaseName. */ + authSource?: string; + /** Force server to assign _id values instead of driver. */ + forceServerObjectId?: boolean; + /** The preferred read preference (ReadPreference.PRIMARY, ReadPreference.PRIMARY_PREFERRED, ReadPreference.SECONDARY, ReadPreference.SECONDARY_PREFERRED, ReadPreference.NEAREST). */ + readPreference?: ReadPreferenceLike; + /** A primary key factory object for generation of custom _id keys. */ + pkFactory?: PkFactory; + /** Specify a read concern for the collection. (only MongoDB 3.2 or higher supported) */ + readConcern?: ReadConcern; + /** Should retry failed writes */ + retryWrites?: boolean; +} + +/** + * The **Db** class is a class that represents a MongoDB Database. + * @public + * + * @example + * ```js + * const { MongoClient } = require('mongodb'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * // Connect using MongoClient + * MongoClient.connect(url, function(err, client) { + * // Select the database by name + * const testDb = client.db(dbName); + * client.close(); + * }); + * ``` + */ +export class Db { + /** @internal */ + s: DbPrivate; + + public static SYSTEM_NAMESPACE_COLLECTION = CONSTANTS.SYSTEM_NAMESPACE_COLLECTION; + public static SYSTEM_INDEX_COLLECTION = CONSTANTS.SYSTEM_INDEX_COLLECTION; + public static SYSTEM_PROFILE_COLLECTION = CONSTANTS.SYSTEM_PROFILE_COLLECTION; + public static SYSTEM_USER_COLLECTION = CONSTANTS.SYSTEM_USER_COLLECTION; + public static SYSTEM_COMMAND_COLLECTION = CONSTANTS.SYSTEM_COMMAND_COLLECTION; + public static SYSTEM_JS_COLLECTION = CONSTANTS.SYSTEM_JS_COLLECTION; + + /** + * Creates a new Db instance + * + * @param client - The MongoClient for the database. + * @param databaseName - The name of the database this instance represents. + * @param options - Optional settings for Db construction + */ + constructor(client: MongoClient, databaseName: string, options?: DbOptions) { + options = options ?? {}; + + // Filter the options + options = filterOptions(options, DB_OPTIONS_ALLOW_LIST); + + // Ensure we have a valid db name + validateDatabaseName(databaseName); + + // Internal state of the db object + this.s = { + // Client + client, + // Options + options, + // Logger instance + logger: new Logger('Db', options), + // Unpack read preference + readPreference: ReadPreference.fromOptions(options), + // Merge bson options + bsonOptions: resolveBSONOptions(options, client), + // Set up the primary key factory or fallback to ObjectId + pkFactory: options?.pkFactory ?? DEFAULT_PK_FACTORY, + // ReadConcern + readConcern: ReadConcern.fromOptions(options), + writeConcern: WriteConcern.fromOptions(options), + // Namespace + namespace: new MongoDBNamespace(databaseName) + }; + } + + get databaseName(): string { + return this.s.namespace.db; + } + + // Options + get options(): DbOptions | undefined { + return this.s.options; + } + + // slaveOk specified + get slaveOk(): boolean { + return this.s.readPreference?.preference !== 'primary' || false; + } + + get readConcern(): ReadConcern | undefined { + return this.s.readConcern; + } + + /** + * The current readPreference of the Db. If not explicitly defined for + * this Db, will be inherited from the parent MongoClient + */ + get readPreference(): ReadPreference { + if (this.s.readPreference == null) { + return this.s.client.readPreference; + } + + return this.s.readPreference; + } + + get bsonOptions(): BSONSerializeOptions { + return this.s.bsonOptions; + } + + // get the write Concern + get writeConcern(): WriteConcern | undefined { + return this.s.writeConcern; + } + + get namespace(): string { + return this.s.namespace.toString(); + } + + /** + * Create a new collection on a server with the specified options. Use this to create capped collections. + * More information about command options available at https://docs.mongodb.com/manual/reference/command/create/ + * + * @param name - The name of the collection to create + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + createCollection(name: string): Promise>; + createCollection( + name: string, + callback: Callback> + ): void; + createCollection( + name: string, + options: CreateCollectionOptions + ): Promise>; + createCollection( + name: string, + options: CreateCollectionOptions, + callback: Callback> + ): void; + createCollection( + name: string, + options?: CreateCollectionOptions | Callback, + callback?: Callback + ): Promise> | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new CreateCollectionOperation(this, name, resolveOptions(this, options)) as TODO_NODE_3286, + callback + ) as TODO_NODE_3286; + } + + /** + * Execute a command + * + * @remarks + * This command does not inherit options from the MongoClient. + * + * @param command - The command to run + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + command(command: Document): Promise; + command(command: Document, callback: Callback): void; + command(command: Document, options: RunCommandOptions): Promise; + command(command: Document, options: RunCommandOptions, callback: Callback): void; + command( + command: Document, + options?: RunCommandOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + // Intentionally, we do not inherit options from parent for this operation. + return executeOperation( + getTopology(this), + new RunCommandOperation(this, command, options ?? {}), + callback + ); + } + + /** + * Execute an aggregation framework pipeline against the database, needs MongoDB \>= 3.6 + * + * @param pipeline - An array of aggregation stages to be executed + * @param options - Optional settings for the command + */ + aggregate( + pipeline: Document[] = [], + options?: AggregateOptions + ): AggregationCursor { + if (arguments.length > 2) { + throw new MongoInvalidArgumentError('Method "db.aggregate()" accepts at most two arguments'); + } + if (typeof pipeline === 'function') { + throw new MongoInvalidArgumentError('Argument "pipeline" must not be function'); + } + if (typeof options === 'function') { + throw new MongoInvalidArgumentError('Argument "options" must not be function'); + } + + return new AggregationCursor( + getTopology(this), + this.s.namespace, + pipeline, + resolveOptions(this, options) + ); + } + + /** Return the Admin db instance */ + admin(): Admin { + return new Admin(this); + } + + /** + * Returns a reference to a MongoDB Collection. If it does not exist it will be created implicitly. + * + * @param name - the collection name we wish to access. + * @returns return the new Collection instance + */ + collection( + name: string, + options: CollectionOptions = {} + ): Collection { + if (typeof options === 'function') { + throw new MongoInvalidArgumentError('The callback form of this helper has been removed.'); + } + const finalOptions = resolveOptions(this, options); + return new Collection(this, name, finalOptions); + } + + /** + * Get all the db statistics. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + stats(): Promise; + stats(callback: Callback): void; + stats(options: DbStatsOptions): Promise; + stats(options: DbStatsOptions, callback: Callback): void; + stats( + options?: DbStatsOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + return executeOperation( + getTopology(this), + new DbStatsOperation(this, resolveOptions(this, options)), + callback + ); + } + + /** + * List all collections of this database with optional filter + * + * @param filter - Query to filter collections by + * @param options - Optional settings for the command + */ + listCollections( + filter: Document, + options: Exclude & { nameOnly: true } + ): ListCollectionsCursor>; + listCollections( + filter: Document, + options: Exclude & { nameOnly: false } + ): ListCollectionsCursor; + listCollections< + T extends Pick | CollectionInfo = + | Pick + | CollectionInfo + >(filter?: Document, options?: ListCollectionsOptions): ListCollectionsCursor; + listCollections< + T extends Pick | CollectionInfo = + | Pick + | CollectionInfo + >(filter: Document = {}, options: ListCollectionsOptions = {}): ListCollectionsCursor { + return new ListCollectionsCursor(this, filter, resolveOptions(this, options)); + } + + /** + * Rename a collection. + * + * @remarks + * This operation does not inherit options from the MongoClient. + * + * @param fromCollection - Name of current collection to rename + * @param toCollection - New name of of the collection + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + renameCollection( + fromCollection: string, + toCollection: string + ): Promise>; + renameCollection( + fromCollection: string, + toCollection: string, + callback: Callback> + ): void; + renameCollection( + fromCollection: string, + toCollection: string, + options: RenameOptions + ): Promise>; + renameCollection( + fromCollection: string, + toCollection: string, + options: RenameOptions, + callback: Callback> + ): void; + renameCollection( + fromCollection: string, + toCollection: string, + options?: RenameOptions | Callback>, + callback?: Callback> + ): Promise> | void { + if (typeof options === 'function') (callback = options), (options = {}); + + // Intentionally, we do not inherit options from parent for this operation. + options = { ...options, readPreference: ReadPreference.PRIMARY }; + + // Add return new collection + options.new_collection = true; + + return executeOperation( + getTopology(this), + new RenameOperation( + this.collection(fromCollection) as TODO_NODE_3286, + toCollection, + options + ) as TODO_NODE_3286, + callback + ); + } + + /** + * Drop a collection from the database, removing it permanently. New accesses will create a new collection. + * + * @param name - Name of collection to drop + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + dropCollection(name: string): Promise; + dropCollection(name: string, callback: Callback): void; + dropCollection(name: string, options: DropCollectionOptions): Promise; + dropCollection(name: string, options: DropCollectionOptions, callback: Callback): void; + dropCollection( + name: string, + options?: DropCollectionOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new DropCollectionOperation(this, name, resolveOptions(this, options)), + callback + ); + } + + /** + * Drop a database, removing it permanently from the server. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + dropDatabase(): Promise; + dropDatabase(callback: Callback): void; + dropDatabase(options: DropDatabaseOptions): Promise; + dropDatabase(options: DropDatabaseOptions, callback: Callback): void; + dropDatabase( + options?: DropDatabaseOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new DropDatabaseOperation(this, resolveOptions(this, options)), + callback + ); + } + + /** + * Fetch all collections for the current db. + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + collections(): Promise; + collections(callback: Callback): void; + collections(options: ListCollectionsOptions): Promise; + collections(options: ListCollectionsOptions, callback: Callback): void; + collections( + options?: ListCollectionsOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new CollectionsOperation(this, resolveOptions(this, options)), + callback + ); + } + + /** + * Creates an index on the db and collection. + * + * @param name - Name of the collection to create the index on. + * @param indexSpec - Specify the field to index, or an index specification + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + createIndex(name: string, indexSpec: IndexSpecification): Promise; + createIndex(name: string, indexSpec: IndexSpecification, callback?: Callback): void; + createIndex( + name: string, + indexSpec: IndexSpecification, + options: CreateIndexesOptions + ): Promise; + createIndex( + name: string, + indexSpec: IndexSpecification, + options: CreateIndexesOptions, + callback: Callback + ): void; + createIndex( + name: string, + indexSpec: IndexSpecification, + options?: CreateIndexesOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new CreateIndexOperation(this, name, indexSpec, resolveOptions(this, options)), + callback + ); + } + + /** + * Add a user to the database + * + * @param username - The username for the new user + * @param password - An optional password for the new user + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + addUser(username: string): Promise; + addUser(username: string, callback: Callback): void; + addUser(username: string, password: string): Promise; + addUser(username: string, password: string, callback: Callback): void; + addUser(username: string, options: AddUserOptions): Promise; + addUser(username: string, options: AddUserOptions, callback: Callback): void; + addUser(username: string, password: string, options: AddUserOptions): Promise; + addUser( + username: string, + password: string, + options: AddUserOptions, + callback: Callback + ): void; + addUser( + username: string, + password?: string | AddUserOptions | Callback, + options?: AddUserOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof password === 'function') { + (callback = password), (password = undefined), (options = {}); + } else if (typeof password !== 'string') { + if (typeof options === 'function') { + (callback = options), (options = password), (password = undefined); + } else { + (options = password), (callback = undefined), (password = undefined); + } + } else { + if (typeof options === 'function') (callback = options), (options = {}); + } + + return executeOperation( + getTopology(this), + new AddUserOperation(this, username, password, resolveOptions(this, options)), + callback + ); + } + + /** + * Remove a user from a database + * + * @param username - The username to remove + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + removeUser(username: string): Promise; + removeUser(username: string, callback: Callback): void; + removeUser(username: string, options: RemoveUserOptions): Promise; + removeUser(username: string, options: RemoveUserOptions, callback: Callback): void; + removeUser( + username: string, + options?: RemoveUserOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new RemoveUserOperation(this, username, resolveOptions(this, options)), + callback + ); + } + + /** + * Set the current profiling level of MongoDB + * + * @param level - The new profiling level (off, slow_only, all). + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + setProfilingLevel(level: ProfilingLevel): Promise; + setProfilingLevel(level: ProfilingLevel, callback: Callback): void; + setProfilingLevel( + level: ProfilingLevel, + options: SetProfilingLevelOptions + ): Promise; + setProfilingLevel( + level: ProfilingLevel, + options: SetProfilingLevelOptions, + callback: Callback + ): void; + setProfilingLevel( + level: ProfilingLevel, + options?: SetProfilingLevelOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new SetProfilingLevelOperation(this, level, resolveOptions(this, options)), + callback + ); + } + + /** + * Retrieve the current profiling Level for MongoDB + * + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + profilingLevel(): Promise; + profilingLevel(callback: Callback): void; + profilingLevel(options: ProfilingLevelOptions): Promise; + profilingLevel(options: ProfilingLevelOptions, callback: Callback): void; + profilingLevel( + options?: ProfilingLevelOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new ProfilingLevelOperation(this, resolveOptions(this, options)), + callback + ); + } + + /** + * Retrieves this collections index info. + * + * @param name - The name of the collection. + * @param options - Optional settings for the command + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + indexInformation(name: string): Promise; + indexInformation(name: string, callback: Callback): void; + indexInformation(name: string, options: IndexInformationOptions): Promise; + indexInformation( + name: string, + options: IndexInformationOptions, + callback: Callback + ): void; + indexInformation( + name: string, + options?: IndexInformationOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + + return executeOperation( + getTopology(this), + new IndexInformationOperation(this, name, resolveOptions(this, options)), + callback + ); + } + + /** + * Unref all sockets + * @deprecated This function is deprecated and will be removed in the next major version. + */ + unref(): void { + getTopology(this).unref(); + } + + /** + * Create a new Change Stream, watching for new changes (insertions, updates, + * replacements, deletions, and invalidations) in this database. Will ignore all + * changes to system collections. + * + * @param pipeline - An array of {@link https://docs.mongodb.com/manual/reference/operator/aggregation-pipeline/|aggregation pipeline stages} through which to pass change stream documents. This allows for filtering (using $match) and manipulating the change stream documents. + * @param options - Optional settings for the command + */ + watch( + pipeline: Document[] = [], + options: ChangeStreamOptions = {} + ): ChangeStream { + // Allow optionally not specifying a pipeline + if (!Array.isArray(pipeline)) { + options = pipeline; + pipeline = []; + } + + return new ChangeStream(this, pipeline, resolveOptions(this, options)); + } + + /** Return the db logger */ + getLogger(): Logger { + return this.s.logger; + } + + get logger(): Logger { + return this.s.logger; + } +} + +// TODO(NODE-3484): Refactor into MongoDBNamespace +// Validate the database name +function validateDatabaseName(databaseName: string) { + if (typeof databaseName !== 'string') + throw new MongoInvalidArgumentError('Database name must be a string'); + if (databaseName.length === 0) + throw new MongoInvalidArgumentError('Database name cannot be the empty string'); + if (databaseName === '$external') return; + + const invalidChars = [' ', '.', '$', '/', '\\']; + for (let i = 0; i < invalidChars.length; i++) { + if (databaseName.indexOf(invalidChars[i]) !== -1) + throw new MongoAPIError(`database names cannot contain the character '${invalidChars[i]}'`); + } +} diff --git a/node_modules/mongodb/src/deps.ts b/node_modules/mongodb/src/deps.ts new file mode 100644 index 000000000..35bd4ebf4 --- /dev/null +++ b/node_modules/mongodb/src/deps.ts @@ -0,0 +1,275 @@ +/* eslint-disable @typescript-eslint/no-var-requires */ +import { MongoMissingDependencyError } from './error'; +import type { MongoClient } from './mongo_client'; +import type { deserialize, Document, serialize } from './bson'; +import { Callback, parsePackageVersion } from './utils'; + +export const PKG_VERSION = Symbol('kPkgVersion'); + +function makeErrorModule(error: any) { + const props = error ? { kModuleError: error } : {}; + return new Proxy(props, { + get: (_: any, key: any) => { + if (key === 'kModuleError') { + return error; + } + throw error; + }, + set: () => { + throw error; + } + }); +} + +export let Kerberos: typeof import('kerberos') | { kModuleError: MongoMissingDependencyError } = + makeErrorModule( + new MongoMissingDependencyError( + 'Optional module `kerberos` not found. Please install it to enable kerberos authentication' + ) + ); + +try { + // Ensure you always wrap an optional require in the try block NODE-3199 + Kerberos = require('kerberos'); +} catch {} // eslint-disable-line + +export interface KerberosClient { + step: (challenge: string, callback?: Callback) => Promise | void; + wrap: ( + challenge: string, + options?: { user: string }, + callback?: Callback + ) => Promise | void; + unwrap: (challenge: string, callback?: Callback) => Promise | void; +} + +type SnappyLib = { + [PKG_VERSION]: { major: number; minor: number; patch: number }; + + /** + * - Snappy 6.x takes a callback and returns void + * - Snappy 7.x returns a promise + * + * In order to support both we must check the return value of the function + * @param buf - Buffer to be compressed + * @param callback - ONLY USED IN SNAPPY 6.x + */ + compress(buf: Buffer): Promise; + compress(buf: Buffer, callback: (error?: Error, buffer?: Buffer) => void): Promise | void; + compress( + buf: Buffer, + callback?: (error?: Error, buffer?: Buffer) => void + ): Promise | void; + + /** + * - Snappy 6.x takes a callback and returns void + * - Snappy 7.x returns a promise + * + * In order to support both we must check the return value of the function + * @param buf - Buffer to be compressed + * @param callback - ONLY USED IN SNAPPY 6.x + */ + uncompress(buf: Buffer, opt: { asBuffer: true }): Promise; + uncompress( + buf: Buffer, + opt: { asBuffer: true }, + callback: (error?: Error, buffer?: Buffer) => void + ): Promise | void; + uncompress( + buf: Buffer, + opt: { asBuffer: true }, + callback?: (error?: Error, buffer?: Buffer) => void + ): Promise | void; +}; + +export let Snappy: SnappyLib | { kModuleError: MongoMissingDependencyError } = makeErrorModule( + new MongoMissingDependencyError( + 'Optional module `snappy` not found. Please install it to enable snappy compression' + ) +); + +try { + // Ensure you always wrap an optional require in the try block NODE-3199 + Snappy = require('snappy'); + try { + (Snappy as any)[PKG_VERSION] = parsePackageVersion(require('snappy/package.json')); + } catch {} // eslint-disable-line +} catch {} // eslint-disable-line + +export let saslprep: typeof import('saslprep') | { kModuleError: MongoMissingDependencyError } = + makeErrorModule( + new MongoMissingDependencyError( + 'Optional module `saslprep` not found.' + + ' Please install it to enable Stringprep Profile for User Names and Passwords' + ) + ); + +try { + // Ensure you always wrap an optional require in the try block NODE-3199 + saslprep = require('saslprep'); +} catch {} // eslint-disable-line + +interface AWS4 { + /** + * Created these inline types to better assert future usage of this API + * @param options - options for request + * @param credentials - AWS credential details, sessionToken should be omitted entirely if its false-y + */ + sign( + options: { + path: '/'; + body: string; + host: string; + method: 'POST'; + headers: { + 'Content-Type': 'application/x-www-form-urlencoded'; + 'Content-Length': number; + 'X-MongoDB-Server-Nonce': string; + 'X-MongoDB-GS2-CB-Flag': 'n'; + }; + service: string; + region: string; + }, + credentials: + | { + accessKeyId: string; + secretAccessKey: string; + sessionToken: string; + } + | { + accessKeyId: string; + secretAccessKey: string; + } + | undefined + ): { + headers: { + Authorization: string; + 'X-Amz-Date': string; + }; + }; +} + +export let aws4: AWS4 | { kModuleError: MongoMissingDependencyError } = makeErrorModule( + new MongoMissingDependencyError( + 'Optional module `aws4` not found. Please install it to enable AWS authentication' + ) +); + +try { + // Ensure you always wrap an optional require in the try block NODE-3199 + aws4 = require('aws4'); +} catch {} // eslint-disable-line + +/** @public */ +export const AutoEncryptionLoggerLevel = Object.freeze({ + FatalError: 0, + Error: 1, + Warning: 2, + Info: 3, + Trace: 4 +} as const); + +/** @public */ +export type AutoEncryptionLoggerLevel = + typeof AutoEncryptionLoggerLevel[keyof typeof AutoEncryptionLoggerLevel]; + +/** @public */ +export interface AutoEncryptionOptions { + /** @internal */ + bson?: { serialize: typeof serialize; deserialize: typeof deserialize }; + /** @internal client for metadata lookups */ + metadataClient?: MongoClient; + /** A `MongoClient` used to fetch keys from a key vault */ + keyVaultClient?: MongoClient; + /** The namespace where keys are stored in the key vault */ + keyVaultNamespace?: string; + /** Configuration options that are used by specific KMS providers during key generation, encryption, and decryption. */ + kmsProviders?: { + /** Configuration options for using 'aws' as your KMS provider */ + aws?: { + /** The access key used for the AWS KMS provider */ + accessKeyId: string; + /** The secret access key used for the AWS KMS provider */ + secretAccessKey: string; + /** + * An optional AWS session token that will be used as the + * X-Amz-Security-Token header for AWS requests. + */ + sessionToken?: string; + }; + /** Configuration options for using 'local' as your KMS provider */ + local?: { + /** + * The master key used to encrypt/decrypt data keys. + * A 96-byte long Buffer or base64 encoded string. + */ + key: Buffer | string; + }; + /** Configuration options for using 'azure' as your KMS provider */ + azure?: { + /** The tenant ID identifies the organization for the account */ + tenantId: string; + /** The client ID to authenticate a registered application */ + clientId: string; + /** The client secret to authenticate a registered application */ + clientSecret: string; + /** + * If present, a host with optional port. E.g. "example.com" or "example.com:443". + * This is optional, and only needed if customer is using a non-commercial Azure instance + * (e.g. a government or China account, which use different URLs). + * Defaults to "login.microsoftonline.com" + */ + identityPlatformEndpoint?: string | undefined; + }; + /** Configuration options for using 'gcp' as your KMS provider */ + gcp?: { + /** The service account email to authenticate */ + email: string; + /** A PKCS#8 encrypted key. This can either be a base64 string or a binary representation */ + privateKey: string | Buffer; + /** + * If present, a host with optional port. E.g. "example.com" or "example.com:443". + * Defaults to "oauth2.googleapis.com" + */ + endpoint?: string | undefined; + }; + }; + /** + * A map of namespaces to a local JSON schema for encryption + * + * **NOTE**: Supplying options.schemaMap provides more security than relying on JSON Schemas obtained from the server. + * It protects against a malicious server advertising a false JSON Schema, which could trick the client into sending decrypted data that should be encrypted. + * Schemas supplied in the schemaMap only apply to configuring automatic encryption for client side encryption. + * Other validation rules in the JSON schema will not be enforced by the driver and will result in an error. + */ + schemaMap?: Document; + /** Allows the user to bypass auto encryption, maintaining implicit decryption */ + bypassAutoEncryption?: boolean; + options?: { + /** An optional hook to catch logging messages from the underlying encryption engine */ + logger?: (level: AutoEncryptionLoggerLevel, message: string) => void; + }; + extraOptions?: { + /** + * A local process the driver communicates with to determine how to encrypt values in a command. + * Defaults to "mongodb://%2Fvar%2Fmongocryptd.sock" if domain sockets are available or "mongodb://localhost:27020" otherwise + */ + mongocryptdURI?: string; + /** If true, autoEncryption will not attempt to spawn a mongocryptd before connecting */ + mongocryptdBypassSpawn?: boolean; + /** The path to the mongocryptd executable on the system */ + mongocryptdSpawnPath?: string; + /** Command line arguments to use when auto-spawning a mongocryptd */ + mongocryptdSpawnArgs?: string[]; + }; +} + +/** @public */ +export interface AutoEncrypter { + // eslint-disable-next-line @typescript-eslint/no-misused-new + new (client: MongoClient, options: AutoEncryptionOptions): AutoEncrypter; + init(cb: Callback): void; + teardown(force: boolean, callback: Callback): void; + encrypt(ns: string, cmd: Document, options: any, callback: Callback): void; + decrypt(cmd: Document, options: any, callback: Callback): void; +} diff --git a/node_modules/mongodb/src/encrypter.ts b/node_modules/mongodb/src/encrypter.ts new file mode 100644 index 000000000..18cd7a127 --- /dev/null +++ b/node_modules/mongodb/src/encrypter.ts @@ -0,0 +1,119 @@ +/* eslint-disable @typescript-eslint/no-var-requires */ +import { MongoClient, MongoClientOptions } from './mongo_client'; +import type { AutoEncrypter, AutoEncryptionOptions } from './deps'; +import { MongoInvalidArgumentError, MongoMissingDependencyError } from './error'; +import { deserialize, serialize } from './bson'; +import type { Callback } from './utils'; +import { MONGO_CLIENT_EVENTS } from './operations/connect'; + +let AutoEncrypterClass: AutoEncrypter; + +/** @internal */ +const kInternalClient = Symbol('internalClient'); + +/** @internal */ +export interface EncrypterOptions { + autoEncryption: AutoEncryptionOptions; + maxPoolSize?: number; +} + +/** @internal */ +export class Encrypter { + [kInternalClient]: MongoClient; + bypassAutoEncryption: boolean; + needsConnecting: boolean; + autoEncrypter: AutoEncrypter; + + constructor(client: MongoClient, uri: string, options: MongoClientOptions) { + if (typeof options.autoEncryption !== 'object') { + throw new MongoInvalidArgumentError('Option "autoEncryption" must be specified'); + } + + this.bypassAutoEncryption = !!options.autoEncryption.bypassAutoEncryption; + this.needsConnecting = false; + + if (options.maxPoolSize === 0 && options.autoEncryption.keyVaultClient == null) { + options.autoEncryption.keyVaultClient = client; + } else if (options.autoEncryption.keyVaultClient == null) { + options.autoEncryption.keyVaultClient = this.getInternalClient(client, uri, options); + } + + if (this.bypassAutoEncryption) { + options.autoEncryption.metadataClient = undefined; + } else if (options.maxPoolSize === 0) { + options.autoEncryption.metadataClient = client; + } else { + options.autoEncryption.metadataClient = this.getInternalClient(client, uri, options); + } + + options.autoEncryption.bson = Object.create(null); + // eslint-disable-next-line @typescript-eslint/no-non-null-assertion + options.autoEncryption.bson!.serialize = serialize; + // eslint-disable-next-line @typescript-eslint/no-non-null-assertion + options.autoEncryption.bson!.deserialize = deserialize; + + this.autoEncrypter = new AutoEncrypterClass(client, options.autoEncryption); + } + + getInternalClient(client: MongoClient, uri: string, options: MongoClientOptions): MongoClient { + if (!this[kInternalClient]) { + const clonedOptions: MongoClientOptions = {}; + + for (const key of Object.keys(options)) { + if (['autoEncryption', 'minPoolSize', 'servers', 'caseTranslate', 'dbName'].includes(key)) + continue; + Reflect.set(clonedOptions, key, Reflect.get(options, key)); + } + + clonedOptions.minPoolSize = 0; + + this[kInternalClient] = new MongoClient(uri, clonedOptions); + + for (const eventName of MONGO_CLIENT_EVENTS) { + for (const listener of client.listeners(eventName)) { + this[kInternalClient].on(eventName, listener); + } + } + + client.on('newListener', (eventName, listener) => { + this[kInternalClient].on(eventName, listener); + }); + + this.needsConnecting = true; + } + return this[kInternalClient]; + } + + connectInternalClient(callback: Callback): void { + if (this.needsConnecting) { + this.needsConnecting = false; + return this[kInternalClient].connect(callback); + } + + return callback(); + } + + close(client: MongoClient, force: boolean, callback: Callback): void { + this.autoEncrypter.teardown(!!force, e => { + if (this[kInternalClient] && client !== this[kInternalClient]) { + return this[kInternalClient].close(force, callback); + } + callback(e); + }); + } + + static checkForMongoCrypt(): void { + let mongodbClientEncryption = undefined; + try { + // Ensure you always wrap an optional require in the try block NODE-3199 + mongodbClientEncryption = require('mongodb-client-encryption'); + } catch (err) { + throw new MongoMissingDependencyError( + 'Auto-encryption requested, but the module is not installed. ' + + 'Please add `mongodb-client-encryption` as a dependency of your project' + ); + } + + AutoEncrypterClass = mongodbClientEncryption.extension(require('../lib/index')).AutoEncrypter; + } +} diff --git a/node_modules/mongodb/src/error.ts b/node_modules/mongodb/src/error.ts new file mode 100644 index 000000000..fb64d5c87 --- /dev/null +++ b/node_modules/mongodb/src/error.ts @@ -0,0 +1,813 @@ +import type { TopologyVersion } from './sdam/server_description'; +import type { Document } from './bson'; +import type { TopologyDescription } from './sdam/topology_description'; + +/** @public */ +export type AnyError = MongoError | Error; + +/** @internal */ +const kErrorLabels = Symbol('errorLabels'); + +/** @internal MongoDB Error Codes */ +export const MONGODB_ERROR_CODES = Object.freeze({ + HostUnreachable: 6, + HostNotFound: 7, + NetworkTimeout: 89, + ShutdownInProgress: 91, + PrimarySteppedDown: 189, + ExceededTimeLimit: 262, + SocketException: 9001, + NotMaster: 10107, + InterruptedAtShutdown: 11600, + InterruptedDueToReplStateChange: 11602, + NotMasterNoSlaveOk: 13435, + NotMasterOrSecondary: 13436, + StaleShardVersion: 63, + StaleEpoch: 150, + StaleConfig: 13388, + RetryChangeStream: 234, + FailedToSatisfyReadPreference: 133, + CursorNotFound: 43, + LegacyNotPrimary: 10058, + WriteConcernFailed: 64, + NamespaceNotFound: 26, + IllegalOperation: 20, + MaxTimeMSExpired: 50, + UnknownReplWriteConcern: 79, + UnsatisfiableWriteConcern: 100 +} as const); + +// From spec@https://github.com/mongodb/specifications/blob/f93d78191f3db2898a59013a7ed5650352ef6da8/source/change-streams/change-streams.rst#resumable-error +export const GET_MORE_RESUMABLE_CODES = new Set([ + MONGODB_ERROR_CODES.HostUnreachable, + MONGODB_ERROR_CODES.HostNotFound, + MONGODB_ERROR_CODES.NetworkTimeout, + MONGODB_ERROR_CODES.ShutdownInProgress, + MONGODB_ERROR_CODES.PrimarySteppedDown, + MONGODB_ERROR_CODES.ExceededTimeLimit, + MONGODB_ERROR_CODES.SocketException, + MONGODB_ERROR_CODES.NotMaster, + MONGODB_ERROR_CODES.InterruptedAtShutdown, + MONGODB_ERROR_CODES.InterruptedDueToReplStateChange, + MONGODB_ERROR_CODES.NotMasterNoSlaveOk, + MONGODB_ERROR_CODES.NotMasterOrSecondary, + MONGODB_ERROR_CODES.StaleShardVersion, + MONGODB_ERROR_CODES.StaleEpoch, + MONGODB_ERROR_CODES.StaleConfig, + MONGODB_ERROR_CODES.RetryChangeStream, + MONGODB_ERROR_CODES.FailedToSatisfyReadPreference, + MONGODB_ERROR_CODES.CursorNotFound +]); + +/** @public */ +export interface ErrorDescription extends Document { + message?: string; + errmsg?: string; + $err?: string; + errorLabels?: string[]; + errInfo?: Document; +} + +/** + * @public + * @category Error + * + * @privateRemarks + * CSFLE has a dependency on this error, it uses the constructor with a string argument + */ +export class MongoError extends Error { + /** @internal */ + [kErrorLabels]: Set; + /** + * This is a number in MongoServerError and a string in MongoDriverError + * @privateRemarks + * Define the type override on the subclasses when we can use the override keyword + */ + code?: number | string; + topologyVersion?: TopologyVersion; + + constructor(message: string | Error) { + if (message instanceof Error) { + super(message.message); + } else { + super(message); + } + } + + get name(): string { + return 'MongoError'; + } + + /** Legacy name for server error responses */ + get errmsg(): string { + return this.message; + } + + /** + * Checks the error to see if it has an error label + * + * @param label - The error label to check for + * @returns returns true if the error has the provided error label + */ + hasErrorLabel(label: string): boolean { + if (this[kErrorLabels] == null) { + return false; + } + + return this[kErrorLabels].has(label); + } + + addErrorLabel(label: string): void { + if (this[kErrorLabels] == null) { + this[kErrorLabels] = new Set(); + } + + this[kErrorLabels].add(label); + } + + get errorLabels(): string[] { + return this[kErrorLabels] ? Array.from(this[kErrorLabels]) : []; + } +} + +/** + * An error coming from the mongo server + * + * @public + * @category Error + */ +export class MongoServerError extends MongoError { + codeName?: string; + writeConcernError?: Document; + errInfo?: Document; + ok?: number; + [key: string]: any; + + constructor(message: ErrorDescription) { + super(message.message || message.errmsg || message.$err || 'n/a'); + if (message.errorLabels) { + this[kErrorLabels] = new Set(message.errorLabels); + } + + for (const name in message) { + if (name !== 'errorLabels' && name !== 'errmsg' && name !== 'message') + this[name] = message[name]; + } + } + + get name(): string { + return 'MongoServerError'; + } +} + +/** + * An error generated by the driver + * + * @public + * @category Error + */ +export class MongoDriverError extends MongoError { + constructor(message: string) { + super(message); + } + + get name(): string { + return 'MongoDriverError'; + } +} + +/** + * An error generated when the driver API is used incorrectly + * + * @privateRemarks + * Should **never** be directly instantiated + * + * @public + * @category Error + */ + +export class MongoAPIError extends MongoDriverError { + constructor(message: string) { + super(message); + } + + get name(): string { + return 'MongoAPIError'; + } +} + +/** + * An error generated when the driver encounters unexpected input + * or reaches an unexpected/invalid internal state + * + * @privateRemarks + * Should **never** be directly instantiated. + * + * @public + * @category Error + */ +export class MongoRuntimeError extends MongoDriverError { + constructor(message: string) { + super(message); + } + + get name(): string { + return 'MongoRuntimeError'; + } +} + +/** + * An error generated when a batch command is reexecuted after one of the commands in the batch + * has failed + * + * @public + * @category Error + */ +export class MongoBatchReExecutionError extends MongoAPIError { + constructor(message = 'This batch has already been executed, create new batch to execute') { + super(message); + } + + get name(): string { + return 'MongoBatchReExecutionError'; + } +} + +/** + * An error generated when the driver fails to decompress + * data received from the server. + * + * @public + * @category Error + */ +export class MongoDecompressionError extends MongoRuntimeError { + constructor(message: string) { + super(message); + } + + get name(): string { + return 'MongoDecompressionError'; + } +} + +/** + * An error thrown when the user attempts to operate on a database or collection through a MongoClient + * that has not yet successfully called the "connect" method + * + * @public + * @category Error + */ +export class MongoNotConnectedError extends MongoAPIError { + constructor(message: string) { + super(message); + } + + get name(): string { + return 'MongoNotConnectedError'; + } +} + +/** + * An error generated when the user makes a mistake in the usage of transactions. + * (e.g. attempting to commit a transaction with a readPreference other than primary) + * + * @public + * @category Error + */ +export class MongoTransactionError extends MongoAPIError { + constructor(message: string) { + super(message); + } + + get name(): string { + return 'MongoTransactionError'; + } +} + +/** + * An error generated when the user attempts to operate + * on a session that has expired or has been closed. + * + * @public + * @category Error + */ +export class MongoExpiredSessionError extends MongoAPIError { + constructor(message = 'Cannot use a session that has ended') { + super(message); + } + + get name(): string { + return 'MongoExpiredSessionError'; + } +} + +/** + * A error generated when the user attempts to authenticate + * via Kerberos, but fails to connect to the Kerberos client. + * + * @public + * @category Error + */ +export class MongoKerberosError extends MongoRuntimeError { + constructor(message: string) { + super(message); + } + + get name(): string { + return 'MongoKerberosError'; + } +} + +/** + * An error generated when a ChangeStream operation fails to execute. + * + * @public + * @category Error + */ +export class MongoChangeStreamError extends MongoRuntimeError { + constructor(message: string) { + super(message); + } + + get name(): string { + return 'MongoChangeStreamError'; + } +} + +/** + * An error thrown when the user calls a function or method not supported on a tailable cursor + * + * @public + * @category Error + */ +export class MongoTailableCursorError extends MongoAPIError { + constructor(message = 'Tailable cursor does not support this operation') { + super(message); + } + + get name(): string { + return 'MongoTailableCursorError'; + } +} + +/** An error generated when a GridFSStream operation fails to execute. + * + * @public + * @category Error + */ +export class MongoGridFSStreamError extends MongoRuntimeError { + constructor(message: string) { + super(message); + } + + get name(): string { + return 'MongoGridFSStreamError'; + } +} + +/** + * An error generated when a malformed or invalid chunk is + * encountered when reading from a GridFSStream. + * + * @public + * @category Error + */ +export class MongoGridFSChunkError extends MongoRuntimeError { + constructor(message: string) { + super(message); + } + + get name(): string { + return 'MongoGridFSChunkError'; + } +} + +/** + * An error thrown when the user attempts to add options to a cursor that has already been + * initialized + * + * @public + * @category Error + */ +export class MongoCursorInUseError extends MongoAPIError { + constructor(message = 'Cursor is already initialized') { + super(message); + } + + get name(): string { + return 'MongoCursorInUseError'; + } +} + +/** + * An error generated when an attempt is made to operate + * on a closed/closing server. + * + * @public + * @category Error + */ +export class MongoServerClosedError extends MongoAPIError { + constructor(message = 'Server is closed') { + super(message); + } + + get name(): string { + return 'MongoServerClosedError'; + } +} + +/** + * An error thrown when an attempt is made to read from a cursor that has been exhausted + * + * @public + * @category Error + */ +export class MongoCursorExhaustedError extends MongoAPIError { + constructor(message?: string) { + super(message || 'Cursor is exhausted'); + } + + get name(): string { + return 'MongoCursorExhaustedError'; + } +} + +/** + * An error generated when an attempt is made to operate on a + * dropped, or otherwise unavailable, database. + * + * @public + * @category Error + */ +export class MongoTopologyClosedError extends MongoAPIError { + constructor(message = 'Topology is closed') { + super(message); + } + + get name(): string { + return 'MongoTopologyClosedError'; + } +} + +/** @internal */ +const kBeforeHandshake = Symbol('beforeHandshake'); +export function isNetworkErrorBeforeHandshake(err: MongoNetworkError): boolean { + return err[kBeforeHandshake] === true; +} + +/** @public */ +export interface MongoNetworkErrorOptions { + /** Indicates the timeout happened before a connection handshake completed */ + beforeHandshake: boolean; +} + +/** + * An error indicating an issue with the network, including TCP errors and timeouts. + * @public + * @category Error + */ +export class MongoNetworkError extends MongoError { + /** @internal */ + [kBeforeHandshake]?: boolean; + + constructor(message: string | Error, options?: MongoNetworkErrorOptions) { + super(message); + + if (options && typeof options.beforeHandshake === 'boolean') { + this[kBeforeHandshake] = options.beforeHandshake; + } + } + + get name(): string { + return 'MongoNetworkError'; + } +} + +/** + * An error indicating a network timeout occurred + * @public + * @category Error + * + * @privateRemarks + * CSFLE has a dependency on this error with an instanceof check + */ +export class MongoNetworkTimeoutError extends MongoNetworkError { + constructor(message: string, options?: MongoNetworkErrorOptions) { + super(message, options); + } + + get name(): string { + return 'MongoNetworkTimeoutError'; + } +} + +/** + * An error used when attempting to parse a value (like a connection string) + * @public + * @category Error + */ +export class MongoParseError extends MongoDriverError { + constructor(message: string) { + super(message); + } + + get name(): string { + return 'MongoParseError'; + } +} + +/** + * An error generated when the user supplies malformed or unexpected arguments + * or when a required argument or field is not provided. + * + * + * @public + * @category Error + */ +export class MongoInvalidArgumentError extends MongoAPIError { + constructor(message: string) { + super(message); + } + + get name(): string { + return 'MongoInvalidArgumentError'; + } +} + +/** + * An error generated when a feature that is not enabled or allowed for the current server + * configuration is used + * + * + * @public + * @category Error + */ +export class MongoCompatibilityError extends MongoAPIError { + constructor(message: string) { + super(message); + } + + get name(): string { + return 'MongoCompatibilityError'; + } +} + +/** + * An error generated when the user fails to provide authentication credentials before attempting + * to connect to a mongo server instance. + * + * + * @public + * @category Error + */ +export class MongoMissingCredentialsError extends MongoAPIError { + constructor(message: string) { + super(message); + } + + get name(): string { + return 'MongoMissingCredentialsError'; + } +} + +/** + * An error generated when a required module or dependency is not present in the local environment + * + * @public + * @category Error + */ +export class MongoMissingDependencyError extends MongoAPIError { + constructor(message: string) { + super(message); + } + + get name(): string { + return 'MongoMissingDependencyError'; + } +} +/** + * An error signifying a general system issue + * @public + * @category Error + */ +export class MongoSystemError extends MongoError { + /** An optional reason context, such as an error saved during flow of monitoring and selecting servers */ + reason?: TopologyDescription; + + constructor(message: string, reason: TopologyDescription) { + if (reason && reason.error) { + super(reason.error.message || reason.error); + } else { + super(message); + } + + if (reason) { + this.reason = reason; + } + } + + get name(): string { + return 'MongoSystemError'; + } +} + +/** + * An error signifying a client-side server selection error + * @public + * @category Error + */ +export class MongoServerSelectionError extends MongoSystemError { + constructor(message: string, reason: TopologyDescription) { + super(message, reason); + } + + get name(): string { + return 'MongoServerSelectionError'; + } +} + +function makeWriteConcernResultObject(input: any) { + const output = Object.assign({}, input); + + if (output.ok === 0) { + output.ok = 1; + delete output.errmsg; + delete output.code; + delete output.codeName; + } + + return output; +} + +/** + * An error thrown when the server reports a writeConcernError + * @public + * @category Error + */ +export class MongoWriteConcernError extends MongoServerError { + /** The result document (provided if ok: 1) */ + result?: Document; + errInfo?: Document; + + constructor(message: ErrorDescription, result?: Document) { + if (result && Array.isArray(result.errorLabels)) { + message.errorLabels = result.errorLabels; + } + + super(message); + this.errInfo = message.errInfo; + + if (result != null) { + this.result = makeWriteConcernResultObject(result); + } + } + + get name(): string { + return 'MongoWriteConcernError'; + } +} + +// see: https://github.com/mongodb/specifications/blob/master/source/retryable-writes/retryable-writes.rst#terms +const RETRYABLE_ERROR_CODES = new Set([ + MONGODB_ERROR_CODES.HostUnreachable, + MONGODB_ERROR_CODES.HostNotFound, + MONGODB_ERROR_CODES.NetworkTimeout, + MONGODB_ERROR_CODES.ShutdownInProgress, + MONGODB_ERROR_CODES.PrimarySteppedDown, + MONGODB_ERROR_CODES.SocketException, + MONGODB_ERROR_CODES.NotMaster, + MONGODB_ERROR_CODES.InterruptedAtShutdown, + MONGODB_ERROR_CODES.InterruptedDueToReplStateChange, + MONGODB_ERROR_CODES.NotMasterNoSlaveOk, + MONGODB_ERROR_CODES.NotMasterOrSecondary +]); + +const RETRYABLE_WRITE_ERROR_CODES = new Set([ + MONGODB_ERROR_CODES.InterruptedAtShutdown, + MONGODB_ERROR_CODES.InterruptedDueToReplStateChange, + MONGODB_ERROR_CODES.NotMaster, + MONGODB_ERROR_CODES.NotMasterNoSlaveOk, + MONGODB_ERROR_CODES.NotMasterOrSecondary, + MONGODB_ERROR_CODES.PrimarySteppedDown, + MONGODB_ERROR_CODES.ShutdownInProgress, + MONGODB_ERROR_CODES.HostNotFound, + MONGODB_ERROR_CODES.HostUnreachable, + MONGODB_ERROR_CODES.NetworkTimeout, + MONGODB_ERROR_CODES.SocketException, + MONGODB_ERROR_CODES.ExceededTimeLimit +]); + +export function isRetryableWriteError(error: MongoError): boolean { + if (error instanceof MongoWriteConcernError) { + return RETRYABLE_WRITE_ERROR_CODES.has(error.result?.code ?? error.code ?? 0); + } + return typeof error.code === 'number' && RETRYABLE_WRITE_ERROR_CODES.has(error.code); +} + +/** Determines whether an error is something the driver should attempt to retry */ +export function isRetryableError(error: MongoError): boolean { + return ( + // eslint-disable-next-line @typescript-eslint/no-non-null-assertion + (typeof error.code === 'number' && RETRYABLE_ERROR_CODES.has(error.code!)) || + error instanceof MongoNetworkError || + !!error.message.match(/not master/) || + !!error.message.match(/node is recovering/) + ); +} + +const SDAM_RECOVERING_CODES = new Set([ + MONGODB_ERROR_CODES.ShutdownInProgress, + MONGODB_ERROR_CODES.PrimarySteppedDown, + MONGODB_ERROR_CODES.InterruptedAtShutdown, + MONGODB_ERROR_CODES.InterruptedDueToReplStateChange, + MONGODB_ERROR_CODES.NotMasterOrSecondary +]); + +const SDAM_NOTMASTER_CODES = new Set([ + MONGODB_ERROR_CODES.NotMaster, + MONGODB_ERROR_CODES.NotMasterNoSlaveOk, + MONGODB_ERROR_CODES.LegacyNotPrimary +]); + +const SDAM_NODE_SHUTTING_DOWN_ERROR_CODES = new Set([ + MONGODB_ERROR_CODES.InterruptedAtShutdown, + MONGODB_ERROR_CODES.ShutdownInProgress +]); + +function isRecoveringError(err: MongoError) { + if (typeof err.code === 'number') { + // If any error code exists, we ignore the error.message + return SDAM_RECOVERING_CODES.has(err.code); + } + + return /not master or secondary/.test(err.message) || /node is recovering/.test(err.message); +} + +function isNotMasterError(err: MongoError) { + if (typeof err.code === 'number') { + // If any error code exists, we ignore the error.message + return SDAM_NOTMASTER_CODES.has(err.code); + } + + if (isRecoveringError(err)) { + return false; + } + + return /not master/.test(err.message); +} + +export function isNodeShuttingDownError(err: MongoError): boolean { + return !!(typeof err.code === 'number' && SDAM_NODE_SHUTTING_DOWN_ERROR_CODES.has(err.code)); +} + +/** + * Determines whether SDAM can recover from a given error. If it cannot + * then the pool will be cleared, and server state will completely reset + * locally. + * + * @see https://github.com/mongodb/specifications/blob/master/source/server-discovery-and-monitoring/server-discovery-and-monitoring.rst#not-master-and-node-is-recovering + */ +export function isSDAMUnrecoverableError(error: MongoError): boolean { + // NOTE: null check is here for a strictly pre-CMAP world, a timeout or + // close event are considered unrecoverable + if (error instanceof MongoParseError || error == null) { + return true; + } + + return isRecoveringError(error) || isNotMasterError(error); +} + +export function isNetworkTimeoutError(err: MongoError): err is MongoNetworkError { + return !!(err instanceof MongoNetworkError && err.message.match(/timed out/)); +} + +// From spec@https://github.com/mongodb/specifications/blob/7a2e93d85935ee4b1046a8d2ad3514c657dc74fa/source/change-streams/change-streams.rst#resumable-error: +// +// An error is considered resumable if it meets any of the following criteria: +// - any error encountered which is not a server error (e.g. a timeout error or network error) +// - any server error response from a getMore command excluding those containing the error label +// NonRetryableChangeStreamError and those containing the following error codes: +// - Interrupted: 11601 +// - CappedPositionLost: 136 +// - CursorKilled: 237 +// +// An error on an aggregate command is not a resumable error. Only errors on a getMore command may be considered resumable errors. + +export function isResumableError(error?: MongoError, wireVersion?: number): boolean { + if (error instanceof MongoNetworkError) { + return true; + } + + if (wireVersion != null && wireVersion >= 9) { + // DRIVERS-1308: For 4.4 drivers running against 4.4 servers, drivers will add a special case to treat the CursorNotFound error code as resumable + if (error && error instanceof MongoError && error.code === 43) { + return true; + } + return error instanceof MongoError && error.hasErrorLabel('ResumableChangeStreamError'); + } + + if (error && typeof error.code === 'number') { + return GET_MORE_RESUMABLE_CODES.has(error.code); + } + return false; +} diff --git a/node_modules/mongodb/src/explain.ts b/node_modules/mongodb/src/explain.ts new file mode 100644 index 000000000..0d08e694a --- /dev/null +++ b/node_modules/mongodb/src/explain.ts @@ -0,0 +1,52 @@ +import { MongoInvalidArgumentError } from './error'; + +/** @public */ +export const ExplainVerbosity = Object.freeze({ + queryPlanner: 'queryPlanner', + queryPlannerExtended: 'queryPlannerExtended', + executionStats: 'executionStats', + allPlansExecution: 'allPlansExecution' +} as const); + +/** @public */ +export type ExplainVerbosity = string; + +/** + * For backwards compatibility, true is interpreted as "allPlansExecution" + * and false as "queryPlanner". Prior to server version 3.6, aggregate() + * ignores the verbosity parameter and executes in "queryPlanner". + * @public + */ +export type ExplainVerbosityLike = ExplainVerbosity | boolean; + +/** @public */ +export interface ExplainOptions { + /** Specifies the verbosity mode for the explain output. */ + explain?: ExplainVerbosityLike; +} + +/** @internal */ +export class Explain { + verbosity: ExplainVerbosity; + + constructor(verbosity: ExplainVerbosityLike) { + if (typeof verbosity === 'boolean') { + this.verbosity = verbosity + ? ExplainVerbosity.allPlansExecution + : ExplainVerbosity.queryPlanner; + } else { + this.verbosity = verbosity; + } + } + + static fromOptions(options?: ExplainOptions): Explain | undefined { + if (options?.explain == null) return; + + const explain = options.explain; + if (typeof explain === 'boolean' || typeof explain === 'string') { + return new Explain(explain); + } + + throw new MongoInvalidArgumentError('Field "explain" must be a string or a boolean'); + } +} diff --git a/node_modules/mongodb/src/gridfs/download.ts b/node_modules/mongodb/src/gridfs/download.ts new file mode 100644 index 000000000..7e2925bb2 --- /dev/null +++ b/node_modules/mongodb/src/gridfs/download.ts @@ -0,0 +1,454 @@ +import { Readable } from 'stream'; +import { + MongoRuntimeError, + MongoGridFSChunkError, + MongoGridFSStreamError, + MongoInvalidArgumentError +} from '../error'; +import type { Document } from '../bson'; +import type { FindOptions } from '../operations/find'; +import type { Sort } from '../sort'; +import type { Callback } from '../utils'; +import type { Collection } from '../collection'; +import type { ReadPreference } from '../read_preference'; +import type { GridFSChunk } from './upload'; +import type { FindCursor } from '../cursor/find_cursor'; +import type { ObjectId } from 'bson'; + +/** @public */ +export interface GridFSBucketReadStreamOptions { + sort?: Sort; + skip?: number; + /** 0-based offset in bytes to start streaming from */ + start?: number; + /** 0-based offset in bytes to stop streaming before */ + end?: number; +} + +/** @public */ +export interface GridFSBucketReadStreamOptionsWithRevision extends GridFSBucketReadStreamOptions { + /** The revision number relative to the oldest file with the given filename. 0 + * gets you the oldest file, 1 gets you the 2nd oldest, -1 gets you the + * newest. */ + revision?: number; +} + +/** @public */ +export interface GridFSFile { + _id: ObjectId; + length: number; + chunkSize: number; + filename: string; + contentType?: string; + aliases?: string[]; + metadata?: Document; + uploadDate: Date; +} + +/** @internal */ +export interface GridFSBucketReadStreamPrivate { + bytesRead: number; + bytesToTrim: number; + bytesToSkip: number; + chunks: Collection; + cursor?: FindCursor; + expected: number; + files: Collection; + filter: Document; + init: boolean; + expectedEnd: number; + file?: GridFSFile; + options: { + sort?: Sort; + skip?: number; + start: number; + end: number; + }; + readPreference?: ReadPreference; +} + +/** + * A readable stream that enables you to read buffers from GridFS. + * + * Do not instantiate this class directly. Use `openDownloadStream()` instead. + * @public + */ +export class GridFSBucketReadStream extends Readable implements NodeJS.ReadableStream { + /** @internal */ + s: GridFSBucketReadStreamPrivate; + + /** + * An error occurred + * @event + */ + static readonly ERROR = 'error' as const; + /** + * Fires when the stream loaded the file document corresponding to the provided id. + * @event + */ + static readonly FILE = 'file' as const; + /** + * Emitted when a chunk of data is available to be consumed. + * @event + */ + static readonly DATA = 'data' as const; + /** + * Fired when the stream is exhausted (no more data events). + * @event + */ + static readonly END = 'end' as const; + /** + * Fired when the stream is exhausted and the underlying cursor is killed + * @event + */ + static readonly CLOSE = 'close' as const; + + /** @internal + * @param chunks - Handle for chunks collection + * @param files - Handle for files collection + * @param readPreference - The read preference to use + * @param filter - The filter to use to find the file document + */ + constructor( + chunks: Collection, + files: Collection, + readPreference: ReadPreference | undefined, + filter: Document, + options?: GridFSBucketReadStreamOptions + ) { + super(); + this.s = { + bytesToTrim: 0, + bytesToSkip: 0, + bytesRead: 0, + chunks, + expected: 0, + files, + filter, + init: false, + expectedEnd: 0, + options: { + start: 0, + end: 0, + ...options + }, + readPreference + }; + } + + /** + * Reads from the cursor and pushes to the stream. + * Private Impl, do not call directly + * @internal + */ + _read(): void { + if (this.destroyed) return; + waitForFile(this, () => doRead(this)); + } + + /** + * Sets the 0-based offset in bytes to start streaming from. Throws + * an error if this stream has entered flowing mode + * (e.g. if you've already called `on('data')`) + * + * @param start - 0-based offset in bytes to start streaming from + */ + start(start = 0): this { + throwIfInitialized(this); + this.s.options.start = start; + return this; + } + + /** + * Sets the 0-based offset in bytes to start streaming from. Throws + * an error if this stream has entered flowing mode + * (e.g. if you've already called `on('data')`) + * + * @param end - Offset in bytes to stop reading at + */ + end(end = 0): this { + throwIfInitialized(this); + this.s.options.end = end; + return this; + } + + /** + * Marks this stream as aborted (will never push another `data` event) + * and kills the underlying cursor. Will emit the 'end' event, and then + * the 'close' event once the cursor is successfully killed. + * + * @param callback - called when the cursor is successfully closed or an error occurred. + */ + abort(callback?: Callback): void { + this.push(null); + this.destroyed = true; + if (this.s.cursor) { + this.s.cursor.close(error => { + this.emit(GridFSBucketReadStream.CLOSE); + callback && callback(error); + }); + } else { + if (!this.s.init) { + // If not initialized, fire close event because we will never + // get a cursor + this.emit(GridFSBucketReadStream.CLOSE); + } + callback && callback(); + } + } +} + +function throwIfInitialized(stream: GridFSBucketReadStream): void { + if (stream.s.init) { + throw new MongoGridFSStreamError('Options cannot be changed after the stream is initialized'); + } +} + +function doRead(stream: GridFSBucketReadStream): void { + if (stream.destroyed) return; + if (!stream.s.cursor) return; + if (!stream.s.file) return; + + stream.s.cursor.next((error, doc) => { + if (stream.destroyed) { + return; + } + if (error) { + stream.emit(GridFSBucketReadStream.ERROR, error); + return; + } + if (!doc) { + stream.push(null); + + process.nextTick(() => { + if (!stream.s.cursor) return; + stream.s.cursor.close(error => { + if (error) { + stream.emit(GridFSBucketReadStream.ERROR, error); + return; + } + + stream.emit(GridFSBucketReadStream.CLOSE); + }); + }); + + return; + } + + if (!stream.s.file) return; + + const bytesRemaining = stream.s.file.length - stream.s.bytesRead; + const expectedN = stream.s.expected++; + const expectedLength = Math.min(stream.s.file.chunkSize, bytesRemaining); + if (doc.n > expectedN) { + return stream.emit( + GridFSBucketReadStream.ERROR, + new MongoGridFSChunkError( + `ChunkIsMissing: Got unexpected n: ${doc.n}, expected: ${expectedN}` + ) + ); + } + + if (doc.n < expectedN) { + return stream.emit( + GridFSBucketReadStream.ERROR, + new MongoGridFSChunkError(`ExtraChunk: Got unexpected n: ${doc.n}, expected: ${expectedN}`) + ); + } + + let buf = Buffer.isBuffer(doc.data) ? doc.data : doc.data.buffer; + + if (buf.byteLength !== expectedLength) { + if (bytesRemaining <= 0) { + return stream.emit( + GridFSBucketReadStream.ERROR, + new MongoGridFSChunkError(`ExtraChunk: Got unexpected n: ${doc.n}`) + ); + } + + return stream.emit( + GridFSBucketReadStream.ERROR, + new MongoGridFSChunkError( + `ChunkIsWrongSize: Got unexpected length: ${buf.byteLength}, expected: ${expectedLength}` + ) + ); + } + + stream.s.bytesRead += buf.byteLength; + + if (buf.byteLength === 0) { + return stream.push(null); + } + + let sliceStart = null; + let sliceEnd = null; + + if (stream.s.bytesToSkip != null) { + sliceStart = stream.s.bytesToSkip; + stream.s.bytesToSkip = 0; + } + + const atEndOfStream = expectedN === stream.s.expectedEnd - 1; + const bytesLeftToRead = stream.s.options.end - stream.s.bytesToSkip; + if (atEndOfStream && stream.s.bytesToTrim != null) { + sliceEnd = stream.s.file.chunkSize - stream.s.bytesToTrim; + } else if (stream.s.options.end && bytesLeftToRead < doc.data.byteLength) { + sliceEnd = bytesLeftToRead; + } + + if (sliceStart != null || sliceEnd != null) { + buf = buf.slice(sliceStart || 0, sliceEnd || buf.byteLength); + } + + stream.push(buf); + }); +} + +function init(stream: GridFSBucketReadStream): void { + const findOneOptions: FindOptions = {}; + if (stream.s.readPreference) { + findOneOptions.readPreference = stream.s.readPreference; + } + if (stream.s.options && stream.s.options.sort) { + findOneOptions.sort = stream.s.options.sort; + } + if (stream.s.options && stream.s.options.skip) { + findOneOptions.skip = stream.s.options.skip; + } + + stream.s.files.findOne(stream.s.filter, findOneOptions, (error, doc) => { + if (error) { + return stream.emit(GridFSBucketReadStream.ERROR, error); + } + + if (!doc) { + const identifier = stream.s.filter._id + ? stream.s.filter._id.toString() + : stream.s.filter.filename; + const errmsg = `FileNotFound: file ${identifier} was not found`; + // TODO(NODE-3483) + const err = new MongoRuntimeError(errmsg); + err.code = 'ENOENT'; // TODO: NODE-3338 set property as part of constructor + return stream.emit(GridFSBucketReadStream.ERROR, err); + } + + // If document is empty, kill the stream immediately and don't + // execute any reads + if (doc.length <= 0) { + stream.push(null); + return; + } + + if (stream.destroyed) { + // If user destroys the stream before we have a cursor, wait + // until the query is done to say we're 'closed' because we can't + // cancel a query. + stream.emit(GridFSBucketReadStream.CLOSE); + return; + } + + try { + stream.s.bytesToSkip = handleStartOption(stream, doc, stream.s.options); + } catch (error) { + return stream.emit(GridFSBucketReadStream.ERROR, error); + } + + const filter: Document = { files_id: doc._id }; + + // Currently (MongoDB 3.4.4) skip function does not support the index, + // it needs to retrieve all the documents first and then skip them. (CS-25811) + // As work around we use $gte on the "n" field. + if (stream.s.options && stream.s.options.start != null) { + const skip = Math.floor(stream.s.options.start / doc.chunkSize); + if (skip > 0) { + filter['n'] = { $gte: skip }; + } + } + stream.s.cursor = stream.s.chunks.find(filter).sort({ n: 1 }); + + if (stream.s.readPreference) { + stream.s.cursor.withReadPreference(stream.s.readPreference); + } + + stream.s.expectedEnd = Math.ceil(doc.length / doc.chunkSize); + stream.s.file = doc as GridFSFile; + + try { + stream.s.bytesToTrim = handleEndOption(stream, doc, stream.s.cursor, stream.s.options); + } catch (error) { + return stream.emit(GridFSBucketReadStream.ERROR, error); + } + + stream.emit(GridFSBucketReadStream.FILE, doc); + }); +} + +function waitForFile(stream: GridFSBucketReadStream, callback: Callback): void { + if (stream.s.file) { + return callback(); + } + + if (!stream.s.init) { + init(stream); + stream.s.init = true; + } + + stream.once('file', () => { + callback(); + }); +} + +function handleStartOption( + stream: GridFSBucketReadStream, + doc: Document, + options: GridFSBucketReadStreamOptions +): number { + if (options && options.start != null) { + if (options.start > doc.length) { + throw new MongoInvalidArgumentError( + `Stream start (${options.start}) must not be more than the length of the file (${doc.length})` + ); + } + if (options.start < 0) { + throw new MongoInvalidArgumentError(`Stream start (${options.start}) must not be negative`); + } + if (options.end != null && options.end < options.start) { + throw new MongoInvalidArgumentError( + `Stream start (${options.start}) must not be greater than stream end (${options.end})` + ); + } + + stream.s.bytesRead = Math.floor(options.start / doc.chunkSize) * doc.chunkSize; + stream.s.expected = Math.floor(options.start / doc.chunkSize); + + return options.start - stream.s.bytesRead; + } + throw new MongoInvalidArgumentError('Start option must be defined'); +} + +function handleEndOption( + stream: GridFSBucketReadStream, + doc: Document, + cursor: FindCursor, + options: GridFSBucketReadStreamOptions +) { + if (options && options.end != null) { + if (options.end > doc.length) { + throw new MongoInvalidArgumentError( + `Stream end (${options.end}) must not be more than the length of the file (${doc.length})` + ); + } + if (options.start == null || options.start < 0) { + throw new MongoInvalidArgumentError(`Stream end (${options.end}) must not be negative`); + } + + const start = options.start != null ? Math.floor(options.start / doc.chunkSize) : 0; + + cursor.limit(Math.ceil(options.end / doc.chunkSize) - start); + + stream.s.expectedEnd = Math.ceil(options.end / doc.chunkSize); + + return Math.ceil(options.end / doc.chunkSize) * doc.chunkSize - options.end; + } + throw new MongoInvalidArgumentError('End option must be defined'); +} diff --git a/node_modules/mongodb/src/gridfs/index.ts b/node_modules/mongodb/src/gridfs/index.ts new file mode 100644 index 000000000..f2e6fb53f --- /dev/null +++ b/node_modules/mongodb/src/gridfs/index.ts @@ -0,0 +1,274 @@ +import { MongoRuntimeError } from '../error'; +import { + GridFSBucketReadStream, + GridFSBucketReadStreamOptions, + GridFSBucketReadStreamOptionsWithRevision, + GridFSFile +} from './download'; +import { GridFSBucketWriteStream, GridFSBucketWriteStreamOptions, GridFSChunk } from './upload'; +import { executeLegacyOperation, Callback, getTopology } from '../utils'; +import { WriteConcernOptions, WriteConcern } from '../write_concern'; +import type { ObjectId } from '../bson'; +import type { Db } from '../db'; +import type { ReadPreference } from '../read_preference'; +import type { Collection } from '../collection'; +import type { FindOptions } from './../operations/find'; +import type { Sort } from '../sort'; +import type { Logger } from '../logger'; +import type { FindCursor } from '../cursor/find_cursor'; +import { Filter, TypedEventEmitter } from '../mongo_types'; + +const DEFAULT_GRIDFS_BUCKET_OPTIONS: { + bucketName: string; + chunkSizeBytes: number; +} = { + bucketName: 'fs', + chunkSizeBytes: 255 * 1024 +}; + +/** @public */ +export interface GridFSBucketOptions extends WriteConcernOptions { + /** The 'files' and 'chunks' collections will be prefixed with the bucket name followed by a dot. */ + bucketName?: string; + /** Number of bytes stored in each chunk. Defaults to 255KB */ + chunkSizeBytes?: number; + /** Read preference to be passed to read operations */ + readPreference?: ReadPreference; +} + +/** @internal */ +export interface GridFSBucketPrivate { + db: Db; + options: { + bucketName: string; + chunkSizeBytes: number; + readPreference?: ReadPreference; + writeConcern: WriteConcern | undefined; + }; + _chunksCollection: Collection; + _filesCollection: Collection; + checkedIndexes: boolean; + calledOpenUploadStream: boolean; +} + +/** @public */ +export type GridFSBucketEvents = { + index(): void; +}; + +/** + * Constructor for a streaming GridFS interface + * @public + */ +export class GridFSBucket extends TypedEventEmitter { + /** @internal */ + s: GridFSBucketPrivate; + + /** + * When the first call to openUploadStream is made, the upload stream will + * check to see if it needs to create the proper indexes on the chunks and + * files collections. This event is fired either when 1) it determines that + * no index creation is necessary, 2) when it successfully creates the + * necessary indexes. + * @event + */ + static readonly INDEX = 'index' as const; + + constructor(db: Db, options?: GridFSBucketOptions) { + super(); + this.setMaxListeners(0); + const privateOptions = { + ...DEFAULT_GRIDFS_BUCKET_OPTIONS, + ...options, + writeConcern: WriteConcern.fromOptions(options) + }; + this.s = { + db, + options: privateOptions, + _chunksCollection: db.collection(privateOptions.bucketName + '.chunks'), + _filesCollection: db.collection(privateOptions.bucketName + '.files'), + checkedIndexes: false, + calledOpenUploadStream: false + }; + } + + /** + * Returns a writable stream (GridFSBucketWriteStream) for writing + * buffers to GridFS. The stream's 'id' property contains the resulting + * file's id. + * + * @param filename - The value of the 'filename' key in the files doc + * @param options - Optional settings. + */ + + openUploadStream( + filename: string, + options?: GridFSBucketWriteStreamOptions + ): GridFSBucketWriteStream { + return new GridFSBucketWriteStream(this, filename, options); + } + + /** + * Returns a writable stream (GridFSBucketWriteStream) for writing + * buffers to GridFS for a custom file id. The stream's 'id' property contains the resulting + * file's id. + */ + openUploadStreamWithId( + id: ObjectId, + filename: string, + options?: GridFSBucketWriteStreamOptions + ): GridFSBucketWriteStream { + return new GridFSBucketWriteStream(this, filename, { ...options, id }); + } + + /** Returns a readable stream (GridFSBucketReadStream) for streaming file data from GridFS. */ + openDownloadStream( + id: ObjectId, + options?: GridFSBucketReadStreamOptions + ): GridFSBucketReadStream { + return new GridFSBucketReadStream( + this.s._chunksCollection, + this.s._filesCollection, + this.s.options.readPreference, + { _id: id }, + options + ); + } + + /** + * Deletes a file with the given id + * + * @param id - The id of the file doc + */ + delete(id: ObjectId): Promise; + delete(id: ObjectId, callback: Callback): void; + delete(id: ObjectId, callback?: Callback): Promise | void { + return executeLegacyOperation(getTopology(this.s.db), _delete, [this, id, callback], { + skipSessions: true + }); + } + + /** Convenience wrapper around find on the files collection */ + find(filter?: Filter, options?: FindOptions): FindCursor { + filter ??= {}; + options = options ?? {}; + return this.s._filesCollection.find(filter, options); + } + + /** + * Returns a readable stream (GridFSBucketReadStream) for streaming the + * file with the given name from GridFS. If there are multiple files with + * the same name, this will stream the most recent file with the given name + * (as determined by the `uploadDate` field). You can set the `revision` + * option to change this behavior. + */ + openDownloadStreamByName( + filename: string, + options?: GridFSBucketReadStreamOptionsWithRevision + ): GridFSBucketReadStream { + let sort: Sort = { uploadDate: -1 }; + let skip = undefined; + if (options && options.revision != null) { + if (options.revision >= 0) { + sort = { uploadDate: 1 }; + skip = options.revision; + } else { + skip = -options.revision - 1; + } + } + return new GridFSBucketReadStream( + this.s._chunksCollection, + this.s._filesCollection, + this.s.options.readPreference, + { filename }, + { ...options, sort, skip } + ); + } + + /** + * Renames the file with the given _id to the given string + * + * @param id - the id of the file to rename + * @param filename - new name for the file + */ + rename(id: ObjectId, filename: string): Promise; + rename(id: ObjectId, filename: string, callback: Callback): void; + rename(id: ObjectId, filename: string, callback?: Callback): Promise | void { + return executeLegacyOperation(getTopology(this.s.db), _rename, [this, id, filename, callback], { + skipSessions: true + }); + } + + /** Removes this bucket's files collection, followed by its chunks collection. */ + drop(): Promise; + drop(callback: Callback): void; + drop(callback?: Callback): Promise | void { + return executeLegacyOperation(getTopology(this.s.db), _drop, [this, callback], { + skipSessions: true + }); + } + + /** Get the Db scoped logger. */ + getLogger(): Logger { + return this.s.db.s.logger; + } +} + +function _delete(bucket: GridFSBucket, id: ObjectId, callback: Callback): void { + return bucket.s._filesCollection.deleteOne({ _id: id }, (error, res) => { + if (error) { + return callback(error); + } + + return bucket.s._chunksCollection.deleteMany({ files_id: id }, error => { + if (error) { + return callback(error); + } + + // Delete orphaned chunks before returning FileNotFound + if (!res?.deletedCount) { + // TODO(NODE-3483): Replace with more appropriate error + // Consider creating new error MongoGridFSFileNotFoundError + return callback(new MongoRuntimeError(`File not found for id ${id}`)); + } + + return callback(); + }); + }); +} + +function _rename( + bucket: GridFSBucket, + id: ObjectId, + filename: string, + callback: Callback +): void { + const filter = { _id: id }; + const update = { $set: { filename } }; + return bucket.s._filesCollection.updateOne(filter, update, (error?, res?) => { + if (error) { + return callback(error); + } + + if (!res?.matchedCount) { + return callback(new MongoRuntimeError(`File with id ${id} not found`)); + } + + return callback(); + }); +} + +function _drop(bucket: GridFSBucket, callback: Callback): void { + return bucket.s._filesCollection.drop((error?: Error) => { + if (error) { + return callback(error); + } + return bucket.s._chunksCollection.drop((error?: Error) => { + if (error) { + return callback(error); + } + + return callback(); + }); + }); +} diff --git a/node_modules/mongodb/src/gridfs/upload.ts b/node_modules/mongodb/src/gridfs/upload.ts new file mode 100644 index 000000000..da812994b --- /dev/null +++ b/node_modules/mongodb/src/gridfs/upload.ts @@ -0,0 +1,563 @@ +import { Writable } from 'stream'; +import type { Document } from '../bson'; +import { ObjectId } from '../bson'; +import type { Collection } from '../collection'; +import { AnyError, MONGODB_ERROR_CODES, MongoError, MongoAPIError } from '../error'; +import { Callback, maybePromise } from '../utils'; +import type { WriteConcernOptions } from '../write_concern'; +import { WriteConcern } from './../write_concern'; +import type { GridFSFile } from './download'; +import type { GridFSBucket } from './index'; + +/** @public */ +export interface GridFSChunk { + _id: ObjectId; + files_id: ObjectId; + n: number; + data: Buffer | Uint8Array; +} + +/** @public */ +export interface GridFSBucketWriteStreamOptions extends WriteConcernOptions { + /** Overwrite this bucket's chunkSizeBytes for this file */ + chunkSizeBytes?: number; + /** Custom file id for the GridFS file. */ + id?: ObjectId; + /** Object to store in the file document's `metadata` field */ + metadata?: Document; + /** String to store in the file document's `contentType` field */ + contentType?: string; + /** Array of strings to store in the file document's `aliases` field */ + aliases?: string[]; +} + +/** + * A writable stream that enables you to write buffers to GridFS. + * + * Do not instantiate this class directly. Use `openUploadStream()` instead. + * @public + */ +export class GridFSBucketWriteStream extends Writable implements NodeJS.WritableStream { + bucket: GridFSBucket; + chunks: Collection; + filename: string; + files: Collection; + options: GridFSBucketWriteStreamOptions; + done: boolean; + id: ObjectId; + chunkSizeBytes: number; + bufToStore: Buffer; + length: number; + n: number; + pos: number; + state: { + streamEnd: boolean; + outstandingRequests: number; + errored: boolean; + aborted: boolean; + }; + writeConcern?: WriteConcern; + + /** @event */ + static readonly CLOSE = 'close'; + /** @event */ + static readonly ERROR = 'error'; + /** + * `end()` was called and the write stream successfully wrote the file metadata and all the chunks to MongoDB. + * @event + */ + static readonly FINISH = 'finish'; + + /** @internal + * @param bucket - Handle for this stream's corresponding bucket + * @param filename - The value of the 'filename' key in the files doc + * @param options - Optional settings. + */ + constructor(bucket: GridFSBucket, filename: string, options?: GridFSBucketWriteStreamOptions) { + super(); + + options = options ?? {}; + this.bucket = bucket; + this.chunks = bucket.s._chunksCollection; + this.filename = filename; + this.files = bucket.s._filesCollection; + this.options = options; + this.writeConcern = WriteConcern.fromOptions(options) || bucket.s.options.writeConcern; + // Signals the write is all done + this.done = false; + + this.id = options.id ? options.id : new ObjectId(); + // properly inherit the default chunksize from parent + this.chunkSizeBytes = options.chunkSizeBytes || this.bucket.s.options.chunkSizeBytes; + this.bufToStore = Buffer.alloc(this.chunkSizeBytes); + this.length = 0; + this.n = 0; + this.pos = 0; + this.state = { + streamEnd: false, + outstandingRequests: 0, + errored: false, + aborted: false + }; + + if (!this.bucket.s.calledOpenUploadStream) { + this.bucket.s.calledOpenUploadStream = true; + + checkIndexes(this, () => { + this.bucket.s.checkedIndexes = true; + this.bucket.emit('index'); + }); + } + } + + /** + * Write a buffer to the stream. + * + * @param chunk - Buffer to write + * @param encodingOrCallback - Optional encoding for the buffer + * @param callback - Function to call when the chunk was added to the buffer, or if the entire chunk was persisted to MongoDB if this chunk caused a flush. + * @returns False if this write required flushing a chunk to MongoDB. True otherwise. + */ + write(chunk: Buffer | string): boolean; + write(chunk: Buffer | string, callback: Callback): boolean; + write(chunk: Buffer | string, encoding: BufferEncoding | undefined): boolean; + write( + chunk: Buffer | string, + encoding: BufferEncoding | undefined, + callback: Callback + ): boolean; + write( + chunk: Buffer | string, + encodingOrCallback?: Callback | BufferEncoding, + callback?: Callback + ): boolean { + const encoding = typeof encodingOrCallback === 'function' ? undefined : encodingOrCallback; + callback = typeof encodingOrCallback === 'function' ? encodingOrCallback : callback; + return waitForIndexes(this, () => doWrite(this, chunk, encoding, callback)); + } + + // TODO(NODE-3405): Refactor this with maybePromise and MongoStreamClosedError + /** + * Places this write stream into an aborted state (all future writes fail) + * and deletes all chunks that have already been written. + * + * @param callback - called when chunks are successfully removed or error occurred + */ + abort(): Promise; + abort(callback: Callback): void; + abort(callback?: Callback): Promise | void { + return maybePromise(callback, callback => { + if (this.state.streamEnd) { + // TODO(NODE-3485): Replace with MongoGridFSStreamClosed + return callback(new MongoAPIError('Cannot abort a stream that has already completed')); + } + + if (this.state.aborted) { + // TODO(NODE-3485): Replace with MongoGridFSStreamClosed + return callback(new MongoAPIError('Cannot call abort() on a stream twice')); + } + + this.state.aborted = true; + this.chunks.deleteMany({ files_id: this.id }, error => callback(error)); + }); + } + + /** + * Tells the stream that no more data will be coming in. The stream will + * persist the remaining data to MongoDB, write the files document, and + * then emit a 'finish' event. + * + * @param chunk - Buffer to write + * @param encoding - Optional encoding for the buffer + * @param callback - Function to call when all files and chunks have been persisted to MongoDB + */ + end(): void; + end(chunk: Buffer): void; + end(callback: Callback): void; + end(chunk: Buffer, callback: Callback): void; + end(chunk: Buffer, encoding: BufferEncoding): void; + end( + chunk: Buffer, + encoding: BufferEncoding | undefined, + callback: Callback + ): void; + end( + chunkOrCallback?: Buffer | Callback, + encodingOrCallback?: BufferEncoding | Callback, + callback?: Callback + ): void { + const chunk = typeof chunkOrCallback === 'function' ? undefined : chunkOrCallback; + const encoding = typeof encodingOrCallback === 'function' ? undefined : encodingOrCallback; + callback = + typeof chunkOrCallback === 'function' + ? chunkOrCallback + : typeof encodingOrCallback === 'function' + ? encodingOrCallback + : callback; + + if (checkAborted(this, callback)) return; + + this.state.streamEnd = true; + + if (callback) { + this.once(GridFSBucketWriteStream.FINISH, (result: GridFSFile) => { + if (callback) callback(undefined, result); + }); + } + + if (!chunk) { + waitForIndexes(this, () => !!writeRemnant(this)); + return; + } + + this.write(chunk, encoding, () => { + writeRemnant(this); + }); + } +} + +function __handleError( + stream: GridFSBucketWriteStream, + error: AnyError, + callback?: Callback +): void { + if (stream.state.errored) { + return; + } + stream.state.errored = true; + if (callback) { + return callback(error); + } + stream.emit(GridFSBucketWriteStream.ERROR, error); +} + +function createChunkDoc(filesId: ObjectId, n: number, data: Buffer): GridFSChunk { + return { + _id: new ObjectId(), + files_id: filesId, + n, + data + }; +} + +function checkChunksIndex(stream: GridFSBucketWriteStream, callback: Callback): void { + stream.chunks.listIndexes().toArray((error?: AnyError, indexes?: Document[]) => { + let index: { files_id: number; n: number }; + if (error) { + // Collection doesn't exist so create index + if (error instanceof MongoError && error.code === MONGODB_ERROR_CODES.NamespaceNotFound) { + index = { files_id: 1, n: 1 }; + stream.chunks.createIndex(index, { background: false, unique: true }, error => { + if (error) { + return callback(error); + } + + callback(); + }); + return; + } + return callback(error); + } + + let hasChunksIndex = false; + if (indexes) { + indexes.forEach((index: Document) => { + if (index.key) { + const keys = Object.keys(index.key); + if (keys.length === 2 && index.key.files_id === 1 && index.key.n === 1) { + hasChunksIndex = true; + } + } + }); + } + + if (hasChunksIndex) { + callback(); + } else { + index = { files_id: 1, n: 1 }; + const writeConcernOptions = getWriteOptions(stream); + + stream.chunks.createIndex( + index, + { + ...writeConcernOptions, + background: true, + unique: true + }, + callback + ); + } + }); +} + +function checkDone(stream: GridFSBucketWriteStream, callback?: Callback): boolean { + if (stream.done) return true; + if (stream.state.streamEnd && stream.state.outstandingRequests === 0 && !stream.state.errored) { + // Set done so we do not trigger duplicate createFilesDoc + stream.done = true; + // Create a new files doc + const filesDoc = createFilesDoc( + stream.id, + stream.length, + stream.chunkSizeBytes, + stream.filename, + stream.options.contentType, + stream.options.aliases, + stream.options.metadata + ); + + if (checkAborted(stream, callback)) { + return false; + } + + stream.files.insertOne(filesDoc, getWriteOptions(stream), (error?: AnyError) => { + if (error) { + return __handleError(stream, error, callback); + } + stream.emit(GridFSBucketWriteStream.FINISH, filesDoc); + stream.emit(GridFSBucketWriteStream.CLOSE); + }); + + return true; + } + + return false; +} + +function checkIndexes(stream: GridFSBucketWriteStream, callback: Callback): void { + stream.files.findOne({}, { projection: { _id: 1 } }, (error, doc) => { + if (error) { + return callback(error); + } + if (doc) { + return callback(); + } + + stream.files.listIndexes().toArray((error?: AnyError, indexes?: Document) => { + let index: { filename: number; uploadDate: number }; + if (error) { + // Collection doesn't exist so create index + if (error instanceof MongoError && error.code === MONGODB_ERROR_CODES.NamespaceNotFound) { + index = { filename: 1, uploadDate: 1 }; + stream.files.createIndex(index, { background: false }, (error?: AnyError) => { + if (error) { + return callback(error); + } + + checkChunksIndex(stream, callback); + }); + return; + } + return callback(error); + } + + let hasFileIndex = false; + if (indexes) { + indexes.forEach((index: Document) => { + const keys = Object.keys(index.key); + if (keys.length === 2 && index.key.filename === 1 && index.key.uploadDate === 1) { + hasFileIndex = true; + } + }); + } + + if (hasFileIndex) { + checkChunksIndex(stream, callback); + } else { + index = { filename: 1, uploadDate: 1 }; + + const writeConcernOptions = getWriteOptions(stream); + + stream.files.createIndex( + index, + { + ...writeConcernOptions, + background: false + }, + (error?: AnyError) => { + if (error) { + return callback(error); + } + + checkChunksIndex(stream, callback); + } + ); + } + }); + }); +} + +function createFilesDoc( + _id: ObjectId, + length: number, + chunkSize: number, + filename: string, + contentType?: string, + aliases?: string[], + metadata?: Document +): GridFSFile { + const ret: GridFSFile = { + _id, + length, + chunkSize, + uploadDate: new Date(), + filename + }; + + if (contentType) { + ret.contentType = contentType; + } + + if (aliases) { + ret.aliases = aliases; + } + + if (metadata) { + ret.metadata = metadata; + } + + return ret; +} + +function doWrite( + stream: GridFSBucketWriteStream, + chunk: Buffer | string, + encoding?: BufferEncoding, + callback?: Callback +): boolean { + if (checkAborted(stream, callback)) { + return false; + } + + const inputBuf = Buffer.isBuffer(chunk) ? chunk : Buffer.from(chunk, encoding); + + stream.length += inputBuf.length; + + // Input is small enough to fit in our buffer + if (stream.pos + inputBuf.length < stream.chunkSizeBytes) { + inputBuf.copy(stream.bufToStore, stream.pos); + stream.pos += inputBuf.length; + + callback && callback(); + + // Note that we reverse the typical semantics of write's return value + // to be compatible with node's `.pipe()` function. + // True means client can keep writing. + return true; + } + + // Otherwise, buffer is too big for current chunk, so we need to flush + // to MongoDB. + let inputBufRemaining = inputBuf.length; + let spaceRemaining: number = stream.chunkSizeBytes - stream.pos; + let numToCopy = Math.min(spaceRemaining, inputBuf.length); + let outstandingRequests = 0; + while (inputBufRemaining > 0) { + const inputBufPos = inputBuf.length - inputBufRemaining; + inputBuf.copy(stream.bufToStore, stream.pos, inputBufPos, inputBufPos + numToCopy); + stream.pos += numToCopy; + spaceRemaining -= numToCopy; + let doc: GridFSChunk; + if (spaceRemaining === 0) { + doc = createChunkDoc(stream.id, stream.n, Buffer.from(stream.bufToStore)); + ++stream.state.outstandingRequests; + ++outstandingRequests; + + if (checkAborted(stream, callback)) { + return false; + } + + stream.chunks.insertOne(doc, getWriteOptions(stream), (error?: AnyError) => { + if (error) { + return __handleError(stream, error); + } + --stream.state.outstandingRequests; + --outstandingRequests; + + if (!outstandingRequests) { + stream.emit('drain', doc); + callback && callback(); + checkDone(stream); + } + }); + + spaceRemaining = stream.chunkSizeBytes; + stream.pos = 0; + ++stream.n; + } + inputBufRemaining -= numToCopy; + numToCopy = Math.min(spaceRemaining, inputBufRemaining); + } + + // Note that we reverse the typical semantics of write's return value + // to be compatible with node's `.pipe()` function. + // False means the client should wait for the 'drain' event. + return false; +} + +function getWriteOptions(stream: GridFSBucketWriteStream): WriteConcernOptions { + const obj: WriteConcernOptions = {}; + if (stream.writeConcern) { + obj.writeConcern = { + w: stream.writeConcern.w, + wtimeout: stream.writeConcern.wtimeout, + j: stream.writeConcern.j + }; + } + return obj; +} + +function waitForIndexes( + stream: GridFSBucketWriteStream, + callback: (res: boolean) => boolean +): boolean { + if (stream.bucket.s.checkedIndexes) { + return callback(false); + } + + stream.bucket.once('index', () => { + callback(true); + }); + + return true; +} + +function writeRemnant(stream: GridFSBucketWriteStream, callback?: Callback): boolean { + // Buffer is empty, so don't bother to insert + if (stream.pos === 0) { + return checkDone(stream, callback); + } + + ++stream.state.outstandingRequests; + + // Create a new buffer to make sure the buffer isn't bigger than it needs + // to be. + const remnant = Buffer.alloc(stream.pos); + stream.bufToStore.copy(remnant, 0, 0, stream.pos); + const doc = createChunkDoc(stream.id, stream.n, remnant); + + // If the stream was aborted, do not write remnant + if (checkAborted(stream, callback)) { + return false; + } + + stream.chunks.insertOne(doc, getWriteOptions(stream), (error?: AnyError) => { + if (error) { + return __handleError(stream, error); + } + --stream.state.outstandingRequests; + checkDone(stream); + }); + return true; +} + +function checkAborted(stream: GridFSBucketWriteStream, callback?: Callback): boolean { + if (stream.state.aborted) { + if (typeof callback === 'function') { + // TODO(NODE-3485): Replace with MongoGridFSStreamClosedError + callback(new MongoAPIError('Stream has been aborted')); + } + return true; + } + return false; +} diff --git a/node_modules/mongodb/src/index.ts b/node_modules/mongodb/src/index.ts new file mode 100644 index 000000000..4a80f643b --- /dev/null +++ b/node_modules/mongodb/src/index.ts @@ -0,0 +1,435 @@ +import { AbstractCursor } from './cursor/abstract_cursor'; +import { AggregationCursor } from './cursor/aggregation_cursor'; +import { FindCursor } from './cursor/find_cursor'; +import { ListIndexesCursor } from './operations/indexes'; +import { ListCollectionsCursor } from './operations/list_collections'; +import { PromiseProvider } from './promise_provider'; +import { Admin } from './admin'; +import { MongoClient } from './mongo_client'; +import { Db } from './db'; +import { Collection } from './collection'; +import { Logger } from './logger'; +import { GridFSBucket } from './gridfs'; +import { CancellationToken } from './mongo_types'; + +export { + Binary, + Code, + DBRef, + Double, + Int32, + Long, + MinKey, + MaxKey, + ObjectId, + Timestamp, + Decimal128, + BSONRegExp, + BSONSymbol, + Map +} from './bson'; + +import { ObjectId } from 'bson'; +/** + * @public + * @deprecated Please use `ObjectId` + */ +export const ObjectID = ObjectId; + +export { + MongoError, + MongoServerError, + MongoDriverError, + MongoAPIError, + MongoCompatibilityError, + MongoInvalidArgumentError, + MongoMissingCredentialsError, + MongoMissingDependencyError, + MongoNetworkError, + MongoNetworkTimeoutError, + MongoSystemError, + MongoServerSelectionError, + MongoParseError, + MongoWriteConcernError, + MongoRuntimeError, + MongoChangeStreamError, + MongoGridFSStreamError, + MongoGridFSChunkError, + MongoDecompressionError, + MongoBatchReExecutionError, + MongoCursorExhaustedError, + MongoCursorInUseError, + MongoNotConnectedError, + MongoExpiredSessionError, + MongoTransactionError, + MongoKerberosError, + MongoServerClosedError, + MongoTopologyClosedError +} from './error'; +export { MongoBulkWriteError, BulkWriteOptions, AnyBulkWriteOperation } from './bulk/common'; +export { + // Utils + PromiseProvider as Promise, + // Actual driver classes exported + Admin, + MongoClient, + Db, + Collection, + Logger, + AbstractCursor, + AggregationCursor, + FindCursor, + ListIndexesCursor, + ListCollectionsCursor, + GridFSBucket, + CancellationToken +}; + +// enums +export { ProfilingLevel } from './operations/set_profiling_level'; +export { ServerType, TopologyType } from './sdam/common'; +export { LoggerLevel } from './logger'; +export { AutoEncryptionLoggerLevel } from './deps'; +export { BatchType } from './bulk/common'; +export { AuthMechanism } from './cmap/auth/defaultAuthProviders'; +export { CURSOR_FLAGS } from './cursor/abstract_cursor'; +export { Compressor } from './cmap/wire_protocol/compression'; +export { ReturnDocument } from './operations/find_and_modify'; +export { ExplainVerbosity } from './explain'; +export { ReadConcernLevel } from './read_concern'; +export { ReadPreferenceMode } from './read_preference'; +export { ServerApiVersion } from './mongo_client'; +export { BSONType } from './mongo_types'; + +// Helper classes +export { WriteConcern } from './write_concern'; +export { ReadConcern } from './read_concern'; +export { ReadPreference } from './read_preference'; + +// events +export { + CommandStartedEvent, + CommandSucceededEvent, + CommandFailedEvent +} from './cmap/command_monitoring_events'; +export { + ConnectionCheckOutFailedEvent, + ConnectionCheckOutStartedEvent, + ConnectionCheckedInEvent, + ConnectionCheckedOutEvent, + ConnectionClosedEvent, + ConnectionCreatedEvent, + ConnectionPoolClearedEvent, + ConnectionPoolClosedEvent, + ConnectionPoolCreatedEvent, + ConnectionPoolMonitoringEvent, + ConnectionReadyEvent +} from './cmap/connection_pool_events'; +export { + ServerHeartbeatStartedEvent, + ServerHeartbeatSucceededEvent, + ServerHeartbeatFailedEvent, + ServerClosedEvent, + ServerDescriptionChangedEvent, + ServerOpeningEvent, + TopologyClosedEvent, + TopologyDescriptionChangedEvent, + TopologyOpeningEvent +} from './sdam/events'; +export { SrvPollingEvent } from './sdam/srv_polling'; + +// type only exports below, these are removed from emitted JS +export type { AdminPrivate } from './admin'; +export type { Document, BSONSerializeOptions } from './bson'; +export type { + InsertOneModel, + ReplaceOneModel, + UpdateOneModel, + UpdateManyModel, + DeleteOneModel, + DeleteManyModel, + BulkResult, + BulkWriteResult, + WriteError, + WriteConcernError, + BulkWriteOperationError +} from './bulk/common'; +export type { + ChangeStream, + ChangeStreamDocument, + UpdateDescription, + ChangeStreamEvents, + ChangeStreamOptions, + ChangeStreamCursor, + ResumeToken, + PipeOptions, + ChangeStreamCursorOptions, + OperationTime, + ResumeOptions +} from './change_stream'; +export type { + MongoCredentials, + AuthMechanismProperties, + MongoCredentialsOptions +} from './cmap/auth/mongo_credentials'; +export type { + WriteProtocolMessageType, + Query, + GetMore, + Msg, + KillCursor, + OpGetMoreOptions, + OpQueryOptions +} from './cmap/commands'; +export type { Stream, LEGAL_TLS_SOCKET_OPTIONS, LEGAL_TCP_SOCKET_OPTIONS } from './cmap/connect'; +export type { + Connection, + ConnectionOptions, + DestroyOptions, + CommandOptions, + QueryOptions, + GetMoreOptions, + ConnectionEvents +} from './cmap/connection'; +export type { ConnectionPoolMetrics } from './cmap/metrics'; +export type { + CloseOptions, + ConnectionPoolOptions, + WaitQueueMember, + WithConnectionCallback, + ConnectionPool, + ConnectionPoolEvents +} from './cmap/connection_pool'; +export type { + OperationDescription, + MessageStream, + MessageStreamOptions +} from './cmap/message_stream'; +export type { StreamDescription, StreamDescriptionOptions } from './cmap/stream_description'; +export type { CompressorName } from './cmap/wire_protocol/compression'; +export type { CollectionPrivate, CollectionOptions, ModifyResult } from './collection'; +export type { AggregationCursorOptions } from './cursor/aggregation_cursor'; +export type { + CursorCloseOptions, + CursorStreamOptions, + AbstractCursorOptions, + AbstractCursorEvents, + CursorFlag +} from './cursor/abstract_cursor'; +export type { DbPrivate, DbOptions } from './db'; +export type { AutoEncryptionOptions, AutoEncrypter } from './deps'; +export type { AnyError, ErrorDescription, MongoNetworkErrorOptions } from './error'; +export type { Explain, ExplainOptions, ExplainVerbosityLike } from './explain'; +export type { + GridFSBucketReadStream, + GridFSBucketReadStreamOptions, + GridFSBucketReadStreamOptionsWithRevision, + GridFSBucketReadStreamPrivate, + GridFSFile +} from './gridfs/download'; +export type { GridFSBucketOptions, GridFSBucketPrivate, GridFSBucketEvents } from './gridfs/index'; +export type { + GridFSBucketWriteStreamOptions, + GridFSBucketWriteStream, + GridFSChunk +} from './gridfs/upload'; +export type { LoggerOptions, LoggerFunction } from './logger'; +export type { + MongoClientEvents, + MongoClientPrivate, + MongoClientOptions, + WithSessionCallback, + PkFactory, + Auth, + DriverInfo, + MongoOptions, + ServerApi, + SupportedNodeConnectionOptions, + SupportedTLSConnectionOptions, + SupportedTLSSocketOptions, + SupportedSocketOptions +} from './mongo_client'; +export type { + TypedEventEmitter, + EventsDescription, + CommonEvents, + GenericListener +} from './mongo_types'; +export type { AddUserOptions, RoleSpecification } from './operations/add_user'; +export type { + AggregateOptions, + AggregateOperation, + DB_AGGREGATE_COLLECTION +} from './operations/aggregate'; +export type { MONGO_CLIENT_EVENTS } from './operations/connect'; +export type { + CommandOperationOptions, + OperationParent, + CommandOperation, + CollationOptions +} from './operations/command'; +export type { IndexInformationOptions } from './operations/common_functions'; +export type { CountOptions } from './operations/count'; +export type { CountDocumentsOptions } from './operations/count_documents'; +export type { + CreateCollectionOptions, + TimeSeriesCollectionOptions +} from './operations/create_collection'; +export type { DeleteOptions, DeleteResult, DeleteStatement } from './operations/delete'; +export type { DistinctOptions } from './operations/distinct'; +export type { DropCollectionOptions, DropDatabaseOptions } from './operations/drop'; +export type { EstimatedDocumentCountOptions } from './operations/estimated_document_count'; +export type { EvalOptions } from './operations/eval'; +export type { FindOptions } from './operations/find'; +export type { Sort, SortDirection, SortDirectionForCmd, SortForCmd } from './sort'; +export type { + FindOneAndDeleteOptions, + FindOneAndReplaceOptions, + FindOneAndUpdateOptions +} from './operations/find_and_modify'; +export type { + IndexSpecification, + CreateIndexesOptions, + IndexDescription, + DropIndexesOptions, + ListIndexesOptions, + IndexDirection +} from './operations/indexes'; +export type { InsertOneResult, InsertOneOptions, InsertManyResult } from './operations/insert'; +export type { ListCollectionsOptions, CollectionInfo } from './operations/list_collections'; +export type { ListDatabasesResult, ListDatabasesOptions } from './operations/list_databases'; +export type { + MapFunction, + ReduceFunction, + MapReduceOptions, + FinalizeFunction +} from './operations/map_reduce'; +export type { Hint, OperationOptions, AbstractOperation } from './operations/operation'; +export type { ProfilingLevelOptions } from './operations/profiling_level'; +export type { RemoveUserOptions } from './operations/remove_user'; +export type { RenameOptions } from './operations/rename'; +export type { RunCommandOptions } from './operations/run_command'; +export type { SetProfilingLevelOptions } from './operations/set_profiling_level'; +export type { + CollStatsOptions, + DbStatsOptions, + CollStats, + WiredTigerData +} from './operations/stats'; +export type { + UpdateResult, + UpdateOptions, + ReplaceOptions, + UpdateStatement +} from './operations/update'; +export type { ValidateCollectionOptions } from './operations/validate_collection'; +export type { ReadConcernLike } from './read_concern'; +export type { + ReadPreferenceLike, + ReadPreferenceOptions, + ReadPreferenceLikeOptions, + ReadPreferenceFromOptions, + HedgeOptions +} from './read_preference'; +export type { ClusterTime, TimerQueue } from './sdam/common'; +export type { + Monitor, + MonitorEvents, + MonitorPrivate, + MonitorOptions, + RTTPinger, + RTTPingerOptions +} from './sdam/monitor'; +export type { Server, ServerEvents, ServerPrivate, ServerOptions } from './sdam/server'; +export type { + TopologyVersion, + TagSet, + ServerDescription, + ServerDescriptionOptions +} from './sdam/server_description'; +export type { ServerSelector } from './sdam/server_selection'; +export type { SrvPoller, SrvPollerEvents, SrvPollerOptions } from './sdam/srv_polling'; +export type { + Topology, + TopologyEvents, + TopologyPrivate, + ServerSelectionRequest, + TopologyOptions, + ServerCapabilities, + ConnectOptions, + SelectServerOptions, + ServerSelectionCallback +} from './sdam/topology'; +export type { TopologyDescription, TopologyDescriptionOptions } from './sdam/topology_description'; +export type { + ClientSession, + ClientSessionEvents, + ClientSessionOptions, + EndSessionOptions, + ServerSessionPool, + ServerSession, + ServerSessionId, + WithTransactionCallback +} from './sessions'; +export type { TransactionOptions, Transaction, TxnState } from './transactions'; +export type { + Callback, + ClientMetadata, + ClientMetadataOptions, + MongoDBNamespace, + InterruptibleAsyncInterval, + BufferPool, + HostAddress, + EventEmitterWithState +} from './utils'; +export type { W, WriteConcernOptions, WriteConcernSettings } from './write_concern'; +export type { ExecutionResult } from './operations/execute_operation'; +export type { InternalAbstractCursorOptions } from './cursor/abstract_cursor'; +export type { + BulkOperationBase, + BulkOperationPrivate, + FindOperators, + Batch, + WriteConcernErrorData +} from './bulk/common'; +export type { OrderedBulkOperation } from './bulk/ordered'; +export type { UnorderedBulkOperation } from './bulk/unordered'; +export type { Encrypter, EncrypterOptions } from './encrypter'; +export type { + EnhancedOmit, + WithId, + OptionalId, + WithoutId, + UpdateFilter, + Filter, + Projection, + InferIdType, + ProjectionOperators, + Flatten, + SchemaMember, + Condition, + RootFilterOperators, + AlternativeType, + FilterOperators, + BSONTypeAlias, + BitwiseFilter, + RegExpOrString, + OnlyFieldsOfType, + NumericType, + IntegerType, + MatchKeysAndValues, + SetFields, + PullOperator, + PushOperator, + PullAllOperator, + AcceptedFields, + NotAcceptedFields, + AddToSetOperators, + ArrayOperator, + FilterOperations, + KeysOfAType, + KeysOfOtherType, + IsAny, + OneOrMore +} from './mongo_types'; +export type { serialize, deserialize } from './bson'; diff --git a/node_modules/mongodb/src/logger.ts b/node_modules/mongodb/src/logger.ts new file mode 100644 index 000000000..e51540215 --- /dev/null +++ b/node_modules/mongodb/src/logger.ts @@ -0,0 +1,264 @@ +import { format } from 'util'; +import { enumToString } from './utils'; +import { MongoInvalidArgumentError } from './error'; + +// Filters for classes +const classFilters: any = {}; +let filteredClasses: any = {}; +let level: LoggerLevel; + +// Save the process id +const pid = process.pid; + +// current logger +// eslint-disable-next-line no-console +let currentLogger: LoggerFunction = console.warn; + +/** @public */ +export const LoggerLevel = Object.freeze({ + ERROR: 'error', + WARN: 'warn', + INFO: 'info', + DEBUG: 'debug', + error: 'error', + warn: 'warn', + info: 'info', + debug: 'debug' +} as const); + +/** @public */ +export type LoggerLevel = typeof LoggerLevel[keyof typeof LoggerLevel]; + +/** @public */ +export type LoggerFunction = (message?: any, ...optionalParams: any[]) => void; + +/** @public */ +export interface LoggerOptions { + logger?: LoggerFunction; + loggerLevel?: LoggerLevel; +} + +/** + * @public + */ +export class Logger { + className: string; + + /** + * Creates a new Logger instance + * + * @param className - The Class name associated with the logging instance + * @param options - Optional logging settings + */ + constructor(className: string, options?: LoggerOptions) { + options = options ?? {}; + + // Current reference + this.className = className; + + // Current logger + if (!(options.logger instanceof Logger) && typeof options.logger === 'function') { + currentLogger = options.logger; + } + + // Set level of logging, default is error + if (options.loggerLevel) { + level = options.loggerLevel || LoggerLevel.ERROR; + } + + // Add all class names + if (filteredClasses[this.className] == null) { + classFilters[this.className] = true; + } + } + + /** + * Log a message at the debug level + * + * @param message - The message to log + * @param object - Additional meta data to log + */ + debug(message: string, object?: unknown): void { + if ( + this.isDebug() && + ((Object.keys(filteredClasses).length > 0 && filteredClasses[this.className]) || + (Object.keys(filteredClasses).length === 0 && classFilters[this.className])) + ) { + const dateTime = new Date().getTime(); + const msg = format('[%s-%s:%s] %s %s', 'DEBUG', this.className, pid, dateTime, message); + const state = { + type: LoggerLevel.DEBUG, + message, + className: this.className, + pid, + date: dateTime + } as any; + + if (object) state.meta = object; + currentLogger(msg, state); + } + } + + /** + * Log a message at the warn level + * + * @param message - The message to log + * @param object - Additional meta data to log + */ + warn(message: string, object?: unknown): void { + if ( + this.isWarn() && + ((Object.keys(filteredClasses).length > 0 && filteredClasses[this.className]) || + (Object.keys(filteredClasses).length === 0 && classFilters[this.className])) + ) { + const dateTime = new Date().getTime(); + const msg = format('[%s-%s:%s] %s %s', 'WARN', this.className, pid, dateTime, message); + const state = { + type: LoggerLevel.WARN, + message, + className: this.className, + pid, + date: dateTime + } as any; + + if (object) state.meta = object; + currentLogger(msg, state); + } + } + + /** + * Log a message at the info level + * + * @param message - The message to log + * @param object - Additional meta data to log + */ + info(message: string, object?: unknown): void { + if ( + this.isInfo() && + ((Object.keys(filteredClasses).length > 0 && filteredClasses[this.className]) || + (Object.keys(filteredClasses).length === 0 && classFilters[this.className])) + ) { + const dateTime = new Date().getTime(); + const msg = format('[%s-%s:%s] %s %s', 'INFO', this.className, pid, dateTime, message); + const state = { + type: LoggerLevel.INFO, + message, + className: this.className, + pid, + date: dateTime + } as any; + + if (object) state.meta = object; + currentLogger(msg, state); + } + } + + /** + * Log a message at the error level + * + * @param message - The message to log + * @param object - Additional meta data to log + */ + error(message: string, object?: unknown): void { + if ( + this.isError() && + ((Object.keys(filteredClasses).length > 0 && filteredClasses[this.className]) || + (Object.keys(filteredClasses).length === 0 && classFilters[this.className])) + ) { + const dateTime = new Date().getTime(); + const msg = format('[%s-%s:%s] %s %s', 'ERROR', this.className, pid, dateTime, message); + const state = { + type: LoggerLevel.ERROR, + message, + className: this.className, + pid, + date: dateTime + } as any; + + if (object) state.meta = object; + currentLogger(msg, state); + } + } + + /** Is the logger set at info level */ + isInfo(): boolean { + return level === LoggerLevel.INFO || level === LoggerLevel.DEBUG; + } + + /** Is the logger set at error level */ + isError(): boolean { + return level === LoggerLevel.ERROR || level === LoggerLevel.INFO || level === LoggerLevel.DEBUG; + } + + /** Is the logger set at error level */ + isWarn(): boolean { + return ( + level === LoggerLevel.ERROR || + level === LoggerLevel.WARN || + level === LoggerLevel.INFO || + level === LoggerLevel.DEBUG + ); + } + + /** Is the logger set at debug level */ + isDebug(): boolean { + return level === LoggerLevel.DEBUG; + } + + /** Resets the logger to default settings, error and no filtered classes */ + static reset(): void { + level = LoggerLevel.ERROR; + filteredClasses = {}; + } + + /** Get the current logger function */ + static currentLogger(): LoggerFunction { + return currentLogger; + } + + /** + * Set the current logger function + * + * @param logger - Custom logging function + */ + static setCurrentLogger(logger: LoggerFunction): void { + if (typeof logger !== 'function') { + throw new MongoInvalidArgumentError('Current logger must be a function'); + } + + currentLogger = logger; + } + + /** + * Filter log messages for a particular class + * + * @param type - The type of filter (currently only class) + * @param values - The filters to apply + */ + static filter(type: string, values: string[]): void { + if (type === 'class' && Array.isArray(values)) { + filteredClasses = {}; + values.forEach(x => (filteredClasses[x] = true)); + } + } + + /** + * Set the current log level + * + * @param newLevel - Set current log level (debug, warn, info, error) + */ + static setLevel(newLevel: LoggerLevel): void { + if ( + newLevel !== LoggerLevel.INFO && + newLevel !== LoggerLevel.ERROR && + newLevel !== LoggerLevel.DEBUG && + newLevel !== LoggerLevel.WARN + ) { + throw new MongoInvalidArgumentError( + `Argument "newLevel" should be one of ${enumToString(LoggerLevel)}` + ); + } + + level = newLevel; + } +} diff --git a/node_modules/mongodb/src/mongo_client.ts b/node_modules/mongodb/src/mongo_client.ts new file mode 100644 index 000000000..b57563f51 --- /dev/null +++ b/node_modules/mongodb/src/mongo_client.ts @@ -0,0 +1,699 @@ +import { Db, DbOptions } from './db'; +import { ChangeStream, ChangeStreamOptions } from './change_stream'; +import type { ReadPreference, ReadPreferenceMode } from './read_preference'; +import { + AnyError, + MongoRuntimeError, + MongoInvalidArgumentError, + MongoNotConnectedError +} from './error'; +import type { W, WriteConcern } from './write_concern'; +import { + maybePromise, + MongoDBNamespace, + Callback, + resolveOptions, + ClientMetadata, + ns, + HostAddress +} from './utils'; +import { connect, MONGO_CLIENT_EVENTS } from './operations/connect'; +import { PromiseProvider } from './promise_provider'; +import type { Logger, LoggerLevel } from './logger'; +import type { ReadConcern, ReadConcernLevel, ReadConcernLike } from './read_concern'; +import { BSONSerializeOptions, Document, resolveBSONOptions } from './bson'; +import type { AutoEncrypter, AutoEncryptionOptions } from './deps'; +import type { AuthMechanism } from './cmap/auth/defaultAuthProviders'; +import type { Topology, TopologyEvents } from './sdam/topology'; +import type { ClientSession, ClientSessionOptions } from './sessions'; +import type { TagSet } from './sdam/server_description'; +import type { AuthMechanismProperties, MongoCredentials } from './cmap/auth/mongo_credentials'; +import { parseOptions } from './connection_string'; +import type { CompressorName } from './cmap/wire_protocol/compression'; +import type { TLSSocketOptions, ConnectionOptions as TLSConnectionOptions } from 'tls'; +import type { TcpNetConnectOpts } from 'net'; +import type { SrvPoller } from './sdam/srv_polling'; +import type { Connection } from './cmap/connection'; +import type { LEGAL_TLS_SOCKET_OPTIONS, LEGAL_TCP_SOCKET_OPTIONS } from './cmap/connect'; +import type { Encrypter } from './encrypter'; +import { TypedEventEmitter } from './mongo_types'; + +/** @public */ +export const ServerApiVersion = Object.freeze({ + v1: '1' +} as const); + +/** @public */ +export type ServerApiVersion = typeof ServerApiVersion[keyof typeof ServerApiVersion]; + +/** @public */ +export interface ServerApi { + version: ServerApiVersion; + strict?: boolean; + deprecationErrors?: boolean; +} + +/** @public */ +export interface DriverInfo { + name?: string; + version?: string; + platform?: string; +} + +/** @public */ +export interface Auth { + /** The username for auth */ + username?: string; + /** The password for auth */ + password?: string; +} + +/** @public */ +export interface PkFactory { + createPk(): any; // TODO: when js-bson is typed, function should return some BSON type +} + +type CleanUpHandlerFunction = (err?: AnyError, result?: any, opts?: any) => Promise; + +/** @public */ +export type SupportedTLSConnectionOptions = Pick< + TLSConnectionOptions, + Extract +>; + +/** @public */ +export type SupportedTLSSocketOptions = Pick< + TLSSocketOptions, + Extract +>; + +/** @public */ +export type SupportedSocketOptions = Pick< + TcpNetConnectOpts, + typeof LEGAL_TCP_SOCKET_OPTIONS[number] +>; + +/** @public */ +export type SupportedNodeConnectionOptions = SupportedTLSConnectionOptions & + SupportedTLSSocketOptions & + SupportedSocketOptions; + +/** + * Describes all possible URI query options for the mongo client + * @public + * @see https://docs.mongodb.com/manual/reference/connection-string + */ +export interface MongoClientOptions extends BSONSerializeOptions, SupportedNodeConnectionOptions { + /** Specifies the name of the replica set, if the mongod is a member of a replica set. */ + replicaSet?: string; + /** Enables or disables TLS/SSL for the connection. */ + tls?: boolean; + /** A boolean to enable or disables TLS/SSL for the connection. (The ssl option is equivalent to the tls option.) */ + ssl?: boolean; + /** Specifies the location of a local TLS Certificate */ + tlsCertificateFile?: string; + /** Specifies the location of a local .pem file that contains either the client's TLS/SSL certificate and key or only the client's TLS/SSL key when tlsCertificateFile is used to provide the certificate. */ + tlsCertificateKeyFile?: string; + /** Specifies the password to de-crypt the tlsCertificateKeyFile. */ + tlsCertificateKeyFilePassword?: string; + /** Specifies the location of a local .pem file that contains the root certificate chain from the Certificate Authority. This file is used to validate the certificate presented by the mongod/mongos instance. */ + tlsCAFile?: string; + /** Bypasses validation of the certificates presented by the mongod/mongos instance */ + tlsAllowInvalidCertificates?: boolean; + /** Disables hostname validation of the certificate presented by the mongod/mongos instance. */ + tlsAllowInvalidHostnames?: boolean; + /** Disables various certificate validations. */ + tlsInsecure?: boolean; + /** The time in milliseconds to attempt a connection before timing out. */ + connectTimeoutMS?: number; + /** The time in milliseconds to attempt a send or receive on a socket before the attempt times out. */ + socketTimeoutMS?: number; + /** An array or comma-delimited string of compressors to enable network compression for communication between this client and a mongod/mongos instance. */ + compressors?: CompressorName[] | string; + /** An integer that specifies the compression level if using zlib for network compression. */ + zlibCompressionLevel?: 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | undefined; + /** The maximum number of connections in the connection pool. */ + maxPoolSize?: number; + /** The minimum number of connections in the connection pool. */ + minPoolSize?: number; + /** The maximum number of milliseconds that a connection can remain idle in the pool before being removed and closed. */ + maxIdleTimeMS?: number; + /** The maximum time in milliseconds that a thread can wait for a connection to become available. */ + waitQueueTimeoutMS?: number; + /** Specify a read concern for the collection (only MongoDB 3.2 or higher supported) */ + readConcern?: ReadConcernLike; + /** The level of isolation */ + readConcernLevel?: ReadConcernLevel; + /** Specifies the read preferences for this connection */ + readPreference?: ReadPreferenceMode | ReadPreference; + /** Specifies, in seconds, how stale a secondary can be before the client stops using it for read operations. */ + maxStalenessSeconds?: number; + /** Specifies the tags document as a comma-separated list of colon-separated key-value pairs. */ + readPreferenceTags?: TagSet[]; + /** The auth settings for when connection to server. */ + auth?: Auth; + /** Specify the database name associated with the user’s credentials. */ + authSource?: string; + /** Specify the authentication mechanism that MongoDB will use to authenticate the connection. */ + authMechanism?: AuthMechanism; + /** Specify properties for the specified authMechanism as a comma-separated list of colon-separated key-value pairs. */ + authMechanismProperties?: AuthMechanismProperties; + /** The size (in milliseconds) of the latency window for selecting among multiple suitable MongoDB instances. */ + localThresholdMS?: number; + /** Specifies how long (in milliseconds) to block for server selection before throwing an exception. */ + serverSelectionTimeoutMS?: number; + /** heartbeatFrequencyMS controls when the driver checks the state of the MongoDB deployment. Specify the interval (in milliseconds) between checks, counted from the end of the previous check until the beginning of the next one. */ + heartbeatFrequencyMS?: number; + /** Sets the minimum heartbeat frequency. In the event that the driver has to frequently re-check a server's availability, it will wait at least this long since the previous check to avoid wasted effort. */ + minHeartbeatFrequencyMS?: number; + /** The name of the application that created this MongoClient instance. MongoDB 3.4 and newer will print this value in the server log upon establishing each connection. It is also recorded in the slow query log and profile collections */ + appName?: string; + /** Enables retryable reads. */ + retryReads?: boolean; + /** Enable retryable writes. */ + retryWrites?: boolean; + /** Allow a driver to force a Single topology type with a connection string containing one host */ + directConnection?: boolean; + /** Instruct the driver it is connecting to a load balancer fronting a mongos like service */ + loadBalanced?: boolean; + + /** The write concern w value */ + w?: W; + /** The write concern timeout */ + wtimeoutMS?: number; + /** The journal write concern */ + journal?: boolean; + + /** Validate mongod server certificate against Certificate Authority */ + sslValidate?: boolean; + /** SSL Certificate file path. */ + sslCA?: string; + /** SSL Certificate file path. */ + sslCert?: string; + /** SSL Key file file path. */ + sslKey?: string; + /** SSL Certificate pass phrase. */ + sslPass?: string; + /** SSL Certificate revocation list file path. */ + sslCRL?: string; + /** TCP Connection no delay */ + noDelay?: boolean; + /** TCP Connection keep alive enabled */ + keepAlive?: boolean; + /** The number of milliseconds to wait before initiating keepAlive on the TCP socket */ + keepAliveInitialDelay?: number; + /** Force server to assign `_id` values instead of driver */ + forceServerObjectId?: boolean; + /** Return document results as raw BSON buffers */ + raw?: boolean; + /** A primary key factory function for generation of custom `_id` keys */ + pkFactory?: PkFactory; + /** A Promise library class the application wishes to use such as Bluebird, must be ES6 compatible */ + promiseLibrary?: any; + /** The logging level */ + loggerLevel?: LoggerLevel; + /** Custom logger object */ + logger?: Logger; + /** Enable command monitoring for this client */ + monitorCommands?: boolean; + /** Server API version */ + serverApi?: ServerApi | ServerApiVersion; + /** + * Optionally enable client side auto encryption + * + * @remarks + * Automatic encryption is an enterprise only feature that only applies to operations on a collection. Automatic encryption is not supported for operations on a database or view, and operations that are not bypassed will result in error + * (see [libmongocrypt: Auto Encryption Allow-List](https://github.com/mongodb/specifications/blob/master/source/client-side-encryption/client-side-encryption.rst#libmongocrypt-auto-encryption-allow-list)). To bypass automatic encryption for all operations, set bypassAutoEncryption=true in AutoEncryptionOpts. + * + * Automatic encryption requires the authenticated user to have the [listCollections privilege action](https://docs.mongodb.com/manual/reference/command/listCollections/#dbcmd.listCollections). + * + * If a MongoClient with a limited connection pool size (i.e a non-zero maxPoolSize) is configured with AutoEncryptionOptions, a separate internal MongoClient is created if any of the following are true: + * - AutoEncryptionOptions.keyVaultClient is not passed. + * - AutoEncryptionOptions.bypassAutomaticEncryption is false. + * + * If an internal MongoClient is created, it is configured with the same options as the parent MongoClient except minPoolSize is set to 0 and AutoEncryptionOptions is omitted. + */ + autoEncryption?: AutoEncryptionOptions; + /** Allows a wrapping driver to amend the client metadata generated by the driver to include information about the wrapping driver */ + driverInfo?: DriverInfo; + + /** @internal */ + srvPoller?: SrvPoller; + /** @internal */ + connectionType?: typeof Connection; +} + +/** @public */ +export type WithSessionCallback = (session: ClientSession) => Promise | void; + +/** @internal */ +export interface MongoClientPrivate { + url: string; + sessions: Set; + bsonOptions: BSONSerializeOptions; + namespace: MongoDBNamespace; + readonly options?: MongoOptions; + readonly readConcern?: ReadConcern; + readonly writeConcern?: WriteConcern; + readonly readPreference: ReadPreference; + readonly logger: Logger; +} + +/** @public */ +export type MongoClientEvents = Pick & { + // In previous versions the open event emitted a topology, in an effort to no longer + // expose internals but continue to expose this useful event API, it now emits a mongoClient + open(mongoClient: MongoClient): void; +}; + +/** @internal */ +const kOptions = Symbol('options'); + +/** + * The **MongoClient** class is a class that allows for making Connections to MongoDB. + * @public + * + * @remarks + * The programmatically provided options take precedent over the URI options. + * + * @example + * ```js + * // Connect using a MongoClient instance + * const MongoClient = require('mongodb').MongoClient; + * const test = require('assert'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * // Connect using MongoClient + * const mongoClient = new MongoClient(url); + * mongoClient.connect(function(err, client) { + * const db = client.db(dbName); + * client.close(); + * }); + * ``` + * + * @example + * ```js + * // Connect using the MongoClient.connect static method + * const MongoClient = require('mongodb').MongoClient; + * const test = require('assert'); + * // Connection url + * const url = 'mongodb://localhost:27017'; + * // Database Name + * const dbName = 'test'; + * // Connect using MongoClient + * MongoClient.connect(url, function(err, client) { + * const db = client.db(dbName); + * client.close(); + * }); + * ``` + */ +export class MongoClient extends TypedEventEmitter { + /** @internal */ + s: MongoClientPrivate; + /** @internal */ + topology?: Topology; + + /** + * The consolidate, parsed, transformed and merged options. + * @internal + */ + [kOptions]: MongoOptions; + + constructor(url: string, options?: MongoClientOptions) { + super(); + + this[kOptions] = parseOptions(url, this, options); + + // eslint-disable-next-line @typescript-eslint/no-this-alias + const client = this; + + // The internal state + this.s = { + url, + sessions: new Set(), + bsonOptions: resolveBSONOptions(this[kOptions]), + namespace: ns('admin'), + + get options() { + return client[kOptions]; + }, + get readConcern() { + return client[kOptions].readConcern; + }, + get writeConcern() { + return client[kOptions].writeConcern; + }, + get readPreference() { + return client[kOptions].readPreference; + }, + get logger() { + return client[kOptions].logger; + } + }; + } + + get options(): Readonly { + return Object.freeze({ ...this[kOptions] }); + } + + get serverApi(): Readonly { + return this[kOptions].serverApi && Object.freeze({ ...this[kOptions].serverApi }); + } + /** + * Intended for APM use only + * @internal + */ + get monitorCommands(): boolean { + return this[kOptions].monitorCommands; + } + set monitorCommands(value: boolean) { + this[kOptions].monitorCommands = value; + } + + get autoEncrypter(): AutoEncrypter | undefined { + return this[kOptions].autoEncrypter; + } + + get readConcern(): ReadConcern | undefined { + return this.s.readConcern; + } + + get writeConcern(): WriteConcern | undefined { + return this.s.writeConcern; + } + + get readPreference(): ReadPreference { + return this.s.readPreference; + } + + get bsonOptions(): BSONSerializeOptions { + return this.s.bsonOptions; + } + + get logger(): Logger { + return this.s.logger; + } + + /** + * Connect to MongoDB using a url + * + * @see docs.mongodb.org/manual/reference/connection-string/ + */ + connect(): Promise; + connect(callback: Callback): void; + connect(callback?: Callback): Promise | void { + if (callback && typeof callback !== 'function') { + throw new MongoInvalidArgumentError('Method `connect` only accepts a callback'); + } + + return maybePromise(callback, cb => { + connect(this, this[kOptions], err => { + if (err) return cb(err); + cb(undefined, this); + }); + }); + } + + /** + * Close the db and its underlying connections + * + * @param force - Force close, emitting no events + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + close(): Promise; + close(callback: Callback): void; + close(force: boolean): Promise; + close(force: boolean, callback: Callback): void; + close( + forceOrCallback?: boolean | Callback, + callback?: Callback + ): Promise | void { + if (typeof forceOrCallback === 'function') { + callback = forceOrCallback; + } + + const force = typeof forceOrCallback === 'boolean' ? forceOrCallback : false; + + return maybePromise(callback, callback => { + if (this.topology == null) { + return callback(); + } + + // clear out references to old topology + const topology = this.topology; + this.topology = undefined; + + topology.close({ force }, error => { + if (error) return callback(error); + const { encrypter } = this[kOptions]; + if (encrypter) { + return encrypter.close(this, force, error => { + callback(error); + }); + } + callback(); + }); + }); + } + + /** + * Create a new Db instance sharing the current socket connections. + * + * @param dbName - The name of the database we want to use. If not provided, use database name from connection string. + * @param options - Optional settings for Db construction + */ + db(dbName?: string, options?: DbOptions): Db { + options = options ?? {}; + + // Default to db from connection string if not provided + if (!dbName) { + dbName = this.options.dbName; + } + + // Copy the options and add out internal override of the not shared flag + const finalOptions = Object.assign({}, this[kOptions], options); + + // Return the db object + const db = new Db(this, dbName, finalOptions); + + // Return the database + return db; + } + + /** + * Connect to MongoDB using a url + * + * @remarks + * The programmatically provided options take precedent over the URI options. + * + * @see https://docs.mongodb.org/manual/reference/connection-string/ + */ + static connect(url: string): Promise; + static connect(url: string, callback: Callback): void; + static connect(url: string, options: MongoClientOptions): Promise; + static connect(url: string, options: MongoClientOptions, callback: Callback): void; + static connect( + url: string, + options?: MongoClientOptions | Callback, + callback?: Callback + ): Promise | void { + if (typeof options === 'function') (callback = options), (options = {}); + options = options ?? {}; + + try { + // Create client + const mongoClient = new MongoClient(url, options); + // Execute the connect method + if (callback) { + return mongoClient.connect(callback); + } else { + return mongoClient.connect(); + } + } catch (error) { + if (callback) return callback(error); + else return PromiseProvider.get().reject(error); + } + } + + /** Starts a new session on the server */ + startSession(): ClientSession; + startSession(options: ClientSessionOptions): ClientSession; + startSession(options?: ClientSessionOptions): ClientSession { + options = Object.assign({ explicit: true }, options); + if (!this.topology) { + throw new MongoNotConnectedError('MongoClient must be connected to start a session'); + } + + return this.topology.startSession(options, this.s.options); + } + + /** + * Runs a given operation with an implicitly created session. The lifetime of the session + * will be handled without the need for user interaction. + * + * NOTE: presently the operation MUST return a Promise (either explicit or implicitly as an async function) + * + * @param options - Optional settings for the command + * @param callback - An callback to execute with an implicitly created session + */ + withSession(callback: WithSessionCallback): Promise; + withSession(options: ClientSessionOptions, callback: WithSessionCallback): Promise; + withSession( + optionsOrOperation?: ClientSessionOptions | WithSessionCallback, + callback?: WithSessionCallback + ): Promise { + let options: ClientSessionOptions = optionsOrOperation as ClientSessionOptions; + if (typeof optionsOrOperation === 'function') { + callback = optionsOrOperation as WithSessionCallback; + options = { owner: Symbol() }; + } + + if (callback == null) { + throw new MongoInvalidArgumentError('Missing required callback parameter'); + } + + const session = this.startSession(options); + const Promise = PromiseProvider.get(); + + let cleanupHandler: CleanUpHandlerFunction = ((err, result, opts) => { + // prevent multiple calls to cleanupHandler + cleanupHandler = () => { + // TODO(NODE-3483) + throw new MongoRuntimeError('cleanupHandler was called too many times'); + }; + + opts = Object.assign({ throw: true }, opts); + session.endSession(); + + if (err) { + if (opts.throw) throw err; + return Promise.reject(err); + } + }) as CleanUpHandlerFunction; + + try { + const result = callback(session); + return Promise.resolve(result).then( + result => cleanupHandler(undefined, result, undefined), + err => cleanupHandler(err, null, { throw: true }) + ); + } catch (err) { + return cleanupHandler(err, null, { throw: false }) as Promise; + } + } + + /** + * Create a new Change Stream, watching for new changes (insertions, updates, + * replacements, deletions, and invalidations) in this cluster. Will ignore all + * changes to system collections, as well as the local, admin, and config databases. + * + * @param pipeline - An array of {@link https://docs.mongodb.com/manual/reference/operator/aggregation-pipeline/|aggregation pipeline stages} through which to pass change stream documents. This allows for filtering (using $match) and manipulating the change stream documents. + * @param options - Optional settings for the command + */ + watch( + pipeline: Document[] = [], + options: ChangeStreamOptions = {} + ): ChangeStream { + // Allow optionally not specifying a pipeline + if (!Array.isArray(pipeline)) { + options = pipeline; + pipeline = []; + } + + return new ChangeStream(this, pipeline, resolveOptions(this, options)); + } + + /** Return the mongo client logger */ + getLogger(): Logger { + return this.s.logger; + } +} + +/** + * Mongo Client Options + * @public + */ +export interface MongoOptions + extends Required< + Pick< + MongoClientOptions, + | 'autoEncryption' + | 'connectTimeoutMS' + | 'directConnection' + | 'driverInfo' + | 'forceServerObjectId' + | 'minHeartbeatFrequencyMS' + | 'heartbeatFrequencyMS' + | 'keepAlive' + | 'keepAliveInitialDelay' + | 'localThresholdMS' + | 'logger' + | 'maxIdleTimeMS' + | 'maxPoolSize' + | 'minPoolSize' + | 'monitorCommands' + | 'noDelay' + | 'pkFactory' + | 'promiseLibrary' + | 'raw' + | 'replicaSet' + | 'retryReads' + | 'retryWrites' + | 'serverSelectionTimeoutMS' + | 'socketTimeoutMS' + | 'tlsAllowInvalidCertificates' + | 'tlsAllowInvalidHostnames' + | 'tlsInsecure' + | 'waitQueueTimeoutMS' + | 'zlibCompressionLevel' + > + >, + SupportedNodeConnectionOptions { + hosts: HostAddress[]; + srvHost?: string; + credentials?: MongoCredentials; + readPreference: ReadPreference; + readConcern: ReadConcern; + loadBalanced: boolean; + serverApi: ServerApi; + compressors: CompressorName[]; + writeConcern: WriteConcern; + dbName: string; + metadata: ClientMetadata; + autoEncrypter?: AutoEncrypter; + /** @internal */ + connectionType?: typeof Connection; + + /** @internal */ + encrypter: Encrypter; + /** @internal */ + userSpecifiedAuthSource: boolean; + /** @internal */ + userSpecifiedReplicaSet: boolean; + + /** + * # NOTE ABOUT TLS Options + * + * If set TLS enabled, equivalent to setting the ssl option. + * + * ### Additional options: + * + * | nodejs option | MongoDB equivalent | type | + * |:---------------------|--------------------------------------------------------- |:---------------------------------------| + * | `ca` | `sslCA`, `tlsCAFile` | `string \| Buffer \| Buffer[]` | + * | `crl` | `sslCRL` | `string \| Buffer \| Buffer[]` | + * | `cert` | `sslCert`, `tlsCertificateFile`, `tlsCertificateKeyFile` | `string \| Buffer \| Buffer[]` | + * | `key` | `sslKey`, `tlsCertificateKeyFile` | `string \| Buffer \| KeyObject[]` | + * | `passphrase` | `sslPass`, `tlsCertificateKeyFilePassword` | `string` | + * | `rejectUnauthorized` | `sslValidate` | `boolean` | + * + */ + tls: boolean; + + /** + * Turn these options into a reusable connection URI + */ + toURI(): string; +} diff --git a/node_modules/mongodb/src/mongo_types.ts b/node_modules/mongodb/src/mongo_types.ts new file mode 100644 index 000000000..d4107e141 --- /dev/null +++ b/node_modules/mongodb/src/mongo_types.ts @@ -0,0 +1,427 @@ +import type { + Binary, + Document, + ObjectId, + BSONRegExp, + Timestamp, + Decimal128, + Double, + Int32, + Long +} from './bson'; +import { EventEmitter } from 'events'; +import type { Sort } from './sort'; + +/** @internal */ +export type TODO_NODE_3286 = any; + +/** Given an object shaped type, return the type of the _id field or default to ObjectId @public */ +export type InferIdType = TSchema extends { _id: infer IdType } // user has defined a type for _id + ? // eslint-disable-next-line @typescript-eslint/ban-types + {} extends IdType // TODO(NODE-3285): Improve type readability + ? // eslint-disable-next-line @typescript-eslint/ban-types + Exclude + : unknown extends IdType + ? ObjectId + : IdType + : ObjectId; // user has not defined _id on schema + +/** Add an _id field to an object shaped type @public */ +export type WithId = EnhancedOmit & { _id: InferIdType }; + +/** + * Add an optional _id field to an object shaped type + * @public + * + * @privateRemarks + * `ObjectId extends TSchema['_id']` is a confusing ordering at first glance. Rather than ask + * `TSchema['_id'] extends ObjectId` which translated to "Is the _id property ObjectId?" + * we instead ask "Does ObjectId look like (have the same shape) as the _id?" + */ +export type OptionalId = ObjectId extends TSchema['_id'] // a Schema with ObjectId _id type or "any" or "indexed type" provided + ? EnhancedOmit & { _id?: InferIdType } // a Schema provided but _id type is not ObjectId + : WithId; // TODO(NODE-3285): Improve type readability + +/** TypeScript Omit (Exclude to be specific) does not work for objects with an "any" indexed type, and breaks discriminated unions @public */ +export type EnhancedOmit = string extends keyof TRecordOrUnion + ? TRecordOrUnion // TRecordOrUnion has indexed type e.g. { _id: string; [k: string]: any; } or it is "any" + : TRecordOrUnion extends any + ? Pick> // discriminated unions + : never; + +/** Remove the _id field from an object shaped type @public */ +export type WithoutId = Omit; + +/** A MongoDB filter can be some portion of the schema or a set of operators @public */ +export type Filter = { + [P in keyof TSchema]?: Condition; +} & + RootFilterOperators; + +/** @public */ +export type Condition = AlternativeType | FilterOperators>; + +/** + * It is possible to search using alternative types in mongodb e.g. + * string types can be searched using a regex in mongo + * array types can be searched using their element type + * @public + */ +export type AlternativeType = T extends ReadonlyArray + ? T | RegExpOrString + : RegExpOrString; + +/** @public */ +export type RegExpOrString = T extends string ? BSONRegExp | RegExp | T : T; + +/** @public */ +export interface RootFilterOperators extends Document { + $and?: Filter[]; + $nor?: Filter[]; + $or?: Filter[]; + $text?: { + $search: string; + $language?: string; + $caseSensitive?: boolean; + $diacriticSensitive?: boolean; + }; + $where?: string | ((this: TSchema) => boolean); + $comment?: string | Document; +} + +/** @public */ +export interface FilterOperators extends Document { + // Comparison + $eq?: TValue; + $gt?: TValue; + $gte?: TValue; + $in?: ReadonlyArray; + $lt?: TValue; + $lte?: TValue; + $ne?: TValue; + $nin?: ReadonlyArray; + // Logical + $not?: TValue extends string ? FilterOperators | RegExp : FilterOperators; + // Element + /** + * When `true`, `$exists` matches the documents that contain the field, + * including documents where the field value is null. + */ + $exists?: boolean; + $type?: BSONType | BSONTypeAlias; + // Evaluation + $expr?: Record; + $jsonSchema?: Record; + $mod?: TValue extends number ? [number, number] : never; + $regex?: TValue extends string ? RegExp | BSONRegExp | string : never; + $options?: TValue extends string ? string : never; + // Geospatial + $geoIntersects?: { $geometry: Document }; + $geoWithin?: Document; + $near?: Document; + $nearSphere?: Document; + $maxDistance?: number; + // Array + $all?: ReadonlyArray; + $elemMatch?: Document; + $size?: TValue extends ReadonlyArray ? number : never; + // Bitwise + $bitsAllClear?: BitwiseFilter; + $bitsAllSet?: BitwiseFilter; + $bitsAnyClear?: BitwiseFilter; + $bitsAnySet?: BitwiseFilter; + $rand?: Record; +} + +/** @public */ +export type BitwiseFilter = + | number /** numeric bit mask */ + | Binary /** BinData bit mask */ + | ReadonlyArray; /** `[ , , ... ]` */ + +/** @public */ +export const BSONType = Object.freeze({ + double: 1, + string: 2, + object: 3, + array: 4, + binData: 5, + undefined: 6, + objectId: 7, + bool: 8, + date: 9, + null: 10, + regex: 11, + dbPointer: 12, + javascript: 13, + symbol: 14, + javascriptWithScope: 15, + int: 16, + timestamp: 17, + long: 18, + decimal: 19, + minKey: -1, + maxKey: 127 +} as const); + +/** @public */ +export type BSONType = typeof BSONType[keyof typeof BSONType]; +/** @public */ +export type BSONTypeAlias = keyof typeof BSONType; + +/** + * @public + * Projection is flexible to permit the wide array of aggregation operators + * @deprecated since v4.1.0: Since projections support all aggregation operations we have no plans to narrow this type further + */ +// eslint-disable-next-line @typescript-eslint/no-unused-vars +export type Projection = Document; + +/** + * @public + * @deprecated since v4.1.0: Since projections support all aggregation operations we have no plans to narrow this type further + */ +export type ProjectionOperators = Document; + +/** @public */ +export type IsAny = true extends false & Type + ? ResultIfAny + : ResultIfNotAny; + +/** @public */ +export type Flatten = Type extends ReadonlyArray ? Item : Type; + +/** @public */ +export type SchemaMember = { [P in keyof T]?: V } | { [key: string]: V }; + +/** @public */ +export type IntegerType = number | Int32 | Long; + +/** @public */ +export type NumericType = IntegerType | Decimal128 | Double; + +/** @public */ +export type FilterOperations = T extends Record + ? { [key in keyof T]?: FilterOperators } + : FilterOperators; + +/** @public */ +export type KeysOfAType = { + [key in keyof TSchema]: NonNullable extends Type ? key : never; +}[keyof TSchema]; + +/** @public */ +export type KeysOfOtherType = { + [key in keyof TSchema]: NonNullable extends Type ? never : key; +}[keyof TSchema]; + +/** @public */ +export type AcceptedFields = { + readonly [key in KeysOfAType]?: AssignableType; +}; + +/** It avoids using fields with not acceptable types @public */ +export type NotAcceptedFields = { + readonly [key in KeysOfOtherType]?: never; +}; + +/** @public */ +export type OnlyFieldsOfType = IsAny< + TSchema[keyof TSchema], + Record, + AcceptedFields & + NotAcceptedFields & + Record +>; + +/** @public */ +export type MatchKeysAndValues = Readonly> & Record; + +/** @public */ +export type AddToSetOperators = { + $each?: Array>; +}; + +/** @public */ +export type ArrayOperator = { + $each?: Array>; + $slice?: number; + $position?: number; + $sort?: Sort; +}; + +/** @public */ +export type SetFields = ({ + readonly [key in KeysOfAType | undefined>]?: + | OptionalId> + | AddToSetOperators>>>; +} & + NotAcceptedFields | undefined>) & { + readonly [key: string]: AddToSetOperators | any; +}; + +/** @public */ +export type PushOperator = ({ + readonly [key in KeysOfAType>]?: + | Flatten + | ArrayOperator>>; +} & + NotAcceptedFields>) & { + readonly [key: string]: ArrayOperator | any; +}; + +/** @public */ +export type PullOperator = ({ + readonly [key in KeysOfAType>]?: + | Partial> + | FilterOperations>; +} & + NotAcceptedFields>) & { + readonly [key: string]: FilterOperators | any; +}; + +/** @public */ +export type PullAllOperator = ({ + readonly [key in KeysOfAType>]?: TSchema[key]; +} & + NotAcceptedFields>) & { + readonly [key: string]: ReadonlyArray; +}; + +/** @public */ +export type UpdateFilter = { + $currentDate?: OnlyFieldsOfType< + TSchema, + Date | Timestamp, + true | { $type: 'date' | 'timestamp' } + >; + $inc?: OnlyFieldsOfType; + $min?: MatchKeysAndValues; + $max?: MatchKeysAndValues; + $mul?: OnlyFieldsOfType; + $rename?: Record; + $set?: MatchKeysAndValues; + $setOnInsert?: MatchKeysAndValues; + $unset?: OnlyFieldsOfType; + $addToSet?: SetFields; + $pop?: OnlyFieldsOfType, 1 | -1>; + $pull?: PullOperator; + $push?: PushOperator; + $pullAll?: PullAllOperator; + $bit?: OnlyFieldsOfType< + TSchema, + NumericType | undefined, + { and: IntegerType } | { or: IntegerType } | { xor: IntegerType } + >; +} & Document; + +/** @public */ +export type Nullable = AnyType | null | undefined; + +/** @public */ +export type OneOrMore = T | ReadonlyArray; + +/** @public */ +export type GenericListener = (...args: any[]) => void; + +/** + * Event description type + * @public + */ +export type EventsDescription = Record; + +/** @public */ +export type CommonEvents = 'newListener' | 'removeListener'; + +/** + * Typescript type safe event emitter + * @public + */ +export declare interface TypedEventEmitter extends EventEmitter { + addListener(event: EventKey, listener: Events[EventKey]): this; + addListener( + event: CommonEvents, + listener: (eventName: string | symbol, listener: GenericListener) => void + ): this; + addListener(event: string | symbol, listener: GenericListener): this; + + on(event: EventKey, listener: Events[EventKey]): this; + on( + event: CommonEvents, + listener: (eventName: string | symbol, listener: GenericListener) => void + ): this; + on(event: string | symbol, listener: GenericListener): this; + + once(event: EventKey, listener: Events[EventKey]): this; + once( + event: CommonEvents, + listener: (eventName: string | symbol, listener: GenericListener) => void + ): this; + once(event: string | symbol, listener: GenericListener): this; + + removeListener(event: EventKey, listener: Events[EventKey]): this; + removeListener( + event: CommonEvents, + listener: (eventName: string | symbol, listener: GenericListener) => void + ): this; + removeListener(event: string | symbol, listener: GenericListener): this; + + off(event: EventKey, listener: Events[EventKey]): this; + off( + event: CommonEvents, + listener: (eventName: string | symbol, listener: GenericListener) => void + ): this; + off(event: string | symbol, listener: GenericListener): this; + + removeAllListeners( + event?: EventKey | CommonEvents | symbol | string + ): this; + + listeners( + event: EventKey | CommonEvents | symbol | string + ): Events[EventKey][]; + + rawListeners( + event: EventKey | CommonEvents | symbol | string + ): Events[EventKey][]; + + emit( + event: EventKey | symbol, + ...args: Parameters + ): boolean; + + listenerCount( + type: EventKey | CommonEvents | symbol | string + ): number; + + prependListener(event: EventKey, listener: Events[EventKey]): this; + prependListener( + event: CommonEvents, + listener: (eventName: string | symbol, listener: GenericListener) => void + ): this; + prependListener(event: string | symbol, listener: GenericListener): this; + + prependOnceListener( + event: EventKey, + listener: Events[EventKey] + ): this; + prependOnceListener( + event: CommonEvents, + listener: (eventName: string | symbol, listener: GenericListener) => void + ): this; + prependOnceListener(event: string | symbol, listener: GenericListener): this; + + eventNames(): string[]; + getMaxListeners(): number; + setMaxListeners(n: number): this; +} + +/** + * Typescript type safe event emitter + * @public + */ +// eslint-disable-next-line @typescript-eslint/no-unused-vars +export class TypedEventEmitter extends EventEmitter {} + +/** @public */ +export class CancellationToken extends TypedEventEmitter<{ cancel(): void }> {} diff --git a/node_modules/mongodb/src/operations/add_user.ts b/node_modules/mongodb/src/operations/add_user.ts new file mode 100644 index 000000000..4dd56acc5 --- /dev/null +++ b/node_modules/mongodb/src/operations/add_user.ts @@ -0,0 +1,106 @@ +import * as crypto from 'crypto'; +import { Aspect, defineAspects } from './operation'; +import { CommandOperation, CommandOperationOptions } from './command'; +import { MongoInvalidArgumentError } from '../error'; +import { Callback, emitWarningOnce, getTopology } from '../utils'; +import type { Document } from '../bson'; +import type { Server } from '../sdam/server'; +import type { Db } from '../db'; +import type { ClientSession } from '../sessions'; + +/** @public */ +export interface RoleSpecification { + /** + * A role grants privileges to perform sets of actions on defined resources. + * A given role applies to the database on which it is defined and can grant access down to a collection level of granularity. + */ + role: string; + /** The database this user's role should effect. */ + db: string; +} + +/** @public */ +export interface AddUserOptions extends CommandOperationOptions { + /** @deprecated Please use db.command('createUser', ...) instead for this option */ + digestPassword?: null; + /** Roles associated with the created user */ + roles?: string | string[] | RoleSpecification | RoleSpecification[]; + /** Custom data associated with the user (only Mongodb 2.6 or higher) */ + customData?: Document; +} + +/** @internal */ +export class AddUserOperation extends CommandOperation { + options: AddUserOptions; + db: Db; + username: string; + password?: string; + + constructor(db: Db, username: string, password: string | undefined, options?: AddUserOptions) { + super(db, options); + + this.db = db; + this.username = username; + this.password = password; + this.options = options ?? {}; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const db = this.db; + const username = this.username; + const password = this.password; + const options = this.options; + + // Error out if digestPassword set + if (options.digestPassword != null) { + return callback( + new MongoInvalidArgumentError( + 'Option "digestPassword" not supported via addUser, use db.command(...) instead' + ) + ); + } + + let roles; + if (!options.roles || (Array.isArray(options.roles) && options.roles.length === 0)) { + emitWarningOnce( + 'Creating a user without roles is deprecated. Defaults to "root" if db is "admin" or "dbOwner" otherwise' + ); + if (db.databaseName.toLowerCase() === 'admin') { + roles = ['root']; + } else { + roles = ['dbOwner']; + } + } else { + roles = Array.isArray(options.roles) ? options.roles : [options.roles]; + } + + const digestPassword = getTopology(db).lastIsMaster().maxWireVersion >= 7; + + let userPassword = password; + + if (!digestPassword) { + // Use node md5 generator + const md5 = crypto.createHash('md5'); + // Generate keys used for authentication + md5.update(`${username}:mongo:${password}`); + userPassword = md5.digest('hex'); + } + + // Build the command to execute + const command: Document = { + createUser: username, + customData: options.customData || {}, + roles: roles, + digestPassword + }; + + // No password + if (typeof password === 'string') { + command.pwd = userPassword; + } + + super.executeCommand(server, session, command, callback); + } +} + +defineAspects(AddUserOperation, [Aspect.WRITE_OPERATION]); diff --git a/node_modules/mongodb/src/operations/aggregate.ts b/node_modules/mongodb/src/operations/aggregate.ts new file mode 100644 index 000000000..068540067 --- /dev/null +++ b/node_modules/mongodb/src/operations/aggregate.ts @@ -0,0 +1,135 @@ +import { CommandOperation, CommandOperationOptions, CollationOptions } from './command'; +import { ReadPreference } from '../read_preference'; +import { MongoInvalidArgumentError } from '../error'; +import { maxWireVersion, MongoDBNamespace } from '../utils'; +import { Aspect, defineAspects, Hint } from './operation'; +import type { Callback } from '../utils'; +import type { Document } from '../bson'; +import type { Server } from '../sdam/server'; +import type { ClientSession } from '../sessions'; + +/** @internal */ +export const DB_AGGREGATE_COLLECTION = 1 as const; +const MIN_WIRE_VERSION_$OUT_READ_CONCERN_SUPPORT = 8 as const; + +/** @public */ +export interface AggregateOptions extends CommandOperationOptions { + /** allowDiskUse lets the server know if it can use disk to store temporary results for the aggregation (requires mongodb 2.6 \>). */ + allowDiskUse?: boolean; + /** The number of documents to return per batch. See [aggregation documentation](https://docs.mongodb.com/manual/reference/command/aggregate). */ + batchSize?: number; + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; + /** Return the query as cursor, on 2.6 \> it returns as a real cursor on pre 2.6 it returns as an emulated cursor. */ + cursor?: Document; + /** specifies a cumulative time limit in milliseconds for processing operations on the cursor. MongoDB interrupts the operation at the earliest following interrupt point. */ + maxTimeMS?: number; + /** The maximum amount of time for the server to wait on new documents to satisfy a tailable cursor query. */ + maxAwaitTimeMS?: number; + /** Specify collation. */ + collation?: CollationOptions; + /** Add an index selection hint to an aggregation command */ + hint?: Hint; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; + out?: string; +} + +/** @internal */ +export class AggregateOperation extends CommandOperation { + options: AggregateOptions; + target: string | typeof DB_AGGREGATE_COLLECTION; + pipeline: Document[]; + hasWriteStage: boolean; + + constructor(ns: MongoDBNamespace, pipeline: Document[], options?: AggregateOptions) { + super(undefined, { ...options, dbName: ns.db }); + + this.options = options ?? {}; + + // Covers when ns.collection is null, undefined or the empty string, use DB_AGGREGATE_COLLECTION + this.target = ns.collection || DB_AGGREGATE_COLLECTION; + + this.pipeline = pipeline; + + // determine if we have a write stage, override read preference if so + this.hasWriteStage = false; + if (typeof options?.out === 'string') { + this.pipeline = this.pipeline.concat({ $out: options.out }); + this.hasWriteStage = true; + } else if (pipeline.length > 0) { + const finalStage = pipeline[pipeline.length - 1]; + if (finalStage.$out || finalStage.$merge) { + this.hasWriteStage = true; + } + } + + if (this.hasWriteStage) { + this.readPreference = ReadPreference.primary; + } + + if (this.explain && this.writeConcern) { + throw new MongoInvalidArgumentError( + 'Option "explain" cannot be used on an aggregate call with writeConcern' + ); + } + + if (options?.cursor != null && typeof options.cursor !== 'object') { + throw new MongoInvalidArgumentError('Cursor options must be an object'); + } + } + + get canRetryRead(): boolean { + return !this.hasWriteStage; + } + + addToPipeline(stage: Document): void { + this.pipeline.push(stage); + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const options: AggregateOptions = this.options; + const serverWireVersion = maxWireVersion(server); + const command: Document = { aggregate: this.target, pipeline: this.pipeline }; + + if (this.hasWriteStage && serverWireVersion < MIN_WIRE_VERSION_$OUT_READ_CONCERN_SUPPORT) { + this.readConcern = undefined; + } + + if (serverWireVersion >= 5) { + if (this.hasWriteStage && this.writeConcern) { + Object.assign(command, { writeConcern: this.writeConcern }); + } + } + + if (options.bypassDocumentValidation === true) { + command.bypassDocumentValidation = options.bypassDocumentValidation; + } + + if (typeof options.allowDiskUse === 'boolean') { + command.allowDiskUse = options.allowDiskUse; + } + + if (options.hint) { + command.hint = options.hint; + } + + if (options.let) { + command.let = options.let; + } + + command.cursor = options.cursor || {}; + if (options.batchSize && !this.hasWriteStage) { + command.cursor.batchSize = options.batchSize; + } + + super.executeCommand(server, session, command, callback); + } +} + +defineAspects(AggregateOperation, [ + Aspect.READ_OPERATION, + Aspect.RETRYABLE, + Aspect.EXPLAINABLE, + Aspect.CURSOR_CREATING +]); diff --git a/node_modules/mongodb/src/operations/bulk_write.ts b/node_modules/mongodb/src/operations/bulk_write.ts new file mode 100644 index 000000000..8ab1f913e --- /dev/null +++ b/node_modules/mongodb/src/operations/bulk_write.ts @@ -0,0 +1,63 @@ +import { Aspect, defineAspects, AbstractOperation } from './operation'; +import type { Callback } from '../utils'; +import type { Collection } from '../collection'; +import type { + BulkOperationBase, + BulkWriteResult, + BulkWriteOptions, + AnyBulkWriteOperation +} from '../bulk/common'; +import type { Server } from '../sdam/server'; +import type { ClientSession } from '../sessions'; + +/** @internal */ +export class BulkWriteOperation extends AbstractOperation { + options: BulkWriteOptions; + collection: Collection; + operations: AnyBulkWriteOperation[]; + + constructor( + collection: Collection, + operations: AnyBulkWriteOperation[], + options: BulkWriteOptions + ) { + super(options); + this.options = options; + this.collection = collection; + this.operations = operations; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const coll = this.collection; + const operations = this.operations; + const options = { ...this.options, ...this.bsonOptions, readPreference: this.readPreference }; + + // Create the bulk operation + const bulk: BulkOperationBase = + options.ordered === false + ? coll.initializeUnorderedBulkOp(options) + : coll.initializeOrderedBulkOp(options); + + // for each op go through and add to the bulk + try { + for (let i = 0; i < operations.length; i++) { + bulk.raw(operations[i]); + } + } catch (err) { + return callback(err); + } + + // Execute the bulk + bulk.execute({ ...options, session }, (err, r) => { + // We have connection level error + if (!r && err) { + return callback(err); + } + + // Return the results + callback(undefined, r); + }); + } +} + +defineAspects(BulkWriteOperation, [Aspect.WRITE_OPERATION]); diff --git a/node_modules/mongodb/src/operations/collections.ts b/node_modules/mongodb/src/operations/collections.ts new file mode 100644 index 000000000..fbdf41751 --- /dev/null +++ b/node_modules/mongodb/src/operations/collections.ts @@ -0,0 +1,44 @@ +import { AbstractOperation, OperationOptions } from './operation'; +import { Collection } from '../collection'; +import type { Callback } from '../utils'; +import type { Db } from '../db'; +import type { Server } from '../sdam/server'; +import type { ClientSession } from '../sessions'; + +export interface CollectionsOptions extends OperationOptions { + nameOnly?: boolean; +} + +/** @internal */ +export class CollectionsOperation extends AbstractOperation { + options: CollectionsOptions; + db: Db; + + constructor(db: Db, options: CollectionsOptions) { + super(options); + this.options = options; + this.db = db; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const db = this.db; + + // Let's get the collection names + db.listCollections( + {}, + { ...this.options, nameOnly: true, readPreference: this.readPreference, session } + ).toArray((err, documents) => { + if (err || !documents) return callback(err); + // Filter collections removing any illegal ones + documents = documents.filter(doc => doc.name.indexOf('$') === -1); + + // Return the collection objects + callback( + undefined, + documents.map(d => { + return new Collection(db, d.name, db.s.options); + }) + ); + }); + } +} diff --git a/node_modules/mongodb/src/operations/command.ts b/node_modules/mongodb/src/operations/command.ts new file mode 100644 index 000000000..70ae413d6 --- /dev/null +++ b/node_modules/mongodb/src/operations/command.ts @@ -0,0 +1,171 @@ +import { Aspect, AbstractOperation, OperationOptions } from './operation'; +import { ReadConcern } from '../read_concern'; +import { WriteConcern, WriteConcernOptions } from '../write_concern'; +import { maxWireVersion, MongoDBNamespace, Callback, decorateWithExplain } from '../utils'; +import type { ReadPreference } from '../read_preference'; +import { ClientSession, commandSupportsReadConcern } from '../sessions'; +import { MongoInvalidArgumentError, MongoCompatibilityError } from '../error'; +import type { Logger } from '../logger'; +import type { Server } from '../sdam/server'; +import type { BSONSerializeOptions, Document } from '../bson'; +import type { ReadConcernLike } from './../read_concern'; +import { Explain, ExplainOptions } from '../explain'; + +const SUPPORTS_WRITE_CONCERN_AND_COLLATION = 5; + +/** @public */ +export interface CollationOptions { + locale: string; + caseLevel?: boolean; + caseFirst?: string; + strength?: number; + numericOrdering?: boolean; + alternate?: string; + maxVariable?: string; + backwards?: boolean; + normalization?: boolean; +} + +/** @public */ +export interface CommandOperationOptions + extends OperationOptions, + WriteConcernOptions, + ExplainOptions { + /** @deprecated This option does nothing */ + fullResponse?: boolean; + /** Specify a read concern and level for the collection. (only MongoDB 3.2 or higher supported) */ + readConcern?: ReadConcernLike; + /** Collation */ + collation?: CollationOptions; + maxTimeMS?: number; + /** A user-provided comment to attach to this command */ + comment?: string | Document; + /** Should retry failed writes */ + retryWrites?: boolean; + + // Admin command overrides. + dbName?: string; + authdb?: string; + noResponse?: boolean; +} + +/** @internal */ +export interface OperationParent { + s: { namespace: MongoDBNamespace }; + readConcern?: ReadConcern; + writeConcern?: WriteConcern; + readPreference?: ReadPreference; + logger?: Logger; + bsonOptions?: BSONSerializeOptions; +} + +/** @internal */ +export abstract class CommandOperation extends AbstractOperation { + options: CommandOperationOptions; + ns: MongoDBNamespace; + readConcern?: ReadConcern; + writeConcern?: WriteConcern; + explain?: Explain; + logger?: Logger; + + constructor(parent?: OperationParent, options?: CommandOperationOptions) { + super(options); + this.options = options ?? {}; + + // NOTE: this was explicitly added for the add/remove user operations, it's likely + // something we'd want to reconsider. Perhaps those commands can use `Admin` + // as a parent? + const dbNameOverride = options?.dbName || options?.authdb; + if (dbNameOverride) { + this.ns = new MongoDBNamespace(dbNameOverride, '$cmd'); + } else { + this.ns = parent + ? parent.s.namespace.withCollection('$cmd') + : new MongoDBNamespace('admin', '$cmd'); + } + + this.readConcern = ReadConcern.fromOptions(options); + this.writeConcern = WriteConcern.fromOptions(options); + + // TODO(NODE-2056): make logger another "inheritable" property + if (parent && parent.logger) { + this.logger = parent.logger; + } + + if (this.hasAspect(Aspect.EXPLAINABLE)) { + this.explain = Explain.fromOptions(options); + } else if (options?.explain != null) { + throw new MongoInvalidArgumentError(`Option "explain" is not supported on this command`); + } + } + + get canRetryWrite(): boolean { + if (this.hasAspect(Aspect.EXPLAINABLE)) { + return this.explain == null; + } + return true; + } + + abstract execute(server: Server, session: ClientSession, callback: Callback): void; + + executeCommand(server: Server, session: ClientSession, cmd: Document, callback: Callback): void { + // TODO: consider making this a non-enumerable property + this.server = server; + + const options = { + ...this.options, + ...this.bsonOptions, + readPreference: this.readPreference, + session + }; + + const serverWireVersion = maxWireVersion(server); + const inTransaction = this.session && this.session.inTransaction(); + + if (this.readConcern && commandSupportsReadConcern(cmd) && !inTransaction) { + Object.assign(cmd, { readConcern: this.readConcern }); + } + + if (options.collation && serverWireVersion < SUPPORTS_WRITE_CONCERN_AND_COLLATION) { + callback( + new MongoCompatibilityError( + `Server ${server.name}, which reports wire version ${serverWireVersion}, does not support collation` + ) + ); + return; + } + + if (this.writeConcern && this.hasAspect(Aspect.WRITE_OPERATION) && !inTransaction) { + Object.assign(cmd, { writeConcern: this.writeConcern }); + } + + if (serverWireVersion >= SUPPORTS_WRITE_CONCERN_AND_COLLATION) { + if ( + options.collation && + typeof options.collation === 'object' && + !this.hasAspect(Aspect.SKIP_COLLATION) + ) { + Object.assign(cmd, { collation: options.collation }); + } + } + + if (typeof options.maxTimeMS === 'number') { + cmd.maxTimeMS = options.maxTimeMS; + } + + if (typeof options.comment === 'string') { + cmd.comment = options.comment; + } + + if (this.hasAspect(Aspect.EXPLAINABLE) && this.explain) { + if (serverWireVersion < 6 && cmd.aggregate) { + // Prior to 3.6, with aggregate, verbosity is ignored, and we must pass in "explain: true" + cmd.explain = true; + } else { + cmd = decorateWithExplain(cmd, this.explain); + } + } + + server.command(this.ns, cmd, options, callback); + } +} diff --git a/node_modules/mongodb/src/operations/common_functions.ts b/node_modules/mongodb/src/operations/common_functions.ts new file mode 100644 index 000000000..f056f32a6 --- /dev/null +++ b/node_modules/mongodb/src/operations/common_functions.ts @@ -0,0 +1,95 @@ +import { MongoTopologyClosedError } from '../error'; +import { Callback, getTopology } from '../utils'; +import type { Document } from '../bson'; +import type { Db } from '../db'; +import type { ClientSession } from '../sessions'; +import type { ReadPreference } from '../read_preference'; +import type { Collection } from '../collection'; + +/** @public */ +export interface IndexInformationOptions { + full?: boolean; + readPreference?: ReadPreference; + session?: ClientSession; +} +/** + * Retrieves this collections index info. + * + * @param db - The Db instance on which to retrieve the index info. + * @param name - The name of the collection. + */ +export function indexInformation(db: Db, name: string, callback: Callback): void; +export function indexInformation( + db: Db, + name: string, + options: IndexInformationOptions, + callback?: Callback +): void; +export function indexInformation( + db: Db, + name: string, + _optionsOrCallback: IndexInformationOptions | Callback, + _callback?: Callback +): void { + let options = _optionsOrCallback as IndexInformationOptions; + let callback = _callback as Callback; + if ('function' === typeof _optionsOrCallback) { + callback = _optionsOrCallback as Callback; + options = {}; + } + // If we specified full information + const full = options.full == null ? false : options.full; + + // Did the user destroy the topology + if (getTopology(db).isDestroyed()) return callback(new MongoTopologyClosedError()); + // Process all the results from the index command and collection + function processResults(indexes: any) { + // Contains all the information + const info: any = {}; + // Process all the indexes + for (let i = 0; i < indexes.length; i++) { + const index = indexes[i]; + // Let's unpack the object + info[index.name] = []; + for (const name in index.key) { + info[index.name].push([name, index.key[name]]); + } + } + + return info; + } + + // Get the list of indexes of the specified collection + db.collection(name) + .listIndexes(options) + .toArray((err, indexes) => { + if (err) return callback(err); + if (!Array.isArray(indexes)) return callback(undefined, []); + if (full) return callback(undefined, indexes); + callback(undefined, processResults(indexes)); + }); +} + +export function prepareDocs( + coll: Collection, + docs: Document[], + options: { forceServerObjectId?: boolean } +): Document[] { + const forceServerObjectId = + typeof options.forceServerObjectId === 'boolean' + ? options.forceServerObjectId + : coll.s.db.options?.forceServerObjectId; + + // no need to modify the docs if server sets the ObjectId + if (forceServerObjectId === true) { + return docs; + } + + return docs.map(doc => { + if (doc._id == null) { + doc._id = coll.s.pkFactory.createPk(); + } + + return doc; + }); +} diff --git a/node_modules/mongodb/src/operations/connect.ts b/node_modules/mongodb/src/operations/connect.ts new file mode 100644 index 000000000..7a1241130 --- /dev/null +++ b/node_modules/mongodb/src/operations/connect.ts @@ -0,0 +1,116 @@ +import { MongoRuntimeError, MongoInvalidArgumentError } from '../error'; +import { Topology, TOPOLOGY_EVENTS } from '../sdam/topology'; +import { resolveSRVRecord } from '../connection_string'; +import type { Callback } from '../utils'; +import type { MongoClient, MongoOptions } from '../mongo_client'; +import { CMAP_EVENTS } from '../cmap/connection_pool'; +import { APM_EVENTS } from '../cmap/connection'; +import { HEARTBEAT_EVENTS } from '../sdam/server'; + +/** @public */ +export const MONGO_CLIENT_EVENTS = [ + ...CMAP_EVENTS, + ...APM_EVENTS, + ...TOPOLOGY_EVENTS, + ...HEARTBEAT_EVENTS +] as const; + +export function connect( + mongoClient: MongoClient, + options: MongoOptions, + callback: Callback +): void { + if (!callback) { + throw new MongoInvalidArgumentError('Callback function must be provided'); + } + + // If a connection already been established, we can terminate early + if (mongoClient.topology && mongoClient.topology.isConnected()) { + return callback(undefined, mongoClient); + } + + const logger = mongoClient.logger; + const connectCallback: Callback = err => { + const warningMessage = + 'seed list contains no mongos proxies, replicaset connections requires ' + + 'the parameter replicaSet to be supplied in the URI or options object, ' + + 'mongodb://server:port/db?replicaSet=name'; + if (err && err.message === 'no mongos proxies found in seed list') { + if (logger.isWarn()) { + logger.warn(warningMessage); + } + + // Return a more specific error message for MongoClient.connect + // TODO(NODE-3483) + return callback(new MongoRuntimeError(warningMessage)); + } + + callback(err, mongoClient); + }; + + if (typeof options.srvHost === 'string') { + return resolveSRVRecord(options, (err, hosts) => { + if (err || !hosts) return callback(err); + for (const [index, host] of hosts.entries()) { + options.hosts[index] = host; + } + + return createTopology(mongoClient, options, connectCallback); + }); + } + + return createTopology(mongoClient, options, connectCallback); +} + +function createTopology( + mongoClient: MongoClient, + options: MongoOptions, + callback: Callback +) { + // Create the topology + const topology = new Topology(options.hosts, options); + // Events can be emitted before initialization is complete so we have to + // save the reference to the topology on the client ASAP if the event handlers need to access it + mongoClient.topology = topology; + + topology.once(Topology.OPEN, () => mongoClient.emit('open', mongoClient)); + + for (const event of MONGO_CLIENT_EVENTS) { + topology.on(event, (...args: any[]) => mongoClient.emit(event, ...(args as any))); + } + + // initialize CSFLE if requested + if (mongoClient.autoEncrypter) { + mongoClient.autoEncrypter.init(err => { + if (err) { + return callback(err); + } + + topology.connect(options, err => { + if (err) { + topology.close({ force: true }); + return callback(err); + } + + options.encrypter.connectInternalClient(error => { + if (error) return callback(error); + + callback(undefined, topology); + }); + }); + }); + + return; + } + + // otherwise connect normally + topology.connect(options, err => { + if (err) { + topology.close({ force: true }); + return callback(err); + } + + callback(undefined, topology); + return; + }); +} diff --git a/node_modules/mongodb/src/operations/count.ts b/node_modules/mongodb/src/operations/count.ts new file mode 100644 index 000000000..34f4156a8 --- /dev/null +++ b/node_modules/mongodb/src/operations/count.ts @@ -0,0 +1,64 @@ +import { Aspect, defineAspects } from './operation'; +import { CommandOperation, CommandOperationOptions } from './command'; +import type { Callback, MongoDBNamespace } from '../utils'; +import type { Document } from '../bson'; +import type { Server } from '../sdam/server'; +import type { Collection } from '../collection'; +import type { ClientSession } from '../sessions'; + +/** @public */ +export interface CountOptions extends CommandOperationOptions { + /** The number of documents to skip. */ + skip?: number; + /** The maximum amounts to count before aborting. */ + limit?: number; + /** Number of milliseconds to wait before aborting the query. */ + maxTimeMS?: number; + /** An index name hint for the query. */ + hint?: string | Document; +} + +/** @internal */ +export class CountOperation extends CommandOperation { + options: CountOptions; + collectionName?: string; + query: Document; + + constructor(namespace: MongoDBNamespace, filter: Document, options: CountOptions) { + super({ s: { namespace: namespace } } as unknown as Collection, options); + + this.options = options; + this.collectionName = namespace.collection; + this.query = filter; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const options = this.options; + const cmd: Document = { + count: this.collectionName, + query: this.query + }; + + if (typeof options.limit === 'number') { + cmd.limit = options.limit; + } + + if (typeof options.skip === 'number') { + cmd.skip = options.skip; + } + + if (options.hint != null) { + cmd.hint = options.hint; + } + + if (typeof options.maxTimeMS === 'number') { + cmd.maxTimeMS = options.maxTimeMS; + } + + super.executeCommand(server, session, cmd, (err, result) => { + callback(err, result ? result.n : 0); + }); + } +} + +defineAspects(CountOperation, [Aspect.READ_OPERATION, Aspect.RETRYABLE]); diff --git a/node_modules/mongodb/src/operations/count_documents.ts b/node_modules/mongodb/src/operations/count_documents.ts new file mode 100644 index 000000000..d7e8c3e09 --- /dev/null +++ b/node_modules/mongodb/src/operations/count_documents.ts @@ -0,0 +1,53 @@ +import { AggregateOperation, AggregateOptions } from './aggregate'; +import type { Callback } from '../utils'; +import type { Document } from '../bson'; +import type { Server } from '../sdam/server'; +import type { Collection } from '../collection'; +import type { ClientSession } from '../sessions'; + +/** @public */ +export interface CountDocumentsOptions extends AggregateOptions { + /** The number of documents to skip. */ + skip?: number; + /** The maximum amounts to count before aborting. */ + limit?: number; +} + +/** @internal */ +export class CountDocumentsOperation extends AggregateOperation { + constructor(collection: Collection, query: Document, options: CountDocumentsOptions) { + const pipeline = []; + pipeline.push({ $match: query }); + + if (typeof options.skip === 'number') { + pipeline.push({ $skip: options.skip }); + } + + if (typeof options.limit === 'number') { + pipeline.push({ $limit: options.limit }); + } + + pipeline.push({ $group: { _id: 1, n: { $sum: 1 } } }); + + super(collection.s.namespace, pipeline, options); + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + super.execute(server, session, (err, result) => { + if (err || !result) { + callback(err); + return; + } + + // NOTE: We're avoiding creating a cursor here to reduce the callstack. + const response = result as unknown as Document; + if (response.cursor == null || response.cursor.firstBatch == null) { + callback(undefined, 0); + return; + } + + const docs = response.cursor.firstBatch; + callback(undefined, docs.length ? docs[0].n : 0); + }); + } +} diff --git a/node_modules/mongodb/src/operations/create_collection.ts b/node_modules/mongodb/src/operations/create_collection.ts new file mode 100644 index 000000000..86291c646 --- /dev/null +++ b/node_modules/mongodb/src/operations/create_collection.ts @@ -0,0 +1,122 @@ +import { CommandOperation, CommandOperationOptions } from './command'; +import { Aspect, defineAspects } from './operation'; +import { Collection } from '../collection'; +import type { Callback } from '../utils'; +import type { Document } from '../bson'; +import type { Server } from '../sdam/server'; +import type { Db } from '../db'; +import type { PkFactory } from '../mongo_client'; +import type { ClientSession } from '../sessions'; + +const ILLEGAL_COMMAND_FIELDS = new Set([ + 'w', + 'wtimeout', + 'j', + 'fsync', + 'autoIndexId', + 'pkFactory', + 'raw', + 'readPreference', + 'session', + 'readConcern', + 'writeConcern', + 'raw', + 'fieldsAsRaw', + 'promoteLongs', + 'promoteValues', + 'promoteBuffers', + 'bsonRegExp', + 'serializeFunctions', + 'ignoreUndefined' +]); + +/** @public + * Configuration options for timeseries collections + * @see https://docs.mongodb.com/manual/core/timeseries-collections/ + */ +export interface TimeSeriesCollectionOptions extends Document { + timeField: string; + metaField?: string; + granularity?: string; +} + +/** @public */ +export interface CreateCollectionOptions extends CommandOperationOptions { + /** Returns an error if the collection does not exist */ + strict?: boolean; + /** Create a capped collection */ + capped?: boolean; + /** @deprecated Create an index on the _id field of the document, True by default on MongoDB 2.6 - 3.0 */ + autoIndexId?: boolean; + /** The size of the capped collection in bytes */ + size?: number; + /** The maximum number of documents in the capped collection */ + max?: number; + /** Available for the MMAPv1 storage engine only to set the usePowerOf2Sizes and the noPadding flag */ + flags?: number; + /** Allows users to specify configuration to the storage engine on a per-collection basis when creating a collection on MongoDB 3.0 or higher */ + storageEngine?: Document; + /** Allows users to specify validation rules or expressions for the collection. For more information, see Document Validation on MongoDB 3.2 or higher */ + validator?: Document; + /** Determines how strictly MongoDB applies the validation rules to existing documents during an update on MongoDB 3.2 or higher */ + validationLevel?: string; + /** Determines whether to error on invalid documents or just warn about the violations but allow invalid documents to be inserted on MongoDB 3.2 or higher */ + validationAction?: string; + /** Allows users to specify a default configuration for indexes when creating a collection on MongoDB 3.2 or higher */ + indexOptionDefaults?: Document; + /** The name of the source collection or view from which to create the view. The name is not the full namespace of the collection or view; i.e. does not include the database name and implies the same database as the view to create on MongoDB 3.4 or higher */ + viewOn?: string; + /** An array that consists of the aggregation pipeline stage. Creates the view by applying the specified pipeline to the viewOn collection or view on MongoDB 3.4 or higher */ + pipeline?: Document[]; + /** A primary key factory function for generation of custom _id keys. */ + pkFactory?: PkFactory; + /** A document specifying configuration options for timeseries collections. */ + timeseries?: TimeSeriesCollectionOptions; + /** The number of seconds after which a document in a timeseries collection expires. */ + expireAfterSeconds?: number; +} + +/** @internal */ +export class CreateCollectionOperation extends CommandOperation { + options: CreateCollectionOptions; + db: Db; + name: string; + + constructor(db: Db, name: string, options: CreateCollectionOptions = {}) { + super(db, options); + + this.options = options; + this.db = db; + this.name = name; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const db = this.db; + const name = this.name; + const options = this.options; + + const done: Callback = err => { + if (err) { + return callback(err); + } + + callback(undefined, new Collection(db, name, options)); + }; + + const cmd: Document = { create: name }; + for (const n in options) { + if ( + (options as any)[n] != null && + typeof (options as any)[n] !== 'function' && + !ILLEGAL_COMMAND_FIELDS.has(n) + ) { + cmd[n] = (options as any)[n]; + } + } + + // otherwise just execute the command + super.executeCommand(server, session, cmd, done); + } +} + +defineAspects(CreateCollectionOperation, [Aspect.WRITE_OPERATION]); diff --git a/node_modules/mongodb/src/operations/delete.ts b/node_modules/mongodb/src/operations/delete.ts new file mode 100644 index 000000000..c8c15b74b --- /dev/null +++ b/node_modules/mongodb/src/operations/delete.ts @@ -0,0 +1,188 @@ +import { defineAspects, Aspect, Hint } from './operation'; +import { CommandOperation, CommandOperationOptions, CollationOptions } from './command'; +import { Callback, maxWireVersion, MongoDBNamespace, collationNotSupported } from '../utils'; +import type { Document } from '../bson'; +import type { Server } from '../sdam/server'; +import type { Collection } from '../collection'; +import type { ClientSession } from '../sessions'; +import { MongoServerError, MongoCompatibilityError } from '../error'; +import type { WriteConcernOptions } from '../write_concern'; + +/** @public */ +export interface DeleteOptions extends CommandOperationOptions, WriteConcernOptions { + /** If true, when an insert fails, don't execute the remaining writes. If false, continue with remaining inserts when one fails. */ + ordered?: boolean; + /** A user-provided comment to attach to this command */ + comment?: string | Document; + /** Specifies the collation to use for the operation */ + collation?: CollationOptions; + /** Specify that the update query should only consider plans using the hinted index */ + hint?: string | Document; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; + + /** @deprecated use `removeOne` or `removeMany` to implicitly specify the limit */ + single?: boolean; +} + +/** @public */ +export interface DeleteResult { + /** Indicates whether this write result was acknowledged. If not, then all other members of this result will be undefined. */ + acknowledged: boolean; + /** The number of documents that were deleted */ + deletedCount: number; +} + +/** @public */ +export interface DeleteStatement { + /** The query that matches documents to delete. */ + q: Document; + /** The number of matching documents to delete. */ + limit: number; + /** Specifies the collation to use for the operation. */ + collation?: CollationOptions; + /** A document or string that specifies the index to use to support the query predicate. */ + hint?: Hint; + /** A user-provided comment to attach to this command */ + comment?: string | Document; +} + +/** @internal */ +export class DeleteOperation extends CommandOperation { + options: DeleteOptions; + statements: DeleteStatement[]; + + constructor(ns: MongoDBNamespace, statements: DeleteStatement[], options: DeleteOptions) { + super(undefined, options); + this.options = options; + this.ns = ns; + this.statements = statements; + } + + get canRetryWrite(): boolean { + if (super.canRetryWrite === false) { + return false; + } + + return this.statements.every(op => (op.limit != null ? op.limit > 0 : true)); + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const options = this.options ?? {}; + const ordered = typeof options.ordered === 'boolean' ? options.ordered : true; + const command: Document = { + delete: this.ns.collection, + deletes: this.statements, + ordered + }; + + if (options.let) { + command.let = options.let; + } + + if (options.explain != null && maxWireVersion(server) < 3) { + return callback + ? callback( + new MongoCompatibilityError(`Server ${server.name} does not support explain on delete`) + ) + : undefined; + } + + const unacknowledgedWrite = this.writeConcern && this.writeConcern.w === 0; + if (unacknowledgedWrite || maxWireVersion(server) < 5) { + if (this.statements.find((o: Document) => o.hint)) { + callback(new MongoCompatibilityError(`Servers < 3.4 do not support hint on delete`)); + return; + } + } + + const statementWithCollation = this.statements.find(statement => !!statement.collation); + if (statementWithCollation && collationNotSupported(server, statementWithCollation)) { + callback(new MongoCompatibilityError(`Server ${server.name} does not support collation`)); + return; + } + + super.executeCommand(server, session, command, callback); + } +} + +export class DeleteOneOperation extends DeleteOperation { + constructor(collection: Collection, filter: Document, options: DeleteOptions) { + super(collection.s.namespace, [makeDeleteStatement(filter, { ...options, limit: 1 })], options); + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + super.execute(server, session, (err, res) => { + if (err || res == null) return callback(err); + if (res.code) return callback(new MongoServerError(res)); + if (res.writeErrors) return callback(new MongoServerError(res.writeErrors[0])); + if (this.explain) return callback(undefined, res); + + callback(undefined, { + acknowledged: this.writeConcern?.w !== 0 ?? true, + deletedCount: res.n + }); + }); + } +} + +export class DeleteManyOperation extends DeleteOperation { + constructor(collection: Collection, filter: Document, options: DeleteOptions) { + super(collection.s.namespace, [makeDeleteStatement(filter, options)], options); + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + super.execute(server, session, (err, res) => { + if (err || res == null) return callback(err); + if (res.code) return callback(new MongoServerError(res)); + if (res.writeErrors) return callback(new MongoServerError(res.writeErrors[0])); + if (this.explain) return callback(undefined, res); + + callback(undefined, { + acknowledged: this.writeConcern?.w !== 0 ?? true, + deletedCount: res.n + }); + }); + } +} + +export function makeDeleteStatement( + filter: Document, + options: DeleteOptions & { limit?: number } +): DeleteStatement { + const op: DeleteStatement = { + q: filter, + limit: typeof options.limit === 'number' ? options.limit : 0 + }; + + if (options.single === true) { + op.limit = 1; + } + + if (options.collation) { + op.collation = options.collation; + } + + if (options.hint) { + op.hint = options.hint; + } + + if (options.comment) { + op.comment = options.comment; + } + + return op; +} + +defineAspects(DeleteOperation, [Aspect.RETRYABLE, Aspect.WRITE_OPERATION]); +defineAspects(DeleteOneOperation, [ + Aspect.RETRYABLE, + Aspect.WRITE_OPERATION, + Aspect.EXPLAINABLE, + Aspect.SKIP_COLLATION +]); +defineAspects(DeleteManyOperation, [ + Aspect.WRITE_OPERATION, + Aspect.EXPLAINABLE, + Aspect.SKIP_COLLATION +]); diff --git a/node_modules/mongodb/src/operations/distinct.ts b/node_modules/mongodb/src/operations/distinct.ts new file mode 100644 index 000000000..5524b2b12 --- /dev/null +++ b/node_modules/mongodb/src/operations/distinct.ts @@ -0,0 +1,88 @@ +import { Aspect, defineAspects } from './operation'; +import { CommandOperation, CommandOperationOptions } from './command'; +import { decorateWithCollation, decorateWithReadConcern, Callback, maxWireVersion } from '../utils'; +import type { Document } from '../bson'; +import type { Server } from '../sdam/server'; +import type { Collection } from '../collection'; +import { MongoCompatibilityError } from '../error'; +import type { ClientSession } from '../sessions'; + +/** @public */ +export type DistinctOptions = CommandOperationOptions; + +/** + * Return a list of distinct values for the given key across a collection. + * @internal + */ +export class DistinctOperation extends CommandOperation { + options: DistinctOptions; + collection: Collection; + /** Field of the document to find distinct values for. */ + key: string; + /** The query for filtering the set of documents to which we apply the distinct filter. */ + query: Document; + + /** + * Construct a Distinct operation. + * + * @param collection - Collection instance. + * @param key - Field of the document to find distinct values for. + * @param query - The query for filtering the set of documents to which we apply the distinct filter. + * @param options - Optional settings. See Collection.prototype.distinct for a list of options. + */ + constructor(collection: Collection, key: string, query: Document, options?: DistinctOptions) { + super(collection, options); + + this.options = options ?? {}; + this.collection = collection; + this.key = key; + this.query = query; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const coll = this.collection; + const key = this.key; + const query = this.query; + const options = this.options; + + // Distinct command + const cmd: Document = { + distinct: coll.collectionName, + key: key, + query: query + }; + + // Add maxTimeMS if defined + if (typeof options.maxTimeMS === 'number') { + cmd.maxTimeMS = options.maxTimeMS; + } + + // Do we have a readConcern specified + decorateWithReadConcern(cmd, coll, options); + + // Have we specified collation + try { + decorateWithCollation(cmd, coll, options); + } catch (err) { + return callback(err); + } + + if (this.explain && maxWireVersion(server) < 4) { + callback( + new MongoCompatibilityError(`Server ${server.name} does not support explain on distinct`) + ); + return; + } + + super.executeCommand(server, session, cmd, (err, result) => { + if (err) { + callback(err); + return; + } + + callback(undefined, this.explain ? result : result.values); + }); + } +} + +defineAspects(DistinctOperation, [Aspect.READ_OPERATION, Aspect.RETRYABLE, Aspect.EXPLAINABLE]); diff --git a/node_modules/mongodb/src/operations/drop.ts b/node_modules/mongodb/src/operations/drop.ts new file mode 100644 index 000000000..f8a375746 --- /dev/null +++ b/node_modules/mongodb/src/operations/drop.ts @@ -0,0 +1,52 @@ +import { Aspect, defineAspects } from './operation'; +import { CommandOperation, CommandOperationOptions } from './command'; +import type { Callback } from '../utils'; +import type { Server } from '../sdam/server'; +import type { Db } from '../db'; +import type { ClientSession } from '../sessions'; + +/** @public */ +export type DropCollectionOptions = CommandOperationOptions; + +/** @internal */ +export class DropCollectionOperation extends CommandOperation { + options: DropCollectionOptions; + name: string; + + constructor(db: Db, name: string, options: DropCollectionOptions) { + super(db, options); + this.options = options; + this.name = name; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + super.executeCommand(server, session, { drop: this.name }, (err, result) => { + if (err) return callback(err); + if (result.ok) return callback(undefined, true); + callback(undefined, false); + }); + } +} + +/** @public */ +export type DropDatabaseOptions = CommandOperationOptions; + +/** @internal */ +export class DropDatabaseOperation extends CommandOperation { + options: DropDatabaseOptions; + + constructor(db: Db, options: DropDatabaseOptions) { + super(db, options); + this.options = options; + } + execute(server: Server, session: ClientSession, callback: Callback): void { + super.executeCommand(server, session, { dropDatabase: 1 }, (err, result) => { + if (err) return callback(err); + if (result.ok) return callback(undefined, true); + callback(undefined, false); + }); + } +} + +defineAspects(DropCollectionOperation, [Aspect.WRITE_OPERATION]); +defineAspects(DropDatabaseOperation, [Aspect.WRITE_OPERATION]); diff --git a/node_modules/mongodb/src/operations/estimated_document_count.ts b/node_modules/mongodb/src/operations/estimated_document_count.ts new file mode 100644 index 000000000..90261cb30 --- /dev/null +++ b/node_modules/mongodb/src/operations/estimated_document_count.ts @@ -0,0 +1,75 @@ +import { Aspect, defineAspects } from './operation'; +import { CommandOperation, CommandOperationOptions } from './command'; +import { Callback, maxWireVersion } from '../utils'; +import type { Document } from '../bson'; +import type { Server } from '../sdam/server'; +import type { Collection } from '../collection'; +import type { ClientSession } from '../sessions'; +import type { MongoServerError } from '../error'; + +/** @public */ +export interface EstimatedDocumentCountOptions extends CommandOperationOptions { + /** + * The maximum amount of time to allow the operation to run. + * + * This option is sent only if the caller explicitly provides a value. The default is to not send a value. + */ + maxTimeMS?: number; +} + +/** @internal */ +export class EstimatedDocumentCountOperation extends CommandOperation { + options: EstimatedDocumentCountOptions; + collectionName: string; + + constructor(collection: Collection, options: EstimatedDocumentCountOptions = {}) { + super(collection, options); + this.options = options; + this.collectionName = collection.collectionName; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + if (maxWireVersion(server) < 12) { + return this.executeLegacy(server, session, callback); + } + const pipeline = [{ $collStats: { count: {} } }, { $group: { _id: 1, n: { $sum: '$count' } } }]; + + const cmd: Document = { aggregate: this.collectionName, pipeline, cursor: {} }; + + if (typeof this.options.maxTimeMS === 'number') { + cmd.maxTimeMS = this.options.maxTimeMS; + } + + super.executeCommand(server, session, cmd, (err, response) => { + if (err && (err as MongoServerError).code !== 26) { + callback(err); + return; + } + + callback(undefined, response?.cursor?.firstBatch[0]?.n || 0); + }); + } + + executeLegacy(server: Server, session: ClientSession, callback: Callback): void { + const cmd: Document = { count: this.collectionName }; + + if (typeof this.options.maxTimeMS === 'number') { + cmd.maxTimeMS = this.options.maxTimeMS; + } + + super.executeCommand(server, session, cmd, (err, response) => { + if (err) { + callback(err); + return; + } + + callback(undefined, response.n || 0); + }); + } +} + +defineAspects(EstimatedDocumentCountOperation, [ + Aspect.READ_OPERATION, + Aspect.RETRYABLE, + Aspect.CURSOR_CREATING +]); diff --git a/node_modules/mongodb/src/operations/eval.ts b/node_modules/mongodb/src/operations/eval.ts new file mode 100644 index 000000000..39749124f --- /dev/null +++ b/node_modules/mongodb/src/operations/eval.ts @@ -0,0 +1,78 @@ +import { CommandOperation, CommandOperationOptions } from './command'; +import { Code, Document } from '../bson'; +import { ReadPreference } from '../read_preference'; +import { MongoServerError } from '../error'; +import type { Callback } from '../utils'; +import type { Server } from '../sdam/server'; +import type { Db } from '../db'; +import type { Collection } from '../collection'; +import type { ClientSession } from '../sessions'; + +/** @public */ +export interface EvalOptions extends CommandOperationOptions { + nolock?: boolean; +} + +/** @internal */ +export class EvalOperation extends CommandOperation { + options: EvalOptions; + code: Code; + parameters?: Document | Document[]; + + constructor( + db: Db | Collection, + code: Code, + parameters?: Document | Document[], + options?: EvalOptions + ) { + super(db, options); + + this.options = options ?? {}; + this.code = code; + this.parameters = parameters; + // force primary read preference + Object.defineProperty(this, 'readPreference', { + value: ReadPreference.primary, + configurable: false, + writable: false + }); + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + let finalCode = this.code; + let finalParameters: Document[] = []; + + // If not a code object translate to one + if (!(finalCode && (finalCode as unknown as { _bsontype: string })._bsontype === 'Code')) { + finalCode = new Code(finalCode as never); + } + + // Ensure the parameters are correct + if (this.parameters != null && typeof this.parameters !== 'function') { + finalParameters = Array.isArray(this.parameters) ? this.parameters : [this.parameters]; + } + + // Create execution selector + const cmd: Document = { $eval: finalCode, args: finalParameters }; + + // Check if the nolock parameter is passed in + if (this.options.nolock) { + cmd.nolock = this.options.nolock; + } + + // Execute the command + super.executeCommand(server, session, cmd, (err, result) => { + if (err) return callback(err); + if (result && result.ok === 1) { + return callback(undefined, result.retval); + } + + if (result) { + callback(new MongoServerError({ message: `eval failed: ${result.errmsg}` })); + return; + } + + callback(err, result); + }); + } +} diff --git a/node_modules/mongodb/src/operations/execute_operation.ts b/node_modules/mongodb/src/operations/execute_operation.ts new file mode 100644 index 000000000..780978324 --- /dev/null +++ b/node_modules/mongodb/src/operations/execute_operation.ts @@ -0,0 +1,269 @@ +import { ReadPreference } from '../read_preference'; +import { + MongoError, + isRetryableError, + MONGODB_ERROR_CODES, + MongoRuntimeError, + MongoNetworkError, + MongoCompatibilityError, + MongoServerError, + MongoExpiredSessionError, + MongoTransactionError +} from '../error'; +import { Aspect, AbstractOperation } from './operation'; +import { maxWireVersion, maybePromise, Callback } from '../utils'; +import type { Server } from '../sdam/server'; +import type { Topology } from '../sdam/topology'; +import type { ClientSession } from '../sessions'; +import type { Document } from '../bson'; +import { supportsRetryableWrites } from '../utils'; + +const MMAPv1_RETRY_WRITES_ERROR_CODE = MONGODB_ERROR_CODES.IllegalOperation; +const MMAPv1_RETRY_WRITES_ERROR_MESSAGE = + 'This MongoDB deployment does not support retryable writes. Please add retryWrites=false to your connection string.'; + +type ResultTypeFromOperation = TOperation extends AbstractOperation + ? K + : never; + +/** @internal */ +export interface ExecutionResult { + /** The server selected for the operation */ + server: Server; + /** The session used for this operation, may be implicitly created */ + session?: ClientSession; + /** The raw server response for the operation */ + response: Document; +} + +/** + * Executes the given operation with provided arguments. + * @internal + * + * @remarks + * This method reduces large amounts of duplication in the entire codebase by providing + * a single point for determining whether callbacks or promises should be used. Additionally + * it allows for a single point of entry to provide features such as implicit sessions, which + * are required by the Driver Sessions specification in the event that a ClientSession is + * not provided + * + * @param topology - The topology to execute this operation on + * @param operation - The operation to execute + * @param callback - The command result callback + */ +export function executeOperation< + T extends AbstractOperation, + TResult = ResultTypeFromOperation +>(topology: Topology, operation: T): Promise; +export function executeOperation< + T extends AbstractOperation, + TResult = ResultTypeFromOperation +>(topology: Topology, operation: T, callback: Callback): void; +export function executeOperation< + T extends AbstractOperation, + TResult = ResultTypeFromOperation +>(topology: Topology, operation: T, callback?: Callback): Promise | void; +export function executeOperation< + T extends AbstractOperation, + TResult = ResultTypeFromOperation +>(topology: Topology, operation: T, callback?: Callback): Promise | void { + if (!(operation instanceof AbstractOperation)) { + // TODO(NODE-3483) + throw new MongoRuntimeError('This method requires a valid operation instance'); + } + + return maybePromise(callback, cb => { + if (topology.shouldCheckForSessionSupport()) { + return topology.selectServer(ReadPreference.primaryPreferred, err => { + if (err) return cb(err); + + executeOperation(topology, operation, cb); + }); + } + + // The driver sessions spec mandates that we implicitly create sessions for operations + // that are not explicitly provided with a session. + let session: ClientSession | undefined = operation.session; + let owner: symbol | undefined; + if (topology.hasSessionSupport()) { + if (session == null) { + owner = Symbol(); + session = topology.startSession({ owner, explicit: false }); + } else if (session.hasEnded) { + return cb(new MongoExpiredSessionError('Use of expired sessions is not permitted')); + } else if (session.snapshotEnabled && !topology.capabilities.supportsSnapshotReads) { + return cb(new MongoCompatibilityError('Snapshot reads require MongoDB 5.0 or later')); + } + } else if (session) { + // If the user passed an explicit session and we are still, after server selection, + // trying to run against a topology that doesn't support sessions we error out. + return cb(new MongoCompatibilityError('Current topology does not support sessions')); + } + + try { + executeWithServerSelection(topology, session, operation, (err, result) => { + if (session && session.owner && session.owner === owner) { + return session.endSession(err2 => cb(err2 || err, result)); + } + + cb(err, result); + }); + } catch (e) { + if (session && session.owner && session.owner === owner) { + session.endSession(); + } + + throw e; + } + }); +} + +function supportsRetryableReads(server: Server) { + return maxWireVersion(server) >= 6; +} + +function executeWithServerSelection( + topology: Topology, + session: ClientSession, + operation: AbstractOperation, + callback: Callback +) { + const readPreference = operation.readPreference || ReadPreference.primary; + const inTransaction = session && session.inTransaction(); + + if (inTransaction && !readPreference.equals(ReadPreference.primary)) { + callback( + new MongoTransactionError( + `Read preference in a transaction must be primary, not: ${readPreference.mode}` + ) + ); + + return; + } + + if ( + session && + session.isPinned && + session.transaction.isCommitted && + !operation.bypassPinningCheck + ) { + session.unpin(); + } + + const serverSelectionOptions = { session }; + function callbackWithRetry(err?: any, result?: any) { + if (err == null) { + return callback(undefined, result); + } + + const hasReadAspect = operation.hasAspect(Aspect.READ_OPERATION); + const hasWriteAspect = operation.hasAspect(Aspect.WRITE_OPERATION); + const itShouldRetryWrite = shouldRetryWrite(err); + + if ((hasReadAspect && !isRetryableError(err)) || (hasWriteAspect && !itShouldRetryWrite)) { + return callback(err); + } + + if ( + hasWriteAspect && + itShouldRetryWrite && + err.code === MMAPv1_RETRY_WRITES_ERROR_CODE && + err.errmsg.match(/Transaction numbers/) + ) { + callback( + new MongoServerError({ + message: MMAPv1_RETRY_WRITES_ERROR_MESSAGE, + errmsg: MMAPv1_RETRY_WRITES_ERROR_MESSAGE, + originalError: err + }) + ); + + return; + } + + // select a new server, and attempt to retry the operation + topology.selectServer(readPreference, serverSelectionOptions, (e?: any, server?: any) => { + if ( + e || + (operation.hasAspect(Aspect.READ_OPERATION) && !supportsRetryableReads(server)) || + (operation.hasAspect(Aspect.WRITE_OPERATION) && !supportsRetryableWrites(server)) + ) { + callback(e); + return; + } + + // If we have a cursor and the initial command fails with a network error, + // we can retry it on another connection. So we need to check it back in, clear the + // pool for the service id, and retry again. + if ( + err && + err instanceof MongoNetworkError && + server.loadBalanced && + session && + session.isPinned && + !session.inTransaction() && + operation.hasAspect(Aspect.CURSOR_CREATING) + ) { + session.unpin({ force: true, forceClear: true }); + } + + operation.execute(server, session, callback); + }); + } + + if ( + readPreference && + !readPreference.equals(ReadPreference.primary) && + session && + session.inTransaction() + ) { + callback( + new MongoTransactionError( + `Read preference in a transaction must be primary, not: ${readPreference.mode}` + ) + ); + + return; + } + + // select a server, and execute the operation against it + topology.selectServer(readPreference, serverSelectionOptions, (err?: any, server?: any) => { + if (err) { + callback(err); + return; + } + + if (session && operation.hasAspect(Aspect.RETRYABLE)) { + const willRetryRead = + topology.s.options.retryReads !== false && + !inTransaction && + supportsRetryableReads(server) && + operation.canRetryRead; + + const willRetryWrite = + topology.s.options.retryWrites === true && + !inTransaction && + supportsRetryableWrites(server) && + operation.canRetryWrite; + + const hasReadAspect = operation.hasAspect(Aspect.READ_OPERATION); + const hasWriteAspect = operation.hasAspect(Aspect.WRITE_OPERATION); + + if ((hasReadAspect && willRetryRead) || (hasWriteAspect && willRetryWrite)) { + if (hasWriteAspect && willRetryWrite) { + operation.options.willRetryWrite = true; + session.incrementTransactionNumber(); + } + + operation.execute(server, session, callbackWithRetry); + return; + } + } + + operation.execute(server, session, callback); + }); +} + +function shouldRetryWrite(err: any) { + return err instanceof MongoError && err.hasErrorLabel('RetryableWriteError'); +} diff --git a/node_modules/mongodb/src/operations/find.ts b/node_modules/mongodb/src/operations/find.ts new file mode 100644 index 000000000..f832eeafc --- /dev/null +++ b/node_modules/mongodb/src/operations/find.ts @@ -0,0 +1,355 @@ +import { Aspect, defineAspects, Hint } from './operation'; +import { + maxWireVersion, + MongoDBNamespace, + Callback, + normalizeHintField, + decorateWithExplain +} from '../utils'; +import { MongoInvalidArgumentError, MongoCompatibilityError } from '../error'; +import type { Document } from '../bson'; +import type { Server } from '../sdam/server'; +import type { Collection } from '../collection'; +import { CommandOperation, CommandOperationOptions, CollationOptions } from './command'; +import { Sort, formatSort } from '../sort'; +import { isSharded } from '../cmap/wire_protocol/shared'; +import { ReadConcern } from '../read_concern'; +import type { ClientSession } from '../sessions'; + +/** + * @public + * @typeParam TSchema - Unused schema definition, deprecated usage, only specify `FindOptions` with no generic + */ +// eslint-disable-next-line @typescript-eslint/no-unused-vars +export interface FindOptions extends CommandOperationOptions { + /** Sets the limit of documents returned in the query. */ + limit?: number; + /** Set to sort the documents coming back from the query. Array of indexes, `[['a', 1]]` etc. */ + sort?: Sort; + /** The fields to return in the query. Object of fields to either include or exclude (one of, not both), `{'a':1, 'b': 1}` **or** `{'a': 0, 'b': 0}` */ + projection?: Document; + /** Set to skip N documents ahead in your query (useful for pagination). */ + skip?: number; + /** Tell the query to use specific indexes in the query. Object of indexes to use, `{'_id':1}` */ + hint?: Hint; + /** Specify if the cursor can timeout. */ + timeout?: boolean; + /** Specify if the cursor is tailable. */ + tailable?: boolean; + /** Specify if the cursor is a a tailable-await cursor. Requires `tailable` to be true */ + awaitData?: boolean; + /** Set the batchSize for the getMoreCommand when iterating over the query results. */ + batchSize?: number; + /** If true, returns only the index keys in the resulting documents. */ + returnKey?: boolean; + /** The inclusive lower bound for a specific index */ + min?: Document; + /** The exclusive upper bound for a specific index */ + max?: Document; + /** You can put a $comment field on a query to make looking in the profiler logs simpler. */ + comment?: string | Document; + /** Number of milliseconds to wait before aborting the query. */ + maxTimeMS?: number; + /** The maximum amount of time for the server to wait on new documents to satisfy a tailable cursor query. Requires `tailable` and `awaitData` to be true */ + maxAwaitTimeMS?: number; + /** The server normally times out idle cursors after an inactivity period (10 minutes) to prevent excess memory use. Set this option to prevent that. */ + noCursorTimeout?: boolean; + /** Specify collation (MongoDB 3.4 or higher) settings for update operation (see 3.4 documentation for available fields). */ + collation?: CollationOptions; + /** Allows disk use for blocking sort operations exceeding 100MB memory. (MongoDB 3.2 or higher) */ + allowDiskUse?: boolean; + /** Determines whether to close the cursor after the first batch. Defaults to false. */ + singleBatch?: boolean; + /** For queries against a sharded collection, allows the command (or subsequent getMore commands) to return partial results, rather than an error, if one or more queried shards are unavailable. */ + allowPartialResults?: boolean; + /** Determines whether to return the record identifier for each document. If true, adds a field $recordId to the returned documents. */ + showRecordId?: boolean; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; +} + +const SUPPORTS_WRITE_CONCERN_AND_COLLATION = 5; + +/** @internal */ +export class FindOperation extends CommandOperation { + options: FindOptions; + filter: Document; + + constructor( + collection: Collection | undefined, + ns: MongoDBNamespace, + filter: Document = {}, + options: FindOptions = {} + ) { + super(collection, options); + + this.options = options; + this.ns = ns; + + if (typeof filter !== 'object' || Array.isArray(filter)) { + throw new MongoInvalidArgumentError('Query filter must be a plain object or ObjectId'); + } + + // If the filter is a buffer, validate that is a valid BSON document + if (Buffer.isBuffer(filter)) { + const objectSize = filter[0] | (filter[1] << 8) | (filter[2] << 16) | (filter[3] << 24); + if (objectSize !== filter.length) { + throw new MongoInvalidArgumentError( + `Query filter raw message size does not match message header size [${filter.length}] != [${objectSize}]` + ); + } + } + + // special case passing in an ObjectId as a filter + this.filter = filter != null && filter._bsontype === 'ObjectID' ? { _id: filter } : filter; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + this.server = server; + + const serverWireVersion = maxWireVersion(server); + const options = this.options; + if (options.allowDiskUse != null && serverWireVersion < 4) { + callback( + new MongoCompatibilityError('Option "allowDiskUse" is not supported on MongoDB < 3.2') + ); + return; + } + + if (options.collation && serverWireVersion < SUPPORTS_WRITE_CONCERN_AND_COLLATION) { + callback( + new MongoCompatibilityError( + `Server ${server.name}, which reports wire version ${serverWireVersion}, does not support collation` + ) + ); + + return; + } + + if (serverWireVersion < 4) { + if (this.readConcern && this.readConcern.level !== 'local') { + callback( + new MongoCompatibilityError( + `Server find command does not support a readConcern level of ${this.readConcern.level}` + ) + ); + + return; + } + + const findCommand = makeLegacyFindCommand(this.ns, this.filter, options); + if (isSharded(server) && this.readPreference) { + findCommand.$readPreference = this.readPreference.toJSON(); + } + + server.query( + this.ns, + findCommand, + { + ...this.options, + ...this.bsonOptions, + documentsReturnedIn: 'firstBatch', + readPreference: this.readPreference + }, + callback + ); + + return; + } + + let findCommand = makeFindCommand(this.ns, this.filter, options); + if (this.explain) { + findCommand = decorateWithExplain(findCommand, this.explain); + } + + server.command( + this.ns, + findCommand, + { + ...this.options, + ...this.bsonOptions, + documentsReturnedIn: 'firstBatch', + session + }, + callback + ); + } +} + +function makeFindCommand(ns: MongoDBNamespace, filter: Document, options: FindOptions): Document { + const findCommand: Document = { + find: ns.collection, + filter + }; + + if (options.sort) { + findCommand.sort = formatSort(options.sort); + } + + if (options.projection) { + let projection = options.projection; + if (projection && Array.isArray(projection)) { + projection = projection.length + ? projection.reduce((result, field) => { + result[field] = 1; + return result; + }, {}) + : { _id: 1 }; + } + + findCommand.projection = projection; + } + + if (options.hint) { + findCommand.hint = normalizeHintField(options.hint); + } + + if (typeof options.skip === 'number') { + findCommand.skip = options.skip; + } + + if (typeof options.limit === 'number') { + if (options.limit < 0) { + findCommand.limit = -options.limit; + findCommand.singleBatch = true; + } else { + findCommand.limit = options.limit; + } + } + + if (typeof options.batchSize === 'number') { + if (options.batchSize < 0) { + if ( + options.limit && + options.limit !== 0 && + Math.abs(options.batchSize) < Math.abs(options.limit) + ) { + findCommand.limit = -options.batchSize; + } + + findCommand.singleBatch = true; + } else { + findCommand.batchSize = options.batchSize; + } + } + + if (typeof options.singleBatch === 'boolean') { + findCommand.singleBatch = options.singleBatch; + } + + if (options.comment) { + findCommand.comment = options.comment; + } + + if (typeof options.maxTimeMS === 'number') { + findCommand.maxTimeMS = options.maxTimeMS; + } + + const readConcern = ReadConcern.fromOptions(options); + if (readConcern) { + findCommand.readConcern = readConcern.toJSON(); + } + + if (options.max) { + findCommand.max = options.max; + } + + if (options.min) { + findCommand.min = options.min; + } + + if (typeof options.returnKey === 'boolean') { + findCommand.returnKey = options.returnKey; + } + + if (typeof options.showRecordId === 'boolean') { + findCommand.showRecordId = options.showRecordId; + } + + if (typeof options.tailable === 'boolean') { + findCommand.tailable = options.tailable; + } + + if (typeof options.timeout === 'boolean') { + findCommand.noCursorTimeout = !options.timeout; + } else if (typeof options.noCursorTimeout === 'boolean') { + findCommand.noCursorTimeout = options.noCursorTimeout; + } + + if (typeof options.awaitData === 'boolean') { + findCommand.awaitData = options.awaitData; + } + + if (typeof options.allowPartialResults === 'boolean') { + findCommand.allowPartialResults = options.allowPartialResults; + } + + if (options.collation) { + findCommand.collation = options.collation; + } + + if (typeof options.allowDiskUse === 'boolean') { + findCommand.allowDiskUse = options.allowDiskUse; + } + + if (options.let) { + findCommand.let = options.let; + } + + return findCommand; +} + +function makeLegacyFindCommand( + ns: MongoDBNamespace, + filter: Document, + options: FindOptions +): Document { + const findCommand: Document = { + $query: filter + }; + + if (options.sort) { + findCommand.$orderby = formatSort(options.sort); + } + + if (options.hint) { + findCommand.$hint = normalizeHintField(options.hint); + } + + if (typeof options.returnKey === 'boolean') { + findCommand.$returnKey = options.returnKey; + } + + if (options.max) { + findCommand.$max = options.max; + } + + if (options.min) { + findCommand.$min = options.min; + } + + if (typeof options.showRecordId === 'boolean') { + findCommand.$showDiskLoc = options.showRecordId; + } + + if (options.comment) { + findCommand.$comment = options.comment; + } + + if (typeof options.maxTimeMS === 'number') { + findCommand.$maxTimeMS = options.maxTimeMS; + } + + if (options.explain != null) { + findCommand.$explain = true; + } + + return findCommand; +} + +defineAspects(FindOperation, [ + Aspect.READ_OPERATION, + Aspect.RETRYABLE, + Aspect.EXPLAINABLE, + Aspect.CURSOR_CREATING +]); diff --git a/node_modules/mongodb/src/operations/find_and_modify.ts b/node_modules/mongodb/src/operations/find_and_modify.ts new file mode 100644 index 000000000..eb04588c8 --- /dev/null +++ b/node_modules/mongodb/src/operations/find_and_modify.ts @@ -0,0 +1,276 @@ +import { ReadPreference } from '../read_preference'; +import { maxWireVersion, decorateWithCollation, hasAtomicOperators, Callback } from '../utils'; +import { MongoInvalidArgumentError, MongoCompatibilityError } from '../error'; +import { CommandOperation, CommandOperationOptions } from './command'; +import { defineAspects, Aspect } from './operation'; +import type { Document } from '../bson'; +import type { Server } from '../sdam/server'; +import type { Collection } from '../collection'; +import { Sort, SortForCmd, formatSort } from '../sort'; +import type { ClientSession } from '../sessions'; +import type { WriteConcern, WriteConcernSettings } from '../write_concern'; + +/** @public */ +export const ReturnDocument = Object.freeze({ + BEFORE: 'before', + AFTER: 'after' +} as const); + +/** @public */ +export type ReturnDocument = typeof ReturnDocument[keyof typeof ReturnDocument]; + +/** @public */ +export interface FindOneAndDeleteOptions extends CommandOperationOptions { + /** An optional hint for query optimization. See the {@link https://docs.mongodb.com/manual/reference/command/update/#update-command-hint|update command} reference for more information.*/ + hint?: Document; + /** Limits the fields to return for all matching documents. */ + projection?: Document; + /** Determines which document the operation modifies if the query selects multiple documents. */ + sort?: Sort; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; +} + +/** @public */ +export interface FindOneAndReplaceOptions extends CommandOperationOptions { + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; + /** An optional hint for query optimization. See the {@link https://docs.mongodb.com/manual/reference/command/update/#update-command-hint|update command} reference for more information.*/ + hint?: Document; + /** Limits the fields to return for all matching documents. */ + projection?: Document; + /** When set to 'after', returns the updated document rather than the original. The default is 'before'. */ + returnDocument?: ReturnDocument; + /** Determines which document the operation modifies if the query selects multiple documents. */ + sort?: Sort; + /** Upsert the document if it does not exist. */ + upsert?: boolean; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; +} + +/** @public */ +export interface FindOneAndUpdateOptions extends CommandOperationOptions { + /** Optional list of array filters referenced in filtered positional operators */ + arrayFilters?: Document[]; + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; + /** An optional hint for query optimization. See the {@link https://docs.mongodb.com/manual/reference/command/update/#update-command-hint|update command} reference for more information.*/ + hint?: Document; + /** Limits the fields to return for all matching documents. */ + projection?: Document; + /** When set to 'after', returns the updated document rather than the original. The default is 'before'. */ + returnDocument?: ReturnDocument; + /** Determines which document the operation modifies if the query selects multiple documents. */ + sort?: Sort; + /** Upsert the document if it does not exist. */ + upsert?: boolean; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; +} + +/** @internal */ +interface FindAndModifyCmdBase { + remove: boolean; + new: boolean; + upsert: boolean; + update?: Document; + sort?: SortForCmd; + fields?: Document; + bypassDocumentValidation?: boolean; + arrayFilters?: Document[]; + maxTimeMS?: number; + let?: Document; + writeConcern?: WriteConcern | WriteConcernSettings; +} + +function configureFindAndModifyCmdBaseUpdateOpts( + cmdBase: FindAndModifyCmdBase, + options: FindOneAndReplaceOptions | FindOneAndUpdateOptions +): FindAndModifyCmdBase { + cmdBase.new = options.returnDocument === ReturnDocument.AFTER; + cmdBase.upsert = options.upsert === true; + + if (options.bypassDocumentValidation === true) { + cmdBase.bypassDocumentValidation = options.bypassDocumentValidation; + } + return cmdBase; +} + +/** @internal */ +class FindAndModifyOperation extends CommandOperation { + options: FindOneAndReplaceOptions | FindOneAndUpdateOptions | FindOneAndDeleteOptions; + cmdBase: FindAndModifyCmdBase; + collection: Collection; + query: Document; + doc?: Document; + + constructor( + collection: Collection, + query: Document, + options: FindOneAndReplaceOptions | FindOneAndUpdateOptions | FindOneAndDeleteOptions + ) { + super(collection, options); + this.options = options ?? {}; + this.cmdBase = { + remove: false, + new: false, + upsert: false + }; + + const sort = formatSort(options.sort); + if (sort) { + this.cmdBase.sort = sort; + } + + if (options.projection) { + this.cmdBase.fields = options.projection; + } + + if (options.maxTimeMS) { + this.cmdBase.maxTimeMS = options.maxTimeMS; + } + + // Decorate the findAndModify command with the write Concern + if (options.writeConcern) { + this.cmdBase.writeConcern = options.writeConcern; + } + + if (options.let) { + this.cmdBase.let = options.let; + } + + // force primary read preference + this.readPreference = ReadPreference.primary; + + this.collection = collection; + this.query = query; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const coll = this.collection; + const query = this.query; + const options = { ...this.options, ...this.bsonOptions }; + + // Create findAndModify command object + const cmd: Document = { + findAndModify: coll.collectionName, + query: query, + ...this.cmdBase + }; + + // Have we specified collation + try { + decorateWithCollation(cmd, coll, options); + } catch (err) { + return callback(err); + } + + if (options.hint) { + // TODO: once this method becomes a CommandOperation we will have the server + // in place to check. + const unacknowledgedWrite = this.writeConcern?.w === 0; + if (unacknowledgedWrite || maxWireVersion(server) < 8) { + callback( + new MongoCompatibilityError( + 'The current topology does not support a hint on findAndModify commands' + ) + ); + + return; + } + + cmd.hint = options.hint; + } + + if (this.explain && maxWireVersion(server) < 4) { + callback( + new MongoCompatibilityError( + `Server ${server.name} does not support explain on findAndModify` + ) + ); + return; + } + + // Execute the command + super.executeCommand(server, session, cmd, (err, result) => { + if (err) return callback(err); + return callback(undefined, result); + }); + } +} + +/** @internal */ +export class FindOneAndDeleteOperation extends FindAndModifyOperation { + constructor(collection: Collection, filter: Document, options: FindOneAndDeleteOptions) { + // Basic validation + if (filter == null || typeof filter !== 'object') { + throw new MongoInvalidArgumentError('Argument "filter" must be an object'); + } + + super(collection, filter, options); + this.cmdBase.remove = true; + } +} + +/** @internal */ +export class FindOneAndReplaceOperation extends FindAndModifyOperation { + constructor( + collection: Collection, + filter: Document, + replacement: Document, + options: FindOneAndReplaceOptions + ) { + if (filter == null || typeof filter !== 'object') { + throw new MongoInvalidArgumentError('Argument "filter" must be an object'); + } + + if (replacement == null || typeof replacement !== 'object') { + throw new MongoInvalidArgumentError('Argument "replacement" must be an object'); + } + + if (hasAtomicOperators(replacement)) { + throw new MongoInvalidArgumentError('Replacement document must not contain atomic operators'); + } + + super(collection, filter, options); + this.cmdBase.update = replacement; + configureFindAndModifyCmdBaseUpdateOpts(this.cmdBase, options); + } +} + +/** @internal */ +export class FindOneAndUpdateOperation extends FindAndModifyOperation { + constructor( + collection: Collection, + filter: Document, + update: Document, + options: FindOneAndUpdateOptions + ) { + if (filter == null || typeof filter !== 'object') { + throw new MongoInvalidArgumentError('Argument "filter" must be an object'); + } + + if (update == null || typeof update !== 'object') { + throw new MongoInvalidArgumentError('Argument "update" must be an object'); + } + + if (!hasAtomicOperators(update)) { + throw new MongoInvalidArgumentError('Update document requires atomic operators'); + } + + super(collection, filter, options); + this.cmdBase.update = update; + configureFindAndModifyCmdBaseUpdateOpts(this.cmdBase, options); + + if (options.arrayFilters) { + this.cmdBase.arrayFilters = options.arrayFilters; + } + } +} + +defineAspects(FindAndModifyOperation, [ + Aspect.WRITE_OPERATION, + Aspect.RETRYABLE, + Aspect.EXPLAINABLE +]); diff --git a/node_modules/mongodb/src/operations/indexes.ts b/node_modules/mongodb/src/operations/indexes.ts new file mode 100644 index 000000000..91b479f1b --- /dev/null +++ b/node_modules/mongodb/src/operations/indexes.ts @@ -0,0 +1,525 @@ +import { indexInformation, IndexInformationOptions } from './common_functions'; +import { AbstractOperation, Aspect, defineAspects } from './operation'; +import { MONGODB_ERROR_CODES, MongoServerError, MongoCompatibilityError } from '../error'; +import { + maxWireVersion, + parseIndexOptions, + MongoDBNamespace, + Callback, + getTopology +} from '../utils'; +import { + CommandOperation, + CommandOperationOptions, + OperationParent, + CollationOptions +} from './command'; +import { ReadPreference } from '../read_preference'; +import type { Server } from '../sdam/server'; +import type { Document } from '../bson'; +import type { Collection } from '../collection'; +import type { Db } from '../db'; +import { AbstractCursor } from '../cursor/abstract_cursor'; +import type { ClientSession } from '../sessions'; +import { executeOperation, ExecutionResult } from './execute_operation'; +import type { OneOrMore } from '../mongo_types'; + +const LIST_INDEXES_WIRE_VERSION = 3; +const VALID_INDEX_OPTIONS = new Set([ + 'background', + 'unique', + 'name', + 'partialFilterExpression', + 'sparse', + 'hidden', + 'expireAfterSeconds', + 'storageEngine', + 'collation', + 'version', + + // text indexes + 'weights', + 'default_language', + 'language_override', + 'textIndexVersion', + + // 2d-sphere indexes + '2dsphereIndexVersion', + + // 2d indexes + 'bits', + 'min', + 'max', + + // geoHaystack Indexes + 'bucketSize', + + // wildcard indexes + 'wildcardProjection' +]); + +/** @public */ +export type IndexDirection = -1 | 1 | '2d' | '2dsphere' | 'text' | 'geoHaystack' | number; + +/** @public */ +export type IndexSpecification = OneOrMore< + | string + | [string, IndexDirection] + | { [key: string]: IndexDirection } + | [string, IndexDirection][] + | { [key: string]: IndexDirection }[] +>; + +/** @public */ +export interface IndexDescription + extends Pick< + CreateIndexesOptions, + | 'background' + | 'unique' + | 'partialFilterExpression' + | 'sparse' + | 'hidden' + | 'expireAfterSeconds' + | 'storageEngine' + | 'version' + | 'weights' + | 'default_language' + | 'language_override' + | 'textIndexVersion' + | '2dsphereIndexVersion' + | 'bits' + | 'min' + | 'max' + | 'bucketSize' + | 'wildcardProjection' + > { + collation?: CollationOptions; + name?: string; + key: Document; +} + +/** @public */ +export interface CreateIndexesOptions extends CommandOperationOptions { + /** Creates the index in the background, yielding whenever possible. */ + background?: boolean; + /** Creates an unique index. */ + unique?: boolean; + /** Override the autogenerated index name (useful if the resulting name is larger than 128 bytes) */ + name?: string; + /** Creates a partial index based on the given filter object (MongoDB 3.2 or higher) */ + partialFilterExpression?: Document; + /** Creates a sparse index. */ + sparse?: boolean; + /** Allows you to expire data on indexes applied to a data (MongoDB 2.2 or higher) */ + expireAfterSeconds?: number; + /** Allows users to configure the storage engine on a per-index basis when creating an index. (MongoDB 3.0 or higher) */ + storageEngine?: Document; + /** (MongoDB 4.4. or higher) Specifies how many data-bearing members of a replica set, including the primary, must complete the index builds successfully before the primary marks the indexes as ready. This option accepts the same values for the "w" field in a write concern plus "votingMembers", which indicates all voting data-bearing nodes. */ + commitQuorum?: number | string; + /** Specifies the index version number, either 0 or 1. */ + version?: number; + // text indexes + weights?: Document; + default_language?: string; + language_override?: string; + textIndexVersion?: number; + // 2d-sphere indexes + '2dsphereIndexVersion'?: number; + // 2d indexes + bits?: number; + /** For geospatial indexes set the lower bound for the co-ordinates. */ + min?: number; + /** For geospatial indexes set the high bound for the co-ordinates. */ + max?: number; + // geoHaystack Indexes + bucketSize?: number; + // wildcard indexes + wildcardProjection?: Document; + /** Specifies that the index should exist on the target collection but should not be used by the query planner when executing operations. (MongoDB 4.4 or higher) */ + hidden?: boolean; +} + +function makeIndexSpec(indexSpec: IndexSpecification, options: any): IndexDescription { + const indexParameters = parseIndexOptions(indexSpec); + + // Generate the index name + const name = typeof options.name === 'string' ? options.name : indexParameters.name; + + // Set up the index + const finalIndexSpec: Document = { name, key: indexParameters.fieldHash }; + + // merge valid index options into the index spec + for (const optionName in options) { + if (VALID_INDEX_OPTIONS.has(optionName)) { + finalIndexSpec[optionName] = options[optionName]; + } + } + + return finalIndexSpec as IndexDescription; +} + +/** @internal */ +export class IndexesOperation extends AbstractOperation { + options: IndexInformationOptions; + collection: Collection; + + constructor(collection: Collection, options: IndexInformationOptions) { + super(options); + this.options = options; + this.collection = collection; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const coll = this.collection; + const options = this.options; + + indexInformation( + coll.s.db, + coll.collectionName, + { full: true, ...options, readPreference: this.readPreference, session }, + callback + ); + } +} + +/** @internal */ +export class CreateIndexesOperation< + T extends string | string[] = string[] +> extends CommandOperation { + options: CreateIndexesOptions; + collectionName: string; + indexes: IndexDescription[]; + + constructor( + parent: OperationParent, + collectionName: string, + indexes: IndexDescription[], + options?: CreateIndexesOptions + ) { + super(parent, options); + + this.options = options ?? {}; + this.collectionName = collectionName; + + this.indexes = indexes; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const options = this.options; + const indexes = this.indexes; + + const serverWireVersion = maxWireVersion(server); + + // Ensure we generate the correct name if the parameter is not set + for (let i = 0; i < indexes.length; i++) { + // Did the user pass in a collation, check if our write server supports it + if (indexes[i].collation && serverWireVersion < 5) { + callback( + new MongoCompatibilityError( + `Server ${server.name}, which reports wire version ${serverWireVersion}, ` + + 'does not support collation' + ) + ); + return; + } + + if (indexes[i].name == null) { + const keys = []; + + for (const name in indexes[i].key) { + keys.push(`${name}_${indexes[i].key[name]}`); + } + + // Set the name + indexes[i].name = keys.join('_'); + } + } + + const cmd: Document = { createIndexes: this.collectionName, indexes }; + + if (options.commitQuorum != null) { + if (serverWireVersion < 9) { + callback( + new MongoCompatibilityError( + 'Option `commitQuorum` for `createIndexes` not supported on servers < 4.4' + ) + ); + return; + } + cmd.commitQuorum = options.commitQuorum; + } + + // collation is set on each index, it should not be defined at the root + this.options.collation = undefined; + + super.executeCommand(server, session, cmd, err => { + if (err) { + callback(err); + return; + } + + const indexNames = indexes.map(index => index.name || ''); + callback(undefined, indexNames as T); + }); + } +} + +/** @internal */ +export class CreateIndexOperation extends CreateIndexesOperation { + constructor( + parent: OperationParent, + collectionName: string, + indexSpec: IndexSpecification, + options?: CreateIndexesOptions + ) { + // createIndex can be called with a variety of styles: + // coll.createIndex('a'); + // coll.createIndex({ a: 1 }); + // coll.createIndex([['a', 1]]); + // createIndexes is always called with an array of index spec objects + + super(parent, collectionName, [makeIndexSpec(indexSpec, options)], options); + } + execute(server: Server, session: ClientSession, callback: Callback): void { + super.execute(server, session, (err, indexNames) => { + if (err || !indexNames) return callback(err); + return callback(undefined, indexNames[0]); + }); + } +} + +/** @internal */ +export class EnsureIndexOperation extends CreateIndexOperation { + db: Db; + collectionName: string; + + constructor( + db: Db, + collectionName: string, + indexSpec: IndexSpecification, + options?: CreateIndexesOptions + ) { + super(db, collectionName, indexSpec, options); + + this.readPreference = ReadPreference.primary; + this.db = db; + this.collectionName = collectionName; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const indexName = this.indexes[0].name; + const cursor = this.db.collection(this.collectionName).listIndexes({ session }); + cursor.toArray((err, indexes) => { + /// ignore "NamespaceNotFound" errors + if (err && (err as MongoServerError).code !== MONGODB_ERROR_CODES.NamespaceNotFound) { + return callback(err); + } + + if (indexes) { + indexes = Array.isArray(indexes) ? indexes : [indexes]; + if (indexes.some(index => index.name === indexName)) { + callback(undefined, indexName); + return; + } + } + + super.execute(server, session, callback); + }); + } +} + +/** @public */ +export type DropIndexesOptions = CommandOperationOptions; + +/** @internal */ +export class DropIndexOperation extends CommandOperation { + options: DropIndexesOptions; + collection: Collection; + indexName: string; + + constructor(collection: Collection, indexName: string, options?: DropIndexesOptions) { + super(collection, options); + + this.options = options ?? {}; + this.collection = collection; + this.indexName = indexName; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const cmd = { dropIndexes: this.collection.collectionName, index: this.indexName }; + super.executeCommand(server, session, cmd, callback); + } +} + +/** @internal */ +export class DropIndexesOperation extends DropIndexOperation { + constructor(collection: Collection, options: DropIndexesOptions) { + super(collection, '*', options); + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + super.execute(server, session, err => { + if (err) return callback(err, false); + callback(undefined, true); + }); + } +} + +/** @public */ +export interface ListIndexesOptions extends CommandOperationOptions { + /** The batchSize for the returned command cursor or if pre 2.8 the systems batch collection */ + batchSize?: number; +} + +/** @internal */ +export class ListIndexesOperation extends CommandOperation { + options: ListIndexesOptions; + collectionNamespace: MongoDBNamespace; + + constructor(collection: Collection, options?: ListIndexesOptions) { + super(collection, options); + + this.options = options ?? {}; + this.collectionNamespace = collection.s.namespace; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const serverWireVersion = maxWireVersion(server); + if (serverWireVersion < LIST_INDEXES_WIRE_VERSION) { + const systemIndexesNS = this.collectionNamespace.withCollection('system.indexes'); + const collectionNS = this.collectionNamespace.toString(); + + server.query( + systemIndexesNS, + { query: { ns: collectionNS } }, + { ...this.options, readPreference: this.readPreference }, + callback + ); + return; + } + + const cursor = this.options.batchSize ? { batchSize: this.options.batchSize } : {}; + super.executeCommand( + server, + session, + { listIndexes: this.collectionNamespace.collection, cursor }, + callback + ); + } +} + +/** @public */ +export class ListIndexesCursor extends AbstractCursor { + parent: Collection; + options?: ListIndexesOptions; + + constructor(collection: Collection, options?: ListIndexesOptions) { + super(getTopology(collection), collection.s.namespace, options); + this.parent = collection; + this.options = options; + } + + clone(): ListIndexesCursor { + return new ListIndexesCursor(this.parent, { + ...this.options, + ...this.cursorOptions + }); + } + + /** @internal */ + _initialize(session: ClientSession | undefined, callback: Callback): void { + const operation = new ListIndexesOperation(this.parent, { + ...this.cursorOptions, + ...this.options, + session + }); + + executeOperation(getTopology(this.parent), operation, (err, response) => { + if (err || response == null) return callback(err); + + // TODO: NODE-2882 + callback(undefined, { server: operation.server, session, response }); + }); + } +} + +/** @internal */ +export class IndexExistsOperation extends AbstractOperation { + options: IndexInformationOptions; + collection: Collection; + indexes: string | string[]; + + constructor( + collection: Collection, + indexes: string | string[], + options: IndexInformationOptions + ) { + super(options); + this.options = options; + this.collection = collection; + this.indexes = indexes; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const coll = this.collection; + const indexes = this.indexes; + + indexInformation( + coll.s.db, + coll.collectionName, + { ...this.options, readPreference: this.readPreference, session }, + (err, indexInformation) => { + // If we have an error return + if (err != null) return callback(err); + // Let's check for the index names + if (!Array.isArray(indexes)) return callback(undefined, indexInformation[indexes] != null); + // Check in list of indexes + for (let i = 0; i < indexes.length; i++) { + if (indexInformation[indexes[i]] == null) { + return callback(undefined, false); + } + } + + // All keys found return true + return callback(undefined, true); + } + ); + } +} + +/** @internal */ +export class IndexInformationOperation extends AbstractOperation { + options: IndexInformationOptions; + db: Db; + name: string; + + constructor(db: Db, name: string, options?: IndexInformationOptions) { + super(options); + this.options = options ?? {}; + this.db = db; + this.name = name; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const db = this.db; + const name = this.name; + + indexInformation( + db, + name, + { ...this.options, readPreference: this.readPreference, session }, + callback + ); + } +} + +defineAspects(ListIndexesOperation, [ + Aspect.READ_OPERATION, + Aspect.RETRYABLE, + Aspect.CURSOR_CREATING +]); +defineAspects(CreateIndexesOperation, [Aspect.WRITE_OPERATION]); +defineAspects(CreateIndexOperation, [Aspect.WRITE_OPERATION]); +defineAspects(EnsureIndexOperation, [Aspect.WRITE_OPERATION]); +defineAspects(DropIndexOperation, [Aspect.WRITE_OPERATION]); +defineAspects(DropIndexesOperation, [Aspect.WRITE_OPERATION]); diff --git a/node_modules/mongodb/src/operations/insert.ts b/node_modules/mongodb/src/operations/insert.ts new file mode 100644 index 000000000..10af5cee8 --- /dev/null +++ b/node_modules/mongodb/src/operations/insert.ts @@ -0,0 +1,137 @@ +import { MongoServerError, MongoInvalidArgumentError } from '../error'; +import { defineAspects, Aspect, AbstractOperation } from './operation'; +import { CommandOperation, CommandOperationOptions } from './command'; +import { prepareDocs } from './common_functions'; +import type { Callback, MongoDBNamespace } from '../utils'; +import type { Server } from '../sdam/server'; +import type { Collection } from '../collection'; +import type { Document } from '../bson'; +import type { BulkWriteOptions } from '../bulk/common'; +import { WriteConcern } from '../write_concern'; +import type { ClientSession } from '../sessions'; +import { BulkWriteOperation } from './bulk_write'; +import type { InferIdType } from '../mongo_types'; + +/** @internal */ +export class InsertOperation extends CommandOperation { + options: BulkWriteOptions; + documents: Document[]; + + constructor(ns: MongoDBNamespace, documents: Document[], options: BulkWriteOptions) { + super(undefined, options); + this.options = { ...options, checkKeys: options.checkKeys ?? false }; + this.ns = ns; + this.documents = documents; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const options = this.options ?? {}; + const ordered = typeof options.ordered === 'boolean' ? options.ordered : true; + const command: Document = { + insert: this.ns.collection, + documents: this.documents, + ordered + }; + + if (typeof options.bypassDocumentValidation === 'boolean') { + command.bypassDocumentValidation = options.bypassDocumentValidation; + } + + if (options.comment != null) { + command.comment = options.comment; + } + + super.executeCommand(server, session, command, callback); + } +} + +/** @public */ +export interface InsertOneOptions extends CommandOperationOptions { + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; + /** Force server to assign _id values instead of driver. */ + forceServerObjectId?: boolean; +} + +/** @public */ +export interface InsertOneResult { + /** Indicates whether this write result was acknowledged. If not, then all other members of this result will be undefined */ + acknowledged: boolean; + /** The identifier that was inserted. If the server generated the identifier, this value will be null as the driver does not have access to that data */ + insertedId: InferIdType; +} + +export class InsertOneOperation extends InsertOperation { + constructor(collection: Collection, doc: Document, options: InsertOneOptions) { + super(collection.s.namespace, prepareDocs(collection, [doc], options), options); + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + super.execute(server, session, (err, res) => { + if (err || res == null) return callback(err); + if (res.code) return callback(new MongoServerError(res)); + if (res.writeErrors) { + // This should be a WriteError but we can't change it now because of error hierarchy + return callback(new MongoServerError(res.writeErrors[0])); + } + + callback(undefined, { + acknowledged: this.writeConcern?.w !== 0 ?? true, + insertedId: this.documents[0]._id + }); + }); + } +} + +/** @public */ +export interface InsertManyResult { + /** Indicates whether this write result was acknowledged. If not, then all other members of this result will be undefined */ + acknowledged: boolean; + /** The number of inserted documents for this operations */ + insertedCount: number; + /** Map of the index of the inserted document to the id of the inserted document */ + insertedIds: { [key: number]: InferIdType }; +} + +/** @internal */ +export class InsertManyOperation extends AbstractOperation { + options: BulkWriteOptions; + collection: Collection; + docs: Document[]; + + constructor(collection: Collection, docs: Document[], options: BulkWriteOptions) { + super(options); + + if (!Array.isArray(docs)) { + throw new MongoInvalidArgumentError('Argument "docs" must be an array of documents'); + } + + this.options = options; + this.collection = collection; + this.docs = docs; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const coll = this.collection; + const options = { ...this.options, ...this.bsonOptions, readPreference: this.readPreference }; + const writeConcern = WriteConcern.fromOptions(options); + const bulkWriteOperation = new BulkWriteOperation( + coll, + prepareDocs(coll, this.docs, options).map(document => ({ insertOne: { document } })), + options + ); + + bulkWriteOperation.execute(server, session, (err, res) => { + if (err || res == null) return callback(err); + callback(undefined, { + acknowledged: writeConcern?.w !== 0 ?? true, + insertedCount: res.insertedCount, + insertedIds: res.insertedIds + }); + }); + } +} + +defineAspects(InsertOperation, [Aspect.RETRYABLE, Aspect.WRITE_OPERATION]); +defineAspects(InsertOneOperation, [Aspect.RETRYABLE, Aspect.WRITE_OPERATION]); +defineAspects(InsertManyOperation, [Aspect.WRITE_OPERATION]); diff --git a/node_modules/mongodb/src/operations/is_capped.ts b/node_modules/mongodb/src/operations/is_capped.ts new file mode 100644 index 000000000..4adc5f58e --- /dev/null +++ b/node_modules/mongodb/src/operations/is_capped.ts @@ -0,0 +1,38 @@ +import type { Callback } from '../utils'; +import type { Collection } from '../collection'; +import { OperationOptions, AbstractOperation } from './operation'; +import type { Server } from '../sdam/server'; +import type { ClientSession } from '../sessions'; +import { MongoAPIError } from '../error'; + +/** @internal */ +export class IsCappedOperation extends AbstractOperation { + options: OperationOptions; + collection: Collection; + + constructor(collection: Collection, options: OperationOptions) { + super(options); + this.options = options; + this.collection = collection; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const coll = this.collection; + + coll.s.db + .listCollections( + { name: coll.collectionName }, + { ...this.options, nameOnly: false, readPreference: this.readPreference, session } + ) + .toArray((err, collections) => { + if (err || !collections) return callback(err); + if (collections.length === 0) { + // TODO(NODE-3485) + return callback(new MongoAPIError(`collection ${coll.namespace} not found`)); + } + + const collOptions = collections[0].options; + callback(undefined, !!(collOptions && collOptions.capped)); + }); + } +} diff --git a/node_modules/mongodb/src/operations/list_collections.ts b/node_modules/mongodb/src/operations/list_collections.ts new file mode 100644 index 000000000..4ba93ea82 --- /dev/null +++ b/node_modules/mongodb/src/operations/list_collections.ts @@ -0,0 +1,161 @@ +import { CommandOperation, CommandOperationOptions } from './command'; +import { Aspect, defineAspects } from './operation'; +import { maxWireVersion, Callback, getTopology, MongoDBNamespace } from '../utils'; +import * as CONSTANTS from '../constants'; +import type { Binary, Document } from '../bson'; +import type { Server } from '../sdam/server'; +import type { Db } from '../db'; +import { AbstractCursor } from '../cursor/abstract_cursor'; +import type { ClientSession } from '../sessions'; +import { executeOperation, ExecutionResult } from './execute_operation'; + +const LIST_COLLECTIONS_WIRE_VERSION = 3; + +/** @public */ +export interface ListCollectionsOptions extends CommandOperationOptions { + /** Since 4.0: If true, will only return the collection name in the response, and will omit additional info */ + nameOnly?: boolean; + /** The batchSize for the returned command cursor or if pre 2.8 the systems batch collection */ + batchSize?: number; +} + +/** @internal */ +export class ListCollectionsOperation extends CommandOperation { + options: ListCollectionsOptions; + db: Db; + filter: Document; + nameOnly: boolean; + batchSize?: number; + + constructor(db: Db, filter: Document, options?: ListCollectionsOptions) { + super(db, options); + + this.options = options ?? {}; + this.db = db; + this.filter = filter; + this.nameOnly = !!this.options.nameOnly; + + if (typeof this.options.batchSize === 'number') { + this.batchSize = this.options.batchSize; + } + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + if (maxWireVersion(server) < LIST_COLLECTIONS_WIRE_VERSION) { + let filter = this.filter; + const databaseName = this.db.s.namespace.db; + + // If we have legacy mode and have not provided a full db name filter it + if (typeof filter.name === 'string' && !new RegExp(`^${databaseName}\\.`).test(filter.name)) { + filter = Object.assign({}, filter); + filter.name = this.db.s.namespace.withCollection(filter.name).toString(); + } + + // No filter, filter by current database + if (filter == null) { + filter = { name: `/${databaseName}/` }; + } + + // Rewrite the filter to use $and to filter out indexes + if (filter.name) { + filter = { $and: [{ name: filter.name }, { name: /^((?!\$).)*$/ }] }; + } else { + filter = { name: /^((?!\$).)*$/ }; + } + + const documentTransform = (doc: Document) => { + const matching = `${databaseName}.`; + const index = doc.name.indexOf(matching); + // Remove database name if available + if (doc.name && index === 0) { + doc.name = doc.name.substr(index + matching.length); + } + + return doc; + }; + + server.query( + new MongoDBNamespace(databaseName, CONSTANTS.SYSTEM_NAMESPACE_COLLECTION), + { query: filter }, + { batchSize: this.batchSize || 1000, readPreference: this.readPreference }, + (err, result) => { + if (result && result.documents && Array.isArray(result.documents)) { + result.documents = result.documents.map(documentTransform); + } + + callback(err, result); + } + ); + + return; + } + + const command = { + listCollections: 1, + filter: this.filter, + cursor: this.batchSize ? { batchSize: this.batchSize } : {}, + nameOnly: this.nameOnly + }; + + return super.executeCommand(server, session, command, callback); + } +} + +/** @public */ +export interface CollectionInfo extends Document { + name: string; + type?: string; + options?: Document; + info?: { + readOnly?: false; + uuid?: Binary; + }; + idIndex?: Document; +} + +/** @public */ +export class ListCollectionsCursor< + T extends Pick | CollectionInfo = + | Pick + | CollectionInfo +> extends AbstractCursor { + parent: Db; + filter: Document; + options?: ListCollectionsOptions; + + constructor(db: Db, filter: Document, options?: ListCollectionsOptions) { + super(getTopology(db), db.s.namespace, options); + this.parent = db; + this.filter = filter; + this.options = options; + } + + clone(): ListCollectionsCursor { + return new ListCollectionsCursor(this.parent, this.filter, { + ...this.options, + ...this.cursorOptions + }); + } + + /** @internal */ + _initialize(session: ClientSession | undefined, callback: Callback): void { + const operation = new ListCollectionsOperation(this.parent, this.filter, { + ...this.cursorOptions, + ...this.options, + session + }); + + executeOperation(getTopology(this.parent), operation, (err, response) => { + if (err || response == null) return callback(err); + + // TODO: NODE-2882 + callback(undefined, { server: operation.server, session, response }); + }); + } +} + +defineAspects(ListCollectionsOperation, [ + Aspect.READ_OPERATION, + Aspect.RETRYABLE, + Aspect.CURSOR_CREATING +]); diff --git a/node_modules/mongodb/src/operations/list_databases.ts b/node_modules/mongodb/src/operations/list_databases.ts new file mode 100644 index 000000000..d138e3591 --- /dev/null +++ b/node_modules/mongodb/src/operations/list_databases.ts @@ -0,0 +1,50 @@ +import { CommandOperation, CommandOperationOptions } from './command'; +import { Aspect, defineAspects } from './operation'; +import { MongoDBNamespace, Callback } from '../utils'; +import type { Document } from '../bson'; +import type { Server } from '../sdam/server'; +import type { Db } from '../db'; +import type { ClientSession } from '../sessions'; + +/** @public */ +export type ListDatabasesResult = string[] | Document[]; + +/** @public */ +export interface ListDatabasesOptions extends CommandOperationOptions { + /** A query predicate that determines which databases are listed */ + filter?: Document; + /** A flag to indicate whether the command should return just the database names, or return both database names and size information */ + nameOnly?: boolean; + /** A flag that determines which databases are returned based on the user privileges when access control is enabled */ + authorizedDatabases?: boolean; +} + +/** @internal */ +export class ListDatabasesOperation extends CommandOperation { + options: ListDatabasesOptions; + + constructor(db: Db, options?: ListDatabasesOptions) { + super(db, options); + this.options = options ?? {}; + this.ns = new MongoDBNamespace('admin', '$cmd'); + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const cmd: Document = { listDatabases: 1 }; + if (this.options.nameOnly) { + cmd.nameOnly = Number(cmd.nameOnly); + } + + if (this.options.filter) { + cmd.filter = this.options.filter; + } + + if (typeof this.options.authorizedDatabases === 'boolean') { + cmd.authorizedDatabases = this.options.authorizedDatabases; + } + + super.executeCommand(server, session, cmd, callback); + } +} + +defineAspects(ListDatabasesOperation, [Aspect.READ_OPERATION, Aspect.RETRYABLE]); diff --git a/node_modules/mongodb/src/operations/map_reduce.ts b/node_modules/mongodb/src/operations/map_reduce.ts new file mode 100644 index 000000000..3e84aeb0e --- /dev/null +++ b/node_modules/mongodb/src/operations/map_reduce.ts @@ -0,0 +1,248 @@ +import { Code, Document } from '../bson'; +import { + applyWriteConcern, + decorateWithCollation, + decorateWithReadConcern, + isObject, + Callback, + maxWireVersion +} from '../utils'; +import { ReadPreference, ReadPreferenceMode } from '../read_preference'; +import { CommandOperation, CommandOperationOptions } from './command'; +import type { Server } from '../sdam/server'; +import type { Collection } from '../collection'; +import type { Sort } from '../sort'; +import { MongoServerError, MongoCompatibilityError } from '../error'; +import type { ObjectId } from '../bson'; +import { Aspect, defineAspects } from './operation'; +import type { ClientSession } from '../sessions'; +import { Db } from '../db'; + +const exclusionList = [ + 'explain', + 'readPreference', + 'readConcern', + 'session', + 'bypassDocumentValidation', + 'writeConcern', + 'raw', + 'fieldsAsRaw', + 'promoteLongs', + 'promoteValues', + 'promoteBuffers', + 'bsonRegExp', + 'serializeFunctions', + 'ignoreUndefined', + 'scope' // this option is reformatted thus exclude the original +]; + +/** @public */ +export type MapFunction = (this: TSchema) => void; +/** @public */ +export type ReduceFunction = (key: TKey, values: TValue[]) => TValue; +/** @public */ +export type FinalizeFunction = ( + key: TKey, + reducedValue: TValue +) => TValue; + +/** @public */ +export interface MapReduceOptions + extends CommandOperationOptions { + /** Sets the output target for the map reduce job. */ + out?: 'inline' | { inline: 1 } | { replace: string } | { merge: string } | { reduce: string }; + /** Query filter object. */ + query?: Document; + /** Sorts the input objects using this key. Useful for optimization, like sorting by the emit key for fewer reduces. */ + sort?: Sort; + /** Number of objects to return from collection. */ + limit?: number; + /** Keep temporary data. */ + keeptemp?: boolean; + /** Finalize function. */ + finalize?: FinalizeFunction | string; + /** Can pass in variables that can be access from map/reduce/finalize. */ + scope?: Document; + /** It is possible to make the execution stay in JS. Provided in MongoDB \> 2.0.X. */ + jsMode?: boolean; + /** Provide statistics on job execution time. */ + verbose?: boolean; + /** Allow driver to bypass schema validation in MongoDB 3.2 or higher. */ + bypassDocumentValidation?: boolean; +} + +interface MapReduceStats { + processtime?: number; + counts?: number; + timing?: number; +} + +/** + * Run Map Reduce across a collection. Be aware that the inline option for out will return an array of results not a collection. + * @internal + */ +export class MapReduceOperation extends CommandOperation { + options: MapReduceOptions; + collection: Collection; + /** The mapping function. */ + map: MapFunction | string; + /** The reduce function. */ + reduce: ReduceFunction | string; + + /** + * Constructs a MapReduce operation. + * + * @param collection - Collection instance. + * @param map - The mapping function. + * @param reduce - The reduce function. + * @param options - Optional settings. See Collection.prototype.mapReduce for a list of options. + */ + constructor( + collection: Collection, + map: MapFunction | string, + reduce: ReduceFunction | string, + options?: MapReduceOptions + ) { + super(collection, options); + + this.options = options ?? {}; + this.collection = collection; + this.map = map; + this.reduce = reduce; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const coll = this.collection; + const map = this.map; + const reduce = this.reduce; + let options = this.options; + + const mapCommandHash: Document = { + mapReduce: coll.collectionName, + map: map, + reduce: reduce + }; + + if (options.scope) { + mapCommandHash.scope = processScope(options.scope); + } + + // Add any other options passed in + for (const n in options) { + // Only include if not in exclusion list + if (exclusionList.indexOf(n) === -1) { + mapCommandHash[n] = (options as any)[n]; + } + } + + options = Object.assign({}, options); + + // If we have a read preference and inline is not set as output fail hard + if ( + this.readPreference.mode === ReadPreferenceMode.primary && + options.out && + (options.out as any).inline !== 1 && + options.out !== 'inline' + ) { + // Force readPreference to primary + options.readPreference = ReadPreference.primary; + // Decorate command with writeConcern if supported + applyWriteConcern(mapCommandHash, { db: coll.s.db, collection: coll }, options); + } else { + decorateWithReadConcern(mapCommandHash, coll, options); + } + + // Is bypassDocumentValidation specified + if (options.bypassDocumentValidation === true) { + mapCommandHash.bypassDocumentValidation = options.bypassDocumentValidation; + } + + // Have we specified collation + try { + decorateWithCollation(mapCommandHash, coll, options); + } catch (err) { + return callback(err); + } + + if (this.explain && maxWireVersion(server) < 9) { + callback( + new MongoCompatibilityError(`Server ${server.name} does not support explain on mapReduce`) + ); + return; + } + + // Execute command + super.executeCommand(server, session, mapCommandHash, (err, result) => { + if (err) return callback(err); + // Check if we have an error + if (1 !== result.ok || result.err || result.errmsg) { + return callback(new MongoServerError(result)); + } + + // If an explain option was executed, don't process the server results + if (this.explain) return callback(undefined, result); + + // Create statistics value + const stats: MapReduceStats = {}; + if (result.timeMillis) stats['processtime'] = result.timeMillis; + if (result.counts) stats['counts'] = result.counts; + if (result.timing) stats['timing'] = result.timing; + + // invoked with inline? + if (result.results) { + // If we wish for no verbosity + if (options['verbose'] == null || !options['verbose']) { + return callback(undefined, result.results); + } + + return callback(undefined, { results: result.results, stats: stats }); + } + + // The returned collection + let collection = null; + + // If we have an object it's a different db + if (result.result != null && typeof result.result === 'object') { + const doc = result.result; + // Return a collection from another db + collection = new Db(coll.s.db.s.client, doc.db, coll.s.db.s.options).collection( + doc.collection + ); + } else { + // Create a collection object that wraps the result collection + collection = coll.s.db.collection(result.result); + } + + // If we wish for no verbosity + if (options['verbose'] == null || !options['verbose']) { + return callback(err, collection); + } + + // Return stats as third set of values + callback(err, { collection, stats }); + }); + } +} + +/** Functions that are passed as scope args must be converted to Code instances. */ +function processScope(scope: Document | ObjectId) { + if (!isObject(scope) || (scope as any)._bsontype === 'ObjectID') { + return scope; + } + + const newScope: Document = {}; + + for (const key of Object.keys(scope)) { + if ('function' === typeof (scope as Document)[key]) { + newScope[key] = new Code(String((scope as Document)[key])); + } else if ((scope as Document)[key]._bsontype === 'Code') { + newScope[key] = (scope as Document)[key]; + } else { + newScope[key] = processScope((scope as Document)[key]); + } + } + + return newScope; +} + +defineAspects(MapReduceOperation, [Aspect.EXPLAINABLE]); diff --git a/node_modules/mongodb/src/operations/operation.ts b/node_modules/mongodb/src/operations/operation.ts new file mode 100644 index 000000000..c503b8180 --- /dev/null +++ b/node_modules/mongodb/src/operations/operation.ts @@ -0,0 +1,116 @@ +import { ReadPreference, ReadPreferenceLike } from '../read_preference'; +import type { ClientSession } from '../sessions'; +import { Document, BSONSerializeOptions, resolveBSONOptions } from '../bson'; +import type { MongoDBNamespace, Callback } from '../utils'; +import type { Server } from '../sdam/server'; + +export const Aspect = { + READ_OPERATION: Symbol('READ_OPERATION'), + WRITE_OPERATION: Symbol('WRITE_OPERATION'), + RETRYABLE: Symbol('RETRYABLE'), + EXPLAINABLE: Symbol('EXPLAINABLE'), + SKIP_COLLATION: Symbol('SKIP_COLLATION'), + CURSOR_CREATING: Symbol('CURSOR_CREATING') +} as const; + +/** @public */ +export type Hint = string | Document; + +export interface OperationConstructor extends Function { + aspects?: Set; +} + +/** @public */ +export interface OperationOptions extends BSONSerializeOptions { + /** Specify ClientSession for this command */ + session?: ClientSession; + willRetryWrites?: boolean; + + /** The preferred read preference (ReadPreference.primary, ReadPreference.primary_preferred, ReadPreference.secondary, ReadPreference.secondary_preferred, ReadPreference.nearest). */ + readPreference?: ReadPreferenceLike; + + /** @internal Hints to `executeOperation` that this operation should not unpin on an ended transaction */ + bypassPinningCheck?: boolean; +} + +/** @internal */ +const kSession = Symbol('session'); + +/** + * This class acts as a parent class for any operation and is responsible for setting this.options, + * as well as setting and getting a session. + * Additionally, this class implements `hasAspect`, which determines whether an operation has + * a specific aspect. + * @internal + */ +export abstract class AbstractOperation { + ns!: MongoDBNamespace; + cmd!: Document; + readPreference: ReadPreference; + server!: Server; + bypassPinningCheck: boolean; + + // BSON serialization options + bsonOptions?: BSONSerializeOptions; + + // TODO: Each operation defines its own options, there should be better typing here + options: Document; + + [kSession]: ClientSession; + + constructor(options: OperationOptions = {}) { + this.readPreference = this.hasAspect(Aspect.WRITE_OPERATION) + ? ReadPreference.primary + : ReadPreference.fromOptions(options) ?? ReadPreference.primary; + + // Pull the BSON serialize options from the already-resolved options + this.bsonOptions = resolveBSONOptions(options); + + if (options.session) { + this[kSession] = options.session; + } + + this.options = options; + this.bypassPinningCheck = !!options.bypassPinningCheck; + } + + abstract execute(server: Server, session: ClientSession, callback: Callback): void; + + hasAspect(aspect: symbol): boolean { + const ctor = this.constructor as OperationConstructor; + if (ctor.aspects == null) { + return false; + } + + return ctor.aspects.has(aspect); + } + + get session(): ClientSession { + return this[kSession]; + } + + get canRetryRead(): boolean { + return true; + } + + get canRetryWrite(): boolean { + return true; + } +} + +export function defineAspects( + operation: OperationConstructor, + aspects: symbol | symbol[] | Set +): Set { + if (!Array.isArray(aspects) && !(aspects instanceof Set)) { + aspects = [aspects]; + } + + aspects = new Set(aspects); + Object.defineProperty(operation, 'aspects', { + value: aspects, + writable: false + }); + + return aspects; +} diff --git a/node_modules/mongodb/src/operations/options_operation.ts b/node_modules/mongodb/src/operations/options_operation.ts new file mode 100644 index 000000000..0d1dee1ec --- /dev/null +++ b/node_modules/mongodb/src/operations/options_operation.ts @@ -0,0 +1,38 @@ +import { AbstractOperation, OperationOptions } from './operation'; +import { MongoAPIError } from '../error'; +import type { Callback } from '../utils'; +import type { Document } from '../bson'; +import type { Collection } from '../collection'; +import type { Server } from '../sdam/server'; +import type { ClientSession } from '../sessions'; + +/** @internal */ +export class OptionsOperation extends AbstractOperation { + options: OperationOptions; + collection: Collection; + + constructor(collection: Collection, options: OperationOptions) { + super(options); + this.options = options; + this.collection = collection; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const coll = this.collection; + + coll.s.db + .listCollections( + { name: coll.collectionName }, + { ...this.options, nameOnly: false, readPreference: this.readPreference, session } + ) + .toArray((err, collections) => { + if (err || !collections) return callback(err); + if (collections.length === 0) { + // TODO(NODE-3485) + return callback(new MongoAPIError(`collection ${coll.namespace} not found`)); + } + + callback(err, collections[0].options); + }); + } +} diff --git a/node_modules/mongodb/src/operations/profiling_level.ts b/node_modules/mongodb/src/operations/profiling_level.ts new file mode 100644 index 000000000..2c28da5f0 --- /dev/null +++ b/node_modules/mongodb/src/operations/profiling_level.ts @@ -0,0 +1,35 @@ +import { CommandOperation, CommandOperationOptions } from './command'; +import type { Callback } from '../utils'; +import type { Server } from '../sdam/server'; +import type { Db } from '../db'; +import type { ClientSession } from '../sessions'; +import { MongoRuntimeError } from '../error'; + +/** @public */ +export type ProfilingLevelOptions = CommandOperationOptions; + +/** @internal */ +export class ProfilingLevelOperation extends CommandOperation { + options: ProfilingLevelOptions; + + constructor(db: Db, options: ProfilingLevelOptions) { + super(db, options); + this.options = options; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + super.executeCommand(server, session, { profile: -1 }, (err, doc) => { + if (err == null && doc.ok === 1) { + const was = doc.was; + if (was === 0) return callback(undefined, 'off'); + if (was === 1) return callback(undefined, 'slow_only'); + if (was === 2) return callback(undefined, 'all'); + // TODO(NODE-3483) + return callback(new MongoRuntimeError(`Illegal profiling level value ${was}`)); + } else { + // TODO(NODE-3483): Consider MongoUnexpectedServerResponseError + err != null ? callback(err) : callback(new MongoRuntimeError('Error with profile command')); + } + }); + } +} diff --git a/node_modules/mongodb/src/operations/remove_user.ts b/node_modules/mongodb/src/operations/remove_user.ts new file mode 100644 index 000000000..073b6eb99 --- /dev/null +++ b/node_modules/mongodb/src/operations/remove_user.ts @@ -0,0 +1,29 @@ +import { Aspect, defineAspects } from './operation'; +import { CommandOperation, CommandOperationOptions } from './command'; +import type { Callback } from '../utils'; +import type { Server } from '../sdam/server'; +import type { Db } from '../db'; +import type { ClientSession } from '../sessions'; + +/** @public */ +export type RemoveUserOptions = CommandOperationOptions; + +/** @internal */ +export class RemoveUserOperation extends CommandOperation { + options: RemoveUserOptions; + username: string; + + constructor(db: Db, username: string, options: RemoveUserOptions) { + super(db, options); + this.options = options; + this.username = username; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + super.executeCommand(server, session, { dropUser: this.username }, err => { + callback(err, err ? false : true); + }); + } +} + +defineAspects(RemoveUserOperation, [Aspect.WRITE_OPERATION]); diff --git a/node_modules/mongodb/src/operations/rename.ts b/node_modules/mongodb/src/operations/rename.ts new file mode 100644 index 000000000..47d03a2e2 --- /dev/null +++ b/node_modules/mongodb/src/operations/rename.ts @@ -0,0 +1,63 @@ +import { checkCollectionName, Callback } from '../utils'; +import { RunAdminCommandOperation } from './run_command'; +import { defineAspects, Aspect } from './operation'; +import type { Server } from '../sdam/server'; +import { Collection } from '../collection'; +import type { CommandOperationOptions } from './command'; +import { MongoServerError } from '../error'; +import type { ClientSession } from '../sessions'; +import type { Document } from 'bson'; + +/** @public */ +export interface RenameOptions extends CommandOperationOptions { + /** Drop the target name collection if it previously exists. */ + dropTarget?: boolean; + /** Unclear */ + new_collection?: boolean; +} + +/** @internal */ +export class RenameOperation extends RunAdminCommandOperation { + options: RenameOptions; + collection: Collection; + newName: string; + + constructor(collection: Collection, newName: string, options: RenameOptions) { + // Check the collection name + checkCollectionName(newName); + + // Build the command + const renameCollection = collection.namespace; + const toCollection = collection.s.namespace.withCollection(newName).toString(); + const dropTarget = typeof options.dropTarget === 'boolean' ? options.dropTarget : false; + const cmd = { renameCollection: renameCollection, to: toCollection, dropTarget: dropTarget }; + + super(collection, cmd, options); + this.options = options; + this.collection = collection; + this.newName = newName; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const coll = this.collection; + + super.execute(server, session, (err, doc) => { + if (err) return callback(err); + // We have an error + if (doc.errmsg) { + return callback(new MongoServerError(doc)); + } + + let newColl: Collection; + try { + newColl = new Collection(coll.s.db, this.newName, coll.s.options); + } catch (err) { + return callback(err); + } + + return callback(undefined, newColl); + }); + } +} + +defineAspects(RenameOperation, [Aspect.WRITE_OPERATION]); diff --git a/node_modules/mongodb/src/operations/run_command.ts b/node_modules/mongodb/src/operations/run_command.ts new file mode 100644 index 000000000..e9cabb90d --- /dev/null +++ b/node_modules/mongodb/src/operations/run_command.ts @@ -0,0 +1,32 @@ +import { CommandOperation, CommandOperationOptions, OperationParent } from './command'; +import { MongoDBNamespace, Callback } from '../utils'; +import type { Server } from '../sdam/server'; +import type { Document } from '../bson'; +import type { ClientSession } from '../sessions'; + +/** @public */ +export type RunCommandOptions = CommandOperationOptions; + +/** @internal */ +export class RunCommandOperation extends CommandOperation { + options: RunCommandOptions; + command: Document; + + constructor(parent: OperationParent | undefined, command: Document, options?: RunCommandOptions) { + super(parent, options); + this.options = options ?? {}; + this.command = command; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const command = this.command; + this.executeCommand(server, session, command, callback); + } +} + +export class RunAdminCommandOperation extends RunCommandOperation { + constructor(parent: OperationParent | undefined, command: Document, options?: RunCommandOptions) { + super(parent, command, options); + this.ns = new MongoDBNamespace('admin'); + } +} diff --git a/node_modules/mongodb/src/operations/set_profiling_level.ts b/node_modules/mongodb/src/operations/set_profiling_level.ts new file mode 100644 index 000000000..db3a2e0c7 --- /dev/null +++ b/node_modules/mongodb/src/operations/set_profiling_level.ts @@ -0,0 +1,69 @@ +import { CommandOperation, CommandOperationOptions } from './command'; +import type { Callback } from '../utils'; +import { enumToString } from '../utils'; +import type { Server } from '../sdam/server'; +import type { Db } from '../db'; +import type { ClientSession } from '../sessions'; +import { MongoRuntimeError, MongoInvalidArgumentError } from '../error'; +const levelValues = new Set(['off', 'slow_only', 'all']); + +/** @public */ +export const ProfilingLevel = Object.freeze({ + off: 'off', + slowOnly: 'slow_only', + all: 'all' +} as const); + +/** @public */ +export type ProfilingLevel = typeof ProfilingLevel[keyof typeof ProfilingLevel]; + +/** @public */ +export type SetProfilingLevelOptions = CommandOperationOptions; + +/** @internal */ +export class SetProfilingLevelOperation extends CommandOperation { + options: SetProfilingLevelOptions; + level: ProfilingLevel; + profile: 0 | 1 | 2; + + constructor(db: Db, level: ProfilingLevel, options: SetProfilingLevelOptions) { + super(db, options); + this.options = options; + switch (level) { + case ProfilingLevel.off: + this.profile = 0; + break; + case ProfilingLevel.slowOnly: + this.profile = 1; + break; + case ProfilingLevel.all: + this.profile = 2; + break; + default: + this.profile = 0; + break; + } + + this.level = level; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const level = this.level; + + if (!levelValues.has(level)) { + return callback( + new MongoInvalidArgumentError( + `Profiling level must be one of "${enumToString(ProfilingLevel)}"` + ) + ); + } + + // TODO(NODE-3483): Determine error to put here + super.executeCommand(server, session, { profile: this.profile }, (err, doc) => { + if (err == null && doc.ok === 1) return callback(undefined, level); + return err != null + ? callback(err) + : callback(new MongoRuntimeError('Error with profile command')); + }); + } +} diff --git a/node_modules/mongodb/src/operations/stats.ts b/node_modules/mongodb/src/operations/stats.ts new file mode 100644 index 000000000..98de40765 --- /dev/null +++ b/node_modules/mongodb/src/operations/stats.ts @@ -0,0 +1,263 @@ +import { Aspect, defineAspects } from './operation'; +import { CommandOperation, CommandOperationOptions } from './command'; +import type { Callback } from '../utils'; +import type { Document } from '../bson'; +import type { Server } from '../sdam/server'; +import type { Collection } from '../collection'; +import type { Db } from '../db'; +import type { ClientSession } from '../sessions'; + +/** @public */ +export interface CollStatsOptions extends CommandOperationOptions { + /** Divide the returned sizes by scale value. */ + scale?: number; +} + +/** + * Get all the collection statistics. + * @internal + */ +export class CollStatsOperation extends CommandOperation { + options: CollStatsOptions; + collectionName: string; + + /** + * Construct a Stats operation. + * + * @param collection - Collection instance + * @param options - Optional settings. See Collection.prototype.stats for a list of options. + */ + constructor(collection: Collection, options?: CollStatsOptions) { + super(collection, options); + this.options = options ?? {}; + this.collectionName = collection.collectionName; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const command: Document = { collStats: this.collectionName }; + if (this.options.scale != null) { + command.scale = this.options.scale; + } + + super.executeCommand(server, session, command, callback); + } +} + +/** @public */ +export interface DbStatsOptions extends CommandOperationOptions { + /** Divide the returned sizes by scale value. */ + scale?: number; +} + +/** @internal */ +export class DbStatsOperation extends CommandOperation { + options: DbStatsOptions; + + constructor(db: Db, options: DbStatsOptions) { + super(db, options); + this.options = options; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const command: Document = { dbStats: true }; + if (this.options.scale != null) { + command.scale = this.options.scale; + } + + super.executeCommand(server, session, command, callback); + } +} + +/** + * @public + * @see https://docs.mongodb.org/manual/reference/command/collStats/ + */ +export interface CollStats extends Document { + /** Namespace */ + ns: string; + /** Number of documents */ + count: number; + /** Collection size in bytes */ + size: number; + /** Average object size in bytes */ + avgObjSize: number; + /** (Pre)allocated space for the collection in bytes */ + storageSize: number; + /** Number of extents (contiguously allocated chunks of datafile space) */ + numExtents: number; + /** Number of indexes */ + nindexes: number; + /** Size of the most recently created extent in bytes */ + lastExtentSize: number; + /** Padding can speed up updates if documents grow */ + paddingFactor: number; + /** A number that indicates the user-set flags on the collection. userFlags only appears when using the mmapv1 storage engine */ + userFlags?: number; + /** Total index size in bytes */ + totalIndexSize: number; + /** Size of specific indexes in bytes */ + indexSizes: { + _id_: number; + [index: string]: number; + }; + /** `true` if the collection is capped */ + capped: boolean; + /** The maximum number of documents that may be present in a capped collection */ + max: number; + /** The maximum size of a capped collection */ + maxSize: number; + /** This document contains data reported directly by the WiredTiger engine and other data for internal diagnostic use */ + wiredTiger?: WiredTigerData; + /** The fields in this document are the names of the indexes, while the values themselves are documents that contain statistics for the index provided by the storage engine */ + indexDetails?: any; + ok: number; + + /** The amount of storage available for reuse. The scale argument affects this value. */ + freeStorageSize?: number; + /** An array that contains the names of the indexes that are currently being built on the collection */ + indexBuilds?: number; + /** The sum of the storageSize and totalIndexSize. The scale argument affects this value */ + totalSize: number; + /** The scale value used by the command. */ + scaleFactor: number; +} + +/** @public */ +export interface WiredTigerData extends Document { + LSM: { + 'bloom filter false positives': number; + 'bloom filter hits': number; + 'bloom filter misses': number; + 'bloom filter pages evicted from cache': number; + 'bloom filter pages read into cache': number; + 'bloom filters in the LSM tree': number; + 'chunks in the LSM tree': number; + 'highest merge generation in the LSM tree': number; + 'queries that could have benefited from a Bloom filter that did not exist': number; + 'sleep for LSM checkpoint throttle': number; + 'sleep for LSM merge throttle': number; + 'total size of bloom filters': number; + } & Document; + 'block-manager': { + 'allocations requiring file extension': number; + 'blocks allocated': number; + 'blocks freed': number; + 'checkpoint size': number; + 'file allocation unit size': number; + 'file bytes available for reuse': number; + 'file magic number': number; + 'file major version number': number; + 'file size in bytes': number; + 'minor version number': number; + }; + btree: { + 'btree checkpoint generation': number; + 'column-store fixed-size leaf pages': number; + 'column-store internal pages': number; + 'column-store variable-size RLE encoded values': number; + 'column-store variable-size deleted values': number; + 'column-store variable-size leaf pages': number; + 'fixed-record size': number; + 'maximum internal page key size': number; + 'maximum internal page size': number; + 'maximum leaf page key size': number; + 'maximum leaf page size': number; + 'maximum leaf page value size': number; + 'maximum tree depth': number; + 'number of key/value pairs': number; + 'overflow pages': number; + 'pages rewritten by compaction': number; + 'row-store internal pages': number; + 'row-store leaf pages': number; + } & Document; + cache: { + 'bytes currently in the cache': number; + 'bytes read into cache': number; + 'bytes written from cache': number; + 'checkpoint blocked page eviction': number; + 'data source pages selected for eviction unable to be evicted': number; + 'hazard pointer blocked page eviction': number; + 'in-memory page passed criteria to be split': number; + 'in-memory page splits': number; + 'internal pages evicted': number; + 'internal pages split during eviction': number; + 'leaf pages split during eviction': number; + 'modified pages evicted': number; + 'overflow pages read into cache': number; + 'overflow values cached in memory': number; + 'page split during eviction deepened the tree': number; + 'page written requiring lookaside records': number; + 'pages read into cache': number; + 'pages read into cache requiring lookaside entries': number; + 'pages requested from the cache': number; + 'pages written from cache': number; + 'pages written requiring in-memory restoration': number; + 'tracked dirty bytes in the cache': number; + 'unmodified pages evicted': number; + } & Document; + cache_walk: { + 'Average difference between current eviction generation when the page was last considered': number; + 'Average on-disk page image size seen': number; + 'Clean pages currently in cache': number; + 'Current eviction generation': number; + 'Dirty pages currently in cache': number; + 'Entries in the root page': number; + 'Internal pages currently in cache': number; + 'Leaf pages currently in cache': number; + 'Maximum difference between current eviction generation when the page was last considered': number; + 'Maximum page size seen': number; + 'Minimum on-disk page image size seen': number; + 'On-disk page image sizes smaller than a single allocation unit': number; + 'Pages created in memory and never written': number; + 'Pages currently queued for eviction': number; + 'Pages that could not be queued for eviction': number; + 'Refs skipped during cache traversal': number; + 'Size of the root page': number; + 'Total number of pages currently in cache': number; + } & Document; + compression: { + 'compressed pages read': number; + 'compressed pages written': number; + 'page written failed to compress': number; + 'page written was too small to compress': number; + 'raw compression call failed, additional data available': number; + 'raw compression call failed, no additional data available': number; + 'raw compression call succeeded': number; + } & Document; + cursor: { + 'bulk-loaded cursor-insert calls': number; + 'create calls': number; + 'cursor-insert key and value bytes inserted': number; + 'cursor-remove key bytes removed': number; + 'cursor-update value bytes updated': number; + 'insert calls': number; + 'next calls': number; + 'prev calls': number; + 'remove calls': number; + 'reset calls': number; + 'restarted searches': number; + 'search calls': number; + 'search near calls': number; + 'truncate calls': number; + 'update calls': number; + }; + reconciliation: { + 'dictionary matches': number; + 'fast-path pages deleted': number; + 'internal page key bytes discarded using suffix compression': number; + 'internal page multi-block writes': number; + 'internal-page overflow keys': number; + 'leaf page key bytes discarded using prefix compression': number; + 'leaf page multi-block writes': number; + 'leaf-page overflow keys': number; + 'maximum blocks required for a page': number; + 'overflow values written': number; + 'page checksum matches': number; + 'page reconciliation calls': number; + 'page reconciliation calls for eviction': number; + 'pages deleted': number; + } & Document; +} + +defineAspects(CollStatsOperation, [Aspect.READ_OPERATION]); +defineAspects(DbStatsOperation, [Aspect.READ_OPERATION]); diff --git a/node_modules/mongodb/src/operations/update.ts b/node_modules/mongodb/src/operations/update.ts new file mode 100644 index 000000000..fa6997341 --- /dev/null +++ b/node_modules/mongodb/src/operations/update.ts @@ -0,0 +1,322 @@ +import { defineAspects, Aspect, Hint } from './operation'; +import { + hasAtomicOperators, + MongoDBNamespace, + Callback, + collationNotSupported, + maxWireVersion +} from '../utils'; +import { CommandOperation, CommandOperationOptions, CollationOptions } from './command'; +import type { Server } from '../sdam/server'; +import type { Collection } from '../collection'; +import type { ObjectId, Document } from '../bson'; +import type { ClientSession } from '../sessions'; +import { MongoServerError, MongoInvalidArgumentError, MongoCompatibilityError } from '../error'; + +/** @public */ +export interface UpdateOptions extends CommandOperationOptions { + /** A set of filters specifying to which array elements an update should apply */ + arrayFilters?: Document[]; + /** If true, allows the write to opt-out of document level validation */ + bypassDocumentValidation?: boolean; + /** Specifies a collation */ + collation?: CollationOptions; + /** Specify that the update query should only consider plans using the hinted index */ + hint?: string | Document; + /** When true, creates a new document if no document matches the query */ + upsert?: boolean; + /** Map of parameter names and values that can be accessed using $$var (requires MongoDB 5.0). */ + let?: Document; +} + +/** @public */ +export interface UpdateResult { + /** Indicates whether this write result was acknowledged. If not, then all other members of this result will be undefined */ + acknowledged: boolean; + /** The number of documents that matched the filter */ + matchedCount: number; + /** The number of documents that were modified */ + modifiedCount: number; + /** The number of documents that were upserted */ + upsertedCount: number; + /** The identifier of the inserted document if an upsert took place */ + upsertedId: ObjectId; +} + +/** @public */ +export interface UpdateStatement { + /** The query that matches documents to update. */ + q: Document; + /** The modifications to apply. */ + u: Document | Document[]; + /** If true, perform an insert if no documents match the query. */ + upsert?: boolean; + /** If true, updates all documents that meet the query criteria. */ + multi?: boolean; + /** Specifies the collation to use for the operation. */ + collation?: CollationOptions; + /** An array of filter documents that determines which array elements to modify for an update operation on an array field. */ + arrayFilters?: Document[]; + /** A document or string that specifies the index to use to support the query predicate. */ + hint?: Hint; +} + +/** @internal */ +export class UpdateOperation extends CommandOperation { + options: UpdateOptions & { ordered?: boolean }; + statements: UpdateStatement[]; + + constructor( + ns: MongoDBNamespace, + statements: UpdateStatement[], + options: UpdateOptions & { ordered?: boolean } + ) { + super(undefined, options); + this.options = options; + this.ns = ns; + + this.statements = statements; + } + + get canRetryWrite(): boolean { + if (super.canRetryWrite === false) { + return false; + } + + return this.statements.every(op => op.multi == null || op.multi === false); + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const options = this.options ?? {}; + const ordered = typeof options.ordered === 'boolean' ? options.ordered : true; + const command: Document = { + update: this.ns.collection, + updates: this.statements, + ordered + }; + + if (typeof options.bypassDocumentValidation === 'boolean') { + command.bypassDocumentValidation = options.bypassDocumentValidation; + } + + if (options.let) { + command.let = options.let; + } + + const statementWithCollation = this.statements.find(statement => !!statement.collation); + if ( + collationNotSupported(server, options) || + (statementWithCollation && collationNotSupported(server, statementWithCollation)) + ) { + callback(new MongoCompatibilityError(`Server ${server.name} does not support collation`)); + return; + } + + const unacknowledgedWrite = this.writeConcern && this.writeConcern.w === 0; + if (unacknowledgedWrite || maxWireVersion(server) < 5) { + if (this.statements.find((o: Document) => o.hint)) { + callback(new MongoCompatibilityError(`Servers < 3.4 do not support hint on update`)); + return; + } + } + + if (this.explain && maxWireVersion(server) < 3) { + callback( + new MongoCompatibilityError(`Server ${server.name} does not support explain on update`) + ); + return; + } + + if (this.statements.some(statement => !!statement.arrayFilters) && maxWireVersion(server) < 6) { + callback( + new MongoCompatibilityError('Option "arrayFilters" is only supported on MongoDB 3.6+') + ); + return; + } + + super.executeCommand(server, session, command, callback); + } +} + +/** @internal */ +export class UpdateOneOperation extends UpdateOperation { + constructor(collection: Collection, filter: Document, update: Document, options: UpdateOptions) { + super( + collection.s.namespace, + [makeUpdateStatement(filter, update, { ...options, multi: false })], + options + ); + + if (!hasAtomicOperators(update)) { + throw new MongoInvalidArgumentError('Update document requires atomic operators'); + } + } + + execute( + server: Server, + session: ClientSession, + callback: Callback + ): void { + super.execute(server, session, (err, res) => { + if (err || !res) return callback(err); + if (this.explain != null) return callback(undefined, res); + if (res.code) return callback(new MongoServerError(res)); + if (res.writeErrors) return callback(new MongoServerError(res.writeErrors[0])); + + callback(undefined, { + acknowledged: this.writeConcern?.w !== 0 ?? true, + modifiedCount: res.nModified != null ? res.nModified : res.n, + upsertedId: + Array.isArray(res.upserted) && res.upserted.length > 0 ? res.upserted[0]._id : null, + upsertedCount: Array.isArray(res.upserted) && res.upserted.length ? res.upserted.length : 0, + matchedCount: Array.isArray(res.upserted) && res.upserted.length > 0 ? 0 : res.n + }); + }); + } +} + +/** @internal */ +export class UpdateManyOperation extends UpdateOperation { + constructor(collection: Collection, filter: Document, update: Document, options: UpdateOptions) { + super( + collection.s.namespace, + [makeUpdateStatement(filter, update, { ...options, multi: true })], + options + ); + + if (!hasAtomicOperators(update)) { + throw new MongoInvalidArgumentError('Update document requires atomic operators'); + } + } + + execute( + server: Server, + session: ClientSession, + callback: Callback + ): void { + super.execute(server, session, (err, res) => { + if (err || !res) return callback(err); + if (this.explain != null) return callback(undefined, res); + if (res.code) return callback(new MongoServerError(res)); + if (res.writeErrors) return callback(new MongoServerError(res.writeErrors[0])); + + callback(undefined, { + acknowledged: this.writeConcern?.w !== 0 ?? true, + modifiedCount: res.nModified != null ? res.nModified : res.n, + upsertedId: + Array.isArray(res.upserted) && res.upserted.length > 0 ? res.upserted[0]._id : null, + upsertedCount: Array.isArray(res.upserted) && res.upserted.length ? res.upserted.length : 0, + matchedCount: Array.isArray(res.upserted) && res.upserted.length > 0 ? 0 : res.n + }); + }); + } +} + +/** @public */ +export interface ReplaceOptions extends CommandOperationOptions { + /** If true, allows the write to opt-out of document level validation */ + bypassDocumentValidation?: boolean; + /** Specifies a collation */ + collation?: CollationOptions; + /** Specify that the update query should only consider plans using the hinted index */ + hint?: string | Document; + /** When true, creates a new document if no document matches the query */ + upsert?: boolean; +} + +/** @internal */ +export class ReplaceOneOperation extends UpdateOperation { + constructor( + collection: Collection, + filter: Document, + replacement: Document, + options: ReplaceOptions + ) { + super( + collection.s.namespace, + [makeUpdateStatement(filter, replacement, { ...options, multi: false })], + options + ); + + if (hasAtomicOperators(replacement)) { + throw new MongoInvalidArgumentError('Replacement document must not contain atomic operators'); + } + } + + execute( + server: Server, + session: ClientSession, + callback: Callback + ): void { + super.execute(server, session, (err, res) => { + if (err || !res) return callback(err); + if (this.explain != null) return callback(undefined, res); + if (res.code) return callback(new MongoServerError(res)); + if (res.writeErrors) return callback(new MongoServerError(res.writeErrors[0])); + + callback(undefined, { + acknowledged: this.writeConcern?.w !== 0 ?? true, + modifiedCount: res.nModified != null ? res.nModified : res.n, + upsertedId: + Array.isArray(res.upserted) && res.upserted.length > 0 ? res.upserted[0]._id : null, + upsertedCount: Array.isArray(res.upserted) && res.upserted.length ? res.upserted.length : 0, + matchedCount: Array.isArray(res.upserted) && res.upserted.length > 0 ? 0 : res.n + }); + }); + } +} + +export function makeUpdateStatement( + filter: Document, + update: Document, + options: UpdateOptions & { multi?: boolean } +): UpdateStatement { + if (filter == null || typeof filter !== 'object') { + throw new MongoInvalidArgumentError('Selector must be a valid JavaScript object'); + } + + if (update == null || typeof update !== 'object') { + throw new MongoInvalidArgumentError('Document must be a valid JavaScript object'); + } + + const op: UpdateStatement = { q: filter, u: update }; + if (typeof options.upsert === 'boolean') { + op.upsert = options.upsert; + } + + if (options.multi) { + op.multi = options.multi; + } + + if (options.hint) { + op.hint = options.hint; + } + + if (options.arrayFilters) { + op.arrayFilters = options.arrayFilters; + } + + if (options.collation) { + op.collation = options.collation; + } + + return op; +} + +defineAspects(UpdateOperation, [Aspect.RETRYABLE, Aspect.WRITE_OPERATION, Aspect.SKIP_COLLATION]); +defineAspects(UpdateOneOperation, [ + Aspect.RETRYABLE, + Aspect.WRITE_OPERATION, + Aspect.EXPLAINABLE, + Aspect.SKIP_COLLATION +]); +defineAspects(UpdateManyOperation, [ + Aspect.WRITE_OPERATION, + Aspect.EXPLAINABLE, + Aspect.SKIP_COLLATION +]); +defineAspects(ReplaceOneOperation, [ + Aspect.RETRYABLE, + Aspect.WRITE_OPERATION, + Aspect.SKIP_COLLATION +]); diff --git a/node_modules/mongodb/src/operations/validate_collection.ts b/node_modules/mongodb/src/operations/validate_collection.ts new file mode 100644 index 000000000..ce3a681ee --- /dev/null +++ b/node_modules/mongodb/src/operations/validate_collection.ts @@ -0,0 +1,55 @@ +import { CommandOperation, CommandOperationOptions } from './command'; +import type { Callback } from '../utils'; +import type { Document } from '../bson'; +import type { Server } from '../sdam/server'; +import type { Admin } from '../admin'; +import type { ClientSession } from '../sessions'; +import { MongoRuntimeError } from '../error'; + +/** @public */ +export interface ValidateCollectionOptions extends CommandOperationOptions { + /** Validates a collection in the background, without interrupting read or write traffic (only in MongoDB 4.4+) */ + background?: boolean; +} + +/** @internal */ +export class ValidateCollectionOperation extends CommandOperation { + options: ValidateCollectionOptions; + collectionName: string; + command: Document; + + constructor(admin: Admin, collectionName: string, options: ValidateCollectionOptions) { + // Decorate command with extra options + const command: Document = { validate: collectionName }; + const keys = Object.keys(options); + for (let i = 0; i < keys.length; i++) { + if (Object.prototype.hasOwnProperty.call(options, keys[i]) && keys[i] !== 'session') { + command[keys[i]] = (options as Document)[keys[i]]; + } + } + + super(admin.s.db, options); + this.options = options; + this.command = command; + this.collectionName = collectionName; + } + + execute(server: Server, session: ClientSession, callback: Callback): void { + const collectionName = this.collectionName; + + super.executeCommand(server, session, this.command, (err, doc) => { + if (err != null) return callback(err); + + // TODO(NODE-3483): Replace these with MongoUnexpectedServerResponseError + if (doc.ok === 0) return callback(new MongoRuntimeError('Error with validate command')); + if (doc.result != null && typeof doc.result !== 'string') + return callback(new MongoRuntimeError('Error with validation data')); + if (doc.result != null && doc.result.match(/exception|corrupt/) != null) + return callback(new MongoRuntimeError(`Invalid collection ${collectionName}`)); + if (doc.valid != null && !doc.valid) + return callback(new MongoRuntimeError(`Invalid collection ${collectionName}`)); + + return callback(undefined, doc); + }); + } +} diff --git a/node_modules/mongodb/src/promise_provider.ts b/node_modules/mongodb/src/promise_provider.ts new file mode 100644 index 000000000..a6aeb548e --- /dev/null +++ b/node_modules/mongodb/src/promise_provider.ts @@ -0,0 +1,41 @@ +import { MongoInvalidArgumentError } from './error'; + +/** @internal */ +const kPromise = Symbol('promise'); + +interface PromiseStore { + [kPromise]?: PromiseConstructor; +} + +const store: PromiseStore = { + [kPromise]: undefined +}; + +/** + * Global promise store allowing user-provided promises + * @public + */ +export class PromiseProvider { + /** Validates the passed in promise library */ + static validate(lib: unknown): lib is PromiseConstructor { + if (typeof lib !== 'function') + throw new MongoInvalidArgumentError(`Promise must be a function, got ${lib}`); + return !!lib; + } + + /** Sets the promise library */ + static set(lib: PromiseConstructor): void { + if (!PromiseProvider.validate(lib)) { + // validate + return; + } + store[kPromise] = lib; + } + + /** Get the stored promise library, or resolves passed in */ + static get(): PromiseConstructor { + return store[kPromise] as PromiseConstructor; + } +} + +PromiseProvider.set(global.Promise); diff --git a/node_modules/mongodb/src/read_concern.ts b/node_modules/mongodb/src/read_concern.ts new file mode 100644 index 000000000..195d6c866 --- /dev/null +++ b/node_modules/mongodb/src/read_concern.ts @@ -0,0 +1,87 @@ +import type { Document } from './bson'; + +/** @public */ +export const ReadConcernLevel = Object.freeze({ + local: 'local', + majority: 'majority', + linearizable: 'linearizable', + available: 'available', + snapshot: 'snapshot' +} as const); + +/** @public */ +export type ReadConcernLevel = typeof ReadConcernLevel[keyof typeof ReadConcernLevel]; + +/** @public */ +export type ReadConcernLike = ReadConcern | { level: ReadConcernLevel } | ReadConcernLevel; + +/** + * The MongoDB ReadConcern, which allows for control of the consistency and isolation properties + * of the data read from replica sets and replica set shards. + * @public + * + * @see https://docs.mongodb.com/manual/reference/read-concern/index.html + */ +export class ReadConcern { + level: ReadConcernLevel | string; + + /** Constructs a ReadConcern from the read concern level.*/ + constructor(level: ReadConcernLevel) { + /** + * A spec test exists that allows level to be any string. + * "invalid readConcern with out stage" + * @see ./test/spec/crud/v2/aggregate-out-readConcern.json + * @see https://github.com/mongodb/specifications/blob/master/source/read-write-concern/read-write-concern.rst#unknown-levels-and-additional-options-for-string-based-readconcerns + */ + this.level = ReadConcernLevel[level] ?? level; + } + + /** + * Construct a ReadConcern given an options object. + * + * @param options - The options object from which to extract the write concern. + */ + static fromOptions(options?: { + readConcern?: ReadConcernLike; + level?: ReadConcernLevel; + }): ReadConcern | undefined { + if (options == null) { + return; + } + + if (options.readConcern) { + const { readConcern } = options; + if (readConcern instanceof ReadConcern) { + return readConcern; + } else if (typeof readConcern === 'string') { + return new ReadConcern(readConcern); + } else if ('level' in readConcern && readConcern.level) { + return new ReadConcern(readConcern.level); + } + } + + if (options.level) { + return new ReadConcern(options.level); + } + } + + static get MAJORITY(): 'majority' { + return ReadConcernLevel.majority; + } + + static get AVAILABLE(): 'available' { + return ReadConcernLevel.available; + } + + static get LINEARIZABLE(): 'linearizable' { + return ReadConcernLevel.linearizable; + } + + static get SNAPSHOT(): 'snapshot' { + return ReadConcernLevel.snapshot; + } + + toJSON(): Document { + return { level: this.level }; + } +} diff --git a/node_modules/mongodb/src/read_preference.ts b/node_modules/mongodb/src/read_preference.ts new file mode 100644 index 000000000..d0ffd6e3c --- /dev/null +++ b/node_modules/mongodb/src/read_preference.ts @@ -0,0 +1,264 @@ +import type { TagSet } from './sdam/server_description'; +import type { Document } from './bson'; +import type { ClientSession } from './sessions'; +import { MongoInvalidArgumentError } from './error'; + +/** @public */ +export type ReadPreferenceLike = ReadPreference | ReadPreferenceMode; + +/** @public */ +export const ReadPreferenceMode = Object.freeze({ + primary: 'primary', + primaryPreferred: 'primaryPreferred', + secondary: 'secondary', + secondaryPreferred: 'secondaryPreferred', + nearest: 'nearest' +} as const); + +/** @public */ +export type ReadPreferenceMode = typeof ReadPreferenceMode[keyof typeof ReadPreferenceMode]; + +/** @public */ +export interface HedgeOptions { + /** Explicitly enable or disable hedged reads. */ + enabled?: boolean; +} + +/** @public */ +export interface ReadPreferenceOptions { + /** Max secondary read staleness in seconds, Minimum value is 90 seconds.*/ + maxStalenessSeconds?: number; + /** Server mode in which the same query is dispatched in parallel to multiple replica set members. */ + hedge?: HedgeOptions; +} + +/** @public */ +export interface ReadPreferenceLikeOptions extends ReadPreferenceOptions { + readPreference?: + | ReadPreferenceLike + | { + mode?: ReadPreferenceMode; + preference?: ReadPreferenceMode; + tags?: TagSet[]; + maxStalenessSeconds?: number; + }; +} + +/** @public */ +export interface ReadPreferenceFromOptions extends ReadPreferenceLikeOptions { + session?: ClientSession; + readPreferenceTags?: TagSet[]; + hedge?: HedgeOptions; +} + +/** + * The **ReadPreference** class is a class that represents a MongoDB ReadPreference and is + * used to construct connections. + * @public + * + * @see https://docs.mongodb.com/manual/core/read-preference/ + */ +export class ReadPreference { + mode: ReadPreferenceMode; + tags?: TagSet[]; + hedge?: HedgeOptions; + maxStalenessSeconds?: number; + minWireVersion?: number; + + public static PRIMARY = ReadPreferenceMode.primary; + public static PRIMARY_PREFERRED = ReadPreferenceMode.primaryPreferred; + public static SECONDARY = ReadPreferenceMode.secondary; + public static SECONDARY_PREFERRED = ReadPreferenceMode.secondaryPreferred; + public static NEAREST = ReadPreferenceMode.nearest; + + public static primary = new ReadPreference(ReadPreferenceMode.primary); + public static primaryPreferred = new ReadPreference(ReadPreferenceMode.primaryPreferred); + public static secondary = new ReadPreference(ReadPreferenceMode.secondary); + public static secondaryPreferred = new ReadPreference(ReadPreferenceMode.secondaryPreferred); + public static nearest = new ReadPreference(ReadPreferenceMode.nearest); + + /** + * @param mode - A string describing the read preference mode (primary|primaryPreferred|secondary|secondaryPreferred|nearest) + * @param tags - A tag set used to target reads to members with the specified tag(s). tagSet is not available if using read preference mode primary. + * @param options - Additional read preference options + */ + constructor(mode: ReadPreferenceMode, tags?: TagSet[], options?: ReadPreferenceOptions) { + if (!ReadPreference.isValid(mode)) { + throw new MongoInvalidArgumentError(`Invalid read preference mode ${JSON.stringify(mode)}`); + } + if (options == null && typeof tags === 'object' && !Array.isArray(tags)) { + options = tags; + tags = undefined; + } else if (tags && !Array.isArray(tags)) { + throw new MongoInvalidArgumentError('ReadPreference tags must be an array'); + } + + this.mode = mode; + this.tags = tags; + this.hedge = options?.hedge; + this.maxStalenessSeconds = undefined; + this.minWireVersion = undefined; + + options = options ?? {}; + if (options.maxStalenessSeconds != null) { + if (options.maxStalenessSeconds <= 0) { + throw new MongoInvalidArgumentError('maxStalenessSeconds must be a positive integer'); + } + + this.maxStalenessSeconds = options.maxStalenessSeconds; + + // NOTE: The minimum required wire version is 5 for this read preference. If the existing + // topology has a lower value then a MongoError will be thrown during server selection. + this.minWireVersion = 5; + } + + if (this.mode === ReadPreference.PRIMARY) { + if (this.tags && Array.isArray(this.tags) && this.tags.length > 0) { + throw new MongoInvalidArgumentError('Primary read preference cannot be combined with tags'); + } + + if (this.maxStalenessSeconds) { + throw new MongoInvalidArgumentError( + 'Primary read preference cannot be combined with maxStalenessSeconds' + ); + } + + if (this.hedge) { + throw new MongoInvalidArgumentError( + 'Primary read preference cannot be combined with hedge' + ); + } + } + } + + // Support the deprecated `preference` property introduced in the porcelain layer + get preference(): ReadPreferenceMode { + return this.mode; + } + + static fromString(mode: string): ReadPreference { + return new ReadPreference(mode as ReadPreferenceMode); + } + + /** + * Construct a ReadPreference given an options object. + * + * @param options - The options object from which to extract the read preference. + */ + static fromOptions(options?: ReadPreferenceFromOptions): ReadPreference | undefined { + if (!options) return; + const readPreference = + options.readPreference ?? options.session?.transaction.options.readPreference; + const readPreferenceTags = options.readPreferenceTags; + + if (readPreference == null) { + return; + } + + if (typeof readPreference === 'string') { + return new ReadPreference(readPreference as ReadPreferenceMode, readPreferenceTags); + } else if (!(readPreference instanceof ReadPreference) && typeof readPreference === 'object') { + const mode = readPreference.mode || readPreference.preference; + if (mode && typeof mode === 'string') { + return new ReadPreference( + mode as ReadPreferenceMode, + readPreference.tags ?? readPreferenceTags, + { + maxStalenessSeconds: readPreference.maxStalenessSeconds, + hedge: options.hedge + } + ); + } + } + + if (readPreferenceTags) { + readPreference.tags = readPreferenceTags; + } + + return readPreference as ReadPreference; + } + + /** + * Replaces options.readPreference with a ReadPreference instance + */ + static translate(options: ReadPreferenceLikeOptions): ReadPreferenceLikeOptions { + if (options.readPreference == null) return options; + const r = options.readPreference; + + if (typeof r === 'string') { + options.readPreference = new ReadPreference(r as ReadPreferenceMode); + } else if (r && !(r instanceof ReadPreference) && typeof r === 'object') { + const mode = r.mode || r.preference; + if (mode && typeof mode === 'string') { + options.readPreference = new ReadPreference(mode as ReadPreferenceMode, r.tags, { + maxStalenessSeconds: r.maxStalenessSeconds + }); + } + } else if (!(r instanceof ReadPreference)) { + throw new MongoInvalidArgumentError(`Invalid read preference: ${r}`); + } + + return options; + } + + /** + * Validate if a mode is legal + * + * @param mode - The string representing the read preference mode. + */ + static isValid(mode: string): boolean { + const VALID_MODES = new Set([ + ReadPreference.PRIMARY, + ReadPreference.PRIMARY_PREFERRED, + ReadPreference.SECONDARY, + ReadPreference.SECONDARY_PREFERRED, + ReadPreference.NEAREST, + null + ]); + + return VALID_MODES.has(mode as ReadPreferenceMode); + } + + /** + * Validate if a mode is legal + * + * @param mode - The string representing the read preference mode. + */ + isValid(mode?: string): boolean { + return ReadPreference.isValid(typeof mode === 'string' ? mode : this.mode); + } + + /** + * Indicates that this readPreference needs the "slaveOk" bit when sent over the wire + * + * @see https://docs.mongodb.com/manual/reference/mongodb-wire-protocol/#op-query + */ + slaveOk(): boolean { + const NEEDS_SLAVEOK = new Set([ + ReadPreference.PRIMARY_PREFERRED, + ReadPreference.SECONDARY, + ReadPreference.SECONDARY_PREFERRED, + ReadPreference.NEAREST + ]); + + return NEEDS_SLAVEOK.has(this.mode); + } + + /** + * Check if the two ReadPreferences are equivalent + * + * @param readPreference - The read preference with which to check equality + */ + equals(readPreference: ReadPreference): boolean { + return readPreference.mode === this.mode; + } + + /** Return JSON representation */ + toJSON(): Document { + const readPreference = { mode: this.mode } as Document; + if (Array.isArray(this.tags)) readPreference.tags = this.tags; + if (this.maxStalenessSeconds) readPreference.maxStalenessSeconds = this.maxStalenessSeconds; + if (this.hedge) readPreference.hedge = this.hedge; + return readPreference; + } +} diff --git a/node_modules/mongodb/src/sdam/common.ts b/node_modules/mongodb/src/sdam/common.ts new file mode 100644 index 000000000..673573b25 --- /dev/null +++ b/node_modules/mongodb/src/sdam/common.ts @@ -0,0 +1,83 @@ +import type { Timestamp, Binary, Long } from '../bson'; +import type { Topology } from './topology'; +import type { ClientSession } from '../sessions'; + +// shared state names +export const STATE_CLOSING = 'closing'; +export const STATE_CLOSED = 'closed'; +export const STATE_CONNECTING = 'connecting'; +export const STATE_CONNECTED = 'connected'; + +/** + * An enumeration of topology types we know about + * @public + */ +export const TopologyType = Object.freeze({ + Single: 'Single', + ReplicaSetNoPrimary: 'ReplicaSetNoPrimary', + ReplicaSetWithPrimary: 'ReplicaSetWithPrimary', + Sharded: 'Sharded', + Unknown: 'Unknown', + LoadBalanced: 'LoadBalanced' +} as const); + +/** @public */ +export type TopologyType = typeof TopologyType[keyof typeof TopologyType]; + +/** + * An enumeration of server types we know about + * @public + */ +export const ServerType = Object.freeze({ + Standalone: 'Standalone', + Mongos: 'Mongos', + PossiblePrimary: 'PossiblePrimary', + RSPrimary: 'RSPrimary', + RSSecondary: 'RSSecondary', + RSArbiter: 'RSArbiter', + RSOther: 'RSOther', + RSGhost: 'RSGhost', + Unknown: 'Unknown', + LoadBalancer: 'LoadBalancer' +} as const); + +/** @public */ +export type ServerType = typeof ServerType[keyof typeof ServerType]; + +/** @internal */ +export type TimerQueue = Set; + +/** @internal */ +export function drainTimerQueue(queue: TimerQueue): void { + queue.forEach(clearTimeout); + queue.clear(); +} + +/** @internal */ +export function clearAndRemoveTimerFrom(timer: NodeJS.Timeout, timers: TimerQueue): boolean { + clearTimeout(timer); + return timers.delete(timer); +} + +/** @public */ +export interface ClusterTime { + clusterTime: Timestamp; + signature: { + hash: Binary; + keyId: Long; + }; +} + +/** Shared function to determine clusterTime for a given topology or session */ +export function _advanceClusterTime( + entity: Topology | ClientSession, + $clusterTime: ClusterTime +): void { + if (entity.clusterTime == null) { + entity.clusterTime = $clusterTime; + } else { + if ($clusterTime.clusterTime.greaterThan(entity.clusterTime.clusterTime)) { + entity.clusterTime = $clusterTime; + } + } +} diff --git a/node_modules/mongodb/src/sdam/events.ts b/node_modules/mongodb/src/sdam/events.ts new file mode 100644 index 000000000..ff8c4600a --- /dev/null +++ b/node_modules/mongodb/src/sdam/events.ts @@ -0,0 +1,182 @@ +import type { ServerDescription } from './server_description'; +import type { TopologyDescription } from './topology_description'; +import type { Document } from '../bson'; + +/** + * Emitted when server description changes, but does NOT include changes to the RTT. + * @public + * @category Event + */ +export class ServerDescriptionChangedEvent { + /** A unique identifier for the topology */ + topologyId: number; + /** The address (host/port pair) of the server */ + address: string; + /** The previous server description */ + previousDescription: ServerDescription; + /** The new server description */ + newDescription: ServerDescription; + + /** @internal */ + constructor( + topologyId: number, + address: string, + previousDescription: ServerDescription, + newDescription: ServerDescription + ) { + this.topologyId = topologyId; + this.address = address; + this.previousDescription = previousDescription; + this.newDescription = newDescription; + } +} + +/** + * Emitted when server is initialized. + * @public + * @category Event + */ +export class ServerOpeningEvent { + /** A unique identifier for the topology */ + topologyId: number; + /** The address (host/port pair) of the server */ + address: string; + + /** @internal */ + constructor(topologyId: number, address: string) { + this.topologyId = topologyId; + this.address = address; + } +} + +/** + * Emitted when server is closed. + * @public + * @category Event + */ +export class ServerClosedEvent { + /** A unique identifier for the topology */ + topologyId: number; + /** The address (host/port pair) of the server */ + address: string; + + /** @internal */ + constructor(topologyId: number, address: string) { + this.topologyId = topologyId; + this.address = address; + } +} + +/** + * Emitted when topology description changes. + * @public + * @category Event + */ +export class TopologyDescriptionChangedEvent { + /** A unique identifier for the topology */ + topologyId: number; + /** The old topology description */ + previousDescription: TopologyDescription; + /** The new topology description */ + newDescription: TopologyDescription; + + /** @internal */ + constructor( + topologyId: number, + previousDescription: TopologyDescription, + newDescription: TopologyDescription + ) { + this.topologyId = topologyId; + this.previousDescription = previousDescription; + this.newDescription = newDescription; + } +} + +/** + * Emitted when topology is initialized. + * @public + * @category Event + */ +export class TopologyOpeningEvent { + /** A unique identifier for the topology */ + topologyId: number; + + /** @internal */ + constructor(topologyId: number) { + this.topologyId = topologyId; + } +} + +/** + * Emitted when topology is closed. + * @public + * @category Event + */ +export class TopologyClosedEvent { + /** A unique identifier for the topology */ + topologyId: number; + + /** @internal */ + constructor(topologyId: number) { + this.topologyId = topologyId; + } +} + +/** + * Emitted when the server monitor’s ismaster command is started - immediately before + * the ismaster command is serialized into raw BSON and written to the socket. + * + * @public + * @category Event + */ +export class ServerHeartbeatStartedEvent { + /** The connection id for the command */ + connectionId: string; + + /** @internal */ + constructor(connectionId: string) { + this.connectionId = connectionId; + } +} + +/** + * Emitted when the server monitor’s ismaster succeeds. + * @public + * @category Event + */ +export class ServerHeartbeatSucceededEvent { + /** The connection id for the command */ + connectionId: string; + /** The execution time of the event in ms */ + duration: number; + /** The command reply */ + reply: Document; + + /** @internal */ + constructor(connectionId: string, duration: number, reply: Document) { + this.connectionId = connectionId; + this.duration = duration; + this.reply = reply; + } +} + +/** + * Emitted when the server monitor’s ismaster fails, either with an “ok: 0” or a socket exception. + * @public + * @category Event + */ +export class ServerHeartbeatFailedEvent { + /** The connection id for the command */ + connectionId: string; + /** The execution time of the event in ms */ + duration: number; + /** The command failure */ + failure: Error; + + /** @internal */ + constructor(connectionId: string, duration: number, failure: Error) { + this.connectionId = connectionId; + this.duration = duration; + this.failure = failure; + } +} diff --git a/node_modules/mongodb/src/sdam/monitor.ts b/node_modules/mongodb/src/sdam/monitor.ts new file mode 100644 index 000000000..aad980713 --- /dev/null +++ b/node_modules/mongodb/src/sdam/monitor.ts @@ -0,0 +1,465 @@ +import { ServerType, STATE_CLOSED, STATE_CLOSING } from './common'; +import { + now, + makeStateMachine, + calculateDurationInMs, + makeInterruptibleAsyncInterval, + ns, + EventEmitterWithState +} from '../utils'; +import { connect } from '../cmap/connect'; +import { Connection, ConnectionOptions } from '../cmap/connection'; +import { MongoNetworkError, AnyError } from '../error'; +import { Long, Document } from '../bson'; +import { + ServerHeartbeatStartedEvent, + ServerHeartbeatSucceededEvent, + ServerHeartbeatFailedEvent +} from './events'; + +import { Server } from './server'; +import type { InterruptibleAsyncInterval, Callback } from '../utils'; +import type { TopologyVersion } from './server_description'; +import { CancellationToken, TypedEventEmitter } from '../mongo_types'; + +/** @internal */ +const kServer = Symbol('server'); +/** @internal */ +const kMonitorId = Symbol('monitorId'); +/** @internal */ +const kConnection = Symbol('connection'); +/** @internal */ +const kCancellationToken = Symbol('cancellationToken'); +/** @internal */ +const kRTTPinger = Symbol('rttPinger'); +/** @internal */ +const kRoundTripTime = Symbol('roundTripTime'); + +const STATE_IDLE = 'idle'; +const STATE_MONITORING = 'monitoring'; +const stateTransition = makeStateMachine({ + [STATE_CLOSING]: [STATE_CLOSING, STATE_IDLE, STATE_CLOSED], + [STATE_CLOSED]: [STATE_CLOSED, STATE_MONITORING], + [STATE_IDLE]: [STATE_IDLE, STATE_MONITORING, STATE_CLOSING], + [STATE_MONITORING]: [STATE_MONITORING, STATE_IDLE, STATE_CLOSING] +}); + +const INVALID_REQUEST_CHECK_STATES = new Set([STATE_CLOSING, STATE_CLOSED, STATE_MONITORING]); +function isInCloseState(monitor: Monitor) { + return monitor.s.state === STATE_CLOSED || monitor.s.state === STATE_CLOSING; +} + +/** @internal */ +export interface MonitorPrivate { + state: string; +} + +/** @public */ +export interface MonitorOptions + extends Omit { + connectTimeoutMS: number; + heartbeatFrequencyMS: number; + minHeartbeatFrequencyMS: number; +} + +/** @public */ +export type MonitorEvents = { + serverHeartbeatStarted(event: ServerHeartbeatStartedEvent): void; + serverHeartbeatSucceeded(event: ServerHeartbeatSucceededEvent): void; + serverHeartbeatFailed(event: ServerHeartbeatFailedEvent): void; + resetServer(error?: Error): void; + resetConnectionPool(): void; + close(): void; +} & EventEmitterWithState; + +/** @internal */ +export class Monitor extends TypedEventEmitter { + /** @internal */ + s: MonitorPrivate; + address: string; + options: Readonly< + Pick + >; + connectOptions: ConnectionOptions; + [kServer]: Server; + [kConnection]?: Connection; + [kCancellationToken]: CancellationToken; + /** @internal */ + [kMonitorId]?: InterruptibleAsyncInterval; + [kRTTPinger]?: RTTPinger; + + constructor(server: Server, options: MonitorOptions) { + super(); + + this[kServer] = server; + this[kConnection] = undefined; + this[kCancellationToken] = new CancellationToken(); + this[kCancellationToken].setMaxListeners(Infinity); + this[kMonitorId] = undefined; + this.s = { + state: STATE_CLOSED + }; + + this.address = server.description.address; + this.options = Object.freeze({ + connectTimeoutMS: options.connectTimeoutMS ?? 10000, + heartbeatFrequencyMS: options.heartbeatFrequencyMS ?? 10000, + minHeartbeatFrequencyMS: options.minHeartbeatFrequencyMS ?? 500 + }); + + const cancellationToken = this[kCancellationToken]; + // TODO: refactor this to pull it directly from the pool, requires new ConnectionPool integration + const connectOptions = Object.assign( + { + id: '' as const, + generation: server.s.pool.generation, + connectionType: Connection, + cancellationToken, + hostAddress: server.description.hostAddress + }, + options, + // force BSON serialization options + { + raw: false, + promoteLongs: true, + promoteValues: true, + promoteBuffers: true + } + ); + + // ensure no authentication is used for monitoring + delete connectOptions.credentials; + if (connectOptions.autoEncrypter) { + delete connectOptions.autoEncrypter; + } + + this.connectOptions = Object.freeze(connectOptions); + } + + connect(): void { + if (this.s.state !== STATE_CLOSED) { + return; + } + + // start + const heartbeatFrequencyMS = this.options.heartbeatFrequencyMS; + const minHeartbeatFrequencyMS = this.options.minHeartbeatFrequencyMS; + this[kMonitorId] = makeInterruptibleAsyncInterval(monitorServer(this), { + interval: heartbeatFrequencyMS, + minInterval: minHeartbeatFrequencyMS, + immediate: true + }); + } + + requestCheck(): void { + if (INVALID_REQUEST_CHECK_STATES.has(this.s.state)) { + return; + } + + this[kMonitorId]?.wake(); + } + + reset(): void { + const topologyVersion = this[kServer].description.topologyVersion; + if (isInCloseState(this) || topologyVersion == null) { + return; + } + + stateTransition(this, STATE_CLOSING); + resetMonitorState(this); + + // restart monitor + stateTransition(this, STATE_IDLE); + + // restart monitoring + const heartbeatFrequencyMS = this.options.heartbeatFrequencyMS; + const minHeartbeatFrequencyMS = this.options.minHeartbeatFrequencyMS; + this[kMonitorId] = makeInterruptibleAsyncInterval(monitorServer(this), { + interval: heartbeatFrequencyMS, + minInterval: minHeartbeatFrequencyMS + }); + } + + close(): void { + if (isInCloseState(this)) { + return; + } + + stateTransition(this, STATE_CLOSING); + resetMonitorState(this); + + // close monitor + this.emit('close'); + stateTransition(this, STATE_CLOSED); + } +} + +function resetMonitorState(monitor: Monitor) { + monitor[kMonitorId]?.stop(); + monitor[kMonitorId] = undefined; + + monitor[kRTTPinger]?.close(); + monitor[kRTTPinger] = undefined; + + monitor[kCancellationToken].emit('cancel'); + + monitor[kConnection]?.destroy({ force: true }); + monitor[kConnection] = undefined; +} + +function checkServer(monitor: Monitor, callback: Callback) { + let start = now(); + monitor.emit(Server.SERVER_HEARTBEAT_STARTED, new ServerHeartbeatStartedEvent(monitor.address)); + + function failureHandler(err: AnyError) { + monitor[kConnection]?.destroy({ force: true }); + monitor[kConnection] = undefined; + + monitor.emit( + Server.SERVER_HEARTBEAT_FAILED, + new ServerHeartbeatFailedEvent(monitor.address, calculateDurationInMs(start), err) + ); + + monitor.emit('resetServer', err); + monitor.emit('resetConnectionPool'); + callback(err); + } + + const connection = monitor[kConnection]; + if (connection && !connection.closed) { + const { serverApi, helloOk } = connection; + const connectTimeoutMS = monitor.options.connectTimeoutMS; + const maxAwaitTimeMS = monitor.options.heartbeatFrequencyMS; + const topologyVersion = monitor[kServer].description.topologyVersion; + const isAwaitable = topologyVersion != null; + + const cmd = { + [serverApi?.version || helloOk ? 'hello' : 'ismaster']: true, + ...(isAwaitable && topologyVersion + ? { maxAwaitTimeMS, topologyVersion: makeTopologyVersion(topologyVersion) } + : {}) + }; + + const options = isAwaitable + ? { + socketTimeoutMS: connectTimeoutMS ? connectTimeoutMS + maxAwaitTimeMS : 0, + exhaustAllowed: true + } + : { socketTimeoutMS: connectTimeoutMS }; + + if (isAwaitable && monitor[kRTTPinger] == null) { + monitor[kRTTPinger] = new RTTPinger( + monitor[kCancellationToken], + Object.assign( + { heartbeatFrequencyMS: monitor.options.heartbeatFrequencyMS }, + monitor.connectOptions + ) + ); + } + + connection.command(ns('admin.$cmd'), cmd, options, (err, isMaster) => { + if (err) { + failureHandler(err); + return; + } + if ('isWritablePrimary' in isMaster) { + // Provide pre-hello-style response document. + isMaster.ismaster = isMaster.isWritablePrimary; + } + + const rttPinger = monitor[kRTTPinger]; + const duration = + isAwaitable && rttPinger ? rttPinger.roundTripTime : calculateDurationInMs(start); + + monitor.emit( + Server.SERVER_HEARTBEAT_SUCCEEDED, + new ServerHeartbeatSucceededEvent(monitor.address, duration, isMaster) + ); + + // if we are using the streaming protocol then we immediately issue another `started` + // event, otherwise the "check" is complete and return to the main monitor loop + if (isAwaitable && isMaster.topologyVersion) { + monitor.emit( + Server.SERVER_HEARTBEAT_STARTED, + new ServerHeartbeatStartedEvent(monitor.address) + ); + start = now(); + } else { + monitor[kRTTPinger]?.close(); + monitor[kRTTPinger] = undefined; + + callback(undefined, isMaster); + } + }); + + return; + } + + // connecting does an implicit `ismaster` + connect(monitor.connectOptions, (err, conn) => { + if (err) { + monitor[kConnection] = undefined; + + // we already reset the connection pool on network errors in all cases + if (!(err instanceof MongoNetworkError)) { + monitor.emit('resetConnectionPool'); + } + + failureHandler(err); + return; + } + + if (conn) { + if (isInCloseState(monitor)) { + conn.destroy({ force: true }); + return; + } + + monitor[kConnection] = conn; + monitor.emit( + Server.SERVER_HEARTBEAT_SUCCEEDED, + new ServerHeartbeatSucceededEvent( + monitor.address, + calculateDurationInMs(start), + conn.ismaster + ) + ); + + callback(undefined, conn.ismaster); + } + }); +} + +function monitorServer(monitor: Monitor) { + return (callback: Callback) => { + stateTransition(monitor, STATE_MONITORING); + function done() { + if (!isInCloseState(monitor)) { + stateTransition(monitor, STATE_IDLE); + } + + callback(); + } + + checkServer(monitor, (err, isMaster) => { + if (err) { + // otherwise an error occurred on initial discovery, also bail + if (monitor[kServer].description.type === ServerType.Unknown) { + monitor.emit('resetServer', err); + return done(); + } + } + + // if the check indicates streaming is supported, immediately reschedule monitoring + if (isMaster && isMaster.topologyVersion) { + setTimeout(() => { + if (!isInCloseState(monitor)) { + monitor[kMonitorId]?.wake(); + } + }, 0); + } + + done(); + }); + }; +} + +function makeTopologyVersion(tv: TopologyVersion) { + return { + processId: tv.processId, + // tests mock counter as just number, but in a real situation counter should always be a Long + counter: Long.isLong(tv.counter) ? tv.counter : Long.fromNumber(tv.counter) + }; +} + +/** @internal */ +export interface RTTPingerOptions extends ConnectionOptions { + heartbeatFrequencyMS: number; +} + +/** @internal */ +export class RTTPinger { + /** @internal */ + [kConnection]?: Connection; + /** @internal */ + [kCancellationToken]: CancellationToken; + /** @internal */ + [kRoundTripTime]: number; + /** @internal */ + [kMonitorId]: NodeJS.Timeout; + closed: boolean; + + constructor(cancellationToken: CancellationToken, options: RTTPingerOptions) { + this[kConnection] = undefined; + this[kCancellationToken] = cancellationToken; + this[kRoundTripTime] = 0; + this.closed = false; + + const heartbeatFrequencyMS = options.heartbeatFrequencyMS; + this[kMonitorId] = setTimeout(() => measureRoundTripTime(this, options), heartbeatFrequencyMS); + } + + get roundTripTime(): number { + return this[kRoundTripTime]; + } + + close(): void { + this.closed = true; + clearTimeout(this[kMonitorId]); + + this[kConnection]?.destroy({ force: true }); + this[kConnection] = undefined; + } +} + +function measureRoundTripTime(rttPinger: RTTPinger, options: RTTPingerOptions) { + const start = now(); + options.cancellationToken = rttPinger[kCancellationToken]; + const heartbeatFrequencyMS = options.heartbeatFrequencyMS; + + if (rttPinger.closed) { + return; + } + + function measureAndReschedule(conn?: Connection) { + if (rttPinger.closed) { + conn?.destroy({ force: true }); + return; + } + + if (rttPinger[kConnection] == null) { + rttPinger[kConnection] = conn; + } + + rttPinger[kRoundTripTime] = calculateDurationInMs(start); + rttPinger[kMonitorId] = setTimeout( + () => measureRoundTripTime(rttPinger, options), + heartbeatFrequencyMS + ); + } + + const connection = rttPinger[kConnection]; + if (connection == null) { + connect(options, (err, conn) => { + if (err) { + rttPinger[kConnection] = undefined; + rttPinger[kRoundTripTime] = 0; + return; + } + + measureAndReschedule(conn); + }); + + return; + } + + connection.command(ns('admin.$cmd'), { ismaster: 1 }, undefined, err => { + if (err) { + rttPinger[kConnection] = undefined; + rttPinger[kRoundTripTime] = 0; + return; + } + + measureAndReschedule(); + }); +} diff --git a/node_modules/mongodb/src/sdam/server.ts b/node_modules/mongodb/src/sdam/server.ts new file mode 100644 index 000000000..e3f1fb409 --- /dev/null +++ b/node_modules/mongodb/src/sdam/server.ts @@ -0,0 +1,602 @@ +import { Logger } from '../logger'; +import { + ConnectionPool, + ConnectionPoolOptions, + CMAP_EVENTS, + ConnectionPoolEvents +} from '../cmap/connection_pool'; +import { ServerDescription, compareTopologyVersion } from './server_description'; +import { Monitor, MonitorOptions } from './monitor'; +import { isTransactionCommand } from '../transactions'; +import { + collationNotSupported, + makeStateMachine, + maxWireVersion, + Callback, + CallbackWithType, + MongoDBNamespace, + EventEmitterWithState +} from '../utils'; +import { + STATE_CLOSED, + STATE_CLOSING, + STATE_CONNECTING, + STATE_CONNECTED, + ClusterTime, + TopologyType +} from './common'; +import { + MongoError, + MongoNetworkError, + MongoNetworkTimeoutError, + isSDAMUnrecoverableError, + isRetryableWriteError, + isNodeShuttingDownError, + isNetworkErrorBeforeHandshake, + MongoCompatibilityError, + MongoInvalidArgumentError, + MongoServerClosedError +} from '../error'; +import { + Connection, + DestroyOptions, + QueryOptions, + GetMoreOptions, + CommandOptions, + APM_EVENTS +} from '../cmap/connection'; +import type { Topology } from './topology'; +import type { + ServerHeartbeatFailedEvent, + ServerHeartbeatStartedEvent, + ServerHeartbeatSucceededEvent +} from './events'; +import type { ClientSession } from '../sessions'; +import type { Document, Long } from '../bson'; +import type { AutoEncrypter } from '../deps'; +import type { ServerApi } from '../mongo_client'; +import { TypedEventEmitter } from '../mongo_types'; +import { supportsRetryableWrites } from '../utils'; + +const stateTransition = makeStateMachine({ + [STATE_CLOSED]: [STATE_CLOSED, STATE_CONNECTING], + [STATE_CONNECTING]: [STATE_CONNECTING, STATE_CLOSING, STATE_CONNECTED, STATE_CLOSED], + [STATE_CONNECTED]: [STATE_CONNECTED, STATE_CLOSING, STATE_CLOSED], + [STATE_CLOSING]: [STATE_CLOSING, STATE_CLOSED] +}); + +/** @internal */ +const kMonitor = Symbol('monitor'); + +/** @public */ +export type ServerOptions = Omit & + MonitorOptions; + +/** @internal */ +export interface ServerPrivate { + /** The server description for this server */ + description: ServerDescription; + /** A copy of the options used to construct this instance */ + options: ServerOptions; + /** A logger instance */ + logger: Logger; + /** The current state of the Server */ + state: string; + /** The topology this server is a part of */ + topology: Topology; + /** A connection pool for this server */ + pool: ConnectionPool; + /** MongoDB server API version */ + serverApi?: ServerApi; +} + +/** @public */ +export type ServerEvents = { + serverHeartbeatStarted(event: ServerHeartbeatStartedEvent): void; + serverHeartbeatSucceeded(event: ServerHeartbeatSucceededEvent): void; + serverHeartbeatFailed(event: ServerHeartbeatFailedEvent): void; + /** Top level MongoClient doesn't emit this so it is marked: @internal */ + connect(server: Server): void; + descriptionReceived(description: ServerDescription): void; + closed(): void; + ended(): void; +} & ConnectionPoolEvents & + EventEmitterWithState; + +/** @internal */ +export class Server extends TypedEventEmitter { + /** @internal */ + s: ServerPrivate; + serverApi?: ServerApi; + clusterTime?: ClusterTime; + ismaster?: Document; + [kMonitor]: Monitor; + + /** @event */ + static readonly SERVER_HEARTBEAT_STARTED = 'serverHeartbeatStarted' as const; + /** @event */ + static readonly SERVER_HEARTBEAT_SUCCEEDED = 'serverHeartbeatSucceeded' as const; + /** @event */ + static readonly SERVER_HEARTBEAT_FAILED = 'serverHeartbeatFailed' as const; + /** @event */ + static readonly CONNECT = 'connect' as const; + /** @event */ + static readonly DESCRIPTION_RECEIVED = 'descriptionReceived' as const; + /** @event */ + static readonly CLOSED = 'closed' as const; + /** @event */ + static readonly ENDED = 'ended' as const; + + /** + * Create a server + */ + constructor(topology: Topology, description: ServerDescription, options: ServerOptions) { + super(); + + this.serverApi = options.serverApi; + + const poolOptions = { hostAddress: description.hostAddress, ...options }; + + this.s = { + description, + options, + logger: new Logger('Server'), + state: STATE_CLOSED, + topology, + pool: new ConnectionPool(poolOptions) + }; + + for (const event of [...CMAP_EVENTS, ...APM_EVENTS]) { + this.s.pool.on(event, (e: any) => this.emit(event, e)); + } + + this.s.pool.on(Connection.CLUSTER_TIME_RECEIVED, (clusterTime: ClusterTime) => { + this.clusterTime = clusterTime; + }); + + // monitoring is disabled in load balancing mode + if (this.loadBalanced) return; + + // create the monitor + this[kMonitor] = new Monitor(this, this.s.options); + + for (const event of HEARTBEAT_EVENTS) { + this[kMonitor].on(event, (e: any) => this.emit(event, e)); + } + + this[kMonitor].on('resetConnectionPool', () => { + this.s.pool.clear(); + }); + + this[kMonitor].on('resetServer', (error: MongoError) => markServerUnknown(this, error)); + this[kMonitor].on(Server.SERVER_HEARTBEAT_SUCCEEDED, (event: ServerHeartbeatSucceededEvent) => { + this.emit( + Server.DESCRIPTION_RECEIVED, + new ServerDescription(this.description.hostAddress, event.reply, { + roundTripTime: calculateRoundTripTime(this.description.roundTripTime, event.duration) + }) + ); + + if (this.s.state === STATE_CONNECTING) { + stateTransition(this, STATE_CONNECTED); + this.emit(Server.CONNECT, this); + } + }); + } + + get description(): ServerDescription { + return this.s.description; + } + + get name(): string { + return this.s.description.address; + } + + get autoEncrypter(): AutoEncrypter | undefined { + if (this.s.options && this.s.options.autoEncrypter) { + return this.s.options.autoEncrypter; + } + } + + get loadBalanced(): boolean { + return this.s.topology.description.type === TopologyType.LoadBalanced; + } + + /** + * Initiate server connect + */ + connect(): void { + if (this.s.state !== STATE_CLOSED) { + return; + } + + stateTransition(this, STATE_CONNECTING); + + // If in load balancer mode we automatically set the server to + // a load balancer. It never transitions out of this state and + // has no monitor. + if (!this.loadBalanced) { + this[kMonitor].connect(); + } else { + stateTransition(this, STATE_CONNECTED); + this.emit(Server.CONNECT, this); + } + } + + /** Destroy the server connection */ + destroy(options?: DestroyOptions, callback?: Callback): void { + if (typeof options === 'function') (callback = options), (options = {}); + options = Object.assign({}, { force: false }, options); + + if (this.s.state === STATE_CLOSED) { + if (typeof callback === 'function') { + callback(); + } + + return; + } + + stateTransition(this, STATE_CLOSING); + + if (!this.loadBalanced) { + this[kMonitor].close(); + } + + this.s.pool.close(options, err => { + stateTransition(this, STATE_CLOSED); + this.emit('closed'); + if (typeof callback === 'function') { + callback(err); + } + }); + } + + /** + * Immediately schedule monitoring of this server. If there already an attempt being made + * this will be a no-op. + */ + requestCheck(): void { + if (!this.loadBalanced) { + this[kMonitor].requestCheck(); + } + } + + /** + * Execute a command + * @internal + */ + command(ns: MongoDBNamespace, cmd: Document, callback: Callback): void; + /** @internal */ + command( + ns: MongoDBNamespace, + cmd: Document, + options: CommandOptions, + callback: Callback + ): void; + command( + ns: MongoDBNamespace, + cmd: Document, + options?: CommandOptions | Callback, + callback?: Callback + ): void { + if (typeof options === 'function') { + (callback = options), (options = {}), (options = options ?? {}); + } + + if (callback == null) { + throw new MongoInvalidArgumentError('Callback must be provided'); + } + + if (ns.db == null || typeof ns === 'string') { + throw new MongoInvalidArgumentError('Namespace must not be a string'); + } + + if (this.s.state === STATE_CLOSING || this.s.state === STATE_CLOSED) { + callback(new MongoServerClosedError()); + return; + } + + // Clone the options + const finalOptions = Object.assign({}, options, { wireProtocolCommand: false }); + + // error if collation not supported + if (collationNotSupported(this, cmd)) { + callback(new MongoCompatibilityError(`Server ${this.name} does not support collation`)); + return; + } + + const session = finalOptions.session; + const conn = session?.pinnedConnection; + + // NOTE: This is a hack! We can't retrieve the connections used for executing an operation + // (and prevent them from being checked back in) at the point of operation execution. + // This should be considered as part of the work for NODE-2882 + if (this.loadBalanced && session && conn == null && isPinnableCommand(cmd, session)) { + this.s.pool.checkOut((err, checkedOut) => { + if (err || checkedOut == null) { + if (callback) return callback(err); + return; + } + + session.pin(checkedOut); + this.command(ns, cmd, finalOptions, callback as Callback); + }); + + return; + } + + this.s.pool.withConnection( + conn, + (err, conn, cb) => { + if (err || !conn) { + markServerUnknown(this, err); + return cb(err); + } + + conn.command( + ns, + cmd, + finalOptions, + makeOperationHandler(this, conn, cmd, finalOptions, cb) as Callback + ); + }, + callback + ); + } + + /** + * Execute a query against the server + * @internal + */ + query(ns: MongoDBNamespace, cmd: Document, options: QueryOptions, callback: Callback): void { + if (this.s.state === STATE_CLOSING || this.s.state === STATE_CLOSED) { + callback(new MongoServerClosedError()); + return; + } + + this.s.pool.withConnection( + undefined, + (err, conn, cb) => { + if (err || !conn) { + markServerUnknown(this, err); + return cb(err); + } + + conn.query( + ns, + cmd, + options, + makeOperationHandler(this, conn, cmd, options, cb) as Callback + ); + }, + callback + ); + } + + /** + * Execute a `getMore` against the server + * @internal + */ + getMore( + ns: MongoDBNamespace, + cursorId: Long, + options: GetMoreOptions, + callback: Callback + ): void { + if (this.s.state === STATE_CLOSING || this.s.state === STATE_CLOSED) { + callback(new MongoServerClosedError()); + return; + } + + this.s.pool.withConnection( + options.session?.pinnedConnection, + (err, conn, cb) => { + if (err || !conn) { + markServerUnknown(this, err); + return cb(err); + } + + conn.getMore( + ns, + cursorId, + options, + makeOperationHandler(this, conn, {}, options, cb) as Callback + ); + }, + callback + ); + } + + /** + * Execute a `killCursors` command against the server + * @internal + */ + killCursors( + ns: MongoDBNamespace, + cursorIds: Long[], + options: CommandOptions, + callback?: Callback + ): void { + if (this.s.state === STATE_CLOSING || this.s.state === STATE_CLOSED) { + if (typeof callback === 'function') { + callback(new MongoServerClosedError()); + } + + return; + } + + this.s.pool.withConnection( + options.session?.pinnedConnection, + (err, conn, cb) => { + if (err || !conn) { + markServerUnknown(this, err); + return cb(err); + } + + conn.killCursors( + ns, + cursorIds, + options, + makeOperationHandler(this, conn, {}, undefined, cb) as Callback + ); + }, + callback + ); + } +} + +export const HEARTBEAT_EVENTS = [ + Server.SERVER_HEARTBEAT_STARTED, + Server.SERVER_HEARTBEAT_SUCCEEDED, + Server.SERVER_HEARTBEAT_FAILED +]; + +Object.defineProperty(Server.prototype, 'clusterTime', { + get() { + return this.s.topology.clusterTime; + }, + set(clusterTime: ClusterTime) { + this.s.topology.clusterTime = clusterTime; + } +}); + +function calculateRoundTripTime(oldRtt: number, duration: number): number { + if (oldRtt === -1) { + return duration; + } + + const alpha = 0.2; + return alpha * duration + (1 - alpha) * oldRtt; +} + +function markServerUnknown(server: Server, error?: MongoError) { + // Load balancer servers can never be marked unknown. + if (server.loadBalanced) { + return; + } + + if (error instanceof MongoNetworkError && !(error instanceof MongoNetworkTimeoutError)) { + server[kMonitor].reset(); + } + + server.emit( + Server.DESCRIPTION_RECEIVED, + new ServerDescription(server.description.hostAddress, undefined, { + error, + topologyVersion: + error && error.topologyVersion ? error.topologyVersion : server.description.topologyVersion + }) + ); +} + +function isPinnableCommand(cmd: Document, session?: ClientSession): boolean { + if (session) { + return ( + session.inTransaction() || + 'aggregate' in cmd || + 'find' in cmd || + 'getMore' in cmd || + 'listCollections' in cmd || + 'listIndexes' in cmd + ); + } + + return false; +} + +function connectionIsStale(pool: ConnectionPool, connection: Connection) { + if (connection.serviceId) { + return ( + connection.generation !== pool.serviceGenerations.get(connection.serviceId.toHexString()) + ); + } + + return connection.generation !== pool.generation; +} + +function shouldHandleStateChangeError(server: Server, err: MongoError) { + const etv = err.topologyVersion; + const stv = server.description.topologyVersion; + return compareTopologyVersion(stv, etv) < 0; +} + +function inActiveTransaction(session: ClientSession | undefined, cmd: Document) { + return session && session.inTransaction() && !isTransactionCommand(cmd); +} + +/** this checks the retryWrites option passed down from the client options, it + * does not check if the server supports retryable writes */ +function isRetryableWritesEnabled(topology: Topology) { + return topology.s.options.retryWrites !== false; +} + +function makeOperationHandler( + server: Server, + connection: Connection, + cmd: Document, + options: CommandOptions | GetMoreOptions | undefined, + callback: Callback +): CallbackWithType { + const session = options?.session; + return function handleOperationResult(err, result) { + if (err && !connectionIsStale(server.s.pool, connection)) { + if (err instanceof MongoNetworkError) { + if (session && !session.hasEnded && session.serverSession) { + session.serverSession.isDirty = true; + } + + // inActiveTransaction check handles commit and abort. + if (inActiveTransaction(session, cmd) && !err.hasErrorLabel('TransientTransactionError')) { + err.addErrorLabel('TransientTransactionError'); + } + + if ( + (isRetryableWritesEnabled(server.s.topology) || isTransactionCommand(cmd)) && + supportsRetryableWrites(server) && + !inActiveTransaction(session, cmd) + ) { + err.addErrorLabel('RetryableWriteError'); + } + + if (!(err instanceof MongoNetworkTimeoutError) || isNetworkErrorBeforeHandshake(err)) { + // In load balanced mode we never mark the server as unknown and always + // clear for the specific service id. + + server.s.pool.clear(connection.serviceId); + if (!server.loadBalanced) { + markServerUnknown(server, err); + } + } + } else { + // if pre-4.4 server, then add error label if its a retryable write error + if ( + (isRetryableWritesEnabled(server.s.topology) || isTransactionCommand(cmd)) && + maxWireVersion(server) < 9 && + isRetryableWriteError(err) && + !inActiveTransaction(session, cmd) + ) { + err.addErrorLabel('RetryableWriteError'); + } + + if (isSDAMUnrecoverableError(err)) { + if (shouldHandleStateChangeError(server, err)) { + if (maxWireVersion(server) <= 7 || isNodeShuttingDownError(err)) { + server.s.pool.clear(connection.serviceId); + } + + if (!server.loadBalanced) { + markServerUnknown(server, err); + process.nextTick(() => server.requestCheck()); + } + } + } + } + + if (session && session.isPinned && err.hasErrorLabel('TransientTransactionError')) { + session.unpin({ force: true }); + } + } + + callback(err, result); + }; +} diff --git a/node_modules/mongodb/src/sdam/server_description.ts b/node_modules/mongodb/src/sdam/server_description.ts new file mode 100644 index 000000000..bd907b2c4 --- /dev/null +++ b/node_modules/mongodb/src/sdam/server_description.ts @@ -0,0 +1,279 @@ +import { arrayStrictEqual, errorStrictEqual, now, HostAddress } from '../utils'; +import { ServerType } from './common'; +import { ObjectId, Long, Document } from '../bson'; +import type { ClusterTime } from './common'; +import type { MongoError } from '../error'; + +const WRITABLE_SERVER_TYPES = new Set([ + ServerType.RSPrimary, + ServerType.Standalone, + ServerType.Mongos, + ServerType.LoadBalancer +]); + +const DATA_BEARING_SERVER_TYPES = new Set([ + ServerType.RSPrimary, + ServerType.RSSecondary, + ServerType.Mongos, + ServerType.Standalone, + ServerType.LoadBalancer +]); + +/** @public */ +export interface TopologyVersion { + processId: ObjectId; + counter: Long; +} + +/** @public */ +export type TagSet = { [key: string]: string }; + +/** @internal */ +export interface ServerDescriptionOptions { + /** An Error used for better reporting debugging */ + error?: MongoError; + + /** The round trip time to ping this server (in ms) */ + roundTripTime?: number; + + /** The topologyVersion */ + topologyVersion?: TopologyVersion; + + /** If the client is in load balancing mode. */ + loadBalanced?: boolean; +} + +/** + * The client's view of a single server, based on the most recent ismaster outcome. + * + * Internal type, not meant to be directly instantiated + * @public + */ +export class ServerDescription { + private _hostAddress: HostAddress; + address: string; + type: ServerType; + hosts: string[]; + passives: string[]; + arbiters: string[]; + tags: TagSet; + + error?: MongoError; + topologyVersion?: TopologyVersion; + minWireVersion: number; + maxWireVersion: number; + roundTripTime: number; + lastUpdateTime: number; + lastWriteDate: number; + + me?: string; + primary?: string; + setName?: string; + setVersion?: number; + electionId?: ObjectId; + logicalSessionTimeoutMinutes?: number; + + // NOTE: does this belong here? It seems we should gossip the cluster time at the CMAP level + $clusterTime?: ClusterTime; + + /** + * Create a ServerDescription + * @internal + * + * @param address - The address of the server + * @param ismaster - An optional ismaster response for this server + */ + constructor( + address: HostAddress | string, + ismaster?: Document, + options?: ServerDescriptionOptions + ) { + if (typeof address === 'string') { + this._hostAddress = new HostAddress(address); + this.address = this._hostAddress.toString(); + } else { + this._hostAddress = address; + this.address = this._hostAddress.toString(); + } + this.type = parseServerType(ismaster, options); + this.hosts = ismaster?.hosts?.map((host: string) => host.toLowerCase()) ?? []; + this.passives = ismaster?.passives?.map((host: string) => host.toLowerCase()) ?? []; + this.arbiters = ismaster?.arbiters?.map((host: string) => host.toLowerCase()) ?? []; + this.tags = ismaster?.tags ?? {}; + this.minWireVersion = ismaster?.minWireVersion ?? 0; + this.maxWireVersion = ismaster?.maxWireVersion ?? 0; + this.roundTripTime = options?.roundTripTime ?? -1; + this.lastUpdateTime = now(); + this.lastWriteDate = ismaster?.lastWrite?.lastWriteDate ?? 0; + + if (options?.topologyVersion) { + this.topologyVersion = options.topologyVersion; + } else if (ismaster?.topologyVersion) { + this.topologyVersion = ismaster.topologyVersion; + } + + if (options?.error) { + this.error = options.error; + } + + if (ismaster?.primary) { + this.primary = ismaster.primary; + } + + if (ismaster?.me) { + this.me = ismaster.me.toLowerCase(); + } + + if (ismaster?.setName) { + this.setName = ismaster.setName; + } + + if (ismaster?.setVersion) { + this.setVersion = ismaster.setVersion; + } + + if (ismaster?.electionId) { + this.electionId = ismaster.electionId; + } + + if (ismaster?.logicalSessionTimeoutMinutes) { + this.logicalSessionTimeoutMinutes = ismaster.logicalSessionTimeoutMinutes; + } + + if (ismaster?.$clusterTime) { + this.$clusterTime = ismaster.$clusterTime; + } + } + + get hostAddress(): HostAddress { + if (this._hostAddress) return this._hostAddress; + else return new HostAddress(this.address); + } + + get allHosts(): string[] { + return this.hosts.concat(this.arbiters).concat(this.passives); + } + + /** Is this server available for reads*/ + get isReadable(): boolean { + return this.type === ServerType.RSSecondary || this.isWritable; + } + + /** Is this server data bearing */ + get isDataBearing(): boolean { + return DATA_BEARING_SERVER_TYPES.has(this.type); + } + + /** Is this server available for writes */ + get isWritable(): boolean { + return WRITABLE_SERVER_TYPES.has(this.type); + } + + get host(): string { + const chopLength = `:${this.port}`.length; + return this.address.slice(0, -chopLength); + } + + get port(): number { + const port = this.address.split(':').pop(); + return port ? Number.parseInt(port, 10) : 27017; + } + + /** + * Determines if another `ServerDescription` is equal to this one per the rules defined + * in the {@link https://github.com/mongodb/specifications/blob/master/source/server-discovery-and-monitoring/server-discovery-and-monitoring.rst#serverdescription|SDAM spec} + */ + equals(other: ServerDescription): boolean { + const topologyVersionsEqual = + this.topologyVersion === other.topologyVersion || + compareTopologyVersion(this.topologyVersion, other.topologyVersion) === 0; + + const electionIdsEqual: boolean = + this.electionId && other.electionId + ? other.electionId && this.electionId.equals(other.electionId) + : this.electionId === other.electionId; + + return ( + other != null && + errorStrictEqual(this.error, other.error) && + this.type === other.type && + this.minWireVersion === other.minWireVersion && + arrayStrictEqual(this.hosts, other.hosts) && + tagsStrictEqual(this.tags, other.tags) && + this.setName === other.setName && + this.setVersion === other.setVersion && + electionIdsEqual && + this.primary === other.primary && + this.logicalSessionTimeoutMinutes === other.logicalSessionTimeoutMinutes && + topologyVersionsEqual + ); + } +} + +// Parses an `ismaster` message and determines the server type +export function parseServerType( + ismaster?: Document, + options?: ServerDescriptionOptions +): ServerType { + if (options?.loadBalanced) { + return ServerType.LoadBalancer; + } + + if (!ismaster || !ismaster.ok) { + return ServerType.Unknown; + } + + if (ismaster.isreplicaset) { + return ServerType.RSGhost; + } + + if (ismaster.msg && ismaster.msg === 'isdbgrid') { + return ServerType.Mongos; + } + + if (ismaster.setName) { + if (ismaster.hidden) { + return ServerType.RSOther; + } else if (ismaster.ismaster || ismaster.isWritablePrimary) { + return ServerType.RSPrimary; + } else if (ismaster.secondary) { + return ServerType.RSSecondary; + } else if (ismaster.arbiterOnly) { + return ServerType.RSArbiter; + } else { + return ServerType.RSOther; + } + } + + return ServerType.Standalone; +} + +function tagsStrictEqual(tags: TagSet, tags2: TagSet): boolean { + const tagsKeys = Object.keys(tags); + const tags2Keys = Object.keys(tags2); + + return ( + tagsKeys.length === tags2Keys.length && + tagsKeys.every((key: string) => tags2[key] === tags[key]) + ); +} + +/** + * Compares two topology versions. + * + * @returns A negative number if `lhs` is older than `rhs`; positive if `lhs` is newer than `rhs`; 0 if they are equivalent. + */ +export function compareTopologyVersion(lhs?: TopologyVersion, rhs?: TopologyVersion): number { + if (lhs == null || rhs == null) { + return -1; + } + + if (lhs.processId.equals(rhs.processId)) { + // tests mock counter as just number, but in a real situation counter should always be a Long + const lhsCounter = Long.isLong(lhs.counter) ? lhs.counter : Long.fromNumber(lhs.counter); + const rhsCounter = Long.isLong(rhs.counter) ? lhs.counter : Long.fromNumber(rhs.counter); + return lhsCounter.compare(rhsCounter); + } + + return -1; +} diff --git a/node_modules/mongodb/src/sdam/server_selection.ts b/node_modules/mongodb/src/sdam/server_selection.ts new file mode 100644 index 000000000..390da9693 --- /dev/null +++ b/node_modules/mongodb/src/sdam/server_selection.ts @@ -0,0 +1,281 @@ +import { ServerType, TopologyType } from './common'; +import { ReadPreference } from '../read_preference'; +import { MongoCompatibilityError, MongoInvalidArgumentError } from '../error'; +import type { TopologyDescription } from './topology_description'; +import type { ServerDescription, TagSet } from './server_description'; + +// max staleness constants +const IDLE_WRITE_PERIOD = 10000; +const SMALLEST_MAX_STALENESS_SECONDS = 90; + +/** @public */ +export type ServerSelector = ( + topologyDescription: TopologyDescription, + servers: ServerDescription[] +) => ServerDescription[]; + +/** + * Returns a server selector that selects for writable servers + */ +export function writableServerSelector(): ServerSelector { + return ( + topologyDescription: TopologyDescription, + servers: ServerDescription[] + ): ServerDescription[] => + latencyWindowReducer( + topologyDescription, + servers.filter((s: ServerDescription) => s.isWritable) + ); +} + +/** + * Reduces the passed in array of servers by the rules of the "Max Staleness" specification + * found here: https://github.com/mongodb/specifications/blob/master/source/max-staleness/max-staleness.rst + * + * @param readPreference - The read preference providing max staleness guidance + * @param topologyDescription - The topology description + * @param servers - The list of server descriptions to be reduced + * @returns The list of servers that satisfy the requirements of max staleness + */ +function maxStalenessReducer( + readPreference: ReadPreference, + topologyDescription: TopologyDescription, + servers: ServerDescription[] +): ServerDescription[] { + if (readPreference.maxStalenessSeconds == null || readPreference.maxStalenessSeconds < 0) { + return servers; + } + + const maxStaleness = readPreference.maxStalenessSeconds; + const maxStalenessVariance = + (topologyDescription.heartbeatFrequencyMS + IDLE_WRITE_PERIOD) / 1000; + if (maxStaleness < maxStalenessVariance) { + throw new MongoInvalidArgumentError( + `Option "maxStalenessSeconds" must be at least ${maxStalenessVariance} seconds` + ); + } + + if (maxStaleness < SMALLEST_MAX_STALENESS_SECONDS) { + throw new MongoInvalidArgumentError( + `Option "maxStalenessSeconds" must be at least ${SMALLEST_MAX_STALENESS_SECONDS} seconds` + ); + } + + if (topologyDescription.type === TopologyType.ReplicaSetWithPrimary) { + const primary: ServerDescription = Array.from(topologyDescription.servers.values()).filter( + primaryFilter + )[0]; + + return servers.reduce((result: ServerDescription[], server: ServerDescription) => { + const stalenessMS = + server.lastUpdateTime - + server.lastWriteDate - + (primary.lastUpdateTime - primary.lastWriteDate) + + topologyDescription.heartbeatFrequencyMS; + + const staleness = stalenessMS / 1000; + const maxStalenessSeconds = readPreference.maxStalenessSeconds ?? 0; + if (staleness <= maxStalenessSeconds) { + result.push(server); + } + + return result; + }, []); + } + + if (topologyDescription.type === TopologyType.ReplicaSetNoPrimary) { + if (servers.length === 0) { + return servers; + } + + const sMax = servers.reduce((max: ServerDescription, s: ServerDescription) => + s.lastWriteDate > max.lastWriteDate ? s : max + ); + + return servers.reduce((result: ServerDescription[], server: ServerDescription) => { + const stalenessMS = + sMax.lastWriteDate - server.lastWriteDate + topologyDescription.heartbeatFrequencyMS; + + const staleness = stalenessMS / 1000; + const maxStalenessSeconds = readPreference.maxStalenessSeconds ?? 0; + if (staleness <= maxStalenessSeconds) { + result.push(server); + } + + return result; + }, []); + } + + return servers; +} + +/** + * Determines whether a server's tags match a given set of tags + * + * @param tagSet - The requested tag set to match + * @param serverTags - The server's tags + */ +function tagSetMatch(tagSet: TagSet, serverTags: TagSet) { + const keys = Object.keys(tagSet); + const serverTagKeys = Object.keys(serverTags); + for (let i = 0; i < keys.length; ++i) { + const key = keys[i]; + if (serverTagKeys.indexOf(key) === -1 || serverTags[key] !== tagSet[key]) { + return false; + } + } + + return true; +} + +/** + * Reduces a set of server descriptions based on tags requested by the read preference + * + * @param readPreference - The read preference providing the requested tags + * @param servers - The list of server descriptions to reduce + * @returns The list of servers matching the requested tags + */ +function tagSetReducer( + readPreference: ReadPreference, + servers: ServerDescription[] +): ServerDescription[] { + if ( + readPreference.tags == null || + (Array.isArray(readPreference.tags) && readPreference.tags.length === 0) + ) { + return servers; + } + + for (let i = 0; i < readPreference.tags.length; ++i) { + const tagSet = readPreference.tags[i]; + const serversMatchingTagset = servers.reduce( + (matched: ServerDescription[], server: ServerDescription) => { + if (tagSetMatch(tagSet, server.tags)) matched.push(server); + return matched; + }, + [] + ); + + if (serversMatchingTagset.length) { + return serversMatchingTagset; + } + } + + return []; +} + +/** + * Reduces a list of servers to ensure they fall within an acceptable latency window. This is + * further specified in the "Server Selection" specification, found here: + * https://github.com/mongodb/specifications/blob/master/source/server-selection/server-selection.rst + * + * @param topologyDescription - The topology description + * @param servers - The list of servers to reduce + * @returns The servers which fall within an acceptable latency window + */ +function latencyWindowReducer( + topologyDescription: TopologyDescription, + servers: ServerDescription[] +): ServerDescription[] { + const low = servers.reduce( + (min: number, server: ServerDescription) => + min === -1 ? server.roundTripTime : Math.min(server.roundTripTime, min), + -1 + ); + + const high = low + topologyDescription.localThresholdMS; + return servers.reduce((result: ServerDescription[], server: ServerDescription) => { + if (server.roundTripTime <= high && server.roundTripTime >= low) result.push(server); + return result; + }, []); +} + +// filters +function primaryFilter(server: ServerDescription): boolean { + return server.type === ServerType.RSPrimary; +} + +function secondaryFilter(server: ServerDescription): boolean { + return server.type === ServerType.RSSecondary; +} + +function nearestFilter(server: ServerDescription): boolean { + return server.type === ServerType.RSSecondary || server.type === ServerType.RSPrimary; +} + +function knownFilter(server: ServerDescription): boolean { + return server.type !== ServerType.Unknown; +} + +function loadBalancerFilter(server: ServerDescription): boolean { + return server.type === ServerType.LoadBalancer; +} + +/** + * Returns a function which selects servers based on a provided read preference + * + * @param readPreference - The read preference to select with + */ +export function readPreferenceServerSelector(readPreference: ReadPreference): ServerSelector { + if (!readPreference.isValid()) { + throw new MongoInvalidArgumentError('Invalid read preference specified'); + } + + return ( + topologyDescription: TopologyDescription, + servers: ServerDescription[] + ): ServerDescription[] => { + const commonWireVersion = topologyDescription.commonWireVersion; + if ( + commonWireVersion && + readPreference.minWireVersion && + readPreference.minWireVersion > commonWireVersion + ) { + throw new MongoCompatibilityError( + `Minimum wire version '${readPreference.minWireVersion}' required, but found '${commonWireVersion}'` + ); + } + + if (topologyDescription.type === TopologyType.LoadBalanced) { + return servers.filter(loadBalancerFilter); + } + + if (topologyDescription.type === TopologyType.Unknown) { + return []; + } + + if ( + topologyDescription.type === TopologyType.Single || + topologyDescription.type === TopologyType.Sharded + ) { + return latencyWindowReducer(topologyDescription, servers.filter(knownFilter)); + } + + const mode = readPreference.mode; + if (mode === ReadPreference.PRIMARY) { + return servers.filter(primaryFilter); + } + + if (mode === ReadPreference.PRIMARY_PREFERRED) { + const result = servers.filter(primaryFilter); + if (result.length) { + return result; + } + } + + const filter = mode === ReadPreference.NEAREST ? nearestFilter : secondaryFilter; + const selectedServers = latencyWindowReducer( + topologyDescription, + tagSetReducer( + readPreference, + maxStalenessReducer(readPreference, topologyDescription, servers.filter(filter)) + ) + ); + + if (mode === ReadPreference.SECONDARY_PREFERRED && selectedServers.length === 0) { + return servers.filter(primaryFilter); + } + + return selectedServers; + }; +} diff --git a/node_modules/mongodb/src/sdam/srv_polling.ts b/node_modules/mongodb/src/sdam/srv_polling.ts new file mode 100644 index 000000000..119bcc597 --- /dev/null +++ b/node_modules/mongodb/src/sdam/srv_polling.ts @@ -0,0 +1,162 @@ +import * as dns from 'dns'; +import { Logger, LoggerOptions } from '../logger'; +import { HostAddress } from '../utils'; +import { TypedEventEmitter } from '../mongo_types'; +import { MongoRuntimeError } from '../error'; + +/** + * Determines whether a provided address matches the provided parent domain in order + * to avoid certain attack vectors. + * + * @param srvAddress - The address to check against a domain + * @param parentDomain - The domain to check the provided address against + * @returns Whether the provided address matches the parent domain + */ +function matchesParentDomain(srvAddress: string, parentDomain: string): boolean { + const regex = /^.*?\./; + const srv = `.${srvAddress.replace(regex, '')}`; + const parent = `.${parentDomain.replace(regex, '')}`; + return srv.endsWith(parent); +} + +/** + * @internal + * @category Event + */ +export class SrvPollingEvent { + srvRecords: dns.SrvRecord[]; + constructor(srvRecords: dns.SrvRecord[]) { + this.srvRecords = srvRecords; + } + + addresses(): Map { + return new Map( + this.srvRecords.map(record => { + const host = new HostAddress(`${record.name}:${record.port}`); + return [host.toString(), host]; + }) + ); + } +} + +/** @internal */ +export interface SrvPollerOptions extends LoggerOptions { + srvHost: string; + heartbeatFrequencyMS: number; +} + +/** @internal */ +export type SrvPollerEvents = { + srvRecordDiscovery(event: SrvPollingEvent): void; +}; + +/** @internal */ +export class SrvPoller extends TypedEventEmitter { + srvHost: string; + rescanSrvIntervalMS: number; + heartbeatFrequencyMS: number; + logger: Logger; + haMode: boolean; + generation: number; + _timeout?: NodeJS.Timeout; + + /** @event */ + static readonly SRV_RECORD_DISCOVERY = 'srvRecordDiscovery' as const; + + constructor(options: SrvPollerOptions) { + super(); + + if (!options || !options.srvHost) { + throw new MongoRuntimeError('Options for SrvPoller must exist and include srvHost'); + } + + this.srvHost = options.srvHost; + this.rescanSrvIntervalMS = 60000; + this.heartbeatFrequencyMS = options.heartbeatFrequencyMS || 10000; + this.logger = new Logger('srvPoller', options); + + this.haMode = false; + this.generation = 0; + + this._timeout = undefined; + } + + get srvAddress(): string { + return `_mongodb._tcp.${this.srvHost}`; + } + + get intervalMS(): number { + return this.haMode ? this.heartbeatFrequencyMS : this.rescanSrvIntervalMS; + } + + start(): void { + if (!this._timeout) { + this.schedule(); + } + } + + stop(): void { + if (this._timeout) { + clearTimeout(this._timeout); + this.generation += 1; + this._timeout = undefined; + } + } + + schedule(): void { + if (this._timeout) { + clearTimeout(this._timeout); + } + + this._timeout = setTimeout(() => this._poll(), this.intervalMS); + } + + success(srvRecords: dns.SrvRecord[]): void { + this.haMode = false; + this.schedule(); + this.emit(SrvPoller.SRV_RECORD_DISCOVERY, new SrvPollingEvent(srvRecords)); + } + + failure(message: string, obj?: NodeJS.ErrnoException): void { + this.logger.warn(message, obj); + this.haMode = true; + this.schedule(); + } + + parentDomainMismatch(srvRecord: dns.SrvRecord): void { + this.logger.warn( + `parent domain mismatch on SRV record (${srvRecord.name}:${srvRecord.port})`, + srvRecord + ); + } + + _poll(): void { + const generation = this.generation; + dns.resolveSrv(this.srvAddress, (err, srvRecords) => { + if (generation !== this.generation) { + return; + } + + if (err) { + this.failure('DNS error', err); + return; + } + + const finalAddresses: dns.SrvRecord[] = []; + srvRecords.forEach((record: dns.SrvRecord) => { + if (matchesParentDomain(record.name, this.srvHost)) { + finalAddresses.push(record); + } else { + this.parentDomainMismatch(record); + } + }); + + if (!finalAddresses.length) { + this.failure('No valid addresses found at host'); + return; + } + + this.success(finalAddresses); + }); + } +} diff --git a/node_modules/mongodb/src/sdam/topology.ts b/node_modules/mongodb/src/sdam/topology.ts new file mode 100644 index 000000000..67f81db60 --- /dev/null +++ b/node_modules/mongodb/src/sdam/topology.ts @@ -0,0 +1,1109 @@ +import Denque = require('denque'); +import { ReadPreference, ReadPreferenceLike } from '../read_preference'; +import { compareTopologyVersion, ServerDescription } from './server_description'; +import { TopologyDescription } from './topology_description'; +import { Server, ServerEvents, ServerOptions } from './server'; +import { + ClientSession, + ServerSessionPool, + ServerSessionId, + ClientSessionOptions +} from '../sessions'; +import { SrvPoller, SrvPollingEvent } from './srv_polling'; +import { CMAP_EVENTS, ConnectionPoolEvents } from '../cmap/connection_pool'; +import { + MongoServerSelectionError, + MongoCompatibilityError, + MongoDriverError, + MongoTopologyClosedError, + MongoRuntimeError +} from '../error'; +import { readPreferenceServerSelector, ServerSelector } from './server_selection'; +import { + makeStateMachine, + eachAsync, + ClientMetadata, + Callback, + HostAddress, + ns, + emitWarning, + EventEmitterWithState +} from '../utils'; +import { + TopologyType, + ServerType, + ClusterTime, + TimerQueue, + _advanceClusterTime, + drainTimerQueue, + clearAndRemoveTimerFrom, + STATE_CLOSED, + STATE_CLOSING, + STATE_CONNECTING, + STATE_CONNECTED +} from './common'; +import { + ServerOpeningEvent, + ServerClosedEvent, + ServerDescriptionChangedEvent, + TopologyOpeningEvent, + TopologyClosedEvent, + TopologyDescriptionChangedEvent +} from './events'; +import type { Document, BSONSerializeOptions } from '../bson'; +import type { MongoCredentials } from '../cmap/auth/mongo_credentials'; +import type { Transaction } from '../transactions'; +import type { CloseOptions } from '../cmap/connection_pool'; +import { DestroyOptions, Connection, ConnectionEvents } from '../cmap/connection'; +import type { MongoOptions, ServerApi } from '../mongo_client'; +import { DEFAULT_OPTIONS } from '../connection_string'; +import { serialize, deserialize } from '../bson'; +import { TypedEventEmitter } from '../mongo_types'; + +// Global state +let globalTopologyCounter = 0; + +// events that we relay to the `Topology` +const SERVER_RELAY_EVENTS = [ + Server.SERVER_HEARTBEAT_STARTED, + Server.SERVER_HEARTBEAT_SUCCEEDED, + Server.SERVER_HEARTBEAT_FAILED, + Connection.COMMAND_STARTED, + Connection.COMMAND_SUCCEEDED, + Connection.COMMAND_FAILED, + ...CMAP_EVENTS +]; + +// all events we listen to from `Server` instances +const LOCAL_SERVER_EVENTS = [ + Server.CONNECT, + Server.DESCRIPTION_RECEIVED, + Server.CLOSED, + Server.ENDED +]; + +const stateTransition = makeStateMachine({ + [STATE_CLOSED]: [STATE_CLOSED, STATE_CONNECTING], + [STATE_CONNECTING]: [STATE_CONNECTING, STATE_CLOSING, STATE_CONNECTED, STATE_CLOSED], + [STATE_CONNECTED]: [STATE_CONNECTED, STATE_CLOSING, STATE_CLOSED], + [STATE_CLOSING]: [STATE_CLOSING, STATE_CLOSED] +}); + +/** @internal */ +const kCancelled = Symbol('cancelled'); +/** @internal */ +const kWaitQueue = Symbol('waitQueue'); + +/** @internal */ +export type ServerSelectionCallback = Callback; + +/** @internal */ +export interface ServerSelectionRequest { + serverSelector: ServerSelector; + transaction?: Transaction; + callback: ServerSelectionCallback; + timer?: NodeJS.Timeout; + [kCancelled]?: boolean; +} + +/** @internal */ +export interface TopologyPrivate { + /** the id of this topology */ + id: number; + /** passed in options */ + options: TopologyOptions; + /** initial seedlist of servers to connect to */ + seedlist: HostAddress[]; + /** initial state */ + state: string; + /** the topology description */ + description: TopologyDescription; + serverSelectionTimeoutMS: number; + heartbeatFrequencyMS: number; + minHeartbeatFrequencyMS: number; + /** A map of server instances to normalized addresses */ + servers: Map; + /** Server Session Pool */ + sessionPool: ServerSessionPool; + /** Active client sessions */ + sessions: Set; + credentials?: MongoCredentials; + clusterTime?: ClusterTime; + /** timers created for the initial connect to a server */ + connectionTimers: TimerQueue; + + /** related to srv polling */ + srvPoller?: SrvPoller; + detectShardedTopology: (event: TopologyDescriptionChangedEvent) => void; + detectSrvRecords: (event: SrvPollingEvent) => void; +} + +/** @public */ +export interface TopologyOptions extends BSONSerializeOptions, ServerOptions { + hosts: HostAddress[]; + retryWrites: boolean; + retryReads: boolean; + /** How long to block for server selection before throwing an error */ + serverSelectionTimeoutMS: number; + /** The name of the replica set to connect to */ + replicaSet?: string; + srvHost?: string; + /** @internal */ + srvPoller?: SrvPoller; + /** Indicates that a client should directly connect to a node without attempting to discover its topology type */ + directConnection: boolean; + loadBalanced: boolean; + metadata: ClientMetadata; + /** MongoDB server API version */ + serverApi?: ServerApi; +} + +/** @public */ +export interface ConnectOptions { + readPreference?: ReadPreference; +} + +/** @public */ +export interface SelectServerOptions { + readPreference?: ReadPreferenceLike; + /** How long to block for server selection before throwing an error */ + serverSelectionTimeoutMS?: number; + session?: ClientSession; +} + +/** @public */ +export type TopologyEvents = { + /** Top level MongoClient doesn't emit this so it is marked: @internal */ + connect(topology: Topology): void; + serverOpening(event: ServerOpeningEvent): void; + serverClosed(event: ServerClosedEvent): void; + serverDescriptionChanged(event: ServerDescriptionChangedEvent): void; + topologyClosed(event: TopologyClosedEvent): void; + topologyOpening(event: TopologyOpeningEvent): void; + topologyDescriptionChanged(event: TopologyDescriptionChangedEvent): void; + error(error: Error): void; + /** TODO(NODE-3273) - remove error @internal */ + open(error: undefined, topology: Topology): void; + close(): void; + timeout(): void; +} & Omit & + ConnectionPoolEvents & + ConnectionEvents & + EventEmitterWithState; +/** + * A container of server instances representing a connection to a MongoDB topology. + * @internal + */ +export class Topology extends TypedEventEmitter { + /** @internal */ + s: TopologyPrivate; + /** @internal */ + [kWaitQueue]: Denque; + /** @internal */ + ismaster?: Document; + /** @internal */ + _type?: string; + + /** @event */ + static readonly SERVER_OPENING = 'serverOpening' as const; + /** @event */ + static readonly SERVER_CLOSED = 'serverClosed' as const; + /** @event */ + static readonly SERVER_DESCRIPTION_CHANGED = 'serverDescriptionChanged' as const; + /** @event */ + static readonly TOPOLOGY_OPENING = 'topologyOpening' as const; + /** @event */ + static readonly TOPOLOGY_CLOSED = 'topologyClosed' as const; + /** @event */ + static readonly TOPOLOGY_DESCRIPTION_CHANGED = 'topologyDescriptionChanged' as const; + /** @event */ + static readonly ERROR = 'error' as const; + /** @event */ + static readonly OPEN = 'open' as const; + /** @event */ + static readonly CONNECT = 'connect' as const; + /** @event */ + static readonly CLOSE = 'close' as const; + /** @event */ + static readonly TIMEOUT = 'timeout' as const; + + /** + * @internal + * + * @privateRemarks + * mongodb-client-encryption's class ClientEncryption falls back to finding the bson lib + * defined on client.topology.bson, in order to maintain compatibility with any version + * of mongodb-client-encryption we keep a reference to serialize and deserialize here. + */ + bson: { serialize: typeof serialize; deserialize: typeof deserialize }; + + /** + * @param seedlist - a list of HostAddress instances to connect to + */ + constructor(seeds: string | string[] | HostAddress | HostAddress[], options: TopologyOptions) { + super(); + + // Legacy CSFLE support + this.bson = Object.create(null); + this.bson.serialize = serialize; + this.bson.deserialize = deserialize; + + // Options should only be undefined in tests, MongoClient will always have defined options + options = options ?? { + hosts: [HostAddress.fromString('localhost:27017')], + retryReads: DEFAULT_OPTIONS.get('retryReads'), + retryWrites: DEFAULT_OPTIONS.get('retryWrites'), + serverSelectionTimeoutMS: DEFAULT_OPTIONS.get('serverSelectionTimeoutMS'), + directConnection: DEFAULT_OPTIONS.get('directConnection'), + loadBalanced: DEFAULT_OPTIONS.get('loadBalanced'), + metadata: DEFAULT_OPTIONS.get('metadata'), + monitorCommands: DEFAULT_OPTIONS.get('monitorCommands'), + tls: DEFAULT_OPTIONS.get('tls'), + maxPoolSize: DEFAULT_OPTIONS.get('maxPoolSize'), + minPoolSize: DEFAULT_OPTIONS.get('minPoolSize'), + waitQueueTimeoutMS: DEFAULT_OPTIONS.get('waitQueueTimeoutMS'), + connectionType: DEFAULT_OPTIONS.get('connectionType'), + connectTimeoutMS: DEFAULT_OPTIONS.get('connectTimeoutMS'), + maxIdleTimeMS: DEFAULT_OPTIONS.get('maxIdleTimeMS'), + heartbeatFrequencyMS: DEFAULT_OPTIONS.get('heartbeatFrequencyMS'), + minHeartbeatFrequencyMS: DEFAULT_OPTIONS.get('minHeartbeatFrequencyMS') + }; + + if (typeof seeds === 'string') { + seeds = [HostAddress.fromString(seeds)]; + } else if (!Array.isArray(seeds)) { + seeds = [seeds]; + } + + const seedlist: HostAddress[] = []; + for (const seed of seeds) { + if (typeof seed === 'string') { + seedlist.push(HostAddress.fromString(seed)); + } else if (seed instanceof HostAddress) { + seedlist.push(seed); + } else { + // FIXME(NODE-3483): May need to be a MongoParseError + throw new MongoRuntimeError(`Topology cannot be constructed from ${JSON.stringify(seed)}`); + } + } + + const topologyType = topologyTypeFromOptions(options); + const topologyId = globalTopologyCounter++; + + const serverDescriptions = new Map(); + for (const hostAddress of seedlist) { + serverDescriptions.set(hostAddress.toString(), new ServerDescription(hostAddress)); + } + + this[kWaitQueue] = new Denque(); + this.s = { + // the id of this topology + id: topologyId, + // passed in options + options, + // initial seedlist of servers to connect to + seedlist, + // initial state + state: STATE_CLOSED, + // the topology description + description: new TopologyDescription( + topologyType, + serverDescriptions, + options.replicaSet, + undefined, + undefined, + undefined, + options + ), + serverSelectionTimeoutMS: options.serverSelectionTimeoutMS, + heartbeatFrequencyMS: options.heartbeatFrequencyMS, + minHeartbeatFrequencyMS: options.minHeartbeatFrequencyMS, + // a map of server instances to normalized addresses + servers: new Map(), + // Server Session Pool + sessionPool: new ServerSessionPool(this), + // Active client sessions + sessions: new Set(), + credentials: options?.credentials, + clusterTime: undefined, + + // timer management + connectionTimers: new Set(), + + detectShardedTopology: ev => this.detectShardedTopology(ev), + detectSrvRecords: ev => this.detectSrvRecords(ev) + }; + + if (options.srvHost && !options.loadBalanced) { + this.s.srvPoller = + options.srvPoller ?? + new SrvPoller({ + heartbeatFrequencyMS: this.s.heartbeatFrequencyMS, + srvHost: options.srvHost + }); + + this.on(Topology.TOPOLOGY_DESCRIPTION_CHANGED, this.s.detectShardedTopology); + } + } + + private detectShardedTopology(event: TopologyDescriptionChangedEvent) { + const previousType = event.previousDescription.type; + const newType = event.newDescription.type; + + const transitionToSharded = + previousType !== TopologyType.Sharded && newType === TopologyType.Sharded; + const srvListeners = this.s.srvPoller?.listeners(SrvPoller.SRV_RECORD_DISCOVERY); + const listeningToSrvPolling = !!srvListeners?.includes(this.s.detectSrvRecords); + + if (transitionToSharded && !listeningToSrvPolling) { + this.s.srvPoller?.on(SrvPoller.SRV_RECORD_DISCOVERY, this.s.detectSrvRecords); + this.s.srvPoller?.start(); + } + } + + private detectSrvRecords(ev: SrvPollingEvent) { + const previousTopologyDescription = this.s.description; + this.s.description = this.s.description.updateFromSrvPollingEvent(ev); + if (this.s.description === previousTopologyDescription) { + // Nothing changed, so return + return; + } + + updateServers(this); + + this.emit( + Topology.TOPOLOGY_DESCRIPTION_CHANGED, + new TopologyDescriptionChangedEvent( + this.s.id, + previousTopologyDescription, + this.s.description + ) + ); + } + + /** + * @returns A `TopologyDescription` for this topology + */ + get description(): TopologyDescription { + return this.s.description; + } + + get loadBalanced(): boolean { + return this.s.options.loadBalanced; + } + + get capabilities(): ServerCapabilities { + return new ServerCapabilities(this.lastIsMaster()); + } + + /** Initiate server connect */ + connect(options?: ConnectOptions, callback?: Callback): void { + if (typeof options === 'function') (callback = options), (options = {}); + options = options ?? {}; + if (this.s.state === STATE_CONNECTED) { + if (typeof callback === 'function') { + callback(); + } + + return; + } + + stateTransition(this, STATE_CONNECTING); + + // emit SDAM monitoring events + this.emit(Topology.TOPOLOGY_OPENING, new TopologyOpeningEvent(this.s.id)); + + // emit an event for the topology change + this.emit( + Topology.TOPOLOGY_DESCRIPTION_CHANGED, + new TopologyDescriptionChangedEvent( + this.s.id, + new TopologyDescription(TopologyType.Unknown), // initial is always Unknown + this.s.description + ) + ); + + // connect all known servers, then attempt server selection to connect + const serverDescriptions = Array.from(this.s.description.servers.values()); + connectServers(this, serverDescriptions); + + // In load balancer mode we need to fake a server description getting + // emitted from the monitor, since the monitor doesn't exist. + if (this.s.options.loadBalanced) { + for (const description of serverDescriptions) { + const newDescription = new ServerDescription(description.hostAddress, undefined, { + loadBalanced: this.s.options.loadBalanced + }); + this.serverUpdateHandler(newDescription); + } + } + + const readPreference = options.readPreference ?? ReadPreference.primary; + this.selectServer(readPreferenceServerSelector(readPreference), options, (err, server) => { + if (err) { + this.close(); + + typeof callback === 'function' ? callback(err) : this.emit(Topology.ERROR, err); + return; + } + + // TODO: NODE-2471 + if (server && this.s.credentials) { + server.command(ns('admin.$cmd'), { ping: 1 }, err => { + if (err) { + typeof callback === 'function' ? callback(err) : this.emit(Topology.ERROR, err); + return; + } + + stateTransition(this, STATE_CONNECTED); + // TODO(NODE-3273) - remove err + this.emit(Topology.OPEN, err, this); + this.emit(Topology.CONNECT, this); + + if (typeof callback === 'function') callback(undefined, this); + }); + + return; + } + + stateTransition(this, STATE_CONNECTED); + // TODO(NODE-3273) - remove err + this.emit(Topology.OPEN, err, this); + this.emit(Topology.CONNECT, this); + + if (typeof callback === 'function') callback(undefined, this); + }); + } + + /** Close this topology */ + close(options?: CloseOptions, callback?: Callback): void { + if (typeof options === 'function') { + callback = options; + options = {}; + } + + if (typeof options === 'boolean') { + options = { force: options }; + } + + options = options ?? {}; + if (this.s.state === STATE_CLOSED || this.s.state === STATE_CLOSING) { + if (typeof callback === 'function') { + callback(); + } + + return; + } + + stateTransition(this, STATE_CLOSING); + + drainWaitQueue(this[kWaitQueue], new MongoTopologyClosedError()); + drainTimerQueue(this.s.connectionTimers); + + if (this.s.srvPoller) { + this.s.srvPoller.stop(); + this.s.srvPoller.removeListener(SrvPoller.SRV_RECORD_DISCOVERY, this.s.detectSrvRecords); + } + + this.removeListener(Topology.TOPOLOGY_DESCRIPTION_CHANGED, this.s.detectShardedTopology); + + eachAsync( + Array.from(this.s.sessions.values()), + (session, cb) => session.endSession(cb), + () => { + this.s.sessionPool.endAllPooledSessions(() => { + eachAsync( + Array.from(this.s.servers.values()), + (server, cb) => destroyServer(server, this, options, cb), + err => { + this.s.servers.clear(); + + // emit an event for close + this.emit(Topology.TOPOLOGY_CLOSED, new TopologyClosedEvent(this.s.id)); + + stateTransition(this, STATE_CLOSED); + + if (typeof callback === 'function') { + callback(err); + } + } + ); + }); + } + ); + } + + /** + * Selects a server according to the selection predicate provided + * + * @param selector - An optional selector to select servers by, defaults to a random selection within a latency window + * @param options - Optional settings related to server selection + * @param callback - The callback used to indicate success or failure + * @returns An instance of a `Server` meeting the criteria of the predicate provided + */ + selectServer(options: SelectServerOptions, callback: Callback): void; + selectServer( + selector: string | ReadPreference | ServerSelector, + callback: Callback + ): void; + selectServer( + selector: string | ReadPreference | ServerSelector, + options: SelectServerOptions, + callback: Callback + ): void; + selectServer( + selector: string | ReadPreference | ServerSelector | SelectServerOptions, + _options?: SelectServerOptions | Callback, + _callback?: Callback + ): void { + let options = _options as SelectServerOptions; + const callback = (_callback ?? _options) as Callback; + if (typeof options === 'function') { + options = {}; + } + + let serverSelector; + if (typeof selector !== 'function') { + if (typeof selector === 'string') { + serverSelector = readPreferenceServerSelector(ReadPreference.fromString(selector)); + } else { + let readPreference; + if (selector instanceof ReadPreference) { + readPreference = selector; + } else { + ReadPreference.translate(options); + readPreference = options.readPreference || ReadPreference.primary; + } + + serverSelector = readPreferenceServerSelector(readPreference as ReadPreference); + } + } else { + serverSelector = selector; + } + + options = Object.assign( + {}, + { serverSelectionTimeoutMS: this.s.serverSelectionTimeoutMS }, + options + ); + + const isSharded = this.description.type === TopologyType.Sharded; + const session = options.session; + const transaction = session && session.transaction; + + if (isSharded && transaction && transaction.server) { + callback(undefined, transaction.server); + return; + } + + const waitQueueMember: ServerSelectionRequest = { + serverSelector, + transaction, + callback + }; + + const serverSelectionTimeoutMS = options.serverSelectionTimeoutMS; + if (serverSelectionTimeoutMS) { + waitQueueMember.timer = setTimeout(() => { + waitQueueMember[kCancelled] = true; + waitQueueMember.timer = undefined; + const timeoutError = new MongoServerSelectionError( + `Server selection timed out after ${serverSelectionTimeoutMS} ms`, + this.description + ); + + waitQueueMember.callback(timeoutError); + }, serverSelectionTimeoutMS); + } + + this[kWaitQueue].push(waitQueueMember); + processWaitQueue(this); + } + + // Sessions related methods + + /** + * @returns Whether the topology should initiate selection to determine session support + */ + shouldCheckForSessionSupport(): boolean { + if (this.description.type === TopologyType.Single) { + return !this.description.hasKnownServers; + } + + return !this.description.hasDataBearingServers; + } + + /** + * @returns Whether sessions are supported on the current topology + */ + hasSessionSupport(): boolean { + return this.loadBalanced || this.description.logicalSessionTimeoutMinutes != null; + } + + /** Start a logical session */ + startSession(options: ClientSessionOptions, clientOptions?: MongoOptions): ClientSession { + const session = new ClientSession(this, this.s.sessionPool, options, clientOptions); + session.once('ended', () => { + this.s.sessions.delete(session); + }); + + this.s.sessions.add(session); + return session; + } + + /** Send endSessions command(s) with the given session ids */ + endSessions(sessions: ServerSessionId[], callback?: Callback): void { + if (!Array.isArray(sessions)) { + sessions = [sessions]; + } + + this.selectServer( + readPreferenceServerSelector(ReadPreference.primaryPreferred), + (err, server) => { + if (err || !server) { + if (typeof callback === 'function') callback(err); + return; + } + + server.command( + ns('admin.$cmd'), + { endSessions: sessions }, + { noResponse: true }, + (err, result) => { + if (typeof callback === 'function') callback(err, result); + } + ); + } + ); + } + + /** + * Update the internal TopologyDescription with a ServerDescription + * + * @param serverDescription - The server to update in the internal list of server descriptions + */ + serverUpdateHandler(serverDescription: ServerDescription): void { + if (!this.s.description.hasServer(serverDescription.address)) { + return; + } + + // ignore this server update if its from an outdated topologyVersion + if (isStaleServerDescription(this.s.description, serverDescription)) { + return; + } + + // these will be used for monitoring events later + const previousTopologyDescription = this.s.description; + const previousServerDescription = this.s.description.servers.get(serverDescription.address); + if (!previousServerDescription) { + return; + } + + // Driver Sessions Spec: "Whenever a driver receives a cluster time from + // a server it MUST compare it to the current highest seen cluster time + // for the deployment. If the new cluster time is higher than the + // highest seen cluster time it MUST become the new highest seen cluster + // time. Two cluster times are compared using only the BsonTimestamp + // value of the clusterTime embedded field." + const clusterTime = serverDescription.$clusterTime; + if (clusterTime) { + _advanceClusterTime(this, clusterTime); + } + + // If we already know all the information contained in this updated description, then + // we don't need to emit SDAM events, but still need to update the description, in order + // to keep client-tracked attributes like last update time and round trip time up to date + const equalDescriptions = + previousServerDescription && previousServerDescription.equals(serverDescription); + + // first update the TopologyDescription + this.s.description = this.s.description.update(serverDescription); + if (this.s.description.compatibilityError) { + this.emit(Topology.ERROR, new MongoCompatibilityError(this.s.description.compatibilityError)); + return; + } + + // emit monitoring events for this change + if (!equalDescriptions) { + const newDescription = this.s.description.servers.get(serverDescription.address); + if (newDescription) { + this.emit( + Topology.SERVER_DESCRIPTION_CHANGED, + new ServerDescriptionChangedEvent( + this.s.id, + serverDescription.address, + previousServerDescription, + newDescription + ) + ); + } + } + + // update server list from updated descriptions + updateServers(this, serverDescription); + + // attempt to resolve any outstanding server selection attempts + if (this[kWaitQueue].length > 0) { + processWaitQueue(this); + } + + if (!equalDescriptions) { + this.emit( + Topology.TOPOLOGY_DESCRIPTION_CHANGED, + new TopologyDescriptionChangedEvent( + this.s.id, + previousTopologyDescription, + this.s.description + ) + ); + } + } + + auth(credentials?: MongoCredentials, callback?: Callback): void { + if (typeof credentials === 'function') (callback = credentials), (credentials = undefined); + if (typeof callback === 'function') callback(undefined, true); + } + + get clientMetadata(): ClientMetadata { + return this.s.options.metadata; + } + + isConnected(): boolean { + return this.s.state === STATE_CONNECTED; + } + + isDestroyed(): boolean { + return this.s.state === STATE_CLOSED; + } + + /** + * @deprecated This function is deprecated and will be removed in the next major version. + */ + unref(): void { + emitWarning('`unref` is a noop and will be removed in the next major version'); + } + + // NOTE: There are many places in code where we explicitly check the last isMaster + // to do feature support detection. This should be done any other way, but for + // now we will just return the first isMaster seen, which should suffice. + lastIsMaster(): Document { + const serverDescriptions = Array.from(this.description.servers.values()); + if (serverDescriptions.length === 0) return {}; + const sd = serverDescriptions.filter( + (sd: ServerDescription) => sd.type !== ServerType.Unknown + )[0]; + + const result = sd || { maxWireVersion: this.description.commonWireVersion }; + return result; + } + + get logicalSessionTimeoutMinutes(): number | undefined { + return this.description.logicalSessionTimeoutMinutes; + } + + get clusterTime(): ClusterTime | undefined { + return this.s.clusterTime; + } + + set clusterTime(clusterTime: ClusterTime | undefined) { + this.s.clusterTime = clusterTime; + } +} + +/** @public */ +export const TOPOLOGY_EVENTS = [ + Topology.SERVER_OPENING, + Topology.SERVER_CLOSED, + Topology.SERVER_DESCRIPTION_CHANGED, + Topology.TOPOLOGY_OPENING, + Topology.TOPOLOGY_CLOSED, + Topology.TOPOLOGY_DESCRIPTION_CHANGED, + Topology.ERROR, + Topology.TIMEOUT, + Topology.CLOSE +]; + +/** Destroys a server, and removes all event listeners from the instance */ +function destroyServer( + server: Server, + topology: Topology, + options?: DestroyOptions, + callback?: Callback +) { + options = options ?? {}; + for (const event of LOCAL_SERVER_EVENTS) { + server.removeAllListeners(event); + } + + server.destroy(options, () => { + topology.emit( + Topology.SERVER_CLOSED, + new ServerClosedEvent(topology.s.id, server.description.address) + ); + + for (const event of SERVER_RELAY_EVENTS) { + server.removeAllListeners(event); + } + if (typeof callback === 'function') { + callback(); + } + }); +} + +/** Predicts the TopologyType from options */ +function topologyTypeFromOptions(options?: TopologyOptions) { + if (options?.directConnection) { + return TopologyType.Single; + } + + if (options?.replicaSet) { + return TopologyType.ReplicaSetNoPrimary; + } + + if (options?.loadBalanced) { + return TopologyType.LoadBalanced; + } + + return TopologyType.Unknown; +} + +function randomSelection(array: ServerDescription[]): ServerDescription { + return array[Math.floor(Math.random() * array.length)]; +} + +/** + * Creates new server instances and attempts to connect them + * + * @param topology - The topology that this server belongs to + * @param serverDescription - The description for the server to initialize and connect to + * @param connectDelay - Time to wait before attempting initial connection + */ +function createAndConnectServer( + topology: Topology, + serverDescription: ServerDescription, + connectDelay?: number +) { + topology.emit( + Topology.SERVER_OPENING, + new ServerOpeningEvent(topology.s.id, serverDescription.address) + ); + + const server = new Server(topology, serverDescription, topology.s.options); + for (const event of SERVER_RELAY_EVENTS) { + server.on(event, (e: any) => topology.emit(event, e)); + } + + server.on(Server.DESCRIPTION_RECEIVED, description => topology.serverUpdateHandler(description)); + + if (connectDelay) { + const connectTimer = setTimeout(() => { + clearAndRemoveTimerFrom(connectTimer, topology.s.connectionTimers); + server.connect(); + }, connectDelay); + + topology.s.connectionTimers.add(connectTimer); + return server; + } + + server.connect(); + return server; +} + +/** + * Create `Server` instances for all initially known servers, connect them, and assign + * them to the passed in `Topology`. + * + * @param topology - The topology responsible for the servers + * @param serverDescriptions - A list of server descriptions to connect + */ +function connectServers(topology: Topology, serverDescriptions: ServerDescription[]) { + topology.s.servers = serverDescriptions.reduce( + (servers: Map, serverDescription: ServerDescription) => { + const server = createAndConnectServer(topology, serverDescription); + servers.set(serverDescription.address, server); + return servers; + }, + new Map() + ); +} + +/** + * @param topology - Topology to update. + * @param incomingServerDescription - New server description. + */ +function updateServers(topology: Topology, incomingServerDescription?: ServerDescription) { + // update the internal server's description + if (incomingServerDescription && topology.s.servers.has(incomingServerDescription.address)) { + const server = topology.s.servers.get(incomingServerDescription.address); + if (server) { + server.s.description = incomingServerDescription; + } + } + + // add new servers for all descriptions we currently don't know about locally + for (const serverDescription of topology.description.servers.values()) { + if (!topology.s.servers.has(serverDescription.address)) { + const server = createAndConnectServer(topology, serverDescription); + topology.s.servers.set(serverDescription.address, server); + } + } + + // for all servers no longer known, remove their descriptions and destroy their instances + for (const entry of topology.s.servers) { + const serverAddress = entry[0]; + if (topology.description.hasServer(serverAddress)) { + continue; + } + + if (!topology.s.servers.has(serverAddress)) { + continue; + } + + const server = topology.s.servers.get(serverAddress); + topology.s.servers.delete(serverAddress); + + // prepare server for garbage collection + if (server) { + destroyServer(server, topology); + } + } +} + +function drainWaitQueue(queue: Denque, err?: MongoDriverError) { + while (queue.length) { + const waitQueueMember = queue.shift(); + if (!waitQueueMember) { + continue; + } + + if (waitQueueMember.timer) { + clearTimeout(waitQueueMember.timer); + } + + if (!waitQueueMember[kCancelled]) { + waitQueueMember.callback(err); + } + } +} + +function processWaitQueue(topology: Topology) { + if (topology.s.state === STATE_CLOSED) { + drainWaitQueue(topology[kWaitQueue], new MongoTopologyClosedError()); + return; + } + + const isSharded = topology.description.type === TopologyType.Sharded; + const serverDescriptions = Array.from(topology.description.servers.values()); + const membersToProcess = topology[kWaitQueue].length; + for (let i = 0; i < membersToProcess; ++i) { + const waitQueueMember = topology[kWaitQueue].shift(); + if (!waitQueueMember) { + continue; + } + + if (waitQueueMember[kCancelled]) { + continue; + } + + let selectedDescriptions; + try { + const serverSelector = waitQueueMember.serverSelector; + selectedDescriptions = serverSelector + ? serverSelector(topology.description, serverDescriptions) + : serverDescriptions; + } catch (e) { + if (waitQueueMember.timer) { + clearTimeout(waitQueueMember.timer); + } + + waitQueueMember.callback(e); + continue; + } + + if (selectedDescriptions.length === 0) { + topology[kWaitQueue].push(waitQueueMember); + continue; + } + + const selectedServerDescription = randomSelection(selectedDescriptions); + const selectedServer = topology.s.servers.get(selectedServerDescription.address); + const transaction = waitQueueMember.transaction; + if (isSharded && transaction && transaction.isActive && selectedServer) { + transaction.pinServer(selectedServer); + } + + if (waitQueueMember.timer) { + clearTimeout(waitQueueMember.timer); + } + + waitQueueMember.callback(undefined, selectedServer); + } + + if (topology[kWaitQueue].length > 0) { + // ensure all server monitors attempt monitoring soon + for (const [, server] of topology.s.servers) { + process.nextTick(function scheduleServerCheck() { + return server.requestCheck(); + }); + } + } +} + +function isStaleServerDescription( + topologyDescription: TopologyDescription, + incomingServerDescription: ServerDescription +) { + const currentServerDescription = topologyDescription.servers.get( + incomingServerDescription.address + ); + const currentTopologyVersion = currentServerDescription?.topologyVersion; + return ( + compareTopologyVersion(currentTopologyVersion, incomingServerDescription.topologyVersion) > 0 + ); +} + +/** @public */ +export class ServerCapabilities { + maxWireVersion: number; + minWireVersion: number; + + constructor(ismaster: Document) { + this.minWireVersion = ismaster.minWireVersion || 0; + this.maxWireVersion = ismaster.maxWireVersion || 0; + } + + get hasAggregationCursor(): boolean { + return this.maxWireVersion >= 1; + } + + get hasWriteCommands(): boolean { + return this.maxWireVersion >= 2; + } + get hasTextSearch(): boolean { + return this.minWireVersion >= 0; + } + + get hasAuthCommands(): boolean { + return this.maxWireVersion >= 1; + } + + get hasListCollectionsCommand(): boolean { + return this.maxWireVersion >= 3; + } + + get hasListIndexesCommand(): boolean { + return this.maxWireVersion >= 3; + } + + get supportsSnapshotReads(): boolean { + return this.maxWireVersion >= 13; + } + + get commandsTakeWriteConcern(): boolean { + return this.maxWireVersion >= 5; + } + + get commandsTakeCollation(): boolean { + return this.maxWireVersion >= 5; + } +} diff --git a/node_modules/mongodb/src/sdam/topology_description.ts b/node_modules/mongodb/src/sdam/topology_description.ts new file mode 100644 index 000000000..60ad5e689 --- /dev/null +++ b/node_modules/mongodb/src/sdam/topology_description.ts @@ -0,0 +1,487 @@ +import { ServerDescription } from './server_description'; +import * as WIRE_CONSTANTS from '../cmap/wire_protocol/constants'; +import { TopologyType, ServerType } from './common'; +import type { ObjectId, Document } from '../bson'; +import type { SrvPollingEvent } from './srv_polling'; +import { MongoError, MongoRuntimeError } from '../error'; + +// constants related to compatibility checks +const MIN_SUPPORTED_SERVER_VERSION = WIRE_CONSTANTS.MIN_SUPPORTED_SERVER_VERSION; +const MAX_SUPPORTED_SERVER_VERSION = WIRE_CONSTANTS.MAX_SUPPORTED_SERVER_VERSION; +const MIN_SUPPORTED_WIRE_VERSION = WIRE_CONSTANTS.MIN_SUPPORTED_WIRE_VERSION; +const MAX_SUPPORTED_WIRE_VERSION = WIRE_CONSTANTS.MAX_SUPPORTED_WIRE_VERSION; + +const MONGOS_OR_UNKNOWN = new Set([ServerType.Mongos, ServerType.Unknown]); +const MONGOS_OR_STANDALONE = new Set([ServerType.Mongos, ServerType.Standalone]); +const NON_PRIMARY_RS_MEMBERS = new Set([ + ServerType.RSSecondary, + ServerType.RSArbiter, + ServerType.RSOther +]); + +/** @public */ +export interface TopologyDescriptionOptions { + heartbeatFrequencyMS?: number; + localThresholdMS?: number; +} + +/** + * Representation of a deployment of servers + * @public + */ +export class TopologyDescription { + type: TopologyType; + setName?: string; + maxSetVersion?: number; + maxElectionId?: ObjectId; + servers: Map; + stale: boolean; + compatible: boolean; + compatibilityError?: string; + logicalSessionTimeoutMinutes?: number; + heartbeatFrequencyMS: number; + localThresholdMS: number; + commonWireVersion?: number; + + /** + * Create a TopologyDescription + */ + constructor( + topologyType: TopologyType, + serverDescriptions?: Map, + setName?: string, + maxSetVersion?: number, + maxElectionId?: ObjectId, + commonWireVersion?: number, + options?: TopologyDescriptionOptions + ) { + options = options ?? {}; + + // TODO: consider assigning all these values to a temporary value `s` which + // we use `Object.freeze` on, ensuring the internal state of this type + // is immutable. + this.type = topologyType ?? TopologyType.Unknown; + this.servers = serverDescriptions ?? new Map(); + this.stale = false; + this.compatible = true; + this.heartbeatFrequencyMS = options.heartbeatFrequencyMS ?? 0; + this.localThresholdMS = options.localThresholdMS ?? 0; + + if (setName) { + this.setName = setName; + } + + if (maxSetVersion) { + this.maxSetVersion = maxSetVersion; + } + + if (maxElectionId) { + this.maxElectionId = maxElectionId; + } + + if (commonWireVersion) { + this.commonWireVersion = commonWireVersion; + } + + // determine server compatibility + for (const serverDescription of this.servers.values()) { + // Load balancer mode is always compatible. + if ( + serverDescription.type === ServerType.Unknown || + serverDescription.type === ServerType.LoadBalancer + ) { + continue; + } + + if (serverDescription.minWireVersion > MAX_SUPPORTED_WIRE_VERSION) { + this.compatible = false; + this.compatibilityError = `Server at ${serverDescription.address} requires wire version ${serverDescription.minWireVersion}, but this version of the driver only supports up to ${MAX_SUPPORTED_WIRE_VERSION} (MongoDB ${MAX_SUPPORTED_SERVER_VERSION})`; + } + + if (serverDescription.maxWireVersion < MIN_SUPPORTED_WIRE_VERSION) { + this.compatible = false; + this.compatibilityError = `Server at ${serverDescription.address} reports wire version ${serverDescription.maxWireVersion}, but this version of the driver requires at least ${MIN_SUPPORTED_WIRE_VERSION} (MongoDB ${MIN_SUPPORTED_SERVER_VERSION}).`; + break; + } + } + + // Whenever a client updates the TopologyDescription from an ismaster response, it MUST set + // TopologyDescription.logicalSessionTimeoutMinutes to the smallest logicalSessionTimeoutMinutes + // value among ServerDescriptions of all data-bearing server types. If any have a null + // logicalSessionTimeoutMinutes, then TopologyDescription.logicalSessionTimeoutMinutes MUST be + // set to null. + this.logicalSessionTimeoutMinutes = undefined; + for (const [, server] of this.servers) { + if (server.isReadable) { + if (server.logicalSessionTimeoutMinutes == null) { + // If any of the servers have a null logicalSessionsTimeout, then the whole topology does + this.logicalSessionTimeoutMinutes = undefined; + break; + } + + if (this.logicalSessionTimeoutMinutes == null) { + // First server with a non null logicalSessionsTimeout + this.logicalSessionTimeoutMinutes = server.logicalSessionTimeoutMinutes; + continue; + } + + // Always select the smaller of the: + // current server logicalSessionsTimeout and the topologies logicalSessionsTimeout + this.logicalSessionTimeoutMinutes = Math.min( + this.logicalSessionTimeoutMinutes, + server.logicalSessionTimeoutMinutes + ); + } + } + } + + /** + * Returns a new TopologyDescription based on the SrvPollingEvent + * @internal + */ + updateFromSrvPollingEvent(ev: SrvPollingEvent): TopologyDescription { + const newAddresses = ev.addresses(); + const serverDescriptions = new Map(this.servers); + for (const address of this.servers.keys()) { + if (newAddresses.has(address)) { + newAddresses.delete(address); + } else { + serverDescriptions.delete(address); + } + } + + if (serverDescriptions.size === this.servers.size && newAddresses.size === 0) { + return this; + } + + for (const [address, host] of newAddresses) { + serverDescriptions.set(address, new ServerDescription(host)); + } + + return new TopologyDescription( + this.type, + serverDescriptions, + this.setName, + this.maxSetVersion, + this.maxElectionId, + this.commonWireVersion, + { heartbeatFrequencyMS: this.heartbeatFrequencyMS, localThresholdMS: this.localThresholdMS } + ); + } + + /** + * Returns a copy of this description updated with a given ServerDescription + * @internal + */ + update(serverDescription: ServerDescription): TopologyDescription { + const address = serverDescription.address; + + // potentially mutated values + let { type: topologyType, setName, maxSetVersion, maxElectionId, commonWireVersion } = this; + + if (serverDescription.setName && setName && serverDescription.setName !== setName) { + serverDescription = new ServerDescription(address, undefined); + } + + const serverType = serverDescription.type; + const serverDescriptions = new Map(this.servers); + + // update common wire version + if (serverDescription.maxWireVersion !== 0) { + if (commonWireVersion == null) { + commonWireVersion = serverDescription.maxWireVersion; + } else { + commonWireVersion = Math.min(commonWireVersion, serverDescription.maxWireVersion); + } + } + + // update the actual server description + serverDescriptions.set(address, serverDescription); + + if (topologyType === TopologyType.Single) { + // once we are defined as single, that never changes + return new TopologyDescription( + TopologyType.Single, + serverDescriptions, + setName, + maxSetVersion, + maxElectionId, + commonWireVersion, + { heartbeatFrequencyMS: this.heartbeatFrequencyMS, localThresholdMS: this.localThresholdMS } + ); + } + + if (topologyType === TopologyType.Unknown) { + if (serverType === ServerType.Standalone && this.servers.size !== 1) { + serverDescriptions.delete(address); + } else { + topologyType = topologyTypeForServerType(serverType); + } + } + + if (topologyType === TopologyType.Sharded) { + if (!MONGOS_OR_UNKNOWN.has(serverType)) { + serverDescriptions.delete(address); + } + } + + if (topologyType === TopologyType.ReplicaSetNoPrimary) { + if (MONGOS_OR_STANDALONE.has(serverType)) { + serverDescriptions.delete(address); + } + + if (serverType === ServerType.RSPrimary) { + const result = updateRsFromPrimary( + serverDescriptions, + serverDescription, + setName, + maxSetVersion, + maxElectionId + ); + + topologyType = result[0]; + setName = result[1]; + maxSetVersion = result[2]; + maxElectionId = result[3]; + } else if (NON_PRIMARY_RS_MEMBERS.has(serverType)) { + const result = updateRsNoPrimaryFromMember(serverDescriptions, serverDescription, setName); + topologyType = result[0]; + setName = result[1]; + } + } + + if (topologyType === TopologyType.ReplicaSetWithPrimary) { + if (MONGOS_OR_STANDALONE.has(serverType)) { + serverDescriptions.delete(address); + topologyType = checkHasPrimary(serverDescriptions); + } else if (serverType === ServerType.RSPrimary) { + const result = updateRsFromPrimary( + serverDescriptions, + serverDescription, + setName, + maxSetVersion, + maxElectionId + ); + + topologyType = result[0]; + setName = result[1]; + maxSetVersion = result[2]; + maxElectionId = result[3]; + } else if (NON_PRIMARY_RS_MEMBERS.has(serverType)) { + topologyType = updateRsWithPrimaryFromMember( + serverDescriptions, + serverDescription, + setName + ); + } else { + topologyType = checkHasPrimary(serverDescriptions); + } + } + + return new TopologyDescription( + topologyType, + serverDescriptions, + setName, + maxSetVersion, + maxElectionId, + commonWireVersion, + { heartbeatFrequencyMS: this.heartbeatFrequencyMS, localThresholdMS: this.localThresholdMS } + ); + } + + get error(): MongoError | undefined { + const descriptionsWithError = Array.from(this.servers.values()).filter( + (sd: ServerDescription) => sd.error + ); + + if (descriptionsWithError.length > 0) { + return descriptionsWithError[0].error; + } + } + + /** + * Determines if the topology description has any known servers + */ + get hasKnownServers(): boolean { + return Array.from(this.servers.values()).some( + (sd: ServerDescription) => sd.type !== ServerType.Unknown + ); + } + + /** + * Determines if this topology description has a data-bearing server available. + */ + get hasDataBearingServers(): boolean { + return Array.from(this.servers.values()).some((sd: ServerDescription) => sd.isDataBearing); + } + + /** + * Determines if the topology has a definition for the provided address + * @internal + */ + hasServer(address: string): boolean { + return this.servers.has(address); + } +} + +function topologyTypeForServerType(serverType: ServerType): TopologyType { + switch (serverType) { + case ServerType.Standalone: + return TopologyType.Single; + case ServerType.Mongos: + return TopologyType.Sharded; + case ServerType.RSPrimary: + return TopologyType.ReplicaSetWithPrimary; + case ServerType.RSOther: + case ServerType.RSSecondary: + return TopologyType.ReplicaSetNoPrimary; + default: + return TopologyType.Unknown; + } +} + +// TODO: improve these docs when ObjectId is properly typed +function compareObjectId(oid1: Document, oid2: Document): number { + if (oid1 == null) { + return -1; + } + + if (oid2 == null) { + return 1; + } + + if (oid1.id instanceof Buffer && oid2.id instanceof Buffer) { + const oid1Buffer = oid1.id; + const oid2Buffer = oid2.id; + return oid1Buffer.compare(oid2Buffer); + } + + const oid1String = oid1.toString(); + const oid2String = oid2.toString(); + return oid1String.localeCompare(oid2String); +} + +function updateRsFromPrimary( + serverDescriptions: Map, + serverDescription: ServerDescription, + setName?: string, + maxSetVersion?: number, + maxElectionId?: ObjectId +): [TopologyType, string?, number?, ObjectId?] { + setName = setName || serverDescription.setName; + if (setName !== serverDescription.setName) { + serverDescriptions.delete(serverDescription.address); + return [checkHasPrimary(serverDescriptions), setName, maxSetVersion, maxElectionId]; + } + + const electionId = serverDescription.electionId ? serverDescription.electionId : null; + if (serverDescription.setVersion && electionId) { + if (maxSetVersion && maxElectionId) { + if ( + maxSetVersion > serverDescription.setVersion || + compareObjectId(maxElectionId, electionId) > 0 + ) { + // this primary is stale, we must remove it + serverDescriptions.set( + serverDescription.address, + new ServerDescription(serverDescription.address) + ); + + return [checkHasPrimary(serverDescriptions), setName, maxSetVersion, maxElectionId]; + } + } + + maxElectionId = serverDescription.electionId; + } + + if ( + serverDescription.setVersion != null && + (maxSetVersion == null || serverDescription.setVersion > maxSetVersion) + ) { + maxSetVersion = serverDescription.setVersion; + } + + // We've heard from the primary. Is it the same primary as before? + for (const [address, server] of serverDescriptions) { + if (server.type === ServerType.RSPrimary && server.address !== serverDescription.address) { + // Reset old primary's type to Unknown. + serverDescriptions.set(address, new ServerDescription(server.address)); + + // There can only be one primary + break; + } + } + + // Discover new hosts from this primary's response. + serverDescription.allHosts.forEach((address: string) => { + if (!serverDescriptions.has(address)) { + serverDescriptions.set(address, new ServerDescription(address)); + } + }); + + // Remove hosts not in the response. + const currentAddresses = Array.from(serverDescriptions.keys()); + const responseAddresses = serverDescription.allHosts; + currentAddresses + .filter((addr: string) => responseAddresses.indexOf(addr) === -1) + .forEach((address: string) => { + serverDescriptions.delete(address); + }); + + return [checkHasPrimary(serverDescriptions), setName, maxSetVersion, maxElectionId]; +} + +function updateRsWithPrimaryFromMember( + serverDescriptions: Map, + serverDescription: ServerDescription, + setName?: string +): TopologyType { + if (setName == null) { + // TODO(NODE-3483): should be an appropriate runtime error + throw new MongoRuntimeError('Argument "setName" is required if connected to a replica set'); + } + + if ( + setName !== serverDescription.setName || + (serverDescription.me && serverDescription.address !== serverDescription.me) + ) { + serverDescriptions.delete(serverDescription.address); + } + + return checkHasPrimary(serverDescriptions); +} + +function updateRsNoPrimaryFromMember( + serverDescriptions: Map, + serverDescription: ServerDescription, + setName?: string +): [TopologyType, string?] { + const topologyType = TopologyType.ReplicaSetNoPrimary; + setName = setName || serverDescription.setName; + if (setName !== serverDescription.setName) { + serverDescriptions.delete(serverDescription.address); + return [topologyType, setName]; + } + + serverDescription.allHosts.forEach((address: string) => { + if (!serverDescriptions.has(address)) { + serverDescriptions.set(address, new ServerDescription(address)); + } + }); + + if (serverDescription.me && serverDescription.address !== serverDescription.me) { + serverDescriptions.delete(serverDescription.address); + } + + return [topologyType, setName]; +} + +function checkHasPrimary(serverDescriptions: Map): TopologyType { + for (const serverDescription of serverDescriptions.values()) { + if (serverDescription.type === ServerType.RSPrimary) { + return TopologyType.ReplicaSetWithPrimary; + } + } + + return TopologyType.ReplicaSetNoPrimary; +} diff --git a/node_modules/mongodb/src/sessions.ts b/node_modules/mongodb/src/sessions.ts new file mode 100644 index 000000000..ec865f1ea --- /dev/null +++ b/node_modules/mongodb/src/sessions.ts @@ -0,0 +1,1058 @@ +import { PromiseProvider } from './promise_provider'; +import { Binary, Long, Timestamp, Document } from './bson'; +import { ReadPreference } from './read_preference'; +import { isTransactionCommand, TxnState, Transaction, TransactionOptions } from './transactions'; +import { _advanceClusterTime, ClusterTime, TopologyType } from './sdam/common'; +import { isSharded } from './cmap/wire_protocol/shared'; +import { + MongoError, + MongoInvalidArgumentError, + isRetryableError, + MongoCompatibilityError, + MongoNetworkError, + MongoWriteConcernError, + MONGODB_ERROR_CODES, + MongoServerError, + MongoDriverError, + MongoAPIError, + AnyError, + MongoExpiredSessionError, + MongoTransactionError, + MongoRuntimeError +} from './error'; +import { + now, + calculateDurationInMs, + Callback, + isPromiseLike, + uuidV4, + maxWireVersion, + maybePromise +} from './utils'; +import type { Topology } from './sdam/topology'; +import type { MongoOptions } from './mongo_client'; +import { executeOperation } from './operations/execute_operation'; +import { RunAdminCommandOperation } from './operations/run_command'; +import type { AbstractCursor } from './cursor/abstract_cursor'; +import type { CommandOptions } from './cmap/connection'; +import { Connection } from './cmap/connection'; +import { ConnectionPoolMetrics } from './cmap/metrics'; +import type { WriteConcern } from './write_concern'; +import { TypedEventEmitter } from './mongo_types'; +import { ReadConcernLevel } from './read_concern'; + +const minWireVersionForShardedTransactions = 8; + +function assertAlive(session: ClientSession, callback?: Callback): boolean { + if (session.serverSession == null) { + const error = new MongoExpiredSessionError(); + if (typeof callback === 'function') { + callback(error); + return false; + } + + throw error; + } + + return true; +} + +/** @public */ +export interface ClientSessionOptions { + /** Whether causal consistency should be enabled on this session */ + causalConsistency?: boolean; + /** Whether all read operations should be read from the same snapshot for this session (NOTE: not compatible with `causalConsistency=true`) */ + snapshot?: boolean; + /** The default TransactionOptions to use for transactions started on this session. */ + defaultTransactionOptions?: TransactionOptions; + + /** @internal */ + owner?: symbol | AbstractCursor; + /** @internal */ + explicit?: boolean; + /** @internal */ + initialClusterTime?: ClusterTime; +} + +/** @public */ +export type WithTransactionCallback = (session: ClientSession) => Promise; + +/** @public */ +export type ClientSessionEvents = { + ended(session: ClientSession): void; +}; + +/** @internal */ +const kServerSession = Symbol('serverSession'); +/** @internal */ +const kSnapshotTime = Symbol('snapshotTime'); +/** @internal */ +const kSnapshotEnabled = Symbol('snapshotEnabled'); +/** @internal */ +const kPinnedConnection = Symbol('pinnedConnection'); + +/** @public */ +export interface EndSessionOptions { + /** + * An optional error which caused the call to end this session + * @internal + */ + error?: AnyError; + force?: boolean; + forceClear?: boolean; +} + +/** + * A class representing a client session on the server + * + * NOTE: not meant to be instantiated directly. + * @public + */ +export class ClientSession extends TypedEventEmitter { + /** @internal */ + topology: Topology; + /** @internal */ + sessionPool: ServerSessionPool; + hasEnded: boolean; + clientOptions?: MongoOptions; + supports: { causalConsistency: boolean }; + clusterTime?: ClusterTime; + operationTime?: Timestamp; + explicit: boolean; + /** @internal */ + owner?: symbol | AbstractCursor; + defaultTransactionOptions: TransactionOptions; + transaction: Transaction; + /** @internal */ + [kServerSession]?: ServerSession; + /** @internal */ + [kSnapshotTime]?: Timestamp; + /** @internal */ + [kSnapshotEnabled] = false; + /** @internal */ + [kPinnedConnection]?: Connection; + + /** + * Create a client session. + * @internal + * @param topology - The current client's topology (Internal Class) + * @param sessionPool - The server session pool (Internal Class) + * @param options - Optional settings + * @param clientOptions - Optional settings provided when creating a MongoClient + */ + constructor( + topology: Topology, + sessionPool: ServerSessionPool, + options: ClientSessionOptions, + clientOptions?: MongoOptions + ) { + super(); + + if (topology == null) { + // TODO(NODE-3483) + throw new MongoRuntimeError('ClientSession requires a topology'); + } + + if (sessionPool == null || !(sessionPool instanceof ServerSessionPool)) { + // TODO(NODE-3483) + throw new MongoRuntimeError('ClientSession requires a ServerSessionPool'); + } + + options = options ?? {}; + + if (options.snapshot === true) { + this[kSnapshotEnabled] = true; + if (options.causalConsistency === true) { + throw new MongoInvalidArgumentError( + 'Properties "causalConsistency" and "snapshot" are mutually exclusive' + ); + } + } + + this.topology = topology; + this.sessionPool = sessionPool; + this.hasEnded = false; + this.clientOptions = clientOptions; + this[kServerSession] = undefined; + + this.supports = { + causalConsistency: options.snapshot !== true && options.causalConsistency !== false + }; + + this.clusterTime = options.initialClusterTime; + + this.operationTime = undefined; + this.explicit = !!options.explicit; + this.owner = options.owner; + this.defaultTransactionOptions = Object.assign({}, options.defaultTransactionOptions); + this.transaction = new Transaction(); + } + + /** The server id associated with this session */ + get id(): ServerSessionId | undefined { + return this.serverSession?.id; + } + + get serverSession(): ServerSession { + if (this[kServerSession] == null) { + this[kServerSession] = this.sessionPool.acquire(); + } + + // eslint-disable-next-line @typescript-eslint/no-non-null-assertion + return this[kServerSession]!; + } + + /** Whether or not this session is configured for snapshot reads */ + get snapshotEnabled(): boolean { + return this[kSnapshotEnabled]; + } + + get loadBalanced(): boolean { + return this.topology.description.type === TopologyType.LoadBalanced; + } + + /** @internal */ + get pinnedConnection(): Connection | undefined { + return this[kPinnedConnection]; + } + + /** @internal */ + pin(conn: Connection): void { + if (this[kPinnedConnection]) { + throw TypeError('Cannot pin multiple connections to the same session'); + } + + this[kPinnedConnection] = conn; + conn.emit( + Connection.PINNED, + this.inTransaction() ? ConnectionPoolMetrics.TXN : ConnectionPoolMetrics.CURSOR + ); + } + + /** @internal */ + unpin(options?: { force?: boolean; forceClear?: boolean; error?: AnyError }): void { + if (this.loadBalanced) { + return maybeClearPinnedConnection(this, options); + } + + this.transaction.unpinServer(); + } + + get isPinned(): boolean { + return this.loadBalanced ? !!this[kPinnedConnection] : this.transaction.isPinned; + } + + /** + * Ends this session on the server + * + * @param options - Optional settings. Currently reserved for future use + * @param callback - Optional callback for completion of this operation + */ + endSession(): Promise; + endSession(callback: Callback): void; + endSession(options: EndSessionOptions): Promise; + endSession(options: EndSessionOptions, callback: Callback): void; + endSession( + options?: EndSessionOptions | Callback, + callback?: Callback + ): void | Promise { + if (typeof options === 'function') (callback = options), (options = {}); + const finalOptions = { force: true, ...options }; + + return maybePromise(callback, done => { + if (this.hasEnded) { + maybeClearPinnedConnection(this, finalOptions); + return done(); + } + + const completeEndSession = () => { + maybeClearPinnedConnection(this, finalOptions); + + // release the server session back to the pool + this.sessionPool.release(this.serverSession); + this[kServerSession] = undefined; + + // mark the session as ended, and emit a signal + this.hasEnded = true; + this.emit('ended', this); + + // spec indicates that we should ignore all errors for `endSessions` + done(); + }; + + if (this.serverSession && this.inTransaction()) { + this.abortTransaction(err => { + if (err) return done(err); + completeEndSession(); + }); + + return; + } + + completeEndSession(); + }); + } + + /** + * Advances the operationTime for a ClientSession. + * + * @param operationTime - the `BSON.Timestamp` of the operation type it is desired to advance to + */ + advanceOperationTime(operationTime: Timestamp): void { + if (this.operationTime == null) { + this.operationTime = operationTime; + return; + } + + if (operationTime.greaterThan(this.operationTime)) { + this.operationTime = operationTime; + } + } + + /** + * Advances the clusterTime for a ClientSession to the provided clusterTime of another ClientSession + * + * @param clusterTime - the $clusterTime returned by the server from another session in the form of a document containing the `BSON.Timestamp` clusterTime and signature + */ + advanceClusterTime(clusterTime: ClusterTime): void { + if (!clusterTime || typeof clusterTime !== 'object') { + throw new MongoInvalidArgumentError('input cluster time must be an object'); + } + if (!clusterTime.clusterTime || clusterTime.clusterTime._bsontype !== 'Timestamp') { + throw new MongoInvalidArgumentError( + 'input cluster time "clusterTime" property must be a valid BSON Timestamp' + ); + } + if ( + !clusterTime.signature || + clusterTime.signature.hash?._bsontype !== 'Binary' || + (typeof clusterTime.signature.keyId !== 'number' && + clusterTime.signature.keyId?._bsontype !== 'Long') // apparently we decode the key to number? + ) { + throw new MongoInvalidArgumentError( + 'input cluster time must have a valid "signature" property with BSON Binary hash and BSON Long keyId' + ); + } + + _advanceClusterTime(this, clusterTime); + } + + /** + * Used to determine if this session equals another + * + * @param session - The session to compare to + */ + equals(session: ClientSession): boolean { + if (!(session instanceof ClientSession)) { + return false; + } + + if (this.id == null || session.id == null) { + return false; + } + + return this.id.id.buffer.equals(session.id.id.buffer); + } + + /** Increment the transaction number on the internal ServerSession */ + incrementTransactionNumber(): void { + if (this.serverSession) { + this.serverSession.txnNumber = + typeof this.serverSession.txnNumber === 'number' ? this.serverSession.txnNumber + 1 : 0; + } + } + + /** @returns whether this session is currently in a transaction or not */ + inTransaction(): boolean { + return this.transaction.isActive; + } + + /** + * Starts a new transaction with the given options. + * + * @param options - Options for the transaction + */ + startTransaction(options?: TransactionOptions): void { + if (this[kSnapshotEnabled]) { + throw new MongoCompatibilityError('Transactions are not allowed with snapshot sessions'); + } + + assertAlive(this); + if (this.inTransaction()) { + throw new MongoTransactionError('Transaction already in progress'); + } + + if (this.isPinned && this.transaction.isCommitted) { + this.unpin(); + } + + const topologyMaxWireVersion = maxWireVersion(this.topology); + if ( + isSharded(this.topology) && + topologyMaxWireVersion != null && + topologyMaxWireVersion < minWireVersionForShardedTransactions + ) { + throw new MongoCompatibilityError( + 'Transactions are not supported on sharded clusters in MongoDB < 4.2.' + ); + } + + // increment txnNumber + this.incrementTransactionNumber(); + // create transaction state + this.transaction = new Transaction({ + readConcern: + options?.readConcern ?? + this.defaultTransactionOptions.readConcern ?? + this.clientOptions?.readConcern, + writeConcern: + options?.writeConcern ?? + this.defaultTransactionOptions.writeConcern ?? + this.clientOptions?.writeConcern, + readPreference: + options?.readPreference ?? + this.defaultTransactionOptions.readPreference ?? + this.clientOptions?.readPreference, + maxCommitTimeMS: options?.maxCommitTimeMS ?? this.defaultTransactionOptions.maxCommitTimeMS + }); + + this.transaction.transition(TxnState.STARTING_TRANSACTION); + } + + /** + * Commits the currently active transaction in this session. + * + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + commitTransaction(): Promise; + commitTransaction(callback: Callback): void; + commitTransaction(callback?: Callback): Promise | void { + return maybePromise(callback, cb => endTransaction(this, 'commitTransaction', cb)); + } + + /** + * Aborts the currently active transaction in this session. + * + * @param callback - An optional callback, a Promise will be returned if none is provided + */ + abortTransaction(): Promise; + abortTransaction(callback: Callback): void; + abortTransaction(callback?: Callback): Promise | void { + return maybePromise(callback, cb => endTransaction(this, 'abortTransaction', cb)); + } + + /** + * This is here to ensure that ClientSession is never serialized to BSON. + */ + toBSON(): never { + throw new MongoRuntimeError('ClientSession cannot be serialized to BSON.'); + } + + /** + * Runs a provided lambda within a transaction, retrying either the commit operation + * or entire transaction as needed (and when the error permits) to better ensure that + * the transaction can complete successfully. + * + * IMPORTANT: This method requires the user to return a Promise, all lambdas that do not + * return a Promise will result in undefined behavior. + * + * @param fn - A lambda to run within a transaction + * @param options - Optional settings for the transaction + */ + withTransaction( + fn: WithTransactionCallback, + options?: TransactionOptions + ): ReturnType { + const startTime = now(); + return attemptTransaction(this, startTime, fn, options); + } +} + +const MAX_WITH_TRANSACTION_TIMEOUT = 120000; +const NON_DETERMINISTIC_WRITE_CONCERN_ERRORS = new Set([ + 'CannotSatisfyWriteConcern', + 'UnknownReplWriteConcern', + 'UnsatisfiableWriteConcern' +]); + +function hasNotTimedOut(startTime: number, max: number) { + return calculateDurationInMs(startTime) < max; +} + +function isUnknownTransactionCommitResult(err: MongoError) { + const isNonDeterministicWriteConcernError = + err instanceof MongoServerError && + err.codeName && + NON_DETERMINISTIC_WRITE_CONCERN_ERRORS.has(err.codeName); + + return ( + isMaxTimeMSExpiredError(err) || + (!isNonDeterministicWriteConcernError && + err.code !== MONGODB_ERROR_CODES.UnsatisfiableWriteConcern && + err.code !== MONGODB_ERROR_CODES.UnknownReplWriteConcern) + ); +} + +export function maybeClearPinnedConnection( + session: ClientSession, + options?: EndSessionOptions +): void { + // unpin a connection if it has been pinned + const conn = session[kPinnedConnection]; + const error = options?.error; + + if ( + session.inTransaction() && + error && + error instanceof MongoError && + error.hasErrorLabel('TransientTransactionError') + ) { + return; + } + + // NOTE: the spec talks about what to do on a network error only, but the tests seem to + // to validate that we don't unpin on _all_ errors? + if (conn) { + const servers = Array.from(session.topology.s.servers.values()); + const loadBalancer = servers[0]; + + if (options?.error == null || options?.force) { + loadBalancer.s.pool.checkIn(conn); + conn.emit( + Connection.UNPINNED, + session.transaction.state !== TxnState.NO_TRANSACTION + ? ConnectionPoolMetrics.TXN + : ConnectionPoolMetrics.CURSOR + ); + + if (options?.forceClear) { + loadBalancer.s.pool.clear(conn.serviceId); + } + } + + session[kPinnedConnection] = undefined; + } +} + +function isMaxTimeMSExpiredError(err: MongoError) { + if (err == null || !(err instanceof MongoServerError)) { + return false; + } + + return ( + err.code === MONGODB_ERROR_CODES.MaxTimeMSExpired || + (err.writeConcernError && err.writeConcernError.code === MONGODB_ERROR_CODES.MaxTimeMSExpired) + ); +} + +function attemptTransactionCommit( + session: ClientSession, + startTime: number, + fn: WithTransactionCallback, + options?: TransactionOptions +): Promise { + return session.commitTransaction().catch((err: MongoError) => { + if ( + err instanceof MongoError && + hasNotTimedOut(startTime, MAX_WITH_TRANSACTION_TIMEOUT) && + !isMaxTimeMSExpiredError(err) + ) { + if (err.hasErrorLabel('UnknownTransactionCommitResult')) { + return attemptTransactionCommit(session, startTime, fn, options); + } + + if (err.hasErrorLabel('TransientTransactionError')) { + return attemptTransaction(session, startTime, fn, options); + } + } + + throw err; + }); +} + +const USER_EXPLICIT_TXN_END_STATES = new Set([ + TxnState.NO_TRANSACTION, + TxnState.TRANSACTION_COMMITTED, + TxnState.TRANSACTION_ABORTED +]); + +function userExplicitlyEndedTransaction(session: ClientSession) { + return USER_EXPLICIT_TXN_END_STATES.has(session.transaction.state); +} + +function attemptTransaction( + session: ClientSession, + startTime: number, + fn: WithTransactionCallback, + options?: TransactionOptions +): Promise { + const Promise = PromiseProvider.get(); + session.startTransaction(options); + + let promise; + try { + promise = fn(session); + } catch (err) { + promise = Promise.reject(err); + } + + if (!isPromiseLike(promise)) { + session.abortTransaction(); + throw new MongoInvalidArgumentError( + 'Function provided to `withTransaction` must return a Promise' + ); + } + + return promise.then( + () => { + if (userExplicitlyEndedTransaction(session)) { + return; + } + + return attemptTransactionCommit(session, startTime, fn, options); + }, + err => { + function maybeRetryOrThrow(err: MongoError): Promise { + if ( + err instanceof MongoError && + err.hasErrorLabel('TransientTransactionError') && + hasNotTimedOut(startTime, MAX_WITH_TRANSACTION_TIMEOUT) + ) { + return attemptTransaction(session, startTime, fn, options); + } + + if (isMaxTimeMSExpiredError(err)) { + err.addErrorLabel('UnknownTransactionCommitResult'); + } + + throw err; + } + + if (session.transaction.isActive) { + return session.abortTransaction().then(() => maybeRetryOrThrow(err)); + } + + return maybeRetryOrThrow(err); + } + ); +} + +function endTransaction(session: ClientSession, commandName: string, callback: Callback) { + if (!assertAlive(session, callback)) { + // checking result in case callback was called + return; + } + + // handle any initial problematic cases + const txnState = session.transaction.state; + + if (txnState === TxnState.NO_TRANSACTION) { + callback(new MongoTransactionError('No transaction started')); + return; + } + + if (commandName === 'commitTransaction') { + if ( + txnState === TxnState.STARTING_TRANSACTION || + txnState === TxnState.TRANSACTION_COMMITTED_EMPTY + ) { + // the transaction was never started, we can safely exit here + session.transaction.transition(TxnState.TRANSACTION_COMMITTED_EMPTY); + callback(); + return; + } + + if (txnState === TxnState.TRANSACTION_ABORTED) { + callback( + new MongoTransactionError('Cannot call commitTransaction after calling abortTransaction') + ); + return; + } + } else { + if (txnState === TxnState.STARTING_TRANSACTION) { + // the transaction was never started, we can safely exit here + session.transaction.transition(TxnState.TRANSACTION_ABORTED); + callback(); + return; + } + + if (txnState === TxnState.TRANSACTION_ABORTED) { + callback(new MongoTransactionError('Cannot call abortTransaction twice')); + return; + } + + if ( + txnState === TxnState.TRANSACTION_COMMITTED || + txnState === TxnState.TRANSACTION_COMMITTED_EMPTY + ) { + callback( + new MongoTransactionError('Cannot call abortTransaction after calling commitTransaction') + ); + return; + } + } + + // construct and send the command + const command: Document = { [commandName]: 1 }; + + // apply a writeConcern if specified + let writeConcern; + if (session.transaction.options.writeConcern) { + writeConcern = Object.assign({}, session.transaction.options.writeConcern); + } else if (session.clientOptions && session.clientOptions.writeConcern) { + writeConcern = { w: session.clientOptions.writeConcern.w }; + } + + if (txnState === TxnState.TRANSACTION_COMMITTED) { + writeConcern = Object.assign({ wtimeout: 10000 }, writeConcern, { w: 'majority' }); + } + + if (writeConcern) { + Object.assign(command, { writeConcern }); + } + + if (commandName === 'commitTransaction' && session.transaction.options.maxTimeMS) { + Object.assign(command, { maxTimeMS: session.transaction.options.maxTimeMS }); + } + + function commandHandler(e?: MongoError, r?: Document) { + if (commandName !== 'commitTransaction') { + session.transaction.transition(TxnState.TRANSACTION_ABORTED); + if (session.loadBalanced) { + maybeClearPinnedConnection(session, { force: false }); + } + + // The spec indicates that we should ignore all errors on `abortTransaction` + return callback(); + } + + session.transaction.transition(TxnState.TRANSACTION_COMMITTED); + if (e) { + if ( + e instanceof MongoNetworkError || + e instanceof MongoWriteConcernError || + isRetryableError(e) || + isMaxTimeMSExpiredError(e) + ) { + if (isUnknownTransactionCommitResult(e)) { + e.addErrorLabel('UnknownTransactionCommitResult'); + + // per txns spec, must unpin session in this case + session.unpin({ error: e }); + } + } else if (e.hasErrorLabel('TransientTransactionError')) { + session.unpin({ error: e }); + } + } + + callback(e, r); + } + + // Assumption here that commandName is "commitTransaction" or "abortTransaction" + if (session.transaction.recoveryToken) { + command.recoveryToken = session.transaction.recoveryToken; + } + + // send the command + executeOperation( + session.topology, + new RunAdminCommandOperation(undefined, command, { + session, + readPreference: ReadPreference.primary, + bypassPinningCheck: true + }), + (err, reply) => { + if (command.abortTransaction) { + // always unpin on abort regardless of command outcome + session.unpin(); + } + + if (err && isRetryableError(err as MongoError)) { + // SPEC-1185: apply majority write concern when retrying commitTransaction + if (command.commitTransaction) { + // per txns spec, must unpin session in this case + session.unpin({ force: true }); + + command.writeConcern = Object.assign({ wtimeout: 10000 }, command.writeConcern, { + w: 'majority' + }); + } + + return executeOperation( + session.topology, + new RunAdminCommandOperation(undefined, command, { + session, + readPreference: ReadPreference.primary, + bypassPinningCheck: true + }), + (_err, _reply) => commandHandler(_err as MongoError, _reply) + ); + } + + commandHandler(err as MongoError, reply); + } + ); +} + +/** @public */ +export type ServerSessionId = { id: Binary }; + +/** + * Reflects the existence of a session on the server. Can be reused by the session pool. + * WARNING: not meant to be instantiated directly. For internal use only. + * @public + */ +export class ServerSession { + id: ServerSessionId; + lastUse: number; + txnNumber: number; + isDirty: boolean; + + /** @internal */ + constructor() { + this.id = { id: new Binary(uuidV4(), Binary.SUBTYPE_UUID) }; + this.lastUse = now(); + this.txnNumber = 0; + this.isDirty = false; + } + + /** + * Determines if the server session has timed out. + * + * @param sessionTimeoutMinutes - The server's "logicalSessionTimeoutMinutes" + */ + hasTimedOut(sessionTimeoutMinutes: number): boolean { + // Take the difference of the lastUse timestamp and now, which will result in a value in + // milliseconds, and then convert milliseconds to minutes to compare to `sessionTimeoutMinutes` + const idleTimeMinutes = Math.round( + ((calculateDurationInMs(this.lastUse) % 86400000) % 3600000) / 60000 + ); + + return idleTimeMinutes > sessionTimeoutMinutes - 1; + } +} + +/** + * Maintains a pool of Server Sessions. + * For internal use only + * @internal + */ +export class ServerSessionPool { + topology: Topology; + sessions: ServerSession[]; + + constructor(topology: Topology) { + if (topology == null) { + throw new MongoRuntimeError('ServerSessionPool requires a topology'); + } + + this.topology = topology; + this.sessions = []; + } + + /** Ends all sessions in the session pool */ + endAllPooledSessions(callback?: Callback): void { + if (this.sessions.length) { + this.topology.endSessions( + this.sessions.map((session: ServerSession) => session.id), + () => { + this.sessions = []; + if (typeof callback === 'function') { + callback(); + } + } + ); + + return; + } + + if (typeof callback === 'function') { + callback(); + } + } + + /** + * Acquire a Server Session from the pool. + * Iterates through each session in the pool, removing any stale sessions + * along the way. The first non-stale session found is removed from the + * pool and returned. If no non-stale session is found, a new ServerSession is created. + */ + acquire(): ServerSession { + const sessionTimeoutMinutes = this.topology.logicalSessionTimeoutMinutes || 10; + + while (this.sessions.length) { + const session = this.sessions.shift(); + if (session && (this.topology.loadBalanced || !session.hasTimedOut(sessionTimeoutMinutes))) { + return session; + } + } + + return new ServerSession(); + } + + /** + * Release a session to the session pool + * Adds the session back to the session pool if the session has not timed out yet. + * This method also removes any stale sessions from the pool. + * + * @param session - The session to release to the pool + */ + release(session: ServerSession): void { + const sessionTimeoutMinutes = this.topology.logicalSessionTimeoutMinutes; + + if (this.topology.loadBalanced && !sessionTimeoutMinutes) { + this.sessions.unshift(session); + } + + if (!sessionTimeoutMinutes) { + return; + } + + while (this.sessions.length) { + const pooledSession = this.sessions[this.sessions.length - 1]; + if (pooledSession.hasTimedOut(sessionTimeoutMinutes)) { + this.sessions.pop(); + } else { + break; + } + } + + if (!session.hasTimedOut(sessionTimeoutMinutes)) { + if (session.isDirty) { + return; + } + + // otherwise, readd this session to the session pool + this.sessions.unshift(session); + } + } +} + +// TODO: this should be codified in command construction +// @see https://github.com/mongodb/specifications/blob/master/source/read-write-concern/read-write-concern.rst#read-concern +export function commandSupportsReadConcern(command: Document, options?: Document): boolean { + if (command.aggregate || command.count || command.distinct || command.find || command.geoNear) { + return true; + } + + if ( + command.mapReduce && + options && + options.out && + (options.out.inline === 1 || options.out === 'inline') + ) { + return true; + } + + return false; +} + +/** + * Optionally decorate a command with sessions specific keys + * + * @param session - the session tracking transaction state + * @param command - the command to decorate + * @param options - Optional settings passed to calling operation + */ +export function applySession( + session: ClientSession, + command: Document, + options?: CommandOptions +): MongoDriverError | undefined { + // TODO: merge this with `assertAlive`, did not want to throw a try/catch here + if (session.hasEnded) { + return new MongoExpiredSessionError(); + } + + const serverSession = session.serverSession; + if (serverSession == null) { + return new MongoRuntimeError('Unable to acquire server session'); + } + + // SPEC-1019: silently ignore explicit session with unacknowledged write for backwards compatibility + // FIXME: NODE-2781, this check for write concern shouldn't be happening here, but instead during command construction + if (options && options.writeConcern && (options.writeConcern as WriteConcern).w === 0) { + if (session && session.explicit) { + return new MongoAPIError('Cannot have explicit session with unacknowledged writes'); + } + return; + } + + // mark the last use of this session, and apply the `lsid` + serverSession.lastUse = now(); + command.lsid = serverSession.id; + + // first apply non-transaction-specific sessions data + const inTransaction = session.inTransaction() || isTransactionCommand(command); + const isRetryableWrite = options?.willRetryWrite || false; + + if (serverSession.txnNumber && (isRetryableWrite || inTransaction)) { + command.txnNumber = Long.fromNumber(serverSession.txnNumber); + } + + if (!inTransaction) { + if (session.transaction.state !== TxnState.NO_TRANSACTION) { + session.transaction.transition(TxnState.NO_TRANSACTION); + } + + if ( + session.supports.causalConsistency && + session.operationTime && + commandSupportsReadConcern(command, options) + ) { + command.readConcern = command.readConcern || {}; + Object.assign(command.readConcern, { afterClusterTime: session.operationTime }); + } else if (session[kSnapshotEnabled]) { + command.readConcern = command.readConcern || { level: ReadConcernLevel.snapshot }; + if (session[kSnapshotTime] != null) { + Object.assign(command.readConcern, { atClusterTime: session[kSnapshotTime] }); + } + } + + return; + } + + // now attempt to apply transaction-specific sessions data + + // `autocommit` must always be false to differentiate from retryable writes + command.autocommit = false; + + if (session.transaction.state === TxnState.STARTING_TRANSACTION) { + session.transaction.transition(TxnState.TRANSACTION_IN_PROGRESS); + command.startTransaction = true; + + const readConcern = + session.transaction.options.readConcern || session?.clientOptions?.readConcern; + if (readConcern) { + command.readConcern = readConcern; + } + + if (session.supports.causalConsistency && session.operationTime) { + command.readConcern = command.readConcern || {}; + Object.assign(command.readConcern, { afterClusterTime: session.operationTime }); + } + } +} + +export function updateSessionFromResponse(session: ClientSession, document: Document): void { + if (document.$clusterTime) { + _advanceClusterTime(session, document.$clusterTime); + } + + if (document.operationTime && session && session.supports.causalConsistency) { + session.advanceOperationTime(document.operationTime); + } + + if (document.recoveryToken && session && session.inTransaction()) { + session.transaction._recoveryToken = document.recoveryToken; + } + + if (session?.[kSnapshotEnabled] && session[kSnapshotTime] == null) { + // find and aggregate commands return atClusterTime on the cursor + // distinct includes it in the response body + const atClusterTime = document.cursor?.atClusterTime || document.atClusterTime; + if (atClusterTime) { + session[kSnapshotTime] = atClusterTime; + } + } +} diff --git a/node_modules/mongodb/src/sort.ts b/node_modules/mongodb/src/sort.ts new file mode 100644 index 000000000..6b766b545 --- /dev/null +++ b/node_modules/mongodb/src/sort.ts @@ -0,0 +1,132 @@ +import { MongoInvalidArgumentError } from './error'; + +/** @public */ +export type SortDirection = + | 1 + | -1 + | 'asc' + | 'desc' + | 'ascending' + | 'descending' + | { $meta: string }; + +/** @public */ +export type Sort = + | string + | Exclude + | string[] + | { [key: string]: SortDirection } + | Map + | [string, SortDirection][] + | [string, SortDirection]; + +/** Below stricter types were created for sort that correspond with type that the cmd takes */ + +/** @internal */ +export type SortDirectionForCmd = 1 | -1 | { $meta: string }; + +/** @internal */ +export type SortForCmd = Map; + +/** @internal */ +type SortPairForCmd = [string, SortDirectionForCmd]; + +/** @internal */ +function prepareDirection(direction: any = 1): SortDirectionForCmd { + const value = `${direction}`.toLowerCase(); + if (isMeta(direction)) return direction; + switch (value) { + case 'ascending': + case 'asc': + case '1': + return 1; + case 'descending': + case 'desc': + case '-1': + return -1; + default: + throw new MongoInvalidArgumentError(`Invalid sort direction: ${JSON.stringify(direction)}`); + } +} + +/** @internal */ +function isMeta(t: SortDirection): t is { $meta: string } { + return typeof t === 'object' && t != null && '$meta' in t && typeof t.$meta === 'string'; +} + +/** @internal */ +function isPair(t: Sort): t is [string, SortDirection] { + if (Array.isArray(t) && t.length === 2) { + try { + prepareDirection(t[1]); + return true; + } catch (e) { + return false; + } + } + return false; +} + +function isDeep(t: Sort): t is [string, SortDirection][] { + return Array.isArray(t) && Array.isArray(t[0]); +} + +function isMap(t: Sort): t is Map { + return t instanceof Map && t.size > 0; +} + +/** @internal */ +function pairToMap(v: [string, SortDirection]): SortForCmd { + return new Map([[`${v[0]}`, prepareDirection([v[1]])]]); +} + +/** @internal */ +function deepToMap(t: [string, SortDirection][]): SortForCmd { + const sortEntries: SortPairForCmd[] = t.map(([k, v]) => [`${k}`, prepareDirection(v)]); + return new Map(sortEntries); +} + +/** @internal */ +function stringsToMap(t: string[]): SortForCmd { + const sortEntries: SortPairForCmd[] = t.map(key => [`${key}`, 1]); + return new Map(sortEntries); +} + +/** @internal */ +function objectToMap(t: { [key: string]: SortDirection }): SortForCmd { + const sortEntries: SortPairForCmd[] = Object.entries(t).map(([k, v]) => [ + `${k}`, + prepareDirection(v) + ]); + return new Map(sortEntries); +} + +/** @internal */ +function mapToMap(t: Map): SortForCmd { + const sortEntries: SortPairForCmd[] = Array.from(t).map(([k, v]) => [ + `${k}`, + prepareDirection(v) + ]); + return new Map(sortEntries); +} + +/** converts a Sort type into a type that is valid for the server (SortForCmd) */ +export function formatSort( + sort: Sort | undefined, + direction?: SortDirection +): SortForCmd | undefined { + if (sort == null) return undefined; + if (typeof sort === 'string') return new Map([[sort, prepareDirection(direction)]]); + if (typeof sort !== 'object') { + throw new MongoInvalidArgumentError( + `Invalid sort format: ${JSON.stringify(sort)} Sort must be a valid object` + ); + } + if (!Array.isArray(sort)) { + return isMap(sort) ? mapToMap(sort) : Object.keys(sort).length ? objectToMap(sort) : undefined; + } + if (!sort.length) return undefined; + if (isDeep(sort)) return deepToMap(sort); + if (isPair(sort)) return pairToMap(sort); + return stringsToMap(sort); +} diff --git a/node_modules/mongodb/src/transactions.ts b/node_modules/mongodb/src/transactions.ts new file mode 100644 index 000000000..1b73ae368 --- /dev/null +++ b/node_modules/mongodb/src/transactions.ts @@ -0,0 +1,187 @@ +import { ReadPreference } from './read_preference'; +import { MongoRuntimeError, MongoTransactionError } from './error'; +import { ReadConcern } from './read_concern'; +import { WriteConcern } from './write_concern'; +import type { Server } from './sdam/server'; +import type { CommandOperationOptions } from './operations/command'; +import type { Document } from './bson'; + +/** @internal */ +export const TxnState = Object.freeze({ + NO_TRANSACTION: 'NO_TRANSACTION', + STARTING_TRANSACTION: 'STARTING_TRANSACTION', + TRANSACTION_IN_PROGRESS: 'TRANSACTION_IN_PROGRESS', + TRANSACTION_COMMITTED: 'TRANSACTION_COMMITTED', + TRANSACTION_COMMITTED_EMPTY: 'TRANSACTION_COMMITTED_EMPTY', + TRANSACTION_ABORTED: 'TRANSACTION_ABORTED' +} as const); + +/** @internal */ +export type TxnState = typeof TxnState[keyof typeof TxnState]; + +const stateMachine: { [state in TxnState]: TxnState[] } = { + [TxnState.NO_TRANSACTION]: [TxnState.NO_TRANSACTION, TxnState.STARTING_TRANSACTION], + [TxnState.STARTING_TRANSACTION]: [ + TxnState.TRANSACTION_IN_PROGRESS, + TxnState.TRANSACTION_COMMITTED, + TxnState.TRANSACTION_COMMITTED_EMPTY, + TxnState.TRANSACTION_ABORTED + ], + [TxnState.TRANSACTION_IN_PROGRESS]: [ + TxnState.TRANSACTION_IN_PROGRESS, + TxnState.TRANSACTION_COMMITTED, + TxnState.TRANSACTION_ABORTED + ], + [TxnState.TRANSACTION_COMMITTED]: [ + TxnState.TRANSACTION_COMMITTED, + TxnState.TRANSACTION_COMMITTED_EMPTY, + TxnState.STARTING_TRANSACTION, + TxnState.NO_TRANSACTION + ], + [TxnState.TRANSACTION_ABORTED]: [TxnState.STARTING_TRANSACTION, TxnState.NO_TRANSACTION], + [TxnState.TRANSACTION_COMMITTED_EMPTY]: [ + TxnState.TRANSACTION_COMMITTED_EMPTY, + TxnState.NO_TRANSACTION + ] +}; + +const ACTIVE_STATES: Set = new Set([ + TxnState.STARTING_TRANSACTION, + TxnState.TRANSACTION_IN_PROGRESS +]); + +const COMMITTED_STATES: Set = new Set([ + TxnState.TRANSACTION_COMMITTED, + TxnState.TRANSACTION_COMMITTED_EMPTY, + TxnState.TRANSACTION_ABORTED +]); + +/** + * Configuration options for a transaction. + * @public + */ +export interface TransactionOptions extends CommandOperationOptions { + // TODO(NODE-3344): These options use the proper class forms of these settings, it should accept the basic enum values too + /** A default read concern for commands in this transaction */ + readConcern?: ReadConcern; + /** A default writeConcern for commands in this transaction */ + writeConcern?: WriteConcern; + /** A default read preference for commands in this transaction */ + readPreference?: ReadPreference; + + maxCommitTimeMS?: number; +} + +/** + * @public + * A class maintaining state related to a server transaction. Internal Only + */ +export class Transaction { + /** @internal */ + state: TxnState; + options: TransactionOptions; + /** @internal */ + _pinnedServer?: Server; + /** @internal */ + _recoveryToken?: Document; + + /** Create a transaction @internal */ + constructor(options?: TransactionOptions) { + options = options ?? {}; + this.state = TxnState.NO_TRANSACTION; + this.options = {}; + + const writeConcern = WriteConcern.fromOptions(options); + if (writeConcern) { + if (writeConcern.w === 0) { + throw new MongoTransactionError('Transactions do not support unacknowledged write concern'); + } + + this.options.writeConcern = writeConcern; + } + + if (options.readConcern) { + this.options.readConcern = ReadConcern.fromOptions(options); + } + + if (options.readPreference) { + this.options.readPreference = ReadPreference.fromOptions(options); + } + + if (options.maxCommitTimeMS) { + this.options.maxTimeMS = options.maxCommitTimeMS; + } + + // TODO: This isn't technically necessary + this._pinnedServer = undefined; + this._recoveryToken = undefined; + } + + /** @internal */ + get server(): Server | undefined { + return this._pinnedServer; + } + + get recoveryToken(): Document | undefined { + return this._recoveryToken; + } + + get isPinned(): boolean { + return !!this.server; + } + + /** @returns Whether the transaction has started */ + get isStarting(): boolean { + return this.state === TxnState.STARTING_TRANSACTION; + } + + /** + * @returns Whether this session is presently in a transaction + */ + get isActive(): boolean { + return ACTIVE_STATES.has(this.state); + } + + get isCommitted(): boolean { + return COMMITTED_STATES.has(this.state); + } + /** + * Transition the transaction in the state machine + * @internal + * @param nextState - The new state to transition to + */ + transition(nextState: TxnState): void { + const nextStates = stateMachine[this.state]; + if (nextStates && nextStates.includes(nextState)) { + this.state = nextState; + if ( + this.state === TxnState.NO_TRANSACTION || + this.state === TxnState.STARTING_TRANSACTION || + this.state === TxnState.TRANSACTION_ABORTED + ) { + this.unpinServer(); + } + return; + } + + throw new MongoRuntimeError( + `Attempted illegal state transition from [${this.state}] to [${nextState}]` + ); + } + + /** @internal */ + pinServer(server: Server): void { + if (this.isActive) { + this._pinnedServer = server; + } + } + + /** @internal */ + unpinServer(): void { + this._pinnedServer = undefined; + } +} + +export function isTransactionCommand(command: Document): boolean { + return !!(command.commitTransaction || command.abortTransaction); +} diff --git a/node_modules/mongodb/src/utils.ts b/node_modules/mongodb/src/utils.ts new file mode 100644 index 000000000..db681d16a --- /dev/null +++ b/node_modules/mongodb/src/utils.ts @@ -0,0 +1,1440 @@ +import * as os from 'os'; +import * as crypto from 'crypto'; +import { PromiseProvider } from './promise_provider'; +import { + AnyError, + MongoParseError, + MongoRuntimeError, + MongoCompatibilityError, + MongoNotConnectedError, + MongoInvalidArgumentError, + MongoExpiredSessionError +} from './error'; +import { WriteConcern, WriteConcernOptions, W } from './write_concern'; +import type { Server } from './sdam/server'; +import type { Topology } from './sdam/topology'; +import { ServerType } from './sdam/common'; +import type { Db } from './db'; +import type { Collection } from './collection'; +import type { OperationOptions, Hint } from './operations/operation'; +import type { ClientSession } from './sessions'; +import { ReadConcern } from './read_concern'; +import type { Connection } from './cmap/connection'; +import { Document, ObjectId, resolveBSONOptions } from './bson'; +import type { IndexSpecification, IndexDirection } from './operations/indexes'; +import type { Explain } from './explain'; +import type { MongoClient } from './mongo_client'; +import type { CommandOperationOptions, OperationParent } from './operations/command'; +import { ReadPreference } from './read_preference'; +import { URL } from 'url'; +import { MAX_SUPPORTED_WIRE_VERSION } from './cmap/wire_protocol/constants'; + +/** + * MongoDB Driver style callback + * @public + */ +export type Callback = (error?: AnyError, result?: T) => void; +/** @public */ +export type CallbackWithType = (error?: E, result?: T0) => void; + +export const MAX_JS_INT = Number.MAX_SAFE_INTEGER + 1; + +export type AnyOptions = Document; + +/** + * Add a readonly enumerable property. + * @internal + */ +export function getSingleProperty( + obj: AnyOptions, + name: string | number | symbol, + value: unknown +): void { + Object.defineProperty(obj, name, { + enumerable: true, + get() { + return value; + } + }); +} + +/** + * Throws if collectionName is not a valid mongodb collection namespace. + * @internal + */ +export function checkCollectionName(collectionName: string): void { + if ('string' !== typeof collectionName) { + throw new MongoInvalidArgumentError('Collection name must be a String'); + } + + if (!collectionName || collectionName.indexOf('..') !== -1) { + throw new MongoInvalidArgumentError('Collection names cannot be empty'); + } + + if ( + collectionName.indexOf('$') !== -1 && + collectionName.match(/((^\$cmd)|(oplog\.\$main))/) == null + ) { + // TODO(NODE-3483): Use MongoNamespace static method + throw new MongoInvalidArgumentError("Collection names must not contain '$'"); + } + + if (collectionName.match(/^\.|\.$/) != null) { + // TODO(NODE-3483): Use MongoNamespace static method + throw new MongoInvalidArgumentError("Collection names must not start or end with '.'"); + } + + // Validate that we are not passing 0x00 in the collection name + if (collectionName.indexOf('\x00') !== -1) { + // TODO(NODE-3483): Use MongoNamespace static method + throw new MongoInvalidArgumentError('Collection names cannot contain a null character'); + } +} + +/** + * Ensure Hint field is in a shape we expect: + * - object of index names mapping to 1 or -1 + * - just an index name + * @internal + */ +export function normalizeHintField(hint?: Hint): Hint | undefined { + let finalHint = undefined; + + if (typeof hint === 'string') { + finalHint = hint; + } else if (Array.isArray(hint)) { + finalHint = {}; + + hint.forEach(param => { + finalHint[param] = 1; + }); + } else if (hint != null && typeof hint === 'object') { + finalHint = {} as Document; + for (const name in hint) { + finalHint[name] = hint[name]; + } + } + + return finalHint; +} + +interface IndexOptions { + name: string; + keys?: string[]; + fieldHash: Document; +} + +/** + * Create an index specifier based on + * @internal + */ +export function parseIndexOptions(indexSpec: IndexSpecification): IndexOptions { + const fieldHash: { [key: string]: IndexDirection } = {}; + const indexes = []; + let keys; + + // Get all the fields accordingly + if ('string' === typeof indexSpec) { + // 'type' + indexes.push(indexSpec + '_' + 1); + fieldHash[indexSpec] = 1; + } else if (Array.isArray(indexSpec)) { + indexSpec.forEach((f: any) => { + if ('string' === typeof f) { + // [{location:'2d'}, 'type'] + indexes.push(f + '_' + 1); + fieldHash[f] = 1; + } else if (Array.isArray(f)) { + // [['location', '2d'],['type', 1]] + indexes.push(f[0] + '_' + (f[1] || 1)); + fieldHash[f[0]] = f[1] || 1; + } else if (isObject(f)) { + // [{location:'2d'}, {type:1}] + keys = Object.keys(f); + keys.forEach(k => { + indexes.push(k + '_' + (f as AnyOptions)[k]); + fieldHash[k] = (f as AnyOptions)[k]; + }); + } else { + // undefined (ignore) + } + }); + } else if (isObject(indexSpec)) { + // {location:'2d', type:1} + keys = Object.keys(indexSpec); + Object.entries(indexSpec).forEach(([key, value]) => { + indexes.push(key + '_' + value); + fieldHash[key] = value; + }); + } + + return { + name: indexes.join('_'), + keys: keys, + fieldHash: fieldHash + }; +} + +/** + * Checks if arg is an Object: + * - **NOTE**: the check is based on the `[Symbol.toStringTag]() === 'Object'` + * @internal + */ +// eslint-disable-next-line @typescript-eslint/ban-types +export function isObject(arg: unknown): arg is object { + return '[object Object]' === Object.prototype.toString.call(arg); +} + +/** @internal */ +export function decorateCommand(command: Document, options: Document, exclude: string[]): Document { + for (const name in options) { + if (!exclude.includes(name)) { + command[name] = options[name]; + } + } + + return command; +} + +/** @internal */ +export function mergeOptions(target: T, source: S): T & S { + return { ...target, ...source }; +} + +/** @internal */ +export function filterOptions(options: AnyOptions, names: string[]): AnyOptions { + const filterOptions: AnyOptions = {}; + + for (const name in options) { + if (names.includes(name)) { + filterOptions[name] = options[name]; + } + } + + // Filtered options + return filterOptions; +} + +/** + * Executes the given operation with provided arguments. + * + * @remarks + * This method reduces large amounts of duplication in the entire codebase by providing + * a single point for determining whether callbacks or promises should be used. Additionally + * it allows for a single point of entry to provide features such as implicit sessions, which + * are required by the Driver Sessions specification in the event that a ClientSession is + * not provided + * + * @internal + * + * @param topology - The topology to execute this operation on + * @param operation - The operation to execute + * @param args - Arguments to apply the provided operation + * @param options - Options that modify the behavior of the method + */ +export function executeLegacyOperation( + topology: Topology, + operation: (...args: any[]) => void | Promise, + args: any[], + options?: AnyOptions +): void | Promise { + const Promise = PromiseProvider.get(); + + if (!Array.isArray(args)) { + // TODO(NODE-3483) + throw new MongoRuntimeError('This method requires an array of arguments to apply'); + } + + options = options ?? {}; + + let callback = args[args.length - 1]; + + // The driver sessions spec mandates that we implicitly create sessions for operations + // that are not explicitly provided with a session. + let session: ClientSession; + let opOptions: any; + let owner: any; + if (!options.skipSessions && topology.hasSessionSupport()) { + opOptions = args[args.length - 2]; + if (opOptions == null || opOptions.session == null) { + owner = Symbol(); + session = topology.startSession({ owner }); + const optionsIndex = args.length - 2; + args[optionsIndex] = Object.assign({}, args[optionsIndex], { session: session }); + } else if (opOptions.session && opOptions.session.hasEnded) { + throw new MongoExpiredSessionError(); + } + } + + function makeExecuteCallback( + resolve: (value?: Document) => void, + reject: (reason?: AnyError) => void + ) { + return function (err?: AnyError, result?: any) { + if (session && session.owner === owner && !options?.returnsCursor) { + session.endSession(() => { + delete opOptions.session; + if (err) return reject(err); + resolve(result); + }); + } else { + if (err) return reject(err); + resolve(result); + } + }; + } + + // Execute using callback + if (typeof callback === 'function') { + callback = args.pop(); + const handler = makeExecuteCallback( + result => callback(undefined, result), + err => callback(err, null) + ); + args.push(handler); + + try { + return operation(...args); + } catch (e) { + handler(e); + throw e; + } + } + + // Return a Promise + if (args[args.length - 1] != null) { + // TODO(NODE-3483) + throw new MongoRuntimeError('Final argument to `executeLegacyOperation` must be a callback'); + } + + return new Promise((resolve, reject) => { + const handler = makeExecuteCallback(resolve, reject); + args[args.length - 1] = handler; + + try { + return operation(...args); + } catch (e) { + handler(e); + } + }); +} + +interface HasRetryableWrites { + retryWrites?: boolean; +} +/** + * Applies retryWrites: true to a command if retryWrites is set on the command's database. + * @internal + * + * @param target - The target command to which we will apply retryWrites. + * @param db - The database from which we can inherit a retryWrites value. + */ +export function applyRetryableWrites(target: T, db?: Db): T { + if (db && db.s.options?.retryWrites) { + target.retryWrites = true; + } + + return target; +} + +interface HasWriteConcern { + writeConcern?: WriteConcernOptions | WriteConcern | W; +} +/** + * Applies a write concern to a command based on well defined inheritance rules, optionally + * detecting support for the write concern in the first place. + * @internal + * + * @param target - the target command we will be applying the write concern to + * @param sources - sources where we can inherit default write concerns from + * @param options - optional settings passed into a command for write concern overrides + */ +export function applyWriteConcern( + target: T, + sources: { db?: Db; collection?: Collection }, + options?: OperationOptions & WriteConcernOptions +): T { + options = options ?? {}; + const db = sources.db; + const coll = sources.collection; + + if (options.session && options.session.inTransaction()) { + // writeConcern is not allowed within a multi-statement transaction + if (target.writeConcern) { + delete target.writeConcern; + } + + return target; + } + + const writeConcern = WriteConcern.fromOptions(options); + if (writeConcern) { + return Object.assign(target, { writeConcern }); + } + + if (coll && coll.writeConcern) { + return Object.assign(target, { writeConcern: Object.assign({}, coll.writeConcern) }); + } + + if (db && db.writeConcern) { + return Object.assign(target, { writeConcern: Object.assign({}, db.writeConcern) }); + } + + return target; +} + +/** + * Checks if a given value is a Promise + * + * @typeParam T - The result type of maybePromise + * @param maybePromise - An object that could be a promise + * @returns true if the provided value is a Promise + */ +export function isPromiseLike( + maybePromise?: PromiseLike | void +): maybePromise is Promise { + return !!maybePromise && typeof maybePromise.then === 'function'; +} + +/** + * Applies collation to a given command. + * @internal + * + * @param command - the command on which to apply collation + * @param target - target of command + * @param options - options containing collation settings + */ +export function decorateWithCollation( + command: Document, + target: MongoClient | Db | Collection, + options: AnyOptions +): void { + const capabilities = getTopology(target).capabilities; + if (options.collation && typeof options.collation === 'object') { + if (capabilities && capabilities.commandsTakeCollation) { + command.collation = options.collation; + } else { + throw new MongoCompatibilityError(`Current topology does not support collation`); + } + } +} + +/** + * Applies a read concern to a given command. + * @internal + * + * @param command - the command on which to apply the read concern + * @param coll - the parent collection of the operation calling this method + */ +export function decorateWithReadConcern( + command: Document, + coll: { s: { readConcern?: ReadConcern } }, + options?: OperationOptions +): void { + if (options && options.session && options.session.inTransaction()) { + return; + } + const readConcern = Object.assign({}, command.readConcern || {}); + if (coll.s.readConcern) { + Object.assign(readConcern, coll.s.readConcern); + } + + if (Object.keys(readConcern).length > 0) { + Object.assign(command, { readConcern: readConcern }); + } +} + +/** + * Applies an explain to a given command. + * @internal + * + * @param command - the command on which to apply the explain + * @param options - the options containing the explain verbosity + */ +export function decorateWithExplain(command: Document, explain: Explain): Document { + if (command.explain) { + return command; + } + + return { explain: command, verbosity: explain.verbosity }; +} + +/** + * A helper function to get the topology from a given provider. Throws + * if the topology cannot be found. + * @internal + */ +export function getTopology(provider: MongoClient | Db | Collection): Topology { + if (`topology` in provider && provider.topology) { + return provider.topology; + } else if ('client' in provider.s && provider.s.client.topology) { + return provider.s.client.topology; + } else if ('db' in provider.s && provider.s.db.s.client.topology) { + return provider.s.db.s.client.topology; + } + + throw new MongoNotConnectedError('MongoClient must be connected to perform this operation'); +} + +/** + * Default message handler for generating deprecation warnings. + * @internal + * + * @param name - function name + * @param option - option name + * @returns warning message + */ +export function defaultMsgHandler(name: string, option: string): string { + return `${name} option [${option}] is deprecated and will be removed in a later version.`; +} + +export interface DeprecateOptionsConfig { + /** function name */ + name: string; + /** options to deprecate */ + deprecatedOptions: string[]; + /** index of options object in function arguments array */ + optionsIndex: number; + /** optional custom message handler to generate warnings */ + msgHandler?(name: string, option: string): string; +} + +/** + * Deprecates a given function's options. + * @internal + * + * @param this - the bound class if this is a method + * @param config - configuration for deprecation + * @param fn - the target function of deprecation + * @returns modified function that warns once per deprecated option, and executes original function + */ +export function deprecateOptions( + this: unknown, + config: DeprecateOptionsConfig, + fn: (...args: any[]) => any +): any { + if ((process as any).noDeprecation === true) { + return fn; + } + + const msgHandler = config.msgHandler ? config.msgHandler : defaultMsgHandler; + + const optionsWarned = new Set(); + function deprecated(this: any, ...args: any[]) { + const options = args[config.optionsIndex] as AnyOptions; + + // ensure options is a valid, non-empty object, otherwise short-circuit + if (!isObject(options) || Object.keys(options).length === 0) { + return fn.bind(this)(...args); // call the function, no change + } + + // interrupt the function call with a warning + for (const deprecatedOption of config.deprecatedOptions) { + if (deprecatedOption in options && !optionsWarned.has(deprecatedOption)) { + optionsWarned.add(deprecatedOption); + const msg = msgHandler(config.name, deprecatedOption); + emitWarning(msg); + if (this && 'getLogger' in this) { + const logger = this.getLogger(); + if (logger) { + logger.warn(msg); + } + } + } + } + + return fn.bind(this)(...args); + } + + // These lines copied from https://github.com/nodejs/node/blob/25e5ae41688676a5fd29b2e2e7602168eee4ceb5/lib/internal/util.js#L73-L80 + // The wrapper will keep the same prototype as fn to maintain prototype chain + Object.setPrototypeOf(deprecated, fn); + if (fn.prototype) { + // Setting this (rather than using Object.setPrototype, as above) ensures + // that calling the unwrapped constructor gives an instanceof the wrapped + // constructor. + deprecated.prototype = fn.prototype; + } + + return deprecated; +} + +/** @internal */ +export function ns(ns: string): MongoDBNamespace { + return MongoDBNamespace.fromString(ns); +} + +/** @public */ +export class MongoDBNamespace { + db: string; + collection?: string; + /** + * Create a namespace object + * + * @param db - database name + * @param collection - collection name + */ + constructor(db: string, collection?: string) { + this.db = db; + this.collection = collection; + } + + toString(): string { + return this.collection ? `${this.db}.${this.collection}` : this.db; + } + + withCollection(collection: string): MongoDBNamespace { + return new MongoDBNamespace(this.db, collection); + } + + static fromString(namespace?: string): MongoDBNamespace { + if (!namespace) { + // TODO(NODE-3483): Replace with MongoNamespaceError + throw new MongoRuntimeError(`Cannot parse namespace from "${namespace}"`); + } + + const [db, ...collection] = namespace.split('.'); + return new MongoDBNamespace(db, collection.join('.')); + } +} + +/** @internal */ +export function* makeCounter(seed = 0): Generator { + let count = seed; + while (true) { + const newCount = count; + count += 1; + yield newCount; + } +} + +/** + * Helper function for either accepting a callback, or returning a promise + * @internal + * + * @param callback - The last function argument in exposed method, controls if a Promise is returned + * @param wrapper - A function that wraps the callback + * @returns Returns void if a callback is supplied, else returns a Promise. + */ +export function maybePromise( + callback: Callback | undefined, + wrapper: (fn: Callback) => void +): Promise | void { + const Promise = PromiseProvider.get(); + let result: Promise | void; + if (typeof callback !== 'function') { + result = new Promise((resolve, reject) => { + callback = (err, res) => { + if (err) return reject(err); + resolve(res); + }; + }); + } + + wrapper((err, res) => { + if (err != null) { + try { + // eslint-disable-next-line @typescript-eslint/no-non-null-assertion + callback!(err); + } catch (error) { + process.nextTick(() => { + throw error; + }); + } + + return; + } + + // eslint-disable-next-line @typescript-eslint/no-non-null-assertion + callback!(err, res); + }); + + return result; +} + +/** @internal */ +export function databaseNamespace(ns: string): string { + return ns.split('.')[0]; +} + +/** @internal */ +export function collectionNamespace(ns: string): string { + return ns.split('.').slice(1).join('.'); +} + +/** + * Synchronously Generate a UUIDv4 + * @internal + */ +export function uuidV4(): Buffer { + const result = crypto.randomBytes(16); + result[6] = (result[6] & 0x0f) | 0x40; + result[8] = (result[8] & 0x3f) | 0x80; + return result; +} + +/** + * A helper function for determining `maxWireVersion` between legacy and new topology instances + * @internal + */ +export function maxWireVersion(topologyOrServer?: Connection | Topology | Server): number { + if (topologyOrServer) { + if (topologyOrServer.loadBalanced) { + // Since we do not have a monitor, we assume the load balanced server is always + // pointed at the latest mongodb version. There is a risk that for on-prem + // deployments that don't upgrade immediately that this could alert to the + // application that a feature is avaiable that is actually not. + return MAX_SUPPORTED_WIRE_VERSION; + } + if (topologyOrServer.ismaster) { + return topologyOrServer.ismaster.maxWireVersion; + } + + if ('lastIsMaster' in topologyOrServer && typeof topologyOrServer.lastIsMaster === 'function') { + const lastIsMaster = topologyOrServer.lastIsMaster(); + if (lastIsMaster) { + return lastIsMaster.maxWireVersion; + } + } + + if ( + topologyOrServer.description && + 'maxWireVersion' in topologyOrServer.description && + topologyOrServer.description.maxWireVersion != null + ) { + return topologyOrServer.description.maxWireVersion; + } + } + + return 0; +} + +/** + * Checks that collation is supported by server. + * @internal + * + * @param server - to check against + * @param cmd - object where collation may be specified + */ +export function collationNotSupported(server: Server, cmd: Document): boolean { + return cmd && cmd.collation && maxWireVersion(server) < 5; +} + +/** + * Applies the function `eachFn` to each item in `arr`, in parallel. + * @internal + * + * @param arr - An array of items to asynchronously iterate over + * @param eachFn - A function to call on each item of the array. The callback signature is `(item, callback)`, where the callback indicates iteration is complete. + * @param callback - The callback called after every item has been iterated + */ +export function eachAsync( + arr: T[], + eachFn: (item: T, callback: (err?: AnyError) => void) => void, + callback: Callback +): void { + arr = arr || []; + + let idx = 0; + let awaiting = 0; + for (idx = 0; idx < arr.length; ++idx) { + awaiting++; + eachFn(arr[idx], eachCallback); + } + + if (awaiting === 0) { + callback(); + return; + } + + function eachCallback(err?: AnyError) { + awaiting--; + if (err) { + callback(err); + return; + } + + if (idx === arr.length && awaiting <= 0) { + callback(); + } + } +} + +/** @internal */ +export function eachAsyncSeries( + arr: T[], + eachFn: (item: T, callback: (err?: AnyError) => void) => void, + callback: Callback +): void { + arr = arr || []; + + let idx = 0; + let awaiting = arr.length; + if (awaiting === 0) { + callback(); + return; + } + + function eachCallback(err?: AnyError) { + idx++; + awaiting--; + if (err) { + callback(err); + return; + } + + if (idx === arr.length && awaiting <= 0) { + callback(); + return; + } + + eachFn(arr[idx], eachCallback); + } + + eachFn(arr[idx], eachCallback); +} + +/** @internal */ +export function arrayStrictEqual(arr: unknown[], arr2: unknown[]): boolean { + if (!Array.isArray(arr) || !Array.isArray(arr2)) { + return false; + } + + return arr.length === arr2.length && arr.every((elt, idx) => elt === arr2[idx]); +} + +/** @internal */ +export function errorStrictEqual(lhs?: AnyError, rhs?: AnyError): boolean { + if (lhs === rhs) { + return true; + } + + if (!lhs || !rhs) { + return lhs === rhs; + } + + if ((lhs == null && rhs != null) || (lhs != null && rhs == null)) { + return false; + } + + if (lhs.constructor.name !== rhs.constructor.name) { + return false; + } + + if (lhs.message !== rhs.message) { + return false; + } + + return true; +} + +interface StateTable { + [key: string]: string[]; +} +interface ObjectWithState { + s: { state: string }; + emit(event: 'stateChanged', state: string, newState: string): void; +} +interface StateTransitionFunction { + (target: ObjectWithState, newState: string): void; +} + +/** @public */ +export type EventEmitterWithState = { + /** @internal */ + stateChanged(previous: string, current: string): void; +}; + +/** @internal */ +export function makeStateMachine(stateTable: StateTable): StateTransitionFunction { + return function stateTransition(target, newState) { + const legalStates = stateTable[target.s.state]; + if (legalStates && legalStates.indexOf(newState) < 0) { + throw new MongoRuntimeError( + `illegal state transition from [${target.s.state}] => [${newState}], allowed: [${legalStates}]` + ); + } + + target.emit('stateChanged', target.s.state, newState); + target.s.state = newState; + }; +} + +/** @public */ +export interface ClientMetadata { + driver: { + name: string; + version: string; + }; + os: { + type: string; + name: NodeJS.Platform; + architecture: string; + version: string; + }; + platform: string; + version?: string; + application?: { + name: string; + }; +} + +/** @public */ +export interface ClientMetadataOptions { + driverInfo?: { + name?: string; + version?: string; + platform?: string; + }; + appName?: string; +} + +// eslint-disable-next-line @typescript-eslint/no-var-requires +const NODE_DRIVER_VERSION = require('../package.json').version; + +export function makeClientMetadata(options?: ClientMetadataOptions): ClientMetadata { + options = options ?? {}; + + const metadata: ClientMetadata = { + driver: { + name: 'nodejs', + version: NODE_DRIVER_VERSION + }, + os: { + type: os.type(), + name: process.platform, + architecture: process.arch, + version: os.release() + }, + platform: `Node.js ${process.version}, ${os.endianness()} (unified)` + }; + + // support optionally provided wrapping driver info + if (options.driverInfo) { + if (options.driverInfo.name) { + metadata.driver.name = `${metadata.driver.name}|${options.driverInfo.name}`; + } + + if (options.driverInfo.version) { + metadata.version = `${metadata.driver.version}|${options.driverInfo.version}`; + } + + if (options.driverInfo.platform) { + metadata.platform = `${metadata.platform}|${options.driverInfo.platform}`; + } + } + + if (options.appName) { + // MongoDB requires the appName not exceed a byte length of 128 + const buffer = Buffer.from(options.appName); + metadata.application = { + name: buffer.byteLength > 128 ? buffer.slice(0, 128).toString('utf8') : options.appName + }; + } + + return metadata; +} + +/** @internal */ +export function now(): number { + const hrtime = process.hrtime(); + return Math.floor(hrtime[0] * 1000 + hrtime[1] / 1000000); +} + +/** @internal */ +export function calculateDurationInMs(started: number): number { + if (typeof started !== 'number') { + throw new MongoInvalidArgumentError('Numeric value required to calculate duration'); + } + + const elapsed = now() - started; + return elapsed < 0 ? 0 : elapsed; +} + +export interface InterruptibleAsyncIntervalOptions { + /** The interval to execute a method on */ + interval: number; + /** A minimum interval that must elapse before the method is called */ + minInterval: number; + /** Whether the method should be called immediately when the interval is started */ + immediate: boolean; + + /** + * Only used for testing unreliable timer environments + * @internal + */ + clock: () => number; +} + +/** @internal */ +export interface InterruptibleAsyncInterval { + wake(): void; + stop(): void; +} + +/** + * Creates an interval timer which is able to be woken up sooner than + * the interval. The timer will also debounce multiple calls to wake + * ensuring that the function is only ever called once within a minimum + * interval window. + * @internal + * + * @param fn - An async function to run on an interval, must accept a `callback` as its only parameter + */ +export function makeInterruptibleAsyncInterval( + fn: (callback: Callback) => void, + options?: Partial +): InterruptibleAsyncInterval { + let timerId: NodeJS.Timeout | undefined; + let lastCallTime: number; + let lastWakeTime: number; + let stopped = false; + + options = options ?? {}; + const interval = options.interval || 1000; + const minInterval = options.minInterval || 500; + const immediate = typeof options.immediate === 'boolean' ? options.immediate : false; + const clock = typeof options.clock === 'function' ? options.clock : now; + + function wake() { + const currentTime = clock(); + const timeSinceLastWake = currentTime - lastWakeTime; + const timeSinceLastCall = currentTime - lastCallTime; + const timeUntilNextCall = interval - timeSinceLastCall; + lastWakeTime = currentTime; + + // For the streaming protocol: there is nothing obviously stopping this + // interval from being woken up again while we are waiting "infinitely" + // for `fn` to be called again`. Since the function effectively + // never completes, the `timeUntilNextCall` will continue to grow + // negatively unbounded, so it will never trigger a reschedule here. + + // debounce multiple calls to wake within the `minInterval` + if (timeSinceLastWake < minInterval) { + return; + } + + // reschedule a call as soon as possible, ensuring the call never happens + // faster than the `minInterval` + if (timeUntilNextCall > minInterval) { + reschedule(minInterval); + } + + // This is possible in virtualized environments like AWS Lambda where our + // clock is unreliable. In these cases the timer is "running" but never + // actually completes, so we want to execute immediately and then attempt + // to reschedule. + if (timeUntilNextCall < 0) { + executeAndReschedule(); + } + } + + function stop() { + stopped = true; + if (timerId) { + clearTimeout(timerId); + timerId = undefined; + } + + lastCallTime = 0; + lastWakeTime = 0; + } + + function reschedule(ms?: number) { + if (stopped) return; + if (timerId) { + clearTimeout(timerId); + } + + timerId = setTimeout(executeAndReschedule, ms || interval); + } + + function executeAndReschedule() { + lastWakeTime = 0; + lastCallTime = clock(); + + fn(err => { + if (err) throw err; + reschedule(interval); + }); + } + + if (immediate) { + executeAndReschedule(); + } else { + lastCallTime = clock(); + reschedule(undefined); + } + + return { wake, stop }; +} + +/** @internal */ +export function hasAtomicOperators(doc: Document | Document[]): boolean { + if (Array.isArray(doc)) { + for (const document of doc) { + if (hasAtomicOperators(document)) { + return true; + } + } + return false; + } + + const keys = Object.keys(doc); + return keys.length > 0 && keys[0][0] === '$'; +} + +/** + * Merge inherited properties from parent into options, prioritizing values from options, + * then values from parent. + * @internal + */ +export function resolveOptions( + parent: OperationParent | undefined, + options?: T +): T { + const result: T = Object.assign({}, options, resolveBSONOptions(options, parent)); + + // Users cannot pass a readConcern/writeConcern to operations in a transaction + const session = options?.session; + if (!session?.inTransaction()) { + const readConcern = ReadConcern.fromOptions(options) ?? parent?.readConcern; + if (readConcern) { + result.readConcern = readConcern; + } + + const writeConcern = WriteConcern.fromOptions(options) ?? parent?.writeConcern; + if (writeConcern) { + result.writeConcern = writeConcern; + } + } + + const readPreference = ReadPreference.fromOptions(options) ?? parent?.readPreference; + if (readPreference) { + result.readPreference = readPreference; + } + + return result; +} + +export function isSuperset(set: Set | any[], subset: Set | any[]): boolean { + set = Array.isArray(set) ? new Set(set) : set; + subset = Array.isArray(subset) ? new Set(subset) : subset; + for (const elem of subset) { + if (!set.has(elem)) { + return false; + } + } + return true; +} + +export function setDifference(setA: Iterable, setB: Iterable): Set { + const difference = new Set(setA); + for (const elem of setB) { + difference.delete(elem); + } + return difference; +} + +export function isRecord( + value: unknown, + requiredKeys: T +): value is Record; +export function isRecord(value: unknown): value is Record; +export function isRecord( + value: unknown, + requiredKeys: string[] | undefined = undefined +): value is Record { + const toString = Object.prototype.toString; + const hasOwnProperty = Object.prototype.hasOwnProperty; + const isObject = (v: unknown) => toString.call(v) === '[object Object]'; + if (!isObject(value)) { + return false; + } + + const ctor = (value as any).constructor; + if (ctor && ctor.prototype) { + if (!isObject(ctor.prototype)) { + return false; + } + + // Check to see if some method exists from the Object exists + if (!hasOwnProperty.call(ctor.prototype, 'isPrototypeOf')) { + return false; + } + } + + if (requiredKeys) { + const keys = Object.keys(value as Record); + return isSuperset(keys, requiredKeys); + } + + return true; +} + +/** + * Make a deep copy of an object + * + * NOTE: This is not meant to be the perfect implementation of a deep copy, + * but instead something that is good enough for the purposes of + * command monitoring. + */ +export function deepCopy(value: T): T { + if (value == null) { + return value; + } else if (Array.isArray(value)) { + return value.map(item => deepCopy(item)) as T; + } else if (isRecord(value)) { + const res = {} as any; + for (const key in value) { + res[key] = deepCopy(value[key]); + } + return res; + } + + const ctor = (value as any).constructor; + if (ctor) { + switch (ctor.name.toLowerCase()) { + case 'date': + return new ctor(Number(value)); + case 'map': + return new Map(value as any) as T; + case 'set': + return new Set(value as any) as T; + case 'buffer': + return Buffer.from(value as Buffer) as T; + } + } + + return value; +} + +/** @internal */ +const kBuffers = Symbol('buffers'); +/** @internal */ +const kLength = Symbol('length'); + +/** + * A pool of Buffers which allow you to read them as if they were one + * @internal + */ +export class BufferPool { + [kBuffers]: Buffer[]; + [kLength]: number; + + constructor() { + this[kBuffers] = []; + this[kLength] = 0; + } + + get length(): number { + return this[kLength]; + } + + /** Adds a buffer to the internal buffer pool list */ + append(buffer: Buffer): void { + this[kBuffers].push(buffer); + this[kLength] += buffer.length; + } + + /** Returns the requested number of bytes without consuming them */ + peek(size: number): Buffer { + return this.read(size, false); + } + + /** Reads the requested number of bytes, optionally consuming them */ + read(size: number, consume = true): Buffer { + if (typeof size !== 'number' || size < 0) { + throw new MongoInvalidArgumentError('Argument "size" must be a non-negative number'); + } + + if (size > this[kLength]) { + return Buffer.alloc(0); + } + + let result: Buffer; + + // read the whole buffer + if (size === this.length) { + result = Buffer.concat(this[kBuffers]); + + if (consume) { + this[kBuffers] = []; + this[kLength] = 0; + } + } + + // size is within first buffer, no need to concat + else if (size <= this[kBuffers][0].length) { + result = this[kBuffers][0].slice(0, size); + if (consume) { + this[kBuffers][0] = this[kBuffers][0].slice(size); + this[kLength] -= size; + } + } + + // size is beyond first buffer, need to track and copy + else { + result = Buffer.allocUnsafe(size); + + let idx; + let offset = 0; + let bytesToCopy = size; + for (idx = 0; idx < this[kBuffers].length; ++idx) { + let bytesCopied; + if (bytesToCopy > this[kBuffers][idx].length) { + bytesCopied = this[kBuffers][idx].copy(result, offset, 0); + offset += bytesCopied; + } else { + bytesCopied = this[kBuffers][idx].copy(result, offset, 0, bytesToCopy); + if (consume) { + this[kBuffers][idx] = this[kBuffers][idx].slice(bytesCopied); + } + offset += bytesCopied; + break; + } + + bytesToCopy -= bytesCopied; + } + + // compact the internal buffer array + if (consume) { + this[kBuffers] = this[kBuffers].slice(idx); + this[kLength] -= size; + } + } + + return result; + } +} + +/** @public */ +export class HostAddress { + host; + port; + // Driver only works with unix socket path to connect + // SDAM operates only on tcp addresses + socketPath; + isIPv6; + + constructor(hostString: string) { + const escapedHost = hostString.split(' ').join('%20'); // escape spaces, for socket path hosts + const { hostname, port } = new URL(`mongodb://${escapedHost}`); + + if (hostname.endsWith('.sock')) { + // heuristically determine if we're working with a domain socket + this.socketPath = decodeURIComponent(hostname); + } else if (typeof hostname === 'string') { + this.isIPv6 = false; + + let normalized = decodeURIComponent(hostname).toLowerCase(); + if (normalized.startsWith('[') && normalized.endsWith(']')) { + this.isIPv6 = true; + normalized = normalized.substring(1, hostname.length - 1); + } + + this.host = normalized.toLowerCase(); + + if (typeof port === 'number') { + this.port = port; + } else if (typeof port === 'string' && port !== '') { + this.port = Number.parseInt(port, 10); + } else { + this.port = 27017; + } + + if (this.port === 0) { + throw new MongoParseError('Invalid port (zero) with hostname'); + } + } else { + throw new MongoInvalidArgumentError('Either socketPath or host must be defined.'); + } + Object.freeze(this); + } + + /** + * @param ipv6Brackets - optionally request ipv6 bracket notation required for connection strings + */ + toString(ipv6Brackets = false): string { + if (typeof this.host === 'string') { + if (this.isIPv6 && ipv6Brackets) { + return `[${this.host}]:${this.port}`; + } + return `${this.host}:${this.port}`; + } + return `${this.socketPath}`; + } + + static fromString(s: string): HostAddress { + return new HostAddress(s); + } +} + +export const DEFAULT_PK_FACTORY = { + // We prefer not to rely on ObjectId having a createPk method + createPk(): ObjectId { + return new ObjectId(); + } +}; + +/** + * When the driver used emitWarning the code will be equal to this. + * @public + * + * @example + * ```js + * process.on('warning', (warning) => { + * if (warning.code === MONGODB_WARNING_CODE) console.error('Ah an important warning! :)') + * }) + * ``` + */ +export const MONGODB_WARNING_CODE = 'MONGODB DRIVER' as const; + +/** @internal */ +export function emitWarning(message: string): void { + return process.emitWarning(message, { code: MONGODB_WARNING_CODE } as any); +} + +const emittedWarnings = new Set(); +/** + * Will emit a warning once for the duration of the application. + * Uses the message to identify if it has already been emitted + * so using string interpolation can cause multiple emits + * @internal + */ +export function emitWarningOnce(message: string): void { + if (!emittedWarnings.has(message)) { + emittedWarnings.add(message); + return emitWarning(message); + } +} + +/** + * Takes a JS object and joins the values into a string separated by ', ' + */ +export function enumToString(en: Record): string { + return Object.values(en).join(', '); +} + +/** + * Determine if a server supports retryable writes. + * + * @internal + */ +export function supportsRetryableWrites(server: Server): boolean { + return ( + !!server.loadBalanced || + (server.description.maxWireVersion >= 6 && + !!server.description.logicalSessionTimeoutMinutes && + server.description.type !== ServerType.Standalone) + ); +} + +export function parsePackageVersion({ version }: { version: string }): { + major: number; + minor: number; + patch: number; +} { + const [major, minor, patch] = version.split('.').map((n: string) => Number.parseInt(n, 10)); + return { major, minor, patch }; +} diff --git a/node_modules/mongodb/src/write_concern.ts b/node_modules/mongodb/src/write_concern.ts new file mode 100644 index 000000000..ebc77ebf6 --- /dev/null +++ b/node_modules/mongodb/src/write_concern.ts @@ -0,0 +1,114 @@ +/** @public */ +export type W = number | 'majority'; + +/** @public */ +export interface WriteConcernOptions { + /** Write Concern as an object */ + writeConcern?: WriteConcern | WriteConcernSettings; +} + +/** @public */ +export interface WriteConcernSettings { + /** The write concern */ + w?: W; + /** The write concern timeout */ + wtimeoutMS?: number; + /** The journal write concern */ + journal?: boolean; + + // legacy options + /** The journal write concern */ + j?: boolean; + /** The write concern timeout */ + wtimeout?: number; + /** The file sync write concern */ + fsync?: boolean | 1; +} + +export const WRITE_CONCERN_KEYS = ['w', 'wtimeout', 'j', 'journal', 'fsync']; + +/** + * A MongoDB WriteConcern, which describes the level of acknowledgement + * requested from MongoDB for write operations. + * @public + * + * @see https://docs.mongodb.com/manual/reference/write-concern/ + */ +export class WriteConcern { + /** request acknowledgment that the write operation has propagated to a specified number of mongod instances or to mongod instances with specified tags. */ + w?: W; + /** specify a time limit to prevent write operations from blocking indefinitely */ + wtimeout?: number; + /** request acknowledgment that the write operation has been written to the on-disk journal */ + j?: boolean; + /** equivalent to the j option */ + fsync?: boolean | 1; + + /** + * Constructs a WriteConcern from the write concern properties. + * @param w - request acknowledgment that the write operation has propagated to a specified number of mongod instances or to mongod instances with specified tags. + * @param wtimeout - specify a time limit to prevent write operations from blocking indefinitely + * @param j - request acknowledgment that the write operation has been written to the on-disk journal + * @param fsync - equivalent to the j option + */ + constructor(w?: W, wtimeout?: number, j?: boolean, fsync?: boolean | 1) { + if (w != null) { + if (!Number.isNaN(Number(w))) { + this.w = Number(w); + } else { + this.w = w; + } + } + if (wtimeout != null) { + this.wtimeout = wtimeout; + } + if (j != null) { + this.j = j; + } + if (fsync != null) { + this.fsync = fsync; + } + } + + /** Construct a WriteConcern given an options object. */ + static fromOptions( + options?: WriteConcernOptions | WriteConcern | W, + inherit?: WriteConcernOptions | WriteConcern + ): WriteConcern | undefined { + if (options == null) return undefined; + inherit = inherit ?? {}; + let opts: WriteConcernSettings | WriteConcern | undefined; + if (typeof options === 'string' || typeof options === 'number') { + opts = { w: options }; + } else if (options instanceof WriteConcern) { + opts = options; + } else { + opts = options.writeConcern; + } + const parentOpts: WriteConcern | WriteConcernSettings | undefined = + inherit instanceof WriteConcern ? inherit : inherit.writeConcern; + + const { + w = undefined, + wtimeout = undefined, + j = undefined, + fsync = undefined, + journal = undefined, + wtimeoutMS = undefined + } = { + ...parentOpts, + ...opts + }; + if ( + w != null || + wtimeout != null || + wtimeoutMS != null || + j != null || + journal != null || + fsync != null + ) { + return new WriteConcern(w, wtimeout ?? wtimeoutMS, j ?? journal, fsync); + } + return undefined; + } +} diff --git a/node_modules/mongoose/CHANGELOG.md b/node_modules/mongoose/CHANGELOG.md new file mode 100644 index 000000000..94c852180 --- /dev/null +++ b/node_modules/mongoose/CHANGELOG.md @@ -0,0 +1,6858 @@ +6.0.10 / 2021-10-08 +=================== + * fix(query): add back strictQuery option to avoid empty filter issues, tie it to `strict` by default for compatibility #10781 #10763 + * fix(model): avoid unnecessarily dropping text indexes in `syncIndexes()` #10851 #10850 [IslandRhythms](https://github.com/IslandRhythms) + * fix(query): avoid trying to call toArray() on cursor if find() error occurred #10845 + * fix: accepts uppercase values in mongoose.isValidObjectId #10846 [foxadb](https://github.com/foxadb) + * perf(document): further reduce unnecessary objects and keys to minimize document memory overhead #10400 + * fix(index.d.ts): restore unpacked type and avoid distributive conditional types #10859 [dbellavista](https://github.com/dbellavista) + * fix(index.d.ts): add correct null typings for `findOneAndUpdate()` and `findByIdAndUpdate()` #10820 + * fix(index.d.ts): make insertMany() correctly return Promise if passing single document to `insertMany()` #10802 + * fix(index.d.ts): avoid weird issue where TypeScript 4.3.x and 4.4.x makes string extend Function #10746 + * fix(index.d.ts): allow type: `SchemaTypeOptions[]` when defining schema #10789 + * fix(index.d.ts): allow using `$in` with enum fields #10757 #10734 + * fix(index.d.ts): add missing fields and options params to Model constructor #10817 + * fix(index.d.ts): support extending type for mongoose.models #10806 [MunifTanjim](https://github.com/MunifTanjim) + * docs: enhance docs section linking #10779 [saveman71](https://github.com/saveman71) + * docs(middleware): add missing query middleware #10721 + * docs: fix typo #10853 [mdatif796](https://github.com/mdatif796) + * docs: add missing to #10848 [digidub](https://github.com/digidub) + +5.13.10 / 2021-10-05 +==================== + * fix(index.d.ts): allow using type: SchemaDefinitionProperty in schema definitions #10674 + * fix(index.d.ts): allow AnyObject as param to findOneAndReplace() #10714 + +6.0.9 / 2021-10-04 +================== + * fix(document): init non-schema values if strict is set to false #10828 + * fix(document): correctly track saved state for deeply nested objects #10773 + * fix(array): avoid mutating arrays passed into Model() constructor #10766 + * fix(cursor): allow using find().cursor() before connecting, report errors in pre('find') hooks when using `.cursor()` #10785 + * fix(populate): support ref: Model with virtual populate #10695 + * fix(schema): support type: { subpath: String } in document array definitions and improve schema `interpretAsType` error messages if type name is undefined #10750 + * fix: upgrade to mongodb driver 4.1.2 #10810 [orgads](https://github.com/orgads) + * fix(subdocument): add extra precaution to throw an error if a subdocument is a parent of itself in `ownerDocument()` #9259 + * perf(index.d.ts): make `model()` call more strict to improve VS Code autocomplete perf #10801 [traverse1984](https://github.com/traverse1984) + * fix(index.d.ts): allow calling depopulate with 0 args #10793 + * fix(index.d.ts): Add type definitions for allowDiskUse #10791 [coyotte508](https://github.com/coyotte508) + * docs(populate): expand virtual populate docs with info on principle of least cardinality and other info #10558 + * docs: add migration guide to side bar #10769 + * docs(connections+api): clarify that maxPoolSize is now 100 by default #10809 + * docs(schema): add Schema#virtuals to docs as a public property #10829 + * docs: remove array indexes section from FAQ #10784 [Duchynko](https://github.com/Duchynko) + * docs(model): fix broken example #10831 [Okekeprince1](https://github.com/Okekeprince1) + * docs: fix markdown issue with schemas.md #10839 [aseerkt](https://github.com/aseerkt) + +6.0.8 / 2021-09-27 +================== + * fix: support $set on elements of map of subdocuments #10720 + * fix(schematype): handle schema type definition where unique: false and `index` not set #10738 + * fix(timestamps): handle `createdAt` with custom `typeKey` #10768 #10761 [jclaudan](https://github.com/jclaudan) + * fix(model): amend Model.translateAliases to observe non-aliased sub schemas #10772 [frisbee09](https://github.com/frisbee09) + * fix: allow ObjectId#valueOf() to override built-in `Object#valueOf()`, clarify using `==` with ObjectIds in migration guide #10741 + * fix: use process.emitWarning() instead of console.warn() for warnings #10687 + * fix(index.d.ts): allow array of schema type options for string[], `number[]` property Schema definitions #10731 + * fix(index.d.ts): make built-in subdocument properties not required in UpdateQuery #10597 + * docs(ssl): correct sslCA option and clarify that sslCA should be the path to the CA file #10705 + +6.0.7 / 2021-09-20 +================== + * fix(populate): wrap populate filters in trusted() so they work with `sanitizeFilter` #10740 + * fix(aggregate): handle calling aggregate() before initial connection succeeds #10722 + * fix(query): avoid throwing error when using `$not` with `$size` #10716 [IslandRhythms](https://github.com/IslandRhythms) + * fix(discriminator): handle setting nested discriminator paths #10702 + * fix(documentarray): don't throw TypeError on DocumentArray#create() when top-level doc has populated paths #10749 + * fix(update): avoid setting single nested subdoc defaults if subdoc isn't set #10660 + * fix: delay creating id virtual until right before model compilation to allow plugins to disable the `id` option #10701 + * fix(connection): correct `auth` object when using `user` option to `connect()` #10727 #10726 [saveman71](https://github.com/saveman71) + * fix(timestamps): avoid calling getters when checking whether `createdAt` is set #10742 [kaishu16](https://github.com/kaishu16) + * fix(index.d.ts): allow using strings for ObjectIds with $in #10735 + * fix(index.d.ts): add TVirtuals generic to Model to make it easier to separate virtuals from DocType #10689 + * fix(index.d.ts): add Model.bulkSave() definition #10745 + * fix(index.d.ts): allow RegExp for `match` in schema definition #10744 [easen-amp](https://github.com/easen-amp) + * fix(index.d.ts): allow arbitrary additional keys in QueryOptions #10688 + * docs: correct TypeScript schema generic params in docs #10739 [minifjurt123](https://github.com/minifjurt123) + * docs: fix h2 header links #10682 [IslandRhythms](https://github.com/IslandRhythms) + +6.0.6 / 2021-09-15 +================== + * perf(index.d.ts): streamline SchemaDefinitionType and SchemaTypeOptions to reduce number of instantiations and speed up lib checking #10349 + * perf(document): make $locals a getter/setter, avoid creating unnecessary `undefined` properties in Document constructor, remove unnecessary event listeners #10400 + * fix(connection): use username parameter for MongoDB driver instead of user #10727 [saveman71](https://github.com/saveman71) + * fix(update): handle casting $or and $and in array filters #10696 + * fix(connection): allow calling connection helpers before calling `mongoose.connect()` #10706 + * fix(document): correctly handle subpaths of arrays that contain non-alphanumeric chars like `-` #10709 + * fix(index.d.ts): correct return value for findOneAndUpdate(), `findByIdAndUpdate()` to support query helpers #10658 + * fix(index.d.ts): add missing methods to ValidationError & ValidatorError classes #10725 [medfreeman](https://github.com/medfreeman) + * perf(subdocument): make internal $isSingleNested and `$isDocumentArrayElement` properties constants on the prototype rather than setting them on every instance #10400 + * docs: improve Document#populate documentation, tests #10728 [saveman71](https://github.com/saveman71) + +6.0.5 / 2021-09-06 +================== + * fix(model): allow calling Model.validate() static with POJO array #10669 + * fix(cast): let $expr through in query filters if strict mode enabled #10662 + * fix(map): propagate flattenMaps option down to nested maps #10653 + * fix(setDefaultsOnInsert): avoid adding unnecessary auto _id to $setOnInsert #10646 + * fix(schema): support object with values and message syntax for Number enums #10648 + * fix(index.d.ts): fix Document#populate() type #10651 [thiagokisaki](https://github.com/thiagokisaki) + * fix(index.d.ts): allow using $in and $nin on array paths #10605 + * fix(index.d.ts): make _id required in query results and return value from `create()` #10657 + * docs: update deprecations.md to reflect version 6 #10673 [multiwebinc](https://github.com/multiwebinc) + * docs: fix typo in queries.md #10681 [yogabonito](https://github.com/yogabonito) + * docs: fix typo in models.md #10680 [yogabonito](https://github.com/yogabonito) + * ci: add test for ubuntu-20.04 #10679 [YC](https://github.com/YC) + +5.13.9 / 2021-09-06 +=================== + * fix(populate): avoid setting empty array on lean document when populate result is undefined #10599 + * fix(document): make depopulate() handle populated paths underneath document arrays #10592 + * fix: peg @types/bson version to 1.x || 4.0.x to avoid stubbed 4.2.x release #10678 + * fix(index.d.ts): simplify UpdateQuery to avoid "excessively deep and possibly infinite" errors with `extends Document` and `any` #10647 + * fix(index.d.ts): allow specifying weights as an IndexOption #10586 + * fix: upgrade to mpath v0.8.4 re: security issue #10683 + +6.0.4 / 2021-09-01 +================== + * fix(schema): handle maps of maps #10644 + * fix: avoid setting defaults on insert on a path whose subpath is referenced in the update #10624 + * fix(index.d.ts): simplify UpdateQuery to avoid "excessively deep and possibly infinite" errors with `extends Document` and `any` #10617 + * fix(index.d.ts): allow using type: [documentDefinition] when defining a doc array in a schema #10605 + * docs: remove useNewUrlParser, useUnifiedTopology, some other legacy options from docs #10631 #10632 + * docs(defaults): clarify that setDefaultsOnInsert is true by default in 6.x #10643 + * test: use async/await instead of co #10633 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + +6.0.3 / 2021-08-30 +================== + * fix: handle buffering with find() now that find() no longer accepts a callback #10639 #10634 #10610 + +6.0.2 / 2021-08-26 +================== + * fix(query): handle find() when buffering on initial connection #10610 + * fix(populate): get doc schema using $__schema to avoid paths named `schema` #10619 + * docs: use async/await in the quickstart #10610 + * docs: fix links to guide, schematypes, connections in v5.x docs #10607 + * docs: add link to 6.x migration guide to schemas guide #10616 + * docs: add migration to 6.x in Migration Guides section #10618 [HunterKohler](https://github.com/HunterKohler) + * docs: fix missing URL part on layout.pug #10622 [ItsLhun](https://github.com/ItsLhun) + +6.0.1 / 2021-08-25 +================== + * fix(aggregate): allow calling Model.aggregate() with options #10604 [amitbeck](https://github.com/amitbeck) + * fix(index.d.ts): add instance, options, schema properties to SchemaType class #10609 + * fix(index.d.ts): allow querying array of strings by string #10605 + * fix(index.d.ts): allow using type: SchemaDefinitionProperty in schema definitions #10605 + * fix(index.d.ts): remove strictQuery option #10598 [thiagokisaki](https://github.com/thiagokisaki) + * docs: add link to migration guide in changelog #10600 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs: fix docs build for v5.x docs #10607 + * docs(query): add note about strict option to setOptions() #10602 + +6.0.0 / 2021-08-24 +================== + * Follow the [migration guide](https://mongoosejs.com/docs/migrating_to_6.html) to get a list of all breaking changes in v6.0. + * BREAKING CHANGE: remove the deprecated safe option in favor of write concerns + * fix: upgrade to mongodb driver 4.1.1 + * fix: consistently use $__parent to store subdoc parent as a property, and `$parent()` as a getter function #10584 #10414 + * fix: allow calling `countDocuments()` with options + +6.0.0-rc2 / 2021-08-23 +====================== + * BREAKING CHANGE: make document set() set keys in the order they're defined in the schema, not in the user specified object #4665 + * BREAKING CHANGE: remove most schema reserved words - can now opt in to using `save`, `isNew`, etc. as schema paths #9010 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * BREAKING CHANGE: upgrade mquery -> 4.x #10579 + * BREAKING CHANGE: remove some legacy type aliases for TypeScript + * fix(document): pass depopulated value to array element validators for consistency #10585 #8042 + +5.13.8 / 2021-08-23 +=================== + * fix(populate): handle populating subdoc array virtual with sort #10552 + * fix(model): check for code instead of codeName when checking for existing collections for backwards compat with MongoDB 3.2 #10420 + * fix(index.d.ts): correct value of this for custom query helper methods #10545 + * fix(index.d.ts): allow strings for ObjectIds in nested properties #10573 + * fix(index.d.ts): add match to VirtualTypeOptions.options #8749 + * fix(index.d.ts): allow QueryOptions populate parameter type PopulateOptions #10587 [osmanakol](https://github.com/osmanakol) + * docs(api): add Document#$where to API docs #10583 + +6.0.0-rc1 / 2021-08-12 +====================== + * feat: upgrade to MongoDB driver 4.1.0 #10567 #10560 + * feat: add `valueOf()` to ObjectId prototype #7299 + * BREAKING CHANGE: rename History.md -> CHANGELOG.md #10542 + +5.13.7 / 2021-08-11 +=================== + * perf(index.d.ts): loosen up restrictions on ModelType generic for Schema for a ~50% perf improvement when compiling TypeScript and using intellisense #10536 #10515 #10349 + * fix(index.d.ts): fix broken `Schema#index()` types #10562 [JaredReisinger](https://github.com/JaredReisinger) + * fix(index.d.ts): allow using SchemaTypeOptions with array of raw document interfaces #10537 + * fix(index.d.ts): define IndexOptions in terms of mongodb.IndexOptions #10563 [JaredReisinger](https://github.com/JaredReisinger) + * fix(index.d.ts): improve intellisense for DocumentArray `push()` #10546 + * fix(index.d.ts): correct type for expires #10529 + * fix(index.d.ts): add Query#model property to ts bindings #10531 + * refactor(index.d.ts): make callbacks use the new Callback and CallbackWithoutResult types #10550 [thiagokisaki](https://github.com/thiagokisaki) + +5.13.6 / 2021-08-09 +=================== + * fix: upgrade mongodb driver -> 3.6.11 #10543 [maon-fp](https://github.com/maon-fp) + * fix(schema): throw more helpful error when defining a document array using a schema from a different copy of the Mongoose module #10453 + * fix: add explicit check on constructor property to avoid throwing an error when checking objects with null prototypes #10512 + * fix(cursor): make sure to clear stack every 1000 docs when calling `next()` to avoid stack overflow with large batch size #10449 + * fix(index.d.ts): allow calling new Model(...) with generic Model param #10526 + * fix(index.d.ts): update type declarations of Schema.index method #10538 #10530 [Raader](https://github.com/Raader) + * fix(index.d.ts): add useNewUrlParser and useUnifiedTopology to ConnectOptions #10500 + * fix(index.d.ts): add missing type for diffIndexes #10547 [bvgusak](https://github.com/bvgusak) + * fix(index.d.ts): fixed incorrect type definition for Query's .map function #10544 [GCastilho](https://github.com/GCastilho) + * docs(schema): add more info and examples to Schema#indexes() docs #10446 + * chore: add types property to package.json #10557 [thiagokisaki](https://github.com/thiagokisaki) + +6.0.0-rc0 / 2021-08-03 +====================== + * BREAKING CHANGE: upgrade to MongoDB Node.js driver 4.x. This adds support for MongoDB 5.0, drops support for Node.js < 12.0.0 #10338 #9840 #8759 + * BREAKING CHANGE(connection): make connections not thenable, add `Connection#asPromise()` to use a connection as a promise #8810 + * BREAKING CHANGE: throw an error if the same query object is executed multiple times #7398 + * BREAKING CHANGE: Mongoose arrays are now proxies, which means directly setting an array index `doc.arr[0] = 'test'` triggers change tracking #8884 + * BREAKING CHANGE: make Document#populate() return a promise if a callback isn't passed, remove `execPopulate()` #3834 + * BREAKING CHANGE: virtual getters and setters are now executed in forward order, rather than reverse order. This means you can add getters/setters to populated virtuals #8897 [ggurkal](https://github.com/ggurkal) + * BREAKING CHANGE: make setDefaultsOnInsert true by default. Set to `false` to disable it. #8410 + * BREAKING CHANGE: throw error by default if populating a path that isn't in the schema #5124 + * BREAKING CHANGE: make schema paths declared with `type: { name: String }` create a single nested subdoc, remove `typePojoToMixed` because it is now always false #7181 + * BREAKING CHANGE: remove context option for queries, always use `context: 'query'` #8395 + * BREAKING CHANGE: array subdocument class now inherits from single subdocument class #8554 + * BREAKING CHANGE: if model is registered on a non-default connection, don't register it on mongoose global #5758 + * BREAKING CHANGE: `Aggregate#cursor()` now returns aggregation cursor, rather than aggregate instance #10410 [IslandRhythms](https://github.com/IslandRhythms) + * BREAKING CHANGE: `useStrictQuery` is removed, `strict` applies for both #9015 + * BREAKING CHANGE: overwrite instead of merging when setting nested paths #9121 + * BREAKING CHANGE: call ref and refPath functions with subdoc being populated, not top-level doc #8469 + * BREAKING CHANGE: remove useFindAndModify option, always use MongoDB's native `findOneAndReplace()` rather than legacy `findAndModify()` #8737 + * BREAKING CHANGE: always pass unpopulated value to validators #8042 [IslandRhythms](https://github.com/IslandRhythms) + * BREAKING CHANGE: rename Embedded/SingleNestedPath -> Subdocument/SubdocumentPath #10419 + * BREAKING CHANGE: call setters with priorVal as 2nd parameter, and with new subdocument as context if creating new document #8629 + * BREAKING CHANGE: pass document as first parameter to `default` functions #9633 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * BREAKING CHANGE: make filter, flat, flatMap, map, and slice return vanilla JS arrays #8356 + * BREAKING CHANGE: make autoCreate true by default #8814 + * BREAKING CHANGE: clone schema passed to discriminator() #8552 + * BREAKING CHANGE: make autoIndex and autoCreate default to false if connection's read preference is 'secondary' or 'secondaryPreferred' #9374 [chumager](https://github.com/chumager) + * BREAKING CHANGE: make deepEqual treat objects with different order of keys as different #9571 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * BREAKING CHANGE: Make Model.exists() return a query rather than a promise, and resolve to doc or null, rather than true/false #8097 + * BREAKING CHANGE: `Model.create([])` now returns an empty array rather than undefined #8792 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * BREAKING CHANGE: `createdAt` schema path is now `immutable` by default #10139 [IslandRhythms](https://github.com/IslandRhythms) + * BREAKING CHANGE: dont start buffering when database is disconnected #8702 + * BREAKING CHANGE: emit 'disconnected' when losing connectivity to replica set primary #9262 + * BREAKING CHANGE: Make Aggregate#model() always return a model rather than an aggregate instance #7702 + * BREAKING CHANGE: remove omitUndefined option for updates - Mongoose now always removes `undefined` keys in updates #7680 + * BREAKING CHANGE: remove Aggregate#addCursorFlag(), use `Aggregate#options()` instead #8701 + * BREAKING CHANGE: rename Query#map() -> Query#transform() to avoid conflating with Array#map() and chaining #7951 + * BREAKING CHANGE: remove storeSubdocumentValidationError schema option, never store subdocument ValidationErrors #5226 + * BREAKING CHANGE(query): make find() with explain() return object instead of an array #8551 + * BREAKING CHANGE: allow undefined in number arrays unlles `required` #8738 + * BREAKING CHANGE: remove isAsync option for validators in favor of checking if a function is an async function #10502 + * BREAKING CHANGE: drop support for Model#model #8958 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * BREAKING CHANGE: remove useNestedStrict, use nested schema strict mode when casting updates unless `strict` is explicitly specified in query options #8961 + * BREAKING CHANGE: remove `safe-buffer`, make MongooseBuffer extend from native Buffer #9199 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * BREAKING CHANGE(schema): skip adding id virtual if schema doesn't have an `_id` path #3936 + * BREAKING CHANGE: remove skipInit parameter to mongoose.model() #4625 + * BREAKING CHANGE: make pluralize convert "goose" to "geese" #9278 + * BREAKING CHANGE: the `Schema#noId` option is now removed, please use `_id` instead #9398 + * BREAKING CHANGE: remove MONGOOSE_DRIVER_PATH, use a setDriver() function on Mongoose instance instead #9604 + * feat(mongoose+query): add `sanitizeFilter` option and `mongoose.trusted()` to defend against query selector injection attacks #3944 + * feat(query): add Query#clone() so you can more easily re-execute a query #8708 + * feat(virtual+populate): allow `foreignField` to be a function #8568 [BJvdA](https://github.com/BJvdA) + +5.13.5 / 2021-07-30 +=================== + * perf(index.d.ts): improve typescript type checking performance #10515 [andreialecu](https://github.com/andreialecu) + * fix(index.d.ts): fix debug type in MongooseOptions #10510 [thiagokisaki](https://github.com/thiagokisaki) + * docs(api): clarify that `depopulate()` with no args depopulates all #10501 [gfrancz](https://github.com/gfrancz) + +5.13.4 / 2021-07-28 +=================== + * fix: avoid pulling non-schema paths from documents into nested paths #10449 + * fix(update): support overwriting nested map paths #10485 + * fix(update): apply timestamps to subdocs that would be newly created by `$setOnInsert` #10460 + * fix(map): correctly clone subdocs when calling toObject() on a map #10486 + * fix(cursor): cap parallel batchSize for populate at 5000 #10449 + * fix(index.d.ts): improve autocomplete for new Model() by making `doc` an object with correct keys #10475 + * fix(index.d.ts): add MongooseOptions interface #10471 [thiagokisaki](https://github.com/thiagokisaki) + * fix(index.d.ts): make LeanDocument work with PopulatedDoc #10494 + * docs(mongoose+connection): correct default value for bufferTimeoutMS #10476 + * chore: remove unnecessary 'eslint-disable' comments #10466 [thiagokisaki](https://github.com/thiagokisaki) + +5.13.3 / 2021-07-16 +=================== + * fix(model): avoid throwing error when bulkSave() called on a document with no changes #10437 + * fix(timestamps): apply timestamps when creating new subdocs with `$addToSet` and with positional operator #10447 + * fix(schema): allow calling Schema#loadClass() with class that has a static getter with no setter #10436 + * fix(model): handle re-applying object defaults after explicitly unsetting #10442 [semirturgay](https://github.com/semirturgay) + * fix: bump mongodb driver -> 3.6.10 #10440 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(index.d.ts): consistently use NativeDate instead of Date for Date validators and timestamps functions #10426 + * fix(index.d.ts): allow calling `discriminator()` with non-document #10452 #10421 [DouglasGabr](https://github.com/DouglasGabr) + * fix(index.d.ts): allow passing ResultType generic to Schema#path() #10435 + +5.13.2 / 2021-07-03 +=================== + * fix: hardcode @types/node version for now to avoid breaking changes from DefinitelyTyped/DefinitelyTyped#53669 #10415 + * fix(index.d.ts): allow using type: Date with Date paths in SchemaDefinitionType #10409 + * fix(index.d.ts): allow extra VirtualTypeOptions for better plugin support #10412 + * docs(api): add SchemaArray to docs #10397 + * docs(schema+validation): fix broken links #10396 + * docs(transactions): add note about creating a connection to transactions docs #10406 + +5.13.1 / 2021-07-02 +=================== + * fix(discriminator): allow using array as discriminator key in schema and as tied value #10303 + * fix(index.d.ts): allow using & Document in schema definition for required subdocument arrays #10370 + * fix(index.d.ts): if using DocType that doesn't extends Document, default to returning that DocType from `toObject()` and `toJSON()` #10345 + * fix(index.d.ts): use raw DocType instead of LeanDocument when using `lean()` with queries if raw DocType doesn't `extends Document` #10345 + * fix(index.d.ts): remove err: any in callbacks, use `err: CallbackError` instead #10340 + * fix(index.d.ts): allow defining map of schemas in TypeScript #10389 + * fix(index.d.ts): correct return type for Model.createCollection() #10359 + * docs(promises+discriminators): correctly escape () in regexp to pull in examples correctly #10364 + * docs(update): fix outdated URL about unindexed upsert #10406 [grimmer0125](https://github.com/grimmer0125) + * docs(index.d.ts): proper placement of mongoose.Date JSDoc [thiagokisaki](https://github.com/thiagokisaki) + +5.13.0 / 2021-06-28 +=================== + * feat(query): add sanitizeProjection option to opt in to automatically sanitizing untrusted query projections #10243 + * feat(model): add `bulkSave()` function that saves multiple docs in 1 `bulkWrite()` #9727 #9673 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * feat(document): allow passing a list of virtuals or `pathsToSkip` to apply in `toObject()` and `toJSON()` #10120 + * fix(model): make Model.validate use object under validation as context by default #10360 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * feat(document): add support for pathsToSkip in validate and validateSync #10375 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * feat(model): add `diffIndexes()` function that calculates what indexes `syncIndexes()` will create/drop without actually executing any changes #10362 [IslandRhythms](https://github.com/IslandRhythms) + * feat(document): avoid using sessions that have ended, so you can use documents that were loaded in the session after calling `endSession()` #10306 + +5.12.15 / 2021-06-25 +==================== + * fix(index.d.ts): add extra TInstanceMethods generic param to `Schema` for cases when we can't infer from Model #10358 + * fix(index.d.ts): added typings for near() in model aggregation #10373 [tbhaxor](https://github.com/tbhaxor) + * fix(index.d.ts): correct function signature for `Query#cast()` #10388 [lkho](https://github.com/lkho) + * docs(transactions): add import statement #10365 [JimLynchCodes](https://github.com/JimLynchCodes) + * docs(schema): add missing `discriminatorKey` schema option #10386 #10376 [IslandRhythms](https://github.com/IslandRhythms) + * docs(index.d.ts): fix typo #10363 [houssemchebeb](https://github.com/houssemchebeb) + +5.12.14 / 2021-06-15 +==================== + * fix(schema): check that schema type is an object when setting isUnderneathDocArray #10361 [vmo-khanus](https://github.com/vmo-khanus) + * fix(document): avoid infinite recursion when setting single nested subdoc to array #10351 + * fix(populate): allow populating nested path in schema using `Model.populate()` #10335 + * fix(drivers): emit operation-start/operation-end events to allow inspecting when operations start and end + * fix(index.d.ts): improve typings for virtuals #10350 [thiagokisaki](https://github.com/thiagokisaki) + * fix(index.d.ts): correct constructor type for Document #10328 + * fix(index.d.ts): add `ValidationError` as a possible type for `ValidationError#errors` #10320 [IslandRhythms](https://github.com/IslandRhythms) + * fix: remove unnecessary async devDependency that's causing npm audit warnings #10281 + * docs(typescript): add schemas guide #10308 + * docs(model): add options parameter description to `Model.exists()` #10336 [Aminoiz](https://github.com/Aminoiz) + +5.12.13 / 2021-06-04 +==================== + * perf(document): avoid creating nested paths when running `$getAllSubdocs()` #10275 + * fix: make returnDocument option work with `findOneAndUpdate()` #10232 #10231 [cnwangjie](https://github.com/cnwangjie) + * fix(document): correctly reset subdocument when resetting a map subdocument underneath a single nested subdoc after save #10295 + * perf(query): avoid setting non-null sessions to avoid overhead from $getAllSubdocs() #10275 + * perf(document): pre split schematype paths when compiling schema to avoid extra overhead of splitting when hydrating documents #10275 + * perf(schema): pre-calculate mapPaths to avoid looping over every path for each path when initing doc #10275 + * fix(index.d.ts): drill down into nested arrays when creating LeanDocument type #10293 + +5.12.12 / 2021-05-28 +==================== + * fix(documentarray): retain atomics when setting to a new array #10272 + * fix(query+model): fix deprecation warning for `returnOriginal` with `findOneAndUpdate()` #10298 #10297 #10292 #10285 [IslandRhythms](https://github.com/IslandRhythms) + * fix(index.d.ts): make `map()` result an array if used over an array #10288 [quantumsheep](https://github.com/quantumsheep) + +5.12.11 / 2021-05-24 +==================== + * fix(populate): skip applying setters when casting arrays for populate() to avoid issues with arrays of immutable elements #10264 + * perf(schematype): avoid cloning setters every time we run setters #9588 + * perf(get): add benchmarks and extra cases to speed up get() #9588 + * perf(array): improve array constructor performance on small arrays to improve nested array perf #9588 + * fix(index.d.ts): allow using type: [String] with string[] when using SchemaDefinition with generic #10261 + * fix(index.d.ts): support ReadonlyArray as well as regular array where possible in schema definitions #10260 + * docs(connection): document noListener option to useDb #10278 [stuartpb](https://github.com/stuartpb) + * docs: migrate raw tutorial content from pug / JS to markdown #10271 + * docs: fix typo #10269 [sanjib](https://github.com/sanjib) + +5.12.10 / 2021-05-18 +==================== + * fix(query): allow setting `defaults` option on result documents from query options #7287 [IslandRhythms](https://github.com/IslandRhythms) + * fix(populate): handle populating embedded discriminator with custom tiedValue #10231 + * fix(document): allow passing space-delimited string of `pathsToValidate` to `validate()` and `validateSync()` #10258 + * fix(model+schema): support `loadClass()` on classes that have `collection` as a static property #10257 #10254 [IslandRhythms](https://github.com/IslandRhythms) + * fix(SchemaArrayOptions): correct property name #10236 + * fix(index.d.ts): add any to all query operators to minimize likelihood of "type instantiation is excessively deep" when querying docs with 4-level deep subdocs #10189 + * fix(index.d.ts): add $parent() in addition to parent() in TS definitions + * fix(index.d.ts): correct async iterator return type for QueryCursor #10253 #10252 #10251 [borfig](https://github.com/borfig) + * fix(index.d.ts): add `virtualsOnly` parameter to `loadClass()` function signature [IslandRhythms](https://github.com/IslandRhythms) + * docs(typescript): add typescript populate docs #10212 + * docs: switch from AWS to Azure Functions for search #10244 + +5.12.9 / 2021-05-13 +=================== + * fix(schema): ensure add() overwrites existing schema paths by default #10208 #10203 + * fix(schema): support creating nested paths underneath document arrays #10193 + * fix(update): convert nested dotted paths in update to nested paths to avoid ending up with dotted properties in update #10200 + * fix(document): allow calling validate() and validateSync() with `options` as first parameter #10216 + * fix(schema): apply static properties to model when using loadClass() #10206 + * fix(index.d.ts): allow returning Promise from middleware functions #10229 + * fix(index.d.ts): add pre('distinct') hooks to TypeScript #10192 + +5.12.8 / 2021-05-10 +=================== + * fix(populate): handle populating immutable array paths #10159 + * fix(CastError): add `toJSON()` function to ensure `name` property always ends up in `JSON.stringify()` output #10166 [IslandRhythms](https://github.com/IslandRhythms) + * fix(query): add allowDiskUse() method to improve setting MongoDB 4.4's new `allowDiskUse` option #10177 + * fix(populate): allow populating paths under mixed schematypes where some documents have non-object properties #10191 + * chore: remove unnecessary driver dynamic imports so Mongoose can work with Parcel #9603 + * fix(index.d.ts): allow any object as parameter to create() and `insertMany()` #10144 + * fix(index.d.ts): allow creating Model class with raw interface, no `extends Document` #10144 + * fix(index.d.ts): separate UpdateQuery from `UpdateWithAggregationPipeline` for cases when `UpdateQuery` is used as a function param #10186 + * fix(index.d.ts): don't require error value in pre/post hooks #10213 [michaln-q](https://github.com/michaln-q) + * docs(typescript): add a typescript intro tutorial and statics tutorial #10021 + * docs(typescript): add query helpers tutorial #10021 + * docs(deprecations): add note that you can safely ignore `useFindAndModify` and `useCreateIndex` deprecation warnings #10155 + * chore(workflows): add node 16 to github actions #10201 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + +5.12.7 / 2021-04-29 +=================== + * fix(document): make $getPopulatedDocs() return populated virtuals #10148 + * fix(discriminator): take discriminator schema's single nested paths over base schema's #10157 + * fix(discriminator): allow numbers and ObjectIds as tied values for discriminators #10130 + * fix(document): avoid double validating paths underneath mixed objects in save() #10141 + * fix(schema): allow path() to return single nested paths within document arrays #10164 + * fix(model+query): consistently wrap query callbacks in `process.nextTick()` to avoid clean stack traces causing memory leak when using synchronous recursion like `async.whilst()` #9864 + * fix(cursor): correctly report CastError when using noCursorTimeout flag #10150 + * fix(index.d.ts): add CastError constructor #10176 + * fix(index.d.ts): allow setting mongoose.pluralize(null) in TypeScript #10185 + * docs: add link to transactions guide from nav bar #10143 + * docs(validation): add section about custom error messages #10140 + * docs: make headers linkable via clicking #10156 + * docs: broken link in document.js #10190 [joostdecock](https://github.com/joostdecock) + * docs: make navbar responsive on legacy 2.x docs #10171 [ad99526](https://github.com/ad99526) + +5.12.6 / 2021-04-27 +=================== + * fix(query): allow setting `writeConcern` schema option to work around MongoDB driver's `writeConcern` deprecation warning #10083 #10009 [IslandRhythms](https://github.com/IslandRhythms) + * fix(populate): dedupe when virtual populate foreignField is an array to avoid duplicate docs in result #10117 + * fix(populate): add `localField` filter to `$elemMatch` on virtual populate when custom `match` has a `$elemMatch` and `foreignField` is an array #10117 + * fix(query): convert projection string values to numbers as a workaround for #10142 + * fix(document): set version key filter on `save()` when using `optimisticConcurrency` if no changes in document #10128 [IslandRhythms](https://github.com/IslandRhythms) + * fix(model): use `obj` as `context` in `Model.validate()` if `obj` is a document #10132 + * fix(connection): avoid db events deprecation warning when using `useDb()` with `useUnifiedTopology` #8267 + * fix: upgrade to sift@13.5.2 to work around transitive dev dependency security warning #10121 + * fix(index.d.ts): allow any object as parameter to `create()` and `insertMany()` #10144 + * fix(index.d.ts): clarify that `eachAsync()` callback receives a single doc rather than array of docs unless `batchSize` is set #10135 + * fix(index.d.ts): clarify that return value from `validateSync()` is a ValidationError #10147 [michaln-q](https://github.com/michaln-q) + * fix(index.d.ts): add generic type for Model constructor #10074 [Duchynko](https://github.com/Duchynko) + * fix(index.d.ts): add parameter type in merge #10168 [yoonhoGo](https://github.com/yoonhoGo) + +5.12.5 / 2021-04-19 +=================== + * fix(populate): handle populating underneath document array when document array property doesn't exist in db #10003 + * fix(populate): clear out dangling pointers to populated docs so query cursor with populate() can garbage collect populated subdocs #9864 + * fix(connection): pull correct `autoCreate` value from Mongoose global when creating new model before calling `connect()` #10091 + * fix(populate): handle populating paths on documents with discriminator keys that point to non-existent discriminators #10082 + * fix(index.d.ts): allow numbers as discriminator names #10115 + * fix(index.d.ts): allow `type: Boolean` in Schema definitions #10085 + * fix(index.d.ts): allow passing array of aggregation pipeline stages to `updateOne()` and `updateMany()` #10095 + * fix(index.d.ts): support legacy 2nd param callback syntax for `deleteOne()`, `deleteMany()` #10122 + * docs(mongoose): make `useCreateIndex` always `false` in docs #10033 + * docs(schema): fix incorrect links from schema API docs #10111 + +5.12.4 / 2021-04-15 +=================== + * fix: upgrade mongodb driver -> 3.6.6 #10079 + * fix: store fields set with select:false at schema-level when saving a new document #10101 [ptantiku](https://github.com/ptantiku) + * fix(populate): avoid turning already populated field to null when populating an existing lean document #10068 [IslandRhythms](https://github.com/IslandRhythms) + * fix(populate): correctly populate lean subdocs with `_id` property #10069 + * fix(model): insertedDocs may contain docs that weren't inserted #10098 [olnazx](https://github.com/olnazx) + * fix(schemaType): make type Mixed cast error objects to pojos #10131 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(populate): support populating embedded discriminators in nested arrays #9984 + * fix(populate): handle populating map paths using trailing `.$*` #10123 + * fix(populate): allow returning primitive from `transform()` function for single conventional populate #10064 + * fix(index.d.ts): allow generic classes of `T` to use `T & Document` internally #10046 + * fix(index.d.ts): allow `$pull` with `$` paths #10075 + * fix(index.d.ts): use correct `Date` type for `$currentDate` #10058 + * fix(index.d.ts): add missing asyncInterator to Query type def #10094 [borfig](https://github.com/borfig) + * fix(index.d.ts): allow RHS of `$unset` properties to be any value #10066 + * fix(index.d.ts): allow setting SchemaType `index` property to a string #10077 + * refactor(index.d.ts): move discriminator() to common interface #10109 [LoneRifle](https://github.com/LoneRifle) + +5.12.3 / 2021-03-31 +=================== + * fix: avoid setting schema-level collation on text indexes #10044 [IslandRhythms](https://github.com/IslandRhythms) + * fix(query): add `writeConcern()` method to avoid writeConcern deprecation warning #10009 + * fix(connection): use queueing instead of event emitter for `createCollection()` and other helpers to avoid event emitter warning #9778 + * fix(connection): scope `Connection#id` to Mongoose instance so id always lines up with `mongoose.connections` index #10025 [IslandRhythms](https://github.com/IslandRhythms) + * fix: avoid throwing in `promiseOrCallback()` if 3rd param isn't an EventEmitter #10055 [emrebass](https://github.com/emrebass) + * fix(index.d.ts): add Model as 2nd generic param to `Model.discriminator()` #10054 [coro101](https://github.com/coro101) + * fix(index.d.ts): add docs to `next()` callback for `pre('insertMany')` hooks #10078 #10072 [pezzu](https://github.com/pezzu) + * fix(index.d.ts): add `transform` to PopulateOptions interface #10061 + * fix(index.d.ts): add DocumentQuery type for backwards compatibility #10036 + +5.12.2 / 2021-03-22 +=================== + * fix(QueryCursor): consistently execute `post('find')` hooks with an array of docs #10015 #9982 [IslandRhythms](https://github.com/IslandRhythms) + * fix(schema): support setting `ref` as an option on an array SchemaType #10029 + * fix(query): apply schema-level `select` option from array schematypes #10029 + * fix(schema): avoid possible prototype pollution with `Schema()` constructor #10035 [zpbrent](https://github.com/zpbrent) + * fix(model): make bulkWrite skip timestamps with timestamps: false #10050 [SoftwareSing](https://github.com/SoftwareSing) + * fix(index.d.ts): make query methods return `QueryWithHelpers` so query helpers pass through chaining #10040 + * fix(index.d.ts): add `upserted` array to `updateOne()`, `updateMany()`, `update()` result #10042 + * fix(index.d.ts): add back `Aggregate#project()` types that were mistakenly removed in 5.12.0 #10043 + * fix(index.d.ts): always allow setting `type` in Schema to a SchemaType class or a Schema instance #10030 + * docs(transactions): introduce `session.withTransaction()` before `session.startTransaction()` because `withTransaction()` is the recommended approach #10008 + * docs(mongoose+browser): fix broken links to info about `mongoose.Types` #10016 + +5.12.1 / 2021-03-18 +=================== + * fix: update mongodb -> 3.6.5 to fix circular dependency warning #9900 + * fix(document): make `toObject()` use child schema `flattenMaps` option by default #9995 + * fix(ObjectId): make `isValidObjectId()` check that length 24 strings are hex chars only #10010 #9996 [IslandRhythms](https://github.com/IslandRhythms) + * fix(query): correctly cast embedded discriminator paths when discriminator key is specified in array filter #9977 + * fix(schema): skip `populated()` check when calling `applyGetters()` with a POJO for mongoose-lean-getters support #9986 + * fix(populate): support populating dotted subpath of a populated doc that has the same id as a populated doc #10005 + * fix(index.d.ts): correct `this` for query helpers #10028 [francescov1](https://github.com/francescov1) + * fix(index.d.ts): avoid omitting function property keys in LeanDocuments, because TS can't accurately infer what's a function if using generic functions #9989 + * fix(index.d.ts): correct type definition for `SchemaType#cast()` #10039 #9980 + * fix(index.d.ts): make SchemaTypeOptions a class, add missing `SchemaType#OptionsConstructor` #10001 + * fix(index.d.ts): support calling `findByIdAndUpdate()` with filter, update, callback params #9981 + +5.12.0 / 2021-03-11 +=================== + * feat(populate): add `transform` option that Mongoose will call on every populated doc #3775 + * feat(query): make `Query#pre()` and `Query#post()` public #9784 + * feat(document): add `Document#getPopulatedDocs()` to return an array of all populated documents in a document #9702 [IslandRhythms](https://github.com/IslandRhythms) + * feat(document): add `Document#getAllSubdocs()` to return an array of all single nested and array subdocuments #9764 [IslandRhythms](https://github.com/IslandRhythms) + * feat(schema): allow `schema` as a schema path name #8798 [IslandRhythms](https://github.com/IslandRhythms) + * feat(QueryCursor): Add batch processing for eachAsync #9902 [khaledosama999](https://github.com/khaledosama999) + * feat(connection): add `noListener` option to help with use cases where you're using `useDb()` on every request #9961 + * feat(index): emit 'createConnection' event when user calls `mongoose.createConnection()` #9985 + * feat(connection+index): emit 'model' and 'deleteModel' events on connections when creating and deleting models #9983 + * feat(query): allow passing `explain` option to `Model.exists()` #8098 [IslandRhythms](https://github.com/IslandRhythms) + +5.11.20 / 2021-03-11 +==================== + * fix(query+populate): avoid unnecessarily projecting in subpath when populating a path that uses an elemMatch projection #9973 + * fix(connection): avoid `db` events deprecation warning with 'close' events #10004 #9930 + * fix(index.d.ts): make `$pull` more permissive to allow dotted paths #9993 + +5.11.19 / 2021-03-05 +==================== + * fix(document): skip validating array elements that aren't modified when `validateModifiedOnly` is set #9963 + * fix(timestamps): apply timestamps on `findOneAndReplace()` #9951 + * fix(schema): correctly handle trailing array filters when looking up schema paths #9977 + * fix(schema): load child class getter for virtuals instead of base class when using `loadClass()` #9975 + * fix(index.d.ts): allow creating statics without passing generics to `Schema` constructor #9969 + * fix(index.d.ts): add QueryHelpers generic to schema and model, make all query methods instead return QueryWithHelpers #9850 + * fix(index.d.ts): support setting `type` to an array of schemas when using SchemaDefinitionType #9962 + * fix(index.d.ts): add generic to plugin schema definition #9968 [emiljanitzek](https://github.com/emiljanitzek) + * docs: small typo fix #9964 [KrishnaMoorthy12](https://github.com/KrishnaMoorthy12) + +5.11.18 / 2021-02-23 +==================== + * fix(connection): set connection state to `disconnected` if connecting string failed to parse #9921 + * fix(connection): remove `db` events deprecation warning if `useUnifiedTopology = true` #9930 + * fix(connection): fix promise chaining for openUri #9960 [lantw44](https://github.com/lantw44) + * fix(index.d.ts): add `PopulatedDoc` type to make it easier to define populated docs in interfaces #9818 + * fix(index.d.ts): allow explicitly overwriting `toObject()` return type for backwards compatibility #9944 + * fix(index.d.ts): correctly throw error when interface path type doesn't line up with schema path type #9958 [ShadiestGoat](https://github.com/ShadiestGoat) + * fix(index.d.ts): remove `any` from `deleteX()` and `updateX()` query params and return values #9959 [btd](https://github.com/btd) + * fix(index.d.ts): add non-generic versions of `Model.create()` for better autocomplete #9928 + * docs: correctly handle multiple `>` in API descriptions #9940 + +5.11.17 / 2021-02-17 +==================== + * fix(populate): handle `perDocumentLimit` when multiple documents reference the same populated doc #9906 + * fix(document): handle directly setting embedded document array element with projection #9909 + * fix(map): cast ObjectId to string inside of MongooseMap #9938 [HunterKohler](https://github.com/HunterKohler) + * fix(model): use schema-level default collation for indexes if index doesn't have collation #9912 + * fix(index.d.ts): make `SchemaTypeOptions#type` optional again to allow alternative typeKeys #9927 + * fix(index.d.ts): support `{ type: String }` in schema definition when using SchemaDefinitionType generic #9911 + * docs(populate+schematypes): document the `$*` syntax for populating every entry in a map #9907 + * docs(connection): clarify that `Connection#transaction()` promise resolves to a command result #9919 + +5.11.16 / 2021-02-12 +==================== + * fix(document): skip applying array element setters when init-ing an array #9889 + * fix: upgrade to mongodb driver 3.6.4 #9893 [jooeycheng](https://github.com/jooeycheng) + * fix: avoid copying Object.prototype properties when cloning #9876 + * fix(aggregate): automatically convert functions to strings when using `$function` operator #9897 + * fix: call pre-remove hooks for subdocuments #9895 #9885 [IslandRhythms](https://github.com/IslandRhythms) + * docs: fix confusing sentence in Schema docs #9914 [namenyi](https://github.com/namenyi) + +5.11.15 / 2021-02-03 +==================== + * fix(document): fix issues with `isSelected` as an path in a nested schema #9884 #9873 [IslandRhythms](https://github.com/IslandRhythms) + * fix(index.d.ts): better support for `SchemaDefinition` generics when creating schema #9863 #9862 #9789 + * fix(index.d.ts): allow required function in array definition #9888 [Ugzuzg](https://github.com/Ugzuzg) + * fix(index.d.ts): reorder create typings to allow array desctructuring #9890 [Ugzuzg](https://github.com/Ugzuzg) + * fix(index.d.ts): add missing overload to Model.validate #9878 #9877 [jonamat](https://github.com/jonamat) + * fix(index.d.ts): throw compiler error if schema says path is a String, but interface says path is a number #9857 + * fix(index.d.ts): make `Query` a class, allow calling `Query#where()` with object argument and with no arguments #9856 + * fix(index.d.ts): support calling `Schema#pre()` and `Schema#post()` with options and array of hooked function names #9844 + * docs(faq): mention other debug options than console #9887 [dandv](https://github.com/dandv) + * docs(connections): clarify that Mongoose can emit 'error' both during initial connection and after initial connection #9853 + +5.11.14 / 2021-01-28 +==================== + * fix(populate): avoid inferring `justOne` from parent when populating a POJO with a manually populated path #9833 [IslandRhythms](https://github.com/IslandRhythms) + * fix(document): apply setters on each element of the array when setting a populated array #9838 + * fix(map): handle change tracking on maps of subdocs #9811 [IslandRhythms](https://github.com/IslandRhythms) + * fix(document): remove dependency on `documentIsSelected` symbol #9841 [IslandRhythms](https://github.com/IslandRhythms) + * fix(error): make ValidationError.toJSON to include the error name correctly #9849 [hanzki](https://github.com/hanzki) + * fix(index.d.ts): indicate that `Document#remove()` returns a promise, not a query #9826 + * fix(index.d.ts): allow setting `SchemaType#enum` to TypeScript enum with `required: true` #9546 + +5.11.13 / 2021-01-20 +==================== + * fix(map): handle change tracking on map of arrays #9813 + * fix(connection): allow passing options to `Connection#transaction()` #9834 [pnutmath](https://github.com/pnutmath) + * fix(index.d.ts): make `Query#options#rawResult` take precedence over `new`+`upsert` #9816 + * fix(index.d.ts): changed setOptions's 'overwrite' argument to optional #9824 [pierissimo](https://github.com/pierissimo) + * fix(index.d.ts): allow setting `mongoose.Promise` #9820 + * fix(index.d.ts): add `Aggregate#replaceRoot()` #9814 + * fix(index.d.ts): make `Model.create()` with a spread return a promise of array rather than single doc #9817 + * fix(index.d.ts): use SchemaDefinitionProperty generic for SchemaTypeOptions if specified #9815 + * docs(populate): add note about setting `toObject` for populate virtuals #9822 + +5.11.12 / 2021-01-14 +==================== + * fix(document): handle using `db` as a document path #9798 + * fix(collection): make sure to call `onOpen()` if `autoCreate === false` #9807 + * fix(index.d.ts): correct query type for `findOneAndUpdate()` and `findByIdAndUpdate()` with `rawResult = true` #9803 + * fix(index.d.ts): require setting `new: true` or `returnOriginal: false` to skip null check with `findOneAndUpdate()` #9654 + * fix(index.d.ts): make methods and statics optional on schema #9801 + * fix(index.d.ts): remove non backwards compatible methods restriction #9801 + * docs: removed the extra word on comment doc #9794 [HenriqueLBorges](https://github.com/HenriqueLBorges) + +5.11.11 / 2021-01-08 +==================== + * fix(model): support calling `create()` with `undefined` as first argument and no callback #9765 + * fix(index.d.ts): ensure TypeScript knows that `this` refers to `DocType` in schema methods with strict mode #9755 + * fix(index.d.ts): make SchemaDefinition accept a model generic #9761 [mroohian](https://github.com/mroohian) + * fix(index.d.ts): add `Aggregate#addFields()` #9774 + * fix(index.d.ts): allow setting `min` and `max` to [number, string] and [Date, string] #9762 + * fix(index.d.ts): improve context and type bindings for `Schema#methods` and `Schema#statics` #9717 + * docs: add recommended connection option #9768 [Fernando-Lozano](https://github.com/Fernando-Lozano) + * chore: correct improper date in History.md #9783 [botv](https://github.com/botv) + +5.11.10 / 2021-01-04 +==================== + * fix(model): support `populate` option for `insertMany()` as a workaround for mongoose-autopopulate #9720 + * perf(schema): avoid creating extra array when initializing array of arrays #9588 + * perf(schema): avoid setting `arrayPath` when casting to a non-array, avoid unnecessarily setting atomics #9588 + * perf(schema): avoid expensive `String#slice()` call when creating a new array #9588 + * fix(queryhelpers): avoid modifying `lean.virtuals` in place #9754 + * fix: fall back to legacy treatment for square brackets if square brackets contents aren't a number #9640 + * fix(document): make fix for #9396 handle null values more gracefully #9709 + * fix(index.d.ts): add missing overloaded function for Document#populate() #9744 [sahasayan](https://github.com/sahasayan) + * fix(index.d.ts): allow Model.create param1 overwrite #9753 [hasezoey](https://github.com/hasezoey) + * fix(index.d.ts): improve autocomplete for query middleware #9752 [3Aahmednaser94](https://github.com/3Aahmednaser94) + * fix(index.d.ts): add missing function for Aggregate#group() #9750 [coro101](https://github.com/coro101) + * fix(index.d.ts): add missing `Aggregate#project()` #9763 [vorticalbox](https://github.com/vorticalbox) + * fix(index.d.ts): allow `null` as an enum value for schematypes #9746 + * docs(guide+schema): make schema API docs and guide docs' list of Schema options line up #9749 + * docs(documents): add some more details about what the `save()` promise resolves to #9689 + * docs(subdocs): add section about subdocument defaults #7291 + * chore: run GitHub CI on PRs and update badge #9760 [YC](https://github.com/YC) + +5.11.9 / 2020-12-28 +=================== + * fix(document): keeps atomics when assigning array to filtered array #9651 + * fix(document): apply `defaults` option to subdocument arrays #9736 + * fix(index.d.ts): allow passing generic parameter to overwrite `lean()` result type #9728 + * fix(index.d.ts): add missing pre hook for findOneAndUpdate #9743 [sahasayan](https://github.com/sahasayan) + * fix(index.d.ts): schema methods & statics types #9725 + * fix(index.d.ts): allow `id` paths with non-string values in TypeScript #9723 + * fix(index.d.ts): support calling `createIndexes()` and `ensureIndexes()` with just callback #9706 + * fix(index.d.ts): include `__v` in LeanDocuments #9687 + * fix(index.d.ts): add missing `Aggregate#append()` #9714 + * chore: add eslint typescript support and lint index.d.ts file #9729 [simllll](https://github.com/simllll) + * chore: add Github Actions #9688 [YC](https://github.com/YC) + +5.11.8 / 2020-12-14 +=================== + * fix(index.d.ts): add missing single document populate #9696 [YC](https://github.com/YC) + * fix(index.d.ts): make options optional for `toObject` #9700 + * fix(index.d.ts): added missing match and model methods in Aggregate class #9710 [manekshms](https://github.com/manekshms) + * fix(index.d.ts): make options optional for `createIndexes()` and `ensureIndexes()` #9706 + * fix(index.d.ts): support passing a function to `ValidateOpts.message` #9697 + * docs: add media query for ::before on headings #9705 #9704 [YC](https://github.com/YC) + +5.11.7 / 2020-12-10 +=================== + * fix(document): ensure calling `get()` with empty string returns undefined for mongoose-plugin-autoinc #9681 + * fix(model): set `isNew` to false for documents that were successfully inserted by `insertMany` with `ordered = false` when an error occurred #9677 + * fix(index.d.ts): add missing Aggregate#skip() & Aggregate#limit() #9692 [sahasayan](https://github.com/sahasayan) + * fix(index.d.ts): make `Document#id` optional so types that use `id` can use `Model` #9684 + +5.11.6 / 2020-12-09 +=================== + * fix(middleware): ensure sync errors in pre hooks always bubble up to the calling code #9659 + * fix(index.d.ts): allow passing ObjectId properties as strings to `create()` and `findOneAndReplace()` #9676 + * fix(index.d.ts): allow calling `mongoose.model()` and `Connection#model()` with model as generic param #9685 #9678 [sahasayan](https://github.com/sahasayan) + * fix(index.d.ts): Fix return type of Model#aggregate() #9680 [orgads](https://github.com/orgads) + * fix(index.d.ts): optional next() parameter for post middleware #9683 [isengartz](https://github.com/isengartz) + * fix(index.d.ts): allow array of validators in SchemaTypeOptions #9686 [cjroebuck](https://github.com/cjroebuck) + +5.11.5 / 2020-12-07 +=================== + * fix(map): support `null` in maps of subdocs #9628 + * fix(index.d.ts): support object syntax for `validate` #9667 + * fix(index.d.ts): Allow number for Schema expires #9670 [alecgibson](https://github.com/alecgibson) + * fix(index.d.ts): allow definining arbitrary properties on SchemaTypeOpts for plugins like mongoose-autopopulate #9669 + * fix(index.d.ts): add mongoose.models #9661 #9660 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(index.d.ts): allow the next() argument to be optional #9665 #9664 [sahasayan](https://github.com/sahasayan) + * fix(index.d.ts): add missing `VirtualType#applyGetters()` and `applySetters()`, `Schema#virtuals`, `Schema#childSchemas`, `Query#_mongooseOptions` #9658 + * fix(index.d.ts): add `id` to LeanDocuments in case it is defined in the user's schema #9657 + * fix(index.d.ts): add missing types for hook functions #9653 + * fix(index.d.ts): improve support for strict null checks with `upsert` and `orFail()` #9654 + * fix(index.d.ts): make return values for `insertMany()` more consistent #9662 + * fix(index.d.ts): Change options in Connection#collection() to be optional #9663 [orgads](https://github.com/orgads) + * fix(index.d.ts): add the missing generic declaration for Schema #9655 [sahasayan](https://github.com/sahasayan) + * fix(index.d.ts): add missing `SchemaTypeOpts` and `ConnectionOptions` aliases for backwards compat + * docs(populate): remove `sort()` from `limit` example to avoid potential confusion #9584 + * docs(compatibility): add MongoDB server 4.4 version compatibility #9641 + +5.11.4 / 2020-12-04 +=================== + * fix(index.d.ts): add `Document#__v` so documents have a Version by default #9652 [sahasayan](https://github.com/sahasayan) + * fix(index.d.ts): add missing `session` option to `SaveOptions` #9642 + * fix(index.d.ts): add `Schema#paths`, `Schema#static(obj)`, `Embedded#schema`, `DocumentArray#schema`, make Schema inherit from EventEmitter #9650 + * fix(index.d.ts): order when cb is optional in method #9647 [CatsMiaow](https://github.com/CatsMiaow) + * fix(index.d.ts): use DocumentDefinition for `FilterQuery` #9649 + * fix(index.d.ts): correct callback result types for `find()`, `findOne()`, `findById()` #9648 + * fix(index.d.ts): remove `Document#parent()` method because it conflicts with existing user code #9645 + * fix(index.d.ts): add missing `Connection#db` property #9643 + * test(typescript): add `tsconfig.json` file for intellisense #9611 [alecgibson](https://github.com/alecgibson) + +5.11.3 / 2020-12-03 +=================== + * fix(index.d.ts): make Mongoose collection inherit MongoDB collection #9637 #9630 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(index.d.ts): add `Document#_id` so documents have an id by default #9632 + * fix(index.d.ts): allow inline schema definitions for nested properties #9639 [Green-Cat](https://github.com/Green-Cat) + * fix(index.d.ts): add support for missing error message definitions #9638 [SaifAlsabe](https://github.com/SaifAlsabe) + * fix(schema+discriminator): support defining recursive embedded discriminators by passing document array schematype to discriminator #9600 + * fix(index.d.ts): make it possible to use `LeanDocument` with arrays #9620 + * fix(index.d.ts): add `ModelUpdateOptions` as alias for `QueryOptions` for backwards compat #9637 + +5.11.2 / 2020-12-02 +=================== + * fix(index.d.ts): add missing query options and model `findById()` function #9626 #9620 + * fix(index.d.ts): support defining schema paths as arrays of functions #9617 + * fix(index.d.ts): add automatic `_id` for Document, support creating Mongoose globals and accessing collection name #9618 + * fix(index.d.ts): add missing global `get()` and `set()` #9616 + * fix(index.d.ts): add missing `new` and `returnOriginal` options to QueryOptions, add missing model static properties #9627 #9616 #9615 + * fix(index.d.ts): allow `useCreateIndex` in connection options #9621 + +5.11.1 / 2020-12-01 +=================== + * fix(index.d.ts): add missing SchemaOptions #9606 + * fix(index.d.ts): allow using `$set` in updates #9609 + * fix(index.d.ts): add support for using return value of `createConnection()` as a connection as well as a promise #9612 #9610 [alecgibson](https://github.com/alecgibson) + * fix(index.d.ts): allow using `Types.ObjectId()` without `new` in TypeScript #9608 + +5.11.0 / 2020-11-30 +=================== + * feat: add official TypeScript definitions `index.d.ts` file #8108 + * feat(connection): add bufferTimeoutMS option that configures how long Mongoose will allow commands to buffer #9469 + * feat(populate): support populate virtuals with `localField` and `foreignField` as arrays #6608 + * feat(populate+virtual): feat: support getters on populate virtuals, including `get` option for `Schema#virtual()` #9343 + * feat(populate+schema): add support for `populate` schematype option that sets default populate options #6029 + * feat(QueryCursor): execute post find hooks for each doc in query cursor #9345 + * feat(schema): support overwriting cast logic for individual schematype instances #8407 + * feat(QueryCursor): make cursor `populate()` in batch when using `batchSize` #9366 [biomorgoth](https://github.com/biomorgoth) + * chore: remove changelog from published bundle #9404 + * feat(model+mongoose): add `overwriteModels` option to bypass `OverwriteModelError` globally #9406 + * feat(model+query): allow defining middleware for all query methods or all document methods, but not other middleware types #9190 + * feat(document+model): make change tracking skip saving if new value matches last saved value #9396 + * perf(utils): major speedup for `deepEqual()` on documents and arrays #9396 + * feat(schema): support passing a TypeScript enum to `enum` validator in schema #9547 #9546 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * feat(debug): #8963 `shell` option for date format (ISODate) #9532 [FlameFractal](https://github.com/FlameFractal) + * feat(document): support square bracket indexing for `get()`, `set()` #9375 + * feat(document): support array and space-delimited syntax for `Document#$isValid()`, `isDirectSelected()`, `isSelected()`, `$isDefault()` #9474 + * feat(string): make `minLength` and `maxLength` behave the same as `minlength` and `maxlength` #8777 [m-weeks](https://github.com/m-weeks) + * feat(document): add `$parent()` as an alias for `parent()` for documents and subdocuments to avoid path name conflicts #9455 + +5.10.19 / 2020-11-30 +==================== + * fix(query): support passing an array to `$type` in query filters #9577 + * perf(schema): avoid creating unnecessary objects when casting to array #9588 + * docs: make example gender neutral #9601 [rehatkathuria](https://github.com/rehatkathuria) + +5.10.18 / 2020-11-29 +==================== + * fix(connection): connect and disconnect can be used destructured #9598 #9597 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + +5.10.17 / 2020-11-27 +==================== + * fix(document): allow setting fields after an undefined field #9587 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + +5.10.16 / 2020-11-25 +==================== + * fix(connection): copy config options from connection rather than base connection when calling `useDb()` #9569 + * fix(schema): support `of` for array type definitions to be consistent with maps #9564 + * docs(dates): fix broken example reference #9557 [kertof](https://github.com/kertof) + * docs(virtualtype): remove unintentional h2 tag re: tj/dox#60 #9568 + +5.10.15 / 2020-11-16 +==================== + * fix(array): make sure `Array#toObject()` returns a vanilla JavaScript array in Node.js 6+ #9540 + * fix(connection): make `disconnect()` stop Mongoose if it is trying to reconnect #9531 + * fix: ensure `Document#overwrite()` correctly overwrites maps #9549 + * fix(document): make transform work with nested paths #9544 #9543 [jonathan-wilkinson](https://github.com/jonathan-wilkinson) + * fix(query): maxTimeMS in count, countDocuments, distinct #9552 [FlameFractal](https://github.com/FlameFractal) + * fix(schema): remove warning re: `increment` as a schema path name #9538 + * fix(model): automatically set `partialFilterExpression` for indexes in discriminator schemas #9542 + +5.10.14 / 2020-11-12 +==================== + * fix(update): handle casting immutable object properties with `$setOnInsert` #9537 + * fix(discriminator): overwrite instead of merge if discriminator schema specifies a path is single nested but base schema has path as doc array #9534 + * docs(middleware): clarify that you need to set both `document` and `query` on `remove` hooks to get just document middleware #9530 [mustafaKamal-fe](https://github.com/mustafaKamal-fe) + * docs(CONTRIBUTING): remove mmapv1 recommendation and clean up a few other details #9529 + * refactor: remove duplicate function definition #9527 [ksullivan](https://github.com/ksullivan) + +5.10.13 / 2020-11-06 +==================== + * fix: upgrade mongodb driver -> 3.6.3 for Lambda cold start fixes #9521 #9179 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(document): correctly handle setting props to other nested props #9519 + +5.10.12 / 2020-11-04 +==================== + * fix(connection): catch and report sync errors in connection wrappers like `startSession()` #9515 + * fix(document): ignore getters when diffing values for change tracking #9501 + * fix(connection): avoid executing promise handler unless it's a function #9507 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(error): throw more helpful error when connecting to a non-SSL MongoDB server with SSL enabled #9511 + * docs(model+query): clarify that `deleteOne` and `deleteMany` trigger middleware #9504 + * docs(ssl): add note about `ssl` defaulting to `true` for srv connection strings #9511 + +5.10.11 / 2020-10-26 +==================== + * fix(connection): when calling `mongoose.connect()` multiple times in parallel, make 2nd call wait for connection before resolving #9476 + * fix(map): make `save()` persist `Map#clear()` #9493 + * fix(document): avoid overwriting array subdocument when setting dotted path that isn't selected #9427 + * fix(connection): don't throw Atlas error if server discovery doesn't find any servers #9470 + * docs: update options for Model.findOneAndUpdate #9499 [radamson](https://github.com/radamson) + +5.10.10 / 2020-10-23 +==================== + * fix(schema): handle merging schemas from separate Mongoose module instances when schema has a virtual #9471 + * fix(connection): make connection.then(...) resolve to a connection instance #9497 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(aggregate): when using $search with discriminators, add `$match` as the 2nd stage in pipeline rather than 1st #9487 + * fix(query): cast $nor within $elemMatch #9479 + * docs(connection): add note about 'error' event versus 'disconnected' event #9488 [tareqdayya](https://github.com/tareqdayya) + +5.10.9 / 2020-10-09 +=================== + * fix(update): strip out unused array filters to avoid "filter was not used in the update" error #9468 + * fix(mongoose): allow setting `autoCreate` as a global option to be consistent with `autoIndex` #9466 + +5.10.8 / 2020-10-05 +=================== + * fix(schema): handle setting nested paths underneath single nested subdocs #9459 + * fix(schema+index): allow calling `mongoose.model()` with schema from a different Mongoose module instance #9449 + * fix(transaction): fix saving new documents w/ arrays in transactions #9457 [PenguinToast](https://github.com/PenguinToast) + * fix(document): track `reason` on cast errors that occur while init-ing a document #9448 + * fix(model): make `createCollection()` not throw error when collection already exists to be consistent with v5.9 #9447 + * docs(connections): add SSL connections docs #9443 + * docs(query_casting): fix typo #9458 [craig-davis](https://github.com/craig-davis) + +5.10.7 / 2020-09-24 +=================== + * fix(schema): set correct path and schema on nested primitive arrays #9429 + * fix(document): pass document to required validator so `required` can use arrow functions #9435 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(document): handle required when schema has property named `isSelected` #9438 + * fix(timestamps): allow using timestamps when schema has a property named 'set' #9428 + * fix(schema): make `Schema#clone()` use parent Mongoose instance's Schema constructor #9426 + +5.10.6 / 2020-09-18 +=================== + * fix(populate): handle `options.perDocumentLimit` option same as `perDocumentLimit` when calling `populate()` #9418 + * fix(document): invalidate path if default function throws an error #9408 + * fix: ensure subdocument defaults run after initial values are set when initing #9408 + * docs(faq+queries): add more detail about duplicate queries, including an faq entry #9386 + * docs: replace var with let and const in docs and test files #9414 [jmadankumar](https://github.com/jmadankumar) + * docs(model+query): document using array of strings as projection #9413 + * docs(middleware): add missing backtick #9425 [tphobe9312](https://github.com/tphobe9312) + +5.10.5 / 2020-09-11 +=================== + * fix: bump mongodb -> 3.6.2 #9411 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(query+aggregate+cursor): support async iteration over a cursor instance as opposed to a Query or Aggregate instance #9403 + * fix(document): respect child schema `minimize` if `toObject()` is called without an explicit `minimize` #9405 + * docs(guide): use const instead of var #9394 [nainardev](https://github.com/nainardev) + * docs(query): link to lean, findOneAndUpdate, query casting tutorials from query docs #9410 + +5.10.4 / 2020-09-09 +=================== + * fix(document): allow setting nested path to instance of model #9392 + * fix: handle `findOneAndRemove()` with `orFail()` #9381 + * fix(schema): support setting `_id` option to `false` after instantiating schema #9390 + * docs(document): fix formatting on `getChanges()` #9376 + +5.10.3 / 2020-09-03 +=================== + * fix: upgrade mongodb -> 3.6.1 #9380 [lamhieu-vk](https://github.com/lamhieu-vk) + * fix(populate): allow populating paths underneath subdocument maps #9359 + * fix(update): handle casting map paths when map is underneath a single nested subdoc #9298 + * fix(discriminator): avoid removing nested path if both base and discriminator schema have the same nested path #9362 + * fix(schema): support `Schema#add()` with schematype instances with different paths #9370 + * docs(api): fix typo in `Query#get()` example #9372 [elainewlin](https://github.com/elainewlin) + +5.10.2 / 2020-08-28 +=================== + * fix(model): avoid uncaught error if `insertMany()` fails due to server selection error #9355 + * fix(aggregate): automatically convert accumulator function options to strings #9364 + * fix(document): handle `pull()` on a document array when `_id` is an alias #9319 + * fix(queryhelpers): avoid path collision error when projecting in discriminator key with `.$` #9361 + * fix: fix typo in error message thrown by unimplemented createIndex #9367 [timhaley94](https://github.com/timhaley94) + * docs(plugins): note that plugins should be applied before you call `mongoose.model()` #7723 + +5.10.1 / 2020-08-26 +=================== + * fix(mongoose): fix `.then()` is not a function error when calling `mongoose.connect()` multiple times #9358 #9335 #9331 + * fix: allow calling `create()` after `bulkWrite()` by clearing internal casting context #9350 + * fix(model): dont wipe out changes made while `save()` is in-flight #9327 + * fix(populate): skip checking `refPath` if the path to populate is undefined #9340 + * fix(document): allow accessing document values from function `default` on array #9351 + * fix(model): skip applying init hook if called with `schema.pre(..., { document: false })` #9316 + * fix(populate): support `retainNullValues` when setting `_id` to `false` for subdocument #9337 #9336 [FelixRe0](https://github.com/FelixRe0) + * docs: update connect example to avoid deprecation warnings #9332 [moander](https://github.com/moander) + +5.10.0 / 2020-08-14 +=================== + * feat: upgrade to MongoDB driver 3.6 for full MongoDB 4.4 support + * feat(connection): add `Connection#transaction()` helper that handles resetting Mongoose document state if the transaction fails #8380 + * feat(connection): make transaction() helper reset array atomics after failed transaction + * feat(schema+model): add `optimisticConcurrency` option to use OCC for `save()` #9001 #5424 + * feat(aggregate): add `Aggregate#search()` for Atlas Text Search #9115 + * feat(mongoose): add support for setting `setDefaultsOnInsert` as a global option #9036 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * feat(mongoose): add support for setting `returnOriginal` as a global option #9189 #9183 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * feat(mongoose): allow global option mongoose.set('strictQuery', true) #9016 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * feat(document): add Document#getChanges #9097 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * feat(document): support `defaults` option to disable adding defaults to a single document #8271 + * feat(SingleNestedPath+DocumentArray): add static `set()` function for global options, support setting `_id` globally #8883 + * feat(query): handle casting `$or` when each clause contains a different discriminator key #9018 + * feat(query): add overwriteDiscriminatorKey option that allows changing the discriminator key in `findOneAndUpdate()`, `updateOne()`, etc. #6087 + * fix(connection): make calling `mongoose.connect()` while already connected a no-op #9203 + * feat(connection): add `getClient()` and `setClient()` function for interacting with a connection's underlying MongoClient instance #9164 + * feat(document+populate): add `parent()` function that allows you to get the parent document for populated docs #8092 + * feat(document): add `useProjection` option to `toObject()` and `toJSON()` for hiding deselected fields on newly created documents #9118 + +5.9.29 / 2020-08-13 +=================== + * fix(document): support setting nested path to itself when it has nested subpaths #9313 + * fix(model): make `syncIndexes()` report error if it can't create an index #9303 + * fix: handle auth error when Atlas username is incorrect #9300 + +5.9.28 / 2020-08-07 +=================== + * fix(connection): consistently stop buffering when "reconnected" is emitted #9295 + * fix(error): ensure `name` and `message` show up on individual ValidatorErrors when calling JSON.stringify() on a ValidationError #9296 + * fix(document): keeps manually populated paths when setting a nested path to itself #9293 + * fix(document): allow saving after setting document array to itself #9266 + * fix(schema): handle `match` schema validator with `/g` flag #9287 + * docs(guide): refactor transactions examples to async/await #9204 + +5.9.27 / 2020-07-31 +=================== + * fix: upgrade mongodb driver -> 3.5.10 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs(transactions): make transactions docs use async/await for readability #9204 + +5.9.26 / 2020-07-27 +=================== + * fix(document): allow unsetting boolean field by setting the field to `undefined` #9275 + * fix(document): throw error when overwriting a single nested subdoc changes an immutable path within the subdoc #9281 + * fix(timestamps): apply timestamps to `bulkWrite()` updates when not using `$set` #9268 + * fix(browser): upgrade babel to v7 to work around an issue with `extends Error` #9273 + * fix: make subdocument's `invalidate()` methods have the same return value as top-level document #9271 + * docs(model): make `create()` docs use async/await, and add another warning about how `create()` with options requires array syntax #9280 + * docs(connections): clarify that Mongoose can emit 'connected' when reconnecting after losing connectivity #9240 + * docs(populate): clarify that you can't filter based on foreign document properties when populating #9279 + * docs(document+model): clarify how `validateModifiedOnly` option works #9263 + * docs: remove extra poolSize option in comment #9270 [shahvicky](https://github.com/shahvicky) + * docs: point bulkWrite() link to mongoose docs instead of localhost #9284 + +5.9.25 / 2020-07-17 +=================== + * fix(discriminator): allow passing a compiled model's schema as a parameter to `discriminator()` #9238 + * fix(connection): throw more readable error when querying db before initial connection when `bufferCommands = false` #9239 + * fix(indexes): don't unnecessarily drop text indexes when running `syncIndexes()` #9225 + * fix: make Boolean _castNullish respect omitUndefined #9242 [ehpc](https://github.com/ehpc) + * fix(populate): populate single nested discriminator underneath doc array when populated docs have different model but same id #9244 + * docs(mongoose): correct formatting typo #9247 [JNa0](https://github.com/JNa0) + +5.9.24 / 2020-07-13 +=================== + * fix(connection): respect connection-level `bufferCommands` option if `mongoose.connect()` is called after `mongoose.model()` #9179 + * fix(document): clear out `priorDoc` after overwriting single nested subdoc so changes after overwrite get persisted correctly #9208 + * fix(connection): dont overwrite user-specified `bufferMaxEntries` when setting `bufferCommands` #9218 + * fix(model): allow passing projection to `Model.hydrate()` #9209 + * fix(schema+document): support adding `null` to schema boolean's `convertToFalse` set #9223 + * docs(model): make `find` and `findOne()` examples use async/await and clarify `find({})` is find all #9210 + +4.13.21 / 2020-07-12 +==================== + * fix(query): delete top-level `_bsontype` property in queries to prevent silent empty queries #8222 + +5.9.23 / 2020-07-10 +=================== + * fix(model): fix `syncIndexes()` error when db index has a collation but Mongoose index does not #9224 [clhuang](https://github.com/clhuang) + * fix(array): only cast array to proper depth if it contains an non-array value #9217 #9215 [cyrilgandon](https://github.com/cyrilgandon) + * docs(schematype): document the `transform` option #9211 + * docs(mongoose): fix typo #9212 [JNa0](https://github.com/JNa0) + +5.9.22 / 2020-07-06 +=================== + * fix(schema): treat `{ type: mongoose.Schema.Types.Array }` as equivalent to `{ type: Array }` #9194 + * fix: revert fix for #9107 to avoid issues when calling `connect()` multiple times #9167 + * fix(update): respect storeSubdocValidationError option with update validators #9172 + * fix: upgrade to safe-buffer 5.2 #9198 + * docs: add a note about SSL validation to migration guide #9147 + * docs(schemas): fix inconsistent header #9196 [samtsai15](https://github.com/samtsai15) + +5.9.21 / 2020-07-01 +=================== + * fix: propagate `typeKey` option to implicitly created schemas from `typePojoToMixed` #9185 [joaoritter](https://github.com/joaoritter) + * fix(populate): handle embedded discriminator `refPath` with multiple documents #9153 + * fix(populate): handle deselected foreign field with `perDocumentLimit` and multiple documents #9175 + * fix(document): disallow `transform` functions that return promises #9176 #9163 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(document): use strict equality when checking mixed paths for modifications #9165 + * docs: add target="_blank" to all edit links #9058 + +5.9.20 / 2020-06-22 +=================== + * fix(populate): handle populating primitive array under document array discriminator #9148 + * fix(connection): make sure to close previous connection when calling `openUri()` on an already open connection #9107 + * fix(model): fix conflicting $setOnInsert default values with `update` values in bulkWrite #9160 #9157 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs(validation): add note about validateBeforeSave and invalidate #9144 [dandv](https://github.com/dandv) + * docs: specify the array field syntax for invalidate #9137 [dandv](https://github.com/dandv) + * docs: fix several typos and broken references #9024 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs: fix minor typo #9143 [dandv](https://github.com/dandv) + +5.9.19 / 2020-06-15 +=================== + * fix: upgrade mongodb driver -> 3.5.9 #9124 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix: copy `required` validator on single nested subdoc correctly when calling `Schema#clone()` #8819 + * fix(discriminator): handle `tiedValue` when casting update on nested paths #9108 + * fix(model): allow empty arrays for bulkWrite #9132 #9131 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(schema): correctly set partialFilterExpression for nested schema indexes #9091 + * fix(castArrayFilters): handle casting on all fields of array filter #9122 [lafeuil](https://github.com/lafeuil) + * fix(update): handle nested path createdAt when overwriting parent path #9105 + * docs(subdocs): add some notes on the difference between single nested subdocs and nested paths #9085 + * docs(subdocs): improve docs on `typePojoToMixed` #9085 + * docs: add note about connections in `globalSetup` with Jest #9063 + * docs: add schema and how to set default sub-schema to schematype options #9111 [dfle](https://github.com/dfle) + * docs(index): use `const` instead of `var` in examples #9125 [dmcgrouther](https://github.com/dmcgrouther) + * docs: corrected markdown typo #9117 + +5.9.18 / 2020-06-05 +=================== + * fix: improve atlas error in the event of incorrect password #9095 + * docs: add edit link for all docs pages #9058 + * fix(document): allow accessing `$locals` when initializing document #9099 #9098 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(query): make `setDefaultsOnInsert` a mongoose option so it doesn't end up in debug output #9086 + * docs(connection+index): add serverSelectionTimeoutMS and heartbeatFrequencyMS to `connect()` and `openUri()` options #9071 + * docs(geojson): add notes about geojson 2dsphere indexes #9044 + * docs: make active page bold in navbar #9062 + * docs: correct a typo in a code snippet #9089 [Elvis-Sarfo](https://github.com/Elvis-Sarfo) + +5.9.17 / 2020-06-02 +=================== + * fix(document): avoid tracking changes like `splice()` on slice()-ed arrays #9011 + * fix(populate): make populating a nested path a no-op #9073 + * fix(document): clear nested cast errors when overwriting an array path #9080 + * fix: upgrade mongodb to v3.5.8 #9069 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs(document): add validateModifiedOnly to Document#save(), Document#validateSync() and Document#validate() #9078 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs(faq): fix typo #9075 [tigransimonyan](https://github.com/tigransimonyan) + * docs: document all parameters to .debug #9029 [dandv](https://github.com/dandv) + * docs: fix property value in Getters example #9061 [ismet](https://github.com/ismet) + +5.9.16 / 2020-05-25 +=================== + * perf(error): convert errors to classes extending Error for lower CPU overhead #9021 [zbjornson](https://github.com/zbjornson) + * fix(query): throw CastError if filter `$and`, `$or`, `$nor` contain non-object values #8948 + * fix(bulkwrite): cast filter & update to schema after applying timestamps #9030 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(document): don't overwrite defaults with undefined keys in nested documents #9039 [vitorhnn](https://github.com/vitorhnn) + * fix(discriminator): remove discriminator schema nested paths pulled from base schema underneath a mixed path in discriminator schema #9042 + * fix(model): make syncIndexes() not drop index if all user-specified collation options are the same #8994 + * fix(document): make internal `$__.scope` property a symbol instead to work around a bug with fast-safe-stringify #8955 + * docs: model.findByIdAndUpdate() 'new' param fix #9026 [dandv](https://github.com/dandv) + +5.9.15 / 2020-05-18 +=================== + * fix(schema): treat creating dotted path with no parent as creating a nested path #9020 + * fix(documentarray): make sure you can call `unshift()` after `map()` #9012 [philippejer](https://github.com/philippejer) + * fix(model): cast bulkwrite according to discriminator schema if discriminator key is present #8982 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(schema): remove `db` from reserved keywords #8940 + * fix(populate): treat populating a doc array that doesn't have a `ref` as a no-op #8946 + * fix(timestamps): set createdAt and updatedAt on doubly nested subdocs when upserting #8894 + * fix(model): allow POJOs as schemas for model.discriminator(...) #8991 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(model): report `insertedDocs` on `insertMany()` errors #8938 + * fix(model): ensure consistent `writeErrors` property on insertMany error with `ordered: false`, even if only one op failed #8938 + * docs: add anchor tag to strictQuery and strict #9014 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs(faq): remove faq ipv6 #9004 + * docs: add note about throwing error only after validation and fix broken reference to api/CastError #8993 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs: fix typos in documents.pug #9005 [dandv](https://github.com/dandv) + +5.9.14 / 2020-05-13 +=================== + * fix(cursor): add index as second parameter to eachAsync callback #8972 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(query): cast filter according to discriminator schema if discriminator key in filter #8881 + * fix(model): fix throwing error when populating virtual path defined on child discriminator #8924 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(errors): handle case when user has make `Error.prototype.toJSON` read only #8986 [osher](https://github.com/osher) + * fix(model): add `kind` to cast errors thrown by query execution #8953 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(update): use child schema strict on single nested updates if useNestedStrict not set #8922 + * docs(model): improve `save()` docs #8956 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs: add immutable type to Schema Types #8987 [Andrew5569](https://github.com/Andrew5569) + * docs: sort schema reserved keys in documentation #8966 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + +5.9.13 / 2020-05-08 +=================== + * fix(schema): mark correct path as modified when setting a path underneath a nested array of documents #8926 + * fix(query): Query#select({ field: false }) should not overwrite schema selection options #8929 #8923 + * fix(update): handle immutable properties are ignored in bulk upserts #8952 [philippejer](https://github.com/philippejer) + * docs(browser): add back sample webpack config #8890 + * docs(faq): fix broken reference in limit vs perDocumentLimit #8937 + +5.9.12 / 2020-05-04 +=================== + * fix(document): report cast error on array elements with array index instead of just being a cast error for the whole array #8888 + * fix(connection): throw more helpful error in case of IP whitelisting issue with Atlas #8846 + * fix(schema): throw error on schema with reserved key with type of object #8869 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(connection): inherit config for useDB from default connection #8267 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(query): set mongodb options for `distinct()` #8906 [clhuang](https://github.com/clhuang) + * fix(schema): allow adding descending indexes on schema #8895 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(document): set defaults if setting nested path to empty object with `minimize: false` #8829 + * fix(populate): check discriminator existence before accessing schema in getModelsMapForPopulate #8837 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs: fix broken references to Mongoose#Document API, and prefer mongoose.model(...) over Document#model(...) #8914 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs(model): adds options.limit to Model.insertMany(...) #8864 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs: add flattenMaps and aliases to Document#toObject() #8901 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs(model): add options.overwrite to findOneAndUpdate #8865 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs(populate+faq): separate limit-vs-perDocumentLimit into its own section, add FAQ for populate and limit #8917 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + +5.9.11 / 2020-04-30 +=================== + * fix: upgrade mongodb driver -> 3.5.7 #8842 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix: validate nested paths on Model.validate(...) #8848 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(populate): make doc.execPopulate(options) a shorthand for doc.populate(options).execPopulate() #8840 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(model): return validation errors when all docs are invalid & rawResult set #8853 [tusharf5](https://github.com/tusharf5) + * fix(schemaType): treat select: null or select: undefined as not specified #8850 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix: fix stream close event listener being called multiple times in Node 14 #8835 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(populate): handle `clone` with `lean` when setting a path to `null` #8807 + * docs(faq): clarify setting paths under document arrays with `markModified()` #8854 + * docs: fix race condition in creating connection for lambda #8845 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs: add options.path for Model.populate(...) #8833 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs: use ES6 classes for custom schema type example #8802 + +5.9.10 / 2020-04-20 +=================== + * fix: upgrade mongodb -> 3.5.6, bson -> 1.1.4 #8719 + * fix(document): avoid calling `$set()` on object keys if object path isn't in schema #8751 + * fix(timestamps): handle timestamps on doubly nested subdocuments #8799 + * fix(schematype): throw error if default is set to a schema instance #8751 + * fix: handle $elemMatch projection with `select: false` in schema #8818 #8806 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs: make FAQ questions more linkable #8825 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs(validation): use `init()` as opposed to `once('index')` in `unique` example #8816 + * docs: clarify `insertMany()` return value #8820 [dandv](https://github.com/dandv) + * docs(populate+query): fix typos #8793 #8794 [dandv](https://github.com/dandv) + * docs(model): document skipId parameter #8791 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + +5.9.9 / 2020-04-13 +================== + * fix(model): make Model.bulkWrite accept `strict` option #8782 #8788 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(virtual): make populated virtual getter return value when it is passed in #8775 #8774 [makinde](https://github.com/makinde) + * fix(document): handle validating document array whose docs contain maps and nested paths #8767 + * fix(document): skip discriminator key when overwriting a document #8765 + * fix(populate): support `clone` option with `lean` #8761 #8760 + * docs(transactions): use `endSession()` in all transactions examples #8741 + * docs(queries): expand streaming section to include async iterators, cursor timeouts, and sesssion idle timeouts #8720 + * docs(model+query+findoneandupdate): add docs for `returnOriginal` option #8766 + * docs(model): fix punctuation #8788 [dandv](https://github.com/dandv) + * docs: fix typos #8780 #8799 [dandv](https://github.com/dandv) + +5.9.8 / 2020-04-06 +================== + * fix(map): run getters when calling `Map#get()` #8730 + * fix(populate): handle `refPath` function in embedded discriminator #8731 + * fix(model): allow setting timestamps to false for bulkWrite #8758 #8745 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(model): pass custom options to `exists()` when no changes to save #8764 #8739 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(update): respect `useNestedStrict: false` when updating a single nested path #8735 + * fix(schema): allow `modelName` as a schema path, since `modelName` is a static property on models #7967 + * docs(promises): add section about using `exec()` with queries and `await` #8747 + * docs(connections): clarify that `connectTimeoutMS` doesn't do anything with `useUnifiedTopology`, should use `serverSelectionTimeoutMS` #8721 + * chore: upgrade mpath -> 0.7.0 #8762 [roja548](https://github.com/roja548) + +5.9.7 / 2020-03-30 +================== + * fix(map): avoid infinite loop when setting a map of documents to a document copied using spread operator #8722 + * fix(query): clean stack trace for filter cast errors so they include the calling file #8691 + * fix(model): make bulkWrite updates error if `strict` and `upsert` are set and `filter` contains a non-schema path #8698 + * fix(cast): make internal `castToNumber()` allow undefined #8725 [p3x-robot](https://github.com/p3x-robot) + +5.9.6 / 2020-03-23 +================== + * fix(document): allow saving document with nested document array after setting `nestedArr.0` #8689 + * docs(connections): expand section about multiple connections to describe patterns for exporting schemas #8679 + * docs(populate): add note about `execPopulate()` to "populate an existing document" section #8671 #8275 + * docs: fix broken links #8690 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs(guide): fix typos #8704 [MateRyze](https://github.com/MateRyze) + * docs(guide): fix minor typo #8683 [pkellz](https://github.com/pkellz) + +5.9.5 / 2020-03-16 +================== + * fix: upgrade mongodb driver -> 3.5.5 #8667 #8664 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(connection): emit "disconnected" after losing connectivity to every member of a replica set with `useUnifiedTopology: true` #8643 + * fix(array): allow calling `slice()` after `push()` #8668 #8655 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(map): avoid marking map as modified if setting `key` to the same value #8652 + * fix(updateValidators): don't run `Mixed` update validator on dotted path underneath mixed type #8659 + * fix(populate): ensure top-level `limit` applies if one document being populated has more than `limit` results #8657 + * fix(populate): throw error if both `limit` and `perDocumentLimit` are set #8661 #8658 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs(findOneAndUpdate): add a section about the `rawResult` option #8662 + * docs(guide): add section about `loadClass()` #8623 + * docs(query): improve `Query#populate()` example to clarify that `sort` doesn't affect the original result's order #8647 + +5.9.4 / 2020-03-09 +================== + * fix(document): allow `new Model(doc)` to set immutable properties when doc is a mongoose document #8642 + * fix(array): make sure you can call `unshift()` after `slice()` #8482 + * fix(schema): propagate `typePojoToMixed` to implicitly created arrays #8627 + * fix(schema): also propagate `typePojoToMixed` option to schemas implicitly created because of `typePojoToMixed` #8627 + * fix(model): support passing `background` option to `syncIndexes()` #8645 + * docs(schema): add a section about the `_id` path in schemas #8625 + * docs(virtualtype+populate): document using `match` with virtual populate #8616 + * docs(guide): fix typo #8648 [sauzy34](https://github.com/sauzy34) + +5.9.3 / 2020-03-02 +================== + * fix: upgrade mongodb driver -> 3.5.4 #8620 + * fix(document): set subpath defaults when overwriting single nested subdoc #8603 + * fix(document): make calling `validate()` with single nested subpath only validate that single nested subpath #8626 + * fix(browser): make `mongoose.model()` return a class in the browser to allow hydrating populated data in the browser #8605 + * fix(model): make `syncIndexes()` and `cleanIndexes()` drop compound indexes with `_id` that aren't in the schema #8559 + * docs(connection+index): add warnings to explain that bufferMaxEntries does nothing with `useUnifiedTopology` #8604 + * docs(document+model+query): add `options.timestamps` parameter docs to `findOneAndUpdate()` and `findByIdAndUpdate()` #8619 + * docs: fix out of date links to tumblr #8599 + +5.9.2 / 2020-02-21 +================== + * fix(model): add discriminator key to bulkWrite filters #8590 + * fix(document): when setting nested array path to non-nested array, wrap values top-down rather than bottom up when possible #8544 + * fix(document): dont leave nested key as undefined when setting nested key to empty object with minimize #8565 + * fix(document): avoid throwing error if setting path to Mongoose document with nullish `_doc` #8565 + * fix(update): handle Binary type correctly with `runValidators` #8580 + * fix(query): run `deleteOne` hooks only on `Document#deleteOne()` when setting `options.document = true` for `Schema#pre()` #8555 + * fix(document): allow calling `validate()` in post validate hook without causing parallel validation error #8597 + * fix(virtualtype): correctly copy options when cloning #8587 + * fix(collection): skip creating capped collection if `autoCreate` set to `false` #8566 + * docs(middleware): clarify that updateOne and deleteOne hooks are query middleware by default, not document middleware #8581 + * docs(aggregate): clarify that `Aggregate#unwind()` can take object parameters as well as strings #8594 + +5.9.1 / 2020-02-14 +================== + * fix(model): set session when calling `save()` with no changes #8571 + * fix(schema): return correct pathType when single nested path is embedded under a nested path with a numeric name #8583 + * fix(queryhelpers): remove `Object.values()` for Node.js 4.x-6.x support #8596 + * fix(cursor): respect sort order when using `eachAsync()` with `parallel` and a sync callback #8577 + * docs: update documentation of custom _id overriding in discriminators #8591 [sam-mfb](https://github.com/sam-mfb) + +5.9.0 / 2020-02-13 +================== + * fix: upgrade to MongoDB driver 3.5 #8520 #8563 + * feat(schematype): support setting default options for schema type (`trim` on all strings, etc.) #8487 + * feat(populate): add `perDocumentLimit` option that limits per document in `find()` result, rather than across all documents #7318 + * feat(schematype): enable setting `transform` option on individual schematypes #8403 + * feat(timestamps): allow setting `currentTime` option for setting custom function to get the current time #3957 + * feat(connection): add `Connection#watch()` to watch for changes on an entire database #8425 + * feat(document): add `Document#$op` property to make it easier to tell what operation is running in middleware #8439 + * feat(populate): support `limit` as top-level populate option #8445 + +5.8.13 / 2020-02-13 +=================== + * fix(populate): use safe get to avoid crash if schematype doesn't have options #8586 + +5.8.12 / 2020-02-12 +=================== + * fix(query): correctly cast dbref `$id` with `$elemMatch` #8577 + * fix(populate): handle populating when some embedded discriminator schemas have `refPath` but none of the subdocs have `refPath` #8553 + * docs: add useUnifiedTopology to homepage example #8558 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * refactor(utils): moving promiseOrCallback to helpers/promiseOrCallback #8573 [hugosenari](https://github.com/hugosenari) + +5.8.11 / 2020-01-31 +=================== + * fix(document): allow calling `validate()` multiple times in parallel on subdocs to avoid errors if Mongoose double-validates [taxilian](https://github.com/taxilian) #8548 #8539 + * fix(connection): allow calling initial `mongoose.connect()` after connection helpers on the same tick #8534 + * fix(connection): throw helpful error when callback param to `mongoose.connect()` or `mongoose.createConnection()` is not a function #8556 + * fix(drivers): avoid unnecessary caught error when importing #8528 + * fix(discriminator): remove unnecessary `utils.merge()` [samgladstone](https://github.com/samgladstone) #8542 + * docs: add "built with mongoose" page #8540 + +5.8.10 / 2020-01-27 +=================== + * perf(document): improve performance of document creation by skipping unnecessary split() calls #8533 [igrunert-atlassian](https://github.com/igrunert-atlassian) + * fix(document): only call validate once for deeply nested subdocuments #8532 #8531 [taxilian](https://github.com/taxilian) + * fix(document): create document array defaults in forward order, not reverse #8514 + * fix(document): allow function as message for date min/max validator #8512 + * fix(populate): don't try to populate embedded discriminator that has populated path but no `refPath` #8527 + * fix(document): plugins from base schema when creating a discriminator #8536 [samgladstone](https://github.com/samgladstone) + * fix(document): ensure parent and ownerDocument are set for subdocs in document array defaults #8509 + * fix(document): dont set undefined keys to null if minimize is false #8504 + * fix(update): bump timestamps when using update aggregation pipelines #8524 + * fix(model): ensure `cleanIndexes()` drops indexes with different collations #8521 + * docs(model): document `insertMany` `lean` option #8522 + * docs(connections): document `authSource` option #8517 + +5.8.9 / 2020-01-17 +================== + * fix(populate): skip populating embedded discriminator array values that don't have a `refPath` #8499 + * docs(queries): clarify when to use queries versus aggregations #8494 + +5.8.8 / 2020-01-14 +================== + * fix(model): allow using `lean` with `insertMany()` #8507 #8234 [ntsekouras](https://github.com/ntsekouras) + * fix(document): don't throw parallel validate error when validating subdoc underneath modified nested path #8486 + * fix: allow `typePojoToMixed` as top-level option #8501 #8500 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs(populate+schematypes): make note of `_id` getter for ObjectIds in populate docs #8483 + +5.8.7 / 2020-01-10 +================== + * fix(documentarray): modify ownerDocument when setting doc array to a doc array thats part of another document #8479 + * fix(document): ensure that you can call `splice()` after `slice()` on an array #8482 + * docs(populate): improve cross-db populate docs to include model refs #8497 + +5.8.6 / 2020-01-07 +==================== + * chore: merge changes from 4.13.20 and override mistaken publish to latest tag + +4.13.20 / 2020-01-07 +==================== + * fix(schema): make aliases handle mongoose-lean-virtuals #6069 + +5.8.5 / 2020-01-06 +================== + * fix(document): throw error when running `validate()` multiple times on the same document #8468 + * fix(model): ensure deleteOne() and deleteMany() set discriminator filter even if no conditions passed #8471 + * fix(document): allow pre('validate') hooks to throw errors with `name = 'ValidationError'` #8466 + * fix(update): move top level $set of immutable properties to $setOnInsert so upserting with immutable properties actually sets the property #8467 + * fix(document): avoid double-running validators on single nested subdocs within single nested subdocs #8468 + * fix(populate): support top-level match option for virtual populate #8475 + * fix(model): avoid applying skip when populating virtual with count #8476 + +5.8.4 / 2020-01-02 +================== + * fix(populate): ensure populate virtual gets set to empty array if `localField` is undefined in the database #8455 + * fix(connection): wrap `mongoose.connect()` server selection timeouts in MongooseTimeoutError for more readable stack traces #8451 + * fix(populate): allow deselecting `foreignField` from projection by prefixing with `-` #8460 + * fix(populate): support embedded discriminators with `refPath` when not all discriminator schemas have `refPath` #8452 + * fix(array): allow defining `enum` on array if an array of numbers #8449 + +5.8.3 / 2019-12-23 +================== + * fix: upgrade mongodb -> 3.4.1 #8430 [jaschaio](https://github.com/jaschaio) + * fix(populate): don't add empty subdocument to array when populating path underneath a non-existent document array #8432 + * fix(schema): handle `_id` option for document array schematypes #8450 + * fix(update): call setters when updating mixed type #8444 + * docs(connections): add note about MongoTimeoutError.reason #8402 + +5.8.2 / 2019-12-20 +================== + * fix(schema): copy `.add()`-ed paths when calling `.add()` with schema argument #8429 + * fix(cursor): pull schema-level readPreference when using `Query#cursor()` #8421 + * fix(cursor): wait for all promises to resolve if `parallel` is greater than number of documents #8422 + * fix(document): depopulate entire array when setting array path to a partially populated array #8443 + * fix: handle setDefaultsOnInsert with deeply nested subdocs #8392 + * fix(document): report `DocumentNotFoundError` if underlying document deleted but no changes made #8428 #8371 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs(populate): clarify limitations of `limit` option for populate and suggest workaround #8409 + * docs(deprecations): explain which connection options are no longer relevant with useUnifiedTopology #8411 + * chore: allow browser build to be published #8435 #8427 [captaincaius](https://github.com/captaincaius) + +5.8.1 / 2019-12-12 +================== + * fix(documentarray): dont attempt to cast when modifying array returned from map() #8399 + * fix(document): update single nested subdoc parent when setting to existing single nested doc #8400 + * fix(schema): add `$embeddedSchemaType` property to arrays for consistency with document arrays #8389 + +5.8.0 / 2019-12-09 +================== + * feat: wrap server selection timeout errors in `MongooseTimeoutError` to retain original stack trace #8259 + * feat(model): add `Model.validate()` function that validates a POJO against the model's schema #7587 + * feat(schema): add `Schema#pick()` function to create a new schema with a picked subset of the original schema's paths #8207 + * feat(schema): add ability to change CastError message using `cast` option to SchemaType #8300 + * feat(schema): group indexes defined in schema path with the same name #6499 + * fix(model): build all indexes even if one index fails #8185 [unusualbob](https://github.com/unusualbob) + * feat(browser): pre-compile mongoose/browser #8350 [captaincaius](https://github.com/captaincaius) + * fix(connection): throw error when setting unsupported option #8335 #6899 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * feat(schema): support `enum` validator for number type #8139 + * feat(update): allow using MongoDB 4.2 update aggregation pipelines, with no Mongoose casting #8225 + * fix(update): make update validators run on all subpaths when setting a nested path, even omitted subpaths #3587 + * feat(schema): support setting `_id` as an option to single nested schema paths #8137 + * feat(query): add Query#mongooseOptions() function #8296 + * feat(array): make `MongooseArray#push()` support using `$position` #4322 + * feat(schema): make pojo paths optionally become subdoc instead of Mixed #8228 [captaincaius](https://github.com/captaincaius) + * feat(model): add Model.cleanIndexes() to drop non-schema indexes #6676 + * feat(document): make `updateOne()` document middleware pass `this` to post hooks #8262 + * feat(aggregate): run pre/post aggregate hooks on `explain()` #5887 + * docs(model+query): add `session` option to docs for findOneAndX() methods #8396 + +5.7.14 / 2019-12-06 +=================== + * fix(cursor): wait until all `eachAsync()` functions finish before resolving the promise #8352 + * fix(update): handle embedded discriminator paths when discriminator key is defined in the update #8378 + * fix(schematype): handle passing `message` function to `SchemaType#validate()` as positional arg #8360 + * fix(map): handle cloning a schema that has a map of subdocuments #8357 + * docs(schema): clarify that `uppercase`, `lowercase`, and `trim` options for SchemaString don't affect RegExp queries #8333 + +5.7.13 / 2019-11-29 +=================== + * fix: upgrade mongodb driver -> 3.3.5 #8383 + * fix(model): catch the error when insertMany fails to initialize the document #8365 #8363 [Fonger](https://github.com/Fonger) + * fix(schema): add array.$, array.$.$ subpaths for nested arrays #6405 + * docs(error): add more detail about the ValidatorError class, including properties #8346 + * docs(connection): document `Connection#models` property #8314 + +5.7.12 / 2019-11-19 +=================== + * fix: avoid throwing error if calling `push()` on a doc array with no parent #8351 #8317 #8312 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(connection): only buffer for "open" events when calling connection helper while connecting #8319 + * fix(connection): pull default database from connection string if specified #8355 #8354 [zachazar](https://github.com/zachazar) + * fix(populate+discriminator): handle populating document whose discriminator value is different from discriminator model name #8324 + * fix: add `mongoose.isValidObjectId()` function to test whether Mongoose can cast a value to an objectid #3823 + * fix(model): support setting `excludeIndexes` as schema option for subdocs #8343 + * fix: add SchemaMapOptions class for options to map schematype #8318 + * docs(query): remove duplicate omitUndefined options #8349 [mdumandag](https://github.com/mdumandag) + * docs(schema): add Schema#paths docs to public API docs #8340 + +5.7.11 / 2019-11-14 +=================== + * fix: update mongodb driver -> 3.3.4 #8276 + * fix(model): throw readable error when casting bulkWrite update without a 'filter' or 'update' #8332 #8331 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(connection): bubble up connected/disconnected events with unified topology #8338 #8337 + * fix(model): delete $versionError after saving #8326 #8048 [Fonger](https://github.com/Fonger) + * test(model): add test for issue #8040 #8341 [Fonger](https://github.com/Fonger) + +5.7.10 / 2019-11-11 +=================== + * perf(cursor): remove unnecessary `setTimeout()` in `eachAsync()`, 4x speedup in basic benchmarks #8310 + * docs(README): re-order sections for better readability #8321 [dandv](https://github.com/dandv) + * chore: make npm test not hard-code file paths #8322 [stieg](https://github.com/stieg) + +5.7.9 / 2019-11-08 +================== + * fix(schema): support setting schema path to an instance of SchemaTypeOptions to fix integration with mongoose-i18n-localize #8297 #8292 + * fix(populate): make `retainNullValues` set array element to `null` if foreign doc with that id was not found #8293 + * fix(document): support getter setting virtual on manually populated doc when calling toJSON() #8295 + * fix(model): allow objects with `toBSON()` to make it to `save()` #8299 + +5.7.8 / 2019-11-04 +================== + * fix(document): allow manually populating path within document array #8273 + * fix(populate): update top-level `populated()` when updating document array with populated subpaths #8265 + * fix(cursor): throw error when using aggregation cursor as async iterator #8280 + * fix(schema): retain `_id: false` in schema after nesting in another schema #8274 + * fix(document): make Document class an event emitter to support defining documents without models in node #8272 + * docs: document return types for `.discriminator()` #8287 + * docs(connection): add note about exporting schemas, not models, in multi connection paradigm #8275 + * docs: clarify that transforms defined in `toObject()` options are applied to subdocs #8260 + +5.7.7 / 2019-10-24 +================== + * fix(populate): make populate virtual consistently an empty array if local field is only empty arrays #8230 + * fix(query): allow findOne(objectid) and find(objectid) #8268 + +5.7.6 / 2019-10-21 +================== + * fix: upgrade mongodb driver -> 3.3.3 to fix issue with failing to connect to a replica set if one member is down #8209 + * fix(document): fix TypeError when setting a single nested subdoc with timestamps #8251 + * fix(cursor): fix issue with long-running `eachAsync()` cursor #8249 #8235 + * fix(connection): ensure repeated `close` events from useUnifiedTopology don't disconnect Mongoose from replica set #8224 + * fix(document): support calling `Document` constructor directly in Node.js #8237 + * fix(populate): add document array subpaths to parent doc `populated()` when calling `DocumentArray#push()` #8247 + * fix(options): add missing minlength and maxlength to SchemaStringOptions #8256 + * docs: add documentarraypath to API docs, including DocumentArrayPath#discriminator() #8164 + * docs(schematypes): add a section about the `type` property #8227 + * docs(api): fix Connection.close return param #8258 [gosuhiman](https://github.com/gosuhiman) + * docs: update link to broken image on home page #8253 [krosenk729](https://github.com/krosenk729) + +5.7.5 / 2019-10-14 +================== + * fix(query): delete top-level `_bsontype` property in queries to prevent silent empty queries #8222 + * fix(update): handle subdocument pre('validate') errors in update validation #7187 + * fix(subdocument): make subdocument#isModified use parent document's isModified #8223 + * docs(index): add favicon to home page #8226 + * docs: add schema options to API docs #8012 + * docs(middleware): add note about accessing the document being updated in pre('findOneAndUpdate') #8218 + * refactor: remove redundant code in ValidationError #8244 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + +5.7.4 / 2019-10-09 +================== + * fix(schema): handle `required: null` and `required: undefined` as `required: false` #8219 + * fix(update): support updating array embedded discriminator props if discriminator key in $elemMatch #8063 + * fix(populate): allow accessing populate virtual prop underneath array when virtual defined on top level #8198 + * fix(model): support passing `options` to `Model.remove()` #8211 + * fix(document): handle `Document#set()` merge option when setting underneath single nested schema #8201 + * fix: use options constructor class for all schematypes #8012 + +5.7.3 / 2019-09-30 +================== + * fix: make CoreMongooseArray#includes() handle `fromIndex` parameter #8203 + * fix(update): cast right hand side of `$pull` as a query instead of an update for document arrays #8166 + * fix(populate): handle virtual populate of an embedded discriminator nested path #8173 + * docs(validation): remove deprecated `isAsync` from validation docs in favor of emphasizing promises #8184 + * docs(documents): add overwriting section #8178 + * docs(promises): add note about queries being thenable #8110 + * perf: avoid update validators going into Mixed types #8192 [birdofpreyru](https://github.com/birdofpreyru) + * refactor: remove async as a prod dependency #8073 + +5.7.2 / 2019-09-23 +================== + * fix(mongoose): support `mongoose.set('autoIndex', false)` #8158 + * fix(discriminator): support `tiedValue` parameter for embedded discriminators analagous to top-level discriminators #8164 + * fix(query): handle `toConstructor()` with entries-style sort syntax #8159 + * fix(populate): avoid converting mixed paths into arrays if populating an object path under `Mixed` #8157 + * fix: use $wrapCallback when using promises for mongoose-async-hooks + * fix: handle queries with setter that converts value to Number instance #8150 + * docs: add mongoosejs-cli to readme #8142 + * docs: fix example typo for Schema.prototype.plugin() #8175 [anaethoss](https://github.com/anaethoss) + +5.7.1 / 2019-09-13 +================== + * fix(query): fix TypeError when calling `findOneAndUpdate()` with `runValidators` #8151 [fernandolguevara](https://github.com/fernandolguevara) + * fix(document): throw strict mode error if setting an immutable path with strict mode: false #8149 + * fix(mongoose): support passing options object to Mongoose constructor #8144 + * fix(model): make syncIndexes() handle changes in index key order #8135 + * fix(error): export StrictModeError as a static property of MongooseError #8148 [ouyuran](https://github.com/ouyuran) + * docs(connection+mongoose): add `useUnifiedTopology` option to `connect()` and `openUri()` docs #8146 + +5.7.0 / 2019-09-09 +================== + * feat(document+query): support conditionally immutable schema paths #8001 + * perf(documentarray): refactor to use ES6 classes instead of mixins, ~30% speedup #7895 + * feat: use MongoDB driver 3.3.x for MongoDB 4.2 support #8083 #8078 + * feat(schema+query): add pre('validate') and post('validate') hooks for update validation #7984 + * fix(timestamps): ensure updatedAt gets incremented consistently using update with and without $set #4768 + * feat(query): add `Query#get()` to make writing custom setters that handle both queries and documents easier #7312 + * feat(document): run setters on defaults #8012 + * feat(document): add `aliases: false` option to `Document#toObject()` #7548 + * feat(timestamps): support skipping updatedAt and createdAt for individual save() and update() #3934 + * docs: fix index creation link in guide #8138 [joebowbeer](https://github.com/joebowbeer) + +5.6.13 / 2019-09-04 +=================== + * fix(parallel): fix parallelLimit when fns is empty #8130 #8128 [sibelius](https://github.com/sibelius) + * fix(document): ensure nested mixed validator gets called exactly once #8117 + * fix(populate): handle `justOne = undefined` #8125 [taxilian](https://github.com/taxilian) + +5.6.12 / 2019-09-03 +=================== + * fix(schema): handle required validator correctly with `clone()` #8111 + * fix(schema): copy schematype getters and setters when cloning #8124 [StphnDamon](https://github.com/StphnDamon) + * fix(discriminator): avoid unnecessarily cloning schema to avoid leaking memory on repeated `discriminator()` calls #2874 + * docs(schematypes): clarify when Mongoose uses `toString()` to convert an object to a string #8112 [TheTrueRandom](https://github.com/TheTrueRandom) + * docs(plugins): fix out of date link to npm docs #8100 + * docs(deprecations): fix typo #8109 [jgcmarins](https://github.com/jgcmarins) + * refactor(model): remove dependency on `async.parallelLimit()` for `insertMany()` #8073 + +5.6.11 / 2019-08-25 +=================== + * fix(model): allow passing options to `exists()` #8075 + * fix(document): make `validateUpdatedOnly` option handle pre-existing errors #8091 + * fix: throw readable error if middleware callback isnt a function #8087 + * fix: don't throw error if calling `find()` on a nested array #8089 + * docs(middleware): clarify that you must add middleware before compiling your model #5087 + * docs(query): add missing options to `setOptions()` #8099 + +5.6.10 / 2019-08-20 +=================== + * fix(schema): fix require() path to work around yet another bug in Jest #8053 + * fix(document): skip casting when initing a populated path #8062 + * fix(document): prevent double-calling validators on mixed objects with nested properties #8067 + * fix(query): handle schematype with `null` options when checking immutability #8070 [rich-earth](https://github.com/rich-earth) + * fix(schema): support `Schema#path()` to get schema path underneath doc array #8057 + * docs(faq): add disable color instruction #8066 + +5.6.9 / 2019-08-07 +================== + * fix(model): delete versionError after saving to prevent memory leak #8048 + * fix(cursor): correctly handle batchSize option with query cursor #8039 + * fix(populate): handle virtual populate with count = 0 if virtual embedded in doc array #7573 + * fix(schema): allow declaring ObjectId array with `{ type: 'ObjectID' }`, last 'D' case insensitive #8034 + +5.6.8 / 2019-08-02 +================== + * fix(aggregate): allow modifying pipeline in pre('aggregate') hooks #8017 + * fix(query): make `findOneAndReplace()` work with `orFail()` #8030 + * fix(document): allow saving an unchanged document if required populated path is null #8018 + * fix(debug): support disabling colors in debug mode #8033 [Mangosteen-Yang](https://github.com/Mangosteen-Yang) + * docs: add async-await guide #8028 [Rossh87](https://github.com/Rossh87) + * docs(plugins): rewrite plugins docs to be more modern and not use strange `= exports` syntax #8026 + * docs(transactions): clarify relationship between `session` in docs and MongoDB driver ClientSession class, link to driver docs #8009 + +5.6.7 / 2019-07-26 +================== + * fix(document): support validators on nested arrays #7926 + * fix(timestamps): handle `timestamps: false` in child schema #8007 + * fix(query): consistently support `new` option to `findOneAndX()` as an alternative to `returnOriginal` #7846 + * fix(document): make `inspect()` never return `null`, because a document or nested path is never `== null` #7942 + * docs(query+lean): add links to mongoose-lean-virtuals, mongoose-lean-getters, mongoose-lean-defaults #5606 + * docs: add example for `Schema#pre(Array)` #8022 [Mangosteen-Yang](https://github.com/Mangosteen-Yang) + * docs(schematype): updated comment from Schema.path to proper s.path #8013 [chrisweilacker](https://github.com/chrisweilacker) + * chore: upgrade nyc #8015 [kolya182](https://github.com/kolya182) + +5.6.6 / 2019-07-22 +================== + * fix(populate): handle refPath returning a virtual with `Query#populate()` #7341 + * fix(populate): handle `refPath` in discriminator when populating top-level model #5109 + * fix(mongoose): ensure destucturing and named imports work for Mongoose singleton methods like `set()`, etc. #6039 + * fix(query): add missing options for deleteOne and deleteMany in Query #8004 [Fonger](https://github.com/Fonger) + * fix(schema): make embedded discriminators `instanceof` their parent types #5005 + * fix(array): make `validators` a private property that doesn't show up in for/in #6572 + * docs(api): fix array API docs that vanished because of #7798 #7979 + * docs(discriminators+api): add single nested discriminator to discriminator docs and API docs #7983 + * docs(connection+mongoose): make option lists consistent between `mongoose.connect()`, `mongoose.createConnection()`, and `conn.openUri()` #7976 + * docs(validation): clarify resolve(false) vs reject() for promise-based async custom validators #7761 + * docs(guide): use correct `mongoose.set()` instead of `mongoose.use()` #7998 + * docs: add redis cache example #7997 [usama-asfar](https://github.com/usama-asfar) + +5.6.5 / 2019-07-17 +================== + * fix(document): handle setting non-schema path to ObjectId or Decimal128 if strict: false #7973 + * fix(connection): remove backwards-breaking multiple mongoose.connect() call for now #7977 + * fix(schema): print invalid value in error message when a schema path is set to undefined or null #7956 + * fix(model): throw readable error if calling `new Model.discriminator()` #7957 + * fix(mongoose): export `cast()` function #7975 [perfectstorm88](https://github.com/perfectstorm88) + * docs(model): fix link to Model.inspect() and add example #7990 + * docs: fix broken anchor links on validation tutorial #7966 + * docs(api): fix broken links to split API pages #7978 + * chore: create LICENSE.md #7989 [Fonger](https://github.com/Fonger) + +5.6.4 / 2019-07-08 +================== + * fix(schema): support pre(Array, Function) and post(Array, Function) #7803 + * fix(document): load docs with a `once` property successfully #7958 + * fix(queryhelpers): ensure parent `select` overwrites child path `select` if parent is nested #7945 + * fix(schema): make `clone()` correctly copy array embedded discriminators #7954 + * fix(update): fix error when update property gets casted to null #7949 + * fix(connection): bubble up attemptReconnect event for now #7872 + * docs(tutorials): add virtuals tutorial #7965 + * docs(connection): add section on connection handling #6997 + +5.6.3 / 2019-07-03 +================== + * fix(document): respect projection when running getters #7940 + * fix(model): call createCollection() in syncIndexes() to ensure the collection exists #7931 + * fix(document): consistently use post-order traversal for gathering subdocs for hooks #7929 + * fix(schema): ensure `Schema#pathType()` returns correct path type given non-existent positional path #7935 + * fix(ChangeStream): set `closed` if emitting close event #7930 + * fix(connection): bubble up 'attemptReconnect' event from MongoDB connection #7872 + * docs: fix broken .jade links on search page #7932 + * docs: correct link to `Query#select()` #7953 [rayhatfield](https://github.com/rayhatfield) + * docs(README): add list of related projects #7773 + +4.13.19 / 2019-07-02 +==================== + * fix(aggregate): make `setOptions()` work as advertised #7950 #6011 [cdimitroulas](https://github.com/cdimitroulas) + +5.6.2 / 2019-06-28 +================== + * fix(update): allow using `update()` with immutable `createdAt` #7917 + * fix(model): pass `doc` parameter to save() error handling middleware #7832 + * fix(mongoose): add applyPluginsToChildSchemas option to allow opting out of global plugins for child schemas #7916 + * docs(connection): document `useCache` option for `useDb()` #7923 + * docs: fix broken link in FAQ #7925 [christophergeiger3](https://github.com/christophergeiger3) + +5.6.1 / 2019-06-24 +================== + * fix(update): skip setting defaults for single embedded subdocs underneath maps #7909 + * fix(document): copy date objects correctly when strict = false #7907 + * feat(mongoose): throw an error if calling `mongoose.connect()` multiple times while connected #7905 [Fonger](https://github.com/Fonger) + * fix(document): copies virtuals from array subdocs when casting array of docs with same schema #7898 + * fix(schema): ensure clone() copies single embedded discriminators correctly #7894 + * fix(discriminator): merge instead of overwriting conflicting nested schemas in discriminator schema #7884 + * fix(populate): ignore nullish arguments when calling `populate()` #7913 [rayhatfield](https://github.com/rayhatfield) + * docs: add getters/setters tutorial #7919 + * docs: clean up error docs so they refer to `Error` rather than `MongooseError` #7867 + * docs: fix a couple broken links #7921 [kizmo04](https://github.com/kizmo04) + * refactor: remove unnecessary if #7911 [rayhatfield](https://github.com/rayhatfield) + +5.6.0 / 2019-06-14 +================== + * feat(schematype): add `immutable` option to disallow changing a given field #7671 + * docs: split API docs into separate pages to make API documentation more Google-able #7812 + * perf(array): remove all mixins in favor of ES6 classes, ~20% faster in basic benchmarks #7798 + * feat(document): use promise rejection error message when async custom validator throws an error #4913 + * feat(virtual): pass document as 3rd parameter to virtual getters and setters to enable using arrow functions #4143 + * feat(model): add `Model.exists()` function to quickly check whether a document matching `filter` exists #6872 + * feat(index+connection): support setting global and connection-level `maxTimeMS` + * feat(populate): support setting `ref` to a function for conventional populate #7669 + * feat(document): add overwrite() function that overwrites all values in a document #7830 + * feat(populate): support `PopulateOptions#connection` option to allow cross-db populate with refPath #6520 + * feat(populate): add skipInvalidIds option to silently skip population if id is invalid, instead of throwing #7706 + * feat(array): skip empty array default if there's a 2dsphere index on a geojson path #3233 + * feat(query): add `getFilter()` as an alias of `getQuery()` to be more in line with API docs #7839 + * feat(model): add Model.inspect() to make models not clutter `util.inspect()` #7836 + * perf(discriminator): skip calling `createIndex()` on indexes that are defined in the base schema #7379 + * docs: upgrade from Jade to latest Pug #7812 + * docs(README): update reference to example schema.js #7899 [sharils](https://github.com/sharils) + * docs(README): improve variable name #7900 [sharils](https://github.com/sharils) + * chore: replace charAt(0) with startsWith #7897 [Fonger](https://github.com/Fonger) + * chore: replace indexOf with includes, startsWith and endsWith for String #7897 [Fonger](https://github.com/Fonger) + +5.5.15 / 2019-06-12 +=================== + * fix(connection): reject initial connect promise even if there is an on('error') listener #7850 + * fix(map): make `of` automatically convert POJOs to schemas unless typeKey is set #7859 + * fix(update): use discriminator schema to cast update if discriminator key specified in filter #7843 + * fix(array): copy atomics from source array #7891 #7889 [jyrkive](https://github.com/jyrkive) + * fix(schema): return this when Schema.prototype.add is called with Schema #7887 [Mickael-van-der-Beek](https://github.com/Mickael-van-der-Beek) + * fix(document): add `numAffected` and `result` to DocumentNotFoundError for better debugging #7892 #7844 + +5.5.14 / 2019-06-08 +=================== + * fix(query): correct this scope of setters in update query #7876 [Fonger](https://github.com/Fonger) + * fix(model): reset modifiedPaths after successful insertMany #7852 #7873 [Fonger](https://github.com/Fonger) + * fix(populate): allow using `refPath` with virtual populate #7848 + * fix(document): prepend private methods getValue and setValue with $ #7870 [Fonger](https://github.com/Fonger) + * fix: update mongodb driver -> 3.2.7 #7871 [Fonger](https://github.com/Fonger) + * docs(tutorials): add tutorial about custom casting functions #7045 + * docs(connection): fix outdated events document #7874 [Fonger](https://github.com/Fonger) + * docs: fix typo in lean docs #7875 [tannakartikey](https://github.com/tannakartikey) + * docs: move off of KeenIO for tracking and use self-hosted analytics instead + +5.5.13 / 2019-06-05 +=================== + * fix(model): support passing deleteOne options #7860 #7857 [Fonger](https://github.com/Fonger) + * fix(update): run setters on array elements when doing $addToSet, $push, etc #4185 + * fix(model): support getting discriminator by value when creating a new model #7851 + * docs(transactions): add section about the `withTransaction()` helper #7598 + * docs(schema): clarify relationship between Schema#static() and Schema#statics #7827 + * docs(model): fix typo `projetion` to `projection` #7868 [dfdeagle47](https://github.com/dfdeagle47) + * docs(schema): correct schema options lists #7828 + +5.5.12 / 2019-05-31 +=================== + * fix(document): fix unexpected error when loading a document with a nested property named `schema` #7831 + * fix(model): skip applying static hooks by default if static name conflicts with query middleware (re: mongoose-delete plugin) #7790 + * fix(query): apply schema-level projections to the result of `findOneAndReplace()` #7654 + * fix: upgrade mongodb driver -> 3.2.6 + * docs(tutorials): add findOneAndUpdate() tutorial #7847 + * docs(validation): add `updateOne()` and `updateMany()` to list of update validator operations #7845 + * docs(model): make sure options lists in `update()` API line up #7842 + +5.5.11 / 2019-05-23 +=================== + * fix(discriminator): allow numeric discriminator keys for embedded discriminators #7808 + * chore: add Node.js 12 to travis build matrix #7784 + +5.5.10 / 2019-05-20 +=================== + * fix(discriminator): allow user-defined discriminator path in schema #7807 + * fix(query): ensure `findOneAndReplace()` sends `replacement` to server #7654 + * fix(cast): allow `[]` as a value when casting `$nin` #7806 + * docs(model): clarify that setters do run on `update()` by default #7801 + * docs: fix typo in FAQ #7821 [jaona](https://github.com/jaona) + +5.5.9 / 2019-05-16 +================== + * fix(query): skip schema setters when casting $regexp $options #7802 [Fonger](https://github.com/Fonger) + * fix(populate): don't skip populating doc array properties whose name conflicts with an array method #7782 + * fix(populate): make populated virtual return undefined if not populated #7795 + * fix(schema): handle custom setters in arrays of document arrays #7804 [Fonger](https://github.com/Fonger) + * docs(tutorials): add query casting tutorial #7789 + +5.5.8 / 2019-05-13 +================== + * fix(document): run pre save hooks on nested child schemas #7792 + * fix(model): set $session() before validation middleware for bulkWrite/insertMany #7785 #7769 [Fonger](https://github.com/Fonger) + * fix(query): make `getPopulatedPaths()` return deeply populated paths #7757 + * fix(query): suppress findAndModify deprecation warning when using `Model.findOneAndUpdate()` #7794 + * fix: upgrade mongodb -> 3.2.4 #7794 + * fix(document): handle a couple edge cases with atomics that happen when schema defines an array property named 'undefined' #7756 + * docs(discriminator): correct function parameters #7786 [gfpacheco](https://github.com/gfpacheco) + +5.5.7 / 2019-05-09 +================== + * fix(model): set $session() before pre save middleware runs when calling save() with options #7742 + * fix(model): set $session before pre remove hooks run when calling remove() with options #7742 + * fix(schema): support `remove()` on nested path #2398 + * fix(map): handle setting populated map element to doc #7745 + * fix(query): return rawResult when inserting with options `{new:false,upsert:true,rawResult:true}` #7774 #7770 [LiaanM](https://github.com/LiaanM) + * fix(schematype): remove internal `validators` option because it conflicts with Backbone #7720 + +5.5.6 / 2019-05-06 +================== + * fix(document): stop converting arrays to objects when setting non-schema path to array with strict: false #7733 + * fix(array): make two Mongoose arrays `assert.deepEqual()` each other if they have the same values #7700 + * fix(populate): support populating a path in a document array embedded in an array #7647 + * fix(populate): set populate virtual count to 0 if local field is empty #7731 + * fix(update): avoid throwing cast error if casting array filter that isn't in schema with strictQuery = false #7728 + * docs: fix typo in `distinct()` description #7767 [phil-r](https://github.com/phil-r) + +5.5.5 / 2019-04-30 +================== + * fix(document): ensure nested properties within single nested subdocs get set correctly when overwriting single nested subdoc #7748 + * fix(document): skip non-object `validators` in schema types #7720 + * fix: bump mongodb driver -> 3.2.3 #7752 + * fix(map): disallow setting map key with special properties #7750 [Fonger](https://github.com/Fonger) + +5.5.4 / 2019-04-25 +================== + * fix(document): avoid calling custom getters when saving #7719 + * fix(timestamps): handle child schema timestamps correctly when reusing child schemas #7712 + * fix(query): pass correct callback for _legacyFindAndModify #7736 [Fonger](https://github.com/Fonger) + * fix(model+query): allow setting `replacement` parameter for `findOneAndReplace()` #7654 + * fix(map): make `delete()` unset the key in the database #7746 [Fonger](https://github.com/Fonger) + * fix(array): use symbol for `_schema` property to avoid confusing deep equality checks #7700 + * fix(document): prevent `depopulate()` from removing fields with empty array #7741 #7740 [Fonger](https://github.com/Fonger) + * fix: make `MongooseArray#includes` support ObjectIds #7732 #6354 [hansemannn](https://github.com/hansemannn) + * fix(document): report correct validation error index when pushing onto doc array #7744 [Fonger](https://github.com/Fonger) + +5.5.3 / 2019-04-22 +================== + * fix: add findAndModify deprecation warning that references the useFindAndModify option #7644 + * fix(document): handle pushing a doc onto a discriminator that contains a doc array #7704 + * fix(update): run setters on array elements when doing $set #7679 + * fix: correct usage of arguments while buffering commands #7718 [rzymek](https://github.com/rzymek) + * fix(document): avoid error clearing modified subpaths if doc not defined #7715 [bitflower](https://github.com/bitflower) + * refactor(array): move `_parent` property behind a symbol #7726 #7700 + * docs(model): list out all operations and options for `bulkWrite()` #7055 + * docs(aggregate): use `eachAsync()` instead of nonexistent `each()` #7699 + * docs(validation): add CastError validation example #7514 + * docs(query+model): list out all options and callback details for Model.updateX() and Query#updateX() #7646 + +5.5.2 / 2019-04-16 +================== + * fix(document): support setting nested path to non-POJO object #7639 + * perf(connection): remove leaked event handler in `Model.init()` so `deleteModel()` frees all memory #7682 + * fix(timestamps): handle custom statics that conflict with built-in functions (like mongoose-delete plugin) #7698 + * fix(populate): make `Document#populated()` work for populated subdocs #7685 + * fix(document): support `.set()` on document array underneath embedded discriminator path #7656 + +5.5.1 / 2019-04-11 +================== + * fix(document): correctly overwrite all properties when setting a single nested subdoc #7660 #7681 + * fix(array): allow customization of array required validator #7696 [freewil](https://github.com/freewil) + * fix(discriminator): handle embedded discriminators when casting array defaults #7687 + * fix(collection): ensure collection functions return a promise even if disconnected #7676 + * fix(schematype): avoid indexing properties with `{ unique: false, index: false }` #7620 + * fix(aggregate): make `Aggregate#model()` with no arguments return the aggregation's model #7608 + +5.5.0 / 2019-04-08 +================== + * feat(model): support applying hooks to custom static functions #5982 + * feat(populate): support specifying a function as `match` #7397 + * perf(buffer): avoid calling `defineProperties()` in Buffer constructor #7331 + * feat(connection): add `plugin()` for connection-scoped plugins #7378 + * feat(model): add Model#deleteOne() and corresponding hooks #7538 + * feat(query): support hooks for `Query#distinct()` #5938 + * feat(model): print warning when calling create() incorrectly with a session #7535 + * feat(document): add Document#isEmpty() and corresponding helpers for nested paths #5369 + * feat(document): add `getters` option to Document#get() #7233 + * feat(query): add Query#projection() to get or overwrite the current projection #7384 + * fix(document): set full validator path on validatorProperties if `propsParameter` set on validator #7447 + * feat(document): add Document#directModifiedPaths() #7373 + * feat(document): add $locals property #7691 + * feat(document): add validateUpdatedOnly option that only validates modified paths in `save()` #7492 [captaincaius](https://github.com/captaincaius) + * chore: upgrade MongoDB driver to v3.2.0 #7641 + * fix(schematype): deprecate `isAsync` option for custom validators #6700 + * chore(mongoose): deprecate global.MONGOOSE_DRIVER_PATH so we can be webpack-warning-free in 6.0 #7501 + +5.4.23 / 2019-04-08 +=================== + * fix(document): report cast error when string path in schema is an array in MongoDB #7619 + * fix(query): set deletedCount on result of remove() #7629 + * docs(subdocs): add note about parent() and ownerDocument() to subdocument docs #7576 + +5.4.22 / 2019-04-04 +=================== + * fix(aggregate): allow modifying options in pre('aggregate') hook #7606 + * fix(map): correctly init maps of maps when loading from MongoDB #7630 + * docs(model+query): add `omitUndefined` option to docs for updateX() and findOneAndX() #3486 + * docs: removed duplicate Query.prototype.merge() reference from doc #7684 [shihabmridha](https://github.com/shihabmridha) + * docs(schema): fix shardKey type to object instead of bool #7668 [kyletsang](https://github.com/kyletsang) + * docs(api): fix `Model.prototypedelete` link #7665 [pixcai](https://github.com/pixcai) + +5.4.21 / 2019-04-02 +=================== + * fix(updateValidators): run update validators correctly on Decimal128 paths #7561 + * fix(update): cast array filters in nested doc arrays correctly #7603 + * fix(document): allow .get() + .set() with aliased paths #7592 + * fix(document): ensure custom getters on single nested subdocs don't get persisted if toObject.getters = true #7601 + * fix(document): support setting subdoc path to subdoc copied using object rest `{...doc}` #7645 + * docs(schema): correct out-of-date list of reserved words #7593 + * docs(model+query): add link to update results docs and examples of using results of updateOne(), etc. #7582 + * docs: use atomic as opposed to $atomic consistently #7649 [720degreeLotus](https://github.com/720degreeLotus) + +5.4.20 / 2019-03-25 +=================== + * docs(tutorials): add tutorial about `lean()` #7640 + * fix(discriminator): fix wrong modelName being used as value to partialFilterExpression index #7635 #7634 [egorovli](https://github.com/egorovli) + * fix(document): allow setters to modify `this` when overwriting single nested subdoc #7585 + * fix(populate): handle count option correctly with multiple docs #7573 + * fix(date): support declaring min/max validators as functions #7600 [ChienDevIT](https://github.com/ChienDevIT) + * fix(discriminator): avoid projecting in embedded discriminator if only auto-selected path is discriminator key #7574 + * fix(discriminator): use discriminator model when using `new BaseModel()` with discriminator key #7586 + * fix(timestamps): avoid throwing if doc array has timestamps and array is undefined #7625 [serg33v](https://github.com/serg33v) + * docs(document): explain DocumentNotFoundError in save() docs #7580 + * docs(query): fix .all() param type and add example #7612 [720degreeLotus](https://github.com/720degreeLotus) + * docs: add useNewUrlParser to mongoose.connect for some pages #7615 [YC](https://github.com/YC) + +5.4.19 / 2019-03-11 +=================== + * fix(mongoose): ensure virtuals set on subdocs in global plugins get applied #7572 + * docs(tutorials): add "Working With Dates" tutorial #7597 + * docs(guide): clarify that versioning only affects array fields #7555 + * docs(model): list out all bulkWrite() options #7550 + +5.4.18 / 2019-03-08 +=================== + * fix(document): handle nested virtuals in populated docs when parent path is projected out #7491 + * fix(model): make subclassed models handle discriminators correctly #7547 + * fix(model): remove $versionError from save options for better debug output #7570 + +5.4.17 / 2019-03-03 +=================== + * fix(update): handle all positional operator when casting array filters #7540 + * fix(populate): handle populating nested path where top-level path is a primitive in the db #7545 + * fix(update): run update validators on array filters #7536 + * fix(document): clean modified subpaths when sorting an array #7556 + * fix(model): cast $setOnInsert correctly with nested docs #7534 + * docs: remove extra curly brace from example #7569 [kolya182](https://github.com/kolya182) + +5.4.16 / 2019-02-26 +=================== + * fix(schema): handle nested objects with `_id: false` #7524 + * fix(schema): don't throw error if declaring a virtual that starts with a map path name #7464 + * fix(browser): add stubbed `model()` function so code that uses model doesn't throw #7541 [caub](https://github.com/caub) + * fix(schema): merge virtuals correctly #7563 [yoursdearboy](https://github.com/yoursdearboy) + * docs(connections): add reconnectFailed to connection docs #7477 + * docs(index): fix typo #7553 [DenrizSusam](https://github.com/DenrizSusam) + * refactor(schema): iterate over paths instead of depending on childSchemas #7554 + +5.4.15 / 2019-02-22 +=================== + * fix(update): don't call schematype validators on array if using $pull with runValidators #6971 + * fix(schema): clone all schema types when cloning an array #7537 + * docs(connections): improve connectTimeoutMS docs and socketTimeoutMS docs to link to Node.js net.setTimeout() #5169 + * docs: fix setters example in migration guide #7546 [freewil](https://github.com/freewil) + +5.4.14 / 2019-02-19 +=================== + * fix(populate): make `getters` option handle nested paths #7521 + * fix(documentarray): report validation errors that occur in an array subdoc created using `create()` and then `set()` #7504 + * docs(schema): add examples for schema functions that didn't have any #7525 + * docs: add MongooseError to API docs and add list of error names + * docs(CONTRIBUTING): fix link #7530 [sarpik](https://github.com/sarpik) + +5.4.13 / 2019-02-15 +=================== + * fix(query): throw handy error when using updateOne() with overwrite: true and no dollar keys #7475 + * fix(schema): support inheriting existing schema types using Node.js `util.inherits` like mongoose-float #7486 + * docs(connections): add list of connection events #7477 + +5.4.12 / 2019-02-13 +=================== + * fix(connection): dont emit reconnected due to socketTimeoutMS #7452 + * fix(schema): revert check for `false` schema paths #7516 #7512 + * fix(model): don't delete unaliased keys in translateAliases #7510 [chrischen](https://github.com/chrischen) + * fix(document): run single nested schematype validator if nested path has a default and subpath is modified #7493 + * fix(query): copy mongoose options when using `Query#merge()` #1790 + * fix(timestamps): don't call createdAt getters when setting updatedAt on new doc #7496 + * docs: improve description of ValidationError #7515 [JulioJu](https://github.com/JulioJu) + * docs: add an asterisk before comment, otherwise the comment line is not generated #7513 [JulioJu](https://github.com/JulioJu) + +5.4.11 / 2019-02-09 +=================== + * fix(schema): handle `_id: false` in schema paths as a shortcut for setting the `_id` option to `false` #7480 + * fix(update): handle $addToSet and $push with ObjectIds and castNonArrays=false #7479 + * docs(model): document `session` option to `save()` #7484 + * chore: fix gitignore syntax #7498 [JulioJu](https://github.com/JulioJu) + * docs: document that Document#validateSync returns ValidationError #7499 + * refactor: use consolidated `isPOJO()` function instead of constructor checks #7500 + +5.4.10 / 2019-02-05 +=================== + * docs: add search bar and /search page #6706 + * fix: support dotted aliases #7478 [chrischen](https://github.com/chrischen) + * fix(document): copy atomics when setting document array to an existing document array #7472 + * chore: upgrade to mongodb driver 3.1.13 #7488 + * docs: remove confusing references to executing a query "immediately" #7461 + * docs(guides+schematypes): link to custom schematypes docs #7407 + +5.4.9 / 2019-02-01 +================== + * fix(document): make `remove()`, `updateOne()`, and `update()` use the document's associated session #7455 + * fix(document): support passing args to hooked custom methods #7456 + * fix(document): avoid double calling single nested getters on `toObject()` #7442 + * fix(discriminator): handle global plugins modifying top-level discriminator options with applyPluginsToDiscriminators: true #7458 + * docs(documents): improve explanation of documents and use more modern syntax #7463 + * docs(middleware+api): fix a couple typos in examples #7474 [arniu](https://github.com/arniu) + +5.4.8 / 2019-01-30 +================== + * fix(query): fix unhandled error when casting object in array filters #7431 + * fix(query): cast query $elemMatch to discriminator schema if discriminator key set #7449 + * docs: add table of contents to all guides #7430 + +5.4.7 / 2019-01-26 +================== + * fix(populate): set `populated()` when using virtual populate #7440 + * fix(discriminator): defer applying plugins to embedded discriminators until model compilation so global plugins work #7435 + * fix(schema): report correct pathtype underneath map so setting dotted paths underneath maps works #7448 + * fix: get debug from options using the get helper #7451 #7446 [LucGranato](https://github.com/LucGranato) + * fix: use correct variable name #7443 [esben-semmle](https://github.com/esben-semmle) + * docs: fix broken QueryCursor link #7438 [shihabmridha](https://github.com/shihabmridha) + +5.4.6 / 2019-01-22 +================== + * fix(utils): make minimize leave empty objects in arrays instead of setting the array element to undefined #7322 + * fix(document): support passing `{document, query}` options to Schema#pre(regex) and Schema#post(regex) #7423 + * docs: add migrating to 5 guide to docs #7434 + * docs(deprecations): add instructions for fixing `count()` deprecation #7419 + * docs(middleware): add description and example for aggregate hooks #7402 + +4.13.18 / 2019-01-21 +==================== + * fix(model): handle setting populated path set via `Document#populate()` #7302 + * fix(cast): backport fix from #7290 to 4.x + +5.4.5 / 2019-01-18 +================== + * fix(populate): handle nested array `foreignField` with virtual populate #7374 + * fix(query): support not passing any arguments to `orFail()` #7409 + * docs(query): document what the resolved value for `deleteOne()`, `deleteMany()`, and `remove()` contains #7324 + * fix(array): allow opting out of converting non-arrays into arrays with `castNonArrays` option #7371 + * fix(query): ensure updateOne() doesnt unintentionally double call Schema#post(regexp) #7418 + +5.4.4 / 2019-01-14 +================== + * fix(query): run casting on arrayFilters option #7079 + * fix(document): support skipping timestamps on save() with `save({ timestamps: false })` #7357 + * fix(model): apply custom where on `Document#remove()` so we attach the shardKey #7393 + * docs(mongoose): document `mongoose.connections` #7338 + +5.4.3 / 2019-01-09 +================== + * fix(populate): handle `count` option when using `Document#populate()` on a virtual #7380 + * fix(connection): set connection state to DISCONNECTED if replica set has no primary #7330 + * fix(mongoose): apply global plugins to schemas nested underneath embedded discriminators #7370 + * fix(document): make modifiedPaths() return nested paths 1 level down on initial set #7313 + * fix(plugins): ensure sharding plugin works even if ObjectId has a `valueOf()` #7353 + +5.4.2 / 2019-01-03 +================== + * fix(document): ensure Document#updateOne() returns a query but still calls hooks #7366 + * fix(query): allow explicitly projecting out populated paths that are automatically projected in #7383 + * fix(document): support setting `flattenMaps` option for `toObject()` and `toJSON()` at schema level #7274 + * fix(query): handle merging objectids with `.where()` #7360 + * fix(schema): copy `.base` when cloning #7377 + * docs: remove links to plugins.mongoosejs.com in favor of plugins.mongoosejs.io #7364 + +5.4.1 / 2018-12-26 +================== + * fix(document): ensure doc array defaults get casted #7337 + * fix(document): make `save()` not crash if nested doc has a property 'get' #7316 + * fix(schema): allow using Schema.Types.Map as well as Map to declare a map type #7305 + * fix(map): make set after init mark correct path as modified #7321 + * fix(mongoose): don't recompile model if same collection and schema passed in to `mongoose.model()` #5767 + * fix(schema): improve error message when type is invalid #7303 + * fix(schema): add `populated` to reserved property names #7317 + * fix(model): don't run built-in middleware on custom methods and ensure timestamp hooks don't run if children don't have timestamps set #7342 + * docs(schematypes): clarify that you can add arbitrary options to a SchemaType #7340 + * docs(mongoose): clarify that passing same name+schema to `mongoose.model()` returns the model #5767 + * docs(index): add useNewUrlParser to example #7368 [JIBIN-P](https://github.com/JIBIN-P) + * docs(connection): add useNewUrlParser to examples #7362 [JIBIN-P](https://github.com/JIBIN-P) + * docs(discriminators): add back missing example from 'recursive embedded discriminators section' #7349 + * docs(schema): improve docs for string and boolean cast() #7351 + +5.4.0 / 2018-12-14 +================== + * feat(schematype): add `SchemaType.get()`, custom getters across all instances of a schematype #6912 + * feat(schematype): add `SchemaType.cast()`, configure casting for individual schematypes #7045 + * feat(schematype): add `SchemaType.checkRequired()`, configure what values pass `required` check for a schematype #7186 #7150 + * feat(model): add `Model.findOneAndReplace()` #7162 + * feat(model): add `Model.events` emitter that emits all `error`'s that occur with a given model #7125 + * feat(populate): add `count` option to populate virtuals, support returning # of populated docs instead of docs themselves #4469 + * feat(aggregate): add `.catch()` helper to make aggregations full thenables #7267 + * feat(query): add hooks for `deleteOne()` and `deleteMany()` #7195 + * feat(document): add hooks for `updateOne()` #7133 + * feat(query): add `Query#map()` for synchronously transforming results before post middleware runs #7142 + * feat(schema): support passing an array of objects or schemas to `Schema` constructor #7218 + * feat(populate): add `clone` option to ensure multiple docs don't share the same populated doc #3258 + * feat(query): add `Query#maxTimeMS()` helper #7254 + * fix(query): deprecate broken `Aggregate#addCursorFlag()` #7120 + * docs(populate): fix incorrect example #7335 [zcfan](https://github.com/zcfan) + * docs(middleware): add `findOneAndDelete` to middleware list #7327 [danielkesselberg](https://github.com/danielkesselberg) + +5.3.16 / 2018-12-11 +=================== + * fix(document): handle `__proto__` in queries #7290 + * fix(document): use Array.isArray() instead of checking constructor name for arrays #7290 + * docs(populate): add section about what happens when no document matches #7279 + * fix(mongoose): avoid crash on `import mongoose, {Schema} from 'mongoose'` #5648 + +5.3.15 / 2018-12-05 +=================== + * fix(query): handle `orFail()` with `findOneAndUpdate()` and `findOneAndDelete()` #7297 #7280 + * fix(document): make `save()` succeed if strict: false with a `collection` property #7276 + * fix(document): add `flattenMaps` option for toObject() #7274 + * docs(document): document flattenMaps option #7274 + * fix(populate): support populating individual subdoc path in document array #7273 + * fix(populate): ensure `model` option overrides `refPath` #7273 + * fix(map): don't call subdoc setters on init #7272 + * fix(document): use internal get() helper instead of lodash.get to support `null` projection param #7271 + * fix(document): continue running validateSync() for all elements in doc array after first error #6746 + +5.3.14 / 2018-11-27 +=================== + * docs(api): use `openUri()` instead of legacy `open()` #7277 [artemjackson](https://github.com/artemjackson) + * fix(document): don't mark date underneath single nested as modified if setting to string #7264 + * fix(update): set timestamps on subdocs if not using $set with no overwrite #7261 + * fix(document): use symbol instead of `__parent` so user code doesn't conflict #7230 + * fix(mongoose): allow using `mongoose.model()` without context, like `import {model} from 'mongoose'` #3768 + +5.3.13 / 2018-11-20 +=================== + * fix: bump mongodb driver -> 3.1.10 #7266 + * fix(populate): support setting a model as a `ref` #7253 + * docs(schematype): add ref() function to document what is a valid `ref` path in a schematype #7253 + * fix(array): clean modified subpaths when calling `splice()` #7249 + * docs(compatibility): don't show Mongoose 4.11 as compatible with MongoDB 3.6 re: MongoDB's official compatibility table #7248 [a-harrison](https://github.com/a-harrison) + * fix(document): report correct validation error if doc array set to primitive #7242 + * fix(mongoose): print warning when including server-side lib with jest jsdom environment #7240 + +5.3.12 / 2018-11-13 +=================== + * docs(compatibility): don't show Mongoose 4.11 as compatible with MongoDB 3.6 re: MongoDB's official compatibility table #7238 [a-harrison](https://github.com/a-harrison) + * fix(populate): use `instanceof` rather than class name for comparison #7237 [ivanseidel](https://github.com/ivanseidel) + * docs(api): make options show up as a nested list #7232 + * fix(document): don't mark array as modified on init if doc array has default #7227 + * docs(api): link to bulk write result object in `bulkWrite()` docs #7225 + +5.3.11 / 2018-11-09 +=================== + * fix(model): make parent pointers non-enumerable so they don't crash JSON.stringify() #7220 + * fix(document): allow saving docs with nested props with '.' using `checkKeys: false` #7144 + * docs(lambda): use async/await with lambda example #7019 + +5.3.10 / 2018-11-06 +=================== + * fix(discriminator): support reusing a schema for multiple discriminators #7200 + * fix(cursor): handle `lean(false)` correctly with query cursors #7197 + * fix(document): avoid manual populate if `ref` not set #7193 + * fix(schema): handle schema without `.base` for browser build #7170 + * docs: add further reading section + +5.3.9 / 2018-11-02 +================== + * fix: upgrade bson dep -> 1.1.0 to match mongodb-core #7213 [NewEraCracker](https://github.com/NewEraCracker) + * docs(api): fix broken anchor link #7210 [gfranco93](https://github.com/gfranco93) + * fix: don't set parent timestamps because a child has timestamps set to false #7203 #7202 [lineus](https://github.com/lineus) + * fix(document): run setter only once when doing `.set()` underneath a single nested subdoc #7196 + * fix(document): surface errors in subdoc pre validate #7187 + * fix(query): run default functions after hydrating the loaded document #7182 + * fix(query): handle strictQuery: 'throw' with nested path correctly #7178 + * fix(update): update timestamps on replaceOne() #7152 + * docs(transactions): add example of aborting a transaction #7113 + +5.3.8 / 2018-10-30 +================== + * fix: bump mongodb driver -> 3.1.8 to fix connecting to +srv uri with no credentials #7191 #6881 [lineus](https://github.com/lineus) + * fix(document): sets defaults correctly in child docs with projection #7159 + * fix(mongoose): handle setting custom type on a separate mongoose global #7158 + * fix: add unnecessary files to npmignore #7157 + * fix(model): set session when creating new subdoc #7104 + +5.3.7 / 2018-10-26 +================== + * fix(browser): fix buffer usage in browser build #7184 #7173 [lineus](https://github.com/lineus) + * fix(document): make depopulate() work on populate virtuals and unpopulated docs #7180 #6075 [lineus](https://github.com/lineus) + * fix(document): only pass properties as 2nd arg to custom validator if `propsParameter` set #7145 + * docs(schematypes): add note about nested paths with `type` getting converted to mixed #7143 + * fix(update): run update validators on nested doc when $set on an array #7135 + * fix(update): copy exact errors from array subdoc validation into top-level update validator error #7135 + +5.3.6 / 2018-10-23 +================== + * fix(cursor): fix undefined transforms error + +5.3.5 / 2018-10-22 +================== + * fix(model): make sure versionKey on `replaceOne()` always gets set at top level to prevent cast errors #7138 + * fix(cursor): handle non-boolean lean option in `eachAsync()` #7137 + * fix(update): correct cast update that overwrites a map #7111 + * fix(schema): handle arrays of mixed correctly #7109 + * fix(query): use correct path when getting schema for child timestamp update #7106 + * fix(document): make `$session()` propagate sessions to child docs #7104 + * fix(document): handle user setting `schema.options.strict = 'throw'` #7103 + * fix(types): use core Node.js buffer prototype instead of safe-buffer because safe-buffer is broken for Node.js 4.x #7102 + * fix(document): handle setting single doc with refPath to document #7070 + * fix(model): handle array filters when updating timestamps for subdocs #7032 + +5.3.4 / 2018-10-15 +================== + * fix(schema): make `add()` and `remove()` return the schema instance #7131 [lineus](https://github.com/lineus) + * fix(query): don't require passing model to `cast()` #7118 + * fix: support `useFindAndModify` as a connection-level option #7110 [lineus](https://github.com/lineus) + * fix(populate): handle plus path projection with virtual populate #7050 + * fix(schema): allow using ObjectId type as schema path type #7049 + * docs(schematypes): elaborate on how schematypes relate to types #7049 + * docs(deprecations): add note about gridstore deprecation #6922 + * docs(guide): add storeSubdocValidationError option to guide #6802 + +5.3.3 / 2018-10-12 +================== + * fix(document): enable storing mongoose validation error in MongoDB by removing `$isValidatorError` property #7127 + * docs(api): clarify that aggregate triggers aggregate middleware #7126 [lineus](https://github.com/lineus) + * fix(connection): handle Model.init() when an index exists on schema + autoCreate == true #7122 [jesstelford](https://github.com/jesstelford) + * docs(middleware): explain how to switch between document and query hooks for `remove()` #7093 + * docs(api): clean up encoding issues in SchemaType.prototype.validate docs #7091 + * docs(schema): add schema types to api docs and update links on schematypes page #7080 #7076 [lineus](https://github.com/lineus) + * docs(model): expand model constructor docs with examples and `fields` param #7077 + * docs(aggregate): remove incorrect description of noCursorTimeout and add description of aggregate options #7056 + * docs: re-add array type to API docs #7027 + * docs(connections): add note about `members.host` errors due to bad host names in replica set #7006 + +5.3.2 / 2018-10-07 +================== + * fix(query): make sure to return correct result from `orFail()` #7101 #7099 [gsandorx](https://github.com/gsandorx) + * fix(schema): handle `{ timestamps: false }` correctly #7088 #7074 [lineus](https://github.com/lineus) + * docs: fix markdown in options.useCreateIndex documentation #7085 [Cyral](https://github.com/Cyral) + * docs(schema): correct field name in timestamps example #7082 [kizmo04](https://github.com/kizmo04) + * docs(migrating_to_5): correct markdown syntax #7078 [gwuah](https://github.com/gwuah) + * fix(connection): add useFindAndModify option in connect #7059 [NormanPerrin](https://github.com/NormanPerrin) + * fix(document): dont mark single nested path as modified if setting to the same value #7048 + * fix(populate): use WeakMap to track lean populate models rather than leanPopulateSymbol #7026 + * fix(mongoose): avoid unhandled rejection when `mongoose.connect()` errors with a callback #6997 + * fix(mongoose): isolate Schema.Types between custom Mongoose instances #6933 + +5.3.1 / 2018-10-02 +================== + * fix(ChangeStream): expose driver's `close()` function #7068 #7022 [lineus](https://github.com/lineus) + * fix(model): avoid printing warning if `_id` index is set to "hashed" #7053 + * fix(populate): handle nested populate underneath lean array correctly #7052 + * fix(update): make timestamps not crash on a null or undefined update #7041 + * docs(schematypes+validation): clean up links from validation docs to schematypes docs #7040 + * fix(model): apply timestamps to nested docs in bulkWrite() #7032 + * docs(populate): rewrite refPath docs to be simpler and more direct #7013 + * docs(faq): explain why destructuring imports are not supported in FAQ #7009 + +5.3.0 / 2018-09-28 +================== + * feat(mongoose): support `mongoose.set('debug', WritableStream)` so you can pipe debug to stderr, file, or network #7018 + * feat(query): add useNestedStrict option #6973 #5144 [lineus](https://github.com/lineus) + * feat(query): add getPopulatedPaths helper to Query.prototype #6970 #6677 [lineus](https://github.com/lineus) + * feat(model): add `createCollection()` helper to make transactions easier #6948 #6711 [Fonger](https://github.com/Fonger) + * feat(schema): add ability to do `schema.add(otherSchema)` to merge hooks, virtuals, etc. #6897 + * feat(query): add `orFail()` helper that throws an error if no documents match `filter` #6841 + * feat(mongoose): support global toObject and toJSON #6815 + * feat(connection): add deleteModel() to remove a model from a connection #6813 + * feat(mongoose): add top-level mongoose.ObjectId, mongoose.Decimal128 for easier schema declarations #6760 + * feat(aggregate+query): support for/await/of (async iterators) #6737 + * feat(mongoose): add global `now()` function that you can stub for testing timestamps #6728 + * feat(schema): support `schema.pre(RegExp, fn)` and `schema.post(RegExp, fn)` #6680 + * docs(query): add better docs for the `mongooseOptions()` function #6677 + * feat(mongoose): add support for global strict object #6858 + * feat(schema+mongoose): add autoCreate option to automatically create collections #6489 + * feat(update): update timestamps on nested subdocs when using `$set` #4412 + * feat(query+schema): add query `remove` hook and ability to switch between query `remove` and document `remove` middleware #3054 + +5.2.18 / 2018-09-27 +=================== + * docs(migrating_to_5): add note about overwriting filter properties #7030 + * fix(query): correctly handle `select('+c')` if c is not in schema #7017 + * fix(document): check path exists before checking for required #6974 + * fix(document): retain user-defined key order on initial set with nested docs #6944 + * fix(populate): handle multiple localFields + foreignFields using `localField: function() {}` syntax #5704 + +5.2.17 / 2018-09-21 +=================== + * docs(guide): clarify that Mongoose only increments versionKey on `save()` and add a workaround for `findOneAndUpdate()` #7038 + * fix(model): correctly handle `createIndex` option to `ensureIndexes()` #7036 #6922 [lineus](https://github.com/lineus) + * docs(migrating_to_5): add a note about changing debug output from stderr to stdout #7034 #7018 [lineus](https://github.com/lineus) + * fix(query): add `setUpdate()` to allow overwriting update without changing op #7024 #7012 [lineus](https://github.com/lineus) + * fix(update): find top-level version key even if there are `$` operators in the update #7003 + * docs(model+query): explain which operators `count()` supports that `countDocuments()` doesn't #6911 + +5.2.16 / 2018-09-19 +=================== + * fix(index): use dynamic require only when needed for better webpack support #7014 #7010 [jaydp17](https://github.com/jaydp17) + * fix(map): handle arrays of mixed maps #6995 + * fix(populate): leave justOne as null if populating underneath a Mixed type #6985 + * fix(populate): add justOne option to allow overriding any bugs with justOne #6985 + * fix(query): add option to skip adding timestamps to an update #6980 + * docs(model+schematype): improve docs about background indexes and init() #6966 + * fix: bump mongodb -> 3.1.6 to allow connecting to srv url without credentials #6955 #6881 [lineus](https://github.com/lineus) + * fix(connection): allow specifying `useCreateIndex` at the connection level, overrides global-level #6922 + * fix(schema): throw a helpful error if setting `ref` to an invalid value #6915 + +5.2.15 / 2018-09-15 +=================== + * fix(populate): handle virtual justOne correctly if it isn't set #6988 + * fix(populate): consistently use lowercase `model` instead of `Model` so double-populating works with existing docs #6978 + * fix(model): allow calling `Model.init()` again after calling `dropDatabase()` #6967 + * fix(populate): find correct justOne when double-populating underneath an array #6798 + * docs(webpack): make webpack docs use es2015 preset for correct libs and use acorn to test output is valid ES5 #6740 + * fix(populate): add selectPopulatedPaths option to opt out of auto-adding `populate()`-ed fields to `select()` #6546 + * fix(model): set timestamps on bulkWrite `insertOne` and `replaceOne` #5708 + +5.2.14 / 2018-09-09 +=================== + * docs: fix wording on promise docs to not imply queries only return promises #6983 #6982 [lineus](https://github.com/lineus) + * fix(map): throw TypeError if keys are not string #6956 + * fix(document): ensure you can `validate()` a child doc #6931 + * fix(populate): avoid cast error if refPath points to localFields with 2 different types #6870 + * fix(populate): handle populating already-populated paths #6839 + * fix(schematype): make ObjectIds handle refPaths when checking required #6714 + * fix(model): set timestamps on bulkWrite() updates #5708 + +5.2.13 / 2018-09-04 +=================== + * fix(map): throw TypeError if keys are not string #6968 [Fonger](https://github.com/Fonger) + * fix(update): make array op casting work with strict:false and {} #6962 #6952 [lineus](https://github.com/lineus) + * fix(document): add doc.deleteOne(), doc.updateOne(), doc.replaceOne() re: deprecation warnings #6959 #6940 [lineus](https://github.com/lineus) + * docs(faq+schematypes): add note about map keys needing to be strings #6957 #6956 [lineus](https://github.com/lineus) + * fix(schematype): remove unused if statement #6950 #6949 [cacothi](https://github.com/cacothi) + * docs: add /docs/deprecations.html for dealing with MongoDB driver deprecation warnings #6922 + * fix(populate): handle refPath where first element in array has no refPath #6913 + * fix(mongoose): allow setting useCreateIndex option after creating a model but before initial connection succeeds #6890 + * fix(updateValidators): ensure $pull validators always get an array #6889 + +5.2.12 / 2018-08-30 +=================== + * fix(document): disallow setting `constructor` and `prototype` if strict mode false + +4.13.17 / 2018-08-30 +==================== + * fix(document): disallow setting `constructor` and `prototype` if strict mode false + +5.2.11 / 2018-08-30 +=================== + * fix(document): disallow setting __proto__ if strict mode false + * fix(document): run document middleware on docs embedded in maps #6945 #6938 [Fonger](https://github.com/Fonger) + * fix(query): make castForQuery return a CastError #6943 #6927 [lineus](https://github.com/lineus) + * fix(query): use correct `this` scope when casting query with legacy 2dsphere pairs defined in schema #6939 #6937 [Fonger](https://github.com/Fonger) + * fix(document): avoid crash when calling `get()` on deeply nested subdocs #6929 #6925 [jakemccloskey](https://github.com/jakemccloskey) + * fix(plugins): make saveSubdocs execute child post save hooks _after_ the actual save #6926 + * docs: add dbName to api docs for .connect() #6923 [p722](https://github.com/p722) + * fix(populate): convert to array when schema specifies array, even if doc doesn't have an array #6908 + * fix(populate): handle `justOne` virtual populate underneath array #6867 + * fix(model): dont set versionKey on upsert if it is already `$set` #5973 + +4.13.16 / 2018-08-30 +==================== + * fix(document): disallow setting __proto__ if strict mode false + * feat(error): backport adding modified paths to VersionError #6928 [freewil](https://github.com/freewil) + +5.2.10 / 2018-08-27 +=================== + * fix: bump mongodb driver -> 3.1.4 #6920 #6903 #6884 #6799 #6741 [Fonger](https://github.com/Fonger) + * fix(model): track `session` option for `save()` as the document's `$session()` #6909 + * fix(query): add Query.getOptions() helper #6907 [Fonger](https://github.com/Fonger) + * fix(document): ensure array atomics get cleared after save() #6900 + * fix(aggregate): add missing redact and readConcern helpers #6895 [Fonger](https://github.com/Fonger) + * fix: add global option `mongoose.set('useCreateIndex', true)` to avoid ensureIndex deprecation warning #6890 + * fix(query): use `projection` option to avoid deprecation warnings #6888 #6880 [Fonger](https://github.com/Fonger) + * fix(query): use `findOneAndReplace()` internally if using `overwrite: true` with `findOneAndUpdate()` #6888 [Fonger](https://github.com/Fonger) + * fix(document): ensure required cache gets cleared correctly between subsequent saves #6892 + * fix(aggregate): support session chaining correctly #6886 #6885 [Fonger](https://github.com/Fonger) + * fix(query): use `projection` instead of `fields` internally for `find()` and `findOne()` to avoid deprecation warning #6880 + * fix(populate): add `getters` option to opt in to calling getters on populate #6844 + +5.2.9 / 2018-08-17 +================== + * fix(document): correctly propagate write concern options in save() #6877 [Fonger](https://github.com/Fonger) + * fix: upgrade mongodb driver -> 3.1.3 for numerous fixes #6869 #6843 #6692 #6670 [simllll](https://github.com/simllll) + * fix: correct `this` scope of default functions for DocumentArray and Array #6868 #6840 [Fonger](https://github.com/Fonger) + * fix(types): support casting JSON form of buffers #6866 #6863 [Fonger](https://github.com/Fonger) + * fix(query): get global runValidators option correctly #6865 #6578 + * fix(query): add Query.prototype.setQuery() analogous to `getQuery()` #6855 #6854 + * docs(connections): add note about the `family` option for IPv4 vs IPv6 and add port to example URIs #6784 + * fix(query): get global runValidators option correctly #6578 + +4.13.15 / 2018-08-14 +==================== + * fix(mongoose): add global `usePushEach` option for easier Mongoose 4.x + MongoDB 3.6 #6858 + * chore: fix flakey tests for 4.x #6853 [Fonger](https://github.com/Fonger) + * feat(error): add version number to VersionError #6852 [freewil](https://github.com/freewil) + +5.2.8 / 2018-08-13 +================== + * docs: update `execPopulate()` code example #6851 [WJakub](https://github.com/WJakub) + * fix(document): allow passing callback to `execPopulate()` #6851 + * fix(populate): populate with undefined fields without error #6848 #6845 [Fonger](https://github.com/Fonger) + * docs(migrating_to_5): Add `objectIdGetter` option docs #6842 [jwalton](https://github.com/jwalton) + * chore: run lint in parallel and only on Node.js v10 #6836 [Fonger](https://github.com/Fonger) + * fix(populate): throw helpful error if refPath excluded in query #6834 + * docs(migrating_to_5): add note about removing runSettersOnQuery #6832 + * fix: use safe-buffer to avoid buffer deprecation errors in Node.js 10 #6829 [Fonger](https://github.com/Fonger) + * docs(query): fix broken links #6828 [yaynick](https://github.com/yaynick) + * docs(defaults): clarify that defaults only run on undefined #6827 + * chore: fix flakey tests #6824 [Fonger](https://github.com/Fonger) + * docs: fix custom inspect function deprecation warning in Node.js 10 #6821 [yelworc](https://github.com/yelworc) + * fix(document): ensure subdocs get set to init state after save() so validators can run again #6818 + * fix(query): make sure embedded query casting always throws a CastError #6803 + * fix(document): ensure `required` function only gets called once when validating #6801 + * docs(connections): note that you must specify port if using `useNewUrlParser: true` #6789 + * fix(populate): support `options.match` in virtual populate schema definition #6787 + * fix(update): strip out virtuals from updates if strict: 'throw' rather than returning an error #6731 + +5.2.7 / 2018-08-06 +================== + * fix(model): check `expireAfterSeconds` option when diffing indexes in syncIndexes() #6820 #6819 [christopherhex](https://github.com/christopherhex) + * chore: fix some common test flakes in travis #6816 [Fonger](https://github.com/Fonger) + * chore: bump eslint and webpack to avoid bad versions of eslint-scope #6814 + * test(model): add delay to session tests to improve pass rate #6811 [Fonger](https://github.com/Fonger) + * fix(model): support options in `deleteMany` #6810 [Fonger](https://github.com/Fonger) + * fix(query): don't use $each when pushing an array into an array #6809 [lineus](https://github.com/lineus) + * chore: bump mquery so eslint isn't a prod dependency #6800 + * fix(populate): correctly get schema type when calling `populate()` on already populated path #6798 + * fix(populate): propagate readConcern options in populate from parent query #6792 #6785 [Fonger](https://github.com/Fonger) + * docs(connection): add description of useNewUrlParser option #6789 + * fix(query): make select('+path') a no-op if no select prop in schema #6785 + * docs(schematype+validation): document using function syntax for custom validator message #6772 + * fix(update): throw CastError if updating with `$inc: null` #6770 + * fix(connection): throw helpful error when calling `createConnection(undefined)` #6763 + +5.2.6 / 2018-07-30 +================== + * fix(document): don't double-call deeply nested custom getters when using `get()` #6779 #6637 + * fix(query): upgrade mquery for readConcern() helper #6777 + * docs(schematypes): clean up typos #6773 [sajadtorkamani](https://github.com/sajadtorkamani) + * refactor(browser): fix webpack warnings #6771 #6705 + * fix(populate): make error reported when no `localField` specified catchable #6767 + * docs(connection): use correct form in createConnection example #6766 [lineus](https://github.com/lineus) + * fix(connection): throw helpful error when using legacy `mongoose.connect()` syntax #6756 + * fix(document): handle overwriting `$session` in `execPopulate()` #6754 + * fix(query): propagate top-level session down to `populate()` #6754 + * fix(aggregate): add `session()` helper for consistency with query api #6752 + * fix(map): avoid infinite recursion when update overwrites a map #6750 + * fix(model): be consistent about passing noop callback to mongoose.model() `init()` as well as db.model() #6707 + +5.2.5 / 2018-07-23 +================== + * fix(boolean): expose `convertToTrue` and `convertToFalse` for custom boolean casting #6758 + * docs(schematypes): add note about what values are converted to booleans #6758 + * fix(document): fix(document): report castError when setting single nested doc to array #6753 + * docs: prefix mongoose.Schema call with new operator #6751 [sajadtorkamani](https://github.com/sajadtorkamani) + * docs(query): add examples and links to schema writeConcern option for writeConcern helpers #6748 + * docs(middleware): clarify that init middleware is sync #6747 + * perf(model): create error rather than modifying stack for source map perf #6735 + * fix(model): throw helpful error when passing object to aggregate() #6732 + * fix(model): pass Model instance as context to applyGetters when calling getters for virtual populate #6726 [lineus](https://github.com/lineus) + * fix(documentarray): remove `isNew` and `save` listeners on CastError because otherwise they never get removed #6723 + * docs(model+query): clarify when to use `countDocuments()` vs `estimatedDocumentCount()` #6713 + * fix(populate): correctly set virtual nestedSchemaPath when calling populate() multiple times #6644 + * docs(connections): add note about the `family` option for IPv4 vs IPv6 and add port to example URIs #6566 + +5.2.4 / 2018-07-16 +================== + * docs: Model.insertMany rawResult option in api docs #6724 [lineus](https://github.com/lineus) + * docs: fix typo on migrating to 5 guide #6722 [iagowp](https://github.com/iagowp) + * docs: update doc about keepalive #6719 #6718 [simllll](https://github.com/simllll) + * fix: ensure debug mode doesn't crash with sessions #6712 + * fix(document): report castError when setting single nested doc to primitive value #6710 + * fix(connection): throw helpful error if using `new db.model(foo)(bar)` #6698 + * fix(model): throw readable error with better stack trace when non-cb passed to $wrapCallback() #6640 + +5.2.3 / 2018-07-11 +================== + * fix(populate): if a getter is defined on the localField, use it when populating #6702 #6618 [lineus](https://github.com/lineus) + * docs(schema): add example of nested aliases #6671 + * fix(query): add `session()` function to queries to avoid positional argument mistakes #6663 + * docs(transactions): use new session() helper to make positional args less confusing #6663 + * fix(query+model+schema): add support for `writeConcern` option and writeConcern helpers #6620 + * docs(guide): add `writeConcern` option and re-add description for `safe` option #6620 + * docs(schema): fix broken API links #6619 + * docs(connections): add information re: socketTimeoutMS and connectTimeoutMS #4789 + +5.2.2 / 2018-07-08 +================== + * fix(model+query): add missing estimatedDocumentCount() function #6697 + * docs(faq): improve array-defaults section #6695 [lineus](https://github.com/lineus) + * docs(model): fix countDocuments() docs, bad copy/paste from count() docs #6694 #6643 + * fix(connection): add `startSession()` helper to connection and mongoose global #6689 + * fix: upgrade mongodb driver -> 3.1.1 for countDocuments() fix #6688 #6666 + * docs(compatibility): add MongoDB 4 range #6685 + * fix(populate): add ability to define refPath as a function #6683 [lineus](https://github.com/lineus) + * docs: add rudimentary transactions guide #6672 + * fix(update): make setDefaultsOnInsert handle nested subdoc updates with deeply nested defaults #6665 + * docs: use latest acquit-ignore to handle examples that start with acquit:ignore:start #6657 + * fix(validation): format `properties.message` as well as `message` #6621 + +5.2.1 / 2018-07-03 +================== + * fix(connection): allow setting the mongodb driver's useNewUrlParser option, default to false #6656 #6648 #6647 + * fix(model): only warn on custom _id index if index only has _id key #6650 + +5.2.0 / 2018-07-02 +================== + * feat(model): add `countDocuments()` #6643 + * feat(model): make ensureIndexes() fail if specifying an index on _id #6605 + * feat(mongoose): add `objectIdGetter` option to remove ObjectId.prototype._id #6588 + * feat: upgrade mongodb -> 3.1.0 for full MongoDB 4.0 support #6579 + * feat(query): support `runValidators` as a global option #6578 + * perf(schema): use WeakMap instead of array for schema stack #6503 + * feat(model): decorate unique discriminator indexes with partialFilterExpressions #6347 + * feat(model): add `syncIndexes()`, drops indexes that aren't in schema #6281 + * feat(document): add default getter/setter if virtual doesn't have one #6262 + * feat(discriminator): support discriminators on nested doc arrays #6202 + * feat(update): add `Query.prototype.set()` #5770 + +5.1.8 / 2018-07-02 +================== + * fix: don't throw TypeError if calling save() after original save() failed with push() #6638 [evanhenke](https://github.com/evanhenke) + * fix(query): add explain() helper and don't hydrate explain output #6625 + * docs(query): fix `setOptions()` lists #6624 + * docs: add geojson docs #6607 + * fix: bump mongodb -> 3.0.11 to avoid cyclic dependency error with retryWrites #6109 + +5.1.7 / 2018-06-26 +================== + * docs: add npm badge to readme #6623 [VFedyk](https://github.com/VFedyk) + * fix(document): don't throw parallel save error if post save hooks in parallel #6614 #6611 [lineus](https://github.com/lineus) + * fix(populate): allow dynamic ref to handle >1 model getModelsMapForPopulate #6613 #6612 [jimmytsao](https://github.com/jimmytsao) + * fix(document): handle `push()` on triple nested document array #6602 + * docs(validation): improve update validator doc headers #6577 [joeytwiddle](https://github.com/joeytwiddle) + * fix(document): handle document arrays in `modifiedPaths()` with includeChildren option #5904 + +5.1.6 / 2018-06-19 +================== + * fix: upgrade mongodb -> 3.0.10 + * docs(model+document): clarify that `save()` returns `undefined` if passed a callback #6604 [lineus](https://github.com/lineus) + * fix(schema): apply alias when adding fields with .add() #6593 + * docs: add full list of guides and streamline nav #6592 + * docs(model): add `projection` option to `findOneAndUpdate()` #6590 [lineus](https://github.com/lineus) + * docs: support @static JSDoc declaration #6584 + * fix(query): use boolean casting logic for $exists #6581 + * fix(query): cast all $text options to correct values #6581 + * fix(model): add support synchronous pre hooks for `createModel` #6552 [profbiss](https://github.com/profbiss) + * docs: add note about the `applyPluginsToDiscriminators` option #4965 + +5.1.5 / 2018-06-11 +================== + * docs(guide): rework query helper example #6575 [lineus](https://github.com/lineus) + * fix(populate): handle virtual populate with embedded discriminator under single nested subdoc #6571 + * docs: add string option to projections that call query select #6563 [lineus](https://github.com/lineus) + * style: use ES6 in collection.js #6560 [l33ds](https://github.com/l33ds) + * fix(populate): add virtual ref function ability getModelsMapForPopulate #6559 #6554 [lineus](https://github.com/lineus) + * docs(queries): fix link #6557 [sun1x](https://github.com/sun1x) + * fix(schema): rename indexes -> getIndexes to avoid webpack duplicate declaration #6547 + * fix(document): support `toString()` as custom method #6538 + * docs: add @instance for instance methods to be more compliant with JSDoc #6516 [treble-snake](https://github.com/treble-snake) + * fix(populate): avoid converting to map when using mongoose-deep-populate #6460 + * docs(browser): create new browser docs page #6061 + +5.1.4 / 2018-06-04 +================== + * docs(faq): add hr tags for parallel save error #6550 [lineus](https://github.com/lineus) + * docs(connection): fix broken link #6545 [iblamefish](https://github.com/iblamefish) + * fix(populate): honor subpopulate options #6539 #6528 [lineus](https://github.com/lineus) + * fix(populate): allow population of refpath under array #6537 #6509 [lineus](https://github.com/lineus) + * fix(query): dont treat $set the same as the other ops in update casting #6535 [lineus](https://github.com/lineus) + * fix: bump async -> 2.6.1 #6534 #6505 [lineus](https://github.com/lineus) + * fix: support using a function as validation error message #6530 [lucandrade](https://github.com/lucandrade) + * fix(populate): propagate `lean()` down to subpopulate #6498 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * docs(lambda): add info on what happens if database does down between lambda function calls #6409 + * fix(update): allow updating embedded discriminator path if discriminator key is in filter #5841 + +5.1.3 / 2018-05-28 +================== + * fix(document): support set() on path underneath array embedded discriminator #6526 + * chore: update lodash and nsp dev dependencies #6514 [ChristianMurphy](https://github.com/ChristianMurphy) + * fix(document): throw readable error when saving the same doc instance more than once in parallel #6511 #6456 #4064 [lineus](https://github.com/lineus) + * fix(populate): set correct nestedSchemaPath for virtual underneath embedded discriminator #6501 #6487 [lineus](https://github.com/lineus) + * docs(query): add section on promises and warning about not mixing promises and callbacks #6495 + * docs(connection): add concrete example of connecting to multiple hosts #6492 [lineus](https://github.com/lineus) + * fix(populate): handle virtual populate under single nested doc under embedded discriminator #6488 + * fix(schema): collect indexes from embedded discriminators for autoIndex build #6485 + * fix(document): handle `doc.set()` underneath embedded discriminator #6482 + * fix(document): handle set() on path under embedded discriminator with object syntax #6482 + * fix(document): handle setting nested property to object with only non-schema properties #6436 + +4.13.14 / 2018-05-25 +==================== + * fix(model): handle retainKeyOrder option in findOneAndUpdate() #6484 + +5.1.2 / 2018-05-21 +================== + * docs(guide): add missing SchemaTypes #6490 [distancesprinter](https://github.com/distancesprinter) + * fix(map): make MongooseMap.toJSON return a serialized object #6486 #6478 [lineus](https://github.com/lineus) + * fix(query): make CustomQuery inherit from model.Query for hooks #6483 #6455 [lineus](https://github.com/lineus) + * fix(document): prevent default falses from being skipped by $__dirty #6481 #6477 [lineus](https://github.com/lineus) + * docs(connection): document `useDb()` #6480 + * fix(model): skip redundant clone in insertMany #6479 [d1manson](https://github.com/d1manson) + * fix(aggregate): let replaceRoot accept objects as well as strings #6475 #6474 [lineus](https://github.com/lineus) + * docs(model): clarify `emit()` in mapReduce and how map/reduce are run #6465 + * fix(populate): flatten array to handle multi-level nested `refPath` #6457 + * fix(date): cast small numeric strings as years #6444 [AbdelrahmanHafez](https://github.com/AbdelrahmanHafez) + * fix(populate): remove unmatched ids when using virtual populate on already hydrated document #6435 + * fix(array): use custom array class to avoid clobbered property names #6431 + * fix(model): handle hooks for custom methods that return promises #6385 + +4.13.13 / 2018-05-17 +==================== + * fix(update): stop clobbering $in when casting update #6441 #6339 + * fix: upgrade async -> 2.6.0 re: security warning + +5.1.1 / 2018-05-14 +================== + * docs(schema): add notes in api and guide about schema.methods object #6470 #6440 [lineus](https://github.com/lineus) + * fix(error): add modified paths to VersionError #6464 #6433 [paglias](https://github.com/paglias) + * fix(populate): only call populate with full param signature when match is not present #6458 #6451 [lineus](https://github.com/lineus) + * docs: fix geoNear link in migration guide #6450 [kawache](https://github.com/kawache) + * fix(discriminator): throw readable error when `create()` with a non-existent discriminator key #6434 + * fix(populate): add `retainNullValues` option to avoid stripping out null keys #6432 + * fix(populate): handle populate in embedded discriminators underneath nested paths #6411 + * docs(model): add change streams and ToC, make terminology more consistent #5888 + +5.1.0 / 2018-05-10 +================== + * feat(ObjectId): add `_id` getter so you can get a usable id whether or not the path is populated #6415 #6115 + * feat(model): add Model.startSession() #6362 + * feat(document): add doc.$session() and set session on doc after query #6362 + * feat: add Map type that supports arbitrary keys #6287 #681 + * feat: add `cloneSchemas` option to mongoose global to opt in to always cloning schemas before use #6274 + * feat(model): add `findOneAndDelete()` and `findByIdAndDelete()` #6164 + * feat(document): support `$ignore()` on single nested and array subdocs #6152 + * feat(document): add warning about calling `save()` on subdocs #6152 + * fix(model): make `save()` use `updateOne()` instead of `update()` #6031 + * feat(error): add version number to VersionError #5966 + * fix(query): allow `[]` as a value for `$in` when casting #5913 + * fix(document): avoid running validators on single nested paths if only a child path is modified #5885 + * feat(schema): print warning if method conflicts with mongoose internals #5860 + +5.0.18 / 2018-05-09 +=================== + * fix(update): stop clobbering $in when casting update #6441 #6339 [lineus](https://github.com/lineus) + * fix: upgrade mongodb driver -> 3.0.8 to fix session issue #6437 #6357 [simllll](https://github.com/simllll) + * fix: upgrade bson -> 1.0.5 re: https://snyk.io/vuln/npm:bson:20180225 #6423 [ChristianMurphy](https://github.com/ChristianMurphy) + * fix: look for `valueOf()` when casting to Decimal128 #6419 #6418 [lineus](https://github.com/lineus) + * fix: populate array of objects with space separated paths #6414 [lineus](https://github.com/lineus) + * test: add coverage for `mongoose.pluralize()` #6412 [FastDeath](https://github.com/FastDeath) + * fix(document): avoid running default functions on init() if path has value #6410 + * fix(document): allow saving document with `null` id #6406 + * fix: prevent casting of populated docs in document.init #6390 [lineus](https://github.com/lineus) + * fix: remove `toHexString()` helper that was added in 5.0.15 #6359 + +5.0.17 / 2018-04-30 +=================== + * docs(migration): certain chars in passwords may cause connection failures #6401 [markstos](https://github.com/markstos) + * fix(document): don't throw when `push()` on a nested doc array #6398 + * fix(model): apply hooks to custom methods if specified #6385 + * fix(schema): support opting out of one timestamp field but not the other for `insertMany()` #6381 + * fix(documentarray): handle `required: true` within documentarray definition #6364 + * fix(document): ensure `isNew` is set before default functions run on init #3793 + +5.0.16 / 2018-04-23 +=================== + * docs(api): sort api methods based on their string property #6374 [lineus](https://github.com/lineus) + * docs(connection): fix typo in `createCollection()` #6370 [mattc41190](https://github.com/mattc41190) + * docs(document): remove vestigial reference to `numAffected` #6367 [ekulabuhov](https://github.com/ekulabuhov) + * docs(schema): fix typo #6366 [dhritzkiv](https://github.com/dhritzkiv) + * docs(schematypes): add missing `minlength` and `maxlength` docs #6365 [treble-snake](https://github.com/treble-snake) + * docs(queries): fix formatting #6360 [treble-snake](https://github.com/treble-snake) + * docs(api): add cursors to API docs #6353 #6344 [lineus](https://github.com/lineus) + * docs(aggregate): remove reference to non-existent `.select()` method #6346 + * fix(update): handle `required` array with update validators and $pull #6341 + * fix(update): avoid setting __v in findOneAndUpdate if it is `$set` #5973 + +5.0.15 / 2018-04-16 +=================== + * fix: add ability for casting from number to decimal128 #6336 #6331 [lineus](https://github.com/lineus) + * docs(middleware): enumerate the ways to error out in a hook #6315 + * fix(document): respect schema-level depopulate option for toObject() #6313 + * fix: bump mongodb driver -> 3.0.6 #6310 + * fix(number): check for `valueOf()` function to support Decimal.js #6306 #6299 [lineus](https://github.com/lineus) + * fix(query): run array setters on query if input value is an array #6277 + * fix(versioning): don't require matching version when using array.pull() #6190 + * fix(document): add `toHexString()` function so you don't need to check whether a path is populated to get an id #6115 + +5.0.14 / 2018-04-09 +=================== + * fix(schema): clone aliases and alternative option syntax correctly + * fix(query): call utils.toObject in query.count like in query.find #6325 [lineus](https://github.com/lineus) + * docs(populate): add middleware examples #6320 [BorntraegerMarc](https://github.com/BorntraegerMarc) + * docs(compatibility): fix dead link #6319 [lacivert](https://github.com/lacivert) + * docs(api): fix markdown parsing for parameters #6318 #6314 [lineus](https://github.com/lineus) + * fix(populate): handle space-delimited paths in array populate #6296 #6284 [lineus](https://github.com/lineus) + * fix(populate): support basic virtual populate underneath embedded discriminators #6273 + +5.0.13 / 2018-04-05 +=================== + * docs(faq): add middleware to faq arrow function warning #6309 [lineus](https://github.com/lineus) + * docs(schema): add example to loadClass() docs #6308 + * docs: clean up misc typos #6304 [sfrieson](https://github.com/sfrieson) + * fix(document): apply virtuals when calling `toJSON()` on a nested path #6294 + * refactor(connection): use `client.db()` syntax rather than double-parsing the URI #6292 #6286 + * docs: document new behavior of required validator for arrays #6288 [daltones](https://github.com/daltones) + * fix(schema): treat set() options as user-provided options #6274 + * fix(schema): clone discriminators correctly #6274 + * fix(update): make setDefaultsOnInsert not create subdoc if only default is id #6269 + * docs(discriminator): clarify 3rd argument to Model.discriminator() #2596 + +5.0.12 / 2018-03-27 +=================== + * docs(query): updating model name in query API docs #6280 [lineus](https://github.com/lineus) + * docs: fix typo in tests #6275 [styler](https://github.com/styler) + * fix: add missing `.hint()` to aggregate #6272 #6251 [lineus](https://github.com/lineus) + * docs(api): add headers to each API docs section for easer nav #6261 + * fix(query): ensure hooked query functions always run on next tick for chaining #6250 + * fix(populate): ensure populated array not set to null if it isn't set #6245 + * fix(connection): set readyState to disconnected if initial connection fails #6244 #6131 + * docs(model): make `create()` params show up correctly in docs #6242 + * fix(model): make error handlers work with MongoDB server errors and `insertMany()` #6228 + * fix(browser): ensure browser document builds defaults for embedded arrays correctly #6175 + * fix(timestamps): set timestamps when using `updateOne()` and `updateMany()` #6282 [gualopezb](https://github.com/gualopezb) + +5.0.11 / 2018-03-19 +=================== + * fix(update): handle $pull with $in in update validators #6240 + * fix(query): don't convert undefined to null when casting so driver `ignoreUndefined` option works #6236 + * docs(middleware): add example of using async/await with middleware #6235 + * fix(populate): apply justOne option before `completeMany()` so it works with lean() #6234 + * fix(query): ensure errors in user callbacks aren't caught in init #6195 #6178 + * docs(connections): document dbName option for Atlas connections #6179 + * fix(discriminator): make child schema nested paths overwrite parent schema paths #6076 + +4.13.12 / 2018-03-13 +==================== + * fix(document): make virtual get() return undefined instead of null if no getters #6223 + * docs: fix url in useMongoClient error message #6219 #6217 [lineus](https://github.com/lineus) + * fix(discriminator): don't copy `discriminators` property from base schema #6122 #6064 + +5.0.10 / 2018-03-12 +=================== + * docs(schematype): add notes re: running setters on queries #6209 + * docs: fix typo #6208 [kamagatos](https://github.com/kamagatos) + * fix(query): only call setters once on query filter props for findOneAndUpdate and findOneAndRemove #6203 + * docs: elaborate on connection string changes in migration guide #6193 + * fix(document): skip applyDefaults if subdoc is null #6187 + * docs: fix schematypes docs and link to them #6176 + * docs(faq): add FAQs re: array defaults and casting aggregation pipelines #6184 #6176 #6170 [lineus](https://github.com/lineus) + * fix(document): ensure primitive defaults are set and built-in default functions run before setters #6155 + * fix(query): handle single embedded embedded discriminators in castForQuery #6027 + +5.0.9 / 2018-03-05 +================== + * perf: bump mongodb -> 3.0.4 to fix SSL perf issue #6065 + +5.0.8 / 2018-03-03 +================== + * docs: remove obsolete references to `emitIndexErrors` #6186 [isaackwan](https://github.com/isaackwan) + * fix(query): don't cast findOne() until exec() so setters don't run twice #6157 + * fix: remove document_provider.web.js file #6186 + * fix(discriminator): support custom discriminator model names #6100 [wentout](https://github.com/wentout) + * fix: support caching calls to `useDb()` #6036 [rocketspacer](https://github.com/rocketspacer) + * fix(query): add omitUndefined option so setDefaultsOnInsert can kick in on undefined #6034 + * fix: upgrade mongodb -> 3.0.3 for reconnectTries: 0 blocking process exit fix #6028 + +5.0.7 / 2018-02-23 +================== + * fix: support eachAsync options with aggregation cursor #6169 #6168 [vichle](https://github.com/vichle) + * docs: fix link to MongoDB compound indexes docs #6162 [br0p0p](https://github.com/br0p0p) + * docs(aggregate): use eachAsync instead of incorrect `each()` #6160 [simllll](https://github.com/simllll) + * chore: fix benchmarks #6158 [pradel](https://github.com/pradel) + * docs: remove dead link to old blog post #6154 [markstos](https://github.com/markstos) + * fix: don't convert dates to numbers when updating mixed path #6146 #6145 [s4rbagamble](https://github.com/s4rbagamble) + * feat(aggregate): add replaceRoot, count, sortByCount helpers #6142 [jakesjews](https://github.com/jakesjews) + * fix(document): add includedChildren flag to modifiedPaths() #6134 + * perf: don't create wrapper function if no hooks specified #6126 + * fix(schema): allow indexes on single nested subdocs for geoJSON #6113 + * fix(document): allow depopulating all fields #6073 + * feat(mongoose): add support for `useFindAndModify` option on singleton #5616 + +5.0.6 / 2018-02-15 +================== + * refactor(query.castUpdate): avoid creating error until necessary #6137 + * docs(api): fix missing api docs #6136 [lineus](https://github.com/lineus) + * fix(schema): copy virtuals when using `clone()` #6133 + * fix(update): avoid digging into buffers with upsert and replaceOne #6124 + * fix(schema): support `enum` on arrays of strings #6102 + * fix(update): cast `$addToSet: [1, 2]` -> `$addToSet: { $each: [1, 2] }` #6086 + +5.0.5 / 2018-02-13 +================== + * docs: make > show up correctly in API docs #6114 + * fix(query): support `where()` overwriting primitive with object #6097 + * fix(schematype): don't run internal `resetId` setter on queries with _id #6093 + * fix(discriminator): don't copy `discriminators` property from base schema #6064 + * fix(utils): respect `valueOf()` when merging object for update #6059 + * docs(validation): fix typo 'maxLength' #4720 + * fix(document): apply defaults after setting initial value so default functions don't see empty doc #3781 + +5.0.4 / 2018-02-08 +================== + * docs: add lambda guide #6107 + * fix(connection): add `dbName` option to work around `mongodb+srv` not supporting db name in URI #6106 + * fix(schematype): fix regexp typo in ObjectId #6098 [JoshuaWise](https://github.com/JoshuaWise) + * perf(document): re-use the modifiedPaths list #6092 [tarun1793](https://github.com/tarun1793) + * fix: use console.info() instead of console.error() for debug output #6088 [yuristsepaniuk](https://github.com/yuristsepaniuk) + * docs(validation): clean up runValidators and isAsync options docs for 5.x #6083 + * docs(model): use array instead of spread consistently for aggregate() #6070 + * fix(schema): make aliases handle mongoose-lean-virtuals #6069 + * docs(layout): add link to subdocs guide #6056 + * fix(query): make strictQuery: true strip out fields that aren't in the schema #6032 + * docs(guide): add notes for `strictQuery` option #6032 + +4.13.11 / 2018-02-07 +==================== + * docs: fix links in 4.x docs #6081 + * chore: add release script that uses --tag for npm publish for 4.x releases #6063 + +5.0.3 / 2018-01-31 +================== + * fix: consistently use process.nextTick() to avoid sinon.useFakeTimers() causing ops to hang #6074 + * docs(aggregate): fix typo #6072 [adursun](https://github.com/adursun) + * chore: add return type to `mongoose.model()` docs [bryant1410](https://github.com/bryant1410) + * fix(document): depopulate push()-ed docs when saving #6048 + * fix: upgrade mongodb -> 3.0.2 #6019 + +5.0.2 / 2018-01-28 +================== + * fix(schema): do not overwrite default values in schema when nested timestamps are provided #6024 [cdeveas](https://github.com/cdeveas) + * docs: fix syntax highlighting in models.jade, schematypes.jade, subdocs.jade #6058 [lineus](https://github.com/lineus) + * fix: use lazy loading so we can build mongoose with webpack #5993 #5842 + * docs(connections): clarify multi-mongos with useMongoClient for 4.x docs #5984 + * fix(populate): handle populating embedded discriminator paths #5970 + +4.13.10 / 2018-01-28 +==================== + * docs(model+query): add lean() option to Model helpers #5996 [aguyinmontreal](https://github.com/aguyinmontreal) + * fix: use lazy loading so we can build mongoose with webpack #5993 #5842 + * docs(connections): clarify multi-mongos with useMongoClient for 4.x docs #5984 + * fix(populate): handle populating embedded discriminator paths #5970 + * docs(query+aggregate): add more detail re: maxTimeMS #4066 + +5.0.1 / 2018-01-19 +================== + * fix(document): make validate() not resolve to document #6014 + * fix(model): make save() not return DocumentNotFoundError if using fire-and-forget writes #6012 + * fix(aggregate): make options() work as advertised #6011 [spederiva](https://github.com/spederiva) + * docs(queries): fix code samples #6008 + +5.0.0 / 2018-01-17 +================== + * test: refactor tests to use start fewer connections #5985 [fenanquin](https://github.com/fenanquin) + * feat: add global bufferCommands option #5879 + * docs: new docs site and build system #5976 + * test: increase timeout on slow test cases #5968 [fenanquin](https://github.com/fenanquin) + * fix: avoid casting out array filter elements #5965 + * feat: add Model.watch() wrapper #5964 + * chore: replace istanbul with nyc #5962 [ChristianMurphy](https://github.com/ChristianMurphy) + +4.13.9 / 2018-01-07 +=================== + * chore: update marked (dev dependency) re: security vulnerability #5951 [ChristianMurphy](https://github.com/ChristianMurphy) + * fix: upgrade mongodb -> 2.2.34 for ipv6 and autoReconnect fixes #5794 #5760 + * docs: use useMongooseAggCursor for aggregate docs #2955 + +5.0.0-rc2 / 2018-01-04 +====================== + * fix: add cleaner warning about no longer needing `useMongoClient` in 5.x #5961 + * chore: update acquit -> 0.5.1 for minor security patch #5961 [ChristianMurphy](https://github.com/ChristianMurphy) + * docs: add docs for mongoose 4.x at http://mongoosejs.com/docs/4.x #5959 + * docs: add link to migration guide #5957 + * chore: update eslint to version 4.14.0 #5955 [ChristianMurphy](https://github.com/ChristianMurphy) + * chore: update mocha to version 4.1.0 [ChristianMurphy](https://github.com/ChristianMurphy) + +5.0.0-rc1 / 2018-01-02 +====================== + * fix(index): use pluralize correctly for `mongoose.model()` #5958 + * fix: make mquery use native promises by default #5945 + * fix(connection): ensure 'joined' and 'left' events get bubbled up #5944 + +5.0.0-rc0 / 2017-12-28 +====================== + * BREAKING CHANGE: always use mongoose aggregation cursor when using `.aggregate().cursor()` #5941 + * BREAKING CHANGE: attach query middleware when compiling model #5939 + * BREAKING CHANGE: `emitIndexErrors` is on by default, failing index build will throw uncaught error if not handled #5910 + * BREAKING CHANGE: remove precompiled browser bundle #5895 + * feat: add `mongoose.pluralize()` function #5877 + * BREAKING CHANGE: remove `passRawResult` option for `findOneAndUpdate`, use `rawResult` #5869 + * BREAKING CHANGE: implicit async validators (based on number of function args) are removed, return a promise instead #5824 + * BREAKING CHANGE: fail fast if user sets a unique index on `_id` #5820 [varunjayaraman](https://github.com/varunjayaraman) + * BREAKING CHANGE: mapReduce resolves to an object with 2 keys rather than 2 separate args #5816 + * BREAKING CHANGE: `mongoose.connect()` returns a promise, removed MongooseThenable #5796 + * BREAKING CHANGE: query stream removed, use `cursor()` instead #5795 + * BREAKING CHANGE: connection `open()` and `openSet()` removed, use `openUri()` instead #5795 + * BREAKING CHANGE: use MongoDB driver 3.0.0, drop support for MongoDB server < 3.0.0 #5791 #4740 + * BREAKING CHANGE: remove support for `$pushAll`, remove `usePushEach` option #5670 + * BREAKING CHANGE: make date casting use native Date #5395 [varunjayaraman](https://github.com/varunjayaraman) + * BREAKING CHANGE: remove `runSettersOnQuery`, always run setters on query #5340 + * BREAKING CHANGE: array of length 0 now satisfies `required: true` for arays #5139 [wlingke](https://github.com/wlingke) + * BREAKING CHANGE: remove `saveErrorIfNotFound`, always error out if `save()` did not update a document #4973 + * BREAKING CHANGE: don't execute getters in reverse order #4835 + * BREAKING CHANGE: make boolean casting more strict #4245 + * BREAKING CHANGE: `toObject()` and `toJSON()` option parameter merges with defaults rather than overwriting #4131 + * feat: allow setting `default` on `_id` #4069 + * BREAKING CHANGE: `deleteX()` and `remove()` promise resolves to the write object result #4013 + * feat: support returning a promise from middleware functions #3779 + * BREAKING CHANGE: don't return a promise if callback specified #3670 + * BREAKING CHANGE: only cast `update()`, `updateX()`, `replaceOne()`, `remove()`, `deleteX()` in exec #3529 + * BREAKING CHANGE: sync errors in middleware functions are now handled #3483 + * BREAKING CHANGE: post hooks get flow control #3232 + * BREAKING CHANGE: deduplicate hooks when merging discriminator schema #2945 + * BREAKING CHANGE: use native promises by default, remove support for mpromise #2917 + * BREAKING CHANGE: remove `retainKeyOrder`, always use forward order when iterating through objects #2749 + * BREAKING CHANGE: `aggregate()` no longer accepts a spread #2716 + +4.13.8 / 2017-12-27 +=================== + * docs(guide): use more up-to-date syntax for autoIndex example #5933 + * docs: fix grammar #5927 [abagh0703](https://github.com/abagh0703) + * fix: propagate lean options to child schemas #5914 + * fix(populate): use correct model with discriminators + nested populate #5858 + +4.13.7 / 2017-12-11 +=================== + * docs(schematypes): fix typo #5889 [gokaygurcan](https://github.com/gokaygurcan) + * fix(cursor): handle `reject(null)` with eachAsync callback #5875 #5874 [ZacharyRSmith](https://github.com/ZacharyRSmith) + * fix: disallow setting `mongoose.connection` to invalid values #5871 [jinasonlin](https://github.com/jinasonlin) + * docs(middleware): suggest using `return next()` to stop middleware execution #5866 + * docs(connection): improve connection string query param docs #5864 + * fix(document): run validate hooks on array subdocs even if not directly modified #5861 + * fix(discriminator): don't treat $meta as defining projection when querying #5859 + * fix(types): handle Decimal128 when using bson-ext on server side #5850 + * fix(document): ensure projection with only $slice isn't treated as inclusive for discriminators #4991 + * fix(model): throw error when passing non-object to create() #2037 + +4.13.6 / 2017-12-02 +=================== + * fix(schema): support strictBool option in schema #5856 [ekulabuhov](https://github.com/ekulabuhov) + * fix(update): make upsert option consistently handle truthy values, not just booleans, for updateOne() #5839 + * refactor: remove unnecessary constructor check #2057 + * docs(query): correct function signature for .mod() helper #1806 + * fix(query): report ObjectParameterError when passing non-object as filter to find() and findOne() #1698 + +4.13.5 / 2017-11-24 +=================== + * fix(model): handle update cast errors correctly with bulkWrite #5845 [Michael77](https://github.com/Michael77) + * docs: add link to bufferCommands option #5844 [ralphite](https://github.com/ralphite) + * fix(model): allow virtual ref function to return arrays #5834 [brunohcastro](https://github.com/brunohcastro) + * fix(query): don't throw uncaught error if query filter too big #5812 + * fix(document): if setting unselected nested path, don't overwrite nested path #5800 + * fix(document): support calling `populate()` on nested document props #5703 + * fix: add `strictBool` option for schema type boolean #5344 #5211 #4245 + * docs(faq): add faq re: typeKey #1886 + * docs(query): add more detailed docs re: options #1702 + +4.13.4 / 2017-11-17 +=================== + * fix(aggregate): add chainable .option() helper for setting arbitrary options #5829 + * fix(aggregate): add `.pipeline()` helper to get the current pipeline #5825 + * docs: grammar fixes for `unique` FAQ #5823 [mfluehr](https://github.com/mfluehr) + * chore: add node 9 to travis #5822 [superheri](https://github.com/superheri) + * fix(model): fix infinite recursion with recursive embedded discriminators #5821 [Faibk](https://github.com/Faibk) + +4.13.3 / 2017-11-15 +=================== + * chore: add node 8 to travis #5818 [superheri](https://github.com/superheri) + * fix(document): don't apply transforms to nested docs when updating already saved doc #5807 + +4.13.2 / 2017-11-11 +=================== + * feat(buffer): add support for subtype prop #5530 + +4.13.1 / 2017-11-08 +=================== + * fix: accept multiple paths or array of paths to depopulate #5798 #5797 [adamreisnz](https://github.com/adamreisnz) + * fix(document): pass default array as actual array rather than taking first element #5780 + * fix(model): increment version when $set-ing it in a save() that requires a version bump #5779 + * fix(query): don't explicitly project in discriminator key if user projected in parent path #5775 #5754 + * fix(model): cast query option to geoNear() #5765 + * fix(query): don't treat projection with just $slice as inclusive #5737 + * fix(discriminator): defer applying embedded discriminator hooks until top-level model is compiled #5706 + * docs(discriminator): add warning to always attach hooks before calling discriminator() #5706 + +4.13.0 / 2017-11-02 +=================== + * feat(aggregate): add $addFields helper #5740 [AyushG3112](https://github.com/AyushG3112) + * feat(connection): add connection-level bufferCommands #5720 + * feat(connection): add createCollection() helper #5712 + * feat(populate): support setting localField and foreignField to functions #5704 #5602 + * feat(query): add multipleCastError option for aggregating cast errors when casting update #5609 + * feat(populate): allow passing a function to virtual ref #5602 + * feat(schema): add excludeIndexes option to optionally prevent collecting indexes from nested schemas #5575 + * feat(model): report validation errors from `insertMany()` if using `ordered: false` and `rawResult: true` #5337 + * feat(aggregate): add pre/post aggregate middleware #5251 + * feat(schema): allow using `set` as a schema path #1939 + +4.12.6 / 2017-11-01 +=================== + * fix(schema): make clone() copy query helpers correctly #5752 + * fix: undeprecate `ensureIndex()` and use it by default #3280 + +4.12.5 / 2017-10-29 +=================== + * fix(query): correctly handle `$in` and required for $pull and update validators #5744 + * feat(aggegate): add $addFields pipeline operator #5740 [AyushG3112](https://github.com/AyushG3112) + * fix(document): catch sync errors in document pre hooks and report as error #5738 + * fix(populate): handle slice projections correctly when automatically selecting populated fields #5737 + * fix(discriminator): fix hooks for embedded discriminators #5706 [wlingke](https://github.com/wlingke) + * fix(model): throw sane error when customer calls `mongoose.Model()` over `mongoose.model()` #2005 + +4.12.4 / 2017-10-21 +=================== + * test(plugins): add coverage for idGetter with id as a schema property #5713 [wlingke](https://github.com/wlingke) + * fix(model): avoid copying recursive $$context object when creating discriminator after querying #5721 + * fix(connection): ensure connection promise helpers are removed before emitting 'connected' #5714 + * docs(schema): add notes about runSettersOnQuery to schema setters #5705 + * fix(collection): ensure queued operations run on the next tick #5562 + +4.12.3 / 2017-10-16 +=================== + * fix(connection): emit 'reconnect' event as well as 'reconnected' for consistency with driver #5719 + * fix: correctly bubble up left/joined events for replica set #5718 + * fix(connection): allow passing in `autoIndex` as top-level option rather than requiring `config.autoIndex` #5711 + * docs(connection): improve docs regarding reconnectTries, autoReconnect, and bufferMaxEntries #5711 + * fix(query): handle null with addToSet/push/pull/pullAll update validators #5710 + * fix(model): handle setDefaultsOnInsert option for bulkWrite updateOne and updateMany #5708 + * fix(query): avoid infinite recursion edge case when cloning a buffer #5702 + +4.12.2 / 2017-10-14 +=================== + * docs(faq): add FAQ about using arrow functions for getters/setters, virtuals, and methods #5700 + * docs(schema): document the childSchemas property and add to public API #5695 + * fix(query): don't project in populated field if parent field is already projected in #5669 + * fix: bump mongodb -> 2.2.33 for issue with autoReconnect #4513 + +4.12.1 / 2017-10-08 +=================== + * fix(document): create new doc when setting single nested, no more set() on copy of priorVal #5693 + * fix(model): recursively call applyMethods on child schemas for global plugins #5690 + * docs: fix bad promise lib example on home page #5686 + * fix(query): handle false when checking for inclusive/exclusive projection #5685 + * fix(discriminator): allow reusing child schema #5684 + * fix: make addToSet() on empty array with subdoc trigger manual population #5504 + +4.12.0 / 2017-10-02 +=================== + * docs(validation): add docs coverage for ValidatorError.reason #5681 + * feat(discriminator): always add discriminatorKey to base schema to allow updating #5613 + * fix(document): make nested docs no longer inherit parent doc's schema props #5586 #5546 #5470 + * feat(query): run update validators on $pull and $pullAll #5555 + * feat(query): add .error() helper to query to error out in pre hooks #5520 + * feat(connection): add dropCollection() helper #5393 + * feat(schema): add schema-level collation option #5295 + * feat(types): add `discriminator()` function for single nested subdocs #5244 + * feat(document): add $isDeleted() getter/setter for better support for soft deletes #4428 + * feat(connection): bubble up reconnectFailed event when driver gives up reconnecting #4027 + * fix(query): report error if passing array or other non-object as filter to update query #3677 + * fix(collection): use createIndex() instead of deprecated ensureIndex() #3280 + +4.11.14 / 2017-09-30 +==================== + * chore: add nsp check to the CI build #5679 [hairyhenderson](https://github.com/hairyhenderson) + * fix: bump mquery because of security issue with debug package #5677 #5675 [jonathanprl](https://github.com/jonathanprl) + * fix(populate): automatically select() populated()-ed fields #5669 + * fix(connection): make force close work as expected #5664 + * fix(document): treat $elemMatch as inclusive projection #5661 + * docs(model/query): clarify which functions fire which middleware #5654 + * fix(model): make `init()` public and return a promise that resolves when indexes are done building #5563 + +4.11.13 / 2017-09-24 +==================== + * fix(query): correctly run replaceOne with update validators #5665 [sime1](https://github.com/sime1) + * fix(schema): replace mistype in setupTimestamp method #5656 [zipp3r](https://github.com/zipp3r) + * fix(query): avoid throwing cast error for strict: throw with nested id in query #5640 + * fix(model): ensure class gets combined schema when using class syntax with discriminators #5635 + * fix(document): handle setting doc array to array of top-level docs #5632 + * fix(model): handle casting findOneAndUpdate() with overwrite and upsert #5631 + * fix(update): correctly handle $ in updates #5628 + * fix(types): handle manual population consistently for unshift() and splice() #5504 + +4.11.12 / 2017-09-18 +==================== + * docs(model): asterisk should not render as markdown bullet #5644 [timkinnane](https://github.com/timkinnane) + * docs: use useMongoClient in connection example #5627 [GabrielNicolasAvellaneda](https://github.com/GabrielNicolasAvellaneda) + * fix(connection): call callback when initial connection failed #5626 + * fix(query): apply select correctly if a given nested schema is used for 2 different paths #5603 + * fix(document): add graceful fallback for setting a doc array value and `pull()`-ing a doc #3511 + +4.11.11 / 2017-09-10 +==================== + * fix(connection): properly set readyState in response to driver 'close' and 'reconnect' events #5604 + * fix(document): ensure single embedded doc setters only get called once, with correct value #5601 + * fix(timestamps): allow enabling updatedAt without createdAt #5598 + * test: improve unique validator test by making create run before ensureIndex #5595 #5562 + * fix(query): ensure find callback only gets called once when post init hook throws error #5592 + +4.11.10 / 2017-09-03 +==================== + * docs: add KeenIO tracking #5612 + * fix(schema): ensure validators declared with `.validate()` get copied with clone() #5607 + * fix: remove unnecessary jest warning #5480 + * fix(discriminator): prevent implicit discriminator schema id from clobbering base schema custom id #5591 + * fix(schema): hide schema objectid warning for non-hex strings of length 24 #5587 + * docs(populate): use story schema defined key author instead of creator #5578 [dmric](https://github.com/dmric) + * docs(document): describe usage of `.set()` #5576 + * fix(document): ensure correct scope in single nested validators #5569 + * fix(populate): don't mark path as populated until populate() is done #5564 + * fix(document): make push()-ing a doc onto an empty array act as manual population #5504 + * fix(connection): emit timeout event on socket timeout #4513 + +4.11.9 / 2017-08-27 +=================== + * fix(error): avoid using arguments.callee because that breaks strict mode #5572 + * docs(schematypes): fix spacing #5567 + * fix(query): enforce binary subtype always propagates to mongodb #5551 + * fix(query): only skip castForQuery for mongoose arrays #5536 + * fix(browser): rely on browser entrypoint to decide whether to use BrowserDocument or NodeDocument #5480 + +4.11.8 / 2017-08-23 +=================== + * feat: add warning about using schema ObjectId as type ObjectId #5571 [efkan](https://github.com/efkan) + * fix(schema): allow setting `id` property after schema was created #5570 #5548 + * docs(populate): remove confusing _ from populate docs #5560 + * fix(connection): expose parsed uri fields (host, port, dbname) when using openUri() #5556 + * docs: added type boolean to options documentation #5547 [ndabAP](https://github.com/ndabAP) + * test: add test coverage for stopping/starting server #5524 + * fix(aggregate): pull read preference from schema by default #5522 + +4.11.7 / 2017-08-14 +=================== + * fix: correct properties when calling toJSON() on populated virtual #5544 #5442 [davidwu226](https://github.com/davidwu226) + * docs: fix spelling #5535 [et](https://github.com/et) + * fix(error): always set name before stack #5533 + * fix: add warning about running jest in jsdom environment #5532 #5513 #4943 + * fix(document): ensure overwriting a doc array cleans out individual docs #5523 + * fix(schema): handle creating arrays of single nested using type key #5521 + * fix: upgrade mongodb -> 2.2.31 to support user/pass options #5419 + +4.11.6 / 2017-08-07 +=================== + * fix: limiting number of async operations per time in insertMany #5529 [andresattler](https://github.com/andresattler) + * fix: upgrade mongodb -> 2.2.30 #5517 + * fix(browserDocument): prevent stack overflow caused by double-wrapping embedded doc save() in jest #5513 + * fix(document): clear single nested doc when setting to empty object #5506 + * fix(connection): emit reconnected and disconnected events correctly with useMongoClient #5498 + * fix(populate): ensure nested virtual populate gets set even if top-level property is null #5431 + +4.11.5 / 2017-07-30 +=================== + * docs: fix link to $lookup #5516 [TalhaAwan](https://github.com/TalhaAwan) + * fix: better parallelization for eachAsync #5502 [lchenay](https://github.com/lchenay) + * docs(document): copy docs for save from model to doc #5493 + * fix(document): handle dotted virtuals in toJSON output #5473 + * fix(populate): restore user-provided limit after mutating so cursor() works with populate limit #5468 + * fix(query): don't throw StrictModeError if geo query with upsert #5467 + * fix(populate): propagate readPreference from query to populate queries by default #5460 + * docs: warn not to use arrow functions for statics and methods #5458 + * fix(query): iterate over all condition keys for setDefaultsOnInsert #5455 + * docs(connection): clarify server/replset/mongos option deprecation with useMongoClient #5442 + +4.11.4 / 2017-07-23 +=================== + * fix: handle next() errors in `eachAsync()` #5486 [lchenay](https://github.com/lchenay) + * fix(schema): propagate runSettersOnQuery option to implicitly created schemas #5479 [https://github.com/ValYouW] + * fix(query): run castConditions() correctly in update ops #5477 + * fix(query): ensure castConditions called for findOne and findOneAnd* #5477 + * docs: clarify relationship between $lookup and populate #5475 [TalhaAwan](https://github.com/TalhaAwan) + * test: add coverage for arrays of arrays [zbjornson](https://github.com/zbjornson) + * fix(middleware): ensure that error handlers for save get doc as 2nd param #5466 + * fix: handle strict: false correctly #5454 #5453 [wookieb](https://github.com/wookieb) + * fix(query): apply schema excluded paths if only projection is a $slice #5450 + * fix(query): correct discriminator handling for schema `select: false` fields in schema #5448 + * fix(cursor): call next() in series when parallel option used #5446 + * chore: load bundled driver first to avoid packaging problem #5443 [prototypeme](https://github.com/prototypeme) + * fix(query): defer condition casting until final exec #5434 + * fix(aggregate): don't rely on mongodb aggregate to put a cursor in the callback #5394 + * docs(aggregate): add useMongooseAggCursor docs #5394 + * docs(middleware): clarify context for document, query, and model middleware #5381 + +4.11.3 / 2017-07-14 +=================== + * fix(connection): remove .then() before resolving to prevent infinite recursion #5471 + +4.11.2 / 2017-07-13 +=================== + * docs: fix comment typo in connect example #5435 [ConnorMcF](https://github.com/ConnorMcF) + * fix(update): correctly cast document array in update validators with exec() #5430 + * fix(connection): handle autoIndex with useMongoClient #5423 + * fix(schema): handle `type: [Array]` in schemas #5416 + * fix(timestamps): if overwrite is set and there's a $set, use $set instead of top-level update #5413 + * fix(document): don't double-validate deeply nested doc array elements #5411 + * fix(schematype): clone default objects so default not shared across object instances unless `shared` specified #5407 + * fix(document): reset down the nested subdocs when resetting parent doc #5406 + * fix: don't pass error arg twice to error handlers #5405 + * fix(connection): make openUri() return connection decorated with then() and catch() #5404 + * fix: enforce $set on an array must be an array #5403 + * fix(document): don't crash if calling `validateSync()` after overwriting doc array index #5389 + * fix(discriminator): ensure discriminator key doesn't count as user-selected field for projection #4629 + +4.11.1 / 2017-07-02 +=================== +* docs: populate virtuals fix justOne description #5427 [fredericosilva](https://github.com/fredericosilva) + * fix(connection): make sure to call onOpen in openUri() #5404 + * docs(query): justOne is actually single, and it default to false #5402 [zbjornson](https://github.com/zbjornson) + * docs: fix small typo in lib/schema.js #5398 #5396 [pjo336](https://github.com/pjo336) + * fix: emit remove on single nested subdocs when removing parent #5388 + * fix(update): handle update with defaults and overwrite but no update validators #5384 + * fix(populate): handle undefined refPath values in middle of array #5377 + * fix(document): ensure consistent setter context for single nested #5363 + * fix(query): support runSettersOnQuery as query option #5350 + +4.11.0 / 2017-06-25 +=================== + * feat(query): execute setters with query as context for `runSettersOnQuery` #5339 + * feat(model): add translateAliases function #5338 [rocketspacer](https://github.com/rocketspacer) + * feat(connection): add `useMongoClient` and `openUri` functions, deprecate current connect logic #5304 + * refactor(schema): make id virtual not access doc internals #5279 + * refactor: handle non-boolean lean #5279 + * feat(cursor): add addCursorFlag() support to query and agg cursors #4814 + * feat(cursor): add parallel option to eachAsync #4244 + * feat(schema): allow setting custom error constructor for custom validators #4009 + +4.10.8 / 2017-06-21 +=================== + * docs: fix small formatting typo on schematypes #5374 [gianpaj](https://github.com/gianpaj) + * fix(model): allow null as an _id #5370 + * fix(populate): don't throw async uncaught exception if model not found in populate #5364 + * fix: correctly cast decimals in update #5361 + * fix(error): don't use custom getter for ValidationError message #5359 + * fix(query): handle runSettersOnQuery in built-in _id setter #5351 + * fix(document): ensure consistent context for nested doc custom validators #5347 + +4.10.7 / 2017-06-18 +=================== + * docs(validation): show overriding custom validator error with 2nd cb arg #5358 + * fix: `parseOption` mutates user passed option map #5357 [igwejk](https://github.com/igwejk) + * docs: fix guide.jade typo #5356 [CalebAnderson2014](https://github.com/CalebAnderson2014) + * fix(populate): don't set populate virtual to ids when match fails #5336 + * fix(query): callback with cast error if remove and delete* args have a cast error #5323 + +4.10.6 / 2017-06-12 +=================== + * fix(cursor): handle custom model option for populate #5334 + * fix(populate): handle empty virtual populate with Model.populate #5331 + * fix(model): make ensureIndexes() run with autoIndex: false unless called internally #5328 #5324 #5317 + * fix: wait for all connections to close before resolving disconnect() promise #5316 + * fix(document): handle setting populated path with custom typeKey in schema #5313 + * fix(error): add toJSON helper to ValidationError so `message` shows up with JSON.stringify #5309 + * feat: add `getPromiseConstructor()` to prevent need for `mongoose.Promise.ES6` #5305 + * fix(document): handle conditional required with undefined props #5296 + * fix(model): clone options before inserting in save() #5294 + * docs(populate): clarify that multiple populate() calls on same path overwrite #5274 + +4.10.5 / 2017-06-06 +=================== + * chore: improve contrib guide for building docs #5312 + * fix(populate): handle init-ing nested virtuals properly #5311 + * fix(update): report update validator error if required path under single nested doc not set + * fix(schema): remove default validate pre hook that was causing issues with jest #4943 + +4.10.4 / 2017-05-29 +=================== + * chore: dont store test data in same directory #5303 + * chore: add data dirs to npmignore #5301 [Starfox64](https://github.com/Starfox64) + * docs(query): add docs about runSettersOnQuery #5300 + +4.10.3 / 2017-05-27 +=================== + * docs: correct inconsistent references to updateOne and replaceOne #5297 [dhritzkiv](https://github.com/dhritzkiv) + * docs: fix dropdowns in docs #5292 [nathanallen](https://github.com/nathanallen) + * docs: add description of alias option #5287 + * fix(document): prevent infinite loop if validating nested array #5282 + * fix(schema): correctly handle ref ObjectIds from different mongoose libs #5259 + * fix(schema): load child class methods after base class methods to allow override #5227 + +4.10.2 / 2017-05-22 +=================== + * fix: bump ms -> 2.0.0 and mquery -> 2.3.1 for minor security vulnerability #5275 + +4.10.1 / 2017-05-21 +=================== + * fix(aggregate): handle sorting by text score correctly #5258 + * fix(populate): handle doc.populate() with virtuals #5240 + * fix(schema): enforce that `_id` is never null #5236 + +4.10.0 / 2017-05-18 +=================== + * fix(schema): update clone method to include indexes #5268 [clozanosanchez](https://github.com/clozanosanchez) + * feat(schema): support aliases #5184 [rocketspacer](https://github.com/rocketspacer) + * feat(aggregate): add mongoose-specific aggregation cursor option #5145 + * refactor(model): make sharding into a plugin instead of core #5105 + * fix(document): make nested doc mongoose internals not enumerable again #5078 + * feat(model): pass params to pre hooks #5064 + * feat(timestamps): support already defined timestamp paths in schema #4868 + * feat(query): add runSettersOnQuery option #4569 + * fix(query): add strictQuery option that throws when not querying on field not in schema #4136 + * fix(update): more complete handling for overwrite option with update validators #3556 + * feat: support `unique: true` in arrays via the mongoose-unique-array plugin #3347 + * fix(model): always emit 'index', even if no indexes #3347 + * fix(schema): set unique indexes on primitive arrays #3347 + * feat(validation): include failed paths in error message and inspect output #3064 #2135 + * fix(model): return saved docs when create() fails #2190 + +4.9.10 / 2017-05-17 +=================== + * fix(connection): ensure callback arg to openSet() is handled properly #5249 + * docs: remove dead plugins repo and add content links #5247 + * fix(model): skip index build if connecting after model init and autoIndex false #5176 + +4.9.9 / 2017-05-13 +================== + * docs: correct value for Query#regex() #5230 + * fix(connection): don't throw if .catch() on open() promise #5229 + * fix(schema): allow update with $currentDate for updatedAt to succeed #5222 + * fix(model): versioning doesn't fail if version key undefined #5221 [basileos](https://github.com/basileos) + * fix(document): don't emit model error if callback specified for consistency with docs #5216 + * fix(document): handle errors in subdoc pre validate #5215 + +4.9.8 / 2017-05-07 +================== + * docs(subdocs): rewrite subdocs guide #5217 + * fix(document): avoid circular JSON if error in doc array under single nested subdoc #5208 + * fix(document): set intermediate empty objects for deeply nested undefined paths before path itself #5206 + * fix(schema): throw error if first param to schema.plugin() is not a function #5201 + * perf(document): major speedup in validating subdocs (50x in some cases) #5191 + +4.9.7 / 2017-04-30 +================== + * docs: fix typo #5204 [phutchins](https://github.com/phutchins) + * fix(schema): ensure correct path for deeply nested schema indexes #5199 + * fix(schema): make remove a reserved name #5197 + * fix(model): handle Decimal type in insertMany correctly #5190 + * fix: upgrade kareem to handle async pre hooks correctly #5188 + * docs: add details about unique not being a validator #5179 + * fix(validation): handle returning a promise with isAsync: true #5171 + +4.9.6 / 2017-04-23 +================== + * fix: update `parentArray` references when directly assigning document arrays #5192 [jhob](https://github.com/jhob) + * docs: improve schematype validator docs #5178 [milesbarr](https://github.com/milesbarr) + * fix(model): modify discriminator() class in place #5175 + * fix(model): handle bulkWrite updateMany casting #5172 [tzellman](https://github.com/tzellman) + * docs(model): fix replaceOne example for bulkWrite #5168 + * fix(document): don't create a new array subdoc when creating schema array #5162 + * fix(model): merge query hooks from discriminators #5147 + * fix(document): add parent() function to subdocument to match array subdoc #5134 + +4.9.5 / 2017-04-16 +================== + * fix(query): correct $pullAll casting of null #5164 [Sebmaster](https://github.com/Sebmaster) + * docs: add advanced schemas docs for loadClass #5157 + * fix(document): handle null/undefined gracefully in applyGetters() #5143 + * fix(model): add resolveToObject option for mapReduce with ES6 promises #4945 + +4.9.4 / 2017-04-09 +================== + * fix(schema): clone query middleware correctly #5153 #5141 [clozanosanchez](https://github.com/clozanosanchez) + * docs(aggregate): fix typo #5142 + * fix(query): cast .$ update to underlying array type #5130 + * fix(populate): don't mutate populate result in place #5128 + * fix(query): handle $setOnInsert consistent with $set #5126 + * docs(query): add strict mode option for findOneAndUpdate #5108 + +4.9.3 / 2017-04-02 +================== + * docs: document.js fixes for functions prepended with `$` #5131 [krmannix](https://github.com/krmannix) + * fix: Avoid exception on constructor check #5129 [monkbroc](https://github.com/monkbroc) + * docs(schematype): explain how to use `isAsync` with validate() #5125 + * docs(schematype): explain custom message with required function #5123 + * fix(populate): only apply refPath duplicate id optimization if not array #5114 + * fix(document): copy non-objects to doc when init() #5111 + * perf(populate): dont clone whole options every time #5103 + * feat(document): add isDirectSelected() to minimize isSelected() changes #5063 + * docs(schematypes): explain some subtleties with arrays #5059 + +4.9.2 / 2017-03-26 +================== + * fix(discriminator): handle class names consistently #5104 + * fix(schema): make clone() work with reusing discriminator schemas #5098 + * fix(querycursor): run pre find hooks with .cursor() #5096 + * fix(connection): throw error if username:password includes @ or : #5091 + * fix(timestamps): handle overwriting createdAt+updatedAt consistently #5088 + * fix(document): ensure subdoc post save runs after parent save #5085 + * docs(model): improve update docs #5076 [bertolo1988](https://github.com/bertolo1988) + +4.9.1 / 2017-03-19 +================== + * fix(query): handle $type for arrays #5080 #5079 [zoellner](https://github.com/zoellner) + * fix(model): handle ordered param for `insertMany` validation errors #5072 [sjorssnoeren](https://github.com/sjorssnoeren) + * fix(populate): avoid duplicate ids in dynref queries #5054 + * fix(timestamps): dont set timestamps in update if user set it #5045 + * fix(update): dont double-call setters on arrays #5041 + * fix: upgrade driver -> 2.2.25 for jest fix #5033 + * fix(model): get promise each time save() is called rather than once #5030 + * fix(connection): make connect return value consistent #5006 + +4.9.0 / 2017-03-13 +================== + * feat(document): return this from `depopulate()` #5027 + * fix(drivers): stop emitting timeouts as errors #5026 + * feat(schema): add a clone() function for schemas #4983 + * feat(query): add rawResult option to replace passRawResult, deprecate passRawResult #4977 #4925 + * feat(schematype): support isAsync validator option and handle returning promises from validators, deprecate implicit async validators #4290 + * feat(query): add `replaceOne()`, `deleteOne()`, `deleteMany()` #3998 + * feat(model): add `bulkWrite()` #3998 + +4.8.7 / 2017-03-12 +================== + * fix(model): if last arg in spread is falsy, treat it as a callback #5061 + * fix(document): use $hook instead of hook to enable 'hook' as a path name #5047 + * fix(populate): dont select foreign field if parent field is selected #5037 + * fix(populate): handle passing no args to query.populate #5036 + * fix(update): use correct method for casting nested arrays #5032 + * fix(discriminator): handle array discriminators when casting $push #5009 + +4.8.6 / 2017-03-05 +================== + * docs(document): remove text that implies that transform is false by default #5023 + * fix(applyHooks): dont wrap a function if it is already wrapped #5019 + * fix(document): ensure nested docs' toObject() clones #5008 + +4.8.5 / 2017-02-25 +================== + * fix: check for empty schemaPath before accessing property $isMongooseDocumentArray #5017 [https://github.com/randyhoulahan](randyhoulahan) + * fix(discriminators): handle create() and push() for embedded discriminators #5001 + * fix(querycursor): ensure close emitted after last data event #4998 + * fix(discriminators): remove fields not selected in child when querying by base model #4991 + +4.8.4 / 2017-02-19 +================== + * docs(discriminators): explain embedded discriminators #4997 + * fix(query): fix TypeError when findOneAndUpdate errors #4990 + * fix(update): handle nested single embedded in update validators correctly #4989 + * fix(browser): make browser doc constructor not crash #4987 + +4.8.3 / 2017-02-15 +================== + * chore: upgrade mongodb driver -> 2.2.24 + * docs(connections): addd some details about callbacks #4986 + * fix: ensure class is created with new keyword #4972 #4947 [benhjames](https://github.com/benhjames) + * fix(discriminator): add applyPluginsToDiscriminators option #4965 + * fix(update): properly cast array subdocs when casting update #4960 + * fix(populate): ensure foreign field is selected for virtual populate #4959 + * docs(query): document some query callback params #4949 + * fix(document): ensure errors in validators get caught #2185 + +4.8.2 / 2017-02-10 +================== + * fix(update): actually run validators on addToSet #4953 + * fix(update): improve buffer error handling #4944 [ValYouW](https://github.com/ValYouW) + * fix(discriminator): handle subclassing with loadClass correctly #4942 + * fix(query): allow passing Map to sort() #4941 + * fix(document): handle setting discriminator doc #4935 + * fix(schema): return correct value from pre init hook #4928 + * fix(query): ensure consistent params in error handlers if pre hook errors #4927 + +4.8.1 / 2017-01-30 +================== + * fix(query): handle $exists for arrays and embedded docs #4937 + * fix(query): handle passing string to hint() #4931 + +4.8.0 / 2017-01-28 +================== + * feat(schema): add saveErrorIfNotFound option and $where property #4924 #4004 + * feat(query): add $in implicitly if passed an array #4912 [QuotableWater7](https://github.com/QuotableWater7) + * feat(aggregate): helper for $facet #4904 [varunjayaraman](https://github.com/varunjayaraman) + * feat(query): add collation method #4839 + * feat(schema): propogate strict option to implicit array subschemas #4831 [dkrosso](https://github.com/dkrosso) + * feat(aggregate): add helper for graphLookup #4819 [varunjayaraman](https://github.com/varunjayaraman) + * feat(types): support Decimal128 #4759 + * feat(aggregate): add eachAsync() to aggregate cursor #4300 + * feat(query): add updateOne and updateMany #3997 + * feat(model): support options for insertMany #3893 + * fix(document): run validation on single nested docs if not directly modified #3884 + * feat(model): use discriminator constructor based on discriminatorKey in create() #3624 + * feat: pass collection as context to debug function #3261 + * feat(query): support push and addToSet for update validators #2933 + * perf(document): refactor registerHooksFromSchema so hooks are defined on doc prototype #2754 + * feat(types): add discriminator() function to doc arrays #2723 #1856 + * fix(populate): return an error if sorting underneath a doc array #2202 + +4.7.9 / 2017-01-27 +================== + * fix(query): handle casting $exists under $not #4933 + * chore: upgrade mongodb -> 2.2.22 re: #4931 + +4.7.8 / 2017-01-23 +================== + * fix(populate): better handling for virtual populate under arrays #4923 + * docs: upgrade contributors count #4918 [AdamZaczek](https://github.com/AdamZaczek) + * fix(query): don't set nested path default if setting parent path #4911 + * docs(promise): add missing bracket #4907 + * fix(connection): ensure error handling is consistently async #4905 + * fix: handle authMechanism in query string #4900 + * fix(document): ensure error handlers run for validate #4885 + +4.7.7 / 2017-01-15 +================== + * fix(utils): don't crash if to[key] is null #4881 + * fix: upgrade mongodb -> 2.2.21 #4867 + * fix: add a toBSON to documents for easier querying #4866 + * fix: suppress bluebird warning #4854 [davidwu226](https://github.com/davidwu226) + * fix(populate): handle nested virtuals in virtual populate #4851 + +4.7.6 / 2017-01-02 +================== + * fix(model): allow passing non-array to insertMany #4846 + * fix(populate): use base model name if no discriminator for backwards compat #4843 + * fix: allow internal validate callback to be optional #4842 [arciisine](https://github.com/arciisine) + * fix(document): don't skip pointCut if save not defined (like in browser doc) #4841 + * chore: improve benchmarks #4838 [billouboq](https://github.com/billouboq) + * perf: remove some unused parameters #4837 [billouboq](https://github.com/billouboq) + * fix(query): don't call error handler if passRawResult is true and no error occurred #4836 + +4.7.5 / 2016-12-26 +================== + * docs(model): fix spelling mistake #4828 [paulinoj](https://github.com/paulinoj) + * fix(aggregate): remove unhandled rejection when using aggregate.then() #4824 + * perf: remove try/catch that kills optimizer #4821 + * fix(model): handles populating with discriminators that may not have a ref #4817 + * fix(document): handle setting array of discriminators #3575 + +4.7.4 / 2016-12-21 +================== + * docs: fix typo #4810 [GEEKIAM](https://github.com/GEEKIAM) + * fix(query): timestamps with $push + $each #4805 + * fix(document): handle buffers correctly in minimize #4800 + * fix: don't disallow overwriting default and cast fns #4795 [pdspicer](https://github.com/pdspicer) + * fix(document): don't convert single nested docs to POJOs #4793 + * fix(connection): handle reconnect to replica set correctly #4972 [gfzabarino](https://github.com/gfzabarino) + +4.7.3 / 2016-12-16 +================== + * fix: upgrade mongodb driver -> 2.2.16 for several bug fixes and 3.4 support #4799 + * fix(model): ensure discriminator key is correct for child schema on discriminator #4790 + * fix(document): handle mark valid in subdocs correctly #4778 + * fix(query): check for objects consistently #4775 + +4.7.2 / 2016-12-07 +================== + * test(populate): fix justOne test #4772 [cblanc](https://github.com/cblanc) + * chore: fix benchmarks #4769 [billouboq](https://github.com/billouboq) + * fix(document): handle setting subdoc to null after setting parent doc #4766 + * fix(query): support passRawResult with lean #4762 #4761 [mhfrantz](https://github.com/mhfrantz) + * fix(query): throw StrictModeError if upsert with nonexisting field in condition #4757 + * test: fix a couple of sort tests #4756 [japod](https://github.com/japod) + * chore: upgrade mongodb driver -> 2.2.12 #4753 [mdlavin](https://github.com/mdlavin) + * fix(query): handle update with upsert and overwrite correctly #4749 + +4.7.1 / 2016-11-30 +================== + * fix(schema): throw error if you use prototype as a schema path #4746 + * fix(schema): throw helpful error if you define a virtual with the same path as a real path #4744 + * fix(connection): make createConnection not throw rejected promises #4742 + * fix(populate): allow specifiying options in model schema #4741 + * fix(document): handle selected nested elements with defaults #4739 + * fix(query): add model to cast error if possible #4729 + * fix(query): handle timestamps with overwrite #4054 + +4.7.0 / 2016-11-23 +================== + * docs: clean up schematypes #4732 [kidlj](https://github.com/kidlj) + * perf: only get stack when necessary with VersionError #4726 [Sebmaster](https://github.com/Sebmaster) + * fix(query): ensure correct casting when setting array element #4724 + * fix(connection): ensure db name gets set when you pass 4 params #4721 + * fix: prevent TypeError in node v7 #4719 #4706 + * feat(document): support .set() on virtual subpaths #4716 + * feat(populate): support populate virtuals on nested schemas #4715 + * feat(querycursor): support transform option and .map() #4714 #4705 [cblanc](https://github.com/cblanc) + * fix(document): dont set defaults on not-selected nested paths #4707 + * fix(populate): don't throw if empty string passed to populate #4702 + * feat(model): add `loadClass()` function for importing schema from ES6 class #4668 [rockmacaca](https://github.com/rockmacaca) + +4.6.8 / 2016-11-14 +================== + * fix(querycursor): clear stack when iterating onto next doc #4697 + * fix: handle null keys in validation error #4693 #4689 [arciisine](https://github.com/arciisine) + * fix(populate): handle pre init middleware correctly with populate virtuals #4683 + * fix(connection): ensure consistent return value for open and openSet #4659 + * fix(schema): handle falsy defaults for arrays #4620 + +4.6.7 / 2016-11-10 +================== + * fix(document): only invalidate in subdoc if using update validators #4681 + * fix(document): don't create subdocs when excluded in projection #4669 + * fix(document): ensure single embedded schema validator runs with correct context #4663 + * fix(document): make sure to depopulate top level for sharding #4658 + * fix(connection): throw more helpful error when .model() called incorrectly #4652 + * fix(populate): throw more descriptive error when trying to populate a virtual that doesn't have proper options #4602 + * fix(document): ensure subtype gets set properly when saving with a buffer id #4506 + * fix(query): handle setDefaultsOnInsert with defaults on doc arrays #4456 + * fix(drivers): make debug output better by calling toBSON() #4356 + +4.6.6 / 2016-11-03 +================== + * chore: upgrade deps #4674 [TrejGun](https://github.com/TrejGun) + * chore: run tests on node v7 #4673 [TrejGun](https://github.com/TrejGun) + * perf: make setDefaultsOnInsert more efficient if upsert is off #4672 [CamHenlin](https://github.com/CamHenlin) + * fix(populate): ensure document array is returned #4656 + * fix(query): cast doc arrays with positionals correctly for update #4655 + * fix(document): ensure single nested doc validators run with correct context #4654 + * fix: handle reconnect failed error in new version of driver #4653 [loris](https://github.com/loris) + * fix(populate): if setting a populated doc, take its id #4632 + * fix(populate): handle populated virtuals in init #4618 + +4.6.5 / 2016-10-23 +================== + * docs: fix grammar issues #4642 #4640 #4639 [silvermanj7](https://github.com/silvermanj7) + * fix(populate): filter out nonexistant values for dynref #4637 + * fix(query): handle $type as a schematype operator #4632 + * fix(schema): better handling for uppercase: false and lowercase: false #4622 + * fix(query): don't run transforms on updateForExec() #4621 + * fix(query): handle id = 0 in findById #4610 + * fix(query): handle buffers in mergeClone #4609 + * fix(document): handle undefined with conditional validator for validateSync #4607 + * fix: upgrade to mongodb driver 2.2.11 #4581 + * docs(schematypes): clarify schema.path() #4518 + * fix(query): ensure path is defined before checking in timestamps #4514 + * fix(model): set version key in upsert #4505 + * fix(document): never depopulate top-level doc #3057 + * refactor: ensure sync for setting non-capped collections #2690 + +4.6.4 / 2016-10-16 +================== + * fix(query): cast $not correctly #4616 #4592 [prssn](https://github.com/prssn) + * fix: address issue with caching global plugins #4608 #4601 [TrejGun](https://github.com/TrejGun) + * fix(model): make sure to depopulate in insertMany #4590 + * fix(model): buffer autoIndex if bufferCommands disabled #4589 + * fix(populate): copy ids array before modifying #4585 + * feat(schema): add retainKeyOrder prop #4542 + * fix(document): return isModified true for children of direct modified paths #4528 + * fix(connection): add dropDatabase() helper #4490 + * fix(model): add usePushEach option for schemas #4455 + * docs(connections): add some warnings about buffering #4413 + * fix: add ability to set promise implementation in browser #4395 + +4.6.3 / 2016-10-05 +================== + * fix(document): ensure single nested docs get initialized correctly when setting nested paths #4578 + * fix: turn off transforms when writing nested docs to db #4574 + * fix(document): don't set single nested subdocs to null when removing parent doc #4566 + * fix(model): ensure versionKey gets set in insertMany #4561 + * fix(schema): handle typeKey in arrays #4548 + * feat(schema): set $implicitlyCreated on schema if created by interpretAsType #4443 + +4.6.2 / 2016-09-30 +================== + * chore: upgrade to async 2.0.1 internally #4579 [billouboq](https://github.com/billouboq) + * fix(types): ensure nested single doc schema errors reach update validators #4557 #4519 + * fix(connection): handle rs names with leading numbers (muri 1.1.1) #4556 + * fix(model): don't throw if method name conflicts with Object.prototype prop #4551 + * docs: fix broken link #4544 [VFedyk](https://github.com/VFedyk) + * fix: allow overwriting model on mongoose singleton #4541 [Nainterceptor](https://github.com/Nainterceptor) + * fix(document): don't use init: true when building doc defaults #4540 + * fix(connection): use replSet option if replset not specified #4535 + * fix(query): cast $not objects #4495 + +4.6.1 / 2016-09-20 +================== + * fix(query): improve handling of $not with $elemMatch #4531 #3719 [timbowhite](https://github.com/timbowhite) + * fix: upgrade mongodb -> 2.2.10 #4517 + * chore: fix webpack build issue #4512 [saiichihashimoto](https://github.com/saiichihashimoto) + * fix(query): emit error on next tick when exec callback errors #4500 + * test: improve test case #4496 [isayme](https://github.com/isayme) + * fix(schema): use same check for array types and top-level types #4493 + * style: fix indentation in docs #4489 [dhurlburtusa](https://github.com/dhurlburtusa) + * fix(schema): expose original object passed to constructor #4486 + * fix(query): handle findOneAndUpdate with array of arrays #4484 #4470 [fedotov](https://github.com/fedotov) + * feat(document): add $ignore to make a path ignored #4480 + * fix(query): properly handle setting single embedded in update #4475 #4466 #4465 + * fix(updateValidators): handle single nested schema subpaths correctly #4479 + * fix(model): throw handy error when method name conflicts with property name #4475 + * fix(schema): handle .set() with array field #4472 + * fix(query): check nested path when avoiding double-validating Mixed #4441 + * fix(schema): handle calling path.trim() with no args correctly #4042 + +4.6.0 / 2016-09-02 +================== + * docs(document): clarify the findById and findByIdAndUpdate examples #4471 [mdcanham](https://github.com/mdcanham) + * docs(schematypes): add details re: options #4452 + * docs(middleware): add docs for insertMany hooks #4451 + * fix(schema): create new array when copying from existing object to preserve change tracking #4449 + * docs: fix typo in index.jade #4448 + * fix(query): allow array for populate options #4446 + * fix(model): create should not cause unhandle reject promise #4439 + * fix: upgrade to mongodb driver 2.2.9 #4363 #4341 #4311 (see [comments here](https://github.com/mongodb/js-bson/commit/aa0b54597a0af28cce3530d2144af708e4b66bf0#commitcomment-18850498) if you use node 0.10) + +4.5.10 / 2016-08-23 +=================== + * docs: fix typo on documents.jade #4444 [Gabri3l](https://github.com/Gabri3l) + * chore: upgrade mocha to 3.0.2 #4437 [TrejGun](https://github.com/TrejGun) + * fix: subdocuments causing error with parent timestamp on update #4434 [dyang108](https://github.com/dyang108) + * fix(query): don't crash if timestamps on and update doesn't have a path #4425 #4424 #4418 + * fix(query): ensure single nested subdoc is hydrated when running update validators #4420 + * fix(query): cast non-$geometry operators for $geoWithin #4419 + * docs: update contributor count #4415 [AdamZaczek](https://github.com/AdamZaczek) + * docs: add more clarification re: the index event #4410 + * fix(document): only skip modifying subdoc path if parent is direct modified #4405 + * fix(schema): throw cast error if provided date invalid #4404 + * feat(error): use util.inspect() so CastError never prints "[object Object]" #4398 + * fix(model): dont error if the discriminator key is unchanged #4387 + * fix(query): don't throw unhandled rejection with bluebird when using cbs #4379 + +4.5.9 / 2016-08-14 +================== + * docs: add mixed schema doc for Object literal #4400 [Kikobeats](https://github.com/Kikobeats) + * fix(query): cast $geoWithin and convert mongoose objects to POJOs before casting #4392 + * fix(schematype): dont cast defaults without parent doc #4390 + * fix(query): disallow passing empty string to findOne() #4378 + * fix(document): set single nested doc isNew correctly #4369 + * fix(types): checks field name correctly with nested arrays and populate #4365 + * fix(drivers): make debug output copy-pastable into mongodb shell #4352 + * fix(services): run update validators on nested paths #4332 + * fix(model): handle typeKey with discriminators #4339 + * fix(query): apply timestamps to child schemas when explicitly specified in update #4049 + * fix(schema): set prefix as nested path with add() #1730 + +4.5.8 / 2016-08-01 +================== + * fix(model): make changing the discriminator key cause a cast error #4374 + * fix(query): pass projection fields to cursor #4371 #4342 [Corei13](https://github.com/Corei13) + * fix(document): support multiple paths for isModified #4370 [adambuczynski](https://github.com/adambuczynski) + * fix(querycursor): always cast fields before returning cursor #4355 + * fix(query): support projection as alias for fields in findOneAndUpdate #4315 + * fix(schema): treat index false + unique false as no index #4304 + * fix(types): dont mark single nested subpath as modified if whole doc already modified #4224 + +4.5.7 / 2016-07-25 +================== + * fix(document): ensure no unhandled rejections if callback specified for save #4364 + +4.5.6 / 2016-07-23 +================== + * fix(schema): don't overwrite createdAt if it isn't selected #4351 [tusbar](https://github.com/tusbar) + * docs(api): fix link to populate() and add a new one from depopulate() #4345 [Delapouite](https://github.com/Delapouite) + * fix(types): ownerDocument() works properly with single nested docs #4344 [vichle](https://github.com/vichle) + * fix(populate): dont use findOne when justOne option set #4329 + * fix(document): dont trigger .then() deprecated warning when calling doc.remove() #4291 + * docs(connection): add promiseLibrary option #4280 + * fix(plugins): apply global plugins to subschemas #4271 + * fix(model): ensure `ensureIndex()` never calls back in the same tick #4246 + * docs(schema): improve post hook docs on schema #4238 + +4.5.5 / 2016-07-18 +================== + * fix(document): handle setting root to empty obj if minimize false #4337 + * fix: downgrade to mongodb 2.1.18 #4335 #4334 #4328 #4323 + * perf(types): remove defineProperty usage in documentarray #4333 + * fix(query): correctly pass model in .toConstructor() #4318 + * fix(services): avoid double-validating mixed types with update validators #4305 + * docs(middleware): add docs describing error handling middleware #4229 + * fix(types): throw correct error when invalidating doc array #3602 + +4.5.4 / 2016-07-11 +================== + * fix(types): fix removing embedded documents #4309 [RoCat](https://github.com/RoCat) + * docs: various docs improvements #4302 #4294 [simonxca](https://github.com/simonxca) + * fix: upgrade mongodb -> 2.1.21 #4295 #4202 [RoCat](https://github.com/RoCat) + * fix(populate): convert single result to array for virtual populate because of lean #4288 + * fix(populate): handle empty results for populate virtuals properly #4285 #4284 + * fix(query): dont cast $inc to number if type is long #4283 + * fix(types): allow setting single nested doc to null #4281 + * fix(populate): handle deeply nested virtual populate #4278 + * fix(document): allow setting empty obj if strict mode is false #4274 + * fix(aggregate): allow passing obj to .unwind() #4239 + * docs(document): add return statements to transform examples #1963 + +4.5.3 / 2016-06-30 +================== + * fix(query): pass correct options to QueryCursor #4277 #4266 + * fix(querycursor): handle lean option correctly #4276 [gchudnov](https://github.com/gchudnov) + * fix(document): fix error handling when no error occurred #4275 + * fix(error): use strict mode for version error #4272 + * docs(populate): fix crashing compilation for populate.jade #4267 + * fix(populate): support `justOne` option for populate virtuals #4263 + * fix(populate): ensure model param gets used for populate virtuals #4261 #4243 + * fix(querycursor): add ability to properly close the cursor #4258 + * docs(model): correct link to Document #4250 + * docs(populate): correct path for refPath populate #4240 + * fix(document): support validator.isEmail as validator #4064 + +4.5.2 / 2016-06-24 +================== + * fix(connection): add checks for collection presence for `onOpen` and `onClose` #4259 [nodkz](https://github.com/nodkz) + * fix(cast): allow strings for $type operator #4256 + * fix(querycursor): support lean() #4255 [pyramation](https://github.com/pyramation) + * fix(aggregate): allow setting noCursorTimeout option #4241 + * fix(document): handle undefined for Array.pull #4222 [Sebmaster](https://github.com/Sebmaster) + * fix(connection): ensure promise.catch() catches initial connection error #4135 + * fix(document): show additional context for VersionError #2633 + +4.5.1 / 2016-06-18 +================== + * fix(model): ensure wrapped insertMany() returns a promise #4237 + * fix(populate): dont overwrite populateVirtuals when populating multiple paths #4234 + * docs(model): clarify relationship between create() and save() #4233 + * fix(types): handle option param in subdoc remove() #4231 [tdebarochez](https://github.com/tdebarochez) + * fix(document): dedupe modified paths #4226 #4223 [adambuczynski](https://github.com/adambuczynski) + * fix(model): don't modify user-provided options object #4221 + * fix(document): handle setting nested path to empty object #4218 #4182 + * fix(document): clean subpaths when removing single nested #4216 + * fix(document): don't force transform on subdocs with inspect #4213 + * fix(error): allow setting .messages object #4207 + +4.5.0 / 2016-06-13 +================== + * feat(query): added Query.prototype.catch() #4215 #4173 [adambuczynski](https://github.com/adambuczynski) + * feat(query): add Query.prototype.cursor() as a .stream() alternative #4117 #3637 #1907 + * feat(document): add markUnmodified() function #4092 [vincentcr](https://github.com/vincentcr) + * feat(aggregate): convert aggregate object to a thenable #3995 #3946 [megagon](https://github.com/megagon) + * perf(types): remove defineProperties call for array (**Note:** Because of this, a mongoose array will no longer `assert.deepEqual()` a plain old JS array) #3886 + * feat(model): add hooks for insertMany() #3846 + * feat(schema): add support for custom query methods #3740 #2372 + * feat(drivers): emit error on 'serverClosed' because that means that reconnect failed #3615 + * feat(model): emit error event when callback throws exception #3499 + * feat(model): inherit options from discriminator base schema #3414 #1818 + * feat(populate): expose mongoose-populate-virtuals inspired populate API #2562 + * feat(document): trigger remove hooks on subdocs when removing parent #2348 + * feat(schema): add support for express-style error handling middleware #2284 + * fix(model): disallow setting discriminator key #2041 + * feat(schema): add support for nested arrays #1361 + +4.4.20 / 2016-06-05 +=================== + * docs: clarify command buffering when using driver directly #4195 + * fix(promise): correct broken mpromise .catch() #4189 + * fix(document): clean modified subpaths when set path to empty obj #4182 + * fix(query): support minDistance with query casting and `.near()` #4179 + * fix(model): remove unnecessary .save() promise #4177 + * fix(schema): cast all valid ObjectId strings to object ids #3365 + * docs: remove unclear "unsafe" term in query docs #3282 + +4.4.19 / 2016-05-21 +=================== + * fix(model): handle insertMany if timestamps not set #4171 + +4.4.18 / 2016-05-21 +=================== + * docs: add missing period #4170 [gitname](https://github.com/gitname) + * docs: change build badge to svg #4158 [a0viedo](https://github.com/a0viedo) + * fix(model): update timestamps when setting `createdAt` #4155 + * fix(utils): make sure to require in document properly #4152 + * fix(model): throw overwrite error when discriminator name conflicts #4148 + +4.4.17 / 2016-05-13 +=================== + * docs: remove repetition in QueryStream docs #4147 [hugoabonizio](https://github.com/hugoabonizio) + * fix(document): dont double-validate doc array elements #4145 + * fix(document): call required function with correct scope #4142 [JedWatson](https://github.com/JedWatson) + +4.4.16 / 2016-05-09 +=================== + * refactor(document): use function reference #4133 [dciccale](https://github.com/dciccale) + * docs(querystream): clarify `destroy()` and close event #4126 [AnthonyCC](https://github.com/AnthonyCC) + * test: make before hook fail fast if it can't connect #4121 + * docs: add description of CastError constructor params #4120 + * fix(schematype): ensure single embedded defaults have $parent #4115 + * fix(document): mark nested paths for validation #4111 + * fix(schema): make sure element is always a subdoc in doc array validation #3816 + +4.4.15 / 2016-05-06 +=================== + * fix(schema): support overwriting array default #4109 + * fix(populate): assign values when resolving each populate #4104 + * fix(aggregate): dont send async option to server #4101 + * fix(model): ensure isNew set to false after insertMany #4099 + * fix(connection): emit on error if listeners and no callback #4098 + * fix(document): treat required fn that returns false as required: false #4094 + +4.4.14 / 2016-04-27 +=================== + * fix: upgrade mongodb -> 2.1.18 #4102 + * feat(connection): allow setting mongos as a uri query param #4093 #4035 [burtonjc](https://github.com/burtonjc) + * fix(populate): make sure to use correct assignment order for each model #4073 + * fix(schema): add complete set of geospatial operators for single embedded subdocs #4014 + +3.8.40 / 2016-04-24 +=================== + * upgraded; mquery -> 1.10.0 #3989 + +4.4.13 / 2016-04-21 +=================== + * docs: add docs favicons #4082 [robertjustjones](https://github.com/robertjustjones) + * docs(model): correct Model.remove() return value #4075 [Jokero](https://github.com/Jokero) + * fix(query): add $geoWithin query casting for single embedded docs #4044 + * fix(schema): handle setting trim option to falsy #4042 + * fix(query): handle setDefaultsOnInsert with empty update #3835 + +4.4.12 / 2016-04-08 +=================== + * docs(query): document context option for update and findOneAndUpdate #4055 + * docs(query): correct link to $geoWithin docs #4050 + * fix(project): upgrade to mongodb driver 2.1.16 #4048 [schmalliso](https://github.com/schmalliso) + * docs(validation): fix validation docs #4028 + * fix(types): improve .id() check for document arrays #4011 + * fix(query): remove premature return when using $rename #3171 + * docs(connection): clarify relationship between models and connections #2157 + +4.4.11 / 2016-04-03 +=================== + * fix: upgrade to mongodb driver 2.1.14 #4036 #4030 #3945 + * fix(connection): allow connecting with { mongos: true } to handle query params #4032 [burtonjc](https://github.com/burtonjc) + * docs(connection): add autoIndex example #4026 [tilleps](https://github.com/tilleps) + * fix(query): handle passRawResult option when zero results #4023 + * fix(populate): clone options before modifying #4022 + * docs: add missing whitespace #4019 [chenxsan](https://github.com/chenxsan) + * chore: upgrade to ESLint 2.4.0 #4015 [ChristianMurphy](https://github.com/ChristianMurphy) + * fix(types): single nested subdocs get ids by default #4008 + * chore(project): add dependency status badge #4007 [Maheshkumar-Kakade](http://github.com/Maheshkumar-Kakade) + * fix: make sure timestamps don't trigger unnecessary updates #4005 #3991 [tommarien](https://github.com/tommarien) + * fix(document): inspect inherits schema options #4001 + * fix(populate): don't mark populated path as modified if setting to object w/ same id #3992 + * fix(document): support kind argument to invalidate #3965 + +4.4.10 / 2016-03-24 +=================== + * fix(document): copy isNew when copying a document #3982 + * fix(document): don't override defaults with undefined keys #3981 + * fix(populate): merge multiple deep populate options for the same path #3974 + +4.4.9 / 2016-03-22 +================== + * fix: upgrade mongodb -> 2.1.10 re https://jira.mongodb.org/browse/NODE-679 #4010 + * docs: add syntax highlighting for acquit examples #3975 + +4.4.8 / 2016-03-18 +================== + * docs(aggregate): clarify promises #3990 [megagon](https://github.com/megagon) + * fix: upgrade mquery -> 1.10 #3988 [matskiv](https://github.com/matskiv) + * feat(connection): 'all' event for repl sets #3986 [xizhibei](https://github.com/xizhibei) + * docs(types): clarify Array.pull #3985 [seriousManual](https://github.com/seriousManual) + * feat(query): support array syntax for .sort() via mquery 1.9 #3980 + * fix(populate): support > 3 level nested populate #3973 + * fix: MongooseThenable exposes connection correctly #3972 + * docs(connection): add note about reconnectTries and reconnectInterval #3969 + * feat(document): invalidate returns the new validationError #3964 + * fix(query): .eq() as shorthand for .equals #3953 [Fonger](https://github.com/Fonger) + * docs(connection): clarify connection string vs passed options #3941 + * docs(query): select option for findOneAndUpdate #3933 + * fix(error): ValidationError.properties no longer enumerable #3925 + * docs(validation): clarify how required validators work with nested schemas #3915 + * fix: upgrade mongodb driver -> 2.1.8 to make partial index errors more sane #3864 + +4.4.7 / 2016-03-11 +================== + * fix(query): stop infinite recursion caused by merging a mongoose buffer #3961 + * fix(populate): handle deep populate array -> array #3954 + * fix(schema): allow setting timestamps with .set() #3952 #3951 #3907 [Fonger](https://github.com/Fonger) + * fix: MongooseThenable doesn't overwrite constructors #3940 + * fix(schema): don't cast boolean to date #3935 + * fix(drivers): support sslValidate in connection string #3929 + * fix(types): correct markModified() for single nested subdocs #3910 + * fix(drivers): catch and report any errors that occur in driver methods #3906 + * fix(populate): get subpopulate model correctly when array under nested #3904 + * fix(document): allow fields named 'pre' and 'post' #3902 + * docs(query): clarify runValidators and setDefaultsOnInsert options #3892 + * docs(validation): show how to use custom required messages in schema #2616 + +4.4.6 / 2016-03-02 +================== + * fix: upgrade mongodb driver to 2.1.7 #3938 + * docs: fix plugins link #3917 #3909 [fbertone](https://github.com/fbertone) + * fix(query): sort+select with count works #3914 + * fix(query): improve mergeUpdate's ability to handle nested docs #3890 + +4.4.5 / 2016-02-24 +================== + * fix(query): ability to select a length field (upgrade to mquery 1.7.0) #3903 + * fix: include nested CastError as reason for array CastError #3897 [kotarou3](https://github.com/kotarou3) + * fix(schema): check for doc existence before taking fields #3889 + * feat(schema): useNestedStrict option to take nested strict mode for update #3883 + * docs(validation): clarify relationship between required and checkRequired #3822 + * docs(populate): dynamic reference docs #3809 + * docs: expand dropdown when clicking on file name #3807 + * docs: plugins.mongoosejs.io is up #3127 + * fix(schema): ability to add a virtual with same name as removed path #2398 + +4.4.4 / 2016-02-17 +================== + * fix(schema): handle field selection when casting single nested subdocs #3880 + * fix(populate): populating using base model with multiple child models in result #3878 + * fix: ability to properly use return value of `mongoose.connect()` #3874 + * fix(populate): dont hydrate populated subdoc if lean option set #3873 + * fix(connection): dont re-auth if already connected with useDb #3871 + * docs: cover how to set underlying driver's promise lib #3869 + * fix(document): handle conflicting names in validation errors with subdocs #3867 + * fix(populate): set undefined instead of null consistently when populate couldn't find results #3859 + * docs: link to `execPopulate()` in `doc.populate()` docs #3836 + * docs(plugin): link to the `mongoose.plugin()` function #3732 + +4.4.3 / 2016-02-09 +================== + * fix: upgrade to mongodb 2.1.6 to remove kerberos log output #3861 #3860 [cartuchogl](https://github.com/cartuchogl) + * fix: require('mongoose') is no longer a pseudo-promise #3856 + * fix(query): update casting for single nested docs #3820 + * fix(populate): deep populating multiple paths with same options #3808 + * docs(middleware): clarify save/validate hook order #1149 + +4.4.2 / 2016-02-05 +================== + * fix(aggregate): handle calling .cursor() with no options #3855 + * fix: upgrade mongodb driver to 2.1.5 for GridFS memory leak fix #3854 + * docs: fix schematype.html conflict #3853 #3850 #3843 + * fix(model): bluebird unhandled rejection with ensureIndexes() on init #3837 + * docs: autoIndex option for createConnection #3805 + +4.4.1 / 2016-02-03 +================== + * fix: linting broke some cases where we use `== null` as shorthand #3852 + * docs: fix up schematype.html conflict #3848 #3843 [mynameiscoffey](https://github.com/mynameiscoffey) + * fix: backwards breaking change with `.connect()` return value #3847 + * docs: downgrade dox and highlight.js to fix docs build #3845 + * docs: clean up typo #3842 [Flash-](https://github.com/Flash-) + * fix(document): storeShard handles undefined values #3841 + * chore: more linting #3838 [TrejGun](https://github.com/TrejGun) + * fix(schema): handle `text: true` as a way to declare a text index #3824 + +4.4.0 / 2016-02-02 +================== + * docs: fix expireAfterSeconds index option name #3831 [Flash-](https://github.com/Flash-) + * chore: run lint after test #3829 [ChristianMurphy](https://github.com/ChristianMurphy) + * chore: use power-assert instead of assert #3828 [TrejGun](https://github.com/TrejGun) + * chore: stricter lint #3827 [TrejGun](https://github.com/TrejGun) + * feat(types): casting moment to date #3813 [TrejGun](https://github.com/TrejGun) + * chore: comma-last lint for test folder #3810 [ChristianMurphy](https://github.com/ChristianMurphy) + * fix: upgrade async mpath, mpromise, muri, and sliced #3801 [TrejGun](https://github.com/TrejGun) + * fix(query): geo queries now return proper ES2015 promises #3800 [TrejGun](https://github.com/TrejGun) + * perf(types): use `Object.defineProperties()` for array #3799 [TrejGun](https://github.com/TrejGun) + * fix(model): mapReduce, ensureIndexes, remove, and save properly return ES2015 promises #3795 #3628 #3595 [TrejGun](https://github.com/TrejGun) + * docs: fixed dates in History.md #3791 [Jokero](https://github.com/Jokero) + * feat: connect, open, openSet, and disconnect return ES2015 promises #3790 #3622 [TrejGun](https://github.com/TrejGun) + * feat: custom type for int32 via mongoose-int32 npm package #3652 #3102 + * feat: basic custom schema type API #995 + * feat(model): `insertMany()` for more performant bulk inserts #723 + +4.3.7 / 2016-01-23 +================== + * docs: grammar fix in timestamps docs #3786 [zclancy](https://github.com/zclancy) + * fix(document): setting nested populated docs #3783 [slamuu](https://github.com/slamuu) + * fix(document): don't call post save hooks twice for pushed docs #3780 + * fix(model): handle `_id=0` correctly #3776 + * docs(middleware): async post hooks #3770 + * docs: remove confusing sentence #3765 [marcusmellis89](https://github.com/marcusmellis89) + +3.8.39 / 2016-01-15 +=================== + * fixed; casting a number to a buffer #3764 + * fixed; enumerating virtual property with nested objects #3743 [kusold](https://github.com/kusold) + +4.3.6 / 2016-01-15 +================== + * fix(types): casting a number to a buffer #3764 + * fix: add "listener" to reserved keywords #3759 + * chore: upgrade uglify #3757 [ChristianMurphy](https://github.com/ChristianMurphy) + * fix: broken execPopulate() in 4.3.5 #3755 #3753 + * fix: ability to remove() a single embedded doc #3754 + * style: comma-last in test folder #3751 [ChristianMurphy](https://github.com/ChristianMurphy) + * docs: clarify versionKey option #3747 + * fix: improve colorization for arrays #3744 [TrejGun](https://github.com/TrejGun) + * fix: webpack build #3713 + +4.3.5 / 2016-01-09 +================== + * fix(query): throw when 4th parameter to update not a function #3741 [kasselTrankos](https://github.com/kasselTrankos) + * fix(document): separate error type for setting an object to a primitive #3735 + * fix(populate): Model.populate returns ES6 promise #3734 + * fix(drivers): re-register event handlers after manual reconnect #3729 + * docs: broken links #3727 + * fix(validation): update validators run array validation #3724 + * docs: clarify the need to use markModified with in-place date ops #3722 + * fix(document): mark correct path as populated when manually populating array #3721 + * fix(aggregate): support for array pipeline argument to append #3718 [dbkup](https://github.com/dbkup) + * docs: clarify `.connect()` callback #3705 + * fix(schema): properly validate nested single nested docs #3702 + * fix(types): handle setting documentarray of wrong type #3701 + * docs: broken links #3700 + * fix(drivers): debug output properly displays '0' #3689 + +3.8.38 / 2016-01-07 +=================== + * fixed; aggregate.append an array #3730 [dbkup](https://github.com/dbkup) + +4.3.4 / 2015-12-23 +================== + * fix: upgrade mongodb driver to 2.1.2 for repl set error #3712 [sansmischevia](https://github.com/sansmischevia) + * docs: validation docs typo #3709 [ivanmaeder](https://github.com/ivanmaeder) + * style: remove unused variables #3708 [ChristianMurphy](https://github.com/ChristianMurphy) + * fix(schema): duck-typing for schemas #3703 [mgcrea](https://github.com/mgcrea) + * docs: connection sample code issue #3697 + * fix(schema): duck-typing for schemas #3693 [mgcrea](https://github.com/mgcrea) + * docs: clarify id schema option #3638 + +4.3.3 / 2015-12-18 +================== + * fix(connection): properly support 'replSet' as well as 'replset' #3688 [taxilian](https://github.com/taxilian) + * fix(document): single nested doc pre hooks called before nested doc array #3687 [aliatsis](https://github.com/aliatsis) + +4.3.2 / 2015-12-17 +================== + * fix(document): .set() into single nested schemas #3686 + * fix(connection): support 'replSet' as well as 'replset' option #3685 + * fix(document): bluebird unhandled rejection when validating doc arrays #3681 + * fix(document): hooks for doc arrays in single nested schemas #3680 + * fix(document): post hooks for single nested schemas #3679 + * fix: remove unused npm module #3674 [sybarite](https://github.com/sybarite) + * fix(model): don't swallow exceptions in nested doc save callback #3671 + * docs: update keepAlive info #3667 [ChrisZieba](https://github.com/ChrisZieba) + * fix(document): strict 'throw' throws a specific mongoose error #3662 + * fix: flakey test #3332 + * fix(query): more robust check for RegExp #2969 + +4.3.1 / 2015-12-11 +================== + * feat(aggregate): `.sample()` helper #3665 + * fix(query): bitwise query operators with buffers #3663 + * docs(migration): clarify `new` option and findByIdAndUpdate #3661 + +4.3.0 / 2015-12-09 +================== + * feat(query): support for mongodb 3.2 bitwise query operators #3660 + * style: use comma-last style consistently #3657 [ChristianMurphy](https://github.com/ChristianMurphy) + * feat: upgrade mongodb driver to 2.1.0 for full MongoDB 3.2 support #3656 + * feat(aggregate): `.lookup()` helper #3532 + +4.2.10 / 2015-12-08 +=================== + * fixed; upgraded marked #3653 [ChristianMurphy](https://github.com/ChristianMurphy) + * docs; cross-db populate #3648 + * docs; update mocha URL #3646 [ojhaujjwal](https://github.com/ojhaujjwal) + * fixed; call close callback asynchronously #3645 + * docs; virtuals.html issue #3644 [Psarna94](https://github.com/Psarna94) + * fixed; single embedded doc casting on init #3642 + * docs; validation docs improvements #3640 + +4.2.9 / 2015-12-02 +================== + * docs; defaults docs #3625 + * fix; nested numeric keys causing an embedded document crash #3623 + * fix; apply path getters before virtual getters #3618 + * fix; casting for arrays in single nested schemas #3616 + +4.2.8 / 2015-11-25 +================== + * docs; clean up README links #3612 [ReadmeCritic](https://github.com/ReadmeCritic) + * fix; ESLint improvements #3605 [ChristianMurphy](https://github.com/ChristianMurphy) + * fix; assigning single nested subdocs #3601 + * docs; describe custom logging functions in `mongoose.set()` docs #3557 + +4.2.7 / 2015-11-20 +================== + * fixed; readPreference connection string option #3600 + * fixed; pulling from manually populated arrays #3598 #3579 + * docs; FAQ about OverwriteModelError #3597 [stcruy](https://github.com/stcruy) + * fixed; setting single embedded schemas to null #3596 + * fixed; indexes for single embedded schemas #3594 + * docs; clarify projection for `findOne()` #3593 [gunar](https://github.com/gunar) + * fixed; .ownerDocument() method on single embedded schemas #3589 + * fixed; properly throw casterror for query on single embedded schema #3580 + * upgraded; mongodb driver -> 2.0.49 for reconnect issue fix #3481 + +4.2.6 / 2015-11-16 +================== + * fixed; ability to manually populate an array #3575 + * docs; clarify `isAsync` parameter to hooks #3573 + * fixed; use captureStackTrace if possible instead #3571 + * fixed; crash with buffer and update validators #3565 [johnpeb](https://github.com/johnpeb) + * fixed; update casting with operators overwrite: true #3564 + * fixed; validation with single embedded docs #3562 + * fixed; inline docs inherit parents $type key #3560 + * docs; bad grammar in populate docs #3559 [amaurymedeiros](https://github.com/amaurymedeiros) + * fixed; properly handle populate option for find() #2321 + +3.8.37 / 2015-11-16 +=================== + * fixed; use retainKeyOrder for cloning update op #3572 + +4.2.5 / 2015-11-09 +================== + * fixed; handle setting fields in pre update hooks with exec #3549 + * upgraded; ESLint #3547 [ChristianMurphy](https://github.com/ChristianMurphy) + * fixed; bluebird unhandled rejections with cast errors and .exec #3543 + * fixed; min/max validators handling undefined #3539 + * fixed; standalone mongos connections #3537 + * fixed; call `.toObject()` when setting a single nested doc #3535 + * fixed; single nested docs now have methods #3534 + * fixed; single nested docs with .create() #3533 #3521 [tusbar](https://github.com/tusbar) + * docs; deep populate docs #3528 + * fixed; deep populate schema ref handling #3507 + * upgraded; mongodb driver -> 2.0.48 for sort overflow issue #3493 + * docs; clarify default ids for discriminators #3482 + * fixed; properly support .update(doc) #3221 + +4.2.4 / 2015-11-02 +================== + * fixed; upgraded `ms` package for security vulnerability #3524 [fhemberger](https://github.com/fhemberger) + * fixed; ESlint rules #3517 [ChristianMurphy](https://github.com/ChristianMurphy) + * docs; typo in aggregation docs #3513 [rafakato](https://github.com/rafakato) + * fixed; add `dontThrowCastError` option to .update() for promises #3512 + * fixed; don't double-cast buffers in node 4.x #3510 #3496 + * fixed; population with single embedded schemas #3501 + * fixed; pre('set') hooks work properly #3479 + * docs; promises guide #3441 + +4.2.3 / 2015-10-26 +================== + * docs; remove unreferenced function in middleware.jade #3506 + * fixed; handling auth with no username/password #3500 #3498 #3484 [mleanos](https://github.com/mleanos) + * fixed; more ESlint rules #3491 [ChristianMurphy](https://github.com/ChristianMurphy) + * fixed; swallowing exceptions in save callback #3478 + * docs; fixed broken links in subdocs guide #3477 + * fixed; casting booleans to numbers #3475 + * fixed; report CastError for subdoc arrays in findOneAndUpdate #3468 + * fixed; geoNear returns ES6 promise #3458 + +4.2.2 / 2015-10-22 +================== + * fixed; go back to old pluralization code #3490 + +4.2.1 / 2015-10-22 +================== + * fixed; pluralization issues #3492 [ChristianMurphy](https://github.com/ChristianMurphy) + +4.2.0 / 2015-10-22 +================== + * added; support for skipVersioning for document arrays #3467 [chazmo03](https://github.com/chazmo03) + * added; ability to customize schema 'type' key #3459 #3245 + * fixed; writeConcern for index builds #3455 + * added; emit event when individual index build starts #3440 [objectiveSee](https://github.com/objectiveSee) + * added; 'context' option for update validators #3430 + * refactor; pluralization now in separate pluralize-mongoose npm module #3415 [ChristianMurphy](https://github.com/ChristianMurphy) + * added; customizable error validation messages #3406 [geronime](https://github.com/geronime) + * added; support for passing 'minimize' option to update #3381 + * added; ability to customize debug logging format #3261 + * added; baseModelName property for discriminator models #3202 + * added; 'emitIndexErrors' option #3174 + * added; 'async' option for aggregation cursor to support buffering #3160 + * added; ability to skip validation for individual save() calls #2981 + * added; single embedded schema support #2689 #585 + * added; depopulate function #2509 + +4.1.12 / 2015-10-19 +=================== + * docs; use readPreference instead of slaveOk for Query.setOptions docs #3471 [buunguyen](https://github.com/buunguyen) + * fixed; more helpful error when regexp contains null bytes #3456 + * fixed; x509 auth issue #3454 [NoxHarmonium](https://github.com/NoxHarmonium) + +3.8.36 / 2015-10-18 +=================== + * fixed; Make array props non-enumerable #3461 [boblauer](https://github.com/boblauer) + +4.1.11 / 2015-10-12 +=================== + * fixed; update timestamps for update() if they're enabled #3450 [isayme](https://github.com/isayme) + * fixed; unit test error on node 0.10 #3449 [isayme](https://github.com/isayme) + * docs; timestamp option docs #3448 [isayme](https://github.com/isayme) + * docs; fix unexpected indent #3443 [isayme](https://github.com/isayme) + * fixed; use ES6 promises for Model.prototype.remove() #3442 + * fixed; don't use unused 'safe' option for index builds #3439 + * fixed; elemMatch casting bug #3437 #3435 [DefinitelyCarter](https://github.com/DefinitelyCarter) + * docs; schema.index docs #3434 + * fixed; exceptions in save() callback getting swallowed on mongodb 2.4 #3371 + +4.1.10 / 2015-10-05 +=================== + * docs; improve virtuals docs to explain virtuals schema option #3433 [zoyaH](https://github.com/zoyaH) + * docs; MongoDB server version compatibility guide #3427 + * docs; clarify that findById and findByIdAndUpdate fire hooks #3422 + * docs; clean up Model.save() docs #3420 + * fixed; properly handle projection with just id #3407 #3412 + * fixed; infinite loop when database document is corrupted #3405 + * docs; clarify remove middleware #3388 + +4.1.9 / 2015-09-28 +================== + * docs; minlength and maxlength string validation docs #3368 #3413 [cosmosgenius](https://github.com/cosmosgenius) + * fixed; linting for infix operators #3397 [ChristianMurphy](https://github.com/ChristianMurphy) + * fixed; proper casting for $all #3394 + * fixed; unhandled rejection warnings with .create() #3391 + * docs; clarify update validators on paths that aren't explicitly set #3386 + * docs; custom validator examples #2778 + +4.1.8 / 2015-09-21 +================== + * docs; fixed typo in example #3390 [kmctown](https://github.com/kmctown) + * fixed; error in toObject() #3387 [guumaster](https://github.com/guumaster) + * fixed; handling for casting null dates #3383 [alexmingoia](https://github.com/alexmingoia) + * fixed; passing composite ids to `findByIdAndUpdate` #3380 + * fixed; linting #3376 #3375 [ChristianMurphy](https://github.com/ChristianMurphy) + * fixed; added NodeJS v4 to Travis #3374 [ChristianMurphy](https://github.com/ChristianMurphy) + * fixed; casting $elemMatch inside of $not #3373 [gaguirre](https://github.com/gaguirre) + * fixed; handle case where $slice is 0 #3369 + * fixed; avoid running getters if path is populated #3357 + * fixed; cast documents to objects when setting to a nested path #3346 + +4.1.7 / 2015-09-14 +================== + * docs; typos in SchemaType documentation #3367 [jasson15](https://github.com/jasson15) + * fixed; MONGOOSE_DRIVER_PATH env variable again #3360 + * docs; added validateSync docs #3353 + * fixed; set findOne op synchronously in query #3344 + * fixed; handling for `.pull()` on a documentarray without an id #3341 + * fixed; use natural order for cloning update conditions #3338 + * fixed; issue with strict mode casting for mixed type updates #3337 + +4.1.6 / 2015-09-08 +================== + * fixed; MONGOOSE_DRIVER_PATH env variable #3345 [g13013](https://github.com/g13013) + * docs; global autoIndex option #3335 [albertorestifo](https://github.com/albertorestifo) + * docs; model documentation typos #3330 + * fixed; report reason for CastError #3320 + * fixed; .populate() no longer returns true after re-assigning #3308 + * fixed; discriminators with aggregation geoNear #3304 + * docs; discriminator docs #2743 + +4.1.5 / 2015-09-01 +================== + * fixed; document.remove() removing all docs #3326 #3325 + * fixed; connect() checks for rs_name in options #3299 + * docs; examples for schema.set() #3288 + * fixed; checkKeys issue with bluebird #3286 [gregthegeek](https://github.com/gregthegeek) + +4.1.4 / 2015-08-31 +================== + * fixed; ability to set strict: false for update #3305 + * fixed; .create() properly uses ES6 promises #3297 + * fixed; pre hooks on nested subdocs #3291 #3284 [aliatsis](https://github.com/aliatsis) + * docs; remove unclear text in .remove() docs #3282 + * fixed; pre hooks called twice for 3rd-level nested doc #3281 + * fixed; nested transforms #3279 + * upgraded; mquery -> 1.6.3 #3278 #3272 + * fixed; don't swallow callback errors by default #3273 #3222 + * fixed; properly get nested paths from nested schemas #3265 + * fixed; remove() with id undefined deleting all docs #3260 [thanpolas](https://github.com/thanpolas) + * fixed; handling for non-numeric projections #3256 + * fixed; findById with id undefined returning first doc #3255 + * fixed; use retainKeyOrder for update #3215 + * added; passRawResult option to findOneAndUpdate for backwards compat #3173 + +4.1.3 / 2015-08-16 +================== + * fixed; getUpdate() in pre update hooks #3520 [gregthegeek](https://github.com/gregthegeek) + * fixed; handleArray() ensures arg is an array #3238 [jloveridge](https://github.com/jloveridge) + * fixed; refresh required path cache when recreating docs #3199 + * fixed; $ operator on unwind aggregation helper #3197 + * fixed; findOneAndUpdate() properly returns raw result as third arg to callback #3173 + * fixed; querystream with dynamic refs #3108 + +3.8.35 / 2015-08-14 +=================== + * fixed; handling for minimize on nested objects #2930 + * fixed; don't crash when schema.path.options undefined #1824 + +4.1.2 / 2015-08-10 +================== + * fixed; better handling for Jade templates #3241 [kbadk](https://github.com/kbadk) + * added; ESlint trailing spaces #3234 [ChristianMurphy](https://github.com/ChristianMurphy) + * added; ESlint #3191 [ChristianMurphy](https://github.com/ChristianMurphy) + * fixed; properly emit event on disconnect #3183 + * fixed; copy options properly using Query.toConstructor() #3176 + * fixed; setMaxListeners() issue in browser build #3170 + * fixed; node driver -> 2.0.40 to not store undefined keys as null #3169 + * fixed; update validators handle positional operator #3167 + * fixed; handle $all + $elemMatch query casting #3163 + * fixed; post save hooks don't swallow extra args #3155 + * docs; spelling mistake in index.jade #3154 + * fixed; don't crash when toObject() has no fields #3130 + * fixed; apply toObject() recursively for find and update queries #3086 [naoina](https://github.com/naoina) + +4.1.1 / 2015-08-03 +================== + * fixed; aggregate exec() crash with no callback #3212 #3198 [jpgarcia](https://github.com/jpgarcia) + * fixed; pre init hooks now properly synchronous #3207 [burtonjc](https://github.com/burtonjc) + * fixed; updateValidators doesn't flatten dates #3206 #3194 [victorkohl](https://github.com/victorkohl) + * fixed; default fields don't make document dirty between saves #3205 [burtonjc](https://github.com/burtonjc) + * fixed; save passes 0 as numAffected rather than undefined when no change #3195 [burtonjc](https://github.com/burtonjc) + * fixed; better handling for positional operator in update #3185 + * fixed; use Travis containers #3181 [ChristianMurphy](https://github.com/ChristianMurphy) + * fixed; leaked variable #3180 [ChristianMurphy](https://github.com/ChristianMurphy) + +4.1.0 / 2015-07-24 +================== + * added; `schema.queue()` now public #3193 + * added; raw result as third parameter to findOneAndX callback #3173 + * added; ability to run validateSync() on only certain fields #3153 + * added; subPopulate #3103 [timbur](https://github.com/timbur) + * added; $isDefault function on documents #3077 + * added; additional properties for built-in validator messages #3063 [KLicheR](https://github.com/KLicheR) + * added; getQuery() and getUpdate() functions for Query #3013 + * added; export DocumentProvider #2996 + * added; ability to remove path from schema #2993 [JohnnyEstilles](https://github.com/JohnnyEstilles) + * added; .explain() helper for aggregate #2714 + * added; ability to specify which ES6-compatible promises library mongoose uses #2688 + * added; export Aggregate #1910 + +4.0.8 / 2015-07-20 +================== + * fixed; assignment with document arrays #3178 [rosston](https://github.com/rosston) + * docs; remove duplicate paragraph #3164 [rhmeeuwisse](https://github.com/rhmeeuwisse) + * docs; improve findOneAndXYZ parameter descriptions #3159 [rhmeeuwisse](https://github.com/rhmeeuwisse) + * docs; add findOneAndRemove to list of supported middleware #3158 + * docs; clarify ensureIndex #3156 + * fixed; refuse to save/remove document without id #3118 + * fixed; hooks next() no longer accidentally returns promise #3104 + * fixed; strict mode for findOneAndUpdate #2947 + * added; .min.js.gz file for browser component #2806 + +3.8.34 / 2015-07-20 +=================== + * fixed; allow using $rename #3171 + * fixed; no longer modifies update arguments #3008 + +4.0.7 / 2015-07-11 +================== + * fixed; documentarray id method when using object id #3157 [siboulet](https://github.com/siboulet) + * docs; improve findById docs #3147 + * fixed; update validators handle null properly #3136 [odeke-em](https://github.com/odeke-em) + * docs; jsdoc syntax errors #3128 [rhmeeuwisse](https://github.com/rhmeeuwisse) + * docs; fix typo #3126 [rhmeeuwisse](https://github.com/rhmeeuwisse) + * docs; proper formatting in queries.jade #3121 [rhmeeuwisse](https://github.com/rhmeeuwisse) + * docs; correct example for string maxlength validator #3111 [rhmeeuwisse](https://github.com/rhmeeuwisse) + * fixed; setDefaultsOnInsert with arrays #3107 + * docs; LearnBoost -> Automattic in package.json #3099 + * docs; pre update hook example #3094 [danpe](https://github.com/danpe) + * docs; clarify query middleware example #3051 + * fixed; ValidationErrors in strict mode #3046 + * fixed; set findOneAndUpdate properties before hooks run #3024 + +3.8.33 / 2015-07-10 +=================== + * upgraded; node driver -> 1.4.38 + * fixed; dont crash when `match` validator undefined + +4.0.6 / 2015-06-21 +================== + * upgraded; node driver -> 2.0.34 #3087 + * fixed; apply setters on addToSet, etc #3067 [victorkohl](https://github.com/victorkohl) + * fixed; missing semicolons #3065 [sokolikp](https://github.com/sokolikp) + * fixed; proper handling for async doc hooks #3062 [gregthegeek](https://github.com/gregthegeek) + * fixed; dont set failed populate field to null if other docs are successfully populated #3055 [eloytoro](https://github.com/eloytoro) + * fixed; setDefaultsOnInsert with document arrays #3034 [taxilian](https://github.com/taxilian) + * fixed; setters fired on array items #3032 + * fixed; stop validateSync() on first error #3025 [victorkohl](https://github.com/victorkohl) + * docs; improve query docs #3016 + * fixed; always exclude _id when its deselected #3010 + * fixed; enum validator kind property #3009 + * fixed; mquery collection names #3005 + * docs; clarify mongos option #3000 + * docs; clarify that query builder has a .then() #2995 + * fixed; race condition in dynamic ref #2992 + +3.8.31 / 2015-06-20 +=================== + * fixed; properly handle text search with discriminators and $meta #2166 + +4.0.5 / 2015-06-05 +================== + * fixed; ObjectIds and buffers when mongodb driver is a sibling dependency #3050 #3048 #3040 #3031 #3020 #2988 #2951 + * fixed; warn user when 'increment' is used in schema #3039 + * fixed; setDefaultsOnInsert with array in schema #3035 + * fixed; dont use default Object toString to cast to string #3030 + * added; npm badge #3020 [odeke-em](https://github.com/odeke-em) + * fixed; proper handling for calling .set() with a subdoc #2782 + * fixed; dont throw cast error when using $rename on non-string path #1845 + +3.8.30 / 2015-06-05 +=================== + * fixed; enable users to set all options with tailable() #2883 + +4.0.4 / 2015-05-28 +================== + * docs; findAndModify new parameter correct default value #3012 [JonForest](https://github.com/JonForest) + * docs; clarify pluralization rules #2999 [anonmily](https://github.com/anonmily) + * fix; discriminators with schema methods #2978 + * fix; make `isModified` a schema reserved keyword #2975 + * fix; properly fire setters when initializing path with object #2943 + * fix; can use `setDefaultsOnInsert` without specifying `runValidators` #2938 + * fix; always set validation errors `kind` property #2885 + * upgraded; node driver -> 2.0.33 #2865 + +3.8.29 / 2015-05-27 +=================== + * fixed; Handle JSON.stringify properly for nested docs #2990 + +4.0.3 / 2015-05-13 +================== + * upgraded; mquery -> 1.5.1 #2983 + * docs; clarify context for query middleware #2974 + * docs; fix missing type -> kind rename in History.md #2961 + * fixed; broken ReadPreference include on Heroku #2957 + * docs; correct form for cursor aggregate option #2955 + * fixed; sync post hooks now properly called after function #2949 #2925 + * fixed; fix sub-doc validate() function #2929 + * upgraded; node driver -> 2.0.30 #2926 + * docs; retainKeyOrder for save() #2924 + * docs; fix broken class names #2913 + * fixed; error when using node-clone on a doc #2909 + * fixed; no more hard references to bson #2908 #2906 + * fixed; dont overwrite array values #2907 [naoina](https://github.com/naoina) + * fixed; use readPreference=primary for findOneAndUpdate #2899 #2823 + * docs; clarify that update validators only run on $set and $unset #2889 + * fixed; set kind consistently for built-in validators #2885 + * docs; single field populated documents #2884 + * fixed; nested objects are now enumerable #2880 [toblerpwn](https://github.com/toblerpwn) + * fixed; properly populate field when ref, lean, stream used together #2841 + * docs; fixed migration guide jade error #2807 + +3.8.28 / 2015-05-12 +=================== + * fixed; proper handling for toJSON options #2910 + * fixed; dont attach virtuals to embedded docs in update() #2046 + +4.0.2 / 2015-04-23 +================== + * fixed; error thrown when calling .validate() on subdoc not in an array #2902 + * fixed; rename define() to play nice with webpack #2900 [jspears](https://github.com/jspears) + * fixed; pre validate called twice with discriminators #2892 + * fixed; .inspect() on mongoose.Types #2875 + * docs; correct callback params for Model.update #2872 + * fixed; setDefaultsOnInsert now works when runValidators not specified #2870 + * fixed; Document now wraps EventEmitter.addListener #2867 + * fixed; call non-hook functions in schema queue #2856 + * fixed; statics can be mocked out for tests #2848 [ninelb](https://github.com/ninelb) + * upgraded; mquery 1.4.0 for bluebird bug fix #2846 + * fixed; required validators run first #2843 + * docs; improved docs for new option to findAndMody #2838 + * docs; populate example now uses correct field #2837 [swilliams](https://github.com/swilliams) + * fixed; pre validate changes causing VersionError #2835 + * fixed; get path from correct place when setting CastError #2832 + * docs; improve docs for Model.update() function signature #2827 [irnc](https://github.com/irnc) + * fixed; populating discriminators #2825 [chetverikov](https://github.com/chetverikov) + * fixed; discriminators with nested schemas #2821 + * fixed; CastErrors with embedded docs #2819 + * fixed; post save hook context #2816 + * docs; 3.8.x -> 4.x migration guide #2807 + * fixed; proper _distinct copying for query #2765 [cdelauder](https://github.com/cdelauder) + +3.8.27 / 2015-04-22 +=================== + * fixed; dont duplicate db calls on Q.ninvoke() #2864 + * fixed; Model.find arguments naming in docs #2828 + * fixed; Support ipv6 in connection strings #2298 + +3.8.26 / 2015-04-07 +=================== + * fixed; TypeError when setting date to undefined #2833 + * fixed; handle CastError properly in distinct() with no callback #2786 + * fixed; broken links in queries docs #2779 + * fixed; dont mark buffer as modified when setting type initially #2738 + * fixed; dont crash when using slice with populate #1934 + +4.0.1 / 2015-03-28 +================== + * fixed; properly handle empty cast doc in update() with promises #2796 + * fixed; unstable warning #2794 + * fixed; findAndModify docs now show new option is false by default #2793 + +4.0.0 / 2015-03-25 +================== + * fixed; on-the-fly schema docs typo #2783 [artiifix](https://github.com/artiifix) + * fixed; cast error validation handling #2775 #2766 #2678 + * fixed; discriminators with populate() #2773 #2719 [chetverikov](https://github.com/chetverikov) + * fixed; increment now a reserved path #2709 + * fixed; avoid sending duplicate object ids in populate() #2683 + * upgraded; mongodb to 2.0.24 to properly emit reconnect event multiple times #2656 + +4.0.0-rc4 / 2015-03-14 +====================== + * fixed; toObject virtuals schema option handled properly #2751 + * fixed; update validators work on document arrays #2733 + * fixed; check for cast errors on $set #2729 + * fixed; instance field set for all schema types #2727 [csdco](https://github.com/csdco) + * fixed; dont run other validators if required fails #2725 + * fixed; custom getters execute on ref paths #2610 + * fixed; save defaults if they were set when doc was loaded from db #2558 + * fixed; pre validate now runs before pre save #2462 + * fixed; no longer throws errors with --use_strict #2281 + +3.8.25 / 2015-03-13 +=================== + * fixed; debug output reverses order of aggregation keys #2759 + * fixed; $eq is a valid query selector in 3.0 #2752 + * fixed; upgraded node driver to 1.4.32 for handling non-numeric poolSize #2682 + * fixed; update() with overwrite sets _id for nested docs #2658 + * fixed; casting for operators in $elemMatch #2199 + +4.0.0-rc3 / 2015-02-28 +====================== + * fixed; update() pre hooks run before validators #2706 + * fixed; setters not called on arrays of refs #2698 [brandom](https://github.com/brandom) + * fixed; use node driver 2.0.18 for nodejs 0.12 support #2685 + * fixed; comments reference file that no longer exists #2681 + * fixed; populated() returns _id of manually populated doc #2678 + * added; ability to exclude version key in toObject() #2675 + * fixed; dont allow setting nested path to a string #2592 + * fixed; can cast objects with _id field to ObjectIds #2581 + * fixed; on-the-fly schema getters #2360 + * added; strict option for findOneAndUpdate() #1967 + +3.8.24 / 2015-02-25 +=================== + * fixed; properly apply child schema transforms #2691 + * fixed; make copy of findOneAndUpdate options before modifying #2687 + * fixed; apply defaults when parent path is selected #2670 #2629 + * fixed; properly get ref property for nested paths #2665 + * fixed; node driver makes copy of authenticate options before modifying them #2619 + * fixed; dont block process exit when auth fails #2599 + * fixed; remove redundant clone in update() #2537 + +4.0.0-rc2 / 2015-02-10 +====================== + * added; io.js to travis build + * removed; browser build dependencies not installed by default + * added; dynamic refpaths #2640 [chetverikov](https://github.com/chetverikov) + * fixed; dont call child schema transforms on parent #2639 [chetverikov](https://github.com/chetverikov) + * fixed; get rid of remove option if new is set in findAndModify #2598 + * fixed; aggregate all document array validation errors #2589 + * fixed; custom setters called when setting value to undefined #1892 + +3.8.23 / 2015-02-06 +=================== + * fixed; unset opts.remove when upsert is true #2519 + * fixed; array saved as object when path is object in array #2442 + * fixed; inline transforms #2440 + * fixed; check for callback in count() #2204 + * fixed; documentation for selecting fields #1534 + +4.0.0-rc1 / 2015-02-01 +====================== + * fixed; use driver 2.0.14 + * changed; use transform: true by default #2245 + +4.0.0-rc0 / 2015-01-31 +=================== + * fixed; wrong order for distinct() params #2628 + * fixed; handling no query argument to remove() #2627 + * fixed; createModel and discriminators #2623 [ashaffer](https://github.com/ashaffer) + * added; pre('count') middleware #2621 + * fixed; double validation calls on document arrays #2618 + * added; validate() catches cast errors #2611 + * fixed; respect replicaSet parameter in connection string #2609 + * added; can explicitly exclude paths from versioning #2576 [csabapalfi](https://github.com/csabapalfi) + * upgraded; driver to 2.0.15 #2552 + * fixed; save() handles errors more gracefully in ES6 #2371 + * fixed; undefined is now a valid argument to findOneAndUpdate #2272 + * changed; `new` option to findAndModify ops is false by default #2262 + +3.8.22 / 2015-01-24 +=================== + * upgraded; node-mongodb-native to 1.4.28 #2587 [Climax777](https://github.com/Climax777) + * added; additional documentation for validators #2449 + * fixed; stack overflow when creating massive arrays #2423 + * fixed; undefined is a valid id for queries #2411 + * fixed; properly create nested schema index when same schema used twice #2322 + * added; link to plugin generator in docs #2085 [huei90](https://github.com/huei90) + * fixed; optional arguments documentation for findOne() #1971 [nachinius](https://github.com/nachinius) + +3.9.7 / 2014-12-19 +=================== + * added; proper cursors for aggregate #2539 [changyy](https://github.com/changyy) + * added; min/max built-in validators for dates #2531 [bshamblen](https://github.com/bshamblen) + * fixed; save and validate are now reserved keywords #2380 + * added; basic documentation for browser component #2256 + * added; find and findOne hooks (query middleware) #2138 + * fixed; throw a DivergentArrayError when saving positional operator queries #2031 + * added; ability to use options as a document property #1416 + * fixed; document no longer inherits from event emitter and so domain and _events are no longer reserved #1351 + * removed; setProfiling #1349 + +3.8.21 / 2014-12-18 +=================== + * fixed; syntax in index.jade #2517 [elderbas](https://github.com/elderbas) + * fixed; writable statics #2510 #2528 + * fixed; overwrite and explicit $set casting #2515 + +3.9.6 / 2014-12-05 +=================== + * added; correctly run validators on each element of array when entire array is modified #661 #1227 + * added; castErrors in validation #1013 [jondavidjohn](https://github.com/jondavidjohn) + * added; specify text indexes in schema fields #1401 [sr527](https://github.com/sr527) + * added; ability to set field with validators to undefined #1594 [alabid](https://github.com/alabid) + * added; .create() returns an array when passed an array #1746 [alabid](https://github.com/alabid) + * added; test suite and docs for use with co and yield #2177 #2474 + * fixed; subdocument toObject() transforms #2447 [chmanie](https://github.com/chmanie) + * fixed; Model.create() with save errors #2484 + * added; pass options to .save() and .remove() #2494 [jondavidjohn](https://github.com/jondavidjohn) + +3.8.20 / 2014-12-01 +=================== + * fixed; recursive readPref #2490 [kjvalencik](https://github.com/kjvalencik) + * fixed; make sure to copy parameters to update() before modifying #2406 [alabid](https://github.com/alabid) + * fixed; unclear documentation about query callbacks #2319 + * fixed; setting a schema-less field to an empty object #2314 [alabid](https://github.com/alabid) + * fixed; registering statics and methods for discriminators #2167 [alabid](https://github.com/alabid) + +3.9.5 / 2014-11-10 +=================== + * added; ability to disable autoIndex on a per-connection basis #1875 [sr527](https://github.com/sr527) + * fixed; `geoNear()` no longer enforces legacy coordinate pairs - supports GeoJSON #1987 [alabid](https://github.com/alabid) + * fixed; browser component works when minified with mangled variable names #2302 + * fixed; `doc.errors` now cleared before `validate()` called #2302 + * added; `execPopulate()` function to make `doc.populate()` compatible with promises #2317 + * fixed; `count()` no longer throws an error when used with `sort()` #2374 + * fixed; `save()` no longer recursively calls `save()` on populated fields #2418 + +3.8.19 / 2014-11-09 +=================== + * fixed; make sure to not override subdoc _ids on find #2276 [alabid](https://github.com/alabid) + * fixed; exception when comparing two documents when one lacks _id #2333 [slawo](https://github.com/slawo) + * fixed; getters for properties with non-strict schemas #2439 [alabid](https://github.com/alabid) + * fixed; inconsistent URI format in docs #2414 [sr527](https://github.com/sr527) + +3.9.4 / 2014-10-25 +================== + * fixed; statics no longer can be overwritten #2343 [nkcmr](https://github.com/chetverikov) + * added; ability to set single populated paths to documents #1530 + * added; setDefaultsOnInsert and runValidator options for findOneAndUpdate() #860 + +3.8.18 / 2014-10-22 +================== + * fixed; Dont use all toObject options in save #2340 [chetverikov](https://github.com/chetverikov) + +3.9.3 / 2014-10-01 +================= + * added; support for virtuals that return objects #2294 + * added; ability to manually hydrate POJOs into mongoose objects #2292 + * added; setDefaultsOnInsert and runValidator options for update() #860 + +3.8.17 / 2014-09-29 +================== + * fixed; use schema options retainKeyOrder in save() #2274 + * fixed; fix skip in populate when limit is set #2252 + * fixed; fix stack overflow when passing MongooseArray to findAndModify #2214 + * fixed; optimize .length usage in populate #2289 + +3.9.2 / 2014-09-08 +================== + * added; test coverage for browser component #2255 + * added; in-order execution of validators #2243 + * added; custom fields for validators #2132 + * removed; exception thrown when find() used with count() #1950 + +3.8.16 / 2014-09-08 +================== + * fixed; properly remove modified array paths if array has been overwritten #1638 + * fixed; key check errors #1884 + * fixed; make sure populate on an array always returns a Mongoose array #2214 + * fixed; SSL connections with node 0.11 #2234 + * fixed; return sensible strings for promise errors #2239 + +3.9.1 / 2014-08-17 +================== + * added; alpha version of browser-side schema validation #2254 + * added; support passing a function to schemas `required` field #2247 + * added; support for setting updatedAt and createdAt timestamps #2227 + * added; document.validate() returns a promise #2131 + +3.8.15 / 2014-08-17 +================== + * fixed; Replica set connection string example in docs #2246 + * fixed; bubble up parseError event #2229 + * fixed; removed buggy populate cache #2176 + * fixed; dont $inc versionKey if its being $set #1933 + * fixed; cast $or and $and in $pull #1932 + * fixed; properly cast to schema in stream() #1862 + * fixed; memory leak in nested objects #1565 #2211 [devongovett](https://github.com/devongovett) + +3.8.14 / 2014-07-26 +================== + * fixed; stringifying MongooseArray shows nested arrays #2002 + * fixed; use populated doc schema in toObject and toJSON by default #2035 + * fixed; dont crash on arrays containing null #2140 + * fixed; model.update w/ upsert has same return values on .exec and promise #2143 + * fixed; better handling for populate limit with multiple documents #2151 + * fixed; dont prevent users from adding weights to text index #2183 + * fixed; helper for aggregation cursor #2187 + * updated; node-mongodb-native to 1.4.7 + +3.8.13 / 2014-07-15 +================== + * fixed; memory leak with isNew events #2159 + * fixed; docs for overwrite option for update() #2144 + * fixed; storeShard() handles dates properly #2127 + * fixed; sub-doc changes not getting persisted to db after save #2082 + * fixed; populate with _id: 0 actually removes _id instead of setting to undefined #2123 + * fixed; save versionKey on findOneAndUpdate w/ upsert #2122 + * fixed; fix typo in 2.8 docs #2120 [shakirullahi](https://github.com/shakirullahi) + * fixed; support maxTimeMs #2102 [yuukinajima](https://github.com/yuukinajima) + * fixed; support $currentDate #2019 + * fixed; $addToSet handles objects without _ids properly #1973 + * fixed; dont crash on invalid nearSphere query #1874 + +3.8.12 / 2014-05-30 +================== + * fixed; single-server reconnect event fires #1672 + * fixed; sub-docs not saved when pushed into populated array #1794 + * fixed; .set() sometimes converts embedded docs to pojos #1954 [archangel-irk](https://github.com/archangel-irk) + * fixed; sub-doc changes not getting persisted to db after save #2082 + * fixed; custom getter might cause mongoose to mistakenly think a path is dirty #2100 [pgherveou](https://github.com/pgherveou) + * fixed; chainable helper for allowDiskUse option in aggregation #2114 + +3.9.0 (unstable) / 2014-05-22 +================== + * changed; added `domain` to reserved keywords #1338 #2052 [antoinepairet](https://github.com/antoinepairet) + * added; asynchronous post hooks #1977 #2081 [chopachom](https://github.com/chopachom) [JasonGhent](https://github.com/JasonGhent) + * added; using model for population, cross-db populate [mihai-chiorean](https://github.com/mihai-chiorean) + * added; can define a type for schema validators + * added; `doc.remove()` returns a promise #1619 [refack](https://github.com/refack) + * added; internal promises for hooks, pre-save hooks run in parallel #1732 [refack](https://github.com/refack) + * fixed; geoSearch hanging when no results returned #1846 [ghartnett](https://github.com/ghartnett) + * fixed; do not set .type property on ValidationError, use .kind instead #1323 + +3.8.11 / 2014-05-22 +================== + * updated; node-mongodb-native to 1.4.5 + * reverted; #2052, fixes #2097 + +3.8.10 / 2014-05-20 +================== + + * updated; node-mongodb-native to 1.4.4 + * fixed; _.isEqual false negatives bug in js-bson #2070 + * fixed; missing check for schema.options #2014 + * fixed; missing support for $position #2024 + * fixed; options object corruption #2049 + * fixed; improvements to virtuals docs #2055 + * fixed; added `domain` to reserved keywords #2052 #1338 + +3.8.9 / 2014-05-08 +================== + + * updated; mquery to 0.7.0 + * updated; node-mongodb-native to 1.4.3 + * fixed; $near failing against MongoDB 2.6 + * fixed; relying on .options() to determine if collection exists + * fixed; $out aggregate helper + * fixed; all test failures against MongoDB 2.6.1, with caveat #2065 + +3.8.8 / 2014-02-22 +================== + + * fixed; saving Buffers #1914 + * updated; expose connection states for user-land #1926 [yorkie](https://github.com/yorkie) + * updated; mquery to 0.5.3 + * updated; added get / set to reserved path list #1903 [tstrimple](https://github.com/tstrimple) + * docs; README code highlighting, syntax fixes #1930 [IonicaBizau](https://github.com/IonicaBizau) + * docs; fixes link in the doc at #1925 [kapeels](https://github.com/kapeels) + * docs; add a missed word 'hook' for the description of the post-hook api #1924 [ipoval](https://github.com/ipoval) + +3.8.7 / 2014-02-09 +================== + + * fixed; sending safe/read options in Query#exec #1895 + * fixed; findOneAnd..() with sort #1887 + +3.8.6 / 2014-01-30 +================== + + * fixed; setting readPreferences #1895 + +3.8.5 / 2014-01-23 +================== + + * fixed; ssl setting when using URI #1882 + * fixed; findByIdAndUpdate now respects the overwrite option #1809 [owenallenaz](https://github.com/owenallenaz) + +3.8.4 / 2014-01-07 +================== + + * updated; mongodb driver to 1.3.23 + * updated; mquery to 0.4.1 + * updated; mpromise to 0.4.3 + * fixed; discriminators now work when selecting fields #1820 [daemon1981](https://github.com/daemon1981) + * fixed; geoSearch with no results timeout #1846 [ghartnett](https://github.com/ghartnett) + * fixed; infitite recursion in ValidationError #1834 [chetverikov](https://github.com/chetverikov) + +3.8.3 / 2013-12-17 +================== + + * fixed; setting empty array with model.update #1838 + * docs; fix url + +3.8.2 / 2013-12-14 +================== + + * fixed; enum validation of multiple values #1778 [heroicyang](https://github.com/heroicyang) + * fixed; global var leak #1803 + * fixed; post remove now fires on subdocs #1810 + * fixed; no longer set default empty array for geospatial-indexed fields #1668 [shirish87](https://github.com/shirish87) + * fixed; model.stream() not hydrating discriminators correctly #1792 [j](https://github.com/j) + * docs: Stablility -> Stability [nikmartin](https://github.com/nikmartin) + * tests; improve shard error handling + +3.8.1 / 2013-11-19 +================== + + * fixed; mishandling of Dates with minimize/getters #1764 + * fixed; Normalize bugs.email, so `npm` will shut up #1769 [refack](https://github.com/refack) + * docs; Improve the grammar where "lets us" was used #1777 [alexyoung](https://github.com/alexyoung) + * docs; Fix some grammatical issues in the documentation #1777 [alexyoung](https://github.com/alexyoung) + * docs; fix Query api exposure + * docs; fix return description + * docs; Added Notes on findAndUpdate() #1750 [sstadelman](https://github.com/sstadelman) + * docs; Update version number in README #1762 [Fodi69](https://github.com/Fodi69) + +3.8.0 / 2013-10-31 +================== + + * updated; warn when using an unstable version + * updated; error message returned in doc.save() #1595 + * updated; mongodb driver to 1.3.19 (fix error swallowing behavior) + * updated; mquery to 0.3.2 + * updated; mocha to 1.12.0 + * updated; mpromise 0.3.0 + * updated; sliced 0.0.5 + * removed; mongoose.Error.DocumentError (never used) + * removed; namedscope (undocumented and broken) #679 #642 #455 #379 + * changed; no longer offically supporting node 0.6.x + * changed; query.within getter is now a method -> query.within() + * changed; query.intersects getter is now a method -> query.intersects() + * added; custom error msgs for built-in validators #747 + * added; discriminator support #1647 #1003 [j](https://github.com/j) + * added; support disabled collection name pluralization #1350 #1707 [refack](https://github.com/refack) + * added; support for GeoJSON to Query#near [ebensing](https://github.com/ebensing) + * added; stand-alone base query support - query.toConstructor() [ebensing](https://github.com/ebensing) + * added; promise support to geoSearch #1614 [ebensing](https://github.com/ebensing) + * added; promise support for geoNear #1614 [ebensing](https://github.com/ebensing) + * added; connection.useDb() #1124 [ebensing](https://github.com/ebensing) + * added; promise support to model.mapReduce() + * added; promise support to model.ensureIndexes() + * added; promise support to model.populate() + * added; benchmarks [ebensing](https://github.com/ebensing) + * added; publicly exposed connection states #1585 + * added; $geoWithin support #1529 $1455 [ebensing](https://github.com/ebensing) + * added; query method chain validation + * added; model.update `overwrite` option + * added; model.geoNear() support #1563 [ebensing](https://github.com/ebensing) + * added; model.geoSearch() support #1560 [ebensing](https://github.com/ebensing) + * added; MongooseBuffer#subtype() + * added; model.create() now returns a promise #1340 + * added; support for `awaitdata` query option + * added; pass the doc to doc.remove() callback #1419 [JoeWagner](https://github.com/JoeWagner) + * added; aggregation query builder #1404 [njoyard](https://github.com/njoyard) + * fixed; document.toObject when using `minimize` and `getters` options #1607 [JedWatson](https://github.com/JedWatson) + * fixed; Mixed types can now be required #1722 [Reggino](https://github.com/Reggino) + * fixed; do not pluralize model names not ending with letters #1703 [refack](https://github.com/refack) + * fixed; repopulating modified populated paths #1697 + * fixed; doc.equals() when _id option is set to false #1687 + * fixed; strict mode warnings #1686 + * fixed; $near GeoJSON casting #1683 + * fixed; nearSphere GeoJSON query builder + * fixed; population field selection w/ strings #1669 + * fixed; setters not firing on null values #1445 [ebensing](https://github.com/ebensing) + * fixed; handle another versioning edge case #1520 + * fixed; excluding subdocument fields #1280 [ebensing](https://github.com/ebensing) + * fixed; allow array properties to be set to null with findOneAndUpdate [aheuermann](https://github.com/aheuermann) + * fixed; subdocuments now use own toJSON opts #1376 [ebensing](https://github.com/ebensing) + * fixed; model#geoNear fulfills promise when results empty #1658 [ebensing](https://github.com/ebensing) + * fixed; utils.merge no longer overrides props and methods #1655 [j](https://github.com/j) + * fixed; subdocuments now use their own transform #1412 [ebensing](https://github.com/ebensing) + * fixed; model.remove() removes only what is necessary #1649 + * fixed; update() now only runs with cb or explicit true #1644 + * fixed; casting ref docs on creation #1606 [ebensing](https://github.com/ebensing) + * fixed; model.update "overwrite" option works as documented + * fixed; query#remove() works as documented + * fixed; "limit" correctly applies to individual items on population #1490 [ebensing](https://github.com/ebensing) + * fixed; issue with positional operator on ref docs #1572 [ebensing](https://github.com/ebensing) + * fixed; benchmarks to actually output valid json + * deprecated; promise#addBack (use promise#onResolve) + * deprecated; promise#complete (use promise#fulfill) + * deprecated; promise#addCallback (use promise#onFulFill) + * deprecated; promise#addErrback (use promise#onReject) + * deprecated; query.nearSphere() (use query.near) + * deprecated; query.center() (use query.circle) + * deprecated; query.centerSphere() (use query.circle) + * deprecated; query#slaveOk (use query#read) + * docs; custom validator messages + * docs; 10gen -> MongoDB + * docs; add Date method caveats #1598 + * docs; more validation details + * docs; state which branch is stable/unstable + * docs; mention that middleware does not run on Models + * docs; promise.fulfill() + * docs; fix readme spelling #1483 [yorchopolis](https://github.com/yorchopolis) + * docs; fixed up the README and examples [ebensing](https://github.com/ebensing) + * website; add "show code" for properties + * website; move "show code" links down + * website; update guide + * website; add unstable docs + * website; many improvements + * website; fix copyright #1439 + * website; server.js -> static.js #1546 [nikmartin](https://github.com/nikmartin) + * tests; refactor 1703 + * tests; add test generator + * tests; validate formatMessage() throws + * tests; add script for continuously running tests + * tests; fixed versioning tests + * tests; race conditions in tests + * tests; added for nested and/or queries + * tests; close some test connections + * tests; validate db contents + * tests; remove .only + * tests; close some test connections + * tests; validate db contents + * tests; remove .only + * tests; replace deprecated method names + * tests; convert id to string + * tests; fix sharding tests for MongoDB 2.4.5 + * tests; now 4-5 seconds faster + * tests; fix race condition + * make; suppress warning msg in test + * benchmarks; updated for pull requests + * examples; improved and expanded [ebensing](https://github.com/ebensing) + +3.7.4 (unstable) / 2013-10-01 +============================= + + * updated; mquery to 0.3.2 + * removed; mongoose.Error.DocumentError (never used) + * added; custom error msgs for built-in validators #747 + * added; discriminator support #1647 #1003 [j](https://github.com/j) + * added; support disabled collection name pluralization #1350 #1707 [refack](https://github.com/refack) + * fixed; do not pluralize model names not ending with letters #1703 [refack](https://github.com/refack) + * fixed; repopulating modified populated paths #1697 + * fixed; doc.equals() when _id option is set to false #1687 + * fixed; strict mode warnings #1686 + * fixed; $near GeoJSON casting #1683 + * fixed; nearSphere GeoJSON query builder + * fixed; population field selection w/ strings #1669 + * docs; custom validator messages + * docs; 10gen -> MongoDB + * docs; add Date method caveats #1598 + * docs; more validation details + * website; add "show code" for properties + * website; move "show code" links down + * tests; refactor 1703 + * tests; add test generator + * tests; validate formatMessage() throws + +3.7.3 (unstable) / 2013-08-22 +============================= + + * updated; warn when using an unstable version + * updated; mquery to 0.3.1 + * updated; mocha to 1.12.0 + * updated; mongodb driver to 1.3.19 (fix error swallowing behavior) + * changed; no longer offically supporting node 0.6.x + * added; support for GeoJSON to Query#near [ebensing](https://github.com/ebensing) + * added; stand-alone base query support - query.toConstructor() [ebensing](https://github.com/ebensing) + * added; promise support to geoSearch #1614 [ebensing](https://github.com/ebensing) + * added; promise support for geoNear #1614 [ebensing](https://github.com/ebensing) + * fixed; setters not firing on null values #1445 [ebensing](https://github.com/ebensing) + * fixed; handle another versioning edge case #1520 + * fixed; excluding subdocument fields #1280 [ebensing](https://github.com/ebensing) + * fixed; allow array properties to be set to null with findOneAndUpdate [aheuermann](https://github.com/aheuermann) + * fixed; subdocuments now use own toJSON opts #1376 [ebensing](https://github.com/ebensing) + * fixed; model#geoNear fulfills promise when results empty #1658 [ebensing](https://github.com/ebensing) + * fixed; utils.merge no longer overrides props and methods #1655 [j](https://github.com/j) + * fixed; subdocuments now use their own transform #1412 [ebensing](https://github.com/ebensing) + * make; suppress warning msg in test + * docs; state which branch is stable/unstable + * docs; mention that middleware does not run on Models + * tests; add script for continuously running tests + * tests; fixed versioning tests + * benchmarks; updated for pull requests + +3.7.2 (unstable) / 2013-08-15 +================== + + * fixed; model.remove() removes only what is necessary #1649 + * fixed; update() now only runs with cb or explicit true #1644 + * tests; race conditions in tests + * website; update guide + +3.7.1 (unstable) / 2013-08-13 +============================= + + * updated; driver to 1.3.18 (fixes memory leak) + * added; connection.useDb() #1124 [ebensing](https://github.com/ebensing) + * added; promise support to model.mapReduce() + * added; promise support to model.ensureIndexes() + * added; promise support to model.populate() + * fixed; casting ref docs on creation #1606 [ebensing](https://github.com/ebensing) + * fixed; model.update "overwrite" option works as documented + * fixed; query#remove() works as documented + * fixed; "limit" correctly applies to individual items on population #1490 [ebensing](https://github.com/ebensing) + * fixed; issue with positional operator on ref docs #1572 [ebensing](https://github.com/ebensing) + * fixed; benchmarks to actually output valid json + * tests; added for nested and/or queries + * tests; close some test connections + * tests; validate db contents + * tests; remove .only + * tests; close some test connections + * tests; validate db contents + * tests; remove .only + * tests; replace deprecated method names + * tests; convert id to string + * docs; promise.fulfill() + +3.7.0 (unstable) / 2013-08-05 +=================== + + * changed; query.within getter is now a method -> query.within() + * changed; query.intersects getter is now a method -> query.intersects() + * deprecated; promise#addBack (use promise#onResolve) + * deprecated; promise#complete (use promise#fulfill) + * deprecated; promise#addCallback (use promise#onFulFill) + * deprecated; promise#addErrback (use promise#onReject) + * deprecated; query.nearSphere() (use query.near) + * deprecated; query.center() (use query.circle) + * deprecated; query.centerSphere() (use query.circle) + * deprecated; query#slaveOk (use query#read) + * removed; namedscope (undocumented and broken) #679 #642 #455 #379 + * added; benchmarks [ebensing](https://github.com/ebensing) + * added; publicly exposed connection states #1585 + * added; $geoWithin support #1529 $1455 [ebensing](https://github.com/ebensing) + * added; query method chain validation + * added; model.update `overwrite` option + * added; model.geoNear() support #1563 [ebensing](https://github.com/ebensing) + * added; model.geoSearch() support #1560 [ebensing](https://github.com/ebensing) + * added; MongooseBuffer#subtype() + * added; model.create() now returns a promise #1340 + * added; support for `awaitdata` query option + * added; pass the doc to doc.remove() callback #1419 [JoeWagner](https://github.com/JoeWagner) + * added; aggregation query builder #1404 [njoyard](https://github.com/njoyard) + * updated; integrate mquery #1562 [ebensing](https://github.com/ebensing) + * updated; error msg in doc.save() #1595 + * updated; bump driver to 1.3.15 + * updated; mpromise 0.3.0 + * updated; sliced 0.0.5 + * tests; fix sharding tests for MongoDB 2.4.5 + * tests; now 4-5 seconds faster + * tests; fix race condition + * docs; fix readme spelling #1483 [yorchopolis](https://github.com/yorchopolis) + * docs; fixed up the README and examples [ebensing](https://github.com/ebensing) + * website; add unstable docs + * website; many improvements + * website; fix copyright #1439 + * website; server.js -> static.js #1546 [nikmartin](https://github.com/nikmartin) + * examples; improved and expanded [ebensing](https://github.com/ebensing) + +3.6.20 (stable) / 2013-09-23 +=================== + + * fixed; repopulating modified populated paths #1697 + * fixed; doc.equals w/ _id false #1687 + * fixed; strict mode warning #1686 + * docs; near/nearSphere + +3.6.19 (stable) / 2013-09-04 +================== + + * fixed; population field selection w/ strings #1669 + * docs; Date method caveats #1598 + +3.6.18 (stable) / 2013-08-22 +=================== + + * updated; warn when using an unstable version of mongoose + * updated; mocha to 1.12.0 + * updated; mongodb driver to 1.3.19 (fix error swallowing behavior) + * fixed; setters not firing on null values #1445 [ebensing](https://github.com/ebensing) + * fixed; properly exclude subdocument fields #1280 [ebensing](https://github.com/ebensing) + * fixed; cast error in findAndModify #1643 [aheuermann](https://github.com/aheuermann) + * website; update guide + * website; added documentation for safe:false and versioning interaction + * docs; mention that middleware dont run on Models + * docs; fix indexes link + * make; suppress warning msg in test + * tests; moar + +3.6.17 / 2013-08-13 +=================== + + * updated; driver to 1.3.18 (fixes memory leak) + * fixed; casting ref docs on creation #1606 + * docs; query options + +3.6.16 / 2013-08-08 +=================== + + * added; publicly expose connection states #1585 + * fixed; limit applies to individual items on population #1490 [ebensing](https://github.com/ebensing) + * fixed; positional operator casting in updates #1572 [ebensing](https://github.com/ebensing) + * updated; MongoDB driver to 1.3.17 + * updated; sliced to 0.0.5 + * website; tweak homepage + * tests; fixed + added + * docs; fix some examples + * docs; multi-mongos support details + * docs; auto open browser after starting static server + +3.6.15 / 2013-07-16 +================== + + * added; mongos failover support #1037 + * updated; make schematype return vals return self #1580 + * docs; add note to model.update #571 + * docs; document third param to document.save callback #1536 + * tests; tweek mongos test timeout + +3.6.14 / 2013-07-05 +=================== + + * updated; driver to 1.3.11 + * fixed; issue with findOneAndUpdate not returning null on upserts #1533 [ebensing](https://github.com/ebensing) + * fixed; missing return statement in SchemaArray#$geoIntersects() #1498 [bsrykt](https://github.com/bsrykt) + * fixed; wrong isSelected() behavior #1521 [kyano](https://github.com/kyano) + * docs; note about toObject behavior during save() + * docs; add callbacks details #1547 [nikmartin](https://github.com/nikmartin) + +3.6.13 / 2013-06-27 +=================== + + * fixed; calling model.distinct without conditions #1541 + * fixed; regression in Query#count() #1542 + * now working on 3.6.13 + +3.6.12 / 2013-06-25 +=================== + + * updated; driver to 1.3.10 + * updated; clearer capped collection error message #1509 [bitmage](https://github.com/bitmage) + * fixed; MongooseBuffer subtype loss during casting #1517 [zedgu](https://github.com/zedgu) + * fixed; docArray#id when doc.id is disabled #1492 + * fixed; docArray#id now supports matches on populated arrays #1492 [pgherveou](https://github.com/pgherveou) + * website; fix example + * website; improve _id disabling example + * website; fix typo #1494 [dejj](https://github.com/dejj) + * docs; added a 'Requesting new features' section #1504 [shovon](https://github.com/shovon) + * docs; improve subtypes description + * docs; clarify _id disabling + * docs: display by alphabetical order the methods list #1508 [nicolasleger](https://github.com/nicolasleger) + * tests; refactor isSelected checks + * tests; remove pointless test + * tests; fixed timeouts + +3.6.11 / 2013-05-15 +=================== + + * updated; driver to 1.3.5 + * fixed; compat w/ Object.create(null) #1484 #1485 + * fixed; cloning objects w/ missing constructors + * fixed; prevent multiple min number validators #1481 [nrako](https://github.com/nrako) + * docs; add doc.increment() example + * docs; add $size example + * docs; add "distinct" example + +3.6.10 / 2013-05-09 +================== + + * update driver to 1.3.3 + * fixed; increment() works without other changes #1475 + * website; fix links to posterous + * docs; fix link #1472 + +3.6.9 / 2013-05-02 +================== + + * fixed; depopulation of mixed documents #1471 + * fixed; use of $options in array #1462 + * tests; fix race condition + * docs; fix default example + +3.6.8 / 2013-04-25 +================== + + * updated; driver to 1.3.0 + * fixed; connection.model should retain options #1458 [vedmalex](https://github.com/vedmalex) + * tests; 4-5 seconds faster + +3.6.7 / 2013-04-19 +================== + + * fixed; population regression in 3.6.6 #1444 + +3.6.6 / 2013-04-18 +================== + + * fixed; saving populated new documents #1442 + * fixed; population regession in 3.6.5 #1441 + * website; fix copyright #1439 + +3.6.5 / 2013-04-15 +================== + + * fixed; strict:throw edge case using .set(path, val) + * fixed; schema.pathType() on some numbericAlpha paths + * fixed; numbericAlpha path versioning + * fixed; setting nested mixed paths #1418 + * fixed; setting nested objects with null prop #1326 + * fixed; regression in v3.6 population performance #1426 [vedmalex](https://github.com/vedmalex) + * fixed; read pref typos #1422 [kyano](https://github.com/kyano) + * docs; fix method example + * website; update faq + * website; add more deep links + * website; update poolSize docs + * website; add 3.6 release notes + * website; note about keepAlive + +3.6.4 / 2013-04-03 +================== + + * fixed; +field conflict with $slice #1370 + * fixed; nested deselection conflict #1333 + * fixed; RangeError in ValidationError.toString() #1296 + * fixed; do not save user defined transforms #1415 + * tests; fix race condition + +3.6.3 / 2013-04-02 +================== + + * fixed; setting subdocuments deeply nested fields #1394 + * fixed; regression: populated streams #1411 + * docs; mention hooks/validation with findAndModify + * docs; mention auth + * docs; add more links + * examples; add document methods example + * website; display "see" links for properties + * website; clean up homepage + +3.6.2 / 2013-03-29 +================== + + * fixed; corrupted sub-doc array #1408 + * fixed; document#update returns a Query #1397 + * docs; readpref strategy + +3.6.1 / 2013-03-27 +================== + + * added; populate support to findAndModify varients #1395 + * added; text index type to schematypes + * expose allowed index types as Schema.indexTypes + * fixed; use of `setMaxListeners` as path + * fixed; regression in node 0.6 on docs with > 10 arrays + * fixed; do not alter schema arguments #1364 + * fixed; subdoc#ownerDocument() #1385 + * website; change search id + * website; add search from google [jackdbernier](https://github.com/jackdbernier) + * website; fix link + * website; add 3.5.x docs release + * website; fix link + * docs; fix geometry + * docs; hide internal constructor + * docs; aggregation does not cast arguments #1399 + * docs; querystream options + * examples; added for population + +3.6.0 / 2013-03-18 +================== + + * changed; cast 'true'/'false' to boolean #1282 [mgrach](https://github.com/mgrach) + * changed; Buffer arrays can now contain nulls + * added; QueryStream transform option + * added; support for authSource driver option + * added; {mongoose,db}.modelNames() + * added; $push w/ $slice,$sort support (MongoDB 2.4) + * added; hashed index type (MongoDB 2.4) + * added; support for mongodb 2.4 geojson (MongoDB 2.4) + * added; value at time of validation error + * added; support for object literal schemas + * added; bufferCommands schema option + * added; allow auth option in connections #1360 [geoah](https://github.com/geoah) + * added; performance improvements to populate() [263ece9](https://github.com/LearnBoost/mongoose/commit/263ece9) + * added; allow adding uncasted docs to populated arrays and properties #570 + * added; doc#populated(path) stores original populated _ids + * added; lean population #1260 + * added; query.populate() now accepts an options object + * added; document#populate(opts, callback) + * added; Model.populate(docs, opts, callback) + * added; support for rich nested path population + * added; doc.array.remove(value) subdoc with _id value support #1278 + * added; optionally allow non-strict sets and updates + * added; promises/A+ comformancy with [mpromise](https://github.com/aheckmann/mpromise) + * added; promise#then + * added; promise#end + * fixed; use of `model` as doc property + * fixed; lean population #1382 + * fixed; empty object mixed defaults #1380 + * fixed; populate w/ deselected _id using string syntax + * fixed; attempted save of divergent populated arrays #1334 related + * fixed; better error msg when attempting toObject as property name + * fixed; non population buffer casting from doc + * fixed; setting populated paths #570 + * fixed; casting when added docs to populated arrays #570 + * fixed; prohibit updating arrays selected with $elemMatch #1334 + * fixed; pull / set subdoc combination #1303 + * fixed; multiple bg index creation #1365 + * fixed; manual reconnection to single mongod + * fixed; Constructor / version exposure #1124 + * fixed; CastError race condition + * fixed; no longer swallowing misuse of subdoc#invalidate() + * fixed; utils.clone retains RegExp opts + * fixed; population of non-schema property + * fixed; allow updating versionKey #1265 + * fixed; add EventEmitter props to reserved paths #1338 + * fixed; can now deselect populated doc _ids #1331 + * fixed; properly pass subtype to Binary in MongooseBuffer + * fixed; casting _id from document with non-ObjectId _id + * fixed; specifying schema type edge case { path: [{type: "String" }] } + * fixed; typo in schemdate #1329 [jplock](https://github.com/jplock) + * updated; driver to 1.2.14 + * updated; muri to 0.3.1 + * updated; mpromise to 0.2.1 + * updated; mocha 1.8.1 + * updated; mpath to 0.1.1 + * deprecated; pluralization will die in 4.x + * refactor; rename private methods to something unusable as doc properties + * refactor MongooseArray#remove + * refactor; move expires index to SchemaDate #1328 + * refactor; internal document properties #1171 #1184 + * tests; added + * docs; indexes + * docs; validation + * docs; populate + * docs; populate + * docs; add note about stream compatibility with node 0.8 + * docs; fix for private names + * docs; Buffer -> mongodb.Binary #1363 + * docs; auth options + * docs; improved + * website; update FAQ + * website; add more api links + * website; add 3.5.x docs to prior releases + * website; Change mongoose-types to an active repo [jackdbernier](https://github.com/jackdbernier) + * website; compat with node 0.10 + * website; add news section + * website; use T for generic type + * benchmark; make adjustable + +3.6.0rc1 / 2013-03-12 +====================== + + * refactor; rename private methods to something unusable as doc properties + * added; {mongoose,db}.modelNames() + * added; $push w/ $slice,$sort support (MongoDB 2.4) + * added; hashed index type (MongoDB 2.4) + * added; support for mongodb 2.4 geojson (MongoDB 2.4) + * added; value at time of validation error + * added; support for object literal schemas + * added; bufferCommands schema option + * added; allow auth option in connections #1360 [geoah](https://github.com/geoah) + * fixed; lean population #1382 + * fixed; empty object mixed defaults #1380 + * fixed; populate w/ deselected _id using string syntax + * fixed; attempted save of divergent populated arrays #1334 related + * fixed; better error msg when attempting toObject as property name + * fixed; non population buffer casting from doc + * fixed; setting populated paths #570 + * fixed; casting when added docs to populated arrays #570 + * fixed; prohibit updating arrays selected with $elemMatch #1334 + * fixed; pull / set subdoc combination #1303 + * fixed; multiple bg index creation #1365 + * fixed; manual reconnection to single mongod + * fixed; Constructor / version exposure #1124 + * fixed; CastError race condition + * fixed; no longer swallowing misuse of subdoc#invalidate() + * fixed; utils.clone retains RegExp opts + * fixed; population of non-schema property + * fixed; allow updating versionKey #1265 + * fixed; add EventEmitter props to reserved paths #1338 + * fixed; can now deselect populated doc _ids #1331 + * updated; muri to 0.3.1 + * updated; driver to 1.2.12 + * updated; mpromise to 0.2.1 + * deprecated; pluralization will die in 4.x + * docs; Buffer -> mongodb.Binary #1363 + * docs; auth options + * docs; improved + * website; add news section + * benchmark; make adjustable + +3.6.0rc0 / 2013-02-03 +====================== + + * changed; cast 'true'/'false' to boolean #1282 [mgrach](https://github.com/mgrach) + * changed; Buffer arrays can now contain nulls + * fixed; properly pass subtype to Binary in MongooseBuffer + * fixed; casting _id from document with non-ObjectId _id + * fixed; specifying schema type edge case { path: [{type: "String" }] } + * fixed; typo in schemdate #1329 [jplock](https://github.com/jplock) + * refactor; move expires index to SchemaDate #1328 + * refactor; internal document properties #1171 #1184 + * added; performance improvements to populate() [263ece9](https://github.com/LearnBoost/mongoose/commit/263ece9) + * added; allow adding uncasted docs to populated arrays and properties #570 + * added; doc#populated(path) stores original populated _ids + * added; lean population #1260 + * added; query.populate() now accepts an options object + * added; document#populate(opts, callback) + * added; Model.populate(docs, opts, callback) + * added; support for rich nested path population + * added; doc.array.remove(value) subdoc with _id value support #1278 + * added; optionally allow non-strict sets and updates + * added; promises/A+ comformancy with [mpromise](https://github.com/aheckmann/mpromise) + * added; promise#then + * added; promise#end + * updated; mocha 1.8.1 + * updated; muri to 0.3.0 + * updated; mpath to 0.1.1 + * updated; docs + +3.5.16 / 2013-08-13 +=================== + + * updated; driver to 1.3.18 + +3.5.15 / 2013-07-26 +================== + + * updated; sliced to 0.0.5 + * updated; driver to 1.3.12 + * fixed; regression in Query#count() due to driver change + * tests; fixed timeouts + * tests; handle differing test uris + +3.5.14 / 2013-05-15 +=================== + + * updated; driver to 1.3.5 + * fixed; compat w/ Object.create(null) #1484 #1485 + * fixed; cloning objects missing constructors + * fixed; prevent multiple min number validators #1481 [nrako](https://github.com/nrako) + +3.5.13 / 2013-05-09 +================== + + * update driver to 1.3.3 + * fixed; use of $options in array #1462 + +3.5.12 / 2013-04-25 +=================== + + * updated; driver to 1.3.0 + * fixed; connection.model should retain options #1458 [vedmalex](https://github.com/vedmalex) + * fixed; read pref typos #1422 [kyano](https://github.com/kyano) + +3.5.11 / 2013-04-03 +================== + + * fixed; +field conflict with $slice #1370 + * fixed; RangeError in ValidationError.toString() #1296 + * fixed; nested deselection conflict #1333 + * remove time from Makefile + +3.5.10 / 2013-04-02 +================== + + * fixed; setting subdocuments deeply nested fields #1394 + * fixed; do not alter schema arguments #1364 + +3.5.9 / 2013-03-15 +================== + + * updated; driver to 1.2.14 + * added; support for authSource driver option (mongodb 2.4) + * added; QueryStream transform option (node 0.10 helper) + * fixed; backport for saving required populated buffers + * fixed; pull / set subdoc combination #1303 + * fixed; multiple bg index creation #1365 + * test; added for saveable required populated buffers + * test; added for #1365 + * test; add authSource test + +3.5.8 / 2013-03-12 +================== + + * added; auth option in connection [geoah](https://github.com/geoah) + * fixed; CastError race condition + * docs; add note about stream compatibility with node 0.8 + +3.5.7 / 2013-02-22 +================== + + * updated; driver to 1.2.13 + * updated; muri to 0.3.1 #1347 + * fixed; utils.clone retains RegExp opts #1355 + * fixed; deepEquals RegExp support + * tests; fix a connection test + * website; clean up docs [afshinm](https://github.com/afshinm) + * website; update homepage + * website; migragtion: emphasize impact of strict docs #1264 + +3.5.6 / 2013-02-14 +================== + + * updated; driver to 1.2.12 + * fixed; properly pass Binary subtype + * fixed; add EventEmitter props to reserved paths #1338 + * fixed; use correct node engine version + * fixed; display empty docs as {} in log output #953 follow up + * improved; "bad $within $box argument" error message + * populate; add unscientific benchmark + * website; add stack overflow to help section + * website; use better code font #1336 [risseraka](https://github.com/risseraka) + * website; clarify where help is available + * website; fix source code links #1272 [floatingLomas](https://github.com/floatingLomas) + * docs; be specific about _id schema option #1103 + * docs; add ensureIndex error handling example + * docs; README + * docs; CONTRIBUTING.md + +3.5.5 / 2013-01-29 +================== + + * updated; driver to 1.2.11 + * removed; old node < 0.6x shims + * fixed; documents with Buffer _ids equality + * fixed; MongooseBuffer properly casts numbers + * fixed; reopening closed connection on alt host/port #1287 + * docs; fixed typo in Readme #1298 [rened](https://github.com/rened) + * docs; fixed typo in migration docs [Prinzhorn](https://github.com/Prinzhorn) + * docs; fixed incorrect annotation in SchemaNumber#min [bilalq](https://github.com/bilalq) + * docs; updated + +3.5.4 / 2013-01-07 +================== + + * changed; "_pres" & "_posts" are now reserved pathnames #1261 + * updated; driver to 1.2.8 + * fixed; exception when reopening a replica set. #1263 [ethankan](https://github.com/ethankan) + * website; updated + +3.5.3 / 2012-12-26 +================== + + * added; support for geo object notation #1257 + * fixed; $within query casting with arrays + * fixed; unix domain socket support #1254 + * updated; driver to 1.2.7 + * updated; muri to 0.0.5 + +3.5.2 / 2012-12-17 +================== + + * fixed; using auth with replica sets #1253 + +3.5.1 / 2012-12-12 +================== + + * fixed; regression when using subdoc with `path` as pathname #1245 [daeq](https://github.com/daeq) + * fixed; safer db option checks + * updated; driver to 1.2.5 + * website; add more examples + * website; clean up old docs + * website; fix prev release urls + * docs; clarify streaming with HTTP responses + +3.5.0 / 2012-12-10 +================== + + * added; paths to CastErrors #1239 + * added; support for mongodb connection string spec #1187 + * added; post validate event + * added; Schema#get (to retrieve schema options) + * added; VersionError #1071 + * added; npmignore [hidekiy](https://github.com/hidekiy) + * update; driver to 1.2.3 + * fixed; stackoverflow in setter #1234 + * fixed; utils.isObject() + * fixed; do not clobber user specified driver writeConcern #1227 + * fixed; always pass current document to post hooks + * fixed; throw error when user attempts to overwrite a model + * fixed; connection.model only caches on connection #1209 + * fixed; respect conn.model() creation when matching global model exists #1209 + * fixed; passing model name + collection name now always honors collection name + * fixed; setting virtual field to an empty object #1154 + * fixed; subclassed MongooseErrors exposure, now available in mongoose.Error.xxxx + * fixed; model.remove() ignoring callback when executed twice [daeq](https://github.com/daeq) #1210 + * docs; add collection option to schema api docs #1222 + * docs; NOTE about db safe options + * docs; add post hooks docs + * docs; connection string options + * docs; middleware is not executed with Model.remove #1241 + * docs; {g,s}etter introspection #777 + * docs; update validation docs + * docs; add link to plugins page + * docs; clarify error returned by unique indexes #1225 + * docs; more detail about disabling autoIndex behavior + * docs; add homepage section to package (npm docs mongoose) + * docs; more detail around collection name pluralization #1193 + * website; add .important css + * website; update models page + * website; update getting started + * website; update quick start + +3.4.0 / 2012-11-10 +================== + + * added; support for generic toJSON/toObject transforms #1160 #1020 #1197 + * added; doc.set() merge support #1148 [NuORDER](https://github.com/NuORDER) + * added; query#add support #1188 [aleclofabbro](https://github.com/aleclofabbro) + * changed; adding invalid nested paths to non-objects throws 4216f14 + * changed; fixed; stop invalid function cloning (internal fix) + * fixed; add query $and casting support #1180 [anotheri](https://github.com/anotheri) + * fixed; overwriting of query arguments #1176 + * docs; fix expires examples + * docs; transforms + * docs; schema `collection` option docs [hermanjunge](https://github.com/hermanjunge) + * website; updated + * tests; added + +3.3.1 / 2012-10-11 +================== + + * fixed; allow goose.connect(uris, dbname, opts) #1144 + * docs; persist API private checked state across page loads + +3.3.0 / 2012-10-10 +================== + + * fixed; passing options as 2nd arg to connect() #1144 + * fixed; race condition after no-op save #1139 + * fixed; schema field selection application in findAndModify #1150 + * fixed; directly setting arrays #1126 + * updated; driver to 1.1.11 + * updated; collection pluralization rules [mrickard](https://github.com/mrickard) + * tests; added + * docs; updated + +3.2.2 / 2012-10-08 +================== + + * updated; driver to 1.1.10 #1143 + * updated; use sliced 0.0.3 + * fixed; do not recast embedded docs unnecessarily + * fixed; expires schema option helper #1132 + * fixed; built in string setters #1131 + * fixed; debug output for Dates/ObjectId properties #1129 + * docs; fixed Javascript syntax error in example [olalonde](https://github.com/olalonde) + * docs; fix toJSON example #1137 + * docs; add ensureIndex production notes + * docs; fix spelling + * docs; add blogposts about v3 + * website; updated + * removed; undocumented inGroupsOf util + * tests; added + +3.2.1 / 2012-09-28 +================== + + * fixed; remove query batchSize option default of 1000 https://github.com/learnboost/mongoose/commit/3edaa8651 + * docs; updated + * website; updated + +3.2.0 / 2012-09-27 +================== + + * added; direct array index assignment with casting support `doc.array.set(index, value)` + * fixed; QueryStream#resume within same tick as pause() #1116 + * fixed; default value validatation #1109 + * fixed; array splice() not casting #1123 + * fixed; default array construction edge case #1108 + * fixed; query casting for inequalities in arrays #1101 [dpatti](https://github.com/dpatti) + * tests; added + * website; more documentation + * website; fixed layout issue #1111 [SlashmanX](https://github.com/SlashmanX) + * website; refactored [guille](https://github.com/guille) + +3.1.2 / 2012-09-10 +================== + + * added; ReadPreferrence schema option #1097 + * updated; driver to 1.1.7 + * updated; default query batchSize to 1000 + * fixed; we now cast the mapReduce query option #1095 + * fixed; $elemMatch+$in with field selection #1091 + * fixed; properly cast $elemMatch+$in conditions #1100 + * fixed; default field application of subdocs #1027 + * fixed; querystream prematurely dying #1092 + * fixed; querystream never resumes when paused at getMore boundries #1092 + * fixed; querystream occasionally emits data events after destroy #1092 + * fixed; remove unnecessary ObjectId creation in querystream + * fixed; allow ne(boolean) again #1093 + * docs; add populate/field selection syntax notes + * docs; add toObject/toJSON options detail + * docs; `read` schema option + +3.1.1 / 2012-08-31 +================== + + * updated; driver to 1.1.6 + +3.1.0 / 2012-08-29 +================== + + * changed; fixed; directly setting nested objects now overwrites entire object (previously incorrectly merged them) + * added; read pref support (mongodb 2.2) 205a709c + * added; aggregate support (mongodb 2.2) f3a5bd3d + * added; virtual {g,s}etter introspection (#1070) + * updated; docs [brettz9](https://github.com/brettz9) + * updated; driver to 1.1.5 + * fixed; retain virtual setter return values (#1069) + +3.0.3 / 2012-08-23 +================== + + * fixed; use of nested paths beginning w/ numbers #1062 + * fixed; query population edge case #1053 #1055 [jfremy](https://github.com/jfremy) + * fixed; simultaneous top and sub level array modifications #1073 + * added; id and _id schema option aliases + tests + * improve debug formatting to allow copy/paste logged queries into mongo shell [eknkc](https://github.com/eknkc) + * docs + +3.0.2 / 2012-08-17 +================== + + * added; missing support for v3 sort/select syntax to findAndModify helpers (#1058) + * fixed; replset fullsetup event emission + * fixed; reconnected event for replsets + * fixed; server reconnection setting discovery + * fixed; compat with non-schema path props using positional notation (#1048) + * fixed; setter/casting order (#665) + * docs; updated + +3.0.1 / 2012-08-11 +================== + + * fixed; throw Error on bad validators (1044) + * fixed; typo in EmbeddedDocument#parentArray [lackac] + * fixed; repair mongoose.SchemaTypes alias + * updated; docs + +3.0.0 / 2012-08-07 +================== + + * removed; old subdocument#commit method + * fixed; setting arrays of matching docs [6924cbc2] + * fixed; doc!remove event now emits in save order as save for consistency + * fixed; pre-save hooks no longer fire on subdocuments when validation fails + * added; subdoc#parent() and subdoc#parentArray() to access subdocument parent objects + * added; query#lean() helper + +3.0.0rc0 / 2012-08-01 +===================== + + * fixed; allow subdoc literal declarations containing "type" pathname (#993) + * fixed; unsetting a default array (#758) + * fixed; boolean $in queries (#998) + * fixed; allow use of `options` as a pathname (#529) + * fixed; `model` is again a permitted schema path name + * fixed; field selection option on subdocs (#1022) + * fixed; handle another edge case with subdoc saving (#975) + * added; emit save err on model if listening + * added; MongoDB TTL collection support (#1006) + * added; $center options support + * added; $nearSphere and $polygon support + * updated; driver version to 1.1.2 + +3.0.0alpha2 / 2012-07-18 +========================= + + * changed; index errors are now emitted on their model and passed to an optional callback (#984) + * fixed; specifying index along with sparse/unique option no longer overwrites (#1004) + * fixed; never swallow connection errors (#618) + * fixed; creating object from model with emded object no longer overwrites defaults [achurkin] (#859) + * fixed; stop needless validation of unchanged/unselected fields (#891) + * fixed; document#equals behavior of objectids (#974) + * fixed; honor the minimize schema option (#978) + * fixed; provide helpful error msgs when reserved schema path is used (#928) + * fixed; callback to conn#disconnect is optional (#875) + * fixed; handle missing protocols in connection urls (#987) + * fixed; validate args to query#where (#969) + * fixed; saving modified/removed subdocs (#975) + * fixed; update with $pull from Mixed array (#735) + * fixed; error with null shard key value + * fixed; allow unsetting enums (#967) + * added; support for manual index creation (#984) + * added; support for disabled auto-indexing (#984) + * added; support for preserving MongooseArray#sort changes (#752) + * added; emit state change events on connection + * added; support for specifying BSON subtype in MongooseBuffer#toObject [jcrugzz] + * added; support for disabled versioning (#977) + * added; implicit "new" support for models and Schemas + +3.0.0alpha1 / 2012-06-15 +========================= + + * removed; doc#commit (use doc#markModified) + * removed; doc.modified getter (#950) + * removed; mongoose{connectSet,createSetConnection}. use connect,createConnection instead + * removed; query alias methods 1149804c + * removed; MongooseNumber + * changed; now creating indexes in background by default + * changed; strict mode now enabled by default (#952) + * changed; doc#modifiedPaths is now a method (#950) + * changed; getters no longer cast (#820); casting happens during set + * fixed; no need to pass updateArg to findOneAndUpdate (#931) + * fixed: utils.merge bug when merging nested non-objects. [treygriffith] + * fixed; strict:throw should produce errors in findAndModify (#963) + * fixed; findAndUpdate no longer overwrites document (#962) + * fixed; setting default DocumentArrays (#953) + * fixed; selection of _id with schema deselection (#954) + * fixed; ensure promise#error emits instanceof Error + * fixed; CursorStream: No stack overflow on any size result (#929) + * fixed; doc#remove now passes safe options + * fixed; invalid use of $set during $pop + * fixed; array#{$pop,$shift} mirror MongoDB behavior + * fixed; no longer test non-required vals in string match (#934) + * fixed; edge case with doc#inspect + * fixed; setter order (#665) + * fixed; setting invalid paths in strict mode (#916) + * fixed; handle docs without id in DocumentArray#id method (#897) + * fixed; do not save virtuals during model.update (#894) + * fixed; sub doc toObject virtuals application (#889) + * fixed; MongooseArray#pull of ObjectId (#881) + * fixed; handle passing db name with any repl set string + * fixed; default application of selected fields (#870) + * fixed; subdoc paths reported in validation errors (#725) + * fixed; incorrect reported num of affected docs in update ops (#862) + * fixed; connection assignment in Model#model (#853) + * fixed; stringifying arrays of docs (#852) + * fixed; modifying subdoc and parent array works (#842) + * fixed; passing undefined to next hook (#785) + * fixed; Query#{update,remove}() works without callbacks (#788) + * fixed; set/updating nested objects by parent pathname (#843) + * fixed; allow null in number arrays (#840) + * fixed; isNew on sub doc after insertion error (#837) + * fixed; if an insert fails, set isNew back to false [boutell] + * fixed; isSelected when only _id is selected (#730) + * fixed; setting an unset default value (#742) + * fixed; query#sort error messaging (#671) + * fixed; support for passing $options with $regex + * added; array of object literal notation in schema creates DocumentArrays + * added; gt,gte,lt,lte query support for arrays (#902) + * added; capped collection support (#938) + * added; document versioning support + * added; inclusion of deselected schema path (#786) + * added; non-atomic array#pop + * added; EmbeddedDocument constructor is now exposed in DocArray#create 7cf8beec + * added; mapReduce support (#678) + * added; support for a configurable minimize option #to{Object,JSON}(option) (#848) + * added; support for strict: `throws` [regality] + * added; support for named schema types (#795) + * added; to{Object,JSON} schema options (#805) + * added; findByIdAnd{Update,Remove}() + * added; findOneAnd{Update,Remove}() + * added; query.setOptions() + * added; instance.update() (#794) + * added; support specifying model in populate() [DanielBaulig] + * added; `lean` query option [gitfy] + * added; multi-atomic support to MongooseArray#nonAtomicPush + * added; support for $set + other $atomic ops on single array + * added; tests + * updated; driver to 1.0.2 + * updated; query.sort() syntax to mirror query.select() + * updated; clearer cast error msg for array numbers + * updated; docs + * updated; doc.clone 3x faster (#950) + * updated; only create _id if necessary (#950) + +2.7.3 / 2012-08-01 +================== + + * fixed; boolean $in queries (#998) + * fixed field selection option on subdocs (#1022) + +2.7.2 / 2012-07-18 +================== + + * fixed; callback to conn#disconnect is optional (#875) + * fixed; handle missing protocols in connection urls (#987) + * fixed; saving modified/removed subdocs (#975) + * updated; tests + +2.7.1 / 2012-06-26 +=================== + + * fixed; sharding: when a document holds a null as a value of the shard key + * fixed; update() using $pull on an array of Mixed (gh-735) + * deprecated; MongooseNumber#{inc, increment, decrement} methods + * tests; now using mocha + +2.7.0 / 2012-06-14 +=================== + + * added; deprecation warnings to methods being removed in 3.x + +2.6.8 / 2012-06-14 +=================== + + * fixed; edge case when using 'options' as a path name (#961) + +2.6.7 / 2012-06-08 +=================== + + * fixed; ensure promise#error always emits instanceof Error + * fixed; selection of _id w/ another excluded path (#954) + * fixed; setting default DocumentArrays (#953) + +2.6.6 / 2012-06-06 +=================== + + * fixed; stack overflow in query stream with large result sets (#929) + * added; $gt, $gte, $lt, $lte support to arrays (#902) + * fixed; pass option `safe` along to doc#remove() calls + +2.6.5 / 2012-05-24 +=================== + + * fixed; do not save virtuals in Model.update (#894) + * added; missing $ prefixed query aliases (going away in 3.x) (#884) [timoxley] + * fixed; setting invalid paths in strict mode (#916) + * fixed; resetting isNew after insert failure (#837) [boutell] + +2.6.4 / 2012-05-15 +=================== + + * updated; backport string regex $options to 2.x + * updated; use driver 1.0.2 (performance improvements) (#914) + * fixed; calling MongooseDocumentArray#id when the doc has no _id (#897) + +2.6.3 / 2012-05-03 +=================== + + * fixed; repl-set connectivity issues during failover on MongoDB 2.0.1 + * updated; driver to 1.0.0 + * fixed; virtuals application of subdocs when using toObject({ virtuals: true }) (#889) + * fixed; MongooseArray#pull of ObjectId correctly updates the array itself (#881) + +2.6.2 / 2012-04-30 +=================== + + * fixed; default field application of selected fields (#870) + +2.6.1 / 2012-04-30 +=================== + + * fixed; connection assignment in mongoose#model (#853, #877) + * fixed; incorrect reported num of affected docs in update ops (#862) + +2.6.0 / 2012-04-19 +=================== + + * updated; hooks.js to 0.2.1 + * fixed; issue with passing undefined to a hook callback. thanks to [chrisleishman] for reporting. + * fixed; updating/setting nested objects in strict schemas (#843) as reported by [kof] + * fixed; Query#{update,remove}() work without callbacks again (#788) + * fixed; modifying subdoc along with parent array $atomic op (#842) + +2.5.14 / 2012-04-13 +=================== + + * fixed; setting an unset default value (#742) + * fixed; doc.isSelected(otherpath) when only _id is selected (#730) + * updated; docs + +2.5.13 / 2012-03-22 +=================== + + * fixed; failing validation of unselected required paths (#730,#713) + * fixed; emitting connection error when only one listener (#759) + * fixed; MongooseArray#splice was not returning values (#784) [chrisleishman] + +2.5.12 / 2012-03-21 +=================== + + * fixed; honor the `safe` option in all ensureIndex calls + * updated; node-mongodb-native driver to 0.9.9-7 + +2.5.11 / 2012-03-15 +=================== + + * added; introspection for getters/setters (#745) + * updated; node-mongodb-driver to 0.9.9-5 + * added; tailable method to Query (#769) [holic] + * fixed; Number min/max validation of null (#764) [btamas] + * added; more flexible user/password connection options (#738) [KarneAsada] + +2.5.10 / 2012-03-06 +=================== + + * updated; node-mongodb-native driver to 0.9.9-4 + * added; Query#comment() + * fixed; allow unsetting arrays + * fixed; hooking the set method of subdocuments (#746) + * fixed; edge case in hooks + * fixed; allow $id and $ref in queries (fixes compatibility with mongoose-dbref) (#749) [richtera] + * added; default path selection to SchemaTypes + +2.5.9 / 2012-02-22 +=================== + + * fixed; properly cast nested atomic update operators for sub-documents + +2.5.8 / 2012-02-21 +=================== + + * added; post 'remove' middleware includes model that was removed (#729) [timoxley] + +2.5.7 / 2012-02-09 +=================== + + * fixed; RegExp validators on node >= v0.6.x + +2.5.6 / 2012-02-09 +=================== + + * fixed; emit errors returned from db.collection() on the connection (were being swallowed) + * added; can add multiple validators in your schema at once (#718) [diogogmt] + * fixed; strict embedded documents (#717) + * updated; docs [niemyjski] + * added; pass number of affected docs back in model.update/save + +2.5.5 / 2012-02-03 +=================== + + * fixed; RangeError: maximum call stack exceed error when removing docs with Number _id (#714) + +2.5.4 / 2012-02-03 +=================== + + * fixed; RangeError: maximum call stack exceed error (#714) + +2.5.3 / 2012-02-02 +=================== + + * added; doc#isSelected(path) + * added; query#equals() + * added; beta sharding support + * added; more descript error msgs (#700) [obeleh] + * added; document.modifiedPaths (#709) [ljharb] + * fixed; only functions can be added as getters/setters (#707,704) [ljharb] + +2.5.2 / 2012-01-30 +=================== + + * fixed; rollback -native driver to 0.9.7-3-5 (was causing timeouts and other replica set weirdness) + * deprecated; MongooseNumber (will be moved to a separate repo for 3.x) + * added; init event is emitted on schemas + +2.5.1 / 2012-01-27 +=================== + + * fixed; honor strict schemas in Model.update (#699) + +2.5.0 / 2012-01-26 +=================== + + * added; doc.toJSON calls toJSON on embedded docs when exists [jerem] + * added; populate support for refs of type Buffer (#686) [jerem] + * added; $all support for ObjectIds and Dates (#690) + * fixed; virtual setter calling on instantiation when strict: true (#682) [hunterloftis] + * fixed; doc construction triggering getters (#685) + * fixed; MongooseBuffer check in deepEquals (#688) + * fixed; range error when using Number _ids with `instance.save()` (#691) + * fixed; isNew on embedded docs edge case (#680) + * updated; driver to 0.9.8-3 + * updated; expose `model()` method within static methods + +2.4.10 / 2012-01-10 +=================== + + * added; optional getter application in .toObject()/.toJSON() (#412) + * fixed; nested $operators in $all queries (#670) + * added; $nor support (#674) + * fixed; bug when adding nested schema (#662) [paulwe] + +2.4.9 / 2012-01-04 +=================== + + * updated; driver to 0.9.7-3-5 to fix Linux performance degradation on some boxes + +2.4.8 / 2011-12-22 +=================== + + * updated; bump -native to 0.9.7.2-5 + * fixed; compatibility with date.js (#646) [chrisleishman] + * changed; undocumented schema "lax" option to "strict" + * fixed; default value population for strict schemas + * updated; the nextTick helper for small performance gain. 1bee2a2 + +2.4.7 / 2011-12-16 +=================== + + * fixed; bug in 2.4.6 with path setting + * updated; bump -native to 0.9.7.2-1 + * added; strict schema option [nw] + +2.4.6 / 2011-12-16 +=================== + + * fixed; conflicting mods on update bug [sirlantis] + * improved; doc.id getter performance + +2.4.5 / 2011-12-14 +=================== + + * fixed; bad MongooseArray behavior in 2.4.2 - 2.4.4 + +2.4.4 / 2011-12-14 +=================== + + * fixed; MongooseArray#doAtomics throwing after sliced + +2.4.3 / 2011-12-14 +=================== + + * updated; system.profile schema for MongoDB 2x + +2.4.2 / 2011-12-12 +=================== + + * fixed; partially populating multiple children of subdocs (#639) [kenpratt] + * fixed; allow Update of numbers to null (#640) [jerem] + +2.4.1 / 2011-12-02 +=================== + + * added; options support for populate() queries + * updated; -native driver to 0.9.7-1.4 + +2.4.0 / 2011-11-29 +=================== + + * added; QueryStreams (#614) + * added; debug print mode for development + * added; $within support to Array queries (#586) [ggoodale] + * added; $centerSphere query support + * fixed; $within support + * added; $unset is now used when setting a path to undefined (#519) + * added; query#batchSize support + * updated; docs + * updated; -native driver to 0.9.7-1.3 (provides Windows support) + +2.3.13 / 2011-11-15 +=================== + + * fixed; required validation for Refs (#612) [ded] + * added; $nearSphere support for Arrays (#610) + +2.3.12 / 2011-11-09 +=================== + + * fixed; regression, objects passed to Model.update should not be changed (#605) + * fixed; regression, empty Model.update should not be executed + +2.3.11 / 2011-11-08 +=================== + + * fixed; using $elemMatch on arrays of Mixed types (#591) + * fixed; allow using $regex when querying Arrays (#599) + * fixed; calling Model.update with no atomic keys (#602) + +2.3.10 / 2011-11-05 +=================== + + * fixed; model.update casting for nested paths works (#542) + +2.3.9 / 2011-11-04 +================== + + * fixed; deepEquals check for MongooseArray returned false + * fixed; reset modified flags of embedded docs after save [gitfy] + * fixed; setting embedded doc with identical values no longer marks modified [gitfy] + * updated; -native driver to 0.9.6.23 [mlazarov] + * fixed; Model.update casting (#542, #545, #479) + * fixed; populated refs no longer fail required validators (#577) + * fixed; populating refs of objects with custom ids works + * fixed; $pop & $unset work with Model.update (#574) + * added; more helpful debugging message for Schema#add (#578) + * fixed; accessing .id when no _id exists now returns null (#590) + +2.3.8 / 2011-10-26 +================== + + * added; callback to query#findOne is now optional (#581) + +2.3.7 / 2011-10-24 +================== + + * fixed; wrapped save/remove callbacks in nextTick to mitigate -native swallowing thrown errors + +2.3.6 / 2011-10-21 +================== + + * fixed; exclusion of embedded doc _id from query results (#541) + +2.3.5 / 2011-10-19 +================== + + * fixed; calling queries without passing a callback works (#569) + * fixed; populate() works with String and Number _ids too (#568) + +2.3.4 / 2011-10-18 +================== + + * added; Model.create now accepts an array as a first arg + * fixed; calling toObject on a DocumentArray with nulls no longer throws + * fixed; calling inspect on a DocumentArray with nulls no longer throws + * added; MongooseArray#unshift support + * fixed; save hooks now fire on embedded documents [gitfy] (#456) + * updated; -native driver to 0.9.6-22 + * fixed; correctly pass $addToSet op instead of $push + * fixed; $addToSet properly detects dates + * fixed; $addToSet with multiple items works + * updated; better node 0.6 Buffer support + +2.3.3 / 2011-10-12 +================== + + * fixed; population conditions in multi-query settings [vedmalex] (#563) + * fixed; now compatible with Node v0.5.x + +2.3.2 / 2011-10-11 +================== + + * fixed; population of null subdoc properties no longer hangs (#561) + +2.3.1 / 2011-10-10 +================== + + * added; support for Query filters to populate() [eneko] + * fixed; querying with number no longer crashes mongodb (#555) [jlbyrey] + * updated; version of -native driver to 0.9.6-21 + * fixed; prevent query callbacks that throw errors from corrupting -native connection state + +2.3.0 / 2011-10-04 +================== + + * fixed; nulls as default values for Boolean now works as expected + * updated; version of -native driver to 0.9.6-20 + +2.2.4 / 2011-10-03 +================== + + * fixed; populate() works when returned array contains undefined/nulls + +2.2.3 / 2011-09-29 +================== + + * updated; version of -native driver to 0.9.6-19 + +2.2.2 / 2011-09-28 +================== + + * added; $regex support to String [davidandrewcope] + * added; support for other contexts like repl etc (#535) + * fixed; clear modified state properly after saving + * added; $addToSet support to Array + +2.2.1 / 2011-09-22 +================== + + * more descript error when casting undefined to string + * updated; version of -native driver to 0.9.6-18 + +2.2.0 / 2011-09-22 +================== + + * fixed; maxListeners warning on schemas with many arrays (#530) + * changed; return / apply defaults based on fields selected in query (#423) + * fixed; correctly detect Mixed types within schema arrays (#532) + +2.1.4 / 2011-09-20 +================== + + * fixed; new private methods that stomped on users code + * changed; finished removing old "compat" support which did nothing + +2.1.3 / 2011-09-16 +================== + + * updated; version of -native driver to 0.9.6-15 + * added; emit `error` on connection when open fails [edwardhotchkiss] + * added; index support to Buffers (thanks justmoon for helping track this down) + * fixed; passing collection name via schema in conn.model() now works (thanks vedmalex for reporting) + +2.1.2 / 2011-09-07 +================== + + * fixed; Query#find with no args no longer throws + +2.1.1 / 2011-09-07 +================== + + * added; support Model.count(fn) + * fixed; compatibility with node >=0.4.0 < 0.4.3 + * added; pass model.options.safe through with .save() so w:2, wtimeout:5000 options work [andrewjstone] + * added; support for $type queries + * added; support for Query#or + * added; more tests + * optimized populate queries + +2.1.0 / 2011-09-01 +================== + + * changed; document#validate is a public method + * fixed; setting number to same value no longer marks modified (#476) [gitfy] + * fixed; Buffers shouldn't have default vals + * added; allow specifying collection name in schema (#470) [ixti] + * fixed; reset modified paths and atomics after saved (#459) + * fixed; set isNew on embedded docs to false after save + * fixed; use self to ensure proper scope of options in doOpenSet (#483) [andrewjstone] + +2.0.4 / 2011-08-29 +================== + + * Fixed; Only send the depopulated ObjectId instead of the entire doc on save (DBRefs) + * Fixed; Properly cast nested array values in Model.update (the data was stored in Mongo incorrectly but recast on document fetch was "fixing" it) + +2.0.3 / 2011-08-28 +================== + + * Fixed; manipulating a populated array no longer causes infinite loop in BSON serializer during save (#477) + * Fixed; populating an empty array no longer hangs foreeeeeeeever (#481) + +2.0.2 / 2011-08-25 +================== + + * Fixed; Maintain query option key order (fixes 'bad hint' error from compound query hints) + +2.0.1 / 2011-08-25 +================== + + * Fixed; do not over-write the doc when no valide props exist in Model.update (#473) + +2.0.0 / 2011-08-24 +=================== + + * Added; support for Buffers [justmoon] + * Changed; improved error handling [maelstrom] + * Removed: unused utils.erase + * Fixed; support for passing other context object into Schemas (#234) [Sija] + * Fixed; getters are no longer circular refs to themselves (#366) + * Removed; unused compat.js + * Fixed; getter/setter scopes are set properly + * Changed; made several private properties more obvious by prefixing _ + * Added; DBRef support [guille] + * Changed; removed support for multiple collection names per model + * Fixed; no longer applying setters when document returned from db + * Changed; default auto_reconnect to true + * Changed; Query#bind no longer clones the query + * Fixed; Model.update now accepts $pull, $inc and friends (#404) + * Added; virtual type option support [nw] + +1.8.4 / 2011-08-21 +=================== + + * Fixed; validation bug when instantiated with non-schema properties (#464) [jmreidy] + +1.8.3 / 2011-08-19 +=================== + + * Fixed; regression in connection#open [jshaw86] + +1.8.2 / 2011-08-17 +=================== + + * fixed; reset connection.readyState after failure [tomseago] + * fixed; can now query positionally for non-embedded docs (arrays of numbers/strings etc) + * fixed; embedded document query casting + * added; support for passing options to node-mongo-native db, server, and replsetserver [tomseago] + +1.8.1 / 2011-08-10 +=================== + + * fixed; ObjectIds were always marked modified + * fixed; can now query using document instances + * fixed; can now query/update using documents with subdocs + +1.8.0 / 2011-08-04 +=================== + + * fixed; can now use $all with String and Number + * fixed; can query subdoc array with $ne: null + * fixed; instance.subdocs#id now works with custom _ids + * fixed; do not apply setters when doc returned from db (change in bad behavior) + +1.7.4 / 2011-07-25 +=================== + + * fixed; sparse now a valid seperate schema option + * fixed; now catching cast errors in queries + * fixed; calling new Schema with object created in vm.runInNewContext now works (#384) [Sija] + * fixed; String enum was disallowing null + * fixed; Find by nested document _id now works (#389) + +1.7.3 / 2011-07-16 +=================== + + * fixed; MongooseArray#indexOf now works with ObjectIds + * fixed; validation scope now set properly (#418) + * fixed; added missing colors dependency (#398) + +1.7.2 / 2011-07-13 +=================== + + * changed; node-mongodb-native driver to v0.9.6.7 + +1.7.1 / 2011-07-12 +=================== + + * changed; roll back node-mongodb-native driver to v0.9.6.4 + +1.7.0 / 2011-07-12 +=================== + + * fixed; collection name misspelling [mathrawka] + * fixed; 2nd param is required for ReplSetServers [kevinmarvin] + * fixed; MongooseArray behaves properly with Object.keys + * changed; node-mongodb-native driver to v0.9.6.6 + * fixed/changed; Mongodb segfault when passed invalid ObjectId (#407) + - This means invalid data passed to the ObjectId constructor will now error + +1.6.0 / 2011-07-07 +=================== + + * changed; .save() errors are now emitted on the instances db instead of the instance 9782463fc + * fixed; errors occurring when creating indexes now properly emit on db + * added; $maxDistance support to MongooseArrays + * fixed; RegExps now work with $all + * changed; node-mongodb-native driver to v0.9.6.4 + * fixed; model names are now accessible via .modelName + * added; Query#slaveOk support + +1.5.0 / 2011-06-27 +=================== + + * changed; saving without a callback no longer ignores the error (@bnoguchi) + * changed; hook-js version bump to 0.1.9 + * changed; node-mongodb-native version bumped to 0.9.6.1 - When .remove() doesn't + return an error, null is no longer passed. + * fixed; two memory leaks (@justmoon) + * added; sparse index support + * added; more ObjectId conditionals (gt, lt, gte, lte) (@phillyqueso) + * added; options are now passed in model#remote (@JerryLuke) + +1.4.0 / 2011-06-10 +=================== + + * bumped hooks-js dependency (fixes issue passing null as first arg to next()) + * fixed; document#inspect now works properly with nested docs + * fixed; 'set' now works as a schema attribute (GH-365) + * fixed; _id is now set properly within pre-init hooks (GH-289) + * added; Query#distinct / Model#distinct support (GH-155) + * fixed; embedded docs now can use instance methods (GH-249) + * fixed; can now overwrite strings conflicting with schema type + +1.3.7 / 2011-06-03 +=================== + + * added MongooseArray#splice support + * fixed; 'path' is now a valid Schema pathname + * improved hooks (utilizing https://github.com/bnoguchi/hooks-js) + * fixed; MongooseArray#$shift now works (never did) + * fixed; Document.modified no longer throws + * fixed; modifying subdoc property sets modified paths for subdoc and parent doc + * fixed; marking subdoc path as modified properly persists the value to the db + * fixed; RexExps can again be saved ( #357 ) + +1.3.6 / 2011-05-18 +=================== + + * fixed; corrected casting for queries against array types + * added; Document#set now accepts Document instances + +1.3.5 / 2011-05-17 +=================== + + * fixed; $ne queries work properly with single vals + * added; #inspect() methods to improve console.log output + +1.3.4 / 2011-05-17 +=================== + + * fixed; find by Date works as expected (#336) + * added; geospatial 2d index support + * added; support for $near (#309) + * updated; node-mongodb-native driver + * fixed; updating numbers work (#342) + * added; better error msg when try to remove an embedded doc without an _id (#307) + * added; support for 'on-the-fly' schemas (#227) + * changed; virtual id getters can now be skipped + * fixed; .index() called on subdoc schema now works as expected + * fixed; db.setProfile() now buffers until the db is open (#340) + +1.3.3 / 2011-04-27 +=================== + + * fixed; corrected query casting on nested mixed types + +1.3.2 / 2011-04-27 +=================== + + * fixed; query hints now retain key order + +1.3.1 / 2011-04-27 +=================== + + * fixed; setting a property on an embedded array no longer overwrites entire array (GH-310) + * fixed; setting nested properties works when sibling prop is named "type" + * fixed; isModified is now much finer grained when .set() is used (GH-323) + * fixed; mongoose.model() and connection.model() now return the Model (GH-308, GH-305) + * fixed; can now use $gt, $lt, $gte, $lte with String schema types (GH-317) + * fixed; .lowercase() -> .toLowerCase() in pluralize() + * fixed; updating an embedded document by index works (GH-334) + * changed; .save() now passes the instance to the callback (GH-294, GH-264) + * added; can now query system.profile and system.indexes collections + * added; db.model('system.profile') is now included as a default Schema + * added; db.setProfiling(level, ms, callback) + * added; Query#hint() support + * added; more tests + * updated node-mongodb-native to 0.9.3 + +1.3.0 / 2011-04-19 +=================== + + * changed; save() callbacks now fire only once on failed validation + * changed; Errors returned from save() callbacks now instances of ValidationError + * fixed; MongooseArray#indexOf now works properly + +1.2.0 / 2011-04-11 +=================== + + * changed; MongooseNumber now casts empty string to null + +1.1.25 / 2011-04-08 +=================== + + * fixed; post init now fires at proper time + +1.1.24 / 2011-04-03 +=================== + + * fixed; pushing an array onto an Array works on existing docs + +1.1.23 / 2011-04-01 +=================== + + * Added Model#model + +1.1.22 / 2011-03-31 +=================== + + * Fixed; $in queries on mixed types now work + +1.1.21 / 2011-03-31 +=================== + + * Fixed; setting object root to null/undefined works + +1.1.20 / 2011-03-31 +=================== + + * Fixed; setting multiple props on null field works + +1.1.19 / 2011-03-31 +=================== + + * Fixed; no longer using $set on paths to an unexisting fields + +1.1.18 / 2011-03-30 +=================== + + * Fixed; non-mixed type object setters work after initd from null + +1.1.17 / 2011-03-30 +=================== + + * Fixed; nested object property access works when root initd with null value + +1.1.16 / 2011-03-28 +=================== + + * Fixed; empty arrays are now saved + +1.1.15 / 2011-03-28 +=================== + + * Fixed; `null` and `undefined` are set atomically. + +1.1.14 / 2011-03-28 +=================== + + * Changed; more forgiving date casting, accepting '' as null. + +1.1.13 / 2011-03-26 +=================== + + * Fixed setting values as `undefined`. + +1.1.12 / 2011-03-26 +=================== + + * Fixed; nested objects now convert to JSON properly + * Fixed; setting nested objects directly now works + * Update node-mongodb-native + +1.1.11 / 2011-03-25 +=================== + + * Fixed for use of `type` as a key. + +1.1.10 / 2011-03-23 +=================== + + * Changed; Make sure to only ensure indexes while connected + +1.1.9 / 2011-03-2 +================== + + * Fixed; Mixed can now default to empty arrays + * Fixed; keys by the name 'type' are now valid + * Fixed; null values retrieved from the database are hydrated as null values. + * Fixed repeated atomic operations when saving a same document twice. + +1.1.8 / 2011-03-23 +================== + + * Fixed 'id' overriding. [bnoguchi] + +1.1.7 / 2011-03-22 +================== + + * Fixed RegExp query casting when querying against an Array of Strings [bnoguchi] + * Fixed getters/setters for nested virtualsl. [bnoguchi] + +1.1.6 / 2011-03-22 +================== + + * Only doValidate when path exists in Schema [aheckmann] + * Allow function defaults for Array types [aheckmann] + * Fix validation hang [aheckmann] + * Fix setting of isRequired of SchemaType [aheckmann] + * Fix SchemaType#required(false) filter [aheckmann] + * More backwards compatibility [aheckmann] + * More tests [aheckmann] + +1.1.5 / 2011-03-14 +================== + + * Added support for `uri, db, fn` and `uri, fn` signatures for replica sets. + * Improved/extended replica set tests. + +1.1.4 / 2011-03-09 +================== + + * Fixed; running an empty Query doesn't throw. [aheckmann] + * Changed; Promise#addBack returns promise. [aheckmann] + * Added streaming cursor support. [aheckmann] + * Changed; Query#update defaults to use$SetOnSave now. [brian] + * Added more docs. + +1.1.3 / 2011-03-04 +================== + + * Added Promise#resolve [aheckmann] + * Fixed backward compatibility with nulls [aheckmann] + * Changed; Query#{run,exec} return promises [aheckmann] + +1.1.2 / 2011-03-03 +================== + + * Restored Query#exec and added notion of default operation [brian] + * Fixed ValidatorError messages [brian] + +1.1.1 / 2011-03-01 +================== + + * Added SchemaType String `lowercase`, `uppercase`, `trim`. + * Public exports (`Model`, `Document`) and tests. + * Added ObjectId casting support for `Document`s. + +1.1.0 / 2011-02-25 +================== + + * Added support for replica sets. + +1.0.16 / 2011-02-18 +=================== + + * Added $nin as another whitelisted $conditional for SchemaArray [brian] + * Changed #with to #where [brian] + * Added ability to use $in conditional with Array types [brian] + +1.0.15 / 2011-02-18 +=================== + + * Added `id` virtual getter for documents to easily access the hexString of + the `_id`. + +1.0.14 / 2011-02-17 +=================== + + * Fix for arrays within subdocuments [brian] + +1.0.13 / 2011-02-16 +=================== + + * Fixed embedded documents saving. + +1.0.12 / 2011-02-14 +=================== + + * Minor refactorings [brian] + +1.0.11 / 2011-02-14 +=================== + + * Query refactor and $ne, $slice, $or, $size, $elemMatch, $nin, $exists support [brian] + * Named scopes sugar [brian] + +1.0.10 / 2011-02-11 +=================== + + * Updated node-mongodb-native driver [thanks John Allen] + +1.0.9 / 2011-02-09 +================== + + * Fixed single member arrays as defaults [brian] + +1.0.8 / 2011-02-09 +================== + + * Fixed for collection-level buffering of commands [gitfy] + * Fixed `Document#toJSON` [dalejefferson] + * Fixed `Connection` authentication [robrighter] + * Fixed clash of accessors in getters/setters [eirikurn] + * Improved `Model#save` promise handling + +1.0.7 / 2011-02-05 +================== + + * Fixed memory leak warnings for test suite on 0.3 + * Fixed querying documents that have an array that contain at least one + specified member. [brian] + * Fixed default value for Array types (fixes GH-210). [brian] + * Fixed example code. + +1.0.6 / 2011-02-03 +================== + + * Fixed `post` middleware + * Fixed; it's now possible to instantiate a model even when one of the paths maps + to an undefined value [brian] + +1.0.5 / 2011-02-02 +================== + + * Fixed; combo $push and $pushAll auto-converts into a $pushAll [brian] + * Fixed; combo $pull and $pullAll auto-converts to a single $pullAll [brian] + * Fixed; $pullAll now removes said members from array before save (so it acts just + like pushAll) [brian] + * Fixed; multiple $pulls and $pushes become a single $pullAll and $pushAll. + Moreover, $pull now modifies the array before save to reflect the immediate + change [brian] + * Added tests for nested shortcut getters [brian] + * Added tests that show that Schemas with nested Arrays don't apply defaults + [brian] + +1.0.4 / 2011-02-02 +================== + + * Added MongooseNumber#toString + * Added MongooseNumber unit tests + +1.0.3 / 2011-02-02 +================== + + * Make sure safe mode works with Model#save + * Changed Schema options: safe mode is now the default + * Updated node-mongodb-native to HEAD + +1.0.2 / 2011-02-02 +================== + + * Added a Model.create shortcut for creating documents. [brian] + * Fixed; we can now instantiate models with hashes that map to at least one + null value. [brian] + * Fixed Schema with more than 2 nested levels. [brian] + +1.0.1 / 2011-02-02 +================== + + * Improved `MongooseNumber`, works almost like the native except for `typeof` + not being `'number'`. diff --git a/node_modules/mongoose/History.md b/node_modules/mongoose/History.md new file mode 100644 index 000000000..66cd4ffa3 --- /dev/null +++ b/node_modules/mongoose/History.md @@ -0,0 +1 @@ +This file has been renamed [CHANGELOG.md](./CHANGELOG.md) diff --git a/node_modules/mongoose/LICENSE.md b/node_modules/mongoose/LICENSE.md new file mode 100644 index 000000000..54b4a4c6c --- /dev/null +++ b/node_modules/mongoose/LICENSE.md @@ -0,0 +1,22 @@ +# MIT License + +Copyright (c) 2010-2013 LearnBoost +Copyright (c) 2013-2021 Automattic + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/node_modules/mongoose/README.md b/node_modules/mongoose/README.md new file mode 100644 index 000000000..90a788d7a --- /dev/null +++ b/node_modules/mongoose/README.md @@ -0,0 +1,366 @@ +# Mongoose + +Mongoose is a [MongoDB](https://www.mongodb.org/) object modeling tool designed to work in an asynchronous environment. Mongoose supports both promises and callbacks. + +[![Slack Status](http://slack.mongoosejs.io/badge.svg)](http://slack.mongoosejs.io) +[![Build Status](https://github.com/Automattic/mongoose/workflows/Test/badge.svg)](https://github.com/Automattic/mongoose) +[![NPM version](https://badge.fury.io/js/mongoose.svg)](http://badge.fury.io/js/mongoose) + +[![npm](https://nodei.co/npm/mongoose.png)](https://www.npmjs.com/package/mongoose) + +## Documentation + +The official documentation website is [mongoosejs.com](http://mongoosejs.com/). + +Mongoose 6.0.0 was released on August 24, 2021. You can find more details on [backwards breaking changes in 6.0.0 on our docs site](https://mongoosejs.com/docs/migrating_to_6.html). + +## Support + + - [Stack Overflow](http://stackoverflow.com/questions/tagged/mongoose) + - [Bug Reports](https://github.com/Automattic/mongoose/issues/) + - [Mongoose Slack Channel](http://slack.mongoosejs.io/) + - [Help Forum](http://groups.google.com/group/mongoose-orm) + - [MongoDB Support](https://docs.mongodb.org/manual/support/) + +## Plugins + +Check out the [plugins search site](http://plugins.mongoosejs.io/) to see hundreds of related modules from the community. Next, learn how to write your own plugin from the [docs](http://mongoosejs.com/docs/plugins.html) or [this blog post](http://thecodebarbarian.com/2015/03/06/guide-to-mongoose-plugins). + +## Contributors + +Pull requests are always welcome! Please base pull requests against the `master` +branch and follow the [contributing guide](https://github.com/Automattic/mongoose/blob/master/CONTRIBUTING.md). + +If your pull requests makes documentation changes, please do **not** +modify any `.html` files. The `.html` files are compiled code, so please make +your changes in `docs/*.pug`, `lib/*.js`, or `test/docs/*.js`. + +View all 400+ [contributors](https://github.com/Automattic/mongoose/graphs/contributors). + +## Installation + +First install [Node.js](http://nodejs.org/) and [MongoDB](https://www.mongodb.org/downloads). Then: + +```sh +$ npm install mongoose +``` + +## Importing + +```javascript +// Using Node.js `require()` +const mongoose = require('mongoose'); + +// Using ES6 imports +import mongoose from 'mongoose'; +``` + +## Mongoose for Enterprise + +Available as part of the Tidelift Subscription + +The maintainers of mongoose and thousands of other packages are working with Tidelift to deliver commercial support and maintenance for the open source dependencies you use to build your applications. Save time, reduce risk, and improve code health, while paying the maintainers of the exact dependencies you use. [Learn more.](https://tidelift.com/subscription/pkg/npm-mongoose?utm_source=npm-mongoose&utm_medium=referral&utm_campaign=enterprise&utm_term=repo) + +## Overview + +### Connecting to MongoDB + +First, we need to define a connection. If your app uses only one database, you should use `mongoose.connect`. If you need to create additional connections, use `mongoose.createConnection`. + +Both `connect` and `createConnection` take a `mongodb://` URI, or the parameters `host, database, port, options`. + +```js +await mongoose.connect('mongodb://localhost/my_database'); +``` + +Once connected, the `open` event is fired on the `Connection` instance. If you're using `mongoose.connect`, the `Connection` is `mongoose.connection`. Otherwise, `mongoose.createConnection` return value is a `Connection`. + +**Note:** _If the local connection fails then try using 127.0.0.1 instead of localhost. Sometimes issues may arise when the local hostname has been changed._ + +**Important!** Mongoose buffers all the commands until it's connected to the database. This means that you don't have to wait until it connects to MongoDB in order to define models, run queries, etc. + +### Defining a Model + +Models are defined through the `Schema` interface. + +```js +const Schema = mongoose.Schema; +const ObjectId = Schema.ObjectId; + +const BlogPost = new Schema({ + author: ObjectId, + title: String, + body: String, +  date: Date +}); +``` + +Aside from defining the structure of your documents and the types of data you're storing, a Schema handles the definition of: + +* [Validators](http://mongoosejs.com/docs/validation.html) (async and sync) +* [Defaults](http://mongoosejs.com/docs/api.html#schematype_SchemaType-default) +* [Getters](http://mongoosejs.com/docs/api.html#schematype_SchemaType-get) +* [Setters](http://mongoosejs.com/docs/api.html#schematype_SchemaType-set) +* [Indexes](http://mongoosejs.com/docs/guide.html#indexes) +* [Middleware](http://mongoosejs.com/docs/middleware.html) +* [Methods](http://mongoosejs.com/docs/guide.html#methods) definition +* [Statics](http://mongoosejs.com/docs/guide.html#statics) definition +* [Plugins](http://mongoosejs.com/docs/plugins.html) +* [pseudo-JOINs](http://mongoosejs.com/docs/populate.html) + +The following example shows some of these features: + +```js +const Comment = new Schema({ + name: { type: String, default: 'hahaha' }, + age: { type: Number, min: 18, index: true }, + bio: { type: String, match: /[a-z]/ }, + date: { type: Date, default: Date.now }, + buff: Buffer +}); + +// a setter +Comment.path('name').set(function (v) { + return capitalize(v); +}); + +// middleware +Comment.pre('save', function (next) { + notify(this.get('email')); + next(); +}); +``` + +Take a look at the example in [`examples/schema/schema.js`](https://github.com/Automattic/mongoose/blob/master/examples/schema/schema.js) for an end-to-end example of a typical setup. + +### Accessing a Model + +Once we define a model through `mongoose.model('ModelName', mySchema)`, we can access it through the same function + +```js +const MyModel = mongoose.model('ModelName'); +``` + +Or just do it all at once + +```js +const MyModel = mongoose.model('ModelName', mySchema); +``` + +The first argument is the _singular_ name of the collection your model is for. **Mongoose automatically looks for the _plural_ version of your model name.** For example, if you use + +```js +const MyModel = mongoose.model('Ticket', mySchema); +``` + +Then Mongoose will create the model for your __tickets__ collection, not your __ticket__ collection. + +Once we have our model, we can then instantiate it, and save it: + +```js +const instance = new MyModel(); +instance.my.key = 'hello'; +instance.save(function (err) { + // +}); +``` + +Or we can find documents from the same collection + +```js +MyModel.find({}, function (err, docs) { + // docs.forEach +}); +``` + +You can also `findOne`, `findById`, `update`, etc. + +```js +const instance = await MyModel.findOne({ ... }); +console.log(instance.my.key); // 'hello' +``` + +For more details check out [the docs](http://mongoosejs.com/docs/queries.html). + +**Important!** If you opened a separate connection using `mongoose.createConnection()` but attempt to access the model through `mongoose.model('ModelName')` it will not work as expected since it is not hooked up to an active db connection. In this case access your model through the connection you created: + +```js +const conn = mongoose.createConnection('your connection string'); +const MyModel = conn.model('ModelName', schema); +const m = new MyModel; +m.save(); // works +``` + +vs + +```js +const conn = mongoose.createConnection('your connection string'); +const MyModel = mongoose.model('ModelName', schema); +const m = new MyModel; +m.save(); // does not work b/c the default connection object was never connected +``` + +### Embedded Documents + +In the first example snippet, we defined a key in the Schema that looks like: + +``` +comments: [Comment] +``` + +Where `Comment` is a `Schema` we created. This means that creating embedded documents is as simple as: + +```js +// retrieve my model +const BlogPost = mongoose.model('BlogPost'); + +// create a blog post +const post = new BlogPost(); + +// create a comment +post.comments.push({ title: 'My comment' }); + +post.save(function (err) { + if (!err) console.log('Success!'); +}); +``` + +The same goes for removing them: + +```js +BlogPost.findById(myId, function (err, post) { + if (!err) { + post.comments[0].remove(); + post.save(function (err) { + // do something + }); + } +}); +``` + +Embedded documents enjoy all the same features as your models. Defaults, validators, middleware. Whenever an error occurs, it's bubbled to the `save()` error callback, so error handling is a snap! + + +### Middleware + +See the [docs](http://mongoosejs.com/docs/middleware.html) page. + +#### Intercepting and mutating method arguments + +You can intercept method arguments via middleware. + +For example, this would allow you to broadcast changes about your Documents every time someone `set`s a path in your Document to a new value: + +```js +schema.pre('set', function (next, path, val, typel) { + // `this` is the current Document + this.emit('set', path, val); + + // Pass control to the next pre + next(); +}); +``` + +Moreover, you can mutate the incoming `method` arguments so that subsequent middleware see different values for those arguments. To do so, just pass the new values to `next`: + +```js +.pre(method, function firstPre (next, methodArg1, methodArg2) { + // Mutate methodArg1 + next("altered-" + methodArg1.toString(), methodArg2); +}); + +// pre declaration is chainable +.pre(method, function secondPre (next, methodArg1, methodArg2) { + console.log(methodArg1); + // => 'altered-originalValOfMethodArg1' + + console.log(methodArg2); + // => 'originalValOfMethodArg2' + + // Passing no arguments to `next` automatically passes along the current argument values + // i.e., the following `next()` is equivalent to `next(methodArg1, methodArg2)` + // and also equivalent to, with the example method arg + // values, `next('altered-originalValOfMethodArg1', 'originalValOfMethodArg2')` + next(); +}); +``` + +#### Schema gotcha + +`type`, when used in a schema has special meaning within Mongoose. If your schema requires using `type` as a nested property you must use object notation: + +```js +new Schema({ + broken: { type: Boolean }, + asset: { + name: String, + type: String // uh oh, it broke. asset will be interpreted as String + } +}); + +new Schema({ + works: { type: Boolean }, + asset: { + name: String, + type: { type: String } // works. asset is an object with a type property + } +}); +``` + +### Driver Access + +Mongoose is built on top of the [official MongoDB Node.js driver](https://github.com/mongodb/node-mongodb-native). Each mongoose model keeps a reference to a [native MongoDB driver collection](http://mongodb.github.io/node-mongodb-native/2.1/api/Collection.html). The collection object can be accessed using `YourModel.collection`. However, using the collection object directly bypasses all mongoose features, including hooks, validation, etc. The one +notable exception that `YourModel.collection` still buffers +commands. As such, `YourModel.collection.find()` will **not** +return a cursor. + +## API Docs + +Find the API docs [here](http://mongoosejs.com/docs/api.html), generated using [dox](https://github.com/tj/dox) +and [acquit](https://github.com/vkarpov15/acquit). + +## Related Projects + +#### MongoDB Runners + +- [run-rs](https://www.npmjs.com/package/run-rs) +- [mongodb-memory-server](https://www.npmjs.com/package/mongodb-memory-server) +- [mongodb-topology-manager](https://www.npmjs.com/package/mongodb-topology-manager) + +#### Unofficial CLIs + +- [mongoosejs-cli](https://www.npmjs.com/package/mongoosejs-cli) + +#### Data Seeding + +- [dookie](https://www.npmjs.com/package/dookie) +- [seedgoose](https://www.npmjs.com/package/seedgoose) +- [mongoose-data-seed](https://www.npmjs.com/package/mongoose-data-seed) + +#### Express Session Stores + +- [connect-mongodb-session](https://www.npmjs.com/package/connect-mongodb-session) +- [connect-mongo](https://www.npmjs.com/package/connect-mongo) + +## License + +Copyright (c) 2010 LearnBoost <dev@learnboost.com> + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/mongoose/SECURITY.md b/node_modules/mongoose/SECURITY.md new file mode 100644 index 000000000..41b89d834 --- /dev/null +++ b/node_modules/mongoose/SECURITY.md @@ -0,0 +1 @@ +Please follow the instructions on [Tidelift's security page](https://tidelift.com/docs/security) to report a security issue. diff --git a/node_modules/mongoose/browser.js b/node_modules/mongoose/browser.js new file mode 100644 index 000000000..4cf822804 --- /dev/null +++ b/node_modules/mongoose/browser.js @@ -0,0 +1,8 @@ +/** + * Export lib/mongoose + * + */ + +'use strict'; + +module.exports = require('./lib/browser'); diff --git a/node_modules/mongoose/build-browser.js b/node_modules/mongoose/build-browser.js new file mode 100644 index 000000000..6f4aa1681 --- /dev/null +++ b/node_modules/mongoose/build-browser.js @@ -0,0 +1,18 @@ +'use strict'; + +const config = require('./webpack.config.js'); +const webpack = require('webpack'); + +const compiler = webpack(config); + +console.log('Starting browser build...'); +compiler.run((err, stats) => { + if (err) { + console.err(stats.toString()); + console.err('Browser build unsuccessful.'); + process.exit(1); + } + console.log(stats.toString()); + console.log('Browser build successful.'); + process.exit(0); +}); diff --git a/node_modules/mongoose/dist/browser.umd.js b/node_modules/mongoose/dist/browser.umd.js new file mode 100644 index 000000000..f1baefca8 --- /dev/null +++ b/node_modules/mongoose/dist/browser.umd.js @@ -0,0 +1,1663 @@ +!function(t,e){"object"==typeof exports&&"object"==typeof module?module.exports=e():"function"==typeof define&&define.amd?define([],e):"object"==typeof exports?exports.mongoose=e():t.mongoose=e()}("undefined"!=typeof self?self:this,(function(){return function(t){var e={};function r(n){if(e[n])return e[n].exports;var i=e[n]={i:n,l:!1,exports:{}};return t[n].call(i.exports,i,i.exports,r),i.l=!0,i.exports}return r.m=t,r.c=e,r.d=function(t,e,n){r.o(t,e)||Object.defineProperty(t,e,{enumerable:!0,get:n})},r.r=function(t){"undefined"!=typeof Symbol&&Symbol.toStringTag&&Object.defineProperty(t,Symbol.toStringTag,{value:"Module"}),Object.defineProperty(t,"__esModule",{value:!0})},r.t=function(t,e){if(1&e&&(t=r(t)),8&e)return t;if(4&e&&"object"==typeof t&&t&&t.__esModule)return t;var n=Object.create(null);if(r.r(n),Object.defineProperty(n,"default",{enumerable:!0,value:t}),2&e&&"string"!=typeof t)for(var i in t)r.d(n,i,function(e){return t[e]}.bind(null,i));return n},r.n=function(t){var e=t&&t.__esModule?function(){return t.default}:function(){return t};return r.d(e,"a",e),e},r.o=function(t,e){return Object.prototype.hasOwnProperty.call(t,e)},r.p="",r(r.s=176)}([function(t,e){"function"==typeof Object.create?t.exports=function(t,e){e&&(t.super_=e,t.prototype=Object.create(e.prototype,{constructor:{value:t,enumerable:!1,writable:!0,configurable:!0}}))}:t.exports=function(t,e){if(e){t.super_=e;var r=function(){};r.prototype=e.prototype,t.prototype=new r,t.prototype.constructor=t}}},function(t,e,r){"use strict";e.arrayAtomicsBackupSymbol=Symbol("mongoose#Array#atomicsBackup"),e.arrayAtomicsSymbol=Symbol("mongoose#Array#_atomics"),e.arrayParentSymbol=Symbol("mongoose#Array#_parent"),e.arrayPathSymbol=Symbol("mongoose#Array#_path"),e.arraySchemaSymbol=Symbol("mongoose#Array#_schema"),e.documentArrayParent=Symbol("mongoose:documentArrayParent"),e.documentIsSelected=Symbol("mongoose#Document#isSelected"),e.documentIsModified=Symbol("mongoose#Document#isModified"),e.documentModifiedPaths=Symbol("mongoose#Document#modifiedPaths"),e.documentSchemaSymbol=Symbol("mongoose#Document#schema"),e.getSymbol=Symbol("mongoose#Document#get"),e.modelSymbol=Symbol("mongoose#Model"),e.objectIdSymbol=Symbol("mongoose#ObjectId"),e.populateModelSymbol=Symbol("mongoose.PopulateOptions#Model"),e.schemaTypeSymbol=Symbol("mongoose#schemaType"),e.sessionNewDocuments=Symbol("mongoose:ClientSession#newDocuments"),e.scopeSymbol=Symbol("mongoose#Document#scope"),e.validatorErrorSymbol=Symbol("mongoose:validatorError")},function(t,e,r){var n=r(3),i=n.Buffer;function o(t,e){for(var r in t)e[r]=t[r]}function s(t,e,r){return i(t,e,r)}i.from&&i.alloc&&i.allocUnsafe&&i.allocUnsafeSlow?t.exports=n:(o(n,e),e.Buffer=s),o(i,s),s.from=function(t,e,r){if("number"==typeof t)throw new TypeError("Argument must not be a number");return i(t,e,r)},s.alloc=function(t,e,r){if("number"!=typeof t)throw new TypeError("Argument must be a number");var n=i(t);return void 0!==e?"string"==typeof r?n.fill(e,r):n.fill(e):n.fill(0),n},s.allocUnsafe=function(t){if("number"!=typeof t)throw new TypeError("Argument must be a number");return i(t)},s.allocUnsafeSlow=function(t){if("number"!=typeof t)throw new TypeError("Argument must be a number");return n.SlowBuffer(t)}},function(t,e,r){"use strict";(function(t){ +/*! + * The buffer module from node.js, for the browser. + * + * @author Feross Aboukhadijeh + * @license MIT + */ +var n=r(178),i=r(179),o=r(93);function s(){return u.TYPED_ARRAY_SUPPORT?2147483647:1073741823}function a(t,e){if(s()=s())throw new RangeError("Attempt to allocate Buffer larger than maximum size: 0x"+s().toString(16)+" bytes");return 0|t}function p(t,e){if(u.isBuffer(t))return t.length;if("undefined"!=typeof ArrayBuffer&&"function"==typeof ArrayBuffer.isView&&(ArrayBuffer.isView(t)||t instanceof ArrayBuffer))return t.byteLength;"string"!=typeof t&&(t=""+t);var r=t.length;if(0===r)return 0;for(var n=!1;;)switch(e){case"ascii":case"latin1":case"binary":return r;case"utf8":case"utf-8":case void 0:return q(t).length;case"ucs2":case"ucs-2":case"utf16le":case"utf-16le":return 2*r;case"hex":return r>>>1;case"base64":return U(t).length;default:if(n)return q(t).length;e=(""+e).toLowerCase(),n=!0}}function m(t,e,r){var n=!1;if((void 0===e||e<0)&&(e=0),e>this.length)return"";if((void 0===r||r>this.length)&&(r=this.length),r<=0)return"";if((r>>>=0)<=(e>>>=0))return"";for(t||(t="utf8");;)switch(t){case"hex":return j(this,e,r);case"utf8":case"utf-8":return E(this,e,r);case"ascii":return x(this,e,r);case"latin1":case"binary":return k(this,e,r);case"base64":return A(this,e,r);case"ucs2":case"ucs-2":case"utf16le":case"utf-16le":return $(this,e,r);default:if(n)throw new TypeError("Unknown encoding: "+t);t=(t+"").toLowerCase(),n=!0}}function y(t,e,r){var n=t[e];t[e]=t[r],t[r]=n}function v(t,e,r,n,i){if(0===t.length)return-1;if("string"==typeof r?(n=r,r=0):r>2147483647?r=2147483647:r<-2147483648&&(r=-2147483648),r=+r,isNaN(r)&&(r=i?0:t.length-1),r<0&&(r=t.length+r),r>=t.length){if(i)return-1;r=t.length-1}else if(r<0){if(!i)return-1;r=0}if("string"==typeof e&&(e=u.from(e,n)),u.isBuffer(e))return 0===e.length?-1:b(t,e,r,n,i);if("number"==typeof e)return e&=255,u.TYPED_ARRAY_SUPPORT&&"function"==typeof Uint8Array.prototype.indexOf?i?Uint8Array.prototype.indexOf.call(t,e,r):Uint8Array.prototype.lastIndexOf.call(t,e,r):b(t,[e],r,n,i);throw new TypeError("val must be string, number or Buffer")}function b(t,e,r,n,i){var o,s=1,a=t.length,u=e.length;if(void 0!==n&&("ucs2"===(n=String(n).toLowerCase())||"ucs-2"===n||"utf16le"===n||"utf-16le"===n)){if(t.length<2||e.length<2)return-1;s=2,a/=2,u/=2,r/=2}function h(t,e){return 1===s?t[e]:t.readUInt16BE(e*s)}if(i){var f=-1;for(o=r;oa&&(r=a-u),o=r;o>=0;o--){for(var c=!0,l=0;li&&(n=i):n=i;var o=e.length;if(o%2!=0)throw new TypeError("Invalid hex string");n>o/2&&(n=o/2);for(var s=0;s>8,i=r%256,o.push(i),o.push(n);return o}(e,t.length-r),t,r,n)}function A(t,e,r){return 0===e&&r===t.length?n.fromByteArray(t):n.fromByteArray(t.slice(e,r))}function E(t,e,r){r=Math.min(t.length,r);for(var n=[],i=e;i239?4:h>223?3:h>191?2:1;if(i+c<=r)switch(c){case 1:h<128&&(f=h);break;case 2:128==(192&(o=t[i+1]))&&(u=(31&h)<<6|63&o)>127&&(f=u);break;case 3:o=t[i+1],s=t[i+2],128==(192&o)&&128==(192&s)&&(u=(15&h)<<12|(63&o)<<6|63&s)>2047&&(u<55296||u>57343)&&(f=u);break;case 4:o=t[i+1],s=t[i+2],a=t[i+3],128==(192&o)&&128==(192&s)&&128==(192&a)&&(u=(15&h)<<18|(63&o)<<12|(63&s)<<6|63&a)>65535&&u<1114112&&(f=u)}null===f?(f=65533,c=1):f>65535&&(f-=65536,n.push(f>>>10&1023|55296),f=56320|1023&f),n.push(f),i+=c}return function(t){var e=t.length;if(e<=4096)return String.fromCharCode.apply(String,t);var r="",n=0;for(;n0&&(t=this.toString("hex",0,r).match(/.{2}/g).join(" "),this.length>r&&(t+=" ... ")),""},u.prototype.compare=function(t,e,r,n,i){if(!u.isBuffer(t))throw new TypeError("Argument must be a Buffer");if(void 0===e&&(e=0),void 0===r&&(r=t?t.length:0),void 0===n&&(n=0),void 0===i&&(i=this.length),e<0||r>t.length||n<0||i>this.length)throw new RangeError("out of range index");if(n>=i&&e>=r)return 0;if(n>=i)return-1;if(e>=r)return 1;if(this===t)return 0;for(var o=(i>>>=0)-(n>>>=0),s=(r>>>=0)-(e>>>=0),a=Math.min(o,s),h=this.slice(n,i),f=t.slice(e,r),c=0;ci)&&(r=i),t.length>0&&(r<0||e<0)||e>this.length)throw new RangeError("Attempt to write outside buffer bounds");n||(n="utf8");for(var o=!1;;)switch(n){case"hex":return g(this,t,e,r);case"utf8":case"utf-8":return w(this,t,e,r);case"ascii":return _(this,t,e,r);case"latin1":case"binary":return M(this,t,e,r);case"base64":return S(this,t,e,r);case"ucs2":case"ucs-2":case"utf16le":case"utf-16le":return O(this,t,e,r);default:if(o)throw new TypeError("Unknown encoding: "+n);n=(""+n).toLowerCase(),o=!0}},u.prototype.toJSON=function(){return{type:"Buffer",data:Array.prototype.slice.call(this._arr||this,0)}};function x(t,e,r){var n="";r=Math.min(t.length,r);for(var i=e;in)&&(r=n);for(var i="",o=e;or)throw new RangeError("Trying to access beyond buffer length")}function R(t,e,r,n,i,o){if(!u.isBuffer(t))throw new TypeError('"buffer" argument must be a Buffer instance');if(e>i||et.length)throw new RangeError("Index out of range")}function B(t,e,r,n){e<0&&(e=65535+e+1);for(var i=0,o=Math.min(t.length-r,2);i>>8*(n?i:1-i)}function T(t,e,r,n){e<0&&(e=4294967295+e+1);for(var i=0,o=Math.min(t.length-r,4);i>>8*(n?i:3-i)&255}function I(t,e,r,n,i,o){if(r+n>t.length)throw new RangeError("Index out of range");if(r<0)throw new RangeError("Index out of range")}function N(t,e,r,n,o){return o||I(t,0,r,4),i.write(t,e,r,n,23,4),r+4}function D(t,e,r,n,o){return o||I(t,0,r,8),i.write(t,e,r,n,52,8),r+8}u.prototype.slice=function(t,e){var r,n=this.length;if((t=~~t)<0?(t+=n)<0&&(t=0):t>n&&(t=n),(e=void 0===e?n:~~e)<0?(e+=n)<0&&(e=0):e>n&&(e=n),e0&&(i*=256);)n+=this[t+--e]*i;return n},u.prototype.readUInt8=function(t,e){return e||P(t,1,this.length),this[t]},u.prototype.readUInt16LE=function(t,e){return e||P(t,2,this.length),this[t]|this[t+1]<<8},u.prototype.readUInt16BE=function(t,e){return e||P(t,2,this.length),this[t]<<8|this[t+1]},u.prototype.readUInt32LE=function(t,e){return e||P(t,4,this.length),(this[t]|this[t+1]<<8|this[t+2]<<16)+16777216*this[t+3]},u.prototype.readUInt32BE=function(t,e){return e||P(t,4,this.length),16777216*this[t]+(this[t+1]<<16|this[t+2]<<8|this[t+3])},u.prototype.readIntLE=function(t,e,r){t|=0,e|=0,r||P(t,e,this.length);for(var n=this[t],i=1,o=0;++o=(i*=128)&&(n-=Math.pow(2,8*e)),n},u.prototype.readIntBE=function(t,e,r){t|=0,e|=0,r||P(t,e,this.length);for(var n=e,i=1,o=this[t+--n];n>0&&(i*=256);)o+=this[t+--n]*i;return o>=(i*=128)&&(o-=Math.pow(2,8*e)),o},u.prototype.readInt8=function(t,e){return e||P(t,1,this.length),128&this[t]?-1*(255-this[t]+1):this[t]},u.prototype.readInt16LE=function(t,e){e||P(t,2,this.length);var r=this[t]|this[t+1]<<8;return 32768&r?4294901760|r:r},u.prototype.readInt16BE=function(t,e){e||P(t,2,this.length);var r=this[t+1]|this[t]<<8;return 32768&r?4294901760|r:r},u.prototype.readInt32LE=function(t,e){return e||P(t,4,this.length),this[t]|this[t+1]<<8|this[t+2]<<16|this[t+3]<<24},u.prototype.readInt32BE=function(t,e){return e||P(t,4,this.length),this[t]<<24|this[t+1]<<16|this[t+2]<<8|this[t+3]},u.prototype.readFloatLE=function(t,e){return e||P(t,4,this.length),i.read(this,t,!0,23,4)},u.prototype.readFloatBE=function(t,e){return e||P(t,4,this.length),i.read(this,t,!1,23,4)},u.prototype.readDoubleLE=function(t,e){return e||P(t,8,this.length),i.read(this,t,!0,52,8)},u.prototype.readDoubleBE=function(t,e){return e||P(t,8,this.length),i.read(this,t,!1,52,8)},u.prototype.writeUIntLE=function(t,e,r,n){(t=+t,e|=0,r|=0,n)||R(this,t,e,r,Math.pow(2,8*r)-1,0);var i=1,o=0;for(this[e]=255&t;++o=0&&(o*=256);)this[e+i]=t/o&255;return e+r},u.prototype.writeUInt8=function(t,e,r){return t=+t,e|=0,r||R(this,t,e,1,255,0),u.TYPED_ARRAY_SUPPORT||(t=Math.floor(t)),this[e]=255&t,e+1},u.prototype.writeUInt16LE=function(t,e,r){return t=+t,e|=0,r||R(this,t,e,2,65535,0),u.TYPED_ARRAY_SUPPORT?(this[e]=255&t,this[e+1]=t>>>8):B(this,t,e,!0),e+2},u.prototype.writeUInt16BE=function(t,e,r){return t=+t,e|=0,r||R(this,t,e,2,65535,0),u.TYPED_ARRAY_SUPPORT?(this[e]=t>>>8,this[e+1]=255&t):B(this,t,e,!1),e+2},u.prototype.writeUInt32LE=function(t,e,r){return t=+t,e|=0,r||R(this,t,e,4,4294967295,0),u.TYPED_ARRAY_SUPPORT?(this[e+3]=t>>>24,this[e+2]=t>>>16,this[e+1]=t>>>8,this[e]=255&t):T(this,t,e,!0),e+4},u.prototype.writeUInt32BE=function(t,e,r){return t=+t,e|=0,r||R(this,t,e,4,4294967295,0),u.TYPED_ARRAY_SUPPORT?(this[e]=t>>>24,this[e+1]=t>>>16,this[e+2]=t>>>8,this[e+3]=255&t):T(this,t,e,!1),e+4},u.prototype.writeIntLE=function(t,e,r,n){if(t=+t,e|=0,!n){var i=Math.pow(2,8*r-1);R(this,t,e,r,i-1,-i)}var o=0,s=1,a=0;for(this[e]=255&t;++o>0)-a&255;return e+r},u.prototype.writeIntBE=function(t,e,r,n){if(t=+t,e|=0,!n){var i=Math.pow(2,8*r-1);R(this,t,e,r,i-1,-i)}var o=r-1,s=1,a=0;for(this[e+o]=255&t;--o>=0&&(s*=256);)t<0&&0===a&&0!==this[e+o+1]&&(a=1),this[e+o]=(t/s>>0)-a&255;return e+r},u.prototype.writeInt8=function(t,e,r){return t=+t,e|=0,r||R(this,t,e,1,127,-128),u.TYPED_ARRAY_SUPPORT||(t=Math.floor(t)),t<0&&(t=255+t+1),this[e]=255&t,e+1},u.prototype.writeInt16LE=function(t,e,r){return t=+t,e|=0,r||R(this,t,e,2,32767,-32768),u.TYPED_ARRAY_SUPPORT?(this[e]=255&t,this[e+1]=t>>>8):B(this,t,e,!0),e+2},u.prototype.writeInt16BE=function(t,e,r){return t=+t,e|=0,r||R(this,t,e,2,32767,-32768),u.TYPED_ARRAY_SUPPORT?(this[e]=t>>>8,this[e+1]=255&t):B(this,t,e,!1),e+2},u.prototype.writeInt32LE=function(t,e,r){return t=+t,e|=0,r||R(this,t,e,4,2147483647,-2147483648),u.TYPED_ARRAY_SUPPORT?(this[e]=255&t,this[e+1]=t>>>8,this[e+2]=t>>>16,this[e+3]=t>>>24):T(this,t,e,!0),e+4},u.prototype.writeInt32BE=function(t,e,r){return t=+t,e|=0,r||R(this,t,e,4,2147483647,-2147483648),t<0&&(t=4294967295+t+1),u.TYPED_ARRAY_SUPPORT?(this[e]=t>>>24,this[e+1]=t>>>16,this[e+2]=t>>>8,this[e+3]=255&t):T(this,t,e,!1),e+4},u.prototype.writeFloatLE=function(t,e,r){return N(this,t,e,!0,r)},u.prototype.writeFloatBE=function(t,e,r){return N(this,t,e,!1,r)},u.prototype.writeDoubleLE=function(t,e,r){return D(this,t,e,!0,r)},u.prototype.writeDoubleBE=function(t,e,r){return D(this,t,e,!1,r)},u.prototype.copy=function(t,e,r,n){if(r||(r=0),n||0===n||(n=this.length),e>=t.length&&(e=t.length),e||(e=0),n>0&&n=this.length)throw new RangeError("sourceStart out of bounds");if(n<0)throw new RangeError("sourceEnd out of bounds");n>this.length&&(n=this.length),t.length-e=0;--i)t[i+e]=this[i+r];else if(o<1e3||!u.TYPED_ARRAY_SUPPORT)for(i=0;i>>=0,r=void 0===r?this.length:r>>>0,t||(t=0),"number"==typeof t)for(o=e;o55295&&r<57344){if(!i){if(r>56319){(e-=3)>-1&&o.push(239,191,189);continue}if(s+1===n){(e-=3)>-1&&o.push(239,191,189);continue}i=r;continue}if(r<56320){(e-=3)>-1&&o.push(239,191,189),i=r;continue}r=65536+(i-55296<<10|r-56320)}else i&&(e-=3)>-1&&o.push(239,191,189);if(i=null,r<128){if((e-=1)<0)break;o.push(r)}else if(r<2048){if((e-=2)<0)break;o.push(r>>6|192,63&r|128)}else if(r<65536){if((e-=3)<0)break;o.push(r>>12|224,r>>6&63|128,63&r|128)}else{if(!(r<1114112))throw new Error("Invalid code point");if((e-=4)<0)break;o.push(r>>18|240,r>>12&63|128,r>>6&63|128,63&r|128)}}return o}function U(t){return n.toByteArray(function(t){if((t=function(t){return t.trim?t.trim():t.replace(/^\s+|\s+$/g,"")}(t).replace(C,"")).length<2)return"";for(;t.length%4!=0;)t+="=";return t}(t))}function F(t,e,r,n){for(var i=0;i=e.length||i>=t.length);++i)e[i+r]=t[i];return i}}).call(this,r(7))},function(t,e,r){"use strict";(function(t,n){ +/*! + * Module dependencies. + */ +function i(t,e){var r="undefined"!=typeof Symbol&&t[Symbol.iterator]||t["@@iterator"];if(!r){if(Array.isArray(t)||(r=function(t,e){if(!t)return;if("string"==typeof t)return o(t,e);var r=Object.prototype.toString.call(t).slice(8,-1);"Object"===r&&t.constructor&&(r=t.constructor.name);if("Map"===r||"Set"===r)return Array.from(t);if("Arguments"===r||/^(?:Ui|I)nt(?:8|16|32)(?:Clamped)?Array$/.test(r))return o(t,e)}(t))||e&&t&&"number"==typeof t.length){r&&(t=r);var n=0,i=function(){};return{s:i,n:function(){return n>=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:i}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function o(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r=0;c--)if(u[c]!==h[c])return!1;for(var l=0,d=u;l0)return t[t.length-1]},e.clone=p, +/*! + * ignore + */ +e.promiseOrCallback=w, +/*! + * ignore + */ +e.omit=function(t,e){if(null==e)return Object.assign({},t);Array.isArray(e)||(e=[e]);var r,n=Object.assign({},t),o=i(e);try{for(o.s();!(r=o.n()).done;){delete n[r.value]}}catch(t){o.e(t)}finally{o.f()}return n}, +/*! + * Shallow copies defaults into options. + * + * @param {Object} defaults + * @param {Object} options + * @return {Object} the merged object + * @api private + */ +e.options=function(t,e){var r,n=Object.keys(t),i=n.length;for(e=e||{};i--;)(r=n[i])in e||(e[r]=t[r]);return e}, +/*! + * Generates a random string + * + * @api private + */ +e.random=function(){return Math.random().toString().substr(3)}, +/*! + * Merges `from` into `to` without overwriting existing properties. + * + * @param {Object} to + * @param {Object} from + * @api private + */ +e.merge=function t(r,n,i,o){i=i||{};var s,a=Object.keys(n),u=0,h=a.length;n[S]&&(r[S]=n[S]),o=o||"";for(var f=i.omitNested||{};u [1,2,3,4] + * + * @param {Array} arr + * @param {Function} [filter] If passed, will be invoked with each item in the array. If `filter` returns a falsy value, the item will not be included in the results. + * @return {Array} + * @private + */ +e.array.flatten=function t(e,r,n){return n||(n=[]),e.forEach((function(e){Array.isArray(e)?t(e,r,n):r&&!r(e)||n.push(e)})),n}; +/*! + * ignore + */ +var E=Object.prototype.hasOwnProperty;e.hasUserDefinedProperty=function(t,r){if(null==t)return!1;if(Array.isArray(r)){var n,o=i(r);try{for(o.s();!(n=o.n()).done;){var a=n.value;if(e.hasUserDefinedProperty(t,a))return!0}}catch(t){o.e(t)}finally{o.f()}return!1}if(E.call(t,r))return!0;if("object"===s(t)&&r in t){var u=t[r];return u!==Object.prototype[r]&&u!==Array.prototype[r]}return!1}; +/*! + * ignore + */ +var x=Math.pow(2,32)-1;e.isArrayIndex=function(t){return"number"==typeof t?t>=0&&t<=x:"string"==typeof t&&(!!/^\d+$/.test(t)&&((t=+t)>=0&&t<=x))}, +/*! + * Removes duplicate values from an array + * + * [1, 2, 3, 3, 5] => [1, 2, 3, 5] + * [ ObjectId("550988ba0c19d57f697dc45e"), ObjectId("550988ba0c19d57f697dc45e") ] + * => [ObjectId("550988ba0c19d57f697dc45e")] + * + * @param {Array} arr + * @return {Array} + * @private + */ +e.array.unique=function(t){var e,r=new Set,n=new Set,o=[],s=i(t);try{for(s.s();!(e=s.n()).done;){var a=e.value;if("number"==typeof a||"string"==typeof a||null==a){if(r.has(a))continue;o.push(a),r.add(a)}else if(a instanceof l){if(n.has(a.toString()))continue;o.push(a),n.add(a.toString())}else o.push(a)}}catch(t){s.e(t)}finally{s.f()}return o}, +/*! + * Determines if two buffers are equal. + * + * @param {Buffer} a + * @param {Object} b + */ +e.buffer={},e.buffer.areEqual=function(e,r){if(!t.isBuffer(e))return!1;if(!t.isBuffer(r))return!1;if(e.length!==r.length)return!1;for(var n=0,i=e.length;n1)for(var r=1;r=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:o}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function i(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r1&&(this.defaultValue=p.args(arguments)),this.defaultValue},w.prototype.index=function(t){return this._index=t,p.expires(this._index),this},w.prototype.unique=function(t){if(!1===this._index){if(!t)return;throw new Error('Path "'+this.path+'" may not have `index` set to false and `unique` set to true')}return this.options.hasOwnProperty("index")||!1!==t?(null==this._index||!0===this._index?this._index={}:"string"==typeof this._index&&(this._index={type:this._index}),this._index.unique=t,this):this},w.prototype.text=function(t){if(!1===this._index){if(!t)return;throw new Error('Path "'+this.path+'" may not have `index` set to false and `text` set to true')}return this.options.hasOwnProperty("index")||!1!==t?(null===this._index||void 0===this._index||"boolean"==typeof this._index?this._index={}:"string"==typeof this._index&&(this._index={type:this._index}),this._index.text=t,this):this},w.prototype.sparse=function(t){if(!1===this._index){if(!t)return;throw new Error('Path "'+this.path+'" may not have `index` set to false and `sparse` set to true')}return this.options.hasOwnProperty("index")||!1!==t?(null==this._index||"boolean"==typeof this._index?this._index={}:"string"==typeof this._index&&(this._index={type:this._index}),this._index.sparse=t,this):this},w.prototype.immutable=function(t){return this.$immutable=t,f(this),this},w.prototype.transform=function(t){return this.options.transform=t,this},w.prototype.set=function(t){if("function"!=typeof t)throw new TypeError("A setter must be a function.");return this.setters.push(t),this},w.prototype.get=function(t){if("function"!=typeof t)throw new TypeError("A getter must be a function.");return this.getters.push(t),this},w.prototype.validate=function(t,e,r){var n,s,a,u;if("function"==typeof t||t&&"RegExp"===p.getFunctionName(t.constructor))return"function"==typeof e?(n={validator:t,message:e}).type=r||"user defined":e instanceof Object&&!r?((n=p.clone(e)).message||(n.message=n.msg),n.validator=t,n.type=n.type||"user defined"):(null==e&&(e=o.messages.general.default),r||(r="user defined"),n={message:e,type:r,validator:t}),this.validators.push(n),this;for(s=0,a=arguments.length;s0&&null==t)return this.validators=this.validators.filter((function(t){return t.validator!==this.requiredValidator}),this),this.isRequired=!1,delete this.originalRequiredValue,this;if("object"===i(t)&&(e=(r=t).message||e,t=t.isRequired),!1===t)return this.validators=this.validators.filter((function(t){return t.validator!==this.requiredValidator}),this),this.isRequired=!1,delete this.originalRequiredValue,this;var n=this;this.isRequired=!0,this.requiredValidator=function(e){var r=h(this,"$__.cachedRequired");if(null!=r&&!this.$__isSelected(n.path)&&!this[y](n.path))return!0;if(null!=r&&n.path in r){var i=!r[n.path]||n.checkRequired(e,this);return delete r[n.path],i}return"function"==typeof t&&!t.apply(this)||n.checkRequired(e,this)},this.originalRequiredValue=t,"string"==typeof t&&(e=t,t=void 0);var s=e||o.messages.general.required;return this.validators.unshift(Object.assign({},r,{validator:this.requiredValidator,message:s,type:"required"})),this},w.prototype.ref=function(t){return this.options.ref=t,this},w.prototype.getDefault=function(t,e){var r;if(null!=(r="function"==typeof this.defaultValue?this.defaultValue===Date.now||this.defaultValue===Array||"objectid"===this.defaultValue.name.toLowerCase()?this.defaultValue.call(t):this.defaultValue.call(t,t):this.defaultValue)){"object"!==i(r)||this.options&&this.options.shared||(r=p.clone(r));var n=this.applySetters(r,t,e);return n&&n.$isSingleNested&&(n.$__parent=t),n}return r}, +/*! + * Applies setters without casting + * + * @api private + */ +w.prototype._applySetters=function(t,e,r,n){var i=t;if(r)return i;for(var o=this.setters,s=o.length-1;s>=0;s--)i=o[s].call(e,i,n,this);return i}, +/*! + * ignore + */ +w.prototype._castNullish=function(t){return t},w.prototype.applySetters=function(t,e,r,n,i){var o=this._applySetters(t,e,r,n,i);return null==o?this._castNullish(o):o=this.cast(o,e,r,n,i)},w.prototype.applyGetters=function(t,e){var r=t,n=this.getters,i=n.length;if(0===i)return r;for(var o=0;o0&&"required"===this.validators[0].type))return null;o=[this.validators[0]]}var s=null;return o.forEach((function(o){if(!s&&null!=o&&"object"===i(o)){var u,h=o.validator,f=p.clone(o);if(f.path=r&&r.path?r.path:n,f.value=t,!c(h))if(h instanceof RegExp)a(h.test(t),f);else if("function"==typeof h){try{u=f.propsParameter?h.call(e,t,f):h.call(e,t)}catch(t){u=!1,f.reason=t}null!=u&&"function"==typeof u.then||a(u,f)}}})),s;function a(t,e){if(!s&&void 0!==t&&!t){var r=e.ErrorConstructor||g;(s=new r(e))[m]=!0}}},w._isRef=function(t,e,r,i){var o=i&&t.options&&(t.options.ref||t.options.refPath);if(!o&&r&&null!=r.$__){var s=r.$__fullPath(t.path,!0),a=r.ownerDocument?r.ownerDocument():r;o=null!=s&&a.$populated(s)||r.$populated(t.path)}return!!o&&(null==e||(!(n.isBuffer(e)||"Binary"===e._bsontype||!p.isObject(e))||i))}, +/*! + * ignore + */ +w.prototype._castRef=function(t,e,r){if(null==t)return t;if(null!=t.$__)return t.$__.wasPopulated=!0,t;if(n.isBuffer(t)||!p.isObject(t)){if(r)return t;throw new b(this.instance,t,this.path,null,this)}var i=e.$__fullPath(this.path,!0),o=(e.ownerDocument?e.ownerDocument():e).$populated(i,!0),s=t;return e.$__.populated&&e.$__.populated[i]&&e.$__.populated[i].options&&e.$__.populated[i].options.options&&e.$__.populated[i].options.options.lean||((s=new o.options[v](t)).$__.wasPopulated=!0),s},w.prototype.$conditionalHandlers={$all:function(t){var e=this;return Array.isArray(t)?t.map((function(t){return e.castForQuery(t)})):[this.castForQuery(t)]},$eq:_,$in:M,$ne:_,$nin:M,$exists:a,$type:u}, +/*! + * Wraps `castForQuery` to handle context + */ +w.prototype.castForQueryWrapper=function(t){if(this.$$context=t.context,"$conditional"in t){var e=this.castForQuery(t.$conditional,t.val);return this.$$context=null,e}if(t.$skipQueryCastForUpdate||t.$applySetters){var r=this._castForQuery(t.val);return this.$$context=null,r}var n=this.castForQuery(t.val);return this.$$context=null,n},w.prototype.castForQuery=function(t,e){var r;if(2===arguments.length){if(!(r=this.$conditionalHandlers[t]))throw new Error("Can't use "+t);return r.call(this,e)}return e=t,this._castForQuery(e)}, +/*! + * Internal switch for runSetters + * + * @api private + */ +w.prototype._castForQuery=function(t){return this.applySetters(t,this.$$context)},w.checkRequired=function(t){return arguments.length>0&&(this._checkRequired=t),this._checkRequired},w.prototype.checkRequired=function(t){return null!=t}, +/*! + * ignore + */ +w.prototype.clone=function(){var t=Object.assign({},this.options),e=new this.constructor(this.path,t,this.instance);return e.validators=this.validators.slice(),void 0!==this.requiredValidator&&(e.requiredValidator=this.requiredValidator),void 0!==this.defaultValue&&(e.defaultValue=this.defaultValue),void 0!==this.$immutable&&void 0===this.options.immutable&&(e.$immutable=this.$immutable,f(e)),void 0!==this._index&&(e._index=this._index),void 0!==this.selected&&(e.selected=this.selected),void 0!==this.isRequired&&(e.isRequired=this.isRequired),void 0!==this.originalRequiredValue&&(e.originalRequiredValue=this.originalRequiredValue),e.getters=this.getters.slice(),e.setters=this.setters.slice(),e}, +/*! + * Module exports. + */ +t.exports=e=w,e.CastError=b,e.ValidatorError=g}).call(this,r(3).Buffer)},function(t,e,r){(function(t){function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}var i=Object.getOwnPropertyDescriptors||function(t){for(var e=Object.keys(t),r={},n=0;n=i)return t;switch(t){case"%s":return String(n[r++]);case"%d":return Number(n[r++]);case"%j":try{return JSON.stringify(n[r++])}catch(t){return"[Circular]"}default:return t}})),a=n[r];r=3&&(n.depth=arguments[2]),arguments.length>=4&&(n.colors=arguments[3]),m(r)?n.showHidden=r:r&&e._extend(n,r),g(n.showHidden)&&(n.showHidden=!1),g(n.depth)&&(n.depth=2),g(n.colors)&&(n.colors=!1),g(n.customInspect)&&(n.customInspect=!0),n.colors&&(n.stylize=h),c(n,t,n.depth)}function h(t,e){var r=u.styles[e];return r?"["+u.colors[r][0]+"m"+t+"["+u.colors[r][1]+"m":t}function f(t,e){return t}function c(t,r,n){if(t.customInspect&&r&&O(r.inspect)&&r.inspect!==e.inspect&&(!r.constructor||r.constructor.prototype!==r)){var i=r.inspect(n,t);return b(i)||(i=c(t,i,n)),i}var o=function(t,e){if(g(e))return t.stylize("undefined","undefined");if(b(e)){var r="'"+JSON.stringify(e).replace(/^"|"$/g,"").replace(/'/g,"\\'").replace(/\\"/g,'"')+"'";return t.stylize(r,"string")}if(v(e))return t.stylize(""+e,"number");if(m(e))return t.stylize(""+e,"boolean");if(y(e))return t.stylize("null","null")}(t,r);if(o)return o;var s=Object.keys(r),a=function(t){var e={};return t.forEach((function(t,r){e[t]=!0})),e}(s);if(t.showHidden&&(s=Object.getOwnPropertyNames(r)),S(r)&&(s.indexOf("message")>=0||s.indexOf("description")>=0))return l(r);if(0===s.length){if(O(r)){var u=r.name?": "+r.name:"";return t.stylize("[Function"+u+"]","special")}if(w(r))return t.stylize(RegExp.prototype.toString.call(r),"regexp");if(M(r))return t.stylize(Date.prototype.toString.call(r),"date");if(S(r))return l(r)}var h,f="",_=!1,A=["{","}"];(p(r)&&(_=!0,A=["[","]"]),O(r))&&(f=" [Function"+(r.name?": "+r.name:"")+"]");return w(r)&&(f=" "+RegExp.prototype.toString.call(r)),M(r)&&(f=" "+Date.prototype.toUTCString.call(r)),S(r)&&(f=" "+l(r)),0!==s.length||_&&0!=r.length?n<0?w(r)?t.stylize(RegExp.prototype.toString.call(r),"regexp"):t.stylize("[Object]","special"):(t.seen.push(r),h=_?function(t,e,r,n,i){for(var o=[],s=0,a=e.length;s=0&&0,t+e.replace(/\u001b\[\d\d?m/g,"").length+1}),0)>60)return r[0]+(""===e?"":e+"\n ")+" "+t.join(",\n ")+" "+r[1];return r[0]+e+" "+t.join(", ")+" "+r[1]}(h,f,A)):A[0]+f+A[1]}function l(t){return"["+Error.prototype.toString.call(t)+"]"}function d(t,e,r,n,i,o){var s,a,u;if((u=Object.getOwnPropertyDescriptor(e,i)||{value:e[i]}).get?a=u.set?t.stylize("[Getter/Setter]","special"):t.stylize("[Getter]","special"):u.set&&(a=t.stylize("[Setter]","special")),j(n,i)||(s="["+i+"]"),a||(t.seen.indexOf(u.value)<0?(a=y(r)?c(t,u.value,null):c(t,u.value,r-1)).indexOf("\n")>-1&&(a=o?a.split("\n").map((function(t){return" "+t})).join("\n").substr(2):"\n"+a.split("\n").map((function(t){return" "+t})).join("\n")):a=t.stylize("[Circular]","special")),g(s)){if(o&&i.match(/^\d+$/))return a;(s=JSON.stringify(""+i)).match(/^"([a-zA-Z_][a-zA-Z_0-9]*)"$/)?(s=s.substr(1,s.length-2),s=t.stylize(s,"name")):(s=s.replace(/'/g,"\\'").replace(/\\"/g,'"').replace(/(^"|"$)/g,"'"),s=t.stylize(s,"string"))}return s+": "+a}function p(t){return Array.isArray(t)}function m(t){return"boolean"==typeof t}function y(t){return null===t}function v(t){return"number"==typeof t}function b(t){return"string"==typeof t}function g(t){return void 0===t}function w(t){return _(t)&&"[object RegExp]"===A(t)}function _(t){return"object"===n(t)&&null!==t}function M(t){return _(t)&&"[object Date]"===A(t)}function S(t){return _(t)&&("[object Error]"===A(t)||t instanceof Error)}function O(t){return"function"==typeof t}function A(t){return Object.prototype.toString.call(t)}function E(t){return t<10?"0"+t.toString(10):t.toString(10)}e.debuglog=function(r){if(g(s)&&(s=t.env.NODE_DEBUG||""),r=r.toUpperCase(),!a[r])if(new RegExp("\\b"+r+"\\b","i").test(s)){var n=t.pid;a[r]=function(){var t=e.format.apply(e,arguments);console.error("%s %d: %s",r,n,t)}}else a[r]=function(){};return a[r]},e.inspect=u,u.colors={bold:[1,22],italic:[3,23],underline:[4,24],inverse:[7,27],white:[37,39],grey:[90,39],black:[30,39],blue:[34,39],cyan:[36,39],green:[32,39],magenta:[35,39],red:[31,39],yellow:[33,39]},u.styles={special:"cyan",number:"yellow",boolean:"yellow",undefined:"grey",null:"bold",string:"green",date:"magenta",regexp:"red"},e.isArray=p,e.isBoolean=m,e.isNull=y,e.isNullOrUndefined=function(t){return null==t},e.isNumber=v,e.isString=b,e.isSymbol=function(t){return"symbol"===n(t)},e.isUndefined=g,e.isRegExp=w,e.isObject=_,e.isDate=M,e.isError=S,e.isFunction=O,e.isPrimitive=function(t){return null===t||"boolean"==typeof t||"number"==typeof t||"string"==typeof t||"symbol"===n(t)||void 0===t},e.isBuffer=r(282);var x=["Jan","Feb","Mar","Apr","May","Jun","Jul","Aug","Sep","Oct","Nov","Dec"];function k(){var t=new Date,e=[E(t.getHours()),E(t.getMinutes()),E(t.getSeconds())].join(":");return[t.getDate(),x[t.getMonth()],e].join(" ")}function j(t,e){return Object.prototype.hasOwnProperty.call(t,e)}e.log=function(){console.log("%s - %s",k(),e.format.apply(e,arguments))},e.inherits=r(283),e._extend=function(t,e){if(!e||!_(e))return t;for(var r=Object.keys(e),n=r.length;n--;)t[r[n]]=e[r[n]];return t};var $="undefined"!=typeof Symbol?Symbol("util.promisify.custom"):void 0;function P(t,e){if(!t){var r=new Error("Promise was rejected with a falsy value");r.reason=t,t=r}return e(t)}e.promisify=function(t){if("function"!=typeof t)throw new TypeError('The "original" argument must be of type Function');if($&&t[$]){var e;if("function"!=typeof(e=t[$]))throw new TypeError('The "util.promisify.custom" argument must be of type Function');return Object.defineProperty(e,$,{value:e,enumerable:!1,writable:!1,configurable:!0}),e}function e(){for(var e,r,n=new Promise((function(t,n){e=t,r=n})),i=[],o=0;o(i>>1)-1?(i>>1)-u:u,o.isubn(a)):a=0,n[s]=a,o.iushrn(1)}return n},n.getJSF=function(t,e){var r=[[],[]];t=t.clone(),e=e.clone();for(var n,i=0,o=0;t.cmpn(-i)>0||e.cmpn(-o)>0;){var s,a,u=t.andln(3)+i&3,h=e.andln(3)+o&3;3===u&&(u=-1),3===h&&(h=-1),s=0==(1&u)?0:3!==(n=t.andln(7)+i&7)&&5!==n||2!==h?u:-u,r[0].push(s),a=0==(1&h)?0:3!==(n=e.andln(7)+o&7)&&5!==n||2!==u?h:-h,r[1].push(a),2*i===s+1&&(i=1-i),2*o===a+1&&(o=1-o),t.iushrn(1),e.iushrn(1)}return r},n.cachedProperty=function(t,e,r){var n="_"+e;t.prototype[e]=function(){return void 0!==this[n]?this[n]:this[n]=r.call(this)}},n.parseBytes=function(t){return"string"==typeof t?n.toArray(t,"hex"):t},n.intFromLE=function(t){return new i(t,"hex","le")}},function(t,e,r){"use strict";function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}var i,o="object"===("undefined"==typeof Reflect?"undefined":n(Reflect))?Reflect:null,s=o&&"function"==typeof o.apply?o.apply:function(t,e,r){return Function.prototype.apply.call(t,e,r)};i=o&&"function"==typeof o.ownKeys?o.ownKeys:Object.getOwnPropertySymbols?function(t){return Object.getOwnPropertyNames(t).concat(Object.getOwnPropertySymbols(t))}:function(t){return Object.getOwnPropertyNames(t)};var a=Number.isNaN||function(t){return t!=t};function u(){u.init.call(this)}t.exports=u,t.exports.once=function(t,e){return new Promise((function(r,n){function i(r){t.removeListener(e,o),n(r)}function o(){"function"==typeof t.removeListener&&t.removeListener("error",i),r([].slice.call(arguments))}b(t,e,o,{once:!0}),"error"!==e&&function(t,e,r){"function"==typeof t.on&&b(t,"error",e,r)}(t,i,{once:!0})}))},u.EventEmitter=u,u.prototype._events=void 0,u.prototype._eventsCount=0,u.prototype._maxListeners=void 0;var h=10;function f(t){if("function"!=typeof t)throw new TypeError('The "listener" argument must be of type Function. Received type '+n(t))}function c(t){return void 0===t._maxListeners?u.defaultMaxListeners:t._maxListeners}function l(t,e,r,n){var i,o,s,a;if(f(r),void 0===(o=t._events)?(o=t._events=Object.create(null),t._eventsCount=0):(void 0!==o.newListener&&(t.emit("newListener",e,r.listener?r.listener:r),o=t._events),s=o[e]),void 0===s)s=o[e]=r,++t._eventsCount;else if("function"==typeof s?s=o[e]=n?[r,s]:[s,r]:n?s.unshift(r):s.push(r),(i=c(t))>0&&s.length>i&&!s.warned){s.warned=!0;var u=new Error("Possible EventEmitter memory leak detected. "+s.length+" "+String(e)+" listeners added. Use emitter.setMaxListeners() to increase limit");u.name="MaxListenersExceededWarning",u.emitter=t,u.type=e,u.count=s.length,a=u,console&&console.warn&&console.warn(a)}return t}function d(){if(!this.fired)return this.target.removeListener(this.type,this.wrapFn),this.fired=!0,0===arguments.length?this.listener.call(this.target):this.listener.apply(this.target,arguments)}function p(t,e,r){var n={fired:!1,wrapFn:void 0,target:t,type:e,listener:r},i=d.bind(n);return i.listener=r,n.wrapFn=i,i}function m(t,e,r){var n=t._events;if(void 0===n)return[];var i=n[e];return void 0===i?[]:"function"==typeof i?r?[i.listener||i]:[i]:r?function(t){for(var e=new Array(t.length),r=0;r0&&(o=e[0]),o instanceof Error)throw o;var a=new Error("Unhandled error."+(o?" ("+o.message+")":""));throw a.context=o,a}var u=i[t];if(void 0===u)return!1;if("function"==typeof u)s(u,this,e);else{var h=u.length,f=v(u,h);for(r=0;r=0;o--)if(r[o]===e||r[o].listener===e){s=r[o].listener,i=o;break}if(i<0)return this;0===i?r.shift():function(t,e){for(;e+1=0;n--)this.removeListener(t,e[n]);return this},u.prototype.listeners=function(t){return m(this,t,!0)},u.prototype.rawListeners=function(t){return m(this,t,!1)},u.listenerCount=function(t,e){return"function"==typeof t.listenerCount?t.listenerCount(e):y.call(t,e)},u.prototype.listenerCount=y,u.prototype.eventNames=function(){return this._eventsCount>0?i(this._events):[]}},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */function n(t,e,r){return e in t?Object.defineProperty(t,e,{value:r,enumerable:!0,configurable:!0,writable:!0}):t[e]=r,t}function i(t,e){var r="undefined"!=typeof Symbol&&t[Symbol.iterator]||t["@@iterator"];if(!r){if(Array.isArray(t)||(r=function(t,e){if(!t)return;if("string"==typeof t)return o(t,e);var r=Object.prototype.toString.call(t).slice(8,-1);"Object"===r&&t.constructor&&(r=t.constructor.name);if("Map"===r||"Set"===r)return Array.from(t);if("Arguments"===r||/^(?:Ui|I)nt(?:8|16|32)(?:Clamped)?Array$/.test(r))return o(t,e)}(t))||e&&t&&"number"==typeof t.length){r&&(t=r);var n=0,i=function(){};return{s:i,n:function(){return n>=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:i}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function o(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r0?".":"")+tt[w],ot||(this.$set(st,{}),this.$__isSelected(st)||this.unmarkModified(st),ot=this.$__getValue(st));if(tt.length<=1)it=t;else{for(w=0;w0&&null!=e[0]&&null!=e[0].$__&&null!=e[0].$__.populated){for(var bt=Object.keys(e[0].$__.populated),gt=function(){var r=_t[wt];a.$populated(t+"."+r,e.map((function(t){return t.$populated(r)})),e[0].$__.populated[r].options)},wt=0,_t=bt;wt<_t.length;wt++)gt();mt=!0}if(!mt&&this.$__.populated){if(Array.isArray(e)&&this.$__.populated[t])for(var Mt=0;Mt0){var r=this.$get(t);return null==r||"object"===s(r)&&(q.isPOJO(r)?ut(r):0===Object.keys(r.toObject(e)).length)}return 0===Object.keys(this.toObject(e)).length},nt.prototype.modifiedPaths=function(t){t=t||{};var e=Object.keys(this.$__.activePaths.states.modify),r=this;return e.reduce((function(e,n){var o=n.split(".");if(e=e.concat(o.reduce((function(t,e,r){return t.concat(o.slice(0,r).concat(e).join("."))}),[]).filter((function(t){return-1===e.indexOf(t)}))),!t.includeChildren)return e;var a=r.$get(n);if(null!=a&&"object"===s(a))if(a._doc&&(a=a._doc),Array.isArray(a)){for(var u=a.length,h=0;h0){"function"==typeof r[r.length-1]&&(t=r.pop());var n,o=q.populate.apply(null,r),s=i(o);try{for(s.s();!(n=s.n()).done;){var a=n.value;e[a.path]=a}}catch(t){s.e(t)}finally{s.f()}}var u=q.object.vals(e),h=this.constructor;if(this.$__isNested){h=this.$__[X].constructor;var f=this.$__.nestedPath;u.forEach((function(t){t.path=f+"."+t.path}))}if(null!=this.$session()){var c=this.$session();u.forEach((function(t){null!=t.options?"session"in t.options||(t.options.session=c):t.options={session:c}}))}return u.forEach((function(t){t._localModel=h})),h.populate(this,u,t)},nt.prototype.$getPopulatedDocs=function(){var t=[];null!=this.$__.populated&&(t=t.concat(Object.keys(this.$__.populated)));var e,r=[],n=i(t);try{for(n.s();!(e=n.n()).done;){var o=e.value,s=this.$get(o);Array.isArray(s)?r=r.concat(s):s instanceof nt&&r.push(s)}}catch(t){n.e(t)}finally{n.f()}return r},nt.prototype.populated=function(t,e,r){if(null==e||!0===e){if(!this.$__.populated)return;if("string"!=typeof t)return;var n=t.endsWith(".$*")?t.replace(/\.\$\*$/,""):t,i=this.$__.populated[n];return i?!0===e?i:i.value:void 0}this.$__.populated||(this.$__.populated={}),this.$__.populated[t]={value:e,options:r};for(var o=t.split("."),s=0;s=t.length)&&56320==(64512&t.charCodeAt(e+1)))}function s(t){return(t>>>24|t>>>8&65280|t<<8&16711680|(255&t)<<24)>>>0}function a(t){return 1===t.length?"0"+t:t}function u(t){return 7===t.length?"0"+t:6===t.length?"00"+t:5===t.length?"000"+t:4===t.length?"0000"+t:3===t.length?"00000"+t:2===t.length?"000000"+t:1===t.length?"0000000"+t:t}e.inherits=i,e.toArray=function(t,e){if(Array.isArray(t))return t.slice();if(!t)return[];var r=[];if("string"==typeof t)if(e){if("hex"===e)for((t=t.replace(/[^a-z0-9]+/gi,"")).length%2!=0&&(t="0"+t),i=0;i>6|192,r[n++]=63&s|128):o(t,i)?(s=65536+((1023&s)<<10)+(1023&t.charCodeAt(++i)),r[n++]=s>>18|240,r[n++]=s>>12&63|128,r[n++]=s>>6&63|128,r[n++]=63&s|128):(r[n++]=s>>12|224,r[n++]=s>>6&63|128,r[n++]=63&s|128)}else for(i=0;i>>0}return s},e.split32=function(t,e){for(var r=new Array(4*t.length),n=0,i=0;n>>24,r[i+1]=o>>>16&255,r[i+2]=o>>>8&255,r[i+3]=255&o):(r[i+3]=o>>>24,r[i+2]=o>>>16&255,r[i+1]=o>>>8&255,r[i]=255&o)}return r},e.rotr32=function(t,e){return t>>>e|t<<32-e},e.rotl32=function(t,e){return t<>>32-e},e.sum32=function(t,e){return t+e>>>0},e.sum32_3=function(t,e,r){return t+e+r>>>0},e.sum32_4=function(t,e,r,n){return t+e+r+n>>>0},e.sum32_5=function(t,e,r,n,i){return t+e+r+n+i>>>0},e.sum64=function(t,e,r,n){var i=t[e],o=n+t[e+1]>>>0,s=(o>>0,t[e+1]=o},e.sum64_hi=function(t,e,r,n){return(e+n>>>0>>0},e.sum64_lo=function(t,e,r,n){return e+n>>>0},e.sum64_4_hi=function(t,e,r,n,i,o,s,a){var u=0,h=e;return u+=(h=h+n>>>0)>>0)>>0)>>0},e.sum64_4_lo=function(t,e,r,n,i,o,s,a){return e+n+o+a>>>0},e.sum64_5_hi=function(t,e,r,n,i,o,s,a,u,h){var f=0,c=e;return f+=(c=c+n>>>0)>>0)>>0)>>0)>>0},e.sum64_5_lo=function(t,e,r,n,i,o,s,a,u,h){return e+n+o+a+h>>>0},e.rotr64_hi=function(t,e,r){return(e<<32-r|t>>>r)>>>0},e.rotr64_lo=function(t,e,r){return(t<<32-r|e>>>r)>>>0},e.shr64_hi=function(t,e,r){return t>>>r},e.shr64_lo=function(t,e,r){return(t<<32-r|e>>>r)>>>0}},function(t,e,r){"use strict";var n=r(24).get().ObjectId,i=r(1).objectIdSymbol; +/*! + * Getter for convenience with populate, see gh-6115 + */ +Object.defineProperty(n.prototype,"_id",{enumerable:!1,configurable:!0,get:function(){return this}}), +/*! + * Convenience `valueOf()` to allow comparing ObjectIds using double equals re: gh-7299 + */ +n.prototype.hasOwnProperty("valueOf")||(n.prototype.valueOf=function(){return this.toString()}),n.prototype[i]=!0,t.exports=n},function(t,e,r){(function(t){function e(t){return(e="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}!function(t,n){"use strict";function i(t,e){if(!t)throw new Error(e||"Assertion failed")}function o(t,e){t.super_=e;var r=function(){};r.prototype=e.prototype,t.prototype=new r,t.prototype.constructor=t}function s(t,e,r){if(s.isBN(t))return t;this.negative=0,this.words=null,this.length=0,this.red=null,null!==t&&("le"!==e&&"be"!==e||(r=e,e=10),this._init(t||0,e||10,r||"be"))}var a;"object"===e(t)?t.exports=s:n.BN=s,s.BN=s,s.wordSize=26;try{a="undefined"!=typeof window&&void 0!==window.Buffer?window.Buffer:r(235).Buffer}catch(t){}function u(t,e){var r=t.charCodeAt(e);return r>=65&&r<=70?r-55:r>=97&&r<=102?r-87:r-48&15}function h(t,e,r){var n=u(t,r);return r-1>=e&&(n|=u(t,r-1)<<4),n}function f(t,e,r,n){for(var i=0,o=Math.min(t.length,r),s=e;s=49?a-49+10:a>=17?a-17+10:a}return i}s.isBN=function(t){return t instanceof s||null!==t&&"object"===e(t)&&t.constructor.wordSize===s.wordSize&&Array.isArray(t.words)},s.max=function(t,e){return t.cmp(e)>0?t:e},s.min=function(t,e){return t.cmp(e)<0?t:e},s.prototype._init=function(t,r,n){if("number"==typeof t)return this._initNumber(t,r,n);if("object"===e(t))return this._initArray(t,r,n);"hex"===r&&(r=16),i(r===(0|r)&&r>=2&&r<=36);var o=0;"-"===(t=t.toString().replace(/\s+/g,""))[0]&&(o++,this.negative=1),o=0;n-=3)s=t[n]|t[n-1]<<8|t[n-2]<<16,this.words[o]|=s<>>26-a&67108863,(a+=24)>=26&&(a-=26,o++);else if("le"===r)for(n=0,o=0;n>>26-a&67108863,(a+=24)>=26&&(a-=26,o++);return this.strip()},s.prototype._parseHex=function(t,e,r){this.length=Math.ceil((t.length-e)/6),this.words=new Array(this.length);for(var n=0;n=e;n-=2)i=h(t,e,n)<=18?(o-=18,s+=1,this.words[s]|=i>>>26):o+=8;else for(n=(t.length-e)%2==0?e+1:e;n=18?(o-=18,s+=1,this.words[s]|=i>>>26):o+=8;this.strip()},s.prototype._parseBase=function(t,e,r){this.words=[0],this.length=1;for(var n=0,i=1;i<=67108863;i*=e)n++;n--,i=i/e|0;for(var o=t.length-r,s=o%n,a=Math.min(o,o-s)+r,u=0,h=r;h1&&0===this.words[this.length-1];)this.length--;return this._normSign()},s.prototype._normSign=function(){return 1===this.length&&0===this.words[0]&&(this.negative=0),this},s.prototype.inspect=function(){return(this.red?""};var c=["","0","00","000","0000","00000","000000","0000000","00000000","000000000","0000000000","00000000000","000000000000","0000000000000","00000000000000","000000000000000","0000000000000000","00000000000000000","000000000000000000","0000000000000000000","00000000000000000000","000000000000000000000","0000000000000000000000","00000000000000000000000","000000000000000000000000","0000000000000000000000000"],l=[0,0,25,16,12,11,10,9,8,8,7,7,7,7,6,6,6,6,6,6,6,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5],d=[0,0,33554432,43046721,16777216,48828125,60466176,40353607,16777216,43046721,1e7,19487171,35831808,62748517,7529536,11390625,16777216,24137569,34012224,47045881,64e6,4084101,5153632,6436343,7962624,9765625,11881376,14348907,17210368,20511149,243e5,28629151,33554432,39135393,45435424,52521875,60466176];function p(t,e,r){r.negative=e.negative^t.negative;var n=t.length+e.length|0;r.length=n,n=n-1|0;var i=0|t.words[0],o=0|e.words[0],s=i*o,a=67108863&s,u=s/67108864|0;r.words[0]=a;for(var h=1;h>>26,c=67108863&u,l=Math.min(h,e.length-1),d=Math.max(0,h-t.length+1);d<=l;d++){var p=h-d|0;f+=(s=(i=0|t.words[p])*(o=0|e.words[d])+c)/67108864|0,c=67108863&s}r.words[h]=0|c,u=0|f}return 0!==u?r.words[h]=0|u:r.length--,r.strip()}s.prototype.toString=function(t,e){var r;if(e=0|e||1,16===(t=t||10)||"hex"===t){r="";for(var n=0,o=0,s=0;s>>24-n&16777215)||s!==this.length-1?c[6-u.length]+u+r:u+r,(n+=2)>=26&&(n-=26,s--)}for(0!==o&&(r=o.toString(16)+r);r.length%e!=0;)r="0"+r;return 0!==this.negative&&(r="-"+r),r}if(t===(0|t)&&t>=2&&t<=36){var h=l[t],f=d[t];r="";var p=this.clone();for(p.negative=0;!p.isZero();){var m=p.modn(f).toString(t);r=(p=p.idivn(f)).isZero()?m+r:c[h-m.length]+m+r}for(this.isZero()&&(r="0"+r);r.length%e!=0;)r="0"+r;return 0!==this.negative&&(r="-"+r),r}i(!1,"Base should be between 2 and 36")},s.prototype.toNumber=function(){var t=this.words[0];return 2===this.length?t+=67108864*this.words[1]:3===this.length&&1===this.words[2]?t+=4503599627370496+67108864*this.words[1]:this.length>2&&i(!1,"Number can only safely store up to 53 bits"),0!==this.negative?-t:t},s.prototype.toJSON=function(){return this.toString(16)},s.prototype.toBuffer=function(t,e){return i(void 0!==a),this.toArrayLike(a,t,e)},s.prototype.toArray=function(t,e){return this.toArrayLike(Array,t,e)},s.prototype.toArrayLike=function(t,e,r){var n=this.byteLength(),o=r||Math.max(1,n);i(n<=o,"byte array longer than desired length"),i(o>0,"Requested array length <= 0"),this.strip();var s,a,u="le"===e,h=new t(o),f=this.clone();if(u){for(a=0;!f.isZero();a++)s=f.andln(255),f.iushrn(8),h[a]=s;for(;a=4096&&(r+=13,e>>>=13),e>=64&&(r+=7,e>>>=7),e>=8&&(r+=4,e>>>=4),e>=2&&(r+=2,e>>>=2),r+e},s.prototype._zeroBits=function(t){if(0===t)return 26;var e=t,r=0;return 0==(8191&e)&&(r+=13,e>>>=13),0==(127&e)&&(r+=7,e>>>=7),0==(15&e)&&(r+=4,e>>>=4),0==(3&e)&&(r+=2,e>>>=2),0==(1&e)&&r++,r},s.prototype.bitLength=function(){var t=this.words[this.length-1],e=this._countBits(t);return 26*(this.length-1)+e},s.prototype.zeroBits=function(){if(this.isZero())return 0;for(var t=0,e=0;et.length?this.clone().ior(t):t.clone().ior(this)},s.prototype.uor=function(t){return this.length>t.length?this.clone().iuor(t):t.clone().iuor(this)},s.prototype.iuand=function(t){var e;e=this.length>t.length?t:this;for(var r=0;rt.length?this.clone().iand(t):t.clone().iand(this)},s.prototype.uand=function(t){return this.length>t.length?this.clone().iuand(t):t.clone().iuand(this)},s.prototype.iuxor=function(t){var e,r;this.length>t.length?(e=this,r=t):(e=t,r=this);for(var n=0;nt.length?this.clone().ixor(t):t.clone().ixor(this)},s.prototype.uxor=function(t){return this.length>t.length?this.clone().iuxor(t):t.clone().iuxor(this)},s.prototype.inotn=function(t){i("number"==typeof t&&t>=0);var e=0|Math.ceil(t/26),r=t%26;this._expand(e),r>0&&e--;for(var n=0;n0&&(this.words[n]=~this.words[n]&67108863>>26-r),this.strip()},s.prototype.notn=function(t){return this.clone().inotn(t)},s.prototype.setn=function(t,e){i("number"==typeof t&&t>=0);var r=t/26|0,n=t%26;return this._expand(r+1),this.words[r]=e?this.words[r]|1<t.length?(r=this,n=t):(r=t,n=this);for(var i=0,o=0;o>>26;for(;0!==i&&o>>26;if(this.length=r.length,0!==i)this.words[this.length]=i,this.length++;else if(r!==this)for(;ot.length?this.clone().iadd(t):t.clone().iadd(this)},s.prototype.isub=function(t){if(0!==t.negative){t.negative=0;var e=this.iadd(t);return t.negative=1,e._normSign()}if(0!==this.negative)return this.negative=0,this.iadd(t),this.negative=1,this._normSign();var r,n,i=this.cmp(t);if(0===i)return this.negative=0,this.length=1,this.words[0]=0,this;i>0?(r=this,n=t):(r=t,n=this);for(var o=0,s=0;s>26,this.words[s]=67108863&e;for(;0!==o&&s>26,this.words[s]=67108863&e;if(0===o&&s>>13,d=0|s[1],p=8191&d,m=d>>>13,y=0|s[2],v=8191&y,b=y>>>13,g=0|s[3],w=8191&g,_=g>>>13,M=0|s[4],S=8191&M,O=M>>>13,A=0|s[5],E=8191&A,x=A>>>13,k=0|s[6],j=8191&k,$=k>>>13,P=0|s[7],R=8191&P,B=P>>>13,T=0|s[8],I=8191&T,N=T>>>13,D=0|s[9],C=8191&D,L=D>>>13,q=0|a[0],U=8191&q,F=q>>>13,z=0|a[1],V=8191&z,K=z>>>13,H=0|a[2],Z=8191&H,W=H>>>13,J=0|a[3],Y=8191&J,Q=J>>>13,G=0|a[4],X=8191&G,tt=G>>>13,et=0|a[5],rt=8191&et,nt=et>>>13,it=0|a[6],ot=8191&it,st=it>>>13,at=0|a[7],ut=8191&at,ht=at>>>13,ft=0|a[8],ct=8191&ft,lt=ft>>>13,dt=0|a[9],pt=8191&dt,mt=dt>>>13;r.negative=t.negative^e.negative,r.length=19;var yt=(h+(n=Math.imul(c,U))|0)+((8191&(i=(i=Math.imul(c,F))+Math.imul(l,U)|0))<<13)|0;h=((o=Math.imul(l,F))+(i>>>13)|0)+(yt>>>26)|0,yt&=67108863,n=Math.imul(p,U),i=(i=Math.imul(p,F))+Math.imul(m,U)|0,o=Math.imul(m,F);var vt=(h+(n=n+Math.imul(c,V)|0)|0)+((8191&(i=(i=i+Math.imul(c,K)|0)+Math.imul(l,V)|0))<<13)|0;h=((o=o+Math.imul(l,K)|0)+(i>>>13)|0)+(vt>>>26)|0,vt&=67108863,n=Math.imul(v,U),i=(i=Math.imul(v,F))+Math.imul(b,U)|0,o=Math.imul(b,F),n=n+Math.imul(p,V)|0,i=(i=i+Math.imul(p,K)|0)+Math.imul(m,V)|0,o=o+Math.imul(m,K)|0;var bt=(h+(n=n+Math.imul(c,Z)|0)|0)+((8191&(i=(i=i+Math.imul(c,W)|0)+Math.imul(l,Z)|0))<<13)|0;h=((o=o+Math.imul(l,W)|0)+(i>>>13)|0)+(bt>>>26)|0,bt&=67108863,n=Math.imul(w,U),i=(i=Math.imul(w,F))+Math.imul(_,U)|0,o=Math.imul(_,F),n=n+Math.imul(v,V)|0,i=(i=i+Math.imul(v,K)|0)+Math.imul(b,V)|0,o=o+Math.imul(b,K)|0,n=n+Math.imul(p,Z)|0,i=(i=i+Math.imul(p,W)|0)+Math.imul(m,Z)|0,o=o+Math.imul(m,W)|0;var gt=(h+(n=n+Math.imul(c,Y)|0)|0)+((8191&(i=(i=i+Math.imul(c,Q)|0)+Math.imul(l,Y)|0))<<13)|0;h=((o=o+Math.imul(l,Q)|0)+(i>>>13)|0)+(gt>>>26)|0,gt&=67108863,n=Math.imul(S,U),i=(i=Math.imul(S,F))+Math.imul(O,U)|0,o=Math.imul(O,F),n=n+Math.imul(w,V)|0,i=(i=i+Math.imul(w,K)|0)+Math.imul(_,V)|0,o=o+Math.imul(_,K)|0,n=n+Math.imul(v,Z)|0,i=(i=i+Math.imul(v,W)|0)+Math.imul(b,Z)|0,o=o+Math.imul(b,W)|0,n=n+Math.imul(p,Y)|0,i=(i=i+Math.imul(p,Q)|0)+Math.imul(m,Y)|0,o=o+Math.imul(m,Q)|0;var wt=(h+(n=n+Math.imul(c,X)|0)|0)+((8191&(i=(i=i+Math.imul(c,tt)|0)+Math.imul(l,X)|0))<<13)|0;h=((o=o+Math.imul(l,tt)|0)+(i>>>13)|0)+(wt>>>26)|0,wt&=67108863,n=Math.imul(E,U),i=(i=Math.imul(E,F))+Math.imul(x,U)|0,o=Math.imul(x,F),n=n+Math.imul(S,V)|0,i=(i=i+Math.imul(S,K)|0)+Math.imul(O,V)|0,o=o+Math.imul(O,K)|0,n=n+Math.imul(w,Z)|0,i=(i=i+Math.imul(w,W)|0)+Math.imul(_,Z)|0,o=o+Math.imul(_,W)|0,n=n+Math.imul(v,Y)|0,i=(i=i+Math.imul(v,Q)|0)+Math.imul(b,Y)|0,o=o+Math.imul(b,Q)|0,n=n+Math.imul(p,X)|0,i=(i=i+Math.imul(p,tt)|0)+Math.imul(m,X)|0,o=o+Math.imul(m,tt)|0;var _t=(h+(n=n+Math.imul(c,rt)|0)|0)+((8191&(i=(i=i+Math.imul(c,nt)|0)+Math.imul(l,rt)|0))<<13)|0;h=((o=o+Math.imul(l,nt)|0)+(i>>>13)|0)+(_t>>>26)|0,_t&=67108863,n=Math.imul(j,U),i=(i=Math.imul(j,F))+Math.imul($,U)|0,o=Math.imul($,F),n=n+Math.imul(E,V)|0,i=(i=i+Math.imul(E,K)|0)+Math.imul(x,V)|0,o=o+Math.imul(x,K)|0,n=n+Math.imul(S,Z)|0,i=(i=i+Math.imul(S,W)|0)+Math.imul(O,Z)|0,o=o+Math.imul(O,W)|0,n=n+Math.imul(w,Y)|0,i=(i=i+Math.imul(w,Q)|0)+Math.imul(_,Y)|0,o=o+Math.imul(_,Q)|0,n=n+Math.imul(v,X)|0,i=(i=i+Math.imul(v,tt)|0)+Math.imul(b,X)|0,o=o+Math.imul(b,tt)|0,n=n+Math.imul(p,rt)|0,i=(i=i+Math.imul(p,nt)|0)+Math.imul(m,rt)|0,o=o+Math.imul(m,nt)|0;var Mt=(h+(n=n+Math.imul(c,ot)|0)|0)+((8191&(i=(i=i+Math.imul(c,st)|0)+Math.imul(l,ot)|0))<<13)|0;h=((o=o+Math.imul(l,st)|0)+(i>>>13)|0)+(Mt>>>26)|0,Mt&=67108863,n=Math.imul(R,U),i=(i=Math.imul(R,F))+Math.imul(B,U)|0,o=Math.imul(B,F),n=n+Math.imul(j,V)|0,i=(i=i+Math.imul(j,K)|0)+Math.imul($,V)|0,o=o+Math.imul($,K)|0,n=n+Math.imul(E,Z)|0,i=(i=i+Math.imul(E,W)|0)+Math.imul(x,Z)|0,o=o+Math.imul(x,W)|0,n=n+Math.imul(S,Y)|0,i=(i=i+Math.imul(S,Q)|0)+Math.imul(O,Y)|0,o=o+Math.imul(O,Q)|0,n=n+Math.imul(w,X)|0,i=(i=i+Math.imul(w,tt)|0)+Math.imul(_,X)|0,o=o+Math.imul(_,tt)|0,n=n+Math.imul(v,rt)|0,i=(i=i+Math.imul(v,nt)|0)+Math.imul(b,rt)|0,o=o+Math.imul(b,nt)|0,n=n+Math.imul(p,ot)|0,i=(i=i+Math.imul(p,st)|0)+Math.imul(m,ot)|0,o=o+Math.imul(m,st)|0;var St=(h+(n=n+Math.imul(c,ut)|0)|0)+((8191&(i=(i=i+Math.imul(c,ht)|0)+Math.imul(l,ut)|0))<<13)|0;h=((o=o+Math.imul(l,ht)|0)+(i>>>13)|0)+(St>>>26)|0,St&=67108863,n=Math.imul(I,U),i=(i=Math.imul(I,F))+Math.imul(N,U)|0,o=Math.imul(N,F),n=n+Math.imul(R,V)|0,i=(i=i+Math.imul(R,K)|0)+Math.imul(B,V)|0,o=o+Math.imul(B,K)|0,n=n+Math.imul(j,Z)|0,i=(i=i+Math.imul(j,W)|0)+Math.imul($,Z)|0,o=o+Math.imul($,W)|0,n=n+Math.imul(E,Y)|0,i=(i=i+Math.imul(E,Q)|0)+Math.imul(x,Y)|0,o=o+Math.imul(x,Q)|0,n=n+Math.imul(S,X)|0,i=(i=i+Math.imul(S,tt)|0)+Math.imul(O,X)|0,o=o+Math.imul(O,tt)|0,n=n+Math.imul(w,rt)|0,i=(i=i+Math.imul(w,nt)|0)+Math.imul(_,rt)|0,o=o+Math.imul(_,nt)|0,n=n+Math.imul(v,ot)|0,i=(i=i+Math.imul(v,st)|0)+Math.imul(b,ot)|0,o=o+Math.imul(b,st)|0,n=n+Math.imul(p,ut)|0,i=(i=i+Math.imul(p,ht)|0)+Math.imul(m,ut)|0,o=o+Math.imul(m,ht)|0;var Ot=(h+(n=n+Math.imul(c,ct)|0)|0)+((8191&(i=(i=i+Math.imul(c,lt)|0)+Math.imul(l,ct)|0))<<13)|0;h=((o=o+Math.imul(l,lt)|0)+(i>>>13)|0)+(Ot>>>26)|0,Ot&=67108863,n=Math.imul(C,U),i=(i=Math.imul(C,F))+Math.imul(L,U)|0,o=Math.imul(L,F),n=n+Math.imul(I,V)|0,i=(i=i+Math.imul(I,K)|0)+Math.imul(N,V)|0,o=o+Math.imul(N,K)|0,n=n+Math.imul(R,Z)|0,i=(i=i+Math.imul(R,W)|0)+Math.imul(B,Z)|0,o=o+Math.imul(B,W)|0,n=n+Math.imul(j,Y)|0,i=(i=i+Math.imul(j,Q)|0)+Math.imul($,Y)|0,o=o+Math.imul($,Q)|0,n=n+Math.imul(E,X)|0,i=(i=i+Math.imul(E,tt)|0)+Math.imul(x,X)|0,o=o+Math.imul(x,tt)|0,n=n+Math.imul(S,rt)|0,i=(i=i+Math.imul(S,nt)|0)+Math.imul(O,rt)|0,o=o+Math.imul(O,nt)|0,n=n+Math.imul(w,ot)|0,i=(i=i+Math.imul(w,st)|0)+Math.imul(_,ot)|0,o=o+Math.imul(_,st)|0,n=n+Math.imul(v,ut)|0,i=(i=i+Math.imul(v,ht)|0)+Math.imul(b,ut)|0,o=o+Math.imul(b,ht)|0,n=n+Math.imul(p,ct)|0,i=(i=i+Math.imul(p,lt)|0)+Math.imul(m,ct)|0,o=o+Math.imul(m,lt)|0;var At=(h+(n=n+Math.imul(c,pt)|0)|0)+((8191&(i=(i=i+Math.imul(c,mt)|0)+Math.imul(l,pt)|0))<<13)|0;h=((o=o+Math.imul(l,mt)|0)+(i>>>13)|0)+(At>>>26)|0,At&=67108863,n=Math.imul(C,V),i=(i=Math.imul(C,K))+Math.imul(L,V)|0,o=Math.imul(L,K),n=n+Math.imul(I,Z)|0,i=(i=i+Math.imul(I,W)|0)+Math.imul(N,Z)|0,o=o+Math.imul(N,W)|0,n=n+Math.imul(R,Y)|0,i=(i=i+Math.imul(R,Q)|0)+Math.imul(B,Y)|0,o=o+Math.imul(B,Q)|0,n=n+Math.imul(j,X)|0,i=(i=i+Math.imul(j,tt)|0)+Math.imul($,X)|0,o=o+Math.imul($,tt)|0,n=n+Math.imul(E,rt)|0,i=(i=i+Math.imul(E,nt)|0)+Math.imul(x,rt)|0,o=o+Math.imul(x,nt)|0,n=n+Math.imul(S,ot)|0,i=(i=i+Math.imul(S,st)|0)+Math.imul(O,ot)|0,o=o+Math.imul(O,st)|0,n=n+Math.imul(w,ut)|0,i=(i=i+Math.imul(w,ht)|0)+Math.imul(_,ut)|0,o=o+Math.imul(_,ht)|0,n=n+Math.imul(v,ct)|0,i=(i=i+Math.imul(v,lt)|0)+Math.imul(b,ct)|0,o=o+Math.imul(b,lt)|0;var Et=(h+(n=n+Math.imul(p,pt)|0)|0)+((8191&(i=(i=i+Math.imul(p,mt)|0)+Math.imul(m,pt)|0))<<13)|0;h=((o=o+Math.imul(m,mt)|0)+(i>>>13)|0)+(Et>>>26)|0,Et&=67108863,n=Math.imul(C,Z),i=(i=Math.imul(C,W))+Math.imul(L,Z)|0,o=Math.imul(L,W),n=n+Math.imul(I,Y)|0,i=(i=i+Math.imul(I,Q)|0)+Math.imul(N,Y)|0,o=o+Math.imul(N,Q)|0,n=n+Math.imul(R,X)|0,i=(i=i+Math.imul(R,tt)|0)+Math.imul(B,X)|0,o=o+Math.imul(B,tt)|0,n=n+Math.imul(j,rt)|0,i=(i=i+Math.imul(j,nt)|0)+Math.imul($,rt)|0,o=o+Math.imul($,nt)|0,n=n+Math.imul(E,ot)|0,i=(i=i+Math.imul(E,st)|0)+Math.imul(x,ot)|0,o=o+Math.imul(x,st)|0,n=n+Math.imul(S,ut)|0,i=(i=i+Math.imul(S,ht)|0)+Math.imul(O,ut)|0,o=o+Math.imul(O,ht)|0,n=n+Math.imul(w,ct)|0,i=(i=i+Math.imul(w,lt)|0)+Math.imul(_,ct)|0,o=o+Math.imul(_,lt)|0;var xt=(h+(n=n+Math.imul(v,pt)|0)|0)+((8191&(i=(i=i+Math.imul(v,mt)|0)+Math.imul(b,pt)|0))<<13)|0;h=((o=o+Math.imul(b,mt)|0)+(i>>>13)|0)+(xt>>>26)|0,xt&=67108863,n=Math.imul(C,Y),i=(i=Math.imul(C,Q))+Math.imul(L,Y)|0,o=Math.imul(L,Q),n=n+Math.imul(I,X)|0,i=(i=i+Math.imul(I,tt)|0)+Math.imul(N,X)|0,o=o+Math.imul(N,tt)|0,n=n+Math.imul(R,rt)|0,i=(i=i+Math.imul(R,nt)|0)+Math.imul(B,rt)|0,o=o+Math.imul(B,nt)|0,n=n+Math.imul(j,ot)|0,i=(i=i+Math.imul(j,st)|0)+Math.imul($,ot)|0,o=o+Math.imul($,st)|0,n=n+Math.imul(E,ut)|0,i=(i=i+Math.imul(E,ht)|0)+Math.imul(x,ut)|0,o=o+Math.imul(x,ht)|0,n=n+Math.imul(S,ct)|0,i=(i=i+Math.imul(S,lt)|0)+Math.imul(O,ct)|0,o=o+Math.imul(O,lt)|0;var kt=(h+(n=n+Math.imul(w,pt)|0)|0)+((8191&(i=(i=i+Math.imul(w,mt)|0)+Math.imul(_,pt)|0))<<13)|0;h=((o=o+Math.imul(_,mt)|0)+(i>>>13)|0)+(kt>>>26)|0,kt&=67108863,n=Math.imul(C,X),i=(i=Math.imul(C,tt))+Math.imul(L,X)|0,o=Math.imul(L,tt),n=n+Math.imul(I,rt)|0,i=(i=i+Math.imul(I,nt)|0)+Math.imul(N,rt)|0,o=o+Math.imul(N,nt)|0,n=n+Math.imul(R,ot)|0,i=(i=i+Math.imul(R,st)|0)+Math.imul(B,ot)|0,o=o+Math.imul(B,st)|0,n=n+Math.imul(j,ut)|0,i=(i=i+Math.imul(j,ht)|0)+Math.imul($,ut)|0,o=o+Math.imul($,ht)|0,n=n+Math.imul(E,ct)|0,i=(i=i+Math.imul(E,lt)|0)+Math.imul(x,ct)|0,o=o+Math.imul(x,lt)|0;var jt=(h+(n=n+Math.imul(S,pt)|0)|0)+((8191&(i=(i=i+Math.imul(S,mt)|0)+Math.imul(O,pt)|0))<<13)|0;h=((o=o+Math.imul(O,mt)|0)+(i>>>13)|0)+(jt>>>26)|0,jt&=67108863,n=Math.imul(C,rt),i=(i=Math.imul(C,nt))+Math.imul(L,rt)|0,o=Math.imul(L,nt),n=n+Math.imul(I,ot)|0,i=(i=i+Math.imul(I,st)|0)+Math.imul(N,ot)|0,o=o+Math.imul(N,st)|0,n=n+Math.imul(R,ut)|0,i=(i=i+Math.imul(R,ht)|0)+Math.imul(B,ut)|0,o=o+Math.imul(B,ht)|0,n=n+Math.imul(j,ct)|0,i=(i=i+Math.imul(j,lt)|0)+Math.imul($,ct)|0,o=o+Math.imul($,lt)|0;var $t=(h+(n=n+Math.imul(E,pt)|0)|0)+((8191&(i=(i=i+Math.imul(E,mt)|0)+Math.imul(x,pt)|0))<<13)|0;h=((o=o+Math.imul(x,mt)|0)+(i>>>13)|0)+($t>>>26)|0,$t&=67108863,n=Math.imul(C,ot),i=(i=Math.imul(C,st))+Math.imul(L,ot)|0,o=Math.imul(L,st),n=n+Math.imul(I,ut)|0,i=(i=i+Math.imul(I,ht)|0)+Math.imul(N,ut)|0,o=o+Math.imul(N,ht)|0,n=n+Math.imul(R,ct)|0,i=(i=i+Math.imul(R,lt)|0)+Math.imul(B,ct)|0,o=o+Math.imul(B,lt)|0;var Pt=(h+(n=n+Math.imul(j,pt)|0)|0)+((8191&(i=(i=i+Math.imul(j,mt)|0)+Math.imul($,pt)|0))<<13)|0;h=((o=o+Math.imul($,mt)|0)+(i>>>13)|0)+(Pt>>>26)|0,Pt&=67108863,n=Math.imul(C,ut),i=(i=Math.imul(C,ht))+Math.imul(L,ut)|0,o=Math.imul(L,ht),n=n+Math.imul(I,ct)|0,i=(i=i+Math.imul(I,lt)|0)+Math.imul(N,ct)|0,o=o+Math.imul(N,lt)|0;var Rt=(h+(n=n+Math.imul(R,pt)|0)|0)+((8191&(i=(i=i+Math.imul(R,mt)|0)+Math.imul(B,pt)|0))<<13)|0;h=((o=o+Math.imul(B,mt)|0)+(i>>>13)|0)+(Rt>>>26)|0,Rt&=67108863,n=Math.imul(C,ct),i=(i=Math.imul(C,lt))+Math.imul(L,ct)|0,o=Math.imul(L,lt);var Bt=(h+(n=n+Math.imul(I,pt)|0)|0)+((8191&(i=(i=i+Math.imul(I,mt)|0)+Math.imul(N,pt)|0))<<13)|0;h=((o=o+Math.imul(N,mt)|0)+(i>>>13)|0)+(Bt>>>26)|0,Bt&=67108863;var Tt=(h+(n=Math.imul(C,pt))|0)+((8191&(i=(i=Math.imul(C,mt))+Math.imul(L,pt)|0))<<13)|0;return h=((o=Math.imul(L,mt))+(i>>>13)|0)+(Tt>>>26)|0,Tt&=67108863,u[0]=yt,u[1]=vt,u[2]=bt,u[3]=gt,u[4]=wt,u[5]=_t,u[6]=Mt,u[7]=St,u[8]=Ot,u[9]=At,u[10]=Et,u[11]=xt,u[12]=kt,u[13]=jt,u[14]=$t,u[15]=Pt,u[16]=Rt,u[17]=Bt,u[18]=Tt,0!==h&&(u[19]=h,r.length++),r};function y(t,e,r){return(new v).mulp(t,e,r)}function v(t,e){this.x=t,this.y=e}Math.imul||(m=p),s.prototype.mulTo=function(t,e){var r=this.length+t.length;return 10===this.length&&10===t.length?m(this,t,e):r<63?p(this,t,e):r<1024?function(t,e,r){r.negative=e.negative^t.negative,r.length=t.length+e.length;for(var n=0,i=0,o=0;o>>26)|0)>>>26,s&=67108863}r.words[o]=a,n=s,s=i}return 0!==n?r.words[o]=n:r.length--,r.strip()}(this,t,e):y(this,t,e)},v.prototype.makeRBT=function(t){for(var e=new Array(t),r=s.prototype._countBits(t)-1,n=0;n>=1;return n},v.prototype.permute=function(t,e,r,n,i,o){for(var s=0;s>>=1)i++;return 1<>>=13,r[2*s+1]=8191&o,o>>>=13;for(s=2*e;s>=26,e+=n/67108864|0,e+=o>>>26,this.words[r]=67108863&o}return 0!==e&&(this.words[r]=e,this.length++),this},s.prototype.muln=function(t){return this.clone().imuln(t)},s.prototype.sqr=function(){return this.mul(this)},s.prototype.isqr=function(){return this.imul(this.clone())},s.prototype.pow=function(t){var e=function(t){for(var e=new Array(t.bitLength()),r=0;r>>i}return e}(t);if(0===e.length)return new s(1);for(var r=this,n=0;n=0);var e,r=t%26,n=(t-r)/26,o=67108863>>>26-r<<26-r;if(0!==r){var s=0;for(e=0;e>>26-r}s&&(this.words[e]=s,this.length++)}if(0!==n){for(e=this.length-1;e>=0;e--)this.words[e+n]=this.words[e];for(e=0;e=0),n=e?(e-e%26)/26:0;var o=t%26,s=Math.min((t-o)/26,this.length),a=67108863^67108863>>>o<s)for(this.length-=s,h=0;h=0&&(0!==f||h>=n);h--){var c=0|this.words[h];this.words[h]=f<<26-o|c>>>o,f=c&a}return u&&0!==f&&(u.words[u.length++]=f),0===this.length&&(this.words[0]=0,this.length=1),this.strip()},s.prototype.ishrn=function(t,e,r){return i(0===this.negative),this.iushrn(t,e,r)},s.prototype.shln=function(t){return this.clone().ishln(t)},s.prototype.ushln=function(t){return this.clone().iushln(t)},s.prototype.shrn=function(t){return this.clone().ishrn(t)},s.prototype.ushrn=function(t){return this.clone().iushrn(t)},s.prototype.testn=function(t){i("number"==typeof t&&t>=0);var e=t%26,r=(t-e)/26,n=1<=0);var e=t%26,r=(t-e)/26;if(i(0===this.negative,"imaskn works only with positive numbers"),this.length<=r)return this;if(0!==e&&r++,this.length=Math.min(r,this.length),0!==e){var n=67108863^67108863>>>e<=67108864;e++)this.words[e]-=67108864,e===this.length-1?this.words[e+1]=1:this.words[e+1]++;return this.length=Math.max(this.length,e+1),this},s.prototype.isubn=function(t){if(i("number"==typeof t),i(t<67108864),t<0)return this.iaddn(-t);if(0!==this.negative)return this.negative=0,this.iaddn(t),this.negative=1,this;if(this.words[0]-=t,1===this.length&&this.words[0]<0)this.words[0]=-this.words[0],this.negative=1;else for(var e=0;e>26)-(u/67108864|0),this.words[n+r]=67108863&o}for(;n>26,this.words[n+r]=67108863&o;if(0===a)return this.strip();for(i(-1===a),a=0,n=0;n>26,this.words[n]=67108863&o;return this.negative=1,this.strip()},s.prototype._wordDiv=function(t,e){var r=(this.length,t.length),n=this.clone(),i=t,o=0|i.words[i.length-1];0!==(r=26-this._countBits(o))&&(i=i.ushln(r),n.iushln(r),o=0|i.words[i.length-1]);var a,u=n.length-i.length;if("mod"!==e){(a=new s(null)).length=u+1,a.words=new Array(a.length);for(var h=0;h=0;c--){var l=67108864*(0|n.words[i.length+c])+(0|n.words[i.length+c-1]);for(l=Math.min(l/o|0,67108863),n._ishlnsubmul(i,l,c);0!==n.negative;)l--,n.negative=0,n._ishlnsubmul(i,1,c),n.isZero()||(n.negative^=1);a&&(a.words[c]=l)}return a&&a.strip(),n.strip(),"div"!==e&&0!==r&&n.iushrn(r),{div:a||null,mod:n}},s.prototype.divmod=function(t,e,r){return i(!t.isZero()),this.isZero()?{div:new s(0),mod:new s(0)}:0!==this.negative&&0===t.negative?(a=this.neg().divmod(t,e),"mod"!==e&&(n=a.div.neg()),"div"!==e&&(o=a.mod.neg(),r&&0!==o.negative&&o.iadd(t)),{div:n,mod:o}):0===this.negative&&0!==t.negative?(a=this.divmod(t.neg(),e),"mod"!==e&&(n=a.div.neg()),{div:n,mod:a.mod}):0!=(this.negative&t.negative)?(a=this.neg().divmod(t.neg(),e),"div"!==e&&(o=a.mod.neg(),r&&0!==o.negative&&o.isub(t)),{div:a.div,mod:o}):t.length>this.length||this.cmp(t)<0?{div:new s(0),mod:this}:1===t.length?"div"===e?{div:this.divn(t.words[0]),mod:null}:"mod"===e?{div:null,mod:new s(this.modn(t.words[0]))}:{div:this.divn(t.words[0]),mod:new s(this.modn(t.words[0]))}:this._wordDiv(t,e);var n,o,a},s.prototype.div=function(t){return this.divmod(t,"div",!1).div},s.prototype.mod=function(t){return this.divmod(t,"mod",!1).mod},s.prototype.umod=function(t){return this.divmod(t,"mod",!0).mod},s.prototype.divRound=function(t){var e=this.divmod(t);if(e.mod.isZero())return e.div;var r=0!==e.div.negative?e.mod.isub(t):e.mod,n=t.ushrn(1),i=t.andln(1),o=r.cmp(n);return o<0||1===i&&0===o?e.div:0!==e.div.negative?e.div.isubn(1):e.div.iaddn(1)},s.prototype.modn=function(t){i(t<=67108863);for(var e=(1<<26)%t,r=0,n=this.length-1;n>=0;n--)r=(e*r+(0|this.words[n]))%t;return r},s.prototype.idivn=function(t){i(t<=67108863);for(var e=0,r=this.length-1;r>=0;r--){var n=(0|this.words[r])+67108864*e;this.words[r]=n/t|0,e=n%t}return this.strip()},s.prototype.divn=function(t){return this.clone().idivn(t)},s.prototype.egcd=function(t){i(0===t.negative),i(!t.isZero());var e=this,r=t.clone();e=0!==e.negative?e.umod(t):e.clone();for(var n=new s(1),o=new s(0),a=new s(0),u=new s(1),h=0;e.isEven()&&r.isEven();)e.iushrn(1),r.iushrn(1),++h;for(var f=r.clone(),c=e.clone();!e.isZero();){for(var l=0,d=1;0==(e.words[0]&d)&&l<26;++l,d<<=1);if(l>0)for(e.iushrn(l);l-- >0;)(n.isOdd()||o.isOdd())&&(n.iadd(f),o.isub(c)),n.iushrn(1),o.iushrn(1);for(var p=0,m=1;0==(r.words[0]&m)&&p<26;++p,m<<=1);if(p>0)for(r.iushrn(p);p-- >0;)(a.isOdd()||u.isOdd())&&(a.iadd(f),u.isub(c)),a.iushrn(1),u.iushrn(1);e.cmp(r)>=0?(e.isub(r),n.isub(a),o.isub(u)):(r.isub(e),a.isub(n),u.isub(o))}return{a:a,b:u,gcd:r.iushln(h)}},s.prototype._invmp=function(t){i(0===t.negative),i(!t.isZero());var e=this,r=t.clone();e=0!==e.negative?e.umod(t):e.clone();for(var n,o=new s(1),a=new s(0),u=r.clone();e.cmpn(1)>0&&r.cmpn(1)>0;){for(var h=0,f=1;0==(e.words[0]&f)&&h<26;++h,f<<=1);if(h>0)for(e.iushrn(h);h-- >0;)o.isOdd()&&o.iadd(u),o.iushrn(1);for(var c=0,l=1;0==(r.words[0]&l)&&c<26;++c,l<<=1);if(c>0)for(r.iushrn(c);c-- >0;)a.isOdd()&&a.iadd(u),a.iushrn(1);e.cmp(r)>=0?(e.isub(r),o.isub(a)):(r.isub(e),a.isub(o))}return(n=0===e.cmpn(1)?o:a).cmpn(0)<0&&n.iadd(t),n},s.prototype.gcd=function(t){if(this.isZero())return t.abs();if(t.isZero())return this.abs();var e=this.clone(),r=t.clone();e.negative=0,r.negative=0;for(var n=0;e.isEven()&&r.isEven();n++)e.iushrn(1),r.iushrn(1);for(;;){for(;e.isEven();)e.iushrn(1);for(;r.isEven();)r.iushrn(1);var i=e.cmp(r);if(i<0){var o=e;e=r,r=o}else if(0===i||0===r.cmpn(1))break;e.isub(r)}return r.iushln(n)},s.prototype.invm=function(t){return this.egcd(t).a.umod(t)},s.prototype.isEven=function(){return 0==(1&this.words[0])},s.prototype.isOdd=function(){return 1==(1&this.words[0])},s.prototype.andln=function(t){return this.words[0]&t},s.prototype.bincn=function(t){i("number"==typeof t);var e=t%26,r=(t-e)/26,n=1<>>26,a&=67108863,this.words[s]=a}return 0!==o&&(this.words[s]=o,this.length++),this},s.prototype.isZero=function(){return 1===this.length&&0===this.words[0]},s.prototype.cmpn=function(t){var e,r=t<0;if(0!==this.negative&&!r)return-1;if(0===this.negative&&r)return 1;if(this.strip(),this.length>1)e=1;else{r&&(t=-t),i(t<=67108863,"Number is too big");var n=0|this.words[0];e=n===t?0:nt.length)return 1;if(this.length=0;r--){var n=0|this.words[r],i=0|t.words[r];if(n!==i){ni&&(e=1);break}}return e},s.prototype.gtn=function(t){return 1===this.cmpn(t)},s.prototype.gt=function(t){return 1===this.cmp(t)},s.prototype.gten=function(t){return this.cmpn(t)>=0},s.prototype.gte=function(t){return this.cmp(t)>=0},s.prototype.ltn=function(t){return-1===this.cmpn(t)},s.prototype.lt=function(t){return-1===this.cmp(t)},s.prototype.lten=function(t){return this.cmpn(t)<=0},s.prototype.lte=function(t){return this.cmp(t)<=0},s.prototype.eqn=function(t){return 0===this.cmpn(t)},s.prototype.eq=function(t){return 0===this.cmp(t)},s.red=function(t){return new O(t)},s.prototype.toRed=function(t){return i(!this.red,"Already a number in reduction context"),i(0===this.negative,"red works only with positives"),t.convertTo(this)._forceRed(t)},s.prototype.fromRed=function(){return i(this.red,"fromRed works only with numbers in reduction context"),this.red.convertFrom(this)},s.prototype._forceRed=function(t){return this.red=t,this},s.prototype.forceRed=function(t){return i(!this.red,"Already a number in reduction context"),this._forceRed(t)},s.prototype.redAdd=function(t){return i(this.red,"redAdd works only with red numbers"),this.red.add(this,t)},s.prototype.redIAdd=function(t){return i(this.red,"redIAdd works only with red numbers"),this.red.iadd(this,t)},s.prototype.redSub=function(t){return i(this.red,"redSub works only with red numbers"),this.red.sub(this,t)},s.prototype.redISub=function(t){return i(this.red,"redISub works only with red numbers"),this.red.isub(this,t)},s.prototype.redShl=function(t){return i(this.red,"redShl works only with red numbers"),this.red.shl(this,t)},s.prototype.redMul=function(t){return i(this.red,"redMul works only with red numbers"),this.red._verify2(this,t),this.red.mul(this,t)},s.prototype.redIMul=function(t){return i(this.red,"redMul works only with red numbers"),this.red._verify2(this,t),this.red.imul(this,t)},s.prototype.redSqr=function(){return i(this.red,"redSqr works only with red numbers"),this.red._verify1(this),this.red.sqr(this)},s.prototype.redISqr=function(){return i(this.red,"redISqr works only with red numbers"),this.red._verify1(this),this.red.isqr(this)},s.prototype.redSqrt=function(){return i(this.red,"redSqrt works only with red numbers"),this.red._verify1(this),this.red.sqrt(this)},s.prototype.redInvm=function(){return i(this.red,"redInvm works only with red numbers"),this.red._verify1(this),this.red.invm(this)},s.prototype.redNeg=function(){return i(this.red,"redNeg works only with red numbers"),this.red._verify1(this),this.red.neg(this)},s.prototype.redPow=function(t){return i(this.red&&!t.red,"redPow(normalNum)"),this.red._verify1(this),this.red.pow(this,t)};var b={k256:null,p224:null,p192:null,p25519:null};function g(t,e){this.name=t,this.p=new s(e,16),this.n=this.p.bitLength(),this.k=new s(1).iushln(this.n).isub(this.p),this.tmp=this._tmp()}function w(){g.call(this,"k256","ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff fffffffe fffffc2f")}function _(){g.call(this,"p224","ffffffff ffffffff ffffffff ffffffff 00000000 00000000 00000001")}function M(){g.call(this,"p192","ffffffff ffffffff ffffffff fffffffe ffffffff ffffffff")}function S(){g.call(this,"25519","7fffffffffffffff ffffffffffffffff ffffffffffffffff ffffffffffffffed")}function O(t){if("string"==typeof t){var e=s._prime(t);this.m=e.p,this.prime=e}else i(t.gtn(1),"modulus must be greater than 1"),this.m=t,this.prime=null}function A(t){O.call(this,t),this.shift=this.m.bitLength(),this.shift%26!=0&&(this.shift+=26-this.shift%26),this.r=new s(1).iushln(this.shift),this.r2=this.imod(this.r.sqr()),this.rinv=this.r._invmp(this.m),this.minv=this.rinv.mul(this.r).isubn(1).div(this.m),this.minv=this.minv.umod(this.r),this.minv=this.r.sub(this.minv)}g.prototype._tmp=function(){var t=new s(null);return t.words=new Array(Math.ceil(this.n/13)),t},g.prototype.ireduce=function(t){var e,r=t;do{this.split(r,this.tmp),e=(r=(r=this.imulK(r)).iadd(this.tmp)).bitLength()}while(e>this.n);var n=e0?r.isub(this.p):void 0!==r.strip?r.strip():r._strip(),r},g.prototype.split=function(t,e){t.iushrn(this.n,0,e)},g.prototype.imulK=function(t){return t.imul(this.k)},o(w,g),w.prototype.split=function(t,e){for(var r=Math.min(t.length,9),n=0;n>>22,i=o}i>>>=22,t.words[n-10]=i,0===i&&t.length>10?t.length-=10:t.length-=9},w.prototype.imulK=function(t){t.words[t.length]=0,t.words[t.length+1]=0,t.length+=2;for(var e=0,r=0;r>>=26,t.words[r]=i,e=n}return 0!==e&&(t.words[t.length++]=e),t},s._prime=function(t){if(b[t])return b[t];var e;if("k256"===t)e=new w;else if("p224"===t)e=new _;else if("p192"===t)e=new M;else{if("p25519"!==t)throw new Error("Unknown prime "+t);e=new S}return b[t]=e,e},O.prototype._verify1=function(t){i(0===t.negative,"red works only with positives"),i(t.red,"red works only with red numbers")},O.prototype._verify2=function(t,e){i(0==(t.negative|e.negative),"red works only with positives"),i(t.red&&t.red===e.red,"red works only with red numbers")},O.prototype.imod=function(t){return this.prime?this.prime.ireduce(t)._forceRed(this):t.umod(this.m)._forceRed(this)},O.prototype.neg=function(t){return t.isZero()?t.clone():this.m.sub(t)._forceRed(this)},O.prototype.add=function(t,e){this._verify2(t,e);var r=t.add(e);return r.cmp(this.m)>=0&&r.isub(this.m),r._forceRed(this)},O.prototype.iadd=function(t,e){this._verify2(t,e);var r=t.iadd(e);return r.cmp(this.m)>=0&&r.isub(this.m),r},O.prototype.sub=function(t,e){this._verify2(t,e);var r=t.sub(e);return r.cmpn(0)<0&&r.iadd(this.m),r._forceRed(this)},O.prototype.isub=function(t,e){this._verify2(t,e);var r=t.isub(e);return r.cmpn(0)<0&&r.iadd(this.m),r},O.prototype.shl=function(t,e){return this._verify1(t),this.imod(t.ushln(e))},O.prototype.imul=function(t,e){return this._verify2(t,e),this.imod(t.imul(e))},O.prototype.mul=function(t,e){return this._verify2(t,e),this.imod(t.mul(e))},O.prototype.isqr=function(t){return this.imul(t,t.clone())},O.prototype.sqr=function(t){return this.mul(t,t)},O.prototype.sqrt=function(t){if(t.isZero())return t.clone();var e=this.m.andln(3);if(i(e%2==1),3===e){var r=this.m.add(new s(1)).iushrn(2);return this.pow(t,r)}for(var n=this.m.subn(1),o=0;!n.isZero()&&0===n.andln(1);)o++,n.iushrn(1);i(!n.isZero());var a=new s(1).toRed(this),u=a.redNeg(),h=this.m.subn(1).iushrn(1),f=this.m.bitLength();for(f=new s(2*f*f).toRed(this);0!==this.pow(f,h).cmp(u);)f.redIAdd(u);for(var c=this.pow(f,n),l=this.pow(t,n.addn(1).iushrn(1)),d=this.pow(t,n),p=o;0!==d.cmp(a);){for(var m=d,y=0;0!==m.cmp(a);y++)m=m.redSqr();i(y=0;n--){for(var h=e.words[n],f=u-1;f>=0;f--){var c=h>>f&1;i!==r[0]&&(i=this.sqr(i)),0!==c||0!==o?(o<<=1,o|=c,(4===++a||0===n&&0===f)&&(i=this.mul(i,r[o]),a=0,o=0)):a=0}u=26}return i},O.prototype.convertTo=function(t){var e=t.umod(this.m);return e===t?e.clone():e},O.prototype.convertFrom=function(t){var e=t.clone();return e.red=null,e},s.mont=function(t){return new A(t)},o(A,O),A.prototype.convertTo=function(t){return this.imod(t.ushln(this.shift))},A.prototype.convertFrom=function(t){var e=this.imod(t.mul(this.rinv));return e.red=null,e},A.prototype.imul=function(t,e){if(t.isZero()||e.isZero())return t.words[0]=0,t.length=1,t;var r=t.imul(e),n=r.maskn(this.shift).mul(this.minv).imaskn(this.shift).mul(this.m),i=r.isub(n).iushrn(this.shift),o=i;return i.cmp(this.m)>=0?o=i.isub(this.m):i.cmpn(0)<0&&(o=i.iadd(this.m)),o._forceRed(this)},A.prototype.mul=function(t,e){if(t.isZero()||e.isZero())return new s(0)._forceRed(this);var r=t.mul(e),n=r.maskn(this.shift).mul(this.minv).imaskn(this.shift).mul(this.m),i=r.isub(n).iushrn(this.shift),o=i;return i.cmp(this.m)>=0?o=i.isub(this.m):i.cmpn(0)<0&&(o=i.iadd(this.m)),o._forceRed(this)},A.prototype.invm=function(t){return this.imod(t._invmp(this.m).mul(this.r2))._forceRed(this)}}(t,this)}).call(this,r(22)(t))},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}function o(t,e){for(var r=0;r0){var a=p(e),c=m(e),l=y(o),d=v(null,t,a,r,l,c);(s=h.call(this,d)).init(t,e,r,n,o)}else s=h.call(this,v());return u(s)}return e=f,(r=[{key:"toJSON",value:function(){return{stringValue:this.stringValue,valueType:this.valueType,kind:this.kind,value:this.value,path:this.path,reason:this.reason,name:this.name,message:this.message}} +/*! + * ignore + */},{key:"init",value:function(t,e,r,n,i){this.stringValue=p(e),this.messageFormat=y(i),this.kind=t,this.value=e,this.path=r,this.reason=n,this.valueType=m(e)} +/*! + * ignore + * @param {Readonly} other + */},{key:"copy",value:function(t){this.messageFormat=t.messageFormat,this.stringValue=t.stringValue,this.kind=t.kind,this.value=t.value,this.path=t.path,this.reason=t.reason,this.message=t.message,this.valueType=t.valueType} +/*! + * ignore + */},{key:"setModel",value:function(t){this.model=t,this.message=v(t,this.kind,this.stringValue,this.path,this.messageFormat,this.valueType)}}])&&o(e.prototype,r),n&&o(e,n),f}(f);function p(t){var e=l.inspect(t);return(e=e.replace(/^'|'$/g,'"')).startsWith('"')||(e='"'+e+'"'),e}function m(t){if(null==t)return""+t;var e=n(t);return"object"!==e||"function"!=typeof t.constructor?e:t.constructor.name}function y(t){var e=c(t,"options.cast",null);if("string"==typeof e)return e} +/*! + * ignore + */function v(t,e,r,n,i,o){if(null!=i){var s=i.replace("{KIND}",e).replace("{VALUE}",r).replace("{PATH}",n);return null!=t&&(s=s.replace("{MODEL}",t.modelName)),s}var a="Cast to "+e+" failed for value "+r+(o?" (type "+o+")":"")+' at path "'+n+'"';return null!=t&&(a+=' for model "'+t.modelName+'"'),a} +/*! + * exports + */Object.defineProperty(d.prototype,"name",{value:"CastError"}),t.exports=d},function(t,e,r){var n=r(2).Buffer,i=r(196).Transform,o=r(37).StringDecoder;function s(t){i.call(this),this.hashMode="string"==typeof t,this.hashMode?this[t]=this._finalOrDigest:this.final=this._finalOrDigest,this._final&&(this.__final=this._final,this._final=null),this._decoder=null,this._encoding=null}r(0)(s,i),s.prototype.update=function(t,e,r){"string"==typeof t&&(t=n.from(t,e));var i=this._update(t);return this.hashMode?this:(r&&(i=this._toString(i,r)),i)},s.prototype.setAutoPadding=function(){},s.prototype.getAuthTag=function(){throw new Error("trying to get auth tag in unsupported state")},s.prototype.setAuthTag=function(){throw new Error("trying to set auth tag in unsupported state")},s.prototype.setAAD=function(){throw new Error("trying to set aad in unsupported state")},s.prototype._transform=function(t,e,r){var n;try{this.hashMode?this._update(t):this.push(this._update(t))}catch(t){n=t}finally{r(n)}},s.prototype._flush=function(t){var e;try{this.push(this.__final())}catch(t){e=t}t(e)},s.prototype._finalOrDigest=function(t){var e=this.__final()||n.alloc(0);return t&&(e=this._toString(e,t,!0)),e},s.prototype._toString=function(t,e,r){if(this._decoder||(this._decoder=new o(e),this._encoding=e),this._encoding!==e)throw new Error("can't switch encodings");var n=this._decoder.write(t);return r&&(n+=this._decoder.end()),n},t.exports=s},function(t,e){t.exports=function(t){return t.webpackPolyfill||(t.deprecate=function(){},t.paths=[],t.children||(t.children=[]),Object.defineProperty(t,"loaded",{enumerable:!0,get:function(){return t.l}}),Object.defineProperty(t,"id",{enumerable:!0,get:function(){return t.i}}),t.webpackPolyfill=1),t}},function(t,e,r){"use strict"; +/*! + * ignore + */function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}function o(t,e){if(e&&("object"===n(e)||"function"==typeof e))return e;if(void 0!==e)throw new TypeError("Derived constructors may only return object or undefined");return function(t){if(void 0===t)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return t}(t)}function s(t){var e="function"==typeof Map?new Map:void 0;return(s=function(t){if(null===t||(r=t,-1===Function.toString.call(r).indexOf("[native code]")))return t;var r;if("function"!=typeof t)throw new TypeError("Super expression must either be null or a function");if(void 0!==e){if(e.has(t))return e.get(t);e.set(t,n)}function n(){return a(t,arguments,f(this).constructor)}return n.prototype=Object.create(t.prototype,{constructor:{value:n,enumerable:!1,writable:!0,configurable:!0}}),h(n,t)})(t)}function a(t,e,r){return(a=u()?Reflect.construct:function(t,e,r){var n=[null];n.push.apply(n,e);var i=new(Function.bind.apply(t,n));return r&&h(i,r.prototype),i}).apply(null,arguments)}function u(){if("undefined"==typeof Reflect||!Reflect.construct)return!1;if(Reflect.construct.sham)return!1;if("function"==typeof Proxy)return!0;try{return Boolean.prototype.valueOf.call(Reflect.construct(Boolean,[],(function(){}))),!0}catch(t){return!1}}function h(t,e){return(h=Object.setPrototypeOf||function(t,e){return t.__proto__=e,t})(t,e)}function f(t){return(f=Object.setPrototypeOf?Object.getPrototypeOf:function(t){return t.__proto__||Object.getPrototypeOf(t)})(t)}var c=function(t){!function(t,e){if("function"!=typeof e&&null!==e)throw new TypeError("Super expression must either be null or a function");t.prototype=Object.create(e&&e.prototype,{constructor:{value:t,writable:!0,configurable:!0}}),e&&h(t,e)}(s,t);var e,r,n=(e=s,r=u(),function(){var t,n=f(e);if(r){var i=f(this).constructor;t=Reflect.construct(n,arguments,i)}else t=n.apply(this,arguments);return o(this,t)});function s(){return i(this,s),n.apply(this,arguments)}return s}(s(Error));Object.defineProperty(c.prototype,"name",{value:"MongooseError"}),t.exports=c},function(t,e,r){"use strict"; +/*! + * ignore + */var n=null;t.exports.get=function(){return n},t.exports.set=function(t){n=t}},function(t,e,r){"use strict";var n=r(46),i=Object.keys||function(t){var e=[];for(var r in t)e.push(r);return e};t.exports=c;var o=Object.create(r(38));o.inherits=r(0);var s=r(105),a=r(65);o.inherits(c,s);for(var u=i(a.prototype),h=0;h4294967295)throw new RangeError("requested too many random bytes");var r=i.allocUnsafe(t);if(t>0)if(t>65536)for(var s=0;s2?"one of ".concat(e," ").concat(t.slice(0,r-1).join(", "),", or ")+t[r-1]:2===r?"one of ".concat(e," ").concat(t[0]," or ").concat(t[1]):"of ".concat(e," ").concat(t[0])}return"of ".concat(e," ").concat(String(t))}o("ERR_INVALID_OPT_VALUE",(function(t,e){return'The value "'+e+'" is invalid for option "'+t+'"'}),TypeError),o("ERR_INVALID_ARG_TYPE",(function(t,e,r){var i,o,a,u;if("string"==typeof e&&(o="not ",e.substr(!a||a<0?0:+a,o.length)===o)?(i="must not be",e=e.replace(/^not /,"")):i="must be",function(t,e,r){return(void 0===r||r>t.length)&&(r=t.length),t.substring(r-e.length,r)===e}(t," argument"))u="The ".concat(t," ").concat(i," ").concat(s(e,"type"));else{var h=function(t,e,r){return"number"!=typeof r&&(r=0),!(r+e.length>t.length)&&-1!==t.indexOf(e,r)}(t,".")?"property":"argument";u='The "'.concat(t,'" ').concat(h," ").concat(i," ").concat(s(e,"type"))}return u+=". Received type ".concat(n(r))}),TypeError),o("ERR_STREAM_PUSH_AFTER_EOF","stream.push() after EOF"),o("ERR_METHOD_NOT_IMPLEMENTED",(function(t){return"The "+t+" method is not implemented"})),o("ERR_STREAM_PREMATURE_CLOSE","Premature close"),o("ERR_STREAM_DESTROYED",(function(t){return"Cannot call "+t+" after a stream was destroyed"})),o("ERR_MULTIPLE_CALLBACK","Callback called multiple times"),o("ERR_STREAM_CANNOT_PIPE","Cannot pipe, not readable"),o("ERR_STREAM_WRITE_AFTER_END","write after end"),o("ERR_STREAM_NULL_VALUES","May not write null values to stream",TypeError),o("ERR_UNKNOWN_ENCODING",(function(t){return"Unknown encoding: "+t}),TypeError),o("ERR_STREAM_UNSHIFT_AFTER_END_EVENT","stream.unshift() after end event"),t.exports.codes=i},function(t,e,r){"use strict";(function(e){var n=Object.keys||function(t){var e=[];for(var r in t)e.push(r);return e};t.exports=h;var i=r(96),o=r(100);r(0)(h,i);for(var s=n(o.prototype),a=0;a=this._finalSize&&(this._update(this._block),this._block.fill(0));var r=8*this._len;if(r<=4294967295)this._block.writeUInt32BE(r,this._blockSize-4);else{var n=(4294967295&r)>>>0,i=(r-n)/4294967296;this._block.writeUInt32BE(i,this._blockSize-8),this._block.writeUInt32BE(n,this._blockSize-4)}this._update(this._block);var o=this._hash();return t?o.toString(t):o},i.prototype._update=function(){throw new Error("_update must be implemented by subclass")},t.exports=i},function(t,e,r){"use strict";t.exports=r(24).get().Decimal128},function(t,e,r){"use strict";(function(e){ +/*! + * Determines if `arg` is an object. + * + * @param {Object|Array|String|Function|RegExp|any} arg + * @api private + * @return {Boolean} + */ +t.exports=function(t){return!!e.isBuffer(t)||"[object Object]"===Object.prototype.toString.call(t)}}).call(this,r(3).Buffer)},function(t,e,r){"use strict";(function(e){function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}var i=r(281); +/*! + * The buffer module from node.js, for the browser. + * + * @author Feross Aboukhadijeh + * @license MIT + */function o(t,e){if(t===e)return 0;for(var r=t.length,n=e.length,i=0,o=Math.min(r,n);i=0;u--)if(f[u]!==c[u])return!1;for(u=f.length-1;u>=0;u--)if(s=f[u],!w(t[s],e[s],r,n))return!1;return!0}(t,e,r,i))}return r?t===e:t==e}function _(t){return"[object Arguments]"==Object.prototype.toString.call(t)}function M(t,e){if(!t||!e)return!1;if("[object RegExp]"==Object.prototype.toString.call(e))return e.test(t);try{if(t instanceof e)return!0}catch(t){}return!Error.isPrototypeOf(e)&&!0===e.call({},t)}function S(t,e,r,n){var i;if("function"!=typeof e)throw new TypeError('"block" argument must be a function');"string"==typeof r&&(n=r,r=null),i=function(t){var e;try{t()}catch(t){e=t}return e}(e),n=(r&&r.name?" ("+r.name+").":".")+(n?" "+n:"."),t&&!i&&b(i,r,"Missing expected exception"+n);var o="string"==typeof n,s=!t&&i&&!r;if((!t&&a.isError(i)&&o&&M(i,r)||s)&&b(i,r,"Got unwanted exception"+n),t&&i&&r&&!M(i,r)||!t&&i)throw i}d.AssertionError=function(t){this.name="AssertionError",this.actual=t.actual,this.expected=t.expected,this.operator=t.operator,t.message?(this.message=t.message,this.generatedMessage=!1):(this.message=function(t){return y(v(t.actual),128)+" "+t.operator+" "+y(v(t.expected),128)}(this),this.generatedMessage=!0);var e=t.stackStartFunction||b;if(Error.captureStackTrace)Error.captureStackTrace(this,e);else{var r=new Error;if(r.stack){var n=r.stack,i=m(e),o=n.indexOf("\n"+i);if(o>=0){var s=n.indexOf("\n",o+1);n=n.substring(s+1)}this.stack=n}}},a.inherits(d.AssertionError,Error),d.fail=b,d.ok=g,d.equal=function(t,e,r){t!=e&&b(t,e,r,"==",d.equal)},d.notEqual=function(t,e,r){t==e&&b(t,e,r,"!=",d.notEqual)},d.deepEqual=function(t,e,r){w(t,e,!1)||b(t,e,r,"deepEqual",d.deepEqual)},d.deepStrictEqual=function(t,e,r){w(t,e,!0)||b(t,e,r,"deepStrictEqual",d.deepStrictEqual)},d.notDeepEqual=function(t,e,r){w(t,e,!1)&&b(t,e,r,"notDeepEqual",d.notDeepEqual)},d.notDeepStrictEqual=function t(e,r,n){w(e,r,!0)&&b(e,r,n,"notDeepStrictEqual",t)},d.strictEqual=function(t,e,r){t!==e&&b(t,e,r,"===",d.strictEqual)},d.notStrictEqual=function(t,e,r){t===e&&b(t,e,r,"!==",d.notStrictEqual)},d.throws=function(t,e,r){S(!0,t,e,r)},d.doesNotThrow=function(t,e,r){S(!1,t,e,r)},d.ifError=function(t){if(t)throw t},d.strict=i((function t(e,r){e||b(e,!0,r,"==",t)}),d,{equal:d.strictEqual,deepEqual:d.deepStrictEqual,notEqual:d.notStrictEqual,notDeepEqual:d.notDeepStrictEqual}),d.strict.strict=d.strict;var O=Object.keys||function(t){var e=[];for(var r in t)u.call(t,r)&&e.push(r);return e}}).call(this,r(7))},function(t,e,r){"use strict"; +/*! + * If `val` is an object, returns constructor name, if possible. Otherwise returns undefined. + */t.exports=function(t){if(null!=t&&"function"==typeof t.constructor)return t.constructor.name}},function(t,e,r){"use strict"; +/*! + * ignore + */e.internalToObjectOptions={transform:!1,virtuals:!1,getters:!1,_skipDepopulateTopLevel:!0,depopulate:!0,flattenDecimals:!1,useProjection:!1}},function(t,e,r){"use strict";var n=r(0),i=r(60),o=r(62),s=r(63),a=r(21);function u(t){a.call(this,"digest"),this._hash=t}n(u,a),u.prototype._update=function(t){this._hash.update(t)},u.prototype._final=function(){return this._hash.digest()},t.exports=function(t){return"md5"===(t=t.toLowerCase())?new i:"rmd160"===t||"ripemd160"===t?new o:new u(s(t))}},function(t,e,r){"use strict";var n=r(187).Buffer,i=n.isEncoding||function(t){switch((t=""+t)&&t.toLowerCase()){case"hex":case"utf8":case"utf-8":case"ascii":case"binary":case"base64":case"ucs2":case"ucs-2":case"utf16le":case"utf-16le":case"raw":return!0;default:return!1}};function o(t){var e;switch(this.encoding=function(t){var e=function(t){if(!t)return"utf8";for(var e;;)switch(t){case"utf8":case"utf-8":return"utf8";case"ucs2":case"ucs-2":case"utf16le":case"utf-16le":return"utf16le";case"latin1":case"binary":return"latin1";case"base64":case"ascii":case"hex":return t;default:if(e)return;t=(""+t).toLowerCase(),e=!0}}(t);if("string"!=typeof e&&(n.isEncoding===i||!i(t)))throw new Error("Unknown encoding: "+t);return e||t}(t),this.encoding){case"utf16le":this.text=u,this.end=h,e=4;break;case"utf8":this.fillLast=a,e=4;break;case"base64":this.text=f,this.end=c,e=3;break;default:return this.write=l,void(this.end=d)}this.lastNeed=0,this.lastTotal=0,this.lastChar=n.allocUnsafe(e)}function s(t){return t<=127?0:t>>5==6?2:t>>4==14?3:t>>3==30?4:t>>6==2?-1:-2}function a(t){var e=this.lastTotal-this.lastNeed,r=function(t,e,r){if(128!=(192&e[0]))return t.lastNeed=0,"�";if(t.lastNeed>1&&e.length>1){if(128!=(192&e[1]))return t.lastNeed=1,"�";if(t.lastNeed>2&&e.length>2&&128!=(192&e[2]))return t.lastNeed=2,"�"}}(this,t);return void 0!==r?r:this.lastNeed<=t.length?(t.copy(this.lastChar,e,0,this.lastNeed),this.lastChar.toString(this.encoding,0,this.lastTotal)):(t.copy(this.lastChar,e,0,t.length),void(this.lastNeed-=t.length))}function u(t,e){if((t.length-e)%2==0){var r=t.toString("utf16le",e);if(r){var n=r.charCodeAt(r.length-1);if(n>=55296&&n<=56319)return this.lastNeed=2,this.lastTotal=4,this.lastChar[0]=t[t.length-2],this.lastChar[1]=t[t.length-1],r.slice(0,-1)}return r}return this.lastNeed=1,this.lastTotal=2,this.lastChar[0]=t[t.length-1],t.toString("utf16le",e,t.length-1)}function h(t){var e=t&&t.length?this.write(t):"";if(this.lastNeed){var r=this.lastTotal-this.lastNeed;return e+this.lastChar.toString("utf16le",0,r)}return e}function f(t,e){var r=(t.length-e)%3;return 0===r?t.toString("base64",e):(this.lastNeed=3-r,this.lastTotal=3,1===r?this.lastChar[0]=t[t.length-1]:(this.lastChar[0]=t[t.length-2],this.lastChar[1]=t[t.length-1]),t.toString("base64",e,t.length-r))}function c(t){var e=t&&t.length?this.write(t):"";return this.lastNeed?e+this.lastChar.toString("base64",0,3-this.lastNeed):e}function l(t){return t.toString(this.encoding)}function d(t){return t&&t.length?this.write(t):""}e.StringDecoder=o,o.prototype.write=function(t){if(0===t.length)return"";var e,r;if(this.lastNeed){if(void 0===(e=this.fillLast(t)))return"";r=this.lastNeed,this.lastNeed=0}else r=0;return r=0)return i>0&&(t.lastNeed=i-1),i;if(--n=0)return i>0&&(t.lastNeed=i-2),i;if(--n=0)return i>0&&(2===i?i=0:t.lastNeed=i-3),i;return 0}(this,t,e);if(!this.lastNeed)return t.toString("utf8",e);this.lastTotal=r;var n=t.length-(r-this.lastNeed);return t.copy(this.lastChar,0,n),t.toString("utf8",e,n)},o.prototype.fillLast=function(t){if(this.lastNeed<=t.length)return t.copy(this.lastChar,this.lastTotal-this.lastNeed,0,this.lastNeed),this.lastChar.toString(this.encoding,0,this.lastTotal);t.copy(this.lastChar,this.lastTotal-this.lastNeed,0,t.length),this.lastNeed-=t.length}},function(t,e,r){function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t){return Object.prototype.toString.call(t)}e.isArray=function(t){return Array.isArray?Array.isArray(t):"[object Array]"===i(t)},e.isBoolean=function(t){return"boolean"==typeof t},e.isNull=function(t){return null===t},e.isNullOrUndefined=function(t){return null==t},e.isNumber=function(t){return"number"==typeof t},e.isString=function(t){return"string"==typeof t},e.isSymbol=function(t){return"symbol"===n(t)},e.isUndefined=function(t){return void 0===t},e.isRegExp=function(t){return"[object RegExp]"===i(t)},e.isObject=function(t){return"object"===n(t)&&null!==t},e.isDate=function(t){return"[object Date]"===i(t)},e.isError=function(t){return"[object Error]"===i(t)||t instanceof Error},e.isFunction=function(t){return"function"==typeof t},e.isPrimitive=function(t){return null===t||"boolean"==typeof t||"number"==typeof t||"string"==typeof t||"symbol"===n(t)||void 0===t},e.isBuffer=r(3).Buffer.isBuffer},function(t,e,r){(function(e){t.exports=function(t,r){for(var n=Math.min(t.length,r.length),i=new e(n),o=0;o=this._delta8){var r=(t=this.pending).length%this._delta8;this.pending=t.slice(t.length-r,t.length),0===this.pending.length&&(this.pending=null),t=n.join32(t,0,t.length-r,this.endian);for(var i=0;i>>24&255,n[i++]=t>>>16&255,n[i++]=t>>>8&255,n[i++]=255&t}else for(n[i++]=255&t,n[i++]=t>>>8&255,n[i++]=t>>>16&255,n[i++]=t>>>24&255,n[i++]=0,n[i++]=0,n[i++]=0,n[i++]=0,o=8;o=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:o}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function i(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r>>24]^f[p>>>16&255]^c[m>>>8&255]^l[255&y]^e[v++],s=h[p>>>24]^f[m>>>16&255]^c[y>>>8&255]^l[255&d]^e[v++],a=h[m>>>24]^f[y>>>16&255]^c[d>>>8&255]^l[255&p]^e[v++],u=h[y>>>24]^f[d>>>16&255]^c[p>>>8&255]^l[255&m]^e[v++],d=o,p=s,m=a,y=u;return o=(n[d>>>24]<<24|n[p>>>16&255]<<16|n[m>>>8&255]<<8|n[255&y])^e[v++],s=(n[p>>>24]<<24|n[m>>>16&255]<<16|n[y>>>8&255]<<8|n[255&d])^e[v++],a=(n[m>>>24]<<24|n[y>>>16&255]<<16|n[d>>>8&255]<<8|n[255&p])^e[v++],u=(n[y>>>24]<<24|n[d>>>16&255]<<16|n[p>>>8&255]<<8|n[255&m])^e[v++],[o>>>=0,s>>>=0,a>>>=0,u>>>=0]}var a=[0,1,2,4,8,16,32,64,128,27,54],u=function(){for(var t=new Array(256),e=0;e<256;e++)t[e]=e<128?e<<1:e<<1^283;for(var r=[],n=[],i=[[],[],[],[]],o=[[],[],[],[]],s=0,a=0,u=0;u<256;++u){var h=a^a<<1^a<<2^a<<3^a<<4;h=h>>>8^255&h^99,r[s]=h,n[h]=s;var f=t[s],c=t[f],l=t[c],d=257*t[h]^16843008*h;i[0][s]=d<<24|d>>>8,i[1][s]=d<<16|d>>>16,i[2][s]=d<<8|d>>>24,i[3][s]=d,d=16843009*l^65537*c^257*f^16843008*s,o[0][h]=d<<24|d>>>8,o[1][h]=d<<16|d>>>16,o[2][h]=d<<8|d>>>24,o[3][h]=d,0===s?s=a=1:(s=f^t[t[t[l^f]]],a^=t[t[a]])}return{SBOX:r,INV_SBOX:n,SUB_MIX:i,INV_SUB_MIX:o}}();function h(t){this._key=i(t),this._reset()}h.blockSize=16,h.keySize=32,h.prototype.blockSize=h.blockSize,h.prototype.keySize=h.keySize,h.prototype._reset=function(){for(var t=this._key,e=t.length,r=e+6,n=4*(r+1),i=[],o=0;o>>24,s=u.SBOX[s>>>24]<<24|u.SBOX[s>>>16&255]<<16|u.SBOX[s>>>8&255]<<8|u.SBOX[255&s],s^=a[o/e|0]<<24):e>6&&o%e==4&&(s=u.SBOX[s>>>24]<<24|u.SBOX[s>>>16&255]<<16|u.SBOX[s>>>8&255]<<8|u.SBOX[255&s]),i[o]=i[o-e]^s}for(var h=[],f=0;f>>24]]^u.INV_SUB_MIX[1][u.SBOX[l>>>16&255]]^u.INV_SUB_MIX[2][u.SBOX[l>>>8&255]]^u.INV_SUB_MIX[3][u.SBOX[255&l]]}this._nRounds=r,this._keySchedule=i,this._invKeySchedule=h},h.prototype.encryptBlockRaw=function(t){return s(t=i(t),this._keySchedule,u.SUB_MIX,u.SBOX,this._nRounds)},h.prototype.encryptBlock=function(t){var e=this.encryptBlockRaw(t),r=n.allocUnsafe(16);return r.writeUInt32BE(e[0],0),r.writeUInt32BE(e[1],4),r.writeUInt32BE(e[2],8),r.writeUInt32BE(e[3],12),r},h.prototype.decryptBlock=function(t){var e=(t=i(t))[1];t[1]=t[3],t[3]=e;var r=s(t,this._invKeySchedule,u.INV_SUB_MIX,u.INV_SBOX,this._nRounds),o=n.allocUnsafe(16);return o.writeUInt32BE(r[0],0),o.writeUInt32BE(r[3],4),o.writeUInt32BE(r[2],8),o.writeUInt32BE(r[1],12),o},h.prototype.scrub=function(){o(this._keySchedule),o(this._invKeySchedule),o(this._key)},t.exports.AES=h},function(t,e,r){var n=r(2).Buffer,i=r(60);t.exports=function(t,e,r,o){if(n.isBuffer(t)||(t=n.from(t,"binary")),e&&(n.isBuffer(e)||(e=n.from(e,"binary")),8!==e.length))throw new RangeError("salt should be Buffer with 8 byte length");for(var s=r/8,a=n.alloc(s),u=n.alloc(o||0),h=n.alloc(0);s>0||o>0;){var f=new i;f.update(h),f.update(t),e&&f.update(e),h=f.digest();var c=0;if(s>0){var l=a.length-s;c=Math.min(s,h.length),h.copy(a,l,0,c),s-=c}if(c0){var d=u.length-o,p=Math.min(o,h.length-c);h.copy(u,d,c,c+p),o-=p}}return h.fill(0),{key:a,iv:u}}},function(t,e,r){"use strict";var n=r(19),i=r(12),o=i.getNAF,s=i.getJSF,a=i.assert;function u(t,e){this.type=t,this.p=new n(e.p,16),this.red=e.prime?n.red(e.prime):n.mont(this.p),this.zero=new n(0).toRed(this.red),this.one=new n(1).toRed(this.red),this.two=new n(2).toRed(this.red),this.n=e.n&&new n(e.n,16),this.g=e.g&&this.pointFromJSON(e.g,e.gRed),this._wnafT1=new Array(4),this._wnafT2=new Array(4),this._wnafT3=new Array(4),this._wnafT4=new Array(4),this._bitLength=this.n?this.n.bitLength():0;var r=this.n&&this.p.div(this.n);!r||r.cmpn(100)>0?this.redN=null:(this._maxwellTrick=!0,this.redN=this.n.toRed(this.red))}function h(t,e){this.curve=t,this.type=e,this.precomputed=null}t.exports=u,u.prototype.point=function(){throw new Error("Not implemented")},u.prototype.validate=function(){throw new Error("Not implemented")},u.prototype._fixedNafMul=function(t,e){a(t.precomputed);var r=t._getDoubles(),n=o(e,1,this._bitLength),i=(1<=s;f--)u=(u<<1)+n[f];h.push(u)}for(var c=this.jpoint(null,null,null),l=this.jpoint(null,null,null),d=i;d>0;d--){for(s=0;s=0;h--){for(var f=0;h>=0&&0===s[h];h--)f++;if(h>=0&&f++,u=u.dblp(f),h<0)break;var c=s[h];a(0!==c),u="affine"===t.type?c>0?u.mixedAdd(i[c-1>>1]):u.mixedAdd(i[-c-1>>1].neg()):c>0?u.add(i[c-1>>1]):u.add(i[-c-1>>1].neg())}return"affine"===t.type?u.toP():u},u.prototype._wnafMulAdd=function(t,e,r,n,i){var a,u,h,f=this._wnafT1,c=this._wnafT2,l=this._wnafT3,d=0;for(a=0;a=1;a-=2){var m=a-1,y=a;if(1===f[m]&&1===f[y]){var v=[e[m],null,null,e[y]];0===e[m].y.cmp(e[y].y)?(v[1]=e[m].add(e[y]),v[2]=e[m].toJ().mixedAdd(e[y].neg())):0===e[m].y.cmp(e[y].y.redNeg())?(v[1]=e[m].toJ().mixedAdd(e[y]),v[2]=e[m].add(e[y].neg())):(v[1]=e[m].toJ().mixedAdd(e[y]),v[2]=e[m].toJ().mixedAdd(e[y].neg()));var b=[-3,-1,-5,-7,0,7,5,1,3],g=s(r[m],r[y]);for(d=Math.max(g[0].length,d),l[m]=new Array(d),l[y]=new Array(d),u=0;u=0;a--){for(var O=0;a>=0;){var A=!0;for(u=0;u=0&&O++,M=M.dblp(O),a<0)break;for(u=0;u0?h=c[u][E-1>>1]:E<0&&(h=c[u][-E-1>>1].neg()),M="affine"===h.type?M.mixedAdd(h):M.add(h))}}for(a=0;a=Math.ceil((t.bitLength()+1)/e.step)},h.prototype._getDoubles=function(t,e){if(this.precomputed&&this.precomputed.doubles)return this.precomputed.doubles;for(var r=[this],n=this,i=0;i0&&!e[o]&&(e[o]=!0,r.emit("error",e));try{t(e)}catch(e){return i((function(){throw e}))}}})):new(s=s||n.get())((function(t,n){e((function(e,i){return null!=e?(null!=r&&null!=r.listeners&&r.listeners("error").length>0&&!e[o]&&(e[o]=!0,r.emit("error",e)),n(e)):arguments.length>2?t(Array.prototype.slice.call(arguments,1)):void t(i)}))}))}},function(t,e,r){"use strict";var n=["find","findOne","update","updateMany","updateOne","replaceOne","remove","count","distinct","findOneAndDelete","findOneAndUpdate","aggregate","findCursor","deleteOne","deleteMany"];function i(){}for(var o=0,s=n.length;o0?i-4:i;for(r=0;r>16&255,s[a++]=e>>8&255,s[a++]=255&e;2===o&&(e=u[t.charCodeAt(r)]<<2|u[t.charCodeAt(r+1)]>>4,s[a++]=255&e);1===o&&(e=u[t.charCodeAt(r)]<<10|u[t.charCodeAt(r+1)]<<4|u[t.charCodeAt(r+2)]>>2,s[a++]=e>>8&255,s[a++]=255&e);return s},s=function(t){for(var e,r=t.length,n=r%3,i=[],o=0,s=r-n;os?s:o+16383));1===n?(e=t[r-1],i.push(a[e>>2]+a[e<<4&63]+"==")):2===n&&(e=(t[r-2]<<8)+t[r-1],i.push(a[e>>10]+a[e>>4&63]+a[e<<2&63]+"="));return i.join("")},a=[],u=[],h="undefined"!=typeof Uint8Array?Uint8Array:Array,f="ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/",c=0,l=f.length;c0)throw new Error("Invalid string. Length must be a multiple of 4");var r=t.indexOf("=");return-1===r&&(r=e),[r,r===e?0:4-r%4]}function p(t,e,r){for(var n,i,o=[],s=e;s>18&63]+a[i>>12&63]+a[i>>6&63]+a[63&i]);return o.join("")}u["-".charCodeAt(0)]=62,u["_".charCodeAt(0)]=63;var m={byteLength:i,toByteArray:o,fromByteArray:s},y=function(t,e,r,n,i){var o,s,a=8*i-n-1,u=(1<>1,f=-7,c=r?i-1:0,l=r?-1:1,d=t[e+c];for(c+=l,o=d&(1<<-f)-1,d>>=-f,f+=a;f>0;o=256*o+t[e+c],c+=l,f-=8);for(s=o&(1<<-f)-1,o>>=-f,f+=n;f>0;s=256*s+t[e+c],c+=l,f-=8);if(0===o)o=1-h;else{if(o===u)return s?NaN:1/0*(d?-1:1);s+=Math.pow(2,n),o-=h}return(d?-1:1)*s*Math.pow(2,o-n)},v=function(t,e,r,n,i,o){var s,a,u,h=8*o-i-1,f=(1<>1,l=23===i?Math.pow(2,-24)-Math.pow(2,-77):0,d=n?0:o-1,p=n?1:-1,m=e<0||0===e&&1/e<0?1:0;for(e=Math.abs(e),isNaN(e)||e===1/0?(a=isNaN(e)?1:0,s=f):(s=Math.floor(Math.log(e)/Math.LN2),e*(u=Math.pow(2,-s))<1&&(s--,u*=2),(e+=s+c>=1?l/u:l*Math.pow(2,1-c))*u>=2&&(s++,u/=2),s+c>=f?(a=0,s=f):s+c>=1?(a=(e*u-1)*Math.pow(2,i),s+=c):(a=e*Math.pow(2,c-1)*Math.pow(2,i),s=0));i>=8;t[r+d]=255&a,d+=p,a/=256,i-=8);for(s=s<0;t[r+d]=255&s,d+=p,s/=256,h-=8);t[r+d-p]|=128*m},b=function(t,e){return t(e={exports:{}},e.exports),e.exports}((function(t,e){var r="function"==typeof Symbol&&"function"==typeof Symbol.for?Symbol.for("nodejs.util.inspect.custom"):null;e.Buffer=i,e.SlowBuffer=function(t){+t!=t&&(t=0);return i.alloc(+t)},e.INSPECT_MAX_BYTES=50;function n(t){if(t>2147483647)throw new RangeError('The value "'+t+'" is invalid for option "size"');var e=new Uint8Array(t);return Object.setPrototypeOf(e,i.prototype),e}function i(t,e,r){if("number"==typeof t){if("string"==typeof e)throw new TypeError('The "string" argument must be of type string. Received type number');return a(t)}return o(t,e,r)}function o(t,e,r){if("string"==typeof t)return function(t,e){"string"==typeof e&&""!==e||(e="utf8");if(!i.isEncoding(e))throw new TypeError("Unknown encoding: "+e);var r=0|c(t,e),o=n(r),s=o.write(t,e);s!==r&&(o=o.slice(0,s));return o}(t,e);if(ArrayBuffer.isView(t))return function(t){if(L(t,Uint8Array)){var e=new Uint8Array(t);return h(e.buffer,e.byteOffset,e.byteLength)}return u(t)}(t);if(null==t)throw new TypeError("The first argument must be one of type string, Buffer, ArrayBuffer, Array, or Array-like Object. Received type "+babelHelpers.typeof(t));if(L(t,ArrayBuffer)||t&&L(t.buffer,ArrayBuffer))return h(t,e,r);if("undefined"!=typeof SharedArrayBuffer&&(L(t,SharedArrayBuffer)||t&&L(t.buffer,SharedArrayBuffer)))return h(t,e,r);if("number"==typeof t)throw new TypeError('The "value" argument must not be of type number. Received type number');var o=t.valueOf&&t.valueOf();if(null!=o&&o!==t)return i.from(o,e,r);var s=function(t){if(i.isBuffer(t)){var e=0|f(t.length),r=n(e);return 0===r.length||t.copy(r,0,0,e),r}if(void 0!==t.length)return"number"!=typeof t.length||q(t.length)?n(0):u(t);if("Buffer"===t.type&&Array.isArray(t.data))return u(t.data)}(t);if(s)return s;if("undefined"!=typeof Symbol&&null!=Symbol.toPrimitive&&"function"==typeof t[Symbol.toPrimitive])return i.from(t[Symbol.toPrimitive]("string"),e,r);throw new TypeError("The first argument must be one of type string, Buffer, ArrayBuffer, Array, or Array-like Object. Received type "+babelHelpers.typeof(t))}function s(t){if("number"!=typeof t)throw new TypeError('"size" argument must be of type number');if(t<0)throw new RangeError('The value "'+t+'" is invalid for option "size"')}function a(t){return s(t),n(t<0?0:0|f(t))}function u(t){for(var e=t.length<0?0:0|f(t.length),r=n(e),i=0;i=2147483647)throw new RangeError("Attempt to allocate Buffer larger than maximum size: 0x"+2147483647..toString(16)+" bytes");return 0|t}function c(t,e){if(i.isBuffer(t))return t.length;if(ArrayBuffer.isView(t)||L(t,ArrayBuffer))return t.byteLength;if("string"!=typeof t)throw new TypeError('The "string" argument must be one of type string, Buffer, or ArrayBuffer. Received type '+babelHelpers.typeof(t));var r=t.length,n=arguments.length>2&&!0===arguments[2];if(!n&&0===r)return 0;for(var o=!1;;)switch(e){case"ascii":case"latin1":case"binary":return r;case"utf8":case"utf-8":return N(t).length;case"ucs2":case"ucs-2":case"utf16le":case"utf-16le":return 2*r;case"hex":return r>>>1;case"base64":return D(t).length;default:if(o)return n?-1:N(t).length;e=(""+e).toLowerCase(),o=!0}}function l(t,e,r){var n=!1;if((void 0===e||e<0)&&(e=0),e>this.length)return"";if((void 0===r||r>this.length)&&(r=this.length),r<=0)return"";if((r>>>=0)<=(e>>>=0))return"";for(t||(t="utf8");;)switch(t){case"hex":return k(this,e,r);case"utf8":case"utf-8":return A(this,e,r);case"ascii":return E(this,e,r);case"latin1":case"binary":return x(this,e,r);case"base64":return O(this,e,r);case"ucs2":case"ucs-2":case"utf16le":case"utf-16le":return j(this,e,r);default:if(n)throw new TypeError("Unknown encoding: "+t);t=(t+"").toLowerCase(),n=!0}}function d(t,e,r){var n=t[e];t[e]=t[r],t[r]=n}function p(t,e,r,n,o){if(0===t.length)return-1;if("string"==typeof r?(n=r,r=0):r>2147483647?r=2147483647:r<-2147483648&&(r=-2147483648),q(r=+r)&&(r=o?0:t.length-1),r<0&&(r=t.length+r),r>=t.length){if(o)return-1;r=t.length-1}else if(r<0){if(!o)return-1;r=0}if("string"==typeof e&&(e=i.from(e,n)),i.isBuffer(e))return 0===e.length?-1:b(t,e,r,n,o);if("number"==typeof e)return e&=255,"function"==typeof Uint8Array.prototype.indexOf?o?Uint8Array.prototype.indexOf.call(t,e,r):Uint8Array.prototype.lastIndexOf.call(t,e,r):b(t,[e],r,n,o);throw new TypeError("val must be string, number or Buffer")}function b(t,e,r,n,i){var o,s=1,a=t.length,u=e.length;if(void 0!==n&&("ucs2"===(n=String(n).toLowerCase())||"ucs-2"===n||"utf16le"===n||"utf-16le"===n)){if(t.length<2||e.length<2)return-1;s=2,a/=2,u/=2,r/=2}function h(t,e){return 1===s?t[e]:t.readUInt16BE(e*s)}if(i){var f=-1;for(o=r;oa&&(r=a-u),o=r;o>=0;o--){for(var c=!0,l=0;li&&(n=i):n=i;var o=e.length;n>o/2&&(n=o/2);for(var s=0;s>8,i=r%256,o.push(i),o.push(n);return o}(e,t.length-r),t,r,n)}function O(t,e,r){return 0===e&&r===t.length?m.fromByteArray(t):m.fromByteArray(t.slice(e,r))}function A(t,e,r){r=Math.min(t.length,r);for(var n=[],i=e;i239?4:h>223?3:h>191?2:1;if(i+c<=r)switch(c){case 1:h<128&&(f=h);break;case 2:128==(192&(o=t[i+1]))&&(u=(31&h)<<6|63&o)>127&&(f=u);break;case 3:o=t[i+1],s=t[i+2],128==(192&o)&&128==(192&s)&&(u=(15&h)<<12|(63&o)<<6|63&s)>2047&&(u<55296||u>57343)&&(f=u);break;case 4:o=t[i+1],s=t[i+2],a=t[i+3],128==(192&o)&&128==(192&s)&&128==(192&a)&&(u=(15&h)<<18|(63&o)<<12|(63&s)<<6|63&a)>65535&&u<1114112&&(f=u)}null===f?(f=65533,c=1):f>65535&&(f-=65536,n.push(f>>>10&1023|55296),f=56320|1023&f),n.push(f),i+=c}return function(t){var e=t.length;if(e<=4096)return String.fromCharCode.apply(String,t);var r="",n=0;for(;nn.length?i.from(s).copy(n,o):Uint8Array.prototype.set.call(n,s,o);else{if(!i.isBuffer(s))throw new TypeError('"list" argument must be an Array of Buffers');s.copy(n,o)}o+=s.length}return n},i.byteLength=c,i.prototype._isBuffer=!0,i.prototype.swap16=function(){var t=this.length;if(t%2!=0)throw new RangeError("Buffer size must be a multiple of 16-bits");for(var e=0;er&&(t+=" ... "),""},r&&(i.prototype[r]=i.prototype.inspect),i.prototype.compare=function(t,e,r,n,o){if(L(t,Uint8Array)&&(t=i.from(t,t.offset,t.byteLength)),!i.isBuffer(t))throw new TypeError('The "target" argument must be one of type Buffer or Uint8Array. Received type '+babelHelpers.typeof(t));if(void 0===e&&(e=0),void 0===r&&(r=t?t.length:0),void 0===n&&(n=0),void 0===o&&(o=this.length),e<0||r>t.length||n<0||o>this.length)throw new RangeError("out of range index");if(n>=o&&e>=r)return 0;if(n>=o)return-1;if(e>=r)return 1;if(this===t)return 0;for(var s=(o>>>=0)-(n>>>=0),a=(r>>>=0)-(e>>>=0),u=Math.min(s,a),h=this.slice(n,o),f=t.slice(e,r),c=0;c>>=0,isFinite(r)?(r>>>=0,void 0===n&&(n="utf8")):(n=r,r=void 0)}var i=this.length-e;if((void 0===r||r>i)&&(r=i),t.length>0&&(r<0||e<0)||e>this.length)throw new RangeError("Attempt to write outside buffer bounds");n||(n="utf8");for(var o=!1;;)switch(n){case"hex":return g(this,t,e,r);case"utf8":case"utf-8":return w(this,t,e,r);case"ascii":case"latin1":case"binary":return _(this,t,e,r);case"base64":return M(this,t,e,r);case"ucs2":case"ucs-2":case"utf16le":case"utf-16le":return S(this,t,e,r);default:if(o)throw new TypeError("Unknown encoding: "+n);n=(""+n).toLowerCase(),o=!0}},i.prototype.toJSON=function(){return{type:"Buffer",data:Array.prototype.slice.call(this._arr||this,0)}};function E(t,e,r){var n="";r=Math.min(t.length,r);for(var i=e;in)&&(r=n);for(var i="",o=e;or)throw new RangeError("Trying to access beyond buffer length")}function P(t,e,r,n,o,s){if(!i.isBuffer(t))throw new TypeError('"buffer" argument must be a Buffer instance');if(e>o||et.length)throw new RangeError("Index out of range")}function R(t,e,r,n,i,o){if(r+n>t.length)throw new RangeError("Index out of range");if(r<0)throw new RangeError("Index out of range")}function B(t,e,r,n,i){return e=+e,r>>>=0,i||R(t,0,r,4),v(t,e,r,n,23,4),r+4}function T(t,e,r,n,i){return e=+e,r>>>=0,i||R(t,0,r,8),v(t,e,r,n,52,8),r+8}i.prototype.slice=function(t,e){var r=this.length;(t=~~t)<0?(t+=r)<0&&(t=0):t>r&&(t=r),(e=void 0===e?r:~~e)<0?(e+=r)<0&&(e=0):e>r&&(e=r),e>>=0,e>>>=0,r||$(t,e,this.length);for(var n=this[t],i=1,o=0;++o>>=0,e>>>=0,r||$(t,e,this.length);for(var n=this[t+--e],i=1;e>0&&(i*=256);)n+=this[t+--e]*i;return n},i.prototype.readUint8=i.prototype.readUInt8=function(t,e){return t>>>=0,e||$(t,1,this.length),this[t]},i.prototype.readUint16LE=i.prototype.readUInt16LE=function(t,e){return t>>>=0,e||$(t,2,this.length),this[t]|this[t+1]<<8},i.prototype.readUint16BE=i.prototype.readUInt16BE=function(t,e){return t>>>=0,e||$(t,2,this.length),this[t]<<8|this[t+1]},i.prototype.readUint32LE=i.prototype.readUInt32LE=function(t,e){return t>>>=0,e||$(t,4,this.length),(this[t]|this[t+1]<<8|this[t+2]<<16)+16777216*this[t+3]},i.prototype.readUint32BE=i.prototype.readUInt32BE=function(t,e){return t>>>=0,e||$(t,4,this.length),16777216*this[t]+(this[t+1]<<16|this[t+2]<<8|this[t+3])},i.prototype.readIntLE=function(t,e,r){t>>>=0,e>>>=0,r||$(t,e,this.length);for(var n=this[t],i=1,o=0;++o=(i*=128)&&(n-=Math.pow(2,8*e)),n},i.prototype.readIntBE=function(t,e,r){t>>>=0,e>>>=0,r||$(t,e,this.length);for(var n=e,i=1,o=this[t+--n];n>0&&(i*=256);)o+=this[t+--n]*i;return o>=(i*=128)&&(o-=Math.pow(2,8*e)),o},i.prototype.readInt8=function(t,e){return t>>>=0,e||$(t,1,this.length),128&this[t]?-1*(255-this[t]+1):this[t]},i.prototype.readInt16LE=function(t,e){t>>>=0,e||$(t,2,this.length);var r=this[t]|this[t+1]<<8;return 32768&r?4294901760|r:r},i.prototype.readInt16BE=function(t,e){t>>>=0,e||$(t,2,this.length);var r=this[t+1]|this[t]<<8;return 32768&r?4294901760|r:r},i.prototype.readInt32LE=function(t,e){return t>>>=0,e||$(t,4,this.length),this[t]|this[t+1]<<8|this[t+2]<<16|this[t+3]<<24},i.prototype.readInt32BE=function(t,e){return t>>>=0,e||$(t,4,this.length),this[t]<<24|this[t+1]<<16|this[t+2]<<8|this[t+3]},i.prototype.readFloatLE=function(t,e){return t>>>=0,e||$(t,4,this.length),y(this,t,!0,23,4)},i.prototype.readFloatBE=function(t,e){return t>>>=0,e||$(t,4,this.length),y(this,t,!1,23,4)},i.prototype.readDoubleLE=function(t,e){return t>>>=0,e||$(t,8,this.length),y(this,t,!0,52,8)},i.prototype.readDoubleBE=function(t,e){return t>>>=0,e||$(t,8,this.length),y(this,t,!1,52,8)},i.prototype.writeUintLE=i.prototype.writeUIntLE=function(t,e,r,n){(t=+t,e>>>=0,r>>>=0,n)||P(this,t,e,r,Math.pow(2,8*r)-1,0);var i=1,o=0;for(this[e]=255&t;++o>>=0,r>>>=0,n)||P(this,t,e,r,Math.pow(2,8*r)-1,0);var i=r-1,o=1;for(this[e+i]=255&t;--i>=0&&(o*=256);)this[e+i]=t/o&255;return e+r},i.prototype.writeUint8=i.prototype.writeUInt8=function(t,e,r){return t=+t,e>>>=0,r||P(this,t,e,1,255,0),this[e]=255&t,e+1},i.prototype.writeUint16LE=i.prototype.writeUInt16LE=function(t,e,r){return t=+t,e>>>=0,r||P(this,t,e,2,65535,0),this[e]=255&t,this[e+1]=t>>>8,e+2},i.prototype.writeUint16BE=i.prototype.writeUInt16BE=function(t,e,r){return t=+t,e>>>=0,r||P(this,t,e,2,65535,0),this[e]=t>>>8,this[e+1]=255&t,e+2},i.prototype.writeUint32LE=i.prototype.writeUInt32LE=function(t,e,r){return t=+t,e>>>=0,r||P(this,t,e,4,4294967295,0),this[e+3]=t>>>24,this[e+2]=t>>>16,this[e+1]=t>>>8,this[e]=255&t,e+4},i.prototype.writeUint32BE=i.prototype.writeUInt32BE=function(t,e,r){return t=+t,e>>>=0,r||P(this,t,e,4,4294967295,0),this[e]=t>>>24,this[e+1]=t>>>16,this[e+2]=t>>>8,this[e+3]=255&t,e+4},i.prototype.writeIntLE=function(t,e,r,n){if(t=+t,e>>>=0,!n){var i=Math.pow(2,8*r-1);P(this,t,e,r,i-1,-i)}var o=0,s=1,a=0;for(this[e]=255&t;++o>0)-a&255;return e+r},i.prototype.writeIntBE=function(t,e,r,n){if(t=+t,e>>>=0,!n){var i=Math.pow(2,8*r-1);P(this,t,e,r,i-1,-i)}var o=r-1,s=1,a=0;for(this[e+o]=255&t;--o>=0&&(s*=256);)t<0&&0===a&&0!==this[e+o+1]&&(a=1),this[e+o]=(t/s>>0)-a&255;return e+r},i.prototype.writeInt8=function(t,e,r){return t=+t,e>>>=0,r||P(this,t,e,1,127,-128),t<0&&(t=255+t+1),this[e]=255&t,e+1},i.prototype.writeInt16LE=function(t,e,r){return t=+t,e>>>=0,r||P(this,t,e,2,32767,-32768),this[e]=255&t,this[e+1]=t>>>8,e+2},i.prototype.writeInt16BE=function(t,e,r){return t=+t,e>>>=0,r||P(this,t,e,2,32767,-32768),this[e]=t>>>8,this[e+1]=255&t,e+2},i.prototype.writeInt32LE=function(t,e,r){return t=+t,e>>>=0,r||P(this,t,e,4,2147483647,-2147483648),this[e]=255&t,this[e+1]=t>>>8,this[e+2]=t>>>16,this[e+3]=t>>>24,e+4},i.prototype.writeInt32BE=function(t,e,r){return t=+t,e>>>=0,r||P(this,t,e,4,2147483647,-2147483648),t<0&&(t=4294967295+t+1),this[e]=t>>>24,this[e+1]=t>>>16,this[e+2]=t>>>8,this[e+3]=255&t,e+4},i.prototype.writeFloatLE=function(t,e,r){return B(this,t,e,!0,r)},i.prototype.writeFloatBE=function(t,e,r){return B(this,t,e,!1,r)},i.prototype.writeDoubleLE=function(t,e,r){return T(this,t,e,!0,r)},i.prototype.writeDoubleBE=function(t,e,r){return T(this,t,e,!1,r)},i.prototype.copy=function(t,e,r,n){if(!i.isBuffer(t))throw new TypeError("argument should be a Buffer");if(r||(r=0),n||0===n||(n=this.length),e>=t.length&&(e=t.length),e||(e=0),n>0&&n=this.length)throw new RangeError("Index out of range");if(n<0)throw new RangeError("sourceEnd out of bounds");n>this.length&&(n=this.length),t.length-e>>=0,r=void 0===r?this.length:r>>>0,t||(t=0),"number"==typeof t)for(s=e;s55295&&r<57344){if(!i){if(r>56319){(e-=3)>-1&&o.push(239,191,189);continue}if(s+1===n){(e-=3)>-1&&o.push(239,191,189);continue}i=r;continue}if(r<56320){(e-=3)>-1&&o.push(239,191,189),i=r;continue}r=65536+(i-55296<<10|r-56320)}else i&&(e-=3)>-1&&o.push(239,191,189);if(i=null,r<128){if((e-=1)<0)break;o.push(r)}else if(r<2048){if((e-=2)<0)break;o.push(r>>6|192,63&r|128)}else if(r<65536){if((e-=3)<0)break;o.push(r>>12|224,r>>6&63|128,63&r|128)}else{if(!(r<1114112))throw new Error("Invalid code point");if((e-=4)<0)break;o.push(r>>18|240,r>>12&63|128,r>>6&63|128,63&r|128)}}return o}function D(t){return m.toByteArray(function(t){if((t=(t=t.split("=")[0]).trim().replace(I,"")).length<2)return"";for(;t.length%4!=0;)t+="=";return t}(t))}function C(t,e,r,n){for(var i=0;i=e.length||i>=t.length);++i)e[i+r]=t[i];return i}function L(t,e){return t instanceof e||null!=t&&null!=t.constructor&&null!=t.constructor.name&&t.constructor.name===e.name}function q(t){return t!=t}var U=function(){for(var t=new Array(256),e=0;e<16;++e)for(var r=16*e,n=0;n<16;++n)t[r+n]="0123456789abcdef"[e]+"0123456789abcdef"[n];return t}()})),g=b.Buffer; +/*! ieee754. BSD-3-Clause License. Feross Aboukhadijeh */b.SlowBuffer,b.INSPECT_MAX_BYTES,b.kMaxLength; +/*! ***************************************************************************** +Copyright (c) Microsoft Corporation. + +Permission to use, copy, modify, and/or distribute this software for any +purpose with or without fee is hereby granted. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH +REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY +AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, +INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM +LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR +OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR +PERFORMANCE OF THIS SOFTWARE. +***************************************************************************** */ +var w=function(t,e){return(w=Object.setPrototypeOf||{__proto__:[]}instanceof Array&&function(t,e){t.__proto__=e}||function(t,e){for(var r in e)e.hasOwnProperty(r)&&(t[r]=e[r])})(t,e)};function _(t,e){function r(){this.constructor=t}w(t,e),t.prototype=null===e?Object.create(e):(r.prototype=e.prototype,new r)}var M=function(t){function e(){return null!==t&&t.apply(this,arguments)||this}return _(e,t),Object.defineProperty(e.prototype,"name",{get:function(){return"BSONError"},enumerable:!1,configurable:!0}),e}(Error),S=function(t){function e(){return null!==t&&t.apply(this,arguments)||this}return _(e,t),Object.defineProperty(e.prototype,"name",{get:function(){return"BSONTypeError"},enumerable:!1,configurable:!0}),e}(TypeError);function O(t){return t&&t.Math==Math&&t}function A(){return O("object"===("undefined"==typeof globalThis?"undefined":n(globalThis))&&globalThis)||O("object"===("undefined"==typeof window?"undefined":n(window))&&window)||O("object"===("undefined"==typeof self?"undefined":n(self))&&self)||O("object"===(void 0===t?"undefined":n(t))&&t)||Function("return this")()}function E(t){return t.toString().replace("function(","function (")}var x=function(t){var e,r="object"===n((e=A()).navigator)&&"ReactNative"===e.navigator.product?"BSON: For React Native please polyfill crypto.getRandomValues, e.g. using: https://www.npmjs.com/package/react-native-get-random-values.":"BSON: No cryptographic implementation for random bytes present, falling back to a less secure implementation.";console.warn(r);for(var i=g.alloc(t),o=0;o");this.sub_type=null!=r?r:t.BSON_BINARY_SUBTYPE_DEFAULT,null==e?(this.buffer=g.alloc(t.BUFFER_SIZE),this.position=0):("string"==typeof e?this.buffer=g.from(e,"binary"):Array.isArray(e)?this.buffer=g.from(e):this.buffer=D(e),this.position=this.buffer.byteLength)}return t.prototype.put=function(e){if("string"==typeof e&&1!==e.length)throw new S("only accepts single character String");if("number"!=typeof e&&1!==e.length)throw new S("only accepts single character Uint8Array or Array");var r;if((r="string"==typeof e?e.charCodeAt(0):"number"==typeof e?e:e[0])<0||r>255)throw new S("only accepts number in a valid unsigned byte range 0-255");if(this.buffer.length>this.position)this.buffer[this.position++]=r;else{var n=g.alloc(t.BUFFER_SIZE+this.buffer.length);this.buffer.copy(n,0,0,this.buffer.length),this.buffer=n,this.buffer[this.position++]=r}},t.prototype.write=function(t,e){if(e="number"==typeof e?e:this.position,this.buffer.lengththis.position?e+t.length:this.position):"string"==typeof t&&(this.buffer.write(t,e,t.length,"binary"),this.position=e+t.length>this.position?e+t.length:this.position)},t.prototype.read=function(t,e){return e=e&&e>0?e:this.position,this.buffer.slice(t,t+e)},t.prototype.value=function(t){return(t=!!t)&&this.buffer.length===this.position?this.buffer:t?this.buffer.slice(0,this.position):this.buffer.toString("binary",0,this.position)},t.prototype.length=function(){return this.position},t.prototype.toJSON=function(){return this.buffer.toString("base64")},t.prototype.toString=function(t){return this.buffer.toString(t)},t.prototype.toExtendedJSON=function(t){t=t||{};var e=this.buffer.toString("base64"),r=Number(this.sub_type).toString(16);return t.legacy?{$binary:e,$type:1===r.length?"0"+r:r}:{$binary:{base64:e,subType:1===r.length?"0"+r:r}}},t.prototype.toUUID=function(){if(this.sub_type===t.SUBTYPE_UUID)return new z(this.buffer.slice(0,this.position));throw new M('Binary sub_type "'+this.sub_type+'" is not supported for converting to UUID. Only "'+t.SUBTYPE_UUID+'" is currently supported.')},t.fromExtendedJSON=function(e,r){var n,i;if(r=r||{},"$binary"in e?r.legacy&&"string"==typeof e.$binary&&"$type"in e?(i=e.$type?parseInt(e.$type,16):0,n=g.from(e.$binary,"base64")):"string"!=typeof e.$binary&&(i=e.$binary.subType?parseInt(e.$binary.subType,16):0,n=g.from(e.$binary.base64,"base64")):"$uuid"in e&&(i=4,n=q(e.$uuid)),!n)throw new S("Unexpected Binary Extended JSON format "+JSON.stringify(e));return new t(n,i)},t.prototype[Symbol.for("nodejs.util.inspect.custom")]=function(){return this.inspect()},t.prototype.inspect=function(){return'new Binary(Buffer.from("'+this.value(!0).toString("hex")+'", "hex"), '+this.sub_type+")"},t.BSON_BINARY_SUBTYPE_DEFAULT=0,t.BUFFER_SIZE=256,t.SUBTYPE_DEFAULT=0,t.SUBTYPE_FUNCTION=1,t.SUBTYPE_BYTE_ARRAY=2,t.SUBTYPE_UUID_OLD=3,t.SUBTYPE_UUID=4,t.SUBTYPE_MD5=5,t.SUBTYPE_ENCRYPTED=6,t.SUBTYPE_COLUMN=7,t.SUBTYPE_USER_DEFINED=128,t}();Object.defineProperty(V.prototype,"_bsontype",{value:"Binary"});var K=function(){function t(e,r){if(!(this instanceof t))return new t(e,r);this.code=e,this.scope=r}return t.prototype.toJSON=function(){return{code:this.code,scope:this.scope}},t.prototype.toExtendedJSON=function(){return this.scope?{$code:this.code,$scope:this.scope}:{$code:this.code}},t.fromExtendedJSON=function(e){return new t(e.$code,e.$scope)},t.prototype[Symbol.for("nodejs.util.inspect.custom")]=function(){return this.inspect()},t.prototype.inspect=function(){var t=this.toJSON();return'new Code("'+t.code+'"'+(t.scope?", "+JSON.stringify(t.scope):"")+")"},t}();function H(t){return I(t)&&null!=t.$id&&"string"==typeof t.$ref&&(null==t.$db||"string"==typeof t.$db)}Object.defineProperty(K.prototype,"_bsontype",{value:"Code"});var Z=function(){function t(e,r,n,i){if(!(this instanceof t))return new t(e,r,n,i);var o=e.split(".");2===o.length&&(n=o.shift(),e=o.shift()),this.collection=e,this.oid=r,this.db=n,this.fields=i||{}}return Object.defineProperty(t.prototype,"namespace",{get:function(){return this.collection},set:function(t){this.collection=t},enumerable:!1,configurable:!0}),t.prototype.toJSON=function(){var t=Object.assign({$ref:this.collection,$id:this.oid},this.fields);return null!=this.db&&(t.$db=this.db),t},t.prototype.toExtendedJSON=function(t){t=t||{};var e={$ref:this.collection,$id:this.oid};return t.legacy?e:(this.db&&(e.$db=this.db),e=Object.assign(e,this.fields))},t.fromExtendedJSON=function(e){var r=Object.assign({},e);return delete r.$ref,delete r.$id,delete r.$db,new t(e.$ref,e.$id,e.$db,r)},t.prototype[Symbol.for("nodejs.util.inspect.custom")]=function(){return this.inspect()},t.prototype.inspect=function(){var t=void 0===this.oid||void 0===this.oid.toString?this.oid:this.oid.toString();return'new DBRef("'+this.namespace+'", new ObjectId("'+t+'")'+(this.db?', "'+this.db+'"':"")+")"},t}();Object.defineProperty(Z.prototype,"_bsontype",{value:"DBRef"});var W=void 0;try{W=new WebAssembly.Instance(new WebAssembly.Module(new Uint8Array([0,97,115,109,1,0,0,0,1,13,2,96,0,1,127,96,4,127,127,127,127,1,127,3,7,6,0,1,1,1,1,1,6,6,1,127,1,65,0,11,7,50,6,3,109,117,108,0,1,5,100,105,118,95,115,0,2,5,100,105,118,95,117,0,3,5,114,101,109,95,115,0,4,5,114,101,109,95,117,0,5,8,103,101,116,95,104,105,103,104,0,0,10,191,1,6,4,0,35,0,11,36,1,1,126,32,0,173,32,1,173,66,32,134,132,32,2,173,32,3,173,66,32,134,132,126,34,4,66,32,135,167,36,0,32,4,167,11,36,1,1,126,32,0,173,32,1,173,66,32,134,132,32,2,173,32,3,173,66,32,134,132,127,34,4,66,32,135,167,36,0,32,4,167,11,36,1,1,126,32,0,173,32,1,173,66,32,134,132,32,2,173,32,3,173,66,32,134,132,128,34,4,66,32,135,167,36,0,32,4,167,11,36,1,1,126,32,0,173,32,1,173,66,32,134,132,32,2,173,32,3,173,66,32,134,132,129,34,4,66,32,135,167,36,0,32,4,167,11,36,1,1,126,32,0,173,32,1,173,66,32,134,132,32,2,173,32,3,173,66,32,134,132,130,34,4,66,32,135,167,36,0,32,4,167,11])),{}).exports}catch(t){}var J={},Y={},Q=function(){function t(e,r,n){if(void 0===e&&(e=0),!(this instanceof t))return new t(e,r,n);"bigint"==typeof e?Object.assign(this,t.fromBigInt(e,!!r)):"string"==typeof e?Object.assign(this,t.fromString(e,!!r)):(this.low=0|e,this.high=0|r,this.unsigned=!!n),Object.defineProperty(this,"__isLong__",{value:!0,configurable:!1,writable:!1,enumerable:!1})}return t.fromBits=function(e,r,n){return new t(e,r,n)},t.fromInt=function(e,r){var n,i,o;return r?(o=0<=(e>>>=0)&&e<256)&&(i=Y[e])?i:(n=t.fromBits(e,(0|e)<0?-1:0,!0),o&&(Y[e]=n),n):(o=-128<=(e|=0)&&e<128)&&(i=J[e])?i:(n=t.fromBits(e,e<0?-1:0,!1),o&&(J[e]=n),n)},t.fromNumber=function(e,r){if(isNaN(e))return r?t.UZERO:t.ZERO;if(r){if(e<0)return t.UZERO;if(e>=0x10000000000000000)return t.MAX_UNSIGNED_VALUE}else{if(e<=-0x8000000000000000)return t.MIN_VALUE;if(e+1>=0x8000000000000000)return t.MAX_VALUE}return e<0?t.fromNumber(-e,r).neg():t.fromBits(e%4294967296|0,e/4294967296|0,r)},t.fromBigInt=function(e,r){return t.fromString(e.toString(),r)},t.fromString=function(e,r,n){if(0===e.length)throw Error("empty string");if("NaN"===e||"Infinity"===e||"+Infinity"===e||"-Infinity"===e)return t.ZERO;if("number"==typeof r?(n=r,r=!1):r=!!r,(n=n||10)<2||360)throw Error("interior hyphen");if(0===i)return t.fromString(e.substring(1),r,n).neg();for(var o=t.fromNumber(Math.pow(n,8)),s=t.ZERO,a=0;a>>16,n=65535&this.high,i=this.low>>>16,o=65535&this.low,s=e.high>>>16,a=65535&e.high,u=e.low>>>16,h=0,f=0,c=0,l=0;return c+=(l+=o+(65535&e.low))>>>16,l&=65535,f+=(c+=i+u)>>>16,c&=65535,h+=(f+=n+a)>>>16,f&=65535,h+=r+s,h&=65535,t.fromBits(c<<16|l,h<<16|f,this.unsigned)},t.prototype.and=function(e){return t.isLong(e)||(e=t.fromValue(e)),t.fromBits(this.low&e.low,this.high&e.high,this.unsigned)},t.prototype.compare=function(e){if(t.isLong(e)||(e=t.fromValue(e)),this.eq(e))return 0;var r=this.isNegative(),n=e.isNegative();return r&&!n?-1:!r&&n?1:this.unsigned?e.high>>>0>this.high>>>0||e.high===this.high&&e.low>>>0>this.low>>>0?-1:1:this.sub(e).isNegative()?-1:1},t.prototype.comp=function(t){return this.compare(t)},t.prototype.divide=function(e){if(t.isLong(e)||(e=t.fromValue(e)),e.isZero())throw Error("division by zero");if(W){if(!this.unsigned&&-2147483648===this.high&&-1===e.low&&-1===e.high)return this;var r=(this.unsigned?W.div_u:W.div_s)(this.low,this.high,e.low,e.high);return t.fromBits(r,W.get_high(),this.unsigned)}if(this.isZero())return this.unsigned?t.UZERO:t.ZERO;var n,i,o;if(this.unsigned){if(e.unsigned||(e=e.toUnsigned()),e.gt(this))return t.UZERO;if(e.gt(this.shru(1)))return t.UONE;o=t.UZERO}else{if(this.eq(t.MIN_VALUE))return e.eq(t.ONE)||e.eq(t.NEG_ONE)?t.MIN_VALUE:e.eq(t.MIN_VALUE)?t.ONE:(n=this.shr(1).div(e).shl(1)).eq(t.ZERO)?e.isNegative()?t.ONE:t.NEG_ONE:(i=this.sub(e.mul(n)),o=n.add(i.div(e)));if(e.eq(t.MIN_VALUE))return this.unsigned?t.UZERO:t.ZERO;if(this.isNegative())return e.isNegative()?this.neg().div(e.neg()):this.neg().div(e).neg();if(e.isNegative())return this.div(e.neg()).neg();o=t.ZERO}for(i=this;i.gte(e);){n=Math.max(1,Math.floor(i.toNumber()/e.toNumber()));for(var s=Math.ceil(Math.log(n)/Math.LN2),a=s<=48?1:Math.pow(2,s-48),u=t.fromNumber(n),h=u.mul(e);h.isNegative()||h.gt(i);)n-=a,h=(u=t.fromNumber(n,this.unsigned)).mul(e);u.isZero()&&(u=t.ONE),o=o.add(u),i=i.sub(h)}return o},t.prototype.div=function(t){return this.divide(t)},t.prototype.equals=function(e){return t.isLong(e)||(e=t.fromValue(e)),(this.unsigned===e.unsigned||this.high>>>31!=1||e.high>>>31!=1)&&(this.high===e.high&&this.low===e.low)},t.prototype.eq=function(t){return this.equals(t)},t.prototype.getHighBits=function(){return this.high},t.prototype.getHighBitsUnsigned=function(){return this.high>>>0},t.prototype.getLowBits=function(){return this.low},t.prototype.getLowBitsUnsigned=function(){return this.low>>>0},t.prototype.getNumBitsAbs=function(){if(this.isNegative())return this.eq(t.MIN_VALUE)?64:this.neg().getNumBitsAbs();var e,r=0!==this.high?this.high:this.low;for(e=31;e>0&&0==(r&1<0},t.prototype.gt=function(t){return this.greaterThan(t)},t.prototype.greaterThanOrEqual=function(t){return this.comp(t)>=0},t.prototype.gte=function(t){return this.greaterThanOrEqual(t)},t.prototype.ge=function(t){return this.greaterThanOrEqual(t)},t.prototype.isEven=function(){return 0==(1&this.low)},t.prototype.isNegative=function(){return!this.unsigned&&this.high<0},t.prototype.isOdd=function(){return 1==(1&this.low)},t.prototype.isPositive=function(){return this.unsigned||this.high>=0},t.prototype.isZero=function(){return 0===this.high&&0===this.low},t.prototype.lessThan=function(t){return this.comp(t)<0},t.prototype.lt=function(t){return this.lessThan(t)},t.prototype.lessThanOrEqual=function(t){return this.comp(t)<=0},t.prototype.lte=function(t){return this.lessThanOrEqual(t)},t.prototype.modulo=function(e){if(t.isLong(e)||(e=t.fromValue(e)),W){var r=(this.unsigned?W.rem_u:W.rem_s)(this.low,this.high,e.low,e.high);return t.fromBits(r,W.get_high(),this.unsigned)}return this.sub(this.div(e).mul(e))},t.prototype.mod=function(t){return this.modulo(t)},t.prototype.rem=function(t){return this.modulo(t)},t.prototype.multiply=function(e){if(this.isZero())return t.ZERO;if(t.isLong(e)||(e=t.fromValue(e)),W){var r=W.mul(this.low,this.high,e.low,e.high);return t.fromBits(r,W.get_high(),this.unsigned)}if(e.isZero())return t.ZERO;if(this.eq(t.MIN_VALUE))return e.isOdd()?t.MIN_VALUE:t.ZERO;if(e.eq(t.MIN_VALUE))return this.isOdd()?t.MIN_VALUE:t.ZERO;if(this.isNegative())return e.isNegative()?this.neg().mul(e.neg()):this.neg().mul(e).neg();if(e.isNegative())return this.mul(e.neg()).neg();if(this.lt(t.TWO_PWR_24)&&e.lt(t.TWO_PWR_24))return t.fromNumber(this.toNumber()*e.toNumber(),this.unsigned);var n=this.high>>>16,i=65535&this.high,o=this.low>>>16,s=65535&this.low,a=e.high>>>16,u=65535&e.high,h=e.low>>>16,f=65535&e.low,c=0,l=0,d=0,p=0;return d+=(p+=s*f)>>>16,p&=65535,l+=(d+=o*f)>>>16,d&=65535,l+=(d+=s*h)>>>16,d&=65535,c+=(l+=i*f)>>>16,l&=65535,c+=(l+=o*h)>>>16,l&=65535,c+=(l+=s*u)>>>16,l&=65535,c+=n*f+i*h+o*u+s*a,c&=65535,t.fromBits(d<<16|p,c<<16|l,this.unsigned)},t.prototype.mul=function(t){return this.multiply(t)},t.prototype.negate=function(){return!this.unsigned&&this.eq(t.MIN_VALUE)?t.MIN_VALUE:this.not().add(t.ONE)},t.prototype.neg=function(){return this.negate()},t.prototype.not=function(){return t.fromBits(~this.low,~this.high,this.unsigned)},t.prototype.notEquals=function(t){return!this.equals(t)},t.prototype.neq=function(t){return this.notEquals(t)},t.prototype.ne=function(t){return this.notEquals(t)},t.prototype.or=function(e){return t.isLong(e)||(e=t.fromValue(e)),t.fromBits(this.low|e.low,this.high|e.high,this.unsigned)},t.prototype.shiftLeft=function(e){return t.isLong(e)&&(e=e.toInt()),0==(e&=63)?this:e<32?t.fromBits(this.low<>>32-e,this.unsigned):t.fromBits(0,this.low<>>e|this.high<<32-e,this.high>>e,this.unsigned):t.fromBits(this.high>>e-32,this.high>=0?0:-1,this.unsigned)},t.prototype.shr=function(t){return this.shiftRight(t)},t.prototype.shiftRightUnsigned=function(e){if(t.isLong(e)&&(e=e.toInt()),0===(e&=63))return this;var r=this.high;if(e<32){var n=this.low;return t.fromBits(n>>>e|r<<32-e,r>>>e,this.unsigned)}return 32===e?t.fromBits(r,0,this.unsigned):t.fromBits(r>>>e-32,0,this.unsigned)},t.prototype.shr_u=function(t){return this.shiftRightUnsigned(t)},t.prototype.shru=function(t){return this.shiftRightUnsigned(t)},t.prototype.subtract=function(e){return t.isLong(e)||(e=t.fromValue(e)),this.add(e.neg())},t.prototype.sub=function(t){return this.subtract(t)},t.prototype.toInt=function(){return this.unsigned?this.low>>>0:this.low},t.prototype.toNumber=function(){return this.unsigned?4294967296*(this.high>>>0)+(this.low>>>0):4294967296*this.high+(this.low>>>0)},t.prototype.toBigInt=function(){return BigInt(this.toString())},t.prototype.toBytes=function(t){return t?this.toBytesLE():this.toBytesBE()},t.prototype.toBytesLE=function(){var t=this.high,e=this.low;return[255&e,e>>>8&255,e>>>16&255,e>>>24,255&t,t>>>8&255,t>>>16&255,t>>>24]},t.prototype.toBytesBE=function(){var t=this.high,e=this.low;return[t>>>24,t>>>16&255,t>>>8&255,255&t,e>>>24,e>>>16&255,e>>>8&255,255&e]},t.prototype.toSigned=function(){return this.unsigned?t.fromBits(this.low,this.high,!1):this},t.prototype.toString=function(e){if((e=e||10)<2||36>>0).toString(e);if((s=u).isZero())return h+a;for(;h.length<6;)h="0"+h;a=""+h+a}},t.prototype.toUnsigned=function(){return this.unsigned?this:t.fromBits(this.low,this.high,!0)},t.prototype.xor=function(e){return t.isLong(e)||(e=t.fromValue(e)),t.fromBits(this.low^e.low,this.high^e.high,this.unsigned)},t.prototype.eqz=function(){return this.isZero()},t.prototype.le=function(t){return this.lessThanOrEqual(t)},t.prototype.toExtendedJSON=function(t){return t&&t.relaxed?this.toNumber():{$numberLong:this.toString()}},t.fromExtendedJSON=function(e,r){var n=t.fromString(e.$numberLong);return r&&r.relaxed?n.toNumber():n},t.prototype[Symbol.for("nodejs.util.inspect.custom")]=function(){return this.inspect()},t.prototype.inspect=function(){return'new Long("'+this.toString()+'"'+(this.unsigned?", true":"")+")"},t.TWO_PWR_24=t.fromInt(1<<24),t.MAX_UNSIGNED_VALUE=t.fromBits(-1,-1,!0),t.ZERO=t.fromInt(0),t.UZERO=t.fromInt(0,!0),t.ONE=t.fromInt(1),t.UONE=t.fromInt(1,!0),t.NEG_ONE=t.fromInt(-1),t.MAX_VALUE=t.fromBits(-1,2147483647,!1),t.MIN_VALUE=t.fromBits(0,-2147483648,!1),t}();Object.defineProperty(Q.prototype,"__isLong__",{value:!0}),Object.defineProperty(Q.prototype,"_bsontype",{value:"Long"});var G=/^(\+|-)?(\d+|(\d*\.\d*))?(E|e)?([-+])?(\d+)?$/,X=/^(\+|-)?(Infinity|inf)$/i,tt=/^(\+|-)?NaN$/i,et=[124,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0].reverse(),rt=[248,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0].reverse(),nt=[120,0,0,0,0,0,0,0,0,0,0,0,0,0,0,0].reverse(),it=/^([-+])?(\d+)?$/;function ot(t){return!isNaN(parseInt(t,10))}function st(t){var e=Q.fromNumber(1e9),r=Q.fromNumber(0);if(!(t.parts[0]||t.parts[1]||t.parts[2]||t.parts[3]))return{quotient:t,rem:r};for(var n=0;n<=3;n++)r=(r=r.shiftLeft(32)).add(new Q(t.parts[n],0)),t.parts[n]=r.div(e).low,r=r.modulo(e);return{quotient:t,rem:r}}function at(t,e){throw new S('"'+t+'" is not a valid Decimal128 string - '+e)}var ut=function(){function t(e){if(!(this instanceof t))return new t(e);this.bytes="string"==typeof e?t.fromString(e).bytes:e}return t.fromString=function(e){var r,n=!1,i=!1,o=!1,s=0,a=0,u=0,h=0,f=0,c=[0],l=0,d=0,p=0,m=0,y=0,v=0,b=new Q(0,0),w=new Q(0,0),_=0;if(e.length>=7e3)throw new S(e+" not a valid Decimal128 string");var M=e.match(G),O=e.match(X),A=e.match(tt);if(!M&&!O&&!A||0===e.length)throw new S(e+" not a valid Decimal128 string");if(M){var E=M[2],x=M[4],k=M[5],j=M[6];x&&void 0===j&&at(e,"missing exponent power"),x&&void 0===E&&at(e,"missing exponent base"),void 0===x&&(k||j)&&at(e,"missing e before exponent")}if("+"!==e[_]&&"-"!==e[_]||(n="-"===e[_++]),!ot(e[_])&&"."!==e[_]){if("i"===e[_]||"I"===e[_])return new t(g.from(n?rt:nt));if("N"===e[_])return new t(g.from(et))}for(;ot(e[_])||"."===e[_];)"."!==e[_]?(l<34&&("0"!==e[_]||o)&&(o||(f=a),o=!0,c[d++]=parseInt(e[_],10),l+=1),o&&(u+=1),i&&(h+=1),a+=1,_+=1):(i&&at(e,"contains multiple periods"),i=!0,_+=1);if(i&&!a)throw new S(e+" not a valid Decimal128 string");if("e"===e[_]||"E"===e[_]){var $=e.substr(++_).match(it);if(!$||!$[2])return new t(g.from(et));y=parseInt($[0],10),_+=$[0].length}if(e[_])return new t(g.from(et));if(p=0,l){if(m=l-1,1!==(s=u))for(;0===c[f+s-1];)s-=1}else p=0,m=0,c[0]=0,u=1,l=1,s=0;for(y<=h&&h-y>16384?y=-6176:y-=h;y>6111;){if((m+=1)-p>34){if(c.join("").match(/^0+$/)){y=6111;break}at(e,"overflow")}y-=1}for(;y<-6176||l=5&&(B=1,5===R))for(B=c[m]%2==1?1:0,v=f+m+2;v=0;T--)if(++c[T]>9&&(c[T]=0,0===T)){if(!(y<6111))return new t(g.from(n?rt:nt));y+=1,c[T]=1}}if(b=Q.fromNumber(0),w=Q.fromNumber(0),0===s)b=Q.fromNumber(0),w=Q.fromNumber(0);else if(m-p<17){T=p;for(w=Q.fromNumber(c[T++]),b=new Q(0,0);T<=m;T++)w=(w=w.multiply(Q.fromNumber(10))).add(Q.fromNumber(c[T]))}else{T=p;for(b=Q.fromNumber(c[T++]);T<=m-17;T++)b=(b=b.multiply(Q.fromNumber(10))).add(Q.fromNumber(c[T]));for(w=Q.fromNumber(c[T++]);T<=m;T++)w=(w=w.multiply(Q.fromNumber(10))).add(Q.fromNumber(c[T]))}var I,N,D,C,L=function(t,e){if(!t&&!e)return{high:Q.fromNumber(0),low:Q.fromNumber(0)};var r=t.shiftRightUnsigned(32),n=new Q(t.getLowBits(),0),i=e.shiftRightUnsigned(32),o=new Q(e.getLowBits(),0),s=r.multiply(i),a=r.multiply(o),u=n.multiply(i),h=n.multiply(o);return s=s.add(a.shiftRightUnsigned(32)),a=new Q(a.getLowBits(),0).add(u).add(h.shiftRightUnsigned(32)),{high:s=s.add(a.shiftRightUnsigned(32)),low:h=a.shiftLeft(32).add(new Q(h.getLowBits(),0))}}(b,Q.fromString("100000000000000000"));L.low=L.low.add(w),I=L.low,N=w,D=I.high>>>0,C=N.high>>>0,(D>>0>>0)&&(L.high=L.high.add(Q.fromNumber(1))),r=y+6176;var q={low:Q.fromNumber(0),high:Q.fromNumber(0)};L.high.shiftRightUnsigned(49).and(Q.fromNumber(1)).equals(Q.fromNumber(1))?(q.high=q.high.or(Q.fromNumber(3).shiftLeft(61)),q.high=q.high.or(Q.fromNumber(r).and(Q.fromNumber(16383).shiftLeft(47))),q.high=q.high.or(L.high.and(Q.fromNumber(0x7fffffffffff)))):(q.high=q.high.or(Q.fromNumber(16383&r).shiftLeft(49)),q.high=q.high.or(L.high.and(Q.fromNumber(562949953421311)))),q.low=L.low,n&&(q.high=q.high.or(Q.fromString("9223372036854775808")));var U=g.alloc(16);return _=0,U[_++]=255&q.low.low,U[_++]=q.low.low>>8&255,U[_++]=q.low.low>>16&255,U[_++]=q.low.low>>24&255,U[_++]=255&q.low.high,U[_++]=q.low.high>>8&255,U[_++]=q.low.high>>16&255,U[_++]=q.low.high>>24&255,U[_++]=255&q.high.low,U[_++]=q.high.low>>8&255,U[_++]=q.high.low>>16&255,U[_++]=q.high.low>>24&255,U[_++]=255&q.high.high,U[_++]=q.high.high>>8&255,U[_++]=q.high.high>>16&255,U[_++]=q.high.high>>24&255,new t(U)},t.prototype.toString=function(){for(var t,e=0,r=new Array(36),n=0;n>26&31;if(y>>3==3){if(30===y)return f.join("")+"Infinity";if(31===y)return"NaN";t=m>>15&16383,i=8+(m>>14&1)}else i=m>>14&7,t=m>>17&16383;var v=t-6176;if(h.parts[0]=(16383&m)+((15&i)<<14),h.parts[1]=p,h.parts[2]=d,h.parts[3]=l,0===h.parts[0]&&0===h.parts[1]&&0===h.parts[2]&&0===h.parts[3])u=!0;else for(s=3;s>=0;s--){var b=0,g=st(h);if(h=g.quotient,b=g.rem.low)for(o=8;o>=0;o--)r[9*s+o]=b%10,b=Math.floor(b/10)}if(u)e=1,r[a]=0;else for(e=36;!r[a];)e-=1,a+=1;var w=e-1+v;if(w>=34||w<=-7||v>0){if(e>34)return f.push("0"),v>0?f.push("E+"+v):v<0&&f.push("E"+v),f.join("");f.push(""+r[a++]),(e-=1)&&f.push(".");for(n=0;n0?f.push("+"+w):f.push(""+w)}else if(v>=0)for(n=0;n0)for(n=0;n<_;n++)f.push(""+r[a++]);else f.push("0");for(f.push(".");_++<0;)f.push("0");for(n=0;n=13&&(e=this.value.toExponential(13).toUpperCase()):e=this.value.toString(),{$numberDouble:e});var e},t.fromExtendedJSON=function(e,r){var n=parseFloat(e.$numberDouble);return r&&r.relaxed?n:new t(n)},t.prototype[Symbol.for("nodejs.util.inspect.custom")]=function(){return this.inspect()},t.prototype.inspect=function(){return"new Double("+this.toExtendedJSON().$numberDouble+")"},t}();Object.defineProperty(ht.prototype,"_bsontype",{value:"Double"});var ft=function(){function t(e){if(!(this instanceof t))return new t(e);e instanceof Number&&(e=e.valueOf()),this.value=+e}return t.prototype.valueOf=function(){return this.value},t.prototype.toString=function(t){return this.value.toString(t)},t.prototype.toJSON=function(){return this.value},t.prototype.toExtendedJSON=function(t){return t&&(t.relaxed||t.legacy)?this.value:{$numberInt:this.value.toString()}},t.fromExtendedJSON=function(e,r){return r&&r.relaxed?parseInt(e.$numberInt,10):new t(e.$numberInt)},t.prototype[Symbol.for("nodejs.util.inspect.custom")]=function(){return this.inspect()},t.prototype.inspect=function(){return"new Int32("+this.valueOf()+")"},t}();Object.defineProperty(ft.prototype,"_bsontype",{value:"Int32"});var ct=function(){function t(){if(!(this instanceof t))return new t}return t.prototype.toExtendedJSON=function(){return{$maxKey:1}},t.fromExtendedJSON=function(){return new t},t.prototype[Symbol.for("nodejs.util.inspect.custom")]=function(){return this.inspect()},t.prototype.inspect=function(){return"new MaxKey()"},t}();Object.defineProperty(ct.prototype,"_bsontype",{value:"MaxKey"});var lt=function(){function t(){if(!(this instanceof t))return new t}return t.prototype.toExtendedJSON=function(){return{$minKey:1}},t.fromExtendedJSON=function(){return new t},t.prototype[Symbol.for("nodejs.util.inspect.custom")]=function(){return this.inspect()},t.prototype.inspect=function(){return"new MinKey()"},t}();Object.defineProperty(lt.prototype,"_bsontype",{value:"MinKey"});var dt=new RegExp("^[0-9a-fA-F]{24}$"),pt=null,mt=Symbol("id"),yt=function(){function t(e){if(!(this instanceof t))return new t(e);if(e instanceof t&&(this[mt]=e.id,this.__id=e.__id),"object"===n(e)&&e&&"id"in e&&("toHexString"in e&&"function"==typeof e.toHexString?this[mt]=g.from(e.toHexString(),"hex"):this[mt]="string"==typeof e.id?g.from(e.id):e.id),null!=e&&"number"!=typeof e||(this[mt]=t.generate("number"==typeof e?e:void 0),t.cacheHexString&&(this.__id=this.id.toString("hex"))),ArrayBuffer.isView(e)&&12===e.byteLength&&(this[mt]=D(e)),"string"==typeof e)if(12===e.length){var r=g.from(e);12===r.byteLength&&(this[mt]=r)}else{if(24!==e.length||!dt.test(e))throw new S("Argument passed in must be a Buffer or string of 12 bytes or a string of 24 hex characters");this[mt]=g.from(e,"hex")}t.cacheHexString&&(this.__id=this.id.toString("hex"))}return Object.defineProperty(t.prototype,"id",{get:function(){return this[mt]},set:function(e){this[mt]=e,t.cacheHexString&&(this.__id=e.toString("hex"))},enumerable:!1,configurable:!0}),Object.defineProperty(t.prototype,"generationTime",{get:function(){return this.id.readInt32BE(0)},set:function(t){this.id.writeUInt32BE(t,0)},enumerable:!1,configurable:!0}),t.prototype.toHexString=function(){if(t.cacheHexString&&this.__id)return this.__id;var e=this.id.toString("hex");return t.cacheHexString&&!this.__id&&(this.__id=e),e},t.getInc=function(){return t.index=(t.index+1)%16777215},t.generate=function(e){"number"!=typeof e&&(e=~~(Date.now()/1e3));var r=t.getInc(),n=g.alloc(12);return n.writeUInt32BE(e,0),null===pt&&(pt=k(5)),n[4]=pt[0],n[5]=pt[1],n[6]=pt[2],n[7]=pt[3],n[8]=pt[4],n[11]=255&r,n[10]=r>>8&255,n[9]=r>>16&255,n},t.prototype.toString=function(t){return t?this.id.toString(t):this.toHexString()},t.prototype.toJSON=function(){return this.toHexString()},t.prototype.equals=function(e){return null!=e&&(e instanceof t?this.toString()===e.toString():"string"==typeof e&&t.isValid(e)&&12===e.length&&$(this.id)?e===g.prototype.toString.call(this.id,"latin1"):"string"==typeof e&&t.isValid(e)&&24===e.length?e.toLowerCase()===this.toHexString():"string"==typeof e&&t.isValid(e)&&12===e.length?g.from(e).equals(this.id):"object"===n(e)&&"toHexString"in e&&"function"==typeof e.toHexString&&e.toHexString()===this.toHexString())},t.prototype.getTimestamp=function(){var t=new Date,e=this.id.readUInt32BE(0);return t.setTime(1e3*Math.floor(e)),t},t.createPk=function(){return new t},t.createFromTime=function(e){var r=g.from([0,0,0,0,0,0,0,0,0,0,0,0]);return r.writeUInt32BE(e,0),new t(r)},t.createFromHexString=function(e){if(void 0===e||null!=e&&24!==e.length)throw new S("Argument passed in must be a single String of 12 bytes or a string of 24 hex characters");return new t(g.from(e,"hex"))},t.isValid=function(e){return null!=e&&("number"==typeof e||("string"==typeof e?12===e.length||24===e.length&&dt.test(e):e instanceof t||(!(!$(e)||12!==e.length)||"object"===n(e)&&"toHexString"in e&&"function"==typeof e.toHexString&&("string"==typeof e.id?12===e.id.length:24===e.toHexString().length&&dt.test(e.id.toString("hex"))))))},t.prototype.toExtendedJSON=function(){return this.toHexString?{$oid:this.toHexString()}:{$oid:this.toString("hex")}},t.fromExtendedJSON=function(e){return new t(e.$oid)},t.prototype[Symbol.for("nodejs.util.inspect.custom")]=function(){return this.inspect()},t.prototype.inspect=function(){return'new ObjectId("'+this.toHexString()+'")'},t.index=~~(16777215*Math.random()),t}();Object.defineProperty(yt.prototype,"generate",{value:N((function(t){return yt.generate(t)}),"Please use the static `ObjectId.generate(time)` instead")}),Object.defineProperty(yt.prototype,"getInc",{value:N((function(){return yt.getInc()}),"Please use the static `ObjectId.getInc()` instead")}),Object.defineProperty(yt.prototype,"get_inc",{value:N((function(){return yt.getInc()}),"Please use the static `ObjectId.getInc()` instead")}),Object.defineProperty(yt,"get_inc",{value:N((function(){return yt.getInc()}),"Please use the static `ObjectId.getInc()` instead")}),Object.defineProperty(yt.prototype,"_bsontype",{value:"ObjectID"});var vt=function(){function t(e,r){if(!(this instanceof t))return new t(e,r);if(this.pattern=e,this.options=(null!=r?r:"").split("").sort().join(""),-1!==this.pattern.indexOf("\0"))throw new M("BSON Regex patterns cannot contain null bytes, found: "+JSON.stringify(this.pattern));if(-1!==this.options.indexOf("\0"))throw new M("BSON Regex options cannot contain null bytes, found: "+JSON.stringify(this.options));for(var n=0;n>>0,i:this.low>>>0}}},e.fromExtendedJSON=function(t){return new e(t.$timestamp)},e.prototype[Symbol.for("nodejs.util.inspect.custom")]=function(){return this.inspect()},e.prototype.inspect=function(){return"new Timestamp({ t: "+this.getHighBits()+", i: "+this.getLowBits()+" })"},e.MAX_VALUE=Q.MAX_UNSIGNED_VALUE,e}(gt);function _t(t){return I(t)&&Reflect.has(t,"_bsontype")&&"string"==typeof t._bsontype}var Mt={$oid:yt,$binary:V,$uuid:V,$symbol:bt,$numberInt:ft,$numberDecimal:ut,$numberDouble:ht,$numberLong:Q,$minKey:lt,$maxKey:ct,$regex:vt,$regularExpression:vt,$timestamp:wt};function St(t){var e=t.toISOString();return 0!==t.getUTCMilliseconds()?e:e.slice(0,-5)+"Z"}function Ot(t,e){if(("object"===n(t)||"function"==typeof t)&&null!==t){var r=e.seenObjects.findIndex((function(e){return e.obj===t}));if(-1!==r){var i=e.seenObjects.map((function(t){return t.propertyName})),o=i.slice(0,r).map((function(t){return t+" -> "})).join(""),s=i[r],a=" -> "+i.slice(r+1,i.length-1).map((function(t){return t+" -> "})).join(""),u=i[i.length-1],h=" ".repeat(o.length+s.length/2),f="-".repeat(a.length+(s.length+u.length)/2-1);throw new S("Converting circular structure to EJSON:\n "+o+s+a+u+"\n "+h+"\\"+f+"/")}e.seenObjects[e.seenObjects.length-1].obj=t}if(Array.isArray(t))return function(t,e){return t.map((function(t,r){e.seenObjects.push({propertyName:"index "+r,obj:null});try{return Ot(t,e)}finally{e.seenObjects.pop()}}))}(t,e);if(void 0===t)return null;if(t instanceof Date||T(t)){var c=t.getTime(),l=c>-1&&c<2534023188e5;return e.legacy?e.relaxed&&l?{$date:t.getTime()}:{$date:St(t)}:e.relaxed&&l?{$date:St(t)}:{$date:{$numberLong:t.getTime().toString()}}}if(!("number"!=typeof t||e.relaxed&&isFinite(t))){if(Math.floor(t)===t){var d=t>=-0x8000000000000000&&t<=0x8000000000000000;if(t>=-2147483648&&t<=2147483647)return{$numberInt:t.toString()};if(d)return{$numberLong:t.toString()}}return{$numberDouble:t.toString()}}if(t instanceof RegExp||B(t)){var p=t.flags;if(void 0===p){var m=t.toString().match(/[gimuy]*$/);m&&(p=m[0])}return new vt(t.source,p).toExtendedJSON(e)}return null!=t&&"object"===n(t)?function(t,e){if(null==t||"object"!==n(t))throw new M("not an object instance");var r=t._bsontype;if(void 0===r){var i={};for(var o in t){e.seenObjects.push({propertyName:o,obj:null});try{i[o]=Ot(t[o],e)}finally{e.seenObjects.pop()}}return i}if(_t(t)){var s=t;if("function"!=typeof s.toExtendedJSON){var a=xt[t._bsontype];if(!a)throw new S("Unrecognized or invalid _bsontype: "+t._bsontype);s=a(s)}return"Code"===r&&s.scope?s=new K(s.code,Ot(s.scope,e)):"DBRef"===r&&s.oid&&(s=new Z(Ot(s.collection,e),Ot(s.oid,e),Ot(s.db,e),Ot(s.fields,e))),s.toExtendedJSON(e)}throw new M("_bsontype must be a string, but was: "+n(r))}(t,e):t}var At,Et,xt={Binary:function(t){return new V(t.value(),t.sub_type)},Code:function(t){return new K(t.code,t.scope)},DBRef:function(t){return new Z(t.collection||t.namespace,t.oid,t.db,t.fields)},Decimal128:function(t){return new ut(t.bytes)},Double:function(t){return new ht(t.value)},Int32:function(t){return new ft(t.value)},Long:function(t){return Q.fromBits(null!=t.low?t.low:t.low_,null!=t.low?t.high:t.high_,null!=t.low?t.unsigned:t.unsigned_)},MaxKey:function(){return new ct},MinKey:function(){return new lt},ObjectID:function(t){return new yt(t)},ObjectId:function(t){return new yt(t)},BSONRegExp:function(t){return new vt(t.pattern,t.options)},Symbol:function(t){return new bt(t.value)},Timestamp:function(t){return wt.fromBits(t.low,t.high)}};!function(t){function e(t,e){var r=Object.assign({},{relaxed:!0,legacy:!1},e);return"boolean"==typeof r.relaxed&&(r.strict=!r.relaxed),"boolean"==typeof r.strict&&(r.relaxed=!r.strict),JSON.parse(t,(function(t,e){if(-1!==t.indexOf("\0"))throw new M("BSON Document field names cannot contain null bytes, found: "+JSON.stringify(t));return function t(e,r){if(void 0===r&&(r={}),"number"==typeof e){if(r.relaxed||r.legacy)return e;if(Math.floor(e)===e){if(e>=-2147483648&&e<=2147483647)return new ft(e);if(e>=-0x8000000000000000&&e<=0x8000000000000000)return Q.fromNumber(e)}return new ht(e)}if(null==e||"object"!==n(e))return e;if(e.$undefined)return null;for(var i=Object.keys(e).filter((function(t){return t.startsWith("$")&&null!=e[t]})),o=0;o=Tt&&e<=Bt&&e>=$t&&e<=jt?(null!=t?g.byteLength(t,"utf8")+1:0)+5:(null!=t?g.byteLength(t,"utf8")+1:0)+9;case"undefined":return i||!o?(null!=t?g.byteLength(t,"utf8")+1:0)+1:0;case"boolean":return(null!=t?g.byteLength(t,"utf8")+1:0)+2;case"object":if(null==e||"MinKey"===e._bsontype||"MaxKey"===e._bsontype)return(null!=t?g.byteLength(t,"utf8")+1:0)+1;if("ObjectId"===e._bsontype||"ObjectID"===e._bsontype)return(null!=t?g.byteLength(t,"utf8")+1:0)+13;if(e instanceof Date||T(e))return(null!=t?g.byteLength(t,"utf8")+1:0)+9;if(ArrayBuffer.isView(e)||e instanceof ArrayBuffer||j(e))return(null!=t?g.byteLength(t,"utf8")+1:0)+6+e.byteLength;if("Long"===e._bsontype||"Double"===e._bsontype||"Timestamp"===e._bsontype)return(null!=t?g.byteLength(t,"utf8")+1:0)+9;if("Decimal128"===e._bsontype)return(null!=t?g.byteLength(t,"utf8")+1:0)+17;if("Code"===e._bsontype)return null!=e.scope&&Object.keys(e.scope).length>0?(null!=t?g.byteLength(t,"utf8")+1:0)+1+4+4+g.byteLength(e.code.toString(),"utf8")+1+ce(e.scope,r,o):(null!=t?g.byteLength(t,"utf8")+1:0)+1+4+g.byteLength(e.code.toString(),"utf8")+1;if("Binary"===e._bsontype)return e.sub_type===V.SUBTYPE_BYTE_ARRAY?(null!=t?g.byteLength(t,"utf8")+1:0)+(e.position+1+4+1+4):(null!=t?g.byteLength(t,"utf8")+1:0)+(e.position+1+4+1);if("Symbol"===e._bsontype)return(null!=t?g.byteLength(t,"utf8")+1:0)+g.byteLength(e.value,"utf8")+4+1+1;if("DBRef"===e._bsontype){var s=Object.assign({$ref:e.collection,$id:e.oid},e.fields);return null!=e.db&&(s.$db=e.db),(null!=t?g.byteLength(t,"utf8")+1:0)+1+ce(s,r,o)}return e instanceof RegExp||B(e)?(null!=t?g.byteLength(t,"utf8")+1:0)+1+g.byteLength(e.source,"utf8")+1+(e.global?1:0)+(e.ignoreCase?1:0)+(e.multiline?1:0)+1:"BSONRegExp"===e._bsontype?(null!=t?g.byteLength(t,"utf8")+1:0)+1+g.byteLength(e.pattern,"utf8")+1+g.byteLength(e.options,"utf8")+1:(null!=t?g.byteLength(t,"utf8")+1:0)+ce(e,r,o)+1;case"function":if(e instanceof RegExp||B(e)||"[object RegExp]"===String.call(e))return(null!=t?g.byteLength(t,"utf8")+1:0)+1+g.byteLength(e.source,"utf8")+1+(e.global?1:0)+(e.ignoreCase?1:0)+(e.multiline?1:0)+1;if(r&&null!=e.scope&&Object.keys(e.scope).length>0)return(null!=t?g.byteLength(t,"utf8")+1:0)+1+4+4+g.byteLength(E(e),"utf8")+1+ce(e.scope,r,o);if(r)return(null!=t?g.byteLength(t,"utf8")+1:0)+1+4+g.byteLength(E(e),"utf8")+1}return 0}function de(t,e,r){for(var n=0,i=e;i= 5, is "+i);if(e.allowObjectSmallerThanBufferSize&&t.length= bson size "+i);if(!e.allowObjectSmallerThanBufferSize&&t.length!==i)throw new M("buffer length "+t.length+" must === bson size "+i);if(i+n>t.byteLength)throw new M("(bson size "+i+" + options.index "+n+" must be <= buffer length "+t.byteLength+")");if(0!==t[n+i-1])throw new M("One object, sized correctly, with a spot for an EOO, but the EOO isn't 0x00");return function t(e,r,n,i){void 0===i&&(i=!1);var o=null!=n.evalFunctions&&n.evalFunctions,s=null!=n.cacheFunctions&&n.cacheFunctions,a=null==n.fieldsAsRaw?null:n.fieldsAsRaw,u=null!=n.raw&&n.raw,h="boolean"==typeof n.bsonRegExp&&n.bsonRegExp,f=null!=n.promoteBuffers&&n.promoteBuffers,c=null==n.promoteLongs||n.promoteLongs,l=null==n.promoteValues||n.promoteValues,d=r;if(e.length<5)throw new M("corrupt bson message < 5 bytes long");var p=e[r++]|e[r++]<<8|e[r++]<<16|e[r++]<<24;if(p<5||p>e.length)throw new M("corrupt bson message");var m=i?[]:{},y=0,v=!i&&null;for(;;){var b=e[r++];if(0===b)break;for(var w=r;0!==e[w]&&w=e.byteLength)throw new M("Bad BSON Document: illegal CString");var _=i?y++:e.toString("utf8",r,w);!1!==v&&"$"===_[0]&&(v=be.test(_));var S=void 0;if(r=w+1,b===Nt){if((Y=e[r++]|e[r++]<<8|e[r++]<<16|e[r++]<<24)<=0||Y>e.length-r||0!==e[r+Y-1])throw new M("bad string length in bson");S=we(e,r,r+Y-1),r+=Y}else if(b===Ut){var O=g.alloc(12);e.copy(O,0,r,r+12),S=new yt(O),r+=12}else if(b===Yt&&!1===l)S=new ft(e[r++]|e[r++]<<8|e[r++]<<16|e[r++]<<24);else if(b===Yt)S=e[r++]|e[r++]<<8|e[r++]<<16|e[r++]<<24;else if(b===It&&!1===l)S=new ht(e.readDoubleLE(r)),r+=8;else if(b===It)S=e.readDoubleLE(r),r+=8;else if(b===zt){var A=e[r++]|e[r++]<<8|e[r++]<<16|e[r++]<<24,E=e[r++]|e[r++]<<8|e[r++]<<16|e[r++]<<24;S=new Date(new Q(A,E).toNumber())}else if(b===Ft){if(0!==e[r]&&1!==e[r])throw new M("illegal boolean type value");S=1===e[r++]}else if(b===Dt){var x=r;if((k=e[r]|e[r+1]<<8|e[r+2]<<16|e[r+3]<<24)<=0||k>e.length-r)throw new M("bad embedded document length in bson");S=u?e.slice(r,r+k):t(e,x,n,!1),r+=k}else if(b===Ct){x=r;var k=e[r]|e[r+1]<<8|e[r+2]<<16|e[r+3]<<24,j=n,$=r+k;if(a&&a[_]){for(var P in j={},n)j[P]=n[P];j.raw=!0}if(S=t(e,x,j,!0),0!==e[(r+=k)-1])throw new M("invalid array terminator byte");if(r!==$)throw new M("corrupted array bson")}else if(b===qt)S=void 0;else if(b===Vt)S=null;else if(b===Gt){A=e[r++]|e[r++]<<8|e[r++]<<16|e[r++]<<24,E=e[r++]|e[r++]<<8|e[r++]<<16|e[r++]<<24;var R=new Q(A,E);S=c&&!0===l&&R.lessThanOrEqual(pe)&&R.greaterThanOrEqual(me)?R.toNumber():R}else if(b===Xt){var B=g.alloc(16);e.copy(B,0,r,r+16),r+=16;var T=new ut(B);S="toObject"in T&&"function"==typeof T.toObject?T.toObject():T}else if(b===Lt){var I=e[r++]|e[r++]<<8|e[r++]<<16|e[r++]<<24,N=I,D=e[r++];if(I<0)throw new M("Negative binary type element size found");if(I>e.byteLength)throw new M("Binary type size larger than document size");if(null!=e.slice){if(D===V.SUBTYPE_BYTE_ARRAY){if((I=e[r++]|e[r++]<<8|e[r++]<<16|e[r++]<<24)<0)throw new M("Negative binary type element size found for subtype 0x02");if(I>N-4)throw new M("Binary type with subtype 0x02 contains too long binary size");if(IN-4)throw new M("Binary type with subtype 0x02 contains too long binary size");if(I=e.length)throw new M("Bad BSON Document: illegal CString");var L=e.toString("utf8",r,w);for(w=r=w+1;0!==e[w]&&w=e.length)throw new M("Bad BSON Document: illegal CString");var q=e.toString("utf8",r,w);r=w+1;var U=new Array(q.length);for(w=0;w=e.length)throw new M("Bad BSON Document: illegal CString");L=e.toString("utf8",r,w);for(w=r=w+1;0!==e[w]&&w=e.length)throw new M("Bad BSON Document: illegal CString");q=e.toString("utf8",r,w);r=w+1,S=new vt(L,q)}else if(b===Wt){if((Y=e[r++]|e[r++]<<8|e[r++]<<16|e[r++]<<24)<=0||Y>e.length-r||0!==e[r+Y-1])throw new M("bad string length in bson");var F=we(e,r,r+Y-1);S=l?F:new bt(F),r+=Y}else if(b===Qt){A=e[r++]|e[r++]<<8|e[r++]<<16|e[r++]<<24,E=e[r++]|e[r++]<<8|e[r++]<<16|e[r++]<<24;S=new wt(A,E)}else if(b===te)S=new lt;else if(b===ee)S=new ct;else if(b===Zt){if((Y=e[r++]|e[r++]<<8|e[r++]<<16|e[r++]<<24)<=0||Y>e.length-r||0!==e[r+Y-1])throw new M("bad string length in bson");var z=we(e,r,r+Y-1);S=o?s?ge(z,ye,m):ge(z):new K(z),r+=Y}else if(b===Jt){var W=e[r++]|e[r++]<<8|e[r++]<<16|e[r++]<<24;if(W<13)throw new M("code_w_scope total size shorter minimum expected length");if((Y=e[r++]|e[r++]<<8|e[r++]<<16|e[r++]<<24)<=0||Y>e.length-r||0!==e[r+Y-1])throw new M("bad string length in bson");z=we(e,r,r+Y-1),x=r+=Y,k=e[r]|e[r+1]<<8|e[r+2]<<16|e[r+3]<<24;var J=t(e,x,n,!1);if(r+=k,W<8+k+Y)throw new M("code_w_scope total size is too short, truncating scope");if(W>8+k+Y)throw new M("code_w_scope total size is too long, clips outer document");o?(S=s?ge(z,ye,m):ge(z)).scope=J:S=new K(z,J)}else{if(b!==Ht)throw new M("Detected unknown BSON type "+b.toString(16)+' for fieldname "'+_+'"');var Y;if((Y=e[r++]|e[r++]<<8|e[r++]<<16|e[r++]<<24)<=0||Y>e.length-r||0!==e[r+Y-1])throw new M("bad string length in bson");if(!de(e,r,r+Y-1))throw new M("Invalid UTF-8 string in BSON document");var G=e.toString("utf8",r,r+Y-1);r+=Y;var X=g.alloc(12);e.copy(X,0,r,r+12);O=new yt(X);r+=12,S=new Z(G,O)}"__proto__"===_?Object.defineProperty(m,_,{value:S,writable:!0,enumerable:!0,configurable:!0}):m[_]=S}if(p!==r-d){if(i)throw new M("corrupt array bson");throw new M("corrupt object bson")}if(!v)return m;if(H(m)){var tt=Object.assign({},m);return delete tt.$ref,delete tt.$id,delete tt.$db,new Z(m.$ref,m.$id,m.$db,tt)}return m}(t,n,e,r)}var be=/^\$ref$|^\$id$|^\$db$/;function ge(t,e,r){return e?(null==e[t]&&(e[t]=new Function(t)),e[t].bind(r)):new Function(t)}function we(t,e,r){for(var n=t.toString("utf8",e,r),i=0;i>1,d=23===i?Math.pow(2,-24)-Math.pow(2,-77):0,p=h?o-1:0,m=h?-1:1,y=e<0||0===e&&1/e<0?1:0;for(e=Math.abs(e),isNaN(e)||e===1/0?(a=isNaN(e)?1:0,s=c):(s=Math.floor(Math.log(e)/Math.LN2),e*(u=Math.pow(2,-s))<1&&(s--,u*=2),(e+=s+l>=1?d/u:d*Math.pow(2,1-l))*u>=2&&(s++,u/=2),s+l>=c?(a=0,s=c):s+l>=1?(a=(e*u-1)*Math.pow(2,i),s+=l):(a=e*Math.pow(2,l-1)*Math.pow(2,i),s=0)),isNaN(e)&&(a=0);i>=8;)t[r+p]=255&a,p+=m,a/=256,i-=8;for(s=s<0;)t[r+p]=255&s,p+=m,s/=256,f-=8;t[r+p-m]|=128*y}var Me=/\x00/,Se=new Set(["$db","$ref","$id","$clusterTime"]);function Oe(t,e,r,n,i){t[n++]=Nt;var o=i?t.write(e,n,void 0,"ascii"):t.write(e,n,void 0,"utf8");t[(n=n+o+1)-1]=0;var s=t.write(r,n+4,void 0,"utf8");return t[n+3]=s+1>>24&255,t[n+2]=s+1>>16&255,t[n+1]=s+1>>8&255,t[n]=s+1&255,n=n+4+s,t[n++]=0,n}function Ae(t,e,r,n,i){Number.isInteger(r)&&r>=$t&&r<=jt?(t[n++]=Yt,n+=i?t.write(e,n,void 0,"ascii"):t.write(e,n,void 0,"utf8"),t[n++]=0,t[n++]=255&r,t[n++]=r>>8&255,t[n++]=r>>16&255,t[n++]=r>>24&255):(t[n++]=It,n+=i?t.write(e,n,void 0,"ascii"):t.write(e,n,void 0,"utf8"),t[n++]=0,_e(t,r,n,"little",52,8),n+=8);return n}function Ee(t,e,r,n,i){return t[n++]=Vt,n+=i?t.write(e,n,void 0,"ascii"):t.write(e,n,void 0,"utf8"),t[n++]=0,n}function xe(t,e,r,n,i){return t[n++]=Ft,n+=i?t.write(e,n,void 0,"ascii"):t.write(e,n,void 0,"utf8"),t[n++]=0,t[n++]=r?1:0,n}function ke(t,e,r,n,i){t[n++]=zt,n+=i?t.write(e,n,void 0,"ascii"):t.write(e,n,void 0,"utf8"),t[n++]=0;var o=Q.fromNumber(r.getTime()),s=o.getLowBits(),a=o.getHighBits();return t[n++]=255&s,t[n++]=s>>8&255,t[n++]=s>>16&255,t[n++]=s>>24&255,t[n++]=255&a,t[n++]=a>>8&255,t[n++]=a>>16&255,t[n++]=a>>24&255,n}function je(t,e,r,n,i){if(t[n++]=Kt,n+=i?t.write(e,n,void 0,"ascii"):t.write(e,n,void 0,"utf8"),t[n++]=0,r.source&&null!=r.source.match(Me))throw Error("value "+r.source+" must not contain null bytes");return n+=t.write(r.source,n,void 0,"utf8"),t[n++]=0,r.ignoreCase&&(t[n++]=105),r.global&&(t[n++]=115),r.multiline&&(t[n++]=109),t[n++]=0,n}function $e(t,e,r,n,i){if(t[n++]=Kt,n+=i?t.write(e,n,void 0,"ascii"):t.write(e,n,void 0,"utf8"),t[n++]=0,null!=r.pattern.match(Me))throw Error("pattern "+r.pattern+" must not contain null bytes");return n+=t.write(r.pattern,n,void 0,"utf8"),t[n++]=0,n+=t.write(r.options.split("").sort().join(""),n,void 0,"utf8"),t[n++]=0,n}function Pe(t,e,r,n,i){return null===r?t[n++]=Vt:"MinKey"===r._bsontype?t[n++]=te:t[n++]=ee,n+=i?t.write(e,n,void 0,"ascii"):t.write(e,n,void 0,"utf8"),t[n++]=0,n}function Re(t,e,r,n,i){if(t[n++]=Ut,n+=i?t.write(e,n,void 0,"ascii"):t.write(e,n,void 0,"utf8"),t[n++]=0,"string"==typeof r.id)t.write(r.id,n,void 0,"binary");else{if(!$(r.id))throw new S("object ["+JSON.stringify(r)+"] is not a valid ObjectId");t.set(r.id.subarray(0,12),n)}return n+12}function Be(t,e,r,n,i){t[n++]=Lt,n+=i?t.write(e,n,void 0,"ascii"):t.write(e,n,void 0,"utf8"),t[n++]=0;var o=r.length;return t[n++]=255&o,t[n++]=o>>8&255,t[n++]=o>>16&255,t[n++]=o>>24&255,t[n++]=re,t.set(D(r),n),n+=o}function Te(t,e,r,n,i,o,s,a,u,h){void 0===i&&(i=!1),void 0===o&&(o=0),void 0===s&&(s=!1),void 0===a&&(a=!0),void 0===u&&(u=!1),void 0===h&&(h=[]);for(var f=0;f>8&255,t[n++]=o>>16&255,t[n++]=o>>24&255,t[n++]=255&s,t[n++]=s>>8&255,t[n++]=s>>16&255,t[n++]=s>>24&255,n}function De(t,e,r,n,i){return r=r.valueOf(),t[n++]=Yt,n+=i?t.write(e,n,void 0,"ascii"):t.write(e,n,void 0,"utf8"),t[n++]=0,t[n++]=255&r,t[n++]=r>>8&255,t[n++]=r>>16&255,t[n++]=r>>24&255,n}function Ce(t,e,r,n,i){return t[n++]=It,n+=i?t.write(e,n,void 0,"ascii"):t.write(e,n,void 0,"utf8"),t[n++]=0,_e(t,r.value,n,"little",52,8),n+=8}function Le(t,e,r,n,i,o,s){t[n++]=Zt,n+=s?t.write(e,n,void 0,"ascii"):t.write(e,n,void 0,"utf8"),t[n++]=0;var a=E(r),u=t.write(a,n+4,void 0,"utf8")+1;return t[n]=255&u,t[n+1]=u>>8&255,t[n+2]=u>>16&255,t[n+3]=u>>24&255,n=n+4+u-1,t[n++]=0,n}function qe(t,e,r,i,o,s,a,u,h){if(void 0===o&&(o=!1),void 0===s&&(s=0),void 0===a&&(a=!1),void 0===u&&(u=!0),void 0===h&&(h=!1),r.scope&&"object"===n(r.scope)){t[i++]=Jt,i+=h?t.write(e,i,void 0,"ascii"):t.write(e,i,void 0,"utf8"),t[i++]=0;var f=i,c="string"==typeof r.code?r.code:r.code.toString();i+=4;var l=t.write(c,i+4,void 0,"utf8")+1;t[i]=255&l,t[i+1]=l>>8&255,t[i+2]=l>>16&255,t[i+3]=l>>24&255,t[i+4+l-1]=0,i=i+l+4;var d=Ve(t,r.scope,o,i,s+1,a,u);i=d-1;var p=d-f;t[f++]=255&p,t[f++]=p>>8&255,t[f++]=p>>16&255,t[f++]=p>>24&255,t[i++]=0}else{t[i++]=Zt,i+=h?t.write(e,i,void 0,"ascii"):t.write(e,i,void 0,"utf8"),t[i++]=0;c=r.code.toString();var m=t.write(c,i+4,void 0,"utf8")+1;t[i]=255&m,t[i+1]=m>>8&255,t[i+2]=m>>16&255,t[i+3]=m>>24&255,i=i+4+m-1,t[i++]=0}return i}function Ue(t,e,r,n,i){t[n++]=Lt,n+=i?t.write(e,n,void 0,"ascii"):t.write(e,n,void 0,"utf8"),t[n++]=0;var o=r.value(!0),s=r.position;return r.sub_type===V.SUBTYPE_BYTE_ARRAY&&(s+=4),t[n++]=255&s,t[n++]=s>>8&255,t[n++]=s>>16&255,t[n++]=s>>24&255,t[n++]=r.sub_type,r.sub_type===V.SUBTYPE_BYTE_ARRAY&&(s-=4,t[n++]=255&s,t[n++]=s>>8&255,t[n++]=s>>16&255,t[n++]=s>>24&255),t.set(o,n),n+=r.position}function Fe(t,e,r,n,i){t[n++]=Wt,n+=i?t.write(e,n,void 0,"ascii"):t.write(e,n,void 0,"utf8"),t[n++]=0;var o=t.write(r.value,n+4,void 0,"utf8")+1;return t[n]=255&o,t[n+1]=o>>8&255,t[n+2]=o>>16&255,t[n+3]=o>>24&255,n=n+4+o-1,t[n++]=0,n}function ze(t,e,r,n,i,o,s){t[n++]=Dt,n+=s?t.write(e,n,void 0,"ascii"):t.write(e,n,void 0,"utf8"),t[n++]=0;var a=n,u={$ref:r.collection||r.namespace,$id:r.oid};null!=r.db&&(u.$db=r.db);var h=Ve(t,u=Object.assign(u,r.fields),!1,n,i+1,o),f=h-a;return t[a++]=255&f,t[a++]=f>>8&255,t[a++]=f>>16&255,t[a++]=f>>24&255,h}function Ve(t,e,r,i,o,s,a,u){void 0===r&&(r=!1),void 0===i&&(i=0),void 0===o&&(o=0),void 0===s&&(s=!1),void 0===a&&(a=!0),void 0===u&&(u=[]),i=i||0,(u=u||[]).push(e);var h,f=i+4;if(Array.isArray(e))for(var c=0;c>8&255,t[i++]=b>>16&255,t[i++]=b>>24&255,f}var Ke=g.alloc(17825792);function He(t){Ke.length>>32-e}function h(t,e,r,n,i,o,s){return u(t+(e&r|~e&n)+i+o|0,s)+e|0}function f(t,e,r,n,i,o,s){return u(t+(e&n|r&~n)+i+o|0,s)+e|0}function c(t,e,r,n,i,o,s){return u(t+(e^r^n)+i+o|0,s)+e|0}function l(t,e,r,n,i,o,s){return u(t+(r^(e|~n))+i+o|0,s)+e|0}n(a,i),a.prototype._update=function(){for(var t=s,e=0;e<16;++e)t[e]=this._block.readInt32LE(4*e);var r=this._a,n=this._b,i=this._c,o=this._d;r=h(r,n,i,o,t[0],3614090360,7),o=h(o,r,n,i,t[1],3905402710,12),i=h(i,o,r,n,t[2],606105819,17),n=h(n,i,o,r,t[3],3250441966,22),r=h(r,n,i,o,t[4],4118548399,7),o=h(o,r,n,i,t[5],1200080426,12),i=h(i,o,r,n,t[6],2821735955,17),n=h(n,i,o,r,t[7],4249261313,22),r=h(r,n,i,o,t[8],1770035416,7),o=h(o,r,n,i,t[9],2336552879,12),i=h(i,o,r,n,t[10],4294925233,17),n=h(n,i,o,r,t[11],2304563134,22),r=h(r,n,i,o,t[12],1804603682,7),o=h(o,r,n,i,t[13],4254626195,12),i=h(i,o,r,n,t[14],2792965006,17),r=f(r,n=h(n,i,o,r,t[15],1236535329,22),i,o,t[1],4129170786,5),o=f(o,r,n,i,t[6],3225465664,9),i=f(i,o,r,n,t[11],643717713,14),n=f(n,i,o,r,t[0],3921069994,20),r=f(r,n,i,o,t[5],3593408605,5),o=f(o,r,n,i,t[10],38016083,9),i=f(i,o,r,n,t[15],3634488961,14),n=f(n,i,o,r,t[4],3889429448,20),r=f(r,n,i,o,t[9],568446438,5),o=f(o,r,n,i,t[14],3275163606,9),i=f(i,o,r,n,t[3],4107603335,14),n=f(n,i,o,r,t[8],1163531501,20),r=f(r,n,i,o,t[13],2850285829,5),o=f(o,r,n,i,t[2],4243563512,9),i=f(i,o,r,n,t[7],1735328473,14),r=c(r,n=f(n,i,o,r,t[12],2368359562,20),i,o,t[5],4294588738,4),o=c(o,r,n,i,t[8],2272392833,11),i=c(i,o,r,n,t[11],1839030562,16),n=c(n,i,o,r,t[14],4259657740,23),r=c(r,n,i,o,t[1],2763975236,4),o=c(o,r,n,i,t[4],1272893353,11),i=c(i,o,r,n,t[7],4139469664,16),n=c(n,i,o,r,t[10],3200236656,23),r=c(r,n,i,o,t[13],681279174,4),o=c(o,r,n,i,t[0],3936430074,11),i=c(i,o,r,n,t[3],3572445317,16),n=c(n,i,o,r,t[6],76029189,23),r=c(r,n,i,o,t[9],3654602809,4),o=c(o,r,n,i,t[12],3873151461,11),i=c(i,o,r,n,t[15],530742520,16),r=l(r,n=c(n,i,o,r,t[2],3299628645,23),i,o,t[0],4096336452,6),o=l(o,r,n,i,t[7],1126891415,10),i=l(i,o,r,n,t[14],2878612391,15),n=l(n,i,o,r,t[5],4237533241,21),r=l(r,n,i,o,t[12],1700485571,6),o=l(o,r,n,i,t[3],2399980690,10),i=l(i,o,r,n,t[10],4293915773,15),n=l(n,i,o,r,t[1],2240044497,21),r=l(r,n,i,o,t[8],1873313359,6),o=l(o,r,n,i,t[15],4264355552,10),i=l(i,o,r,n,t[6],2734768916,15),n=l(n,i,o,r,t[13],1309151649,21),r=l(r,n,i,o,t[4],4149444226,6),o=l(o,r,n,i,t[11],3174756917,10),i=l(i,o,r,n,t[2],718787259,15),n=l(n,i,o,r,t[9],3951481745,21),this._a=this._a+r|0,this._b=this._b+n|0,this._c=this._c+i|0,this._d=this._d+o|0},a.prototype._digest=function(){this._block[this._blockOffset++]=128,this._blockOffset>56&&(this._block.fill(0,this._blockOffset,64),this._update(),this._blockOffset=0),this._block.fill(0,this._blockOffset,56),this._block.writeUInt32LE(this._length[0],56),this._block.writeUInt32LE(this._length[1],60),this._update();var t=o.allocUnsafe(16);return t.writeInt32LE(this._a,0),t.writeInt32LE(this._b,4),t.writeInt32LE(this._c,8),t.writeInt32LE(this._d,12),t},t.exports=a},function(t,e,r){"use strict";var n=r(28).codes.ERR_STREAM_PREMATURE_CLOSE;function i(){}t.exports=function t(e,r,o){if("function"==typeof r)return t(e,null,r);r||(r={}),o=function(t){var e=!1;return function(){if(!e){e=!0;for(var r=arguments.length,n=new Array(r),i=0;i>>32-e}function m(t,e,r,n,i,o,s,a){return p(t+(e^r^n)+o+s|0,a)+i|0}function y(t,e,r,n,i,o,s,a){return p(t+(e&r|~e&n)+o+s|0,a)+i|0}function v(t,e,r,n,i,o,s,a){return p(t+((e|~r)^n)+o+s|0,a)+i|0}function b(t,e,r,n,i,o,s,a){return p(t+(e&n|r&~n)+o+s|0,a)+i|0}function g(t,e,r,n,i,o,s,a){return p(t+(e^(r|~n))+o+s|0,a)+i|0}i(d,o),d.prototype._update=function(){for(var t=s,e=0;e<16;++e)t[e]=this._block.readInt32LE(4*e);for(var r=0|this._a,n=0|this._b,i=0|this._c,o=0|this._d,d=0|this._e,w=0|this._a,_=0|this._b,M=0|this._c,S=0|this._d,O=0|this._e,A=0;A<80;A+=1){var E,x;A<16?(E=m(r,n,i,o,d,t[a[A]],c[0],h[A]),x=g(w,_,M,S,O,t[u[A]],l[0],f[A])):A<32?(E=y(r,n,i,o,d,t[a[A]],c[1],h[A]),x=b(w,_,M,S,O,t[u[A]],l[1],f[A])):A<48?(E=v(r,n,i,o,d,t[a[A]],c[2],h[A]),x=v(w,_,M,S,O,t[u[A]],l[2],f[A])):A<64?(E=b(r,n,i,o,d,t[a[A]],c[3],h[A]),x=y(w,_,M,S,O,t[u[A]],l[3],f[A])):(E=g(r,n,i,o,d,t[a[A]],c[4],h[A]),x=m(w,_,M,S,O,t[u[A]],l[4],f[A])),r=d,d=o,o=p(i,10),i=n,n=E,w=O,O=S,S=p(M,10),M=_,_=x}var k=this._b+i+S|0;this._b=this._c+o+O|0,this._c=this._d+d+w|0,this._d=this._e+r+_|0,this._e=this._a+n+M|0,this._a=k},d.prototype._digest=function(){this._block[this._blockOffset++]=128,this._blockOffset>56&&(this._block.fill(0,this._blockOffset,64),this._update(),this._blockOffset=0),this._block.fill(0,this._blockOffset,56),this._block.writeUInt32LE(this._length[0],56),this._block.writeUInt32LE(this._length[1],60),this._update();var t=n.alloc?n.alloc(20):new n(20);return t.writeInt32LE(this._a,0),t.writeInt32LE(this._b,4),t.writeInt32LE(this._c,8),t.writeInt32LE(this._d,12),t.writeInt32LE(this._e,16),t},t.exports=d},function(t,e,r){(e=t.exports=function(t){t=t.toLowerCase();var r=e[t];if(!r)throw new Error(t+" is not supported (we accept pull requests)");return new r}).sha=r(192),e.sha1=r(193),e.sha224=r(194),e.sha256=r(103),e.sha384=r(195),e.sha512=r(104)},function(t,e,r){(e=t.exports=r(105)).Stream=e,e.Readable=e,e.Writable=r(65),e.Duplex=r(25),e.Transform=r(109),e.PassThrough=r(201)},function(t,e,r){"use strict";(function(e,n,i){var o=r(46);function s(t){var e=this;this.next=null,this.entry=null,this.finish=function(){!function(t,e,r){var n=t.entry;t.entry=null;for(;n;){var i=n.callback;e.pendingcb--,i(r),n=n.next}e.corkedRequestsFree?e.corkedRequestsFree.next=t:e.corkedRequestsFree=t}(e,t)}}t.exports=b;var a,u=!e.browser&&["v0.10","v0.9."].indexOf(e.version.slice(0,5))>-1?n:o.nextTick;b.WritableState=v;var h=Object.create(r(38));h.inherits=r(0);var f={deprecate:r(101)},c=r(106),l=r(2).Buffer,d=i.Uint8Array||function(){};var p,m=r(107);function y(){}function v(t,e){a=a||r(25),t=t||{};var n=e instanceof a;this.objectMode=!!t.objectMode,n&&(this.objectMode=this.objectMode||!!t.writableObjectMode);var i=t.highWaterMark,h=t.writableHighWaterMark,f=this.objectMode?16:16384;this.highWaterMark=i||0===i?i:n&&(h||0===h)?h:f,this.highWaterMark=Math.floor(this.highWaterMark),this.finalCalled=!1,this.needDrain=!1,this.ending=!1,this.ended=!1,this.finished=!1,this.destroyed=!1;var c=!1===t.decodeStrings;this.decodeStrings=!c,this.defaultEncoding=t.defaultEncoding||"utf8",this.length=0,this.writing=!1,this.corked=0,this.sync=!0,this.bufferProcessing=!1,this.onwrite=function(t){!function(t,e){var r=t._writableState,n=r.sync,i=r.writecb;if(function(t){t.writing=!1,t.writecb=null,t.length-=t.writelen,t.writelen=0}(r),e)!function(t,e,r,n,i){--e.pendingcb,r?(o.nextTick(i,n),o.nextTick(O,t,e),t._writableState.errorEmitted=!0,t.emit("error",n)):(i(n),t._writableState.errorEmitted=!0,t.emit("error",n),O(t,e))}(t,r,n,e,i);else{var s=M(r);s||r.corked||r.bufferProcessing||!r.bufferedRequest||_(t,r),n?u(w,t,r,s,i):w(t,r,s,i)}}(e,t)},this.writecb=null,this.writelen=0,this.bufferedRequest=null,this.lastBufferedRequest=null,this.pendingcb=0,this.prefinished=!1,this.errorEmitted=!1,this.bufferedRequestCount=0,this.corkedRequestsFree=new s(this)}function b(t){if(a=a||r(25),!(p.call(b,this)||this instanceof a))return new b(t);this._writableState=new v(t,this),this.writable=!0,t&&("function"==typeof t.write&&(this._write=t.write),"function"==typeof t.writev&&(this._writev=t.writev),"function"==typeof t.destroy&&(this._destroy=t.destroy),"function"==typeof t.final&&(this._final=t.final)),c.call(this)}function g(t,e,r,n,i,o,s){e.writelen=n,e.writecb=s,e.writing=!0,e.sync=!0,r?t._writev(i,e.onwrite):t._write(i,o,e.onwrite),e.sync=!1}function w(t,e,r,n){r||function(t,e){0===e.length&&e.needDrain&&(e.needDrain=!1,t.emit("drain"))}(t,e),e.pendingcb--,n(),O(t,e)}function _(t,e){e.bufferProcessing=!0;var r=e.bufferedRequest;if(t._writev&&r&&r.next){var n=e.bufferedRequestCount,i=new Array(n),o=e.corkedRequestsFree;o.entry=r;for(var a=0,u=!0;r;)i[a]=r,r.isBuf||(u=!1),r=r.next,a+=1;i.allBuffers=u,g(t,e,!0,e.length,i,"",o.finish),e.pendingcb++,e.lastBufferedRequest=null,o.next?(e.corkedRequestsFree=o.next,o.next=null):e.corkedRequestsFree=new s(e),e.bufferedRequestCount=0}else{for(;r;){var h=r.chunk,f=r.encoding,c=r.callback;if(g(t,e,!1,e.objectMode?1:h.length,h,f,c),r=r.next,e.bufferedRequestCount--,e.writing)break}null===r&&(e.lastBufferedRequest=null)}e.bufferedRequest=r,e.bufferProcessing=!1}function M(t){return t.ending&&0===t.length&&null===t.bufferedRequest&&!t.finished&&!t.writing}function S(t,e){t._final((function(r){e.pendingcb--,r&&t.emit("error",r),e.prefinished=!0,t.emit("prefinish"),O(t,e)}))}function O(t,e){var r=M(e);return r&&(!function(t,e){e.prefinished||e.finalCalled||("function"==typeof t._final?(e.pendingcb++,e.finalCalled=!0,o.nextTick(S,t,e)):(e.prefinished=!0,t.emit("prefinish")))}(t,e),0===e.pendingcb&&(e.finished=!0,t.emit("finish"))),r}h.inherits(b,c),v.prototype.getBuffer=function(){for(var t=this.bufferedRequest,e=[];t;)e.push(t),t=t.next;return e},function(){try{Object.defineProperty(v.prototype,"buffer",{get:f.deprecate((function(){return this.getBuffer()}),"_writableState.buffer is deprecated. Use _writableState.getBuffer instead.","DEP0003")})}catch(t){}}(),"function"==typeof Symbol&&Symbol.hasInstance&&"function"==typeof Function.prototype[Symbol.hasInstance]?(p=Function.prototype[Symbol.hasInstance],Object.defineProperty(b,Symbol.hasInstance,{value:function(t){return!!p.call(this,t)||this===b&&(t&&t._writableState instanceof v)}})):p=function(t){return t instanceof this},b.prototype.pipe=function(){this.emit("error",new Error("Cannot pipe, not readable"))},b.prototype.write=function(t,e,r){var n,i=this._writableState,s=!1,a=!i.objectMode&&(n=t,l.isBuffer(n)||n instanceof d);return a&&!l.isBuffer(t)&&(t=function(t){return l.from(t)}(t)),"function"==typeof e&&(r=e,e=null),a?e="buffer":e||(e=i.defaultEncoding),"function"!=typeof r&&(r=y),i.ended?function(t,e){var r=new Error("write after end");t.emit("error",r),o.nextTick(e,r)}(this,r):(a||function(t,e,r,n){var i=!0,s=!1;return null===r?s=new TypeError("May not write null values to stream"):"string"==typeof r||void 0===r||e.objectMode||(s=new TypeError("Invalid non-string/buffer chunk")),s&&(t.emit("error",s),o.nextTick(n,s),i=!1),i}(this,i,t,r))&&(i.pendingcb++,s=function(t,e,r,n,i,o){if(!r){var s=function(t,e,r){t.objectMode||!1===t.decodeStrings||"string"!=typeof e||(e=l.from(e,r));return e}(e,n,i);n!==s&&(r=!0,i="buffer",n=s)}var a=e.objectMode?1:n.length;e.length+=a;var u=e.length-1))throw new TypeError("Unknown encoding: "+t);return this._writableState.defaultEncoding=t,this},Object.defineProperty(b.prototype,"writableHighWaterMark",{enumerable:!1,get:function(){return this._writableState.highWaterMark}}),b.prototype._write=function(t,e,r){r(new Error("_write() is not implemented"))},b.prototype._writev=null,b.prototype.end=function(t,e,r){var n=this._writableState;"function"==typeof t?(r=t,t=null,e=null):"function"==typeof e&&(r=e,e=null),null!=t&&this.write(t,e),n.corked&&(n.corked=1,this.uncork()),n.ending||n.finished||function(t,e,r){e.ending=!0,O(t,e),r&&(e.finished?o.nextTick(r):t.once("finish",r));e.ended=!0,t.writable=!1}(this,n,r)},Object.defineProperty(b.prototype,"destroyed",{get:function(){return void 0!==this._writableState&&this._writableState.destroyed},set:function(t){this._writableState&&(this._writableState.destroyed=t)}}),b.prototype.destroy=m.destroy,b.prototype._undestroy=m.undestroy,b.prototype._destroy=function(t,e){this.end(),e(t)}}).call(this,r(5),r(108).setImmediate,r(7))},function(t,e,r){"use strict";var n=r(11);function i(t){this.options=t,this.type=this.options.type,this.blockSize=8,this._init(),this.buffer=new Array(this.blockSize),this.bufferOff=0}t.exports=i,i.prototype._init=function(){},i.prototype.update=function(t){return 0===t.length?[]:"decrypt"===this.type?this._updateDecrypt(t):this._updateEncrypt(t)},i.prototype._buffer=function(t,e){for(var r=Math.min(this.buffer.length-this.bufferOff,t.length-e),n=0;n0;n--)e+=this._buffer(t,e),r+=this._flushBuffer(i,r);return e+=this._buffer(t,e),i},i.prototype.final=function(t){var e,r;return t&&(e=this.update(t)),r="encrypt"===this.type?this._finalEncrypt():this._finalDecrypt(),e?e.concat(r):r},i.prototype._pad=function(t,e){if(0===e)return!1;for(;e */ +var n=r(3),i=n.Buffer;function o(t,e){for(var r in t)e[r]=t[r]}function s(t,e,r){return i(t,e,r)}i.from&&i.alloc&&i.allocUnsafe&&i.allocUnsafeSlow?t.exports=n:(o(n,e),e.Buffer=s),s.prototype=Object.create(i.prototype),o(i,s),s.from=function(t,e,r){if("number"==typeof t)throw new TypeError("Argument must not be a number");return i(t,e,r)},s.alloc=function(t,e,r){if("number"!=typeof t)throw new TypeError("Argument must be a number");var n=i(t);return void 0!==e?"string"==typeof r?n.fill(e,r):n.fill(e):n.fill(0),n},s.allocUnsafe=function(t){if("number"!=typeof t)throw new TypeError("Argument must be a number");return i(t)},s.allocUnsafeSlow=function(t){if("number"!=typeof t)throw new TypeError("Argument must be a number");return n.SlowBuffer(t)}},function(t,e,r){(function(e){var n=r(72),i=r(27);function o(t){var e,r=t.modulus.byteLength();do{e=new n(i(r))}while(e.cmp(t.modulus)>=0||!e.umod(t.prime1)||!e.umod(t.prime2));return e}function s(t,r){var i=function(t){var e=o(t);return{blinder:e.toRed(n.mont(t.modulus)).redPow(new n(t.publicExponent)).fromRed(),unblinder:e.invm(t.modulus)}}(r),s=r.modulus.byteLength(),a=new n(t).mul(i.blinder).umod(r.modulus),u=a.toRed(n.mont(r.prime1)),h=a.toRed(n.mont(r.prime2)),f=r.coefficient,c=r.prime1,l=r.prime2,d=u.redPow(r.exponent1).fromRed(),p=h.redPow(r.exponent2).fromRed(),m=d.isub(p).imul(f).umod(c).imul(l);return p.iadd(m).imul(i.unblinder).umod(r.modulus).toArrayLike(e,"be",s)}s.getr=o,t.exports=s}).call(this,r(3).Buffer)},function(t,e,r){(function(t){function e(t){return(e="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}!function(t,n){"use strict";function i(t,e){if(!t)throw new Error(e||"Assertion failed")}function o(t,e){t.super_=e;var r=function(){};r.prototype=e.prototype,t.prototype=new r,t.prototype.constructor=t}function s(t,e,r){if(s.isBN(t))return t;this.negative=0,this.words=null,this.length=0,this.red=null,null!==t&&("le"!==e&&"be"!==e||(r=e,e=10),this._init(t||0,e||10,r||"be"))}var a;"object"===e(t)?t.exports=s:n.BN=s,s.BN=s,s.wordSize=26;try{a="undefined"!=typeof window&&void 0!==window.Buffer?window.Buffer:r(233).Buffer}catch(t){}function u(t,e){var r=t.charCodeAt(e);return r>=48&&r<=57?r-48:r>=65&&r<=70?r-55:r>=97&&r<=102?r-87:void i(!1,"Invalid character in "+t)}function h(t,e,r){var n=u(t,r);return r-1>=e&&(n|=u(t,r-1)<<4),n}function f(t,e,r,n){for(var o=0,s=0,a=Math.min(t.length,r),u=e;u=49?h-49+10:h>=17?h-17+10:h,i(h>=0&&s0?t:e},s.min=function(t,e){return t.cmp(e)<0?t:e},s.prototype._init=function(t,r,n){if("number"==typeof t)return this._initNumber(t,r,n);if("object"===e(t))return this._initArray(t,r,n);"hex"===r&&(r=16),i(r===(0|r)&&r>=2&&r<=36);var o=0;"-"===(t=t.toString().replace(/\s+/g,""))[0]&&(o++,this.negative=1),o=0;n-=3)s=t[n]|t[n-1]<<8|t[n-2]<<16,this.words[o]|=s<>>26-a&67108863,(a+=24)>=26&&(a-=26,o++);else if("le"===r)for(n=0,o=0;n>>26-a&67108863,(a+=24)>=26&&(a-=26,o++);return this._strip()},s.prototype._parseHex=function(t,e,r){this.length=Math.ceil((t.length-e)/6),this.words=new Array(this.length);for(var n=0;n=e;n-=2)i=h(t,e,n)<=18?(o-=18,s+=1,this.words[s]|=i>>>26):o+=8;else for(n=(t.length-e)%2==0?e+1:e;n=18?(o-=18,s+=1,this.words[s]|=i>>>26):o+=8;this._strip()},s.prototype._parseBase=function(t,e,r){this.words=[0],this.length=1;for(var n=0,i=1;i<=67108863;i*=e)n++;n--,i=i/e|0;for(var o=t.length-r,s=o%n,a=Math.min(o,o-s)+r,u=0,h=r;h1&&0===this.words[this.length-1];)this.length--;return this._normSign()},s.prototype._normSign=function(){return 1===this.length&&0===this.words[0]&&(this.negative=0),this},"undefined"!=typeof Symbol&&"function"==typeof Symbol.for)try{s.prototype[Symbol.for("nodejs.util.inspect.custom")]=l}catch(t){s.prototype.inspect=l}else s.prototype.inspect=l;function l(){return(this.red?""}var d=["","0","00","000","0000","00000","000000","0000000","00000000","000000000","0000000000","00000000000","000000000000","0000000000000","00000000000000","000000000000000","0000000000000000","00000000000000000","000000000000000000","0000000000000000000","00000000000000000000","000000000000000000000","0000000000000000000000","00000000000000000000000","000000000000000000000000","0000000000000000000000000"],p=[0,0,25,16,12,11,10,9,8,8,7,7,7,7,6,6,6,6,6,6,6,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5],m=[0,0,33554432,43046721,16777216,48828125,60466176,40353607,16777216,43046721,1e7,19487171,35831808,62748517,7529536,11390625,16777216,24137569,34012224,47045881,64e6,4084101,5153632,6436343,7962624,9765625,11881376,14348907,17210368,20511149,243e5,28629151,33554432,39135393,45435424,52521875,60466176];s.prototype.toString=function(t,e){var r;if(e=0|e||1,16===(t=t||10)||"hex"===t){r="";for(var n=0,o=0,s=0;s>>24-n&16777215)||s!==this.length-1?d[6-u.length]+u+r:u+r,(n+=2)>=26&&(n-=26,s--)}for(0!==o&&(r=o.toString(16)+r);r.length%e!=0;)r="0"+r;return 0!==this.negative&&(r="-"+r),r}if(t===(0|t)&&t>=2&&t<=36){var h=p[t],f=m[t];r="";var c=this.clone();for(c.negative=0;!c.isZero();){var l=c.modrn(f).toString(t);r=(c=c.idivn(f)).isZero()?l+r:d[h-l.length]+l+r}for(this.isZero()&&(r="0"+r);r.length%e!=0;)r="0"+r;return 0!==this.negative&&(r="-"+r),r}i(!1,"Base should be between 2 and 36")},s.prototype.toNumber=function(){var t=this.words[0];return 2===this.length?t+=67108864*this.words[1]:3===this.length&&1===this.words[2]?t+=4503599627370496+67108864*this.words[1]:this.length>2&&i(!1,"Number can only safely store up to 53 bits"),0!==this.negative?-t:t},s.prototype.toJSON=function(){return this.toString(16,2)},a&&(s.prototype.toBuffer=function(t,e){return this.toArrayLike(a,t,e)}),s.prototype.toArray=function(t,e){return this.toArrayLike(Array,t,e)};function y(t,e,r){r.negative=e.negative^t.negative;var n=t.length+e.length|0;r.length=n,n=n-1|0;var i=0|t.words[0],o=0|e.words[0],s=i*o,a=67108863&s,u=s/67108864|0;r.words[0]=a;for(var h=1;h>>26,c=67108863&u,l=Math.min(h,e.length-1),d=Math.max(0,h-t.length+1);d<=l;d++){var p=h-d|0;f+=(s=(i=0|t.words[p])*(o=0|e.words[d])+c)/67108864|0,c=67108863&s}r.words[h]=0|c,u=0|f}return 0!==u?r.words[h]=0|u:r.length--,r._strip()}s.prototype.toArrayLike=function(t,e,r){this._strip();var n=this.byteLength(),o=r||Math.max(1,n);i(n<=o,"byte array longer than desired length"),i(o>0,"Requested array length <= 0");var s=function(t,e){return t.allocUnsafe?t.allocUnsafe(e):new t(e)}(t,o);return this["_toArrayLike"+("le"===e?"LE":"BE")](s,n),s},s.prototype._toArrayLikeLE=function(t,e){for(var r=0,n=0,i=0,o=0;i>8&255),r>16&255),6===o?(r>24&255),n=0,o=0):(n=s>>>24,o+=2)}if(r=0&&(t[r--]=s>>8&255),r>=0&&(t[r--]=s>>16&255),6===o?(r>=0&&(t[r--]=s>>24&255),n=0,o=0):(n=s>>>24,o+=2)}if(r>=0)for(t[r--]=n;r>=0;)t[r--]=0},Math.clz32?s.prototype._countBits=function(t){return 32-Math.clz32(t)}:s.prototype._countBits=function(t){var e=t,r=0;return e>=4096&&(r+=13,e>>>=13),e>=64&&(r+=7,e>>>=7),e>=8&&(r+=4,e>>>=4),e>=2&&(r+=2,e>>>=2),r+e},s.prototype._zeroBits=function(t){if(0===t)return 26;var e=t,r=0;return 0==(8191&e)&&(r+=13,e>>>=13),0==(127&e)&&(r+=7,e>>>=7),0==(15&e)&&(r+=4,e>>>=4),0==(3&e)&&(r+=2,e>>>=2),0==(1&e)&&r++,r},s.prototype.bitLength=function(){var t=this.words[this.length-1],e=this._countBits(t);return 26*(this.length-1)+e},s.prototype.zeroBits=function(){if(this.isZero())return 0;for(var t=0,e=0;et.length?this.clone().ior(t):t.clone().ior(this)},s.prototype.uor=function(t){return this.length>t.length?this.clone().iuor(t):t.clone().iuor(this)},s.prototype.iuand=function(t){var e;e=this.length>t.length?t:this;for(var r=0;rt.length?this.clone().iand(t):t.clone().iand(this)},s.prototype.uand=function(t){return this.length>t.length?this.clone().iuand(t):t.clone().iuand(this)},s.prototype.iuxor=function(t){var e,r;this.length>t.length?(e=this,r=t):(e=t,r=this);for(var n=0;nt.length?this.clone().ixor(t):t.clone().ixor(this)},s.prototype.uxor=function(t){return this.length>t.length?this.clone().iuxor(t):t.clone().iuxor(this)},s.prototype.inotn=function(t){i("number"==typeof t&&t>=0);var e=0|Math.ceil(t/26),r=t%26;this._expand(e),r>0&&e--;for(var n=0;n0&&(this.words[n]=~this.words[n]&67108863>>26-r),this._strip()},s.prototype.notn=function(t){return this.clone().inotn(t)},s.prototype.setn=function(t,e){i("number"==typeof t&&t>=0);var r=t/26|0,n=t%26;return this._expand(r+1),this.words[r]=e?this.words[r]|1<t.length?(r=this,n=t):(r=t,n=this);for(var i=0,o=0;o>>26;for(;0!==i&&o>>26;if(this.length=r.length,0!==i)this.words[this.length]=i,this.length++;else if(r!==this)for(;ot.length?this.clone().iadd(t):t.clone().iadd(this)},s.prototype.isub=function(t){if(0!==t.negative){t.negative=0;var e=this.iadd(t);return t.negative=1,e._normSign()}if(0!==this.negative)return this.negative=0,this.iadd(t),this.negative=1,this._normSign();var r,n,i=this.cmp(t);if(0===i)return this.negative=0,this.length=1,this.words[0]=0,this;i>0?(r=this,n=t):(r=t,n=this);for(var o=0,s=0;s>26,this.words[s]=67108863&e;for(;0!==o&&s>26,this.words[s]=67108863&e;if(0===o&&s>>13,d=0|s[1],p=8191&d,m=d>>>13,y=0|s[2],v=8191&y,b=y>>>13,g=0|s[3],w=8191&g,_=g>>>13,M=0|s[4],S=8191&M,O=M>>>13,A=0|s[5],E=8191&A,x=A>>>13,k=0|s[6],j=8191&k,$=k>>>13,P=0|s[7],R=8191&P,B=P>>>13,T=0|s[8],I=8191&T,N=T>>>13,D=0|s[9],C=8191&D,L=D>>>13,q=0|a[0],U=8191&q,F=q>>>13,z=0|a[1],V=8191&z,K=z>>>13,H=0|a[2],Z=8191&H,W=H>>>13,J=0|a[3],Y=8191&J,Q=J>>>13,G=0|a[4],X=8191&G,tt=G>>>13,et=0|a[5],rt=8191&et,nt=et>>>13,it=0|a[6],ot=8191&it,st=it>>>13,at=0|a[7],ut=8191&at,ht=at>>>13,ft=0|a[8],ct=8191&ft,lt=ft>>>13,dt=0|a[9],pt=8191&dt,mt=dt>>>13;r.negative=t.negative^e.negative,r.length=19;var yt=(h+(n=Math.imul(c,U))|0)+((8191&(i=(i=Math.imul(c,F))+Math.imul(l,U)|0))<<13)|0;h=((o=Math.imul(l,F))+(i>>>13)|0)+(yt>>>26)|0,yt&=67108863,n=Math.imul(p,U),i=(i=Math.imul(p,F))+Math.imul(m,U)|0,o=Math.imul(m,F);var vt=(h+(n=n+Math.imul(c,V)|0)|0)+((8191&(i=(i=i+Math.imul(c,K)|0)+Math.imul(l,V)|0))<<13)|0;h=((o=o+Math.imul(l,K)|0)+(i>>>13)|0)+(vt>>>26)|0,vt&=67108863,n=Math.imul(v,U),i=(i=Math.imul(v,F))+Math.imul(b,U)|0,o=Math.imul(b,F),n=n+Math.imul(p,V)|0,i=(i=i+Math.imul(p,K)|0)+Math.imul(m,V)|0,o=o+Math.imul(m,K)|0;var bt=(h+(n=n+Math.imul(c,Z)|0)|0)+((8191&(i=(i=i+Math.imul(c,W)|0)+Math.imul(l,Z)|0))<<13)|0;h=((o=o+Math.imul(l,W)|0)+(i>>>13)|0)+(bt>>>26)|0,bt&=67108863,n=Math.imul(w,U),i=(i=Math.imul(w,F))+Math.imul(_,U)|0,o=Math.imul(_,F),n=n+Math.imul(v,V)|0,i=(i=i+Math.imul(v,K)|0)+Math.imul(b,V)|0,o=o+Math.imul(b,K)|0,n=n+Math.imul(p,Z)|0,i=(i=i+Math.imul(p,W)|0)+Math.imul(m,Z)|0,o=o+Math.imul(m,W)|0;var gt=(h+(n=n+Math.imul(c,Y)|0)|0)+((8191&(i=(i=i+Math.imul(c,Q)|0)+Math.imul(l,Y)|0))<<13)|0;h=((o=o+Math.imul(l,Q)|0)+(i>>>13)|0)+(gt>>>26)|0,gt&=67108863,n=Math.imul(S,U),i=(i=Math.imul(S,F))+Math.imul(O,U)|0,o=Math.imul(O,F),n=n+Math.imul(w,V)|0,i=(i=i+Math.imul(w,K)|0)+Math.imul(_,V)|0,o=o+Math.imul(_,K)|0,n=n+Math.imul(v,Z)|0,i=(i=i+Math.imul(v,W)|0)+Math.imul(b,Z)|0,o=o+Math.imul(b,W)|0,n=n+Math.imul(p,Y)|0,i=(i=i+Math.imul(p,Q)|0)+Math.imul(m,Y)|0,o=o+Math.imul(m,Q)|0;var wt=(h+(n=n+Math.imul(c,X)|0)|0)+((8191&(i=(i=i+Math.imul(c,tt)|0)+Math.imul(l,X)|0))<<13)|0;h=((o=o+Math.imul(l,tt)|0)+(i>>>13)|0)+(wt>>>26)|0,wt&=67108863,n=Math.imul(E,U),i=(i=Math.imul(E,F))+Math.imul(x,U)|0,o=Math.imul(x,F),n=n+Math.imul(S,V)|0,i=(i=i+Math.imul(S,K)|0)+Math.imul(O,V)|0,o=o+Math.imul(O,K)|0,n=n+Math.imul(w,Z)|0,i=(i=i+Math.imul(w,W)|0)+Math.imul(_,Z)|0,o=o+Math.imul(_,W)|0,n=n+Math.imul(v,Y)|0,i=(i=i+Math.imul(v,Q)|0)+Math.imul(b,Y)|0,o=o+Math.imul(b,Q)|0,n=n+Math.imul(p,X)|0,i=(i=i+Math.imul(p,tt)|0)+Math.imul(m,X)|0,o=o+Math.imul(m,tt)|0;var _t=(h+(n=n+Math.imul(c,rt)|0)|0)+((8191&(i=(i=i+Math.imul(c,nt)|0)+Math.imul(l,rt)|0))<<13)|0;h=((o=o+Math.imul(l,nt)|0)+(i>>>13)|0)+(_t>>>26)|0,_t&=67108863,n=Math.imul(j,U),i=(i=Math.imul(j,F))+Math.imul($,U)|0,o=Math.imul($,F),n=n+Math.imul(E,V)|0,i=(i=i+Math.imul(E,K)|0)+Math.imul(x,V)|0,o=o+Math.imul(x,K)|0,n=n+Math.imul(S,Z)|0,i=(i=i+Math.imul(S,W)|0)+Math.imul(O,Z)|0,o=o+Math.imul(O,W)|0,n=n+Math.imul(w,Y)|0,i=(i=i+Math.imul(w,Q)|0)+Math.imul(_,Y)|0,o=o+Math.imul(_,Q)|0,n=n+Math.imul(v,X)|0,i=(i=i+Math.imul(v,tt)|0)+Math.imul(b,X)|0,o=o+Math.imul(b,tt)|0,n=n+Math.imul(p,rt)|0,i=(i=i+Math.imul(p,nt)|0)+Math.imul(m,rt)|0,o=o+Math.imul(m,nt)|0;var Mt=(h+(n=n+Math.imul(c,ot)|0)|0)+((8191&(i=(i=i+Math.imul(c,st)|0)+Math.imul(l,ot)|0))<<13)|0;h=((o=o+Math.imul(l,st)|0)+(i>>>13)|0)+(Mt>>>26)|0,Mt&=67108863,n=Math.imul(R,U),i=(i=Math.imul(R,F))+Math.imul(B,U)|0,o=Math.imul(B,F),n=n+Math.imul(j,V)|0,i=(i=i+Math.imul(j,K)|0)+Math.imul($,V)|0,o=o+Math.imul($,K)|0,n=n+Math.imul(E,Z)|0,i=(i=i+Math.imul(E,W)|0)+Math.imul(x,Z)|0,o=o+Math.imul(x,W)|0,n=n+Math.imul(S,Y)|0,i=(i=i+Math.imul(S,Q)|0)+Math.imul(O,Y)|0,o=o+Math.imul(O,Q)|0,n=n+Math.imul(w,X)|0,i=(i=i+Math.imul(w,tt)|0)+Math.imul(_,X)|0,o=o+Math.imul(_,tt)|0,n=n+Math.imul(v,rt)|0,i=(i=i+Math.imul(v,nt)|0)+Math.imul(b,rt)|0,o=o+Math.imul(b,nt)|0,n=n+Math.imul(p,ot)|0,i=(i=i+Math.imul(p,st)|0)+Math.imul(m,ot)|0,o=o+Math.imul(m,st)|0;var St=(h+(n=n+Math.imul(c,ut)|0)|0)+((8191&(i=(i=i+Math.imul(c,ht)|0)+Math.imul(l,ut)|0))<<13)|0;h=((o=o+Math.imul(l,ht)|0)+(i>>>13)|0)+(St>>>26)|0,St&=67108863,n=Math.imul(I,U),i=(i=Math.imul(I,F))+Math.imul(N,U)|0,o=Math.imul(N,F),n=n+Math.imul(R,V)|0,i=(i=i+Math.imul(R,K)|0)+Math.imul(B,V)|0,o=o+Math.imul(B,K)|0,n=n+Math.imul(j,Z)|0,i=(i=i+Math.imul(j,W)|0)+Math.imul($,Z)|0,o=o+Math.imul($,W)|0,n=n+Math.imul(E,Y)|0,i=(i=i+Math.imul(E,Q)|0)+Math.imul(x,Y)|0,o=o+Math.imul(x,Q)|0,n=n+Math.imul(S,X)|0,i=(i=i+Math.imul(S,tt)|0)+Math.imul(O,X)|0,o=o+Math.imul(O,tt)|0,n=n+Math.imul(w,rt)|0,i=(i=i+Math.imul(w,nt)|0)+Math.imul(_,rt)|0,o=o+Math.imul(_,nt)|0,n=n+Math.imul(v,ot)|0,i=(i=i+Math.imul(v,st)|0)+Math.imul(b,ot)|0,o=o+Math.imul(b,st)|0,n=n+Math.imul(p,ut)|0,i=(i=i+Math.imul(p,ht)|0)+Math.imul(m,ut)|0,o=o+Math.imul(m,ht)|0;var Ot=(h+(n=n+Math.imul(c,ct)|0)|0)+((8191&(i=(i=i+Math.imul(c,lt)|0)+Math.imul(l,ct)|0))<<13)|0;h=((o=o+Math.imul(l,lt)|0)+(i>>>13)|0)+(Ot>>>26)|0,Ot&=67108863,n=Math.imul(C,U),i=(i=Math.imul(C,F))+Math.imul(L,U)|0,o=Math.imul(L,F),n=n+Math.imul(I,V)|0,i=(i=i+Math.imul(I,K)|0)+Math.imul(N,V)|0,o=o+Math.imul(N,K)|0,n=n+Math.imul(R,Z)|0,i=(i=i+Math.imul(R,W)|0)+Math.imul(B,Z)|0,o=o+Math.imul(B,W)|0,n=n+Math.imul(j,Y)|0,i=(i=i+Math.imul(j,Q)|0)+Math.imul($,Y)|0,o=o+Math.imul($,Q)|0,n=n+Math.imul(E,X)|0,i=(i=i+Math.imul(E,tt)|0)+Math.imul(x,X)|0,o=o+Math.imul(x,tt)|0,n=n+Math.imul(S,rt)|0,i=(i=i+Math.imul(S,nt)|0)+Math.imul(O,rt)|0,o=o+Math.imul(O,nt)|0,n=n+Math.imul(w,ot)|0,i=(i=i+Math.imul(w,st)|0)+Math.imul(_,ot)|0,o=o+Math.imul(_,st)|0,n=n+Math.imul(v,ut)|0,i=(i=i+Math.imul(v,ht)|0)+Math.imul(b,ut)|0,o=o+Math.imul(b,ht)|0,n=n+Math.imul(p,ct)|0,i=(i=i+Math.imul(p,lt)|0)+Math.imul(m,ct)|0,o=o+Math.imul(m,lt)|0;var At=(h+(n=n+Math.imul(c,pt)|0)|0)+((8191&(i=(i=i+Math.imul(c,mt)|0)+Math.imul(l,pt)|0))<<13)|0;h=((o=o+Math.imul(l,mt)|0)+(i>>>13)|0)+(At>>>26)|0,At&=67108863,n=Math.imul(C,V),i=(i=Math.imul(C,K))+Math.imul(L,V)|0,o=Math.imul(L,K),n=n+Math.imul(I,Z)|0,i=(i=i+Math.imul(I,W)|0)+Math.imul(N,Z)|0,o=o+Math.imul(N,W)|0,n=n+Math.imul(R,Y)|0,i=(i=i+Math.imul(R,Q)|0)+Math.imul(B,Y)|0,o=o+Math.imul(B,Q)|0,n=n+Math.imul(j,X)|0,i=(i=i+Math.imul(j,tt)|0)+Math.imul($,X)|0,o=o+Math.imul($,tt)|0,n=n+Math.imul(E,rt)|0,i=(i=i+Math.imul(E,nt)|0)+Math.imul(x,rt)|0,o=o+Math.imul(x,nt)|0,n=n+Math.imul(S,ot)|0,i=(i=i+Math.imul(S,st)|0)+Math.imul(O,ot)|0,o=o+Math.imul(O,st)|0,n=n+Math.imul(w,ut)|0,i=(i=i+Math.imul(w,ht)|0)+Math.imul(_,ut)|0,o=o+Math.imul(_,ht)|0,n=n+Math.imul(v,ct)|0,i=(i=i+Math.imul(v,lt)|0)+Math.imul(b,ct)|0,o=o+Math.imul(b,lt)|0;var Et=(h+(n=n+Math.imul(p,pt)|0)|0)+((8191&(i=(i=i+Math.imul(p,mt)|0)+Math.imul(m,pt)|0))<<13)|0;h=((o=o+Math.imul(m,mt)|0)+(i>>>13)|0)+(Et>>>26)|0,Et&=67108863,n=Math.imul(C,Z),i=(i=Math.imul(C,W))+Math.imul(L,Z)|0,o=Math.imul(L,W),n=n+Math.imul(I,Y)|0,i=(i=i+Math.imul(I,Q)|0)+Math.imul(N,Y)|0,o=o+Math.imul(N,Q)|0,n=n+Math.imul(R,X)|0,i=(i=i+Math.imul(R,tt)|0)+Math.imul(B,X)|0,o=o+Math.imul(B,tt)|0,n=n+Math.imul(j,rt)|0,i=(i=i+Math.imul(j,nt)|0)+Math.imul($,rt)|0,o=o+Math.imul($,nt)|0,n=n+Math.imul(E,ot)|0,i=(i=i+Math.imul(E,st)|0)+Math.imul(x,ot)|0,o=o+Math.imul(x,st)|0,n=n+Math.imul(S,ut)|0,i=(i=i+Math.imul(S,ht)|0)+Math.imul(O,ut)|0,o=o+Math.imul(O,ht)|0,n=n+Math.imul(w,ct)|0,i=(i=i+Math.imul(w,lt)|0)+Math.imul(_,ct)|0,o=o+Math.imul(_,lt)|0;var xt=(h+(n=n+Math.imul(v,pt)|0)|0)+((8191&(i=(i=i+Math.imul(v,mt)|0)+Math.imul(b,pt)|0))<<13)|0;h=((o=o+Math.imul(b,mt)|0)+(i>>>13)|0)+(xt>>>26)|0,xt&=67108863,n=Math.imul(C,Y),i=(i=Math.imul(C,Q))+Math.imul(L,Y)|0,o=Math.imul(L,Q),n=n+Math.imul(I,X)|0,i=(i=i+Math.imul(I,tt)|0)+Math.imul(N,X)|0,o=o+Math.imul(N,tt)|0,n=n+Math.imul(R,rt)|0,i=(i=i+Math.imul(R,nt)|0)+Math.imul(B,rt)|0,o=o+Math.imul(B,nt)|0,n=n+Math.imul(j,ot)|0,i=(i=i+Math.imul(j,st)|0)+Math.imul($,ot)|0,o=o+Math.imul($,st)|0,n=n+Math.imul(E,ut)|0,i=(i=i+Math.imul(E,ht)|0)+Math.imul(x,ut)|0,o=o+Math.imul(x,ht)|0,n=n+Math.imul(S,ct)|0,i=(i=i+Math.imul(S,lt)|0)+Math.imul(O,ct)|0,o=o+Math.imul(O,lt)|0;var kt=(h+(n=n+Math.imul(w,pt)|0)|0)+((8191&(i=(i=i+Math.imul(w,mt)|0)+Math.imul(_,pt)|0))<<13)|0;h=((o=o+Math.imul(_,mt)|0)+(i>>>13)|0)+(kt>>>26)|0,kt&=67108863,n=Math.imul(C,X),i=(i=Math.imul(C,tt))+Math.imul(L,X)|0,o=Math.imul(L,tt),n=n+Math.imul(I,rt)|0,i=(i=i+Math.imul(I,nt)|0)+Math.imul(N,rt)|0,o=o+Math.imul(N,nt)|0,n=n+Math.imul(R,ot)|0,i=(i=i+Math.imul(R,st)|0)+Math.imul(B,ot)|0,o=o+Math.imul(B,st)|0,n=n+Math.imul(j,ut)|0,i=(i=i+Math.imul(j,ht)|0)+Math.imul($,ut)|0,o=o+Math.imul($,ht)|0,n=n+Math.imul(E,ct)|0,i=(i=i+Math.imul(E,lt)|0)+Math.imul(x,ct)|0,o=o+Math.imul(x,lt)|0;var jt=(h+(n=n+Math.imul(S,pt)|0)|0)+((8191&(i=(i=i+Math.imul(S,mt)|0)+Math.imul(O,pt)|0))<<13)|0;h=((o=o+Math.imul(O,mt)|0)+(i>>>13)|0)+(jt>>>26)|0,jt&=67108863,n=Math.imul(C,rt),i=(i=Math.imul(C,nt))+Math.imul(L,rt)|0,o=Math.imul(L,nt),n=n+Math.imul(I,ot)|0,i=(i=i+Math.imul(I,st)|0)+Math.imul(N,ot)|0,o=o+Math.imul(N,st)|0,n=n+Math.imul(R,ut)|0,i=(i=i+Math.imul(R,ht)|0)+Math.imul(B,ut)|0,o=o+Math.imul(B,ht)|0,n=n+Math.imul(j,ct)|0,i=(i=i+Math.imul(j,lt)|0)+Math.imul($,ct)|0,o=o+Math.imul($,lt)|0;var $t=(h+(n=n+Math.imul(E,pt)|0)|0)+((8191&(i=(i=i+Math.imul(E,mt)|0)+Math.imul(x,pt)|0))<<13)|0;h=((o=o+Math.imul(x,mt)|0)+(i>>>13)|0)+($t>>>26)|0,$t&=67108863,n=Math.imul(C,ot),i=(i=Math.imul(C,st))+Math.imul(L,ot)|0,o=Math.imul(L,st),n=n+Math.imul(I,ut)|0,i=(i=i+Math.imul(I,ht)|0)+Math.imul(N,ut)|0,o=o+Math.imul(N,ht)|0,n=n+Math.imul(R,ct)|0,i=(i=i+Math.imul(R,lt)|0)+Math.imul(B,ct)|0,o=o+Math.imul(B,lt)|0;var Pt=(h+(n=n+Math.imul(j,pt)|0)|0)+((8191&(i=(i=i+Math.imul(j,mt)|0)+Math.imul($,pt)|0))<<13)|0;h=((o=o+Math.imul($,mt)|0)+(i>>>13)|0)+(Pt>>>26)|0,Pt&=67108863,n=Math.imul(C,ut),i=(i=Math.imul(C,ht))+Math.imul(L,ut)|0,o=Math.imul(L,ht),n=n+Math.imul(I,ct)|0,i=(i=i+Math.imul(I,lt)|0)+Math.imul(N,ct)|0,o=o+Math.imul(N,lt)|0;var Rt=(h+(n=n+Math.imul(R,pt)|0)|0)+((8191&(i=(i=i+Math.imul(R,mt)|0)+Math.imul(B,pt)|0))<<13)|0;h=((o=o+Math.imul(B,mt)|0)+(i>>>13)|0)+(Rt>>>26)|0,Rt&=67108863,n=Math.imul(C,ct),i=(i=Math.imul(C,lt))+Math.imul(L,ct)|0,o=Math.imul(L,lt);var Bt=(h+(n=n+Math.imul(I,pt)|0)|0)+((8191&(i=(i=i+Math.imul(I,mt)|0)+Math.imul(N,pt)|0))<<13)|0;h=((o=o+Math.imul(N,mt)|0)+(i>>>13)|0)+(Bt>>>26)|0,Bt&=67108863;var Tt=(h+(n=Math.imul(C,pt))|0)+((8191&(i=(i=Math.imul(C,mt))+Math.imul(L,pt)|0))<<13)|0;return h=((o=Math.imul(L,mt))+(i>>>13)|0)+(Tt>>>26)|0,Tt&=67108863,u[0]=yt,u[1]=vt,u[2]=bt,u[3]=gt,u[4]=wt,u[5]=_t,u[6]=Mt,u[7]=St,u[8]=Ot,u[9]=At,u[10]=Et,u[11]=xt,u[12]=kt,u[13]=jt,u[14]=$t,u[15]=Pt,u[16]=Rt,u[17]=Bt,u[18]=Tt,0!==h&&(u[19]=h,r.length++),r};function b(t,e,r){r.negative=e.negative^t.negative,r.length=t.length+e.length;for(var n=0,i=0,o=0;o>>26)|0)>>>26,s&=67108863}r.words[o]=a,n=s,s=i}return 0!==n?r.words[o]=n:r.length--,r._strip()}function g(t,e,r){return b(t,e,r)}function w(t,e){this.x=t,this.y=e}Math.imul||(v=y),s.prototype.mulTo=function(t,e){var r=this.length+t.length;return 10===this.length&&10===t.length?v(this,t,e):r<63?y(this,t,e):r<1024?b(this,t,e):g(this,t,e)},w.prototype.makeRBT=function(t){for(var e=new Array(t),r=s.prototype._countBits(t)-1,n=0;n>=1;return n},w.prototype.permute=function(t,e,r,n,i,o){for(var s=0;s>>=1)i++;return 1<>>=13,r[2*s+1]=8191&o,o>>>=13;for(s=2*e;s>=26,r+=o/67108864|0,r+=s>>>26,this.words[n]=67108863&s}return 0!==r&&(this.words[n]=r,this.length++),e?this.ineg():this},s.prototype.muln=function(t){return this.clone().imuln(t)},s.prototype.sqr=function(){return this.mul(this)},s.prototype.isqr=function(){return this.imul(this.clone())},s.prototype.pow=function(t){var e=function(t){for(var e=new Array(t.bitLength()),r=0;r>>i&1}return e}(t);if(0===e.length)return new s(1);for(var r=this,n=0;n=0);var e,r=t%26,n=(t-r)/26,o=67108863>>>26-r<<26-r;if(0!==r){var s=0;for(e=0;e>>26-r}s&&(this.words[e]=s,this.length++)}if(0!==n){for(e=this.length-1;e>=0;e--)this.words[e+n]=this.words[e];for(e=0;e=0),n=e?(e-e%26)/26:0;var o=t%26,s=Math.min((t-o)/26,this.length),a=67108863^67108863>>>o<s)for(this.length-=s,h=0;h=0&&(0!==f||h>=n);h--){var c=0|this.words[h];this.words[h]=f<<26-o|c>>>o,f=c&a}return u&&0!==f&&(u.words[u.length++]=f),0===this.length&&(this.words[0]=0,this.length=1),this._strip()},s.prototype.ishrn=function(t,e,r){return i(0===this.negative),this.iushrn(t,e,r)},s.prototype.shln=function(t){return this.clone().ishln(t)},s.prototype.ushln=function(t){return this.clone().iushln(t)},s.prototype.shrn=function(t){return this.clone().ishrn(t)},s.prototype.ushrn=function(t){return this.clone().iushrn(t)},s.prototype.testn=function(t){i("number"==typeof t&&t>=0);var e=t%26,r=(t-e)/26,n=1<=0);var e=t%26,r=(t-e)/26;if(i(0===this.negative,"imaskn works only with positive numbers"),this.length<=r)return this;if(0!==e&&r++,this.length=Math.min(r,this.length),0!==e){var n=67108863^67108863>>>e<=67108864;e++)this.words[e]-=67108864,e===this.length-1?this.words[e+1]=1:this.words[e+1]++;return this.length=Math.max(this.length,e+1),this},s.prototype.isubn=function(t){if(i("number"==typeof t),i(t<67108864),t<0)return this.iaddn(-t);if(0!==this.negative)return this.negative=0,this.iaddn(t),this.negative=1,this;if(this.words[0]-=t,1===this.length&&this.words[0]<0)this.words[0]=-this.words[0],this.negative=1;else for(var e=0;e>26)-(u/67108864|0),this.words[n+r]=67108863&o}for(;n>26,this.words[n+r]=67108863&o;if(0===a)return this._strip();for(i(-1===a),a=0,n=0;n>26,this.words[n]=67108863&o;return this.negative=1,this._strip()},s.prototype._wordDiv=function(t,e){var r=(this.length,t.length),n=this.clone(),i=t,o=0|i.words[i.length-1];0!==(r=26-this._countBits(o))&&(i=i.ushln(r),n.iushln(r),o=0|i.words[i.length-1]);var a,u=n.length-i.length;if("mod"!==e){(a=new s(null)).length=u+1,a.words=new Array(a.length);for(var h=0;h=0;c--){var l=67108864*(0|n.words[i.length+c])+(0|n.words[i.length+c-1]);for(l=Math.min(l/o|0,67108863),n._ishlnsubmul(i,l,c);0!==n.negative;)l--,n.negative=0,n._ishlnsubmul(i,1,c),n.isZero()||(n.negative^=1);a&&(a.words[c]=l)}return a&&a._strip(),n._strip(),"div"!==e&&0!==r&&n.iushrn(r),{div:a||null,mod:n}},s.prototype.divmod=function(t,e,r){return i(!t.isZero()),this.isZero()?{div:new s(0),mod:new s(0)}:0!==this.negative&&0===t.negative?(a=this.neg().divmod(t,e),"mod"!==e&&(n=a.div.neg()),"div"!==e&&(o=a.mod.neg(),r&&0!==o.negative&&o.iadd(t)),{div:n,mod:o}):0===this.negative&&0!==t.negative?(a=this.divmod(t.neg(),e),"mod"!==e&&(n=a.div.neg()),{div:n,mod:a.mod}):0!=(this.negative&t.negative)?(a=this.neg().divmod(t.neg(),e),"div"!==e&&(o=a.mod.neg(),r&&0!==o.negative&&o.isub(t)),{div:a.div,mod:o}):t.length>this.length||this.cmp(t)<0?{div:new s(0),mod:this}:1===t.length?"div"===e?{div:this.divn(t.words[0]),mod:null}:"mod"===e?{div:null,mod:new s(this.modrn(t.words[0]))}:{div:this.divn(t.words[0]),mod:new s(this.modrn(t.words[0]))}:this._wordDiv(t,e);var n,o,a},s.prototype.div=function(t){return this.divmod(t,"div",!1).div},s.prototype.mod=function(t){return this.divmod(t,"mod",!1).mod},s.prototype.umod=function(t){return this.divmod(t,"mod",!0).mod},s.prototype.divRound=function(t){var e=this.divmod(t);if(e.mod.isZero())return e.div;var r=0!==e.div.negative?e.mod.isub(t):e.mod,n=t.ushrn(1),i=t.andln(1),o=r.cmp(n);return o<0||1===i&&0===o?e.div:0!==e.div.negative?e.div.isubn(1):e.div.iaddn(1)},s.prototype.modrn=function(t){var e=t<0;e&&(t=-t),i(t<=67108863);for(var r=(1<<26)%t,n=0,o=this.length-1;o>=0;o--)n=(r*n+(0|this.words[o]))%t;return e?-n:n},s.prototype.modn=function(t){return this.modrn(t)},s.prototype.idivn=function(t){var e=t<0;e&&(t=-t),i(t<=67108863);for(var r=0,n=this.length-1;n>=0;n--){var o=(0|this.words[n])+67108864*r;this.words[n]=o/t|0,r=o%t}return this._strip(),e?this.ineg():this},s.prototype.divn=function(t){return this.clone().idivn(t)},s.prototype.egcd=function(t){i(0===t.negative),i(!t.isZero());var e=this,r=t.clone();e=0!==e.negative?e.umod(t):e.clone();for(var n=new s(1),o=new s(0),a=new s(0),u=new s(1),h=0;e.isEven()&&r.isEven();)e.iushrn(1),r.iushrn(1),++h;for(var f=r.clone(),c=e.clone();!e.isZero();){for(var l=0,d=1;0==(e.words[0]&d)&&l<26;++l,d<<=1);if(l>0)for(e.iushrn(l);l-- >0;)(n.isOdd()||o.isOdd())&&(n.iadd(f),o.isub(c)),n.iushrn(1),o.iushrn(1);for(var p=0,m=1;0==(r.words[0]&m)&&p<26;++p,m<<=1);if(p>0)for(r.iushrn(p);p-- >0;)(a.isOdd()||u.isOdd())&&(a.iadd(f),u.isub(c)),a.iushrn(1),u.iushrn(1);e.cmp(r)>=0?(e.isub(r),n.isub(a),o.isub(u)):(r.isub(e),a.isub(n),u.isub(o))}return{a:a,b:u,gcd:r.iushln(h)}},s.prototype._invmp=function(t){i(0===t.negative),i(!t.isZero());var e=this,r=t.clone();e=0!==e.negative?e.umod(t):e.clone();for(var n,o=new s(1),a=new s(0),u=r.clone();e.cmpn(1)>0&&r.cmpn(1)>0;){for(var h=0,f=1;0==(e.words[0]&f)&&h<26;++h,f<<=1);if(h>0)for(e.iushrn(h);h-- >0;)o.isOdd()&&o.iadd(u),o.iushrn(1);for(var c=0,l=1;0==(r.words[0]&l)&&c<26;++c,l<<=1);if(c>0)for(r.iushrn(c);c-- >0;)a.isOdd()&&a.iadd(u),a.iushrn(1);e.cmp(r)>=0?(e.isub(r),o.isub(a)):(r.isub(e),a.isub(o))}return(n=0===e.cmpn(1)?o:a).cmpn(0)<0&&n.iadd(t),n},s.prototype.gcd=function(t){if(this.isZero())return t.abs();if(t.isZero())return this.abs();var e=this.clone(),r=t.clone();e.negative=0,r.negative=0;for(var n=0;e.isEven()&&r.isEven();n++)e.iushrn(1),r.iushrn(1);for(;;){for(;e.isEven();)e.iushrn(1);for(;r.isEven();)r.iushrn(1);var i=e.cmp(r);if(i<0){var o=e;e=r,r=o}else if(0===i||0===r.cmpn(1))break;e.isub(r)}return r.iushln(n)},s.prototype.invm=function(t){return this.egcd(t).a.umod(t)},s.prototype.isEven=function(){return 0==(1&this.words[0])},s.prototype.isOdd=function(){return 1==(1&this.words[0])},s.prototype.andln=function(t){return this.words[0]&t},s.prototype.bincn=function(t){i("number"==typeof t);var e=t%26,r=(t-e)/26,n=1<>>26,a&=67108863,this.words[s]=a}return 0!==o&&(this.words[s]=o,this.length++),this},s.prototype.isZero=function(){return 1===this.length&&0===this.words[0]},s.prototype.cmpn=function(t){var e,r=t<0;if(0!==this.negative&&!r)return-1;if(0===this.negative&&r)return 1;if(this._strip(),this.length>1)e=1;else{r&&(t=-t),i(t<=67108863,"Number is too big");var n=0|this.words[0];e=n===t?0:nt.length)return 1;if(this.length=0;r--){var n=0|this.words[r],i=0|t.words[r];if(n!==i){ni&&(e=1);break}}return e},s.prototype.gtn=function(t){return 1===this.cmpn(t)},s.prototype.gt=function(t){return 1===this.cmp(t)},s.prototype.gten=function(t){return this.cmpn(t)>=0},s.prototype.gte=function(t){return this.cmp(t)>=0},s.prototype.ltn=function(t){return-1===this.cmpn(t)},s.prototype.lt=function(t){return-1===this.cmp(t)},s.prototype.lten=function(t){return this.cmpn(t)<=0},s.prototype.lte=function(t){return this.cmp(t)<=0},s.prototype.eqn=function(t){return 0===this.cmpn(t)},s.prototype.eq=function(t){return 0===this.cmp(t)},s.red=function(t){return new x(t)},s.prototype.toRed=function(t){return i(!this.red,"Already a number in reduction context"),i(0===this.negative,"red works only with positives"),t.convertTo(this)._forceRed(t)},s.prototype.fromRed=function(){return i(this.red,"fromRed works only with numbers in reduction context"),this.red.convertFrom(this)},s.prototype._forceRed=function(t){return this.red=t,this},s.prototype.forceRed=function(t){return i(!this.red,"Already a number in reduction context"),this._forceRed(t)},s.prototype.redAdd=function(t){return i(this.red,"redAdd works only with red numbers"),this.red.add(this,t)},s.prototype.redIAdd=function(t){return i(this.red,"redIAdd works only with red numbers"),this.red.iadd(this,t)},s.prototype.redSub=function(t){return i(this.red,"redSub works only with red numbers"),this.red.sub(this,t)},s.prototype.redISub=function(t){return i(this.red,"redISub works only with red numbers"),this.red.isub(this,t)},s.prototype.redShl=function(t){return i(this.red,"redShl works only with red numbers"),this.red.shl(this,t)},s.prototype.redMul=function(t){return i(this.red,"redMul works only with red numbers"),this.red._verify2(this,t),this.red.mul(this,t)},s.prototype.redIMul=function(t){return i(this.red,"redMul works only with red numbers"),this.red._verify2(this,t),this.red.imul(this,t)},s.prototype.redSqr=function(){return i(this.red,"redSqr works only with red numbers"),this.red._verify1(this),this.red.sqr(this)},s.prototype.redISqr=function(){return i(this.red,"redISqr works only with red numbers"),this.red._verify1(this),this.red.isqr(this)},s.prototype.redSqrt=function(){return i(this.red,"redSqrt works only with red numbers"),this.red._verify1(this),this.red.sqrt(this)},s.prototype.redInvm=function(){return i(this.red,"redInvm works only with red numbers"),this.red._verify1(this),this.red.invm(this)},s.prototype.redNeg=function(){return i(this.red,"redNeg works only with red numbers"),this.red._verify1(this),this.red.neg(this)},s.prototype.redPow=function(t){return i(this.red&&!t.red,"redPow(normalNum)"),this.red._verify1(this),this.red.pow(this,t)};var _={k256:null,p224:null,p192:null,p25519:null};function M(t,e){this.name=t,this.p=new s(e,16),this.n=this.p.bitLength(),this.k=new s(1).iushln(this.n).isub(this.p),this.tmp=this._tmp()}function S(){M.call(this,"k256","ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff fffffffe fffffc2f")}function O(){M.call(this,"p224","ffffffff ffffffff ffffffff ffffffff 00000000 00000000 00000001")}function A(){M.call(this,"p192","ffffffff ffffffff ffffffff fffffffe ffffffff ffffffff")}function E(){M.call(this,"25519","7fffffffffffffff ffffffffffffffff ffffffffffffffff ffffffffffffffed")}function x(t){if("string"==typeof t){var e=s._prime(t);this.m=e.p,this.prime=e}else i(t.gtn(1),"modulus must be greater than 1"),this.m=t,this.prime=null}function k(t){x.call(this,t),this.shift=this.m.bitLength(),this.shift%26!=0&&(this.shift+=26-this.shift%26),this.r=new s(1).iushln(this.shift),this.r2=this.imod(this.r.sqr()),this.rinv=this.r._invmp(this.m),this.minv=this.rinv.mul(this.r).isubn(1).div(this.m),this.minv=this.minv.umod(this.r),this.minv=this.r.sub(this.minv)}M.prototype._tmp=function(){var t=new s(null);return t.words=new Array(Math.ceil(this.n/13)),t},M.prototype.ireduce=function(t){var e,r=t;do{this.split(r,this.tmp),e=(r=(r=this.imulK(r)).iadd(this.tmp)).bitLength()}while(e>this.n);var n=e0?r.isub(this.p):void 0!==r.strip?r.strip():r._strip(),r},M.prototype.split=function(t,e){t.iushrn(this.n,0,e)},M.prototype.imulK=function(t){return t.imul(this.k)},o(S,M),S.prototype.split=function(t,e){for(var r=Math.min(t.length,9),n=0;n>>22,i=o}i>>>=22,t.words[n-10]=i,0===i&&t.length>10?t.length-=10:t.length-=9},S.prototype.imulK=function(t){t.words[t.length]=0,t.words[t.length+1]=0,t.length+=2;for(var e=0,r=0;r>>=26,t.words[r]=i,e=n}return 0!==e&&(t.words[t.length++]=e),t},s._prime=function(t){if(_[t])return _[t];var e;if("k256"===t)e=new S;else if("p224"===t)e=new O;else if("p192"===t)e=new A;else{if("p25519"!==t)throw new Error("Unknown prime "+t);e=new E}return _[t]=e,e},x.prototype._verify1=function(t){i(0===t.negative,"red works only with positives"),i(t.red,"red works only with red numbers")},x.prototype._verify2=function(t,e){i(0==(t.negative|e.negative),"red works only with positives"),i(t.red&&t.red===e.red,"red works only with red numbers")},x.prototype.imod=function(t){return this.prime?this.prime.ireduce(t)._forceRed(this):(c(t,t.umod(this.m)._forceRed(this)),t)},x.prototype.neg=function(t){return t.isZero()?t.clone():this.m.sub(t)._forceRed(this)},x.prototype.add=function(t,e){this._verify2(t,e);var r=t.add(e);return r.cmp(this.m)>=0&&r.isub(this.m),r._forceRed(this)},x.prototype.iadd=function(t,e){this._verify2(t,e);var r=t.iadd(e);return r.cmp(this.m)>=0&&r.isub(this.m),r},x.prototype.sub=function(t,e){this._verify2(t,e);var r=t.sub(e);return r.cmpn(0)<0&&r.iadd(this.m),r._forceRed(this)},x.prototype.isub=function(t,e){this._verify2(t,e);var r=t.isub(e);return r.cmpn(0)<0&&r.iadd(this.m),r},x.prototype.shl=function(t,e){return this._verify1(t),this.imod(t.ushln(e))},x.prototype.imul=function(t,e){return this._verify2(t,e),this.imod(t.imul(e))},x.prototype.mul=function(t,e){return this._verify2(t,e),this.imod(t.mul(e))},x.prototype.isqr=function(t){return this.imul(t,t.clone())},x.prototype.sqr=function(t){return this.mul(t,t)},x.prototype.sqrt=function(t){if(t.isZero())return t.clone();var e=this.m.andln(3);if(i(e%2==1),3===e){var r=this.m.add(new s(1)).iushrn(2);return this.pow(t,r)}for(var n=this.m.subn(1),o=0;!n.isZero()&&0===n.andln(1);)o++,n.iushrn(1);i(!n.isZero());var a=new s(1).toRed(this),u=a.redNeg(),h=this.m.subn(1).iushrn(1),f=this.m.bitLength();for(f=new s(2*f*f).toRed(this);0!==this.pow(f,h).cmp(u);)f.redIAdd(u);for(var c=this.pow(f,n),l=this.pow(t,n.addn(1).iushrn(1)),d=this.pow(t,n),p=o;0!==d.cmp(a);){for(var m=d,y=0;0!==m.cmp(a);y++)m=m.redSqr();i(y=0;n--){for(var h=e.words[n],f=u-1;f>=0;f--){var c=h>>f&1;i!==r[0]&&(i=this.sqr(i)),0!==c||0!==o?(o<<=1,o|=c,(4===++a||0===n&&0===f)&&(i=this.mul(i,r[o]),a=0,o=0)):a=0}u=26}return i},x.prototype.convertTo=function(t){var e=t.umod(this.m);return e===t?e.clone():e},x.prototype.convertFrom=function(t){var e=t.clone();return e.red=null,e},s.mont=function(t){return new k(t)},o(k,x),k.prototype.convertTo=function(t){return this.imod(t.ushln(this.shift))},k.prototype.convertFrom=function(t){var e=this.imod(t.mul(this.rinv));return e.red=null,e},k.prototype.imul=function(t,e){if(t.isZero()||e.isZero())return t.words[0]=0,t.length=1,t;var r=t.imul(e),n=r.maskn(this.shift).mul(this.minv).imaskn(this.shift).mul(this.m),i=r.isub(n).iushrn(this.shift),o=i;return i.cmp(this.m)>=0?o=i.isub(this.m):i.cmpn(0)<0&&(o=i.iadd(this.m)),o._forceRed(this)},k.prototype.mul=function(t,e){if(t.isZero()||e.isZero())return new s(0)._forceRed(this);var r=t.mul(e),n=r.maskn(this.shift).mul(this.minv).imaskn(this.shift).mul(this.m),i=r.isub(n).iushrn(this.shift),o=i;return i.cmp(this.m)>=0?o=i.isub(this.m):i.cmpn(0)<0&&(o=i.iadd(this.m)),o._forceRed(this)},k.prototype.invm=function(t){return this.imod(t._invmp(this.m).mul(this.r2))._forceRed(this)}}(t,this)}).call(this,r(22)(t))},function(t,e,r){"use strict";var n=e;n.version=r(234).version,n.utils=r(12),n.rand=r(69),n.curve=r(129),n.curves=r(74),n.ec=r(246),n.eddsa=r(250)},function(t,e,r){"use strict";var n,i=e,o=r(75),s=r(129),a=r(12).assert;function u(t){"short"===t.type?this.curve=new s.short(t):"edwards"===t.type?this.curve=new s.edwards(t):this.curve=new s.mont(t),this.g=this.curve.g,this.n=this.curve.n,this.hash=t.hash,a(this.g.validate(),"Invalid curve"),a(this.g.mul(this.n).isInfinity(),"Invalid curve, G*N != O")}function h(t,e){Object.defineProperty(i,t,{configurable:!0,enumerable:!0,get:function(){var r=new u(e);return Object.defineProperty(i,t,{configurable:!0,enumerable:!0,value:r}),r}})}i.PresetCurve=u,h("p192",{type:"short",prime:"p192",p:"ffffffff ffffffff ffffffff fffffffe ffffffff ffffffff",a:"ffffffff ffffffff ffffffff fffffffe ffffffff fffffffc",b:"64210519 e59c80e7 0fa7e9ab 72243049 feb8deec c146b9b1",n:"ffffffff ffffffff ffffffff 99def836 146bc9b1 b4d22831",hash:o.sha256,gRed:!1,g:["188da80e b03090f6 7cbf20eb 43a18800 f4ff0afd 82ff1012","07192b95 ffc8da78 631011ed 6b24cdd5 73f977a1 1e794811"]}),h("p224",{type:"short",prime:"p224",p:"ffffffff ffffffff ffffffff ffffffff 00000000 00000000 00000001",a:"ffffffff ffffffff ffffffff fffffffe ffffffff ffffffff fffffffe",b:"b4050a85 0c04b3ab f5413256 5044b0b7 d7bfd8ba 270b3943 2355ffb4",n:"ffffffff ffffffff ffffffff ffff16a2 e0b8f03e 13dd2945 5c5c2a3d",hash:o.sha256,gRed:!1,g:["b70e0cbd 6bb4bf7f 321390b9 4a03c1d3 56c21122 343280d6 115c1d21","bd376388 b5f723fb 4c22dfe6 cd4375a0 5a074764 44d58199 85007e34"]}),h("p256",{type:"short",prime:null,p:"ffffffff 00000001 00000000 00000000 00000000 ffffffff ffffffff ffffffff",a:"ffffffff 00000001 00000000 00000000 00000000 ffffffff ffffffff fffffffc",b:"5ac635d8 aa3a93e7 b3ebbd55 769886bc 651d06b0 cc53b0f6 3bce3c3e 27d2604b",n:"ffffffff 00000000 ffffffff ffffffff bce6faad a7179e84 f3b9cac2 fc632551",hash:o.sha256,gRed:!1,g:["6b17d1f2 e12c4247 f8bce6e5 63a440f2 77037d81 2deb33a0 f4a13945 d898c296","4fe342e2 fe1a7f9b 8ee7eb4a 7c0f9e16 2bce3357 6b315ece cbb64068 37bf51f5"]}),h("p384",{type:"short",prime:null,p:"ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff fffffffe ffffffff 00000000 00000000 ffffffff",a:"ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff fffffffe ffffffff 00000000 00000000 fffffffc",b:"b3312fa7 e23ee7e4 988e056b e3f82d19 181d9c6e fe814112 0314088f 5013875a c656398d 8a2ed19d 2a85c8ed d3ec2aef",n:"ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff c7634d81 f4372ddf 581a0db2 48b0a77a ecec196a ccc52973",hash:o.sha384,gRed:!1,g:["aa87ca22 be8b0537 8eb1c71e f320ad74 6e1d3b62 8ba79b98 59f741e0 82542a38 5502f25d bf55296c 3a545e38 72760ab7","3617de4a 96262c6f 5d9e98bf 9292dc29 f8f41dbd 289a147c e9da3113 b5f0b8c0 0a60b1ce 1d7e819d 7a431d7c 90ea0e5f"]}),h("p521",{type:"short",prime:null,p:"000001ff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff",a:"000001ff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff fffffffc",b:"00000051 953eb961 8e1c9a1f 929a21a0 b68540ee a2da725b 99b315f3 b8b48991 8ef109e1 56193951 ec7e937b 1652c0bd 3bb1bf07 3573df88 3d2c34f1 ef451fd4 6b503f00",n:"000001ff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff fffffffa 51868783 bf2f966b 7fcc0148 f709a5d0 3bb5c9b8 899c47ae bb6fb71e 91386409",hash:o.sha512,gRed:!1,g:["000000c6 858e06b7 0404e9cd 9e3ecb66 2395b442 9c648139 053fb521 f828af60 6b4d3dba a14b5e77 efe75928 fe1dc127 a2ffa8de 3348b3c1 856a429b f97e7e31 c2e5bd66","00000118 39296a78 9a3bc004 5c8a5fb4 2c7d1bd9 98f54449 579b4468 17afbd17 273e662c 97ee7299 5ef42640 c550b901 3fad0761 353c7086 a272c240 88be9476 9fd16650"]}),h("curve25519",{type:"mont",prime:"p25519",p:"7fffffffffffffff ffffffffffffffff ffffffffffffffff ffffffffffffffed",a:"76d06",b:"1",n:"1000000000000000 0000000000000000 14def9dea2f79cd6 5812631a5cf5d3ed",hash:o.sha256,gRed:!1,g:["9"]}),h("ed25519",{type:"edwards",prime:"p25519",p:"7fffffffffffffff ffffffffffffffff ffffffffffffffff ffffffffffffffed",a:"-1",c:"1",d:"52036cee2b6ffe73 8cc740797779e898 00700a4d4141d8ab 75eb4dca135978a3",n:"1000000000000000 0000000000000000 14def9dea2f79cd6 5812631a5cf5d3ed",hash:o.sha256,gRed:!1,g:["216936d3cd6e53fec0a4e231fdd6dc5c692cc7609525a7b2c9562d608f25d51a","6666666666666666666666666666666666666666666666666666666666666658"]});try{n=r(245)}catch(t){n=void 0}h("secp256k1",{type:"short",prime:"k256",p:"ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff fffffffe fffffc2f",a:"0",b:"7",n:"ffffffff ffffffff ffffffff fffffffe baaedce6 af48a03b bfd25e8c d0364141",h:"1",hash:o.sha256,beta:"7ae96a2b657c07106e64479eac3434e99cf0497512f58995c1396c28719501ee",lambda:"5363ad4cc05c30e0a5261c028812645a122e22ea20816678df02967c1b23bd72",basis:[{a:"3086d221a7d46bcde86c90e49284eb15",b:"-e4437ed6010e88286f547fa90abfe4c3"},{a:"114ca50f7a8e2f3f657c1108d9d44cfd8",b:"3086d221a7d46bcde86c90e49284eb15"}],gRed:!1,g:["79be667ef9dcbbac55a06295ce870b07029bfcdb2dce28d959f2815b16f81798","483ada7726a3c4655da4fbfc0e1108a8fd17b448a68554199c47d08ffb10d4b8",n]})},function(t,e,r){var n=e;n.utils=r(17),n.common=r(40),n.sha=r(239),n.ripemd=r(243),n.hmac=r(244),n.sha1=n.sha.sha1,n.sha256=n.sha.sha256,n.sha224=n.sha.sha224,n.sha384=n.sha.sha384,n.sha512=n.sha.sha512,n.ripemd160=n.ripemd.ripemd160},function(t,e,r){"use strict";(function(e){function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}var i,o=r(3),s=o.Buffer,a={};for(i in o)o.hasOwnProperty(i)&&"SlowBuffer"!==i&&"Buffer"!==i&&(a[i]=o[i]);var u=a.Buffer={};for(i in s)s.hasOwnProperty(i)&&"allocUnsafe"!==i&&"allocUnsafeSlow"!==i&&(u[i]=s[i]);if(a.Buffer.prototype=s.prototype,u.from&&u.from!==Uint8Array.from||(u.from=function(t,e,r){if("number"==typeof t)throw new TypeError('The "value" argument must not be of type number. Received type '+n(t));if(t&&void 0===t.length)throw new TypeError("The first argument must be one of type string, Buffer, ArrayBuffer, Array, or Array-like Object. Received type "+n(t));return s(t,e,r)}),u.alloc||(u.alloc=function(t,e,r){if("number"!=typeof t)throw new TypeError('The "size" argument must be of type number. Received type '+n(t));if(t<0||t>=2*(1<<30))throw new RangeError('The value "'+t+'" is invalid for option "size"');var i=s(t);return e&&0!==e.length?"string"==typeof r?i.fill(e,r):i.fill(e):i.fill(0),i}),!a.kStringMaxLength)try{a.kStringMaxLength=e.binding("buffer").kStringMaxLength}catch(t){}a.constants||(a.constants={MAX_LENGTH:a.kMaxLength},a.kStringMaxLength&&(a.constants.MAX_STRING_LENGTH=a.kStringMaxLength)),t.exports=a}).call(this,r(5))},function(t,e,r){"use strict";function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}var i=r(78).Reporter,o=r(41).EncoderBuffer,s=r(41).DecoderBuffer,a=r(11),u=["seq","seqof","set","setof","objid","bool","gentime","utctime","null_","enum","int","objDesc","bitstr","bmpstr","charstr","genstr","graphstr","ia5str","iso646str","numstr","octstr","printstr","t61str","unistr","utf8str","videostr"],h=["key","obj","use","optional","explicit","implicit","def","choice","any","contains"].concat(u);function f(t,e,r){var n={};this._baseState=n,n.name=r,n.enc=t,n.parent=e||null,n.children=null,n.tag=null,n.args=null,n.reverseArgs=null,n.choice=null,n.optional=!1,n.any=!1,n.obj=!1,n.use=null,n.useDecoder=null,n.key=null,n.default=null,n.explicit=null,n.implicit=null,n.contains=null,n.parent||(n.children=[],this._wrap())}t.exports=f;var c=["enc","parent","children","tag","args","reverseArgs","choice","optional","any","obj","use","alteredUse","key","default","explicit","implicit","contains"];f.prototype.clone=function(){var t=this._baseState,e={};c.forEach((function(r){e[r]=t[r]}));var r=new this.constructor(e.parent);return r._baseState=e,r},f.prototype._wrap=function(){var t=this._baseState;h.forEach((function(e){this[e]=function(){var r=new this.constructor(this);return t.children.push(r),r[e].apply(r,arguments)}}),this)},f.prototype._init=function(t){var e=this._baseState;a(null===e.parent),t.call(this),e.children=e.children.filter((function(t){return t._baseState.parent===this}),this),a.equal(e.children.length,1,"Root node can have only one child")},f.prototype._useArgs=function(t){var e=this._baseState,r=t.filter((function(t){return t instanceof this.constructor}),this);t=t.filter((function(t){return!(t instanceof this.constructor)}),this),0!==r.length&&(a(null===e.children),e.children=r,r.forEach((function(t){t._baseState.parent=this}),this)),0!==t.length&&(a(null===e.args),e.args=t,e.reverseArgs=t.map((function(t){if("object"!==n(t)||t.constructor!==Object)return t;var e={};return Object.keys(t).forEach((function(r){r==(0|r)&&(r|=0);var n=t[r];e[n]=r})),e})))},["_peekTag","_decodeTag","_use","_decodeStr","_decodeObjid","_decodeTime","_decodeNull","_decodeInt","_decodeBool","_decodeList","_encodeComposite","_encodeStr","_encodeObjid","_encodeTime","_encodeNull","_encodeInt","_encodeBool"].forEach((function(t){f.prototype[t]=function(){var e=this._baseState;throw new Error(t+" not implemented for encoding: "+e.enc)}})),u.forEach((function(t){f.prototype[t]=function(){var e=this._baseState,r=Array.prototype.slice.call(arguments);return a(null===e.tag),e.tag=t,this._useArgs(r),this}})),f.prototype.use=function(t){a(t);var e=this._baseState;return a(null===e.use),e.use=t,this},f.prototype.optional=function(){return this._baseState.optional=!0,this},f.prototype.def=function(t){var e=this._baseState;return a(null===e.default),e.default=t,e.optional=!0,this},f.prototype.explicit=function(t){var e=this._baseState;return a(null===e.explicit&&null===e.implicit),e.explicit=t,this},f.prototype.implicit=function(t){var e=this._baseState;return a(null===e.explicit&&null===e.implicit),e.implicit=t,this},f.prototype.obj=function(){var t=this._baseState,e=Array.prototype.slice.call(arguments);return t.obj=!0,0!==e.length&&this._useArgs(e),this},f.prototype.key=function(t){var e=this._baseState;return a(null===e.key),e.key=t,this},f.prototype.any=function(){return this._baseState.any=!0,this},f.prototype.choice=function(t){var e=this._baseState;return a(null===e.choice),e.choice=t,this._useArgs(Object.keys(t).map((function(e){return t[e]}))),this},f.prototype.contains=function(t){var e=this._baseState;return a(null===e.use),e.contains=t,this},f.prototype._decode=function(t,e){var r=this._baseState;if(null===r.parent)return t.wrapResult(r.children[0]._decode(t,e));var n,i=r.default,o=!0,a=null;if(null!==r.key&&(a=t.enterKey(r.key)),r.optional){var u=null;if(null!==r.explicit?u=r.explicit:null!==r.implicit?u=r.implicit:null!==r.tag&&(u=r.tag),null!==u||r.any){if(o=this._peekTag(t,u,r.any),t.isError(o))return o}else{var h=t.save();try{null===r.choice?this._decodeGeneric(r.tag,t,e):this._decodeChoice(t,e),o=!0}catch(t){o=!1}t.restore(h)}}if(r.obj&&o&&(n=t.enterObject()),o){if(null!==r.explicit){var f=this._decodeTag(t,r.explicit);if(t.isError(f))return f;t=f}var c=t.offset;if(null===r.use&&null===r.choice){var l;r.any&&(l=t.save());var d=this._decodeTag(t,null!==r.implicit?r.implicit:r.tag,r.any);if(t.isError(d))return d;r.any?i=t.raw(l):t=d}if(e&&e.track&&null!==r.tag&&e.track(t.path(),c,t.length,"tagged"),e&&e.track&&null!==r.tag&&e.track(t.path(),t.offset,t.length,"content"),r.any||(i=null===r.choice?this._decodeGeneric(r.tag,t,e):this._decodeChoice(t,e)),t.isError(i))return i;if(r.any||null!==r.choice||null===r.children||r.children.forEach((function(r){r._decode(t,e)})),r.contains&&("octstr"===r.tag||"bitstr"===r.tag)){var p=new s(i);i=this._getUse(r.contains,t._reporterState.obj)._decode(p,e)}}return r.obj&&o&&(i=t.leaveObject(n)),null===r.key||null===i&&!0!==o?null!==a&&t.exitKey(a):t.leaveKey(a,r.key,i),i},f.prototype._decodeGeneric=function(t,e,r){var n=this._baseState;return"seq"===t||"set"===t?null:"seqof"===t||"setof"===t?this._decodeList(e,t,n.args[0],r):/str$/.test(t)?this._decodeStr(e,t,r):"objid"===t&&n.args?this._decodeObjid(e,n.args[0],n.args[1],r):"objid"===t?this._decodeObjid(e,null,null,r):"gentime"===t||"utctime"===t?this._decodeTime(e,t,r):"null_"===t?this._decodeNull(e,r):"bool"===t?this._decodeBool(e,r):"objDesc"===t?this._decodeStr(e,t,r):"int"===t||"enum"===t?this._decodeInt(e,n.args&&n.args[0],r):null!==n.use?this._getUse(n.use,e._reporterState.obj)._decode(e,r):e.error("unknown tag: "+t)},f.prototype._getUse=function(t,e){var r=this._baseState;return r.useDecoder=this._use(t,e),a(null===r.useDecoder._baseState.parent),r.useDecoder=r.useDecoder._baseState.children[0],r.implicit!==r.useDecoder._baseState.implicit&&(r.useDecoder=r.useDecoder.clone(),r.useDecoder._baseState.implicit=r.implicit),r.useDecoder},f.prototype._decodeChoice=function(t,e){var r=this._baseState,n=null,i=!1;return Object.keys(r.choice).some((function(o){var s=t.save(),a=r.choice[o];try{var u=a._decode(t,e);if(t.isError(u))return!1;n={type:o,value:u},i=!0}catch(e){return t.restore(s),!1}return!0}),this),i?n:t.error("Choice not matched")},f.prototype._createEncoderBuffer=function(t){return new o(t,this.reporter)},f.prototype._encode=function(t,e,r){var n=this._baseState;if(null===n.default||n.default!==t){var i=this._encodeValue(t,e,r);if(void 0!==i&&!this._skipDefault(i,e,r))return i}},f.prototype._encodeValue=function(t,e,r){var o=this._baseState;if(null===o.parent)return o.children[0]._encode(t,e||new i);var s=null;if(this.reporter=e,o.optional&&void 0===t){if(null===o.default)return;t=o.default}var a=null,u=!1;if(o.any)s=this._createEncoderBuffer(t);else if(o.choice)s=this._encodeChoice(t,e);else if(o.contains)a=this._getUse(o.contains,r)._encode(t,e),u=!0;else if(o.children)a=o.children.map((function(r){if("null_"===r._baseState.tag)return r._encode(null,e,t);if(null===r._baseState.key)return e.error("Child should have a key");var i=e.enterKey(r._baseState.key);if("object"!==n(t))return e.error("Child expected, but input is not object");var o=r._encode(t[r._baseState.key],e,t);return e.leaveKey(i),o}),this).filter((function(t){return t})),a=this._createEncoderBuffer(a);else if("seqof"===o.tag||"setof"===o.tag){if(!o.args||1!==o.args.length)return e.error("Too many args for : "+o.tag);if(!Array.isArray(t))return e.error("seqof/setof, but data is not Array");var h=this.clone();h._baseState.implicit=null,a=this._createEncoderBuffer(t.map((function(r){var n=this._baseState;return this._getUse(n.args[0],t)._encode(r,e)}),h))}else null!==o.use?s=this._getUse(o.use,r)._encode(t,e):(a=this._encodePrimitive(o.tag,t),u=!0);if(!o.any&&null===o.choice){var f=null!==o.implicit?o.implicit:o.tag,c=null===o.implicit?"universal":"context";null===f?null===o.use&&e.error("Tag could be omitted only for .use()"):null===o.use&&(s=this._encodeComposite(f,u,c,a))}return null!==o.explicit&&(s=this._encodeComposite(o.explicit,!1,"context",s)),s},f.prototype._encodeChoice=function(t,e){var r=this._baseState,n=r.choice[t.type];return n||a(!1,t.type+" not found in "+JSON.stringify(Object.keys(r.choice))),n._encode(t.value,e)},f.prototype._encodePrimitive=function(t,e){var r=this._baseState;if(/str$/.test(t))return this._encodeStr(e,t);if("objid"===t&&r.args)return this._encodeObjid(e,r.reverseArgs[0],r.args[1]);if("objid"===t)return this._encodeObjid(e,null,null);if("gentime"===t||"utctime"===t)return this._encodeTime(e,t);if("null_"===t)return this._encodeNull();if("int"===t||"enum"===t)return this._encodeInt(e,r.args&&r.reverseArgs[0]);if("bool"===t)return this._encodeBool(e);if("objDesc"===t)return this._encodeStr(e,t);throw new Error("Unsupported tag: "+t)},f.prototype._isNumstr=function(t){return/^[0-9 ]*$/.test(t)},f.prototype._isPrintstr=function(t){return/^[A-Za-z0-9 '()+,-./:=?]*$/.test(t)}},function(t,e,r){"use strict";var n=r(0);function i(t){this._reporterState={obj:null,path:[],options:t||{},errors:[]}}function o(t,e){this.path=t,this.rethrow(e)}e.Reporter=i,i.prototype.isError=function(t){return t instanceof o},i.prototype.save=function(){var t=this._reporterState;return{obj:t.obj,pathLen:t.path.length}},i.prototype.restore=function(t){var e=this._reporterState;e.obj=t.obj,e.path=e.path.slice(0,t.pathLen)},i.prototype.enterKey=function(t){return this._reporterState.path.push(t)},i.prototype.exitKey=function(t){var e=this._reporterState;e.path=e.path.slice(0,t-1)},i.prototype.leaveKey=function(t,e,r){var n=this._reporterState;this.exitKey(t),null!==n.obj&&(n.obj[e]=r)},i.prototype.path=function(){return this._reporterState.path.join("/")},i.prototype.enterObject=function(){var t=this._reporterState,e=t.obj;return t.obj={},e},i.prototype.leaveObject=function(t){var e=this._reporterState,r=e.obj;return e.obj=t,r},i.prototype.error=function(t){var e,r=this._reporterState,n=t instanceof o;if(e=n?t:new o(r.path.map((function(t){return"["+JSON.stringify(t)+"]"})).join(""),t.message||t,t.stack),!r.options.partial)throw e;return n||r.errors.push(e),e},i.prototype.wrapResult=function(t){var e=this._reporterState;return e.options.partial?{result:this.isError(t)?null:t,errors:e.errors}:t},n(o,Error),o.prototype.rethrow=function(t){if(this.message=t+" at: "+(this.path||"(shallow)"),Error.captureStackTrace&&Error.captureStackTrace(this,o),!this.stack)try{throw new Error(this.message)}catch(t){this.stack=t.stack}return this}},function(t,e,r){"use strict";function n(t){var e={};return Object.keys(t).forEach((function(r){(0|r)==r&&(r|=0);var n=t[r];e[n]=r})),e}e.tagClass={0:"universal",1:"application",2:"context",3:"private"},e.tagClassByName=n(e.tagClass),e.tag={0:"end",1:"bool",2:"int",3:"bitstr",4:"octstr",5:"null_",6:"objid",7:"objDesc",8:"external",9:"real",10:"enum",11:"embed",12:"utf8str",13:"relativeOid",16:"seq",17:"set",18:"numstr",19:"printstr",20:"t61str",21:"videostr",22:"ia5str",23:"utctime",24:"gentime",25:"graphstr",26:"iso646str",27:"genstr",28:"unistr",29:"charstr",30:"bmpstr"},e.tagByName=n(e.tag)},function(t,e,r){(function(t){function e(t){return(e="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}!function(t,n){"use strict";function i(t,e){if(!t)throw new Error(e||"Assertion failed")}function o(t,e){t.super_=e;var r=function(){};r.prototype=e.prototype,t.prototype=new r,t.prototype.constructor=t}function s(t,e,r){if(s.isBN(t))return t;this.negative=0,this.words=null,this.length=0,this.red=null,null!==t&&("le"!==e&&"be"!==e||(r=e,e=10),this._init(t||0,e||10,r||"be"))}var a;"object"===e(t)?t.exports=s:n.BN=s,s.BN=s,s.wordSize=26;try{a="undefined"!=typeof window&&void 0!==window.Buffer?window.Buffer:r(269).Buffer}catch(t){}function u(t,e){var r=t.charCodeAt(e);return r>=65&&r<=70?r-55:r>=97&&r<=102?r-87:r-48&15}function h(t,e,r){var n=u(t,r);return r-1>=e&&(n|=u(t,r-1)<<4),n}function f(t,e,r,n){for(var i=0,o=Math.min(t.length,r),s=e;s=49?a-49+10:a>=17?a-17+10:a}return i}s.isBN=function(t){return t instanceof s||null!==t&&"object"===e(t)&&t.constructor.wordSize===s.wordSize&&Array.isArray(t.words)},s.max=function(t,e){return t.cmp(e)>0?t:e},s.min=function(t,e){return t.cmp(e)<0?t:e},s.prototype._init=function(t,r,n){if("number"==typeof t)return this._initNumber(t,r,n);if("object"===e(t))return this._initArray(t,r,n);"hex"===r&&(r=16),i(r===(0|r)&&r>=2&&r<=36);var o=0;"-"===(t=t.toString().replace(/\s+/g,""))[0]&&(o++,this.negative=1),o=0;n-=3)s=t[n]|t[n-1]<<8|t[n-2]<<16,this.words[o]|=s<>>26-a&67108863,(a+=24)>=26&&(a-=26,o++);else if("le"===r)for(n=0,o=0;n>>26-a&67108863,(a+=24)>=26&&(a-=26,o++);return this.strip()},s.prototype._parseHex=function(t,e,r){this.length=Math.ceil((t.length-e)/6),this.words=new Array(this.length);for(var n=0;n=e;n-=2)i=h(t,e,n)<=18?(o-=18,s+=1,this.words[s]|=i>>>26):o+=8;else for(n=(t.length-e)%2==0?e+1:e;n=18?(o-=18,s+=1,this.words[s]|=i>>>26):o+=8;this.strip()},s.prototype._parseBase=function(t,e,r){this.words=[0],this.length=1;for(var n=0,i=1;i<=67108863;i*=e)n++;n--,i=i/e|0;for(var o=t.length-r,s=o%n,a=Math.min(o,o-s)+r,u=0,h=r;h1&&0===this.words[this.length-1];)this.length--;return this._normSign()},s.prototype._normSign=function(){return 1===this.length&&0===this.words[0]&&(this.negative=0),this},s.prototype.inspect=function(){return(this.red?""};var c=["","0","00","000","0000","00000","000000","0000000","00000000","000000000","0000000000","00000000000","000000000000","0000000000000","00000000000000","000000000000000","0000000000000000","00000000000000000","000000000000000000","0000000000000000000","00000000000000000000","000000000000000000000","0000000000000000000000","00000000000000000000000","000000000000000000000000","0000000000000000000000000"],l=[0,0,25,16,12,11,10,9,8,8,7,7,7,7,6,6,6,6,6,6,6,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5],d=[0,0,33554432,43046721,16777216,48828125,60466176,40353607,16777216,43046721,1e7,19487171,35831808,62748517,7529536,11390625,16777216,24137569,34012224,47045881,64e6,4084101,5153632,6436343,7962624,9765625,11881376,14348907,17210368,20511149,243e5,28629151,33554432,39135393,45435424,52521875,60466176];function p(t,e,r){r.negative=e.negative^t.negative;var n=t.length+e.length|0;r.length=n,n=n-1|0;var i=0|t.words[0],o=0|e.words[0],s=i*o,a=67108863&s,u=s/67108864|0;r.words[0]=a;for(var h=1;h>>26,c=67108863&u,l=Math.min(h,e.length-1),d=Math.max(0,h-t.length+1);d<=l;d++){var p=h-d|0;f+=(s=(i=0|t.words[p])*(o=0|e.words[d])+c)/67108864|0,c=67108863&s}r.words[h]=0|c,u=0|f}return 0!==u?r.words[h]=0|u:r.length--,r.strip()}s.prototype.toString=function(t,e){var r;if(e=0|e||1,16===(t=t||10)||"hex"===t){r="";for(var n=0,o=0,s=0;s>>24-n&16777215)||s!==this.length-1?c[6-u.length]+u+r:u+r,(n+=2)>=26&&(n-=26,s--)}for(0!==o&&(r=o.toString(16)+r);r.length%e!=0;)r="0"+r;return 0!==this.negative&&(r="-"+r),r}if(t===(0|t)&&t>=2&&t<=36){var h=l[t],f=d[t];r="";var p=this.clone();for(p.negative=0;!p.isZero();){var m=p.modn(f).toString(t);r=(p=p.idivn(f)).isZero()?m+r:c[h-m.length]+m+r}for(this.isZero()&&(r="0"+r);r.length%e!=0;)r="0"+r;return 0!==this.negative&&(r="-"+r),r}i(!1,"Base should be between 2 and 36")},s.prototype.toNumber=function(){var t=this.words[0];return 2===this.length?t+=67108864*this.words[1]:3===this.length&&1===this.words[2]?t+=4503599627370496+67108864*this.words[1]:this.length>2&&i(!1,"Number can only safely store up to 53 bits"),0!==this.negative?-t:t},s.prototype.toJSON=function(){return this.toString(16)},s.prototype.toBuffer=function(t,e){return i(void 0!==a),this.toArrayLike(a,t,e)},s.prototype.toArray=function(t,e){return this.toArrayLike(Array,t,e)},s.prototype.toArrayLike=function(t,e,r){var n=this.byteLength(),o=r||Math.max(1,n);i(n<=o,"byte array longer than desired length"),i(o>0,"Requested array length <= 0"),this.strip();var s,a,u="le"===e,h=new t(o),f=this.clone();if(u){for(a=0;!f.isZero();a++)s=f.andln(255),f.iushrn(8),h[a]=s;for(;a=4096&&(r+=13,e>>>=13),e>=64&&(r+=7,e>>>=7),e>=8&&(r+=4,e>>>=4),e>=2&&(r+=2,e>>>=2),r+e},s.prototype._zeroBits=function(t){if(0===t)return 26;var e=t,r=0;return 0==(8191&e)&&(r+=13,e>>>=13),0==(127&e)&&(r+=7,e>>>=7),0==(15&e)&&(r+=4,e>>>=4),0==(3&e)&&(r+=2,e>>>=2),0==(1&e)&&r++,r},s.prototype.bitLength=function(){var t=this.words[this.length-1],e=this._countBits(t);return 26*(this.length-1)+e},s.prototype.zeroBits=function(){if(this.isZero())return 0;for(var t=0,e=0;et.length?this.clone().ior(t):t.clone().ior(this)},s.prototype.uor=function(t){return this.length>t.length?this.clone().iuor(t):t.clone().iuor(this)},s.prototype.iuand=function(t){var e;e=this.length>t.length?t:this;for(var r=0;rt.length?this.clone().iand(t):t.clone().iand(this)},s.prototype.uand=function(t){return this.length>t.length?this.clone().iuand(t):t.clone().iuand(this)},s.prototype.iuxor=function(t){var e,r;this.length>t.length?(e=this,r=t):(e=t,r=this);for(var n=0;nt.length?this.clone().ixor(t):t.clone().ixor(this)},s.prototype.uxor=function(t){return this.length>t.length?this.clone().iuxor(t):t.clone().iuxor(this)},s.prototype.inotn=function(t){i("number"==typeof t&&t>=0);var e=0|Math.ceil(t/26),r=t%26;this._expand(e),r>0&&e--;for(var n=0;n0&&(this.words[n]=~this.words[n]&67108863>>26-r),this.strip()},s.prototype.notn=function(t){return this.clone().inotn(t)},s.prototype.setn=function(t,e){i("number"==typeof t&&t>=0);var r=t/26|0,n=t%26;return this._expand(r+1),this.words[r]=e?this.words[r]|1<t.length?(r=this,n=t):(r=t,n=this);for(var i=0,o=0;o>>26;for(;0!==i&&o>>26;if(this.length=r.length,0!==i)this.words[this.length]=i,this.length++;else if(r!==this)for(;ot.length?this.clone().iadd(t):t.clone().iadd(this)},s.prototype.isub=function(t){if(0!==t.negative){t.negative=0;var e=this.iadd(t);return t.negative=1,e._normSign()}if(0!==this.negative)return this.negative=0,this.iadd(t),this.negative=1,this._normSign();var r,n,i=this.cmp(t);if(0===i)return this.negative=0,this.length=1,this.words[0]=0,this;i>0?(r=this,n=t):(r=t,n=this);for(var o=0,s=0;s>26,this.words[s]=67108863&e;for(;0!==o&&s>26,this.words[s]=67108863&e;if(0===o&&s>>13,d=0|s[1],p=8191&d,m=d>>>13,y=0|s[2],v=8191&y,b=y>>>13,g=0|s[3],w=8191&g,_=g>>>13,M=0|s[4],S=8191&M,O=M>>>13,A=0|s[5],E=8191&A,x=A>>>13,k=0|s[6],j=8191&k,$=k>>>13,P=0|s[7],R=8191&P,B=P>>>13,T=0|s[8],I=8191&T,N=T>>>13,D=0|s[9],C=8191&D,L=D>>>13,q=0|a[0],U=8191&q,F=q>>>13,z=0|a[1],V=8191&z,K=z>>>13,H=0|a[2],Z=8191&H,W=H>>>13,J=0|a[3],Y=8191&J,Q=J>>>13,G=0|a[4],X=8191&G,tt=G>>>13,et=0|a[5],rt=8191&et,nt=et>>>13,it=0|a[6],ot=8191&it,st=it>>>13,at=0|a[7],ut=8191&at,ht=at>>>13,ft=0|a[8],ct=8191&ft,lt=ft>>>13,dt=0|a[9],pt=8191&dt,mt=dt>>>13;r.negative=t.negative^e.negative,r.length=19;var yt=(h+(n=Math.imul(c,U))|0)+((8191&(i=(i=Math.imul(c,F))+Math.imul(l,U)|0))<<13)|0;h=((o=Math.imul(l,F))+(i>>>13)|0)+(yt>>>26)|0,yt&=67108863,n=Math.imul(p,U),i=(i=Math.imul(p,F))+Math.imul(m,U)|0,o=Math.imul(m,F);var vt=(h+(n=n+Math.imul(c,V)|0)|0)+((8191&(i=(i=i+Math.imul(c,K)|0)+Math.imul(l,V)|0))<<13)|0;h=((o=o+Math.imul(l,K)|0)+(i>>>13)|0)+(vt>>>26)|0,vt&=67108863,n=Math.imul(v,U),i=(i=Math.imul(v,F))+Math.imul(b,U)|0,o=Math.imul(b,F),n=n+Math.imul(p,V)|0,i=(i=i+Math.imul(p,K)|0)+Math.imul(m,V)|0,o=o+Math.imul(m,K)|0;var bt=(h+(n=n+Math.imul(c,Z)|0)|0)+((8191&(i=(i=i+Math.imul(c,W)|0)+Math.imul(l,Z)|0))<<13)|0;h=((o=o+Math.imul(l,W)|0)+(i>>>13)|0)+(bt>>>26)|0,bt&=67108863,n=Math.imul(w,U),i=(i=Math.imul(w,F))+Math.imul(_,U)|0,o=Math.imul(_,F),n=n+Math.imul(v,V)|0,i=(i=i+Math.imul(v,K)|0)+Math.imul(b,V)|0,o=o+Math.imul(b,K)|0,n=n+Math.imul(p,Z)|0,i=(i=i+Math.imul(p,W)|0)+Math.imul(m,Z)|0,o=o+Math.imul(m,W)|0;var gt=(h+(n=n+Math.imul(c,Y)|0)|0)+((8191&(i=(i=i+Math.imul(c,Q)|0)+Math.imul(l,Y)|0))<<13)|0;h=((o=o+Math.imul(l,Q)|0)+(i>>>13)|0)+(gt>>>26)|0,gt&=67108863,n=Math.imul(S,U),i=(i=Math.imul(S,F))+Math.imul(O,U)|0,o=Math.imul(O,F),n=n+Math.imul(w,V)|0,i=(i=i+Math.imul(w,K)|0)+Math.imul(_,V)|0,o=o+Math.imul(_,K)|0,n=n+Math.imul(v,Z)|0,i=(i=i+Math.imul(v,W)|0)+Math.imul(b,Z)|0,o=o+Math.imul(b,W)|0,n=n+Math.imul(p,Y)|0,i=(i=i+Math.imul(p,Q)|0)+Math.imul(m,Y)|0,o=o+Math.imul(m,Q)|0;var wt=(h+(n=n+Math.imul(c,X)|0)|0)+((8191&(i=(i=i+Math.imul(c,tt)|0)+Math.imul(l,X)|0))<<13)|0;h=((o=o+Math.imul(l,tt)|0)+(i>>>13)|0)+(wt>>>26)|0,wt&=67108863,n=Math.imul(E,U),i=(i=Math.imul(E,F))+Math.imul(x,U)|0,o=Math.imul(x,F),n=n+Math.imul(S,V)|0,i=(i=i+Math.imul(S,K)|0)+Math.imul(O,V)|0,o=o+Math.imul(O,K)|0,n=n+Math.imul(w,Z)|0,i=(i=i+Math.imul(w,W)|0)+Math.imul(_,Z)|0,o=o+Math.imul(_,W)|0,n=n+Math.imul(v,Y)|0,i=(i=i+Math.imul(v,Q)|0)+Math.imul(b,Y)|0,o=o+Math.imul(b,Q)|0,n=n+Math.imul(p,X)|0,i=(i=i+Math.imul(p,tt)|0)+Math.imul(m,X)|0,o=o+Math.imul(m,tt)|0;var _t=(h+(n=n+Math.imul(c,rt)|0)|0)+((8191&(i=(i=i+Math.imul(c,nt)|0)+Math.imul(l,rt)|0))<<13)|0;h=((o=o+Math.imul(l,nt)|0)+(i>>>13)|0)+(_t>>>26)|0,_t&=67108863,n=Math.imul(j,U),i=(i=Math.imul(j,F))+Math.imul($,U)|0,o=Math.imul($,F),n=n+Math.imul(E,V)|0,i=(i=i+Math.imul(E,K)|0)+Math.imul(x,V)|0,o=o+Math.imul(x,K)|0,n=n+Math.imul(S,Z)|0,i=(i=i+Math.imul(S,W)|0)+Math.imul(O,Z)|0,o=o+Math.imul(O,W)|0,n=n+Math.imul(w,Y)|0,i=(i=i+Math.imul(w,Q)|0)+Math.imul(_,Y)|0,o=o+Math.imul(_,Q)|0,n=n+Math.imul(v,X)|0,i=(i=i+Math.imul(v,tt)|0)+Math.imul(b,X)|0,o=o+Math.imul(b,tt)|0,n=n+Math.imul(p,rt)|0,i=(i=i+Math.imul(p,nt)|0)+Math.imul(m,rt)|0,o=o+Math.imul(m,nt)|0;var Mt=(h+(n=n+Math.imul(c,ot)|0)|0)+((8191&(i=(i=i+Math.imul(c,st)|0)+Math.imul(l,ot)|0))<<13)|0;h=((o=o+Math.imul(l,st)|0)+(i>>>13)|0)+(Mt>>>26)|0,Mt&=67108863,n=Math.imul(R,U),i=(i=Math.imul(R,F))+Math.imul(B,U)|0,o=Math.imul(B,F),n=n+Math.imul(j,V)|0,i=(i=i+Math.imul(j,K)|0)+Math.imul($,V)|0,o=o+Math.imul($,K)|0,n=n+Math.imul(E,Z)|0,i=(i=i+Math.imul(E,W)|0)+Math.imul(x,Z)|0,o=o+Math.imul(x,W)|0,n=n+Math.imul(S,Y)|0,i=(i=i+Math.imul(S,Q)|0)+Math.imul(O,Y)|0,o=o+Math.imul(O,Q)|0,n=n+Math.imul(w,X)|0,i=(i=i+Math.imul(w,tt)|0)+Math.imul(_,X)|0,o=o+Math.imul(_,tt)|0,n=n+Math.imul(v,rt)|0,i=(i=i+Math.imul(v,nt)|0)+Math.imul(b,rt)|0,o=o+Math.imul(b,nt)|0,n=n+Math.imul(p,ot)|0,i=(i=i+Math.imul(p,st)|0)+Math.imul(m,ot)|0,o=o+Math.imul(m,st)|0;var St=(h+(n=n+Math.imul(c,ut)|0)|0)+((8191&(i=(i=i+Math.imul(c,ht)|0)+Math.imul(l,ut)|0))<<13)|0;h=((o=o+Math.imul(l,ht)|0)+(i>>>13)|0)+(St>>>26)|0,St&=67108863,n=Math.imul(I,U),i=(i=Math.imul(I,F))+Math.imul(N,U)|0,o=Math.imul(N,F),n=n+Math.imul(R,V)|0,i=(i=i+Math.imul(R,K)|0)+Math.imul(B,V)|0,o=o+Math.imul(B,K)|0,n=n+Math.imul(j,Z)|0,i=(i=i+Math.imul(j,W)|0)+Math.imul($,Z)|0,o=o+Math.imul($,W)|0,n=n+Math.imul(E,Y)|0,i=(i=i+Math.imul(E,Q)|0)+Math.imul(x,Y)|0,o=o+Math.imul(x,Q)|0,n=n+Math.imul(S,X)|0,i=(i=i+Math.imul(S,tt)|0)+Math.imul(O,X)|0,o=o+Math.imul(O,tt)|0,n=n+Math.imul(w,rt)|0,i=(i=i+Math.imul(w,nt)|0)+Math.imul(_,rt)|0,o=o+Math.imul(_,nt)|0,n=n+Math.imul(v,ot)|0,i=(i=i+Math.imul(v,st)|0)+Math.imul(b,ot)|0,o=o+Math.imul(b,st)|0,n=n+Math.imul(p,ut)|0,i=(i=i+Math.imul(p,ht)|0)+Math.imul(m,ut)|0,o=o+Math.imul(m,ht)|0;var Ot=(h+(n=n+Math.imul(c,ct)|0)|0)+((8191&(i=(i=i+Math.imul(c,lt)|0)+Math.imul(l,ct)|0))<<13)|0;h=((o=o+Math.imul(l,lt)|0)+(i>>>13)|0)+(Ot>>>26)|0,Ot&=67108863,n=Math.imul(C,U),i=(i=Math.imul(C,F))+Math.imul(L,U)|0,o=Math.imul(L,F),n=n+Math.imul(I,V)|0,i=(i=i+Math.imul(I,K)|0)+Math.imul(N,V)|0,o=o+Math.imul(N,K)|0,n=n+Math.imul(R,Z)|0,i=(i=i+Math.imul(R,W)|0)+Math.imul(B,Z)|0,o=o+Math.imul(B,W)|0,n=n+Math.imul(j,Y)|0,i=(i=i+Math.imul(j,Q)|0)+Math.imul($,Y)|0,o=o+Math.imul($,Q)|0,n=n+Math.imul(E,X)|0,i=(i=i+Math.imul(E,tt)|0)+Math.imul(x,X)|0,o=o+Math.imul(x,tt)|0,n=n+Math.imul(S,rt)|0,i=(i=i+Math.imul(S,nt)|0)+Math.imul(O,rt)|0,o=o+Math.imul(O,nt)|0,n=n+Math.imul(w,ot)|0,i=(i=i+Math.imul(w,st)|0)+Math.imul(_,ot)|0,o=o+Math.imul(_,st)|0,n=n+Math.imul(v,ut)|0,i=(i=i+Math.imul(v,ht)|0)+Math.imul(b,ut)|0,o=o+Math.imul(b,ht)|0,n=n+Math.imul(p,ct)|0,i=(i=i+Math.imul(p,lt)|0)+Math.imul(m,ct)|0,o=o+Math.imul(m,lt)|0;var At=(h+(n=n+Math.imul(c,pt)|0)|0)+((8191&(i=(i=i+Math.imul(c,mt)|0)+Math.imul(l,pt)|0))<<13)|0;h=((o=o+Math.imul(l,mt)|0)+(i>>>13)|0)+(At>>>26)|0,At&=67108863,n=Math.imul(C,V),i=(i=Math.imul(C,K))+Math.imul(L,V)|0,o=Math.imul(L,K),n=n+Math.imul(I,Z)|0,i=(i=i+Math.imul(I,W)|0)+Math.imul(N,Z)|0,o=o+Math.imul(N,W)|0,n=n+Math.imul(R,Y)|0,i=(i=i+Math.imul(R,Q)|0)+Math.imul(B,Y)|0,o=o+Math.imul(B,Q)|0,n=n+Math.imul(j,X)|0,i=(i=i+Math.imul(j,tt)|0)+Math.imul($,X)|0,o=o+Math.imul($,tt)|0,n=n+Math.imul(E,rt)|0,i=(i=i+Math.imul(E,nt)|0)+Math.imul(x,rt)|0,o=o+Math.imul(x,nt)|0,n=n+Math.imul(S,ot)|0,i=(i=i+Math.imul(S,st)|0)+Math.imul(O,ot)|0,o=o+Math.imul(O,st)|0,n=n+Math.imul(w,ut)|0,i=(i=i+Math.imul(w,ht)|0)+Math.imul(_,ut)|0,o=o+Math.imul(_,ht)|0,n=n+Math.imul(v,ct)|0,i=(i=i+Math.imul(v,lt)|0)+Math.imul(b,ct)|0,o=o+Math.imul(b,lt)|0;var Et=(h+(n=n+Math.imul(p,pt)|0)|0)+((8191&(i=(i=i+Math.imul(p,mt)|0)+Math.imul(m,pt)|0))<<13)|0;h=((o=o+Math.imul(m,mt)|0)+(i>>>13)|0)+(Et>>>26)|0,Et&=67108863,n=Math.imul(C,Z),i=(i=Math.imul(C,W))+Math.imul(L,Z)|0,o=Math.imul(L,W),n=n+Math.imul(I,Y)|0,i=(i=i+Math.imul(I,Q)|0)+Math.imul(N,Y)|0,o=o+Math.imul(N,Q)|0,n=n+Math.imul(R,X)|0,i=(i=i+Math.imul(R,tt)|0)+Math.imul(B,X)|0,o=o+Math.imul(B,tt)|0,n=n+Math.imul(j,rt)|0,i=(i=i+Math.imul(j,nt)|0)+Math.imul($,rt)|0,o=o+Math.imul($,nt)|0,n=n+Math.imul(E,ot)|0,i=(i=i+Math.imul(E,st)|0)+Math.imul(x,ot)|0,o=o+Math.imul(x,st)|0,n=n+Math.imul(S,ut)|0,i=(i=i+Math.imul(S,ht)|0)+Math.imul(O,ut)|0,o=o+Math.imul(O,ht)|0,n=n+Math.imul(w,ct)|0,i=(i=i+Math.imul(w,lt)|0)+Math.imul(_,ct)|0,o=o+Math.imul(_,lt)|0;var xt=(h+(n=n+Math.imul(v,pt)|0)|0)+((8191&(i=(i=i+Math.imul(v,mt)|0)+Math.imul(b,pt)|0))<<13)|0;h=((o=o+Math.imul(b,mt)|0)+(i>>>13)|0)+(xt>>>26)|0,xt&=67108863,n=Math.imul(C,Y),i=(i=Math.imul(C,Q))+Math.imul(L,Y)|0,o=Math.imul(L,Q),n=n+Math.imul(I,X)|0,i=(i=i+Math.imul(I,tt)|0)+Math.imul(N,X)|0,o=o+Math.imul(N,tt)|0,n=n+Math.imul(R,rt)|0,i=(i=i+Math.imul(R,nt)|0)+Math.imul(B,rt)|0,o=o+Math.imul(B,nt)|0,n=n+Math.imul(j,ot)|0,i=(i=i+Math.imul(j,st)|0)+Math.imul($,ot)|0,o=o+Math.imul($,st)|0,n=n+Math.imul(E,ut)|0,i=(i=i+Math.imul(E,ht)|0)+Math.imul(x,ut)|0,o=o+Math.imul(x,ht)|0,n=n+Math.imul(S,ct)|0,i=(i=i+Math.imul(S,lt)|0)+Math.imul(O,ct)|0,o=o+Math.imul(O,lt)|0;var kt=(h+(n=n+Math.imul(w,pt)|0)|0)+((8191&(i=(i=i+Math.imul(w,mt)|0)+Math.imul(_,pt)|0))<<13)|0;h=((o=o+Math.imul(_,mt)|0)+(i>>>13)|0)+(kt>>>26)|0,kt&=67108863,n=Math.imul(C,X),i=(i=Math.imul(C,tt))+Math.imul(L,X)|0,o=Math.imul(L,tt),n=n+Math.imul(I,rt)|0,i=(i=i+Math.imul(I,nt)|0)+Math.imul(N,rt)|0,o=o+Math.imul(N,nt)|0,n=n+Math.imul(R,ot)|0,i=(i=i+Math.imul(R,st)|0)+Math.imul(B,ot)|0,o=o+Math.imul(B,st)|0,n=n+Math.imul(j,ut)|0,i=(i=i+Math.imul(j,ht)|0)+Math.imul($,ut)|0,o=o+Math.imul($,ht)|0,n=n+Math.imul(E,ct)|0,i=(i=i+Math.imul(E,lt)|0)+Math.imul(x,ct)|0,o=o+Math.imul(x,lt)|0;var jt=(h+(n=n+Math.imul(S,pt)|0)|0)+((8191&(i=(i=i+Math.imul(S,mt)|0)+Math.imul(O,pt)|0))<<13)|0;h=((o=o+Math.imul(O,mt)|0)+(i>>>13)|0)+(jt>>>26)|0,jt&=67108863,n=Math.imul(C,rt),i=(i=Math.imul(C,nt))+Math.imul(L,rt)|0,o=Math.imul(L,nt),n=n+Math.imul(I,ot)|0,i=(i=i+Math.imul(I,st)|0)+Math.imul(N,ot)|0,o=o+Math.imul(N,st)|0,n=n+Math.imul(R,ut)|0,i=(i=i+Math.imul(R,ht)|0)+Math.imul(B,ut)|0,o=o+Math.imul(B,ht)|0,n=n+Math.imul(j,ct)|0,i=(i=i+Math.imul(j,lt)|0)+Math.imul($,ct)|0,o=o+Math.imul($,lt)|0;var $t=(h+(n=n+Math.imul(E,pt)|0)|0)+((8191&(i=(i=i+Math.imul(E,mt)|0)+Math.imul(x,pt)|0))<<13)|0;h=((o=o+Math.imul(x,mt)|0)+(i>>>13)|0)+($t>>>26)|0,$t&=67108863,n=Math.imul(C,ot),i=(i=Math.imul(C,st))+Math.imul(L,ot)|0,o=Math.imul(L,st),n=n+Math.imul(I,ut)|0,i=(i=i+Math.imul(I,ht)|0)+Math.imul(N,ut)|0,o=o+Math.imul(N,ht)|0,n=n+Math.imul(R,ct)|0,i=(i=i+Math.imul(R,lt)|0)+Math.imul(B,ct)|0,o=o+Math.imul(B,lt)|0;var Pt=(h+(n=n+Math.imul(j,pt)|0)|0)+((8191&(i=(i=i+Math.imul(j,mt)|0)+Math.imul($,pt)|0))<<13)|0;h=((o=o+Math.imul($,mt)|0)+(i>>>13)|0)+(Pt>>>26)|0,Pt&=67108863,n=Math.imul(C,ut),i=(i=Math.imul(C,ht))+Math.imul(L,ut)|0,o=Math.imul(L,ht),n=n+Math.imul(I,ct)|0,i=(i=i+Math.imul(I,lt)|0)+Math.imul(N,ct)|0,o=o+Math.imul(N,lt)|0;var Rt=(h+(n=n+Math.imul(R,pt)|0)|0)+((8191&(i=(i=i+Math.imul(R,mt)|0)+Math.imul(B,pt)|0))<<13)|0;h=((o=o+Math.imul(B,mt)|0)+(i>>>13)|0)+(Rt>>>26)|0,Rt&=67108863,n=Math.imul(C,ct),i=(i=Math.imul(C,lt))+Math.imul(L,ct)|0,o=Math.imul(L,lt);var Bt=(h+(n=n+Math.imul(I,pt)|0)|0)+((8191&(i=(i=i+Math.imul(I,mt)|0)+Math.imul(N,pt)|0))<<13)|0;h=((o=o+Math.imul(N,mt)|0)+(i>>>13)|0)+(Bt>>>26)|0,Bt&=67108863;var Tt=(h+(n=Math.imul(C,pt))|0)+((8191&(i=(i=Math.imul(C,mt))+Math.imul(L,pt)|0))<<13)|0;return h=((o=Math.imul(L,mt))+(i>>>13)|0)+(Tt>>>26)|0,Tt&=67108863,u[0]=yt,u[1]=vt,u[2]=bt,u[3]=gt,u[4]=wt,u[5]=_t,u[6]=Mt,u[7]=St,u[8]=Ot,u[9]=At,u[10]=Et,u[11]=xt,u[12]=kt,u[13]=jt,u[14]=$t,u[15]=Pt,u[16]=Rt,u[17]=Bt,u[18]=Tt,0!==h&&(u[19]=h,r.length++),r};function y(t,e,r){return(new v).mulp(t,e,r)}function v(t,e){this.x=t,this.y=e}Math.imul||(m=p),s.prototype.mulTo=function(t,e){var r=this.length+t.length;return 10===this.length&&10===t.length?m(this,t,e):r<63?p(this,t,e):r<1024?function(t,e,r){r.negative=e.negative^t.negative,r.length=t.length+e.length;for(var n=0,i=0,o=0;o>>26)|0)>>>26,s&=67108863}r.words[o]=a,n=s,s=i}return 0!==n?r.words[o]=n:r.length--,r.strip()}(this,t,e):y(this,t,e)},v.prototype.makeRBT=function(t){for(var e=new Array(t),r=s.prototype._countBits(t)-1,n=0;n>=1;return n},v.prototype.permute=function(t,e,r,n,i,o){for(var s=0;s>>=1)i++;return 1<>>=13,r[2*s+1]=8191&o,o>>>=13;for(s=2*e;s>=26,e+=n/67108864|0,e+=o>>>26,this.words[r]=67108863&o}return 0!==e&&(this.words[r]=e,this.length++),this},s.prototype.muln=function(t){return this.clone().imuln(t)},s.prototype.sqr=function(){return this.mul(this)},s.prototype.isqr=function(){return this.imul(this.clone())},s.prototype.pow=function(t){var e=function(t){for(var e=new Array(t.bitLength()),r=0;r>>i}return e}(t);if(0===e.length)return new s(1);for(var r=this,n=0;n=0);var e,r=t%26,n=(t-r)/26,o=67108863>>>26-r<<26-r;if(0!==r){var s=0;for(e=0;e>>26-r}s&&(this.words[e]=s,this.length++)}if(0!==n){for(e=this.length-1;e>=0;e--)this.words[e+n]=this.words[e];for(e=0;e=0),n=e?(e-e%26)/26:0;var o=t%26,s=Math.min((t-o)/26,this.length),a=67108863^67108863>>>o<s)for(this.length-=s,h=0;h=0&&(0!==f||h>=n);h--){var c=0|this.words[h];this.words[h]=f<<26-o|c>>>o,f=c&a}return u&&0!==f&&(u.words[u.length++]=f),0===this.length&&(this.words[0]=0,this.length=1),this.strip()},s.prototype.ishrn=function(t,e,r){return i(0===this.negative),this.iushrn(t,e,r)},s.prototype.shln=function(t){return this.clone().ishln(t)},s.prototype.ushln=function(t){return this.clone().iushln(t)},s.prototype.shrn=function(t){return this.clone().ishrn(t)},s.prototype.ushrn=function(t){return this.clone().iushrn(t)},s.prototype.testn=function(t){i("number"==typeof t&&t>=0);var e=t%26,r=(t-e)/26,n=1<=0);var e=t%26,r=(t-e)/26;if(i(0===this.negative,"imaskn works only with positive numbers"),this.length<=r)return this;if(0!==e&&r++,this.length=Math.min(r,this.length),0!==e){var n=67108863^67108863>>>e<=67108864;e++)this.words[e]-=67108864,e===this.length-1?this.words[e+1]=1:this.words[e+1]++;return this.length=Math.max(this.length,e+1),this},s.prototype.isubn=function(t){if(i("number"==typeof t),i(t<67108864),t<0)return this.iaddn(-t);if(0!==this.negative)return this.negative=0,this.iaddn(t),this.negative=1,this;if(this.words[0]-=t,1===this.length&&this.words[0]<0)this.words[0]=-this.words[0],this.negative=1;else for(var e=0;e>26)-(u/67108864|0),this.words[n+r]=67108863&o}for(;n>26,this.words[n+r]=67108863&o;if(0===a)return this.strip();for(i(-1===a),a=0,n=0;n>26,this.words[n]=67108863&o;return this.negative=1,this.strip()},s.prototype._wordDiv=function(t,e){var r=(this.length,t.length),n=this.clone(),i=t,o=0|i.words[i.length-1];0!==(r=26-this._countBits(o))&&(i=i.ushln(r),n.iushln(r),o=0|i.words[i.length-1]);var a,u=n.length-i.length;if("mod"!==e){(a=new s(null)).length=u+1,a.words=new Array(a.length);for(var h=0;h=0;c--){var l=67108864*(0|n.words[i.length+c])+(0|n.words[i.length+c-1]);for(l=Math.min(l/o|0,67108863),n._ishlnsubmul(i,l,c);0!==n.negative;)l--,n.negative=0,n._ishlnsubmul(i,1,c),n.isZero()||(n.negative^=1);a&&(a.words[c]=l)}return a&&a.strip(),n.strip(),"div"!==e&&0!==r&&n.iushrn(r),{div:a||null,mod:n}},s.prototype.divmod=function(t,e,r){return i(!t.isZero()),this.isZero()?{div:new s(0),mod:new s(0)}:0!==this.negative&&0===t.negative?(a=this.neg().divmod(t,e),"mod"!==e&&(n=a.div.neg()),"div"!==e&&(o=a.mod.neg(),r&&0!==o.negative&&o.iadd(t)),{div:n,mod:o}):0===this.negative&&0!==t.negative?(a=this.divmod(t.neg(),e),"mod"!==e&&(n=a.div.neg()),{div:n,mod:a.mod}):0!=(this.negative&t.negative)?(a=this.neg().divmod(t.neg(),e),"div"!==e&&(o=a.mod.neg(),r&&0!==o.negative&&o.isub(t)),{div:a.div,mod:o}):t.length>this.length||this.cmp(t)<0?{div:new s(0),mod:this}:1===t.length?"div"===e?{div:this.divn(t.words[0]),mod:null}:"mod"===e?{div:null,mod:new s(this.modn(t.words[0]))}:{div:this.divn(t.words[0]),mod:new s(this.modn(t.words[0]))}:this._wordDiv(t,e);var n,o,a},s.prototype.div=function(t){return this.divmod(t,"div",!1).div},s.prototype.mod=function(t){return this.divmod(t,"mod",!1).mod},s.prototype.umod=function(t){return this.divmod(t,"mod",!0).mod},s.prototype.divRound=function(t){var e=this.divmod(t);if(e.mod.isZero())return e.div;var r=0!==e.div.negative?e.mod.isub(t):e.mod,n=t.ushrn(1),i=t.andln(1),o=r.cmp(n);return o<0||1===i&&0===o?e.div:0!==e.div.negative?e.div.isubn(1):e.div.iaddn(1)},s.prototype.modn=function(t){i(t<=67108863);for(var e=(1<<26)%t,r=0,n=this.length-1;n>=0;n--)r=(e*r+(0|this.words[n]))%t;return r},s.prototype.idivn=function(t){i(t<=67108863);for(var e=0,r=this.length-1;r>=0;r--){var n=(0|this.words[r])+67108864*e;this.words[r]=n/t|0,e=n%t}return this.strip()},s.prototype.divn=function(t){return this.clone().idivn(t)},s.prototype.egcd=function(t){i(0===t.negative),i(!t.isZero());var e=this,r=t.clone();e=0!==e.negative?e.umod(t):e.clone();for(var n=new s(1),o=new s(0),a=new s(0),u=new s(1),h=0;e.isEven()&&r.isEven();)e.iushrn(1),r.iushrn(1),++h;for(var f=r.clone(),c=e.clone();!e.isZero();){for(var l=0,d=1;0==(e.words[0]&d)&&l<26;++l,d<<=1);if(l>0)for(e.iushrn(l);l-- >0;)(n.isOdd()||o.isOdd())&&(n.iadd(f),o.isub(c)),n.iushrn(1),o.iushrn(1);for(var p=0,m=1;0==(r.words[0]&m)&&p<26;++p,m<<=1);if(p>0)for(r.iushrn(p);p-- >0;)(a.isOdd()||u.isOdd())&&(a.iadd(f),u.isub(c)),a.iushrn(1),u.iushrn(1);e.cmp(r)>=0?(e.isub(r),n.isub(a),o.isub(u)):(r.isub(e),a.isub(n),u.isub(o))}return{a:a,b:u,gcd:r.iushln(h)}},s.prototype._invmp=function(t){i(0===t.negative),i(!t.isZero());var e=this,r=t.clone();e=0!==e.negative?e.umod(t):e.clone();for(var n,o=new s(1),a=new s(0),u=r.clone();e.cmpn(1)>0&&r.cmpn(1)>0;){for(var h=0,f=1;0==(e.words[0]&f)&&h<26;++h,f<<=1);if(h>0)for(e.iushrn(h);h-- >0;)o.isOdd()&&o.iadd(u),o.iushrn(1);for(var c=0,l=1;0==(r.words[0]&l)&&c<26;++c,l<<=1);if(c>0)for(r.iushrn(c);c-- >0;)a.isOdd()&&a.iadd(u),a.iushrn(1);e.cmp(r)>=0?(e.isub(r),o.isub(a)):(r.isub(e),a.isub(o))}return(n=0===e.cmpn(1)?o:a).cmpn(0)<0&&n.iadd(t),n},s.prototype.gcd=function(t){if(this.isZero())return t.abs();if(t.isZero())return this.abs();var e=this.clone(),r=t.clone();e.negative=0,r.negative=0;for(var n=0;e.isEven()&&r.isEven();n++)e.iushrn(1),r.iushrn(1);for(;;){for(;e.isEven();)e.iushrn(1);for(;r.isEven();)r.iushrn(1);var i=e.cmp(r);if(i<0){var o=e;e=r,r=o}else if(0===i||0===r.cmpn(1))break;e.isub(r)}return r.iushln(n)},s.prototype.invm=function(t){return this.egcd(t).a.umod(t)},s.prototype.isEven=function(){return 0==(1&this.words[0])},s.prototype.isOdd=function(){return 1==(1&this.words[0])},s.prototype.andln=function(t){return this.words[0]&t},s.prototype.bincn=function(t){i("number"==typeof t);var e=t%26,r=(t-e)/26,n=1<>>26,a&=67108863,this.words[s]=a}return 0!==o&&(this.words[s]=o,this.length++),this},s.prototype.isZero=function(){return 1===this.length&&0===this.words[0]},s.prototype.cmpn=function(t){var e,r=t<0;if(0!==this.negative&&!r)return-1;if(0===this.negative&&r)return 1;if(this.strip(),this.length>1)e=1;else{r&&(t=-t),i(t<=67108863,"Number is too big");var n=0|this.words[0];e=n===t?0:nt.length)return 1;if(this.length=0;r--){var n=0|this.words[r],i=0|t.words[r];if(n!==i){ni&&(e=1);break}}return e},s.prototype.gtn=function(t){return 1===this.cmpn(t)},s.prototype.gt=function(t){return 1===this.cmp(t)},s.prototype.gten=function(t){return this.cmpn(t)>=0},s.prototype.gte=function(t){return this.cmp(t)>=0},s.prototype.ltn=function(t){return-1===this.cmpn(t)},s.prototype.lt=function(t){return-1===this.cmp(t)},s.prototype.lten=function(t){return this.cmpn(t)<=0},s.prototype.lte=function(t){return this.cmp(t)<=0},s.prototype.eqn=function(t){return 0===this.cmpn(t)},s.prototype.eq=function(t){return 0===this.cmp(t)},s.red=function(t){return new O(t)},s.prototype.toRed=function(t){return i(!this.red,"Already a number in reduction context"),i(0===this.negative,"red works only with positives"),t.convertTo(this)._forceRed(t)},s.prototype.fromRed=function(){return i(this.red,"fromRed works only with numbers in reduction context"),this.red.convertFrom(this)},s.prototype._forceRed=function(t){return this.red=t,this},s.prototype.forceRed=function(t){return i(!this.red,"Already a number in reduction context"),this._forceRed(t)},s.prototype.redAdd=function(t){return i(this.red,"redAdd works only with red numbers"),this.red.add(this,t)},s.prototype.redIAdd=function(t){return i(this.red,"redIAdd works only with red numbers"),this.red.iadd(this,t)},s.prototype.redSub=function(t){return i(this.red,"redSub works only with red numbers"),this.red.sub(this,t)},s.prototype.redISub=function(t){return i(this.red,"redISub works only with red numbers"),this.red.isub(this,t)},s.prototype.redShl=function(t){return i(this.red,"redShl works only with red numbers"),this.red.shl(this,t)},s.prototype.redMul=function(t){return i(this.red,"redMul works only with red numbers"),this.red._verify2(this,t),this.red.mul(this,t)},s.prototype.redIMul=function(t){return i(this.red,"redMul works only with red numbers"),this.red._verify2(this,t),this.red.imul(this,t)},s.prototype.redSqr=function(){return i(this.red,"redSqr works only with red numbers"),this.red._verify1(this),this.red.sqr(this)},s.prototype.redISqr=function(){return i(this.red,"redISqr works only with red numbers"),this.red._verify1(this),this.red.isqr(this)},s.prototype.redSqrt=function(){return i(this.red,"redSqrt works only with red numbers"),this.red._verify1(this),this.red.sqrt(this)},s.prototype.redInvm=function(){return i(this.red,"redInvm works only with red numbers"),this.red._verify1(this),this.red.invm(this)},s.prototype.redNeg=function(){return i(this.red,"redNeg works only with red numbers"),this.red._verify1(this),this.red.neg(this)},s.prototype.redPow=function(t){return i(this.red&&!t.red,"redPow(normalNum)"),this.red._verify1(this),this.red.pow(this,t)};var b={k256:null,p224:null,p192:null,p25519:null};function g(t,e){this.name=t,this.p=new s(e,16),this.n=this.p.bitLength(),this.k=new s(1).iushln(this.n).isub(this.p),this.tmp=this._tmp()}function w(){g.call(this,"k256","ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff fffffffe fffffc2f")}function _(){g.call(this,"p224","ffffffff ffffffff ffffffff ffffffff 00000000 00000000 00000001")}function M(){g.call(this,"p192","ffffffff ffffffff ffffffff fffffffe ffffffff ffffffff")}function S(){g.call(this,"25519","7fffffffffffffff ffffffffffffffff ffffffffffffffff ffffffffffffffed")}function O(t){if("string"==typeof t){var e=s._prime(t);this.m=e.p,this.prime=e}else i(t.gtn(1),"modulus must be greater than 1"),this.m=t,this.prime=null}function A(t){O.call(this,t),this.shift=this.m.bitLength(),this.shift%26!=0&&(this.shift+=26-this.shift%26),this.r=new s(1).iushln(this.shift),this.r2=this.imod(this.r.sqr()),this.rinv=this.r._invmp(this.m),this.minv=this.rinv.mul(this.r).isubn(1).div(this.m),this.minv=this.minv.umod(this.r),this.minv=this.r.sub(this.minv)}g.prototype._tmp=function(){var t=new s(null);return t.words=new Array(Math.ceil(this.n/13)),t},g.prototype.ireduce=function(t){var e,r=t;do{this.split(r,this.tmp),e=(r=(r=this.imulK(r)).iadd(this.tmp)).bitLength()}while(e>this.n);var n=e0?r.isub(this.p):void 0!==r.strip?r.strip():r._strip(),r},g.prototype.split=function(t,e){t.iushrn(this.n,0,e)},g.prototype.imulK=function(t){return t.imul(this.k)},o(w,g),w.prototype.split=function(t,e){for(var r=Math.min(t.length,9),n=0;n>>22,i=o}i>>>=22,t.words[n-10]=i,0===i&&t.length>10?t.length-=10:t.length-=9},w.prototype.imulK=function(t){t.words[t.length]=0,t.words[t.length+1]=0,t.length+=2;for(var e=0,r=0;r>>=26,t.words[r]=i,e=n}return 0!==e&&(t.words[t.length++]=e),t},s._prime=function(t){if(b[t])return b[t];var e;if("k256"===t)e=new w;else if("p224"===t)e=new _;else if("p192"===t)e=new M;else{if("p25519"!==t)throw new Error("Unknown prime "+t);e=new S}return b[t]=e,e},O.prototype._verify1=function(t){i(0===t.negative,"red works only with positives"),i(t.red,"red works only with red numbers")},O.prototype._verify2=function(t,e){i(0==(t.negative|e.negative),"red works only with positives"),i(t.red&&t.red===e.red,"red works only with red numbers")},O.prototype.imod=function(t){return this.prime?this.prime.ireduce(t)._forceRed(this):t.umod(this.m)._forceRed(this)},O.prototype.neg=function(t){return t.isZero()?t.clone():this.m.sub(t)._forceRed(this)},O.prototype.add=function(t,e){this._verify2(t,e);var r=t.add(e);return r.cmp(this.m)>=0&&r.isub(this.m),r._forceRed(this)},O.prototype.iadd=function(t,e){this._verify2(t,e);var r=t.iadd(e);return r.cmp(this.m)>=0&&r.isub(this.m),r},O.prototype.sub=function(t,e){this._verify2(t,e);var r=t.sub(e);return r.cmpn(0)<0&&r.iadd(this.m),r._forceRed(this)},O.prototype.isub=function(t,e){this._verify2(t,e);var r=t.isub(e);return r.cmpn(0)<0&&r.iadd(this.m),r},O.prototype.shl=function(t,e){return this._verify1(t),this.imod(t.ushln(e))},O.prototype.imul=function(t,e){return this._verify2(t,e),this.imod(t.imul(e))},O.prototype.mul=function(t,e){return this._verify2(t,e),this.imod(t.mul(e))},O.prototype.isqr=function(t){return this.imul(t,t.clone())},O.prototype.sqr=function(t){return this.mul(t,t)},O.prototype.sqrt=function(t){if(t.isZero())return t.clone();var e=this.m.andln(3);if(i(e%2==1),3===e){var r=this.m.add(new s(1)).iushrn(2);return this.pow(t,r)}for(var n=this.m.subn(1),o=0;!n.isZero()&&0===n.andln(1);)o++,n.iushrn(1);i(!n.isZero());var a=new s(1).toRed(this),u=a.redNeg(),h=this.m.subn(1).iushrn(1),f=this.m.bitLength();for(f=new s(2*f*f).toRed(this);0!==this.pow(f,h).cmp(u);)f.redIAdd(u);for(var c=this.pow(f,n),l=this.pow(t,n.addn(1).iushrn(1)),d=this.pow(t,n),p=o;0!==d.cmp(a);){for(var m=d,y=0;0!==m.cmp(a);y++)m=m.redSqr();i(y=0;n--){for(var h=e.words[n],f=u-1;f>=0;f--){var c=h>>f&1;i!==r[0]&&(i=this.sqr(i)),0!==c||0!==o?(o<<=1,o|=c,(4===++a||0===n&&0===f)&&(i=this.mul(i,r[o]),a=0,o=0)):a=0}u=26}return i},O.prototype.convertTo=function(t){var e=t.umod(this.m);return e===t?e.clone():e},O.prototype.convertFrom=function(t){var e=t.clone();return e.red=null,e},s.mont=function(t){return new A(t)},o(A,O),A.prototype.convertTo=function(t){return this.imod(t.ushln(this.shift))},A.prototype.convertFrom=function(t){var e=this.imod(t.mul(this.rinv));return e.red=null,e},A.prototype.imul=function(t,e){if(t.isZero()||e.isZero())return t.words[0]=0,t.length=1,t;var r=t.imul(e),n=r.maskn(this.shift).mul(this.minv).imaskn(this.shift).mul(this.m),i=r.isub(n).iushrn(this.shift),o=i;return i.cmp(this.m)>=0?o=i.isub(this.m):i.cmpn(0)<0&&(o=i.iadd(this.m)),o._forceRed(this)},A.prototype.mul=function(t,e){if(t.isZero()||e.isZero())return new s(0)._forceRed(this);var r=t.mul(e),n=r.maskn(this.shift).mul(this.minv).imaskn(this.shift).mul(this.m),i=r.isub(n).iushrn(this.shift),o=i;return i.cmp(this.m)>=0?o=i.isub(this.m):i.cmpn(0)<0&&(o=i.iadd(this.m)),o._forceRed(this)},A.prototype.invm=function(t){return this.imod(t._invmp(this.m).mul(this.r2))._forceRed(this)}}(t,this)}).call(this,r(22)(t))},function(t,e,r){"use strict";t.exports=r(278)},function(t,e,r){"use strict";t.exports=new Set(["__proto__","constructor","prototype"])},function(t,e,r){"use strict";var n=r(84); +/*! + * ignore + */t.exports=function(t){var e=null!=this?this.path:null;return n(t,e)}},function(t,e,r){"use strict";var n=r(20); +/*! + * Given a value, cast it to a boolean, or throw a `CastError` if the value + * cannot be casted. `null` and `undefined` are considered valid. + * + * @param {Any} value + * @param {String} [path] optional the path to set on the CastError + * @return {Boolean|null|undefined} + * @throws {CastError} if `value` is not one of the allowed values + * @api private + */t.exports=function(e,r){if(t.exports.convertToTrue.has(e))return!0;if(t.exports.convertToFalse.has(e))return!1;if(null==e)return e;throw new n("boolean",e,r)},t.exports.convertToTrue=new Set([!0,"true",1,"1","yes"]),t.exports.convertToFalse=new Set([!1,"false",0,"0","no"])},function(t,e,r){"use strict";(function(n){ +/*! + * Module dependencies. + */ +function i(t,e,r){return e in t?Object.defineProperty(t,e,{value:r,enumerable:!0,configurable:!0,writable:!0}):t[e]=r,t}function o(t){return(o="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function s(t,e){var r="undefined"!=typeof Symbol&&t[Symbol.iterator]||t["@@iterator"];if(!r){if(Array.isArray(t)||(r=function(t,e){if(!t)return;if("string"==typeof t)return a(t,e);var r=Object.prototype.toString.call(t).slice(8,-1);"Object"===r&&t.constructor&&(r=t.constructor.name);if("Map"===r||"Set"===r)return Array.from(t);if("Arguments"===r||/^(?:Ui|I)nt(?:8|16|32)(?:Clamped)?Array$/.test(r))return a(t,e)}(t))||e&&t&&"number"==typeof t.length){r&&(t=r);var n=0,i=function(){};return{s:i,n:function(){return n>=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:i}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var o,s=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return s=t.done,t},e:function(t){u=!0,o=t},f:function(){try{s||null==r.return||r.return()}finally{if(u)throw o}}}}function a(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r0){e&&(this.nested[e.substr(0,e.length-1)]=!0);var u=new P(a),h=Object.assign({},t[i],{type:u});this.path(e+i,h)}else e&&(this.nested[e.substr(0,e.length-1)]=!0),this.path(e+i,t[i])}else e&&(this.nested[e.substr(0,e.length-1)]=!0),this.path(e+i,t[i])}}return function(t,e){var r,n=s(e=e||Object.keys(t.paths));try{for(n.s();!(r=n.n()).done;){var i=r.value,o=v(t.paths[i],"options");if(null!=o){var a=t.paths[i].path,u=o.alias;if(u){if("string"!=typeof u)throw new Error("Invalid value for alias option on "+a+", got "+u);t.aliases[u]=a,t.virtual(u).get(function(t){return function(){return"function"==typeof this.get?this.get(t):this[t]}}(a)).set(function(t){return function(e){return this.$set(t,e)}}(a))}}}}catch(t){n.e(t)}finally{n.f()}}(this,Object.keys(t).map((function(t){return e?e+t:t}))),this},P.reserved=Object.create(null),P.prototype.reserved=P.reserved;var R=P.reserved; +/*! + * ignore + */ +function B(t){return/\.\d+/.test(t)?t.replace(/\.\d+\./g,".$.").replace(/\.\d+$/,".$"):t} +/*! + * ignore + */function T(t,e){if(0===t.mapPaths.length)return null;var r,n=s(t.mapPaths);try{for(n.s();!(r=n.n()).done;){var i=r.value.path;if(new RegExp("^"+i.replace(/\.\$\*/g,"\\.[^.]+")+"$").test(e))return t.paths[i]}}catch(t){n.e(t)}finally{n.f()}return null} +/*! + * ignore. Deprecated re: #6405 + */ +function I(t,e){var r=e.split(/\.(\d+)\.|\.(\d+)$/).filter(Boolean);if(r.length<2)return t.paths.hasOwnProperty(r[0])?t.paths[r[0]]:"adhocOrUndefined";var n=t.path(r[0]),i=!1;if(!n)return"adhocOrUndefined";for(var o=r.length-1,s=1;s0?".":"")+y,d[y]||(this.nested[p]=!0,d[y]={}),"object"!==o(d[y])){var v="Cannot set nested path `"+t+"`. Parent path `"+p+"` already set to type "+d[y].name+".";throw new Error(v)}d=d[y]}}catch(t){m.e(t)}finally{m.f()}d[c]=A.clone(e),this.paths[t]=this.interpretAsType(t,e,this.options);var b=this.paths[t];if(b.$isSchemaMap){var g=t+".$*";this.paths[g]=b.$__schemaType,this.mapPaths.push(this.paths[g])}if(b.$isSingleNested){for(var w=0,_=Object.keys(b.schema.paths);w<_.length;w++){var M=_[w];this.singleNestedPaths[t+"."+M]=b.schema.paths[M]}for(var S=0,O=Object.keys(b.schema.singleNestedPaths);S0&&!A.hasUserDefinedProperty(n.of,t.options.typeKey);a=u?i({},t.options.typeKey,new P(n.of)):A.isPOJO(n.of)?Object.assign({},n.of):i({},t.options.typeKey,n.of),A.hasUserDefinedProperty(n,"ref")&&(a.ref=n.ref)}e.$__schemaType=t.interpretAsType(s,a,o)}(this,v,t,e,r),v},P.prototype.eachPath=function(t){for(var e=Object.keys(this.paths),r=e.length,n=0;n0?t+"."+e[r]:e[r],this.paths.hasOwnProperty(t)&&this.paths[t]instanceof u.Mixed)return this.paths[t];return null},P.prototype.setupTimestamp=function(t){return O(this,t)},P.prototype.queue=function(t,e){return this.callQueue.push([t,e]),this},P.prototype.pre=function(t){if(t instanceof RegExp){var e,r=Array.prototype.slice.call(arguments,1),n=s(j);try{for(n.s();!(e=n.n()).done;){var i=e.value;t.test(i)&&this.pre.apply(this,[i].concat(r))}}catch(t){n.e(t)}finally{n.f()}return this}if(Array.isArray(t)){var o,a=Array.prototype.slice.call(arguments,1),u=s(t);try{for(u.s();!(o=u.n()).done;){var h=o.value;this.pre.apply(this,[h].concat(a))}}catch(t){u.e(t)}finally{u.f()}return this}return this.s.hooks.pre.apply(this.s.hooks,arguments),this},P.prototype.post=function(t){if(t instanceof RegExp){var e,r=Array.prototype.slice.call(arguments,1),n=s(j);try{for(n.s();!(e=n.n()).done;){var i=e.value;t.test(i)&&this.post.apply(this,[i].concat(r))}}catch(t){n.e(t)}finally{n.f()}return this}if(Array.isArray(t)){var o,a=Array.prototype.slice.call(arguments,1),u=s(t);try{for(u.s();!(o=u.n()).done;){var h=o.value;this.post.apply(this,[h].concat(a))}}catch(t){u.e(t)}finally{u.f()}return this}return this.s.hooks.post.apply(this.s.hooks,arguments),this},P.prototype.plugin=function(t,e){if("function"!=typeof t)throw new Error('First param to `schema.plugin()` must be a function, got "'+o(t)+'"');if(e&&e.deduplicate){var r,n=s(this.plugins);try{for(n.s();!(r=n.n()).done;){if(r.value.fn===t)return this}}catch(t){n.e(t)}finally{n.f()}}return this.plugins.push({fn:t,opts:e}),t(this,e),this},P.prototype.method=function(t,e,r){if("string"!=typeof t)for(var n in t)this.methods[n]=t[n],this.methodOptions[n]=A.clone(r);else this.methods[t]=e,this.methodOptions[t]=A.clone(r);return this},P.prototype.static=function(t,e){if("string"!=typeof t)for(var r in t)this.statics[r]=t[r];else this.statics[t]=e;return this},P.prototype.index=function(t,e){return t||(t={}),e||(e={}),e.expires&&A.expires(e),this._indexes.push([t,e]),this},P.prototype.set=function(t,e,r){if(1===arguments.length)return this.options[t];switch(t){case"read":this.options[t]=S(e,r),this._userProvidedOptions[t]=this.options[t];break;case"timestamps":this.setupTimestamp(e),this.options[t]=e,this._userProvidedOptions[t]=this.options[t];break;case"_id":this.options[t]=e,this._userProvidedOptions[t]=this.options[t],e&&!this.paths._id?y(this):!e&&null!=this.paths._id&&this.paths._id.auto&&this.remove("_id");break;default:this.options[t]=e,this._userProvidedOptions[t]=this.options[t]}return this},P.prototype.get=function(t){return this.options[t]};var N="2d 2dsphere hashed text".split(" "); +/*! + * ignore + */ +function D(t,e){var r,n=e.split("."),i=n.pop(),o=t.tree,a=s(n);try{for(a.s();!(r=a.n()).done;){o=o[r.value]}}catch(t){a.e(t)}finally{a.f()}delete o[i]} +/*! + * ignore + */ +function C(t){return t.startsWith("$[")&&t.endsWith("]")} +/*! + * Called by `compile()` _right before_ compiling. Good for making any changes to + * the schema that should respect options set by plugins, like `id` + */Object.defineProperty(P,"indexTypes",{get:function(){return N},set:function(){throw new Error("Cannot overwrite Schema.indexTypes")}}),P.prototype.indexes=function(){return g(this)},P.prototype.virtual=function(t,e){if(t instanceof m||"VirtualType"===b(t))return this.virtual(t.path,t.options);if(e=new p(e),A.hasUserDefinedProperty(e,["ref","refPath"])){if(null==e.localField)throw new Error("Reference virtuals require `localField` option");if(null==e.foreignField)throw new Error("Reference virtuals require `foreignField` option");this.pre("init",(function(r){if(M.has(t,r)){var n=M.get(t,r);this.$$populatedVirtuals||(this.$$populatedVirtuals={}),e.justOne||e.count?this.$$populatedVirtuals[t]=Array.isArray(n)?n[0]:n:this.$$populatedVirtuals[t]=Array.isArray(n)?n:null==n?[]:[n],M.unset(t,r)}}));var r=this.virtual(t);r.options=e,r.set((function(r){this.$$populatedVirtuals||(this.$$populatedVirtuals={}),e.justOne||e.count?(this.$$populatedVirtuals[t]=Array.isArray(r)?r[0]:r,"object"!==o(this.$$populatedVirtuals[t])&&(this.$$populatedVirtuals[t]=e.count?r:null)):(this.$$populatedVirtuals[t]=Array.isArray(r)?r:null==r?[]:[r],this.$$populatedVirtuals[t]=this.$$populatedVirtuals[t].filter((function(t){return t&&"object"===o(t)})))})),"function"==typeof e.get&&r.get(e.get);for(var n=t.split("."),i=n[0],s=0;s=e.length?i:s+1>=e.length?i.$__schemaType:t(e.slice(s+1),i.$__schemaType.schema)}return i.$fullPath=r.join("."),i}}(n,this)}, +/*! + * ignore + */ +P.prototype._getPathType=function(t){if(this.path(t))return"real";return function t(e,r){for(var n,i,o=e.length+1;o--;){if(i=e.slice(0,o).join("."),n=r.path(i))return n.caster?n.caster instanceof u.Mixed?{schema:n,pathType:"mixed"}:o!==e.length&&n.schema?"$"===e[o]||C(e[o])?o===e.length-1?{schema:n,pathType:"nested"}:t(e.slice(o+1),n.schema):t(e.slice(o),n.schema):{schema:n,pathType:n.$isSingleNested?"nested":"array"}:{schema:n,pathType:"real"};if(o===e.length&&r.nested[i])return{schema:r,pathType:"nested"}}return{schema:n||r,pathType:"undefined"}}(t.split("."),this)},P.prototype._preCompile=function(){w(this)}, +/*! + * Module exports. + */ +t.exports=e=P,P.Types=u=r(87), +/*! + * ignore + */ +e.ObjectId=u.ObjectId}).call(this,r(3).Buffer)},function(t,e,r){"use strict";function n(t,e){var r="undefined"!=typeof Symbol&&t[Symbol.iterator]||t["@@iterator"];if(!r){if(Array.isArray(t)||(r=function(t,e){if(!t)return;if("string"==typeof t)return i(t,e);var r=Object.prototype.toString.call(t).slice(8,-1);"Object"===r&&t.constructor&&(r=t.constructor.name);if("Map"===r||"Set"===r)return Array.from(t);if("Arguments"===r||/^(?:Ui|I)nt(?:8|16|32)(?:Clamped)?Array$/.test(r))return i(t,e)}(t))||e&&t&&"number"==typeof t.length){r&&(t=r);var n=0,o=function(){};return{s:o,n:function(){return n>=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:o}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function i(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r0||this.setters.length>0)){var t="$"+this.path;this.getters.push((function(){return this[t]})),this.setters.push((function(e){this[t]=e}))}}, +/*! + * ignore + */ +s.prototype.clone=function(){var t=new s(this.options,this.path);return t.getters=[].concat(this.getters),t.setters=[].concat(this.setters),t},s.prototype.get=function(t){return this.getters.push(t),this},s.prototype.set=function(t){return this.setters.push(t),this},s.prototype.applyGetters=function(t,e){o.hasUserDefinedProperty(this.options,["ref","refPath"])&&e.$$populatedVirtuals&&e.$$populatedVirtuals.hasOwnProperty(this.path)&&(t=e.$$populatedVirtuals[this.path]);var r,i=t,s=n(this.getters);try{for(s.s();!(r=s.n()).done;){i=r.value.call(e,i,this,e)}}catch(t){s.e(t)}finally{s.f()}return i},s.prototype.applySetters=function(t,e){var r,i=t,o=n(this.setters);try{for(o.s();!(r=o.n()).done;){i=r.value.call(e,i,this,e)}}catch(t){o.e(t)}finally{o.f()}return i}, +/*! + * exports + */ +t.exports=s},function(t,e,r){"use strict"; +/*! + * Module exports. + */e.String=r(312),e.Number=r(161),e.Boolean=r(316),e.DocumentArray=r(317),e.Subdocument=r(324),e.Array=r(88),e.Buffer=r(326),e.Date=r(328),e.ObjectId=r(331),e.Mixed=r(44),e.Decimal128=e.Decimal=r(333),e.Map=r(335),e.Oid=e.ObjectId,e.Object=e.Mixed,e.Bool=e.Boolean,e.ObjectID=e.ObjectId},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */function n(t,e){var r="undefined"!=typeof Symbol&&t[Symbol.iterator]||t["@@iterator"];if(!r){if(Array.isArray(t)||(r=function(t,e){if(!t)return;if("string"==typeof t)return i(t,e);var r=Object.prototype.toString.call(t).slice(8,-1);"Object"===r&&t.constructor&&(r=t.constructor.name);if("Map"===r||"Set"===r)return Array.from(t);if("Arguments"===r||/^(?:Ui|I)nt(?:8|16|32)(?:Clamped)?Array$/.test(r))return i(t,e)}(t))||e&&t&&"number"==typeof t.length){r&&(t=r);var n=0,o=function(){};return{s:o,n:function(){return n>=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:o}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function i(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r0){var s=p(t);if(s.min===s.max&&s.max=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:o}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function i(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r0?t:r)}return this},_registerAtomic:function(t,e){if(!this[b]){if("$set"===t)return this[d]={$set:e},h(this[p],this[m]),this._markModified(),this;var r,n=this[d];if("$pop"===t&&!("$pop"in n)){var i=this;this[p].once("save",(function(){i._popped=i._shifted=null}))}if(n.$set||Object.keys(n).length&&!(t in n))return this[d]={$set:this},this;if("$pullAll"===t||"$addToSet"===t)n[t]||(n[t]=[]),n[t]=n[t].concat(e);else if("$pullDocs"===t){var o=n.$pull||(n.$pull={});e[0]instanceof s?(r=o.$or||(o.$or=[]),Array.prototype.push.apply(r,e.map((function(t){return t.toObject({transform:!1,virtuals:!1})})))):(r=o._id||(o._id={$in:[]})).$in=r.$in.concat(e)}else"$push"===t?(n.$push=n.$push||{$each:[]},null!=e&&l.hasUserDefinedProperty(e,"$each")?n.$push=e:n.$push.$each=n.$push.$each.concat(e)):n[t]=e;return this}},addToSet:function(){_(this,arguments);var t=[].map.call(arguments,this._mapCast,this);t=this[y].applySetters(t,this[p]);var e=[],r="";t[0]instanceof s?r="doc":t[0]instanceof Date&&(r="date");var n=t.isMongooseArrayProxy?t.__array:this,i=this.isMongooseArrayProxy?this.__array:this;return n.forEach((function(t){var n,o=+t;switch(r){case"doc":n=this.some((function(e){return e.equals(t)}));break;case"date":n=this.some((function(t){return+t===o}));break;default:n=~this.indexOf(t)}n||(i.push(t),this._registerAtomic("$addToSet",t),this._markModified(),[].push.call(e,t))}),this),e},hasAtomics:function(){return l.isPOJO(this[d])?Object.keys(this[d]).length:0},includes:function(t,e){return-1!==this.indexOf(t,e)},indexOf:function(t,e){t instanceof u&&(t=t.toString()),e=null==e?0:e;for(var r=this.length,n=e;n0&&this._registerAtomic("$set",this),this},push:function(){var t=arguments,e=t,r=null!=t[0]&&l.hasUserDefinedProperty(t[0],"$each"),n=this.isMongooseArrayProxy?this.__array:this;if(r&&(e=t[0],t=t[0].$each),null==this[y])return g.apply(this,t);_(this,t);var i,o=this[p];t=[].map.call(t,this._mapCast,this),t=this[y].applySetters(t,o,void 0,void 0,{skipDocumentArrayCast:!0});var s=this[d];if(r){if(e.$each=t,f(s,"$push.$each.length",0)>0&&s.$push.$position!=e.$position)throw new a("Cannot call `Array#push()` multiple times with different `$position`");null!=e.$position?([].splice.apply(n,[e.$position,0].concat(t)),i=this.length):i=[].push.apply(n,t)}else{if(f(s,"$push.$each.length",0)>0&&null!=s.$push.$position)throw new a("Cannot call `Array#push()` multiple times with different `$position`");e=t,i=[].push.apply(n,t)}return this._registerAtomic("$push",e),this._markModified(),i},remove:function(){return this.pull.apply(this,arguments)},set:function(t,e,r){var n=this.__array;if(r)return n[t]=e,this;var i=w._cast.call(this,e,t);return n[t]=i,w._markModified.call(this,t),this},shift:function(){var t=this.isMongooseArrayProxy?this.__array:this,e=[].shift.call(t);return this._registerAtomic("$set",this),this._markModified(),e},sort:function(){var t=this.isMongooseArrayProxy?this.__array:this,e=[].sort.apply(t,arguments);return this._registerAtomic("$set",this),e},splice:function(){var t,e=this.isMongooseArrayProxy?this.__array:this;if(_(this,Array.prototype.slice.call(arguments,2)),arguments.length){var r;if(null==this[y])r=arguments;else{r=[];for(var n=0;n0&& +/*! + * ignore + */ +function(t,e){if(!e)return!1;var r,i=n(t);try{for(i.s();!(r=i.n()).done;){var s=r.value;if(null==s)return!1;var a=s.constructor;if(!(s instanceof o)||a.modelName!==e&&a.baseModelName!==e)return!1}}catch(t){i.e(t)}finally{i.f()}return!0}(e,a)&&t[p].$populated(t[m],[],(r={},i=v,s=e[0].constructor,i in r?Object.defineProperty(r,i,{value:s,enumerable:!0,configurable:!0,writable:!0}):r[i]=s,r))}for(var M=function(){var t=O[S];if(null==Array.prototype[t])return"continue";w[t]=function(){var e=this.isMongooseArrayProxy?this.__array:this,r=[].concat(e);return r[t].apply(r,arguments)}},S=0,O=["filter","flat","flatMap","map","slice"];S=this._blockSize;){for(var o=this._blockOffset;o0;++s)this._length[s]+=a,(a=this._length[s]/4294967296|0)>0&&(this._length[s]-=4294967296*a);return this},o.prototype._update=function(){throw new Error("_update is not implemented")},o.prototype.digest=function(t){if(this._finalized)throw new Error("Digest already called");this._finalized=!0;var e=this._digest();void 0!==t&&(e=e.toString(t)),this._block.fill(0),this._blockOffset=0;for(var r=0;r<4;++r)this._length[r]=0;return e},o.prototype._digest=function(){throw new Error("_digest is not implemented")},t.exports=o},function(t,e,r){(e=t.exports=r(96)).Stream=e,e.Readable=e,e.Writable=r(100),e.Duplex=r(29),e.Transform=r(102),e.PassThrough=r(190),e.finished=r(61),e.pipeline=r(191)},function(t,e,r){"use strict";(function(e,n){var i;t.exports=A,A.ReadableState=O;r(13).EventEmitter;var o=function(t,e){return t.listeners(e).length},s=r(97),a=r(3).Buffer,u=e.Uint8Array||function(){};var h,f=r(184);h=f&&f.debuglog?f.debuglog("stream"):function(){};var c,l,d,p=r(185),m=r(98),y=r(99).getHighWaterMark,v=r(28).codes,b=v.ERR_INVALID_ARG_TYPE,g=v.ERR_STREAM_PUSH_AFTER_EOF,w=v.ERR_METHOD_NOT_IMPLEMENTED,_=v.ERR_STREAM_UNSHIFT_AFTER_END_EVENT;r(0)(A,s);var M=m.errorOrDestroy,S=["error","close","destroy","pause","resume"];function O(t,e,n){i=i||r(29),t=t||{},"boolean"!=typeof n&&(n=e instanceof i),this.objectMode=!!t.objectMode,n&&(this.objectMode=this.objectMode||!!t.readableObjectMode),this.highWaterMark=y(this,t,"readableHighWaterMark",n),this.buffer=new p,this.length=0,this.pipes=null,this.pipesCount=0,this.flowing=null,this.ended=!1,this.endEmitted=!1,this.reading=!1,this.sync=!0,this.needReadable=!1,this.emittedReadable=!1,this.readableListening=!1,this.resumeScheduled=!1,this.paused=!0,this.emitClose=!1!==t.emitClose,this.autoDestroy=!!t.autoDestroy,this.destroyed=!1,this.defaultEncoding=t.defaultEncoding||"utf8",this.awaitDrain=0,this.readingMore=!1,this.decoder=null,this.encoding=null,t.encoding&&(c||(c=r(37).StringDecoder),this.decoder=new c(t.encoding),this.encoding=t.encoding)}function A(t){if(i=i||r(29),!(this instanceof A))return new A(t);var e=this instanceof i;this._readableState=new O(t,this,e),this.readable=!0,t&&("function"==typeof t.read&&(this._read=t.read),"function"==typeof t.destroy&&(this._destroy=t.destroy)),s.call(this)}function E(t,e,r,n,i){h("readableAddChunk",e);var o,s=t._readableState;if(null===e)s.reading=!1,function(t,e){if(h("onEofChunk"),e.ended)return;if(e.decoder){var r=e.decoder.end();r&&r.length&&(e.buffer.push(r),e.length+=e.objectMode?1:r.length)}e.ended=!0,e.sync?j(t):(e.needReadable=!1,e.emittedReadable||(e.emittedReadable=!0,$(t)))}(t,s);else if(i||(o=function(t,e){var r;n=e,a.isBuffer(n)||n instanceof u||"string"==typeof e||void 0===e||t.objectMode||(r=new b("chunk",["string","Buffer","Uint8Array"],e));var n;return r}(s,e)),o)M(t,o);else if(s.objectMode||e&&e.length>0)if("string"==typeof e||s.objectMode||Object.getPrototypeOf(e)===a.prototype||(e=function(t){return a.from(t)}(e)),n)s.endEmitted?M(t,new _):x(t,s,e,!0);else if(s.ended)M(t,new g);else{if(s.destroyed)return!1;s.reading=!1,s.decoder&&!r?(e=s.decoder.write(e),s.objectMode||0!==e.length?x(t,s,e,!1):P(t,s)):x(t,s,e,!1)}else n||(s.reading=!1,P(t,s));return!s.ended&&(s.lengthe.highWaterMark&&(e.highWaterMark=function(t){return t>=1073741824?t=1073741824:(t--,t|=t>>>1,t|=t>>>2,t|=t>>>4,t|=t>>>8,t|=t>>>16,t++),t}(t)),t<=e.length?t:e.ended?e.length:(e.needReadable=!0,0))}function j(t){var e=t._readableState;h("emitReadable",e.needReadable,e.emittedReadable),e.needReadable=!1,e.emittedReadable||(h("emitReadable",e.flowing),e.emittedReadable=!0,n.nextTick($,t))}function $(t){var e=t._readableState;h("emitReadable_",e.destroyed,e.length,e.ended),e.destroyed||!e.length&&!e.ended||(t.emit("readable"),e.emittedReadable=!1),e.needReadable=!e.flowing&&!e.ended&&e.length<=e.highWaterMark,N(t)}function P(t,e){e.readingMore||(e.readingMore=!0,n.nextTick(R,t,e))}function R(t,e){for(;!e.reading&&!e.ended&&(e.length0,e.resumeScheduled&&!e.paused?e.flowing=!0:t.listenerCount("data")>0&&t.resume()}function T(t){h("readable nexttick read 0"),t.read(0)}function I(t,e){h("resume",e.reading),e.reading||t.read(0),e.resumeScheduled=!1,t.emit("resume"),N(t),e.flowing&&!e.reading&&t.read(0)}function N(t){var e=t._readableState;for(h("flow",e.flowing);e.flowing&&null!==t.read(););}function D(t,e){return 0===e.length?null:(e.objectMode?r=e.buffer.shift():!t||t>=e.length?(r=e.decoder?e.buffer.join(""):1===e.buffer.length?e.buffer.first():e.buffer.concat(e.length),e.buffer.clear()):r=e.buffer.consume(t,e.decoder),r);var r}function C(t){var e=t._readableState;h("endReadable",e.endEmitted),e.endEmitted||(e.ended=!0,n.nextTick(L,e,t))}function L(t,e){if(h("endReadableNT",t.endEmitted,t.length),!t.endEmitted&&0===t.length&&(t.endEmitted=!0,e.readable=!1,e.emit("end"),t.autoDestroy)){var r=e._writableState;(!r||r.autoDestroy&&r.finished)&&e.destroy()}}function q(t,e){for(var r=0,n=t.length;r=e.highWaterMark:e.length>0)||e.ended))return h("read: emitReadable",e.length,e.ended),0===e.length&&e.ended?C(this):j(this),null;if(0===(t=k(t,e))&&e.ended)return 0===e.length&&C(this),null;var n,i=e.needReadable;return h("need readable",i),(0===e.length||e.length-t0?D(t,e):null)?(e.needReadable=e.length<=e.highWaterMark,t=0):(e.length-=t,e.awaitDrain=0),0===e.length&&(e.ended||(e.needReadable=!0),r!==t&&e.ended&&C(this)),null!==n&&this.emit("data",n),n},A.prototype._read=function(t){M(this,new w("_read()"))},A.prototype.pipe=function(t,e){var r=this,i=this._readableState;switch(i.pipesCount){case 0:i.pipes=t;break;case 1:i.pipes=[i.pipes,t];break;default:i.pipes.push(t)}i.pipesCount+=1,h("pipe count=%d opts=%j",i.pipesCount,e);var s=(!e||!1!==e.end)&&t!==n.stdout&&t!==n.stderr?u:y;function a(e,n){h("onunpipe"),e===r&&n&&!1===n.hasUnpiped&&(n.hasUnpiped=!0,h("cleanup"),t.removeListener("close",p),t.removeListener("finish",m),t.removeListener("drain",f),t.removeListener("error",d),t.removeListener("unpipe",a),r.removeListener("end",u),r.removeListener("end",y),r.removeListener("data",l),c=!0,!i.awaitDrain||t._writableState&&!t._writableState.needDrain||f())}function u(){h("onend"),t.end()}i.endEmitted?n.nextTick(s):r.once("end",s),t.on("unpipe",a);var f=function(t){return function(){var e=t._readableState;h("pipeOnDrain",e.awaitDrain),e.awaitDrain&&e.awaitDrain--,0===e.awaitDrain&&o(t,"data")&&(e.flowing=!0,N(t))}}(r);t.on("drain",f);var c=!1;function l(e){h("ondata");var n=t.write(e);h("dest.write",n),!1===n&&((1===i.pipesCount&&i.pipes===t||i.pipesCount>1&&-1!==q(i.pipes,t))&&!c&&(h("false write response, pause",i.awaitDrain),i.awaitDrain++),r.pause())}function d(e){h("onerror",e),y(),t.removeListener("error",d),0===o(t,"error")&&M(t,e)}function p(){t.removeListener("finish",m),y()}function m(){h("onfinish"),t.removeListener("close",p),y()}function y(){h("unpipe"),r.unpipe(t)}return r.on("data",l),function(t,e,r){if("function"==typeof t.prependListener)return t.prependListener(e,r);t._events&&t._events[e]?Array.isArray(t._events[e])?t._events[e].unshift(r):t._events[e]=[r,t._events[e]]:t.on(e,r)}(t,"error",d),t.once("close",p),t.once("finish",m),t.emit("pipe",r),i.flowing||(h("pipe resume"),r.resume()),t},A.prototype.unpipe=function(t){var e=this._readableState,r={hasUnpiped:!1};if(0===e.pipesCount)return this;if(1===e.pipesCount)return t&&t!==e.pipes||(t||(t=e.pipes),e.pipes=null,e.pipesCount=0,e.flowing=!1,t&&t.emit("unpipe",this,r)),this;if(!t){var n=e.pipes,i=e.pipesCount;e.pipes=null,e.pipesCount=0,e.flowing=!1;for(var o=0;o0,!1!==i.flowing&&this.resume()):"readable"===t&&(i.endEmitted||i.readableListening||(i.readableListening=i.needReadable=!0,i.flowing=!1,i.emittedReadable=!1,h("on readable",i.length,i.reading),i.length?j(this):i.reading||n.nextTick(T,this))),r},A.prototype.addListener=A.prototype.on,A.prototype.removeListener=function(t,e){var r=s.prototype.removeListener.call(this,t,e);return"readable"===t&&n.nextTick(B,this),r},A.prototype.removeAllListeners=function(t){var e=s.prototype.removeAllListeners.apply(this,arguments);return"readable"!==t&&void 0!==t||n.nextTick(B,this),e},A.prototype.resume=function(){var t=this._readableState;return t.flowing||(h("resume"),t.flowing=!t.readableListening,function(t,e){e.resumeScheduled||(e.resumeScheduled=!0,n.nextTick(I,t,e))}(this,t)),t.paused=!1,this},A.prototype.pause=function(){return h("call pause flowing=%j",this._readableState.flowing),!1!==this._readableState.flowing&&(h("pause"),this._readableState.flowing=!1,this.emit("pause")),this._readableState.paused=!0,this},A.prototype.wrap=function(t){var e=this,r=this._readableState,n=!1;for(var i in t.on("end",(function(){if(h("wrapped end"),r.decoder&&!r.ended){var t=r.decoder.end();t&&t.length&&e.push(t)}e.push(null)})),t.on("data",(function(i){(h("wrapped data"),r.decoder&&(i=r.decoder.write(i)),r.objectMode&&null==i)||(r.objectMode||i&&i.length)&&(e.push(i)||(n=!0,t.pause()))})),t)void 0===this[i]&&"function"==typeof t[i]&&(this[i]=function(e){return function(){return t[e].apply(t,arguments)}}(i));for(var o=0;o-1))throw new _(t);return this._writableState.defaultEncoding=t,this},Object.defineProperty(A.prototype,"writableBuffer",{enumerable:!1,get:function(){return this._writableState&&this._writableState.getBuffer()}}),Object.defineProperty(A.prototype,"writableHighWaterMark",{enumerable:!1,get:function(){return this._writableState.highWaterMark}}),A.prototype._write=function(t,e,r){r(new m("_write()"))},A.prototype._writev=null,A.prototype.end=function(t,e,r){var i=this._writableState;return"function"==typeof t?(r=t,t=null,e=null):"function"==typeof e&&(r=e,e=null),null!=t&&this.write(t,e),i.corked&&(i.corked=1,this.uncork()),i.ending||function(t,e,r){e.ending=!0,P(t,e),r&&(e.finished?n.nextTick(r):t.once("finish",r));e.ended=!0,t.writable=!1}(this,i,r),this},Object.defineProperty(A.prototype,"writableLength",{enumerable:!1,get:function(){return this._writableState.length}}),Object.defineProperty(A.prototype,"destroyed",{enumerable:!1,get:function(){return void 0!==this._writableState&&this._writableState.destroyed},set:function(t){this._writableState&&(this._writableState.destroyed=t)}}),A.prototype.destroy=c.destroy,A.prototype._undestroy=c.undestroy,A.prototype._destroy=function(t,e){e(t)}}).call(this,r(7),r(5))},function(t,e,r){(function(e){function r(t){try{if(!e.localStorage)return!1}catch(t){return!1}var r=e.localStorage[t];return null!=r&&"true"===String(r).toLowerCase()}t.exports=function(t,e){if(r("noDeprecation"))return t;var n=!1;return function(){if(!n){if(r("throwDeprecation"))throw new Error(e);r("traceDeprecation")?console.trace(e):console.warn(e),n=!0}return t.apply(this,arguments)}}}).call(this,r(7))},function(t,e,r){"use strict";t.exports=f;var n=r(28).codes,i=n.ERR_METHOD_NOT_IMPLEMENTED,o=n.ERR_MULTIPLE_CALLBACK,s=n.ERR_TRANSFORM_ALREADY_TRANSFORMING,a=n.ERR_TRANSFORM_WITH_LENGTH_0,u=r(29);function h(t,e){var r=this._transformState;r.transforming=!1;var n=r.writecb;if(null===n)return this.emit("error",new o);r.writechunk=null,r.writecb=null,null!=e&&this.push(e),n(t);var i=this._readableState;i.reading=!1,(i.needReadable||i.length>>2|t<<30)^(t>>>13|t<<19)^(t>>>22|t<<10)}function l(t){return(t>>>6|t<<26)^(t>>>11|t<<21)^(t>>>25|t<<7)}function d(t){return(t>>>7|t<<25)^(t>>>18|t<<14)^t>>>3}n(u,i),u.prototype.init=function(){return this._a=1779033703,this._b=3144134277,this._c=1013904242,this._d=2773480762,this._e=1359893119,this._f=2600822924,this._g=528734635,this._h=1541459225,this},u.prototype._update=function(t){for(var e,r=this._w,n=0|this._a,i=0|this._b,o=0|this._c,a=0|this._d,u=0|this._e,p=0|this._f,m=0|this._g,y=0|this._h,v=0;v<16;++v)r[v]=t.readInt32BE(4*v);for(;v<64;++v)r[v]=0|(((e=r[v-2])>>>17|e<<15)^(e>>>19|e<<13)^e>>>10)+r[v-7]+d(r[v-15])+r[v-16];for(var b=0;b<64;++b){var g=y+l(u)+h(u,p,m)+s[b]+r[b]|0,w=c(n)+f(n,i,o)|0;y=m,m=p,p=u,u=a+g|0,a=o,o=i,i=n,n=g+w|0}this._a=n+this._a|0,this._b=i+this._b|0,this._c=o+this._c|0,this._d=a+this._d|0,this._e=u+this._e|0,this._f=p+this._f|0,this._g=m+this._g|0,this._h=y+this._h|0},u.prototype._hash=function(){var t=o.allocUnsafe(32);return t.writeInt32BE(this._a,0),t.writeInt32BE(this._b,4),t.writeInt32BE(this._c,8),t.writeInt32BE(this._d,12),t.writeInt32BE(this._e,16),t.writeInt32BE(this._f,20),t.writeInt32BE(this._g,24),t.writeInt32BE(this._h,28),t},t.exports=u},function(t,e,r){var n=r(0),i=r(30),o=r(2).Buffer,s=[1116352408,3609767458,1899447441,602891725,3049323471,3964484399,3921009573,2173295548,961987163,4081628472,1508970993,3053834265,2453635748,2937671579,2870763221,3664609560,3624381080,2734883394,310598401,1164996542,607225278,1323610764,1426881987,3590304994,1925078388,4068182383,2162078206,991336113,2614888103,633803317,3248222580,3479774868,3835390401,2666613458,4022224774,944711139,264347078,2341262773,604807628,2007800933,770255983,1495990901,1249150122,1856431235,1555081692,3175218132,1996064986,2198950837,2554220882,3999719339,2821834349,766784016,2952996808,2566594879,3210313671,3203337956,3336571891,1034457026,3584528711,2466948901,113926993,3758326383,338241895,168717936,666307205,1188179964,773529912,1546045734,1294757372,1522805485,1396182291,2643833823,1695183700,2343527390,1986661051,1014477480,2177026350,1206759142,2456956037,344077627,2730485921,1290863460,2820302411,3158454273,3259730800,3505952657,3345764771,106217008,3516065817,3606008344,3600352804,1432725776,4094571909,1467031594,275423344,851169720,430227734,3100823752,506948616,1363258195,659060556,3750685593,883997877,3785050280,958139571,3318307427,1322822218,3812723403,1537002063,2003034995,1747873779,3602036899,1955562222,1575990012,2024104815,1125592928,2227730452,2716904306,2361852424,442776044,2428436474,593698344,2756734187,3733110249,3204031479,2999351573,3329325298,3815920427,3391569614,3928383900,3515267271,566280711,3940187606,3454069534,4118630271,4000239992,116418474,1914138554,174292421,2731055270,289380356,3203993006,460393269,320620315,685471733,587496836,852142971,1086792851,1017036298,365543100,1126000580,2618297676,1288033470,3409855158,1501505948,4234509866,1607167915,987167468,1816402316,1246189591],a=new Array(160);function u(){this.init(),this._w=a,i.call(this,128,112)}function h(t,e,r){return r^t&(e^r)}function f(t,e,r){return t&e|r&(t|e)}function c(t,e){return(t>>>28|e<<4)^(e>>>2|t<<30)^(e>>>7|t<<25)}function l(t,e){return(t>>>14|e<<18)^(t>>>18|e<<14)^(e>>>9|t<<23)}function d(t,e){return(t>>>1|e<<31)^(t>>>8|e<<24)^t>>>7}function p(t,e){return(t>>>1|e<<31)^(t>>>8|e<<24)^(t>>>7|e<<25)}function m(t,e){return(t>>>19|e<<13)^(e>>>29|t<<3)^t>>>6}function y(t,e){return(t>>>19|e<<13)^(e>>>29|t<<3)^(t>>>6|e<<26)}function v(t,e){return t>>>0>>0?1:0}n(u,i),u.prototype.init=function(){return this._ah=1779033703,this._bh=3144134277,this._ch=1013904242,this._dh=2773480762,this._eh=1359893119,this._fh=2600822924,this._gh=528734635,this._hh=1541459225,this._al=4089235720,this._bl=2227873595,this._cl=4271175723,this._dl=1595750129,this._el=2917565137,this._fl=725511199,this._gl=4215389547,this._hl=327033209,this},u.prototype._update=function(t){for(var e=this._w,r=0|this._ah,n=0|this._bh,i=0|this._ch,o=0|this._dh,a=0|this._eh,u=0|this._fh,b=0|this._gh,g=0|this._hh,w=0|this._al,_=0|this._bl,M=0|this._cl,S=0|this._dl,O=0|this._el,A=0|this._fl,E=0|this._gl,x=0|this._hl,k=0;k<32;k+=2)e[k]=t.readInt32BE(4*k),e[k+1]=t.readInt32BE(4*k+4);for(;k<160;k+=2){var j=e[k-30],$=e[k-30+1],P=d(j,$),R=p($,j),B=m(j=e[k-4],$=e[k-4+1]),T=y($,j),I=e[k-14],N=e[k-14+1],D=e[k-32],C=e[k-32+1],L=R+N|0,q=P+I+v(L,R)|0;q=(q=q+B+v(L=L+T|0,T)|0)+D+v(L=L+C|0,C)|0,e[k]=q,e[k+1]=L}for(var U=0;U<160;U+=2){q=e[U],L=e[U+1];var F=f(r,n,i),z=f(w,_,M),V=c(r,w),K=c(w,r),H=l(a,O),Z=l(O,a),W=s[U],J=s[U+1],Y=h(a,u,b),Q=h(O,A,E),G=x+Z|0,X=g+H+v(G,x)|0;X=(X=(X=X+Y+v(G=G+Q|0,Q)|0)+W+v(G=G+J|0,J)|0)+q+v(G=G+L|0,L)|0;var tt=K+z|0,et=V+F+v(tt,K)|0;g=b,x=E,b=u,E=A,u=a,A=O,a=o+X+v(O=S+G|0,S)|0,o=i,S=M,i=n,M=_,n=r,_=w,r=X+et+v(w=G+tt|0,G)|0}this._al=this._al+w|0,this._bl=this._bl+_|0,this._cl=this._cl+M|0,this._dl=this._dl+S|0,this._el=this._el+O|0,this._fl=this._fl+A|0,this._gl=this._gl+E|0,this._hl=this._hl+x|0,this._ah=this._ah+r+v(this._al,w)|0,this._bh=this._bh+n+v(this._bl,_)|0,this._ch=this._ch+i+v(this._cl,M)|0,this._dh=this._dh+o+v(this._dl,S)|0,this._eh=this._eh+a+v(this._el,O)|0,this._fh=this._fh+u+v(this._fl,A)|0,this._gh=this._gh+b+v(this._gl,E)|0,this._hh=this._hh+g+v(this._hl,x)|0},u.prototype._hash=function(){var t=o.allocUnsafe(64);function e(e,r,n){t.writeInt32BE(e,n),t.writeInt32BE(r,n+4)}return e(this._ah,this._al,0),e(this._bh,this._bl,8),e(this._ch,this._cl,16),e(this._dh,this._dl,24),e(this._eh,this._el,32),e(this._fh,this._fl,40),e(this._gh,this._gl,48),e(this._hh,this._hl,56),t},t.exports=u},function(t,e,r){"use strict";(function(e,n){var i=r(46);t.exports=g;var o,s=r(93);g.ReadableState=b;r(13).EventEmitter;var a=function(t,e){return t.listeners(e).length},u=r(106),h=r(2).Buffer,f=e.Uint8Array||function(){};var c=Object.create(r(38));c.inherits=r(0);var l=r(197),d=void 0;d=l&&l.debuglog?l.debuglog("stream"):function(){};var p,m=r(198),y=r(107);c.inherits(g,u);var v=["error","close","destroy","pause","resume"];function b(t,e){t=t||{};var n=e instanceof(o=o||r(25));this.objectMode=!!t.objectMode,n&&(this.objectMode=this.objectMode||!!t.readableObjectMode);var i=t.highWaterMark,s=t.readableHighWaterMark,a=this.objectMode?16:16384;this.highWaterMark=i||0===i?i:n&&(s||0===s)?s:a,this.highWaterMark=Math.floor(this.highWaterMark),this.buffer=new m,this.length=0,this.pipes=null,this.pipesCount=0,this.flowing=null,this.ended=!1,this.endEmitted=!1,this.reading=!1,this.sync=!0,this.needReadable=!1,this.emittedReadable=!1,this.readableListening=!1,this.resumeScheduled=!1,this.destroyed=!1,this.defaultEncoding=t.defaultEncoding||"utf8",this.awaitDrain=0,this.readingMore=!1,this.decoder=null,this.encoding=null,t.encoding&&(p||(p=r(37).StringDecoder),this.decoder=new p(t.encoding),this.encoding=t.encoding)}function g(t){if(o=o||r(25),!(this instanceof g))return new g(t);this._readableState=new b(t,this),this.readable=!0,t&&("function"==typeof t.read&&(this._read=t.read),"function"==typeof t.destroy&&(this._destroy=t.destroy)),u.call(this)}function w(t,e,r,n,i){var o,s=t._readableState;null===e?(s.reading=!1,function(t,e){if(e.ended)return;if(e.decoder){var r=e.decoder.end();r&&r.length&&(e.buffer.push(r),e.length+=e.objectMode?1:r.length)}e.ended=!0,S(t)}(t,s)):(i||(o=function(t,e){var r;n=e,h.isBuffer(n)||n instanceof f||"string"==typeof e||void 0===e||t.objectMode||(r=new TypeError("Invalid non-string/buffer chunk"));var n;return r}(s,e)),o?t.emit("error",o):s.objectMode||e&&e.length>0?("string"==typeof e||s.objectMode||Object.getPrototypeOf(e)===h.prototype||(e=function(t){return h.from(t)}(e)),n?s.endEmitted?t.emit("error",new Error("stream.unshift() after end event")):_(t,s,e,!0):s.ended?t.emit("error",new Error("stream.push() after EOF")):(s.reading=!1,s.decoder&&!r?(e=s.decoder.write(e),s.objectMode||0!==e.length?_(t,s,e,!1):A(t,s)):_(t,s,e,!1))):n||(s.reading=!1));return function(t){return!t.ended&&(t.needReadable||t.lengthe.highWaterMark&&(e.highWaterMark=function(t){return t>=8388608?t=8388608:(t--,t|=t>>>1,t|=t>>>2,t|=t>>>4,t|=t>>>8,t|=t>>>16,t++),t}(t)),t<=e.length?t:e.ended?e.length:(e.needReadable=!0,0))}function S(t){var e=t._readableState;e.needReadable=!1,e.emittedReadable||(d("emitReadable",e.flowing),e.emittedReadable=!0,e.sync?i.nextTick(O,t):O(t))}function O(t){d("emit readable"),t.emit("readable"),j(t)}function A(t,e){e.readingMore||(e.readingMore=!0,i.nextTick(E,t,e))}function E(t,e){for(var r=e.length;!e.reading&&!e.flowing&&!e.ended&&e.length=e.length?(r=e.decoder?e.buffer.join(""):1===e.buffer.length?e.buffer.head.data:e.buffer.concat(e.length),e.buffer.clear()):r=function(t,e,r){var n;to.length?o.length:t;if(s===o.length?i+=o:i+=o.slice(0,t),0===(t-=s)){s===o.length?(++n,r.next?e.head=r.next:e.head=e.tail=null):(e.head=r,r.data=o.slice(s));break}++n}return e.length-=n,i}(t,e):function(t,e){var r=h.allocUnsafe(t),n=e.head,i=1;n.data.copy(r),t-=n.data.length;for(;n=n.next;){var o=n.data,s=t>o.length?o.length:t;if(o.copy(r,r.length-t,0,s),0===(t-=s)){s===o.length?(++i,n.next?e.head=n.next:e.head=e.tail=null):(e.head=n,n.data=o.slice(s));break}++i}return e.length-=i,r}(t,e);return n}(t,e.buffer,e.decoder),r);var r}function P(t){var e=t._readableState;if(e.length>0)throw new Error('"endReadable()" called on non-empty stream');e.endEmitted||(e.ended=!0,i.nextTick(R,e,t))}function R(t,e){t.endEmitted||0!==t.length||(t.endEmitted=!0,e.readable=!1,e.emit("end"))}function B(t,e){for(var r=0,n=t.length;r=e.highWaterMark||e.ended))return d("read: emitReadable",e.length,e.ended),0===e.length&&e.ended?P(this):S(this),null;if(0===(t=M(t,e))&&e.ended)return 0===e.length&&P(this),null;var n,i=e.needReadable;return d("need readable",i),(0===e.length||e.length-t0?$(t,e):null)?(e.needReadable=!0,t=0):e.length-=t,0===e.length&&(e.ended||(e.needReadable=!0),r!==t&&e.ended&&P(this)),null!==n&&this.emit("data",n),n},g.prototype._read=function(t){this.emit("error",new Error("_read() is not implemented"))},g.prototype.pipe=function(t,e){var r=this,o=this._readableState;switch(o.pipesCount){case 0:o.pipes=t;break;case 1:o.pipes=[o.pipes,t];break;default:o.pipes.push(t)}o.pipesCount+=1,d("pipe count=%d opts=%j",o.pipesCount,e);var u=(!e||!1!==e.end)&&t!==n.stdout&&t!==n.stderr?f:g;function h(e,n){d("onunpipe"),e===r&&n&&!1===n.hasUnpiped&&(n.hasUnpiped=!0,d("cleanup"),t.removeListener("close",v),t.removeListener("finish",b),t.removeListener("drain",c),t.removeListener("error",y),t.removeListener("unpipe",h),r.removeListener("end",f),r.removeListener("end",g),r.removeListener("data",m),l=!0,!o.awaitDrain||t._writableState&&!t._writableState.needDrain||c())}function f(){d("onend"),t.end()}o.endEmitted?i.nextTick(u):r.once("end",u),t.on("unpipe",h);var c=function(t){return function(){var e=t._readableState;d("pipeOnDrain",e.awaitDrain),e.awaitDrain&&e.awaitDrain--,0===e.awaitDrain&&a(t,"data")&&(e.flowing=!0,j(t))}}(r);t.on("drain",c);var l=!1;var p=!1;function m(e){d("ondata"),p=!1,!1!==t.write(e)||p||((1===o.pipesCount&&o.pipes===t||o.pipesCount>1&&-1!==B(o.pipes,t))&&!l&&(d("false write response, pause",r._readableState.awaitDrain),r._readableState.awaitDrain++,p=!0),r.pause())}function y(e){d("onerror",e),g(),t.removeListener("error",y),0===a(t,"error")&&t.emit("error",e)}function v(){t.removeListener("finish",b),g()}function b(){d("onfinish"),t.removeListener("close",v),g()}function g(){d("unpipe"),r.unpipe(t)}return r.on("data",m),function(t,e,r){if("function"==typeof t.prependListener)return t.prependListener(e,r);t._events&&t._events[e]?s(t._events[e])?t._events[e].unshift(r):t._events[e]=[r,t._events[e]]:t.on(e,r)}(t,"error",y),t.once("close",v),t.once("finish",b),t.emit("pipe",r),o.flowing||(d("pipe resume"),r.resume()),t},g.prototype.unpipe=function(t){var e=this._readableState,r={hasUnpiped:!1};if(0===e.pipesCount)return this;if(1===e.pipesCount)return t&&t!==e.pipes||(t||(t=e.pipes),e.pipes=null,e.pipesCount=0,e.flowing=!1,t&&t.emit("unpipe",this,r)),this;if(!t){var n=e.pipes,i=e.pipesCount;e.pipes=null,e.pipesCount=0,e.flowing=!1;for(var o=0;o=0&&(t._idleTimeoutId=setTimeout((function(){t._onTimeout&&t._onTimeout()}),e))},r(200),e.setImmediate="undefined"!=typeof self&&self.setImmediate||void 0!==t&&t.setImmediate||this&&this.setImmediate,e.clearImmediate="undefined"!=typeof self&&self.clearImmediate||void 0!==t&&t.clearImmediate||this&&this.clearImmediate}).call(this,r(7))},function(t,e,r){"use strict";t.exports=s;var n=r(25),i=Object.create(r(38));function o(t,e){var r=this._transformState;r.transforming=!1;var n=r.writecb;if(!n)return this.emit("error",new Error("write callback called multiple times"));r.writechunk=null,r.writecb=null,null!=e&&this.push(e),n(t);var i=this._readableState;i.reading=!1,(i.needReadable||i.lengthr)?e=("rmd160"===t?new u:h(t)).update(e).digest():e.lengthr||e!=e)throw new TypeError("Bad key length")}},function(t,e,r){(function(e,r){var n;if(e.process&&e.process.browser)n="utf-8";else if(e.process&&e.process.version){n=parseInt(r.version.split(".")[0].slice(1),10)>=6?"utf-8":"binary"}else n="utf-8";t.exports=n}).call(this,r(7),r(5))},function(t,e,r){var n=r(111),i=r(62),o=r(63),s=r(2).Buffer,a=r(114),u=r(115),h=r(117),f=s.alloc(128),c={md5:16,sha1:20,sha224:28,sha256:32,sha384:48,sha512:64,rmd160:20,ripemd160:20};function l(t,e,r){var a=function(t){function e(e){return o(t).update(e).digest()}return"rmd160"===t||"ripemd160"===t?function(t){return(new i).update(t).digest()}:"md5"===t?n:e}(t),u="sha512"===t||"sha384"===t?128:64;e.length>u?e=a(e):e.length>>0},e.writeUInt32BE=function(t,e,r){t[0+r]=e>>>24,t[1+r]=e>>>16&255,t[2+r]=e>>>8&255,t[3+r]=255&e},e.ip=function(t,e,r,n){for(var i=0,o=0,s=6;s>=0;s-=2){for(var a=0;a<=24;a+=8)i<<=1,i|=e>>>a+s&1;for(a=0;a<=24;a+=8)i<<=1,i|=t>>>a+s&1}for(s=6;s>=0;s-=2){for(a=1;a<=25;a+=8)o<<=1,o|=e>>>a+s&1;for(a=1;a<=25;a+=8)o<<=1,o|=t>>>a+s&1}r[n+0]=i>>>0,r[n+1]=o>>>0},e.rip=function(t,e,r,n){for(var i=0,o=0,s=0;s<4;s++)for(var a=24;a>=0;a-=8)i<<=1,i|=e>>>a+s&1,i<<=1,i|=t>>>a+s&1;for(s=4;s<8;s++)for(a=24;a>=0;a-=8)o<<=1,o|=e>>>a+s&1,o<<=1,o|=t>>>a+s&1;r[n+0]=i>>>0,r[n+1]=o>>>0},e.pc1=function(t,e,r,n){for(var i=0,o=0,s=7;s>=5;s--){for(var a=0;a<=24;a+=8)i<<=1,i|=e>>a+s&1;for(a=0;a<=24;a+=8)i<<=1,i|=t>>a+s&1}for(a=0;a<=24;a+=8)i<<=1,i|=e>>a+s&1;for(s=1;s<=3;s++){for(a=0;a<=24;a+=8)o<<=1,o|=e>>a+s&1;for(a=0;a<=24;a+=8)o<<=1,o|=t>>a+s&1}for(a=0;a<=24;a+=8)o<<=1,o|=t>>a+s&1;r[n+0]=i>>>0,r[n+1]=o>>>0},e.r28shl=function(t,e){return t<>>28-e};var n=[14,11,17,4,27,23,25,0,13,22,7,18,5,9,16,24,2,20,12,21,1,8,15,26,15,4,25,19,9,1,26,16,5,11,23,8,12,7,17,0,22,3,10,14,6,20,27,24];e.pc2=function(t,e,r,i){for(var o=0,s=0,a=n.length>>>1,u=0;u>>n[u]&1;for(u=a;u>>n[u]&1;r[i+0]=o>>>0,r[i+1]=s>>>0},e.expand=function(t,e,r){var n=0,i=0;n=(1&t)<<5|t>>>27;for(var o=23;o>=15;o-=4)n<<=6,n|=t>>>o&63;for(o=11;o>=3;o-=4)i|=t>>>o&63,i<<=6;i|=(31&t)<<1|t>>>31,e[r+0]=n>>>0,e[r+1]=i>>>0};var i=[14,0,4,15,13,7,1,4,2,14,15,2,11,13,8,1,3,10,10,6,6,12,12,11,5,9,9,5,0,3,7,8,4,15,1,12,14,8,8,2,13,4,6,9,2,1,11,7,15,5,12,11,9,3,7,14,3,10,10,0,5,6,0,13,15,3,1,13,8,4,14,7,6,15,11,2,3,8,4,14,9,12,7,0,2,1,13,10,12,6,0,9,5,11,10,5,0,13,14,8,7,10,11,1,10,3,4,15,13,4,1,2,5,11,8,6,12,7,6,12,9,0,3,5,2,14,15,9,10,13,0,7,9,0,14,9,6,3,3,4,15,6,5,10,1,2,13,8,12,5,7,14,11,12,4,11,2,15,8,1,13,1,6,10,4,13,9,0,8,6,15,9,3,8,0,7,11,4,1,15,2,14,12,3,5,11,10,5,14,2,7,12,7,13,13,8,14,11,3,5,0,6,6,15,9,0,10,3,1,4,2,7,8,2,5,12,11,1,12,10,4,14,15,9,10,3,6,15,9,0,0,6,12,10,11,1,7,13,13,8,15,9,1,4,3,5,14,11,5,12,2,7,8,2,4,14,2,14,12,11,4,2,1,12,7,4,10,7,11,13,6,1,8,5,5,0,3,15,15,10,13,3,0,9,14,8,9,6,4,11,2,8,1,12,11,7,10,1,13,14,7,2,8,13,15,6,9,15,12,0,5,9,6,10,3,4,0,5,14,3,12,10,1,15,10,4,15,2,9,7,2,12,6,9,8,5,0,6,13,1,3,13,4,14,14,0,7,11,5,3,11,8,9,4,14,3,15,2,5,12,2,9,8,5,12,15,3,10,7,11,0,14,4,1,10,7,1,6,13,0,11,8,6,13,4,13,11,0,2,11,14,7,15,4,0,9,8,1,13,10,3,14,12,3,9,5,7,12,5,2,10,15,6,8,1,6,1,6,4,11,11,13,13,8,12,1,3,4,7,10,14,7,10,9,15,5,6,0,8,15,0,14,5,2,9,3,2,12,13,1,2,15,8,13,4,8,6,10,15,3,11,7,1,4,10,12,9,5,3,6,14,11,5,0,0,14,12,9,7,2,7,2,11,1,4,14,1,7,9,4,12,10,14,8,2,13,0,15,6,12,10,9,13,0,15,3,3,5,5,6,8,11];e.substitute=function(t,e){for(var r=0,n=0;n<4;n++){r<<=4,r|=i[64*n+(t>>>18-6*n&63)]}for(n=0;n<4;n++){r<<=4,r|=i[256+64*n+(e>>>18-6*n&63)]}return r>>>0};var o=[16,25,12,11,3,20,4,15,31,17,9,6,27,14,1,22,30,24,8,18,0,5,29,23,13,19,2,26,10,21,28,7];e.permute=function(t){for(var e=0,r=0;r>>o[r]&1;return e>>>0},e.padSplit=function(t,e,r){for(var n=t.toString(2);n.length>>1];r=o.r28shl(r,a),i=o.r28shl(i,a),o.pc2(r,i,t.keys,s)}},u.prototype._update=function(t,e,r,n){var i=this._desState,s=o.readUInt32BE(t,e),a=o.readUInt32BE(t,e+4);o.ip(s,a,i.tmp,0),s=i.tmp[0],a=i.tmp[1],"encrypt"===this.type?this._encrypt(i,s,a,i.tmp,0):this._decrypt(i,s,a,i.tmp,0),s=i.tmp[0],a=i.tmp[1],o.writeUInt32BE(r,s,n),o.writeUInt32BE(r,a,n+4)},u.prototype._pad=function(t,e){for(var r=t.length-e,n=e;n>>0,s=l}o.rip(a,s,n,i)},u.prototype._decrypt=function(t,e,r,n,i){for(var s=r,a=e,u=t.keys.length-2;u>=0;u-=2){var h=t.keys[u],f=t.keys[u+1];o.expand(s,t.tmp,0),h^=t.tmp[0],f^=t.tmp[1];var c=o.substitute(h,f),l=s;s=(a^o.permute(c))>>>0,a=l}o.rip(s,a,n,i)}},function(t,e,r){var n=r(39),i=r(2).Buffer,o=r(121);function s(t){var e=t._cipher.encryptBlockRaw(t._prev);return o(t._prev),e}e.encrypt=function(t,e){var r=Math.ceil(e.length/16),o=t._cache.length;t._cache=i.concat([t._cache,i.allocUnsafe(16*r)]);for(var a=0;at;)r.ishrn(1);if(r.isEven()&&r.iadd(a),r.testn(1)||r.iadd(u),e.cmp(u)){if(!e.cmp(h))for(;r.mod(f).cmp(c);)r.iadd(d)}else for(;r.mod(o).cmp(l);)r.iadd(d);if(y(p=r.shrn(1))&&y(r)&&v(p)&&v(r)&&s.test(p)&&s.test(r))return r}}},function(t,e,r){(function(t){function e(t){return(e="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}!function(t,n){"use strict";function i(t,e){if(!t)throw new Error(e||"Assertion failed")}function o(t,e){t.super_=e;var r=function(){};r.prototype=e.prototype,t.prototype=new r,t.prototype.constructor=t}function s(t,e,r){if(s.isBN(t))return t;this.negative=0,this.words=null,this.length=0,this.red=null,null!==t&&("le"!==e&&"be"!==e||(r=e,e=10),this._init(t||0,e||10,r||"be"))}var a;"object"===e(t)?t.exports=s:n.BN=s,s.BN=s,s.wordSize=26;try{a="undefined"!=typeof window&&void 0!==window.Buffer?window.Buffer:r(225).Buffer}catch(t){}function u(t,e){var r=t.charCodeAt(e);return r>=65&&r<=70?r-55:r>=97&&r<=102?r-87:r-48&15}function h(t,e,r){var n=u(t,r);return r-1>=e&&(n|=u(t,r-1)<<4),n}function f(t,e,r,n){for(var i=0,o=Math.min(t.length,r),s=e;s=49?a-49+10:a>=17?a-17+10:a}return i}s.isBN=function(t){return t instanceof s||null!==t&&"object"===e(t)&&t.constructor.wordSize===s.wordSize&&Array.isArray(t.words)},s.max=function(t,e){return t.cmp(e)>0?t:e},s.min=function(t,e){return t.cmp(e)<0?t:e},s.prototype._init=function(t,r,n){if("number"==typeof t)return this._initNumber(t,r,n);if("object"===e(t))return this._initArray(t,r,n);"hex"===r&&(r=16),i(r===(0|r)&&r>=2&&r<=36);var o=0;"-"===(t=t.toString().replace(/\s+/g,""))[0]&&(o++,this.negative=1),o=0;n-=3)s=t[n]|t[n-1]<<8|t[n-2]<<16,this.words[o]|=s<>>26-a&67108863,(a+=24)>=26&&(a-=26,o++);else if("le"===r)for(n=0,o=0;n>>26-a&67108863,(a+=24)>=26&&(a-=26,o++);return this.strip()},s.prototype._parseHex=function(t,e,r){this.length=Math.ceil((t.length-e)/6),this.words=new Array(this.length);for(var n=0;n=e;n-=2)i=h(t,e,n)<=18?(o-=18,s+=1,this.words[s]|=i>>>26):o+=8;else for(n=(t.length-e)%2==0?e+1:e;n=18?(o-=18,s+=1,this.words[s]|=i>>>26):o+=8;this.strip()},s.prototype._parseBase=function(t,e,r){this.words=[0],this.length=1;for(var n=0,i=1;i<=67108863;i*=e)n++;n--,i=i/e|0;for(var o=t.length-r,s=o%n,a=Math.min(o,o-s)+r,u=0,h=r;h1&&0===this.words[this.length-1];)this.length--;return this._normSign()},s.prototype._normSign=function(){return 1===this.length&&0===this.words[0]&&(this.negative=0),this},s.prototype.inspect=function(){return(this.red?""};var c=["","0","00","000","0000","00000","000000","0000000","00000000","000000000","0000000000","00000000000","000000000000","0000000000000","00000000000000","000000000000000","0000000000000000","00000000000000000","000000000000000000","0000000000000000000","00000000000000000000","000000000000000000000","0000000000000000000000","00000000000000000000000","000000000000000000000000","0000000000000000000000000"],l=[0,0,25,16,12,11,10,9,8,8,7,7,7,7,6,6,6,6,6,6,6,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5],d=[0,0,33554432,43046721,16777216,48828125,60466176,40353607,16777216,43046721,1e7,19487171,35831808,62748517,7529536,11390625,16777216,24137569,34012224,47045881,64e6,4084101,5153632,6436343,7962624,9765625,11881376,14348907,17210368,20511149,243e5,28629151,33554432,39135393,45435424,52521875,60466176];function p(t,e,r){r.negative=e.negative^t.negative;var n=t.length+e.length|0;r.length=n,n=n-1|0;var i=0|t.words[0],o=0|e.words[0],s=i*o,a=67108863&s,u=s/67108864|0;r.words[0]=a;for(var h=1;h>>26,c=67108863&u,l=Math.min(h,e.length-1),d=Math.max(0,h-t.length+1);d<=l;d++){var p=h-d|0;f+=(s=(i=0|t.words[p])*(o=0|e.words[d])+c)/67108864|0,c=67108863&s}r.words[h]=0|c,u=0|f}return 0!==u?r.words[h]=0|u:r.length--,r.strip()}s.prototype.toString=function(t,e){var r;if(e=0|e||1,16===(t=t||10)||"hex"===t){r="";for(var n=0,o=0,s=0;s>>24-n&16777215)||s!==this.length-1?c[6-u.length]+u+r:u+r,(n+=2)>=26&&(n-=26,s--)}for(0!==o&&(r=o.toString(16)+r);r.length%e!=0;)r="0"+r;return 0!==this.negative&&(r="-"+r),r}if(t===(0|t)&&t>=2&&t<=36){var h=l[t],f=d[t];r="";var p=this.clone();for(p.negative=0;!p.isZero();){var m=p.modn(f).toString(t);r=(p=p.idivn(f)).isZero()?m+r:c[h-m.length]+m+r}for(this.isZero()&&(r="0"+r);r.length%e!=0;)r="0"+r;return 0!==this.negative&&(r="-"+r),r}i(!1,"Base should be between 2 and 36")},s.prototype.toNumber=function(){var t=this.words[0];return 2===this.length?t+=67108864*this.words[1]:3===this.length&&1===this.words[2]?t+=4503599627370496+67108864*this.words[1]:this.length>2&&i(!1,"Number can only safely store up to 53 bits"),0!==this.negative?-t:t},s.prototype.toJSON=function(){return this.toString(16)},s.prototype.toBuffer=function(t,e){return i(void 0!==a),this.toArrayLike(a,t,e)},s.prototype.toArray=function(t,e){return this.toArrayLike(Array,t,e)},s.prototype.toArrayLike=function(t,e,r){var n=this.byteLength(),o=r||Math.max(1,n);i(n<=o,"byte array longer than desired length"),i(o>0,"Requested array length <= 0"),this.strip();var s,a,u="le"===e,h=new t(o),f=this.clone();if(u){for(a=0;!f.isZero();a++)s=f.andln(255),f.iushrn(8),h[a]=s;for(;a=4096&&(r+=13,e>>>=13),e>=64&&(r+=7,e>>>=7),e>=8&&(r+=4,e>>>=4),e>=2&&(r+=2,e>>>=2),r+e},s.prototype._zeroBits=function(t){if(0===t)return 26;var e=t,r=0;return 0==(8191&e)&&(r+=13,e>>>=13),0==(127&e)&&(r+=7,e>>>=7),0==(15&e)&&(r+=4,e>>>=4),0==(3&e)&&(r+=2,e>>>=2),0==(1&e)&&r++,r},s.prototype.bitLength=function(){var t=this.words[this.length-1],e=this._countBits(t);return 26*(this.length-1)+e},s.prototype.zeroBits=function(){if(this.isZero())return 0;for(var t=0,e=0;et.length?this.clone().ior(t):t.clone().ior(this)},s.prototype.uor=function(t){return this.length>t.length?this.clone().iuor(t):t.clone().iuor(this)},s.prototype.iuand=function(t){var e;e=this.length>t.length?t:this;for(var r=0;rt.length?this.clone().iand(t):t.clone().iand(this)},s.prototype.uand=function(t){return this.length>t.length?this.clone().iuand(t):t.clone().iuand(this)},s.prototype.iuxor=function(t){var e,r;this.length>t.length?(e=this,r=t):(e=t,r=this);for(var n=0;nt.length?this.clone().ixor(t):t.clone().ixor(this)},s.prototype.uxor=function(t){return this.length>t.length?this.clone().iuxor(t):t.clone().iuxor(this)},s.prototype.inotn=function(t){i("number"==typeof t&&t>=0);var e=0|Math.ceil(t/26),r=t%26;this._expand(e),r>0&&e--;for(var n=0;n0&&(this.words[n]=~this.words[n]&67108863>>26-r),this.strip()},s.prototype.notn=function(t){return this.clone().inotn(t)},s.prototype.setn=function(t,e){i("number"==typeof t&&t>=0);var r=t/26|0,n=t%26;return this._expand(r+1),this.words[r]=e?this.words[r]|1<t.length?(r=this,n=t):(r=t,n=this);for(var i=0,o=0;o>>26;for(;0!==i&&o>>26;if(this.length=r.length,0!==i)this.words[this.length]=i,this.length++;else if(r!==this)for(;ot.length?this.clone().iadd(t):t.clone().iadd(this)},s.prototype.isub=function(t){if(0!==t.negative){t.negative=0;var e=this.iadd(t);return t.negative=1,e._normSign()}if(0!==this.negative)return this.negative=0,this.iadd(t),this.negative=1,this._normSign();var r,n,i=this.cmp(t);if(0===i)return this.negative=0,this.length=1,this.words[0]=0,this;i>0?(r=this,n=t):(r=t,n=this);for(var o=0,s=0;s>26,this.words[s]=67108863&e;for(;0!==o&&s>26,this.words[s]=67108863&e;if(0===o&&s>>13,d=0|s[1],p=8191&d,m=d>>>13,y=0|s[2],v=8191&y,b=y>>>13,g=0|s[3],w=8191&g,_=g>>>13,M=0|s[4],S=8191&M,O=M>>>13,A=0|s[5],E=8191&A,x=A>>>13,k=0|s[6],j=8191&k,$=k>>>13,P=0|s[7],R=8191&P,B=P>>>13,T=0|s[8],I=8191&T,N=T>>>13,D=0|s[9],C=8191&D,L=D>>>13,q=0|a[0],U=8191&q,F=q>>>13,z=0|a[1],V=8191&z,K=z>>>13,H=0|a[2],Z=8191&H,W=H>>>13,J=0|a[3],Y=8191&J,Q=J>>>13,G=0|a[4],X=8191&G,tt=G>>>13,et=0|a[5],rt=8191&et,nt=et>>>13,it=0|a[6],ot=8191&it,st=it>>>13,at=0|a[7],ut=8191&at,ht=at>>>13,ft=0|a[8],ct=8191&ft,lt=ft>>>13,dt=0|a[9],pt=8191&dt,mt=dt>>>13;r.negative=t.negative^e.negative,r.length=19;var yt=(h+(n=Math.imul(c,U))|0)+((8191&(i=(i=Math.imul(c,F))+Math.imul(l,U)|0))<<13)|0;h=((o=Math.imul(l,F))+(i>>>13)|0)+(yt>>>26)|0,yt&=67108863,n=Math.imul(p,U),i=(i=Math.imul(p,F))+Math.imul(m,U)|0,o=Math.imul(m,F);var vt=(h+(n=n+Math.imul(c,V)|0)|0)+((8191&(i=(i=i+Math.imul(c,K)|0)+Math.imul(l,V)|0))<<13)|0;h=((o=o+Math.imul(l,K)|0)+(i>>>13)|0)+(vt>>>26)|0,vt&=67108863,n=Math.imul(v,U),i=(i=Math.imul(v,F))+Math.imul(b,U)|0,o=Math.imul(b,F),n=n+Math.imul(p,V)|0,i=(i=i+Math.imul(p,K)|0)+Math.imul(m,V)|0,o=o+Math.imul(m,K)|0;var bt=(h+(n=n+Math.imul(c,Z)|0)|0)+((8191&(i=(i=i+Math.imul(c,W)|0)+Math.imul(l,Z)|0))<<13)|0;h=((o=o+Math.imul(l,W)|0)+(i>>>13)|0)+(bt>>>26)|0,bt&=67108863,n=Math.imul(w,U),i=(i=Math.imul(w,F))+Math.imul(_,U)|0,o=Math.imul(_,F),n=n+Math.imul(v,V)|0,i=(i=i+Math.imul(v,K)|0)+Math.imul(b,V)|0,o=o+Math.imul(b,K)|0,n=n+Math.imul(p,Z)|0,i=(i=i+Math.imul(p,W)|0)+Math.imul(m,Z)|0,o=o+Math.imul(m,W)|0;var gt=(h+(n=n+Math.imul(c,Y)|0)|0)+((8191&(i=(i=i+Math.imul(c,Q)|0)+Math.imul(l,Y)|0))<<13)|0;h=((o=o+Math.imul(l,Q)|0)+(i>>>13)|0)+(gt>>>26)|0,gt&=67108863,n=Math.imul(S,U),i=(i=Math.imul(S,F))+Math.imul(O,U)|0,o=Math.imul(O,F),n=n+Math.imul(w,V)|0,i=(i=i+Math.imul(w,K)|0)+Math.imul(_,V)|0,o=o+Math.imul(_,K)|0,n=n+Math.imul(v,Z)|0,i=(i=i+Math.imul(v,W)|0)+Math.imul(b,Z)|0,o=o+Math.imul(b,W)|0,n=n+Math.imul(p,Y)|0,i=(i=i+Math.imul(p,Q)|0)+Math.imul(m,Y)|0,o=o+Math.imul(m,Q)|0;var wt=(h+(n=n+Math.imul(c,X)|0)|0)+((8191&(i=(i=i+Math.imul(c,tt)|0)+Math.imul(l,X)|0))<<13)|0;h=((o=o+Math.imul(l,tt)|0)+(i>>>13)|0)+(wt>>>26)|0,wt&=67108863,n=Math.imul(E,U),i=(i=Math.imul(E,F))+Math.imul(x,U)|0,o=Math.imul(x,F),n=n+Math.imul(S,V)|0,i=(i=i+Math.imul(S,K)|0)+Math.imul(O,V)|0,o=o+Math.imul(O,K)|0,n=n+Math.imul(w,Z)|0,i=(i=i+Math.imul(w,W)|0)+Math.imul(_,Z)|0,o=o+Math.imul(_,W)|0,n=n+Math.imul(v,Y)|0,i=(i=i+Math.imul(v,Q)|0)+Math.imul(b,Y)|0,o=o+Math.imul(b,Q)|0,n=n+Math.imul(p,X)|0,i=(i=i+Math.imul(p,tt)|0)+Math.imul(m,X)|0,o=o+Math.imul(m,tt)|0;var _t=(h+(n=n+Math.imul(c,rt)|0)|0)+((8191&(i=(i=i+Math.imul(c,nt)|0)+Math.imul(l,rt)|0))<<13)|0;h=((o=o+Math.imul(l,nt)|0)+(i>>>13)|0)+(_t>>>26)|0,_t&=67108863,n=Math.imul(j,U),i=(i=Math.imul(j,F))+Math.imul($,U)|0,o=Math.imul($,F),n=n+Math.imul(E,V)|0,i=(i=i+Math.imul(E,K)|0)+Math.imul(x,V)|0,o=o+Math.imul(x,K)|0,n=n+Math.imul(S,Z)|0,i=(i=i+Math.imul(S,W)|0)+Math.imul(O,Z)|0,o=o+Math.imul(O,W)|0,n=n+Math.imul(w,Y)|0,i=(i=i+Math.imul(w,Q)|0)+Math.imul(_,Y)|0,o=o+Math.imul(_,Q)|0,n=n+Math.imul(v,X)|0,i=(i=i+Math.imul(v,tt)|0)+Math.imul(b,X)|0,o=o+Math.imul(b,tt)|0,n=n+Math.imul(p,rt)|0,i=(i=i+Math.imul(p,nt)|0)+Math.imul(m,rt)|0,o=o+Math.imul(m,nt)|0;var Mt=(h+(n=n+Math.imul(c,ot)|0)|0)+((8191&(i=(i=i+Math.imul(c,st)|0)+Math.imul(l,ot)|0))<<13)|0;h=((o=o+Math.imul(l,st)|0)+(i>>>13)|0)+(Mt>>>26)|0,Mt&=67108863,n=Math.imul(R,U),i=(i=Math.imul(R,F))+Math.imul(B,U)|0,o=Math.imul(B,F),n=n+Math.imul(j,V)|0,i=(i=i+Math.imul(j,K)|0)+Math.imul($,V)|0,o=o+Math.imul($,K)|0,n=n+Math.imul(E,Z)|0,i=(i=i+Math.imul(E,W)|0)+Math.imul(x,Z)|0,o=o+Math.imul(x,W)|0,n=n+Math.imul(S,Y)|0,i=(i=i+Math.imul(S,Q)|0)+Math.imul(O,Y)|0,o=o+Math.imul(O,Q)|0,n=n+Math.imul(w,X)|0,i=(i=i+Math.imul(w,tt)|0)+Math.imul(_,X)|0,o=o+Math.imul(_,tt)|0,n=n+Math.imul(v,rt)|0,i=(i=i+Math.imul(v,nt)|0)+Math.imul(b,rt)|0,o=o+Math.imul(b,nt)|0,n=n+Math.imul(p,ot)|0,i=(i=i+Math.imul(p,st)|0)+Math.imul(m,ot)|0,o=o+Math.imul(m,st)|0;var St=(h+(n=n+Math.imul(c,ut)|0)|0)+((8191&(i=(i=i+Math.imul(c,ht)|0)+Math.imul(l,ut)|0))<<13)|0;h=((o=o+Math.imul(l,ht)|0)+(i>>>13)|0)+(St>>>26)|0,St&=67108863,n=Math.imul(I,U),i=(i=Math.imul(I,F))+Math.imul(N,U)|0,o=Math.imul(N,F),n=n+Math.imul(R,V)|0,i=(i=i+Math.imul(R,K)|0)+Math.imul(B,V)|0,o=o+Math.imul(B,K)|0,n=n+Math.imul(j,Z)|0,i=(i=i+Math.imul(j,W)|0)+Math.imul($,Z)|0,o=o+Math.imul($,W)|0,n=n+Math.imul(E,Y)|0,i=(i=i+Math.imul(E,Q)|0)+Math.imul(x,Y)|0,o=o+Math.imul(x,Q)|0,n=n+Math.imul(S,X)|0,i=(i=i+Math.imul(S,tt)|0)+Math.imul(O,X)|0,o=o+Math.imul(O,tt)|0,n=n+Math.imul(w,rt)|0,i=(i=i+Math.imul(w,nt)|0)+Math.imul(_,rt)|0,o=o+Math.imul(_,nt)|0,n=n+Math.imul(v,ot)|0,i=(i=i+Math.imul(v,st)|0)+Math.imul(b,ot)|0,o=o+Math.imul(b,st)|0,n=n+Math.imul(p,ut)|0,i=(i=i+Math.imul(p,ht)|0)+Math.imul(m,ut)|0,o=o+Math.imul(m,ht)|0;var Ot=(h+(n=n+Math.imul(c,ct)|0)|0)+((8191&(i=(i=i+Math.imul(c,lt)|0)+Math.imul(l,ct)|0))<<13)|0;h=((o=o+Math.imul(l,lt)|0)+(i>>>13)|0)+(Ot>>>26)|0,Ot&=67108863,n=Math.imul(C,U),i=(i=Math.imul(C,F))+Math.imul(L,U)|0,o=Math.imul(L,F),n=n+Math.imul(I,V)|0,i=(i=i+Math.imul(I,K)|0)+Math.imul(N,V)|0,o=o+Math.imul(N,K)|0,n=n+Math.imul(R,Z)|0,i=(i=i+Math.imul(R,W)|0)+Math.imul(B,Z)|0,o=o+Math.imul(B,W)|0,n=n+Math.imul(j,Y)|0,i=(i=i+Math.imul(j,Q)|0)+Math.imul($,Y)|0,o=o+Math.imul($,Q)|0,n=n+Math.imul(E,X)|0,i=(i=i+Math.imul(E,tt)|0)+Math.imul(x,X)|0,o=o+Math.imul(x,tt)|0,n=n+Math.imul(S,rt)|0,i=(i=i+Math.imul(S,nt)|0)+Math.imul(O,rt)|0,o=o+Math.imul(O,nt)|0,n=n+Math.imul(w,ot)|0,i=(i=i+Math.imul(w,st)|0)+Math.imul(_,ot)|0,o=o+Math.imul(_,st)|0,n=n+Math.imul(v,ut)|0,i=(i=i+Math.imul(v,ht)|0)+Math.imul(b,ut)|0,o=o+Math.imul(b,ht)|0,n=n+Math.imul(p,ct)|0,i=(i=i+Math.imul(p,lt)|0)+Math.imul(m,ct)|0,o=o+Math.imul(m,lt)|0;var At=(h+(n=n+Math.imul(c,pt)|0)|0)+((8191&(i=(i=i+Math.imul(c,mt)|0)+Math.imul(l,pt)|0))<<13)|0;h=((o=o+Math.imul(l,mt)|0)+(i>>>13)|0)+(At>>>26)|0,At&=67108863,n=Math.imul(C,V),i=(i=Math.imul(C,K))+Math.imul(L,V)|0,o=Math.imul(L,K),n=n+Math.imul(I,Z)|0,i=(i=i+Math.imul(I,W)|0)+Math.imul(N,Z)|0,o=o+Math.imul(N,W)|0,n=n+Math.imul(R,Y)|0,i=(i=i+Math.imul(R,Q)|0)+Math.imul(B,Y)|0,o=o+Math.imul(B,Q)|0,n=n+Math.imul(j,X)|0,i=(i=i+Math.imul(j,tt)|0)+Math.imul($,X)|0,o=o+Math.imul($,tt)|0,n=n+Math.imul(E,rt)|0,i=(i=i+Math.imul(E,nt)|0)+Math.imul(x,rt)|0,o=o+Math.imul(x,nt)|0,n=n+Math.imul(S,ot)|0,i=(i=i+Math.imul(S,st)|0)+Math.imul(O,ot)|0,o=o+Math.imul(O,st)|0,n=n+Math.imul(w,ut)|0,i=(i=i+Math.imul(w,ht)|0)+Math.imul(_,ut)|0,o=o+Math.imul(_,ht)|0,n=n+Math.imul(v,ct)|0,i=(i=i+Math.imul(v,lt)|0)+Math.imul(b,ct)|0,o=o+Math.imul(b,lt)|0;var Et=(h+(n=n+Math.imul(p,pt)|0)|0)+((8191&(i=(i=i+Math.imul(p,mt)|0)+Math.imul(m,pt)|0))<<13)|0;h=((o=o+Math.imul(m,mt)|0)+(i>>>13)|0)+(Et>>>26)|0,Et&=67108863,n=Math.imul(C,Z),i=(i=Math.imul(C,W))+Math.imul(L,Z)|0,o=Math.imul(L,W),n=n+Math.imul(I,Y)|0,i=(i=i+Math.imul(I,Q)|0)+Math.imul(N,Y)|0,o=o+Math.imul(N,Q)|0,n=n+Math.imul(R,X)|0,i=(i=i+Math.imul(R,tt)|0)+Math.imul(B,X)|0,o=o+Math.imul(B,tt)|0,n=n+Math.imul(j,rt)|0,i=(i=i+Math.imul(j,nt)|0)+Math.imul($,rt)|0,o=o+Math.imul($,nt)|0,n=n+Math.imul(E,ot)|0,i=(i=i+Math.imul(E,st)|0)+Math.imul(x,ot)|0,o=o+Math.imul(x,st)|0,n=n+Math.imul(S,ut)|0,i=(i=i+Math.imul(S,ht)|0)+Math.imul(O,ut)|0,o=o+Math.imul(O,ht)|0,n=n+Math.imul(w,ct)|0,i=(i=i+Math.imul(w,lt)|0)+Math.imul(_,ct)|0,o=o+Math.imul(_,lt)|0;var xt=(h+(n=n+Math.imul(v,pt)|0)|0)+((8191&(i=(i=i+Math.imul(v,mt)|0)+Math.imul(b,pt)|0))<<13)|0;h=((o=o+Math.imul(b,mt)|0)+(i>>>13)|0)+(xt>>>26)|0,xt&=67108863,n=Math.imul(C,Y),i=(i=Math.imul(C,Q))+Math.imul(L,Y)|0,o=Math.imul(L,Q),n=n+Math.imul(I,X)|0,i=(i=i+Math.imul(I,tt)|0)+Math.imul(N,X)|0,o=o+Math.imul(N,tt)|0,n=n+Math.imul(R,rt)|0,i=(i=i+Math.imul(R,nt)|0)+Math.imul(B,rt)|0,o=o+Math.imul(B,nt)|0,n=n+Math.imul(j,ot)|0,i=(i=i+Math.imul(j,st)|0)+Math.imul($,ot)|0,o=o+Math.imul($,st)|0,n=n+Math.imul(E,ut)|0,i=(i=i+Math.imul(E,ht)|0)+Math.imul(x,ut)|0,o=o+Math.imul(x,ht)|0,n=n+Math.imul(S,ct)|0,i=(i=i+Math.imul(S,lt)|0)+Math.imul(O,ct)|0,o=o+Math.imul(O,lt)|0;var kt=(h+(n=n+Math.imul(w,pt)|0)|0)+((8191&(i=(i=i+Math.imul(w,mt)|0)+Math.imul(_,pt)|0))<<13)|0;h=((o=o+Math.imul(_,mt)|0)+(i>>>13)|0)+(kt>>>26)|0,kt&=67108863,n=Math.imul(C,X),i=(i=Math.imul(C,tt))+Math.imul(L,X)|0,o=Math.imul(L,tt),n=n+Math.imul(I,rt)|0,i=(i=i+Math.imul(I,nt)|0)+Math.imul(N,rt)|0,o=o+Math.imul(N,nt)|0,n=n+Math.imul(R,ot)|0,i=(i=i+Math.imul(R,st)|0)+Math.imul(B,ot)|0,o=o+Math.imul(B,st)|0,n=n+Math.imul(j,ut)|0,i=(i=i+Math.imul(j,ht)|0)+Math.imul($,ut)|0,o=o+Math.imul($,ht)|0,n=n+Math.imul(E,ct)|0,i=(i=i+Math.imul(E,lt)|0)+Math.imul(x,ct)|0,o=o+Math.imul(x,lt)|0;var jt=(h+(n=n+Math.imul(S,pt)|0)|0)+((8191&(i=(i=i+Math.imul(S,mt)|0)+Math.imul(O,pt)|0))<<13)|0;h=((o=o+Math.imul(O,mt)|0)+(i>>>13)|0)+(jt>>>26)|0,jt&=67108863,n=Math.imul(C,rt),i=(i=Math.imul(C,nt))+Math.imul(L,rt)|0,o=Math.imul(L,nt),n=n+Math.imul(I,ot)|0,i=(i=i+Math.imul(I,st)|0)+Math.imul(N,ot)|0,o=o+Math.imul(N,st)|0,n=n+Math.imul(R,ut)|0,i=(i=i+Math.imul(R,ht)|0)+Math.imul(B,ut)|0,o=o+Math.imul(B,ht)|0,n=n+Math.imul(j,ct)|0,i=(i=i+Math.imul(j,lt)|0)+Math.imul($,ct)|0,o=o+Math.imul($,lt)|0;var $t=(h+(n=n+Math.imul(E,pt)|0)|0)+((8191&(i=(i=i+Math.imul(E,mt)|0)+Math.imul(x,pt)|0))<<13)|0;h=((o=o+Math.imul(x,mt)|0)+(i>>>13)|0)+($t>>>26)|0,$t&=67108863,n=Math.imul(C,ot),i=(i=Math.imul(C,st))+Math.imul(L,ot)|0,o=Math.imul(L,st),n=n+Math.imul(I,ut)|0,i=(i=i+Math.imul(I,ht)|0)+Math.imul(N,ut)|0,o=o+Math.imul(N,ht)|0,n=n+Math.imul(R,ct)|0,i=(i=i+Math.imul(R,lt)|0)+Math.imul(B,ct)|0,o=o+Math.imul(B,lt)|0;var Pt=(h+(n=n+Math.imul(j,pt)|0)|0)+((8191&(i=(i=i+Math.imul(j,mt)|0)+Math.imul($,pt)|0))<<13)|0;h=((o=o+Math.imul($,mt)|0)+(i>>>13)|0)+(Pt>>>26)|0,Pt&=67108863,n=Math.imul(C,ut),i=(i=Math.imul(C,ht))+Math.imul(L,ut)|0,o=Math.imul(L,ht),n=n+Math.imul(I,ct)|0,i=(i=i+Math.imul(I,lt)|0)+Math.imul(N,ct)|0,o=o+Math.imul(N,lt)|0;var Rt=(h+(n=n+Math.imul(R,pt)|0)|0)+((8191&(i=(i=i+Math.imul(R,mt)|0)+Math.imul(B,pt)|0))<<13)|0;h=((o=o+Math.imul(B,mt)|0)+(i>>>13)|0)+(Rt>>>26)|0,Rt&=67108863,n=Math.imul(C,ct),i=(i=Math.imul(C,lt))+Math.imul(L,ct)|0,o=Math.imul(L,lt);var Bt=(h+(n=n+Math.imul(I,pt)|0)|0)+((8191&(i=(i=i+Math.imul(I,mt)|0)+Math.imul(N,pt)|0))<<13)|0;h=((o=o+Math.imul(N,mt)|0)+(i>>>13)|0)+(Bt>>>26)|0,Bt&=67108863;var Tt=(h+(n=Math.imul(C,pt))|0)+((8191&(i=(i=Math.imul(C,mt))+Math.imul(L,pt)|0))<<13)|0;return h=((o=Math.imul(L,mt))+(i>>>13)|0)+(Tt>>>26)|0,Tt&=67108863,u[0]=yt,u[1]=vt,u[2]=bt,u[3]=gt,u[4]=wt,u[5]=_t,u[6]=Mt,u[7]=St,u[8]=Ot,u[9]=At,u[10]=Et,u[11]=xt,u[12]=kt,u[13]=jt,u[14]=$t,u[15]=Pt,u[16]=Rt,u[17]=Bt,u[18]=Tt,0!==h&&(u[19]=h,r.length++),r};function y(t,e,r){return(new v).mulp(t,e,r)}function v(t,e){this.x=t,this.y=e}Math.imul||(m=p),s.prototype.mulTo=function(t,e){var r=this.length+t.length;return 10===this.length&&10===t.length?m(this,t,e):r<63?p(this,t,e):r<1024?function(t,e,r){r.negative=e.negative^t.negative,r.length=t.length+e.length;for(var n=0,i=0,o=0;o>>26)|0)>>>26,s&=67108863}r.words[o]=a,n=s,s=i}return 0!==n?r.words[o]=n:r.length--,r.strip()}(this,t,e):y(this,t,e)},v.prototype.makeRBT=function(t){for(var e=new Array(t),r=s.prototype._countBits(t)-1,n=0;n>=1;return n},v.prototype.permute=function(t,e,r,n,i,o){for(var s=0;s>>=1)i++;return 1<>>=13,r[2*s+1]=8191&o,o>>>=13;for(s=2*e;s>=26,e+=n/67108864|0,e+=o>>>26,this.words[r]=67108863&o}return 0!==e&&(this.words[r]=e,this.length++),this},s.prototype.muln=function(t){return this.clone().imuln(t)},s.prototype.sqr=function(){return this.mul(this)},s.prototype.isqr=function(){return this.imul(this.clone())},s.prototype.pow=function(t){var e=function(t){for(var e=new Array(t.bitLength()),r=0;r>>i}return e}(t);if(0===e.length)return new s(1);for(var r=this,n=0;n=0);var e,r=t%26,n=(t-r)/26,o=67108863>>>26-r<<26-r;if(0!==r){var s=0;for(e=0;e>>26-r}s&&(this.words[e]=s,this.length++)}if(0!==n){for(e=this.length-1;e>=0;e--)this.words[e+n]=this.words[e];for(e=0;e=0),n=e?(e-e%26)/26:0;var o=t%26,s=Math.min((t-o)/26,this.length),a=67108863^67108863>>>o<s)for(this.length-=s,h=0;h=0&&(0!==f||h>=n);h--){var c=0|this.words[h];this.words[h]=f<<26-o|c>>>o,f=c&a}return u&&0!==f&&(u.words[u.length++]=f),0===this.length&&(this.words[0]=0,this.length=1),this.strip()},s.prototype.ishrn=function(t,e,r){return i(0===this.negative),this.iushrn(t,e,r)},s.prototype.shln=function(t){return this.clone().ishln(t)},s.prototype.ushln=function(t){return this.clone().iushln(t)},s.prototype.shrn=function(t){return this.clone().ishrn(t)},s.prototype.ushrn=function(t){return this.clone().iushrn(t)},s.prototype.testn=function(t){i("number"==typeof t&&t>=0);var e=t%26,r=(t-e)/26,n=1<=0);var e=t%26,r=(t-e)/26;if(i(0===this.negative,"imaskn works only with positive numbers"),this.length<=r)return this;if(0!==e&&r++,this.length=Math.min(r,this.length),0!==e){var n=67108863^67108863>>>e<=67108864;e++)this.words[e]-=67108864,e===this.length-1?this.words[e+1]=1:this.words[e+1]++;return this.length=Math.max(this.length,e+1),this},s.prototype.isubn=function(t){if(i("number"==typeof t),i(t<67108864),t<0)return this.iaddn(-t);if(0!==this.negative)return this.negative=0,this.iaddn(t),this.negative=1,this;if(this.words[0]-=t,1===this.length&&this.words[0]<0)this.words[0]=-this.words[0],this.negative=1;else for(var e=0;e>26)-(u/67108864|0),this.words[n+r]=67108863&o}for(;n>26,this.words[n+r]=67108863&o;if(0===a)return this.strip();for(i(-1===a),a=0,n=0;n>26,this.words[n]=67108863&o;return this.negative=1,this.strip()},s.prototype._wordDiv=function(t,e){var r=(this.length,t.length),n=this.clone(),i=t,o=0|i.words[i.length-1];0!==(r=26-this._countBits(o))&&(i=i.ushln(r),n.iushln(r),o=0|i.words[i.length-1]);var a,u=n.length-i.length;if("mod"!==e){(a=new s(null)).length=u+1,a.words=new Array(a.length);for(var h=0;h=0;c--){var l=67108864*(0|n.words[i.length+c])+(0|n.words[i.length+c-1]);for(l=Math.min(l/o|0,67108863),n._ishlnsubmul(i,l,c);0!==n.negative;)l--,n.negative=0,n._ishlnsubmul(i,1,c),n.isZero()||(n.negative^=1);a&&(a.words[c]=l)}return a&&a.strip(),n.strip(),"div"!==e&&0!==r&&n.iushrn(r),{div:a||null,mod:n}},s.prototype.divmod=function(t,e,r){return i(!t.isZero()),this.isZero()?{div:new s(0),mod:new s(0)}:0!==this.negative&&0===t.negative?(a=this.neg().divmod(t,e),"mod"!==e&&(n=a.div.neg()),"div"!==e&&(o=a.mod.neg(),r&&0!==o.negative&&o.iadd(t)),{div:n,mod:o}):0===this.negative&&0!==t.negative?(a=this.divmod(t.neg(),e),"mod"!==e&&(n=a.div.neg()),{div:n,mod:a.mod}):0!=(this.negative&t.negative)?(a=this.neg().divmod(t.neg(),e),"div"!==e&&(o=a.mod.neg(),r&&0!==o.negative&&o.isub(t)),{div:a.div,mod:o}):t.length>this.length||this.cmp(t)<0?{div:new s(0),mod:this}:1===t.length?"div"===e?{div:this.divn(t.words[0]),mod:null}:"mod"===e?{div:null,mod:new s(this.modn(t.words[0]))}:{div:this.divn(t.words[0]),mod:new s(this.modn(t.words[0]))}:this._wordDiv(t,e);var n,o,a},s.prototype.div=function(t){return this.divmod(t,"div",!1).div},s.prototype.mod=function(t){return this.divmod(t,"mod",!1).mod},s.prototype.umod=function(t){return this.divmod(t,"mod",!0).mod},s.prototype.divRound=function(t){var e=this.divmod(t);if(e.mod.isZero())return e.div;var r=0!==e.div.negative?e.mod.isub(t):e.mod,n=t.ushrn(1),i=t.andln(1),o=r.cmp(n);return o<0||1===i&&0===o?e.div:0!==e.div.negative?e.div.isubn(1):e.div.iaddn(1)},s.prototype.modn=function(t){i(t<=67108863);for(var e=(1<<26)%t,r=0,n=this.length-1;n>=0;n--)r=(e*r+(0|this.words[n]))%t;return r},s.prototype.idivn=function(t){i(t<=67108863);for(var e=0,r=this.length-1;r>=0;r--){var n=(0|this.words[r])+67108864*e;this.words[r]=n/t|0,e=n%t}return this.strip()},s.prototype.divn=function(t){return this.clone().idivn(t)},s.prototype.egcd=function(t){i(0===t.negative),i(!t.isZero());var e=this,r=t.clone();e=0!==e.negative?e.umod(t):e.clone();for(var n=new s(1),o=new s(0),a=new s(0),u=new s(1),h=0;e.isEven()&&r.isEven();)e.iushrn(1),r.iushrn(1),++h;for(var f=r.clone(),c=e.clone();!e.isZero();){for(var l=0,d=1;0==(e.words[0]&d)&&l<26;++l,d<<=1);if(l>0)for(e.iushrn(l);l-- >0;)(n.isOdd()||o.isOdd())&&(n.iadd(f),o.isub(c)),n.iushrn(1),o.iushrn(1);for(var p=0,m=1;0==(r.words[0]&m)&&p<26;++p,m<<=1);if(p>0)for(r.iushrn(p);p-- >0;)(a.isOdd()||u.isOdd())&&(a.iadd(f),u.isub(c)),a.iushrn(1),u.iushrn(1);e.cmp(r)>=0?(e.isub(r),n.isub(a),o.isub(u)):(r.isub(e),a.isub(n),u.isub(o))}return{a:a,b:u,gcd:r.iushln(h)}},s.prototype._invmp=function(t){i(0===t.negative),i(!t.isZero());var e=this,r=t.clone();e=0!==e.negative?e.umod(t):e.clone();for(var n,o=new s(1),a=new s(0),u=r.clone();e.cmpn(1)>0&&r.cmpn(1)>0;){for(var h=0,f=1;0==(e.words[0]&f)&&h<26;++h,f<<=1);if(h>0)for(e.iushrn(h);h-- >0;)o.isOdd()&&o.iadd(u),o.iushrn(1);for(var c=0,l=1;0==(r.words[0]&l)&&c<26;++c,l<<=1);if(c>0)for(r.iushrn(c);c-- >0;)a.isOdd()&&a.iadd(u),a.iushrn(1);e.cmp(r)>=0?(e.isub(r),o.isub(a)):(r.isub(e),a.isub(o))}return(n=0===e.cmpn(1)?o:a).cmpn(0)<0&&n.iadd(t),n},s.prototype.gcd=function(t){if(this.isZero())return t.abs();if(t.isZero())return this.abs();var e=this.clone(),r=t.clone();e.negative=0,r.negative=0;for(var n=0;e.isEven()&&r.isEven();n++)e.iushrn(1),r.iushrn(1);for(;;){for(;e.isEven();)e.iushrn(1);for(;r.isEven();)r.iushrn(1);var i=e.cmp(r);if(i<0){var o=e;e=r,r=o}else if(0===i||0===r.cmpn(1))break;e.isub(r)}return r.iushln(n)},s.prototype.invm=function(t){return this.egcd(t).a.umod(t)},s.prototype.isEven=function(){return 0==(1&this.words[0])},s.prototype.isOdd=function(){return 1==(1&this.words[0])},s.prototype.andln=function(t){return this.words[0]&t},s.prototype.bincn=function(t){i("number"==typeof t);var e=t%26,r=(t-e)/26,n=1<>>26,a&=67108863,this.words[s]=a}return 0!==o&&(this.words[s]=o,this.length++),this},s.prototype.isZero=function(){return 1===this.length&&0===this.words[0]},s.prototype.cmpn=function(t){var e,r=t<0;if(0!==this.negative&&!r)return-1;if(0===this.negative&&r)return 1;if(this.strip(),this.length>1)e=1;else{r&&(t=-t),i(t<=67108863,"Number is too big");var n=0|this.words[0];e=n===t?0:nt.length)return 1;if(this.length=0;r--){var n=0|this.words[r],i=0|t.words[r];if(n!==i){ni&&(e=1);break}}return e},s.prototype.gtn=function(t){return 1===this.cmpn(t)},s.prototype.gt=function(t){return 1===this.cmp(t)},s.prototype.gten=function(t){return this.cmpn(t)>=0},s.prototype.gte=function(t){return this.cmp(t)>=0},s.prototype.ltn=function(t){return-1===this.cmpn(t)},s.prototype.lt=function(t){return-1===this.cmp(t)},s.prototype.lten=function(t){return this.cmpn(t)<=0},s.prototype.lte=function(t){return this.cmp(t)<=0},s.prototype.eqn=function(t){return 0===this.cmpn(t)},s.prototype.eq=function(t){return 0===this.cmp(t)},s.red=function(t){return new O(t)},s.prototype.toRed=function(t){return i(!this.red,"Already a number in reduction context"),i(0===this.negative,"red works only with positives"),t.convertTo(this)._forceRed(t)},s.prototype.fromRed=function(){return i(this.red,"fromRed works only with numbers in reduction context"),this.red.convertFrom(this)},s.prototype._forceRed=function(t){return this.red=t,this},s.prototype.forceRed=function(t){return i(!this.red,"Already a number in reduction context"),this._forceRed(t)},s.prototype.redAdd=function(t){return i(this.red,"redAdd works only with red numbers"),this.red.add(this,t)},s.prototype.redIAdd=function(t){return i(this.red,"redIAdd works only with red numbers"),this.red.iadd(this,t)},s.prototype.redSub=function(t){return i(this.red,"redSub works only with red numbers"),this.red.sub(this,t)},s.prototype.redISub=function(t){return i(this.red,"redISub works only with red numbers"),this.red.isub(this,t)},s.prototype.redShl=function(t){return i(this.red,"redShl works only with red numbers"),this.red.shl(this,t)},s.prototype.redMul=function(t){return i(this.red,"redMul works only with red numbers"),this.red._verify2(this,t),this.red.mul(this,t)},s.prototype.redIMul=function(t){return i(this.red,"redMul works only with red numbers"),this.red._verify2(this,t),this.red.imul(this,t)},s.prototype.redSqr=function(){return i(this.red,"redSqr works only with red numbers"),this.red._verify1(this),this.red.sqr(this)},s.prototype.redISqr=function(){return i(this.red,"redISqr works only with red numbers"),this.red._verify1(this),this.red.isqr(this)},s.prototype.redSqrt=function(){return i(this.red,"redSqrt works only with red numbers"),this.red._verify1(this),this.red.sqrt(this)},s.prototype.redInvm=function(){return i(this.red,"redInvm works only with red numbers"),this.red._verify1(this),this.red.invm(this)},s.prototype.redNeg=function(){return i(this.red,"redNeg works only with red numbers"),this.red._verify1(this),this.red.neg(this)},s.prototype.redPow=function(t){return i(this.red&&!t.red,"redPow(normalNum)"),this.red._verify1(this),this.red.pow(this,t)};var b={k256:null,p224:null,p192:null,p25519:null};function g(t,e){this.name=t,this.p=new s(e,16),this.n=this.p.bitLength(),this.k=new s(1).iushln(this.n).isub(this.p),this.tmp=this._tmp()}function w(){g.call(this,"k256","ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff fffffffe fffffc2f")}function _(){g.call(this,"p224","ffffffff ffffffff ffffffff ffffffff 00000000 00000000 00000001")}function M(){g.call(this,"p192","ffffffff ffffffff ffffffff fffffffe ffffffff ffffffff")}function S(){g.call(this,"25519","7fffffffffffffff ffffffffffffffff ffffffffffffffff ffffffffffffffed")}function O(t){if("string"==typeof t){var e=s._prime(t);this.m=e.p,this.prime=e}else i(t.gtn(1),"modulus must be greater than 1"),this.m=t,this.prime=null}function A(t){O.call(this,t),this.shift=this.m.bitLength(),this.shift%26!=0&&(this.shift+=26-this.shift%26),this.r=new s(1).iushln(this.shift),this.r2=this.imod(this.r.sqr()),this.rinv=this.r._invmp(this.m),this.minv=this.rinv.mul(this.r).isubn(1).div(this.m),this.minv=this.minv.umod(this.r),this.minv=this.r.sub(this.minv)}g.prototype._tmp=function(){var t=new s(null);return t.words=new Array(Math.ceil(this.n/13)),t},g.prototype.ireduce=function(t){var e,r=t;do{this.split(r,this.tmp),e=(r=(r=this.imulK(r)).iadd(this.tmp)).bitLength()}while(e>this.n);var n=e0?r.isub(this.p):void 0!==r.strip?r.strip():r._strip(),r},g.prototype.split=function(t,e){t.iushrn(this.n,0,e)},g.prototype.imulK=function(t){return t.imul(this.k)},o(w,g),w.prototype.split=function(t,e){for(var r=Math.min(t.length,9),n=0;n>>22,i=o}i>>>=22,t.words[n-10]=i,0===i&&t.length>10?t.length-=10:t.length-=9},w.prototype.imulK=function(t){t.words[t.length]=0,t.words[t.length+1]=0,t.length+=2;for(var e=0,r=0;r>>=26,t.words[r]=i,e=n}return 0!==e&&(t.words[t.length++]=e),t},s._prime=function(t){if(b[t])return b[t];var e;if("k256"===t)e=new w;else if("p224"===t)e=new _;else if("p192"===t)e=new M;else{if("p25519"!==t)throw new Error("Unknown prime "+t);e=new S}return b[t]=e,e},O.prototype._verify1=function(t){i(0===t.negative,"red works only with positives"),i(t.red,"red works only with red numbers")},O.prototype._verify2=function(t,e){i(0==(t.negative|e.negative),"red works only with positives"),i(t.red&&t.red===e.red,"red works only with red numbers")},O.prototype.imod=function(t){return this.prime?this.prime.ireduce(t)._forceRed(this):t.umod(this.m)._forceRed(this)},O.prototype.neg=function(t){return t.isZero()?t.clone():this.m.sub(t)._forceRed(this)},O.prototype.add=function(t,e){this._verify2(t,e);var r=t.add(e);return r.cmp(this.m)>=0&&r.isub(this.m),r._forceRed(this)},O.prototype.iadd=function(t,e){this._verify2(t,e);var r=t.iadd(e);return r.cmp(this.m)>=0&&r.isub(this.m),r},O.prototype.sub=function(t,e){this._verify2(t,e);var r=t.sub(e);return r.cmpn(0)<0&&r.iadd(this.m),r._forceRed(this)},O.prototype.isub=function(t,e){this._verify2(t,e);var r=t.isub(e);return r.cmpn(0)<0&&r.iadd(this.m),r},O.prototype.shl=function(t,e){return this._verify1(t),this.imod(t.ushln(e))},O.prototype.imul=function(t,e){return this._verify2(t,e),this.imod(t.imul(e))},O.prototype.mul=function(t,e){return this._verify2(t,e),this.imod(t.mul(e))},O.prototype.isqr=function(t){return this.imul(t,t.clone())},O.prototype.sqr=function(t){return this.mul(t,t)},O.prototype.sqrt=function(t){if(t.isZero())return t.clone();var e=this.m.andln(3);if(i(e%2==1),3===e){var r=this.m.add(new s(1)).iushrn(2);return this.pow(t,r)}for(var n=this.m.subn(1),o=0;!n.isZero()&&0===n.andln(1);)o++,n.iushrn(1);i(!n.isZero());var a=new s(1).toRed(this),u=a.redNeg(),h=this.m.subn(1).iushrn(1),f=this.m.bitLength();for(f=new s(2*f*f).toRed(this);0!==this.pow(f,h).cmp(u);)f.redIAdd(u);for(var c=this.pow(f,n),l=this.pow(t,n.addn(1).iushrn(1)),d=this.pow(t,n),p=o;0!==d.cmp(a);){for(var m=d,y=0;0!==m.cmp(a);y++)m=m.redSqr();i(y=0;n--){for(var h=e.words[n],f=u-1;f>=0;f--){var c=h>>f&1;i!==r[0]&&(i=this.sqr(i)),0!==c||0!==o?(o<<=1,o|=c,(4===++a||0===n&&0===f)&&(i=this.mul(i,r[o]),a=0,o=0)):a=0}u=26}return i},O.prototype.convertTo=function(t){var e=t.umod(this.m);return e===t?e.clone():e},O.prototype.convertFrom=function(t){var e=t.clone();return e.red=null,e},s.mont=function(t){return new A(t)},o(A,O),A.prototype.convertTo=function(t){return this.imod(t.ushln(this.shift))},A.prototype.convertFrom=function(t){var e=this.imod(t.mul(this.rinv));return e.red=null,e},A.prototype.imul=function(t,e){if(t.isZero()||e.isZero())return t.words[0]=0,t.length=1,t;var r=t.imul(e),n=r.maskn(this.shift).mul(this.minv).imaskn(this.shift).mul(this.m),i=r.isub(n).iushrn(this.shift),o=i;return i.cmp(this.m)>=0?o=i.isub(this.m):i.cmpn(0)<0&&(o=i.iadd(this.m)),o._forceRed(this)},A.prototype.mul=function(t,e){if(t.isZero()||e.isZero())return new s(0)._forceRed(this);var r=t.mul(e),n=r.maskn(this.shift).mul(this.minv).imaskn(this.shift).mul(this.m),i=r.isub(n).iushrn(this.shift),o=i;return i.cmp(this.m)>=0?o=i.isub(this.m):i.cmpn(0)<0&&(o=i.iadd(this.m)),o._forceRed(this)},A.prototype.invm=function(t){return this.imod(t._invmp(this.m).mul(this.r2))._forceRed(this)}}(t,this)}).call(this,r(22)(t))},function(t,e,r){var n=r(226),i=r(69);function o(t){this.rand=t||new i.Rand}t.exports=o,o.create=function(t){return new o(t)},o.prototype._randbelow=function(t){var e=t.bitLength(),r=Math.ceil(e/8);do{var i=new n(this.rand.generate(r))}while(i.cmp(t)>=0);return i},o.prototype._randrange=function(t,e){var r=e.sub(t);return t.add(this._randbelow(r))},o.prototype.test=function(t,e,r){var i=t.bitLength(),o=n.mont(t),s=new n(1).toRed(o);e||(e=Math.max(1,i/48|0));for(var a=t.subn(1),u=0;!a.testn(u);u++);for(var h=t.shrn(u),f=a.toRed(o);e>0;e--){var c=this._randrange(new n(2),a);r&&r(c);var l=c.toRed(o).redPow(h);if(0!==l.cmp(s)&&0!==l.cmp(f)){for(var d=1;d0;e--){var f=this._randrange(new n(2),s),c=t.gcd(f);if(0!==c.cmpn(1))return c;var l=f.toRed(i).redPow(u);if(0!==l.cmp(o)&&0!==l.cmp(h)){for(var d=1;d>8,s=255&i;o?r.push(o,s):r.push(s)}return r},n.zero2=i,n.toHex=o,n.encode=function(t,e){return"hex"===e?o(t):t}},function(t,e,r){"use strict";var n=e;n.base=r(49),n.short=r(236),n.mont=r(237),n.edwards=r(238)},function(t,e,r){"use strict";var n=r(17).rotr32;function i(t,e,r){return t&e^~t&r}function o(t,e,r){return t&e^t&r^e&r}function s(t,e,r){return t^e^r}e.ft_1=function(t,e,r,n){return 0===t?i(e,r,n):1===t||3===t?s(e,r,n):2===t?o(e,r,n):void 0},e.ch32=i,e.maj32=o,e.p32=s,e.s0_256=function(t){return n(t,2)^n(t,13)^n(t,22)},e.s1_256=function(t){return n(t,6)^n(t,11)^n(t,25)},e.g0_256=function(t){return n(t,7)^n(t,18)^t>>>3},e.g1_256=function(t){return n(t,17)^n(t,19)^t>>>10}},function(t,e,r){"use strict";var n=r(17),i=r(40),o=r(130),s=r(11),a=n.sum32,u=n.sum32_4,h=n.sum32_5,f=o.ch32,c=o.maj32,l=o.s0_256,d=o.s1_256,p=o.g0_256,m=o.g1_256,y=i.BlockHash,v=[1116352408,1899447441,3049323471,3921009573,961987163,1508970993,2453635748,2870763221,3624381080,310598401,607225278,1426881987,1925078388,2162078206,2614888103,3248222580,3835390401,4022224774,264347078,604807628,770255983,1249150122,1555081692,1996064986,2554220882,2821834349,2952996808,3210313671,3336571891,3584528711,113926993,338241895,666307205,773529912,1294757372,1396182291,1695183700,1986661051,2177026350,2456956037,2730485921,2820302411,3259730800,3345764771,3516065817,3600352804,4094571909,275423344,430227734,506948616,659060556,883997877,958139571,1322822218,1537002063,1747873779,1955562222,2024104815,2227730452,2361852424,2428436474,2756734187,3204031479,3329325298];function b(){if(!(this instanceof b))return new b;y.call(this),this.h=[1779033703,3144134277,1013904242,2773480762,1359893119,2600822924,528734635,1541459225],this.k=v,this.W=new Array(64)}n.inherits(b,y),t.exports=b,b.blockSize=512,b.outSize=256,b.hmacStrength=192,b.padLength=64,b.prototype._update=function(t,e){for(var r=this.W,n=0;n<16;n++)r[n]=t[e+n];for(;n=65&&r<=70?r-55:r>=97&&r<=102?r-87:r-48&15}function h(t,e,r){var n=u(t,r);return r-1>=e&&(n|=u(t,r-1)<<4),n}function f(t,e,r,n){for(var i=0,o=Math.min(t.length,r),s=e;s=49?a-49+10:a>=17?a-17+10:a}return i}s.isBN=function(t){return t instanceof s||null!==t&&"object"===e(t)&&t.constructor.wordSize===s.wordSize&&Array.isArray(t.words)},s.max=function(t,e){return t.cmp(e)>0?t:e},s.min=function(t,e){return t.cmp(e)<0?t:e},s.prototype._init=function(t,r,n){if("number"==typeof t)return this._initNumber(t,r,n);if("object"===e(t))return this._initArray(t,r,n);"hex"===r&&(r=16),i(r===(0|r)&&r>=2&&r<=36);var o=0;"-"===(t=t.toString().replace(/\s+/g,""))[0]&&(o++,this.negative=1),o=0;n-=3)s=t[n]|t[n-1]<<8|t[n-2]<<16,this.words[o]|=s<>>26-a&67108863,(a+=24)>=26&&(a-=26,o++);else if("le"===r)for(n=0,o=0;n>>26-a&67108863,(a+=24)>=26&&(a-=26,o++);return this.strip()},s.prototype._parseHex=function(t,e,r){this.length=Math.ceil((t.length-e)/6),this.words=new Array(this.length);for(var n=0;n=e;n-=2)i=h(t,e,n)<=18?(o-=18,s+=1,this.words[s]|=i>>>26):o+=8;else for(n=(t.length-e)%2==0?e+1:e;n=18?(o-=18,s+=1,this.words[s]|=i>>>26):o+=8;this.strip()},s.prototype._parseBase=function(t,e,r){this.words=[0],this.length=1;for(var n=0,i=1;i<=67108863;i*=e)n++;n--,i=i/e|0;for(var o=t.length-r,s=o%n,a=Math.min(o,o-s)+r,u=0,h=r;h1&&0===this.words[this.length-1];)this.length--;return this._normSign()},s.prototype._normSign=function(){return 1===this.length&&0===this.words[0]&&(this.negative=0),this},s.prototype.inspect=function(){return(this.red?""};var c=["","0","00","000","0000","00000","000000","0000000","00000000","000000000","0000000000","00000000000","000000000000","0000000000000","00000000000000","000000000000000","0000000000000000","00000000000000000","000000000000000000","0000000000000000000","00000000000000000000","000000000000000000000","0000000000000000000000","00000000000000000000000","000000000000000000000000","0000000000000000000000000"],l=[0,0,25,16,12,11,10,9,8,8,7,7,7,7,6,6,6,6,6,6,6,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5],d=[0,0,33554432,43046721,16777216,48828125,60466176,40353607,16777216,43046721,1e7,19487171,35831808,62748517,7529536,11390625,16777216,24137569,34012224,47045881,64e6,4084101,5153632,6436343,7962624,9765625,11881376,14348907,17210368,20511149,243e5,28629151,33554432,39135393,45435424,52521875,60466176];function p(t,e,r){r.negative=e.negative^t.negative;var n=t.length+e.length|0;r.length=n,n=n-1|0;var i=0|t.words[0],o=0|e.words[0],s=i*o,a=67108863&s,u=s/67108864|0;r.words[0]=a;for(var h=1;h>>26,c=67108863&u,l=Math.min(h,e.length-1),d=Math.max(0,h-t.length+1);d<=l;d++){var p=h-d|0;f+=(s=(i=0|t.words[p])*(o=0|e.words[d])+c)/67108864|0,c=67108863&s}r.words[h]=0|c,u=0|f}return 0!==u?r.words[h]=0|u:r.length--,r.strip()}s.prototype.toString=function(t,e){var r;if(e=0|e||1,16===(t=t||10)||"hex"===t){r="";for(var n=0,o=0,s=0;s>>24-n&16777215)||s!==this.length-1?c[6-u.length]+u+r:u+r,(n+=2)>=26&&(n-=26,s--)}for(0!==o&&(r=o.toString(16)+r);r.length%e!=0;)r="0"+r;return 0!==this.negative&&(r="-"+r),r}if(t===(0|t)&&t>=2&&t<=36){var h=l[t],f=d[t];r="";var p=this.clone();for(p.negative=0;!p.isZero();){var m=p.modn(f).toString(t);r=(p=p.idivn(f)).isZero()?m+r:c[h-m.length]+m+r}for(this.isZero()&&(r="0"+r);r.length%e!=0;)r="0"+r;return 0!==this.negative&&(r="-"+r),r}i(!1,"Base should be between 2 and 36")},s.prototype.toNumber=function(){var t=this.words[0];return 2===this.length?t+=67108864*this.words[1]:3===this.length&&1===this.words[2]?t+=4503599627370496+67108864*this.words[1]:this.length>2&&i(!1,"Number can only safely store up to 53 bits"),0!==this.negative?-t:t},s.prototype.toJSON=function(){return this.toString(16)},s.prototype.toBuffer=function(t,e){return i(void 0!==a),this.toArrayLike(a,t,e)},s.prototype.toArray=function(t,e){return this.toArrayLike(Array,t,e)},s.prototype.toArrayLike=function(t,e,r){var n=this.byteLength(),o=r||Math.max(1,n);i(n<=o,"byte array longer than desired length"),i(o>0,"Requested array length <= 0"),this.strip();var s,a,u="le"===e,h=new t(o),f=this.clone();if(u){for(a=0;!f.isZero();a++)s=f.andln(255),f.iushrn(8),h[a]=s;for(;a=4096&&(r+=13,e>>>=13),e>=64&&(r+=7,e>>>=7),e>=8&&(r+=4,e>>>=4),e>=2&&(r+=2,e>>>=2),r+e},s.prototype._zeroBits=function(t){if(0===t)return 26;var e=t,r=0;return 0==(8191&e)&&(r+=13,e>>>=13),0==(127&e)&&(r+=7,e>>>=7),0==(15&e)&&(r+=4,e>>>=4),0==(3&e)&&(r+=2,e>>>=2),0==(1&e)&&r++,r},s.prototype.bitLength=function(){var t=this.words[this.length-1],e=this._countBits(t);return 26*(this.length-1)+e},s.prototype.zeroBits=function(){if(this.isZero())return 0;for(var t=0,e=0;et.length?this.clone().ior(t):t.clone().ior(this)},s.prototype.uor=function(t){return this.length>t.length?this.clone().iuor(t):t.clone().iuor(this)},s.prototype.iuand=function(t){var e;e=this.length>t.length?t:this;for(var r=0;rt.length?this.clone().iand(t):t.clone().iand(this)},s.prototype.uand=function(t){return this.length>t.length?this.clone().iuand(t):t.clone().iuand(this)},s.prototype.iuxor=function(t){var e,r;this.length>t.length?(e=this,r=t):(e=t,r=this);for(var n=0;nt.length?this.clone().ixor(t):t.clone().ixor(this)},s.prototype.uxor=function(t){return this.length>t.length?this.clone().iuxor(t):t.clone().iuxor(this)},s.prototype.inotn=function(t){i("number"==typeof t&&t>=0);var e=0|Math.ceil(t/26),r=t%26;this._expand(e),r>0&&e--;for(var n=0;n0&&(this.words[n]=~this.words[n]&67108863>>26-r),this.strip()},s.prototype.notn=function(t){return this.clone().inotn(t)},s.prototype.setn=function(t,e){i("number"==typeof t&&t>=0);var r=t/26|0,n=t%26;return this._expand(r+1),this.words[r]=e?this.words[r]|1<t.length?(r=this,n=t):(r=t,n=this);for(var i=0,o=0;o>>26;for(;0!==i&&o>>26;if(this.length=r.length,0!==i)this.words[this.length]=i,this.length++;else if(r!==this)for(;ot.length?this.clone().iadd(t):t.clone().iadd(this)},s.prototype.isub=function(t){if(0!==t.negative){t.negative=0;var e=this.iadd(t);return t.negative=1,e._normSign()}if(0!==this.negative)return this.negative=0,this.iadd(t),this.negative=1,this._normSign();var r,n,i=this.cmp(t);if(0===i)return this.negative=0,this.length=1,this.words[0]=0,this;i>0?(r=this,n=t):(r=t,n=this);for(var o=0,s=0;s>26,this.words[s]=67108863&e;for(;0!==o&&s>26,this.words[s]=67108863&e;if(0===o&&s>>13,d=0|s[1],p=8191&d,m=d>>>13,y=0|s[2],v=8191&y,b=y>>>13,g=0|s[3],w=8191&g,_=g>>>13,M=0|s[4],S=8191&M,O=M>>>13,A=0|s[5],E=8191&A,x=A>>>13,k=0|s[6],j=8191&k,$=k>>>13,P=0|s[7],R=8191&P,B=P>>>13,T=0|s[8],I=8191&T,N=T>>>13,D=0|s[9],C=8191&D,L=D>>>13,q=0|a[0],U=8191&q,F=q>>>13,z=0|a[1],V=8191&z,K=z>>>13,H=0|a[2],Z=8191&H,W=H>>>13,J=0|a[3],Y=8191&J,Q=J>>>13,G=0|a[4],X=8191&G,tt=G>>>13,et=0|a[5],rt=8191&et,nt=et>>>13,it=0|a[6],ot=8191&it,st=it>>>13,at=0|a[7],ut=8191&at,ht=at>>>13,ft=0|a[8],ct=8191&ft,lt=ft>>>13,dt=0|a[9],pt=8191&dt,mt=dt>>>13;r.negative=t.negative^e.negative,r.length=19;var yt=(h+(n=Math.imul(c,U))|0)+((8191&(i=(i=Math.imul(c,F))+Math.imul(l,U)|0))<<13)|0;h=((o=Math.imul(l,F))+(i>>>13)|0)+(yt>>>26)|0,yt&=67108863,n=Math.imul(p,U),i=(i=Math.imul(p,F))+Math.imul(m,U)|0,o=Math.imul(m,F);var vt=(h+(n=n+Math.imul(c,V)|0)|0)+((8191&(i=(i=i+Math.imul(c,K)|0)+Math.imul(l,V)|0))<<13)|0;h=((o=o+Math.imul(l,K)|0)+(i>>>13)|0)+(vt>>>26)|0,vt&=67108863,n=Math.imul(v,U),i=(i=Math.imul(v,F))+Math.imul(b,U)|0,o=Math.imul(b,F),n=n+Math.imul(p,V)|0,i=(i=i+Math.imul(p,K)|0)+Math.imul(m,V)|0,o=o+Math.imul(m,K)|0;var bt=(h+(n=n+Math.imul(c,Z)|0)|0)+((8191&(i=(i=i+Math.imul(c,W)|0)+Math.imul(l,Z)|0))<<13)|0;h=((o=o+Math.imul(l,W)|0)+(i>>>13)|0)+(bt>>>26)|0,bt&=67108863,n=Math.imul(w,U),i=(i=Math.imul(w,F))+Math.imul(_,U)|0,o=Math.imul(_,F),n=n+Math.imul(v,V)|0,i=(i=i+Math.imul(v,K)|0)+Math.imul(b,V)|0,o=o+Math.imul(b,K)|0,n=n+Math.imul(p,Z)|0,i=(i=i+Math.imul(p,W)|0)+Math.imul(m,Z)|0,o=o+Math.imul(m,W)|0;var gt=(h+(n=n+Math.imul(c,Y)|0)|0)+((8191&(i=(i=i+Math.imul(c,Q)|0)+Math.imul(l,Y)|0))<<13)|0;h=((o=o+Math.imul(l,Q)|0)+(i>>>13)|0)+(gt>>>26)|0,gt&=67108863,n=Math.imul(S,U),i=(i=Math.imul(S,F))+Math.imul(O,U)|0,o=Math.imul(O,F),n=n+Math.imul(w,V)|0,i=(i=i+Math.imul(w,K)|0)+Math.imul(_,V)|0,o=o+Math.imul(_,K)|0,n=n+Math.imul(v,Z)|0,i=(i=i+Math.imul(v,W)|0)+Math.imul(b,Z)|0,o=o+Math.imul(b,W)|0,n=n+Math.imul(p,Y)|0,i=(i=i+Math.imul(p,Q)|0)+Math.imul(m,Y)|0,o=o+Math.imul(m,Q)|0;var wt=(h+(n=n+Math.imul(c,X)|0)|0)+((8191&(i=(i=i+Math.imul(c,tt)|0)+Math.imul(l,X)|0))<<13)|0;h=((o=o+Math.imul(l,tt)|0)+(i>>>13)|0)+(wt>>>26)|0,wt&=67108863,n=Math.imul(E,U),i=(i=Math.imul(E,F))+Math.imul(x,U)|0,o=Math.imul(x,F),n=n+Math.imul(S,V)|0,i=(i=i+Math.imul(S,K)|0)+Math.imul(O,V)|0,o=o+Math.imul(O,K)|0,n=n+Math.imul(w,Z)|0,i=(i=i+Math.imul(w,W)|0)+Math.imul(_,Z)|0,o=o+Math.imul(_,W)|0,n=n+Math.imul(v,Y)|0,i=(i=i+Math.imul(v,Q)|0)+Math.imul(b,Y)|0,o=o+Math.imul(b,Q)|0,n=n+Math.imul(p,X)|0,i=(i=i+Math.imul(p,tt)|0)+Math.imul(m,X)|0,o=o+Math.imul(m,tt)|0;var _t=(h+(n=n+Math.imul(c,rt)|0)|0)+((8191&(i=(i=i+Math.imul(c,nt)|0)+Math.imul(l,rt)|0))<<13)|0;h=((o=o+Math.imul(l,nt)|0)+(i>>>13)|0)+(_t>>>26)|0,_t&=67108863,n=Math.imul(j,U),i=(i=Math.imul(j,F))+Math.imul($,U)|0,o=Math.imul($,F),n=n+Math.imul(E,V)|0,i=(i=i+Math.imul(E,K)|0)+Math.imul(x,V)|0,o=o+Math.imul(x,K)|0,n=n+Math.imul(S,Z)|0,i=(i=i+Math.imul(S,W)|0)+Math.imul(O,Z)|0,o=o+Math.imul(O,W)|0,n=n+Math.imul(w,Y)|0,i=(i=i+Math.imul(w,Q)|0)+Math.imul(_,Y)|0,o=o+Math.imul(_,Q)|0,n=n+Math.imul(v,X)|0,i=(i=i+Math.imul(v,tt)|0)+Math.imul(b,X)|0,o=o+Math.imul(b,tt)|0,n=n+Math.imul(p,rt)|0,i=(i=i+Math.imul(p,nt)|0)+Math.imul(m,rt)|0,o=o+Math.imul(m,nt)|0;var Mt=(h+(n=n+Math.imul(c,ot)|0)|0)+((8191&(i=(i=i+Math.imul(c,st)|0)+Math.imul(l,ot)|0))<<13)|0;h=((o=o+Math.imul(l,st)|0)+(i>>>13)|0)+(Mt>>>26)|0,Mt&=67108863,n=Math.imul(R,U),i=(i=Math.imul(R,F))+Math.imul(B,U)|0,o=Math.imul(B,F),n=n+Math.imul(j,V)|0,i=(i=i+Math.imul(j,K)|0)+Math.imul($,V)|0,o=o+Math.imul($,K)|0,n=n+Math.imul(E,Z)|0,i=(i=i+Math.imul(E,W)|0)+Math.imul(x,Z)|0,o=o+Math.imul(x,W)|0,n=n+Math.imul(S,Y)|0,i=(i=i+Math.imul(S,Q)|0)+Math.imul(O,Y)|0,o=o+Math.imul(O,Q)|0,n=n+Math.imul(w,X)|0,i=(i=i+Math.imul(w,tt)|0)+Math.imul(_,X)|0,o=o+Math.imul(_,tt)|0,n=n+Math.imul(v,rt)|0,i=(i=i+Math.imul(v,nt)|0)+Math.imul(b,rt)|0,o=o+Math.imul(b,nt)|0,n=n+Math.imul(p,ot)|0,i=(i=i+Math.imul(p,st)|0)+Math.imul(m,ot)|0,o=o+Math.imul(m,st)|0;var St=(h+(n=n+Math.imul(c,ut)|0)|0)+((8191&(i=(i=i+Math.imul(c,ht)|0)+Math.imul(l,ut)|0))<<13)|0;h=((o=o+Math.imul(l,ht)|0)+(i>>>13)|0)+(St>>>26)|0,St&=67108863,n=Math.imul(I,U),i=(i=Math.imul(I,F))+Math.imul(N,U)|0,o=Math.imul(N,F),n=n+Math.imul(R,V)|0,i=(i=i+Math.imul(R,K)|0)+Math.imul(B,V)|0,o=o+Math.imul(B,K)|0,n=n+Math.imul(j,Z)|0,i=(i=i+Math.imul(j,W)|0)+Math.imul($,Z)|0,o=o+Math.imul($,W)|0,n=n+Math.imul(E,Y)|0,i=(i=i+Math.imul(E,Q)|0)+Math.imul(x,Y)|0,o=o+Math.imul(x,Q)|0,n=n+Math.imul(S,X)|0,i=(i=i+Math.imul(S,tt)|0)+Math.imul(O,X)|0,o=o+Math.imul(O,tt)|0,n=n+Math.imul(w,rt)|0,i=(i=i+Math.imul(w,nt)|0)+Math.imul(_,rt)|0,o=o+Math.imul(_,nt)|0,n=n+Math.imul(v,ot)|0,i=(i=i+Math.imul(v,st)|0)+Math.imul(b,ot)|0,o=o+Math.imul(b,st)|0,n=n+Math.imul(p,ut)|0,i=(i=i+Math.imul(p,ht)|0)+Math.imul(m,ut)|0,o=o+Math.imul(m,ht)|0;var Ot=(h+(n=n+Math.imul(c,ct)|0)|0)+((8191&(i=(i=i+Math.imul(c,lt)|0)+Math.imul(l,ct)|0))<<13)|0;h=((o=o+Math.imul(l,lt)|0)+(i>>>13)|0)+(Ot>>>26)|0,Ot&=67108863,n=Math.imul(C,U),i=(i=Math.imul(C,F))+Math.imul(L,U)|0,o=Math.imul(L,F),n=n+Math.imul(I,V)|0,i=(i=i+Math.imul(I,K)|0)+Math.imul(N,V)|0,o=o+Math.imul(N,K)|0,n=n+Math.imul(R,Z)|0,i=(i=i+Math.imul(R,W)|0)+Math.imul(B,Z)|0,o=o+Math.imul(B,W)|0,n=n+Math.imul(j,Y)|0,i=(i=i+Math.imul(j,Q)|0)+Math.imul($,Y)|0,o=o+Math.imul($,Q)|0,n=n+Math.imul(E,X)|0,i=(i=i+Math.imul(E,tt)|0)+Math.imul(x,X)|0,o=o+Math.imul(x,tt)|0,n=n+Math.imul(S,rt)|0,i=(i=i+Math.imul(S,nt)|0)+Math.imul(O,rt)|0,o=o+Math.imul(O,nt)|0,n=n+Math.imul(w,ot)|0,i=(i=i+Math.imul(w,st)|0)+Math.imul(_,ot)|0,o=o+Math.imul(_,st)|0,n=n+Math.imul(v,ut)|0,i=(i=i+Math.imul(v,ht)|0)+Math.imul(b,ut)|0,o=o+Math.imul(b,ht)|0,n=n+Math.imul(p,ct)|0,i=(i=i+Math.imul(p,lt)|0)+Math.imul(m,ct)|0,o=o+Math.imul(m,lt)|0;var At=(h+(n=n+Math.imul(c,pt)|0)|0)+((8191&(i=(i=i+Math.imul(c,mt)|0)+Math.imul(l,pt)|0))<<13)|0;h=((o=o+Math.imul(l,mt)|0)+(i>>>13)|0)+(At>>>26)|0,At&=67108863,n=Math.imul(C,V),i=(i=Math.imul(C,K))+Math.imul(L,V)|0,o=Math.imul(L,K),n=n+Math.imul(I,Z)|0,i=(i=i+Math.imul(I,W)|0)+Math.imul(N,Z)|0,o=o+Math.imul(N,W)|0,n=n+Math.imul(R,Y)|0,i=(i=i+Math.imul(R,Q)|0)+Math.imul(B,Y)|0,o=o+Math.imul(B,Q)|0,n=n+Math.imul(j,X)|0,i=(i=i+Math.imul(j,tt)|0)+Math.imul($,X)|0,o=o+Math.imul($,tt)|0,n=n+Math.imul(E,rt)|0,i=(i=i+Math.imul(E,nt)|0)+Math.imul(x,rt)|0,o=o+Math.imul(x,nt)|0,n=n+Math.imul(S,ot)|0,i=(i=i+Math.imul(S,st)|0)+Math.imul(O,ot)|0,o=o+Math.imul(O,st)|0,n=n+Math.imul(w,ut)|0,i=(i=i+Math.imul(w,ht)|0)+Math.imul(_,ut)|0,o=o+Math.imul(_,ht)|0,n=n+Math.imul(v,ct)|0,i=(i=i+Math.imul(v,lt)|0)+Math.imul(b,ct)|0,o=o+Math.imul(b,lt)|0;var Et=(h+(n=n+Math.imul(p,pt)|0)|0)+((8191&(i=(i=i+Math.imul(p,mt)|0)+Math.imul(m,pt)|0))<<13)|0;h=((o=o+Math.imul(m,mt)|0)+(i>>>13)|0)+(Et>>>26)|0,Et&=67108863,n=Math.imul(C,Z),i=(i=Math.imul(C,W))+Math.imul(L,Z)|0,o=Math.imul(L,W),n=n+Math.imul(I,Y)|0,i=(i=i+Math.imul(I,Q)|0)+Math.imul(N,Y)|0,o=o+Math.imul(N,Q)|0,n=n+Math.imul(R,X)|0,i=(i=i+Math.imul(R,tt)|0)+Math.imul(B,X)|0,o=o+Math.imul(B,tt)|0,n=n+Math.imul(j,rt)|0,i=(i=i+Math.imul(j,nt)|0)+Math.imul($,rt)|0,o=o+Math.imul($,nt)|0,n=n+Math.imul(E,ot)|0,i=(i=i+Math.imul(E,st)|0)+Math.imul(x,ot)|0,o=o+Math.imul(x,st)|0,n=n+Math.imul(S,ut)|0,i=(i=i+Math.imul(S,ht)|0)+Math.imul(O,ut)|0,o=o+Math.imul(O,ht)|0,n=n+Math.imul(w,ct)|0,i=(i=i+Math.imul(w,lt)|0)+Math.imul(_,ct)|0,o=o+Math.imul(_,lt)|0;var xt=(h+(n=n+Math.imul(v,pt)|0)|0)+((8191&(i=(i=i+Math.imul(v,mt)|0)+Math.imul(b,pt)|0))<<13)|0;h=((o=o+Math.imul(b,mt)|0)+(i>>>13)|0)+(xt>>>26)|0,xt&=67108863,n=Math.imul(C,Y),i=(i=Math.imul(C,Q))+Math.imul(L,Y)|0,o=Math.imul(L,Q),n=n+Math.imul(I,X)|0,i=(i=i+Math.imul(I,tt)|0)+Math.imul(N,X)|0,o=o+Math.imul(N,tt)|0,n=n+Math.imul(R,rt)|0,i=(i=i+Math.imul(R,nt)|0)+Math.imul(B,rt)|0,o=o+Math.imul(B,nt)|0,n=n+Math.imul(j,ot)|0,i=(i=i+Math.imul(j,st)|0)+Math.imul($,ot)|0,o=o+Math.imul($,st)|0,n=n+Math.imul(E,ut)|0,i=(i=i+Math.imul(E,ht)|0)+Math.imul(x,ut)|0,o=o+Math.imul(x,ht)|0,n=n+Math.imul(S,ct)|0,i=(i=i+Math.imul(S,lt)|0)+Math.imul(O,ct)|0,o=o+Math.imul(O,lt)|0;var kt=(h+(n=n+Math.imul(w,pt)|0)|0)+((8191&(i=(i=i+Math.imul(w,mt)|0)+Math.imul(_,pt)|0))<<13)|0;h=((o=o+Math.imul(_,mt)|0)+(i>>>13)|0)+(kt>>>26)|0,kt&=67108863,n=Math.imul(C,X),i=(i=Math.imul(C,tt))+Math.imul(L,X)|0,o=Math.imul(L,tt),n=n+Math.imul(I,rt)|0,i=(i=i+Math.imul(I,nt)|0)+Math.imul(N,rt)|0,o=o+Math.imul(N,nt)|0,n=n+Math.imul(R,ot)|0,i=(i=i+Math.imul(R,st)|0)+Math.imul(B,ot)|0,o=o+Math.imul(B,st)|0,n=n+Math.imul(j,ut)|0,i=(i=i+Math.imul(j,ht)|0)+Math.imul($,ut)|0,o=o+Math.imul($,ht)|0,n=n+Math.imul(E,ct)|0,i=(i=i+Math.imul(E,lt)|0)+Math.imul(x,ct)|0,o=o+Math.imul(x,lt)|0;var jt=(h+(n=n+Math.imul(S,pt)|0)|0)+((8191&(i=(i=i+Math.imul(S,mt)|0)+Math.imul(O,pt)|0))<<13)|0;h=((o=o+Math.imul(O,mt)|0)+(i>>>13)|0)+(jt>>>26)|0,jt&=67108863,n=Math.imul(C,rt),i=(i=Math.imul(C,nt))+Math.imul(L,rt)|0,o=Math.imul(L,nt),n=n+Math.imul(I,ot)|0,i=(i=i+Math.imul(I,st)|0)+Math.imul(N,ot)|0,o=o+Math.imul(N,st)|0,n=n+Math.imul(R,ut)|0,i=(i=i+Math.imul(R,ht)|0)+Math.imul(B,ut)|0,o=o+Math.imul(B,ht)|0,n=n+Math.imul(j,ct)|0,i=(i=i+Math.imul(j,lt)|0)+Math.imul($,ct)|0,o=o+Math.imul($,lt)|0;var $t=(h+(n=n+Math.imul(E,pt)|0)|0)+((8191&(i=(i=i+Math.imul(E,mt)|0)+Math.imul(x,pt)|0))<<13)|0;h=((o=o+Math.imul(x,mt)|0)+(i>>>13)|0)+($t>>>26)|0,$t&=67108863,n=Math.imul(C,ot),i=(i=Math.imul(C,st))+Math.imul(L,ot)|0,o=Math.imul(L,st),n=n+Math.imul(I,ut)|0,i=(i=i+Math.imul(I,ht)|0)+Math.imul(N,ut)|0,o=o+Math.imul(N,ht)|0,n=n+Math.imul(R,ct)|0,i=(i=i+Math.imul(R,lt)|0)+Math.imul(B,ct)|0,o=o+Math.imul(B,lt)|0;var Pt=(h+(n=n+Math.imul(j,pt)|0)|0)+((8191&(i=(i=i+Math.imul(j,mt)|0)+Math.imul($,pt)|0))<<13)|0;h=((o=o+Math.imul($,mt)|0)+(i>>>13)|0)+(Pt>>>26)|0,Pt&=67108863,n=Math.imul(C,ut),i=(i=Math.imul(C,ht))+Math.imul(L,ut)|0,o=Math.imul(L,ht),n=n+Math.imul(I,ct)|0,i=(i=i+Math.imul(I,lt)|0)+Math.imul(N,ct)|0,o=o+Math.imul(N,lt)|0;var Rt=(h+(n=n+Math.imul(R,pt)|0)|0)+((8191&(i=(i=i+Math.imul(R,mt)|0)+Math.imul(B,pt)|0))<<13)|0;h=((o=o+Math.imul(B,mt)|0)+(i>>>13)|0)+(Rt>>>26)|0,Rt&=67108863,n=Math.imul(C,ct),i=(i=Math.imul(C,lt))+Math.imul(L,ct)|0,o=Math.imul(L,lt);var Bt=(h+(n=n+Math.imul(I,pt)|0)|0)+((8191&(i=(i=i+Math.imul(I,mt)|0)+Math.imul(N,pt)|0))<<13)|0;h=((o=o+Math.imul(N,mt)|0)+(i>>>13)|0)+(Bt>>>26)|0,Bt&=67108863;var Tt=(h+(n=Math.imul(C,pt))|0)+((8191&(i=(i=Math.imul(C,mt))+Math.imul(L,pt)|0))<<13)|0;return h=((o=Math.imul(L,mt))+(i>>>13)|0)+(Tt>>>26)|0,Tt&=67108863,u[0]=yt,u[1]=vt,u[2]=bt,u[3]=gt,u[4]=wt,u[5]=_t,u[6]=Mt,u[7]=St,u[8]=Ot,u[9]=At,u[10]=Et,u[11]=xt,u[12]=kt,u[13]=jt,u[14]=$t,u[15]=Pt,u[16]=Rt,u[17]=Bt,u[18]=Tt,0!==h&&(u[19]=h,r.length++),r};function y(t,e,r){return(new v).mulp(t,e,r)}function v(t,e){this.x=t,this.y=e}Math.imul||(m=p),s.prototype.mulTo=function(t,e){var r=this.length+t.length;return 10===this.length&&10===t.length?m(this,t,e):r<63?p(this,t,e):r<1024?function(t,e,r){r.negative=e.negative^t.negative,r.length=t.length+e.length;for(var n=0,i=0,o=0;o>>26)|0)>>>26,s&=67108863}r.words[o]=a,n=s,s=i}return 0!==n?r.words[o]=n:r.length--,r.strip()}(this,t,e):y(this,t,e)},v.prototype.makeRBT=function(t){for(var e=new Array(t),r=s.prototype._countBits(t)-1,n=0;n>=1;return n},v.prototype.permute=function(t,e,r,n,i,o){for(var s=0;s>>=1)i++;return 1<>>=13,r[2*s+1]=8191&o,o>>>=13;for(s=2*e;s>=26,e+=n/67108864|0,e+=o>>>26,this.words[r]=67108863&o}return 0!==e&&(this.words[r]=e,this.length++),this},s.prototype.muln=function(t){return this.clone().imuln(t)},s.prototype.sqr=function(){return this.mul(this)},s.prototype.isqr=function(){return this.imul(this.clone())},s.prototype.pow=function(t){var e=function(t){for(var e=new Array(t.bitLength()),r=0;r>>i}return e}(t);if(0===e.length)return new s(1);for(var r=this,n=0;n=0);var e,r=t%26,n=(t-r)/26,o=67108863>>>26-r<<26-r;if(0!==r){var s=0;for(e=0;e>>26-r}s&&(this.words[e]=s,this.length++)}if(0!==n){for(e=this.length-1;e>=0;e--)this.words[e+n]=this.words[e];for(e=0;e=0),n=e?(e-e%26)/26:0;var o=t%26,s=Math.min((t-o)/26,this.length),a=67108863^67108863>>>o<s)for(this.length-=s,h=0;h=0&&(0!==f||h>=n);h--){var c=0|this.words[h];this.words[h]=f<<26-o|c>>>o,f=c&a}return u&&0!==f&&(u.words[u.length++]=f),0===this.length&&(this.words[0]=0,this.length=1),this.strip()},s.prototype.ishrn=function(t,e,r){return i(0===this.negative),this.iushrn(t,e,r)},s.prototype.shln=function(t){return this.clone().ishln(t)},s.prototype.ushln=function(t){return this.clone().iushln(t)},s.prototype.shrn=function(t){return this.clone().ishrn(t)},s.prototype.ushrn=function(t){return this.clone().iushrn(t)},s.prototype.testn=function(t){i("number"==typeof t&&t>=0);var e=t%26,r=(t-e)/26,n=1<=0);var e=t%26,r=(t-e)/26;if(i(0===this.negative,"imaskn works only with positive numbers"),this.length<=r)return this;if(0!==e&&r++,this.length=Math.min(r,this.length),0!==e){var n=67108863^67108863>>>e<=67108864;e++)this.words[e]-=67108864,e===this.length-1?this.words[e+1]=1:this.words[e+1]++;return this.length=Math.max(this.length,e+1),this},s.prototype.isubn=function(t){if(i("number"==typeof t),i(t<67108864),t<0)return this.iaddn(-t);if(0!==this.negative)return this.negative=0,this.iaddn(t),this.negative=1,this;if(this.words[0]-=t,1===this.length&&this.words[0]<0)this.words[0]=-this.words[0],this.negative=1;else for(var e=0;e>26)-(u/67108864|0),this.words[n+r]=67108863&o}for(;n>26,this.words[n+r]=67108863&o;if(0===a)return this.strip();for(i(-1===a),a=0,n=0;n>26,this.words[n]=67108863&o;return this.negative=1,this.strip()},s.prototype._wordDiv=function(t,e){var r=(this.length,t.length),n=this.clone(),i=t,o=0|i.words[i.length-1];0!==(r=26-this._countBits(o))&&(i=i.ushln(r),n.iushln(r),o=0|i.words[i.length-1]);var a,u=n.length-i.length;if("mod"!==e){(a=new s(null)).length=u+1,a.words=new Array(a.length);for(var h=0;h=0;c--){var l=67108864*(0|n.words[i.length+c])+(0|n.words[i.length+c-1]);for(l=Math.min(l/o|0,67108863),n._ishlnsubmul(i,l,c);0!==n.negative;)l--,n.negative=0,n._ishlnsubmul(i,1,c),n.isZero()||(n.negative^=1);a&&(a.words[c]=l)}return a&&a.strip(),n.strip(),"div"!==e&&0!==r&&n.iushrn(r),{div:a||null,mod:n}},s.prototype.divmod=function(t,e,r){return i(!t.isZero()),this.isZero()?{div:new s(0),mod:new s(0)}:0!==this.negative&&0===t.negative?(a=this.neg().divmod(t,e),"mod"!==e&&(n=a.div.neg()),"div"!==e&&(o=a.mod.neg(),r&&0!==o.negative&&o.iadd(t)),{div:n,mod:o}):0===this.negative&&0!==t.negative?(a=this.divmod(t.neg(),e),"mod"!==e&&(n=a.div.neg()),{div:n,mod:a.mod}):0!=(this.negative&t.negative)?(a=this.neg().divmod(t.neg(),e),"div"!==e&&(o=a.mod.neg(),r&&0!==o.negative&&o.isub(t)),{div:a.div,mod:o}):t.length>this.length||this.cmp(t)<0?{div:new s(0),mod:this}:1===t.length?"div"===e?{div:this.divn(t.words[0]),mod:null}:"mod"===e?{div:null,mod:new s(this.modn(t.words[0]))}:{div:this.divn(t.words[0]),mod:new s(this.modn(t.words[0]))}:this._wordDiv(t,e);var n,o,a},s.prototype.div=function(t){return this.divmod(t,"div",!1).div},s.prototype.mod=function(t){return this.divmod(t,"mod",!1).mod},s.prototype.umod=function(t){return this.divmod(t,"mod",!0).mod},s.prototype.divRound=function(t){var e=this.divmod(t);if(e.mod.isZero())return e.div;var r=0!==e.div.negative?e.mod.isub(t):e.mod,n=t.ushrn(1),i=t.andln(1),o=r.cmp(n);return o<0||1===i&&0===o?e.div:0!==e.div.negative?e.div.isubn(1):e.div.iaddn(1)},s.prototype.modn=function(t){i(t<=67108863);for(var e=(1<<26)%t,r=0,n=this.length-1;n>=0;n--)r=(e*r+(0|this.words[n]))%t;return r},s.prototype.idivn=function(t){i(t<=67108863);for(var e=0,r=this.length-1;r>=0;r--){var n=(0|this.words[r])+67108864*e;this.words[r]=n/t|0,e=n%t}return this.strip()},s.prototype.divn=function(t){return this.clone().idivn(t)},s.prototype.egcd=function(t){i(0===t.negative),i(!t.isZero());var e=this,r=t.clone();e=0!==e.negative?e.umod(t):e.clone();for(var n=new s(1),o=new s(0),a=new s(0),u=new s(1),h=0;e.isEven()&&r.isEven();)e.iushrn(1),r.iushrn(1),++h;for(var f=r.clone(),c=e.clone();!e.isZero();){for(var l=0,d=1;0==(e.words[0]&d)&&l<26;++l,d<<=1);if(l>0)for(e.iushrn(l);l-- >0;)(n.isOdd()||o.isOdd())&&(n.iadd(f),o.isub(c)),n.iushrn(1),o.iushrn(1);for(var p=0,m=1;0==(r.words[0]&m)&&p<26;++p,m<<=1);if(p>0)for(r.iushrn(p);p-- >0;)(a.isOdd()||u.isOdd())&&(a.iadd(f),u.isub(c)),a.iushrn(1),u.iushrn(1);e.cmp(r)>=0?(e.isub(r),n.isub(a),o.isub(u)):(r.isub(e),a.isub(n),u.isub(o))}return{a:a,b:u,gcd:r.iushln(h)}},s.prototype._invmp=function(t){i(0===t.negative),i(!t.isZero());var e=this,r=t.clone();e=0!==e.negative?e.umod(t):e.clone();for(var n,o=new s(1),a=new s(0),u=r.clone();e.cmpn(1)>0&&r.cmpn(1)>0;){for(var h=0,f=1;0==(e.words[0]&f)&&h<26;++h,f<<=1);if(h>0)for(e.iushrn(h);h-- >0;)o.isOdd()&&o.iadd(u),o.iushrn(1);for(var c=0,l=1;0==(r.words[0]&l)&&c<26;++c,l<<=1);if(c>0)for(r.iushrn(c);c-- >0;)a.isOdd()&&a.iadd(u),a.iushrn(1);e.cmp(r)>=0?(e.isub(r),o.isub(a)):(r.isub(e),a.isub(o))}return(n=0===e.cmpn(1)?o:a).cmpn(0)<0&&n.iadd(t),n},s.prototype.gcd=function(t){if(this.isZero())return t.abs();if(t.isZero())return this.abs();var e=this.clone(),r=t.clone();e.negative=0,r.negative=0;for(var n=0;e.isEven()&&r.isEven();n++)e.iushrn(1),r.iushrn(1);for(;;){for(;e.isEven();)e.iushrn(1);for(;r.isEven();)r.iushrn(1);var i=e.cmp(r);if(i<0){var o=e;e=r,r=o}else if(0===i||0===r.cmpn(1))break;e.isub(r)}return r.iushln(n)},s.prototype.invm=function(t){return this.egcd(t).a.umod(t)},s.prototype.isEven=function(){return 0==(1&this.words[0])},s.prototype.isOdd=function(){return 1==(1&this.words[0])},s.prototype.andln=function(t){return this.words[0]&t},s.prototype.bincn=function(t){i("number"==typeof t);var e=t%26,r=(t-e)/26,n=1<>>26,a&=67108863,this.words[s]=a}return 0!==o&&(this.words[s]=o,this.length++),this},s.prototype.isZero=function(){return 1===this.length&&0===this.words[0]},s.prototype.cmpn=function(t){var e,r=t<0;if(0!==this.negative&&!r)return-1;if(0===this.negative&&r)return 1;if(this.strip(),this.length>1)e=1;else{r&&(t=-t),i(t<=67108863,"Number is too big");var n=0|this.words[0];e=n===t?0:nt.length)return 1;if(this.length=0;r--){var n=0|this.words[r],i=0|t.words[r];if(n!==i){ni&&(e=1);break}}return e},s.prototype.gtn=function(t){return 1===this.cmpn(t)},s.prototype.gt=function(t){return 1===this.cmp(t)},s.prototype.gten=function(t){return this.cmpn(t)>=0},s.prototype.gte=function(t){return this.cmp(t)>=0},s.prototype.ltn=function(t){return-1===this.cmpn(t)},s.prototype.lt=function(t){return-1===this.cmp(t)},s.prototype.lten=function(t){return this.cmpn(t)<=0},s.prototype.lte=function(t){return this.cmp(t)<=0},s.prototype.eqn=function(t){return 0===this.cmpn(t)},s.prototype.eq=function(t){return 0===this.cmp(t)},s.red=function(t){return new O(t)},s.prototype.toRed=function(t){return i(!this.red,"Already a number in reduction context"),i(0===this.negative,"red works only with positives"),t.convertTo(this)._forceRed(t)},s.prototype.fromRed=function(){return i(this.red,"fromRed works only with numbers in reduction context"),this.red.convertFrom(this)},s.prototype._forceRed=function(t){return this.red=t,this},s.prototype.forceRed=function(t){return i(!this.red,"Already a number in reduction context"),this._forceRed(t)},s.prototype.redAdd=function(t){return i(this.red,"redAdd works only with red numbers"),this.red.add(this,t)},s.prototype.redIAdd=function(t){return i(this.red,"redIAdd works only with red numbers"),this.red.iadd(this,t)},s.prototype.redSub=function(t){return i(this.red,"redSub works only with red numbers"),this.red.sub(this,t)},s.prototype.redISub=function(t){return i(this.red,"redISub works only with red numbers"),this.red.isub(this,t)},s.prototype.redShl=function(t){return i(this.red,"redShl works only with red numbers"),this.red.shl(this,t)},s.prototype.redMul=function(t){return i(this.red,"redMul works only with red numbers"),this.red._verify2(this,t),this.red.mul(this,t)},s.prototype.redIMul=function(t){return i(this.red,"redMul works only with red numbers"),this.red._verify2(this,t),this.red.imul(this,t)},s.prototype.redSqr=function(){return i(this.red,"redSqr works only with red numbers"),this.red._verify1(this),this.red.sqr(this)},s.prototype.redISqr=function(){return i(this.red,"redISqr works only with red numbers"),this.red._verify1(this),this.red.isqr(this)},s.prototype.redSqrt=function(){return i(this.red,"redSqrt works only with red numbers"),this.red._verify1(this),this.red.sqrt(this)},s.prototype.redInvm=function(){return i(this.red,"redInvm works only with red numbers"),this.red._verify1(this),this.red.invm(this)},s.prototype.redNeg=function(){return i(this.red,"redNeg works only with red numbers"),this.red._verify1(this),this.red.neg(this)},s.prototype.redPow=function(t){return i(this.red&&!t.red,"redPow(normalNum)"),this.red._verify1(this),this.red.pow(this,t)};var b={k256:null,p224:null,p192:null,p25519:null};function g(t,e){this.name=t,this.p=new s(e,16),this.n=this.p.bitLength(),this.k=new s(1).iushln(this.n).isub(this.p),this.tmp=this._tmp()}function w(){g.call(this,"k256","ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff fffffffe fffffc2f")}function _(){g.call(this,"p224","ffffffff ffffffff ffffffff ffffffff 00000000 00000000 00000001")}function M(){g.call(this,"p192","ffffffff ffffffff ffffffff fffffffe ffffffff ffffffff")}function S(){g.call(this,"25519","7fffffffffffffff ffffffffffffffff ffffffffffffffff ffffffffffffffed")}function O(t){if("string"==typeof t){var e=s._prime(t);this.m=e.p,this.prime=e}else i(t.gtn(1),"modulus must be greater than 1"),this.m=t,this.prime=null}function A(t){O.call(this,t),this.shift=this.m.bitLength(),this.shift%26!=0&&(this.shift+=26-this.shift%26),this.r=new s(1).iushln(this.shift),this.r2=this.imod(this.r.sqr()),this.rinv=this.r._invmp(this.m),this.minv=this.rinv.mul(this.r).isubn(1).div(this.m),this.minv=this.minv.umod(this.r),this.minv=this.r.sub(this.minv)}g.prototype._tmp=function(){var t=new s(null);return t.words=new Array(Math.ceil(this.n/13)),t},g.prototype.ireduce=function(t){var e,r=t;do{this.split(r,this.tmp),e=(r=(r=this.imulK(r)).iadd(this.tmp)).bitLength()}while(e>this.n);var n=e0?r.isub(this.p):void 0!==r.strip?r.strip():r._strip(),r},g.prototype.split=function(t,e){t.iushrn(this.n,0,e)},g.prototype.imulK=function(t){return t.imul(this.k)},o(w,g),w.prototype.split=function(t,e){for(var r=Math.min(t.length,9),n=0;n>>22,i=o}i>>>=22,t.words[n-10]=i,0===i&&t.length>10?t.length-=10:t.length-=9},w.prototype.imulK=function(t){t.words[t.length]=0,t.words[t.length+1]=0,t.length+=2;for(var e=0,r=0;r>>=26,t.words[r]=i,e=n}return 0!==e&&(t.words[t.length++]=e),t},s._prime=function(t){if(b[t])return b[t];var e;if("k256"===t)e=new w;else if("p224"===t)e=new _;else if("p192"===t)e=new M;else{if("p25519"!==t)throw new Error("Unknown prime "+t);e=new S}return b[t]=e,e},O.prototype._verify1=function(t){i(0===t.negative,"red works only with positives"),i(t.red,"red works only with red numbers")},O.prototype._verify2=function(t,e){i(0==(t.negative|e.negative),"red works only with positives"),i(t.red&&t.red===e.red,"red works only with red numbers")},O.prototype.imod=function(t){return this.prime?this.prime.ireduce(t)._forceRed(this):t.umod(this.m)._forceRed(this)},O.prototype.neg=function(t){return t.isZero()?t.clone():this.m.sub(t)._forceRed(this)},O.prototype.add=function(t,e){this._verify2(t,e);var r=t.add(e);return r.cmp(this.m)>=0&&r.isub(this.m),r._forceRed(this)},O.prototype.iadd=function(t,e){this._verify2(t,e);var r=t.iadd(e);return r.cmp(this.m)>=0&&r.isub(this.m),r},O.prototype.sub=function(t,e){this._verify2(t,e);var r=t.sub(e);return r.cmpn(0)<0&&r.iadd(this.m),r._forceRed(this)},O.prototype.isub=function(t,e){this._verify2(t,e);var r=t.isub(e);return r.cmpn(0)<0&&r.iadd(this.m),r},O.prototype.shl=function(t,e){return this._verify1(t),this.imod(t.ushln(e))},O.prototype.imul=function(t,e){return this._verify2(t,e),this.imod(t.imul(e))},O.prototype.mul=function(t,e){return this._verify2(t,e),this.imod(t.mul(e))},O.prototype.isqr=function(t){return this.imul(t,t.clone())},O.prototype.sqr=function(t){return this.mul(t,t)},O.prototype.sqrt=function(t){if(t.isZero())return t.clone();var e=this.m.andln(3);if(i(e%2==1),3===e){var r=this.m.add(new s(1)).iushrn(2);return this.pow(t,r)}for(var n=this.m.subn(1),o=0;!n.isZero()&&0===n.andln(1);)o++,n.iushrn(1);i(!n.isZero());var a=new s(1).toRed(this),u=a.redNeg(),h=this.m.subn(1).iushrn(1),f=this.m.bitLength();for(f=new s(2*f*f).toRed(this);0!==this.pow(f,h).cmp(u);)f.redIAdd(u);for(var c=this.pow(f,n),l=this.pow(t,n.addn(1).iushrn(1)),d=this.pow(t,n),p=o;0!==d.cmp(a);){for(var m=d,y=0;0!==m.cmp(a);y++)m=m.redSqr();i(y=0;n--){for(var h=e.words[n],f=u-1;f>=0;f--){var c=h>>f&1;i!==r[0]&&(i=this.sqr(i)),0!==c||0!==o?(o<<=1,o|=c,(4===++a||0===n&&0===f)&&(i=this.mul(i,r[o]),a=0,o=0)):a=0}u=26}return i},O.prototype.convertTo=function(t){var e=t.umod(this.m);return e===t?e.clone():e},O.prototype.convertFrom=function(t){var e=t.clone();return e.red=null,e},s.mont=function(t){return new A(t)},o(A,O),A.prototype.convertTo=function(t){return this.imod(t.ushln(this.shift))},A.prototype.convertFrom=function(t){var e=this.imod(t.mul(this.rinv));return e.red=null,e},A.prototype.imul=function(t,e){if(t.isZero()||e.isZero())return t.words[0]=0,t.length=1,t;var r=t.imul(e),n=r.maskn(this.shift).mul(this.minv).imaskn(this.shift).mul(this.m),i=r.isub(n).iushrn(this.shift),o=i;return i.cmp(this.m)>=0?o=i.isub(this.m):i.cmpn(0)<0&&(o=i.iadd(this.m)),o._forceRed(this)},A.prototype.mul=function(t,e){if(t.isZero()||e.isZero())return new s(0)._forceRed(this);var r=t.mul(e),n=r.maskn(this.shift).mul(this.minv).imaskn(this.shift).mul(this.m),i=r.isub(n).iushrn(this.shift),o=i;return i.cmp(this.m)>=0?o=i.isub(this.m):i.cmpn(0)<0&&(o=i.iadd(this.m)),o._forceRed(this)},A.prototype.invm=function(t){return this.imod(t._invmp(this.m).mul(this.r2))._forceRed(this)}}(t,this)}).call(this,r(22)(t))},function(t,e,r){"use strict";var n=e;n.der=r(136),n.pem=r(256)},function(t,e,r){"use strict";var n=r(0),i=r(76).Buffer,o=r(77),s=r(79);function a(t){this.enc="der",this.name=t.name,this.entity=t,this.tree=new u,this.tree._init(t.body)}function u(t){o.call(this,"der",t)}function h(t){return t<10?"0"+t:t}t.exports=a,a.prototype.encode=function(t,e){return this.tree._encode(t,e).join()},n(u,o),u.prototype._encodeComposite=function(t,e,r,n){var o=function(t,e,r,n){var i;"seqof"===t?t="seq":"setof"===t&&(t="set");if(s.tagByName.hasOwnProperty(t))i=s.tagByName[t];else{if("number"!=typeof t||(0|t)!==t)return n.error("Unknown tag: "+t);i=t}if(i>=31)return n.error("Multi-octet tag encoding unsupported");e||(i|=32);return i|=s.tagClassByName[r||"universal"]<<6}(t,e,r,this.reporter);if(n.length<128){var a=i.alloc(2);return a[0]=o,a[1]=n.length,this._createEncoderBuffer([a,n])}for(var u=1,h=n.length;h>=256;h>>=8)u++;var f=i.alloc(2+u);f[0]=o,f[1]=128|u;for(var c=1+u,l=n.length;l>0;c--,l>>=8)f[c]=255&l;return this._createEncoderBuffer([f,n])},u.prototype._encodeStr=function(t,e){if("bitstr"===e)return this._createEncoderBuffer([0|t.unused,t.data]);if("bmpstr"===e){for(var r=i.alloc(2*t.length),n=0;n=40)return this.reporter.error("Second objid identifier OOB");t.splice(0,2,40*t[0]+t[1])}for(var s=0,a=0;a=128;u>>=7)s++}for(var h=i.alloc(s),f=h.length-1,c=t.length-1;c>=0;c--){var l=t[c];for(h[f--]=127&l;(l>>=7)>0;)h[f--]=128|127&l}return this._createEncoderBuffer(h)},u.prototype._encodeTime=function(t,e){var r,n=new Date(t);return"gentime"===e?r=[h(n.getUTCFullYear()),h(n.getUTCMonth()+1),h(n.getUTCDate()),h(n.getUTCHours()),h(n.getUTCMinutes()),h(n.getUTCSeconds()),"Z"].join(""):"utctime"===e?r=[h(n.getUTCFullYear()%100),h(n.getUTCMonth()+1),h(n.getUTCDate()),h(n.getUTCHours()),h(n.getUTCMinutes()),h(n.getUTCSeconds()),"Z"].join(""):this.reporter.error("Encoding "+e+" time is not supported yet"),this._encodeStr(r,"octstr")},u.prototype._encodeNull=function(){return this._createEncoderBuffer("")},u.prototype._encodeInt=function(t,e){if("string"==typeof t){if(!e)return this.reporter.error("String int or enum given, but no values map");if(!e.hasOwnProperty(t))return this.reporter.error("Values map doesn't contain: "+JSON.stringify(t));t=e[t]}if("number"!=typeof t&&!i.isBuffer(t)){var r=t.toArray();!t.sign&&128&r[0]&&r.unshift(0),t=i.from(r)}if(i.isBuffer(t)){var n=t.length;0===t.length&&n++;var o=i.alloc(n);return t.copy(o),0===t.length&&(o[0]=0),this._createEncoderBuffer(o)}if(t<128)return this._createEncoderBuffer(t);if(t<256)return this._createEncoderBuffer([0,t]);for(var s=1,a=t;a>=256;a>>=8)s++;for(var u=new Array(s),h=u.length-1;h>=0;h--)u[h]=255&t,t>>=8;return 128&u[0]&&u.unshift(0),this._createEncoderBuffer(i.from(u))},u.prototype._encodeBool=function(t){return this._createEncoderBuffer(t?255:0)},u.prototype._use=function(t,e){return"function"==typeof t&&(t=t(e)),t._getEncoder("der").tree},u.prototype._skipDefault=function(t,e,r){var n,i=this._baseState;if(null===i.default)return!1;var o=t.join();if(void 0===i.defaultBuffer&&(i.defaultBuffer=this._encodeValue(i.default,e,r).join()),o.length!==i.defaultBuffer.length)return!1;for(n=0;n>6],i=0==(32&r);if(31==(31&r)){var o=r;for(r=0;128==(128&o);){if(o=t.readUInt8(e),t.isError(o))return o;r<<=7,r|=127&o}}else r&=31;return{cls:n,primitive:i,tag:r,tagStr:a.tag[r]}}function c(t,e,r){var n=t.readUInt8(r);if(t.isError(n))return n;if(!e&&128===n)return null;if(0==(128&n))return n;var i=127&n;if(i>4)return t.error("length octect is too long");n=0;for(var o=0;o=1.5*r;return Math.round(t/r)+" "+n+(i?"s":"")}t.exports=function(t,e){e=e||{};var u=r(t);if("string"===u&&t.length>0)return function(t){if((t=String(t)).length>100)return;var e=/^(-?(?:\d+)?\.?\d+) *(milliseconds?|msecs?|ms|seconds?|secs?|s|minutes?|mins?|m|hours?|hrs?|h|days?|d|weeks?|w|years?|yrs?|y)?$/i.exec(t);if(!e)return;var r=parseFloat(e[1]);switch((e[2]||"ms").toLowerCase()){case"years":case"year":case"yrs":case"yr":case"y":return 315576e5*r;case"weeks":case"week":case"w":return 6048e5*r;case"days":case"day":case"d":return r*s;case"hours":case"hour":case"hrs":case"hr":case"h":return r*o;case"minutes":case"minute":case"mins":case"min":case"m":return r*i;case"seconds":case"second":case"secs":case"sec":case"s":return r*n;case"milliseconds":case"millisecond":case"msecs":case"msec":case"ms":return r;default:return}}(t);if("number"===u&&isFinite(t))return e.long?function(t){var e=Math.abs(t);if(e>=s)return a(t,e,s,"day");if(e>=o)return a(t,e,o,"hour");if(e>=i)return a(t,e,i,"minute");if(e>=n)return a(t,e,n,"second");return t+" ms"}(t):function(t){var e=Math.abs(t);if(e>=s)return Math.round(t/s)+"d";if(e>=o)return Math.round(t/o)+"h";if(e>=i)return Math.round(t/i)+"m";if(e>=n)return Math.round(t/n)+"s";return t+"ms"}(t);throw new Error("val is not a non-empty string or a valid number. val="+JSON.stringify(t))}},function(t,e){t.exports=function(t,e,r){var n=[],i=t.length;if(0===i)return n;var o=e<0?Math.max(0,e+i):e||0;for(void 0!==r&&(i=r<0?r+i:r);i-- >o;)n[i-o]=t[i];return n}},function(t,e){function r(t){return(r="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}var n=Object.prototype.toString;t.exports=function(t){if("object"!=r(e=t)||"[object RegExp]"!=n.call(e))throw new TypeError("Not a RegExp");var e,i=[];t.global&&i.push("g"),t.multiline&&i.push("m"),t.ignoreCase&&i.push("i"),t.dotAll&&i.push("s"),t.unicode&&i.push("u"),t.sticky&&i.push("y");var o=new RegExp(t.source,i.join(""));return"number"==typeof t.lastIndex&&(o.lastIndex=t.lastIndex),o}},function(t,e,r){"use strict";t.exports=function(t){return t.name?t.name:(t.toString().trim().match(/^function\s*([^\s(]+)/)||[])[1]}},function(t,e,r){"use strict";var n=r(6); +/*! + * Get the bson type, if it exists + */t.exports=function(t,e){return n(t,"_bsontype",void 0)===e}},function(t,e,r){"use strict";function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}var i=Symbol("mongoose#trustedSymbol");e.trustedSymbol=i,e.trusted=function(t){return null==t||"object"!==n(t)||(t[i]=!0),t}},function(t,e,r){"use strict";(function(e){ +/*! + * ignore + */ +var n=r(33),i=r(284),o={_promise:null,get:function(){return o._promise},set:function(t){n.ok("function"==typeof t,"mongoose.Promise must be a function, got ".concat(t)),o._promise=t,i.Promise=t}}; +/*! + * Use native promises by default + */ +o.set(e.Promise),t.exports=o}).call(this,r(7))},function(t,e,r){"use strict";(function(t,n,i){ +/*! + * Module dependencies. + */ +function o(t){return(o="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}var s=r(145),a=["__proto__","constructor","prototype"],u=e.clone=function t(r,n){if(null==r)return r;if(Array.isArray(r))return e.cloneArray(r,n);if(r.constructor){if(/ObjectI[dD]$/.test(r.constructor.name))return"function"==typeof r.clone?r.clone():new r.constructor(r.id);if("ReadPreference"===r.constructor.name)return new r.constructor(r.mode,t(r.tags,n));if("Binary"==r._bsontype&&r.buffer&&r.value)return"function"==typeof r.clone?r.clone():new r.constructor(r.value(!0),r.sub_type);if("Date"===r.constructor.name||"Function"===r.constructor.name)return new r.constructor(+r);if("RegExp"===r.constructor.name)return s(r);if("Buffer"===r.constructor.name)return e.cloneBuffer(r)}return f(r)?e.cloneObject(r,n):r.valueOf?r.valueOf():void 0}; +/*! + * ignore + */ +e.cloneObject=function(t,e){for(var r,n,i,o=e&&e.minimize,s={},h=Object.keys(t),f=0;f1)throw new Error("Adding properties is not supported");function e(){}return e.prototype=t,new e},e.inherits=function(t,r){t.prototype=e.create(r.prototype),t.prototype.constructor=t};var c=e.soon="function"==typeof t?t:n.nextTick;e.cloneBuffer=function(t){var e=i.alloc(t.length);return t.copy(e,0,0,t.length),e},e.isArgumentsObject=function(t){return"[object Arguments]"===Object.prototype.toString.call(t)}}).call(this,r(108).setImmediate,r(5),r(3).Buffer)},function(t,e,r){"use strict";(function(t,r,n,i){function o(t){return(o="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}e.isNode=void 0!==t&&"object"==o(r)&&"object"==(void 0===n?"undefined":o(n))&&"function"==typeof i&&t.argv,e.isMongo=!e.isNode&&"function"==typeof printjson&&"function"==typeof ObjectId&&"function"==typeof rs&&"function"==typeof sh,e.isBrowser=!e.isNode&&!e.isMongo&&"undefined"!=typeof window,e.type=e.isNode?"node":e.isMongo?"mongo":e.isBrowser?"browser":"unknown"}).call(this,r(5),r(22)(t),r(7),r(3).Buffer)},function(t,e,r){"use strict";t.exports=function(t,e,r){for(var n={},i=0,o=Object.keys(e.tree);i=t},message:r,type:"min",min:t})}return this},f.prototype.max=function(t,e){if(this.maxValidator&&(this.validators=this.validators.filter((function(t){return t.validator!==this.maxValidator}),this)),null!=t){var r=e||n.messages.Number.max;r=r.replace(/{MAX}/,t),this.validators.push({validator:this.maxValidator=function(e){return null==e||e<=t},message:r,type:"max",max:t})}return this},f.prototype.enum=function(t,e){if(this.enumValidator&&(this.validators=this.validators.filter((function(t){return t.validator!==this.enumValidator}),this)),!Array.isArray(t)){var r=u.isPOJO(t)&&null!=t.values;r?(e=t.message,t=t.values):"number"==typeof t&&(t=Array.prototype.slice.call(arguments),e=null),u.isPOJO(t)&&(t=Object.values(t)),e=e||n.messages.Number.enum}return e=null==e?n.messages.Number.enum:e,this.enumValidator=function(e){return null==e||-1!==t.indexOf(e)},this.validators.push({validator:this.enumValidator,message:e,type:"enum",enumValues:t}),this},f.prototype.cast=function(t,e,r){if(o._isRef(this,t,e,r))return"number"==typeof t?t:this._castRef(t,e,r);var n,i=t&&void 0!==t._id?t._id:t;n="function"==typeof this._castFunction?this._castFunction:"function"==typeof this.constructor.cast?this.constructor.cast():f.cast();try{return n(i)}catch(t){throw new h("Number",i,this.path,t,this)}},f.prototype.$conditionalHandlers=u.options(o.prototype.$conditionalHandlers,{$bitsAllClear:a,$bitsAnyClear:a,$bitsAllSet:a,$bitsAnySet:a,$gt:c,$gte:c,$lt:c,$lte:c,$mod:function(t){var e=this;return Array.isArray(t)?t.map((function(t){return e.cast(t)})):[this.cast(t)]}}),f.prototype.castForQuery=function(t,e){var r;if(2===arguments.length){if(!(r=this.$conditionalHandlers[t]))throw new h("number",e,this.path,null,this);return r.call(this,e)}return e=this._castForQuery(t)}, +/*! + * Module exports. + */ +t.exports=f},function(t,e,r){"use strict";(function(e){ +/*! + * Module requirements. + */ +var n=r(20); +/*! + * ignore + */ +/*! + * ignore + */ +function i(t,e){var r=Number(e);if(isNaN(r))throw new n("number",e,t);return r}t.exports=function(t){var r=this;return Array.isArray(t)?t.map((function(t){return i(r.path,t)})):e.isBuffer(t)?t:i(r.path,t)}}).call(this,r(3).Buffer)},function(t,e,r){"use strict";var n=r(164); +/*! +* returns discriminator by discriminatorMapping.value +* +* @param {Schema} schema +* @param {string} value +*/t.exports=function(t,e){if(null==t||null==t.discriminators)return null;for(var r=0,i=Object.keys(t.discriminators);r0&&this._markModified(),t},copy:function(t){var r=e.prototype.copy.apply(this,arguments);return t&&t.isMongooseBuffer&&t._markModified(),r}}, +/*! + * Compile other Buffer methods marking this buffer as modified. + */ +"writeUInt8 writeUInt16 writeUInt32 writeInt8 writeInt16 writeInt32 writeFloat writeDouble fill utf8Write binaryWrite asciiWrite set writeUInt16LE writeUInt16BE writeUInt32LE writeUInt32BE writeInt16LE writeInt16BE writeInt32LE writeInt32BE writeFloatLE writeFloatBE writeDoubleLE writeDoubleBE".split(" ").forEach((function(t){e.prototype[t]&&(o.mixin[t]=function(){var r=e.prototype[t].apply(this,arguments);return this._markModified(),r})})),o.mixin.toObject=function(t){var r="number"==typeof t?t:this._subtype||0;return new n(e.from(this),r)},o.mixin.$toObject=o.mixin.toObject,o.mixin.toBSON=function(){return new n(this,this._subtype||0)},o.mixin.equals=function(t){if(!e.isBuffer(t))return!1;if(this.length!==t.length)return!1;for(var r=0;r=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:i}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function o(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:o}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function i(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r0?s-4:s;for(r=0;r>16&255,u[f++]=e>>8&255,u[f++]=255&e;2===a&&(e=i[t.charCodeAt(r)]<<2|i[t.charCodeAt(r+1)]>>4,u[f++]=255&e);1===a&&(e=i[t.charCodeAt(r)]<<10|i[t.charCodeAt(r+1)]<<4|i[t.charCodeAt(r+2)]>>2,u[f++]=e>>8&255,u[f++]=255&e);return u},e.fromByteArray=function(t){for(var e,r=t.length,i=r%3,o=[],s=0,a=r-i;sa?a:s+16383));1===i?(e=t[r-1],o.push(n[e>>2]+n[e<<4&63]+"==")):2===i&&(e=(t[r-2]<<8)+t[r-1],o.push(n[e>>10]+n[e>>4&63]+n[e<<2&63]+"="));return o.join("")};for(var n=[],i=[],o="undefined"!=typeof Uint8Array?Uint8Array:Array,s="ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/",a=0,u=s.length;a0)throw new Error("Invalid string. Length must be a multiple of 4");var r=t.indexOf("=");return-1===r&&(r=e),[r,r===e?0:4-r%4]}function f(t,e,r){for(var i,o,s=[],a=e;a>18&63]+n[o>>12&63]+n[o>>6&63]+n[63&o]);return s.join("")}i["-".charCodeAt(0)]=62,i["_".charCodeAt(0)]=63},function(t,e){ +/*! ieee754. BSD-3-Clause License. Feross Aboukhadijeh */ +e.read=function(t,e,r,n,i){var o,s,a=8*i-n-1,u=(1<>1,f=-7,c=r?i-1:0,l=r?-1:1,d=t[e+c];for(c+=l,o=d&(1<<-f)-1,d>>=-f,f+=a;f>0;o=256*o+t[e+c],c+=l,f-=8);for(s=o&(1<<-f)-1,o>>=-f,f+=n;f>0;s=256*s+t[e+c],c+=l,f-=8);if(0===o)o=1-h;else{if(o===u)return s?NaN:1/0*(d?-1:1);s+=Math.pow(2,n),o-=h}return(d?-1:1)*s*Math.pow(2,o-n)},e.write=function(t,e,r,n,i,o){var s,a,u,h=8*o-i-1,f=(1<>1,l=23===i?Math.pow(2,-24)-Math.pow(2,-77):0,d=n?0:o-1,p=n?1:-1,m=e<0||0===e&&1/e<0?1:0;for(e=Math.abs(e),isNaN(e)||e===1/0?(a=isNaN(e)?1:0,s=f):(s=Math.floor(Math.log(e)/Math.LN2),e*(u=Math.pow(2,-s))<1&&(s--,u*=2),(e+=s+c>=1?l/u:l*Math.pow(2,1-c))*u>=2&&(s++,u/=2),s+c>=f?(a=0,s=f):s+c>=1?(a=(e*u-1)*Math.pow(2,i),s+=c):(a=e*Math.pow(2,c-1)*Math.pow(2,i),s=0));i>=8;t[r+d]=255&a,d+=p,a/=256,i-=8);for(s=s<0;t[r+d]=255&s,d+=p,s/=256,h-=8);t[r+d-p]|=128*m}},function(t,e,r){"use strict"; +/*! + * Module exports. + */e.Binary=r(181),e.Collection=function(){throw new Error("Cannot create a collection from browser library")},e.getConnection=function(){return function(){throw new Error("Cannot create a connection from browser library")}},e.Decimal128=r(272),e.ObjectId=r(273),e.ReadPreference=r(274)},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */var n=r(59).Binary; +/*! + * Module exports. + */t.exports=n},function(t,e,r){"use strict";e.randomBytes=e.rng=e.pseudoRandomBytes=e.prng=r(27),e.createHash=e.Hash=r(36),e.createHmac=e.Hmac=r(110);var n=r(207),i=Object.keys(n),o=["sha1","sha224","sha256","sha384","sha512","md5","rmd160"].concat(i);e.getHashes=function(){return o};var s=r(113);e.pbkdf2=s.pbkdf2,e.pbkdf2Sync=s.pbkdf2Sync;var a=r(209);e.Cipher=a.Cipher,e.createCipher=a.createCipher,e.Cipheriv=a.Cipheriv,e.createCipheriv=a.createCipheriv,e.Decipher=a.Decipher,e.createDecipher=a.createDecipher,e.Decipheriv=a.Decipheriv,e.createDecipheriv=a.createDecipheriv,e.getCiphers=a.getCiphers,e.listCiphers=a.listCiphers;var u=r(224);e.DiffieHellmanGroup=u.DiffieHellmanGroup,e.createDiffieHellmanGroup=u.createDiffieHellmanGroup,e.getDiffieHellman=u.getDiffieHellman,e.createDiffieHellman=u.createDiffieHellman,e.DiffieHellman=u.DiffieHellman;var h=r(231);e.createSign=h.createSign,e.Sign=h.Sign,e.createVerify=h.createVerify,e.Verify=h.Verify,e.createECDH=r(264);var f=r(267);e.publicEncrypt=f.publicEncrypt,e.privateEncrypt=f.privateEncrypt,e.publicDecrypt=f.publicDecrypt,e.privateDecrypt=f.privateDecrypt;var c=r(271);e.randomFill=c.randomFill,e.randomFillSync=c.randomFillSync,e.createCredentials=function(){throw new Error(["sorry, createCredentials is not implemented yet","we accept pull requests","https://github.com/crypto-browserify/crypto-browserify"].join("\n"))},e.constants={DH_CHECK_P_NOT_SAFE_PRIME:2,DH_CHECK_P_NOT_PRIME:1,DH_UNABLE_TO_CHECK_GENERATOR:4,DH_NOT_SUITABLE_GENERATOR:8,NPN_ENABLED:1,ALPN_ENABLED:1,RSA_PKCS1_PADDING:1,RSA_SSLV23_PADDING:2,RSA_NO_PADDING:3,RSA_PKCS1_OAEP_PADDING:4,RSA_X931_PADDING:5,RSA_PKCS1_PSS_PADDING:6,POINT_CONVERSION_COMPRESSED:2,POINT_CONVERSION_UNCOMPRESSED:4,POINT_CONVERSION_HYBRID:6}},function(t,e,r){ +/*! safe-buffer. MIT License. Feross Aboukhadijeh */ +var n=r(3),i=n.Buffer;function o(t,e){for(var r in t)e[r]=t[r]}function s(t,e,r){return i(t,e,r)}i.from&&i.alloc&&i.allocUnsafe&&i.allocUnsafeSlow?t.exports=n:(o(n,e),e.Buffer=s),s.prototype=Object.create(i.prototype),o(i,s),s.from=function(t,e,r){if("number"==typeof t)throw new TypeError("Argument must not be a number");return i(t,e,r)},s.alloc=function(t,e,r){if("number"!=typeof t)throw new TypeError("Argument must be a number");var n=i(t);return void 0!==e?"string"==typeof r?n.fill(e,r):n.fill(e):n.fill(0),n},s.allocUnsafe=function(t){if("number"!=typeof t)throw new TypeError("Argument must be a number");return i(t)},s.allocUnsafeSlow=function(t){if("number"!=typeof t)throw new TypeError("Argument must be a number");return n.SlowBuffer(t)}},function(t,e){},function(t,e,r){"use strict";function n(t,e){var r=Object.keys(t);if(Object.getOwnPropertySymbols){var n=Object.getOwnPropertySymbols(t);e&&(n=n.filter((function(e){return Object.getOwnPropertyDescriptor(t,e).enumerable}))),r.push.apply(r,n)}return r}function i(t,e,r){return e in t?Object.defineProperty(t,e,{value:r,enumerable:!0,configurable:!0,writable:!0}):t[e]=r,t}function o(t,e){for(var r=0;r0?this.tail.next=e:this.head=e,this.tail=e,++this.length}},{key:"unshift",value:function(t){var e={data:t,next:this.head};0===this.length&&(this.tail=e),this.head=e,++this.length}},{key:"shift",value:function(){if(0!==this.length){var t=this.head.data;return 1===this.length?this.head=this.tail=null:this.head=this.head.next,--this.length,t}}},{key:"clear",value:function(){this.head=this.tail=null,this.length=0}},{key:"join",value:function(t){if(0===this.length)return"";for(var e=this.head,r=""+e.data;e=e.next;)r+=t+e.data;return r}},{key:"concat",value:function(t){if(0===this.length)return s.alloc(0);for(var e,r,n,i=s.allocUnsafe(t>>>0),o=this.head,a=0;o;)e=o.data,r=i,n=a,s.prototype.copy.call(e,r,n),a+=o.data.length,o=o.next;return i}},{key:"consume",value:function(t,e){var r;return ti.length?i.length:t;if(o===i.length?n+=i:n+=i.slice(0,t),0==(t-=o)){o===i.length?(++r,e.next?this.head=e.next:this.head=this.tail=null):(this.head=e,e.data=i.slice(o));break}++r}return this.length-=r,n}},{key:"_getBuffer",value:function(t){var e=s.allocUnsafe(t),r=this.head,n=1;for(r.data.copy(e),t-=r.data.length;r=r.next;){var i=r.data,o=t>i.length?i.length:t;if(i.copy(e,e.length-t,0,o),0==(t-=o)){o===i.length?(++n,r.next?this.head=r.next:this.head=this.tail=null):(this.head=r,r.data=i.slice(o));break}++n}return this.length-=n,e}},{key:u,value:function(t,e){return a(this,function(t){for(var e=1;e */ +var n=r(3),i=n.Buffer;function o(t,e){for(var r in t)e[r]=t[r]}function s(t,e,r){return i(t,e,r)}i.from&&i.alloc&&i.allocUnsafe&&i.allocUnsafeSlow?t.exports=n:(o(n,e),e.Buffer=s),s.prototype=Object.create(i.prototype),o(i,s),s.from=function(t,e,r){if("number"==typeof t)throw new TypeError("Argument must not be a number");return i(t,e,r)},s.alloc=function(t,e,r){if("number"!=typeof t)throw new TypeError("Argument must be a number");var n=i(t);return void 0!==e?"string"==typeof r?n.fill(e,r):n.fill(e):n.fill(0),n},s.allocUnsafe=function(t){if("number"!=typeof t)throw new TypeError("Argument must be a number");return i(t)},s.allocUnsafeSlow=function(t){if("number"!=typeof t)throw new TypeError("Argument must be a number");return n.SlowBuffer(t)}},function(t,e,r){"use strict";(function(e){var n;function i(t,e,r){return e in t?Object.defineProperty(t,e,{value:r,enumerable:!0,configurable:!0,writable:!0}):t[e]=r,t}var o=r(61),s=Symbol("lastResolve"),a=Symbol("lastReject"),u=Symbol("error"),h=Symbol("ended"),f=Symbol("lastPromise"),c=Symbol("handlePromise"),l=Symbol("stream");function d(t,e){return{value:t,done:e}}function p(t){var e=t[s];if(null!==e){var r=t[l].read();null!==r&&(t[f]=null,t[s]=null,t[a]=null,e(d(r,!1)))}}function m(t){e.nextTick(p,t)}var y=Object.getPrototypeOf((function(){})),v=Object.setPrototypeOf((i(n={get stream(){return this[l]},next:function(){var t=this,r=this[u];if(null!==r)return Promise.reject(r);if(this[h])return Promise.resolve(d(void 0,!0));if(this[l].destroyed)return new Promise((function(r,n){e.nextTick((function(){t[u]?n(t[u]):r(d(void 0,!0))}))}));var n,i=this[f];if(i)n=new Promise(function(t,e){return function(r,n){t.then((function(){e[h]?r(d(void 0,!0)):e[c](r,n)}),n)}}(i,this));else{var o=this[l].read();if(null!==o)return Promise.resolve(d(o,!1));n=new Promise(this[c])}return this[f]=n,n}},Symbol.asyncIterator,(function(){return this})),i(n,"return",(function(){var t=this;return new Promise((function(e,r){t[l].destroy(null,(function(t){t?r(t):e(d(void 0,!0))}))}))})),n),y);t.exports=function(t){var e,r=Object.create(v,(i(e={},l,{value:t,writable:!0}),i(e,s,{value:null,writable:!0}),i(e,a,{value:null,writable:!0}),i(e,u,{value:null,writable:!0}),i(e,h,{value:t._readableState.endEmitted,writable:!0}),i(e,c,{value:function(t,e){var n=r[l].read();n?(r[f]=null,r[s]=null,r[a]=null,t(d(n,!1))):(r[s]=t,r[a]=e)},writable:!0}),e));return r[f]=null,o(t,(function(t){if(t&&"ERR_STREAM_PREMATURE_CLOSE"!==t.code){var e=r[a];return null!==e&&(r[f]=null,r[s]=null,r[a]=null,e(t)),void(r[u]=t)}var n=r[s];null!==n&&(r[f]=null,r[s]=null,r[a]=null,n(d(void 0,!0))),r[h]=!0})),t.on("readable",m.bind(null,r)),r}}).call(this,r(5))},function(t,e){t.exports=function(){throw new Error("Readable.from is not available in the browser")}},function(t,e,r){"use strict";t.exports=i;var n=r(102);function i(t){if(!(this instanceof i))return new i(t);n.call(this,t)}r(0)(i,n),i.prototype._transform=function(t,e,r){r(null,t)}},function(t,e,r){"use strict";var n;var i=r(28).codes,o=i.ERR_MISSING_ARGS,s=i.ERR_STREAM_DESTROYED;function a(t){if(t)throw t}function u(t,e,i,o){o=function(t){var e=!1;return function(){e||(e=!0,t.apply(void 0,arguments))}}(o);var a=!1;t.on("close",(function(){a=!0})),void 0===n&&(n=r(61)),n(t,{readable:e,writable:i},(function(t){if(t)return o(t);a=!0,o()}));var u=!1;return function(e){if(!a&&!u)return u=!0,function(t){return t.setHeader&&"function"==typeof t.abort}(t)?t.abort():"function"==typeof t.destroy?t.destroy():void o(e||new s("pipe"))}}function h(t){t()}function f(t,e){return t.pipe(e)}function c(t){return t.length?"function"!=typeof t[t.length-1]?a:t.pop():a}t.exports=function(){for(var t=arguments.length,e=new Array(t),r=0;r0,(function(t){n||(n=t),t&&s.forEach(h),o||(s.forEach(h),i(n))}))}));return e.reduce(f)}},function(t,e,r){var n=r(0),i=r(30),o=r(2).Buffer,s=[1518500249,1859775393,-1894007588,-899497514],a=new Array(80);function u(){this.init(),this._w=a,i.call(this,64,56)}function h(t){return t<<30|t>>>2}function f(t,e,r,n){return 0===t?e&r|~e&n:2===t?e&r|e&n|r&n:e^r^n}n(u,i),u.prototype.init=function(){return this._a=1732584193,this._b=4023233417,this._c=2562383102,this._d=271733878,this._e=3285377520,this},u.prototype._update=function(t){for(var e,r=this._w,n=0|this._a,i=0|this._b,o=0|this._c,a=0|this._d,u=0|this._e,c=0;c<16;++c)r[c]=t.readInt32BE(4*c);for(;c<80;++c)r[c]=r[c-3]^r[c-8]^r[c-14]^r[c-16];for(var l=0;l<80;++l){var d=~~(l/20),p=0|((e=n)<<5|e>>>27)+f(d,i,o,a)+u+r[l]+s[d];u=a,a=o,o=h(i),i=n,n=p}this._a=n+this._a|0,this._b=i+this._b|0,this._c=o+this._c|0,this._d=a+this._d|0,this._e=u+this._e|0},u.prototype._hash=function(){var t=o.allocUnsafe(20);return t.writeInt32BE(0|this._a,0),t.writeInt32BE(0|this._b,4),t.writeInt32BE(0|this._c,8),t.writeInt32BE(0|this._d,12),t.writeInt32BE(0|this._e,16),t},t.exports=u},function(t,e,r){var n=r(0),i=r(30),o=r(2).Buffer,s=[1518500249,1859775393,-1894007588,-899497514],a=new Array(80);function u(){this.init(),this._w=a,i.call(this,64,56)}function h(t){return t<<5|t>>>27}function f(t){return t<<30|t>>>2}function c(t,e,r,n){return 0===t?e&r|~e&n:2===t?e&r|e&n|r&n:e^r^n}n(u,i),u.prototype.init=function(){return this._a=1732584193,this._b=4023233417,this._c=2562383102,this._d=271733878,this._e=3285377520,this},u.prototype._update=function(t){for(var e,r=this._w,n=0|this._a,i=0|this._b,o=0|this._c,a=0|this._d,u=0|this._e,l=0;l<16;++l)r[l]=t.readInt32BE(4*l);for(;l<80;++l)r[l]=(e=r[l-3]^r[l-8]^r[l-14]^r[l-16])<<1|e>>>31;for(var d=0;d<80;++d){var p=~~(d/20),m=h(n)+c(p,i,o,a)+u+r[d]+s[p]|0;u=a,a=o,o=f(i),i=n,n=m}this._a=n+this._a|0,this._b=i+this._b|0,this._c=o+this._c|0,this._d=a+this._d|0,this._e=u+this._e|0},u.prototype._hash=function(){var t=o.allocUnsafe(20);return t.writeInt32BE(0|this._a,0),t.writeInt32BE(0|this._b,4),t.writeInt32BE(0|this._c,8),t.writeInt32BE(0|this._d,12),t.writeInt32BE(0|this._e,16),t},t.exports=u},function(t,e,r){var n=r(0),i=r(103),o=r(30),s=r(2).Buffer,a=new Array(64);function u(){this.init(),this._w=a,o.call(this,64,56)}n(u,i),u.prototype.init=function(){return this._a=3238371032,this._b=914150663,this._c=812702999,this._d=4144912697,this._e=4290775857,this._f=1750603025,this._g=1694076839,this._h=3204075428,this},u.prototype._hash=function(){var t=s.allocUnsafe(28);return t.writeInt32BE(this._a,0),t.writeInt32BE(this._b,4),t.writeInt32BE(this._c,8),t.writeInt32BE(this._d,12),t.writeInt32BE(this._e,16),t.writeInt32BE(this._f,20),t.writeInt32BE(this._g,24),t},t.exports=u},function(t,e,r){var n=r(0),i=r(104),o=r(30),s=r(2).Buffer,a=new Array(160);function u(){this.init(),this._w=a,o.call(this,128,112)}n(u,i),u.prototype.init=function(){return this._ah=3418070365,this._bh=1654270250,this._ch=2438529370,this._dh=355462360,this._eh=1731405415,this._fh=2394180231,this._gh=3675008525,this._hh=1203062813,this._al=3238371032,this._bl=914150663,this._cl=812702999,this._dl=4144912697,this._el=4290775857,this._fl=1750603025,this._gl=1694076839,this._hl=3204075428,this},u.prototype._hash=function(){var t=s.allocUnsafe(48);function e(e,r,n){t.writeInt32BE(e,n),t.writeInt32BE(r,n+4)}return e(this._ah,this._al,0),e(this._bh,this._bl,8),e(this._ch,this._cl,16),e(this._dh,this._dl,24),e(this._eh,this._el,32),e(this._fh,this._fl,40),t},t.exports=u},function(t,e,r){t.exports=i;var n=r(13).EventEmitter;function i(){n.call(this)}r(0)(i,n),i.Readable=r(64),i.Writable=r(202),i.Duplex=r(203),i.Transform=r(204),i.PassThrough=r(205),i.Stream=i,i.prototype.pipe=function(t,e){var r=this;function i(e){t.writable&&!1===t.write(e)&&r.pause&&r.pause()}function o(){r.readable&&r.resume&&r.resume()}r.on("data",i),t.on("drain",o),t._isStdio||e&&!1===e.end||(r.on("end",a),r.on("close",u));var s=!1;function a(){s||(s=!0,t.end())}function u(){s||(s=!0,"function"==typeof t.destroy&&t.destroy())}function h(t){if(f(),0===n.listenerCount(this,"error"))throw t}function f(){r.removeListener("data",i),t.removeListener("drain",o),r.removeListener("end",a),r.removeListener("close",u),r.removeListener("error",h),t.removeListener("error",h),r.removeListener("end",f),r.removeListener("close",f),t.removeListener("close",f)}return r.on("error",h),t.on("error",h),r.on("end",f),r.on("close",f),t.on("close",f),t.emit("pipe",r),t}},function(t,e){},function(t,e,r){"use strict";var n=r(2).Buffer,i=r(199);t.exports=function(){function t(){!function(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}(this,t),this.head=null,this.tail=null,this.length=0}return t.prototype.push=function(t){var e={data:t,next:null};this.length>0?this.tail.next=e:this.head=e,this.tail=e,++this.length},t.prototype.unshift=function(t){var e={data:t,next:this.head};0===this.length&&(this.tail=e),this.head=e,++this.length},t.prototype.shift=function(){if(0!==this.length){var t=this.head.data;return 1===this.length?this.head=this.tail=null:this.head=this.head.next,--this.length,t}},t.prototype.clear=function(){this.head=this.tail=null,this.length=0},t.prototype.join=function(t){if(0===this.length)return"";for(var e=this.head,r=""+e.data;e=e.next;)r+=t+e.data;return r},t.prototype.concat=function(t){if(0===this.length)return n.alloc(0);if(1===this.length)return this.head.data;for(var e,r,i,o=n.allocUnsafe(t>>>0),s=this.head,a=0;s;)e=s.data,r=o,i=a,e.copy(r,i),a+=s.data.length,s=s.next;return o},t}(),i&&i.inspect&&i.inspect.custom&&(t.exports.prototype[i.inspect.custom]=function(){var t=i.inspect({length:this.length});return this.constructor.name+" "+t})},function(t,e){},function(t,e,r){(function(t,e){!function(t,r){"use strict";if(!t.setImmediate){var n,i,o,s,a,u=1,h={},f=!1,c=t.document,l=Object.getPrototypeOf&&Object.getPrototypeOf(t);l=l&&l.setTimeout?l:t,"[object process]"==={}.toString.call(t.process)?n=function(t){e.nextTick((function(){p(t)}))}:!function(){if(t.postMessage&&!t.importScripts){var e=!0,r=t.onmessage;return t.onmessage=function(){e=!1},t.postMessage("","*"),t.onmessage=r,e}}()?t.MessageChannel?((o=new MessageChannel).port1.onmessage=function(t){p(t.data)},n=function(t){o.port2.postMessage(t)}):c&&"onreadystatechange"in c.createElement("script")?(i=c.documentElement,n=function(t){var e=c.createElement("script");e.onreadystatechange=function(){p(t),e.onreadystatechange=null,i.removeChild(e),e=null},i.appendChild(e)}):n=function(t){setTimeout(p,0,t)}:(s="setImmediate$"+Math.random()+"$",a=function(e){e.source===t&&"string"==typeof e.data&&0===e.data.indexOf(s)&&p(+e.data.slice(s.length))},t.addEventListener?t.addEventListener("message",a,!1):t.attachEvent("onmessage",a),n=function(e){t.postMessage(s+e,"*")}),l.setImmediate=function(t){"function"!=typeof t&&(t=new Function(""+t));for(var e=new Array(arguments.length-1),r=0;r64?e=t(e):e.length<64&&(e=i.concat([e,s],64));for(var r=this._ipad=i.allocUnsafe(64),n=this._opad=i.allocUnsafe(64),a=0;a<64;a++)r[a]=54^e[a],n[a]=92^e[a];this._hash=[r]}n(a,o),a.prototype._update=function(t){this._hash.push(t)},a.prototype._final=function(){var t=this._alg(i.concat(this._hash));return this._alg(i.concat([this._opad,t]))},t.exports=a},function(t,e,r){t.exports=r(112)},function(t,e,r){(function(e){var n,i,o=r(2).Buffer,s=r(114),a=r(115),u=r(116),h=r(117),f=e.crypto&&e.crypto.subtle,c={sha:"SHA-1","sha-1":"SHA-1",sha1:"SHA-1",sha256:"SHA-256","sha-256":"SHA-256",sha384:"SHA-384","sha-384":"SHA-384","sha-512":"SHA-512",sha512:"SHA-512"},l=[];function d(){return i||(i=e.process&&e.process.nextTick?e.process.nextTick:e.queueMicrotask?e.queueMicrotask:e.setImmediate?e.setImmediate:e.setTimeout)}function p(t,e,r,n,i){return f.importKey("raw",t,{name:"PBKDF2"},!1,["deriveBits"]).then((function(t){return f.deriveBits({name:"PBKDF2",salt:e,iterations:r,hash:{name:i}},t,n<<3)})).then((function(t){return o.from(t)}))}t.exports=function(t,r,i,m,y,v){"function"==typeof y&&(v=y,y=void 0);var b=c[(y=y||"sha1").toLowerCase()];if(b&&"function"==typeof e.Promise){if(s(i,m),t=h(t,a,"Password"),r=h(r,a,"Salt"),"function"!=typeof v)throw new Error("No callback provided to pbkdf2");!function(t,e){t.then((function(t){d()((function(){e(null,t)}))}),(function(t){d()((function(){e(t)}))}))}(function(t){if(e.process&&!e.process.browser)return Promise.resolve(!1);if(!f||!f.importKey||!f.deriveBits)return Promise.resolve(!1);if(void 0!==l[t])return l[t];var r=p(n=n||o.alloc(8),n,10,128,t).then((function(){return!0})).catch((function(){return!1}));return l[t]=r,r}(b).then((function(e){return e?p(t,r,i,m,b):u(t,r,i,m,y)})),v)}else d()((function(){var e;try{e=u(t,r,i,m,y)}catch(t){return v(t)}v(null,e)}))}}).call(this,r(7))},function(t,e,r){var n=r(210),i=r(67),o=r(68),s=r(223),a=r(48);function u(t,e,r){if(t=t.toLowerCase(),o[t])return i.createCipheriv(t,e,r);if(s[t])return new n({key:e,iv:r,mode:t});throw new TypeError("invalid suite type")}function h(t,e,r){if(t=t.toLowerCase(),o[t])return i.createDecipheriv(t,e,r);if(s[t])return new n({key:e,iv:r,mode:t,decrypt:!0});throw new TypeError("invalid suite type")}e.createCipher=e.Cipher=function(t,e){var r,n;if(t=t.toLowerCase(),o[t])r=o[t].key,n=o[t].iv;else{if(!s[t])throw new TypeError("invalid suite type");r=8*s[t].key,n=s[t].iv}var i=a(e,!1,r,n);return u(t,i.key,i.iv)},e.createCipheriv=e.Cipheriv=u,e.createDecipher=e.Decipher=function(t,e){var r,n;if(t=t.toLowerCase(),o[t])r=o[t].key,n=o[t].iv;else{if(!s[t])throw new TypeError("invalid suite type");r=8*s[t].key,n=s[t].iv}var i=a(e,!1,r,n);return h(t,i.key,i.iv)},e.createDecipheriv=e.Decipheriv=h,e.listCiphers=e.getCiphers=function(){return Object.keys(s).concat(i.getCiphers())}},function(t,e,r){var n=r(21),i=r(211),o=r(0),s=r(2).Buffer,a={"des-ede3-cbc":i.CBC.instantiate(i.EDE),"des-ede3":i.EDE,"des-ede-cbc":i.CBC.instantiate(i.EDE),"des-ede":i.EDE,"des-cbc":i.CBC.instantiate(i.DES),"des-ecb":i.DES};function u(t){n.call(this);var e,r=t.mode.toLowerCase(),i=a[r];e=t.decrypt?"decrypt":"encrypt";var o=t.key;s.isBuffer(o)||(o=s.from(o)),"des-ede"!==r&&"des-ede-cbc"!==r||(o=s.concat([o,o.slice(0,8)]));var u=t.iv;s.isBuffer(u)||(u=s.from(u)),this._des=i.create({key:o,iv:u,type:e})}a.des=a["des-cbc"],a.des3=a["des-ede3-cbc"],t.exports=u,o(u,n),u.prototype._update=function(t){return s.from(this._des.update(t))},u.prototype._final=function(){return s.from(this._des.final())}},function(t,e,r){"use strict";e.utils=r(118),e.Cipher=r(66),e.DES=r(119),e.CBC=r(212),e.EDE=r(213)},function(t,e,r){"use strict";var n=r(11),i=r(0),o={};function s(t){n.equal(t.length,8,"Invalid IV length"),this.iv=new Array(8);for(var e=0;e15){var t=this.cache.slice(0,16);return this.cache=this.cache.slice(16),t}return null},l.prototype.flush=function(){for(var t=16-this.cache.length,e=o.allocUnsafe(t),r=-1;++r>s%8,t._prev=o(t._prev,r?n:i);return a}function o(t,e){var r=t.length,i=-1,o=n.allocUnsafe(t.length);for(t=n.concat([t,n.from([e])]);++i>7;return o}e.encrypt=function(t,e,r){for(var o=e.length,s=n.allocUnsafe(o),a=-1;++a>>0,0),e.writeUInt32BE(t[1]>>>0,4),e.writeUInt32BE(t[2]>>>0,8),e.writeUInt32BE(t[3]>>>0,12),e}function s(t){this.h=t,this.state=n.alloc(16,0),this.cache=n.allocUnsafe(0)}s.prototype.ghash=function(t){for(var e=-1;++e0;e--)n[e]=n[e]>>>1|(1&n[e-1])<<31;n[0]=n[0]>>>1,r&&(n[0]=n[0]^225<<24)}this.state=o(i)},s.prototype.update=function(t){var e;for(this.cache=n.concat([this.cache,t]);this.cache.length>=16;)e=this.cache.slice(0,16),this.cache=this.cache.slice(16),this.ghash(e)},s.prototype.final=function(t,e){return this.cache.length&&this.ghash(n.concat([this.cache,i],16)),this.ghash(o([0,t,0,e])),this.state},t.exports=s},function(t,e,r){var n=r(123),i=r(2).Buffer,o=r(68),s=r(124),a=r(21),u=r(47),h=r(48);function f(t,e,r){a.call(this),this._cache=new c,this._last=void 0,this._cipher=new u.AES(e),this._prev=i.from(r),this._mode=t,this._autopadding=!0}function c(){this.cache=i.allocUnsafe(0)}function l(t,e,r){var a=o[t.toLowerCase()];if(!a)throw new TypeError("invalid suite type");if("string"==typeof r&&(r=i.from(r)),"GCM"!==a.mode&&r.length!==a.iv)throw new TypeError("invalid iv length "+r.length);if("string"==typeof e&&(e=i.from(e)),e.length!==a.key/8)throw new TypeError("invalid key length "+e.length);return"stream"===a.type?new s(a.module,e,r,!0):"auth"===a.type?new n(a.module,e,r,!0):new f(a.module,e,r)}r(0)(f,a),f.prototype._update=function(t){var e,r;this._cache.add(t);for(var n=[];e=this._cache.get(this._autopadding);)r=this._mode.decrypt(this,e),n.push(r);return i.concat(n)},f.prototype._final=function(){var t=this._cache.flush();if(this._autopadding)return function(t){var e=t[15];if(e<1||e>16)throw new Error("unable to decrypt data");var r=-1;for(;++r16)return e=this.cache.slice(0,16),this.cache=this.cache.slice(16),e}else if(this.cache.length>=16)return e=this.cache.slice(0,16),this.cache=this.cache.slice(16),e;return null},c.prototype.flush=function(){if(this.cache.length)return this.cache},e.createDecipher=function(t,e){var r=o[t.toLowerCase()];if(!r)throw new TypeError("invalid suite type");var n=h(e,!1,r.key,r.iv);return l(t,n.key,n.iv)},e.createDecipheriv=l},function(t,e){e["des-ecb"]={key:8,iv:0},e["des-cbc"]=e.des={key:8,iv:8},e["des-ede3-cbc"]=e.des3={key:24,iv:8},e["des-ede3"]={key:24,iv:0},e["des-ede-cbc"]={key:16,iv:8},e["des-ede"]={key:16,iv:0}},function(t,e,r){(function(t){var n=r(125),i=r(229),o=r(230);var s={binary:!0,hex:!0,base64:!0};e.DiffieHellmanGroup=e.createDiffieHellmanGroup=e.getDiffieHellman=function(e){var r=new t(i[e].prime,"hex"),n=new t(i[e].gen,"hex");return new o(r,n)},e.createDiffieHellman=e.DiffieHellman=function e(r,i,a,u){return t.isBuffer(i)||void 0===s[i]?e(r,"binary",i,a):(i=i||"binary",u=u||"binary",a=a||new t([2]),t.isBuffer(a)||(a=new t(a,u)),"number"==typeof r?new o(n(r,a),a,!0):(t.isBuffer(r)||(r=new t(r,i)),new o(r,a,!0)))}}).call(this,r(3).Buffer)},function(t,e){},function(t,e,r){(function(t){function e(t){return(e="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}!function(t,n){"use strict";function i(t,e){if(!t)throw new Error(e||"Assertion failed")}function o(t,e){t.super_=e;var r=function(){};r.prototype=e.prototype,t.prototype=new r,t.prototype.constructor=t}function s(t,e,r){if(s.isBN(t))return t;this.negative=0,this.words=null,this.length=0,this.red=null,null!==t&&("le"!==e&&"be"!==e||(r=e,e=10),this._init(t||0,e||10,r||"be"))}var a;"object"===e(t)?t.exports=s:n.BN=s,s.BN=s,s.wordSize=26;try{a="undefined"!=typeof window&&void 0!==window.Buffer?window.Buffer:r(227).Buffer}catch(t){}function u(t,e){var r=t.charCodeAt(e);return r>=65&&r<=70?r-55:r>=97&&r<=102?r-87:r-48&15}function h(t,e,r){var n=u(t,r);return r-1>=e&&(n|=u(t,r-1)<<4),n}function f(t,e,r,n){for(var i=0,o=Math.min(t.length,r),s=e;s=49?a-49+10:a>=17?a-17+10:a}return i}s.isBN=function(t){return t instanceof s||null!==t&&"object"===e(t)&&t.constructor.wordSize===s.wordSize&&Array.isArray(t.words)},s.max=function(t,e){return t.cmp(e)>0?t:e},s.min=function(t,e){return t.cmp(e)<0?t:e},s.prototype._init=function(t,r,n){if("number"==typeof t)return this._initNumber(t,r,n);if("object"===e(t))return this._initArray(t,r,n);"hex"===r&&(r=16),i(r===(0|r)&&r>=2&&r<=36);var o=0;"-"===(t=t.toString().replace(/\s+/g,""))[0]&&(o++,this.negative=1),o=0;n-=3)s=t[n]|t[n-1]<<8|t[n-2]<<16,this.words[o]|=s<>>26-a&67108863,(a+=24)>=26&&(a-=26,o++);else if("le"===r)for(n=0,o=0;n>>26-a&67108863,(a+=24)>=26&&(a-=26,o++);return this.strip()},s.prototype._parseHex=function(t,e,r){this.length=Math.ceil((t.length-e)/6),this.words=new Array(this.length);for(var n=0;n=e;n-=2)i=h(t,e,n)<=18?(o-=18,s+=1,this.words[s]|=i>>>26):o+=8;else for(n=(t.length-e)%2==0?e+1:e;n=18?(o-=18,s+=1,this.words[s]|=i>>>26):o+=8;this.strip()},s.prototype._parseBase=function(t,e,r){this.words=[0],this.length=1;for(var n=0,i=1;i<=67108863;i*=e)n++;n--,i=i/e|0;for(var o=t.length-r,s=o%n,a=Math.min(o,o-s)+r,u=0,h=r;h1&&0===this.words[this.length-1];)this.length--;return this._normSign()},s.prototype._normSign=function(){return 1===this.length&&0===this.words[0]&&(this.negative=0),this},s.prototype.inspect=function(){return(this.red?""};var c=["","0","00","000","0000","00000","000000","0000000","00000000","000000000","0000000000","00000000000","000000000000","0000000000000","00000000000000","000000000000000","0000000000000000","00000000000000000","000000000000000000","0000000000000000000","00000000000000000000","000000000000000000000","0000000000000000000000","00000000000000000000000","000000000000000000000000","0000000000000000000000000"],l=[0,0,25,16,12,11,10,9,8,8,7,7,7,7,6,6,6,6,6,6,6,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5],d=[0,0,33554432,43046721,16777216,48828125,60466176,40353607,16777216,43046721,1e7,19487171,35831808,62748517,7529536,11390625,16777216,24137569,34012224,47045881,64e6,4084101,5153632,6436343,7962624,9765625,11881376,14348907,17210368,20511149,243e5,28629151,33554432,39135393,45435424,52521875,60466176];function p(t,e,r){r.negative=e.negative^t.negative;var n=t.length+e.length|0;r.length=n,n=n-1|0;var i=0|t.words[0],o=0|e.words[0],s=i*o,a=67108863&s,u=s/67108864|0;r.words[0]=a;for(var h=1;h>>26,c=67108863&u,l=Math.min(h,e.length-1),d=Math.max(0,h-t.length+1);d<=l;d++){var p=h-d|0;f+=(s=(i=0|t.words[p])*(o=0|e.words[d])+c)/67108864|0,c=67108863&s}r.words[h]=0|c,u=0|f}return 0!==u?r.words[h]=0|u:r.length--,r.strip()}s.prototype.toString=function(t,e){var r;if(e=0|e||1,16===(t=t||10)||"hex"===t){r="";for(var n=0,o=0,s=0;s>>24-n&16777215)||s!==this.length-1?c[6-u.length]+u+r:u+r,(n+=2)>=26&&(n-=26,s--)}for(0!==o&&(r=o.toString(16)+r);r.length%e!=0;)r="0"+r;return 0!==this.negative&&(r="-"+r),r}if(t===(0|t)&&t>=2&&t<=36){var h=l[t],f=d[t];r="";var p=this.clone();for(p.negative=0;!p.isZero();){var m=p.modn(f).toString(t);r=(p=p.idivn(f)).isZero()?m+r:c[h-m.length]+m+r}for(this.isZero()&&(r="0"+r);r.length%e!=0;)r="0"+r;return 0!==this.negative&&(r="-"+r),r}i(!1,"Base should be between 2 and 36")},s.prototype.toNumber=function(){var t=this.words[0];return 2===this.length?t+=67108864*this.words[1]:3===this.length&&1===this.words[2]?t+=4503599627370496+67108864*this.words[1]:this.length>2&&i(!1,"Number can only safely store up to 53 bits"),0!==this.negative?-t:t},s.prototype.toJSON=function(){return this.toString(16)},s.prototype.toBuffer=function(t,e){return i(void 0!==a),this.toArrayLike(a,t,e)},s.prototype.toArray=function(t,e){return this.toArrayLike(Array,t,e)},s.prototype.toArrayLike=function(t,e,r){var n=this.byteLength(),o=r||Math.max(1,n);i(n<=o,"byte array longer than desired length"),i(o>0,"Requested array length <= 0"),this.strip();var s,a,u="le"===e,h=new t(o),f=this.clone();if(u){for(a=0;!f.isZero();a++)s=f.andln(255),f.iushrn(8),h[a]=s;for(;a=4096&&(r+=13,e>>>=13),e>=64&&(r+=7,e>>>=7),e>=8&&(r+=4,e>>>=4),e>=2&&(r+=2,e>>>=2),r+e},s.prototype._zeroBits=function(t){if(0===t)return 26;var e=t,r=0;return 0==(8191&e)&&(r+=13,e>>>=13),0==(127&e)&&(r+=7,e>>>=7),0==(15&e)&&(r+=4,e>>>=4),0==(3&e)&&(r+=2,e>>>=2),0==(1&e)&&r++,r},s.prototype.bitLength=function(){var t=this.words[this.length-1],e=this._countBits(t);return 26*(this.length-1)+e},s.prototype.zeroBits=function(){if(this.isZero())return 0;for(var t=0,e=0;et.length?this.clone().ior(t):t.clone().ior(this)},s.prototype.uor=function(t){return this.length>t.length?this.clone().iuor(t):t.clone().iuor(this)},s.prototype.iuand=function(t){var e;e=this.length>t.length?t:this;for(var r=0;rt.length?this.clone().iand(t):t.clone().iand(this)},s.prototype.uand=function(t){return this.length>t.length?this.clone().iuand(t):t.clone().iuand(this)},s.prototype.iuxor=function(t){var e,r;this.length>t.length?(e=this,r=t):(e=t,r=this);for(var n=0;nt.length?this.clone().ixor(t):t.clone().ixor(this)},s.prototype.uxor=function(t){return this.length>t.length?this.clone().iuxor(t):t.clone().iuxor(this)},s.prototype.inotn=function(t){i("number"==typeof t&&t>=0);var e=0|Math.ceil(t/26),r=t%26;this._expand(e),r>0&&e--;for(var n=0;n0&&(this.words[n]=~this.words[n]&67108863>>26-r),this.strip()},s.prototype.notn=function(t){return this.clone().inotn(t)},s.prototype.setn=function(t,e){i("number"==typeof t&&t>=0);var r=t/26|0,n=t%26;return this._expand(r+1),this.words[r]=e?this.words[r]|1<t.length?(r=this,n=t):(r=t,n=this);for(var i=0,o=0;o>>26;for(;0!==i&&o>>26;if(this.length=r.length,0!==i)this.words[this.length]=i,this.length++;else if(r!==this)for(;ot.length?this.clone().iadd(t):t.clone().iadd(this)},s.prototype.isub=function(t){if(0!==t.negative){t.negative=0;var e=this.iadd(t);return t.negative=1,e._normSign()}if(0!==this.negative)return this.negative=0,this.iadd(t),this.negative=1,this._normSign();var r,n,i=this.cmp(t);if(0===i)return this.negative=0,this.length=1,this.words[0]=0,this;i>0?(r=this,n=t):(r=t,n=this);for(var o=0,s=0;s>26,this.words[s]=67108863&e;for(;0!==o&&s>26,this.words[s]=67108863&e;if(0===o&&s>>13,d=0|s[1],p=8191&d,m=d>>>13,y=0|s[2],v=8191&y,b=y>>>13,g=0|s[3],w=8191&g,_=g>>>13,M=0|s[4],S=8191&M,O=M>>>13,A=0|s[5],E=8191&A,x=A>>>13,k=0|s[6],j=8191&k,$=k>>>13,P=0|s[7],R=8191&P,B=P>>>13,T=0|s[8],I=8191&T,N=T>>>13,D=0|s[9],C=8191&D,L=D>>>13,q=0|a[0],U=8191&q,F=q>>>13,z=0|a[1],V=8191&z,K=z>>>13,H=0|a[2],Z=8191&H,W=H>>>13,J=0|a[3],Y=8191&J,Q=J>>>13,G=0|a[4],X=8191&G,tt=G>>>13,et=0|a[5],rt=8191&et,nt=et>>>13,it=0|a[6],ot=8191&it,st=it>>>13,at=0|a[7],ut=8191&at,ht=at>>>13,ft=0|a[8],ct=8191&ft,lt=ft>>>13,dt=0|a[9],pt=8191&dt,mt=dt>>>13;r.negative=t.negative^e.negative,r.length=19;var yt=(h+(n=Math.imul(c,U))|0)+((8191&(i=(i=Math.imul(c,F))+Math.imul(l,U)|0))<<13)|0;h=((o=Math.imul(l,F))+(i>>>13)|0)+(yt>>>26)|0,yt&=67108863,n=Math.imul(p,U),i=(i=Math.imul(p,F))+Math.imul(m,U)|0,o=Math.imul(m,F);var vt=(h+(n=n+Math.imul(c,V)|0)|0)+((8191&(i=(i=i+Math.imul(c,K)|0)+Math.imul(l,V)|0))<<13)|0;h=((o=o+Math.imul(l,K)|0)+(i>>>13)|0)+(vt>>>26)|0,vt&=67108863,n=Math.imul(v,U),i=(i=Math.imul(v,F))+Math.imul(b,U)|0,o=Math.imul(b,F),n=n+Math.imul(p,V)|0,i=(i=i+Math.imul(p,K)|0)+Math.imul(m,V)|0,o=o+Math.imul(m,K)|0;var bt=(h+(n=n+Math.imul(c,Z)|0)|0)+((8191&(i=(i=i+Math.imul(c,W)|0)+Math.imul(l,Z)|0))<<13)|0;h=((o=o+Math.imul(l,W)|0)+(i>>>13)|0)+(bt>>>26)|0,bt&=67108863,n=Math.imul(w,U),i=(i=Math.imul(w,F))+Math.imul(_,U)|0,o=Math.imul(_,F),n=n+Math.imul(v,V)|0,i=(i=i+Math.imul(v,K)|0)+Math.imul(b,V)|0,o=o+Math.imul(b,K)|0,n=n+Math.imul(p,Z)|0,i=(i=i+Math.imul(p,W)|0)+Math.imul(m,Z)|0,o=o+Math.imul(m,W)|0;var gt=(h+(n=n+Math.imul(c,Y)|0)|0)+((8191&(i=(i=i+Math.imul(c,Q)|0)+Math.imul(l,Y)|0))<<13)|0;h=((o=o+Math.imul(l,Q)|0)+(i>>>13)|0)+(gt>>>26)|0,gt&=67108863,n=Math.imul(S,U),i=(i=Math.imul(S,F))+Math.imul(O,U)|0,o=Math.imul(O,F),n=n+Math.imul(w,V)|0,i=(i=i+Math.imul(w,K)|0)+Math.imul(_,V)|0,o=o+Math.imul(_,K)|0,n=n+Math.imul(v,Z)|0,i=(i=i+Math.imul(v,W)|0)+Math.imul(b,Z)|0,o=o+Math.imul(b,W)|0,n=n+Math.imul(p,Y)|0,i=(i=i+Math.imul(p,Q)|0)+Math.imul(m,Y)|0,o=o+Math.imul(m,Q)|0;var wt=(h+(n=n+Math.imul(c,X)|0)|0)+((8191&(i=(i=i+Math.imul(c,tt)|0)+Math.imul(l,X)|0))<<13)|0;h=((o=o+Math.imul(l,tt)|0)+(i>>>13)|0)+(wt>>>26)|0,wt&=67108863,n=Math.imul(E,U),i=(i=Math.imul(E,F))+Math.imul(x,U)|0,o=Math.imul(x,F),n=n+Math.imul(S,V)|0,i=(i=i+Math.imul(S,K)|0)+Math.imul(O,V)|0,o=o+Math.imul(O,K)|0,n=n+Math.imul(w,Z)|0,i=(i=i+Math.imul(w,W)|0)+Math.imul(_,Z)|0,o=o+Math.imul(_,W)|0,n=n+Math.imul(v,Y)|0,i=(i=i+Math.imul(v,Q)|0)+Math.imul(b,Y)|0,o=o+Math.imul(b,Q)|0,n=n+Math.imul(p,X)|0,i=(i=i+Math.imul(p,tt)|0)+Math.imul(m,X)|0,o=o+Math.imul(m,tt)|0;var _t=(h+(n=n+Math.imul(c,rt)|0)|0)+((8191&(i=(i=i+Math.imul(c,nt)|0)+Math.imul(l,rt)|0))<<13)|0;h=((o=o+Math.imul(l,nt)|0)+(i>>>13)|0)+(_t>>>26)|0,_t&=67108863,n=Math.imul(j,U),i=(i=Math.imul(j,F))+Math.imul($,U)|0,o=Math.imul($,F),n=n+Math.imul(E,V)|0,i=(i=i+Math.imul(E,K)|0)+Math.imul(x,V)|0,o=o+Math.imul(x,K)|0,n=n+Math.imul(S,Z)|0,i=(i=i+Math.imul(S,W)|0)+Math.imul(O,Z)|0,o=o+Math.imul(O,W)|0,n=n+Math.imul(w,Y)|0,i=(i=i+Math.imul(w,Q)|0)+Math.imul(_,Y)|0,o=o+Math.imul(_,Q)|0,n=n+Math.imul(v,X)|0,i=(i=i+Math.imul(v,tt)|0)+Math.imul(b,X)|0,o=o+Math.imul(b,tt)|0,n=n+Math.imul(p,rt)|0,i=(i=i+Math.imul(p,nt)|0)+Math.imul(m,rt)|0,o=o+Math.imul(m,nt)|0;var Mt=(h+(n=n+Math.imul(c,ot)|0)|0)+((8191&(i=(i=i+Math.imul(c,st)|0)+Math.imul(l,ot)|0))<<13)|0;h=((o=o+Math.imul(l,st)|0)+(i>>>13)|0)+(Mt>>>26)|0,Mt&=67108863,n=Math.imul(R,U),i=(i=Math.imul(R,F))+Math.imul(B,U)|0,o=Math.imul(B,F),n=n+Math.imul(j,V)|0,i=(i=i+Math.imul(j,K)|0)+Math.imul($,V)|0,o=o+Math.imul($,K)|0,n=n+Math.imul(E,Z)|0,i=(i=i+Math.imul(E,W)|0)+Math.imul(x,Z)|0,o=o+Math.imul(x,W)|0,n=n+Math.imul(S,Y)|0,i=(i=i+Math.imul(S,Q)|0)+Math.imul(O,Y)|0,o=o+Math.imul(O,Q)|0,n=n+Math.imul(w,X)|0,i=(i=i+Math.imul(w,tt)|0)+Math.imul(_,X)|0,o=o+Math.imul(_,tt)|0,n=n+Math.imul(v,rt)|0,i=(i=i+Math.imul(v,nt)|0)+Math.imul(b,rt)|0,o=o+Math.imul(b,nt)|0,n=n+Math.imul(p,ot)|0,i=(i=i+Math.imul(p,st)|0)+Math.imul(m,ot)|0,o=o+Math.imul(m,st)|0;var St=(h+(n=n+Math.imul(c,ut)|0)|0)+((8191&(i=(i=i+Math.imul(c,ht)|0)+Math.imul(l,ut)|0))<<13)|0;h=((o=o+Math.imul(l,ht)|0)+(i>>>13)|0)+(St>>>26)|0,St&=67108863,n=Math.imul(I,U),i=(i=Math.imul(I,F))+Math.imul(N,U)|0,o=Math.imul(N,F),n=n+Math.imul(R,V)|0,i=(i=i+Math.imul(R,K)|0)+Math.imul(B,V)|0,o=o+Math.imul(B,K)|0,n=n+Math.imul(j,Z)|0,i=(i=i+Math.imul(j,W)|0)+Math.imul($,Z)|0,o=o+Math.imul($,W)|0,n=n+Math.imul(E,Y)|0,i=(i=i+Math.imul(E,Q)|0)+Math.imul(x,Y)|0,o=o+Math.imul(x,Q)|0,n=n+Math.imul(S,X)|0,i=(i=i+Math.imul(S,tt)|0)+Math.imul(O,X)|0,o=o+Math.imul(O,tt)|0,n=n+Math.imul(w,rt)|0,i=(i=i+Math.imul(w,nt)|0)+Math.imul(_,rt)|0,o=o+Math.imul(_,nt)|0,n=n+Math.imul(v,ot)|0,i=(i=i+Math.imul(v,st)|0)+Math.imul(b,ot)|0,o=o+Math.imul(b,st)|0,n=n+Math.imul(p,ut)|0,i=(i=i+Math.imul(p,ht)|0)+Math.imul(m,ut)|0,o=o+Math.imul(m,ht)|0;var Ot=(h+(n=n+Math.imul(c,ct)|0)|0)+((8191&(i=(i=i+Math.imul(c,lt)|0)+Math.imul(l,ct)|0))<<13)|0;h=((o=o+Math.imul(l,lt)|0)+(i>>>13)|0)+(Ot>>>26)|0,Ot&=67108863,n=Math.imul(C,U),i=(i=Math.imul(C,F))+Math.imul(L,U)|0,o=Math.imul(L,F),n=n+Math.imul(I,V)|0,i=(i=i+Math.imul(I,K)|0)+Math.imul(N,V)|0,o=o+Math.imul(N,K)|0,n=n+Math.imul(R,Z)|0,i=(i=i+Math.imul(R,W)|0)+Math.imul(B,Z)|0,o=o+Math.imul(B,W)|0,n=n+Math.imul(j,Y)|0,i=(i=i+Math.imul(j,Q)|0)+Math.imul($,Y)|0,o=o+Math.imul($,Q)|0,n=n+Math.imul(E,X)|0,i=(i=i+Math.imul(E,tt)|0)+Math.imul(x,X)|0,o=o+Math.imul(x,tt)|0,n=n+Math.imul(S,rt)|0,i=(i=i+Math.imul(S,nt)|0)+Math.imul(O,rt)|0,o=o+Math.imul(O,nt)|0,n=n+Math.imul(w,ot)|0,i=(i=i+Math.imul(w,st)|0)+Math.imul(_,ot)|0,o=o+Math.imul(_,st)|0,n=n+Math.imul(v,ut)|0,i=(i=i+Math.imul(v,ht)|0)+Math.imul(b,ut)|0,o=o+Math.imul(b,ht)|0,n=n+Math.imul(p,ct)|0,i=(i=i+Math.imul(p,lt)|0)+Math.imul(m,ct)|0,o=o+Math.imul(m,lt)|0;var At=(h+(n=n+Math.imul(c,pt)|0)|0)+((8191&(i=(i=i+Math.imul(c,mt)|0)+Math.imul(l,pt)|0))<<13)|0;h=((o=o+Math.imul(l,mt)|0)+(i>>>13)|0)+(At>>>26)|0,At&=67108863,n=Math.imul(C,V),i=(i=Math.imul(C,K))+Math.imul(L,V)|0,o=Math.imul(L,K),n=n+Math.imul(I,Z)|0,i=(i=i+Math.imul(I,W)|0)+Math.imul(N,Z)|0,o=o+Math.imul(N,W)|0,n=n+Math.imul(R,Y)|0,i=(i=i+Math.imul(R,Q)|0)+Math.imul(B,Y)|0,o=o+Math.imul(B,Q)|0,n=n+Math.imul(j,X)|0,i=(i=i+Math.imul(j,tt)|0)+Math.imul($,X)|0,o=o+Math.imul($,tt)|0,n=n+Math.imul(E,rt)|0,i=(i=i+Math.imul(E,nt)|0)+Math.imul(x,rt)|0,o=o+Math.imul(x,nt)|0,n=n+Math.imul(S,ot)|0,i=(i=i+Math.imul(S,st)|0)+Math.imul(O,ot)|0,o=o+Math.imul(O,st)|0,n=n+Math.imul(w,ut)|0,i=(i=i+Math.imul(w,ht)|0)+Math.imul(_,ut)|0,o=o+Math.imul(_,ht)|0,n=n+Math.imul(v,ct)|0,i=(i=i+Math.imul(v,lt)|0)+Math.imul(b,ct)|0,o=o+Math.imul(b,lt)|0;var Et=(h+(n=n+Math.imul(p,pt)|0)|0)+((8191&(i=(i=i+Math.imul(p,mt)|0)+Math.imul(m,pt)|0))<<13)|0;h=((o=o+Math.imul(m,mt)|0)+(i>>>13)|0)+(Et>>>26)|0,Et&=67108863,n=Math.imul(C,Z),i=(i=Math.imul(C,W))+Math.imul(L,Z)|0,o=Math.imul(L,W),n=n+Math.imul(I,Y)|0,i=(i=i+Math.imul(I,Q)|0)+Math.imul(N,Y)|0,o=o+Math.imul(N,Q)|0,n=n+Math.imul(R,X)|0,i=(i=i+Math.imul(R,tt)|0)+Math.imul(B,X)|0,o=o+Math.imul(B,tt)|0,n=n+Math.imul(j,rt)|0,i=(i=i+Math.imul(j,nt)|0)+Math.imul($,rt)|0,o=o+Math.imul($,nt)|0,n=n+Math.imul(E,ot)|0,i=(i=i+Math.imul(E,st)|0)+Math.imul(x,ot)|0,o=o+Math.imul(x,st)|0,n=n+Math.imul(S,ut)|0,i=(i=i+Math.imul(S,ht)|0)+Math.imul(O,ut)|0,o=o+Math.imul(O,ht)|0,n=n+Math.imul(w,ct)|0,i=(i=i+Math.imul(w,lt)|0)+Math.imul(_,ct)|0,o=o+Math.imul(_,lt)|0;var xt=(h+(n=n+Math.imul(v,pt)|0)|0)+((8191&(i=(i=i+Math.imul(v,mt)|0)+Math.imul(b,pt)|0))<<13)|0;h=((o=o+Math.imul(b,mt)|0)+(i>>>13)|0)+(xt>>>26)|0,xt&=67108863,n=Math.imul(C,Y),i=(i=Math.imul(C,Q))+Math.imul(L,Y)|0,o=Math.imul(L,Q),n=n+Math.imul(I,X)|0,i=(i=i+Math.imul(I,tt)|0)+Math.imul(N,X)|0,o=o+Math.imul(N,tt)|0,n=n+Math.imul(R,rt)|0,i=(i=i+Math.imul(R,nt)|0)+Math.imul(B,rt)|0,o=o+Math.imul(B,nt)|0,n=n+Math.imul(j,ot)|0,i=(i=i+Math.imul(j,st)|0)+Math.imul($,ot)|0,o=o+Math.imul($,st)|0,n=n+Math.imul(E,ut)|0,i=(i=i+Math.imul(E,ht)|0)+Math.imul(x,ut)|0,o=o+Math.imul(x,ht)|0,n=n+Math.imul(S,ct)|0,i=(i=i+Math.imul(S,lt)|0)+Math.imul(O,ct)|0,o=o+Math.imul(O,lt)|0;var kt=(h+(n=n+Math.imul(w,pt)|0)|0)+((8191&(i=(i=i+Math.imul(w,mt)|0)+Math.imul(_,pt)|0))<<13)|0;h=((o=o+Math.imul(_,mt)|0)+(i>>>13)|0)+(kt>>>26)|0,kt&=67108863,n=Math.imul(C,X),i=(i=Math.imul(C,tt))+Math.imul(L,X)|0,o=Math.imul(L,tt),n=n+Math.imul(I,rt)|0,i=(i=i+Math.imul(I,nt)|0)+Math.imul(N,rt)|0,o=o+Math.imul(N,nt)|0,n=n+Math.imul(R,ot)|0,i=(i=i+Math.imul(R,st)|0)+Math.imul(B,ot)|0,o=o+Math.imul(B,st)|0,n=n+Math.imul(j,ut)|0,i=(i=i+Math.imul(j,ht)|0)+Math.imul($,ut)|0,o=o+Math.imul($,ht)|0,n=n+Math.imul(E,ct)|0,i=(i=i+Math.imul(E,lt)|0)+Math.imul(x,ct)|0,o=o+Math.imul(x,lt)|0;var jt=(h+(n=n+Math.imul(S,pt)|0)|0)+((8191&(i=(i=i+Math.imul(S,mt)|0)+Math.imul(O,pt)|0))<<13)|0;h=((o=o+Math.imul(O,mt)|0)+(i>>>13)|0)+(jt>>>26)|0,jt&=67108863,n=Math.imul(C,rt),i=(i=Math.imul(C,nt))+Math.imul(L,rt)|0,o=Math.imul(L,nt),n=n+Math.imul(I,ot)|0,i=(i=i+Math.imul(I,st)|0)+Math.imul(N,ot)|0,o=o+Math.imul(N,st)|0,n=n+Math.imul(R,ut)|0,i=(i=i+Math.imul(R,ht)|0)+Math.imul(B,ut)|0,o=o+Math.imul(B,ht)|0,n=n+Math.imul(j,ct)|0,i=(i=i+Math.imul(j,lt)|0)+Math.imul($,ct)|0,o=o+Math.imul($,lt)|0;var $t=(h+(n=n+Math.imul(E,pt)|0)|0)+((8191&(i=(i=i+Math.imul(E,mt)|0)+Math.imul(x,pt)|0))<<13)|0;h=((o=o+Math.imul(x,mt)|0)+(i>>>13)|0)+($t>>>26)|0,$t&=67108863,n=Math.imul(C,ot),i=(i=Math.imul(C,st))+Math.imul(L,ot)|0,o=Math.imul(L,st),n=n+Math.imul(I,ut)|0,i=(i=i+Math.imul(I,ht)|0)+Math.imul(N,ut)|0,o=o+Math.imul(N,ht)|0,n=n+Math.imul(R,ct)|0,i=(i=i+Math.imul(R,lt)|0)+Math.imul(B,ct)|0,o=o+Math.imul(B,lt)|0;var Pt=(h+(n=n+Math.imul(j,pt)|0)|0)+((8191&(i=(i=i+Math.imul(j,mt)|0)+Math.imul($,pt)|0))<<13)|0;h=((o=o+Math.imul($,mt)|0)+(i>>>13)|0)+(Pt>>>26)|0,Pt&=67108863,n=Math.imul(C,ut),i=(i=Math.imul(C,ht))+Math.imul(L,ut)|0,o=Math.imul(L,ht),n=n+Math.imul(I,ct)|0,i=(i=i+Math.imul(I,lt)|0)+Math.imul(N,ct)|0,o=o+Math.imul(N,lt)|0;var Rt=(h+(n=n+Math.imul(R,pt)|0)|0)+((8191&(i=(i=i+Math.imul(R,mt)|0)+Math.imul(B,pt)|0))<<13)|0;h=((o=o+Math.imul(B,mt)|0)+(i>>>13)|0)+(Rt>>>26)|0,Rt&=67108863,n=Math.imul(C,ct),i=(i=Math.imul(C,lt))+Math.imul(L,ct)|0,o=Math.imul(L,lt);var Bt=(h+(n=n+Math.imul(I,pt)|0)|0)+((8191&(i=(i=i+Math.imul(I,mt)|0)+Math.imul(N,pt)|0))<<13)|0;h=((o=o+Math.imul(N,mt)|0)+(i>>>13)|0)+(Bt>>>26)|0,Bt&=67108863;var Tt=(h+(n=Math.imul(C,pt))|0)+((8191&(i=(i=Math.imul(C,mt))+Math.imul(L,pt)|0))<<13)|0;return h=((o=Math.imul(L,mt))+(i>>>13)|0)+(Tt>>>26)|0,Tt&=67108863,u[0]=yt,u[1]=vt,u[2]=bt,u[3]=gt,u[4]=wt,u[5]=_t,u[6]=Mt,u[7]=St,u[8]=Ot,u[9]=At,u[10]=Et,u[11]=xt,u[12]=kt,u[13]=jt,u[14]=$t,u[15]=Pt,u[16]=Rt,u[17]=Bt,u[18]=Tt,0!==h&&(u[19]=h,r.length++),r};function y(t,e,r){return(new v).mulp(t,e,r)}function v(t,e){this.x=t,this.y=e}Math.imul||(m=p),s.prototype.mulTo=function(t,e){var r=this.length+t.length;return 10===this.length&&10===t.length?m(this,t,e):r<63?p(this,t,e):r<1024?function(t,e,r){r.negative=e.negative^t.negative,r.length=t.length+e.length;for(var n=0,i=0,o=0;o>>26)|0)>>>26,s&=67108863}r.words[o]=a,n=s,s=i}return 0!==n?r.words[o]=n:r.length--,r.strip()}(this,t,e):y(this,t,e)},v.prototype.makeRBT=function(t){for(var e=new Array(t),r=s.prototype._countBits(t)-1,n=0;n>=1;return n},v.prototype.permute=function(t,e,r,n,i,o){for(var s=0;s>>=1)i++;return 1<>>=13,r[2*s+1]=8191&o,o>>>=13;for(s=2*e;s>=26,e+=n/67108864|0,e+=o>>>26,this.words[r]=67108863&o}return 0!==e&&(this.words[r]=e,this.length++),this},s.prototype.muln=function(t){return this.clone().imuln(t)},s.prototype.sqr=function(){return this.mul(this)},s.prototype.isqr=function(){return this.imul(this.clone())},s.prototype.pow=function(t){var e=function(t){for(var e=new Array(t.bitLength()),r=0;r>>i}return e}(t);if(0===e.length)return new s(1);for(var r=this,n=0;n=0);var e,r=t%26,n=(t-r)/26,o=67108863>>>26-r<<26-r;if(0!==r){var s=0;for(e=0;e>>26-r}s&&(this.words[e]=s,this.length++)}if(0!==n){for(e=this.length-1;e>=0;e--)this.words[e+n]=this.words[e];for(e=0;e=0),n=e?(e-e%26)/26:0;var o=t%26,s=Math.min((t-o)/26,this.length),a=67108863^67108863>>>o<s)for(this.length-=s,h=0;h=0&&(0!==f||h>=n);h--){var c=0|this.words[h];this.words[h]=f<<26-o|c>>>o,f=c&a}return u&&0!==f&&(u.words[u.length++]=f),0===this.length&&(this.words[0]=0,this.length=1),this.strip()},s.prototype.ishrn=function(t,e,r){return i(0===this.negative),this.iushrn(t,e,r)},s.prototype.shln=function(t){return this.clone().ishln(t)},s.prototype.ushln=function(t){return this.clone().iushln(t)},s.prototype.shrn=function(t){return this.clone().ishrn(t)},s.prototype.ushrn=function(t){return this.clone().iushrn(t)},s.prototype.testn=function(t){i("number"==typeof t&&t>=0);var e=t%26,r=(t-e)/26,n=1<=0);var e=t%26,r=(t-e)/26;if(i(0===this.negative,"imaskn works only with positive numbers"),this.length<=r)return this;if(0!==e&&r++,this.length=Math.min(r,this.length),0!==e){var n=67108863^67108863>>>e<=67108864;e++)this.words[e]-=67108864,e===this.length-1?this.words[e+1]=1:this.words[e+1]++;return this.length=Math.max(this.length,e+1),this},s.prototype.isubn=function(t){if(i("number"==typeof t),i(t<67108864),t<0)return this.iaddn(-t);if(0!==this.negative)return this.negative=0,this.iaddn(t),this.negative=1,this;if(this.words[0]-=t,1===this.length&&this.words[0]<0)this.words[0]=-this.words[0],this.negative=1;else for(var e=0;e>26)-(u/67108864|0),this.words[n+r]=67108863&o}for(;n>26,this.words[n+r]=67108863&o;if(0===a)return this.strip();for(i(-1===a),a=0,n=0;n>26,this.words[n]=67108863&o;return this.negative=1,this.strip()},s.prototype._wordDiv=function(t,e){var r=(this.length,t.length),n=this.clone(),i=t,o=0|i.words[i.length-1];0!==(r=26-this._countBits(o))&&(i=i.ushln(r),n.iushln(r),o=0|i.words[i.length-1]);var a,u=n.length-i.length;if("mod"!==e){(a=new s(null)).length=u+1,a.words=new Array(a.length);for(var h=0;h=0;c--){var l=67108864*(0|n.words[i.length+c])+(0|n.words[i.length+c-1]);for(l=Math.min(l/o|0,67108863),n._ishlnsubmul(i,l,c);0!==n.negative;)l--,n.negative=0,n._ishlnsubmul(i,1,c),n.isZero()||(n.negative^=1);a&&(a.words[c]=l)}return a&&a.strip(),n.strip(),"div"!==e&&0!==r&&n.iushrn(r),{div:a||null,mod:n}},s.prototype.divmod=function(t,e,r){return i(!t.isZero()),this.isZero()?{div:new s(0),mod:new s(0)}:0!==this.negative&&0===t.negative?(a=this.neg().divmod(t,e),"mod"!==e&&(n=a.div.neg()),"div"!==e&&(o=a.mod.neg(),r&&0!==o.negative&&o.iadd(t)),{div:n,mod:o}):0===this.negative&&0!==t.negative?(a=this.divmod(t.neg(),e),"mod"!==e&&(n=a.div.neg()),{div:n,mod:a.mod}):0!=(this.negative&t.negative)?(a=this.neg().divmod(t.neg(),e),"div"!==e&&(o=a.mod.neg(),r&&0!==o.negative&&o.isub(t)),{div:a.div,mod:o}):t.length>this.length||this.cmp(t)<0?{div:new s(0),mod:this}:1===t.length?"div"===e?{div:this.divn(t.words[0]),mod:null}:"mod"===e?{div:null,mod:new s(this.modn(t.words[0]))}:{div:this.divn(t.words[0]),mod:new s(this.modn(t.words[0]))}:this._wordDiv(t,e);var n,o,a},s.prototype.div=function(t){return this.divmod(t,"div",!1).div},s.prototype.mod=function(t){return this.divmod(t,"mod",!1).mod},s.prototype.umod=function(t){return this.divmod(t,"mod",!0).mod},s.prototype.divRound=function(t){var e=this.divmod(t);if(e.mod.isZero())return e.div;var r=0!==e.div.negative?e.mod.isub(t):e.mod,n=t.ushrn(1),i=t.andln(1),o=r.cmp(n);return o<0||1===i&&0===o?e.div:0!==e.div.negative?e.div.isubn(1):e.div.iaddn(1)},s.prototype.modn=function(t){i(t<=67108863);for(var e=(1<<26)%t,r=0,n=this.length-1;n>=0;n--)r=(e*r+(0|this.words[n]))%t;return r},s.prototype.idivn=function(t){i(t<=67108863);for(var e=0,r=this.length-1;r>=0;r--){var n=(0|this.words[r])+67108864*e;this.words[r]=n/t|0,e=n%t}return this.strip()},s.prototype.divn=function(t){return this.clone().idivn(t)},s.prototype.egcd=function(t){i(0===t.negative),i(!t.isZero());var e=this,r=t.clone();e=0!==e.negative?e.umod(t):e.clone();for(var n=new s(1),o=new s(0),a=new s(0),u=new s(1),h=0;e.isEven()&&r.isEven();)e.iushrn(1),r.iushrn(1),++h;for(var f=r.clone(),c=e.clone();!e.isZero();){for(var l=0,d=1;0==(e.words[0]&d)&&l<26;++l,d<<=1);if(l>0)for(e.iushrn(l);l-- >0;)(n.isOdd()||o.isOdd())&&(n.iadd(f),o.isub(c)),n.iushrn(1),o.iushrn(1);for(var p=0,m=1;0==(r.words[0]&m)&&p<26;++p,m<<=1);if(p>0)for(r.iushrn(p);p-- >0;)(a.isOdd()||u.isOdd())&&(a.iadd(f),u.isub(c)),a.iushrn(1),u.iushrn(1);e.cmp(r)>=0?(e.isub(r),n.isub(a),o.isub(u)):(r.isub(e),a.isub(n),u.isub(o))}return{a:a,b:u,gcd:r.iushln(h)}},s.prototype._invmp=function(t){i(0===t.negative),i(!t.isZero());var e=this,r=t.clone();e=0!==e.negative?e.umod(t):e.clone();for(var n,o=new s(1),a=new s(0),u=r.clone();e.cmpn(1)>0&&r.cmpn(1)>0;){for(var h=0,f=1;0==(e.words[0]&f)&&h<26;++h,f<<=1);if(h>0)for(e.iushrn(h);h-- >0;)o.isOdd()&&o.iadd(u),o.iushrn(1);for(var c=0,l=1;0==(r.words[0]&l)&&c<26;++c,l<<=1);if(c>0)for(r.iushrn(c);c-- >0;)a.isOdd()&&a.iadd(u),a.iushrn(1);e.cmp(r)>=0?(e.isub(r),o.isub(a)):(r.isub(e),a.isub(o))}return(n=0===e.cmpn(1)?o:a).cmpn(0)<0&&n.iadd(t),n},s.prototype.gcd=function(t){if(this.isZero())return t.abs();if(t.isZero())return this.abs();var e=this.clone(),r=t.clone();e.negative=0,r.negative=0;for(var n=0;e.isEven()&&r.isEven();n++)e.iushrn(1),r.iushrn(1);for(;;){for(;e.isEven();)e.iushrn(1);for(;r.isEven();)r.iushrn(1);var i=e.cmp(r);if(i<0){var o=e;e=r,r=o}else if(0===i||0===r.cmpn(1))break;e.isub(r)}return r.iushln(n)},s.prototype.invm=function(t){return this.egcd(t).a.umod(t)},s.prototype.isEven=function(){return 0==(1&this.words[0])},s.prototype.isOdd=function(){return 1==(1&this.words[0])},s.prototype.andln=function(t){return this.words[0]&t},s.prototype.bincn=function(t){i("number"==typeof t);var e=t%26,r=(t-e)/26,n=1<>>26,a&=67108863,this.words[s]=a}return 0!==o&&(this.words[s]=o,this.length++),this},s.prototype.isZero=function(){return 1===this.length&&0===this.words[0]},s.prototype.cmpn=function(t){var e,r=t<0;if(0!==this.negative&&!r)return-1;if(0===this.negative&&r)return 1;if(this.strip(),this.length>1)e=1;else{r&&(t=-t),i(t<=67108863,"Number is too big");var n=0|this.words[0];e=n===t?0:nt.length)return 1;if(this.length=0;r--){var n=0|this.words[r],i=0|t.words[r];if(n!==i){ni&&(e=1);break}}return e},s.prototype.gtn=function(t){return 1===this.cmpn(t)},s.prototype.gt=function(t){return 1===this.cmp(t)},s.prototype.gten=function(t){return this.cmpn(t)>=0},s.prototype.gte=function(t){return this.cmp(t)>=0},s.prototype.ltn=function(t){return-1===this.cmpn(t)},s.prototype.lt=function(t){return-1===this.cmp(t)},s.prototype.lten=function(t){return this.cmpn(t)<=0},s.prototype.lte=function(t){return this.cmp(t)<=0},s.prototype.eqn=function(t){return 0===this.cmpn(t)},s.prototype.eq=function(t){return 0===this.cmp(t)},s.red=function(t){return new O(t)},s.prototype.toRed=function(t){return i(!this.red,"Already a number in reduction context"),i(0===this.negative,"red works only with positives"),t.convertTo(this)._forceRed(t)},s.prototype.fromRed=function(){return i(this.red,"fromRed works only with numbers in reduction context"),this.red.convertFrom(this)},s.prototype._forceRed=function(t){return this.red=t,this},s.prototype.forceRed=function(t){return i(!this.red,"Already a number in reduction context"),this._forceRed(t)},s.prototype.redAdd=function(t){return i(this.red,"redAdd works only with red numbers"),this.red.add(this,t)},s.prototype.redIAdd=function(t){return i(this.red,"redIAdd works only with red numbers"),this.red.iadd(this,t)},s.prototype.redSub=function(t){return i(this.red,"redSub works only with red numbers"),this.red.sub(this,t)},s.prototype.redISub=function(t){return i(this.red,"redISub works only with red numbers"),this.red.isub(this,t)},s.prototype.redShl=function(t){return i(this.red,"redShl works only with red numbers"),this.red.shl(this,t)},s.prototype.redMul=function(t){return i(this.red,"redMul works only with red numbers"),this.red._verify2(this,t),this.red.mul(this,t)},s.prototype.redIMul=function(t){return i(this.red,"redMul works only with red numbers"),this.red._verify2(this,t),this.red.imul(this,t)},s.prototype.redSqr=function(){return i(this.red,"redSqr works only with red numbers"),this.red._verify1(this),this.red.sqr(this)},s.prototype.redISqr=function(){return i(this.red,"redISqr works only with red numbers"),this.red._verify1(this),this.red.isqr(this)},s.prototype.redSqrt=function(){return i(this.red,"redSqrt works only with red numbers"),this.red._verify1(this),this.red.sqrt(this)},s.prototype.redInvm=function(){return i(this.red,"redInvm works only with red numbers"),this.red._verify1(this),this.red.invm(this)},s.prototype.redNeg=function(){return i(this.red,"redNeg works only with red numbers"),this.red._verify1(this),this.red.neg(this)},s.prototype.redPow=function(t){return i(this.red&&!t.red,"redPow(normalNum)"),this.red._verify1(this),this.red.pow(this,t)};var b={k256:null,p224:null,p192:null,p25519:null};function g(t,e){this.name=t,this.p=new s(e,16),this.n=this.p.bitLength(),this.k=new s(1).iushln(this.n).isub(this.p),this.tmp=this._tmp()}function w(){g.call(this,"k256","ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff fffffffe fffffc2f")}function _(){g.call(this,"p224","ffffffff ffffffff ffffffff ffffffff 00000000 00000000 00000001")}function M(){g.call(this,"p192","ffffffff ffffffff ffffffff fffffffe ffffffff ffffffff")}function S(){g.call(this,"25519","7fffffffffffffff ffffffffffffffff ffffffffffffffff ffffffffffffffed")}function O(t){if("string"==typeof t){var e=s._prime(t);this.m=e.p,this.prime=e}else i(t.gtn(1),"modulus must be greater than 1"),this.m=t,this.prime=null}function A(t){O.call(this,t),this.shift=this.m.bitLength(),this.shift%26!=0&&(this.shift+=26-this.shift%26),this.r=new s(1).iushln(this.shift),this.r2=this.imod(this.r.sqr()),this.rinv=this.r._invmp(this.m),this.minv=this.rinv.mul(this.r).isubn(1).div(this.m),this.minv=this.minv.umod(this.r),this.minv=this.r.sub(this.minv)}g.prototype._tmp=function(){var t=new s(null);return t.words=new Array(Math.ceil(this.n/13)),t},g.prototype.ireduce=function(t){var e,r=t;do{this.split(r,this.tmp),e=(r=(r=this.imulK(r)).iadd(this.tmp)).bitLength()}while(e>this.n);var n=e0?r.isub(this.p):void 0!==r.strip?r.strip():r._strip(),r},g.prototype.split=function(t,e){t.iushrn(this.n,0,e)},g.prototype.imulK=function(t){return t.imul(this.k)},o(w,g),w.prototype.split=function(t,e){for(var r=Math.min(t.length,9),n=0;n>>22,i=o}i>>>=22,t.words[n-10]=i,0===i&&t.length>10?t.length-=10:t.length-=9},w.prototype.imulK=function(t){t.words[t.length]=0,t.words[t.length+1]=0,t.length+=2;for(var e=0,r=0;r>>=26,t.words[r]=i,e=n}return 0!==e&&(t.words[t.length++]=e),t},s._prime=function(t){if(b[t])return b[t];var e;if("k256"===t)e=new w;else if("p224"===t)e=new _;else if("p192"===t)e=new M;else{if("p25519"!==t)throw new Error("Unknown prime "+t);e=new S}return b[t]=e,e},O.prototype._verify1=function(t){i(0===t.negative,"red works only with positives"),i(t.red,"red works only with red numbers")},O.prototype._verify2=function(t,e){i(0==(t.negative|e.negative),"red works only with positives"),i(t.red&&t.red===e.red,"red works only with red numbers")},O.prototype.imod=function(t){return this.prime?this.prime.ireduce(t)._forceRed(this):t.umod(this.m)._forceRed(this)},O.prototype.neg=function(t){return t.isZero()?t.clone():this.m.sub(t)._forceRed(this)},O.prototype.add=function(t,e){this._verify2(t,e);var r=t.add(e);return r.cmp(this.m)>=0&&r.isub(this.m),r._forceRed(this)},O.prototype.iadd=function(t,e){this._verify2(t,e);var r=t.iadd(e);return r.cmp(this.m)>=0&&r.isub(this.m),r},O.prototype.sub=function(t,e){this._verify2(t,e);var r=t.sub(e);return r.cmpn(0)<0&&r.iadd(this.m),r._forceRed(this)},O.prototype.isub=function(t,e){this._verify2(t,e);var r=t.isub(e);return r.cmpn(0)<0&&r.iadd(this.m),r},O.prototype.shl=function(t,e){return this._verify1(t),this.imod(t.ushln(e))},O.prototype.imul=function(t,e){return this._verify2(t,e),this.imod(t.imul(e))},O.prototype.mul=function(t,e){return this._verify2(t,e),this.imod(t.mul(e))},O.prototype.isqr=function(t){return this.imul(t,t.clone())},O.prototype.sqr=function(t){return this.mul(t,t)},O.prototype.sqrt=function(t){if(t.isZero())return t.clone();var e=this.m.andln(3);if(i(e%2==1),3===e){var r=this.m.add(new s(1)).iushrn(2);return this.pow(t,r)}for(var n=this.m.subn(1),o=0;!n.isZero()&&0===n.andln(1);)o++,n.iushrn(1);i(!n.isZero());var a=new s(1).toRed(this),u=a.redNeg(),h=this.m.subn(1).iushrn(1),f=this.m.bitLength();for(f=new s(2*f*f).toRed(this);0!==this.pow(f,h).cmp(u);)f.redIAdd(u);for(var c=this.pow(f,n),l=this.pow(t,n.addn(1).iushrn(1)),d=this.pow(t,n),p=o;0!==d.cmp(a);){for(var m=d,y=0;0!==m.cmp(a);y++)m=m.redSqr();i(y=0;n--){for(var h=e.words[n],f=u-1;f>=0;f--){var c=h>>f&1;i!==r[0]&&(i=this.sqr(i)),0!==c||0!==o?(o<<=1,o|=c,(4===++a||0===n&&0===f)&&(i=this.mul(i,r[o]),a=0,o=0)):a=0}u=26}return i},O.prototype.convertTo=function(t){var e=t.umod(this.m);return e===t?e.clone():e},O.prototype.convertFrom=function(t){var e=t.clone();return e.red=null,e},s.mont=function(t){return new A(t)},o(A,O),A.prototype.convertTo=function(t){return this.imod(t.ushln(this.shift))},A.prototype.convertFrom=function(t){var e=this.imod(t.mul(this.rinv));return e.red=null,e},A.prototype.imul=function(t,e){if(t.isZero()||e.isZero())return t.words[0]=0,t.length=1,t;var r=t.imul(e),n=r.maskn(this.shift).mul(this.minv).imaskn(this.shift).mul(this.m),i=r.isub(n).iushrn(this.shift),o=i;return i.cmp(this.m)>=0?o=i.isub(this.m):i.cmpn(0)<0&&(o=i.iadd(this.m)),o._forceRed(this)},A.prototype.mul=function(t,e){if(t.isZero()||e.isZero())return new s(0)._forceRed(this);var r=t.mul(e),n=r.maskn(this.shift).mul(this.minv).imaskn(this.shift).mul(this.m),i=r.isub(n).iushrn(this.shift),o=i;return i.cmp(this.m)>=0?o=i.isub(this.m):i.cmpn(0)<0&&(o=i.iadd(this.m)),o._forceRed(this)},A.prototype.invm=function(t){return this.imod(t._invmp(this.m).mul(this.r2))._forceRed(this)}}(t,this)}).call(this,r(22)(t))},function(t,e){},function(t,e){},function(t){t.exports=JSON.parse('{"modp1":{"gen":"02","prime":"ffffffffffffffffc90fdaa22168c234c4c6628b80dc1cd129024e088a67cc74020bbea63b139b22514a08798e3404ddef9519b3cd3a431b302b0a6df25f14374fe1356d6d51c245e485b576625e7ec6f44c42e9a63a3620ffffffffffffffff"},"modp2":{"gen":"02","prime":"ffffffffffffffffc90fdaa22168c234c4c6628b80dc1cd129024e088a67cc74020bbea63b139b22514a08798e3404ddef9519b3cd3a431b302b0a6df25f14374fe1356d6d51c245e485b576625e7ec6f44c42e9a637ed6b0bff5cb6f406b7edee386bfb5a899fa5ae9f24117c4b1fe649286651ece65381ffffffffffffffff"},"modp5":{"gen":"02","prime":"ffffffffffffffffc90fdaa22168c234c4c6628b80dc1cd129024e088a67cc74020bbea63b139b22514a08798e3404ddef9519b3cd3a431b302b0a6df25f14374fe1356d6d51c245e485b576625e7ec6f44c42e9a637ed6b0bff5cb6f406b7edee386bfb5a899fa5ae9f24117c4b1fe649286651ece45b3dc2007cb8a163bf0598da48361c55d39a69163fa8fd24cf5f83655d23dca3ad961c62f356208552bb9ed529077096966d670c354e4abc9804f1746c08ca237327ffffffffffffffff"},"modp14":{"gen":"02","prime":"ffffffffffffffffc90fdaa22168c234c4c6628b80dc1cd129024e088a67cc74020bbea63b139b22514a08798e3404ddef9519b3cd3a431b302b0a6df25f14374fe1356d6d51c245e485b576625e7ec6f44c42e9a637ed6b0bff5cb6f406b7edee386bfb5a899fa5ae9f24117c4b1fe649286651ece45b3dc2007cb8a163bf0598da48361c55d39a69163fa8fd24cf5f83655d23dca3ad961c62f356208552bb9ed529077096966d670c354e4abc9804f1746c08ca18217c32905e462e36ce3be39e772c180e86039b2783a2ec07a28fb5c55df06f4c52c9de2bcbf6955817183995497cea956ae515d2261898fa051015728e5a8aacaa68ffffffffffffffff"},"modp15":{"gen":"02","prime":"ffffffffffffffffc90fdaa22168c234c4c6628b80dc1cd129024e088a67cc74020bbea63b139b22514a08798e3404ddef9519b3cd3a431b302b0a6df25f14374fe1356d6d51c245e485b576625e7ec6f44c42e9a637ed6b0bff5cb6f406b7edee386bfb5a899fa5ae9f24117c4b1fe649286651ece45b3dc2007cb8a163bf0598da48361c55d39a69163fa8fd24cf5f83655d23dca3ad961c62f356208552bb9ed529077096966d670c354e4abc9804f1746c08ca18217c32905e462e36ce3be39e772c180e86039b2783a2ec07a28fb5c55df06f4c52c9de2bcbf6955817183995497cea956ae515d2261898fa051015728e5a8aaac42dad33170d04507a33a85521abdf1cba64ecfb850458dbef0a8aea71575d060c7db3970f85a6e1e4c7abf5ae8cdb0933d71e8c94e04a25619dcee3d2261ad2ee6bf12ffa06d98a0864d87602733ec86a64521f2b18177b200cbbe117577a615d6c770988c0bad946e208e24fa074e5ab3143db5bfce0fd108e4b82d120a93ad2caffffffffffffffff"},"modp16":{"gen":"02","prime":"ffffffffffffffffc90fdaa22168c234c4c6628b80dc1cd129024e088a67cc74020bbea63b139b22514a08798e3404ddef9519b3cd3a431b302b0a6df25f14374fe1356d6d51c245e485b576625e7ec6f44c42e9a637ed6b0bff5cb6f406b7edee386bfb5a899fa5ae9f24117c4b1fe649286651ece45b3dc2007cb8a163bf0598da48361c55d39a69163fa8fd24cf5f83655d23dca3ad961c62f356208552bb9ed529077096966d670c354e4abc9804f1746c08ca18217c32905e462e36ce3be39e772c180e86039b2783a2ec07a28fb5c55df06f4c52c9de2bcbf6955817183995497cea956ae515d2261898fa051015728e5a8aaac42dad33170d04507a33a85521abdf1cba64ecfb850458dbef0a8aea71575d060c7db3970f85a6e1e4c7abf5ae8cdb0933d71e8c94e04a25619dcee3d2261ad2ee6bf12ffa06d98a0864d87602733ec86a64521f2b18177b200cbbe117577a615d6c770988c0bad946e208e24fa074e5ab3143db5bfce0fd108e4b82d120a92108011a723c12a787e6d788719a10bdba5b2699c327186af4e23c1a946834b6150bda2583e9ca2ad44ce8dbbbc2db04de8ef92e8efc141fbecaa6287c59474e6bc05d99b2964fa090c3a2233ba186515be7ed1f612970cee2d7afb81bdd762170481cd0069127d5b05aa993b4ea988d8fddc186ffb7dc90a6c08f4df435c934063199ffffffffffffffff"},"modp17":{"gen":"02","prime":"ffffffffffffffffc90fdaa22168c234c4c6628b80dc1cd129024e088a67cc74020bbea63b139b22514a08798e3404ddef9519b3cd3a431b302b0a6df25f14374fe1356d6d51c245e485b576625e7ec6f44c42e9a637ed6b0bff5cb6f406b7edee386bfb5a899fa5ae9f24117c4b1fe649286651ece45b3dc2007cb8a163bf0598da48361c55d39a69163fa8fd24cf5f83655d23dca3ad961c62f356208552bb9ed529077096966d670c354e4abc9804f1746c08ca18217c32905e462e36ce3be39e772c180e86039b2783a2ec07a28fb5c55df06f4c52c9de2bcbf6955817183995497cea956ae515d2261898fa051015728e5a8aaac42dad33170d04507a33a85521abdf1cba64ecfb850458dbef0a8aea71575d060c7db3970f85a6e1e4c7abf5ae8cdb0933d71e8c94e04a25619dcee3d2261ad2ee6bf12ffa06d98a0864d87602733ec86a64521f2b18177b200cbbe117577a615d6c770988c0bad946e208e24fa074e5ab3143db5bfce0fd108e4b82d120a92108011a723c12a787e6d788719a10bdba5b2699c327186af4e23c1a946834b6150bda2583e9ca2ad44ce8dbbbc2db04de8ef92e8efc141fbecaa6287c59474e6bc05d99b2964fa090c3a2233ba186515be7ed1f612970cee2d7afb81bdd762170481cd0069127d5b05aa993b4ea988d8fddc186ffb7dc90a6c08f4df435c93402849236c3fab4d27c7026c1d4dcb2602646dec9751e763dba37bdf8ff9406ad9e530ee5db382f413001aeb06a53ed9027d831179727b0865a8918da3edbebcf9b14ed44ce6cbaced4bb1bdb7f1447e6cc254b332051512bd7af426fb8f401378cd2bf5983ca01c64b92ecf032ea15d1721d03f482d7ce6e74fef6d55e702f46980c82b5a84031900b1c9e59e7c97fbec7e8f323a97a7e36cc88be0f1d45b7ff585ac54bd407b22b4154aacc8f6d7ebf48e1d814cc5ed20f8037e0a79715eef29be32806a1d58bb7c5da76f550aa3d8a1fbff0eb19ccb1a313d55cda56c9ec2ef29632387fe8d76e3c0468043e8f663f4860ee12bf2d5b0b7474d6e694f91e6dcc4024ffffffffffffffff"},"modp18":{"gen":"02","prime":"ffffffffffffffffc90fdaa22168c234c4c6628b80dc1cd129024e088a67cc74020bbea63b139b22514a08798e3404ddef9519b3cd3a431b302b0a6df25f14374fe1356d6d51c245e485b576625e7ec6f44c42e9a637ed6b0bff5cb6f406b7edee386bfb5a899fa5ae9f24117c4b1fe649286651ece45b3dc2007cb8a163bf0598da48361c55d39a69163fa8fd24cf5f83655d23dca3ad961c62f356208552bb9ed529077096966d670c354e4abc9804f1746c08ca18217c32905e462e36ce3be39e772c180e86039b2783a2ec07a28fb5c55df06f4c52c9de2bcbf6955817183995497cea956ae515d2261898fa051015728e5a8aaac42dad33170d04507a33a85521abdf1cba64ecfb850458dbef0a8aea71575d060c7db3970f85a6e1e4c7abf5ae8cdb0933d71e8c94e04a25619dcee3d2261ad2ee6bf12ffa06d98a0864d87602733ec86a64521f2b18177b200cbbe117577a615d6c770988c0bad946e208e24fa074e5ab3143db5bfce0fd108e4b82d120a92108011a723c12a787e6d788719a10bdba5b2699c327186af4e23c1a946834b6150bda2583e9ca2ad44ce8dbbbc2db04de8ef92e8efc141fbecaa6287c59474e6bc05d99b2964fa090c3a2233ba186515be7ed1f612970cee2d7afb81bdd762170481cd0069127d5b05aa993b4ea988d8fddc186ffb7dc90a6c08f4df435c93402849236c3fab4d27c7026c1d4dcb2602646dec9751e763dba37bdf8ff9406ad9e530ee5db382f413001aeb06a53ed9027d831179727b0865a8918da3edbebcf9b14ed44ce6cbaced4bb1bdb7f1447e6cc254b332051512bd7af426fb8f401378cd2bf5983ca01c64b92ecf032ea15d1721d03f482d7ce6e74fef6d55e702f46980c82b5a84031900b1c9e59e7c97fbec7e8f323a97a7e36cc88be0f1d45b7ff585ac54bd407b22b4154aacc8f6d7ebf48e1d814cc5ed20f8037e0a79715eef29be32806a1d58bb7c5da76f550aa3d8a1fbff0eb19ccb1a313d55cda56c9ec2ef29632387fe8d76e3c0468043e8f663f4860ee12bf2d5b0b7474d6e694f91e6dbe115974a3926f12fee5e438777cb6a932df8cd8bec4d073b931ba3bc832b68d9dd300741fa7bf8afc47ed2576f6936ba424663aab639c5ae4f5683423b4742bf1c978238f16cbe39d652de3fdb8befc848ad922222e04a4037c0713eb57a81a23f0c73473fc646cea306b4bcbc8862f8385ddfa9d4b7fa2c087e879683303ed5bdd3a062b3cf5b3a278a66d2a13f83f44f82ddf310ee074ab6a364597e899a0255dc164f31cc50846851df9ab48195ded7ea1b1d510bd7ee74d73faf36bc31ecfa268359046f4eb879f924009438b481c6cd7889a002ed5ee382bc9190da6fc026e479558e4475677e9aa9e3050e2765694dfc81f56e880b96e7160c980dd98edd3dfffffffffffffffff"}}')},function(t,e,r){(function(e){var n=r(126),i=new(r(127)),o=new n(24),s=new n(11),a=new n(10),u=new n(3),h=new n(7),f=r(125),c=r(27);function l(t,r){return r=r||"utf8",e.isBuffer(t)||(t=new e(t,r)),this._pub=new n(t),this}function d(t,r){return r=r||"utf8",e.isBuffer(t)||(t=new e(t,r)),this._priv=new n(t),this}t.exports=m;var p={};function m(t,e,r){this.setGenerator(e),this.__prime=new n(t),this._prime=n.mont(this.__prime),this._primeLen=t.length,this._pub=void 0,this._priv=void 0,this._primeCode=void 0,r?(this.setPublicKey=l,this.setPrivateKey=d):this._primeCode=8}function y(t,r){var n=new e(t.toArray());return r?n.toString(r):n}Object.defineProperty(m.prototype,"verifyError",{enumerable:!0,get:function(){return"number"!=typeof this._primeCode&&(this._primeCode=function(t,e){var r=e.toString("hex"),n=[r,t.toString(16)].join("_");if(n in p)return p[n];var c,l=0;if(t.isEven()||!f.simpleSieve||!f.fermatTest(t)||!i.test(t))return l+=1,l+="02"===r||"05"===r?8:4,p[n]=l,l;switch(i.test(t.shrn(1))||(l+=2),r){case"02":t.mod(o).cmp(s)&&(l+=8);break;case"05":(c=t.mod(a)).cmp(u)&&c.cmp(h)&&(l+=8);break;default:l+=4}return p[n]=l,l}(this.__prime,this.__gen)),this._primeCode}}),m.prototype.generateKeys=function(){return this._priv||(this._priv=new n(c(this._primeLen))),this._pub=this._gen.toRed(this._prime).redPow(this._priv).fromRed(),this.getPublicKey()},m.prototype.computeSecret=function(t){var r=(t=(t=new n(t)).toRed(this._prime)).redPow(this._priv).fromRed(),i=new e(r.toArray()),o=this.getPrime();if(i.length0&&r.ishrn(n),r}function l(t,e,r){var o,s;do{for(o=n.alloc(0);8*o.length=0&&(s=e,a=r),n.negative&&(n=n.neg(),o=o.neg()),s.negative&&(s=s.neg(),a=a.neg()),[{a:n,b:o},{a:s,b:a}]},u.prototype._endoSplit=function(t){var e=this.endo.basis,r=e[0],n=e[1],i=n.b.mul(t).divRound(this.n),o=r.b.neg().mul(t).divRound(this.n),s=i.mul(r.a),a=o.mul(n.a),u=i.mul(r.b),h=o.mul(n.b);return{k1:t.sub(s).sub(a),k2:u.add(h).neg()}},u.prototype.pointFromX=function(t,e){(t=new i(t,16)).red||(t=t.toRed(this.red));var r=t.redSqr().redMul(t).redIAdd(t.redMul(this.a)).redIAdd(this.b),n=r.redSqrt();if(0!==n.redSqr().redSub(r).cmp(this.zero))throw new Error("invalid point");var o=n.fromRed().isOdd();return(e&&!o||!e&&o)&&(n=n.redNeg()),this.point(t,n)},u.prototype.validate=function(t){if(t.inf)return!0;var e=t.x,r=t.y,n=this.a.redMul(e),i=e.redSqr().redMul(e).redIAdd(n).redIAdd(this.b);return 0===r.redSqr().redISub(i).cmpn(0)},u.prototype._endoWnafMulAdd=function(t,e,r){for(var n=this._endoWnafT1,i=this._endoWnafT2,o=0;o":""},h.prototype.isInfinity=function(){return this.inf},h.prototype.add=function(t){if(this.inf)return t;if(t.inf)return this;if(this.eq(t))return this.dbl();if(this.neg().eq(t))return this.curve.point(null,null);if(0===this.x.cmp(t.x))return this.curve.point(null,null);var e=this.y.redSub(t.y);0!==e.cmpn(0)&&(e=e.redMul(this.x.redSub(t.x).redInvm()));var r=e.redSqr().redISub(this.x).redISub(t.x),n=e.redMul(this.x.redSub(r)).redISub(this.y);return this.curve.point(r,n)},h.prototype.dbl=function(){if(this.inf)return this;var t=this.y.redAdd(this.y);if(0===t.cmpn(0))return this.curve.point(null,null);var e=this.curve.a,r=this.x.redSqr(),n=t.redInvm(),i=r.redAdd(r).redIAdd(r).redIAdd(e).redMul(n),o=i.redSqr().redISub(this.x.redAdd(this.x)),s=i.redMul(this.x.redSub(o)).redISub(this.y);return this.curve.point(o,s)},h.prototype.getX=function(){return this.x.fromRed()},h.prototype.getY=function(){return this.y.fromRed()},h.prototype.mul=function(t){return t=new i(t,16),this.isInfinity()?this:this._hasDoubles(t)?this.curve._fixedNafMul(this,t):this.curve.endo?this.curve._endoWnafMulAdd([this],[t]):this.curve._wnafMul(this,t)},h.prototype.mulAdd=function(t,e,r){var n=[this,e],i=[t,r];return this.curve.endo?this.curve._endoWnafMulAdd(n,i):this.curve._wnafMulAdd(1,n,i,2)},h.prototype.jmulAdd=function(t,e,r){var n=[this,e],i=[t,r];return this.curve.endo?this.curve._endoWnafMulAdd(n,i,!0):this.curve._wnafMulAdd(1,n,i,2,!0)},h.prototype.eq=function(t){return this===t||this.inf===t.inf&&(this.inf||0===this.x.cmp(t.x)&&0===this.y.cmp(t.y))},h.prototype.neg=function(t){if(this.inf)return this;var e=this.curve.point(this.x,this.y.redNeg());if(t&&this.precomputed){var r=this.precomputed,n=function(t){return t.neg()};e.precomputed={naf:r.naf&&{wnd:r.naf.wnd,points:r.naf.points.map(n)},doubles:r.doubles&&{step:r.doubles.step,points:r.doubles.points.map(n)}}}return e},h.prototype.toJ=function(){return this.inf?this.curve.jpoint(null,null,null):this.curve.jpoint(this.x,this.y,this.curve.one)},o(f,s.BasePoint),u.prototype.jpoint=function(t,e,r){return new f(this,t,e,r)},f.prototype.toP=function(){if(this.isInfinity())return this.curve.point(null,null);var t=this.z.redInvm(),e=t.redSqr(),r=this.x.redMul(e),n=this.y.redMul(e).redMul(t);return this.curve.point(r,n)},f.prototype.neg=function(){return this.curve.jpoint(this.x,this.y.redNeg(),this.z)},f.prototype.add=function(t){if(this.isInfinity())return t;if(t.isInfinity())return this;var e=t.z.redSqr(),r=this.z.redSqr(),n=this.x.redMul(e),i=t.x.redMul(r),o=this.y.redMul(e.redMul(t.z)),s=t.y.redMul(r.redMul(this.z)),a=n.redSub(i),u=o.redSub(s);if(0===a.cmpn(0))return 0!==u.cmpn(0)?this.curve.jpoint(null,null,null):this.dbl();var h=a.redSqr(),f=h.redMul(a),c=n.redMul(h),l=u.redSqr().redIAdd(f).redISub(c).redISub(c),d=u.redMul(c.redISub(l)).redISub(o.redMul(f)),p=this.z.redMul(t.z).redMul(a);return this.curve.jpoint(l,d,p)},f.prototype.mixedAdd=function(t){if(this.isInfinity())return t.toJ();if(t.isInfinity())return this;var e=this.z.redSqr(),r=this.x,n=t.x.redMul(e),i=this.y,o=t.y.redMul(e).redMul(this.z),s=r.redSub(n),a=i.redSub(o);if(0===s.cmpn(0))return 0!==a.cmpn(0)?this.curve.jpoint(null,null,null):this.dbl();var u=s.redSqr(),h=u.redMul(s),f=r.redMul(u),c=a.redSqr().redIAdd(h).redISub(f).redISub(f),l=a.redMul(f.redISub(c)).redISub(i.redMul(h)),d=this.z.redMul(s);return this.curve.jpoint(c,l,d)},f.prototype.dblp=function(t){if(0===t)return this;if(this.isInfinity())return this;if(!t)return this.dbl();var e;if(this.curve.zeroA||this.curve.threeA){var r=this;for(e=0;e=0)return!1;if(r.redIAdd(i),0===this.x.cmp(r))return!0}},f.prototype.inspect=function(){return this.isInfinity()?"":""},f.prototype.isInfinity=function(){return 0===this.z.cmpn(0)}},function(t,e,r){"use strict";var n=r(19),i=r(0),o=r(49),s=r(12);function a(t){o.call(this,"mont",t),this.a=new n(t.a,16).toRed(this.red),this.b=new n(t.b,16).toRed(this.red),this.i4=new n(4).toRed(this.red).redInvm(),this.two=new n(2).toRed(this.red),this.a24=this.i4.redMul(this.a.redAdd(this.two))}function u(t,e,r){o.BasePoint.call(this,t,"projective"),null===e&&null===r?(this.x=this.curve.one,this.z=this.curve.zero):(this.x=new n(e,16),this.z=new n(r,16),this.x.red||(this.x=this.x.toRed(this.curve.red)),this.z.red||(this.z=this.z.toRed(this.curve.red)))}i(a,o),t.exports=a,a.prototype.validate=function(t){var e=t.normalize().x,r=e.redSqr(),n=r.redMul(e).redAdd(r.redMul(this.a)).redAdd(e);return 0===n.redSqrt().redSqr().cmp(n)},i(u,o.BasePoint),a.prototype.decodePoint=function(t,e){return this.point(s.toArray(t,e),1)},a.prototype.point=function(t,e){return new u(this,t,e)},a.prototype.pointFromJSON=function(t){return u.fromJSON(this,t)},u.prototype.precompute=function(){},u.prototype._encode=function(){return this.getX().toArray("be",this.curve.p.byteLength())},u.fromJSON=function(t,e){return new u(t,e[0],e[1]||t.one)},u.prototype.inspect=function(){return this.isInfinity()?"":""},u.prototype.isInfinity=function(){return 0===this.z.cmpn(0)},u.prototype.dbl=function(){var t=this.x.redAdd(this.z).redSqr(),e=this.x.redSub(this.z).redSqr(),r=t.redSub(e),n=t.redMul(e),i=r.redMul(e.redAdd(this.curve.a24.redMul(r)));return this.curve.point(n,i)},u.prototype.add=function(){throw new Error("Not supported on Montgomery curve")},u.prototype.diffAdd=function(t,e){var r=this.x.redAdd(this.z),n=this.x.redSub(this.z),i=t.x.redAdd(t.z),o=t.x.redSub(t.z).redMul(r),s=i.redMul(n),a=e.z.redMul(o.redAdd(s).redSqr()),u=e.x.redMul(o.redISub(s).redSqr());return this.curve.point(a,u)},u.prototype.mul=function(t){for(var e=t.clone(),r=this,n=this.curve.point(null,null),i=[];0!==e.cmpn(0);e.iushrn(1))i.push(e.andln(1));for(var o=i.length-1;o>=0;o--)0===i[o]?(r=r.diffAdd(n,this),n=n.dbl()):(n=r.diffAdd(n,this),r=r.dbl());return n},u.prototype.mulAdd=function(){throw new Error("Not supported on Montgomery curve")},u.prototype.jumlAdd=function(){throw new Error("Not supported on Montgomery curve")},u.prototype.eq=function(t){return 0===this.getX().cmp(t.getX())},u.prototype.normalize=function(){return this.x=this.x.redMul(this.z.redInvm()),this.z=this.curve.one,this},u.prototype.getX=function(){return this.normalize(),this.x.fromRed()}},function(t,e,r){"use strict";var n=r(12),i=r(19),o=r(0),s=r(49),a=n.assert;function u(t){this.twisted=1!=(0|t.a),this.mOneA=this.twisted&&-1==(0|t.a),this.extended=this.mOneA,s.call(this,"edwards",t),this.a=new i(t.a,16).umod(this.red.m),this.a=this.a.toRed(this.red),this.c=new i(t.c,16).toRed(this.red),this.c2=this.c.redSqr(),this.d=new i(t.d,16).toRed(this.red),this.dd=this.d.redAdd(this.d),a(!this.twisted||0===this.c.fromRed().cmpn(1)),this.oneC=1==(0|t.c)}function h(t,e,r,n,o){s.BasePoint.call(this,t,"projective"),null===e&&null===r&&null===n?(this.x=this.curve.zero,this.y=this.curve.one,this.z=this.curve.one,this.t=this.curve.zero,this.zOne=!0):(this.x=new i(e,16),this.y=new i(r,16),this.z=n?new i(n,16):this.curve.one,this.t=o&&new i(o,16),this.x.red||(this.x=this.x.toRed(this.curve.red)),this.y.red||(this.y=this.y.toRed(this.curve.red)),this.z.red||(this.z=this.z.toRed(this.curve.red)),this.t&&!this.t.red&&(this.t=this.t.toRed(this.curve.red)),this.zOne=this.z===this.curve.one,this.curve.extended&&!this.t&&(this.t=this.x.redMul(this.y),this.zOne||(this.t=this.t.redMul(this.z.redInvm()))))}o(u,s),t.exports=u,u.prototype._mulA=function(t){return this.mOneA?t.redNeg():this.a.redMul(t)},u.prototype._mulC=function(t){return this.oneC?t:this.c.redMul(t)},u.prototype.jpoint=function(t,e,r,n){return this.point(t,e,r,n)},u.prototype.pointFromX=function(t,e){(t=new i(t,16)).red||(t=t.toRed(this.red));var r=t.redSqr(),n=this.c2.redSub(this.a.redMul(r)),o=this.one.redSub(this.c2.redMul(this.d).redMul(r)),s=n.redMul(o.redInvm()),a=s.redSqrt();if(0!==a.redSqr().redSub(s).cmp(this.zero))throw new Error("invalid point");var u=a.fromRed().isOdd();return(e&&!u||!e&&u)&&(a=a.redNeg()),this.point(t,a)},u.prototype.pointFromY=function(t,e){(t=new i(t,16)).red||(t=t.toRed(this.red));var r=t.redSqr(),n=r.redSub(this.c2),o=r.redMul(this.d).redMul(this.c2).redSub(this.a),s=n.redMul(o.redInvm());if(0===s.cmp(this.zero)){if(e)throw new Error("invalid point");return this.point(this.zero,t)}var a=s.redSqrt();if(0!==a.redSqr().redSub(s).cmp(this.zero))throw new Error("invalid point");return a.fromRed().isOdd()!==e&&(a=a.redNeg()),this.point(a,t)},u.prototype.validate=function(t){if(t.isInfinity())return!0;t.normalize();var e=t.x.redSqr(),r=t.y.redSqr(),n=e.redMul(this.a).redAdd(r),i=this.c2.redMul(this.one.redAdd(this.d.redMul(e).redMul(r)));return 0===n.cmp(i)},o(h,s.BasePoint),u.prototype.pointFromJSON=function(t){return h.fromJSON(this,t)},u.prototype.point=function(t,e,r,n){return new h(this,t,e,r,n)},h.fromJSON=function(t,e){return new h(t,e[0],e[1],e[2])},h.prototype.inspect=function(){return this.isInfinity()?"":""},h.prototype.isInfinity=function(){return 0===this.x.cmpn(0)&&(0===this.y.cmp(this.z)||this.zOne&&0===this.y.cmp(this.curve.c))},h.prototype._extDbl=function(){var t=this.x.redSqr(),e=this.y.redSqr(),r=this.z.redSqr();r=r.redIAdd(r);var n=this.curve._mulA(t),i=this.x.redAdd(this.y).redSqr().redISub(t).redISub(e),o=n.redAdd(e),s=o.redSub(r),a=n.redSub(e),u=i.redMul(s),h=o.redMul(a),f=i.redMul(a),c=s.redMul(o);return this.curve.point(u,h,c,f)},h.prototype._projDbl=function(){var t,e,r,n,i,o,s=this.x.redAdd(this.y).redSqr(),a=this.x.redSqr(),u=this.y.redSqr();if(this.curve.twisted){var h=(n=this.curve._mulA(a)).redAdd(u);this.zOne?(t=s.redSub(a).redSub(u).redMul(h.redSub(this.curve.two)),e=h.redMul(n.redSub(u)),r=h.redSqr().redSub(h).redSub(h)):(i=this.z.redSqr(),o=h.redSub(i).redISub(i),t=s.redSub(a).redISub(u).redMul(o),e=h.redMul(n.redSub(u)),r=h.redMul(o))}else n=a.redAdd(u),i=this.curve._mulC(this.z).redSqr(),o=n.redSub(i).redSub(i),t=this.curve._mulC(s.redISub(n)).redMul(o),e=this.curve._mulC(n).redMul(a.redISub(u)),r=n.redMul(o);return this.curve.point(t,e,r)},h.prototype.dbl=function(){return this.isInfinity()?this:this.curve.extended?this._extDbl():this._projDbl()},h.prototype._extAdd=function(t){var e=this.y.redSub(this.x).redMul(t.y.redSub(t.x)),r=this.y.redAdd(this.x).redMul(t.y.redAdd(t.x)),n=this.t.redMul(this.curve.dd).redMul(t.t),i=this.z.redMul(t.z.redAdd(t.z)),o=r.redSub(e),s=i.redSub(n),a=i.redAdd(n),u=r.redAdd(e),h=o.redMul(s),f=a.redMul(u),c=o.redMul(u),l=s.redMul(a);return this.curve.point(h,f,l,c)},h.prototype._projAdd=function(t){var e,r,n=this.z.redMul(t.z),i=n.redSqr(),o=this.x.redMul(t.x),s=this.y.redMul(t.y),a=this.curve.d.redMul(o).redMul(s),u=i.redSub(a),h=i.redAdd(a),f=this.x.redAdd(this.y).redMul(t.x.redAdd(t.y)).redISub(o).redISub(s),c=n.redMul(u).redMul(f);return this.curve.twisted?(e=n.redMul(h).redMul(s.redSub(this.curve._mulA(o))),r=u.redMul(h)):(e=n.redMul(h).redMul(s.redSub(o)),r=this.curve._mulC(u).redMul(h)),this.curve.point(c,e,r)},h.prototype.add=function(t){return this.isInfinity()?t:t.isInfinity()?this:this.curve.extended?this._extAdd(t):this._projAdd(t)},h.prototype.mul=function(t){return this._hasDoubles(t)?this.curve._fixedNafMul(this,t):this.curve._wnafMul(this,t)},h.prototype.mulAdd=function(t,e,r){return this.curve._wnafMulAdd(1,[this,e],[t,r],2,!1)},h.prototype.jmulAdd=function(t,e,r){return this.curve._wnafMulAdd(1,[this,e],[t,r],2,!0)},h.prototype.normalize=function(){if(this.zOne)return this;var t=this.z.redInvm();return this.x=this.x.redMul(t),this.y=this.y.redMul(t),this.t&&(this.t=this.t.redMul(t)),this.z=this.curve.one,this.zOne=!0,this},h.prototype.neg=function(){return this.curve.point(this.x.redNeg(),this.y,this.z,this.t&&this.t.redNeg())},h.prototype.getX=function(){return this.normalize(),this.x.fromRed()},h.prototype.getY=function(){return this.normalize(),this.y.fromRed()},h.prototype.eq=function(t){return this===t||0===this.getX().cmp(t.getX())&&0===this.getY().cmp(t.getY())},h.prototype.eqXToP=function(t){var e=t.toRed(this.curve.red).redMul(this.z);if(0===this.x.cmp(e))return!0;for(var r=t.clone(),n=this.curve.redN.redMul(this.z);;){if(r.iadd(this.curve.n),r.cmp(this.curve.p)>=0)return!1;if(e.redIAdd(n),0===this.x.cmp(e))return!0}},h.prototype.toP=h.prototype.normalize,h.prototype.mixedAdd=h.prototype.add},function(t,e,r){"use strict";e.sha1=r(240),e.sha224=r(241),e.sha256=r(131),e.sha384=r(242),e.sha512=r(132)},function(t,e,r){"use strict";var n=r(17),i=r(40),o=r(130),s=n.rotl32,a=n.sum32,u=n.sum32_5,h=o.ft_1,f=i.BlockHash,c=[1518500249,1859775393,2400959708,3395469782];function l(){if(!(this instanceof l))return new l;f.call(this),this.h=[1732584193,4023233417,2562383102,271733878,3285377520],this.W=new Array(80)}n.inherits(l,f),t.exports=l,l.blockSize=512,l.outSize=160,l.hmacStrength=80,l.padLength=64,l.prototype._update=function(t,e){for(var r=this.W,n=0;n<16;n++)r[n]=t[e+n];for(;nthis.blockSize&&(t=(new this.Hash).update(t).digest()),i(t.length<=this.blockSize);for(var e=t.length;e0))return s.iaddn(1),this.keyFromPrivate(s)}},l.prototype._truncateToN=function(t,e){var r=8*t.byteLength()-this.n.bitLength();return r>0&&(t=t.ushrn(r)),!e&&t.cmp(this.n)>=0?t.sub(this.n):t},l.prototype.sign=function(t,e,r,s){"object"===n(r)&&(s=r,r=null),s||(s={}),e=this.keyFromPrivate(e,r),t=this._truncateToN(new i(t,16));for(var a=this.n.byteLength(),u=e.getPrivate().toArray("be",a),h=t.toArray("be",a),f=new o({hash:this.hash,entropy:u,nonce:h,pers:s.pers,persEnc:s.persEnc||"utf8"}),l=this.n.sub(new i(1)),d=0;;d++){var p=s.k?s.k(d):new i(f.generate(this.n.byteLength()));if(!((p=this._truncateToN(p,!0)).cmpn(1)<=0||p.cmp(l)>=0)){var m=this.g.mul(p);if(!m.isInfinity()){var y=m.getX(),v=y.umod(this.n);if(0!==v.cmpn(0)){var b=p.invm(this.n).mul(v.mul(e.getPrivate()).iadd(t));if(0!==(b=b.umod(this.n)).cmpn(0)){var g=(m.getY().isOdd()?1:0)|(0!==y.cmp(v)?2:0);return s.canonical&&b.cmp(this.nh)>0&&(b=this.n.sub(b),g^=1),new c({r:v,s:b,recoveryParam:g})}}}}}},l.prototype.verify=function(t,e,r,n){t=this._truncateToN(new i(t,16)),r=this.keyFromPublic(r,n);var o=(e=new c(e,"hex")).r,s=e.s;if(o.cmpn(1)<0||o.cmp(this.n)>=0)return!1;if(s.cmpn(1)<0||s.cmp(this.n)>=0)return!1;var a,u=s.invm(this.n),h=u.mul(t).umod(this.n),f=u.mul(o).umod(this.n);return this.curve._maxwellTrick?!(a=this.g.jmulAdd(h,r.getPublic(),f)).isInfinity()&&a.eqXToP(o):!(a=this.g.mulAdd(h,r.getPublic(),f)).isInfinity()&&0===a.getX().umod(this.n).cmp(o)},l.prototype.recoverPubKey=function(t,e,r,n){h((3&r)===r,"The recovery param is more than two bits"),e=new c(e,n);var o=this.n,s=new i(t),a=e.r,u=e.s,f=1&r,l=r>>1;if(a.cmp(this.curve.p.umod(this.curve.n))>=0&&l)throw new Error("Unable to find sencond key candinate");a=l?this.curve.pointFromX(a.add(this.curve.n),f):this.curve.pointFromX(a,f);var d=e.r.invm(o),p=o.sub(s).mul(d).umod(o),m=u.mul(d).umod(o);return this.g.mulAdd(p,a,m)},l.prototype.getKeyRecoveryParam=function(t,e,r,n){if(null!==(e=new c(e,n)).recoveryParam)return e.recoveryParam;for(var i=0;i<4;i++){var o;try{o=this.recoverPubKey(t,e,i)}catch(t){continue}if(o.eq(r))return i}throw new Error("Unable to find valid recovery factor")}},function(t,e,r){"use strict";var n=r(75),i=r(128),o=r(11);function s(t){if(!(this instanceof s))return new s(t);this.hash=t.hash,this.predResist=!!t.predResist,this.outLen=this.hash.outSize,this.minEntropy=t.minEntropy||this.hash.hmacStrength,this._reseed=null,this.reseedInterval=null,this.K=null,this.V=null;var e=i.toArray(t.entropy,t.entropyEnc||"hex"),r=i.toArray(t.nonce,t.nonceEnc||"hex"),n=i.toArray(t.pers,t.persEnc||"hex");o(e.length>=this.minEntropy/8,"Not enough entropy. Minimum is: "+this.minEntropy+" bits"),this._init(e,r,n)}t.exports=s,s.prototype._init=function(t,e,r){var n=t.concat(e).concat(r);this.K=new Array(this.outLen/8),this.V=new Array(this.outLen/8);for(var i=0;i=this.minEntropy/8,"Not enough entropy. Minimum is: "+this.minEntropy+" bits"),this._update(t.concat(r||[])),this._reseed=1},s.prototype.generate=function(t,e,r,n){if(this._reseed>this.reseedInterval)throw new Error("Reseed is required");"string"!=typeof e&&(n=r,r=e,e=null),r&&(r=i.toArray(r,n||"hex"),this._update(r));for(var o=[];o.length"}},function(t,e,r){"use strict";var n=r(19),i=r(12),o=i.assert;function s(t,e){if(t instanceof s)return t;this._importDER(t,e)||(o(t.r&&t.s,"Signature without r or s"),this.r=new n(t.r,16),this.s=new n(t.s,16),void 0===t.recoveryParam?this.recoveryParam=null:this.recoveryParam=t.recoveryParam)}function a(){this.place=0}function u(t,e){var r=t[e.place++];if(!(128&r))return r;var n=15&r;if(0===n||n>4)return!1;for(var i=0,o=0,s=e.place;o>>=0;return!(i<=127)&&(e.place=s,i)}function h(t){for(var e=0,r=t.length-1;!t[e]&&!(128&t[e+1])&&e>>3);for(t.push(128|r);--r;)t.push(e>>>(r<<3)&255);t.push(e)}}t.exports=s,s.prototype._importDER=function(t,e){t=i.toArray(t,e);var r=new a;if(48!==t[r.place++])return!1;var o=u(t,r);if(!1===o)return!1;if(o+r.place!==t.length)return!1;if(2!==t[r.place++])return!1;var s=u(t,r);if(!1===s)return!1;var h=t.slice(r.place,s+r.place);if(r.place+=s,2!==t[r.place++])return!1;var f=u(t,r);if(!1===f)return!1;if(t.length!==f+r.place)return!1;var c=t.slice(r.place,f+r.place);if(0===h[0]){if(!(128&h[1]))return!1;h=h.slice(1)}if(0===c[0]){if(!(128&c[1]))return!1;c=c.slice(1)}return this.r=new n(h),this.s=new n(c),this.recoveryParam=null,!0},s.prototype.toDER=function(t){var e=this.r.toArray(),r=this.s.toArray();for(128&e[0]&&(e=[0].concat(e)),128&r[0]&&(r=[0].concat(r)),e=h(e),r=h(r);!(r[0]||128&r[1]);)r=r.slice(1);var n=[2];f(n,e.length),(n=n.concat(e)).push(2),f(n,r.length);var o=n.concat(r),s=[48];return f(s,o.length),s=s.concat(o),i.encode(s,t)}},function(t,e,r){"use strict";var n=r(75),i=r(74),o=r(12),s=o.assert,a=o.parseBytes,u=r(251),h=r(252);function f(t){if(s("ed25519"===t,"only tested with ed25519 so far"),!(this instanceof f))return new f(t);t=i[t].curve,this.curve=t,this.g=t.g,this.g.precompute(t.n.bitLength()+1),this.pointClass=t.point().constructor,this.encodingLength=Math.ceil(t.n.bitLength()/8),this.hash=n.sha512}t.exports=f,f.prototype.sign=function(t,e){t=a(t);var r=this.keyFromSecret(e),n=this.hashInt(r.messagePrefix(),t),i=this.g.mul(n),o=this.encodePoint(i),s=this.hashInt(o,r.pubBytes(),t).mul(r.priv()),u=n.add(s).umod(this.curve.n);return this.makeSignature({R:i,S:u,Rencoded:o})},f.prototype.verify=function(t,e,r){t=a(t),e=this.makeSignature(e);var n=this.keyFromPublic(r),i=this.hashInt(e.Rencoded(),n.pubBytes(),t),o=this.g.mul(e.S());return e.R().add(n.pub().mul(i)).eq(o)},f.prototype.hashInt=function(){for(var t=this.hash(),e=0;e=e)throw new Error("invalid sig")}t.exports=function(t,e,r,h,f){var c=s(r);if("ec"===c.type){if("ecdsa"!==h&&"ecdsa/rsa"!==h)throw new Error("wrong public key type");return function(t,e,r){var n=a[r.data.algorithm.curve.join(".")];if(!n)throw new Error("unknown curve "+r.data.algorithm.curve.join("."));var i=new o(n),s=r.data.subjectPrivateKey.data;return i.verify(e,t,s)}(t,e,c)}if("dsa"===c.type){if("dsa"!==h)throw new Error("wrong public key type");return function(t,e,r){var n=r.data.p,o=r.data.q,a=r.data.g,h=r.data.pub_key,f=s.signature.decode(t,"der"),c=f.s,l=f.r;u(c,o),u(l,o);var d=i.mont(n),p=c.invm(o);return 0===a.toRed(d).redPow(new i(e).mul(p).mod(o)).fromRed().mul(h.toRed(d).redPow(l.mul(p).mod(o)).fromRed()).mod(n).mod(o).cmp(l)}(t,e,c)}if("rsa"!==h&&"ecdsa/rsa"!==h)throw new Error("wrong public key type");e=n.concat([f,e]);for(var l=c.modulus.byteLength(),d=[1],p=0;e.length+d.length+2=65&&r<=70?r-55:r>=97&&r<=102?r-87:r-48&15}function h(t,e,r){var n=u(t,r);return r-1>=e&&(n|=u(t,r-1)<<4),n}function f(t,e,r,n){for(var i=0,o=Math.min(t.length,r),s=e;s=49?a-49+10:a>=17?a-17+10:a}return i}s.isBN=function(t){return t instanceof s||null!==t&&"object"===e(t)&&t.constructor.wordSize===s.wordSize&&Array.isArray(t.words)},s.max=function(t,e){return t.cmp(e)>0?t:e},s.min=function(t,e){return t.cmp(e)<0?t:e},s.prototype._init=function(t,r,n){if("number"==typeof t)return this._initNumber(t,r,n);if("object"===e(t))return this._initArray(t,r,n);"hex"===r&&(r=16),i(r===(0|r)&&r>=2&&r<=36);var o=0;"-"===(t=t.toString().replace(/\s+/g,""))[0]&&(o++,this.negative=1),o=0;n-=3)s=t[n]|t[n-1]<<8|t[n-2]<<16,this.words[o]|=s<>>26-a&67108863,(a+=24)>=26&&(a-=26,o++);else if("le"===r)for(n=0,o=0;n>>26-a&67108863,(a+=24)>=26&&(a-=26,o++);return this.strip()},s.prototype._parseHex=function(t,e,r){this.length=Math.ceil((t.length-e)/6),this.words=new Array(this.length);for(var n=0;n=e;n-=2)i=h(t,e,n)<=18?(o-=18,s+=1,this.words[s]|=i>>>26):o+=8;else for(n=(t.length-e)%2==0?e+1:e;n=18?(o-=18,s+=1,this.words[s]|=i>>>26):o+=8;this.strip()},s.prototype._parseBase=function(t,e,r){this.words=[0],this.length=1;for(var n=0,i=1;i<=67108863;i*=e)n++;n--,i=i/e|0;for(var o=t.length-r,s=o%n,a=Math.min(o,o-s)+r,u=0,h=r;h1&&0===this.words[this.length-1];)this.length--;return this._normSign()},s.prototype._normSign=function(){return 1===this.length&&0===this.words[0]&&(this.negative=0),this},s.prototype.inspect=function(){return(this.red?""};var c=["","0","00","000","0000","00000","000000","0000000","00000000","000000000","0000000000","00000000000","000000000000","0000000000000","00000000000000","000000000000000","0000000000000000","00000000000000000","000000000000000000","0000000000000000000","00000000000000000000","000000000000000000000","0000000000000000000000","00000000000000000000000","000000000000000000000000","0000000000000000000000000"],l=[0,0,25,16,12,11,10,9,8,8,7,7,7,7,6,6,6,6,6,6,6,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5],d=[0,0,33554432,43046721,16777216,48828125,60466176,40353607,16777216,43046721,1e7,19487171,35831808,62748517,7529536,11390625,16777216,24137569,34012224,47045881,64e6,4084101,5153632,6436343,7962624,9765625,11881376,14348907,17210368,20511149,243e5,28629151,33554432,39135393,45435424,52521875,60466176];function p(t,e,r){r.negative=e.negative^t.negative;var n=t.length+e.length|0;r.length=n,n=n-1|0;var i=0|t.words[0],o=0|e.words[0],s=i*o,a=67108863&s,u=s/67108864|0;r.words[0]=a;for(var h=1;h>>26,c=67108863&u,l=Math.min(h,e.length-1),d=Math.max(0,h-t.length+1);d<=l;d++){var p=h-d|0;f+=(s=(i=0|t.words[p])*(o=0|e.words[d])+c)/67108864|0,c=67108863&s}r.words[h]=0|c,u=0|f}return 0!==u?r.words[h]=0|u:r.length--,r.strip()}s.prototype.toString=function(t,e){var r;if(e=0|e||1,16===(t=t||10)||"hex"===t){r="";for(var n=0,o=0,s=0;s>>24-n&16777215)||s!==this.length-1?c[6-u.length]+u+r:u+r,(n+=2)>=26&&(n-=26,s--)}for(0!==o&&(r=o.toString(16)+r);r.length%e!=0;)r="0"+r;return 0!==this.negative&&(r="-"+r),r}if(t===(0|t)&&t>=2&&t<=36){var h=l[t],f=d[t];r="";var p=this.clone();for(p.negative=0;!p.isZero();){var m=p.modn(f).toString(t);r=(p=p.idivn(f)).isZero()?m+r:c[h-m.length]+m+r}for(this.isZero()&&(r="0"+r);r.length%e!=0;)r="0"+r;return 0!==this.negative&&(r="-"+r),r}i(!1,"Base should be between 2 and 36")},s.prototype.toNumber=function(){var t=this.words[0];return 2===this.length?t+=67108864*this.words[1]:3===this.length&&1===this.words[2]?t+=4503599627370496+67108864*this.words[1]:this.length>2&&i(!1,"Number can only safely store up to 53 bits"),0!==this.negative?-t:t},s.prototype.toJSON=function(){return this.toString(16)},s.prototype.toBuffer=function(t,e){return i(void 0!==a),this.toArrayLike(a,t,e)},s.prototype.toArray=function(t,e){return this.toArrayLike(Array,t,e)},s.prototype.toArrayLike=function(t,e,r){var n=this.byteLength(),o=r||Math.max(1,n);i(n<=o,"byte array longer than desired length"),i(o>0,"Requested array length <= 0"),this.strip();var s,a,u="le"===e,h=new t(o),f=this.clone();if(u){for(a=0;!f.isZero();a++)s=f.andln(255),f.iushrn(8),h[a]=s;for(;a=4096&&(r+=13,e>>>=13),e>=64&&(r+=7,e>>>=7),e>=8&&(r+=4,e>>>=4),e>=2&&(r+=2,e>>>=2),r+e},s.prototype._zeroBits=function(t){if(0===t)return 26;var e=t,r=0;return 0==(8191&e)&&(r+=13,e>>>=13),0==(127&e)&&(r+=7,e>>>=7),0==(15&e)&&(r+=4,e>>>=4),0==(3&e)&&(r+=2,e>>>=2),0==(1&e)&&r++,r},s.prototype.bitLength=function(){var t=this.words[this.length-1],e=this._countBits(t);return 26*(this.length-1)+e},s.prototype.zeroBits=function(){if(this.isZero())return 0;for(var t=0,e=0;et.length?this.clone().ior(t):t.clone().ior(this)},s.prototype.uor=function(t){return this.length>t.length?this.clone().iuor(t):t.clone().iuor(this)},s.prototype.iuand=function(t){var e;e=this.length>t.length?t:this;for(var r=0;rt.length?this.clone().iand(t):t.clone().iand(this)},s.prototype.uand=function(t){return this.length>t.length?this.clone().iuand(t):t.clone().iuand(this)},s.prototype.iuxor=function(t){var e,r;this.length>t.length?(e=this,r=t):(e=t,r=this);for(var n=0;nt.length?this.clone().ixor(t):t.clone().ixor(this)},s.prototype.uxor=function(t){return this.length>t.length?this.clone().iuxor(t):t.clone().iuxor(this)},s.prototype.inotn=function(t){i("number"==typeof t&&t>=0);var e=0|Math.ceil(t/26),r=t%26;this._expand(e),r>0&&e--;for(var n=0;n0&&(this.words[n]=~this.words[n]&67108863>>26-r),this.strip()},s.prototype.notn=function(t){return this.clone().inotn(t)},s.prototype.setn=function(t,e){i("number"==typeof t&&t>=0);var r=t/26|0,n=t%26;return this._expand(r+1),this.words[r]=e?this.words[r]|1<t.length?(r=this,n=t):(r=t,n=this);for(var i=0,o=0;o>>26;for(;0!==i&&o>>26;if(this.length=r.length,0!==i)this.words[this.length]=i,this.length++;else if(r!==this)for(;ot.length?this.clone().iadd(t):t.clone().iadd(this)},s.prototype.isub=function(t){if(0!==t.negative){t.negative=0;var e=this.iadd(t);return t.negative=1,e._normSign()}if(0!==this.negative)return this.negative=0,this.iadd(t),this.negative=1,this._normSign();var r,n,i=this.cmp(t);if(0===i)return this.negative=0,this.length=1,this.words[0]=0,this;i>0?(r=this,n=t):(r=t,n=this);for(var o=0,s=0;s>26,this.words[s]=67108863&e;for(;0!==o&&s>26,this.words[s]=67108863&e;if(0===o&&s>>13,d=0|s[1],p=8191&d,m=d>>>13,y=0|s[2],v=8191&y,b=y>>>13,g=0|s[3],w=8191&g,_=g>>>13,M=0|s[4],S=8191&M,O=M>>>13,A=0|s[5],E=8191&A,x=A>>>13,k=0|s[6],j=8191&k,$=k>>>13,P=0|s[7],R=8191&P,B=P>>>13,T=0|s[8],I=8191&T,N=T>>>13,D=0|s[9],C=8191&D,L=D>>>13,q=0|a[0],U=8191&q,F=q>>>13,z=0|a[1],V=8191&z,K=z>>>13,H=0|a[2],Z=8191&H,W=H>>>13,J=0|a[3],Y=8191&J,Q=J>>>13,G=0|a[4],X=8191&G,tt=G>>>13,et=0|a[5],rt=8191&et,nt=et>>>13,it=0|a[6],ot=8191&it,st=it>>>13,at=0|a[7],ut=8191&at,ht=at>>>13,ft=0|a[8],ct=8191&ft,lt=ft>>>13,dt=0|a[9],pt=8191&dt,mt=dt>>>13;r.negative=t.negative^e.negative,r.length=19;var yt=(h+(n=Math.imul(c,U))|0)+((8191&(i=(i=Math.imul(c,F))+Math.imul(l,U)|0))<<13)|0;h=((o=Math.imul(l,F))+(i>>>13)|0)+(yt>>>26)|0,yt&=67108863,n=Math.imul(p,U),i=(i=Math.imul(p,F))+Math.imul(m,U)|0,o=Math.imul(m,F);var vt=(h+(n=n+Math.imul(c,V)|0)|0)+((8191&(i=(i=i+Math.imul(c,K)|0)+Math.imul(l,V)|0))<<13)|0;h=((o=o+Math.imul(l,K)|0)+(i>>>13)|0)+(vt>>>26)|0,vt&=67108863,n=Math.imul(v,U),i=(i=Math.imul(v,F))+Math.imul(b,U)|0,o=Math.imul(b,F),n=n+Math.imul(p,V)|0,i=(i=i+Math.imul(p,K)|0)+Math.imul(m,V)|0,o=o+Math.imul(m,K)|0;var bt=(h+(n=n+Math.imul(c,Z)|0)|0)+((8191&(i=(i=i+Math.imul(c,W)|0)+Math.imul(l,Z)|0))<<13)|0;h=((o=o+Math.imul(l,W)|0)+(i>>>13)|0)+(bt>>>26)|0,bt&=67108863,n=Math.imul(w,U),i=(i=Math.imul(w,F))+Math.imul(_,U)|0,o=Math.imul(_,F),n=n+Math.imul(v,V)|0,i=(i=i+Math.imul(v,K)|0)+Math.imul(b,V)|0,o=o+Math.imul(b,K)|0,n=n+Math.imul(p,Z)|0,i=(i=i+Math.imul(p,W)|0)+Math.imul(m,Z)|0,o=o+Math.imul(m,W)|0;var gt=(h+(n=n+Math.imul(c,Y)|0)|0)+((8191&(i=(i=i+Math.imul(c,Q)|0)+Math.imul(l,Y)|0))<<13)|0;h=((o=o+Math.imul(l,Q)|0)+(i>>>13)|0)+(gt>>>26)|0,gt&=67108863,n=Math.imul(S,U),i=(i=Math.imul(S,F))+Math.imul(O,U)|0,o=Math.imul(O,F),n=n+Math.imul(w,V)|0,i=(i=i+Math.imul(w,K)|0)+Math.imul(_,V)|0,o=o+Math.imul(_,K)|0,n=n+Math.imul(v,Z)|0,i=(i=i+Math.imul(v,W)|0)+Math.imul(b,Z)|0,o=o+Math.imul(b,W)|0,n=n+Math.imul(p,Y)|0,i=(i=i+Math.imul(p,Q)|0)+Math.imul(m,Y)|0,o=o+Math.imul(m,Q)|0;var wt=(h+(n=n+Math.imul(c,X)|0)|0)+((8191&(i=(i=i+Math.imul(c,tt)|0)+Math.imul(l,X)|0))<<13)|0;h=((o=o+Math.imul(l,tt)|0)+(i>>>13)|0)+(wt>>>26)|0,wt&=67108863,n=Math.imul(E,U),i=(i=Math.imul(E,F))+Math.imul(x,U)|0,o=Math.imul(x,F),n=n+Math.imul(S,V)|0,i=(i=i+Math.imul(S,K)|0)+Math.imul(O,V)|0,o=o+Math.imul(O,K)|0,n=n+Math.imul(w,Z)|0,i=(i=i+Math.imul(w,W)|0)+Math.imul(_,Z)|0,o=o+Math.imul(_,W)|0,n=n+Math.imul(v,Y)|0,i=(i=i+Math.imul(v,Q)|0)+Math.imul(b,Y)|0,o=o+Math.imul(b,Q)|0,n=n+Math.imul(p,X)|0,i=(i=i+Math.imul(p,tt)|0)+Math.imul(m,X)|0,o=o+Math.imul(m,tt)|0;var _t=(h+(n=n+Math.imul(c,rt)|0)|0)+((8191&(i=(i=i+Math.imul(c,nt)|0)+Math.imul(l,rt)|0))<<13)|0;h=((o=o+Math.imul(l,nt)|0)+(i>>>13)|0)+(_t>>>26)|0,_t&=67108863,n=Math.imul(j,U),i=(i=Math.imul(j,F))+Math.imul($,U)|0,o=Math.imul($,F),n=n+Math.imul(E,V)|0,i=(i=i+Math.imul(E,K)|0)+Math.imul(x,V)|0,o=o+Math.imul(x,K)|0,n=n+Math.imul(S,Z)|0,i=(i=i+Math.imul(S,W)|0)+Math.imul(O,Z)|0,o=o+Math.imul(O,W)|0,n=n+Math.imul(w,Y)|0,i=(i=i+Math.imul(w,Q)|0)+Math.imul(_,Y)|0,o=o+Math.imul(_,Q)|0,n=n+Math.imul(v,X)|0,i=(i=i+Math.imul(v,tt)|0)+Math.imul(b,X)|0,o=o+Math.imul(b,tt)|0,n=n+Math.imul(p,rt)|0,i=(i=i+Math.imul(p,nt)|0)+Math.imul(m,rt)|0,o=o+Math.imul(m,nt)|0;var Mt=(h+(n=n+Math.imul(c,ot)|0)|0)+((8191&(i=(i=i+Math.imul(c,st)|0)+Math.imul(l,ot)|0))<<13)|0;h=((o=o+Math.imul(l,st)|0)+(i>>>13)|0)+(Mt>>>26)|0,Mt&=67108863,n=Math.imul(R,U),i=(i=Math.imul(R,F))+Math.imul(B,U)|0,o=Math.imul(B,F),n=n+Math.imul(j,V)|0,i=(i=i+Math.imul(j,K)|0)+Math.imul($,V)|0,o=o+Math.imul($,K)|0,n=n+Math.imul(E,Z)|0,i=(i=i+Math.imul(E,W)|0)+Math.imul(x,Z)|0,o=o+Math.imul(x,W)|0,n=n+Math.imul(S,Y)|0,i=(i=i+Math.imul(S,Q)|0)+Math.imul(O,Y)|0,o=o+Math.imul(O,Q)|0,n=n+Math.imul(w,X)|0,i=(i=i+Math.imul(w,tt)|0)+Math.imul(_,X)|0,o=o+Math.imul(_,tt)|0,n=n+Math.imul(v,rt)|0,i=(i=i+Math.imul(v,nt)|0)+Math.imul(b,rt)|0,o=o+Math.imul(b,nt)|0,n=n+Math.imul(p,ot)|0,i=(i=i+Math.imul(p,st)|0)+Math.imul(m,ot)|0,o=o+Math.imul(m,st)|0;var St=(h+(n=n+Math.imul(c,ut)|0)|0)+((8191&(i=(i=i+Math.imul(c,ht)|0)+Math.imul(l,ut)|0))<<13)|0;h=((o=o+Math.imul(l,ht)|0)+(i>>>13)|0)+(St>>>26)|0,St&=67108863,n=Math.imul(I,U),i=(i=Math.imul(I,F))+Math.imul(N,U)|0,o=Math.imul(N,F),n=n+Math.imul(R,V)|0,i=(i=i+Math.imul(R,K)|0)+Math.imul(B,V)|0,o=o+Math.imul(B,K)|0,n=n+Math.imul(j,Z)|0,i=(i=i+Math.imul(j,W)|0)+Math.imul($,Z)|0,o=o+Math.imul($,W)|0,n=n+Math.imul(E,Y)|0,i=(i=i+Math.imul(E,Q)|0)+Math.imul(x,Y)|0,o=o+Math.imul(x,Q)|0,n=n+Math.imul(S,X)|0,i=(i=i+Math.imul(S,tt)|0)+Math.imul(O,X)|0,o=o+Math.imul(O,tt)|0,n=n+Math.imul(w,rt)|0,i=(i=i+Math.imul(w,nt)|0)+Math.imul(_,rt)|0,o=o+Math.imul(_,nt)|0,n=n+Math.imul(v,ot)|0,i=(i=i+Math.imul(v,st)|0)+Math.imul(b,ot)|0,o=o+Math.imul(b,st)|0,n=n+Math.imul(p,ut)|0,i=(i=i+Math.imul(p,ht)|0)+Math.imul(m,ut)|0,o=o+Math.imul(m,ht)|0;var Ot=(h+(n=n+Math.imul(c,ct)|0)|0)+((8191&(i=(i=i+Math.imul(c,lt)|0)+Math.imul(l,ct)|0))<<13)|0;h=((o=o+Math.imul(l,lt)|0)+(i>>>13)|0)+(Ot>>>26)|0,Ot&=67108863,n=Math.imul(C,U),i=(i=Math.imul(C,F))+Math.imul(L,U)|0,o=Math.imul(L,F),n=n+Math.imul(I,V)|0,i=(i=i+Math.imul(I,K)|0)+Math.imul(N,V)|0,o=o+Math.imul(N,K)|0,n=n+Math.imul(R,Z)|0,i=(i=i+Math.imul(R,W)|0)+Math.imul(B,Z)|0,o=o+Math.imul(B,W)|0,n=n+Math.imul(j,Y)|0,i=(i=i+Math.imul(j,Q)|0)+Math.imul($,Y)|0,o=o+Math.imul($,Q)|0,n=n+Math.imul(E,X)|0,i=(i=i+Math.imul(E,tt)|0)+Math.imul(x,X)|0,o=o+Math.imul(x,tt)|0,n=n+Math.imul(S,rt)|0,i=(i=i+Math.imul(S,nt)|0)+Math.imul(O,rt)|0,o=o+Math.imul(O,nt)|0,n=n+Math.imul(w,ot)|0,i=(i=i+Math.imul(w,st)|0)+Math.imul(_,ot)|0,o=o+Math.imul(_,st)|0,n=n+Math.imul(v,ut)|0,i=(i=i+Math.imul(v,ht)|0)+Math.imul(b,ut)|0,o=o+Math.imul(b,ht)|0,n=n+Math.imul(p,ct)|0,i=(i=i+Math.imul(p,lt)|0)+Math.imul(m,ct)|0,o=o+Math.imul(m,lt)|0;var At=(h+(n=n+Math.imul(c,pt)|0)|0)+((8191&(i=(i=i+Math.imul(c,mt)|0)+Math.imul(l,pt)|0))<<13)|0;h=((o=o+Math.imul(l,mt)|0)+(i>>>13)|0)+(At>>>26)|0,At&=67108863,n=Math.imul(C,V),i=(i=Math.imul(C,K))+Math.imul(L,V)|0,o=Math.imul(L,K),n=n+Math.imul(I,Z)|0,i=(i=i+Math.imul(I,W)|0)+Math.imul(N,Z)|0,o=o+Math.imul(N,W)|0,n=n+Math.imul(R,Y)|0,i=(i=i+Math.imul(R,Q)|0)+Math.imul(B,Y)|0,o=o+Math.imul(B,Q)|0,n=n+Math.imul(j,X)|0,i=(i=i+Math.imul(j,tt)|0)+Math.imul($,X)|0,o=o+Math.imul($,tt)|0,n=n+Math.imul(E,rt)|0,i=(i=i+Math.imul(E,nt)|0)+Math.imul(x,rt)|0,o=o+Math.imul(x,nt)|0,n=n+Math.imul(S,ot)|0,i=(i=i+Math.imul(S,st)|0)+Math.imul(O,ot)|0,o=o+Math.imul(O,st)|0,n=n+Math.imul(w,ut)|0,i=(i=i+Math.imul(w,ht)|0)+Math.imul(_,ut)|0,o=o+Math.imul(_,ht)|0,n=n+Math.imul(v,ct)|0,i=(i=i+Math.imul(v,lt)|0)+Math.imul(b,ct)|0,o=o+Math.imul(b,lt)|0;var Et=(h+(n=n+Math.imul(p,pt)|0)|0)+((8191&(i=(i=i+Math.imul(p,mt)|0)+Math.imul(m,pt)|0))<<13)|0;h=((o=o+Math.imul(m,mt)|0)+(i>>>13)|0)+(Et>>>26)|0,Et&=67108863,n=Math.imul(C,Z),i=(i=Math.imul(C,W))+Math.imul(L,Z)|0,o=Math.imul(L,W),n=n+Math.imul(I,Y)|0,i=(i=i+Math.imul(I,Q)|0)+Math.imul(N,Y)|0,o=o+Math.imul(N,Q)|0,n=n+Math.imul(R,X)|0,i=(i=i+Math.imul(R,tt)|0)+Math.imul(B,X)|0,o=o+Math.imul(B,tt)|0,n=n+Math.imul(j,rt)|0,i=(i=i+Math.imul(j,nt)|0)+Math.imul($,rt)|0,o=o+Math.imul($,nt)|0,n=n+Math.imul(E,ot)|0,i=(i=i+Math.imul(E,st)|0)+Math.imul(x,ot)|0,o=o+Math.imul(x,st)|0,n=n+Math.imul(S,ut)|0,i=(i=i+Math.imul(S,ht)|0)+Math.imul(O,ut)|0,o=o+Math.imul(O,ht)|0,n=n+Math.imul(w,ct)|0,i=(i=i+Math.imul(w,lt)|0)+Math.imul(_,ct)|0,o=o+Math.imul(_,lt)|0;var xt=(h+(n=n+Math.imul(v,pt)|0)|0)+((8191&(i=(i=i+Math.imul(v,mt)|0)+Math.imul(b,pt)|0))<<13)|0;h=((o=o+Math.imul(b,mt)|0)+(i>>>13)|0)+(xt>>>26)|0,xt&=67108863,n=Math.imul(C,Y),i=(i=Math.imul(C,Q))+Math.imul(L,Y)|0,o=Math.imul(L,Q),n=n+Math.imul(I,X)|0,i=(i=i+Math.imul(I,tt)|0)+Math.imul(N,X)|0,o=o+Math.imul(N,tt)|0,n=n+Math.imul(R,rt)|0,i=(i=i+Math.imul(R,nt)|0)+Math.imul(B,rt)|0,o=o+Math.imul(B,nt)|0,n=n+Math.imul(j,ot)|0,i=(i=i+Math.imul(j,st)|0)+Math.imul($,ot)|0,o=o+Math.imul($,st)|0,n=n+Math.imul(E,ut)|0,i=(i=i+Math.imul(E,ht)|0)+Math.imul(x,ut)|0,o=o+Math.imul(x,ht)|0,n=n+Math.imul(S,ct)|0,i=(i=i+Math.imul(S,lt)|0)+Math.imul(O,ct)|0,o=o+Math.imul(O,lt)|0;var kt=(h+(n=n+Math.imul(w,pt)|0)|0)+((8191&(i=(i=i+Math.imul(w,mt)|0)+Math.imul(_,pt)|0))<<13)|0;h=((o=o+Math.imul(_,mt)|0)+(i>>>13)|0)+(kt>>>26)|0,kt&=67108863,n=Math.imul(C,X),i=(i=Math.imul(C,tt))+Math.imul(L,X)|0,o=Math.imul(L,tt),n=n+Math.imul(I,rt)|0,i=(i=i+Math.imul(I,nt)|0)+Math.imul(N,rt)|0,o=o+Math.imul(N,nt)|0,n=n+Math.imul(R,ot)|0,i=(i=i+Math.imul(R,st)|0)+Math.imul(B,ot)|0,o=o+Math.imul(B,st)|0,n=n+Math.imul(j,ut)|0,i=(i=i+Math.imul(j,ht)|0)+Math.imul($,ut)|0,o=o+Math.imul($,ht)|0,n=n+Math.imul(E,ct)|0,i=(i=i+Math.imul(E,lt)|0)+Math.imul(x,ct)|0,o=o+Math.imul(x,lt)|0;var jt=(h+(n=n+Math.imul(S,pt)|0)|0)+((8191&(i=(i=i+Math.imul(S,mt)|0)+Math.imul(O,pt)|0))<<13)|0;h=((o=o+Math.imul(O,mt)|0)+(i>>>13)|0)+(jt>>>26)|0,jt&=67108863,n=Math.imul(C,rt),i=(i=Math.imul(C,nt))+Math.imul(L,rt)|0,o=Math.imul(L,nt),n=n+Math.imul(I,ot)|0,i=(i=i+Math.imul(I,st)|0)+Math.imul(N,ot)|0,o=o+Math.imul(N,st)|0,n=n+Math.imul(R,ut)|0,i=(i=i+Math.imul(R,ht)|0)+Math.imul(B,ut)|0,o=o+Math.imul(B,ht)|0,n=n+Math.imul(j,ct)|0,i=(i=i+Math.imul(j,lt)|0)+Math.imul($,ct)|0,o=o+Math.imul($,lt)|0;var $t=(h+(n=n+Math.imul(E,pt)|0)|0)+((8191&(i=(i=i+Math.imul(E,mt)|0)+Math.imul(x,pt)|0))<<13)|0;h=((o=o+Math.imul(x,mt)|0)+(i>>>13)|0)+($t>>>26)|0,$t&=67108863,n=Math.imul(C,ot),i=(i=Math.imul(C,st))+Math.imul(L,ot)|0,o=Math.imul(L,st),n=n+Math.imul(I,ut)|0,i=(i=i+Math.imul(I,ht)|0)+Math.imul(N,ut)|0,o=o+Math.imul(N,ht)|0,n=n+Math.imul(R,ct)|0,i=(i=i+Math.imul(R,lt)|0)+Math.imul(B,ct)|0,o=o+Math.imul(B,lt)|0;var Pt=(h+(n=n+Math.imul(j,pt)|0)|0)+((8191&(i=(i=i+Math.imul(j,mt)|0)+Math.imul($,pt)|0))<<13)|0;h=((o=o+Math.imul($,mt)|0)+(i>>>13)|0)+(Pt>>>26)|0,Pt&=67108863,n=Math.imul(C,ut),i=(i=Math.imul(C,ht))+Math.imul(L,ut)|0,o=Math.imul(L,ht),n=n+Math.imul(I,ct)|0,i=(i=i+Math.imul(I,lt)|0)+Math.imul(N,ct)|0,o=o+Math.imul(N,lt)|0;var Rt=(h+(n=n+Math.imul(R,pt)|0)|0)+((8191&(i=(i=i+Math.imul(R,mt)|0)+Math.imul(B,pt)|0))<<13)|0;h=((o=o+Math.imul(B,mt)|0)+(i>>>13)|0)+(Rt>>>26)|0,Rt&=67108863,n=Math.imul(C,ct),i=(i=Math.imul(C,lt))+Math.imul(L,ct)|0,o=Math.imul(L,lt);var Bt=(h+(n=n+Math.imul(I,pt)|0)|0)+((8191&(i=(i=i+Math.imul(I,mt)|0)+Math.imul(N,pt)|0))<<13)|0;h=((o=o+Math.imul(N,mt)|0)+(i>>>13)|0)+(Bt>>>26)|0,Bt&=67108863;var Tt=(h+(n=Math.imul(C,pt))|0)+((8191&(i=(i=Math.imul(C,mt))+Math.imul(L,pt)|0))<<13)|0;return h=((o=Math.imul(L,mt))+(i>>>13)|0)+(Tt>>>26)|0,Tt&=67108863,u[0]=yt,u[1]=vt,u[2]=bt,u[3]=gt,u[4]=wt,u[5]=_t,u[6]=Mt,u[7]=St,u[8]=Ot,u[9]=At,u[10]=Et,u[11]=xt,u[12]=kt,u[13]=jt,u[14]=$t,u[15]=Pt,u[16]=Rt,u[17]=Bt,u[18]=Tt,0!==h&&(u[19]=h,r.length++),r};function y(t,e,r){return(new v).mulp(t,e,r)}function v(t,e){this.x=t,this.y=e}Math.imul||(m=p),s.prototype.mulTo=function(t,e){var r=this.length+t.length;return 10===this.length&&10===t.length?m(this,t,e):r<63?p(this,t,e):r<1024?function(t,e,r){r.negative=e.negative^t.negative,r.length=t.length+e.length;for(var n=0,i=0,o=0;o>>26)|0)>>>26,s&=67108863}r.words[o]=a,n=s,s=i}return 0!==n?r.words[o]=n:r.length--,r.strip()}(this,t,e):y(this,t,e)},v.prototype.makeRBT=function(t){for(var e=new Array(t),r=s.prototype._countBits(t)-1,n=0;n>=1;return n},v.prototype.permute=function(t,e,r,n,i,o){for(var s=0;s>>=1)i++;return 1<>>=13,r[2*s+1]=8191&o,o>>>=13;for(s=2*e;s>=26,e+=n/67108864|0,e+=o>>>26,this.words[r]=67108863&o}return 0!==e&&(this.words[r]=e,this.length++),this},s.prototype.muln=function(t){return this.clone().imuln(t)},s.prototype.sqr=function(){return this.mul(this)},s.prototype.isqr=function(){return this.imul(this.clone())},s.prototype.pow=function(t){var e=function(t){for(var e=new Array(t.bitLength()),r=0;r>>i}return e}(t);if(0===e.length)return new s(1);for(var r=this,n=0;n=0);var e,r=t%26,n=(t-r)/26,o=67108863>>>26-r<<26-r;if(0!==r){var s=0;for(e=0;e>>26-r}s&&(this.words[e]=s,this.length++)}if(0!==n){for(e=this.length-1;e>=0;e--)this.words[e+n]=this.words[e];for(e=0;e=0),n=e?(e-e%26)/26:0;var o=t%26,s=Math.min((t-o)/26,this.length),a=67108863^67108863>>>o<s)for(this.length-=s,h=0;h=0&&(0!==f||h>=n);h--){var c=0|this.words[h];this.words[h]=f<<26-o|c>>>o,f=c&a}return u&&0!==f&&(u.words[u.length++]=f),0===this.length&&(this.words[0]=0,this.length=1),this.strip()},s.prototype.ishrn=function(t,e,r){return i(0===this.negative),this.iushrn(t,e,r)},s.prototype.shln=function(t){return this.clone().ishln(t)},s.prototype.ushln=function(t){return this.clone().iushln(t)},s.prototype.shrn=function(t){return this.clone().ishrn(t)},s.prototype.ushrn=function(t){return this.clone().iushrn(t)},s.prototype.testn=function(t){i("number"==typeof t&&t>=0);var e=t%26,r=(t-e)/26,n=1<=0);var e=t%26,r=(t-e)/26;if(i(0===this.negative,"imaskn works only with positive numbers"),this.length<=r)return this;if(0!==e&&r++,this.length=Math.min(r,this.length),0!==e){var n=67108863^67108863>>>e<=67108864;e++)this.words[e]-=67108864,e===this.length-1?this.words[e+1]=1:this.words[e+1]++;return this.length=Math.max(this.length,e+1),this},s.prototype.isubn=function(t){if(i("number"==typeof t),i(t<67108864),t<0)return this.iaddn(-t);if(0!==this.negative)return this.negative=0,this.iaddn(t),this.negative=1,this;if(this.words[0]-=t,1===this.length&&this.words[0]<0)this.words[0]=-this.words[0],this.negative=1;else for(var e=0;e>26)-(u/67108864|0),this.words[n+r]=67108863&o}for(;n>26,this.words[n+r]=67108863&o;if(0===a)return this.strip();for(i(-1===a),a=0,n=0;n>26,this.words[n]=67108863&o;return this.negative=1,this.strip()},s.prototype._wordDiv=function(t,e){var r=(this.length,t.length),n=this.clone(),i=t,o=0|i.words[i.length-1];0!==(r=26-this._countBits(o))&&(i=i.ushln(r),n.iushln(r),o=0|i.words[i.length-1]);var a,u=n.length-i.length;if("mod"!==e){(a=new s(null)).length=u+1,a.words=new Array(a.length);for(var h=0;h=0;c--){var l=67108864*(0|n.words[i.length+c])+(0|n.words[i.length+c-1]);for(l=Math.min(l/o|0,67108863),n._ishlnsubmul(i,l,c);0!==n.negative;)l--,n.negative=0,n._ishlnsubmul(i,1,c),n.isZero()||(n.negative^=1);a&&(a.words[c]=l)}return a&&a.strip(),n.strip(),"div"!==e&&0!==r&&n.iushrn(r),{div:a||null,mod:n}},s.prototype.divmod=function(t,e,r){return i(!t.isZero()),this.isZero()?{div:new s(0),mod:new s(0)}:0!==this.negative&&0===t.negative?(a=this.neg().divmod(t,e),"mod"!==e&&(n=a.div.neg()),"div"!==e&&(o=a.mod.neg(),r&&0!==o.negative&&o.iadd(t)),{div:n,mod:o}):0===this.negative&&0!==t.negative?(a=this.divmod(t.neg(),e),"mod"!==e&&(n=a.div.neg()),{div:n,mod:a.mod}):0!=(this.negative&t.negative)?(a=this.neg().divmod(t.neg(),e),"div"!==e&&(o=a.mod.neg(),r&&0!==o.negative&&o.isub(t)),{div:a.div,mod:o}):t.length>this.length||this.cmp(t)<0?{div:new s(0),mod:this}:1===t.length?"div"===e?{div:this.divn(t.words[0]),mod:null}:"mod"===e?{div:null,mod:new s(this.modn(t.words[0]))}:{div:this.divn(t.words[0]),mod:new s(this.modn(t.words[0]))}:this._wordDiv(t,e);var n,o,a},s.prototype.div=function(t){return this.divmod(t,"div",!1).div},s.prototype.mod=function(t){return this.divmod(t,"mod",!1).mod},s.prototype.umod=function(t){return this.divmod(t,"mod",!0).mod},s.prototype.divRound=function(t){var e=this.divmod(t);if(e.mod.isZero())return e.div;var r=0!==e.div.negative?e.mod.isub(t):e.mod,n=t.ushrn(1),i=t.andln(1),o=r.cmp(n);return o<0||1===i&&0===o?e.div:0!==e.div.negative?e.div.isubn(1):e.div.iaddn(1)},s.prototype.modn=function(t){i(t<=67108863);for(var e=(1<<26)%t,r=0,n=this.length-1;n>=0;n--)r=(e*r+(0|this.words[n]))%t;return r},s.prototype.idivn=function(t){i(t<=67108863);for(var e=0,r=this.length-1;r>=0;r--){var n=(0|this.words[r])+67108864*e;this.words[r]=n/t|0,e=n%t}return this.strip()},s.prototype.divn=function(t){return this.clone().idivn(t)},s.prototype.egcd=function(t){i(0===t.negative),i(!t.isZero());var e=this,r=t.clone();e=0!==e.negative?e.umod(t):e.clone();for(var n=new s(1),o=new s(0),a=new s(0),u=new s(1),h=0;e.isEven()&&r.isEven();)e.iushrn(1),r.iushrn(1),++h;for(var f=r.clone(),c=e.clone();!e.isZero();){for(var l=0,d=1;0==(e.words[0]&d)&&l<26;++l,d<<=1);if(l>0)for(e.iushrn(l);l-- >0;)(n.isOdd()||o.isOdd())&&(n.iadd(f),o.isub(c)),n.iushrn(1),o.iushrn(1);for(var p=0,m=1;0==(r.words[0]&m)&&p<26;++p,m<<=1);if(p>0)for(r.iushrn(p);p-- >0;)(a.isOdd()||u.isOdd())&&(a.iadd(f),u.isub(c)),a.iushrn(1),u.iushrn(1);e.cmp(r)>=0?(e.isub(r),n.isub(a),o.isub(u)):(r.isub(e),a.isub(n),u.isub(o))}return{a:a,b:u,gcd:r.iushln(h)}},s.prototype._invmp=function(t){i(0===t.negative),i(!t.isZero());var e=this,r=t.clone();e=0!==e.negative?e.umod(t):e.clone();for(var n,o=new s(1),a=new s(0),u=r.clone();e.cmpn(1)>0&&r.cmpn(1)>0;){for(var h=0,f=1;0==(e.words[0]&f)&&h<26;++h,f<<=1);if(h>0)for(e.iushrn(h);h-- >0;)o.isOdd()&&o.iadd(u),o.iushrn(1);for(var c=0,l=1;0==(r.words[0]&l)&&c<26;++c,l<<=1);if(c>0)for(r.iushrn(c);c-- >0;)a.isOdd()&&a.iadd(u),a.iushrn(1);e.cmp(r)>=0?(e.isub(r),o.isub(a)):(r.isub(e),a.isub(o))}return(n=0===e.cmpn(1)?o:a).cmpn(0)<0&&n.iadd(t),n},s.prototype.gcd=function(t){if(this.isZero())return t.abs();if(t.isZero())return this.abs();var e=this.clone(),r=t.clone();e.negative=0,r.negative=0;for(var n=0;e.isEven()&&r.isEven();n++)e.iushrn(1),r.iushrn(1);for(;;){for(;e.isEven();)e.iushrn(1);for(;r.isEven();)r.iushrn(1);var i=e.cmp(r);if(i<0){var o=e;e=r,r=o}else if(0===i||0===r.cmpn(1))break;e.isub(r)}return r.iushln(n)},s.prototype.invm=function(t){return this.egcd(t).a.umod(t)},s.prototype.isEven=function(){return 0==(1&this.words[0])},s.prototype.isOdd=function(){return 1==(1&this.words[0])},s.prototype.andln=function(t){return this.words[0]&t},s.prototype.bincn=function(t){i("number"==typeof t);var e=t%26,r=(t-e)/26,n=1<>>26,a&=67108863,this.words[s]=a}return 0!==o&&(this.words[s]=o,this.length++),this},s.prototype.isZero=function(){return 1===this.length&&0===this.words[0]},s.prototype.cmpn=function(t){var e,r=t<0;if(0!==this.negative&&!r)return-1;if(0===this.negative&&r)return 1;if(this.strip(),this.length>1)e=1;else{r&&(t=-t),i(t<=67108863,"Number is too big");var n=0|this.words[0];e=n===t?0:nt.length)return 1;if(this.length=0;r--){var n=0|this.words[r],i=0|t.words[r];if(n!==i){ni&&(e=1);break}}return e},s.prototype.gtn=function(t){return 1===this.cmpn(t)},s.prototype.gt=function(t){return 1===this.cmp(t)},s.prototype.gten=function(t){return this.cmpn(t)>=0},s.prototype.gte=function(t){return this.cmp(t)>=0},s.prototype.ltn=function(t){return-1===this.cmpn(t)},s.prototype.lt=function(t){return-1===this.cmp(t)},s.prototype.lten=function(t){return this.cmpn(t)<=0},s.prototype.lte=function(t){return this.cmp(t)<=0},s.prototype.eqn=function(t){return 0===this.cmpn(t)},s.prototype.eq=function(t){return 0===this.cmp(t)},s.red=function(t){return new O(t)},s.prototype.toRed=function(t){return i(!this.red,"Already a number in reduction context"),i(0===this.negative,"red works only with positives"),t.convertTo(this)._forceRed(t)},s.prototype.fromRed=function(){return i(this.red,"fromRed works only with numbers in reduction context"),this.red.convertFrom(this)},s.prototype._forceRed=function(t){return this.red=t,this},s.prototype.forceRed=function(t){return i(!this.red,"Already a number in reduction context"),this._forceRed(t)},s.prototype.redAdd=function(t){return i(this.red,"redAdd works only with red numbers"),this.red.add(this,t)},s.prototype.redIAdd=function(t){return i(this.red,"redIAdd works only with red numbers"),this.red.iadd(this,t)},s.prototype.redSub=function(t){return i(this.red,"redSub works only with red numbers"),this.red.sub(this,t)},s.prototype.redISub=function(t){return i(this.red,"redISub works only with red numbers"),this.red.isub(this,t)},s.prototype.redShl=function(t){return i(this.red,"redShl works only with red numbers"),this.red.shl(this,t)},s.prototype.redMul=function(t){return i(this.red,"redMul works only with red numbers"),this.red._verify2(this,t),this.red.mul(this,t)},s.prototype.redIMul=function(t){return i(this.red,"redMul works only with red numbers"),this.red._verify2(this,t),this.red.imul(this,t)},s.prototype.redSqr=function(){return i(this.red,"redSqr works only with red numbers"),this.red._verify1(this),this.red.sqr(this)},s.prototype.redISqr=function(){return i(this.red,"redISqr works only with red numbers"),this.red._verify1(this),this.red.isqr(this)},s.prototype.redSqrt=function(){return i(this.red,"redSqrt works only with red numbers"),this.red._verify1(this),this.red.sqrt(this)},s.prototype.redInvm=function(){return i(this.red,"redInvm works only with red numbers"),this.red._verify1(this),this.red.invm(this)},s.prototype.redNeg=function(){return i(this.red,"redNeg works only with red numbers"),this.red._verify1(this),this.red.neg(this)},s.prototype.redPow=function(t){return i(this.red&&!t.red,"redPow(normalNum)"),this.red._verify1(this),this.red.pow(this,t)};var b={k256:null,p224:null,p192:null,p25519:null};function g(t,e){this.name=t,this.p=new s(e,16),this.n=this.p.bitLength(),this.k=new s(1).iushln(this.n).isub(this.p),this.tmp=this._tmp()}function w(){g.call(this,"k256","ffffffff ffffffff ffffffff ffffffff ffffffff ffffffff fffffffe fffffc2f")}function _(){g.call(this,"p224","ffffffff ffffffff ffffffff ffffffff 00000000 00000000 00000001")}function M(){g.call(this,"p192","ffffffff ffffffff ffffffff fffffffe ffffffff ffffffff")}function S(){g.call(this,"25519","7fffffffffffffff ffffffffffffffff ffffffffffffffff ffffffffffffffed")}function O(t){if("string"==typeof t){var e=s._prime(t);this.m=e.p,this.prime=e}else i(t.gtn(1),"modulus must be greater than 1"),this.m=t,this.prime=null}function A(t){O.call(this,t),this.shift=this.m.bitLength(),this.shift%26!=0&&(this.shift+=26-this.shift%26),this.r=new s(1).iushln(this.shift),this.r2=this.imod(this.r.sqr()),this.rinv=this.r._invmp(this.m),this.minv=this.rinv.mul(this.r).isubn(1).div(this.m),this.minv=this.minv.umod(this.r),this.minv=this.r.sub(this.minv)}g.prototype._tmp=function(){var t=new s(null);return t.words=new Array(Math.ceil(this.n/13)),t},g.prototype.ireduce=function(t){var e,r=t;do{this.split(r,this.tmp),e=(r=(r=this.imulK(r)).iadd(this.tmp)).bitLength()}while(e>this.n);var n=e0?r.isub(this.p):void 0!==r.strip?r.strip():r._strip(),r},g.prototype.split=function(t,e){t.iushrn(this.n,0,e)},g.prototype.imulK=function(t){return t.imul(this.k)},o(w,g),w.prototype.split=function(t,e){for(var r=Math.min(t.length,9),n=0;n>>22,i=o}i>>>=22,t.words[n-10]=i,0===i&&t.length>10?t.length-=10:t.length-=9},w.prototype.imulK=function(t){t.words[t.length]=0,t.words[t.length+1]=0,t.length+=2;for(var e=0,r=0;r>>=26,t.words[r]=i,e=n}return 0!==e&&(t.words[t.length++]=e),t},s._prime=function(t){if(b[t])return b[t];var e;if("k256"===t)e=new w;else if("p224"===t)e=new _;else if("p192"===t)e=new M;else{if("p25519"!==t)throw new Error("Unknown prime "+t);e=new S}return b[t]=e,e},O.prototype._verify1=function(t){i(0===t.negative,"red works only with positives"),i(t.red,"red works only with red numbers")},O.prototype._verify2=function(t,e){i(0==(t.negative|e.negative),"red works only with positives"),i(t.red&&t.red===e.red,"red works only with red numbers")},O.prototype.imod=function(t){return this.prime?this.prime.ireduce(t)._forceRed(this):t.umod(this.m)._forceRed(this)},O.prototype.neg=function(t){return t.isZero()?t.clone():this.m.sub(t)._forceRed(this)},O.prototype.add=function(t,e){this._verify2(t,e);var r=t.add(e);return r.cmp(this.m)>=0&&r.isub(this.m),r._forceRed(this)},O.prototype.iadd=function(t,e){this._verify2(t,e);var r=t.iadd(e);return r.cmp(this.m)>=0&&r.isub(this.m),r},O.prototype.sub=function(t,e){this._verify2(t,e);var r=t.sub(e);return r.cmpn(0)<0&&r.iadd(this.m),r._forceRed(this)},O.prototype.isub=function(t,e){this._verify2(t,e);var r=t.isub(e);return r.cmpn(0)<0&&r.iadd(this.m),r},O.prototype.shl=function(t,e){return this._verify1(t),this.imod(t.ushln(e))},O.prototype.imul=function(t,e){return this._verify2(t,e),this.imod(t.imul(e))},O.prototype.mul=function(t,e){return this._verify2(t,e),this.imod(t.mul(e))},O.prototype.isqr=function(t){return this.imul(t,t.clone())},O.prototype.sqr=function(t){return this.mul(t,t)},O.prototype.sqrt=function(t){if(t.isZero())return t.clone();var e=this.m.andln(3);if(i(e%2==1),3===e){var r=this.m.add(new s(1)).iushrn(2);return this.pow(t,r)}for(var n=this.m.subn(1),o=0;!n.isZero()&&0===n.andln(1);)o++,n.iushrn(1);i(!n.isZero());var a=new s(1).toRed(this),u=a.redNeg(),h=this.m.subn(1).iushrn(1),f=this.m.bitLength();for(f=new s(2*f*f).toRed(this);0!==this.pow(f,h).cmp(u);)f.redIAdd(u);for(var c=this.pow(f,n),l=this.pow(t,n.addn(1).iushrn(1)),d=this.pow(t,n),p=o;0!==d.cmp(a);){for(var m=d,y=0;0!==m.cmp(a);y++)m=m.redSqr();i(y=0;n--){for(var h=e.words[n],f=u-1;f>=0;f--){var c=h>>f&1;i!==r[0]&&(i=this.sqr(i)),0!==c||0!==o?(o<<=1,o|=c,(4===++a||0===n&&0===f)&&(i=this.mul(i,r[o]),a=0,o=0)):a=0}u=26}return i},O.prototype.convertTo=function(t){var e=t.umod(this.m);return e===t?e.clone():e},O.prototype.convertFrom=function(t){var e=t.clone();return e.red=null,e},s.mont=function(t){return new A(t)},o(A,O),A.prototype.convertTo=function(t){return this.imod(t.ushln(this.shift))},A.prototype.convertFrom=function(t){var e=this.imod(t.mul(this.rinv));return e.red=null,e},A.prototype.imul=function(t,e){if(t.isZero()||e.isZero())return t.words[0]=0,t.length=1,t;var r=t.imul(e),n=r.maskn(this.shift).mul(this.minv).imaskn(this.shift).mul(this.m),i=r.isub(n).iushrn(this.shift),o=i;return i.cmp(this.m)>=0?o=i.isub(this.m):i.cmpn(0)<0&&(o=i.iadd(this.m)),o._forceRed(this)},A.prototype.mul=function(t,e){if(t.isZero()||e.isZero())return new s(0)._forceRed(this);var r=t.mul(e),n=r.maskn(this.shift).mul(this.minv).imaskn(this.shift).mul(this.m),i=r.isub(n).iushrn(this.shift),o=i;return i.cmp(this.m)>=0?o=i.isub(this.m):i.cmpn(0)<0&&(o=i.iadd(this.m)),o._forceRed(this)},A.prototype.invm=function(t){return this.imod(t._invmp(this.m).mul(this.r2))._forceRed(this)}}(t,this)}).call(this,r(22)(t))},function(t,e){},function(t,e,r){e.publicEncrypt=r(268),e.privateDecrypt=r(270),e.privateEncrypt=function(t,r){return e.publicEncrypt(t,r,!0)},e.publicDecrypt=function(t,r){return e.privateDecrypt(t,r,!0)}},function(t,e,r){var n=r(50),i=r(27),o=r(36),s=r(140),a=r(141),u=r(80),h=r(142),f=r(71),c=r(2).Buffer;t.exports=function(t,e,r){var l;l=t.padding?t.padding:r?1:4;var d,p=n(t);if(4===l)d=function(t,e){var r=t.modulus.byteLength(),n=e.length,h=o("sha1").update(c.alloc(0)).digest(),f=h.length,l=2*f;if(n>r-l-2)throw new Error("message too long");var d=c.alloc(r-n-l-2),p=r-f-1,m=i(f),y=a(c.concat([h,d,c.alloc(1,1),e],p),s(m,p)),v=a(m,s(y,f));return new u(c.concat([c.alloc(1),v,y],r))}(p,e);else if(1===l)d=function(t,e,r){var n,o=e.length,s=t.modulus.byteLength();if(o>s-11)throw new Error("message too long");n=r?c.alloc(s-o-3,255):function(t){var e,r=c.allocUnsafe(t),n=0,o=i(2*t),s=0;for(;n=0)throw new Error("data too long for modulus")}return r?f(d,p):h(d,p)}},function(t,e){},function(t,e,r){var n=r(50),i=r(140),o=r(141),s=r(80),a=r(71),u=r(36),h=r(142),f=r(2).Buffer;t.exports=function(t,e,r){var c;c=t.padding?t.padding:r?1:4;var l,d=n(t),p=d.modulus.byteLength();if(e.length>p||new s(e).cmp(d.modulus)>=0)throw new Error("decryption error");l=r?h(new s(e),d):a(e,d);var m=f.alloc(p-l.length);if(l=f.concat([m,l],p),4===c)return function(t,e){var r=t.modulus.byteLength(),n=u("sha1").update(f.alloc(0)).digest(),s=n.length;if(0!==e[0])throw new Error("decryption error");var a=e.slice(1,s+1),h=e.slice(s+1),c=o(a,i(h,s)),l=o(h,i(c,r-s-1));if(function(t,e){t=f.from(t),e=f.from(e);var r=0,n=t.length;t.length!==e.length&&(r++,n=Math.min(t.length,e.length));var i=-1;for(;++i=e.length){o++;break}var s=e.slice(2,i-1);("0002"!==n.toString("hex")&&!r||"0001"!==n.toString("hex")&&r)&&o++;s.length<8&&o++;if(o)throw new Error("decryption error");return e.slice(i)}(0,l,r);if(3===c)return l;throw new Error("unknown padding")}},function(t,e,r){"use strict";(function(t,n){function i(){throw new Error("secure random number generation not supported by this browser\nuse chrome, FireFox or Internet Explorer 11")}var o=r(2),s=r(27),a=o.Buffer,u=o.kMaxLength,h=t.crypto||t.msCrypto,f=Math.pow(2,32)-1;function c(t,e){if("number"!=typeof t||t!=t)throw new TypeError("offset must be a number");if(t>f||t<0)throw new TypeError("offset must be a uint32");if(t>u||t>e)throw new RangeError("offset out of range")}function l(t,e,r){if("number"!=typeof t||t!=t)throw new TypeError("size must be a number");if(t>f||t<0)throw new TypeError("size must be a uint32");if(t+e>r||t>u)throw new RangeError("buffer too small")}function d(t,e,r,i){if(n.browser){var o=t.buffer,a=new Uint8Array(o,e,r);return h.getRandomValues(a),i?void n.nextTick((function(){i(null,t)})):t}if(!i)return s(r).copy(t,e),t;s(r,(function(r,n){if(r)return i(r);n.copy(t,e),i(null,t)}))}h&&h.getRandomValues||!n.browser?(e.randomFill=function(e,r,n,i){if(!(a.isBuffer(e)||e instanceof t.Uint8Array))throw new TypeError('"buf" argument must be a Buffer or Uint8Array');if("function"==typeof r)i=r,r=0,n=e.length;else if("function"==typeof n)i=n,n=e.length-r;else if("function"!=typeof i)throw new TypeError('"cb" argument must be a function');return c(r,e.length),l(n,r,e.length),d(e,r,n,i)},e.randomFillSync=function(e,r,n){void 0===r&&(r=0);if(!(a.isBuffer(e)||e instanceof t.Uint8Array))throw new TypeError('"buf" argument must be a Buffer or Uint8Array');c(r,e.length),void 0===n&&(n=e.length-r);return l(n,r,e.length),d(e,r,n)}):(e.randomFill=i,e.randomFillSync=i)}).call(this,r(7),r(5))},function(t,e,r){"use strict"; +/*! + * ignore + */t.exports=r(59).Decimal128},function(t,e,r){"use strict"; +/*! + * [node-mongodb-native](https://github.com/mongodb/node-mongodb-native) ObjectId + * @constructor NodeMongoDbObjectId + * @see ObjectId + */var n=r(59).ObjectID; +/*! + * Getter for convenience with populate, see gh-6115 + */Object.defineProperty(n.prototype,"_id",{enumerable:!1,configurable:!0,get:function(){return this}}), +/*! + * ignore + */ +t.exports=n},function(t,e,r){"use strict"; +/*! + * ignore + */t.exports=function(){}},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */var n=r(14),i=r(348),o=!1;t.exports=function(){return o?i:n}, +/*! + * ignore + */ +t.exports.setBrowser=function(t){o=t}},function(t,e,r){"use strict"; +/*! + * Dependencies + */var n=r(277).ctor("require","modify","init","default","ignore");function i(){this.activePaths=new n,this.ownerDocument=void 0,this.fullPath=void 0}t.exports=i,i.prototype.strictMode=void 0,i.prototype.selected=void 0,i.prototype.shardval=void 0,i.prototype.saveError=void 0,i.prototype.validationError=void 0,i.prototype.adhocPaths=void 0,i.prototype.removing=void 0,i.prototype.inserting=void 0,i.prototype.saving=void 0,i.prototype.version=void 0,i.prototype._id=void 0,i.prototype.populate=void 0,i.prototype.populated=void 0,i.prototype.wasPopulated=!1,i.prototype.scope=void 0,i.prototype.session=null,i.prototype.pathsToScopes=null,i.prototype.cachedRequired=null},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */var n=r(4),i=t.exports=function(){}; +/*! + * StateMachine represents a minimal `interface` for the + * constructors it builds via StateMachine.ctor(...). + * + * @api private + */ +/*! + * StateMachine.ctor('state1', 'state2', ...) + * A factory method for subclassing StateMachine. + * The arguments are a list of states. For each state, + * the constructor's prototype gets state transition + * methods named after each state. These transition methods + * place their path argument into the given state. + * + * @param {String} state + * @param {String} [state] + * @return {Function} subclass constructor + * @private + */ +i.ctor=function(){var t=n.args(arguments),e=function(){i.apply(this,arguments),this.paths={},this.states={},this.stateNames=t;for(var e,r=t.length;r--;)e=t[r],this.states[e]={}};return e.prototype=new i,t.forEach((function(t){e.prototype[t]=function(e){this._changeState(e,t)}})),e}, +/*! + * This function is wrapped by the state change functions: + * + * - `require(path)` + * - `modify(path)` + * - `init(path)` + * + * @api private + */ +i.prototype._changeState=function(t,e){var r=this.states[this.paths[t]];r&&delete r[t],this.paths[t]=e,this.states[e][t]=!0}, +/*! + * ignore + */ +i.prototype.clear=function(t){for(var e,r=Object.keys(this.states[t]),n=r.length;n--;)e=r[n],delete this.states[t][e],delete this.paths[e]}, +/*! + * Checks to see if at least one path is in the states passed in via `arguments` + * e.g., this.some('required', 'inited') + * + * @param {String} state that we want to check for. + * @private + */ +i.prototype.some=function(){var t=this,e=arguments.length?arguments:this.stateNames;return Array.prototype.some.call(e,(function(e){return Object.keys(t.states[e]).length}))}, +/*! + * This function builds the functions that get assigned to `forEach` and `map`, + * since both of those methods share a lot of the same logic. + * + * @param {String} iterMethod is either 'forEach' or 'map' + * @return {Function} + * @api private + */ +i.prototype._iter=function(t){return function(){var e=arguments.length,r=n.args(arguments,0,e-1),i=arguments[e-1];r.length||(r=this.stateNames);var o=this,s=r.reduce((function(t,e){return t.concat(Object.keys(o.states[e]))}),[]);return s[t]((function(t,e,r){return i(t,e,r)}))}}, +/*! + * Iterates over the paths that belong to one of the parameter states. + * + * The function profile can look like: + * this.forEach(state1, fn); // iterates over all paths in state1 + * this.forEach(state1, state2, fn); // iterates over all paths in state1 or state2 + * this.forEach(fn); // iterates over all paths in all states + * + * @param {String} [state] + * @param {String} [state] + * @param {Function} callback + * @private + */ +i.prototype.forEach=function(){return this.forEach=this._iter("forEach"),this.forEach.apply(this,arguments)}, +/*! + * Maps over the paths that belong to one of the parameter states. + * + * The function profile can look like: + * this.forEach(state1, fn); // iterates over all paths in state1 + * this.forEach(state1, state2, fn); // iterates over all paths in state1 or state2 + * this.forEach(fn); // iterates over all paths in all states + * + * @param {String} [state] + * @param {String} [state] + * @param {Function} callback + * @return {Array} + * @private + */ +i.prototype.map=function(){return this.map=this._iter("map"),this.map.apply(this,arguments)}},function(t,e,r){function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}var i=r(279),o=["__proto__","constructor","prototype"]; +/*! + * Returns the value passed to it. + */ +function s(t){return t}e.get=function(t,r,o,a){var u;"function"==typeof o&&(o.length<2?(a=o,o=void 0):(u=o,o=void 0)),a||(a=s);var h="string"==typeof t?i(t):t;if(!Array.isArray(h))throw new TypeError("Invalid `path`. Must be either string or array");for(var f,c=r,l=0;l=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:o}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function i(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r");t.sort.set(r,n)}))} +/*! + * limit, skip, maxScan, batchSize, comment + * + * Sets these associated options. + * + * query.comment('feed query'); + */ +/*! + * Internal helper for update, updateMany, updateOne + */ +function v(t,e,r,n,i,o,s){return t.op=e,c.canMerge(r)&&t.merge(r),n&&t._mergeUpdate(n),h.isObject(i)&&t.setOptions(i),o||s?!t._update||!t.options.overwrite&&0===h.keys(t._update).length?(s&&h.soon(s.bind(null,null,0)),t):(i=t._optionsForExec(),s||(i.safe=!1),r=t._conditions,n=t._updateForExec(),f("update",t._collection.collectionName,r,n,i),s=t._wrapCallback(e,s,{conditions:r,doc:n,options:i}),t._collection[e](r,n,i,h.tick(s)),t):t}["limit","skip","maxScan","batchSize","comment"].forEach((function(t){c.prototype[t]=function(e){return this._validate(t),this.options[t]=e,this}})),c.prototype.maxTime=c.prototype.maxTimeMS=function(t){return this._validate("maxTime"),this.options.maxTimeMS=t,this},c.prototype.snapshot=function(){return this._validate("snapshot"),this.options.snapshot=!arguments.length||!!arguments[0],this},c.prototype.hint=function(){if(0===arguments.length)return this;this._validate("hint");var t=arguments[0];if(h.isObject(t)){var e=this.options.hint||(this.options.hint={});for(var r in t)e[r]=t[r];return this}if("string"==typeof t)return this.options.hint=t,this;throw new TypeError("Invalid hint. "+t)},c.prototype.j=function(t){return this.options.j=t,this},c.prototype.slaveOk=function(t){return this.options.slaveOk=!arguments.length||!!t,this},c.prototype.read=c.prototype.setReadPreference=function(t){return arguments.length>1&&!c.prototype.read.deprecationWarningIssued&&(console.error("Deprecation warning: 'tags' argument is not supported anymore in Query.read() method. Please use mongodb.ReadPreference object instead."),c.prototype.read.deprecationWarningIssued=!0),this.options.readPreference=h.readPref(t),this},c.prototype.readConcern=c.prototype.r=function(t){return this.options.readConcern=h.readConcern(t),this},c.prototype.tailable=function(){return this._validate("tailable"),this.options.tailable=!arguments.length||!!arguments[0],this},c.prototype.writeConcern=c.prototype.w=function(t){return"object"===o(t)?(void 0!==t.j&&(this.options.j=t.j),void 0!==t.w&&(this.options.w=t.w),void 0!==t.wtimeout&&(this.options.wtimeout=t.wtimeout)):this.options.w="m"===t?"majority":t,this},c.prototype.wtimeout=c.prototype.wTimeout=function(t){return this.options.wtimeout=t,this},c.prototype.merge=function(t){if(!t)return this;if(!c.canMerge(t))throw new TypeError("Invalid argument. Expected instanceof mquery or plain object");return t instanceof c?(t._conditions&&h.merge(this._conditions,t._conditions),t._fields&&(this._fields||(this._fields={}),h.merge(this._fields,t._fields)),t.options&&(this.options||(this.options={}),h.merge(this.options,t.options)),t._update&&(this._update||(this._update={}),h.mergeClone(this._update,t._update)),t._distinct&&(this._distinct=t._distinct),this):(h.merge(this._conditions,t),this)},c.prototype.find=function(t,e){if(this.op="find","function"==typeof t?(e=t,t=void 0):c.canMerge(t)&&this.merge(t),!e)return this;var r=this._conditions,n=this._optionsForExec();return this.$useProjection?n.projection=this._fieldsForExec():n.fields=this._fieldsForExec(),f("find",this._collection.collectionName,r,n),e=this._wrapCallback("find",e,{conditions:r,options:n}),this._collection.find(r,n,h.tick(e)),this},c.prototype.cursor=function(t){if(this.op){if("find"!==this.op)throw new TypeError(".cursor only support .find method")}else this.find(t);var e=this._conditions,r=this._optionsForExec();return this.$useProjection?r.projection=this._fieldsForExec():r.fields=this._fieldsForExec(),f("findCursor",this._collection.collectionName,e,r),this._collection.findCursor(e,r)},c.prototype.findOne=function(t,e){if(this.op="findOne","function"==typeof t?(e=t,t=void 0):c.canMerge(t)&&this.merge(t),!e)return this;var r=this._conditions,n=this._optionsForExec();return this.$useProjection?n.projection=this._fieldsForExec():n.fields=this._fieldsForExec(),f("findOne",this._collection.collectionName,r,n),e=this._wrapCallback("findOne",e,{conditions:r,options:n}),this._collection.findOne(r,n,h.tick(e)),this},c.prototype.count=function(t,e){if(this.op="count",this._validate(),"function"==typeof t?(e=t,t=void 0):c.canMerge(t)&&this.merge(t),!e)return this;var r=this._conditions,n=this._optionsForExec();return f("count",this._collection.collectionName,r,n),e=this._wrapCallback("count",e,{conditions:r,options:n}),this._collection.count(r,n,h.tick(e)),this},c.prototype.distinct=function(t,e,r){if(this.op="distinct",this._validate(),!r){switch(o(e)){case"function":r=e,"string"==typeof t&&(e=t,t=void 0);break;case"undefined":case"string":break;default:throw new TypeError("Invalid `field` argument. Must be string or function")}switch(o(t)){case"function":r=t,t=e=void 0;break;case"string":e=t,t=void 0}}if("string"==typeof e&&(this._distinct=e),c.canMerge(t)&&this.merge(t),!r)return this;if(!this._distinct)throw new Error("No value for `distinct` has been declared");var n=this._conditions,i=this._optionsForExec();return f("distinct",this._collection.collectionName,n,i),r=this._wrapCallback("distinct",r,{conditions:n,options:i}),this._collection.distinct(this._distinct,n,i,h.tick(r)),this},c.prototype.update=function(t,e,r,n){var i;switch(arguments.length){case 3:"function"==typeof r&&(n=r,r=void 0);break;case 2:"function"==typeof e&&(n=e,e=t,t=void 0);break;case 1:switch(o(t)){case"function":n=t,t=r=e=void 0;break;case"boolean":i=t,t=void 0;break;default:e=t,t=r=void 0}}return v(this,"update",t,e,r,i,n)},c.prototype.updateMany=function(t,e,r,n){var i;switch(arguments.length){case 3:"function"==typeof r&&(n=r,r=void 0);break;case 2:"function"==typeof e&&(n=e,e=t,t=void 0);break;case 1:switch(o(t)){case"function":n=t,t=r=e=void 0;break;case"boolean":i=t,t=void 0;break;default:e=t,t=r=void 0}}return v(this,"updateMany",t,e,r,i,n)},c.prototype.updateOne=function(t,e,r,n){var i;switch(arguments.length){case 3:"function"==typeof r&&(n=r,r=void 0);break;case 2:"function"==typeof e&&(n=e,e=t,t=void 0);break;case 1:switch(o(t)){case"function":n=t,t=r=e=void 0;break;case"boolean":i=t,t=void 0;break;default:e=t,t=r=void 0}}return v(this,"updateOne",t,e,r,i,n)},c.prototype.replaceOne=function(t,e,r,n){var i;switch(arguments.length){case 3:"function"==typeof r&&(n=r,r=void 0);break;case 2:"function"==typeof e&&(n=e,e=t,t=void 0);break;case 1:switch(o(t)){case"function":n=t,t=r=e=void 0;break;case"boolean":i=t,t=void 0;break;default:e=t,t=r=void 0}}return this.setOptions({overwrite:!0}),v(this,"replaceOne",t,e,r,i,n)},c.prototype.remove=function(t,e){var r;if(this.op="remove","function"==typeof t?(e=t,t=void 0):c.canMerge(t)?this.merge(t):!0===t&&(r=t,t=void 0),!r&&!e)return this;var n=this._optionsForExec();e||(n.safe=!1);var i=this._conditions;return f("remove",this._collection.collectionName,i,n),e=this._wrapCallback("remove",e,{conditions:i,options:n}),this._collection.remove(i,n,h.tick(e)),this},c.prototype.deleteOne=function(t,e){var r;if(this.op="deleteOne","function"==typeof t?(e=t,t=void 0):c.canMerge(t)?this.merge(t):!0===t&&(r=t,t=void 0),!r&&!e)return this;var n=this._optionsForExec();e||(n.safe=!1),delete n.justOne;var i=this._conditions;return f("deleteOne",this._collection.collectionName,i,n),e=this._wrapCallback("deleteOne",e,{conditions:i,options:n}),this._collection.deleteOne(i,n,h.tick(e)),this},c.prototype.deleteMany=function(t,e){var r;if(this.op="deleteMany","function"==typeof t?(e=t,t=void 0):c.canMerge(t)?this.merge(t):!0===t&&(r=t,t=void 0),!r&&!e)return this;var n=this._optionsForExec();e||(n.safe=!1),delete n.justOne;var i=this._conditions;return f("deleteOne",this._collection.collectionName,i,n),e=this._wrapCallback("deleteOne",e,{conditions:i,options:n}),this._collection.deleteMany(i,n,h.tick(e)),this},c.prototype.findOneAndUpdate=function(t,e,r,n){switch(this.op="findOneAndUpdate",this._validate(),arguments.length){case 3:"function"==typeof r&&(n=r,r={});break;case 2:"function"==typeof e&&(n=e,e=t,t=void 0),r=void 0;break;case 1:"function"==typeof t?(n=t,t=r=e=void 0):(e=t,t=r=void 0)}if(c.canMerge(t)&&this.merge(t),e&&this._mergeUpdate(e),r&&this.setOptions(r),!n)return this;var i=this._conditions,o=this._updateForExec();return r=this._optionsForExec(),this._collection.findOneAndUpdate(i,o,r,h.tick(n))},c.prototype.findOneAndRemove=c.prototype.findOneAndDelete=function(t,e,r){if(this.op="findOneAndRemove",this._validate(),"function"==typeof e?(r=e,e=void 0):"function"==typeof t&&(r=t,t=void 0),c.canMerge(t)&&this.merge(t),e&&this.setOptions(e),!r)return this;e=this._optionsForExec();var n=this._conditions;return this._collection.findOneAndDelete(n,e,h.tick(r))},c.prototype._wrapCallback=function(t,e,r){var n=this._traceFunction||c.traceFunction;if(n){r.collectionName=this._collection.collectionName;var i=n&&n.call(null,t,r,this),o=(new Date).getTime();return function(t,r){if(i){var n=(new Date).getTime()-o;i.call(null,t,r,n)}e&&e.apply(null,arguments)}}return e},c.prototype.setTraceFunction=function(t){return this._traceFunction=t,this},c.prototype.exec=function(t,e){switch(o(t)){case"function":e=t,t=null;break;case"string":this.op=t}a.ok(this.op,"Missing query type: (find, update, etc)"),"update"!=this.op&&"remove"!=this.op||e||(e=!0);var r=this;if("function"!=typeof e)return new c.Promise((function(t,e){r[r.op]((function(r,n){r?e(r):t(n),t=e=null}))}));this[this.op](e)},c.prototype.thunk=function(){var t=this;return function(e){t.exec(e)}},c.prototype.then=function(t,e){var r=this;return new c.Promise((function(t,e){r.exec((function(r,n){r?e(r):t(n),t=e=null}))})).then(t,e)},c.prototype.cursor=function(){if("find"!=this.op)throw new Error("cursor() is only available for find");var t=this._conditions,e=this._optionsForExec();return this.$useProjection?e.projection=this._fieldsForExec():e.fields=this._fieldsForExec(),f("cursor",this._collection.collectionName,t,e),this._collection.findCursor(t,e)},c.prototype.selected=function(){return!!(this._fields&&Object.keys(this._fields).length>0)},c.prototype.selectedInclusively=function(){if(!this._fields)return!1;var t=Object.keys(this._fields);if(0===t.length)return!1;for(var e=0;e=31||"undefined"!=typeof navigator&&navigator.userAgent&&navigator.userAgent.toLowerCase().match(/applewebkit\/(\d+)/)},e.storage=function(){try{return localStorage}catch(t){}}(),e.destroy=(i=!1,function(){i||(i=!0,console.warn("Instance method `debug.destroy()` is deprecated and no longer does anything. It will be removed in the next major version of `debug`."))}),e.colors=["#0000CC","#0000FF","#0033CC","#0033FF","#0066CC","#0066FF","#0099CC","#0099FF","#00CC00","#00CC33","#00CC66","#00CC99","#00CCCC","#00CCFF","#3300CC","#3300FF","#3333CC","#3333FF","#3366CC","#3366FF","#3399CC","#3399FF","#33CC00","#33CC33","#33CC66","#33CC99","#33CCCC","#33CCFF","#6600CC","#6600FF","#6633CC","#6633FF","#66CC00","#66CC33","#9900CC","#9900FF","#9933CC","#9933FF","#99CC00","#99CC33","#CC0000","#CC0033","#CC0066","#CC0099","#CC00CC","#CC00FF","#CC3300","#CC3333","#CC3366","#CC3399","#CC33CC","#CC33FF","#CC6600","#CC6633","#CC9900","#CC9933","#CCCC00","#CCCC33","#FF0000","#FF0033","#FF0066","#FF0099","#FF00CC","#FF00FF","#FF3300","#FF3333","#FF3366","#FF3399","#FF33CC","#FF33FF","#FF6600","#FF6633","#FF9900","#FF9933","#FFCC00","#FFCC33"],e.log=console.debug||console.log||function(){},t.exports=r(286)(e),t.exports.formatters.j=function(t){try{return JSON.stringify(t)}catch(t){return"[UnexpectedJSONParseError]: "+t.message}}}).call(this,r(5))},function(t,e,r){function n(t){return function(t){if(Array.isArray(t))return i(t)}(t)||function(t){if("undefined"!=typeof Symbol&&null!=t[Symbol.iterator]||null!=t["@@iterator"])return Array.from(t)}(t)||function(t,e){if(!t)return;if("string"==typeof t)return i(t,e);var r=Object.prototype.toString.call(t).slice(8,-1);"Object"===r&&t.constructor&&(r=t.constructor.name);if("Map"===r||"Set"===r)return Array.from(t);if("Arguments"===r||/^(?:Ui|I)nt(?:8|16|32)(?:Clamped)?Array$/.test(r))return i(t,e)}(t)||function(){throw new TypeError("Invalid attempt to spread non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}()}function i(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r0?"field selection and slice":(Object.keys(n.distinct).every((function(r){return!t.options[r]||(e=r,!1)})),e);var e},n.distinct.select=n.distinct.slice=n.distinct.sort=n.distinct.limit=n.distinct.skip=n.distinct.batchSize=n.distinct.comment=n.distinct.maxScan=n.distinct.snapshot=n.distinct.hint=n.distinct.tailable=!0,n.findOneAndUpdate=n.findOneAndRemove=function(t){var e;return Object.keys(n.findOneAndUpdate).every((function(r){return!t.options[r]||(e=r,!1)})),e},n.findOneAndUpdate.limit=n.findOneAndUpdate.skip=n.findOneAndUpdate.batchSize=n.findOneAndUpdate.maxScan=n.findOneAndUpdate.snapshot=n.findOneAndUpdate.hint=n.findOneAndUpdate.tailable=n.findOneAndUpdate.comment=!0,n.count=function(t){return t._fields&&Object.keys(t._fields).length>0?"field selection and slice":(Object.keys(n.count).every((function(r){return!t.options[r]||(e=r,!1)})),e);var e},n.count.slice=n.count.batchSize=n.count.comment=n.count.maxScan=n.count.snapshot=n.count.tailable=!0},function(t,e,r){"use strict";var n=r(151);if("unknown"==n.type)throw new Error("Unknown environment");t.exports=n.isNode?r(289):(n.isMongo,r(53))},function(t,e,r){"use strict";var n=r(53);function i(t){this.collection=t,this.collectionName=t.collectionName}r(150).inherits(i,n),i.prototype.find=function(t,e,r){var n=this.collection.find(t,e);try{n.toArray(r)}catch(t){r(t)}},i.prototype.findOne=function(t,e,r){this.collection.findOne(t,e,r)},i.prototype.count=function(t,e,r){this.collection.count(t,e,r)},i.prototype.distinct=function(t,e,r,n){this.collection.distinct(t,e,r,n)},i.prototype.update=function(t,e,r,n){this.collection.update(t,e,r,n)},i.prototype.updateMany=function(t,e,r,n){this.collection.updateMany(t,e,r,n)},i.prototype.updateOne=function(t,e,r,n){this.collection.updateOne(t,e,r,n)},i.prototype.replaceOne=function(t,e,r,n){this.collection.replaceOne(t,e,r,n)},i.prototype.deleteOne=function(t,e,r){this.collection.deleteOne(t,e,r)},i.prototype.deleteMany=function(t,e,r){this.collection.deleteMany(t,e,r)},i.prototype.remove=function(t,e,r){this.collection.remove(t,e,r)},i.prototype.findOneAndDelete=function(t,e,r){this.collection.findOneAndDelete(t,e,r)},i.prototype.findOneAndUpdate=function(t,e,r,n){this.collection.findOneAndUpdate(t,e,r,n)},i.prototype.findCursor=function(t,e){return this.collection.find(t,e)},t.exports=i},function(t,e,r){"use strict";var n=t.exports={};n.DocumentNotFoundError=null,n.general={},n.general.default="Validator failed for path `{PATH}` with value `{VALUE}`",n.general.required="Path `{PATH}` is required.",n.Number={},n.Number.min="Path `{PATH}` ({VALUE}) is less than minimum allowed value ({MIN}).",n.Number.max="Path `{PATH}` ({VALUE}) is more than maximum allowed value ({MAX}).",n.Number.enum="`{VALUE}` is not a valid enum value for path `{PATH}`.",n.Date={},n.Date.min="Path `{PATH}` ({VALUE}) is before minimum allowed value ({MIN}).",n.Date.max="Path `{PATH}` ({VALUE}) is after maximum allowed value ({MAX}).",n.String={},n.String.enum="`{VALUE}` is not a valid enum value for path `{PATH}`.",n.String.match="Path `{PATH}` is invalid ({VALUE}).",n.String.minlength="Path `{PATH}` (`{VALUE}`) is shorter than the minimum allowed length ({MINLENGTH}).",n.String.maxlength="Path `{PATH}` (`{VALUE}`) is longer than the maximum allowed length ({MAXLENGTH})."},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){return(i=Object.setPrototypeOf||function(t,e){return t.__proto__=e,t})(t,e)}function o(t){var e=function(){if("undefined"==typeof Reflect||!Reflect.construct)return!1;if(Reflect.construct.sham)return!1;if("function"==typeof Proxy)return!0;try{return Boolean.prototype.valueOf.call(Reflect.construct(Boolean,[],(function(){}))),!0}catch(t){return!1}}();return function(){var r,n=a(t);if(e){var i=a(this).constructor;r=Reflect.construct(n,arguments,i)}else r=n.apply(this,arguments);return s(this,r)}}function s(t,e){if(e&&("object"===n(e)||"function"==typeof e))return e;if(void 0!==e)throw new TypeError("Derived constructors may only return object or undefined");return function(t){if(void 0===t)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return t}(t)}function a(t){return(a=Object.setPrototypeOf?Object.getPrototypeOf:function(t){return t.__proto__||Object.getPrototypeOf(t)})(t)}var u=r(8),h=r(10),f=function(t){!function(t,e){if("function"!=typeof e&&null!==e)throw new TypeError("Super expression must either be null or a function");t.prototype=Object.create(e&&e.prototype,{constructor:{value:t,writable:!0,configurable:!0}}),e&&i(t,e)}(r,t);var e=o(r); +/*! + * OverwriteModel Error constructor. + */function r(t,n,i,o){var s,a;!function(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}(this,r);var f=u.messages;return a=null!=f.DocumentNotFoundError?"function"==typeof f.DocumentNotFoundError?f.DocumentNotFoundError(t,n):f.DocumentNotFoundError:'No document found for query "'+h.inspect(t)+'" on model "'+n+'"',(s=e.call(this,a)).result=o,s.numAffected=i,s.filter=t,s.query=t,s}return r}(u);Object.defineProperty(f.prototype,"name",{value:"DocumentNotFoundError"}), +/*! + * exports + */ +t.exports=f},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){return(i=Object.setPrototypeOf||function(t,e){return t.__proto__=e,t})(t,e)}function o(t){var e=function(){if("undefined"==typeof Reflect||!Reflect.construct)return!1;if(Reflect.construct.sham)return!1;if("function"==typeof Proxy)return!0;try{return Boolean.prototype.valueOf.call(Reflect.construct(Boolean,[],(function(){}))),!0}catch(t){return!1}}();return function(){var r,n=a(t);if(e){var i=a(this).constructor;r=Reflect.construct(n,arguments,i)}else r=n.apply(this,arguments);return s(this,r)}}function s(t,e){if(e&&("object"===n(e)||"function"==typeof e))return e;if(void 0!==e)throw new TypeError("Derived constructors may only return object or undefined");return function(t){if(void 0===t)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return t}(t)}function a(t){return(a=Object.setPrototypeOf?Object.getPrototypeOf:function(t){return t.__proto__||Object.getPrototypeOf(t)})(t)}var u=function(t){!function(t,e){if("function"!=typeof e&&null!==e)throw new TypeError("Super expression must either be null or a function");t.prototype=Object.create(e&&e.prototype,{constructor:{value:t,writable:!0,configurable:!0}}),e&&i(t,e)}(r,t);var e=o(r);function r(t,n,i){var o;!function(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}(this,r);var s=i.join(", ");return(o=e.call(this,'No matching document found for id "'+t._id+'" version '+n+' modifiedPaths "'+s+'"')).version=n,o.modifiedPaths=i,o}return r}(r(8));Object.defineProperty(u.prototype,"name",{value:"VersionError"}), +/*! + * exports + */ +t.exports=u},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){return(i=Object.setPrototypeOf||function(t,e){return t.__proto__=e,t})(t,e)}function o(t){var e=function(){if("undefined"==typeof Reflect||!Reflect.construct)return!1;if(Reflect.construct.sham)return!1;if("function"==typeof Proxy)return!0;try{return Boolean.prototype.valueOf.call(Reflect.construct(Boolean,[],(function(){}))),!0}catch(t){return!1}}();return function(){var r,n=a(t);if(e){var i=a(this).constructor;r=Reflect.construct(n,arguments,i)}else r=n.apply(this,arguments);return s(this,r)}}function s(t,e){if(e&&("object"===n(e)||"function"==typeof e))return e;if(void 0!==e)throw new TypeError("Derived constructors may only return object or undefined");return function(t){if(void 0===t)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return t}(t)}function a(t){return(a=Object.setPrototypeOf?Object.getPrototypeOf:function(t){return t.__proto__||Object.getPrototypeOf(t)})(t)}var u=function(t){!function(t,e){if("function"!=typeof e&&null!==e)throw new TypeError("Super expression must either be null or a function");t.prototype=Object.create(e&&e.prototype,{constructor:{value:t,writable:!0,configurable:!0}}),e&&i(t,e)}(r,t);var e=o(r);function r(t){!function(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}(this,r);return e.call(this,"Can't save() the same doc multiple times in parallel. Document: "+t._id)}return r}(r(8));Object.defineProperty(u.prototype,"name",{value:"ParallelSaveError"}), +/*! + * exports + */ +t.exports=u},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){return(i=Object.setPrototypeOf||function(t,e){return t.__proto__=e,t})(t,e)}function o(t){var e=function(){if("undefined"==typeof Reflect||!Reflect.construct)return!1;if(Reflect.construct.sham)return!1;if("function"==typeof Proxy)return!0;try{return Boolean.prototype.valueOf.call(Reflect.construct(Boolean,[],(function(){}))),!0}catch(t){return!1}}();return function(){var r,n=a(t);if(e){var i=a(this).constructor;r=Reflect.construct(n,arguments,i)}else r=n.apply(this,arguments);return s(this,r)}}function s(t,e){if(e&&("object"===n(e)||"function"==typeof e))return e;if(void 0!==e)throw new TypeError("Derived constructors may only return object or undefined");return function(t){if(void 0===t)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return t}(t)}function a(t){return(a=Object.setPrototypeOf?Object.getPrototypeOf:function(t){return t.__proto__||Object.getPrototypeOf(t)})(t)}var u=function(t){!function(t,e){if("function"!=typeof e&&null!==e)throw new TypeError("Super expression must either be null or a function");t.prototype=Object.create(e&&e.prototype,{constructor:{value:t,writable:!0,configurable:!0}}),e&&i(t,e)}(r,t);var e=o(r); +/*! + * OverwriteModel Error constructor. + * @param {String} name + */function r(t){return function(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}(this,r),e.call(this,"Cannot overwrite `"+t+"` model once compiled.")}return r}(r(8));Object.defineProperty(u.prototype,"name",{value:"OverwriteModelError"}), +/*! + * exports + */ +t.exports=u},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){return(i=Object.setPrototypeOf||function(t,e){return t.__proto__=e,t})(t,e)}function o(t){var e=function(){if("undefined"==typeof Reflect||!Reflect.construct)return!1;if(Reflect.construct.sham)return!1;if("function"==typeof Proxy)return!0;try{return Boolean.prototype.valueOf.call(Reflect.construct(Boolean,[],(function(){}))),!0}catch(t){return!1}}();return function(){var r,n=a(t);if(e){var i=a(this).constructor;r=Reflect.construct(n,arguments,i)}else r=n.apply(this,arguments);return s(this,r)}}function s(t,e){if(e&&("object"===n(e)||"function"==typeof e))return e;if(void 0!==e)throw new TypeError("Derived constructors may only return object or undefined");return function(t){if(void 0===t)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return t}(t)}function a(t){return(a=Object.setPrototypeOf?Object.getPrototypeOf:function(t){return t.__proto__||Object.getPrototypeOf(t)})(t)}var u=function(t){!function(t,e){if("function"!=typeof e&&null!==e)throw new TypeError("Super expression must either be null or a function");t.prototype=Object.create(e&&e.prototype,{constructor:{value:t,writable:!0,configurable:!0}}),e&&i(t,e)}(r,t);var e=o(r); +/*! + * MissingSchema Error constructor. + * @param {String} name + */function r(t){!function(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}(this,r);var n="Schema hasn't been registered for model \""+t+'".\nUse mongoose.model(name, schema)';return e.call(this,n)}return r}(r(8));Object.defineProperty(u.prototype,"name",{value:"MissingSchemaError"}), +/*! + * exports + */ +t.exports=u},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){return(i=Object.setPrototypeOf||function(t,e){return t.__proto__=e,t})(t,e)}function o(t){var e=function(){if("undefined"==typeof Reflect||!Reflect.construct)return!1;if(Reflect.construct.sham)return!1;if("function"==typeof Proxy)return!0;try{return Boolean.prototype.valueOf.call(Reflect.construct(Boolean,[],(function(){}))),!0}catch(t){return!1}}();return function(){var r,n=a(t);if(e){var i=a(this).constructor;r=Reflect.construct(n,arguments,i)}else r=n.apply(this,arguments);return s(this,r)}}function s(t,e){if(e&&("object"===n(e)||"function"==typeof e))return e;if(void 0!==e)throw new TypeError("Derived constructors may only return object or undefined");return function(t){if(void 0===t)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return t}(t)}function a(t){return(a=Object.setPrototypeOf?Object.getPrototypeOf:function(t){return t.__proto__||Object.getPrototypeOf(t)})(t)}var u=function(t){!function(t,e){if("function"!=typeof e&&null!==e)throw new TypeError("Super expression must either be null or a function");t.prototype=Object.create(e&&e.prototype,{constructor:{value:t,writable:!0,configurable:!0}}),e&&i(t,e)}(r,t);var e=o(r); +/*! + * DivergentArrayError constructor. + * @param {Array} paths + */function r(t){!function(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}(this,r);var n="For your own good, using `document.save()` to update an array which was selected using an $elemMatch projection OR populated using skip, limit, query conditions, or exclusion of the _id field when the operation results in a $pop or $set of the entire array is not supported. The following path(s) would have been modified unsafely:\n "+t.join("\n ")+"\nUse Model.update() to update these arrays instead.";return e.call(this,n)}return r}(r(8));Object.defineProperty(u.prototype,"name",{value:"DivergentArrayError"}), +/*! + * exports + */ +t.exports=u},function(t,e,r){"use strict";var n=r(54); +/*! + * ignore + */t.exports=function(t){var e,r;t.$immutable?(t.$immutableSetter=(e=t.path,r=t.options.immutable,function(t){if(null==this||null==this.$__)return t;if(this.isNew)return t;if(!("function"==typeof r?r.call(this,this):r))return t;var i=null!=this.$__.priorDoc?this.$__.priorDoc.$__getValue(e):this.$__getValue(e);if("throw"===this.$__.strictMode&&t!==i)throw new n(e,"Path `"+e+"` is immutable and strict mode is set to throw.",!0);return i}),t.set(t.$immutableSetter)):t.$immutableSetter&&(t.setters=t.setters.filter((function(e){return e!==t.$immutableSetter})),delete t.$immutableSetter)}},function(t,e,r){"use strict";var n=r(10).inspect;t.exports=function(t){if("function"==typeof t)return n(t).startsWith("[AsyncFunction:")}},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){return(i=Object.setPrototypeOf||function(t,e){return t.__proto__=e,t})(t,e)}function o(t){var e=function(){if("undefined"==typeof Reflect||!Reflect.construct)return!1;if(Reflect.construct.sham)return!1;if("function"==typeof Proxy)return!0;try{return Boolean.prototype.valueOf.call(Reflect.construct(Boolean,[],(function(){}))),!0}catch(t){return!1}}();return function(){var r,n=a(t);if(e){var i=a(this).constructor;r=Reflect.construct(n,arguments,i)}else r=n.apply(this,arguments);return s(this,r)}}function s(t,e){if(e&&("object"===n(e)||"function"==typeof e))return e;if(void 0!==e)throw new TypeError("Derived constructors may only return object or undefined");return function(t){if(void 0===t)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return t}(t)}function a(t){return(a=Object.setPrototypeOf?Object.getPrototypeOf:function(t){return t.__proto__||Object.getPrototypeOf(t)})(t)}var u=function(t){!function(t,e){if("function"!=typeof e&&null!==e)throw new TypeError("Super expression must either be null or a function");t.prototype=Object.create(e&&e.prototype,{constructor:{value:t,writable:!0,configurable:!0}}),e&&i(t,e)}(r,t);var e=o(r);function r(t,n,i){return function(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}(this,r),e.call(this,'Parameter "'+n+'" to '+i+"() must be an object, got "+t.toString())}return r}(r(8));Object.defineProperty(u.prototype,"name",{value:"ObjectParameterError"}),t.exports=u},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){return(i=Object.setPrototypeOf||function(t,e){return t.__proto__=e,t})(t,e)}function o(t){var e=function(){if("undefined"==typeof Reflect||!Reflect.construct)return!1;if(Reflect.construct.sham)return!1;if("function"==typeof Proxy)return!0;try{return Boolean.prototype.valueOf.call(Reflect.construct(Boolean,[],(function(){}))),!0}catch(t){return!1}}();return function(){var r,n=a(t);if(e){var i=a(this).constructor;r=Reflect.construct(n,arguments,i)}else r=n.apply(this,arguments);return s(this,r)}}function s(t,e){if(e&&("object"===n(e)||"function"==typeof e))return e;if(void 0!==e)throw new TypeError("Derived constructors may only return object or undefined");return function(t){if(void 0===t)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return t}(t)}function a(t){return(a=Object.setPrototypeOf?Object.getPrototypeOf:function(t){return t.__proto__||Object.getPrototypeOf(t)})(t)}var u=function(t){!function(t,e){if("function"!=typeof e&&null!==e)throw new TypeError("Super expression must either be null or a function");t.prototype=Object.create(e&&e.prototype,{constructor:{value:t,writable:!0,configurable:!0}}),e&&i(t,e)}(r,t);var e=o(r);function r(t){!function(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}(this,r);return e.call(this,"Can't validate() the same doc multiple times in parallel. Document: "+t._id)}return r}(r(23));Object.defineProperty(u.prototype,"name",{value:"ParallelValidateError"}), +/*! + * exports + */ +t.exports=u},function(t,e,r){"use strict";(function(e){function r(t,e){var r="undefined"!=typeof Symbol&&t[Symbol.iterator]||t["@@iterator"];if(!r){if(Array.isArray(t)||(r=function(t,e){if(!t)return;if("string"==typeof t)return n(t,e);var r=Object.prototype.toString.call(t).slice(8,-1);"Object"===r&&t.constructor&&(r=t.constructor.name);if("Map"===r||"Set"===r)return Array.from(t);if("Arguments"===r||/^(?:Ui|I)nt(?:8|16|32)(?:Clamped)?Array$/.test(r))return n(t,e)}(t))||e&&t&&"number"==typeof t.length){r&&(t=r);var i=0,o=function(){};return{s:o,n:function(){return i>=t.length?{done:!0}:{done:!1,value:t[i++]}},e:function(t){throw t},f:o}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function n(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r=s)){var n=o[l];if(n.isAsync){var a=[f(v),f((function(t){if(t){if(p)return;return p=!0,i(t)}if(0==--d&&l>=s)return i(null)}))];u(n.fn,r,a,a[0])}else if(n.fn.length>0){a=[f(v)];for(var c=arguments.length>=2?arguments:[null].concat(m),y=1;y=s)return d>0?void 0:e.nextTick((function(){i(null)}));t()}}}};function v(t){if(t){if(p)return;return p=!0,i(t)}if(++l>=s)return d>0?void 0:i(null);y.apply(r,arguments)}y.apply(null,[null].concat(n))},o.prototype.execPreSync=function(t,e,r){for(var n=a(this._pres,t,[]),i=n.length,o=0;o=c)return o.call(null,d);t()}));u(e,r,[d].concat(p).concat([y]),y)}else{if(++l>=c)return o.call(null,d);t()}else{var v=f((function(e){return e?(d=e,t()):++l>=c?o.apply(null,[null].concat(n)):void t()}));if(e.length===i+2)return++l>=c?o.apply(null,[null].concat(n)):t();if(e.length===i+1)u(e,r,p.concat([v]),v);else{var b,g;try{g=e.apply(r,p)}catch(t){b=t,d=t}if(h(g))return g.then((function(){return v()}),(function(t){return v(t)}));if(++l>=c)return o.apply(null,[b].concat(n));t()}}};p()},o.prototype.execPostSync=function(t,e,r){for(var n=a(this._posts,t,[]),i=n.length,o=0;o0?n[n.length-1]:null,a=("function"==typeof o&&n.slice(0,n.length-1),this),u=(i=i||{}).checkForPromise;this.execPre(t,r,n,(function(h){if(h){for(var f=i.numCallbackParams||0,c=i.contextParameter?[r]:[],l=c.length;l=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:o}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function i(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r0;--l){var d=t.path(c.slice(0,l).join("."));null!=d&&(d.$isMongooseDocumentArray||d.$isSingleNested)&&f.push({parentPath:e.split(".").slice(0,l).join("."),parentSchemaType:d})}if(Array.isArray(r[e])&&h.$isMongooseDocumentArray)!function(t,e,r){var n=e.schema.options.timestamps;if(n)for(var i=t.length,o=s(n,"createdAt"),u=s(n,"updatedAt"),h=0;h0){var p,m=n(f);try{for(m.s();!(p=m.n()).done;){var y=p.value,v=y.parentPath,b=y.parentSchemaType,g=b.schema.options.timestamps,w=s(g,"updatedAt");if(g&&null!=w)if(b.$isSingleNested)r[v+"."+w]=i;else if(b.$isMongooseDocumentArray){var _=e.substr(v.length+1);if(/^\d+$/.test(_)){r[v+"."+_][w]=i;continue}var M=_.indexOf(".");r[v+"."+(_=-1!==M?_.substr(0,M):_)+"."+w]=i}}}catch(t){m.e(t)}finally{m.f()}}else if(null!=h.schema&&h.schema!=t&&r[e]){var S=h.schema.options.timestamps,O=s(S,"createdAt"),A=s(S,"updatedAt");if(!S)return;null!=A&&(r[e][A]=i),null!=O&&(r[e][O]=i)}}}t.exports=a},function(t,e,r){"use strict";t.exports=function(t){return t.replace(/\.\$(\[[^\]]*\])?(?=\.)/g,".0").replace(/\.\$(\[[^\]]*\])?$/g,".0")}},function(t,e,r){"use strict"; +/*! + * ignore + */var n=r(6);t.exports= +/*! + * ignore + */ +function(t,e,r,i,o){var s=i,a=s,u=n(o,"overwrite",!1),h=n(o,"timestamps",!0);if(!h||null==s)return i;var f=null!=h&&!1===h.createdAt,c=null!=h&&!1===h.updatedAt;if(u)return i&&i.$set&&(i=i.$set,s.$set={},a=s.$set),c||!r||i[r]||(a[r]=t),f||!e||i[e]||(a[e]=t),s;if(i=i||{},Array.isArray(s))return s.push({$set:{updatedAt:t}}),s;if(s.$set=s.$set||{},!c&&r&&(!i.$currentDate||!i.$currentDate[r])){var l=!1;if(-1!==r.indexOf("."))for(var d=r.split("."),p=1;p=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:o}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function i(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r0&&!t?this:this.set((function(t){return"string"!=typeof t&&(t=e.cast(t)),t?t.toLowerCase():t}))},c.prototype.uppercase=function(t){var e=this;return arguments.length>0&&!t?this:this.set((function(t){return"string"!=typeof t&&(t=e.cast(t)),t?t.toUpperCase():t}))},c.prototype.trim=function(t){var e=this;return arguments.length>0&&!t?this:this.set((function(t){return"string"!=typeof t&&(t=e.cast(t)),t?t.trim():t}))},c.prototype.minlength=function(t,e){if(this.minlengthValidator&&(this.validators=this.validators.filter((function(t){return t.validator!==this.minlengthValidator}),this)),null!=t){var r=e||s.messages.String.minlength;r=r.replace(/{MINLENGTH}/,t),this.validators.push({validator:this.minlengthValidator=function(e){return null===e||e.length>=t},message:r,type:"minlength",minlength:t})}return this},c.prototype.minLength=c.prototype.minlength,c.prototype.maxlength=function(t,e){if(this.maxlengthValidator&&(this.validators=this.validators.filter((function(t){return t.validator!==this.maxlengthValidator}),this)),null!=t){var r=e||s.messages.String.maxlength;r=r.replace(/{MAXLENGTH}/,t),this.validators.push({validator:this.maxlengthValidator=function(e){return null===e||e.length<=t},message:r,type:"maxlength",maxlength:t})}return this},c.prototype.maxLength=c.prototype.maxlength,c.prototype.match=function(t,e){var r=e||s.messages.String.match;return this.validators.push({validator:function(e){return!!t&&(t.lastIndex=0,null==e||""===e||t.test(e))},message:r,type:"regexp",regexp:t}),this},c.prototype.checkRequired=function(t,e){return o._isRef(this,t,e,!0)?!!t:("function"==typeof this.constructor.checkRequired?this.constructor.checkRequired():c.checkRequired())(t)},c.prototype.cast=function(t,e,r){if(o._isRef(this,t,e,r))return"string"==typeof t?t:this._castRef(t,e,r);var n;n="function"==typeof this._castFunction?this._castFunction:"function"==typeof this.constructor.cast?this.constructor.cast():c.cast();try{return n(t)}catch(e){throw new f("string",t,this.path,null,this)}};var d=h.options(o.prototype.$conditionalHandlers,{$all:function(t){var e=this;return Array.isArray(t)?t.map((function(t){return e.castForQuery(t)})):[this.castForQuery(t)]},$gt:l,$gte:l,$lt:l,$lte:l,$options:String,$regex:l,$not:l});Object.defineProperty(c.prototype,"$conditionalHandlers",{configurable:!1,enumerable:!1,writable:!1,value:Object.freeze(d)}),c.prototype.castForQuery=function(t,e){var r;if(2===arguments.length){if(!(r=this.$conditionalHandlers[t]))throw new Error("Can't use "+t+" with String.");return r.call(this,e)}return e=t,"[object RegExp]"===Object.prototype.toString.call(e)?e:this._castForQuery(e)}, +/*! + * Module exports. + */ +t.exports=c},function(t,e,r){"use strict";function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}function o(t,e){return(o=Object.setPrototypeOf||function(t,e){return t.__proto__=e,t})(t,e)}function s(t){var e=function(){if("undefined"==typeof Reflect||!Reflect.construct)return!1;if(Reflect.construct.sham)return!1;if("function"==typeof Proxy)return!0;try{return Boolean.prototype.valueOf.call(Reflect.construct(Boolean,[],(function(){}))),!0}catch(t){return!1}}();return function(){var r,n=u(t);if(e){var i=u(this).constructor;r=Reflect.construct(n,arguments,i)}else r=n.apply(this,arguments);return a(this,r)}}function a(t,e){if(e&&("object"===n(e)||"function"==typeof e))return e;if(void 0!==e)throw new TypeError("Derived constructors may only return object or undefined");return function(t){if(void 0===t)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return t}(t)}function u(t){return(u=Object.setPrototypeOf?Object.getPrototypeOf:function(t){return t.__proto__||Object.getPrototypeOf(t)})(t)}var h=function(t){!function(t,e){if("function"!=typeof e&&null!==e)throw new TypeError("Super expression must either be null or a function");t.prototype=Object.create(e&&e.prototype,{constructor:{value:t,writable:!0,configurable:!0}}),e&&o(t,e)}(r,t);var e=s(r);function r(){return i(this,r),e.apply(this,arguments)}return r}(r(15)),f=r(16);Object.defineProperty(h.prototype,"enum",f),Object.defineProperty(h.prototype,"match",f),Object.defineProperty(h.prototype,"lowercase",f),Object.defineProperty(h.prototype,"trim",f),Object.defineProperty(h.prototype,"uppercase",f),Object.defineProperty(h.prototype,"minLength",f),Object.defineProperty(h.prototype,"minlength",f),Object.defineProperty(h.prototype,"maxLength",f),Object.defineProperty(h.prototype,"maxlength",f),Object.defineProperty(h.prototype,"populate",f), +/*! + * ignore + */ +t.exports=h},function(t,e,r){"use strict";function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}function o(t,e){return(o=Object.setPrototypeOf||function(t,e){return t.__proto__=e,t})(t,e)}function s(t){var e=function(){if("undefined"==typeof Reflect||!Reflect.construct)return!1;if(Reflect.construct.sham)return!1;if("function"==typeof Proxy)return!0;try{return Boolean.prototype.valueOf.call(Reflect.construct(Boolean,[],(function(){}))),!0}catch(t){return!1}}();return function(){var r,n=u(t);if(e){var i=u(this).constructor;r=Reflect.construct(n,arguments,i)}else r=n.apply(this,arguments);return a(this,r)}}function a(t,e){if(e&&("object"===n(e)||"function"==typeof e))return e;if(void 0!==e)throw new TypeError("Derived constructors may only return object or undefined");return function(t){if(void 0===t)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return t}(t)}function u(t){return(u=Object.setPrototypeOf?Object.getPrototypeOf:function(t){return t.__proto__||Object.getPrototypeOf(t)})(t)}var h=function(t){!function(t,e){if("function"!=typeof e&&null!==e)throw new TypeError("Super expression must either be null or a function");t.prototype=Object.create(e&&e.prototype,{constructor:{value:t,writable:!0,configurable:!0}}),e&&o(t,e)}(r,t);var e=s(r);function r(){return i(this,r),e.apply(this,arguments)}return r}(r(15)),f=r(16);Object.defineProperty(h.prototype,"min",f),Object.defineProperty(h.prototype,"max",f),Object.defineProperty(h.prototype,"enum",f),Object.defineProperty(h.prototype,"populate",f), +/*! + * ignore + */ +t.exports=h},function(t,e,r){"use strict";var n=r(33); +/*! + * Given a value, cast it to a number, or throw a `CastError` if the value + * cannot be casted. `null` and `undefined` are considered valid. + * + * @param {Any} value + * @param {String} [path] optional the path to set on the CastError + * @return {Boolean|null|undefined} + * @throws {Error} if `value` is not one of the allowed values + * @api private + */t.exports=function(t){return null==t?t:""===t?null:("string"!=typeof t&&"boolean"!=typeof t||(t=Number(t)),n.ok(!isNaN(t)),t instanceof Number?t.valueOf():"number"==typeof t?t:Array.isArray(t)||"function"!=typeof t.valueOf?t.toString&&!Array.isArray(t)&&t.toString()==Number(t)?Number(t):void n.ok(!1):Number(t.valueOf()))}},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */var n=r(20),i=r(9),o=r(84),s=r(4);function a(t,e){i.call(this,t,e,"Boolean")}a.schemaName="Boolean",a.defaultOptions={}, +/*! + * Inherits from SchemaType. + */ +a.prototype=Object.create(i.prototype),a.prototype.constructor=a, +/*! + * ignore + */ +a._cast=o,a.set=i.set,a.cast=function(t){return 0===arguments.length||(!1===t&&(t=this._defaultCaster),this._cast=t),this._cast}, +/*! + * ignore + */ +a._defaultCaster=function(t){if(null!=t&&"boolean"!=typeof t)throw new Error;return t}, +/*! + * ignore + */ +a._checkRequired=function(t){return!0===t||!1===t},a.checkRequired=i.checkRequired,a.prototype.checkRequired=function(t){return this.constructor._checkRequired(t)},Object.defineProperty(a,"convertToTrue",{get:function(){return o.convertToTrue},set:function(t){o.convertToTrue=t}}),Object.defineProperty(a,"convertToFalse",{get:function(){return o.convertToFalse},set:function(t){o.convertToFalse=t}}),a.prototype.cast=function(t){var e;e="function"==typeof this._castFunction?this._castFunction:"function"==typeof this.constructor.cast?this.constructor.cast():a.cast();try{return e(t)}catch(e){throw new n("Boolean",t,this.path,e,this)}},a.$conditionalHandlers=s.options(i.prototype.$conditionalHandlers,{}),a.prototype.castForQuery=function(t,e){var r;return 2===arguments.length?(r=a.$conditionalHandlers[t])?r.call(this,e):this._castForQuery(e):this._castForQuery(t)},a.prototype._castNullish=function(t){if(void 0===t&&null!=this.$$context&&null!=this.$$context._mongooseOptions&&this.$$context._mongooseOptions.omitUndefined)return t;var e="function"==typeof this.constructor.cast?this.constructor.cast():a.cast();return null==e?t:!(e.convertToFalse instanceof Set&&e.convertToFalse.has(t))&&(!!(e.convertToTrue instanceof Set&&e.convertToTrue.has(t))||t)}, +/*! + * Module exports. + */ +t.exports=a},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */var n,i,o=r(88),s=r(20),a=r(13).EventEmitter,u=r(323),h=r(9),f=r(173),c=r(6),l=r(174),d=r(10),p=r(4),m=r(175),y=r(1).arrayAtomicsSymbol,v=r(1).arrayPathSymbol,b=r(1).documentArrayParent;function g(t,e,r,n){null!=n&&null!=n._id?e=l(e,n):null!=r&&null!=r._id&&(e=l(e,r));var i=w(e,r);i.prototype.$basePath=t,o.call(this,t,i,r),this.schema=e,this.schemaOptions=n||{},this.$isMongooseDocumentArray=!0,this.Constructor=i,i.base=e.base;var s=this.defaultValue;"defaultValue"in this&&void 0===s||this.default((function(){var t=s.call(this);return Array.isArray(t)||(t=[t]),t}));var a=this;this.$embeddedSchemaType=new h(t+".$",{required:c(this,"schemaOptions.required",!1)}),this.$embeddedSchemaType.cast=function(t,e,r){return a.cast(t,e,r)[0]},this.$embeddedSchemaType.$isMongooseDocumentArrayElement=!0,this.$embeddedSchemaType.caster=this.Constructor,this.$embeddedSchemaType.schema=this.schema} +/*! + * Ignore + */ +function w(t,e,n){function o(){i.apply(this,arguments),this.$session(this.ownerDocument().$session())}i||(i=r(45)),t._preCompile();var s=null!=n?n.prototype:i.prototype;for(var u in o.prototype=Object.create(s),o.prototype.$__setSchema(t),o.schema=t,o.prototype.constructor=o,o.$isArraySubdocument=!0,o.events=new a,t.methods)o.prototype[u]=t.methods[u];for(var h in t.statics)o[h]=t.statics[h];for(var f in a.prototype)o[f]=a.prototype[f];return o.options=e,o} +/*! + * Scopes paths selected in a query to this array. + * Necessary for proper default application of subdocument values. + * + * @param {DocumentArrayPath} array - the array to scope `fields` paths + * @param {Object|undefined} fields - the root fields selected in the query + * @param {Boolean|undefined} init - if we are being created part of a query result + */ +function _(t,e,r){if(r&&e){for(var n,i,o,s=t.path+".",a=Object.keys(e),u=a.length,h={};u--;)if((i=a[u]).startsWith(s)){if("$"===(o=i.substring(s.length)))continue;o.startsWith("$.")&&(o=o.substr(2)),n||(n=!0),h[o]=e[i]}return n&&h||void 0}}g.schemaName="DocumentArray",g.options={castNonArrays:!0}, +/*! + * Inherits from ArrayType. + */ +g.prototype=Object.create(o.prototype),g.prototype.constructor=g,g.prototype.OptionsConstructor=u,g.prototype.discriminator=function(t,e,r){"function"==typeof t&&(t=p.getFunctionName(t)),r=r||{};var n=p.isPOJO(r)?r.value:r,i=c(r,"clone",!0);e.instanceOfSchema&&i&&(e=e.clone());var o=w(e=f(this.casterConstructor,t,e,n),null,this.casterConstructor);o.baseCasterConstructor=this.casterConstructor;try{Object.defineProperty(o,"name",{value:t})}catch(t){}return this.casterConstructor.discriminators[t]=o,this.casterConstructor.discriminators[t]},g.prototype.doValidate=function(t,e,o,s){n||(n=r(26));var a=this;try{h.prototype.doValidate.call(this,t,(function(r){if(r)return e(r);var u,h=t&&t.length;if(!h)return e();if(s&&s.updateValidator)return e();t.isMongooseDocumentArray||(t=new n(t,a.path,o));function f(t){null!=t&&(u=t),--h||e(u)}for(var c=0,l=h;cr.max&&(r.max=i.max),r.containsNonArrayItem=r.containsNonArrayItem||i.containsNonArrayItem}return r.min=r.min+1,r.max=r.max+1,r}},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */function n(t,e){var r="undefined"!=typeof Symbol&&t[Symbol.iterator]||t["@@iterator"];if(!r){if(Array.isArray(t)||(r=function(t,e){if(!t)return;if("string"==typeof t)return i(t,e);var r=Object.prototype.toString.call(t).slice(8,-1);"Object"===r&&t.constructor&&(r=t.constructor.name);if("Map"===r||"Set"===r)return Array.from(t);if("Arguments"===r||/^(?:Ui|I)nt(?:8|16|32)(?:Clamped)?Array$/.test(r))return i(t,e)}(t))||e&&t&&"number"==typeof t.length){r&&(t=r);var n=0,o=function(){};return{s:o,n:function(){return n>=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:o}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function i(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:o}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function i(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r0?t:n)}return this}}; +/*! + * If this is a document array, each element may contain single + * populated paths, so we need to modify the top-level document's + * populated cache. See gh-8247, gh-8265. + */ +function v(t){var e=t[l];if(e&&null!=e.$__.populated){var r,i=n(Object.keys(e.$__.populated).filter((function(e){return e.startsWith(t[d]+".")})));try{var o=function(){var n=r.value,i=n.slice((t[d]+".").length);if(!Array.isArray(e.$__.populated[n].value))return"continue";e.$__.populated[n].value=t.map((function(t){return t.$populated(i)}))};for(i.s();!(r=i.n()).done;)o()}catch(t){i.e(t)}finally{i.f()}}}t.exports=y}).call(this,r(3).Buffer)},function(t,e,r){"use strict";function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}function o(t,e){return(o=Object.setPrototypeOf||function(t,e){return t.__proto__=e,t})(t,e)}function s(t){var e=function(){if("undefined"==typeof Reflect||!Reflect.construct)return!1;if(Reflect.construct.sham)return!1;if("function"==typeof Proxy)return!0;try{return Boolean.prototype.valueOf.call(Reflect.construct(Boolean,[],(function(){}))),!0}catch(t){return!1}}();return function(){var r,n=u(t);if(e){var i=u(this).constructor;r=Reflect.construct(n,arguments,i)}else r=n.apply(this,arguments);return a(this,r)}}function a(t,e){if(e&&("object"===n(e)||"function"==typeof e))return e;if(void 0!==e)throw new TypeError("Derived constructors may only return object or undefined");return function(t){if(void 0===t)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return t}(t)}function u(t){return(u=Object.setPrototypeOf?Object.getPrototypeOf:function(t){return t.__proto__||Object.getPrototypeOf(t)})(t)}var h=function(t){!function(t,e){if("function"!=typeof e&&null!==e)throw new TypeError("Super expression must either be null or a function");t.prototype=Object.create(e&&e.prototype,{constructor:{value:t,writable:!0,configurable:!0}}),e&&o(t,e)}(r,t);var e=s(r);function r(){return i(this,r),e.apply(this,arguments)}return r}(r(15)),f=r(16);Object.defineProperty(h.prototype,"excludeIndexes",f),Object.defineProperty(h.prototype,"_id",f), +/*! + * ignore + */ +t.exports=h},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}var i,o=r(20),s=r(13).EventEmitter,a=r(156),u=r(325),h=r(9),f=r(83),c=r(56).castToNumber,l=r(173),d=r(166),p=r(6),m=r(175),y=r(174),v=r(35).internalToObjectOptions,b=r(4);function g(t,e,r){t=y(t,r),this.caster=w(t),this.caster.path=e,this.caster.prototype.$basePath=e,this.schema=t,this.$isSingleNested=!0,h.call(this,e,r,"Embedded")} +/*! + * ignore + */ +/*! + * ignore + */ +function w(t,e){i||(i=r(90));var n=function(t,e,r){var n=this;this.$__parent=r,i.apply(this,arguments),this.$session(this.ownerDocument().$session()),r&&(r.$on("save",(function(){n.emit("save",n),n.constructor.emit("save",n)})),r.$on("isNew",(function(t){n.isNew=t,n.emit("isNew",t),n.constructor.emit("isNew",t)})))};t._preCompile();var o=null!=e?e.prototype:i.prototype;for(var a in(n.prototype=Object.create(o)).$__setSchema(t),n.prototype.constructor=n,n.schema=t,n.$isSingleNested=!0,n.events=new s,n.prototype.toBSON=function(){return this.toObject(v)},t.methods)n.prototype[a]=t.methods[a];for(var u in t.statics)n[u]=t.statics[u];for(var h in s.prototype)n[h]=s.prototype[h];return n} +/*! + * Special case for when users use a common location schema to represent + * locations for use with $geoWithin. + * https://docs.mongodb.org/manual/reference/operator/query/geoWithin/ + * + * @param {Object} val + * @api private + */t.exports=g,g.prototype=Object.create(h.prototype),g.prototype.constructor=g,g.prototype.OptionsConstructor=u,g.prototype.$conditionalHandlers.$geoWithin=function(t){return{$geometry:this.castForQuery(t.$geometry)}}, +/*! + * ignore + */ +g.prototype.$conditionalHandlers.$near=g.prototype.$conditionalHandlers.$nearSphere=d.cast$near,g.prototype.$conditionalHandlers.$within=g.prototype.$conditionalHandlers.$geoWithin=d.cast$within,g.prototype.$conditionalHandlers.$geoIntersects=d.cast$geoIntersects,g.prototype.$conditionalHandlers.$minDistance=c,g.prototype.$conditionalHandlers.$maxDistance=c,g.prototype.$conditionalHandlers.$exists=f,g.prototype.cast=function(t,e,r,i,o){if(t&&t.$isSingleNested&&t.parent===e)return t;if(null!=t&&("object"!==n(t)||Array.isArray(t)))throw new a(this.path,t);var s,u=m(this.caster,t),h=p(e,"$__.selected",{}),f=this.path,c=Object.keys(h).reduce((function(t,e){return e.startsWith(f+".")&&(t[e.substr(f.length+1)]=h[e]),t}),{});return o=Object.assign({},o,{priorDoc:i}),r?((s=new u(void 0,c,e)).$init(t),s):0===Object.keys(t).length?new u({},c,e,void 0,o):new u(t,c,e,void 0,o)},g.prototype.castForQuery=function(t,e,r){var n;if(2===arguments.length){if(!(n=this.$conditionalHandlers[t]))throw new Error("Can't use "+t);return n.call(this,e)}if(null==(e=t))return e;this.options.runSetters&&(e=this._applySetters(e));var i=m(this.caster,e),s=null!=r&&null!=r.strict?r.strict:void 0;try{e=new i(e,s)}catch(t){if(!(t instanceof o))throw new o("Embedded",e,this.path,t,this);throw t}return e},g.prototype.doValidate=function(t,e,r,n){var i=m(this.caster,t);if(n&&n.skipSchemaValidators)return t instanceof i||(t=new i(t,null,r)),t.validate(e);h.prototype.doValidate.call(this,t,(function(r){return r?e(r):t?void t.validate(e):e(null)}),r,n)},g.prototype.doValidateSync=function(t,e,r){if(!r||!r.skipSchemaValidators){var n=h.prototype.doValidateSync.call(this,t,e);if(n)return n}if(t)return t.validateSync()},g.prototype.discriminator=function(t,e,r){r=r||{};var n=b.isPOJO(r)?r.value:r,i=p(r,"clone",!0);return e.instanceOfSchema&&i&&(e=e.clone()),e=l(this.caster,t,e,n),this.caster.discriminators[t]=w(e,this.caster),this.caster.discriminators[t]},g.defaultOptions={},g.set=h.set, +/*! + * ignore + */ +g.prototype.clone=function(){var t=Object.assign({},this.options),e=new this.constructor(this.schema,this.path,t);return e.validators=this.validators.slice(),void 0!==this.requiredValidator&&(e.requiredValidator=this.requiredValidator),e.caster.discriminators=Object.assign({},this.caster.discriminators),e}},function(t,e,r){"use strict";function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}function o(t,e){return(o=Object.setPrototypeOf||function(t,e){return t.__proto__=e,t})(t,e)}function s(t){var e=function(){if("undefined"==typeof Reflect||!Reflect.construct)return!1;if(Reflect.construct.sham)return!1;if("function"==typeof Proxy)return!0;try{return Boolean.prototype.valueOf.call(Reflect.construct(Boolean,[],(function(){}))),!0}catch(t){return!1}}();return function(){var r,n=u(t);if(e){var i=u(this).constructor;r=Reflect.construct(n,arguments,i)}else r=n.apply(this,arguments);return a(this,r)}}function a(t,e){if(e&&("object"===n(e)||"function"==typeof e))return e;if(void 0!==e)throw new TypeError("Derived constructors may only return object or undefined");return function(t){if(void 0===t)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return t}(t)}function u(t){return(u=Object.setPrototypeOf?Object.getPrototypeOf:function(t){return t.__proto__||Object.getPrototypeOf(t)})(t)}var h=function(t){!function(t,e){if("function"!=typeof e&&null!==e)throw new TypeError("Super expression must either be null or a function");t.prototype=Object.create(e&&e.prototype,{constructor:{value:t,writable:!0,configurable:!0}}),e&&o(t,e)}(r,t);var e=s(r);function r(){return i(this,r),e.apply(this,arguments)}return r}(r(15)),f=r(16);Object.defineProperty(h.prototype,"_id",f),t.exports=h},function(t,e,r){"use strict";(function(e){ +/*! + * Module dependencies. + */ +function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}var i=r(169),o=r(327),s=r(9),a=r(162),u=r(4),h=i.Binary,f=s.CastError;function c(t,e){s.call(this,t,e,"Buffer")} +/*! + * ignore + */ +function l(t){return this.castForQuery(t)}c.schemaName="Buffer",c.defaultOptions={}, +/*! + * Inherits from SchemaType. + */ +c.prototype=Object.create(s.prototype),c.prototype.constructor=c,c.prototype.OptionsConstructor=o, +/*! + * ignore + */ +c._checkRequired=function(t){return!(!t||!t.length)},c.set=s.set,c.checkRequired=s.checkRequired,c.prototype.checkRequired=function(t,e){return s._isRef(this,t,e,!0)?!!t:this.constructor._checkRequired(t)},c.prototype.cast=function(t,r,o){var a;if(s._isRef(this,t,r,o)){if(t&&t.isMongooseBuffer)return t;if(e.isBuffer(t))return t&&t.isMongooseBuffer||(t=new i(t,[this.path,r]),null!=this.options.subtype&&(t._subtype=this.options.subtype)),t;if(t instanceof h){if(a=new i(t.value(!0),[this.path,r]),"number"!=typeof t.sub_type)throw new f("Buffer",t,this.path,null,this);return a._subtype=t.sub_type,a}return this._castRef(t,r,o)}if(t&&t._id&&(t=t._id),t&&t.isMongooseBuffer)return t;if(e.isBuffer(t))return t&&t.isMongooseBuffer||(t=new i(t,[this.path,r]),null!=this.options.subtype&&(t._subtype=this.options.subtype)),t;if(t instanceof h){if(a=new i(t.value(!0),[this.path,r]),"number"!=typeof t.sub_type)throw new f("Buffer",t,this.path,null,this);return a._subtype=t.sub_type,a}if(null===t)return t;var u=n(t);if("string"===u||"number"===u||Array.isArray(t)||"object"===u&&"Buffer"===t.type&&Array.isArray(t.data))return"number"===u&&(t=[t]),a=new i(t,[this.path,r]),null!=this.options.subtype&&(a._subtype=this.options.subtype),a;throw new f("Buffer",t,this.path,null,this)},c.prototype.subtype=function(t){return this.options.subtype=t,this},c.prototype.$conditionalHandlers=u.options(s.prototype.$conditionalHandlers,{$bitsAllClear:a,$bitsAnyClear:a,$bitsAllSet:a,$bitsAnySet:a,$gt:l,$gte:l,$lt:l,$lte:l}),c.prototype.castForQuery=function(t,e){var r;if(2===arguments.length){if(!(r=this.$conditionalHandlers[t]))throw new Error("Can't use "+t+" with Buffer.");return r.call(this,e)}e=t;var n=this._castForQuery(e);return n?n.toObject({transform:!1,virtuals:!1}):n}, +/*! + * Module exports. + */ +t.exports=c}).call(this,r(3).Buffer)},function(t,e,r){"use strict";function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}function o(t,e){return(o=Object.setPrototypeOf||function(t,e){return t.__proto__=e,t})(t,e)}function s(t){var e=function(){if("undefined"==typeof Reflect||!Reflect.construct)return!1;if(Reflect.construct.sham)return!1;if("function"==typeof Proxy)return!0;try{return Boolean.prototype.valueOf.call(Reflect.construct(Boolean,[],(function(){}))),!0}catch(t){return!1}}();return function(){var r,n=u(t);if(e){var i=u(this).constructor;r=Reflect.construct(n,arguments,i)}else r=n.apply(this,arguments);return a(this,r)}}function a(t,e){if(e&&("object"===n(e)||"function"==typeof e))return e;if(void 0!==e)throw new TypeError("Derived constructors may only return object or undefined");return function(t){if(void 0===t)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return t}(t)}function u(t){return(u=Object.setPrototypeOf?Object.getPrototypeOf:function(t){return t.__proto__||Object.getPrototypeOf(t)})(t)}var h=function(t){!function(t,e){if("function"!=typeof e&&null!==e)throw new TypeError("Super expression must either be null or a function");t.prototype=Object.create(e&&e.prototype,{constructor:{value:t,writable:!0,configurable:!0}}),e&&o(t,e)}(r,t);var e=s(r);function r(){return i(this,r),e.apply(this,arguments)}return r}(r(15)),f=r(16);Object.defineProperty(h.prototype,"subtype",f), +/*! + * ignore + */ +t.exports=h},function(t,e,r){"use strict"; +/*! + * Module requirements. + */var n=r(8),i=r(329),o=r(9),s=r(330),a=r(34),u=r(4),h=o.CastError;function f(t,e){o.call(this,t,e,"Date")} +/*! + * Date Query casting. + * + * @api private + */ +function c(t){return this.cast(t)}f.schemaName="Date",f.defaultOptions={}, +/*! + * Inherits from SchemaType. + */ +f.prototype=Object.create(o.prototype),f.prototype.constructor=f,f.prototype.OptionsConstructor=i, +/*! + * ignore + */ +f._cast=s,f.set=o.set,f.cast=function(t){return 0===arguments.length||(!1===t&&(t=this._defaultCaster),this._cast=t),this._cast}, +/*! + * ignore + */ +f._defaultCaster=function(t){if(null!=t&&!(t instanceof Date))throw new Error;return t},f.prototype.expires=function(t){return"Object"!==a(this._index)&&(this._index={}),this._index.expires=t,u.expires(this._index),this}, +/*! + * ignore + */ +f._checkRequired=function(t){return t instanceof Date},f.checkRequired=o.checkRequired,f.prototype.checkRequired=function(t,e){return o._isRef(this,t,e,!0)?!!t:("function"==typeof this.constructor.checkRequired?this.constructor.checkRequired():f.checkRequired())(t)},f.prototype.min=function(t,e){if(this.minValidator&&(this.validators=this.validators.filter((function(t){return t.validator!==this.minValidator}),this)),t){var r=e||n.messages.Date.min;"string"==typeof r&&(r=r.replace(/{MIN}/,t===Date.now?"Date.now()":t.toString()));var i=this;this.validators.push({validator:this.minValidator=function(e){var r=t;"function"==typeof t&&t!==Date.now&&(r=r.call(this));var n=r===Date.now?r():i.cast(r);return null===e||e.valueOf()>=n.valueOf()},message:r,type:"min",min:t})}return this},f.prototype.max=function(t,e){if(this.maxValidator&&(this.validators=this.validators.filter((function(t){return t.validator!==this.maxValidator}),this)),t){var r=e||n.messages.Date.max;"string"==typeof r&&(r=r.replace(/{MAX}/,t===Date.now?"Date.now()":t.toString()));var i=this;this.validators.push({validator:this.maxValidator=function(e){var r=t;"function"==typeof r&&r!==Date.now&&(r=r.call(this));var n=r===Date.now?r():i.cast(r);return null===e||e.valueOf()<=n.valueOf()},message:r,type:"max",max:t})}return this},f.prototype.cast=function(t){var e;e="function"==typeof this._castFunction?this._castFunction:"function"==typeof this.constructor.cast?this.constructor.cast():f.cast();try{return e(t)}catch(e){throw new h("date",t,this.path,e,this)}},f.prototype.$conditionalHandlers=u.options(o.prototype.$conditionalHandlers,{$gt:c,$gte:c,$lt:c,$lte:c}),f.prototype.castForQuery=function(t,e){if(2!==arguments.length)return this._castForQuery(t);var r=this.$conditionalHandlers[t];if(!r)throw new Error("Can't use "+t+" with Date.");return r.call(this,e)}, +/*! + * Module exports. + */ +t.exports=f},function(t,e,r){"use strict";function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}function o(t,e){return(o=Object.setPrototypeOf||function(t,e){return t.__proto__=e,t})(t,e)}function s(t){var e=function(){if("undefined"==typeof Reflect||!Reflect.construct)return!1;if(Reflect.construct.sham)return!1;if("function"==typeof Proxy)return!0;try{return Boolean.prototype.valueOf.call(Reflect.construct(Boolean,[],(function(){}))),!0}catch(t){return!1}}();return function(){var r,n=u(t);if(e){var i=u(this).constructor;r=Reflect.construct(n,arguments,i)}else r=n.apply(this,arguments);return a(this,r)}}function a(t,e){if(e&&("object"===n(e)||"function"==typeof e))return e;if(void 0!==e)throw new TypeError("Derived constructors may only return object or undefined");return function(t){if(void 0===t)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return t}(t)}function u(t){return(u=Object.setPrototypeOf?Object.getPrototypeOf:function(t){return t.__proto__||Object.getPrototypeOf(t)})(t)}var h=function(t){!function(t,e){if("function"!=typeof e&&null!==e)throw new TypeError("Super expression must either be null or a function");t.prototype=Object.create(e&&e.prototype,{constructor:{value:t,writable:!0,configurable:!0}}),e&&o(t,e)}(r,t);var e=s(r);function r(){return i(this,r),e.apply(this,arguments)}return r}(r(15)),f=r(16);Object.defineProperty(h.prototype,"min",f),Object.defineProperty(h.prototype,"max",f),Object.defineProperty(h.prototype,"expires",f), +/*! + * ignore + */ +t.exports=h},function(t,e,r){"use strict";var n=r(33);t.exports=function(t){return null==t||""===t?null:t instanceof Date?(n.ok(!isNaN(t.valueOf())),t):(n.ok("boolean"!=typeof t),e=t instanceof Number||"number"==typeof t?new Date(t):"string"==typeof t&&!isNaN(Number(t))&&(Number(t)>=275761||Number(t)<-271820)?new Date(Number(t)):"function"==typeof t.valueOf?new Date(t.valueOf()):new Date(t),isNaN(e.valueOf())?void n.ok(!1):e);var e}},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */var n,i=r(332),o=r(9),s=r(170),a=r(34),u=r(18),h=r(4),f=o.CastError;function c(t,e){var r="string"==typeof t&&24===t.length&&/^[a-f0-9]+$/i.test(t),n=e&&e.suppressWarning;!r&&void 0!==t||n||h.warn("mongoose: To create a new ObjectId please try `Mongoose.Types.ObjectId` instead of using `Mongoose.Schema.ObjectId`. Set the `suppressWarning` option if you're trying to create a hex char path in your schema."),o.call(this,t,e,"ObjectID")} +/*! + * ignore + */ +function l(t){return this.cast(t)} +/*! + * ignore + */ +function d(){return new u}function p(t){if(n||(n=r(14)),this instanceof n){if(void 0===t){var e=new u;return this.$__._id=e,e}this.$__._id=t}return t} +/*! + * Module exports. + */c.schemaName="ObjectId",c.defaultOptions={}, +/*! + * Inherits from SchemaType. + */ +c.prototype=Object.create(o.prototype),c.prototype.constructor=c,c.prototype.OptionsConstructor=i,c.get=o.get,c.set=o.set,c.prototype.auto=function(t){return t&&(this.default(d),this.set(p)),this}, +/*! + * ignore + */ +c._checkRequired=function(t){return t instanceof u}, +/*! + * ignore + */ +c._cast=s,c.cast=function(t){return 0===arguments.length||(!1===t&&(t=this._defaultCaster),this._cast=t),this._cast}, +/*! + * ignore + */ +c._defaultCaster=function(t){if(!(t instanceof u))throw new Error(t+" is not an instance of ObjectId");return t},c.checkRequired=o.checkRequired,c.prototype.checkRequired=function(t,e){return o._isRef(this,t,e,!0)?!!t:("function"==typeof this.constructor.checkRequired?this.constructor.checkRequired():c.checkRequired())(t)},c.prototype.cast=function(t,e,r){if(o._isRef(this,t,e,r))return t instanceof u?t:"objectid"===(a(t)||"").toLowerCase()?new u(t.toHexString()):this._castRef(t,e,r);var n;n="function"==typeof this._castFunction?this._castFunction:"function"==typeof this.constructor.cast?this.constructor.cast():c.cast();try{return n(t)}catch(e){throw new f("ObjectId",t,this.path,e,this)}},c.prototype.$conditionalHandlers=h.options(o.prototype.$conditionalHandlers,{$gt:l,$gte:l,$lt:l,$lte:l}),d.$runBeforeSetters=!0,t.exports=c},function(t,e,r){"use strict";function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){if(!(t instanceof e))throw new TypeError("Cannot call a class as a function")}function o(t,e){return(o=Object.setPrototypeOf||function(t,e){return t.__proto__=e,t})(t,e)}function s(t){var e=function(){if("undefined"==typeof Reflect||!Reflect.construct)return!1;if(Reflect.construct.sham)return!1;if("function"==typeof Proxy)return!0;try{return Boolean.prototype.valueOf.call(Reflect.construct(Boolean,[],(function(){}))),!0}catch(t){return!1}}();return function(){var r,n=u(t);if(e){var i=u(this).constructor;r=Reflect.construct(n,arguments,i)}else r=n.apply(this,arguments);return a(this,r)}}function a(t,e){if(e&&("object"===n(e)||"function"==typeof e))return e;if(void 0!==e)throw new TypeError("Derived constructors may only return object or undefined");return function(t){if(void 0===t)throw new ReferenceError("this hasn't been initialised - super() hasn't been called");return t}(t)}function u(t){return(u=Object.setPrototypeOf?Object.getPrototypeOf:function(t){return t.__proto__||Object.getPrototypeOf(t)})(t)}var h=function(t){!function(t,e){if("function"!=typeof e&&null!==e)throw new TypeError("Super expression must either be null or a function");t.prototype=Object.create(e&&e.prototype,{constructor:{value:t,writable:!0,configurable:!0}}),e&&o(t,e)}(r,t);var e=s(r);function r(){return i(this,r),e.apply(this,arguments)}return r}(r(15)),f=r(16);Object.defineProperty(h.prototype,"auto",f),Object.defineProperty(h.prototype,"populate",f), +/*! + * ignore + */ +t.exports=h},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */var n=r(9),i=n.CastError,o=r(31),s=r(334),a=r(4);function u(t,e){n.call(this,t,e,"Decimal128")} +/*! + * ignore + */ +function h(t){return this.cast(t)}u.schemaName="Decimal128",u.defaultOptions={}, +/*! + * Inherits from SchemaType. + */ +u.prototype=Object.create(n.prototype),u.prototype.constructor=u, +/*! + * ignore + */ +u._cast=s,u.set=n.set,u.cast=function(t){return 0===arguments.length||(!1===t&&(t=this._defaultCaster),this._cast=t),this._cast}, +/*! + * ignore + */ +u._defaultCaster=function(t){if(null!=t&&!(t instanceof o))throw new Error;return t}, +/*! + * ignore + */ +u._checkRequired=function(t){return t instanceof o},u.checkRequired=n.checkRequired,u.prototype.checkRequired=function(t,e){return n._isRef(this,t,e,!0)?!!t:("function"==typeof this.constructor.checkRequired?this.constructor.checkRequired():u.checkRequired())(t)},u.prototype.cast=function(t,e,r){if(n._isRef(this,t,e,r))return t instanceof o?t:this._castRef(t,e,r);var s;s="function"==typeof this._castFunction?this._castFunction:"function"==typeof this.constructor.cast?this.constructor.cast():u.cast();try{return s(t)}catch(e){throw new i("Decimal128",t,this.path,e,this)}},u.prototype.$conditionalHandlers=a.options(n.prototype.$conditionalHandlers,{$gt:h,$gte:h,$lt:h,$lte:h}), +/*! + * Module exports. + */ +t.exports=u},function(t,e,r){"use strict";(function(e){function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}var i=r(31),o=r(33);t.exports=function(t){return null==t?t:"object"===n(t)&&"string"==typeof t.$numberDecimal?i.fromString(t.$numberDecimal):t instanceof i?t:"string"==typeof t?i.fromString(t):e.isBuffer(t)?new i(t):"number"==typeof t?i.fromString(String(t)):"function"==typeof t.valueOf&&"string"==typeof t.valueOf()?i.fromString(t.valueOf()):void o.ok(!1)}}).call(this,r(3).Buffer)},function(t,e,r){"use strict";(function(e){ +/*! + * ignore + */ +function n(t){return(n="function"==typeof Symbol&&"symbol"==typeof Symbol.iterator?function(t){return typeof t}:function(t){return t&&"function"==typeof Symbol&&t.constructor===Symbol&&t!==Symbol.prototype?"symbol":typeof t})(t)}function i(t,e){var r="undefined"!=typeof Symbol&&t[Symbol.iterator]||t["@@iterator"];if(!r){if(Array.isArray(t)||(r=function(t,e){if(!t)return;if("string"==typeof t)return o(t,e);var r=Object.prototype.toString.call(t).slice(8,-1);"Object"===r&&t.constructor&&(r=t.constructor.name);if("Map"===r||"Set"===r)return Array.from(t);if("Arguments"===r||/^(?:Ui|I)nt(?:8|16|32)(?:Clamped)?Array$/.test(r))return o(t,e)}(t))||e&&t&&"number"==typeof t.length){r&&(t=r);var n=0,i=function(){};return{s:i,n:function(){return n>=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:i}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function o(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r0)&&!(e instanceof t)&&!(e instanceof o)&&!(e instanceof i)}e.flatten= +/*! + * ignore + */ +function e(r,n,i,o){var s;s=r&&a(r)&&!t.isBuffer(r)?Object.keys(r.toObject({transform:!1,virtuals:!1})):Object.keys(r||{});var h=s.length,f={};n=n?n+".":"";for(var c=0;c=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:o}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function i(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:o}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function i(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r1){i=new Set;var u,h=n(s);try{for(h.s();!(u=h.n()).done;){var f=u.value;a.has(f)&&i.add(f)}}catch(t){h.e(t)}finally{h.f()}var c,l=n(a);try{for(l.s();!(c=l.n()).done;){var d=c.value;i.has(d)||i.add(d)}}catch(t){l.e(t)}finally{l.f()}i=Array.from(i)}else i=Array.from(a);return i}},function(t,e,r){"use strict";var n=r(92); +/*! + * ignore + */t.exports=function(t){if(null==t)return null;var e=Object.keys(t),r=e.length,i=null;if(1===r&&"_id"===e[0])i=!t._id;else for(;r--;)if("_id"!==e[r]&&n(t[e[r]])){i=!t[e[r]];break}return i}},function(t,e,r){"use strict"; +/*! + * If populating a path within a document array, make sure each + * subdoc within the array knows its subpaths are populated. + * + * ####Example: + * const doc = await Article.findOne().populate('comments.author'); + * doc.comments[0].populated('author'); // Should be set + */function n(t,e){var r="undefined"!=typeof Symbol&&t[Symbol.iterator]||t["@@iterator"];if(!r){if(Array.isArray(t)||(r=function(t,e){if(!t)return;if("string"==typeof t)return i(t,e);var r=Object.prototype.toString.call(t).slice(8,-1);"Object"===r&&t.constructor&&(r=t.constructor.name);if("Map"===r||"Set"===r)return Array.from(t);if("Arguments"===r||/^(?:Ui|I)nt(?:8|16|32)(?:Clamped)?Array$/.test(r))return i(t,e)}(t))||e&&t&&"number"==typeof t.length){r&&(t=r);var n=0,o=function(){};return{s:o,n:function(){return n>=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:o}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function i(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r=t.length?{done:!0}:{done:!1,value:t[n++]}},e:function(t){throw t},f:i}}throw new TypeError("Invalid attempt to iterate non-iterable instance.\nIn order to be iterable, non-array objects must have a [Symbol.iterator]() method.")}var s,a=!0,u=!1;return{s:function(){r=r.call(t)},n:function(){var t=r.next();return a=t.done,t},e:function(t){u=!0,s=t},f:function(){try{a||null==r.return||r.return()}finally{if(u)throw s}}}}function o(t,e){(null==e||e>t.length)&&(e=t.length);for(var r=0,n=new Array(e);r1&&!~o.indexOf(e)&&(t[e]=1));for(var h=e.split("."),f="",d=0;d0?".":"")+e[i],n.push(r);return n}},function(t,e,r){"use strict"; +/*! + * Module dependencies. + */var n=r(14),i=r(13).EventEmitter,o=r(8),s=r(85),a=r(18),u=o.ValidationError,h=r(159),f=r(32);function c(t,e,r,i,u){if(!(this instanceof c))return new c(t,e,r,i,u);if(f(e)&&!e.instanceOfSchema&&(e=new s(e)),e=this.schema||e,!this.schema&&e.options._id&&void 0===(t=t||{})._id&&(t._id=new a),!e)throw new o.MissingSchemaError;for(var l in this.$__setSchema(e),n.call(this,t,r,i,u),h(this,e,{decorateDoc:!0}),e.methods)this[l]=e.methods[l];for(var d in e.statics)this[d]=e.statics[d]} +/*! + * Inherit from the NodeJS document + */c.prototype=Object.create(n.prototype),c.prototype.constructor=c, +/*! + * ignore + */ +c.events=new i, +/*! + * Browser doc exposes the event emitter API + */ +c.$emitter=new i,["on","once","emit","listeners","removeListener","setMaxListeners","removeAllListeners","addListener"].forEach((function(t){c[t]=function(){return c.$emitter[t].apply(c.$emitter,arguments)}})), +/*! + * Module exports. + */ +c.ValidationError=u,t.exports=c}])})); \ No newline at end of file diff --git a/node_modules/mongoose/index.d.ts b/node_modules/mongoose/index.d.ts new file mode 100644 index 000000000..2462a9a3f --- /dev/null +++ b/node_modules/mongoose/index.d.ts @@ -0,0 +1,3115 @@ +declare module 'mongoose' { + import events = require('events'); + import mongodb = require('mongodb'); + import mongoose = require('mongoose'); + import stream = require('stream'); + + export enum ConnectionStates { + disconnected = 0, + connected = 1, + connecting = 2, + disconnecting = 3, + uninitialized = 99, + } + + class NativeDate extends global.Date {} + + /** The Mongoose Date [SchemaType](/docs/schematypes.html). */ + export type Date = Schema.Types.Date; + + /** + * The Mongoose Decimal128 [SchemaType](/docs/schematypes.html). Used for + * declaring paths in your schema that should be + * [128-bit decimal floating points](http://thecodebarbarian.com/a-nodejs-perspective-on-mongodb-34-decimal.html). + * Do not use this to create a new Decimal128 instance, use `mongoose.Types.Decimal128` + * instead. + */ + export type Decimal128 = Schema.Types.Decimal128; + + /** + * The Mongoose Mixed [SchemaType](/docs/schematypes.html). Used for + * declaring paths in your schema that Mongoose's change tracking, casting, + * and validation should ignore. + */ + export type Mixed = Schema.Types.Mixed; + + /** + * Mongoose constructor. The exports object of the `mongoose` module is an instance of this + * class. Most apps will only use this one instance. + */ + export const Mongoose: new (options?: MongooseOptions | null) => typeof mongoose; + + /** + * The Mongoose Number [SchemaType](/docs/schematypes.html). Used for + * declaring paths in your schema that Mongoose should cast to numbers. + */ + export type Number = Schema.Types.Number; + + /** + * The Mongoose ObjectId [SchemaType](/docs/schematypes.html). Used for + * declaring paths in your schema that should be + * [MongoDB ObjectIds](https://docs.mongodb.com/manual/reference/method/ObjectId/). + * Do not use this to create a new ObjectId instance, use `mongoose.Types.ObjectId` + * instead. + */ + export type ObjectId = Schema.Types.ObjectId; + + export let Promise: any; + export const PromiseProvider: any; + + /** The various Mongoose SchemaTypes. */ + export const SchemaTypes: typeof Schema.Types; + + /** Expose connection states for user-land */ + export const STATES: typeof ConnectionStates; + + /** Opens Mongoose's default connection to MongoDB, see [connections docs](https://mongoosejs.com/docs/connections.html) */ + export function connect(uri: string, options: ConnectOptions, callback: CallbackWithoutResult): void; + export function connect(uri: string, callback: CallbackWithoutResult): void; + export function connect(uri: string, options?: ConnectOptions): Promise; + + /** The Mongoose module's default connection. Equivalent to `mongoose.connections[0]`, see [`connections`](#mongoose_Mongoose-connections). */ + export const connection: Connection; + + /** An array containing all connections associated with this Mongoose instance. */ + export const connections: Connection[]; + + /** + * Can be extended to explicitly type specific models. + */ + interface Models { + [modelName: string]: Model + } + + /** An array containing all models associated with this Mongoose instance. */ + export const models: Models; + /** Creates a Connection instance. */ + export function createConnection(uri: string, options?: ConnectOptions): Connection; + export function createConnection(): Connection; + export function createConnection(uri: string, options: ConnectOptions, callback: Callback): void; + + /** + * Removes the model named `name` from the default connection, if it exists. + * You can use this function to clean up any models you created in your tests to + * prevent OverwriteModelErrors. + */ + export function deleteModel(name: string | RegExp): typeof mongoose; + + export function disconnect(): Promise; + export function disconnect(cb: CallbackWithoutResult): void; + + /** Gets mongoose options */ + export function get(key: K): MongooseOptions[K]; + + /** + * Returns true if Mongoose can cast the given value to an ObjectId, or + * false otherwise. + */ + export function isValidObjectId(v: any): boolean; + + export function model(name: string, schema?: Schema | Schema, collection?: string, skipInit?: boolean): Model; + export function model( + name: string, + schema?: Schema, + collection?: string, + skipInit?: boolean + ): U; + + /** Returns an array of model names created on this instance of Mongoose. */ + export function modelNames(): Array; + + /** The node-mongodb-native driver Mongoose uses. */ + export const mongo: typeof mongodb; + + /** + * Mongoose uses this function to get the current time when setting + * [timestamps](/docs/guide.html#timestamps). You may stub out this function + * using a tool like [Sinon](https://www.npmjs.com/package/sinon) for testing. + */ + export function now(): NativeDate; + + /** Declares a global plugin executed on all Schemas. */ + export function plugin(fn: (schema: Schema, opts?: any) => void, opts?: any): typeof mongoose; + + /** Getter/setter around function for pluralizing collection names. */ + export function pluralize(fn?: ((str: string) => string) | null): ((str: string) => string) | null; + + /** Sets mongoose options */ + export function set(key: K, value: MongooseOptions[K]): typeof mongoose; + + /** + * _Requires MongoDB >= 3.6.0._ Starts a [MongoDB session](https://docs.mongodb.com/manual/release-notes/3.6/#client-sessions) + * for benefits like causal consistency, [retryable writes](https://docs.mongodb.com/manual/core/retryable-writes/), + * and [transactions](http://thecodebarbarian.com/a-node-js-perspective-on-mongodb-4-transactions.html). + */ + export function startSession(options?: mongodb.ClientSessionOptions): Promise; + export function startSession(options: mongodb.ClientSessionOptions, cb: Callback): void; + + /** The Mongoose version */ + export const version: string; + + export type CastError = Error.CastError; + + type Mongoose = typeof mongoose; + + interface MongooseOptions { + /** true by default. Set to false to skip applying global plugins to child schemas */ + applyPluginsToChildSchemas?: boolean; + + /** + * false by default. Set to true to apply global plugins to discriminator schemas. + * This typically isn't necessary because plugins are applied to the base schema and + * discriminators copy all middleware, methods, statics, and properties from the base schema. + */ + applyPluginsToDiscriminators?: boolean; + + /** + * Set to `true` to make Mongoose call` Model.createCollection()` automatically when you + * create a model with `mongoose.model()` or `conn.model()`. This is useful for testing + * transactions, change streams, and other features that require the collection to exist. + */ + autoCreate?: boolean; + + /** + * true by default. Set to false to disable automatic index creation + * for all models associated with this Mongoose instance. + */ + autoIndex?: boolean; + + /** enable/disable mongoose's buffering mechanism for all connections and models */ + bufferCommands?: boolean; + + bufferTimeoutMS?: number; + + /** false by default. Set to `true` to `clone()` all schemas before compiling into a model. */ + cloneSchemas?: boolean; + + /** + * If `true`, prints the operations mongoose sends to MongoDB to the console. + * If a writable stream is passed, it will log to that stream, without colorization. + * If a callback function is passed, it will receive the collection name, the method + * name, then all arguments passed to the method. For example, if you wanted to + * replicate the default logging, you could output from the callback + * `Mongoose: ${collectionName}.${methodName}(${methodArgs.join(', ')})`. + */ + debug?: + | boolean + | { color?: boolean; shell?: boolean } + | stream.Writable + | ((collectionName: string, methodName: string, ...methodArgs: any[]) => void); + + /** If set, attaches [maxTimeMS](https://docs.mongodb.com/manual/reference/operator/meta/maxTimeMS/) to every query */ + maxTimeMS?: number; + + /** + * true by default. Mongoose adds a getter to MongoDB ObjectId's called `_id` that + * returns `this` for convenience with populate. Set this to false to remove the getter. + */ + objectIdGetter?: boolean; + + /** + * Set to `true` to default to overwriting models with the same name when calling + * `mongoose.model()`, as opposed to throwing an `OverwriteModelError`. + */ + overwriteModels?: boolean; + + /** + * If `false`, changes the default `returnOriginal` option to `findOneAndUpdate()`, + * `findByIdAndUpdate`, and `findOneAndReplace()` to false. This is equivalent to + * setting the `new` option to `true` for `findOneAndX()` calls by default. Read our + * `findOneAndUpdate()` [tutorial](https://mongoosejs.com/docs/tutorials/findoneandupdate.html) + * for more information. + */ + returnOriginal?: boolean; + + /** + * false by default. Set to true to enable [update validators]( + * https://mongoosejs.com/docs/validation.html#update-validators + * ) for all validators by default. + */ + runValidators?: boolean; + + sanitizeFilter?: boolean; + + sanitizeProjection?: boolean; + + /** + * true by default. Set to false to opt out of Mongoose adding all fields that you `populate()` + * to your `select()`. The schema-level option `selectPopulatedPaths` overwrites this one. + */ + selectPopulatedPaths?: boolean; + + setDefaultsOnInsert?: boolean; + + /** true by default, may be `false`, `true`, or `'throw'`. Sets the default strict mode for schemas. */ + strict?: boolean | 'throw'; + + /** + * `{ transform: true, flattenDecimals: true }` by default. Overwrites default objects to + * `toJSON()`, for determining how Mongoose documents get serialized by `JSON.stringify()` + */ + toJSON?: ToObjectOptions; + + /** `{ transform: true, flattenDecimals: true }` by default. Overwrites default objects to `toObject()` */ + toObject?: ToObjectOptions; + } + + // eslint-disable-next-line @typescript-eslint/no-empty-interface + interface ClientSession extends mongodb.ClientSession { } + + interface ConnectOptions extends mongodb.MongoClientOptions { + /** Set to false to [disable buffering](http://mongoosejs.com/docs/faq.html#callback_never_executes) on all models associated with this connection. */ + bufferCommands?: boolean; + /** The name of the database you want to use. If not provided, Mongoose uses the database name from connection string. */ + dbName?: string; + /** username for authentication, equivalent to `options.auth.user`. Maintained for backwards compatibility. */ + user?: string; + /** password for authentication, equivalent to `options.auth.password`. Maintained for backwards compatibility. */ + pass?: string; + /** Set to false to disable automatic index creation for all models associated with this connection. */ + autoIndex?: boolean; + /** Set to `true` to make Mongoose automatically call `createCollection()` on every model created on this connection. */ + autoCreate?: boolean; + } + + class Connection extends events.EventEmitter { + /** Returns a promise that resolves when this connection successfully connects to MongoDB */ + asPromise(): Promise; + + /** Closes the connection */ + close(callback: CallbackWithoutResult): void; + close(force: boolean, callback: CallbackWithoutResult): void; + close(force?: boolean): Promise; + + /** Retrieves a collection, creating it if not cached. */ + collection(name: string, options?: mongodb.CreateCollectionOptions): Collection; + + /** A hash of the collections associated with this connection */ + collections: { [index: string]: Collection }; + + /** A hash of the global options that are associated with this connection */ + config: any; + + /** The mongodb.Db instance, set when the connection is opened */ + db: mongodb.Db; + + /** + * Helper for `createCollection()`. Will explicitly create the given collection + * with specified options. Used to create [capped collections](https://docs.mongodb.com/manual/core/capped-collections/) + * and [views](https://docs.mongodb.com/manual/core/views/) from mongoose. + */ + createCollection(name: string, options?: mongodb.CreateCollectionOptions): Promise; + createCollection(name: string, cb: Callback): void; + createCollection(name: string, options: mongodb.CreateCollectionOptions, cb?: Callback): Promise; + + /** + * Removes the model named `name` from this connection, if it exists. You can + * use this function to clean up any models you created in your tests to + * prevent OverwriteModelErrors. + */ + deleteModel(name: string): this; + + /** + * Helper for `dropCollection()`. Will delete the given collection, including + * all documents and indexes. + */ + dropCollection(collection: string): Promise; + dropCollection(collection: string, cb: CallbackWithoutResult): void; + + /** + * Helper for `dropDatabase()`. Deletes the given database, including all + * collections, documents, and indexes. + */ + dropDatabase(): Promise; + dropDatabase(cb: CallbackWithoutResult): void; + + /** Gets the value of the option `key`. Equivalent to `conn.options[key]` */ + get(key: string): any; + + /** + * Returns the [MongoDB driver `MongoClient`](http://mongodb.github.io/node-mongodb-native/3.5/api/MongoClient.html) instance + * that this connection uses to talk to MongoDB. + */ + getClient(): mongodb.MongoClient; + + /** + * The host name portion of the URI. If multiple hosts, such as a replica set, + * this will contain the first host name in the URI + */ + host: string; + + /** + * A number identifier for this connection. Used for debugging when + * you have [multiple connections](/docs/connections.html#multiple_connections). + */ + id: number; + + /** + * A [POJO](https://masteringjs.io/tutorials/fundamentals/pojo) containing + * a map from model names to models. Contains all models that have been + * added to this connection using [`Connection#model()`](/docs/api/connection.html#connection_Connection-model). + */ + models: { [index: string]: Model }; + + /** Defines or retrieves a model. */ + model(name: string, schema?: Schema, collection?: string): Model; + model( + name: string, + schema?: Schema, + collection?: string, + skipInit?: boolean + ): U; + + /** Returns an array of model names created on this connection. */ + modelNames(): Array; + + /** The name of the database this connection points to. */ + name: string; + + /** Opens the connection with a URI using `MongoClient.connect()`. */ + openUri(uri: string, options?: ConnectOptions): Promise; + openUri(uri: string, callback: (err: CallbackError, conn?: Connection) => void): Connection; + openUri(uri: string, options: ConnectOptions, callback: (err: CallbackError, conn?: Connection) => void): Connection; + + /** The password specified in the URI */ + pass: string; + + /** + * The port portion of the URI. If multiple hosts, such as a replica set, + * this will contain the port from the first host name in the URI. + */ + port: number; + + /** Declares a plugin executed on all schemas you pass to `conn.model()` */ + plugin(fn: (schema: Schema, opts?: any) => void, opts?: any): Connection; + + /** The plugins that will be applied to all models created on this connection. */ + plugins: Array; + + /** + * Connection ready state + * + * - 0 = disconnected + * - 1 = connected + * - 2 = connecting + * - 3 = disconnecting + */ + readyState: number; + + /** Sets the value of the option `key`. Equivalent to `conn.options[key] = val` */ + set(key: string, value: any): any; + + /** + * Set the [MongoDB driver `MongoClient`](http://mongodb.github.io/node-mongodb-native/3.5/api/MongoClient.html) instance + * that this connection uses to talk to MongoDB. This is useful if you already have a MongoClient instance, and want to + * reuse it. + */ + setClient(client: mongodb.MongoClient): this; + + /** + * _Requires MongoDB >= 3.6.0._ Starts a [MongoDB session](https://docs.mongodb.com/manual/release-notes/3.6/#client-sessions) + * for benefits like causal consistency, [retryable writes](https://docs.mongodb.com/manual/core/retryable-writes/), + * and [transactions](http://thecodebarbarian.com/a-node-js-perspective-on-mongodb-4-transactions.html). + */ + startSession(options?: mongodb.ClientSessionOptions): Promise; + startSession(options: mongodb.ClientSessionOptions, cb: Callback): void; + + /** + * _Requires MongoDB >= 3.6.0._ Executes the wrapped async function + * in a transaction. Mongoose will commit the transaction if the + * async function executes successfully and attempt to retry if + * there was a retriable error. + */ + transaction(fn: (session: mongodb.ClientSession) => Promise): Promise; + + /** Switches to a different database using the same connection pool. */ + useDb(name: string, options?: { useCache?: boolean, noListener?: boolean }): Connection; + + /** The username specified in the URI */ + user: string; + + /** Watches the entire underlying database for changes. Similar to [`Model.watch()`](/docs/api/model.html#model_Model.watch). */ + watch(pipeline?: Array, options?: mongodb.ChangeStreamOptions): mongodb.ChangeStream; + } + + /* + * section collection.js + * http://mongoosejs.com/docs/api.html#collection-js + */ + interface CollectionBase extends mongodb.Collection { + /* + * Abstract methods. Some of these are already defined on the + * mongodb.Collection interface so they've been commented out. + */ + ensureIndex(...args: any[]): any; + findAndModify(...args: any[]): any; + getIndexes(...args: any[]): any; + + /** The collection name */ + collectionName: string; + /** The Connection instance */ + conn: Connection; + /** The collection name */ + name: string; + } + + /* + * section drivers/node-mongodb-native/collection.js + * http://mongoosejs.com/docs/api.html#drivers-node-mongodb-native-collection-js + */ + let Collection: Collection; + interface Collection extends CollectionBase { + /** + * Collection constructor + * @param name name of the collection + * @param conn A MongooseConnection instance + * @param opts optional collection options + */ + // eslint-disable-next-line @typescript-eslint/no-misused-new + new(name: string, conn: Connection, opts?: any): Collection; + /** Formatter for debug print args */ + $format(arg: any): string; + /** Debug print helper */ + $print(name: any, i: any, args: any[]): void; + /** Retrieves information about this collections indexes. */ + getIndexes(): any; + } + + class Document { + constructor(doc?: any); + + /** This documents _id. */ + _id?: T; + + /** This documents __v. */ + __v?: any; + + /* Get all subdocs (by bfs) */ + $getAllSubdocs(): Document[]; + + /** Don't run validation on this path or persist changes to this path. */ + $ignore(path: string): void; + + /** Checks if a path is set to its default. */ + $isDefault(path: string): boolean; + + /** Getter/setter, determines whether the document was removed or not. */ + $isDeleted(val?: boolean): boolean; + + /** Returns an array of all populated documents associated with the query */ + $getPopulatedDocs(): Document[]; + + /** + * Returns true if the given path is nullish or only contains empty objects. + * Useful for determining whether this subdoc will get stripped out by the + * [minimize option](/docs/guide.html#minimize). + */ + $isEmpty(path: string): boolean; + + /** Checks if a path is invalid */ + $isValid(path: string): boolean; + + /** + * Empty object that you can use for storing properties on the document. This + * is handy for passing data to middleware without conflicting with Mongoose + * internals. + */ + $locals: Record; + + /** Marks a path as valid, removing existing validation errors. */ + $markValid(path: string): void; + + /** + * A string containing the current operation that Mongoose is executing + * on this document. May be `null`, `'save'`, `'validate'`, or `'remove'`. + */ + $op: string | null; + + /** + * Getter/setter around the session associated with this document. Used to + * automatically set `session` if you `save()` a doc that you got from a + * query with an associated session. + */ + $session(session?: mongodb.ClientSession | null): mongodb.ClientSession; + + /** Alias for `set()`, used internally to avoid conflicts */ + $set(path: string, val: any, options?: any): this; + $set(path: string, val: any, type: any, options?: any): this; + $set(value: any): this; + + /** Set this property to add additional query filters when Mongoose saves this document and `isNew` is false. */ + $where: Record; + + /** If this is a discriminator model, `baseModelName` is the name of the base model. */ + baseModelName?: string; + + /** Collection the model uses. */ + collection: Collection; + + /** Connection the model uses. */ + db: Connection; + + /** Removes this document from the db. */ + delete(options?: QueryOptions): QueryWithHelpers; + delete(options: QueryOptions, cb?: Callback): void; + delete(cb: Callback): void; + + /** Removes this document from the db. */ + deleteOne(options?: QueryOptions): QueryWithHelpers; + deleteOne(options: QueryOptions, cb?: Callback): void; + deleteOne(cb: Callback): void; + + /** + * Takes a populated field and returns it to its unpopulated state. If called with + * no arguments, then all populated fields are returned to their unpopulated state. + */ + depopulate(path?: string | string[]): this; + + /** + * Returns the list of paths that have been directly modified. A direct + * modified path is a path that you explicitly set, whether via `doc.foo = 'bar'`, + * `Object.assign(doc, { foo: 'bar' })`, or `doc.set('foo', 'bar')`. + */ + directModifiedPaths(): Array; + + /** + * Returns true if this document is equal to another document. + * + * Documents are considered equal when they have matching `_id`s, unless neither + * document has an `_id`, in which case this function falls back to using + * `deepEqual()`. + */ + equals(doc: Document): boolean; + + /** Hash containing current validation errors. */ + errors?: Error.ValidationError; + + /** Returns the value of a path. */ + get(path: string, type?: any, options?: any): any; + + /** + * Returns the changes that happened to the document + * in the format that will be sent to MongoDB. + */ + getChanges(): UpdateQuery; + + /** The string version of this documents _id. */ + id?: any; + + /** Signal that we desire an increment of this documents version. */ + increment(): this; + + /** + * Initializes the document without setters or marking anything modified. + * Called internally after a document is returned from mongodb. Normally, + * you do **not** need to call this function on your own. + */ + init(obj: any, opts?: any, cb?: Callback): this; + + /** Marks a path as invalid, causing validation to fail. */ + invalidate(path: string, errorMsg: string | NativeError, value?: any, kind?: string): NativeError | null; + + /** Returns true if `path` was directly set and modified, else false. */ + isDirectModified(path: string): boolean; + + /** Checks if `path` was explicitly selected. If no projection, always returns true. */ + isDirectSelected(path: string): boolean; + + /** Checks if `path` is in the `init` state, that is, it was set by `Document#init()` and not modified since. */ + isInit(path: string): boolean; + + /** + * Returns true if any of the given paths is modified, else false. If no arguments, returns `true` if any path + * in this document is modified. + */ + isModified(path?: string | Array): boolean; + + /** Boolean flag specifying if the document is new. */ + isNew: boolean; + + /** Checks if `path` was selected in the source query which initialized this document. */ + isSelected(path: string): boolean; + + /** Marks the path as having pending changes to write to the db. */ + markModified(path: string, scope?: any): void; + + /** Returns the list of paths that have been modified. */ + modifiedPaths(options?: { includeChildren?: boolean }): Array; + + /** The name of the model */ + modelName: string; + + /** + * Overwrite all values in this document with the values of `obj`, except + * for immutable properties. Behaves similarly to `set()`, except for it + * unsets all properties that aren't in `obj`. + */ + overwrite(obj: DocumentDefinition): this; + + /** + * If this document is a subdocument or populated document, returns the + * document's parent. Returns undefined otherwise. + */ + $parent(): Document | undefined; + + /** Populates document references. */ + populate(path: string | PopulateOptions | (string | PopulateOptions)[]): Promise; + populate(path: string | PopulateOptions | (string | PopulateOptions)[], callback: Callback): void; + populate(path: string, names: string): Promise; + populate(path: string, names: string, callback: Callback): void; + + /** Gets _id(s) used during population of the given `path`. If the path was not populated, returns `undefined`. */ + populated(path: string): any; + + /** Removes this document from the db. */ + remove(options?: QueryOptions): Promise; + remove(options?: QueryOptions, cb?: Callback): void; + + /** Sends a replaceOne command with this document `_id` as the query selector. */ + replaceOne(replacement?: DocumentDefinition, options?: QueryOptions | null, callback?: Callback): Query; + replaceOne(replacement?: Object, options?: QueryOptions | null, callback?: Callback): Query; + + /** Saves this document by inserting a new document into the database if [document.isNew](/docs/api.html#document_Document-isNew) is `true`, or sends an [updateOne](/docs/api.html#document_Document-updateOne) operation with just the modified paths if `isNew` is `false`. */ + save(options?: SaveOptions): Promise; + save(options?: SaveOptions, fn?: Callback): void; + save(fn?: Callback): void; + + /** The document's schema. */ + schema: Schema; + + /** Sets the value of a path, or many paths. */ + set(path: string, val: any, options?: any): this; + set(path: string, val: any, type: any, options?: any): this; + set(value: any): this; + + /** The return value of this method is used in calls to JSON.stringify(doc). */ + toJSON(options?: ToObjectOptions): LeanDocument; + toJSON(options?: ToObjectOptions): T; + + /** Converts this document into a plain-old JavaScript object ([POJO](https://masteringjs.io/tutorials/fundamentals/pojo)). */ + toObject(options?: ToObjectOptions): LeanDocument; + toObject(options?: ToObjectOptions): T; + + /** Clears the modified state on the specified path. */ + unmarkModified(path: string): void; + + /** Sends an update command with this document `_id` as the query selector. */ + update(update?: UpdateQuery | UpdateWithAggregationPipeline, options?: QueryOptions | null, callback?: Callback): Query; + + /** Sends an updateOne command with this document `_id` as the query selector. */ + updateOne(update?: UpdateQuery | UpdateWithAggregationPipeline, options?: QueryOptions | null, callback?: Callback): Query; + + /** Executes registered validation rules for this document. */ + validate(options:{ pathsToSkip?: pathsToSkip }): Promise; + validate(pathsToValidate?: pathsToValidate, options?: any): Promise; + validate(callback: CallbackWithoutResult): void; + validate(pathsToValidate: pathsToValidate, callback: CallbackWithoutResult): void; + validate(pathsToValidate: pathsToValidate, options: any, callback: CallbackWithoutResult): void; + + /** Executes registered validation rules (skipping asynchronous validators) for this document. */ + validateSync(options:{pathsToSkip?: pathsToSkip, [k:string]: any }): Error.ValidationError | null; + validateSync(pathsToValidate?: Array, options?: any): Error.ValidationError | null; + } + + /** A list of paths to validate. If set, Mongoose will validate only the modified paths that are in the given list. */ + type pathsToValidate = string[] | string; + /** A list of paths to skip. If set, Mongoose will validate every modified path that is not in this list. */ + type pathsToSkip = string[] | string; + + interface AcceptsDiscriminator { + /** Adds a discriminator type. */ + discriminator(name: string | number, schema: Schema, value?: string | number | ObjectId): Model; + discriminator(name: string | number, schema: Schema, value?: string | number | ObjectId): U; + } + + type AnyKeys = { [P in keyof T]?: T[P] | any }; + interface AnyObject { [k: string]: any } + + type Require_id = T extends { _id?: any } ? (T & { _id: T['_id'] }) : (T & { _id: Types.ObjectId }); + type EnforceDocument = T extends Document ? Require_id : (Document & Require_id & TVirtuals & TMethods); + + interface IndexesDiff { + /** Indexes that would be created in mongodb. */ + toCreate: Array + /** Indexes that would be dropped in mongodb. */ + toDrop: Array + } + + export const Model: Model; + interface Model extends NodeJS.EventEmitter, AcceptsDiscriminator { + new(doc?: AnyKeys & AnyObject, fields?: any | null, options?: boolean | AnyObject): EnforceDocument; + + aggregate(pipeline?: any[], options?: Record): Aggregate>; + aggregate(pipeline: any[], cb: Function): Promise>; + aggregate(pipeline: any[], options: Record, cb: Function): Promise>; + + /** Base Mongoose instance the model uses. */ + base: typeof mongoose; + + /** + * If this is a discriminator model, `baseModelName` is the name of + * the base model. + */ + baseModelName: string | undefined; + + /** + * Sends multiple `insertOne`, `updateOne`, `updateMany`, `replaceOne`, + * `deleteOne`, and/or `deleteMany` operations to the MongoDB server in one + * command. This is faster than sending multiple independent operations (e.g. + * if you use `create()`) because with `bulkWrite()` there is only one network + * round trip to the MongoDB server. + */ + bulkWrite(writes: Array, options?: mongodb.BulkWriteOptions): Promise; + bulkWrite(writes: Array, options?: mongodb.BulkWriteOptions, cb?: Callback): void; + + /** + * Sends multiple `save()` calls in a single `bulkWrite()`. This is faster than + * sending multiple `save()` calls because with `bulkSave()` there is only one + * network round trip to the MongoDB server. + */ + bulkSave(documents: Array): Promise; + + /** Collection the model uses. */ + collection: Collection; + + /** Creates a `count` query: counts the number of documents that match `filter`. */ + count(callback?: Callback): QueryWithHelpers, TQueryHelpers, T>; + count(filter: FilterQuery, callback?: Callback): QueryWithHelpers, TQueryHelpers, T>; + + /** Creates a `countDocuments` query: counts the number of documents that match `filter`. */ + countDocuments(callback?: Callback): QueryWithHelpers, TQueryHelpers, T>; + countDocuments(filter: FilterQuery, callback?: Callback): QueryWithHelpers, TQueryHelpers, T>; + + /** Creates a new document or documents */ + create(docs: (T | DocumentDefinition | AnyObject)[], options?: SaveOptions): Promise[]>; + create(docs: (T | DocumentDefinition | AnyObject)[], callback: Callback[]>): void; + create(doc: T | DocumentDefinition | AnyObject): Promise>; + create(doc: T | DocumentDefinition | AnyObject, callback: Callback>): void; + create>(docs: DocContents[], options?: SaveOptions): Promise[]>; + create>(docs: DocContents[], callback: Callback[]>): void; + create>(doc: DocContents): Promise>; + create>(...docs: DocContents[]): Promise[]>; + create>(doc: DocContents, callback: Callback>): void; + + /** + * Create the collection for this model. By default, if no indexes are specified, + * mongoose will not create the collection for the model until any documents are + * created. Use this method to create the collection explicitly. + */ + createCollection(options?: mongodb.CreateCollectionOptions): Promise; + createCollection(options: mongodb.CreateCollectionOptions | null, callback: Callback): void; + + /** + * Similar to `ensureIndexes()`, except for it uses the [`createIndex`](http://mongodb.github.io/node-mongodb-native/2.2/api/Collection.html#createIndex) + * function. + */ + createIndexes(callback?: CallbackWithoutResult): Promise; + createIndexes(options?: any, callback?: CallbackWithoutResult): Promise; + + /** Connection the model uses. */ + db: Connection; + + /** + * Deletes all of the documents that match `conditions` from the collection. + * Behaves like `remove()`, but deletes all documents that match `conditions` + * regardless of the `single` option. + */ + deleteMany(filter?: FilterQuery, options?: QueryOptions, callback?: CallbackWithoutResult): QueryWithHelpers, TQueryHelpers, T>; + deleteMany(filter: FilterQuery, callback: CallbackWithoutResult): QueryWithHelpers, TQueryHelpers, T>; + deleteMany(callback: CallbackWithoutResult): QueryWithHelpers, TQueryHelpers, T>; + + /** + * Deletes the first document that matches `conditions` from the collection. + * Behaves like `remove()`, but deletes at most one document regardless of the + * `single` option. + */ + deleteOne(filter?: FilterQuery, options?: QueryOptions, callback?: CallbackWithoutResult): QueryWithHelpers, TQueryHelpers, T>; + deleteOne(filter: FilterQuery, callback: CallbackWithoutResult): QueryWithHelpers, TQueryHelpers, T>; + deleteOne(callback: CallbackWithoutResult): QueryWithHelpers, TQueryHelpers, T>; + + /** + * Sends `createIndex` commands to mongo for each index declared in the schema. + * The `createIndex` commands are sent in series. + */ + ensureIndexes(callback?: CallbackWithoutResult): Promise; + ensureIndexes(options?: any, callback?: CallbackWithoutResult): Promise; + + /** + * Event emitter that reports any errors that occurred. Useful for global error + * handling. + */ + events: NodeJS.EventEmitter; + + /** + * Finds a single document by its _id field. `findById(id)` is almost* + * equivalent to `findOne({ _id: id })`. If you want to query by a document's + * `_id`, use `findById()` instead of `findOne()`. + */ + findById(id: any, projection?: any | null, options?: QueryOptions | null, callback?: Callback | null>): QueryWithHelpers | null, EnforceDocument, TQueryHelpers, T>; + + /** Finds one document. */ + findOne(filter?: FilterQuery, projection?: any | null, options?: QueryOptions | null, callback?: Callback | null>): QueryWithHelpers | null, EnforceDocument, TQueryHelpers, T>; + + /** + * Shortcut for creating a new Document from existing raw data, pre-saved in the DB. + * The document returned has no paths marked as modified initially. + */ + hydrate(obj: any): EnforceDocument; + + /** + * This function is responsible for building [indexes](https://docs.mongodb.com/manual/indexes/), + * unless [`autoIndex`](http://mongoosejs.com/docs/guide.html#autoIndex) is turned off. + * Mongoose calls this function automatically when a model is created using + * [`mongoose.model()`](/docs/api.html#mongoose_Mongoose-model) or + * [`connection.model()`](/docs/api.html#connection_Connection-model), so you + * don't need to call it. + */ + init(callback?: CallbackWithoutResult): Promise>; + + /** Inserts one or more new documents as a single `insertMany` call to the MongoDB server. */ + insertMany(docs: Array | AnyObject>, options: InsertManyOptions & { rawResult: true }): Promise; + insertMany(docs: Array | AnyObject>, options?: InsertManyOptions): Promise>>; + insertMany(doc: T | DocumentDefinition | AnyObject, options: InsertManyOptions & { rawResult: true }): Promise; + insertMany(doc: T | DocumentDefinition | AnyObject, options?: InsertManyOptions): Promise[]>; + insertMany(doc: T | DocumentDefinition | AnyObject, options?: InsertManyOptions, callback?: Callback[] | InsertManyResult>): void; + insertMany(docs: Array | AnyObject>, options?: InsertManyOptions, callback?: Callback> | InsertManyResult>): void; + + /** + * Lists the indexes currently defined in MongoDB. This may or may not be + * the same as the indexes defined in your schema depending on whether you + * use the [`autoIndex` option](/docs/guide.html#autoIndex) and if you + * build indexes manually. + */ + listIndexes(callback: Callback>): void; + listIndexes(): Promise>; + + /** The name of the model */ + modelName: string; + + /** Populates document references. */ + populate(docs: Array, options: PopulateOptions | Array | string, + callback?: Callback<(EnforceDocument)[]>): Promise>>; + populate(doc: any, options: PopulateOptions | Array | string, + callback?: Callback>): Promise>; + + /** + * Makes the indexes in MongoDB match the indexes defined in this model's + * schema. This function will drop any indexes that are not defined in + * the model's schema except the `_id` index, and build any indexes that + * are in your schema but not in MongoDB. + */ + syncIndexes(options?: Record): Promise>; + syncIndexes(options: Record | null, callback: Callback>): void; + + /** + * Does a dry-run of Model.syncIndexes(), meaning that + * the result of this function would be the result of + * Model.syncIndexes(). + */ + diffIndexes(options?: Record): Promise + diffIndexes(options: Record | null, callback: (err: CallbackError, diff: IndexesDiff) => void): void + + /** + * Starts a [MongoDB session](https://docs.mongodb.com/manual/release-notes/3.6/#client-sessions) + * for benefits like causal consistency, [retryable writes](https://docs.mongodb.com/manual/core/retryable-writes/), + * and [transactions](http://thecodebarbarian.com/a-node-js-perspective-on-mongodb-4-transactions.html). + * */ + startSession(options?: mongodb.ClientSessionOptions, cb?: Callback): Promise; + + /** Casts and validates the given object against this model's schema, passing the given `context` to custom validators. */ + validate(callback?: CallbackWithoutResult): Promise; + validate(optional: any, callback?: CallbackWithoutResult): Promise; + validate(optional: any, pathsToValidate: string[], callback?: CallbackWithoutResult): Promise; + + /** Watches the underlying collection for changes using [MongoDB change streams](https://docs.mongodb.com/manual/changeStreams/). */ + watch(pipeline?: Array>, options?: mongodb.ChangeStreamOptions): mongodb.ChangeStream; + + /** Adds a `$where` clause to this query */ + $where(argument: string | Function): QueryWithHelpers>, EnforceDocument, TQueryHelpers, T>; + + /** Registered discriminators for this model. */ + discriminators: { [name: string]: Model } | undefined; + + /** Translate any aliases fields/conditions so the final query or document object is pure */ + translateAliases(raw: any): any; + + /** Creates a `distinct` query: returns the distinct values of the given `field` that match `filter`. */ + distinct(field: string, filter?: FilterQuery, callback?: Callback): QueryWithHelpers, EnforceDocument, TQueryHelpers, T>; + + /** Creates a `estimatedDocumentCount` query: counts the number of documents in the collection. */ + estimatedDocumentCount(options?: QueryOptions, callback?: Callback): QueryWithHelpers, TQueryHelpers, T>; + + /** + * Returns true if at least one document exists in the database that matches + * the given `filter`, and false otherwise. + */ + exists(filter: FilterQuery): Promise; + exists(filter: FilterQuery, callback: Callback): void; + + /** Creates a `find` query: gets a list of documents that match `filter`. */ + find(callback?: Callback[]>): QueryWithHelpers>, EnforceDocument, TQueryHelpers, T>; + find(filter: FilterQuery, callback?: Callback): QueryWithHelpers>, EnforceDocument, TQueryHelpers, T>; + find(filter: FilterQuery, projection?: any | null, options?: QueryOptions | null, callback?: Callback[]>): QueryWithHelpers>, EnforceDocument, TQueryHelpers, T>; + + /** Creates a `findByIdAndDelete` query, filtering by the given `_id`. */ + findByIdAndDelete(id?: mongodb.ObjectId | any, options?: QueryOptions | null, callback?: (err: CallbackError, doc: EnforceDocument | null, res: any) => void): QueryWithHelpers | null, EnforceDocument, TQueryHelpers, T>; + + /** Creates a `findByIdAndRemove` query, filtering by the given `_id`. */ + findByIdAndRemove(id?: mongodb.ObjectId | any, options?: QueryOptions | null, callback?: (err: CallbackError, doc: EnforceDocument | null, res: any) => void): QueryWithHelpers | null, EnforceDocument, TQueryHelpers, T>; + + /** Creates a `findOneAndUpdate` query, filtering by the given `_id`. */ + findByIdAndUpdate(id: mongodb.ObjectId | any, update: UpdateQuery, options: QueryOptions & { rawResult: true }, callback?: (err: CallbackError, doc: any, res: any) => void): QueryWithHelpers>, EnforceDocument, TQueryHelpers, T>; + findByIdAndUpdate(id: mongodb.ObjectId | any, update: UpdateQuery, options: QueryOptions & { upsert: true } & ReturnsNewDoc, callback?: (err: CallbackError, doc: EnforceDocument, res: any) => void): QueryWithHelpers, EnforceDocument, TQueryHelpers, T>; + findByIdAndUpdate(id?: mongodb.ObjectId | any, update?: UpdateQuery, options?: QueryOptions | null, callback?: (err: CallbackError, doc: EnforceDocument | null, res: any) => void): QueryWithHelpers | null, EnforceDocument, TQueryHelpers, T>; + findByIdAndUpdate(id: mongodb.ObjectId | any, update: UpdateQuery, callback: (err: CallbackError, doc: EnforceDocument | null, res: any) => void): QueryWithHelpers | null, EnforceDocument, TQueryHelpers, T>; + + /** Creates a `findOneAndDelete` query: atomically finds the given document, deletes it, and returns the document as it was before deletion. */ + findOneAndDelete(filter?: FilterQuery, options?: QueryOptions | null, callback?: (err: CallbackError, doc: EnforceDocument | null, res: any) => void): QueryWithHelpers | null, EnforceDocument, TQueryHelpers, T>; + + /** Creates a `findOneAndRemove` query: atomically finds the given document and deletes it. */ + findOneAndRemove(filter?: FilterQuery, options?: QueryOptions | null, callback?: (err: CallbackError, doc: EnforceDocument | null, res: any) => void): QueryWithHelpers | null, EnforceDocument, TQueryHelpers, T>; + + /** Creates a `findOneAndReplace` query: atomically finds the given document and replaces it with `replacement`. */ + findOneAndReplace(filter: FilterQuery, replacement: DocumentDefinition | AnyObject, options: QueryOptions & { upsert: true } & ReturnsNewDoc, callback?: (err: CallbackError, doc: EnforceDocument, res: any) => void): QueryWithHelpers, EnforceDocument, TQueryHelpers, T>; + findOneAndReplace(filter?: FilterQuery, replacement?: DocumentDefinition | AnyObject, options?: QueryOptions | null, callback?: (err: CallbackError, doc: EnforceDocument | null, res: any) => void): QueryWithHelpers | null, EnforceDocument, TQueryHelpers, T>; + + /** Creates a `findOneAndUpdate` query: atomically find the first document that matches `filter` and apply `update`. */ + findOneAndUpdate(filter: FilterQuery, update: UpdateQuery, options: QueryOptions & { rawResult: true }, callback?: (err: CallbackError, doc: any, res: any) => void): QueryWithHelpers>, EnforceDocument, TQueryHelpers, T>; + findOneAndUpdate(filter: FilterQuery, update: UpdateQuery, options: QueryOptions & { upsert: true } & ReturnsNewDoc, callback?: (err: CallbackError, doc: EnforceDocument, res: any) => void): QueryWithHelpers, EnforceDocument, TQueryHelpers, T>; + findOneAndUpdate(filter?: FilterQuery, update?: UpdateQuery, options?: QueryOptions | null, callback?: (err: CallbackError, doc: T | null, res: any) => void): QueryWithHelpers | null, EnforceDocument, TQueryHelpers, T>; + + geoSearch(filter?: FilterQuery, options?: GeoSearchOptions, callback?: Callback>>): QueryWithHelpers>, EnforceDocument, TQueryHelpers, T>; + + /** Executes a mapReduce command. */ + mapReduce( + o: MapReduceOptions, + callback?: Callback + ): Promise; + + remove(filter?: any, callback?: CallbackWithoutResult): QueryWithHelpers, TQueryHelpers, T>; + + /** Creates a `replaceOne` query: finds the first document that matches `filter` and replaces it with `replacement`. */ + replaceOne(filter?: FilterQuery, replacement?: DocumentDefinition, options?: QueryOptions | null, callback?: Callback): QueryWithHelpers, TQueryHelpers, T>; + replaceOne(filter?: FilterQuery, replacement?: Object, options?: QueryOptions | null, callback?: Callback): QueryWithHelpers, TQueryHelpers, T>; + + /** Schema the model uses. */ + schema: Schema; + + /** + * @deprecated use `updateOne` or `updateMany` instead. + * Creates a `update` query: updates one or many documents that match `filter` with `update`, based on the `multi` option. + */ + update(filter?: FilterQuery, update?: UpdateQuery | UpdateWithAggregationPipeline, options?: QueryOptions | null, callback?: Callback): QueryWithHelpers, TQueryHelpers, T>; + + /** Creates a `updateMany` query: updates all documents that match `filter` with `update`. */ + updateMany(filter?: FilterQuery, update?: UpdateQuery | UpdateWithAggregationPipeline, options?: QueryOptions | null, callback?: Callback): QueryWithHelpers, TQueryHelpers, T>; + + /** Creates a `updateOne` query: updates the first document that matches `filter` with `update`. */ + updateOne(filter?: FilterQuery, update?: UpdateQuery | UpdateWithAggregationPipeline, options?: QueryOptions | null, callback?: Callback): QueryWithHelpers, TQueryHelpers, T>; + + /** Creates a Query, applies the passed conditions, and returns the Query. */ + where(path: string, val?: any): QueryWithHelpers>, EnforceDocument, TQueryHelpers, T>; + where(obj: object): QueryWithHelpers>, EnforceDocument, TQueryHelpers, T>; + where(): QueryWithHelpers>, EnforceDocument, TQueryHelpers, T>; + } + + type UpdateWriteOpResult = mongodb.UpdateResult; + + interface QueryOptions { + arrayFilters?: { [key: string]: any }[]; + batchSize?: number; + collation?: mongodb.CollationOptions; + comment?: any; + context?: string; + explain?: any; + fields?: any | string; + hint?: any; + /** + * If truthy, mongoose will return the document as a plain JavaScript object rather than a mongoose document. + */ + lean?: boolean | any; + limit?: number; + maxTimeMS?: number; + maxscan?: number; + multi?: boolean; + multipleCastError?: boolean; + /** + * By default, `findOneAndUpdate()` returns the document as it was **before** + * `update` was applied. If you set `new: true`, `findOneAndUpdate()` will + * instead give you the object after `update` was applied. + */ + new?: boolean; + overwrite?: boolean; + overwriteDiscriminatorKey?: boolean; + populate?: string | string[] | PopulateOptions | PopulateOptions[]; + projection?: any; + /** + * if true, returns the raw result from the MongoDB driver + */ + rawResult?: boolean; + readPreference?: string | mongodb.ReadPreferenceMode; + /** + * An alias for the `new` option. `returnOriginal: false` is equivalent to `new: true`. + */ + returnOriginal?: boolean; + /** + * Another alias for the `new` option. `returnOriginal` is deprecated so this should be used. + */ + returnDocument?: string; + runValidators?: boolean; + /* Set to `true` to automatically sanitize potentially unsafe user-generated query projections */ + sanitizeProjection?: boolean; + /** + * Set to `true` to automatically sanitize potentially unsafe query filters by stripping out query selectors that + * aren't explicitly allowed using `mongoose.trusted()`. + */ + sanitizeFilter?: boolean; + /** The session associated with this query. */ + session?: mongodb.ClientSession; + setDefaultsOnInsert?: boolean; + skip?: number; + snapshot?: any; + sort?: any; + /** overwrites the schema's strict mode option */ + strict?: boolean | string; + /** + * equal to `strict` by default, may be `false`, `true`, or `'throw'`. Sets the default + * [strictQuery](https://mongoosejs.com/docs/guide.html#strictQuery) mode for schemas. + */ + strictQuery?: boolean | 'throw'; + tailable?: number; + /** + * If set to `false` and schema-level timestamps are enabled, + * skip timestamps for this update. Note that this allows you to overwrite + * timestamps. Does nothing if schema-level timestamps are not set. + */ + timestamps?: boolean; + upsert?: boolean; + writeConcern?: any; + + [other: string]: any; + } + + type MongooseQueryOptions = Pick; + + interface SaveOptions { + checkKeys?: boolean; + j?: boolean; + safe?: boolean | WriteConcern; + session?: ClientSession | null; + timestamps?: boolean; + validateBeforeSave?: boolean; + validateModifiedOnly?: boolean; + w?: number | string; + wtimeout?: number; + } + + interface WriteConcern { + j?: boolean; + w?: number | 'majority' | TagSet; + wtimeout?: number; + } + + interface TagSet { + [k: string]: string; + } + + interface InsertManyOptions { + limit?: number; + rawResult?: boolean; + ordered?: boolean; + lean?: boolean; + session?: mongodb.ClientSession; + populate?: string | string[] | PopulateOptions | PopulateOptions[]; + } + + interface InsertManyResult extends mongodb.InsertManyResult { + mongoose?: { validationErrors?: Array } + } + + interface MapReduceOptions { + map: Function | string; + reduce: (key: Key, vals: T[]) => Val; + /** query filter object. */ + query?: any; + /** sort input objects using this key */ + sort?: any; + /** max number of documents */ + limit?: number; + /** keep temporary data default: false */ + keeptemp?: boolean; + /** finalize function */ + finalize?: (key: Key, val: Val) => Val; + /** scope variables exposed to map/reduce/finalize during execution */ + scope?: any; + /** it is possible to make the execution stay in JS. Provided in MongoDB > 2.0.X default: false */ + jsMode?: boolean; + /** provide statistics on job execution time. default: false */ + verbose?: boolean; + readPreference?: string; + /** sets the output target for the map reduce job. default: {inline: 1} */ + out?: { + /** the results are returned in an array */ + inline?: number; + /** + * {replace: 'collectionName'} add the results to collectionName: the + * results replace the collection + */ + replace?: string; + /** + * {reduce: 'collectionName'} add the results to collectionName: if + * dups are detected, uses the reducer / finalize functions + */ + reduce?: string; + /** + * {merge: 'collectionName'} add the results to collectionName: if + * dups exist the new docs overwrite the old + */ + merge?: string; + }; + } + + interface GeoSearchOptions { + /** x,y point to search for */ + near: number[]; + /** the maximum distance from the point near that a result can be */ + maxDistance: number; + /** The maximum number of results to return */ + limit?: number; + /** return the raw object instead of the Mongoose Model */ + lean?: boolean; + } + + interface PopulateOptions { + /** space delimited path(s) to populate */ + path: string; + /** fields to select */ + select?: any; + /** query conditions to match */ + match?: any; + /** optional model to use for population */ + model?: string | Model; + /** optional query options like sort, limit, etc */ + options?: any; + /** deep populate */ + populate?: PopulateOptions | Array; + /** + * If true Mongoose will always set `path` to an array, if false Mongoose will + * always set `path` to a document. Inferred from schema by default. + */ + justOne?: boolean; + /** transform function to call on every populated doc */ + transform?: (doc: any, id: any) => any; + } + + interface ToObjectOptions { + /** apply all getters (path and virtual getters) */ + getters?: boolean; + /** apply virtual getters (can override getters option) */ + virtuals?: boolean | string[]; + /** if `options.virtuals = true`, you can set `options.aliases = false` to skip applying aliases. This option is a no-op if `options.virtuals = false`. */ + aliases?: boolean; + /** remove empty objects (defaults to true) */ + minimize?: boolean; + /** if set, mongoose will call this function to allow you to transform the returned object */ + transform?: (doc: any, ret: any, options: any) => any; + /** if true, replace any conventionally populated paths with the original id in the output. Has no affect on virtual populated paths. */ + depopulate?: boolean; + /** if false, exclude the version key (`__v` by default) from the output */ + versionKey?: boolean; + /** if true, convert Maps to POJOs. Useful if you want to `JSON.stringify()` the result of `toObject()`. */ + flattenMaps?: boolean; + /** If true, omits fields that are excluded in this document's projection. Unless you specified a projection, this will omit any field that has `select: false` in the schema. */ + useProjection?: boolean; + } + + type MongooseDocumentMiddleware = 'validate' | 'save' | 'remove' | 'updateOne' | 'deleteOne' | 'init'; + type MongooseQueryMiddleware = 'count' | 'deleteMany' | 'deleteOne' | 'distinct' | 'find' | 'findOne' | 'findOneAndDelete' | 'findOneAndRemove' | 'findOneAndUpdate' | 'remove' | 'update' | 'updateOne' | 'updateMany'; + + type SchemaPreOptions = { document?: boolean, query?: boolean }; + type SchemaPostOptions = { document?: boolean, query?: boolean }; + + type ExtractQueryHelpers = M extends Model ? TQueryHelpers : {}; + type ExtractVirtuals = M extends Model ? TVirtuals : {}; + + type IndexDirection = 1 | -1 | '2d' | '2dsphere' | 'geoHaystack' | 'hashed' | 'text'; + type IndexDefinition = Record; + + type PreMiddlewareFunction = (this: T, next: (err?: CallbackError) => void) => void | Promise; + type PreSaveMiddlewareFunction = (this: T, next: (err?: CallbackError) => void, opts: SaveOptions) => void | Promise; + type PostMiddlewareFunction = (this: ThisType, res: ResType, next: (err?: CallbackError) => void) => void | Promise; + type ErrorHandlingMiddlewareFunction = (this: ThisType, err: NativeError, res: ResType, next: (err?: CallbackError) => void) => void; + + class Schema, TInstanceMethods = {}> extends events.EventEmitter { + /** + * Create a new schema + */ + constructor(definition?: SchemaDefinition>, options?: SchemaOptions); + + /** Adds key path / schema type pairs to this schema. */ + add(obj: SchemaDefinition> | Schema, prefix?: string): this; + + /** + * Array of child schemas (from document arrays and single nested subdocs) + * and their corresponding compiled models. Each element of the array is + * an object with 2 properties: `schema` and `model`. + */ + childSchemas: { schema: Schema, model: any }[]; + + /** Returns a copy of this schema */ + clone(): Schema; + + /** Object containing discriminators defined on this schema */ + discriminators?: { [name: string]: Schema }; + + /** Iterates the schemas paths similar to Array#forEach. */ + eachPath(fn: (path: string, type: SchemaType) => void): this; + + /** Defines an index (most likely compound) for this schema. */ + index(fields: IndexDefinition, options?: IndexOptions): this; + + /** + * Returns a list of indexes that this schema declares, via `schema.index()` + * or by `index: true` in a path's options. + */ + indexes(): Array; + + /** Gets a schema option. */ + get(key: K): SchemaOptions[K]; + + /** + * Loads an ES6 class into a schema. Maps [setters](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Functions/set) + [getters](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Functions/get), [static methods](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Classes/static), + * and [instance methods](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Classes#Class_body_and_method_definitions) + * to schema [virtuals](http://mongoosejs.com/docs/guide.html#virtuals), + * [statics](http://mongoosejs.com/docs/guide.html#statics), and + * [methods](http://mongoosejs.com/docs/guide.html#methods). + */ + loadClass(model: Function, onlyVirtuals?: boolean): this; + + /** Adds an instance method to documents constructed from Models compiled from this schema. */ + method(name: string, fn: (this: EnforceDocument>, ...args: any[]) => any, opts?: any): this; + method(obj: { [name: string]: (this: EnforceDocument>, ...args: any[]) => any }): this; + + /** Object of currently defined methods on this schema. */ + methods: { [name: string]: (this: EnforceDocument>, ...args: any[]) => any }; + + /** The original object passed to the schema constructor */ + obj: any; + + /** Gets/sets schema paths. */ + path(path: string): ResultType; + path(path: string, constructor: any): this; + + /** Lists all paths and their type in the schema. */ + paths: { + [key: string]: SchemaType; + } + + /** Returns the pathType of `path` for this schema. */ + pathType(path: string): string; + + /** Registers a plugin for this schema. */ + plugin(fn: (schema: Schema, opts?: any) => void, opts?: any): this; + + /** Defines a post hook for the model. */ + post>>(method: MongooseDocumentMiddleware | MongooseDocumentMiddleware[] | RegExp, fn: PostMiddlewareFunction): this; + post>>(method: MongooseDocumentMiddleware | MongooseDocumentMiddleware[] | RegExp, options: SchemaPostOptions, fn: PostMiddlewareFunction): this; + post = Query>(method: MongooseQueryMiddleware | MongooseQueryMiddleware[] | string | RegExp, fn: PostMiddlewareFunction): this; + post = Query>(method: MongooseQueryMiddleware | MongooseQueryMiddleware[] | string | RegExp, options: SchemaPostOptions, fn: PostMiddlewareFunction): this; + post = Aggregate>(method: 'aggregate' | RegExp, fn: PostMiddlewareFunction>): this; + post = Aggregate>(method: 'aggregate' | RegExp, options: SchemaPostOptions, fn: PostMiddlewareFunction>): this; + post(method: 'insertMany' | RegExp, fn: PostMiddlewareFunction): this; + post(method: 'insertMany' | RegExp, options: SchemaPostOptions, fn: PostMiddlewareFunction): this; + + post>>(method: MongooseDocumentMiddleware | MongooseDocumentMiddleware[] | RegExp, fn: ErrorHandlingMiddlewareFunction): this; + post>>(method: MongooseDocumentMiddleware | MongooseDocumentMiddleware[] | RegExp, options: SchemaPostOptions, fn: ErrorHandlingMiddlewareFunction): this; + post = Query>(method: MongooseQueryMiddleware | MongooseQueryMiddleware[] | string | RegExp, fn: ErrorHandlingMiddlewareFunction): this; + post = Query>(method: MongooseQueryMiddleware | MongooseQueryMiddleware[] | string | RegExp, options: SchemaPostOptions, fn: ErrorHandlingMiddlewareFunction): this; + post = Aggregate>(method: 'aggregate' | RegExp, fn: ErrorHandlingMiddlewareFunction>): this; + post = Aggregate>(method: 'aggregate' | RegExp, options: SchemaPostOptions, fn: ErrorHandlingMiddlewareFunction>): this; + post(method: 'insertMany' | RegExp, fn: ErrorHandlingMiddlewareFunction): this; + post(method: 'insertMany' | RegExp, options: SchemaPostOptions, fn: ErrorHandlingMiddlewareFunction): this; + + /** Defines a pre hook for the model. */ + pre>>(method: 'save', fn: PreSaveMiddlewareFunction): this; + pre>>(method: MongooseDocumentMiddleware | MongooseDocumentMiddleware[] | RegExp, fn: PreMiddlewareFunction): this; + pre>>(method: MongooseDocumentMiddleware | MongooseDocumentMiddleware[] | RegExp, options: SchemaPreOptions, fn: PreMiddlewareFunction): this; + pre = Query>(method: MongooseQueryMiddleware | MongooseQueryMiddleware[] | string | RegExp, fn: PreMiddlewareFunction): this; + pre = Query>(method: MongooseQueryMiddleware | MongooseQueryMiddleware[] | string | RegExp, options: SchemaPreOptions, fn: PreMiddlewareFunction): this; + pre = Aggregate>(method: 'aggregate' | RegExp, fn: PreMiddlewareFunction): this; + pre = Aggregate>(method: 'aggregate' | RegExp, options: SchemaPreOptions, fn: PreMiddlewareFunction): this; + pre(method: 'insertMany' | RegExp, fn: (this: T, next: (err?: CallbackError) => void, docs: any | Array) => void | Promise): this; + pre(method: 'insertMany' | RegExp, options: SchemaPreOptions, fn: (this: T, next: (err?: CallbackError) => void, docs: any | Array) => void | Promise): this; + + /** Object of currently defined query helpers on this schema. */ + query: { [name: string]: (this: QueryWithHelpers>, ...args: any[]) => any }; + + /** Adds a method call to the queue. */ + queue(name: string, args: any[]): this; + + /** Removes the given `path` (or [`paths`]). */ + remove(paths: string | Array): this; + + /** Returns an Array of path strings that are required by this schema. */ + requiredPaths(invalidate?: boolean): string[]; + + /** Sets a schema option. */ + set(key: K, value: SchemaOptions[K], _tags?: any): this; + + /** Adds static "class" methods to Models compiled from this schema. */ + static(name: string, fn: (this: M, ...args: any[]) => any): this; + static(obj: { [name: string]: (this: M, ...args: any[]) => any }): this; + + /** Object of currently defined statics on this schema. */ + statics: { [name: string]: (this: M, ...args: any[]) => any }; + + /** Creates a virtual type with the given name. */ + virtual(name: string, options?: VirtualTypeOptions): VirtualType; + + /** Object of currently defined virtuals on this schema */ + virtuals: any; + + /** Returns the virtual type with the given `name`. */ + virtualpath(name: string): VirtualType | null; + } + + type SchemaDefinitionWithBuiltInClass = T extends number + ? (typeof Number | 'number' | 'Number' | typeof Schema.Types.Number) + : T extends string + ? (typeof String | 'string' | 'String' | typeof Schema.Types.String) + : T extends boolean + ? (typeof Boolean | 'boolean' | 'Boolean' | typeof Schema.Types.Boolean) + : T extends NativeDate + ? (typeof NativeDate | 'date' | 'Date' | typeof Schema.Types.Date) + : (Function | string); + + type SchemaDefinitionProperty = SchemaDefinitionWithBuiltInClass | + SchemaTypeOptions | + typeof SchemaType | + Schema | + Schema[] | + SchemaTypeOptions[] | + Function[] | + SchemaDefinition | + SchemaDefinition[]; + + type SchemaDefinition = T extends undefined + ? { [path: string]: SchemaDefinitionProperty; } + : { [path in keyof T]?: SchemaDefinitionProperty; }; + + interface SchemaOptions { + /** + * By default, Mongoose's init() function creates all the indexes defined in your model's schema by + * calling Model.createIndexes() after you successfully connect to MongoDB. If you want to disable + * automatic index builds, you can set autoIndex to false. + */ + autoIndex?: boolean; + /** + * If set to `true`, Mongoose will call Model.createCollection() to create the underlying collection + * in MongoDB if autoCreate is set to true. Calling createCollection() sets the collection's default + * collation based on the collation option and establishes the collection as a capped collection if + * you set the capped schema option. + */ + autoCreate?: boolean; + /** + * By default, mongoose buffers commands when the connection goes down until the driver manages to reconnect. + * To disable buffering, set bufferCommands to false. + */ + bufferCommands?: boolean; + /** + * If bufferCommands is on, this option sets the maximum amount of time Mongoose buffering will wait before + * throwing an error. If not specified, Mongoose will use 10000 (10 seconds). + */ + bufferTimeoutMS?: number; + /** + * Mongoose supports MongoDBs capped collections. To specify the underlying MongoDB collection be capped, set + * the capped option to the maximum size of the collection in bytes. + */ + capped?: boolean | number | { size?: number; max?: number; autoIndexId?: boolean; }; + /** Sets a default collation for every query and aggregation. */ + collation?: mongodb.CollationOptions; + /** + * Mongoose by default produces a collection name by passing the model name to the utils.toCollectionName + * method. This method pluralizes the name. Set this option if you need a different name for your collection. + */ + collection?: string; + /** + * When you define a [discriminator](/docs/discriminators.html), Mongoose adds a path to your + * schema that stores which discriminator a document is an instance of. By default, Mongoose + * adds an `__t` path, but you can set `discriminatorKey` to overwrite this default. + */ + discriminatorKey?: string; + /** defaults to false. */ + emitIndexErrors?: boolean; + excludeIndexes?: any; + /** + * Mongoose assigns each of your schemas an id virtual getter by default which returns the document's _id field + * cast to a string, or in the case of ObjectIds, its hexString. + */ + id?: boolean; + /** + * Mongoose assigns each of your schemas an _id field by default if one is not passed into the Schema + * constructor. The type assigned is an ObjectId to coincide with MongoDB's default behavior. If you + * don't want an _id added to your schema at all, you may disable it using this option. + */ + _id?: boolean; + /** + * Mongoose will, by default, "minimize" schemas by removing empty objects. This behavior can be + * overridden by setting minimize option to false. It will then store empty objects. + */ + minimize?: boolean; + /** + * Optimistic concurrency is a strategy to ensure the document you're updating didn't change between when you + * loaded it using find() or findOne(), and when you update it using save(). Set to `true` to enable + * optimistic concurrency. + */ + optimisticConcurrency?: boolean; + /** + * Allows setting query#read options at the schema level, providing us a way to apply default ReadPreferences + * to all queries derived from a model. + */ + read?: string; + /** Allows setting write concern at the schema level. */ + writeConcern?: WriteConcern; + /** defaults to true. */ + safe?: boolean | { w?: number | string; wtimeout?: number; j?: boolean }; + /** + * The shardKey option is used when we have a sharded MongoDB architecture. Each sharded collection is + * given a shard key which must be present in all insert/update operations. We just need to set this + * schema option to the same shard key and we'll be all set. + */ + shardKey?: Record; + /** + * The strict option, (enabled by default), ensures that values passed to our model constructor that were not + * specified in our schema do not get saved to the db. + */ + strict?: boolean | 'throw'; + /** + * equal to `strict` by default, may be `false`, `true`, or `'throw'`. Sets the default + * [strictQuery](https://mongoosejs.com/docs/guide.html#strictQuery) mode for schemas. + */ + strictQuery?: boolean | 'throw'; + /** Exactly the same as the toObject option but only applies when the document's toJSON method is called. */ + toJSON?: ToObjectOptions; + /** + * Documents have a toObject method which converts the mongoose document into a plain JavaScript object. + * This method accepts a few options. Instead of applying these options on a per-document basis, we may + * declare the options at the schema level and have them applied to all of the schema's documents by + * default. + */ + toObject?: ToObjectOptions; + /** + * By default, if you have an object with key 'type' in your schema, mongoose will interpret it as a + * type declaration. However, for applications like geoJSON, the 'type' property is important. If you want to + * control which key mongoose uses to find type declarations, set the 'typeKey' schema option. + */ + typeKey?: string; + + /** + * By default, documents are automatically validated before they are saved to the database. This is to + * prevent saving an invalid document. If you want to handle validation manually, and be able to save + * objects which don't pass validation, you can set validateBeforeSave to false. + */ + validateBeforeSave?: boolean; + /** + * The versionKey is a property set on each document when first created by Mongoose. This keys value + * contains the internal revision of the document. The versionKey option is a string that represents + * the path to use for versioning. The default is '__v'. + */ + versionKey?: string | boolean; + /** + * By default, Mongoose will automatically select() any populated paths for you, unless you explicitly exclude them. + */ + selectPopulatedPaths?: boolean; + /** + * skipVersioning allows excluding paths from versioning (i.e., the internal revision will not be + * incremented even if these paths are updated). DO NOT do this unless you know what you're doing. + * For subdocuments, include this on the parent document using the fully qualified path. + */ + skipVersioning?: any; + /** + * Validation errors in a single nested schema are reported + * both on the child and on the parent schema. + * Set storeSubdocValidationError to false on the child schema + * to make Mongoose only report the parent error. + */ + storeSubdocValidationError?: boolean; + /** + * The timestamps option tells mongoose to assign createdAt and updatedAt fields to your schema. The type + * assigned is Date. By default, the names of the fields are createdAt and updatedAt. Customize the + * field names by setting timestamps.createdAt and timestamps.updatedAt. + */ + timestamps?: boolean | SchemaTimestampsConfig; + } + + interface SchemaTimestampsConfig { + createdAt?: boolean | string; + updatedAt?: boolean | string; + currentTime?: () => (NativeDate | number); + } + + type Unpacked = T extends (infer U)[] ? + U : + T extends ReadonlyArray ? U : T; + + type AnyArray = T[] | ReadonlyArray; + + export class SchemaTypeOptions { + type?: + T extends string | number | boolean | NativeDate | Function ? SchemaDefinitionWithBuiltInClass : + T extends Schema ? T : + T extends object[] ? (AnyArray> | AnyArray>> | AnyArray>>) : + T extends string[] ? AnyArray> | AnyArray> : + T extends number[] ? AnyArray> | AnyArray> : + T extends boolean[] ? AnyArray> | AnyArray> : + T extends Function[] ? AnyArray> | AnyArray>> : + T | typeof SchemaType | Schema | SchemaDefinition; + + /** Defines a virtual with the given name that gets/sets this path. */ + alias?: string; + + /** Function or object describing how to validate this schematype. See [validation docs](https://mongoosejs.com/docs/validation.html). */ + validate?: RegExp | [RegExp, string] | Function | [Function, string] | ValidateOpts | ValidateOpts[]; + + /** Allows overriding casting logic for this individual path. If a string, the given string overwrites Mongoose's default cast error message. */ + cast?: string; + + /** + * If true, attach a required validator to this path, which ensures this path + * path cannot be set to a nullish value. If a function, Mongoose calls the + * function and only checks for nullish values if the function returns a truthy value. + */ + required?: boolean | (() => boolean) | [boolean, string] | [() => boolean, string]; + + /** + * The default value for this path. If a function, Mongoose executes the function + * and uses the return value as the default. + */ + default?: T | ((this: any, doc: any) => T); + + /** + * The model that `populate()` should use if populating this path. + */ + ref?: string | Model | ((this: any, doc: any) => string | Model); + + /** + * Whether to include or exclude this path by default when loading documents + * using `find()`, `findOne()`, etc. + */ + select?: boolean | number; + + /** + * If [truthy](https://masteringjs.io/tutorials/fundamentals/truthy), Mongoose will + * build an index on this path when the model is compiled. + */ + index?: boolean | number | IndexOptions | '2d' | '2dsphere' | 'hashed' | 'text'; + + /** + * If [truthy](https://masteringjs.io/tutorials/fundamentals/truthy), Mongoose + * will build a unique index on this path when the + * model is compiled. [The `unique` option is **not** a validator](/docs/validation.html#the-unique-option-is-not-a-validator). + */ + unique?: boolean | number; + + /** + * If [truthy](https://masteringjs.io/tutorials/fundamentals/truthy), Mongoose will + * disallow changes to this path once the document is saved to the database for the first time. Read more + * about [immutability in Mongoose here](http://thecodebarbarian.com/whats-new-in-mongoose-5-6-immutable-properties.html). + */ + immutable?: boolean | ((this: any, doc: any) => boolean); + + /** + * If [truthy](https://masteringjs.io/tutorials/fundamentals/truthy), Mongoose will + * build a sparse index on this path. + */ + sparse?: boolean | number; + + /** + * If [truthy](https://masteringjs.io/tutorials/fundamentals/truthy), Mongoose + * will build a text index on this path. + */ + text?: boolean | number | any; + + /** + * Define a transform function for this individual schema type. + * Only called when calling `toJSON()` or `toObject()`. + */ + transform?: (this: any, val: T) => any; + + /** defines a custom getter for this property using [`Object.defineProperty()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/defineProperty). */ + get?: (value: T, schematype?: this) => any; + + /** defines a custom setter for this property using [`Object.defineProperty()`](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/defineProperty). */ + set?: (value: T, schematype?: this) => any; + + /** array of allowed values for this path. Allowed for strings, numbers, and arrays of strings */ + enum?: Array | ReadonlyArray | { values: Array | ReadonlyArray, message?: string } | { [path: string]: string | number | null }; + + /** The default [subtype](http://bsonspec.org/spec.html) associated with this buffer when it is stored in MongoDB. Only allowed for buffer paths */ + subtype?: number + + /** The minimum value allowed for this path. Only allowed for numbers and dates. */ + min?: number | NativeDate | [number, string] | [NativeDate, string] | readonly [number, string] | readonly [NativeDate, string]; + + /** The maximum value allowed for this path. Only allowed for numbers and dates. */ + max?: number | NativeDate | [number, string] | [NativeDate, string] | readonly [number, string] | readonly [NativeDate, string]; + + /** Defines a TTL index on this path. Only allowed for dates. */ + expires?: string | number; + + /** If `true`, Mongoose will skip gathering indexes on subpaths. Only allowed for subdocuments and subdocument arrays. */ + excludeIndexes?: boolean; + + /** If set, overrides the child schema's `_id` option. Only allowed for subdocuments and subdocument arrays. */ + _id?: boolean; + + /** If set, specifies the type of this map's values. Mongoose will cast this map's values to the given type. */ + of?: Function | SchemaDefinitionProperty; + + /** If true, uses Mongoose's default `_id` settings. Only allowed for ObjectIds */ + auto?: boolean; + + /** Attaches a validator that succeeds if the data string matches the given regular expression, and fails otherwise. */ + match?: RegExp | [RegExp, string] | readonly [RegExp, string]; + + /** If truthy, Mongoose will add a custom setter that lowercases this string using JavaScript's built-in `String#toLowerCase()`. */ + lowercase?: boolean; + + /** If truthy, Mongoose will add a custom setter that removes leading and trailing whitespace using JavaScript's built-in `String#trim()`. */ + trim?: boolean; + + /** If truthy, Mongoose will add a custom setter that uppercases this string using JavaScript's built-in `String#toUpperCase()`. */ + uppercase?: boolean; + + /** If set, Mongoose will add a custom validator that ensures the given string's `length` is at least the given number. */ + minlength?: number | [number, string] | readonly [number, string]; + + /** If set, Mongoose will add a custom validator that ensures the given string's `length` is at most the given number. */ + maxlength?: number | [number, string] | readonly [number, string]; + + [other: string]: any; + } + + export type RefType = + | number + | string + | Buffer + | undefined + | mongoose.Types.ObjectId + | mongoose.Types.Buffer + | typeof mongoose.Schema.Types.Number + | typeof mongoose.Schema.Types.String + | typeof mongoose.Schema.Types.Buffer + | typeof mongoose.Schema.Types.ObjectId; + + /** + * Reference another Model + */ + export type PopulatedDoc< + PopulatedType, + RawId extends RefType = (PopulatedType extends { _id?: RefType; } ? NonNullable : mongoose.Types.ObjectId) | undefined + > = PopulatedType | RawId; + + interface IndexOptions extends mongodb.CreateIndexesOptions { + /** + * `expires` utilizes the `ms` module from [guille](https://github.com/guille/) allowing us to use a friendlier syntax: + * + * @example + * ```js + * const schema = new Schema({ prop1: Date }); + * + * // expire in 24 hours + * schema.index({ prop1: 1 }, { expires: 60*60*24 }) + * + * // expire in 24 hours + * schema.index({ prop1: 1 }, { expires: '24h' }) + * + * // expire in 1.5 hours + * schema.index({ prop1: 1 }, { expires: '1.5h' }) + * + * // expire in 7 days + * schema.index({ prop1: 1 }, { expires: '7d' }) + * ``` + */ + expires?: number | string; + weights?: AnyObject; + } + + interface ValidatorProps { + path: string; + value: any; + } + + interface ValidatorMessageFn { + (props: ValidatorProps): string; + } + + interface ValidateFn { + (value: T): boolean; + } + + interface LegacyAsyncValidateFn { + (value: T, done: (result: boolean) => void): void; + } + + interface AsyncValidateFn { + (value: any): Promise; + } + + interface ValidateOpts { + msg?: string; + message?: string | ValidatorMessageFn; + type?: string; + validator: ValidateFn | LegacyAsyncValidateFn | AsyncValidateFn; + } + + interface VirtualTypeOptions { + /** If `ref` is not nullish, this becomes a populated virtual. */ + ref?: string | Function; + + /** The local field to populate on if this is a populated virtual. */ + localField?: string | Function; + + /** The foreign field to populate on if this is a populated virtual. */ + foreignField?: string | Function; + + /** + * By default, a populated virtual is an array. If you set `justOne`, + * the populated virtual will be a single doc or `null`. + */ + justOne?: boolean; + + /** If you set this to `true`, Mongoose will call any custom getters you defined on this virtual. */ + getters?: boolean; + + /** + * If you set this to `true`, `populate()` will set this virtual to the number of populated + * documents, as opposed to the documents themselves, using `Query#countDocuments()`. + */ + count?: boolean; + + /** Add an extra match condition to `populate()`. */ + match?: FilterQuery | Function; + + /** Add a default `limit` to the `populate()` query. */ + limit?: number; + + /** Add a default `skip` to the `populate()` query. */ + skip?: number; + + /** + * For legacy reasons, `limit` with `populate()` may give incorrect results because it only + * executes a single query for every document being populated. If you set `perDocumentLimit`, + * Mongoose will ensure correct `limit` per document by executing a separate query for each + * document to `populate()`. For example, `.find().populate({ path: 'test', perDocumentLimit: 2 })` + * will execute 2 additional queries if `.find()` returns 2 documents. + */ + perDocumentLimit?: number; + + /** Additional options like `limit` and `lean`. */ + options?: QueryOptions & { match?: AnyObject }; + + /** Additional options for plugins */ + [extra: string]: any; + } + + class VirtualType { + /** Applies getters to `value`. */ + applyGetters(value: any, doc: Document): any; + + /** Applies setters to `value`. */ + applySetters(value: any, doc: Document): any; + + /** Adds a custom getter to this virtual. */ + get(fn: Function): this; + + /** Adds a custom setter to this virtual. */ + set(fn: Function): this; + } + + namespace Schema { + namespace Types { + class Array extends SchemaType implements AcceptsDiscriminator { + /** This schema type's name, to defend against minifiers that mangle function names. */ + static schemaName: string; + + static options: { castNonArrays: boolean; }; + + discriminator(name: string | number, schema: Schema, value?: string): Model; + discriminator(name: string | number, schema: Schema, value?: string): U; + + /** + * Adds an enum validator if this is an array of strings or numbers. Equivalent to + * `SchemaString.prototype.enum()` or `SchemaNumber.prototype.enum()` + */ + enum(vals: string[] | number[]): this; + } + + class Boolean extends SchemaType { + /** This schema type's name, to defend against minifiers that mangle function names. */ + static schemaName: string; + + /** Configure which values get casted to `true`. */ + static convertToTrue: Set; + + /** Configure which values get casted to `false`. */ + static convertToFalse: Set; + } + + class Buffer extends SchemaType { + /** This schema type's name, to defend against minifiers that mangle function names. */ + static schemaName: string; + + /** + * Sets the default [subtype](https://studio3t.com/whats-new/best-practices-uuid-mongodb/) + * for this buffer. You can find a [list of allowed subtypes here](http://api.mongodb.com/python/current/api/bson/binary.html). + */ + subtype(subtype: number): this; + } + + class Date extends SchemaType { + /** This schema type's name, to defend against minifiers that mangle function names. */ + static schemaName: string; + + /** Declares a TTL index (rounded to the nearest second) for _Date_ types only. */ + expires(when: number | string): this; + + /** Sets a maximum date validator. */ + max(value: NativeDate, message: string): this; + + /** Sets a minimum date validator. */ + min(value: NativeDate, message: string): this; + } + + class Decimal128 extends SchemaType { + /** This schema type's name, to defend against minifiers that mangle function names. */ + static schemaName: string; + } + + class DocumentArray extends SchemaType implements AcceptsDiscriminator { + /** This schema type's name, to defend against minifiers that mangle function names. */ + static schemaName: string; + + static options: { castNonArrays: boolean; }; + + discriminator(name: string | number, schema: Schema, value?: string): Model; + discriminator(name: string | number, schema: Schema, value?: string): U; + + /** The schema used for documents in this array */ + schema: Schema; + } + + class Map extends SchemaType { + /** This schema type's name, to defend against minifiers that mangle function names. */ + static schemaName: string; + } + + class Mixed extends SchemaType { + /** This schema type's name, to defend against minifiers that mangle function names. */ + static schemaName: string; + } + + class Number extends SchemaType { + /** This schema type's name, to defend against minifiers that mangle function names. */ + static schemaName: string; + + /** Sets a enum validator */ + enum(vals: number[]): this; + + /** Sets a maximum number validator. */ + max(value: number, message: string): this; + + /** Sets a minimum number validator. */ + min(value: number, message: string): this; + } + + class ObjectId extends SchemaType { + /** This schema type's name, to defend against minifiers that mangle function names. */ + static schemaName: string; + + /** Adds an auto-generated ObjectId default if turnOn is true. */ + auto(turnOn: boolean): this; + } + + class Subdocument extends SchemaType implements AcceptsDiscriminator { + /** This schema type's name, to defend against minifiers that mangle function names. */ + static schemaName: string; + + /** The document's schema */ + schema: Schema; + + discriminator(name: string | number, schema: Schema, value?: string): Model; + discriminator(name: string | number, schema: Schema, value?: string): U; + } + + class String extends SchemaType { + /** This schema type's name, to defend against minifiers that mangle function names. */ + static schemaName: string; + + /** Adds an enum validator */ + enum(vals: string[] | any): this; + + /** Adds a lowercase [setter](http://mongoosejs.com/docs/api.html#schematype_SchemaType-set). */ + lowercase(shouldApply?: boolean): this; + + /** Sets a regexp validator. */ + match(value: RegExp, message: string): this; + + /** Sets a maximum length validator. */ + maxlength(value: number, message: string): this; + + /** Sets a minimum length validator. */ + minlength(value: number, message: string): this; + + /** Adds a trim [setter](http://mongoosejs.com/docs/api.html#schematype_SchemaType-set). */ + trim(shouldTrim?: boolean): this; + + /** Adds an uppercase [setter](http://mongoosejs.com/docs/api.html#schematype_SchemaType-set). */ + uppercase(shouldApply?: boolean): this; + } + } + } + + namespace Types { + class Array extends global.Array { + /** Pops the array atomically at most one time per document `save()`. */ + $pop(): T; + + /** Atomically shifts the array at most one time per document `save()`. */ + $shift(): T; + + /** Adds values to the array if not already present. */ + addToSet(...args: any[]): any[]; + + isMongooseArray: true; + + /** Pushes items to the array non-atomically. */ + nonAtomicPush(...args: any[]): number; + + /** Wraps [`Array#push`](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/push) with proper change tracking. */ + push(...args: any[]): number; + + /** + * Pulls items from the array atomically. Equality is determined by casting + * the provided value to an embedded document and comparing using + * [the `Document.equals()` function.](./api.html#document_Document-equals) + */ + pull(...args: any[]): this; + + /** + * Alias of [pull](#mongoosearray_MongooseArray-pull) + */ + remove(...args: any[]): this; + + /** Sets the casted `val` at index `i` and marks the array modified. */ + set(i: number, val: T): this; + + /** Atomically shifts the array at most one time per document `save()`. */ + shift(): T; + + /** Returns a native js Array. */ + toObject(options?: ToObjectOptions): any; + toObject(options?: ToObjectOptions): T; + + /** Wraps [`Array#unshift`](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/unshift) with proper change tracking. */ + unshift(...args: any[]): number; + } + + class Buffer extends global.Buffer { + /** Sets the subtype option and marks the buffer modified. */ + subtype(subtype: number | ToObjectOptions): void; + + /** Converts this buffer to its Binary type representation. */ + toObject(subtype?: number): mongodb.Binary; + } + + class Decimal128 extends mongodb.Decimal128 { } + + class DocumentArray extends Types.Array { + /** DocumentArray constructor */ + constructor(values: any[]); + + isMongooseDocumentArray: true; + + /** Creates a subdocument casted to this schema. */ + create(obj: any): T; + + /** Searches array items for the first document with a matching _id. */ + id(id: any): T | null; + + push(...args: (AnyKeys & AnyObject)[]): number; + } + + class Map extends global.Map { + /** Converts a Mongoose map into a vanilla JavaScript map. */ + toObject(options?: ToObjectOptions & { flattenMaps?: boolean }): any; + } + + class ObjectId extends mongodb.ObjectId { + _id: this; + } + + class Subdocument extends Document { + $isSingleNested: true; + + /** Returns the top level document of this sub-document. */ + ownerDocument(): Document; + + /** Returns this sub-documents parent document. */ + parent(): Document; + + /** Returns this sub-documents parent document. */ + $parent(): Document; + } + } + + type ReturnsNewDoc = { new: true } | { returnOriginal: false } | {returnDocument: 'after'}; + + type QueryWithHelpers = Query & THelpers; + + class Query { + _mongooseOptions: MongooseQueryOptions; + + /** + * Returns a wrapper around a [mongodb driver cursor](http://mongodb.github.io/node-mongodb-native/2.1/api/Cursor.html). + * A QueryCursor exposes a Streams3 interface, as well as a `.next()` function. + * This is equivalent to calling `.cursor()` with no arguments. + */ + [Symbol.asyncIterator](): AsyncIterableIterator; + + /** Executes the query */ + exec(): Promise; + exec(callback?: Callback): void; + // @todo: this doesn't seem right + exec(callback?: Callback): Promise | any; + + $where(argument: string | Function): QueryWithHelpers; + + /** Specifies an `$all` query condition. When called with one argument, the most recent path passed to `where()` is used. */ + all(val: Array): this; + all(path: string, val: Array): this; + + /** Sets the allowDiskUse option for the query (ignored for < 4.4.0) */ + allowDiskUse(value: boolean): this; + + /** Specifies arguments for an `$and` condition. */ + and(array: FilterQuery[]): this; + + /** Specifies the batchSize option. */ + batchSize(val: number): this; + + /** Specifies a `$box` condition */ + box(val: any): this; + box(lower: number[], upper: number[]): this; + + /** + * Casts this query to the schema of `model`. + * + * @param {Model} [model] the model to cast to. If not set, defaults to `this.model` + * @param {Object} [obj] If not set, defaults to this query's conditions + * @return {Object} the casted `obj` + */ + cast(model?: Model | null, obj?: any): any; + + /** + * Executes the query returning a `Promise` which will be + * resolved with either the doc(s) or rejected with the error. + * Like `.then()`, but only takes a rejection handler. + */ + catch: Promise['catch']; + + /** Specifies a `$center` or `$centerSphere` condition. */ + circle(area: any): this; + circle(path: string, area: any): this; + + /** Adds a collation to this op (MongoDB 3.4 and up) */ + collation(value: mongodb.CollationOptions): this; + + /** Specifies the `comment` option. */ + comment(val: string): this; + + /** Specifies this query as a `count` query. */ + count(callback?: Callback): QueryWithHelpers; + count(criteria: FilterQuery, callback?: Callback): QueryWithHelpers; + + /** Specifies this query as a `countDocuments` query. */ + countDocuments(callback?: Callback): QueryWithHelpers; + countDocuments(criteria: FilterQuery, callback?: Callback): QueryWithHelpers; + + /** + * Returns a wrapper around a [mongodb driver cursor](http://mongodb.github.io/node-mongodb-native/2.1/api/Cursor.html). + * A QueryCursor exposes a Streams3 interface, as well as a `.next()` function. + */ + cursor(options?: any): QueryCursor; + + /** + * Declare and/or execute this query as a `deleteMany()` operation. Works like + * remove, except it deletes _every_ document that matches `filter` in the + * collection, regardless of the value of `single`. + */ + deleteMany(filter?: FilterQuery, options?: QueryOptions, callback?: Callback): QueryWithHelpers; + deleteMany(filter: FilterQuery, callback: Callback): QueryWithHelpers; + deleteMany(callback: Callback): QueryWithHelpers; + + /** + * Declare and/or execute this query as a `deleteOne()` operation. Works like + * remove, except it deletes at most one document regardless of the `single` + * option. + */ + deleteOne(filter?: FilterQuery, options?: QueryOptions, callback?: Callback): QueryWithHelpers; + deleteOne(filter: FilterQuery, callback: Callback): QueryWithHelpers; + deleteOne(callback: Callback): QueryWithHelpers; + + /** Creates a `distinct` query: returns the distinct values of the given `field` that match `filter`. */ + distinct(field: string, filter?: FilterQuery, callback?: Callback): QueryWithHelpers, DocType, THelpers, RawDocType>; + + /** Specifies a `$elemMatch` query condition. When called with one argument, the most recent path passed to `where()` is used. */ + elemMatch(val: Function | any): this; + elemMatch(path: string, val: Function | any): this; + + /** + * Gets/sets the error flag on this query. If this flag is not null or + * undefined, the `exec()` promise will reject without executing. + */ + error(): NativeError | null; + error(val: NativeError | null): this; + + /** Specifies the complementary comparison value for paths specified with `where()` */ + equals(val: any): this; + + /** Creates a `estimatedDocumentCount` query: counts the number of documents in the collection. */ + estimatedDocumentCount(options?: QueryOptions, callback?: Callback): QueryWithHelpers; + + /** Specifies a `$exists` query condition. When called with one argument, the most recent path passed to `where()` is used. */ + exists(val: boolean): this; + exists(path: string, val: boolean): this; + + /** + * Sets the [`explain` option](https://docs.mongodb.com/manual/reference/method/cursor.explain/), + * which makes this query return detailed execution stats instead of the actual + * query result. This method is useful for determining what index your queries + * use. + */ + explain(verbose?: string): this; + + /** Creates a `find` query: gets a list of documents that match `filter`. */ + find(callback?: Callback): QueryWithHelpers, DocType, THelpers, RawDocType>; + find(filter: FilterQuery, callback?: Callback): QueryWithHelpers, DocType, THelpers, RawDocType>; + find(filter: FilterQuery, projection?: any | null, options?: QueryOptions | null, callback?: Callback): QueryWithHelpers, DocType, THelpers, RawDocType>; + + /** Declares the query a findOne operation. When executed, the first found document is passed to the callback. */ + findOne(filter?: FilterQuery, projection?: any | null, options?: QueryOptions | null, callback?: Callback): QueryWithHelpers; + + /** Creates a `findOneAndDelete` query: atomically finds the given document, deletes it, and returns the document as it was before deletion. */ + findOneAndDelete(filter?: FilterQuery, options?: QueryOptions | null, callback?: (err: CallbackError, doc: DocType | null, res: any) => void): QueryWithHelpers; + + /** Creates a `findOneAndRemove` query: atomically finds the given document and deletes it. */ + findOneAndRemove(filter?: FilterQuery, options?: QueryOptions | null, callback?: (err: CallbackError, doc: DocType | null, res: any) => void): QueryWithHelpers; + + /** Creates a `findOneAndUpdate` query: atomically find the first document that matches `filter` and apply `update`. */ + findOneAndUpdate(filter: FilterQuery, update: UpdateQuery, options: QueryOptions & { rawResult: true }, callback?: (err: CallbackError, doc: DocType | null, res: mongodb.ModifyResult) => void): QueryWithHelpers, DocType, THelpers, RawDocType>; + findOneAndUpdate(filter: FilterQuery, update: UpdateQuery, options: QueryOptions & { upsert: true } & ReturnsNewDoc, callback?: (err: CallbackError, doc: DocType, res: mongodb.ModifyResult) => void): QueryWithHelpers; + findOneAndUpdate(filter?: FilterQuery, update?: UpdateQuery, options?: QueryOptions | null, callback?: (err: CallbackError, doc: DocType | null, res: mongodb.ModifyResult) => void): QueryWithHelpers; + + /** Creates a `findByIdAndDelete` query, filtering by the given `_id`. */ + findByIdAndDelete(id?: mongodb.ObjectId | any, options?: QueryOptions | null, callback?: (err: CallbackError, doc: DocType | null, res: any) => void): QueryWithHelpers; + + /** Creates a `findOneAndUpdate` query, filtering by the given `_id`. */ + findByIdAndUpdate(id: mongodb.ObjectId | any, update: UpdateQuery, options: QueryOptions & { rawResult: true }, callback?: (err: CallbackError, doc: any, res?: any) => void): QueryWithHelpers; + findByIdAndUpdate(id: mongodb.ObjectId | any, update: UpdateQuery, options: QueryOptions & { upsert: true } & ReturnsNewDoc, callback?: (err: CallbackError, doc: DocType, res?: any) => void): QueryWithHelpers; + findByIdAndUpdate(id?: mongodb.ObjectId | any, update?: UpdateQuery, options?: QueryOptions | null, callback?: (CallbackError: any, doc: DocType | null, res?: any) => void): QueryWithHelpers; + findByIdAndUpdate(id: mongodb.ObjectId | any, update: UpdateQuery, callback: (CallbackError: any, doc: DocType | null, res?: any) => void): QueryWithHelpers; + + /** Specifies a `$geometry` condition */ + geometry(object: { type: string, coordinates: any[] }): this; + + /** + * For update operations, returns the value of a path in the update's `$set`. + * Useful for writing getters/setters that can work with both update operations + * and `save()`. + */ + get(path: string): any; + + /** Returns the current query filter (also known as conditions) as a POJO. */ + getFilter(): FilterQuery; + + /** Gets query options. */ + getOptions(): QueryOptions; + + /** Gets a list of paths to be populated by this query */ + getPopulatedPaths(): Array; + + /** Returns the current query filter. Equivalent to `getFilter()`. */ + getQuery(): FilterQuery; + + /** Returns the current update operations as a JSON object. */ + getUpdate(): UpdateQuery | UpdateWithAggregationPipeline | null; + + /** Specifies a `$gt` query condition. When called with one argument, the most recent path passed to `where()` is used. */ + gt(val: number): this; + gt(path: string, val: number): this; + + /** Specifies a `$gte` query condition. When called with one argument, the most recent path passed to `where()` is used. */ + gte(val: number): this; + gte(path: string, val: number): this; + + /** Sets query hints. */ + hint(val: any): this; + + /** Specifies an `$in` query condition. When called with one argument, the most recent path passed to `where()` is used. */ + in(val: Array): this; + in(path: string, val: Array): this; + + /** Declares an intersects query for `geometry()`. */ + intersects(arg?: any): this; + + /** Requests acknowledgement that this operation has been persisted to MongoDB's on-disk journal. */ + j(val: boolean | null): this; + + /** Sets the lean option. */ + lean : LeanDocumentOrArrayWithRawType>(val?: boolean | any): QueryWithHelpers; + + /** Specifies the maximum number of documents the query will return. */ + limit(val: number): this; + + /** Specifies a `$lt` query condition. When called with one argument, the most recent path passed to `where()` is used. */ + lt(val: number): this; + lt(path: string, val: number): this; + + /** Specifies a `$lte` query condition. When called with one argument, the most recent path passed to `where()` is used. */ + lte(val: number): this; + lte(path: string, val: number): this; + + /** + * Runs a function `fn` and treats the return value of `fn` as the new value + * for the query to resolve to. + */ + map(fn: (doc: ResultType) => MappedType): QueryWithHelpers; + + /** Specifies an `$maxDistance` query condition. When called with one argument, the most recent path passed to `where()` is used. */ + maxDistance(val: number): this; + maxDistance(path: string, val: number): this; + + /** Specifies the maxScan option. */ + maxScan(val: number): this; + + /** + * Sets the [maxTimeMS](https://docs.mongodb.com/manual/reference/method/cursor.maxTimeMS/) + * option. This will tell the MongoDB server to abort if the query or write op + * has been running for more than `ms` milliseconds. + */ + maxTimeMS(ms: number): this; + + /** Merges another Query or conditions object into this one. */ + merge(source: Query | FilterQuery): this; + + /** Specifies a `$mod` condition, filters documents for documents whose `path` property is a number that is equal to `remainder` modulo `divisor`. */ + mod(val: Array): this; + mod(path: string, val: Array): this; + + /** The model this query was created from */ + model: typeof Model; + + /** + * Getter/setter around the current mongoose-specific options for this query + * Below are the current Mongoose-specific options. + */ + mongooseOptions(val?: MongooseQueryOptions): MongooseQueryOptions; + + /** Specifies a `$ne` query condition. When called with one argument, the most recent path passed to `where()` is used. */ + ne(val: any): this; + ne(path: string, val: any): this; + + /** Specifies a `$near` or `$nearSphere` condition */ + near(val: any): this; + near(path: string, val: any): this; + + /** Specifies an `$nin` query condition. When called with one argument, the most recent path passed to `where()` is used. */ + nin(val: Array): this; + nin(path: string, val: Array): this; + + /** Specifies arguments for an `$nor` condition. */ + nor(array: Array>): this; + + /** Specifies arguments for an `$or` condition. */ + or(array: Array>): this; + + /** + * Make this query throw an error if no documents match the given `filter`. + * This is handy for integrating with async/await, because `orFail()` saves you + * an extra `if` statement to check if no document was found. + */ + orFail(err?: NativeError | (() => NativeError)): QueryWithHelpers, DocType, THelpers, RawDocType>; + + /** Specifies a `$polygon` condition */ + polygon(...coordinatePairs: number[][]): this; + polygon(path: string, ...coordinatePairs: number[][]): this; + + /** Specifies paths which should be populated with other documents. */ + populate(path: string | any, select?: string | any, model?: string | Model, match?: any): this; + populate(options: PopulateOptions | Array): this; + + /** Get/set the current projection (AKA fields). Pass `null` to remove the current projection. */ + projection(fields?: any | null): any; + + /** Determines the MongoDB nodes from which to read. */ + read(pref: string | mongodb.ReadPreferenceMode, tags?: any[]): this; + + /** Sets the readConcern option for the query. */ + readConcern(level: string): this; + + /** Specifies a `$regex` query condition. When called with one argument, the most recent path passed to `where()` is used. */ + regex(val: string | RegExp): this; + regex(path: string, val: string | RegExp): this; + + /** + * Declare and/or execute this query as a remove() operation. `remove()` is + * deprecated, you should use [`deleteOne()`](#query_Query-deleteOne) + * or [`deleteMany()`](#query_Query-deleteMany) instead. + */ + remove(filter?: FilterQuery, callback?: Callback): Query; + + /** + * Declare and/or execute this query as a replaceOne() operation. Same as + * `update()`, except MongoDB will replace the existing document and will + * not accept any [atomic](https://docs.mongodb.com/manual/tutorial/model-data-for-atomic-operations/#pattern) operators (`$set`, etc.) + */ + replaceOne(filter?: FilterQuery, replacement?: DocumentDefinition, options?: QueryOptions | null, callback?: Callback): QueryWithHelpers; + replaceOne(filter?: FilterQuery, replacement?: Object, options?: QueryOptions | null, callback?: Callback): QueryWithHelpers; + + /** Specifies which document fields to include or exclude (also known as the query "projection") */ + select(arg: string | any): this; + + /** Determines if field selection has been made. */ + selected(): boolean; + + /** Determines if exclusive field selection has been made. */ + selectedExclusively(): boolean; + + /** Determines if inclusive field selection has been made. */ + selectedInclusively(): boolean; + + /** + * Sets the [MongoDB session](https://docs.mongodb.com/manual/reference/server-sessions/) + * associated with this query. Sessions are how you mark a query as part of a + * [transaction](/docs/transactions.html). + */ + session(session: mongodb.ClientSession | null): this; + + /** + * Adds a `$set` to this query's update without changing the operation. + * This is useful for query middleware so you can add an update regardless + * of whether you use `updateOne()`, `updateMany()`, `findOneAndUpdate()`, etc. + */ + set(path: string, value: any): this; + + /** Sets query options. Some options only make sense for certain operations. */ + setOptions(options: QueryOptions, overwrite?: boolean): this; + + /** Sets the query conditions to the provided JSON object. */ + setQuery(val: FilterQuery | null): void; + + setUpdate(update: UpdateQuery | UpdateWithAggregationPipeline): void; + + /** Specifies an `$size` query condition. When called with one argument, the most recent path passed to `where()` is used. */ + size(val: number): this; + size(path: string, val: number): this; + + /** Specifies the number of documents to skip. */ + skip(val: number): this; + + /** Specifies a `$slice` projection for an array. */ + slice(val: number | Array): this; + slice(path: string, val: number | Array): this; + + /** Specifies this query as a `snapshot` query. */ + snapshot(val?: boolean): this; + + /** Sets the sort order. If an object is passed, values allowed are `asc`, `desc`, `ascending`, `descending`, `1`, and `-1`. */ + sort(arg: string | any): this; + + /** Sets the tailable option (for use with capped collections). */ + tailable(bool?: boolean, opts?: { + numberOfRetries?: number; + tailableRetryInterval?: number; + }): this; + + /** + * Executes the query returning a `Promise` which will be + * resolved with either the doc(s) or rejected with the error. + */ + then: Promise['then']; + + /** Converts this query to a customized, reusable query constructor with all arguments and options retained. */ + toConstructor(): new (...args: any[]) => QueryWithHelpers; + + /** Declare and/or execute this query as an update() operation. */ + update(filter?: FilterQuery, update?: UpdateQuery | UpdateWithAggregationPipeline, options?: QueryOptions | null, callback?: Callback): QueryWithHelpers; + + /** + * Declare and/or execute this query as an updateMany() operation. Same as + * `update()`, except MongoDB will update _all_ documents that match + * `filter` (as opposed to just the first one) regardless of the value of + * the `multi` option. + */ + updateMany(filter?: FilterQuery, update?: UpdateQuery | UpdateWithAggregationPipeline, options?: QueryOptions | null, callback?: Callback): QueryWithHelpers; + + /** + * Declare and/or execute this query as an updateOne() operation. Same as + * `update()`, except it does not support the `multi` or `overwrite` options. + */ + updateOne(filter?: FilterQuery, update?: UpdateQuery | UpdateWithAggregationPipeline, options?: QueryOptions | null, callback?: Callback): QueryWithHelpers; + + /** + * Sets the specified number of `mongod` servers, or tag set of `mongod` servers, + * that must acknowledge this write before this write is considered successful. + */ + w(val: string | number | null): this; + + /** Specifies a path for use with chaining. */ + where(path: string, val?: any): this; + where(obj: object): this; + where(): this; + + /** Defines a `$within` or `$geoWithin` argument for geo-spatial queries. */ + within(val?: any): this; + + /** + * If [`w > 1`](/docs/api.html#query_Query-w), the maximum amount of time to + * wait for this write to propagate through the replica set before this + * operation fails. The default is `0`, which means no timeout. + */ + wtimeout(ms: number): this; + } + + export type QuerySelector = { + // Comparison + $eq?: T; + $gt?: T; + $gte?: T; + $in?: [T] extends AnyArray ? Unpacked[] : T[]; + $lt?: T; + $lte?: T; + $ne?: T; + $nin?: [T] extends AnyArray ? Unpacked[] : T[]; + // Logical + $not?: T extends string ? QuerySelector | RegExp : QuerySelector; + // Element + /** + * When `true`, `$exists` matches the documents that contain the field, + * including documents where the field value is null. + */ + $exists?: boolean; + $type?: string | number; + // Evaluation + $expr?: any; + $jsonSchema?: any; + $mod?: T extends number ? [number, number] : never; + $regex?: T extends string ? RegExp | string : never; + $options?: T extends string ? string : never; + // Geospatial + // TODO: define better types for geo queries + $geoIntersects?: { $geometry: object }; + $geoWithin?: object; + $near?: object; + $nearSphere?: object; + $maxDistance?: number; + // Array + // TODO: define better types for $all and $elemMatch + $all?: T extends ReadonlyArray ? any[] : never; + $elemMatch?: T extends ReadonlyArray ? object : never; + $size?: T extends ReadonlyArray ? number : never; + // Bitwise + $bitsAllClear?: number | mongodb.Binary | number[]; + $bitsAllSet?: number | mongodb.Binary | number[]; + $bitsAnyClear?: number | mongodb.Binary | number[]; + $bitsAnySet?: number | mongodb.Binary | number[]; + }; + + export type RootQuerySelector = { + /** @see https://docs.mongodb.com/manual/reference/operator/query/and/#op._S_and */ + $and?: Array>; + /** @see https://docs.mongodb.com/manual/reference/operator/query/nor/#op._S_nor */ + $nor?: Array>; + /** @see https://docs.mongodb.com/manual/reference/operator/query/or/#op._S_or */ + $or?: Array>; + /** @see https://docs.mongodb.com/manual/reference/operator/query/text */ + $text?: { + $search: string; + $language?: string; + $caseSensitive?: boolean; + $diacriticSensitive?: boolean; + }; + /** @see https://docs.mongodb.com/manual/reference/operator/query/where/#op._S_where */ + $where?: string | Function; + /** @see https://docs.mongodb.com/manual/reference/operator/query/comment/#op._S_comment */ + $comment?: string; + // we could not find a proper TypeScript generic to support nested queries e.g. 'user.friends.name' + // this will mark all unrecognized properties as any (including nested queries) + [key: string]: any; + }; + + type DotAndArrayNotation = { + readonly [key: string]: AssignableType; + }; + + type ReadonlyPartial = { + readonly [key in keyof TSchema]?: TSchema[key]; + }; + + type MatchKeysAndValues = ReadonlyPartial & DotAndArrayNotation; + + type AllowStringForObjectId = T extends mongodb.ObjectId ? T | string : T; + type AllowRegexpForStrings = T extends string ? T | RegExp : T; + type AllowArrayElementQuery = T extends (infer U)[] ? T | U : T; + + type ApplyBasicQueryCasting = AllowStringForObjectId | AllowRegexpForStrings | AllowArrayElementQuery; + type Condition = ApplyBasicQueryCasting | QuerySelector>; + + type _FilterQuery = { + [P in keyof T]?: Condition; + } & + RootQuerySelector; + + export type FilterQuery = _FilterQuery; + + type AddToSetOperators = { + $each: Type; + }; + + type SortValues = -1 | 1 | 'asc' | 'desc'; + + type ArrayOperator = { + $each: Type; + $slice?: number; + $position?: number; + $sort?: SortValues | Record; + }; + + type OnlyFieldsOfType = { + [key in keyof TSchema]?: [Extract] extends [never] ? never : AssignableType; + }; + + type NumericTypes = number | Decimal128 | mongodb.Double | mongodb.Int32 | mongodb.Long; + + type _UpdateQuery = { + /** @see https://docs.mongodb.com/manual/reference/operator/update-field/ */ + $currentDate?: OnlyFieldsOfType & AnyObject; + $inc?: OnlyFieldsOfType & AnyObject; + $min?: OnlyFieldsOfType & AnyObject; + $max?: OnlyFieldsOfType & AnyObject; + $mul?: OnlyFieldsOfType & AnyObject; + $rename?: { [key: string]: string }; + $set?: OnlyFieldsOfType & AnyObject; + $setOnInsert?: OnlyFieldsOfType & AnyObject; + $unset?: OnlyFieldsOfType & AnyObject; + + /** @see https://docs.mongodb.com/manual/reference/operator/update-array/ */ + $addToSet?: OnlyFieldsOfType & AnyObject; + $pop?: OnlyFieldsOfType, 1 | -1> & AnyObject; + $pull?: OnlyFieldsOfType, any> & AnyObject; + $push?: OnlyFieldsOfType, any> & AnyObject; + $pullAll?: OnlyFieldsOfType, any> & AnyObject; + + /** @see https://docs.mongodb.com/manual/reference/operator/update-bitwise/ */ + $bit?: { + [key: string]: { [key in 'and' | 'or' | 'xor']?: number }; + }; + }; + + type UpdateWithAggregationPipeline = UpdateAggregationStage[]; + type UpdateAggregationStage = { $addFields: any } | + { $set: any } | + { $project: any } | + { $unset: any } | + { $replaceRoot: any } | + { $replaceWith: any }; + + type __UpdateDefProperty = + 0 extends (1 & T) ? T : // any + T extends unknown[] ? LeanArray : // Array + T extends Types.Subdocument ? Omit, '$isSingleNested' | 'ownerDocument' | 'parent'> : + T extends Document ? LeanDocument : // Subdocument + [Extract] extends [never] ? T : + T | string; + type __UpdateQueryDef = { + [K in keyof T]: __UpdateDefProperty; + }; + type _UpdateQueryDef = __UpdateQueryDef; + + export type UpdateQuery = (_UpdateQuery<_UpdateQueryDef> & MatchKeysAndValues<_UpdateQueryDef>>); + + type _AllowStringsForIds = { + [K in keyof T]: [Extract] extends [never] + ? T[K] extends TreatAsPrimitives + ? T[K] + : _AllowStringsForIds + : T[K] | string; + }; + export type DocumentDefinition = { + [K in keyof Omit>]: + [Extract] extends [never] + ? T[K] extends TreatAsPrimitives + ? T[K] + : LeanDocumentElement + : T[K] | string; + }; + + type actualPrimitives = string | boolean | number | bigint | symbol | null | undefined; + type TreatAsPrimitives = actualPrimitives | + NativeDate | RegExp | symbol | Error | BigInt | Types.ObjectId; + + type LeanType = + 0 extends (1 & T) ? T : // any + T extends TreatAsPrimitives ? T : // primitives + T extends Types.Subdocument ? Omit, '$isSingleNested' | 'ownerDocument' | 'parent'> : // subdocs + LeanDocument; // Documents and everything else + + type LeanArray = T extends unknown[][] ? LeanArray[] : LeanType[]; + + export type _LeanDocument = { + [K in keyof T]: LeanDocumentElement; + }; + + // Keep this a separate type, to ensure that T is a naked type. + // This way, the conditional type is distributive over union types. + // This is required for PopulatedDoc. + type LeanDocumentElement = + 0 extends (1 & T) ? T : // any + T extends unknown[] ? LeanArray : // Array + T extends Document ? LeanDocument : // Subdocument + T; + + export type SchemaDefinitionType = T extends Document ? Omit> : T; + export type LeanDocument = Omit<_LeanDocument, Exclude | '$isSingleNested'>; + + export type LeanDocumentOrArray = 0 extends (1 & T) ? T : + T extends unknown[] ? LeanDocument[] : + T extends Document ? LeanDocument : + T; + + export type LeanDocumentOrArrayWithRawType = 0 extends (1 & T) ? T : + T extends unknown[] ? RawDocType[] : + T extends Document ? RawDocType : + T; + + class QueryCursor extends stream.Readable { + [Symbol.asyncIterator](): AsyncIterableIterator; + + /** + * Adds a [cursor flag](http://mongodb.github.io/node-mongodb-native/2.2/api/Cursor.html#addCursorFlag). + * Useful for setting the `noCursorTimeout` and `tailable` flags. + */ + addCursorFlag(flag: string, value: boolean): this; + + /** + * Marks this cursor as closed. Will stop streaming and subsequent calls to + * `next()` will error. + */ + close(): Promise; + close(callback: CallbackWithoutResult): void; + + /** + * Execute `fn` for every document(s) in the cursor. If batchSize is provided + * `fn` will be executed for each batch of documents. If `fn` returns a promise, + * will wait for the promise to resolve before iterating on to the next one. + * Returns a promise that resolves when done. + */ + eachAsync(fn: (doc: DocType) => any, options?: { parallel?: number }): Promise; + eachAsync(fn: (doc: DocType[]) => any, options: { parallel?: number, batchSize: number }): Promise; + eachAsync(fn: (doc: DocType) => any, options?: { parallel?: number, batchSize?: number }, cb?: CallbackWithoutResult): void; + eachAsync(fn: (doc: DocType[]) => any, options: { parallel?: number, batchSize: number }, cb?: CallbackWithoutResult): void; + + /** + * Registers a transform function which subsequently maps documents retrieved + * via the streams interface or `.next()` + */ + map(fn: (res: DocType) => ResultType): QueryCursor; + + /** + * Get the next document from this cursor. Will return `null` when there are + * no documents left. + */ + next(): Promise; + next(callback: Callback): void; + + options: any; + } + + class Aggregate { + /** + * Sets an option on this aggregation. This function will be deprecated in a + * future release. */ + addCursorFlag(flag: string, value: boolean): this; + + /** + * Appends a new $addFields operator to this aggregate pipeline. + * Requires MongoDB v3.4+ to work + */ + addFields(arg: any): this; + + /** Sets the allowDiskUse option for the aggregation query (ignored for < 2.6.0) */ + allowDiskUse(value: boolean): this; + + /** Appends new operators to this aggregate pipeline */ + append(...args: any[]): this; + + /** + * Executes the query returning a `Promise` which will be + * resolved with either the doc(s) or rejected with the error. + * Like [`.then()`](#query_Query-then), but only takes a rejection handler. + */ + catch: Promise['catch']; + + /** Adds a collation. */ + collation(options: mongodb.CollationOptions): this; + + /** Appends a new $count operator to this aggregate pipeline. */ + count(countName: string): this; + + /** + * Sets the cursor option for the aggregation query (ignored for < 2.6.0). + */ + cursor(options?: Record): AggregationCursor; + + /** Executes the aggregate pipeline on the currently bound Model. */ + exec(callback?: Callback): Promise; + + /** Execute the aggregation with explain */ + explain(callback?: Callback): Promise; + + /** Combines multiple aggregation pipelines. */ + facet(options: any): this; + + /** Appends new custom $graphLookup operator(s) to this aggregate pipeline, performing a recursive search on a collection. */ + graphLookup(options: any): this; + + /** Appends new custom $group operator to this aggregate pipeline. */ + group(arg: any): this; + + /** Sets the hint option for the aggregation query (ignored for < 3.6.0) */ + hint(value: Record | string): this; + + /** + * Appends a new $limit operator to this aggregate pipeline. + * @param num maximum number of records to pass to the next stage + */ + limit(num: number): this; + + /** Appends new custom $lookup operator to this aggregate pipeline. */ + lookup(options: any): this; + + /** + * Appends a new custom $match operator to this aggregate pipeline. + * @param arg $match operator contents + */ + match(arg: any): this; + + /** + * Binds this aggregate to a model. + * @param model the model to which the aggregate is to be bound + */ + model(model: any): this; + + /** + * Append a new $near operator to this aggregation pipeline + * @param arg $near operator contents + */ + near(arg: { near?: number[]; distanceField: string; maxDistance?: number; query?: Record; includeLocs?: string; num?: number; uniqueDocs?: boolean }): this; + + /** Returns the current pipeline */ + pipeline(): any[]; + + /** Appends a new $project operator to this aggregate pipeline. */ + project(arg: string | Object): this; + + /** Sets the readPreference option for the aggregation query. */ + read(pref: string | mongodb.ReadPreferenceMode, tags?: any[]): this; + + /** Sets the readConcern level for the aggregation query. */ + readConcern(level: string): this; + + /** Appends a new $redact operator to this aggregate pipeline. */ + redact(expression: any, thenExpr: string | any, elseExpr: string | any): this; + + /** Appends a new $replaceRoot operator to this aggregate pipeline. */ + replaceRoot(newRoot: object | string): this; + + /** + * Helper for [Atlas Text Search](https://docs.atlas.mongodb.com/reference/atlas-search/tutorial/)'s + * `$search` stage. + */ + search(options: any): this; + + /** Lets you set arbitrary options, for middleware or plugins. */ + option(value: Record): this; + + /** Appends new custom $sample operator to this aggregate pipeline. */ + sample(size: number): this; + + /** Sets the session for this aggregation. Useful for [transactions](/docs/transactions.html). */ + session(session: mongodb.ClientSession | null): this; + + /** + * Appends a new $skip operator to this aggregate pipeline. + * @param num number of records to skip before next stage + */ + skip(num: number): this; + + /** Appends a new $sort operator to this aggregate pipeline. */ + sort(arg: any): this; + + /** Provides promise for aggregate. */ + then: Promise['then']; + + /** + * Appends a new $sortByCount operator to this aggregate pipeline. Accepts either a string field name + * or a pipeline object. + */ + sortByCount(arg: string | any): this; + + /** Appends new custom $unwind operator(s) to this aggregate pipeline. */ + unwind(...args: any[]): this; + } + + class AggregationCursor extends stream.Readable { + /** + * Adds a [cursor flag](http://mongodb.github.io/node-mongodb-native/2.2/api/Cursor.html#addCursorFlag). + * Useful for setting the `noCursorTimeout` and `tailable` flags. + */ + addCursorFlag(flag: string, value: boolean): this; + + /** + * Marks this cursor as closed. Will stop streaming and subsequent calls to + * `next()` will error. + */ + close(): Promise; + close(callback: CallbackWithoutResult): void; + + /** + * Execute `fn` for every document(s) in the cursor. If batchSize is provided + * `fn` will be executed for each batch of documents. If `fn` returns a promise, + * will wait for the promise to resolve before iterating on to the next one. + * Returns a promise that resolves when done. + */ + eachAsync(fn: (doc: any) => any, options?: { parallel?: number, batchSize?: number }): Promise; + eachAsync(fn: (doc: any) => any, options?: { parallel?: number, batchSize?: number }, cb?: CallbackWithoutResult): void; + + /** + * Registers a transform function which subsequently maps documents retrieved + * via the streams interface or `.next()` + */ + map(fn: (res: any) => any): this; + + /** + * Get the next document from this cursor. Will return `null` when there are + * no documents left. + */ + next(): Promise; + next(callback: Callback): void; + } + + class SchemaType { + /** SchemaType constructor */ + constructor(path: string, options?: AnyObject, instance?: string); + + /** Get/set the function used to cast arbitrary values to this type. */ + static cast(caster?: Function | boolean): Function; + + static checkRequired(checkRequired?: (v: any) => boolean): (v: any) => boolean; + + /** Sets a default option for this schema type. */ + static set(option: string, value: any): void; + + /** Attaches a getter for all instances of this schema type. */ + static get(getter: (value: any) => any): void; + + /** The class that Mongoose uses internally to instantiate this SchemaType's `options` property. */ + OptionsConstructor: typeof SchemaTypeOptions; + + /** Cast `val` to this schema type. Each class that inherits from schema type should implement this function. */ + cast(val: any, doc: Document, init: boolean, prev?: any, options?: any): any; + + /** Sets a default value for this SchemaType. */ + default(val: any): any; + + /** Adds a getter to this schematype. */ + get(fn: Function): this; + + /** + * Defines this path as immutable. Mongoose prevents you from changing + * immutable paths unless the parent document has [`isNew: true`](/docs/api.html#document_Document-isNew). + */ + immutable(bool: boolean): this; + + /** Declares the index options for this schematype. */ + index(options: any): this; + + /** String representation of what type this is, like 'ObjectID' or 'Number' */ + instance: string; + + /** The options this SchemaType was instantiated with */ + options: AnyObject; + + /** + * Set the model that this path refers to. This is the option that [populate](https://mongoosejs.com/docs/populate.html) + * looks at to determine the foreign collection it should query. + */ + ref(ref: string | boolean | Model): this; + + /** + * Adds a required validator to this SchemaType. The validator gets added + * to the front of this SchemaType's validators array using unshift(). + */ + required(required: boolean, message?: string): this; + + /** The schema this SchemaType instance is part of */ + schema: Schema; + + /** Sets default select() behavior for this path. */ + select(val: boolean): this; + + /** Adds a setter to this schematype. */ + set(fn: Function): this; + + /** Declares a sparse index. */ + sparse(bool: boolean): this; + + /** Declares a full text index. */ + text(bool: boolean): this; + + /** Defines a custom function for transforming this path when converting a document to JSON. */ + transform(fn: (value: any) => any): this; + + /** Declares an unique index. */ + unique(bool: boolean): this; + + /** Adds validator(s) for this document path. */ + validate(obj: RegExp | Function | any, errorMsg?: string, + type?: string): this; + } + + type Callback = (error: CallbackError, result: T) => void; + + type CallbackWithoutResult = (error: CallbackError) => void; + + class NativeError extends global.Error { } + type CallbackError = NativeError | null; + + class Error extends global.Error { + constructor(msg: string); + + /** The type of error. "MongooseError" for generic errors. */ + name: string; + + static messages: any; + + static Messages: any; + } + + namespace Error { + export class CastError extends Error { + name: 'CastError'; + stringValue: string; + kind: string; + value: any; + path: string; + reason?: NativeError | null; + model?: any; + + constructor(type: string, value: any, path: string, reason?: NativeError, schemaType?: SchemaType); + } + + export class DisconnectedError extends Error { + name: 'DisconnectedError'; + } + + export class DivergentArrayError extends Error { + name: 'DivergentArrayError'; + } + + export class MissingSchemaError extends Error { + name: 'MissingSchemaError'; + } + + export class DocumentNotFoundError extends Error { + name: 'DocumentNotFoundError'; + result: any; + numAffected: number; + filter: any; + query: any; + } + + export class ObjectExpectedError extends Error { + name: 'ObjectExpectedError'; + path: string; + } + + export class ObjectParameterError extends Error { + name: 'ObjectParameterError'; + } + + export class OverwriteModelError extends Error { + name: 'OverwriteModelError'; + } + + export class ParallelSaveError extends Error { + name: 'ParallelSaveError'; + } + + export class ParallelValidateError extends Error { + name: 'ParallelValidateError'; + } + + export class MongooseServerSelectionError extends Error { + name: 'MongooseServerSelectionError'; + } + + export class StrictModeError extends Error { + name: 'StrictModeError'; + isImmutableError: boolean; + path: string; + } + + export class ValidationError extends Error { + name: 'ValidationError'; + + errors: { [path: string]: ValidatorError | CastError | ValidationError }; + addError: (path: string, error: ValidatorError | CastError | ValidationError) => void; + + constructor(instance?: Error); + } + + export class ValidatorError extends Error { + name: 'ValidatorError'; + properties: { + message: string, + type?: string, + path?: string, + value?: any, + reason?: any + }; + kind: string; + path: string; + value: any; + reason?: Error | null; + + constructor(properties: { + message?: string, + type?: string, + path?: string, + value?: any, + reason?: any + }); + } + + export class VersionError extends Error { + name: 'VersionError'; + version: number; + modifiedPaths: Array; + } + } + + /* for ts-mongoose */ + class mquery {} +} diff --git a/node_modules/mongoose/index.js b/node_modules/mongoose/index.js new file mode 100644 index 000000000..a938b443d --- /dev/null +++ b/node_modules/mongoose/index.js @@ -0,0 +1,9 @@ + +/** + * Export lib/mongoose + * + */ + +'use strict'; + +module.exports = require('./lib/'); diff --git a/node_modules/mongoose/lib/aggregate.js b/node_modules/mongoose/lib/aggregate.js new file mode 100644 index 000000000..aa07bc5a8 --- /dev/null +++ b/node_modules/mongoose/lib/aggregate.js @@ -0,0 +1,1141 @@ +'use strict'; + +/*! + * Module dependencies + */ + +const AggregationCursor = require('./cursor/AggregationCursor'); +const Query = require('./query'); +const applyGlobalMaxTimeMS = require('./helpers/query/applyGlobalMaxTimeMS'); +const getConstructorName = require('./helpers/getConstructorName'); +const promiseOrCallback = require('./helpers/promiseOrCallback'); +const stringifyFunctionOperators = require('./helpers/aggregate/stringifyFunctionOperators'); +const util = require('util'); +const utils = require('./utils'); +const read = Query.prototype.read; +const readConcern = Query.prototype.readConcern; + +/** + * Aggregate constructor used for building aggregation pipelines. Do not + * instantiate this class directly, use [Model.aggregate()](/docs/api.html#model_Model.aggregate) instead. + * + * ####Example: + * + * const aggregate = Model.aggregate([ + * { $project: { a: 1, b: 1 } }, + * { $skip: 5 } + * ]); + * + * Model. + * aggregate([{ $match: { age: { $gte: 21 }}}]). + * unwind('tags'). + * exec(callback); + * + * ####Note: + * + * - The documents returned are plain javascript objects, not mongoose documents (since any shape of document can be returned). + * - Mongoose does **not** cast pipeline stages. The below will **not** work unless `_id` is a string in the database + * + * ```javascript + * new Aggregate([{ $match: { _id: '00000000000000000000000a' } }]); + * // Do this instead to cast to an ObjectId + * new Aggregate([{ $match: { _id: mongoose.Types.ObjectId('00000000000000000000000a') } }]); + * ``` + * + * @see MongoDB http://docs.mongodb.org/manual/applications/aggregation/ + * @see driver http://mongodb.github.com/node-mongodb-native/api-generated/collection.html#aggregate + * @param {Array} [pipeline] aggregation pipeline as an array of objects + * @param {Model} [model] the model to use with this aggregate. + * @api public + */ + +function Aggregate(pipeline, model) { + this._pipeline = []; + this._model = model; + this.options = {}; + + if (arguments.length === 1 && util.isArray(pipeline)) { + this.append.apply(this, pipeline); + } +} + +/** + * Contains options passed down to the [aggregate command](https://docs.mongodb.com/manual/reference/command/aggregate/). + * Supported options are: + * + * - `readPreference` + * - [`cursor`](./api.html#aggregate_Aggregate-cursor) + * - [`explain`](./api.html#aggregate_Aggregate-explain) + * - [`allowDiskUse`](./api.html#aggregate_Aggregate-allowDiskUse) + * - `maxTimeMS` + * - `bypassDocumentValidation` + * - `raw` + * - `promoteLongs` + * - `promoteValues` + * - `promoteBuffers` + * - [`collation`](./api.html#aggregate_Aggregate-collation) + * - `comment` + * - [`session`](./api.html#aggregate_Aggregate-session) + * + * @property options + * @memberOf Aggregate + * @api public + */ + +Aggregate.prototype.options; + +/** + * Get/set the model that this aggregation will execute on. + * + * ####Example: + * const aggregate = MyModel.aggregate([{ $match: { answer: 42 } }]); + * aggregate.model() === MyModel; // true + * + * // Change the model. There's rarely any reason to do this. + * aggregate.model(SomeOtherModel); + * aggregate.model() === SomeOtherModel; // true + * + * @param {Model} [model] set the model associated with this aggregate. + * @return {Model} + * @api public + */ + +Aggregate.prototype.model = function(model) { + if (arguments.length === 0) { + return this._model; + } + + this._model = model; + if (model.schema != null) { + if (this.options.readPreference == null && + model.schema.options.read != null) { + this.options.readPreference = model.schema.options.read; + } + if (this.options.collation == null && + model.schema.options.collation != null) { + this.options.collation = model.schema.options.collation; + } + } + + return model; +}; + +/** + * Appends new operators to this aggregate pipeline + * + * ####Examples: + * + * aggregate.append({ $project: { field: 1 }}, { $limit: 2 }); + * + * // or pass an array + * const pipeline = [{ $match: { daw: 'Logic Audio X' }} ]; + * aggregate.append(pipeline); + * + * @param {Object} ops operator(s) to append + * @return {Aggregate} + * @api public + */ + +Aggregate.prototype.append = function() { + const args = (arguments.length === 1 && util.isArray(arguments[0])) + ? arguments[0] + : utils.args(arguments); + + if (!args.every(isOperator)) { + throw new Error('Arguments must be aggregate pipeline operators'); + } + + this._pipeline = this._pipeline.concat(args); + + return this; +}; + +/** + * Appends a new $addFields operator to this aggregate pipeline. + * Requires MongoDB v3.4+ to work + * + * ####Examples: + * + * // adding new fields based on existing fields + * aggregate.addFields({ + * newField: '$b.nested' + * , plusTen: { $add: ['$val', 10]} + * , sub: { + * name: '$a' + * } + * }) + * + * // etc + * aggregate.addFields({ salary_k: { $divide: [ "$salary", 1000 ] } }); + * + * @param {Object} arg field specification + * @see $addFields https://docs.mongodb.com/manual/reference/operator/aggregation/addFields/ + * @return {Aggregate} + * @api public + */ +Aggregate.prototype.addFields = function(arg) { + const fields = {}; + if (typeof arg === 'object' && !util.isArray(arg)) { + Object.keys(arg).forEach(function(field) { + fields[field] = arg[field]; + }); + } else { + throw new Error('Invalid addFields() argument. Must be an object'); + } + return this.append({ $addFields: fields }); +}; + +/** + * Appends a new $project operator to this aggregate pipeline. + * + * Mongoose query [selection syntax](#query_Query-select) is also supported. + * + * ####Examples: + * + * // include a, include b, exclude _id + * aggregate.project("a b -_id"); + * + * // or you may use object notation, useful when + * // you have keys already prefixed with a "-" + * aggregate.project({a: 1, b: 1, _id: 0}); + * + * // reshaping documents + * aggregate.project({ + * newField: '$b.nested' + * , plusTen: { $add: ['$val', 10]} + * , sub: { + * name: '$a' + * } + * }) + * + * // etc + * aggregate.project({ salary_k: { $divide: [ "$salary", 1000 ] } }); + * + * @param {Object|String} arg field specification + * @see projection http://docs.mongodb.org/manual/reference/aggregation/project/ + * @return {Aggregate} + * @api public + */ + +Aggregate.prototype.project = function(arg) { + const fields = {}; + + if (typeof arg === 'object' && !util.isArray(arg)) { + Object.keys(arg).forEach(function(field) { + fields[field] = arg[field]; + }); + } else if (arguments.length === 1 && typeof arg === 'string') { + arg.split(/\s+/).forEach(function(field) { + if (!field) { + return; + } + const include = field[0] === '-' ? 0 : 1; + if (include === 0) { + field = field.substring(1); + } + fields[field] = include; + }); + } else { + throw new Error('Invalid project() argument. Must be string or object'); + } + + return this.append({ $project: fields }); +}; + +/** + * Appends a new custom $group operator to this aggregate pipeline. + * + * ####Examples: + * + * aggregate.group({ _id: "$department" }); + * + * @see $group http://docs.mongodb.org/manual/reference/aggregation/group/ + * @method group + * @memberOf Aggregate + * @instance + * @param {Object} arg $group operator contents + * @return {Aggregate} + * @api public + */ + +/** + * Appends a new custom $match operator to this aggregate pipeline. + * + * ####Examples: + * + * aggregate.match({ department: { $in: [ "sales", "engineering" ] } }); + * + * @see $match http://docs.mongodb.org/manual/reference/aggregation/match/ + * @method match + * @memberOf Aggregate + * @instance + * @param {Object} arg $match operator contents + * @return {Aggregate} + * @api public + */ + +/** + * Appends a new $skip operator to this aggregate pipeline. + * + * ####Examples: + * + * aggregate.skip(10); + * + * @see $skip http://docs.mongodb.org/manual/reference/aggregation/skip/ + * @method skip + * @memberOf Aggregate + * @instance + * @param {Number} num number of records to skip before next stage + * @return {Aggregate} + * @api public + */ + +/** + * Appends a new $limit operator to this aggregate pipeline. + * + * ####Examples: + * + * aggregate.limit(10); + * + * @see $limit http://docs.mongodb.org/manual/reference/aggregation/limit/ + * @method limit + * @memberOf Aggregate + * @instance + * @param {Number} num maximum number of records to pass to the next stage + * @return {Aggregate} + * @api public + */ + +/** + * Appends a new $geoNear operator to this aggregate pipeline. + * + * ####NOTE: + * + * **MUST** be used as the first operator in the pipeline. + * + * ####Examples: + * + * aggregate.near({ + * near: [40.724, -73.997], + * distanceField: "dist.calculated", // required + * maxDistance: 0.008, + * query: { type: "public" }, + * includeLocs: "dist.location", + * uniqueDocs: true, + * num: 5 + * }); + * + * @see $geoNear http://docs.mongodb.org/manual/reference/aggregation/geoNear/ + * @method near + * @memberOf Aggregate + * @instance + * @param {Object} arg + * @return {Aggregate} + * @api public + */ + +Aggregate.prototype.near = function(arg) { + const op = {}; + op.$geoNear = arg; + return this.append(op); +}; + +/*! + * define methods + */ + +'group match skip limit out'.split(' ').forEach(function($operator) { + Aggregate.prototype[$operator] = function(arg) { + const op = {}; + op['$' + $operator] = arg; + return this.append(op); + }; +}); + +/** + * Appends new custom $unwind operator(s) to this aggregate pipeline. + * + * Note that the `$unwind` operator requires the path name to start with '$'. + * Mongoose will prepend '$' if the specified field doesn't start '$'. + * + * ####Examples: + * + * aggregate.unwind("tags"); + * aggregate.unwind("a", "b", "c"); + * aggregate.unwind({ path: '$tags', preserveNullAndEmptyArrays: true }); + * + * @see $unwind http://docs.mongodb.org/manual/reference/aggregation/unwind/ + * @param {String|Object} fields the field(s) to unwind, either as field names or as [objects with options](https://docs.mongodb.com/manual/reference/operator/aggregation/unwind/#document-operand-with-options). If passing a string, prefixing the field name with '$' is optional. If passing an object, `path` must start with '$'. + * @return {Aggregate} + * @api public + */ + +Aggregate.prototype.unwind = function() { + const args = utils.args(arguments); + + const res = []; + for (const arg of args) { + if (arg && typeof arg === 'object') { + res.push({ $unwind: arg }); + } else if (typeof arg === 'string') { + res.push({ + $unwind: (arg && arg.startsWith('$')) ? arg : '$' + arg + }); + } else { + throw new Error('Invalid arg "' + arg + '" to unwind(), ' + + 'must be string or object'); + } + } + + return this.append.apply(this, res); +}; + +/** + * Appends a new $replaceRoot operator to this aggregate pipeline. + * + * Note that the `$replaceRoot` operator requires field strings to start with '$'. + * If you are passing in a string Mongoose will prepend '$' if the specified field doesn't start '$'. + * If you are passing in an object the strings in your expression will not be altered. + * + * ####Examples: + * + * aggregate.replaceRoot("user"); + * + * aggregate.replaceRoot({ x: { $concat: ['$this', '$that'] } }); + * + * @see $replaceRoot https://docs.mongodb.org/manual/reference/operator/aggregation/replaceRoot + * @param {String|Object} the field or document which will become the new root document + * @return {Aggregate} + * @api public + */ + +Aggregate.prototype.replaceRoot = function(newRoot) { + let ret; + + if (typeof newRoot === 'string') { + ret = newRoot.startsWith('$') ? newRoot : '$' + newRoot; + } else { + ret = newRoot; + } + + return this.append({ + $replaceRoot: { + newRoot: ret + } + }); +}; + +/** + * Appends a new $count operator to this aggregate pipeline. + * + * ####Examples: + * + * aggregate.count("userCount"); + * + * @see $count https://docs.mongodb.org/manual/reference/operator/aggregation/count + * @param {String} the name of the count field + * @return {Aggregate} + * @api public + */ + +Aggregate.prototype.count = function(countName) { + return this.append({ $count: countName }); +}; + +/** + * Appends a new $sortByCount operator to this aggregate pipeline. Accepts either a string field name + * or a pipeline object. + * + * Note that the `$sortByCount` operator requires the new root to start with '$'. + * Mongoose will prepend '$' if the specified field name doesn't start with '$'. + * + * ####Examples: + * + * aggregate.sortByCount('users'); + * aggregate.sortByCount({ $mergeObjects: [ "$employee", "$business" ] }) + * + * @see $sortByCount https://docs.mongodb.com/manual/reference/operator/aggregation/sortByCount/ + * @param {Object|String} arg + * @return {Aggregate} this + * @api public + */ + +Aggregate.prototype.sortByCount = function(arg) { + if (arg && typeof arg === 'object') { + return this.append({ $sortByCount: arg }); + } else if (typeof arg === 'string') { + return this.append({ + $sortByCount: (arg && arg.startsWith('$')) ? arg : '$' + arg + }); + } else { + throw new TypeError('Invalid arg "' + arg + '" to sortByCount(), ' + + 'must be string or object'); + } +}; + +/** + * Appends new custom $lookup operator to this aggregate pipeline. + * + * ####Examples: + * + * aggregate.lookup({ from: 'users', localField: 'userId', foreignField: '_id', as: 'users' }); + * + * @see $lookup https://docs.mongodb.org/manual/reference/operator/aggregation/lookup/#pipe._S_lookup + * @param {Object} options to $lookup as described in the above link + * @return {Aggregate}* @api public + */ + +Aggregate.prototype.lookup = function(options) { + return this.append({ $lookup: options }); +}; + +/** + * Appends new custom $graphLookup operator(s) to this aggregate pipeline, performing a recursive search on a collection. + * + * Note that graphLookup can only consume at most 100MB of memory, and does not allow disk use even if `{ allowDiskUse: true }` is specified. + * + * #### Examples: + * // Suppose we have a collection of courses, where a document might look like `{ _id: 0, name: 'Calculus', prerequisite: 'Trigonometry'}` and `{ _id: 0, name: 'Trigonometry', prerequisite: 'Algebra' }` + * aggregate.graphLookup({ from: 'courses', startWith: '$prerequisite', connectFromField: 'prerequisite', connectToField: 'name', as: 'prerequisites', maxDepth: 3 }) // this will recursively search the 'courses' collection up to 3 prerequisites + * + * @see $graphLookup https://docs.mongodb.com/manual/reference/operator/aggregation/graphLookup/#pipe._S_graphLookup + * @param {Object} options to $graphLookup as described in the above link + * @return {Aggregate} + * @api public + */ + +Aggregate.prototype.graphLookup = function(options) { + const cloneOptions = {}; + if (options) { + if (!utils.isObject(options)) { + throw new TypeError('Invalid graphLookup() argument. Must be an object.'); + } + + utils.mergeClone(cloneOptions, options); + const startWith = cloneOptions.startWith; + + if (startWith && typeof startWith === 'string') { + cloneOptions.startWith = cloneOptions.startWith.startsWith('$') ? + cloneOptions.startWith : + '$' + cloneOptions.startWith; + } + + } + return this.append({ $graphLookup: cloneOptions }); +}; + +/** + * Appends new custom $sample operator to this aggregate pipeline. + * + * ####Examples: + * + * aggregate.sample(3); // Add a pipeline that picks 3 random documents + * + * @see $sample https://docs.mongodb.org/manual/reference/operator/aggregation/sample/#pipe._S_sample + * @param {Number} size number of random documents to pick + * @return {Aggregate} + * @api public + */ + +Aggregate.prototype.sample = function(size) { + return this.append({ $sample: { size: size } }); +}; + +/** + * Appends a new $sort operator to this aggregate pipeline. + * + * If an object is passed, values allowed are `asc`, `desc`, `ascending`, `descending`, `1`, and `-1`. + * + * If a string is passed, it must be a space delimited list of path names. The sort order of each path is ascending unless the path name is prefixed with `-` which will be treated as descending. + * + * ####Examples: + * + * // these are equivalent + * aggregate.sort({ field: 'asc', test: -1 }); + * aggregate.sort('field -test'); + * + * @see $sort http://docs.mongodb.org/manual/reference/aggregation/sort/ + * @param {Object|String} arg + * @return {Aggregate} this + * @api public + */ + +Aggregate.prototype.sort = function(arg) { + // TODO refactor to reuse the query builder logic + + const sort = {}; + + if (getConstructorName(arg) === 'Object') { + const desc = ['desc', 'descending', -1]; + Object.keys(arg).forEach(function(field) { + // If sorting by text score, skip coercing into 1/-1 + if (arg[field] instanceof Object && arg[field].$meta) { + sort[field] = arg[field]; + return; + } + sort[field] = desc.indexOf(arg[field]) === -1 ? 1 : -1; + }); + } else if (arguments.length === 1 && typeof arg === 'string') { + arg.split(/\s+/).forEach(function(field) { + if (!field) { + return; + } + const ascend = field[0] === '-' ? -1 : 1; + if (ascend === -1) { + field = field.substring(1); + } + sort[field] = ascend; + }); + } else { + throw new TypeError('Invalid sort() argument. Must be a string or object.'); + } + + return this.append({ $sort: sort }); +}; + +/** + * Sets the readPreference option for the aggregation query. + * + * ####Example: + * + * Model.aggregate(..).read('primaryPreferred').exec(callback) + * + * @param {String} pref one of the listed preference options or their aliases + * @param {Array} [tags] optional tags for this query + * @return {Aggregate} this + * @api public + * @see mongodb http://docs.mongodb.org/manual/applications/replication/#read-preference + * @see driver http://mongodb.github.com/node-mongodb-native/driver-articles/anintroductionto1_1and2_2.html#read-preferences + */ + +Aggregate.prototype.read = function(pref, tags) { + if (!this.options) { + this.options = {}; + } + read.call(this, pref, tags); + return this; +}; + +/** + * Sets the readConcern level for the aggregation query. + * + * ####Example: + * + * Model.aggregate(..).readConcern('majority').exec(callback) + * + * @param {String} level one of the listed read concern level or their aliases + * @see mongodb https://docs.mongodb.com/manual/reference/read-concern/ + * @return {Aggregate} this + * @api public + */ + +Aggregate.prototype.readConcern = function(level) { + if (!this.options) { + this.options = {}; + } + readConcern.call(this, level); + return this; +}; + +/** + * Appends a new $redact operator to this aggregate pipeline. + * + * If 3 arguments are supplied, Mongoose will wrap them with if-then-else of $cond operator respectively + * If `thenExpr` or `elseExpr` is string, make sure it starts with $$, like `$$DESCEND`, `$$PRUNE` or `$$KEEP`. + * + * ####Example: + * + * Model.aggregate(...) + * .redact({ + * $cond: { + * if: { $eq: [ '$level', 5 ] }, + * then: '$$PRUNE', + * else: '$$DESCEND' + * } + * }) + * .exec(); + * + * // $redact often comes with $cond operator, you can also use the following syntax provided by mongoose + * Model.aggregate(...) + * .redact({ $eq: [ '$level', 5 ] }, '$$PRUNE', '$$DESCEND') + * .exec(); + * + * @param {Object} expression redact options or conditional expression + * @param {String|Object} [thenExpr] true case for the condition + * @param {String|Object} [elseExpr] false case for the condition + * @return {Aggregate} this + * @see $redact https://docs.mongodb.com/manual/reference/operator/aggregation/redact/ + * @api public + */ + +Aggregate.prototype.redact = function(expression, thenExpr, elseExpr) { + if (arguments.length === 3) { + if ((typeof thenExpr === 'string' && !thenExpr.startsWith('$$')) || + (typeof elseExpr === 'string' && !elseExpr.startsWith('$$'))) { + throw new Error('If thenExpr or elseExpr is string, it must start with $$. e.g. $$DESCEND, $$PRUNE, $$KEEP'); + } + + expression = { + $cond: { + if: expression, + then: thenExpr, + else: elseExpr + } + }; + } else if (arguments.length !== 1) { + throw new TypeError('Invalid arguments'); + } + + return this.append({ $redact: expression }); +}; + +/** + * Execute the aggregation with explain + * + * ####Example: + * + * Model.aggregate(..).explain(callback) + * + * @param {Function} callback + * @return {Promise} + */ + +Aggregate.prototype.explain = function(callback) { + const model = this._model; + + return promiseOrCallback(callback, cb => { + if (!this._pipeline.length) { + const err = new Error('Aggregate has empty pipeline'); + return cb(err); + } + + prepareDiscriminatorPipeline(this); + + model.hooks.execPre('aggregate', this, error => { + if (error) { + const _opts = { error: error }; + return model.hooks.execPost('aggregate', this, [null], _opts, error => { + cb(error); + }); + } + + this.options.explain = true; + + model.collection.aggregate(this._pipeline, this.options, (error, cursor) => { + if (error != null) { + const _opts = { error: error }; + return model.hooks.execPost('aggregate', this, [null], _opts, error => { + cb(error); + }); + } + cursor.explain((error, result) => { + const _opts = { error: error }; + return model.hooks.execPost('aggregate', this, [result], _opts, error => { + if (error) { + return cb(error); + } + return cb(null, result); + }); + }); + }); + }); + }, model.events); +}; + +/** + * Sets the allowDiskUse option for the aggregation query (ignored for < 2.6.0) + * + * ####Example: + * + * await Model.aggregate([{ $match: { foo: 'bar' } }]).allowDiskUse(true); + * + * @param {Boolean} value Should tell server it can use hard drive to store data during aggregation. + * @param {Array} [tags] optional tags for this query + * @see mongodb http://docs.mongodb.org/manual/reference/command/aggregate/ + */ + +Aggregate.prototype.allowDiskUse = function(value) { + this.options.allowDiskUse = value; + return this; +}; + +/** + * Sets the hint option for the aggregation query (ignored for < 3.6.0) + * + * ####Example: + * + * Model.aggregate(..).hint({ qty: 1, category: 1 }).exec(callback) + * + * @param {Object|String} value a hint object or the index name + * @see mongodb http://docs.mongodb.org/manual/reference/command/aggregate/ + */ + +Aggregate.prototype.hint = function(value) { + this.options.hint = value; + return this; +}; + +/** + * Sets the session for this aggregation. Useful for [transactions](/docs/transactions.html). + * + * ####Example: + * + * const session = await Model.startSession(); + * await Model.aggregate(..).session(session); + * + * @param {ClientSession} session + * @see mongodb http://docs.mongodb.org/manual/reference/command/aggregate/ + */ + +Aggregate.prototype.session = function(session) { + if (session == null) { + delete this.options.session; + } else { + this.options.session = session; + } + return this; +}; + +/** + * Lets you set arbitrary options, for middleware or plugins. + * + * ####Example: + * + * const agg = Model.aggregate(..).option({ allowDiskUse: true }); // Set the `allowDiskUse` option + * agg.options; // `{ allowDiskUse: true }` + * + * @param {Object} options keys to merge into current options + * @param [options.maxTimeMS] number limits the time this aggregation will run, see [MongoDB docs on `maxTimeMS`](https://docs.mongodb.com/manual/reference/operator/meta/maxTimeMS/) + * @param [options.allowDiskUse] boolean if true, the MongoDB server will use the hard drive to store data during this aggregation + * @param [options.collation] object see [`Aggregate.prototype.collation()`](./docs/api.html#aggregate_Aggregate-collation) + * @param [options.session] ClientSession see [`Aggregate.prototype.session()`](./docs/api.html#aggregate_Aggregate-session) + * @see mongodb http://docs.mongodb.org/manual/reference/command/aggregate/ + * @return {Aggregate} this + * @api public + */ + +Aggregate.prototype.option = function(value) { + for (const key in value) { + this.options[key] = value[key]; + } + return this; +}; + +/** + * Sets the cursor option option for the aggregation query (ignored for < 2.6.0). + * Note the different syntax below: .exec() returns a cursor object, and no callback + * is necessary. + * + * ####Example: + * + * const cursor = Model.aggregate(..).cursor({ batchSize: 1000 }).exec(); + * cursor.eachAsync(function(doc, i) { + * // use doc + * }); + * + * @param {Object} options + * @param {Number} options.batchSize set the cursor batch size + * @param {Boolean} [options.useMongooseAggCursor] use experimental mongoose-specific aggregation cursor (for `eachAsync()` and other query cursor semantics) + * @return {Aggregate} this + * @api public + * @see mongodb http://mongodb.github.io/node-mongodb-native/2.0/api/AggregationCursor.html + */ + +Aggregate.prototype.cursor = function(options) { + if (!this.options) { + this.options = {}; + } + this.options.cursor = options || {}; + return new AggregationCursor(this); // return this; +}; + +/** + * Adds a collation + * + * ####Example: + * + * Model.aggregate(..).collation({ locale: 'en_US', strength: 1 }).exec(); + * + * @param {Object} collation options + * @return {Aggregate} this + * @api public + * @see mongodb http://mongodb.github.io/node-mongodb-native/2.2/api/Collection.html#aggregate + */ + +Aggregate.prototype.collation = function(collation) { + if (!this.options) { + this.options = {}; + } + this.options.collation = collation; + return this; +}; + +/** + * Combines multiple aggregation pipelines. + * + * ####Example: + * + * Model.aggregate(...) + * .facet({ + * books: [{ groupBy: '$author' }], + * price: [{ $bucketAuto: { groupBy: '$price', buckets: 2 } }] + * }) + * .exec(); + * + * // Output: { books: [...], price: [{...}, {...}] } + * + * @param {Object} facet options + * @return {Aggregate} this + * @see $facet https://docs.mongodb.com/v3.4/reference/operator/aggregation/facet/ + * @api public + */ + +Aggregate.prototype.facet = function(options) { + return this.append({ $facet: options }); +}; + +/** + * Helper for [Atlas Text Search](https://docs.atlas.mongodb.com/reference/atlas-search/tutorial/)'s + * `$search` stage. + * + * ####Example: + * + * Model.aggregate(). + * search({ + * text: { + * query: 'baseball', + * path: 'plot' + * } + * }); + * + * // Output: [{ plot: '...', title: '...' }] + * + * @param {Object} $search options + * @return {Aggregate} this + * @see $search https://docs.atlas.mongodb.com/reference/atlas-search/tutorial/ + * @api public + */ + +Aggregate.prototype.search = function(options) { + return this.append({ $search: options }); +}; + +/** + * Returns the current pipeline + * + * ####Example: + * + * MyModel.aggregate().match({ test: 1 }).pipeline(); // [{ $match: { test: 1 } }] + * + * @return {Array} + * @api public + */ + + +Aggregate.prototype.pipeline = function() { + return this._pipeline; +}; + +/** + * Executes the aggregate pipeline on the currently bound Model. + * + * ####Example: + * + * aggregate.exec(callback); + * + * // Because a promise is returned, the `callback` is optional. + * const promise = aggregate.exec(); + * promise.then(..); + * + * @see Promise #promise_Promise + * @param {Function} [callback] + * @return {Promise} + * @api public + */ + +Aggregate.prototype.exec = function(callback) { + if (!this._model) { + throw new Error('Aggregate not bound to any Model'); + } + const model = this._model; + const collection = this._model.collection; + + applyGlobalMaxTimeMS(this.options, model); + + if (this.options && this.options.cursor) { + return new AggregationCursor(this); + } + + return promiseOrCallback(callback, cb => { + prepareDiscriminatorPipeline(this); + stringifyFunctionOperators(this._pipeline); + + model.hooks.execPre('aggregate', this, error => { + if (error) { + const _opts = { error: error }; + return model.hooks.execPost('aggregate', this, [null], _opts, error => { + cb(error); + }); + } + if (!this._pipeline.length) { + return cb(new Error('Aggregate has empty pipeline')); + } + + const options = utils.clone(this.options || {}); + + collection.aggregate(this._pipeline, options, (err, cursor) => { + if (err != null) { + return cb(err); + } + + cursor.toArray((error, result) => { + const _opts = { error: error }; + model.hooks.execPost('aggregate', this, [result], _opts, (error, result) => { + if (error) { + return cb(error); + } + + cb(null, result); + }); + }); + }); + }); + }, model.events); +}; + +/** + * Provides promise for aggregate. + * + * ####Example: + * + * Model.aggregate(..).then(successCallback, errorCallback); + * + * @see Promise #promise_Promise + * @param {Function} [resolve] successCallback + * @param {Function} [reject] errorCallback + * @return {Promise} + */ +Aggregate.prototype.then = function(resolve, reject) { + return this.exec().then(resolve, reject); +}; + +/** + * Executes the query returning a `Promise` which will be + * resolved with either the doc(s) or rejected with the error. + * Like [`.then()`](#query_Query-then), but only takes a rejection handler. + * + * @param {Function} [reject] + * @return {Promise} + * @api public + */ + +Aggregate.prototype.catch = function(reject) { + return this.exec().then(null, reject); +}; + +/** + * Returns an asyncIterator for use with [`for/await/of` loops](https://thecodebarbarian.com/getting-started-with-async-iterators-in-node-js + * You do not need to call this function explicitly, the JavaScript runtime + * will call it for you. + * + * ####Example + * + * const agg = Model.aggregate([{ $match: { age: { $gte: 25 } } }]); + * for await (const doc of agg) { + * console.log(doc.name); + * } + * + * Node.js 10.x supports async iterators natively without any flags. You can + * enable async iterators in Node.js 8.x using the [`--harmony_async_iteration` flag](https://github.com/tc39/proposal-async-iteration/issues/117#issuecomment-346695187). + * + * **Note:** This function is not set if `Symbol.asyncIterator` is undefined. If + * `Symbol.asyncIterator` is undefined, that means your Node.js version does not + * support async iterators. + * + * @method Symbol.asyncIterator + * @memberOf Aggregate + * @instance + * @api public + */ + +if (Symbol.asyncIterator != null) { + Aggregate.prototype[Symbol.asyncIterator] = function() { + return this.cursor({ useMongooseAggCursor: true }). + transformNull(). + _transformForAsyncIterator(); + }; +} + +/*! + * Helpers + */ + +/** + * Checks whether an object is likely a pipeline operator + * + * @param {Object} obj object to check + * @return {Boolean} + * @api private + */ + +function isOperator(obj) { + if (typeof obj !== 'object') { + return false; + } + + const k = Object.keys(obj); + + return k.length === 1 && k.some(key => { return key[0] === '$'; }); +} + +/*! + * Adds the appropriate `$match` pipeline step to the top of an aggregate's + * pipeline, should it's model is a non-root discriminator type. This is + * analogous to the `prepareDiscriminatorCriteria` function in `lib/query.js`. + * + * @param {Aggregate} aggregate Aggregate to prepare + */ + +Aggregate._prepareDiscriminatorPipeline = prepareDiscriminatorPipeline; + +function prepareDiscriminatorPipeline(aggregate) { + const schema = aggregate._model.schema; + const discriminatorMapping = schema && schema.discriminatorMapping; + + if (discriminatorMapping && !discriminatorMapping.isRoot) { + const originalPipeline = aggregate._pipeline; + const discriminatorKey = discriminatorMapping.key; + const discriminatorValue = discriminatorMapping.value; + + // If the first pipeline stage is a match and it doesn't specify a `__t` + // key, add the discriminator key to it. This allows for potential + // aggregation query optimizations not to be disturbed by this feature. + if (originalPipeline[0] != null && originalPipeline[0].$match && !originalPipeline[0].$match[discriminatorKey]) { + originalPipeline[0].$match[discriminatorKey] = discriminatorValue; + // `originalPipeline` is a ref, so there's no need for + // aggregate._pipeline = originalPipeline + } else if (originalPipeline[0] != null && originalPipeline[0].$geoNear) { + originalPipeline[0].$geoNear.query = + originalPipeline[0].$geoNear.query || {}; + originalPipeline[0].$geoNear.query[discriminatorKey] = discriminatorValue; + } else if (originalPipeline[0] != null && originalPipeline[0].$search) { + if (originalPipeline[1] && originalPipeline[1].$match != null) { + originalPipeline[1].$match[discriminatorKey] = originalPipeline[1].$match[discriminatorKey] || discriminatorValue; + } else { + const match = {}; + match[discriminatorKey] = discriminatorValue; + originalPipeline.splice(1, 0, { $match: match }); + } + } else { + const match = {}; + match[discriminatorKey] = discriminatorValue; + aggregate._pipeline.unshift({ $match: match }); + } + } +} + +/*! + * Exports + */ + +module.exports = Aggregate; diff --git a/node_modules/mongoose/lib/browser.js b/node_modules/mongoose/lib/browser.js new file mode 100644 index 000000000..f71a4793c --- /dev/null +++ b/node_modules/mongoose/lib/browser.js @@ -0,0 +1,158 @@ +/* eslint-env browser */ + +'use strict'; + +require('./driver').set(require('./drivers/browser')); + +const DocumentProvider = require('./document_provider.js'); +const PromiseProvider = require('./promise_provider'); + +DocumentProvider.setBrowser(true); + +/** + * The Mongoose [Promise](#promise_Promise) constructor. + * + * @method Promise + * @api public + */ + +Object.defineProperty(exports, 'Promise', { + get: function() { + return PromiseProvider.get(); + }, + set: function(lib) { + PromiseProvider.set(lib); + } +}); + +/** + * Storage layer for mongoose promises + * + * @method PromiseProvider + * @api public + */ + +exports.PromiseProvider = PromiseProvider; + +/** + * The [MongooseError](#error_MongooseError) constructor. + * + * @method Error + * @api public + */ + +exports.Error = require('./error/index'); + +/** + * The Mongoose [Schema](#schema_Schema) constructor + * + * ####Example: + * + * const mongoose = require('mongoose'); + * const Schema = mongoose.Schema; + * const CatSchema = new Schema(..); + * + * @method Schema + * @api public + */ + +exports.Schema = require('./schema'); + +/** + * The various Mongoose Types. + * + * ####Example: + * + * const mongoose = require('mongoose'); + * const array = mongoose.Types.Array; + * + * ####Types: + * + * - [Array](/docs/schematypes.html#arrays) + * - [Buffer](/docs/schematypes.html#buffers) + * - [Embedded](/docs/schematypes.html#schemas) + * - [DocumentArray](/docs/api/documentarraypath.html) + * - [Decimal128](/docs/api.html#mongoose_Mongoose-Decimal128) + * - [ObjectId](/docs/schematypes.html#objectids) + * - [Map](/docs/schematypes.html#maps) + * - [Subdocument](/docs/schematypes.html#schemas) + * + * Using this exposed access to the `ObjectId` type, we can construct ids on demand. + * + * const ObjectId = mongoose.Types.ObjectId; + * const id1 = new ObjectId; + * + * @property Types + * @api public + */ +exports.Types = require('./types'); + +/** + * The Mongoose [VirtualType](#virtualtype_VirtualType) constructor + * + * @method VirtualType + * @api public + */ +exports.VirtualType = require('./virtualtype'); + +/** + * The various Mongoose SchemaTypes. + * + * ####Note: + * + * _Alias of mongoose.Schema.Types for backwards compatibility._ + * + * @property SchemaTypes + * @see Schema.SchemaTypes #schema_Schema.Types + * @api public + */ + +exports.SchemaType = require('./schematype.js'); + +/** + * Internal utils + * + * @property utils + * @api private + */ + +exports.utils = require('./utils.js'); + +/** + * The Mongoose browser [Document](/api/document.html) constructor. + * + * @method Document + * @api public + */ +exports.Document = DocumentProvider(); + +/** + * Return a new browser model. In the browser, a model is just + * a simplified document with a schema - it does **not** have + * functions like `findOne()`, etc. + * + * @method model + * @api public + * @param {String} name + * @param {Schema} schema + * @return Class + */ +exports.model = function(name, schema) { + class Model extends exports.Document { + constructor(obj, fields) { + super(obj, schema, fields); + } + } + Model.modelName = name; + + return Model; +}; + +/*! + * Module exports. + */ + +if (typeof window !== 'undefined') { + window.mongoose = module.exports; + window.Buffer = Buffer; +} diff --git a/node_modules/mongoose/lib/browserDocument.js b/node_modules/mongoose/lib/browserDocument.js new file mode 100644 index 000000000..a44f3077e --- /dev/null +++ b/node_modules/mongoose/lib/browserDocument.js @@ -0,0 +1,100 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const NodeJSDocument = require('./document'); +const EventEmitter = require('events').EventEmitter; +const MongooseError = require('./error/index'); +const Schema = require('./schema'); +const ObjectId = require('./types/objectid'); +const ValidationError = MongooseError.ValidationError; +const applyHooks = require('./helpers/model/applyHooks'); +const isObject = require('./helpers/isObject'); + +/** + * Document constructor. + * + * @param {Object} obj the values to set + * @param {Object} [fields] optional object containing the fields which were selected in the query returning this document and any populated paths data + * @param {Boolean} [skipId] bool, should we auto create an ObjectId _id + * @inherits NodeJS EventEmitter http://nodejs.org/api/events.html#events_class_events_eventemitter + * @event `init`: Emitted on a document after it has was retrieved from the db and fully hydrated by Mongoose. + * @event `save`: Emitted when the document is successfully saved + * @api private + */ + +function Document(obj, schema, fields, skipId, skipInit) { + if (!(this instanceof Document)) { + return new Document(obj, schema, fields, skipId, skipInit); + } + + if (isObject(schema) && !schema.instanceOfSchema) { + schema = new Schema(schema); + } + + // When creating EmbeddedDocument, it already has the schema and he doesn't need the _id + schema = this.schema || schema; + + // Generate ObjectId if it is missing, but it requires a scheme + if (!this.schema && schema.options._id) { + obj = obj || {}; + + if (obj._id === undefined) { + obj._id = new ObjectId(); + } + } + + if (!schema) { + throw new MongooseError.MissingSchemaError(); + } + + this.$__setSchema(schema); + + NodeJSDocument.call(this, obj, fields, skipId, skipInit); + + applyHooks(this, schema, { decorateDoc: true }); + + // apply methods + for (const m in schema.methods) { + this[m] = schema.methods[m]; + } + // apply statics + for (const s in schema.statics) { + this[s] = schema.statics[s]; + } +} + +/*! + * Inherit from the NodeJS document + */ + +Document.prototype = Object.create(NodeJSDocument.prototype); +Document.prototype.constructor = Document; + +/*! + * ignore + */ + +Document.events = new EventEmitter(); + +/*! + * Browser doc exposes the event emitter API + */ + +Document.$emitter = new EventEmitter(); + +['on', 'once', 'emit', 'listeners', 'removeListener', 'setMaxListeners', + 'removeAllListeners', 'addListener'].forEach(function(emitterFn) { + Document[emitterFn] = function() { + return Document.$emitter[emitterFn].apply(Document.$emitter, arguments); + }; +}); + +/*! + * Module exports. + */ + +Document.ValidationError = ValidationError; +module.exports = exports = Document; diff --git a/node_modules/mongoose/lib/cast.js b/node_modules/mongoose/lib/cast.js new file mode 100644 index 000000000..cb99ed6a4 --- /dev/null +++ b/node_modules/mongoose/lib/cast.js @@ -0,0 +1,369 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const CastError = require('./error/cast'); +const StrictModeError = require('./error/strict'); +const Types = require('./schema/index'); +const castTextSearch = require('./schema/operators/text'); +const get = require('./helpers/get'); +const getConstructorName = require('./helpers/getConstructorName'); +const getSchemaDiscriminatorByValue = require('./helpers/discriminator/getSchemaDiscriminatorByValue'); +const isOperator = require('./helpers/query/isOperator'); +const util = require('util'); +const isObject = require('./helpers/isObject'); +const isMongooseObject = require('./helpers/isMongooseObject'); + +const ALLOWED_GEOWITHIN_GEOJSON_TYPES = ['Polygon', 'MultiPolygon']; + +/** + * Handles internal casting for query filters. + * + * @param {Schema} schema + * @param {Object} obj Object to cast + * @param {Object} options the query options + * @param {Query} context passed to setters + * @api private + */ +module.exports = function cast(schema, obj, options, context) { + if (Array.isArray(obj)) { + throw new Error('Query filter must be an object, got an array ', util.inspect(obj)); + } + + if (obj == null) { + return obj; + } + + // bson 1.x has the unfortunate tendency to remove filters that have a top-level + // `_bsontype` property. But we should still allow ObjectIds because + // `Collection#find()` has a special case to support `find(objectid)`. + // Should remove this when we upgrade to bson 4.x. See gh-8222, gh-8268 + if (obj.hasOwnProperty('_bsontype') && obj._bsontype !== 'ObjectID') { + delete obj._bsontype; + } + + if (schema != null && schema.discriminators != null && obj[schema.options.discriminatorKey] != null) { + schema = getSchemaDiscriminatorByValue(schema, obj[schema.options.discriminatorKey]) || schema; + } + + const paths = Object.keys(obj); + let i = paths.length; + let _keys; + let any$conditionals; + let schematype; + let nested; + let path; + let type; + let val; + + options = options || {}; + + while (i--) { + path = paths[i]; + val = obj[path]; + + if (path === '$or' || path === '$nor' || path === '$and') { + if (!Array.isArray(val)) { + throw new CastError('Array', val, path); + } + for (let k = 0; k < val.length; ++k) { + if (val[k] == null || typeof val[k] !== 'object') { + throw new CastError('Object', val[k], path + '.' + k); + } + val[k] = cast(schema, val[k], options, context); + } + } else if (path === '$where') { + type = typeof val; + + if (type !== 'string' && type !== 'function') { + throw new Error('Must have a string or function for $where'); + } + + if (type === 'function') { + obj[path] = val.toString(); + } + + continue; + } else if (path === '$expr') { + if (typeof val !== 'object' || val == null) { + throw new Error('`$expr` must be an object'); + } + continue; + } else if (path === '$elemMatch') { + val = cast(schema, val, options, context); + } else if (path === '$text') { + val = castTextSearch(val, path); + } else { + if (!schema) { + // no casting for Mixed types + continue; + } + + schematype = schema.path(path); + + // Check for embedded discriminator paths + if (!schematype) { + const split = path.split('.'); + let j = split.length; + while (j--) { + const pathFirstHalf = split.slice(0, j).join('.'); + const pathLastHalf = split.slice(j).join('.'); + const _schematype = schema.path(pathFirstHalf); + const discriminatorKey = get(_schematype, 'schema.options.discriminatorKey'); + + // gh-6027: if we haven't found the schematype but this path is + // underneath an embedded discriminator and the embedded discriminator + // key is in the query, use the embedded discriminator schema + if (_schematype != null && + get(_schematype, 'schema.discriminators') != null && + discriminatorKey != null && + pathLastHalf !== discriminatorKey) { + const discriminatorVal = get(obj, pathFirstHalf + '.' + discriminatorKey); + if (discriminatorVal != null) { + schematype = _schematype.schema.discriminators[discriminatorVal]. + path(pathLastHalf); + } + } + } + } + + if (!schematype) { + // Handle potential embedded array queries + const split = path.split('.'); + let j = split.length; + let pathFirstHalf; + let pathLastHalf; + let remainingConds; + + // Find the part of the var path that is a path of the Schema + while (j--) { + pathFirstHalf = split.slice(0, j).join('.'); + schematype = schema.path(pathFirstHalf); + if (schematype) { + break; + } + } + + // If a substring of the input path resolves to an actual real path... + if (schematype) { + // Apply the casting; similar code for $elemMatch in schema/array.js + if (schematype.caster && schematype.caster.schema) { + remainingConds = {}; + pathLastHalf = split.slice(j).join('.'); + remainingConds[pathLastHalf] = val; + obj[path] = cast(schematype.caster.schema, remainingConds, options, context)[pathLastHalf]; + } else { + obj[path] = val; + } + continue; + } + + if (isObject(val)) { + // handle geo schemas that use object notation + // { loc: { long: Number, lat: Number } + + let geo = ''; + if (val.$near) { + geo = '$near'; + } else if (val.$nearSphere) { + geo = '$nearSphere'; + } else if (val.$within) { + geo = '$within'; + } else if (val.$geoIntersects) { + geo = '$geoIntersects'; + } else if (val.$geoWithin) { + geo = '$geoWithin'; + } + + if (geo) { + const numbertype = new Types.Number('__QueryCasting__'); + let value = val[geo]; + + if (val.$maxDistance != null) { + val.$maxDistance = numbertype.castForQueryWrapper({ + val: val.$maxDistance, + context: context + }); + } + if (val.$minDistance != null) { + val.$minDistance = numbertype.castForQueryWrapper({ + val: val.$minDistance, + context: context + }); + } + + if (geo === '$within') { + const withinType = value.$center + || value.$centerSphere + || value.$box + || value.$polygon; + + if (!withinType) { + throw new Error('Bad $within parameter: ' + JSON.stringify(val)); + } + + value = withinType; + } else if (geo === '$near' && + typeof value.type === 'string' && Array.isArray(value.coordinates)) { + // geojson; cast the coordinates + value = value.coordinates; + } else if ((geo === '$near' || geo === '$nearSphere' || geo === '$geoIntersects') && + value.$geometry && typeof value.$geometry.type === 'string' && + Array.isArray(value.$geometry.coordinates)) { + if (value.$maxDistance != null) { + value.$maxDistance = numbertype.castForQueryWrapper({ + val: value.$maxDistance, + context: context + }); + } + if (value.$minDistance != null) { + value.$minDistance = numbertype.castForQueryWrapper({ + val: value.$minDistance, + context: context + }); + } + if (isMongooseObject(value.$geometry)) { + value.$geometry = value.$geometry.toObject({ + transform: false, + virtuals: false + }); + } + value = value.$geometry.coordinates; + } else if (geo === '$geoWithin') { + if (value.$geometry) { + if (isMongooseObject(value.$geometry)) { + value.$geometry = value.$geometry.toObject({ virtuals: false }); + } + const geoWithinType = value.$geometry.type; + if (ALLOWED_GEOWITHIN_GEOJSON_TYPES.indexOf(geoWithinType) === -1) { + throw new Error('Invalid geoJSON type for $geoWithin "' + + geoWithinType + '", must be "Polygon" or "MultiPolygon"'); + } + value = value.$geometry.coordinates; + } else { + value = value.$box || value.$polygon || value.$center || + value.$centerSphere; + if (isMongooseObject(value)) { + value = value.toObject({ virtuals: false }); + } + } + } + + _cast(value, numbertype, context); + continue; + } + } + + if (schema.nested[path]) { + continue; + } + if (options.upsert && options.strict) { + if (options.strict === 'throw') { + throw new StrictModeError(path); + } + throw new StrictModeError(path, 'Path "' + path + '" is not in ' + + 'schema, strict mode is `true`, and upsert is `true`.'); + } if (options.strictQuery === 'throw') { + throw new StrictModeError(path, 'Path "' + path + '" is not in ' + + 'schema and strictQuery is \'throw\'.'); + } else if (options.strictQuery) { + delete obj[path]; + } + } else if (val == null) { + continue; + } else if (getConstructorName(val) === 'Object') { + any$conditionals = Object.keys(val).some(isOperator); + + if (!any$conditionals) { + obj[path] = schematype.castForQueryWrapper({ + val: val, + context: context + }); + } else { + const ks = Object.keys(val); + let $cond; + + let k = ks.length; + + while (k--) { + $cond = ks[k]; + nested = val[$cond]; + + if ($cond === '$not') { + if (nested && schematype) { + _keys = Object.keys(nested); + if (_keys.length && isOperator(_keys[0])) { + for (const key in nested) { + nested[key] = schematype.castForQueryWrapper({ + $conditional: key, + val: nested[key], + context: context + }); + } + } else { + val[$cond] = schematype.castForQueryWrapper({ + $conditional: $cond, + val: nested, + context: context + }); + } + continue; + } + } else { + val[$cond] = schematype.castForQueryWrapper({ + $conditional: $cond, + val: nested, + context: context + }); + } + } + } + } else if (Array.isArray(val) && ['Buffer', 'Array'].indexOf(schematype.instance) === -1) { + const casted = []; + const valuesArray = val; + + for (const _val of valuesArray) { + casted.push(schematype.castForQueryWrapper({ + val: _val, + context: context + })); + } + + obj[path] = { $in: casted }; + } else { + obj[path] = schematype.castForQueryWrapper({ + val: val, + context: context + }); + } + } + } + + return obj; +}; + +function _cast(val, numbertype, context) { + if (Array.isArray(val)) { + val.forEach(function(item, i) { + if (Array.isArray(item) || isObject(item)) { + return _cast(item, numbertype, context); + } + val[i] = numbertype.castForQueryWrapper({ val: item, context: context }); + }); + } else { + const nearKeys = Object.keys(val); + let nearLen = nearKeys.length; + while (nearLen--) { + const nkey = nearKeys[nearLen]; + const item = val[nkey]; + if (Array.isArray(item) || isObject(item)) { + _cast(item, numbertype, context); + val[nkey] = item; + } else { + val[nkey] = numbertype.castForQuery({ val: item, context: context }); + } + } + } +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/cast/boolean.js b/node_modules/mongoose/lib/cast/boolean.js new file mode 100644 index 000000000..92551d41e --- /dev/null +++ b/node_modules/mongoose/lib/cast/boolean.js @@ -0,0 +1,32 @@ +'use strict'; + +const CastError = require('../error/cast'); + +/*! + * Given a value, cast it to a boolean, or throw a `CastError` if the value + * cannot be casted. `null` and `undefined` are considered valid. + * + * @param {Any} value + * @param {String} [path] optional the path to set on the CastError + * @return {Boolean|null|undefined} + * @throws {CastError} if `value` is not one of the allowed values + * @api private + */ + +module.exports = function castBoolean(value, path) { + if (module.exports.convertToTrue.has(value)) { + return true; + } + if (module.exports.convertToFalse.has(value)) { + return false; + } + + if (value == null) { + return value; + } + + throw new CastError('boolean', value, path); +}; + +module.exports.convertToTrue = new Set([true, 'true', 1, '1', 'yes']); +module.exports.convertToFalse = new Set([false, 'false', 0, '0', 'no']); diff --git a/node_modules/mongoose/lib/cast/date.js b/node_modules/mongoose/lib/cast/date.js new file mode 100644 index 000000000..ac17006df --- /dev/null +++ b/node_modules/mongoose/lib/cast/date.js @@ -0,0 +1,41 @@ +'use strict'; + +const assert = require('assert'); + +module.exports = function castDate(value) { + // Support empty string because of empty form values. Originally introduced + // in https://github.com/Automattic/mongoose/commit/efc72a1898fc3c33a319d915b8c5463a22938dfe + if (value == null || value === '') { + return null; + } + + if (value instanceof Date) { + assert.ok(!isNaN(value.valueOf())); + + return value; + } + + let date; + + assert.ok(typeof value !== 'boolean'); + + if (value instanceof Number || typeof value === 'number') { + date = new Date(value); + } else if (typeof value === 'string' && !isNaN(Number(value)) && (Number(value) >= 275761 || Number(value) < -271820)) { + // string representation of milliseconds take this path + date = new Date(Number(value)); + } else if (typeof value.valueOf === 'function') { + // support for moment.js. This is also the path strings will take because + // strings have a `valueOf()` + date = new Date(value.valueOf()); + } else { + // fallback + date = new Date(value); + } + + if (!isNaN(date.valueOf())) { + return date; + } + + assert.ok(false); +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/cast/decimal128.js b/node_modules/mongoose/lib/cast/decimal128.js new file mode 100644 index 000000000..bfb1578c4 --- /dev/null +++ b/node_modules/mongoose/lib/cast/decimal128.js @@ -0,0 +1,36 @@ +'use strict'; + +const Decimal128Type = require('../types/decimal128'); +const assert = require('assert'); + +module.exports = function castDecimal128(value) { + if (value == null) { + return value; + } + + if (typeof value === 'object' && typeof value.$numberDecimal === 'string') { + return Decimal128Type.fromString(value.$numberDecimal); + } + + if (value instanceof Decimal128Type) { + return value; + } + + if (typeof value === 'string') { + return Decimal128Type.fromString(value); + } + + if (Buffer.isBuffer(value)) { + return new Decimal128Type(value); + } + + if (typeof value === 'number') { + return Decimal128Type.fromString(String(value)); + } + + if (typeof value.valueOf === 'function' && typeof value.valueOf() === 'string') { + return Decimal128Type.fromString(value.valueOf()); + } + + assert.ok(false); +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/cast/number.js b/node_modules/mongoose/lib/cast/number.js new file mode 100644 index 000000000..7cd7f848c --- /dev/null +++ b/node_modules/mongoose/lib/cast/number.js @@ -0,0 +1,43 @@ +'use strict'; + +const assert = require('assert'); + +/*! + * Given a value, cast it to a number, or throw a `CastError` if the value + * cannot be casted. `null` and `undefined` are considered valid. + * + * @param {Any} value + * @param {String} [path] optional the path to set on the CastError + * @return {Boolean|null|undefined} + * @throws {Error} if `value` is not one of the allowed values + * @api private + */ + +module.exports = function castNumber(val) { + if (val == null) { + return val; + } + if (val === '') { + return null; + } + + if (typeof val === 'string' || typeof val === 'boolean') { + val = Number(val); + } + + assert.ok(!isNaN(val)); + if (val instanceof Number) { + return val.valueOf(); + } + if (typeof val === 'number') { + return val; + } + if (!Array.isArray(val) && typeof val.valueOf === 'function') { + return Number(val.valueOf()); + } + if (val.toString && !Array.isArray(val) && val.toString() == Number(val)) { + return Number(val); + } + + assert.ok(false); +}; diff --git a/node_modules/mongoose/lib/cast/objectid.js b/node_modules/mongoose/lib/cast/objectid.js new file mode 100644 index 000000000..67cffb543 --- /dev/null +++ b/node_modules/mongoose/lib/cast/objectid.js @@ -0,0 +1,29 @@ +'use strict'; + +const ObjectId = require('../driver').get().ObjectId; +const assert = require('assert'); + +module.exports = function castObjectId(value) { + if (value == null) { + return value; + } + + if (value instanceof ObjectId) { + return value; + } + + if (value._id) { + if (value._id instanceof ObjectId) { + return value._id; + } + if (value._id.toString instanceof Function) { + return new ObjectId(value._id.toString()); + } + } + + if (value.toString instanceof Function) { + return new ObjectId(value.toString()); + } + + assert.ok(false); +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/cast/string.js b/node_modules/mongoose/lib/cast/string.js new file mode 100644 index 000000000..4d89f8e2b --- /dev/null +++ b/node_modules/mongoose/lib/cast/string.js @@ -0,0 +1,37 @@ +'use strict'; + +const CastError = require('../error/cast'); + +/*! + * Given a value, cast it to a string, or throw a `CastError` if the value + * cannot be casted. `null` and `undefined` are considered valid. + * + * @param {Any} value + * @param {String} [path] optional the path to set on the CastError + * @return {string|null|undefined} + * @throws {CastError} + * @api private + */ + +module.exports = function castString(value, path) { + // If null or undefined + if (value == null) { + return value; + } + + // handle documents being passed + if (value._id && typeof value._id === 'string') { + return value._id; + } + + // Re: gh-647 and gh-3030, we're ok with casting using `toString()` + // **unless** its the default Object.toString, because "[object Object]" + // doesn't really qualify as useful data + if (value.toString && + value.toString !== Object.prototype.toString && + !Array.isArray(value)) { + return value.toString(); + } + + throw new CastError('string', value, path); +}; diff --git a/node_modules/mongoose/lib/collection.js b/node_modules/mongoose/lib/collection.js new file mode 100644 index 000000000..91206362c --- /dev/null +++ b/node_modules/mongoose/lib/collection.js @@ -0,0 +1,318 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const EventEmitter = require('events').EventEmitter; +const STATES = require('./connectionstate'); +const immediate = require('./helpers/immediate'); + +/** + * Abstract Collection constructor + * + * This is the base class that drivers inherit from and implement. + * + * @param {String} name name of the collection + * @param {Connection} conn A MongooseConnection instance + * @param {Object} opts optional collection options + * @api public + */ + +function Collection(name, conn, opts) { + if (opts === void 0) { + opts = {}; + } + if (opts.capped === void 0) { + opts.capped = {}; + } + + if (typeof opts.capped === 'number') { + opts.capped = { size: opts.capped }; + } + + this.opts = opts; + this.name = name; + this.collectionName = name; + this.conn = conn; + this.queue = []; + this.buffer = true; + this.emitter = new EventEmitter(); + + if (STATES.connected === this.conn.readyState) { + this.onOpen(); + } +} + +/** + * The collection name + * + * @api public + * @property name + */ + +Collection.prototype.name; + +/** + * The collection name + * + * @api public + * @property collectionName + */ + +Collection.prototype.collectionName; + +/** + * The Connection instance + * + * @api public + * @property conn + */ + +Collection.prototype.conn; + +/** + * Called when the database connects + * + * @api private + */ + +Collection.prototype.onOpen = function() { + this.buffer = false; + immediate(() => this.doQueue()); +}; + +/** + * Called when the database disconnects + * + * @api private + */ + +Collection.prototype.onClose = function() {}; + +/** + * Queues a method for later execution when its + * database connection opens. + * + * @param {String} name name of the method to queue + * @param {Array} args arguments to pass to the method when executed + * @api private + */ + +Collection.prototype.addQueue = function(name, args) { + this.queue.push([name, args]); + return this; +}; + +/** + * Removes a queued method + * + * @param {String} name name of the method to queue + * @param {Array} args arguments to pass to the method when executed + * @api private + */ + +Collection.prototype.removeQueue = function(name, args) { + const index = this.queue.findIndex(v => v[0] === name && v[1] === args); + if (index === -1) { + return false; + } + this.queue.splice(index, 1); + return true; +}; + +/** + * Executes all queued methods and clears the queue. + * + * @api private + */ + +Collection.prototype.doQueue = function() { + for (const method of this.queue) { + if (typeof method[0] === 'function') { + method[0].apply(this, method[1]); + } else { + this[method[0]].apply(this, method[1]); + } + } + this.queue = []; + const _this = this; + immediate(function() { + _this.emitter.emit('queue'); + }); + return this; +}; + +/** + * Abstract method that drivers must implement. + */ + +Collection.prototype.ensureIndex = function() { + throw new Error('Collection#ensureIndex unimplemented by driver'); +}; + +/** + * Abstract method that drivers must implement. + */ + +Collection.prototype.createIndex = function() { + throw new Error('Collection#createIndex unimplemented by driver'); +}; + +/** + * Abstract method that drivers must implement. + */ + +Collection.prototype.findAndModify = function() { + throw new Error('Collection#findAndModify unimplemented by driver'); +}; + +/** + * Abstract method that drivers must implement. + */ + +Collection.prototype.findOneAndUpdate = function() { + throw new Error('Collection#findOneAndUpdate unimplemented by driver'); +}; + +/** + * Abstract method that drivers must implement. + */ + +Collection.prototype.findOneAndDelete = function() { + throw new Error('Collection#findOneAndDelete unimplemented by driver'); +}; + +/** + * Abstract method that drivers must implement. + */ + +Collection.prototype.findOneAndReplace = function() { + throw new Error('Collection#findOneAndReplace unimplemented by driver'); +}; + +/** + * Abstract method that drivers must implement. + */ + +Collection.prototype.findOne = function() { + throw new Error('Collection#findOne unimplemented by driver'); +}; + +/** + * Abstract method that drivers must implement. + */ + +Collection.prototype.find = function() { + throw new Error('Collection#find unimplemented by driver'); +}; + +/** + * Abstract method that drivers must implement. + */ + +Collection.prototype.insert = function() { + throw new Error('Collection#insert unimplemented by driver'); +}; + +/** + * Abstract method that drivers must implement. + */ + +Collection.prototype.insertOne = function() { + throw new Error('Collection#insertOne unimplemented by driver'); +}; + +/** + * Abstract method that drivers must implement. + */ + +Collection.prototype.insertMany = function() { + throw new Error('Collection#insertMany unimplemented by driver'); +}; + +/** + * Abstract method that drivers must implement. + */ + +Collection.prototype.save = function() { + throw new Error('Collection#save unimplemented by driver'); +}; + +/** + * Abstract method that drivers must implement. + */ + +Collection.prototype.update = function() { + throw new Error('Collection#update unimplemented by driver'); +}; + +/** + * Abstract method that drivers must implement. + */ + +Collection.prototype.getIndexes = function() { + throw new Error('Collection#getIndexes unimplemented by driver'); +}; + +/** + * Abstract method that drivers must implement. + */ + +Collection.prototype.mapReduce = function() { + throw new Error('Collection#mapReduce unimplemented by driver'); +}; + +/** + * Abstract method that drivers must implement. + */ + +Collection.prototype.watch = function() { + throw new Error('Collection#watch unimplemented by driver'); +}; + +/*! + * ignore + */ + +Collection.prototype._shouldBufferCommands = function _shouldBufferCommands() { + const opts = this.opts; + + if (opts.bufferCommands != null) { + return opts.bufferCommands; + } + if (opts && opts.schemaUserProvidedOptions != null && opts.schemaUserProvidedOptions.bufferCommands != null) { + return opts.schemaUserProvidedOptions.bufferCommands; + } + + return this.conn._shouldBufferCommands(); +}; + +/*! + * ignore + */ + +Collection.prototype._getBufferTimeoutMS = function _getBufferTimeoutMS() { + const conn = this.conn; + const opts = this.opts; + + if (opts.bufferTimeoutMS != null) { + return opts.bufferTimeoutMS; + } + if (opts && opts.schemaUserProvidedOptions != null && opts.schemaUserProvidedOptions.bufferTimeoutMS != null) { + return opts.schemaUserProvidedOptions.bufferTimeoutMS; + } + if (conn.config.bufferTimeoutMS != null) { + return conn.config.bufferTimeoutMS; + } + if (conn.base != null && conn.base.get('bufferTimeoutMS') != null) { + return conn.base.get('bufferTimeoutMS'); + } + return 10000; +}; + +/*! + * Module exports. + */ + +module.exports = Collection; diff --git a/node_modules/mongoose/lib/connection.js b/node_modules/mongoose/lib/connection.js new file mode 100644 index 000000000..f5286b87b --- /dev/null +++ b/node_modules/mongoose/lib/connection.js @@ -0,0 +1,1394 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const ChangeStream = require('./cursor/ChangeStream'); +const EventEmitter = require('events').EventEmitter; +const Schema = require('./schema'); +const Collection = require('./driver').get().Collection; +const STATES = require('./connectionstate'); +const MongooseError = require('./error/index'); +const PromiseProvider = require('./promise_provider'); +const ServerSelectionError = require('./error/serverSelection'); +const applyPlugins = require('./helpers/schema/applyPlugins'); +const promiseOrCallback = require('./helpers/promiseOrCallback'); +const get = require('./helpers/get'); +const immediate = require('./helpers/immediate'); +const mongodb = require('mongodb'); +const pkg = require('../package.json'); +const utils = require('./utils'); +const processConnectionOptions = require('./helpers/processConnectionOptions'); + +const arrayAtomicsSymbol = require('./helpers/symbols').arrayAtomicsSymbol; +const sessionNewDocuments = require('./helpers/symbols').sessionNewDocuments; + +/*! + * A list of authentication mechanisms that don't require a password for authentication. + * This is used by the authMechanismDoesNotRequirePassword method. + * + * @api private + */ +const noPasswordAuthMechanisms = [ + 'MONGODB-X509' +]; + +/** + * Connection constructor + * + * For practical reasons, a Connection equals a Db. + * + * @param {Mongoose} base a mongoose instance + * @inherits NodeJS EventEmitter http://nodejs.org/api/events.html#events_class_events_eventemitter + * @event `connecting`: Emitted when `connection.openUri()` is executed on this connection. + * @event `connected`: Emitted when this connection successfully connects to the db. May be emitted _multiple_ times in `reconnected` scenarios. + * @event `open`: Emitted after we `connected` and `onOpen` is executed on all of this connections models. + * @event `disconnecting`: Emitted when `connection.close()` was executed. + * @event `disconnected`: Emitted after getting disconnected from the db. + * @event `close`: Emitted after we `disconnected` and `onClose` executed on all of this connections models. + * @event `reconnected`: Emitted after we `connected` and subsequently `disconnected`, followed by successfully another successful connection. + * @event `error`: Emitted when an error occurs on this connection. + * @event `fullsetup`: Emitted after the driver has connected to primary and all secondaries if specified in the connection string. + * @api public + */ + +function Connection(base) { + this.base = base; + this.collections = {}; + this.models = {}; + this.config = {}; + this.replica = false; + this.options = null; + this.otherDbs = []; // FIXME: To be replaced with relatedDbs + this.relatedDbs = {}; // Hashmap of other dbs that share underlying connection + this.states = STATES; + this._readyState = STATES.disconnected; + this._closeCalled = false; + this._hasOpened = false; + this.plugins = []; + if (typeof base === 'undefined' || !base.connections.length) { + this.id = 0; + } else { + this.id = base.connections.length; + } + this._queue = []; +} + +/*! + * Inherit from EventEmitter + */ + +Connection.prototype.__proto__ = EventEmitter.prototype; + +/** + * Connection ready state + * + * - 0 = disconnected + * - 1 = connected + * - 2 = connecting + * - 3 = disconnecting + * + * Each state change emits its associated event name. + * + * ####Example + * + * conn.on('connected', callback); + * conn.on('disconnected', callback); + * + * @property readyState + * @memberOf Connection + * @instance + * @api public + */ + +Object.defineProperty(Connection.prototype, 'readyState', { + get: function() { + return this._readyState; + }, + set: function(val) { + if (!(val in STATES)) { + throw new Error('Invalid connection state: ' + val); + } + + if (this._readyState !== val) { + this._readyState = val; + // [legacy] loop over the otherDbs on this connection and change their state + for (const db of this.otherDbs) { + db.readyState = val; + } + + if (STATES.connected === val) { + this._hasOpened = true; + } + + this.emit(STATES[val]); + } + } +}); + +/** + * Gets the value of the option `key`. Equivalent to `conn.options[key]` + * + * ####Example: + * + * conn.get('test'); // returns the 'test' value + * + * @param {String} key + * @method get + * @api public + */ + +Connection.prototype.get = function(key) { + if (this.config.hasOwnProperty(key)) { + return this.config[key]; + } + + return get(this.options, key); +}; + +/** + * Sets the value of the option `key`. Equivalent to `conn.options[key] = val` + * + * Supported options include: + * + * - `maxTimeMS`: Set [`maxTimeMS`](/docs/api.html#query_Query-maxTimeMS) for all queries on this connection. + * + * ####Example: + * + * conn.set('test', 'foo'); + * conn.get('test'); // 'foo' + * conn.options.test; // 'foo' + * + * @param {String} key + * @param {Any} val + * @method set + * @api public + */ + +Connection.prototype.set = function(key, val) { + if (this.config.hasOwnProperty(key)) { + this.config[key] = val; + return val; + } + + this.options = this.options || {}; + this.options[key] = val; + return val; +}; + +/** + * A hash of the collections associated with this connection + * + * @property collections + * @memberOf Connection + * @instance + * @api public + */ + +Connection.prototype.collections; + +/** + * The name of the database this connection points to. + * + * ####Example + * + * mongoose.createConnection('mongodb://localhost:27017/mydb').name; // "mydb" + * + * @property name + * @memberOf Connection + * @instance + * @api public + */ + +Connection.prototype.name; + +/** + * A [POJO](https://masteringjs.io/tutorials/fundamentals/pojo) containing + * a map from model names to models. Contains all models that have been + * added to this connection using [`Connection#model()`](/docs/api/connection.html#connection_Connection-model). + * + * ####Example + * + * const conn = mongoose.createConnection(); + * const Test = conn.model('Test', mongoose.Schema({ name: String })); + * + * Object.keys(conn.models).length; // 1 + * conn.models.Test === Test; // true + * + * @property models + * @memberOf Connection + * @instance + * @api public + */ + +Connection.prototype.models; + +/** + * A number identifier for this connection. Used for debugging when + * you have [multiple connections](/docs/connections.html#multiple_connections). + * + * ####Example + * + * // The default connection has `id = 0` + * mongoose.connection.id; // 0 + * + * // If you create a new connection, Mongoose increments id + * const conn = mongoose.createConnection(); + * conn.id; // 1 + * + * @property id + * @memberOf Connection + * @instance + * @api public + */ + +Connection.prototype.id; + +/** + * The plugins that will be applied to all models created on this connection. + * + * ####Example: + * + * const db = mongoose.createConnection('mongodb://localhost:27017/mydb'); + * db.plugin(() => console.log('Applied')); + * db.plugins.length; // 1 + * + * db.model('Test', new Schema({})); // Prints "Applied" + * + * @property plugins + * @memberOf Connection + * @instance + * @api public + */ + +Object.defineProperty(Connection.prototype, 'plugins', { + configurable: false, + enumerable: true, + writable: true +}); + +/** + * The host name portion of the URI. If multiple hosts, such as a replica set, + * this will contain the first host name in the URI + * + * ####Example + * + * mongoose.createConnection('mongodb://localhost:27017/mydb').host; // "localhost" + * + * @property host + * @memberOf Connection + * @instance + * @api public + */ + +Object.defineProperty(Connection.prototype, 'host', { + configurable: true, + enumerable: true, + writable: true +}); + +/** + * The port portion of the URI. If multiple hosts, such as a replica set, + * this will contain the port from the first host name in the URI. + * + * ####Example + * + * mongoose.createConnection('mongodb://localhost:27017/mydb').port; // 27017 + * + * @property port + * @memberOf Connection + * @instance + * @api public + */ + +Object.defineProperty(Connection.prototype, 'port', { + configurable: true, + enumerable: true, + writable: true +}); + +/** + * The username specified in the URI + * + * ####Example + * + * mongoose.createConnection('mongodb://val:psw@localhost:27017/mydb').user; // "val" + * + * @property user + * @memberOf Connection + * @instance + * @api public + */ + +Object.defineProperty(Connection.prototype, 'user', { + configurable: true, + enumerable: true, + writable: true +}); + +/** + * The password specified in the URI + * + * ####Example + * + * mongoose.createConnection('mongodb://val:psw@localhost:27017/mydb').pass; // "psw" + * + * @property pass + * @memberOf Connection + * @instance + * @api public + */ + +Object.defineProperty(Connection.prototype, 'pass', { + configurable: true, + enumerable: true, + writable: true +}); + +/** + * The mongodb.Db instance, set when the connection is opened + * + * @property db + * @memberOf Connection + * @instance + * @api public + */ + +Connection.prototype.db; + +/** + * The MongoClient instance this connection uses to talk to MongoDB. Mongoose automatically sets this property + * when the connection is opened. + * + * @property client + * @memberOf Connection + * @instance + * @api public + */ + +Connection.prototype.client; + +/** + * A hash of the global options that are associated with this connection + * + * @property config + * @memberOf Connection + * @instance + * @api public + */ + +Connection.prototype.config; + +/** + * Helper for `createCollection()`. Will explicitly create the given collection + * with specified options. Used to create [capped collections](https://docs.mongodb.com/manual/core/capped-collections/) + * and [views](https://docs.mongodb.com/manual/core/views/) from mongoose. + * + * Options are passed down without modification to the [MongoDB driver's `createCollection()` function](http://mongodb.github.io/node-mongodb-native/2.2/api/Db.html#createCollection) + * + * @method createCollection + * @param {string} collection The collection to create + * @param {Object} [options] see [MongoDB driver docs](http://mongodb.github.io/node-mongodb-native/2.2/api/Db.html#createCollection) + * @param {Function} [callback] + * @return {Promise} + * @api public + */ + +Connection.prototype.createCollection = _wrapConnHelper(function createCollection(collection, options, cb) { + if (typeof options === 'function') { + cb = options; + options = {}; + } + this.db.createCollection(collection, options, cb); +}); + +/** + * _Requires MongoDB >= 3.6.0._ Starts a [MongoDB session](https://docs.mongodb.com/manual/release-notes/3.6/#client-sessions) + * for benefits like causal consistency, [retryable writes](https://docs.mongodb.com/manual/core/retryable-writes/), + * and [transactions](http://thecodebarbarian.com/a-node-js-perspective-on-mongodb-4-transactions.html). + * + * ####Example: + * + * const session = await conn.startSession(); + * let doc = await Person.findOne({ name: 'Ned Stark' }, null, { session }); + * await doc.remove(); + * // `doc` will always be null, even if reading from a replica set + * // secondary. Without causal consistency, it is possible to + * // get a doc back from the below query if the query reads from a + * // secondary that is experiencing replication lag. + * doc = await Person.findOne({ name: 'Ned Stark' }, null, { session, readPreference: 'secondary' }); + * + * + * @method startSession + * @param {Object} [options] see the [mongodb driver options](http://mongodb.github.io/node-mongodb-native/3.0/api/MongoClient.html#startSession) + * @param {Boolean} [options.causalConsistency=true] set to false to disable causal consistency + * @param {Function} [callback] + * @return {Promise} promise that resolves to a MongoDB driver `ClientSession` + * @api public + */ + +Connection.prototype.startSession = _wrapConnHelper(function startSession(options, cb) { + if (typeof options === 'function') { + cb = options; + options = null; + } + const session = this.client.startSession(options); + cb(null, session); +}); + +/** + * _Requires MongoDB >= 3.6.0._ Executes the wrapped async function + * in a transaction. Mongoose will commit the transaction if the + * async function executes successfully and attempt to retry if + * there was a retriable error. + * + * Calls the MongoDB driver's [`session.withTransaction()`](http://mongodb.github.io/node-mongodb-native/3.5/api/ClientSession.html#withTransaction), + * but also handles resetting Mongoose document state as shown below. + * + * ####Example: + * + * const doc = new Person({ name: 'Will Riker' }); + * await db.transaction(async function setRank(session) { + * doc.rank = 'Captain'; + * await doc.save({ session }); + * doc.isNew; // false + * + * // Throw an error to abort the transaction + * throw new Error('Oops!'); + * },{ readPreference: 'primary' }).catch(() => {}); + * + * // true, `transaction()` reset the document's state because the + * // transaction was aborted. + * doc.isNew; + * + * @method transaction + * @param {Function} fn Function to execute in a transaction + * @param {mongodb.TransactionOptions} [options] Optional settings for the transaction + * @return {Promise} promise that is fulfilled if Mongoose successfully committed the transaction, or rejects if the transaction was aborted or if Mongoose failed to commit the transaction. If fulfilled, the promise resolves to a MongoDB command result. + * @api public + */ + +Connection.prototype.transaction = function transaction(fn, options) { + return this.startSession().then(session => { + session[sessionNewDocuments] = new Map(); + return session.withTransaction(() => fn(session), options). + then(res => { + delete session[sessionNewDocuments]; + return res; + }). + catch(err => { + // If transaction was aborted, we need to reset newly + // inserted documents' `isNew`. + for (const doc of session[sessionNewDocuments].keys()) { + const state = session[sessionNewDocuments].get(doc); + if (state.hasOwnProperty('isNew')) { + doc.$isNew = state.$isNew; + } + if (state.hasOwnProperty('versionKey')) { + doc.set(doc.schema.options.versionKey, state.versionKey); + } + + for (const path of state.modifiedPaths) { + doc.$__.activePaths.paths[path] = 'modify'; + doc.$__.activePaths.states.modify[path] = true; + } + + for (const path of state.atomics.keys()) { + const val = doc.$__getValue(path); + if (val == null) { + continue; + } + val[arrayAtomicsSymbol] = state.atomics.get(path); + } + } + delete session[sessionNewDocuments]; + throw err; + }); + }); +}; + +/** + * Helper for `dropCollection()`. Will delete the given collection, including + * all documents and indexes. + * + * @method dropCollection + * @param {string} collection The collection to delete + * @param {Function} [callback] + * @return {Promise} + * @api public + */ + +Connection.prototype.dropCollection = _wrapConnHelper(function dropCollection(collection, cb) { + this.db.dropCollection(collection, cb); +}); + +/** + * Helper for `dropDatabase()`. Deletes the given database, including all + * collections, documents, and indexes. + * + * ####Example: + * + * const conn = mongoose.createConnection('mongodb://localhost:27017/mydb'); + * // Deletes the entire 'mydb' database + * await conn.dropDatabase(); + * + * @method dropDatabase + * @param {Function} [callback] + * @return {Promise} + * @api public + */ + +Connection.prototype.dropDatabase = _wrapConnHelper(function dropDatabase(cb) { + // If `dropDatabase()` is called, this model's collection will not be + // init-ed. It is sufficiently common to call `dropDatabase()` after + // `mongoose.connect()` but before creating models that we want to + // support this. See gh-6967 + for (const name of Object.keys(this.models)) { + delete this.models[name].$init; + } + this.db.dropDatabase(cb); +}); + +/*! + * ignore + */ + +function _wrapConnHelper(fn) { + return function() { + const cb = arguments.length > 0 ? arguments[arguments.length - 1] : null; + const argsWithoutCb = typeof cb === 'function' ? + Array.prototype.slice.call(arguments, 0, arguments.length - 1) : + Array.prototype.slice.call(arguments); + const disconnectedError = new MongooseError('Connection ' + this.id + + ' was disconnected when calling `' + fn.name + '`'); + + return promiseOrCallback(cb, cb => { + immediate(() => { + if ((this.readyState === STATES.connecting || this.readyState === STATES.disconnected) && this._shouldBufferCommands()) { + this._queue.push({ fn: fn, ctx: this, args: argsWithoutCb.concat([cb]) }); + } else if (this.readyState === STATES.disconnected && this.db == null) { + cb(disconnectedError); + } else { + try { + fn.apply(this, argsWithoutCb.concat([cb])); + } catch (err) { + return cb(err); + } + } + }); + }); + }; +} + +/*! + * ignore + */ + +Connection.prototype._shouldBufferCommands = function _shouldBufferCommands() { + if (this.config.bufferCommands != null) { + return this.config.bufferCommands; + } + if (this.base.get('bufferCommands') != null) { + return this.base.get('bufferCommands'); + } + return true; +}; + +/** + * error + * + * Graceful error handling, passes error to callback + * if available, else emits error on the connection. + * + * @param {Error} err + * @param {Function} callback optional + * @api private + */ + +Connection.prototype.error = function(err, callback) { + if (callback) { + callback(err); + return null; + } + if (this.listeners('error').length > 0) { + this.emit('error', err); + } + return Promise.reject(err); +}; + +/** + * Called when the connection is opened + * + * @api private + */ + +Connection.prototype.onOpen = function() { + this.readyState = STATES.connected; + + for (const d of this._queue) { + d.fn.apply(d.ctx, d.args); + } + this._queue = []; + + // avoid having the collection subscribe to our event emitter + // to prevent 0.3 warning + for (const i in this.collections) { + if (utils.object.hasOwnProperty(this.collections, i)) { + this.collections[i].onOpen(); + } + } + + this.emit('open'); +}; + +/** + * Opens the connection with a URI using `MongoClient.connect()`. + * + * @param {String} uri The URI to connect with. + * @param {Object} [options] Passed on to http://mongodb.github.io/node-mongodb-native/2.2/api/MongoClient.html#connect + * @param {Boolean} [options.bufferCommands=true] Mongoose specific option. Set to false to [disable buffering](http://mongoosejs.com/docs/faq.html#callback_never_executes) on all models associated with this connection. + * @param {Number} [options.bufferTimeoutMS=10000] Mongoose specific option. If `bufferCommands` is true, Mongoose will throw an error after `bufferTimeoutMS` if the operation is still buffered. + * @param {String} [options.dbName] The name of the database we want to use. If not provided, use database name from connection string. + * @param {String} [options.user] username for authentication, equivalent to `options.auth.user`. Maintained for backwards compatibility. + * @param {String} [options.pass] password for authentication, equivalent to `options.auth.password`. Maintained for backwards compatibility. + * @param {Number} [options.maxPoolSize=100] The maximum number of sockets the MongoDB driver will keep open for this connection. Keep in mind that MongoDB only allows one operation per socket at a time, so you may want to increase this if you find you have a few slow queries that are blocking faster queries from proceeding. See [Slow Trains in MongoDB and Node.js](http://thecodebarbarian.com/slow-trains-in-mongodb-and-nodejs). + * @param {Number} [options.minPoolSize=0] The minimum number of sockets the MongoDB driver will keep open for this connection. Keep in mind that MongoDB only allows one operation per socket at a time, so you may want to increase this if you find you have a few slow queries that are blocking faster queries from proceeding. See [Slow Trains in MongoDB and Node.js](http://thecodebarbarian.com/slow-trains-in-mongodb-and-nodejs). + * @param {Number} [options.serverSelectionTimeoutMS] If `useUnifiedTopology = true`, the MongoDB driver will try to find a server to send any given operation to, and keep retrying for `serverSelectionTimeoutMS` milliseconds before erroring out. If not set, the MongoDB driver defaults to using `30000` (30 seconds). + * @param {Number} [options.heartbeatFrequencyMS] If `useUnifiedTopology = true`, the MongoDB driver sends a heartbeat every `heartbeatFrequencyMS` to check on the status of the connection. A heartbeat is subject to `serverSelectionTimeoutMS`, so the MongoDB driver will retry failed heartbeats for up to 30 seconds by default. Mongoose only emits a `'disconnected'` event after a heartbeat has failed, so you may want to decrease this setting to reduce the time between when your server goes down and when Mongoose emits `'disconnected'`. We recommend you do **not** set this setting below 1000, too many heartbeats can lead to performance degradation. + * @param {Boolean} [options.autoIndex=true] Mongoose-specific option. Set to false to disable automatic index creation for all models associated with this connection. + * @param {Class} [options.promiseLibrary] Sets the [underlying driver's promise library](http://mongodb.github.io/node-mongodb-native/3.1/api/MongoClient.html). + * @param {Number} [options.connectTimeoutMS=30000] How long the MongoDB driver will wait before killing a socket due to inactivity _during initial connection_. Defaults to 30000. This option is passed transparently to [Node.js' `socket#setTimeout()` function](https://nodejs.org/api/net.html#net_socket_settimeout_timeout_callback). + * @param {Number} [options.socketTimeoutMS=30000] How long the MongoDB driver will wait before killing a socket due to inactivity _after initial connection_. A socket may be inactive because of either no activity or a long-running operation. This is set to `30000` by default, you should set this to 2-3x your longest running operation if you expect some of your database operations to run longer than 20 seconds. This option is passed to [Node.js `socket#setTimeout()` function](https://nodejs.org/api/net.html#net_socket_settimeout_timeout_callback) after the MongoDB driver successfully completes. + * @param {Number} [options.family=0] Passed transparently to [Node.js' `dns.lookup()`](https://nodejs.org/api/dns.html#dns_dns_lookup_hostname_options_callback) function. May be either `0, `4`, or `6`. `4` means use IPv4 only, `6` means use IPv6 only, `0` means try both. + * @param {Boolean} [options.autoCreate=false] Set to `true` to make Mongoose automatically call `createCollection()` on every model created on this connection. + * @param {Function} [callback] + * @returns {Connection} this + * @api public + */ + +Connection.prototype.openUri = function(uri, options, callback) { + if (typeof options === 'function') { + callback = options; + options = null; + } + + if (['string', 'number'].indexOf(typeof options) !== -1) { + throw new MongooseError('Mongoose 5.x no longer supports ' + + '`mongoose.connect(host, dbname, port)` or ' + + '`mongoose.createConnection(host, dbname, port)`. See ' + + 'http://mongoosejs.com/docs/connections.html for supported connection syntax'); + } + + if (typeof uri !== 'string') { + throw new MongooseError('The `uri` parameter to `openUri()` must be a ' + + `string, got "${typeof uri}". Make sure the first parameter to ` + + '`mongoose.connect()` or `mongoose.createConnection()` is a string.'); + } + + if (callback != null && typeof callback !== 'function') { + throw new MongooseError('3rd parameter to `mongoose.connect()` or ' + + '`mongoose.createConnection()` must be a function, got "' + + typeof callback + '"'); + } + + if (this.readyState === STATES.connecting || this.readyState === STATES.connected) { + if (this._connectionString !== uri) { + throw new MongooseError('Can\'t call `openUri()` on an active connection with ' + + 'different connection strings. Make sure you aren\'t calling `mongoose.connect()` ' + + 'multiple times. See: https://mongoosejs.com/docs/connections.html#multiple_connections'); + } + + if (typeof callback === 'function') { + this.$initialConnection = this.$initialConnection.then( + () => callback(null, this), + err => callback(err) + ); + } + return this; + } + + this._connectionString = uri; + this.readyState = STATES.connecting; + this._closeCalled = false; + + const Promise = PromiseProvider.get(); + const _this = this; + + options = processConnectionOptions(uri, options); + + if (options) { + options = utils.clone(options); + + const autoIndex = options.config && options.config.autoIndex != null ? + options.config.autoIndex : + options.autoIndex; + if (autoIndex != null) { + this.config.autoIndex = autoIndex !== false; + delete options.config; + delete options.autoIndex; + } + + if ('autoCreate' in options) { + this.config.autoCreate = !!options.autoCreate; + delete options.autoCreate; + } + + if ('sanitizeFilter' in options) { + this.config.sanitizeFilter = options.sanitizeFilter; + delete options.sanitizeFilter; + } + + // Backwards compat + if (options.user || options.pass) { + options.auth = options.auth || {}; + options.auth.username = options.user; + options.auth.password = options.pass; + + this.user = options.user; + this.pass = options.pass; + } + delete options.user; + delete options.pass; + + if (options.bufferCommands != null) { + this.config.bufferCommands = options.bufferCommands; + delete options.bufferCommands; + } + } else { + options = {}; + } + + this._connectionOptions = options; + const dbName = options.dbName; + if (dbName != null) { + this.$dbName = dbName; + } + delete options.dbName; + + if (!utils.hasUserDefinedProperty(options, 'driverInfo')) { + options.driverInfo = { + name: 'Mongoose', + version: pkg.version + }; + } + + const promise = new Promise((resolve, reject) => { + let client; + try { + client = new mongodb.MongoClient(uri, options); + } catch (error) { + _this.readyState = STATES.disconnected; + return reject(error); + } + _this.client = client; + client.connect((error) => { + if (error) { + return reject(error); + } + + _setClient(_this, client, options, dbName); + + resolve(_this); + }); + }); + + const serverSelectionError = new ServerSelectionError(); + this.$initialConnection = promise. + then(() => this). + catch(err => { + this.readyState = STATES.disconnected; + if (err != null && err.name === 'MongoServerSelectionError') { + err = serverSelectionError.assimilateError(err); + } + + if (this.listeners('error').length > 0) { + immediate(() => this.emit('error', err)); + } + throw err; + }); + + if (callback != null) { + this.$initialConnection = this.$initialConnection.then( + () => { callback(null, this); return this; }, + err => callback(err) + ); + } + + return this.$initialConnection; +}; + +/*! + * ignore + */ + +function _setClient(conn, client, options, dbName) { + const db = dbName != null ? client.db(dbName) : client.db(); + conn.db = db; + conn.client = client; + conn.host = get(client, 's.options.hosts.0.host', void 0); + conn.port = get(client, 's.options.hosts.0.port', void 0); + conn.name = dbName != null ? dbName : get(client, 's.options.dbName', void 0); + conn._closeCalled = client._closeCalled; + + const _handleReconnect = () => { + // If we aren't disconnected, we assume this reconnect is due to a + // socket timeout. If there's no activity on a socket for + // `socketTimeoutMS`, the driver will attempt to reconnect and emit + // this event. + if (conn.readyState !== STATES.connected) { + conn.readyState = STATES.connected; + conn.emit('reconnect'); + conn.emit('reconnected'); + conn.onOpen(); + } + }; + + const type = get(client, 'topology.description.type', ''); + + if (type === 'Single') { + client.on('serverDescriptionChanged', ev => { + const newDescription = ev.newDescription; + if (newDescription.type === 'Standalone') { + _handleReconnect(); + } else { + conn.readyState = STATES.disconnected; + } + }); + } else if (type.startsWith('ReplicaSet')) { + client.on('topologyDescriptionChanged', ev => { + // Emit disconnected if we've lost connectivity to the primary + const description = ev.newDescription; + if (conn.readyState === STATES.connected && description.type !== 'ReplicaSetWithPrimary') { + // Implicitly emits 'disconnected' + conn.readyState = STATES.disconnected; + } else if (conn.readyState === STATES.disconnected && description.type === 'ReplicaSetWithPrimary') { + _handleReconnect(); + } + }); + } + + conn.onOpen(); + + for (const i in conn.collections) { + if (utils.object.hasOwnProperty(conn.collections, i)) { + conn.collections[i].onOpen(); + } + } +} + +/** + * Closes the connection + * + * @param {Boolean} [force] optional + * @param {Function} [callback] optional + * @return {Promise} + * @api public + */ + +Connection.prototype.close = function(force, callback) { + if (typeof force === 'function') { + callback = force; + force = false; + } + + this.$wasForceClosed = !!force; + + return promiseOrCallback(callback, cb => { + this._close(force, cb); + }); +}; + +/** + * Handles closing the connection + * + * @param {Boolean} force + * @param {Function} callback + * @api private + */ +Connection.prototype._close = function(force, callback) { + const _this = this; + const closeCalled = this._closeCalled; + this._closeCalled = true; + if (this.client != null) { + this.client._closeCalled = true; + } + + switch (this.readyState) { + case STATES.disconnected: + if (closeCalled) { + callback(); + } else { + this.doClose(force, function(err) { + if (err) { + return callback(err); + } + _this.onClose(force); + callback(null); + }); + } + break; + + case STATES.connected: + this.readyState = STATES.disconnecting; + this.doClose(force, function(err) { + if (err) { + return callback(err); + } + _this.onClose(force); + callback(null); + }); + + break; + case STATES.connecting: + this.once('open', function() { + _this.close(callback); + }); + break; + + case STATES.disconnecting: + this.once('close', function() { + callback(); + }); + break; + } + + return this; +}; + +/** + * Called when the connection closes + * + * @api private + */ + +Connection.prototype.onClose = function(force) { + this.readyState = STATES.disconnected; + + // avoid having the collection subscribe to our event emitter + // to prevent 0.3 warning + for (const i in this.collections) { + if (utils.object.hasOwnProperty(this.collections, i)) { + this.collections[i].onClose(force); + } + } + + this.emit('close', force); + + for (const db of this.otherDbs) { + db.close(force); + } +}; + +/** + * Retrieves a collection, creating it if not cached. + * + * Not typically needed by applications. Just talk to your collection through your model. + * + * @param {String} name of the collection + * @param {Object} [options] optional collection options + * @return {Collection} collection instance + * @api public + */ + +Connection.prototype.collection = function(name, options) { + const defaultOptions = { + autoIndex: this.config.autoIndex != null ? this.config.autoIndex : this.base.options.autoIndex, + autoCreate: this.config.autoCreate != null ? this.config.autoCreate : this.base.options.autoCreate + }; + options = Object.assign({}, defaultOptions, options ? utils.clone(options) : {}); + options.$wasForceClosed = this.$wasForceClosed; + if (!(name in this.collections)) { + this.collections[name] = new Collection(name, this, options); + } + return this.collections[name]; +}; + +/** + * Declares a plugin executed on all schemas you pass to `conn.model()` + * + * Equivalent to calling `.plugin(fn)` on each schema you create. + * + * ####Example: + * const db = mongoose.createConnection('mongodb://localhost:27017/mydb'); + * db.plugin(() => console.log('Applied')); + * db.plugins.length; // 1 + * + * db.model('Test', new Schema({})); // Prints "Applied" + * + * @param {Function} fn plugin callback + * @param {Object} [opts] optional options + * @return {Connection} this + * @see plugins ./plugins.html + * @api public + */ + +Connection.prototype.plugin = function(fn, opts) { + this.plugins.push([fn, opts]); + return this; +}; + +/** + * Defines or retrieves a model. + * + * const mongoose = require('mongoose'); + * const db = mongoose.createConnection(..); + * db.model('Venue', new Schema(..)); + * const Ticket = db.model('Ticket', new Schema(..)); + * const Venue = db.model('Venue'); + * + * _When no `collection` argument is passed, Mongoose produces a collection name by passing the model `name` to the [utils.toCollectionName](#utils_exports.toCollectionName) method. This method pluralizes the name. If you don't like this behavior, either pass a collection name or set your schemas collection name option._ + * + * ####Example: + * + * const schema = new Schema({ name: String }, { collection: 'actor' }); + * + * // or + * + * schema.set('collection', 'actor'); + * + * // or + * + * const collectionName = 'actor' + * const M = conn.model('Actor', schema, collectionName) + * + * @param {String|Function} name the model name or class extending Model + * @param {Schema} [schema] a schema. necessary when defining a model + * @param {String} [collection] name of mongodb collection (optional) if not given it will be induced from model name + * @param {Object} [options] + * @param {Boolean} [options.overwriteModels=false] If true, overwrite existing models with the same name to avoid `OverwriteModelError` + * @see Mongoose#model #index_Mongoose-model + * @return {Model} The compiled model + * @api public + */ + +Connection.prototype.model = function(name, schema, collection, options) { + if (!(this instanceof Connection)) { + throw new MongooseError('`connection.model()` should not be run with ' + + '`new`. If you are doing `new db.model(foo)(bar)`, use ' + + '`db.model(foo)(bar)` instead'); + } + + let fn; + if (typeof name === 'function') { + fn = name; + name = fn.name; + } + + // collection name discovery + if (typeof schema === 'string') { + collection = schema; + schema = false; + } + + if (utils.isObject(schema) && !schema.instanceOfSchema) { + schema = new Schema(schema); + } + if (schema && !schema.instanceOfSchema) { + throw new Error('The 2nd parameter to `mongoose.model()` should be a ' + + 'schema or a POJO'); + } + + const defaultOptions = { cache: false, overwriteModels: this.base.options.overwriteModels }; + const opts = Object.assign(defaultOptions, options, { connection: this }); + if (this.models[name] && !collection && opts.overwriteModels !== true) { + // model exists but we are not subclassing with custom collection + if (schema && schema.instanceOfSchema && schema !== this.models[name].schema) { + throw new MongooseError.OverwriteModelError(name); + } + return this.models[name]; + } + + let model; + + if (schema && schema.instanceOfSchema) { + applyPlugins(schema, this.plugins, null, '$connectionPluginsApplied'); + + // compile a model + model = this.base._model(fn || name, schema, collection, opts); + + // only the first model with this name is cached to allow + // for one-offs with custom collection names etc. + if (!this.models[name]) { + this.models[name] = model; + } + + // Errors handled internally, so safe to ignore error + model.init(function $modelInitNoop() {}); + + return model; + } + + if (this.models[name] && collection) { + // subclassing current model with alternate collection + model = this.models[name]; + schema = model.prototype.schema; + const sub = model.__subclass(this, schema, collection); + // do not cache the sub model + return sub; + } + + if (!model) { + throw new MongooseError.MissingSchemaError(name); + } + + if (this === model.prototype.db + && (!collection || collection === model.collection.name)) { + // model already uses this connection. + + // only the first model with this name is cached to allow + // for one-offs with custom collection names etc. + if (!this.models[name]) { + this.models[name] = model; + } + + return model; + } + this.models[name] = model.__subclass(this, schema, collection); + return this.models[name]; +}; + +/** + * Removes the model named `name` from this connection, if it exists. You can + * use this function to clean up any models you created in your tests to + * prevent OverwriteModelErrors. + * + * ####Example: + * + * conn.model('User', new Schema({ name: String })); + * console.log(conn.model('User')); // Model object + * conn.deleteModel('User'); + * console.log(conn.model('User')); // undefined + * + * // Usually useful in a Mocha `afterEach()` hook + * afterEach(function() { + * conn.deleteModel(/.+/); // Delete every model + * }); + * + * @api public + * @param {String|RegExp} name if string, the name of the model to remove. If regexp, removes all models whose name matches the regexp. + * @return {Connection} this + */ + +Connection.prototype.deleteModel = function(name) { + if (typeof name === 'string') { + const model = this.model(name); + if (model == null) { + return this; + } + const collectionName = model.collection.name; + delete this.models[name]; + delete this.collections[collectionName]; + + this.emit('deleteModel', model); + } else if (name instanceof RegExp) { + const pattern = name; + const names = this.modelNames(); + for (const name of names) { + if (pattern.test(name)) { + this.deleteModel(name); + } + } + } else { + throw new Error('First parameter to `deleteModel()` must be a string ' + + 'or regexp, got "' + name + '"'); + } + + return this; +}; + +/** + * Watches the entire underlying database for changes. Similar to + * [`Model.watch()`](/docs/api/model.html#model_Model.watch). + * + * This function does **not** trigger any middleware. In particular, it + * does **not** trigger aggregate middleware. + * + * The ChangeStream object is an event emitter that emits the following events: + * + * - 'change': A change occurred, see below example + * - 'error': An unrecoverable error occurred. In particular, change streams currently error out if they lose connection to the replica set primary. Follow [this GitHub issue](https://github.com/Automattic/mongoose/issues/6799) for updates. + * - 'end': Emitted if the underlying stream is closed + * - 'close': Emitted if the underlying stream is closed + * + * ####Example: + * + * const User = conn.model('User', new Schema({ name: String })); + * + * const changeStream = conn.watch().on('change', data => console.log(data)); + * + * // Triggers a 'change' event on the change stream. + * await User.create({ name: 'test' }); + * + * @api public + * @param {Array} [pipeline] + * @param {Object} [options] passed without changes to [the MongoDB driver's `Db#watch()` function](https://mongodb.github.io/node-mongodb-native/3.4/api/Db.html#watch) + * @return {ChangeStream} mongoose-specific change stream wrapper, inherits from EventEmitter + */ + +Connection.prototype.watch = function(pipeline, options) { + const disconnectedError = new MongooseError('Connection ' + this.id + + ' was disconnected when calling `watch()`'); + + const changeStreamThunk = cb => { + immediate(() => { + if (this.readyState === STATES.connecting) { + this.once('open', function() { + const driverChangeStream = this.db.watch(pipeline, options); + cb(null, driverChangeStream); + }); + } else if (this.readyState === STATES.disconnected && this.db == null) { + cb(disconnectedError); + } else { + const driverChangeStream = this.db.watch(pipeline, options); + cb(null, driverChangeStream); + } + }); + }; + + const changeStream = new ChangeStream(changeStreamThunk, pipeline, options); + return changeStream; +}; + +/** + * Returns a promise that resolves when this connection + * successfully connects to MongoDB, or rejects if this connection failed + * to connect. + * + * ####Example: + * const conn = await mongoose.createConnection('mongodb://localhost:27017/test'). + * asPromise(); + * conn.readyState; // 1, means Mongoose is connected + * + * @api public + * @return {Promise} + */ + +Connection.prototype.asPromise = function asPromise() { + return this.$initialConnection; +}; + +/** + * Returns an array of model names created on this connection. + * @api public + * @return {Array} + */ + +Connection.prototype.modelNames = function() { + return Object.keys(this.models); +}; + +/** + * @brief Returns if the connection requires authentication after it is opened. Generally if a + * username and password are both provided than authentication is needed, but in some cases a + * password is not required. + * @api private + * @return {Boolean} true if the connection should be authenticated after it is opened, otherwise false. + */ +Connection.prototype.shouldAuthenticate = function() { + return this.user != null && + (this.pass != null || this.authMechanismDoesNotRequirePassword()); +}; + +/** + * @brief Returns a boolean value that specifies if the current authentication mechanism needs a + * password to authenticate according to the auth objects passed into the openUri methods. + * @api private + * @return {Boolean} true if the authentication mechanism specified in the options object requires + * a password, otherwise false. + */ +Connection.prototype.authMechanismDoesNotRequirePassword = function() { + if (this.options && this.options.auth) { + return noPasswordAuthMechanisms.indexOf(this.options.auth.authMechanism) >= 0; + } + return true; +}; + +/** + * @brief Returns a boolean value that specifies if the provided objects object provides enough + * data to authenticate with. Generally this is true if the username and password are both specified + * but in some authentication methods, a password is not required for authentication so only a username + * is required. + * @param {Object} [options] the options object passed into the openUri methods. + * @api private + * @return {Boolean} true if the provided options object provides enough data to authenticate with, + * otherwise false. + */ +Connection.prototype.optionsProvideAuthenticationData = function(options) { + return (options) && + (options.user) && + ((options.pass) || this.authMechanismDoesNotRequirePassword()); +}; + +/** + * Returns the [MongoDB driver `MongoClient`](http://mongodb.github.io/node-mongodb-native/3.5/api/MongoClient.html) instance + * that this connection uses to talk to MongoDB. + * + * ####Example: + * const conn = await mongoose.createConnection('mongodb://localhost:27017/test'); + * + * conn.getClient(); // MongoClient { ... } + * + * @api public + * @return {MongoClient} + */ + +Connection.prototype.getClient = function getClient() { + return this.client; +}; + +/** + * Set the [MongoDB driver `MongoClient`](http://mongodb.github.io/node-mongodb-native/3.5/api/MongoClient.html) instance + * that this connection uses to talk to MongoDB. This is useful if you already have a MongoClient instance, and want to + * reuse it. + * + * ####Example: + * const client = await mongodb.MongoClient.connect('mongodb://localhost:27017/test'); + * + * const conn = mongoose.createConnection().setClient(client); + * + * conn.getClient(); // MongoClient { ... } + * conn.readyState; // 1, means 'CONNECTED' + * + * @api public + * @return {Connection} this + */ + +Connection.prototype.setClient = function setClient(client) { + if (!(client instanceof mongodb.MongoClient)) { + throw new MongooseError('Must call `setClient()` with an instance of MongoClient'); + } + if (this.client != null || this.readyState !== STATES.disconnected) { + throw new MongooseError('Cannot call `setClient()` on a connection that is already connected.'); + } + if (client.topology == null) { + throw new MongooseError('Cannot call `setClient()` with a MongoClient that you have not called `connect()` on yet.'); + } + + this._connectionString = client.s.url; + _setClient(this, client, { useUnifiedTopology: client.s.options.useUnifiedTopology }, client.s.options.dbName); + + return this; +}; + +/** + * Switches to a different database using the same connection pool. + * + * Returns a new connection object, with the new db. + * + * @method useDb + * @memberOf Connection + * @param {String} name The database name + * @param {Object} [options] + * @param {Boolean} [options.useCache=false] If true, cache results so calling `useDb()` multiple times with the same name only creates 1 connection object. + * @param {Boolean} [options.noListener=false] If true, the connection object will not make the db listen to events on the original connection. See [issue #9961](https://github.com/Automattic/mongoose/issues/9961). + * @return {Connection} New Connection Object + * @api public + */ + +/*! + * Module exports. + */ + +Connection.STATES = STATES; +module.exports = Connection; diff --git a/node_modules/mongoose/lib/connectionstate.js b/node_modules/mongoose/lib/connectionstate.js new file mode 100644 index 000000000..920f45bee --- /dev/null +++ b/node_modules/mongoose/lib/connectionstate.js @@ -0,0 +1,26 @@ + +/*! + * Connection states + */ + +'use strict'; + +const STATES = module.exports = exports = Object.create(null); + +const disconnected = 'disconnected'; +const connected = 'connected'; +const connecting = 'connecting'; +const disconnecting = 'disconnecting'; +const uninitialized = 'uninitialized'; + +STATES[0] = disconnected; +STATES[1] = connected; +STATES[2] = connecting; +STATES[3] = disconnecting; +STATES[99] = uninitialized; + +STATES[disconnected] = 0; +STATES[connected] = 1; +STATES[connecting] = 2; +STATES[disconnecting] = 3; +STATES[uninitialized] = 99; diff --git a/node_modules/mongoose/lib/cursor/AggregationCursor.js b/node_modules/mongoose/lib/cursor/AggregationCursor.js new file mode 100644 index 000000000..6131bab5d --- /dev/null +++ b/node_modules/mongoose/lib/cursor/AggregationCursor.js @@ -0,0 +1,369 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const MongooseError = require('../error/mongooseError'); +const Readable = require('stream').Readable; +const promiseOrCallback = require('../helpers/promiseOrCallback'); +const eachAsync = require('../helpers/cursor/eachAsync'); +const immediate = require('../helpers/immediate'); +const util = require('util'); + +/** + * An AggregationCursor is a concurrency primitive for processing aggregation + * results one document at a time. It is analogous to QueryCursor. + * + * An AggregationCursor fulfills the Node.js streams3 API, + * in addition to several other mechanisms for loading documents from MongoDB + * one at a time. + * + * Creating an AggregationCursor executes the model's pre aggregate hooks, + * but **not** the model's post aggregate hooks. + * + * Unless you're an advanced user, do **not** instantiate this class directly. + * Use [`Aggregate#cursor()`](/docs/api.html#aggregate_Aggregate-cursor) instead. + * + * @param {Aggregate} agg + * @param {Object} options + * @inherits Readable + * @event `cursor`: Emitted when the cursor is created + * @event `error`: Emitted when an error occurred + * @event `data`: Emitted when the stream is flowing and the next doc is ready + * @event `end`: Emitted when the stream is exhausted + * @api public + */ + +function AggregationCursor(agg) { + Readable.call(this, { objectMode: true }); + + this.cursor = null; + this.agg = agg; + this._transforms = []; + const model = agg._model; + delete agg.options.cursor.useMongooseAggCursor; + this._mongooseOptions = {}; + + _init(model, this, agg); +} + +util.inherits(AggregationCursor, Readable); + +/*! + * ignore + */ + +function _init(model, c, agg) { + if (!model.collection.buffer) { + model.hooks.execPre('aggregate', agg, function() { + c.cursor = model.collection.aggregate(agg._pipeline, agg.options || {}); + c.emit('cursor', c.cursor); + }); + } else { + model.collection.emitter.once('queue', function() { + model.hooks.execPre('aggregate', agg, function() { + c.cursor = model.collection.aggregate(agg._pipeline, agg.options || {}); + c.emit('cursor', c.cursor); + }); + }); + } +} + +/*! + * Necessary to satisfy the Readable API + */ + +AggregationCursor.prototype._read = function() { + const _this = this; + _next(this, function(error, doc) { + if (error) { + return _this.emit('error', error); + } + if (!doc) { + _this.push(null); + _this.cursor.close(function(error) { + if (error) { + return _this.emit('error', error); + } + setTimeout(function() { + // on node >= 14 streams close automatically (gh-8834) + const isNotClosedAutomatically = !_this.destroyed; + if (isNotClosedAutomatically) { + _this.emit('close'); + } + }, 0); + }); + return; + } + _this.push(doc); + }); +}; + +if (Symbol.asyncIterator != null) { + const msg = 'Mongoose does not support using async iterators with an ' + + 'existing aggregation cursor. See http://bit.ly/mongoose-async-iterate-aggregation'; + + AggregationCursor.prototype[Symbol.asyncIterator] = function() { + throw new MongooseError(msg); + }; +} + +/** + * Registers a transform function which subsequently maps documents retrieved + * via the streams interface or `.next()` + * + * ####Example + * + * // Map documents returned by `data` events + * Thing. + * find({ name: /^hello/ }). + * cursor(). + * map(function (doc) { + * doc.foo = "bar"; + * return doc; + * }) + * on('data', function(doc) { console.log(doc.foo); }); + * + * // Or map documents returned by `.next()` + * const cursor = Thing.find({ name: /^hello/ }). + * cursor(). + * map(function (doc) { + * doc.foo = "bar"; + * return doc; + * }); + * cursor.next(function(error, doc) { + * console.log(doc.foo); + * }); + * + * @param {Function} fn + * @return {AggregationCursor} + * @api public + * @method map + */ + +AggregationCursor.prototype.map = function(fn) { + this._transforms.push(fn); + return this; +}; + +/*! + * Marks this cursor as errored + */ + +AggregationCursor.prototype._markError = function(error) { + this._error = error; + return this; +}; + +/** + * Marks this cursor as closed. Will stop streaming and subsequent calls to + * `next()` will error. + * + * @param {Function} callback + * @return {Promise} + * @api public + * @method close + * @emits close + * @see MongoDB driver cursor#close http://mongodb.github.io/node-mongodb-native/2.1/api/Cursor.html#close + */ + +AggregationCursor.prototype.close = function(callback) { + return promiseOrCallback(callback, cb => { + this.cursor.close(error => { + if (error) { + cb(error); + return this.listeners('error').length > 0 && this.emit('error', error); + } + this.emit('close'); + cb(null); + }); + }); +}; + +/** + * Get the next document from this cursor. Will return `null` when there are + * no documents left. + * + * @param {Function} callback + * @return {Promise} + * @api public + * @method next + */ + +AggregationCursor.prototype.next = function(callback) { + return promiseOrCallback(callback, cb => { + _next(this, cb); + }); +}; + +/** + * Execute `fn` for every document in the cursor. If `fn` returns a promise, + * will wait for the promise to resolve before iterating on to the next one. + * Returns a promise that resolves when done. + * + * @param {Function} fn + * @param {Object} [options] + * @param {Number} [options.parallel] the number of promises to execute in parallel. Defaults to 1. + * @param {Function} [callback] executed when all docs have been processed + * @return {Promise} + * @api public + * @method eachAsync + */ + +AggregationCursor.prototype.eachAsync = function(fn, opts, callback) { + const _this = this; + if (typeof opts === 'function') { + callback = opts; + opts = {}; + } + opts = opts || {}; + + return eachAsync(function(cb) { return _next(_this, cb); }, fn, opts, callback); +}; + +/** + * Returns an asyncIterator for use with [`for/await/of` loops](https://thecodebarbarian.com/getting-started-with-async-iterators-in-node-js + * You do not need to call this function explicitly, the JavaScript runtime + * will call it for you. + * + * ####Example + * + * // Async iterator without explicitly calling `cursor()`. Mongoose still + * // creates an AggregationCursor instance internally. + * const agg = Model.aggregate([{ $match: { age: { $gte: 25 } } }]); + * for await (const doc of agg) { + * console.log(doc.name); + * } + * + * // You can also use an AggregationCursor instance for async iteration + * const cursor = Model.aggregate([{ $match: { age: { $gte: 25 } } }]).cursor(); + * for await (const doc of cursor) { + * console.log(doc.name); + * } + * + * Node.js 10.x supports async iterators natively without any flags. You can + * enable async iterators in Node.js 8.x using the [`--harmony_async_iteration` flag](https://github.com/tc39/proposal-async-iteration/issues/117#issuecomment-346695187). + * + * **Note:** This function is not set if `Symbol.asyncIterator` is undefined. If + * `Symbol.asyncIterator` is undefined, that means your Node.js version does not + * support async iterators. + * + * @method Symbol.asyncIterator + * @memberOf AggregationCursor + * @instance + * @api public + */ + +if (Symbol.asyncIterator != null) { + AggregationCursor.prototype[Symbol.asyncIterator] = function() { + return this.transformNull()._transformForAsyncIterator(); + }; +} + +/*! + * ignore + */ + +AggregationCursor.prototype._transformForAsyncIterator = function() { + if (this._transforms.indexOf(_transformForAsyncIterator) === -1) { + this.map(_transformForAsyncIterator); + } + return this; +}; + +/*! + * ignore + */ + +AggregationCursor.prototype.transformNull = function(val) { + if (arguments.length === 0) { + val = true; + } + this._mongooseOptions.transformNull = val; + return this; +}; + +/*! + * ignore + */ + +function _transformForAsyncIterator(doc) { + return doc == null ? { done: true } : { value: doc, done: false }; +} + +/** + * Adds a [cursor flag](http://mongodb.github.io/node-mongodb-native/2.2/api/Cursor.html#addCursorFlag). + * Useful for setting the `noCursorTimeout` and `tailable` flags. + * + * @param {String} flag + * @param {Boolean} value + * @return {AggregationCursor} this + * @api public + * @method addCursorFlag + */ + +AggregationCursor.prototype.addCursorFlag = function(flag, value) { + const _this = this; + _waitForCursor(this, function() { + _this.cursor.addCursorFlag(flag, value); + }); + return this; +}; + +/*! + * ignore + */ + +function _waitForCursor(ctx, cb) { + if (ctx.cursor) { + return cb(); + } + ctx.once('cursor', function() { + cb(); + }); +} + +/*! + * Get the next doc from the underlying cursor and mongooseify it + * (populate, etc.) + */ + +function _next(ctx, cb) { + let callback = cb; + if (ctx._transforms.length) { + callback = function(err, doc) { + if (err || (doc === null && !ctx._mongooseOptions.transformNull)) { + return cb(err, doc); + } + cb(err, ctx._transforms.reduce(function(doc, fn) { + return fn(doc); + }, doc)); + }; + } + + if (ctx._error) { + return immediate(function() { + callback(ctx._error); + }); + } + + if (ctx.cursor) { + return ctx.cursor.next(function(error, doc) { + if (error) { + return callback(error); + } + if (!doc) { + return callback(null, null); + } + + callback(null, doc); + }); + } else { + ctx.once('cursor', function() { + _next(ctx, cb); + }); + } +} + +module.exports = AggregationCursor; diff --git a/node_modules/mongoose/lib/cursor/ChangeStream.js b/node_modules/mongoose/lib/cursor/ChangeStream.js new file mode 100644 index 000000000..b3445b06b --- /dev/null +++ b/node_modules/mongoose/lib/cursor/ChangeStream.js @@ -0,0 +1,61 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const EventEmitter = require('events').EventEmitter; + +/*! + * ignore + */ + +class ChangeStream extends EventEmitter { + constructor(changeStreamThunk, pipeline, options) { + super(); + + this.driverChangeStream = null; + this.closed = false; + this.pipeline = pipeline; + this.options = options; + + // This wrapper is necessary because of buffering. + changeStreamThunk((err, driverChangeStream) => { + if (err != null) { + this.emit('error', err); + return; + } + + this.driverChangeStream = driverChangeStream; + this._bindEvents(); + this.emit('ready'); + }); + } + + _bindEvents() { + this.driverChangeStream.on('close', () => { + this.closed = true; + }); + + ['close', 'change', 'end', 'error'].forEach(ev => { + this.driverChangeStream.on(ev, data => this.emit(ev, data)); + }); + } + + _queue(cb) { + this.once('ready', () => cb()); + } + + close() { + this.closed = true; + if (this.driverChangeStream) { + this.driverChangeStream.close(); + } + } +} + +/*! + * ignore + */ + +module.exports = ChangeStream; diff --git a/node_modules/mongoose/lib/cursor/QueryCursor.js b/node_modules/mongoose/lib/cursor/QueryCursor.js new file mode 100644 index 000000000..1bf2470cc --- /dev/null +++ b/node_modules/mongoose/lib/cursor/QueryCursor.js @@ -0,0 +1,522 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const Readable = require('stream').Readable; +const promiseOrCallback = require('../helpers/promiseOrCallback'); +const eachAsync = require('../helpers/cursor/eachAsync'); +const helpers = require('../queryhelpers'); +const immediate = require('../helpers/immediate'); +const util = require('util'); + +/** + * A QueryCursor is a concurrency primitive for processing query results + * one document at a time. A QueryCursor fulfills the Node.js streams3 API, + * in addition to several other mechanisms for loading documents from MongoDB + * one at a time. + * + * QueryCursors execute the model's pre `find` hooks before loading any documents + * from MongoDB, and the model's post `find` hooks after loading each document. + * + * Unless you're an advanced user, do **not** instantiate this class directly. + * Use [`Query#cursor()`](/docs/api.html#query_Query-cursor) instead. + * + * @param {Query} query + * @param {Object} options query options passed to `.find()` + * @inherits Readable + * @event `cursor`: Emitted when the cursor is created + * @event `error`: Emitted when an error occurred + * @event `data`: Emitted when the stream is flowing and the next doc is ready + * @event `end`: Emitted when the stream is exhausted + * @api public + */ + +function QueryCursor(query, options) { + Readable.call(this, { objectMode: true }); + + this.cursor = null; + this.query = query; + const _this = this; + const model = query.model; + this._mongooseOptions = {}; + this._transforms = []; + this.model = model; + this.options = options || {}; + + model.hooks.execPre('find', query, (err) => { + if (err != null) { + _this._markError(err); + _this.listeners('error').length > 0 && _this.emit('error', err); + return; + } + this._transforms = this._transforms.concat(query._transforms.slice()); + if (this.options.transform) { + this._transforms.push(options.transform); + } + // Re: gh-8039, you need to set the `cursor.batchSize` option, top-level + // `batchSize` option doesn't work. + if (this.options.batchSize) { + this.options.cursor = options.cursor || {}; + this.options.cursor.batchSize = options.batchSize; + + // Max out the number of documents we'll populate in parallel at 5000. + this.options._populateBatchSize = Math.min(this.options.batchSize, 5000); + } + model.collection.find(query._conditions, this.options, (err, cursor) => { + if (err != null) { + _this._markError(err); + _this.listeners('error').length > 0 && _this.emit('error', _this._error); + return; + } + + if (_this._error) { + cursor.close(function() {}); + _this.listeners('error').length > 0 && _this.emit('error', _this._error); + } + _this.cursor = cursor; + _this.emit('cursor', cursor); + }); + }); +} + +util.inherits(QueryCursor, Readable); + +/*! + * Necessary to satisfy the Readable API + */ + +QueryCursor.prototype._read = function() { + const _this = this; + _next(this, function(error, doc) { + if (error) { + return _this.emit('error', error); + } + if (!doc) { + _this.push(null); + _this.cursor.close(function(error) { + if (error) { + return _this.emit('error', error); + } + setTimeout(function() { + // on node >= 14 streams close automatically (gh-8834) + const isNotClosedAutomatically = !_this.destroyed; + if (isNotClosedAutomatically) { + _this.emit('close'); + } + }, 0); + }); + return; + } + _this.push(doc); + }); +}; + +/** + * Registers a transform function which subsequently maps documents retrieved + * via the streams interface or `.next()` + * + * ####Example + * + * // Map documents returned by `data` events + * Thing. + * find({ name: /^hello/ }). + * cursor(). + * map(function (doc) { + * doc.foo = "bar"; + * return doc; + * }) + * on('data', function(doc) { console.log(doc.foo); }); + * + * // Or map documents returned by `.next()` + * const cursor = Thing.find({ name: /^hello/ }). + * cursor(). + * map(function (doc) { + * doc.foo = "bar"; + * return doc; + * }); + * cursor.next(function(error, doc) { + * console.log(doc.foo); + * }); + * + * @param {Function} fn + * @return {QueryCursor} + * @api public + * @method map + */ + +QueryCursor.prototype.map = function(fn) { + this._transforms.push(fn); + return this; +}; + +/*! + * Marks this cursor as errored + */ + +QueryCursor.prototype._markError = function(error) { + this._error = error; + return this; +}; + +/** + * Marks this cursor as closed. Will stop streaming and subsequent calls to + * `next()` will error. + * + * @param {Function} callback + * @return {Promise} + * @api public + * @method close + * @emits close + * @see MongoDB driver cursor#close http://mongodb.github.io/node-mongodb-native/2.1/api/Cursor.html#close + */ + +QueryCursor.prototype.close = function(callback) { + return promiseOrCallback(callback, cb => { + this.cursor.close(error => { + if (error) { + cb(error); + return this.listeners('error').length > 0 && this.emit('error', error); + } + this.emit('close'); + cb(null); + }); + }, this.model.events); +}; + +/** + * Get the next document from this cursor. Will return `null` when there are + * no documents left. + * + * @param {Function} callback + * @return {Promise} + * @api public + * @method next + */ + +QueryCursor.prototype.next = function(callback) { + return promiseOrCallback(callback, cb => { + _next(this, function(error, doc) { + if (error) { + return cb(error); + } + cb(null, doc); + }); + }, this.model.events); +}; + +/** + * Execute `fn` for every document in the cursor. If `fn` returns a promise, + * will wait for the promise to resolve before iterating on to the next one. + * Returns a promise that resolves when done. + * + * ####Example + * + * // Iterate over documents asynchronously + * Thing. + * find({ name: /^hello/ }). + * cursor(). + * eachAsync(async function (doc, i) { + * doc.foo = doc.bar + i; + * await doc.save(); + * }) + * + * @param {Function} fn + * @param {Object} [options] + * @param {Number} [options.parallel] the number of promises to execute in parallel. Defaults to 1. + * @param {Function} [callback] executed when all docs have been processed + * @return {Promise} + * @api public + * @method eachAsync + */ + +QueryCursor.prototype.eachAsync = function(fn, opts, callback) { + const _this = this; + if (typeof opts === 'function') { + callback = opts; + opts = {}; + } + opts = opts || {}; + + return eachAsync(function(cb) { return _next(_this, cb); }, fn, opts, callback); +}; + +/** + * The `options` passed in to the `QueryCursor` constructor. + * + * @api public + * @property options + */ + +QueryCursor.prototype.options; + +/** + * Adds a [cursor flag](http://mongodb.github.io/node-mongodb-native/2.2/api/Cursor.html#addCursorFlag). + * Useful for setting the `noCursorTimeout` and `tailable` flags. + * + * @param {String} flag + * @param {Boolean} value + * @return {AggregationCursor} this + * @api public + * @method addCursorFlag + */ + +QueryCursor.prototype.addCursorFlag = function(flag, value) { + const _this = this; + _waitForCursor(this, function() { + _this.cursor.addCursorFlag(flag, value); + }); + return this; +}; + +/*! + * ignore + */ + +QueryCursor.prototype.transformNull = function(val) { + if (arguments.length === 0) { + val = true; + } + this._mongooseOptions.transformNull = val; + return this; +}; + +/*! + * ignore + */ + +QueryCursor.prototype._transformForAsyncIterator = function() { + if (this._transforms.indexOf(_transformForAsyncIterator) === -1) { + this.map(_transformForAsyncIterator); + } + return this; +}; + +/** + * Returns an asyncIterator for use with [`for/await/of` loops](https://thecodebarbarian.com/getting-started-with-async-iterators-in-node-js). + * You do not need to call this function explicitly, the JavaScript runtime + * will call it for you. + * + * ####Example + * + * // Works without using `cursor()` + * for await (const doc of Model.find([{ $sort: { name: 1 } }])) { + * console.log(doc.name); + * } + * + * // Can also use `cursor()` + * for await (const doc of Model.find([{ $sort: { name: 1 } }]).cursor()) { + * console.log(doc.name); + * } + * + * Node.js 10.x supports async iterators natively without any flags. You can + * enable async iterators in Node.js 8.x using the [`--harmony_async_iteration` flag](https://github.com/tc39/proposal-async-iteration/issues/117#issuecomment-346695187). + * + * **Note:** This function is not if `Symbol.asyncIterator` is undefined. If + * `Symbol.asyncIterator` is undefined, that means your Node.js version does not + * support async iterators. + * + * @method Symbol.asyncIterator + * @memberOf Query + * @instance + * @api public + */ + +if (Symbol.asyncIterator != null) { + QueryCursor.prototype[Symbol.asyncIterator] = function() { + return this.transformNull()._transformForAsyncIterator(); + }; +} + +/*! + * ignore + */ + +function _transformForAsyncIterator(doc) { + return doc == null ? { done: true } : { value: doc, done: false }; +} + +/*! + * Get the next doc from the underlying cursor and mongooseify it + * (populate, etc.) + */ + +function _next(ctx, cb) { + let callback = cb; + if (ctx._transforms.length) { + callback = function(err, doc) { + if (err || (doc === null && !ctx._mongooseOptions.transformNull)) { + return cb(err, doc); + } + cb(err, ctx._transforms.reduce(function(doc, fn) { + return fn.call(ctx, doc); + }, doc)); + }; + } + + if (ctx._error) { + return immediate(function() { + callback(ctx._error); + }); + } + + if (ctx.cursor) { + if (ctx.query._mongooseOptions.populate && !ctx._pop) { + ctx._pop = helpers.preparePopulationOptionsMQ(ctx.query, + ctx.query._mongooseOptions); + ctx._pop.__noPromise = true; + } + if (ctx.query._mongooseOptions.populate && ctx.options._populateBatchSize > 1) { + if (ctx._batchDocs && ctx._batchDocs.length) { + // Return a cached populated doc + return _nextDoc(ctx, ctx._batchDocs.shift(), ctx._pop, callback); + } else if (ctx._batchExhausted) { + // Internal cursor reported no more docs. Act the same here + return callback(null, null); + } else { + // Request as many docs as batchSize, to populate them also in batch + ctx._batchDocs = []; + return ctx.cursor.next(_onNext.bind({ ctx, callback })); + } + } else { + return ctx.cursor.next(function(error, doc) { + if (error) { + return callback(error); + } + if (!doc) { + return callback(null, null); + } + + if (!ctx.query._mongooseOptions.populate) { + return _nextDoc(ctx, doc, null, callback); + } + + ctx.query.model.populate(doc, ctx._pop, function(err, doc) { + if (err) { + return callback(err); + } + return _nextDoc(ctx, doc, ctx._pop, callback); + }); + }); + } + } else { + ctx.once('error', cb); + + ctx.once('cursor', function(cursor) { + ctx.removeListener('error', cb); + if (cursor == null) { + return; + } + _next(ctx, cb); + }); + } +} + +/*! + * ignore + */ + +function _onNext(error, doc) { + if (error) { + return this.callback(error); + } + if (!doc) { + this.ctx._batchExhausted = true; + return _populateBatch.call(this); + } + + this.ctx._batchDocs.push(doc); + + if (this.ctx._batchDocs.length < this.ctx.options._populateBatchSize) { + // If both `batchSize` and `_populateBatchSize` are huge, calling `next()` repeatedly may + // cause a stack overflow. So make sure we clear the stack regularly. + if (this.ctx._batchDocs.length > 0 && this.ctx._batchDocs.length % 1000 === 0) { + return immediate(() => this.ctx.cursor.next(_onNext.bind(this))); + } + this.ctx.cursor.next(_onNext.bind(this)); + } else { + _populateBatch.call(this); + } +} + +/*! + * ignore + */ + +function _populateBatch() { + if (!this.ctx._batchDocs.length) { + return this.callback(null, null); + } + const _this = this; + this.ctx.query.model.populate(this.ctx._batchDocs, this.ctx._pop, function(err) { + if (err) { + return _this.callback(err); + } + + _nextDoc(_this.ctx, _this.ctx._batchDocs.shift(), _this.ctx._pop, _this.callback); + }); +} + +/*! + * ignore + */ + +function _nextDoc(ctx, doc, pop, callback) { + if (ctx.query._mongooseOptions.lean) { + return ctx.model.hooks.execPost('find', ctx.query, [[doc]], err => { + if (err != null) { + return callback(err); + } + callback(null, doc); + }); + } + + _create(ctx, doc, pop, (err, doc) => { + if (err != null) { + return callback(err); + } + ctx.model.hooks.execPost('find', ctx.query, [[doc]], err => { + if (err != null) { + return callback(err); + } + callback(null, doc); + }); + }); +} + +/*! + * ignore + */ + +function _waitForCursor(ctx, cb) { + if (ctx.cursor) { + return cb(); + } + ctx.once('cursor', function(cursor) { + if (cursor == null) { + return; + } + cb(); + }); +} + +/*! + * Convert a raw doc into a full mongoose doc. + */ + +function _create(ctx, doc, populatedIds, cb) { + const instance = helpers.createModel(ctx.query.model, doc, ctx.query._fields); + const opts = populatedIds ? + { populated: populatedIds } : + undefined; + + instance.$init(doc, opts, function(err) { + if (err) { + return cb(err); + } + cb(null, instance); + }); +} + +module.exports = QueryCursor; diff --git a/node_modules/mongoose/lib/document.js b/node_modules/mongoose/lib/document.js new file mode 100644 index 000000000..3e6162690 --- /dev/null +++ b/node_modules/mongoose/lib/document.js @@ -0,0 +1,4362 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const EventEmitter = require('events').EventEmitter; +const InternalCache = require('./internal'); +const MongooseError = require('./error/index'); +const MixedSchema = require('./schema/mixed'); +const ObjectExpectedError = require('./error/objectExpected'); +const ObjectParameterError = require('./error/objectParameter'); +const ParallelValidateError = require('./error/parallelValidate'); +const Schema = require('./schema'); +const StrictModeError = require('./error/strict'); +const ValidationError = require('./error/validation'); +const ValidatorError = require('./error/validator'); +const VirtualType = require('./virtualtype'); +const promiseOrCallback = require('./helpers/promiseOrCallback'); +const cleanModifiedSubpaths = require('./helpers/document/cleanModifiedSubpaths'); +const compile = require('./helpers/document/compile').compile; +const defineKey = require('./helpers/document/compile').defineKey; +const flatten = require('./helpers/common').flatten; +const flattenObjectWithDottedPaths = require('./helpers/path/flattenObjectWithDottedPaths'); +const get = require('./helpers/get'); +const getEmbeddedDiscriminatorPath = require('./helpers/document/getEmbeddedDiscriminatorPath'); +const getKeysInSchemaOrder = require('./helpers/schema/getKeysInSchemaOrder'); +const handleSpreadDoc = require('./helpers/document/handleSpreadDoc'); +const immediate = require('./helpers/immediate'); +const isDefiningProjection = require('./helpers/projection/isDefiningProjection'); +const isExclusive = require('./helpers/projection/isExclusive'); +const inspect = require('util').inspect; +const internalToObjectOptions = require('./options').internalToObjectOptions; +const markArraySubdocsPopulated = require('./helpers/populate/markArraySubdocsPopulated'); +const mpath = require('mpath'); +const queryhelpers = require('./queryhelpers'); +const utils = require('./utils'); +const isPromise = require('./helpers/isPromise'); + +const clone = utils.clone; +const deepEqual = utils.deepEqual; +const isMongooseObject = utils.isMongooseObject; + +const arrayAtomicsBackupSymbol = require('./helpers/symbols').arrayAtomicsBackupSymbol; +const arrayAtomicsSymbol = require('./helpers/symbols').arrayAtomicsSymbol; +const documentArrayParent = require('./helpers/symbols').documentArrayParent; +const documentIsModified = require('./helpers/symbols').documentIsModified; +const documentModifiedPaths = require('./helpers/symbols').documentModifiedPaths; +const documentSchemaSymbol = require('./helpers/symbols').documentSchemaSymbol; +const getSymbol = require('./helpers/symbols').getSymbol; +const populateModelSymbol = require('./helpers/symbols').populateModelSymbol; +const scopeSymbol = require('./helpers/symbols').scopeSymbol; +const schemaMixedSymbol = require('./schema/symbols').schemaMixedSymbol; +const parentPaths = require('./helpers/path/parentPaths'); +let DocumentArray; +let MongooseArray; +let Embedded; + +const specialProperties = utils.specialProperties; + +/** + * The core Mongoose document constructor. You should not call this directly, + * the Mongoose [Model constructor](./api.html#Model) calls this for you. + * + * @param {Object} obj the values to set + * @param {Object} [fields] optional object containing the fields which were selected in the query returning this document and any populated paths data + * @param {Object} [options] various configuration options for the document + * @param {Boolean} [options.defaults=true] if `false`, skip applying default values to this document. + * @inherits NodeJS EventEmitter http://nodejs.org/api/events.html#events_class_events_eventemitter + * @event `init`: Emitted on a document after it has been retrieved from the db and fully hydrated by Mongoose. + * @event `save`: Emitted when the document is successfully saved + * @api private + */ + +function Document(obj, fields, skipId, options) { + if (typeof skipId === 'object' && skipId != null) { + options = skipId; + skipId = options.skipId; + } + options = Object.assign({}, options); + + // Support `browserDocument.js` syntax + if (this.$__schema == null) { + const _schema = utils.isObject(fields) && !fields.instanceOfSchema ? + new Schema(fields) : + fields; + this.$__setSchema(_schema); + fields = skipId; + skipId = options; + options = arguments[4] || {}; + } + + this.$__ = new InternalCache; + this.$__.emitter = new EventEmitter(); + this.$isNew = 'isNew' in options ? options.isNew : true; + + if ('priorDoc' in options) { + this.$__.priorDoc = options.priorDoc; + } + + if (obj != null && typeof obj !== 'object') { + throw new ObjectParameterError(obj, 'obj', 'Document'); + } + + let defaults = true; + if (options.defaults !== undefined) { + this.$__.defaults = options.defaults; + defaults = options.defaults; + } + + const schema = this.$__schema; + + if (typeof fields === 'boolean' || fields === 'throw') { + this.$__.strictMode = fields; + fields = undefined; + } else { + this.$__.strictMode = schema.options.strict; + if (fields !== undefined) { + this.$__.selected = fields; + } + } + + const requiredPaths = schema.requiredPaths(true); + for (const path of requiredPaths) { + this.$__.activePaths.require(path); + } + + this.$__.emitter.setMaxListeners(0); + + let exclude = null; + + // determine if this doc is a result of a query with + // excluded fields + if (utils.isPOJO(fields)) { + exclude = isExclusive(fields); + } + + const hasIncludedChildren = exclude === false && fields ? + $__hasIncludedChildren(fields) : + {}; + + if (this._doc == null) { + this.$__buildDoc(obj, fields, skipId, exclude, hasIncludedChildren, false); + + // By default, defaults get applied **before** setting initial values + // Re: gh-6155 + if (defaults) { + $__applyDefaults(this, fields, skipId, exclude, hasIncludedChildren, true, { + isNew: this.$isNew + }); + } + } + if (obj) { + // Skip set hooks + if (this.$__original_set) { + this.$__original_set(obj, undefined, true); + } else { + this.$set(obj, undefined, true); + } + + if (obj instanceof Document) { + this.$isNew = obj.$isNew; + } + } + + // Function defaults get applied **after** setting initial values so they + // see the full doc rather than an empty one, unless they opt out. + // Re: gh-3781, gh-6155 + if (options.willInit && defaults) { + EventEmitter.prototype.once.call(this, 'init', () => { + $__applyDefaults(this, fields, skipId, exclude, hasIncludedChildren, false, options.skipDefaults, { + isNew: this.$isNew + }); + }); + } else if (defaults) { + $__applyDefaults(this, fields, skipId, exclude, hasIncludedChildren, false, options.skipDefaults, { + isNew: this.$isNew + }); + } + + this.$__._id = this._id; + + if (!this.$__.strictMode && obj) { + const _this = this; + const keys = Object.keys(this._doc); + + keys.forEach(function(key) { + if (!(key in schema.tree)) { + defineKey({ prop: key, subprops: null, prototype: _this }); + } + }); + } + + applyQueue(this); +} + +Object.defineProperty(Document.prototype, 'isNew', { + get: function() { + return this.$isNew; + }, + set: function(value) { + this.$isNew = value; + } +}); + +Object.defineProperty(Document.prototype, 'errors', { + get: function() { + return this.$errors; + }, + set: function(value) { + this.$errors = value; + } +}); +/*! + * Document exposes the NodeJS event emitter API, so you can use + * `on`, `once`, etc. + */ +utils.each( + ['on', 'once', 'emit', 'listeners', 'removeListener', 'setMaxListeners', + 'removeAllListeners', 'addListener'], + function(emitterFn) { + Document.prototype[emitterFn] = function() { + return this.$__.emitter[emitterFn].apply(this.$__.emitter, arguments); + }; + Document.prototype[`$${emitterFn}`] = Document.prototype[emitterFn]; + }); + +Document.prototype.constructor = Document; + +for (const i in EventEmitter.prototype) { + Document[i] = EventEmitter.prototype[i]; +} + +/** + * The document's internal schema. + * + * @api private + * @property schema + * @memberOf Document + * @instance + */ + +Document.prototype.$__schema; + +/** + * The document's schema. + * + * @api public + * @property schema + * @memberOf Document + * @instance + */ + +Document.prototype.schema; + +/** + * Empty object that you can use for storing properties on the document. This + * is handy for passing data to middleware without conflicting with Mongoose + * internals. + * + * ####Example: + * + * schema.pre('save', function() { + * // Mongoose will set `isNew` to `false` if `save()` succeeds + * this.$locals.wasNew = this.isNew; + * }); + * + * schema.post('save', function() { + * // Prints true if `isNew` was set before `save()` + * console.log(this.$locals.wasNew); + * }); + * + * @api public + * @property $locals + * @memberOf Document + * @instance + */ + +Object.defineProperty(Document.prototype, '$locals', { + configurable: false, + enumerable: false, + get: function() { + if (this.$__.locals == null) { + this.$__.locals = {}; + } + return this.$__.locals; + }, + set: function(v) { + this.$__.locals = v; + } +}); + + +/** + * Boolean flag specifying if the document is new. + * + * @api public + * @property $isNew + * @memberOf Document + * @instance + */ + +Document.prototype.$isNew; + +/** + * Boolean flag specifying if the document is new. + * + * @api public + * @property isNew + * @memberOf Document + * @instance + */ + +Document.prototype.isNew; + +/** + * Set this property to add additional query filters when Mongoose saves this document and `isNew` is false. + * + * ####Example: + * + * // Make sure `save()` never updates a soft deleted document. + * schema.pre('save', function() { + * this.$where = { isDeleted: false }; + * }); + * + * @api public + * @property $where + * @memberOf Document + * @instance + */ + +Object.defineProperty(Document.prototype, '$where', { + configurable: false, + enumerable: false, + writable: true +}); + +/** + * The string version of this documents _id. + * + * ####Note: + * + * This getter exists on all documents by default. The getter can be disabled by setting the `id` [option](/docs/guide.html#id) of its `Schema` to false at construction time. + * + * new Schema({ name: String }, { id: false }); + * + * @api public + * @see Schema options /docs/guide.html#options + * @property id + * @memberOf Document + * @instance + */ + +Document.prototype.id; + +/** + * Hash containing current validation $errors. + * + * @api public + * @property $errors + * @memberOf Document + * @instance + */ + +Document.prototype.$errors; + +/** + * Hash containing current validation errors. + * + * @api public + * @property errors + * @memberOf Document + * @instance + */ + +Document.prototype.errors; + +/** + * A string containing the current operation that Mongoose is executing + * on this document. May be `null`, `'save'`, `'validate'`, or `'remove'`. + * + * ####Example: + * + * const doc = new Model({ name: 'test' }); + * doc.$op; // null + * + * const promise = doc.save(); + * doc.$op; // 'save' + * + * await promise; + * doc.$op; // null + * + * @api public + * @property $op + * @memberOf Document + * @instance + */ + +Object.defineProperty(Document.prototype, '$op', { + get: function() { + return this.$__.op || null; + }, + set: function(value) { + this.$__.op = value; + } +}); + +/*! + * ignore + */ + +function $__hasIncludedChildren(fields) { + const hasIncludedChildren = {}; + const keys = Object.keys(fields); + + for (const key of keys) { + const parts = key.split('.'); + const c = []; + + for (const part of parts) { + c.push(part); + hasIncludedChildren[c.join('.')] = 1; + } + } + + return hasIncludedChildren; +} + +/*! + * ignore + */ + +function $__applyDefaults(doc, fields, skipId, exclude, hasIncludedChildren, isBeforeSetters, pathsToSkip) { + const paths = Object.keys(doc.$__schema.paths); + const plen = paths.length; + + for (let i = 0; i < plen; ++i) { + let def; + let curPath = ''; + const p = paths[i]; + + if (p === '_id' && skipId) { + continue; + } + + const type = doc.$__schema.paths[p]; + const path = type.splitPath(); + const len = path.length; + let included = false; + let doc_ = doc._doc; + for (let j = 0; j < len; ++j) { + if (doc_ == null) { + break; + } + + const piece = path[j]; + curPath += (!curPath.length ? '' : '.') + piece; + + if (exclude === true) { + if (curPath in fields) { + break; + } + } else if (exclude === false && fields && !included) { + if (curPath in fields) { + included = true; + } else if (!hasIncludedChildren[curPath]) { + break; + } + } + + if (j === len - 1) { + if (doc_[piece] !== void 0) { + break; + } + + if (typeof type.defaultValue === 'function') { + if (!type.defaultValue.$runBeforeSetters && isBeforeSetters) { + break; + } + if (type.defaultValue.$runBeforeSetters && !isBeforeSetters) { + break; + } + } else if (!isBeforeSetters) { + // Non-function defaults should always run **before** setters + continue; + } + + if (pathsToSkip && pathsToSkip[curPath]) { + break; + } + + if (fields && exclude !== null) { + if (exclude === true) { + // apply defaults to all non-excluded fields + if (p in fields) { + continue; + } + + try { + def = type.getDefault(doc, false); + } catch (err) { + doc.invalidate(p, err); + break; + } + + if (typeof def !== 'undefined') { + doc_[piece] = def; + doc.$__.activePaths.default(p); + } + } else if (included) { + // selected field + try { + def = type.getDefault(doc, false); + } catch (err) { + doc.invalidate(p, err); + break; + } + + if (typeof def !== 'undefined') { + doc_[piece] = def; + doc.$__.activePaths.default(p); + } + } + } else { + try { + def = type.getDefault(doc, false); + } catch (err) { + doc.invalidate(p, err); + break; + } + + if (typeof def !== 'undefined') { + doc_[piece] = def; + doc.$__.activePaths.default(p); + } + } + } else { + doc_ = doc_[piece]; + } + } + } +} + +/*! + * ignore + */ + +function $applyDefaultsToNested(val, path, doc) { + if (val == null) { + return; + } + + flattenObjectWithDottedPaths(val); + + const paths = Object.keys(doc.$__schema.paths); + const plen = paths.length; + + const pathPieces = path.indexOf('.') === -1 ? [path] : path.split('.'); + + for (let i = 0; i < plen; ++i) { + let curPath = ''; + const p = paths[i]; + + if (!p.startsWith(path + '.')) { + continue; + } + + const type = doc.$__schema.paths[p]; + const pieces = type.splitPath().slice(pathPieces.length); + const len = pieces.length; + + if (type.defaultValue === void 0) { + continue; + } + + let cur = val; + + for (let j = 0; j < len; ++j) { + if (cur == null) { + break; + } + + const piece = pieces[j]; + + if (j === len - 1) { + if (cur[piece] !== void 0) { + break; + } + + try { + const def = type.getDefault(doc, false); + if (def !== void 0) { + cur[piece] = def; + } + } catch (err) { + doc.invalidate(path + '.' + curPath, err); + break; + } + + break; + } + + curPath += (!curPath.length ? '' : '.') + piece; + + cur[piece] = cur[piece] || {}; + cur = cur[piece]; + } + } +} + +/** + * Builds the default doc structure + * + * @param {Object} obj + * @param {Object} [fields] + * @param {Boolean} [skipId] + * @api private + * @method $__buildDoc + * @memberOf Document + * @instance + */ + +Document.prototype.$__buildDoc = function(obj, fields, skipId, exclude, hasIncludedChildren) { + const doc = {}; + + const paths = Object.keys(this.$__schema.paths). + // Don't build up any paths that are underneath a map, we don't know + // what the keys will be + filter(p => !p.includes('$*')); + const plen = paths.length; + let ii = 0; + + for (; ii < plen; ++ii) { + const p = paths[ii]; + + if (p === '_id') { + if (skipId) { + continue; + } + if (obj && '_id' in obj) { + continue; + } + } + + const path = this.$__schema.paths[p].splitPath(); + const len = path.length; + const last = len - 1; + let curPath = ''; + let doc_ = doc; + let included = false; + + for (let i = 0; i < len; ++i) { + const piece = path[i]; + + curPath += (!curPath.length ? '' : '.') + piece; + + // support excluding intermediary levels + if (exclude === true) { + if (curPath in fields) { + break; + } + } else if (exclude === false && fields && !included) { + if (curPath in fields) { + included = true; + } else if (!hasIncludedChildren[curPath]) { + break; + } + } + + if (i < last) { + doc_ = doc_[piece] || (doc_[piece] = {}); + } + } + } + + this._doc = doc; +}; + +/*! + * Converts to POJO when you use the document for querying + */ + +Document.prototype.toBSON = function() { + return this.toObject(internalToObjectOptions); +}; + +/** + * Initializes the document without setters or marking anything modified. + * + * Called internally after a document is returned from mongodb. Normally, + * you do **not** need to call this function on your own. + * + * This function triggers `init` [middleware](/docs/middleware.html). + * Note that `init` hooks are [synchronous](/docs/middleware.html#synchronous). + * + * @param {Object} doc document returned by mongo + * @api public + * @memberOf Document + * @instance + */ + +Document.prototype.init = function(doc, opts, fn) { + if (typeof opts === 'function') { + fn = opts; + opts = null; + } + + this.$__init(doc, opts); + + if (fn) { + fn(null, this); + } + + return this; +}; + +Document.prototype.$init = function() { + return this.constructor.prototype.init.apply(this, arguments); +}; + +/*! + * ignore + */ + +Document.prototype.$__init = function(doc, opts) { + this.$isNew = false; + this.$init = true; + opts = opts || {}; + + // handle docs with populated paths + // If doc._id is not null or undefined + if (doc._id != null && opts.populated && opts.populated.length) { + const id = String(doc._id); + for (const item of opts.populated) { + if (item.isVirtual) { + this.$populated(item.path, utils.getValue(item.path, doc), item); + } else { + this.$populated(item.path, item._docs[id], item); + } + + if (item._childDocs == null) { + continue; + } + for (const child of item._childDocs) { + if (child == null || child.$__ == null) { + continue; + } + child.$__.parent = this; + } + item._childDocs = []; + } + } + + init(this, doc, this._doc, opts); + + markArraySubdocsPopulated(this, opts.populated); + + this.$emit('init', this); + this.constructor.emit('init', this); + + this.$__._id = this._id; + return this; +}; + +/*! + * Init helper. + * + * @param {Object} self document instance + * @param {Object} obj raw mongodb doc + * @param {Object} doc object we are initializing + * @api private + */ + +function init(self, obj, doc, opts, prefix) { + prefix = prefix || ''; + + const keys = Object.keys(obj); + const len = keys.length; + let schema; + let path; + let i; + let index = 0; + const strict = self.$__.strictMode; + + while (index < len) { + _init(index++); + } + + function _init(index) { + i = keys[index]; + path = prefix + i; + schema = self.$__schema.path(path); + + // Should still work if not a model-level discriminator, but should not be + // necessary. This is *only* to catch the case where we queried using the + // base model and the discriminated model has a projection + if (self.$__schema.$isRootDiscriminator && !self.$__isSelected(path)) { + return; + } + + if (!schema && utils.isPOJO(obj[i])) { + // assume nested object + if (!doc[i]) { + doc[i] = {}; + } + init(self, obj[i], doc[i], opts, path + '.'); + } else if (!schema) { + doc[i] = obj[i]; + if (!strict) { + self[i] = obj[i]; + } + } else { + // Retain order when overwriting defaults + if (doc.hasOwnProperty(i) && obj[i] !== void 0) { + delete doc[i]; + } + if (obj[i] === null) { + doc[i] = schema._castNullish(null); + } else if (obj[i] !== undefined) { + const intCache = obj[i].$__ || {}; + const wasPopulated = intCache.wasPopulated || null; + + if (schema && !wasPopulated) { + try { + doc[i] = schema.cast(obj[i], self, true); + } catch (e) { + self.invalidate(e.path, new ValidatorError({ + path: e.path, + message: e.message, + type: 'cast', + value: e.value, + reason: e + })); + } + } else { + doc[i] = obj[i]; + } + } + // mark as hydrated + if (!self.$isModified(path)) { + self.$__.activePaths.init(path); + } + } + } +} + +/** + * Sends an update command with this document `_id` as the query selector. + * + * ####Example: + * + * weirdCar.update({$inc: {wheels:1}}, { w: 1 }, callback); + * + * ####Valid options: + * + * - same as in [Model.update](#model_Model.update) + * + * @see Model.update #model_Model.update + * @param {Object} doc + * @param {Object} options + * @param {Function} callback + * @return {Query} + * @api public + * @memberOf Document + * @instance + */ + +Document.prototype.update = function update() { + const args = utils.args(arguments); + args.unshift({ _id: this._id }); + const query = this.constructor.update.apply(this.constructor, args); + + if (this.$session() != null) { + if (!('session' in query.options)) { + query.options.session = this.$session(); + } + } + + return query; +}; + +/** + * Sends an updateOne command with this document `_id` as the query selector. + * + * ####Example: + * + * weirdCar.updateOne({$inc: {wheels:1}}, { w: 1 }, callback); + * + * ####Valid options: + * + * - same as in [Model.updateOne](#model_Model.updateOne) + * + * @see Model.updateOne #model_Model.updateOne + * @param {Object} doc + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Object} [options.lean] if truthy, mongoose will return the document as a plain JavaScript object rather than a mongoose document. See [`Query.lean()`](/docs/api.html#query_Query-lean) and the [Mongoose lean tutorial](/docs/tutorials/lean.html). + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Boolean} [options.timestamps=null] If set to `false` and [schema-level timestamps](/docs/guide.html#timestamps) are enabled, skip timestamps for this update. Note that this allows you to overwrite timestamps. Does nothing if schema-level timestamps are not set. + * @param {Function} callback + * @return {Query} + * @api public + * @memberOf Document + * @instance + */ + +Document.prototype.updateOne = function updateOne(doc, options, callback) { + const query = this.constructor.updateOne({ _id: this._id }, doc, options); + query.pre(cb => { + this.constructor._middleware.execPre('updateOne', this, [this], cb); + }); + query.post(cb => { + this.constructor._middleware.execPost('updateOne', this, [this], {}, cb); + }); + + if (this.$session() != null) { + if (!('session' in query.options)) { + query.options.session = this.$session(); + } + } + + if (callback != null) { + return query.exec(callback); + } + + return query; +}; + +/** + * Sends a replaceOne command with this document `_id` as the query selector. + * + * ####Valid options: + * + * - same as in [Model.replaceOne](https://mongoosejs.com/docs/api/model.html#model_Model.replaceOne) + * + * @see Model.replaceOne #model_Model.replaceOne + * @param {Object} doc + * @param {Object} options + * @param {Function} callback + * @return {Query} + * @api public + * @memberOf Document + * @instance + */ + +Document.prototype.replaceOne = function replaceOne() { + const args = utils.args(arguments); + args.unshift({ _id: this._id }); + return this.constructor.replaceOne.apply(this.constructor, args); +}; + +/** + * Getter/setter around the session associated with this document. Used to + * automatically set `session` if you `save()` a doc that you got from a + * query with an associated session. + * + * ####Example: + * + * const session = MyModel.startSession(); + * const doc = await MyModel.findOne().session(session); + * doc.$session() === session; // true + * doc.$session(null); + * doc.$session() === null; // true + * + * If this is a top-level document, setting the session propagates to all child + * docs. + * + * @param {ClientSession} [session] overwrite the current session + * @return {ClientSession} + * @method $session + * @api public + * @memberOf Document + */ + +Document.prototype.$session = function $session(session) { + if (arguments.length === 0) { + if (this.$__.session != null && this.$__.session.hasEnded) { + this.$__.session = null; + return null; + } + return this.$__.session; + } + + if (session != null && session.hasEnded) { + throw new MongooseError('Cannot set a document\'s session to a session that has ended. Make sure you haven\'t ' + + 'called `endSession()` on the session you are passing to `$session()`.'); + } + + this.$__.session = session; + + if (!this.ownerDocument) { + const subdocs = this.$getAllSubdocs(); + for (const child of subdocs) { + child.$session(session); + } + } + + return session; +}; + +/** + * Overwrite all values in this document with the values of `obj`, except + * for immutable properties. Behaves similarly to `set()`, except for it + * unsets all properties that aren't in `obj`. + * + * @param {Object} obj the object to overwrite this document with + * @method overwrite + * @name overwrite + * @memberOf Document + * @instance + * @api public + */ + +Document.prototype.overwrite = function overwrite(obj) { + const keys = Array.from(new Set(Object.keys(this._doc).concat(Object.keys(obj)))); + + for (const key of keys) { + if (key === '_id') { + continue; + } + // Explicitly skip version key + if (this.$__schema.options.versionKey && key === this.$__schema.options.versionKey) { + continue; + } + if (this.$__schema.options.discriminatorKey && key === this.$__schema.options.discriminatorKey) { + continue; + } + this.$set(key, obj[key]); + } + + return this; +}; + +/** + * Alias for `set()`, used internally to avoid conflicts + * + * @param {String|Object} path path or object of key/vals to set + * @param {Any} val the value to set + * @param {Schema|String|Number|Buffer|*} [type] optionally specify a type for "on-the-fly" attributes + * @param {Object} [options] optionally specify options that modify the behavior of the set + * @method $set + * @name $set + * @memberOf Document + * @instance + * @api public + */ + +Document.prototype.$set = function $set(path, val, type, options) { + if (utils.isPOJO(type)) { + options = type; + type = undefined; + } + + options = options || {}; + const merge = options.merge; + const adhoc = type && type !== true; + const constructing = type === true; + const typeKey = this.$__schema.options.typeKey; + let adhocs; + let keys; + let i = 0; + let pathtype; + let key; + let prefix; + + const strict = 'strict' in options + ? options.strict + : this.$__.strictMode; + + if (adhoc) { + adhocs = this.$__.adhocPaths || (this.$__.adhocPaths = {}); + adhocs[path] = this.$__schema.interpretAsType(path, type, this.$__schema.options); + } + + if (path == null) { + [path, val] = [val, path]; + } else if (typeof path !== 'string') { + // new Document({ key: val }) + if (path instanceof Document) { + if (path.$__isNested) { + path = path.toObject(); + } else { + path = path._doc; + } + } + if (path == null) { + [path, val] = [val, path]; + } + + prefix = val ? val + '.' : ''; + keys = getKeysInSchemaOrder(this.$__schema, path); + + const len = keys.length; + + // `_skipMinimizeTopLevel` is because we may have deleted the top-level + // nested key to ensure key order. + const _skipMinimizeTopLevel = get(options, '_skipMinimizeTopLevel', false); + if (len === 0 && _skipMinimizeTopLevel) { + delete options._skipMinimizeTopLevel; + if (val) { + this.$set(val, {}); + } + return this; + } + + for (let i = 0; i < len; ++i) { + key = keys[i]; + const pathName = prefix + key; + pathtype = this.$__schema.pathType(pathName); + const valForKey = path[key]; + + // On initial set, delete any nested keys if we're going to overwrite + // them to ensure we keep the user's key order. + if (type === true && + !prefix && + path[key] != null && + pathtype === 'nested' && + this._doc[key] != null) { + delete this._doc[key]; + // Make sure we set `{}` back even if we minimize re: gh-8565 + options = Object.assign({}, options, { _skipMinimizeTopLevel: true }); + } else { + // Make sure we set `{_skipMinimizeTopLevel: false}` if don't have overwrite: gh-10441 + options = Object.assign({}, options, { _skipMinimizeTopLevel: false }); + } + + if (utils.isNonBuiltinObject(valForKey) && pathtype === 'nested') { + $applyDefaultsToNested(path[key], prefix + key, this); + this.$set(prefix + key, path[key], constructing, Object.assign({}, options, { _skipMarkModified: true })); + continue; + } else if (strict) { + // Don't overwrite defaults with undefined keys (gh-3981) (gh-9039) + if (constructing && path[key] === void 0 && + this.$get(pathName) !== void 0) { + continue; + } + + if (pathtype === 'adhocOrUndefined') { + pathtype = getEmbeddedDiscriminatorPath(this, pathName, { typeOnly: true }); + } + + if (pathtype === 'real' || pathtype === 'virtual') { + // Check for setting single embedded schema to document (gh-3535) + let p = path[key]; + if (this.$__schema.paths[pathName] && + this.$__schema.paths[pathName].$isSingleNested && + path[key] instanceof Document) { + p = p.toObject({ virtuals: false, transform: false }); + } + this.$set(prefix + key, p, constructing, options); + } else if (pathtype === 'nested' && path[key] instanceof Document) { + this.$set(prefix + key, + path[key].toObject({ transform: false }), constructing, options); + } else if (strict === 'throw') { + if (pathtype === 'nested') { + throw new ObjectExpectedError(key, path[key]); + } else { + throw new StrictModeError(key); + } + } + } else if (path[key] !== void 0) { + this.$set(prefix + key, path[key], constructing, options); + } + } + + // Ensure all properties are in correct order by deleting and recreating every property. + for (const key of Object.keys(this.$__schema.tree)) { + if (this._doc.hasOwnProperty(key)) { + const val = this._doc[key]; + delete this._doc[key]; + this._doc[key] = val; + } + } + + return this; + } + + let pathType = this.$__schema.pathType(path); + if (pathType === 'adhocOrUndefined') { + pathType = getEmbeddedDiscriminatorPath(this, path, { typeOnly: true }); + } + + // Assume this is a Mongoose document that was copied into a POJO using + // `Object.assign()` or `{...doc}` + val = handleSpreadDoc(val); + + // if this doc is being constructed we should not trigger getters + const priorVal = (() => { + if (this.$__.priorDoc != null) { + return this.$__.priorDoc.$__getValue(path); + } + if (constructing) { + return void 0; + } + return this.$__getValue(path); + })(); + + if (pathType === 'nested' && val) { + if (typeof val === 'object' && val != null) { + const hasInitialVal = this.$__.savedState != null && this.$__.savedState.hasOwnProperty(path); + if (this.$__.savedState != null && !this.$isNew && !this.$__.savedState.hasOwnProperty(path)) { + const initialVal = this.$__getValue(path); + this.$__.savedState[path] = initialVal; + + const keys = Object.keys(initialVal || {}); + for (const key of keys) { + this.$__.savedState[path + '.' + key] = initialVal[key]; + } + } + + if (!merge) { + this.$__setValue(path, null); + cleanModifiedSubpaths(this, path); + } else { + return this.$set(val, path, constructing); + } + + const keys = getKeysInSchemaOrder(this.$__schema, val, path); + + this.$__setValue(path, {}); + for (const key of keys) { + this.$set(path + '.' + key, val[key], constructing, options); + } + if (priorVal != null && utils.deepEqual(hasInitialVal ? this.$__.savedState[path] : priorVal, val)) { + this.unmarkModified(path); + } else { + this.markModified(path); + } + cleanModifiedSubpaths(this, path, { skipDocArrays: true }); + return this; + } + this.invalidate(path, new MongooseError.CastError('Object', val, path)); + return this; + } + + let schema; + const parts = path.indexOf('.') === -1 ? [path] : path.split('.'); + + // Might need to change path for top-level alias + if (typeof this.$__schema.aliases[parts[0]] == 'string') { + parts[0] = this.$__schema.aliases[parts[0]]; + } + + if (pathType === 'adhocOrUndefined' && strict) { + // check for roots that are Mixed types + let mixed; + + for (i = 0; i < parts.length; ++i) { + const subpath = parts.slice(0, i + 1).join('.'); + + // If path is underneath a virtual, bypass everything and just set it. + if (i + 1 < parts.length && this.$__schema.pathType(subpath) === 'virtual') { + mpath.set(path, val, this); + return this; + } + + schema = this.$__schema.path(subpath); + if (schema == null) { + continue; + } + + if (schema instanceof MixedSchema) { + // allow changes to sub paths of mixed types + mixed = true; + break; + } + } + + if (schema == null) { + // Check for embedded discriminators + schema = getEmbeddedDiscriminatorPath(this, path); + } + + if (!mixed && !schema) { + if (strict === 'throw') { + throw new StrictModeError(path); + } + return this; + } + } else if (pathType === 'virtual') { + schema = this.$__schema.virtualpath(path); + schema.applySetters(val, this); + return this; + } else { + schema = this.$__path(path); + } + + // gh-4578, if setting a deeply nested path that doesn't exist yet, create it + let cur = this._doc; + let curPath = ''; + for (i = 0; i < parts.length - 1; ++i) { + cur = cur[parts[i]]; + curPath += (curPath.length > 0 ? '.' : '') + parts[i]; + if (!cur) { + this.$set(curPath, {}); + // Hack re: gh-5800. If nested field is not selected, it probably exists + // so `MongoServerError: cannot use the part (nested of nested.num) to + // traverse the element ({nested: null})` is not likely. If user gets + // that error, its their fault for now. We should reconsider disallowing + // modifying not selected paths for 6.x + if (!this.$__isSelected(curPath)) { + this.unmarkModified(curPath); + } + cur = this.$__getValue(curPath); + } + } + + let pathToMark; + + // When using the $set operator the path to the field must already exist. + // Else mongodb throws: "LEFT_SUBFIELD only supports Object" + + if (parts.length <= 1) { + pathToMark = path; + } else { + for (i = 0; i < parts.length; ++i) { + const subpath = parts.slice(0, i + 1).join('.'); + if (this.$get(subpath, null, { getters: false }) === null) { + pathToMark = subpath; + break; + } + } + + if (!pathToMark) { + pathToMark = path; + } + } + + if (!schema) { + this.$__set(pathToMark, path, options, constructing, parts, schema, val, priorVal); + return this; + } + + // If overwriting a subdocument path, make sure to clear out + // any errors _before_ setting, so new errors that happen + // get persisted. Re: #9080 + if (schema.$isSingleNested || schema.$isMongooseArray) { + _markValidSubpaths(this, path); + } + + if (schema.$isSingleNested && val != null && merge) { + if (val instanceof Document) { + val = val.toObject({ virtuals: false, transform: false }); + } + const keys = Object.keys(val); + for (const key of keys) { + this.$set(path + '.' + key, val[key], constructing, options); + } + + return this; + } + + let shouldSet = true; + try { + // If the user is trying to set a ref path to a document with + // the correct model name, treat it as populated + const refMatches = (() => { + if (schema.options == null) { + return false; + } + if (!(val instanceof Document)) { + return false; + } + const model = val.constructor; + + // Check ref + const ref = schema.options.ref; + if (ref != null && (ref === model.modelName || ref === model.baseModelName)) { + return true; + } + + // Check refPath + const refPath = schema.options.refPath; + if (refPath == null) { + return false; + } + const modelName = val.get(refPath); + return modelName === model.modelName || modelName === model.baseModelName; + })(); + + let didPopulate = false; + if (refMatches && val instanceof Document) { + this.$populated(path, val._id, { [populateModelSymbol]: val.constructor }); + val.$__.wasPopulated = true; + didPopulate = true; + } + + let popOpts; + if (schema.options && + Array.isArray(schema.options[typeKey]) && + schema.options[typeKey].length && + schema.options[typeKey][0].ref && + _isManuallyPopulatedArray(val, schema.options[typeKey][0].ref)) { + popOpts = { [populateModelSymbol]: val[0].constructor }; + this.$populated(path, val.map(function(v) { return v._id; }), popOpts); + + for (const doc of val) { + doc.$__.wasPopulated = true; + } + didPopulate = true; + } + + if (this.$__schema.singleNestedPaths[path] == null) { + // If this path is underneath a single nested schema, we'll call the setter + // later in `$__set()` because we don't take `_doc` when we iterate through + // a single nested doc. That's to make sure we get the correct context. + // Otherwise we would double-call the setter, see gh-7196. + val = schema.applySetters(val, this, false, priorVal); + } + + if (schema.$isMongooseDocumentArray && + Array.isArray(val) && + val.length > 0 && + val[0] != null && + val[0].$__ != null && + val[0].$__.populated != null) { + const populatedPaths = Object.keys(val[0].$__.populated); + for (const populatedPath of populatedPaths) { + this.$populated(path + '.' + populatedPath, + val.map(v => v.$populated(populatedPath)), + val[0].$__.populated[populatedPath].options); + } + didPopulate = true; + } + + if (!didPopulate && this.$__.populated) { + // If this array partially contains populated documents, convert them + // all to ObjectIds re: #8443 + if (Array.isArray(val) && this.$__.populated[path]) { + for (let i = 0; i < val.length; ++i) { + if (val[i] instanceof Document) { + val.set(i, val[i]._id, true); + } + } + } + delete this.$__.populated[path]; + } + + if (schema.$isSingleNested && val != null) { + _checkImmutableSubpaths(val, schema, priorVal); + } + + this.$markValid(path); + } catch (e) { + if (e instanceof MongooseError.StrictModeError && e.isImmutableError) { + this.invalidate(path, e); + } else if (e instanceof MongooseError.CastError) { + this.invalidate(e.path, e); + if (e.$originalErrorPath) { + this.invalidate(path, + new MongooseError.CastError(schema.instance, val, path, e.$originalErrorPath)); + } + } else { + this.invalidate(path, + new MongooseError.CastError(schema.instance, val, path, e)); + } + shouldSet = false; + } + + if (shouldSet) { + const doc = this.ownerDocument ? this.ownerDocument() : this; + const savedState = doc.$__.savedState; + const savedStatePath = this.ownerDocument ? this.$__.fullPath + '.' + path : path; + if (savedState != null) { + const firstDot = savedStatePath.indexOf('.'); + const topLevelPath = firstDot === -1 ? savedStatePath : savedStatePath.slice(0, firstDot); + if (!savedState.hasOwnProperty(topLevelPath)) { + savedState[topLevelPath] = utils.clone(doc.$__getValue(topLevelPath)); + } + } + + this.$__set(pathToMark, path, options, constructing, parts, schema, val, priorVal); + + if (savedState != null && savedState.hasOwnProperty(savedStatePath) && utils.deepEqual(val, savedState[savedStatePath])) { + this.unmarkModified(path); + } + } + + if (schema.$isSingleNested && (this.isDirectModified(path) || val == null)) { + cleanModifiedSubpaths(this, path); + } + + return this; +}; + +/*! + * ignore + */ + +function _isManuallyPopulatedArray(val, ref) { + if (!Array.isArray(val)) { + return false; + } + if (val.length === 0) { + return false; + } + + for (const el of val) { + if (!(el instanceof Document)) { + return false; + } + const modelName = el.constructor.modelName; + if (modelName == null) { + return false; + } + if (el.constructor.modelName != ref && el.constructor.baseModelName != ref) { + return false; + } + } + + return true; +} + +/** + * Sets the value of a path, or many paths. + * + * ####Example: + * + * // path, value + * doc.set(path, value) + * + * // object + * doc.set({ + * path : value + * , path2 : { + * path : value + * } + * }) + * + * // on-the-fly cast to number + * doc.set(path, value, Number) + * + * // on-the-fly cast to string + * doc.set(path, value, String) + * + * // changing strict mode behavior + * doc.set(path, value, { strict: false }); + * + * @param {String|Object} path path or object of key/vals to set + * @param {Any} val the value to set + * @param {Schema|String|Number|Buffer|*} [type] optionally specify a type for "on-the-fly" attributes + * @param {Object} [options] optionally specify options that modify the behavior of the set + * @api public + * @method set + * @memberOf Document + * @instance + */ + +Document.prototype.set = Document.prototype.$set; + +/** + * Determine if we should mark this change as modified. + * + * @return {Boolean} + * @api private + * @method $__shouldModify + * @memberOf Document + * @instance + */ + +Document.prototype.$__shouldModify = function(pathToMark, path, options, constructing, parts, schema, val, priorVal) { + if (options._skipMarkModified) { + return false; + } + if (this.$isNew) { + return true; + } + + // Re: the note about gh-7196, `val` is the raw value without casting or + // setters if the full path is under a single nested subdoc because we don't + // want to double run setters. So don't set it as modified. See gh-7264. + if (this.$__schema.singleNestedPaths[path] != null) { + return false; + } + + if (val === void 0 && !this.$__isSelected(path)) { + // when a path is not selected in a query, its initial + // value will be undefined. + return true; + } + + if (val === void 0 && path in this.$__.activePaths.states.default) { + // we're just unsetting the default value which was never saved + return false; + } + + // gh-3992: if setting a populated field to a doc, don't mark modified + // if they have the same _id + if (this.$populated(path) && + val instanceof Document && + deepEqual(val._id, priorVal)) { + return false; + } + + if (!deepEqual(val, priorVal || utils.getValue(path, this))) { + return true; + } + + if (!constructing && + val !== null && + val !== undefined && + path in this.$__.activePaths.states.default && + deepEqual(val, schema.getDefault(this, constructing))) { + // a path with a default was $unset on the server + // and the user is setting it to the same value again + return true; + } + return false; +}; + +/** + * Handles the actual setting of the value and marking the path modified if appropriate. + * + * @api private + * @method $__set + * @memberOf Document + * @instance + */ + +Document.prototype.$__set = function(pathToMark, path, options, constructing, parts, schema, val, priorVal) { + Embedded = Embedded || require('./types/ArraySubdocument'); + + const shouldModify = this.$__shouldModify(pathToMark, path, options, constructing, parts, + schema, val, priorVal); + const _this = this; + + if (shouldModify) { + this.markModified(pathToMark); + + // handle directly setting arrays (gh-1126) + MongooseArray || (MongooseArray = require('./types/array')); + if (val && val.isMongooseArray) { + val._registerAtomic('$set', val); + + // Update embedded document parent references (gh-5189) + if (val.isMongooseDocumentArray) { + val.forEach(function(item) { + item && item.__parentArray && (item.__parentArray = val); + }); + } + + // Small hack for gh-1638: if we're overwriting the entire array, ignore + // paths that were modified before the array overwrite + this.$__.activePaths.forEach(function(modifiedPath) { + if (modifiedPath.startsWith(path + '.')) { + _this.$__.activePaths.ignore(modifiedPath); + } + }); + } + } else if (Array.isArray(val) && val.isMongooseArray && Array.isArray(priorVal) && priorVal.isMongooseArray) { + val[arrayAtomicsSymbol] = priorVal[arrayAtomicsSymbol]; + val[arrayAtomicsBackupSymbol] = priorVal[arrayAtomicsBackupSymbol]; + } + + let obj = this._doc; + let i = 0; + const l = parts.length; + let cur = ''; + + for (; i < l; i++) { + const next = i + 1; + const last = next === l; + cur += (cur ? '.' + parts[i] : parts[i]); + if (specialProperties.has(parts[i])) { + return; + } + + if (last) { + if (obj instanceof Map) { + obj.set(parts[i], val); + } else { + obj[parts[i]] = val; + } + } else { + if (utils.isPOJO(obj[parts[i]])) { + obj = obj[parts[i]]; + } else if (obj[parts[i]] && obj[parts[i]] instanceof Embedded) { + obj = obj[parts[i]]; + } else if (obj[parts[i]] && obj[parts[i]].$isSingleNested) { + obj = obj[parts[i]]; + } else if (obj[parts[i]] && Array.isArray(obj[parts[i]])) { + obj = obj[parts[i]]; + } else { + obj[parts[i]] = obj[parts[i]] || {}; + obj = obj[parts[i]]; + } + } + } +}; + +/** + * Gets a raw value from a path (no getters) + * + * @param {String} path + * @api private + */ + +Document.prototype.$__getValue = function(path) { + return utils.getValue(path, this._doc); +}; + +/** + * Sets a raw value for a path (no casting, setters, transformations) + * + * @param {String} path + * @param {Object} value + * @api private + */ + +Document.prototype.$__setValue = function(path, val) { + utils.setValue(path, val, this._doc); + return this; +}; + +/** + * Returns the value of a path. + * + * ####Example + * + * // path + * doc.get('age') // 47 + * + * // dynamic casting to a string + * doc.get('age', String) // "47" + * + * @param {String} path + * @param {Schema|String|Number|Buffer|*} [type] optionally specify a type for on-the-fly attributes + * @param {Object} [options] + * @param {Boolean} [options.virtuals=false] Apply virtuals before getting this path + * @param {Boolean} [options.getters=true] If false, skip applying getters and just get the raw value + * @api public + */ + +Document.prototype.get = function(path, type, options) { + let adhoc; + options = options || {}; + if (type) { + adhoc = this.$__schema.interpretAsType(path, type, this.$__schema.options); + } + + let schema = this.$__path(path); + if (schema == null) { + schema = this.$__schema.virtualpath(path); + } + if (schema instanceof MixedSchema) { + const virtual = this.$__schema.virtualpath(path); + if (virtual != null) { + schema = virtual; + } + } + const pieces = path.indexOf('.') === -1 ? [path] : path.split('.'); + let obj = this._doc; + + if (schema instanceof VirtualType) { + return schema.applyGetters(void 0, this); + } + + // Might need to change path for top-level alias + if (typeof this.$__schema.aliases[pieces[0]] == 'string') { + pieces[0] = this.$__schema.aliases[pieces[0]]; + } + + for (let i = 0, l = pieces.length; i < l; i++) { + if (obj && obj._doc) { + obj = obj._doc; + } + + if (obj == null) { + obj = void 0; + } else if (obj instanceof Map) { + obj = obj.get(pieces[i], { getters: false }); + } else if (i === l - 1) { + obj = utils.getValue(pieces[i], obj); + } else { + obj = obj[pieces[i]]; + } + } + + if (adhoc) { + obj = adhoc.cast(obj); + } + + if (schema != null && options.getters !== false) { + obj = schema.applyGetters(obj, this); + } else if (this.$__schema.nested[path] && options.virtuals) { + // Might need to apply virtuals if this is a nested path + return applyVirtuals(this, utils.clone(obj) || {}, { path: path }); + } + + return obj; +}; + +/*! + * ignore + */ + +Document.prototype[getSymbol] = Document.prototype.get; +Document.prototype.$get = Document.prototype.get; +/** + * Returns the schematype for the given `path`. + * + * @param {String} path + * @api private + * @method $__path + * @memberOf Document + * @instance + */ + +Document.prototype.$__path = function(path) { + const adhocs = this.$__.adhocPaths; + const adhocType = adhocs && adhocs.hasOwnProperty(path) ? adhocs[path] : null; + + if (adhocType) { + return adhocType; + } + return this.$__schema.path(path); +}; + +/** + * Marks the path as having pending changes to write to the db. + * + * _Very helpful when using [Mixed](./schematypes.html#mixed) types._ + * + * ####Example: + * + * doc.mixed.type = 'changed'; + * doc.markModified('mixed.type'); + * doc.save() // changes to mixed.type are now persisted + * + * @param {String} path the path to mark modified + * @param {Document} [scope] the scope to run validators with + * @api public + */ + +Document.prototype.markModified = function(path, scope) { + this.$__.activePaths.modify(path); + if (scope != null && !this.ownerDocument) { + this.$__.pathsToScopes = this.$__pathsToScopes || {}; + this.$__.pathsToScopes[path] = scope; + } +}; + +/** + * Clears the modified state on the specified path. + * + * ####Example: + * + * doc.foo = 'bar'; + * doc.unmarkModified('foo'); + * doc.save(); // changes to foo will not be persisted + * + * @param {String} path the path to unmark modified + * @api public + */ + +Document.prototype.unmarkModified = function(path) { + this.$__.activePaths.init(path); + if (this.$__.pathsToScopes != null) { + delete this.$__.pathsToScopes[path]; + } +}; + +/** + * Don't run validation on this path or persist changes to this path. + * + * ####Example: + * + * doc.foo = null; + * doc.$ignore('foo'); + * doc.save(); // changes to foo will not be persisted and validators won't be run + * + * @memberOf Document + * @instance + * @method $ignore + * @param {String} path the path to ignore + * @api public + */ + +Document.prototype.$ignore = function(path) { + this.$__.activePaths.ignore(path); +}; + +/** + * Returns the list of paths that have been directly modified. A direct + * modified path is a path that you explicitly set, whether via `doc.foo = 'bar'`, + * `Object.assign(doc, { foo: 'bar' })`, or `doc.set('foo', 'bar')`. + * + * A path `a` may be in `modifiedPaths()` but not in `directModifiedPaths()` + * because a child of `a` was directly modified. + * + * ####Example + * const schema = new Schema({ foo: String, nested: { bar: String } }); + * const Model = mongoose.model('Test', schema); + * await Model.create({ foo: 'original', nested: { bar: 'original' } }); + * + * const doc = await Model.findOne(); + * doc.nested.bar = 'modified'; + * doc.directModifiedPaths(); // ['nested.bar'] + * doc.modifiedPaths(); // ['nested', 'nested.bar'] + * + * @return {Array} + * @api public + */ + +Document.prototype.directModifiedPaths = function() { + return Object.keys(this.$__.activePaths.states.modify); +}; + +/** + * Returns true if the given path is nullish or only contains empty objects. + * Useful for determining whether this subdoc will get stripped out by the + * [minimize option](/docs/guide.html#minimize). + * + * ####Example: + * const schema = new Schema({ nested: { foo: String } }); + * const Model = mongoose.model('Test', schema); + * const doc = new Model({}); + * doc.$isEmpty('nested'); // true + * doc.nested.$isEmpty(); // true + * + * doc.nested.foo = 'bar'; + * doc.$isEmpty('nested'); // false + * doc.nested.$isEmpty(); // false + * + * @memberOf Document + * @instance + * @api public + * @method $isEmpty + * @return {Boolean} + */ + +Document.prototype.$isEmpty = function(path) { + const isEmptyOptions = { + minimize: true, + virtuals: false, + getters: false, + transform: false + }; + + if (arguments.length > 0) { + const v = this.$get(path); + if (v == null) { + return true; + } + if (typeof v !== 'object') { + return false; + } + if (utils.isPOJO(v)) { + return _isEmpty(v); + } + return Object.keys(v.toObject(isEmptyOptions)).length === 0; + } + + return Object.keys(this.toObject(isEmptyOptions)).length === 0; +}; + +function _isEmpty(v) { + if (v == null) { + return true; + } + if (typeof v !== 'object' || Array.isArray(v)) { + return false; + } + for (const key of Object.keys(v)) { + if (!_isEmpty(v[key])) { + return false; + } + } + return true; +} + +/** + * Returns the list of paths that have been modified. + * + * @param {Object} [options] + * @param {Boolean} [options.includeChildren=false] if true, returns children of modified paths as well. For example, if false, the list of modified paths for `doc.colors = { primary: 'blue' };` will **not** contain `colors.primary`. If true, `modifiedPaths()` will return an array that contains `colors.primary`. + * @return {Array} + * @api public + */ + +Document.prototype.modifiedPaths = function(options) { + options = options || {}; + const directModifiedPaths = Object.keys(this.$__.activePaths.states.modify); + const _this = this; + return directModifiedPaths.reduce(function(list, path) { + const parts = path.split('.'); + list = list.concat(parts.reduce(function(chains, part, i) { + return chains.concat(parts.slice(0, i).concat(part).join('.')); + }, []).filter(function(chain) { + return (list.indexOf(chain) === -1); + })); + + if (!options.includeChildren) { + return list; + } + + let cur = _this.$get(path); + if (cur != null && typeof cur === 'object') { + if (cur._doc) { + cur = cur._doc; + } + if (Array.isArray(cur)) { + const len = cur.length; + for (let i = 0; i < len; ++i) { + if (list.indexOf(path + '.' + i) === -1) { + list.push(path + '.' + i); + if (cur[i] != null && cur[i].$__) { + const modified = cur[i].modifiedPaths(); + for (const childPath of modified) { + list.push(path + '.' + i + '.' + childPath); + } + } + } + } + } else { + Object.keys(cur). + filter(function(key) { + return list.indexOf(path + '.' + key) === -1; + }). + forEach(function(key) { + list.push(path + '.' + key); + }); + } + } + + return list; + }, []); +}; + +Document.prototype[documentModifiedPaths] = Document.prototype.modifiedPaths; + +/** + * Returns true if any of the given paths is modified, else false. If no arguments, returns `true` if any path + * in this document is modified. + * + * If `path` is given, checks if a path or any full path containing `path` as part of its path chain has been modified. + * + * ####Example + * + * doc.set('documents.0.title', 'changed'); + * doc.isModified() // true + * doc.isModified('documents') // true + * doc.isModified('documents.0.title') // true + * doc.isModified('documents otherProp') // true + * doc.isDirectModified('documents') // false + * + * @param {String} [path] optional + * @return {Boolean} + * @api public + */ + +Document.prototype.isModified = function(paths, modifiedPaths) { + if (paths) { + if (!Array.isArray(paths)) { + paths = paths.split(' '); + } + const modified = modifiedPaths || this[documentModifiedPaths](); + const directModifiedPaths = Object.keys(this.$__.activePaths.states.modify); + const isModifiedChild = paths.some(function(path) { + return !!~modified.indexOf(path); + }); + + return isModifiedChild || paths.some(function(path) { + return directModifiedPaths.some(function(mod) { + return mod === path || path.startsWith(mod + '.'); + }); + }); + } + + return this.$__.activePaths.some('modify'); +}; + +Document.prototype.$isModified = Document.prototype.isModified; + +Document.prototype[documentIsModified] = Document.prototype.isModified; + +/** + * Checks if a path is set to its default. + * + * ####Example + * + * MyModel = mongoose.model('test', { name: { type: String, default: 'Val '} }); + * const m = new MyModel(); + * m.$isDefault('name'); // true + * + * @memberOf Document + * @instance + * @method $isDefault + * @param {String} [path] + * @return {Boolean} + * @api public + */ + +Document.prototype.$isDefault = function(path) { + if (path == null) { + return this.$__.activePaths.some('default'); + } + + if (typeof path === 'string' && path.indexOf(' ') === -1) { + return this.$__.activePaths.states.default.hasOwnProperty(path); + } + + let paths = path; + if (!Array.isArray(paths)) { + paths = paths.split(' '); + } + + return paths.some(path => this.$__.activePaths.states.default.hasOwnProperty(path)); +}; + +/** + * Getter/setter, determines whether the document was removed or not. + * + * ####Example: + * const product = await product.remove(); + * product.$isDeleted(); // true + * product.remove(); // no-op, doesn't send anything to the db + * + * product.$isDeleted(false); + * product.$isDeleted(); // false + * product.remove(); // will execute a remove against the db + * + * + * @param {Boolean} [val] optional, overrides whether mongoose thinks the doc is deleted + * @return {Boolean} whether mongoose thinks this doc is deleted. + * @method $isDeleted + * @memberOf Document + * @instance + * @api public + */ + +Document.prototype.$isDeleted = function(val) { + if (arguments.length === 0) { + return !!this.$__.isDeleted; + } + + this.$__.isDeleted = !!val; + return this; +}; + +/** + * Returns true if `path` was directly set and modified, else false. + * + * ####Example + * + * doc.set('documents.0.title', 'changed'); + * doc.isDirectModified('documents.0.title') // true + * doc.isDirectModified('documents') // false + * + * @param {String|Array} path + * @return {Boolean} + * @api public + */ + +Document.prototype.isDirectModified = function(path) { + if (path == null) { + return this.$__.activePaths.some('modify'); + } + + if (typeof path === 'string' && path.indexOf(' ') === -1) { + return this.$__.activePaths.states.modify.hasOwnProperty(path); + } + + let paths = path; + if (!Array.isArray(paths)) { + paths = paths.split(' '); + } + + return paths.some(path => this.$__.activePaths.states.modify.hasOwnProperty(path)); +}; + +/** + * Checks if `path` is in the `init` state, that is, it was set by `Document#init()` and not modified since. + * + * @param {String} path + * @return {Boolean} + * @api public + */ + +Document.prototype.isInit = function(path) { + if (path == null) { + return this.$__.activePaths.some('init'); + } + + if (typeof path === 'string' && path.indexOf(' ') === -1) { + return this.$__.activePaths.states.init.hasOwnProperty(path); + } + + let paths = path; + if (!Array.isArray(paths)) { + paths = paths.split(' '); + } + + return paths.some(path => this.$__.activePaths.states.init.hasOwnProperty(path)); +}; + +/** + * Checks if `path` was selected in the source query which initialized this document. + * + * ####Example + * + * const doc = await Thing.findOne().select('name'); + * doc.isSelected('name') // true + * doc.isSelected('age') // false + * + * @param {String|Array} path + * @return {Boolean} + * @api public + */ + +Document.prototype.isSelected = function isSelected(path) { + if (this.$__.selected == null) { + return true; + } + + if (path === '_id') { + return this.$__.selected._id !== 0; + } + + if (path.indexOf(' ') !== -1) { + path = path.split(' '); + } + if (Array.isArray(path)) { + return path.some(p => this.$__isSelected(p)); + } + + const paths = Object.keys(this.$__.selected); + let inclusive = null; + + if (paths.length === 1 && paths[0] === '_id') { + // only _id was selected. + return this.$__.selected._id === 0; + } + + for (const cur of paths) { + if (cur === '_id') { + continue; + } + if (!isDefiningProjection(this.$__.selected[cur])) { + continue; + } + inclusive = !!this.$__.selected[cur]; + break; + } + + if (inclusive === null) { + return true; + } + + if (path in this.$__.selected) { + return inclusive; + } + + const pathDot = path + '.'; + + for (const cur of paths) { + if (cur === '_id') { + continue; + } + + if (cur.startsWith(pathDot)) { + return inclusive || cur !== pathDot; + } + + if (pathDot.startsWith(cur + '.')) { + return inclusive; + } + } + + return !inclusive; +}; + +Document.prototype.$__isSelected = Document.prototype.isSelected; + +/** + * Checks if `path` was explicitly selected. If no projection, always returns + * true. + * + * ####Example + * + * Thing.findOne().select('nested.name').exec(function (err, doc) { + * doc.isDirectSelected('nested.name') // true + * doc.isDirectSelected('nested.otherName') // false + * doc.isDirectSelected('nested') // false + * }) + * + * @param {String} path + * @return {Boolean} + * @api public + */ + +Document.prototype.isDirectSelected = function isDirectSelected(path) { + if (this.$__.selected == null) { + return true; + } + + if (path === '_id') { + return this.$__.selected._id !== 0; + } + + if (path.indexOf(' ') !== -1) { + path = path.split(' '); + } + if (Array.isArray(path)) { + return path.some(p => this.isDirectSelected(p)); + } + + const paths = Object.keys(this.$__.selected); + let inclusive = null; + + if (paths.length === 1 && paths[0] === '_id') { + // only _id was selected. + return this.$__.selected._id === 0; + } + + for (const cur of paths) { + if (cur === '_id') { + continue; + } + if (!isDefiningProjection(this.$__.selected[cur])) { + continue; + } + inclusive = !!this.$__.selected[cur]; + break; + } + + if (inclusive === null) { + return true; + } + + if (this.$__.selected.hasOwnProperty(path)) { + return inclusive; + } + + return !inclusive; +}; + +/** + * Executes registered validation rules for this document. + * + * ####Note: + * + * This method is called `pre` save and if a validation rule is violated, [save](#model_Model-save) is aborted and the error is returned to your `callback`. + * + * ####Example: + * + * doc.validate(function (err) { + * if (err) handleError(err); + * else // validation passed + * }); + * + * @param {Array|String} [pathsToValidate] list of paths to validate. If set, Mongoose will validate only the modified paths that are in the given list. + * @param {Object} [options] internal options + * @param {Boolean} [options.validateModifiedOnly=false] if `true` mongoose validates only modified paths. + * @param {Array|string} [options.pathsToSkip] list of paths to skip. If set, Mongoose will validate every modified path that is not in this list. + * @param {Function} [callback] optional callback called after validation completes, passing an error if one occurred + * @return {Promise} Promise + * @api public + */ + +Document.prototype.validate = function(pathsToValidate, options, callback) { + let parallelValidate; + this.$op = 'validate'; + + if (this.ownerDocument != null) { + // Skip parallel validate check for subdocuments + } else if (this.$__.validating) { + parallelValidate = new ParallelValidateError(this, { + parentStack: options && options.parentStack, + conflictStack: this.$__.validating.stack + }); + } else { + this.$__.validating = new ParallelValidateError(this, { parentStack: options && options.parentStack }); + } + + if (arguments.length === 1) { + if (typeof arguments[0] === 'object' && !Array.isArray(arguments[0])) { + options = arguments[0]; + callback = null; + pathsToValidate = null; + } else if (typeof arguments[0] === 'function') { + callback = arguments[0]; + options = null; + pathsToValidate = null; + } + } else if (typeof pathsToValidate === 'function') { + callback = pathsToValidate; + options = null; + pathsToValidate = null; + } else if (typeof options === 'function') { + callback = options; + options = pathsToValidate; + pathsToValidate = null; + } + if (options && typeof options.pathsToSkip === 'string') { + const isOnePathOnly = options.pathsToSkip.indexOf(' ') === -1; + options.pathsToSkip = isOnePathOnly ? [options.pathsToSkip] : options.pathsToSkip.split(' '); + } + + return promiseOrCallback(callback, cb => { + if (parallelValidate != null) { + return cb(parallelValidate); + } + + this.$__validate(pathsToValidate, options, (error) => { + this.$op = null; + cb(error); + }); + }, this.constructor.events); +}; + +Document.prototype.$validate = Document.prototype.validate; + +/*! + * ignore + */ + +function _evaluateRequiredFunctions(doc) { + Object.keys(doc.$__.activePaths.states.require).forEach(path => { + const p = doc.$__schema.path(path); + + if (p != null && typeof p.originalRequiredValue === 'function') { + doc.$__.cachedRequired = doc.$__.cachedRequired || {}; + doc.$__.cachedRequired[path] = p.originalRequiredValue.call(doc, doc); + } + }); +} + +/*! + * ignore + */ + +function _getPathsToValidate(doc) { + const skipSchemaValidators = {}; + + _evaluateRequiredFunctions(doc); + // only validate required fields when necessary + let paths = new Set(Object.keys(doc.$__.activePaths.states.require).filter(function(path) { + if (!doc.$__isSelected(path) && !doc.$isModified(path)) { + return false; + } + if (doc.$__.cachedRequired != null && path in doc.$__.cachedRequired) { + return doc.$__.cachedRequired[path]; + } + return true; + })); + + + Object.keys(doc.$__.activePaths.states.init).forEach(addToPaths); + Object.keys(doc.$__.activePaths.states.modify).forEach(addToPaths); + Object.keys(doc.$__.activePaths.states.default).forEach(addToPaths); + function addToPaths(p) { paths.add(p); } + + const subdocs = doc.$getAllSubdocs(); + const modifiedPaths = doc.modifiedPaths(); + for (const subdoc of subdocs) { + if (subdoc.$basePath) { + // Remove child paths for now, because we'll be validating the whole + // subdoc + for (const p of paths) { + if (p === null || p.startsWith(subdoc.$basePath + '.')) { + paths.delete(p); + } + } + + if (doc.$isModified(subdoc.$basePath, modifiedPaths) && + !doc.isDirectModified(subdoc.$basePath) && + !doc.$isDefault(subdoc.$basePath)) { + paths.add(subdoc.$basePath); + + skipSchemaValidators[subdoc.$basePath] = true; + } + } + } + + // from here on we're not removing items from paths + + // gh-661: if a whole array is modified, make sure to run validation on all + // the children as well + for (const path of paths) { + const _pathType = doc.$__schema.path(path); + if (!_pathType || + !_pathType.$isMongooseArray || + // To avoid potential performance issues, skip doc arrays whose children + // are not required. `getPositionalPathType()` may be slow, so avoid + // it unless we have a case of #6364 + (_pathType.$isMongooseDocumentArray && !get(_pathType, 'schemaOptions.required'))) { + continue; + } + + const val = doc.$__getValue(path); + _pushNestedArrayPaths(val, paths, path); + } + + function _pushNestedArrayPaths(val, paths, path) { + if (val != null) { + const numElements = val.length; + for (let j = 0; j < numElements; ++j) { + if (Array.isArray(val[j])) { + _pushNestedArrayPaths(val[j], paths, path + '.' + j); + } else { + paths.add(path + '.' + j); + } + } + } + } + + const flattenOptions = { skipArrays: true }; + for (const pathToCheck of paths) { + if (doc.$__schema.nested[pathToCheck]) { + let _v = doc.$__getValue(pathToCheck); + if (isMongooseObject(_v)) { + _v = _v.toObject({ transform: false }); + } + const flat = flatten(_v, pathToCheck, flattenOptions, doc.$__schema); + Object.keys(flat).forEach(addToPaths); + } + } + + for (const path of paths) { + // Single nested paths (paths embedded under single nested subdocs) will + // be validated on their own when we call `validate()` on the subdoc itself. + // Re: gh-8468 + if (doc.$__schema.singleNestedPaths.hasOwnProperty(path)) { + paths.delete(path); + continue; + } + const _pathType = doc.$__schema.path(path); + if (!_pathType || !_pathType.$isSchemaMap) { + continue; + } + + const val = doc.$__getValue(path); + if (val == null) { + continue; + } + for (const key of val.keys()) { + paths.add(path + '.' + key); + } + } + + paths = Array.from(paths); + return [paths, skipSchemaValidators]; +} + +/*! + * ignore + */ + +Document.prototype.$__validate = function(pathsToValidate, options, callback) { + if (typeof pathsToValidate === 'function') { + callback = pathsToValidate; + options = null; + pathsToValidate = null; + } else if (typeof options === 'function') { + callback = options; + options = null; + } + + const hasValidateModifiedOnlyOption = options && + (typeof options === 'object') && + ('validateModifiedOnly' in options); + + const pathsToSkip = get(options, 'pathsToSkip', null); + + let shouldValidateModifiedOnly; + if (hasValidateModifiedOnlyOption) { + shouldValidateModifiedOnly = !!options.validateModifiedOnly; + } else { + shouldValidateModifiedOnly = this.$__schema.options.validateModifiedOnly; + } + + const _this = this; + const _complete = () => { + let validationError = this.$__.validationError; + this.$__.validationError = undefined; + + if (shouldValidateModifiedOnly && validationError != null) { + // Remove any validation errors that aren't from modified paths + const errors = Object.keys(validationError.errors); + for (const errPath of errors) { + if (!this.$isModified(errPath)) { + delete validationError.errors[errPath]; + } + } + if (Object.keys(validationError.errors).length === 0) { + validationError = void 0; + } + } + + this.$__.cachedRequired = {}; + this.$emit('validate', _this); + this.constructor.emit('validate', _this); + + this.$__.validating = null; + if (validationError) { + for (const key in validationError.errors) { + // Make sure cast errors persist + if (!this[documentArrayParent] && + validationError.errors[key] instanceof MongooseError.CastError) { + this.invalidate(key, validationError.errors[key]); + } + } + + return validationError; + } + }; + + // only validate required fields when necessary + const pathDetails = _getPathsToValidate(this); + let paths = shouldValidateModifiedOnly ? + pathDetails[0].filter((path) => this.$isModified(path)) : + pathDetails[0]; + const skipSchemaValidators = pathDetails[1]; + if (typeof pathsToValidate === 'string') { + pathsToValidate = pathsToValidate.split(' '); + } + if (Array.isArray(pathsToValidate)) { + paths = _handlePathsToValidate(paths, pathsToValidate); + } else if (pathsToSkip) { + paths = _handlePathsToSkip(paths, pathsToSkip); + } + if (paths.length === 0) { + return immediate(function() { + const error = _complete(); + if (error) { + return _this.$__schema.s.hooks.execPost('validate:error', _this, [_this], { error: error }, function(error) { + callback(error); + }); + } + callback(null, _this); + }); + } + + const validated = {}; + let total = 0; + + for (const path of paths) { + validatePath(path); + } + + function validatePath(path) { + if (path == null || validated[path]) { + return; + } + + validated[path] = true; + total++; + + immediate(function() { + const schemaType = _this.$__schema.path(path); + + if (!schemaType) { + return --total || complete(); + } + + // If user marked as invalid or there was a cast error, don't validate + if (!_this.$isValid(path)) { + --total || complete(); + return; + } + + // If setting a path under a mixed path, avoid using the mixed path validator (gh-10141) + if (schemaType[schemaMixedSymbol] != null && path !== schemaType.path) { + return --total || complete(); + } + + let val = _this.$__getValue(path); + + // If you `populate()` and get back a null value, required validators + // shouldn't fail (gh-8018). We should always fall back to the populated + // value. + let pop; + if ((pop = _this.$populated(path))) { + val = pop; + } else if (val != null && val.$__ != null && val.$__.wasPopulated) { + // Array paths, like `somearray.1`, do not show up as populated with `$populated()`, + // so in that case pull out the document's id + val = val._id; + } + const scope = _this.$__.pathsToScopes != null && path in _this.$__.pathsToScopes ? + _this.$__.pathsToScopes[path] : + _this; + + const doValidateOptions = { + skipSchemaValidators: skipSchemaValidators[path], + path: path, + validateModifiedOnly: shouldValidateModifiedOnly + }; + + schemaType.doValidate(val, function(err) { + if (err) { + const isSubdoc = schemaType.$isSingleNested || + schemaType.$isArraySubdocument || + schemaType.$isMongooseDocumentArray; + if (isSubdoc && err instanceof ValidationError) { + return --total || complete(); + } + _this.invalidate(path, err, undefined, true); + } + --total || complete(); + }, scope, doValidateOptions); + }); + } + + function complete() { + const error = _complete(); + if (error) { + return _this.$__schema.s.hooks.execPost('validate:error', _this, [_this], { error: error }, function(error) { + callback(error); + }); + } + callback(null, _this); + } + +}; + +/*! + * ignore + */ + +function _handlePathsToValidate(paths, pathsToValidate) { + const _pathsToValidate = new Set(pathsToValidate); + const parentPaths = new Map([]); + for (const path of pathsToValidate) { + if (path.indexOf('.') === -1) { + continue; + } + const pieces = path.split('.'); + let cur = pieces[0]; + for (let i = 1; i < pieces.length; ++i) { + // Since we skip subpaths under single nested subdocs to + // avoid double validation, we need to add back the + // single nested subpath if the user asked for it (gh-8626) + parentPaths.set(cur, path); + cur = cur + '.' + pieces[i]; + } + } + + const ret = []; + for (const path of paths) { + if (_pathsToValidate.has(path)) { + ret.push(path); + } else if (parentPaths.has(path)) { + ret.push(parentPaths.get(path)); + } + } + return ret; +} + +/*! + * ignore + */ +function _handlePathsToSkip(paths, pathsToSkip) { + pathsToSkip = new Set(pathsToSkip); + paths = paths.filter(p => !pathsToSkip.has(p)); + return paths; +} + +/** + * Executes registered validation rules (skipping asynchronous validators) for this document. + * + * ####Note: + * + * This method is useful if you need synchronous validation. + * + * ####Example: + * + * const err = doc.validateSync(); + * if (err) { + * handleError(err); + * } else { + * // validation passed + * } + * + * @param {Array|string} pathsToValidate only validate the given paths + * @param {Object} [options] options for validation + * @param {Boolean} [options.validateModifiedOnly=false] If `true`, Mongoose will only validate modified paths, as opposed to modified paths and `required` paths. + * @param {Array|string} [options.pathsToSkip] list of paths to skip. If set, Mongoose will validate every modified path that is not in this list. + * @return {ValidationError|undefined} ValidationError if there are errors during validation, or undefined if there is no error. + * @api public + */ + +Document.prototype.validateSync = function(pathsToValidate, options) { + const _this = this; + + if (arguments.length === 1 && typeof arguments[0] === 'object' && !Array.isArray(arguments[0])) { + options = arguments[0]; + pathsToValidate = null; + } + + const hasValidateModifiedOnlyOption = options && + (typeof options === 'object') && + ('validateModifiedOnly' in options); + + let shouldValidateModifiedOnly; + if (hasValidateModifiedOnlyOption) { + shouldValidateModifiedOnly = !!options.validateModifiedOnly; + } else { + shouldValidateModifiedOnly = this.$__schema.options.validateModifiedOnly; + } + + let pathsToSkip = options && options.pathsToSkip; + + if (typeof pathsToValidate === 'string') { + const isOnePathOnly = pathsToValidate.indexOf(' ') === -1; + pathsToValidate = isOnePathOnly ? [pathsToValidate] : pathsToValidate.split(' '); + } else if (typeof pathsToSkip === 'string' && pathsToSkip.indexOf(' ') !== -1) { + pathsToSkip = pathsToSkip.split(' '); + } + + // only validate required fields when necessary + const pathDetails = _getPathsToValidate(this); + let paths = shouldValidateModifiedOnly ? + pathDetails[0].filter((path) => this.$isModified(path)) : + pathDetails[0]; + const skipSchemaValidators = pathDetails[1]; + + if (Array.isArray(pathsToValidate)) { + paths = _handlePathsToValidate(paths, pathsToValidate); + } else if (Array.isArray(pathsToSkip)) { + paths = _handlePathsToSkip(paths, pathsToSkip); + } + const validating = {}; + + paths.forEach(function(path) { + if (validating[path]) { + return; + } + + validating[path] = true; + + const p = _this.$__schema.path(path); + if (!p) { + return; + } + if (!_this.$isValid(path)) { + return; + } + + const val = _this.$__getValue(path); + const err = p.doValidateSync(val, _this, { + skipSchemaValidators: skipSchemaValidators[path], + path: path, + validateModifiedOnly: shouldValidateModifiedOnly + }); + if (err) { + const isSubdoc = p.$isSingleNested || + p.$isArraySubdocument || + p.$isMongooseDocumentArray; + if (isSubdoc && err instanceof ValidationError) { + return; + } + _this.invalidate(path, err, undefined, true); + } + }); + + const err = _this.$__.validationError; + _this.$__.validationError = undefined; + _this.$emit('validate', _this); + _this.constructor.emit('validate', _this); + + if (err) { + for (const key in err.errors) { + // Make sure cast errors persist + if (err.errors[key] instanceof MongooseError.CastError) { + _this.invalidate(key, err.errors[key]); + } + } + } + + return err; +}; + +/** + * Marks a path as invalid, causing validation to fail. + * + * The `errorMsg` argument will become the message of the `ValidationError`. + * + * The `value` argument (if passed) will be available through the `ValidationError.value` property. + * + * doc.invalidate('size', 'must be less than 20', 14); + + * doc.validate(function (err) { + * console.log(err) + * // prints + * { message: 'Validation failed', + * name: 'ValidationError', + * errors: + * { size: + * { message: 'must be less than 20', + * name: 'ValidatorError', + * path: 'size', + * type: 'user defined', + * value: 14 } } } + * }) + * + * @param {String} path the field to invalidate. For array elements, use the `array.i.field` syntax, where `i` is the 0-based index in the array. + * @param {String|Error} errorMsg the error which states the reason `path` was invalid + * @param {Object|String|Number|any} value optional invalid value + * @param {String} [kind] optional `kind` property for the error + * @return {ValidationError} the current ValidationError, with all currently invalidated paths + * @api public + */ + +Document.prototype.invalidate = function(path, err, val, kind) { + if (!this.$__.validationError) { + this.$__.validationError = new ValidationError(this); + } + + if (this.$__.validationError.errors[path]) { + return; + } + + if (!err || typeof err === 'string') { + err = new ValidatorError({ + path: path, + message: err, + type: kind || 'user defined', + value: val + }); + } + + if (this.$__.validationError === err) { + return this.$__.validationError; + } + + this.$__.validationError.addError(path, err); + return this.$__.validationError; +}; + +/** + * Marks a path as valid, removing existing validation errors. + * + * @param {String} path the field to mark as valid + * @api public + * @memberOf Document + * @instance + * @method $markValid + */ + +Document.prototype.$markValid = function(path) { + if (!this.$__.validationError || !this.$__.validationError.errors[path]) { + return; + } + + delete this.$__.validationError.errors[path]; + if (Object.keys(this.$__.validationError.errors).length === 0) { + this.$__.validationError = null; + } +}; + +/*! + * ignore + */ + +function _markValidSubpaths(doc, path) { + if (!doc.$__.validationError) { + return; + } + + const keys = Object.keys(doc.$__.validationError.errors); + for (const key of keys) { + if (key.startsWith(path + '.')) { + delete doc.$__.validationError.errors[key]; + } + } + if (Object.keys(doc.$__.validationError.errors).length === 0) { + doc.$__.validationError = null; + } +} + +/*! + * ignore + */ + +function _checkImmutableSubpaths(subdoc, schematype, priorVal) { + const schema = schematype.schema; + if (schema == null) { + return; + } + + for (const key of Object.keys(schema.paths)) { + const path = schema.paths[key]; + if (path.$immutableSetter == null) { + continue; + } + const oldVal = priorVal == null ? void 0 : priorVal.$__getValue(key); + // Calling immutableSetter with `oldVal` even though it expects `newVal` + // is intentional. That's because `$immutableSetter` compares its param + // to the current value. + path.$immutableSetter.call(subdoc, oldVal); + } +} + +/** + * Saves this document by inserting a new document into the database if [document.isNew](/docs/api.html#document_Document-isNew) is `true`, + * or sends an [updateOne](/docs/api.html#document_Document-updateOne) operation **only** with the modifications to the database, it does not replace the whole document in the latter case. + * + * ####Example: + * + * product.sold = Date.now(); + * product = await product.save(); + * + * If save is successful, the returned promise will fulfill with the document + * saved. + * + * ####Example: + * + * const newProduct = await product.save(); + * newProduct === product; // true + * + * @param {Object} [options] options optional options + * @param {Session} [options.session=null] the [session](https://docs.mongodb.com/manual/reference/server-sessions/) associated with this save operation. If not specified, defaults to the [document's associated session](api.html#document_Document-$session). + * @param {Object} [options.safe] (DEPRECATED) overrides [schema's safe option](http://mongoosejs.com//docs/guide.html#safe). Use the `w` option instead. + * @param {Boolean} [options.validateBeforeSave] set to false to save without validating. + * @param {Boolean} [options.validateModifiedOnly=false] If `true`, Mongoose will only validate modified paths, as opposed to modified paths and `required` paths. + * @param {Number|String} [options.w] set the [write concern](https://docs.mongodb.com/manual/reference/write-concern/#w-option). Overrides the [schema-level `writeConcern` option](/docs/guide.html#writeConcern) + * @param {Boolean} [options.j] set to true for MongoDB to wait until this `save()` has been [journaled before resolving the returned promise](https://docs.mongodb.com/manual/reference/write-concern/#j-option). Overrides the [schema-level `writeConcern` option](/docs/guide.html#writeConcern) + * @param {Number} [options.wtimeout] sets a [timeout for the write concern](https://docs.mongodb.com/manual/reference/write-concern/#wtimeout). Overrides the [schema-level `writeConcern` option](/docs/guide.html#writeConcern). + * @param {Boolean} [options.checkKeys=true] the MongoDB driver prevents you from saving keys that start with '$' or contain '.' by default. Set this option to `false` to skip that check. See [restrictions on field names](https://docs.mongodb.com/manual/reference/limits/#Restrictions-on-Field-Names) + * @param {Boolean} [options.timestamps=true] if `false` and [timestamps](./guide.html#timestamps) are enabled, skip timestamps for this `save()`. + * @param {Function} [fn] optional callback + * @method save + * @memberOf Document + * @instance + * @throws {DocumentNotFoundError} if this [save updates an existing document](api.html#document_Document-isNew) but the document doesn't exist in the database. For example, you will get this error if the document is [deleted between when you retrieved the document and when you saved it](documents.html#updating). + * @return {Promise|undefined} Returns undefined if used with callback or a Promise otherwise. + * @api public + * @see middleware http://mongoosejs.com/docs/middleware.html + */ + +/** + * Checks if a path is invalid + * + * @param {String|Array} path the field to check + * @method $isValid + * @memberOf Document + * @instance + * @api private + */ + +Document.prototype.$isValid = function(path) { + if (this.$__.validationError == null || Object.keys(this.$__.validationError.errors).length === 0) { + return true; + } + if (path == null) { + return false; + } + + if (path.indexOf(' ') !== -1) { + path = path.split(' '); + } + if (Array.isArray(path)) { + return path.some(p => this.$__.validationError.errors[p] == null); + } + + return this.$__.validationError.errors[path] == null; +}; + +/** + * Resets the internal modified state of this document. + * + * @api private + * @return {Document} + * @method $__reset + * @memberOf Document + * @instance + */ + +Document.prototype.$__reset = function reset() { + let _this = this; + DocumentArray || (DocumentArray = require('./types/DocumentArray')); + + this.$__.activePaths + .map('init', 'modify', function(i) { + return _this.$__getValue(i); + }) + .filter(function(val) { + return val && val instanceof Array && val.isMongooseDocumentArray && val.length; + }) + .forEach(function(array) { + let i = array.length; + while (i--) { + const doc = array[i]; + if (!doc) { + continue; + } + doc.$__reset(); + } + + _this.$__.activePaths.init(array.$path()); + + array[arrayAtomicsBackupSymbol] = array[arrayAtomicsSymbol]; + array[arrayAtomicsSymbol] = {}; + }); + + this.$__.activePaths. + map('init', 'modify', function(i) { + return _this.$__getValue(i); + }). + filter(function(val) { + return val && val.$isSingleNested; + }). + forEach(function(doc) { + doc.$__reset(); + if (doc.$parent() === _this) { + _this.$__.activePaths.init(doc.$basePath); + } else if (doc.$parent() != null && doc.$parent().ownerDocument) { + // If map path underneath subdocument, may end up with a case where + // map path is modified but parent still needs to be reset. See gh-10295 + doc.$parent().$__reset(); + } + }); + + // clear atomics + this.$__dirty().forEach(function(dirt) { + const type = dirt.value; + + if (type && type[arrayAtomicsSymbol]) { + type[arrayAtomicsBackupSymbol] = type[arrayAtomicsSymbol]; + type[arrayAtomicsSymbol] = {}; + } + }); + + this.$__.backup = {}; + this.$__.backup.activePaths = { + modify: Object.assign({}, this.$__.activePaths.states.modify), + default: Object.assign({}, this.$__.activePaths.states.default) + }; + this.$__.backup.validationError = this.$__.validationError; + this.$__.backup.errors = this.$errors; + + // Clear 'dirty' cache + this.$__.activePaths.clear('modify'); + this.$__.activePaths.clear('default'); + this.$__.validationError = undefined; + this.$errors = undefined; + _this = this; + this.$__schema.requiredPaths().forEach(function(path) { + _this.$__.activePaths.require(path); + }); + + return this; +}; + +/*! + * ignore + */ + +Document.prototype.$__undoReset = function $__undoReset() { + if (this.$__.backup == null || this.$__.backup.activePaths == null) { + return; + } + + this.$__.activePaths.states.modify = this.$__.backup.activePaths.modify; + this.$__.activePaths.states.default = this.$__.backup.activePaths.default; + + this.$__.validationError = this.$__.backup.validationError; + this.$errors = this.$__.backup.errors; + + for (const dirt of this.$__dirty()) { + const type = dirt.value; + + if (type && type[arrayAtomicsSymbol] && type[arrayAtomicsBackupSymbol]) { + type[arrayAtomicsSymbol] = type[arrayAtomicsBackupSymbol]; + } + } + + for (const subdoc of this.$getAllSubdocs()) { + subdoc.$__undoReset(); + } +}; + +/** + * Returns this documents dirty paths / vals. + * + * @api private + * @method $__dirty + * @memberOf Document + * @instance + */ + +Document.prototype.$__dirty = function() { + const _this = this; + let all = this.$__.activePaths.map('modify', function(path) { + return { + path: path, + value: _this.$__getValue(path), + schema: _this.$__path(path) + }; + }); + // gh-2558: if we had to set a default and the value is not undefined, + // we have to save as well + all = all.concat(this.$__.activePaths.map('default', function(path) { + if (path === '_id' || _this.$__getValue(path) == null) { + return; + } + return { + path: path, + value: _this.$__getValue(path), + schema: _this.$__path(path) + }; + })); + + const allPaths = new Map(all.filter((el) => el != null).map((el) => [el.path, el.value])); + // Ignore "foo.a" if "foo" is dirty already. + const minimal = []; + + all.forEach(function(item) { + if (!item) { + return; + } + + let top = null; + + const array = parentPaths(item.path); + for (let i = 0; i < array.length - 1; i++) { + if (allPaths.has(array[i])) { + top = allPaths.get(array[i]); + break; + } + } + if (top == null) { + minimal.push(item); + } else if (top != null && + top[arrayAtomicsSymbol] != null && + top.hasAtomics()) { + // special case for top level MongooseArrays + // the `top` array itself and a sub path of `top` are being set. + // the only way to honor all of both modifications is through a $set + // of entire array. + top[arrayAtomicsSymbol] = {}; + top[arrayAtomicsSymbol].$set = top; + } + }); + return minimal; +}; + +/** + * Assigns/compiles `schema` into this documents prototype. + * + * @param {Schema} schema + * @api private + * @method $__setSchema + * @memberOf Document + * @instance + */ + +Document.prototype.$__setSchema = function(schema) { + compile(schema.tree, this, undefined, schema.options); + + // Apply default getters if virtual doesn't have any (gh-6262) + for (const key of Object.keys(schema.virtuals)) { + schema.virtuals[key]._applyDefaultGetters(); + } + if (schema.path('schema') == null) { + this.schema = schema; + } + this.$__schema = schema; + this[documentSchemaSymbol] = schema; +}; + + +/** + * Get active path that were changed and are arrays + * + * @api private + * @method $__getArrayPathsToValidate + * @memberOf Document + * @instance + */ + +Document.prototype.$__getArrayPathsToValidate = function() { + DocumentArray || (DocumentArray = require('./types/DocumentArray')); + + // validate all document arrays. + return this.$__.activePaths + .map('init', 'modify', function(i) { + return this.$__getValue(i); + }.bind(this)) + .filter(function(val) { + return val && val instanceof Array && val.isMongooseDocumentArray && val.length; + }).reduce(function(seed, array) { + return seed.concat(array); + }, []) + .filter(function(doc) { + return doc; + }); +}; + + +/** + * Get all subdocs (by bfs) + * + * @api public + * @method $getAllSubdocs + * @memberOf Document + * @instance + */ + +Document.prototype.$getAllSubdocs = function() { + DocumentArray || (DocumentArray = require('./types/DocumentArray')); + Embedded = Embedded || require('./types/ArraySubdocument'); + + function docReducer(doc, seed, path) { + let val = doc; + let isNested = false; + if (path) { + if (doc instanceof Document && doc[documentSchemaSymbol].paths[path]) { + val = doc._doc[path]; + } else if (doc instanceof Document && doc[documentSchemaSymbol].nested[path]) { + val = doc._doc[path]; + isNested = true; + } else { + val = doc[path]; + } + } + if (val instanceof Embedded) { + seed.push(val); + } else if (val instanceof Map) { + seed = Array.from(val.keys()).reduce(function(seed, path) { + return docReducer(val.get(path), seed, null); + }, seed); + } else if (val && val.$isSingleNested) { + seed = Object.keys(val._doc).reduce(function(seed, path) { + return docReducer(val._doc, seed, path); + }, seed); + seed.push(val); + } else if (val && val.isMongooseDocumentArray) { + val.forEach(function _docReduce(doc) { + if (!doc || !doc._doc) { + return; + } + seed = Object.keys(doc._doc).reduce(function(seed, path) { + return docReducer(doc._doc, seed, path); + }, seed); + if (doc instanceof Embedded) { + seed.push(doc); + } + }); + } else if (isNested && val != null) { + for (const path of Object.keys(val)) { + docReducer(val, seed, path); + } + } + return seed; + } + + const subDocs = []; + for (const path of Object.keys(this._doc)) { + docReducer(this, subDocs, path); + } + + return subDocs; +}; + +/*! + * Runs queued functions + */ + +function applyQueue(doc) { + const q = doc.$__schema && doc.$__schema.callQueue; + if (!q.length) { + return; + } + + for (const pair of q) { + if (pair[0] !== 'pre' && pair[0] !== 'post' && pair[0] !== 'on') { + doc[pair[0]].apply(doc, pair[1]); + } + } +} + +/*! + * ignore + */ + +Document.prototype.$__handleReject = function handleReject(err) { + // emit on the Model if listening + if (this.$listeners('error').length) { + this.$emit('error', err); + } else if (this.constructor.listeners && this.constructor.listeners('error').length) { + this.constructor.emit('error', err); + } +}; + +/** + * Internal helper for toObject() and toJSON() that doesn't manipulate options + * + * @api private + * @method $toObject + * @memberOf Document + * @instance + */ + +Document.prototype.$toObject = function(options, json) { + let defaultOptions = { + transform: true, + flattenDecimals: true + }; + + const path = json ? 'toJSON' : 'toObject'; + const baseOptions = get(this, 'constructor.base.options.' + path, {}); + const schemaOptions = get(this, '$__schema.options', {}); + // merge base default options with Schema's set default options if available. + // `clone` is necessary here because `utils.options` directly modifies the second input. + defaultOptions = utils.options(defaultOptions, clone(baseOptions)); + defaultOptions = utils.options(defaultOptions, clone(schemaOptions[path] || {})); + + // If options do not exist or is not an object, set it to empty object + options = utils.isPOJO(options) ? clone(options) : {}; + options._calledWithOptions = options._calledWithOptions || clone(options); + + let _minimize; + if (options._calledWithOptions.minimize != null) { + _minimize = options.minimize; + } else if (defaultOptions.minimize != null) { + _minimize = defaultOptions.minimize; + } else { + _minimize = schemaOptions.minimize; + } + + let flattenMaps; + if (options._calledWithOptions.flattenMaps != null) { + flattenMaps = options.flattenMaps; + } else if (defaultOptions.flattenMaps != null) { + flattenMaps = defaultOptions.flattenMaps; + } else { + flattenMaps = schemaOptions.flattenMaps; + } + + // The original options that will be passed to `clone()`. Important because + // `clone()` will recursively call `$toObject()` on embedded docs, so we + // need the original options the user passed in, plus `_isNested` and + // `_parentOptions` for checking whether we need to depopulate. + const cloneOptions = Object.assign(utils.clone(options), { + _isNested: true, + json: json, + minimize: _minimize, + flattenMaps: flattenMaps + }); + + if (utils.hasUserDefinedProperty(options, 'getters')) { + cloneOptions.getters = options.getters; + } + if (utils.hasUserDefinedProperty(options, 'virtuals')) { + cloneOptions.virtuals = options.virtuals; + } + + const depopulate = options.depopulate || + get(options, '_parentOptions.depopulate', false); + // _isNested will only be true if this is not the top level document, we + // should never depopulate + if (depopulate && options._isNested && this.$__.wasPopulated) { + // populated paths that we set to a document + return clone(this._id, cloneOptions); + } + + // merge default options with input options. + options = utils.options(defaultOptions, options); + options._isNested = true; + options.json = json; + options.minimize = _minimize; + + cloneOptions._parentOptions = options; + cloneOptions._skipSingleNestedGetters = true; + + const gettersOptions = Object.assign({}, cloneOptions); + gettersOptions._skipSingleNestedGetters = false; + + // remember the root transform function + // to save it from being overwritten by sub-transform functions + const originalTransform = options.transform; + + let ret = clone(this._doc, cloneOptions) || {}; + + if (options.getters) { + applyGetters(this, ret, gettersOptions); + + if (options.minimize) { + ret = minimize(ret) || {}; + } + } + + if (options.virtuals || (options.getters && options.virtuals !== false)) { + applyVirtuals(this, ret, gettersOptions, options); + } + + if (options.versionKey === false && this.$__schema.options.versionKey) { + delete ret[this.$__schema.options.versionKey]; + } + + let transform = options.transform; + + // In the case where a subdocument has its own transform function, we need to + // check and see if the parent has a transform (options.transform) and if the + // child schema has a transform (this.schema.options.toObject) In this case, + // we need to adjust options.transform to be the child schema's transform and + // not the parent schema's + if (transform) { + applySchemaTypeTransforms(this, ret); + } + + if (options.useProjection) { + omitDeselectedFields(this, ret); + } + + if (transform === true || (schemaOptions.toObject && transform)) { + const opts = options.json ? schemaOptions.toJSON : schemaOptions.toObject; + + if (opts) { + transform = (typeof options.transform === 'function' ? options.transform : opts.transform); + } + } else { + options.transform = originalTransform; + } + + if (typeof transform === 'function') { + const xformed = transform(this, ret, options); + if (typeof xformed !== 'undefined') { + ret = xformed; + } + } + + return ret; +}; + +/** + * Converts this document into a plain-old JavaScript object ([POJO](https://masteringjs.io/tutorials/fundamentals/pojo)). + * + * Buffers are converted to instances of [mongodb.Binary](http://mongodb.github.com/node-mongodb-native/api-bson-generated/binary.html) for proper storage. + * + * ####Options: + * + * - `getters` apply all getters (path and virtual getters), defaults to false + * - `aliases` apply all aliases if `virtuals=true`, defaults to true + * - `virtuals` apply virtual getters (can override `getters` option), defaults to false + * - `minimize` remove empty objects, defaults to true + * - `transform` a transform function to apply to the resulting document before returning + * - `depopulate` depopulate any populated paths, replacing them with their original refs, defaults to false + * - `versionKey` whether to include the version key, defaults to true + * - `flattenMaps` convert Maps to POJOs. Useful if you want to JSON.stringify() the result of toObject(), defaults to false + * - `useProjection` set to `true` to omit fields that are excluded in this document's projection. Unless you specified a projection, this will omit any field that has `select: false` in the schema. + * + * ####Getters/Virtuals + * + * Example of only applying path getters + * + * doc.toObject({ getters: true, virtuals: false }) + * + * Example of only applying virtual getters + * + * doc.toObject({ virtuals: true }) + * + * Example of applying both path and virtual getters + * + * doc.toObject({ getters: true }) + * + * To apply these options to every document of your schema by default, set your [schemas](#schema_Schema) `toObject` option to the same argument. + * + * schema.set('toObject', { virtuals: true }) + * + * ####Transform + * + * We may need to perform a transformation of the resulting object based on some criteria, say to remove some sensitive information or return a custom object. In this case we set the optional `transform` function. + * + * Transform functions receive three arguments + * + * function (doc, ret, options) {} + * + * - `doc` The mongoose document which is being converted + * - `ret` The plain object representation which has been converted + * - `options` The options in use (either schema options or the options passed inline) + * + * ####Example + * + * // specify the transform schema option + * if (!schema.options.toObject) schema.options.toObject = {}; + * schema.options.toObject.transform = function (doc, ret, options) { + * // remove the _id of every document before returning the result + * delete ret._id; + * return ret; + * } + * + * // without the transformation in the schema + * doc.toObject(); // { _id: 'anId', name: 'Wreck-it Ralph' } + * + * // with the transformation + * doc.toObject(); // { name: 'Wreck-it Ralph' } + * + * With transformations we can do a lot more than remove properties. We can even return completely new customized objects: + * + * if (!schema.options.toObject) schema.options.toObject = {}; + * schema.options.toObject.transform = function (doc, ret, options) { + * return { movie: ret.name } + * } + * + * // without the transformation in the schema + * doc.toObject(); // { _id: 'anId', name: 'Wreck-it Ralph' } + * + * // with the transformation + * doc.toObject(); // { movie: 'Wreck-it Ralph' } + * + * _Note: if a transform function returns `undefined`, the return value will be ignored._ + * + * Transformations may also be applied inline, overridding any transform set in the options: + * + * function xform (doc, ret, options) { + * return { inline: ret.name, custom: true } + * } + * + * // pass the transform as an inline option + * doc.toObject({ transform: xform }); // { inline: 'Wreck-it Ralph', custom: true } + * + * If you want to skip transformations, use `transform: false`: + * + * schema.options.toObject.hide = '_id'; + * schema.options.toObject.transform = function (doc, ret, options) { + * if (options.hide) { + * options.hide.split(' ').forEach(function (prop) { + * delete ret[prop]; + * }); + * } + * return ret; + * } + * + * const doc = new Doc({ _id: 'anId', secret: 47, name: 'Wreck-it Ralph' }); + * doc.toObject(); // { secret: 47, name: 'Wreck-it Ralph' } + * doc.toObject({ hide: 'secret _id', transform: false });// { _id: 'anId', secret: 47, name: 'Wreck-it Ralph' } + * doc.toObject({ hide: 'secret _id', transform: true }); // { name: 'Wreck-it Ralph' } + * + * If you pass a transform in `toObject()` options, Mongoose will apply the transform + * to [subdocuments](/docs/subdocs.html) in addition to the top-level document. + * Similarly, `transform: false` skips transforms for all subdocuments. + * Note that this is behavior is different for transforms defined in the schema: + * if you define a transform in `schema.options.toObject.transform`, that transform + * will **not** apply to subdocuments. + * + * const memberSchema = new Schema({ name: String, email: String }); + * const groupSchema = new Schema({ members: [memberSchema], name: String, email }); + * const Group = mongoose.model('Group', groupSchema); + * + * const doc = new Group({ + * name: 'Engineering', + * email: 'dev@mongoosejs.io', + * members: [{ name: 'Val', email: 'val@mongoosejs.io' }] + * }); + * + * // Removes `email` from both top-level document **and** array elements + * // { name: 'Engineering', members: [{ name: 'Val' }] } + * doc.toObject({ transform: (doc, ret) => { delete ret.email; return ret; } }); + * + * Transforms, like all of these options, are also available for `toJSON`. See [this guide to `JSON.stringify()`](https://thecodebarbarian.com/the-80-20-guide-to-json-stringify-in-javascript.html) to learn why `toJSON()` and `toObject()` are separate functions. + * + * See [schema options](/docs/guide.html#toObject) for some more details. + * + * _During save, no custom options are applied to the document before being sent to the database._ + * + * @param {Object} [options] + * @param {Boolean} [options.getters=false] if true, apply all getters, including virtuals + * @param {Boolean} [options.virtuals=false] if true, apply virtuals, including aliases. Use `{ getters: true, virtuals: false }` to just apply getters, not virtuals + * @param {Boolean} [options.aliases=true] if `options.virtuals = true`, you can set `options.aliases = false` to skip applying aliases. This option is a no-op if `options.virtuals = false`. + * @param {Boolean} [options.minimize=true] if true, omit any empty objects from the output + * @param {Function|null} [options.transform=null] if set, mongoose will call this function to allow you to transform the returned object + * @param {Boolean} [options.depopulate=false] if true, replace any conventionally populated paths with the original id in the output. Has no affect on virtual populated paths. + * @param {Boolean} [options.versionKey=true] if false, exclude the version key (`__v` by default) from the output + * @param {Boolean} [options.flattenMaps=false] if true, convert Maps to POJOs. Useful if you want to `JSON.stringify()` the result of `toObject()`. + * @param {Boolean} [options.useProjection=false] - If true, omits fields that are excluded in this document's projection. Unless you specified a projection, this will omit any field that has `select: false` in the schema. + * @return {Object} js object + * @see mongodb.Binary http://mongodb.github.com/node-mongodb-native/api-bson-generated/binary.html + * @api public + * @memberOf Document + * @instance + */ + +Document.prototype.toObject = function(options) { + return this.$toObject(options); +}; + +/*! + * Minimizes an object, removing undefined values and empty objects + * + * @param {Object} object to minimize + * @return {Object} + */ + +function minimize(obj) { + const keys = Object.keys(obj); + let i = keys.length; + let hasKeys; + let key; + let val; + + while (i--) { + key = keys[i]; + val = obj[key]; + + if (utils.isPOJO(val)) { + obj[key] = minimize(val); + } + + if (undefined === obj[key]) { + delete obj[key]; + continue; + } + + hasKeys = true; + } + + return hasKeys + ? obj + : undefined; +} + +/*! + * Applies virtuals properties to `json`. + */ + +function applyVirtuals(self, json, options, toObjectOptions) { + const schema = self.$__schema; + const paths = Object.keys(schema.virtuals); + let i = paths.length; + const numPaths = i; + let path; + let assignPath; + let cur = self._doc; + let v; + const aliases = get(toObjectOptions, 'aliases', true); + + let virtualsToApply = null; + if (Array.isArray(options.virtuals)) { + virtualsToApply = new Set(options.virtuals); + } + else if (options.virtuals && options.virtuals.pathsToSkip) { + virtualsToApply = new Set(paths); + for (let i = 0; i < options.virtuals.pathsToSkip.length; i++) { + if (virtualsToApply.has(options.virtuals.pathsToSkip[i])) { + virtualsToApply.delete(options.virtuals.pathsToSkip[i]); + } + } + } + + if (!cur) { + return json; + } + + options = options || {}; + for (i = 0; i < numPaths; ++i) { + path = paths[i]; + + if (virtualsToApply != null && !virtualsToApply.has(path)) { + continue; + } + + // Allow skipping aliases with `toObject({ virtuals: true, aliases: false })` + if (!aliases && schema.aliases.hasOwnProperty(path)) { + continue; + } + + // We may be applying virtuals to a nested object, for example if calling + // `doc.nestedProp.toJSON()`. If so, the path we assign to, `assignPath`, + // will be a trailing substring of the `path`. + assignPath = path; + if (options.path != null) { + if (!path.startsWith(options.path + '.')) { + continue; + } + assignPath = path.substr(options.path.length + 1); + } + const parts = assignPath.split('.'); + v = clone(self.get(path), options); + if (v === void 0) { + continue; + } + const plen = parts.length; + cur = json; + for (let j = 0; j < plen - 1; ++j) { + cur[parts[j]] = cur[parts[j]] || {}; + cur = cur[parts[j]]; + } + cur[parts[plen - 1]] = v; + } + + return json; +} + + +/*! + * Applies virtuals properties to `json`. + * + * @param {Document} self + * @param {Object} json + * @return {Object} `json` + */ + +function applyGetters(self, json, options) { + const schema = self.$__schema; + const paths = Object.keys(schema.paths); + let i = paths.length; + let path; + let cur = self._doc; + let v; + + if (!cur) { + return json; + } + + while (i--) { + path = paths[i]; + + const parts = path.split('.'); + const plen = parts.length; + const last = plen - 1; + let branch = json; + let part; + cur = self._doc; + + if (!self.$__isSelected(path)) { + continue; + } + + for (let ii = 0; ii < plen; ++ii) { + part = parts[ii]; + v = cur[part]; + if (ii === last) { + const val = self.$get(path); + branch[part] = clone(val, options); + } else if (v == null) { + if (part in cur) { + branch[part] = v; + } + break; + } else { + branch = branch[part] || (branch[part] = {}); + } + cur = v; + } + } + + return json; +} + +/*! + * Applies schema type transforms to `json`. + * + * @param {Document} self + * @param {Object} json + * @return {Object} `json` + */ + +function applySchemaTypeTransforms(self, json) { + const schema = self.$__schema; + const paths = Object.keys(schema.paths || {}); + const cur = self._doc; + + if (!cur) { + return json; + } + + for (const path of paths) { + const schematype = schema.paths[path]; + if (typeof schematype.options.transform === 'function') { + const val = self.$get(path); + const transformedValue = schematype.options.transform.call(self, val); + throwErrorIfPromise(path, transformedValue); + utils.setValue(path, transformedValue, json); + } else if (schematype.$embeddedSchemaType != null && + typeof schematype.$embeddedSchemaType.options.transform === 'function') { + const vals = [].concat(self.$get(path)); + const transform = schematype.$embeddedSchemaType.options.transform; + for (let i = 0; i < vals.length; ++i) { + const transformedValue = transform.call(self, vals[i]); + vals[i] = transformedValue; + throwErrorIfPromise(path, transformedValue); + } + + json[path] = vals; + } + } + + return json; +} + +function throwErrorIfPromise(path, transformedValue) { + if (isPromise(transformedValue)) { + throw new Error('`transform` function must be synchronous, but the transform on path `' + path + '` returned a promise.'); + } +} + +/*! + * ignore + */ + +function omitDeselectedFields(self, json) { + const schema = self.$__schema; + const paths = Object.keys(schema.paths || {}); + const cur = self._doc; + + if (!cur) { + return json; + } + + let selected = self.$__.selected; + if (selected === void 0) { + selected = {}; + queryhelpers.applyPaths(selected, schema); + } + if (selected == null || Object.keys(selected).length === 0) { + return json; + } + + for (const path of paths) { + if (selected[path] != null && !selected[path]) { + delete json[path]; + } + } + + return json; +} + +/** + * The return value of this method is used in calls to JSON.stringify(doc). + * + * This method accepts the same options as [Document#toObject](#document_Document-toObject). To apply the options to every document of your schema by default, set your [schemas](#schema_Schema) `toJSON` option to the same argument. + * + * schema.set('toJSON', { virtuals: true }) + * + * See [schema options](/docs/guide.html#toJSON) for details. + * + * @param {Object} options + * @return {Object} + * @see Document#toObject #document_Document-toObject + * @see JSON.stringify() in JavaScript https://thecodebarbarian.com/the-80-20-guide-to-json-stringify-in-javascript.html + * @api public + * @memberOf Document + * @instance + */ + +Document.prototype.toJSON = function(options) { + return this.$toObject(options, true); +}; + +/** + * If this document is a subdocument or populated document, returns the document's + * parent. Returns `undefined` otherwise. + * + * @api public + * @method parent + * @memberOf Document + * @instance + */ + +Document.prototype.parent = function() { + return this.$__.parent; +}; + +/** + * Alias for `parent()`. If this document is a subdocument or populated + * document, returns the document's parent. Returns `undefined` otherwise. + * + * @api public + * @method $parent + * @memberOf Document + * @instance + */ + +Document.prototype.$parent = Document.prototype.parent; + +/** + * Helper for console.log + * + * @api public + * @method inspect + * @memberOf Document + * @instance + */ + +Document.prototype.inspect = function(options) { + const isPOJO = utils.isPOJO(options); + let opts; + if (isPOJO) { + opts = options; + opts.minimize = false; + } + const ret = this.toObject(opts); + + if (ret == null) { + // If `toObject()` returns null, `this` is still an object, so if `inspect()` + // prints out null this can cause some serious confusion. See gh-7942. + return 'MongooseDocument { ' + ret + ' }'; + } + + return ret; +}; + +if (inspect.custom) { + /*! + * Avoid Node deprecation warning DEP0079 + */ + + Document.prototype[inspect.custom] = Document.prototype.inspect; +} + +/** + * Helper for console.log + * + * @api public + * @method toString + * @memberOf Document + * @instance + */ + +Document.prototype.toString = function() { + const ret = this.inspect(); + if (typeof ret === 'string') { + return ret; + } + return inspect(ret); +}; + +/** + * Returns true if this document is equal to another document. + * + * Documents are considered equal when they have matching `_id`s, unless neither + * document has an `_id`, in which case this function falls back to using + * `deepEqual()`. + * + * @param {Document} doc a document to compare + * @return {Boolean} + * @api public + * @memberOf Document + * @instance + */ + +Document.prototype.equals = function(doc) { + if (!doc) { + return false; + } + + const tid = this.$__getValue('_id'); + const docid = doc.$__ != null ? doc.$__getValue('_id') : doc; + if (!tid && !docid) { + return deepEqual(this, doc); + } + return tid && tid.equals + ? tid.equals(docid) + : tid === docid; +}; + +/** + * Populates paths on an existing document. + * + * ####Example: + * + * await doc.populate([ + * 'stories', + * { path: 'fans', sort: { name: -1 } } + * ]); + * doc.populated('stories'); // Array of ObjectIds + * doc.stories[0].title; // 'Casino Royale' + * doc.populated('fans'); // Array of ObjectIds + * + * await doc.populate('fans', '-email'); + * doc.fans[0].email // not populated + * + * await doc.populate('author fans', '-email'); + * doc.author.email // not populated + * doc.fans[0].email // not populated + * + * @param {String|Object|Array} path either the path to populate or an object specifying all parameters, or either an array of those + * @param {Object|String} [select] Field selection for the population query + * @param {Model} [model] The model you wish to use for population. If not specified, populate will look up the model by the name in the Schema's `ref` field. + * @param {Object} [match] Conditions for the population query + * @param {Object} [options] Options for the population query (sort, etc) + * @param {String} [options.path=null] The path to populate. + * @param {boolean} [options.retainNullValues=false] by default, Mongoose removes null and undefined values from populated arrays. Use this option to make `populate()` retain `null` and `undefined` array entries. + * @param {boolean} [options.getters=false] if true, Mongoose will call any getters defined on the `localField`. By default, Mongoose gets the raw value of `localField`. For example, you would need to set this option to `true` if you wanted to [add a `lowercase` getter to your `localField`](/docs/schematypes.html#schematype-options). + * @param {boolean} [options.clone=false] When you do `BlogPost.find().populate('author')`, blog posts with the same author will share 1 copy of an `author` doc. Enable this option to make Mongoose clone populated docs before assigning them. + * @param {Object|Function} [options.match=null] Add an additional filter to the populate query. Can be a filter object containing [MongoDB query syntax](https://docs.mongodb.com/manual/tutorial/query-documents/), or a function that returns a filter object. + * @param {Function} [options.transform=null] Function that Mongoose will call on every populated document that allows you to transform the populated document. + * @param {Object} [options.options=null] Additional options like `limit` and `lean`. + * @param {Function} [callback] Callback + * @see population ./populate.html + * @see Query#select #query_Query-select + * @see Model.populate #model_Model.populate + * @memberOf Document + * @instance + * @return {Promise|null} + * @api public + */ + +Document.prototype.populate = function populate() { + const pop = {}; + const args = utils.args(arguments); + let fn; + + if (args.length > 0) { + if (typeof args[args.length - 1] === 'function') { + fn = args.pop(); + } + + // use hash to remove duplicate paths + const res = utils.populate.apply(null, args); + for (const populateOptions of res) { + pop[populateOptions.path] = populateOptions; + } + } + + const paths = utils.object.vals(pop); + let topLevelModel = this.constructor; + if (this.$__isNested) { + topLevelModel = this.$__[scopeSymbol].constructor; + const nestedPath = this.$__.nestedPath; + paths.forEach(function(populateOptions) { + populateOptions.path = nestedPath + '.' + populateOptions.path; + }); + } + + // Use `$session()` by default if the document has an associated session + // See gh-6754 + if (this.$session() != null) { + const session = this.$session(); + paths.forEach(path => { + if (path.options == null) { + path.options = { session: session }; + return; + } + if (!('session' in path.options)) { + path.options.session = session; + } + }); + } + + paths.forEach(p => { + p._localModel = topLevelModel; + }); + + return topLevelModel.populate(this, paths, fn); +}; + +/** + * Gets all populated documents associated with this document. + * + * @api public + * @return {Array} array of populated documents. Empty array if there are no populated documents associated with this document. + * @memberOf Document + * @instance + */ +Document.prototype.$getPopulatedDocs = function $getPopulatedDocs() { + let keys = []; + if (this.$__.populated != null) { + keys = keys.concat(Object.keys(this.$__.populated)); + } + let result = []; + for (const key of keys) { + const value = this.$get(key); + if (Array.isArray(value)) { + result = result.concat(value); + } else if (value instanceof Document) { + result.push(value); + } + } + return result; +}; + +/** + * Gets _id(s) used during population of the given `path`. + * + * ####Example: + * + * Model.findOne().populate('author').exec(function (err, doc) { + * console.log(doc.author.name) // Dr.Seuss + * console.log(doc.populated('author')) // '5144cf8050f071d979c118a7' + * }) + * + * If the path was not populated, returns `undefined`. + * + * @param {String} path + * @return {Array|ObjectId|Number|Buffer|String|undefined} + * @memberOf Document + * @instance + * @api public + */ + +Document.prototype.populated = function(path, val, options) { + // val and options are internal + if (val == null || val === true) { + if (!this.$__.populated) { + return undefined; + } + if (typeof path !== 'string') { + return undefined; + } + + // Map paths can be populated with either `path.$*` or just `path` + const _path = path.endsWith('.$*') ? path.replace(/\.\$\*$/, '') : path; + + const v = this.$__.populated[_path]; + if (v) { + return val === true ? v : v.value; + } + return undefined; + } + + this.$__.populated || (this.$__.populated = {}); + this.$__.populated[path] = { value: val, options: options }; + + // If this was a nested populate, make sure each populated doc knows + // about its populated children (gh-7685) + const pieces = path.split('.'); + for (let i = 0; i < pieces.length - 1; ++i) { + const subpath = pieces.slice(0, i + 1).join('.'); + const subdoc = this.$get(subpath); + if (subdoc != null && subdoc.$__ != null && this.$populated(subpath)) { + const rest = pieces.slice(i + 1).join('.'); + subdoc.$populated(rest, val, options); + // No need to continue because the above recursion should take care of + // marking the rest of the docs as populated + break; + } + } + + return val; +}; + +Document.prototype.$populated = Document.prototype.populated; + +/** + * Takes a populated field and returns it to its unpopulated state. + * + * ####Example: + * + * Model.findOne().populate('author').exec(function (err, doc) { + * console.log(doc.author.name); // Dr.Seuss + * console.log(doc.depopulate('author')); + * console.log(doc.author); // '5144cf8050f071d979c118a7' + * }) + * + * If the path was not provided, then all populated fields are returned to their unpopulated state. + * + * @param {String} path + * @return {Document} this + * @see Document.populate #document_Document-populate + * @api public + * @memberOf Document + * @instance + */ + +Document.prototype.depopulate = function(path) { + if (typeof path === 'string') { + path = path.indexOf(' ') === -1 ? [path] : path.split(' '); + } + + let populatedIds; + const virtualKeys = this.$$populatedVirtuals ? Object.keys(this.$$populatedVirtuals) : []; + const populated = get(this, '$__.populated', {}); + + if (arguments.length === 0) { + // Depopulate all + for (const virtualKey of virtualKeys) { + delete this.$$populatedVirtuals[virtualKey]; + delete this._doc[virtualKey]; + delete populated[virtualKey]; + } + + const keys = Object.keys(populated); + + for (const key of keys) { + populatedIds = this.$populated(key); + if (!populatedIds) { + continue; + } + delete populated[key]; + utils.setValue(key, populatedIds, this._doc); + } + return this; + } + + for (const singlePath of path) { + populatedIds = this.$populated(singlePath); + delete populated[singlePath]; + + if (virtualKeys.indexOf(singlePath) !== -1) { + delete this.$$populatedVirtuals[singlePath]; + delete this._doc[singlePath]; + } else if (populatedIds) { + utils.setValue(singlePath, populatedIds, this._doc); + } + } + return this; +}; + + +/** + * Returns the full path to this document. + * + * @param {String} [path] + * @return {String} + * @api private + * @method $__fullPath + * @memberOf Document + * @instance + */ + +Document.prototype.$__fullPath = function(path) { + // overridden in SubDocuments + return path || ''; +}; + +/** + * Returns the changes that happened to the document + * in the format that will be sent to MongoDB. + * + * #### Example: + * + * const userSchema = new Schema({ + * name: String, + * age: Number, + * country: String + * }); + * const User = mongoose.model('User', userSchema); + * const user = await User.create({ + * name: 'Hafez', + * age: 25, + * country: 'Egypt' + * }); + * + * // returns an empty object, no changes happened yet + * user.getChanges(); // { } + * + * user.country = undefined; + * user.age = 26; + * + * user.getChanges(); // { $set: { age: 26 }, { $unset: { country: 1 } } } + * + * await user.save(); + * + * user.getChanges(); // { } + * + * Modifying the object that `getChanges()` returns does not affect the document's + * change tracking state. Even if you `delete user.getChanges().$set`, Mongoose + * will still send a `$set` to the server. + * + * @return {Object} + * @api public + * @method getChanges + * @memberOf Document + * @instance + */ + +Document.prototype.getChanges = function() { + const delta = this.$__delta(); + const changes = delta ? delta[1] : {}; + return changes; +}; + +/*! + * Module exports. + */ + +Document.ValidationError = ValidationError; +module.exports = exports = Document; diff --git a/node_modules/mongoose/lib/document_provider.js b/node_modules/mongoose/lib/document_provider.js new file mode 100644 index 000000000..1ace61f4f --- /dev/null +++ b/node_modules/mongoose/lib/document_provider.js @@ -0,0 +1,30 @@ +'use strict'; + +/* eslint-env browser */ + +/*! + * Module dependencies. + */ +const Document = require('./document.js'); +const BrowserDocument = require('./browserDocument.js'); + +let isBrowser = false; + +/** + * Returns the Document constructor for the current context + * + * @api private + */ +module.exports = function() { + if (isBrowser) { + return BrowserDocument; + } + return Document; +}; + +/*! + * ignore + */ +module.exports.setBrowser = function(flag) { + isBrowser = flag; +}; diff --git a/node_modules/mongoose/lib/driver.js b/node_modules/mongoose/lib/driver.js new file mode 100644 index 000000000..cf7ca3d7b --- /dev/null +++ b/node_modules/mongoose/lib/driver.js @@ -0,0 +1,15 @@ +'use strict'; + +/*! + * ignore + */ + +let driver = null; + +module.exports.get = function() { + return driver; +}; + +module.exports.set = function(v) { + driver = v; +}; diff --git a/node_modules/mongoose/lib/drivers/SPEC.md b/node_modules/mongoose/lib/drivers/SPEC.md new file mode 100644 index 000000000..64646931e --- /dev/null +++ b/node_modules/mongoose/lib/drivers/SPEC.md @@ -0,0 +1,4 @@ + +# Driver Spec + +TODO diff --git a/node_modules/mongoose/lib/drivers/browser/ReadPreference.js b/node_modules/mongoose/lib/drivers/browser/ReadPreference.js new file mode 100644 index 000000000..136357081 --- /dev/null +++ b/node_modules/mongoose/lib/drivers/browser/ReadPreference.js @@ -0,0 +1,7 @@ +/*! + * ignore + */ + +'use strict'; + +module.exports = function() {}; diff --git a/node_modules/mongoose/lib/drivers/browser/binary.js b/node_modules/mongoose/lib/drivers/browser/binary.js new file mode 100644 index 000000000..4658f7b9e --- /dev/null +++ b/node_modules/mongoose/lib/drivers/browser/binary.js @@ -0,0 +1,14 @@ + +/*! + * Module dependencies. + */ + +'use strict'; + +const Binary = require('bson').Binary; + +/*! + * Module exports. + */ + +module.exports = exports = Binary; diff --git a/node_modules/mongoose/lib/drivers/browser/decimal128.js b/node_modules/mongoose/lib/drivers/browser/decimal128.js new file mode 100644 index 000000000..5668182b3 --- /dev/null +++ b/node_modules/mongoose/lib/drivers/browser/decimal128.js @@ -0,0 +1,7 @@ +/*! + * ignore + */ + +'use strict'; + +module.exports = require('bson').Decimal128; diff --git a/node_modules/mongoose/lib/drivers/browser/index.js b/node_modules/mongoose/lib/drivers/browser/index.js new file mode 100644 index 000000000..bc4ac5f6b --- /dev/null +++ b/node_modules/mongoose/lib/drivers/browser/index.js @@ -0,0 +1,16 @@ +/*! + * Module exports. + */ + +'use strict'; + +exports.Binary = require('./binary'); +exports.Collection = function() { + throw new Error('Cannot create a collection from browser library'); +}; +exports.getConnection = () => function() { + throw new Error('Cannot create a connection from browser library'); +}; +exports.Decimal128 = require('./decimal128'); +exports.ObjectId = require('./objectid'); +exports.ReadPreference = require('./ReadPreference'); diff --git a/node_modules/mongoose/lib/drivers/browser/objectid.js b/node_modules/mongoose/lib/drivers/browser/objectid.js new file mode 100644 index 000000000..b1e603d71 --- /dev/null +++ b/node_modules/mongoose/lib/drivers/browser/objectid.js @@ -0,0 +1,28 @@ + +/*! + * [node-mongodb-native](https://github.com/mongodb/node-mongodb-native) ObjectId + * @constructor NodeMongoDbObjectId + * @see ObjectId + */ + +'use strict'; + +const ObjectId = require('bson').ObjectID; + +/*! + * Getter for convenience with populate, see gh-6115 + */ + +Object.defineProperty(ObjectId.prototype, '_id', { + enumerable: false, + configurable: true, + get: function() { + return this; + } +}); + +/*! + * ignore + */ + +module.exports = exports = ObjectId; diff --git a/node_modules/mongoose/lib/drivers/node-mongodb-native/ReadPreference.js b/node_modules/mongoose/lib/drivers/node-mongodb-native/ReadPreference.js new file mode 100644 index 000000000..024ee181a --- /dev/null +++ b/node_modules/mongoose/lib/drivers/node-mongodb-native/ReadPreference.js @@ -0,0 +1,47 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const mongodb = require('mongodb'); +const ReadPref = mongodb.ReadPreference; + +/*! + * Converts arguments to ReadPrefs the driver + * can understand. + * + * @param {String|Array} pref + * @param {Array} [tags] + */ + +module.exports = function readPref(pref, tags) { + if (Array.isArray(pref)) { + tags = pref[1]; + pref = pref[0]; + } + + if (pref instanceof ReadPref) { + return pref; + } + + switch (pref) { + case 'p': + pref = 'primary'; + break; + case 'pp': + pref = 'primaryPreferred'; + break; + case 's': + pref = 'secondary'; + break; + case 'sp': + pref = 'secondaryPreferred'; + break; + case 'n': + pref = 'nearest'; + break; + } + + return new ReadPref(pref, tags); +}; diff --git a/node_modules/mongoose/lib/drivers/node-mongodb-native/binary.js b/node_modules/mongoose/lib/drivers/node-mongodb-native/binary.js new file mode 100644 index 000000000..4e3c86f78 --- /dev/null +++ b/node_modules/mongoose/lib/drivers/node-mongodb-native/binary.js @@ -0,0 +1,10 @@ + +/*! + * Module dependencies. + */ + +'use strict'; + +const Binary = require('mongodb').Binary; + +module.exports = exports = Binary; diff --git a/node_modules/mongoose/lib/drivers/node-mongodb-native/collection.js b/node_modules/mongoose/lib/drivers/node-mongodb-native/collection.js new file mode 100644 index 000000000..2c364f810 --- /dev/null +++ b/node_modules/mongoose/lib/drivers/node-mongodb-native/collection.js @@ -0,0 +1,411 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const MongooseCollection = require('../../collection'); +const MongooseError = require('../../error/mongooseError'); +const Collection = require('mongodb').Collection; +const ObjectId = require('./objectid'); +const get = require('../../helpers/get'); +const getConstructorName = require('../../helpers/getConstructorName'); +const sliced = require('sliced'); +const stream = require('stream'); +const util = require('util'); + +/** + * A [node-mongodb-native](https://github.com/mongodb/node-mongodb-native) collection implementation. + * + * All methods methods from the [node-mongodb-native](https://github.com/mongodb/node-mongodb-native) driver are copied and wrapped in queue management. + * + * @inherits Collection + * @api private + */ + +function NativeCollection(name, conn, options) { + this.collection = null; + this.Promise = options.Promise || Promise; + this.modelName = options.modelName; + delete options.modelName; + this._closed = false; + MongooseCollection.apply(this, arguments); +} + +/*! + * Inherit from abstract Collection. + */ + +NativeCollection.prototype.__proto__ = MongooseCollection.prototype; + +/** + * Called when the connection opens. + * + * @api private + */ + +NativeCollection.prototype.onOpen = function() { + const _this = this; + + _this.collection = _this.conn.db.collection(_this.name); + MongooseCollection.prototype.onOpen.call(_this); + return _this.collection; +}; + +/** + * Called when the connection closes + * + * @api private + */ + +NativeCollection.prototype.onClose = function(force) { + MongooseCollection.prototype.onClose.call(this, force); +}; + +/*! + * ignore + */ + +const syncCollectionMethods = { watch: true, find: true, aggregate: true }; + +/*! + * Copy the collection methods and make them subject to queues + */ + +function iter(i) { + NativeCollection.prototype[i] = function() { + const collection = this.collection; + const args = Array.from(arguments); + const _this = this; + const debug = get(_this, 'conn.base.options.debug'); + const lastArg = arguments[arguments.length - 1]; + const opId = new ObjectId(); + + // If user force closed, queueing will hang forever. See #5664 + if (this.conn.$wasForceClosed) { + const error = new MongooseError('Connection was force closed'); + if (args.length > 0 && + typeof args[args.length - 1] === 'function') { + args[args.length - 1](error); + return; + } else { + throw error; + } + } + + let _args = args; + let callback = null; + if (this._shouldBufferCommands() && this.buffer) { + if (syncCollectionMethods[i] && typeof lastArg !== 'function') { + throw new Error('Collection method ' + i + ' is synchronous'); + } + + this.conn.emit('buffer', { + _id: opId, + modelName: _this.modelName, + collectionName: _this.name, + method: i, + args: args + }); + + let callback; + let _args = args; + let promise = null; + let timeout = null; + if (syncCollectionMethods[i]) { + this.addQueue(() => { + lastArg.call(this, null, this[i].apply(this, _args.slice(0, _args.length - 1))); + }, []); + } else if (typeof lastArg === 'function') { + callback = function collectionOperationCallback() { + if (timeout != null) { + clearTimeout(timeout); + } + return lastArg.apply(this, arguments); + }; + _args = args.slice(0, args.length - 1).concat([callback]); + } else { + promise = new this.Promise((resolve, reject) => { + callback = function collectionOperationCallback(err, res) { + if (timeout != null) { + clearTimeout(timeout); + } + if (err != null) { + return reject(err); + } + resolve(res); + }; + _args = args.concat([callback]); + this.addQueue(i, _args); + }); + } + + const bufferTimeoutMS = this._getBufferTimeoutMS(); + timeout = setTimeout(() => { + const removed = this.removeQueue(i, _args); + if (removed) { + const message = 'Operation `' + this.name + '.' + i + '()` buffering timed out after ' + + bufferTimeoutMS + 'ms'; + const err = new MongooseError(message); + this.conn.emit('buffer-end', { _id: opId, modelName: _this.modelName, collectionName: _this.name, method: i, error: err }); + callback(err); + } + }, bufferTimeoutMS); + + if (!syncCollectionMethods[i] && typeof lastArg === 'function') { + this.addQueue(i, _args); + return; + } + + return promise; + } else if (!syncCollectionMethods[i] && typeof lastArg === 'function') { + callback = function collectionOperationCallback(err, res) { + if (err != null) { + _this.conn.emit('operation-end', { _id: opId, modelName: _this.modelName, collectionName: _this.name, method: i, error: err }); + } else { + _this.conn.emit('operation-end', { _id: opId, modelName: _this.modelName, collectionName: _this.name, method: i, result: res }); + } + return lastArg.apply(this, arguments); + }; + _args = args.slice(0, args.length - 1).concat([callback]); + } + + if (debug) { + if (typeof debug === 'function') { + debug.apply(_this, + [_this.name, i].concat(sliced(args, 0, args.length - 1))); + } else if (debug instanceof stream.Writable) { + this.$printToStream(_this.name, i, args, debug); + } else { + const color = debug.color == null ? true : debug.color; + const shell = debug.shell == null ? false : debug.shell; + this.$print(_this.name, i, args, color, shell); + } + } + + this.conn.emit('operation-start', { _id: opId, modelName: _this.modelName, collectionName: this.name, method: i, params: _args }); + + try { + if (collection == null) { + const message = 'Cannot call `' + this.name + '.' + i + '()` before initial connection ' + + 'is complete if `bufferCommands = false`. Make sure you `await mongoose.connect()` if ' + + 'you have `bufferCommands = false`.'; + throw new MongooseError(message); + } + + if (syncCollectionMethods[i] && typeof lastArg === 'function') { + return lastArg.call(this, null, collection[i].apply(collection, _args.slice(0, _args.length - 1))); + } + + const ret = collection[i].apply(collection, _args); + if (ret != null && typeof ret.then === 'function') { + return ret.then( + res => { + this.conn.emit('operation-end', { _id: opId, modelName: this.modelName, collectionName: this.name, method: i, result: res }); + return res; + }, + err => { + this.conn.emit('operation-end', { _id: opId, modelName: this.modelName, collectionName: this.name, method: i, error: err }); + throw err; + } + ); + } + return ret; + } catch (error) { + // Collection operation may throw because of max bson size, catch it here + // See gh-3906 + if (typeof callback === 'function') { + callback(error); + } else { + this.conn.emit('operation-end', { _id: opId, modelName: _this.modelName, collectionName: this.name, method: i, error: error }); + } + if (typeof lastArg === 'function') { + lastArg(error); + } else { + throw error; + } + } + }; +} + +for (const key of Object.getOwnPropertyNames(Collection.prototype)) { + // Janky hack to work around gh-3005 until we can get rid of the mongoose + // collection abstraction + const descriptor = Object.getOwnPropertyDescriptor(Collection.prototype, key); + // Skip properties with getters because they may throw errors (gh-8528) + if (descriptor.get !== undefined) { + continue; + } + if (typeof Collection.prototype[key] !== 'function') { + continue; + } + + iter(key); +} + +/** + * Debug print helper + * + * @api public + * @method $print + */ + +NativeCollection.prototype.$print = function(name, i, args, color, shell) { + const moduleName = color ? '\x1B[0;36mMongoose:\x1B[0m ' : 'Mongoose: '; + const functionCall = [name, i].join('.'); + const _args = []; + for (let j = args.length - 1; j >= 0; --j) { + if (this.$format(args[j]) || _args.length) { + _args.unshift(this.$format(args[j], color, shell)); + } + } + const params = '(' + _args.join(', ') + ')'; + + console.info(moduleName + functionCall + params); +}; + +/** + * Debug print helper + * + * @api public + * @method $print + */ + +NativeCollection.prototype.$printToStream = function(name, i, args, stream) { + const functionCall = [name, i].join('.'); + const _args = []; + for (let j = args.length - 1; j >= 0; --j) { + if (this.$format(args[j]) || _args.length) { + _args.unshift(this.$format(args[j])); + } + } + const params = '(' + _args.join(', ') + ')'; + + stream.write(functionCall + params, 'utf8'); +}; + +/** + * Formatter for debug print args + * + * @api public + * @method $format + */ + +NativeCollection.prototype.$format = function(arg, color, shell) { + const type = typeof arg; + if (type === 'function' || type === 'undefined') return ''; + return format(arg, false, color, shell); +}; + +/*! + * Debug print helper + */ + +function inspectable(representation) { + const ret = { + inspect: function() { return representation; } + }; + if (util.inspect.custom) { + ret[util.inspect.custom] = ret.inspect; + } + return ret; +} +function map(o) { + return format(o, true); +} +function formatObjectId(x, key) { + x[key] = inspectable('ObjectId("' + x[key].toHexString() + '")'); +} +function formatDate(x, key, shell) { + if (shell) { + x[key] = inspectable('ISODate("' + x[key].toUTCString() + '")'); + } else { + x[key] = inspectable('new Date("' + x[key].toUTCString() + '")'); + } +} +function format(obj, sub, color, shell) { + if (obj && typeof obj.toBSON === 'function') { + obj = obj.toBSON(); + } + if (obj == null) { + return obj; + } + + const clone = require('../../helpers/clone'); + let x = clone(obj, { transform: false }); + const constructorName = getConstructorName(x); + + if (constructorName === 'Binary') { + x = 'BinData(' + x.sub_type + ', "' + x.toString('base64') + '")'; + } else if (constructorName === 'ObjectID') { + x = inspectable('ObjectId("' + x.toHexString() + '")'); + } else if (constructorName === 'Date') { + x = inspectable('new Date("' + x.toUTCString() + '")'); + } else if (constructorName === 'Object') { + const keys = Object.keys(x); + const numKeys = keys.length; + let key; + for (let i = 0; i < numKeys; ++i) { + key = keys[i]; + if (x[key]) { + let error; + if (typeof x[key].toBSON === 'function') { + try { + // `session.toBSON()` throws an error. This means we throw errors + // in debug mode when using transactions, see gh-6712. As a + // workaround, catch `toBSON()` errors, try to serialize without + // `toBSON()`, and rethrow if serialization still fails. + x[key] = x[key].toBSON(); + } catch (_error) { + error = _error; + } + } + const _constructorName = getConstructorName(x[key]); + if (_constructorName === 'Binary') { + x[key] = 'BinData(' + x[key].sub_type + ', "' + + x[key].buffer.toString('base64') + '")'; + } else if (_constructorName === 'Object') { + x[key] = format(x[key], true); + } else if (_constructorName === 'ObjectID') { + formatObjectId(x, key); + } else if (_constructorName === 'Date') { + formatDate(x, key, shell); + } else if (_constructorName === 'ClientSession') { + x[key] = inspectable('ClientSession("' + + get(x[key], 'id.id.buffer', '').toString('hex') + '")'); + } else if (Array.isArray(x[key])) { + x[key] = x[key].map(map); + } else if (error != null) { + // If there was an error with `toBSON()` and the object wasn't + // already converted to a string representation, rethrow it. + // Open to better ideas on how to handle this. + throw error; + } + } + } + } + if (sub) { + return x; + } + + return util. + inspect(x, false, 10, color). + replace(/\n/g, ''). + replace(/\s{2,}/g, ' '); +} + +/** + * Retrieves information about this collections indexes. + * + * @param {Function} callback + * @method getIndexes + * @api public + */ + +NativeCollection.prototype.getIndexes = NativeCollection.prototype.indexInformation; + +/*! + * Module exports. + */ + +module.exports = NativeCollection; diff --git a/node_modules/mongoose/lib/drivers/node-mongodb-native/connection.js b/node_modules/mongoose/lib/drivers/node-mongodb-native/connection.js new file mode 100644 index 000000000..f976c55f1 --- /dev/null +++ b/node_modules/mongoose/lib/drivers/node-mongodb-native/connection.js @@ -0,0 +1,153 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const MongooseConnection = require('../../connection'); +const STATES = require('../../connectionstate'); +const immediate = require('../../helpers/immediate'); +const setTimeout = require('../../helpers/timers').setTimeout; + +/** + * A [node-mongodb-native](https://github.com/mongodb/node-mongodb-native) connection implementation. + * + * @inherits Connection + * @api private + */ + +function NativeConnection() { + MongooseConnection.apply(this, arguments); + this._listening = false; +} + +/** + * Expose the possible connection states. + * @api public + */ + +NativeConnection.STATES = STATES; + +/*! + * Inherits from Connection. + */ + +NativeConnection.prototype.__proto__ = MongooseConnection.prototype; + +/** + * Switches to a different database using the same connection pool. + * + * Returns a new connection object, with the new db. If you set the `useCache` + * option, `useDb()` will cache connections by `name`. + * + * **Note:** Calling `close()` on a `useDb()` connection will close the base connection as well. + * + * @param {String} name The database name + * @param {Object} [options] + * @param {Boolean} [options.useCache=false] If true, cache results so calling `useDb()` multiple times with the same name only creates 1 connection object. + * @param {Boolean} [options.noListener=false] If true, the new connection object won't listen to any events on the base connection. This is better for memory usage in cases where you're calling `useDb()` for every request. + * @return {Connection} New Connection Object + * @api public + */ + +NativeConnection.prototype.useDb = function(name, options) { + // Return immediately if cached + options = options || {}; + if (options.useCache && this.relatedDbs[name]) { + return this.relatedDbs[name]; + } + + // we have to manually copy all of the attributes... + const newConn = new this.constructor(); + newConn.name = name; + newConn.base = this.base; + newConn.collections = {}; + newConn.models = {}; + newConn.replica = this.replica; + newConn.config = Object.assign({}, this.config, newConn.config); + newConn.name = this.name; + newConn.options = this.options; + newConn._readyState = this._readyState; + newConn._closeCalled = this._closeCalled; + newConn._hasOpened = this._hasOpened; + newConn._listening = false; + + newConn.host = this.host; + newConn.port = this.port; + newConn.user = this.user; + newConn.pass = this.pass; + + // First, when we create another db object, we are not guaranteed to have a + // db object to work with. So, in the case where we have a db object and it + // is connected, we can just proceed with setting everything up. However, if + // we do not have a db or the state is not connected, then we need to wait on + // the 'open' event of the connection before doing the rest of the setup + // the 'connected' event is the first time we'll have access to the db object + + const _this = this; + + newConn.client = _this.client; + + if (this.db && this._readyState === STATES.connected) { + wireup(); + } else { + this.once('connected', wireup); + } + + function wireup() { + newConn.client = _this.client; + const _opts = {}; + if (options.hasOwnProperty('noListener')) { + _opts.noListener = options.noListener; + } + newConn.db = _this.client.db(name, _opts); + newConn.onOpen(); + } + + newConn.name = name; + + // push onto the otherDbs stack, this is used when state changes + if (options.noListener !== true) { + this.otherDbs.push(newConn); + } + newConn.otherDbs.push(this); + + // push onto the relatedDbs cache, this is used when state changes + if (options && options.useCache) { + this.relatedDbs[newConn.name] = newConn; + newConn.relatedDbs = this.relatedDbs; + } + + return newConn; +}; + +/** + * Closes the connection + * + * @param {Boolean} [force] + * @param {Function} [fn] + * @return {Connection} this + * @api private + */ + +NativeConnection.prototype.doClose = function(force, fn) { + if (this.client == null) { + immediate(() => fn()); + return this; + } + + this.client.close(force, (err, res) => { + // Defer because the driver will wait at least 1ms before finishing closing + // the pool, see https://github.com/mongodb-js/mongodb-core/blob/a8f8e4ce41936babc3b9112bf42d609779f03b39/lib/connection/pool.js#L1026-L1030. + // If there's queued operations, you may still get some background work + // after the callback is called. + setTimeout(() => fn(err, res), 1); + }); + return this; +}; + +/*! + * Module exports. + */ + +module.exports = NativeConnection; diff --git a/node_modules/mongoose/lib/drivers/node-mongodb-native/decimal128.js b/node_modules/mongoose/lib/drivers/node-mongodb-native/decimal128.js new file mode 100644 index 000000000..c895f17fd --- /dev/null +++ b/node_modules/mongoose/lib/drivers/node-mongodb-native/decimal128.js @@ -0,0 +1,7 @@ +/*! + * ignore + */ + +'use strict'; + +module.exports = require('mongodb').Decimal128; diff --git a/node_modules/mongoose/lib/drivers/node-mongodb-native/index.js b/node_modules/mongoose/lib/drivers/node-mongodb-native/index.js new file mode 100644 index 000000000..2ed9eb0a5 --- /dev/null +++ b/node_modules/mongoose/lib/drivers/node-mongodb-native/index.js @@ -0,0 +1,12 @@ +/*! + * Module exports. + */ + +'use strict'; + +exports.Binary = require('./binary'); +exports.Collection = require('./collection'); +exports.Decimal128 = require('./decimal128'); +exports.ObjectId = require('./objectid'); +exports.ReadPreference = require('./ReadPreference'); +exports.getConnection = () => require('./connection'); \ No newline at end of file diff --git a/node_modules/mongoose/lib/drivers/node-mongodb-native/objectid.js b/node_modules/mongoose/lib/drivers/node-mongodb-native/objectid.js new file mode 100644 index 000000000..6f432b796 --- /dev/null +++ b/node_modules/mongoose/lib/drivers/node-mongodb-native/objectid.js @@ -0,0 +1,16 @@ + +/*! + * [node-mongodb-native](https://github.com/mongodb/node-mongodb-native) ObjectId + * @constructor NodeMongoDbObjectId + * @see ObjectId + */ + +'use strict'; + +const ObjectId = require('mongodb').ObjectId; + +/*! + * ignore + */ + +module.exports = exports = ObjectId; diff --git a/node_modules/mongoose/lib/error/browserMissingSchema.js b/node_modules/mongoose/lib/error/browserMissingSchema.js new file mode 100644 index 000000000..fe492d53c --- /dev/null +++ b/node_modules/mongoose/lib/error/browserMissingSchema.js @@ -0,0 +1,28 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const MongooseError = require('./'); + + +class MissingSchemaError extends MongooseError { + /*! + * MissingSchema Error constructor. + */ + constructor() { + super('Schema hasn\'t been registered for document.\n' + + 'Use mongoose.Document(name, schema)'); + } +} + +Object.defineProperty(MissingSchemaError.prototype, 'name', { + value: 'MongooseError' +}); + +/*! + * exports + */ + +module.exports = MissingSchemaError; diff --git a/node_modules/mongoose/lib/error/cast.js b/node_modules/mongoose/lib/error/cast.js new file mode 100644 index 000000000..dc604b582 --- /dev/null +++ b/node_modules/mongoose/lib/error/cast.js @@ -0,0 +1,150 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const MongooseError = require('./mongooseError'); +const get = require('../helpers/get'); +const util = require('util'); + +/** + * Casting Error constructor. + * + * @param {String} type + * @param {String} value + * @inherits MongooseError + * @api private + */ + +class CastError extends MongooseError { + constructor(type, value, path, reason, schemaType) { + // If no args, assume we'll `init()` later. + if (arguments.length > 0) { + const stringValue = getStringValue(value); + const valueType = getValueType(value); + const messageFormat = getMessageFormat(schemaType); + const msg = formatMessage(null, type, stringValue, path, messageFormat, valueType); + super(msg); + this.init(type, value, path, reason, schemaType); + } else { + super(formatMessage()); + } + } + + toJSON() { + return { + stringValue: this.stringValue, + valueType: this.valueType, + kind: this.kind, + value: this.value, + path: this.path, + reason: this.reason, + name: this.name, + message: this.message + }; + } + /*! + * ignore + */ + init(type, value, path, reason, schemaType) { + this.stringValue = getStringValue(value); + this.messageFormat = getMessageFormat(schemaType); + this.kind = type; + this.value = value; + this.path = path; + this.reason = reason; + this.valueType = getValueType(value); + } + + /*! + * ignore + * @param {Readonly} other + */ + copy(other) { + this.messageFormat = other.messageFormat; + this.stringValue = other.stringValue; + this.kind = other.kind; + this.value = other.value; + this.path = other.path; + this.reason = other.reason; + this.message = other.message; + this.valueType = other.valueType; + } + + /*! + * ignore + */ + setModel(model) { + this.model = model; + this.message = formatMessage(model, this.kind, this.stringValue, this.path, + this.messageFormat, this.valueType); + } +} + +Object.defineProperty(CastError.prototype, 'name', { + value: 'CastError' +}); + +function getStringValue(value) { + let stringValue = util.inspect(value); + stringValue = stringValue.replace(/^'|'$/g, '"'); + if (!stringValue.startsWith('"')) { + stringValue = '"' + stringValue + '"'; + } + return stringValue; +} + +function getValueType(value) { + if (value == null) { + return '' + value; + } + + const t = typeof value; + if (t !== 'object') { + return t; + } + if (typeof value.constructor !== 'function') { + return t; + } + return value.constructor.name; +} + +function getMessageFormat(schemaType) { + const messageFormat = get(schemaType, 'options.cast', null); + if (typeof messageFormat === 'string') { + return messageFormat; + } +} + +/*! + * ignore + */ + +function formatMessage(model, kind, stringValue, path, messageFormat, valueType) { + if (messageFormat != null) { + let ret = messageFormat. + replace('{KIND}', kind). + replace('{VALUE}', stringValue). + replace('{PATH}', path); + if (model != null) { + ret = ret.replace('{MODEL}', model.modelName); + } + + return ret; + } else { + const valueTypeMsg = valueType ? ' (type ' + valueType + ')' : ''; + let ret = 'Cast to ' + kind + ' failed for value ' + + stringValue + valueTypeMsg + ' at path "' + path + '"'; + if (model != null) { + ret += ' for model "' + model.modelName + '"'; + } + return ret; + } +} + +/*! + * exports + */ + +module.exports = CastError; diff --git a/node_modules/mongoose/lib/error/disconnected.js b/node_modules/mongoose/lib/error/disconnected.js new file mode 100644 index 000000000..777a1def2 --- /dev/null +++ b/node_modules/mongoose/lib/error/disconnected.js @@ -0,0 +1,34 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const MongooseError = require('./'); + + +/** + * The connection failed to reconnect and will never successfully reconnect to + * MongoDB without manual intervention. + * @api private + */ +class DisconnectedError extends MongooseError { + /** + * @param {String} connectionString + */ + constructor(connectionString) { + super('Ran out of retries trying to reconnect to "' + + connectionString + '". Try setting `server.reconnectTries` and ' + + '`server.reconnectInterval` to something higher.'); + } +} + +Object.defineProperty(DisconnectedError.prototype, 'name', { + value: 'DisconnectedError' +}); + +/*! + * exports + */ + +module.exports = DisconnectedError; diff --git a/node_modules/mongoose/lib/error/divergentArray.js b/node_modules/mongoose/lib/error/divergentArray.js new file mode 100644 index 000000000..ed86caf17 --- /dev/null +++ b/node_modules/mongoose/lib/error/divergentArray.js @@ -0,0 +1,37 @@ + +/*! + * Module dependencies. + */ + +'use strict'; + +const MongooseError = require('./'); + +class DivergentArrayError extends MongooseError { + /*! + * DivergentArrayError constructor. + * @param {Array} paths + */ + constructor(paths) { + const msg = 'For your own good, using `document.save()` to update an array ' + + 'which was selected using an $elemMatch projection OR ' + + 'populated using skip, limit, query conditions, or exclusion of ' + + 'the _id field when the operation results in a $pop or $set of ' + + 'the entire array is not supported. The following ' + + 'path(s) would have been modified unsafely:\n' + + ' ' + paths.join('\n ') + '\n' + + 'Use Model.update() to update these arrays instead.'; + // TODO write up a docs page (FAQ) and link to it + super(msg); + } +} + +Object.defineProperty(DivergentArrayError.prototype, 'name', { + value: 'DivergentArrayError' +}); + +/*! + * exports + */ + +module.exports = DivergentArrayError; diff --git a/node_modules/mongoose/lib/error/index.js b/node_modules/mongoose/lib/error/index.js new file mode 100644 index 000000000..ec4188d61 --- /dev/null +++ b/node_modules/mongoose/lib/error/index.js @@ -0,0 +1,205 @@ +'use strict'; + +/** + * MongooseError constructor. MongooseError is the base class for all + * Mongoose-specific errors. + * + * ####Example: + * const Model = mongoose.model('Test', new Schema({ answer: Number })); + * const doc = new Model({ answer: 'not a number' }); + * const err = doc.validateSync(); + * + * err instanceof mongoose.Error; // true + * + * @constructor Error + * @param {String} msg Error message + * @inherits Error https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Error + */ + +const MongooseError = require('./mongooseError'); + +/** + * The name of the error. The name uniquely identifies this Mongoose error. The + * possible values are: + * + * - `MongooseError`: general Mongoose error + * - `CastError`: Mongoose could not convert a value to the type defined in the schema path. May be in a `ValidationError` class' `errors` property. + * - `DisconnectedError`: This [connection](connections.html) timed out in trying to reconnect to MongoDB and will not successfully reconnect to MongoDB unless you explicitly reconnect. + * - `DivergentArrayError`: You attempted to `save()` an array that was modified after you loaded it with a `$elemMatch` or similar projection + * - `MissingSchemaError`: You tried to access a model with [`mongoose.model()`](api.html#mongoose_Mongoose-model) that was not defined + * - `DocumentNotFoundError`: The document you tried to [`save()`](api.html#document_Document-save) was not found + * - `ValidatorError`: error from an individual schema path's validator + * - `ValidationError`: error returned from [`validate()`](api.html#document_Document-validate) or [`validateSync()`](api.html#document_Document-validateSync). Contains zero or more `ValidatorError` instances in `.errors` property. + * - `MissingSchemaError`: You called `mongoose.Document()` without a schema + * - `ObjectExpectedError`: Thrown when you set a nested path to a non-object value with [strict mode set](guide.html#strict). + * - `ObjectParameterError`: Thrown when you pass a non-object value to a function which expects an object as a paramter + * - `OverwriteModelError`: Thrown when you call [`mongoose.model()`](api.html#mongoose_Mongoose-model) to re-define a model that was already defined. + * - `ParallelSaveError`: Thrown when you call [`save()`](api.html#model_Model-save) on a document when the same document instance is already saving. + * - `StrictModeError`: Thrown when you set a path that isn't the schema and [strict mode](guide.html#strict) is set to `throw`. + * - `VersionError`: Thrown when the [document is out of sync](guide.html#versionKey) + * + * @api public + * @property {String} name + * @memberOf Error + * @instance + */ + +/*! + * Module exports. + */ + +module.exports = exports = MongooseError; + +/** + * The default built-in validator error messages. + * + * @see Error.messages #error_messages_MongooseError-messages + * @api public + * @memberOf Error + * @static messages + */ + +MongooseError.messages = require('./messages'); + +// backward compat +MongooseError.Messages = MongooseError.messages; + +/** + * An instance of this error class will be returned when `save()` fails + * because the underlying + * document was not found. The constructor takes one parameter, the + * conditions that mongoose passed to `update()` when trying to update + * the document. + * + * @api public + * @memberOf Error + * @static DocumentNotFoundError + */ + +MongooseError.DocumentNotFoundError = require('./notFound'); + +/** + * An instance of this error class will be returned when mongoose failed to + * cast a value. + * + * @api public + * @memberOf Error + * @static CastError + */ + +MongooseError.CastError = require('./cast'); + +/** + * An instance of this error class will be returned when [validation](/docs/validation.html) failed. + * The `errors` property contains an object whose keys are the paths that failed and whose values are + * instances of CastError or ValidationError. + * + * @api public + * @memberOf Error + * @static ValidationError + */ + +MongooseError.ValidationError = require('./validation'); + +/** + * A `ValidationError` has a hash of `errors` that contain individual + * `ValidatorError` instances. + * + * ####Example: + * + * const schema = Schema({ name: { type: String, required: true } }); + * const Model = mongoose.model('Test', schema); + * const doc = new Model({}); + * + * // Top-level error is a ValidationError, **not** a ValidatorError + * const err = doc.validateSync(); + * err instanceof mongoose.Error.ValidationError; // true + * + * // A ValidationError `err` has 0 or more ValidatorErrors keyed by the + * // path in the `err.errors` property. + * err.errors['name'] instanceof mongoose.Error.ValidatorError; + * + * err.errors['name'].kind; // 'required' + * err.errors['name'].path; // 'name' + * err.errors['name'].value; // undefined + * + * Instances of `ValidatorError` have the following properties: + * + * - `kind`: The validator's `type`, like `'required'` or `'regexp'` + * - `path`: The path that failed validation + * - `value`: The value that failed validation + * + * @api public + * @memberOf Error + * @static ValidatorError + */ + +MongooseError.ValidatorError = require('./validator'); + +/** + * An instance of this error class will be returned when you call `save()` after + * the document in the database was changed in a potentially unsafe way. See + * the [`versionKey` option](/docs/guide.html#versionKey) for more information. + * + * @api public + * @memberOf Error + * @static VersionError + */ + +MongooseError.VersionError = require('./version'); + +/** + * An instance of this error class will be returned when you call `save()` multiple + * times on the same document in parallel. See the [FAQ](/docs/faq.html) for more + * information. + * + * @api public + * @memberOf Error + * @static ParallelSaveError + */ + +MongooseError.ParallelSaveError = require('./parallelSave'); + +/** + * Thrown when a model with the given name was already registered on the connection. + * See [the FAQ about `OverwriteModelError`](/docs/faq.html#overwrite-model-error). + * + * @api public + * @memberOf Error + * @static OverwriteModelError + */ + +MongooseError.OverwriteModelError = require('./overwriteModel'); + +/** + * Thrown when you try to access a model that has not been registered yet + * + * @api public + * @memberOf Error + * @static MissingSchemaError + */ + +MongooseError.MissingSchemaError = require('./missingSchema'); + +/** + * An instance of this error will be returned if you used an array projection + * and then modified the array in an unsafe way. + * + * @api public + * @memberOf Error + * @static DivergentArrayError + */ + +MongooseError.DivergentArrayError = require('./divergentArray'); + +/** + * Thrown when your try to pass values to model contrtuctor that + * were not specified in schema or change immutable properties when + * `strict` mode is `"throw"` + * + * @api public + * @memberOf Error + * @static StrictModeError + */ + +MongooseError.StrictModeError = require('./strict'); diff --git a/node_modules/mongoose/lib/error/messages.js b/node_modules/mongoose/lib/error/messages.js new file mode 100644 index 000000000..ac0294a39 --- /dev/null +++ b/node_modules/mongoose/lib/error/messages.js @@ -0,0 +1,47 @@ + +/** + * The default built-in validator error messages. These may be customized. + * + * // customize within each schema or globally like so + * const mongoose = require('mongoose'); + * mongoose.Error.messages.String.enum = "Your custom message for {PATH}."; + * + * As you might have noticed, error messages support basic templating + * + * - `{PATH}` is replaced with the invalid document path + * - `{VALUE}` is replaced with the invalid value + * - `{TYPE}` is replaced with the validator type such as "regexp", "min", or "user defined" + * - `{MIN}` is replaced with the declared min value for the Number.min validator + * - `{MAX}` is replaced with the declared max value for the Number.max validator + * + * Click the "show code" link below to see all defaults. + * + * @static messages + * @receiver MongooseError + * @api public + */ + +'use strict'; + +const msg = module.exports = exports = {}; + +msg.DocumentNotFoundError = null; + +msg.general = {}; +msg.general.default = 'Validator failed for path `{PATH}` with value `{VALUE}`'; +msg.general.required = 'Path `{PATH}` is required.'; + +msg.Number = {}; +msg.Number.min = 'Path `{PATH}` ({VALUE}) is less than minimum allowed value ({MIN}).'; +msg.Number.max = 'Path `{PATH}` ({VALUE}) is more than maximum allowed value ({MAX}).'; +msg.Number.enum = '`{VALUE}` is not a valid enum value for path `{PATH}`.'; + +msg.Date = {}; +msg.Date.min = 'Path `{PATH}` ({VALUE}) is before minimum allowed value ({MIN}).'; +msg.Date.max = 'Path `{PATH}` ({VALUE}) is after maximum allowed value ({MAX}).'; + +msg.String = {}; +msg.String.enum = '`{VALUE}` is not a valid enum value for path `{PATH}`.'; +msg.String.match = 'Path `{PATH}` is invalid ({VALUE}).'; +msg.String.minlength = 'Path `{PATH}` (`{VALUE}`) is shorter than the minimum allowed length ({MINLENGTH}).'; +msg.String.maxlength = 'Path `{PATH}` (`{VALUE}`) is longer than the maximum allowed length ({MAXLENGTH}).'; diff --git a/node_modules/mongoose/lib/error/missingSchema.js b/node_modules/mongoose/lib/error/missingSchema.js new file mode 100644 index 000000000..ca306b7e6 --- /dev/null +++ b/node_modules/mongoose/lib/error/missingSchema.js @@ -0,0 +1,30 @@ + +/*! + * Module dependencies. + */ + +'use strict'; + +const MongooseError = require('./'); + +class MissingSchemaError extends MongooseError { + /*! + * MissingSchema Error constructor. + * @param {String} name + */ + constructor(name) { + const msg = 'Schema hasn\'t been registered for model "' + name + '".\n' + + 'Use mongoose.model(name, schema)'; + super(msg); + } +} + +Object.defineProperty(MissingSchemaError.prototype, 'name', { + value: 'MissingSchemaError' +}); + +/*! + * exports + */ + +module.exports = MissingSchemaError; diff --git a/node_modules/mongoose/lib/error/mongooseError.js b/node_modules/mongoose/lib/error/mongooseError.js new file mode 100644 index 000000000..5505105cb --- /dev/null +++ b/node_modules/mongoose/lib/error/mongooseError.js @@ -0,0 +1,13 @@ +'use strict'; + +/*! + * ignore + */ + +class MongooseError extends Error { } + +Object.defineProperty(MongooseError.prototype, 'name', { + value: 'MongooseError' +}); + +module.exports = MongooseError; diff --git a/node_modules/mongoose/lib/error/notFound.js b/node_modules/mongoose/lib/error/notFound.js new file mode 100644 index 000000000..0e8386b8a --- /dev/null +++ b/node_modules/mongoose/lib/error/notFound.js @@ -0,0 +1,44 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const MongooseError = require('./'); +const util = require('util'); + +class DocumentNotFoundError extends MongooseError { + /*! + * OverwriteModel Error constructor. + */ + constructor(filter, model, numAffected, result) { + let msg; + const messages = MongooseError.messages; + if (messages.DocumentNotFoundError != null) { + msg = typeof messages.DocumentNotFoundError === 'function' ? + messages.DocumentNotFoundError(filter, model) : + messages.DocumentNotFoundError; + } else { + msg = 'No document found for query "' + util.inspect(filter) + + '" on model "' + model + '"'; + } + + super(msg); + + this.result = result; + this.numAffected = numAffected; + this.filter = filter; + // Backwards compat + this.query = filter; + } +} + +Object.defineProperty(DocumentNotFoundError.prototype, 'name', { + value: 'DocumentNotFoundError' +}); + +/*! + * exports + */ + +module.exports = DocumentNotFoundError; diff --git a/node_modules/mongoose/lib/error/objectExpected.js b/node_modules/mongoose/lib/error/objectExpected.js new file mode 100644 index 000000000..6506f6065 --- /dev/null +++ b/node_modules/mongoose/lib/error/objectExpected.js @@ -0,0 +1,30 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const MongooseError = require('./'); + + +class ObjectExpectedError extends MongooseError { + /** + * Strict mode error constructor + * + * @param {string} type + * @param {string} value + * @api private + */ + constructor(path, val) { + const typeDescription = Array.isArray(val) ? 'array' : 'primitive value'; + super('Tried to set nested object field `' + path + + `\` to ${typeDescription} \`` + val + '`'); + this.path = path; + } +} + +Object.defineProperty(ObjectExpectedError.prototype, 'name', { + value: 'ObjectExpectedError' +}); + +module.exports = ObjectExpectedError; diff --git a/node_modules/mongoose/lib/error/objectParameter.js b/node_modules/mongoose/lib/error/objectParameter.js new file mode 100644 index 000000000..5881a481a --- /dev/null +++ b/node_modules/mongoose/lib/error/objectParameter.js @@ -0,0 +1,30 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const MongooseError = require('./'); + +class ObjectParameterError extends MongooseError { + /** + * Constructor for errors that happen when a parameter that's expected to be + * an object isn't an object + * + * @param {Any} value + * @param {String} paramName + * @param {String} fnName + * @api private + */ + constructor(value, paramName, fnName) { + super('Parameter "' + paramName + '" to ' + fnName + + '() must be an object, got ' + value.toString()); + } +} + + +Object.defineProperty(ObjectParameterError.prototype, 'name', { + value: 'ObjectParameterError' +}); + +module.exports = ObjectParameterError; diff --git a/node_modules/mongoose/lib/error/overwriteModel.js b/node_modules/mongoose/lib/error/overwriteModel.js new file mode 100644 index 000000000..d509e1955 --- /dev/null +++ b/node_modules/mongoose/lib/error/overwriteModel.js @@ -0,0 +1,29 @@ + +/*! + * Module dependencies. + */ + +'use strict'; + +const MongooseError = require('./'); + + +class OverwriteModelError extends MongooseError { + /*! + * OverwriteModel Error constructor. + * @param {String} name + */ + constructor(name) { + super('Cannot overwrite `' + name + '` model once compiled.'); + } +} + +Object.defineProperty(OverwriteModelError.prototype, 'name', { + value: 'OverwriteModelError' +}); + +/*! + * exports + */ + +module.exports = OverwriteModelError; diff --git a/node_modules/mongoose/lib/error/parallelSave.js b/node_modules/mongoose/lib/error/parallelSave.js new file mode 100644 index 000000000..e0628576d --- /dev/null +++ b/node_modules/mongoose/lib/error/parallelSave.js @@ -0,0 +1,30 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const MongooseError = require('./'); + +class ParallelSaveError extends MongooseError { + /** + * ParallelSave Error constructor. + * + * @param {Document} doc + * @api private + */ + constructor(doc) { + const msg = 'Can\'t save() the same doc multiple times in parallel. Document: '; + super(msg + doc._id); + } +} + +Object.defineProperty(ParallelSaveError.prototype, 'name', { + value: 'ParallelSaveError' +}); + +/*! + * exports + */ + +module.exports = ParallelSaveError; diff --git a/node_modules/mongoose/lib/error/parallelValidate.js b/node_modules/mongoose/lib/error/parallelValidate.js new file mode 100644 index 000000000..18697b6bc --- /dev/null +++ b/node_modules/mongoose/lib/error/parallelValidate.js @@ -0,0 +1,31 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const MongooseError = require('./mongooseError'); + + +class ParallelValidateError extends MongooseError { + /** + * ParallelValidate Error constructor. + * + * @param {Document} doc + * @api private + */ + constructor(doc) { + const msg = 'Can\'t validate() the same doc multiple times in parallel. Document: '; + super(msg + doc._id); + } +} + +Object.defineProperty(ParallelValidateError.prototype, 'name', { + value: 'ParallelValidateError' +}); + +/*! + * exports + */ + +module.exports = ParallelValidateError; \ No newline at end of file diff --git a/node_modules/mongoose/lib/error/serverSelection.js b/node_modules/mongoose/lib/error/serverSelection.js new file mode 100644 index 000000000..162e2fceb --- /dev/null +++ b/node_modules/mongoose/lib/error/serverSelection.js @@ -0,0 +1,61 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const MongooseError = require('./mongooseError'); +const allServersUnknown = require('../helpers/topology/allServersUnknown'); +const isAtlas = require('../helpers/topology/isAtlas'); +const isSSLError = require('../helpers/topology/isSSLError'); + +/*! + * ignore + */ + +const atlasMessage = 'Could not connect to any servers in your MongoDB Atlas cluster. ' + + 'One common reason is that you\'re trying to access the database from ' + + 'an IP that isn\'t whitelisted. Make sure your current IP address is on your Atlas ' + + 'cluster\'s IP whitelist: https://docs.atlas.mongodb.com/security-whitelist/'; + +const sslMessage = 'Mongoose is connecting with SSL enabled, but the server is ' + + 'not accepting SSL connections. Please ensure that the MongoDB server you are ' + + 'connecting to is configured to accept SSL connections. Learn more: ' + + 'https://mongoosejs.com/docs/tutorials/ssl.html'; + +class MongooseServerSelectionError extends MongooseError { + /** + * MongooseServerSelectionError constructor + * + * @api private + */ + assimilateError(err) { + const reason = err.reason; + // Special message for a case that is likely due to IP whitelisting issues. + const isAtlasWhitelistError = isAtlas(reason) && + allServersUnknown(reason) && + err.message.indexOf('bad auth') === -1 && + err.message.indexOf('Authentication failed') === -1; + + if (isAtlasWhitelistError) { + this.message = atlasMessage; + } else if (isSSLError(reason)) { + this.message = sslMessage; + } else { + this.message = err.message; + } + for (const key in err) { + if (key !== 'name') { + this[key] = err[key]; + } + } + + return this; + } +} + +Object.defineProperty(MongooseServerSelectionError.prototype, 'name', { + value: 'MongooseServerSelectionError' +}); + +module.exports = MongooseServerSelectionError; diff --git a/node_modules/mongoose/lib/error/strict.js b/node_modules/mongoose/lib/error/strict.js new file mode 100644 index 000000000..393ca6e1f --- /dev/null +++ b/node_modules/mongoose/lib/error/strict.js @@ -0,0 +1,33 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const MongooseError = require('./'); + + +class StrictModeError extends MongooseError { + /** + * Strict mode error constructor + * + * @param {String} path + * @param {String} [msg] + * @param {Boolean} [immutable] + * @inherits MongooseError + * @api private + */ + constructor(path, msg, immutable) { + msg = msg || 'Field `' + path + '` is not in schema and strict ' + + 'mode is set to throw.'; + super(msg); + this.isImmutableError = !!immutable; + this.path = path; + } +} + +Object.defineProperty(StrictModeError.prototype, 'name', { + value: 'StrictModeError' +}); + +module.exports = StrictModeError; diff --git a/node_modules/mongoose/lib/error/validation.js b/node_modules/mongoose/lib/error/validation.js new file mode 100644 index 000000000..5ca993a34 --- /dev/null +++ b/node_modules/mongoose/lib/error/validation.js @@ -0,0 +1,112 @@ +/*! + * Module requirements + */ + +'use strict'; + +const MongooseError = require('./mongooseError'); +const getConstructorName = require('../helpers/getConstructorName'); +const util = require('util'); + +class ValidationError extends MongooseError { + /** + * Document Validation Error + * + * @api private + * @param {Document} [instance] + * @inherits MongooseError + */ + constructor(instance) { + let _message; + if (getConstructorName(instance) === 'model') { + _message = instance.constructor.modelName + ' validation failed'; + } else { + _message = 'Validation failed'; + } + + super(_message); + + this.errors = {}; + this._message = _message; + + if (instance) { + instance.$errors = this.errors; + } + } + + /** + * Console.log helper + */ + toString() { + return this.name + ': ' + _generateMessage(this); + } + + /*! + * inspect helper + */ + inspect() { + return Object.assign(new Error(this.message), this); + } + + /*! + * add message + */ + addError(path, error) { + this.errors[path] = error; + this.message = this._message + ': ' + _generateMessage(this); + } +} + + +if (util.inspect.custom) { + /*! + * Avoid Node deprecation warning DEP0079 + */ + + ValidationError.prototype[util.inspect.custom] = ValidationError.prototype.inspect; +} + +/*! + * Helper for JSON.stringify + * Ensure `name` and `message` show up in toJSON output re: gh-9847 + */ +Object.defineProperty(ValidationError.prototype, 'toJSON', { + enumerable: false, + writable: false, + configurable: true, + value: function() { + return Object.assign({}, this, { name: this.name, message: this.message }); + } +}); + + +Object.defineProperty(ValidationError.prototype, 'name', { + value: 'ValidationError' +}); + +/*! + * ignore + */ + +function _generateMessage(err) { + const keys = Object.keys(err.errors || {}); + const len = keys.length; + const msgs = []; + let key; + + for (let i = 0; i < len; ++i) { + key = keys[i]; + if (err === err.errors[key]) { + continue; + } + msgs.push(key + ': ' + err.errors[key].message); + } + + return msgs.join(', '); +} + +/*! + * Module exports + */ + +module.exports = ValidationError; diff --git a/node_modules/mongoose/lib/error/validator.js b/node_modules/mongoose/lib/error/validator.js new file mode 100644 index 000000000..f880e2b58 --- /dev/null +++ b/node_modules/mongoose/lib/error/validator.js @@ -0,0 +1,94 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const MongooseError = require('./'); + + +class ValidatorError extends MongooseError { + /** + * Schema validator error + * + * @param {Object} properties + * @api private + */ + constructor(properties) { + let msg = properties.message; + if (!msg) { + msg = MongooseError.messages.general.default; + } + + const message = formatMessage(msg, properties); + super(message); + + properties = Object.assign({}, properties, { message: message }); + this.properties = properties; + this.kind = properties.type; + this.path = properties.path; + this.value = properties.value; + this.reason = properties.reason; + } + + /*! + * toString helper + * TODO remove? This defaults to `${this.name}: ${this.message}` + */ + toString() { + return this.message; + } + + /*! + * Ensure `name` and `message` show up in toJSON output re: gh-9296 + */ + + toJSON() { + return Object.assign({ name: this.name, message: this.message }, this); + } +} + + +Object.defineProperty(ValidatorError.prototype, 'name', { + value: 'ValidatorError' +}); + +/*! + * The object used to define this validator. Not enumerable to hide + * it from `require('util').inspect()` output re: gh-3925 + */ + +Object.defineProperty(ValidatorError.prototype, 'properties', { + enumerable: false, + writable: true, + value: null +}); + +// Exposed for testing +ValidatorError.prototype.formatMessage = formatMessage; + +/*! + * Formats error messages + */ + +function formatMessage(msg, properties) { + if (typeof msg === 'function') { + return msg(properties); + } + + const propertyNames = Object.keys(properties); + for (const propertyName of propertyNames) { + if (propertyName === 'message') { + continue; + } + msg = msg.replace('{' + propertyName.toUpperCase() + '}', properties[propertyName]); + } + + return msg; +} + +/*! + * exports + */ + +module.exports = ValidatorError; diff --git a/node_modules/mongoose/lib/error/version.js b/node_modules/mongoose/lib/error/version.js new file mode 100644 index 000000000..b357fb16c --- /dev/null +++ b/node_modules/mongoose/lib/error/version.js @@ -0,0 +1,36 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const MongooseError = require('./'); + +class VersionError extends MongooseError { + /** + * Version Error constructor. + * + * @param {Document} doc + * @param {Number} currentVersion + * @param {Array} modifiedPaths + * @api private + */ + constructor(doc, currentVersion, modifiedPaths) { + const modifiedPathsStr = modifiedPaths.join(', '); + super('No matching document found for id "' + doc._id + + '" version ' + currentVersion + ' modifiedPaths "' + modifiedPathsStr + '"'); + this.version = currentVersion; + this.modifiedPaths = modifiedPaths; + } +} + + +Object.defineProperty(VersionError.prototype, 'name', { + value: 'VersionError' +}); + +/*! + * exports + */ + +module.exports = VersionError; diff --git a/node_modules/mongoose/lib/helpers/aggregate/stringifyFunctionOperators.js b/node_modules/mongoose/lib/helpers/aggregate/stringifyFunctionOperators.js new file mode 100644 index 000000000..124418efd --- /dev/null +++ b/node_modules/mongoose/lib/helpers/aggregate/stringifyFunctionOperators.js @@ -0,0 +1,50 @@ +'use strict'; + +module.exports = function stringifyFunctionOperators(pipeline) { + if (!Array.isArray(pipeline)) { + return; + } + + for (const stage of pipeline) { + if (stage == null) { + continue; + } + + const canHaveAccumulator = stage.$group || stage.$bucket || stage.$bucketAuto; + if (canHaveAccumulator != null) { + for (const key of Object.keys(canHaveAccumulator)) { + handleAccumulator(canHaveAccumulator[key]); + } + } + + const stageType = Object.keys(stage)[0]; + if (stageType && typeof stage[stageType] === 'object') { + const stageOptions = stage[stageType]; + for (const key of Object.keys(stageOptions)) { + if (stageOptions[key] != null && + stageOptions[key].$function != null && + typeof stageOptions[key].$function.body === 'function') { + stageOptions[key].$function.body = stageOptions[key].$function.body.toString(); + } + } + } + + if (stage.$facet != null) { + for (const key of Object.keys(stage.$facet)) { + stringifyFunctionOperators(stage.$facet[key]); + } + } + } +}; + +function handleAccumulator(operator) { + if (operator == null || operator.$accumulator == null) { + return; + } + + for (const key of ['init', 'accumulate', 'merge', 'finalize']) { + if (typeof operator.$accumulator[key] === 'function') { + operator.$accumulator[key] = String(operator.$accumulator[key]); + } + } +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/arrayDepth.js b/node_modules/mongoose/lib/helpers/arrayDepth.js new file mode 100644 index 000000000..e55de7ffb --- /dev/null +++ b/node_modules/mongoose/lib/helpers/arrayDepth.js @@ -0,0 +1,33 @@ +'use strict'; + +module.exports = arrayDepth; + +function arrayDepth(arr) { + if (!Array.isArray(arr)) { + return { min: 0, max: 0, containsNonArrayItem: true }; + } + if (arr.length === 0) { + return { min: 1, max: 1, containsNonArrayItem: false }; + } + if (arr.length === 1 && !Array.isArray(arr[0])) { + return { min: 1, max: 1, containsNonArrayItem: false }; + } + + const res = arrayDepth(arr[0]); + + for (let i = 1; i < arr.length; ++i) { + const _res = arrayDepth(arr[i]); + if (_res.min < res.min) { + res.min = _res.min; + } + if (_res.max > res.max) { + res.max = _res.max; + } + res.containsNonArrayItem = res.containsNonArrayItem || _res.containsNonArrayItem; + } + + res.min = res.min + 1; + res.max = res.max + 1; + + return res; +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/clone.js b/node_modules/mongoose/lib/helpers/clone.js new file mode 100644 index 000000000..4cdd9ecac --- /dev/null +++ b/node_modules/mongoose/lib/helpers/clone.js @@ -0,0 +1,148 @@ +'use strict'; + + +const cloneRegExp = require('regexp-clone'); +const Decimal = require('../types/decimal128'); +const ObjectId = require('../types/objectid'); +const specialProperties = require('./specialProperties'); +const isMongooseObject = require('./isMongooseObject'); +const getFunctionName = require('./getFunctionName'); +const isBsonType = require('./isBsonType'); +const isObject = require('./isObject'); +const symbols = require('./symbols'); +const trustedSymbol = require('./query/trusted').trustedSymbol; +const utils = require('../utils'); + + +/*! + * Object clone with Mongoose natives support. + * + * If options.minimize is true, creates a minimal data object. Empty objects and undefined values will not be cloned. This makes the data payload sent to MongoDB as small as possible. + * + * Functions are never cloned. + * + * @param {Object} obj the object to clone + * @param {Object} options + * @param {Boolean} isArrayChild true if cloning immediately underneath an array. Special case for minimize. + * @return {Object} the cloned object + * @api private + */ + +function clone(obj, options, isArrayChild) { + if (obj == null) { + return obj; + } + + if (Array.isArray(obj)) { + return cloneArray(obj, options); + } + + if (isMongooseObject(obj)) { + // Single nested subdocs should apply getters later in `applyGetters()` + // when calling `toObject()`. See gh-7442, gh-8295 + if (options && options._skipSingleNestedGetters && obj.$isSingleNested) { + options = Object.assign({}, options, { getters: false }); + } + + if (utils.isPOJO(obj) && obj.$__ != null && obj._doc != null) { + return obj._doc; + } + + if (options && options.json && typeof obj.toJSON === 'function') { + return obj.toJSON(options); + } + return obj.toObject(options); + } + + if (obj.constructor) { + switch (getFunctionName(obj.constructor)) { + case 'Object': + return cloneObject(obj, options, isArrayChild); + case 'Date': + return new obj.constructor(+obj); + case 'RegExp': + return cloneRegExp(obj); + default: + // ignore + break; + } + } + + if (obj instanceof ObjectId) { + return new ObjectId(obj.id); + } + + if (isBsonType(obj, 'Decimal128')) { + if (options && options.flattenDecimals) { + return obj.toJSON(); + } + return Decimal.fromString(obj.toString()); + } + + if (!obj.constructor && isObject(obj)) { + // object created with Object.create(null) + return cloneObject(obj, options, isArrayChild); + } + + if (obj[symbols.schemaTypeSymbol]) { + return obj.clone(); + } + + // If we're cloning this object to go into a MongoDB command, + // and there's a `toBSON()` function, assume this object will be + // stored as a primitive in MongoDB and doesn't need to be cloned. + if (options && options.bson && typeof obj.toBSON === 'function') { + return obj; + } + + if (obj.valueOf != null) { + return obj.valueOf(); + } + + return cloneObject(obj, options, isArrayChild); +} +module.exports = clone; + +/*! + * ignore + */ + +function cloneObject(obj, options, isArrayChild) { + const minimize = options && options.minimize; + const ret = {}; + let hasKeys; + + if (obj[trustedSymbol]) { + ret[trustedSymbol] = obj[trustedSymbol]; + } + + for (const k of Object.keys(obj)) { + if (specialProperties.has(k)) { + continue; + } + + // Don't pass `isArrayChild` down + const val = clone(obj[k], options); + + if (!minimize || (typeof val !== 'undefined')) { + if (minimize === false && typeof val === 'undefined') { + delete ret[k]; + } else { + hasKeys || (hasKeys = true); + ret[k] = val; + } + } + } + + return minimize && !isArrayChild ? hasKeys && ret : ret; +} + +function cloneArray(arr, options) { + const ret = []; + + for (const item of arr) { + ret.push(clone(item, options, true)); + } + + return ret; +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/common.js b/node_modules/mongoose/lib/helpers/common.js new file mode 100644 index 000000000..a191b8ae2 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/common.js @@ -0,0 +1,115 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const Binary = require('../driver').get().Binary; +const Decimal128 = require('../types/decimal128'); +const ObjectId = require('../types/objectid'); +const isMongooseObject = require('./isMongooseObject'); + +exports.flatten = flatten; +exports.modifiedPaths = modifiedPaths; + +/*! + * ignore + */ + +function flatten(update, path, options, schema) { + let keys; + if (update && isMongooseObject(update) && !Buffer.isBuffer(update)) { + keys = Object.keys(update.toObject({ transform: false, virtuals: false })); + } else { + keys = Object.keys(update || {}); + } + + const numKeys = keys.length; + const result = {}; + path = path ? path + '.' : ''; + + for (let i = 0; i < numKeys; ++i) { + const key = keys[i]; + const val = update[key]; + result[path + key] = val; + + // Avoid going into mixed paths if schema is specified + const keySchema = schema && schema.path && schema.path(path + key); + const isNested = schema && schema.nested && schema.nested[path + key]; + if (keySchema && keySchema.instance === 'Mixed') continue; + + if (shouldFlatten(val)) { + if (options && options.skipArrays && Array.isArray(val)) { + continue; + } + const flat = flatten(val, path + key, options, schema); + for (const k in flat) { + result[k] = flat[k]; + } + if (Array.isArray(val)) { + result[path + key] = val; + } + } + + if (isNested) { + const paths = Object.keys(schema.paths); + for (const p of paths) { + if (p.startsWith(path + key + '.') && !result.hasOwnProperty(p)) { + result[p] = void 0; + } + } + } + } + + return result; +} + +/*! + * ignore + */ + +function modifiedPaths(update, path, result) { + const keys = Object.keys(update || {}); + const numKeys = keys.length; + result = result || {}; + path = path ? path + '.' : ''; + + for (let i = 0; i < numKeys; ++i) { + const key = keys[i]; + let val = update[key]; + + const _path = path + key; + result[_path] = true; + if (_path.indexOf('.') !== -1) { + const sp = _path.split('.'); + let cur = sp[0]; + for (let i = 1; i < sp.length; ++i) { + result[cur] = true; + cur += '.' + sp[i]; + } + } + if (isMongooseObject(val) && !Buffer.isBuffer(val)) { + val = val.toObject({ transform: false, virtuals: false }); + } + if (shouldFlatten(val)) { + modifiedPaths(val, path + key, result); + } + } + + return result; +} + +/*! + * ignore + */ + +function shouldFlatten(val) { + return val && + typeof val === 'object' && + !(val instanceof Date) && + !(val instanceof ObjectId) && + (!Array.isArray(val) || val.length > 0) && + !(val instanceof Buffer) && + !(val instanceof Decimal128) && + !(val instanceof Binary); +} diff --git a/node_modules/mongoose/lib/helpers/cursor/eachAsync.js b/node_modules/mongoose/lib/helpers/cursor/eachAsync.js new file mode 100644 index 000000000..2ef0346e0 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/cursor/eachAsync.js @@ -0,0 +1,157 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const immediate = require('../immediate'); +const promiseOrCallback = require('../promiseOrCallback'); + +/** + * Execute `fn` for every document in the cursor. If `fn` returns a promise, + * will wait for the promise to resolve before iterating on to the next one. + * Returns a promise that resolves when done. + * + * @param {Function} next the thunk to call to get the next document + * @param {Function} fn + * @param {Object} options + * @param {Function} [callback] executed when all docs have been processed + * @return {Promise} + * @api public + * @method eachAsync + */ + +module.exports = function eachAsync(next, fn, options, callback) { + const parallel = options.parallel || 1; + const batchSize = options.batchSize; + const enqueue = asyncQueue(); + + return promiseOrCallback(callback, cb => { + if (batchSize != null) { + if (typeof batchSize !== 'number') { + throw new TypeError('batchSize must be a number'); + } + if (batchSize < 1) { + throw new TypeError('batchSize must be at least 1'); + } + if (batchSize !== Math.floor(batchSize)) { + throw new TypeError('batchSize must be a positive integer'); + } + } + + iterate(cb); + }); + + function iterate(finalCallback) { + let drained = false; + let handleResultsInProgress = 0; + let currentDocumentIndex = 0; + let documentsBatch = []; + + let error = null; + for (let i = 0; i < parallel; ++i) { + enqueue(fetch); + } + + function fetch(done) { + if (drained || error) { + return done(); + } + + next(function(err, doc) { + if (drained || error != null) { + return done(); + } + if (err != null) { + error = err; + finalCallback(err); + return done(); + } + if (doc == null) { + drained = true; + if (handleResultsInProgress <= 0) { + finalCallback(null); + } else if (batchSize != null && documentsBatch.length) { + handleNextResult(documentsBatch, currentDocumentIndex++, handleNextResultCallBack); + } + return done(); + } + + ++handleResultsInProgress; + + // Kick off the subsequent `next()` before handling the result, but + // make sure we know that we still have a result to handle re: #8422 + immediate(() => done()); + + if (batchSize != null) { + documentsBatch.push(doc); + } + + // If the current documents size is less than the provided patch size don't process the documents yet + if (batchSize != null && documentsBatch.length !== batchSize) { + setTimeout(() => enqueue(fetch), 0); + return; + } + + const docsToProcess = batchSize != null ? documentsBatch : doc; + + function handleNextResultCallBack(err) { + if (batchSize != null) { + handleResultsInProgress -= documentsBatch.length; + documentsBatch = []; + } else { + --handleResultsInProgress; + } + if (err != null) { + error = err; + return finalCallback(err); + } + if (drained && handleResultsInProgress <= 0) { + return finalCallback(null); + } + + setTimeout(() => enqueue(fetch), 0); + } + + handleNextResult(docsToProcess, currentDocumentIndex++, handleNextResultCallBack); + }); + } + } + + function handleNextResult(doc, i, callback) { + const promise = fn(doc, i); + if (promise && typeof promise.then === 'function') { + promise.then( + function() { callback(null); }, + function(error) { callback(error || new Error('`eachAsync()` promise rejected without error')); }); + } else { + callback(null); + } + } +}; + +// `next()` can only execute one at a time, so make sure we always execute +// `next()` in series, while still allowing multiple `fn()` instances to run +// in parallel. +function asyncQueue() { + const _queue = []; + let inProgress = null; + let id = 0; + + return function enqueue(fn) { + if (_queue.length === 0 && inProgress == null) { + inProgress = id++; + return fn(_step); + } + _queue.push(fn); + }; + + function _step() { + inProgress = null; + if (_queue.length > 0) { + inProgress = id++; + const fn = _queue.shift(); + fn(_step); + } + } +} diff --git a/node_modules/mongoose/lib/helpers/discriminator/areDiscriminatorValuesEqual.js b/node_modules/mongoose/lib/helpers/discriminator/areDiscriminatorValuesEqual.js new file mode 100644 index 000000000..87b2408e6 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/discriminator/areDiscriminatorValuesEqual.js @@ -0,0 +1,16 @@ +'use strict'; + +const ObjectId = require('../../types/objectid'); + +module.exports = function areDiscriminatorValuesEqual(a, b) { + if (typeof a === 'string' && typeof b === 'string') { + return a === b; + } + if (typeof a === 'number' && typeof b === 'number') { + return a === b; + } + if (a instanceof ObjectId && b instanceof ObjectId) { + return a.toString() === b.toString(); + } + return false; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/discriminator/checkEmbeddedDiscriminatorKeyProjection.js b/node_modules/mongoose/lib/helpers/discriminator/checkEmbeddedDiscriminatorKeyProjection.js new file mode 100644 index 000000000..755de88f1 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/discriminator/checkEmbeddedDiscriminatorKeyProjection.js @@ -0,0 +1,12 @@ +'use strict'; + +module.exports = function checkEmbeddedDiscriminatorKeyProjection(userProjection, path, schema, selected, addedPaths) { + const userProjectedInPath = Object.keys(userProjection). + reduce((cur, key) => cur || key.startsWith(path + '.'), false); + const _discriminatorKey = path + '.' + schema.options.discriminatorKey; + if (!userProjectedInPath && + addedPaths.length === 1 && + addedPaths[0] === _discriminatorKey) { + selected.splice(selected.indexOf(_discriminatorKey), 1); + } +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/discriminator/getConstructor.js b/node_modules/mongoose/lib/helpers/discriminator/getConstructor.js new file mode 100644 index 000000000..728da3b25 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/discriminator/getConstructor.js @@ -0,0 +1,25 @@ +'use strict'; + +const getDiscriminatorByValue = require('./getDiscriminatorByValue'); + +/*! + * Find the correct constructor, taking into account discriminators + */ + +module.exports = function getConstructor(Constructor, value) { + const discriminatorKey = Constructor.schema.options.discriminatorKey; + if (value != null && + Constructor.discriminators && + value[discriminatorKey] != null) { + if (Constructor.discriminators[value[discriminatorKey]]) { + Constructor = Constructor.discriminators[value[discriminatorKey]]; + } else { + const constructorByValue = getDiscriminatorByValue(Constructor.discriminators, value[discriminatorKey]); + if (constructorByValue) { + Constructor = constructorByValue; + } + } + } + + return Constructor; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/discriminator/getDiscriminatorByValue.js b/node_modules/mongoose/lib/helpers/discriminator/getDiscriminatorByValue.js new file mode 100644 index 000000000..099aaefdd --- /dev/null +++ b/node_modules/mongoose/lib/helpers/discriminator/getDiscriminatorByValue.js @@ -0,0 +1,27 @@ +'use strict'; + +const areDiscriminatorValuesEqual = require('./areDiscriminatorValuesEqual'); + +/*! +* returns discriminator by discriminatorMapping.value +* +* @param {Model} model +* @param {string} value +*/ + +module.exports = function getDiscriminatorByValue(discriminators, value) { + if (discriminators == null) { + return null; + } + for (const name of Object.keys(discriminators)) { + const it = discriminators[name]; + if ( + it.schema && + it.schema.discriminatorMapping && + areDiscriminatorValuesEqual(it.schema.discriminatorMapping.value, value) + ) { + return it; + } + } + return null; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/discriminator/getSchemaDiscriminatorByValue.js b/node_modules/mongoose/lib/helpers/discriminator/getSchemaDiscriminatorByValue.js new file mode 100644 index 000000000..b29fb6521 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/discriminator/getSchemaDiscriminatorByValue.js @@ -0,0 +1,26 @@ +'use strict'; + +const areDiscriminatorValuesEqual = require('./areDiscriminatorValuesEqual'); + +/*! +* returns discriminator by discriminatorMapping.value +* +* @param {Schema} schema +* @param {string} value +*/ + +module.exports = function getSchemaDiscriminatorByValue(schema, value) { + if (schema == null || schema.discriminators == null) { + return null; + } + for (const key of Object.keys(schema.discriminators)) { + const discriminatorSchema = schema.discriminators[key]; + if (discriminatorSchema.discriminatorMapping == null) { + continue; + } + if (areDiscriminatorValuesEqual(discriminatorSchema.discriminatorMapping.value, value)) { + return discriminatorSchema; + } + } + return null; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/document/cleanModifiedSubpaths.js b/node_modules/mongoose/lib/helpers/document/cleanModifiedSubpaths.js new file mode 100644 index 000000000..98de47536 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/document/cleanModifiedSubpaths.js @@ -0,0 +1,28 @@ +'use strict'; + +/*! + * ignore + */ + +module.exports = function cleanModifiedSubpaths(doc, path, options) { + options = options || {}; + const skipDocArrays = options.skipDocArrays; + + let deleted = 0; + if (!doc) { + return deleted; + } + for (const modifiedPath of Object.keys(doc.$__.activePaths.states.modify)) { + if (skipDocArrays) { + const schemaType = doc.$__schema.path(modifiedPath); + if (schemaType && schemaType.$isMongooseDocumentArray) { + continue; + } + } + if (modifiedPath.startsWith(path + '.')) { + delete doc.$__.activePaths.states.modify[modifiedPath]; + ++deleted; + } + } + return deleted; +}; diff --git a/node_modules/mongoose/lib/helpers/document/compile.js b/node_modules/mongoose/lib/helpers/document/compile.js new file mode 100644 index 000000000..9981e8f9f --- /dev/null +++ b/node_modules/mongoose/lib/helpers/document/compile.js @@ -0,0 +1,207 @@ +'use strict'; + +const documentSchemaSymbol = require('../../helpers/symbols').documentSchemaSymbol; +const get = require('../../helpers/get'); +const internalToObjectOptions = require('../../options').internalToObjectOptions; +const utils = require('../../utils'); + +let Document; +const getSymbol = require('../../helpers/symbols').getSymbol; +const scopeSymbol = require('../../helpers/symbols').scopeSymbol; + +/*! + * exports + */ + +exports.compile = compile; +exports.defineKey = defineKey; + +/*! + * Compiles schemas. + */ + +function compile(tree, proto, prefix, options) { + Document = Document || require('../../document'); + + for (const key of Object.keys(tree)) { + const limb = tree[key]; + + const hasSubprops = utils.isPOJO(limb) && Object.keys(limb).length && + (!limb[options.typeKey] || (options.typeKey === 'type' && limb.type.type)); + const subprops = hasSubprops ? limb : null; + + defineKey({ prop: key, subprops: subprops, prototype: proto, prefix: prefix, options: options }); + } +} + +/*! + * Defines the accessor named prop on the incoming prototype. + */ + +function defineKey({ prop, subprops, prototype, prefix, options }) { + Document = Document || require('../../document'); + const path = (prefix ? prefix + '.' : '') + prop; + prefix = prefix || ''; + + if (subprops) { + Object.defineProperty(prototype, prop, { + enumerable: true, + configurable: true, + get: function() { + const _this = this; + if (!this.$__.getters) { + this.$__.getters = {}; + } + + if (!this.$__.getters[path]) { + const nested = Object.create(Document.prototype, getOwnPropertyDescriptors(this)); + + // save scope for nested getters/setters + if (!prefix) { + nested.$__[scopeSymbol] = this; + } + nested.$__.nestedPath = path; + + Object.defineProperty(nested, 'schema', { + enumerable: false, + configurable: true, + writable: false, + value: prototype.schema + }); + + Object.defineProperty(nested, '$__schema', { + enumerable: false, + configurable: true, + writable: false, + value: prototype.schema + }); + + Object.defineProperty(nested, documentSchemaSymbol, { + enumerable: false, + configurable: true, + writable: false, + value: prototype.schema + }); + + Object.defineProperty(nested, 'toObject', { + enumerable: false, + configurable: true, + writable: false, + value: function() { + return utils.clone(_this.get(path, null, { + virtuals: get(this, 'schema.options.toObject.virtuals', null) + })); + } + }); + + Object.defineProperty(nested, '$__get', { + enumerable: false, + configurable: true, + writable: false, + value: function() { + return _this.get(path, null, { + virtuals: get(this, 'schema.options.toObject.virtuals', null) + }); + } + }); + + Object.defineProperty(nested, 'toJSON', { + enumerable: false, + configurable: true, + writable: false, + value: function() { + return _this.get(path, null, { + virtuals: get(_this, 'schema.options.toJSON.virtuals', null) + }); + } + }); + + Object.defineProperty(nested, '$__isNested', { + enumerable: false, + configurable: true, + writable: false, + value: true + }); + + const _isEmptyOptions = Object.freeze({ + minimize: true, + virtuals: false, + getters: false, + transform: false + }); + Object.defineProperty(nested, '$isEmpty', { + enumerable: false, + configurable: true, + writable: false, + value: function() { + return Object.keys(this.get(path, null, _isEmptyOptions) || {}).length === 0; + } + }); + + Object.defineProperty(nested, '$__parent', { + enumerable: false, + configurable: true, + writable: false, + value: this + }); + + compile(subprops, nested, path, options); + this.$__.getters[path] = nested; + } + + return this.$__.getters[path]; + }, + set: function(v) { + if (v != null && v.$__isNested) { + // Convert top-level to POJO, but leave subdocs hydrated so `$set` + // can handle them. See gh-9293. + v = v.$__get(); + } else if (v instanceof Document && !v.$__isNested) { + v = v.$toObject(internalToObjectOptions); + } + const doc = this.$__[scopeSymbol] || this; + doc.$set(path, v); + } + }); + } else { + Object.defineProperty(prototype, prop, { + enumerable: true, + configurable: true, + get: function() { + return this[getSymbol].call(this.$__[scopeSymbol] || this, path); + }, + set: function(v) { + this.$set.call(this.$__[scopeSymbol] || this, path, v); + } + }); + } +} + +// gets descriptors for all properties of `object` +// makes all properties non-enumerable to match previous behavior to #2211 +function getOwnPropertyDescriptors(object) { + const result = {}; + + Object.getOwnPropertyNames(object).forEach(function(key) { + const skip = [ + 'isNew', + '$__', + '$errors', + 'errors', + '_doc', + '$locals', + '$op', + '__parentArray', + '__index', + '$isDocumentArrayElement' + ].indexOf(key) === -1; + if (skip) { + return; + } + + result[key] = Object.getOwnPropertyDescriptor(object, key); + result[key].enumerable = false; + }); + + return result; +} diff --git a/node_modules/mongoose/lib/helpers/document/getEmbeddedDiscriminatorPath.js b/node_modules/mongoose/lib/helpers/document/getEmbeddedDiscriminatorPath.js new file mode 100644 index 000000000..3f4d22a21 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/document/getEmbeddedDiscriminatorPath.js @@ -0,0 +1,46 @@ +'use strict'; + +const get = require('../get'); +const getSchemaDiscriminatorByValue = require('../discriminator/getSchemaDiscriminatorByValue'); + +/*! + * Like `schema.path()`, except with a document, because impossible to + * determine path type without knowing the embedded discriminator key. + */ + +module.exports = function getEmbeddedDiscriminatorPath(doc, path, options) { + options = options || {}; + const typeOnly = options.typeOnly; + const parts = path.indexOf('.') === -1 ? [path] : path.split('.'); + let schemaType = null; + let type = 'adhocOrUndefined'; + + const schema = getSchemaDiscriminatorByValue(doc.schema, doc.get(doc.schema.options.discriminatorKey)) || doc.schema; + + for (let i = 0; i < parts.length; ++i) { + const subpath = parts.slice(0, i + 1).join('.'); + schemaType = schema.path(subpath); + if (schemaType == null) { + type = 'adhocOrUndefined'; + continue; + } + if (schemaType.instance === 'Mixed') { + return typeOnly ? 'real' : schemaType; + } + type = schema.pathType(subpath); + if ((schemaType.$isSingleNested || schemaType.$isMongooseDocumentArrayElement) && + schemaType.schema.discriminators != null) { + const discriminators = schemaType.schema.discriminators; + const discriminatorKey = doc.get(subpath + '.' + + get(schemaType, 'schema.options.discriminatorKey')); + if (discriminatorKey == null || discriminators[discriminatorKey] == null) { + continue; + } + const rest = parts.slice(i + 1).join('.'); + return getEmbeddedDiscriminatorPath(doc.get(subpath), rest, options); + } + } + + // Are we getting the whole schema or just the type, 'real', 'nested', etc. + return typeOnly ? type : schemaType; +}; diff --git a/node_modules/mongoose/lib/helpers/document/handleSpreadDoc.js b/node_modules/mongoose/lib/helpers/document/handleSpreadDoc.js new file mode 100644 index 000000000..d3075e413 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/document/handleSpreadDoc.js @@ -0,0 +1,17 @@ +'use strict'; + +const utils = require('../../utils'); + +/** + * Using spread operator on a Mongoose document gives you a + * POJO that has a tendency to cause infinite recursion. So + * we use this function on `set()` to prevent that. + */ + +module.exports = function handleSpreadDoc(v) { + if (utils.isPOJO(v) && v.$__ != null && v._doc != null) { + return v._doc; + } + + return v; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/each.js b/node_modules/mongoose/lib/helpers/each.js new file mode 100644 index 000000000..fe7006931 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/each.js @@ -0,0 +1,25 @@ +'use strict'; + +module.exports = function each(arr, cb, done) { + if (arr.length === 0) { + return done(); + } + + let remaining = arr.length; + let err = null; + for (const v of arr) { + cb(v, function(_err) { + if (err != null) { + return; + } + if (_err != null) { + err = _err; + return done(err); + } + + if (--remaining <= 0) { + return done(); + } + }); + } +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/get.js b/node_modules/mongoose/lib/helpers/get.js new file mode 100644 index 000000000..dbab30601 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/get.js @@ -0,0 +1,64 @@ +'use strict'; + +/*! + * Simplified lodash.get to work around the annoying null quirk. See: + * https://github.com/lodash/lodash/issues/3659 + */ + +module.exports = function get(obj, path, def) { + let parts; + let isPathArray = false; + if (typeof path === 'string') { + if (path.indexOf('.') === -1) { + const _v = getProperty(obj, path); + if (_v == null) { + return def; + } + return _v; + } + + parts = path.split('.'); + } else { + isPathArray = true; + parts = path; + + if (parts.length === 1) { + const _v = getProperty(obj, parts[0]); + if (_v == null) { + return def; + } + return _v; + } + } + let rest = path; + let cur = obj; + for (const part of parts) { + if (cur == null) { + return def; + } + + // `lib/cast.js` depends on being able to get dotted paths in updates, + // like `{ $set: { 'a.b': 42 } }` + if (!isPathArray && cur[rest] != null) { + return cur[rest]; + } + + cur = getProperty(cur, part); + + if (!isPathArray) { + rest = rest.substr(part.length + 1); + } + } + + return cur == null ? def : cur; +}; + +function getProperty(obj, prop) { + if (obj == null) { + return obj; + } + if (obj instanceof Map) { + return obj.get(prop); + } + return obj[prop]; +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/getConstructorName.js b/node_modules/mongoose/lib/helpers/getConstructorName.js new file mode 100644 index 000000000..a4e192493 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/getConstructorName.js @@ -0,0 +1,15 @@ +'use strict'; + +/*! + * If `val` is an object, returns constructor name, if possible. Otherwise returns undefined. + */ + +module.exports = function getConstructorName(val) { + if (val == null) { + return void 0; + } + if (typeof val.constructor !== 'function') { + return void 0; + } + return val.constructor.name; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/getDefaultBulkwriteResult.js b/node_modules/mongoose/lib/helpers/getDefaultBulkwriteResult.js new file mode 100644 index 000000000..7d10f1748 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/getDefaultBulkwriteResult.js @@ -0,0 +1,27 @@ +'use strict'; +function getDefaultBulkwriteResult() { + return { + result: { + ok: 1, + writeErrors: [], + writeConcernErrors: [], + insertedIds: [], + nInserted: 0, + nUpserted: 0, + nMatched: 0, + nModified: 0, + nRemoved: 0, + upserted: [] + }, + insertedCount: 0, + matchedCount: 0, + modifiedCount: 0, + deletedCount: 0, + upsertedCount: 0, + upsertedIds: {}, + insertedIds: {}, + n: 0 + }; +} + +module.exports = getDefaultBulkwriteResult; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/getFunctionName.js b/node_modules/mongoose/lib/helpers/getFunctionName.js new file mode 100644 index 000000000..87a2c690a --- /dev/null +++ b/node_modules/mongoose/lib/helpers/getFunctionName.js @@ -0,0 +1,8 @@ +'use strict'; + +module.exports = function(fn) { + if (fn.name) { + return fn.name; + } + return (fn.toString().trim().match(/^function\s*([^\s(]+)/) || [])[1]; +}; diff --git a/node_modules/mongoose/lib/helpers/immediate.js b/node_modules/mongoose/lib/helpers/immediate.js new file mode 100644 index 000000000..0a1c26c32 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/immediate.js @@ -0,0 +1,14 @@ +/*! + * Centralize this so we can more easily work around issues with people + * stubbing out `process.nextTick()` in tests using sinon: + * https://github.com/sinonjs/lolex#automatically-incrementing-mocked-time + * See gh-6074 + */ + +'use strict'; + +const nextTick = process.nextTick.bind(process); + +module.exports = function immediate(cb) { + return nextTick(cb); +}; diff --git a/node_modules/mongoose/lib/helpers/indexes/isDefaultIdIndex.js b/node_modules/mongoose/lib/helpers/indexes/isDefaultIdIndex.js new file mode 100644 index 000000000..c975dcfc3 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/indexes/isDefaultIdIndex.js @@ -0,0 +1,18 @@ +'use strict'; + +const get = require('../get'); + +module.exports = function isDefaultIdIndex(index) { + if (Array.isArray(index)) { + // Mongoose syntax + const keys = Object.keys(index[0]); + return keys.length === 1 && keys[0] === '_id' && index[0]._id !== 'hashed'; + } + + if (typeof index !== 'object') { + return false; + } + + const key = get(index, 'key', {}); + return Object.keys(key).length === 1 && key.hasOwnProperty('_id'); +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/indexes/isIndexEqual.js b/node_modules/mongoose/lib/helpers/indexes/isIndexEqual.js new file mode 100644 index 000000000..12e5c3d08 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/indexes/isIndexEqual.js @@ -0,0 +1,96 @@ +'use strict'; + +const get = require('../get'); +const utils = require('../../utils'); +/** + * Given a Mongoose index definition (key + options objects) and a MongoDB server + * index definition, determine if the two indexes are equal. + * + * @param {Object} key the Mongoose index spec + * @param {Object} options the Mongoose index definition's options + * @param {Object} dbIndex the index in MongoDB as returned by `listIndexes()` + * @api private + */ + +module.exports = function isIndexEqual(key, options, dbIndex) { + // Special case: text indexes have a special format in the db. For example, + // `{ name: 'text' }` becomes: + // { + // v: 2, + // key: { _fts: 'text', _ftsx: 1 }, + // name: 'name_text', + // ns: 'test.tests', + // background: true, + // weights: { name: 1 }, + // default_language: 'english', + // language_override: 'language', + // textIndexVersion: 3 + // } + if (dbIndex.textIndexVersion != null) { + delete dbIndex.key._fts; + delete dbIndex.key._ftsx; + const weights = { ...dbIndex.weights, ...dbIndex.key }; + if (Object.keys(weights).length !== Object.keys(key).length) { + return false; + } + for (const prop of Object.keys(weights)) { + if (!(prop in key)) { + return false; + } + const weight = weights[prop]; + if (weight !== get(options, 'weights.' + prop) && !(weight === 1 && get(options, 'weights.' + prop) == null)) { + return false; + } + } + + if (options['default_language'] !== dbIndex['default_language']) { + return dbIndex['default_language'] === 'english' && options['default_language'] == null; + } + + return true; + } + + const optionKeys = [ + 'unique', + 'partialFilterExpression', + 'sparse', + 'expireAfterSeconds', + 'collation' + ]; + for (const key of optionKeys) { + if (!(key in options) && !(key in dbIndex)) { + continue; + } + if (key === 'collation') { + if (options[key] == null || dbIndex[key] == null) { + return options[key] == null && dbIndex[key] == null; + } + const definedKeys = Object.keys(options.collation); + const schemaCollation = options.collation; + const dbCollation = dbIndex.collation; + for (const opt of definedKeys) { + if (get(schemaCollation, opt) !== get(dbCollation, opt)) { + return false; + } + } + } else if (!utils.deepEqual(options[key], dbIndex[key])) { + return false; + } + } + + const schemaIndexKeys = Object.keys(key); + const dbIndexKeys = Object.keys(dbIndex.key); + if (schemaIndexKeys.length !== dbIndexKeys.length) { + return false; + } + for (let i = 0; i < schemaIndexKeys.length; ++i) { + if (schemaIndexKeys[i] !== dbIndexKeys[i]) { + return false; + } + if (!utils.deepEqual(key[schemaIndexKeys[i]], dbIndex.key[dbIndexKeys[i]])) { + return false; + } + } + + return true; +}; diff --git a/node_modules/mongoose/lib/helpers/isAsyncFunction.js b/node_modules/mongoose/lib/helpers/isAsyncFunction.js new file mode 100644 index 000000000..ecd2872af --- /dev/null +++ b/node_modules/mongoose/lib/helpers/isAsyncFunction.js @@ -0,0 +1,11 @@ +'use strict'; + +const { inspect } = require('util'); + +module.exports = function isAsyncFunction(v) { + if (typeof v !== 'function') { + return; + } + + return inspect(v).startsWith('[AsyncFunction:'); +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/isBsonType.js b/node_modules/mongoose/lib/helpers/isBsonType.js new file mode 100644 index 000000000..01435d3f2 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/isBsonType.js @@ -0,0 +1,13 @@ +'use strict'; + +const get = require('./get'); + +/*! + * Get the bson type, if it exists + */ + +function isBsonType(obj, typename) { + return get(obj, '_bsontype', void 0) === typename; +} + +module.exports = isBsonType; diff --git a/node_modules/mongoose/lib/helpers/isMongooseObject.js b/node_modules/mongoose/lib/helpers/isMongooseObject.js new file mode 100644 index 000000000..016f9e65a --- /dev/null +++ b/node_modules/mongoose/lib/helpers/isMongooseObject.js @@ -0,0 +1,21 @@ +'use strict'; + +/*! + * Returns if `v` is a mongoose object that has a `toObject()` method we can use. + * + * This is for compatibility with libs like Date.js which do foolish things to Natives. + * + * @param {any} v + * @api private + */ + +module.exports = function(v) { + if (v == null) { + return false; + } + + return v.$__ != null || // Document + v.isMongooseArray || // Array or Document Array + v.isMongooseBuffer || // Buffer + v.$isMongooseMap; // Map +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/isObject.js b/node_modules/mongoose/lib/helpers/isObject.js new file mode 100644 index 000000000..f8ac31326 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/isObject.js @@ -0,0 +1,16 @@ +'use strict'; + +/*! + * Determines if `arg` is an object. + * + * @param {Object|Array|String|Function|RegExp|any} arg + * @api private + * @return {Boolean} + */ + +module.exports = function(arg) { + if (Buffer.isBuffer(arg)) { + return true; + } + return Object.prototype.toString.call(arg) === '[object Object]'; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/isPromise.js b/node_modules/mongoose/lib/helpers/isPromise.js new file mode 100644 index 000000000..d6db2608c --- /dev/null +++ b/node_modules/mongoose/lib/helpers/isPromise.js @@ -0,0 +1,6 @@ +'use strict'; +function isPromise(val) { + return !!val && (typeof val === 'object' || typeof val === 'function') && typeof val.then === 'function'; +} + +module.exports = isPromise; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/model/applyHooks.js b/node_modules/mongoose/lib/helpers/model/applyHooks.js new file mode 100644 index 000000000..0498c75e6 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/model/applyHooks.js @@ -0,0 +1,138 @@ +'use strict'; + +const symbols = require('../../schema/symbols'); +const promiseOrCallback = require('../promiseOrCallback'); + +/*! + * ignore + */ + +module.exports = applyHooks; + +/*! + * ignore + */ + +applyHooks.middlewareFunctions = [ + 'deleteOne', + 'save', + 'validate', + 'remove', + 'updateOne', + 'init' +]; + +/*! + * Register hooks for this model + * + * @param {Model} model + * @param {Schema} schema + */ + +function applyHooks(model, schema, options) { + options = options || {}; + + const kareemOptions = { + useErrorHandlers: true, + numCallbackParams: 1, + nullResultByDefault: true, + contextParameter: true + }; + const objToDecorate = options.decorateDoc ? model : model.prototype; + + model.$appliedHooks = true; + for (const key of Object.keys(schema.paths)) { + const type = schema.paths[key]; + let childModel = null; + if (type.$isSingleNested) { + childModel = type.caster; + } else if (type.$isMongooseDocumentArray) { + childModel = type.Constructor; + } else { + continue; + } + + if (childModel.$appliedHooks) { + continue; + } + + applyHooks(childModel, type.schema, options); + if (childModel.discriminators != null) { + const keys = Object.keys(childModel.discriminators); + for (const key of keys) { + applyHooks(childModel.discriminators[key], + childModel.discriminators[key].schema, options); + } + } + } + + // Built-in hooks rely on hooking internal functions in order to support + // promises and make it so that `doc.save.toString()` provides meaningful + // information. + + const middleware = schema.s.hooks. + filter(hook => { + if (hook.name === 'updateOne' || hook.name === 'deleteOne') { + return !!hook['document']; + } + if (hook.name === 'remove' || hook.name === 'init') { + return hook['document'] == null || !!hook['document']; + } + if (hook.query != null || hook.document != null) { + return hook.document !== false; + } + return true; + }). + filter(hook => { + // If user has overwritten the method, don't apply built-in middleware + if (schema.methods[hook.name]) { + return !hook.fn[symbols.builtInMiddleware]; + } + + return true; + }); + + model._middleware = middleware; + + objToDecorate.$__originalValidate = objToDecorate.$__originalValidate || objToDecorate.$__validate; + + for (const method of ['save', 'validate', 'remove', 'deleteOne']) { + const toWrap = method === 'validate' ? '$__originalValidate' : `$__${method}`; + const wrapped = middleware. + createWrapper(method, objToDecorate[toWrap], null, kareemOptions); + objToDecorate[`$__${method}`] = wrapped; + } + objToDecorate.$__init = middleware. + createWrapperSync('init', objToDecorate.$__init, null, kareemOptions); + + // Support hooks for custom methods + const customMethods = Object.keys(schema.methods); + const customMethodOptions = Object.assign({}, kareemOptions, { + // Only use `checkForPromise` for custom methods, because mongoose + // query thunks are not as consistent as I would like about returning + // a nullish value rather than the query. If a query thunk returns + // a query, `checkForPromise` causes infinite recursion + checkForPromise: true + }); + for (const method of customMethods) { + if (!middleware.hasHooks(method)) { + // Don't wrap if there are no hooks for the custom method to avoid + // surprises. Also, `createWrapper()` enforces consistent async, + // so wrapping a sync method would break it. + continue; + } + const originalMethod = objToDecorate[method]; + objToDecorate[method] = function() { + const args = Array.prototype.slice.call(arguments); + const cb = args.slice(-1).pop(); + const argsWithoutCallback = typeof cb === 'function' ? + args.slice(0, args.length - 1) : args; + return promiseOrCallback(cb, callback => { + return this[`$__${method}`].apply(this, + argsWithoutCallback.concat([callback])); + }, model.events); + }; + objToDecorate[`$__${method}`] = middleware. + createWrapper(method, originalMethod, null, customMethodOptions); + } +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/model/applyMethods.js b/node_modules/mongoose/lib/helpers/model/applyMethods.js new file mode 100644 index 000000000..bd3718da3 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/model/applyMethods.js @@ -0,0 +1,57 @@ +'use strict'; + +const get = require('../get'); +const utils = require('../../utils'); + +/*! + * Register methods for this model + * + * @param {Model} model + * @param {Schema} schema + */ + +module.exports = function applyMethods(model, schema) { + function apply(method, schema) { + Object.defineProperty(model.prototype, method, { + get: function() { + const h = {}; + for (const k in schema.methods[method]) { + h[k] = schema.methods[method][k].bind(this); + } + return h; + }, + configurable: true + }); + } + for (const method of Object.keys(schema.methods)) { + const fn = schema.methods[method]; + if (schema.tree.hasOwnProperty(method)) { + throw new Error('You have a method and a property in your schema both ' + + 'named "' + method + '"'); + } + if (schema.reserved[method] && + !get(schema, `methodOptions.${method}.suppressWarning`, false)) { + utils.warn(`mongoose: the method name "${method}" is used by mongoose ` + + 'internally, overwriting it may cause bugs. If you\'re sure you know ' + + 'what you\'re doing, you can suppress this error by using ' + + `\`schema.method('${method}', fn, { suppressWarning: true })\`.`); + } + if (typeof fn === 'function') { + model.prototype[method] = fn; + } else { + apply(method, schema); + } + } + + // Recursively call `applyMethods()` on child schemas + model.$appliedMethods = true; + for (const key of Object.keys(schema.paths)) { + const type = schema.paths[key]; + if (type.$isSingleNested && !type.caster.$appliedMethods) { + applyMethods(type.caster, type.schema); + } + if (type.$isMongooseDocumentArray && !type.Constructor.$appliedMethods) { + applyMethods(type.Constructor, type.schema); + } + } +}; diff --git a/node_modules/mongoose/lib/helpers/model/applyStaticHooks.js b/node_modules/mongoose/lib/helpers/model/applyStaticHooks.js new file mode 100644 index 000000000..219e28903 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/model/applyStaticHooks.js @@ -0,0 +1,71 @@ +'use strict'; + +const middlewareFunctions = require('../query/applyQueryMiddleware').middlewareFunctions; +const promiseOrCallback = require('../promiseOrCallback'); + +module.exports = function applyStaticHooks(model, hooks, statics) { + const kareemOptions = { + useErrorHandlers: true, + numCallbackParams: 1 + }; + + hooks = hooks.filter(hook => { + // If the custom static overwrites an existing query middleware, don't apply + // middleware to it by default. This avoids a potential backwards breaking + // change with plugins like `mongoose-delete` that use statics to overwrite + // built-in Mongoose functions. + if (middlewareFunctions.indexOf(hook.name) !== -1) { + return !!hook.model; + } + return hook.model !== false; + }); + + model.$__insertMany = hooks.createWrapper('insertMany', + model.$__insertMany, model, kareemOptions); + + for (const key of Object.keys(statics)) { + if (hooks.hasHooks(key)) { + const original = model[key]; + + model[key] = function() { + const numArgs = arguments.length; + const lastArg = numArgs > 0 ? arguments[numArgs - 1] : null; + const cb = typeof lastArg === 'function' ? lastArg : null; + const args = Array.prototype.slice. + call(arguments, 0, cb == null ? numArgs : numArgs - 1); + // Special case: can't use `Kareem#wrap()` because it doesn't currently + // support wrapped functions that return a promise. + return promiseOrCallback(cb, callback => { + hooks.execPre(key, model, args, function(err) { + if (err != null) { + return callback(err); + } + + let postCalled = 0; + const ret = original.apply(model, args.concat(post)); + if (ret != null && typeof ret.then === 'function') { + ret.then(res => post(null, res), err => post(err)); + } + + function post(error, res) { + if (postCalled++ > 0) { + return; + } + + if (error != null) { + return callback(error); + } + + hooks.execPost(key, model, [res], function(error) { + if (error != null) { + return callback(error); + } + callback(null, res); + }); + } + }); + }, model.events); + }; + } + } +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/model/applyStatics.js b/node_modules/mongoose/lib/helpers/model/applyStatics.js new file mode 100644 index 000000000..3b9501e04 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/model/applyStatics.js @@ -0,0 +1,12 @@ +'use strict'; + +/*! + * Register statics for this model + * @param {Model} model + * @param {Schema} schema + */ +module.exports = function applyStatics(model, schema) { + for (const i in schema.statics) { + model[i] = schema.statics[i]; + } +}; diff --git a/node_modules/mongoose/lib/helpers/model/castBulkWrite.js b/node_modules/mongoose/lib/helpers/model/castBulkWrite.js new file mode 100644 index 000000000..84bcb79f8 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/model/castBulkWrite.js @@ -0,0 +1,223 @@ +'use strict'; + +const getDiscriminatorByValue = require('../../helpers/discriminator/getDiscriminatorByValue'); +const applyTimestampsToChildren = require('../update/applyTimestampsToChildren'); +const applyTimestampsToUpdate = require('../update/applyTimestampsToUpdate'); +const cast = require('../../cast'); +const castUpdate = require('../query/castUpdate'); +const setDefaultsOnInsert = require('../setDefaultsOnInsert'); + +/*! + * Given a model and a bulkWrite op, return a thunk that handles casting and + * validating the individual op. + */ + +module.exports = function castBulkWrite(originalModel, op, options) { + const now = originalModel.base.now(); + + if (op['insertOne']) { + return (callback) => { + const model = decideModelByObject(originalModel, op['insertOne']['document']); + + const doc = new model(op['insertOne']['document']); + if (model.schema.options.timestamps) { + doc.initializeTimestamps(); + } + if (options.session != null) { + doc.$session(options.session); + } + op['insertOne']['document'] = doc; + op['insertOne']['document'].$validate({ __noPromise: true }, function(error) { + if (error) { + return callback(error, null); + } + callback(null); + }); + }; + } else if (op['updateOne']) { + return (callback) => { + try { + if (!op['updateOne']['filter']) { + throw new Error('Must provide a filter object.'); + } + if (!op['updateOne']['update']) { + throw new Error('Must provide an update object.'); + } + + const model = decideModelByObject(originalModel, op['updateOne']['filter']); + const schema = model.schema; + const strict = options.strict != null ? options.strict : model.schema.options.strict; + + _addDiscriminatorToObject(schema, op['updateOne']['filter']); + + if (model.schema.$timestamps != null && op['updateOne'].timestamps !== false) { + const createdAt = model.schema.$timestamps.createdAt; + const updatedAt = model.schema.$timestamps.updatedAt; + applyTimestampsToUpdate(now, createdAt, updatedAt, op['updateOne']['update'], {}); + } + + applyTimestampsToChildren(now, op['updateOne']['update'], model.schema); + + if (op['updateOne'].setDefaultsOnInsert !== false) { + setDefaultsOnInsert(op['updateOne']['filter'], model.schema, op['updateOne']['update'], { + setDefaultsOnInsert: true, + upsert: op['updateOne'].upsert + }); + } + + op['updateOne']['filter'] = cast(model.schema, op['updateOne']['filter'], { + strict: strict, + upsert: op['updateOne'].upsert + }); + + op['updateOne']['update'] = castUpdate(model.schema, op['updateOne']['update'], { + strict: strict, + overwrite: false, + upsert: op['updateOne'].upsert + }, model, op['updateOne']['filter']); + } catch (error) { + return callback(error, null); + } + + callback(null); + }; + } else if (op['updateMany']) { + return (callback) => { + try { + if (!op['updateMany']['filter']) { + throw new Error('Must provide a filter object.'); + } + if (!op['updateMany']['update']) { + throw new Error('Must provide an update object.'); + } + + const model = decideModelByObject(originalModel, op['updateMany']['filter']); + const schema = model.schema; + const strict = options.strict != null ? options.strict : model.schema.options.strict; + + if (op['updateMany'].setDefaultsOnInsert !== false) { + setDefaultsOnInsert(op['updateMany']['filter'], model.schema, op['updateMany']['update'], { + setDefaultsOnInsert: true, + upsert: op['updateMany'].upsert + }); + } + + if (model.schema.$timestamps != null && op['updateMany'].timestamps !== false) { + const createdAt = model.schema.$timestamps.createdAt; + const updatedAt = model.schema.$timestamps.updatedAt; + applyTimestampsToUpdate(now, createdAt, updatedAt, op['updateMany']['update'], {}); + } + + applyTimestampsToChildren(now, op['updateMany']['update'], model.schema); + + _addDiscriminatorToObject(schema, op['updateMany']['filter']); + + op['updateMany']['filter'] = cast(model.schema, op['updateMany']['filter'], { + strict: strict, + upsert: op['updateMany'].upsert + }); + + op['updateMany']['update'] = castUpdate(model.schema, op['updateMany']['update'], { + strict: strict, + overwrite: false, + upsert: op['updateMany'].upsert + }, model, op['updateMany']['filter']); + } catch (error) { + return callback(error, null); + } + + callback(null); + }; + } else if (op['replaceOne']) { + return (callback) => { + const model = decideModelByObject(originalModel, op['replaceOne']['filter']); + const schema = model.schema; + const strict = options.strict != null ? options.strict : model.schema.options.strict; + + _addDiscriminatorToObject(schema, op['replaceOne']['filter']); + try { + op['replaceOne']['filter'] = cast(model.schema, op['replaceOne']['filter'], { + strict: strict, + upsert: op['replaceOne'].upsert + }); + } catch (error) { + return callback(error, null); + } + + // set `skipId`, otherwise we get "_id field cannot be changed" + const doc = new model(op['replaceOne']['replacement'], strict, true); + if (model.schema.options.timestamps) { + doc.initializeTimestamps(); + } + if (options.session != null) { + doc.$session(options.session); + } + op['replaceOne']['replacement'] = doc; + + op['replaceOne']['replacement'].$validate({ __noPromise: true }, function(error) { + if (error) { + return callback(error, null); + } + op['replaceOne']['replacement'] = op['replaceOne']['replacement'].toBSON(); + callback(null); + }); + }; + } else if (op['deleteOne']) { + return (callback) => { + const model = decideModelByObject(originalModel, op['deleteOne']['filter']); + const schema = model.schema; + + _addDiscriminatorToObject(schema, op['deleteOne']['filter']); + + try { + op['deleteOne']['filter'] = cast(model.schema, + op['deleteOne']['filter']); + } catch (error) { + return callback(error, null); + } + + callback(null); + }; + } else if (op['deleteMany']) { + return (callback) => { + const model = decideModelByObject(originalModel, op['deleteMany']['filter']); + const schema = model.schema; + + _addDiscriminatorToObject(schema, op['deleteMany']['filter']); + + try { + op['deleteMany']['filter'] = cast(model.schema, + op['deleteMany']['filter']); + } catch (error) { + return callback(error, null); + } + + callback(null); + }; + } else { + return (callback) => { + callback(new Error('Invalid op passed to `bulkWrite()`'), null); + }; + } +}; + +function _addDiscriminatorToObject(schema, obj) { + if (schema == null) { + return; + } + if (schema.discriminatorMapping && !schema.discriminatorMapping.isRoot) { + obj[schema.discriminatorMapping.key] = schema.discriminatorMapping.value; + } +} + +/*! + * gets discriminator model if discriminator key is present in object + */ + +function decideModelByObject(model, object) { + const discriminatorKey = model.schema.options.discriminatorKey; + if (object != null && object.hasOwnProperty(discriminatorKey)) { + model = getDiscriminatorByValue(model.discriminators, object[discriminatorKey]) || model; + } + return model; +} diff --git a/node_modules/mongoose/lib/helpers/model/discriminator.js b/node_modules/mongoose/lib/helpers/model/discriminator.js new file mode 100644 index 000000000..25485e90f --- /dev/null +++ b/node_modules/mongoose/lib/helpers/model/discriminator.js @@ -0,0 +1,209 @@ +'use strict'; + +const Mixed = require('../../schema/mixed'); +const defineKey = require('../document/compile').defineKey; +const get = require('../get'); +const utils = require('../../utils'); + +const CUSTOMIZABLE_DISCRIMINATOR_OPTIONS = { + toJSON: true, + toObject: true, + _id: true, + id: true +}; + +/*! + * ignore + */ + +module.exports = function discriminator(model, name, schema, tiedValue, applyPlugins) { + if (!(schema && schema.instanceOfSchema)) { + throw new Error('You must pass a valid discriminator Schema'); + } + + if (model.schema.discriminatorMapping && + !model.schema.discriminatorMapping.isRoot) { + throw new Error('Discriminator "' + name + + '" can only be a discriminator of the root model'); + } + + if (applyPlugins) { + const applyPluginsToDiscriminators = get(model.base, + 'options.applyPluginsToDiscriminators', false); + // Even if `applyPluginsToDiscriminators` isn't set, we should still apply + // global plugins to schemas embedded in the discriminator schema (gh-7370) + model.base._applyPlugins(schema, { + skipTopLevel: !applyPluginsToDiscriminators + }); + } + + const key = model.schema.options.discriminatorKey; + + const existingPath = model.schema.path(key); + if (existingPath != null) { + if (!utils.hasUserDefinedProperty(existingPath.options, 'select')) { + existingPath.options.select = true; + } + existingPath.options.$skipDiscriminatorCheck = true; + } else { + const baseSchemaAddition = {}; + baseSchemaAddition[key] = { + default: void 0, + select: true, + $skipDiscriminatorCheck: true + }; + baseSchemaAddition[key][model.schema.options.typeKey] = String; + model.schema.add(baseSchemaAddition); + defineKey({ + prop: key, + prototype: model.prototype, + options: model.schema.options + }); + } + + if (schema.path(key) && schema.path(key).options.$skipDiscriminatorCheck !== true) { + throw new Error('Discriminator "' + name + + '" cannot have field with name "' + key + '"'); + } + + let value = name; + if ((typeof tiedValue === 'string' && tiedValue.length) || tiedValue != null) { + value = tiedValue; + } + + function merge(schema, baseSchema) { + // Retain original schema before merging base schema + schema._baseSchema = baseSchema; + if (baseSchema.paths._id && + baseSchema.paths._id.options && + !baseSchema.paths._id.options.auto) { + schema.remove('_id'); + } + + // Find conflicting paths: if something is a path in the base schema + // and a nested path in the child schema, overwrite the base schema path. + // See gh-6076 + const baseSchemaPaths = Object.keys(baseSchema.paths); + const conflictingPaths = []; + + for (const path of baseSchemaPaths) { + if (schema.nested[path]) { + conflictingPaths.push(path); + continue; + } + + if (path.indexOf('.') === -1) { + continue; + } + const sp = path.split('.').slice(0, -1); + let cur = ''; + for (const piece of sp) { + cur += (cur.length ? '.' : '') + piece; + if (schema.paths[cur] instanceof Mixed || + schema.singleNestedPaths[cur] instanceof Mixed) { + conflictingPaths.push(path); + } + } + } + + utils.merge(schema, baseSchema, { + isDiscriminatorSchemaMerge: true, + omit: { discriminators: true, base: true }, + omitNested: conflictingPaths.reduce((cur, path) => { + cur['tree.' + path] = true; + return cur; + }, {}) + }); + + // Clean up conflicting paths _after_ merging re: gh-6076 + for (const conflictingPath of conflictingPaths) { + delete schema.paths[conflictingPath]; + } + + // Rebuild schema models because schemas may have been merged re: #7884 + schema.childSchemas.forEach(obj => { + obj.model.prototype.$__setSchema(obj.schema); + }); + + const obj = {}; + obj[key] = { + default: value, + select: true, + set: function(newName) { + if (newName === value || (Array.isArray(value) && utils.deepEqual(newName, value))) { + return value; + } + throw new Error('Can\'t set discriminator key "' + key + '"'); + }, + $skipDiscriminatorCheck: true + }; + obj[key][schema.options.typeKey] = existingPath ? existingPath.options[schema.options.typeKey] : String; + schema.add(obj); + + + schema.discriminatorMapping = { key: key, value: value, isRoot: false }; + + if (baseSchema.options.collection) { + schema.options.collection = baseSchema.options.collection; + } + + const toJSON = schema.options.toJSON; + const toObject = schema.options.toObject; + const _id = schema.options._id; + const id = schema.options.id; + + const keys = Object.keys(schema.options); + schema.options.discriminatorKey = baseSchema.options.discriminatorKey; + + for (const _key of keys) { + if (!CUSTOMIZABLE_DISCRIMINATOR_OPTIONS[_key]) { + // Special case: compiling a model sets `pluralization = true` by default. Avoid throwing an error + // for that case. See gh-9238 + if (_key === 'pluralization' && schema.options[_key] == true && baseSchema.options[_key] == null) { + continue; + } + + if (!utils.deepEqual(schema.options[_key], baseSchema.options[_key])) { + throw new Error('Can\'t customize discriminator option ' + _key + + ' (can only modify ' + + Object.keys(CUSTOMIZABLE_DISCRIMINATOR_OPTIONS).join(', ') + + ')'); + } + } + } + schema.options = utils.clone(baseSchema.options); + if (toJSON) schema.options.toJSON = toJSON; + if (toObject) schema.options.toObject = toObject; + if (typeof _id !== 'undefined') { + schema.options._id = _id; + } + schema.options.id = id; + schema.s.hooks = model.schema.s.hooks.merge(schema.s.hooks); + + schema.plugins = Array.prototype.slice.call(baseSchema.plugins); + schema.callQueue = baseSchema.callQueue.concat(schema.callQueue); + delete schema._requiredpaths; // reset just in case Schema#requiredPaths() was called on either schema + } + + // merges base schema into new discriminator schema and sets new type field. + merge(schema, model.schema); + + if (!model.discriminators) { + model.discriminators = {}; + } + + if (!model.schema.discriminatorMapping) { + model.schema.discriminatorMapping = { key: key, value: null, isRoot: true }; + } + if (!model.schema.discriminators) { + model.schema.discriminators = {}; + } + + model.schema.discriminators[name] = schema; + + if (model.discriminators[name]) { + throw new Error('Discriminator with name "' + name + '" already exists'); + } + + return schema; +}; diff --git a/node_modules/mongoose/lib/helpers/once.js b/node_modules/mongoose/lib/helpers/once.js new file mode 100644 index 000000000..02675799c --- /dev/null +++ b/node_modules/mongoose/lib/helpers/once.js @@ -0,0 +1,12 @@ +'use strict'; + +module.exports = function once(fn) { + let called = false; + return function() { + if (called) { + return; + } + called = true; + return fn.apply(null, arguments); + }; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/parallelLimit.js b/node_modules/mongoose/lib/helpers/parallelLimit.js new file mode 100644 index 000000000..9b07c028b --- /dev/null +++ b/node_modules/mongoose/lib/helpers/parallelLimit.js @@ -0,0 +1,55 @@ +'use strict'; + +module.exports = parallelLimit; + +/*! + * ignore + */ + +function parallelLimit(fns, limit, callback) { + let numInProgress = 0; + let numFinished = 0; + let error = null; + + if (limit <= 0) { + throw new Error('Limit must be positive'); + } + + if (fns.length === 0) { + return callback(null, []); + } + + for (let i = 0; i < fns.length && i < limit; ++i) { + _start(); + } + + function _start() { + fns[numFinished + numInProgress](_done(numFinished + numInProgress)); + ++numInProgress; + } + + const results = []; + + function _done(index) { + return (err, res) => { + --numInProgress; + ++numFinished; + + if (error != null) { + return; + } + if (err != null) { + error = err; + return callback(error); + } + + results[index] = res; + + if (numFinished === fns.length) { + return callback(null, results); + } else if (numFinished + numInProgress < fns.length) { + _start(); + } + }; + } +} diff --git a/node_modules/mongoose/lib/helpers/path/flattenObjectWithDottedPaths.js b/node_modules/mongoose/lib/helpers/path/flattenObjectWithDottedPaths.js new file mode 100644 index 000000000..9c2e28998 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/path/flattenObjectWithDottedPaths.js @@ -0,0 +1,25 @@ +'use strict'; + +const setDottedPath = require('../path/setDottedPath'); + +/** + * Given an object that may contain dotted paths, flatten the paths out. + * For example: `flattenObjectWithDottedPaths({ a: { 'b.c': 42 } })` => `{ a: { b: { c: 42 } } }` + */ + +module.exports = function flattenObjectWithDottedPaths(obj) { + if (obj == null || typeof obj !== 'object' || Array.isArray(obj)) { + return; + } + const keys = Object.keys(obj); + for (const key of keys) { + const val = obj[key]; + if (key.indexOf('.') !== -1) { + delete obj[key]; + setDottedPath(obj, key, val); + continue; + } + + flattenObjectWithDottedPaths(obj[key]); + } +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/path/parentPaths.js b/node_modules/mongoose/lib/helpers/path/parentPaths.js new file mode 100644 index 000000000..77d00ae89 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/path/parentPaths.js @@ -0,0 +1,13 @@ +'use strict'; + +module.exports = function parentPaths(path) { + const pieces = path.split('.'); + let cur = ''; + const ret = []; + for (let i = 0; i < pieces.length; ++i) { + cur += (cur.length > 0 ? '.' : '') + pieces[i]; + ret.push(cur); + } + + return ret; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/path/setDottedPath.js b/node_modules/mongoose/lib/helpers/path/setDottedPath.js new file mode 100644 index 000000000..853769c68 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/path/setDottedPath.js @@ -0,0 +1,16 @@ +'use strict'; + +module.exports = function setDottedPath(obj, path, val) { + const parts = path.indexOf('.') === -1 ? [path] : path.split('.'); + let cur = obj; + for (const part of parts.slice(0, -1)) { + if (cur[part] == null) { + cur[part] = {}; + } + + cur = cur[part]; + } + + const last = parts[parts.length - 1]; + cur[last] = val; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/pluralize.js b/node_modules/mongoose/lib/helpers/pluralize.js new file mode 100644 index 000000000..c56795017 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/pluralize.js @@ -0,0 +1,94 @@ +'use strict'; + +module.exports = pluralize; + +/** + * Pluralization rules. + */ + +exports.pluralization = [ + [/(m)an$/gi, '$1en'], + [/(pe)rson$/gi, '$1ople'], + [/(child)$/gi, '$1ren'], + [/^(ox)$/gi, '$1en'], + [/(ax|test)is$/gi, '$1es'], + [/(octop|vir)us$/gi, '$1i'], + [/(alias|status)$/gi, '$1es'], + [/(bu)s$/gi, '$1ses'], + [/(buffal|tomat|potat)o$/gi, '$1oes'], + [/([ti])um$/gi, '$1a'], + [/sis$/gi, 'ses'], + [/(?:([^f])fe|([lr])f)$/gi, '$1$2ves'], + [/(hive)$/gi, '$1s'], + [/([^aeiouy]|qu)y$/gi, '$1ies'], + [/(x|ch|ss|sh)$/gi, '$1es'], + [/(matr|vert|ind)ix|ex$/gi, '$1ices'], + [/([m|l])ouse$/gi, '$1ice'], + [/(kn|w|l)ife$/gi, '$1ives'], + [/(quiz)$/gi, '$1zes'], + [/^goose$/i, 'geese'], + [/s$/gi, 's'], + [/([^a-z])$/, '$1'], + [/$/gi, 's'] +]; +const rules = exports.pluralization; + +/** + * Uncountable words. + * + * These words are applied while processing the argument to `toCollectionName`. + * @api public + */ + +exports.uncountables = [ + 'advice', + 'energy', + 'excretion', + 'digestion', + 'cooperation', + 'health', + 'justice', + 'labour', + 'machinery', + 'equipment', + 'information', + 'pollution', + 'sewage', + 'paper', + 'money', + 'species', + 'series', + 'rain', + 'rice', + 'fish', + 'sheep', + 'moose', + 'deer', + 'news', + 'expertise', + 'status', + 'media' +]; +const uncountables = exports.uncountables; + +/*! + * Pluralize function. + * + * @author TJ Holowaychuk (extracted from _ext.js_) + * @param {String} string to pluralize + * @api private + */ + +function pluralize(str) { + let found; + str = str.toLowerCase(); + if (!~uncountables.indexOf(str)) { + found = rules.filter(function(rule) { + return str.match(rule[0]); + }); + if (found[0]) { + return str.replace(found[0][0], found[0][1]); + } + } + return str; +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/populate/SkipPopulateValue.js b/node_modules/mongoose/lib/helpers/populate/SkipPopulateValue.js new file mode 100644 index 000000000..5d46cfdd8 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/populate/SkipPopulateValue.js @@ -0,0 +1,10 @@ +'use strict'; + +module.exports = function SkipPopulateValue(val) { + if (!(this instanceof SkipPopulateValue)) { + return new SkipPopulateValue(val); + } + + this.val = val; + return this; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/populate/assignRawDocsToIdStructure.js b/node_modules/mongoose/lib/helpers/populate/assignRawDocsToIdStructure.js new file mode 100644 index 000000000..562b15cf5 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/populate/assignRawDocsToIdStructure.js @@ -0,0 +1,107 @@ +'use strict'; + +const leanPopulateMap = require('./leanPopulateMap'); +const modelSymbol = require('../symbols').modelSymbol; +const utils = require('../../utils'); + +module.exports = assignRawDocsToIdStructure; + +/*! + * Assign `vals` returned by mongo query to the `rawIds` + * structure returned from utils.getVals() honoring + * query sort order if specified by user. + * + * This can be optimized. + * + * Rules: + * + * if the value of the path is not an array, use findOne rules, else find. + * for findOne the results are assigned directly to doc path (including null results). + * for find, if user specified sort order, results are assigned directly + * else documents are put back in original order of array if found in results + * + * @param {Array} rawIds + * @param {Array} vals + * @param {Boolean} sort + * @api private + */ + +function assignRawDocsToIdStructure(rawIds, resultDocs, resultOrder, options, recursed) { + // honor user specified sort order + const newOrder = []; + const sorting = options.sort && rawIds.length > 1; + const nullIfNotFound = options.$nullIfNotFound; + let doc; + let sid; + let id; + + if (rawIds.isMongooseArrayProxy) { + rawIds = rawIds.__array; + } + + for (let i = 0; i < rawIds.length; ++i) { + id = rawIds[i]; + + if (Array.isArray(id)) { + // handle [ [id0, id2], [id3] ] + assignRawDocsToIdStructure(id, resultDocs, resultOrder, options, true); + newOrder.push(id); + continue; + } + + if (id === null && !sorting) { + // keep nulls for findOne unless sorting, which always + // removes them (backward compat) + newOrder.push(id); + continue; + } + + sid = String(id); + + doc = resultDocs[sid]; + // If user wants separate copies of same doc, use this option + if (options.clone && doc != null) { + if (options.lean) { + const _model = leanPopulateMap.get(doc); + doc = utils.clone(doc); + leanPopulateMap.set(doc, _model); + } else { + doc = doc.constructor.hydrate(doc._doc); + } + } + + if (recursed) { + if (doc) { + if (sorting) { + const _resultOrder = resultOrder[sid]; + if (Array.isArray(_resultOrder) && Array.isArray(doc) && _resultOrder.length === doc.length) { + newOrder.push(doc); + } else { + newOrder[_resultOrder] = doc; + } + } else { + newOrder.push(doc); + } + } else if (id != null && id[modelSymbol] != null) { + newOrder.push(id); + } else { + newOrder.push(options.retainNullValues || nullIfNotFound ? null : id); + } + } else { + // apply findOne behavior - if document in results, assign, else assign null + newOrder[i] = doc || null; + } + } + + rawIds.length = 0; + if (newOrder.length) { + // reassign the documents based on corrected order + + // forEach skips over sparse entries in arrays so we + // can safely use this to our advantage dealing with sorted + // result sets too. + newOrder.forEach(function(doc, i) { + rawIds[i] = doc; + }); + } +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/populate/assignVals.js b/node_modules/mongoose/lib/helpers/populate/assignVals.js new file mode 100644 index 000000000..f0a394901 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/populate/assignVals.js @@ -0,0 +1,307 @@ +'use strict'; + +const MongooseMap = require('../../types/map'); +const SkipPopulateValue = require('./SkipPopulateValue'); +const assignRawDocsToIdStructure = require('./assignRawDocsToIdStructure'); +const get = require('../get'); +const getVirtual = require('./getVirtual'); +const leanPopulateMap = require('./leanPopulateMap'); +const lookupLocalFields = require('./lookupLocalFields'); +const markArraySubdocsPopulated = require('./markArraySubdocsPopulated'); +const mpath = require('mpath'); +const sift = require('sift').default; +const utils = require('../../utils'); +const { populateModelSymbol } = require('../symbols'); + +module.exports = function assignVals(o) { + // Options that aren't explicitly listed in `populateOptions` + const userOptions = Object.assign({}, get(o, 'allOptions.options.options'), get(o, 'allOptions.options')); + // `o.options` contains options explicitly listed in `populateOptions`, like + // `match` and `limit`. + const populateOptions = Object.assign({}, o.options, userOptions, { + justOne: o.justOne + }); + populateOptions.$nullIfNotFound = o.isVirtual; + const populatedModel = o.populatedModel; + + const originalIds = [].concat(o.rawIds); + + // replace the original ids in our intermediate _ids structure + // with the documents found by query + o.allIds = [].concat(o.allIds); + assignRawDocsToIdStructure(o.rawIds, o.rawDocs, o.rawOrder, populateOptions); + + // now update the original documents being populated using the + // result structure that contains real documents. + const docs = o.docs; + const rawIds = o.rawIds; + const options = o.options; + const count = o.count && o.isVirtual; + let i; + + function setValue(val) { + if (count) { + return val; + } + if (val instanceof SkipPopulateValue) { + return val.val; + } + if (val === void 0) { + return val; + } + + const _allIds = o.allIds[i]; + + if (o.justOne === true && Array.isArray(val)) { + // Might be an embedded discriminator (re: gh-9244) with multiple models, so make sure to pick the right + // model before assigning. + const ret = []; + for (const doc of val) { + const _docPopulatedModel = leanPopulateMap.get(doc); + if (_docPopulatedModel == null || _docPopulatedModel === populatedModel) { + ret.push(doc); + } + } + // Since we don't want to have to create a new mongoosearray, make sure to + // modify the array in place + while (val.length > ret.length) { + Array.prototype.pop.apply(val, []); + } + for (let i = 0; i < ret.length; ++i) { + val[i] = ret[i]; + } + + return valueFilter(val[0], options, populateOptions, _allIds); + } else if (o.justOne === false && !Array.isArray(val)) { + if (val === null && o.isVirtual) { + return void 0; + } + return valueFilter([val], options, populateOptions, _allIds); + } + return valueFilter(val, options, populateOptions, _allIds); + } + + for (i = 0; i < docs.length; ++i) { + const _path = o.path.endsWith('.$*') ? o.path.slice(0, -3) : o.path; + const existingVal = mpath.get(_path, docs[i], lookupLocalFields); + if (existingVal == null && !getVirtual(o.originalModel.schema, _path)) { + continue; + } + + let valueToSet; + if (count) { + valueToSet = numDocs(rawIds[i]); + } else if (Array.isArray(o.match)) { + valueToSet = Array.isArray(rawIds[i]) ? + rawIds[i].filter(sift(o.match[i])) : + [rawIds[i]].filter(sift(o.match[i]))[0]; + } else { + valueToSet = rawIds[i]; + } + + // If we're populating a map, the existing value will be an object, so + // we need to transform again + const originalSchema = o.originalModel.schema; + const isDoc = get(docs[i], '$__', null) != null; + let isMap = isDoc ? + existingVal instanceof Map : + utils.isPOJO(existingVal); + // If we pass the first check, also make sure the local field's schematype + // is map (re: gh-6460) + isMap = isMap && get(originalSchema._getSchema(_path), '$isSchemaMap'); + if (!o.isVirtual && isMap) { + const _keys = existingVal instanceof Map ? + Array.from(existingVal.keys()) : + Object.keys(existingVal); + valueToSet = valueToSet.reduce((cur, v, i) => { + cur.set(_keys[i], v); + return cur; + }, new Map()); + } + + if (isDoc && Array.isArray(valueToSet)) { + for (const val of valueToSet) { + if (val != null && val.$__ != null) { + val.$__.parent = docs[i]; + } + } + } else if (isDoc && valueToSet != null && valueToSet.$__ != null) { + valueToSet.$__.parent = docs[i]; + } + + if (o.isVirtual && isDoc) { + docs[i].$populated(_path, o.justOne ? originalIds[0] : originalIds, o.allOptions); + // If virtual populate and doc is already init-ed, need to walk through + // the actual doc to set rather than setting `_doc` directly + mpath.set(_path, valueToSet, docs[i], void 0, setValue, false); + continue; + } + + const parts = _path.split('.'); + let cur = docs[i]; + const curPath = parts[0]; + for (let j = 0; j < parts.length - 1; ++j) { + // If we get to an array with a dotted path, like `arr.foo`, don't set + // `foo` on the array. + if (Array.isArray(cur) && !utils.isArrayIndex(parts[j])) { + break; + } + + if (parts[j] === '$*') { + break; + } + + if (cur[parts[j]] == null) { + // If nothing to set, avoid creating an unnecessary array. Otherwise + // we'll end up with a single doc in the array with only defaults. + // See gh-8342, gh-8455 + const schematype = originalSchema._getSchema(curPath); + if (valueToSet == null && schematype != null && schematype.$isMongooseArray) { + break; + } + cur[parts[j]] = {}; + } + cur = cur[parts[j]]; + // If the property in MongoDB is a primitive, we won't be able to populate + // the nested path, so skip it. See gh-7545 + if (typeof cur !== 'object') { + break; + } + } + if (docs[i].$__) { + o.allOptions.options[populateModelSymbol] = o.allOptions.model; + docs[i].$populated(_path, o.allIds[i], o.allOptions.options); + + if (valueToSet instanceof Map && !valueToSet.$isMongooseMap) { + valueToSet = new MongooseMap(valueToSet, _path, docs[i], docs[i].schema.path(_path).$__schemaType); + } + } + + // If lean, need to check that each individual virtual respects + // `justOne`, because you may have a populated virtual with `justOne` + // underneath an array. See gh-6867 + mpath.set(_path, valueToSet, docs[i], lookupLocalFields, setValue, false); + + if (docs[i].$__) { + markArraySubdocsPopulated(docs[i], [o.allOptions.options]); + } + } +}; + +function numDocs(v) { + if (Array.isArray(v)) { + // If setting underneath an array of populated subdocs, we may have an + // array of arrays. See gh-7573 + if (v.some(el => Array.isArray(el))) { + return v.map(el => numDocs(el)); + } + return v.length; + } + return v == null ? 0 : 1; +} + +/*! + * 1) Apply backwards compatible find/findOne behavior to sub documents + * + * find logic: + * a) filter out non-documents + * b) remove _id from sub docs when user specified + * + * findOne + * a) if no doc found, set to null + * b) remove _id from sub docs when user specified + * + * 2) Remove _ids when specified by users query. + * + * background: + * _ids are left in the query even when user excludes them so + * that population mapping can occur. + */ + +function valueFilter(val, assignmentOpts, populateOptions, allIds) { + const userSpecifiedTransform = typeof populateOptions.transform === 'function'; + const transform = userSpecifiedTransform ? populateOptions.transform : noop; + if (Array.isArray(val)) { + // find logic + const ret = []; + const numValues = val.length; + for (let i = 0; i < numValues; ++i) { + let subdoc = val[i]; + const _allIds = Array.isArray(allIds) ? allIds[i] : allIds; + if (!isPopulatedObject(subdoc) && (!populateOptions.retainNullValues || subdoc != null) && !userSpecifiedTransform) { + continue; + } else if (userSpecifiedTransform) { + subdoc = transform(isPopulatedObject(subdoc) ? subdoc : null, _allIds); + } + maybeRemoveId(subdoc, assignmentOpts); + ret.push(subdoc); + if (assignmentOpts.originalLimit && + ret.length >= assignmentOpts.originalLimit) { + break; + } + } + + // Since we don't want to have to create a new mongoosearray, make sure to + // modify the array in place + while (val.length > ret.length) { + Array.prototype.pop.apply(val, []); + } + for (let i = 0; i < ret.length; ++i) { + if (val.isMongooseArrayProxy) { + val.set(i, ret[i], true); + } else { + val[i] = ret[i]; + } + } + return val; + } + + // findOne + if (isPopulatedObject(val) || utils.isPOJO(val)) { + maybeRemoveId(val, assignmentOpts); + return transform(val, allIds); + } + if (val instanceof Map) { + return val; + } + + if (populateOptions.justOne === false) { + return []; + } + + return val == null ? transform(val, allIds) : transform(null, allIds); +} + +/*! + * Remove _id from `subdoc` if user specified "lean" query option + */ + +function maybeRemoveId(subdoc, assignmentOpts) { + if (subdoc != null && assignmentOpts.excludeId) { + if (typeof subdoc.$__setValue === 'function') { + delete subdoc._doc._id; + } else { + delete subdoc._id; + } + } +} + +/*! + * Determine if `obj` is something we can set a populated path to. Can be a + * document, a lean document, or an array/map that contains docs. + */ + +function isPopulatedObject(obj) { + if (obj == null) { + return false; + } + + return Array.isArray(obj) || + obj.$isMongooseMap || + obj.$__ != null || + leanPopulateMap.has(obj); +} + +function noop(v) { + return v; +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/populate/createPopulateQueryFilter.js b/node_modules/mongoose/lib/helpers/populate/createPopulateQueryFilter.js new file mode 100644 index 000000000..1f133b7d1 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/populate/createPopulateQueryFilter.js @@ -0,0 +1,80 @@ +'use strict'; + +const SkipPopulateValue = require('./SkipPopulateValue'); +const parentPaths = require('../path/parentPaths'); +const { trusted } = require('../query/trusted'); + +module.exports = function createPopulateQueryFilter(ids, _match, _foreignField, model, skipInvalidIds) { + const match = _formatMatch(_match); + + if (_foreignField.size === 1) { + const foreignField = Array.from(_foreignField)[0]; + const foreignSchemaType = model.schema.path(foreignField); + if (foreignField !== '_id' || !match['_id']) { + ids = _filterInvalidIds(ids, foreignSchemaType, skipInvalidIds); + match[foreignField] = trusted({ $in: ids }); + } + + const _parentPaths = parentPaths(foreignField); + for (let i = 0; i < _parentPaths.length - 1; ++i) { + const cur = _parentPaths[i]; + if (match[cur] != null && match[cur].$elemMatch != null) { + match[cur].$elemMatch[foreignField.slice(cur.length + 1)] = trusted({ $in: ids }); + delete match[foreignField]; + break; + } + } + } else { + const $or = []; + if (Array.isArray(match.$or)) { + match.$and = [{ $or: match.$or }, { $or: $or }]; + delete match.$or; + } else { + match.$or = $or; + } + for (const foreignField of _foreignField) { + if (foreignField !== '_id' || !match['_id']) { + const foreignSchemaType = model.schema.path(foreignField); + ids = _filterInvalidIds(ids, foreignSchemaType, skipInvalidIds); + $or.push({ [foreignField]: { $in: ids } }); + } + } + } + + return match; +}; + +/*! + * Optionally filter out invalid ids that don't conform to foreign field's schema + * to avoid cast errors (gh-7706) + */ + +function _filterInvalidIds(ids, foreignSchemaType, skipInvalidIds) { + ids = ids.filter(v => !(v instanceof SkipPopulateValue)); + if (!skipInvalidIds) { + return ids; + } + return ids.filter(id => { + try { + foreignSchemaType.cast(id); + return true; + } catch (err) { + return false; + } + }); +} + +/*! + * Format `mod.match` given that it may be an array that we need to $or if + * the client has multiple docs with match functions + */ + +function _formatMatch(match) { + if (Array.isArray(match)) { + if (match.length > 1) { + return { $or: [].concat(match.map(m => Object.assign({}, m))) }; + } + return Object.assign({}, match[0]); + } + return Object.assign({}, match); +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/populate/getModelsMapForPopulate.js b/node_modules/mongoose/lib/helpers/populate/getModelsMapForPopulate.js new file mode 100644 index 000000000..08cdef478 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/populate/getModelsMapForPopulate.js @@ -0,0 +1,680 @@ +'use strict'; + +const MongooseError = require('../../error/index'); +const SkipPopulateValue = require('./SkipPopulateValue'); +const get = require('../get'); +const getDiscriminatorByValue = require('../discriminator/getDiscriminatorByValue'); +const getConstructorName = require('../getConstructorName'); +const getSchemaTypes = require('./getSchemaTypes'); +const getVirtual = require('./getVirtual'); +const lookupLocalFields = require('./lookupLocalFields'); +const mpath = require('mpath'); +const modelNamesFromRefPath = require('./modelNamesFromRefPath'); +const utils = require('../../utils'); + +const modelSymbol = require('../symbols').modelSymbol; +const populateModelSymbol = require('../symbols').populateModelSymbol; +const schemaMixedSymbol = require('../../schema/symbols').schemaMixedSymbol; + +module.exports = function getModelsMapForPopulate(model, docs, options) { + let doc; + const len = docs.length; + const map = []; + const modelNameFromQuery = options.model && options.model.modelName || options.model; + let schema; + let refPath; + let modelNames; + const available = {}; + + const modelSchema = model.schema; + + // Populating a nested path should always be a no-op re: #9073. + // People shouldn't do this, but apparently they do. + if (options._localModel != null && options._localModel.schema.nested[options.path]) { + return []; + } + + const _virtualRes = getVirtual(model.schema, options.path); + const virtual = _virtualRes == null ? null : _virtualRes.virtual; + if (virtual != null) { + return _virtualPopulate(model, docs, options, _virtualRes); + } + + let allSchemaTypes = getSchemaTypes(model, modelSchema, null, options.path); + allSchemaTypes = Array.isArray(allSchemaTypes) ? allSchemaTypes : [allSchemaTypes].filter(v => v != null); + + if (allSchemaTypes.length <= 0 && options.strictPopulate !== false && options._localModel != null) { + return new MongooseError('Cannot populate path `' + options.path + + '` because it is not in your schema. Set the `strictPopulate` option ' + + 'to false to override.'); + } + + for (let i = 0; i < len; i++) { + doc = docs[i]; + let justOne = null; + + const docSchema = doc != null && doc.$__ != null ? doc.$__schema : modelSchema; + schema = getSchemaTypes(model, docSchema, doc, options.path); + + // Special case: populating a path that's a DocumentArray unless + // there's an explicit `ref` or `refPath` re: gh-8946 + if (schema != null && + schema.$isMongooseDocumentArray && + schema.options.ref == null && + schema.options.refPath == null) { + continue; + } + const isUnderneathDocArray = schema && schema.$isUnderneathDocArray; + if (isUnderneathDocArray && get(options, 'options.sort') != null) { + return new MongooseError('Cannot populate with `sort` on path ' + options.path + + ' because it is a subproperty of a document array'); + } + + modelNames = null; + let isRefPath = false; + let normalizedRefPath = null; + let schemaOptions = null; + + if (Array.isArray(schema)) { + const schemasArray = schema; + for (const _schema of schemasArray) { + let _modelNames; + let res; + try { + res = _getModelNames(doc, _schema, modelNameFromQuery, model); + _modelNames = res.modelNames; + isRefPath = isRefPath || res.isRefPath; + normalizedRefPath = normalizedRefPath || res.refPath; + justOne = res.justOne; + } catch (error) { + return error; + } + + if (isRefPath && !res.isRefPath) { + continue; + } + if (!_modelNames) { + continue; + } + modelNames = modelNames || []; + for (const modelName of _modelNames) { + if (modelNames.indexOf(modelName) === -1) { + modelNames.push(modelName); + } + } + } + } else { + try { + const res = _getModelNames(doc, schema, modelNameFromQuery, model); + modelNames = res.modelNames; + isRefPath = res.isRefPath; + normalizedRefPath = normalizedRefPath || res.refPath; + justOne = res.justOne; + schemaOptions = get(schema, 'options.populate', null); + } catch (error) { + return error; + } + + if (!modelNames) { + continue; + } + } + + const data = {}; + const localField = options.path; + const foreignField = '_id'; + + // `justOne = null` means we don't know from the schema whether the end + // result should be an array or a single doc. This can result from + // populating a POJO using `Model.populate()` + if ('justOne' in options && options.justOne !== void 0) { + justOne = options.justOne; + } else if (schema && !schema[schemaMixedSymbol]) { + // Skip Mixed types because we explicitly don't do casting on those. + if (options.path.endsWith('.' + schema.path) || options.path === schema.path) { + justOne = Array.isArray(schema) ? + schema.every(schema => !schema.$isMongooseArray) : + !schema.$isMongooseArray; + } + } + + if (!modelNames) { + continue; + } + + data.isVirtual = false; + data.justOne = justOne; + data.localField = localField; + data.foreignField = foreignField; + + // Get local fields + const ret = _getLocalFieldValues(doc, localField, model, options, null, schema); + + const id = String(utils.getValue(foreignField, doc)); + options._docs[id] = Array.isArray(ret) ? ret.slice() : ret; + + let match = get(options, 'match', null); + + const hasMatchFunction = typeof match === 'function'; + if (hasMatchFunction) { + match = match.call(doc, doc); + } + data.match = match; + data.hasMatchFunction = hasMatchFunction; + data.isRefPath = isRefPath; + + if (isRefPath) { + const embeddedDiscriminatorModelNames = _findRefPathForDiscriminators(doc, + modelSchema, data, options, normalizedRefPath, ret); + + modelNames = embeddedDiscriminatorModelNames || modelNames; + } + + try { + addModelNamesToMap(model, map, available, modelNames, options, data, ret, doc, schemaOptions); + } catch (err) { + return err; + } + } + return map; + + function _getModelNames(doc, schema, modelNameFromQuery, model) { + let modelNames; + let isRefPath = false; + let justOne = null; + + if (schema && schema.caster) { + schema = schema.caster; + } + if (schema && schema.$isSchemaMap) { + schema = schema.$__schemaType; + } + + const ref = schema && schema.options && schema.options.ref; + refPath = schema && schema.options && schema.options.refPath; + if (schema != null && + schema[schemaMixedSymbol] && + !ref && + !refPath && + !modelNameFromQuery) { + return { modelNames: null }; + } + + if (modelNameFromQuery) { + modelNames = [modelNameFromQuery]; // query options + } else if (refPath != null) { + if (typeof refPath === 'function') { + const subdocPath = options.path.slice(0, options.path.length - schema.path.length - 1); + const vals = mpath.get(subdocPath, doc, lookupLocalFields); + const subdocsBeingPopulated = Array.isArray(vals) ? + utils.array.flatten(vals) : + (vals ? [vals] : []); + + modelNames = new Set(); + for (const subdoc of subdocsBeingPopulated) { + refPath = refPath.call(subdoc, subdoc, options.path); + modelNamesFromRefPath(refPath, doc, options.path, modelSchema, options._queryProjection). + forEach(name => modelNames.add(name)); + } + modelNames = Array.from(modelNames); + } else { + modelNames = modelNamesFromRefPath(refPath, doc, options.path, modelSchema, options._queryProjection); + } + + isRefPath = true; + } else { + let ref; + let refPath; + let schemaForCurrentDoc; + let discriminatorValue; + let modelForCurrentDoc = model; + const discriminatorKey = model.schema.options.discriminatorKey; + + if (!schema && discriminatorKey && (discriminatorValue = utils.getValue(discriminatorKey, doc))) { + // `modelNameForFind` is the discriminator value, so we might need + // find the discriminated model name + const discriminatorModel = getDiscriminatorByValue(model.discriminators, discriminatorValue) || model; + if (discriminatorModel != null) { + modelForCurrentDoc = discriminatorModel; + } else { + try { + modelForCurrentDoc = model.db.model(discriminatorValue); + } catch (error) { + return error; + } + } + + schemaForCurrentDoc = modelForCurrentDoc.schema._getSchema(options.path); + + if (schemaForCurrentDoc && schemaForCurrentDoc.caster) { + schemaForCurrentDoc = schemaForCurrentDoc.caster; + } + } else { + schemaForCurrentDoc = schema; + } + + if (schemaForCurrentDoc != null) { + justOne = !schemaForCurrentDoc.$isMongooseArray && !schemaForCurrentDoc._arrayPath; + } + + if ((ref = get(schemaForCurrentDoc, 'options.ref')) != null) { + if (schemaForCurrentDoc != null && + typeof ref === 'function' && + options.path.endsWith('.' + schemaForCurrentDoc.path)) { + // Ensure correct context for ref functions: subdoc, not top-level doc. See gh-8469 + modelNames = new Set(); + + const subdocPath = options.path.slice(0, options.path.length - schemaForCurrentDoc.path.length - 1); + const vals = mpath.get(subdocPath, doc, lookupLocalFields); + const subdocsBeingPopulated = Array.isArray(vals) ? + utils.array.flatten(vals) : + (vals ? [vals] : []); + for (const subdoc of subdocsBeingPopulated) { + modelNames.add(handleRefFunction(ref, subdoc)); + } + + if (subdocsBeingPopulated.length === 0) { + modelNames = [handleRefFunction(ref, doc)]; + } else { + modelNames = Array.from(modelNames); + } + } else { + ref = handleRefFunction(ref, doc); + modelNames = [ref]; + } + } else if ((schemaForCurrentDoc = get(schema, 'options.refPath')) != null) { + isRefPath = true; + if (typeof refPath === 'function') { + const subdocPath = options.path.slice(0, options.path.length - schemaForCurrentDoc.path.length - 1); + const vals = mpath.get(subdocPath, doc, lookupLocalFields); + const subdocsBeingPopulated = Array.isArray(vals) ? + utils.array.flatten(vals) : + (vals ? [vals] : []); + + modelNames = new Set(); + for (const subdoc of subdocsBeingPopulated) { + refPath = refPath.call(subdoc, subdoc, options.path); + modelNamesFromRefPath(refPath, doc, options.path, modelSchema, options._queryProjection). + forEach(name => modelNames.add(name)); + } + modelNames = Array.from(modelNames); + } else { + modelNames = modelNamesFromRefPath(refPath, doc, options.path, modelSchema, options._queryProjection); + } + } + } + + if (!modelNames) { + return { modelNames: modelNames, justOne: justOne, isRefPath: isRefPath, refPath: refPath }; + } + + if (!Array.isArray(modelNames)) { + modelNames = [modelNames]; + } + + return { modelNames: modelNames, justOne: justOne, isRefPath: isRefPath, refPath: refPath }; + } +}; + +/*! + * ignore + */ + +function _virtualPopulate(model, docs, options, _virtualRes) { + const map = []; + const available = {}; + const virtual = _virtualRes.virtual; + + for (const doc of docs) { + let modelNames = null; + const data = {}; + + // localField and foreignField + let localField; + const virtualPrefix = _virtualRes.nestedSchemaPath ? + _virtualRes.nestedSchemaPath + '.' : ''; + if (typeof virtual.options.localField === 'function') { + localField = virtualPrefix + virtual.options.localField.call(doc, doc); + } else if (Array.isArray(virtual.options.localField)) { + localField = virtual.options.localField.map(field => virtualPrefix + field); + } else { + localField = virtualPrefix + virtual.options.localField; + } + data.count = virtual.options.count; + + if (virtual.options.skip != null && !options.hasOwnProperty('skip')) { + options.skip = virtual.options.skip; + } + if (virtual.options.limit != null && !options.hasOwnProperty('limit')) { + options.limit = virtual.options.limit; + } + if (virtual.options.perDocumentLimit != null && !options.hasOwnProperty('perDocumentLimit')) { + options.perDocumentLimit = virtual.options.perDocumentLimit; + } + let foreignField = virtual.options.foreignField; + + if (!localField || !foreignField) { + return new MongooseError('If you are populating a virtual, you must set the ' + + 'localField and foreignField options'); + } + + if (typeof localField === 'function') { + localField = localField.call(doc, doc); + } + if (typeof foreignField === 'function') { + foreignField = foreignField.call(doc); + } + + data.isRefPath = false; + + // `justOne = null` means we don't know from the schema whether the end + // result should be an array or a single doc. This can result from + // populating a POJO using `Model.populate()` + let justOne = null; + if ('justOne' in options && options.justOne !== void 0) { + justOne = options.justOne; + } + + if (virtual.options.refPath) { + modelNames = + modelNamesFromRefPath(virtual.options.refPath, doc, options.path); + justOne = !!virtual.options.justOne; + data.isRefPath = true; + } else if (virtual.options.ref) { + let normalizedRef; + if (typeof virtual.options.ref === 'function' && !virtual.options.ref[modelSymbol]) { + normalizedRef = virtual.options.ref.call(doc, doc); + } else { + normalizedRef = virtual.options.ref; + } + justOne = !!virtual.options.justOne; + // When referencing nested arrays, the ref should be an Array + // of modelNames. + if (Array.isArray(normalizedRef)) { + modelNames = normalizedRef; + } else { + modelNames = [normalizedRef]; + } + } + + data.isVirtual = true; + data.virtual = virtual; + data.justOne = justOne; + + // `match` + let match = get(options, 'match', null) || + get(data, 'virtual.options.match', null) || + get(data, 'virtual.options.options.match', null); + + let hasMatchFunction = typeof match === 'function'; + if (hasMatchFunction) { + match = match.call(doc, doc); + } + + if (Array.isArray(localField) && Array.isArray(foreignField) && localField.length === foreignField.length) { + match = Object.assign({}, match); + for (let i = 1; i < localField.length; ++i) { + match[foreignField[i]] = convertTo_id(mpath.get(localField[i], doc, lookupLocalFields), model.schema); + hasMatchFunction = true; + } + + localField = localField[0]; + foreignField = foreignField[0]; + } + + data.localField = localField; + data.foreignField = foreignField; + data.match = match; + data.hasMatchFunction = hasMatchFunction; + + // Get local fields + const ret = _getLocalFieldValues(doc, localField, model, options, virtual); + + try { + addModelNamesToMap(model, map, available, modelNames, options, data, ret, doc); + } catch (err) { + return err; + } + } + + return map; +} + +/*! + * ignore + */ + +function addModelNamesToMap(model, map, available, modelNames, options, data, ret, doc, schemaOptions) { + // `PopulateOptions#connection`: if the model is passed as a string, the + // connection matters because different connections have different models. + const connection = options.connection != null ? options.connection : model.db; + + if (modelNames == null) { + return; + } + + let k = modelNames.length; + while (k--) { + const modelName = modelNames[k]; + if (modelName == null) { + continue; + } + + let Model; + if (options.model && options.model[modelSymbol]) { + Model = options.model; + } else if (modelName[modelSymbol]) { + Model = modelName; + } else { + try { + Model = connection.model(modelName); + } catch (err) { + if (ret !== void 0) { + throw err; + } + Model = null; + } + } + + let ids = ret; + const flat = Array.isArray(ret) ? utils.array.flatten(ret) : []; + + if (data.isRefPath && Array.isArray(ret) && flat.length === modelNames.length) { + ids = flat.filter((val, i) => modelNames[i] === modelName); + } + + const perDocumentLimit = options.perDocumentLimit == null ? + get(options, 'options.perDocumentLimit', null) : + options.perDocumentLimit; + + if (!available[modelName] || perDocumentLimit != null) { + const currentOptions = { + model: Model + }; + + if (data.isVirtual && get(data.virtual, 'options.options')) { + currentOptions.options = utils.clone(data.virtual.options.options); + } else if (schemaOptions != null) { + currentOptions.options = Object.assign({}, schemaOptions); + } + utils.merge(currentOptions, options); + + // Used internally for checking what model was used to populate this + // path. + options[populateModelSymbol] = Model; + + available[modelName] = { + model: Model, + options: currentOptions, + match: data.hasMatchFunction ? [data.match] : data.match, + docs: [doc], + ids: [ids], + allIds: [ret], + localField: new Set([data.localField]), + foreignField: new Set([data.foreignField]), + justOne: data.justOne, + isVirtual: data.isVirtual, + virtual: data.virtual, + count: data.count, + [populateModelSymbol]: Model + }; + map.push(available[modelName]); + } else { + available[modelName].localField.add(data.localField); + available[modelName].foreignField.add(data.foreignField); + available[modelName].docs.push(doc); + available[modelName].ids.push(ids); + available[modelName].allIds.push(ret); + if (data.hasMatchFunction) { + available[modelName].match.push(data.match); + } + } + } +} + +/*! + * ignore + */ + +function handleRefFunction(ref, doc) { + if (typeof ref === 'function' && !ref[modelSymbol]) { + return ref.call(doc, doc); + } + return ref; +} + +/*! + * ignore + */ + +function _getLocalFieldValues(doc, localField, model, options, virtual, schema) { + // Get Local fields + const localFieldPathType = model.schema._getPathType(localField); + const localFieldPath = localFieldPathType === 'real' ? + model.schema.path(localField) : + localFieldPathType.schema; + const localFieldGetters = localFieldPath && localFieldPath.getters ? + localFieldPath.getters : []; + + const _populateOptions = get(options, 'options', {}); + + const getters = 'getters' in _populateOptions ? + _populateOptions.getters : + get(virtual, 'options.getters', false); + if (localFieldGetters.length > 0 && getters) { + const hydratedDoc = (doc.$__ != null) ? doc : model.hydrate(doc); + const localFieldValue = utils.getValue(localField, doc); + if (Array.isArray(localFieldValue)) { + const localFieldHydratedValue = utils.getValue(localField.split('.').slice(0, -1), hydratedDoc); + return localFieldValue.map((localFieldArrVal, localFieldArrIndex) => + localFieldPath.applyGetters(localFieldArrVal, localFieldHydratedValue[localFieldArrIndex])); + } else { + return localFieldPath.applyGetters(localFieldValue, hydratedDoc); + } + } else { + return convertTo_id(mpath.get(localField, doc, lookupLocalFields), schema); + } +} + +/*! + * Retrieve the _id of `val` if a Document or Array of Documents. + * + * @param {Array|Document|Any} val + * @return {Array|Document|Any} + */ + +function convertTo_id(val, schema) { + if (val != null && val.$__ != null) { + return val._id; + } + if (val != null && val._id != null && (schema == null || !schema.$isSchemaMap)) { + return val._id; + } + + if (Array.isArray(val)) { + const rawVal = val.__array != null ? val.__array : val; + for (let i = 0; i < rawVal.length; ++i) { + if (rawVal[i] != null && rawVal[i].$__ != null) { + rawVal[i] = rawVal[i]._id; + } + } + if (val.isMongooseArray && val.$schema()) { + return val.$schema()._castForPopulate(val, val.$parent()); + } + + return [].concat(val); + } + + // `populate('map')` may be an object if populating on a doc that hasn't + // been hydrated yet + if (getConstructorName(val) === 'Object' && + // The intent here is we should only flatten the object if we expect + // to get a Map in the end. Avoid doing this for mixed types. + (schema == null || schema[schemaMixedSymbol] == null)) { + const ret = []; + for (const key of Object.keys(val)) { + ret.push(val[key]); + } + return ret; + } + // If doc has already been hydrated, e.g. `doc.populate('map')` + // then `val` will already be a map + if (val instanceof Map) { + return Array.from(val.values()); + } + + return val; +} + +/*! + * ignore + */ + +function _findRefPathForDiscriminators(doc, modelSchema, data, options, normalizedRefPath, ret) { + // Re: gh-8452. Embedded discriminators may not have `refPath`, so clear + // out embedded discriminator docs that don't have a `refPath` on the + // populated path. + if (!data.isRefPath || normalizedRefPath == null) { + return; + } + + const pieces = normalizedRefPath.split('.'); + let cur = ''; + let modelNames = void 0; + for (let i = 0; i < pieces.length; ++i) { + const piece = pieces[i]; + cur = cur + (cur.length === 0 ? '' : '.') + piece; + const schematype = modelSchema.path(cur); + if (schematype != null && + schematype.$isMongooseArray && + schematype.caster.discriminators != null && + Object.keys(schematype.caster.discriminators).length > 0) { + const subdocs = utils.getValue(cur, doc); + const remnant = options.path.substr(cur.length + 1); + const discriminatorKey = schematype.caster.schema.options.discriminatorKey; + modelNames = []; + for (const subdoc of subdocs) { + const discriminatorName = utils.getValue(discriminatorKey, subdoc); + const discriminator = schematype.caster.discriminators[discriminatorName]; + const discriminatorSchema = discriminator && discriminator.schema; + if (discriminatorSchema == null) { + continue; + } + const _path = discriminatorSchema.path(remnant); + if (_path == null || _path.options.refPath == null) { + const docValue = utils.getValue(data.localField.substr(cur.length + 1), subdoc); + ret.forEach((v, i) => { + if (v === docValue) { + ret[i] = SkipPopulateValue(v); + } + }); + continue; + } + const modelName = utils.getValue(pieces.slice(i + 1).join('.'), subdoc); + modelNames.push(modelName); + } + } + } + + return modelNames; +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/populate/getSchemaTypes.js b/node_modules/mongoose/lib/helpers/populate/getSchemaTypes.js new file mode 100644 index 000000000..76c428653 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/populate/getSchemaTypes.js @@ -0,0 +1,229 @@ +'use strict'; + +/*! + * ignore + */ + +const Mixed = require('../../schema/mixed'); +const get = require('../get'); +const getDiscriminatorByValue = require('../discriminator/getDiscriminatorByValue'); +const leanPopulateMap = require('./leanPopulateMap'); +const mpath = require('mpath'); + +const populateModelSymbol = require('../symbols').populateModelSymbol; + +/*! + * Given a model and its schema, find all possible schema types for `path`, + * including searching through discriminators. If `doc` is specified, will + * use the doc's values for discriminator keys when searching, otherwise + * will search all discriminators. + * + * @param {Schema} schema + * @param {Object} doc POJO + * @param {string} path + */ + +module.exports = function getSchemaTypes(model, schema, doc, path) { + const pathschema = schema.path(path); + const topLevelDoc = doc; + if (pathschema) { + return pathschema; + } + + const discriminatorKey = schema.discriminatorMapping && + schema.discriminatorMapping.key; + if (discriminatorKey && model != null) { + if (doc != null && doc[discriminatorKey] != null) { + const discriminator = getDiscriminatorByValue(model.discriminators, doc[discriminatorKey]); + schema = discriminator ? discriminator.schema : schema; + } else if (model.discriminators != null) { + return Object.keys(model.discriminators).reduce((arr, name) => { + const disc = model.discriminators[name]; + return arr.concat(getSchemaTypes(disc, disc.schema, null, path)); + }, []); + } + } + + function search(parts, schema, subdoc, nestedPath) { + let p = parts.length + 1; + let foundschema; + let trypath; + + while (p--) { + trypath = parts.slice(0, p).join('.'); + foundschema = schema.path(trypath); + if (foundschema == null) { + continue; + } + + if (foundschema.caster) { + // array of Mixed? + if (foundschema.caster instanceof Mixed) { + return foundschema.caster; + } + + let schemas = null; + if (foundschema.schema != null && foundschema.schema.discriminators != null) { + const discriminators = foundschema.schema.discriminators; + const discriminatorKeyPath = trypath + '.' + + foundschema.schema.options.discriminatorKey; + const keys = subdoc ? mpath.get(discriminatorKeyPath, subdoc) || [] : []; + schemas = Object.keys(discriminators). + reduce(function(cur, discriminator) { + const tiedValue = discriminators[discriminator].discriminatorMapping.value; + if (doc == null || keys.indexOf(discriminator) !== -1 || keys.indexOf(tiedValue) !== -1) { + cur.push(discriminators[discriminator]); + } + return cur; + }, []); + } + + // Now that we found the array, we need to check if there + // are remaining document paths to look up for casting. + // Also we need to handle array.$.path since schema.path + // doesn't work for that. + // If there is no foundschema.schema we are dealing with + // a path like array.$ + if (p !== parts.length && foundschema.schema) { + let ret; + if (parts[p] === '$') { + if (p + 1 === parts.length) { + // comments.$ + return foundschema; + } + // comments.$.comments.$.title + ret = search( + parts.slice(p + 1), + schema, + subdoc ? mpath.get(trypath, subdoc) : null, + nestedPath.concat(parts.slice(0, p)) + ); + if (ret) { + ret.$isUnderneathDocArray = ret.$isUnderneathDocArray || + !foundschema.schema.$isSingleNested; + } + return ret; + } + + if (schemas != null && schemas.length > 0) { + ret = []; + for (const schema of schemas) { + const _ret = search( + parts.slice(p), + schema, + subdoc ? mpath.get(trypath, subdoc) : null, + nestedPath.concat(parts.slice(0, p)) + ); + if (_ret != null) { + _ret.$isUnderneathDocArray = _ret.$isUnderneathDocArray || + !foundschema.schema.$isSingleNested; + if (_ret.$isUnderneathDocArray) { + ret.$isUnderneathDocArray = true; + } + ret.push(_ret); + } + } + return ret; + } else { + ret = search( + parts.slice(p), + foundschema.schema, + subdoc ? mpath.get(trypath, subdoc) : null, + nestedPath.concat(parts.slice(0, p)) + ); + + if (ret) { + ret.$isUnderneathDocArray = ret.$isUnderneathDocArray || + !foundschema.schema.$isSingleNested; + } + return ret; + } + } else if (p !== parts.length && + foundschema.$isMongooseArray && + foundschema.casterConstructor.$isMongooseArray) { + // Nested arrays. Drill down to the bottom of the nested array. + let type = foundschema; + while (type.$isMongooseArray && !type.$isMongooseDocumentArray) { + type = type.casterConstructor; + } + + const ret = search( + parts.slice(p), + type.schema, + null, + nestedPath.concat(parts.slice(0, p)) + ); + if (ret != null) { + return ret; + } + + if (type.schema.discriminators) { + const discriminatorPaths = []; + for (const discriminatorName of Object.keys(type.schema.discriminators)) { + const _schema = type.schema.discriminators[discriminatorName] || type.schema; + const ret = search(parts.slice(p), _schema, null, nestedPath.concat(parts.slice(0, p))); + if (ret != null) { + discriminatorPaths.push(ret); + } + } + if (discriminatorPaths.length > 0) { + return discriminatorPaths; + } + } + } + } + + const fullPath = nestedPath.concat([trypath]).join('.'); + if (topLevelDoc != null && topLevelDoc.$__ && topLevelDoc.$populated(fullPath) && p < parts.length) { + const model = doc.$__.populated[fullPath].options[populateModelSymbol]; + if (model != null) { + const ret = search( + parts.slice(p), + model.schema, + subdoc ? mpath.get(trypath, subdoc) : null, + nestedPath.concat(parts.slice(0, p)) + ); + + if (ret) { + ret.$isUnderneathDocArray = ret.$isUnderneathDocArray || + !model.schema.$isSingleNested; + } + return ret; + } + } + + const _val = get(topLevelDoc, trypath); + if (_val != null) { + const model = Array.isArray(_val) && _val.length > 0 ? + leanPopulateMap.get(_val[0]) : + leanPopulateMap.get(_val); + // Populated using lean, `leanPopulateMap` value is the foreign model + const schema = model != null ? model.schema : null; + if (schema != null) { + const ret = search( + parts.slice(p), + schema, + subdoc ? mpath.get(trypath, subdoc) : null, + nestedPath.concat(parts.slice(0, p)) + ); + + if (ret != null) { + ret.$isUnderneathDocArray = ret.$isUnderneathDocArray || + !schema.$isSingleNested; + return ret; + } + } + } + return foundschema; + } + } + // look for arrays + const parts = path.split('.'); + for (let i = 0; i < parts.length; ++i) { + if (parts[i] === '$') { + // Re: gh-5628, because `schema.path()` doesn't take $ into account. + parts[i] = '0'; + } + } + return search(parts, schema, doc, []); +}; diff --git a/node_modules/mongoose/lib/helpers/populate/getVirtual.js b/node_modules/mongoose/lib/helpers/populate/getVirtual.js new file mode 100644 index 000000000..fc1641d40 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/populate/getVirtual.js @@ -0,0 +1,72 @@ +'use strict'; + +module.exports = getVirtual; + +/*! + * ignore + */ + +function getVirtual(schema, name) { + if (schema.virtuals[name]) { + return { virtual: schema.virtuals[name], path: void 0 }; + } + + const parts = name.split('.'); + let cur = ''; + let nestedSchemaPath = ''; + for (let i = 0; i < parts.length; ++i) { + cur += (cur.length > 0 ? '.' : '') + parts[i]; + if (schema.virtuals[cur]) { + if (i === parts.length - 1) { + return { virtual: schema.virtuals[cur], path: nestedSchemaPath }; + } + continue; + } + + if (schema.nested[cur]) { + continue; + } + + if (schema.paths[cur] && schema.paths[cur].schema) { + schema = schema.paths[cur].schema; + const rest = parts.slice(i + 1).join('.'); + + if (schema.virtuals[rest]) { + if (i === parts.length - 2) { + return { + virtual: schema.virtuals[rest], + nestedSchemaPath: [nestedSchemaPath, cur].filter(v => !!v).join('.') + }; + } + continue; + } + + if (i + 1 < parts.length && schema.discriminators) { + for (const key of Object.keys(schema.discriminators)) { + const res = getVirtual(schema.discriminators[key], rest); + if (res != null) { + const _path = [nestedSchemaPath, cur, res.nestedSchemaPath]. + filter(v => !!v).join('.'); + return { + virtual: res.virtual, + nestedSchemaPath: _path + }; + } + } + } + + nestedSchemaPath += (nestedSchemaPath.length > 0 ? '.' : '') + cur; + cur = ''; + continue; + } + + if (schema.discriminators) { + for (const discriminatorKey of Object.keys(schema.discriminators)) { + const virtualFromDiscriminator = getVirtual(schema.discriminators[discriminatorKey], name); + if (virtualFromDiscriminator) return virtualFromDiscriminator; + } + } + + return null; + } +} diff --git a/node_modules/mongoose/lib/helpers/populate/leanPopulateMap.js b/node_modules/mongoose/lib/helpers/populate/leanPopulateMap.js new file mode 100644 index 000000000..a333124fa --- /dev/null +++ b/node_modules/mongoose/lib/helpers/populate/leanPopulateMap.js @@ -0,0 +1,7 @@ +'use strict'; + +/*! + * ignore + */ + +module.exports = new WeakMap(); \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/populate/lookupLocalFields.js b/node_modules/mongoose/lib/helpers/populate/lookupLocalFields.js new file mode 100644 index 000000000..be553ebf9 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/populate/lookupLocalFields.js @@ -0,0 +1,40 @@ +'use strict'; + +module.exports = function lookupLocalFields(cur, path, val) { + if (cur == null) { + return cur; + } + + if (cur._doc != null) { + cur = cur._doc; + } + + if (arguments.length >= 3) { + if (typeof cur !== 'object') { + return void 0; + } + if (val === void 0) { + return void 0; + } + if (cur instanceof Map) { + cur.set(path, val); + } else { + cur[path] = val; + } + return val; + } + + + // Support populating paths under maps using `map.$*.subpath` + if (path === '$*') { + return cur instanceof Map ? + Array.from(cur.values()) : + Object.keys(cur).map(key => cur[key]); + } + + if (cur instanceof Map) { + return cur.get(path); + } + + return cur[path]; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/populate/markArraySubdocsPopulated.js b/node_modules/mongoose/lib/helpers/populate/markArraySubdocsPopulated.js new file mode 100644 index 000000000..fd7a2dd83 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/populate/markArraySubdocsPopulated.js @@ -0,0 +1,40 @@ +'use strict'; + +/*! + * If populating a path within a document array, make sure each + * subdoc within the array knows its subpaths are populated. + * + * ####Example: + * const doc = await Article.findOne().populate('comments.author'); + * doc.comments[0].populated('author'); // Should be set + */ + +module.exports = function markArraySubdocsPopulated(doc, populated) { + if (doc._id == null || populated == null || populated.length === 0) { + return; + } + + const id = String(doc._id); + for (const item of populated) { + if (item.isVirtual) { + continue; + } + const path = item.path; + const pieces = path.split('.'); + for (let i = 0; i < pieces.length - 1; ++i) { + const subpath = pieces.slice(0, i + 1).join('.'); + const rest = pieces.slice(i + 1).join('.'); + const val = doc.get(subpath); + if (val == null) { + continue; + } + + if (val.isMongooseDocumentArray) { + for (let j = 0; j < val.length; ++j) { + val[j].populated(rest, item._docs[id] == null ? void 0 : item._docs[id][j], item); + } + break; + } + } + } +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/populate/modelNamesFromRefPath.js b/node_modules/mongoose/lib/helpers/populate/modelNamesFromRefPath.js new file mode 100644 index 000000000..c441f565e --- /dev/null +++ b/node_modules/mongoose/lib/helpers/populate/modelNamesFromRefPath.js @@ -0,0 +1,65 @@ +'use strict'; + +const MongooseError = require('../../error/mongooseError'); +const isPathExcluded = require('../projection/isPathExcluded'); +const util = require('util'); +const utils = require('../../utils'); + +module.exports = function modelNamesFromRefPath(refPath, doc, populatedPath, modelSchema, queryProjection) { + if (refPath == null) { + return []; + } + + if (typeof refPath === 'string' && queryProjection != null && isPathExcluded(queryProjection, refPath)) { + throw new MongooseError('refPath `' + refPath + '` must not be excluded in projection, got ' + + util.inspect(queryProjection)); + } + + // If populated path has numerics, the end `refPath` should too. For example, + // if populating `a.0.b` instead of `a.b` and `b` has `refPath = a.c`, we + // should return `a.0.c` for the refPath. + const hasNumericProp = /(\.\d+$|\.\d+\.)/g; + + if (hasNumericProp.test(populatedPath)) { + const chunks = populatedPath.split(hasNumericProp); + + if (chunks[chunks.length - 1] === '') { + throw new Error('Can\'t populate individual element in an array'); + } + + let _refPath = ''; + let _remaining = refPath; + // 2nd, 4th, etc. will be numeric props. For example: `[ 'a', '.0.', 'b' ]` + for (let i = 0; i < chunks.length; i += 2) { + const chunk = chunks[i]; + if (_remaining.startsWith(chunk + '.')) { + _refPath += _remaining.substr(0, chunk.length) + chunks[i + 1]; + _remaining = _remaining.substr(chunk.length + 1); + } else if (i === chunks.length - 1) { + _refPath += _remaining; + _remaining = ''; + break; + } else { + throw new Error('Could not normalize ref path, chunk ' + chunk + ' not in populated path'); + } + } + + const refValue = utils.getValue(_refPath, doc); + let modelNames = Array.isArray(refValue) ? refValue : [refValue]; + modelNames = utils.array.flatten(modelNames); + return modelNames; + } + + const refValue = utils.getValue(refPath, doc); + + let modelNames; + if (modelSchema != null && modelSchema.virtuals.hasOwnProperty(refPath)) { + modelNames = [modelSchema.virtuals[refPath].applyGetters(void 0, doc)]; + } else { + modelNames = Array.isArray(refValue) ? refValue : [refValue]; + } + + modelNames = utils.array.flatten(modelNames); + + return modelNames; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/populate/removeDeselectedForeignField.js b/node_modules/mongoose/lib/helpers/populate/removeDeselectedForeignField.js new file mode 100644 index 000000000..39b893a9d --- /dev/null +++ b/node_modules/mongoose/lib/helpers/populate/removeDeselectedForeignField.js @@ -0,0 +1,31 @@ +'use strict'; + +const get = require('../get'); +const mpath = require('mpath'); +const parseProjection = require('../projection/parseProjection'); + +/*! + * ignore + */ + +module.exports = function removeDeselectedForeignField(foreignFields, options, docs) { + const projection = parseProjection(get(options, 'select', null), true) || + parseProjection(get(options, 'options.select', null), true); + + if (projection == null) { + return; + } + for (const foreignField of foreignFields) { + if (!projection.hasOwnProperty('-' + foreignField)) { + continue; + } + + for (const val of docs) { + if (val.$__ != null) { + mpath.unset(foreignField, val._doc); + } else { + mpath.unset(foreignField, val); + } + } + } +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/populate/validateRef.js b/node_modules/mongoose/lib/helpers/populate/validateRef.js new file mode 100644 index 000000000..9dc2b6fc6 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/populate/validateRef.js @@ -0,0 +1,19 @@ +'use strict'; + +const MongooseError = require('../../error/mongooseError'); +const util = require('util'); + +module.exports = validateRef; + +function validateRef(ref, path) { + if (typeof ref === 'string') { + return; + } + + if (typeof ref === 'function') { + return; + } + + throw new MongooseError('Invalid ref at path "' + path + '". Got ' + + util.inspect(ref, { depth: 0 })); +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/printJestWarning.js b/node_modules/mongoose/lib/helpers/printJestWarning.js new file mode 100644 index 000000000..cc112e2bc --- /dev/null +++ b/node_modules/mongoose/lib/helpers/printJestWarning.js @@ -0,0 +1,17 @@ +'use strict'; + +const utils = require('../utils'); + +if (typeof jest !== 'undefined' && typeof window !== 'undefined') { + utils.warn('Mongoose: looks like you\'re trying to test a Mongoose app ' + + 'with Jest\'s default jsdom test environment. Please make sure you read ' + + 'Mongoose\'s docs on configuring Jest to test Node.js apps: ' + + 'http://mongoosejs.com/docs/jest.html'); +} + +if (typeof jest !== 'undefined' && process.nextTick.toString().indexOf('nextTick') === -1) { + utils.warn('Mongoose: looks like you\'re trying to test a Mongoose app ' + + 'with Jest\'s mock timers enabled. Please make sure you read ' + + 'Mongoose\'s docs on configuring Jest to test Node.js apps: ' + + 'http://mongoosejs.com/docs/jest.html'); +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/processConnectionOptions.js b/node_modules/mongoose/lib/helpers/processConnectionOptions.js new file mode 100644 index 000000000..3dd0c130d --- /dev/null +++ b/node_modules/mongoose/lib/helpers/processConnectionOptions.js @@ -0,0 +1,64 @@ +'use strict'; + +const clone = require('./clone'); +const MongooseError = require('../error/index'); + +function processConnectionOptions(uri, options) { + const opts = options ? options : {}; + const readPreference = opts.readPreference + ? opts.readPreference + : getUriReadPreference(uri); + + const resolvedOpts = readPreference + ? resolveOptsConflicts(readPreference, opts) + : opts; + + return clone(resolvedOpts); +} + +function resolveOptsConflicts(pref, opts) { + // don't silently override user-provided indexing options + if (setsIndexOptions(opts) && setsSecondaryRead(pref)) { + throwReadPreferenceError(); + } + + // if user has not explicitly set any auto-indexing options, + // we can silently default them all to false + else { + return defaultIndexOptsToFalse(opts); + } +} + +function setsIndexOptions(opts) { + const configIdx = opts.config && opts.config.autoIndex; + const { autoCreate, autoIndex } = opts; + return !!(configIdx || autoCreate || autoIndex); +} + +function setsSecondaryRead(prefString) { + return !!(prefString === 'secondary' || prefString === 'secondaryPreferred'); +} + +function getUriReadPreference(connectionString) { + const exp = /(?:&|\?)readPreference=(\w+)(?:&|$)/; + const match = exp.exec(connectionString); + return match ? match[1] : null; +} + +function defaultIndexOptsToFalse(opts) { + opts.config = { autoIndex: false }; + opts.autoCreate = false; + opts.autoIndex = false; + return opts; +} + +function throwReadPreferenceError() { + throw new MongooseError( + 'MongoDB prohibits index creation on connections that read from ' + + 'non-primary replicas. Connections that set "readPreference" to "secondary" or ' + + '"secondaryPreferred" may not opt-in to the following connection options: ' + + 'autoCreate, autoIndex' + ); +} + +module.exports = processConnectionOptions; diff --git a/node_modules/mongoose/lib/helpers/projection/isDefiningProjection.js b/node_modules/mongoose/lib/helpers/projection/isDefiningProjection.js new file mode 100644 index 000000000..67dfb39fc --- /dev/null +++ b/node_modules/mongoose/lib/helpers/projection/isDefiningProjection.js @@ -0,0 +1,18 @@ +'use strict'; + +/*! + * ignore + */ + +module.exports = function isDefiningProjection(val) { + if (val == null) { + // `undefined` or `null` become exclusive projections + return true; + } + if (typeof val === 'object') { + // Only cases where a value does **not** define whether the whole projection + // is inclusive or exclusive are `$meta` and `$slice`. + return !('$meta' in val) && !('$slice' in val); + } + return true; +}; diff --git a/node_modules/mongoose/lib/helpers/projection/isExclusive.js b/node_modules/mongoose/lib/helpers/projection/isExclusive.js new file mode 100644 index 000000000..abc1bce72 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/projection/isExclusive.js @@ -0,0 +1,32 @@ +'use strict'; + +const isDefiningProjection = require('./isDefiningProjection'); + +/*! + * ignore + */ + +module.exports = function isExclusive(projection) { + if (projection == null) { + return null; + } + + const keys = Object.keys(projection); + let ki = keys.length; + let exclude = null; + + if (ki === 1 && keys[0] === '_id') { + exclude = !projection._id; + } else { + while (ki--) { + // Does this projection explicitly define inclusion/exclusion? + // Explicitly avoid `$meta` and `$slice` + if (keys[ki] !== '_id' && isDefiningProjection(projection[keys[ki]])) { + exclude = !projection[keys[ki]]; + break; + } + } + } + + return exclude; +}; diff --git a/node_modules/mongoose/lib/helpers/projection/isInclusive.js b/node_modules/mongoose/lib/helpers/projection/isInclusive.js new file mode 100644 index 000000000..098309ff3 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/projection/isInclusive.js @@ -0,0 +1,34 @@ +'use strict'; + +const isDefiningProjection = require('./isDefiningProjection'); + +/*! + * ignore + */ + +module.exports = function isInclusive(projection) { + if (projection == null) { + return false; + } + + const props = Object.keys(projection); + const numProps = props.length; + if (numProps === 0) { + return false; + } + + for (let i = 0; i < numProps; ++i) { + const prop = props[i]; + // Plus paths can't define the projection (see gh-7050) + if (prop.startsWith('+')) { + continue; + } + // If field is truthy (1, true, etc.) and not an object, then this + // projection must be inclusive. If object, assume its $meta, $slice, etc. + if (isDefiningProjection(projection[prop]) && !!projection[prop]) { + return true; + } + } + + return false; +}; diff --git a/node_modules/mongoose/lib/helpers/projection/isPathExcluded.js b/node_modules/mongoose/lib/helpers/projection/isPathExcluded.js new file mode 100644 index 000000000..fc2592cda --- /dev/null +++ b/node_modules/mongoose/lib/helpers/projection/isPathExcluded.js @@ -0,0 +1,35 @@ +'use strict'; + +const isDefiningProjection = require('./isDefiningProjection'); + +/*! + * Determines if `path` is excluded by `projection` + * + * @param {Object} projection + * @param {string} path + * @return {Boolean} + */ + +module.exports = function isPathExcluded(projection, path) { + if (path === '_id') { + return projection._id === 0; + } + + const paths = Object.keys(projection); + let type = null; + + for (const _path of paths) { + if (isDefiningProjection(projection[_path])) { + type = projection[path] === 1 ? 'inclusive' : 'exclusive'; + break; + } + } + + if (type === 'inclusive') { + return projection[path] !== 1; + } + if (type === 'exclusive') { + return projection[path] === 0; + } + return false; +}; diff --git a/node_modules/mongoose/lib/helpers/projection/isPathSelectedInclusive.js b/node_modules/mongoose/lib/helpers/projection/isPathSelectedInclusive.js new file mode 100644 index 000000000..8a05fc948 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/projection/isPathSelectedInclusive.js @@ -0,0 +1,28 @@ +'use strict'; + +/*! + * ignore + */ + +module.exports = function isPathSelectedInclusive(fields, path) { + const chunks = path.split('.'); + let cur = ''; + let j; + let keys; + let numKeys; + for (let i = 0; i < chunks.length; ++i) { + cur += cur.length ? '.' : '' + chunks[i]; + if (fields[cur]) { + keys = Object.keys(fields); + numKeys = keys.length; + for (j = 0; j < numKeys; ++j) { + if (keys[i].indexOf(cur + '.') === 0 && keys[i].indexOf(path) !== 0) { + continue; + } + } + return true; + } + } + + return false; +}; diff --git a/node_modules/mongoose/lib/helpers/projection/parseProjection.js b/node_modules/mongoose/lib/helpers/projection/parseProjection.js new file mode 100644 index 000000000..d2a44b106 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/projection/parseProjection.js @@ -0,0 +1,33 @@ +'use strict'; + +/** + * Convert a string or array into a projection object, retaining all + * `-` and `+` paths. + */ + +module.exports = function parseProjection(v, retainMinusPaths) { + const type = typeof v; + + if (type === 'string') { + v = v.split(/\s+/); + } + if (!Array.isArray(v) && Object.prototype.toString.call(v) !== '[object Arguments]') { + return v; + } + + const len = v.length; + const ret = {}; + for (let i = 0; i < len; ++i) { + let field = v[i]; + if (!field) { + continue; + } + const include = '-' == field[0] ? 0 : 1; + if (!retainMinusPaths && include === 0) { + field = field.substring(1); + } + ret[field] = include; + } + + return ret; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/promiseOrCallback.js b/node_modules/mongoose/lib/helpers/promiseOrCallback.js new file mode 100644 index 000000000..69ba38530 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/promiseOrCallback.js @@ -0,0 +1,46 @@ +'use strict'; + +const PromiseProvider = require('../promise_provider'); +const immediate = require('./immediate'); + +const emittedSymbol = Symbol('mongoose:emitted'); + +module.exports = function promiseOrCallback(callback, fn, ee, Promise) { + if (typeof callback === 'function') { + return fn(function(error) { + if (error != null) { + if (ee != null && ee.listeners != null && ee.listeners('error').length > 0 && !error[emittedSymbol]) { + error[emittedSymbol] = true; + ee.emit('error', error); + } + try { + callback(error); + } catch (error) { + return immediate(() => { + throw error; + }); + } + return; + } + callback.apply(this, arguments); + }); + } + + Promise = Promise || PromiseProvider.get(); + + return new Promise((resolve, reject) => { + fn(function(error, res) { + if (error != null) { + if (ee != null && ee.listeners != null && ee.listeners('error').length > 0 && !error[emittedSymbol]) { + error[emittedSymbol] = true; + ee.emit('error', error); + } + return reject(error); + } + if (arguments.length > 2) { + return resolve(Array.prototype.slice.call(arguments, 1)); + } + resolve(res); + }); + }); +}; diff --git a/node_modules/mongoose/lib/helpers/query/applyGlobalMaxTimeMS.js b/node_modules/mongoose/lib/helpers/query/applyGlobalMaxTimeMS.js new file mode 100644 index 000000000..cb4926055 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/query/applyGlobalMaxTimeMS.js @@ -0,0 +1,15 @@ +'use strict'; + +const utils = require('../../utils'); + +module.exports = function applyGlobalMaxTimeMS(options, model) { + if (utils.hasUserDefinedProperty(options, 'maxTimeMS')) { + return; + } + + if (utils.hasUserDefinedProperty(model.db.options, 'maxTimeMS')) { + options.maxTimeMS = model.db.options.maxTimeMS; + } else if (utils.hasUserDefinedProperty(model.base.options, 'maxTimeMS')) { + options.maxTimeMS = model.base.options.maxTimeMS; + } +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/query/applyQueryMiddleware.js b/node_modules/mongoose/lib/helpers/query/applyQueryMiddleware.js new file mode 100644 index 000000000..99e096b19 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/query/applyQueryMiddleware.js @@ -0,0 +1,78 @@ +'use strict'; + +/*! + * ignore + */ + +module.exports = applyQueryMiddleware; + +const validOps = require('./validOps'); + +/*! + * ignore + */ + +applyQueryMiddleware.middlewareFunctions = validOps.concat([ + 'validate' +]); + +/*! + * Apply query middleware + * + * @param {Query} query constructor + * @param {Model} model + */ + +function applyQueryMiddleware(Query, model) { + const kareemOptions = { + useErrorHandlers: true, + numCallbackParams: 1, + nullResultByDefault: true + }; + + const middleware = model.hooks.filter(hook => { + const contexts = _getContexts(hook); + if (hook.name === 'updateOne') { + return contexts.query == null || !!contexts.query; + } + if (hook.name === 'deleteOne') { + return !!contexts.query || Object.keys(contexts).length === 0; + } + if (hook.name === 'validate' || hook.name === 'remove') { + return !!contexts.query; + } + if (hook.query != null || hook.document != null) { + return !!hook.query; + } + return true; + }); + + // `update()` thunk has a different name because `_update` was already taken + Query.prototype._execUpdate = middleware.createWrapper('update', + Query.prototype._execUpdate, null, kareemOptions); + // `distinct()` thunk has a different name because `_distinct` was already taken + Query.prototype.__distinct = middleware.createWrapper('distinct', + Query.prototype.__distinct, null, kareemOptions); + + // `validate()` doesn't have a thunk because it doesn't execute a query. + Query.prototype.validate = middleware.createWrapper('validate', + Query.prototype.validate, null, kareemOptions); + + applyQueryMiddleware.middlewareFunctions. + filter(v => v !== 'update' && v !== 'distinct' && v !== 'validate'). + forEach(fn => { + Query.prototype[`_${fn}`] = middleware.createWrapper(fn, + Query.prototype[`_${fn}`], null, kareemOptions); + }); +} + +function _getContexts(hook) { + const ret = {}; + if (hook.hasOwnProperty('query')) { + ret.query = hook.query; + } + if (hook.hasOwnProperty('document')) { + ret.document = hook.document; + } + return ret; +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/query/castFilterPath.js b/node_modules/mongoose/lib/helpers/query/castFilterPath.js new file mode 100644 index 000000000..42b8460ce --- /dev/null +++ b/node_modules/mongoose/lib/helpers/query/castFilterPath.js @@ -0,0 +1,55 @@ +'use strict'; + +const isOperator = require('./isOperator'); + +module.exports = function castFilterPath(query, schematype, val) { + const ctx = query; + const any$conditionals = Object.keys(val).some(isOperator); + + if (!any$conditionals) { + return schematype.castForQueryWrapper({ + val: val, + context: ctx + }); + } + + const ks = Object.keys(val); + + let k = ks.length; + + while (k--) { + const $cond = ks[k]; + const nested = val[$cond]; + + if ($cond === '$not') { + if (nested && schematype && !schematype.caster) { + const _keys = Object.keys(nested); + if (_keys.length && isOperator(_keys[0])) { + for (const key of Object.keys(nested)) { + nested[key] = schematype.castForQueryWrapper({ + $conditional: key, + val: nested[key], + context: ctx + }); + } + } else { + val[$cond] = schematype.castForQueryWrapper({ + $conditional: $cond, + val: nested, + context: ctx + }); + } + continue; + } + // cast(schematype.caster ? schematype.caster.schema : schema, nested, options, context); + } else { + val[$cond] = schematype.castForQueryWrapper({ + $conditional: $cond, + val: nested, + context: ctx + }); + } + } + + return val; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/query/castUpdate.js b/node_modules/mongoose/lib/helpers/query/castUpdate.js new file mode 100644 index 000000000..21b7f947c --- /dev/null +++ b/node_modules/mongoose/lib/helpers/query/castUpdate.js @@ -0,0 +1,532 @@ +'use strict'; + +const CastError = require('../../error/cast'); +const MongooseError = require('../../error/mongooseError'); +const StrictModeError = require('../../error/strict'); +const ValidationError = require('../../error/validation'); +const castNumber = require('../../cast/number'); +const cast = require('../../cast'); +const getConstructorName = require('../getConstructorName'); +const getEmbeddedDiscriminatorPath = require('./getEmbeddedDiscriminatorPath'); +const handleImmutable = require('./handleImmutable'); +const moveImmutableProperties = require('../update/moveImmutableProperties'); +const schemaMixedSymbol = require('../../schema/symbols').schemaMixedSymbol; +const setDottedPath = require('../path/setDottedPath'); +const utils = require('../../utils'); + +/*! + * Casts an update op based on the given schema + * + * @param {Schema} schema + * @param {Object} obj + * @param {Object} options + * @param {Boolean} [options.overwrite] defaults to false + * @param {Boolean|String} [options.strict] defaults to true + * @param {Query} context passed to setters + * @return {Boolean} true iff the update is non-empty + */ +module.exports = function castUpdate(schema, obj, options, context, filter) { + if (obj == null) { + return undefined; + } + options = options || {}; + // Update pipeline + if (Array.isArray(obj)) { + const len = obj.length; + for (let i = 0; i < len; ++i) { + const ops = Object.keys(obj[i]); + for (const op of ops) { + obj[i][op] = castPipelineOperator(op, obj[i][op]); + } + } + return obj; + } + if (schema.options.strict === 'throw' && obj.hasOwnProperty(schema.options.discriminatorKey)) { + throw new StrictModeError(schema.options.discriminatorKey); + } else if (context._mongooseOptions != null && !context._mongooseOptions.overwriteDiscriminatorKey) { + delete obj[schema.options.discriminatorKey]; + } + if (options.upsert) { + moveImmutableProperties(schema, obj, context); + } + + const ops = Object.keys(obj); + let i = ops.length; + const ret = {}; + let val; + let hasDollarKey = false; + const overwrite = options.overwrite; + + filter = filter || {}; + while (i--) { + const op = ops[i]; + // if overwrite is set, don't do any of the special $set stuff + if (op[0] !== '$' && !overwrite) { + // fix up $set sugar + if (!ret.$set) { + if (obj.$set) { + ret.$set = obj.$set; + } else { + ret.$set = {}; + } + } + ret.$set[op] = obj[op]; + ops.splice(i, 1); + if (!~ops.indexOf('$set')) ops.push('$set'); + } else if (op === '$set') { + if (!ret.$set) { + ret[op] = obj[op]; + } + } else { + ret[op] = obj[op]; + } + } + // cast each value + i = ops.length; + while (i--) { + const op = ops[i]; + val = ret[op]; + hasDollarKey = hasDollarKey || op.startsWith('$'); + + if (val && + typeof val === 'object' && + !Buffer.isBuffer(val) && + (!overwrite || hasDollarKey)) { + walkUpdatePath(schema, val, op, options, context, filter); + } else if (overwrite && ret && typeof ret === 'object') { + walkUpdatePath(schema, ret, '$set', options, context, filter); + } else { + const msg = 'Invalid atomic update value for ' + op + '. ' + + 'Expected an object, received ' + typeof val; + throw new Error(msg); + } + + if (op.startsWith('$') && utils.isEmptyObject(val)) { + delete ret[op]; + } + } + + if (Object.keys(ret).length === 0 && + options.upsert && + Object.keys(filter).length > 0) { + // Trick the driver into allowing empty upserts to work around + // https://github.com/mongodb/node-mongodb-native/pull/2490 + return { $setOnInsert: filter }; + } + return ret; +}; + +/*! + * ignore + */ + +function castPipelineOperator(op, val) { + if (op === '$unset') { + if (!Array.isArray(val) || val.find(v => typeof v !== 'string')) { + throw new MongooseError('Invalid $unset in pipeline, must be ' + + 'an array of strings'); + } + return val; + } + if (op === '$project') { + if (val == null || typeof val !== 'object') { + throw new MongooseError('Invalid $project in pipeline, must be an object'); + } + return val; + } + if (op === '$addFields' || op === '$set') { + if (val == null || typeof val !== 'object') { + throw new MongooseError('Invalid ' + op + ' in pipeline, must be an object'); + } + return val; + } else if (op === '$replaceRoot' || op === '$replaceWith') { + if (val == null || typeof val !== 'object') { + throw new MongooseError('Invalid ' + op + ' in pipeline, must be an object'); + } + return val; + } + + throw new MongooseError('Invalid update pipeline operator: "' + op + '"'); +} + +/*! + * Walk each path of obj and cast its values + * according to its schema. + * + * @param {Schema} schema + * @param {Object} obj - part of a query + * @param {String} op - the atomic operator ($pull, $set, etc) + * @param {Object} options + * @param {Boolean|String} [options.strict] + * @param {Query} context + * @param {String} pref - path prefix (internal only) + * @return {Bool} true if this path has keys to update + * @api private + */ + +function walkUpdatePath(schema, obj, op, options, context, filter, pref) { + const strict = options.strict; + const prefix = pref ? pref + '.' : ''; + const keys = Object.keys(obj); + let i = keys.length; + let hasKeys = false; + let schematype; + let key; + let val; + + let aggregatedError = null; + + while (i--) { + key = keys[i]; + val = obj[key]; + + // `$pull` is special because we need to cast the RHS as a query, not as + // an update. + if (op === '$pull') { + schematype = schema._getSchema(prefix + key); + if (schematype != null && schematype.schema != null) { + obj[key] = cast(schematype.schema, obj[key], options, context); + hasKeys = true; + continue; + } + } + + if (getConstructorName(val) === 'Object') { + // watch for embedded doc schemas + schematype = schema._getSchema(prefix + key); + + if (schematype == null) { + const _res = getEmbeddedDiscriminatorPath(schema, obj, filter, prefix + key, options); + if (_res.schematype != null) { + schematype = _res.schematype; + } + } + + if (op !== '$setOnInsert' && + handleImmutable(schematype, strict, obj, key, prefix + key, context)) { + continue; + } + + if (schematype && schematype.caster && op in castOps) { + // embedded doc schema + if ('$each' in val) { + hasKeys = true; + try { + obj[key] = { + $each: castUpdateVal(schematype, val.$each, op, key, context, prefix + key) + }; + } catch (error) { + aggregatedError = _handleCastError(error, context, key, aggregatedError); + } + + if (val.$slice != null) { + obj[key].$slice = val.$slice | 0; + } + + if (val.$sort) { + obj[key].$sort = val.$sort; + } + + if (val.$position != null) { + obj[key].$position = castNumber(val.$position); + } + } else { + if (schematype != null && schematype.$isSingleNested) { + const _strict = strict == null ? schematype.schema.options.strict : strict; + try { + obj[key] = schematype.castForQuery(val, context, { strict: _strict }); + } catch (error) { + aggregatedError = _handleCastError(error, context, key, aggregatedError); + } + } else { + try { + obj[key] = castUpdateVal(schematype, val, op, key, context, prefix + key); + } catch (error) { + aggregatedError = _handleCastError(error, context, key, aggregatedError); + } + } + + if (obj[key] === void 0) { + delete obj[key]; + continue; + } + + hasKeys = true; + } + } else if ((op === '$currentDate') || (op in castOps && schematype)) { + // $currentDate can take an object + try { + obj[key] = castUpdateVal(schematype, val, op, key, context, prefix + key); + } catch (error) { + aggregatedError = _handleCastError(error, context, key, aggregatedError); + } + + if (obj[key] === void 0) { + delete obj[key]; + continue; + } + + hasKeys = true; + } else { + const pathToCheck = (prefix + key); + const v = schema._getPathType(pathToCheck); + let _strict = strict; + if (v && v.schema && _strict == null) { + _strict = v.schema.options.strict; + } + + if (v.pathType === 'undefined') { + if (_strict === 'throw') { + throw new StrictModeError(pathToCheck); + } else if (_strict) { + delete obj[key]; + continue; + } + } + + // gh-2314 + // we should be able to set a schema-less field + // to an empty object literal + hasKeys |= walkUpdatePath(schema, val, op, options, context, filter, prefix + key) || + (utils.isObject(val) && Object.keys(val).length === 0); + } + } else { + const checkPath = (key === '$each' || key === '$or' || key === '$and' || key === '$in') ? + pref : prefix + key; + schematype = schema._getSchema(checkPath); + + // You can use `$setOnInsert` with immutable keys + if (op !== '$setOnInsert' && + handleImmutable(schematype, strict, obj, key, prefix + key, context)) { + continue; + } + + let pathDetails = schema._getPathType(checkPath); + + // If no schema type, check for embedded discriminators because the + // filter or update may imply an embedded discriminator type. See #8378 + if (schematype == null) { + const _res = getEmbeddedDiscriminatorPath(schema, obj, filter, checkPath, options); + if (_res.schematype != null) { + schematype = _res.schematype; + pathDetails = _res.type; + } + } + + let isStrict = strict; + if (pathDetails && pathDetails.schema && strict == null) { + isStrict = pathDetails.schema.options.strict; + } + + const skip = isStrict && + !schematype && + !/real|nested/.test(pathDetails.pathType); + + if (skip) { + // Even if strict is `throw`, avoid throwing an error because of + // virtuals because of #6731 + if (isStrict === 'throw' && schema.virtuals[checkPath] == null) { + throw new StrictModeError(prefix + key); + } else { + delete obj[key]; + } + } else { + // gh-1845 temporary fix: ignore $rename. See gh-3027 for tracking + // improving this. + if (op === '$rename') { + hasKeys = true; + continue; + } + + try { + if (prefix.length === 0 || key.indexOf('.') === -1) { + obj[key] = castUpdateVal(schematype, val, op, key, context, prefix + key); + } else { + // Setting a nested dotted path that's in the schema. We don't allow paths with '.' in + // a schema, so replace the dotted path with a nested object to avoid ending up with + // dotted properties in the updated object. See (gh-10200) + setDottedPath(obj, key, castUpdateVal(schematype, val, op, key, context, prefix + key)); + delete obj[key]; + } + } catch (error) { + aggregatedError = _handleCastError(error, context, key, aggregatedError); + } + + if (Array.isArray(obj[key]) && (op === '$addToSet' || op === '$push') && key !== '$each') { + if (schematype && schematype.caster && !schematype.caster.$isMongooseArray) { + obj[key] = { $each: obj[key] }; + } + } + + if (obj[key] === void 0) { + delete obj[key]; + continue; + } + + hasKeys = true; + } + } + } + + if (aggregatedError != null) { + throw aggregatedError; + } + + return hasKeys; +} + +/*! + * ignore + */ + +function _handleCastError(error, query, key, aggregatedError) { + if (typeof query !== 'object' || !query.options.multipleCastError) { + throw error; + } + aggregatedError = aggregatedError || new ValidationError(); + aggregatedError.addError(key, error); + return aggregatedError; +} + +/*! + * These operators should be cast to numbers instead + * of their path schema type. + */ + +const numberOps = { + $pop: 1, + $inc: 1 +}; + +/*! + * These ops require no casting because the RHS doesn't do anything. + */ + +const noCastOps = { + $unset: 1 +}; + +/*! + * These operators require casting docs + * to real Documents for Update operations. + */ + +const castOps = { + $push: 1, + $addToSet: 1, + $set: 1, + $setOnInsert: 1 +}; + +/*! + * ignore + */ + +const overwriteOps = { + $set: 1, + $setOnInsert: 1 +}; + +/*! + * Casts `val` according to `schema` and atomic `op`. + * + * @param {SchemaType} schema + * @param {Object} val + * @param {String} op - the atomic operator ($pull, $set, etc) + * @param {String} $conditional + * @param {Query} context + * @api private + */ + +function castUpdateVal(schema, val, op, $conditional, context, path) { + if (!schema) { + // non-existing schema path + if (op in numberOps) { + try { + return castNumber(val); + } catch (err) { + throw new CastError('number', val, path); + } + } + return val; + } + + const cond = schema.caster && op in castOps && + (utils.isObject(val) || Array.isArray(val)); + if (cond && !overwriteOps[op]) { + // Cast values for ops that add data to MongoDB. + // Ensures embedded documents get ObjectIds etc. + let schemaArrayDepth = 0; + let cur = schema; + while (cur.$isMongooseArray) { + ++schemaArrayDepth; + cur = cur.caster; + } + let arrayDepth = 0; + let _val = val; + while (Array.isArray(_val)) { + ++arrayDepth; + _val = _val[0]; + } + + const additionalNesting = schemaArrayDepth - arrayDepth; + while (arrayDepth < schemaArrayDepth) { + val = [val]; + ++arrayDepth; + } + + let tmp = schema.applySetters(Array.isArray(val) ? val : [val], context); + + for (let i = 0; i < additionalNesting; ++i) { + tmp = tmp[0]; + } + return tmp; + } + + if (op in noCastOps) { + return val; + } + if (op in numberOps) { + // Null and undefined not allowed for $pop, $inc + if (val == null) { + throw new CastError('number', val, schema.path); + } + if (op === '$inc') { + // Support `$inc` with long, int32, etc. (gh-4283) + return schema.castForQueryWrapper({ + val: val, + context: context + }); + } + try { + return castNumber(val); + } catch (error) { + throw new CastError('number', val, schema.path); + } + } + if (op === '$currentDate') { + if (typeof val === 'object') { + return { $type: val.$type }; + } + return Boolean(val); + } + + if (/^\$/.test($conditional)) { + return schema.castForQueryWrapper({ + $conditional: $conditional, + val: val, + context: context + }); + } + + if (overwriteOps[op]) { + return schema.castForQueryWrapper({ + val: val, + context: context, + $skipQueryCastForUpdate: val != null && schema.$isMongooseArray && schema.$fullPath != null && !schema.$fullPath.match(/\d+$/), + $applySetters: schema[schemaMixedSymbol] != null + }); + } + + return schema.castForQueryWrapper({ val: val, context: context }); +} diff --git a/node_modules/mongoose/lib/helpers/query/completeMany.js b/node_modules/mongoose/lib/helpers/query/completeMany.js new file mode 100644 index 000000000..b52da432a --- /dev/null +++ b/node_modules/mongoose/lib/helpers/query/completeMany.js @@ -0,0 +1,51 @@ +'use strict'; + +const helpers = require('../../queryhelpers'); +const immediate = require('../immediate'); + +module.exports = completeMany; + +/*! + * Given a model and an array of docs, hydrates all the docs to be instances + * of the model. Used to initialize docs returned from the db from `find()` + * + * @param {Model} model + * @param {Array} docs + * @param {Object} fields the projection used, including `select` from schemas + * @param {Object} userProvidedFields the user-specified projection + * @param {Object} opts + * @param {Array} [opts.populated] + * @param {ClientSession} [opts.session] + * @param {Function} callback + */ + +function completeMany(model, docs, fields, userProvidedFields, opts, callback) { + const arr = []; + let count = docs.length; + const len = count; + let error = null; + + function init(_error) { + if (_error != null) { + error = error || _error; + } + if (error != null) { + --count || immediate(() => callback(error)); + return; + } + --count || immediate(() => callback(error, arr)); + } + + for (let i = 0; i < len; ++i) { + arr[i] = helpers.createModel(model, docs[i], fields, userProvidedFields); + try { + arr[i].$init(docs[i], opts, init); + } catch (error) { + init(error); + } + + if (opts.session != null) { + arr[i].$session(opts.session); + } + } +} diff --git a/node_modules/mongoose/lib/helpers/query/getEmbeddedDiscriminatorPath.js b/node_modules/mongoose/lib/helpers/query/getEmbeddedDiscriminatorPath.js new file mode 100644 index 000000000..2f3f39d97 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/query/getEmbeddedDiscriminatorPath.js @@ -0,0 +1,84 @@ +'use strict'; + +const cleanPositionalOperators = require('../schema/cleanPositionalOperators'); +const get = require('../get'); +const getDiscriminatorByValue = require('../discriminator/getDiscriminatorByValue'); +const updatedPathsByArrayFilter = require('../update/updatedPathsByArrayFilter'); + +/*! + * Like `schema.path()`, except with a document, because impossible to + * determine path type without knowing the embedded discriminator key. + */ + +module.exports = function getEmbeddedDiscriminatorPath(schema, update, filter, path, options) { + const parts = path.split('.'); + let schematype = null; + let type = 'adhocOrUndefined'; + + filter = filter || {}; + update = update || {}; + const arrayFilters = options != null && Array.isArray(options.arrayFilters) ? + options.arrayFilters : []; + const updatedPathsByFilter = updatedPathsByArrayFilter(update); + + for (let i = 0; i < parts.length; ++i) { + const subpath = cleanPositionalOperators(parts.slice(0, i + 1).join('.')); + schematype = schema.path(subpath); + if (schematype == null) { + continue; + } + + type = schema.pathType(subpath); + if ((schematype.$isSingleNested || schematype.$isMongooseDocumentArrayElement) && + schematype.schema.discriminators != null) { + const key = get(schematype, 'schema.options.discriminatorKey'); + const discriminatorValuePath = subpath + '.' + key; + const discriminatorFilterPath = + discriminatorValuePath.replace(/\.\d+\./, '.'); + let discriminatorKey = null; + + if (discriminatorValuePath in filter) { + discriminatorKey = filter[discriminatorValuePath]; + } + if (discriminatorFilterPath in filter) { + discriminatorKey = filter[discriminatorFilterPath]; + } + + const wrapperPath = subpath.replace(/\.\d+$/, ''); + if (schematype.$isMongooseDocumentArrayElement && + get(filter[wrapperPath], '$elemMatch.' + key) != null) { + discriminatorKey = filter[wrapperPath].$elemMatch[key]; + } + + if (discriminatorValuePath in update) { + discriminatorKey = update[discriminatorValuePath]; + } + + for (const filterKey of Object.keys(updatedPathsByFilter)) { + const schemaKey = updatedPathsByFilter[filterKey] + '.' + key; + const arrayFilterKey = filterKey + '.' + key; + if (schemaKey === discriminatorFilterPath) { + const filter = arrayFilters.find(filter => filter.hasOwnProperty(arrayFilterKey)); + if (filter != null) { + discriminatorKey = filter[arrayFilterKey]; + } + } + } + + if (discriminatorKey == null) { + continue; + } + + const discriminatorSchema = getDiscriminatorByValue(schematype.caster.discriminators, discriminatorKey).schema; + + const rest = parts.slice(i + 1).join('.'); + schematype = discriminatorSchema.path(rest); + if (schematype != null) { + type = discriminatorSchema._getPathType(rest); + break; + } + } + } + + return { type: type, schematype: schematype }; +}; diff --git a/node_modules/mongoose/lib/helpers/query/handleImmutable.js b/node_modules/mongoose/lib/helpers/query/handleImmutable.js new file mode 100644 index 000000000..22adb3c50 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/query/handleImmutable.js @@ -0,0 +1,28 @@ +'use strict'; + +const StrictModeError = require('../../error/strict'); + +module.exports = function handleImmutable(schematype, strict, obj, key, fullPath, ctx) { + if (schematype == null || !schematype.options || !schematype.options.immutable) { + return false; + } + let immutable = schematype.options.immutable; + + if (typeof immutable === 'function') { + immutable = immutable.call(ctx, ctx); + } + if (!immutable) { + return false; + } + + if (strict === false) { + return false; + } + if (strict === 'throw') { + throw new StrictModeError(null, + `Field ${fullPath} is immutable and strict = 'throw'`); + } + + delete obj[key]; + return true; +}; diff --git a/node_modules/mongoose/lib/helpers/query/hasDollarKeys.js b/node_modules/mongoose/lib/helpers/query/hasDollarKeys.js new file mode 100644 index 000000000..09c40d775 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/query/hasDollarKeys.js @@ -0,0 +1,19 @@ +'use strict'; + +/*! + * ignore + */ + +module.exports = function(obj) { + if (obj == null || typeof obj !== 'object') { + return false; + } + const keys = Object.keys(obj); + const len = keys.length; + for (let i = 0; i < len; ++i) { + if (keys[i].startsWith('$')) { + return true; + } + } + return false; +}; diff --git a/node_modules/mongoose/lib/helpers/query/isOperator.js b/node_modules/mongoose/lib/helpers/query/isOperator.js new file mode 100644 index 000000000..3b9813920 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/query/isOperator.js @@ -0,0 +1,11 @@ +'use strict'; + +const specialKeys = new Set([ + '$ref', + '$id', + '$db' +]); + +module.exports = function isOperator(path) { + return path.startsWith('$') && !specialKeys.has(path); +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/query/sanitizeFilter.js b/node_modules/mongoose/lib/helpers/query/sanitizeFilter.js new file mode 100644 index 000000000..4177a0c68 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/query/sanitizeFilter.js @@ -0,0 +1,38 @@ +'use strict'; + +const hasDollarKeys = require('./hasDollarKeys'); +const { trustedSymbol } = require('./trusted'); + +module.exports = function sanitizeFilter(filter) { + if (filter == null || typeof filter !== 'object') { + return filter; + } + if (Array.isArray(filter)) { + for (const subfilter of filter) { + sanitizeFilter(subfilter); + } + return filter; + } + + const filterKeys = Object.keys(filter); + for (const key of filterKeys) { + const value = filter[key]; + if (value != null && value[trustedSymbol]) { + continue; + } + if (key === '$and' || key === '$or') { + sanitizeFilter(value); + continue; + } + + if (hasDollarKeys(value)) { + const keys = Object.keys(value); + if (keys.length === 1 && keys[0] === '$eq') { + continue; + } + filter[key] = { $eq: filter[key] }; + } + } + + return filter; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/query/sanitizeProjection.js b/node_modules/mongoose/lib/helpers/query/sanitizeProjection.js new file mode 100644 index 000000000..dcb61fba9 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/query/sanitizeProjection.js @@ -0,0 +1,14 @@ +'use strict'; + +module.exports = function sanitizeProjection(projection) { + if (projection == null) { + return; + } + + const keys = Object.keys(projection); + for (let i = 0; i < keys.length; ++i) { + if (typeof projection[keys[i]] === 'string') { + projection[keys[i]] = 1; + } + } +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/query/selectPopulatedFields.js b/node_modules/mongoose/lib/helpers/query/selectPopulatedFields.js new file mode 100644 index 000000000..92f1d87e0 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/query/selectPopulatedFields.js @@ -0,0 +1,49 @@ +'use strict'; + +const isExclusive = require('../projection/isExclusive'); +const isInclusive = require('../projection/isInclusive'); + +/*! + * ignore + */ + +module.exports = function selectPopulatedFields(fields, userProvidedFields, populateOptions) { + if (populateOptions == null) { + return; + } + + const paths = Object.keys(populateOptions); + userProvidedFields = userProvidedFields || {}; + if (isInclusive(fields)) { + for (const path of paths) { + if (!isPathInFields(userProvidedFields, path)) { + fields[path] = 1; + } else if (userProvidedFields[path] === 0) { + delete fields[path]; + } + } + } else if (isExclusive(fields)) { + for (const path of paths) { + if (userProvidedFields[path] == null) { + delete fields[path]; + } + } + } +}; + +/*! + * ignore + */ + +function isPathInFields(userProvidedFields, path) { + const pieces = path.split('.'); + const len = pieces.length; + let cur = pieces[0]; + for (let i = 1; i < len; ++i) { + if (userProvidedFields[cur] != null || userProvidedFields[cur + '.$'] != null) { + return true; + } + cur += '.' + pieces[i]; + } + return userProvidedFields[cur] != null || userProvidedFields[cur + '.$'] != null; +} diff --git a/node_modules/mongoose/lib/helpers/query/trusted.js b/node_modules/mongoose/lib/helpers/query/trusted.js new file mode 100644 index 000000000..98c6054ad --- /dev/null +++ b/node_modules/mongoose/lib/helpers/query/trusted.js @@ -0,0 +1,13 @@ +'use strict'; + +const trustedSymbol = Symbol('mongoose#trustedSymbol'); + +exports.trustedSymbol = trustedSymbol; + +exports.trusted = function trusted(obj) { + if (obj == null || typeof obj !== 'object') { + return obj; + } + obj[trustedSymbol] = true; + return obj; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/query/validOps.js b/node_modules/mongoose/lib/helpers/query/validOps.js new file mode 100644 index 000000000..c0ac397f2 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/query/validOps.js @@ -0,0 +1,24 @@ +'use strict'; + +module.exports = Object.freeze([ + // Read + 'count', + 'countDocuments', + 'distinct', + 'estimatedDocumentCount', + 'find', + 'findOne', + // Update + 'findOneAndReplace', + 'findOneAndUpdate', + 'replaceOne', + 'update', + 'updateMany', + 'updateOne', + // Delete + 'deleteMany', + 'deleteOne', + 'findOneAndDelete', + 'findOneAndRemove', + 'remove' +]); \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/query/wrapThunk.js b/node_modules/mongoose/lib/helpers/query/wrapThunk.js new file mode 100644 index 000000000..daa4681ef --- /dev/null +++ b/node_modules/mongoose/lib/helpers/query/wrapThunk.js @@ -0,0 +1,29 @@ +'use strict'; + +const MongooseError = require('../../error/mongooseError'); + +/*! + * A query thunk is the function responsible for sending the query to MongoDB, + * like `Query#_findOne()` or `Query#_execUpdate()`. The `Query#exec()` function + * calls a thunk. The term "thunk" here is the traditional Node.js definition: + * a function that takes exactly 1 parameter, a callback. + * + * This function defines common behavior for all query thunks. + */ + +module.exports = function wrapThunk(fn) { + return function _wrappedThunk(cb) { + if (this._executionStack != null) { + let str = this.toString(); + if (str.length > 60) { + str = str.slice(0, 60) + '...'; + } + const err = new MongooseError('Query was already executed: ' + str); + err.originalStack = this._executionStack.stack; + return cb(err); + } + this._executionStack = new Error(); + + fn.call(this, cb); + }; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/schema/addAutoId.js b/node_modules/mongoose/lib/helpers/schema/addAutoId.js new file mode 100644 index 000000000..11a1f2302 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/schema/addAutoId.js @@ -0,0 +1,7 @@ +'use strict'; + +module.exports = function addAutoId(schema) { + const _obj = { _id: { auto: true } }; + _obj._id[schema.options.typeKey] = 'ObjectId'; + schema.add(_obj); +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/schema/applyPlugins.js b/node_modules/mongoose/lib/helpers/schema/applyPlugins.js new file mode 100644 index 000000000..f1daf4012 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/schema/applyPlugins.js @@ -0,0 +1,44 @@ +'use strict'; + +module.exports = function applyPlugins(schema, plugins, options, cacheKey) { + if (schema[cacheKey]) { + return; + } + schema[cacheKey] = true; + + if (!options || !options.skipTopLevel) { + for (const plugin of plugins) { + schema.plugin(plugin[0], plugin[1]); + } + } + + options = Object.assign({}, options); + delete options.skipTopLevel; + + if (options.applyPluginsToChildSchemas !== false) { + for (const path of Object.keys(schema.paths)) { + const type = schema.paths[path]; + if (type.schema != null) { + applyPlugins(type.schema, plugins, options, cacheKey); + + // Recompile schema because plugins may have changed it, see gh-7572 + type.caster.prototype.$__setSchema(type.schema); + } + } + } + + const discriminators = schema.discriminators; + if (discriminators == null) { + return; + } + + const applyPluginsToDiscriminators = options.applyPluginsToDiscriminators; + + const keys = Object.keys(discriminators); + for (const discriminatorKey of keys) { + const discriminatorSchema = discriminators[discriminatorKey]; + + applyPlugins(discriminatorSchema, plugins, + { skipTopLevel: !applyPluginsToDiscriminators }, cacheKey); + } +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/schema/applyWriteConcern.js b/node_modules/mongoose/lib/helpers/schema/applyWriteConcern.js new file mode 100644 index 000000000..4095bd94b --- /dev/null +++ b/node_modules/mongoose/lib/helpers/schema/applyWriteConcern.js @@ -0,0 +1,30 @@ +'use strict'; + +const get = require('../get'); + +module.exports = function applyWriteConcern(schema, options) { + const writeConcern = get(schema, 'options.writeConcern', {}); + if (Object.keys(writeConcern).length != 0) { + options.writeConcern = {}; + if (!('w' in options) && writeConcern.w != null) { + options.writeConcern.w = writeConcern.w; + } + if (!('j' in options) && writeConcern.j != null) { + options.writeConcern.j = writeConcern.j; + } + if (!('wtimeout' in options) && writeConcern.wtimeout != null) { + options.writeConcern.wtimeout = writeConcern.wtimeout; + } + } + else { + if (!('w' in options) && writeConcern.w != null) { + options.w = writeConcern.w; + } + if (!('j' in options) && writeConcern.j != null) { + options.j = writeConcern.j; + } + if (!('wtimeout' in options) && writeConcern.wtimeout != null) { + options.wtimeout = writeConcern.wtimeout; + } + } +}; diff --git a/node_modules/mongoose/lib/helpers/schema/cleanPositionalOperators.js b/node_modules/mongoose/lib/helpers/schema/cleanPositionalOperators.js new file mode 100644 index 000000000..905bb7882 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/schema/cleanPositionalOperators.js @@ -0,0 +1,12 @@ +'use strict'; + +/** + * For consistency's sake, we replace positional operator `$` and array filters + * `$[]` and `$[foo]` with `0` when looking up schema paths. + */ + +module.exports = function cleanPositionalOperators(path) { + return path. + replace(/\.\$(\[[^\]]*\])?(?=\.)/g, '.0'). + replace(/\.\$(\[[^\]]*\])?$/g, '.0'); +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/schema/getIndexes.js b/node_modules/mongoose/lib/helpers/schema/getIndexes.js new file mode 100644 index 000000000..be907dbd6 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/schema/getIndexes.js @@ -0,0 +1,155 @@ +'use strict'; + +const get = require('../get'); +const helperIsObject = require('../isObject'); + +/*! + * Gather all indexes defined in the schema, including single nested, + * document arrays, and embedded discriminators. + */ + +module.exports = function getIndexes(schema) { + let indexes = []; + const schemaStack = new WeakMap(); + const indexTypes = schema.constructor.indexTypes; + const indexByName = new Map(); + + collectIndexes(schema); + return indexes; + + function collectIndexes(schema, prefix, baseSchema) { + // Ignore infinitely nested schemas, if we've already seen this schema + // along this path there must be a cycle + if (schemaStack.has(schema)) { + return; + } + schemaStack.set(schema, true); + + prefix = prefix || ''; + const keys = Object.keys(schema.paths); + + for (const key of keys) { + const path = schema.paths[key]; + if (baseSchema != null && baseSchema.paths[key]) { + // If looking at an embedded discriminator schema, don't look at paths + // that the + continue; + } + + if (path.$isMongooseDocumentArray || path.$isSingleNested) { + if (get(path, 'options.excludeIndexes') !== true && + get(path, 'schemaOptions.excludeIndexes') !== true && + get(path, 'schema.options.excludeIndexes') !== true) { + collectIndexes(path.schema, prefix + key + '.'); + } + + if (path.schema.discriminators != null) { + const discriminators = path.schema.discriminators; + const discriminatorKeys = Object.keys(discriminators); + for (const discriminatorKey of discriminatorKeys) { + collectIndexes(discriminators[discriminatorKey], + prefix + key + '.', path.schema); + } + } + + // Retained to minimize risk of backwards breaking changes due to + // gh-6113 + if (path.$isMongooseDocumentArray) { + continue; + } + } + + const index = path._index || (path.caster && path.caster._index); + + if (index !== false && index !== null && index !== undefined) { + const field = {}; + const isObject = helperIsObject(index); + const options = isObject ? index : {}; + const type = typeof index === 'string' ? index : + isObject ? index.type : + false; + + if (type && indexTypes.indexOf(type) !== -1) { + field[prefix + key] = type; + } else if (options.text) { + field[prefix + key] = 'text'; + delete options.text; + } else { + const isDescendingIndex = Number(index) === -1; + field[prefix + key] = isDescendingIndex ? -1 : 1; + } + + delete options.type; + if (!('background' in options)) { + options.background = true; + } + if (schema.options.autoIndex != null) { + options._autoIndex = schema.options.autoIndex; + } + + const indexName = options && options.name; + if (typeof indexName === 'string') { + if (indexByName.has(indexName)) { + Object.assign(indexByName.get(indexName), field); + } else { + indexes.push([field, options]); + indexByName.set(indexName, field); + } + } else { + indexes.push([field, options]); + indexByName.set(indexName, field); + } + } + } + + schemaStack.delete(schema); + + if (prefix) { + fixSubIndexPaths(schema, prefix); + } else { + schema._indexes.forEach(function(index) { + if (!('background' in index[1])) { + index[1].background = true; + } + }); + indexes = indexes.concat(schema._indexes); + } + } + + /*! + * Checks for indexes added to subdocs using Schema.index(). + * These indexes need their paths prefixed properly. + * + * schema._indexes = [ [indexObj, options], [indexObj, options] ..] + */ + + function fixSubIndexPaths(schema, prefix) { + const subindexes = schema._indexes; + const len = subindexes.length; + for (let i = 0; i < len; ++i) { + const indexObj = subindexes[i][0]; + const indexOptions = subindexes[i][1]; + const keys = Object.keys(indexObj); + const klen = keys.length; + const newindex = {}; + + // use forward iteration, order matters + for (let j = 0; j < klen; ++j) { + const key = keys[j]; + newindex[prefix + key] = indexObj[key]; + } + + const newIndexOptions = Object.assign({}, indexOptions); + if (indexOptions != null && indexOptions.partialFilterExpression != null) { + newIndexOptions.partialFilterExpression = {}; + const partialFilterExpression = indexOptions.partialFilterExpression; + for (const key of Object.keys(partialFilterExpression)) { + newIndexOptions.partialFilterExpression[prefix + key] = + partialFilterExpression[key]; + } + } + + indexes.push([newindex, newIndexOptions]); + } + } +}; diff --git a/node_modules/mongoose/lib/helpers/schema/getKeysInSchemaOrder.js b/node_modules/mongoose/lib/helpers/schema/getKeysInSchemaOrder.js new file mode 100644 index 000000000..f84a88f04 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/schema/getKeysInSchemaOrder.js @@ -0,0 +1,28 @@ +'use strict'; + +const get = require('../get'); + +module.exports = function getKeysInSchemaOrder(schema, val, path) { + const schemaKeys = path != null ? Object.keys(get(schema.tree, path, {})) : Object.keys(schema.tree); + const valKeys = new Set(Object.keys(val)); + + let keys; + if (valKeys.size > 1) { + keys = new Set(); + for (const key of schemaKeys) { + if (valKeys.has(key)) { + keys.add(key); + } + } + for (const key of valKeys) { + if (!keys.has(key)) { + keys.add(key); + } + } + keys = Array.from(keys); + } else { + keys = Array.from(valKeys); + } + + return keys; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/schema/getPath.js b/node_modules/mongoose/lib/helpers/schema/getPath.js new file mode 100644 index 000000000..ccbc67c81 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/schema/getPath.js @@ -0,0 +1,35 @@ +'use strict'; + +/*! + * Behaves like `Schema#path()`, except for it also digs into arrays without + * needing to put `.0.`, so `getPath(schema, 'docArr.elProp')` works. + */ + +module.exports = function getPath(schema, path) { + let schematype = schema.path(path); + if (schematype != null) { + return schematype; + } + + const pieces = path.split('.'); + let cur = ''; + let isArray = false; + + for (const piece of pieces) { + if (/^\d+$/.test(piece) && isArray) { + continue; + } + cur = cur.length === 0 ? piece : cur + '.' + piece; + + schematype = schema.path(cur); + if (schematype != null && schematype.schema) { + schema = schematype.schema; + cur = ''; + if (schematype.$isMongooseDocumentArray) { + isArray = true; + } + } + } + + return schematype; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/schema/handleIdOption.js b/node_modules/mongoose/lib/helpers/schema/handleIdOption.js new file mode 100644 index 000000000..569bf9f51 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/schema/handleIdOption.js @@ -0,0 +1,20 @@ +'use strict'; + +const addAutoId = require('./addAutoId'); + +module.exports = function handleIdOption(schema, options) { + if (options == null || options._id == null) { + return schema; + } + + schema = schema.clone(); + if (!options._id) { + schema.remove('_id'); + schema.options._id = false; + } else if (!schema.paths['_id']) { + addAutoId(schema); + schema.options._id = true; + } + + return schema; +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/schema/handleTimestampOption.js b/node_modules/mongoose/lib/helpers/schema/handleTimestampOption.js new file mode 100644 index 000000000..1551b7c10 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/schema/handleTimestampOption.js @@ -0,0 +1,24 @@ +'use strict'; + +module.exports = handleTimestampOption; + +/*! + * ignore + */ + +function handleTimestampOption(arg, prop) { + if (arg == null) { + return null; + } + + if (typeof arg === 'boolean') { + return prop; + } + if (typeof arg[prop] === 'boolean') { + return arg[prop] ? prop : null; + } + if (!(prop in arg)) { + return prop; + } + return arg[prop]; +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/schema/idGetter.js b/node_modules/mongoose/lib/helpers/schema/idGetter.js new file mode 100644 index 000000000..5a0741773 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/schema/idGetter.js @@ -0,0 +1,31 @@ +'use strict'; + +/*! + * ignore + */ + +module.exports = function addIdGetter(schema) { + // ensure the documents receive an id getter unless disabled + const autoIdGetter = !schema.paths['id'] && + schema.paths['_id'] && + schema.options.id; + if (!autoIdGetter) { + return schema; + } + + schema.virtual('id').get(idGetter); + + return schema; +}; + +/*! + * Returns this documents _id cast to a string. + */ + +function idGetter() { + if (this._id != null) { + return String(this._id); + } + + return null; +} diff --git a/node_modules/mongoose/lib/helpers/schema/merge.js b/node_modules/mongoose/lib/helpers/schema/merge.js new file mode 100644 index 000000000..edc5175ee --- /dev/null +++ b/node_modules/mongoose/lib/helpers/schema/merge.js @@ -0,0 +1,27 @@ +'use strict'; + +module.exports = function merge(s1, s2, skipConflictingPaths) { + const paths = Object.keys(s2.tree); + const pathsToAdd = {}; + for (const key of paths) { + if (skipConflictingPaths && (s1.paths[key] || s1.nested[key] || s1.singleNestedPaths[key])) { + continue; + } + pathsToAdd[key] = s2.tree[key]; + } + s1.add(pathsToAdd); + + s1.callQueue = s1.callQueue.concat(s2.callQueue); + s1.method(s2.methods); + s1.static(s2.statics); + + for (const query in s2.query) { + s1.query[query] = s2.query[query]; + } + + for (const virtual in s2.virtuals) { + s1.virtuals[virtual] = s2.virtuals[virtual].clone(); + } + + s1.s.hooks.merge(s2.s.hooks, false); +}; diff --git a/node_modules/mongoose/lib/helpers/schematype/handleImmutable.js b/node_modules/mongoose/lib/helpers/schematype/handleImmutable.js new file mode 100644 index 000000000..a9a0b7f10 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/schematype/handleImmutable.js @@ -0,0 +1,47 @@ +'use strict'; + +const StrictModeError = require('../../error/strict'); + +/*! + * ignore + */ + +module.exports = function(schematype) { + if (schematype.$immutable) { + schematype.$immutableSetter = createImmutableSetter(schematype.path, + schematype.options.immutable); + schematype.set(schematype.$immutableSetter); + } else if (schematype.$immutableSetter) { + schematype.setters = schematype.setters. + filter(fn => fn !== schematype.$immutableSetter); + delete schematype.$immutableSetter; + } +}; + +function createImmutableSetter(path, immutable) { + return function immutableSetter(v) { + if (this == null || this.$__ == null) { + return v; + } + if (this.isNew) { + return v; + } + + const _immutable = typeof immutable === 'function' ? + immutable.call(this, this) : + immutable; + if (!_immutable) { + return v; + } + + const _value = this.$__.priorDoc != null ? + this.$__.priorDoc.$__getValue(path) : + this.$__getValue(path); + if (this.$__.strictMode === 'throw' && v !== _value) { + throw new StrictModeError(path, 'Path `' + path + '` is immutable ' + + 'and strict mode is set to throw.', true); + } + + return _value; + }; +} diff --git a/node_modules/mongoose/lib/helpers/setDefaultsOnInsert.js b/node_modules/mongoose/lib/helpers/setDefaultsOnInsert.js new file mode 100644 index 000000000..03cc4e877 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/setDefaultsOnInsert.js @@ -0,0 +1,108 @@ +'use strict'; +const modifiedPaths = require('./common').modifiedPaths; +const get = require('./get'); + +/** + * Applies defaults to update and findOneAndUpdate operations. + * + * @param {Object} filter + * @param {Schema} schema + * @param {Object} castedDoc + * @param {Object} options + * @method setDefaultsOnInsert + * @api private + */ + +module.exports = function(filter, schema, castedDoc, options) { + options = options || {}; + + const shouldSetDefaultsOnInsert = + options.setDefaultsOnInsert != null ? + options.setDefaultsOnInsert : + schema.base.options.setDefaultsOnInsert; + + if (!options.upsert || shouldSetDefaultsOnInsert === false) { + return castedDoc; + } + + const keys = Object.keys(castedDoc || {}); + const updatedKeys = {}; + const updatedValues = {}; + const numKeys = keys.length; + const modified = {}; + + let hasDollarUpdate = false; + + for (let i = 0; i < numKeys; ++i) { + if (keys[i].startsWith('$')) { + modifiedPaths(castedDoc[keys[i]], '', modified); + hasDollarUpdate = true; + } + } + + if (!hasDollarUpdate) { + modifiedPaths(castedDoc, '', modified); + } + + const paths = Object.keys(filter); + const numPaths = paths.length; + for (let i = 0; i < numPaths; ++i) { + const path = paths[i]; + const condition = filter[path]; + if (condition && typeof condition === 'object') { + const conditionKeys = Object.keys(condition); + const numConditionKeys = conditionKeys.length; + let hasDollarKey = false; + for (let j = 0; j < numConditionKeys; ++j) { + if (conditionKeys[j].startsWith('$')) { + hasDollarKey = true; + break; + } + } + if (hasDollarKey) { + continue; + } + } + updatedKeys[path] = true; + modified[path] = true; + } + + if (options && options.overwrite && !hasDollarUpdate) { + // Defaults will be set later, since we're overwriting we'll cast + // the whole update to a document + return castedDoc; + } + + schema.eachPath(function(path, schemaType) { + // Skip single nested paths if underneath a map + if (schemaType.path === '_id' && schemaType.options.auto) { + return; + } + const def = schemaType.getDefault(null, true); + if (!isModified(modified, path) && typeof def !== 'undefined') { + castedDoc = castedDoc || {}; + castedDoc.$setOnInsert = castedDoc.$setOnInsert || {}; + if (get(castedDoc, path) == null) { + castedDoc.$setOnInsert[path] = def; + } + updatedValues[path] = def; + } + }); + + return castedDoc; +}; + +function isModified(modified, path) { + if (modified[path]) { + return true; + } + const sp = path.split('.'); + let cur = sp[0]; + for (let i = 1; i < sp.length; ++i) { + if (modified[cur]) { + return true; + } + cur += '.' + sp[i]; + } + return false; +} diff --git a/node_modules/mongoose/lib/helpers/specialProperties.js b/node_modules/mongoose/lib/helpers/specialProperties.js new file mode 100644 index 000000000..1e1aca50a --- /dev/null +++ b/node_modules/mongoose/lib/helpers/specialProperties.js @@ -0,0 +1,3 @@ +'use strict'; + +module.exports = new Set(['__proto__', 'constructor', 'prototype']); \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/symbols.js b/node_modules/mongoose/lib/helpers/symbols.js new file mode 100644 index 000000000..7e323e285 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/symbols.js @@ -0,0 +1,20 @@ +'use strict'; + +exports.arrayAtomicsBackupSymbol = Symbol('mongoose#Array#atomicsBackup'); +exports.arrayAtomicsSymbol = Symbol('mongoose#Array#_atomics'); +exports.arrayParentSymbol = Symbol('mongoose#Array#_parent'); +exports.arrayPathSymbol = Symbol('mongoose#Array#_path'); +exports.arraySchemaSymbol = Symbol('mongoose#Array#_schema'); +exports.documentArrayParent = Symbol('mongoose:documentArrayParent'); +exports.documentIsSelected = Symbol('mongoose#Document#isSelected'); +exports.documentIsModified = Symbol('mongoose#Document#isModified'); +exports.documentModifiedPaths = Symbol('mongoose#Document#modifiedPaths'); +exports.documentSchemaSymbol = Symbol('mongoose#Document#schema'); +exports.getSymbol = Symbol('mongoose#Document#get'); +exports.modelSymbol = Symbol('mongoose#Model'); +exports.objectIdSymbol = Symbol('mongoose#ObjectId'); +exports.populateModelSymbol = Symbol('mongoose.PopulateOptions#Model'); +exports.schemaTypeSymbol = Symbol('mongoose#schemaType'); +exports.sessionNewDocuments = Symbol('mongoose:ClientSession#newDocuments'); +exports.scopeSymbol = Symbol('mongoose#Document#scope'); +exports.validatorErrorSymbol = Symbol('mongoose:validatorError'); \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/timers.js b/node_modules/mongoose/lib/helpers/timers.js new file mode 100644 index 000000000..7bd09c741 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/timers.js @@ -0,0 +1,3 @@ +'use strict'; + +exports.setTimeout = setTimeout; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/timestamps/setupTimestamps.js b/node_modules/mongoose/lib/helpers/timestamps/setupTimestamps.js new file mode 100644 index 000000000..8b3984272 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/timestamps/setupTimestamps.js @@ -0,0 +1,107 @@ +'use strict'; + +const applyTimestampsToChildren = require('../update/applyTimestampsToChildren'); +const applyTimestampsToUpdate = require('../update/applyTimestampsToUpdate'); +const get = require('../get'); +const handleTimestampOption = require('../schema/handleTimestampOption'); +const symbols = require('../../schema/symbols'); + +module.exports = function setupTimestamps(schema, timestamps) { + const childHasTimestamp = schema.childSchemas.find(withTimestamp); + function withTimestamp(s) { + const ts = s.schema.options.timestamps; + return !!ts; + } + + if (!timestamps && !childHasTimestamp) { + return; + } + + const createdAt = handleTimestampOption(timestamps, 'createdAt'); + const updatedAt = handleTimestampOption(timestamps, 'updatedAt'); + const currentTime = timestamps != null && timestamps.hasOwnProperty('currentTime') ? + timestamps.currentTime : + null; + const schemaAdditions = {}; + + schema.$timestamps = { createdAt: createdAt, updatedAt: updatedAt }; + + if (updatedAt && !schema.paths[updatedAt]) { + schemaAdditions[updatedAt] = Date; + } + + if (createdAt && !schema.paths[createdAt]) { + schemaAdditions[createdAt] = { [schema.options.typeKey || 'type']: Date, immutable: true }; + } + schema.add(schemaAdditions); + + schema.pre('save', function(next) { + const timestampOption = get(this, '$__.saveOptions.timestamps'); + if (timestampOption === false) { + return next(); + } + + const skipUpdatedAt = timestampOption != null && timestampOption.updatedAt === false; + const skipCreatedAt = timestampOption != null && timestampOption.createdAt === false; + + const defaultTimestamp = currentTime != null ? + currentTime() : + (this.ownerDocument ? this.ownerDocument() : this).constructor.base.now(); + const auto_id = this._id && this._id.auto; + + if (!skipCreatedAt && createdAt && !this.$__getValue(createdAt) && this.$__isSelected(createdAt)) { + this.$set(createdAt, auto_id ? this._id.getTimestamp() : defaultTimestamp); + } + + if (!skipUpdatedAt && updatedAt && (this.isNew || this.$isModified())) { + let ts = defaultTimestamp; + if (this.isNew) { + if (createdAt != null) { + ts = this.$__getValue(createdAt); + } else if (auto_id) { + ts = this._id.getTimestamp(); + } + } + this.$set(updatedAt, ts); + } + + next(); + }); + + schema.methods.initializeTimestamps = function() { + const ts = currentTime != null ? + currentTime() : + this.constructor.base.now(); + if (createdAt && !this.get(createdAt)) { + this.$set(createdAt, ts); + } + if (updatedAt && !this.get(updatedAt)) { + this.$set(updatedAt, ts); + } + return this; + }; + + _setTimestampsOnUpdate[symbols.builtInMiddleware] = true; + + const opts = { query: true, model: false }; + schema.pre('findOneAndReplace', opts, _setTimestampsOnUpdate); + schema.pre('findOneAndUpdate', opts, _setTimestampsOnUpdate); + schema.pre('replaceOne', opts, _setTimestampsOnUpdate); + schema.pre('update', opts, _setTimestampsOnUpdate); + schema.pre('updateOne', opts, _setTimestampsOnUpdate); + schema.pre('updateMany', opts, _setTimestampsOnUpdate); + + function _setTimestampsOnUpdate(next) { + const now = currentTime != null ? + currentTime() : + this.model.base.now(); + // Replacing with null update should still trigger timestamps + if (this.op === 'findOneAndReplace' && this.getUpdate() == null) { + this.setUpdate({}); + } + applyTimestampsToUpdate(now, createdAt, updatedAt, this.getUpdate(), + this.options, this.schema); + applyTimestampsToChildren(now, this.getUpdate(), this.model.schema); + next(); + } +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/topology/allServersUnknown.js b/node_modules/mongoose/lib/helpers/topology/allServersUnknown.js new file mode 100644 index 000000000..33d23ab3e --- /dev/null +++ b/node_modules/mongoose/lib/helpers/topology/allServersUnknown.js @@ -0,0 +1,12 @@ +'use strict'; + +const getConstructorName = require('../getConstructorName'); + +module.exports = function allServersUnknown(topologyDescription) { + if (getConstructorName(topologyDescription) !== 'TopologyDescription') { + return false; + } + + const servers = Array.from(topologyDescription.servers.values()); + return servers.length > 0 && servers.every(server => server.type === 'Unknown'); +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/topology/isAtlas.js b/node_modules/mongoose/lib/helpers/topology/isAtlas.js new file mode 100644 index 000000000..f0638d551 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/topology/isAtlas.js @@ -0,0 +1,13 @@ +'use strict'; + +const getConstructorName = require('../getConstructorName'); + +module.exports = function isAtlas(topologyDescription) { + if (getConstructorName(topologyDescription) !== 'TopologyDescription') { + return false; + } + + const hostnames = Array.from(topologyDescription.servers.keys()); + return hostnames.length > 0 && + hostnames.every(host => host.endsWith('.mongodb.net:27017')); +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/topology/isSSLError.js b/node_modules/mongoose/lib/helpers/topology/isSSLError.js new file mode 100644 index 000000000..d910fc834 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/topology/isSSLError.js @@ -0,0 +1,16 @@ +'use strict'; + +const getConstructorName = require('../getConstructorName'); + +const nonSSLMessage = 'Client network socket disconnected before secure TLS ' + + 'connection was established'; + +module.exports = function isSSLError(topologyDescription) { + if (getConstructorName(topologyDescription) !== 'TopologyDescription') { + return false; + } + + const descriptions = Array.from(topologyDescription.servers.values()); + return descriptions.length > 0 && + descriptions.every(descr => descr.error && descr.error.message.indexOf(nonSSLMessage) !== -1); +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/update/applyTimestampsToChildren.js b/node_modules/mongoose/lib/helpers/update/applyTimestampsToChildren.js new file mode 100644 index 000000000..6186c3d4d --- /dev/null +++ b/node_modules/mongoose/lib/helpers/update/applyTimestampsToChildren.js @@ -0,0 +1,185 @@ +'use strict'; + +const cleanPositionalOperators = require('../schema/cleanPositionalOperators'); +const handleTimestampOption = require('../schema/handleTimestampOption'); + +module.exports = applyTimestampsToChildren; + +/*! + * ignore + */ + +function applyTimestampsToChildren(now, update, schema) { + if (update == null) { + return; + } + + const keys = Object.keys(update); + const hasDollarKey = keys.some(key => key.startsWith('$')); + + if (hasDollarKey) { + if (update.$push) { + _applyTimestampToUpdateOperator(update.$push); + } + if (update.$addToSet) { + _applyTimestampToUpdateOperator(update.$addToSet); + } + if (update.$set != null) { + const keys = Object.keys(update.$set); + for (const key of keys) { + applyTimestampsToUpdateKey(schema, key, update.$set, now); + } + } + if (update.$setOnInsert != null) { + const keys = Object.keys(update.$setOnInsert); + for (const key of keys) { + applyTimestampsToUpdateKey(schema, key, update.$setOnInsert, now); + } + } + } + + const updateKeys = Object.keys(update).filter(key => !key.startsWith('$')); + for (const key of updateKeys) { + applyTimestampsToUpdateKey(schema, key, update, now); + } + + function _applyTimestampToUpdateOperator(op) { + for (const key of Object.keys(op)) { + const $path = schema.path(key.replace(/\.\$\./i, '.').replace(/.\$$/, '')); + if (op[key] && + $path && + $path.$isMongooseDocumentArray && + $path.schema.options.timestamps) { + const timestamps = $path.schema.options.timestamps; + const createdAt = handleTimestampOption(timestamps, 'createdAt'); + const updatedAt = handleTimestampOption(timestamps, 'updatedAt'); + if (op[key].$each) { + op[key].$each.forEach(function(subdoc) { + if (updatedAt != null) { + subdoc[updatedAt] = now; + } + if (createdAt != null) { + subdoc[createdAt] = now; + } + }); + } else { + if (updatedAt != null) { + op[key][updatedAt] = now; + } + if (createdAt != null) { + op[key][createdAt] = now; + } + } + } + } + } +} + +function applyTimestampsToDocumentArray(arr, schematype, now) { + const timestamps = schematype.schema.options.timestamps; + + if (!timestamps) { + return; + } + + const len = arr.length; + + const createdAt = handleTimestampOption(timestamps, 'createdAt'); + const updatedAt = handleTimestampOption(timestamps, 'updatedAt'); + for (let i = 0; i < len; ++i) { + if (updatedAt != null) { + arr[i][updatedAt] = now; + } + if (createdAt != null) { + arr[i][createdAt] = now; + } + + applyTimestampsToChildren(now, arr[i], schematype.schema); + } +} + +function applyTimestampsToSingleNested(subdoc, schematype, now) { + const timestamps = schematype.schema.options.timestamps; + if (!timestamps) { + return; + } + + const createdAt = handleTimestampOption(timestamps, 'createdAt'); + const updatedAt = handleTimestampOption(timestamps, 'updatedAt'); + if (updatedAt != null) { + subdoc[updatedAt] = now; + } + if (createdAt != null) { + subdoc[createdAt] = now; + } + + applyTimestampsToChildren(now, subdoc, schematype.schema); +} + +function applyTimestampsToUpdateKey(schema, key, update, now) { + // Replace positional operator `$` and array filters `$[]` and `$[.*]` + const keyToSearch = cleanPositionalOperators(key); + const path = schema.path(keyToSearch); + if (!path) { + return; + } + + const parentSchemaTypes = []; + const pieces = keyToSearch.split('.'); + for (let i = pieces.length - 1; i > 0; --i) { + const s = schema.path(pieces.slice(0, i).join('.')); + if (s != null && + (s.$isMongooseDocumentArray || s.$isSingleNested)) { + parentSchemaTypes.push({ parentPath: key.split('.').slice(0, i).join('.'), parentSchemaType: s }); + } + } + + if (Array.isArray(update[key]) && path.$isMongooseDocumentArray) { + applyTimestampsToDocumentArray(update[key], path, now); + } else if (update[key] && path.$isSingleNested) { + applyTimestampsToSingleNested(update[key], path, now); + } else if (parentSchemaTypes.length > 0) { + for (const item of parentSchemaTypes) { + const parentPath = item.parentPath; + const parentSchemaType = item.parentSchemaType; + const timestamps = parentSchemaType.schema.options.timestamps; + const updatedAt = handleTimestampOption(timestamps, 'updatedAt'); + + if (!timestamps || updatedAt == null) { + continue; + } + + if (parentSchemaType.$isSingleNested) { + // Single nested is easy + update[parentPath + '.' + updatedAt] = now; + } else if (parentSchemaType.$isMongooseDocumentArray) { + let childPath = key.substr(parentPath.length + 1); + + if (/^\d+$/.test(childPath)) { + update[parentPath + '.' + childPath][updatedAt] = now; + continue; + } + + const firstDot = childPath.indexOf('.'); + childPath = firstDot !== -1 ? childPath.substr(0, firstDot) : childPath; + + update[parentPath + '.' + childPath + '.' + updatedAt] = now; + } + } + } else if (path.schema != null && path.schema != schema && update[key]) { + const timestamps = path.schema.options.timestamps; + const createdAt = handleTimestampOption(timestamps, 'createdAt'); + const updatedAt = handleTimestampOption(timestamps, 'updatedAt'); + + if (!timestamps) { + return; + } + + if (updatedAt != null) { + update[key][updatedAt] = now; + } + if (createdAt != null) { + update[key][createdAt] = now; + } + } +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/update/applyTimestampsToUpdate.js b/node_modules/mongoose/lib/helpers/update/applyTimestampsToUpdate.js new file mode 100644 index 000000000..6fab29b5d --- /dev/null +++ b/node_modules/mongoose/lib/helpers/update/applyTimestampsToUpdate.js @@ -0,0 +1,118 @@ +'use strict'; + +/*! + * ignore + */ + +const get = require('../get'); + +module.exports = applyTimestampsToUpdate; + +/*! + * ignore + */ + +function applyTimestampsToUpdate(now, createdAt, updatedAt, currentUpdate, options) { + const updates = currentUpdate; + let _updates = updates; + const overwrite = get(options, 'overwrite', false); + const timestamps = get(options, 'timestamps', true); + + // Support skipping timestamps at the query level, see gh-6980 + if (!timestamps || updates == null) { + return currentUpdate; + } + + const skipCreatedAt = timestamps != null && timestamps.createdAt === false; + const skipUpdatedAt = timestamps != null && timestamps.updatedAt === false; + + if (overwrite) { + if (currentUpdate && currentUpdate.$set) { + currentUpdate = currentUpdate.$set; + updates.$set = {}; + _updates = updates.$set; + } + if (!skipUpdatedAt && updatedAt && !currentUpdate[updatedAt]) { + _updates[updatedAt] = now; + } + if (!skipCreatedAt && createdAt && !currentUpdate[createdAt]) { + _updates[createdAt] = now; + } + return updates; + } + currentUpdate = currentUpdate || {}; + + if (Array.isArray(updates)) { + // Update with aggregation pipeline + updates.push({ $set: { updatedAt: now } }); + + return updates; + } + + updates.$set = updates.$set || {}; + if (!skipUpdatedAt && updatedAt && + (!currentUpdate.$currentDate || !currentUpdate.$currentDate[updatedAt])) { + let timestampSet = false; + if (updatedAt.indexOf('.') !== -1) { + const pieces = updatedAt.split('.'); + for (let i = 1; i < pieces.length; ++i) { + const remnant = pieces.slice(-i).join('.'); + const start = pieces.slice(0, -i).join('.'); + if (currentUpdate[start] != null) { + currentUpdate[start][remnant] = now; + timestampSet = true; + break; + } else if (currentUpdate.$set && currentUpdate.$set[start]) { + currentUpdate.$set[start][remnant] = now; + timestampSet = true; + break; + } + } + } + + if (!timestampSet) { + updates.$set[updatedAt] = now; + } + + if (updates.hasOwnProperty(updatedAt)) { + delete updates[updatedAt]; + } + } + + if (!skipCreatedAt && createdAt) { + + if (currentUpdate[createdAt]) { + delete currentUpdate[createdAt]; + } + if (currentUpdate.$set && currentUpdate.$set[createdAt]) { + delete currentUpdate.$set[createdAt]; + } + let timestampSet = false; + if (createdAt.indexOf('.') !== -1) { + const pieces = createdAt.split('.'); + for (let i = 1; i < pieces.length; ++i) { + const remnant = pieces.slice(-i).join('.'); + const start = pieces.slice(0, -i).join('.'); + if (currentUpdate[start] != null) { + currentUpdate[start][remnant] = now; + timestampSet = true; + break; + } else if (currentUpdate.$set && currentUpdate.$set[start]) { + currentUpdate.$set[start][remnant] = now; + timestampSet = true; + break; + } + } + } + + if (!timestampSet) { + updates.$setOnInsert = updates.$setOnInsert || {}; + updates.$setOnInsert[createdAt] = now; + } + } + + if (Object.keys(updates.$set).length === 0) { + delete updates.$set; + } + return updates; +} diff --git a/node_modules/mongoose/lib/helpers/update/castArrayFilters.js b/node_modules/mongoose/lib/helpers/update/castArrayFilters.js new file mode 100644 index 000000000..c60ebdcde --- /dev/null +++ b/node_modules/mongoose/lib/helpers/update/castArrayFilters.js @@ -0,0 +1,70 @@ +'use strict'; + +const castFilterPath = require('../query/castFilterPath'); +const cleanPositionalOperators = require('../schema/cleanPositionalOperators'); +const getPath = require('../schema/getPath'); +const updatedPathsByArrayFilter = require('./updatedPathsByArrayFilter'); + +module.exports = function castArrayFilters(query) { + const arrayFilters = query.options.arrayFilters; + const update = query.getUpdate(); + const schema = query.schema; + const updatedPathsByFilter = updatedPathsByArrayFilter(update); + const strictQuery = schema.options.strictQuery; + + _castArrayFilters(arrayFilters, schema, strictQuery, updatedPathsByFilter, query); +}; + +function _castArrayFilters(arrayFilters, schema, strictQuery, updatedPathsByFilter, query) { + if (!Array.isArray(arrayFilters)) { + return; + } + + for (const filter of arrayFilters) { + if (filter == null) { + throw new Error(`Got null array filter in ${arrayFilters}`); + } + for (const key of Object.keys(filter)) { + if (key === '$and' || key === '$or') { + _castArrayFilters(filter[key], schema, strictQuery, updatedPathsByFilter, query); + continue; + } + if (filter[key] == null) { + continue; + } + if (updatedPathsByFilter[key] === null) { + continue; + } + if (Object.keys(updatedPathsByFilter).length === 0) { + continue; + } + const dot = key.indexOf('.'); + let filterPath = dot === -1 ? + updatedPathsByFilter[key] + '.0' : + updatedPathsByFilter[key.substr(0, dot)] + '.0' + key.substr(dot); + if (filterPath == null) { + throw new Error(`Filter path not found for ${key}`); + } + + // If there are multiple array filters in the path being updated, make sure + // to replace them so we can get the schema path. + filterPath = cleanPositionalOperators(filterPath); + const schematype = getPath(schema, filterPath); + if (schematype == null) { + if (!strictQuery) { + return; + } + // For now, treat `strictQuery = true` and `strictQuery = 'throw'` as + // equivalent for casting array filters. `strictQuery = true` doesn't + // quite work in this context because we never want to silently strip out + // array filters, even if the path isn't in the schema. + throw new Error(`Could not find path "${filterPath}" in schema`); + } + if (typeof filter[key] === 'object') { + filter[key] = castFilterPath(query, schematype, filter[key]); + } else { + filter[key] = schematype.castForQuery(filter[key]); + } + } + } +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/update/modifiedPaths.js b/node_modules/mongoose/lib/helpers/update/modifiedPaths.js new file mode 100644 index 000000000..9cb567d5a --- /dev/null +++ b/node_modules/mongoose/lib/helpers/update/modifiedPaths.js @@ -0,0 +1,33 @@ +'use strict'; + +const _modifiedPaths = require('../common').modifiedPaths; + +/** + * Given an update document with potential update operators (`$set`, etc.) + * returns an object whose keys are the directly modified paths. + * + * If there are any top-level keys that don't start with `$`, we assume those + * will get wrapped in a `$set`. The Mongoose Query is responsible for wrapping + * top-level keys in `$set`. + * + * @param {Object} update + * @return {Object} modified + */ + +module.exports = function modifiedPaths(update) { + const keys = Object.keys(update); + const res = {}; + + const withoutDollarKeys = {}; + for (const key of keys) { + if (key.startsWith('$')) { + _modifiedPaths(update[key], '', res); + continue; + } + withoutDollarKeys[key] = update[key]; + } + + _modifiedPaths(withoutDollarKeys, '', res); + + return res; +}; diff --git a/node_modules/mongoose/lib/helpers/update/moveImmutableProperties.js b/node_modules/mongoose/lib/helpers/update/moveImmutableProperties.js new file mode 100644 index 000000000..8541c5bd2 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/update/moveImmutableProperties.js @@ -0,0 +1,53 @@ +'use strict'; + +const get = require('../get'); + +/** + * Given an update, move all $set on immutable properties to $setOnInsert. + * This should only be called for upserts, because $setOnInsert bypasses the + * strictness check for immutable properties. + */ + +module.exports = function moveImmutableProperties(schema, update, ctx) { + if (update == null) { + return; + } + + const keys = Object.keys(update); + for (const key of keys) { + const isDollarKey = key.startsWith('$'); + + if (key === '$set') { + const updatedPaths = Object.keys(update[key]); + for (const path of updatedPaths) { + _walkUpdatePath(schema, update[key], path, update, ctx); + } + } else if (!isDollarKey) { + _walkUpdatePath(schema, update, key, update, ctx); + } + + } +}; + +function _walkUpdatePath(schema, op, path, update, ctx) { + const schematype = schema.path(path); + if (schematype == null) { + return; + } + + let immutable = get(schematype, 'options.immutable', null); + if (immutable == null) { + return; + } + if (typeof immutable === 'function') { + immutable = immutable.call(ctx, ctx); + } + + if (!immutable) { + return; + } + + update.$setOnInsert = update.$setOnInsert || {}; + update.$setOnInsert[path] = op[path]; + delete op[path]; +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/update/removeUnusedArrayFilters.js b/node_modules/mongoose/lib/helpers/update/removeUnusedArrayFilters.js new file mode 100644 index 000000000..6a52f9166 --- /dev/null +++ b/node_modules/mongoose/lib/helpers/update/removeUnusedArrayFilters.js @@ -0,0 +1,32 @@ +'use strict'; + +/** + * MongoDB throws an error if there's unused array filters. That is, if `options.arrayFilters` defines + * a filter, but none of the `update` keys use it. This should be enough to filter out all unused array + * filters. + */ + +module.exports = function removeUnusedArrayFilters(update, arrayFilters) { + const updateKeys = Object.keys(update). + map(key => Object.keys(update[key])). + reduce((cur, arr) => cur.concat(arr), []); + return arrayFilters.filter(obj => { + return _checkSingleFilterKey(obj, updateKeys); + }); +}; + +function _checkSingleFilterKey(arrayFilter, updateKeys) { + const firstKey = Object.keys(arrayFilter)[0]; + + if (firstKey === '$and' || firstKey === '$or') { + if (!Array.isArray(arrayFilter[firstKey])) { + return false; + } + return arrayFilter[firstKey].find(filter => _checkSingleFilterKey(filter, updateKeys)) != null; + } + + const firstDot = firstKey.indexOf('.'); + const arrayFilterKey = firstDot === -1 ? firstKey : firstKey.slice(0, firstDot); + + return updateKeys.find(key => key.includes('$[' + arrayFilterKey + ']')) != null; +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/update/updatedPathsByArrayFilter.js b/node_modules/mongoose/lib/helpers/update/updatedPathsByArrayFilter.js new file mode 100644 index 000000000..0958808db --- /dev/null +++ b/node_modules/mongoose/lib/helpers/update/updatedPathsByArrayFilter.js @@ -0,0 +1,27 @@ +'use strict'; + +const modifiedPaths = require('./modifiedPaths'); + +module.exports = function updatedPathsByArrayFilter(update) { + if (update == null) { + return {}; + } + const updatedPaths = modifiedPaths(update); + + return Object.keys(updatedPaths).reduce((cur, path) => { + const matches = path.match(/\$\[[^\]]+\]/g); + if (matches == null) { + return cur; + } + for (const match of matches) { + const firstMatch = path.indexOf(match); + if (firstMatch !== path.lastIndexOf(match)) { + throw new Error(`Path '${path}' contains the same array filter multiple times`); + } + cur[match.substring(2, match.length - 1)] = path. + substr(0, firstMatch - 1). + replace(/\$\[[^\]]+\]/g, '0'); + } + return cur; + }, {}); +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/helpers/updateValidators.js b/node_modules/mongoose/lib/helpers/updateValidators.js new file mode 100644 index 000000000..eddb8574a --- /dev/null +++ b/node_modules/mongoose/lib/helpers/updateValidators.js @@ -0,0 +1,256 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const ValidationError = require('../error/validation'); +const cleanPositionalOperators = require('./schema/cleanPositionalOperators'); +const flatten = require('./common').flatten; +const modifiedPaths = require('./common').modifiedPaths; + +/** + * Applies validators and defaults to update and findOneAndUpdate operations, + * specifically passing a null doc as `this` to validators and defaults + * + * @param {Query} query + * @param {Schema} schema + * @param {Object} castedDoc + * @param {Object} options + * @method runValidatorsOnUpdate + * @api private + */ + +module.exports = function(query, schema, castedDoc, options, callback) { + const keys = Object.keys(castedDoc || {}); + let updatedKeys = {}; + let updatedValues = {}; + const isPull = {}; + const arrayAtomicUpdates = {}; + const numKeys = keys.length; + let hasDollarUpdate = false; + const modified = {}; + let currentUpdate; + let key; + let i; + + for (i = 0; i < numKeys; ++i) { + if (keys[i].startsWith('$')) { + hasDollarUpdate = true; + if (keys[i] === '$push' || keys[i] === '$addToSet') { + const _keys = Object.keys(castedDoc[keys[i]]); + for (let ii = 0; ii < _keys.length; ++ii) { + currentUpdate = castedDoc[keys[i]][_keys[ii]]; + if (currentUpdate && currentUpdate.$each) { + arrayAtomicUpdates[_keys[ii]] = (arrayAtomicUpdates[_keys[ii]] || []). + concat(currentUpdate.$each); + } else { + arrayAtomicUpdates[_keys[ii]] = (arrayAtomicUpdates[_keys[ii]] || []). + concat([currentUpdate]); + } + } + continue; + } + modifiedPaths(castedDoc[keys[i]], '', modified); + const flat = flatten(castedDoc[keys[i]], null, null, schema); + const paths = Object.keys(flat); + const numPaths = paths.length; + for (let j = 0; j < numPaths; ++j) { + const updatedPath = cleanPositionalOperators(paths[j]); + key = keys[i]; + // With `$pull` we might flatten `$in`. Skip stuff nested under `$in` + // for the rest of the logic, it will get handled later. + if (updatedPath.includes('$')) { + continue; + } + if (key === '$set' || key === '$setOnInsert' || + key === '$pull' || key === '$pullAll') { + updatedValues[updatedPath] = flat[paths[j]]; + isPull[updatedPath] = key === '$pull' || key === '$pullAll'; + } else if (key === '$unset') { + updatedValues[updatedPath] = undefined; + } + updatedKeys[updatedPath] = true; + } + } + } + + if (!hasDollarUpdate) { + modifiedPaths(castedDoc, '', modified); + updatedValues = flatten(castedDoc, null, null, schema); + updatedKeys = Object.keys(updatedValues); + } + + const updates = Object.keys(updatedValues); + const numUpdates = updates.length; + const validatorsToExecute = []; + const validationErrors = []; + + const alreadyValidated = []; + + const context = query; + function iter(i, v) { + const schemaPath = schema._getSchema(updates[i]); + if (schemaPath == null) { + return; + } + if (schemaPath.instance === 'Mixed' && schemaPath.path !== updates[i]) { + return; + } + + if (v && Array.isArray(v.$in)) { + v.$in.forEach((v, i) => { + validatorsToExecute.push(function(callback) { + schemaPath.doValidate( + v, + function(err) { + if (err) { + err.path = updates[i] + '.$in.' + i; + validationErrors.push(err); + } + callback(null); + }, + context, + { updateValidator: true }); + }); + }); + } else { + if (isPull[updates[i]] && + schemaPath.$isMongooseArray) { + return; + } + + if (schemaPath.$isMongooseDocumentArrayElement && v != null && v.$__ != null) { + alreadyValidated.push(updates[i]); + validatorsToExecute.push(function(callback) { + schemaPath.doValidate(v, function(err) { + if (err) { + err.path = updates[i]; + validationErrors.push(err); + return callback(null); + } + + v.validate(function(err) { + if (err) { + if (err.errors) { + for (const key of Object.keys(err.errors)) { + const _err = err.errors[key]; + _err.path = updates[i] + '.' + key; + validationErrors.push(_err); + } + } else { + err.path = updates[i]; + validationErrors.push(err); + } + } + callback(null); + }); + }, context, { updateValidator: true }); + }); + } else { + validatorsToExecute.push(function(callback) { + for (const path of alreadyValidated) { + if (updates[i].startsWith(path + '.')) { + return callback(null); + } + } + + schemaPath.doValidate(v, function(err) { + if (schemaPath.schema != null && + schemaPath.schema.options.storeSubdocValidationError === false && + err instanceof ValidationError) { + return callback(null); + } + + if (err) { + err.path = updates[i]; + validationErrors.push(err); + } + callback(null); + }, context, { updateValidator: true }); + }); + } + } + } + for (i = 0; i < numUpdates; ++i) { + iter(i, updatedValues[updates[i]]); + } + + const arrayUpdates = Object.keys(arrayAtomicUpdates); + for (const arrayUpdate of arrayUpdates) { + let schemaPath = schema._getSchema(arrayUpdate); + if (schemaPath && schemaPath.$isMongooseDocumentArray) { + validatorsToExecute.push(function(callback) { + schemaPath.doValidate( + arrayAtomicUpdates[arrayUpdate], + getValidationCallback(arrayUpdate, validationErrors, callback), + options && options.context === 'query' ? query : null); + }); + } else { + schemaPath = schema._getSchema(arrayUpdate + '.0'); + for (const atomicUpdate of arrayAtomicUpdates[arrayUpdate]) { + validatorsToExecute.push(function(callback) { + schemaPath.doValidate( + atomicUpdate, + getValidationCallback(arrayUpdate, validationErrors, callback), + options && options.context === 'query' ? query : null, + { updateValidator: true }); + }); + } + } + } + + if (callback != null) { + let numValidators = validatorsToExecute.length; + if (numValidators === 0) { + return _done(callback); + } + for (const validator of validatorsToExecute) { + validator(function() { + if (--numValidators <= 0) { + _done(callback); + } + }); + } + + return; + } + + return function(callback) { + let numValidators = validatorsToExecute.length; + if (numValidators === 0) { + return _done(callback); + } + for (const validator of validatorsToExecute) { + validator(function() { + if (--numValidators <= 0) { + _done(callback); + } + }); + } + }; + + function _done(callback) { + if (validationErrors.length) { + const err = new ValidationError(null); + + for (const validationError of validationErrors) { + err.addError(validationError.path, validationError); + } + + return callback(err); + } + callback(null); + } + + function getValidationCallback(arrayUpdate, validationErrors, callback) { + return function(err) { + if (err) { + err.path = arrayUpdate; + validationErrors.push(err); + } + callback(null); + }; + } +}; + diff --git a/node_modules/mongoose/lib/index.js b/node_modules/mongoose/lib/index.js new file mode 100644 index 000000000..a0e7b982a --- /dev/null +++ b/node_modules/mongoose/lib/index.js @@ -0,0 +1,1162 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +require('./driver').set(require('./drivers/node-mongodb-native')); + +const Document = require('./document'); +const EventEmitter = require('events').EventEmitter; +const Schema = require('./schema'); +const SchemaType = require('./schematype'); +const SchemaTypes = require('./schema/index'); +const VirtualType = require('./virtualtype'); +const STATES = require('./connectionstate'); +const VALID_OPTIONS = require('./validoptions'); +const Types = require('./types'); +const Query = require('./query'); +const Model = require('./model'); +const applyPlugins = require('./helpers/schema/applyPlugins'); +const driver = require('./driver'); +const get = require('./helpers/get'); +const promiseOrCallback = require('./helpers/promiseOrCallback'); +const legacyPluralize = require('./helpers/pluralize'); +const utils = require('./utils'); +const pkg = require('../package.json'); +const cast = require('./cast'); +const removeSubdocs = require('./plugins/removeSubdocs'); +const saveSubdocs = require('./plugins/saveSubdocs'); +const trackTransaction = require('./plugins/trackTransaction'); +const validateBeforeSave = require('./plugins/validateBeforeSave'); + +const Aggregate = require('./aggregate'); +const PromiseProvider = require('./promise_provider'); +const shardingPlugin = require('./plugins/sharding'); +const trusted = require('./helpers/query/trusted').trusted; +const sanitizeFilter = require('./helpers/query/sanitizeFilter'); + +const defaultMongooseSymbol = Symbol.for('mongoose:default'); + +require('./helpers/printJestWarning'); + +/** + * Mongoose constructor. + * + * The exports object of the `mongoose` module is an instance of this class. + * Most apps will only use this one instance. + * + * ####Example: + * const mongoose = require('mongoose'); + * mongoose instanceof mongoose.Mongoose; // true + * + * // Create a new Mongoose instance with its own `connect()`, `set()`, `model()`, etc. + * const m = new mongoose.Mongoose(); + * + * @api public + * @param {Object} options see [`Mongoose#set()` docs](/docs/api/mongoose.html#mongoose_Mongoose-set) + */ +function Mongoose(options) { + this.connections = []; + this.models = {}; + this.events = new EventEmitter(); + // default global options + this.options = Object.assign({ + pluralization: true, + autoIndex: true, + autoCreate: true + }, options); + const conn = this.createConnection(); // default connection + conn.models = this.models; + + if (this.options.pluralization) { + this._pluralize = legacyPluralize; + } + + // If a user creates their own Mongoose instance, give them a separate copy + // of the `Schema` constructor so they get separate custom types. (gh-6933) + if (!options || !options[defaultMongooseSymbol]) { + const _this = this; + this.Schema = function() { + this.base = _this; + return Schema.apply(this, arguments); + }; + this.Schema.prototype = Object.create(Schema.prototype); + + Object.assign(this.Schema, Schema); + this.Schema.base = this; + this.Schema.Types = Object.assign({}, Schema.Types); + } else { + // Hack to work around babel's strange behavior with + // `import mongoose, { Schema } from 'mongoose'`. Because `Schema` is not + // an own property of a Mongoose global, Schema will be undefined. See gh-5648 + for (const key of ['Schema', 'model']) { + this[key] = Mongoose.prototype[key]; + } + } + this.Schema.prototype.base = this; + + Object.defineProperty(this, 'plugins', { + configurable: false, + enumerable: true, + writable: false, + value: [ + [saveSubdocs, { deduplicate: true }], + [validateBeforeSave, { deduplicate: true }], + [shardingPlugin, { deduplicate: true }], + [removeSubdocs, { deduplicate: true }], + [trackTransaction, { deduplicate: true }] + ] + }); +} +Mongoose.prototype.cast = cast; +/** + * Expose connection states for user-land + * + * @memberOf Mongoose + * @property STATES + * @api public + */ +Mongoose.prototype.STATES = STATES; + +/** + * Object with `get()` and `set()` containing the underlying driver this Mongoose instance + * uses to communicate with the database. A driver is a Mongoose-specific interface that defines functions + * like `find()`. + * + * @memberOf Mongoose + * @property driver + * @api public + */ + +Mongoose.prototype.driver = driver; + +/** + * Sets mongoose options + * + * ####Example: + * + * mongoose.set('test', value) // sets the 'test' option to `value` + * + * mongoose.set('debug', true) // enable logging collection methods + arguments to the console/file + * + * mongoose.set('debug', function(collectionName, methodName, ...methodArgs) {}); // use custom function to log collection methods + arguments + * + * Currently supported options are: + * - 'debug': If `true`, prints the operations mongoose sends to MongoDB to the console. If a writable stream is passed, it will log to that stream, without colorization. If a callback function is passed, it will receive the collection name, the method name, then all arugments passed to the method. For example, if you wanted to replicate the default logging, you could output from the callback `Mongoose: ${collectionName}.${methodName}(${methodArgs.join(', ')})`. + * - 'returnOriginal': If `false`, changes the default `returnOriginal` option to `findOneAndUpdate()`, `findByIdAndUpdate`, and `findOneAndReplace()` to false. This is equivalent to setting the `new` option to `true` for `findOneAndX()` calls by default. Read our [`findOneAndUpdate()` tutorial](/docs/tutorials/findoneandupdate.html) for more information. + * - 'bufferCommands': enable/disable mongoose's buffering mechanism for all connections and models + * - 'cloneSchemas': false by default. Set to `true` to `clone()` all schemas before compiling into a model. + * - 'applyPluginsToDiscriminators': false by default. Set to true to apply global plugins to discriminator schemas. This typically isn't necessary because plugins are applied to the base schema and discriminators copy all middleware, methods, statics, and properties from the base schema. + * - 'applyPluginsToChildSchemas': true by default. Set to false to skip applying global plugins to child schemas + * - 'objectIdGetter': true by default. Mongoose adds a getter to MongoDB ObjectId's called `_id` that returns `this` for convenience with populate. Set this to false to remove the getter. + * - 'runValidators': false by default. Set to true to enable [update validators](/docs/validation.html#update-validators) for all validators by default. + * - 'toObject': `{ transform: true, flattenDecimals: true }` by default. Overwrites default objects to [`toObject()`](/docs/api.html#document_Document-toObject) + * - 'toJSON': `{ transform: true, flattenDecimals: true }` by default. Overwrites default objects to [`toJSON()`](/docs/api.html#document_Document-toJSON), for determining how Mongoose documents get serialized by `JSON.stringify()` + * - 'strict': true by default, may be `false`, `true`, or `'throw'`. Sets the default strict mode for schemas. + * - 'strictQuery': false by default, may be `false`, `true`, or `'throw'`. Sets the default [strictQuery](/docs/guide.html#strictQuery) mode for schemas. + * - 'selectPopulatedPaths': true by default. Set to false to opt out of Mongoose adding all fields that you `populate()` to your `select()`. The schema-level option `selectPopulatedPaths` overwrites this one. + * - 'maxTimeMS': If set, attaches [maxTimeMS](https://docs.mongodb.com/manual/reference/operator/meta/maxTimeMS/) to every query + * - 'autoIndex': true by default. Set to false to disable automatic index creation for all models associated with this Mongoose instance. + * - 'autoCreate': Set to `true` to make Mongoose call [`Model.createCollection()`](/docs/api/model.html#model_Model.createCollection) automatically when you create a model with `mongoose.model()` or `conn.model()`. This is useful for testing transactions, change streams, and other features that require the collection to exist. + * - 'overwriteModels': Set to `true` to default to overwriting models with the same name when calling `mongoose.model()`, as opposed to throwing an `OverwriteModelError`. + * + * @param {String} key + * @param {String|Function|Boolean} value + * @api public + */ + +Mongoose.prototype.set = function(key, value) { + const _mongoose = this instanceof Mongoose ? this : mongoose; + + if (VALID_OPTIONS.indexOf(key) === -1) throw new Error(`\`${key}\` is an invalid option.`); + + if (arguments.length === 1) { + return _mongoose.options[key]; + } + + _mongoose.options[key] = value; + + if (key === 'objectIdGetter') { + if (value) { + Object.defineProperty(mongoose.Types.ObjectId.prototype, '_id', { + enumerable: false, + configurable: true, + get: function() { + return this; + } + }); + } else { + delete mongoose.Types.ObjectId.prototype._id; + } + } + + return _mongoose; +}; + +/** + * Gets mongoose options + * + * ####Example: + * + * mongoose.get('test') // returns the 'test' value + * + * @param {String} key + * @method get + * @api public + */ + +Mongoose.prototype.get = Mongoose.prototype.set; + +/** + * Creates a Connection instance. + * + * Each `connection` instance maps to a single database. This method is helpful when managing multiple db connections. + * + * + * _Options passed take precedence over options included in connection strings._ + * + * ####Example: + * + * // with mongodb:// URI + * db = mongoose.createConnection('mongodb://user:pass@localhost:port/database'); + * + * // and options + * const opts = { db: { native_parser: true }} + * db = mongoose.createConnection('mongodb://user:pass@localhost:port/database', opts); + * + * // replica sets + * db = mongoose.createConnection('mongodb://user:pass@localhost:port,anotherhost:port,yetanother:port/database'); + * + * // and options + * const opts = { replset: { strategy: 'ping', rs_name: 'testSet' }} + * db = mongoose.createConnection('mongodb://user:pass@localhost:port,anotherhost:port,yetanother:port/database', opts); + * + * // and options + * const opts = { server: { auto_reconnect: false }, user: 'username', pass: 'mypassword' } + * db = mongoose.createConnection('localhost', 'database', port, opts) + * + * // initialize now, connect later + * db = mongoose.createConnection(); + * db.openUri('localhost', 'database', port, [opts]); + * + * @param {String} [uri] a mongodb:// URI + * @param {Object} [options] passed down to the [MongoDB driver's `connect()` function](http://mongodb.github.io/node-mongodb-native/3.0/api/MongoClient.html), except for 4 mongoose-specific options explained below. + * @param {Boolean} [options.bufferCommands=true] Mongoose specific option. Set to false to [disable buffering](http://mongoosejs.com/docs/faq.html#callback_never_executes) on all models associated with this connection. + * @param {String} [options.dbName] The name of the database you want to use. If not provided, Mongoose uses the database name from connection string. + * @param {String} [options.user] username for authentication, equivalent to `options.auth.user`. Maintained for backwards compatibility. + * @param {String} [options.pass] password for authentication, equivalent to `options.auth.password`. Maintained for backwards compatibility. + * @param {Boolean} [options.autoIndex=true] Mongoose-specific option. Set to false to disable automatic index creation for all models associated with this connection. + * @param {Number} [options.reconnectTries=30] If you're connected to a single server or mongos proxy (as opposed to a replica set), the MongoDB driver will try to reconnect every `reconnectInterval` milliseconds for `reconnectTries` times, and give up afterward. When the driver gives up, the mongoose connection emits a `reconnectFailed` event. This option does nothing for replica set connections. + * @param {Number} [options.reconnectInterval=1000] See `reconnectTries` option above. + * @param {Class} [options.promiseLibrary] Sets the [underlying driver's promise library](http://mongodb.github.io/node-mongodb-native/3.1/api/MongoClient.html). + * @param {Number} [options.maxPoolSize=5] The maximum number of sockets the MongoDB driver will keep open for this connection. Keep in mind that MongoDB only allows one operation per socket at a time, so you may want to increase this if you find you have a few slow queries that are blocking faster queries from proceeding. See [Slow Trains in MongoDB and Node.js](http://thecodebarbarian.com/slow-trains-in-mongodb-and-nodejs). + * @param {Number} [options.minPoolSize=1] The minimum number of sockets the MongoDB driver will keep open for this connection. Keep in mind that MongoDB only allows one operation per socket at a time, so you may want to increase this if you find you have a few slow queries that are blocking faster queries from proceeding. See [Slow Trains in MongoDB and Node.js](http://thecodebarbarian.com/slow-trains-in-mongodb-and-nodejs). + * @param {Number} [options.connectTimeoutMS=30000] How long the MongoDB driver will wait before killing a socket due to inactivity _during initial connection_. Defaults to 30000. This option is passed transparently to [Node.js' `socket#setTimeout()` function](https://nodejs.org/api/net.html#net_socket_settimeout_timeout_callback). + * @param {Number} [options.socketTimeoutMS=30000] How long the MongoDB driver will wait before killing a socket due to inactivity _after initial connection_. A socket may be inactive because of either no activity or a long-running operation. This is set to `30000` by default, you should set this to 2-3x your longest running operation if you expect some of your database operations to run longer than 20 seconds. This option is passed to [Node.js `socket#setTimeout()` function](https://nodejs.org/api/net.html#net_socket_settimeout_timeout_callback) after the MongoDB driver successfully completes. + * @param {Number} [options.family=0] Passed transparently to [Node.js' `dns.lookup()`](https://nodejs.org/api/dns.html#dns_dns_lookup_hostname_options_callback) function. May be either `0`, `4`, or `6`. `4` means use IPv4 only, `6` means use IPv6 only, `0` means try both. + * @return {Connection} the created Connection object. Connections are thenable, so you can do `await mongoose.createConnection()` + * @api public + */ + +Mongoose.prototype.createConnection = function(uri, options, callback) { + const _mongoose = this instanceof Mongoose ? this : mongoose; + + const conn = new Connection(_mongoose); + if (typeof options === 'function') { + callback = options; + options = null; + } + _mongoose.connections.push(conn); + _mongoose.events.emit('createConnection', conn); + + if (arguments.length > 0) { + conn.openUri(uri, options, callback); + } + + return conn; +}; + +/** + * Opens the default mongoose connection. + * + * ####Example: + * + * mongoose.connect('mongodb://user:pass@localhost:port/database'); + * + * // replica sets + * const uri = 'mongodb://user:pass@localhost:port,anotherhost:port,yetanother:port/mydatabase'; + * mongoose.connect(uri); + * + * // with options + * mongoose.connect(uri, options); + * + * // optional callback that gets fired when initial connection completed + * const uri = 'mongodb://nonexistent.domain:27000'; + * mongoose.connect(uri, function(error) { + * // if error is truthy, the initial connection failed. + * }) + * + * @param {String} uri(s) + * @param {Object} [options] passed down to the [MongoDB driver's `connect()` function](http://mongodb.github.io/node-mongodb-native/3.0/api/MongoClient.html), except for 4 mongoose-specific options explained below. + * @param {Boolean} [options.bufferCommands=true] Mongoose specific option. Set to false to [disable buffering](http://mongoosejs.com/docs/faq.html#callback_never_executes) on all models associated with this connection. + * @param {Number} [options.bufferTimeoutMS=10000] Mongoose specific option. If `bufferCommands` is true, Mongoose will throw an error after `bufferTimeoutMS` if the operation is still buffered. + * @param {String} [options.dbName] The name of the database we want to use. If not provided, use database name from connection string. + * @param {String} [options.user] username for authentication, equivalent to `options.auth.user`. Maintained for backwards compatibility. + * @param {String} [options.pass] password for authentication, equivalent to `options.auth.password`. Maintained for backwards compatibility. + * @param {Number} [options.maxPoolSize=100] The maximum number of sockets the MongoDB driver will keep open for this connection. Keep in mind that MongoDB only allows one operation per socket at a time, so you may want to increase this if you find you have a few slow queries that are blocking faster queries from proceeding. See [Slow Trains in MongoDB and Node.js](http://thecodebarbarian.com/slow-trains-in-mongodb-and-nodejs). + * @param {Number} [options.minPoolSize=0] The minimum number of sockets the MongoDB driver will keep open for this connection. + * @param {Number} [options.serverSelectionTimeoutMS] If `useUnifiedTopology = true`, the MongoDB driver will try to find a server to send any given operation to, and keep retrying for `serverSelectionTimeoutMS` milliseconds before erroring out. If not set, the MongoDB driver defaults to using `30000` (30 seconds). + * @param {Number} [options.heartbeatFrequencyMS] If `useUnifiedTopology = true`, the MongoDB driver sends a heartbeat every `heartbeatFrequencyMS` to check on the status of the connection. A heartbeat is subject to `serverSelectionTimeoutMS`, so the MongoDB driver will retry failed heartbeats for up to 30 seconds by default. Mongoose only emits a `'disconnected'` event after a heartbeat has failed, so you may want to decrease this setting to reduce the time between when your server goes down and when Mongoose emits `'disconnected'`. We recommend you do **not** set this setting below 1000, too many heartbeats can lead to performance degradation. + * @param {Boolean} [options.autoIndex=true] Mongoose-specific option. Set to false to disable automatic index creation for all models associated with this connection. + * @param {Number} [options.reconnectTries=30] If you're connected to a single server or mongos proxy (as opposed to a replica set), the MongoDB driver will try to reconnect every `reconnectInterval` milliseconds for `reconnectTries` times, and give up afterward. When the driver gives up, the mongoose connection emits a `reconnectFailed` event. This option does nothing for replica set connections. + * @param {Number} [options.reconnectInterval=1000] See `reconnectTries` option above. + * @param {Class} [options.promiseLibrary] Sets the [underlying driver's promise library](http://mongodb.github.io/node-mongodb-native/3.1/api/MongoClient.html). + * @param {Number} [options.connectTimeoutMS=30000] How long the MongoDB driver will wait before killing a socket due to inactivity _during initial connection_. Defaults to 30000. This option is passed transparently to [Node.js' `socket#setTimeout()` function](https://nodejs.org/api/net.html#net_socket_settimeout_timeout_callback). + * @param {Number} [options.socketTimeoutMS=30000] How long the MongoDB driver will wait before killing a socket due to inactivity _after initial connection_. A socket may be inactive because of either no activity or a long-running operation. This is set to `30000` by default, you should set this to 2-3x your longest running operation if you expect some of your database operations to run longer than 20 seconds. This option is passed to [Node.js `socket#setTimeout()` function](https://nodejs.org/api/net.html#net_socket_settimeout_timeout_callback) after the MongoDB driver successfully completes. + * @param {Number} [options.family=0] Passed transparently to [Node.js' `dns.lookup()`](https://nodejs.org/api/dns.html#dns_dns_lookup_hostname_options_callback) function. May be either `0`, `4`, or `6`. `4` means use IPv4 only, `6` means use IPv6 only, `0` means try both. + * @param {Boolean} [options.autoCreate=false] Set to `true` to make Mongoose automatically call `createCollection()` on every model created on this connection. + * @param {Function} [callback] + * @see Mongoose#createConnection #index_Mongoose-createConnection + * @api public + * @return {Promise} resolves to `this` if connection succeeded + */ + +Mongoose.prototype.connect = function(uri, options, callback) { + const _mongoose = this instanceof Mongoose ? this : mongoose; + const conn = _mongoose.connection; + + return _mongoose._promiseOrCallback(callback, cb => { + conn.openUri(uri, options, err => { + if (err != null) { + return cb(err); + } + return cb(null, _mongoose); + }); + }); +}; + +/** + * Runs `.close()` on all connections in parallel. + * + * @param {Function} [callback] called after all connection close, or when first error occurred. + * @return {Promise} resolves when all connections are closed, or rejects with the first error that occurred. + * @api public + */ + +Mongoose.prototype.disconnect = function(callback) { + const _mongoose = this instanceof Mongoose ? this : mongoose; + + return _mongoose._promiseOrCallback(callback, cb => { + let remaining = _mongoose.connections.length; + if (remaining <= 0) { + return cb(null); + } + _mongoose.connections.forEach(conn => { + conn.close(function(error) { + if (error) { + return cb(error); + } + if (!--remaining) { + cb(null); + } + }); + }); + }); +}; + +/** + * _Requires MongoDB >= 3.6.0._ Starts a [MongoDB session](https://docs.mongodb.com/manual/release-notes/3.6/#client-sessions) + * for benefits like causal consistency, [retryable writes](https://docs.mongodb.com/manual/core/retryable-writes/), + * and [transactions](http://thecodebarbarian.com/a-node-js-perspective-on-mongodb-4-transactions.html). + * + * Calling `mongoose.startSession()` is equivalent to calling `mongoose.connection.startSession()`. + * Sessions are scoped to a connection, so calling `mongoose.startSession()` + * starts a session on the [default mongoose connection](/docs/api.html#mongoose_Mongoose-connection). + * + * @param {Object} [options] see the [mongodb driver options](http://mongodb.github.io/node-mongodb-native/3.0/api/MongoClient.html#startSession) + * @param {Boolean} [options.causalConsistency=true] set to false to disable causal consistency + * @param {Function} [callback] + * @return {Promise} promise that resolves to a MongoDB driver `ClientSession` + * @api public + */ + +Mongoose.prototype.startSession = function() { + const _mongoose = this instanceof Mongoose ? this : mongoose; + + return _mongoose.connection.startSession.apply(_mongoose.connection, arguments); +}; + +/** + * Getter/setter around function for pluralizing collection names. + * + * @param {Function|null} [fn] overwrites the function used to pluralize collection names + * @return {Function|null} the current function used to pluralize collection names, defaults to the legacy function from `mongoose-legacy-pluralize`. + * @api public + */ + +Mongoose.prototype.pluralize = function(fn) { + const _mongoose = this instanceof Mongoose ? this : mongoose; + + if (arguments.length > 0) { + _mongoose._pluralize = fn; + } + return _mongoose._pluralize; +}; + +/** + * Defines a model or retrieves it. + * + * Models defined on the `mongoose` instance are available to all connection + * created by the same `mongoose` instance. + * + * If you call `mongoose.model()` with twice the same name but a different schema, + * you will get an `OverwriteModelError`. If you call `mongoose.model()` with + * the same name and same schema, you'll get the same schema back. + * + * ####Example: + * + * const mongoose = require('mongoose'); + * + * // define an Actor model with this mongoose instance + * const schema = new Schema({ name: String }); + * mongoose.model('Actor', schema); + * + * // create a new connection + * const conn = mongoose.createConnection(..); + * + * // create Actor model + * const Actor = conn.model('Actor', schema); + * conn.model('Actor') === Actor; // true + * conn.model('Actor', schema) === Actor; // true, same schema + * conn.model('Actor', schema, 'actors') === Actor; // true, same schema and collection name + * + * // This throws an `OverwriteModelError` because the schema is different. + * conn.model('Actor', new Schema({ name: String })); + * + * _When no `collection` argument is passed, Mongoose uses the model name. If you don't like this behavior, either pass a collection name, use `mongoose.pluralize()`, or set your schemas collection name option._ + * + * ####Example: + * + * const schema = new Schema({ name: String }, { collection: 'actor' }); + * + * // or + * + * schema.set('collection', 'actor'); + * + * // or + * + * const collectionName = 'actor' + * const M = mongoose.model('Actor', schema, collectionName) + * + * @param {String|Function} name model name or class extending Model + * @param {Schema} [schema] the schema to use. + * @param {String} [collection] name (optional, inferred from model name) + * @return {Model} The model associated with `name`. Mongoose will create the model if it doesn't already exist. + * @api public + */ + +Mongoose.prototype.model = function(name, schema, collection, options) { + const _mongoose = this instanceof Mongoose ? this : mongoose; + + if (typeof schema === 'string') { + collection = schema; + schema = false; + } + + if (utils.isObject(schema) && !(schema instanceof Schema)) { + schema = new Schema(schema); + } + if (schema && !(schema instanceof Schema)) { + throw new Error('The 2nd parameter to `mongoose.model()` should be a ' + + 'schema or a POJO'); + } + + // handle internal options from connection.model() + options = options || {}; + + const originalSchema = schema; + if (schema) { + if (_mongoose.get('cloneSchemas')) { + schema = schema.clone(); + } + _mongoose._applyPlugins(schema); + } + + // connection.model() may be passing a different schema for + // an existing model name. in this case don't read from cache. + const overwriteModels = _mongoose.options.hasOwnProperty('overwriteModels') ? + _mongoose.options.overwriteModels : + options.overwriteModels; + if (_mongoose.models.hasOwnProperty(name) && options.cache !== false && overwriteModels !== true) { + if (originalSchema && + originalSchema.instanceOfSchema && + originalSchema !== _mongoose.models[name].schema) { + throw new _mongoose.Error.OverwriteModelError(name); + } + if (collection && collection !== _mongoose.models[name].collection.name) { + // subclass current model with alternate collection + const model = _mongoose.models[name]; + schema = model.prototype.schema; + const sub = model.__subclass(_mongoose.connection, schema, collection); + // do not cache the sub model + return sub; + } + return _mongoose.models[name]; + } + if (schema == null) { + throw new _mongoose.Error.MissingSchemaError(name); + } + + const model = _mongoose._model(name, schema, collection, options); + + _mongoose.connection.models[name] = model; + _mongoose.models[name] = model; + + return model; +}; + +/*! + * ignore + */ + +Mongoose.prototype._model = function(name, schema, collection, options) { + const _mongoose = this instanceof Mongoose ? this : mongoose; + + let model; + if (typeof name === 'function') { + model = name; + name = model.name; + if (!(model.prototype instanceof Model)) { + throw new _mongoose.Error('The provided class ' + name + ' must extend Model'); + } + } + + if (schema) { + if (_mongoose.get('cloneSchemas')) { + schema = schema.clone(); + } + _mongoose._applyPlugins(schema); + } + + // Apply relevant "global" options to the schema + if (schema == null || !('pluralization' in schema.options)) { + schema.options.pluralization = _mongoose.options.pluralization; + } + + if (!collection) { + collection = schema.get('collection') || + utils.toCollectionName(name, _mongoose.pluralize()); + } + + const connection = options.connection || _mongoose.connection; + model = _mongoose.Model.compile(model || name, schema, collection, connection, _mongoose); + + // Errors handled internally, so safe to ignore error + model.init(function $modelInitNoop() {}); + + connection.emit('model', model); + + return model; +}; + +/** + * Removes the model named `name` from the default connection, if it exists. + * You can use this function to clean up any models you created in your tests to + * prevent OverwriteModelErrors. + * + * Equivalent to `mongoose.connection.deleteModel(name)`. + * + * ####Example: + * + * mongoose.model('User', new Schema({ name: String })); + * console.log(mongoose.model('User')); // Model object + * mongoose.deleteModel('User'); + * console.log(mongoose.model('User')); // undefined + * + * // Usually useful in a Mocha `afterEach()` hook + * afterEach(function() { + * mongoose.deleteModel(/.+/); // Delete every model + * }); + * + * @api public + * @param {String|RegExp} name if string, the name of the model to remove. If regexp, removes all models whose name matches the regexp. + * @return {Mongoose} this + */ + +Mongoose.prototype.deleteModel = function(name) { + const _mongoose = this instanceof Mongoose ? this : mongoose; + + _mongoose.connection.deleteModel(name); + delete _mongoose.models[name]; + return _mongoose; +}; + +/** + * Returns an array of model names created on this instance of Mongoose. + * + * ####Note: + * + * _Does not include names of models created using `connection.model()`._ + * + * @api public + * @return {Array} + */ + +Mongoose.prototype.modelNames = function() { + const _mongoose = this instanceof Mongoose ? this : mongoose; + + const names = Object.keys(_mongoose.models); + return names; +}; + +/** + * Applies global plugins to `schema`. + * + * @param {Schema} schema + * @api private + */ + +Mongoose.prototype._applyPlugins = function(schema, options) { + const _mongoose = this instanceof Mongoose ? this : mongoose; + + options = options || {}; + options.applyPluginsToDiscriminators = get(_mongoose, + 'options.applyPluginsToDiscriminators', false); + options.applyPluginsToChildSchemas = get(_mongoose, + 'options.applyPluginsToChildSchemas', true); + applyPlugins(schema, _mongoose.plugins, options, '$globalPluginsApplied'); +}; + +/** + * Declares a global plugin executed on all Schemas. + * + * Equivalent to calling `.plugin(fn)` on each Schema you create. + * + * @param {Function} fn plugin callback + * @param {Object} [opts] optional options + * @return {Mongoose} this + * @see plugins ./plugins.html + * @api public + */ + +Mongoose.prototype.plugin = function(fn, opts) { + const _mongoose = this instanceof Mongoose ? this : mongoose; + + _mongoose.plugins.push([fn, opts]); + return _mongoose; +}; + +/** + * The Mongoose module's default connection. Equivalent to `mongoose.connections[0]`, see [`connections`](#mongoose_Mongoose-connections). + * + * ####Example: + * + * const mongoose = require('mongoose'); + * mongoose.connect(...); + * mongoose.connection.on('error', cb); + * + * This is the connection used by default for every model created using [mongoose.model](#index_Mongoose-model). + * + * To create a new connection, use [`createConnection()`](#mongoose_Mongoose-createConnection). + * + * @memberOf Mongoose + * @instance + * @property {Connection} connection + * @api public + */ + +Mongoose.prototype.__defineGetter__('connection', function() { + return this.connections[0]; +}); + +Mongoose.prototype.__defineSetter__('connection', function(v) { + if (v instanceof Connection) { + this.connections[0] = v; + this.models = v.models; + } +}); + +/** + * An array containing all [connections](connections.html) associated with this + * Mongoose instance. By default, there is 1 connection. Calling + * [`createConnection()`](#mongoose_Mongoose-createConnection) adds a connection + * to this array. + * + * ####Example: + * + * const mongoose = require('mongoose'); + * mongoose.connections.length; // 1, just the default connection + * mongoose.connections[0] === mongoose.connection; // true + * + * mongoose.createConnection('mongodb://localhost:27017/test'); + * mongoose.connections.length; // 2 + * + * @memberOf Mongoose + * @instance + * @property {Array} connections + * @api public + */ + +Mongoose.prototype.connections; + +/*! + * Connection + */ + +const Connection = driver.get().getConnection(); + +/*! + * Collection + */ + +const Collection = driver.get().Collection; + +/** + * The Mongoose Aggregate constructor + * + * @method Aggregate + * @api public + */ + +Mongoose.prototype.Aggregate = Aggregate; + +/** + * The Mongoose Collection constructor + * + * @method Collection + * @api public + */ + +Mongoose.prototype.Collection = Collection; + +/** + * The Mongoose [Connection](#connection_Connection) constructor + * + * @memberOf Mongoose + * @instance + * @method Connection + * @api public + */ + +Mongoose.prototype.Connection = Connection; + +/** + * The Mongoose version + * + * #### Example + * + * console.log(mongoose.version); // '5.x.x' + * + * @property version + * @api public + */ + +Mongoose.prototype.version = pkg.version; + +/** + * The Mongoose constructor + * + * The exports of the mongoose module is an instance of this class. + * + * ####Example: + * + * const mongoose = require('mongoose'); + * const mongoose2 = new mongoose.Mongoose(); + * + * @method Mongoose + * @api public + */ + +Mongoose.prototype.Mongoose = Mongoose; + +/** + * The Mongoose [Schema](#schema_Schema) constructor + * + * ####Example: + * + * const mongoose = require('mongoose'); + * const Schema = mongoose.Schema; + * const CatSchema = new Schema(..); + * + * @method Schema + * @api public + */ + +Mongoose.prototype.Schema = Schema; + +/** + * The Mongoose [SchemaType](#schematype_SchemaType) constructor + * + * @method SchemaType + * @api public + */ + +Mongoose.prototype.SchemaType = SchemaType; + +/** + * The various Mongoose SchemaTypes. + * + * ####Note: + * + * _Alias of mongoose.Schema.Types for backwards compatibility._ + * + * @property SchemaTypes + * @see Schema.SchemaTypes #schema_Schema.Types + * @api public + */ + +Mongoose.prototype.SchemaTypes = Schema.Types; + +/** + * The Mongoose [VirtualType](#virtualtype_VirtualType) constructor + * + * @method VirtualType + * @api public + */ + +Mongoose.prototype.VirtualType = VirtualType; + +/** + * The various Mongoose Types. + * + * ####Example: + * + * const mongoose = require('mongoose'); + * const array = mongoose.Types.Array; + * + * ####Types: + * + * - [Array](/docs/schematypes.html#arrays) + * - [Buffer](/docs/schematypes.html#buffers) + * - [Embedded](/docs/schematypes.html#schemas) + * - [DocumentArray](/docs/api/documentarraypath.html) + * - [Decimal128](/docs/api.html#mongoose_Mongoose-Decimal128) + * - [ObjectId](/docs/schematypes.html#objectids) + * - [Map](/docs/schematypes.html#maps) + * - [Subdocument](/docs/schematypes.html#schemas) + * + * Using this exposed access to the `ObjectId` type, we can construct ids on demand. + * + * const ObjectId = mongoose.Types.ObjectId; + * const id1 = new ObjectId; + * + * @property Types + * @api public + */ + +Mongoose.prototype.Types = Types; + +/** + * The Mongoose [Query](#query_Query) constructor. + * + * @method Query + * @api public + */ + +Mongoose.prototype.Query = Query; + +/** + * The Mongoose [Promise](#promise_Promise) constructor. + * + * @memberOf Mongoose + * @instance + * @property Promise + * @api public + */ + +Object.defineProperty(Mongoose.prototype, 'Promise', { + get: function() { + return PromiseProvider.get(); + }, + set: function(lib) { + PromiseProvider.set(lib); + } +}); + +/** + * Storage layer for mongoose promises + * + * @method PromiseProvider + * @api public + */ + +Mongoose.prototype.PromiseProvider = PromiseProvider; + +/** + * The Mongoose [Model](#model_Model) constructor. + * + * @method Model + * @api public + */ + +Mongoose.prototype.Model = Model; + +/** + * The Mongoose [Document](/api/document.html) constructor. + * + * @method Document + * @api public + */ + +Mongoose.prototype.Document = Document; + +/** + * The Mongoose DocumentProvider constructor. Mongoose users should not have to + * use this directly + * + * @method DocumentProvider + * @api public + */ + +Mongoose.prototype.DocumentProvider = require('./document_provider'); + +/** + * The Mongoose ObjectId [SchemaType](/docs/schematypes.html). Used for + * declaring paths in your schema that should be + * [MongoDB ObjectIds](https://docs.mongodb.com/manual/reference/method/ObjectId/). + * Do not use this to create a new ObjectId instance, use `mongoose.Types.ObjectId` + * instead. + * + * ####Example: + * + * const childSchema = new Schema({ parentId: mongoose.ObjectId }); + * + * @property ObjectId + * @api public + */ + +Mongoose.prototype.ObjectId = SchemaTypes.ObjectId; + +/** + * Returns true if Mongoose can cast the given value to an ObjectId, or + * false otherwise. + * + * ####Example: + * + * mongoose.isValidObjectId(new mongoose.Types.ObjectId()); // true + * mongoose.isValidObjectId('0123456789ab'); // true + * mongoose.isValidObjectId(6); // false + * + * @method isValidObjectId + * @api public + */ + +Mongoose.prototype.isValidObjectId = function(v) { + if (v == null) { + return true; + } + const base = this || mongoose; + const ObjectId = base.driver.get().ObjectId; + if (v instanceof ObjectId) { + return true; + } + + if (v._id != null) { + if (v._id instanceof ObjectId) { + return true; + } + if (v._id.toString instanceof Function) { + v = v._id.toString(); + if (typeof v === 'string' && v.length === 12) { + return true; + } + if (typeof v === 'string' && /^[0-9A-Fa-f]{24}$/.test(v)) { + return true; + } + return false; + } + } + + if (v.toString instanceof Function) { + v = v.toString(); + } + + if (typeof v === 'string' && v.length === 12) { + return true; + } + if (typeof v === 'string' && /^[0-9A-Fa-f]{24}$/.test(v)) { + return true; + } + + return false; +}; + +/** + * The Mongoose Decimal128 [SchemaType](/docs/schematypes.html). Used for + * declaring paths in your schema that should be + * [128-bit decimal floating points](http://thecodebarbarian.com/a-nodejs-perspective-on-mongodb-34-decimal.html). + * Do not use this to create a new Decimal128 instance, use `mongoose.Types.Decimal128` + * instead. + * + * ####Example: + * + * const vehicleSchema = new Schema({ fuelLevel: mongoose.Decimal128 }); + * + * @property Decimal128 + * @api public + */ + +Mongoose.prototype.Decimal128 = SchemaTypes.Decimal128; + +/** + * The Mongoose Mixed [SchemaType](/docs/schematypes.html). Used for + * declaring paths in your schema that Mongoose's change tracking, casting, + * and validation should ignore. + * + * ####Example: + * + * const schema = new Schema({ arbitrary: mongoose.Mixed }); + * + * @property Mixed + * @api public + */ + +Mongoose.prototype.Mixed = SchemaTypes.Mixed; + +/** + * The Mongoose Date [SchemaType](/docs/schematypes.html). + * + * ####Example: + * + * const schema = new Schema({ test: Date }); + * schema.path('test') instanceof mongoose.Date; // true + * + * @property Date + * @api public + */ + +Mongoose.prototype.Date = SchemaTypes.Date; + +/** + * The Mongoose Number [SchemaType](/docs/schematypes.html). Used for + * declaring paths in your schema that Mongoose should cast to numbers. + * + * ####Example: + * + * const schema = new Schema({ num: mongoose.Number }); + * // Equivalent to: + * const schema = new Schema({ num: 'number' }); + * + * @property Number + * @api public + */ + +Mongoose.prototype.Number = SchemaTypes.Number; + +/** + * The [MongooseError](#error_MongooseError) constructor. + * + * @method Error + * @api public + */ + +Mongoose.prototype.Error = require('./error/index'); + +/** + * Mongoose uses this function to get the current time when setting + * [timestamps](/docs/guide.html#timestamps). You may stub out this function + * using a tool like [Sinon](https://www.npmjs.com/package/sinon) for testing. + * + * @method now + * @returns Date the current time + * @api public + */ + +Mongoose.prototype.now = function now() { return new Date(); }; + +/** + * The Mongoose CastError constructor + * + * @method CastError + * @param {String} type The name of the type + * @param {Any} value The value that failed to cast + * @param {String} path The path `a.b.c` in the doc where this cast error occurred + * @param {Error} [reason] The original error that was thrown + * @api public + */ + +Mongoose.prototype.CastError = require('./error/cast'); + +/** + * The constructor used for schematype options + * + * @method SchemaTypeOptions + * @api public + */ + +Mongoose.prototype.SchemaTypeOptions = require('./options/SchemaTypeOptions'); + +/** + * The [node-mongodb-native](https://github.com/mongodb/node-mongodb-native) driver Mongoose uses. + * + * @property mongo + * @api public + */ + +Mongoose.prototype.mongo = require('mongodb'); + +/** + * The [mquery](https://github.com/aheckmann/mquery) query builder Mongoose uses. + * + * @property mquery + * @api public + */ + +Mongoose.prototype.mquery = require('mquery'); + +/** + * Sanitizes query filters against [query selector injection attacks](https://thecodebarbarian.com/2014/09/04/defending-against-query-selector-injection-attacks.html) + * by wrapping any nested objects that have a property whose name starts with `$` in a `$eq`. + * + * ```javascript + * const obj = { username: 'val', pwd: { $ne: null } }; + * sanitizeFilter(obj); + * obj; // { username: 'val', pwd: { $eq: { $ne: null } } }); + * ``` + * + * @method sanitizeFilter + * @param {Object} filter + * @returns Object the sanitized object + * @api public + */ + +Mongoose.prototype.sanitizeFilter = sanitizeFilter; + +/** + * Tells `sanitizeFilter()` to skip the given object when filtering out potential [query selector injection attacks](https://thecodebarbarian.com/2014/09/04/defending-against-query-selector-injection-attacks.html). + * Use this method when you have a known query selector that you want to use. + * + * ```javascript + * const obj = { username: 'val', pwd: trusted({ $type: 'string', $eq: 'my secret' }) }; + * sanitizeFilter(obj); + * + * // Note that `sanitizeFilter()` did not add `$eq` around `$type`. + * obj; // { username: 'val', pwd: { $type: 'string', $eq: 'my secret' } }); + * ``` + * + * @method trusted + * @param {Object} obj + * @returns Object the passed in object + * @api public + */ + +Mongoose.prototype.trusted = trusted; + +/*! + * ignore + */ + +Mongoose.prototype._promiseOrCallback = function(callback, fn, ee) { + return promiseOrCallback(callback, fn, ee, this.Promise); +}; + +/*! + * The exports object is an instance of Mongoose. + * + * @api public + */ + +const mongoose = module.exports = exports = new Mongoose({ + [defaultMongooseSymbol]: true +}); diff --git a/node_modules/mongoose/lib/internal.js b/node_modules/mongoose/lib/internal.js new file mode 100644 index 000000000..7e534dc99 --- /dev/null +++ b/node_modules/mongoose/lib/internal.js @@ -0,0 +1,38 @@ +/*! + * Dependencies + */ + +'use strict'; + +const StateMachine = require('./statemachine'); +const ActiveRoster = StateMachine.ctor('require', 'modify', 'init', 'default', 'ignore'); + +module.exports = exports = InternalCache; + +function InternalCache() { + this.activePaths = new ActiveRoster; + + // embedded docs + this.ownerDocument = undefined; + this.fullPath = undefined; +} + +InternalCache.prototype.strictMode = undefined; +InternalCache.prototype.selected = undefined; +InternalCache.prototype.shardval = undefined; +InternalCache.prototype.saveError = undefined; +InternalCache.prototype.validationError = undefined; +InternalCache.prototype.adhocPaths = undefined; +InternalCache.prototype.removing = undefined; +InternalCache.prototype.inserting = undefined; +InternalCache.prototype.saving = undefined; +InternalCache.prototype.version = undefined; +InternalCache.prototype._id = undefined; +InternalCache.prototype.populate = undefined; // what we want to populate in this doc +InternalCache.prototype.populated = undefined;// the _ids that have been populated +InternalCache.prototype.wasPopulated = false; // if this doc was the result of a population +InternalCache.prototype.scope = undefined; + +InternalCache.prototype.session = null; +InternalCache.prototype.pathsToScopes = null; +InternalCache.prototype.cachedRequired = null; diff --git a/node_modules/mongoose/lib/model.js b/node_modules/mongoose/lib/model.js new file mode 100644 index 000000000..c7aeb5a25 --- /dev/null +++ b/node_modules/mongoose/lib/model.js @@ -0,0 +1,4980 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const Aggregate = require('./aggregate'); +const ChangeStream = require('./cursor/ChangeStream'); +const Document = require('./document'); +const DocumentNotFoundError = require('./error/notFound'); +const DivergentArrayError = require('./error/divergentArray'); +const EventEmitter = require('events').EventEmitter; +const MongooseBuffer = require('./types/buffer'); +const MongooseError = require('./error/index'); +const OverwriteModelError = require('./error/overwriteModel'); +const PromiseProvider = require('./promise_provider'); +const Query = require('./query'); +const RemoveOptions = require('./options/removeOptions'); +const SaveOptions = require('./options/saveOptions'); +const Schema = require('./schema'); +const ServerSelectionError = require('./error/serverSelection'); +const ValidationError = require('./error/validation'); +const VersionError = require('./error/version'); +const ParallelSaveError = require('./error/parallelSave'); +const applyQueryMiddleware = require('./helpers/query/applyQueryMiddleware'); +const applyHooks = require('./helpers/model/applyHooks'); +const applyMethods = require('./helpers/model/applyMethods'); +const applyStaticHooks = require('./helpers/model/applyStaticHooks'); +const applyStatics = require('./helpers/model/applyStatics'); +const applyWriteConcern = require('./helpers/schema/applyWriteConcern'); +const assignVals = require('./helpers/populate/assignVals'); +const castBulkWrite = require('./helpers/model/castBulkWrite'); +const createPopulateQueryFilter = require('./helpers/populate/createPopulateQueryFilter'); +const getDefaultBulkwriteResult = require('./helpers/getDefaultBulkwriteResult'); +const discriminator = require('./helpers/model/discriminator'); +const each = require('./helpers/each'); +const get = require('./helpers/get'); +const getConstructorName = require('./helpers/getConstructorName'); +const getDiscriminatorByValue = require('./helpers/discriminator/getDiscriminatorByValue'); +const getModelsMapForPopulate = require('./helpers/populate/getModelsMapForPopulate'); +const immediate = require('./helpers/immediate'); +const internalToObjectOptions = require('./options').internalToObjectOptions; +const isDefaultIdIndex = require('./helpers/indexes/isDefaultIdIndex'); +const isIndexEqual = require('./helpers/indexes/isIndexEqual'); +const isPathSelectedInclusive = require('./helpers/projection/isPathSelectedInclusive'); +const leanPopulateMap = require('./helpers/populate/leanPopulateMap'); +const modifiedPaths = require('./helpers/update/modifiedPaths'); +const parallelLimit = require('./helpers/parallelLimit'); +const removeDeselectedForeignField = require('./helpers/populate/removeDeselectedForeignField'); +const util = require('util'); +const utils = require('./utils'); + +const VERSION_WHERE = 1; +const VERSION_INC = 2; +const VERSION_ALL = VERSION_WHERE | VERSION_INC; + +const arrayAtomicsSymbol = require('./helpers/symbols').arrayAtomicsSymbol; +const modelCollectionSymbol = Symbol('mongoose#Model#collection'); +const modelDbSymbol = Symbol('mongoose#Model#db'); +const modelSymbol = require('./helpers/symbols').modelSymbol; +const subclassedSymbol = Symbol('mongoose#Model#subclassed'); + +const saveToObjectOptions = Object.assign({}, internalToObjectOptions, { + bson: true +}); + +/** + * A Model is a class that's your primary tool for interacting with MongoDB. + * An instance of a Model is called a [Document](./api.html#Document). + * + * In Mongoose, the term "Model" refers to subclasses of the `mongoose.Model` + * class. You should not use the `mongoose.Model` class directly. The + * [`mongoose.model()`](./api.html#mongoose_Mongoose-model) and + * [`connection.model()`](./api.html#connection_Connection-model) functions + * create subclasses of `mongoose.Model` as shown below. + * + * ####Example: + * + * // `UserModel` is a "Model", a subclass of `mongoose.Model`. + * const UserModel = mongoose.model('User', new Schema({ name: String })); + * + * // You can use a Model to create new documents using `new`: + * const userDoc = new UserModel({ name: 'Foo' }); + * await userDoc.save(); + * + * // You also use a model to create queries: + * const userFromDb = await UserModel.findOne({ name: 'Foo' }); + * + * @param {Object} doc values for initial set + * @param [fields] optional object containing the fields that were selected in the query which returned this document. You do **not** need to set this parameter to ensure Mongoose handles your [query projection](./api.html#query_Query-select). + * @param {Boolean} [skipId=false] optional boolean. If true, mongoose doesn't add an `_id` field to the document. + * @inherits Document http://mongoosejs.com/docs/api/document.html + * @event `error`: If listening to this event, 'error' is emitted when a document was saved without passing a callback and an `error` occurred. If not listening, the event bubbles to the connection used to create this Model. + * @event `index`: Emitted after `Model#ensureIndexes` completes. If an error occurred it is passed with the event. + * @event `index-single-start`: Emitted when an individual index starts within `Model#ensureIndexes`. The fields and options being used to build the index are also passed with the event. + * @event `index-single-done`: Emitted when an individual index finishes within `Model#ensureIndexes`. If an error occurred it is passed with the event. The fields, options, and index name are also passed. + * @api public + */ + +function Model(doc, fields, skipId) { + if (fields instanceof Schema) { + throw new TypeError('2nd argument to `Model` must be a POJO or string, ' + + '**not** a schema. Make sure you\'re calling `mongoose.model()`, not ' + + '`mongoose.Model()`.'); + } + Document.call(this, doc, fields, skipId); +} + +/*! + * Inherits from Document. + * + * All Model.prototype features are available on + * top level (non-sub) documents. + */ + +Model.prototype.__proto__ = Document.prototype; +Model.prototype.$isMongooseModelPrototype = true; + +/** + * Connection the model uses. + * + * @api public + * @property db + * @memberOf Model + * @instance + */ + +Model.prototype.db; + +/** + * Collection the model uses. + * + * This property is read-only. Modifying this property is a no-op. + * + * @api public + * @property collection + * @memberOf Model + * @instance + */ + +Model.prototype.collection; + +/** + * Internal collection the model uses. + * + * This property is read-only. Modifying this property is a no-op. + * + * @api private + * @property collection + * @memberOf Model + * @instance + */ + + +Model.prototype.$__collection; + +/** + * The name of the model + * + * @api public + * @property modelName + * @memberOf Model + * @instance + */ + +Model.prototype.modelName; + +/** + * Additional properties to attach to the query when calling `save()` and + * `isNew` is false. + * + * @api public + * @property $where + * @memberOf Model + * @instance + */ + +Model.prototype.$where; + +/** + * If this is a discriminator model, `baseModelName` is the name of + * the base model. + * + * @api public + * @property baseModelName + * @memberOf Model + * @instance + */ + +Model.prototype.baseModelName; + +/** + * Event emitter that reports any errors that occurred. Useful for global error + * handling. + * + * ####Example: + * + * MyModel.events.on('error', err => console.log(err.message)); + * + * // Prints a 'CastError' because of the above handler + * await MyModel.findOne({ _id: 'notanid' }).catch(noop); + * + * @api public + * @fires error whenever any query or model function errors + * @memberOf Model + * @static events + */ + +Model.events; + +/*! + * Compiled middleware for this model. Set in `applyHooks()`. + * + * @api private + * @property _middleware + * @memberOf Model + * @static + */ + +Model._middleware; + +/*! + * ignore + */ + +function _applyCustomWhere(doc, where) { + if (doc.$where == null) { + return; + } + for (const key of Object.keys(doc.$where)) { + where[key] = doc.$where[key]; + } +} + +/*! + * ignore + */ + +Model.prototype.$__handleSave = function(options, callback) { + const _this = this; + let saveOptions = {}; + + applyWriteConcern(this.$__schema, options); + if (typeof options.writeConcern != 'undefined') { + saveOptions.writeConcern = {}; + if ('w' in options.writeConcern) { + saveOptions.writeConcern.w = options.writeConcern.w; + } + if ('j' in options.writeConcern) { + saveOptions.writeConcern.j = options.writeConcern.j; + } + if ('wtimeout' in options.writeConcern) { + saveOptions.writeConcern.wtimeout = options.writeConcern.wtimeout; + } + } else { + if ('w' in options) { + saveOptions.w = options.w; + } + if ('j' in options) { + saveOptions.j = options.j; + } + if ('wtimeout' in options) { + saveOptions.wtimeout = options.wtimeout; + } + } + if ('checkKeys' in options) { + saveOptions.checkKeys = options.checkKeys; + } + const session = this.$session(); + if (!saveOptions.hasOwnProperty('session')) { + saveOptions.session = session; + } + + if (Object.keys(saveOptions).length === 0) { + saveOptions = null; + } + if (this.$isNew) { + // send entire doc + const obj = this.toObject(saveToObjectOptions); + if ((obj || {})._id === void 0) { + // documents must have an _id else mongoose won't know + // what to update later if more changes are made. the user + // wouldn't know what _id was generated by mongodb either + // nor would the ObjectId generated by mongodb necessarily + // match the schema definition. + immediate(function() { + callback(new MongooseError('document must have an _id before saving')); + }); + return; + } + + this.$__version(true, obj); + this[modelCollectionSymbol].insertOne(obj, saveOptions, function(err, ret) { + if (err) { + _setIsNew(_this, true); + + callback(err, null); + return; + } + + callback(null, ret); + }); + this.$__reset(); + _setIsNew(this, false); + // Make it possible to retry the insert + this.$__.inserting = true; + } else { + // Make sure we don't treat it as a new object on error, + // since it already exists + this.$__.inserting = false; + + const delta = this.$__delta(); + if (delta) { + if (delta instanceof MongooseError) { + callback(delta); + return; + } + + const where = this.$__where(delta[0]); + if (where instanceof MongooseError) { + callback(where); + return; + } + + _applyCustomWhere(this, where); + this[modelCollectionSymbol].updateOne(where, delta[1], saveOptions, (err, ret) => { + if (err) { + this.$__undoReset(); + + callback(err); + return; + } + ret.$where = where; + callback(null, ret); + }); + } else { + const optionsWithCustomValues = Object.assign({}, options, saveOptions); + const where = this.$__where(); + if (this.$__schema.options.optimisticConcurrency) { + const key = this.$__schema.options.versionKey; + const val = this.$__getValue(key); + if (val != null) { + where[key] = val; + } + } + this.constructor.exists(where, optionsWithCustomValues) + .then((documentExists) => { + if (!documentExists) { + throw new DocumentNotFoundError(this.$__where(), this.constructor.modelName); + } + + callback(); + }) + .catch(callback); + return; + } + + // store the modified paths before the document is reset + this.$__.modifiedPaths = this.modifiedPaths(); + this.$__reset(); + + _setIsNew(this, false); + } +}; + +/*! + * ignore + */ + +Model.prototype.$__save = function(options, callback) { + this.$__handleSave(options, (error, result) => { + const hooks = this.$__schema.s.hooks; + if (error) { + return hooks.execPost('save:error', this, [this], { error: error }, (error) => { + callback(error, this); + }); + } + let numAffected = 0; + if (get(options, 'safe.w') !== 0 && get(options, 'w') !== 0) { + // Skip checking if write succeeded if writeConcern is set to + // unacknowledged writes, because otherwise `numAffected` will always be 0 + if (result) { + if (Array.isArray(result)) { + numAffected = result.length; + } else if (result.matchedCount != null) { + numAffected = result.matchedCount; + } else { + numAffected = result; + } + } + // was this an update that required a version bump? + if (this.$__.version && !this.$__.inserting) { + const doIncrement = VERSION_INC === (VERSION_INC & this.$__.version); + this.$__.version = undefined; + const key = this.$__schema.options.versionKey; + const version = this.$__getValue(key) || 0; + if (numAffected <= 0) { + // the update failed. pass an error back + this.$__undoReset(); + const err = this.$__.$versionError || + new VersionError(this, version, this.$__.modifiedPaths); + return callback(err); + } + + // increment version if was successful + if (doIncrement) { + this.$__setValue(key, version + 1); + } + } + if (result != null && numAffected <= 0) { + this.$__undoReset(); + error = new DocumentNotFoundError(result.$where, + this.constructor.modelName, numAffected, result); + return hooks.execPost('save:error', this, [this], { error: error }, (error) => { + callback(error, this); + }); + } + } + this.$__.saving = undefined; + this.$__.savedState = {}; + this.$emit('save', this, numAffected); + this.constructor.emit('save', this, numAffected); + callback(null, this); + }); +}; + +/*! + * ignore + */ + +function generateVersionError(doc, modifiedPaths) { + const key = doc.$__schema.options.versionKey; + if (!key) { + return null; + } + const version = doc.$__getValue(key) || 0; + return new VersionError(doc, version, modifiedPaths); +} + +/** + * Saves this document by inserting a new document into the database if [document.isNew](/docs/api.html#document_Document-isNew) is `true`, + * or sends an [updateOne](/docs/api.html#document_Document-updateOne) operation with just the modified paths if `isNew` is `false`. + * + * ####Example: + * + * product.sold = Date.now(); + * product = await product.save(); + * + * If save is successful, the returned promise will fulfill with the document + * saved. + * + * ####Example: + * + * const newProduct = await product.save(); + * newProduct === product; // true + * + * @param {Object} [options] options optional options + * @param {Session} [options.session=null] the [session](https://docs.mongodb.com/manual/reference/server-sessions/) associated with this save operation. If not specified, defaults to the [document's associated session](api.html#document_Document-$session). + * @param {Object} [options.safe] (DEPRECATED) overrides [schema's safe option](http://mongoosejs.com//docs/guide.html#safe). Use the `w` option instead. + * @param {Boolean} [options.validateBeforeSave] set to false to save without validating. + * @param {Boolean} [options.validateModifiedOnly=false] if `true`, Mongoose will only validate modified paths, as opposed to modified paths and `required` paths. + * @param {Number|String} [options.w] set the [write concern](https://docs.mongodb.com/manual/reference/write-concern/#w-option). Overrides the [schema-level `writeConcern` option](/docs/guide.html#writeConcern) + * @param {Boolean} [options.j] set to true for MongoDB to wait until this `save()` has been [journaled before resolving the returned promise](https://docs.mongodb.com/manual/reference/write-concern/#j-option). Overrides the [schema-level `writeConcern` option](/docs/guide.html#writeConcern) + * @param {Number} [options.wtimeout] sets a [timeout for the write concern](https://docs.mongodb.com/manual/reference/write-concern/#wtimeout). Overrides the [schema-level `writeConcern` option](/docs/guide.html#writeConcern). + * @param {Boolean} [options.checkKeys=true] the MongoDB driver prevents you from saving keys that start with '$' or contain '.' by default. Set this option to `false` to skip that check. See [restrictions on field names](https://docs.mongodb.com/manual/reference/limits/#Restrictions-on-Field-Names) + * @param {Boolean} [options.timestamps=true] if `false` and [timestamps](./guide.html#timestamps) are enabled, skip timestamps for this `save()`. + * @param {Function} [fn] optional callback + * @throws {DocumentNotFoundError} if this [save updates an existing document](api.html#document_Document-isNew) but the document doesn't exist in the database. For example, you will get this error if the document is [deleted between when you retrieved the document and when you saved it](documents.html#updating). + * @return {Promise|undefined} Returns undefined if used with callback or a Promise otherwise. + * @api public + * @see middleware http://mongoosejs.com/docs/middleware.html + */ + +Model.prototype.save = function(options, fn) { + let parallelSave; + this.$op = 'save'; + + if (this.$__.saving) { + parallelSave = new ParallelSaveError(this); + } else { + this.$__.saving = new ParallelSaveError(this); + } + + if (typeof options === 'function') { + fn = options; + options = undefined; + } + + options = new SaveOptions(options); + if (options.hasOwnProperty('session')) { + this.$session(options.session); + } + this.$__.$versionError = generateVersionError(this, this.modifiedPaths()); + + fn = this.constructor.$handleCallbackError(fn); + return this.constructor.db.base._promiseOrCallback(fn, cb => { + cb = this.constructor.$wrapCallback(cb); + + if (parallelSave) { + this.$__handleReject(parallelSave); + return cb(parallelSave); + } + + this.$__.saveOptions = options; + + this.$__save(options, error => { + this.$__.saving = undefined; + delete this.$__.saveOptions; + delete this.$__.$versionError; + this.$op = null; + + if (error) { + this.$__handleReject(error); + return cb(error); + } + cb(null, this); + }); + }, this.constructor.events); +}; + +Model.prototype.$save = Model.prototype.save; + +/*! + * Determines whether versioning should be skipped for the given path + * + * @param {Document} self + * @param {String} path + * @return {Boolean} true if versioning should be skipped for the given path + */ +function shouldSkipVersioning(self, path) { + const skipVersioning = self.$__schema.options.skipVersioning; + if (!skipVersioning) return false; + + // Remove any array indexes from the path + path = path.replace(/\.\d+\./, '.'); + + return skipVersioning[path]; +} + +/*! + * Apply the operation to the delta (update) clause as + * well as track versioning for our where clause. + * + * @param {Document} self + * @param {Object} where + * @param {Object} delta + * @param {Object} data + * @param {Mixed} val + * @param {String} [operation] + */ + +function operand(self, where, delta, data, val, op) { + // delta + op || (op = '$set'); + if (!delta[op]) delta[op] = {}; + delta[op][data.path] = val; + // disabled versioning? + if (self.$__schema.options.versionKey === false) return; + + // path excluded from versioning? + if (shouldSkipVersioning(self, data.path)) return; + + // already marked for versioning? + if (VERSION_ALL === (VERSION_ALL & self.$__.version)) return; + + if (self.$__schema.options.optimisticConcurrency) { + self.$__.version = VERSION_ALL; + return; + } + + switch (op) { + case '$set': + case '$unset': + case '$pop': + case '$pull': + case '$pullAll': + case '$push': + case '$addToSet': + break; + default: + // nothing to do + return; + } + + // ensure updates sent with positional notation are + // editing the correct array element. + // only increment the version if an array position changes. + // modifying elements of an array is ok if position does not change. + if (op === '$push' || op === '$addToSet' || op === '$pullAll' || op === '$pull') { + self.$__.version = VERSION_INC; + } else if (/^\$p/.test(op)) { + // potentially changing array positions + increment.call(self); + } else if (Array.isArray(val)) { + // $set an array + increment.call(self); + } else if (/\.\d+\.|\.\d+$/.test(data.path)) { + // now handling $set, $unset + // subpath of array + self.$__.version = VERSION_WHERE; + } +} + +/*! + * Compiles an update and where clause for a `val` with _atomics. + * + * @param {Document} self + * @param {Object} where + * @param {Object} delta + * @param {Object} data + * @param {Array} value + */ + +function handleAtomics(self, where, delta, data, value) { + if (delta.$set && delta.$set[data.path]) { + // $set has precedence over other atomics + return; + } + + if (typeof value.$__getAtomics === 'function') { + value.$__getAtomics().forEach(function(atomic) { + const op = atomic[0]; + const val = atomic[1]; + operand(self, where, delta, data, val, op); + }); + return; + } + + // legacy support for plugins + + const atomics = value[arrayAtomicsSymbol]; + const ops = Object.keys(atomics); + let i = ops.length; + let val; + let op; + + if (i === 0) { + // $set + + if (utils.isMongooseObject(value)) { + value = value.toObject({ depopulate: 1, _isNested: true }); + } else if (value.valueOf) { + value = value.valueOf(); + } + + return operand(self, where, delta, data, value); + } + + function iter(mem) { + return utils.isMongooseObject(mem) + ? mem.toObject({ depopulate: 1, _isNested: true }) + : mem; + } + + while (i--) { + op = ops[i]; + val = atomics[op]; + + if (utils.isMongooseObject(val)) { + val = val.toObject({ depopulate: true, transform: false, _isNested: true }); + } else if (Array.isArray(val)) { + val = val.map(iter); + } else if (val.valueOf) { + val = val.valueOf(); + } + + if (op === '$addToSet') { + val = { $each: val }; + } + + operand(self, where, delta, data, val, op); + } +} + +/** + * Produces a special query document of the modified properties used in updates. + * + * @api private + * @method $__delta + * @memberOf Model + * @instance + */ + +Model.prototype.$__delta = function() { + const dirty = this.$__dirty(); + if (!dirty.length && VERSION_ALL !== this.$__.version) { + return; + } + const where = {}; + const delta = {}; + const len = dirty.length; + const divergent = []; + let d = 0; + + where._id = this._doc._id; + // If `_id` is an object, need to depopulate, but also need to be careful + // because `_id` can technically be null (see gh-6406) + if (get(where, '_id.$__', null) != null) { + where._id = where._id.toObject({ transform: false, depopulate: true }); + } + for (; d < len; ++d) { + const data = dirty[d]; + let value = data.value; + const match = checkDivergentArray(this, data.path, value); + if (match) { + divergent.push(match); + continue; + } + + const pop = this.$populated(data.path, true); + if (!pop && this.$__.selected) { + // If any array was selected using an $elemMatch projection, we alter the path and where clause + // NOTE: MongoDB only supports projected $elemMatch on top level array. + const pathSplit = data.path.split('.'); + const top = pathSplit[0]; + if (this.$__.selected[top] && this.$__.selected[top].$elemMatch) { + // If the selected array entry was modified + if (pathSplit.length > 1 && pathSplit[1] == 0 && typeof where[top] === 'undefined') { + where[top] = this.$__.selected[top]; + pathSplit[1] = '$'; + data.path = pathSplit.join('.'); + } + // if the selected array was modified in any other way throw an error + else { + divergent.push(data.path); + continue; + } + } + } + + if (divergent.length) continue; + if (value === undefined) { + operand(this, where, delta, data, 1, '$unset'); + } else if (value === null) { + operand(this, where, delta, data, null); + } else if (value.isMongooseArray && value.$path() && value[arrayAtomicsSymbol]) { + // arrays and other custom types (support plugins etc) + handleAtomics(this, where, delta, data, value); + } else if (value[MongooseBuffer.pathSymbol] && Buffer.isBuffer(value)) { + // MongooseBuffer + value = value.toObject(); + operand(this, where, delta, data, value); + } else { + value = utils.clone(value, { + depopulate: true, + transform: false, + virtuals: false, + getters: false, + _isNested: true + }); + operand(this, where, delta, data, value); + } + } + + if (divergent.length) { + return new DivergentArrayError(divergent); + } + + if (this.$__.version) { + this.$__version(where, delta); + } + return [where, delta]; +}; + +/*! + * Determine if array was populated with some form of filter and is now + * being updated in a manner which could overwrite data unintentionally. + * + * @see https://github.com/Automattic/mongoose/issues/1334 + * @param {Document} doc + * @param {String} path + * @return {String|undefined} + */ + +function checkDivergentArray(doc, path, array) { + // see if we populated this path + const pop = doc.$populated(path, true); + + if (!pop && doc.$__.selected) { + // If any array was selected using an $elemMatch projection, we deny the update. + // NOTE: MongoDB only supports projected $elemMatch on top level array. + const top = path.split('.')[0]; + if (doc.$__.selected[top + '.$']) { + return top; + } + } + + if (!(pop && array && array.isMongooseArray)) return; + + // If the array was populated using options that prevented all + // documents from being returned (match, skip, limit) or they + // deselected the _id field, $pop and $set of the array are + // not safe operations. If _id was deselected, we do not know + // how to remove elements. $pop will pop off the _id from the end + // of the array in the db which is not guaranteed to be the + // same as the last element we have here. $set of the entire array + // would be similarily destructive as we never received all + // elements of the array and potentially would overwrite data. + const check = pop.options.match || + pop.options.options && utils.object.hasOwnProperty(pop.options.options, 'limit') || // 0 is not permitted + pop.options.options && pop.options.options.skip || // 0 is permitted + pop.options.select && // deselected _id? + (pop.options.select._id === 0 || + /\s?-_id\s?/.test(pop.options.select)); + + if (check) { + const atomics = array[arrayAtomicsSymbol]; + if (Object.keys(atomics).length === 0 || atomics.$set || atomics.$pop) { + return path; + } + } +} + +/** + * Appends versioning to the where and update clauses. + * + * @api private + * @method $__version + * @memberOf Model + * @instance + */ + +Model.prototype.$__version = function(where, delta) { + const key = this.$__schema.options.versionKey; + + if (where === true) { + // this is an insert + if (key) { + this.$__setValue(key, delta[key] = 0); + } + return; + } + + // updates + + // only apply versioning if our versionKey was selected. else + // there is no way to select the correct version. we could fail + // fast here and force them to include the versionKey but + // thats a bit intrusive. can we do this automatically? + if (!this.$__isSelected(key)) { + return; + } + + // $push $addToSet don't need the where clause set + if (VERSION_WHERE === (VERSION_WHERE & this.$__.version)) { + const value = this.$__getValue(key); + if (value != null) where[key] = value; + } + + if (VERSION_INC === (VERSION_INC & this.$__.version)) { + if (get(delta.$set, key, null) != null) { + // Version key is getting set, means we'll increment the doc's version + // after a successful save, so we should set the incremented version so + // future saves don't fail (gh-5779) + ++delta.$set[key]; + } else { + delta.$inc = delta.$inc || {}; + delta.$inc[key] = 1; + } + } +}; + +/** + * Signal that we desire an increment of this documents version. + * + * ####Example: + * + * Model.findById(id, function (err, doc) { + * doc.increment(); + * doc.save(function (err) { .. }) + * }) + * + * @see versionKeys http://mongoosejs.com/docs/guide.html#versionKey + * @api public + */ + +function increment() { + this.$__.version = VERSION_ALL; + return this; +} + +Model.prototype.increment = increment; + +/** + * Returns a query object + * + * @api private + * @method $__where + * @memberOf Model + * @instance + */ + +Model.prototype.$__where = function _where(where) { + where || (where = {}); + + if (!where._id) { + where._id = this._doc._id; + } + + if (this._doc._id === void 0) { + return new MongooseError('No _id found on document!'); + } + + return where; +}; + +/** + * Removes this document from the db. + * + * ####Example: + * product.remove(function (err, product) { + * if (err) return handleError(err); + * Product.findById(product._id, function (err, product) { + * console.log(product) // null + * }) + * }) + * + * + * As an extra measure of flow control, remove will return a Promise (bound to `fn` if passed) so it could be chained, or hooked to recieve errors + * + * ####Example: + * product.remove().then(function (product) { + * ... + * }).catch(function (err) { + * assert.ok(err) + * }) + * + * @param {Object} [options] + * @param {Session} [options.session=null] the [session](https://docs.mongodb.com/manual/reference/server-sessions/) associated with this operation. If not specified, defaults to the [document's associated session](api.html#document_Document-$session). + * @param {function(err,product)} [fn] optional callback + * @return {Promise} Promise + * @api public + */ + +Model.prototype.remove = function remove(options, fn) { + if (typeof options === 'function') { + fn = options; + options = undefined; + } + + options = new RemoveOptions(options); + if (options.hasOwnProperty('session')) { + this.$session(options.session); + } + this.$op = 'remove'; + + fn = this.constructor.$handleCallbackError(fn); + + return this.constructor.db.base._promiseOrCallback(fn, cb => { + cb = this.constructor.$wrapCallback(cb); + this.$__remove(options, (err, res) => { + this.$op = null; + cb(err, res); + }); + }, this.constructor.events); +}; + +/*! + * Alias for remove + */ + +Model.prototype.$remove = Model.prototype.remove; +Model.prototype.delete = Model.prototype.remove; + +/** + * Removes this document from the db. Equivalent to `.remove()`. + * + * ####Example: + * product = await product.deleteOne(); + * await Product.findById(product._id); // null + * + * @param {function(err,product)} [fn] optional callback + * @return {Promise} Promise + * @api public + */ + +Model.prototype.deleteOne = function deleteOne(options, fn) { + if (typeof options === 'function') { + fn = options; + options = undefined; + } + + if (!options) { + options = {}; + } + + fn = this.constructor.$handleCallbackError(fn); + + return this.constructor.db.base._promiseOrCallback(fn, cb => { + cb = this.constructor.$wrapCallback(cb); + this.$__deleteOne(options, cb); + }, this.constructor.events); +}; + +/*! + * ignore + */ + +Model.prototype.$__remove = function $__remove(options, cb) { + if (this.$__.isDeleted) { + return immediate(() => cb(null, this)); + } + + const where = this.$__where(); + if (where instanceof MongooseError) { + return cb(where); + } + + _applyCustomWhere(this, where); + + const session = this.$session(); + if (!options.hasOwnProperty('session')) { + options.session = session; + } + + this[modelCollectionSymbol].deleteOne(where, options, err => { + if (!err) { + this.$__.isDeleted = true; + this.$emit('remove', this); + this.constructor.emit('remove', this); + return cb(null, this); + } + this.$__.isDeleted = false; + cb(err); + }); +}; + +/*! + * ignore + */ + +Model.prototype.$__deleteOne = Model.prototype.$__remove; + +/** + * Returns another Model instance. + * + * ####Example: + * + * const doc = new Tank; + * doc.model('User').findById(id, callback); + * + * @param {String} name model name + * @api public + */ + +Model.prototype.model = function model(name) { + return this[modelDbSymbol].model(name); +}; + +/** + * Returns true if at least one document exists in the database that matches + * the given `filter`, and false otherwise. + * + * Under the hood, `MyModel.exists({ answer: 42 })` is equivalent to + * `MyModel.findOne({ answer: 42 }).select({ _id: 1 }).lean()` + * + * ####Example: + * await Character.deleteMany({}); + * await Character.create({ name: 'Jean-Luc Picard' }); + * + * await Character.exists({ name: /picard/i }); // { _id: ... } + * await Character.exists({ name: /riker/i }); // null + * + * This function triggers the following middleware. + * + * - `findOne()` + * + * @param {Object} filter + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Function} [callback] callback + * @return {Query} + */ + +Model.exists = function exists(filter, options, callback) { + _checkContext(this, 'exists'); + if (typeof options === 'function') { + callback = options; + options = null; + } + + const query = this.findOne(filter). + select({ _id: 1 }). + lean(). + setOptions(options); + + if (typeof callback === 'function') { + return query.exec(callback); + } + options = options || {}; + if (!options.explain) { + return query.then(doc => !!doc); + } + + return query; +}; + +/** + * Adds a discriminator type. + * + * ####Example: + * + * function BaseSchema() { + * Schema.apply(this, arguments); + * + * this.add({ + * name: String, + * createdAt: Date + * }); + * } + * util.inherits(BaseSchema, Schema); + * + * const PersonSchema = new BaseSchema(); + * const BossSchema = new BaseSchema({ department: String }); + * + * const Person = mongoose.model('Person', PersonSchema); + * const Boss = Person.discriminator('Boss', BossSchema); + * new Boss().__t; // "Boss". `__t` is the default `discriminatorKey` + * + * const employeeSchema = new Schema({ boss: ObjectId }); + * const Employee = Person.discriminator('Employee', employeeSchema, 'staff'); + * new Employee().__t; // "staff" because of 3rd argument above + * + * @param {String} name discriminator model name + * @param {Schema} schema discriminator model schema + * @param {Object|String} [options] If string, same as `options.value`. + * @param {String} [options.value] the string stored in the `discriminatorKey` property. If not specified, Mongoose uses the `name` parameter. + * @param {Boolean} [options.clone=true] By default, `discriminator()` clones the given `schema`. Set to `false` to skip cloning. + * @return {Model} The newly created discriminator model + * @api public + */ + +Model.discriminator = function(name, schema, options) { + let model; + if (typeof name === 'function') { + model = name; + name = utils.getFunctionName(model); + if (!(model.prototype instanceof Model)) { + throw new MongooseError('The provided class ' + name + ' must extend Model'); + } + } + + options = options || {}; + const value = utils.isPOJO(options) ? options.value : options; + const clone = get(options, 'clone', true); + + _checkContext(this, 'discriminator'); + + if (utils.isObject(schema) && !schema.instanceOfSchema) { + schema = new Schema(schema); + } + if (schema instanceof Schema && clone) { + schema = schema.clone(); + } + + schema = discriminator(this, name, schema, value, true); + if (this.db.models[name]) { + throw new OverwriteModelError(name); + } + + schema.$isRootDiscriminator = true; + schema.$globalPluginsApplied = true; + + model = this.db.model(model || name, schema, this.$__collection.name); + this.discriminators[name] = model; + const d = this.discriminators[name]; + d.prototype.__proto__ = this.prototype; + Object.defineProperty(d, 'baseModelName', { + value: this.modelName, + configurable: true, + writable: false + }); + + // apply methods and statics + applyMethods(d, schema); + applyStatics(d, schema); + + if (this[subclassedSymbol] != null) { + for (const submodel of this[subclassedSymbol]) { + submodel.discriminators = submodel.discriminators || {}; + submodel.discriminators[name] = + model.__subclass(model.db, schema, submodel.collection.name); + } + } + + return d; +}; + +/*! + * Make sure `this` is a model + */ + +function _checkContext(ctx, fnName) { + // Check context, because it is easy to mistakenly type + // `new Model.discriminator()` and get an incomprehensible error + if (ctx == null || ctx === global) { + throw new MongooseError('`Model.' + fnName + '()` cannot run without a ' + + 'model as `this`. Make sure you are calling `MyModel.' + fnName + '()` ' + + 'where `MyModel` is a Mongoose model.'); + } else if (ctx[modelSymbol] == null) { + throw new MongooseError('`Model.' + fnName + '()` cannot run without a ' + + 'model as `this`. Make sure you are not calling ' + + '`new Model.' + fnName + '()`'); + } +} + +// Model (class) features + +/*! + * Give the constructor the ability to emit events. + */ + +for (const i in EventEmitter.prototype) { + Model[i] = EventEmitter.prototype[i]; +} + +/** + * This function is responsible for building [indexes](https://docs.mongodb.com/manual/indexes/), + * unless [`autoIndex`](http://mongoosejs.com/docs/guide.html#autoIndex) is turned off. + * + * Mongoose calls this function automatically when a model is created using + * [`mongoose.model()`](/docs/api.html#mongoose_Mongoose-model) or + * [`connection.model()`](/docs/api.html#connection_Connection-model), so you + * don't need to call it. This function is also idempotent, so you may call it + * to get back a promise that will resolve when your indexes are finished + * building as an alternative to [`MyModel.on('index')`](/docs/guide.html#indexes) + * + * ####Example: + * + * const eventSchema = new Schema({ thing: { type: 'string', unique: true }}) + * // This calls `Event.init()` implicitly, so you don't need to call + * // `Event.init()` on your own. + * const Event = mongoose.model('Event', eventSchema); + * + * Event.init().then(function(Event) { + * // You can also use `Event.on('index')` if you prefer event emitters + * // over promises. + * console.log('Indexes are done building!'); + * }); + * + * @api public + * @param {Function} [callback] + * @returns {Promise} + */ + +Model.init = function init(callback) { + _checkContext(this, 'init'); + + this.schema.emit('init', this); + + if (this.$init != null) { + if (callback) { + this.$init.then(() => callback(), err => callback(err)); + return null; + } + return this.$init; + } + + const Promise = PromiseProvider.get(); + const autoIndex = utils.getOption('autoIndex', + this.schema.options, this.db.config, this.db.base.options); + const autoCreate = utils.getOption('autoCreate', + this.schema.options, this.db.config, this.db.base.options); + + const _ensureIndexes = autoIndex ? + cb => this.ensureIndexes({ _automatic: true }, cb) : + cb => cb(); + const _createCollection = autoCreate ? + cb => this.createCollection({}, cb) : + cb => cb(); + + this.$init = new Promise((resolve, reject) => { + _createCollection(error => { + if (error) { + return reject(error); + } + _ensureIndexes(error => { + if (error) { + return reject(error); + } + resolve(this); + }); + }); + }); + + if (callback) { + this.$init.then(() => callback(), err => callback(err)); + this.$caught = true; + return null; + } else { + const _catch = this.$init.catch; + const _this = this; + this.$init.catch = function() { + this.$caught = true; + return _catch.apply(_this.$init, arguments); + }; + } + + return this.$init; +}; + + +/** + * Create the collection for this model. By default, if no indexes are specified, + * mongoose will not create the collection for the model until any documents are + * created. Use this method to create the collection explicitly. + * + * Note 1: You may need to call this before starting a transaction + * See https://docs.mongodb.com/manual/core/transactions/#transactions-and-operations + * + * Note 2: You don't have to call this if your schema contains index or unique field. + * In that case, just use `Model.init()` + * + * ####Example: + * + * const userSchema = new Schema({ name: String }) + * const User = mongoose.model('User', userSchema); + * + * User.createCollection().then(function(collection) { + * console.log('Collection is created!'); + * }); + * + * @api public + * @param {Object} [options] see [MongoDB driver docs](http://mongodb.github.io/node-mongodb-native/3.1/api/Db.html#createCollection) + * @param {Function} [callback] + * @returns {Promise} + */ + +Model.createCollection = function createCollection(options, callback) { + _checkContext(this, 'createCollection'); + + if (typeof options === 'string') { + throw new MongooseError('You can\'t specify a new collection name in Model.createCollection.' + + 'This is not like Connection.createCollection. Only options are accepted here.'); + } else if (typeof options === 'function') { + callback = options; + options = void 0; + } + + const schemaCollation = get(this, 'schema.options.collation', null); + if (schemaCollation != null) { + options = Object.assign({ collation: schemaCollation }, options); + } + const capped = get(this, 'schema.options.capped'); + if (capped) { + options = Object.assign({ capped: true }, capped, options); + } + + callback = this.$handleCallbackError(callback); + + return this.db.base._promiseOrCallback(callback, cb => { + cb = this.$wrapCallback(cb); + + this.db.createCollection(this.$__collection.collectionName, options, utils.tick((err) => { + if (err != null && (err.name !== 'MongoServerError' || err.code !== 48)) { + return cb(err); + } + this.$__collection = this.db.collection(this.$__collection.collectionName, options); + cb(null, this.$__collection); + })); + }, this.events); +}; + +/** + * Makes the indexes in MongoDB match the indexes defined in this model's + * schema. This function will drop any indexes that are not defined in + * the model's schema except the `_id` index, and build any indexes that + * are in your schema but not in MongoDB. + * + * See the [introductory blog post](http://thecodebarbarian.com/whats-new-in-mongoose-5-2-syncindexes) + * for more information. + * + * ####Example: + * + * const schema = new Schema({ name: { type: String, unique: true } }); + * const Customer = mongoose.model('Customer', schema); + * await Customer.collection.createIndex({ age: 1 }); // Index is not in schema + * // Will drop the 'age' index and create an index on `name` + * await Customer.syncIndexes(); + * + * @param {Object} [options] options to pass to `ensureIndexes()` + * @param {Boolean} [options.background=null] if specified, overrides each index's `background` property + * @param {Function} [callback] optional callback + * @return {Promise|undefined} Returns `undefined` if callback is specified, returns a promise if no callback. + * @api public + */ + +Model.syncIndexes = function syncIndexes(options, callback) { + _checkContext(this, 'syncIndexes'); + + callback = this.$handleCallbackError(callback); + + return this.db.base._promiseOrCallback(callback, cb => { + cb = this.$wrapCallback(cb); + + this.createCollection(err => { + if (err != null && (err.name !== 'MongoServerError' || err.code !== 48)) { + return cb(err); + } + this.cleanIndexes((err, dropped) => { + if (err != null) { + return cb(err); + } + this.createIndexes(options, err => { + if (err != null) { + return cb(err); + } + cb(null, dropped); + }); + }); + }); + }, this.events); +}; + +/** + * Does a dry-run of Model.syncIndexes(), meaning that + * the result of this function would be the result of + * Model.syncIndexes(). + * + * @param {Object} options not used at all. + * @param {Function} callback optional callback + * @returns {Promise} which containts an object, {toDrop, toCreate}, which + * are indexes that would be dropped in mongodb and indexes that would be created in mongodb. + */ + +Model.diffIndexes = function diffIndexes(options, callback) { + const toDrop = []; + const toCreate = []; + callback = this.$handleCallbackError(callback); + return this.db.base._promiseOrCallback(callback, cb => { + cb = this.$wrapCallback(cb); + this.listIndexes((err, indexes) => { + if (indexes === undefined) { + indexes = []; + } + const schemaIndexes = this.schema.indexes(); + // Iterate through the indexes created in mongodb and + // compare against the indexes in the schema. + for (const index of indexes) { + let found = false; + // Never try to drop `_id` index, MongoDB server doesn't allow it + if (isDefaultIdIndex(index)) { + continue; + } + + for (const schemaIndex of schemaIndexes) { + const key = schemaIndex[0]; + const options = _decorateDiscriminatorIndexOptions(this, + utils.clone(schemaIndex[1])); + if (isIndexEqual(key, options, index)) { + found = true; + } + } + + if (!found) { + toDrop.push(index.name); + } + } + // Iterate through the indexes created on the schema and + // compare against the indexes in mongodb. + for (const schemaIndex of schemaIndexes) { + const key = schemaIndex[0]; + let found = false; + const options = _decorateDiscriminatorIndexOptions(this, + utils.clone(schemaIndex[1])); + for (const index of indexes) { + if (isDefaultIdIndex(index)) { + continue; + } + if (isIndexEqual(key, options, index)) { + found = true; + } + } + if (!found) { + toCreate.push(key); + } + } + cb(null, { toDrop, toCreate }); + }); + }); +}; + +/** + * Deletes all indexes that aren't defined in this model's schema. Used by + * `syncIndexes()`. + * + * The returned promise resolves to a list of the dropped indexes' names as an array + * + * @param {Function} [callback] optional callback + * @return {Promise|undefined} Returns `undefined` if callback is specified, returns a promise if no callback. + * @api public + */ + +Model.cleanIndexes = function cleanIndexes(callback) { + _checkContext(this, 'cleanIndexes'); + + callback = this.$handleCallbackError(callback); + + return this.db.base._promiseOrCallback(callback, cb => { + const collection = this.$__collection; + + this.listIndexes((err, indexes) => { + if (err != null) { + return cb(err); + } + + const schemaIndexes = this.schema.indexes(); + const toDrop = []; + + for (const index of indexes) { + let found = false; + // Never try to drop `_id` index, MongoDB server doesn't allow it + if (isDefaultIdIndex(index)) { + continue; + } + + for (const schemaIndex of schemaIndexes) { + const key = schemaIndex[0]; + const options = _decorateDiscriminatorIndexOptions(this, + utils.clone(schemaIndex[1])); + if (isIndexEqual(key, options, index)) { + found = true; + } + } + + if (!found) { + toDrop.push(index.name); + } + } + + if (toDrop.length === 0) { + return cb(null, []); + } + + dropIndexes(toDrop, cb); + }); + + function dropIndexes(toDrop, cb) { + let remaining = toDrop.length; + let error = false; + toDrop.forEach(indexName => { + collection.dropIndex(indexName, err => { + if (err != null) { + error = true; + return cb(err); + } + if (!error) { + --remaining || cb(null, toDrop); + } + }); + }); + } + }); +}; + +/** + * Lists the indexes currently defined in MongoDB. This may or may not be + * the same as the indexes defined in your schema depending on whether you + * use the [`autoIndex` option](/docs/guide.html#autoIndex) and if you + * build indexes manually. + * + * @param {Function} [cb] optional callback + * @return {Promise|undefined} Returns `undefined` if callback is specified, returns a promise if no callback. + * @api public + */ + +Model.listIndexes = function init(callback) { + _checkContext(this, 'listIndexes'); + + const _listIndexes = cb => { + this.$__collection.listIndexes().toArray(cb); + }; + + callback = this.$handleCallbackError(callback); + + return this.db.base._promiseOrCallback(callback, cb => { + cb = this.$wrapCallback(cb); + + // Buffering + if (this.$__collection.buffer) { + this.$__collection.addQueue(_listIndexes, [cb]); + } else { + _listIndexes(cb); + } + }, this.events); +}; + +/** + * Sends `createIndex` commands to mongo for each index declared in the schema. + * The `createIndex` commands are sent in series. + * + * ####Example: + * + * Event.ensureIndexes(function (err) { + * if (err) return handleError(err); + * }); + * + * After completion, an `index` event is emitted on this `Model` passing an error if one occurred. + * + * ####Example: + * + * const eventSchema = new Schema({ thing: { type: 'string', unique: true }}) + * const Event = mongoose.model('Event', eventSchema); + * + * Event.on('index', function (err) { + * if (err) console.error(err); // error occurred during index creation + * }) + * + * _NOTE: It is not recommended that you run this in production. Index creation may impact database performance depending on your load. Use with caution._ + * + * @param {Object} [options] internal options + * @param {Function} [cb] optional callback + * @return {Promise} + * @api public + */ + +Model.ensureIndexes = function ensureIndexes(options, callback) { + _checkContext(this, 'ensureIndexes'); + + if (typeof options === 'function') { + callback = options; + options = null; + } + + callback = this.$handleCallbackError(callback); + + return this.db.base._promiseOrCallback(callback, cb => { + cb = this.$wrapCallback(cb); + + _ensureIndexes(this, options || {}, error => { + if (error) { + return cb(error); + } + cb(null); + }); + }, this.events); +}; + +/** + * Similar to `ensureIndexes()`, except for it uses the [`createIndex`](http://mongodb.github.io/node-mongodb-native/2.2/api/Collection.html#createIndex) + * function. + * + * @param {Object} [options] internal options + * @param {Function} [cb] optional callback + * @return {Promise} + * @api public + */ + +Model.createIndexes = function createIndexes(options, callback) { + _checkContext(this, 'createIndexes'); + + if (typeof options === 'function') { + callback = options; + options = {}; + } + options = options || {}; + options.createIndex = true; + return this.ensureIndexes(options, callback); +}; + +/*! + * ignore + */ + +function _ensureIndexes(model, options, callback) { + const indexes = model.schema.indexes(); + let indexError; + + options = options || {}; + const done = function(err) { + if (err && !model.$caught) { + model.emit('error', err); + } + model.emit('index', err || indexError); + callback && callback(err || indexError); + }; + + for (const index of indexes) { + if (isDefaultIdIndex(index)) { + utils.warn('mongoose: Cannot specify a custom index on `_id` for ' + + 'model name "' + model.modelName + '", ' + + 'MongoDB does not allow overwriting the default `_id` index. See ' + + 'http://bit.ly/mongodb-id-index'); + } + } + + if (!indexes.length) { + immediate(function() { + done(); + }); + return; + } + // Indexes are created one-by-one to support how MongoDB < 2.4 deals + // with background indexes. + + const indexSingleDone = function(err, fields, options, name) { + model.emit('index-single-done', err, fields, options, name); + }; + const indexSingleStart = function(fields, options) { + model.emit('index-single-start', fields, options); + }; + + const baseSchema = model.schema._baseSchema; + const baseSchemaIndexes = baseSchema ? baseSchema.indexes() : []; + + const create = function() { + if (options._automatic) { + if (model.schema.options.autoIndex === false || + (model.schema.options.autoIndex == null && model.db.config.autoIndex === false)) { + return done(); + } + } + + const index = indexes.shift(); + if (!index) { + return done(); + } + if (options._automatic && index[1]._autoIndex === false) { + return create(); + } + + if (baseSchemaIndexes.find(i => utils.deepEqual(i, index))) { + return create(); + } + + const indexFields = utils.clone(index[0]); + const indexOptions = utils.clone(index[1]); + let isTextIndex = false; + for (const key of Object.keys(indexFields)) { + if (indexFields[key] === 'text') { + isTextIndex = true; + } + } + delete indexOptions._autoIndex; + _decorateDiscriminatorIndexOptions(model, indexOptions); + applyWriteConcern(model.schema, indexOptions); + + indexSingleStart(indexFields, options); + + if ('background' in options) { + indexOptions.background = options.background; + } + if (model.schema.options.hasOwnProperty('collation') && + !indexOptions.hasOwnProperty('collation') && + !isTextIndex) { + indexOptions.collation = model.schema.options.collation; + } + + model.collection.createIndex(indexFields, indexOptions, utils.tick(function(err, name) { + indexSingleDone(err, indexFields, indexOptions, name); + if (err) { + if (!indexError) { + indexError = err; + } + if (!model.$caught) { + model.emit('error', err); + } + } + create(); + })); + }; + + immediate(function() { + // If buffering is off, do this manually. + if (options._automatic && !model.collection.collection) { + model.collection.addQueue(create, []); + } else { + create(); + } + }); +} + +function _decorateDiscriminatorIndexOptions(model, indexOptions) { + // If the model is a discriminator and it has a unique index, add a + // partialFilterExpression by default so the unique index will only apply + // to that discriminator. + if (model.baseModelName != null && + !('partialFilterExpression' in indexOptions) && + !('sparse' in indexOptions)) { + const value = ( + model.schema.discriminatorMapping && + model.schema.discriminatorMapping.value + ) || model.modelName; + const discriminatorKey = model.schema.options.discriminatorKey; + + indexOptions.partialFilterExpression = { [discriminatorKey]: value }; + } + return indexOptions; +} + +/** + * Schema the model uses. + * + * @property schema + * @receiver Model + * @api public + * @memberOf Model + */ + +Model.schema; + +/*! + * Connection instance the model uses. + * + * @property db + * @api public + * @memberOf Model + */ + +Model.db; + +/*! + * Collection the model uses. + * + * @property collection + * @api public + * @memberOf Model + */ + +Model.collection; + +/** + * Internal collection the model uses. + * + * @property collection + * @api private + * @memberOf Model + */ +Model.$__collection; + +/** + * Base Mongoose instance the model uses. + * + * @property base + * @api public + * @memberOf Model + */ + +Model.base; + +/** + * Registered discriminators for this model. + * + * @property discriminators + * @api public + * @memberOf Model + */ + +Model.discriminators; + +/** + * Translate any aliases fields/conditions so the final query or document object is pure + * + * ####Example: + * + * Character + * .find(Character.translateAliases({ + * '名': 'Eddard Stark' // Alias for 'name' + * }) + * .exec(function(err, characters) {}) + * + * ####Note: + * Only translate arguments of object type anything else is returned raw + * + * @param {Object} raw fields/conditions that may contain aliased keys + * @return {Object} the translated 'pure' fields/conditions + */ +Model.translateAliases = function translateAliases(fields) { + _checkContext(this, 'translateAliases'); + + const translate = (key, value) => { + let alias; + const translated = []; + const fieldKeys = key.split('.'); + let currentSchema = this.schema; + for (const i in fieldKeys) { + const name = fieldKeys[i]; + if (currentSchema && currentSchema.aliases[name]) { + alias = currentSchema.aliases[name]; + // Alias found, + translated.push(alias); + } else { + alias = name; + // Alias not found, so treat as un-aliased key + translated.push(name); + } + + // Check if aliased path is a schema + if (currentSchema && currentSchema.paths[alias]) { + currentSchema = currentSchema.paths[alias].schema; + } + else + currentSchema = null; + } + + const translatedKey = translated.join('.'); + if (fields instanceof Map) + fields.set(translatedKey, value); + else + fields[translatedKey] = value; + + if (translatedKey !== key) { + // We'll be using the translated key instead + if (fields instanceof Map) { + // Delete from map + fields.delete(key); + } else { + // Delete from object + delete fields[key]; // We'll be using the translated key instead + } + } + return fields; + }; + + if (typeof fields === 'object') { + // Fields is an object (query conditions or document fields) + if (fields instanceof Map) { + // A Map was supplied + for (const field of new Map(fields)) { + fields = translate(field[0], field[1]); + } + } else { + // Infer a regular object was supplied + for (const key of Object.keys(fields)) { + fields = translate(key, fields[key]); + if (key[0] === '$') { + if (Array.isArray(fields[key])) { + for (const i in fields[key]) { + // Recursively translate nested queries + fields[key][i] = this.translateAliases(fields[key][i]); + } + } + } + } + } + + return fields; + } else { + // Don't know typeof fields + return fields; + } +}; + +/** + * Removes all documents that match `conditions` from the collection. + * To remove just the first document that matches `conditions`, set the `single` + * option to true. + * + * This method is deprecated. See [Deprecation Warnings](../deprecations.html#remove) for details. + * + * ####Example: + * + * const res = await Character.remove({ name: 'Eddard Stark' }); + * res.deletedCount; // Number of documents removed + * + * ####Note: + * + * This method sends a remove command directly to MongoDB, no Mongoose documents + * are involved. Because no Mongoose documents are involved, Mongoose does + * not execute [document middleware](/docs/middleware.html#types-of-middleware). + * + * @deprecated + * @param {Object} conditions + * @param {Object} [options] + * @param {Session} [options.session=null] the [session](https://docs.mongodb.com/manual/reference/server-sessions/) associated with this operation. + * @param {Function} [callback] + * @return {Query} + * @api public + */ + +Model.remove = function remove(conditions, options, callback) { + _checkContext(this, 'remove'); + + if (typeof conditions === 'function') { + callback = conditions; + conditions = {}; + options = null; + } else if (typeof options === 'function') { + callback = options; + options = null; + } + + // get the mongodb collection object + const mq = new this.Query({}, {}, this, this.$__collection); + mq.setOptions(options); + + callback = this.$handleCallbackError(callback); + + return mq.remove(conditions, callback); +}; + +/** + * Deletes the first document that matches `conditions` from the collection. + * Behaves like `remove()`, but deletes at most one document regardless of the + * `single` option. + * + * ####Example: + * + * await Character.deleteOne({ name: 'Eddard Stark' }); + * + * ####Note: + * + * This function triggers `deleteOne` query hooks. Read the + * [middleware docs](/docs/middleware.html#naming) to learn more. + * + * @param {Object} conditions + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Function} [callback] + * @return {Query} + * @api public + */ + +Model.deleteOne = function deleteOne(conditions, options, callback) { + _checkContext(this, 'deleteOne'); + + if (typeof conditions === 'function') { + callback = conditions; + conditions = {}; + options = null; + } + else if (typeof options === 'function') { + callback = options; + options = null; + } + + const mq = new this.Query({}, {}, this, this.$__collection); + mq.setOptions(options); + + callback = this.$handleCallbackError(callback); + + return mq.deleteOne(conditions, callback); +}; + +/** + * Deletes all of the documents that match `conditions` from the collection. + * Behaves like `remove()`, but deletes all documents that match `conditions` + * regardless of the `single` option. + * + * ####Example: + * + * await Character.deleteMany({ name: /Stark/, age: { $gte: 18 } }); + * + * ####Note: + * + * This function triggers `deleteMany` query hooks. Read the + * [middleware docs](/docs/middleware.html#naming) to learn more. + * + * @param {Object} conditions + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Function} [callback] + * @return {Query} + * @api public + */ + +Model.deleteMany = function deleteMany(conditions, options, callback) { + _checkContext(this, 'deleteMany'); + + if (typeof conditions === 'function') { + callback = conditions; + conditions = {}; + options = null; + } else if (typeof options === 'function') { + callback = options; + options = null; + } + + const mq = new this.Query({}, {}, this, this.$__collection); + mq.setOptions(options); + + callback = this.$handleCallbackError(callback); + + return mq.deleteMany(conditions, callback); +}; + +/** + * Finds documents. + * + * Mongoose casts the `filter` to match the model's schema before the command is sent. + * See our [query casting tutorial](/docs/tutorials/query_casting.html) for + * more information on how Mongoose casts `filter`. + * + * ####Examples: + * + * // find all documents + * await MyModel.find({}); + * + * // find all documents named john and at least 18 + * await MyModel.find({ name: 'john', age: { $gte: 18 } }).exec(); + * + * // executes, passing results to callback + * MyModel.find({ name: 'john', age: { $gte: 18 }}, function (err, docs) {}); + * + * // executes, name LIKE john and only selecting the "name" and "friends" fields + * await MyModel.find({ name: /john/i }, 'name friends').exec(); + * + * // passing options + * await MyModel.find({ name: /john/i }, null, { skip: 10 }).exec(); + * + * @param {Object|ObjectId} filter + * @param {Object|String|Array} [projection] optional fields to return, see [`Query.prototype.select()`](http://mongoosejs.com/docs/api.html#query_Query-select) + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Function} [callback] + * @return {Query} + * @see field selection #query_Query-select + * @see query casting /docs/tutorials/query_casting.html + * @api public + */ + +Model.find = function find(conditions, projection, options, callback) { + _checkContext(this, 'find'); + + if (typeof conditions === 'function') { + callback = conditions; + conditions = {}; + projection = null; + options = null; + } else if (typeof projection === 'function') { + callback = projection; + projection = null; + options = null; + } else if (typeof options === 'function') { + callback = options; + options = null; + } + + const mq = new this.Query({}, {}, this, this.$__collection); + mq.select(projection); + + mq.setOptions(options); + if (this.schema.discriminatorMapping && + this.schema.discriminatorMapping.isRoot && + mq.selectedInclusively()) { + // Need to select discriminator key because original schema doesn't have it + mq.select(this.schema.options.discriminatorKey); + } + + callback = this.$handleCallbackError(callback); + + return mq.find(conditions, callback); +}; + +/** + * Finds a single document by its _id field. `findById(id)` is almost* + * equivalent to `findOne({ _id: id })`. If you want to query by a document's + * `_id`, use `findById()` instead of `findOne()`. + * + * The `id` is cast based on the Schema before sending the command. + * + * This function triggers the following middleware. + * + * - `findOne()` + * + * \* Except for how it treats `undefined`. If you use `findOne()`, you'll see + * that `findOne(undefined)` and `findOne({ _id: undefined })` are equivalent + * to `findOne({})` and return arbitrary documents. However, mongoose + * translates `findById(undefined)` into `findOne({ _id: null })`. + * + * ####Example: + * + * // Find the adventure with the given `id`, or `null` if not found + * await Adventure.findById(id).exec(); + * + * // using callback + * Adventure.findById(id, function (err, adventure) {}); + * + * // select only the adventures name and length + * await Adventure.findById(id, 'name length').exec(); + * + * @param {Any} id value of `_id` to query by + * @param {Object|String|Array} [projection] optional fields to return, see [`Query.prototype.select()`](#query_Query-select) + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Function} [callback] + * @return {Query} + * @see field selection #query_Query-select + * @see lean queries /docs/tutorials/lean.html + * @see findById in Mongoose https://masteringjs.io/tutorials/mongoose/find-by-id + * @api public + */ + +Model.findById = function findById(id, projection, options, callback) { + _checkContext(this, 'findById'); + + if (typeof id === 'undefined') { + id = null; + } + + callback = this.$handleCallbackError(callback); + + return this.findOne({ _id: id }, projection, options, callback); +}; + +/** + * Finds one document. + * + * The `conditions` are cast to their respective SchemaTypes before the command is sent. + * + * *Note:* `conditions` is optional, and if `conditions` is null or undefined, + * mongoose will send an empty `findOne` command to MongoDB, which will return + * an arbitrary document. If you're querying by `_id`, use `findById()` instead. + * + * ####Example: + * + * // Find one adventure whose `country` is 'Croatia', otherwise `null` + * await Adventure.findOne({ country: 'Croatia' }).exec(); + * + * // using callback + * Adventure.findOne({ country: 'Croatia' }, function (err, adventure) {}); + * + * // select only the adventures name and length + * await Adventure.findOne({ country: 'Croatia' }, 'name length').exec(); + * + * @param {Object} [conditions] + * @param {Object|String|Array} [projection] optional fields to return, see [`Query.prototype.select()`](#query_Query-select) + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Function} [callback] + * @return {Query} + * @see field selection #query_Query-select + * @see lean queries /docs/tutorials/lean.html + * @api public + */ + +Model.findOne = function findOne(conditions, projection, options, callback) { + _checkContext(this, 'findOne'); + if (typeof options === 'function') { + callback = options; + options = null; + } else if (typeof projection === 'function') { + callback = projection; + projection = null; + options = null; + } else if (typeof conditions === 'function') { + callback = conditions; + conditions = {}; + projection = null; + options = null; + } + + const mq = new this.Query({}, {}, this, this.$__collection); + mq.select(projection); + mq.setOptions(options); + if (this.schema.discriminatorMapping && + this.schema.discriminatorMapping.isRoot && + mq.selectedInclusively()) { + mq.select(this.schema.options.discriminatorKey); + } + + callback = this.$handleCallbackError(callback); + return mq.findOne(conditions, callback); +}; + +/** + * Estimates the number of documents in the MongoDB collection. Faster than + * using `countDocuments()` for large collections because + * `estimatedDocumentCount()` uses collection metadata rather than scanning + * the entire collection. + * + * ####Example: + * + * const numAdventures = Adventure.estimatedDocumentCount(); + * + * @param {Object} [options] + * @param {Function} [callback] + * @return {Query} + * @api public + */ + +Model.estimatedDocumentCount = function estimatedDocumentCount(options, callback) { + _checkContext(this, 'estimatedDocumentCount'); + + const mq = new this.Query({}, {}, this, this.$__collection); + + callback = this.$handleCallbackError(callback); + + return mq.estimatedDocumentCount(options, callback); +}; + +/** + * Counts number of documents matching `filter` in a database collection. + * + * ####Example: + * + * Adventure.countDocuments({ type: 'jungle' }, function (err, count) { + * console.log('there are %d jungle adventures', count); + * }); + * + * If you want to count all documents in a large collection, + * use the [`estimatedDocumentCount()` function](/docs/api.html#model_Model.estimatedDocumentCount) + * instead. If you call `countDocuments({})`, MongoDB will always execute + * a full collection scan and **not** use any indexes. + * + * The `countDocuments()` function is similar to `count()`, but there are a + * [few operators that `countDocuments()` does not support](https://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#countDocuments). + * Below are the operators that `count()` supports but `countDocuments()` does not, + * and the suggested replacement: + * + * - `$where`: [`$expr`](https://docs.mongodb.com/manual/reference/operator/query/expr/) + * - `$near`: [`$geoWithin`](https://docs.mongodb.com/manual/reference/operator/query/geoWithin/) with [`$center`](https://docs.mongodb.com/manual/reference/operator/query/center/#op._S_center) + * - `$nearSphere`: [`$geoWithin`](https://docs.mongodb.com/manual/reference/operator/query/geoWithin/) with [`$centerSphere`](https://docs.mongodb.com/manual/reference/operator/query/centerSphere/#op._S_centerSphere) + * + * @param {Object} filter + * @param {Function} [callback] + * @return {Query} + * @api public + */ + +Model.countDocuments = function countDocuments(conditions, options, callback) { + _checkContext(this, 'countDocuments'); + + if (typeof conditions === 'function') { + callback = conditions; + conditions = {}; + } + if (typeof options === 'function') { + callback = options; + options = null; + } + + const mq = new this.Query({}, {}, this, this.$__collection); + if (options != null) { + mq.setOptions(options); + } + + callback = this.$handleCallbackError(callback); + + return mq.countDocuments(conditions, callback); +}; + +/** + * Counts number of documents that match `filter` in a database collection. + * + * This method is deprecated. If you want to count the number of documents in + * a collection, e.g. `count({})`, use the [`estimatedDocumentCount()` function](/docs/api.html#model_Model.estimatedDocumentCount) + * instead. Otherwise, use the [`countDocuments()`](/docs/api.html#model_Model.countDocuments) function instead. + * + * ####Example: + * + * const count = await Adventure.count({ type: 'jungle' }); + * console.log('there are %d jungle adventures', count); + * + * @deprecated + * @param {Object} filter + * @param {Function} [callback] + * @return {Query} + * @api public + */ + +Model.count = function count(conditions, callback) { + _checkContext(this, 'count'); + + if (typeof conditions === 'function') { + callback = conditions; + conditions = {}; + } + + const mq = new this.Query({}, {}, this, this.$__collection); + + callback = this.$handleCallbackError(callback); + + return mq.count(conditions, callback); +}; + +/** + * Creates a Query for a `distinct` operation. + * + * Passing a `callback` executes the query. + * + * ####Example + * + * Link.distinct('url', { clicks: {$gt: 100}}, function (err, result) { + * if (err) return handleError(err); + * + * assert(Array.isArray(result)); + * console.log('unique urls with more than 100 clicks', result); + * }) + * + * const query = Link.distinct('url'); + * query.exec(callback); + * + * @param {String} field + * @param {Object} [conditions] optional + * @param {Function} [callback] + * @return {Query} + * @api public + */ + +Model.distinct = function distinct(field, conditions, callback) { + _checkContext(this, 'distinct'); + + const mq = new this.Query({}, {}, this, this.$__collection); + + if (typeof conditions === 'function') { + callback = conditions; + conditions = {}; + } + callback = this.$handleCallbackError(callback); + + return mq.distinct(field, conditions, callback); +}; + +/** + * Creates a Query, applies the passed conditions, and returns the Query. + * + * For example, instead of writing: + * + * User.find({age: {$gte: 21, $lte: 65}}, callback); + * + * we can instead write: + * + * User.where('age').gte(21).lte(65).exec(callback); + * + * Since the Query class also supports `where` you can continue chaining + * + * User + * .where('age').gte(21).lte(65) + * .where('name', /^b/i) + * ... etc + * + * @param {String} path + * @param {Object} [val] optional value + * @return {Query} + * @api public + */ + +Model.where = function where(path, val) { + _checkContext(this, 'where'); + + void val; // eslint + const mq = new this.Query({}, {}, this, this.$__collection).find({}); + return mq.where.apply(mq, arguments); +}; + +/** + * Creates a `Query` and specifies a `$where` condition. + * + * Sometimes you need to query for things in mongodb using a JavaScript expression. You can do so via `find({ $where: javascript })`, or you can use the mongoose shortcut method $where via a Query chain or from your mongoose Model. + * + * Blog.$where('this.username.indexOf("val") !== -1').exec(function (err, docs) {}); + * + * @param {String|Function} argument is a javascript string or anonymous function + * @method $where + * @memberOf Model + * @return {Query} + * @see Query.$where #query_Query-%24where + * @api public + */ + +Model.$where = function $where() { + _checkContext(this, '$where'); + + const mq = new this.Query({}, {}, this, this.$__collection).find({}); + return mq.$where.apply(mq, arguments); +}; + +/** + * Issues a mongodb findAndModify update command. + * + * Finds a matching document, updates it according to the `update` arg, passing any `options`, and returns the found document (if any) to the callback. The query executes if `callback` is passed else a Query object is returned. + * + * ####Options: + * + * - `new`: bool - if true, return the modified document rather than the original. defaults to false (changed in 4.0) + * - `upsert`: bool - creates the object if it doesn't exist. defaults to false. + * - `overwrite`: bool - if true, replace the entire document. + * - `fields`: {Object|String} - Field selection. Equivalent to `.select(fields).findOneAndUpdate()` + * - `maxTimeMS`: puts a time limit on the query - requires mongodb >= 2.6.0 + * - `sort`: if multiple docs are found by the conditions, sets the sort order to choose which doc to update + * - `runValidators`: if true, runs [update validators](/docs/validation.html#update-validators) on this command. Update validators validate the update operation against the model's schema. + * - `setDefaultsOnInsert`: `true` by default. If `setDefaultsOnInsert` and `upsert` are true, mongoose will apply the [defaults](http://mongoosejs.com/docs/defaults.html) specified in the model's schema if a new document is created. + * - `rawResult`: if true, returns the [raw result from the MongoDB driver](http://mongodb.github.io/node-mongodb-native/2.0/api/Collection.html#findAndModify) + * - `strict`: overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) for this update + * + * ####Examples: + * + * A.findOneAndUpdate(conditions, update, options, callback) // executes + * A.findOneAndUpdate(conditions, update, options) // returns Query + * A.findOneAndUpdate(conditions, update, callback) // executes + * A.findOneAndUpdate(conditions, update) // returns Query + * A.findOneAndUpdate() // returns Query + * + * ####Note: + * + * All top level update keys which are not `atomic` operation names are treated as set operations: + * + * ####Example: + * + * const query = { name: 'borne' }; + * Model.findOneAndUpdate(query, { name: 'jason bourne' }, options, callback) + * + * // is sent as + * Model.findOneAndUpdate(query, { $set: { name: 'jason bourne' }}, options, callback) + * + * This helps prevent accidentally overwriting your document with `{ name: 'jason bourne' }`. + * + * ####Note: + * + * `findOneAndX` and `findByIdAndX` functions support limited validation that + * you can enable by setting the `runValidators` option. + * + * If you need full-fledged validation, use the traditional approach of first + * retrieving the document. + * + * const doc = await Model.findById(id); + * doc.name = 'jason bourne'; + * await doc.save(); + * + * @param {Object} [conditions] + * @param {Object} [update] + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Boolean} [options.new=false] By default, `findOneAndUpdate()` returns the document as it was **before** `update` was applied. If you set `new: true`, `findOneAndUpdate()` will instead give you the object after `update` was applied. To change the default to `true`, use `mongoose.set('returnOriginal', false);`. + * @param {Object} [options.lean] if truthy, mongoose will return the document as a plain JavaScript object rather than a mongoose document. See [`Query.lean()`](/docs/api.html#query_Query-lean) and [the Mongoose lean tutorial](/docs/tutorials/lean.html). + * @param {ClientSession} [options.session=null] The session associated with this query. See [transactions docs](/docs/transactions.html). + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Boolean} [options.timestamps=null] If set to `false` and [schema-level timestamps](/docs/guide.html#timestamps) are enabled, skip timestamps for this update. Note that this allows you to overwrite timestamps. Does nothing if schema-level timestamps are not set. + * @param {Boolean} [options.returnOriginal=null] An alias for the `new` option. `returnOriginal: false` is equivalent to `new: true`. + * @param {Boolean} [options.overwrite=false] By default, if you don't include any [update operators](https://docs.mongodb.com/manual/reference/operator/update/) in `update`, Mongoose will wrap `update` in `$set` for you. This prevents you from accidentally overwriting the document. This option tells Mongoose to skip adding `$set`. An alternative to this would be using [Model.findOneAndReplace(conditions, update, options, callback)](https://mongoosejs.com/docs/api/model.html#model_Model.findOneAndReplace). + * @param {Boolean} [options.upsert=false] if true, and no documents found, insert a new document + * @param {Object|String|Array} [options.projection=null] optional fields to return, see [`Query.prototype.select()`](#query_Query-select) + * @param {Function} [callback] + * @return {Query} + * @see Tutorial /docs/tutorials/findoneandupdate.html + * @see mongodb http://www.mongodb.org/display/DOCS/findAndModify+Command + * @api public + */ + +Model.findOneAndUpdate = function(conditions, update, options, callback) { + _checkContext(this, 'findOneAndUpdate'); + + if (typeof options === 'function') { + callback = options; + options = null; + } else if (arguments.length === 1) { + if (typeof conditions === 'function') { + const msg = 'Model.findOneAndUpdate(): First argument must not be a function.\n\n' + + ' ' + this.modelName + '.findOneAndUpdate(conditions, update, options, callback)\n' + + ' ' + this.modelName + '.findOneAndUpdate(conditions, update, options)\n' + + ' ' + this.modelName + '.findOneAndUpdate(conditions, update)\n' + + ' ' + this.modelName + '.findOneAndUpdate(update)\n' + + ' ' + this.modelName + '.findOneAndUpdate()\n'; + throw new TypeError(msg); + } + update = conditions; + conditions = undefined; + } + callback = this.$handleCallbackError(callback); + + let fields; + if (options) { + fields = options.fields || options.projection; + } + + update = utils.clone(update, { + depopulate: true, + _isNested: true + }); + + _decorateUpdateWithVersionKey(update, options, this.schema.options.versionKey); + + const mq = new this.Query({}, {}, this, this.$__collection); + mq.select(fields); + + return mq.findOneAndUpdate(conditions, update, options, callback); +}; + +/*! + * Decorate the update with a version key, if necessary + */ + +function _decorateUpdateWithVersionKey(update, options, versionKey) { + if (!versionKey || !get(options, 'upsert', false)) { + return; + } + + const updatedPaths = modifiedPaths(update); + if (!updatedPaths[versionKey]) { + if (options.overwrite) { + update[versionKey] = 0; + } else { + if (!update.$setOnInsert) { + update.$setOnInsert = {}; + } + update.$setOnInsert[versionKey] = 0; + } + } +} + +/** + * Issues a mongodb findAndModify update command by a document's _id field. + * `findByIdAndUpdate(id, ...)` is equivalent to `findOneAndUpdate({ _id: id }, ...)`. + * + * Finds a matching document, updates it according to the `update` arg, + * passing any `options`, and returns the found document (if any) to the + * callback. The query executes if `callback` is passed. + * + * This function triggers the following middleware. + * + * - `findOneAndUpdate()` + * + * ####Options: + * + * - `new`: bool - true to return the modified document rather than the original. defaults to false + * - `upsert`: bool - creates the object if it doesn't exist. defaults to false. + * - `runValidators`: if true, runs [update validators](/docs/validation.html#update-validators) on this command. Update validators validate the update operation against the model's schema. + * - `setDefaultsOnInsert`: `true` by default. If `setDefaultsOnInsert` and `upsert` are true, mongoose will apply the [defaults](http://mongoosejs.com/docs/defaults.html) specified in the model's schema if a new document is created. + * - `sort`: if multiple docs are found by the conditions, sets the sort order to choose which doc to update + * - `select`: sets the document fields to return + * - `rawResult`: if true, returns the [raw result from the MongoDB driver](http://mongodb.github.io/node-mongodb-native/2.0/api/Collection.html#findAndModify) + * - `strict`: overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) for this update + * + * ####Examples: + * + * A.findByIdAndUpdate(id, update, options, callback) // executes + * A.findByIdAndUpdate(id, update, options) // returns Query + * A.findByIdAndUpdate(id, update, callback) // executes + * A.findByIdAndUpdate(id, update) // returns Query + * A.findByIdAndUpdate() // returns Query + * + * ####Note: + * + * All top level update keys which are not `atomic` operation names are treated as set operations: + * + * ####Example: + * + * Model.findByIdAndUpdate(id, { name: 'jason bourne' }, options, callback) + * + * // is sent as + * Model.findByIdAndUpdate(id, { $set: { name: 'jason bourne' }}, options, callback) + * + * This helps prevent accidentally overwriting your document with `{ name: 'jason bourne' }`. + * + * ####Note: + * + * `findOneAndX` and `findByIdAndX` functions support limited validation. You can + * enable validation by setting the `runValidators` option. + * + * If you need full-fledged validation, use the traditional approach of first + * retrieving the document. + * + * Model.findById(id, function (err, doc) { + * if (err) .. + * doc.name = 'jason bourne'; + * doc.save(callback); + * }); + * + * @param {Object|Number|String} id value of `_id` to query by + * @param {Object} [update] + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Boolean} [options.new=false] By default, `findByIdAndUpdate()` returns the document as it was **before** `update` was applied. If you set `new: true`, `findOneAndUpdate()` will instead give you the object after `update` was applied. To change the default to `true`, use `mongoose.set('returnOriginal', false);`. + * @param {Object} [options.lean] if truthy, mongoose will return the document as a plain JavaScript object rather than a mongoose document. See [`Query.lean()`](/docs/api.html#query_Query-lean) and [the Mongoose lean tutorial](/docs/tutorials/lean.html). + * @param {ClientSession} [options.session=null] The session associated with this query. See [transactions docs](/docs/transactions.html). + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Boolean} [options.timestamps=null] If set to `false` and [schema-level timestamps](/docs/guide.html#timestamps) are enabled, skip timestamps for this update. Note that this allows you to overwrite timestamps. Does nothing if schema-level timestamps are not set. + * @param {Boolean} [options.returnOriginal=null] An alias for the `new` option. `returnOriginal: false` is equivalent to `new: true`. + * @param {Boolean} [options.overwrite=false] By default, if you don't include any [update operators](https://docs.mongodb.com/manual/reference/operator/update/) in `update`, Mongoose will wrap `update` in `$set` for you. This prevents you from accidentally overwriting the document. This option tells Mongoose to skip adding `$set`. An alternative to this would be using [Model.findOneAndReplace({ _id: id }, update, options, callback)](https://mongoosejs.com/docs/api/model.html#model_Model.findOneAndReplace). + * @param {Function} [callback] + * @return {Query} + * @see Model.findOneAndUpdate #model_Model.findOneAndUpdate + * @see mongodb http://www.mongodb.org/display/DOCS/findAndModify+Command + * @api public + */ + +Model.findByIdAndUpdate = function(id, update, options, callback) { + _checkContext(this, 'findByIdAndUpdate'); + + callback = this.$handleCallbackError(callback); + if (arguments.length === 1) { + if (typeof id === 'function') { + const msg = 'Model.findByIdAndUpdate(): First argument must not be a function.\n\n' + + ' ' + this.modelName + '.findByIdAndUpdate(id, callback)\n' + + ' ' + this.modelName + '.findByIdAndUpdate(id)\n' + + ' ' + this.modelName + '.findByIdAndUpdate()\n'; + throw new TypeError(msg); + } + return this.findOneAndUpdate({ _id: id }, undefined); + } + + // if a model is passed in instead of an id + if (id instanceof Document) { + id = id._id; + } + + return this.findOneAndUpdate.call(this, { _id: id }, update, options, callback); +}; + +/** + * Issue a MongoDB `findOneAndDelete()` command. + * + * Finds a matching document, removes it, and passes the found document + * (if any) to the callback. + * + * Executes the query if `callback` is passed. + * + * This function triggers the following middleware. + * + * - `findOneAndDelete()` + * + * This function differs slightly from `Model.findOneAndRemove()` in that + * `findOneAndRemove()` becomes a [MongoDB `findAndModify()` command](https://docs.mongodb.com/manual/reference/method/db.collection.findAndModify/), + * as opposed to a `findOneAndDelete()` command. For most mongoose use cases, + * this distinction is purely pedantic. You should use `findOneAndDelete()` + * unless you have a good reason not to. + * + * ####Options: + * + * - `sort`: if multiple docs are found by the conditions, sets the sort order to choose which doc to update + * - `maxTimeMS`: puts a time limit on the query - requires mongodb >= 2.6.0 + * - `select`: sets the document fields to return, ex. `{ projection: { _id: 0 } }` + * - `projection`: equivalent to `select` + * - `rawResult`: if true, returns the [raw result from the MongoDB driver](http://mongodb.github.io/node-mongodb-native/2.0/api/Collection.html#findAndModify) + * - `strict`: overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) for this update + * + * ####Examples: + * + * A.findOneAndDelete(conditions, options, callback) // executes + * A.findOneAndDelete(conditions, options) // return Query + * A.findOneAndDelete(conditions, callback) // executes + * A.findOneAndDelete(conditions) // returns Query + * A.findOneAndDelete() // returns Query + * + * `findOneAndX` and `findByIdAndX` functions support limited validation. You can + * enable validation by setting the `runValidators` option. + * + * If you need full-fledged validation, use the traditional approach of first + * retrieving the document. + * + * Model.findById(id, function (err, doc) { + * if (err) .. + * doc.name = 'jason bourne'; + * doc.save(callback); + * }); + * + * @param {Object} conditions + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Object|String|Array} [options.projection=null] optional fields to return, see [`Query.prototype.select()`](#query_Query-select) + * @param {ClientSession} [options.session=null] The session associated with this query. See [transactions docs](/docs/transactions.html). + * @param {Function} [callback] + * @return {Query} + * @api public + */ + +Model.findOneAndDelete = function(conditions, options, callback) { + _checkContext(this, 'findOneAndDelete'); + + if (arguments.length === 1 && typeof conditions === 'function') { + const msg = 'Model.findOneAndDelete(): First argument must not be a function.\n\n' + + ' ' + this.modelName + '.findOneAndDelete(conditions, callback)\n' + + ' ' + this.modelName + '.findOneAndDelete(conditions)\n' + + ' ' + this.modelName + '.findOneAndDelete()\n'; + throw new TypeError(msg); + } + + if (typeof options === 'function') { + callback = options; + options = undefined; + } + callback = this.$handleCallbackError(callback); + + let fields; + if (options) { + fields = options.select; + options.select = undefined; + } + + const mq = new this.Query({}, {}, this, this.$__collection); + mq.select(fields); + + return mq.findOneAndDelete(conditions, options, callback); +}; + +/** + * Issue a MongoDB `findOneAndDelete()` command by a document's _id field. + * In other words, `findByIdAndDelete(id)` is a shorthand for + * `findOneAndDelete({ _id: id })`. + * + * This function triggers the following middleware. + * + * - `findOneAndDelete()` + * + * @param {Object|Number|String} id value of `_id` to query by + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Function} [callback] + * @return {Query} + * @see Model.findOneAndRemove #model_Model.findOneAndRemove + * @see mongodb http://www.mongodb.org/display/DOCS/findAndModify+Command + */ + +Model.findByIdAndDelete = function(id, options, callback) { + _checkContext(this, 'findByIdAndDelete'); + + if (arguments.length === 1 && typeof id === 'function') { + const msg = 'Model.findByIdAndDelete(): First argument must not be a function.\n\n' + + ' ' + this.modelName + '.findByIdAndDelete(id, callback)\n' + + ' ' + this.modelName + '.findByIdAndDelete(id)\n' + + ' ' + this.modelName + '.findByIdAndDelete()\n'; + throw new TypeError(msg); + } + callback = this.$handleCallbackError(callback); + + return this.findOneAndDelete({ _id: id }, options, callback); +}; + +/** + * Issue a MongoDB `findOneAndReplace()` command. + * + * Finds a matching document, replaces it with the provided doc, and passes the + * returned doc to the callback. + * + * Executes the query if `callback` is passed. + * + * This function triggers the following query middleware. + * + * - `findOneAndReplace()` + * + * ####Options: + * + * - `sort`: if multiple docs are found by the conditions, sets the sort order to choose which doc to update + * - `maxTimeMS`: puts a time limit on the query - requires mongodb >= 2.6.0 + * - `select`: sets the document fields to return + * - `projection`: like select, it determines which fields to return, ex. `{ projection: { _id: 0 } }` + * - `rawResult`: if true, returns the [raw result from the MongoDB driver](http://mongodb.github.io/node-mongodb-native/2.0/api/Collection.html#findAndModify) + * - `strict`: overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) for this update + * + * ####Examples: + * + * A.findOneAndReplace(conditions, options, callback) // executes + * A.findOneAndReplace(conditions, options) // return Query + * A.findOneAndReplace(conditions, callback) // executes + * A.findOneAndReplace(conditions) // returns Query + * A.findOneAndReplace() // returns Query + * + * @param {Object} filter Replace the first document that matches this filter + * @param {Object} [replacement] Replace with this document + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Boolean} [options.new=false] By default, `findOneAndReplace()` returns the document as it was **before** `update` was applied. If you set `new: true`, `findOneAndReplace()` will instead give you the object after `update` was applied. To change the default to `true`, use `mongoose.set('returnOriginal', false);`. + * @param {Object} [options.lean] if truthy, mongoose will return the document as a plain JavaScript object rather than a mongoose document. See [`Query.lean()`](/docs/api.html#query_Query-lean) and [the Mongoose lean tutorial](/docs/tutorials/lean.html). + * @param {ClientSession} [options.session=null] The session associated with this query. See [transactions docs](/docs/transactions.html). + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Boolean} [options.timestamps=null] If set to `false` and [schema-level timestamps](/docs/guide.html#timestamps) are enabled, skip timestamps for this update. Note that this allows you to overwrite timestamps. Does nothing if schema-level timestamps are not set. + * @param {Boolean} [options.returnOriginal=null] An alias for the `new` option. `returnOriginal: false` is equivalent to `new: true`. + * @param {Object|String|Array} [options.projection=null] optional fields to return, see [`Query.prototype.select()`](#query_Query-select) + * @param {Function} [callback] + * @return {Query} + * @api public + */ + +Model.findOneAndReplace = function(filter, replacement, options, callback) { + _checkContext(this, 'findOneAndReplace'); + + if (arguments.length === 1 && typeof filter === 'function') { + const msg = 'Model.findOneAndReplace(): First argument must not be a function.\n\n' + + ' ' + this.modelName + '.findOneAndReplace(conditions, callback)\n' + + ' ' + this.modelName + '.findOneAndReplace(conditions)\n' + + ' ' + this.modelName + '.findOneAndReplace()\n'; + throw new TypeError(msg); + } + + if (arguments.length === 3 && typeof options === 'function') { + callback = options; + options = replacement; + replacement = void 0; + } + if (arguments.length === 2 && typeof replacement === 'function') { + callback = replacement; + replacement = void 0; + options = void 0; + } + callback = this.$handleCallbackError(callback); + + let fields; + if (options) { + fields = options.select; + options.select = undefined; + } + + const mq = new this.Query({}, {}, this, this.$__collection); + mq.select(fields); + + return mq.findOneAndReplace(filter, replacement, options, callback); +}; + +/** + * Issue a mongodb findAndModify remove command. + * + * Finds a matching document, removes it, passing the found document (if any) to the callback. + * + * Executes the query if `callback` is passed. + * + * This function triggers the following middleware. + * + * - `findOneAndRemove()` + * + * ####Options: + * + * - `sort`: if multiple docs are found by the conditions, sets the sort order to choose which doc to update + * - `maxTimeMS`: puts a time limit on the query - requires mongodb >= 2.6.0 + * - `select`: sets the document fields to return + * - `projection`: like select, it determines which fields to return, ex. `{ projection: { _id: 0 } }` + * - `rawResult`: if true, returns the [raw result from the MongoDB driver](http://mongodb.github.io/node-mongodb-native/2.0/api/Collection.html#findAndModify) + * - `strict`: overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) for this update + * + * ####Examples: + * + * A.findOneAndRemove(conditions, options, callback) // executes + * A.findOneAndRemove(conditions, options) // return Query + * A.findOneAndRemove(conditions, callback) // executes + * A.findOneAndRemove(conditions) // returns Query + * A.findOneAndRemove() // returns Query + * + * `findOneAndX` and `findByIdAndX` functions support limited validation. You can + * enable validation by setting the `runValidators` option. + * + * If you need full-fledged validation, use the traditional approach of first + * retrieving the document. + * + * Model.findById(id, function (err, doc) { + * if (err) .. + * doc.name = 'jason bourne'; + * doc.save(callback); + * }); + * + * @param {Object} conditions + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {ClientSession} [options.session=null] The session associated with this query. See [transactions docs](/docs/transactions.html). + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Object|String|Array} [options.projection=null] optional fields to return, see [`Query.prototype.select()`](#query_Query-select) + * @param {Function} [callback] + * @return {Query} + * @see mongodb http://www.mongodb.org/display/DOCS/findAndModify+Command + * @api public + */ + +Model.findOneAndRemove = function(conditions, options, callback) { + _checkContext(this, 'findOneAndRemove'); + + if (arguments.length === 1 && typeof conditions === 'function') { + const msg = 'Model.findOneAndRemove(): First argument must not be a function.\n\n' + + ' ' + this.modelName + '.findOneAndRemove(conditions, callback)\n' + + ' ' + this.modelName + '.findOneAndRemove(conditions)\n' + + ' ' + this.modelName + '.findOneAndRemove()\n'; + throw new TypeError(msg); + } + + if (typeof options === 'function') { + callback = options; + options = undefined; + } + callback = this.$handleCallbackError(callback); + + let fields; + if (options) { + fields = options.select; + options.select = undefined; + } + + const mq = new this.Query({}, {}, this, this.$__collection); + mq.select(fields); + + return mq.findOneAndRemove(conditions, options, callback); +}; + +/** + * Issue a mongodb findAndModify remove command by a document's _id field. `findByIdAndRemove(id, ...)` is equivalent to `findOneAndRemove({ _id: id }, ...)`. + * + * Finds a matching document, removes it, passing the found document (if any) to the callback. + * + * Executes the query if `callback` is passed. + * + * This function triggers the following middleware. + * + * - `findOneAndRemove()` + * + * ####Options: + * + * - `sort`: if multiple docs are found by the conditions, sets the sort order to choose which doc to update + * - `select`: sets the document fields to return + * - `rawResult`: if true, returns the [raw result from the MongoDB driver](http://mongodb.github.io/node-mongodb-native/2.0/api/Collection.html#findAndModify) + * - `strict`: overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) for this update + * + * ####Examples: + * + * A.findByIdAndRemove(id, options, callback) // executes + * A.findByIdAndRemove(id, options) // return Query + * A.findByIdAndRemove(id, callback) // executes + * A.findByIdAndRemove(id) // returns Query + * A.findByIdAndRemove() // returns Query + * + * @param {Object|Number|String} id value of `_id` to query by + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {ClientSession} [options.session=null] The session associated with this query. See [transactions docs](/docs/transactions.html). + * @param {Object|String|Array} [options.projection=null] optional fields to return, see [`Query.prototype.select()`](#query_Query-select) + * @param {Function} [callback] + * @return {Query} + * @see Model.findOneAndRemove #model_Model.findOneAndRemove + * @see mongodb http://www.mongodb.org/display/DOCS/findAndModify+Command + */ + +Model.findByIdAndRemove = function(id, options, callback) { + _checkContext(this, 'findByIdAndRemove'); + + if (arguments.length === 1 && typeof id === 'function') { + const msg = 'Model.findByIdAndRemove(): First argument must not be a function.\n\n' + + ' ' + this.modelName + '.findByIdAndRemove(id, callback)\n' + + ' ' + this.modelName + '.findByIdAndRemove(id)\n' + + ' ' + this.modelName + '.findByIdAndRemove()\n'; + throw new TypeError(msg); + } + callback = this.$handleCallbackError(callback); + + return this.findOneAndRemove({ _id: id }, options, callback); +}; + +/** + * Shortcut for saving one or more documents to the database. + * `MyModel.create(docs)` does `new MyModel(doc).save()` for every doc in + * docs. + * + * This function triggers the following middleware. + * + * - `save()` + * + * ####Example: + * + * // Insert one new `Character` document + * await Character.create({ name: 'Jean-Luc Picard' }); + * + * // Insert multiple new `Character` documents + * await Character.create([{ name: 'Will Riker' }, { name: 'Geordi LaForge' }]); + * + * // Create a new character within a transaction. Note that you **must** + * // pass an array as the first parameter to `create()` if you want to + * // specify options. + * await Character.create([{ name: 'Jean-Luc Picard' }], { session }); + * + * @param {Array|Object} docs Documents to insert, as a spread or array + * @param {Object} [options] Options passed down to `save()`. To specify `options`, `docs` **must** be an array, not a spread. + * @param {Function} [callback] callback + * @return {Promise} + * @api public + */ + +Model.create = function create(doc, options, callback) { + _checkContext(this, 'create'); + + let args; + let cb; + const discriminatorKey = this.schema.options.discriminatorKey; + + if (Array.isArray(doc)) { + args = doc; + cb = typeof options === 'function' ? options : callback; + options = options != null && typeof options === 'object' ? options : {}; + } else { + const last = arguments[arguments.length - 1]; + options = {}; + // Handle falsy callbacks re: #5061 + if (typeof last === 'function' || (arguments.length > 1 && !last)) { + cb = last; + args = utils.args(arguments, 0, arguments.length - 1); + } else { + args = utils.args(arguments); + } + + if (args.length === 2 && + args[0] != null && + args[1] != null && + args[0].session == null && + getConstructorName(last.session) === 'ClientSession' && + !this.schema.path('session')) { + // Probably means the user is running into the common mistake of trying + // to use a spread to specify options, see gh-7535 + utils.warn('WARNING: to pass a `session` to `Model.create()` in ' + + 'Mongoose, you **must** pass an array as the first argument. See: ' + + 'https://mongoosejs.com/docs/api.html#model_Model.create'); + } + } + + return this.db.base._promiseOrCallback(cb, cb => { + cb = this.$wrapCallback(cb); + + if (args.length === 0) { + if (Array.isArray(doc)) { + return cb(null, []); + } else { + return cb(null); + } + } + + const toExecute = []; + let firstError; + args.forEach(doc => { + toExecute.push(callback => { + const Model = this.discriminators && doc[discriminatorKey] != null ? + this.discriminators[doc[discriminatorKey]] || getDiscriminatorByValue(this.discriminators, doc[discriminatorKey]) : + this; + if (Model == null) { + throw new MongooseError(`Discriminator "${doc[discriminatorKey]}" not ` + + `found for model "${this.modelName}"`); + } + let toSave = doc; + const callbackWrapper = (error, doc) => { + if (error) { + if (!firstError) { + firstError = error; + } + return callback(null, { error: error }); + } + callback(null, { doc: doc }); + }; + + if (!(toSave instanceof Model)) { + try { + toSave = new Model(toSave); + } catch (error) { + return callbackWrapper(error); + } + } + + toSave.$save(options, callbackWrapper); + }); + }); + + let numFns = toExecute.length; + if (numFns === 0) { + return cb(null, []); + } + const _done = (error, res) => { + const savedDocs = []; + const len = res.length; + for (let i = 0; i < len; ++i) { + if (res[i].doc) { + savedDocs.push(res[i].doc); + } + } + + if (firstError) { + return cb(firstError, savedDocs); + } + + if (doc instanceof Array) { + cb(null, savedDocs); + } else { + cb.apply(this, [null].concat(savedDocs)); + } + }; + + const _res = []; + toExecute.forEach((fn, i) => { + fn((err, res) => { + _res[i] = res; + if (--numFns <= 0) { + return _done(null, _res); + } + }); + }); + }, this.events); +}; + +/** + * _Requires a replica set running MongoDB >= 3.6.0._ Watches the + * underlying collection for changes using + * [MongoDB change streams](https://docs.mongodb.com/manual/changeStreams/). + * + * This function does **not** trigger any middleware. In particular, it + * does **not** trigger aggregate middleware. + * + * The ChangeStream object is an event emitter that emits the following events: + * + * - 'change': A change occurred, see below example + * - 'error': An unrecoverable error occurred. In particular, change streams currently error out if they lose connection to the replica set primary. Follow [this GitHub issue](https://github.com/Automattic/mongoose/issues/6799) for updates. + * - 'end': Emitted if the underlying stream is closed + * - 'close': Emitted if the underlying stream is closed + * + * ####Example: + * + * const doc = await Person.create({ name: 'Ned Stark' }); + * const changeStream = Person.watch().on('change', change => console.log(change)); + * // Will print from the above `console.log()`: + * // { _id: { _data: ... }, + * // operationType: 'delete', + * // ns: { db: 'mydb', coll: 'Person' }, + * // documentKey: { _id: 5a51b125c5500f5aa094c7bd } } + * await doc.remove(); + * + * @param {Array} [pipeline] + * @param {Object} [options] see the [mongodb driver options](http://mongodb.github.io/node-mongodb-native/3.0/api/Collection.html#watch) + * @return {ChangeStream} mongoose-specific change stream wrapper, inherits from EventEmitter + * @api public + */ + +Model.watch = function(pipeline, options) { + _checkContext(this, 'watch'); + + const changeStreamThunk = cb => { + if (this.$__collection.buffer) { + this.$__collection.addQueue(() => { + if (this.closed) { + return; + } + const driverChangeStream = this.$__collection.watch(pipeline, options); + cb(null, driverChangeStream); + }); + } else { + const driverChangeStream = this.$__collection.watch(pipeline, options); + cb(null, driverChangeStream); + } + }; + + return new ChangeStream(changeStreamThunk, pipeline, options); +}; + +/** + * _Requires MongoDB >= 3.6.0._ Starts a [MongoDB session](https://docs.mongodb.com/manual/release-notes/3.6/#client-sessions) + * for benefits like causal consistency, [retryable writes](https://docs.mongodb.com/manual/core/retryable-writes/), + * and [transactions](http://thecodebarbarian.com/a-node-js-perspective-on-mongodb-4-transactions.html). + * + * Calling `MyModel.startSession()` is equivalent to calling `MyModel.db.startSession()`. + * + * This function does not trigger any middleware. + * + * ####Example: + * + * const session = await Person.startSession(); + * let doc = await Person.findOne({ name: 'Ned Stark' }, null, { session }); + * await doc.remove(); + * // `doc` will always be null, even if reading from a replica set + * // secondary. Without causal consistency, it is possible to + * // get a doc back from the below query if the query reads from a + * // secondary that is experiencing replication lag. + * doc = await Person.findOne({ name: 'Ned Stark' }, null, { session, readPreference: 'secondary' }); + * + * @param {Object} [options] see the [mongodb driver options](http://mongodb.github.io/node-mongodb-native/3.0/api/MongoClient.html#startSession) + * @param {Boolean} [options.causalConsistency=true] set to false to disable causal consistency + * @param {Function} [callback] + * @return {Promise} promise that resolves to a MongoDB driver `ClientSession` + * @api public + */ + +Model.startSession = function() { + _checkContext(this, 'startSession'); + + return this.db.startSession.apply(this.db, arguments); +}; + +/** + * Shortcut for validating an array of documents and inserting them into + * MongoDB if they're all valid. This function is faster than `.create()` + * because it only sends one operation to the server, rather than one for each + * document. + * + * Mongoose always validates each document **before** sending `insertMany` + * to MongoDB. So if one document has a validation error, no documents will + * be saved, unless you set + * [the `ordered` option to false](https://docs.mongodb.com/manual/reference/method/db.collection.insertMany/#error-handling). + * + * This function does **not** trigger save middleware. + * + * This function triggers the following middleware. + * + * - `insertMany()` + * + * ####Example: + * + * const arr = [{ name: 'Star Wars' }, { name: 'The Empire Strikes Back' }]; + * Movies.insertMany(arr, function(error, docs) {}); + * + * @param {Array|Object|*} doc(s) + * @param {Object} [options] see the [mongodb driver options](http://mongodb.github.io/node-mongodb-native/2.2/api/Collection.html#insertMany) + * @param {Boolean} [options.ordered = true] if true, will fail fast on the first error encountered. If false, will insert all the documents it can and report errors later. An `insertMany()` with `ordered = false` is called an "unordered" `insertMany()`. + * @param {Boolean} [options.rawResult = false] if false, the returned promise resolves to the documents that passed mongoose document validation. If `true`, will return the [raw result from the MongoDB driver](http://mongodb.github.io/node-mongodb-native/2.2/api/Collection.html#~insertWriteOpCallback) with a `mongoose` property that contains `validationErrors` if this is an unordered `insertMany`. + * @param {Boolean} [options.lean = false] if `true`, skips hydrating and validating the documents. This option is useful if you need the extra performance, but Mongoose won't validate the documents before inserting. + * @param {Number} [options.limit = null] this limits the number of documents being processed (validation/casting) by mongoose in parallel, this does **NOT** send the documents in batches to MongoDB. Use this option if you're processing a large number of documents and your app is running out of memory. + * @param {String|Object|Array} [options.populate = null] populates the result documents. This option is a no-op if `rawResult` is set. + * @param {Function} [callback] callback + * @return {Promise} resolving to the raw result from the MongoDB driver if `options.rawResult` was `true`, or the documents that passed validation, otherwise + * @api public + */ + +Model.insertMany = function(arr, options, callback) { + _checkContext(this, 'insertMany'); + + if (typeof options === 'function') { + callback = options; + options = null; + } + return this.db.base._promiseOrCallback(callback, cb => { + this.$__insertMany(arr, options, cb); + }, this.events); +}; + +/*! + * ignore + */ + +Model.$__insertMany = function(arr, options, callback) { + const _this = this; + if (typeof options === 'function') { + callback = options; + options = null; + } + if (callback) { + callback = this.$handleCallbackError(callback); + callback = this.$wrapCallback(callback); + } + callback = callback || utils.noop; + options = options || {}; + const limit = get(options, 'limit', 1000); + const rawResult = get(options, 'rawResult', false); + const ordered = get(options, 'ordered', true); + const lean = get(options, 'lean', false); + + if (!Array.isArray(arr)) { + arr = [arr]; + } + + const validationErrors = []; + const toExecute = arr.map(doc => + callback => { + if (!(doc instanceof _this)) { + try { + doc = new _this(doc); + } catch (err) { + return callback(err); + } + } + if (options.session != null) { + doc.$session(options.session); + } + // If option `lean` is set to true bypass validation + if (lean) { + // we have to execute callback at the nextTick to be compatible + // with parallelLimit, as `results` variable has TDZ issue if we + // execute the callback synchronously + return immediate(() => callback(null, doc)); + } + doc.$validate({ __noPromise: true }, function(error) { + if (error) { + // Option `ordered` signals that insert should be continued after reaching + // a failing insert. Therefore we delegate "null", meaning the validation + // failed. It's up to the next function to filter out all failed models + if (ordered === false) { + validationErrors.push(error); + return callback(null, null); + } + return callback(error); + } + callback(null, doc); + }); + }); + + parallelLimit(toExecute, limit, function(error, docs) { + if (error) { + callback(error, null); + return; + } + // We filter all failed pre-validations by removing nulls + const docAttributes = docs.filter(function(doc) { + return doc != null; + }); + // Quickly escape while there aren't any valid docAttributes + if (docAttributes.length < 1) { + if (rawResult) { + const res = { + mongoose: { + validationErrors: validationErrors + } + }; + return callback(null, res); + } + callback(null, []); + return; + } + const docObjects = docAttributes.map(function(doc) { + if (doc.$__schema.options.versionKey) { + doc[doc.$__schema.options.versionKey] = 0; + } + if (doc.initializeTimestamps) { + return doc.initializeTimestamps().toObject(internalToObjectOptions); + } + return doc.toObject(internalToObjectOptions); + }); + + _this.$__collection.insertMany(docObjects, options, function(error, res) { + if (error) { + // `writeErrors` is a property reported by the MongoDB driver, + // just not if there's only 1 error. + if (error.writeErrors == null && + get(error, 'result.result.writeErrors') != null) { + error.writeErrors = error.result.result.writeErrors; + } + + // `insertedDocs` is a Mongoose-specific property + const erroredIndexes = new Set(get(error, 'writeErrors', []).map(err => err.index)); + + let firstErroredIndex = -1; + error.insertedDocs = docAttributes. + filter((doc, i) => { + const isErrored = erroredIndexes.has(i); + + if (ordered) { + if (firstErroredIndex > -1) { + return i < firstErroredIndex; + } + + if (isErrored) { + firstErroredIndex = i; + } + } + + return !isErrored; + }). + map(function setIsNewForInsertedDoc(doc) { + doc.$__reset(); + _setIsNew(doc, false); + return doc; + }); + + callback(error, null); + return; + } + + for (const attribute of docAttributes) { + attribute.$__reset(); + _setIsNew(attribute, false); + } + + if (rawResult) { + if (ordered === false) { + // Decorate with mongoose validation errors in case of unordered, + // because then still do `insertMany()` + res.mongoose = { + validationErrors: validationErrors + }; + } + return callback(null, res); + } + + if (options.populate != null) { + return _this.populate(docAttributes, options.populate, err => { + if (err != null) { + error.insertedDocs = docAttributes; + return callback(err); + } + + callback(null, docs); + }); + } + + callback(null, docAttributes); + }); + }); +}; + +/*! + * ignore + */ + +function _setIsNew(doc, val) { + doc.$isNew = val; + doc.$emit('isNew', val); + doc.constructor.emit('isNew', val); + + const subdocs = doc.$getAllSubdocs(); + for (const subdoc of subdocs) { + subdoc.$isNew = val; + } +} + +/** + * Sends multiple `insertOne`, `updateOne`, `updateMany`, `replaceOne`, + * `deleteOne`, and/or `deleteMany` operations to the MongoDB server in one + * command. This is faster than sending multiple independent operations (e.g. + * if you use `create()`) because with `bulkWrite()` there is only one round + * trip to MongoDB. + * + * Mongoose will perform casting on all operations you provide. + * + * This function does **not** trigger any middleware, neither `save()`, nor `update()`. + * If you need to trigger + * `save()` middleware for every document use [`create()`](http://mongoosejs.com/docs/api.html#model_Model.create) instead. + * + * ####Example: + * + * Character.bulkWrite([ + * { + * insertOne: { + * document: { + * name: 'Eddard Stark', + * title: 'Warden of the North' + * } + * } + * }, + * { + * updateOne: { + * filter: { name: 'Eddard Stark' }, + * // If you were using the MongoDB driver directly, you'd need to do + * // `update: { $set: { title: ... } }` but mongoose adds $set for + * // you. + * update: { title: 'Hand of the King' } + * } + * }, + * { + * deleteOne: { + * { + * filter: { name: 'Eddard Stark' } + * } + * } + * } + * ]).then(res => { + * // Prints "1 1 1" + * console.log(res.insertedCount, res.modifiedCount, res.deletedCount); + * }); + * + * The [supported operations](https://docs.mongodb.com/manual/reference/method/db.collection.bulkWrite/#db.collection.bulkWrite) are: + * + * - `insertOne` + * - `updateOne` + * - `updateMany` + * - `deleteOne` + * - `deleteMany` + * - `replaceOne` + * + * @param {Array} ops + * @param {Object} [ops.insertOne.document] The document to insert + * @param {Object} [opts.updateOne.filter] Update the first document that matches this filter + * @param {Object} [opts.updateOne.update] An object containing [update operators](https://docs.mongodb.com/manual/reference/operator/update/) + * @param {Boolean} [opts.updateOne.upsert=false] If true, insert a doc if none match + * @param {Boolean} [opts.updateOne.timestamps=true] If false, do not apply [timestamps](https://mongoosejs.com/docs/guide.html#timestamps) to the operation + * @param {Object} [opts.updateOne.collation] The [MongoDB collation](https://thecodebarbarian.com/a-nodejs-perspective-on-mongodb-34-collations) to use + * @param {Array} [opts.updateOne.arrayFilters] The [array filters](https://thecodebarbarian.com/a-nodejs-perspective-on-mongodb-36-array-filters.html) used in `update` + * @param {Object} [opts.updateMany.filter] Update all the documents that match this filter + * @param {Object} [opts.updateMany.update] An object containing [update operators](https://docs.mongodb.com/manual/reference/operator/update/) + * @param {Boolean} [opts.updateMany.upsert=false] If true, insert a doc if no documents match `filter` + * @param {Boolean} [opts.updateMany.timestamps=true] If false, do not apply [timestamps](https://mongoosejs.com/docs/guide.html#timestamps) to the operation + * @param {Object} [opts.updateMany.collation] The [MongoDB collation](https://thecodebarbarian.com/a-nodejs-perspective-on-mongodb-34-collations) to use + * @param {Array} [opts.updateMany.arrayFilters] The [array filters](https://thecodebarbarian.com/a-nodejs-perspective-on-mongodb-36-array-filters.html) used in `update` + * @param {Object} [opts.deleteOne.filter] Delete the first document that matches this filter + * @param {Object} [opts.deleteMany.filter] Delete all documents that match this filter + * @param {Object} [opts.replaceOne.filter] Replace the first document that matches this filter + * @param {Object} [opts.replaceOne.replacement] The replacement document + * @param {Boolean} [opts.replaceOne.upsert=false] If true, insert a doc if no documents match `filter` + * @param {Object} [options] + * @param {Boolean} [options.ordered=true] If true, execute writes in order and stop at the first error. If false, execute writes in parallel and continue until all writes have either succeeded or errored. + * @param {ClientSession} [options.session=null] The session associated with this bulk write. See [transactions docs](/docs/transactions.html). + * @param {String|number} [options.w=1] The [write concern](https://docs.mongodb.com/manual/reference/write-concern/). See [`Query#w()`](/docs/api.html#query_Query-w) for more information. + * @param {number} [options.wtimeout=null] The [write concern timeout](https://docs.mongodb.com/manual/reference/write-concern/#wtimeout). + * @param {Boolean} [options.j=true] If false, disable [journal acknowledgement](https://docs.mongodb.com/manual/reference/write-concern/#j-option) + * @param {Boolean} [options.bypassDocumentValidation=false] If true, disable [MongoDB server-side schema validation](https://docs.mongodb.com/manual/core/schema-validation/) for all writes in this bulk. + * @param {Boolean} [options.strict=null] Overwrites the [`strict` option](/docs/guide.html#strict) on schema. If false, allows filtering and writing fields not defined in the schema for all writes in this bulk. + * @param {Function} [callback] callback `function(error, bulkWriteOpResult) {}` + * @return {Promise} resolves to a [`BulkWriteOpResult`](http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#~BulkWriteOpResult) if the operation succeeds + * @api public + */ + +Model.bulkWrite = function(ops, options, callback) { + _checkContext(this, 'bulkWrite'); + + if (typeof options === 'function') { + callback = options; + options = null; + } + options = options || {}; + + const validations = ops.map(op => castBulkWrite(this, op, options)); + + callback = this.$handleCallbackError(callback); + return this.db.base._promiseOrCallback(callback, cb => { + cb = this.$wrapCallback(cb); + each(validations, (fn, cb) => fn(cb), error => { + if (error) { + return cb(error); + } + + if (ops.length === 0) { + return cb(null, getDefaultBulkwriteResult()); + } + + this.$__collection.bulkWrite(ops, options, (error, res) => { + if (error) { + return cb(error); + } + + cb(null, res); + }); + }); + }, this.events); +}; + +/** + * takes an array of documents, gets the changes and inserts/updates documents in the database + * according to whether or not the document is new, or whether it has changes or not. + * + * `bulkSave` uses `bulkWrite` under the hood, so it's mostly useful when dealing with many documents (10K+) + * + * @param {[Document]} documents + * + */ +Model.bulkSave = function(documents) { + const preSavePromises = documents.map(buildPreSavePromise); + + const writeOperations = this.buildBulkWriteOperations(documents, { skipValidation: true }); + + let bulkWriteResultPromise; + return Promise.all(preSavePromises) + .then(() => bulkWriteResultPromise = this.bulkWrite(writeOperations)) + .then(() => documents.map(buildSuccessfulWriteHandlerPromise)) + .then(() => bulkWriteResultPromise) + .catch((err) => { + if (!get(err, 'writeErrors.length')) { + throw err; + } + return Promise.all( + documents.map((document) => { + const documentError = err.writeErrors.find(writeError => { + const writeErrorDocumentId = writeError.err.op._id || writeError.err.op.q._id; + return writeErrorDocumentId.toString() === document._id.toString(); + }); + + if (documentError == null) { + return buildSuccessfulWriteHandlerPromise(document); + } + }) + ).then(() => { + throw err; + }); + }); +}; + +function buildPreSavePromise(document) { + return new Promise((resolve, reject) => { + document.schema.s.hooks.execPre('save', document, (err) => { + if (err) { + reject(err); + return; + } + resolve(); + }); + }); +} + +function buildSuccessfulWriteHandlerPromise(document) { + return new Promise((resolve, reject) => { + handleSuccessfulWrite(document, resolve, reject); + }); +} + +function handleSuccessfulWrite(document, resolve, reject) { + if (document.$isNew) { + _setIsNew(document, false); + } + + document.$__reset(); + document.schema.s.hooks.execPost('save', document, {}, (err) => { + if (err) { + reject(err); + return; + } + resolve(); + }); +} + +/** + * + * @param {[Document]} documents The array of documents to build write operations of + * @param {Object} options + * @param {Boolean} options.skipValidation defaults to `false`, when set to true, building the write operations will bypass validating the documents. + * @returns + */ +Model.buildBulkWriteOperations = function buildBulkWriteOperations(documents, options) { + if (!Array.isArray(documents)) { + throw new Error(`bulkSave expects an array of documents to be passed, received \`${documents}\` instead`); + } + + setDefaultOptions(); + + const writeOperations = documents.reduce((accumulator, document, i) => { + if (!options.skipValidation) { + if (!(document instanceof Document)) { + throw new Error(`documents.${i} was not a mongoose document, documents must be an array of mongoose documents (instanceof mongoose.Document).`); + } + const validationError = document.validateSync(); + if (validationError) { + throw validationError; + } + } + + const isANewDocument = document.isNew; + if (isANewDocument) { + accumulator.push({ + insertOne: { document } + }); + + return accumulator; + } + + const delta = document.$__delta(); + const isDocumentWithChanges = delta != null && !utils.isEmptyObject(delta[0]); + + if (isDocumentWithChanges) { + const where = document.$__where(delta[0]); + const changes = delta[1]; + + _applyCustomWhere(document, where); + + document.$__version(where, delta); + + accumulator.push({ + updateOne: { + filter: where, + update: changes + } + }); + + return accumulator; + } + + return accumulator; + }, []); + + return writeOperations; + + + function setDefaultOptions() { + options = options || {}; + if (options.skipValidation == null) { + options.skipValidation = false; + } + } +}; + +/** + * Shortcut for creating a new Document from existing raw data, pre-saved in the DB. + * The document returned has no paths marked as modified initially. + * + * ####Example: + * + * // hydrate previous data into a Mongoose document + * const mongooseCandy = Candy.hydrate({ _id: '54108337212ffb6d459f854c', type: 'jelly bean' }); + * + * @param {Object} obj + * @param {Object|String|Array} [projection] optional projection containing which fields should be selected for this document + * @return {Document} document instance + * @api public + */ + +Model.hydrate = function(obj, projection) { + _checkContext(this, 'hydrate'); + + const document = require('./queryhelpers').createModel(this, obj, projection); + document.$init(obj); + return document; +}; + +/** + * Updates one document in the database without returning it. + * + * This function triggers the following middleware. + * + * - `update()` + * + * This method is deprecated. See [Deprecation Warnings](../deprecations.html#update) for details. + * + * ####Examples: + * + * MyModel.update({ age: { $gt: 18 } }, { oldEnough: true }, fn); + * + * const res = await MyModel.update({ name: 'Tobi' }, { ferret: true }); + * res.n; // Number of documents that matched `{ name: 'Tobi' }` + * // Number of documents that were changed. If every doc matched already + * // had `ferret` set to `true`, `nModified` will be 0. + * res.nModified; + * + * ####Valid options: + * + * - `strict` (boolean): overrides the [schema-level `strict` option](/docs/guide.html#strict) for this update + * - `upsert` (boolean): whether to create the doc if it doesn't match (false) + * - `writeConcern` (object): sets the [write concern](https://docs.mongodb.com/manual/reference/write-concern/) for replica sets. Overrides the [schema-level write concern](/docs/guide.html#writeConcern) + * - `multi` (boolean): whether multiple documents should be updated (false) + * - `runValidators`: if true, runs [update validators](/docs/validation.html#update-validators) on this command. Update validators validate the update operation against the model's schema. + * - `setDefaultsOnInsert` (boolean): if this and `upsert` are true, mongoose will apply the [defaults](http://mongoosejs.com/docs/defaults.html) specified in the model's schema if a new document is created. This option only works on MongoDB >= 2.4 because it relies on [MongoDB's `$setOnInsert` operator](https://docs.mongodb.org/v2.4/reference/operator/update/setOnInsert/). + * - `timestamps` (boolean): If set to `false` and [schema-level timestamps](/docs/guide.html#timestamps) are enabled, skip timestamps for this update. Does nothing if schema-level timestamps are not set. + * - `overwrite` (boolean): disables update-only mode, allowing you to overwrite the doc (false) + * + * All `update` values are cast to their appropriate SchemaTypes before being sent. + * + * The `callback` function receives `(err, rawResponse)`. + * + * - `err` is the error if any occurred + * - `rawResponse` is the full response from Mongo + * + * ####Note: + * + * All top level keys which are not `atomic` operation names are treated as set operations: + * + * ####Example: + * + * const query = { name: 'borne' }; + * Model.update(query, { name: 'jason bourne' }, options, callback); + * + * // is sent as + * Model.update(query, { $set: { name: 'jason bourne' }}, options, function(err, res)); + * // if overwrite option is false. If overwrite is true, sent without the $set wrapper. + * + * This helps prevent accidentally overwriting all documents in your collection with `{ name: 'jason bourne' }`. + * + * ####Note: + * + * Be careful to not use an existing model instance for the update clause (this won't work and can cause weird behavior like infinite loops). Also, ensure that the update clause does not have an _id property, which causes Mongo to return a "Mod on _id not allowed" error. + * + * @deprecated + * @see strict http://mongoosejs.com/docs/guide.html#strict + * @see response http://docs.mongodb.org/v2.6/reference/command/update/#output + * @param {Object} filter + * @param {Object} doc + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](/docs/api.html#query_Query-setOptions) + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](/docs/guide.html#strict) + * @param {Boolean} [options.upsert=false] if true, and no documents found, insert a new document + * @param {Object} [options.writeConcern=null] sets the [write concern](https://docs.mongodb.com/manual/reference/write-concern/) for replica sets. Overrides the [schema-level write concern](/docs/guide.html#writeConcern) + * @param {Boolean} [options.multi=false] whether multiple documents should be updated or just the first one that matches `filter`. + * @param {Boolean} [options.runValidators=false] if true, runs [update validators](/docs/validation.html#update-validators) on this command. Update validators validate the update operation against the model's schema. + * @param {Boolean} [options.setDefaultsOnInsert=false] `true` by default. If `setDefaultsOnInsert` and `upsert` are true, mongoose will apply the [defaults](http://mongoosejs.com/docs/defaults.html) specified in the model's schema if a new document is created. + * @param {Boolean} [options.timestamps=null] If set to `false` and [schema-level timestamps](/docs/guide.html#timestamps) are enabled, skip timestamps for this update. Does nothing if schema-level timestamps are not set. + * @param {Boolean} [options.overwrite=false] By default, if you don't include any [update operators](https://docs.mongodb.com/manual/reference/operator/update/) in `doc`, Mongoose will wrap `doc` in `$set` for you. This prevents you from accidentally overwriting the document. This option tells Mongoose to skip adding `$set`. + * @param {Function} [callback] params are (error, [updateWriteOpResult](https://mongodb.github.io/node-mongodb-native/3.6/api/Collection.html#~updateWriteOpResult)) + * @param {Function} [callback] + * @return {Query} + * @see MongoDB docs https://docs.mongodb.com/manual/reference/command/update/#update-command-output + * @see writeOpResult https://mongodb.github.io/node-mongodb-native/3.6/api/Collection.html#~updateWriteOpResult + * @see Query docs https://mongoosejs.com/docs/queries.html + * @api public + */ + +Model.update = function update(conditions, doc, options, callback) { + _checkContext(this, 'update'); + + return _update(this, 'update', conditions, doc, options, callback); +}; + +/** + * Same as `update()`, except MongoDB will update _all_ documents that match + * `filter` (as opposed to just the first one) regardless of the value of + * the `multi` option. + * + * **Note** updateMany will _not_ fire update middleware. Use `pre('updateMany')` + * and `post('updateMany')` instead. + * + * ####Example: + * const res = await Person.updateMany({ name: /Stark$/ }, { isDeleted: true }); + * res.n; // Number of documents matched + * res.nModified; // Number of documents modified + * + * This function triggers the following middleware. + * + * - `updateMany()` + * + * @param {Object} filter + * @param {Object|Array} update + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Boolean} [options.upsert=false] if true, and no documents found, insert a new document + * @param {Object} [options.writeConcern=null] sets the [write concern](https://docs.mongodb.com/manual/reference/write-concern/) for replica sets. Overrides the [schema-level write concern](/docs/guide.html#writeConcern) + * @param {Boolean} [options.timestamps=null] If set to `false` and [schema-level timestamps](/docs/guide.html#timestamps) are enabled, skip timestamps for this update. Does nothing if schema-level timestamps are not set. + * @param {Function} [callback] `function(error, res) {}` where `res` has 3 properties: `n`, `nModified`, `ok`. + * @return {Query} + * @see Query docs https://mongoosejs.com/docs/queries.html + * @see MongoDB docs https://docs.mongodb.com/manual/reference/command/update/#update-command-output + * @see writeOpResult http://mongodb.github.io/node-mongodb-native/2.2/api/Collection.html#~WriteOpResult + * @api public + */ + +Model.updateMany = function updateMany(conditions, doc, options, callback) { + _checkContext(this, 'updateMany'); + + return _update(this, 'updateMany', conditions, doc, options, callback); +}; + +/** + * Same as `update()`, except it does not support the `multi` or `overwrite` + * options. + * + * - MongoDB will update _only_ the first document that matches `filter` regardless of the value of the `multi` option. + * - Use `replaceOne()` if you want to overwrite an entire document rather than using atomic operators like `$set`. + * + * ####Example: + * const res = await Person.updateOne({ name: 'Jean-Luc Picard' }, { ship: 'USS Enterprise' }); + * res.n; // Number of documents matched + * res.nModified; // Number of documents modified + * + * This function triggers the following middleware. + * + * - `updateOne()` + * + * @param {Object} filter + * @param {Object|Array} update + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Boolean} [options.upsert=false] if true, and no documents found, insert a new document + * @param {Object} [options.writeConcern=null] sets the [write concern](https://docs.mongodb.com/manual/reference/write-concern/) for replica sets. Overrides the [schema-level write concern](/docs/guide.html#writeConcern) + * @param {Boolean} [options.timestamps=null] If set to `false` and [schema-level timestamps](/docs/guide.html#timestamps) are enabled, skip timestamps for this update. Note that this allows you to overwrite timestamps. Does nothing if schema-level timestamps are not set. + * @param {Function} [callback] params are (error, writeOpResult) + * @return {Query} + * @see Query docs https://mongoosejs.com/docs/queries.html + * @see MongoDB docs https://docs.mongodb.com/manual/reference/command/update/#update-command-output + * @see writeOpResult http://mongodb.github.io/node-mongodb-native/2.2/api/Collection.html#~WriteOpResult + * @api public + */ + +Model.updateOne = function updateOne(conditions, doc, options, callback) { + _checkContext(this, 'updateOne'); + + return _update(this, 'updateOne', conditions, doc, options, callback); +}; + +/** + * Same as `update()`, except MongoDB replace the existing document with the + * given document (no atomic operators like `$set`). + * + * ####Example: + * const res = await Person.replaceOne({ _id: 24601 }, { name: 'Jean Valjean' }); + * res.n; // Number of documents matched + * res.nModified; // Number of documents modified + * + * This function triggers the following middleware. + * + * - `replaceOne()` + * + * @param {Object} filter + * @param {Object} doc + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Boolean} [options.upsert=false] if true, and no documents found, insert a new document + * @param {Object} [options.writeConcern=null] sets the [write concern](https://docs.mongodb.com/manual/reference/write-concern/) for replica sets. Overrides the [schema-level write concern](/docs/guide.html#writeConcern) + * @param {Boolean} [options.timestamps=null] If set to `false` and [schema-level timestamps](/docs/guide.html#timestamps) are enabled, skip timestamps for this update. Does nothing if schema-level timestamps are not set. + * @param {Function} [callback] `function(error, res) {}` where `res` has 3 properties: `n`, `nModified`, `ok`. + * @return {Query} + * @see Query docs https://mongoosejs.com/docs/queries.html + * @see writeOpResult http://mongodb.github.io/node-mongodb-native/2.2/api/Collection.html#~WriteOpResult + * @return {Query} + * @api public + */ + +Model.replaceOne = function replaceOne(conditions, doc, options, callback) { + _checkContext(this, 'replaceOne'); + + const versionKey = get(this, 'schema.options.versionKey', null); + if (versionKey && !doc[versionKey]) { + doc[versionKey] = 0; + } + + return _update(this, 'replaceOne', conditions, doc, options, callback); +}; + +/*! + * Common code for `updateOne()`, `updateMany()`, `replaceOne()`, and `update()` + * because they need to do the same thing + */ + +function _update(model, op, conditions, doc, options, callback) { + const mq = new model.Query({}, {}, model, model.collection); + callback = model.$handleCallbackError(callback); + // gh-2406 + // make local deep copy of conditions + if (conditions instanceof Document) { + conditions = conditions.toObject(); + } else { + conditions = utils.clone(conditions); + } + options = typeof options === 'function' ? options : utils.clone(options); + + const versionKey = get(model, 'schema.options.versionKey', null); + _decorateUpdateWithVersionKey(doc, options, versionKey); + + return mq[op](conditions, doc, options, callback); +} + +/** + * Executes a mapReduce command. + * + * `o` is an object specifying all mapReduce options as well as the map and reduce functions. All options are delegated to the driver implementation. See [node-mongodb-native mapReduce() documentation](http://mongodb.github.io/node-mongodb-native/api-generated/collection.html#mapreduce) for more detail about options. + * + * This function does not trigger any middleware. + * + * ####Example: + * + * const o = {}; + * // `map()` and `reduce()` are run on the MongoDB server, not Node.js, + * // these functions are converted to strings + * o.map = function () { emit(this.name, 1) }; + * o.reduce = function (k, vals) { return vals.length }; + * User.mapReduce(o, function (err, results) { + * console.log(results) + * }) + * + * ####Other options: + * + * - `query` {Object} query filter object. + * - `sort` {Object} sort input objects using this key + * - `limit` {Number} max number of documents + * - `keeptemp` {Boolean, default:false} keep temporary data + * - `finalize` {Function} finalize function + * - `scope` {Object} scope variables exposed to map/reduce/finalize during execution + * - `jsMode` {Boolean, default:false} it is possible to make the execution stay in JS. Provided in MongoDB > 2.0.X + * - `verbose` {Boolean, default:false} provide statistics on job execution time. + * - `readPreference` {String} + * - `out*` {Object, default: {inline:1}} sets the output target for the map reduce job. + * + * ####* out options: + * + * - `{inline:1}` the results are returned in an array + * - `{replace: 'collectionName'}` add the results to collectionName: the results replace the collection + * - `{reduce: 'collectionName'}` add the results to collectionName: if dups are detected, uses the reducer / finalize functions + * - `{merge: 'collectionName'}` add the results to collectionName: if dups exist the new docs overwrite the old + * + * If `options.out` is set to `replace`, `merge`, or `reduce`, a Model instance is returned that can be used for further querying. Queries run against this model are all executed with the [`lean` option](/docs/tutorials/lean.html); meaning only the js object is returned and no Mongoose magic is applied (getters, setters, etc). + * + * ####Example: + * + * const o = {}; + * // You can also define `map()` and `reduce()` as strings if your + * // linter complains about `emit()` not being defined + * o.map = 'function () { emit(this.name, 1) }'; + * o.reduce = 'function (k, vals) { return vals.length }'; + * o.out = { replace: 'createdCollectionNameForResults' } + * o.verbose = true; + * + * User.mapReduce(o, function (err, model, stats) { + * console.log('map reduce took %d ms', stats.processtime) + * model.find().where('value').gt(10).exec(function (err, docs) { + * console.log(docs); + * }); + * }) + * + * // `mapReduce()` returns a promise. However, ES6 promises can only + * // resolve to exactly one value, + * o.resolveToObject = true; + * const promise = User.mapReduce(o); + * promise.then(function (res) { + * const model = res.model; + * const stats = res.stats; + * console.log('map reduce took %d ms', stats.processtime) + * return model.find().where('value').gt(10).exec(); + * }).then(function (docs) { + * console.log(docs); + * }).then(null, handleError).end() + * + * @param {Object} o an object specifying map-reduce options + * @param {Function} [callback] optional callback + * @see http://www.mongodb.org/display/DOCS/MapReduce + * @return {Promise} + * @api public + */ + +Model.mapReduce = function mapReduce(o, callback) { + _checkContext(this, 'mapReduce'); + + callback = this.$handleCallbackError(callback); + + return this.db.base._promiseOrCallback(callback, cb => { + cb = this.$wrapCallback(cb); + + if (!Model.mapReduce.schema) { + const opts = { _id: false, id: false, strict: false }; + Model.mapReduce.schema = new Schema({}, opts); + } + + if (!o.out) o.out = { inline: 1 }; + if (o.verbose !== false) o.verbose = true; + + o.map = String(o.map); + o.reduce = String(o.reduce); + + if (o.query) { + let q = new this.Query(o.query); + q.cast(this); + o.query = q._conditions; + q = undefined; + } + + this.$__collection.mapReduce(null, null, o, (err, res) => { + if (err) { + return cb(err); + } + if (res.collection) { + // returned a collection, convert to Model + const model = Model.compile('_mapreduce_' + res.collection.collectionName, + Model.mapReduce.schema, res.collection.collectionName, this.db, + this.base); + + model._mapreduce = true; + res.model = model; + + return cb(null, res); + } + + cb(null, res); + }); + }, this.events); +}; + +/** + * Performs [aggregations](http://docs.mongodb.org/manual/applications/aggregation/) on the models collection. + * + * If a `callback` is passed, the `aggregate` is executed and a `Promise` is returned. If a callback is not passed, the `aggregate` itself is returned. + * + * This function triggers the following middleware. + * + * - `aggregate()` + * + * ####Example: + * + * // Find the max balance of all accounts + * const res = await Users.aggregate([ + * { $group: { _id: null, maxBalance: { $max: '$balance' }}}, + * { $project: { _id: 0, maxBalance: 1 }} + * ]); + * + * console.log(res); // [ { maxBalance: 98000 } ] + * + * // Or use the aggregation pipeline builder. + * const res = await Users.aggregate(). + * group({ _id: null, maxBalance: { $max: '$balance' } }). + * project('-id maxBalance'). + * exec(); + * console.log(res); // [ { maxBalance: 98 } ] + * + * ####NOTE: + * + * - Mongoose does **not** cast aggregation pipelines to the model's schema because `$project` and `$group` operators allow redefining the "shape" of the documents at any stage of the pipeline, which may leave documents in an incompatible format. You can use the [mongoose-cast-aggregation plugin](https://github.com/AbdelrahmanHafez/mongoose-cast-aggregation) to enable minimal casting for aggregation pipelines. + * - The documents returned are plain javascript objects, not mongoose documents (since any shape of document can be returned). + * + * #### More About Aggregations: + * + * - [Mongoose `Aggregate`](/docs/api/aggregate.html) + * - [An Introduction to Mongoose Aggregate](https://masteringjs.io/tutorials/mongoose/aggregate) + * - [MongoDB Aggregation docs](http://docs.mongodb.org/manual/applications/aggregation/) + * + * @see Aggregate #aggregate_Aggregate + * @see MongoDB http://docs.mongodb.org/manual/applications/aggregation/ + * @param {Array} [pipeline] aggregation pipeline as an array of objects + * @param {Object} [options] aggregation options + * @param {Function} [callback] + * @return {Aggregate} + * @api public + */ + +Model.aggregate = function aggregate(pipeline, options, callback) { + _checkContext(this, 'aggregate'); + + if (arguments.length > 3 || get(pipeline, 'constructor.name') === 'Object') { + throw new MongooseError('Mongoose 5.x disallows passing a spread of operators ' + + 'to `Model.aggregate()`. Instead of ' + + '`Model.aggregate({ $match }, { $skip })`, do ' + + '`Model.aggregate([{ $match }, { $skip }])`'); + } + + if (typeof pipeline === 'function') { + callback = pipeline; + pipeline = []; + } + + if (typeof options === 'function') { + callback = options; + options = null; + } + + const aggregate = new Aggregate(pipeline || []); + aggregate.model(this); + + if (options != null) { + aggregate.option(options); + } + + if (typeof callback === 'undefined') { + return aggregate; + } + + callback = this.$handleCallbackError(callback); + callback = this.$wrapCallback(callback); + + aggregate.exec(callback); + return aggregate; +}; + +/** + * Casts and validates the given object against this model's schema, passing the + * given `context` to custom validators. + * + * ####Example: + * + * const Model = mongoose.model('Test', Schema({ + * name: { type: String, required: true }, + * age: { type: Number, required: true } + * }); + * + * try { + * await Model.validate({ name: null }, ['name']) + * } catch (err) { + * err instanceof mongoose.Error.ValidationError; // true + * Object.keys(err.errors); // ['name'] + * } + * + * @param {Object} obj + * @param {Array} pathsToValidate + * @param {Object} [context] + * @param {Function} [callback] + * @return {Promise|undefined} + * @api public + */ + +Model.validate = function validate(obj, pathsToValidate, context, callback) { + if ((arguments.length < 3) || (arguments.length === 3 && typeof arguments[2] === 'function')) { + // For convenience, if we're validating a document or an object, make `context` default to + // the model so users don't have to always pass `context`, re: gh-10132, gh-10346 + context = obj; + } + + return this.db.base._promiseOrCallback(callback, cb => { + const schema = this.schema; + let paths = Object.keys(schema.paths); + + if (pathsToValidate != null) { + const _pathsToValidate = new Set(pathsToValidate); + paths = paths.filter(p => { + const pieces = p.split('.'); + let cur = pieces[0]; + + for (const piece of pieces) { + if (_pathsToValidate.has(cur)) { + return true; + } + cur += '.' + piece; + } + + return _pathsToValidate.has(p); + }); + } + + for (const path of paths) { + const schemaType = schema.path(path); + if (!schemaType || !schemaType.$isMongooseArray) { + continue; + } + + const val = get(obj, path); + pushNestedArrayPaths(val, path); + } + + let remaining = paths.length; + let error = null; + + for (const path of paths) { + const schemaType = schema.path(path); + if (schemaType == null) { + _checkDone(); + continue; + } + + const pieces = path.split('.'); + let cur = obj; + for (let i = 0; i < pieces.length - 1; ++i) { + cur = cur[pieces[i]]; + } + + let val = get(obj, path, void 0); + + if (val != null) { + try { + val = schemaType.cast(val); + cur[pieces[pieces.length - 1]] = val; + } catch (err) { + error = error || new ValidationError(); + error.addError(path, err); + + _checkDone(); + continue; + } + } + + schemaType.doValidate(val, err => { + if (err) { + error = error || new ValidationError(); + if (err instanceof ValidationError) { + for (const _err of Object.keys(err.errors)) { + error.addError(`${path}.${err.errors[_err].path}`, _err); + } + } else { + error.addError(err.path, err); + } + } + _checkDone(); + }, context, { path: path }); + } + + function pushNestedArrayPaths(nestedArray, path) { + if (nestedArray == null) { + return; + } + + for (let i = 0; i < nestedArray.length; ++i) { + if (Array.isArray(nestedArray[i])) { + pushNestedArrayPaths(nestedArray[i], path + '.' + i); + } else { + paths.push(path + '.' + i); + } + } + } + + function _checkDone() { + if (--remaining <= 0) { + return cb(error); + } + } + }); +}; + +/** + * Populates document references. + * + * Changed in Mongoose 6: the model you call `populate()` on should be the + * "local field" model, **not** the "foreign field" model. + * + * ####Available top-level options: + * + * - path: space delimited path(s) to populate + * - select: optional fields to select + * - match: optional query conditions to match + * - model: optional name of the model to use for population + * - options: optional query options like sort, limit, etc + * - justOne: optional boolean, if true Mongoose will always set `path` to an array. Inferred from schema by default. + * - strictPopulate: optional boolean, set to `false` to allow populating paths that aren't in the schema. + * + * ####Examples: + * + * const Dog = mongoose.model('Dog', new Schema({ name: String, breed: String })); + * const Person = mongoose.model('Person', new Schema({ + * name: String, + * pet: { type: mongoose.ObjectId, ref: 'Dog' } + * })); + * + * const pets = await Pet.create([ + * { name: 'Daisy', breed: 'Beagle' }, + * { name: 'Einstein', breed: 'Catalan Sheepdog' } + * ]); + * + * // populate many plain objects + * const users = [ + * { name: 'John Wick', dog: pets[0]._id }, + * { name: 'Doc Brown', dog: pets[1]._id } + * ]; + * await User.populate(users, { path: 'dog', select: 'name' }); + * users[0].dog.name; // 'Daisy' + * users[0].dog.breed; // undefined because of `select` + * + * @param {Document|Array} docs Either a single document or array of documents to populate. + * @param {Object|String} options Either the paths to populate or an object specifying all parameters + * @param {string} [options.path=null] The path to populate. + * @param {boolean} [options.retainNullValues=false] By default, Mongoose removes null and undefined values from populated arrays. Use this option to make `populate()` retain `null` and `undefined` array entries. + * @param {boolean} [options.getters=false] If true, Mongoose will call any getters defined on the `localField`. By default, Mongoose gets the raw value of `localField`. For example, you would need to set this option to `true` if you wanted to [add a `lowercase` getter to your `localField`](/docs/schematypes.html#schematype-options). + * @param {boolean} [options.clone=false] When you do `BlogPost.find().populate('author')`, blog posts with the same author will share 1 copy of an `author` doc. Enable this option to make Mongoose clone populated docs before assigning them. + * @param {Object|Function} [options.match=null] Add an additional filter to the populate query. Can be a filter object containing [MongoDB query syntax](https://docs.mongodb.com/manual/tutorial/query-documents/), or a function that returns a filter object. + * @param {Boolean} [options.skipInvalidIds=false] By default, Mongoose throws a cast error if `localField` and `foreignField` schemas don't line up. If you enable this option, Mongoose will instead filter out any `localField` properties that cannot be casted to `foreignField`'s schema type. + * @param {Number} [options.perDocumentLimit=null] For legacy reasons, `limit` with `populate()` may give incorrect results because it only executes a single query for every document being populated. If you set `perDocumentLimit`, Mongoose will ensure correct `limit` per document by executing a separate query for each document to `populate()`. For example, `.find().populate({ path: 'test', perDocumentLimit: 2 })` will execute 2 additional queries if `.find()` returns 2 documents. + * @param {Boolean} [options.strictPopulate=true] Set to false to allow populating paths that aren't defined in the given model's schema. + * @param {Object} [options.options=null] Additional options like `limit` and `lean`. + * @param {Function} [options.transform=null] Function that Mongoose will call on every populated document that allows you to transform the populated document. + * @param {Function} [callback(err,doc)] Optional callback, executed upon completion. Receives `err` and the `doc(s)`. + * @return {Promise} + * @api public + */ + +Model.populate = function(docs, paths, callback) { + _checkContext(this, 'populate'); + + const _this = this; + + // normalized paths + paths = utils.populate(paths); + + // data that should persist across subPopulate calls + const cache = {}; + + callback = this.$handleCallbackError(callback); + return this.db.base._promiseOrCallback(callback, cb => { + cb = this.$wrapCallback(cb); + _populate(_this, docs, paths, cache, cb); + }, this.events); +}; + +/*! + * Populate helper + * + * @param {Model} model the model to use + * @param {Document|Array} docs Either a single document or array of documents to populate. + * @param {Object} paths + * @param {Function} [cb(err,doc)] Optional callback, executed upon completion. Receives `err` and the `doc(s)`. + * @return {Function} + * @api private + */ + +function _populate(model, docs, paths, cache, callback) { + let pending = paths.length; + if (paths.length === 0) { + return callback(null, docs); + } + + // each path has its own query options and must be executed separately + for (const path of paths) { + populate(model, docs, path, next); + } + + function next(err) { + if (err) { + return callback(err, null); + } + if (--pending) { + return; + } + callback(null, docs); + } +} + +/*! + * Populates `docs` + */ +const excludeIdReg = /\s?-_id\s?/; +const excludeIdRegGlobal = /\s?-_id\s?/g; + +function populate(model, docs, options, callback) { + // normalize single / multiple docs passed + if (!Array.isArray(docs)) { + docs = [docs]; + } + if (docs.length === 0 || docs.every(utils.isNullOrUndefined)) { + return callback(); + } + + const modelsMap = getModelsMapForPopulate(model, docs, options); + if (modelsMap instanceof MongooseError) { + return immediate(function() { + callback(modelsMap); + }); + } + + const len = modelsMap.length; + let vals = []; + + function flatten(item) { + // no need to include undefined values in our query + return undefined !== item; + } + + let _remaining = len; + let hasOne = false; + const params = []; + for (let i = 0; i < len; ++i) { + const mod = modelsMap[i]; + let select = mod.options.select; + let ids = utils.array.flatten(mod.ids, flatten); + ids = utils.array.unique(ids); + + const assignmentOpts = {}; + assignmentOpts.sort = get(mod, 'options.options.sort', void 0); + assignmentOpts.excludeId = excludeIdReg.test(select) || (select && select._id === 0); + + if (ids.length === 0 || ids.every(utils.isNullOrUndefined)) { + // Ensure that we set to 0 or empty array even + // if we don't actually execute a query to make sure there's a value + // and we know this path was populated for future sets. See gh-7731, gh-8230 + --_remaining; + _assign(model, [], mod, assignmentOpts); + continue; + } + + hasOne = true; + const match = createPopulateQueryFilter(ids, mod.match, mod.foreignField, mod.model, mod.options.skipInvalidIds); + + if (assignmentOpts.excludeId) { + // override the exclusion from the query so we can use the _id + // for document matching during assignment. we'll delete the + // _id back off before returning the result. + if (typeof select === 'string') { + select = select.replace(excludeIdRegGlobal, ' '); + } else { + // preserve original select conditions by copying + select = utils.object.shallowCopy(select); + delete select._id; + } + } + + if (mod.options.options && mod.options.options.limit != null) { + assignmentOpts.originalLimit = mod.options.options.limit; + } else if (mod.options.limit != null) { + assignmentOpts.originalLimit = mod.options.limit; + } + params.push([mod, match, select, assignmentOpts, _next]); + } + if (!hasOne) { + // If models but no docs, skip further deep populate. + if (modelsMap.length > 0) { + return callback(); + } + // If no models to populate but we have a nested populate, + // keep trying, re: gh-8946 + if (options.populate != null) { + const opts = utils.populate(options.populate).map(pop => Object.assign({}, pop, { + path: options.path + '.' + pop.path + })); + return model.populate(docs, opts, callback); + } + return callback(); + } + + for (const arr of params) { + _execPopulateQuery.apply(null, arr); + } + function _next(err, valsFromDb) { + if (err != null) { + return callback(err, null); + } + vals = vals.concat(valsFromDb); + if (--_remaining === 0) { + _done(); + } + } + + function _done() { + for (const arr of params) { + const mod = arr[0]; + const assignmentOpts = arr[3]; + for (const val of vals) { + mod.options._childDocs.push(val); + } + _assign(model, vals, mod, assignmentOpts); + } + + for (const arr of params) { + removeDeselectedForeignField(arr[0].foreignField, arr[0].options, vals); + } + callback(); + } +} + +/*! + * ignore + */ + +function _execPopulateQuery(mod, match, select, assignmentOpts, callback) { + const subPopulate = utils.clone(mod.options.populate); + const queryOptions = Object.assign({ + skip: mod.options.skip, + limit: mod.options.limit, + perDocumentLimit: mod.options.perDocumentLimit + }, mod.options.options); + + if (mod.count) { + delete queryOptions.skip; + } + + if (queryOptions.perDocumentLimit != null) { + queryOptions.limit = queryOptions.perDocumentLimit; + delete queryOptions.perDocumentLimit; + } else if (queryOptions.limit != null) { + queryOptions.limit = queryOptions.limit * mod.ids.length; + } + + const query = mod.model.find(match, select, queryOptions); + // If we're doing virtual populate and projection is inclusive and foreign + // field is not selected, automatically select it because mongoose needs it. + // If projection is exclusive and client explicitly unselected the foreign + // field, that's the client's fault. + for (const foreignField of mod.foreignField) { + if (foreignField !== '_id' && query.selectedInclusively() && + !isPathSelectedInclusive(query._fields, foreignField)) { + query.select(foreignField); + } + } + + // If using count, still need the `foreignField` so we can match counts + // to documents, otherwise we would need a separate `count()` for every doc. + if (mod.count) { + for (const foreignField of mod.foreignField) { + query.select(foreignField); + } + } + + // If we need to sub-populate, call populate recursively + if (subPopulate) { + // If subpopulating on a discriminator, skip check for non-existent + // paths. Because the discriminator may not have the path defined. + if (mod.model.baseModelName != null) { + if (Array.isArray(subPopulate)) { + subPopulate.forEach(pop => { pop.strictPopulate = false; }); + } else { + subPopulate.strictPopulate = false; + } + } + query.populate(subPopulate); + } + + query.exec((err, docs) => { + if (err != null) { + return callback(err); + } + + for (const val of docs) { + leanPopulateMap.set(val, mod.model); + } + callback(null, docs); + }); +} + +/*! + * ignore + */ + +function _assign(model, vals, mod, assignmentOpts) { + const options = mod.options; + const isVirtual = mod.isVirtual; + const justOne = mod.justOne; + let _val; + const lean = get(options, 'options.lean', false); + const len = vals.length; + const rawOrder = {}; + const rawDocs = {}; + let key; + let val; + + // Clone because `assignRawDocsToIdStructure` will mutate the array + const allIds = utils.clone(mod.allIds); + // optimization: + // record the document positions as returned by + // the query result. + for (let i = 0; i < len; i++) { + val = vals[i]; + if (val == null) { + continue; + } + + for (const foreignField of mod.foreignField) { + _val = utils.getValue(foreignField, val); + if (Array.isArray(_val)) { + _val = utils.array.unique(utils.array.flatten(_val)); + + for (let __val of _val) { + if (__val instanceof Document) { + __val = __val._id; + } + key = String(__val); + if (rawDocs[key]) { + if (Array.isArray(rawDocs[key])) { + rawDocs[key].push(val); + rawOrder[key].push(i); + } else { + rawDocs[key] = [rawDocs[key], val]; + rawOrder[key] = [rawOrder[key], i]; + } + } else { + if (isVirtual && !justOne) { + rawDocs[key] = [val]; + rawOrder[key] = [i]; + } else { + rawDocs[key] = val; + rawOrder[key] = i; + } + } + } + } else { + if (_val instanceof Document) { + _val = _val._id; + } + key = String(_val); + if (rawDocs[key]) { + if (Array.isArray(rawDocs[key])) { + rawDocs[key].push(val); + rawOrder[key].push(i); + } else if (isVirtual || + rawDocs[key].constructor !== val.constructor || + String(rawDocs[key]._id) !== String(val._id)) { + // May need to store multiple docs with the same id if there's multiple models + // if we have discriminators or a ref function. But avoid converting to an array + // if we have multiple queries on the same model because of `perDocumentLimit` re: gh-9906 + rawDocs[key] = [rawDocs[key], val]; + rawOrder[key] = [rawOrder[key], i]; + } + } else { + rawDocs[key] = val; + rawOrder[key] = i; + } + } + // flag each as result of population + if (!lean) { + val.$__.wasPopulated = true; + } + } + } + + assignVals({ + originalModel: model, + // If virtual, make sure to not mutate original field + rawIds: mod.isVirtual ? allIds : mod.allIds, + allIds: allIds, + foreignField: mod.foreignField, + rawDocs: rawDocs, + rawOrder: rawOrder, + docs: mod.docs, + path: options.path, + options: assignmentOpts, + justOne: mod.justOne, + isVirtual: mod.isVirtual, + allOptions: mod, + populatedModel: mod.model, + lean: lean, + virtual: mod.virtual, + count: mod.count, + match: mod.match + }); +} + +/*! + * Compiler utility. + * + * @param {String|Function} name model name or class extending Model + * @param {Schema} schema + * @param {String} collectionName + * @param {Connection} connection + * @param {Mongoose} base mongoose instance + */ + +Model.compile = function compile(name, schema, collectionName, connection, base) { + const versioningEnabled = schema.options.versionKey !== false; + + if (versioningEnabled && !schema.paths[schema.options.versionKey]) { + // add versioning to top level documents only + const o = {}; + o[schema.options.versionKey] = Number; + schema.add(o); + } + + let model; + if (typeof name === 'function' && name.prototype instanceof Model) { + model = name; + name = model.name; + schema.loadClass(model, false); + model.prototype.$isMongooseModelPrototype = true; + } else { + // generate new class + model = function model(doc, fields, skipId) { + model.hooks.execPreSync('createModel', doc); + if (!(this instanceof model)) { + return new model(doc, fields, skipId); + } + const discriminatorKey = model.schema.options.discriminatorKey; + + if (model.discriminators == null || doc == null || doc[discriminatorKey] == null) { + Model.call(this, doc, fields, skipId); + return; + } + + // If discriminator key is set, use the discriminator instead (gh-7586) + const Discriminator = model.discriminators[doc[discriminatorKey]] || + getDiscriminatorByValue(model.discriminators, doc[discriminatorKey]); + if (Discriminator != null) { + return new Discriminator(doc, fields, skipId); + } + + // Otherwise, just use the top-level model + Model.call(this, doc, fields, skipId); + }; + } + + model.hooks = schema.s.hooks.clone(); + model.base = base; + model.modelName = name; + + if (!(model.prototype instanceof Model)) { + model.__proto__ = Model; + model.prototype.__proto__ = Model.prototype; + } + model.model = function model(name) { + return this.db.model(name); + }; + model.db = connection; + model.prototype.db = connection; + model.prototype[modelDbSymbol] = connection; + model.discriminators = model.prototype.discriminators = undefined; + model[modelSymbol] = true; + model.events = new EventEmitter(); + + schema._preCompile(); + + model.prototype.$__setSchema(schema); + + const _userProvidedOptions = schema._userProvidedOptions || {}; + + const collectionOptions = { + schemaUserProvidedOptions: _userProvidedOptions, + capped: schema.options.capped, + Promise: model.base.Promise, + modelName: name + }; + if (schema.options.autoCreate !== void 0) { + collectionOptions.autoCreate = schema.options.autoCreate; + } + + model.prototype.collection = connection.collection( + collectionName, + collectionOptions + ); + + model.prototype.$collection = model.prototype.collection; + model.prototype[modelCollectionSymbol] = model.prototype.collection; + + // apply methods and statics + applyMethods(model, schema); + applyStatics(model, schema); + applyHooks(model, schema); + applyStaticHooks(model, schema.s.hooks, schema.statics); + + model.schema = model.prototype.$__schema; + model.collection = model.prototype.collection; + model.$__collection = model.collection; + + // Create custom query constructor + model.Query = function() { + Query.apply(this, arguments); + }; + model.Query.prototype = Object.create(Query.prototype); + model.Query.base = Query.base; + applyQueryMiddleware(model.Query, model); + applyQueryMethods(model, schema.query); + + return model; +}; + +/*! + * Register custom query methods for this model + * + * @param {Model} model + * @param {Schema} schema + */ + +function applyQueryMethods(model, methods) { + for (const i in methods) { + model.Query.prototype[i] = methods[i]; + } +} + +/*! + * Subclass this model with `conn`, `schema`, and `collection` settings. + * + * @param {Connection} conn + * @param {Schema} [schema] + * @param {String} [collection] + * @return {Model} + */ + +Model.__subclass = function subclass(conn, schema, collection) { + // subclass model using this connection and collection name + const _this = this; + + const Model = function Model(doc, fields, skipId) { + if (!(this instanceof Model)) { + return new Model(doc, fields, skipId); + } + _this.call(this, doc, fields, skipId); + }; + + Model.__proto__ = _this; + Model.prototype.__proto__ = _this.prototype; + Model.db = conn; + Model.prototype.db = conn; + Model.prototype[modelDbSymbol] = conn; + + _this[subclassedSymbol] = _this[subclassedSymbol] || []; + _this[subclassedSymbol].push(Model); + if (_this.discriminators != null) { + Model.discriminators = {}; + for (const key of Object.keys(_this.discriminators)) { + Model.discriminators[key] = _this.discriminators[key]. + __subclass(_this.db, _this.discriminators[key].schema, collection); + } + } + + const s = schema && typeof schema !== 'string' + ? schema + : _this.prototype.$__schema; + + const options = s.options || {}; + const _userProvidedOptions = s._userProvidedOptions || {}; + + if (!collection) { + collection = _this.prototype.$__schema.get('collection') || + utils.toCollectionName(_this.modelName, this.base.pluralize()); + } + + const collectionOptions = { + schemaUserProvidedOptions: _userProvidedOptions, + capped: s && options.capped + }; + + Model.prototype.collection = conn.collection(collection, collectionOptions); + Model.prototype.$collection = Model.prototype.collection; + Model.prototype[modelCollectionSymbol] = Model.prototype.collection; + Model.collection = Model.prototype.collection; + Model.$__collection = Model.collection; + // Errors handled internally, so ignore + Model.init(() => {}); + return Model; +}; + +Model.$handleCallbackError = function(callback) { + if (callback == null) { + return callback; + } + if (typeof callback !== 'function') { + throw new MongooseError('Callback must be a function, got ' + callback); + } + + const _this = this; + return function() { + immediate(() => { + try { + callback.apply(null, arguments); + } catch (error) { + _this.emit('error', error); + } + }); + }; +}; + +/*! + * ignore + */ + +Model.$wrapCallback = function(callback) { + const serverSelectionError = new ServerSelectionError(); + const _this = this; + + return function(err) { + if (err != null && err.name === 'MongoServerSelectionError') { + arguments[0] = serverSelectionError.assimilateError(err); + } + if (err != null && err.name === 'MongoNetworkTimeoutError' && err.message.endsWith('timed out')) { + _this.db.emit('timeout'); + } + + return callback.apply(null, arguments); + }; +}; + +/** + * Helper for console.log. Given a model named 'MyModel', returns the string + * `'Model { MyModel }'`. + * + * ####Example: + * + * const MyModel = mongoose.model('Test', Schema({ name: String })); + * MyModel.inspect(); // 'Model { Test }' + * console.log(MyModel); // Prints 'Model { Test }' + * + * @api public + */ + +Model.inspect = function() { + return `Model { ${this.modelName} }`; +}; + +if (util.inspect.custom) { + /*! + * Avoid Node deprecation warning DEP0079 + */ + + Model[util.inspect.custom] = Model.inspect; +} + +/*! + * Module exports. + */ + +module.exports = exports = Model; diff --git a/node_modules/mongoose/lib/options.js b/node_modules/mongoose/lib/options.js new file mode 100644 index 000000000..4826e59fd --- /dev/null +++ b/node_modules/mongoose/lib/options.js @@ -0,0 +1,15 @@ +'use strict'; + +/*! + * ignore + */ + +exports.internalToObjectOptions = { + transform: false, + virtuals: false, + getters: false, + _skipDepopulateTopLevel: true, + depopulate: true, + flattenDecimals: false, + useProjection: false +}; diff --git a/node_modules/mongoose/lib/options/PopulateOptions.js b/node_modules/mongoose/lib/options/PopulateOptions.js new file mode 100644 index 000000000..5b9819460 --- /dev/null +++ b/node_modules/mongoose/lib/options/PopulateOptions.js @@ -0,0 +1,36 @@ +'use strict'; + +const clone = require('../helpers/clone'); + +class PopulateOptions { + constructor(obj) { + this._docs = {}; + this._childDocs = []; + + if (obj == null) { + return; + } + obj = clone(obj); + Object.assign(this, obj); + if (typeof obj.subPopulate === 'object') { + this.populate = obj.subPopulate; + } + + + if (obj.perDocumentLimit != null && obj.limit != null) { + throw new Error('Can not use `limit` and `perDocumentLimit` at the same time. Path: `' + obj.path + '`.'); + } + } +} + +/** + * The connection used to look up models by name. If not specified, Mongoose + * will default to using the connection associated with the model in + * `PopulateOptions#model`. + * + * @memberOf PopulateOptions + * @property {Connection} connection + * @api public + */ + +module.exports = PopulateOptions; \ No newline at end of file diff --git a/node_modules/mongoose/lib/options/SchemaArrayOptions.js b/node_modules/mongoose/lib/options/SchemaArrayOptions.js new file mode 100644 index 000000000..e3734ae49 --- /dev/null +++ b/node_modules/mongoose/lib/options/SchemaArrayOptions.js @@ -0,0 +1,59 @@ +'use strict'; + +const SchemaTypeOptions = require('./SchemaTypeOptions'); + +/** + * The options defined on an Array schematype. + * + * ####Example: + * + * const schema = new Schema({ tags: [String] }); + * schema.path('tags').options; // SchemaArrayOptions instance + * + * @api public + * @inherits SchemaTypeOptions + * @constructor SchemaArrayOptions + */ + +class SchemaArrayOptions extends SchemaTypeOptions {} + +const opts = require('./propertyOptions'); + +/** + * If this is an array of strings, an array of allowed values for this path. + * Throws an error if this array isn't an array of strings. + * + * @api public + * @property enum + * @memberOf SchemaArrayOptions + * @type Array + * @instance + */ + +Object.defineProperty(SchemaArrayOptions.prototype, 'enum', opts); + +/** + * If set, specifies the type of this array's values. Equivalent to setting + * `type` to an array whose first element is `of`. + * + * ####Example: + * + * // `arr` is an array of numbers. + * new Schema({ arr: [Number] }); + * // Equivalent way to define `arr` as an array of numbers + * new Schema({ arr: { type: Array, of: Number } }); + * + * @api public + * @property of + * @memberOf SchemaArrayOptions + * @type Function|String + * @instance + */ + +Object.defineProperty(SchemaArrayOptions.prototype, 'of', opts); + +/*! + * ignore + */ + +module.exports = SchemaArrayOptions; \ No newline at end of file diff --git a/node_modules/mongoose/lib/options/SchemaBufferOptions.js b/node_modules/mongoose/lib/options/SchemaBufferOptions.js new file mode 100644 index 000000000..258130dfe --- /dev/null +++ b/node_modules/mongoose/lib/options/SchemaBufferOptions.js @@ -0,0 +1,38 @@ +'use strict'; + +const SchemaTypeOptions = require('./SchemaTypeOptions'); + +/** + * The options defined on a Buffer schematype. + * + * ####Example: + * + * const schema = new Schema({ bitmap: Buffer }); + * schema.path('bitmap').options; // SchemaBufferOptions instance + * + * @api public + * @inherits SchemaTypeOptions + * @constructor SchemaBufferOptions + */ + +class SchemaBufferOptions extends SchemaTypeOptions {} + +const opts = require('./propertyOptions'); + +/** + * Set the default subtype for this buffer. + * + * @api public + * @property subtype + * @memberOf SchemaBufferOptions + * @type Number + * @instance + */ + +Object.defineProperty(SchemaBufferOptions.prototype, 'subtype', opts); + +/*! + * ignore + */ + +module.exports = SchemaBufferOptions; \ No newline at end of file diff --git a/node_modules/mongoose/lib/options/SchemaDateOptions.js b/node_modules/mongoose/lib/options/SchemaDateOptions.js new file mode 100644 index 000000000..09bf27f6a --- /dev/null +++ b/node_modules/mongoose/lib/options/SchemaDateOptions.js @@ -0,0 +1,64 @@ +'use strict'; + +const SchemaTypeOptions = require('./SchemaTypeOptions'); + +/** + * The options defined on a Date schematype. + * + * ####Example: + * + * const schema = new Schema({ startedAt: Date }); + * schema.path('startedAt').options; // SchemaDateOptions instance + * + * @api public + * @inherits SchemaTypeOptions + * @constructor SchemaDateOptions + */ + +class SchemaDateOptions extends SchemaTypeOptions {} + +const opts = require('./propertyOptions'); + +/** + * If set, Mongoose adds a validator that checks that this path is after the + * given `min`. + * + * @api public + * @property min + * @memberOf SchemaDateOptions + * @type Date + * @instance + */ + +Object.defineProperty(SchemaDateOptions.prototype, 'min', opts); + +/** + * If set, Mongoose adds a validator that checks that this path is before the + * given `max`. + * + * @api public + * @property max + * @memberOf SchemaDateOptions + * @type Date + * @instance + */ + +Object.defineProperty(SchemaDateOptions.prototype, 'max', opts); + +/** + * If set, Mongoose creates a TTL index on this path. + * + * @api public + * @property expires + * @memberOf SchemaDateOptions + * @type Date + * @instance + */ + +Object.defineProperty(SchemaDateOptions.prototype, 'expires', opts); + +/*! + * ignore + */ + +module.exports = SchemaDateOptions; \ No newline at end of file diff --git a/node_modules/mongoose/lib/options/SchemaDocumentArrayOptions.js b/node_modules/mongoose/lib/options/SchemaDocumentArrayOptions.js new file mode 100644 index 000000000..f3fb7a977 --- /dev/null +++ b/node_modules/mongoose/lib/options/SchemaDocumentArrayOptions.js @@ -0,0 +1,68 @@ +'use strict'; + +const SchemaTypeOptions = require('./SchemaTypeOptions'); + +/** + * The options defined on an Document Array schematype. + * + * ####Example: + * + * const schema = new Schema({ users: [{ name: string }] }); + * schema.path('users').options; // SchemaDocumentArrayOptions instance + * + * @api public + * @inherits SchemaTypeOptions + * @constructor SchemaDocumentOptions + */ + +class SchemaDocumentArrayOptions extends SchemaTypeOptions {} + +const opts = require('./propertyOptions'); + +/** + * If `true`, Mongoose will skip building any indexes defined in this array's schema. + * If not set, Mongoose will build all indexes defined in this array's schema. + * + * ####Example: + * + * const childSchema = Schema({ name: { type: String, index: true } }); + * // If `excludeIndexes` is `true`, Mongoose will skip building an index + * // on `arr.name`. Otherwise, Mongoose will build an index on `arr.name`. + * const parentSchema = Schema({ + * arr: { type: [childSchema], excludeIndexes: true } + * }); + * + * @api public + * @property excludeIndexes + * @memberOf SchemaDocumentArrayOptions + * @type Array + * @instance + */ + +Object.defineProperty(SchemaDocumentArrayOptions.prototype, 'excludeIndexes', opts); + +/** + * If set, overwrites the child schema's `_id` option. + * + * ####Example: + * + * const childSchema = Schema({ name: String }); + * const parentSchema = Schema({ + * child: { type: childSchema, _id: false } + * }); + * parentSchema.path('child').schema.options._id; // false + * + * @api public + * @property _id + * @memberOf SchemaDocumentArrayOptions + * @type Array + * @instance + */ + +Object.defineProperty(SchemaDocumentArrayOptions.prototype, '_id', opts); + +/*! + * ignore + */ + +module.exports = SchemaDocumentArrayOptions; \ No newline at end of file diff --git a/node_modules/mongoose/lib/options/SchemaMapOptions.js b/node_modules/mongoose/lib/options/SchemaMapOptions.js new file mode 100644 index 000000000..335d84a98 --- /dev/null +++ b/node_modules/mongoose/lib/options/SchemaMapOptions.js @@ -0,0 +1,43 @@ +'use strict'; + +const SchemaTypeOptions = require('./SchemaTypeOptions'); + +/** + * The options defined on a Map schematype. + * + * ####Example: + * + * const schema = new Schema({ socialMediaHandles: { type: Map, of: String } }); + * schema.path('socialMediaHandles').options; // SchemaMapOptions instance + * + * @api public + * @inherits SchemaTypeOptions + * @constructor SchemaMapOptions + */ + +class SchemaMapOptions extends SchemaTypeOptions {} + +const opts = require('./propertyOptions'); + +/** + * If set, specifies the type of this map's values. Mongoose will cast + * this map's values to the given type. + * + * If not set, Mongoose will not cast the map's values. + * + * ####Example: + * + * // Mongoose will cast `socialMediaHandles` values to strings + * const schema = new Schema({ socialMediaHandles: { type: Map, of: String } }); + * schema.path('socialMediaHandles').options.of; // String + * + * @api public + * @property of + * @memberOf SchemaMapOptions + * @type Function|string + * @instance + */ + +Object.defineProperty(SchemaMapOptions.prototype, 'of', opts); + +module.exports = SchemaMapOptions; \ No newline at end of file diff --git a/node_modules/mongoose/lib/options/SchemaNumberOptions.js b/node_modules/mongoose/lib/options/SchemaNumberOptions.js new file mode 100644 index 000000000..38553fb78 --- /dev/null +++ b/node_modules/mongoose/lib/options/SchemaNumberOptions.js @@ -0,0 +1,99 @@ +'use strict'; + +const SchemaTypeOptions = require('./SchemaTypeOptions'); + +/** + * The options defined on a Number schematype. + * + * ####Example: + * + * const schema = new Schema({ count: Number }); + * schema.path('count').options; // SchemaNumberOptions instance + * + * @api public + * @inherits SchemaTypeOptions + * @constructor SchemaNumberOptions + */ + +class SchemaNumberOptions extends SchemaTypeOptions {} + +const opts = require('./propertyOptions'); + +/** + * If set, Mongoose adds a validator that checks that this path is at least the + * given `min`. + * + * @api public + * @property min + * @memberOf SchemaNumberOptions + * @type Number + * @instance + */ + +Object.defineProperty(SchemaNumberOptions.prototype, 'min', opts); + +/** + * If set, Mongoose adds a validator that checks that this path is less than the + * given `max`. + * + * @api public + * @property max + * @memberOf SchemaNumberOptions + * @type Number + * @instance + */ + +Object.defineProperty(SchemaNumberOptions.prototype, 'max', opts); + +/** + * If set, Mongoose adds a validator that checks that this path is strictly + * equal to one of the given values. + * + * ####Example: + * const schema = new Schema({ + * favoritePrime: { + * type: Number, + * enum: [3, 5, 7] + * } + * }); + * schema.path('favoritePrime').options.enum; // [3, 5, 7] + * + * @api public + * @property enum + * @memberOf SchemaNumberOptions + * @type Array + * @instance + */ + +Object.defineProperty(SchemaNumberOptions.prototype, 'enum', opts); + +/** + * Sets default [populate options](/docs/populate.html#query-conditions). + * + * ####Example: + * const schema = new Schema({ + * child: { + * type: Number, + * ref: 'Child', + * populate: { select: 'name' } + * } + * }); + * const Parent = mongoose.model('Parent', schema); + * + * // Automatically adds `.select('name')` + * Parent.findOne().populate('child'); + * + * @api public + * @property populate + * @memberOf SchemaNumberOptions + * @type Object + * @instance + */ + +Object.defineProperty(SchemaNumberOptions.prototype, 'populate', opts); + +/*! + * ignore + */ + +module.exports = SchemaNumberOptions; \ No newline at end of file diff --git a/node_modules/mongoose/lib/options/SchemaObjectIdOptions.js b/node_modules/mongoose/lib/options/SchemaObjectIdOptions.js new file mode 100644 index 000000000..e07ed9ecc --- /dev/null +++ b/node_modules/mongoose/lib/options/SchemaObjectIdOptions.js @@ -0,0 +1,63 @@ +'use strict'; + +const SchemaTypeOptions = require('./SchemaTypeOptions'); + +/** + * The options defined on an ObjectId schematype. + * + * ####Example: + * + * const schema = new Schema({ testId: mongoose.ObjectId }); + * schema.path('testId').options; // SchemaObjectIdOptions instance + * + * @api public + * @inherits SchemaTypeOptions + * @constructor SchemaObjectIdOptions + */ + +class SchemaObjectIdOptions extends SchemaTypeOptions {} + +const opts = require('./propertyOptions'); + +/** + * If truthy, uses Mongoose's default built-in ObjectId path. + * + * @api public + * @property auto + * @memberOf SchemaObjectIdOptions + * @type Boolean + * @instance + */ + +Object.defineProperty(SchemaObjectIdOptions.prototype, 'auto', opts); + +/** + * Sets default [populate options](/docs/populate.html#query-conditions). + * + * ####Example: + * const schema = new Schema({ + * child: { + * type: 'ObjectId', + * ref: 'Child', + * populate: { select: 'name' } + * } + * }); + * const Parent = mongoose.model('Parent', schema); + * + * // Automatically adds `.select('name')` + * Parent.findOne().populate('child'); + * + * @api public + * @property populate + * @memberOf SchemaObjectIdOptions + * @type Object + * @instance + */ + +Object.defineProperty(SchemaObjectIdOptions.prototype, 'populate', opts); + +/*! + * ignore + */ + +module.exports = SchemaObjectIdOptions; \ No newline at end of file diff --git a/node_modules/mongoose/lib/options/SchemaStringOptions.js b/node_modules/mongoose/lib/options/SchemaStringOptions.js new file mode 100644 index 000000000..380050c40 --- /dev/null +++ b/node_modules/mongoose/lib/options/SchemaStringOptions.js @@ -0,0 +1,138 @@ +'use strict'; + +const SchemaTypeOptions = require('./SchemaTypeOptions'); + +/** + * The options defined on a string schematype. + * + * ####Example: + * + * const schema = new Schema({ name: String }); + * schema.path('name').options; // SchemaStringOptions instance + * + * @api public + * @inherits SchemaTypeOptions + * @constructor SchemaStringOptions + */ + +class SchemaStringOptions extends SchemaTypeOptions {} + +const opts = require('./propertyOptions'); + +/** + * Array of allowed values for this path + * + * @api public + * @property enum + * @memberOf SchemaStringOptions + * @type Array + * @instance + */ + +Object.defineProperty(SchemaStringOptions.prototype, 'enum', opts); + +/** + * Attach a validator that succeeds if the data string matches the given regular + * expression, and fails otherwise. + * + * @api public + * @property match + * @memberOf SchemaStringOptions + * @type RegExp + * @instance + */ + +Object.defineProperty(SchemaStringOptions.prototype, 'match', opts); + +/** + * If truthy, Mongoose will add a custom setter that lowercases this string + * using JavaScript's built-in `String#toLowerCase()`. + * + * @api public + * @property lowercase + * @memberOf SchemaStringOptions + * @type Boolean + * @instance + */ + +Object.defineProperty(SchemaStringOptions.prototype, 'lowercase', opts); + +/** + * If truthy, Mongoose will add a custom setter that removes leading and trailing + * whitespace using JavaScript's built-in `String#trim()`. + * + * @api public + * @property trim + * @memberOf SchemaStringOptions + * @type Boolean + * @instance + */ + +Object.defineProperty(SchemaStringOptions.prototype, 'trim', opts); + +/** + * If truthy, Mongoose will add a custom setter that uppercases this string + * using JavaScript's built-in `String#toUpperCase()`. + * + * @api public + * @property uppercase + * @memberOf SchemaStringOptions + * @type Boolean + * @instance + */ + +Object.defineProperty(SchemaStringOptions.prototype, 'uppercase', opts); + +/** + * If set, Mongoose will add a custom validator that ensures the given + * string's `length` is at least the given number. + * + * Mongoose supports two different spellings for this option: `minLength` and `minlength`. + * `minLength` is the recommended way to specify this option, but Mongoose also supports + * `minlength` (lowercase "l"). + * + * @api public + * @property minLength + * @memberOf SchemaStringOptions + * @type Number + * @instance + */ + +Object.defineProperty(SchemaStringOptions.prototype, 'minLength', opts); +Object.defineProperty(SchemaStringOptions.prototype, 'minlength', opts); + +/** + * If set, Mongoose will add a custom validator that ensures the given + * string's `length` is at most the given number. + * + * Mongoose supports two different spellings for this option: `maxLength` and `maxlength`. + * `maxLength` is the recommended way to specify this option, but Mongoose also supports + * `maxlength` (lowercase "l"). + * + * @api public + * @property maxLength + * @memberOf SchemaStringOptions + * @type Number + * @instance + */ + +Object.defineProperty(SchemaStringOptions.prototype, 'maxLength', opts); +Object.defineProperty(SchemaStringOptions.prototype, 'maxlength', opts); + +/** + * Sets default [populate options](/docs/populate.html#query-conditions). + * + * @api public + * @property populate + * @memberOf SchemaStringOptions + * @type Object + * @instance + */ + +Object.defineProperty(SchemaStringOptions.prototype, 'populate', opts); + +/*! + * ignore + */ + +module.exports = SchemaStringOptions; diff --git a/node_modules/mongoose/lib/options/SchemaSubdocumentOptions.js b/node_modules/mongoose/lib/options/SchemaSubdocumentOptions.js new file mode 100644 index 000000000..167565376 --- /dev/null +++ b/node_modules/mongoose/lib/options/SchemaSubdocumentOptions.js @@ -0,0 +1,42 @@ +'use strict'; + +const SchemaTypeOptions = require('./SchemaTypeOptions'); + +/** + * The options defined on a single nested schematype. + * + * ####Example: + * + * const schema = Schema({ child: Schema({ name: String }) }); + * schema.path('child').options; // SchemaSubdocumentOptions instance + * + * @api public + * @inherits SchemaTypeOptions + * @constructor SchemaSubdocumentOptions + */ + +class SchemaSubdocumentOptions extends SchemaTypeOptions {} + +const opts = require('./propertyOptions'); + +/** + * If set, overwrites the child schema's `_id` option. + * + * ####Example: + * + * const childSchema = Schema({ name: String }); + * const parentSchema = Schema({ + * child: { type: childSchema, _id: false } + * }); + * parentSchema.path('child').schema.options._id; // false + * + * @api public + * @property of + * @memberOf SchemaSubdocumentOptions + * @type Function|string + * @instance + */ + +Object.defineProperty(SchemaSubdocumentOptions.prototype, '_id', opts); + +module.exports = SchemaSubdocumentOptions; \ No newline at end of file diff --git a/node_modules/mongoose/lib/options/SchemaTypeOptions.js b/node_modules/mongoose/lib/options/SchemaTypeOptions.js new file mode 100644 index 000000000..d9149bac0 --- /dev/null +++ b/node_modules/mongoose/lib/options/SchemaTypeOptions.js @@ -0,0 +1,231 @@ +'use strict'; + +const clone = require('../helpers/clone'); + +/** + * The options defined on a schematype. + * + * ####Example: + * + * const schema = new Schema({ name: String }); + * schema.path('name').options instanceof mongoose.SchemaTypeOptions; // true + * + * @api public + * @constructor SchemaTypeOptions + */ + +class SchemaTypeOptions { + constructor(obj) { + if (obj == null) { + return this; + } + Object.assign(this, clone(obj)); + } +} + +const opts = require('./propertyOptions'); + +/** + * The type to cast this path to. + * + * @api public + * @property type + * @memberOf SchemaTypeOptions + * @type Function|String|Object + * @instance + */ + +Object.defineProperty(SchemaTypeOptions.prototype, 'type', opts); + +/** + * Function or object describing how to validate this schematype. + * + * @api public + * @property validate + * @memberOf SchemaTypeOptions + * @type Function|Object + * @instance + */ + +Object.defineProperty(SchemaTypeOptions.prototype, 'validate', opts); + +/** + * Allows overriding casting logic for this individual path. If a string, the + * given string overwrites Mongoose's default cast error message. + * + * ####Example: + * + * const schema = new Schema({ + * num: { + * type: Number, + * cast: '{VALUE} is not a valid number' + * } + * }); + * + * // Throws 'CastError: "bad" is not a valid number' + * schema.path('num').cast('bad'); + * + * const Model = mongoose.model('Test', schema); + * const doc = new Model({ num: 'fail' }); + * const err = doc.validateSync(); + * + * err.errors['num']; // 'CastError: "fail" is not a valid number' + * + * @api public + * @property cast + * @memberOf SchemaTypeOptions + * @type String + * @instance + */ + +Object.defineProperty(SchemaTypeOptions.prototype, 'cast', opts); + +/** + * If true, attach a required validator to this path, which ensures this path + * path cannot be set to a nullish value. If a function, Mongoose calls the + * function and only checks for nullish values if the function returns a truthy value. + * + * @api public + * @property required + * @memberOf SchemaTypeOptions + * @type Function|Boolean + * @instance + */ + +Object.defineProperty(SchemaTypeOptions.prototype, 'required', opts); + +/** + * The default value for this path. If a function, Mongoose executes the function + * and uses the return value as the default. + * + * @api public + * @property default + * @memberOf SchemaTypeOptions + * @type Function|Any + * @instance + */ + +Object.defineProperty(SchemaTypeOptions.prototype, 'default', opts); + +/** + * The model that `populate()` should use if populating this path. + * + * @api public + * @property ref + * @memberOf SchemaTypeOptions + * @type Function|String + * @instance + */ + +Object.defineProperty(SchemaTypeOptions.prototype, 'ref', opts); + +/** + * Whether to include or exclude this path by default when loading documents + * using `find()`, `findOne()`, etc. + * + * @api public + * @property select + * @memberOf SchemaTypeOptions + * @type Boolean|Number + * @instance + */ + +Object.defineProperty(SchemaTypeOptions.prototype, 'select', opts); + +/** + * If [truthy](https://masteringjs.io/tutorials/fundamentals/truthy), Mongoose will + * build an index on this path when the model is compiled. + * + * @api public + * @property index + * @memberOf SchemaTypeOptions + * @type Boolean|Number|Object + * @instance + */ + +Object.defineProperty(SchemaTypeOptions.prototype, 'index', opts); + +/** + * If [truthy](https://masteringjs.io/tutorials/fundamentals/truthy), Mongoose + * will build a unique index on this path when the + * model is compiled. [The `unique` option is **not** a validator](/docs/validation.html#the-unique-option-is-not-a-validator). + * + * @api public + * @property unique + * @memberOf SchemaTypeOptions + * @type Boolean|Number + * @instance + */ + +Object.defineProperty(SchemaTypeOptions.prototype, 'unique', opts); + +/** + * If [truthy](https://masteringjs.io/tutorials/fundamentals/truthy), Mongoose will + * disallow changes to this path once the document + * is saved to the database for the first time. Read more about [immutability in Mongoose here](http://thecodebarbarian.com/whats-new-in-mongoose-5-6-immutable-properties.html). + * + * @api public + * @property immutable + * @memberOf SchemaTypeOptions + * @type Function|Boolean + * @instance + */ + +Object.defineProperty(SchemaTypeOptions.prototype, 'immutable', opts); + +/** + * If [truthy](https://masteringjs.io/tutorials/fundamentals/truthy), Mongoose will + * build a sparse index on this path. + * + * @api public + * @property sparse + * @memberOf SchemaTypeOptions + * @type Boolean|Number + * @instance + */ + +Object.defineProperty(SchemaTypeOptions.prototype, 'sparse', opts); + +/** + * If [truthy](https://masteringjs.io/tutorials/fundamentals/truthy), Mongoose + * will build a text index on this path. + * + * @api public + * @property text + * @memberOf SchemaTypeOptions + * @type Boolean|Number|Object + * @instance + */ + +Object.defineProperty(SchemaTypeOptions.prototype, 'text', opts); + +/** + * Define a transform function for this individual schema type. + * Only called when calling `toJSON()` or `toObject()`. + * + * ####Example: + * + * const schema = Schema({ + * myDate: { + * type: Date, + * transform: v => v.getFullYear() + * } + * }); + * const Model = mongoose.model('Test', schema); + * + * const doc = new Model({ myDate: new Date('2019/06/01') }); + * doc.myDate instanceof Date; // true + * + * const res = doc.toObject({ transform: true }); + * res.myDate; // 2019 + * + * @api public + * @property transform + * @memberOf SchemaTypeOptions + * @type Function + * @instance + */ + +Object.defineProperty(SchemaTypeOptions.prototype, 'transform', opts); + +module.exports = SchemaTypeOptions; \ No newline at end of file diff --git a/node_modules/mongoose/lib/options/VirtualOptions.js b/node_modules/mongoose/lib/options/VirtualOptions.js new file mode 100644 index 000000000..a26641459 --- /dev/null +++ b/node_modules/mongoose/lib/options/VirtualOptions.js @@ -0,0 +1,164 @@ +'use strict'; + +const opts = require('./propertyOptions'); + +class VirtualOptions { + constructor(obj) { + Object.assign(this, obj); + + if (obj != null && obj.options != null) { + this.options = Object.assign({}, obj.options); + } + } +} + +/** + * Marks this virtual as a populate virtual, and specifies the model to + * use for populate. + * + * @api public + * @property ref + * @memberOf VirtualOptions + * @type String|Model|Function + * @instance + */ + +Object.defineProperty(VirtualOptions.prototype, 'ref', opts); + +/** + * Marks this virtual as a populate virtual, and specifies the path that + * contains the name of the model to populate + * + * @api public + * @property refPath + * @memberOf VirtualOptions + * @type String|Function + * @instance + */ + +Object.defineProperty(VirtualOptions.prototype, 'refPath', opts); + +/** + * The name of the property in the local model to match to `foreignField` + * in the foreign model. + * + * @api public + * @property localField + * @memberOf VirtualOptions + * @type String|Function + * @instance + */ + +Object.defineProperty(VirtualOptions.prototype, 'localField', opts); + +/** + * The name of the property in the foreign model to match to `localField` + * in the local model. + * + * @api public + * @property foreignField + * @memberOf VirtualOptions + * @type String|Function + * @instance + */ + +Object.defineProperty(VirtualOptions.prototype, 'foreignField', opts); + +/** + * Whether to populate this virtual as a single document (true) or an + * array of documents (false). + * + * @api public + * @property justOne + * @memberOf VirtualOptions + * @type Boolean + * @instance + */ + +Object.defineProperty(VirtualOptions.prototype, 'justOne', opts); + +/** + * If true, populate just the number of documents where `localField` + * matches `foreignField`, as opposed to the documents themselves. + * + * If `count` is set, it overrides `justOne`. + * + * @api public + * @property count + * @memberOf VirtualOptions + * @type Boolean + * @instance + */ + +Object.defineProperty(VirtualOptions.prototype, 'count', opts); + +/** + * Add an additional filter to populate, in addition to `localField` + * matches `foreignField`. + * + * @api public + * @property match + * @memberOf VirtualOptions + * @type Object|Function + * @instance + */ + +Object.defineProperty(VirtualOptions.prototype, 'match', opts); + +/** + * Additional options to pass to the query used to `populate()`: + * + * - `sort` + * - `skip` + * - `limit` + * + * @api public + * @property options + * @memberOf VirtualOptions + * @type Object + * @instance + */ + +Object.defineProperty(VirtualOptions.prototype, 'options', opts); + +/** + * If true, add a `skip` to the query used to `populate()`. + * + * @api public + * @property skip + * @memberOf VirtualOptions + * @type Number + * @instance + */ + +Object.defineProperty(VirtualOptions.prototype, 'skip', opts); + +/** + * If true, add a `limit` to the query used to `populate()`. + * + * @api public + * @property limit + * @memberOf VirtualOptions + * @type Number + * @instance + */ + +Object.defineProperty(VirtualOptions.prototype, 'limit', opts); + +/** + * The `limit` option for `populate()` has [some unfortunate edge cases](/docs/populate.html#query-conditions) + * when working with multiple documents, like `.find().populate()`. The + * `perDocumentLimit` option makes `populate()` execute a separate query + * for each document returned from `find()` to ensure each document + * gets up to `perDocumentLimit` populated docs if possible. + * + * @api public + * @property perDocumentLimit + * @memberOf VirtualOptions + * @type Number + * @instance + */ + +Object.defineProperty(VirtualOptions.prototype, 'perDocumentLimit', opts); + +module.exports = VirtualOptions; \ No newline at end of file diff --git a/node_modules/mongoose/lib/options/propertyOptions.js b/node_modules/mongoose/lib/options/propertyOptions.js new file mode 100644 index 000000000..b7488a37e --- /dev/null +++ b/node_modules/mongoose/lib/options/propertyOptions.js @@ -0,0 +1,8 @@ +'use strict'; + +module.exports = Object.freeze({ + enumerable: true, + configurable: true, + writable: true, + value: void 0 +}); \ No newline at end of file diff --git a/node_modules/mongoose/lib/options/removeOptions.js b/node_modules/mongoose/lib/options/removeOptions.js new file mode 100644 index 000000000..0d9158645 --- /dev/null +++ b/node_modules/mongoose/lib/options/removeOptions.js @@ -0,0 +1,14 @@ +'use strict'; + +const clone = require('../helpers/clone'); + +class RemoveOptions { + constructor(obj) { + if (obj == null) { + return; + } + Object.assign(this, clone(obj)); + } +} + +module.exports = RemoveOptions; \ No newline at end of file diff --git a/node_modules/mongoose/lib/options/saveOptions.js b/node_modules/mongoose/lib/options/saveOptions.js new file mode 100644 index 000000000..22ef4375f --- /dev/null +++ b/node_modules/mongoose/lib/options/saveOptions.js @@ -0,0 +1,14 @@ +'use strict'; + +const clone = require('../helpers/clone'); + +class SaveOptions { + constructor(obj) { + if (obj == null) { + return; + } + Object.assign(this, clone(obj)); + } +} + +module.exports = SaveOptions; \ No newline at end of file diff --git a/node_modules/mongoose/lib/plugins/removeSubdocs.js b/node_modules/mongoose/lib/plugins/removeSubdocs.js new file mode 100644 index 000000000..c2756fc53 --- /dev/null +++ b/node_modules/mongoose/lib/plugins/removeSubdocs.js @@ -0,0 +1,31 @@ +'use strict'; + +const each = require('../helpers/each'); + +/*! + * ignore + */ + +module.exports = function(schema) { + const unshift = true; + schema.s.hooks.pre('remove', false, function(next) { + if (this.ownerDocument) { + next(); + return; + } + + const _this = this; + const subdocs = this.$getAllSubdocs(); + + each(subdocs, function(subdoc, cb) { + subdoc.$__remove(cb); + }, function(error) { + if (error) { + return _this.$__schema.s.hooks.execPost('remove:error', _this, [_this], { error: error }, function(error) { + next(error); + }); + } + next(); + }); + }, null, unshift); +}; diff --git a/node_modules/mongoose/lib/plugins/saveSubdocs.js b/node_modules/mongoose/lib/plugins/saveSubdocs.js new file mode 100644 index 000000000..fcc73d88a --- /dev/null +++ b/node_modules/mongoose/lib/plugins/saveSubdocs.js @@ -0,0 +1,66 @@ +'use strict'; + +const each = require('../helpers/each'); + +/*! + * ignore + */ + +module.exports = function(schema) { + const unshift = true; + schema.s.hooks.pre('save', false, function(next) { + if (this.ownerDocument) { + next(); + return; + } + + const _this = this; + const subdocs = this.$getAllSubdocs(); + + if (!subdocs.length) { + next(); + return; + } + + each(subdocs, function(subdoc, cb) { + subdoc.$__schema.s.hooks.execPre('save', subdoc, function(err) { + cb(err); + }); + }, function(error) { + if (error) { + return _this.$__schema.s.hooks.execPost('save:error', _this, [_this], { error: error }, function(error) { + next(error); + }); + } + next(); + }); + }, null, unshift); + + schema.s.hooks.post('save', function(doc, next) { + if (this.ownerDocument) { + next(); + return; + } + + const _this = this; + const subdocs = this.$getAllSubdocs(); + + if (!subdocs.length) { + next(); + return; + } + + each(subdocs, function(subdoc, cb) { + subdoc.$__schema.s.hooks.execPost('save', subdoc, [subdoc], function(err) { + cb(err); + }); + }, function(error) { + if (error) { + return _this.$__schema.s.hooks.execPost('save:error', _this, [_this], { error: error }, function(error) { + next(error); + }); + } + next(); + }); + }, null, unshift); +}; diff --git a/node_modules/mongoose/lib/plugins/sharding.js b/node_modules/mongoose/lib/plugins/sharding.js new file mode 100644 index 000000000..020ec06c6 --- /dev/null +++ b/node_modules/mongoose/lib/plugins/sharding.js @@ -0,0 +1,83 @@ +'use strict'; + +const objectIdSymbol = require('../helpers/symbols').objectIdSymbol; +const utils = require('../utils'); + +/*! + * ignore + */ + +module.exports = function shardingPlugin(schema) { + schema.post('init', function() { + storeShard.call(this); + return this; + }); + schema.pre('save', function(next) { + applyWhere.call(this); + next(); + }); + schema.pre('remove', function(next) { + applyWhere.call(this); + next(); + }); + schema.post('save', function() { + storeShard.call(this); + }); +}; + +/*! + * ignore + */ + +function applyWhere() { + let paths; + let len; + + if (this.$__.shardval) { + paths = Object.keys(this.$__.shardval); + len = paths.length; + + this.$where = this.$where || {}; + for (let i = 0; i < len; ++i) { + this.$where[paths[i]] = this.$__.shardval[paths[i]]; + } + } +} + +/*! + * ignore + */ + +module.exports.storeShard = storeShard; + +/*! + * ignore + */ + +function storeShard() { + // backwards compat + const key = this.$__schema.options.shardKey || this.$__schema.options.shardkey; + if (!utils.isPOJO(key)) { + return; + } + + const orig = this.$__.shardval = {}; + const paths = Object.keys(key); + const len = paths.length; + let val; + + for (let i = 0; i < len; ++i) { + val = this.$__getValue(paths[i]); + if (val == null) { + orig[paths[i]] = val; + } else if (utils.isMongooseObject(val)) { + orig[paths[i]] = val.toObject({ depopulate: true, _isNested: true }); + } else if (val instanceof Date || val[objectIdSymbol]) { + orig[paths[i]] = val; + } else if (typeof val.valueOf === 'function') { + orig[paths[i]] = val.valueOf(); + } else { + orig[paths[i]] = val; + } + } +} diff --git a/node_modules/mongoose/lib/plugins/trackTransaction.js b/node_modules/mongoose/lib/plugins/trackTransaction.js new file mode 100644 index 000000000..30ded8785 --- /dev/null +++ b/node_modules/mongoose/lib/plugins/trackTransaction.js @@ -0,0 +1,91 @@ +'use strict'; + +const arrayAtomicsSymbol = require('../helpers/symbols').arrayAtomicsSymbol; +const sessionNewDocuments = require('../helpers/symbols').sessionNewDocuments; + +module.exports = function trackTransaction(schema) { + schema.pre('save', function() { + const session = this.$session(); + if (session == null) { + return; + } + if (session.transaction == null || session[sessionNewDocuments] == null) { + return; + } + + if (!session[sessionNewDocuments].has(this)) { + const initialState = {}; + if (this.isNew) { + initialState.isNew = true; + } + if (this.$__schema.options.versionKey) { + initialState.versionKey = this.get(this.$__schema.options.versionKey); + } + + initialState.modifiedPaths = new Set(Object.keys(this.$__.activePaths.states.modify)); + initialState.atomics = _getAtomics(this); + + session[sessionNewDocuments].set(this, initialState); + } else { + const state = session[sessionNewDocuments].get(this); + + for (const path of Object.keys(this.$__.activePaths.states.modify)) { + state.modifiedPaths.add(path); + } + state.atomics = _getAtomics(this, state.atomics); + } + }); +}; + +function _getAtomics(doc, previous) { + const pathToAtomics = new Map(); + previous = previous || new Map(); + + const pathsToCheck = Object.keys(doc.$__.activePaths.init).concat(Object.keys(doc.$__.activePaths.modify)); + + for (const path of pathsToCheck) { + const val = doc.$__getValue(path); + if (val != null && + val instanceof Array && + val.isMongooseDocumentArray && + val.length && + val[arrayAtomicsSymbol] != null && + Object.keys(val[arrayAtomicsSymbol]).length > 0) { + const existing = previous.get(path) || {}; + pathToAtomics.set(path, mergeAtomics(existing, val[arrayAtomicsSymbol])); + } + } + + const dirty = doc.$__dirty(); + for (const dirt of dirty) { + const path = dirt.path; + + const val = dirt.value; + if (val != null && val[arrayAtomicsSymbol] != null && Object.keys(val[arrayAtomicsSymbol]).length > 0) { + const existing = previous.get(path) || {}; + pathToAtomics.set(path, mergeAtomics(existing, val[arrayAtomicsSymbol])); + } + } + + return pathToAtomics; +} + +function mergeAtomics(destination, source) { + destination = destination || {}; + + if (source.$pullAll != null) { + destination.$pullAll = (destination.$pullAll || []).concat(source.$pullAll); + } + if (source.$push != null) { + destination.$push = destination.$push || {}; + destination.$push.$each = (destination.$push.$each || []).concat(source.$push.$each); + } + if (source.$addToSet != null) { + destination.$addToSet = (destination.$addToSet || []).concat(source.$addToSet); + } + if (source.$set != null) { + destination.$set = Object.assign(destination.$set || {}, source.$set); + } + + return destination; +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/plugins/validateBeforeSave.js b/node_modules/mongoose/lib/plugins/validateBeforeSave.js new file mode 100644 index 000000000..b9a93b299 --- /dev/null +++ b/node_modules/mongoose/lib/plugins/validateBeforeSave.js @@ -0,0 +1,45 @@ +'use strict'; + +/*! + * ignore + */ + +module.exports = function(schema) { + const unshift = true; + schema.pre('save', false, function validateBeforeSave(next, options) { + const _this = this; + // Nested docs have their own presave + if (this.ownerDocument) { + return next(); + } + + const hasValidateBeforeSaveOption = options && + (typeof options === 'object') && + ('validateBeforeSave' in options); + + let shouldValidate; + if (hasValidateBeforeSaveOption) { + shouldValidate = !!options.validateBeforeSave; + } else { + shouldValidate = this.$__schema.options.validateBeforeSave; + } + + // Validate + if (shouldValidate) { + const hasValidateModifiedOnlyOption = options && + (typeof options === 'object') && + ('validateModifiedOnly' in options); + const validateOptions = hasValidateModifiedOnlyOption ? + { validateModifiedOnly: options.validateModifiedOnly } : + null; + this.$validate(validateOptions, function(error) { + return _this.$__schema.s.hooks.execPost('save:error', _this, [_this], { error: error }, function(error) { + _this.$op = 'save'; + next(error); + }); + }); + } else { + next(); + } + }, null, unshift); +}; diff --git a/node_modules/mongoose/lib/promise_provider.js b/node_modules/mongoose/lib/promise_provider.js new file mode 100644 index 000000000..3febf3687 --- /dev/null +++ b/node_modules/mongoose/lib/promise_provider.js @@ -0,0 +1,49 @@ +/*! + * ignore + */ + +'use strict'; + +const assert = require('assert'); +const mquery = require('mquery'); + +/** + * Helper for multiplexing promise implementations + * + * @api private + */ + +const store = { + _promise: null +}; + +/** + * Get the current promise constructor + * + * @api private + */ + +store.get = function() { + return store._promise; +}; + +/** + * Set the current promise constructor + * + * @api private + */ + +store.set = function(lib) { + assert.ok(typeof lib === 'function', + `mongoose.Promise must be a function, got ${lib}`); + store._promise = lib; + mquery.Promise = lib; +}; + +/*! + * Use native promises by default + */ + +store.set(global.Promise); + +module.exports = store; diff --git a/node_modules/mongoose/lib/query.js b/node_modules/mongoose/lib/query.js new file mode 100644 index 000000000..bfefb2b70 --- /dev/null +++ b/node_modules/mongoose/lib/query.js @@ -0,0 +1,5598 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const CastError = require('./error/cast'); +const DocumentNotFoundError = require('./error/notFound'); +const Kareem = require('kareem'); +const MongooseError = require('./error/mongooseError'); +const ObjectParameterError = require('./error/objectParameter'); +const QueryCursor = require('./cursor/QueryCursor'); +const ReadPreference = require('./driver').get().ReadPreference; +const applyGlobalMaxTimeMS = require('./helpers/query/applyGlobalMaxTimeMS'); +const applyWriteConcern = require('./helpers/schema/applyWriteConcern'); +const cast = require('./cast'); +const castArrayFilters = require('./helpers/update/castArrayFilters'); +const castUpdate = require('./helpers/query/castUpdate'); +const completeMany = require('./helpers/query/completeMany'); +const get = require('./helpers/get'); +const promiseOrCallback = require('./helpers/promiseOrCallback'); +const getDiscriminatorByValue = require('./helpers/discriminator/getDiscriminatorByValue'); +const hasDollarKeys = require('./helpers/query/hasDollarKeys'); +const helpers = require('./queryhelpers'); +const immediate = require('./helpers/immediate'); +const isExclusive = require('./helpers/projection/isExclusive'); +const isInclusive = require('./helpers/projection/isInclusive'); +const mquery = require('mquery'); +const parseProjection = require('./helpers/projection/parseProjection'); +const removeUnusedArrayFilters = require('./helpers/update/removeUnusedArrayFilters'); +const sanitizeFilter = require('./helpers/query/sanitizeFilter'); +const sanitizeProjection = require('./helpers/query/sanitizeProjection'); +const selectPopulatedFields = require('./helpers/query/selectPopulatedFields'); +const setDefaultsOnInsert = require('./helpers/setDefaultsOnInsert'); +const slice = require('sliced'); +const updateValidators = require('./helpers/updateValidators'); +const util = require('util'); +const utils = require('./utils'); +const validOps = require('./helpers/query/validOps'); +const wrapThunk = require('./helpers/query/wrapThunk'); + +/** + * Query constructor used for building queries. You do not need + * to instantiate a `Query` directly. Instead use Model functions like + * [`Model.find()`](/docs/api.html#find_find). + * + * ####Example: + * + * const query = MyModel.find(); // `query` is an instance of `Query` + * query.setOptions({ lean : true }); + * query.collection(MyModel.collection); + * query.where('age').gte(21).exec(callback); + * + * // You can instantiate a query directly. There is no need to do + * // this unless you're an advanced user with a very good reason to. + * const query = new mongoose.Query(); + * + * @param {Object} [options] + * @param {Object} [model] + * @param {Object} [conditions] + * @param {Object} [collection] Mongoose collection + * @api public + */ + +function Query(conditions, options, model, collection) { + // this stuff is for dealing with custom queries created by #toConstructor + if (!this._mongooseOptions) { + this._mongooseOptions = {}; + } + options = options || {}; + + this._transforms = []; + this._hooks = new Kareem(); + this._executionStack = null; + + // this is the case where we have a CustomQuery, we need to check if we got + // options passed in, and if we did, merge them in + const keys = Object.keys(options); + for (const key of keys) { + this._mongooseOptions[key] = options[key]; + } + + if (collection) { + this.mongooseCollection = collection; + } + + if (model) { + this.model = model; + this.schema = model.schema; + } + + + // this is needed because map reduce returns a model that can be queried, but + // all of the queries on said model should be lean + if (this.model && this.model._mapreduce) { + this.lean(); + } + + // inherit mquery + mquery.call(this, this.mongooseCollection, options); + + if (conditions) { + this.find(conditions); + } + + this.options = this.options || {}; + + // For gh-6880. mquery still needs to support `fields` by default for old + // versions of MongoDB + this.$useProjection = true; + + const collation = get(this, 'schema.options.collation', null); + if (collation != null) { + this.options.collation = collation; + } +} + +/*! + * inherit mquery + */ + +Query.prototype = new mquery; +Query.prototype.constructor = Query; +Query.base = mquery.prototype; + +/** + * Flag to opt out of using `$geoWithin`. + * + * mongoose.Query.use$geoWithin = false; + * + * MongoDB 2.4 deprecated the use of `$within`, replacing it with `$geoWithin`. Mongoose uses `$geoWithin` by default (which is 100% backward compatible with `$within`). If you are running an older version of MongoDB, set this flag to `false` so your `within()` queries continue to work. + * + * @see http://docs.mongodb.org/manual/reference/operator/geoWithin/ + * @default true + * @property use$geoWithin + * @memberOf Query + * @receiver Query + * @api public + */ + +Query.use$geoWithin = mquery.use$geoWithin; + +/** + * Converts this query to a customized, reusable query constructor with all arguments and options retained. + * + * ####Example + * + * // Create a query for adventure movies and read from the primary + * // node in the replica-set unless it is down, in which case we'll + * // read from a secondary node. + * const query = Movie.find({ tags: 'adventure' }).read('primaryPreferred'); + * + * // create a custom Query constructor based off these settings + * const Adventure = query.toConstructor(); + * + * // Adventure is now a subclass of mongoose.Query and works the same way but with the + * // default query parameters and options set. + * Adventure().exec(callback) + * + * // further narrow down our query results while still using the previous settings + * Adventure().where({ name: /^Life/ }).exec(callback); + * + * // since Adventure is a stand-alone constructor we can also add our own + * // helper methods and getters without impacting global queries + * Adventure.prototype.startsWith = function (prefix) { + * this.where({ name: new RegExp('^' + prefix) }) + * return this; + * } + * Object.defineProperty(Adventure.prototype, 'highlyRated', { + * get: function () { + * this.where({ rating: { $gt: 4.5 }}); + * return this; + * } + * }) + * Adventure().highlyRated.startsWith('Life').exec(callback) + * + * @return {Query} subclass-of-Query + * @api public + */ + +Query.prototype.toConstructor = function toConstructor() { + const model = this.model; + const coll = this.mongooseCollection; + + const CustomQuery = function(criteria, options) { + if (!(this instanceof CustomQuery)) { + return new CustomQuery(criteria, options); + } + this._mongooseOptions = utils.clone(p._mongooseOptions); + Query.call(this, criteria, options || null, model, coll); + }; + + util.inherits(CustomQuery, model.Query); + + // set inherited defaults + const p = CustomQuery.prototype; + + p.options = {}; + + // Need to handle `sort()` separately because entries-style `sort()` syntax + // `sort([['prop1', 1]])` confuses mquery into losing the outer nested array. + // See gh-8159 + const options = Object.assign({}, this.options); + if (options.sort != null) { + p.sort(options.sort); + delete options.sort; + } + p.setOptions(options); + + p.op = this.op; + p._validateOp(); + p._conditions = utils.clone(this._conditions); + p._fields = utils.clone(this._fields); + p._update = utils.clone(this._update, { + flattenDecimals: false + }); + p._path = this._path; + p._distinct = this._distinct; + p._collection = this._collection; + p._mongooseOptions = this._mongooseOptions; + + return CustomQuery; +}; + +/** + * Make a copy of this query so you can re-execute it. + * + * ####Example: + * const q = Book.findOne({ title: 'Casino Royale' }); + * await q.exec(); + * await q.exec(); // Throws an error because you can't execute a query twice + * + * await q.clone().exec(); // Works + * + * @method clone + * @return {Query} copy + * @memberOf Query + * @instance + * @api public + */ + +Query.prototype.clone = function clone() { + const model = this.model; + const collection = this.mongooseCollection; + + const q = new this.constructor({}, {}, model, collection); + + // Need to handle `sort()` separately because entries-style `sort()` syntax + // `sort([['prop1', 1]])` confuses mquery into losing the outer nested array. + // See gh-8159 + const options = Object.assign({}, this.options); + if (options.sort != null) { + q.sort(options.sort); + delete options.sort; + } + q.setOptions(options); + + q.op = this.op; + q._validateOp(); + q._conditions = utils.clone(this._conditions); + q._fields = utils.clone(this._fields); + q._update = utils.clone(this._update, { + flattenDecimals: false + }); + q._path = this._path; + q._distinct = this._distinct; + q._collection = this._collection; + q._mongooseOptions = this._mongooseOptions; + + return q; +}; + +/** + * Specifies a javascript function or expression to pass to MongoDBs query system. + * + * ####Example + * + * query.$where('this.comments.length === 10 || this.name.length === 5') + * + * // or + * + * query.$where(function () { + * return this.comments.length === 10 || this.name.length === 5; + * }) + * + * ####NOTE: + * + * Only use `$where` when you have a condition that cannot be met using other MongoDB operators like `$lt`. + * **Be sure to read about all of [its caveats](http://docs.mongodb.org/manual/reference/operator/where/) before using.** + * + * @see $where http://docs.mongodb.org/manual/reference/operator/where/ + * @method $where + * @param {String|Function} js javascript string or function + * @return {Query} this + * @memberOf Query + * @instance + * @method $where + * @api public + */ + +/** + * Specifies a `path` for use with chaining. + * + * ####Example + * + * // instead of writing: + * User.find({age: {$gte: 21, $lte: 65}}, callback); + * + * // we can instead write: + * User.where('age').gte(21).lte(65); + * + * // passing query conditions is permitted + * User.find().where({ name: 'vonderful' }) + * + * // chaining + * User + * .where('age').gte(21).lte(65) + * .where('name', /^vonderful/i) + * .where('friends').slice(10) + * .exec(callback) + * + * @method where + * @memberOf Query + * @instance + * @param {String|Object} [path] + * @param {any} [val] + * @return {Query} this + * @api public + */ + +/** + * Specifies a `$slice` projection for an array. + * + * ####Example + * + * query.slice('comments', 5) + * query.slice('comments', -5) + * query.slice('comments', [10, 5]) + * query.where('comments').slice(5) + * query.where('comments').slice([-10, 5]) + * + * @method slice + * @memberOf Query + * @instance + * @param {String} [path] + * @param {Number} val number/range of elements to slice + * @return {Query} this + * @see mongodb http://www.mongodb.org/display/DOCS/Retrieving+a+Subset+of+Fields#RetrievingaSubsetofFields-RetrievingaSubrangeofArrayElements + * @see $slice http://docs.mongodb.org/manual/reference/projection/slice/#prj._S_slice + * @api public + */ + +Query.prototype.slice = function() { + if (arguments.length === 0) { + return this; + } + + this._validate('slice'); + + let path; + let val; + + if (arguments.length === 1) { + const arg = arguments[0]; + if (typeof arg === 'object' && !Array.isArray(arg)) { + const keys = Object.keys(arg); + const numKeys = keys.length; + for (let i = 0; i < numKeys; ++i) { + this.slice(keys[i], arg[keys[i]]); + } + return this; + } + this._ensurePath('slice'); + path = this._path; + val = arguments[0]; + } else if (arguments.length === 2) { + if ('number' === typeof arguments[0]) { + this._ensurePath('slice'); + path = this._path; + val = slice(arguments); + } else { + path = arguments[0]; + val = arguments[1]; + } + } else if (arguments.length === 3) { + path = arguments[0]; + val = slice(arguments, 1); + } + + const p = {}; + p[path] = { $slice: val }; + this.select(p); + + return this; +}; + +/*! + * ignore + */ + +const validOpsSet = new Set(validOps); + +Query.prototype._validateOp = function() { + if (this.op != null && !validOpsSet.has(this.op)) { + this.error(new Error('Query has invalid `op`: "' + this.op + '"')); + } +}; + +/** + * Specifies the complementary comparison value for paths specified with `where()` + * + * ####Example + * + * User.where('age').equals(49); + * + * // is the same as + * + * User.where('age', 49); + * + * @method equals + * @memberOf Query + * @instance + * @param {Object} val + * @return {Query} this + * @api public + */ + +/** + * Specifies arguments for an `$or` condition. + * + * ####Example + * + * query.or([{ color: 'red' }, { status: 'emergency' }]) + * + * @see $or http://docs.mongodb.org/manual/reference/operator/or/ + * @method or + * @memberOf Query + * @instance + * @param {Array} array array of conditions + * @return {Query} this + * @api public + */ + +/** + * Specifies arguments for a `$nor` condition. + * + * ####Example + * + * query.nor([{ color: 'green' }, { status: 'ok' }]) + * + * @see $nor http://docs.mongodb.org/manual/reference/operator/nor/ + * @method nor + * @memberOf Query + * @instance + * @param {Array} array array of conditions + * @return {Query} this + * @api public + */ + +/** + * Specifies arguments for a `$and` condition. + * + * ####Example + * + * query.and([{ color: 'green' }, { status: 'ok' }]) + * + * @method and + * @memberOf Query + * @instance + * @see $and http://docs.mongodb.org/manual/reference/operator/and/ + * @param {Array} array array of conditions + * @return {Query} this + * @api public + */ + +/** + * Specifies a `$gt` query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * ####Example + * + * Thing.find().where('age').gt(21) + * + * // or + * Thing.find().gt('age', 21) + * + * @method gt + * @memberOf Query + * @instance + * @param {String} [path] + * @param {Number} val + * @see $gt http://docs.mongodb.org/manual/reference/operator/gt/ + * @api public + */ + +/** + * Specifies a `$gte` query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @method gte + * @memberOf Query + * @instance + * @param {String} [path] + * @param {Number} val + * @see $gte http://docs.mongodb.org/manual/reference/operator/gte/ + * @api public + */ + +/** + * Specifies a `$lt` query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @method lt + * @memberOf Query + * @instance + * @param {String} [path] + * @param {Number} val + * @see $lt http://docs.mongodb.org/manual/reference/operator/lt/ + * @api public + */ + +/** + * Specifies a `$lte` query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @method lte + * @see $lte http://docs.mongodb.org/manual/reference/operator/lte/ + * @memberOf Query + * @instance + * @param {String} [path] + * @param {Number} val + * @api public + */ + +/** + * Specifies a `$ne` query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @see $ne http://docs.mongodb.org/manual/reference/operator/ne/ + * @method ne + * @memberOf Query + * @instance + * @param {String} [path] + * @param {any} val + * @api public + */ + +/** + * Specifies an `$in` query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @see $in http://docs.mongodb.org/manual/reference/operator/in/ + * @method in + * @memberOf Query + * @instance + * @param {String} [path] + * @param {Array} val + * @api public + */ + +/** + * Specifies an `$nin` query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @see $nin http://docs.mongodb.org/manual/reference/operator/nin/ + * @method nin + * @memberOf Query + * @instance + * @param {String} [path] + * @param {Array} val + * @api public + */ + +/** + * Specifies an `$all` query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * ####Example: + * + * MyModel.find().where('pets').all(['dog', 'cat', 'ferret']); + * // Equivalent: + * MyModel.find().all('pets', ['dog', 'cat', 'ferret']); + * + * @see $all http://docs.mongodb.org/manual/reference/operator/all/ + * @method all + * @memberOf Query + * @instance + * @param {String} [path] + * @param {Array} val + * @api public + */ + +/** + * Specifies a `$size` query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * ####Example + * + * const docs = await MyModel.where('tags').size(0).exec(); + * assert(Array.isArray(docs)); + * console.log('documents with 0 tags', docs); + * + * @see $size http://docs.mongodb.org/manual/reference/operator/size/ + * @method size + * @memberOf Query + * @instance + * @param {String} [path] + * @param {Number} val + * @api public + */ + +/** + * Specifies a `$regex` query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @see $regex http://docs.mongodb.org/manual/reference/operator/regex/ + * @method regex + * @memberOf Query + * @instance + * @param {String} [path] + * @param {String|RegExp} val + * @api public + */ + +/** + * Specifies a `maxDistance` query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @see $maxDistance http://docs.mongodb.org/manual/reference/operator/maxDistance/ + * @method maxDistance + * @memberOf Query + * @instance + * @param {String} [path] + * @param {Number} val + * @api public + */ + +/** + * Specifies a `$mod` condition, filters documents for documents whose + * `path` property is a number that is equal to `remainder` modulo `divisor`. + * + * ####Example + * + * // All find products whose inventory is odd + * Product.find().mod('inventory', [2, 1]); + * Product.find().where('inventory').mod([2, 1]); + * // This syntax is a little strange, but supported. + * Product.find().where('inventory').mod(2, 1); + * + * @method mod + * @memberOf Query + * @instance + * @param {String} [path] + * @param {Array} val must be of length 2, first element is `divisor`, 2nd element is `remainder`. + * @return {Query} this + * @see $mod http://docs.mongodb.org/manual/reference/operator/mod/ + * @api public + */ + +Query.prototype.mod = function() { + let val; + let path; + + if (arguments.length === 1) { + this._ensurePath('mod'); + val = arguments[0]; + path = this._path; + } else if (arguments.length === 2 && !Array.isArray(arguments[1])) { + this._ensurePath('mod'); + val = slice(arguments); + path = this._path; + } else if (arguments.length === 3) { + val = slice(arguments, 1); + path = arguments[0]; + } else { + val = arguments[1]; + path = arguments[0]; + } + + const conds = this._conditions[path] || (this._conditions[path] = {}); + conds.$mod = val; + return this; +}; + +/** + * Specifies an `$exists` condition + * + * ####Example + * + * // { name: { $exists: true }} + * Thing.where('name').exists() + * Thing.where('name').exists(true) + * Thing.find().exists('name') + * + * // { name: { $exists: false }} + * Thing.where('name').exists(false); + * Thing.find().exists('name', false); + * + * @method exists + * @memberOf Query + * @instance + * @param {String} [path] + * @param {Boolean} val + * @return {Query} this + * @see $exists http://docs.mongodb.org/manual/reference/operator/exists/ + * @api public + */ + +/** + * Specifies an `$elemMatch` condition + * + * ####Example + * + * query.elemMatch('comment', { author: 'autobot', votes: {$gte: 5}}) + * + * query.where('comment').elemMatch({ author: 'autobot', votes: {$gte: 5}}) + * + * query.elemMatch('comment', function (elem) { + * elem.where('author').equals('autobot'); + * elem.where('votes').gte(5); + * }) + * + * query.where('comment').elemMatch(function (elem) { + * elem.where({ author: 'autobot' }); + * elem.where('votes').gte(5); + * }) + * + * @method elemMatch + * @memberOf Query + * @instance + * @param {String|Object|Function} path + * @param {Object|Function} filter + * @return {Query} this + * @see $elemMatch http://docs.mongodb.org/manual/reference/operator/elemMatch/ + * @api public + */ + +/** + * Defines a `$within` or `$geoWithin` argument for geo-spatial queries. + * + * ####Example + * + * query.where(path).within().box() + * query.where(path).within().circle() + * query.where(path).within().geometry() + * + * query.where('loc').within({ center: [50,50], radius: 10, unique: true, spherical: true }); + * query.where('loc').within({ box: [[40.73, -73.9], [40.7, -73.988]] }); + * query.where('loc').within({ polygon: [[],[],[],[]] }); + * + * query.where('loc').within([], [], []) // polygon + * query.where('loc').within([], []) // box + * query.where('loc').within({ type: 'LineString', coordinates: [...] }); // geometry + * + * **MUST** be used after `where()`. + * + * ####NOTE: + * + * As of Mongoose 3.7, `$geoWithin` is always used for queries. To change this behavior, see [Query.use$geoWithin](#query_Query-use%2524geoWithin). + * + * ####NOTE: + * + * In Mongoose 3.7, `within` changed from a getter to a function. If you need the old syntax, use [this](https://github.com/ebensing/mongoose-within). + * + * @method within + * @see $polygon http://docs.mongodb.org/manual/reference/operator/polygon/ + * @see $box http://docs.mongodb.org/manual/reference/operator/box/ + * @see $geometry http://docs.mongodb.org/manual/reference/operator/geometry/ + * @see $center http://docs.mongodb.org/manual/reference/operator/center/ + * @see $centerSphere http://docs.mongodb.org/manual/reference/operator/centerSphere/ + * @memberOf Query + * @instance + * @return {Query} this + * @api public + */ + +/** + * Specifies the maximum number of documents the query will return. + * + * ####Example + * + * query.limit(20) + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @method limit + * @memberOf Query + * @instance + * @param {Number} val + * @api public + */ + +/** + * Specifies the number of documents to skip. + * + * ####Example + * + * query.skip(100).limit(20) + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @method skip + * @memberOf Query + * @instance + * @param {Number} val + * @see cursor.skip http://docs.mongodb.org/manual/reference/method/cursor.skip/ + * @api public + */ + +/** + * Specifies the maxScan option. + * + * ####Example + * + * query.maxScan(100) + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @method maxScan + * @memberOf Query + * @instance + * @param {Number} val + * @see maxScan http://docs.mongodb.org/manual/reference/operator/maxScan/ + * @api public + */ + +/** + * Specifies the batchSize option. + * + * ####Example + * + * query.batchSize(100) + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @method batchSize + * @memberOf Query + * @instance + * @param {Number} val + * @see batchSize http://docs.mongodb.org/manual/reference/method/cursor.batchSize/ + * @api public + */ + +/** + * Specifies the `comment` option. + * + * ####Example + * + * query.comment('login query') + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @method comment + * @memberOf Query + * @instance + * @param {String} val + * @see comment http://docs.mongodb.org/manual/reference/operator/comment/ + * @api public + */ + +/** + * Specifies this query as a `snapshot` query. + * + * ####Example + * + * query.snapshot() // true + * query.snapshot(true) + * query.snapshot(false) + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @method snapshot + * @memberOf Query + * @instance + * @see snapshot http://docs.mongodb.org/manual/reference/operator/snapshot/ + * @return {Query} this + * @api public + */ + +/** + * Sets query hints. + * + * ####Example + * + * query.hint({ indexA: 1, indexB: -1}) + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @method hint + * @memberOf Query + * @instance + * @param {Object} val a hint object + * @return {Query} this + * @see $hint http://docs.mongodb.org/manual/reference/operator/hint/ + * @api public + */ + +/** + * Get/set the current projection (AKA fields). Pass `null` to remove the + * current projection. + * + * Unlike `projection()`, the `select()` function modifies the current + * projection in place. This function overwrites the existing projection. + * + * ####Example: + * + * const q = Model.find(); + * q.projection(); // null + * + * q.select('a b'); + * q.projection(); // { a: 1, b: 1 } + * + * q.projection({ c: 1 }); + * q.projection(); // { c: 1 } + * + * q.projection(null); + * q.projection(); // null + * + * + * @method projection + * @memberOf Query + * @instance + * @param {Object|null} arg + * @return {Object} the current projection + * @api public + */ + +Query.prototype.projection = function(arg) { + if (arguments.length === 0) { + return this._fields; + } + + this._fields = {}; + this._userProvidedFields = {}; + this.select(arg); + return this._fields; +}; + +/** + * Specifies which document fields to include or exclude (also known as the query "projection") + * + * When using string syntax, prefixing a path with `-` will flag that path as excluded. When a path does not have the `-` prefix, it is included. Lastly, if a path is prefixed with `+`, it forces inclusion of the path, which is useful for paths excluded at the [schema level](/docs/api.html#schematype_SchemaType-select). + * + * A projection _must_ be either inclusive or exclusive. In other words, you must + * either list the fields to include (which excludes all others), or list the fields + * to exclude (which implies all other fields are included). The [`_id` field is the only exception because MongoDB includes it by default](https://docs.mongodb.com/manual/tutorial/project-fields-from-query-results/#suppress-id-field). + * + * ####Example + * + * // include a and b, exclude other fields + * query.select('a b'); + * // Equivalent syntaxes: + * query.select(['a', 'b']); + * query.select({ a: 1, b: 1 }); + * + * // exclude c and d, include other fields + * query.select('-c -d'); + * + * // Use `+` to override schema-level `select: false` without making the + * // projection inclusive. + * const schema = new Schema({ + * foo: { type: String, select: false }, + * bar: String + * }); + * // ... + * query.select('+foo'); // Override foo's `select: false` without excluding `bar` + * + * // or you may use object notation, useful when + * // you have keys already prefixed with a "-" + * query.select({ a: 1, b: 1 }); + * query.select({ c: 0, d: 0 }); + * + * + * @method select + * @memberOf Query + * @instance + * @param {Object|String|Array} arg + * @return {Query} this + * @see SchemaType + * @api public + */ + +Query.prototype.select = function select() { + let arg = arguments[0]; + if (!arg) return this; + + if (arguments.length !== 1) { + throw new Error('Invalid select: select only takes 1 argument'); + } + + this._validate('select'); + + const fields = this._fields || (this._fields = {}); + const userProvidedFields = this._userProvidedFields || (this._userProvidedFields = {}); + let sanitizeProjection = undefined; + if (this.model != null && utils.hasUserDefinedProperty(this.model.db.options, 'sanitizeProjection')) { + sanitizeProjection = this.model.db.options.sanitizeProjection; + } else if (this.model != null && utils.hasUserDefinedProperty(this.model.base.options, 'sanitizeProjection')) { + sanitizeProjection = this.model.base.options.sanitizeProjection; + } else { + sanitizeProjection = this._mongooseOptions.sanitizeProjection; + } + + arg = parseProjection(arg); + + if (utils.isObject(arg)) { + const keys = Object.keys(arg); + for (let i = 0; i < keys.length; ++i) { + let value = arg[keys[i]]; + if (typeof value === 'string' && sanitizeProjection) { + value = 1; + } + fields[keys[i]] = value; + userProvidedFields[keys[i]] = value; + } + return this; + } + + throw new TypeError('Invalid select() argument. Must be string or object.'); +}; + +/** + * _DEPRECATED_ Sets the slaveOk option. + * + * **Deprecated** in MongoDB 2.2 in favor of [read preferences](#query_Query-read). + * + * ####Example: + * + * query.slaveOk() // true + * query.slaveOk(true) + * query.slaveOk(false) + * + * @method slaveOk + * @memberOf Query + * @instance + * @deprecated use read() preferences instead if on mongodb >= 2.2 + * @param {Boolean} v defaults to true + * @see mongodb http://docs.mongodb.org/manual/applications/replication/#read-preference + * @see slaveOk http://docs.mongodb.org/manual/reference/method/rs.slaveOk/ + * @see read() #query_Query-read + * @return {Query} this + * @api public + */ + +/** + * Determines the MongoDB nodes from which to read. + * + * ####Preferences: + * + * primary - (default) Read from primary only. Operations will produce an error if primary is unavailable. Cannot be combined with tags. + * secondary Read from secondary if available, otherwise error. + * primaryPreferred Read from primary if available, otherwise a secondary. + * secondaryPreferred Read from a secondary if available, otherwise read from the primary. + * nearest All operations read from among the nearest candidates, but unlike other modes, this option will include both the primary and all secondaries in the random selection. + * + * Aliases + * + * p primary + * pp primaryPreferred + * s secondary + * sp secondaryPreferred + * n nearest + * + * ####Example: + * + * new Query().read('primary') + * new Query().read('p') // same as primary + * + * new Query().read('primaryPreferred') + * new Query().read('pp') // same as primaryPreferred + * + * new Query().read('secondary') + * new Query().read('s') // same as secondary + * + * new Query().read('secondaryPreferred') + * new Query().read('sp') // same as secondaryPreferred + * + * new Query().read('nearest') + * new Query().read('n') // same as nearest + * + * // read from secondaries with matching tags + * new Query().read('s', [{ dc:'sf', s: 1 },{ dc:'ma', s: 2 }]) + * + * Read more about how to use read preferences [here](http://docs.mongodb.org/manual/applications/replication/#read-preference) and [here](http://mongodb.github.com/node-mongodb-native/driver-articles/anintroductionto1_1and2_2.html#read-preferences). + * + * @method read + * @memberOf Query + * @instance + * @param {String} pref one of the listed preference options or aliases + * @param {Array} [tags] optional tags for this query + * @see mongodb http://docs.mongodb.org/manual/applications/replication/#read-preference + * @see driver http://mongodb.github.com/node-mongodb-native/driver-articles/anintroductionto1_1and2_2.html#read-preferences + * @return {Query} this + * @api public + */ + +Query.prototype.read = function read(pref, tags) { + // first cast into a ReadPreference object to support tags + const read = new ReadPreference(pref, tags); + this.options.readPreference = read; + return this; +}; + +/*! + * ignore + */ + +Query.prototype.toString = function toString() { + if (this.op === 'count' || + this.op === 'countDocuments' || + this.op === 'find' || + this.op === 'findOne' || + this.op === 'deleteMany' || + this.op === 'deleteOne' || + this.op === 'findOneAndDelete' || + this.op === 'findOneAndRemove' || + this.op === 'remove') { + return `${this.model.modelName}.${this.op}(${util.inspect(this._conditions)})`; + } + if (this.op === 'distinct') { + return `${this.model.modelName}.distinct('${this._distinct}', ${util.inspect(this._conditions)})`; + } + if (this.op === 'findOneAndReplace' || + this.op === 'findOneAndUpdate' || + this.op === 'replaceOne' || + this.op === 'update' || + this.op === 'updateMany' || + this.op === 'updateOne') { + return `${this.model.modelName}.${this.op}(${util.inspect(this._conditions)}, ${util.inspect(this._update)})`; + } + + // 'estimatedDocumentCount' or any others + return `${this.model.modelName}.${this.op}()`; +}; + +/** + * Sets the [MongoDB session](https://docs.mongodb.com/manual/reference/server-sessions/) + * associated with this query. Sessions are how you mark a query as part of a + * [transaction](/docs/transactions.html). + * + * Calling `session(null)` removes the session from this query. + * + * ####Example: + * + * const s = await mongoose.startSession(); + * await mongoose.model('Person').findOne({ name: 'Axl Rose' }).session(s); + * + * @method session + * @memberOf Query + * @instance + * @param {ClientSession} [session] from `await conn.startSession()` + * @see Connection.prototype.startSession() /docs/api.html#connection_Connection-startSession + * @see mongoose.startSession() /docs/api.html#mongoose_Mongoose-startSession + * @return {Query} this + * @api public + */ + +Query.prototype.session = function session(v) { + if (v == null) { + delete this.options.session; + } + this.options.session = v; + return this; +}; + +/** + * Sets the 3 write concern parameters for this query: + * + * - `w`: Sets the specified number of `mongod` servers, or tag set of `mongod` servers, that must acknowledge this write before this write is considered successful. + * - `j`: Boolean, set to `true` to request acknowledgement that this operation has been persisted to MongoDB's on-disk journal. + * - `wtimeout`: If [`w > 1`](/docs/api.html#query_Query-w), the maximum amount of time to wait for this write to propagate through the replica set before this operation fails. The default is `0`, which means no timeout. + * + * This option is only valid for operations that write to the database: + * + * - `deleteOne()` + * - `deleteMany()` + * - `findOneAndDelete()` + * - `findOneAndReplace()` + * - `findOneAndUpdate()` + * - `remove()` + * - `update()` + * - `updateOne()` + * - `updateMany()` + * + * Defaults to the schema's [`writeConcern` option](/docs/guide.html#writeConcern) + * + * ####Example: + * + * // The 'majority' option means the `deleteOne()` promise won't resolve + * // until the `deleteOne()` has propagated to the majority of the replica set + * await mongoose.model('Person'). + * deleteOne({ name: 'Ned Stark' }). + * writeConcern({ w: 'majority' }); + * + * @method writeConcern + * @memberOf Query + * @instance + * @param {Object} writeConcern the write concern value to set + * @see mongodb https://mongodb.github.io/node-mongodb-native/3.1/api/global.html#WriteConcern + * @return {Query} this + * @api public + */ + +Query.prototype.writeConcern = function writeConcern(val) { + if (val == null) { + delete this.options.writeConcern; + return this; + } + this.options.writeConcern = val; + return this; +}; + +/** + * Sets the specified number of `mongod` servers, or tag set of `mongod` servers, + * that must acknowledge this write before this write is considered successful. + * This option is only valid for operations that write to the database: + * + * - `deleteOne()` + * - `deleteMany()` + * - `findOneAndDelete()` + * - `findOneAndReplace()` + * - `findOneAndUpdate()` + * - `remove()` + * - `update()` + * - `updateOne()` + * - `updateMany()` + * + * Defaults to the schema's [`writeConcern.w` option](/docs/guide.html#writeConcern) + * + * ####Example: + * + * // The 'majority' option means the `deleteOne()` promise won't resolve + * // until the `deleteOne()` has propagated to the majority of the replica set + * await mongoose.model('Person'). + * deleteOne({ name: 'Ned Stark' }). + * w('majority'); + * + * @method w + * @memberOf Query + * @instance + * @param {String|number} val 0 for fire-and-forget, 1 for acknowledged by one server, 'majority' for majority of the replica set, or [any of the more advanced options](https://docs.mongodb.com/manual/reference/write-concern/#w-option). + * @see mongodb https://docs.mongodb.com/manual/reference/write-concern/#w-option + * @return {Query} this + * @api public + */ + +Query.prototype.w = function w(val) { + if (val == null) { + delete this.options.w; + } + if (this.options.writeConcern != null) { + this.options.writeConcern.w = val; + } else { + this.options.w = val; + } + return this; +}; + +/** + * Requests acknowledgement that this operation has been persisted to MongoDB's + * on-disk journal. + * This option is only valid for operations that write to the database: + * + * - `deleteOne()` + * - `deleteMany()` + * - `findOneAndDelete()` + * - `findOneAndReplace()` + * - `findOneAndUpdate()` + * - `remove()` + * - `update()` + * - `updateOne()` + * - `updateMany()` + * + * Defaults to the schema's [`writeConcern.j` option](/docs/guide.html#writeConcern) + * + * ####Example: + * + * await mongoose.model('Person').deleteOne({ name: 'Ned Stark' }).j(true); + * + * @method j + * @memberOf Query + * @instance + * @param {boolean} val + * @see mongodb https://docs.mongodb.com/manual/reference/write-concern/#j-option + * @return {Query} this + * @api public + */ + +Query.prototype.j = function j(val) { + if (val == null) { + delete this.options.j; + } + if (this.options.writeConcern != null) { + this.options.writeConcern.j = val; + } else { + this.options.j = val; + } + return this; +}; + +/** + * If [`w > 1`](/docs/api.html#query_Query-w), the maximum amount of time to + * wait for this write to propagate through the replica set before this + * operation fails. The default is `0`, which means no timeout. + * + * This option is only valid for operations that write to the database: + * + * - `deleteOne()` + * - `deleteMany()` + * - `findOneAndDelete()` + * - `findOneAndReplace()` + * - `findOneAndUpdate()` + * - `remove()` + * - `update()` + * - `updateOne()` + * - `updateMany()` + * + * Defaults to the schema's [`writeConcern.wtimeout` option](/docs/guide.html#writeConcern) + * + * ####Example: + * + * // The `deleteOne()` promise won't resolve until this `deleteOne()` has + * // propagated to at least `w = 2` members of the replica set. If it takes + * // longer than 1 second, this `deleteOne()` will fail. + * await mongoose.model('Person'). + * deleteOne({ name: 'Ned Stark' }). + * w(2). + * wtimeout(1000); + * + * @method wtimeout + * @memberOf Query + * @instance + * @param {number} ms number of milliseconds to wait + * @see mongodb https://docs.mongodb.com/manual/reference/write-concern/#wtimeout + * @return {Query} this + * @api public + */ + +Query.prototype.wtimeout = function wtimeout(ms) { + if (ms == null) { + delete this.options.wtimeout; + } + if (this.options.writeConcern != null) { + this.options.writeConcern.wtimeout = ms; + } else { + this.options.wtimeout = ms; + } + return this; +}; + +/** + * Sets the readConcern option for the query. + * + * ####Example: + * + * new Query().readConcern('local') + * new Query().readConcern('l') // same as local + * + * new Query().readConcern('available') + * new Query().readConcern('a') // same as available + * + * new Query().readConcern('majority') + * new Query().readConcern('m') // same as majority + * + * new Query().readConcern('linearizable') + * new Query().readConcern('lz') // same as linearizable + * + * new Query().readConcern('snapshot') + * new Query().readConcern('s') // same as snapshot + * + * + * ####Read Concern Level: + * + * local MongoDB 3.2+ The query returns from the instance with no guarantee guarantee that the data has been written to a majority of the replica set members (i.e. may be rolled back). + * available MongoDB 3.6+ The query returns from the instance with no guarantee guarantee that the data has been written to a majority of the replica set members (i.e. may be rolled back). + * majority MongoDB 3.2+ The query returns the data that has been acknowledged by a majority of the replica set members. The documents returned by the read operation are durable, even in the event of failure. + * linearizable MongoDB 3.4+ The query returns data that reflects all successful majority-acknowledged writes that completed prior to the start of the read operation. The query may wait for concurrently executing writes to propagate to a majority of replica set members before returning results. + * snapshot MongoDB 4.0+ Only available for operations within multi-document transactions. Upon transaction commit with write concern "majority", the transaction operations are guaranteed to have read from a snapshot of majority-committed data. + * + * Aliases + * + * l local + * a available + * m majority + * lz linearizable + * s snapshot + * + * Read more about how to use read concern [here](https://docs.mongodb.com/manual/reference/read-concern/). + * + * @memberOf Query + * @method readConcern + * @param {String} level one of the listed read concern level or their aliases + * @see mongodb https://docs.mongodb.com/manual/reference/read-concern/ + * @return {Query} this + * @api public + */ + +/** + * Gets query options. + * + * ####Example: + * + * const query = new Query(); + * query.limit(10); + * query.setOptions({ maxTimeMS: 1000 }) + * query.getOptions(); // { limit: 10, maxTimeMS: 1000 } + * + * @return {Object} the options + * @api public + */ + +Query.prototype.getOptions = function() { + return this.options; +}; + +/** + * Sets query options. Some options only make sense for certain operations. + * + * ####Options: + * + * The following options are only for `find()`: + * + * - [tailable](http://www.mongodb.org/display/DOCS/Tailable+Cursors) + * - [sort](http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%7B%7Bsort(\)%7D%7D) + * - [limit](http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%7B%7Blimit%28%29%7D%7D) + * - [skip](http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%7B%7Bskip%28%29%7D%7D) + * - [allowDiskUse](https://docs.mongodb.com/manual/reference/method/cursor.allowDiskUse/) + * - [batchSize](http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%7B%7BbatchSize%28%29%7D%7D) + * - [readPreference](http://docs.mongodb.org/manual/applications/replication/#read-preference) + * - [hint](http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%24hint) + * - [comment](http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%24comment) + * - [snapshot](http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%7B%7Bsnapshot%28%29%7D%7D) + * - [maxscan](https://docs.mongodb.org/v3.2/reference/operator/meta/maxScan/#metaOp._S_maxScan) + * + * The following options are only for write operations: `update()`, `updateOne()`, `updateMany()`, `replaceOne()`, `findOneAndUpdate()`, and `findByIdAndUpdate()`: + * + * - [upsert](https://docs.mongodb.com/manual/reference/method/db.collection.update/) + * - [writeConcern](https://docs.mongodb.com/manual/reference/method/db.collection.update/) + * - [timestamps](https://mongoosejs.com/docs/guide.html#timestamps): If `timestamps` is set in the schema, set this option to `false` to skip timestamps for that particular update. Has no effect if `timestamps` is not enabled in the schema options. + * - omitUndefined: delete any properties whose value is `undefined` when casting an update. In other words, if this is set, Mongoose will delete `baz` from the update in `Model.updateOne({}, { foo: 'bar', baz: undefined })` before sending the update to the server. + * - overwriteDiscriminatorKey: allow setting the discriminator key in the update. Will use the correct discriminator schema if the update changes the discriminator key. + * + * The following options are only for `find()`, `findOne()`, `findById()`, `findOneAndUpdate()`, and `findByIdAndUpdate()`: + * + * - [lean](./api.html#query_Query-lean) + * - [populate](/docs/populate.html) + * - [projection](/docs/api/query.html#query_Query-projection) + * - sanitizeProjection + * + * The following options are only for all operations **except** `update()`, `updateOne()`, `updateMany()`, `remove()`, `deleteOne()`, and `deleteMany()`: + * + * - [maxTimeMS](https://docs.mongodb.com/manual/reference/operator/meta/maxTimeMS/) + * + * The following options are for `findOneAndUpdate()` and `findOneAndRemove()` + * + * - rawResult + * + * The following options are for all operations: + * + * - [strict](/docs/guide.html#strict) + * - [collation](https://docs.mongodb.com/manual/reference/collation/) + * - [session](https://docs.mongodb.com/manual/reference/server-sessions/) + * - [explain](https://docs.mongodb.com/manual/reference/method/cursor.explain/) + * + * @param {Object} options + * @return {Query} this + * @api public + */ + +Query.prototype.setOptions = function(options, overwrite) { + // overwrite is only for internal use + if (overwrite) { + // ensure that _mongooseOptions & options are two different objects + this._mongooseOptions = (options && utils.clone(options)) || {}; + this.options = options || {}; + + if ('populate' in options) { + this.populate(this._mongooseOptions); + } + return this; + } + if (options == null) { + return this; + } + if (typeof options !== 'object') { + throw new Error('Options must be an object, got "' + options + '"'); + } + + if (Array.isArray(options.populate)) { + const populate = options.populate; + delete options.populate; + const _numPopulate = populate.length; + for (let i = 0; i < _numPopulate; ++i) { + this.populate(populate[i]); + } + } + + if ('setDefaultsOnInsert' in options) { + this._mongooseOptions.setDefaultsOnInsert = options.setDefaultsOnInsert; + delete options.setDefaultsOnInsert; + } + if ('overwriteDiscriminatorKey' in options) { + this._mongooseOptions.overwriteDiscriminatorKey = options.overwriteDiscriminatorKey; + delete options.overwriteDiscriminatorKey; + } + if ('sanitizeProjection' in options) { + if (options.sanitizeProjection && !this._mongooseOptions.sanitizeProjection) { + sanitizeProjection(this._fields); + } + + this._mongooseOptions.sanitizeProjection = options.sanitizeProjection; + delete options.sanitizeProjection; + } + if ('sanitizeFilter' in options) { + this._mongooseOptions.sanitizeFilter = options.sanitizeFilter; + delete options.sanitizeFilter; + } + + if ('defaults' in options) { + this._mongooseOptions.defaults = options.defaults; + // deleting options.defaults will cause 7287 to fail + } + + return Query.base.setOptions.call(this, options); +}; + +/** + * Sets the [`explain` option](https://docs.mongodb.com/manual/reference/method/cursor.explain/), + * which makes this query return detailed execution stats instead of the actual + * query result. This method is useful for determining what index your queries + * use. + * + * Calling `query.explain(v)` is equivalent to `query.setOptions({ explain: v })` + * + * ####Example: + * + * const query = new Query(); + * const res = await query.find({ a: 1 }).explain('queryPlanner'); + * console.log(res); + * + * @param {String} [verbose] The verbosity mode. Either 'queryPlanner', 'executionStats', or 'allPlansExecution'. The default is 'queryPlanner' + * @return {Query} this + * @api public + */ + +Query.prototype.explain = function(verbose) { + if (arguments.length === 0) { + this.options.explain = true; + } else if (verbose === false) { + delete this.options.explain; + } else { + this.options.explain = verbose; + } + return this; +}; + +/** + * Sets the [`allowDiskUse` option](https://docs.mongodb.com/manual/reference/method/cursor.allowDiskUse/), + * which allows the MongoDB server to use more than 100 MB for this query's `sort()`. This option can + * let you work around `QueryExceededMemoryLimitNoDiskUseAllowed` errors from the MongoDB server. + * + * Note that this option requires MongoDB server >= 4.4. Setting this option is a no-op for MongoDB 4.2 + * and earlier. + * + * Calling `query.allowDiskUse(v)` is equivalent to `query.setOptions({ allowDiskUse: v })` + * + * ####Example: + * + * await query.find().sort({ name: 1 }).allowDiskUse(true); + * // Equivalent: + * await query.find().sort({ name: 1 }).allowDiskUse(); + * + * @param {Boolean} [v] Enable/disable `allowDiskUse`. If called with 0 arguments, sets `allowDiskUse: true` + * @return {Query} this + * @api public + */ + +Query.prototype.allowDiskUse = function(v) { + if (arguments.length === 0) { + this.options.allowDiskUse = true; + } else if (v === false) { + delete this.options.allowDiskUse; + } else { + this.options.allowDiskUse = v; + } + return this; +}; + +/** + * Sets the [maxTimeMS](https://docs.mongodb.com/manual/reference/method/cursor.maxTimeMS/) + * option. This will tell the MongoDB server to abort if the query or write op + * has been running for more than `ms` milliseconds. + * + * Calling `query.maxTimeMS(v)` is equivalent to `query.setOptions({ maxTimeMS: v })` + * + * ####Example: + * + * const query = new Query(); + * // Throws an error 'operation exceeded time limit' as long as there's + * // >= 1 doc in the queried collection + * const res = await query.find({ $where: 'sleep(1000) || true' }).maxTimeMS(100); + * + * @param {Number} [ms] The number of milliseconds + * @return {Query} this + * @api public + */ + +Query.prototype.maxTimeMS = function(ms) { + this.options.maxTimeMS = ms; + return this; +}; + +/** + * Returns the current query filter (also known as conditions) as a [POJO](https://masteringjs.io/tutorials/fundamentals/pojo). + * + * ####Example: + * + * const query = new Query(); + * query.find({ a: 1 }).where('b').gt(2); + * query.getFilter(); // { a: 1, b: { $gt: 2 } } + * + * @return {Object} current query filter + * @api public + */ + +Query.prototype.getFilter = function() { + return this._conditions; +}; + +/** + * Returns the current query filter. Equivalent to `getFilter()`. + * + * You should use `getFilter()` instead of `getQuery()` where possible. `getQuery()` + * will likely be deprecated in a future release. + * + * ####Example: + * + * const query = new Query(); + * query.find({ a: 1 }).where('b').gt(2); + * query.getQuery(); // { a: 1, b: { $gt: 2 } } + * + * @return {Object} current query filter + * @api public + */ + +Query.prototype.getQuery = function() { + return this._conditions; +}; + +/** + * Sets the query conditions to the provided JSON object. + * + * ####Example: + * + * const query = new Query(); + * query.find({ a: 1 }) + * query.setQuery({ a: 2 }); + * query.getQuery(); // { a: 2 } + * + * @param {Object} new query conditions + * @return {undefined} + * @api public + */ + +Query.prototype.setQuery = function(val) { + this._conditions = val; +}; + +/** + * Returns the current update operations as a JSON object. + * + * ####Example: + * + * const query = new Query(); + * query.update({}, { $set: { a: 5 } }); + * query.getUpdate(); // { $set: { a: 5 } } + * + * @return {Object} current update operations + * @api public + */ + +Query.prototype.getUpdate = function() { + return this._update; +}; + +/** + * Sets the current update operation to new value. + * + * ####Example: + * + * const query = new Query(); + * query.update({}, { $set: { a: 5 } }); + * query.setUpdate({ $set: { b: 6 } }); + * query.getUpdate(); // { $set: { b: 6 } } + * + * @param {Object} new update operation + * @return {undefined} + * @api public + */ + +Query.prototype.setUpdate = function(val) { + this._update = val; +}; + +/** + * Returns fields selection for this query. + * + * @method _fieldsForExec + * @return {Object} + * @api private + * @receiver Query + */ + +Query.prototype._fieldsForExec = function() { + return utils.clone(this._fields); +}; + + +/** + * Return an update document with corrected `$set` operations. + * + * @method _updateForExec + * @api private + * @receiver Query + */ + +Query.prototype._updateForExec = function() { + const update = utils.clone(this._update, { + transform: false, + depopulate: true + }); + const ops = Object.keys(update); + let i = ops.length; + const ret = {}; + + while (i--) { + const op = ops[i]; + + if (this.options.overwrite) { + ret[op] = update[op]; + continue; + } + + if ('$' !== op[0]) { + // fix up $set sugar + if (!ret.$set) { + if (update.$set) { + ret.$set = update.$set; + } else { + ret.$set = {}; + } + } + ret.$set[op] = update[op]; + ops.splice(i, 1); + if (!~ops.indexOf('$set')) ops.push('$set'); + } else if ('$set' === op) { + if (!ret.$set) { + ret[op] = update[op]; + } + } else { + ret[op] = update[op]; + } + } + + return ret; +}; + +/** + * Makes sure _path is set. + * + * @method _ensurePath + * @param {String} method + * @api private + * @receiver Query + */ + +/** + * Determines if `conds` can be merged using `mquery().merge()` + * + * @method canMerge + * @memberOf Query + * @instance + * @param {Object} conds + * @return {Boolean} + * @api private + */ + +/** + * Returns default options for this query. + * + * @param {Model} model + * @api private + */ + +Query.prototype._optionsForExec = function(model) { + const options = utils.clone(this.options); + delete options.populate; + model = model || this.model; + + if (!model) { + return options; + } + + // Apply schema-level `writeConcern` option + applyWriteConcern(model.schema, options); + + const readPreference = get(model, 'schema.options.read'); + if (!('readPreference' in options) && readPreference) { + options.readPreference = readPreference; + } + + if (options.upsert !== void 0) { + options.upsert = !!options.upsert; + } + if (options.writeConcern) { + if (options.j) { + options.writeConcern.j = options.j; + delete options.j; + } + if (options.w) { + options.writeConcern.w = options.w; + delete options.w; + } + if (options.wtimeout) { + options.writeConcern.wtimeout = options.wtimeout; + delete options.wtimeout; + } + } + return options; +}; + +/** + * Sets the lean option. + * + * Documents returned from queries with the `lean` option enabled are plain + * javascript objects, not [Mongoose Documents](/api/document.html). They have no + * `save` method, getters/setters, virtuals, or other Mongoose features. + * + * ####Example: + * + * new Query().lean() // true + * new Query().lean(true) + * new Query().lean(false) + * + * const docs = await Model.find().lean(); + * docs[0] instanceof mongoose.Document; // false + * + * [Lean is great for high-performance, read-only cases](/docs/tutorials/lean.html), + * especially when combined + * with [cursors](/docs/queries.html#streaming). + * + * If you need virtuals, getters/setters, or defaults with `lean()`, you need + * to use a plugin. See: + * + * - [mongoose-lean-virtuals](https://plugins.mongoosejs.io/plugins/lean-virtuals) + * - [mongoose-lean-getters](https://plugins.mongoosejs.io/plugins/lean-getters) + * - [mongoose-lean-defaults](https://www.npmjs.com/package/mongoose-lean-defaults) + * + * @param {Boolean|Object} bool defaults to true + * @return {Query} this + * @api public + */ + +Query.prototype.lean = function(v) { + this._mongooseOptions.lean = arguments.length ? v : true; + return this; +}; + +/** + * Adds a `$set` to this query's update without changing the operation. + * This is useful for query middleware so you can add an update regardless + * of whether you use `updateOne()`, `updateMany()`, `findOneAndUpdate()`, etc. + * + * ####Example: + * + * // Updates `{ $set: { updatedAt: new Date() } }` + * new Query().updateOne({}, {}).set('updatedAt', new Date()); + * new Query().updateMany({}, {}).set({ updatedAt: new Date() }); + * + * @param {String|Object} path path or object of key/value pairs to set + * @param {Any} [val] the value to set + * @return {Query} this + * @api public + */ + +Query.prototype.set = function(path, val) { + if (typeof path === 'object') { + const keys = Object.keys(path); + for (const key of keys) { + this.set(key, path[key]); + } + return this; + } + + this._update = this._update || {}; + this._update.$set = this._update.$set || {}; + this._update.$set[path] = val; + return this; +}; + +/** + * For update operations, returns the value of a path in the update's `$set`. + * Useful for writing getters/setters that can work with both update operations + * and `save()`. + * + * ####Example: + * + * const query = Model.updateOne({}, { $set: { name: 'Jean-Luc Picard' } }); + * query.get('name'); // 'Jean-Luc Picard' + * + * @param {String|Object} path path or object of key/value pairs to get + * @return {Query} this + * @api public + */ + +Query.prototype.get = function get(path) { + const update = this._update; + if (update == null) { + return void 0; + } + const $set = update.$set; + if ($set == null) { + return update[path]; + } + + if (utils.hasUserDefinedProperty(update, path)) { + return update[path]; + } + if (utils.hasUserDefinedProperty($set, path)) { + return $set[path]; + } + + return void 0; +}; + +/** + * Gets/sets the error flag on this query. If this flag is not null or + * undefined, the `exec()` promise will reject without executing. + * + * ####Example: + * + * Query().error(); // Get current error value + * Query().error(null); // Unset the current error + * Query().error(new Error('test')); // `exec()` will resolve with test + * Schema.pre('find', function() { + * if (!this.getQuery().userId) { + * this.error(new Error('Not allowed to query without setting userId')); + * } + * }); + * + * Note that query casting runs **after** hooks, so cast errors will override + * custom errors. + * + * ####Example: + * const TestSchema = new Schema({ num: Number }); + * const TestModel = db.model('Test', TestSchema); + * TestModel.find({ num: 'not a number' }).error(new Error('woops')).exec(function(error) { + * // `error` will be a cast error because `num` failed to cast + * }); + * + * @param {Error|null} err if set, `exec()` will fail fast before sending the query to MongoDB + * @return {Query} this + * @api public + */ + +Query.prototype.error = function error(err) { + if (arguments.length === 0) { + return this._error; + } + + this._error = err; + return this; +}; + +/*! + * ignore + */ + +Query.prototype._unsetCastError = function _unsetCastError() { + if (this._error != null && !(this._error instanceof CastError)) { + return; + } + return this.error(null); +}; + +/** + * Getter/setter around the current mongoose-specific options for this query + * Below are the current Mongoose-specific options. + * + * - `populate`: an array representing what paths will be populated. Should have one entry for each call to [`Query.prototype.populate()`](/docs/api.html#query_Query-populate) + * - `lean`: if truthy, Mongoose will not [hydrate](/docs/api.html#model_Model.hydrate) any documents that are returned from this query. See [`Query.prototype.lean()`](/docs/api.html#query_Query-lean) for more information. + * - `strict`: controls how Mongoose handles keys that aren't in the schema for updates. This option is `true` by default, which means Mongoose will silently strip any paths in the update that aren't in the schema. See the [`strict` mode docs](/docs/guide.html#strict) for more information. + * - `strictQuery`: controls how Mongoose handles keys that aren't in the schema for the query `filter`. This option is `false` by default for backwards compatibility, which means Mongoose will allow `Model.find({ foo: 'bar' })` even if `foo` is not in the schema. See the [`strictQuery` docs](/docs/guide.html#strictQuery) for more information. + * - `nearSphere`: use `$nearSphere` instead of `near()`. See the [`Query.prototype.nearSphere()` docs](/docs/api.html#query_Query-nearSphere) + * + * Mongoose maintains a separate object for internal options because + * Mongoose sends `Query.prototype.options` to the MongoDB server, and the + * above options are not relevant for the MongoDB server. + * + * @param {Object} options if specified, overwrites the current options + * @return {Object} the options + * @api public + */ + +Query.prototype.mongooseOptions = function(v) { + if (arguments.length > 0) { + this._mongooseOptions = v; + } + return this._mongooseOptions; +}; + +/*! + * ignore + */ + +Query.prototype._castConditions = function() { + let sanitizeFilterOpt = undefined; + if (this.model != null && utils.hasUserDefinedProperty(this.model.db.options, 'sanitizeFilter')) { + sanitizeFilterOpt = this.model.db.options.sanitizeFilter; + } else if (this.model != null && utils.hasUserDefinedProperty(this.model.base.options, 'sanitizeFilter')) { + sanitizeFilterOpt = this.model.base.options.sanitizeFilter; + } else { + sanitizeFilterOpt = this._mongooseOptions.sanitizeFilter; + } + + if (sanitizeFilterOpt) { + sanitizeFilter(this._conditions); + } + + try { + this.cast(this.model); + this._unsetCastError(); + } catch (err) { + this.error(err); + } +}; + +/*! + * ignore + */ + +function _castArrayFilters(query) { + try { + castArrayFilters(query); + } catch (err) { + query.error(err); + } +} + +/** + * Thunk around find() + * + * @param {Function} [callback] + * @return {Query} this + * @api private + */ +Query.prototype._find = wrapThunk(function(callback) { + this._castConditions(); + + if (this.error() != null) { + callback(this.error()); + return null; + } + + callback = _wrapThunkCallback(this, callback); + + this._applyPaths(); + this._fields = this._castFields(this._fields); + + const fields = this._fieldsForExec(); + const mongooseOptions = this._mongooseOptions; + const _this = this; + const userProvidedFields = _this._userProvidedFields || {}; + + applyGlobalMaxTimeMS(this.options, this.model); + + // Separate options to pass down to `completeMany()` in case we need to + // set a session on the document + const completeManyOptions = Object.assign({}, { + session: get(this, 'options.session', null) + }); + + const cb = (err, docs) => { + if (err) { + return callback(err); + } + + if (docs.length === 0) { + return callback(null, docs); + } + if (this.options.explain) { + return callback(null, docs); + } + + if (!mongooseOptions.populate) { + return mongooseOptions.lean ? + callback(null, docs) : + completeMany(_this.model, docs, fields, userProvidedFields, completeManyOptions, callback); + } + + const pop = helpers.preparePopulationOptionsMQ(_this, mongooseOptions); + + if (mongooseOptions.lean) { + return _this.model.populate(docs, pop, callback); + } + + completeMany(_this.model, docs, fields, userProvidedFields, completeManyOptions, (err, docs) => { + if (err != null) { + return callback(err); + } + _this.model.populate(docs, pop, callback); + }); + }; + + const options = this._optionsForExec(); + options.projection = this._fieldsForExec(); + const filter = this._conditions; + + this._collection.collection.find(filter, options, (err, cursor) => { + if (err != null) { + return cb(err); + } + + if (options.explain) { + return cursor.explain(cb); + } + try { + return cursor.toArray(cb); + } catch (err) { + return cb(err); + } + }); +}); + +/** + * Find all documents that match `selector`. The result will be an array of documents. + * + * If there are too many documents in the result to fit in memory, use + * [`Query.prototype.cursor()`](api.html#query_Query-cursor) + * + * ####Example + * + * // Using async/await + * const arr = await Movie.find({ year: { $gte: 1980, $lte: 1989 } }); + * + * // Using callbacks + * Movie.find({ year: { $gte: 1980, $lte: 1989 } }, function(err, arr) {}); + * + * @param {Object|ObjectId} [filter] mongodb selector. If not specified, returns all documents. + * @param {Function} [callback] + * @return {Query} this + * @api public + */ + +Query.prototype.find = function(conditions, callback) { + this.op = 'find'; + + if (typeof conditions === 'function') { + callback = conditions; + conditions = {}; + } + + conditions = utils.toObject(conditions); + + if (mquery.canMerge(conditions)) { + this.merge(conditions); + + prepareDiscriminatorCriteria(this); + } else if (conditions != null) { + this.error(new ObjectParameterError(conditions, 'filter', 'find')); + } + + // if we don't have a callback, then just return the query object + if (!callback) { + return Query.base.find.call(this); + } + + this.exec(callback); + + return this; +}; + +/** + * Merges another Query or conditions object into this one. + * + * When a Query is passed, conditions, field selection and options are merged. + * + * @param {Query|Object} source + * @return {Query} this + */ + +Query.prototype.merge = function(source) { + if (!source) { + return this; + } + + const opts = { overwrite: true }; + + if (source instanceof Query) { + // if source has a feature, apply it to ourselves + + if (source._conditions) { + utils.merge(this._conditions, source._conditions, opts); + } + + if (source._fields) { + this._fields || (this._fields = {}); + utils.merge(this._fields, source._fields, opts); + } + + if (source.options) { + this.options || (this.options = {}); + utils.merge(this.options, source.options, opts); + } + + if (source._update) { + this._update || (this._update = {}); + utils.mergeClone(this._update, source._update); + } + + if (source._distinct) { + this._distinct = source._distinct; + } + + utils.merge(this._mongooseOptions, source._mongooseOptions); + + return this; + } + + // plain object + utils.merge(this._conditions, source, opts); + + return this; +}; + +/** + * Adds a collation to this op (MongoDB 3.4 and up) + * + * @param {Object} value + * @return {Query} this + * @see MongoDB docs https://docs.mongodb.com/manual/reference/method/cursor.collation/#cursor.collation + * @api public + */ + +Query.prototype.collation = function(value) { + if (this.options == null) { + this.options = {}; + } + this.options.collation = value; + return this; +}; + +/** + * Hydrate a single doc from `findOne()`, `findOneAndUpdate()`, etc. + * + * @api private + */ + +Query.prototype._completeOne = function(doc, res, callback) { + if (!doc && !this.options.rawResult) { + return callback(null, null); + } + + const model = this.model; + const projection = utils.clone(this._fields); + const userProvidedFields = this._userProvidedFields || {}; + // `populate`, `lean` + const mongooseOptions = this._mongooseOptions; + // `rawResult` + const options = this.options; + + if (options.explain) { + return callback(null, doc); + } + + if (!mongooseOptions.populate) { + return mongooseOptions.lean ? + _completeOneLean(doc, res, options, callback) : + completeOne(model, doc, res, options, projection, userProvidedFields, + null, callback); + } + + const pop = helpers.preparePopulationOptionsMQ(this, this._mongooseOptions); + if (mongooseOptions.lean) { + return model.populate(doc, pop, (err, doc) => { + if (err != null) { + return callback(err); + } + _completeOneLean(doc, res, options, callback); + }); + } + + completeOne(model, doc, res, options, projection, userProvidedFields, [], (err, doc) => { + if (err != null) { + return callback(err); + } + model.populate(doc, pop, callback); + }); +}; + +/** + * Thunk around findOne() + * + * @param {Function} [callback] + * @see findOne http://docs.mongodb.org/manual/reference/method/db.collection.findOne/ + * @api private + */ + +Query.prototype._findOne = wrapThunk(function(callback) { + this._castConditions(); + + if (this.error()) { + callback(this.error()); + return null; + } + + this._applyPaths(); + this._fields = this._castFields(this._fields); + + applyGlobalMaxTimeMS(this.options, this.model); + + // don't pass in the conditions because we already merged them in + Query.base.findOne.call(this, {}, (err, doc) => { + if (err) { + callback(err); + return null; + } + + this._completeOne(doc, null, _wrapThunkCallback(this, callback)); + }); +}); + +/** + * Declares the query a findOne operation. When executed, the first found document is passed to the callback. + * + * Passing a `callback` executes the query. The result of the query is a single document. + * + * * *Note:* `conditions` is optional, and if `conditions` is null or undefined, + * mongoose will send an empty `findOne` command to MongoDB, which will return + * an arbitrary document. If you're querying by `_id`, use `Model.findById()` + * instead. + * + * This function triggers the following middleware. + * + * - `findOne()` + * + * ####Example + * + * const query = Kitten.where({ color: 'white' }); + * query.findOne(function (err, kitten) { + * if (err) return handleError(err); + * if (kitten) { + * // doc may be null if no document matched + * } + * }); + * + * @param {Object} [filter] mongodb selector + * @param {Object} [projection] optional fields to return + * @param {Object} [options] see [`setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Function} [callback] optional params are (error, document) + * @return {Query} this + * @see findOne http://docs.mongodb.org/manual/reference/method/db.collection.findOne/ + * @see Query.select #query_Query-select + * @api public + */ + +Query.prototype.findOne = function(conditions, projection, options, callback) { + this.op = 'findOne'; + this._validateOp(); + + if (typeof conditions === 'function') { + callback = conditions; + conditions = null; + projection = null; + options = null; + } else if (typeof projection === 'function') { + callback = projection; + options = null; + projection = null; + } else if (typeof options === 'function') { + callback = options; + options = null; + } + + // make sure we don't send in the whole Document to merge() + conditions = utils.toObject(conditions); + + if (options) { + this.setOptions(options); + } + + if (projection) { + this.select(projection); + } + + if (mquery.canMerge(conditions)) { + this.merge(conditions); + + prepareDiscriminatorCriteria(this); + } else if (conditions != null) { + this.error(new ObjectParameterError(conditions, 'filter', 'findOne')); + } + + if (!callback) { + // already merged in the conditions, don't need to send them in. + return Query.base.findOne.call(this); + } + + this.exec(callback); + return this; +}; + +/** + * Thunk around count() + * + * @param {Function} [callback] + * @see count http://docs.mongodb.org/manual/reference/method/db.collection.count/ + * @api private + */ + +Query.prototype._count = wrapThunk(function(callback) { + try { + this.cast(this.model); + } catch (err) { + this.error(err); + } + + if (this.error()) { + return callback(this.error()); + } + + applyGlobalMaxTimeMS(this.options, this.model); + + const conds = this._conditions; + const options = this._optionsForExec(); + + this._collection.count(conds, options, utils.tick(callback)); +}); + +/** + * Thunk around countDocuments() + * + * @param {Function} [callback] + * @see countDocuments http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#countDocuments + * @api private + */ + +Query.prototype._countDocuments = wrapThunk(function(callback) { + try { + this.cast(this.model); + } catch (err) { + this.error(err); + } + + if (this.error()) { + return callback(this.error()); + } + + applyGlobalMaxTimeMS(this.options, this.model); + + const conds = this._conditions; + const options = this._optionsForExec(); + + this._collection.collection.countDocuments(conds, options, utils.tick(callback)); +}); + +/** + * Thunk around estimatedDocumentCount() + * + * @param {Function} [callback] + * @see estimatedDocumentCount http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#estimatedDocumentCount + * @api private + */ + +Query.prototype._estimatedDocumentCount = wrapThunk(function(callback) { + if (this.error()) { + return callback(this.error()); + } + + const options = this._optionsForExec(); + + this._collection.collection.estimatedDocumentCount(options, utils.tick(callback)); +}); + +/** + * Specifies this query as a `count` query. + * + * This method is deprecated. If you want to count the number of documents in + * a collection, e.g. `count({})`, use the [`estimatedDocumentCount()` function](/docs/api.html#query_Query-estimatedDocumentCount) + * instead. Otherwise, use the [`countDocuments()`](/docs/api.html#query_Query-countDocuments) function instead. + * + * Passing a `callback` executes the query. + * + * This function triggers the following middleware. + * + * - `count()` + * + * ####Example: + * + * const countQuery = model.where({ 'color': 'black' }).count(); + * + * query.count({ color: 'black' }).count(callback) + * + * query.count({ color: 'black' }, callback) + * + * query.where('color', 'black').count(function (err, count) { + * if (err) return handleError(err); + * console.log('there are %d kittens', count); + * }) + * + * @deprecated + * @param {Object} [filter] count documents that match this object + * @param {Function} [callback] optional params are (error, count) + * @return {Query} this + * @see count http://docs.mongodb.org/manual/reference/method/db.collection.count/ + * @api public + */ + +Query.prototype.count = function(filter, callback) { + this.op = 'count'; + this._validateOp(); + if (typeof filter === 'function') { + callback = filter; + filter = undefined; + } + + filter = utils.toObject(filter); + + if (mquery.canMerge(filter)) { + this.merge(filter); + } + + if (!callback) { + return this; + } + + this.exec(callback); + + return this; +}; + +/** + * Specifies this query as a `estimatedDocumentCount()` query. Faster than + * using `countDocuments()` for large collections because + * `estimatedDocumentCount()` uses collection metadata rather than scanning + * the entire collection. + * + * `estimatedDocumentCount()` does **not** accept a filter. `Model.find({ foo: bar }).estimatedDocumentCount()` + * is equivalent to `Model.find().estimatedDocumentCount()` + * + * This function triggers the following middleware. + * + * - `estimatedDocumentCount()` + * + * ####Example: + * + * await Model.find().estimatedDocumentCount(); + * + * @param {Object} [options] passed transparently to the [MongoDB driver](http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#estimatedDocumentCount) + * @param {Function} [callback] optional params are (error, count) + * @return {Query} this + * @see estimatedDocumentCount http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#estimatedDocumentCount + * @api public + */ + +Query.prototype.estimatedDocumentCount = function(options, callback) { + this.op = 'estimatedDocumentCount'; + this._validateOp(); + if (typeof options === 'function') { + callback = options; + options = undefined; + } + + if (typeof options === 'object' && options != null) { + this.setOptions(options); + } + + if (!callback) { + return this; + } + + this.exec(callback); + + return this; +}; + +/** + * Specifies this query as a `countDocuments()` query. Behaves like `count()`, + * except it always does a full collection scan when passed an empty filter `{}`. + * + * There are also minor differences in how `countDocuments()` handles + * [`$where` and a couple geospatial operators](http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#countDocuments). + * versus `count()`. + * + * Passing a `callback` executes the query. + * + * This function triggers the following middleware. + * + * - `countDocuments()` + * + * ####Example: + * + * const countQuery = model.where({ 'color': 'black' }).countDocuments(); + * + * query.countDocuments({ color: 'black' }).count(callback); + * + * query.countDocuments({ color: 'black' }, callback); + * + * query.where('color', 'black').countDocuments(function(err, count) { + * if (err) return handleError(err); + * console.log('there are %d kittens', count); + * }); + * + * The `countDocuments()` function is similar to `count()`, but there are a + * [few operators that `countDocuments()` does not support](https://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#countDocuments). + * Below are the operators that `count()` supports but `countDocuments()` does not, + * and the suggested replacement: + * + * - `$where`: [`$expr`](https://docs.mongodb.com/manual/reference/operator/query/expr/) + * - `$near`: [`$geoWithin`](https://docs.mongodb.com/manual/reference/operator/query/geoWithin/) with [`$center`](https://docs.mongodb.com/manual/reference/operator/query/center/#op._S_center) + * - `$nearSphere`: [`$geoWithin`](https://docs.mongodb.com/manual/reference/operator/query/geoWithin/) with [`$centerSphere`](https://docs.mongodb.com/manual/reference/operator/query/centerSphere/#op._S_centerSphere) + * + * @param {Object} [filter] mongodb selector + * @param {Function} [callback] optional params are (error, count) + * @return {Query} this + * @see countDocuments http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#countDocuments + * @api public + */ + +Query.prototype.countDocuments = function(conditions, callback) { + this.op = 'countDocuments'; + this._validateOp(); + if (typeof conditions === 'function') { + callback = conditions; + conditions = undefined; + } + + conditions = utils.toObject(conditions); + + if (mquery.canMerge(conditions)) { + this.merge(conditions); + } + + if (!callback) { + return this; + } + + this.exec(callback); + + return this; +}; + +/** + * Thunk around distinct() + * + * @param {Function} [callback] + * @see distinct http://docs.mongodb.org/manual/reference/method/db.collection.distinct/ + * @api private + */ + +Query.prototype.__distinct = wrapThunk(function __distinct(callback) { + this._castConditions(); + + if (this.error()) { + callback(this.error()); + return null; + } + + applyGlobalMaxTimeMS(this.options, this.model); + + const options = this._optionsForExec(); + + // don't pass in the conditions because we already merged them in + this._collection.collection. + distinct(this._distinct, this._conditions, options, callback); +}); + +/** + * Declares or executes a distinct() operation. + * + * Passing a `callback` executes the query. + * + * This function does not trigger any middleware. + * + * ####Example + * + * distinct(field, conditions, callback) + * distinct(field, conditions) + * distinct(field, callback) + * distinct(field) + * distinct(callback) + * distinct() + * + * @param {String} [field] + * @param {Object|Query} [filter] + * @param {Function} [callback] optional params are (error, arr) + * @return {Query} this + * @see distinct http://docs.mongodb.org/manual/reference/method/db.collection.distinct/ + * @api public + */ + +Query.prototype.distinct = function(field, conditions, callback) { + this.op = 'distinct'; + this._validateOp(); + if (!callback) { + if (typeof conditions === 'function') { + callback = conditions; + conditions = undefined; + } else if (typeof field === 'function') { + callback = field; + field = undefined; + conditions = undefined; + } + } + + conditions = utils.toObject(conditions); + + if (mquery.canMerge(conditions)) { + this.merge(conditions); + + prepareDiscriminatorCriteria(this); + } else if (conditions != null) { + this.error(new ObjectParameterError(conditions, 'filter', 'distinct')); + } + + if (field != null) { + this._distinct = field; + } + + if (callback != null) { + this.exec(callback); + } + + return this; +}; + +/** + * Sets the sort order + * + * If an object is passed, values allowed are `asc`, `desc`, `ascending`, `descending`, `1`, and `-1`. + * + * If a string is passed, it must be a space delimited list of path names. The + * sort order of each path is ascending unless the path name is prefixed with `-` + * which will be treated as descending. + * + * ####Example + * + * // sort by "field" ascending and "test" descending + * query.sort({ field: 'asc', test: -1 }); + * + * // equivalent + * query.sort('field -test'); + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @param {Object|String} arg + * @return {Query} this + * @see cursor.sort http://docs.mongodb.org/manual/reference/method/cursor.sort/ + * @api public + */ + +Query.prototype.sort = function(arg) { + if (arguments.length > 1) { + throw new Error('sort() only takes 1 Argument'); + } + + return Query.base.sort.call(this, arg); +}; + +/** + * Declare and/or execute this query as a remove() operation. `remove()` is + * deprecated, you should use [`deleteOne()`](#query_Query-deleteOne) + * or [`deleteMany()`](#query_Query-deleteMany) instead. + * + * This function does not trigger any middleware + * + * ####Example + * + * Character.remove({ name: /Stark/ }, callback); + * + * This function calls the MongoDB driver's [`Collection#remove()` function](http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#remove). + * The returned [promise](https://mongoosejs.com/docs/queries.html) resolves to an + * object that contains 3 properties: + * + * - `ok`: `1` if no errors occurred + * - `deletedCount`: the number of documents deleted + * - `n`: the number of documents deleted. Equal to `deletedCount`. + * + * ####Example + * + * const res = await Character.remove({ name: /Stark/ }); + * // Number of docs deleted + * res.deletedCount; + * + * ####Note + * + * Calling `remove()` creates a [Mongoose query](./queries.html), and a query + * does not execute until you either pass a callback, call [`Query#then()`](#query_Query-then), + * or call [`Query#exec()`](#query_Query-exec). + * + * // not executed + * const query = Character.remove({ name: /Stark/ }); + * + * // executed + * Character.remove({ name: /Stark/ }, callback); + * Character.remove({ name: /Stark/ }).remove(callback); + * + * // executed without a callback + * Character.exec(); + * + * @param {Object|Query} [filter] mongodb selector + * @param {Function} [callback] optional params are (error, mongooseDeleteResult) + * @return {Query} this + * @deprecated + * @see deleteWriteOpResult http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#~deleteWriteOpResult + * @see MongoDB driver remove http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#remove + * @api public + */ + +Query.prototype.remove = function(filter, callback) { + this.op = 'remove'; + if (typeof filter === 'function') { + callback = filter; + filter = null; + } + + filter = utils.toObject(filter); + + if (mquery.canMerge(filter)) { + this.merge(filter); + + prepareDiscriminatorCriteria(this); + } else if (filter != null) { + this.error(new ObjectParameterError(filter, 'filter', 'remove')); + } + + if (!callback) { + return Query.base.remove.call(this); + } + + this.exec(callback); + return this; +}; + +/*! + * ignore + */ + +Query.prototype._remove = wrapThunk(function(callback) { + this._castConditions(); + + if (this.error() != null) { + callback(this.error()); + return this; + } + + callback = _wrapThunkCallback(this, callback); + + return Query.base.remove.call(this, helpers.handleDeleteWriteOpResult(callback)); +}); + +/** + * Declare and/or execute this query as a `deleteOne()` operation. Works like + * remove, except it deletes at most one document regardless of the `single` + * option. + * + * This function triggers `deleteOne` middleware. + * + * ####Example + * + * await Character.deleteOne({ name: 'Eddard Stark' }); + * + * // Using callbacks: + * Character.deleteOne({ name: 'Eddard Stark' }, callback); + * + * This function calls the MongoDB driver's [`Collection#deleteOne()` function](http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#deleteOne). + * The returned [promise](https://mongoosejs.com/docs/queries.html) resolves to an + * object that contains 3 properties: + * + * - `ok`: `1` if no errors occurred + * - `deletedCount`: the number of documents deleted + * - `n`: the number of documents deleted. Equal to `deletedCount`. + * + * ####Example + * + * const res = await Character.deleteOne({ name: 'Eddard Stark' }); + * // `1` if MongoDB deleted a doc, `0` if no docs matched the filter `{ name: ... }` + * res.deletedCount; + * + * @param {Object|Query} [filter] mongodb selector + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Function} [callback] optional params are (error, mongooseDeleteResult) + * @return {Query} this + * @see deleteWriteOpResult http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#~deleteWriteOpResult + * @see MongoDB Driver deleteOne http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#deleteOne + * @api public + */ + +Query.prototype.deleteOne = function(filter, options, callback) { + this.op = 'deleteOne'; + if (typeof filter === 'function') { + callback = filter; + filter = null; + options = null; + } else if (typeof options === 'function') { + callback = options; + options = null; + } else { + this.setOptions(options); + } + + filter = utils.toObject(filter); + + if (mquery.canMerge(filter)) { + this.merge(filter); + + prepareDiscriminatorCriteria(this); + } else if (filter != null) { + this.error(new ObjectParameterError(filter, 'filter', 'deleteOne')); + } + + if (!callback) { + return Query.base.deleteOne.call(this); + } + + this.exec.call(this, callback); + + return this; +}; + +/*! + * Internal thunk for `deleteOne()` + */ + +Query.prototype._deleteOne = wrapThunk(function(callback) { + this._castConditions(); + + if (this.error() != null) { + callback(this.error()); + return this; + } + + callback = _wrapThunkCallback(this, callback); + + return Query.base.deleteOne.call(this, helpers.handleDeleteWriteOpResult(callback)); +}); + +/** + * Declare and/or execute this query as a `deleteMany()` operation. Works like + * remove, except it deletes _every_ document that matches `filter` in the + * collection, regardless of the value of `single`. + * + * This function triggers `deleteMany` middleware. + * + * ####Example + * + * await Character.deleteMany({ name: /Stark/, age: { $gte: 18 } }); + * + * // Using callbacks: + * Character.deleteMany({ name: /Stark/, age: { $gte: 18 } }, callback); + * + * This function calls the MongoDB driver's [`Collection#deleteMany()` function](http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#deleteMany). + * The returned [promise](https://mongoosejs.com/docs/queries.html) resolves to an + * object that contains 3 properties: + * + * - `ok`: `1` if no errors occurred + * - `deletedCount`: the number of documents deleted + * - `n`: the number of documents deleted. Equal to `deletedCount`. + * + * ####Example + * + * const res = await Character.deleteMany({ name: /Stark/, age: { $gte: 18 } }); + * // `0` if no docs matched the filter, number of docs deleted otherwise + * res.deletedCount; + * + * @param {Object|Query} [filter] mongodb selector + * @param {Object} [options] optional see [`Query.prototype.setOptions()`](http://mongoosejs.com/docs/api.html#query_Query-setOptions) + * @param {Function} [callback] optional params are (error, mongooseDeleteResult) + * @return {Query} this + * @see deleteWriteOpResult http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#~deleteWriteOpResult + * @see MongoDB Driver deleteMany http://mongodb.github.io/node-mongodb-native/3.1/api/Collection.html#deleteMany + * @api public + */ + +Query.prototype.deleteMany = function(filter, options, callback) { + this.op = 'deleteMany'; + if (typeof filter === 'function') { + callback = filter; + filter = null; + options = null; + } else if (typeof options === 'function') { + callback = options; + options = null; + } else { + this.setOptions(options); + } + + filter = utils.toObject(filter); + + if (mquery.canMerge(filter)) { + this.merge(filter); + + prepareDiscriminatorCriteria(this); + } else if (filter != null) { + this.error(new ObjectParameterError(filter, 'filter', 'deleteMany')); + } + + if (!callback) { + return Query.base.deleteMany.call(this); + } + + this.exec.call(this, callback); + + return this; +}; + +/*! + * Internal thunk around `deleteMany()` + */ + +Query.prototype._deleteMany = wrapThunk(function(callback) { + this._castConditions(); + + if (this.error() != null) { + callback(this.error()); + return this; + } + + callback = _wrapThunkCallback(this, callback); + + return Query.base.deleteMany.call(this, helpers.handleDeleteWriteOpResult(callback)); +}); + +/*! + * hydrates a document + * + * @param {Model} model + * @param {Document} doc + * @param {Object} res 3rd parameter to callback + * @param {Object} fields + * @param {Query} self + * @param {Array} [pop] array of paths used in population + * @param {Function} callback + */ + +function completeOne(model, doc, res, options, fields, userProvidedFields, pop, callback) { + const opts = pop ? + { populated: pop } + : undefined; + + if (options.rawResult && doc == null) { + _init(null); + return null; + } + + const casted = helpers.createModel(model, doc, fields, userProvidedFields, options); + try { + casted.$init(doc, opts, _init); + } catch (error) { + _init(error); + } + + function _init(err) { + if (err) { + return immediate(() => callback(err)); + } + + + if (options.rawResult) { + if (doc && casted) { + if (options.session != null) { + casted.$session(options.session); + } + res.value = casted; + } else { + res.value = null; + } + return immediate(() => callback(null, res)); + } + if (options.session != null) { + casted.$session(options.session); + } + immediate(() => callback(null, casted)); + } +} + +/*! + * If the model is a discriminator type and not root, then add the key & value to the criteria. + */ + +function prepareDiscriminatorCriteria(query) { + if (!query || !query.model || !query.model.schema) { + return; + } + + const schema = query.model.schema; + + if (schema && schema.discriminatorMapping && !schema.discriminatorMapping.isRoot) { + query._conditions[schema.discriminatorMapping.key] = schema.discriminatorMapping.value; + } +} + +/** + * Issues a mongodb [findAndModify](http://www.mongodb.org/display/DOCS/findAndModify+Command) update command. + * + * Finds a matching document, updates it according to the `update` arg, passing any `options`, and returns the found + * document (if any) to the callback. The query executes if + * `callback` is passed. + * + * This function triggers the following middleware. + * + * - `findOneAndUpdate()` + * + * ####Available options + * + * - `new`: bool - if true, return the modified document rather than the original. defaults to false (changed in 4.0) + * - `upsert`: bool - creates the object if it doesn't exist. defaults to false. + * - `fields`: {Object|String} - Field selection. Equivalent to `.select(fields).findOneAndUpdate()` + * - `sort`: if multiple docs are found by the conditions, sets the sort order to choose which doc to update + * - `maxTimeMS`: puts a time limit on the query - requires mongodb >= 2.6.0 + * - `runValidators`: if true, runs [update validators](/docs/validation.html#update-validators) on this command. Update validators validate the update operation against the model's schema. + * - `setDefaultsOnInsert`: `true` by default. If `setDefaultsOnInsert` and `upsert` are true, mongoose will apply the [defaults](http://mongoosejs.com/docs/defaults.html) specified in the model's schema if a new document is created. + * - `rawResult`: if true, returns the [raw result from the MongoDB driver](http://mongodb.github.io/node-mongodb-native/2.0/api/Collection.html#findAndModify) + * + * ####Callback Signature + * function(error, doc) { + * // error: any errors that occurred + * // doc: the document before updates are applied if `new: false`, or after updates if `new = true` + * } + * + * ####Examples + * + * query.findOneAndUpdate(conditions, update, options, callback) // executes + * query.findOneAndUpdate(conditions, update, options) // returns Query + * query.findOneAndUpdate(conditions, update, callback) // executes + * query.findOneAndUpdate(conditions, update) // returns Query + * query.findOneAndUpdate(update, callback) // returns Query + * query.findOneAndUpdate(update) // returns Query + * query.findOneAndUpdate(callback) // executes + * query.findOneAndUpdate() // returns Query + * + * @method findOneAndUpdate + * @memberOf Query + * @instance + * @param {Object|Query} [filter] + * @param {Object} [doc] + * @param {Object} [options] + * @param {Boolean} [options.rawResult] if true, returns the [raw result from the MongoDB driver](http://mongodb.github.io/node-mongodb-native/2.0/api/Collection.html#findAndModify) + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {ClientSession} [options.session=null] The session associated with this query. See [transactions docs](/docs/transactions.html). + * @param {Boolean} [options.multipleCastError] by default, mongoose only returns the first error that occurred in casting the query. Turn on this option to aggregate all the cast errors. + * @param {Boolean} [options.new=false] By default, `findOneAndUpdate()` returns the document as it was **before** `update` was applied. If you set `new: true`, `findOneAndUpdate()` will instead give you the object after `update` was applied. + * @param {Object} [options.lean] if truthy, mongoose will return the document as a plain JavaScript object rather than a mongoose document. See [`Query.lean()`](/docs/api.html#query_Query-lean) and [the Mongoose lean tutorial](/docs/tutorials/lean.html). + * @param {ClientSession} [options.session=null] The session associated with this query. See [transactions docs](/docs/transactions.html). + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Boolean} [options.timestamps=null] If set to `false` and [schema-level timestamps](/docs/guide.html#timestamps) are enabled, skip timestamps for this update. Note that this allows you to overwrite timestamps. Does nothing if schema-level timestamps are not set. + * @param {Boolean} [options.returnOriginal=null] An alias for the `new` option. `returnOriginal: false` is equivalent to `new: true`. + * @param {Function} [callback] optional params are (error, doc), _unless_ `rawResult` is used, in which case params are (error, writeOpResult) + * @see Tutorial /docs/tutorials/findoneandupdate.html + * @see mongodb http://www.mongodb.org/display/DOCS/findAndModify+Command + * @see writeOpResult http://mongodb.github.io/node-mongodb-native/2.2/api/Collection.html#~WriteOpResult + * @return {Query} this + * @api public + */ + +Query.prototype.findOneAndUpdate = function(criteria, doc, options, callback) { + this.op = 'findOneAndUpdate'; + this._validateOp(); + this._validate(); + + switch (arguments.length) { + case 3: + if (typeof options === 'function') { + callback = options; + options = {}; + } + break; + case 2: + if (typeof doc === 'function') { + callback = doc; + doc = criteria; + criteria = undefined; + } + options = undefined; + break; + case 1: + if (typeof criteria === 'function') { + callback = criteria; + criteria = options = doc = undefined; + } else { + doc = criteria; + criteria = options = undefined; + } + } + + if (mquery.canMerge(criteria)) { + this.merge(criteria); + } + + // apply doc + if (doc) { + this._mergeUpdate(doc); + } + + options = options ? utils.clone(options) : {}; + + if (options.projection) { + this.select(options.projection); + delete options.projection; + } + if (options.fields) { + this.select(options.fields); + delete options.fields; + } + + + const returnOriginal = get(this, 'model.base.options.returnOriginal'); + if (options.new == null && options.returnDocument == null && options.returnOriginal == null && returnOriginal != null) { + options.returnOriginal = returnOriginal; + } + + this.setOptions(options); + + if (!callback) { + return this; + } + + this.exec(callback); + + return this; +}; + +/*! + * Thunk around findOneAndUpdate() + * + * @param {Function} [callback] + * @api private + */ + +Query.prototype._findOneAndUpdate = wrapThunk(function(callback) { + if (this.error() != null) { + return callback(this.error()); + } + + this._findAndModify('update', callback); +}); + +/** + * Issues a mongodb [findAndModify](http://www.mongodb.org/display/DOCS/findAndModify+Command) remove command. + * + * Finds a matching document, removes it, passing the found document (if any) to + * the callback. Executes if `callback` is passed. + * + * This function triggers the following middleware. + * + * - `findOneAndRemove()` + * + * ####Available options + * + * - `sort`: if multiple docs are found by the conditions, sets the sort order to choose which doc to update + * - `maxTimeMS`: puts a time limit on the query - requires mongodb >= 2.6.0 + * - `rawResult`: if true, resolves to the [raw result from the MongoDB driver](http://mongodb.github.io/node-mongodb-native/2.0/api/Collection.html#findAndModify) + * + * ####Callback Signature + * function(error, doc) { + * // error: any errors that occurred + * // doc: the document before updates are applied if `new: false`, or after updates if `new = true` + * } + * + * ####Examples + * + * A.where().findOneAndRemove(conditions, options, callback) // executes + * A.where().findOneAndRemove(conditions, options) // return Query + * A.where().findOneAndRemove(conditions, callback) // executes + * A.where().findOneAndRemove(conditions) // returns Query + * A.where().findOneAndRemove(callback) // executes + * A.where().findOneAndRemove() // returns Query + * + * @method findOneAndRemove + * @memberOf Query + * @instance + * @param {Object} [conditions] + * @param {Object} [options] + * @param {Boolean} [options.rawResult] if true, returns the [raw result from the MongoDB driver](http://mongodb.github.io/node-mongodb-native/2.0/api/Collection.html#findAndModify) + * @param {ClientSession} [options.session=null] The session associated with this query. See [transactions docs](/docs/transactions.html). + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Function} [callback] optional params are (error, document) + * @return {Query} this + * @see mongodb http://www.mongodb.org/display/DOCS/findAndModify+Command + * @api public + */ + +Query.prototype.findOneAndRemove = function(conditions, options, callback) { + this.op = 'findOneAndRemove'; + this._validateOp(); + this._validate(); + + switch (arguments.length) { + case 2: + if (typeof options === 'function') { + callback = options; + options = {}; + } + break; + case 1: + if (typeof conditions === 'function') { + callback = conditions; + conditions = undefined; + options = undefined; + } + break; + } + + if (mquery.canMerge(conditions)) { + this.merge(conditions); + } + + options && this.setOptions(options); + + if (!callback) { + return this; + } + + this.exec(callback); + + return this; +}; + +/** + * Issues a MongoDB [findOneAndDelete](https://docs.mongodb.com/manual/reference/method/db.collection.findOneAndDelete/) command. + * + * Finds a matching document, removes it, and passes the found document (if any) + * to the callback. Executes if `callback` is passed. + * + * This function triggers the following middleware. + * + * - `findOneAndDelete()` + * + * This function differs slightly from `Model.findOneAndRemove()` in that + * `findOneAndRemove()` becomes a [MongoDB `findAndModify()` command](https://docs.mongodb.com/manual/reference/method/db.collection.findAndModify/), + * as opposed to a `findOneAndDelete()` command. For most mongoose use cases, + * this distinction is purely pedantic. You should use `findOneAndDelete()` + * unless you have a good reason not to. + * + * ####Available options + * + * - `sort`: if multiple docs are found by the conditions, sets the sort order to choose which doc to update + * - `maxTimeMS`: puts a time limit on the query - requires mongodb >= 2.6.0 + * - `rawResult`: if true, resolves to the [raw result from the MongoDB driver](http://mongodb.github.io/node-mongodb-native/2.0/api/Collection.html#findAndModify) + * + * ####Callback Signature + * function(error, doc) { + * // error: any errors that occurred + * // doc: the document before updates are applied if `new: false`, or after updates if `new = true` + * } + * + * ####Examples + * + * A.where().findOneAndDelete(conditions, options, callback) // executes + * A.where().findOneAndDelete(conditions, options) // return Query + * A.where().findOneAndDelete(conditions, callback) // executes + * A.where().findOneAndDelete(conditions) // returns Query + * A.where().findOneAndDelete(callback) // executes + * A.where().findOneAndDelete() // returns Query + * + * @method findOneAndDelete + * @memberOf Query + * @param {Object} [conditions] + * @param {Object} [options] + * @param {Boolean} [options.rawResult] if true, returns the [raw result from the MongoDB driver](http://mongodb.github.io/node-mongodb-native/2.0/api/Collection.html#findAndModify) + * @param {ClientSession} [options.session=null] The session associated with this query. See [transactions docs](/docs/transactions.html). + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Function} [callback] optional params are (error, document) + * @return {Query} this + * @see mongodb http://www.mongodb.org/display/DOCS/findAndModify+Command + * @api public + */ + +Query.prototype.findOneAndDelete = function(conditions, options, callback) { + this.op = 'findOneAndDelete'; + this._validateOp(); + this._validate(); + + switch (arguments.length) { + case 2: + if (typeof options === 'function') { + callback = options; + options = {}; + } + break; + case 1: + if (typeof conditions === 'function') { + callback = conditions; + conditions = undefined; + options = undefined; + } + break; + } + + if (mquery.canMerge(conditions)) { + this.merge(conditions); + } + + options && this.setOptions(options); + + if (!callback) { + return this; + } + + this.exec(callback); + + return this; +}; + +/*! + * Thunk around findOneAndDelete() + * + * @param {Function} [callback] + * @return {Query} this + * @api private + */ +Query.prototype._findOneAndDelete = wrapThunk(function(callback) { + this._castConditions(); + + if (this.error() != null) { + callback(this.error()); + return null; + } + + const filter = this._conditions; + const options = this._optionsForExec(); + let fields = null; + + if (this._fields != null) { + options.projection = this._castFields(utils.clone(this._fields)); + fields = options.projection; + if (fields instanceof Error) { + callback(fields); + return null; + } + } + + this._collection.collection.findOneAndDelete(filter, options, _wrapThunkCallback(this, (err, res) => { + if (err) { + return callback(err); + } + + const doc = res.value; + + return this._completeOne(doc, res, callback); + })); +}); + +/** + * Issues a MongoDB [findOneAndReplace](https://docs.mongodb.com/manual/reference/method/db.collection.findOneAndReplace/) command. + * + * Finds a matching document, removes it, and passes the found document (if any) + * to the callback. Executes if `callback` is passed. + * + * This function triggers the following middleware. + * + * - `findOneAndReplace()` + * + * ####Available options + * + * - `sort`: if multiple docs are found by the conditions, sets the sort order to choose which doc to update + * - `maxTimeMS`: puts a time limit on the query - requires mongodb >= 2.6.0 + * - `rawResult`: if true, resolves to the [raw result from the MongoDB driver](http://mongodb.github.io/node-mongodb-native/2.0/api/Collection.html#findAndModify) + * + * ####Callback Signature + * function(error, doc) { + * // error: any errors that occurred + * // doc: the document before updates are applied if `new: false`, or after updates if `new = true` + * } + * + * ####Examples + * + * A.where().findOneAndReplace(filter, replacement, options, callback); // executes + * A.where().findOneAndReplace(filter, replacement, options); // return Query + * A.where().findOneAndReplace(filter, replacement, callback); // executes + * A.where().findOneAndReplace(filter); // returns Query + * A.where().findOneAndReplace(callback); // executes + * A.where().findOneAndReplace(); // returns Query + * + * @method findOneAndReplace + * @memberOf Query + * @param {Object} [filter] + * @param {Object} [replacement] + * @param {Object} [options] + * @param {Boolean} [options.rawResult] if true, returns the [raw result from the MongoDB driver](http://mongodb.github.io/node-mongodb-native/2.0/api/Collection.html#findAndModify) + * @param {ClientSession} [options.session=null] The session associated with this query. See [transactions docs](/docs/transactions.html). + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Boolean} [options.new=false] By default, `findOneAndUpdate()` returns the document as it was **before** `update` was applied. If you set `new: true`, `findOneAndUpdate()` will instead give you the object after `update` was applied. + * @param {Object} [options.lean] if truthy, mongoose will return the document as a plain JavaScript object rather than a mongoose document. See [`Query.lean()`](/docs/api.html#query_Query-lean) and [the Mongoose lean tutorial](/docs/tutorials/lean.html). + * @param {ClientSession} [options.session=null] The session associated with this query. See [transactions docs](/docs/transactions.html). + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Boolean} [options.timestamps=null] If set to `false` and [schema-level timestamps](/docs/guide.html#timestamps) are enabled, skip timestamps for this update. Note that this allows you to overwrite timestamps. Does nothing if schema-level timestamps are not set. + * @param {Boolean} [options.returnOriginal=null] An alias for the `new` option. `returnOriginal: false` is equivalent to `new: true`. + * @param {Function} [callback] optional params are (error, document) + * @return {Query} this + * @api public + */ + +Query.prototype.findOneAndReplace = function(filter, replacement, options, callback) { + this.op = 'findOneAndReplace'; + this._validateOp(); + this._validate(); + + switch (arguments.length) { + case 3: + if (typeof options === 'function') { + callback = options; + options = void 0; + } + break; + case 2: + if (typeof replacement === 'function') { + callback = replacement; + replacement = void 0; + } + break; + case 1: + if (typeof filter === 'function') { + callback = filter; + filter = void 0; + replacement = void 0; + options = void 0; + } + break; + } + + if (mquery.canMerge(filter)) { + this.merge(filter); + } + + if (replacement != null) { + if (hasDollarKeys(replacement)) { + throw new Error('The replacement document must not contain atomic operators.'); + } + this._mergeUpdate(replacement); + } + + options = options || {}; + + const returnOriginal = get(this, 'model.base.options.returnOriginal'); + if (options.new == null && options.returnDocument == null && options.returnOriginal == null && returnOriginal != null) { + options.returnOriginal = returnOriginal; + } + this.setOptions(options); + this.setOptions({ overwrite: true }); + + if (!callback) { + return this; + } + + this.exec(callback); + + return this; +}; + +/*! + * Thunk around findOneAndReplace() + * + * @param {Function} [callback] + * @return {Query} this + * @api private + */ +Query.prototype._findOneAndReplace = wrapThunk(function(callback) { + this._castConditions(); + + if (this.error() != null) { + callback(this.error()); + return null; + } + + const filter = this._conditions; + const options = this._optionsForExec(); + convertNewToReturnDocument(options); + let fields = null; + + let castedDoc = new this.model(this._update, null, true); + this._update = castedDoc; + + this._applyPaths(); + if (this._fields != null) { + options.projection = this._castFields(utils.clone(this._fields)); + fields = options.projection; + if (fields instanceof Error) { + callback(fields); + return null; + } + } + + castedDoc.$validate(err => { + if (err != null) { + return callback(err); + } + + if (castedDoc.toBSON) { + castedDoc = castedDoc.toBSON(); + } + + this._collection.collection.findOneAndReplace(filter, castedDoc, options, _wrapThunkCallback(this, (err, res) => { + if (err) { + return callback(err); + } + + const doc = res.value; + + return this._completeOne(doc, res, callback); + })); + }); +}); + +/*! + * Support the `new` option as an alternative to `returnOriginal` for backwards + * compat. + */ + +function convertNewToReturnDocument(options) { + if ('new' in options) { + options.returnDocument = options['new'] ? 'after' : 'before'; + delete options['new']; + } + if ('returnOriginal' in options) { + options.returnDocument = options['returnOriginal'] ? 'before' : 'after'; + delete options['returnOriginal']; + } + // Temporary since driver 4.0.0-beta does not support `returnDocument` + if (typeof options.returnDocument === 'string') { + options.returnOriginal = options.returnDocument === 'before'; + } +} + +/*! + * Thunk around findOneAndRemove() + * + * @param {Function} [callback] + * @return {Query} this + * @api private + */ +Query.prototype._findOneAndRemove = wrapThunk(function(callback) { + if (this.error() != null) { + callback(this.error()); + return; + } + + this._findAndModify('remove', callback); +}); + +/*! + * Get options from query opts, falling back to the base mongoose object. + */ + +function _getOption(query, option, def) { + const opts = query._optionsForExec(query.model); + + if (option in opts) { + return opts[option]; + } + if (option in query.model.base.options) { + return query.model.base.options[option]; + } + return def; +} + +/*! + * Override mquery.prototype._findAndModify to provide casting etc. + * + * @param {String} type - either "remove" or "update" + * @param {Function} callback + * @api private + */ + +Query.prototype._findAndModify = function(type, callback) { + if (typeof callback !== 'function') { + throw new Error('Expected callback in _findAndModify'); + } + + const model = this.model; + const schema = model.schema; + const _this = this; + let fields; + + const castedQuery = castQuery(this); + if (castedQuery instanceof Error) { + return callback(castedQuery); + } + + _castArrayFilters(this); + + const opts = this._optionsForExec(model); + + if ('strict' in opts) { + this._mongooseOptions.strict = opts.strict; + } + + const isOverwriting = this.options.overwrite && !hasDollarKeys(this._update); + if (isOverwriting) { + this._update = new this.model(this._update, null, true); + } + + if (type === 'remove') { + opts.remove = true; + } else { + if (!('new' in opts) && !('returnOriginal' in opts) && !('returnDocument' in opts)) { + opts.new = false; + } + if (!('upsert' in opts)) { + opts.upsert = false; + } + if (opts.upsert || opts['new']) { + opts.remove = false; + } + + if (!isOverwriting) { + this._update = castDoc(this, opts.overwrite); + const _opts = Object.assign({}, opts, { + setDefaultsOnInsert: this._mongooseOptions.setDefaultsOnInsert + }); + this._update = setDefaultsOnInsert(this._conditions, schema, this._update, _opts); + if (!this._update || Object.keys(this._update).length === 0) { + if (opts.upsert) { + // still need to do the upsert to empty doc + const doc = utils.clone(castedQuery); + delete doc._id; + this._update = { $set: doc }; + } else { + this._executionStack = null; + this.findOne(callback); + return this; + } + } else if (this._update instanceof Error) { + return callback(this._update); + } else { + // In order to make MongoDB 2.6 happy (see + // https://jira.mongodb.org/browse/SERVER-12266 and related issues) + // if we have an actual update document but $set is empty, junk the $set. + if (this._update.$set && Object.keys(this._update.$set).length === 0) { + delete this._update.$set; + } + } + } + + if (Array.isArray(opts.arrayFilters)) { + opts.arrayFilters = removeUnusedArrayFilters(this._update, opts.arrayFilters); + } + } + + this._applyPaths(); + + if (this._fields) { + fields = utils.clone(this._fields); + opts.projection = this._castFields(fields); + if (opts.projection instanceof Error) { + return callback(opts.projection); + } + } + + if (opts.sort) convertSortToArray(opts); + + const cb = function(err, doc, res) { + if (err) { + return callback(err); + } + + _this._completeOne(doc, res, callback); + }; + + const runValidators = _getOption(this, 'runValidators', false); + + // Bypass mquery + const collection = _this._collection.collection; + convertNewToReturnDocument(opts); + + if (type === 'remove') { + collection.findOneAndDelete(castedQuery, opts, _wrapThunkCallback(_this, function(error, res) { + return cb(error, res ? res.value : res, res); + })); + + return this; + } + + // honors legacy overwrite option for backward compatibility + const updateMethod = isOverwriting ? 'findOneAndReplace' : 'findOneAndUpdate'; + + if (runValidators) { + this.validate(this._update, opts, isOverwriting, error => { + if (error) { + return callback(error); + } + if (this._update && this._update.toBSON) { + this._update = this._update.toBSON(); + } + + collection[updateMethod](castedQuery, this._update, opts, _wrapThunkCallback(_this, function(error, res) { + return cb(error, res ? res.value : res, res); + })); + }); + } else { + if (this._update && this._update.toBSON) { + this._update = this._update.toBSON(); + } + collection[updateMethod](castedQuery, this._update, opts, _wrapThunkCallback(_this, function(error, res) { + return cb(error, res ? res.value : res, res); + })); + } + + return this; +}; + +/*! + * ignore + */ + +function _completeOneLean(doc, res, opts, callback) { + if (opts.rawResult) { + return callback(null, res); + } + return callback(null, doc); +} + +/*! + * Override mquery.prototype._mergeUpdate to handle mongoose objects in + * updates. + * + * @param {Object} doc + * @api private + */ + +Query.prototype._mergeUpdate = function(doc) { + if (doc == null || (typeof doc === 'object' && Object.keys(doc).length === 0)) { + return; + } + + if (!this._update) { + this._update = Array.isArray(doc) ? [] : {}; + } + if (doc instanceof Query) { + if (Array.isArray(this._update)) { + throw new Error('Cannot mix array and object updates'); + } + if (doc._update) { + utils.mergeClone(this._update, doc._update); + } + } else if (Array.isArray(doc)) { + if (!Array.isArray(this._update)) { + throw new Error('Cannot mix array and object updates'); + } + this._update = this._update.concat(doc); + } else { + if (Array.isArray(this._update)) { + throw new Error('Cannot mix array and object updates'); + } + utils.mergeClone(this._update, doc); + } +}; + +/*! + * The mongodb driver 1.3.23 only supports the nested array sort + * syntax. We must convert it or sorting findAndModify will not work. + */ + +function convertSortToArray(opts) { + if (Array.isArray(opts.sort)) { + return; + } + if (!utils.isObject(opts.sort)) { + return; + } + + const sort = []; + + for (const key in opts.sort) { + if (utils.object.hasOwnProperty(opts.sort, key)) { + sort.push([key, opts.sort[key]]); + } + } + + opts.sort = sort; +} + +/*! + * ignore + */ + +function _updateThunk(op, callback) { + this._castConditions(); + + _castArrayFilters(this); + + if (this.error() != null) { + callback(this.error()); + return null; + } + + callback = _wrapThunkCallback(this, callback); + + const castedQuery = this._conditions; + const options = this._optionsForExec(this.model); + + this._update = utils.clone(this._update, options); + const isOverwriting = this.options.overwrite && !hasDollarKeys(this._update); + if (isOverwriting) { + if (op === 'updateOne' || op === 'updateMany') { + return callback(new MongooseError('The MongoDB server disallows ' + + 'overwriting documents using `' + op + '`. See: ' + + 'https://mongoosejs.com/docs/deprecations.html#update')); + } + this._update = new this.model(this._update, null, true); + } else { + this._update = castDoc(this, options.overwrite); + + if (this._update instanceof Error) { + callback(this._update); + return null; + } + + if (this._update == null || Object.keys(this._update).length === 0) { + callback(null, { acknowledged: false }); + return null; + } + + const _opts = Object.assign({}, options, { + setDefaultsOnInsert: this._mongooseOptions.setDefaultsOnInsert + }); + this._update = setDefaultsOnInsert(this._conditions, this.model.schema, + this._update, _opts); + } + + if (Array.isArray(options.arrayFilters)) { + options.arrayFilters = removeUnusedArrayFilters(this._update, options.arrayFilters); + } + + const runValidators = _getOption(this, 'runValidators', false); + if (runValidators) { + this.validate(this._update, options, isOverwriting, err => { + if (err) { + return callback(err); + } + + if (this._update.toBSON) { + this._update = this._update.toBSON(); + } + this._collection[op](castedQuery, this._update, options, callback); + }); + return null; + } + + if (this._update.toBSON) { + this._update = this._update.toBSON(); + } + + this._collection[op](castedQuery, this._update, options, callback); + return null; +} + +/*! + * Mongoose calls this function internally to validate the query if + * `runValidators` is set + * + * @param {Object} castedDoc the update, after casting + * @param {Object} options the options from `_optionsForExec()` + * @param {Function} callback + * @api private + */ + +Query.prototype.validate = function validate(castedDoc, options, isOverwriting, callback) { + return promiseOrCallback(callback, cb => { + try { + if (isOverwriting) { + castedDoc.$validate(cb); + } else { + updateValidators(this, this.model.schema, castedDoc, options, cb); + } + } catch (err) { + immediate(function() { + cb(err); + }); + } + }); +}; + +/*! + * Internal thunk for .update() + * + * @param {Function} callback + * @see Model.update #model_Model.update + * @api private + */ +Query.prototype._execUpdate = wrapThunk(function(callback) { + return _updateThunk.call(this, 'update', callback); +}); + +/*! + * Internal thunk for .updateMany() + * + * @param {Function} callback + * @see Model.update #model_Model.update + * @api private + */ +Query.prototype._updateMany = wrapThunk(function(callback) { + return _updateThunk.call(this, 'updateMany', callback); +}); + +/*! + * Internal thunk for .updateOne() + * + * @param {Function} callback + * @see Model.update #model_Model.update + * @api private + */ +Query.prototype._updateOne = wrapThunk(function(callback) { + return _updateThunk.call(this, 'updateOne', callback); +}); + +/*! + * Internal thunk for .replaceOne() + * + * @param {Function} callback + * @see Model.replaceOne #model_Model.replaceOne + * @api private + */ +Query.prototype._replaceOne = wrapThunk(function(callback) { + return _updateThunk.call(this, 'replaceOne', callback); +}); + +/** + * Declare and/or execute this query as an update() operation. + * + * _All paths passed that are not [atomic](https://docs.mongodb.com/manual/tutorial/model-data-for-atomic-operations/#pattern) operations will become `$set` ops._ + * + * This function triggers the following middleware. + * + * - `update()` + * + * ####Example + * + * Model.where({ _id: id }).update({ title: 'words' }) + * + * // becomes + * + * Model.where({ _id: id }).update({ $set: { title: 'words' }}) + * + * ####Valid options: + * + * - `upsert` (boolean) whether to create the doc if it doesn't match (false) + * - `multi` (boolean) whether multiple documents should be updated (false) + * - `runValidators` (boolean) if true, runs [update validators](/docs/validation.html#update-validators) on this command. Update validators validate the update operation against the model's schema. + * - `setDefaultsOnInsert` (boolean) `true` by default. If `setDefaultsOnInsert` and `upsert` are true, mongoose will apply the [defaults](http://mongoosejs.com/docs/defaults.html) specified in the model's schema if a new document is created. + * - `strict` (boolean) overrides the `strict` option for this update + * - `read` + * - `writeConcern` + * + * ####Note + * + * Passing an empty object `{}` as the doc will result in a no-op. The update operation will be ignored and the callback executed without sending the command to MongoDB. + * + * ####Note + * + * The operation is only executed when a callback is passed. To force execution without a callback, we must first call update() and then execute it by using the `exec()` method. + * + * const q = Model.where({ _id: id }); + * q.update({ $set: { name: 'bob' }}).update(); // not executed + * + * q.update({ $set: { name: 'bob' }}).exec(); // executed + * + * // keys that are not [atomic](https://docs.mongodb.com/manual/tutorial/model-data-for-atomic-operations/#pattern) ops become `$set`. + * // this executes the same command as the previous example. + * q.update({ name: 'bob' }).exec(); + * + * // multi updates + * Model.where() + * .update({ name: /^match/ }, { $set: { arr: [] }}, { multi: true }, callback) + * + * // more multi updates + * Model.where() + * .setOptions({ multi: true }) + * .update({ $set: { arr: [] }}, callback) + * + * // single update by default + * Model.where({ email: 'address@example.com' }) + * .update({ $inc: { counter: 1 }}, callback) + * + * API summary + * + * update(filter, doc, options, cb) // executes + * update(filter, doc, options) + * update(filter, doc, cb) // executes + * update(filter, doc) + * update(doc, cb) // executes + * update(doc) + * update(cb) // executes + * update(true) // executes + * update() + * + * @param {Object} [filter] + * @param {Object} [doc] the update command + * @param {Object} [options] + * @param {Boolean} [options.multipleCastError] by default, mongoose only returns the first error that occurred in casting the query. Turn on this option to aggregate all the cast errors. + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Boolean} [options.upsert=false] if true, and no documents found, insert a new document + * @param {Object} [options.writeConcern=null] sets the [write concern](https://docs.mongodb.com/manual/reference/write-concern/) for replica sets. Overrides the [schema-level write concern](/docs/guide.html#writeConcern) + * @param {Boolean} [options.timestamps=null] If set to `false` and [schema-level timestamps](/docs/guide.html#timestamps) are enabled, skip timestamps for this update. Does nothing if schema-level timestamps are not set. + * @param {Function} [callback] params are (error, writeOpResult) + * @return {Query} this + * @see Model.update #model_Model.update + * @see Query docs https://mongoosejs.com/docs/queries.html + * @see update http://docs.mongodb.org/manual/reference/method/db.collection.update/ + * @see writeOpResult http://mongodb.github.io/node-mongodb-native/2.2/api/Collection.html#~WriteOpResult + * @see MongoDB docs https://docs.mongodb.com/manual/reference/command/update/#update-command-output + * @api public + */ + +Query.prototype.update = function(conditions, doc, options, callback) { + if (typeof options === 'function') { + // .update(conditions, doc, callback) + callback = options; + options = null; + } else if (typeof doc === 'function') { + // .update(doc, callback); + callback = doc; + doc = conditions; + conditions = {}; + options = null; + } else if (typeof conditions === 'function') { + // .update(callback) + callback = conditions; + conditions = undefined; + doc = undefined; + options = undefined; + } else if (typeof conditions === 'object' && !doc && !options && !callback) { + // .update(doc) + doc = conditions; + conditions = undefined; + options = undefined; + callback = undefined; + } + + return _update(this, 'update', conditions, doc, options, callback); +}; + +/** + * Declare and/or execute this query as an updateMany() operation. Same as + * `update()`, except MongoDB will update _all_ documents that match + * `filter` (as opposed to just the first one) regardless of the value of + * the `multi` option. + * + * **Note** updateMany will _not_ fire update middleware. Use `pre('updateMany')` + * and `post('updateMany')` instead. + * + * ####Example: + * const res = await Person.updateMany({ name: /Stark$/ }, { isDeleted: true }); + * res.n; // Number of documents matched + * res.nModified; // Number of documents modified + * + * This function triggers the following middleware. + * + * - `updateMany()` + * + * @param {Object} [filter] + * @param {Object|Array} [update] the update command + * @param {Object} [options] + * @param {Boolean} [options.multipleCastError] by default, mongoose only returns the first error that occurred in casting the query. Turn on this option to aggregate all the cast errors. + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Boolean} [options.upsert=false] if true, and no documents found, insert a new document + * @param {Object} [options.writeConcern=null] sets the [write concern](https://docs.mongodb.com/manual/reference/write-concern/) for replica sets. Overrides the [schema-level write concern](/docs/guide.html#writeConcern) + * @param {Boolean} [options.timestamps=null] If set to `false` and [schema-level timestamps](/docs/guide.html#timestamps) are enabled, skip timestamps for this update. Does nothing if schema-level timestamps are not set. + * @param {Function} [callback] params are (error, writeOpResult) + * @return {Query} this + * @see Model.update #model_Model.update + * @see Query docs https://mongoosejs.com/docs/queries.html + * @see update http://docs.mongodb.org/manual/reference/method/db.collection.update/ + * @see writeOpResult http://mongodb.github.io/node-mongodb-native/2.2/api/Collection.html#~WriteOpResult + * @see MongoDB docs https://docs.mongodb.com/manual/reference/command/update/#update-command-output + * @api public + */ + +Query.prototype.updateMany = function(conditions, doc, options, callback) { + if (typeof options === 'function') { + // .update(conditions, doc, callback) + callback = options; + options = null; + } else if (typeof doc === 'function') { + // .update(doc, callback); + callback = doc; + doc = conditions; + conditions = {}; + options = null; + } else if (typeof conditions === 'function') { + // .update(callback) + callback = conditions; + conditions = undefined; + doc = undefined; + options = undefined; + } else if (typeof conditions === 'object' && !doc && !options && !callback) { + // .update(doc) + doc = conditions; + conditions = undefined; + options = undefined; + callback = undefined; + } + + return _update(this, 'updateMany', conditions, doc, options, callback); +}; + +/** + * Declare and/or execute this query as an updateOne() operation. Same as + * `update()`, except it does not support the `multi` option. + * + * - MongoDB will update _only_ the first document that matches `filter` regardless of the value of the `multi` option. + * - Use `replaceOne()` if you want to overwrite an entire document rather than using [atomic](https://docs.mongodb.com/manual/tutorial/model-data-for-atomic-operations/#pattern) operators like `$set`. + * + * **Note** updateOne will _not_ fire update middleware. Use `pre('updateOne')` + * and `post('updateOne')` instead. + * + * ####Example: + * const res = await Person.updateOne({ name: 'Jean-Luc Picard' }, { ship: 'USS Enterprise' }); + * res.n; // Number of documents matched + * res.nModified; // Number of documents modified + * + * This function triggers the following middleware. + * + * - `updateOne()` + * + * @param {Object} [filter] + * @param {Object|Array} [update] the update command + * @param {Object} [options] + * @param {Boolean} [options.multipleCastError] by default, mongoose only returns the first error that occurred in casting the query. Turn on this option to aggregate all the cast errors. + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Boolean} [options.upsert=false] if true, and no documents found, insert a new document + * @param {Object} [options.writeConcern=null] sets the [write concern](https://docs.mongodb.com/manual/reference/write-concern/) for replica sets. Overrides the [schema-level write concern](/docs/guide.html#writeConcern) + * @param {Boolean} [options.timestamps=null] If set to `false` and [schema-level timestamps](/docs/guide.html#timestamps) are enabled, skip timestamps for this update. Note that this allows you to overwrite timestamps. Does nothing if schema-level timestamps are not set. + * @param {Function} [callback] params are (error, writeOpResult) + * @return {Query} this + * @see Model.update #model_Model.update + * @see Query docs https://mongoosejs.com/docs/queries.html + * @see update http://docs.mongodb.org/manual/reference/method/db.collection.update/ + * @see writeOpResult http://mongodb.github.io/node-mongodb-native/2.2/api/Collection.html#~WriteOpResult + * @see MongoDB docs https://docs.mongodb.com/manual/reference/command/update/#update-command-output + * @api public + */ + +Query.prototype.updateOne = function(conditions, doc, options, callback) { + if (typeof options === 'function') { + // .update(conditions, doc, callback) + callback = options; + options = null; + } else if (typeof doc === 'function') { + // .update(doc, callback); + callback = doc; + doc = conditions; + conditions = {}; + options = null; + } else if (typeof conditions === 'function') { + // .update(callback) + callback = conditions; + conditions = undefined; + doc = undefined; + options = undefined; + } else if (typeof conditions === 'object' && !doc && !options && !callback) { + // .update(doc) + doc = conditions; + conditions = undefined; + options = undefined; + callback = undefined; + } + + return _update(this, 'updateOne', conditions, doc, options, callback); +}; + +/** + * Declare and/or execute this query as a replaceOne() operation. Same as + * `update()`, except MongoDB will replace the existing document and will + * not accept any [atomic](https://docs.mongodb.com/manual/tutorial/model-data-for-atomic-operations/#pattern) operators (`$set`, etc.) + * + * **Note** replaceOne will _not_ fire update middleware. Use `pre('replaceOne')` + * and `post('replaceOne')` instead. + * + * ####Example: + * const res = await Person.replaceOne({ _id: 24601 }, { name: 'Jean Valjean' }); + * res.n; // Number of documents matched + * res.nModified; // Number of documents modified + * + * This function triggers the following middleware. + * + * - `replaceOne()` + * + * @param {Object} [filter] + * @param {Object} [doc] the update command + * @param {Object} [options] + * @param {Boolean} [options.multipleCastError] by default, mongoose only returns the first error that occurred in casting the query. Turn on this option to aggregate all the cast errors. + * @param {Boolean|String} [options.strict] overwrites the schema's [strict mode option](http://mongoosejs.com/docs/guide.html#strict) + * @param {Boolean} [options.upsert=false] if true, and no documents found, insert a new document + * @param {Object} [options.writeConcern=null] sets the [write concern](https://docs.mongodb.com/manual/reference/write-concern/) for replica sets. Overrides the [schema-level write concern](/docs/guide.html#writeConcern) + * @param {Boolean} [options.timestamps=null] If set to `false` and [schema-level timestamps](/docs/guide.html#timestamps) are enabled, skip timestamps for this update. Does nothing if schema-level timestamps are not set. + * @param {Function} [callback] params are (error, writeOpResult) + * @return {Query} this + * @see Model.update #model_Model.update + * @see Query docs https://mongoosejs.com/docs/queries.html + * @see update http://docs.mongodb.org/manual/reference/method/db.collection.update/ + * @see writeOpResult http://mongodb.github.io/node-mongodb-native/2.2/api/Collection.html#~WriteOpResult + * @see MongoDB docs https://docs.mongodb.com/manual/reference/command/update/#update-command-output + * @api public + */ + +Query.prototype.replaceOne = function(conditions, doc, options, callback) { + if (typeof options === 'function') { + // .update(conditions, doc, callback) + callback = options; + options = null; + } else if (typeof doc === 'function') { + // .update(doc, callback); + callback = doc; + doc = conditions; + conditions = {}; + options = null; + } else if (typeof conditions === 'function') { + // .update(callback) + callback = conditions; + conditions = undefined; + doc = undefined; + options = undefined; + } else if (typeof conditions === 'object' && !doc && !options && !callback) { + // .update(doc) + doc = conditions; + conditions = undefined; + options = undefined; + callback = undefined; + } + + this.setOptions({ overwrite: true }); + return _update(this, 'replaceOne', conditions, doc, options, callback); +}; + +/*! + * Internal helper for update, updateMany, updateOne, replaceOne + */ + +function _update(query, op, filter, doc, options, callback) { + // make sure we don't send in the whole Document to merge() + query.op = op; + query._validateOp(); + filter = utils.toObject(filter); + doc = doc || {}; + + // strict is an option used in the update checking, make sure it gets set + if (options != null) { + if ('strict' in options) { + query._mongooseOptions.strict = options.strict; + } + } + + if (!(filter instanceof Query) && + filter != null && + filter.toString() !== '[object Object]') { + query.error(new ObjectParameterError(filter, 'filter', op)); + } else { + query.merge(filter); + } + + if (utils.isObject(options)) { + query.setOptions(options); + } + + query._mergeUpdate(doc); + + // Hooks + if (callback) { + query.exec(callback); + + return query; + } + + return Query.base[op].call(query, filter, void 0, options, callback); +} + +/** + * Runs a function `fn` and treats the return value of `fn` as the new value + * for the query to resolve to. + * + * Any functions you pass to `map()` will run **after** any post hooks. + * + * ####Example: + * + * const res = await MyModel.findOne().transform(res => { + * // Sets a `loadedAt` property on the doc that tells you the time the + * // document was loaded. + * return res == null ? + * res : + * Object.assign(res, { loadedAt: new Date() }); + * }); + * + * @method map + * @memberOf Query + * @instance + * @param {Function} fn function to run to transform the query result + * @return {Query} this + */ + +Query.prototype.transform = function(fn) { + this._transforms.push(fn); + return this; +}; + +/** + * Make this query throw an error if no documents match the given `filter`. + * This is handy for integrating with async/await, because `orFail()` saves you + * an extra `if` statement to check if no document was found. + * + * ####Example: + * + * // Throws if no doc returned + * await Model.findOne({ foo: 'bar' }).orFail(); + * + * // Throws if no document was updated + * await Model.updateOne({ foo: 'bar' }, { name: 'test' }).orFail(); + * + * // Throws "No docs found!" error if no docs match `{ foo: 'bar' }` + * await Model.find({ foo: 'bar' }).orFail(new Error('No docs found!')); + * + * // Throws "Not found" error if no document was found + * await Model.findOneAndUpdate({ foo: 'bar' }, { name: 'test' }). + * orFail(() => Error('Not found')); + * + * @method orFail + * @memberOf Query + * @instance + * @param {Function|Error} [err] optional error to throw if no docs match `filter`. If not specified, `orFail()` will throw a `DocumentNotFoundError` + * @return {Query} this + */ + +Query.prototype.orFail = function(err) { + this.transform(res => { + switch (this.op) { + case 'find': + if (res.length === 0) { + throw _orFailError(err, this); + } + break; + case 'findOne': + if (res == null) { + throw _orFailError(err, this); + } + break; + case 'update': + case 'updateMany': + case 'updateOne': + if (get(res, 'modifiedCount') === 0) { + throw _orFailError(err, this); + } + break; + case 'findOneAndDelete': + case 'findOneAndRemove': + if (get(res, 'lastErrorObject.n') === 0) { + throw _orFailError(err, this); + } + break; + case 'findOneAndUpdate': + case 'findOneAndReplace': + if (get(res, 'lastErrorObject.updatedExisting') === false) { + throw _orFailError(err, this); + } + break; + case 'deleteMany': + case 'deleteOne': + case 'remove': + if (res.deletedCount === 0) { + throw _orFailError(err, this); + } + break; + default: + break; + } + + return res; + }); + return this; +}; + +/*! + * Get the error to throw for `orFail()` + */ + +function _orFailError(err, query) { + if (typeof err === 'function') { + err = err.call(query); + } + + if (err == null) { + err = new DocumentNotFoundError(query.getQuery(), query.model.modelName); + } + + return err; +} + +/** + * Executes the query + * + * ####Examples: + * + * const promise = query.exec(); + * const promise = query.exec('update'); + * + * query.exec(callback); + * query.exec('find', callback); + * + * @param {String|Function} [operation] + * @param {Function} [callback] optional params depend on the function being called + * @return {Promise} + * @api public + */ + +Query.prototype.exec = function exec(op, callback) { + const _this = this; + // Ensure that `exec()` is the first thing that shows up in + // the stack when cast errors happen. + const castError = new CastError(); + + if (typeof op === 'function') { + callback = op; + op = null; + } else if (typeof op === 'string') { + this.op = op; + } + + if (this.op == null) { + throw new Error('Query must have `op` before executing'); + } + this._validateOp(); + + callback = this.model.$handleCallbackError(callback); + + return promiseOrCallback(callback, (cb) => { + cb = this.model.$wrapCallback(cb); + + if (!_this.op) { + cb(); + return; + } + + this._hooks.execPre('exec', this, [], (error) => { + if (error != null) { + return cb(_cleanCastErrorStack(castError, error)); + } + let thunk = '_' + this.op; + if (this.op === 'update') { + thunk = '_execUpdate'; + } else if (this.op === 'distinct') { + thunk = '__distinct'; + } + this[thunk].call(this, (error, res) => { + if (error) { + return cb(_cleanCastErrorStack(castError, error)); + } + + this._hooks.execPost('exec', this, [], {}, (error) => { + if (error) { + return cb(_cleanCastErrorStack(castError, error)); + } + + cb(null, res); + }); + }); + }); + }, this.model.events); +}; + +/*! + * ignore + */ + +function _cleanCastErrorStack(castError, error) { + if (error instanceof CastError) { + castError.copy(error); + return castError; + } + + return error; +} + +/*! + * ignore + */ + +function _wrapThunkCallback(query, cb) { + return function(error, res) { + if (error != null) { + return cb(error); + } + + for (const fn of query._transforms) { + try { + res = fn(res); + } catch (error) { + return cb(error); + } + } + + return cb(null, res); + }; +} + +/** + * Executes the query returning a `Promise` which will be + * resolved with either the doc(s) or rejected with the error. + * + * More about [`then()` in JavaScript](https://masteringjs.io/tutorials/fundamentals/then). + * + * @param {Function} [resolve] + * @param {Function} [reject] + * @return {Promise} + * @api public + */ + +Query.prototype.then = function(resolve, reject) { + return this.exec().then(resolve, reject); +}; + +/** + * Executes the query returning a `Promise` which will be + * resolved with either the doc(s) or rejected with the error. + * Like `.then()`, but only takes a rejection handler. + * + * More about [Promise `catch()` in JavaScript](https://masteringjs.io/tutorials/fundamentals/catch). + * + * @param {Function} [reject] + * @return {Promise} + * @api public + */ + +Query.prototype.catch = function(reject) { + return this.exec().then(null, reject); +}; + +/** + * Add pre [middleware](/docs/middleware.html) to this query instance. Doesn't affect + * other queries. + * + * ####Example: + * + * const q1 = Question.find({ answer: 42 }); + * q1.pre(function middleware() { + * console.log(this.getFilter()); + * }); + * await q1.exec(); // Prints "{ answer: 42 }" + * + * // Doesn't print anything, because `middleware()` is only + * // registered on `q1`. + * await Question.find({ answer: 42 }); + * + * @param {Function} fn + * @return {Promise} + * @api public + */ + +Query.prototype.pre = function(fn) { + this._hooks.pre('exec', fn); + return this; +}; + +/** + * Add post [middleware](/docs/middleware.html) to this query instance. Doesn't affect + * other queries. + * + * ####Example: + * + * const q1 = Question.find({ answer: 42 }); + * q1.post(function middleware() { + * console.log(this.getFilter()); + * }); + * await q1.exec(); // Prints "{ answer: 42 }" + * + * // Doesn't print anything, because `middleware()` is only + * // registered on `q1`. + * await Question.find({ answer: 42 }); + * + * @param {Function} fn + * @return {Promise} + * @api public + */ + +Query.prototype.post = function(fn) { + this._hooks.post('exec', fn); + return this; +}; + +/*! + * Casts obj for an update command. + * + * @param {Object} obj + * @return {Object} obj after casting its values + * @api private + */ + +Query.prototype._castUpdate = function _castUpdate(obj, overwrite) { + let schema = this.schema; + + const discriminatorKey = schema.options.discriminatorKey; + const baseSchema = schema._baseSchema ? schema._baseSchema : schema; + if (this._mongooseOptions.overwriteDiscriminatorKey && + obj[discriminatorKey] != null && + baseSchema.discriminators) { + const _schema = baseSchema.discriminators[obj[discriminatorKey]]; + if (_schema != null) { + schema = _schema; + } + } + + let upsert; + if ('upsert' in this.options) { + upsert = this.options.upsert; + } + + const filter = this._conditions; + if (schema != null && + utils.hasUserDefinedProperty(filter, schema.options.discriminatorKey) && + typeof filter[schema.options.discriminatorKey] !== 'object' && + schema.discriminators != null) { + const discriminatorValue = filter[schema.options.discriminatorKey]; + const byValue = getDiscriminatorByValue(this.model.discriminators, discriminatorValue); + schema = schema.discriminators[discriminatorValue] || + (byValue && byValue.schema) || + schema; + } + + return castUpdate(schema, obj, { + overwrite: overwrite, + strict: this._mongooseOptions.strict, + upsert: upsert, + arrayFilters: this.options.arrayFilters + }, this, this._conditions); +}; + +/*! + * castQuery + * @api private + */ + +function castQuery(query) { + try { + return query.cast(query.model); + } catch (err) { + return err; + } +} + +/*! + * castDoc + * @api private + */ + +function castDoc(query, overwrite) { + try { + return query._castUpdate(query._update, overwrite); + } catch (err) { + return err; + } +} + +/** + * Specifies paths which should be populated with other documents. + * + * ####Example: + * + * let book = await Book.findOne().populate('authors'); + * book.title; // 'Node.js in Action' + * book.authors[0].name; // 'TJ Holowaychuk' + * book.authors[1].name; // 'Nathan Rajlich' + * + * let books = await Book.find().populate({ + * path: 'authors', + * // `match` and `sort` apply to the Author model, + * // not the Book model. These options do not affect + * // which documents are in `books`, just the order and + * // contents of each book document's `authors`. + * match: { name: new RegExp('.*h.*', 'i') }, + * sort: { name: -1 } + * }); + * books[0].title; // 'Node.js in Action' + * // Each book's `authors` are sorted by name, descending. + * books[0].authors[0].name; // 'TJ Holowaychuk' + * books[0].authors[1].name; // 'Marc Harter' + * + * books[1].title; // 'Professional AngularJS' + * // Empty array, no authors' name has the letter 'h' + * books[1].authors; // [] + * + * Paths are populated after the query executes and a response is received. A + * separate query is then executed for each path specified for population. After + * a response for each query has also been returned, the results are passed to + * the callback. + * + * @param {Object|String} path either the path to populate or an object specifying all parameters + * @param {Object|String} [select] Field selection for the population query + * @param {Model} [model] The model you wish to use for population. If not specified, populate will look up the model by the name in the Schema's `ref` field. + * @param {Object} [match] Conditions for the population query + * @param {Object} [options] Options for the population query (sort, etc) + * @param {String} [options.path=null] The path to populate. + * @param {boolean} [options.retainNullValues=false] by default, Mongoose removes null and undefined values from populated arrays. Use this option to make `populate()` retain `null` and `undefined` array entries. + * @param {boolean} [options.getters=false] if true, Mongoose will call any getters defined on the `localField`. By default, Mongoose gets the raw value of `localField`. For example, you would need to set this option to `true` if you wanted to [add a `lowercase` getter to your `localField`](/docs/schematypes.html#schematype-options). + * @param {boolean} [options.clone=false] When you do `BlogPost.find().populate('author')`, blog posts with the same author will share 1 copy of an `author` doc. Enable this option to make Mongoose clone populated docs before assigning them. + * @param {Object|Function} [options.match=null] Add an additional filter to the populate query. Can be a filter object containing [MongoDB query syntax](https://docs.mongodb.com/manual/tutorial/query-documents/), or a function that returns a filter object. + * @param {Function} [options.transform=null] Function that Mongoose will call on every populated document that allows you to transform the populated document. + * @param {Object} [options.options=null] Additional options like `limit` and `lean`. + * @see population ./populate.html + * @see Query#select #query_Query-select + * @see Model.populate #model_Model.populate + * @return {Query} this + * @api public + */ + +Query.prototype.populate = function() { + // Bail when given no truthy arguments + if (!Array.from(arguments).some(Boolean)) { + return this; + } + + const res = utils.populate.apply(null, arguments); + + // Propagate readConcern and readPreference and lean from parent query, + // unless one already specified + if (this.options != null) { + const readConcern = this.options.readConcern; + const readPref = this.options.readPreference; + + for (const populateOptions of res) { + if (readConcern != null && get(populateOptions, 'options.readConcern') == null) { + populateOptions.options = populateOptions.options || {}; + populateOptions.options.readConcern = readConcern; + } + if (readPref != null && get(populateOptions, 'options.readPreference') == null) { + populateOptions.options = populateOptions.options || {}; + populateOptions.options.readPreference = readPref; + } + } + } + + const opts = this._mongooseOptions; + + if (opts.lean != null) { + const lean = opts.lean; + for (const populateOptions of res) { + if (get(populateOptions, 'options.lean') == null) { + populateOptions.options = populateOptions.options || {}; + populateOptions.options.lean = lean; + } + } + } + + if (!utils.isObject(opts.populate)) { + opts.populate = {}; + } + + const pop = opts.populate; + + for (const populateOptions of res) { + const path = populateOptions.path; + if (pop[path] && pop[path].populate && populateOptions.populate) { + populateOptions.populate = pop[path].populate.concat(populateOptions.populate); + } + + pop[populateOptions.path] = populateOptions; + } + return this; +}; + +/** + * Gets a list of paths to be populated by this query + * + * ####Example: + * bookSchema.pre('findOne', function() { + * let keys = this.getPopulatedPaths(); // ['author'] + * }); + * ... + * Book.findOne({}).populate('author'); + * + * ####Example: + * // Deep populate + * const q = L1.find().populate({ + * path: 'level2', + * populate: { path: 'level3' } + * }); + * q.getPopulatedPaths(); // ['level2', 'level2.level3'] + * + * @return {Array} an array of strings representing populated paths + * @api public + */ + +Query.prototype.getPopulatedPaths = function getPopulatedPaths() { + const obj = this._mongooseOptions.populate || {}; + const ret = Object.keys(obj); + for (const path of Object.keys(obj)) { + const pop = obj[path]; + if (!Array.isArray(pop.populate)) { + continue; + } + _getPopulatedPaths(ret, pop.populate, path + '.'); + } + return ret; +}; + +/*! + * ignore + */ + +function _getPopulatedPaths(list, arr, prefix) { + for (const pop of arr) { + list.push(prefix + pop.path); + if (!Array.isArray(pop.populate)) { + continue; + } + _getPopulatedPaths(list, pop.populate, prefix + pop.path + '.'); + } +} + +/** + * Casts this query to the schema of `model` + * + * ####Note + * + * If `obj` is present, it is cast instead of this query. + * + * @param {Model} [model] the model to cast to. If not set, defaults to `this.model` + * @param {Object} [obj] + * @return {Object} + * @api public + */ + +Query.prototype.cast = function(model, obj) { + obj || (obj = this._conditions); + + model = model || this.model; + const discriminatorKey = model.schema.options.discriminatorKey; + if (obj != null && + obj.hasOwnProperty(discriminatorKey)) { + model = getDiscriminatorByValue(model.discriminators, obj[discriminatorKey]) || model; + } + + try { + return cast(model.schema, obj, { + upsert: this.options && this.options.upsert, + strict: (this.options && 'strict' in this.options) ? + this.options.strict : + get(model, 'schema.options.strict', null), + strictQuery: (this.options && 'strictQuery' in this.options) ? + this.options.strictQuery : + (this.options && 'strict' in this.options) ? + this.options.strict : + get(model, 'schema.options.strictQuery', null) + }, this); + } catch (err) { + // CastError, assign model + if (typeof err.setModel === 'function') { + err.setModel(model); + } + throw err; + } +}; + +/** + * Casts selected field arguments for field selection with mongo 2.2 + * + * query.select({ ids: { $elemMatch: { $in: [hexString] }}) + * + * @param {Object} fields + * @see https://github.com/Automattic/mongoose/issues/1091 + * @see http://docs.mongodb.org/manual/reference/projection/elemMatch/ + * @api private + */ + +Query.prototype._castFields = function _castFields(fields) { + let selected, + elemMatchKeys, + keys, + key, + out, + i; + + if (fields) { + keys = Object.keys(fields); + elemMatchKeys = []; + i = keys.length; + + // collect $elemMatch args + while (i--) { + key = keys[i]; + if (fields[key].$elemMatch) { + selected || (selected = {}); + selected[key] = fields[key]; + elemMatchKeys.push(key); + } + } + } + + if (selected) { + // they passed $elemMatch, cast em + try { + out = this.cast(this.model, selected); + } catch (err) { + return err; + } + + // apply the casted field args + i = elemMatchKeys.length; + while (i--) { + key = elemMatchKeys[i]; + fields[key] = out[key]; + } + } + + return fields; +}; + +/** + * Applies schematype selected options to this query. + * @api private + */ + +Query.prototype._applyPaths = function applyPaths() { + this._fields = this._fields || {}; + helpers.applyPaths(this._fields, this.model.schema); + + let _selectPopulatedPaths = true; + + if ('selectPopulatedPaths' in this.model.base.options) { + _selectPopulatedPaths = this.model.base.options.selectPopulatedPaths; + } + if ('selectPopulatedPaths' in this.model.schema.options) { + _selectPopulatedPaths = this.model.schema.options.selectPopulatedPaths; + } + + if (_selectPopulatedPaths) { + selectPopulatedFields(this._fields, this._userProvidedFields, this._mongooseOptions.populate); + } +}; + +/** + * Returns a wrapper around a [mongodb driver cursor](http://mongodb.github.io/node-mongodb-native/2.1/api/Cursor.html). + * A QueryCursor exposes a Streams3 interface, as well as a `.next()` function. + * + * The `.cursor()` function triggers pre find hooks, but **not** post find hooks. + * + * ####Example + * + * // There are 2 ways to use a cursor. First, as a stream: + * Thing. + * find({ name: /^hello/ }). + * cursor(). + * on('data', function(doc) { console.log(doc); }). + * on('end', function() { console.log('Done!'); }); + * + * // Or you can use `.next()` to manually get the next doc in the stream. + * // `.next()` returns a promise, so you can use promises or callbacks. + * const cursor = Thing.find({ name: /^hello/ }).cursor(); + * cursor.next(function(error, doc) { + * console.log(doc); + * }); + * + * // Because `.next()` returns a promise, you can use co + * // to easily iterate through all documents without loading them + * // all into memory. + * const cursor = Thing.find({ name: /^hello/ }).cursor(); + * for (let doc = await cursor.next(); doc != null; doc = await cursor.next()) { + * console.log(doc); + * } + * + * ####Valid options + * + * - `transform`: optional function which accepts a mongoose document. The return value of the function will be emitted on `data` and returned by `.next()`. + * + * @return {QueryCursor} + * @param {Object} [options] + * @see QueryCursor + * @api public + */ + +Query.prototype.cursor = function cursor(opts) { + this._applyPaths(); + this._fields = this._castFields(this._fields); + this.setOptions({ projection: this._fieldsForExec() }); + if (opts) { + this.setOptions(opts); + } + + const options = Object.assign({}, this._optionsForExec(), { + projection: this.projection() + }); + try { + this.cast(this.model); + } catch (err) { + return (new QueryCursor(this, options))._markError(err); + } + + return new QueryCursor(this, options); +}; + +// the rest of these are basically to support older Mongoose syntax with mquery + +/** + * _DEPRECATED_ Alias of `maxScan` + * + * @deprecated + * @see maxScan #query_Query-maxScan + * @method maxscan + * @memberOf Query + * @instance + */ + +Query.prototype.maxscan = Query.base.maxScan; + +/** + * Sets the tailable option (for use with capped collections). + * + * ####Example + * + * query.tailable() // true + * query.tailable(true) + * query.tailable(false) + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @param {Boolean} bool defaults to true + * @param {Object} [opts] options to set + * @param {Number} [opts.numberOfRetries] if cursor is exhausted, retry this many times before giving up + * @param {Number} [opts.tailableRetryInterval] if cursor is exhausted, wait this many milliseconds before retrying + * @see tailable http://docs.mongodb.org/manual/tutorial/create-tailable-cursor/ + * @api public + */ + +Query.prototype.tailable = function(val, opts) { + // we need to support the tailable({ awaitdata : true }) as well as the + // tailable(true, {awaitdata :true}) syntax that mquery does not support + if (val != null && typeof val.constructor === 'function' && val.constructor.name === 'Object') { + opts = val; + val = true; + } + + if (val === undefined) { + val = true; + } + + if (opts && typeof opts === 'object') { + for (const key of Object.keys(opts)) { + if (key === 'awaitdata') { + // For backwards compatibility + this.options[key] = !!opts[key]; + } else { + this.options[key] = opts[key]; + } + } + } + + return Query.base.tailable.call(this, val); +}; + +/** + * Declares an intersects query for `geometry()`. + * + * ####Example + * + * query.where('path').intersects().geometry({ + * type: 'LineString' + * , coordinates: [[180.0, 11.0], [180, 9.0]] + * }) + * + * query.where('path').intersects({ + * type: 'LineString' + * , coordinates: [[180.0, 11.0], [180, 9.0]] + * }) + * + * ####NOTE: + * + * **MUST** be used after `where()`. + * + * ####NOTE: + * + * In Mongoose 3.7, `intersects` changed from a getter to a function. If you need the old syntax, use [this](https://github.com/ebensing/mongoose-within). + * + * @method intersects + * @memberOf Query + * @instance + * @param {Object} [arg] + * @return {Query} this + * @see $geometry http://docs.mongodb.org/manual/reference/operator/geometry/ + * @see geoIntersects http://docs.mongodb.org/manual/reference/operator/geoIntersects/ + * @api public + */ + +/** + * Specifies a `$geometry` condition + * + * ####Example + * + * const polyA = [[[ 10, 20 ], [ 10, 40 ], [ 30, 40 ], [ 30, 20 ]]] + * query.where('loc').within().geometry({ type: 'Polygon', coordinates: polyA }) + * + * // or + * const polyB = [[ 0, 0 ], [ 1, 1 ]] + * query.where('loc').within().geometry({ type: 'LineString', coordinates: polyB }) + * + * // or + * const polyC = [ 0, 0 ] + * query.where('loc').within().geometry({ type: 'Point', coordinates: polyC }) + * + * // or + * query.where('loc').intersects().geometry({ type: 'Point', coordinates: polyC }) + * + * The argument is assigned to the most recent path passed to `where()`. + * + * ####NOTE: + * + * `geometry()` **must** come after either `intersects()` or `within()`. + * + * The `object` argument must contain `type` and `coordinates` properties. + * - type {String} + * - coordinates {Array} + * + * @method geometry + * @memberOf Query + * @instance + * @param {Object} object Must contain a `type` property which is a String and a `coordinates` property which is an Array. See the examples. + * @return {Query} this + * @see $geometry http://docs.mongodb.org/manual/reference/operator/geometry/ + * @see http://docs.mongodb.org/manual/release-notes/2.4/#new-geospatial-indexes-with-geojson-and-improved-spherical-geometry + * @see http://www.mongodb.org/display/DOCS/Geospatial+Indexing + * @api public + */ + +/** + * Specifies a `$near` or `$nearSphere` condition + * + * These operators return documents sorted by distance. + * + * ####Example + * + * query.where('loc').near({ center: [10, 10] }); + * query.where('loc').near({ center: [10, 10], maxDistance: 5 }); + * query.where('loc').near({ center: [10, 10], maxDistance: 5, spherical: true }); + * query.near('loc', { center: [10, 10], maxDistance: 5 }); + * + * @method near + * @memberOf Query + * @instance + * @param {String} [path] + * @param {Object} val + * @return {Query} this + * @see $near http://docs.mongodb.org/manual/reference/operator/near/ + * @see $nearSphere http://docs.mongodb.org/manual/reference/operator/nearSphere/ + * @see $maxDistance http://docs.mongodb.org/manual/reference/operator/maxDistance/ + * @see http://www.mongodb.org/display/DOCS/Geospatial+Indexing + * @api public + */ + +/*! + * Overwriting mquery is needed to support a couple different near() forms found in older + * versions of mongoose + * near([1,1]) + * near(1,1) + * near(field, [1,2]) + * near(field, 1, 2) + * In addition to all of the normal forms supported by mquery + */ + +Query.prototype.near = function() { + const params = []; + const sphere = this._mongooseOptions.nearSphere; + + // TODO refactor + + if (arguments.length === 1) { + if (Array.isArray(arguments[0])) { + params.push({ center: arguments[0], spherical: sphere }); + } else if (typeof arguments[0] === 'string') { + // just passing a path + params.push(arguments[0]); + } else if (utils.isObject(arguments[0])) { + if (typeof arguments[0].spherical !== 'boolean') { + arguments[0].spherical = sphere; + } + params.push(arguments[0]); + } else { + throw new TypeError('invalid argument'); + } + } else if (arguments.length === 2) { + if (typeof arguments[0] === 'number' && typeof arguments[1] === 'number') { + params.push({ center: [arguments[0], arguments[1]], spherical: sphere }); + } else if (typeof arguments[0] === 'string' && Array.isArray(arguments[1])) { + params.push(arguments[0]); + params.push({ center: arguments[1], spherical: sphere }); + } else if (typeof arguments[0] === 'string' && utils.isObject(arguments[1])) { + params.push(arguments[0]); + if (typeof arguments[1].spherical !== 'boolean') { + arguments[1].spherical = sphere; + } + params.push(arguments[1]); + } else { + throw new TypeError('invalid argument'); + } + } else if (arguments.length === 3) { + if (typeof arguments[0] === 'string' && typeof arguments[1] === 'number' + && typeof arguments[2] === 'number') { + params.push(arguments[0]); + params.push({ center: [arguments[1], arguments[2]], spherical: sphere }); + } else { + throw new TypeError('invalid argument'); + } + } else { + throw new TypeError('invalid argument'); + } + + return Query.base.near.apply(this, params); +}; + +/** + * _DEPRECATED_ Specifies a `$nearSphere` condition + * + * ####Example + * + * query.where('loc').nearSphere({ center: [10, 10], maxDistance: 5 }); + * + * **Deprecated.** Use `query.near()` instead with the `spherical` option set to `true`. + * + * ####Example + * + * query.where('loc').near({ center: [10, 10], spherical: true }); + * + * @deprecated + * @see near() #query_Query-near + * @see $near http://docs.mongodb.org/manual/reference/operator/near/ + * @see $nearSphere http://docs.mongodb.org/manual/reference/operator/nearSphere/ + * @see $maxDistance http://docs.mongodb.org/manual/reference/operator/maxDistance/ + */ + +Query.prototype.nearSphere = function() { + this._mongooseOptions.nearSphere = true; + this.near.apply(this, arguments); + return this; +}; + +/** + * Returns an asyncIterator for use with [`for/await/of` loops](https://thecodebarbarian.com/getting-started-with-async-iterators-in-node-js) + * This function *only* works for `find()` queries. + * You do not need to call this function explicitly, the JavaScript runtime + * will call it for you. + * + * ####Example + * + * for await (const doc of Model.aggregate([{ $sort: { name: 1 } }])) { + * console.log(doc.name); + * } + * + * Node.js 10.x supports async iterators natively without any flags. You can + * enable async iterators in Node.js 8.x using the [`--harmony_async_iteration` flag](https://github.com/tc39/proposal-async-iteration/issues/117#issuecomment-346695187). + * + * **Note:** This function is not if `Symbol.asyncIterator` is undefined. If + * `Symbol.asyncIterator` is undefined, that means your Node.js version does not + * support async iterators. + * + * @method Symbol.asyncIterator + * @memberOf Query + * @instance + * @api public + */ + +if (Symbol.asyncIterator != null) { + Query.prototype[Symbol.asyncIterator] = function() { + return this.cursor().transformNull()._transformForAsyncIterator(); + }; +} + +/** + * Specifies a `$polygon` condition + * + * ####Example + * + * query.where('loc').within().polygon([10,20], [13, 25], [7,15]) + * query.polygon('loc', [10,20], [13, 25], [7,15]) + * + * @method polygon + * @memberOf Query + * @instance + * @param {String|Array} [path] + * @param {Array|Object} [coordinatePairs...] + * @return {Query} this + * @see $polygon http://docs.mongodb.org/manual/reference/operator/polygon/ + * @see http://www.mongodb.org/display/DOCS/Geospatial+Indexing + * @api public + */ + +/** + * Specifies a `$box` condition + * + * ####Example + * + * const lowerLeft = [40.73083, -73.99756] + * const upperRight= [40.741404, -73.988135] + * + * query.where('loc').within().box(lowerLeft, upperRight) + * query.box({ ll : lowerLeft, ur : upperRight }) + * + * @method box + * @memberOf Query + * @instance + * @see $box http://docs.mongodb.org/manual/reference/operator/box/ + * @see within() Query#within #query_Query-within + * @see http://www.mongodb.org/display/DOCS/Geospatial+Indexing + * @param {Object} val + * @param [Array] Upper Right Coords + * @return {Query} this + * @api public + */ + +/*! + * this is needed to support the mongoose syntax of: + * box(field, { ll : [x,y], ur : [x2,y2] }) + * box({ ll : [x,y], ur : [x2,y2] }) + */ + +Query.prototype.box = function(ll, ur) { + if (!Array.isArray(ll) && utils.isObject(ll)) { + ur = ll.ur; + ll = ll.ll; + } + return Query.base.box.call(this, ll, ur); +}; + +/** + * Specifies a `$center` or `$centerSphere` condition. + * + * ####Example + * + * const area = { center: [50, 50], radius: 10, unique: true } + * query.where('loc').within().circle(area) + * // alternatively + * query.circle('loc', area); + * + * // spherical calculations + * const area = { center: [50, 50], radius: 10, unique: true, spherical: true } + * query.where('loc').within().circle(area) + * // alternatively + * query.circle('loc', area); + * + * @method circle + * @memberOf Query + * @instance + * @param {String} [path] + * @param {Object} area + * @return {Query} this + * @see $center http://docs.mongodb.org/manual/reference/operator/center/ + * @see $centerSphere http://docs.mongodb.org/manual/reference/operator/centerSphere/ + * @see $geoWithin http://docs.mongodb.org/manual/reference/operator/geoWithin/ + * @see http://www.mongodb.org/display/DOCS/Geospatial+Indexing + * @api public + */ + +/** + * _DEPRECATED_ Alias for [circle](#query_Query-circle) + * + * **Deprecated.** Use [circle](#query_Query-circle) instead. + * + * @deprecated + * @method center + * @memberOf Query + * @instance + * @api public + */ + +Query.prototype.center = Query.base.circle; + +/** + * _DEPRECATED_ Specifies a `$centerSphere` condition + * + * **Deprecated.** Use [circle](#query_Query-circle) instead. + * + * ####Example + * + * const area = { center: [50, 50], radius: 10 }; + * query.where('loc').within().centerSphere(area); + * + * @deprecated + * @param {String} [path] + * @param {Object} val + * @return {Query} this + * @see http://www.mongodb.org/display/DOCS/Geospatial+Indexing + * @see $centerSphere http://docs.mongodb.org/manual/reference/operator/centerSphere/ + * @api public + */ + +Query.prototype.centerSphere = function() { + if (arguments[0] != null && typeof arguments[0].constructor === 'function' && arguments[0].constructor.name === 'Object') { + arguments[0].spherical = true; + } + + if (arguments[1] != null && typeof arguments[1].constructor === 'function' && arguments[1].constructor.name === 'Object') { + arguments[1].spherical = true; + } + + Query.base.circle.apply(this, arguments); +}; + +/** + * Determines if field selection has been made. + * + * @method selected + * @memberOf Query + * @instance + * @return {Boolean} + * @api public + */ + +/** + * Determines if inclusive field selection has been made. + * + * query.selectedInclusively() // false + * query.select('name') + * query.selectedInclusively() // true + * + * @method selectedInclusively + * @memberOf Query + * @instance + * @return {Boolean} + * @api public + */ + +Query.prototype.selectedInclusively = function selectedInclusively() { + return isInclusive(this._fields); +}; + +/** + * Determines if exclusive field selection has been made. + * + * query.selectedExclusively() // false + * query.select('-name') + * query.selectedExclusively() // true + * query.selectedInclusively() // false + * + * @method selectedExclusively + * @memberOf Query + * @instance + * @return {Boolean} + * @api public + */ + +Query.prototype.selectedExclusively = function selectedExclusively() { + return isExclusive(this._fields); +}; + +/** + * The model this query is associated with. + * + * #### Example: + * + * const q = MyModel.find(); + * q.model === MyModel; // true + * + * @api public + * @property model + * @memberOf Query + * @instance + */ + +Query.prototype.model; + +/*! + * Export + */ + +module.exports = Query; diff --git a/node_modules/mongoose/lib/queryhelpers.js b/node_modules/mongoose/lib/queryhelpers.js new file mode 100644 index 000000000..7fdebdb63 --- /dev/null +++ b/node_modules/mongoose/lib/queryhelpers.js @@ -0,0 +1,330 @@ +'use strict'; + +/*! + * Module dependencies + */ + +const checkEmbeddedDiscriminatorKeyProjection = + require('./helpers/discriminator/checkEmbeddedDiscriminatorKeyProjection'); +const get = require('./helpers/get'); +const getDiscriminatorByValue = + require('./helpers/discriminator/getDiscriminatorByValue'); +const isDefiningProjection = require('./helpers/projection/isDefiningProjection'); +const clone = require('./helpers/clone'); + +/*! + * Prepare a set of path options for query population. + * + * @param {Query} query + * @param {Object} options + * @return {Array} + */ + +exports.preparePopulationOptions = function preparePopulationOptions(query, options) { + const _populate = query.options.populate; + const pop = Object.keys(_populate).reduce((vals, key) => vals.concat([_populate[key]]), []); + + // lean options should trickle through all queries + if (options.lean != null) { + pop. + filter(p => get(p, 'options.lean') == null). + forEach(makeLean(options.lean)); + } + + pop.forEach(opts => { + opts._localModel = query.model; + }); + + return pop; +}; + +/*! + * Prepare a set of path options for query population. This is the MongooseQuery + * version + * + * @param {Query} query + * @param {Object} options + * @return {Array} + */ + +exports.preparePopulationOptionsMQ = function preparePopulationOptionsMQ(query, options) { + const _populate = query._mongooseOptions.populate; + const pop = Object.keys(_populate).reduce((vals, key) => vals.concat([_populate[key]]), []); + + // lean options should trickle through all queries + if (options.lean != null) { + pop. + filter(p => get(p, 'options.lean') == null). + forEach(makeLean(options.lean)); + } + + const session = get(query, 'options.session', null); + if (session != null) { + pop.forEach(path => { + if (path.options == null) { + path.options = { session: session }; + return; + } + if (!('session' in path.options)) { + path.options.session = session; + } + }); + } + + const projection = query._fieldsForExec(); + pop.forEach(p => { + p._queryProjection = projection; + }); + pop.forEach(opts => { + opts._localModel = query.model; + }); + + return pop; +}; + +/*! + * If the document is a mapped discriminator type, it returns a model instance for that type, otherwise, + * it returns an instance of the given model. + * + * @param {Model} model + * @param {Object} doc + * @param {Object} fields + * + * @return {Document} + */ +exports.createModel = function createModel(model, doc, fields, userProvidedFields, options) { + model.hooks.execPreSync('createModel', doc); + const discriminatorMapping = model.schema ? + model.schema.discriminatorMapping : + null; + + const key = discriminatorMapping && discriminatorMapping.isRoot ? + discriminatorMapping.key : + null; + + const value = doc[key]; + if (key && value && model.discriminators) { + const discriminator = model.discriminators[value] || getDiscriminatorByValue(model.discriminators, value); + if (discriminator) { + const _fields = clone(userProvidedFields); + exports.applyPaths(_fields, discriminator.schema); + return new discriminator(undefined, _fields, true); + } + } + if (typeof options === 'undefined') { + options = {}; + options.defaults = true; + } + return new model(undefined, fields, { + skipId: true, + isNew: false, + willInit: true, + defaults: options.defaults + }); +}; + +/*! + * ignore + */ + +exports.applyPaths = function applyPaths(fields, schema) { + // determine if query is selecting or excluding fields + let exclude; + let keys; + let keyIndex; + + if (fields) { + keys = Object.keys(fields); + keyIndex = keys.length; + + while (keyIndex--) { + if (keys[keyIndex][0] === '+') { + continue; + } + const field = fields[keys[keyIndex]]; + // Skip `$meta` and `$slice` + if (!isDefiningProjection(field)) { + continue; + } + exclude = !field; + break; + } + } + + // if selecting, apply default schematype select:true fields + // if excluding, apply schematype select:false fields + + const selected = []; + const excluded = []; + const stack = []; + + analyzeSchema(schema); + + switch (exclude) { + case true: + for (const fieldName of excluded) { + fields[fieldName] = 0; + } + break; + case false: + if (schema && + schema.paths['_id'] && + schema.paths['_id'].options && + schema.paths['_id'].options.select === false) { + fields._id = 0; + } + + for (const fieldName of selected) { + fields[fieldName] = fields[fieldName] || 1; + } + break; + case undefined: + if (fields == null) { + break; + } + // Any leftover plus paths must in the schema, so delete them (gh-7017) + for (const key of Object.keys(fields || {})) { + if (key.startsWith('+')) { + delete fields[key]; + } + } + + // user didn't specify fields, implies returning all fields. + // only need to apply excluded fields and delete any plus paths + for (const fieldName of excluded) { + fields[fieldName] = 0; + } + break; + } + + function analyzeSchema(schema, prefix) { + prefix || (prefix = ''); + + // avoid recursion + if (stack.indexOf(schema) !== -1) { + return []; + } + stack.push(schema); + + const addedPaths = []; + schema.eachPath(function(path, type) { + if (prefix) path = prefix + '.' + path; + + let addedPath = analyzePath(path, type); + // arrays + if (addedPath == null && type.$isMongooseArray && !type.$isMongooseDocumentArray) { + addedPath = analyzePath(path, type.caster); + } + if (addedPath != null) { + addedPaths.push(addedPath); + } + + // nested schemas + if (type.schema) { + const _addedPaths = analyzeSchema(type.schema, path); + + // Special case: if discriminator key is the only field that would + // be projected in, remove it. + if (exclude === false) { + checkEmbeddedDiscriminatorKeyProjection(fields, path, type.schema, + selected, _addedPaths); + } + } + }); + + stack.pop(); + return addedPaths; + } + + function analyzePath(path, type) { + const plusPath = '+' + path; + const hasPlusPath = fields && plusPath in fields; + if (hasPlusPath) { + // forced inclusion + delete fields[plusPath]; + } + + if (typeof type.selected !== 'boolean') return; + + if (hasPlusPath) { + // forced inclusion + delete fields[plusPath]; + + // if there are other fields being included, add this one + // if no other included fields, leave this out (implied inclusion) + if (exclude === false && keys.length > 1 && !~keys.indexOf(path)) { + fields[path] = 1; + } + + return; + } + + // check for parent exclusions + const pieces = path.split('.'); + let cur = ''; + for (let i = 0; i < pieces.length; ++i) { + cur += cur.length ? '.' + pieces[i] : pieces[i]; + if (excluded.indexOf(cur) !== -1) { + return; + } + } + + // Special case: if user has included a parent path of a discriminator key, + // don't explicitly project in the discriminator key because that will + // project out everything else under the parent path + if (!exclude && get(type, 'options.$skipDiscriminatorCheck', false)) { + let cur = ''; + for (let i = 0; i < pieces.length; ++i) { + cur += (cur.length === 0 ? '' : '.') + pieces[i]; + const projection = get(fields, cur, false) || get(fields, cur + '.$', false); + if (projection && typeof projection !== 'object') { + return; + } + } + } + + (type.selected ? selected : excluded).push(path); + return path; + } +}; + +/*! + * Set each path query option to lean + * + * @param {Object} option + */ + +function makeLean(val) { + return function(option) { + option.options || (option.options = {}); + + if (val != null && Array.isArray(val.virtuals)) { + val = Object.assign({}, val); + val.virtuals = val.virtuals. + filter(path => typeof path === 'string' && path.startsWith(option.path + '.')). + map(path => path.slice(option.path.length + 1)); + } + + option.options.lean = val; + }; +} + +/*! + * Handle the `WriteOpResult` from the server + */ + +exports.handleDeleteWriteOpResult = function handleDeleteWriteOpResult(callback) { + return function _handleDeleteWriteOpResult(error, res) { + if (error) { + return callback(error); + } + const mongooseResult = Object.assign({}, res.result); + if (get(res, 'result.n', null) != null) { + mongooseResult.deletedCount = res.result.n; + } + if (res.deletedCount != null) { + mongooseResult.deletedCount = res.deletedCount; + } + return callback(null, mongooseResult); + }; +}; diff --git a/node_modules/mongoose/lib/schema.js b/node_modules/mongoose/lib/schema.js new file mode 100644 index 000000000..d84bee729 --- /dev/null +++ b/node_modules/mongoose/lib/schema.js @@ -0,0 +1,2241 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const EventEmitter = require('events').EventEmitter; +const Kareem = require('kareem'); +const MongooseError = require('./error/mongooseError'); +const SchemaType = require('./schematype'); +const SchemaTypeOptions = require('./options/SchemaTypeOptions'); +const VirtualOptions = require('./options/VirtualOptions'); +const VirtualType = require('./virtualtype'); +const addAutoId = require('./helpers/schema/addAutoId'); +const get = require('./helpers/get'); +const getConstructorName = require('./helpers/getConstructorName'); +const getIndexes = require('./helpers/schema/getIndexes'); +const idGetter = require('./helpers/schema/idGetter'); +const merge = require('./helpers/schema/merge'); +const mpath = require('mpath'); +const readPref = require('./driver').get().ReadPreference; +const setupTimestamps = require('./helpers/timestamps/setupTimestamps'); +const utils = require('./utils'); +const validateRef = require('./helpers/populate/validateRef'); + +let MongooseTypes; + +const queryHooks = require('./helpers/query/applyQueryMiddleware'). + middlewareFunctions; +const documentHooks = require('./helpers/model/applyHooks').middlewareFunctions; +const hookNames = queryHooks.concat(documentHooks). + reduce((s, hook) => s.add(hook), new Set()); + +let id = 0; + +/** + * Schema constructor. + * + * ####Example: + * + * const child = new Schema({ name: String }); + * const schema = new Schema({ name: String, age: Number, children: [child] }); + * const Tree = mongoose.model('Tree', schema); + * + * // setting schema options + * new Schema({ name: String }, { _id: false, autoIndex: false }) + * + * ####Options: + * + * - [autoIndex](/docs/guide.html#autoIndex): bool - defaults to null (which means use the connection's autoIndex option) + * - [autoCreate](/docs/guide.html#autoCreate): bool - defaults to null (which means use the connection's autoCreate option) + * - [bufferCommands](/docs/guide.html#bufferCommands): bool - defaults to true + * - [bufferTimeoutMS](/docs/guide.html#bufferTimeoutMS): number - defaults to 10000 (10 seconds). If `bufferCommands` is enabled, the amount of time Mongoose will wait for connectivity to be restablished before erroring out. + * - [capped](/docs/guide.html#capped): bool - defaults to false + * - [collection](/docs/guide.html#collection): string - no default + * - [discriminatorKey](/docs/guide.html#discriminatorKey): string - defaults to `__t` + * - [id](/docs/guide.html#id): bool - defaults to true + * - [_id](/docs/guide.html#_id): bool - defaults to true + * - [minimize](/docs/guide.html#minimize): bool - controls [document#toObject](#document_Document-toObject) behavior when called manually - defaults to true + * - [read](/docs/guide.html#read): string + * - [writeConcern](/docs/guide.html#writeConcern): object - defaults to null, use to override [the MongoDB server's default write concern settings](https://docs.mongodb.com/manual/reference/write-concern/) + * - [shardKey](/docs/guide.html#shardKey): object - defaults to `null` + * - [strict](/docs/guide.html#strict): bool - defaults to true + * - [strictQuery](/docs/guide.html#strictQuery): bool - defaults to false + * - [toJSON](/docs/guide.html#toJSON) - object - no default + * - [toObject](/docs/guide.html#toObject) - object - no default + * - [typeKey](/docs/guide.html#typeKey) - string - defaults to 'type' + * - [validateBeforeSave](/docs/guide.html#validateBeforeSave) - bool - defaults to `true` + * - [versionKey](/docs/guide.html#versionKey): string or object - defaults to "__v" + * - [optimisticConcurrency](/docs/guide.html#optimisticConcurrency): bool - defaults to false. Set to true to enable [optimistic concurrency](https://thecodebarbarian.com/whats-new-in-mongoose-5-10-optimistic-concurrency.html). + * - [collation](/docs/guide.html#collation): object - defaults to null (which means use no collation) + * - [selectPopulatedPaths](/docs/guide.html#selectPopulatedPaths): boolean - defaults to `true` + * - [skipVersioning](/docs/guide.html#skipVersioning): object - paths to exclude from versioning + * - [timestamps](/docs/guide.html#timestamps): object or boolean - defaults to `false`. If true, Mongoose adds `createdAt` and `updatedAt` properties to your schema and manages those properties for you. + * + * ####Options for Nested Schemas: + * - `excludeIndexes`: bool - defaults to `false`. If `true`, skip building indexes on this schema's paths. + * + * ####Note: + * + * _When nesting schemas, (`children` in the example above), always declare the child schema first before passing it into its parent._ + * + * @param {Object|Schema|Array} [definition] Can be one of: object describing schema paths, or schema to copy, or array of objects and schemas + * @param {Object} [options] + * @inherits NodeJS EventEmitter http://nodejs.org/api/events.html#events_class_events_eventemitter + * @event `init`: Emitted after the schema is compiled into a `Model`. + * @api public + */ + +function Schema(obj, options) { + if (!(this instanceof Schema)) { + return new Schema(obj, options); + } + + this.obj = obj; + this.paths = {}; + this.aliases = {}; + this.subpaths = {}; + this.virtuals = {}; + this.singleNestedPaths = {}; + this.nested = {}; + this.inherits = {}; + this.callQueue = []; + this._indexes = []; + this.methods = {}; + this.methodOptions = {}; + this.statics = {}; + this.tree = {}; + this.query = {}; + this.childSchemas = []; + this.plugins = []; + // For internal debugging. Do not use this to try to save a schema in MDB. + this.$id = ++id; + this.mapPaths = []; + + this.s = { + hooks: new Kareem() + }; + + this.options = this.defaultOptions(options); + + // build paths + if (Array.isArray(obj)) { + for (const definition of obj) { + this.add(definition); + } + } else if (obj) { + this.add(obj); + } + + // check if _id's value is a subdocument (gh-2276) + const _idSubDoc = obj && obj._id && utils.isObject(obj._id); + + // ensure the documents get an auto _id unless disabled + const auto_id = !this.paths['_id'] && + (this.options._id) && !_idSubDoc; + + if (auto_id) { + addAutoId(this); + } + + this.setupTimestamp(this.options.timestamps); +} + +/*! + * Create virtual properties with alias field + */ +function aliasFields(schema, paths) { + paths = paths || Object.keys(schema.paths); + for (const path of paths) { + const options = get(schema.paths[path], 'options'); + if (options == null) { + continue; + } + + const prop = schema.paths[path].path; + const alias = options.alias; + + if (!alias) { + continue; + } + + if (typeof alias !== 'string') { + throw new Error('Invalid value for alias option on ' + prop + ', got ' + alias); + } + + schema.aliases[alias] = prop; + + schema. + virtual(alias). + get((function(p) { + return function() { + if (typeof this.get === 'function') { + return this.get(p); + } + return this[p]; + }; + })(prop)). + set((function(p) { + return function(v) { + return this.$set(p, v); + }; + })(prop)); + } +} + +/*! + * Inherit from EventEmitter. + */ +Schema.prototype = Object.create(EventEmitter.prototype); +Schema.prototype.constructor = Schema; +Schema.prototype.instanceOfSchema = true; + +/*! + * ignore + */ + +Object.defineProperty(Schema.prototype, '$schemaType', { + configurable: false, + enumerable: false, + writable: true +}); + +/** + * Array of child schemas (from document arrays and single nested subdocs) + * and their corresponding compiled models. Each element of the array is + * an object with 2 properties: `schema` and `model`. + * + * This property is typically only useful for plugin authors and advanced users. + * You do not need to interact with this property at all to use mongoose. + * + * @api public + * @property childSchemas + * @memberOf Schema + * @instance + */ + +Object.defineProperty(Schema.prototype, 'childSchemas', { + configurable: false, + enumerable: true, + writable: true +}); + +/** + * Object containing all virtuals defined on this schema. + * The objects' keys are the virtual paths and values are instances of `VirtualType`. + * + * This property is typically only useful for plugin authors and advanced users. + * You do not need to interact with this property at all to use mongoose. + * + * ####Example: + * const schema = new Schema({}); + * schema.virtual('answer').get(() => 42); + * + * console.log(schema.virtuals); // { answer: VirtualType { path: 'answer', ... } } + * console.log(schema.virtuals['answer'].getters[0].call()); // 42 + * + * @api public + * @property virtuals + * @memberOf Schema + * @instance + */ + +Object.defineProperty(Schema.prototype, 'virtuals', { + configurable: false, + enumerable: true, + writable: true +}); + +/** + * The original object passed to the schema constructor + * + * ####Example: + * + * const schema = new Schema({ a: String }).add({ b: String }); + * schema.obj; // { a: String } + * + * @api public + * @property obj + * @memberOf Schema + * @instance + */ + +Schema.prototype.obj; + +/** + * The paths defined on this schema. The keys are the top-level paths + * in this schema, and the values are instances of the SchemaType class. + * + * ####Example: + * const schema = new Schema({ name: String }, { _id: false }); + * schema.paths; // { name: SchemaString { ... } } + * + * schema.add({ age: Number }); + * schema.paths; // { name: SchemaString { ... }, age: SchemaNumber { ... } } + * + * @api public + * @property paths + * @memberOf Schema + * @instance + */ + +Schema.prototype.paths; + +/** + * Schema as a tree + * + * ####Example: + * { + * '_id' : ObjectId + * , 'nested' : { + * 'key' : String + * } + * } + * + * @api private + * @property tree + * @memberOf Schema + * @instance + */ + +Schema.prototype.tree; + +/** + * Returns a deep copy of the schema + * + * ####Example: + * + * const schema = new Schema({ name: String }); + * const clone = schema.clone(); + * clone === schema; // false + * clone.path('name'); // SchemaString { ... } + * + * @return {Schema} the cloned schema + * @api public + * @memberOf Schema + * @instance + */ + +Schema.prototype.clone = function() { + const Constructor = this.base == null ? Schema : this.base.Schema; + + const s = new Constructor({}, this._userProvidedOptions); + s.base = this.base; + s.obj = this.obj; + s.options = utils.clone(this.options); + s.callQueue = this.callQueue.map(function(f) { return f; }); + s.methods = utils.clone(this.methods); + s.methodOptions = utils.clone(this.methodOptions); + s.statics = utils.clone(this.statics); + s.query = utils.clone(this.query); + s.plugins = Array.prototype.slice.call(this.plugins); + s._indexes = utils.clone(this._indexes); + s.s.hooks = this.s.hooks.clone(); + + s.tree = utils.clone(this.tree); + s.paths = utils.clone(this.paths); + s.nested = utils.clone(this.nested); + s.subpaths = utils.clone(this.subpaths); + s.singleNestedPaths = utils.clone(this.singleNestedPaths); + s.childSchemas = gatherChildSchemas(s); + + s.virtuals = utils.clone(this.virtuals); + s.$globalPluginsApplied = this.$globalPluginsApplied; + s.$isRootDiscriminator = this.$isRootDiscriminator; + s.$implicitlyCreated = this.$implicitlyCreated; + s.$id = ++id; + s.$originalSchemaId = this.$id; + s.mapPaths = [].concat(this.mapPaths); + + if (this.discriminatorMapping != null) { + s.discriminatorMapping = Object.assign({}, this.discriminatorMapping); + } + if (this.discriminators != null) { + s.discriminators = Object.assign({}, this.discriminators); + } + + s.aliases = Object.assign({}, this.aliases); + + // Bubble up `init` for backwards compat + s.on('init', v => this.emit('init', v)); + + return s; +}; + +/** + * Returns a new schema that has the picked `paths` from this schema. + * + * This method is analagous to [Lodash's `pick()` function](https://lodash.com/docs/4.17.15#pick) for Mongoose schemas. + * + * ####Example: + * + * const schema = Schema({ name: String, age: Number }); + * // Creates a new schema with the same `name` path as `schema`, + * // but no `age` path. + * const newSchema = schema.pick(['name']); + * + * newSchema.path('name'); // SchemaString { ... } + * newSchema.path('age'); // undefined + * + * @param {Array} paths list of paths to pick + * @param {Object} [options] options to pass to the schema constructor. Defaults to `this.options` if not set. + * @return {Schema} + * @api public + */ + +Schema.prototype.pick = function(paths, options) { + const newSchema = new Schema({}, options || this.options); + if (!Array.isArray(paths)) { + throw new MongooseError('Schema#pick() only accepts an array argument, ' + + 'got "' + typeof paths + '"'); + } + + for (const path of paths) { + if (this.nested[path]) { + newSchema.add({ [path]: get(this.tree, path) }); + } else { + const schematype = this.path(path); + if (schematype == null) { + throw new MongooseError('Path `' + path + '` is not in the schema'); + } + newSchema.add({ [path]: schematype }); + } + } + + return newSchema; +}; + +/** + * Returns default options for this schema, merged with `options`. + * + * @param {Object} options + * @return {Object} + * @api private + */ + +Schema.prototype.defaultOptions = function(options) { + this._userProvidedOptions = options == null ? {} : utils.clone(options); + + const baseOptions = get(this, 'base.options', {}); + + const strict = 'strict' in baseOptions ? baseOptions.strict : true; + + options = utils.options({ + strict: strict, + strictQuery: 'strict' in this._userProvidedOptions ? + this._userProvidedOptions.strict : + 'strictQuery' in baseOptions ? + baseOptions.strictQuery : strict, + bufferCommands: true, + capped: false, // { size, max, autoIndexId } + versionKey: '__v', + optimisticConcurrency: false, + minimize: true, + autoIndex: null, + discriminatorKey: '__t', + shardKey: null, + read: null, + validateBeforeSave: true, + // the following are only applied at construction time + _id: true, + id: true, + typeKey: 'type' + }, utils.clone(options)); + + if (options.read) { + options.read = readPref(options.read); + } + + if (options.versionKey && typeof options.versionKey !== 'string') { + throw new MongooseError('`versionKey` must be falsy or string, got `' + (typeof options.versionKey) + '`'); + } + + if (options.optimisticConcurrency && !options.versionKey) { + throw new MongooseError('Must set `versionKey` if using `optimisticConcurrency`'); + } + + return options; +}; + +/** + * Adds key path / schema type pairs to this schema. + * + * ####Example: + * + * const ToySchema = new Schema(); + * ToySchema.add({ name: 'string', color: 'string', price: 'number' }); + * + * const TurboManSchema = new Schema(); + * // You can also `add()` another schema and copy over all paths, virtuals, + * // getters, setters, indexes, methods, and statics. + * TurboManSchema.add(ToySchema).add({ year: Number }); + * + * @param {Object|Schema} obj plain object with paths to add, or another schema + * @param {String} [prefix] path to prefix the newly added paths with + * @return {Schema} the Schema instance + * @api public + */ + +Schema.prototype.add = function add(obj, prefix) { + if (obj instanceof Schema || (obj != null && obj.instanceOfSchema)) { + merge(this, obj); + + return this; + } + + // Special case: setting top-level `_id` to false should convert to disabling + // the `_id` option. This behavior never worked before 5.4.11 but numerous + // codebases use it (see gh-7516, gh-7512). + if (obj._id === false && prefix == null) { + this.options._id = false; + } + + prefix = prefix || ''; + // avoid prototype pollution + if (prefix === '__proto__.' || prefix === 'constructor.' || prefix === 'prototype.') { + return this; + } + + const keys = Object.keys(obj); + + for (const key of keys) { + const fullPath = prefix + key; + + if (obj[key] == null) { + throw new TypeError('Invalid value for schema path `' + fullPath + + '`, got value "' + obj[key] + '"'); + } + // Retain `_id: false` but don't set it as a path, re: gh-8274. + if (key === '_id' && obj[key] === false) { + continue; + } + if (obj[key] instanceof VirtualType || get(obj[key], 'constructor.name', null) === 'VirtualType') { + this.virtual(obj[key]); + continue; + } + + if (Array.isArray(obj[key]) && obj[key].length === 1 && obj[key][0] == null) { + throw new TypeError('Invalid value for schema Array path `' + fullPath + + '`, got value "' + obj[key][0] + '"'); + } + + if (!(utils.isPOJO(obj[key]) || obj[key] instanceof SchemaTypeOptions)) { + // Special-case: Non-options definitely a path so leaf at this node + // Examples: Schema instances, SchemaType instances + if (prefix) { + this.nested[prefix.substr(0, prefix.length - 1)] = true; + } + this.path(prefix + key, obj[key]); + } else if (Object.keys(obj[key]).length < 1) { + // Special-case: {} always interpreted as Mixed path so leaf at this node + if (prefix) { + this.nested[prefix.substr(0, prefix.length - 1)] = true; + } + this.path(fullPath, obj[key]); // mixed type + } else if (!obj[key][this.options.typeKey] || (this.options.typeKey === 'type' && obj[key].type.type)) { + // Special-case: POJO with no bona-fide type key - interpret as tree of deep paths so recurse + // nested object { last: { name: String }} + this.nested[fullPath] = true; + this.add(obj[key], fullPath + '.'); + } else { + // There IS a bona-fide type key that may also be a POJO + const _typeDef = obj[key][this.options.typeKey]; + if (utils.isPOJO(_typeDef) && Object.keys(_typeDef).length > 0) { + // If a POJO is the value of a type key, make it a subdocument + if (prefix) { + this.nested[prefix.substr(0, prefix.length - 1)] = true; + } + const _schema = new Schema(_typeDef); + const schemaWrappedPath = Object.assign({}, obj[key], { type: _schema }); + this.path(prefix + key, schemaWrappedPath); + } else { + // Either the type is non-POJO or we interpret it as Mixed anyway + if (prefix) { + this.nested[prefix.substr(0, prefix.length - 1)] = true; + } + this.path(prefix + key, obj[key]); + } + } + } + + const addedKeys = Object.keys(obj). + map(key => prefix ? prefix + key : key); + aliasFields(this, addedKeys); + return this; +}; + +/** + * Reserved document keys. + * + * Keys in this object are names that are warned in schema declarations + * because they have the potential to break Mongoose/ Mongoose plugins functionality. If you create a schema + * using `new Schema()` with one of these property names, Mongoose will log a warning. + * + * - _posts + * - _pres + * - collection + * - emit + * - errors + * - get + * - init + * - isModified + * - isNew + * - listeners + * - modelName + * - on + * - once + * - populated + * - prototype + * - remove + * - removeListener + * - save + * - schema + * - toObject + * - validate + * + * _NOTE:_ Use of these terms as method names is permitted, but play at your own risk, as they may be existing mongoose document methods you are stomping on. + * + * const schema = new Schema(..); + * schema.methods.init = function () {} // potentially breaking + */ + +Schema.reserved = Object.create(null); +Schema.prototype.reserved = Schema.reserved; + +const reserved = Schema.reserved; +// Core object +reserved['prototype'] = +// EventEmitter +reserved.emit = +reserved.listeners = +reserved.on = +reserved.removeListener = + +// document properties and functions +reserved.collection = +reserved.errors = +reserved.get = +reserved.init = +reserved.isModified = +reserved.isNew = +reserved.populated = +reserved.remove = +reserved.save = +reserved.toObject = +reserved.validate = 1; +reserved.collection = 1; + +/** + * Gets/sets schema paths. + * + * Sets a path (if arity 2) + * Gets a path (if arity 1) + * + * ####Example + * + * schema.path('name') // returns a SchemaType + * schema.path('name', Number) // changes the schemaType of `name` to Number + * + * @param {String} path + * @param {Object} constructor + * @api public + */ + +Schema.prototype.path = function(path, obj) { + // Convert to '.$' to check subpaths re: gh-6405 + const cleanPath = _pathToPositionalSyntax(path); + if (obj === undefined) { + let schematype = _getPath(this, path, cleanPath); + if (schematype != null) { + return schematype; + } + + // Look for maps + const mapPath = getMapPath(this, path); + if (mapPath != null) { + return mapPath; + } + + // Look if a parent of this path is mixed + schematype = this.hasMixedParent(cleanPath); + if (schematype != null) { + return schematype; + } + + // subpaths? + return /\.\d+\.?.*$/.test(path) + ? getPositionalPath(this, path) + : undefined; + } + + // some path names conflict with document methods + const firstPieceOfPath = path.split('.')[0]; + if (reserved[firstPieceOfPath] && !this.options.supressReservedKeysWarning) { + const errorMessage = `\`${firstPieceOfPath}\` is a reserved schema pathname and may break some functionality. ` + + 'You are allowed to use it, but use at your own risk. ' + + 'To disable this warning pass `supressReservedKeysWarning` as a schema option.'; + + utils.warn(errorMessage); + } + + if (typeof obj === 'object' && utils.hasUserDefinedProperty(obj, 'ref')) { + validateRef(obj.ref, path); + } + + // update the tree + const subpaths = path.split(/\./); + const last = subpaths.pop(); + let branch = this.tree; + let fullPath = ''; + + for (const sub of subpaths) { + fullPath = fullPath += (fullPath.length > 0 ? '.' : '') + sub; + if (!branch[sub]) { + this.nested[fullPath] = true; + branch[sub] = {}; + } + if (typeof branch[sub] !== 'object') { + const msg = 'Cannot set nested path `' + path + '`. ' + + 'Parent path `' + + fullPath + + '` already set to type ' + branch[sub].name + + '.'; + throw new Error(msg); + } + branch = branch[sub]; + } + + branch[last] = utils.clone(obj); + + this.paths[path] = this.interpretAsType(path, obj, this.options); + const schemaType = this.paths[path]; + + if (schemaType.$isSchemaMap) { + // Maps can have arbitrary keys, so `$*` is internal shorthand for "any key" + // The '$' is to imply this path should never be stored in MongoDB so we + // can easily build a regexp out of this path, and '*' to imply "any key." + const mapPath = path + '.$*'; + + this.paths[mapPath] = schemaType.$__schemaType; + this.mapPaths.push(this.paths[mapPath]); + } + + if (schemaType.$isSingleNested) { + for (const key of Object.keys(schemaType.schema.paths)) { + this.singleNestedPaths[path + '.' + key] = schemaType.schema.paths[key]; + } + for (const key of Object.keys(schemaType.schema.singleNestedPaths)) { + this.singleNestedPaths[path + '.' + key] = + schemaType.schema.singleNestedPaths[key]; + } + for (const key of Object.keys(schemaType.schema.subpaths)) { + this.singleNestedPaths[path + '.' + key] = + schemaType.schema.subpaths[key]; + } + for (const key of Object.keys(schemaType.schema.nested)) { + this.singleNestedPaths[path + '.' + key] = 'nested'; + } + + Object.defineProperty(schemaType.schema, 'base', { + configurable: true, + enumerable: false, + writable: false, + value: this.base + }); + + schemaType.caster.base = this.base; + this.childSchemas.push({ + schema: schemaType.schema, + model: schemaType.caster + }); + } else if (schemaType.$isMongooseDocumentArray) { + Object.defineProperty(schemaType.schema, 'base', { + configurable: true, + enumerable: false, + writable: false, + value: this.base + }); + + schemaType.casterConstructor.base = this.base; + this.childSchemas.push({ + schema: schemaType.schema, + model: schemaType.casterConstructor + }); + } + + if (schemaType.$isMongooseArray && schemaType.caster instanceof SchemaType) { + let arrayPath = path; + let _schemaType = schemaType; + + const toAdd = []; + while (_schemaType.$isMongooseArray) { + arrayPath = arrayPath + '.$'; + + // Skip arrays of document arrays + if (_schemaType.$isMongooseDocumentArray) { + _schemaType.$embeddedSchemaType._arrayPath = arrayPath; + _schemaType.$embeddedSchemaType._arrayParentPath = path; + _schemaType = _schemaType.$embeddedSchemaType.clone(); + } else { + _schemaType.caster._arrayPath = arrayPath; + _schemaType.caster._arrayParentPath = path; + _schemaType = _schemaType.caster.clone(); + } + + _schemaType.path = arrayPath; + toAdd.push(_schemaType); + } + + for (const _schemaType of toAdd) { + this.subpaths[_schemaType.path] = _schemaType; + } + } + + if (schemaType.$isMongooseDocumentArray) { + for (const key of Object.keys(schemaType.schema.paths)) { + const _schemaType = schemaType.schema.paths[key]; + this.subpaths[path + '.' + key] = _schemaType; + if (typeof _schemaType === 'object' && _schemaType != null) { + _schemaType.$isUnderneathDocArray = true; + } + } + for (const key of Object.keys(schemaType.schema.subpaths)) { + const _schemaType = schemaType.schema.subpaths[key]; + this.subpaths[path + '.' + key] = _schemaType; + if (typeof _schemaType === 'object' && _schemaType != null) { + _schemaType.$isUnderneathDocArray = true; + } + } + for (const key of Object.keys(schemaType.schema.singleNestedPaths)) { + const _schemaType = schemaType.schema.singleNestedPaths[key]; + this.subpaths[path + '.' + key] = _schemaType; + if (typeof _schemaType === 'object' && _schemaType != null) { + _schemaType.$isUnderneathDocArray = true; + } + } + } + + return this; +}; + +/*! + * ignore + */ + +function gatherChildSchemas(schema) { + const childSchemas = []; + + for (const path of Object.keys(schema.paths)) { + const schematype = schema.paths[path]; + if (schematype.$isMongooseDocumentArray || schematype.$isSingleNested) { + childSchemas.push({ schema: schematype.schema, model: schematype.caster }); + } + } + + return childSchemas; +} + +/*! + * ignore + */ + +function _getPath(schema, path, cleanPath) { + if (schema.paths.hasOwnProperty(path)) { + return schema.paths[path]; + } + if (schema.subpaths.hasOwnProperty(cleanPath)) { + return schema.subpaths[cleanPath]; + } + if (schema.singleNestedPaths.hasOwnProperty(cleanPath) && typeof schema.singleNestedPaths[cleanPath] === 'object') { + return schema.singleNestedPaths[cleanPath]; + } + + return null; +} + +/*! + * ignore + */ + +function _pathToPositionalSyntax(path) { + if (!/\.\d+/.test(path)) { + return path; + } + return path.replace(/\.\d+\./g, '.$.').replace(/\.\d+$/, '.$'); +} + +/*! + * ignore + */ + +function getMapPath(schema, path) { + if (schema.mapPaths.length === 0) { + return null; + } + for (const val of schema.mapPaths) { + const _path = val.path; + const re = new RegExp('^' + _path.replace(/\.\$\*/g, '\\.[^.]+') + '$'); + if (re.test(path)) { + return schema.paths[_path]; + } + } + + return null; +} + +/** + * The Mongoose instance this schema is associated with + * + * @property base + * @api private + */ + +Object.defineProperty(Schema.prototype, 'base', { + configurable: true, + enumerable: false, + writable: true, + value: null +}); + +/** + * Converts type arguments into Mongoose Types. + * + * @param {String} path + * @param {Object} obj constructor + * @api private + */ + +Schema.prototype.interpretAsType = function(path, obj, options) { + if (obj instanceof SchemaType) { + if (obj.path === path) { + return obj; + } + const clone = obj.clone(); + clone.path = path; + return clone; + } + + // If this schema has an associated Mongoose object, use the Mongoose object's + // copy of SchemaTypes re: gh-7158 gh-6933 + const MongooseTypes = this.base != null ? this.base.Schema.Types : Schema.Types; + + if (!utils.isPOJO(obj) && !(obj instanceof SchemaTypeOptions)) { + const constructorName = utils.getFunctionName(obj.constructor); + if (constructorName !== 'Object') { + const oldObj = obj; + obj = {}; + obj[options.typeKey] = oldObj; + } + } + + // Get the type making sure to allow keys named "type" + // and default to mixed if not specified. + // { type: { type: String, default: 'freshcut' } } + let type = obj[options.typeKey] && (options.typeKey !== 'type' || !obj.type.type) + ? obj[options.typeKey] + : {}; + let name; + + if (utils.isPOJO(type) || type === 'mixed') { + return new MongooseTypes.Mixed(path, obj); + } + + if (Array.isArray(type) || type === Array || type === 'array' || type === MongooseTypes.Array) { + // if it was specified through { type } look for `cast` + let cast = (type === Array || type === 'array') + ? obj.cast || obj.of + : type[0]; + + // new Schema({ path: [new Schema({ ... })] }) + if (cast && cast.instanceOfSchema) { + if (!(cast instanceof Schema)) { + throw new TypeError('Schema for array path `' + path + + '` is from a different copy of the Mongoose module. Please make sure you\'re using the same version ' + + 'of Mongoose everywhere with `npm list mongoose`.'); + } + return new MongooseTypes.DocumentArray(path, cast, obj); + } + if (cast && + cast[options.typeKey] && + cast[options.typeKey].instanceOfSchema) { + if (!(cast[options.typeKey] instanceof Schema)) { + throw new TypeError('Schema for array path `' + path + + '` is from a different copy of the Mongoose module. Please make sure you\'re using the same version ' + + 'of Mongoose everywhere with `npm list mongoose`.'); + } + return new MongooseTypes.DocumentArray(path, cast[options.typeKey], obj, cast); + } + + if (Array.isArray(cast)) { + return new MongooseTypes.Array(path, this.interpretAsType(path, cast, options), obj); + } + + // Handle both `new Schema({ arr: [{ subpath: String }] })` and `new Schema({ arr: [{ type: { subpath: string } }] })` + const castFromTypeKey = (cast != null && cast[options.typeKey] && (options.typeKey !== 'type' || !cast.type.type)) ? + cast[options.typeKey] : + cast; + if (typeof cast === 'string') { + cast = MongooseTypes[cast.charAt(0).toUpperCase() + cast.substring(1)]; + } else if (utils.isPOJO(castFromTypeKey)) { + if (Object.keys(castFromTypeKey).length) { + // The `minimize` and `typeKey` options propagate to child schemas + // declared inline, like `{ arr: [{ val: { $type: String } }] }`. + // See gh-3560 + const childSchemaOptions = { minimize: options.minimize }; + if (options.typeKey) { + childSchemaOptions.typeKey = options.typeKey; + } + // propagate 'strict' option to child schema + if (options.hasOwnProperty('strict')) { + childSchemaOptions.strict = options.strict; + } + if (this._userProvidedOptions.hasOwnProperty('_id')) { + childSchemaOptions._id = this._userProvidedOptions._id; + } else if (Schema.Types.DocumentArray.defaultOptions._id != null) { + childSchemaOptions._id = Schema.Types.DocumentArray.defaultOptions._id; + } + const childSchema = new Schema(castFromTypeKey, childSchemaOptions); + childSchema.$implicitlyCreated = true; + return new MongooseTypes.DocumentArray(path, childSchema, obj); + } else { + // Special case: empty object becomes mixed + return new MongooseTypes.Array(path, MongooseTypes.Mixed, obj); + } + } + + if (cast) { + type = cast[options.typeKey] && (options.typeKey !== 'type' || !cast.type.type) + ? cast[options.typeKey] + : cast; + + name = typeof type === 'string' + ? type + : type.schemaName || utils.getFunctionName(type); + + // For Jest 26+, see #10296 + if (name === 'ClockDate') { + name = 'Date'; + } + + if (name === void 0) { + throw new TypeError('Invalid schema configuration: ' + + `Could not determine the embedded type for array \`${path}\`. ` + + 'See https://mongoosejs.com/docs/guide.html#definition for more info on supported schema syntaxes.'); + } + if (!MongooseTypes.hasOwnProperty(name)) { + throw new TypeError('Invalid schema configuration: ' + + `\`${name}\` is not a valid type within the array \`${path}\`.` + + 'See http://bit.ly/mongoose-schematypes for a list of valid schema types.'); + } + } + + return new MongooseTypes.Array(path, cast || MongooseTypes.Mixed, obj, options); + } + + if (type && type.instanceOfSchema) { + return new MongooseTypes.Subdocument(type, path, obj); + } + + if (Buffer.isBuffer(type)) { + name = 'Buffer'; + } else if (typeof type === 'function' || typeof type === 'object') { + name = type.schemaName || utils.getFunctionName(type); + } else { + name = type == null ? '' + type : type.toString(); + } + + if (name) { + name = name.charAt(0).toUpperCase() + name.substring(1); + } + // Special case re: gh-7049 because the bson `ObjectID` class' capitalization + // doesn't line up with Mongoose's. + if (name === 'ObjectID') { + name = 'ObjectId'; + } + // For Jest 26+, see #10296 + if (name === 'ClockDate') { + name = 'Date'; + } + + if (name === void 0) { + throw new TypeError(`Invalid schema configuration: \`${path}\` schematype definition is ` + + 'invalid. See ' + + 'https://mongoosejs.com/docs/guide.html#definition for more info on supported schema syntaxes.'); + } + if (MongooseTypes[name] == null) { + throw new TypeError(`Invalid schema configuration: \`${name}\` is not ` + + `a valid type at path \`${path}\`. See ` + + 'http://bit.ly/mongoose-schematypes for a list of valid schema types.'); + } + + const schemaType = new MongooseTypes[name](path, obj); + + if (schemaType.$isSchemaMap) { + createMapNestedSchemaType(this, schemaType, path, obj, options); + } + + return schemaType; +}; + +/*! + * ignore + */ + +function createMapNestedSchemaType(schema, schemaType, path, obj, options) { + const mapPath = path + '.$*'; + let _mapType = { type: {} }; + if (utils.hasUserDefinedProperty(obj, 'of')) { + const isInlineSchema = utils.isPOJO(obj.of) && + Object.keys(obj.of).length > 0 && + !utils.hasUserDefinedProperty(obj.of, schema.options.typeKey); + if (isInlineSchema) { + _mapType = { [schema.options.typeKey]: new Schema(obj.of) }; + } else if (utils.isPOJO(obj.of)) { + _mapType = Object.assign({}, obj.of); + } else { + _mapType = { [schema.options.typeKey]: obj.of }; + } + + if (utils.hasUserDefinedProperty(obj, 'ref')) { + _mapType.ref = obj.ref; + } + } + schemaType.$__schemaType = schema.interpretAsType(mapPath, _mapType, options); +} + +/** + * Iterates the schemas paths similar to Array#forEach. + * + * The callback is passed the pathname and the schemaType instance. + * + * ####Example: + * + * const userSchema = new Schema({ name: String, registeredAt: Date }); + * userSchema.eachPath((pathname, schematype) => { + * // Prints twice: + * // name SchemaString { ... } + * // registeredAt SchemaDate { ... } + * console.log(pathname, schematype); + * }); + * + * @param {Function} fn callback function + * @return {Schema} this + * @api public + */ + +Schema.prototype.eachPath = function(fn) { + const keys = Object.keys(this.paths); + const len = keys.length; + + for (let i = 0; i < len; ++i) { + fn(keys[i], this.paths[keys[i]]); + } + + return this; +}; + +/** + * Returns an Array of path strings that are required by this schema. + * + * ####Example: + * const s = new Schema({ + * name: { type: String, required: true }, + * age: { type: String, required: true }, + * notes: String + * }); + * s.requiredPaths(); // [ 'age', 'name' ] + * + * @api public + * @param {Boolean} invalidate refresh the cache + * @return {Array} + */ + +Schema.prototype.requiredPaths = function requiredPaths(invalidate) { + if (this._requiredpaths && !invalidate) { + return this._requiredpaths; + } + + const paths = Object.keys(this.paths); + let i = paths.length; + const ret = []; + + while (i--) { + const path = paths[i]; + if (this.paths[path].isRequired) { + ret.push(path); + } + } + this._requiredpaths = ret; + return this._requiredpaths; +}; + +/** + * Returns indexes from fields and schema-level indexes (cached). + * + * @api private + * @return {Array} + */ + +Schema.prototype.indexedPaths = function indexedPaths() { + if (this._indexedpaths) { + return this._indexedpaths; + } + this._indexedpaths = this.indexes(); + return this._indexedpaths; +}; + +/** + * Returns the pathType of `path` for this schema. + * + * Given a path, returns whether it is a real, virtual, nested, or ad-hoc/undefined path. + * + * ####Example: + * const s = new Schema({ name: String, nested: { foo: String } }); + * s.virtual('foo').get(() => 42); + * s.pathType('name'); // "real" + * s.pathType('nested'); // "nested" + * s.pathType('foo'); // "virtual" + * s.pathType('fail'); // "adhocOrUndefined" + * + * @param {String} path + * @return {String} + * @api public + */ + +Schema.prototype.pathType = function(path) { + // Convert to '.$' to check subpaths re: gh-6405 + const cleanPath = _pathToPositionalSyntax(path); + + if (this.paths.hasOwnProperty(path)) { + return 'real'; + } + if (this.virtuals.hasOwnProperty(path)) { + return 'virtual'; + } + if (this.nested.hasOwnProperty(path)) { + return 'nested'; + } + if (this.subpaths.hasOwnProperty(cleanPath) || this.subpaths.hasOwnProperty(path)) { + return 'real'; + } + + const singleNestedPath = this.singleNestedPaths.hasOwnProperty(cleanPath) || this.singleNestedPaths.hasOwnProperty(path); + if (singleNestedPath) { + return singleNestedPath === 'nested' ? 'nested' : 'real'; + } + + // Look for maps + const mapPath = getMapPath(this, path); + if (mapPath != null) { + return 'real'; + } + + if (/\.\d+\.|\.\d+$/.test(path)) { + return getPositionalPathType(this, path); + } + return 'adhocOrUndefined'; +}; + +/** + * Returns true iff this path is a child of a mixed schema. + * + * @param {String} path + * @return {Boolean} + * @api private + */ + +Schema.prototype.hasMixedParent = function(path) { + const subpaths = path.split(/\./g); + path = ''; + for (let i = 0; i < subpaths.length; ++i) { + path = i > 0 ? path + '.' + subpaths[i] : subpaths[i]; + if (this.paths.hasOwnProperty(path) && + this.paths[path] instanceof MongooseTypes.Mixed) { + return this.paths[path]; + } + } + + return null; +}; + +/** + * Setup updatedAt and createdAt timestamps to documents if enabled + * + * @param {Boolean|Object} timestamps timestamps options + * @api private + */ +Schema.prototype.setupTimestamp = function(timestamps) { + return setupTimestamps(this, timestamps); +}; + +/*! + * ignore. Deprecated re: #6405 + */ + +function getPositionalPathType(self, path) { + const subpaths = path.split(/\.(\d+)\.|\.(\d+)$/).filter(Boolean); + if (subpaths.length < 2) { + return self.paths.hasOwnProperty(subpaths[0]) ? + self.paths[subpaths[0]] : + 'adhocOrUndefined'; + } + + let val = self.path(subpaths[0]); + let isNested = false; + if (!val) { + return 'adhocOrUndefined'; + } + + const last = subpaths.length - 1; + + for (let i = 1; i < subpaths.length; ++i) { + isNested = false; + const subpath = subpaths[i]; + + if (i === last && val && !/\D/.test(subpath)) { + if (val.$isMongooseDocumentArray) { + val = val.$embeddedSchemaType; + } else if (val instanceof MongooseTypes.Array) { + // StringSchema, NumberSchema, etc + val = val.caster; + } else { + val = undefined; + } + break; + } + + // ignore if its just a position segment: path.0.subpath + if (!/\D/.test(subpath)) { + // Nested array + if (val instanceof MongooseTypes.Array && i !== last) { + val = val.caster; + } + continue; + } + + if (!(val && val.schema)) { + val = undefined; + break; + } + + const type = val.schema.pathType(subpath); + isNested = (type === 'nested'); + val = val.schema.path(subpath); + } + + self.subpaths[path] = val; + if (val) { + return 'real'; + } + if (isNested) { + return 'nested'; + } + return 'adhocOrUndefined'; +} + + +/*! + * ignore + */ + +function getPositionalPath(self, path) { + getPositionalPathType(self, path); + return self.subpaths[path]; +} + +/** + * Adds a method call to the queue. + * + * ####Example: + * + * schema.methods.print = function() { console.log(this); }; + * schema.queue('print', []); // Print the doc every one is instantiated + * + * const Model = mongoose.model('Test', schema); + * new Model({ name: 'test' }); // Prints '{"_id": ..., "name": "test" }' + * + * @param {String} name name of the document method to call later + * @param {Array} args arguments to pass to the method + * @api public + */ + +Schema.prototype.queue = function(name, args) { + this.callQueue.push([name, args]); + return this; +}; + +/** + * Defines a pre hook for the model. + * + * ####Example + * + * const toySchema = new Schema({ name: String, created: Date }); + * + * toySchema.pre('save', function(next) { + * if (!this.created) this.created = new Date; + * next(); + * }); + * + * toySchema.pre('validate', function(next) { + * if (this.name !== 'Woody') this.name = 'Woody'; + * next(); + * }); + * + * // Equivalent to calling `pre()` on `find`, `findOne`, `findOneAndUpdate`. + * toySchema.pre(/^find/, function(next) { + * console.log(this.getFilter()); + * }); + * + * // Equivalent to calling `pre()` on `updateOne`, `findOneAndUpdate`. + * toySchema.pre(['updateOne', 'findOneAndUpdate'], function(next) { + * console.log(this.getFilter()); + * }); + * + * toySchema.pre('deleteOne', function() { + * // Runs when you call `Toy.deleteOne()` + * }); + * + * toySchema.pre('deleteOne', { document: true }, function() { + * // Runs when you call `doc.deleteOne()` + * }); + * + * @param {String|RegExp} The method name or regular expression to match method name + * @param {Object} [options] + * @param {Boolean} [options.document] If `name` is a hook for both document and query middleware, set to `true` to run on document middleware. For example, set `options.document` to `true` to apply this hook to `Document#deleteOne()` rather than `Query#deleteOne()`. + * @param {Boolean} [options.query] If `name` is a hook for both document and query middleware, set to `true` to run on query middleware. + * @param {Function} callback + * @api public + */ + +Schema.prototype.pre = function(name) { + if (name instanceof RegExp) { + const remainingArgs = Array.prototype.slice.call(arguments, 1); + for (const fn of hookNames) { + if (name.test(fn)) { + this.pre.apply(this, [fn].concat(remainingArgs)); + } + } + return this; + } + if (Array.isArray(name)) { + const remainingArgs = Array.prototype.slice.call(arguments, 1); + for (const el of name) { + this.pre.apply(this, [el].concat(remainingArgs)); + } + return this; + } + this.s.hooks.pre.apply(this.s.hooks, arguments); + return this; +}; + +/** + * Defines a post hook for the document + * + * const schema = new Schema(..); + * schema.post('save', function (doc) { + * console.log('this fired after a document was saved'); + * }); + * + * schema.post('find', function(docs) { + * console.log('this fired after you ran a find query'); + * }); + * + * schema.post(/Many$/, function(res) { + * console.log('this fired after you ran `updateMany()` or `deleteMany()`); + * }); + * + * const Model = mongoose.model('Model', schema); + * + * const m = new Model(..); + * m.save(function(err) { + * console.log('this fires after the `post` hook'); + * }); + * + * m.find(function(err, docs) { + * console.log('this fires after the post find hook'); + * }); + * + * @param {String|RegExp} The method name or regular expression to match method name + * @param {Object} [options] + * @param {Boolean} [options.document] If `name` is a hook for both document and query middleware, set to `true` to run on document middleware. + * @param {Boolean} [options.query] If `name` is a hook for both document and query middleware, set to `true` to run on query middleware. + * @param {Function} fn callback + * @see middleware http://mongoosejs.com/docs/middleware.html + * @see kareem http://npmjs.org/package/kareem + * @api public + */ + +Schema.prototype.post = function(name) { + if (name instanceof RegExp) { + const remainingArgs = Array.prototype.slice.call(arguments, 1); + for (const fn of hookNames) { + if (name.test(fn)) { + this.post.apply(this, [fn].concat(remainingArgs)); + } + } + return this; + } + if (Array.isArray(name)) { + const remainingArgs = Array.prototype.slice.call(arguments, 1); + for (const el of name) { + this.post.apply(this, [el].concat(remainingArgs)); + } + return this; + } + this.s.hooks.post.apply(this.s.hooks, arguments); + return this; +}; + +/** + * Registers a plugin for this schema. + * + * ####Example: + * + * const s = new Schema({ name: String }); + * s.plugin(schema => console.log(schema.path('name').path)); + * mongoose.model('Test', s); // Prints 'name' + * + * @param {Function} plugin callback + * @param {Object} [opts] + * @see plugins + * @api public + */ + +Schema.prototype.plugin = function(fn, opts) { + if (typeof fn !== 'function') { + throw new Error('First param to `schema.plugin()` must be a function, ' + + 'got "' + (typeof fn) + '"'); + } + + if (opts && opts.deduplicate) { + for (const plugin of this.plugins) { + if (plugin.fn === fn) { + return this; + } + } + } + this.plugins.push({ fn: fn, opts: opts }); + + fn(this, opts); + return this; +}; + +/** + * Adds an instance method to documents constructed from Models compiled from this schema. + * + * ####Example + * + * const schema = kittySchema = new Schema(..); + * + * schema.method('meow', function () { + * console.log('meeeeeoooooooooooow'); + * }) + * + * const Kitty = mongoose.model('Kitty', schema); + * + * const fizz = new Kitty; + * fizz.meow(); // meeeeeooooooooooooow + * + * If a hash of name/fn pairs is passed as the only argument, each name/fn pair will be added as methods. + * + * schema.method({ + * purr: function () {} + * , scratch: function () {} + * }); + * + * // later + * fizz.purr(); + * fizz.scratch(); + * + * NOTE: `Schema.method()` adds instance methods to the `Schema.methods` object. You can also add instance methods directly to the `Schema.methods` object as seen in the [guide](/docs/guide.html#methods) + * + * @param {String|Object} method name + * @param {Function} [fn] + * @api public + */ + +Schema.prototype.method = function(name, fn, options) { + if (typeof name !== 'string') { + for (const i in name) { + this.methods[i] = name[i]; + this.methodOptions[i] = utils.clone(options); + } + } else { + this.methods[name] = fn; + this.methodOptions[name] = utils.clone(options); + } + return this; +}; + +/** + * Adds static "class" methods to Models compiled from this schema. + * + * ####Example + * + * const schema = new Schema(..); + * // Equivalent to `schema.statics.findByName = function(name) {}`; + * schema.static('findByName', function(name) { + * return this.find({ name: name }); + * }); + * + * const Drink = mongoose.model('Drink', schema); + * await Drink.findByName('LaCroix'); + * + * If a hash of name/fn pairs is passed as the only argument, each name/fn pair will be added as statics. + * + * @param {String|Object} name + * @param {Function} [fn] + * @api public + * @see Statics /docs/guide.html#statics + */ + +Schema.prototype.static = function(name, fn) { + if (typeof name !== 'string') { + for (const i in name) { + this.statics[i] = name[i]; + } + } else { + this.statics[name] = fn; + } + return this; +}; + +/** + * Defines an index (most likely compound) for this schema. + * + * ####Example + * + * schema.index({ first: 1, last: -1 }) + * + * @param {Object} fields + * @param {Object} [options] Options to pass to [MongoDB driver's `createIndex()` function](http://mongodb.github.io/node-mongodb-native/2.0/api/Collection.html#createIndex) + * @param {String | number} [options.expires=null] Mongoose-specific syntactic sugar, uses [ms](https://www.npmjs.com/package/ms) to convert `expires` option into seconds for the `expireAfterSeconds` in the above link. + * @api public + */ + +Schema.prototype.index = function(fields, options) { + fields || (fields = {}); + options || (options = {}); + + if (options.expires) { + utils.expires(options); + } + + this._indexes.push([fields, options]); + return this; +}; + +/** + * Sets a schema option. + * + * ####Example + * + * schema.set('strict'); // 'true' by default + * schema.set('strict', false); // Sets 'strict' to false + * schema.set('strict'); // 'false' + * + * @param {String} key option name + * @param {Object} [value] if not passed, the current option value is returned + * @see Schema ./ + * @api public + */ + +Schema.prototype.set = function(key, value, _tags) { + if (arguments.length === 1) { + return this.options[key]; + } + + switch (key) { + case 'read': + this.options[key] = readPref(value, _tags); + this._userProvidedOptions[key] = this.options[key]; + break; + case 'timestamps': + this.setupTimestamp(value); + this.options[key] = value; + this._userProvidedOptions[key] = this.options[key]; + break; + case '_id': + this.options[key] = value; + this._userProvidedOptions[key] = this.options[key]; + + if (value && !this.paths['_id']) { + addAutoId(this); + } else if (!value && this.paths['_id'] != null && this.paths['_id'].auto) { + this.remove('_id'); + } + break; + default: + this.options[key] = value; + this._userProvidedOptions[key] = this.options[key]; + break; + } + + return this; +}; + +/** + * Gets a schema option. + * + * ####Example: + * + * schema.get('strict'); // true + * schema.set('strict', false); + * schema.get('strict'); // false + * + * @param {String} key option name + * @api public + * @return {Any} the option's value + */ + +Schema.prototype.get = function(key) { + return this.options[key]; +}; + +/** + * The allowed index types + * + * @receiver Schema + * @static indexTypes + * @api public + */ + +const indexTypes = '2d 2dsphere hashed text'.split(' '); + +Object.defineProperty(Schema, 'indexTypes', { + get: function() { + return indexTypes; + }, + set: function() { + throw new Error('Cannot overwrite Schema.indexTypes'); + } +}); + +/** + * Returns a list of indexes that this schema declares, via `schema.index()` or by `index: true` in a path's options. + * Indexes are expressed as an array `[spec, options]`. + * + * ####Example: + * + * const userSchema = new Schema({ + * email: { type: String, required: true, unique: true }, + * registeredAt: { type: Date, index: true } + * }); + * + * // [ [ { email: 1 }, { unique: true, background: true } ], + * // [ { registeredAt: 1 }, { background: true } ] ] + * userSchema.indexes(); + * + * [Plugins](/docs/plugins.html) can use the return value of this function to modify a schema's indexes. + * For example, the below plugin makes every index unique by default. + * + * function myPlugin(schema) { + * for (const index of schema.indexes()) { + * if (index[1].unique === undefined) { + * index[1].unique = true; + * } + * } + * } + * + * @api public + * @return {Array} list of indexes defined in the schema + */ + +Schema.prototype.indexes = function() { + return getIndexes(this); +}; + +/** + * Creates a virtual type with the given name. + * + * @param {String} name + * @param {Object} [options] + * @param {String|Model} [options.ref] model name or model instance. Marks this as a [populate virtual](populate.html#populate-virtuals). + * @param {String|Function} [options.localField] Required for populate virtuals. See [populate virtual docs](populate.html#populate-virtuals) for more information. + * @param {String|Function} [options.foreignField] Required for populate virtuals. See [populate virtual docs](populate.html#populate-virtuals) for more information. + * @param {Boolean|Function} [options.justOne=false] Only works with populate virtuals. If [truthy](https://masteringjs.io/tutorials/fundamentals/truthy), will be a single doc or `null`. Otherwise, the populate virtual will be an array. + * @param {Boolean} [options.count=false] Only works with populate virtuals. If [truthy](https://masteringjs.io/tutorials/fundamentals/truthy), this populate virtual will contain the number of documents rather than the documents themselves when you `populate()`. + * @param {Function|null} [options.get=null] Adds a [getter](/docs/tutorials/getters-setters.html) to this virtual to transform the populated doc. + * @return {VirtualType} + */ + +Schema.prototype.virtual = function(name, options) { + if (name instanceof VirtualType || getConstructorName(name) === 'VirtualType') { + return this.virtual(name.path, name.options); + } + + options = new VirtualOptions(options); + + if (utils.hasUserDefinedProperty(options, ['ref', 'refPath'])) { + if (options.localField == null) { + throw new Error('Reference virtuals require `localField` option'); + } + + if (options.foreignField == null) { + throw new Error('Reference virtuals require `foreignField` option'); + } + + this.pre('init', function(obj) { + if (mpath.has(name, obj)) { + const _v = mpath.get(name, obj); + if (!this.$$populatedVirtuals) { + this.$$populatedVirtuals = {}; + } + + if (options.justOne || options.count) { + this.$$populatedVirtuals[name] = Array.isArray(_v) ? + _v[0] : + _v; + } else { + this.$$populatedVirtuals[name] = Array.isArray(_v) ? + _v : + _v == null ? [] : [_v]; + } + + mpath.unset(name, obj); + } + }); + + const virtual = this.virtual(name); + virtual.options = options; + + virtual. + set(function(_v) { + if (!this.$$populatedVirtuals) { + this.$$populatedVirtuals = {}; + } + + if (options.justOne || options.count) { + this.$$populatedVirtuals[name] = Array.isArray(_v) ? + _v[0] : + _v; + + if (typeof this.$$populatedVirtuals[name] !== 'object') { + this.$$populatedVirtuals[name] = options.count ? _v : null; + } + } else { + this.$$populatedVirtuals[name] = Array.isArray(_v) ? + _v : + _v == null ? [] : [_v]; + + this.$$populatedVirtuals[name] = this.$$populatedVirtuals[name].filter(function(doc) { + return doc && typeof doc === 'object'; + }); + } + }); + + if (typeof options.get === 'function') { + virtual.get(options.get); + } + + // Workaround for gh-8198: if virtual is under document array, make a fake + // virtual. See gh-8210 + const parts = name.split('.'); + let cur = parts[0]; + for (let i = 0; i < parts.length - 1; ++i) { + if (this.paths[cur] != null && this.paths[cur].$isMongooseDocumentArray) { + const remnant = parts.slice(i + 1).join('.'); + this.paths[cur].schema.virtual(remnant, options); + break; + } + + cur += '.' + parts[i + 1]; + } + + return virtual; + } + + const virtuals = this.virtuals; + const parts = name.split('.'); + + if (this.pathType(name) === 'real') { + throw new Error('Virtual path "' + name + '"' + + ' conflicts with a real path in the schema'); + } + + virtuals[name] = parts.reduce(function(mem, part, i) { + mem[part] || (mem[part] = (i === parts.length - 1) + ? new VirtualType(options, name) + : {}); + return mem[part]; + }, this.tree); + + return virtuals[name]; +}; + +/** + * Returns the virtual type with the given `name`. + * + * @param {String} name + * @return {VirtualType} + */ + +Schema.prototype.virtualpath = function(name) { + return this.virtuals.hasOwnProperty(name) ? this.virtuals[name] : null; +}; + +/** + * Removes the given `path` (or [`paths`]). + * + * ####Example: + * + * const schema = new Schema({ name: String, age: Number }); + * schema.remove('name'); + * schema.path('name'); // Undefined + * schema.path('age'); // SchemaNumber { ... } + * + * @param {String|Array} path + * @return {Schema} the Schema instance + * @api public + */ +Schema.prototype.remove = function(path) { + if (typeof path === 'string') { + path = [path]; + } + if (Array.isArray(path)) { + path.forEach(function(name) { + if (this.path(name) == null && !this.nested[name]) { + return; + } + if (this.nested[name]) { + const allKeys = Object.keys(this.paths). + concat(Object.keys(this.nested)); + for (const path of allKeys) { + if (path.startsWith(name + '.')) { + delete this.paths[path]; + delete this.nested[path]; + _deletePath(this, path); + } + } + + delete this.nested[name]; + _deletePath(this, name); + return; + } + + delete this.paths[name]; + _deletePath(this, name); + }, this); + } + return this; +}; + +/*! + * ignore + */ + +function _deletePath(schema, name) { + const pieces = name.split('.'); + const last = pieces.pop(); + + let branch = schema.tree; + + for (const piece of pieces) { + branch = branch[piece]; + } + + delete branch[last]; +} + +/** + * Loads an ES6 class into a schema. Maps [setters](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Functions/set) + [getters](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Functions/get), [static methods](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Classes/static), + * and [instance methods](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Classes#Class_body_and_method_definitions) + * to schema [virtuals](/docs/guide.html#virtuals), + * [statics](/docs/guide.html#statics), and + * [methods](/docs/guide.html#methods). + * + * ####Example: + * + * ```javascript + * const md5 = require('md5'); + * const userSchema = new Schema({ email: String }); + * class UserClass { + * // `gravatarImage` becomes a virtual + * get gravatarImage() { + * const hash = md5(this.email.toLowerCase()); + * return `https://www.gravatar.com/avatar/${hash}`; + * } + * + * // `getProfileUrl()` becomes a document method + * getProfileUrl() { + * return `https://mysite.com/${this.email}`; + * } + * + * // `findByEmail()` becomes a static + * static findByEmail(email) { + * return this.findOne({ email }); + * } + * } + * + * // `schema` will now have a `gravatarImage` virtual, a `getProfileUrl()` method, + * // and a `findByEmail()` static + * userSchema.loadClass(UserClass); + * ``` + * + * @param {Function} model + * @param {Boolean} [virtualsOnly] if truthy, only pulls virtuals from the class, not methods or statics + */ +Schema.prototype.loadClass = function(model, virtualsOnly) { + if (model === Object.prototype || + model === Function.prototype || + model.prototype.hasOwnProperty('$isMongooseModelPrototype')) { + return this; + } + + this.loadClass(Object.getPrototypeOf(model), virtualsOnly); + + // Add static methods + if (!virtualsOnly) { + Object.getOwnPropertyNames(model).forEach(function(name) { + if (name.match(/^(length|name|prototype|constructor|__proto__)$/)) { + return; + } + const prop = Object.getOwnPropertyDescriptor(model, name); + if (prop.hasOwnProperty('value')) { + this.static(name, prop.value); + } + }, this); + } + + // Add methods and virtuals + Object.getOwnPropertyNames(model.prototype).forEach(function(name) { + if (name.match(/^(constructor)$/)) { + return; + } + const method = Object.getOwnPropertyDescriptor(model.prototype, name); + if (!virtualsOnly) { + if (typeof method.value === 'function') { + this.method(name, method.value); + } + } + if (typeof method.get === 'function') { + if (this.virtuals[name]) { + this.virtuals[name].getters = []; + } + this.virtual(name).get(method.get); + } + if (typeof method.set === 'function') { + if (this.virtuals[name]) { + this.virtuals[name].setters = []; + } + this.virtual(name).set(method.set); + } + }, this); + + return this; +}; + +/*! + * ignore + */ + +Schema.prototype._getSchema = function(path) { + const _this = this; + const pathschema = _this.path(path); + const resultPath = []; + + if (pathschema) { + pathschema.$fullPath = path; + return pathschema; + } + + function search(parts, schema) { + let p = parts.length + 1; + let foundschema; + let trypath; + + while (p--) { + trypath = parts.slice(0, p).join('.'); + foundschema = schema.path(trypath); + if (foundschema) { + resultPath.push(trypath); + + if (foundschema.caster) { + // array of Mixed? + if (foundschema.caster instanceof MongooseTypes.Mixed) { + foundschema.caster.$fullPath = resultPath.join('.'); + return foundschema.caster; + } + + // Now that we found the array, we need to check if there + // are remaining document paths to look up for casting. + // Also we need to handle array.$.path since schema.path + // doesn't work for that. + // If there is no foundschema.schema we are dealing with + // a path like array.$ + if (p !== parts.length) { + if (foundschema.schema) { + let ret; + if (parts[p] === '$' || isArrayFilter(parts[p])) { + if (p + 1 === parts.length) { + // comments.$ + return foundschema; + } + // comments.$.comments.$.title + ret = search(parts.slice(p + 1), foundschema.schema); + if (ret) { + ret.$isUnderneathDocArray = ret.$isUnderneathDocArray || + !foundschema.schema.$isSingleNested; + } + return ret; + } + // this is the last path of the selector + ret = search(parts.slice(p), foundschema.schema); + if (ret) { + ret.$isUnderneathDocArray = ret.$isUnderneathDocArray || + !foundschema.schema.$isSingleNested; + } + return ret; + } + } + } else if (foundschema.$isSchemaMap) { + if (p >= parts.length) { + return foundschema; + } + // Any path in the map will be an instance of the map's embedded schematype + if (p + 1 >= parts.length) { + return foundschema.$__schemaType; + } + const ret = search(parts.slice(p + 1), foundschema.$__schemaType.schema); + return ret; + } + + foundschema.$fullPath = resultPath.join('.'); + + return foundschema; + } + } + } + + // look for arrays + const parts = path.split('.'); + for (let i = 0; i < parts.length; ++i) { + if (parts[i] === '$' || isArrayFilter(parts[i])) { + // Re: gh-5628, because `schema.path()` doesn't take $ into account. + parts[i] = '0'; + } + } + return search(parts, _this); +}; + +/*! + * ignore + */ + +Schema.prototype._getPathType = function(path) { + const _this = this; + const pathschema = _this.path(path); + + if (pathschema) { + return 'real'; + } + + function search(parts, schema) { + let p = parts.length + 1, + foundschema, + trypath; + + while (p--) { + trypath = parts.slice(0, p).join('.'); + foundschema = schema.path(trypath); + if (foundschema) { + if (foundschema.caster) { + // array of Mixed? + if (foundschema.caster instanceof MongooseTypes.Mixed) { + return { schema: foundschema, pathType: 'mixed' }; + } + + // Now that we found the array, we need to check if there + // are remaining document paths to look up for casting. + // Also we need to handle array.$.path since schema.path + // doesn't work for that. + // If there is no foundschema.schema we are dealing with + // a path like array.$ + if (p !== parts.length && foundschema.schema) { + if (parts[p] === '$' || isArrayFilter(parts[p])) { + if (p === parts.length - 1) { + return { schema: foundschema, pathType: 'nested' }; + } + // comments.$.comments.$.title + return search(parts.slice(p + 1), foundschema.schema); + } + // this is the last path of the selector + return search(parts.slice(p), foundschema.schema); + } + return { + schema: foundschema, + pathType: foundschema.$isSingleNested ? 'nested' : 'array' + }; + } + return { schema: foundschema, pathType: 'real' }; + } else if (p === parts.length && schema.nested[trypath]) { + return { schema: schema, pathType: 'nested' }; + } + } + return { schema: foundschema || schema, pathType: 'undefined' }; + } + + // look for arrays + return search(path.split('.'), _this); +}; + +/*! + * ignore + */ + +function isArrayFilter(piece) { + return piece.startsWith('$[') && piece.endsWith(']'); +} + +/*! + * Called by `compile()` _right before_ compiling. Good for making any changes to + * the schema that should respect options set by plugins, like `id` + */ + +Schema.prototype._preCompile = function _preCompile() { + idGetter(this); +}; + +/*! + * Module exports. + */ + +module.exports = exports = Schema; + +// require down here because of reference issues + +/** + * The various built-in Mongoose Schema Types. + * + * ####Example: + * + * const mongoose = require('mongoose'); + * const ObjectId = mongoose.Schema.Types.ObjectId; + * + * ####Types: + * + * - [String](/docs/schematypes.html#strings) + * - [Number](/docs/schematypes.html#numbers) + * - [Boolean](/docs/schematypes.html#booleans) | Bool + * - [Array](/docs/schematypes.html#arrays) + * - [Buffer](/docs/schematypes.html#buffers) + * - [Date](/docs/schematypes.html#dates) + * - [ObjectId](/docs/schematypes.html#objectids) | Oid + * - [Mixed](/docs/schematypes.html#mixed) + * + * Using this exposed access to the `Mixed` SchemaType, we can use them in our schema. + * + * const Mixed = mongoose.Schema.Types.Mixed; + * new mongoose.Schema({ _user: Mixed }) + * + * @api public + */ + +Schema.Types = MongooseTypes = require('./schema/index'); + +/*! + * ignore + */ + +exports.ObjectId = MongooseTypes.ObjectId; diff --git a/node_modules/mongoose/lib/schema/SubdocumentPath.js b/node_modules/mongoose/lib/schema/SubdocumentPath.js new file mode 100644 index 000000000..bffae008e --- /dev/null +++ b/node_modules/mongoose/lib/schema/SubdocumentPath.js @@ -0,0 +1,349 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const CastError = require('../error/cast'); +const EventEmitter = require('events').EventEmitter; +const ObjectExpectedError = require('../error/objectExpected'); +const SchemaSubdocumentOptions = require('../options/SchemaSubdocumentOptions'); +const SchemaType = require('../schematype'); +const $exists = require('./operators/exists'); +const castToNumber = require('./operators/helpers').castToNumber; +const discriminator = require('../helpers/model/discriminator'); +const geospatial = require('./operators/geospatial'); +const get = require('../helpers/get'); +const getConstructor = require('../helpers/discriminator/getConstructor'); +const handleIdOption = require('../helpers/schema/handleIdOption'); +const internalToObjectOptions = require('../options').internalToObjectOptions; +const utils = require('../utils'); + +let Subdocument; + +module.exports = SubdocumentPath; + +/** + * Single nested subdocument SchemaType constructor. + * + * @param {Schema} schema + * @param {String} key + * @param {Object} options + * @inherits SchemaType + * @api public + */ + +function SubdocumentPath(schema, path, options) { + schema = handleIdOption(schema, options); + + this.caster = _createConstructor(schema); + this.caster.path = path; + this.caster.prototype.$basePath = path; + this.schema = schema; + this.$isSingleNested = true; + SchemaType.call(this, path, options, 'Embedded'); +} + +/*! + * ignore + */ + +SubdocumentPath.prototype = Object.create(SchemaType.prototype); +SubdocumentPath.prototype.constructor = SubdocumentPath; +SubdocumentPath.prototype.OptionsConstructor = SchemaSubdocumentOptions; + +/*! + * ignore + */ + +function _createConstructor(schema, baseClass) { + // lazy load + Subdocument || (Subdocument = require('../types/subdocument')); + + const _embedded = function SingleNested(value, path, parent) { + const _this = this; + + this.$__parent = parent; + Subdocument.apply(this, arguments); + + this.$session(this.ownerDocument().$session()); + + if (parent) { + parent.$on('save', function() { + _this.emit('save', _this); + _this.constructor.emit('save', _this); + }); + + parent.$on('isNew', function(val) { + _this.isNew = val; + _this.emit('isNew', val); + _this.constructor.emit('isNew', val); + }); + } + }; + + schema._preCompile(); + + const proto = baseClass != null ? baseClass.prototype : Subdocument.prototype; + _embedded.prototype = Object.create(proto); + _embedded.prototype.$__setSchema(schema); + _embedded.prototype.constructor = _embedded; + _embedded.schema = schema; + _embedded.$isSingleNested = true; + _embedded.events = new EventEmitter(); + _embedded.prototype.toBSON = function() { + return this.toObject(internalToObjectOptions); + }; + + // apply methods + for (const i in schema.methods) { + _embedded.prototype[i] = schema.methods[i]; + } + + // apply statics + for (const i in schema.statics) { + _embedded[i] = schema.statics[i]; + } + + for (const i in EventEmitter.prototype) { + _embedded[i] = EventEmitter.prototype[i]; + } + + return _embedded; +} + +/*! + * Special case for when users use a common location schema to represent + * locations for use with $geoWithin. + * https://docs.mongodb.org/manual/reference/operator/query/geoWithin/ + * + * @param {Object} val + * @api private + */ + +SubdocumentPath.prototype.$conditionalHandlers.$geoWithin = function handle$geoWithin(val) { + return { $geometry: this.castForQuery(val.$geometry) }; +}; + +/*! + * ignore + */ + +SubdocumentPath.prototype.$conditionalHandlers.$near = +SubdocumentPath.prototype.$conditionalHandlers.$nearSphere = geospatial.cast$near; + +SubdocumentPath.prototype.$conditionalHandlers.$within = +SubdocumentPath.prototype.$conditionalHandlers.$geoWithin = geospatial.cast$within; + +SubdocumentPath.prototype.$conditionalHandlers.$geoIntersects = + geospatial.cast$geoIntersects; + +SubdocumentPath.prototype.$conditionalHandlers.$minDistance = castToNumber; +SubdocumentPath.prototype.$conditionalHandlers.$maxDistance = castToNumber; + +SubdocumentPath.prototype.$conditionalHandlers.$exists = $exists; + +/** + * Casts contents + * + * @param {Object} value + * @api private + */ + +SubdocumentPath.prototype.cast = function(val, doc, init, priorVal, options) { + if (val && val.$isSingleNested && val.parent === doc) { + return val; + } + + if (val != null && (typeof val !== 'object' || Array.isArray(val))) { + throw new ObjectExpectedError(this.path, val); + } + + const Constructor = getConstructor(this.caster, val); + + let subdoc; + + // Only pull relevant selected paths and pull out the base path + const parentSelected = get(doc, '$__.selected', {}); + const path = this.path; + const selected = Object.keys(parentSelected).reduce((obj, key) => { + if (key.startsWith(path + '.')) { + obj[key.substr(path.length + 1)] = parentSelected[key]; + } + return obj; + }, {}); + options = Object.assign({}, options, { priorDoc: priorVal }); + if (init) { + subdoc = new Constructor(void 0, selected, doc); + subdoc.$init(val); + } else { + if (Object.keys(val).length === 0) { + return new Constructor({}, selected, doc, undefined, options); + } + + return new Constructor(val, selected, doc, undefined, options); + } + + return subdoc; +}; + +/** + * Casts contents for query + * + * @param {string} [$conditional] optional query operator (like `$eq` or `$in`) + * @param {any} value + * @api private + */ + +SubdocumentPath.prototype.castForQuery = function($conditional, val, options) { + let handler; + if (arguments.length === 2) { + handler = this.$conditionalHandlers[$conditional]; + if (!handler) { + throw new Error('Can\'t use ' + $conditional); + } + return handler.call(this, val); + } + val = $conditional; + if (val == null) { + return val; + } + + if (this.options.runSetters) { + val = this._applySetters(val); + } + + const Constructor = getConstructor(this.caster, val); + const overrideStrict = options != null && options.strict != null ? + options.strict : + void 0; + + try { + val = new Constructor(val, overrideStrict); + } catch (error) { + // Make sure we always wrap in a CastError (gh-6803) + if (!(error instanceof CastError)) { + throw new CastError('Embedded', val, this.path, error, this); + } + throw error; + } + return val; +}; + +/** + * Async validation on this single nested doc. + * + * @api private + */ + +SubdocumentPath.prototype.doValidate = function(value, fn, scope, options) { + const Constructor = getConstructor(this.caster, value); + + if (options && options.skipSchemaValidators) { + if (!(value instanceof Constructor)) { + value = new Constructor(value, null, scope); + } + return value.validate(fn); + } + + SchemaType.prototype.doValidate.call(this, value, function(error) { + if (error) { + return fn(error); + } + if (!value) { + return fn(null); + } + + value.validate(fn); + }, scope, options); +}; + +/** + * Synchronously validate this single nested doc + * + * @api private + */ + +SubdocumentPath.prototype.doValidateSync = function(value, scope, options) { + if (!options || !options.skipSchemaValidators) { + const schemaTypeError = SchemaType.prototype.doValidateSync.call(this, value, scope); + if (schemaTypeError) { + return schemaTypeError; + } + } + if (!value) { + return; + } + return value.validateSync(); +}; + +/** + * Adds a discriminator to this single nested subdocument. + * + * ####Example: + * const shapeSchema = Schema({ name: String }, { discriminatorKey: 'kind' }); + * const schema = Schema({ shape: shapeSchema }); + * + * const singleNestedPath = parentSchema.path('shape'); + * singleNestedPath.discriminator('Circle', Schema({ radius: Number })); + * + * @param {String} name + * @param {Schema} schema fields to add to the schema for instances of this sub-class + * @param {Object|string} [options] If string, same as `options.value`. + * @param {String} [options.value] the string stored in the `discriminatorKey` property. If not specified, Mongoose uses the `name` parameter. + * @param {Boolean} [options.clone=true] By default, `discriminator()` clones the given `schema`. Set to `false` to skip cloning. + * @return {Function} the constructor Mongoose will use for creating instances of this discriminator model + * @see discriminators /docs/discriminators.html + * @api public + */ + +SubdocumentPath.prototype.discriminator = function(name, schema, options) { + options = options || {}; + const value = utils.isPOJO(options) ? options.value : options; + const clone = get(options, 'clone', true); + + if (schema.instanceOfSchema && clone) { + schema = schema.clone(); + } + + schema = discriminator(this.caster, name, schema, value); + + this.caster.discriminators[name] = _createConstructor(schema, this.caster); + + return this.caster.discriminators[name]; +}; + +/** + * Sets a default option for all SubdocumentPath instances. + * + * ####Example: + * + * // Make all numbers have option `min` equal to 0. + * mongoose.Schema.Embedded.set('required', true); + * + * @param {String} option - The option you'd like to set the value for + * @param {*} value - value for option + * @return {undefined} + * @function set + * @static + * @api public + */ + +SubdocumentPath.defaultOptions = {}; + +SubdocumentPath.set = SchemaType.set; + +/*! + * ignore + */ + +SubdocumentPath.prototype.clone = function() { + const options = Object.assign({}, this.options); + const schematype = new this.constructor(this.schema, this.path, options); + schematype.validators = this.validators.slice(); + if (this.requiredValidator !== undefined) { + schematype.requiredValidator = this.requiredValidator; + } + schematype.caster.discriminators = Object.assign({}, this.caster.discriminators); + return schematype; +}; diff --git a/node_modules/mongoose/lib/schema/array.js b/node_modules/mongoose/lib/schema/array.js new file mode 100644 index 000000000..95291867d --- /dev/null +++ b/node_modules/mongoose/lib/schema/array.js @@ -0,0 +1,651 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const $exists = require('./operators/exists'); +const $type = require('./operators/type'); +const MongooseError = require('../error/mongooseError'); +const SchemaArrayOptions = require('../options/SchemaArrayOptions'); +const SchemaType = require('../schematype'); +const CastError = SchemaType.CastError; +const Mixed = require('./mixed'); +const arrayDepth = require('../helpers/arrayDepth'); +const cast = require('../cast'); +const get = require('../helpers/get'); +const isOperator = require('../helpers/query/isOperator'); +const util = require('util'); +const utils = require('../utils'); +const castToNumber = require('./operators/helpers').castToNumber; +const geospatial = require('./operators/geospatial'); +const getDiscriminatorByValue = require('../helpers/discriminator/getDiscriminatorByValue'); + +let MongooseArray; +let EmbeddedDoc; + +const isNestedArraySymbol = Symbol('mongoose#isNestedArray'); +const emptyOpts = Object.freeze({}); + +/** + * Array SchemaType constructor + * + * @param {String} key + * @param {SchemaType} cast + * @param {Object} options + * @inherits SchemaType + * @api public + */ + +function SchemaArray(key, cast, options, schemaOptions) { + // lazy load + EmbeddedDoc || (EmbeddedDoc = require('../types').Embedded); + + let typeKey = 'type'; + if (schemaOptions && schemaOptions.typeKey) { + typeKey = schemaOptions.typeKey; + } + this.schemaOptions = schemaOptions; + + if (cast) { + let castOptions = {}; + + if (utils.isPOJO(cast)) { + if (cast[typeKey]) { + // support { type: Woot } + castOptions = utils.clone(cast); // do not alter user arguments + delete castOptions[typeKey]; + cast = cast[typeKey]; + } else { + cast = Mixed; + } + } + + if (options != null && options.ref != null && castOptions.ref == null) { + castOptions.ref = options.ref; + } + + if (cast === Object) { + cast = Mixed; + } + + // support { type: 'String' } + const name = typeof cast === 'string' + ? cast + : utils.getFunctionName(cast); + + const Types = require('./index.js'); + const caster = Types.hasOwnProperty(name) ? Types[name] : cast; + + this.casterConstructor = caster; + + if (this.casterConstructor instanceof SchemaArray) { + this.casterConstructor[isNestedArraySymbol] = true; + } + + if (typeof caster === 'function' && + !caster.$isArraySubdocument && + !caster.$isSchemaMap) { + const path = this.caster instanceof EmbeddedDoc ? null : key; + this.caster = new caster(path, castOptions); + } else { + this.caster = caster; + if (!(this.caster instanceof EmbeddedDoc)) { + this.caster.path = key; + } + } + + this.$embeddedSchemaType = this.caster; + } + + this.$isMongooseArray = true; + + SchemaType.call(this, key, options, 'Array'); + + let defaultArr; + let fn; + + if (this.defaultValue != null) { + defaultArr = this.defaultValue; + fn = typeof defaultArr === 'function'; + } + + if (!('defaultValue' in this) || this.defaultValue !== void 0) { + const defaultFn = function() { + let arr = []; + if (fn) { + arr = defaultArr.call(this); + } else if (defaultArr != null) { + arr = arr.concat(defaultArr); + } + // Leave it up to `cast()` to convert the array + return arr; + }; + defaultFn.$runBeforeSetters = !fn; + this.default(defaultFn); + } +} + +/** + * This schema type's name, to defend against minifiers that mangle + * function names. + * + * @api public + */ +SchemaArray.schemaName = 'Array'; + + +/** + * Options for all arrays. + * + * - `castNonArrays`: `true` by default. If `false`, Mongoose will throw a CastError when a value isn't an array. If `true`, Mongoose will wrap the provided value in an array before casting. + * + * @static options + * @api public + */ + +SchemaArray.options = { castNonArrays: true }; + +/*! + * ignore + */ + +SchemaArray.defaultOptions = {}; + +/** + * Sets a default option for all Array instances. + * + * ####Example: + * + * // Make all Array instances have `required` of true by default. + * mongoose.Schema.Array.set('required', true); + * + * const User = mongoose.model('User', new Schema({ test: Array })); + * new User({ }).validateSync().errors.test.message; // Path `test` is required. + * + * @param {String} option - The option you'd like to set the value for + * @param {*} value - value for option + * @return {undefined} + * @function set + * @api public + */ +SchemaArray.set = SchemaType.set; + +/*! + * Inherits from SchemaType. + */ +SchemaArray.prototype = Object.create(SchemaType.prototype); +SchemaArray.prototype.constructor = SchemaArray; +SchemaArray.prototype.OptionsConstructor = SchemaArrayOptions; + +/*! + * ignore + */ + +SchemaArray._checkRequired = SchemaType.prototype.checkRequired; + +/** + * Override the function the required validator uses to check whether an array + * passes the `required` check. + * + * ####Example: + * + * // Require non-empty array to pass `required` check + * mongoose.Schema.Types.Array.checkRequired(v => Array.isArray(v) && v.length); + * + * const M = mongoose.model({ arr: { type: Array, required: true } }); + * new M({ arr: [] }).validateSync(); // `null`, validation fails! + * + * @param {Function} fn + * @return {Function} + * @function checkRequired + * @api public + */ + +SchemaArray.checkRequired = SchemaType.checkRequired; + +/** + * Check if the given value satisfies the `required` validator. + * + * @param {Any} value + * @param {Document} doc + * @return {Boolean} + * @api public + */ + +SchemaArray.prototype.checkRequired = function checkRequired(value, doc) { + if (SchemaType._isRef(this, value, doc, true)) { + return !!value; + } + + // `require('util').inherits()` does **not** copy static properties, and + // plugins like mongoose-float use `inherits()` for pre-ES6. + const _checkRequired = typeof this.constructor.checkRequired == 'function' ? + this.constructor.checkRequired() : + SchemaArray.checkRequired(); + + return _checkRequired(value); +}; + +/** + * Adds an enum validator if this is an array of strings or numbers. Equivalent to + * `SchemaString.prototype.enum()` or `SchemaNumber.prototype.enum()` + * + * @param {String|Object} [args...] enumeration values + * @return {SchemaArray} this + */ + +SchemaArray.prototype.enum = function() { + let arr = this; + while (true) { + const instance = get(arr, 'caster.instance'); + if (instance === 'Array') { + arr = arr.caster; + continue; + } + if (instance !== 'String' && instance !== 'Number') { + throw new Error('`enum` can only be set on an array of strings or numbers ' + + ', not ' + instance); + } + break; + } + + let enumArray = arguments; + if (!Array.isArray(arguments) && utils.isObject(arguments)) { + enumArray = utils.object.vals(enumArray); + } + + arr.caster.enum.apply(arr.caster, enumArray); + return this; +}; + +/** + * Overrides the getters application for the population special-case + * + * @param {Object} value + * @param {Object} scope + * @api private + */ + +SchemaArray.prototype.applyGetters = function(value, scope) { + if (scope != null && scope.$__ != null && scope.$populated(this.path)) { + // means the object id was populated + return value; + } + + const ret = SchemaType.prototype.applyGetters.call(this, value, scope); + if (Array.isArray(ret)) { + const rawValue = ret.isMongooseArrayProxy ? ret.__array : ret; + const len = rawValue.length; + for (let i = 0; i < len; ++i) { + rawValue[i] = this.caster.applyGetters(rawValue[i], scope); + } + } + return ret; +}; + +SchemaArray.prototype._applySetters = function(value, scope, init, priorVal) { + if (this.casterConstructor.$isMongooseArray && + SchemaArray.options.castNonArrays && + !this[isNestedArraySymbol]) { + // Check nesting levels and wrap in array if necessary + let depth = 0; + let arr = this; + while (arr != null && + arr.$isMongooseArray && + !arr.$isMongooseDocumentArray) { + ++depth; + arr = arr.casterConstructor; + } + + // No need to wrap empty arrays + if (value != null && value.length > 0) { + const valueDepth = arrayDepth(value); + if (valueDepth.min === valueDepth.max && valueDepth.max < depth && valueDepth.containsNonArrayItem) { + for (let i = valueDepth.max; i < depth; ++i) { + value = [value]; + } + } + } + } + + return SchemaType.prototype._applySetters.call(this, value, scope, init, priorVal); +}; + +/** + * Casts values for set(). + * + * @param {Object} value + * @param {Document} doc document that triggers the casting + * @param {Boolean} init whether this is an initialization cast + * @api private + */ + +SchemaArray.prototype.cast = function(value, doc, init, prev, options) { + // lazy load + MongooseArray || (MongooseArray = require('../types').Array); + + let i; + let l; + + if (Array.isArray(value)) { + const len = value.length; + if (!len && doc) { + const indexes = doc.schema.indexedPaths(); + + const arrayPath = this.path; + for (i = 0, l = indexes.length; i < l; ++i) { + const pathIndex = indexes[i][0][arrayPath]; + if (pathIndex === '2dsphere' || pathIndex === '2d') { + return; + } + } + + // Special case: if this index is on the parent of what looks like + // GeoJSON, skip setting the default to empty array re: #1668, #3233 + const arrayGeojsonPath = this.path.endsWith('.coordinates') ? + this.path.substr(0, this.path.lastIndexOf('.')) : null; + if (arrayGeojsonPath != null) { + for (i = 0, l = indexes.length; i < l; ++i) { + const pathIndex = indexes[i][0][arrayGeojsonPath]; + if (pathIndex === '2dsphere') { + return; + } + } + } + } + + options = options || emptyOpts; + + let rawValue = value.isMongooseArrayProxy ? value.__array : value; + value = MongooseArray(rawValue, options.path || this._arrayPath || this.path, doc, this); + rawValue = value.__array; + + if (init && doc != null && doc.$__ != null && doc.$populated(this.path)) { + return value; + } + + const caster = this.caster; + const isMongooseArray = caster.$isMongooseArray; + if (caster && this.casterConstructor !== Mixed) { + try { + const len = rawValue.length; + for (i = 0; i < len; i++) { + const opts = {}; + // Perf: creating `arrayPath` is expensive for large arrays. + // We only need `arrayPath` if this is a nested array, so + // skip if possible. + if (isMongooseArray) { + if (options.arrayPath != null) { + opts.arrayPathIndex = i; + } else if (caster._arrayParentPath != null) { + opts.arrayPathIndex = i; + } + } + rawValue[i] = caster.applySetters(rawValue[i], doc, init, void 0, opts); + } + } catch (e) { + // rethrow + throw new CastError('[' + e.kind + ']', util.inspect(value), this.path + '.' + i, e, this); + } + } + + return value; + } + + if (init || SchemaArray.options.castNonArrays) { + // gh-2442: if we're loading this from the db and its not an array, mark + // the whole array as modified. + if (!!doc && !!init) { + doc.markModified(this.path); + } + return this.cast([value], doc, init); + } + + throw new CastError('Array', util.inspect(value), this.path, null, this); +}; + +/*! + * ignore + */ + +SchemaArray.prototype._castForPopulate = function _castForPopulate(value, doc) { + // lazy load + MongooseArray || (MongooseArray = require('../types').Array); + + if (Array.isArray(value)) { + let i; + const rawValue = value.__array ? value.__array : value; + const len = rawValue.length; + + const caster = this.caster; + if (caster && this.casterConstructor !== Mixed) { + try { + for (i = 0; i < len; i++) { + const opts = {}; + // Perf: creating `arrayPath` is expensive for large arrays. + // We only need `arrayPath` if this is a nested array, so + // skip if possible. + if (caster.$isMongooseArray && caster._arrayParentPath != null) { + opts.arrayPathIndex = i; + } + + rawValue[i] = caster.cast(rawValue[i], doc, false, void 0, opts); + } + } catch (e) { + // rethrow + throw new CastError('[' + e.kind + ']', util.inspect(value), this.path + '.' + i, e, this); + } + } + + return value; + } + + throw new CastError('Array', util.inspect(value), this.path, null, this); +}; + +SchemaArray.prototype.$toObject = SchemaArray.prototype.toObject; + +/*! + * Ignore + */ + +SchemaArray.prototype.discriminator = function(name, schema) { + let arr = this; + while (arr.$isMongooseArray && !arr.$isMongooseDocumentArray) { + arr = arr.casterConstructor; + if (arr == null || typeof arr === 'function') { + throw new MongooseError('You can only add an embedded discriminator on ' + + 'a document array, ' + this.path + ' is a plain array'); + } + } + return arr.discriminator(name, schema); +}; + +/*! + * ignore + */ + +SchemaArray.prototype.clone = function() { + const options = Object.assign({}, this.options); + const schematype = new this.constructor(this.path, this.caster, options, this.schemaOptions); + schematype.validators = this.validators.slice(); + if (this.requiredValidator !== undefined) { + schematype.requiredValidator = this.requiredValidator; + } + return schematype; +}; + +/** + * Casts values for queries. + * + * @param {String} $conditional + * @param {any} [value] + * @api private + */ + +SchemaArray.prototype.castForQuery = function($conditional, value) { + let handler; + let val; + + if (arguments.length === 2) { + handler = this.$conditionalHandlers[$conditional]; + + if (!handler) { + throw new Error('Can\'t use ' + $conditional + ' with Array.'); + } + + val = handler.call(this, value); + } else { + val = $conditional; + let Constructor = this.casterConstructor; + + if (val && + Constructor.discriminators && + Constructor.schema && + Constructor.schema.options && + Constructor.schema.options.discriminatorKey) { + if (typeof val[Constructor.schema.options.discriminatorKey] === 'string' && + Constructor.discriminators[val[Constructor.schema.options.discriminatorKey]]) { + Constructor = Constructor.discriminators[val[Constructor.schema.options.discriminatorKey]]; + } else { + const constructorByValue = getDiscriminatorByValue(Constructor.discriminators, val[Constructor.schema.options.discriminatorKey]); + if (constructorByValue) { + Constructor = constructorByValue; + } + } + } + + const proto = this.casterConstructor.prototype; + let method = proto && (proto.castForQuery || proto.cast); + if (!method && Constructor.castForQuery) { + method = Constructor.castForQuery; + } + const caster = this.caster; + + if (Array.isArray(val)) { + this.setters.reverse().forEach(setter => { + val = setter.call(this, val, this); + }); + val = val.map(function(v) { + if (utils.isObject(v) && v.$elemMatch) { + return v; + } + if (method) { + v = method.call(caster, v); + return v; + } + if (v != null) { + v = new Constructor(v); + return v; + } + return v; + }); + } else if (method) { + val = method.call(caster, val); + } else if (val != null) { + val = new Constructor(val); + } + } + + return val; +}; + +function cast$all(val) { + if (!Array.isArray(val)) { + val = [val]; + } + + val = val.map(function(v) { + if (utils.isObject(v)) { + const o = {}; + o[this.path] = v; + return cast(this.casterConstructor.schema, o)[this.path]; + } + return v; + }, this); + + return this.castForQuery(val); +} + +function cast$elemMatch(val) { + const keys = Object.keys(val); + const numKeys = keys.length; + for (let i = 0; i < numKeys; ++i) { + const key = keys[i]; + const value = val[key]; + if (isOperator(key) && value != null) { + val[key] = this.castForQuery(key, value); + } + } + + // Is this an embedded discriminator and is the discriminator key set? + // If so, use the discriminator schema. See gh-7449 + const discriminatorKey = get(this, + 'casterConstructor.schema.options.discriminatorKey'); + const discriminators = get(this, 'casterConstructor.schema.discriminators', {}); + if (discriminatorKey != null && + val[discriminatorKey] != null && + discriminators[val[discriminatorKey]] != null) { + return cast(discriminators[val[discriminatorKey]], val); + } + + return cast(this.casterConstructor.schema, val); +} + +const handle = SchemaArray.prototype.$conditionalHandlers = {}; + +handle.$all = cast$all; +handle.$options = String; +handle.$elemMatch = cast$elemMatch; +handle.$geoIntersects = geospatial.cast$geoIntersects; +handle.$or = createLogicalQueryOperatorHandler('$or'); +handle.$and = createLogicalQueryOperatorHandler('$and'); +handle.$nor = createLogicalQueryOperatorHandler('$nor'); + +function createLogicalQueryOperatorHandler(op) { + return function logicalQueryOperatorHandler(val) { + if (!Array.isArray(val)) { + throw new TypeError('conditional ' + op + ' requires an array'); + } + + const ret = []; + for (const obj of val) { + ret.push(cast(this.casterConstructor.schema, obj)); + } + + return ret; + }; +} + +handle.$near = +handle.$nearSphere = geospatial.cast$near; + +handle.$within = +handle.$geoWithin = geospatial.cast$within; + +handle.$size = +handle.$minDistance = +handle.$maxDistance = castToNumber; + +handle.$exists = $exists; +handle.$type = $type; + +handle.$eq = +handle.$gt = +handle.$gte = +handle.$lt = +handle.$lte = +handle.$ne = +handle.$regex = SchemaArray.prototype.castForQuery; + +// `$in` is special because you can also include an empty array in the query +// like `$in: [1, []]`, see gh-5913 +handle.$nin = SchemaType.prototype.$conditionalHandlers.$nin; +handle.$in = SchemaType.prototype.$conditionalHandlers.$in; + +/*! + * Module exports. + */ + +module.exports = SchemaArray; diff --git a/node_modules/mongoose/lib/schema/boolean.js b/node_modules/mongoose/lib/schema/boolean.js new file mode 100644 index 000000000..8ef7e28b9 --- /dev/null +++ b/node_modules/mongoose/lib/schema/boolean.js @@ -0,0 +1,270 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const CastError = require('../error/cast'); +const SchemaType = require('../schematype'); +const castBoolean = require('../cast/boolean'); +const utils = require('../utils'); + +/** + * Boolean SchemaType constructor. + * + * @param {String} path + * @param {Object} options + * @inherits SchemaType + * @api public + */ + +function SchemaBoolean(path, options) { + SchemaType.call(this, path, options, 'Boolean'); +} + +/** + * This schema type's name, to defend against minifiers that mangle + * function names. + * + * @api public + */ +SchemaBoolean.schemaName = 'Boolean'; + +SchemaBoolean.defaultOptions = {}; + +/*! + * Inherits from SchemaType. + */ +SchemaBoolean.prototype = Object.create(SchemaType.prototype); +SchemaBoolean.prototype.constructor = SchemaBoolean; + +/*! + * ignore + */ + +SchemaBoolean._cast = castBoolean; + +/** + * Sets a default option for all Boolean instances. + * + * ####Example: + * + * // Make all booleans have `default` of false. + * mongoose.Schema.Boolean.set('default', false); + * + * const Order = mongoose.model('Order', new Schema({ isPaid: Boolean })); + * new Order({ }).isPaid; // false + * + * @param {String} option - The option you'd like to set the value for + * @param {*} value - value for option + * @return {undefined} + * @function set + * @static + * @api public + */ + +SchemaBoolean.set = SchemaType.set; + +/** + * Get/set the function used to cast arbitrary values to booleans. + * + * ####Example: + * + * // Make Mongoose cast empty string '' to false. + * const original = mongoose.Schema.Boolean.cast(); + * mongoose.Schema.Boolean.cast(v => { + * if (v === '') { + * return false; + * } + * return original(v); + * }); + * + * // Or disable casting entirely + * mongoose.Schema.Boolean.cast(false); + * + * @param {Function} caster + * @return {Function} + * @function get + * @static + * @api public + */ + +SchemaBoolean.cast = function cast(caster) { + if (arguments.length === 0) { + return this._cast; + } + if (caster === false) { + caster = this._defaultCaster; + } + this._cast = caster; + + return this._cast; +}; + +/*! + * ignore + */ + +SchemaBoolean._defaultCaster = v => { + if (v != null && typeof v !== 'boolean') { + throw new Error(); + } + return v; +}; + +/*! + * ignore + */ + +SchemaBoolean._checkRequired = v => v === true || v === false; + +/** + * Override the function the required validator uses to check whether a boolean + * passes the `required` check. + * + * @param {Function} fn + * @return {Function} + * @function checkRequired + * @static + * @api public + */ + +SchemaBoolean.checkRequired = SchemaType.checkRequired; + +/** + * Check if the given value satisfies a required validator. For a boolean + * to satisfy a required validator, it must be strictly equal to true or to + * false. + * + * @param {Any} value + * @return {Boolean} + * @api public + */ + +SchemaBoolean.prototype.checkRequired = function(value) { + return this.constructor._checkRequired(value); +}; + +/** + * Configure which values get casted to `true`. + * + * ####Example: + * + * const M = mongoose.model('Test', new Schema({ b: Boolean })); + * new M({ b: 'affirmative' }).b; // undefined + * mongoose.Schema.Boolean.convertToTrue.add('affirmative'); + * new M({ b: 'affirmative' }).b; // true + * + * @property convertToTrue + * @type Set + * @api public + */ + +Object.defineProperty(SchemaBoolean, 'convertToTrue', { + get: () => castBoolean.convertToTrue, + set: v => { castBoolean.convertToTrue = v; } +}); + +/** + * Configure which values get casted to `false`. + * + * ####Example: + * + * const M = mongoose.model('Test', new Schema({ b: Boolean })); + * new M({ b: 'nay' }).b; // undefined + * mongoose.Schema.Types.Boolean.convertToFalse.add('nay'); + * new M({ b: 'nay' }).b; // false + * + * @property convertToFalse + * @type Set + * @api public + */ + +Object.defineProperty(SchemaBoolean, 'convertToFalse', { + get: () => castBoolean.convertToFalse, + set: v => { castBoolean.convertToFalse = v; } +}); + +/** + * Casts to boolean + * + * @param {Object} value + * @param {Object} model - this value is optional + * @api private + */ + +SchemaBoolean.prototype.cast = function(value) { + let castBoolean; + if (typeof this._castFunction === 'function') { + castBoolean = this._castFunction; + } else if (typeof this.constructor.cast === 'function') { + castBoolean = this.constructor.cast(); + } else { + castBoolean = SchemaBoolean.cast(); + } + + try { + return castBoolean(value); + } catch (error) { + throw new CastError('Boolean', value, this.path, error, this); + } +}; + +SchemaBoolean.$conditionalHandlers = + utils.options(SchemaType.prototype.$conditionalHandlers, {}); + +/** + * Casts contents for queries. + * + * @param {String} $conditional + * @param {any} val + * @api private + */ + +SchemaBoolean.prototype.castForQuery = function($conditional, val) { + let handler; + if (arguments.length === 2) { + handler = SchemaBoolean.$conditionalHandlers[$conditional]; + + if (handler) { + return handler.call(this, val); + } + + return this._castForQuery(val); + } + + return this._castForQuery($conditional); +}; + +/** + * + * @api private + */ + +SchemaBoolean.prototype._castNullish = function _castNullish(v) { + if (typeof v === 'undefined' && + this.$$context != null && + this.$$context._mongooseOptions != null && + this.$$context._mongooseOptions.omitUndefined) { + return v; + } + const castBoolean = typeof this.constructor.cast === 'function' ? + this.constructor.cast() : + SchemaBoolean.cast(); + if (castBoolean == null) { + return v; + } + if (castBoolean.convertToFalse instanceof Set && castBoolean.convertToFalse.has(v)) { + return false; + } + if (castBoolean.convertToTrue instanceof Set && castBoolean.convertToTrue.has(v)) { + return true; + } + return v; +}; + +/*! + * Module exports. + */ + +module.exports = SchemaBoolean; diff --git a/node_modules/mongoose/lib/schema/buffer.js b/node_modules/mongoose/lib/schema/buffer.js new file mode 100644 index 000000000..d83541662 --- /dev/null +++ b/node_modules/mongoose/lib/schema/buffer.js @@ -0,0 +1,267 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const MongooseBuffer = require('../types/buffer'); +const SchemaBufferOptions = require('../options/SchemaBufferOptions'); +const SchemaType = require('../schematype'); +const handleBitwiseOperator = require('./operators/bitwise'); +const utils = require('../utils'); + +const Binary = MongooseBuffer.Binary; +const CastError = SchemaType.CastError; + +/** + * Buffer SchemaType constructor + * + * @param {String} key + * @param {Object} options + * @inherits SchemaType + * @api public + */ + +function SchemaBuffer(key, options) { + SchemaType.call(this, key, options, 'Buffer'); +} + +/** + * This schema type's name, to defend against minifiers that mangle + * function names. + * + * @api public + */ +SchemaBuffer.schemaName = 'Buffer'; + +SchemaBuffer.defaultOptions = {}; + +/*! + * Inherits from SchemaType. + */ +SchemaBuffer.prototype = Object.create(SchemaType.prototype); +SchemaBuffer.prototype.constructor = SchemaBuffer; +SchemaBuffer.prototype.OptionsConstructor = SchemaBufferOptions; + +/*! + * ignore + */ + +SchemaBuffer._checkRequired = v => !!(v && v.length); + +/** + * Sets a default option for all Buffer instances. + * + * ####Example: + * + * // Make all buffers have `required` of true by default. + * mongoose.Schema.Buffer.set('required', true); + * + * const User = mongoose.model('User', new Schema({ test: Buffer })); + * new User({ }).validateSync().errors.test.message; // Path `test` is required. + * + * @param {String} option - The option you'd like to set the value for + * @param {*} value - value for option + * @return {undefined} + * @function set + * @static + * @api public + */ + +SchemaBuffer.set = SchemaType.set; + +/** + * Override the function the required validator uses to check whether a string + * passes the `required` check. + * + * ####Example: + * + * // Allow empty strings to pass `required` check + * mongoose.Schema.Types.String.checkRequired(v => v != null); + * + * const M = mongoose.model({ buf: { type: Buffer, required: true } }); + * new M({ buf: Buffer.from('') }).validateSync(); // validation passes! + * + * @param {Function} fn + * @return {Function} + * @function checkRequired + * @static + * @api public + */ + +SchemaBuffer.checkRequired = SchemaType.checkRequired; + +/** + * Check if the given value satisfies a required validator. To satisfy a + * required validator, a buffer must not be null or undefined and have + * non-zero length. + * + * @param {Any} value + * @param {Document} doc + * @return {Boolean} + * @api public + */ + +SchemaBuffer.prototype.checkRequired = function(value, doc) { + if (SchemaType._isRef(this, value, doc, true)) { + return !!value; + } + return this.constructor._checkRequired(value); +}; + +/** + * Casts contents + * + * @param {Object} value + * @param {Document} doc document that triggers the casting + * @param {Boolean} init + * @api private + */ + +SchemaBuffer.prototype.cast = function(value, doc, init) { + let ret; + if (SchemaType._isRef(this, value, doc, init)) { + if (value && value.isMongooseBuffer) { + return value; + } + + if (Buffer.isBuffer(value)) { + if (!value || !value.isMongooseBuffer) { + value = new MongooseBuffer(value, [this.path, doc]); + if (this.options.subtype != null) { + value._subtype = this.options.subtype; + } + } + return value; + } + + if (value instanceof Binary) { + ret = new MongooseBuffer(value.value(true), [this.path, doc]); + if (typeof value.sub_type !== 'number') { + throw new CastError('Buffer', value, this.path, null, this); + } + ret._subtype = value.sub_type; + return ret; + } + + return this._castRef(value, doc, init); + } + + // documents + if (value && value._id) { + value = value._id; + } + + if (value && value.isMongooseBuffer) { + return value; + } + + if (Buffer.isBuffer(value)) { + if (!value || !value.isMongooseBuffer) { + value = new MongooseBuffer(value, [this.path, doc]); + if (this.options.subtype != null) { + value._subtype = this.options.subtype; + } + } + return value; + } + + if (value instanceof Binary) { + ret = new MongooseBuffer(value.value(true), [this.path, doc]); + if (typeof value.sub_type !== 'number') { + throw new CastError('Buffer', value, this.path, null, this); + } + ret._subtype = value.sub_type; + return ret; + } + + if (value === null) { + return value; + } + + + const type = typeof value; + if ( + type === 'string' || type === 'number' || Array.isArray(value) || + (type === 'object' && value.type === 'Buffer' && Array.isArray(value.data)) // gh-6863 + ) { + if (type === 'number') { + value = [value]; + } + ret = new MongooseBuffer(value, [this.path, doc]); + if (this.options.subtype != null) { + ret._subtype = this.options.subtype; + } + return ret; + } + + throw new CastError('Buffer', value, this.path, null, this); +}; + +/** + * Sets the default [subtype](https://studio3t.com/whats-new/best-practices-uuid-mongodb/) + * for this buffer. You can find a [list of allowed subtypes here](http://api.mongodb.com/python/current/api/bson/binary.html). + * + * ####Example: + * + * const s = new Schema({ uuid: { type: Buffer, subtype: 4 }); + * const M = db.model('M', s); + * const m = new M({ uuid: 'test string' }); + * m.uuid._subtype; // 4 + * + * @param {Number} subtype the default subtype + * @return {SchemaType} this + * @api public + */ + +SchemaBuffer.prototype.subtype = function(subtype) { + this.options.subtype = subtype; + return this; +}; + +/*! + * ignore + */ +function handleSingle(val) { + return this.castForQuery(val); +} + +SchemaBuffer.prototype.$conditionalHandlers = + utils.options(SchemaType.prototype.$conditionalHandlers, { + $bitsAllClear: handleBitwiseOperator, + $bitsAnyClear: handleBitwiseOperator, + $bitsAllSet: handleBitwiseOperator, + $bitsAnySet: handleBitwiseOperator, + $gt: handleSingle, + $gte: handleSingle, + $lt: handleSingle, + $lte: handleSingle + }); + +/** + * Casts contents for queries. + * + * @param {String} $conditional + * @param {any} [value] + * @api private + */ + +SchemaBuffer.prototype.castForQuery = function($conditional, val) { + let handler; + if (arguments.length === 2) { + handler = this.$conditionalHandlers[$conditional]; + if (!handler) { + throw new Error('Can\'t use ' + $conditional + ' with Buffer.'); + } + return handler.call(this, val); + } + val = $conditional; + const casted = this._castForQuery(val); + return casted ? casted.toObject({ transform: false, virtuals: false }) : casted; +}; + +/*! + * Module exports. + */ + +module.exports = SchemaBuffer; diff --git a/node_modules/mongoose/lib/schema/date.js b/node_modules/mongoose/lib/schema/date.js new file mode 100644 index 000000000..5ee0eb9e9 --- /dev/null +++ b/node_modules/mongoose/lib/schema/date.js @@ -0,0 +1,403 @@ +/*! + * Module requirements. + */ + +'use strict'; + +const MongooseError = require('../error/index'); +const SchemaDateOptions = require('../options/SchemaDateOptions'); +const SchemaType = require('../schematype'); +const castDate = require('../cast/date'); +const getConstructorName = require('../helpers/getConstructorName'); +const utils = require('../utils'); + +const CastError = SchemaType.CastError; + +/** + * Date SchemaType constructor. + * + * @param {String} key + * @param {Object} options + * @inherits SchemaType + * @api public + */ + +function SchemaDate(key, options) { + SchemaType.call(this, key, options, 'Date'); +} + +/** + * This schema type's name, to defend against minifiers that mangle + * function names. + * + * @api public + */ +SchemaDate.schemaName = 'Date'; + +SchemaDate.defaultOptions = {}; + +/*! + * Inherits from SchemaType. + */ +SchemaDate.prototype = Object.create(SchemaType.prototype); +SchemaDate.prototype.constructor = SchemaDate; +SchemaDate.prototype.OptionsConstructor = SchemaDateOptions; + +/*! + * ignore + */ + +SchemaDate._cast = castDate; + +/** + * Sets a default option for all Date instances. + * + * ####Example: + * + * // Make all dates have `required` of true by default. + * mongoose.Schema.Date.set('required', true); + * + * const User = mongoose.model('User', new Schema({ test: Date })); + * new User({ }).validateSync().errors.test.message; // Path `test` is required. + * + * @param {String} option - The option you'd like to set the value for + * @param {*} value - value for option + * @return {undefined} + * @function set + * @static + * @api public + */ + +SchemaDate.set = SchemaType.set; + +/** + * Get/set the function used to cast arbitrary values to dates. + * + * ####Example: + * + * // Mongoose converts empty string '' into `null` for date types. You + * // can create a custom caster to disable it. + * const original = mongoose.Schema.Types.Date.cast(); + * mongoose.Schema.Types.Date.cast(v => { + * assert.ok(v !== ''); + * return original(v); + * }); + * + * // Or disable casting entirely + * mongoose.Schema.Types.Date.cast(false); + * + * @param {Function} caster + * @return {Function} + * @function get + * @static + * @api public + */ + +SchemaDate.cast = function cast(caster) { + if (arguments.length === 0) { + return this._cast; + } + if (caster === false) { + caster = this._defaultCaster; + } + this._cast = caster; + + return this._cast; +}; + +/*! + * ignore + */ + +SchemaDate._defaultCaster = v => { + if (v != null && !(v instanceof Date)) { + throw new Error(); + } + return v; +}; + +/** + * Declares a TTL index (rounded to the nearest second) for _Date_ types only. + * + * This sets the `expireAfterSeconds` index option available in MongoDB >= 2.1.2. + * This index type is only compatible with Date types. + * + * ####Example: + * + * // expire in 24 hours + * new Schema({ createdAt: { type: Date, expires: 60*60*24 }}); + * + * `expires` utilizes the `ms` module from [guille](https://github.com/guille/) allowing us to use a friendlier syntax: + * + * ####Example: + * + * // expire in 24 hours + * new Schema({ createdAt: { type: Date, expires: '24h' }}); + * + * // expire in 1.5 hours + * new Schema({ createdAt: { type: Date, expires: '1.5h' }}); + * + * // expire in 7 days + * const schema = new Schema({ createdAt: Date }); + * schema.path('createdAt').expires('7d'); + * + * @param {Number|String} when + * @added 3.0.0 + * @return {SchemaType} this + * @api public + */ + +SchemaDate.prototype.expires = function(when) { + if (getConstructorName(this._index) !== 'Object') { + this._index = {}; + } + + this._index.expires = when; + utils.expires(this._index); + return this; +}; + +/*! + * ignore + */ + +SchemaDate._checkRequired = v => v instanceof Date; + +/** + * Override the function the required validator uses to check whether a string + * passes the `required` check. + * + * ####Example: + * + * // Allow empty strings to pass `required` check + * mongoose.Schema.Types.String.checkRequired(v => v != null); + * + * const M = mongoose.model({ str: { type: String, required: true } }); + * new M({ str: '' }).validateSync(); // `null`, validation passes! + * + * @param {Function} fn + * @return {Function} + * @function checkRequired + * @static + * @api public + */ + +SchemaDate.checkRequired = SchemaType.checkRequired; + +/** + * Check if the given value satisfies a required validator. To satisfy + * a required validator, the given value must be an instance of `Date`. + * + * @param {Any} value + * @param {Document} doc + * @return {Boolean} + * @api public + */ + +SchemaDate.prototype.checkRequired = function(value, doc) { + if (SchemaType._isRef(this, value, doc, true)) { + return !!value; + } + + // `require('util').inherits()` does **not** copy static properties, and + // plugins like mongoose-float use `inherits()` for pre-ES6. + const _checkRequired = typeof this.constructor.checkRequired == 'function' ? + this.constructor.checkRequired() : + SchemaDate.checkRequired(); + return _checkRequired(value); +}; + +/** + * Sets a minimum date validator. + * + * ####Example: + * + * const s = new Schema({ d: { type: Date, min: Date('1970-01-01') }) + * const M = db.model('M', s) + * const m = new M({ d: Date('1969-12-31') }) + * m.save(function (err) { + * console.error(err) // validator error + * m.d = Date('2014-12-08'); + * m.save() // success + * }) + * + * // custom error messages + * // We can also use the special {MIN} token which will be replaced with the invalid value + * const min = [Date('1970-01-01'), 'The value of path `{PATH}` ({VALUE}) is beneath the limit ({MIN}).']; + * const schema = new Schema({ d: { type: Date, min: min }) + * const M = mongoose.model('M', schema); + * const s= new M({ d: Date('1969-12-31') }); + * s.validate(function (err) { + * console.log(String(err)) // ValidationError: The value of path `d` (1969-12-31) is before the limit (1970-01-01). + * }) + * + * @param {Date} value minimum date + * @param {String} [message] optional custom error message + * @return {SchemaType} this + * @see Customized Error Messages #error_messages_MongooseError-messages + * @api public + */ + +SchemaDate.prototype.min = function(value, message) { + if (this.minValidator) { + this.validators = this.validators.filter(function(v) { + return v.validator !== this.minValidator; + }, this); + } + + if (value) { + let msg = message || MongooseError.messages.Date.min; + if (typeof msg === 'string') { + msg = msg.replace(/{MIN}/, (value === Date.now ? 'Date.now()' : value.toString())); + } + const _this = this; + this.validators.push({ + validator: this.minValidator = function(val) { + let _value = value; + if (typeof value === 'function' && value !== Date.now) { + _value = _value.call(this); + } + const min = (_value === Date.now ? _value() : _this.cast(_value)); + return val === null || val.valueOf() >= min.valueOf(); + }, + message: msg, + type: 'min', + min: value + }); + } + + return this; +}; + +/** + * Sets a maximum date validator. + * + * ####Example: + * + * const s = new Schema({ d: { type: Date, max: Date('2014-01-01') }) + * const M = db.model('M', s) + * const m = new M({ d: Date('2014-12-08') }) + * m.save(function (err) { + * console.error(err) // validator error + * m.d = Date('2013-12-31'); + * m.save() // success + * }) + * + * // custom error messages + * // We can also use the special {MAX} token which will be replaced with the invalid value + * const max = [Date('2014-01-01'), 'The value of path `{PATH}` ({VALUE}) exceeds the limit ({MAX}).']; + * const schema = new Schema({ d: { type: Date, max: max }) + * const M = mongoose.model('M', schema); + * const s= new M({ d: Date('2014-12-08') }); + * s.validate(function (err) { + * console.log(String(err)) // ValidationError: The value of path `d` (2014-12-08) exceeds the limit (2014-01-01). + * }) + * + * @param {Date} maximum date + * @param {String} [message] optional custom error message + * @return {SchemaType} this + * @see Customized Error Messages #error_messages_MongooseError-messages + * @api public + */ + +SchemaDate.prototype.max = function(value, message) { + if (this.maxValidator) { + this.validators = this.validators.filter(function(v) { + return v.validator !== this.maxValidator; + }, this); + } + + if (value) { + let msg = message || MongooseError.messages.Date.max; + if (typeof msg === 'string') { + msg = msg.replace(/{MAX}/, (value === Date.now ? 'Date.now()' : value.toString())); + } + const _this = this; + this.validators.push({ + validator: this.maxValidator = function(val) { + let _value = value; + if (typeof _value === 'function' && _value !== Date.now) { + _value = _value.call(this); + } + const max = (_value === Date.now ? _value() : _this.cast(_value)); + return val === null || val.valueOf() <= max.valueOf(); + }, + message: msg, + type: 'max', + max: value + }); + } + + return this; +}; + +/** + * Casts to date + * + * @param {Object} value to cast + * @api private + */ + +SchemaDate.prototype.cast = function(value) { + let castDate; + if (typeof this._castFunction === 'function') { + castDate = this._castFunction; + } else if (typeof this.constructor.cast === 'function') { + castDate = this.constructor.cast(); + } else { + castDate = SchemaDate.cast(); + } + + try { + return castDate(value); + } catch (error) { + throw new CastError('date', value, this.path, error, this); + } +}; + +/*! + * Date Query casting. + * + * @api private + */ + +function handleSingle(val) { + return this.cast(val); +} + +SchemaDate.prototype.$conditionalHandlers = + utils.options(SchemaType.prototype.$conditionalHandlers, { + $gt: handleSingle, + $gte: handleSingle, + $lt: handleSingle, + $lte: handleSingle + }); + + +/** + * Casts contents for queries. + * + * @param {String} $conditional + * @param {any} [value] + * @api private + */ + +SchemaDate.prototype.castForQuery = function($conditional, val) { + if (arguments.length !== 2) { + return this._castForQuery($conditional); + } + + const handler = this.$conditionalHandlers[$conditional]; + + if (!handler) { + throw new Error('Can\'t use ' + $conditional + ' with Date.'); + } + + return handler.call(this, val); +}; + +/*! + * Module exports. + */ + +module.exports = SchemaDate; diff --git a/node_modules/mongoose/lib/schema/decimal128.js b/node_modules/mongoose/lib/schema/decimal128.js new file mode 100644 index 000000000..9c0e1d40d --- /dev/null +++ b/node_modules/mongoose/lib/schema/decimal128.js @@ -0,0 +1,210 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const SchemaType = require('../schematype'); +const CastError = SchemaType.CastError; +const Decimal128Type = require('../types/decimal128'); +const castDecimal128 = require('../cast/decimal128'); +const utils = require('../utils'); + +/** + * Decimal128 SchemaType constructor. + * + * @param {String} key + * @param {Object} options + * @inherits SchemaType + * @api public + */ + +function Decimal128(key, options) { + SchemaType.call(this, key, options, 'Decimal128'); +} + +/** + * This schema type's name, to defend against minifiers that mangle + * function names. + * + * @api public + */ +Decimal128.schemaName = 'Decimal128'; + +Decimal128.defaultOptions = {}; + +/*! + * Inherits from SchemaType. + */ +Decimal128.prototype = Object.create(SchemaType.prototype); +Decimal128.prototype.constructor = Decimal128; + +/*! + * ignore + */ + +Decimal128._cast = castDecimal128; + +/** + * Sets a default option for all Decimal128 instances. + * + * ####Example: + * + * // Make all decimal 128s have `required` of true by default. + * mongoose.Schema.Decimal128.set('required', true); + * + * const User = mongoose.model('User', new Schema({ test: mongoose.Decimal128 })); + * new User({ }).validateSync().errors.test.message; // Path `test` is required. + * + * @param {String} option - The option you'd like to set the value for + * @param {*} value - value for option + * @return {undefined} + * @function set + * @static + * @api public + */ + +Decimal128.set = SchemaType.set; + +/** + * Get/set the function used to cast arbitrary values to decimals. + * + * ####Example: + * + * // Make Mongoose only refuse to cast numbers as decimal128 + * const original = mongoose.Schema.Types.Decimal128.cast(); + * mongoose.Decimal128.cast(v => { + * assert.ok(typeof v !== 'number'); + * return original(v); + * }); + * + * // Or disable casting entirely + * mongoose.Decimal128.cast(false); + * + * @param {Function} [caster] + * @return {Function} + * @function get + * @static + * @api public + */ + +Decimal128.cast = function cast(caster) { + if (arguments.length === 0) { + return this._cast; + } + if (caster === false) { + caster = this._defaultCaster; + } + this._cast = caster; + + return this._cast; +}; + +/*! + * ignore + */ + +Decimal128._defaultCaster = v => { + if (v != null && !(v instanceof Decimal128Type)) { + throw new Error(); + } + return v; +}; + +/*! + * ignore + */ + +Decimal128._checkRequired = v => v instanceof Decimal128Type; + +/** + * Override the function the required validator uses to check whether a string + * passes the `required` check. + * + * @param {Function} fn + * @return {Function} + * @function checkRequired + * @static + * @api public + */ + +Decimal128.checkRequired = SchemaType.checkRequired; + +/** + * Check if the given value satisfies a required validator. + * + * @param {Any} value + * @param {Document} doc + * @return {Boolean} + * @api public + */ + +Decimal128.prototype.checkRequired = function checkRequired(value, doc) { + if (SchemaType._isRef(this, value, doc, true)) { + return !!value; + } + + // `require('util').inherits()` does **not** copy static properties, and + // plugins like mongoose-float use `inherits()` for pre-ES6. + const _checkRequired = typeof this.constructor.checkRequired == 'function' ? + this.constructor.checkRequired() : + Decimal128.checkRequired(); + + return _checkRequired(value); +}; + +/** + * Casts to Decimal128 + * + * @param {Object} value + * @param {Object} doc + * @param {Boolean} init whether this is an initialization cast + * @api private + */ + +Decimal128.prototype.cast = function(value, doc, init) { + if (SchemaType._isRef(this, value, doc, init)) { + if (value instanceof Decimal128Type) { + return value; + } + + return this._castRef(value, doc, init); + } + + let castDecimal128; + if (typeof this._castFunction === 'function') { + castDecimal128 = this._castFunction; + } else if (typeof this.constructor.cast === 'function') { + castDecimal128 = this.constructor.cast(); + } else { + castDecimal128 = Decimal128.cast(); + } + + try { + return castDecimal128(value); + } catch (error) { + throw new CastError('Decimal128', value, this.path, error, this); + } +}; + +/*! + * ignore + */ + +function handleSingle(val) { + return this.cast(val); +} + +Decimal128.prototype.$conditionalHandlers = + utils.options(SchemaType.prototype.$conditionalHandlers, { + $gt: handleSingle, + $gte: handleSingle, + $lt: handleSingle, + $lte: handleSingle + }); + +/*! + * Module exports. + */ + +module.exports = Decimal128; diff --git a/node_modules/mongoose/lib/schema/documentarray.js b/node_modules/mongoose/lib/schema/documentarray.js new file mode 100644 index 000000000..fc1356a5d --- /dev/null +++ b/node_modules/mongoose/lib/schema/documentarray.js @@ -0,0 +1,575 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const ArrayType = require('./array'); +const CastError = require('../error/cast'); +const EventEmitter = require('events').EventEmitter; +const SchemaDocumentArrayOptions = + require('../options/SchemaDocumentArrayOptions'); +const SchemaType = require('../schematype'); +const discriminator = require('../helpers/model/discriminator'); +const get = require('../helpers/get'); +const handleIdOption = require('../helpers/schema/handleIdOption'); +const util = require('util'); +const utils = require('../utils'); +const getConstructor = require('../helpers/discriminator/getConstructor'); + +const arrayAtomicsSymbol = require('../helpers/symbols').arrayAtomicsSymbol; +const arrayPathSymbol = require('../helpers/symbols').arrayPathSymbol; +const documentArrayParent = require('../helpers/symbols').documentArrayParent; + +let MongooseDocumentArray; +let Subdocument; + +/** + * SubdocsArray SchemaType constructor + * + * @param {String} key + * @param {Schema} schema + * @param {Object} options + * @inherits SchemaArray + * @api public + */ + +function DocumentArrayPath(key, schema, options, schemaOptions) { + if (schemaOptions != null && schemaOptions._id != null) { + schema = handleIdOption(schema, schemaOptions); + } else if (options != null && options._id != null) { + schema = handleIdOption(schema, options); + } + + const EmbeddedDocument = _createConstructor(schema, options); + EmbeddedDocument.prototype.$basePath = key; + + ArrayType.call(this, key, EmbeddedDocument, options); + + this.schema = schema; + this.schemaOptions = schemaOptions || {}; + this.$isMongooseDocumentArray = true; + this.Constructor = EmbeddedDocument; + + EmbeddedDocument.base = schema.base; + + const fn = this.defaultValue; + + if (!('defaultValue' in this) || fn !== void 0) { + this.default(function() { + let arr = fn.call(this); + if (!Array.isArray(arr)) { + arr = [arr]; + } + // Leave it up to `cast()` to convert this to a documentarray + return arr; + }); + } + + const parentSchemaType = this; + this.$embeddedSchemaType = new SchemaType(key + '.$', { + required: get(this, 'schemaOptions.required', false) + }); + this.$embeddedSchemaType.cast = function(value, doc, init) { + return parentSchemaType.cast(value, doc, init)[0]; + }; + this.$embeddedSchemaType.$isMongooseDocumentArrayElement = true; + this.$embeddedSchemaType.caster = this.Constructor; + this.$embeddedSchemaType.schema = this.schema; +} + +/** + * This schema type's name, to defend against minifiers that mangle + * function names. + * + * @api public + */ +DocumentArrayPath.schemaName = 'DocumentArray'; + +/** + * Options for all document arrays. + * + * - `castNonArrays`: `true` by default. If `false`, Mongoose will throw a CastError when a value isn't an array. If `true`, Mongoose will wrap the provided value in an array before casting. + * + * @api public + */ + +DocumentArrayPath.options = { castNonArrays: true }; + +/*! + * Inherits from ArrayType. + */ +DocumentArrayPath.prototype = Object.create(ArrayType.prototype); +DocumentArrayPath.prototype.constructor = DocumentArrayPath; +DocumentArrayPath.prototype.OptionsConstructor = SchemaDocumentArrayOptions; + +/*! + * Ignore + */ + +function _createConstructor(schema, options, baseClass) { + Subdocument || (Subdocument = require('../types/ArraySubdocument')); + + // compile an embedded document for this schema + function EmbeddedDocument() { + Subdocument.apply(this, arguments); + + this.$session(this.ownerDocument().$session()); + } + + schema._preCompile(); + + const proto = baseClass != null ? baseClass.prototype : Subdocument.prototype; + EmbeddedDocument.prototype = Object.create(proto); + EmbeddedDocument.prototype.$__setSchema(schema); + EmbeddedDocument.schema = schema; + EmbeddedDocument.prototype.constructor = EmbeddedDocument; + EmbeddedDocument.$isArraySubdocument = true; + EmbeddedDocument.events = new EventEmitter(); + + // apply methods + for (const i in schema.methods) { + EmbeddedDocument.prototype[i] = schema.methods[i]; + } + + // apply statics + for (const i in schema.statics) { + EmbeddedDocument[i] = schema.statics[i]; + } + + for (const i in EventEmitter.prototype) { + EmbeddedDocument[i] = EventEmitter.prototype[i]; + } + + EmbeddedDocument.options = options; + + return EmbeddedDocument; +} + +/** + * Adds a discriminator to this document array. + * + * ####Example: + * const shapeSchema = Schema({ name: String }, { discriminatorKey: 'kind' }); + * const schema = Schema({ shapes: [shapeSchema] }); + * + * const docArrayPath = parentSchema.path('shapes'); + * docArrayPath.discriminator('Circle', Schema({ radius: Number })); + * + * @param {String} name + * @param {Schema} schema fields to add to the schema for instances of this sub-class + * @param {Object|string} [options] If string, same as `options.value`. + * @param {String} [options.value] the string stored in the `discriminatorKey` property. If not specified, Mongoose uses the `name` parameter. + * @param {Boolean} [options.clone=true] By default, `discriminator()` clones the given `schema`. Set to `false` to skip cloning. + * @see discriminators /docs/discriminators.html + * @return {Function} the constructor Mongoose will use for creating instances of this discriminator model + * @api public + */ + +DocumentArrayPath.prototype.discriminator = function(name, schema, options) { + if (typeof name === 'function') { + name = utils.getFunctionName(name); + } + + options = options || {}; + const tiedValue = utils.isPOJO(options) ? options.value : options; + const clone = get(options, 'clone', true); + + if (schema.instanceOfSchema && clone) { + schema = schema.clone(); + } + + schema = discriminator(this.casterConstructor, name, schema, tiedValue); + + const EmbeddedDocument = _createConstructor(schema, null, this.casterConstructor); + EmbeddedDocument.baseCasterConstructor = this.casterConstructor; + + try { + Object.defineProperty(EmbeddedDocument, 'name', { + value: name + }); + } catch (error) { + // Ignore error, only happens on old versions of node + } + + this.casterConstructor.discriminators[name] = EmbeddedDocument; + + return this.casterConstructor.discriminators[name]; +}; + +/** + * Performs local validations first, then validations on each embedded doc + * + * @api private + */ + +DocumentArrayPath.prototype.doValidate = function(array, fn, scope, options) { + // lazy load + MongooseDocumentArray || (MongooseDocumentArray = require('../types/DocumentArray')); + + const _this = this; + try { + SchemaType.prototype.doValidate.call(this, array, cb, scope); + } catch (err) { + return fn(err); + } + + function cb(err) { + if (err) { + return fn(err); + } + + let count = array && array.length; + let error; + + if (!count) { + return fn(); + } + if (options && options.updateValidator) { + return fn(); + } + if (!array.isMongooseDocumentArray) { + array = new MongooseDocumentArray(array, _this.path, scope); + } + + // handle sparse arrays, do not use array.forEach which does not + // iterate over sparse elements yet reports array.length including + // them :( + + function callback(err) { + if (err != null) { + error = err; + } + --count || fn(error); + } + + for (let i = 0, len = count; i < len; ++i) { + // sidestep sparse entries + let doc = array[i]; + if (doc == null) { + --count || fn(error); + continue; + } + + // If you set the array index directly, the doc might not yet be + // a full fledged mongoose subdoc, so make it into one. + if (!(doc instanceof Subdocument)) { + const Constructor = getConstructor(_this.casterConstructor, array[i]); + doc = array[i] = new Constructor(doc, array, undefined, undefined, i); + } + + if (options != null && options.validateModifiedOnly && !doc.$isModified()) { + --count || fn(error); + continue; + } + + doc.$__validate(callback); + } + } +}; + +/** + * Performs local validations first, then validations on each embedded doc. + * + * ####Note: + * + * This method ignores the asynchronous validators. + * + * @return {MongooseError|undefined} + * @api private + */ + +DocumentArrayPath.prototype.doValidateSync = function(array, scope, options) { + const schemaTypeError = SchemaType.prototype.doValidateSync.call(this, array, scope); + if (schemaTypeError != null) { + return schemaTypeError; + } + + const count = array && array.length; + let resultError = null; + + if (!count) { + return; + } + + // handle sparse arrays, do not use array.forEach which does not + // iterate over sparse elements yet reports array.length including + // them :( + + for (let i = 0, len = count; i < len; ++i) { + // sidestep sparse entries + let doc = array[i]; + if (!doc) { + continue; + } + + // If you set the array index directly, the doc might not yet be + // a full fledged mongoose subdoc, so make it into one. + if (!(doc instanceof Subdocument)) { + const Constructor = getConstructor(this.casterConstructor, array[i]); + doc = array[i] = new Constructor(doc, array, undefined, undefined, i); + } + + if (options != null && options.validateModifiedOnly && !doc.$isModified()) { + continue; + } + + const subdocValidateError = doc.validateSync(); + + if (subdocValidateError && resultError == null) { + resultError = subdocValidateError; + } + } + + return resultError; +}; + +/*! + * ignore + */ + +DocumentArrayPath.prototype.getDefault = function(scope) { + let ret = typeof this.defaultValue === 'function' + ? this.defaultValue.call(scope) + : this.defaultValue; + + if (ret == null) { + return ret; + } + + // lazy load + MongooseDocumentArray || (MongooseDocumentArray = require('../types/DocumentArray')); + + if (!Array.isArray(ret)) { + ret = [ret]; + } + + ret = new MongooseDocumentArray(ret, this.path, scope); + + for (let i = 0; i < ret.length; ++i) { + const Constructor = getConstructor(this.casterConstructor, ret[i]); + const _subdoc = new Constructor({}, ret, undefined, + undefined, i); + _subdoc.$init(ret[i]); + _subdoc.isNew = true; + + // Make sure all paths in the subdoc are set to `default` instead + // of `init` since we used `init`. + Object.assign(_subdoc.$__.activePaths.default, _subdoc.$__.activePaths.init); + _subdoc.$__.activePaths.init = {}; + + ret[i] = _subdoc; + } + + return ret; +}; + +/** + * Casts contents + * + * @param {Object} value + * @param {Document} document that triggers the casting + * @api private + */ + +DocumentArrayPath.prototype.cast = function(value, doc, init, prev, options) { + // lazy load + MongooseDocumentArray || (MongooseDocumentArray = require('../types/DocumentArray')); + + // Skip casting if `value` is the same as the previous value, no need to cast. See gh-9266 + if (value != null && value[arrayPathSymbol] != null && value === prev) { + return value; + } + + let selected; + let subdoc; + const _opts = { transform: false, virtuals: false }; + options = options || {}; + + if (!Array.isArray(value)) { + if (!init && !DocumentArrayPath.options.castNonArrays) { + throw new CastError('DocumentArray', util.inspect(value), this.path, null, this); + } + // gh-2442 mark whole array as modified if we're initializing a doc from + // the db and the path isn't an array in the document + if (!!doc && init) { + doc.markModified(this.path); + } + return this.cast([value], doc, init, prev, options); + } + + if (!(value && value.isMongooseDocumentArray) && + !options.skipDocumentArrayCast) { + value = new MongooseDocumentArray(value, this.path, doc); + } else if (value && value.isMongooseDocumentArray) { + // We need to create a new array, otherwise change tracking will + // update the old doc (gh-4449) + value = new MongooseDocumentArray(value, this.path, doc); + } + + if (prev != null) { + value[arrayAtomicsSymbol] = prev[arrayAtomicsSymbol] || {}; + } + + if (options.arrayPathIndex != null) { + value[arrayPathSymbol] = this.path + '.' + options.arrayPathIndex; + } + + const rawArray = value.isMongooseDocumentArrayProxy ? value.__array : value; + + const len = rawArray.length; + const initDocumentOptions = { skipId: true, willInit: true }; + + for (let i = 0; i < len; ++i) { + if (!rawArray[i]) { + continue; + } + + const Constructor = getConstructor(this.casterConstructor, value[i]); + + // Check if the document has a different schema (re gh-3701) + if ((rawArray[i].$__) && + (!(rawArray[i] instanceof Constructor) || rawArray[i][documentArrayParent] !== doc)) { + rawArray[i] = rawArray[i].toObject({ + transform: false, + // Special case: if different model, but same schema, apply virtuals + // re: gh-7898 + virtuals: rawArray[i].schema === Constructor.schema + }); + } + + if (rawArray[i] instanceof Subdocument) { + // Might not have the correct index yet, so ensure it does. + if (rawArray[i].__index == null) { + rawArray[i].$setIndex(i); + } + } else if (rawArray[i] != null) { + if (init) { + if (doc) { + selected || (selected = scopePaths(this, doc.$__.selected, init)); + } else { + selected = true; + } + + subdoc = new Constructor(null, value, initDocumentOptions, selected, i); + rawArray[i] = subdoc.$init(rawArray[i]); + } else { + if (prev && typeof prev.id === 'function') { + subdoc = prev.id(rawArray[i]._id); + } + + if (prev && subdoc && utils.deepEqual(subdoc.toObject(_opts), rawArray[i])) { + // handle resetting doc with existing id and same data + subdoc.set(rawArray[i]); + // if set() is hooked it will have no return value + // see gh-746 + rawArray[i] = subdoc; + } else { + try { + subdoc = new Constructor(rawArray[i], value, undefined, + undefined, i); + // if set() is hooked it will have no return value + // see gh-746 + rawArray[i] = subdoc; + } catch (error) { + const valueInErrorMessage = util.inspect(rawArray[i]); + throw new CastError('embedded', valueInErrorMessage, + value[arrayPathSymbol], error, this); + } + } + } + } + } + + return value; +}; + +/*! + * ignore + */ + +DocumentArrayPath.prototype.clone = function() { + const options = Object.assign({}, this.options); + const schematype = new this.constructor(this.path, this.schema, options, this.schemaOptions); + schematype.validators = this.validators.slice(); + if (this.requiredValidator !== undefined) { + schematype.requiredValidator = this.requiredValidator; + } + schematype.Constructor.discriminators = Object.assign({}, + this.Constructor.discriminators); + return schematype; +}; + +/*! + * ignore + */ + +DocumentArrayPath.prototype.applyGetters = function(value, scope) { + return SchemaType.prototype.applyGetters.call(this, value, scope); +}; + +/*! + * Scopes paths selected in a query to this array. + * Necessary for proper default application of subdocument values. + * + * @param {DocumentArrayPath} array - the array to scope `fields` paths + * @param {Object|undefined} fields - the root fields selected in the query + * @param {Boolean|undefined} init - if we are being created part of a query result + */ + +function scopePaths(array, fields, init) { + if (!(init && fields)) { + return undefined; + } + + const path = array.path + '.'; + const keys = Object.keys(fields); + let i = keys.length; + const selected = {}; + let hasKeys; + let key; + let sub; + + while (i--) { + key = keys[i]; + if (key.startsWith(path)) { + sub = key.substring(path.length); + if (sub === '$') { + continue; + } + if (sub.startsWith('$.')) { + sub = sub.substr(2); + } + hasKeys || (hasKeys = true); + selected[sub] = fields[key]; + } + } + + return hasKeys && selected || undefined; +} + +/** + * Sets a default option for all DocumentArray instances. + * + * ####Example: + * + * // Make all numbers have option `min` equal to 0. + * mongoose.Schema.DocumentArray.set('_id', false); + * + * @param {String} option - The option you'd like to set the value for + * @param {*} value - value for option + * @return {undefined} + * @function set + * @static + * @api public + */ + +DocumentArrayPath.defaultOptions = {}; + +DocumentArrayPath.set = SchemaType.set; + +/*! + * Module exports. + */ + +module.exports = DocumentArrayPath; diff --git a/node_modules/mongoose/lib/schema/index.js b/node_modules/mongoose/lib/schema/index.js new file mode 100644 index 000000000..999bef7c6 --- /dev/null +++ b/node_modules/mongoose/lib/schema/index.js @@ -0,0 +1,37 @@ + +/*! + * Module exports. + */ + +'use strict'; + +exports.String = require('./string'); + +exports.Number = require('./number'); + +exports.Boolean = require('./boolean'); + +exports.DocumentArray = require('./documentarray'); + +exports.Subdocument = require('./SubdocumentPath'); + +exports.Array = require('./array'); + +exports.Buffer = require('./buffer'); + +exports.Date = require('./date'); + +exports.ObjectId = require('./objectid'); + +exports.Mixed = require('./mixed'); + +exports.Decimal128 = exports.Decimal = require('./decimal128'); + +exports.Map = require('./map'); + +// alias + +exports.Oid = exports.ObjectId; +exports.Object = exports.Mixed; +exports.Bool = exports.Boolean; +exports.ObjectID = exports.ObjectId; diff --git a/node_modules/mongoose/lib/schema/map.js b/node_modules/mongoose/lib/schema/map.js new file mode 100644 index 000000000..e81dccf45 --- /dev/null +++ b/node_modules/mongoose/lib/schema/map.js @@ -0,0 +1,76 @@ +'use strict'; + +/*! + * ignore + */ + +const MongooseMap = require('../types/map'); +const SchemaMapOptions = require('../options/SchemaMapOptions'); +const SchemaType = require('../schematype'); +/*! + * ignore + */ + +class Map extends SchemaType { + constructor(key, options) { + super(key, options, 'Map'); + this.$isSchemaMap = true; + } + + set(option, value) { + return SchemaType.set(option, value); + } + + cast(val, doc, init) { + if (val instanceof MongooseMap) { + return val; + } + + const path = this.path; + + if (init) { + const map = new MongooseMap({}, path, doc, this.$__schemaType); + + if (val instanceof global.Map) { + for (const key of val.keys()) { + let _val = val.get(key); + if (_val == null) { + _val = map.$__schemaType._castNullish(_val); + } else { + _val = map.$__schemaType.cast(_val, doc, true, null, { path: path + '.' + key }); + } + map.$init(key, _val); + } + } else { + for (const key of Object.keys(val)) { + let _val = val[key]; + if (_val == null) { + _val = map.$__schemaType._castNullish(_val); + } else { + _val = map.$__schemaType.cast(_val, doc, true, null, { path: path + '.' + key }); + } + map.$init(key, _val); + } + } + + return map; + } + + return new MongooseMap(val, path, doc, this.$__schemaType); + } + + clone() { + const schematype = super.clone(); + + if (this.$__schemaType != null) { + schematype.$__schemaType = this.$__schemaType.clone(); + } + return schematype; + } +} + +Map.prototype.OptionsConstructor = SchemaMapOptions; + +Map.defaultOptions = {}; + +module.exports = Map; diff --git a/node_modules/mongoose/lib/schema/mixed.js b/node_modules/mongoose/lib/schema/mixed.js new file mode 100644 index 000000000..09944bf60 --- /dev/null +++ b/node_modules/mongoose/lib/schema/mixed.js @@ -0,0 +1,132 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const SchemaType = require('../schematype'); +const symbols = require('./symbols'); +const isObject = require('../helpers/isObject'); +const utils = require('../utils'); + +/** + * Mixed SchemaType constructor. + * + * @param {String} path + * @param {Object} options + * @inherits SchemaType + * @api public + */ + +function Mixed(path, options) { + if (options && options.default) { + const def = options.default; + if (Array.isArray(def) && def.length === 0) { + // make sure empty array defaults are handled + options.default = Array; + } else if (!options.shared && isObject(def) && Object.keys(def).length === 0) { + // prevent odd "shared" objects between documents + options.default = function() { + return {}; + }; + } + } + + SchemaType.call(this, path, options, 'Mixed'); + + this[symbols.schemaMixedSymbol] = true; +} + +/** + * This schema type's name, to defend against minifiers that mangle + * function names. + * + * @api public + */ +Mixed.schemaName = 'Mixed'; + +Mixed.defaultOptions = {}; + +/*! + * Inherits from SchemaType. + */ +Mixed.prototype = Object.create(SchemaType.prototype); +Mixed.prototype.constructor = Mixed; + +/** + * Attaches a getter for all Mixed paths. + * + * ####Example: + * + * // Hide the 'hidden' path + * mongoose.Schema.Mixed.get(v => Object.assign({}, v, { hidden: null })); + * + * const Model = mongoose.model('Test', new Schema({ test: {} })); + * new Model({ test: { hidden: 'Secret!' } }).test.hidden; // null + * + * @param {Function} getter + * @return {this} + * @function get + * @static + * @api public + */ + +Mixed.get = SchemaType.get; + +/** + * Sets a default option for all Mixed instances. + * + * ####Example: + * + * // Make all mixed instances have `required` of true by default. + * mongoose.Schema.Mixed.set('required', true); + * + * const User = mongoose.model('User', new Schema({ test: mongoose.Mixed })); + * new User({ }).validateSync().errors.test.message; // Path `test` is required. + * + * @param {String} option - The option you'd like to set the value for + * @param {*} value - value for option + * @return {undefined} + * @function set + * @static + * @api public + */ + +Mixed.set = SchemaType.set; + +/** + * Casts `val` for Mixed. + * + * _this is a no-op_ + * + * @param {Object} value to cast + * @api private + */ + +Mixed.prototype.cast = function(val) { + if (val instanceof Error) { + return utils.errorToPOJO(val); + } + return val; +}; + +/** + * Casts contents for queries. + * + * @param {String} $cond + * @param {any} [val] + * @api private + */ + +Mixed.prototype.castForQuery = function($cond, val) { + if (arguments.length === 2) { + return val; + } + return $cond; +}; + +/*! + * Module exports. + */ + +module.exports = Mixed; diff --git a/node_modules/mongoose/lib/schema/number.js b/node_modules/mongoose/lib/schema/number.js new file mode 100644 index 000000000..77e51bbc0 --- /dev/null +++ b/node_modules/mongoose/lib/schema/number.js @@ -0,0 +1,440 @@ +'use strict'; + +/*! + * Module requirements. + */ + +const MongooseError = require('../error/index'); +const SchemaNumberOptions = require('../options/SchemaNumberOptions'); +const SchemaType = require('../schematype'); +const castNumber = require('../cast/number'); +const handleBitwiseOperator = require('./operators/bitwise'); +const utils = require('../utils'); + +const CastError = SchemaType.CastError; + +/** + * Number SchemaType constructor. + * + * @param {String} key + * @param {Object} options + * @inherits SchemaType + * @api public + */ + +function SchemaNumber(key, options) { + SchemaType.call(this, key, options, 'Number'); +} + +/** + * Attaches a getter for all Number instances. + * + * ####Example: + * + * // Make all numbers round down + * mongoose.Number.get(function(v) { return Math.floor(v); }); + * + * const Model = mongoose.model('Test', new Schema({ test: Number })); + * new Model({ test: 3.14 }).test; // 3 + * + * @param {Function} getter + * @return {this} + * @function get + * @static + * @api public + */ + +SchemaNumber.get = SchemaType.get; + +/** + * Sets a default option for all Number instances. + * + * ####Example: + * + * // Make all numbers have option `min` equal to 0. + * mongoose.Schema.Number.set('min', 0); + * + * const Order = mongoose.model('Order', new Schema({ amount: Number })); + * new Order({ amount: -10 }).validateSync().errors.amount.message; // Path `amount` must be larger than 0. + * + * @param {String} option - The option you'd like to set the value for + * @param {*} value - value for option + * @return {undefined} + * @function set + * @static + * @api public + */ + +SchemaNumber.set = SchemaType.set; + +/*! + * ignore + */ + +SchemaNumber._cast = castNumber; + +/** + * Get/set the function used to cast arbitrary values to numbers. + * + * ####Example: + * + * // Make Mongoose cast empty strings '' to 0 for paths declared as numbers + * const original = mongoose.Number.cast(); + * mongoose.Number.cast(v => { + * if (v === '') { return 0; } + * return original(v); + * }); + * + * // Or disable casting entirely + * mongoose.Number.cast(false); + * + * @param {Function} caster + * @return {Function} + * @function get + * @static + * @api public + */ + +SchemaNumber.cast = function cast(caster) { + if (arguments.length === 0) { + return this._cast; + } + if (caster === false) { + caster = this._defaultCaster; + } + this._cast = caster; + + return this._cast; +}; + +/*! + * ignore + */ + +SchemaNumber._defaultCaster = v => { + if (typeof v !== 'number') { + throw new Error(); + } + return v; +}; + +/** + * This schema type's name, to defend against minifiers that mangle + * function names. + * + * @api public + */ +SchemaNumber.schemaName = 'Number'; + +SchemaNumber.defaultOptions = {}; + +/*! + * Inherits from SchemaType. + */ +SchemaNumber.prototype = Object.create(SchemaType.prototype); +SchemaNumber.prototype.constructor = SchemaNumber; +SchemaNumber.prototype.OptionsConstructor = SchemaNumberOptions; + +/*! + * ignore + */ + +SchemaNumber._checkRequired = v => typeof v === 'number' || v instanceof Number; + +/** + * Override the function the required validator uses to check whether a string + * passes the `required` check. + * + * @param {Function} fn + * @return {Function} + * @function checkRequired + * @static + * @api public + */ + +SchemaNumber.checkRequired = SchemaType.checkRequired; + +/** + * Check if the given value satisfies a required validator. + * + * @param {Any} value + * @param {Document} doc + * @return {Boolean} + * @api public + */ + +SchemaNumber.prototype.checkRequired = function checkRequired(value, doc) { + if (SchemaType._isRef(this, value, doc, true)) { + return !!value; + } + + // `require('util').inherits()` does **not** copy static properties, and + // plugins like mongoose-float use `inherits()` for pre-ES6. + const _checkRequired = typeof this.constructor.checkRequired == 'function' ? + this.constructor.checkRequired() : + SchemaNumber.checkRequired(); + + return _checkRequired(value); +}; + +/** + * Sets a minimum number validator. + * + * ####Example: + * + * const s = new Schema({ n: { type: Number, min: 10 }) + * const M = db.model('M', s) + * const m = new M({ n: 9 }) + * m.save(function (err) { + * console.error(err) // validator error + * m.n = 10; + * m.save() // success + * }) + * + * // custom error messages + * // We can also use the special {MIN} token which will be replaced with the invalid value + * const min = [10, 'The value of path `{PATH}` ({VALUE}) is beneath the limit ({MIN}).']; + * const schema = new Schema({ n: { type: Number, min: min }) + * const M = mongoose.model('Measurement', schema); + * const s= new M({ n: 4 }); + * s.validate(function (err) { + * console.log(String(err)) // ValidationError: The value of path `n` (4) is beneath the limit (10). + * }) + * + * @param {Number} value minimum number + * @param {String} [message] optional custom error message + * @return {SchemaType} this + * @see Customized Error Messages #error_messages_MongooseError-messages + * @api public + */ + +SchemaNumber.prototype.min = function(value, message) { + if (this.minValidator) { + this.validators = this.validators.filter(function(v) { + return v.validator !== this.minValidator; + }, this); + } + + if (value !== null && value !== undefined) { + let msg = message || MongooseError.messages.Number.min; + msg = msg.replace(/{MIN}/, value); + this.validators.push({ + validator: this.minValidator = function(v) { + return v == null || v >= value; + }, + message: msg, + type: 'min', + min: value + }); + } + + return this; +}; + +/** + * Sets a maximum number validator. + * + * ####Example: + * + * const s = new Schema({ n: { type: Number, max: 10 }) + * const M = db.model('M', s) + * const m = new M({ n: 11 }) + * m.save(function (err) { + * console.error(err) // validator error + * m.n = 10; + * m.save() // success + * }) + * + * // custom error messages + * // We can also use the special {MAX} token which will be replaced with the invalid value + * const max = [10, 'The value of path `{PATH}` ({VALUE}) exceeds the limit ({MAX}).']; + * const schema = new Schema({ n: { type: Number, max: max }) + * const M = mongoose.model('Measurement', schema); + * const s= new M({ n: 4 }); + * s.validate(function (err) { + * console.log(String(err)) // ValidationError: The value of path `n` (4) exceeds the limit (10). + * }) + * + * @param {Number} maximum number + * @param {String} [message] optional custom error message + * @return {SchemaType} this + * @see Customized Error Messages #error_messages_MongooseError-messages + * @api public + */ + +SchemaNumber.prototype.max = function(value, message) { + if (this.maxValidator) { + this.validators = this.validators.filter(function(v) { + return v.validator !== this.maxValidator; + }, this); + } + + if (value !== null && value !== undefined) { + let msg = message || MongooseError.messages.Number.max; + msg = msg.replace(/{MAX}/, value); + this.validators.push({ + validator: this.maxValidator = function(v) { + return v == null || v <= value; + }, + message: msg, + type: 'max', + max: value + }); + } + + return this; +}; + +/** + * Sets a enum validator + * + * ####Example: + * + * const s = new Schema({ n: { type: Number, enum: [1, 2, 3] }); + * const M = db.model('M', s); + * + * const m = new M({ n: 4 }); + * await m.save(); // throws validation error + * + * m.n = 3; + * await m.save(); // succeeds + * + * @param {Array} values allowed values + * @param {String} [message] optional custom error message + * @return {SchemaType} this + * @see Customized Error Messages #error_messages_MongooseError-messages + * @api public + */ + +SchemaNumber.prototype.enum = function(values, message) { + if (this.enumValidator) { + this.validators = this.validators.filter(function(v) { + return v.validator !== this.enumValidator; + }, this); + } + + + if (!Array.isArray(values)) { + const isObjectSyntax = utils.isPOJO(values) && values.values != null; + if (isObjectSyntax) { + message = values.message; + values = values.values; + } else if (typeof values === 'number') { + values = Array.prototype.slice.call(arguments); + message = null; + } + + if (utils.isPOJO(values)) { + values = Object.values(values); + } + message = message || MongooseError.messages.Number.enum; + } + + message = message == null ? MongooseError.messages.Number.enum : message; + + this.enumValidator = v => v == null || values.indexOf(v) !== -1; + this.validators.push({ + validator: this.enumValidator, + message: message, + type: 'enum', + enumValues: values + }); + + return this; +}; + +/** + * Casts to number + * + * @param {Object} value value to cast + * @param {Document} doc document that triggers the casting + * @param {Boolean} init + * @api private + */ + +SchemaNumber.prototype.cast = function(value, doc, init) { + if (SchemaType._isRef(this, value, doc, init)) { + if (typeof value === 'number') { + return value; + } + + return this._castRef(value, doc, init); + } + + const val = value && typeof value._id !== 'undefined' ? + value._id : // documents + value; + + let castNumber; + if (typeof this._castFunction === 'function') { + castNumber = this._castFunction; + } else if (typeof this.constructor.cast === 'function') { + castNumber = this.constructor.cast(); + } else { + castNumber = SchemaNumber.cast(); + } + + try { + return castNumber(val); + } catch (err) { + throw new CastError('Number', val, this.path, err, this); + } +}; + +/*! + * ignore + */ + +function handleSingle(val) { + return this.cast(val); +} + +function handleArray(val) { + const _this = this; + if (!Array.isArray(val)) { + return [this.cast(val)]; + } + return val.map(function(m) { + return _this.cast(m); + }); +} + +SchemaNumber.prototype.$conditionalHandlers = + utils.options(SchemaType.prototype.$conditionalHandlers, { + $bitsAllClear: handleBitwiseOperator, + $bitsAnyClear: handleBitwiseOperator, + $bitsAllSet: handleBitwiseOperator, + $bitsAnySet: handleBitwiseOperator, + $gt: handleSingle, + $gte: handleSingle, + $lt: handleSingle, + $lte: handleSingle, + $mod: handleArray + }); + +/** + * Casts contents for queries. + * + * @param {String} $conditional + * @param {any} [value] + * @api private + */ + +SchemaNumber.prototype.castForQuery = function($conditional, val) { + let handler; + if (arguments.length === 2) { + handler = this.$conditionalHandlers[$conditional]; + if (!handler) { + throw new CastError('number', val, this.path, null, this); + } + return handler.call(this, val); + } + val = this._castForQuery($conditional); + return val; +}; + +/*! + * Module exports. + */ + +module.exports = SchemaNumber; diff --git a/node_modules/mongoose/lib/schema/objectid.js b/node_modules/mongoose/lib/schema/objectid.js new file mode 100644 index 000000000..9e481f13a --- /dev/null +++ b/node_modules/mongoose/lib/schema/objectid.js @@ -0,0 +1,297 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const SchemaObjectIdOptions = require('../options/SchemaObjectIdOptions'); +const SchemaType = require('../schematype'); +const castObjectId = require('../cast/objectid'); +const getConstructorName = require('../helpers/getConstructorName'); +const oid = require('../types/objectid'); +const utils = require('../utils'); + +const CastError = SchemaType.CastError; +let Document; + +/** + * ObjectId SchemaType constructor. + * + * @param {String} key + * @param {Object} options + * @inherits SchemaType + * @api public + */ + +function ObjectId(key, options) { + const isKeyHexStr = typeof key === 'string' && key.length === 24 && /^[a-f0-9]+$/i.test(key); + const suppressWarning = options && options.suppressWarning; + if ((isKeyHexStr || typeof key === 'undefined') && !suppressWarning) { + utils.warn('mongoose: To create a new ObjectId please try ' + + '`Mongoose.Types.ObjectId` instead of using ' + + '`Mongoose.Schema.ObjectId`. Set the `suppressWarning` option if ' + + 'you\'re trying to create a hex char path in your schema.'); + } + SchemaType.call(this, key, options, 'ObjectID'); +} + +/** + * This schema type's name, to defend against minifiers that mangle + * function names. + * + * @api public + */ +ObjectId.schemaName = 'ObjectId'; + +ObjectId.defaultOptions = {}; + +/*! + * Inherits from SchemaType. + */ +ObjectId.prototype = Object.create(SchemaType.prototype); +ObjectId.prototype.constructor = ObjectId; +ObjectId.prototype.OptionsConstructor = SchemaObjectIdOptions; + +/** + * Attaches a getter for all ObjectId instances + * + * ####Example: + * + * // Always convert to string when getting an ObjectId + * mongoose.ObjectId.get(v => v.toString()); + * + * const Model = mongoose.model('Test', new Schema({})); + * typeof (new Model({})._id); // 'string' + * + * @param {Function} getter + * @return {this} + * @function get + * @static + * @api public + */ + +ObjectId.get = SchemaType.get; + +/** + * Sets a default option for all ObjectId instances. + * + * ####Example: + * + * // Make all object ids have option `required` equal to true. + * mongoose.Schema.ObjectId.set('required', true); + * + * const Order = mongoose.model('Order', new Schema({ userId: ObjectId })); + * new Order({ }).validateSync().errors.userId.message; // Path `userId` is required. + * + * @param {String} option - The option you'd like to set the value for + * @param {*} value - value for option + * @return {undefined} + * @function set + * @static + * @api public + */ + +ObjectId.set = SchemaType.set; + +/** + * Adds an auto-generated ObjectId default if turnOn is true. + * @param {Boolean} turnOn auto generated ObjectId defaults + * @api public + * @return {SchemaType} this + */ + +ObjectId.prototype.auto = function(turnOn) { + if (turnOn) { + this.default(defaultId); + this.set(resetId); + } + + return this; +}; + +/*! + * ignore + */ + +ObjectId._checkRequired = v => v instanceof oid; + +/*! + * ignore + */ + +ObjectId._cast = castObjectId; + +/** + * Get/set the function used to cast arbitrary values to objectids. + * + * ####Example: + * + * // Make Mongoose only try to cast length 24 strings. By default, any 12 + * // char string is a valid ObjectId. + * const original = mongoose.ObjectId.cast(); + * mongoose.ObjectId.cast(v => { + * assert.ok(typeof v !== 'string' || v.length === 24); + * return original(v); + * }); + * + * // Or disable casting entirely + * mongoose.ObjectId.cast(false); + * + * @param {Function} caster + * @return {Function} + * @function get + * @static + * @api public + */ + +ObjectId.cast = function cast(caster) { + if (arguments.length === 0) { + return this._cast; + } + if (caster === false) { + caster = this._defaultCaster; + } + this._cast = caster; + + return this._cast; +}; + +/*! + * ignore + */ + +ObjectId._defaultCaster = v => { + if (!(v instanceof oid)) { + throw new Error(v + ' is not an instance of ObjectId'); + } + return v; +}; + +/** + * Override the function the required validator uses to check whether a string + * passes the `required` check. + * + * ####Example: + * + * // Allow empty strings to pass `required` check + * mongoose.Schema.Types.String.checkRequired(v => v != null); + * + * const M = mongoose.model({ str: { type: String, required: true } }); + * new M({ str: '' }).validateSync(); // `null`, validation passes! + * + * @param {Function} fn + * @return {Function} + * @function checkRequired + * @static + * @api public + */ + +ObjectId.checkRequired = SchemaType.checkRequired; + +/** + * Check if the given value satisfies a required validator. + * + * @param {Any} value + * @param {Document} doc + * @return {Boolean} + * @api public + */ + +ObjectId.prototype.checkRequired = function checkRequired(value, doc) { + if (SchemaType._isRef(this, value, doc, true)) { + return !!value; + } + + // `require('util').inherits()` does **not** copy static properties, and + // plugins like mongoose-float use `inherits()` for pre-ES6. + const _checkRequired = typeof this.constructor.checkRequired == 'function' ? + this.constructor.checkRequired() : + ObjectId.checkRequired(); + + return _checkRequired(value); +}; + +/** + * Casts to ObjectId + * + * @param {Object} value + * @param {Object} doc + * @param {Boolean} init whether this is an initialization cast + * @api private + */ + +ObjectId.prototype.cast = function(value, doc, init) { + if (SchemaType._isRef(this, value, doc, init)) { + // wait! we may need to cast this to a document + if (value instanceof oid) { + return value; + } else if ((getConstructorName(value) || '').toLowerCase() === 'objectid') { + return new oid(value.toHexString()); + } + + return this._castRef(value, doc, init); + } + + let castObjectId; + if (typeof this._castFunction === 'function') { + castObjectId = this._castFunction; + } else if (typeof this.constructor.cast === 'function') { + castObjectId = this.constructor.cast(); + } else { + castObjectId = ObjectId.cast(); + } + + try { + return castObjectId(value); + } catch (error) { + throw new CastError('ObjectId', value, this.path, error, this); + } +}; + +/*! + * ignore + */ + +function handleSingle(val) { + return this.cast(val); +} + +ObjectId.prototype.$conditionalHandlers = + utils.options(SchemaType.prototype.$conditionalHandlers, { + $gt: handleSingle, + $gte: handleSingle, + $lt: handleSingle, + $lte: handleSingle + }); + +/*! + * ignore + */ + +function defaultId() { + return new oid(); +} + +defaultId.$runBeforeSetters = true; + +function resetId(v) { + Document || (Document = require('./../document')); + + if (this instanceof Document) { + if (v === void 0) { + const _v = new oid; + this.$__._id = _v; + return _v; + } + + this.$__._id = v; + } + + return v; +} + +/*! + * Module exports. + */ + +module.exports = ObjectId; diff --git a/node_modules/mongoose/lib/schema/operators/bitwise.js b/node_modules/mongoose/lib/schema/operators/bitwise.js new file mode 100644 index 000000000..07e18cd0c --- /dev/null +++ b/node_modules/mongoose/lib/schema/operators/bitwise.js @@ -0,0 +1,38 @@ +/*! + * Module requirements. + */ + +'use strict'; + +const CastError = require('../../error/cast'); + +/*! + * ignore + */ + +function handleBitwiseOperator(val) { + const _this = this; + if (Array.isArray(val)) { + return val.map(function(v) { + return _castNumber(_this.path, v); + }); + } else if (Buffer.isBuffer(val)) { + return val; + } + // Assume trying to cast to number + return _castNumber(_this.path, val); +} + +/*! + * ignore + */ + +function _castNumber(path, num) { + const v = Number(num); + if (isNaN(v)) { + throw new CastError('number', num, path); + } + return v; +} + +module.exports = handleBitwiseOperator; diff --git a/node_modules/mongoose/lib/schema/operators/exists.js b/node_modules/mongoose/lib/schema/operators/exists.js new file mode 100644 index 000000000..916b4cbf6 --- /dev/null +++ b/node_modules/mongoose/lib/schema/operators/exists.js @@ -0,0 +1,12 @@ +'use strict'; + +const castBoolean = require('../../cast/boolean'); + +/*! + * ignore + */ + +module.exports = function(val) { + const path = this != null ? this.path : null; + return castBoolean(val, path); +}; diff --git a/node_modules/mongoose/lib/schema/operators/geospatial.js b/node_modules/mongoose/lib/schema/operators/geospatial.js new file mode 100644 index 000000000..80a605207 --- /dev/null +++ b/node_modules/mongoose/lib/schema/operators/geospatial.js @@ -0,0 +1,107 @@ +/*! + * Module requirements. + */ + +'use strict'; + +const castArraysOfNumbers = require('./helpers').castArraysOfNumbers; +const castToNumber = require('./helpers').castToNumber; + +/*! + * ignore + */ + +exports.cast$geoIntersects = cast$geoIntersects; +exports.cast$near = cast$near; +exports.cast$within = cast$within; + +function cast$near(val) { + const SchemaArray = require('../array'); + + if (Array.isArray(val)) { + castArraysOfNumbers(val, this); + return val; + } + + _castMinMaxDistance(this, val); + + if (val && val.$geometry) { + return cast$geometry(val, this); + } + + if (!Array.isArray(val)) { + throw new TypeError('$near must be either an array or an object ' + + 'with a $geometry property'); + } + + return SchemaArray.prototype.castForQuery.call(this, val); +} + +function cast$geometry(val, self) { + switch (val.$geometry.type) { + case 'Polygon': + case 'LineString': + case 'Point': + castArraysOfNumbers(val.$geometry.coordinates, self); + break; + default: + // ignore unknowns + break; + } + + _castMinMaxDistance(self, val); + + return val; +} + +function cast$within(val) { + _castMinMaxDistance(this, val); + + if (val.$box || val.$polygon) { + const type = val.$box ? '$box' : '$polygon'; + val[type].forEach(arr => { + if (!Array.isArray(arr)) { + const msg = 'Invalid $within $box argument. ' + + 'Expected an array, received ' + arr; + throw new TypeError(msg); + } + arr.forEach((v, i) => { + arr[i] = castToNumber.call(this, v); + }); + }); + } else if (val.$center || val.$centerSphere) { + const type = val.$center ? '$center' : '$centerSphere'; + val[type].forEach((item, i) => { + if (Array.isArray(item)) { + item.forEach((v, j) => { + item[j] = castToNumber.call(this, v); + }); + } else { + val[type][i] = castToNumber.call(this, item); + } + }); + } else if (val.$geometry) { + cast$geometry(val, this); + } + + return val; +} + +function cast$geoIntersects(val) { + const geo = val.$geometry; + if (!geo) { + return; + } + + cast$geometry(val, this); + return val; +} + +function _castMinMaxDistance(self, val) { + if (val.$maxDistance) { + val.$maxDistance = castToNumber.call(self, val.$maxDistance); + } + if (val.$minDistance) { + val.$minDistance = castToNumber.call(self, val.$minDistance); + } +} diff --git a/node_modules/mongoose/lib/schema/operators/helpers.js b/node_modules/mongoose/lib/schema/operators/helpers.js new file mode 100644 index 000000000..a17951cd7 --- /dev/null +++ b/node_modules/mongoose/lib/schema/operators/helpers.js @@ -0,0 +1,32 @@ +'use strict'; + +/*! + * Module requirements. + */ + +const SchemaNumber = require('../number'); + +/*! + * @ignore + */ + +exports.castToNumber = castToNumber; +exports.castArraysOfNumbers = castArraysOfNumbers; + +/*! + * @ignore + */ + +function castToNumber(val) { + return SchemaNumber.cast()(val); +} + +function castArraysOfNumbers(arr, self) { + arr.forEach(function(v, i) { + if (Array.isArray(v)) { + castArraysOfNumbers(v, self); + } else { + arr[i] = castToNumber.call(self, v); + } + }); +} diff --git a/node_modules/mongoose/lib/schema/operators/text.js b/node_modules/mongoose/lib/schema/operators/text.js new file mode 100644 index 000000000..4b959168d --- /dev/null +++ b/node_modules/mongoose/lib/schema/operators/text.js @@ -0,0 +1,39 @@ +'use strict'; + +const CastError = require('../../error/cast'); +const castBoolean = require('../../cast/boolean'); +const castString = require('../../cast/string'); + +/*! + * Casts val to an object suitable for `$text`. Throws an error if the object + * can't be casted. + * + * @param {Any} val value to cast + * @param {String} [path] path to associate with any errors that occured + * @return {Object} casted object + * @see https://docs.mongodb.com/manual/reference/operator/query/text/ + * @api private + */ + +module.exports = function(val, path) { + if (val == null || typeof val !== 'object') { + throw new CastError('$text', val, path); + } + + if (val.$search != null) { + val.$search = castString(val.$search, path + '.$search'); + } + if (val.$language != null) { + val.$language = castString(val.$language, path + '.$language'); + } + if (val.$caseSensitive != null) { + val.$caseSensitive = castBoolean(val.$caseSensitive, + path + '.$castSensitive'); + } + if (val.$diacriticSensitive != null) { + val.$diacriticSensitive = castBoolean(val.$diacriticSensitive, + path + '.$diacriticSensitive'); + } + + return val; +}; diff --git a/node_modules/mongoose/lib/schema/operators/type.js b/node_modules/mongoose/lib/schema/operators/type.js new file mode 100644 index 000000000..952c79007 --- /dev/null +++ b/node_modules/mongoose/lib/schema/operators/type.js @@ -0,0 +1,20 @@ +'use strict'; + +/*! + * ignore + */ + +module.exports = function(val) { + if (Array.isArray(val)) { + if (!val.every(v => typeof v === 'number' || typeof v === 'string')) { + throw new Error('$type array values must be strings or numbers'); + } + return val; + } + + if (typeof val !== 'number' && typeof val !== 'string') { + throw new Error('$type parameter must be number, string, or array of numbers and strings'); + } + + return val; +}; diff --git a/node_modules/mongoose/lib/schema/string.js b/node_modules/mongoose/lib/schema/string.js new file mode 100644 index 000000000..d62928afb --- /dev/null +++ b/node_modules/mongoose/lib/schema/string.js @@ -0,0 +1,672 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const SchemaType = require('../schematype'); +const MongooseError = require('../error/index'); +const SchemaStringOptions = require('../options/SchemaStringOptions'); +const castString = require('../cast/string'); +const utils = require('../utils'); + +const CastError = SchemaType.CastError; + +/** + * String SchemaType constructor. + * + * @param {String} key + * @param {Object} options + * @inherits SchemaType + * @api public + */ + +function SchemaString(key, options) { + this.enumValues = []; + this.regExp = null; + SchemaType.call(this, key, options, 'String'); +} + +/** + * This schema type's name, to defend against minifiers that mangle + * function names. + * + * @api public + */ +SchemaString.schemaName = 'String'; + +SchemaString.defaultOptions = {}; + +/*! + * Inherits from SchemaType. + */ +SchemaString.prototype = Object.create(SchemaType.prototype); +SchemaString.prototype.constructor = SchemaString; +Object.defineProperty(SchemaString.prototype, 'OptionsConstructor', { + configurable: false, + enumerable: false, + writable: false, + value: SchemaStringOptions +}); + +/*! + * ignore + */ + +SchemaString._cast = castString; + +/** + * Get/set the function used to cast arbitrary values to strings. + * + * ####Example: + * + * // Throw an error if you pass in an object. Normally, Mongoose allows + * // objects with custom `toString()` functions. + * const original = mongoose.Schema.Types.String.cast(); + * mongoose.Schema.Types.String.cast(v => { + * assert.ok(v == null || typeof v !== 'object'); + * return original(v); + * }); + * + * // Or disable casting entirely + * mongoose.Schema.Types.String.cast(false); + * + * @param {Function} caster + * @return {Function} + * @function get + * @static + * @api public + */ + +SchemaString.cast = function cast(caster) { + if (arguments.length === 0) { + return this._cast; + } + if (caster === false) { + caster = this._defaultCaster; + } + this._cast = caster; + + return this._cast; +}; + +/*! + * ignore + */ + +SchemaString._defaultCaster = v => { + if (v != null && typeof v !== 'string') { + throw new Error(); + } + return v; +}; + +/** + * Attaches a getter for all String instances. + * + * ####Example: + * + * // Make all numbers round down + * mongoose.Schema.String.get(v => v.toLowerCase()); + * + * const Model = mongoose.model('Test', new Schema({ test: String })); + * new Model({ test: 'FOO' }).test; // 'foo' + * + * @param {Function} getter + * @return {this} + * @function get + * @static + * @api public + */ + +SchemaString.get = SchemaType.get; + +/** + * Sets a default option for all String instances. + * + * ####Example: + * + * // Make all strings have option `trim` equal to true. + * mongoose.Schema.String.set('trim', true); + * + * const User = mongoose.model('User', new Schema({ name: String })); + * new User({ name: ' John Doe ' }).name; // 'John Doe' + * + * @param {String} option - The option you'd like to set the value for + * @param {*} value - value for option + * @return {undefined} + * @function set + * @static + * @api public + */ + +SchemaString.set = SchemaType.set; + +/*! + * ignore + */ + +SchemaString._checkRequired = v => (v instanceof String || typeof v === 'string') && v.length; + +/** + * Override the function the required validator uses to check whether a string + * passes the `required` check. + * + * ####Example: + * + * // Allow empty strings to pass `required` check + * mongoose.Schema.Types.String.checkRequired(v => v != null); + * + * const M = mongoose.model({ str: { type: String, required: true } }); + * new M({ str: '' }).validateSync(); // `null`, validation passes! + * + * @param {Function} fn + * @return {Function} + * @function checkRequired + * @static + * @api public + */ + +SchemaString.checkRequired = SchemaType.checkRequired; + +/** + * Adds an enum validator + * + * ####Example: + * + * const states = ['opening', 'open', 'closing', 'closed'] + * const s = new Schema({ state: { type: String, enum: states }}) + * const M = db.model('M', s) + * const m = new M({ state: 'invalid' }) + * m.save(function (err) { + * console.error(String(err)) // ValidationError: `invalid` is not a valid enum value for path `state`. + * m.state = 'open' + * m.save(callback) // success + * }) + * + * // or with custom error messages + * const enum = { + * values: ['opening', 'open', 'closing', 'closed'], + * message: 'enum validator failed for path `{PATH}` with value `{VALUE}`' + * } + * const s = new Schema({ state: { type: String, enum: enum }) + * const M = db.model('M', s) + * const m = new M({ state: 'invalid' }) + * m.save(function (err) { + * console.error(String(err)) // ValidationError: enum validator failed for path `state` with value `invalid` + * m.state = 'open' + * m.save(callback) // success + * }) + * + * @param {String|Object} [args...] enumeration values + * @return {SchemaType} this + * @see Customized Error Messages #error_messages_MongooseError-messages + * @api public + */ + +SchemaString.prototype.enum = function() { + if (this.enumValidator) { + this.validators = this.validators.filter(function(v) { + return v.validator !== this.enumValidator; + }, this); + this.enumValidator = false; + } + + if (arguments[0] === void 0 || arguments[0] === false) { + return this; + } + + let values; + let errorMessage; + + if (utils.isObject(arguments[0])) { + if (Array.isArray(arguments[0].values)) { + values = arguments[0].values; + errorMessage = arguments[0].message; + } else { + values = utils.object.vals(arguments[0]); + errorMessage = MongooseError.messages.String.enum; + } + } else { + values = arguments; + errorMessage = MongooseError.messages.String.enum; + } + + for (const value of values) { + if (value !== undefined) { + this.enumValues.push(this.cast(value)); + } + } + + const vals = this.enumValues; + this.enumValidator = function(v) { + return undefined === v || ~vals.indexOf(v); + }; + this.validators.push({ + validator: this.enumValidator, + message: errorMessage, + type: 'enum', + enumValues: vals + }); + + return this; +}; + +/** + * Adds a lowercase [setter](http://mongoosejs.com/docs/api.html#schematype_SchemaType-set). + * + * ####Example: + * + * const s = new Schema({ email: { type: String, lowercase: true }}) + * const M = db.model('M', s); + * const m = new M({ email: 'SomeEmail@example.COM' }); + * console.log(m.email) // someemail@example.com + * M.find({ email: 'SomeEmail@example.com' }); // Queries by 'someemail@example.com' + * + * Note that `lowercase` does **not** affect regular expression queries: + * + * ####Example: + * // Still queries for documents whose `email` matches the regular + * // expression /SomeEmail/. Mongoose does **not** convert the RegExp + * // to lowercase. + * M.find({ email: /SomeEmail/ }); + * + * @api public + * @return {SchemaType} this + */ + +SchemaString.prototype.lowercase = function(shouldApply) { + if (arguments.length > 0 && !shouldApply) { + return this; + } + return this.set(v => { + if (typeof v !== 'string') { + v = this.cast(v); + } + if (v) { + return v.toLowerCase(); + } + return v; + }); +}; + +/** + * Adds an uppercase [setter](http://mongoosejs.com/docs/api.html#schematype_SchemaType-set). + * + * ####Example: + * + * const s = new Schema({ caps: { type: String, uppercase: true }}) + * const M = db.model('M', s); + * const m = new M({ caps: 'an example' }); + * console.log(m.caps) // AN EXAMPLE + * M.find({ caps: 'an example' }) // Matches documents where caps = 'AN EXAMPLE' + * + * Note that `uppercase` does **not** affect regular expression queries: + * + * ####Example: + * // Mongoose does **not** convert the RegExp to uppercase. + * M.find({ email: /an example/ }); + * + * @api public + * @return {SchemaType} this + */ + +SchemaString.prototype.uppercase = function(shouldApply) { + if (arguments.length > 0 && !shouldApply) { + return this; + } + return this.set(v => { + if (typeof v !== 'string') { + v = this.cast(v); + } + if (v) { + return v.toUpperCase(); + } + return v; + }); +}; + +/** + * Adds a trim [setter](http://mongoosejs.com/docs/api.html#schematype_SchemaType-set). + * + * The string value will be trimmed when set. + * + * ####Example: + * + * const s = new Schema({ name: { type: String, trim: true }}); + * const M = db.model('M', s); + * const string = ' some name '; + * console.log(string.length); // 11 + * const m = new M({ name: string }); + * console.log(m.name.length); // 9 + * + * // Equivalent to `findOne({ name: string.trim() })` + * M.findOne({ name: string }); + * + * Note that `trim` does **not** affect regular expression queries: + * + * ####Example: + * // Mongoose does **not** trim whitespace from the RegExp. + * M.find({ name: / some name / }); + * + * @api public + * @return {SchemaType} this + */ + +SchemaString.prototype.trim = function(shouldTrim) { + if (arguments.length > 0 && !shouldTrim) { + return this; + } + return this.set(v => { + if (typeof v !== 'string') { + v = this.cast(v); + } + if (v) { + return v.trim(); + } + return v; + }); +}; + +/** + * Sets a minimum length validator. + * + * ####Example: + * + * const schema = new Schema({ postalCode: { type: String, minlength: 5 }) + * const Address = db.model('Address', schema) + * const address = new Address({ postalCode: '9512' }) + * address.save(function (err) { + * console.error(err) // validator error + * address.postalCode = '95125'; + * address.save() // success + * }) + * + * // custom error messages + * // We can also use the special {MINLENGTH} token which will be replaced with the minimum allowed length + * const minlength = [5, 'The value of path `{PATH}` (`{VALUE}`) is shorter than the minimum allowed length ({MINLENGTH}).']; + * const schema = new Schema({ postalCode: { type: String, minlength: minlength }) + * const Address = mongoose.model('Address', schema); + * const address = new Address({ postalCode: '9512' }); + * address.validate(function (err) { + * console.log(String(err)) // ValidationError: The value of path `postalCode` (`9512`) is shorter than the minimum length (5). + * }) + * + * @param {Number} value minimum string length + * @param {String} [message] optional custom error message + * @return {SchemaType} this + * @see Customized Error Messages #error_messages_MongooseError-messages + * @api public + */ + +SchemaString.prototype.minlength = function(value, message) { + if (this.minlengthValidator) { + this.validators = this.validators.filter(function(v) { + return v.validator !== this.minlengthValidator; + }, this); + } + + if (value !== null && value !== undefined) { + let msg = message || MongooseError.messages.String.minlength; + msg = msg.replace(/{MINLENGTH}/, value); + this.validators.push({ + validator: this.minlengthValidator = function(v) { + return v === null || v.length >= value; + }, + message: msg, + type: 'minlength', + minlength: value + }); + } + + return this; +}; + +SchemaString.prototype.minLength = SchemaString.prototype.minlength; + +/** + * Sets a maximum length validator. + * + * ####Example: + * + * const schema = new Schema({ postalCode: { type: String, maxlength: 9 }) + * const Address = db.model('Address', schema) + * const address = new Address({ postalCode: '9512512345' }) + * address.save(function (err) { + * console.error(err) // validator error + * address.postalCode = '95125'; + * address.save() // success + * }) + * + * // custom error messages + * // We can also use the special {MAXLENGTH} token which will be replaced with the maximum allowed length + * const maxlength = [9, 'The value of path `{PATH}` (`{VALUE}`) exceeds the maximum allowed length ({MAXLENGTH}).']; + * const schema = new Schema({ postalCode: { type: String, maxlength: maxlength }) + * const Address = mongoose.model('Address', schema); + * const address = new Address({ postalCode: '9512512345' }); + * address.validate(function (err) { + * console.log(String(err)) // ValidationError: The value of path `postalCode` (`9512512345`) exceeds the maximum allowed length (9). + * }) + * + * @param {Number} value maximum string length + * @param {String} [message] optional custom error message + * @return {SchemaType} this + * @see Customized Error Messages #error_messages_MongooseError-messages + * @api public + */ + +SchemaString.prototype.maxlength = function(value, message) { + if (this.maxlengthValidator) { + this.validators = this.validators.filter(function(v) { + return v.validator !== this.maxlengthValidator; + }, this); + } + + if (value !== null && value !== undefined) { + let msg = message || MongooseError.messages.String.maxlength; + msg = msg.replace(/{MAXLENGTH}/, value); + this.validators.push({ + validator: this.maxlengthValidator = function(v) { + return v === null || v.length <= value; + }, + message: msg, + type: 'maxlength', + maxlength: value + }); + } + + return this; +}; + +SchemaString.prototype.maxLength = SchemaString.prototype.maxlength; + +/** + * Sets a regexp validator. + * + * Any value that does not pass `regExp`.test(val) will fail validation. + * + * ####Example: + * + * const s = new Schema({ name: { type: String, match: /^a/ }}) + * const M = db.model('M', s) + * const m = new M({ name: 'I am invalid' }) + * m.validate(function (err) { + * console.error(String(err)) // "ValidationError: Path `name` is invalid (I am invalid)." + * m.name = 'apples' + * m.validate(function (err) { + * assert.ok(err) // success + * }) + * }) + * + * // using a custom error message + * const match = [ /\.html$/, "That file doesn't end in .html ({VALUE})" ]; + * const s = new Schema({ file: { type: String, match: match }}) + * const M = db.model('M', s); + * const m = new M({ file: 'invalid' }); + * m.validate(function (err) { + * console.log(String(err)) // "ValidationError: That file doesn't end in .html (invalid)" + * }) + * + * Empty strings, `undefined`, and `null` values always pass the match validator. If you require these values, enable the `required` validator also. + * + * const s = new Schema({ name: { type: String, match: /^a/, required: true }}) + * + * @param {RegExp} regExp regular expression to test against + * @param {String} [message] optional custom error message + * @return {SchemaType} this + * @see Customized Error Messages #error_messages_MongooseError-messages + * @api public + */ + +SchemaString.prototype.match = function match(regExp, message) { + // yes, we allow multiple match validators + + const msg = message || MongooseError.messages.String.match; + + const matchValidator = function(v) { + if (!regExp) { + return false; + } + + // In case RegExp happens to have `/g` flag set, we need to reset the + // `lastIndex`, otherwise `match` will intermittently fail. + regExp.lastIndex = 0; + + const ret = ((v != null && v !== '') + ? regExp.test(v) + : true); + return ret; + }; + + this.validators.push({ + validator: matchValidator, + message: msg, + type: 'regexp', + regexp: regExp + }); + return this; +}; + +/** + * Check if the given value satisfies the `required` validator. The value is + * considered valid if it is a string (that is, not `null` or `undefined`) and + * has positive length. The `required` validator **will** fail for empty + * strings. + * + * @param {Any} value + * @param {Document} doc + * @return {Boolean} + * @api public + */ + +SchemaString.prototype.checkRequired = function checkRequired(value, doc) { + if (SchemaType._isRef(this, value, doc, true)) { + return !!value; + } + + // `require('util').inherits()` does **not** copy static properties, and + // plugins like mongoose-float use `inherits()` for pre-ES6. + const _checkRequired = typeof this.constructor.checkRequired == 'function' ? + this.constructor.checkRequired() : + SchemaString.checkRequired(); + + return _checkRequired(value); +}; + +/** + * Casts to String + * + * @api private + */ + +SchemaString.prototype.cast = function(value, doc, init) { + if (SchemaType._isRef(this, value, doc, init)) { + if (typeof value === 'string') { + return value; + } + + return this._castRef(value, doc, init); + } + + let castString; + if (typeof this._castFunction === 'function') { + castString = this._castFunction; + } else if (typeof this.constructor.cast === 'function') { + castString = this.constructor.cast(); + } else { + castString = SchemaString.cast(); + } + + try { + return castString(value); + } catch (error) { + throw new CastError('string', value, this.path, null, this); + } +}; + +/*! + * ignore + */ + +function handleSingle(val) { + return this.castForQuery(val); +} + +function handleArray(val) { + const _this = this; + if (!Array.isArray(val)) { + return [this.castForQuery(val)]; + } + return val.map(function(m) { + return _this.castForQuery(m); + }); +} + +const $conditionalHandlers = utils.options(SchemaType.prototype.$conditionalHandlers, { + $all: handleArray, + $gt: handleSingle, + $gte: handleSingle, + $lt: handleSingle, + $lte: handleSingle, + $options: String, + $regex: handleSingle, + $not: handleSingle +}); + +Object.defineProperty(SchemaString.prototype, '$conditionalHandlers', { + configurable: false, + enumerable: false, + writable: false, + value: Object.freeze($conditionalHandlers) +}); + +/** + * Casts contents for queries. + * + * @param {String} $conditional + * @param {any} [val] + * @api private + */ + +SchemaString.prototype.castForQuery = function($conditional, val) { + let handler; + if (arguments.length === 2) { + handler = this.$conditionalHandlers[$conditional]; + if (!handler) { + throw new Error('Can\'t use ' + $conditional + ' with String.'); + } + return handler.call(this, val); + } + val = $conditional; + if (Object.prototype.toString.call(val) === '[object RegExp]') { + return val; + } + + return this._castForQuery(val); +}; + +/*! + * Module exports. + */ + +module.exports = SchemaString; diff --git a/node_modules/mongoose/lib/schema/symbols.js b/node_modules/mongoose/lib/schema/symbols.js new file mode 100644 index 000000000..08d1d27ed --- /dev/null +++ b/node_modules/mongoose/lib/schema/symbols.js @@ -0,0 +1,5 @@ +'use strict'; + +exports.schemaMixedSymbol = Symbol.for('mongoose:schema_mixed'); + +exports.builtInMiddleware = Symbol.for('mongoose:built-in-middleware'); \ No newline at end of file diff --git a/node_modules/mongoose/lib/schematype.js b/node_modules/mongoose/lib/schematype.js new file mode 100644 index 000000000..85ab26446 --- /dev/null +++ b/node_modules/mongoose/lib/schematype.js @@ -0,0 +1,1638 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const MongooseError = require('./error/index'); +const SchemaTypeOptions = require('./options/SchemaTypeOptions'); +const $exists = require('./schema/operators/exists'); +const $type = require('./schema/operators/type'); +const get = require('./helpers/get'); +const handleImmutable = require('./helpers/schematype/handleImmutable'); +const isAsyncFunction = require('./helpers/isAsyncFunction'); +const immediate = require('./helpers/immediate'); +const schemaTypeSymbol = require('./helpers/symbols').schemaTypeSymbol; +const utils = require('./utils'); +const validatorErrorSymbol = require('./helpers/symbols').validatorErrorSymbol; +const documentIsModified = require('./helpers/symbols').documentIsModified; + +const populateModelSymbol = require('./helpers/symbols').populateModelSymbol; + +const CastError = MongooseError.CastError; +const ValidatorError = MongooseError.ValidatorError; + +/** + * SchemaType constructor. Do **not** instantiate `SchemaType` directly. + * Mongoose converts your schema paths into SchemaTypes automatically. + * + * ####Example: + * + * const schema = new Schema({ name: String }); + * schema.path('name') instanceof SchemaType; // true + * + * @param {String} path + * @param {SchemaTypeOptions} [options] See [SchemaTypeOptions docs](/docs/api/schematypeoptions.html) + * @param {String} [instance] + * @api public + */ + +function SchemaType(path, options, instance) { + this[schemaTypeSymbol] = true; + this.path = path; + this.instance = instance; + this.validators = []; + this.getters = this.constructor.hasOwnProperty('getters') ? + this.constructor.getters.slice() : + []; + this.setters = []; + + this.splitPath(); + + options = options || {}; + const defaultOptions = this.constructor.defaultOptions || {}; + const defaultOptionsKeys = Object.keys(defaultOptions); + + for (const option of defaultOptionsKeys) { + if (defaultOptions.hasOwnProperty(option) && !options.hasOwnProperty(option)) { + options[option] = defaultOptions[option]; + } + } + + if (options.select == null) { + delete options.select; + } + + const Options = this.OptionsConstructor || SchemaTypeOptions; + this.options = new Options(options); + this._index = null; + + + if (utils.hasUserDefinedProperty(this.options, 'immutable')) { + this.$immutable = this.options.immutable; + + handleImmutable(this); + } + + const keys = Object.keys(this.options); + for (const prop of keys) { + if (prop === 'cast') { + this.castFunction(this.options[prop]); + continue; + } + if (utils.hasUserDefinedProperty(this.options, prop) && typeof this[prop] === 'function') { + // { unique: true, index: true } + if (prop === 'index' && this._index) { + if (options.index === false) { + const index = this._index; + if (typeof index === 'object' && index != null) { + if (index.unique) { + throw new Error('Path "' + this.path + '" may not have `index` ' + + 'set to false and `unique` set to true'); + } + if (index.sparse) { + throw new Error('Path "' + this.path + '" may not have `index` ' + + 'set to false and `sparse` set to true'); + } + } + + this._index = false; + } + continue; + } + + const val = options[prop]; + // Special case so we don't screw up array defaults, see gh-5780 + if (prop === 'default') { + this.default(val); + continue; + } + + const opts = Array.isArray(val) ? val : [val]; + + this[prop].apply(this, opts); + } + } + + Object.defineProperty(this, '$$context', { + enumerable: false, + configurable: false, + writable: true, + value: null + }); +} + +/*! + * The class that Mongoose uses internally to instantiate this SchemaType's `options` property. + */ + +SchemaType.prototype.OptionsConstructor = SchemaTypeOptions; + +/*! + * ignore + */ + +SchemaType.prototype.splitPath = function() { + if (this._presplitPath != null) { + return this._presplitPath; + } + if (this.path == null) { + return undefined; + } + + this._presplitPath = this.path.indexOf('.') === -1 ? [this.path] : this.path.split('.'); + return this._presplitPath; +}; + +/** + * Get/set the function used to cast arbitrary values to this type. + * + * ####Example: + * + * // Disallow `null` for numbers, and don't try to cast any values to + * // numbers, so even strings like '123' will cause a CastError. + * mongoose.Number.cast(function(v) { + * assert.ok(v === undefined || typeof v === 'number'); + * return v; + * }); + * + * @param {Function|false} caster Function that casts arbitrary values to this type, or throws an error if casting failed + * @return {Function} + * @static + * @receiver SchemaType + * @function cast + * @api public + */ + +SchemaType.cast = function cast(caster) { + if (arguments.length === 0) { + return this._cast; + } + if (caster === false) { + caster = v => v; + } + this._cast = caster; + + return this._cast; +}; + +/** + * Get/set the function used to cast arbitrary values to this particular schematype instance. + * Overrides `SchemaType.cast()`. + * + * ####Example: + * + * // Disallow `null` for numbers, and don't try to cast any values to + * // numbers, so even strings like '123' will cause a CastError. + * const number = new mongoose.Number('mypath', {}); + * number.cast(function(v) { + * assert.ok(v === undefined || typeof v === 'number'); + * return v; + * }); + * + * @param {Function|false} caster Function that casts arbitrary values to this type, or throws an error if casting failed + * @return {Function} + * @static + * @receiver SchemaType + * @function cast + * @api public + */ + +SchemaType.prototype.castFunction = function castFunction(caster) { + if (arguments.length === 0) { + return this._castFunction; + } + if (caster === false) { + caster = this.constructor._defaultCaster || (v => v); + } + this._castFunction = caster; + + return this._castFunction; +}; + +/** + * The function that Mongoose calls to cast arbitrary values to this SchemaType. + * + * @param {Object} value value to cast + * @param {Document} doc document that triggers the casting + * @param {Boolean} init + * @api public + */ + +SchemaType.prototype.cast = function cast() { + throw new Error('Base SchemaType class does not implement a `cast()` function'); +}; + +/** + * Sets a default option for this schema type. + * + * ####Example: + * + * // Make all strings be trimmed by default + * mongoose.SchemaTypes.String.set('trim', true); + * + * @param {String} option The name of the option you'd like to set (e.g. trim, lowercase, etc...) + * @param {*} value The value of the option you'd like to set. + * @return {void} + * @static + * @receiver SchemaType + * @function set + * @api public + */ + +SchemaType.set = function set(option, value) { + if (!this.hasOwnProperty('defaultOptions')) { + this.defaultOptions = Object.assign({}, this.defaultOptions); + } + this.defaultOptions[option] = value; +}; + +/** + * Attaches a getter for all instances of this schema type. + * + * ####Example: + * + * // Make all numbers round down + * mongoose.Number.get(function(v) { return Math.floor(v); }); + * + * @param {Function} getter + * @return {this} + * @static + * @receiver SchemaType + * @function get + * @api public + */ + +SchemaType.get = function(getter) { + this.getters = this.hasOwnProperty('getters') ? this.getters : []; + this.getters.push(getter); +}; + +/** + * Sets a default value for this SchemaType. + * + * ####Example: + * + * const schema = new Schema({ n: { type: Number, default: 10 }) + * const M = db.model('M', schema) + * const m = new M; + * console.log(m.n) // 10 + * + * Defaults can be either `functions` which return the value to use as the default or the literal value itself. Either way, the value will be cast based on its schema type before being set during document creation. + * + * ####Example: + * + * // values are cast: + * const schema = new Schema({ aNumber: { type: Number, default: 4.815162342 }}) + * const M = db.model('M', schema) + * const m = new M; + * console.log(m.aNumber) // 4.815162342 + * + * // default unique objects for Mixed types: + * const schema = new Schema({ mixed: Schema.Types.Mixed }); + * schema.path('mixed').default(function () { + * return {}; + * }); + * + * // if we don't use a function to return object literals for Mixed defaults, + * // each document will receive a reference to the same object literal creating + * // a "shared" object instance: + * const schema = new Schema({ mixed: Schema.Types.Mixed }); + * schema.path('mixed').default({}); + * const M = db.model('M', schema); + * const m1 = new M; + * m1.mixed.added = 1; + * console.log(m1.mixed); // { added: 1 } + * const m2 = new M; + * console.log(m2.mixed); // { added: 1 } + * + * @param {Function|any} val the default value + * @return {defaultValue} + * @api public + */ + +SchemaType.prototype.default = function(val) { + if (arguments.length === 1) { + if (val === void 0) { + this.defaultValue = void 0; + return void 0; + } + + if (val != null && val.instanceOfSchema) { + throw new MongooseError('Cannot set default value of path `' + this.path + + '` to a mongoose Schema instance.'); + } + + this.defaultValue = val; + return this.defaultValue; + } else if (arguments.length > 1) { + this.defaultValue = utils.args(arguments); + } + return this.defaultValue; +}; + +/** + * Declares the index options for this schematype. + * + * ####Example: + * + * const s = new Schema({ name: { type: String, index: true }) + * const s = new Schema({ loc: { type: [Number], index: 'hashed' }) + * const s = new Schema({ loc: { type: [Number], index: '2d', sparse: true }) + * const s = new Schema({ loc: { type: [Number], index: { type: '2dsphere', sparse: true }}) + * const s = new Schema({ date: { type: Date, index: { unique: true, expires: '1d' }}) + * s.path('my.path').index(true); + * s.path('my.date').index({ expires: 60 }); + * s.path('my.path').index({ unique: true, sparse: true }); + * + * ####NOTE: + * + * _Indexes are created [in the background](https://docs.mongodb.com/manual/core/index-creation/#index-creation-background) + * by default. If `background` is set to `false`, MongoDB will not execute any + * read/write operations you send until the index build. + * Specify `background: false` to override Mongoose's default._ + * + * @param {Object|Boolean|String} options + * @return {SchemaType} this + * @api public + */ + +SchemaType.prototype.index = function(options) { + this._index = options; + utils.expires(this._index); + return this; +}; + +/** + * Declares an unique index. + * + * ####Example: + * + * const s = new Schema({ name: { type: String, unique: true }}); + * s.path('name').index({ unique: true }); + * + * _NOTE: violating the constraint returns an `E11000` error from MongoDB when saving, not a Mongoose validation error._ + * + * @param {Boolean} bool + * @return {SchemaType} this + * @api public + */ + +SchemaType.prototype.unique = function(bool) { + if (this._index === false) { + if (!bool) { + return; + } + throw new Error('Path "' + this.path + '" may not have `index` set to ' + + 'false and `unique` set to true'); + } + + if (!this.options.hasOwnProperty('index') && bool === false) { + return this; + } + + if (this._index == null || this._index === true) { + this._index = {}; + } else if (typeof this._index === 'string') { + this._index = { type: this._index }; + } + + this._index.unique = bool; + return this; +}; + +/** + * Declares a full text index. + * + * ###Example: + * + * const s = new Schema({name : {type: String, text : true }) + * s.path('name').index({text : true}); + * @param {Boolean} bool + * @return {SchemaType} this + * @api public + */ + +SchemaType.prototype.text = function(bool) { + if (this._index === false) { + if (!bool) { + return; + } + throw new Error('Path "' + this.path + '" may not have `index` set to ' + + 'false and `text` set to true'); + } + + if (!this.options.hasOwnProperty('index') && bool === false) { + return this; + } + + if (this._index === null || this._index === undefined || + typeof this._index === 'boolean') { + this._index = {}; + } else if (typeof this._index === 'string') { + this._index = { type: this._index }; + } + + this._index.text = bool; + return this; +}; + +/** + * Declares a sparse index. + * + * ####Example: + * + * const s = new Schema({ name: { type: String, sparse: true } }); + * s.path('name').index({ sparse: true }); + * + * @param {Boolean} bool + * @return {SchemaType} this + * @api public + */ + +SchemaType.prototype.sparse = function(bool) { + if (this._index === false) { + if (!bool) { + return; + } + throw new Error('Path "' + this.path + '" may not have `index` set to ' + + 'false and `sparse` set to true'); + } + + if (!this.options.hasOwnProperty('index') && bool === false) { + return this; + } + + if (this._index == null || typeof this._index === 'boolean') { + this._index = {}; + } else if (typeof this._index === 'string') { + this._index = { type: this._index }; + } + + this._index.sparse = bool; + return this; +}; + +/** + * Defines this path as immutable. Mongoose prevents you from changing + * immutable paths unless the parent document has [`isNew: true`](/docs/api.html#document_Document-isNew). + * + * ####Example: + * + * const schema = new Schema({ + * name: { type: String, immutable: true }, + * age: Number + * }); + * const Model = mongoose.model('Test', schema); + * + * await Model.create({ name: 'test' }); + * const doc = await Model.findOne(); + * + * doc.isNew; // false + * doc.name = 'new name'; + * doc.name; // 'test', because `name` is immutable + * + * Mongoose also prevents changing immutable properties using `updateOne()` + * and `updateMany()` based on [strict mode](/docs/guide.html#strict). + * + * ####Example: + * + * // Mongoose will strip out the `name` update, because `name` is immutable + * Model.updateOne({}, { $set: { name: 'test2' }, $inc: { age: 1 } }); + * + * // If `strict` is set to 'throw', Mongoose will throw an error if you + * // update `name` + * const err = await Model.updateOne({}, { name: 'test2' }, { strict: 'throw' }). + * then(() => null, err => err); + * err.name; // StrictModeError + * + * // If `strict` is `false`, Mongoose allows updating `name` even though + * // the property is immutable. + * Model.updateOne({}, { name: 'test2' }, { strict: false }); + * + * @param {Boolean} bool + * @return {SchemaType} this + * @see isNew /docs/api.html#document_Document-isNew + * @api public + */ + +SchemaType.prototype.immutable = function(bool) { + this.$immutable = bool; + handleImmutable(this); + + return this; +}; + +/** + * Defines a custom function for transforming this path when converting a document to JSON. + * + * Mongoose calls this function with one parameter: the current `value` of the path. Mongoose + * then uses the return value in the JSON output. + * + * ####Example: + * + * const schema = new Schema({ + * date: { type: Date, transform: v => v.getFullYear() } + * }); + * const Model = mongoose.model('Test', schema); + * + * await Model.create({ date: new Date('2016-06-01') }); + * const doc = await Model.findOne(); + * + * doc.date instanceof Date; // true + * + * doc.toJSON().date; // 2016 as a number + * JSON.stringify(doc); // '{"_id":...,"date":2016}' + * + * @param {Function} fn + * @return {SchemaType} this + * @api public + */ + +SchemaType.prototype.transform = function(fn) { + this.options.transform = fn; + + return this; +}; + +/** + * Adds a setter to this schematype. + * + * ####Example: + * + * function capitalize (val) { + * if (typeof val !== 'string') val = ''; + * return val.charAt(0).toUpperCase() + val.substring(1); + * } + * + * // defining within the schema + * const s = new Schema({ name: { type: String, set: capitalize }}); + * + * // or with the SchemaType + * const s = new Schema({ name: String }) + * s.path('name').set(capitalize); + * + * Setters allow you to transform the data before it gets to the raw mongodb + * document or query. + * + * Suppose you are implementing user registration for a website. Users provide + * an email and password, which gets saved to mongodb. The email is a string + * that you will want to normalize to lower case, in order to avoid one email + * having more than one account -- e.g., otherwise, avenue@q.com can be registered for 2 accounts via avenue@q.com and AvEnUe@Q.CoM. + * + * You can set up email lower case normalization easily via a Mongoose setter. + * + * function toLower(v) { + * return v.toLowerCase(); + * } + * + * const UserSchema = new Schema({ + * email: { type: String, set: toLower } + * }); + * + * const User = db.model('User', UserSchema); + * + * const user = new User({email: 'AVENUE@Q.COM'}); + * console.log(user.email); // 'avenue@q.com' + * + * // or + * const user = new User(); + * user.email = 'Avenue@Q.com'; + * console.log(user.email); // 'avenue@q.com' + * User.updateOne({ _id: _id }, { $set: { email: 'AVENUE@Q.COM' } }); // update to 'avenue@q.com' + * + * As you can see above, setters allow you to transform the data before it + * stored in MongoDB, or before executing a query. + * + * _NOTE: we could have also just used the built-in `lowercase: true` SchemaType option instead of defining our own function._ + * + * new Schema({ email: { type: String, lowercase: true }}) + * + * Setters are also passed a second argument, the schematype on which the setter was defined. This allows for tailored behavior based on options passed in the schema. + * + * function inspector (val, schematype) { + * if (schematype.options.required) { + * return schematype.path + ' is required'; + * } else { + * return val; + * } + * } + * + * const VirusSchema = new Schema({ + * name: { type: String, required: true, set: inspector }, + * taxonomy: { type: String, set: inspector } + * }) + * + * const Virus = db.model('Virus', VirusSchema); + * const v = new Virus({ name: 'Parvoviridae', taxonomy: 'Parvovirinae' }); + * + * console.log(v.name); // name is required + * console.log(v.taxonomy); // Parvovirinae + * + * You can also use setters to modify other properties on the document. If + * you're setting a property `name` on a document, the setter will run with + * `this` as the document. Be careful, in mongoose 5 setters will also run + * when querying by `name` with `this` as the query. + * + * ```javascript + * const nameSchema = new Schema({ name: String, keywords: [String] }); + * nameSchema.path('name').set(function(v) { + * // Need to check if `this` is a document, because in mongoose 5 + * // setters will also run on queries, in which case `this` will be a + * // mongoose query object. + * if (this instanceof Document && v != null) { + * this.keywords = v.split(' '); + * } + * return v; + * }); + * ``` + * + * @param {Function} fn + * @return {SchemaType} this + * @api public + */ + +SchemaType.prototype.set = function(fn) { + if (typeof fn !== 'function') { + throw new TypeError('A setter must be a function.'); + } + this.setters.push(fn); + return this; +}; + +/** + * Adds a getter to this schematype. + * + * ####Example: + * + * function dob (val) { + * if (!val) return val; + * return (val.getMonth() + 1) + "/" + val.getDate() + "/" + val.getFullYear(); + * } + * + * // defining within the schema + * const s = new Schema({ born: { type: Date, get: dob }) + * + * // or by retreiving its SchemaType + * const s = new Schema({ born: Date }) + * s.path('born').get(dob) + * + * Getters allow you to transform the representation of the data as it travels from the raw mongodb document to the value that you see. + * + * Suppose you are storing credit card numbers and you want to hide everything except the last 4 digits to the mongoose user. You can do so by defining a getter in the following way: + * + * function obfuscate (cc) { + * return '****-****-****-' + cc.slice(cc.length-4, cc.length); + * } + * + * const AccountSchema = new Schema({ + * creditCardNumber: { type: String, get: obfuscate } + * }); + * + * const Account = db.model('Account', AccountSchema); + * + * Account.findById(id, function (err, found) { + * console.log(found.creditCardNumber); // '****-****-****-1234' + * }); + * + * Getters are also passed a second argument, the schematype on which the getter was defined. This allows for tailored behavior based on options passed in the schema. + * + * function inspector (val, schematype) { + * if (schematype.options.required) { + * return schematype.path + ' is required'; + * } else { + * return schematype.path + ' is not'; + * } + * } + * + * const VirusSchema = new Schema({ + * name: { type: String, required: true, get: inspector }, + * taxonomy: { type: String, get: inspector } + * }) + * + * const Virus = db.model('Virus', VirusSchema); + * + * Virus.findById(id, function (err, virus) { + * console.log(virus.name); // name is required + * console.log(virus.taxonomy); // taxonomy is not + * }) + * + * @param {Function} fn + * @return {SchemaType} this + * @api public + */ + +SchemaType.prototype.get = function(fn) { + if (typeof fn !== 'function') { + throw new TypeError('A getter must be a function.'); + } + this.getters.push(fn); + return this; +}; + +/** + * Adds validator(s) for this document path. + * + * Validators always receive the value to validate as their first argument and + * must return `Boolean`. Returning `false` or throwing an error means + * validation failed. + * + * The error message argument is optional. If not passed, the [default generic error message template](#error_messages_MongooseError-messages) will be used. + * + * ####Examples: + * + * // make sure every value is equal to "something" + * function validator (val) { + * return val == 'something'; + * } + * new Schema({ name: { type: String, validate: validator }}); + * + * // with a custom error message + * + * const custom = [validator, 'Uh oh, {PATH} does not equal "something".'] + * new Schema({ name: { type: String, validate: custom }}); + * + * // adding many validators at a time + * + * const many = [ + * { validator: validator, msg: 'uh oh' } + * , { validator: anotherValidator, msg: 'failed' } + * ] + * new Schema({ name: { type: String, validate: many }}); + * + * // or utilizing SchemaType methods directly: + * + * const schema = new Schema({ name: 'string' }); + * schema.path('name').validate(validator, 'validation of `{PATH}` failed with value `{VALUE}`'); + * + * ####Error message templates: + * + * From the examples above, you may have noticed that error messages support + * basic templating. There are a few other template keywords besides `{PATH}` + * and `{VALUE}` too. To find out more, details are available + * [here](#error_messages_MongooseError.messages). + * + * If Mongoose's built-in error message templating isn't enough, Mongoose + * supports setting the `message` property to a function. + * + * schema.path('name').validate({ + * validator: function() { return v.length > 5; }, + * // `errors['name']` will be "name must have length 5, got 'foo'" + * message: function(props) { + * return `${props.path} must have length 5, got '${props.value}'`; + * } + * }); + * + * To bypass Mongoose's error messages and just copy the error message that + * the validator throws, do this: + * + * schema.path('name').validate({ + * validator: function() { throw new Error('Oops!'); }, + * // `errors['name']` will be "Oops!" + * message: function(props) { return props.reason.message; } + * }); + * + * ####Asynchronous validation: + * + * Mongoose supports validators that return a promise. A validator that returns + * a promise is called an _async validator_. Async validators run in + * parallel, and `validate()` will wait until all async validators have settled. + * + * schema.path('name').validate({ + * validator: function (value) { + * return new Promise(function (resolve, reject) { + * resolve(false); // validation failed + * }); + * } + * }); + * + * You might use asynchronous validators to retreive other documents from the database to validate against or to meet other I/O bound validation needs. + * + * Validation occurs `pre('save')` or whenever you manually execute [document#validate](#document_Document-validate). + * + * If validation fails during `pre('save')` and no callback was passed to receive the error, an `error` event will be emitted on your Models associated db [connection](#connection_Connection), passing the validation error object along. + * + * const conn = mongoose.createConnection(..); + * conn.on('error', handleError); + * + * const Product = conn.model('Product', yourSchema); + * const dvd = new Product(..); + * dvd.save(); // emits error on the `conn` above + * + * If you want to handle these errors at the Model level, add an `error` + * listener to your Model as shown below. + * + * // registering an error listener on the Model lets us handle errors more locally + * Product.on('error', handleError); + * + * @param {RegExp|Function|Object} obj validator function, or hash describing options + * @param {Function} [obj.validator] validator function. If the validator function returns `undefined` or a truthy value, validation succeeds. If it returns [falsy](https://masteringjs.io/tutorials/fundamentals/falsy) (except `undefined`) or throws an error, validation fails. + * @param {String|Function} [obj.message] optional error message. If function, should return the error message as a string + * @param {Boolean} [obj.propsParameter=false] If true, Mongoose will pass the validator properties object (with the `validator` function, `message`, etc.) as the 2nd arg to the validator function. This is disabled by default because many validators [rely on positional args](https://github.com/chriso/validator.js#validators), so turning this on may cause unpredictable behavior in external validators. + * @param {String|Function} [errorMsg] optional error message. If function, should return the error message as a string + * @param {String} [type] optional validator type + * @return {SchemaType} this + * @api public + */ + +SchemaType.prototype.validate = function(obj, message, type) { + if (typeof obj === 'function' || obj && utils.getFunctionName(obj.constructor) === 'RegExp') { + let properties; + if (typeof message === 'function') { + properties = { validator: obj, message: message }; + properties.type = type || 'user defined'; + } else if (message instanceof Object && !type) { + properties = utils.clone(message); + if (!properties.message) { + properties.message = properties.msg; + } + properties.validator = obj; + properties.type = properties.type || 'user defined'; + } else { + if (message == null) { + message = MongooseError.messages.general.default; + } + if (!type) { + type = 'user defined'; + } + properties = { message: message, type: type, validator: obj }; + } + + this.validators.push(properties); + return this; + } + + let i; + let length; + let arg; + + for (i = 0, length = arguments.length; i < length; i++) { + arg = arguments[i]; + if (!utils.isPOJO(arg)) { + const msg = 'Invalid validator. Received (' + typeof arg + ') ' + + arg + + '. See http://mongoosejs.com/docs/api.html#schematype_SchemaType-validate'; + + throw new Error(msg); + } + this.validate(arg.validator, arg); + } + + return this; +}; + +/** + * Adds a required validator to this SchemaType. The validator gets added + * to the front of this SchemaType's validators array using `unshift()`. + * + * ####Example: + * + * const s = new Schema({ born: { type: Date, required: true }) + * + * // or with custom error message + * + * const s = new Schema({ born: { type: Date, required: '{PATH} is required!' }) + * + * // or with a function + * + * const s = new Schema({ + * userId: ObjectId, + * username: { + * type: String, + * required: function() { return this.userId != null; } + * } + * }) + * + * // or with a function and a custom message + * const s = new Schema({ + * userId: ObjectId, + * username: { + * type: String, + * required: [ + * function() { return this.userId != null; }, + * 'username is required if id is specified' + * ] + * } + * }) + * + * // or through the path API + * + * s.path('name').required(true); + * + * // with custom error messaging + * + * s.path('name').required(true, 'grrr :( '); + * + * // or make a path conditionally required based on a function + * const isOver18 = function() { return this.age >= 18; }; + * s.path('voterRegistrationId').required(isOver18); + * + * The required validator uses the SchemaType's `checkRequired` function to + * determine whether a given value satisfies the required validator. By default, + * a value satisfies the required validator if `val != null` (that is, if + * the value is not null nor undefined). However, most built-in mongoose schema + * types override the default `checkRequired` function: + * + * @param {Boolean|Function|Object} required enable/disable the validator, or function that returns required boolean, or options object + * @param {Boolean|Function} [options.isRequired] enable/disable the validator, or function that returns required boolean + * @param {Function} [options.ErrorConstructor] custom error constructor. The constructor receives 1 parameter, an object containing the validator properties. + * @param {String} [message] optional custom error message + * @return {SchemaType} this + * @see Customized Error Messages #error_messages_MongooseError-messages + * @see SchemaArray#checkRequired #schema_array_SchemaArray.checkRequired + * @see SchemaBoolean#checkRequired #schema_boolean_SchemaBoolean-checkRequired + * @see SchemaBuffer#checkRequired #schema_buffer_SchemaBuffer.schemaName + * @see SchemaNumber#checkRequired #schema_number_SchemaNumber-min + * @see SchemaObjectId#checkRequired #schema_objectid_ObjectId-auto + * @see SchemaString#checkRequired #schema_string_SchemaString-checkRequired + * @api public + */ + +SchemaType.prototype.required = function(required, message) { + let customOptions = {}; + + if (arguments.length > 0 && required == null) { + this.validators = this.validators.filter(function(v) { + return v.validator !== this.requiredValidator; + }, this); + + this.isRequired = false; + delete this.originalRequiredValue; + return this; + } + + if (typeof required === 'object') { + customOptions = required; + message = customOptions.message || message; + required = required.isRequired; + } + + if (required === false) { + this.validators = this.validators.filter(function(v) { + return v.validator !== this.requiredValidator; + }, this); + + this.isRequired = false; + delete this.originalRequiredValue; + return this; + } + + const _this = this; + this.isRequired = true; + + this.requiredValidator = function(v) { + const cachedRequired = get(this, '$__.cachedRequired'); + + // no validation when this path wasn't selected in the query. + if (cachedRequired != null && !this.$__isSelected(_this.path) && !this[documentIsModified](_this.path)) { + return true; + } + + // `$cachedRequired` gets set in `_evaluateRequiredFunctions()` so we + // don't call required functions multiple times in one validate call + // See gh-6801 + if (cachedRequired != null && _this.path in cachedRequired) { + const res = cachedRequired[_this.path] ? + _this.checkRequired(v, this) : + true; + delete cachedRequired[_this.path]; + return res; + } else if (typeof required === 'function') { + return required.apply(this) ? _this.checkRequired(v, this) : true; + } + + return _this.checkRequired(v, this); + }; + this.originalRequiredValue = required; + + if (typeof required === 'string') { + message = required; + required = undefined; + } + + const msg = message || MongooseError.messages.general.required; + this.validators.unshift(Object.assign({}, customOptions, { + validator: this.requiredValidator, + message: msg, + type: 'required' + })); + + return this; +}; + +/** + * Set the model that this path refers to. This is the option that [populate](https://mongoosejs.com/docs/populate.html) + * looks at to determine the foreign collection it should query. + * + * ####Example: + * const userSchema = new Schema({ name: String }); + * const User = mongoose.model('User', userSchema); + * + * const postSchema = new Schema({ user: mongoose.ObjectId }); + * postSchema.path('user').ref('User'); // Can set ref to a model name + * postSchema.path('user').ref(User); // Or a model class + * postSchema.path('user').ref(() => 'User'); // Or a function that returns the model name + * postSchema.path('user').ref(() => User); // Or a function that returns the model class + * + * // Or you can just declare the `ref` inline in your schema + * const postSchema2 = new Schema({ + * user: { type: mongoose.ObjectId, ref: User } + * }); + * + * @param {String|Model|Function} ref either a model name, a [Model](https://mongoosejs.com/docs/models.html), or a function that returns a model name or model. + * @return {SchemaType} this + * @api public + */ + +SchemaType.prototype.ref = function(ref) { + this.options.ref = ref; + return this; +}; + +/** + * Gets the default value + * + * @param {Object} scope the scope which callback are executed + * @param {Boolean} init + * @api private + */ + +SchemaType.prototype.getDefault = function(scope, init) { + let ret; + if (typeof this.defaultValue === 'function') { + if ( + this.defaultValue === Date.now || + this.defaultValue === Array || + this.defaultValue.name.toLowerCase() === 'objectid' + ) { + ret = this.defaultValue.call(scope); + } else { + ret = this.defaultValue.call(scope, scope); + } + } else { + ret = this.defaultValue; + } + + + if (ret !== null && ret !== undefined) { + if (typeof ret === 'object' && (!this.options || !this.options.shared)) { + ret = utils.clone(ret); + } + + const casted = this.applySetters(ret, scope, init); + if (casted && casted.$isSingleNested) { + casted.$__parent = scope; + } + return casted; + } + return ret; +}; + +/*! + * Applies setters without casting + * + * @api private + */ + +SchemaType.prototype._applySetters = function(value, scope, init, priorVal) { + let v = value; + if (init) { + return v; + } + const setters = this.setters; + + for (let i = setters.length - 1; i >= 0; i--) { + v = setters[i].call(scope, v, priorVal, this); + } + + return v; +}; + +/*! + * ignore + */ + +SchemaType.prototype._castNullish = function _castNullish(v) { + return v; +}; + +/** + * Applies setters + * + * @param {Object} value + * @param {Object} scope + * @param {Boolean} init + * @api private + */ + +SchemaType.prototype.applySetters = function(value, scope, init, priorVal, options) { + let v = this._applySetters(value, scope, init, priorVal, options); + if (v == null) { + return this._castNullish(v); + } + + // do not cast until all setters are applied #665 + v = this.cast(v, scope, init, priorVal, options); + + return v; +}; + +/** + * Applies getters to a value + * + * @param {Object} value + * @param {Object} scope + * @api private + */ + +SchemaType.prototype.applyGetters = function(value, scope) { + let v = value; + const getters = this.getters; + const len = getters.length; + + if (len === 0) { + return v; + } + + for (let i = 0; i < len; ++i) { + v = getters[i].call(scope, v, this); + } + + return v; +}; + +/** + * Sets default `select()` behavior for this path. + * + * Set to `true` if this path should always be included in the results, `false` if it should be excluded by default. This setting can be overridden at the query level. + * + * ####Example: + * + * T = db.model('T', new Schema({ x: { type: String, select: true }})); + * T.find(..); // field x will always be selected .. + * // .. unless overridden; + * T.find().select('-x').exec(callback); + * + * @param {Boolean} val + * @return {SchemaType} this + * @api public + */ + +SchemaType.prototype.select = function select(val) { + this.selected = !!val; + return this; +}; + +/** + * Performs a validation of `value` using the validators declared for this SchemaType. + * + * @param {any} value + * @param {Function} callback + * @param {Object} scope + * @api private + */ + +SchemaType.prototype.doValidate = function(value, fn, scope, options) { + let err = false; + const path = this.path; + + // Avoid non-object `validators` + const validators = this.validators. + filter(v => v != null && typeof v === 'object'); + + let count = validators.length; + + if (!count) { + return fn(null); + } + + const _this = this; + validators.forEach(function(v) { + if (err) { + return; + } + + const validator = v.validator; + let ok; + + const validatorProperties = utils.clone(v); + validatorProperties.path = options && options.path ? options.path : path; + validatorProperties.value = value; + + if (validator instanceof RegExp) { + validate(validator.test(value), validatorProperties); + return; + } + + if (typeof validator !== 'function') { + return; + } + + if (value === undefined && validator !== _this.requiredValidator) { + validate(true, validatorProperties); + return; + } + + try { + if (validatorProperties.propsParameter) { + ok = validator.call(scope, value, validatorProperties); + } else { + ok = validator.call(scope, value); + } + } catch (error) { + ok = false; + validatorProperties.reason = error; + if (error.message) { + validatorProperties.message = error.message; + } + } + + if (ok != null && typeof ok.then === 'function') { + ok.then( + function(ok) { validate(ok, validatorProperties); }, + function(error) { + validatorProperties.reason = error; + validatorProperties.message = error.message; + ok = false; + validate(ok, validatorProperties); + }); + } else { + validate(ok, validatorProperties); + } + + }); + + function validate(ok, validatorProperties) { + if (err) { + return; + } + if (ok === undefined || ok) { + if (--count <= 0) { + immediate(function() { + fn(null); + }); + } + } else { + const ErrorConstructor = validatorProperties.ErrorConstructor || ValidatorError; + err = new ErrorConstructor(validatorProperties); + err[validatorErrorSymbol] = true; + immediate(function() { + fn(err); + }); + } + } +}; + +/** + * Performs a validation of `value` using the validators declared for this SchemaType. + * + * ####Note: + * + * This method ignores the asynchronous validators. + * + * @param {any} value + * @param {Object} scope + * @return {MongooseError|undefined} + * @api private + */ + +SchemaType.prototype.doValidateSync = function(value, scope, options) { + const path = this.path; + const count = this.validators.length; + + if (!count) { + return null; + } + + let validators = this.validators; + if (value === void 0) { + if (this.validators.length > 0 && this.validators[0].type === 'required') { + validators = [this.validators[0]]; + } else { + return null; + } + } + + let err = null; + validators.forEach(function(v) { + if (err) { + return; + } + + if (v == null || typeof v !== 'object') { + return; + } + + const validator = v.validator; + const validatorProperties = utils.clone(v); + validatorProperties.path = options && options.path ? options.path : path; + validatorProperties.value = value; + let ok; + + // Skip any explicit async validators. Validators that return a promise + // will still run, but won't trigger any errors. + if (isAsyncFunction(validator)) { + return; + } + + if (validator instanceof RegExp) { + validate(validator.test(value), validatorProperties); + return; + } + + if (typeof validator !== 'function') { + return; + } + + try { + if (validatorProperties.propsParameter) { + ok = validator.call(scope, value, validatorProperties); + } else { + ok = validator.call(scope, value); + } + } catch (error) { + ok = false; + validatorProperties.reason = error; + } + + // Skip any validators that return a promise, we can't handle those + // synchronously + if (ok != null && typeof ok.then === 'function') { + return; + } + validate(ok, validatorProperties); + }); + + return err; + + function validate(ok, validatorProperties) { + if (err) { + return; + } + if (ok !== undefined && !ok) { + const ErrorConstructor = validatorProperties.ErrorConstructor || ValidatorError; + err = new ErrorConstructor(validatorProperties); + err[validatorErrorSymbol] = true; + } + } +}; + +/** + * Determines if value is a valid Reference. + * + * @param {SchemaType} self + * @param {Object} value + * @param {Document} doc + * @param {Boolean} init + * @return {Boolean} + * @api private + */ + +SchemaType._isRef = function(self, value, doc, init) { + // fast path + let ref = init && self.options && (self.options.ref || self.options.refPath); + + if (!ref && doc && doc.$__ != null) { + // checks for + // - this populated with adhoc model and no ref was set in schema OR + // - setting / pushing values after population + const path = doc.$__fullPath(self.path, true); + + const owner = doc.ownerDocument ? doc.ownerDocument() : doc; + ref = (path != null && owner.$populated(path)) || doc.$populated(self.path); + } + + if (ref) { + if (value == null) { + return true; + } + if (!Buffer.isBuffer(value) && // buffers are objects too + value._bsontype !== 'Binary' // raw binary value from the db + && utils.isObject(value) // might have deselected _id in population query + ) { + return true; + } + + return init; + } + + return false; +}; + +/*! + * ignore + */ + +SchemaType.prototype._castRef = function _castRef(value, doc, init) { + if (value == null) { + return value; + } + + if (value.$__ != null) { + value.$__.wasPopulated = true; + return value; + } + + // setting a populated path + if (Buffer.isBuffer(value) || !utils.isObject(value)) { + if (init) { + return value; + } + throw new CastError(this.instance, value, this.path, null, this); + } + + // Handle the case where user directly sets a populated + // path to a plain object; cast to the Model used in + // the population query. + const path = doc.$__fullPath(this.path, true); + const owner = doc.ownerDocument ? doc.ownerDocument() : doc; + const pop = owner.$populated(path, true); + let ret = value; + if (!doc.$__.populated || + !doc.$__.populated[path] || + !doc.$__.populated[path].options || + !doc.$__.populated[path].options.options || + !doc.$__.populated[path].options.options.lean) { + ret = new pop.options[populateModelSymbol](value); + ret.$__.wasPopulated = true; + } + + return ret; +}; + +/*! + * ignore + */ + +function handleSingle(val) { + return this.castForQuery(val); +} + +/*! + * ignore + */ + +function handleArray(val) { + const _this = this; + if (!Array.isArray(val)) { + return [this.castForQuery(val)]; + } + return val.map(function(m) { + return _this.castForQuery(m); + }); +} + +/*! + * Just like handleArray, except also allows `[]` because surprisingly + * `$in: [1, []]` works fine + */ + +function handle$in(val) { + const _this = this; + if (!Array.isArray(val)) { + return [this.castForQuery(val)]; + } + return val.map(function(m) { + if (Array.isArray(m) && m.length === 0) { + return m; + } + return _this.castForQuery(m); + }); +} + +/*! + * ignore + */ + +SchemaType.prototype.$conditionalHandlers = { + $all: handleArray, + $eq: handleSingle, + $in: handle$in, + $ne: handleSingle, + $nin: handle$in, + $exists: $exists, + $type: $type +}; + +/*! + * Wraps `castForQuery` to handle context + */ + +SchemaType.prototype.castForQueryWrapper = function(params) { + this.$$context = params.context; + if ('$conditional' in params) { + const ret = this.castForQuery(params.$conditional, params.val); + this.$$context = null; + return ret; + } + if (params.$skipQueryCastForUpdate || params.$applySetters) { + const ret = this._castForQuery(params.val); + this.$$context = null; + return ret; + } + + const ret = this.castForQuery(params.val); + this.$$context = null; + return ret; +}; + +/** + * Cast the given value with the given optional query operator. + * + * @param {String} [$conditional] query operator, like `$eq` or `$in` + * @param {any} val + * @api private + */ + +SchemaType.prototype.castForQuery = function($conditional, val) { + let handler; + if (arguments.length === 2) { + handler = this.$conditionalHandlers[$conditional]; + if (!handler) { + throw new Error('Can\'t use ' + $conditional); + } + return handler.call(this, val); + } + val = $conditional; + return this._castForQuery(val); +}; + +/*! + * Internal switch for runSetters + * + * @api private + */ + +SchemaType.prototype._castForQuery = function(val) { + return this.applySetters(val, this.$$context); +}; + +/** + * Override the function the required validator uses to check whether a value + * passes the `required` check. Override this on the individual SchemaType. + * + * ####Example: + * + * // Use this to allow empty strings to pass the `required` validator + * mongoose.Schema.Types.String.checkRequired(v => typeof v === 'string'); + * + * @param {Function} fn + * @return {Function} + * @static + * @receiver SchemaType + * @function checkRequired + * @api public + */ + +SchemaType.checkRequired = function(fn) { + if (arguments.length > 0) { + this._checkRequired = fn; + } + + return this._checkRequired; +}; + +/** + * Default check for if this path satisfies the `required` validator. + * + * @param {any} val + * @api private + */ + +SchemaType.prototype.checkRequired = function(val) { + return val != null; +}; + +/*! + * ignore + */ + +SchemaType.prototype.clone = function() { + const options = Object.assign({}, this.options); + const schematype = new this.constructor(this.path, options, this.instance); + schematype.validators = this.validators.slice(); + if (this.requiredValidator !== undefined) schematype.requiredValidator = this.requiredValidator; + if (this.defaultValue !== undefined) schematype.defaultValue = this.defaultValue; + if (this.$immutable !== undefined && this.options.immutable === undefined) { + schematype.$immutable = this.$immutable; + + handleImmutable(schematype); + } + if (this._index !== undefined) schematype._index = this._index; + if (this.selected !== undefined) schematype.selected = this.selected; + if (this.isRequired !== undefined) schematype.isRequired = this.isRequired; + if (this.originalRequiredValue !== undefined) schematype.originalRequiredValue = this.originalRequiredValue; + schematype.getters = this.getters.slice(); + schematype.setters = this.setters.slice(); + return schematype; +}; + +/*! + * Module exports. + */ + +module.exports = exports = SchemaType; + +exports.CastError = CastError; + +exports.ValidatorError = ValidatorError; diff --git a/node_modules/mongoose/lib/statemachine.js b/node_modules/mongoose/lib/statemachine.js new file mode 100644 index 000000000..7e36dc1ce --- /dev/null +++ b/node_modules/mongoose/lib/statemachine.js @@ -0,0 +1,180 @@ + +/*! + * Module dependencies. + */ + +'use strict'; + +const utils = require('./utils'); + +/*! + * StateMachine represents a minimal `interface` for the + * constructors it builds via StateMachine.ctor(...). + * + * @api private + */ + +const StateMachine = module.exports = exports = function StateMachine() { +}; + +/*! + * StateMachine.ctor('state1', 'state2', ...) + * A factory method for subclassing StateMachine. + * The arguments are a list of states. For each state, + * the constructor's prototype gets state transition + * methods named after each state. These transition methods + * place their path argument into the given state. + * + * @param {String} state + * @param {String} [state] + * @return {Function} subclass constructor + * @private + */ + +StateMachine.ctor = function() { + const states = utils.args(arguments); + + const ctor = function() { + StateMachine.apply(this, arguments); + this.paths = {}; + this.states = {}; + this.stateNames = states; + + let i = states.length, + state; + + while (i--) { + state = states[i]; + this.states[state] = {}; + } + }; + + ctor.prototype = new StateMachine(); + + states.forEach(function(state) { + // Changes the `path`'s state to `state`. + ctor.prototype[state] = function(path) { + this._changeState(path, state); + }; + }); + + return ctor; +}; + +/*! + * This function is wrapped by the state change functions: + * + * - `require(path)` + * - `modify(path)` + * - `init(path)` + * + * @api private + */ + +StateMachine.prototype._changeState = function _changeState(path, nextState) { + const prevBucket = this.states[this.paths[path]]; + if (prevBucket) delete prevBucket[path]; + + this.paths[path] = nextState; + this.states[nextState][path] = true; +}; + +/*! + * ignore + */ + +StateMachine.prototype.clear = function clear(state) { + const keys = Object.keys(this.states[state]); + let i = keys.length; + let path; + + while (i--) { + path = keys[i]; + delete this.states[state][path]; + delete this.paths[path]; + } +}; + +/*! + * Checks to see if at least one path is in the states passed in via `arguments` + * e.g., this.some('required', 'inited') + * + * @param {String} state that we want to check for. + * @private + */ + +StateMachine.prototype.some = function some() { + const _this = this; + const what = arguments.length ? arguments : this.stateNames; + return Array.prototype.some.call(what, function(state) { + return Object.keys(_this.states[state]).length; + }); +}; + +/*! + * This function builds the functions that get assigned to `forEach` and `map`, + * since both of those methods share a lot of the same logic. + * + * @param {String} iterMethod is either 'forEach' or 'map' + * @return {Function} + * @api private + */ + +StateMachine.prototype._iter = function _iter(iterMethod) { + return function() { + const numArgs = arguments.length; + let states = utils.args(arguments, 0, numArgs - 1); + const callback = arguments[numArgs - 1]; + + if (!states.length) states = this.stateNames; + + const _this = this; + + const paths = states.reduce(function(paths, state) { + return paths.concat(Object.keys(_this.states[state])); + }, []); + + return paths[iterMethod](function(path, i, paths) { + return callback(path, i, paths); + }); + }; +}; + +/*! + * Iterates over the paths that belong to one of the parameter states. + * + * The function profile can look like: + * this.forEach(state1, fn); // iterates over all paths in state1 + * this.forEach(state1, state2, fn); // iterates over all paths in state1 or state2 + * this.forEach(fn); // iterates over all paths in all states + * + * @param {String} [state] + * @param {String} [state] + * @param {Function} callback + * @private + */ + +StateMachine.prototype.forEach = function forEach() { + this.forEach = this._iter('forEach'); + return this.forEach.apply(this, arguments); +}; + +/*! + * Maps over the paths that belong to one of the parameter states. + * + * The function profile can look like: + * this.forEach(state1, fn); // iterates over all paths in state1 + * this.forEach(state1, state2, fn); // iterates over all paths in state1 or state2 + * this.forEach(fn); // iterates over all paths in all states + * + * @param {String} [state] + * @param {String} [state] + * @param {Function} callback + * @return {Array} + * @private + */ + +StateMachine.prototype.map = function map() { + this.map = this._iter('map'); + return this.map.apply(this, arguments); +}; diff --git a/node_modules/mongoose/lib/types/ArraySubdocument.js b/node_modules/mongoose/lib/types/ArraySubdocument.js new file mode 100644 index 000000000..73514c899 --- /dev/null +++ b/node_modules/mongoose/lib/types/ArraySubdocument.js @@ -0,0 +1,170 @@ +/* eslint no-func-assign: 1 */ + +/*! + * Module dependencies. + */ + +'use strict'; + +const EventEmitter = require('events').EventEmitter; +const Subdocument = require('./subdocument'); +const get = require('../helpers/get'); + +const documentArrayParent = require('../helpers/symbols').documentArrayParent; + +/** + * A constructor. + * + * @param {Object} obj js object returned from the db + * @param {MongooseDocumentArray} parentArr the parent array of this document + * @param {Boolean} skipId + * @inherits Document + * @api private + */ + +function ArraySubdocument(obj, parentArr, skipId, fields, index) { + if (parentArr != null && parentArr.isMongooseDocumentArray) { + this.__parentArray = parentArr; + this[documentArrayParent] = parentArr.$parent(); + } else { + this.__parentArray = undefined; + this[documentArrayParent] = undefined; + } + this.$setIndex(index); + this.$__parent = this[documentArrayParent]; + + Subdocument.call(this, obj, fields, this[documentArrayParent], void 0, { isNew: true }); +} + +/*! + * Inherit from Subdocument + */ +ArraySubdocument.prototype = Object.create(Subdocument.prototype); +ArraySubdocument.prototype.constructor = ArraySubdocument; + +Object.defineProperty(ArraySubdocument.prototype, '$isSingleNested', { + configurable: false, + writable: false, + value: false +}); + +Object.defineProperty(ArraySubdocument.prototype, '$isDocumentArrayElement', { + configurable: false, + writable: false, + value: true +}); + +for (const i in EventEmitter.prototype) { + ArraySubdocument[i] = EventEmitter.prototype[i]; +} + +/*! + * ignore + */ + +ArraySubdocument.prototype.$setIndex = function(index) { + this.__index = index; + + if (get(this, '$__.validationError', null) != null) { + const keys = Object.keys(this.$__.validationError.errors); + for (const key of keys) { + this.invalidate(key, this.$__.validationError.errors[key]); + } + } +}; + +/*! + * ignore + */ + +ArraySubdocument.prototype.populate = function() { + throw new Error('Mongoose does not support calling populate() on nested ' + + 'docs. Instead of `doc.arr[0].populate("path")`, use ' + + '`doc.populate("arr.0.path")`'); +}; + +/*! + * ignore + */ + +ArraySubdocument.prototype.$__removeFromParent = function() { + const _id = this._doc._id; + if (!_id) { + throw new Error('For your own good, Mongoose does not know ' + + 'how to remove an ArraySubdocument that has no _id'); + } + this.__parentArray.pull({ _id: _id }); +}; + +/** + * Returns the full path to this document. If optional `path` is passed, it is appended to the full path. + * + * @param {String} [path] + * @param {Boolean} [skipIndex] Skip adding the array index. For example `arr.foo` instead of `arr.0.foo`. + * @return {String} + * @api private + * @method $__fullPath + * @memberOf ArraySubdocument + * @instance + */ + +ArraySubdocument.prototype.$__fullPath = function(path, skipIndex) { + if (this.__index == null) { + return null; + } + if (!this.$__.fullPath) { + this.ownerDocument(); + } + + if (skipIndex) { + return path ? + this.$__.fullPath + '.' + path : + this.$__.fullPath; + } + + return path ? + this.$__.fullPath + '.' + this.__index + '.' + path : + this.$__.fullPath + '.' + this.__index; +}; + +/*! + * Given a path relative to this document, return the path relative + * to the top-level document. + */ + +ArraySubdocument.prototype.$__pathRelativeToParent = function(path, skipIndex) { + if (this.__index == null) { + return null; + } + if (skipIndex) { + return path == null ? this.__parentArray.$path() : this.__parentArray.$path() + '.' + path; + } + if (path == null) { + return this.__parentArray.$path() + '.' + this.__index; + } + return this.__parentArray.$path() + '.' + this.__index + '.' + path; +}; + +/*! + * Returns this sub-documents parent document. + */ + +ArraySubdocument.prototype.$parent = function() { + return this[documentArrayParent]; +}; + +/** + * Returns this sub-documents parent array. + * + * @api public + */ + +ArraySubdocument.prototype.parentArray = function() { + return this.__parentArray; +}; + +/*! + * Module exports. + */ + +module.exports = ArraySubdocument; diff --git a/node_modules/mongoose/lib/types/DocumentArray/index.js b/node_modules/mongoose/lib/types/DocumentArray/index.js new file mode 100644 index 000000000..6f428ee7d --- /dev/null +++ b/node_modules/mongoose/lib/types/DocumentArray/index.js @@ -0,0 +1,130 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const ArrayMethods = require('../array/methods'); +const DocumentArrayMethods = require('./methods'); +const Document = require('../../document'); + +const arrayAtomicsSymbol = require('../../helpers/symbols').arrayAtomicsSymbol; +const arrayAtomicsBackupSymbol = require('../../helpers/symbols').arrayAtomicsBackupSymbol; +const arrayParentSymbol = require('../../helpers/symbols').arrayParentSymbol; +const arrayPathSymbol = require('../../helpers/symbols').arrayPathSymbol; +const arraySchemaSymbol = require('../../helpers/symbols').arraySchemaSymbol; + +const _basePush = Array.prototype.push; + +/** + * DocumentArray constructor + * + * @param {Array} values + * @param {String} path the path to this array + * @param {Document} doc parent document + * @api private + * @return {MongooseDocumentArray} + * @inherits MongooseArray + * @see http://bit.ly/f6CnZU + */ + +function MongooseDocumentArray(values, path, doc) { + const arr = []; + + const internals = { + [arrayAtomicsSymbol]: {}, + [arrayAtomicsBackupSymbol]: void 0, + [arrayPathSymbol]: path, + [arraySchemaSymbol]: void 0, + [arrayParentSymbol]: void 0 + }; + + if (Array.isArray(values)) { + if (values[arrayPathSymbol] === path && + values[arrayParentSymbol] === doc) { + internals[arrayAtomicsSymbol] = Object.assign({}, values[arrayAtomicsSymbol]); + } + values.forEach(v => { + _basePush.call(arr, v); + }); + } + internals[arrayPathSymbol] = path; + + // Because doc comes from the context of another function, doc === global + // can happen if there was a null somewhere up the chain (see #3020 && #3034) + // RB Jun 17, 2015 updated to check for presence of expected paths instead + // to make more proof against unusual node environments + if (doc && doc instanceof Document) { + internals[arrayParentSymbol] = doc; + internals[arraySchemaSymbol] = doc.schema.path(path); + + // `schema.path()` doesn't drill into nested arrays properly yet, see + // gh-6398, gh-6602. This is a workaround because nested arrays are + // always plain non-document arrays, so once you get to a document array + // nesting is done. Matryoshka code. + while (internals[arraySchemaSymbol] != null && + internals[arraySchemaSymbol].$isMongooseArray && + !internals[arraySchemaSymbol].$isMongooseDocumentArray) { + internals[arraySchemaSymbol] = internals[arraySchemaSymbol].casterConstructor; + } + } + + const proxy = new Proxy(arr, { + get: function(target, prop) { + if (prop === 'isMongooseArray' || + prop === 'isMongooseArrayProxy' || + prop === 'isMongooseDocumentArray' || + prop === 'isMongooseDocumentArrayProxy') { + return true; + } + if (prop === '__array') { + return arr; + } + if (prop === 'set') { + return set; + } + if (internals.hasOwnProperty(prop)) { + return internals[prop]; + } + if (DocumentArrayMethods.hasOwnProperty(prop)) { + return DocumentArrayMethods[prop]; + } + if (ArrayMethods.hasOwnProperty(prop)) { + return ArrayMethods[prop]; + } + + return arr[prop]; + }, + set: function(target, prop, value) { + if (typeof prop === 'string' && /^\d+$/.test(prop)) { + set.call(proxy, prop, value); + } else if (internals.hasOwnProperty(prop)) { + internals[prop] = value; + } else { + arr[prop] = value; + } + + return true; + } + }); + + return proxy; +} + +function set(i, val, skipModified) { + const arr = this.__array; + if (skipModified) { + arr[i] = val; + return arr; + } + const value = DocumentArrayMethods._cast.call(this, val, i); + arr[i] = value; + DocumentArrayMethods._markModified.call(this, i); + return arr; +} + +/*! + * Module exports. + */ + +module.exports = MongooseDocumentArray; diff --git a/node_modules/mongoose/lib/types/DocumentArray/methods/index.js b/node_modules/mongoose/lib/types/DocumentArray/methods/index.js new file mode 100644 index 000000000..2f9e89e53 --- /dev/null +++ b/node_modules/mongoose/lib/types/DocumentArray/methods/index.js @@ -0,0 +1,359 @@ +'use strict'; + +const ArrayMethods = require('../../array/methods'); +const Document = require('../../../document'); +const ObjectId = require('../../objectid'); +const castObjectId = require('../../../cast/objectid'); +const getDiscriminatorByValue = require('../../../helpers/discriminator/getDiscriminatorByValue'); +const internalToObjectOptions = require('../../../options').internalToObjectOptions; +const utils = require('../../../utils'); + +const arrayParentSymbol = require('../../../helpers/symbols').arrayParentSymbol; +const arrayPathSymbol = require('../../../helpers/symbols').arrayPathSymbol; +const arraySchemaSymbol = require('../../../helpers/symbols').arraySchemaSymbol; +const documentArrayParent = require('../../../helpers/symbols').documentArrayParent; + +const methods = { + /*! + * ignore + */ + + toBSON() { + return this.toObject(internalToObjectOptions); + }, + + /** + * Overrides MongooseArray#cast + * + * @method _cast + * @api private + * @receiver MongooseDocumentArray + */ + + _cast(value, index) { + if (this[arraySchemaSymbol] == null) { + return value; + } + let Constructor = this[arraySchemaSymbol].casterConstructor; + const isInstance = Constructor.$isMongooseDocumentArray ? + value && value.isMongooseDocumentArray : + value instanceof Constructor; + if (isInstance || + // Hack re: #5001, see #5005 + (value && value.constructor && value.constructor.baseCasterConstructor === Constructor)) { + if (!(value[documentArrayParent] && value.__parentArray)) { + // value may have been created using array.create() + value[documentArrayParent] = this[arrayParentSymbol]; + value.__parentArray = this; + } + value.$setIndex(index); + return value; + } + + if (value === undefined || value === null) { + return null; + } + + // handle cast('string') or cast(ObjectId) etc. + // only objects are permitted so we can safely assume that + // non-objects are to be interpreted as _id + if (Buffer.isBuffer(value) || + value instanceof ObjectId || !utils.isObject(value)) { + value = { _id: value }; + } + + if (value && + Constructor.discriminators && + Constructor.schema && + Constructor.schema.options && + Constructor.schema.options.discriminatorKey) { + if (typeof value[Constructor.schema.options.discriminatorKey] === 'string' && + Constructor.discriminators[value[Constructor.schema.options.discriminatorKey]]) { + Constructor = Constructor.discriminators[value[Constructor.schema.options.discriminatorKey]]; + } else { + const constructorByValue = getDiscriminatorByValue(Constructor.discriminators, value[Constructor.schema.options.discriminatorKey]); + if (constructorByValue) { + Constructor = constructorByValue; + } + } + } + + if (Constructor.$isMongooseDocumentArray) { + return Constructor.cast(value, this, undefined, undefined, index); + } + const ret = new Constructor(value, this, undefined, undefined, index); + ret.isNew = true; + return ret; + }, + + /** + * Searches array items for the first document with a matching _id. + * + * ####Example: + * + * const embeddedDoc = m.array.id(some_id); + * + * @return {EmbeddedDocument|null} the subdocument or null if not found. + * @param {ObjectId|String|Number|Buffer} id + * @TODO cast to the _id based on schema for proper comparison + * @method id + * @api public + * @receiver MongooseDocumentArray + */ + + id(id) { + let casted; + let sid; + let _id; + + try { + casted = castObjectId(id).toString(); + } catch (e) { + casted = null; + } + + for (const val of this) { + if (!val) { + continue; + } + + _id = val.get('_id'); + + if (_id === null || typeof _id === 'undefined') { + continue; + } else if (_id instanceof Document) { + sid || (sid = String(id)); + if (sid == _id._id) { + return val; + } + } else if (!(id instanceof ObjectId) && !(_id instanceof ObjectId)) { + if (id == _id || utils.deepEqual(id, _id)) { + return val; + } + } else if (casted == _id) { + return val; + } + } + + return null; + }, + + /** + * Returns a native js Array of plain js objects + * + * ####NOTE: + * + * _Each sub-document is converted to a plain object by calling its `#toObject` method._ + * + * @param {Object} [options] optional options to pass to each documents `toObject` method call during conversion + * @return {Array} + * @method toObject + * @api public + * @receiver MongooseDocumentArray + */ + + toObject(options) { + // `[].concat` coerces the return value into a vanilla JS array, rather + // than a Mongoose array. + return [].concat(this.map(function(doc) { + if (doc == null) { + return null; + } + if (typeof doc.toObject !== 'function') { + return doc; + } + return doc.toObject(options); + })); + }, + + $toObject() { + return this.constructor.prototype.toObject.apply(this, arguments); + }, + + /** + * Wraps [`Array#push`](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/push) with proper change tracking. + * + * @param {Object} [args...] + * @api public + * @method push + * @memberOf MongooseDocumentArray + */ + + push() { + const ret = ArrayMethods.push.apply(this, arguments); + + _updateParentPopulated(this); + + return ret; + }, + + /** + * Pulls items from the array atomically. + * + * @param {Object} [args...] + * @api public + * @method pull + * @memberOf MongooseDocumentArray + */ + + pull() { + const ret = ArrayMethods.pull.apply(this, arguments); + + _updateParentPopulated(this); + + return ret; + }, + + /** + * Wraps [`Array#shift`](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/unshift) with proper change tracking. + */ + + shift() { + const ret = ArrayMethods.shift.apply(this, arguments); + + _updateParentPopulated(this); + + return ret; + }, + + /** + * Wraps [`Array#splice`](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/splice) with proper change tracking and casting. + */ + + splice() { + const ret = ArrayMethods.splice.apply(this, arguments); + + _updateParentPopulated(this); + + return ret; + }, + + /** + * Helper for console.log + * + * @method inspect + * @api public + * @receiver MongooseDocumentArray + */ + + inspect() { + return this.toObject(); + }, + + /** + * Creates a subdocument casted to this schema. + * + * This is the same subdocument constructor used for casting. + * + * @param {Object} obj the value to cast to this arrays SubDocument schema + * @method create + * @api public + * @receiver MongooseDocumentArray + */ + + create(obj) { + let Constructor = this[arraySchemaSymbol].casterConstructor; + if (obj && + Constructor.discriminators && + Constructor.schema && + Constructor.schema.options && + Constructor.schema.options.discriminatorKey) { + if (typeof obj[Constructor.schema.options.discriminatorKey] === 'string' && + Constructor.discriminators[obj[Constructor.schema.options.discriminatorKey]]) { + Constructor = Constructor.discriminators[obj[Constructor.schema.options.discriminatorKey]]; + } else { + const constructorByValue = getDiscriminatorByValue(Constructor.discriminators, obj[Constructor.schema.options.discriminatorKey]); + if (constructorByValue) { + Constructor = constructorByValue; + } + } + } + + return new Constructor(obj, this); + }, + + /*! + * ignore + */ + + notify(event) { + const _this = this; + return function notify(val, _arr) { + _arr = _arr || _this; + let i = _arr.length; + while (i--) { + if (_arr[i] == null) { + continue; + } + switch (event) { + // only swap for save event for now, we may change this to all event types later + case 'save': + val = _this[i]; + break; + default: + // NO-OP + break; + } + + if (_arr[i].isMongooseArray) { + notify(val, _arr[i]); + } else if (_arr[i]) { + _arr[i].emit(event, val); + } + } + }; + }, + + _markModified(elem, embeddedPath) { + const parent = this[arrayParentSymbol]; + let dirtyPath; + + if (parent) { + dirtyPath = this[arrayPathSymbol]; + + if (arguments.length) { + if (embeddedPath != null) { + // an embedded doc bubbled up the change + const index = elem.__index; + dirtyPath = dirtyPath + '.' + index + '.' + embeddedPath; + } else { + // directly set an index + dirtyPath = dirtyPath + '.' + elem; + } + } + + if (dirtyPath != null && dirtyPath.endsWith('.$')) { + return this; + } + + parent.markModified(dirtyPath, arguments.length > 0 ? elem : parent); + } + + return this; + } +}; + +module.exports = methods; + +/*! + * If this is a document array, each element may contain single + * populated paths, so we need to modify the top-level document's + * populated cache. See gh-8247, gh-8265. + */ + +function _updateParentPopulated(arr) { + const parent = arr[arrayParentSymbol]; + if (!parent || parent.$__.populated == null) return; + + const populatedPaths = Object.keys(parent.$__.populated). + filter(p => p.startsWith(arr[arrayPathSymbol] + '.')); + + for (const path of populatedPaths) { + const remnant = path.slice((arr[arrayPathSymbol] + '.').length); + if (!Array.isArray(parent.$__.populated[path].value)) { + continue; + } + + parent.$__.populated[path].value = arr.map(val => val.$populated(remnant)); + } +} \ No newline at end of file diff --git a/node_modules/mongoose/lib/types/array/ArrayWrapper.js b/node_modules/mongoose/lib/types/array/ArrayWrapper.js new file mode 100644 index 000000000..90d057f94 --- /dev/null +++ b/node_modules/mongoose/lib/types/array/ArrayWrapper.js @@ -0,0 +1,981 @@ +'use strict'; + +const Document = require('../../document'); +const ArraySubdocument = require('../ArraySubdocument'); +const MongooseError = require('../../error/mongooseError'); +const ObjectId = require('../objectid'); +const cleanModifiedSubpaths = require('../../helpers/document/cleanModifiedSubpaths'); +const get = require('../../helpers/get'); +const internalToObjectOptions = require('../../options').internalToObjectOptions; +const utils = require('../../utils'); +const util = require('util'); + +const arrayAtomicsSymbol = require('../../helpers/symbols').arrayAtomicsSymbol; +const arrayParentSymbol = require('../../helpers/symbols').arrayParentSymbol; +const arrayPathSymbol = require('../../helpers/symbols').arrayPathSymbol; +const arraySchemaSymbol = require('../../helpers/symbols').arraySchemaSymbol; +const populateModelSymbol = require('../../helpers/symbols').populateModelSymbol; +const slicedSymbol = Symbol('mongoose#Array#sliced'); + +const _basePush = Array.prototype.push; + +const validatorsSymbol = Symbol('mongoose#MongooseCoreArray#validators'); + +/*! + * ignore + */ + +class ArrayWrapper extends Array { + get isMongooseArray() { + return true; + } + + get validators() { + return this[validatorsSymbol]; + } + + set validators(v) { + this[validatorsSymbol] = v; + } + + /** + * Depopulates stored atomic operation values as necessary for direct insertion to MongoDB. + * + * If no atomics exist, we return all array values after conversion. + * + * @return {Array} + * @method $__getAtomics + * @memberOf MongooseArray + * @instance + * @api private + */ + + $__getAtomics() { + const ret = []; + const keys = Object.keys(this[arrayAtomicsSymbol] || {}); + let i = keys.length; + + const opts = Object.assign({}, internalToObjectOptions, { _isNested: true }); + + if (i === 0) { + ret[0] = ['$set', this.toObject(opts)]; + return ret; + } + + while (i--) { + const op = keys[i]; + let val = this[arrayAtomicsSymbol][op]; + + // the atomic values which are arrays are not MongooseArrays. we + // need to convert their elements as if they were MongooseArrays + // to handle populated arrays versus DocumentArrays properly. + if (utils.isMongooseObject(val)) { + val = val.toObject(opts); + } else if (Array.isArray(val)) { + val = this.toObject.call(val, opts); + } else if (val != null && Array.isArray(val.$each)) { + val.$each = this.toObject.call(val.$each, opts); + } else if (val != null && typeof val.valueOf === 'function') { + val = val.valueOf(); + } + + if (op === '$addToSet') { + val = { $each: val }; + } + + ret.push([op, val]); + } + + return ret; + } + + /*! + * ignore + */ + + $atomics() { + return this[arrayAtomicsSymbol] || {}; + } + + /*! + * ignore + */ + + $parent() { + return this[arrayParentSymbol]; + } + + /*! + * ignore + */ + + $path() { + return this[arrayPathSymbol]; + } + + /** + * Atomically shifts the array at most one time per document `save()`. + * + * ####NOTE: + * + * _Calling this multiple times on an array before saving sends the same command as calling it once._ + * _This update is implemented using the MongoDB [$pop](http://www.mongodb.org/display/DOCS/Updating/#Updating-%24pop) method which enforces this restriction._ + * + * doc.array = [1,2,3]; + * + * const shifted = doc.array.$shift(); + * console.log(shifted); // 1 + * console.log(doc.array); // [2,3] + * + * // no affect + * shifted = doc.array.$shift(); + * console.log(doc.array); // [2,3] + * + * doc.save(function (err) { + * if (err) return handleError(err); + * + * // we saved, now $shift works again + * shifted = doc.array.$shift(); + * console.log(shifted ); // 2 + * console.log(doc.array); // [3] + * }) + * + * @api public + * @memberOf MongooseArray + * @instance + * @method $shift + * @see mongodb http://www.mongodb.org/display/DOCS/Updating/#Updating-%24pop + */ + + $shift() { + this._registerAtomic('$pop', -1); + this._markModified(); + + // only allow shifting once + if (this._shifted) { + return; + } + this._shifted = true; + + return [].shift.call(this); + } + + /** + * Pops the array atomically at most one time per document `save()`. + * + * #### NOTE: + * + * _Calling this mulitple times on an array before saving sends the same command as calling it once._ + * _This update is implemented using the MongoDB [$pop](http://www.mongodb.org/display/DOCS/Updating/#Updating-%24pop) method which enforces this restriction._ + * + * doc.array = [1,2,3]; + * + * const popped = doc.array.$pop(); + * console.log(popped); // 3 + * console.log(doc.array); // [1,2] + * + * // no affect + * popped = doc.array.$pop(); + * console.log(doc.array); // [1,2] + * + * doc.save(function (err) { + * if (err) return handleError(err); + * + * // we saved, now $pop works again + * popped = doc.array.$pop(); + * console.log(popped); // 2 + * console.log(doc.array); // [1] + * }) + * + * @api public + * @method $pop + * @memberOf MongooseArray + * @instance + * @see mongodb http://www.mongodb.org/display/DOCS/Updating/#Updating-%24pop + * @method $pop + * @memberOf MongooseArray + */ + + $pop() { + this._registerAtomic('$pop', 1); + this._markModified(); + + // only allow popping once + if (this._popped) { + return; + } + this._popped = true; + + return [].pop.call(this); + } + + /*! + * ignore + */ + + $schema() { + return this[arraySchemaSymbol]; + } + + /** + * Casts a member based on this arrays schema. + * + * @param {any} value + * @return value the casted value + * @method _cast + * @api private + * @memberOf MongooseArray + */ + + _cast(value) { + let populated = false; + let Model; + + if (this[arrayParentSymbol]) { + populated = this[arrayParentSymbol].$populated(this[arrayPathSymbol], true); + } + + if (populated && value !== null && value !== undefined) { + // cast to the populated Models schema + Model = populated.options[populateModelSymbol]; + + // only objects are permitted so we can safely assume that + // non-objects are to be interpreted as _id + if (Buffer.isBuffer(value) || + value instanceof ObjectId || !utils.isObject(value)) { + value = { _id: value }; + } + + // gh-2399 + // we should cast model only when it's not a discriminator + const isDisc = value.$__schema && value.$__schema.discriminatorMapping && + value.$__schema.discriminatorMapping.key !== undefined; + if (!isDisc) { + value = new Model(value); + } + return this[arraySchemaSymbol].caster.applySetters(value, this[arrayParentSymbol], true); + } + + return this[arraySchemaSymbol].caster.applySetters(value, this[arrayParentSymbol], false); + } + + /** + * Internal helper for .map() + * + * @api private + * @return {Number} + * @method _mapCast + * @memberOf MongooseArray + */ + + _mapCast(val, index) { + return this._cast(val, this.length + index); + } + + /** + * Marks this array as modified. + * + * If it bubbles up from an embedded document change, then it takes the following arguments (otherwise, takes 0 arguments) + * + * @param {ArraySubdocument} subdoc the embedded doc that invoked this method on the Array + * @param {String} embeddedPath the path which changed in the subdoc + * @method _markModified + * @api private + * @memberOf MongooseArray + */ + + _markModified(elem) { + const parent = this[arrayParentSymbol]; + let dirtyPath; + + if (parent) { + dirtyPath = this[arrayPathSymbol]; + + if (arguments.length) { + dirtyPath = dirtyPath + '.' + elem; + } + + if (dirtyPath != null && dirtyPath.endsWith('.$')) { + return this; + } + + parent.markModified(dirtyPath, arguments.length > 0 ? elem : parent); + } + + return this; + } + + /** + * Register an atomic operation with the parent. + * + * @param {Array} op operation + * @param {any} val + * @method _registerAtomic + * @api private + * @memberOf MongooseArray + */ + + _registerAtomic(op, val) { + if (this[slicedSymbol]) { + return; + } + if (op === '$set') { + // $set takes precedence over all other ops. + // mark entire array modified. + this[arrayAtomicsSymbol] = { $set: val }; + cleanModifiedSubpaths(this[arrayParentSymbol], this[arrayPathSymbol]); + this._markModified(); + return this; + } + + this[arrayAtomicsSymbol] || (this[arrayAtomicsSymbol] = {}); + + const atomics = this[arrayAtomicsSymbol]; + + // reset pop/shift after save + if (op === '$pop' && !('$pop' in atomics)) { + const _this = this; + this[arrayParentSymbol].once('save', function() { + _this._popped = _this._shifted = null; + }); + } + + // check for impossible $atomic combos (Mongo denies more than one + // $atomic op on a single path + if (atomics.$set || Object.keys(atomics).length && !(op in atomics)) { + // a different op was previously registered. + // save the entire thing. + this[arrayAtomicsSymbol] = { $set: this }; + return this; + } + + let selector; + + if (op === '$pullAll' || op === '$addToSet') { + atomics[op] || (atomics[op] = []); + atomics[op] = atomics[op].concat(val); + } else if (op === '$pullDocs') { + const pullOp = atomics['$pull'] || (atomics['$pull'] = {}); + if (val[0] instanceof ArraySubdocument) { + selector = pullOp['$or'] || (pullOp['$or'] = []); + Array.prototype.push.apply(selector, val.map(function(v) { + return v.toObject({ transform: false, virtuals: false }); + })); + } else { + selector = pullOp['_id'] || (pullOp['_id'] = { $in: [] }); + selector['$in'] = selector['$in'].concat(val); + } + } else if (op === '$push') { + atomics.$push = atomics.$push || { $each: [] }; + if (val != null && utils.hasUserDefinedProperty(val, '$each')) { + atomics.$push = val; + } else { + atomics.$push.$each = atomics.$push.$each.concat(val); + } + } else { + atomics[op] = val; + } + + return this; + } + + /** + * Adds values to the array if not already present. + * + * ####Example: + * + * console.log(doc.array) // [2,3,4] + * const added = doc.array.addToSet(4,5); + * console.log(doc.array) // [2,3,4,5] + * console.log(added) // [5] + * + * @param {any} [args...] + * @return {Array} the values that were added + * @memberOf MongooseArray + * @api public + * @method addToSet + */ + + addToSet() { + _checkManualPopulation(this, arguments); + + let values = [].map.call(arguments, this._mapCast, this); + values = this[arraySchemaSymbol].applySetters(values, this[arrayParentSymbol]); + const added = []; + let type = ''; + if (values[0] instanceof ArraySubdocument) { + type = 'doc'; + } else if (values[0] instanceof Date) { + type = 'date'; + } + + const rawValues = values.isMongooseArrayProxy ? values.__array : this; + const rawArray = this.isMongooseArrayProxy ? this.__array : this; + + rawValues.forEach(function(v) { + let found; + const val = +v; + switch (type) { + case 'doc': + found = this.some(function(doc) { + return doc.equals(v); + }); + break; + case 'date': + found = this.some(function(d) { + return +d === val; + }); + break; + default: + found = ~this.indexOf(v); + } + + if (!found) { + rawArray.push(v); + this._registerAtomic('$addToSet', v); + this._markModified(); + [].push.call(added, v); + } + }, this); + + return added; + } + + /** + * Returns the number of pending atomic operations to send to the db for this array. + * + * @api private + * @return {Number} + * @method hasAtomics + * @memberOf MongooseArray + */ + + hasAtomics() { + if (!utils.isPOJO(this[arrayAtomicsSymbol])) { + return 0; + } + + return Object.keys(this[arrayAtomicsSymbol]).length; + } + + /** + * Return whether or not the `obj` is included in the array. + * + * @param {Object} obj the item to check + * @return {Boolean} + * @api public + * @method includes + * @memberOf MongooseArray + */ + + includes(obj, fromIndex) { + const ret = this.indexOf(obj, fromIndex); + return ret !== -1; + } + + /** + * Return the index of `obj` or `-1` if not found. + * + * @param {Object} obj the item to look for + * @return {Number} + * @api public + * @method indexOf + * @memberOf MongooseArray + */ + + indexOf(obj, fromIndex) { + if (obj instanceof ObjectId) { + obj = obj.toString(); + } + + fromIndex = fromIndex == null ? 0 : fromIndex; + const len = this.length; + for (let i = fromIndex; i < len; ++i) { + if (obj == this[i]) { + return i; + } + } + return -1; + } + + /** + * Helper for console.log + * + * @api public + * @method inspect + * @memberOf MongooseArray + */ + + inspect() { + return JSON.stringify(this); + } + + /** + * Pushes items to the array non-atomically. + * + * ####NOTE: + * + * _marks the entire array as modified, which if saved, will store it as a `$set` operation, potentially overwritting any changes that happen between when you retrieved the object and when you save it._ + * + * @param {any} [args...] + * @api public + * @method nonAtomicPush + * @memberOf MongooseArray + */ + + nonAtomicPush() { + const values = [].map.call(arguments, this._mapCast, this); + const ret = [].push.apply(this, values); + this._registerAtomic('$set', this); + this._markModified(); + return ret; + } + + /** + * Wraps [`Array#pop`](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/pop) with proper change tracking. + * + * ####Note: + * + * _marks the entire array as modified which will pass the entire thing to $set potentially overwritting any changes that happen between when you retrieved the object and when you save it._ + * + * @see MongooseArray#$pop #types_array_MongooseArray-%24pop + * @api public + * @method pop + * @memberOf MongooseArray + */ + + pop() { + const ret = [].pop.call(this); + this._registerAtomic('$set', this); + this._markModified(); + return ret; + } + + /** + * Pulls items from the array atomically. Equality is determined by casting + * the provided value to an embedded document and comparing using + * [the `Document.equals()` function.](./api.html#document_Document-equals) + * + * ####Examples: + * + * doc.array.pull(ObjectId) + * doc.array.pull({ _id: 'someId' }) + * doc.array.pull(36) + * doc.array.pull('tag 1', 'tag 2') + * + * To remove a document from a subdocument array we may pass an object with a matching `_id`. + * + * doc.subdocs.push({ _id: 4815162342 }) + * doc.subdocs.pull({ _id: 4815162342 }) // removed + * + * Or we may passing the _id directly and let mongoose take care of it. + * + * doc.subdocs.push({ _id: 4815162342 }) + * doc.subdocs.pull(4815162342); // works + * + * The first pull call will result in a atomic operation on the database, if pull is called repeatedly without saving the document, a $set operation is used on the complete array instead, overwriting possible changes that happened on the database in the meantime. + * + * @param {any} [args...] + * @see mongodb http://www.mongodb.org/display/DOCS/Updating/#Updating-%24pull + * @api public + * @method pull + * @memberOf MongooseArray + */ + + pull() { + const values = [].map.call(arguments, this._cast, this); + const cur = this[arrayParentSymbol].get(this[arrayPathSymbol]); + let i = cur.length; + let mem; + + while (i--) { + mem = cur[i]; + if (mem instanceof Document) { + const some = values.some(function(v) { + return mem.equals(v); + }); + if (some) { + [].splice.call(cur, i, 1); + } + } else if (~cur.indexOf.call(values, mem)) { + [].splice.call(cur, i, 1); + } + } + + if (values[0] instanceof ArraySubdocument) { + this._registerAtomic('$pullDocs', values.map(function(v) { + return v.$__getValue('_id') || v; + })); + } else { + this._registerAtomic('$pullAll', values); + } + + this._markModified(); + + // Might have modified child paths and then pulled, like + // `doc.children[1].name = 'test';` followed by + // `doc.children.remove(doc.children[0]);`. In this case we fall back + // to a `$set` on the whole array. See #3511 + if (cleanModifiedSubpaths(this[arrayParentSymbol], this[arrayPathSymbol]) > 0) { + this._registerAtomic('$set', this); + } + + return this; + } + + /** + * Wraps [`Array#push`](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/push) with proper change tracking. + * + * ####Example: + * + * const schema = Schema({ nums: [Number] }); + * const Model = mongoose.model('Test', schema); + * + * const doc = await Model.create({ nums: [3, 4] }); + * doc.nums.push(5); // Add 5 to the end of the array + * await doc.save(); + * + * // You can also pass an object with `$each` as the + * // first parameter to use MongoDB's `$position` + * doc.nums.push({ + * $each: [1, 2], + * $position: 0 + * }); + * doc.nums; // [1, 2, 3, 4, 5] + * + * @param {Object} [args...] + * @api public + * @method push + * @memberOf MongooseArray + */ + + push() { + let values = arguments; + let atomic = values; + const isOverwrite = values[0] != null && + utils.hasUserDefinedProperty(values[0], '$each'); + const arr = this.isMongooseArrayProxy ? this.__array : this; + if (isOverwrite) { + atomic = values[0]; + values = values[0].$each; + } + + if (this[arraySchemaSymbol] == null) { + return _basePush.apply(this, values); + } + + _checkManualPopulation(this, values); + + const parent = this[arrayParentSymbol]; + values = [].map.call(values, this._mapCast, this); + values = this[arraySchemaSymbol].applySetters(values, parent, undefined, + undefined, { skipDocumentArrayCast: true }); + let ret; + const atomics = this[arrayAtomicsSymbol]; + + if (isOverwrite) { + atomic.$each = values; + + if (get(atomics, '$push.$each.length', 0) > 0 && + atomics.$push.$position != atomic.$position) { + throw new MongooseError('Cannot call `Array#push()` multiple times ' + + 'with different `$position`'); + } + + if (atomic.$position != null) { + [].splice.apply(arr, [atomic.$position, 0].concat(values)); + ret = this.length; + } else { + ret = [].push.apply(arr, values); + } + } else { + if (get(atomics, '$push.$each.length', 0) > 0 && + atomics.$push.$position != null) { + throw new MongooseError('Cannot call `Array#push()` multiple times ' + + 'with different `$position`'); + } + atomic = values; + ret = [].push.apply(arr, values); + } + + this._registerAtomic('$push', atomic); + this._markModified(); + return ret; + } + + /** + * Alias of [pull](#mongoosearray_MongooseArray-pull) + * + * @see MongooseArray#pull #types_array_MongooseArray-pull + * @see mongodb http://www.mongodb.org/display/DOCS/Updating/#Updating-%24pull + * @api public + * @memberOf MongooseArray + * @instance + * @method remove + */ + + remove() { + return this.pull.apply(this, arguments); + } + + /** + * Sets the casted `val` at index `i` and marks the array modified. + * + * ####Example: + * + * // given documents based on the following + * const Doc = mongoose.model('Doc', new Schema({ array: [Number] })); + * + * const doc = new Doc({ array: [2,3,4] }) + * + * console.log(doc.array) // [2,3,4] + * + * doc.array.set(1,"5"); + * console.log(doc.array); // [2,5,4] // properly cast to number + * doc.save() // the change is saved + * + * // VS not using array#set + * doc.array[1] = "5"; + * console.log(doc.array); // [2,"5",4] // no casting + * doc.save() // change is not saved + * + * @return {Array} this + * @api public + * @method set + * @memberOf MongooseArray + */ + + set(i, val, skipModified) { + if (skipModified) { + this[i] = val; + return this; + } + const value = this._cast(val, i); + this[i] = value; + this._markModified(i); + return this; + } + + /** + * Wraps [`Array#shift`](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/unshift) with proper change tracking. + * + * ####Example: + * + * doc.array = [2,3]; + * const res = doc.array.shift(); + * console.log(res) // 2 + * console.log(doc.array) // [3] + * + * ####Note: + * + * _marks the entire array as modified, which if saved, will store it as a `$set` operation, potentially overwritting any changes that happen between when you retrieved the object and when you save it._ + * + * @api public + * @method shift + * @memberOf MongooseArray + */ + + shift() { + const arr = this.isMongooseArrayProxy ? this.__array : this; + const ret = [].shift.call(arr); + this._registerAtomic('$set', this); + this._markModified(); + return ret; + } + + /** + * Wraps [`Array#sort`](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/sort) with proper change tracking. + * + * ####NOTE: + * + * _marks the entire array as modified, which if saved, will store it as a `$set` operation, potentially overwritting any changes that happen between when you retrieved the object and when you save it._ + * + * @api public + * @method sort + * @memberOf MongooseArray + * @see https://masteringjs.io/tutorials/fundamentals/array-sort + */ + + sort() { + const arr = this.isMongooseArrayProxy ? this.__array : this; + const ret = [].sort.apply(arr, arguments); + this._registerAtomic('$set', this); + return ret; + } + + /** + * Wraps [`Array#splice`](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/splice) with proper change tracking and casting. + * + * ####Note: + * + * _marks the entire array as modified, which if saved, will store it as a `$set` operation, potentially overwritting any changes that happen between when you retrieved the object and when you save it._ + * + * @api public + * @method splice + * @memberOf MongooseArray + * @see https://masteringjs.io/tutorials/fundamentals/array-splice + */ + + splice() { + let ret; + const arr = this.isMongooseArrayProxy ? this.__array : this; + + _checkManualPopulation(this, Array.prototype.slice.call(arguments, 2)); + + if (arguments.length) { + let vals; + if (this[arraySchemaSymbol] == null) { + vals = arguments; + } else { + vals = []; + for (let i = 0; i < arguments.length; ++i) { + vals[i] = i < 2 ? + arguments[i] : + this._cast(arguments[i], arguments[0] + (i - 2)); + } + } + + ret = [].splice.apply(arr, vals); + this._registerAtomic('$set', this); + } + + return ret; + } + + /*! + * ignore + */ + + toBSON() { + return this.toObject(internalToObjectOptions); + } + + /** + * Returns a native js Array. + * + * @param {Object} options + * @return {Array} + * @api public + * @method toObject + * @memberOf MongooseArray + */ + + toObject(options) { + const arr = this.isMongooseArrayProxy ? this.__array : this; + if (options && options.depopulate) { + options = utils.clone(options); + options._isNested = true; + // Ensure return value is a vanilla array, because in Node.js 6+ `map()` + // is smart enough to use the inherited array's constructor. + return [].concat(arr).map(function(doc) { + return doc instanceof Document + ? doc.toObject(options) + : doc; + }); + } + + return [].concat(arr); + } + + $toObject() { + return this.constructor.prototype.toObject.apply(this, arguments); + } + + /** + * Wraps [`Array#unshift`](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/unshift) with proper change tracking. + * + * ####Note: + * + * _marks the entire array as modified, which if saved, will store it as a `$set` operation, potentially overwriting any changes that happen between when you retrieved the object and when you save it._ + * + * @api public + * @method unshift + * @memberOf MongooseArray + */ + + unshift() { + _checkManualPopulation(this, arguments); + + let values; + if (this[arraySchemaSymbol] == null) { + values = arguments; + } else { + values = [].map.call(arguments, this._cast, this); + values = this[arraySchemaSymbol].applySetters(values, this[arrayParentSymbol]); + } + + const arr = this.isMongooseArrayProxy ? this.__array : this; + [].unshift.apply(arr, values); + this._registerAtomic('$set', this); + this._markModified(); + return this.length; + } +} + +if (util.inspect.custom) { + ArrayWrapper.prototype[util.inspect.custom] = + ArrayWrapper.prototype.inspect; +} + +/*! + * ignore + */ + +function _isAllSubdocs(docs, ref) { + if (!ref) { + return false; + } + + for (const arg of docs) { + if (arg == null) { + return false; + } + const model = arg.constructor; + if (!(arg instanceof Document) || + (model.modelName !== ref && model.baseModelName !== ref)) { + return false; + } + } + + return true; +} + +/*! + * ignore + */ + +function _checkManualPopulation(arr, docs) { + const ref = arr == null ? + null : + get(arr[arraySchemaSymbol], 'caster.options.ref', null); + if (arr.length === 0 && + docs.length > 0) { + if (_isAllSubdocs(docs, ref)) { + arr[arrayParentSymbol].$populated(arr[arrayPathSymbol], [], { + [populateModelSymbol]: docs[0].constructor + }); + } + } +} + +const returnVanillaArrayMethods = [ + 'filter', + 'flat', + 'flatMap', + 'map', + 'slice' +]; +for (const method of returnVanillaArrayMethods) { + if (Array.prototype[method] == null) { + continue; + } + + ArrayWrapper.prototype[method] = function() { + const _arr = this.isMongooseArrayProxy ? this.__array : this; + const arr = [].concat(_arr); + + return arr[method].apply(arr, arguments); + }; +} + +module.exports = ArrayWrapper; diff --git a/node_modules/mongoose/lib/types/array/index.js b/node_modules/mongoose/lib/types/array/index.js new file mode 100644 index 000000000..9b71bf400 --- /dev/null +++ b/node_modules/mongoose/lib/types/array/index.js @@ -0,0 +1,115 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const Document = require('../../document'); +const mongooseArrayMethods = require('./methods'); + +const arrayAtomicsSymbol = require('../../helpers/symbols').arrayAtomicsSymbol; +const arrayAtomicsBackupSymbol = require('../../helpers/symbols').arrayAtomicsBackupSymbol; +const arrayParentSymbol = require('../../helpers/symbols').arrayParentSymbol; +const arrayPathSymbol = require('../../helpers/symbols').arrayPathSymbol; +const arraySchemaSymbol = require('../../helpers/symbols').arraySchemaSymbol; + +/** + * Mongoose Array constructor. + * + * ####NOTE: + * + * _Values always have to be passed to the constructor to initialize, otherwise `MongooseArray#push` will mark the array as modified._ + * + * @param {Array} values + * @param {String} path + * @param {Document} doc parent document + * @api private + * @inherits Array + * @see http://bit.ly/f6CnZU + */ + +function MongooseArray(values, path, doc, schematype) { + let arr; + if (Array.isArray(values)) { + const len = values.length; + + // Perf optimizations for small arrays: much faster to use `...` than `for` + `push`, + // but large arrays may cause stack overflows. And for arrays of length 0/1, just + // modifying the array is faster. Seems small, but adds up when you have a document + // with thousands of nested arrays. + if (len === 0) { + arr = new Array(); + } else if (len === 1) { + arr = new Array(1); + arr[0] = values[0]; + } else if (len < 10000) { + arr = new Array(); + arr.push.apply(arr, values); + } else { + arr = new Array(); + for (let i = 0; i < len; ++i) { + arr.push(values[i]); + } + } + } else { + arr = []; + } + + const internals = { + [arrayAtomicsSymbol]: {}, + [arrayAtomicsBackupSymbol]: void 0, + [arrayPathSymbol]: path, + [arraySchemaSymbol]: schematype, + [arrayParentSymbol]: void 0, + isMongooseArray: true, + isMongooseArrayProxy: true, + __array: arr + }; + + if (values[arrayAtomicsSymbol] != null) { + internals[arrayAtomicsSymbol] = values[arrayAtomicsSymbol]; + } + + // Because doc comes from the context of another function, doc === global + // can happen if there was a null somewhere up the chain (see #3020) + // RB Jun 17, 2015 updated to check for presence of expected paths instead + // to make more proof against unusual node environments + if (doc != null && doc instanceof Document) { + internals[arrayParentSymbol] = doc; + internals[arraySchemaSymbol] = schematype || doc.schema.path(path); + } + + const proxy = new Proxy(arr, { + get: function(target, prop) { + if (internals.hasOwnProperty(prop)) { + return internals[prop]; + } + if (mongooseArrayMethods.hasOwnProperty(prop)) { + return mongooseArrayMethods[prop]; + } + + return arr[prop]; + }, + set: function(target, prop, val) { + if (typeof prop === 'string' && /^\d+$/.test(prop)) { + const value = mongooseArrayMethods._cast.call(proxy, val, prop); + arr[prop] = value; + mongooseArrayMethods._markModified.call(proxy, prop); + } else if (internals.hasOwnProperty(prop)) { + internals[prop] = val; + } else { + arr[prop] = val; + } + + return true; + } + }); + + return proxy; +} + +/*! + * Module exports. + */ + +module.exports = exports = MongooseArray; diff --git a/node_modules/mongoose/lib/types/array/methods/index.js b/node_modules/mongoose/lib/types/array/methods/index.js new file mode 100644 index 000000000..154f578d6 --- /dev/null +++ b/node_modules/mongoose/lib/types/array/methods/index.js @@ -0,0 +1,960 @@ +'use strict'; + +const Document = require('../../../document'); +const ArraySubdocument = require('../../ArraySubdocument'); +const MongooseError = require('../../../error/mongooseError'); +const ObjectId = require('../../objectid'); +const cleanModifiedSubpaths = require('../../../helpers/document/cleanModifiedSubpaths'); +const get = require('../../../helpers/get'); +const internalToObjectOptions = require('../../../options').internalToObjectOptions; +const utils = require('../../../utils'); + +const arrayAtomicsSymbol = require('../../../helpers/symbols').arrayAtomicsSymbol; +const arrayParentSymbol = require('../../../helpers/symbols').arrayParentSymbol; +const arrayPathSymbol = require('../../../helpers/symbols').arrayPathSymbol; +const arraySchemaSymbol = require('../../../helpers/symbols').arraySchemaSymbol; +const populateModelSymbol = require('../../../helpers/symbols').populateModelSymbol; +const slicedSymbol = Symbol('mongoose#Array#sliced'); + +const _basePush = Array.prototype.push; + +/*! + * ignore + */ + +const methods = { + /** + * Depopulates stored atomic operation values as necessary for direct insertion to MongoDB. + * + * If no atomics exist, we return all array values after conversion. + * + * @return {Array} + * @method $__getAtomics + * @memberOf MongooseArray + * @instance + * @api private + */ + + $__getAtomics() { + const ret = []; + const keys = Object.keys(this[arrayAtomicsSymbol] || {}); + let i = keys.length; + + const opts = Object.assign({}, internalToObjectOptions, { _isNested: true }); + + if (i === 0) { + ret[0] = ['$set', this.toObject(opts)]; + return ret; + } + + while (i--) { + const op = keys[i]; + let val = this[arrayAtomicsSymbol][op]; + + // the atomic values which are arrays are not MongooseArrays. we + // need to convert their elements as if they were MongooseArrays + // to handle populated arrays versus DocumentArrays properly. + if (utils.isMongooseObject(val)) { + val = val.toObject(opts); + } else if (Array.isArray(val)) { + val = this.toObject.call(val, opts); + } else if (val != null && Array.isArray(val.$each)) { + val.$each = this.toObject.call(val.$each, opts); + } else if (val != null && typeof val.valueOf === 'function') { + val = val.valueOf(); + } + + if (op === '$addToSet') { + val = { $each: val }; + } + + ret.push([op, val]); + } + + return ret; + }, + + /*! + * ignore + */ + + $atomics() { + return this[arrayAtomicsSymbol]; + }, + + /*! + * ignore + */ + + $parent() { + return this[arrayParentSymbol]; + }, + + /*! + * ignore + */ + + $path() { + return this[arrayPathSymbol]; + }, + + /** + * Atomically shifts the array at most one time per document `save()`. + * + * ####NOTE: + * + * _Calling this multiple times on an array before saving sends the same command as calling it once._ + * _This update is implemented using the MongoDB [$pop](http://www.mongodb.org/display/DOCS/Updating/#Updating-%24pop) method which enforces this restriction._ + * + * doc.array = [1,2,3]; + * + * const shifted = doc.array.$shift(); + * console.log(shifted); // 1 + * console.log(doc.array); // [2,3] + * + * // no affect + * shifted = doc.array.$shift(); + * console.log(doc.array); // [2,3] + * + * doc.save(function (err) { + * if (err) return handleError(err); + * + * // we saved, now $shift works again + * shifted = doc.array.$shift(); + * console.log(shifted ); // 2 + * console.log(doc.array); // [3] + * }) + * + * @api public + * @memberOf MongooseArray + * @instance + * @method $shift + * @see mongodb http://www.mongodb.org/display/DOCS/Updating/#Updating-%24pop + */ + + $shift() { + this._registerAtomic('$pop', -1); + this._markModified(); + + // only allow shifting once + if (this._shifted) { + return; + } + this._shifted = true; + + return [].shift.call(this); + }, + + /** + * Pops the array atomically at most one time per document `save()`. + * + * #### NOTE: + * + * _Calling this mulitple times on an array before saving sends the same command as calling it once._ + * _This update is implemented using the MongoDB [$pop](http://www.mongodb.org/display/DOCS/Updating/#Updating-%24pop) method which enforces this restriction._ + * + * doc.array = [1,2,3]; + * + * const popped = doc.array.$pop(); + * console.log(popped); // 3 + * console.log(doc.array); // [1,2] + * + * // no affect + * popped = doc.array.$pop(); + * console.log(doc.array); // [1,2] + * + * doc.save(function (err) { + * if (err) return handleError(err); + * + * // we saved, now $pop works again + * popped = doc.array.$pop(); + * console.log(popped); // 2 + * console.log(doc.array); // [1] + * }) + * + * @api public + * @method $pop + * @memberOf MongooseArray + * @instance + * @see mongodb http://www.mongodb.org/display/DOCS/Updating/#Updating-%24pop + * @method $pop + * @memberOf MongooseArray + */ + + $pop() { + this._registerAtomic('$pop', 1); + this._markModified(); + + // only allow popping once + if (this._popped) { + return; + } + this._popped = true; + + return [].pop.call(this); + }, + + /*! + * ignore + */ + + $schema() { + return this[arraySchemaSymbol]; + }, + + /** + * Casts a member based on this arrays schema. + * + * @param {any} value + * @return value the casted value + * @method _cast + * @api private + * @memberOf MongooseArray + */ + + _cast(value) { + let populated = false; + let Model; + + const parent = this[arrayParentSymbol]; + if (parent) { + populated = parent.$populated(this[arrayPathSymbol], true); + } + + if (populated && value !== null && value !== undefined) { + // cast to the populated Models schema + Model = populated.options[populateModelSymbol]; + + // only objects are permitted so we can safely assume that + // non-objects are to be interpreted as _id + if (Buffer.isBuffer(value) || + value instanceof ObjectId || !utils.isObject(value)) { + value = { _id: value }; + } + + // gh-2399 + // we should cast model only when it's not a discriminator + const isDisc = value.schema && value.schema.discriminatorMapping && + value.schema.discriminatorMapping.key !== undefined; + if (!isDisc) { + value = new Model(value); + } + return this[arraySchemaSymbol].caster.applySetters(value, parent, true); + } + + return this[arraySchemaSymbol].caster.applySetters(value, parent, false); + }, + + /** + * Internal helper for .map() + * + * @api private + * @return {Number} + * @method _mapCast + * @memberOf MongooseArray + */ + + _mapCast(val, index) { + return this._cast(val, this.length + index); + }, + + /** + * Marks this array as modified. + * + * If it bubbles up from an embedded document change, then it takes the following arguments (otherwise, takes 0 arguments) + * + * @param {ArraySubdocument} subdoc the embedded doc that invoked this method on the Array + * @param {String} embeddedPath the path which changed in the subdoc + * @method _markModified + * @api private + * @memberOf MongooseArray + */ + + _markModified(elem) { + const parent = this[arrayParentSymbol]; + let dirtyPath; + + if (parent) { + dirtyPath = this[arrayPathSymbol]; + + if (arguments.length) { + dirtyPath = dirtyPath + '.' + elem; + } + + if (dirtyPath != null && dirtyPath.endsWith('.$')) { + return this; + } + + parent.markModified(dirtyPath, arguments.length > 0 ? elem : parent); + } + + return this; + }, + + /** + * Register an atomic operation with the parent. + * + * @param {Array} op operation + * @param {any} val + * @method _registerAtomic + * @api private + * @memberOf MongooseArray + */ + + _registerAtomic(op, val) { + if (this[slicedSymbol]) { + return; + } + if (op === '$set') { + // $set takes precedence over all other ops. + // mark entire array modified. + this[arrayAtomicsSymbol] = { $set: val }; + cleanModifiedSubpaths(this[arrayParentSymbol], this[arrayPathSymbol]); + this._markModified(); + return this; + } + + const atomics = this[arrayAtomicsSymbol]; + + // reset pop/shift after save + if (op === '$pop' && !('$pop' in atomics)) { + const _this = this; + this[arrayParentSymbol].once('save', function() { + _this._popped = _this._shifted = null; + }); + } + + // check for impossible $atomic combos (Mongo denies more than one + // $atomic op on a single path + if (atomics.$set || Object.keys(atomics).length && !(op in atomics)) { + // a different op was previously registered. + // save the entire thing. + this[arrayAtomicsSymbol] = { $set: this }; + return this; + } + + let selector; + + if (op === '$pullAll' || op === '$addToSet') { + atomics[op] || (atomics[op] = []); + atomics[op] = atomics[op].concat(val); + } else if (op === '$pullDocs') { + const pullOp = atomics['$pull'] || (atomics['$pull'] = {}); + if (val[0] instanceof ArraySubdocument) { + selector = pullOp['$or'] || (pullOp['$or'] = []); + Array.prototype.push.apply(selector, val.map(function(v) { + return v.toObject({ transform: false, virtuals: false }); + })); + } else { + selector = pullOp['_id'] || (pullOp['_id'] = { $in: [] }); + selector['$in'] = selector['$in'].concat(val); + } + } else if (op === '$push') { + atomics.$push = atomics.$push || { $each: [] }; + if (val != null && utils.hasUserDefinedProperty(val, '$each')) { + atomics.$push = val; + } else { + atomics.$push.$each = atomics.$push.$each.concat(val); + } + } else { + atomics[op] = val; + } + + return this; + }, + + /** + * Adds values to the array if not already present. + * + * ####Example: + * + * console.log(doc.array) // [2,3,4] + * const added = doc.array.addToSet(4,5); + * console.log(doc.array) // [2,3,4,5] + * console.log(added) // [5] + * + * @param {any} [args...] + * @return {Array} the values that were added + * @memberOf MongooseArray + * @api public + * @method addToSet + */ + + addToSet() { + _checkManualPopulation(this, arguments); + + let values = [].map.call(arguments, this._mapCast, this); + values = this[arraySchemaSymbol].applySetters(values, this[arrayParentSymbol]); + const added = []; + let type = ''; + if (values[0] instanceof ArraySubdocument) { + type = 'doc'; + } else if (values[0] instanceof Date) { + type = 'date'; + } + + const rawValues = values.isMongooseArrayProxy ? values.__array : this; + const rawArray = this.isMongooseArrayProxy ? this.__array : this; + + rawValues.forEach(function(v) { + let found; + const val = +v; + switch (type) { + case 'doc': + found = this.some(function(doc) { + return doc.equals(v); + }); + break; + case 'date': + found = this.some(function(d) { + return +d === val; + }); + break; + default: + found = ~this.indexOf(v); + } + + if (!found) { + rawArray.push(v); + this._registerAtomic('$addToSet', v); + this._markModified(); + [].push.call(added, v); + } + }, this); + + return added; + }, + + /** + * Returns the number of pending atomic operations to send to the db for this array. + * + * @api private + * @return {Number} + * @method hasAtomics + * @memberOf MongooseArray + */ + + hasAtomics() { + if (!utils.isPOJO(this[arrayAtomicsSymbol])) { + return 0; + } + + return Object.keys(this[arrayAtomicsSymbol]).length; + }, + + /** + * Return whether or not the `obj` is included in the array. + * + * @param {Object} obj the item to check + * @return {Boolean} + * @api public + * @method includes + * @memberOf MongooseArray + */ + + includes(obj, fromIndex) { + const ret = this.indexOf(obj, fromIndex); + return ret !== -1; + }, + + /** + * Return the index of `obj` or `-1` if not found. + * + * @param {Object} obj the item to look for + * @return {Number} + * @api public + * @method indexOf + * @memberOf MongooseArray + */ + + indexOf(obj, fromIndex) { + if (obj instanceof ObjectId) { + obj = obj.toString(); + } + + fromIndex = fromIndex == null ? 0 : fromIndex; + const len = this.length; + for (let i = fromIndex; i < len; ++i) { + if (obj == this[i]) { + return i; + } + } + return -1; + }, + + /** + * Helper for console.log + * + * @api public + * @method inspect + * @memberOf MongooseArray + */ + + inspect() { + return JSON.stringify(this); + }, + + /** + * Pushes items to the array non-atomically. + * + * ####NOTE: + * + * _marks the entire array as modified, which if saved, will store it as a `$set` operation, potentially overwritting any changes that happen between when you retrieved the object and when you save it._ + * + * @param {any} [args...] + * @api public + * @method nonAtomicPush + * @memberOf MongooseArray + */ + + nonAtomicPush() { + const values = [].map.call(arguments, this._mapCast, this); + const ret = [].push.apply(this, values); + this._registerAtomic('$set', this); + this._markModified(); + return ret; + }, + + /** + * Wraps [`Array#pop`](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/pop) with proper change tracking. + * + * ####Note: + * + * _marks the entire array as modified which will pass the entire thing to $set potentially overwritting any changes that happen between when you retrieved the object and when you save it._ + * + * @see MongooseArray#$pop #types_array_MongooseArray-%24pop + * @api public + * @method pop + * @memberOf MongooseArray + */ + + pop() { + const ret = [].pop.call(this); + this._registerAtomic('$set', this); + this._markModified(); + return ret; + }, + + /** + * Pulls items from the array atomically. Equality is determined by casting + * the provided value to an embedded document and comparing using + * [the `Document.equals()` function.](./api.html#document_Document-equals) + * + * ####Examples: + * + * doc.array.pull(ObjectId) + * doc.array.pull({ _id: 'someId' }) + * doc.array.pull(36) + * doc.array.pull('tag 1', 'tag 2') + * + * To remove a document from a subdocument array we may pass an object with a matching `_id`. + * + * doc.subdocs.push({ _id: 4815162342 }) + * doc.subdocs.pull({ _id: 4815162342 }) // removed + * + * Or we may passing the _id directly and let mongoose take care of it. + * + * doc.subdocs.push({ _id: 4815162342 }) + * doc.subdocs.pull(4815162342); // works + * + * The first pull call will result in a atomic operation on the database, if pull is called repeatedly without saving the document, a $set operation is used on the complete array instead, overwriting possible changes that happened on the database in the meantime. + * + * @param {any} [args...] + * @see mongodb http://www.mongodb.org/display/DOCS/Updating/#Updating-%24pull + * @api public + * @method pull + * @memberOf MongooseArray + */ + + pull() { + const values = [].map.call(arguments, this._cast, this); + const cur = this[arrayParentSymbol].get(this[arrayPathSymbol]); + let i = cur.length; + let mem; + + while (i--) { + mem = cur[i]; + if (mem instanceof Document) { + const some = values.some(function(v) { + return mem.equals(v); + }); + if (some) { + [].splice.call(cur, i, 1); + } + } else if (~cur.indexOf.call(values, mem)) { + [].splice.call(cur, i, 1); + } + } + + if (values[0] instanceof ArraySubdocument) { + this._registerAtomic('$pullDocs', values.map(function(v) { + return v.$__getValue('_id') || v; + })); + } else { + this._registerAtomic('$pullAll', values); + } + + this._markModified(); + + // Might have modified child paths and then pulled, like + // `doc.children[1].name = 'test';` followed by + // `doc.children.remove(doc.children[0]);`. In this case we fall back + // to a `$set` on the whole array. See #3511 + if (cleanModifiedSubpaths(this[arrayParentSymbol], this[arrayPathSymbol]) > 0) { + this._registerAtomic('$set', this); + } + + return this; + }, + + /** + * Wraps [`Array#push`](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/push) with proper change tracking. + * + * ####Example: + * + * const schema = Schema({ nums: [Number] }); + * const Model = mongoose.model('Test', schema); + * + * const doc = await Model.create({ nums: [3, 4] }); + * doc.nums.push(5); // Add 5 to the end of the array + * await doc.save(); + * + * // You can also pass an object with `$each` as the + * // first parameter to use MongoDB's `$position` + * doc.nums.push({ + * $each: [1, 2], + * $position: 0 + * }); + * doc.nums; // [1, 2, 3, 4, 5] + * + * @param {Object} [args...] + * @api public + * @method push + * @memberOf MongooseArray + */ + + push() { + let values = arguments; + let atomic = values; + const isOverwrite = values[0] != null && + utils.hasUserDefinedProperty(values[0], '$each'); + const arr = this.isMongooseArrayProxy ? this.__array : this; + if (isOverwrite) { + atomic = values[0]; + values = values[0].$each; + } + + if (this[arraySchemaSymbol] == null) { + return _basePush.apply(this, values); + } + + _checkManualPopulation(this, values); + + const parent = this[arrayParentSymbol]; + values = [].map.call(values, this._mapCast, this); + values = this[arraySchemaSymbol].applySetters(values, parent, undefined, + undefined, { skipDocumentArrayCast: true }); + let ret; + const atomics = this[arrayAtomicsSymbol]; + + if (isOverwrite) { + atomic.$each = values; + + if (get(atomics, '$push.$each.length', 0) > 0 && + atomics.$push.$position != atomic.$position) { + throw new MongooseError('Cannot call `Array#push()` multiple times ' + + 'with different `$position`'); + } + + if (atomic.$position != null) { + [].splice.apply(arr, [atomic.$position, 0].concat(values)); + ret = this.length; + } else { + ret = [].push.apply(arr, values); + } + } else { + if (get(atomics, '$push.$each.length', 0) > 0 && + atomics.$push.$position != null) { + throw new MongooseError('Cannot call `Array#push()` multiple times ' + + 'with different `$position`'); + } + atomic = values; + ret = [].push.apply(arr, values); + } + + this._registerAtomic('$push', atomic); + this._markModified(); + return ret; + }, + + /** + * Alias of [pull](#mongoosearray_MongooseArray-pull) + * + * @see MongooseArray#pull #types_array_MongooseArray-pull + * @see mongodb http://www.mongodb.org/display/DOCS/Updating/#Updating-%24pull + * @api public + * @memberOf MongooseArray + * @instance + * @method remove + */ + + remove() { + return this.pull.apply(this, arguments); + }, + + /** + * Sets the casted `val` at index `i` and marks the array modified. + * + * ####Example: + * + * // given documents based on the following + * const Doc = mongoose.model('Doc', new Schema({ array: [Number] })); + * + * const doc = new Doc({ array: [2,3,4] }) + * + * console.log(doc.array) // [2,3,4] + * + * doc.array.set(1,"5"); + * console.log(doc.array); // [2,5,4] // properly cast to number + * doc.save() // the change is saved + * + * // VS not using array#set + * doc.array[1] = "5"; + * console.log(doc.array); // [2,"5",4] // no casting + * doc.save() // change is not saved + * + * @return {Array} this + * @api public + * @method set + * @memberOf MongooseArray + */ + + set(i, val, skipModified) { + const arr = this.__array; + if (skipModified) { + arr[i] = val; + return this; + } + const value = methods._cast.call(this, val, i); + arr[i] = value; + methods._markModified.call(this, i); + return this; + }, + + /** + * Wraps [`Array#shift`](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/unshift) with proper change tracking. + * + * ####Example: + * + * doc.array = [2,3]; + * const res = doc.array.shift(); + * console.log(res) // 2 + * console.log(doc.array) // [3] + * + * ####Note: + * + * _marks the entire array as modified, which if saved, will store it as a `$set` operation, potentially overwritting any changes that happen between when you retrieved the object and when you save it._ + * + * @api public + * @method shift + * @memberOf MongooseArray + */ + + shift() { + const arr = this.isMongooseArrayProxy ? this.__array : this; + const ret = [].shift.call(arr); + this._registerAtomic('$set', this); + this._markModified(); + return ret; + }, + + /** + * Wraps [`Array#sort`](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/sort) with proper change tracking. + * + * ####NOTE: + * + * _marks the entire array as modified, which if saved, will store it as a `$set` operation, potentially overwritting any changes that happen between when you retrieved the object and when you save it._ + * + * @api public + * @method sort + * @memberOf MongooseArray + * @see https://masteringjs.io/tutorials/fundamentals/array-sort + */ + + sort() { + const arr = this.isMongooseArrayProxy ? this.__array : this; + const ret = [].sort.apply(arr, arguments); + this._registerAtomic('$set', this); + return ret; + }, + + /** + * Wraps [`Array#splice`](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/splice) with proper change tracking and casting. + * + * ####Note: + * + * _marks the entire array as modified, which if saved, will store it as a `$set` operation, potentially overwritting any changes that happen between when you retrieved the object and when you save it._ + * + * @api public + * @method splice + * @memberOf MongooseArray + * @see https://masteringjs.io/tutorials/fundamentals/array-splice + */ + + splice() { + let ret; + const arr = this.isMongooseArrayProxy ? this.__array : this; + + _checkManualPopulation(this, Array.prototype.slice.call(arguments, 2)); + + if (arguments.length) { + let vals; + if (this[arraySchemaSymbol] == null) { + vals = arguments; + } else { + vals = []; + for (let i = 0; i < arguments.length; ++i) { + vals[i] = i < 2 ? + arguments[i] : + this._cast(arguments[i], arguments[0] + (i - 2)); + } + } + + ret = [].splice.apply(arr, vals); + this._registerAtomic('$set', this); + } + + return ret; + }, + + /*! + * ignore + */ + + toBSON() { + return this.toObject(internalToObjectOptions); + }, + + /** + * Returns a native js Array. + * + * @param {Object} options + * @return {Array} + * @api public + * @method toObject + * @memberOf MongooseArray + */ + + toObject(options) { + const arr = this.isMongooseArrayProxy ? this.__array : this; + if (options && options.depopulate) { + options = utils.clone(options); + options._isNested = true; + // Ensure return value is a vanilla array, because in Node.js 6+ `map()` + // is smart enough to use the inherited array's constructor. + return [].concat(arr).map(function(doc) { + return doc instanceof Document + ? doc.toObject(options) + : doc; + }); + } + + return [].concat(arr); + }, + + $toObject() { + return this.constructor.prototype.toObject.apply(this, arguments); + }, + /** + * Wraps [`Array#unshift`](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/Array/unshift) with proper change tracking. + * + * ####Note: + * + * _marks the entire array as modified, which if saved, will store it as a `$set` operation, potentially overwriting any changes that happen between when you retrieved the object and when you save it._ + * + * @api public + * @method unshift + * @memberOf MongooseArray + */ + + unshift() { + _checkManualPopulation(this, arguments); + + let values; + if (this[arraySchemaSymbol] == null) { + values = arguments; + } else { + values = [].map.call(arguments, this._cast, this); + values = this[arraySchemaSymbol].applySetters(values, this[arrayParentSymbol]); + } + + const arr = this.isMongooseArrayProxy ? this.__array : this; + [].unshift.apply(arr, values); + this._registerAtomic('$set', this); + this._markModified(); + return this.length; + } +}; + +/*! + * ignore + */ + +function _isAllSubdocs(docs, ref) { + if (!ref) { + return false; + } + + for (const arg of docs) { + if (arg == null) { + return false; + } + const model = arg.constructor; + if (!(arg instanceof Document) || + (model.modelName !== ref && model.baseModelName !== ref)) { + return false; + } + } + + return true; +} + +/*! + * ignore + */ + +function _checkManualPopulation(arr, docs) { + const ref = arr == null ? + null : + get(arr[arraySchemaSymbol], 'caster.options.ref', null); + if (arr.length === 0 && + docs.length > 0) { + if (_isAllSubdocs(docs, ref)) { + arr[arrayParentSymbol].$populated(arr[arrayPathSymbol], [], { + [populateModelSymbol]: docs[0].constructor + }); + } + } +} + +const returnVanillaArrayMethods = [ + 'filter', + 'flat', + 'flatMap', + 'map', + 'slice' +]; +for (const method of returnVanillaArrayMethods) { + if (Array.prototype[method] == null) { + continue; + } + + methods[method] = function() { + const _arr = this.isMongooseArrayProxy ? this.__array : this; + const arr = [].concat(_arr); + + return arr[method].apply(arr, arguments); + }; +} + +module.exports = methods; diff --git a/node_modules/mongoose/lib/types/buffer.js b/node_modules/mongoose/lib/types/buffer.js new file mode 100644 index 000000000..41ba16612 --- /dev/null +++ b/node_modules/mongoose/lib/types/buffer.js @@ -0,0 +1,273 @@ +/*! + * Module dependencies. + */ + +'use strict'; + +const Binary = require('../driver').get().Binary; +const utils = require('../utils'); + +/** + * Mongoose Buffer constructor. + * + * Values always have to be passed to the constructor to initialize. + * + * @param {Buffer} value + * @param {String} encode + * @param {Number} offset + * @api private + * @inherits Buffer + * @see http://bit.ly/f6CnZU + */ + +function MongooseBuffer(value, encode, offset) { + let val = value; + if (value == null) { + val = 0; + } + + let encoding; + let path; + let doc; + + if (Array.isArray(encode)) { + // internal casting + path = encode[0]; + doc = encode[1]; + } else { + encoding = encode; + } + + let buf; + if (typeof val === 'number' || val instanceof Number) { + buf = Buffer.alloc(val); + } else { // string, array or object { type: 'Buffer', data: [...] } + buf = Buffer.from(val, encoding, offset); + } + utils.decorate(buf, MongooseBuffer.mixin); + buf.isMongooseBuffer = true; + + // make sure these internal props don't show up in Object.keys() + buf[MongooseBuffer.pathSymbol] = path; + buf[parentSymbol] = doc; + + buf._subtype = 0; + return buf; +} + +const pathSymbol = Symbol.for('mongoose#Buffer#_path'); +const parentSymbol = Symbol.for('mongoose#Buffer#_parent'); +MongooseBuffer.pathSymbol = pathSymbol; + +/*! + * Inherit from Buffer. + */ + +MongooseBuffer.mixin = { + + /** + * Default subtype for the Binary representing this Buffer + * + * @api private + * @property _subtype + * @receiver MongooseBuffer + */ + + _subtype: undefined, + + /** + * Marks this buffer as modified. + * + * @api private + * @method _markModified + * @receiver MongooseBuffer + */ + + _markModified: function() { + const parent = this[parentSymbol]; + + if (parent) { + parent.markModified(this[MongooseBuffer.pathSymbol]); + } + return this; + }, + + /** + * Writes the buffer. + * + * @api public + * @method write + * @receiver MongooseBuffer + */ + + write: function() { + const written = Buffer.prototype.write.apply(this, arguments); + + if (written > 0) { + this._markModified(); + } + + return written; + }, + + /** + * Copies the buffer. + * + * ####Note: + * + * `Buffer#copy` does not mark `target` as modified so you must copy from a `MongooseBuffer` for it to work as expected. This is a work around since `copy` modifies the target, not this. + * + * @return {Number} The number of bytes copied. + * @param {Buffer} target + * @method copy + * @receiver MongooseBuffer + */ + + copy: function(target) { + const ret = Buffer.prototype.copy.apply(this, arguments); + + if (target && target.isMongooseBuffer) { + target._markModified(); + } + + return ret; + } +}; + +/*! + * Compile other Buffer methods marking this buffer as modified. + */ + +( +// node < 0.5 + ('writeUInt8 writeUInt16 writeUInt32 writeInt8 writeInt16 writeInt32 ' + + 'writeFloat writeDouble fill ' + + 'utf8Write binaryWrite asciiWrite set ' + + + // node >= 0.5 + 'writeUInt16LE writeUInt16BE writeUInt32LE writeUInt32BE ' + + 'writeInt16LE writeInt16BE writeInt32LE writeInt32BE ' + 'writeFloatLE writeFloatBE writeDoubleLE writeDoubleBE') +).split(' ').forEach(function(method) { + if (!Buffer.prototype[method]) { + return; + } + MongooseBuffer.mixin[method] = function() { + const ret = Buffer.prototype[method].apply(this, arguments); + this._markModified(); + return ret; + }; +}); + +/** + * Converts this buffer to its Binary type representation. + * + * ####SubTypes: + * + * const bson = require('bson') + * bson.BSON_BINARY_SUBTYPE_DEFAULT + * bson.BSON_BINARY_SUBTYPE_FUNCTION + * bson.BSON_BINARY_SUBTYPE_BYTE_ARRAY + * bson.BSON_BINARY_SUBTYPE_UUID + * bson.BSON_BINARY_SUBTYPE_MD5 + * bson.BSON_BINARY_SUBTYPE_USER_DEFINED + * + * doc.buffer.toObject(bson.BSON_BINARY_SUBTYPE_USER_DEFINED); + * + * @see http://bsonspec.org/#/specification + * @param {Hex} [subtype] + * @return {Binary} + * @api public + * @method toObject + * @receiver MongooseBuffer + */ + +MongooseBuffer.mixin.toObject = function(options) { + const subtype = typeof options === 'number' + ? options + : (this._subtype || 0); + return new Binary(Buffer.from(this), subtype); +}; + +MongooseBuffer.mixin.$toObject = MongooseBuffer.mixin.toObject; + +/** + * Converts this buffer for storage in MongoDB, including subtype + * + * @return {Binary} + * @api public + * @method toBSON + * @receiver MongooseBuffer + */ + +MongooseBuffer.mixin.toBSON = function() { + return new Binary(this, this._subtype || 0); +}; + +/** + * Determines if this buffer is equals to `other` buffer + * + * @param {Buffer} other + * @return {Boolean} + * @method equals + * @receiver MongooseBuffer + */ + +MongooseBuffer.mixin.equals = function(other) { + if (!Buffer.isBuffer(other)) { + return false; + } + + if (this.length !== other.length) { + return false; + } + + for (let i = 0; i < this.length; ++i) { + if (this[i] !== other[i]) { + return false; + } + } + + return true; +}; + +/** + * Sets the subtype option and marks the buffer modified. + * + * ####SubTypes: + * + * const bson = require('bson') + * bson.BSON_BINARY_SUBTYPE_DEFAULT + * bson.BSON_BINARY_SUBTYPE_FUNCTION + * bson.BSON_BINARY_SUBTYPE_BYTE_ARRAY + * bson.BSON_BINARY_SUBTYPE_UUID + * bson.BSON_BINARY_SUBTYPE_MD5 + * bson.BSON_BINARY_SUBTYPE_USER_DEFINED + * + * doc.buffer.subtype(bson.BSON_BINARY_SUBTYPE_UUID); + * + * @see http://bsonspec.org/#/specification + * @param {Hex} subtype + * @api public + * @method subtype + * @receiver MongooseBuffer + */ + +MongooseBuffer.mixin.subtype = function(subtype) { + if (typeof subtype !== 'number') { + throw new TypeError('Invalid subtype. Expected a number'); + } + + if (this._subtype !== subtype) { + this._markModified(); + } + + this._subtype = subtype; +}; + +/*! + * Module exports. + */ + +MongooseBuffer.Binary = Binary; + +module.exports = MongooseBuffer; diff --git a/node_modules/mongoose/lib/types/decimal128.js b/node_modules/mongoose/lib/types/decimal128.js new file mode 100644 index 000000000..cb0886184 --- /dev/null +++ b/node_modules/mongoose/lib/types/decimal128.js @@ -0,0 +1,13 @@ +/** + * ObjectId type constructor + * + * ####Example + * + * const id = new mongoose.Types.ObjectId; + * + * @constructor ObjectId + */ + +'use strict'; + +module.exports = require('../driver').get().Decimal128; diff --git a/node_modules/mongoose/lib/types/index.js b/node_modules/mongoose/lib/types/index.js new file mode 100644 index 000000000..fbfb89a55 --- /dev/null +++ b/node_modules/mongoose/lib/types/index.js @@ -0,0 +1,20 @@ + +/*! + * Module exports. + */ + +'use strict'; + +exports.Array = require('./array'); +exports.Buffer = require('./buffer'); + +exports.Document = // @deprecate +exports.Embedded = require('./ArraySubdocument'); + +exports.DocumentArray = require('./DocumentArray'); +exports.Decimal128 = require('./decimal128'); +exports.ObjectId = require('./objectid'); + +exports.Map = require('./map'); + +exports.Subdocument = require('./subdocument'); diff --git a/node_modules/mongoose/lib/types/map.js b/node_modules/mongoose/lib/types/map.js new file mode 100644 index 000000000..d6f3fb378 --- /dev/null +++ b/node_modules/mongoose/lib/types/map.js @@ -0,0 +1,244 @@ +'use strict'; + +const Mixed = require('../schema/mixed'); +const ObjectId = require('./objectid'); +const clone = require('../helpers/clone'); +const deepEqual = require('../utils').deepEqual; +const get = require('../helpers/get'); +const getConstructorName = require('../helpers/getConstructorName'); +const handleSpreadDoc = require('../helpers/document/handleSpreadDoc'); +const util = require('util'); +const specialProperties = require('../helpers/specialProperties'); + +const populateModelSymbol = require('../helpers/symbols').populateModelSymbol; + +/*! + * ignore + */ + +class MongooseMap extends Map { + constructor(v, path, doc, schemaType) { + if (getConstructorName(v) === 'Object') { + v = Object.keys(v).reduce((arr, key) => arr.concat([[key, v[key]]]), []); + } + super(v); + this.$__parent = doc != null && doc.$__ != null ? doc : null; + this.$__path = path; + this.$__schemaType = schemaType == null ? new Mixed(path) : schemaType; + + this.$__runDeferred(); + } + + $init(key, value) { + checkValidKey(key); + + super.set(key, value); + + if (value != null && value.$isSingleNested) { + value.$basePath = this.$__path + '.' + key; + } + } + + $__set(key, value) { + super.set(key, value); + } + + get(key, options) { + if (key instanceof ObjectId) { + key = key.toString(); + } + + options = options || {}; + if (options.getters === false) { + return super.get(key); + } + return this.$__schemaType.applyGetters(super.get(key), this.$__parent); + } + + set(key, value) { + if (key instanceof ObjectId) { + key = key.toString(); + } + + checkValidKey(key); + value = handleSpreadDoc(value); + + // Weird, but because you can't assign to `this` before calling `super()` + // you can't get access to `$__schemaType` to cast in the initial call to + // `set()` from the `super()` constructor. + + if (this.$__schemaType == null) { + this.$__deferred = this.$__deferred || []; + this.$__deferred.push({ key: key, value: value }); + return; + } + + const fullPath = this.$__path + '.' + key; + const populated = this.$__parent != null && this.$__parent.$__ ? + this.$__parent.$populated(fullPath) || this.$__parent.$populated(this.$__path) : + null; + const priorVal = this.get(key); + + if (populated != null) { + if (value.$__ == null) { + value = new populated.options[populateModelSymbol](value); + } + value.$__.wasPopulated = true; + } else { + try { + value = this.$__schemaType. + applySetters(value, this.$__parent, false, this.get(key), { path: fullPath }); + } catch (error) { + if (this.$__parent != null && this.$__parent.$__ != null) { + this.$__parent.invalidate(fullPath, error); + return; + } + throw error; + } + } + + super.set(key, value); + + if (value != null && value.$isSingleNested) { + value.$basePath = this.$__path + '.' + key; + } + + const parent = this.$__parent; + if (parent != null && parent.$__ != null && !deepEqual(value, priorVal)) { + parent.markModified(this.$__path + '.' + key); + } + } + + clear() { + super.clear(); + const parent = this.$__parent; + if (parent != null) { + parent.markModified(this.$__path); + } + } + + delete(key) { + if (key instanceof ObjectId) { + key = key.toString(); + } + + this.set(key, undefined); + super.delete(key); + } + + toBSON() { + return new Map(this); + } + + toObject(options) { + if (get(options, 'flattenMaps')) { + const ret = {}; + const keys = this.keys(); + for (const key of keys) { + ret[key] = clone(this.get(key), options); + } + return ret; + } + + return new Map(this); + } + + $toObject() { + return this.constructor.prototype.toObject.apply(this, arguments); + } + + toJSON() { + const ret = {}; + const keys = this.keys(); + for (const key of keys) { + ret[key] = this.get(key); + } + return ret; + } + + inspect() { + return new Map(this); + } + + $__runDeferred() { + if (!this.$__deferred) { + return; + } + + for (const keyValueObject of this.$__deferred) { + this.set(keyValueObject.key, keyValueObject.value); + } + + this.$__deferred = null; + } +} + +if (util.inspect.custom) { + Object.defineProperty(MongooseMap.prototype, util.inspect.custom, { + enumerable: false, + writable: false, + configurable: false, + value: MongooseMap.prototype.inspect + }); +} + +Object.defineProperty(MongooseMap.prototype, '$__set', { + enumerable: false, + writable: true, + configurable: false +}); + +Object.defineProperty(MongooseMap.prototype, '$__parent', { + enumerable: false, + writable: true, + configurable: false +}); + +Object.defineProperty(MongooseMap.prototype, '$__path', { + enumerable: false, + writable: true, + configurable: false +}); + +Object.defineProperty(MongooseMap.prototype, '$__schemaType', { + enumerable: false, + writable: true, + configurable: false +}); + +Object.defineProperty(MongooseMap.prototype, '$isMongooseMap', { + enumerable: false, + writable: false, + configurable: false, + value: true +}); + +Object.defineProperty(MongooseMap.prototype, '$__deferredCalls', { + enumerable: false, + writable: false, + configurable: false, + value: true +}); + +/*! + * Since maps are stored as objects under the hood, keys must be strings + * and can't contain any invalid characters + */ + +function checkValidKey(key) { + const keyType = typeof key; + if (keyType !== 'string') { + throw new TypeError(`Mongoose maps only support string keys, got ${keyType}`); + } + if (key.startsWith('$')) { + throw new Error(`Mongoose maps do not support keys that start with "$", got "${key}"`); + } + if (key.includes('.')) { + throw new Error(`Mongoose maps do not support keys that contain ".", got "${key}"`); + } + if (specialProperties.has(key)) { + throw new Error(`Mongoose maps do not support reserved key name "${key}"`); + } +} + +module.exports = MongooseMap; diff --git a/node_modules/mongoose/lib/types/objectid.js b/node_modules/mongoose/lib/types/objectid.js new file mode 100644 index 000000000..e4237c293 --- /dev/null +++ b/node_modules/mongoose/lib/types/objectid.js @@ -0,0 +1,41 @@ +/** + * ObjectId type constructor + * + * ####Example + * + * const id = new mongoose.Types.ObjectId; + * + * @constructor ObjectId + */ + +'use strict'; + +const ObjectId = require('../driver').get().ObjectId; +const objectIdSymbol = require('../helpers/symbols').objectIdSymbol; + +/*! + * Getter for convenience with populate, see gh-6115 + */ + +Object.defineProperty(ObjectId.prototype, '_id', { + enumerable: false, + configurable: true, + get: function() { + return this; + } +}); + +/*! + * Convenience `valueOf()` to allow comparing ObjectIds using double equals re: gh-7299 + */ + + +if (!ObjectId.prototype.hasOwnProperty('valueOf')) { + ObjectId.prototype.valueOf = function objectIdValueOf() { + return this.toString(); + }; +} + +ObjectId.prototype[objectIdSymbol] = true; + +module.exports = ObjectId; diff --git a/node_modules/mongoose/lib/types/subdocument.js b/node_modules/mongoose/lib/types/subdocument.js new file mode 100644 index 000000000..269335777 --- /dev/null +++ b/node_modules/mongoose/lib/types/subdocument.js @@ -0,0 +1,372 @@ +'use strict'; + +const Document = require('../document'); +const immediate = require('../helpers/immediate'); +const internalToObjectOptions = require('../options').internalToObjectOptions; +const promiseOrCallback = require('../helpers/promiseOrCallback'); +const util = require('util'); +const utils = require('../utils'); + +module.exports = Subdocument; + +/** + * Subdocument constructor. + * + * @inherits Document + * @api private + */ + +function Subdocument(value, fields, parent, skipId, options) { + if (parent != null) { + // If setting a nested path, should copy isNew from parent re: gh-7048 + const parentOptions = { isNew: parent.isNew, defaults: 'defaults' in parent.$__ ? parent.$__.defaults : true }; + options = Object.assign({}, parentOptions, options); + } + if (options != null && options.path != null) { + this.$basePath = options.path; + } + Document.call(this, value, fields, skipId, options); + + delete this.$__.priorDoc; +} + +Subdocument.prototype = Object.create(Document.prototype); + +Object.defineProperty(Subdocument.prototype, '$isSingleNested', { + configurable: false, + writable: false, + value: true +}); + +/*! + * ignore + */ + +Subdocument.prototype.toBSON = function() { + return this.toObject(internalToObjectOptions); +}; + +/** + * Used as a stub for middleware + * + * ####NOTE: + * + * _This is a no-op. Does not actually save the doc to the db._ + * + * @param {Function} [fn] + * @return {Promise} resolved Promise + * @api private + */ + +Subdocument.prototype.save = function(options, fn) { + if (typeof options === 'function') { + fn = options; + options = {}; + } + options = options || {}; + + if (!options.suppressWarning) { + utils.warn('mongoose: calling `save()` on a subdoc does **not** save ' + + 'the document to MongoDB, it only runs save middleware. ' + + 'Use `subdoc.save({ suppressWarning: true })` to hide this warning ' + + 'if you\'re sure this behavior is right for your app.'); + } + + return promiseOrCallback(fn, cb => { + this.$__save(cb); + }); +}; + +/*! + * Given a path relative to this document, return the path relative + * to the top-level document. + */ + +Subdocument.prototype.$__fullPath = function(path) { + if (!this.$__.fullPath) { + this.ownerDocument(); + } + + return path ? + this.$__.fullPath + '.' + path : + this.$__.fullPath; +}; + +/*! + * Given a path relative to this document, return the path relative + * to the top-level document. + */ + +Subdocument.prototype.$__pathRelativeToParent = function(p) { + if (p == null) { + return this.$basePath; + } + return [this.$basePath, p].join('.'); +}; + +/** + * Used as a stub for middleware + * + * ####NOTE: + * + * _This is a no-op. Does not actually save the doc to the db._ + * + * @param {Function} [fn] + * @method $__save + * @api private + */ + +Subdocument.prototype.$__save = function(fn) { + return immediate(() => fn(null, this)); +}; + +/*! + * ignore + */ + +Subdocument.prototype.$isValid = function(path) { + const parent = this.$parent(); + const fullPath = this.$__pathRelativeToParent(path); + if (parent != null && fullPath != null) { + return parent.$isValid(fullPath); + } + return Document.prototype.$isValid.call(this, path); +}; + +/*! + * ignore + */ + +Subdocument.prototype.markModified = function(path) { + Document.prototype.markModified.call(this, path); + const parent = this.$parent(); + const fullPath = this.$__pathRelativeToParent(path); + + if (parent == null || fullPath == null) { + return; + } + + const myPath = this.$__pathRelativeToParent().replace(/\.$/, ''); + if (parent.isDirectModified(myPath) || this.isNew) { + return; + } + this.$__parent.markModified(fullPath, this); +}; + +/*! + * ignore + */ + +Subdocument.prototype.isModified = function(paths, modifiedPaths) { + const parent = this.$parent(); + if (parent != null) { + if (Array.isArray(paths) || typeof paths === 'string') { + paths = (Array.isArray(paths) ? paths : paths.split(' ')); + paths = paths.map(p => this.$__pathRelativeToParent(p)).filter(p => p != null); + } else if (!paths) { + paths = this.$__pathRelativeToParent(); + } + + return parent.$isModified(paths, modifiedPaths); + } + + return Document.prototype.isModified.call(this, paths, modifiedPaths); +}; + +/** + * Marks a path as valid, removing existing validation errors. + * + * @param {String} path the field to mark as valid + * @api private + * @method $markValid + * @receiver Subdocument + */ + +Subdocument.prototype.$markValid = function(path) { + Document.prototype.$markValid.call(this, path); + const parent = this.$parent(); + const fullPath = this.$__pathRelativeToParent(path); + if (parent != null && fullPath != null) { + parent.$markValid(fullPath); + } +}; + +/*! + * ignore + */ + +Subdocument.prototype.invalidate = function(path, err, val) { + Document.prototype.invalidate.call(this, path, err, val); + + const parent = this.$parent(); + const fullPath = this.$__pathRelativeToParent(path); + if (parent != null && fullPath != null) { + parent.invalidate(fullPath, err, val); + } else if (err.kind === 'cast' || err.name === 'CastError' || fullPath == null) { + throw err; + } + + return this.ownerDocument().$__.validationError; +}; + +/*! + * ignore + */ + +Subdocument.prototype.$ignore = function(path) { + Document.prototype.$ignore.call(this, path); + const parent = this.$parent(); + const fullPath = this.$__pathRelativeToParent(path); + if (parent != null && fullPath != null) { + parent.$ignore(fullPath); + } +}; + +/** + * Returns the top level document of this sub-document. + * + * @return {Document} + */ + +Subdocument.prototype.ownerDocument = function() { + if (this.$__.ownerDocument) { + return this.$__.ownerDocument; + } + + let parent = this; // eslint-disable-line consistent-this + + const paths = []; + const seenDocs = new Set([parent]); + + while (true) { + if (typeof parent.$__pathRelativeToParent !== 'function') { + break; + } + paths.unshift(parent.$__pathRelativeToParent(void 0, true)); + const _parent = parent.$parent(); + if (_parent == null) { + break; + } + parent = _parent; + if (seenDocs.has(parent)) { + throw new Error('Infinite subdocument loop: subdoc with _id ' + parent._id + ' is a parent of itself'); + } + + seenDocs.add(parent); + } + + this.$__.fullPath = paths.join('.'); + + this.$__.ownerDocument = parent; + return this.$__.ownerDocument; +}; + +/** + * Returns this sub-documents parent document. + * + * @api public + */ + +Subdocument.prototype.parent = function() { + return this.$__parent; +}; + +/** + * Returns this sub-documents parent document. + * + * @api public + */ + +Subdocument.prototype.$parent = Subdocument.prototype.parent; + +/*! + * no-op for hooks + */ + +Subdocument.prototype.$__remove = function(cb) { + if (cb == null) { + return; + } + return cb(null, this); +}; + +Subdocument.prototype.$__removeFromParent = function() { + this.$__parent.set(this.$basePath, null); +}; + +/** + * Null-out this subdoc + * + * @param {Object} [options] + * @param {Function} [callback] optional callback for compatibility with Document.prototype.remove + */ + +Subdocument.prototype.remove = function(options, callback) { + if (typeof options === 'function') { + callback = options; + options = null; + } + registerRemoveListener(this); + + // If removing entire doc, no need to remove subdoc + if (!options || !options.noop) { + this.$__removeFromParent(); + } + + return this.$__remove(callback); +}; + +/*! + * ignore + */ + +Subdocument.prototype.populate = function() { + throw new Error('Mongoose does not support calling populate() on nested ' + + 'docs. Instead of `doc.nested.populate("path")`, use ' + + '`doc.populate("nested.path")`'); +}; + +/** + * Helper for console.log + * + * @api public + */ + +Subdocument.prototype.inspect = function() { + return this.toObject({ + transform: false, + virtuals: false, + flattenDecimals: false + }); +}; + +if (util.inspect.custom) { + /*! + * Avoid Node deprecation warning DEP0079 + */ + + Subdocument.prototype[util.inspect.custom] = Subdocument.prototype.inspect; +} + +/*! + * Registers remove event listeners for triggering + * on subdocuments. + * + * @param {Subdocument} sub + * @api private + */ + +function registerRemoveListener(sub) { + let owner = sub.ownerDocument(); + + function emitRemove() { + owner.$removeListener('save', emitRemove); + owner.$removeListener('remove', emitRemove); + sub.emit('remove', sub); + sub.constructor.emit('remove', sub); + owner = sub = null; + } + + owner.$on('save', emitRemove); + owner.$on('remove', emitRemove); +} diff --git a/node_modules/mongoose/lib/utils.js b/node_modules/mongoose/lib/utils.js new file mode 100644 index 000000000..5d1d85ae0 --- /dev/null +++ b/node_modules/mongoose/lib/utils.js @@ -0,0 +1,961 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const ms = require('ms'); +const mpath = require('mpath'); +const sliced = require('sliced'); +const Decimal = require('./types/decimal128'); +const ObjectId = require('./types/objectid'); +const PopulateOptions = require('./options/PopulateOptions'); +const clone = require('./helpers/clone'); +const immediate = require('./helpers/immediate'); +const isObject = require('./helpers/isObject'); +const isBsonType = require('./helpers/isBsonType'); +const getFunctionName = require('./helpers/getFunctionName'); +const isMongooseObject = require('./helpers/isMongooseObject'); +const promiseOrCallback = require('./helpers/promiseOrCallback'); +const schemaMerge = require('./helpers/schema/merge'); +const specialProperties = require('./helpers/specialProperties'); +const { trustedSymbol } = require('./helpers/query/trusted'); + +let Document; + +exports.specialProperties = specialProperties; + +/*! + * Produces a collection name from model `name`. By default, just returns + * the model name + * + * @param {String} name a model name + * @param {Function} pluralize function that pluralizes the collection name + * @return {String} a collection name + * @api private + */ + +exports.toCollectionName = function(name, pluralize) { + if (name === 'system.profile') { + return name; + } + if (name === 'system.indexes') { + return name; + } + if (typeof pluralize === 'function') { + return pluralize(name); + } + return name; +}; + +/*! + * Determines if `a` and `b` are deep equal. + * + * Modified from node/lib/assert.js + * + * @param {any} a a value to compare to `b` + * @param {any} b a value to compare to `a` + * @return {Boolean} + * @api private + */ + +exports.deepEqual = function deepEqual(a, b) { + if (a === b) { + return true; + } + + if (typeof a !== 'object' && typeof b !== 'object') { + return a === b; + } + + if (a instanceof Date && b instanceof Date) { + return a.getTime() === b.getTime(); + } + + if ((isBsonType(a, 'ObjectID') && isBsonType(b, 'ObjectID')) || + (isBsonType(a, 'Decimal128') && isBsonType(b, 'Decimal128'))) { + return a.toString() === b.toString(); + } + + if (a instanceof RegExp && b instanceof RegExp) { + return a.source === b.source && + a.ignoreCase === b.ignoreCase && + a.multiline === b.multiline && + a.global === b.global; + } + + if (a == null || b == null) { + return false; + } + + if (a.prototype !== b.prototype) { + return false; + } + + if (a instanceof Map && b instanceof Map) { + return deepEqual(Array.from(a.keys()), Array.from(b.keys())) && + deepEqual(Array.from(a.values()), Array.from(b.values())); + } + + // Handle MongooseNumbers + if (a instanceof Number && b instanceof Number) { + return a.valueOf() === b.valueOf(); + } + + if (Buffer.isBuffer(a)) { + return exports.buffer.areEqual(a, b); + } + + if (Array.isArray(a) && Array.isArray(b)) { + const len = a.length; + if (len !== b.length) { + return false; + } + for (let i = 0; i < len; ++i) { + if (!deepEqual(a[i], b[i])) { + return false; + } + } + return true; + } + + if (a.$__ != null) { + a = a._doc; + } else if (isMongooseObject(a)) { + a = a.toObject(); + } + + if (b.$__ != null) { + b = b._doc; + } else if (isMongooseObject(b)) { + b = b.toObject(); + } + + const ka = Object.keys(a); + const kb = Object.keys(b); + const kaLength = ka.length; + + // having the same number of owned properties (keys incorporates + // hasOwnProperty) + if (kaLength !== kb.length) { + return false; + } + + // ~~~cheap key test + for (let i = kaLength - 1; i >= 0; i--) { + if (ka[i] !== kb[i]) { + return false; + } + } + + // equivalent values for every corresponding key, and + // ~~~possibly expensive deep test + for (const key of ka) { + if (!deepEqual(a[key], b[key])) { + return false; + } + } + + return true; +}; + +/*! + * Get the last element of an array + */ + +exports.last = function(arr) { + if (arr.length > 0) { + return arr[arr.length - 1]; + } + return void 0; +}; + +exports.clone = clone; + +/*! + * ignore + */ + +exports.promiseOrCallback = promiseOrCallback; + +/*! + * ignore + */ + +exports.omit = function omit(obj, keys) { + if (keys == null) { + return Object.assign({}, obj); + } + if (!Array.isArray(keys)) { + keys = [keys]; + } + + const ret = Object.assign({}, obj); + for (const key of keys) { + delete ret[key]; + } + return ret; +}; + + +/*! + * Shallow copies defaults into options. + * + * @param {Object} defaults + * @param {Object} options + * @return {Object} the merged object + * @api private + */ + +exports.options = function(defaults, options) { + const keys = Object.keys(defaults); + let i = keys.length; + let k; + + options = options || {}; + + while (i--) { + k = keys[i]; + if (!(k in options)) { + options[k] = defaults[k]; + } + } + + return options; +}; + +/*! + * Generates a random string + * + * @api private + */ + +exports.random = function() { + return Math.random().toString().substr(3); +}; + +/*! + * Merges `from` into `to` without overwriting existing properties. + * + * @param {Object} to + * @param {Object} from + * @api private + */ + +exports.merge = function merge(to, from, options, path) { + options = options || {}; + + const keys = Object.keys(from); + let i = 0; + const len = keys.length; + let key; + + if (from[trustedSymbol]) { + to[trustedSymbol] = from[trustedSymbol]; + } + + path = path || ''; + const omitNested = options.omitNested || {}; + + while (i < len) { + key = keys[i++]; + if (options.omit && options.omit[key]) { + continue; + } + if (omitNested[path]) { + continue; + } + if (specialProperties.has(key)) { + continue; + } + if (to[key] == null) { + to[key] = from[key]; + } else if (exports.isObject(from[key])) { + if (!exports.isObject(to[key])) { + to[key] = {}; + } + if (from[key] != null) { + // Skip merging schemas if we're creating a discriminator schema and + // base schema has a given path as a single nested but discriminator schema + // has the path as a document array, or vice versa (gh-9534) + if (options.isDiscriminatorSchemaMerge && + (from[key].$isSingleNested && to[key].$isMongooseDocumentArray) || + (from[key].$isMongooseDocumentArray && to[key].$isSingleNested)) { + continue; + } else if (from[key].instanceOfSchema) { + if (to[key].instanceOfSchema) { + schemaMerge(to[key], from[key].clone(), options.isDiscriminatorSchemaMerge); + } else { + to[key] = from[key].clone(); + } + continue; + } else if (from[key] instanceof ObjectId) { + to[key] = new ObjectId(from[key]); + continue; + } + } + merge(to[key], from[key], options, path ? path + '.' + key : key); + } else if (options.overwrite) { + to[key] = from[key]; + } + } +}; + +/*! + * Applies toObject recursively. + * + * @param {Document|Array|Object} obj + * @return {Object} + * @api private + */ + +exports.toObject = function toObject(obj) { + Document || (Document = require('./document')); + let ret; + + if (obj == null) { + return obj; + } + + if (obj instanceof Document) { + return obj.toObject(); + } + + if (Array.isArray(obj)) { + ret = []; + + for (const doc of obj) { + ret.push(toObject(doc)); + } + + return ret; + } + + if (exports.isPOJO(obj)) { + ret = {}; + + if (obj[trustedSymbol]) { + ret[trustedSymbol] = obj[trustedSymbol]; + } + + for (const k of Object.keys(obj)) { + if (specialProperties.has(k)) { + continue; + } + ret[k] = toObject(obj[k]); + } + + return ret; + } + + return obj; +}; + +exports.isObject = isObject; + +/*! + * Determines if `arg` is a plain old JavaScript object (POJO). Specifically, + * `arg` must be an object but not an instance of any special class, like String, + * ObjectId, etc. + * + * `Object.getPrototypeOf()` is part of ES5: https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/getPrototypeOf + * + * @param {Object|Array|String|Function|RegExp|any} arg + * @api private + * @return {Boolean} + */ + +exports.isPOJO = function isPOJO(arg) { + if (arg == null || typeof arg !== 'object') { + return false; + } + const proto = Object.getPrototypeOf(arg); + // Prototype may be null if you used `Object.create(null)` + // Checking `proto`'s constructor is safe because `getPrototypeOf()` + // explicitly crosses the boundary from object data to object metadata + return !proto || proto.constructor.name === 'Object'; +}; + +/*! + * Determines if `arg` is an object that isn't an instance of a built-in value + * class, like Array, Buffer, ObjectId, etc. + */ + +exports.isNonBuiltinObject = function isNonBuiltinObject(val) { + return typeof val === 'object' && + !exports.isNativeObject(val) && + !exports.isMongooseType(val) && + val != null; +}; + +/*! + * Determines if `obj` is a built-in object like an array, date, boolean, + * etc. + */ + +exports.isNativeObject = function(arg) { + return Array.isArray(arg) || + arg instanceof Date || + arg instanceof Boolean || + arg instanceof Number || + arg instanceof String; +}; + +/*! + * Determines if `val` is an object that has no own keys + */ + +exports.isEmptyObject = function(val) { + return val != null && + typeof val === 'object' && + Object.keys(val).length === 0; +}; + +/*! + * Search if `obj` or any POJOs nested underneath `obj` has a property named + * `key` + */ + +exports.hasKey = function hasKey(obj, key) { + const props = Object.keys(obj); + for (const prop of props) { + if (prop === key) { + return true; + } + if (exports.isPOJO(obj[prop]) && exports.hasKey(obj[prop], key)) { + return true; + } + } + return false; +}; + +/*! + * A faster Array.prototype.slice.call(arguments) alternative + * @api private + */ + +exports.args = sliced; + +/*! + * process.nextTick helper. + * + * Wraps `callback` in a try/catch + nextTick. + * + * node-mongodb-native has a habit of state corruption when an error is immediately thrown from within a collection callback. + * + * @param {Function} callback + * @api private + */ + +exports.tick = function tick(callback) { + if (typeof callback !== 'function') { + return; + } + return function() { + try { + callback.apply(this, arguments); + } catch (err) { + // only nextTick on err to get out of + // the event loop and avoid state corruption. + immediate(function() { + throw err; + }); + } + }; +}; + +/*! + * Returns true if `v` is an object that can be serialized as a primitive in + * MongoDB + */ + +exports.isMongooseType = function(v) { + return v instanceof ObjectId || v instanceof Decimal || v instanceof Buffer; +}; + +exports.isMongooseObject = isMongooseObject; + +/*! + * Converts `expires` options of index objects to `expiresAfterSeconds` options for MongoDB. + * + * @param {Object} object + * @api private + */ + +exports.expires = function expires(object) { + if (!(object && object.constructor.name === 'Object')) { + return; + } + if (!('expires' in object)) { + return; + } + + let when; + if (typeof object.expires !== 'string') { + when = object.expires; + } else { + when = Math.round(ms(object.expires) / 1000); + } + object.expireAfterSeconds = when; + delete object.expires; +}; + +/*! + * populate helper + */ + +exports.populate = function populate(path, select, model, match, options, subPopulate, justOne, count) { + // might have passed an object specifying all arguments + let obj = null; + if (arguments.length === 1) { + if (path instanceof PopulateOptions) { + return [path]; + } + + if (Array.isArray(path)) { + const singles = makeSingles(path); + return singles.map(o => exports.populate(o)[0]); + } + + if (exports.isObject(path)) { + obj = Object.assign({}, path); + } else { + obj = { path: path }; + } + } else if (typeof model === 'object') { + obj = { + path: path, + select: select, + match: model, + options: match + }; + } else { + obj = { + path: path, + select: select, + model: model, + match: match, + options: options, + populate: subPopulate, + justOne: justOne, + count: count + }; + } + + if (typeof obj.path !== 'string') { + throw new TypeError('utils.populate: invalid path. Expected string. Got typeof `' + typeof path + '`'); + } + + return _populateObj(obj); + + // The order of select/conditions args is opposite Model.find but + // necessary to keep backward compatibility (select could be + // an array, string, or object literal). + function makeSingles(arr) { + const ret = []; + arr.forEach(function(obj) { + if (/[\s]/.test(obj.path)) { + const paths = obj.path.split(' '); + paths.forEach(function(p) { + const copy = Object.assign({}, obj); + copy.path = p; + ret.push(copy); + }); + } else { + ret.push(obj); + } + }); + + return ret; + } +}; + +function _populateObj(obj) { + if (Array.isArray(obj.populate)) { + const ret = []; + obj.populate.forEach(function(obj) { + if (/[\s]/.test(obj.path)) { + const copy = Object.assign({}, obj); + const paths = copy.path.split(' '); + paths.forEach(function(p) { + copy.path = p; + ret.push(exports.populate(copy)[0]); + }); + } else { + ret.push(exports.populate(obj)[0]); + } + }); + obj.populate = exports.populate(ret); + } else if (obj.populate != null && typeof obj.populate === 'object') { + obj.populate = exports.populate(obj.populate); + } + + const ret = []; + const paths = obj.path.split(' '); + if (obj.options != null) { + obj.options = exports.clone(obj.options); + } + + for (const path of paths) { + ret.push(new PopulateOptions(Object.assign({}, obj, { path: path }))); + } + + return ret; +} + +/*! + * Return the value of `obj` at the given `path`. + * + * @param {String} path + * @param {Object} obj + */ + +exports.getValue = function(path, obj, map) { + return mpath.get(path, obj, '_doc', map); +}; + +/*! + * Sets the value of `obj` at the given `path`. + * + * @param {String} path + * @param {Anything} val + * @param {Object} obj + */ + +exports.setValue = function(path, val, obj, map, _copying) { + mpath.set(path, val, obj, '_doc', map, _copying); +}; + +/*! + * Returns an array of values from object `o`. + * + * @param {Object} o + * @return {Array} + * @private + */ + +exports.object = {}; +exports.object.vals = function vals(o) { + const keys = Object.keys(o); + let i = keys.length; + const ret = []; + + while (i--) { + ret.push(o[keys[i]]); + } + + return ret; +}; + +/*! + * @see exports.options + */ + +exports.object.shallowCopy = exports.options; + +/*! + * Safer helper for hasOwnProperty checks + * + * @param {Object} obj + * @param {String} prop + */ + +const hop = Object.prototype.hasOwnProperty; +exports.object.hasOwnProperty = function(obj, prop) { + return hop.call(obj, prop); +}; + +/*! + * Determine if `val` is null or undefined + * + * @return {Boolean} + */ + +exports.isNullOrUndefined = function(val) { + return val === null || val === undefined; +}; + +/*! + * ignore + */ + +exports.array = {}; + +/*! + * Flattens an array. + * + * [ 1, [ 2, 3, [4] ]] -> [1,2,3,4] + * + * @param {Array} arr + * @param {Function} [filter] If passed, will be invoked with each item in the array. If `filter` returns a falsy value, the item will not be included in the results. + * @return {Array} + * @private + */ + +exports.array.flatten = function flatten(arr, filter, ret) { + ret || (ret = []); + + arr.forEach(function(item) { + if (Array.isArray(item)) { + flatten(item, filter, ret); + } else { + if (!filter || filter(item)) { + ret.push(item); + } + } + }); + + return ret; +}; + +/*! + * ignore + */ + +const _hasOwnProperty = Object.prototype.hasOwnProperty; + +exports.hasUserDefinedProperty = function(obj, key) { + if (obj == null) { + return false; + } + + if (Array.isArray(key)) { + for (const k of key) { + if (exports.hasUserDefinedProperty(obj, k)) { + return true; + } + } + return false; + } + + if (_hasOwnProperty.call(obj, key)) { + return true; + } + if (typeof obj === 'object' && key in obj) { + const v = obj[key]; + return v !== Object.prototype[key] && v !== Array.prototype[key]; + } + + return false; +}; + +/*! + * ignore + */ + +const MAX_ARRAY_INDEX = Math.pow(2, 32) - 1; + +exports.isArrayIndex = function(val) { + if (typeof val === 'number') { + return val >= 0 && val <= MAX_ARRAY_INDEX; + } + if (typeof val === 'string') { + if (!/^\d+$/.test(val)) { + return false; + } + val = +val; + return val >= 0 && val <= MAX_ARRAY_INDEX; + } + + return false; +}; + +/*! + * Removes duplicate values from an array + * + * [1, 2, 3, 3, 5] => [1, 2, 3, 5] + * [ ObjectId("550988ba0c19d57f697dc45e"), ObjectId("550988ba0c19d57f697dc45e") ] + * => [ObjectId("550988ba0c19d57f697dc45e")] + * + * @param {Array} arr + * @return {Array} + * @private + */ + +exports.array.unique = function(arr) { + const primitives = new Set(); + const ids = new Set(); + const ret = []; + + for (const item of arr) { + if (typeof item === 'number' || typeof item === 'string' || item == null) { + if (primitives.has(item)) { + continue; + } + ret.push(item); + primitives.add(item); + } else if (item instanceof ObjectId) { + if (ids.has(item.toString())) { + continue; + } + ret.push(item); + ids.add(item.toString()); + } else { + ret.push(item); + } + } + + return ret; +}; + +/*! + * Determines if two buffers are equal. + * + * @param {Buffer} a + * @param {Object} b + */ + +exports.buffer = {}; +exports.buffer.areEqual = function(a, b) { + if (!Buffer.isBuffer(a)) { + return false; + } + if (!Buffer.isBuffer(b)) { + return false; + } + if (a.length !== b.length) { + return false; + } + for (let i = 0, len = a.length; i < len; ++i) { + if (a[i] !== b[i]) { + return false; + } + } + return true; +}; + +exports.getFunctionName = getFunctionName; +/*! + * Decorate buffers + */ + +exports.decorate = function(destination, source) { + for (const key in source) { + if (specialProperties.has(key)) { + continue; + } + destination[key] = source[key]; + } +}; + +/** + * merges to with a copy of from + * + * @param {Object} to + * @param {Object} fromObj + * @api private + */ + +exports.mergeClone = function(to, fromObj) { + if (isMongooseObject(fromObj)) { + fromObj = fromObj.toObject({ + transform: false, + virtuals: false, + depopulate: true, + getters: false, + flattenDecimals: false + }); + } + const keys = Object.keys(fromObj); + const len = keys.length; + let i = 0; + let key; + + while (i < len) { + key = keys[i++]; + if (specialProperties.has(key)) { + continue; + } + if (typeof to[key] === 'undefined') { + to[key] = exports.clone(fromObj[key], { + transform: false, + virtuals: false, + depopulate: true, + getters: false, + flattenDecimals: false + }); + } else { + let val = fromObj[key]; + if (val != null && val.valueOf && !(val instanceof Date)) { + val = val.valueOf(); + } + if (exports.isObject(val)) { + let obj = val; + if (isMongooseObject(val) && !val.isMongooseBuffer) { + obj = obj.toObject({ + transform: false, + virtuals: false, + depopulate: true, + getters: false, + flattenDecimals: false + }); + } + if (val.isMongooseBuffer) { + obj = Buffer.from(obj); + } + exports.mergeClone(to[key], obj); + } else { + to[key] = exports.clone(val, { + flattenDecimals: false + }); + } + } + } +}; + +/** + * Executes a function on each element of an array (like _.each) + * + * @param {Array} arr + * @param {Function} fn + * @api private + */ + +exports.each = function(arr, fn) { + for (const item of arr) { + fn(item); + } +}; + +/*! + * ignore + */ + +exports.getOption = function(name) { + const sources = Array.prototype.slice.call(arguments, 1); + + for (const source of sources) { + if (source[name] != null) { + return source[name]; + } + } + + return null; +}; + +/*! + * ignore + */ + +exports.noop = function() {}; + +exports.errorToPOJO = function errorToPOJO(error) { + const isError = error instanceof Error; + if (!isError) { + throw new Error('`error` must be `instanceof Error`.'); + } + + const ret = {}; + for (const properyName of Object.getOwnPropertyNames(error)) { + ret[properyName] = error[properyName]; + } + return ret; +}; + +/*! + * ignore + */ + +exports.warn = function warn(message) { + return process.emitWarning(message, { code: 'MONGOOSE' }); +}; \ No newline at end of file diff --git a/node_modules/mongoose/lib/validoptions.js b/node_modules/mongoose/lib/validoptions.js new file mode 100644 index 000000000..c64894de3 --- /dev/null +++ b/node_modules/mongoose/lib/validoptions.js @@ -0,0 +1,32 @@ + +/*! + * Valid mongoose options + */ + +'use strict'; + +const VALID_OPTIONS = Object.freeze([ + 'applyPluginsToChildSchemas', + 'applyPluginsToDiscriminators', + 'autoCreate', + 'autoIndex', + 'bufferCommands', + 'bufferTimeoutMS', + 'cloneSchemas', + 'debug', + 'maxTimeMS', + 'objectIdGetter', + 'overwriteModels', + 'returnOriginal', + 'runValidators', + 'sanitizeFilter', + 'sanitizeProjection', + 'selectPopulatedPaths', + 'setDefaultsOnInsert', + 'strict', + 'strictQuery', + 'toJSON', + 'toObject' +]); + +module.exports = VALID_OPTIONS; diff --git a/node_modules/mongoose/lib/virtualtype.js b/node_modules/mongoose/lib/virtualtype.js new file mode 100644 index 000000000..58a44b59c --- /dev/null +++ b/node_modules/mongoose/lib/virtualtype.js @@ -0,0 +1,176 @@ +'use strict'; + +const utils = require('./utils'); + +/** + * VirtualType constructor + * + * This is what mongoose uses to define virtual attributes via `Schema.prototype.virtual`. + * + * ####Example: + * + * const fullname = schema.virtual('fullname'); + * fullname instanceof mongoose.VirtualType // true + * + * @param {Object} options + * @param {string|function} [options.ref] if `ref` is not nullish, this becomes a [populated virtual](/docs/populate.html#populate-virtuals) + * @param {string|function} [options.localField] the local field to populate on if this is a populated virtual. + * @param {string|function} [options.foreignField] the foreign field to populate on if this is a populated virtual. + * @param {boolean} [options.justOne=false] by default, a populated virtual is an array. If you set `justOne`, the populated virtual will be a single doc or `null`. + * @param {boolean} [options.getters=false] if you set this to `true`, Mongoose will call any custom getters you defined on this virtual + * @param {boolean} [options.count=false] if you set this to `true`, `populate()` will set this virtual to the number of populated documents, as opposed to the documents themselves, using [`Query#countDocuments()`](./api.html#query_Query-countDocuments) + * @param {Object|Function} [options.match=null] add an extra match condition to `populate()` + * @param {Number} [options.limit=null] add a default `limit` to the `populate()` query + * @param {Number} [options.skip=null] add a default `skip` to the `populate()` query + * @param {Number} [options.perDocumentLimit=null] For legacy reasons, `limit` with `populate()` may give incorrect results because it only executes a single query for every document being populated. If you set `perDocumentLimit`, Mongoose will ensure correct `limit` per document by executing a separate query for each document to `populate()`. For example, `.find().populate({ path: 'test', perDocumentLimit: 2 })` will execute 2 additional queries if `.find()` returns 2 documents. + * @param {Object} [options.options=null] Additional options like `limit` and `lean`. + * @api public + */ + +function VirtualType(options, name) { + this.path = name; + this.getters = []; + this.setters = []; + this.options = Object.assign({}, options); +} + +/** + * If no getters/getters, add a default + * + * @param {Function} fn + * @return {VirtualType} this + * @api private + */ + +VirtualType.prototype._applyDefaultGetters = function() { + if (this.getters.length > 0 || this.setters.length > 0) { + return; + } + + const path = this.path; + const internalProperty = '$' + path; + this.getters.push(function() { + return this[internalProperty]; + }); + this.setters.push(function(v) { + this[internalProperty] = v; + }); +}; + +/*! + * ignore + */ + +VirtualType.prototype.clone = function() { + const clone = new VirtualType(this.options, this.path); + clone.getters = [].concat(this.getters); + clone.setters = [].concat(this.setters); + return clone; +}; + +/** + * Adds a custom getter to this virtual. + * + * Mongoose calls the getter function with the below 3 parameters. + * + * - `value`: the value returned by the previous getter. If there is only one getter, `value` will be `undefined`. + * - `virtual`: the virtual object you called `.get()` on + * - `doc`: the document this virtual is attached to. Equivalent to `this`. + * + * ####Example: + * + * const virtual = schema.virtual('fullname'); + * virtual.get(function(value, virtual, doc) { + * return this.name.first + ' ' + this.name.last; + * }); + * + * @param {Function(Any, VirtualType, Document)} fn + * @return {VirtualType} this + * @api public + */ + +VirtualType.prototype.get = function(fn) { + this.getters.push(fn); + return this; +}; + +/** + * Adds a custom setter to this virtual. + * + * Mongoose calls the setter function with the below 3 parameters. + * + * - `value`: the value being set + * - `virtual`: the virtual object you're calling `.set()` on + * - `doc`: the document this virtual is attached to. Equivalent to `this`. + * + * ####Example: + * + * const virtual = schema.virtual('fullname'); + * virtual.set(function(value, virtual, doc) { + * const parts = value.split(' '); + * this.name.first = parts[0]; + * this.name.last = parts[1]; + * }); + * + * const Model = mongoose.model('Test', schema); + * const doc = new Model(); + * // Calls the setter with `value = 'Jean-Luc Picard'` + * doc.fullname = 'Jean-Luc Picard'; + * doc.name.first; // 'Jean-Luc' + * doc.name.last; // 'Picard' + * + * @param {Function(Any, VirtualType, Document)} fn + * @return {VirtualType} this + * @api public + */ + +VirtualType.prototype.set = function(fn) { + this.setters.push(fn); + return this; +}; + +/** + * Applies getters to `value`. + * + * @param {Object} value + * @param {Document} doc The document this virtual is attached to + * @return {any} the value after applying all getters + * @api public + */ + +VirtualType.prototype.applyGetters = function(value, doc) { + if (utils.hasUserDefinedProperty(this.options, ['ref', 'refPath']) && + doc.$$populatedVirtuals && + doc.$$populatedVirtuals.hasOwnProperty(this.path)) { + value = doc.$$populatedVirtuals[this.path]; + } + + let v = value; + for (const getter of this.getters) { + v = getter.call(doc, v, this, doc); + } + return v; +}; + +/** + * Applies setters to `value`. + * + * @param {Object} value + * @param {Document} doc + * @return {any} the value after applying all setters + * @api public + */ + +VirtualType.prototype.applySetters = function(value, doc) { + let v = value; + for (const setter of this.setters) { + v = setter.call(doc, v, this, doc); + } + return v; +}; + +/*! + * exports + */ + +module.exports = VirtualType; diff --git a/node_modules/mongoose/node_modules/ms/index.js b/node_modules/mongoose/node_modules/ms/index.js new file mode 100644 index 000000000..c4498bcc2 --- /dev/null +++ b/node_modules/mongoose/node_modules/ms/index.js @@ -0,0 +1,162 @@ +/** + * Helpers. + */ + +var s = 1000; +var m = s * 60; +var h = m * 60; +var d = h * 24; +var w = d * 7; +var y = d * 365.25; + +/** + * Parse or format the given `val`. + * + * Options: + * + * - `long` verbose formatting [false] + * + * @param {String|Number} val + * @param {Object} [options] + * @throws {Error} throw an error if val is not a non-empty string or a number + * @return {String|Number} + * @api public + */ + +module.exports = function(val, options) { + options = options || {}; + var type = typeof val; + if (type === 'string' && val.length > 0) { + return parse(val); + } else if (type === 'number' && isFinite(val)) { + return options.long ? fmtLong(val) : fmtShort(val); + } + throw new Error( + 'val is not a non-empty string or a valid number. val=' + + JSON.stringify(val) + ); +}; + +/** + * Parse the given `str` and return milliseconds. + * + * @param {String} str + * @return {Number} + * @api private + */ + +function parse(str) { + str = String(str); + if (str.length > 100) { + return; + } + var match = /^(-?(?:\d+)?\.?\d+) *(milliseconds?|msecs?|ms|seconds?|secs?|s|minutes?|mins?|m|hours?|hrs?|h|days?|d|weeks?|w|years?|yrs?|y)?$/i.exec( + str + ); + if (!match) { + return; + } + var n = parseFloat(match[1]); + var type = (match[2] || 'ms').toLowerCase(); + switch (type) { + case 'years': + case 'year': + case 'yrs': + case 'yr': + case 'y': + return n * y; + case 'weeks': + case 'week': + case 'w': + return n * w; + case 'days': + case 'day': + case 'd': + return n * d; + case 'hours': + case 'hour': + case 'hrs': + case 'hr': + case 'h': + return n * h; + case 'minutes': + case 'minute': + case 'mins': + case 'min': + case 'm': + return n * m; + case 'seconds': + case 'second': + case 'secs': + case 'sec': + case 's': + return n * s; + case 'milliseconds': + case 'millisecond': + case 'msecs': + case 'msec': + case 'ms': + return n; + default: + return undefined; + } +} + +/** + * Short format for `ms`. + * + * @param {Number} ms + * @return {String} + * @api private + */ + +function fmtShort(ms) { + var msAbs = Math.abs(ms); + if (msAbs >= d) { + return Math.round(ms / d) + 'd'; + } + if (msAbs >= h) { + return Math.round(ms / h) + 'h'; + } + if (msAbs >= m) { + return Math.round(ms / m) + 'm'; + } + if (msAbs >= s) { + return Math.round(ms / s) + 's'; + } + return ms + 'ms'; +} + +/** + * Long format for `ms`. + * + * @param {Number} ms + * @return {String} + * @api private + */ + +function fmtLong(ms) { + var msAbs = Math.abs(ms); + if (msAbs >= d) { + return plural(ms, msAbs, d, 'day'); + } + if (msAbs >= h) { + return plural(ms, msAbs, h, 'hour'); + } + if (msAbs >= m) { + return plural(ms, msAbs, m, 'minute'); + } + if (msAbs >= s) { + return plural(ms, msAbs, s, 'second'); + } + return ms + ' ms'; +} + +/** + * Pluralization helper. + */ + +function plural(ms, msAbs, n, name) { + var isPlural = msAbs >= n * 1.5; + return Math.round(ms / n) + ' ' + name + (isPlural ? 's' : ''); +} diff --git a/node_modules/mongoose/node_modules/ms/license.md b/node_modules/mongoose/node_modules/ms/license.md new file mode 100644 index 000000000..69b61253a --- /dev/null +++ b/node_modules/mongoose/node_modules/ms/license.md @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2016 Zeit, Inc. + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/node_modules/mongoose/node_modules/ms/package.json b/node_modules/mongoose/node_modules/ms/package.json new file mode 100644 index 000000000..ffeb58533 --- /dev/null +++ b/node_modules/mongoose/node_modules/ms/package.json @@ -0,0 +1,69 @@ +{ + "_from": "ms@2.1.2", + "_id": "ms@2.1.2", + "_inBundle": false, + "_integrity": "sha512-sGkPx+VjMtmA6MX27oA4FBFELFCZZ4S4XqeGOXCv68tT+jb3vk/RyaKWP0PTKyWtmLSM0b+adUTEvbs1PEaH2w==", + "_location": "/mongoose/ms", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "ms@2.1.2", + "name": "ms", + "escapedName": "ms", + "rawSpec": "2.1.2", + "saveSpec": null, + "fetchSpec": "2.1.2" + }, + "_requiredBy": [ + "/mongoose" + ], + "_resolved": "https://registry.npmjs.org/ms/-/ms-2.1.2.tgz", + "_shasum": "d09d1f357b443f493382a8eb3ccd183872ae6009", + "_spec": "ms@2.1.2", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\mongoose", + "bugs": { + "url": "https://github.com/zeit/ms/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Tiny millisecond conversion utility", + "devDependencies": { + "eslint": "4.12.1", + "expect.js": "0.3.1", + "husky": "0.14.3", + "lint-staged": "5.0.0", + "mocha": "4.0.1" + }, + "eslintConfig": { + "extends": "eslint:recommended", + "env": { + "node": true, + "es6": true + } + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/zeit/ms#readme", + "license": "MIT", + "lint-staged": { + "*.js": [ + "npm run lint", + "prettier --single-quote --write", + "git add" + ] + }, + "main": "./index", + "name": "ms", + "repository": { + "type": "git", + "url": "git+https://github.com/zeit/ms.git" + }, + "scripts": { + "lint": "eslint lib/* bin/*", + "precommit": "lint-staged", + "test": "mocha tests.js" + }, + "version": "2.1.2" +} diff --git a/node_modules/mongoose/node_modules/ms/readme.md b/node_modules/mongoose/node_modules/ms/readme.md new file mode 100644 index 000000000..9a1996b17 --- /dev/null +++ b/node_modules/mongoose/node_modules/ms/readme.md @@ -0,0 +1,60 @@ +# ms + +[![Build Status](https://travis-ci.org/zeit/ms.svg?branch=master)](https://travis-ci.org/zeit/ms) +[![Join the community on Spectrum](https://withspectrum.github.io/badge/badge.svg)](https://spectrum.chat/zeit) + +Use this package to easily convert various time formats to milliseconds. + +## Examples + +```js +ms('2 days') // 172800000 +ms('1d') // 86400000 +ms('10h') // 36000000 +ms('2.5 hrs') // 9000000 +ms('2h') // 7200000 +ms('1m') // 60000 +ms('5s') // 5000 +ms('1y') // 31557600000 +ms('100') // 100 +ms('-3 days') // -259200000 +ms('-1h') // -3600000 +ms('-200') // -200 +``` + +### Convert from Milliseconds + +```js +ms(60000) // "1m" +ms(2 * 60000) // "2m" +ms(-3 * 60000) // "-3m" +ms(ms('10 hours')) // "10h" +``` + +### Time Format Written-Out + +```js +ms(60000, { long: true }) // "1 minute" +ms(2 * 60000, { long: true }) // "2 minutes" +ms(-3 * 60000, { long: true }) // "-3 minutes" +ms(ms('10 hours'), { long: true }) // "10 hours" +``` + +## Features + +- Works both in [Node.js](https://nodejs.org) and in the browser +- If a number is supplied to `ms`, a string with a unit is returned +- If a string that contains the number is supplied, it returns it as a number (e.g.: it returns `100` for `'100'`) +- If you pass a string with a number and a valid unit, the number of equivalent milliseconds is returned + +## Related Packages + +- [ms.macro](https://github.com/knpwrs/ms.macro) - Run `ms` as a macro at build-time. + +## Caught a Bug? + +1. [Fork](https://help.github.com/articles/fork-a-repo/) this repository to your own GitHub account and then [clone](https://help.github.com/articles/cloning-a-repository/) it to your local device +2. Link the package to the global module directory: `npm link` +3. Within the module you want to test your local development instance of ms, just link it to the dependencies: `npm link ms`. Instead of the default one from npm, Node.js will now use your clone of ms! + +As always, you can run the tests using: `npm test` diff --git a/node_modules/mongoose/package.json b/node_modules/mongoose/package.json new file mode 100644 index 000000000..7caa90567 --- /dev/null +++ b/node_modules/mongoose/package.json @@ -0,0 +1,279 @@ +{ + "_from": "mongoose", + "_id": "mongoose@6.0.10", + "_inBundle": false, + "_integrity": "sha512-p/wiEDUXoQuyb/xQx8QW/YGN92ZsojJ5E/DDgMCUU0WOGxc5uhcWoZ7ijLu6Ssjq8UkwVSv+jzkYp4Wbr+NqBg==", + "_location": "/mongoose", + "_phantomChildren": {}, + "_requested": { + "type": "tag", + "registry": true, + "raw": "mongoose", + "name": "mongoose", + "escapedName": "mongoose", + "rawSpec": "", + "saveSpec": null, + "fetchSpec": "latest" + }, + "_requiredBy": [ + "#USER", + "/" + ], + "_resolved": "https://registry.npmjs.org/mongoose/-/mongoose-6.0.10.tgz", + "_shasum": "c22ac914afccf648778edfe11954f4aa0425a105", + "_spec": "mongoose", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1", + "author": { + "name": "Guillermo Rauch", + "email": "guillermo@learnboost.com" + }, + "browser": "./dist/browser.umd.js", + "bugs": { + "url": "https://github.com/Automattic/mongoose/issues/new" + }, + "bundleDependencies": false, + "dependencies": { + "bson": "^4.2.2", + "kareem": "2.3.2", + "mongodb": "4.1.2", + "mpath": "0.8.4", + "mquery": "4.0.0", + "ms": "2.1.2", + "regexp-clone": "1.0.0", + "sift": "13.5.2", + "sliced": "1.0.1" + }, + "deprecated": false, + "description": "Mongoose MongoDB ODM", + "devDependencies": { + "@babel/core": "7.10.5", + "@babel/preset-env": "7.10.4", + "@typescript-eslint/eslint-plugin": "4.33.0", + "@typescript-eslint/parser": "4.33.0", + "acquit": "1.x", + "acquit-ignore": "0.1.x", + "acquit-require": "0.1.x", + "babel-loader": "8.1.0", + "benchmark": "2.1.4", + "bluebird": "3.5.5", + "chalk": "2.4.2", + "cheerio": "1.0.0-rc.2", + "dox": "0.3.1", + "eslint": "7.1.0", + "eslint-plugin-mocha-no-only": "1.1.0", + "highlight.js": "9.18.2", + "lodash.isequal": "4.5.0", + "lodash.isequalwith": "4.4.0", + "marked": "1.1.1", + "mkdirp": "0.5.5", + "mocha": "5.x", + "moment": "2.x", + "mongoose-long": "0.2.1", + "node-static": "0.7.11", + "object-sizeof": "1.3.0", + "pug": "2.0.3", + "q": "1.5.1", + "rimraf": "2.6.3", + "semver": "5.5.0", + "typescript": "4.1.x", + "uuid": "2.0.3", + "uuid-parse": "1.0.0", + "validator": "10.8.0", + "webpack": "4.44.0" + }, + "directories": { + "lib": "./lib/mongoose" + }, + "engines": { + "node": ">=12.0.0" + }, + "eslintConfig": { + "extends": [ + "eslint:recommended" + ], + "overrides": [ + { + "files": [ + "**/*.{ts,tsx}" + ], + "extends": [ + "plugin:@typescript-eslint/eslint-recommended", + "plugin:@typescript-eslint/recommended" + ], + "plugins": [ + "@typescript-eslint" + ], + "rules": { + "@typescript-eslint/no-explicit-any": "off", + "@typescript-eslint/ban-types": "off", + "@typescript-eslint/no-unused-vars": "off", + "@typescript-eslint/explicit-module-boundary-types": "off" + } + } + ], + "plugins": [ + "mocha-no-only" + ], + "parserOptions": { + "ecmaVersion": 2020 + }, + "env": { + "node": true, + "es6": true + }, + "rules": { + "comma-style": "error", + "indent": [ + "error", + 2, + { + "SwitchCase": 1, + "VariableDeclarator": 2 + } + ], + "keyword-spacing": "error", + "no-whitespace-before-property": "error", + "no-buffer-constructor": "warn", + "no-console": "off", + "no-multi-spaces": "error", + "no-constant-condition": "off", + "func-call-spacing": "error", + "no-trailing-spaces": "error", + "no-undef": "error", + "no-unneeded-ternary": "error", + "no-const-assign": "error", + "no-useless-rename": "error", + "no-dupe-keys": "error", + "space-in-parens": [ + "error", + "never" + ], + "spaced-comment": [ + "error", + "always", + { + "block": { + "markers": [ + "!" + ], + "balanced": true + } + } + ], + "key-spacing": [ + "error", + { + "beforeColon": false, + "afterColon": true + } + ], + "comma-spacing": [ + "error", + { + "before": false, + "after": true + } + ], + "array-bracket-spacing": 1, + "arrow-spacing": [ + "error", + { + "before": true, + "after": true + } + ], + "object-curly-spacing": [ + "error", + "always" + ], + "comma-dangle": [ + "error", + "never" + ], + "no-unreachable": "error", + "quotes": [ + "error", + "single" + ], + "quote-props": [ + "error", + "as-needed" + ], + "semi": "error", + "no-extra-semi": "error", + "semi-spacing": "error", + "no-spaced-func": "error", + "no-throw-literal": "error", + "space-before-blocks": "error", + "space-before-function-paren": [ + "error", + "never" + ], + "space-infix-ops": "error", + "space-unary-ops": "error", + "no-var": "warn", + "prefer-const": "warn", + "strict": [ + "error", + "global" + ], + "no-restricted-globals": [ + "error", + { + "name": "context", + "message": "Don't use Mocha's global context" + } + ], + "no-prototype-builtins": "off", + "mocha-no-only/mocha-no-only": [ + "error" + ] + } + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/mongoose" + }, + "homepage": "https://mongoosejs.com", + "keywords": [ + "mongodb", + "document", + "model", + "schema", + "database", + "odm", + "data", + "datastore", + "query", + "nosql", + "orm", + "db" + ], + "license": "MIT", + "main": "./index.js", + "mocha": { + "extension": [ + "test.js" + ], + "watch-files": [ + "test/**/*.js" + ] + }, + "name": "mongoose", + "repository": { + "type": "git", + "url": "git://github.com/Automattic/mongoose.git" + }, + "scripts": { + "build-browser": "node build-browser.js", + "lint": "eslint .", + "prepublishOnly": "npm run build-browser", + "release": "git pull && git push origin master --tags && npm publish", + "release-legacy": "git pull origin 5.x && git push origin 5.x --tags && npm publish --tag legacy", + "tdd": "mocha ./test/*.test.js ./test/typescript/main.test.js --inspect --watch --recursive --watch-files ./**/*.js", + "test": "mocha --exit ./test/*.test.js ./test/typescript/main.test.js", + "test-cov": "nyc --reporter=html --reporter=text npm test" + }, + "types": "./index.d.ts", + "version": "6.0.10" +} diff --git a/node_modules/mongoose/tools/auth.js b/node_modules/mongoose/tools/auth.js new file mode 100644 index 000000000..2fddafaf7 --- /dev/null +++ b/node_modules/mongoose/tools/auth.js @@ -0,0 +1,31 @@ +'use strict'; + +const Server = require('mongodb-topology-manager').Server; +const mongodb = require('mongodb'); + +run().catch(error => { + console.error(error); + process.exit(-1); +}); + +async function run() { + // Create new instance + const server = new Server('mongod', { + auth: null, + dbpath: '/data/db/27017' + }); + + // Purge the directory + await server.purge(); + + // Start process + await server.start(); + + const db = await mongodb.MongoClient.connect('mongodb://localhost:27017/admin'); + + await db.addUser('passwordIsTaco', 'taco', { + roles: ['dbOwner'] + }); + + console.log('done'); +} \ No newline at end of file diff --git a/node_modules/mongoose/tools/repl.js b/node_modules/mongoose/tools/repl.js new file mode 100644 index 000000000..a2ff64552 --- /dev/null +++ b/node_modules/mongoose/tools/repl.js @@ -0,0 +1,36 @@ +'use strict'; + +run().catch(error => { + console.error(error); + process.exit(-1); +}); + +async function run () { + const ReplSet = require('mongodb-topology-manager').ReplSet; + + // Create new instance + const topology = new ReplSet('mongod', [{ + // mongod process options + options: { + bind_ip: 'localhost', port: 31000, dbpath: `/data/db/31000` + } + }, { + // mongod process options + options: { + bind_ip: 'localhost', port: 31001, dbpath: `/data/db/31001` + } + }, { + // Type of node + arbiterOnly: true, + // mongod process options + options: { + bind_ip: 'localhost', port: 31002, dbpath: `/data/db/31002` + } + }], { + replSet: 'rs' + }); + + await topology.start(); + + console.log('done'); +} diff --git a/node_modules/mongoose/tools/sharded.js b/node_modules/mongoose/tools/sharded.js new file mode 100644 index 000000000..cfab884f5 --- /dev/null +++ b/node_modules/mongoose/tools/sharded.js @@ -0,0 +1,46 @@ +'use strict'; + +run().catch(error => { + console.error(error); + process.exit(-1); +}); + + +async function run () { + const Sharded = require('mongodb-topology-manager').Sharded; + + // Create new instance + const topology = new Sharded({ + mongod: 'mongod', + mongos: 'mongos' + }); + + await topology.addShard([{ + options: { + bind_ip: 'localhost', port: 31000, dbpath: `/data/db/31000`, shardsvr: null + } + }], { replSet: 'rs1' }); + + await topology.addConfigurationServers([{ + options: { + bind_ip: 'localhost', port: 35000, dbpath: `/data/db/35000` + } + }], { replSet: 'rs0' }); + + await topology.addProxies([{ + bind_ip: 'localhost', port: 51000, configdb: 'localhost:35000' + }], { + binary: 'mongos' + }); + + console.log('Start...'); + // Start up topology + await topology.start(); + + console.log('Started'); + + // Shard db + await topology.enableSharding('test'); + + console.log('done'); +} \ No newline at end of file diff --git a/node_modules/mpath/.travis.yml b/node_modules/mpath/.travis.yml new file mode 100644 index 000000000..9bdf212d6 --- /dev/null +++ b/node_modules/mpath/.travis.yml @@ -0,0 +1,9 @@ +language: node_js +node_js: + - "4" + - "5" + - "6" + - "7" + - "8" + - "9" + - "10" diff --git a/node_modules/mpath/History.md b/node_modules/mpath/History.md new file mode 100644 index 000000000..58fea9614 --- /dev/null +++ b/node_modules/mpath/History.md @@ -0,0 +1,84 @@ +0.8.4 / 2021-09-01 +================== + * fix: throw error if `parts` contains an element that isn't a string or number #13 + +0.8.3 / 2020-12-30 +================== + * fix: use var instead of let/const for Node.js 4.x support + +0.8.2 / 2020-12-30 +================== + * fix(stringToParts): fall back to legacy treatment for square brackets if square brackets contents aren't a number Automattic/mongoose#9640 + * chore: add eslint + +0.8.1 / 2020-12-10 +================== + * fix(stringToParts): handle empty string and trailing dot the same way that `split()` does for backwards compat + +0.8.0 / 2020-11-14 +================== + * feat: support square bracket indexing for `get()`, `set()`, `has()`, and `unset()` + +0.7.0 / 2020-03-24 +================== + * BREAKING CHANGE: remove `component.json` #9 [AlexeyGrigorievBoost](https://github.com/AlexeyGrigorievBoost) + +0.6.0 / 2019-05-01 +================== + * feat: support setting dotted paths within nested arrays + +0.5.2 / 2019-04-25 +================== + * fix: avoid using subclassed array constructor when doing `map()` + +0.5.1 / 2018-08-30 +================== + * fix: prevent writing to constructor and prototype as well as __proto__ + +0.5.0 / 2018-08-30 +================== + * BREAKING CHANGE: disallow setting/unsetting __proto__ properties + * feat: re-add support for Node < 4 for this release + +0.4.1 / 2018-04-08 +================== + * fix: allow opting out of weird `$` set behavior re: Automattic/mongoose#6273 + +0.4.0 / 2018-03-27 +================== + * feat: add support for ES6 maps + * BREAKING CHANGE: drop support for Node < 4 + +0.3.0 / 2017-06-05 +================== + * feat: add has() and unset() functions + +0.2.1 / 2013-03-22 +================== + + * test; added for #5 + * fix typo that breaks set #5 [Contra](https://github.com/Contra) + +0.2.0 / 2013-03-15 +================== + + * added; adapter support for set + * added; adapter support for get + * add basic benchmarks + * add support for using module as a component #2 [Contra](https://github.com/Contra) + +0.1.1 / 2012-12-21 +================== + + * added; map support + +0.1.0 / 2012-12-13 +================== + + * added; set('array.property', val, object) support + * added; get('array.property', object) support + +0.0.1 / 2012-11-03 +================== + + * initial release diff --git a/node_modules/mpath/LICENSE b/node_modules/mpath/LICENSE new file mode 100644 index 000000000..38c529daa --- /dev/null +++ b/node_modules/mpath/LICENSE @@ -0,0 +1,22 @@ +(The MIT License) + +Copyright (c) 2012 [Aaron Heckmann](aaron.heckmann+github@gmail.com) + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/mpath/Makefile b/node_modules/mpath/Makefile new file mode 100644 index 000000000..8d6a79c01 --- /dev/null +++ b/node_modules/mpath/Makefile @@ -0,0 +1,4 @@ +bench: + node bench.js + +.PHONY: test diff --git a/node_modules/mpath/README.md b/node_modules/mpath/README.md new file mode 100644 index 000000000..9831dd066 --- /dev/null +++ b/node_modules/mpath/README.md @@ -0,0 +1,278 @@ +#mpath + +{G,S}et javascript object values using MongoDB-like path notation. + +###Getting + +```js +var mpath = require('mpath'); + +var obj = { + comments: [ + { title: 'funny' }, + { title: 'exciting!' } + ] +} + +mpath.get('comments.1.title', obj) // 'exciting!' +``` + +`mpath.get` supports array property notation as well. + +```js +var obj = { + comments: [ + { title: 'funny' }, + { title: 'exciting!' } + ] +} + +mpath.get('comments.title', obj) // ['funny', 'exciting!'] +``` + +Array property and indexing syntax, when used together, are very powerful. + +```js +var obj = { + array: [ + { o: { array: [{x: {b: [4,6,8]}}, { y: 10} ] }} + , { o: { array: [{x: {b: [1,2,3]}}, { x: {z: 10 }}, { x: 'Turkey Day' }] }} + , { o: { array: [{x: {b: null }}, { x: { b: [null, 1]}}] }} + , { o: { array: [{x: null }] }} + , { o: { array: [{y: 3 }] }} + , { o: { array: [3, 0, null] }} + , { o: { name: 'ha' }} + ]; +} + +var found = mpath.get('array.o.array.x.b.1', obj); + +console.log(found); // prints.. + + [ [6, undefined] + , [2, undefined, undefined] + , [null, 1] + , [null] + , [undefined] + , [undefined, undefined, undefined] + , undefined + ] + +``` + +#####Field selection rules: + +The following rules are iteratively applied to each `segment` in the passed `path`. For example: + +```js +var path = 'one.two.14'; // path +'one' // segment 0 +'two' // segment 1 +14 // segment 2 +``` + +- 1) when value of the segment parent is not an array, return the value of `parent.segment` +- 2) when value of the segment parent is an array + - a) if the segment is an integer, replace the parent array with the value at `parent[segment]` + - b) if not an integer, keep the array but replace each array `item` with the value returned from calling `get(remainingSegments, item)` or undefined if falsey. + +#####Maps + +`mpath.get` also accepts an optional `map` argument which receives each individual found value. The value returned from the `map` function will be used in the original found values place. + +```js +var obj = { + comments: [ + { title: 'funny' }, + { title: 'exciting!' } + ] +} + +mpath.get('comments.title', obj, function (val) { + return 'funny' == val + ? 'amusing' + : val; +}); +// ['amusing', 'exciting!'] +``` + +###Setting + +```js +var obj = { + comments: [ + { title: 'funny' }, + { title: 'exciting!' } + ] +} + +mpath.set('comments.1.title', 'hilarious', obj) +console.log(obj.comments[1].title) // 'hilarious' +``` + +`mpath.set` supports the same array property notation as `mpath.get`. + +```js +var obj = { + comments: [ + { title: 'funny' }, + { title: 'exciting!' } + ] +} + +mpath.set('comments.title', ['hilarious', 'fruity'], obj); + +console.log(obj); // prints.. + + { comments: [ + { title: 'hilarious' }, + { title: 'fruity' } + ]} +``` + +Array property and indexing syntax can be used together also when setting. + +```js +var obj = { + array: [ + { o: { array: [{x: {b: [4,6,8]}}, { y: 10} ] }} + , { o: { array: [{x: {b: [1,2,3]}}, { x: {z: 10 }}, { x: 'Turkey Day' }] }} + , { o: { array: [{x: {b: null }}, { x: { b: [null, 1]}}] }} + , { o: { array: [{x: null }] }} + , { o: { array: [{y: 3 }] }} + , { o: { array: [3, 0, null] }} + , { o: { name: 'ha' }} + ] +} + +mpath.set('array.1.o', 'this was changed', obj); + +console.log(require('util').inspect(obj, false, 1000)); // prints.. + +{ + array: [ + { o: { array: [{x: {b: [4,6,8]}}, { y: 10} ] }} + , { o: 'this was changed' } + , { o: { array: [{x: {b: null }}, { x: { b: [null, 1]}}] }} + , { o: { array: [{x: null }] }} + , { o: { array: [{y: 3 }] }} + , { o: { array: [3, 0, null] }} + , { o: { name: 'ha' }} + ]; +} + +mpath.set('array.o.array.x', 'this was changed too', obj); + +console.log(require('util').inspect(obj, false, 1000)); // prints.. + +{ + array: [ + { o: { array: [{x: 'this was changed too'}, { y: 10, x: 'this was changed too'} ] }} + , { o: 'this was changed' } + , { o: { array: [{x: 'this was changed too'}, { x: 'this was changed too'}] }} + , { o: { array: [{x: 'this was changed too'}] }} + , { o: { array: [{x: 'this was changed too', y: 3 }] }} + , { o: { array: [3, 0, null] }} + , { o: { name: 'ha' }} + ]; +} +``` + +####Setting arrays + +By default, setting a property within an array to another array results in each element of the new array being set to the item in the destination array at the matching index. An example is helpful. + +```js +var obj = { + comments: [ + { title: 'funny' }, + { title: 'exciting!' } + ] +} + +mpath.set('comments.title', ['hilarious', 'fruity'], obj); + +console.log(obj); // prints.. + + { comments: [ + { title: 'hilarious' }, + { title: 'fruity' } + ]} +``` + +If we do not desire this destructuring-like assignment behavior we may instead specify the `$` operator in the path being set to force the array to be copied directly. + +```js +var obj = { + comments: [ + { title: 'funny' }, + { title: 'exciting!' } + ] +} + +mpath.set('comments.$.title', ['hilarious', 'fruity'], obj); + +console.log(obj); // prints.. + + { comments: [ + { title: ['hilarious', 'fruity'] }, + { title: ['hilarious', 'fruity'] } + ]} +``` + +####Field assignment rules + +The rules utilized mirror those used on `mpath.get`, meaning we can take values returned from `mpath.get`, update them, and reassign them using `mpath.set`. Note that setting nested arrays of arrays can get unweildy quickly. Check out the [tests](https://github.com/aheckmann/mpath/blob/master/test/index.js) for more extreme examples. + +#####Maps + +`mpath.set` also accepts an optional `map` argument which receives each individual value being set. The value returned from the `map` function will be used in the original values place. + +```js +var obj = { + comments: [ + { title: 'funny' }, + { title: 'exciting!' } + ] +} + +mpath.set('comments.title', ['hilarious', 'fruity'], obj, function (val) { + return val.length; +}); + +console.log(obj); // prints.. + + { comments: [ + { title: 9 }, + { title: 6 } + ]} +``` + +### Custom object types + +Sometimes you may want to enact the same functionality on custom object types that store all their real data internally, say for an ODM type object. No fear, `mpath` has you covered. Simply pass the name of the property being used to store the internal data and it will be traversed instead: + +```js +var mpath = require('mpath'); + +var obj = { + comments: [ + { title: 'exciting!', _doc: { title: 'great!' }} + ] +} + +mpath.get('comments.0.title', obj, '_doc') // 'great!' +mpath.set('comments.0.title', 'nov 3rd', obj, '_doc') +mpath.get('comments.0.title', obj, '_doc') // 'nov 3rd' +mpath.get('comments.0.title', obj) // 'exciting' +``` + +When used with a `map`, the `map` argument comes last. + +```js +mpath.get(path, obj, '_doc', map); +mpath.set(path, val, obj, '_doc', map); +``` + +[LICENSE](https://github.com/aheckmann/mpath/blob/master/LICENSE) + diff --git a/node_modules/mpath/SECURITY.md b/node_modules/mpath/SECURITY.md new file mode 100644 index 000000000..1c916a34f --- /dev/null +++ b/node_modules/mpath/SECURITY.md @@ -0,0 +1,5 @@ +# Reporting a Vulnerability + +Please report suspected security vulnerabilities to val [at] karpov [dot] io. +You will receive a response from us within 72 hours. +If the issue is confirmed, we will release a patch as soon as possible depending on complexity. diff --git a/node_modules/mpath/bench.js b/node_modules/mpath/bench.js new file mode 100644 index 000000000..9503cf279 --- /dev/null +++ b/node_modules/mpath/bench.js @@ -0,0 +1,110 @@ +'use strict'; + +const mpath = require('./'); +const Bench = require('benchmark'); +require('child_process').exec('git log --pretty=format:\'%h\' -n 1', function(err, sha) { + if (err) throw err; + + const fs = require('fs'); + const filename = __dirname + '/bench.out'; + const out = fs.createWriteStream(filename, { flags: 'a', encoding: 'utf8' }); + + /** + * test doc creator + */ + + function doc() { + const o = { first: { second: { third: [3, { name: 'aaron' }, 9] } } }; + o.comments = [ + { name: 'one' }, + { name: 'two', _doc: { name: '2' } }, + { name: 'three', + comments: [{}, { comments: [{ val: 'twoo' }] }], + _doc: { name: '3', comments: [{}, { _doc: { comments: [{ val: 2 }] } }] } } + ]; + o.name = 'jiro'; + o.array = [ + { o: { array: [{ x: { b: [4, 6, 8] } }, { y: 10 }] } }, + { o: { array: [{ x: { b: [1, 2, 3] } }, { x: { z: 10 } }, { x: { b: 'hi' } }] } }, + { o: { array: [{ x: { b: null } }, { x: { b: [null, 1] } }] } }, + { o: { array: [{ x: null }] } }, + { o: { array: [{ y: 3 }] } }, + { o: { array: [3, 0, null] } }, + { o: { name: 'ha' } } + ]; + o.arr = [ + { arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: true } + ]; + return o; + } + + const o = doc(); + + const s = new Bench.Suite; + s.add('mpath.get("first", obj)', function() { + mpath.get('first', o); + }); + s.add('mpath.get("first.second", obj)', function() { + mpath.get('first.second', o); + }); + s.add('mpath.get("first.second.third.1.name", obj)', function() { + mpath.get('first.second.third.1.name', o); + }); + s.add('mpath.get("comments", obj)', function() { + mpath.get('comments', o); + }); + s.add('mpath.get("comments.1", obj)', function() { + mpath.get('comments.1', o); + }); + s.add('mpath.get("comments.2.name", obj)', function() { + mpath.get('comments.2.name', o); + }); + s.add('mpath.get("comments.2.comments.1.comments.0.val", obj)', function() { + mpath.get('comments.2.comments.1.comments.0.val', o); + }); + s.add('mpath.get("comments.name", obj)', function() { + mpath.get('comments.name', o); + }); + + s.add('mpath.set("first", obj, val)', function() { + mpath.set('first', o, 1); + }); + s.add('mpath.set("first.second", obj, val)', function() { + mpath.set('first.second', o, 1); + }); + s.add('mpath.set("first.second.third.1.name", obj, val)', function() { + mpath.set('first.second.third.1.name', o, 1); + }); + s.add('mpath.set("comments", obj, val)', function() { + mpath.set('comments', o, 1); + }); + s.add('mpath.set("comments.1", obj, val)', function() { + mpath.set('comments.1', o, 1); + }); + s.add('mpath.set("comments.2.name", obj, val)', function() { + mpath.set('comments.2.name', o, 1); + }); + s.add('mpath.set("comments.2.comments.1.comments.0.val", obj, val)', function() { + mpath.set('comments.2.comments.1.comments.0.val', o, 1); + }); + s.add('mpath.set("comments.name", obj, val)', function() { + mpath.set('comments.name', o, 1); + }); + + s.on('start', function() { + console.log('starting...'); + out.write('*' + sha + ': ' + String(new Date()) + '\n'); + }); + s.on('cycle', function(e) { + const s = String(e.target); + console.log(s); + out.write(s + '\n'); + }); + s.on('complete', function() { + console.log('done'); + out.end(''); + }); + s.run(); +}); + diff --git a/node_modules/mpath/bench.log b/node_modules/mpath/bench.log new file mode 100644 index 000000000..e69de29bb diff --git a/node_modules/mpath/bench.out b/node_modules/mpath/bench.out new file mode 100644 index 000000000..b9b0371d3 --- /dev/null +++ b/node_modules/mpath/bench.out @@ -0,0 +1,52 @@ +*b566c26: Fri Mar 15 2013 14:14:04 GMT-0700 (PDT) +mpath.get("first", obj) x 3,827,405 ops/sec ±0.91% (90 runs sampled) +mpath.get("first.second", obj) x 4,930,222 ops/sec ±1.92% (91 runs sampled) +mpath.get("first.second.third.1.name", obj) x 3,070,837 ops/sec ±1.45% (97 runs sampled) +mpath.get("comments", obj) x 3,649,771 ops/sec ±1.71% (93 runs sampled) +mpath.get("comments.1", obj) x 3,846,728 ops/sec ±0.86% (94 runs sampled) +mpath.get("comments.2.name", obj) x 3,527,680 ops/sec ±0.95% (96 runs sampled) +mpath.get("comments.2.comments.1.comments.0.val", obj) x 2,046,982 ops/sec ±0.80% (96 runs sampled) +mpath.get("comments.name", obj) x 625,546 ops/sec ±2.02% (82 runs sampled) +*e42bdb1: Fri Mar 15 2013 14:19:28 GMT-0700 (PDT) +mpath.get("first", obj) x 3,700,783 ops/sec ±1.30% (95 runs sampled) +mpath.get("first.second", obj) x 4,621,795 ops/sec ±0.86% (95 runs sampled) +mpath.get("first.second.third.1.name", obj) x 3,012,671 ops/sec ±1.21% (100 runs sampled) +mpath.get("comments", obj) x 3,677,694 ops/sec ±0.80% (96 runs sampled) +mpath.get("comments.1", obj) x 3,798,862 ops/sec ±0.81% (91 runs sampled) +mpath.get("comments.2.name", obj) x 3,489,356 ops/sec ±0.66% (98 runs sampled) +mpath.get("comments.2.comments.1.comments.0.val", obj) x 2,004,076 ops/sec ±0.85% (99 runs sampled) +mpath.get("comments.name", obj) x 613,270 ops/sec ±1.33% (83 runs sampled) +*0521aac: Fri Mar 15 2013 16:37:16 GMT-0700 (PDT) +mpath.get("first", obj) x 3,834,755 ops/sec ±0.70% (100 runs sampled) +mpath.get("first.second", obj) x 4,999,965 ops/sec ±1.01% (98 runs sampled) +mpath.get("first.second.third.1.name", obj) x 3,125,953 ops/sec ±0.97% (100 runs sampled) +mpath.get("comments", obj) x 3,759,233 ops/sec ±0.81% (97 runs sampled) +mpath.get("comments.1", obj) x 3,894,893 ops/sec ±0.76% (96 runs sampled) +mpath.get("comments.2.name", obj) x 3,576,929 ops/sec ±0.68% (98 runs sampled) +mpath.get("comments.2.comments.1.comments.0.val", obj) x 2,149,610 ops/sec ±0.67% (97 runs sampled) +mpath.get("comments.name", obj) x 629,259 ops/sec ±1.30% (87 runs sampled) +mpath.set("first", obj, val) x 2,869,477 ops/sec ±0.63% (97 runs sampled) +mpath.set("first.second", obj, val) x 2,418,751 ops/sec ±0.62% (98 runs sampled) +mpath.set("first.second.third.1.name", obj, val) x 2,313,099 ops/sec ±0.69% (94 runs sampled) +mpath.set("comments", obj, val) x 2,680,882 ops/sec ±0.76% (99 runs sampled) +mpath.set("comments.1", obj, val) x 2,401,829 ops/sec ±0.68% (98 runs sampled) +mpath.set("comments.2.name", obj, val) x 2,335,081 ops/sec ±1.07% (96 runs sampled) +mpath.set("comments.2.comments.1.comments.0.val", obj, val) x 2,245,436 ops/sec ±0.76% (92 runs sampled) +mpath.set("comments.name", obj, val) x 2,356,278 ops/sec ±1.15% (100 runs sampled) +*97e85d3: Fri Mar 15 2013 16:39:21 GMT-0700 (PDT) +mpath.get("first", obj) x 3,837,614 ops/sec ±0.74% (99 runs sampled) +mpath.get("first.second", obj) x 4,991,779 ops/sec ±1.01% (94 runs sampled) +mpath.get("first.second.third.1.name", obj) x 3,078,455 ops/sec ±1.17% (96 runs sampled) +mpath.get("comments", obj) x 3,770,961 ops/sec ±0.45% (101 runs sampled) +mpath.get("comments.1", obj) x 3,832,814 ops/sec ±0.67% (92 runs sampled) +mpath.get("comments.2.name", obj) x 3,536,436 ops/sec ±0.49% (100 runs sampled) +mpath.get("comments.2.comments.1.comments.0.val", obj) x 2,141,947 ops/sec ±0.72% (98 runs sampled) +mpath.get("comments.name", obj) x 667,898 ops/sec ±1.62% (85 runs sampled) +mpath.set("first", obj, val) x 2,642,517 ops/sec ±0.72% (98 runs sampled) +mpath.set("first.second", obj, val) x 2,502,124 ops/sec ±1.28% (99 runs sampled) +mpath.set("first.second.third.1.name", obj, val) x 2,426,804 ops/sec ±0.55% (99 runs sampled) +mpath.set("comments", obj, val) x 2,699,478 ops/sec ±0.85% (98 runs sampled) +mpath.set("comments.1", obj, val) x 2,494,454 ops/sec ±1.05% (96 runs sampled) +mpath.set("comments.2.name", obj, val) x 2,463,894 ops/sec ±0.86% (98 runs sampled) +mpath.set("comments.2.comments.1.comments.0.val", obj, val) x 2,320,398 ops/sec ±0.82% (95 runs sampled) +mpath.set("comments.name", obj, val) x 2,512,408 ops/sec ±0.77% (95 runs sampled) diff --git a/node_modules/mpath/index.js b/node_modules/mpath/index.js new file mode 100644 index 000000000..47c17cdc8 --- /dev/null +++ b/node_modules/mpath/index.js @@ -0,0 +1,3 @@ +'use strict'; + +module.exports = exports = require('./lib'); diff --git a/node_modules/mpath/lib/index.js b/node_modules/mpath/lib/index.js new file mode 100644 index 000000000..c9e7e5f06 --- /dev/null +++ b/node_modules/mpath/lib/index.js @@ -0,0 +1,326 @@ +/* eslint strict:off */ +/* eslint no-var: off */ +/* eslint no-redeclare: off */ + +var stringToParts = require('./stringToParts'); + +// These properties are special and can open client libraries to security +// issues +var ignoreProperties = ['__proto__', 'constructor', 'prototype']; + +/** + * Returns the value of object `o` at the given `path`. + * + * ####Example: + * + * var obj = { + * comments: [ + * { title: 'exciting!', _doc: { title: 'great!' }} + * , { title: 'number dos' } + * ] + * } + * + * mpath.get('comments.0.title', o) // 'exciting!' + * mpath.get('comments.0.title', o, '_doc') // 'great!' + * mpath.get('comments.title', o) // ['exciting!', 'number dos'] + * + * // summary + * mpath.get(path, o) + * mpath.get(path, o, special) + * mpath.get(path, o, map) + * mpath.get(path, o, special, map) + * + * @param {String} path + * @param {Object} o + * @param {String} [special] When this property name is present on any object in the path, walking will continue on the value of this property. + * @param {Function} [map] Optional function which receives each individual found value. The value returned from `map` is used in the original values place. + */ + +exports.get = function(path, o, special, map) { + var lookup; + + if ('function' == typeof special) { + if (special.length < 2) { + map = special; + special = undefined; + } else { + lookup = special; + special = undefined; + } + } + + map || (map = K); + + var parts = 'string' == typeof path + ? stringToParts(path) + : path; + + if (!Array.isArray(parts)) { + throw new TypeError('Invalid `path`. Must be either string or array'); + } + + var obj = o, + part; + + for (var i = 0; i < parts.length; ++i) { + part = parts[i]; + if (typeof parts[i] !== 'string' && typeof parts[i] !== 'number') { + throw new TypeError('Each segment of path to `get()` must be a string or number, got ' + typeof parts[i]); + } + + if (Array.isArray(obj) && !/^\d+$/.test(part)) { + // reading a property from the array items + var paths = parts.slice(i); + + // Need to `concat()` to avoid `map()` calling a constructor of an array + // subclass + return [].concat(obj).map(function(item) { + return item + ? exports.get(paths, item, special || lookup, map) + : map(undefined); + }); + } + + if (lookup) { + obj = lookup(obj, part); + } else { + var _from = special && obj[special] ? obj[special] : obj; + obj = _from instanceof Map ? + _from.get(part) : + _from[part]; + } + + if (!obj) return map(obj); + } + + return map(obj); +}; + +/** + * Returns true if `in` returns true for every piece of the path + * + * @param {String} path + * @param {Object} o + */ + +exports.has = function(path, o) { + var parts = typeof path === 'string' ? + stringToParts(path) : + path; + + if (!Array.isArray(parts)) { + throw new TypeError('Invalid `path`. Must be either string or array'); + } + + var len = parts.length; + var cur = o; + for (var i = 0; i < len; ++i) { + if (typeof parts[i] !== 'string' && typeof parts[i] !== 'number') { + throw new TypeError('Each segment of path to `has()` must be a string or number, got ' + typeof parts[i]); + } + if (cur == null || typeof cur !== 'object' || !(parts[i] in cur)) { + return false; + } + cur = cur[parts[i]]; + } + + return true; +}; + +/** + * Deletes the last piece of `path` + * + * @param {String} path + * @param {Object} o + */ + +exports.unset = function(path, o) { + var parts = typeof path === 'string' ? + stringToParts(path) : + path; + + if (!Array.isArray(parts)) { + throw new TypeError('Invalid `path`. Must be either string or array'); + } + + var len = parts.length; + var cur = o; + for (var i = 0; i < len; ++i) { + if (cur == null || typeof cur !== 'object' || !(parts[i] in cur)) { + return false; + } + if (typeof parts[i] !== 'string' && typeof parts[i] !== 'number') { + throw new TypeError('Each segment of path to `unset()` must be a string or number, got ' + typeof parts[i]); + } + // Disallow any updates to __proto__ or special properties. + if (ignoreProperties.indexOf(parts[i]) !== -1) { + return false; + } + if (i === len - 1) { + delete cur[parts[i]]; + return true; + } + cur = cur instanceof Map ? cur.get(parts[i]) : cur[parts[i]]; + } + + return true; +}; + +/** + * Sets the `val` at the given `path` of object `o`. + * + * @param {String} path + * @param {Anything} val + * @param {Object} o + * @param {String} [special] When this property name is present on any object in the path, walking will continue on the value of this property. + * @param {Function} [map] Optional function which is passed each individual value before setting it. The value returned from `map` is used in the original values place. + */ + +exports.set = function(path, val, o, special, map, _copying) { + var lookup; + + if ('function' == typeof special) { + if (special.length < 2) { + map = special; + special = undefined; + } else { + lookup = special; + special = undefined; + } + } + + map || (map = K); + + var parts = 'string' == typeof path + ? stringToParts(path) + : path; + + if (!Array.isArray(parts)) { + throw new TypeError('Invalid `path`. Must be either string or array'); + } + + if (null == o) return; + + for (var i = 0; i < parts.length; ++i) { + if (typeof parts[i] !== 'string' && typeof parts[i] !== 'number') { + throw new TypeError('Each segment of path to `set()` must be a string or number, got ' + typeof parts[i]); + } + // Silently ignore any updates to `__proto__`, these are potentially + // dangerous if using mpath with unsanitized data. + if (ignoreProperties.indexOf(parts[i]) !== -1) { + return; + } + } + + // the existance of $ in a path tells us if the user desires + // the copying of an array instead of setting each value of + // the array to the one by one to matching positions of the + // current array. Unless the user explicitly opted out by passing + // false, see Automattic/mongoose#6273 + var copy = _copying || (/\$/.test(path) && _copying !== false), + obj = o, + part; + + for (var i = 0, len = parts.length - 1; i < len; ++i) { + part = parts[i]; + + if ('$' == part) { + if (i == len - 1) { + break; + } else { + continue; + } + } + + if (Array.isArray(obj) && !/^\d+$/.test(part)) { + var paths = parts.slice(i); + if (!copy && Array.isArray(val)) { + for (var j = 0; j < obj.length && j < val.length; ++j) { + // assignment of single values of array + exports.set(paths, val[j], obj[j], special || lookup, map, copy); + } + } else { + for (var j = 0; j < obj.length; ++j) { + // assignment of entire value + exports.set(paths, val, obj[j], special || lookup, map, copy); + } + } + return; + } + + if (lookup) { + obj = lookup(obj, part); + } else { + var _to = special && obj[special] ? obj[special] : obj; + obj = _to instanceof Map ? + _to.get(part) : + _to[part]; + } + + if (!obj) return; + } + + // process the last property of the path + + part = parts[len]; + + // use the special property if exists + if (special && obj[special]) { + obj = obj[special]; + } + + // set the value on the last branch + if (Array.isArray(obj) && !/^\d+$/.test(part)) { + if (!copy && Array.isArray(val)) { + _setArray(obj, val, part, lookup, special, map); + } else { + for (var j = 0; j < obj.length; ++j) { + var item = obj[j]; + if (item) { + if (lookup) { + lookup(item, part, map(val)); + } else { + if (item[special]) item = item[special]; + item[part] = map(val); + } + } + } + } + } else { + if (lookup) { + lookup(obj, part, map(val)); + } else if (obj instanceof Map) { + obj.set(part, map(val)); + } else { + obj[part] = map(val); + } + } +}; + +/*! + * Recursively set nested arrays + */ + +function _setArray(obj, val, part, lookup, special, map) { + for (var item, j = 0; j < obj.length && j < val.length; ++j) { + item = obj[j]; + if (Array.isArray(item) && Array.isArray(val[j])) { + _setArray(item, val[j], part, lookup, special, map); + } else if (item) { + if (lookup) { + lookup(item, part, map(val[j])); + } else { + if (item[special]) item = item[special]; + item[part] = map(val[j]); + } + } + } +} + +/*! + * Returns the value passed to it. + */ + +function K(v) { + return v; +} \ No newline at end of file diff --git a/node_modules/mpath/lib/stringToParts.js b/node_modules/mpath/lib/stringToParts.js new file mode 100644 index 000000000..f70f33334 --- /dev/null +++ b/node_modules/mpath/lib/stringToParts.js @@ -0,0 +1,48 @@ +'use strict'; + +module.exports = function stringToParts(str) { + const result = []; + + let curPropertyName = ''; + let state = 'DEFAULT'; + for (let i = 0; i < str.length; ++i) { + // Fall back to treating as property name rather than bracket notation if + // square brackets contains something other than a number. + if (state === 'IN_SQUARE_BRACKETS' && !/\d/.test(str[i]) && str[i] !== ']') { + state = 'DEFAULT'; + curPropertyName = result[result.length - 1] + '[' + curPropertyName; + result.splice(result.length - 1, 1); + } + + if (str[i] === '[') { + if (state !== 'IMMEDIATELY_AFTER_SQUARE_BRACKETS') { + result.push(curPropertyName); + curPropertyName = ''; + } + state = 'IN_SQUARE_BRACKETS'; + } else if (str[i] === ']') { + if (state === 'IN_SQUARE_BRACKETS') { + state = 'IMMEDIATELY_AFTER_SQUARE_BRACKETS'; + result.push(curPropertyName); + curPropertyName = ''; + } else { + state = 'DEFAULT'; + curPropertyName += str[i]; + } + } else if (str[i] === '.') { + if (state !== 'IMMEDIATELY_AFTER_SQUARE_BRACKETS') { + result.push(curPropertyName); + curPropertyName = ''; + } + state = 'DEFAULT'; + } else { + curPropertyName += str[i]; + } + } + + if (state !== 'IMMEDIATELY_AFTER_SQUARE_BRACKETS') { + result.push(curPropertyName); + } + + return result; +}; \ No newline at end of file diff --git a/node_modules/mpath/package.json b/node_modules/mpath/package.json new file mode 100644 index 000000000..94faf9459 --- /dev/null +++ b/node_modules/mpath/package.json @@ -0,0 +1,179 @@ +{ + "_from": "mpath@0.8.4", + "_id": "mpath@0.8.4", + "_inBundle": false, + "_integrity": "sha512-DTxNZomBcTWlrMW76jy1wvV37X/cNNxPW1y2Jzd4DZkAaC5ZGsm8bfGfNOthcDuRJujXLqiuS6o3Tpy0JEoh7g==", + "_location": "/mpath", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "mpath@0.8.4", + "name": "mpath", + "escapedName": "mpath", + "rawSpec": "0.8.4", + "saveSpec": null, + "fetchSpec": "0.8.4" + }, + "_requiredBy": [ + "/mongoose" + ], + "_resolved": "https://registry.npmjs.org/mpath/-/mpath-0.8.4.tgz", + "_shasum": "6b566d9581621d9e931dd3b142ed3618e7599313", + "_spec": "mpath@0.8.4", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\mongoose", + "author": { + "name": "Aaron Heckmann", + "email": "aaron.heckmann+github@gmail.com" + }, + "bugs": { + "url": "https://github.com/aheckmann/mpath/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "{G,S}et object values using MongoDB-like path notation", + "devDependencies": { + "benchmark": "~1.0.0", + "eslint": "7.16.0", + "mocha": "5.x" + }, + "engines": { + "node": ">=4.0.0" + }, + "eslintConfig": { + "extends": [ + "eslint:recommended" + ], + "parserOptions": { + "ecmaVersion": 2015 + }, + "env": { + "node": true, + "es6": true + }, + "rules": { + "comma-style": "error", + "indent": [ + "error", + 2, + { + "SwitchCase": 1, + "VariableDeclarator": 2 + } + ], + "keyword-spacing": "error", + "no-whitespace-before-property": "error", + "no-buffer-constructor": "warn", + "no-console": "off", + "no-multi-spaces": "error", + "no-constant-condition": "off", + "func-call-spacing": "error", + "no-trailing-spaces": "error", + "no-undef": "error", + "no-unneeded-ternary": "error", + "no-const-assign": "error", + "no-useless-rename": "error", + "no-dupe-keys": "error", + "space-in-parens": [ + "error", + "never" + ], + "spaced-comment": [ + "error", + "always", + { + "block": { + "markers": [ + "!" + ], + "balanced": true + } + } + ], + "key-spacing": [ + "error", + { + "beforeColon": false, + "afterColon": true + } + ], + "comma-spacing": [ + "error", + { + "before": false, + "after": true + } + ], + "array-bracket-spacing": 1, + "arrow-spacing": [ + "error", + { + "before": true, + "after": true + } + ], + "object-curly-spacing": [ + "error", + "always" + ], + "comma-dangle": [ + "error", + "never" + ], + "no-unreachable": "error", + "quotes": [ + "error", + "single" + ], + "quote-props": [ + "error", + "as-needed" + ], + "semi": "error", + "no-extra-semi": "error", + "semi-spacing": "error", + "no-spaced-func": "error", + "no-throw-literal": "error", + "space-before-blocks": "error", + "space-before-function-paren": [ + "error", + "never" + ], + "space-infix-ops": "error", + "space-unary-ops": "error", + "no-var": "warn", + "prefer-const": "warn", + "strict": [ + "error", + "global" + ], + "no-restricted-globals": [ + "error", + { + "name": "context", + "message": "Don't use Mocha's global context" + } + ], + "no-prototype-builtins": "off" + } + }, + "homepage": "https://github.com/aheckmann/mpath#readme", + "keywords": [ + "mongodb", + "path", + "get", + "set" + ], + "license": "MIT", + "main": "index.js", + "name": "mpath", + "repository": { + "type": "git", + "url": "git://github.com/aheckmann/mpath.git" + }, + "scripts": { + "lint": "eslint .", + "test": "mocha test/*" + }, + "version": "0.8.4" +} diff --git a/node_modules/mpath/test/.eslintrc.yml b/node_modules/mpath/test/.eslintrc.yml new file mode 100644 index 000000000..c0c68038f --- /dev/null +++ b/node_modules/mpath/test/.eslintrc.yml @@ -0,0 +1,4 @@ +env: + mocha: true +rules: + no-unused-vars: off \ No newline at end of file diff --git a/node_modules/mpath/test/index.js b/node_modules/mpath/test/index.js new file mode 100644 index 000000000..ce07b1989 --- /dev/null +++ b/node_modules/mpath/test/index.js @@ -0,0 +1,1879 @@ +'use strict'; + +/** + * Test dependencies. + */ + +const mpath = require('../'); +const assert = require('assert'); + +/** + * logging helper + */ + +function log(o) { + console.log(); + console.log(require('util').inspect(o, false, 1000)); +} + +/** + * special path for override tests + */ + +const special = '_doc'; + +/** + * Tests + */ + +describe('mpath', function() { + + /** + * test doc creator + */ + + function doc() { + const o = { first: { second: { third: [3, { name: 'aaron' }, 9] } } }; + o.comments = [ + { name: 'one' }, + { name: 'two', _doc: { name: '2' } }, + { name: 'three', + comments: [{}, { comments: [{ val: 'twoo' }] }], + _doc: { name: '3', comments: [{}, { _doc: { comments: [{ val: 2 }] } }] } } + ]; + o.name = 'jiro'; + o.array = [ + { o: { array: [{ x: { b: [4, 6, 8] } }, { y: 10 }] } }, + { o: { array: [{ x: { b: [1, 2, 3] } }, { x: { z: 10 } }, { x: { b: 'hi' } }] } }, + { o: { array: [{ x: { b: null } }, { x: { b: [null, 1] } }] } }, + { o: { array: [{ x: null }] } }, + { o: { array: [{ y: 3 }] } }, + { o: { array: [3, 0, null] } }, + { o: { name: 'ha' } } + ]; + o.arr = [ + { arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: true } + ]; + return o; + } + + describe('get', function() { + const o = doc(); + + it('`path` must be a string or array', function(done) { + assert.throws(function() { + mpath.get({}, o); + }, /Must be either string or array/); + assert.throws(function() { + mpath.get(4, o); + }, /Must be either string or array/); + assert.throws(function() { + mpath.get(function() {}, o); + }, /Must be either string or array/); + assert.throws(function() { + mpath.get(/asdf/, o); + }, /Must be either string or array/); + assert.throws(function() { + mpath.get(Math, o); + }, /Must be either string or array/); + assert.throws(function() { + mpath.get(Buffer, o); + }, /Must be either string or array/); + assert.doesNotThrow(function() { + mpath.get('string', o); + }); + assert.doesNotThrow(function() { + mpath.get([], o); + }); + done(); + }); + + describe('without `special`', function() { + it('works', function(done) { + assert.equal('jiro', mpath.get('name', o)); + + assert.deepEqual( + { second: { third: [3, { name: 'aaron' }, 9] } } + , mpath.get('first', o) + ); + + assert.deepEqual( + { third: [3, { name: 'aaron' }, 9] } + , mpath.get('first.second', o) + ); + + assert.deepEqual( + [3, { name: 'aaron' }, 9] + , mpath.get('first.second.third', o) + ); + + assert.deepEqual( + 3 + , mpath.get('first.second.third.0', o) + ); + + assert.deepEqual( + 9 + , mpath.get('first.second.third.2', o) + ); + + assert.deepEqual( + { name: 'aaron' } + , mpath.get('first.second.third.1', o) + ); + + assert.deepEqual( + 'aaron' + , mpath.get('first.second.third.1.name', o) + ); + + assert.deepEqual([ + { name: 'one' }, + { name: 'two', _doc: { name: '2' } }, + { name: 'three', + comments: [{}, { comments: [{ val: 'twoo' }] }], + _doc: { name: '3', comments: [{}, { _doc: { comments: [{ val: 2 }] } }] } }], + mpath.get('comments', o)); + + assert.deepEqual({ name: 'one' }, mpath.get('comments.0', o)); + assert.deepEqual('one', mpath.get('comments.0.name', o)); + assert.deepEqual('two', mpath.get('comments.1.name', o)); + assert.deepEqual('three', mpath.get('comments.2.name', o)); + + assert.deepEqual([{}, { comments: [{ val: 'twoo' }] }] + , mpath.get('comments.2.comments', o)); + + assert.deepEqual({ comments: [{ val: 'twoo' }] } + , mpath.get('comments.2.comments.1', o)); + + assert.deepEqual('twoo', mpath.get('comments.2.comments.1.comments.0.val', o)); + + done(); + }); + + it('handles array.property dot-notation', function(done) { + assert.deepEqual( + ['one', 'two', 'three'] + , mpath.get('comments.name', o) + ); + done(); + }); + + it('handles array.array notation', function(done) { + assert.deepEqual( + [undefined, undefined, [{}, { comments: [{ val: 'twoo' }] }]] + , mpath.get('comments.comments', o) + ); + done(); + }); + + it('handles prop.prop.prop.arrayProperty notation', function(done) { + assert.deepEqual( + [undefined, 'aaron', undefined] + , mpath.get('first.second.third.name', o) + ); + assert.deepEqual( + [1, 'aaron', 1] + , mpath.get('first.second.third.name', o, function(v) { + return undefined === v ? 1 : v; + }) + ); + done(); + }); + + it('handles array.prop.array', function(done) { + assert.deepEqual( + [[{ x: { b: [4, 6, 8] } }, { y: 10 }], + [{ x: { b: [1, 2, 3] } }, { x: { z: 10 } }, { x: { b: 'hi' } }], + [{ x: { b: null } }, { x: { b: [null, 1] } }], + [{ x: null }], + [{ y: 3 }], + [3, 0, null], + undefined + ] + , mpath.get('array.o.array', o) + ); + done(); + }); + + it('handles array.prop.array.index', function(done) { + assert.deepEqual( + [{ x: { b: [4, 6, 8] } }, + { x: { b: [1, 2, 3] } }, + { x: { b: null } }, + { x: null }, + { y: 3 }, + 3, + undefined + ] + , mpath.get('array.o.array.0', o) + ); + done(); + }); + + it('handles array.prop.array.index.prop', function(done) { + assert.deepEqual( + [{ b: [4, 6, 8] }, + { b: [1, 2, 3] }, + { b: null }, + null, + undefined, + undefined, + undefined + ] + , mpath.get('array.o.array.0.x', o) + ); + done(); + }); + + it('handles array.prop.array.prop', function(done) { + assert.deepEqual( + [[undefined, 10], + [undefined, undefined, undefined], + [undefined, undefined], + [undefined], + [3], + [undefined, undefined, undefined], + undefined + ] + , mpath.get('array.o.array.y', o) + ); + assert.deepEqual( + [[{ b: [4, 6, 8] }, undefined], + [{ b: [1, 2, 3] }, { z: 10 }, { b: 'hi' }], + [{ b: null }, { b: [null, 1] }], + [null], + [undefined], + [undefined, undefined, undefined], + undefined + ] + , mpath.get('array.o.array.x', o) + ); + done(); + }); + + it('handles array.prop.array.prop.prop', function(done) { + assert.deepEqual( + [[[4, 6, 8], undefined], + [[1, 2, 3], undefined, 'hi'], + [null, [null, 1]], + [null], + [undefined], + [undefined, undefined, undefined], + undefined + ] + , mpath.get('array.o.array.x.b', o) + ); + done(); + }); + + it('handles array.prop.array.prop.prop.index', function(done) { + assert.deepEqual( + [[6, undefined], + [2, undefined, 'i'], // undocumented feature (string indexing) + [null, 1], + [null], + [undefined], + [undefined, undefined, undefined], + undefined + ] + , mpath.get('array.o.array.x.b.1', o) + ); + assert.deepEqual( + [[6, 0], + [2, 0, 'i'], // undocumented feature (string indexing) + [null, 1], + [null], + [0], + [0, 0, 0], + 0 + ] + , mpath.get('array.o.array.x.b.1', o, function(v) { + return undefined === v ? 0 : v; + }) + ); + done(); + }); + + it('handles array.index.prop.prop', function(done) { + assert.deepEqual( + [{ x: { b: [1, 2, 3] } }, { x: { z: 10 } }, { x: { b: 'hi' } }] + , mpath.get('array.1.o.array', o) + ); + assert.deepEqual( + ['hi', 'hi', 'hi'] + , mpath.get('array.1.o.array', o, function(v) { + if (Array.isArray(v)) { + return v.map(function(val) { + return 'hi'; + }); + } + return v; + }) + ); + done(); + }); + + it('handles array.array.index', function(done) { + assert.deepEqual( + [{ a: { c: 48 } }, undefined] + , mpath.get('arr.arr.1', o) + ); + assert.deepEqual( + ['woot', undefined] + , mpath.get('arr.arr.1', o, function(v) { + if (v && v.a && v.a.c) return 'woot'; + return v; + }) + ); + done(); + }); + + it('handles array.array.index.prop', function(done) { + assert.deepEqual( + [{ c: 48 }, 'woot'] + , mpath.get('arr.arr.1.a', o, function(v) { + if (undefined === v) return 'woot'; + return v; + }) + ); + assert.deepEqual( + [{ c: 48 }, undefined] + , mpath.get('arr.arr.1.a', o) + ); + mpath.set('arr.arr.1.a', [{ c: 49 }, undefined], o); + assert.deepEqual( + [{ c: 49 }, undefined] + , mpath.get('arr.arr.1.a', o) + ); + mpath.set('arr.arr.1.a', [{ c: 48 }, undefined], o); + done(); + }); + + it('handles array.array.index.prop.prop', function(done) { + assert.deepEqual( + [48, undefined] + , mpath.get('arr.arr.1.a.c', o) + ); + assert.deepEqual( + [48, 'woot'] + , mpath.get('arr.arr.1.a.c', o, function(v) { + if (undefined === v) return 'woot'; + return v; + }) + ); + done(); + }); + + }); + + describe('with `special`', function() { + describe('that is a string', function() { + it('works', function(done) { + assert.equal('jiro', mpath.get('name', o, special)); + + assert.deepEqual( + { second: { third: [3, { name: 'aaron' }, 9] } } + , mpath.get('first', o, special) + ); + + assert.deepEqual( + { third: [3, { name: 'aaron' }, 9] } + , mpath.get('first.second', o, special) + ); + + assert.deepEqual( + [3, { name: 'aaron' }, 9] + , mpath.get('first.second.third', o, special) + ); + + assert.deepEqual( + 3 + , mpath.get('first.second.third.0', o, special) + ); + + assert.deepEqual( + 4 + , mpath.get('first.second.third.0', o, special, function(v) { + return 3 === v ? 4 : v; + }) + ); + + assert.deepEqual( + 9 + , mpath.get('first.second.third.2', o, special) + ); + + assert.deepEqual( + { name: 'aaron' } + , mpath.get('first.second.third.1', o, special) + ); + + assert.deepEqual( + 'aaron' + , mpath.get('first.second.third.1.name', o, special) + ); + + assert.deepEqual([ + { name: 'one' }, + { name: 'two', _doc: { name: '2' } }, + { name: 'three', + comments: [{}, { comments: [{ val: 'twoo' }] }], + _doc: { name: '3', comments: [{}, { _doc: { comments: [{ val: 2 }] } }] } }], + mpath.get('comments', o, special)); + + assert.deepEqual({ name: 'one' }, mpath.get('comments.0', o, special)); + assert.deepEqual('one', mpath.get('comments.0.name', o, special)); + assert.deepEqual('2', mpath.get('comments.1.name', o, special)); + assert.deepEqual('3', mpath.get('comments.2.name', o, special)); + assert.deepEqual('nice', mpath.get('comments.2.name', o, special, function(v) { + return '3' === v ? 'nice' : v; + })); + + assert.deepEqual([{}, { _doc: { comments: [{ val: 2 }] } }] + , mpath.get('comments.2.comments', o, special)); + + assert.deepEqual({ _doc: { comments: [{ val: 2 }] } } + , mpath.get('comments.2.comments.1', o, special)); + + assert.deepEqual(2, mpath.get('comments.2.comments.1.comments.0.val', o, special)); + done(); + }); + + it('handles array.property dot-notation', function(done) { + assert.deepEqual( + ['one', '2', '3'] + , mpath.get('comments.name', o, special) + ); + assert.deepEqual( + ['one', 2, '3'] + , mpath.get('comments.name', o, special, function(v) { + return '2' === v ? 2 : v; + }) + ); + done(); + }); + + it('handles array.array notation', function(done) { + assert.deepEqual( + [undefined, undefined, [{}, { _doc: { comments: [{ val: 2 }] } }]] + , mpath.get('comments.comments', o, special) + ); + done(); + }); + + it('handles array.array.index.array', function(done) { + assert.deepEqual( + [undefined, undefined, [{ val: 2 }]] + , mpath.get('comments.comments.1.comments', o, special) + ); + done(); + }); + + it('handles array.array.index.array.prop', function(done) { + assert.deepEqual( + [undefined, undefined, [2]] + , mpath.get('comments.comments.1.comments.val', o, special) + ); + assert.deepEqual( + ['nil', 'nil', [2]] + , mpath.get('comments.comments.1.comments.val', o, special, function(v) { + return undefined === v ? 'nil' : v; + }) + ); + done(); + }); + }); + + describe('that is a function', function() { + const special = function(obj, key) { + return obj[key]; + }; + + it('works', function(done) { + assert.equal('jiro', mpath.get('name', o, special)); + + assert.deepEqual( + { second: { third: [3, { name: 'aaron' }, 9] } } + , mpath.get('first', o, special) + ); + + assert.deepEqual( + { third: [3, { name: 'aaron' }, 9] } + , mpath.get('first.second', o, special) + ); + + assert.deepEqual( + [3, { name: 'aaron' }, 9] + , mpath.get('first.second.third', o, special) + ); + + assert.deepEqual( + 3 + , mpath.get('first.second.third.0', o, special) + ); + + assert.deepEqual( + 4 + , mpath.get('first.second.third.0', o, special, function(v) { + return 3 === v ? 4 : v; + }) + ); + + assert.deepEqual( + 9 + , mpath.get('first.second.third.2', o, special) + ); + + assert.deepEqual( + { name: 'aaron' } + , mpath.get('first.second.third.1', o, special) + ); + + assert.deepEqual( + 'aaron' + , mpath.get('first.second.third.1.name', o, special) + ); + + assert.deepEqual([ + { name: 'one' }, + { name: 'two', _doc: { name: '2' } }, + { name: 'three', + comments: [{}, { comments: [{ val: 'twoo' }] }], + _doc: { name: '3', comments: [{}, { _doc: { comments: [{ val: 2 }] } }] } }], + mpath.get('comments', o, special)); + + assert.deepEqual({ name: 'one' }, mpath.get('comments.0', o, special)); + assert.deepEqual('one', mpath.get('comments.0.name', o, special)); + assert.deepEqual('two', mpath.get('comments.1.name', o, special)); + assert.deepEqual('three', mpath.get('comments.2.name', o, special)); + assert.deepEqual('nice', mpath.get('comments.2.name', o, special, function(v) { + return 'three' === v ? 'nice' : v; + })); + + assert.deepEqual([{}, { comments: [{ val: 'twoo' }] }] + , mpath.get('comments.2.comments', o, special)); + + assert.deepEqual({ comments: [{ val: 'twoo' }] } + , mpath.get('comments.2.comments.1', o, special)); + + assert.deepEqual('twoo', mpath.get('comments.2.comments.1.comments.0.val', o, special)); + + let overide = false; + assert.deepEqual('twoo', mpath.get('comments.8.comments.1.comments.0.val', o, function(obj, path) { + if (Array.isArray(obj) && 8 == path) { + overide = true; + return obj[2]; + } + return obj[path]; + })); + assert.ok(overide); + + done(); + }); + + it('in combination with map', function(done) { + const special = function(obj, key) { + if (Array.isArray(obj)) return obj[key]; + return obj.mpath; + }; + const map = function(val) { + return 'convert' == val + ? 'mpath' + : val; + }; + const o = { mpath: [{ mpath: 'converse' }, { mpath: 'convert' }] }; + + assert.equal('mpath', mpath.get('something.1.kewl', o, special, map)); + done(); + }); + }); + }); + }); + + describe('set', function() { + it('prevents writing to __proto__', function() { + const obj = {}; + mpath.set('__proto__.x', 'foobar', obj); + assert.ok(!({}.x)); + + mpath.set('constructor.prototype.x', 'foobar', obj); + assert.ok(!({}.x)); + }); + + describe('without `special`', function() { + const o = doc(); + + it('works', function(done) { + mpath.set('name', 'a new val', o, function(v) { + return 'a new val' === v ? 'changed' : v; + }); + assert.deepEqual('changed', o.name); + + mpath.set('name', 'changed', o); + assert.deepEqual('changed', o.name); + + mpath.set('first.second.third', [1, { name: 'x' }, 9], o); + assert.deepEqual([1, { name: 'x' }, 9], o.first.second.third); + + mpath.set('first.second.third.1.name', 'y', o); + assert.deepEqual([1, { name: 'y' }, 9], o.first.second.third); + + mpath.set('comments.1.name', 'ttwwoo', o); + assert.deepEqual({ name: 'ttwwoo', _doc: { name: '2' } }, o.comments[1]); + + mpath.set('comments.2.comments.1.comments.0.expand', 'added', o); + assert.deepEqual( + { val: 'twoo', expand: 'added' } + , o.comments[2].comments[1].comments[0]); + + mpath.set('comments.2.comments.1.comments.2', 'added', o); + assert.equal(3, o.comments[2].comments[1].comments.length); + assert.deepEqual( + { val: 'twoo', expand: 'added' } + , o.comments[2].comments[1].comments[0]); + assert.deepEqual( + undefined + , o.comments[2].comments[1].comments[1]); + assert.deepEqual( + 'added' + , o.comments[2].comments[1].comments[2]); + + done(); + }); + + describe('array.path', function() { + describe('with single non-array value', function() { + it('works', function(done) { + mpath.set('arr.yep', false, o, function(v) { + return false === v ? true : v; + }); + assert.deepEqual([ + { yep: true, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: true } + ], o.arr); + + mpath.set('arr.yep', false, o); + + assert.deepEqual([ + { yep: false, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: false } + ], o.arr); + + done(); + }); + }); + describe('with array of values', function() { + it('that are equal in length', function(done) { + mpath.set('arr.yep', ['one', 2], o, function(v) { + return 'one' === v ? 1 : v; + }); + assert.deepEqual([ + { yep: 1, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: 2 } + ], o.arr); + mpath.set('arr.yep', ['one', 2], o); + + assert.deepEqual([ + { yep: 'one', arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: 2 } + ], o.arr); + + done(); + }); + + it('that is less than length', function(done) { + mpath.set('arr.yep', [47], o, function(v) { + return 47 === v ? 4 : v; + }); + assert.deepEqual([ + { yep: 4, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: 2 } + ], o.arr); + + mpath.set('arr.yep', [47], o); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: 2 } + ], o.arr); + + done(); + }); + + it('that is greater than length', function(done) { + mpath.set('arr.yep', [5, 6, 7], o, function(v) { + return 5 === v ? 'five' : v; + }); + assert.deepEqual([ + { yep: 'five', arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: 6 } + ], o.arr); + + mpath.set('arr.yep', [5, 6, 7], o); + assert.deepEqual([ + { yep: 5, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: 6 } + ], o.arr); + + done(); + }); + }); + }); + + describe('array.$.path', function() { + describe('with single non-array value', function() { + it('copies the value to each item in array', function(done) { + mpath.set('arr.$.yep', { xtra: 'double good' }, o, function(v) { + return v && v.xtra ? 'hi' : v; + }); + assert.deepEqual([ + { yep: 'hi', arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: 'hi' } + ], o.arr); + + mpath.set('arr.$.yep', { xtra: 'double good' }, o); + assert.deepEqual([ + { yep: { xtra: 'double good' }, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: { xtra: 'double good' } } + ], o.arr); + + done(); + }); + }); + describe('with array of values', function() { + it('copies the value to each item in array', function(done) { + mpath.set('arr.$.yep', [15], o, function(v) { + return v.length === 1 ? [] : v; + }); + assert.deepEqual([ + { yep: [], arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: [] } + ], o.arr); + + mpath.set('arr.$.yep', [15], o); + assert.deepEqual([ + { yep: [15], arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: [15] } + ], o.arr); + + done(); + }); + }); + }); + + describe('array.index.path', function() { + it('works', function(done) { + mpath.set('arr.1.yep', 0, o, function(v) { + return 0 === v ? 'zero' : v; + }); + assert.deepEqual([ + { yep: [15], arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: 'zero' } + ], o.arr); + + mpath.set('arr.1.yep', 0, o); + assert.deepEqual([ + { yep: [15], arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: 0 } + ], o.arr); + + done(); + }); + }); + + describe('array.index.array.path', function() { + it('with single value', function(done) { + mpath.set('arr.0.arr.e', 35, o, function(v) { + return 35 === v ? 3 : v; + }); + assert.deepEqual([ + { yep: [15], arr: [{ a: { b: 47 }, e: 3 }, { a: { c: 48 }, e: 3 }, { d: 'yep', e: 3 }] }, + { yep: 0 } + ], o.arr); + + mpath.set('arr.0.arr.e', 35, o); + assert.deepEqual([ + { yep: [15], arr: [{ a: { b: 47 }, e: 35 }, { a: { c: 48 }, e: 35 }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + done(); + }); + it('with array', function(done) { + mpath.set('arr.0.arr.e', ['a', 'b'], o, function(v) { + return 'a' === v ? 'x' : v; + }); + assert.deepEqual([ + { yep: [15], arr: [{ a: { b: 47 }, e: 'x' }, { a: { c: 48 }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + mpath.set('arr.0.arr.e', ['a', 'b'], o); + assert.deepEqual([ + { yep: [15], arr: [{ a: { b: 47 }, e: 'a' }, { a: { c: 48 }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + done(); + }); + }); + + describe('array.index.array.path.path', function() { + it('with single value', function(done) { + mpath.set('arr.0.arr.a.b', 36, o, function(v) { + return 36 === v ? 3 : v; + }); + assert.deepEqual([ + { yep: [15], arr: [{ a: { b: 3 }, e: 'a' }, { a: { c: 48, b: 3 }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + mpath.set('arr.0.arr.a.b', 36, o); + assert.deepEqual([ + { yep: [15], arr: [{ a: { b: 36 }, e: 'a' }, { a: { c: 48, b: 36 }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + done(); + }); + it('with array', function(done) { + mpath.set('arr.0.arr.a.b', [1, 2, 3, 4], o, function(v) { + return 2 === v ? 'two' : v; + }); + assert.deepEqual([ + { yep: [15], arr: [{ a: { b: 1 }, e: 'a' }, { a: { c: 48, b: 'two' }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + mpath.set('arr.0.arr.a.b', [1, 2, 3, 4], o); + assert.deepEqual([ + { yep: [15], arr: [{ a: { b: 1 }, e: 'a' }, { a: { c: 48, b: 2 }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + done(); + }); + }); + + describe('array.index.array.$.path.path', function() { + it('with single value', function(done) { + mpath.set('arr.0.arr.$.a.b', '$', o, function(v) { + return '$' === v ? 'dolla billz' : v; + }); + assert.deepEqual([ + { yep: [15], arr: [{ a: { b: 'dolla billz' }, e: 'a' }, { a: { c: 48, b: 'dolla billz' }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + mpath.set('arr.0.arr.$.a.b', '$', o); + assert.deepEqual([ + { yep: [15], arr: [{ a: { b: '$' }, e: 'a' }, { a: { c: 48, b: '$' }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + done(); + }); + it('with array', function(done) { + mpath.set('arr.0.arr.$.a.b', [1], o, function(v) { + return Array.isArray(v) ? {} : v; + }); + assert.deepEqual([ + { yep: [15], arr: [{ a: { b: {} }, e: 'a' }, { a: { c: 48, b: {} }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + mpath.set('arr.0.arr.$.a.b', [1], o); + assert.deepEqual([ + { yep: [15], arr: [{ a: { b: [1] }, e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + done(); + }); + }); + + describe('array.array.index.path', function() { + it('with single value', function(done) { + mpath.set('arr.arr.0.a', 'single', o, function(v) { + return 'single' === v ? 'double' : v; + }); + assert.deepEqual([ + { yep: [15], arr: [{ a: 'double', e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + mpath.set('arr.arr.0.a', 'single', o); + assert.deepEqual([ + { yep: [15], arr: [{ a: 'single', e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + done(); + }); + it('with array', function(done) { + mpath.set('arr.arr.0.a', [4, 8, 15, 16, 23, 42], o, function(v) { + return 4 === v ? 3 : v; + }); + assert.deepEqual([ + { yep: [15], arr: [{ a: 3, e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: false } + ], o.arr); + + mpath.set('arr.arr.0.a', [4, 8, 15, 16, 23, 42], o); + assert.deepEqual([ + { yep: [15], arr: [{ a: 4, e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: false } + ], o.arr); + + done(); + }); + }); + + describe('array.array.$.index.path', function() { + it('with single value', function(done) { + mpath.set('arr.arr.$.0.a', 'singles', o, function(v) { + return 0; + }); + assert.deepEqual([ + { yep: [15], arr: [{ a: 0, e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + mpath.set('arr.arr.$.0.a', 'singles', o); + assert.deepEqual([ + { yep: [15], arr: [{ a: 'singles', e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + mpath.set('$.arr.arr.0.a', 'single', o); + assert.deepEqual([ + { yep: [15], arr: [{ a: 'single', e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + done(); + }); + it('with array', function(done) { + mpath.set('arr.arr.$.0.a', [4, 8, 15, 16, 23, 42], o, function(v) { + return 'nope'; + }); + assert.deepEqual([ + { yep: [15], arr: [{ a: 'nope', e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + mpath.set('arr.arr.$.0.a', [4, 8, 15, 16, 23, 42], o); + assert.deepEqual([ + { yep: [15], arr: [{ a: [4, 8, 15, 16, 23, 42], e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + mpath.set('arr.$.arr.0.a', [4, 8, 15, 16, 23, 42, 108], o); + assert.deepEqual([ + { yep: [15], arr: [{ a: [4, 8, 15, 16, 23, 42, 108], e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + done(); + }); + }); + + describe('array.array.path.index', function() { + it('with single value', function(done) { + mpath.set('arr.arr.a.7', 47, o, function(v) { + return 1; + }); + assert.deepEqual([ + { yep: [15], arr: [{ a: [4, 8, 15, 16, 23, 42, 108, 1], e: 'a' }, { a: { c: 48, b: [1], 7: 1 }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + mpath.set('arr.arr.a.7', 47, o); + assert.deepEqual([ + { yep: [15], arr: [{ a: [4, 8, 15, 16, 23, 42, 108, 47], e: 'a' }, { a: { c: 48, b: [1], 7: 47 }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0 } + ], o.arr); + + done(); + }); + it('with array', function(done) { + o.arr[1].arr = [{ a: [] }, { a: [] }, { a: null }]; + mpath.set('arr.arr.a.7', [[null, 46], [undefined, 'woot']], o); + + const a1 = []; + const a2 = []; + a1[7] = undefined; + a2[7] = 'woot'; + + assert.deepEqual([ + { yep: [15], arr: [{ a: [4, 8, 15, 16, 23, 42, 108, null], e: 'a' }, { a: { c: 48, b: [1], 7: 46 }, e: 'b' }, { d: 'yep', e: 35 }] }, + { yep: 0, arr: [{ a: a1 }, { a: a2 }, { a: null }] } + ], o.arr); + + done(); + }); + }); + + describe('handles array.array.path', function() { + it('with single', function(done) { + o.arr[1].arr = [{}, {}]; + assert.deepEqual([{}, {}], o.arr[1].arr); + o.arr.push({ arr: 'something else' }); + o.arr.push({ arr: ['something else'] }); + o.arr.push({ arr: [[]] }); + o.arr.push({ arr: [5] }); + + const weird = []; + weird.e = 'xmas'; + + // test + mpath.set('arr.arr.e', 47, o, function(v) { + return 'xmas'; + }); + assert.deepEqual([ + { yep: [15], arr: [ + { a: [4, 8, 15, 16, 23, 42, 108, null], e: 'xmas' }, + { a: { c: 48, b: [1], 7: 46 }, e: 'xmas' }, + { d: 'yep', e: 'xmas' } + ] + }, + { yep: 0, arr: [{ e: 'xmas' }, { e: 'xmas' }] }, + { arr: 'something else' }, + { arr: ['something else'] }, + { arr: [weird] }, + { arr: [5] } + ] + , o.arr); + + weird.e = 47; + + mpath.set('arr.arr.e', 47, o); + assert.deepEqual([ + { yep: [15], arr: [ + { a: [4, 8, 15, 16, 23, 42, 108, null], e: 47 }, + { a: { c: 48, b: [1], 7: 46 }, e: 47 }, + { d: 'yep', e: 47 } + ] + }, + { yep: 0, arr: [{ e: 47 }, { e: 47 }] }, + { arr: 'something else' }, + { arr: ['something else'] }, + { arr: [weird] }, + { arr: [5] } + ] + , o.arr); + + done(); + }); + it('with arrays', function(done) { + mpath.set('arr.arr.e', [[1, 2, 3], [4, 5], null, [], [6], [7, 8, 9]], o, function(v) { + return 10; + }); + + const weird = []; + weird.e = 10; + + assert.deepEqual([ + { yep: [15], arr: [ + { a: [4, 8, 15, 16, 23, 42, 108, null], e: 10 }, + { a: { c: 48, b: [1], 7: 46 }, e: 10 }, + { d: 'yep', e: 10 } + ] + }, + { yep: 0, arr: [{ e: 10 }, { e: 10 }] }, + { arr: 'something else' }, + { arr: ['something else'] }, + { arr: [weird] }, + { arr: [5] } + ] + , o.arr); + + mpath.set('arr.arr.e', [[1, 2, 3], [4, 5], null, [], [6], [7, 8, 9]], o); + + weird.e = 6; + + assert.deepEqual([ + { yep: [15], arr: [ + { a: [4, 8, 15, 16, 23, 42, 108, null], e: 1 }, + { a: { c: 48, b: [1], 7: 46 }, e: 2 }, + { d: 'yep', e: 3 } + ] + }, + { yep: 0, arr: [{ e: 4 }, { e: 5 }] }, + { arr: 'something else' }, + { arr: ['something else'] }, + { arr: [weird] }, + { arr: [5] } + ] + , o.arr); + + done(); + }); + }); + }); + + describe('with `special`', function() { + const o = doc(); + + it('works', function(done) { + mpath.set('name', 'chan', o, special, function(v) { + return 'hi'; + }); + assert.deepEqual('hi', o.name); + + mpath.set('name', 'changer', o, special); + assert.deepEqual('changer', o.name); + + mpath.set('first.second.third', [1, { name: 'y' }, 9], o, special); + assert.deepEqual([1, { name: 'y' }, 9], o.first.second.third); + + mpath.set('first.second.third.1.name', 'z', o, special); + assert.deepEqual([1, { name: 'z' }, 9], o.first.second.third); + + mpath.set('comments.1.name', 'ttwwoo', o, special); + assert.deepEqual({ name: 'two', _doc: { name: 'ttwwoo' } }, o.comments[1]); + + mpath.set('comments.2.comments.1.comments.0.expander', 'adder', o, special, function(v) { + return 'super'; + }); + assert.deepEqual( + { val: 2, expander: 'super' } + , o.comments[2]._doc.comments[1]._doc.comments[0]); + + mpath.set('comments.2.comments.1.comments.0.expander', 'adder', o, special); + assert.deepEqual( + { val: 2, expander: 'adder' } + , o.comments[2]._doc.comments[1]._doc.comments[0]); + + mpath.set('comments.2.comments.1.comments.2', 'set', o, special); + assert.equal(3, o.comments[2]._doc.comments[1]._doc.comments.length); + assert.deepEqual( + { val: 2, expander: 'adder' } + , o.comments[2]._doc.comments[1]._doc.comments[0]); + assert.deepEqual( + undefined + , o.comments[2]._doc.comments[1]._doc.comments[1]); + assert.deepEqual( + 'set' + , o.comments[2]._doc.comments[1]._doc.comments[2]); + done(); + }); + + describe('array.path', function() { + describe('with single non-array value', function() { + it('works', function(done) { + o.arr[1]._doc = { special: true }; + + mpath.set('arr.yep', false, o, special, function(v) { + return 'yes'; + }); + assert.deepEqual([ + { yep: 'yes', arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: true, _doc: { special: true, yep: 'yes' } } + ], o.arr); + + mpath.set('arr.yep', false, o, special); + assert.deepEqual([ + { yep: false, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: true, _doc: { special: true, yep: false } } + ], o.arr); + + done(); + }); + }); + describe('with array of values', function() { + it('that are equal in length', function(done) { + mpath.set('arr.yep', ['one', 2], o, special, function(v) { + return 2 === v ? 20 : v; + }); + assert.deepEqual([ + { yep: 'one', arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: true, _doc: { special: true, yep: 20 } } + ], o.arr); + + mpath.set('arr.yep', ['one', 2], o, special); + assert.deepEqual([ + { yep: 'one', arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: true, _doc: { special: true, yep: 2 } } + ], o.arr); + + done(); + }); + + it('that is less than length', function(done) { + mpath.set('arr.yep', [47], o, special, function(v) { + return 80; + }); + assert.deepEqual([ + { yep: 80, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: true, _doc: { special: true, yep: 2 } } + ], o.arr); + + mpath.set('arr.yep', [47], o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }, + { yep: true, _doc: { special: true, yep: 2 } } + ], o.arr); + + // add _doc to first element + o.arr[0]._doc = { yep: 46, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] }; + + mpath.set('arr.yep', [20], o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], _doc: { yep: 20, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] } }, + { yep: true, _doc: { special: true, yep: 2 } } + ], o.arr); + + done(); + }); + + it('that is greater than length', function(done) { + mpath.set('arr.yep', [5, 6, 7], o, special, function() { + return 'x'; + }); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], _doc: { yep: 'x', arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] } }, + { yep: true, _doc: { special: true, yep: 'x' } } + ], o.arr); + + mpath.set('arr.yep', [5, 6, 7], o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], _doc: { yep: 5, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] } }, + { yep: true, _doc: { special: true, yep: 6 } } + ], o.arr); + + done(); + }); + }); + }); + + describe('array.$.path', function() { + describe('with single non-array value', function() { + it('copies the value to each item in array', function(done) { + mpath.set('arr.$.yep', { xtra: 'double good' }, o, special, function(v) { + return 9; + }); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: 9, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] } }, + { yep: true, _doc: { special: true, yep: 9 } } + ], o.arr); + + mpath.set('arr.$.yep', { xtra: 'double good' }, o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: { xtra: 'double good' }, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] } }, + { yep: true, _doc: { special: true, yep: { xtra: 'double good' } } } + ], o.arr); + + done(); + }); + }); + describe('with array of values', function() { + it('copies the value to each item in array', function(done) { + mpath.set('arr.$.yep', [15], o, special, function(v) { + return 'array'; + }); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: 'array', arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] } }, + { yep: true, _doc: { special: true, yep: 'array' } } + ], o.arr); + + mpath.set('arr.$.yep', [15], o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] } }, + { yep: true, _doc: { special: true, yep: [15] } } + ], o.arr); + + done(); + }); + }); + }); + + describe('array.index.path', function() { + it('works', function(done) { + mpath.set('arr.1.yep', 0, o, special, function(v) { + return 1; + }); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] } }, + { yep: true, _doc: { special: true, yep: 1 } } + ], o.arr); + + mpath.set('arr.1.yep', 0, o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + done(); + }); + }); + + describe('array.index.array.path', function() { + it('with single value', function(done) { + mpath.set('arr.0.arr.e', 35, o, special, function(v) { + return 30; + }); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: { b: 47 }, e: 30 }, { a: { c: 48 }, e: 30 }, { d: 'yep', e: 30 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + mpath.set('arr.0.arr.e', 35, o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: { b: 47 }, e: 35 }, { a: { c: 48 }, e: 35 }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + done(); + }); + it('with array', function(done) { + mpath.set('arr.0.arr.e', ['a', 'b'], o, special, function(v) { + return 'a' === v ? 'A' : v; + }); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: { b: 47 }, e: 'A' }, { a: { c: 48 }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + mpath.set('arr.0.arr.e', ['a', 'b'], o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: { b: 47 }, e: 'a' }, { a: { c: 48 }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + done(); + }); + }); + + describe('array.index.array.path.path', function() { + it('with single value', function(done) { + mpath.set('arr.0.arr.a.b', 36, o, special, function(v) { + return 20; + }); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: { b: 20 }, e: 'a' }, { a: { c: 48, b: 20 }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + mpath.set('arr.0.arr.a.b', 36, o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: { b: 36 }, e: 'a' }, { a: { c: 48, b: 36 }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + done(); + }); + it('with array', function(done) { + mpath.set('arr.0.arr.a.b', [1, 2, 3, 4], o, special, function(v) { + return v * 2; + }); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: { b: 2 }, e: 'a' }, { a: { c: 48, b: 4 }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + mpath.set('arr.0.arr.a.b', [1, 2, 3, 4], o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: { b: 1 }, e: 'a' }, { a: { c: 48, b: 2 }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + done(); + }); + }); + + describe('array.index.array.$.path.path', function() { + it('with single value', function(done) { + mpath.set('arr.0.arr.$.a.b', '$', o, special, function(v) { + return 'dollaz'; + }); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: { b: 'dollaz' }, e: 'a' }, { a: { c: 48, b: 'dollaz' }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + mpath.set('arr.0.arr.$.a.b', '$', o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: { b: '$' }, e: 'a' }, { a: { c: 48, b: '$' }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + done(); + }); + it('with array', function(done) { + mpath.set('arr.0.arr.$.a.b', [1], o, special, function(v) { + return {}; + }); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: { b: {} }, e: 'a' }, { a: { c: 48, b: {} }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + mpath.set('arr.0.arr.$.a.b', [1], o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: { b: [1] }, e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + done(); + }); + }); + + describe('array.array.index.path', function() { + it('with single value', function(done) { + mpath.set('arr.arr.0.a', 'single', o, special, function(v) { + return 88; + }); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: 88, e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + mpath.set('arr.arr.0.a', 'single', o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: 'single', e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + done(); + }); + it('with array', function(done) { + mpath.set('arr.arr.0.a', [4, 8, 15, 16, 23, 42], o, special, function(v) { + return v * 2; + }); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: 8, e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + mpath.set('arr.arr.0.a', [4, 8, 15, 16, 23, 42], o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: 4, e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + done(); + }); + }); + + describe('array.array.$.index.path', function() { + it('with single value', function(done) { + mpath.set('arr.arr.$.0.a', 'singles', o, special, function(v) { + return v.toUpperCase(); + }); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: 'SINGLES', e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + mpath.set('arr.arr.$.0.a', 'singles', o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: 'singles', e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + mpath.set('$.arr.arr.0.a', 'single', o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: 'single', e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + done(); + }); + it('with array', function(done) { + mpath.set('arr.arr.$.0.a', [4, 8, 15, 16, 23, 42], o, special, function(v) { + return Array; + }); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: Array, e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + mpath.set('arr.arr.$.0.a', [4, 8, 15, 16, 23, 42], o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: [4, 8, 15, 16, 23, 42], e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + mpath.set('arr.$.arr.0.a', [4, 8, 15, 16, 23, 42, 108], o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: [4, 8, 15, 16, 23, 42, 108], e: 'a' }, { a: { c: 48, b: [1] }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + done(); + }); + }); + + describe('array.array.path.index', function() { + it('with single value', function(done) { + mpath.set('arr.arr.a.7', 47, o, special, function(v) { + return Object; + }); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: [4, 8, 15, 16, 23, 42, 108, Object], e: 'a' }, { a: { c: 48, b: [1], 7: Object }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + mpath.set('arr.arr.a.7', 47, o, special); + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: [4, 8, 15, 16, 23, 42, 108, 47], e: 'a' }, { a: { c: 48, b: [1], 7: 47 }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { special: true, yep: 0 } } + ], o.arr); + + done(); + }); + it('with array', function(done) { + o.arr[1]._doc.arr = [{ a: [] }, { a: [] }, { a: null }]; + mpath.set('arr.arr.a.7', [[null, 46], [undefined, 'woot']], o, special, function(v) { + return undefined === v ? 'nope' : v; + }); + + const a1 = []; + const a2 = []; + a1[7] = 'nope'; + a2[7] = 'woot'; + + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: [4, 8, 15, 16, 23, 42, 108, null], e: 'a' }, { a: { c: 48, b: [1], 7: 46 }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { arr: [{ a: a1 }, { a: a2 }, { a: null }], special: true, yep: 0 } } + ], o.arr); + + mpath.set('arr.arr.a.7', [[null, 46], [undefined, 'woot']], o, special); + + a1[7] = undefined; + + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { yep: [15], arr: [{ a: [4, 8, 15, 16, 23, 42, 108, null], e: 'a' }, { a: { c: 48, b: [1], 7: 46 }, e: 'b' }, { d: 'yep', e: 35 }] } }, + { yep: true, _doc: { arr: [{ a: a1 }, { a: a2 }, { a: null }], special: true, yep: 0 } } + ], o.arr); + + done(); + }); + }); + + describe('handles array.array.path', function() { + it('with single', function(done) { + o.arr[1]._doc.arr = [{}, {}]; + assert.deepEqual([{}, {}], o.arr[1]._doc.arr); + o.arr.push({ _doc: { arr: 'something else' } }); + o.arr.push({ _doc: { arr: ['something else'] } }); + o.arr.push({ _doc: { arr: [[]] } }); + o.arr.push({ _doc: { arr: [5] } }); + + // test + mpath.set('arr.arr.e', 47, o, special); + + const weird = []; + weird.e = 47; + + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { + yep: [15], + arr: [ + { a: [4, 8, 15, 16, 23, 42, 108, null], e: 47 }, + { a: { c: 48, b: [1], 7: 46 }, e: 47 }, + { d: 'yep', e: 47 } + ] + } + }, + { yep: true, + _doc: { + arr: [ + { e: 47 }, + { e: 47 } + ], + special: true, + yep: 0 + } + }, + { _doc: { arr: 'something else' } }, + { _doc: { arr: ['something else'] } }, + { _doc: { arr: [weird] } }, + { _doc: { arr: [5] } } + ] + , o.arr); + + done(); + }); + it('with arrays', function(done) { + mpath.set('arr.arr.e', [[1, 2, 3], [4, 5], null, [], [6], [7, 8, 9]], o, special); + + const weird = []; + weird.e = 6; + + assert.deepEqual([ + { yep: 47, arr: [{ a: { b: 47 } }, { a: { c: 48 } }, { d: 'yep' }], + _doc: { + yep: [15], + arr: [ + { a: [4, 8, 15, 16, 23, 42, 108, null], e: 1 }, + { a: { c: 48, b: [1], 7: 46 }, e: 2 }, + { d: 'yep', e: 3 } + ] + } + }, + { yep: true, + _doc: { + arr: [ + { e: 4 }, + { e: 5 } + ], + special: true, + yep: 0 + } + }, + { _doc: { arr: 'something else' } }, + { _doc: { arr: ['something else'] } }, + { _doc: { arr: [weird] } }, + { _doc: { arr: [5] } } + ] + , o.arr); + + done(); + }); + }); + + describe('that is a function', function() { + describe('without map', function() { + it('works on array value', function(done) { + const o = { hello: { world: [{ how: 'are' }, { you: '?' }] } }; + const special = function(obj, key, val) { + if (val) { + obj[key] = val; + } else { + return 'thing' == key + ? obj.world + : obj[key]; + } + }; + mpath.set('hello.thing.how', 'arrrr', o, special); + assert.deepEqual(o, { hello: { world: [{ how: 'arrrr' }, { you: '?', how: 'arrrr' }] } }); + done(); + }); + it('works on non-array value', function(done) { + const o = { hello: { world: { how: 'are you' } } }; + const special = function(obj, key, val) { + if (val) { + obj[key] = val; + } else { + return 'thing' == key + ? obj.world + : obj[key]; + } + }; + mpath.set('hello.thing.how', 'RU', o, special); + assert.deepEqual(o, { hello: { world: { how: 'RU' } } }); + done(); + }); + }); + it('works with map', function(done) { + const o = { hello: { world: [{ how: 'are' }, { you: '?' }] } }; + const special = function(obj, key, val) { + if (val) { + obj[key] = val; + } else { + return 'thing' == key + ? obj.world + : obj[key]; + } + }; + const map = function(val) { + return 'convert' == val + ? 'ºº' + : val; + }; + mpath.set('hello.thing.how', 'convert', o, special, map); + assert.deepEqual(o, { hello: { world: [{ how: 'ºº' }, { you: '?', how: 'ºº' }] } }); + done(); + }); + }); + + }); + + describe('get/set integration', function() { + const o = doc(); + + it('works', function(done) { + const vals = mpath.get('array.o.array.x.b', o); + + vals[0][0][2] = 10; + vals[1][0][1] = 0; + vals[1][1] = 'Rambaldi'; + vals[1][2] = [12, 14]; + vals[2] = [{ changed: true }, [null, ['changed', 'to', 'array']]]; + + mpath.set('array.o.array.x.b', vals, o); + + const t = [ + { o: { array: [{ x: { b: [4, 6, 10] } }, { y: 10 }] } }, + { o: { array: [{ x: { b: [1, 0, 3] } }, { x: { b: 'Rambaldi', z: 10 } }, { x: { b: [12, 14] } }] } }, + { o: { array: [{ x: { b: { changed: true } } }, { x: { b: [null, ['changed', 'to', 'array']] } }] } }, + { o: { array: [{ x: null }] } }, + { o: { array: [{ y: 3 }] } }, + { o: { array: [3, 0, null] } }, + { o: { name: 'ha' } } + ]; + assert.deepEqual(t, o.array); + done(); + }); + + it('array.prop', function(done) { + mpath.set('comments.name', ['this', 'was', 'changed'], o); + + assert.deepEqual([ + { name: 'this' }, + { name: 'was', _doc: { name: '2' } }, + { name: 'changed', + comments: [{}, { comments: [{ val: 'twoo' }] }], + _doc: { name: '3', comments: [{}, { _doc: { comments: [{ val: 2 }] } }] } } + ], o.comments); + + mpath.set('comments.name', ['also', 'changed', 'this'], o, special); + + assert.deepEqual([ + { name: 'also' }, + { name: 'was', _doc: { name: 'changed' } }, + { name: 'changed', + comments: [{}, { comments: [{ val: 'twoo' }] }], + _doc: { name: 'this', comments: [{}, { _doc: { comments: [{ val: 2 }] } }] } } + ], o.comments); + + done(); + }); + + it('nested array', function(done) { + const obj = { arr: [[{ test: 41 }]] }; + mpath.set('arr.test', [[42]], obj); + assert.deepEqual(obj.arr, [[{ test: 42 }]]); + done(); + }); + }); + + describe('multiple $ use', function() { + const o = doc(); + it('is ok', function(done) { + assert.doesNotThrow(function() { + mpath.set('arr.$.arr.$.a', 35, o); + }); + done(); + }); + }); + + it('has', function(done) { + assert.ok(mpath.has('a', { a: 1 })); + assert.ok(mpath.has('a', { a: undefined })); + assert.ok(!mpath.has('a', {})); + assert.ok(!mpath.has('a', null)); + + assert.ok(mpath.has('a.b', { a: { b: 1 } })); + assert.ok(mpath.has('a.b', { a: { b: undefined } })); + assert.ok(!mpath.has('a.b', { a: 1 })); + assert.ok(!mpath.has('a.b', { a: null })); + + done(); + }); + + it('underneath a map', function(done) { + if (!global.Map) { + done(); + return; + } + assert.equal(mpath.get('a.b', { a: new Map([['b', 1]]) }), 1); + + const m = new Map([['b', 1]]); + const obj = { a: m }; + mpath.set('a.c', 2, obj); + assert.equal(m.get('c'), 2); + + done(); + }); + + it('unset', function(done) { + let o = { a: 1 }; + mpath.unset('a', o); + assert.deepEqual(o, {}); + + o = { a: { b: 1 } }; + mpath.unset('a.b', o); + assert.deepEqual(o, { a: {} }); + + o = { a: null }; + mpath.unset('a.b', o); + assert.deepEqual(o, { a: null }); + + done(); + }); + + it('unset with __proto__', function(done) { + // Should refuse to set __proto__ + function Clazz() {} + Clazz.prototype.foobar = true; + + mpath.unset('__proto__.foobar', new Clazz()); + assert.ok(Clazz.prototype.foobar); + + mpath.unset('constructor.prototype.foobar', new Clazz()); + assert.ok(Clazz.prototype.foobar); + + done(); + }); + + it('get() underneath subclassed array', function(done) { + class MyArray extends Array {} + + const obj = { + arr: new MyArray() + }; + obj.arr.push({ test: 2 }); + + const arr = mpath.get('arr.test', obj); + assert.equal(arr.constructor.name, 'Array'); + assert.ok(!(arr instanceof MyArray)); + + done(); + }); + + it('ignores setting a nested path that doesnt exist', function(done) { + const o = doc(); + assert.doesNotThrow(function() { + mpath.set('thing.that.is.new', 10, o); + }); + done(); + }); + }); +}); diff --git a/node_modules/mpath/test/stringToParts.js b/node_modules/mpath/test/stringToParts.js new file mode 100644 index 000000000..09940dfa9 --- /dev/null +++ b/node_modules/mpath/test/stringToParts.js @@ -0,0 +1,30 @@ +'use strict'; + +const assert = require('assert'); +const stringToParts = require('../lib/stringToParts'); + +describe('stringToParts', function() { + it('handles brackets for numbers', function() { + assert.deepEqual(stringToParts('list[0].name'), ['list', '0', 'name']); + assert.deepEqual(stringToParts('list[0][1].name'), ['list', '0', '1', 'name']); + }); + + it('handles dot notation', function() { + assert.deepEqual(stringToParts('a.b.c'), ['a', 'b', 'c']); + assert.deepEqual(stringToParts('a..b.d'), ['a', '', 'b', 'd']); + }); + + it('ignores invalid numbers in square brackets', function() { + assert.deepEqual(stringToParts('foo[1mystring]'), ['foo[1mystring]']); + assert.deepEqual(stringToParts('foo[1mystring].bar[1]'), ['foo[1mystring]', 'bar', '1']); + assert.deepEqual(stringToParts('foo[1mystring][2]'), ['foo[1mystring]', '2']); + }); + + it('handles empty string', function() { + assert.deepEqual(stringToParts(''), ['']); + }); + + it('handles trailing dot', function() { + assert.deepEqual(stringToParts('a.b.'), ['a', 'b', '']); + }); +}); \ No newline at end of file diff --git a/node_modules/mquery/.eslintignore b/node_modules/mquery/.eslintignore new file mode 100644 index 000000000..4b4d86310 --- /dev/null +++ b/node_modules/mquery/.eslintignore @@ -0,0 +1 @@ +coverage/ \ No newline at end of file diff --git a/node_modules/mquery/.eslintrc.json b/node_modules/mquery/.eslintrc.json new file mode 100644 index 000000000..8dabf1a21 --- /dev/null +++ b/node_modules/mquery/.eslintrc.json @@ -0,0 +1,123 @@ +{ + "extends": [ + "eslint:recommended" + ], + "plugins": [ + "mocha-no-only" + ], + "parserOptions": { + "ecmaVersion": 2017 + }, + "env": { + "node": true, + "es6": true + }, + "rules": { + "comma-style": "error", + "indent": [ + "error", + 2, + { + "SwitchCase": 1, + "VariableDeclarator": 2 + } + ], + "keyword-spacing": "error", + "no-whitespace-before-property": "error", + "no-buffer-constructor": "warn", + "no-console": "off", + "no-multi-spaces": "error", + "no-constant-condition": "off", + "func-call-spacing": "error", + "no-trailing-spaces": "error", + "no-undef": "error", + "no-unneeded-ternary": "error", + "no-const-assign": "error", + "no-useless-rename": "error", + "no-dupe-keys": "error", + "space-in-parens": [ + "error", + "never" + ], + "spaced-comment": [ + "error", + "always", + { + "block": { + "markers": [ + "!" + ], + "balanced": true + } + } + ], + "key-spacing": [ + "error", + { + "beforeColon": false, + "afterColon": true + } + ], + "comma-spacing": [ + "error", + { + "before": false, + "after": true + } + ], + "array-bracket-spacing": 1, + "arrow-spacing": [ + "error", + { + "before": true, + "after": true + } + ], + "object-curly-spacing": [ + "error", + "always" + ], + "comma-dangle": [ + "error", + "never" + ], + "no-unreachable": "error", + "quotes": [ + "error", + "single" + ], + "quote-props": [ + "error", + "as-needed" + ], + "semi": "error", + "no-extra-semi": "error", + "semi-spacing": "error", + "no-spaced-func": "error", + "no-throw-literal": "error", + "space-before-blocks": "error", + "space-before-function-paren": [ + "error", + "never" + ], + "space-infix-ops": "error", + "space-unary-ops": "error", + "no-var": "warn", + "prefer-const": "warn", + "strict": [ + "error", + "global" + ], + "no-restricted-globals": [ + "error", + { + "name": "context", + "message": "Don't use Mocha's global context" + } + ], + "no-prototype-builtins": "off", + "mocha-no-only/mocha-no-only": [ + "error" + ] + } +} \ No newline at end of file diff --git a/node_modules/mquery/.travis.yml b/node_modules/mquery/.travis.yml new file mode 100644 index 000000000..3d207e1d0 --- /dev/null +++ b/node_modules/mquery/.travis.yml @@ -0,0 +1,17 @@ +language: node_js +node_js: + - "8" + - "10" + - "12" +matrix: + include: + - node_js: "13" + env: "NVM_NODEJS_ORG_MIRROR=https://nodejs.org/download/nightly" + allow_failures: + # Allow the nightly installs to fail + - env: "NVM_NODEJS_ORG_MIRROR=https://nodejs.org/download/nightly" +script: + - npm test + - npm run lint +services: + - mongodb diff --git a/node_modules/mquery/History.md b/node_modules/mquery/History.md new file mode 100644 index 000000000..6b40aa997 --- /dev/null +++ b/node_modules/mquery/History.md @@ -0,0 +1,363 @@ +4.0.0 / 2021-08-24 + +4.0.0-rc0 / 2021-08-19 +====================== + * BREAKING CHANGE: drop support for Node < 12 #123 + * BREAKING CHANGE: upgrade to mongodb driver 4.x: drop support for `findAndModify()`, use native `findOneAndUpdate/Delete` #124 + * BREAKING CHANGE: rename findStream -> findCursor #124 + * BREAKING CHANGE: use native ES6 promises by default, remove bluebird dependency #123 + +3.2.5 / 2021-03-29 +================== + * fix(utils): make `mergeClone()` skip special properties like `__proto__` #121 [zpbrent](https://github.com/zpbrent) + +3.2.4 / 2021-02-12 +================== + * fix(utils): make clone() only copy own properties Automattic/mongoose#9876 + +3.2.3 / 2020-12-10 +================== + * fix(utils): avoid copying special properties like `__proto__` when merging and cloning. Fix CVE-2020-35149 + +3.2.2 / 2019-09-22 +================== + * fix: dont re-call setOptions() when pulling base class options Automattic/mongoose#8159 + +3.2.1 / 2018-08-24 +================== + * chore: upgrade deps + +3.2.0 / 2018-08-24 +================== + * feat: add $useProjection to opt in to using `projection` instead of `fields` re: MongoDB deprecation warnings Automattic/mongoose#6880 + +3.1.2 / 2018-08-01 +================== + * chore: move eslint to devDependencies #110 [jakesjews](https://github.com/jakesjews) + +3.1.1 / 2018-07-30 +================== + * chore: add eslint #107 [Fonger](https://github.com/Fonger) + * docs: clean up readConcern docs #106 [Fonger](https://github.com/Fonger) + +3.1.0 / 2018-07-29 +================== + * feat: add `readConcern()` helper #105 [Fonger](https://github.com/Fonger) + * feat: add `maxTimeMS()` as alias of `maxTime()` #105 [Fonger](https://github.com/Fonger) + * feat: add `collation()` helper #105 [Fonger](https://github.com/Fonger) + +3.0.1 / 2018-07-02 +================== + * fix: parse sort array options correctly #103 #102 [Fonger](https://github.com/Fonger) + +3.0.0 / 2018-01-20 +================== + * chore: upgrade deps and add nsp + +3.0.0-rc0 / 2017-12-06 +====================== + * BREAKING CHANGE: remove support for node < 4 + * BREAKING CHANGE: remove support for retainKeyOrder, will always be true by default re: Automattic/mongoose#2749 + +2.3.3 / 2017-11-19 +================== + * fixed; catch sync errors in cursor.toArray() re: Automattic/mongoose#5812 + +2.3.2 / 2017-09-27 +================== + * fixed; bumped debug -> 2.6.9 re: #89 + +2.3.1 / 2017-05-22 +================== + * fixed; bumped debug -> 2.6.7 re: #86 + +2.3.0 / 2017-03-05 +================== + * added; replaceOne function + * added; deleteOne and deleteMany functions + +2.2.3 / 2017-01-31 +================== + * fixed; throw correct error when passing incorrectly formatted array to sort() + +2.2.2 / 2017-01-31 +================== + * fixed; allow passing maps to sort() + +2.2.1 / 2017-01-29 +================== + * fixed; allow passing string to hint() + +2.2.0 / 2017-01-08 +================== + * added; updateOne and updateMany functions + +2.1.0 / 2016-12-22 +================== + * added; ability to pass an array to select() #81 [dciccale](https://github.com/dciccale) + +2.0.0 / 2016-09-25 +================== + * added; support for mongodb driver 2.0 streams + +1.12.0 / 2016-09-25 +=================== + * added; `retainKeyOrder` option re: Automattic/mongoose#4542 + +1.11.0 / 2016-06-04 +=================== + * added; `.minDistance()` helper and minDistance for `.near()` Automattic/mongoose#4179 + +1.10.1 / 2016-04-26 +=================== + * fixed; ensure conditions is an object before assigning #75 + +1.10.0 / 2016-03-16 +================== + + * updated; bluebird to latest 2.10.2 version #74 [matskiv](https://github.com/matskiv) + +1.9.0 / 2016-03-15 +================== + * added; `.eq` as a shortcut for `.equals` #72 [Fonger](https://github.com/Fonger) + * added; ability to use array syntax for sort re: https://jira.mongodb.org/browse/NODE-578 #67 + +1.8.0 / 2016-03-01 +================== + * fixed; dont throw an error if count used with sort or select Automattic/mongoose#3914 + +1.7.0 / 2016-02-23 +================== + * fixed; don't treat objects with a length property as argument objects #70 + * added; `.findCursor()` method #69 [nswbmw](https://github.com/nswbmw) + * added; `_compiledUpdate` property #68 [nswbmw](https://github.com/nswbmw) + +1.6.2 / 2015-07-12 +================== + + * fixed; support exec cb being called synchronously #66 + +1.6.1 / 2015-06-16 +================== + + * fixed; do not treat $meta projection as inclusive [vkarpov15](https://github.com/vkarpov15) + +1.6.0 / 2015-05-27 +================== + + * update dependencies #65 [bachp](https://github.com/bachp) + +1.5.0 / 2015-03-31 +================== + + * fixed; debug output + * fixed; allow hint usage with count #61 [trueinsider](https://github.com/trueinsider) + +1.4.0 / 2015-03-29 +================== + + * added; object support to slice() #60 [vkarpov15](https://github.com/vkarpov15) + * debug; improved output #57 [flyvictor](https://github.com/flyvictor) + +1.3.0 / 2014-11-06 +================== + + * added; setTraceFunction() #53 from [jlai](https://github.com/jlai) + +1.2.1 / 2014-09-26 +================== + + * fixed; distinct assignment in toConstructor() #51 from [esco](https://github.com/esco) + +1.2.0 / 2014-09-18 +================== + + * added; stream() support for find() + +1.1.0 / 2014-09-15 +================== + + * add #then for co / koa support + * start checking code coverage + +1.0.0 / 2014-07-07 +================== + + * Remove broken require() calls until they're actually implemented #48 [vkarpov15](https://github.com/vkarpov15) + +0.9.0 / 2014-05-22 +================== + + * added; thunk() support + * release 0.8.0 + +0.8.0 / 2014-05-15 +================== + + * added; support for maxTimeMS #44 [yoitsro](https://github.com/yoitsro) + * updated; devDependency (driver to 1.4.4) + +0.7.0 / 2014-05-02 +================== + + * fixed; pass $maxDistance in $near object as described in docs #43 [vkarpov15](https://github.com/vkarpov15) + * fixed; cloning buffers #42 [gjohnson](https://github.com/gjohnson) + * tests; a little bit more `mongodb` agnostic #34 [refack](https://github.com/refack) + +0.6.0 / 2014-04-01 +================== + + * fixed; Allow $meta args in sort() so text search sorting works #37 [vkarpov15](https://github.com/vkarpov15) + +0.5.3 / 2014-02-22 +================== + + * fixed; cloning mongodb.Binary + +0.5.2 / 2014-01-30 +================== + + * fixed; cloning ObjectId constructors + * fixed; cloning of ReadPreferences #30 [ashtuchkin](https://github.com/ashtuchkin) + * tests; use specific mongodb version #29 [AvianFlu](https://github.com/AvianFlu) + * tests; remove dependency on ObjectId #28 [refack](https://github.com/refack) + * tests; add failing ReadPref test + +0.5.1 / 2014-01-17 +================== + + * added; deprecation notice to tags parameter #27 [ashtuchkin](https://github.com/ashtuchkin) + * readme; add links + +0.5.0 / 2014-01-16 +================== + + * removed; mongodb driver dependency #26 [ashtuchkin](https://github.com/ashtuchkin) + * removed; first class support of read preference tags #26 (still supported though) [ashtuchkin](https://github.com/ashtuchkin) + * added; better ObjectId clone support + * fixed; cloning objects that have no constructor #21 + * docs; cleaned up [ashtuchkin](https://github.com/ashtuchkin) + +0.4.2 / 2014-01-08 +================== + + * updated; debug module 0.7.4 [refack](https://github.com/refack) + +0.4.1 / 2014-01-07 +================== + + * fixed; inclusive/exclusive logic + +0.4.0 / 2014-01-06 +================== + + * added; selected() + * added; selectedInclusively() + * added; selectedExclusively() + +0.3.3 / 2013-11-14 +================== + + * Fix Mongo DB Dependency #20 [rschmukler](https://github.com/rschmukler) + +0.3.2 / 2013-09-06 +================== + + * added; geometry support for near() + +0.3.1 / 2013-08-22 +================== + + * fixed; update retains key order #19 + +0.3.0 / 2013-08-22 +================== + + * less hardcoded isNode env detection #18 [vshulyak](https://github.com/vshulyak) + * added; validation of findAndModify varients + * clone update doc before execution + * stricter env checks + +0.2.7 / 2013-08-2 +================== + + * Now support GeoJSON point values for Query#near + +0.2.6 / 2013-07-30 +================== + + * internally, 'asc' and 'desc' for sorts are now converted into 1 and -1, respectively + +0.2.5 / 2013-07-30 +================== + + * updated docs + * changed internal representation of `sort` to use objects instead of arrays + +0.2.4 / 2013-07-25 +================== + + * updated; sliced to 0.0.5 + +0.2.3 / 2013-07-09 +================== + + * now using a callback in collection.find instead of directly calling toArray() on the cursor [ebensing](https://github.com/ebensing) + +0.2.2 / 2013-07-09 +================== + + * now exposing mongodb export to allow for better testing [ebensing](https://github.com/ebensing) + +0.2.1 / 2013-07-08 +================== + + * select no longer accepts arrays as parameters [ebensing](https://github.com/ebensing) + +0.2.0 / 2013-07-05 +================== + + * use $geoWithin by default + +0.1.2 / 2013-07-02 +================== + + * added use$geoWithin flag [ebensing](https://github.com/ebensing) + * fix read preferences typo [ebensing](https://github.com/ebensing) + * fix reference to old param name in exists() [ebensing](https://github.com/ebensing) + +0.1.1 / 2013-06-24 +================== + + * fixed; $intersects -> $geoIntersects #14 [ebensing](https://github.com/ebensing) + * fixed; Retain key order when copying objects #15 [ebensing](https://github.com/ebensing) + * bump mongodb dev dep + +0.1.0 / 2013-05-06 +================== + + * findAndModify; return the query + * move mquery.proto.canMerge to mquery.canMerge + * overwrite option now works with non-empty objects + * use strict mode + * validate count options + * validate distinct options + * add aggregate to base collection methods + * clone merge arguments + * clone merged update arguments + * move subclass to mquery.prototype.toConstructor + * fixed; maxScan casing + * use regexp-clone + * added; geometry/intersects support + * support $and + * near: do not use "radius" + * callbacks always fire on next turn of loop + * defined collection interface + * remove time from tests + * clarify goals + * updated docs; + +0.0.1 / 2012-12-15 +================== + + * initial release diff --git a/node_modules/mquery/LICENSE b/node_modules/mquery/LICENSE new file mode 100644 index 000000000..38c529daa --- /dev/null +++ b/node_modules/mquery/LICENSE @@ -0,0 +1,22 @@ +(The MIT License) + +Copyright (c) 2012 [Aaron Heckmann](aaron.heckmann+github@gmail.com) + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/mquery/Makefile b/node_modules/mquery/Makefile new file mode 100644 index 000000000..587655db2 --- /dev/null +++ b/node_modules/mquery/Makefile @@ -0,0 +1,26 @@ + +test: + @NODE_ENV=test ./node_modules/.bin/mocha $(T) $(TESTS) + +test-cov: + @NODE_ENV=test node \ + node_modules/.bin/istanbul cover \ + ./node_modules/.bin/_mocha \ + -- -u exports \ + +open-cov: + open coverage/lcov-report/index.html + +lint: + @NODE_ENV=test node ./node_modules/eslint/bin/eslint.js . + +test-travis: + @NODE_ENV=test node \ + node_modules/.bin/istanbul cover \ + ./node_modules/.bin/_mocha \ + --report lcovonly \ + --bail + @NODE_ENV=test node \ + ./node_modules/eslint/bin/eslint.js . + +.PHONY: test test-cov open-cov lint test-travis diff --git a/node_modules/mquery/README.md b/node_modules/mquery/README.md new file mode 100644 index 000000000..8e2136969 --- /dev/null +++ b/node_modules/mquery/README.md @@ -0,0 +1,1373 @@ +# mquery + +`mquery` is a fluent mongodb query builder designed to run in multiple environments. + +[![Build Status](https://travis-ci.org/aheckmann/mquery.svg?branch=master)](https://travis-ci.org/aheckmann/mquery) +[![NPM version](https://badge.fury.io/js/mquery.svg)](http://badge.fury.io/js/mquery) + +[![npm](https://nodei.co/npm/mquery.png)](https://www.npmjs.com/package/mquery) + +## Features + + - fluent query builder api + - custom base query support + - MongoDB 2.4 geoJSON support + - method + option combinations validation + - node.js driver compatibility + - environment detection + - [debug](https://github.com/visionmedia/debug) support + - separated collection implementations for maximum flexibility + +## Use + +```js +require('mongodb').connect(uri, function (err, db) { + if (err) return handleError(err); + + // get a collection + var collection = db.collection('artists'); + + // pass it to the constructor + mquery(collection).find({..}, callback); + + // or pass it to the collection method + mquery().find({..}).collection(collection).exec(callback) + + // or better yet, create a custom query constructor that has it always set + var Artist = mquery(collection).toConstructor(); + Artist().find(..).where(..).exec(callback) +}) +``` + +`mquery` requires a collection object to work with. In the example above we just pass the collection object created using the official [MongoDB driver](https://github.com/mongodb/node-mongodb-native). + + +## Fluent API + +- [find](#find) +- [findOne](#findOne) +- [count](#count) +- [remove](#remove) +- [update](#update) +- [findOneAndUpdate](#findoneandupdate) +- [findOneAndDelete, findOneAndRemove](#findoneandremove) +- [distinct](#distinct) +- [exec](#exec) +- [stream](#stream) +- [all](#all) +- [and](#and) +- [box](#box) +- [circle](#circle) +- [elemMatch](#elemmatch) +- [equals](#equals) +- [exists](#exists) +- [geometry](#geometry) +- [gt](#gt) +- [gte](#gte) +- [in](#in) +- [intersects](#intersects) +- [lt](#lt) +- [lte](#lte) +- [maxDistance](#maxdistance) +- [mod](#mod) +- [ne](#ne) +- [nin](#nin) +- [nor](#nor) +- [near](#near) +- [or](#or) +- [polygon](#polygon) +- [regex](#regex) +- [select](#select) +- [selected](#selected) +- [selectedInclusively](#selectedinclusively) +- [selectedExclusively](#selectedexclusively) +- [size](#size) +- [slice](#slice) +- [within](#within) +- [where](#where) +- [$where](#where-1) +- [batchSize](#batchsize) +- [collation](#collation) +- [comment](#comment) +- [hint](#hint) +- [j](#j) +- [limit](#limit) +- [maxScan](#maxscan) +- [maxTime, maxTimeMS](#maxtime) +- [skip](#skip) +- [sort](#sort) +- [read, setReadPreference](#read) +- [readConcern, r](#readconcern) +- [slaveOk](#slaveok) +- [snapshot](#snapshot) +- [tailable](#tailable) +- [writeConcern, w](#writeconcern) +- [wtimeout, wTimeout](#wtimeout) + +## Helpers + +- [collection](#collection) +- [then](#then) +- [thunk](#thunk) +- [merge](#mergeobject) +- [setOptions](#setoptionsoptions) +- [setTraceFunction](#settracefunctionfunc) +- [mquery.setGlobalTraceFunction](#mquerysetglobaltracefunctionfunc) +- [mquery.canMerge](#mquerycanmerge) +- [mquery.use$geoWithin](#mqueryusegeowithin) + +### find() + +Declares this query a _find_ query. Optionally pass a match clause and / or callback. If a callback is passed the query is executed. + +```js +mquery().find() +mquery().find(match) +mquery().find(callback) +mquery().find(match, function (err, docs) { + assert(Array.isArray(docs)); +}) +``` + +### findOne() + +Declares this query a _findOne_ query. Optionally pass a match clause and / or callback. If a callback is passed the query is executed. + +```js +mquery().findOne() +mquery().findOne(match) +mquery().findOne(callback) +mquery().findOne(match, function (err, doc) { + if (doc) { + // the document may not be found + console.log(doc); + } +}) +``` + +### count() + +Declares this query a _count_ query. Optionally pass a match clause and / or callback. If a callback is passed the query is executed. + +```js +mquery().count() +mquery().count(match) +mquery().count(callback) +mquery().count(match, function (err, number){ + console.log('we found %d matching documents', number); +}) +``` + +### remove() + +Declares this query a _remove_ query. Optionally pass a match clause and / or callback. If a callback is passed the query is executed. + +```js +mquery().remove() +mquery().remove(match) +mquery().remove(callback) +mquery().remove(match, function (err){}) +``` + +### update() + +Declares this query an _update_ query. Optionally pass an update document, match clause, options or callback. If a callback is passed, the query is executed. To force execution without passing a callback, run `update(true)`. + +```js +mquery().update() +mquery().update(match, updateDocument) +mquery().update(match, updateDocument, options) + +// the following all execute the command +mquery().update(callback) +mquery().update({$set: updateDocument, callback) +mquery().update(match, updateDocument, callback) +mquery().update(match, updateDocument, options, function (err, result){}) +mquery().update(true) // executes (unsafe write) +``` + +##### the update document + +All paths passed that are not `$atomic` operations will become `$set` ops. For example: + +```js +mquery(collection).where({ _id: id }).update({ title: 'words' }, callback) +``` + +becomes + +```js +collection.update({ _id: id }, { $set: { title: 'words' }}, callback) +``` + +This behavior can be overridden using the `overwrite` option (see below). + +##### options + +Options are passed to the `setOptions()` method. + +- overwrite + +Passing an empty object `{ }` as the update document will result in a no-op unless the `overwrite` option is passed. Without the `overwrite` option, the update operation will be ignored and the callback executed without sending the command to MongoDB to prevent accidently overwritting documents in the collection. + +```js +var q = mquery(collection).where({ _id: id }).setOptions({ overwrite: true }); +q.update({ }, callback); // overwrite with an empty doc +``` + +The `overwrite` option isn't just for empty objects, it also provides a means to override the default `$set` conversion and send the update document as is. + +```js +// create a base query +var base = mquery({ _id: 108 }).collection(collection).toConstructor(); + +base().findOne(function (err, doc) { + console.log(doc); // { _id: 108, name: 'cajon' }) + + base().setOptions({ overwrite: true }).update({ changed: true }, function (err) { + base.findOne(function (err, doc) { + console.log(doc); // { _id: 108, changed: true }) - the doc was overwritten + }); + }); +}) +``` + +- multi + +Updates only modify a single document by default. To update multiple documents, set the `multi` option to `true`. + +```js +mquery() + .collection(coll) + .update({ name: /^match/ }, { $addToSet: { arr: 4 }}, { multi: true }, callback) + +// another way of doing it +mquery({ name: /^match/ }) + .collection(coll) + .setOptions({ multi: true }) + .update({ $addToSet: { arr: 4 }}, callback) + +// update multiple documents with an empty doc +var q = mquery(collection).where({ name: /^match/ }); +q.setOptions({ multi: true, overwrite: true }) +q.update({ }); +q.update(function (err, result) { + console.log(arguments); +}); +``` + +### findOneAndUpdate() + +Declares this query a _findAndModify_ with update query. Optionally pass a match clause, update document, options, or callback. If a callback is passed, the query is executed. + +When executed, the first matching document (if found) is modified according to the update document and passed back to the callback. + +##### options + +Options are passed to the `setOptions()` method. + +- `returnDocument`: string - `'after'` to return the modified document rather than the original. defaults to `'before'` +- `upsert`: boolean - creates the object if it doesn't exist. defaults to false +- `sort`: if multiple docs are found by the match condition, sets the sort order to choose which doc to update + +```js +query.findOneAndUpdate() +query.findOneAndUpdate(updateDocument) +query.findOneAndUpdate(match, updateDocument) +query.findOneAndUpdate(match, updateDocument, options) + +// the following all execute the command +query.findOneAndUpdate(callback) +query.findOneAndUpdate(updateDocument, callback) +query.findOneAndUpdate(match, updateDocument, callback) +query.findOneAndUpdate(match, updateDocument, options, function (err, doc) { + if (doc) { + // the document may not be found + console.log(doc); + } +}) + ``` + +### findOneAndRemove() + +Declares this query a _findAndModify_ with remove query. Alias of findOneAndDelete. +Optionally pass a match clause, options, or callback. If a callback is passed, the query is executed. + +When executed, the first matching document (if found) is modified according to the update document, removed from the collection and passed to the callback. + +##### options + +Options are passed to the `setOptions()` method. + +- `sort`: if multiple docs are found by the condition, sets the sort order to choose which doc to modify and remove + +```js +A.where().findOneAndDelete() +A.where().findOneAndRemove() +A.where().findOneAndRemove(match) +A.where().findOneAndRemove(match, options) + +// the following all execute the command +A.where().findOneAndRemove(callback) +A.where().findOneAndRemove(match, callback) +A.where().findOneAndRemove(match, options, function (err, doc) { + if (doc) { + // the document may not be found + console.log(doc); + } +}) + ``` + +### distinct() + +Declares this query a _distinct_ query. Optionally pass the distinct field, a match clause or callback. If a callback is passed the query is executed. + +```js +mquery().distinct() +mquery().distinct(match) +mquery().distinct(match, field) +mquery().distinct(field) + +// the following all execute the command +mquery().distinct(callback) +mquery().distinct(field, callback) +mquery().distinct(match, callback) +mquery().distinct(match, field, function (err, result) { + console.log(result); +}) +``` + +### exec() + +Executes the query. + +```js +mquery().findOne().where('route').intersects(polygon).exec(function (err, docs){}) +``` + +### stream() + +Executes the query and returns a stream. + +```js +var stream = mquery().find().stream(options); +stream.on('data', cb); +stream.on('close', fn); +``` + +Note: this only works with `find()` operations. + +Note: returns the stream object directly from the node-mongodb-native driver. (currently streams1 type stream). Any options will be passed along to the [driver method](http://mongodb.github.io/node-mongodb-native/api-generated/cursor.html#stream). + +------------- + +### all() + +Specifies an `$all` query condition + +```js +mquery().where('permission').all(['read', 'write']) +``` + +[MongoDB documentation](http://docs.mongodb.org/manual/reference/operator/all/) + +### and() + +Specifies arguments for an `$and` condition + +```js +mquery().and([{ color: 'green' }, { status: 'ok' }]) +``` + +[MongoDB documentation](http://docs.mongodb.org/manual/reference/operator/and/) + +### box() + +Specifies a `$box` condition + +```js +var lowerLeft = [40.73083, -73.99756] +var upperRight= [40.741404, -73.988135] + +mquery().where('location').within().box(lowerLeft, upperRight) +``` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/box/) + +### circle() + +Specifies a `$center` or `$centerSphere` condition. + +```js +var area = { center: [50, 50], radius: 10, unique: true } +query.where('loc').within().circle(area) +query.circle('loc', area); + +// for spherical calculations +var area = { center: [50, 50], radius: 10, unique: true, spherical: true } +query.where('loc').within().circle(area) +query.circle('loc', area); +``` + +- [MongoDB Documentation - center](http://docs.mongodb.org/manual/reference/operator/center/) +- [MongoDB Documentation - centerSphere](http://docs.mongodb.org/manual/reference/operator/centerSphere/) + +### elemMatch() + +Specifies an `$elemMatch` condition + +```js +query.where('comment').elemMatch({ author: 'autobot', votes: {$gte: 5}}) + +query.elemMatch('comment', function (elem) { + elem.where('author').equals('autobot'); + elem.where('votes').gte(5); +}) +``` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/elemMatch/) + +### equals() + +Specifies the complementary comparison value for the path specified with `where()`. + +```js +mquery().where('age').equals(49); + +// is the same as + +mquery().where({ 'age': 49 }); +``` + +### exists() + +Specifies an `$exists` condition + +```js +// { name: { $exists: true }} +mquery().where('name').exists() +mquery().where('name').exists(true) +mquery().exists('name') + +// { name: { $exists: false }} +mquery().where('name').exists(false); +mquery().exists('name', false); +``` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/exists/) + +### geometry() + +Specifies a `$geometry` condition + +```js +var polyA = [[[ 10, 20 ], [ 10, 40 ], [ 30, 40 ], [ 30, 20 ]]] +query.where('loc').within().geometry({ type: 'Polygon', coordinates: polyA }) + +// or +var polyB = [[ 0, 0 ], [ 1, 1 ]] +query.where('loc').within().geometry({ type: 'LineString', coordinates: polyB }) + +// or +var polyC = [ 0, 0 ] +query.where('loc').within().geometry({ type: 'Point', coordinates: polyC }) + +// or +query.where('loc').intersects().geometry({ type: 'Point', coordinates: polyC }) + +// or +query.where('loc').near().geometry({ type: 'Point', coordinates: [3,5] }) +``` + +`geometry()` **must** come after `intersects()`, `within()`, or `near()`. + +The `object` argument must contain `type` and `coordinates` properties. + +- type `String` +- coordinates `Array` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/geometry/) + +### gt() + +Specifies a `$gt` query condition. + +```js +mquery().where('clicks').gt(999) +``` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/gt/) + +### gte() + +Specifies a `$gte` query condition. + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/gte/) + +```js +mquery().where('clicks').gte(1000) +``` + +### in() + +Specifies an `$in` query condition. + +```js +mquery().where('author_id').in([3, 48901, 761]) +``` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/in/) + +### intersects() + +Declares an `$geoIntersects` query for `geometry()`. + +```js +query.where('path').intersects().geometry({ + type: 'LineString' + , coordinates: [[180.0, 11.0], [180, 9.0]] +}) + +// geometry arguments are supported +query.where('path').intersects({ + type: 'LineString' + , coordinates: [[180.0, 11.0], [180, 9.0]] +}) +``` + +**Must** be used after `where()`. + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/geoIntersects/) + +### lt() + +Specifies a `$lt` query condition. + +```js +mquery().where('clicks').lt(50) +``` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/lt/) + +### lte() + +Specifies a `$lte` query condition. + +```js +mquery().where('clicks').lte(49) +``` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/lte/) + +### maxDistance() + +Specifies a `$maxDistance` query condition. + +```js +mquery().where('location').near({ center: [139, 74.3] }).maxDistance(5) +``` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/maxDistance/) + +### mod() + +Specifies a `$mod` condition + +```js +mquery().where('count').mod(2, 0) +``` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/mod/) + +### ne() + +Specifies a `$ne` query condition. + +```js +mquery().where('status').ne('ok') +``` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/ne/) + +### nin() + +Specifies an `$nin` query condition. + +```js +mquery().where('author_id').nin([3, 48901, 761]) +``` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/nin/) + +### nor() + +Specifies arguments for an `$nor` condition. + +```js +mquery().nor([{ color: 'green' }, { status: 'ok' }]) +``` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/nor/) + +### near() + +Specifies arguments for a `$near` or `$nearSphere` condition. + +These operators return documents sorted by distance. + +#### Example + +```js +query.where('loc').near({ center: [10, 10] }); +query.where('loc').near({ center: [10, 10], maxDistance: 5 }); +query.near('loc', { center: [10, 10], maxDistance: 5 }); + +// GeoJSON +query.where('loc').near({ center: { type: 'Point', coordinates: [10, 10] }}); +query.where('loc').near({ center: { type: 'Point', coordinates: [10, 10] }, maxDistance: 5, spherical: true }); +query.where('loc').near().geometry({ type: 'Point', coordinates: [10, 10] }); + +// For a $nearSphere condition, pass the `spherical` option. +query.near({ center: [10, 10], maxDistance: 5, spherical: true }); +``` + +[MongoDB Documentation](http://www.mongodb.org/display/DOCS/Geospatial+Indexing) + +### or() + +Specifies arguments for an `$or` condition. + +```js +mquery().or([{ color: 'red' }, { status: 'emergency' }]) +``` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/or/) + +### polygon() + +Specifies a `$polygon` condition + +```js +mquery().where('loc').within().polygon([10,20], [13, 25], [7,15]) +mquery().polygon('loc', [10,20], [13, 25], [7,15]) +``` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/polygon/) + +### regex() + +Specifies a `$regex` query condition. + +```js +mquery().where('name').regex(/^sixstepsrecords/) +``` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/regex/) + +### select() + +Specifies which document fields to include or exclude + +```js +// 1 means include, 0 means exclude +mquery().select({ name: 1, address: 1, _id: 0 }) + +// or + +mquery().select('name address -_id') +``` + +##### String syntax + +When passing a string, prefixing a path with `-` will flag that path as excluded. When a path does not have the `-` prefix, it is included. + +```js +// include a and b, exclude c +query.select('a b -c'); + +// or you may use object notation, useful when +// you have keys already prefixed with a "-" +query.select({a: 1, b: 1, c: 0}); +``` + +_Cannot be used with `distinct()`._ + +### selected() + +Determines if the query has selected any fields. + +```js +var query = mquery(); +query.selected() // false +query.select('-name'); +query.selected() // true +``` + +### selectedInclusively() + +Determines if the query has selected any fields inclusively. + +```js +var query = mquery().select('name'); +query.selectedInclusively() // true + +var query = mquery(); +query.selected() // false +query.select('-name'); +query.selectedInclusively() // false +query.selectedExclusively() // true +``` + +### selectedExclusively() + +Determines if the query has selected any fields exclusively. + +```js +var query = mquery().select('-name'); +query.selectedExclusively() // true + +var query = mquery(); +query.selected() // false +query.select('name'); +query.selectedExclusively() // false +query.selectedInclusively() // true +``` + +### size() + +Specifies a `$size` query condition. + +```js +mquery().where('someArray').size(6) +``` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/size/) + +### slice() + +Specifies a `$slice` projection for a `path` + +```js +mquery().where('comments').slice(5) +mquery().where('comments').slice(-5) +mquery().where('comments').slice([-10, 5]) +``` + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/projection/slice/) + +### within() + +Sets a `$geoWithin` or `$within` argument for geo-spatial queries. + +```js +mquery().within().box() +mquery().within().circle() +mquery().within().geometry() + +mquery().where('loc').within({ center: [50,50], radius: 10, unique: true, spherical: true }); +mquery().where('loc').within({ box: [[40.73, -73.9], [40.7, -73.988]] }); +mquery().where('loc').within({ polygon: [[],[],[],[]] }); + +mquery().where('loc').within([], [], []) // polygon +mquery().where('loc').within([], []) // box +mquery().where('loc').within({ type: 'LineString', coordinates: [...] }); // geometry +``` + +As of mquery 2.0, `$geoWithin` is used by default. This impacts you if running MongoDB < 2.4. To alter this behavior, see [mquery.use$geoWithin](#mqueryusegeowithin). + +**Must** be used after `where()`. + +[MongoDB Documentation](http://docs.mongodb.org/manual/reference/operator/geoWithin/) + +### where() + +Specifies a `path` for use with chaining + +```js +// instead of writing: +mquery().find({age: {$gte: 21, $lte: 65}}); + +// we can instead write: +mquery().where('age').gte(21).lte(65); + +// passing query conditions is permitted too +mquery().find().where({ name: 'vonderful' }) + +// chaining +mquery() +.where('age').gte(21).lte(65) +.where({ 'name': /^vonderful/i }) +.where('friends').slice(10) +.exec(callback) +``` + +### $where() + +Specifies a `$where` condition. + +Use `$where` when you need to select documents using a JavaScript expression. + +```js +query.$where('this.comments.length > 10 || this.name.length > 5').exec(callback) + +query.$where(function () { + return this.comments.length > 10 || this.name.length > 5; +}) +``` + +Only use `$where` when you have a condition that cannot be met using other MongoDB operators like `$lt`. Be sure to read about all of [its caveats](http://docs.mongodb.org/manual/reference/operator/where/) before using. + +----------- + +### batchSize() + +Specifies the batchSize option. + +```js +query.batchSize(100) +``` + +_Cannot be used with `distinct()`._ + +[MongoDB documentation](http://docs.mongodb.org/manual/reference/method/cursor.batchSize/) + +### collation() + +Specifies the collation option. + +```js +query.collation({ locale: "en_US", strength: 1 }) +``` + +[MongoDB documentation](https://docs.mongodb.com/manual/reference/method/cursor.collation/#cursor.collation) + +### comment() + +Specifies the comment option. + +```js +query.comment('login query'); +``` + +_Cannot be used with `distinct()`._ + +[MongoDB documentation](http://docs.mongodb.org/manual/reference/operator/) + +### hint() + +Sets query hints. + +```js +mquery().hint({ indexA: 1, indexB: -1 }) +``` + +_Cannot be used with `distinct()`._ + +[MongoDB documentation](http://docs.mongodb.org/manual/reference/operator/hint/) + +### j() + +Requests acknowledgement that this operation has been persisted to MongoDB's on-disk journal. + +This option is only valid for operations that write to the database: + +- `deleteOne()` +- `deleteMany()` +- `findOneAndDelete()` +- `findOneAndUpdate()` +- `remove()` +- `update()` +- `updateOne()` +- `updateMany()` + +Defaults to the `j` value if it is specified in [writeConcern](#writeconcern) + +```js +mquery().j(true); +``` + +### limit() + +Specifies the limit option. + +```js +query.limit(20) +``` + +_Cannot be used with `distinct()`._ + +[MongoDB documentation](http://docs.mongodb.org/manual/reference/method/cursor.limit/) + +### maxScan() + +Specifies the maxScan option. + +```js +query.maxScan(100) +``` + +_Cannot be used with `distinct()`._ + +[MongoDB documentation](http://docs.mongodb.org/manual/reference/operator/maxScan/) + +### maxTime() + +Specifies the maxTimeMS option. + +```js +query.maxTime(100) +query.maxTimeMS(100) +``` + +[MongoDB documentation](http://docs.mongodb.org/manual/reference/method/cursor.maxTimeMS/) + + +### skip() + +Specifies the skip option. + +```js +query.skip(100).limit(20) +``` + +_Cannot be used with `distinct()`._ + +[MongoDB documentation](http://docs.mongodb.org/manual/reference/method/cursor.skip/) + +### sort() + +Sets the query sort order. + +If an object is passed, key values allowed are `asc`, `desc`, `ascending`, `descending`, `1`, and `-1`. + +If a string is passed, it must be a space delimited list of path names. The sort order of each path is ascending unless the path name is prefixed with `-` which will be treated as descending. + +```js +// these are equivalent +query.sort({ field: 'asc', test: -1 }); +query.sort('field -test'); +``` + +_Cannot be used with `distinct()`._ + +[MongoDB documentation](http://docs.mongodb.org/manual/reference/method/cursor.sort/) + +### read() + +Sets the readPreference option for the query. + +```js +mquery().read('primary') +mquery().read('p') // same as primary + +mquery().read('primaryPreferred') +mquery().read('pp') // same as primaryPreferred + +mquery().read('secondary') +mquery().read('s') // same as secondary + +mquery().read('secondaryPreferred') +mquery().read('sp') // same as secondaryPreferred + +mquery().read('nearest') +mquery().read('n') // same as nearest + +mquery().setReadPreference('primary') // alias of .read() +``` + +##### Preferences: + +- `primary` - (default) Read from primary only. Operations will produce an error if primary is unavailable. Cannot be combined with tags. +- `secondary` - Read from secondary if available, otherwise error. +- `primaryPreferred` - Read from primary if available, otherwise a secondary. +- `secondaryPreferred` - Read from a secondary if available, otherwise read from the primary. +- `nearest` - All operations read from among the nearest candidates, but unlike other modes, this option will include both the primary and all secondaries in the random selection. + +Aliases + +- `p` primary +- `pp` primaryPreferred +- `s` secondary +- `sp` secondaryPreferred +- `n` nearest + +##### Preference Tags: + +To keep the separation of concerns between `mquery` and your driver +clean, `mquery#read()` no longer handles specifying a second `tags` argument as of version 0.5. +If you need to specify tags, pass any non-string argument as the first argument. +`mquery` will pass this argument untouched to your collections methods later. +For example: + +```js +// example of specifying tags using the Node.js driver +var ReadPref = require('mongodb').ReadPreference; +var preference = new ReadPref('secondary', [{ dc:'sf', s: 1 },{ dc:'ma', s: 2 }]); +mquery(..).read(preference).exec(); +``` + +Read more about how to use read preferences [here](http://docs.mongodb.org/manual/applications/replication/#read-preference) and [here](http://mongodb.github.com/node-mongodb-native/driver-articles/anintroductionto1_1and2_2.html#read-preferences). + + +### readConcern() + +Sets the readConcern option for the query. + +```js +// local +mquery().readConcern('local') +mquery().readConcern('l') +mquery().r('l') + +// available +mquery().readConcern('available') +mquery().readConcern('a') +mquery().r('a') + +// majority +mquery().readConcern('majority') +mquery().readConcern('m') +mquery().r('m') + +// linearizable +mquery().readConcern('linearizable') +mquery().readConcern('lz') +mquery().r('lz') + +// snapshot +mquery().readConcern('snapshot') +mquery().readConcern('s') +mquery().r('s') +``` + +##### Read Concern Level: + +- `local` - The query returns from the instance with no guarantee guarantee that the data has been written to a majority of the replica set members (i.e. may be rolled back). (MongoDB 3.2+) +- `available` - The query returns from the instance with no guarantee guarantee that the data has been written to a majority of the replica set members (i.e. may be rolled back). (MongoDB 3.6+) +- `majority` - The query returns the data that has been acknowledged by a majority of the replica set members. The documents returned by the read operation are durable, even in the event of failure. (MongoDB 3.2+) +- `linearizable` - The query returns data that reflects all successful majority-acknowledged writes that completed prior to the start of the read operation. The query may wait for concurrently executing writes to propagate to a majority of replica set members before returning results. (MongoDB 3.4+) +- `snapshot` - Only available for operations within multi-document transactions. Upon transaction commit with write concern "majority", the transaction operations are guaranteed to have read from a snapshot of majority-committed data. (MongoDB 4.0+) + +Aliases + +- `l` local +- `a` available +- `m` majority +- `lz` linearizable +- `s` snapshot + +Read more about how to use read concern [here](https://docs.mongodb.com/manual/reference/read-concern/). + +### writeConcern() + +Sets the writeConcern option for the query. + +This option is only valid for operations that write to the database: + +- `deleteOne()` +- `deleteMany()` +- `findOneAndDelete()` +- `findOneAndUpdate()` +- `remove()` +- `update()` +- `updateOne()` +- `updateMany()` + +```js +mquery().writeConcern(0) +mquery().writeConcern(1) +mquery().writeConcern({ w: 1, j: true, wtimeout: 2000 }) +mquery().writeConcern('majority') +mquery().writeConcern('m') // same as majority +mquery().writeConcern('tagSetName') // if the tag set is 'm', use .writeConcern({ w: 'm' }) instead +mquery().w(1) // w is alias of writeConcern +``` + +##### Write Concern: + +writeConcern({ w: ``, j: ``, wtimeout: `` }`) + +- the w option to request acknowledgement that the write operation has propagated to a specified number of mongod instances or to mongod instances with specified tags +- the j option to request acknowledgement that the write operation has been written to the journal +- the wtimeout option to specify a time limit to prevent write operations from blocking indefinitely + +Can be break down to use the following syntax: + +mquery().w(``).j(``).wtimeout(``) + +Read more about how to use write concern [here](https://docs.mongodb.com/manual/reference/write-concern/) + +### slaveOk() + +Sets the slaveOk option. `true` allows reading from secondaries. + +**deprecated** use [read()](#read) preferences instead if on mongodb >= 2.2 + +```js +query.slaveOk() // true +query.slaveOk(true) +query.slaveOk(false) +``` + +[MongoDB documentation](http://docs.mongodb.org/manual/reference/method/rs.slaveOk/) + +### snapshot() + +Specifies this query as a snapshot query. + +```js +mquery().snapshot() // true +mquery().snapshot(true) +mquery().snapshot(false) +``` + +_Cannot be used with `distinct()`._ + +[MongoDB documentation](http://docs.mongodb.org/manual/reference/operator/snapshot/) + +### tailable() + +Sets tailable option. + +```js +mquery().tailable() <== true +mquery().tailable(true) +mquery().tailable(false) +``` + +_Cannot be used with `distinct()`._ + +[MongoDB Documentation](http://docs.mongodb.org/manual/tutorial/create-tailable-cursor/) + +### wtimeout() + +Specifies a time limit, in milliseconds, for the write concern. If `w > 1`, it is maximum amount of time to +wait for this write to propagate through the replica set before this operation fails. The default is `0`, which means no timeout. + +This option is only valid for operations that write to the database: + +- `deleteOne()` +- `deleteMany()` +- `findOneAndDelete()` +- `findOneAndUpdate()` +- `remove()` +- `update()` +- `updateOne()` +- `updateMany()` + +Defaults to `wtimeout` value if it is specified in [writeConcern](#writeconcern) + +```js +mquery().wtimeout(2000) +mquery().wTimeout(2000) +``` + +## Helpers + +### collection() + +Sets the querys collection. + +```js +mquery().collection(aCollection) +``` + +### then() + +Executes the query and returns a promise which will be resolved with the query results or rejected if the query responds with an error. + +```js +mquery().find(..).then(success, error); +``` + +This is very useful when combined with [co](https://github.com/visionmedia/co) or [koa](https://github.com/koajs/koa), which automatically resolve promise-like objects for you. + +```js +co(function*(){ + var doc = yield mquery().findOne({ _id: 499 }); + console.log(doc); // { _id: 499, name: 'amazing', .. } +})(); +``` + +_NOTE_: +The returned promise is a [bluebird](https://github.com/petkaantonov/bluebird/) promise but this is customizable. If you want to +use your favorite promise library, simply set `mquery.Promise = YourPromiseConstructor`. +Your `Promise` must be [promises A+](http://promisesaplus.com/) compliant. + +### thunk() + +Returns a thunk which when called runs the query's `exec` method passing the results to the callback. + +```js +var thunk = mquery(collection).find({..}).thunk(); + +thunk(function(err, results) { + +}) +``` + +### merge(object) + +Merges other mquery or match condition objects into this one. When an mquery instance is passed, its match conditions, field selection and options are merged. + +```js +var drum = mquery({ type: 'drum' }).collection(instruments); +var redDrum = mquery({ color: 'red' }).merge(drum); +redDrum.count(function (err, n) { + console.log('there are %d red drums', n); +}) +``` + +Internally uses `mquery.canMerge` to determine validity. + +### setOptions(options) + +Sets query options. + +```js +mquery().setOptions({ collection: coll, limit: 20 }) +``` + +##### options + +- [tailable](#tailable) * +- [sort](#sort) * +- [limit](#limit) * +- [skip](#skip) * +- [maxScan](#maxscan) * +- [maxTime](#maxtime) * +- [batchSize](#batchSize) * +- [comment](#comment) * +- [snapshot](#snapshot) * +- [hint](#hint) * +- [collection](#collection): the collection to query against + +_* denotes a query helper method is also available_ + +### setTraceFunction(func) + +Set a function to trace this query. Useful for profiling or logging. + +```js +function traceFunction (method, queryInfo, query) { + console.log('starting ' + method + ' query'); + + return function (err, result, millis) { + console.log('finished ' + method + ' query in ' + millis + 'ms'); + }; +} + +mquery().setTraceFunction(traceFunction).findOne({name: 'Joe'}, cb); +``` + +The trace function is passed (method, queryInfo, query) + +- method is the name of the method being called (e.g. findOne) +- queryInfo contains information about the query: + - conditions: query conditions/criteria + - options: options such as sort, fields, etc + - doc: document being updated +- query is the query object + +The trace function should return a callback function which accepts: +- err: error, if any +- result: result, if any +- millis: time spent waiting for query result + +NOTE: stream requests are not traced. + +### mquery.setGlobalTraceFunction(func) + +Similar to `setTraceFunction()` but automatically applied to all queries. + +```js +mquery.setTraceFunction(traceFunction); +``` + +### mquery.canMerge(conditions) + +Determines if `conditions` can be merged using `mquery().merge()`. + +```js +var query = mquery({ type: 'drum' }); +var okToMerge = mquery.canMerge(anObject) +if (okToMerge) { + query.merge(anObject); +} +``` + +## mquery.use$geoWithin + +MongoDB 2.4 introduced the `$geoWithin` operator which replaces and is 100% backward compatible with `$within`. As of mquery 0.2, we default to using `$geoWithin` for all `within()` calls. + +If you are running MongoDB < 2.4 this will be problematic. To force `mquery` to be backward compatible and always use `$within`, set the `mquery.use$geoWithin` flag to `false`. + +```js +mquery.use$geoWithin = false; +``` + +## Custom Base Queries + +Often times we want custom base queries that encapsulate predefined criteria. With `mquery` this is easy. First create the query you want to reuse and call its `toConstructor()` method which returns a new subclass of `mquery` that retains all options and criteria of the original. + +```js +var greatMovies = mquery(movieCollection).where('rating').gte(4.5).toConstructor(); + +// use it! +greatMovies().count(function (err, n) { + console.log('There are %d great movies', n); +}); + +greatMovies().where({ name: /^Life/ }).select('name').find(function (err, docs) { + console.log(docs); +}); +``` + +## Validation + +Method and options combinations are checked for validity at runtime to prevent creation of invalid query constructs. For example, a `distinct` query does not support specifying options like `hint` or field selection. In this case an error will be thrown so you can catch these mistakes in development. + +## Debug support + +Debug mode is provided through the use of the [debug](https://github.com/visionmedia/debug) module. To enable: + + DEBUG=mquery node yourprogram.js + +Read the debug module documentation for more details. + +## General compatibility + +#### ObjectIds + +`mquery` clones query arguments before passing them to a `collection` method for execution. +This prevents accidental side-affects to the objects you pass. +To clone `ObjectIds` we need to make some assumptions. + +First, to check if an object is an `ObjectId`, we check its constructors name. If it matches either +`ObjectId` or `ObjectID` we clone it. + +To clone `ObjectIds`, we call its optional `clone` method. If a `clone` method does not exist, we fall +back to calling `new obj.constructor(obj.id)`. We assume, for compatibility with the +Node.js driver, that the `ObjectId` instance has a public `id` property and that +when creating an `ObjectId` instance we can pass that `id` as an argument. + +#### Read Preferences + +`mquery` supports specifying [Read Preferences]() to control from which MongoDB node your query will read. +The Read Preferences spec also support specifying tags. To pass tags, some +drivers (Node.js driver) require passing a special constructor that handles both the read preference and its tags. +If you need to specify tags, pass an instance of your drivers ReadPreference constructor or roll your own. `mquery` will store whatever you provide and pass later to your collection during execution. + +## Future goals + + - mongo shell compatibility + - browser compatibility + +## Installation + + $ npm install mquery + +## License + +[MIT](https://github.com/aheckmann/mquery/blob/master/LICENSE) + diff --git a/node_modules/mquery/SECURITY.md b/node_modules/mquery/SECURITY.md new file mode 100644 index 000000000..41b89d834 --- /dev/null +++ b/node_modules/mquery/SECURITY.md @@ -0,0 +1 @@ +Please follow the instructions on [Tidelift's security page](https://tidelift.com/docs/security) to report a security issue. diff --git a/node_modules/mquery/lib/collection/collection.js b/node_modules/mquery/lib/collection/collection.js new file mode 100644 index 000000000..28b69828a --- /dev/null +++ b/node_modules/mquery/lib/collection/collection.js @@ -0,0 +1,47 @@ +'use strict'; + +/** + * methods a collection must implement + */ + +const methods = [ + 'find', + 'findOne', + 'update', + 'updateMany', + 'updateOne', + 'replaceOne', + 'remove', + 'count', + 'distinct', + 'findOneAndDelete', + 'findOneAndUpdate', + 'aggregate', + 'findCursor', + 'deleteOne', + 'deleteMany' +]; + +/** + * Collection base class from which implementations inherit + */ + +function Collection() {} + +for (let i = 0, len = methods.length; i < len; ++i) { + const method = methods[i]; + Collection.prototype[method] = notImplemented(method); +} + +module.exports = exports = Collection; +Collection.methods = methods; + +/** + * creates a function which throws an implementation error + */ + +function notImplemented(method) { + return function() { + throw new Error('collection.' + method + ' not implemented'); + }; +} diff --git a/node_modules/mquery/lib/collection/index.js b/node_modules/mquery/lib/collection/index.js new file mode 100644 index 000000000..4faa03220 --- /dev/null +++ b/node_modules/mquery/lib/collection/index.js @@ -0,0 +1,13 @@ +'use strict'; + +const env = require('../env'); + +if ('unknown' == env.type) { + throw new Error('Unknown environment'); +} + +module.exports = + env.isNode ? require('./node') : + env.isMongo ? require('./collection') : + require('./collection'); + diff --git a/node_modules/mquery/lib/collection/node.js b/node_modules/mquery/lib/collection/node.js new file mode 100644 index 000000000..7c6b6f886 --- /dev/null +++ b/node_modules/mquery/lib/collection/node.js @@ -0,0 +1,148 @@ +'use strict'; + +/** + * Module dependencies + */ + +const Collection = require('./collection'); +const utils = require('../utils'); + +function NodeCollection(col) { + this.collection = col; + this.collectionName = col.collectionName; +} + +/** + * inherit from collection base class + */ + +utils.inherits(NodeCollection, Collection); + +/** + * find(match, options, function(err, docs)) + */ + +NodeCollection.prototype.find = function(match, options, cb) { + const cursor = this.collection.find(match, options); + + try { + cursor.toArray(cb); + } catch (error) { + cb(error); + } +}; + +/** + * findOne(match, options, function(err, doc)) + */ + +NodeCollection.prototype.findOne = function(match, options, cb) { + this.collection.findOne(match, options, cb); +}; + +/** + * count(match, options, function(err, count)) + */ + +NodeCollection.prototype.count = function(match, options, cb) { + this.collection.count(match, options, cb); +}; + +/** + * distinct(prop, match, options, function(err, count)) + */ + +NodeCollection.prototype.distinct = function(prop, match, options, cb) { + this.collection.distinct(prop, match, options, cb); +}; + +/** + * update(match, update, options, function(err[, result])) + */ + +NodeCollection.prototype.update = function(match, update, options, cb) { + this.collection.update(match, update, options, cb); +}; + +/** + * update(match, update, options, function(err[, result])) + */ + +NodeCollection.prototype.updateMany = function(match, update, options, cb) { + this.collection.updateMany(match, update, options, cb); +}; + +/** + * update(match, update, options, function(err[, result])) + */ + +NodeCollection.prototype.updateOne = function(match, update, options, cb) { + this.collection.updateOne(match, update, options, cb); +}; + +/** + * replaceOne(match, update, options, function(err[, result])) + */ + +NodeCollection.prototype.replaceOne = function(match, update, options, cb) { + this.collection.replaceOne(match, update, options, cb); +}; + +/** + * deleteOne(match, options, function(err[, result]) + */ + +NodeCollection.prototype.deleteOne = function(match, options, cb) { + this.collection.deleteOne(match, options, cb); +}; + +/** + * deleteMany(match, options, function(err[, result]) + */ + +NodeCollection.prototype.deleteMany = function(match, options, cb) { + this.collection.deleteMany(match, options, cb); +}; + +/** + * remove(match, options, function(err[, result]) + */ + +NodeCollection.prototype.remove = function(match, options, cb) { + this.collection.remove(match, options, cb); +}; + +/** + * findOneAndDelete(match, options, function(err[, result]) + */ + +NodeCollection.prototype.findOneAndDelete = function(match, options, cb) { + this.collection.findOneAndDelete(match, options, cb); +}; + +/** + * findOneAndUpdate(match, update, options, function(err[, result]) + */ + +NodeCollection.prototype.findOneAndUpdate = function(match, update, options, cb) { + this.collection.findOneAndUpdate(match, update, options, cb); +}; + +/** + * var cursor = findCursor(match, options) + */ + +NodeCollection.prototype.findCursor = function(match, options) { + return this.collection.find(match, options); +}; + +/** + * aggregation(operators..., function(err, doc)) + * TODO + */ + +/** + * Expose + */ + +module.exports = exports = NodeCollection; diff --git a/node_modules/mquery/lib/env.js b/node_modules/mquery/lib/env.js new file mode 100644 index 000000000..d3d225b7c --- /dev/null +++ b/node_modules/mquery/lib/env.js @@ -0,0 +1,22 @@ +'use strict'; + +exports.isNode = 'undefined' != typeof process + && 'object' == typeof module + && 'object' == typeof global + && 'function' == typeof Buffer + && process.argv; + +exports.isMongo = !exports.isNode + && 'function' == typeof printjson + && 'function' == typeof ObjectId + && 'function' == typeof rs + && 'function' == typeof sh; + +exports.isBrowser = !exports.isNode + && !exports.isMongo + && 'undefined' != typeof window; + +exports.type = exports.isNode ? 'node' + : exports.isMongo ? 'mongo' + : exports.isBrowser ? 'browser' + : 'unknown'; diff --git a/node_modules/mquery/lib/mquery.js b/node_modules/mquery/lib/mquery.js new file mode 100644 index 000000000..628f3e105 --- /dev/null +++ b/node_modules/mquery/lib/mquery.js @@ -0,0 +1,3201 @@ +'use strict'; + +/** + * Dependencies + */ + +const slice = require('sliced'); +const assert = require('assert'); +const util = require('util'); +const utils = require('./utils'); +const debug = require('debug')('mquery'); + +/* global Map */ + +/** + * Query constructor used for building queries. + * + * ####Example: + * + * var query = new Query({ name: 'mquery' }); + * query.setOptions({ collection: moduleCollection }) + * query.where('age').gte(21).exec(callback); + * + * @param {Object} [criteria] + * @param {Object} [options] + * @api public + */ + +function Query(criteria, options) { + if (!(this instanceof Query)) + return new Query(criteria, options); + + const proto = this.constructor.prototype; + + this.op = proto.op || undefined; + + this.options = Object.assign({}, proto.options); + + this._conditions = proto._conditions + ? utils.clone(proto._conditions) + : {}; + + this._fields = proto._fields + ? utils.clone(proto._fields) + : undefined; + + this._update = proto._update + ? utils.clone(proto._update) + : undefined; + + this._path = proto._path || undefined; + this._distinct = proto._distinct || undefined; + this._collection = proto._collection || undefined; + this._traceFunction = proto._traceFunction || undefined; + + if (options) { + this.setOptions(options); + } + + if (criteria) { + if (criteria.find && criteria.remove && criteria.update) { + // quack quack! + this.collection(criteria); + } else { + this.find(criteria); + } + } +} + +/** + * This is a parameter that the user can set which determines if mquery + * uses $within or $geoWithin for queries. It defaults to true which + * means $geoWithin will be used. If using MongoDB < 2.4 you should + * set this to false. + * + * @api public + * @property use$geoWithin + */ + +let $withinCmd = '$geoWithin'; +Object.defineProperty(Query, 'use$geoWithin', { + get: function() { return $withinCmd == '$geoWithin'; }, + set: function(v) { + if (true === v) { + // mongodb >= 2.4 + $withinCmd = '$geoWithin'; + } else { + $withinCmd = '$within'; + } + } +}); + +/** + * Converts this query to a constructor function with all arguments and options retained. + * + * ####Example + * + * // Create a query that will read documents with a "video" category from + * // `aCollection` on the primary node in the replica-set unless it is down, + * // in which case we'll read from a secondary node. + * var query = mquery({ category: 'video' }) + * query.setOptions({ collection: aCollection, read: 'primaryPreferred' }); + * + * // create a constructor based off these settings + * var Video = query.toConstructor(); + * + * // Video is now a subclass of mquery() and works the same way but with the + * // default query parameters and options set. + * + * // run a query with the previous settings but filter for movies with names + * // that start with "Life". + * Video().where({ name: /^Life/ }).exec(cb); + * + * @return {Query} new Query + * @api public + */ + +Query.prototype.toConstructor = function toConstructor() { + function CustomQuery(criteria, options) { + if (!(this instanceof CustomQuery)) + return new CustomQuery(criteria, options); + Query.call(this, criteria, options); + } + + utils.inherits(CustomQuery, Query); + + // set inherited defaults + const p = CustomQuery.prototype; + + p.options = {}; + p.setOptions(this.options); + + p.op = this.op; + p._conditions = utils.clone(this._conditions); + p._fields = utils.clone(this._fields); + p._update = utils.clone(this._update); + p._path = this._path; + p._distinct = this._distinct; + p._collection = this._collection; + p._traceFunction = this._traceFunction; + + return CustomQuery; +}; + +/** + * Sets query options. + * + * ####Options: + * + * - [tailable](http://www.mongodb.org/display/DOCS/Tailable+Cursors) * + * - [sort](http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%7B%7Bsort(\)%7D%7D) * + * - [limit](http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%7B%7Blimit%28%29%7D%7D) * + * - [skip](http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%7B%7Bskip%28%29%7D%7D) * + * - [maxScan](http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%24maxScan) * + * - [maxTime](http://docs.mongodb.org/manual/reference/operator/meta/maxTimeMS/#op._S_maxTimeMS) * + * - [batchSize](http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%7B%7BbatchSize%28%29%7D%7D) * + * - [comment](http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%24comment) * + * - [snapshot](http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%7B%7Bsnapshot%28%29%7D%7D) * + * - [hint](http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%24hint) * + * - [slaveOk](http://docs.mongodb.org/manual/applications/replication/#read-preference) * + * - [safe](http://www.mongodb.org/display/DOCS/getLastError+Command) + * - collection the collection to query against + * + * _* denotes a query helper method is also available_ + * + * @param {Object} options + * @api public + */ + +Query.prototype.setOptions = function(options) { + if (!(options && utils.isObject(options))) + return this; + + // set arbitrary options + const methods = utils.keys(options); + let method; + + for (let i = 0; i < methods.length; ++i) { + method = methods[i]; + + // use methods if exist (safer option manipulation) + if ('function' == typeof this[method]) { + const args = utils.isArray(options[method]) + ? options[method] + : [options[method]]; + this[method].apply(this, args); + } else { + this.options[method] = options[method]; + } + } + + return this; +}; + +/** + * Sets this Querys collection. + * + * @param {Collection} coll + * @return {Query} this + */ + +Query.prototype.collection = function collection(coll) { + this._collection = new Query.Collection(coll); + + return this; +}; + +/** + * Adds a collation to this op (MongoDB 3.4 and up) + * + * ####Example + * + * query.find().collation({ locale: "en_US", strength: 1 }) + * + * @param {Object} value + * @return {Query} this + * @see MongoDB docs https://docs.mongodb.com/manual/reference/method/cursor.collation/#cursor.collation + * @api public + */ + +Query.prototype.collation = function(value) { + this.options.collation = value; + return this; +}; + +/** + * Specifies a `$where` condition + * + * Use `$where` when you need to select documents using a JavaScript expression. + * + * ####Example + * + * query.$where('this.comments.length > 10 || this.name.length > 5') + * + * query.$where(function () { + * return this.comments.length > 10 || this.name.length > 5; + * }) + * + * @param {String|Function} js javascript string or function + * @return {Query} this + * @memberOf Query + * @method $where + * @api public + */ + +Query.prototype.$where = function(js) { + this._conditions.$where = js; + return this; +}; + +/** + * Specifies a `path` for use with chaining. + * + * ####Example + * + * // instead of writing: + * User.find({age: {$gte: 21, $lte: 65}}, callback); + * + * // we can instead write: + * User.where('age').gte(21).lte(65); + * + * // passing query conditions is permitted + * User.find().where({ name: 'vonderful' }) + * + * // chaining + * User + * .where('age').gte(21).lte(65) + * .where('name', /^vonderful/i) + * .where('friends').slice(10) + * .exec(callback) + * + * @param {String} [path] + * @param {Object} [val] + * @return {Query} this + * @api public + */ + +Query.prototype.where = function() { + if (!arguments.length) return this; + if (!this.op) this.op = 'find'; + + const type = typeof arguments[0]; + + if ('string' == type) { + this._path = arguments[0]; + + if (2 === arguments.length) { + this._conditions[this._path] = arguments[1]; + } + + return this; + } + + if ('object' == type && !Array.isArray(arguments[0])) { + return this.merge(arguments[0]); + } + + throw new TypeError('path must be a string or object'); +}; + +/** + * Specifies the complementary comparison value for paths specified with `where()` + * + * ####Example + * + * User.where('age').equals(49); + * + * // is the same as + * + * User.where('age', 49); + * + * @param {Object} val + * @return {Query} this + * @api public + */ + +Query.prototype.equals = function equals(val) { + this._ensurePath('equals'); + const path = this._path; + this._conditions[path] = val; + return this; +}; + +/** + * Specifies the complementary comparison value for paths specified with `where()` + * This is alias of `equals` + * + * ####Example + * + * User.where('age').eq(49); + * + * // is the same as + * + * User.shere('age').equals(49); + * + * // is the same as + * + * User.where('age', 49); + * + * @param {Object} val + * @return {Query} this + * @api public + */ + +Query.prototype.eq = function eq(val) { + this._ensurePath('eq'); + const path = this._path; + this._conditions[path] = val; + return this; +}; + +/** + * Specifies arguments for an `$or` condition. + * + * ####Example + * + * query.or([{ color: 'red' }, { status: 'emergency' }]) + * + * @param {Array} array array of conditions + * @return {Query} this + * @api public + */ + +Query.prototype.or = function or(array) { + const or = this._conditions.$or || (this._conditions.$or = []); + if (!utils.isArray(array)) array = [array]; + or.push.apply(or, array); + return this; +}; + +/** + * Specifies arguments for a `$nor` condition. + * + * ####Example + * + * query.nor([{ color: 'green' }, { status: 'ok' }]) + * + * @param {Array} array array of conditions + * @return {Query} this + * @api public + */ + +Query.prototype.nor = function nor(array) { + const nor = this._conditions.$nor || (this._conditions.$nor = []); + if (!utils.isArray(array)) array = [array]; + nor.push.apply(nor, array); + return this; +}; + +/** + * Specifies arguments for a `$and` condition. + * + * ####Example + * + * query.and([{ color: 'green' }, { status: 'ok' }]) + * + * @see $and http://docs.mongodb.org/manual/reference/operator/and/ + * @param {Array} array array of conditions + * @return {Query} this + * @api public + */ + +Query.prototype.and = function and(array) { + const and = this._conditions.$and || (this._conditions.$and = []); + if (!Array.isArray(array)) array = [array]; + and.push.apply(and, array); + return this; +}; + +/** + * Specifies a $gt query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * ####Example + * + * Thing.find().where('age').gt(21) + * + * // or + * Thing.find().gt('age', 21) + * + * @method gt + * @memberOf Query + * @param {String} [path] + * @param {Number} val + * @api public + */ + +/** + * Specifies a $gte query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @method gte + * @memberOf Query + * @param {String} [path] + * @param {Number} val + * @api public + */ + +/** + * Specifies a $lt query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @method lt + * @memberOf Query + * @param {String} [path] + * @param {Number} val + * @api public + */ + +/** + * Specifies a $lte query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @method lte + * @memberOf Query + * @param {String} [path] + * @param {Number} val + * @api public + */ + +/** + * Specifies a $ne query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @method ne + * @memberOf Query + * @param {String} [path] + * @param {Number} val + * @api public + */ + +/** + * Specifies an $in query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @method in + * @memberOf Query + * @param {String} [path] + * @param {Number} val + * @api public + */ + +/** + * Specifies an $nin query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @method nin + * @memberOf Query + * @param {String} [path] + * @param {Number} val + * @api public + */ + +/** + * Specifies an $all query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @method all + * @memberOf Query + * @param {String} [path] + * @param {Number} val + * @api public + */ + +/** + * Specifies a $size query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @method size + * @memberOf Query + * @param {String} [path] + * @param {Number} val + * @api public + */ + +/** + * Specifies a $regex query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @method regex + * @memberOf Query + * @param {String} [path] + * @param {String|RegExp} val + * @api public + */ + +/** + * Specifies a $maxDistance query condition. + * + * When called with one argument, the most recent path passed to `where()` is used. + * + * @method maxDistance + * @memberOf Query + * @param {String} [path] + * @param {Number} val + * @api public + */ + +/*! + * gt, gte, lt, lte, ne, in, nin, all, regex, size, maxDistance + * + * Thing.where('type').nin(array) + */ + +'gt gte lt lte ne in nin all regex size maxDistance minDistance'.split(' ').forEach(function($conditional) { + Query.prototype[$conditional] = function() { + let path, val; + + if (1 === arguments.length) { + this._ensurePath($conditional); + val = arguments[0]; + path = this._path; + } else { + val = arguments[1]; + path = arguments[0]; + } + + const conds = this._conditions[path] === null || typeof this._conditions[path] === 'object' ? + this._conditions[path] : + (this._conditions[path] = {}); + conds['$' + $conditional] = val; + return this; + }; +}); + +/** + * Specifies a `$mod` condition + * + * @param {String} [path] + * @param {Number} val + * @return {Query} this + * @api public + */ + +Query.prototype.mod = function() { + let val, path; + + if (1 === arguments.length) { + this._ensurePath('mod'); + val = arguments[0]; + path = this._path; + } else if (2 === arguments.length && !utils.isArray(arguments[1])) { + this._ensurePath('mod'); + val = slice(arguments); + path = this._path; + } else if (3 === arguments.length) { + val = slice(arguments, 1); + path = arguments[0]; + } else { + val = arguments[1]; + path = arguments[0]; + } + + const conds = this._conditions[path] || (this._conditions[path] = {}); + conds.$mod = val; + return this; +}; + +/** + * Specifies an `$exists` condition + * + * ####Example + * + * // { name: { $exists: true }} + * Thing.where('name').exists() + * Thing.where('name').exists(true) + * Thing.find().exists('name') + * + * // { name: { $exists: false }} + * Thing.where('name').exists(false); + * Thing.find().exists('name', false); + * + * @param {String} [path] + * @param {Number} val + * @return {Query} this + * @api public + */ + +Query.prototype.exists = function() { + let path, val; + + if (0 === arguments.length) { + this._ensurePath('exists'); + path = this._path; + val = true; + } else if (1 === arguments.length) { + if ('boolean' === typeof arguments[0]) { + this._ensurePath('exists'); + path = this._path; + val = arguments[0]; + } else { + path = arguments[0]; + val = true; + } + } else if (2 === arguments.length) { + path = arguments[0]; + val = arguments[1]; + } + + const conds = this._conditions[path] || (this._conditions[path] = {}); + conds.$exists = val; + return this; +}; + +/** + * Specifies an `$elemMatch` condition + * + * ####Example + * + * query.elemMatch('comment', { author: 'autobot', votes: {$gte: 5}}) + * + * query.where('comment').elemMatch({ author: 'autobot', votes: {$gte: 5}}) + * + * query.elemMatch('comment', function (elem) { + * elem.where('author').equals('autobot'); + * elem.where('votes').gte(5); + * }) + * + * query.where('comment').elemMatch(function (elem) { + * elem.where({ author: 'autobot' }); + * elem.where('votes').gte(5); + * }) + * + * @param {String|Object|Function} path + * @param {Object|Function} criteria + * @return {Query} this + * @api public + */ + +Query.prototype.elemMatch = function() { + if (null == arguments[0]) + throw new TypeError('Invalid argument'); + + let fn, path, criteria; + + if ('function' === typeof arguments[0]) { + this._ensurePath('elemMatch'); + path = this._path; + fn = arguments[0]; + } else if (utils.isObject(arguments[0])) { + this._ensurePath('elemMatch'); + path = this._path; + criteria = arguments[0]; + } else if ('function' === typeof arguments[1]) { + path = arguments[0]; + fn = arguments[1]; + } else if (arguments[1] && utils.isObject(arguments[1])) { + path = arguments[0]; + criteria = arguments[1]; + } else { + throw new TypeError('Invalid argument'); + } + + if (fn) { + criteria = new Query; + fn(criteria); + criteria = criteria._conditions; + } + + const conds = this._conditions[path] || (this._conditions[path] = {}); + conds.$elemMatch = criteria; + return this; +}; + +// Spatial queries + +/** + * Sugar for geo-spatial queries. + * + * ####Example + * + * query.within().box() + * query.within().circle() + * query.within().geometry() + * + * query.where('loc').within({ center: [50,50], radius: 10, unique: true, spherical: true }); + * query.where('loc').within({ box: [[40.73, -73.9], [40.7, -73.988]] }); + * query.where('loc').within({ polygon: [[],[],[],[]] }); + * + * query.where('loc').within([], [], []) // polygon + * query.where('loc').within([], []) // box + * query.where('loc').within({ type: 'LineString', coordinates: [...] }); // geometry + * + * ####NOTE: + * + * Must be used after `where()`. + * + * @memberOf Query + * @return {Query} this + * @api public + */ + +Query.prototype.within = function within() { + // opinionated, must be used after where + this._ensurePath('within'); + this._geoComparison = $withinCmd; + + if (0 === arguments.length) { + return this; + } + + if (2 === arguments.length) { + return this.box.apply(this, arguments); + } else if (2 < arguments.length) { + return this.polygon.apply(this, arguments); + } + + const area = arguments[0]; + + if (!area) + throw new TypeError('Invalid argument'); + + if (area.center) + return this.circle(area); + + if (area.box) + return this.box.apply(this, area.box); + + if (area.polygon) + return this.polygon.apply(this, area.polygon); + + if (area.type && area.coordinates) + return this.geometry(area); + + throw new TypeError('Invalid argument'); +}; + +/** + * Specifies a $box condition + * + * ####Example + * + * var lowerLeft = [40.73083, -73.99756] + * var upperRight= [40.741404, -73.988135] + * + * query.where('loc').within().box(lowerLeft, upperRight) + * query.box('loc', lowerLeft, upperRight ) + * + * @see http://www.mongodb.org/display/DOCS/Geospatial+Indexing + * @see Query#within #query_Query-within + * @param {String} path + * @param {Object} val + * @return {Query} this + * @api public + */ + +Query.prototype.box = function() { + let path, box; + + if (3 === arguments.length) { + // box('loc', [], []) + path = arguments[0]; + box = [arguments[1], arguments[2]]; + } else if (2 === arguments.length) { + // box([], []) + this._ensurePath('box'); + path = this._path; + box = [arguments[0], arguments[1]]; + } else { + throw new TypeError('Invalid argument'); + } + + const conds = this._conditions[path] || (this._conditions[path] = {}); + conds[this._geoComparison || $withinCmd] = { $box: box }; + return this; +}; + +/** + * Specifies a $polygon condition + * + * ####Example + * + * query.where('loc').within().polygon([10,20], [13, 25], [7,15]) + * query.polygon('loc', [10,20], [13, 25], [7,15]) + * + * @param {String|Array} [path] + * @param {Array|Object} [val] + * @return {Query} this + * @see http://www.mongodb.org/display/DOCS/Geospatial+Indexing + * @api public + */ + +Query.prototype.polygon = function() { + let val, path; + + if ('string' == typeof arguments[0]) { + // polygon('loc', [],[],[]) + path = arguments[0]; + val = slice(arguments, 1); + } else { + // polygon([],[],[]) + this._ensurePath('polygon'); + path = this._path; + val = slice(arguments); + } + + const conds = this._conditions[path] || (this._conditions[path] = {}); + conds[this._geoComparison || $withinCmd] = { $polygon: val }; + return this; +}; + +/** + * Specifies a $center or $centerSphere condition. + * + * ####Example + * + * var area = { center: [50, 50], radius: 10, unique: true } + * query.where('loc').within().circle(area) + * query.center('loc', area); + * + * // for spherical calculations + * var area = { center: [50, 50], radius: 10, unique: true, spherical: true } + * query.where('loc').within().circle(area) + * query.center('loc', area); + * + * @param {String} [path] + * @param {Object} area + * @return {Query} this + * @see http://www.mongodb.org/display/DOCS/Geospatial+Indexing + * @api public + */ + +Query.prototype.circle = function() { + let path, val; + + if (1 === arguments.length) { + this._ensurePath('circle'); + path = this._path; + val = arguments[0]; + } else if (2 === arguments.length) { + path = arguments[0]; + val = arguments[1]; + } else { + throw new TypeError('Invalid argument'); + } + + if (!('radius' in val && val.center)) + throw new Error('center and radius are required'); + + const conds = this._conditions[path] || (this._conditions[path] = {}); + + const type = val.spherical + ? '$centerSphere' + : '$center'; + + const wKey = this._geoComparison || $withinCmd; + conds[wKey] = {}; + conds[wKey][type] = [val.center, val.radius]; + + if ('unique' in val) + conds[wKey].$uniqueDocs = !!val.unique; + + return this; +}; + +/** + * Specifies a `$near` or `$nearSphere` condition + * + * These operators return documents sorted by distance. + * + * ####Example + * + * query.where('loc').near({ center: [10, 10] }); + * query.where('loc').near({ center: [10, 10], maxDistance: 5 }); + * query.where('loc').near({ center: [10, 10], maxDistance: 5, spherical: true }); + * query.near('loc', { center: [10, 10], maxDistance: 5 }); + * query.near({ center: { type: 'Point', coordinates: [..] }}) + * query.near().geometry({ type: 'Point', coordinates: [..] }) + * + * @param {String} [path] + * @param {Object} val + * @return {Query} this + * @see http://www.mongodb.org/display/DOCS/Geospatial+Indexing + * @api public + */ + +Query.prototype.near = function near() { + let path, val; + + this._geoComparison = '$near'; + + if (0 === arguments.length) { + return this; + } else if (1 === arguments.length) { + this._ensurePath('near'); + path = this._path; + val = arguments[0]; + } else if (2 === arguments.length) { + path = arguments[0]; + val = arguments[1]; + } else { + throw new TypeError('Invalid argument'); + } + + if (!val.center) { + throw new Error('center is required'); + } + + const conds = this._conditions[path] || (this._conditions[path] = {}); + + const type = val.spherical + ? '$nearSphere' + : '$near'; + + // center could be a GeoJSON object or an Array + if (Array.isArray(val.center)) { + conds[type] = val.center; + + const radius = 'maxDistance' in val + ? val.maxDistance + : null; + + if (null != radius) { + conds.$maxDistance = radius; + } + if (null != val.minDistance) { + conds.$minDistance = val.minDistance; + } + } else { + // GeoJSON? + if (val.center.type != 'Point' || !Array.isArray(val.center.coordinates)) { + throw new Error(util.format('Invalid GeoJSON specified for %s', type)); + } + conds[type] = { $geometry: val.center }; + + // MongoDB 2.6 insists on maxDistance being in $near / $nearSphere + if ('maxDistance' in val) { + conds[type]['$maxDistance'] = val.maxDistance; + } + if ('minDistance' in val) { + conds[type]['$minDistance'] = val.minDistance; + } + } + + return this; +}; + +/** + * Declares an intersects query for `geometry()`. + * + * ####Example + * + * query.where('path').intersects().geometry({ + * type: 'LineString' + * , coordinates: [[180.0, 11.0], [180, 9.0]] + * }) + * + * query.where('path').intersects({ + * type: 'LineString' + * , coordinates: [[180.0, 11.0], [180, 9.0]] + * }) + * + * @param {Object} [arg] + * @return {Query} this + * @api public + */ + +Query.prototype.intersects = function intersects() { + // opinionated, must be used after where + this._ensurePath('intersects'); + + this._geoComparison = '$geoIntersects'; + + if (0 === arguments.length) { + return this; + } + + const area = arguments[0]; + + if (null != area && area.type && area.coordinates) + return this.geometry(area); + + throw new TypeError('Invalid argument'); +}; + +/** + * Specifies a `$geometry` condition + * + * ####Example + * + * var polyA = [[[ 10, 20 ], [ 10, 40 ], [ 30, 40 ], [ 30, 20 ]]] + * query.where('loc').within().geometry({ type: 'Polygon', coordinates: polyA }) + * + * // or + * var polyB = [[ 0, 0 ], [ 1, 1 ]] + * query.where('loc').within().geometry({ type: 'LineString', coordinates: polyB }) + * + * // or + * var polyC = [ 0, 0 ] + * query.where('loc').within().geometry({ type: 'Point', coordinates: polyC }) + * + * // or + * query.where('loc').intersects().geometry({ type: 'Point', coordinates: polyC }) + * + * ####NOTE: + * + * `geometry()` **must** come after either `intersects()` or `within()`. + * + * The `object` argument must contain `type` and `coordinates` properties. + * - type {String} + * - coordinates {Array} + * + * The most recent path passed to `where()` is used. + * + * @param {Object} object Must contain a `type` property which is a String and a `coordinates` property which is an Array. See the examples. + * @return {Query} this + * @see http://docs.mongodb.org/manual/release-notes/2.4/#new-geospatial-indexes-with-geojson-and-improved-spherical-geometry + * @see http://www.mongodb.org/display/DOCS/Geospatial+Indexing + * @see $geometry http://docs.mongodb.org/manual/reference/operator/geometry/ + * @api public + */ + +Query.prototype.geometry = function geometry() { + if (!('$within' == this._geoComparison || + '$geoWithin' == this._geoComparison || + '$near' == this._geoComparison || + '$geoIntersects' == this._geoComparison)) { + throw new Error('geometry() must come after `within()`, `intersects()`, or `near()'); + } + + let val, path; + + if (1 === arguments.length) { + this._ensurePath('geometry'); + path = this._path; + val = arguments[0]; + } else { + throw new TypeError('Invalid argument'); + } + + if (!(val.type && Array.isArray(val.coordinates))) { + throw new TypeError('Invalid argument'); + } + + const conds = this._conditions[path] || (this._conditions[path] = {}); + conds[this._geoComparison] = { $geometry: val }; + + return this; +}; + +// end spatial + +/** + * Specifies which document fields to include or exclude + * + * ####String syntax + * + * When passing a string, prefixing a path with `-` will flag that path as excluded. When a path does not have the `-` prefix, it is included. + * + * ####Example + * + * // include a and b, exclude c + * query.select('a b -c'); + * + * // or you may use object notation, useful when + * // you have keys already prefixed with a "-" + * query.select({a: 1, b: 1, c: 0}); + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @param {Object|String} arg + * @return {Query} this + * @see SchemaType + * @api public + */ + +Query.prototype.select = function select() { + let arg = arguments[0]; + if (!arg) return this; + + if (arguments.length !== 1) { + throw new Error('Invalid select: select only takes 1 argument'); + } + + this._validate('select'); + + const fields = this._fields || (this._fields = {}); + const type = typeof arg; + let i, len; + + if (('string' == type || utils.isArgumentsObject(arg)) && + 'number' == typeof arg.length || Array.isArray(arg)) { + if ('string' == type) + arg = arg.split(/\s+/); + + for (i = 0, len = arg.length; i < len; ++i) { + let field = arg[i]; + if (!field) continue; + const include = '-' == field[0] ? 0 : 1; + if (include === 0) field = field.substring(1); + fields[field] = include; + } + + return this; + } + + if (utils.isObject(arg)) { + const keys = utils.keys(arg); + for (i = 0; i < keys.length; ++i) { + fields[keys[i]] = arg[keys[i]]; + } + return this; + } + + throw new TypeError('Invalid select() argument. Must be string or object.'); +}; + +/** + * Specifies a $slice condition for a `path` + * + * ####Example + * + * query.slice('comments', 5) + * query.slice('comments', -5) + * query.slice('comments', [10, 5]) + * query.where('comments').slice(5) + * query.where('comments').slice([-10, 5]) + * + * @param {String} [path] + * @param {Number} val number/range of elements to slice + * @return {Query} this + * @see mongodb http://www.mongodb.org/display/DOCS/Retrieving+a+Subset+of+Fields#RetrievingaSubsetofFields-RetrievingaSubrangeofArrayElements + * @api public + */ + +Query.prototype.slice = function() { + if (0 === arguments.length) + return this; + + this._validate('slice'); + + let path, val; + + if (1 === arguments.length) { + const arg = arguments[0]; + if (typeof arg === 'object' && !Array.isArray(arg)) { + const keys = Object.keys(arg); + const numKeys = keys.length; + for (let i = 0; i < numKeys; ++i) { + this.slice(keys[i], arg[keys[i]]); + } + return this; + } + this._ensurePath('slice'); + path = this._path; + val = arguments[0]; + } else if (2 === arguments.length) { + if ('number' === typeof arguments[0]) { + this._ensurePath('slice'); + path = this._path; + val = slice(arguments); + } else { + path = arguments[0]; + val = arguments[1]; + } + } else if (3 === arguments.length) { + path = arguments[0]; + val = slice(arguments, 1); + } + + const myFields = this._fields || (this._fields = {}); + myFields[path] = { $slice: val }; + return this; +}; + +/** + * Sets the sort order + * + * If an object is passed, values allowed are 'asc', 'desc', 'ascending', 'descending', 1, and -1. + * + * If a string is passed, it must be a space delimited list of path names. The sort order of each path is ascending unless the path name is prefixed with `-` which will be treated as descending. + * + * ####Example + * + * // these are equivalent + * query.sort({ field: 'asc', test: -1 }); + * query.sort('field -test'); + * query.sort([['field', 1], ['test', -1]]); + * + * ####Note + * + * - The array syntax `.sort([['field', 1], ['test', -1]])` can only be used with [mongodb driver >= 2.0.46](https://github.com/mongodb/node-mongodb-native/blob/2.1/HISTORY.md#2046-2015-10-15). + * - Cannot be used with `distinct()` + * + * @param {Object|String|Array} arg + * @return {Query} this + * @api public + */ + +Query.prototype.sort = function(arg) { + if (!arg) return this; + let i, len, field; + + this._validate('sort'); + + const type = typeof arg; + + // .sort([['field', 1], ['test', -1]]) + if (Array.isArray(arg)) { + len = arg.length; + for (i = 0; i < arg.length; ++i) { + if (!Array.isArray(arg[i])) { + throw new Error('Invalid sort() argument, must be array of arrays'); + } + _pushArr(this.options, arg[i][0], arg[i][1]); + } + return this; + } + + // .sort('field -test') + if (1 === arguments.length && 'string' == type) { + arg = arg.split(/\s+/); + len = arg.length; + for (i = 0; i < len; ++i) { + field = arg[i]; + if (!field) continue; + const ascend = '-' == field[0] ? -1 : 1; + if (ascend === -1) field = field.substring(1); + push(this.options, field, ascend); + } + + return this; + } + + // .sort({ field: 1, test: -1 }) + if (utils.isObject(arg)) { + const keys = utils.keys(arg); + for (i = 0; i < keys.length; ++i) { + field = keys[i]; + push(this.options, field, arg[field]); + } + + return this; + } + + if (typeof Map !== 'undefined' && arg instanceof Map) { + _pushMap(this.options, arg); + return this; + } + throw new TypeError('Invalid sort() argument. Must be a string, object, or array.'); +}; + +/*! + * @ignore + */ + +const _validSortValue = { + 1: 1, + '-1': -1, + asc: 1, + ascending: 1, + desc: -1, + descending: -1 +}; + +function push(opts, field, value) { + if (Array.isArray(opts.sort)) { + throw new TypeError('Can\'t mix sort syntaxes. Use either array or object:' + + '\n- `.sort([[\'field\', 1], [\'test\', -1]])`' + + '\n- `.sort({ field: 1, test: -1 })`'); + } + + let s; + if (value && value.$meta) { + s = opts.sort || (opts.sort = {}); + s[field] = { $meta: value.$meta }; + return; + } + + s = opts.sort || (opts.sort = {}); + let val = String(value || 1).toLowerCase(); + val = _validSortValue[val]; + if (!val) throw new TypeError('Invalid sort value: { ' + field + ': ' + value + ' }'); + + s[field] = val; +} + +function _pushArr(opts, field, value) { + opts.sort = opts.sort || []; + if (!Array.isArray(opts.sort)) { + throw new TypeError('Can\'t mix sort syntaxes. Use either array or object:' + + '\n- `.sort([[\'field\', 1], [\'test\', -1]])`' + + '\n- `.sort({ field: 1, test: -1 })`'); + } + + let val = String(value || 1).toLowerCase(); + val = _validSortValue[val]; + if (!val) throw new TypeError('Invalid sort value: [ ' + field + ', ' + value + ' ]'); + + opts.sort.push([field, val]); +} + +function _pushMap(opts, map) { + opts.sort = opts.sort || new Map(); + if (!(opts.sort instanceof Map)) { + throw new TypeError('Can\'t mix sort syntaxes. Use either array or ' + + 'object or map consistently'); + } + map.forEach(function(value, key) { + let val = String(value || 1).toLowerCase(); + val = _validSortValue[val]; + if (!val) throw new TypeError('Invalid sort value: < ' + key + ': ' + value + ' >'); + + opts.sort.set(key, val); + }); +} + + + +/** + * Specifies the limit option. + * + * ####Example + * + * query.limit(20) + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @method limit + * @memberOf Query + * @param {Number} val + * @see mongodb http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%7B%7Blimit%28%29%7D%7D + * @api public + */ +/** + * Specifies the skip option. + * + * ####Example + * + * query.skip(100).limit(20) + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @method skip + * @memberOf Query + * @param {Number} val + * @see mongodb http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%7B%7Bskip%28%29%7D%7D + * @api public + */ +/** + * Specifies the maxScan option. + * + * ####Example + * + * query.maxScan(100) + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @method maxScan + * @memberOf Query + * @param {Number} val + * @see mongodb http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%24maxScan + * @api public + */ +/** + * Specifies the batchSize option. + * + * ####Example + * + * query.batchSize(100) + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @method batchSize + * @memberOf Query + * @param {Number} val + * @see mongodb http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%7B%7BbatchSize%28%29%7D%7D + * @api public + */ +/** + * Specifies the `comment` option. + * + * ####Example + * + * query.comment('login query') + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @method comment + * @memberOf Query + * @param {Number} val + * @see mongodb http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%24comment + * @api public + */ + +/*! + * limit, skip, maxScan, batchSize, comment + * + * Sets these associated options. + * + * query.comment('feed query'); + */ + +['limit', 'skip', 'maxScan', 'batchSize', 'comment'].forEach(function(method) { + Query.prototype[method] = function(v) { + this._validate(method); + this.options[method] = v; + return this; + }; +}); + +/** + * Specifies the maxTimeMS option. + * + * ####Example + * + * query.maxTime(100) + * query.maxTimeMS(100) + * + * @method maxTime + * @memberOf Query + * @param {Number} ms + * @see mongodb http://docs.mongodb.org/manual/reference/operator/meta/maxTimeMS/#op._S_maxTimeMS + * @api public + */ + +Query.prototype.maxTime = Query.prototype.maxTimeMS = function(ms) { + this._validate('maxTime'); + this.options.maxTimeMS = ms; + return this; +}; + +/** + * Specifies this query as a `snapshot` query. + * + * ####Example + * + * mquery().snapshot() // true + * mquery().snapshot(true) + * mquery().snapshot(false) + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @see mongodb http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%7B%7Bsnapshot%28%29%7D%7D + * @return {Query} this + * @api public + */ + +Query.prototype.snapshot = function() { + this._validate('snapshot'); + + this.options.snapshot = arguments.length + ? !!arguments[0] + : true; + + return this; +}; + +/** + * Sets query hints. + * + * ####Example + * + * query.hint({ indexA: 1, indexB: -1}); + * query.hint('indexA_1_indexB_1'); + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @param {Object|string} val a hint object or the index name + * @return {Query} this + * @see mongodb http://www.mongodb.org/display/DOCS/Advanced+Queries#AdvancedQueries-%24hint + * @api public + */ + +Query.prototype.hint = function() { + if (0 === arguments.length) return this; + + this._validate('hint'); + + const arg = arguments[0]; + if (utils.isObject(arg)) { + const hint = this.options.hint || (this.options.hint = {}); + + // must keep object keys in order so don't use Object.keys() + for (const k in arg) { + hint[k] = arg[k]; + } + + return this; + } + if (typeof arg === 'string') { + this.options.hint = arg; + return this; + } + + throw new TypeError('Invalid hint. ' + arg); +}; + +/** + * Requests acknowledgement that this operation has been persisted to MongoDB's + * on-disk journal. + * This option is only valid for operations that write to the database: + * + * - `deleteOne()` + * - `deleteMany()` + * - `findOneAndDelete()` + * - `findOneAndUpdate()` + * - `remove()` + * - `update()` + * - `updateOne()` + * - `updateMany()` + * + * Defaults to the `j` value if it is specified in writeConcern options + * + * ####Example: + * + * mquery().w(2).j(true).wtimeout(2000); + * + * @method j + * @memberOf Query + * @instance + * @param {boolean} val + * @see mongodb https://docs.mongodb.com/manual/reference/write-concern/#j-option + * @return {Query} this + * @api public + */ + +Query.prototype.j = function j(val) { + this.options.j = val; + return this; +}; + +/** + * Sets the slaveOk option. _Deprecated_ in MongoDB 2.2 in favor of read preferences. + * + * ####Example: + * + * query.slaveOk() // true + * query.slaveOk(true) + * query.slaveOk(false) + * + * @deprecated use read() preferences instead if on mongodb >= 2.2 + * @param {Boolean} v defaults to true + * @see mongodb http://docs.mongodb.org/manual/applications/replication/#read-preference + * @see read() + * @return {Query} this + * @api public + */ + +Query.prototype.slaveOk = function(v) { + this.options.slaveOk = arguments.length ? !!v : true; + return this; +}; + +/** + * Sets the readPreference option for the query. + * + * ####Example: + * + * new Query().read('primary') + * new Query().read('p') // same as primary + * + * new Query().read('primaryPreferred') + * new Query().read('pp') // same as primaryPreferred + * + * new Query().read('secondary') + * new Query().read('s') // same as secondary + * + * new Query().read('secondaryPreferred') + * new Query().read('sp') // same as secondaryPreferred + * + * new Query().read('nearest') + * new Query().read('n') // same as nearest + * + * // you can also use mongodb.ReadPreference class to also specify tags + * new Query().read(mongodb.ReadPreference('secondary', [{ dc:'sf', s: 1 },{ dc:'ma', s: 2 }])) + * + * new Query().setReadPreference('primary') // alias of .read() + * + * ####Preferences: + * + * primary - (default) Read from primary only. Operations will produce an error if primary is unavailable. Cannot be combined with tags. + * secondary Read from secondary if available, otherwise error. + * primaryPreferred Read from primary if available, otherwise a secondary. + * secondaryPreferred Read from a secondary if available, otherwise read from the primary. + * nearest All operations read from among the nearest candidates, but unlike other modes, this option will include both the primary and all secondaries in the random selection. + * + * Aliases + * + * p primary + * pp primaryPreferred + * s secondary + * sp secondaryPreferred + * n nearest + * + * Read more about how to use read preferences [here](http://docs.mongodb.org/manual/applications/replication/#read-preference) and [here](http://mongodb.github.com/node-mongodb-native/driver-articles/anintroductionto1_1and2_2.html#read-preferences). + * + * @param {String|ReadPreference} pref one of the listed preference options or their aliases + * @see mongodb http://docs.mongodb.org/manual/applications/replication/#read-preference + * @see driver http://mongodb.github.com/node-mongodb-native/driver-articles/anintroductionto1_1and2_2.html#read-preferences + * @return {Query} this + * @api public + */ + +Query.prototype.read = Query.prototype.setReadPreference = function(pref) { + if (arguments.length > 1 && !Query.prototype.read.deprecationWarningIssued) { + console.error('Deprecation warning: \'tags\' argument is not supported anymore in Query.read() method. Please use mongodb.ReadPreference object instead.'); + Query.prototype.read.deprecationWarningIssued = true; + } + this.options.readPreference = utils.readPref(pref); + return this; +}; + +/** + * Sets the readConcern option for the query. + * + * ####Example: + * + * new Query().readConcern('local') + * new Query().readConcern('l') // same as local + * + * new Query().readConcern('available') + * new Query().readConcern('a') // same as available + * + * new Query().readConcern('majority') + * new Query().readConcern('m') // same as majority + * + * new Query().readConcern('linearizable') + * new Query().readConcern('lz') // same as linearizable + * + * new Query().readConcern('snapshot') + * new Query().readConcern('s') // same as snapshot + * + * new Query().r('s') // r is alias of readConcern + * + * + * ####Read Concern Level: + * + * local MongoDB 3.2+ The query returns from the instance with no guarantee guarantee that the data has been written to a majority of the replica set members (i.e. may be rolled back). + * available MongoDB 3.6+ The query returns from the instance with no guarantee guarantee that the data has been written to a majority of the replica set members (i.e. may be rolled back). + * majority MongoDB 3.2+ The query returns the data that has been acknowledged by a majority of the replica set members. The documents returned by the read operation are durable, even in the event of failure. + * linearizable MongoDB 3.4+ The query returns data that reflects all successful majority-acknowledged writes that completed prior to the start of the read operation. The query may wait for concurrently executing writes to propagate to a majority of replica set members before returning results. + * snapshot MongoDB 4.0+ Only available for operations within multi-document transactions. Upon transaction commit with write concern "majority", the transaction operations are guaranteed to have read from a snapshot of majority-committed data. + + + * + * + * Aliases + * + * l local + * a available + * m majority + * lz linearizable + * s snapshot + * + * Read more about how to use read concern [here](https://docs.mongodb.com/manual/reference/read-concern/). + * + * @param {String} level one of the listed read concern level or their aliases + * @see mongodb https://docs.mongodb.com/manual/reference/read-concern/ + * @return {Query} this + * @api public + */ + +Query.prototype.readConcern = Query.prototype.r = function(level) { + this.options.readConcern = utils.readConcern(level); + return this; +}; + +/** + * Sets tailable option. + * + * ####Example + * + * query.tailable() <== true + * query.tailable(true) + * query.tailable(false) + * + * ####Note + * + * Cannot be used with `distinct()` + * + * @param {Boolean} v defaults to true + * @see mongodb http://www.mongodb.org/display/DOCS/Tailable+Cursors + * @api public + */ + +Query.prototype.tailable = function() { + this._validate('tailable'); + + this.options.tailable = arguments.length + ? !!arguments[0] + : true; + + return this; +}; + +/** + * Sets the specified number of `mongod` servers, or tag set of `mongod` servers, + * that must acknowledge this write before this write is considered successful. + * This option is only valid for operations that write to the database: + * + * - `deleteOne()` + * - `deleteMany()` + * - `findOneAndDelete()` + * - `findOneAndUpdate()` + * - `remove()` + * - `update()` + * - `updateOne()` + * - `updateMany()` + * + * Defaults to the `w` value if it is specified in writeConcern options + * + * ####Example: + * + * mquery().writeConcern(0) + * mquery().writeConcern(1) + * mquery().writeConcern({ w: 1, j: true, wtimeout: 2000 }) + * mquery().writeConcern('majority') + * mquery().writeConcern('m') // same as majority + * mquery().writeConcern('tagSetName') // if the tag set is 'm', use .writeConcern({ w: 'm' }) instead + * mquery().w(1) // w is alias of writeConcern + * + * @method writeConcern + * @memberOf Query + * @instance + * @param {String|number|object} concern 0 for fire-and-forget, 1 for acknowledged by one server, 'majority' for majority of the replica set, or [any of the more advanced options](https://docs.mongodb.com/manual/reference/write-concern/#w-option). + * @see mongodb https://docs.mongodb.com/manual/reference/write-concern/#w-option + * @return {Query} this + * @api public + */ + +Query.prototype.writeConcern = Query.prototype.w = function writeConcern(concern) { + if ('object' === typeof concern) { + if ('undefined' !== typeof concern.j) this.options.j = concern.j; + if ('undefined' !== typeof concern.w) this.options.w = concern.w; + if ('undefined' !== typeof concern.wtimeout) this.options.wtimeout = concern.wtimeout; + } else { + this.options.w = 'm' === concern ? 'majority' : concern; + } + return this; +}; + +/** + * Specifies a time limit, in milliseconds, for the write concern. + * If `ms > 1`, it is maximum amount of time to wait for this write + * to propagate through the replica set before this operation fails. + * The default is `0`, which means no timeout. + * + * This option is only valid for operations that write to the database: + * + * - `deleteOne()` + * - `deleteMany()` + * - `findOneAndDelete()` + * - `findOneAndUpdate()` + * - `remove()` + * - `update()` + * - `updateOne()` + * - `updateMany()` + * + * Defaults to `wtimeout` value if it is specified in writeConcern + * + * ####Example: + * + * mquery().w(2).j(true).wtimeout(2000) + * + * @method wtimeout + * @memberOf Query + * @instance + * @param {number} ms number of milliseconds to wait + * @see mongodb https://docs.mongodb.com/manual/reference/write-concern/#wtimeout + * @return {Query} this + * @api public + */ + +Query.prototype.wtimeout = Query.prototype.wTimeout = function wtimeout(ms) { + this.options.wtimeout = ms; + return this; +}; + +/** + * Merges another Query or conditions object into this one. + * + * When a Query is passed, conditions, field selection and options are merged. + * + * @param {Query|Object} source + * @return {Query} this + */ + +Query.prototype.merge = function(source) { + if (!source) + return this; + + if (!Query.canMerge(source)) + throw new TypeError('Invalid argument. Expected instanceof mquery or plain object'); + + if (source instanceof Query) { + // if source has a feature, apply it to ourselves + + if (source._conditions) { + utils.merge(this._conditions, source._conditions); + } + + if (source._fields) { + this._fields || (this._fields = {}); + utils.merge(this._fields, source._fields); + } + + if (source.options) { + this.options || (this.options = {}); + utils.merge(this.options, source.options); + } + + if (source._update) { + this._update || (this._update = {}); + utils.mergeClone(this._update, source._update); + } + + if (source._distinct) { + this._distinct = source._distinct; + } + + return this; + } + + // plain object + utils.merge(this._conditions, source); + + return this; +}; + +/** + * Finds documents. + * + * Passing a `callback` executes the query. + * + * ####Example + * + * query.find() + * query.find(callback) + * query.find({ name: 'Burning Lights' }, callback) + * + * @param {Object} [criteria] mongodb selector + * @param {Function} [callback] + * @return {Query} this + * @api public + */ + +Query.prototype.find = function(criteria, callback) { + this.op = 'find'; + + if ('function' === typeof criteria) { + callback = criteria; + criteria = undefined; + } else if (Query.canMerge(criteria)) { + this.merge(criteria); + } + + if (!callback) return this; + + const conds = this._conditions; + const options = this._optionsForExec(); + + if (this.$useProjection) { + options.projection = this._fieldsForExec(); + } else { + options.fields = this._fieldsForExec(); + } + + debug('find', this._collection.collectionName, conds, options); + callback = this._wrapCallback('find', callback, { + conditions: conds, + options: options + }); + + this._collection.find(conds, options, utils.tick(callback)); + return this; +}; + +/** + * Returns the query cursor + * + * ####Examples + * + * query.find().cursor(); + * query.cursor({ name: 'Burning Lights' }); + * + * @param {Object} [criteria] mongodb selector + * @return {Object} cursor + * @api public + */ + +Query.prototype.cursor = function cursor(criteria) { + if (this.op) { + if (this.op !== 'find') { + throw new TypeError('.cursor only support .find method'); + } + } else { + this.find(criteria); + } + + const conds = this._conditions; + const options = this._optionsForExec(); + + if (this.$useProjection) { + options.projection = this._fieldsForExec(); + } else { + options.fields = this._fieldsForExec(); + } + + debug('findCursor', this._collection.collectionName, conds, options); + return this._collection.findCursor(conds, options); +}; + +/** + * Executes the query as a findOne() operation. + * + * Passing a `callback` executes the query. + * + * ####Example + * + * query.findOne().where('name', /^Burning/); + * + * query.findOne({ name: /^Burning/ }) + * + * query.findOne({ name: /^Burning/ }, callback); // executes + * + * query.findOne(function (err, doc) { + * if (err) return handleError(err); + * if (doc) { + * // doc may be null if no document matched + * + * } + * }); + * + * @param {Object|Query} [criteria] mongodb selector + * @param {Function} [callback] + * @return {Query} this + * @api public + */ + +Query.prototype.findOne = function(criteria, callback) { + this.op = 'findOne'; + + if ('function' === typeof criteria) { + callback = criteria; + criteria = undefined; + } else if (Query.canMerge(criteria)) { + this.merge(criteria); + } + + if (!callback) return this; + + const conds = this._conditions; + const options = this._optionsForExec(); + + if (this.$useProjection) { + options.projection = this._fieldsForExec(); + } else { + options.fields = this._fieldsForExec(); + } + + debug('findOne', this._collection.collectionName, conds, options); + callback = this._wrapCallback('findOne', callback, { + conditions: conds, + options: options + }); + + this._collection.findOne(conds, options, utils.tick(callback)); + + return this; +}; + +/** + * Exectues the query as a count() operation. + * + * Passing a `callback` executes the query. + * + * ####Example + * + * query.count().where('color', 'black').exec(callback); + * + * query.count({ color: 'black' }).count(callback) + * + * query.count({ color: 'black' }, callback) + * + * query.where('color', 'black').count(function (err, count) { + * if (err) return handleError(err); + * console.log('there are %d kittens', count); + * }) + * + * @param {Object} [criteria] mongodb selector + * @param {Function} [callback] + * @return {Query} this + * @see mongodb http://www.mongodb.org/display/DOCS/Aggregation#Aggregation-Count + * @api public + */ + +Query.prototype.count = function(criteria, callback) { + this.op = 'count'; + this._validate(); + + if ('function' === typeof criteria) { + callback = criteria; + criteria = undefined; + } else if (Query.canMerge(criteria)) { + this.merge(criteria); + } + + if (!callback) return this; + + const conds = this._conditions, + options = this._optionsForExec(); + + debug('count', this._collection.collectionName, conds, options); + callback = this._wrapCallback('count', callback, { + conditions: conds, + options: options + }); + + this._collection.count(conds, options, utils.tick(callback)); + return this; +}; + +/** + * Declares or executes a distinct() operation. + * + * Passing a `callback` executes the query. + * + * ####Example + * + * distinct(criteria, field, fn) + * distinct(criteria, field) + * distinct(field, fn) + * distinct(field) + * distinct(fn) + * distinct() + * + * @param {Object|Query} [criteria] + * @param {String} [field] + * @param {Function} [callback] + * @return {Query} this + * @see mongodb http://www.mongodb.org/display/DOCS/Aggregation#Aggregation-Distinct + * @api public + */ + +Query.prototype.distinct = function(criteria, field, callback) { + this.op = 'distinct'; + this._validate(); + + if (!callback) { + switch (typeof field) { + case 'function': + callback = field; + if ('string' == typeof criteria) { + field = criteria; + criteria = undefined; + } + break; + case 'undefined': + case 'string': + break; + default: + throw new TypeError('Invalid `field` argument. Must be string or function'); + } + + switch (typeof criteria) { + case 'function': + callback = criteria; + criteria = field = undefined; + break; + case 'string': + field = criteria; + criteria = undefined; + break; + } + } + + if ('string' == typeof field) { + this._distinct = field; + } + + if (Query.canMerge(criteria)) { + this.merge(criteria); + } + + if (!callback) { + return this; + } + + if (!this._distinct) { + throw new Error('No value for `distinct` has been declared'); + } + + const conds = this._conditions, + options = this._optionsForExec(); + + debug('distinct', this._collection.collectionName, conds, options); + callback = this._wrapCallback('distinct', callback, { + conditions: conds, + options: options + }); + + this._collection.distinct(this._distinct, conds, options, utils.tick(callback)); + + return this; +}; + +/** + * Declare and/or execute this query as an update() operation. By default, + * `update()` only modifies the _first_ document that matches `criteria`. + * + * _All paths passed that are not $atomic operations will become $set ops._ + * + * ####Example + * + * mquery({ _id: id }).update({ title: 'words' }, ...) + * + * becomes + * + * collection.update({ _id: id }, { $set: { title: 'words' }}, ...) + * + * ####Note + * + * Passing an empty object `{}` as the doc will result in a no-op unless the `overwrite` option is passed. Without the `overwrite` option set, the update operation will be ignored and the callback executed without sending the command to MongoDB so as to prevent accidently overwritting documents in the collection. + * + * ####Note + * + * The operation is only executed when a callback is passed. To force execution without a callback (which would be an unsafe write), we must first call update() and then execute it by using the `exec()` method. + * + * var q = mquery(collection).where({ _id: id }); + * q.update({ $set: { name: 'bob' }}).update(); // not executed + * + * var q = mquery(collection).where({ _id: id }); + * q.update({ $set: { name: 'bob' }}).exec(); // executed as unsafe + * + * // keys that are not $atomic ops become $set. + * // this executes the same command as the previous example. + * q.update({ name: 'bob' }).where({ _id: id }).exec(); + * + * var q = mquery(collection).update(); // not executed + * + * // overwriting with empty docs + * var q.where({ _id: id }).setOptions({ overwrite: true }) + * q.update({ }, callback); // executes + * + * // multi update with overwrite to empty doc + * var q = mquery(collection).where({ _id: id }); + * q.setOptions({ multi: true, overwrite: true }) + * q.update({ }); + * q.update(callback); // executed + * + * // multi updates + * mquery() + * .collection(coll) + * .update({ name: /^match/ }, { $set: { arr: [] }}, { multi: true }, callback) + * // more multi updates + * mquery({ }) + * .collection(coll) + * .setOptions({ multi: true }) + * .update({ $set: { arr: [] }}, callback) + * + * // single update by default + * mquery({ email: 'address@example.com' }) + * .collection(coll) + * .update({ $inc: { counter: 1 }}, callback) + * + * // summary + * update(criteria, doc, opts, cb) // executes + * update(criteria, doc, opts) + * update(criteria, doc, cb) // executes + * update(criteria, doc) + * update(doc, cb) // executes + * update(doc) + * update(cb) // executes + * update(true) // executes (unsafe write) + * update() + * + * @param {Object} [criteria] + * @param {Object} [doc] the update command + * @param {Object} [options] + * @param {Function} [callback] + * @return {Query} this + * @api public + */ + +Query.prototype.update = function update(criteria, doc, options, callback) { + var force; + + switch (arguments.length) { + case 3: + if ('function' == typeof options) { + callback = options; + options = undefined; + } + break; + case 2: + if ('function' == typeof doc) { + callback = doc; + doc = criteria; + criteria = undefined; + } + break; + case 1: + switch (typeof criteria) { + case 'function': + callback = criteria; + criteria = options = doc = undefined; + break; + case 'boolean': + // execution with no callback (unsafe write) + force = criteria; + criteria = undefined; + break; + default: + doc = criteria; + criteria = options = undefined; + break; + } + } + + return _update(this, 'update', criteria, doc, options, force, callback); +}; + +/** + * Declare and/or execute this query as an `updateMany()` operation. Identical + * to `update()` except `updateMany()` will update _all_ documents that match + * `criteria`, rather than just the first one. + * + * _All paths passed that are not $atomic operations will become $set ops._ + * + * ####Example + * + * // Update every document whose `title` contains 'test' + * mquery().updateMany({ title: /test/ }, { year: 2017 }) + * + * @param {Object} [criteria] + * @param {Object} [doc] the update command + * @param {Object} [options] + * @param {Function} [callback] + * @return {Query} this + * @api public + */ + +Query.prototype.updateMany = function updateMany(criteria, doc, options, callback) { + let force; + + switch (arguments.length) { + case 3: + if ('function' == typeof options) { + callback = options; + options = undefined; + } + break; + case 2: + if ('function' == typeof doc) { + callback = doc; + doc = criteria; + criteria = undefined; + } + break; + case 1: + switch (typeof criteria) { + case 'function': + callback = criteria; + criteria = options = doc = undefined; + break; + case 'boolean': + // execution with no callback (unsafe write) + force = criteria; + criteria = undefined; + break; + default: + doc = criteria; + criteria = options = undefined; + break; + } + } + + return _update(this, 'updateMany', criteria, doc, options, force, callback); +}; + +/** + * Declare and/or execute this query as an `updateOne()` operation. Identical + * to `update()` except `updateOne()` will _always_ update just one document, + * regardless of the `multi` option. + * + * _All paths passed that are not $atomic operations will become $set ops._ + * + * ####Example + * + * // Update the first document whose `title` contains 'test' + * mquery().updateMany({ title: /test/ }, { year: 2017 }) + * + * @param {Object} [criteria] + * @param {Object} [doc] the update command + * @param {Object} [options] + * @param {Function} [callback] + * @return {Query} this + * @api public + */ + +Query.prototype.updateOne = function updateOne(criteria, doc, options, callback) { + let force; + + switch (arguments.length) { + case 3: + if ('function' == typeof options) { + callback = options; + options = undefined; + } + break; + case 2: + if ('function' == typeof doc) { + callback = doc; + doc = criteria; + criteria = undefined; + } + break; + case 1: + switch (typeof criteria) { + case 'function': + callback = criteria; + criteria = options = doc = undefined; + break; + case 'boolean': + // execution with no callback (unsafe write) + force = criteria; + criteria = undefined; + break; + default: + doc = criteria; + criteria = options = undefined; + break; + } + } + + return _update(this, 'updateOne', criteria, doc, options, force, callback); +}; + +/** + * Declare and/or execute this query as an `replaceOne()` operation. Similar + * to `updateOne()`, except `replaceOne()` is not allowed to use atomic + * modifiers (`$set`, `$push`, etc.). Calling `replaceOne()` will always + * replace the existing doc. + * + * ####Example + * + * // Replace the document with `_id` 1 with `{ _id: 1, year: 2017 }` + * mquery().replaceOne({ _id: 1 }, { year: 2017 }) + * + * @param {Object} [criteria] + * @param {Object} [doc] the update command + * @param {Object} [options] + * @param {Function} [callback] + * @return {Query} this + * @api public + */ + +Query.prototype.replaceOne = function replaceOne(criteria, doc, options, callback) { + let force; + + switch (arguments.length) { + case 3: + if ('function' == typeof options) { + callback = options; + options = undefined; + } + break; + case 2: + if ('function' == typeof doc) { + callback = doc; + doc = criteria; + criteria = undefined; + } + break; + case 1: + switch (typeof criteria) { + case 'function': + callback = criteria; + criteria = options = doc = undefined; + break; + case 'boolean': + // execution with no callback (unsafe write) + force = criteria; + criteria = undefined; + break; + default: + doc = criteria; + criteria = options = undefined; + break; + } + } + + this.setOptions({ overwrite: true }); + return _update(this, 'replaceOne', criteria, doc, options, force, callback); +}; + + +/*! + * Internal helper for update, updateMany, updateOne + */ + +function _update(query, op, criteria, doc, options, force, callback) { + query.op = op; + + if (Query.canMerge(criteria)) { + query.merge(criteria); + } + + if (doc) { + query._mergeUpdate(doc); + } + + if (utils.isObject(options)) { + // { overwrite: true } + query.setOptions(options); + } + + // we are done if we don't have callback and they are + // not forcing an unsafe write. + if (!(force || callback)) { + return query; + } + + if (!query._update || + !query.options.overwrite && 0 === utils.keys(query._update).length) { + callback && utils.soon(callback.bind(null, null, 0)); + return query; + } + + options = query._optionsForExec(); + if (!callback) options.safe = false; + + criteria = query._conditions; + doc = query._updateForExec(); + + debug('update', query._collection.collectionName, criteria, doc, options); + callback = query._wrapCallback(op, callback, { + conditions: criteria, + doc: doc, + options: options + }); + + query._collection[op](criteria, doc, options, utils.tick(callback)); + + return query; +} + +/** + * Declare and/or execute this query as a remove() operation. + * + * ####Example + * + * mquery(collection).remove({ artist: 'Anne Murray' }, callback) + * + * ####Note + * + * The operation is only executed when a callback is passed. To force execution without a callback (which would be an unsafe write), we must first call remove() and then execute it by using the `exec()` method. + * + * // not executed + * var query = mquery(collection).remove({ name: 'Anne Murray' }) + * + * // executed + * mquery(collection).remove({ name: 'Anne Murray' }, callback) + * mquery(collection).remove({ name: 'Anne Murray' }).remove(callback) + * + * // executed without a callback (unsafe write) + * query.exec() + * + * // summary + * query.remove(conds, fn); // executes + * query.remove(conds) + * query.remove(fn) // executes + * query.remove() + * + * @param {Object|Query} [criteria] mongodb selector + * @param {Function} [callback] + * @return {Query} this + * @api public + */ + +Query.prototype.remove = function(criteria, callback) { + this.op = 'remove'; + let force; + + if ('function' === typeof criteria) { + callback = criteria; + criteria = undefined; + } else if (Query.canMerge(criteria)) { + this.merge(criteria); + } else if (true === criteria) { + force = criteria; + criteria = undefined; + } + + if (!(force || callback)) + return this; + + const options = this._optionsForExec(); + if (!callback) options.safe = false; + + const conds = this._conditions; + + debug('remove', this._collection.collectionName, conds, options); + callback = this._wrapCallback('remove', callback, { + conditions: conds, + options: options + }); + + this._collection.remove(conds, options, utils.tick(callback)); + + return this; +}; + +/** + * Declare and/or execute this query as a `deleteOne()` operation. Behaves like + * `remove()`, except for ignores the `justOne` option and always deletes at + * most one document. + * + * ####Example + * + * mquery(collection).deleteOne({ artist: 'Anne Murray' }, callback) + * + * @param {Object|Query} [criteria] mongodb selector + * @param {Function} [callback] + * @return {Query} this + * @api public + */ + +Query.prototype.deleteOne = function(criteria, callback) { + this.op = 'deleteOne'; + let force; + + if ('function' === typeof criteria) { + callback = criteria; + criteria = undefined; + } else if (Query.canMerge(criteria)) { + this.merge(criteria); + } else if (true === criteria) { + force = criteria; + criteria = undefined; + } + + if (!(force || callback)) + return this; + + const options = this._optionsForExec(); + if (!callback) options.safe = false; + delete options.justOne; + + const conds = this._conditions; + + debug('deleteOne', this._collection.collectionName, conds, options); + callback = this._wrapCallback('deleteOne', callback, { + conditions: conds, + options: options + }); + + this._collection.deleteOne(conds, options, utils.tick(callback)); + + return this; +}; + +/** + * Declare and/or execute this query as a `deleteMany()` operation. Behaves like + * `remove()`, except for ignores the `justOne` option and always deletes + * _every_ document that matches `criteria`. + * + * ####Example + * + * mquery(collection).deleteMany({ artist: 'Anne Murray' }, callback) + * + * @param {Object|Query} [criteria] mongodb selector + * @param {Function} [callback] + * @return {Query} this + * @api public + */ + +Query.prototype.deleteMany = function(criteria, callback) { + this.op = 'deleteMany'; + let force; + + if ('function' === typeof criteria) { + callback = criteria; + criteria = undefined; + } else if (Query.canMerge(criteria)) { + this.merge(criteria); + } else if (true === criteria) { + force = criteria; + criteria = undefined; + } + + if (!(force || callback)) + return this; + + const options = this._optionsForExec(); + if (!callback) options.safe = false; + delete options.justOne; + + const conds = this._conditions; + + debug('deleteOne', this._collection.collectionName, conds, options); + callback = this._wrapCallback('deleteOne', callback, { + conditions: conds, + options: options + }); + + this._collection.deleteMany(conds, options, utils.tick(callback)); + + return this; +}; + +/** + * Issues a mongodb [findAndModify](http://www.mongodb.org/display/DOCS/findAndModify+Command) update command. + * + * Finds a matching document, updates it according to the `update` arg, passing any `options`, and returns the found document (if any) to the callback. The query executes immediately if `callback` is passed. + * + * ####Available options + * + * - `new`: bool - true to return the modified document rather than the original. defaults to true + * - `upsert`: bool - creates the object if it doesn't exist. defaults to false. + * - `sort`: if multiple docs are found by the conditions, sets the sort order to choose which doc to update + * + * ####Examples + * + * query.findOneAndUpdate(conditions, update, options, callback) // executes + * query.findOneAndUpdate(conditions, update, options) // returns Query + * query.findOneAndUpdate(conditions, update, callback) // executes + * query.findOneAndUpdate(conditions, update) // returns Query + * query.findOneAndUpdate(update, callback) // returns Query + * query.findOneAndUpdate(update) // returns Query + * query.findOneAndUpdate(callback) // executes + * query.findOneAndUpdate() // returns Query + * + * @param {Object|Query} [query] + * @param {Object} [doc] + * @param {Object} [options] + * @param {Function} [callback] + * @see mongodb http://www.mongodb.org/display/DOCS/findAndModify+Command + * @return {Query} this + * @api public + */ + +Query.prototype.findOneAndUpdate = function(criteria, doc, options, callback) { + this.op = 'findOneAndUpdate'; + this._validate(); + + switch (arguments.length) { + case 3: + if ('function' == typeof options) { + callback = options; + options = {}; + } + break; + case 2: + if ('function' == typeof doc) { + callback = doc; + doc = criteria; + criteria = undefined; + } + options = undefined; + break; + case 1: + if ('function' == typeof criteria) { + callback = criteria; + criteria = options = doc = undefined; + } else { + doc = criteria; + criteria = options = undefined; + } + } + + if (Query.canMerge(criteria)) { + this.merge(criteria); + } + + // apply doc + if (doc) { + this._mergeUpdate(doc); + } + + options && this.setOptions(options); + + if (!callback) return this; + + const conds = this._conditions; + const update = this._updateForExec(); + options = this._optionsForExec(); + + return this._collection.findOneAndUpdate(conds, update, options, utils.tick(callback)); +}; + +/** + * Issues a mongodb [findAndModify](http://www.mongodb.org/display/DOCS/findAndModify+Command) remove command. + * + * Finds a matching document, removes it, passing the found document (if any) to the callback. Executes immediately if `callback` is passed. + * + * ####Available options + * + * - `sort`: if multiple docs are found by the conditions, sets the sort order to choose which doc to update + * + * ####Examples + * + * A.where().findOneAndRemove(conditions, options, callback) // executes + * A.where().findOneAndRemove(conditions, options) // return Query + * A.where().findOneAndRemove(conditions, callback) // executes + * A.where().findOneAndRemove(conditions) // returns Query + * A.where().findOneAndRemove(callback) // executes + * A.where().findOneAndRemove() // returns Query + * A.where().findOneAndDelete() // alias of .findOneAndRemove() + * + * @param {Object} [conditions] + * @param {Object} [options] + * @param {Function} [callback] + * @return {Query} this + * @see mongodb http://www.mongodb.org/display/DOCS/findAndModify+Command + * @api public + */ + +Query.prototype.findOneAndRemove = Query.prototype.findOneAndDelete = function(conditions, options, callback) { + this.op = 'findOneAndRemove'; + this._validate(); + + if ('function' == typeof options) { + callback = options; + options = undefined; + } else if ('function' == typeof conditions) { + callback = conditions; + conditions = undefined; + } + + // apply conditions + if (Query.canMerge(conditions)) { + this.merge(conditions); + } + + // apply options + options && this.setOptions(options); + + if (!callback) return this; + + options = this._optionsForExec(); + const conds = this._conditions; + + return this._collection.findOneAndDelete(conds, options, utils.tick(callback)); +}; + +/** + * Wrap callback to add tracing + * + * @param {Function} callback + * @param {Object} [queryInfo] + * @api private + */ +Query.prototype._wrapCallback = function(method, callback, queryInfo) { + const traceFunction = this._traceFunction || Query.traceFunction; + + if (traceFunction) { + queryInfo.collectionName = this._collection.collectionName; + + const traceCallback = traceFunction && + traceFunction.call(null, method, queryInfo, this); + + const startTime = new Date().getTime(); + + return function wrapperCallback(err, result) { + if (traceCallback) { + const millis = new Date().getTime() - startTime; + traceCallback.call(null, err, result, millis); + } + + if (callback) { + callback.apply(null, arguments); + } + }; + } + + return callback; +}; + +/** + * Add trace function that gets called when the query is executed. + * The function will be called with (method, queryInfo, query) and + * should return a callback function which will be called + * with (err, result, millis) when the query is complete. + * + * queryInfo is an object containing: { + * collectionName: , + * conditions: , + * options: , + * doc: [document to update, if applicable] + * } + * + * NOTE: Does not trace stream queries. + * + * @param {Function} traceFunction + * @return {Query} this + * @api public + */ +Query.prototype.setTraceFunction = function(traceFunction) { + this._traceFunction = traceFunction; + return this; +}; + +/** + * Executes the query + * + * ####Examples + * + * query.exec(); + * query.exec(callback); + * query.exec('update'); + * query.exec('find', callback); + * + * @param {String|Function} [operation] + * @param {Function} [callback] + * @api public + */ + +Query.prototype.exec = function exec(op, callback) { + switch (typeof op) { + case 'function': + callback = op; + op = null; + break; + case 'string': + this.op = op; + break; + } + + assert.ok(this.op, 'Missing query type: (find, update, etc)'); + + if ('update' == this.op || 'remove' == this.op) { + callback || (callback = true); + } + + const _this = this; + + if ('function' == typeof callback) { + this[this.op](callback); + } else { + return new Query.Promise(function(success, error) { + _this[_this.op](function(err, val) { + if (err) error(err); + else success(val); + success = error = null; + }); + }); + } +}; + +/** + * Returns a thunk which when called runs this.exec() + * + * The thunk receives a callback function which will be + * passed to `this.exec()` + * + * @return {Function} + * @api public + */ + +Query.prototype.thunk = function() { + const _this = this; + return function(cb) { + _this.exec(cb); + }; +}; + +/** + * Executes the query returning a `Promise` which will be + * resolved with either the doc(s) or rejected with the error. + * + * @param {Function} [resolve] + * @param {Function} [reject] + * @return {Promise} + * @api public + */ + +Query.prototype.then = function(resolve, reject) { + const _this = this; + const promise = new Query.Promise(function(success, error) { + _this.exec(function(err, val) { + if (err) error(err); + else success(val); + success = error = null; + }); + }); + return promise.then(resolve, reject); +}; + +/** + * Returns a cursor for the given `find` query. + * + * @throws Error if operation is not a find + * @returns {Cursor} MongoDB driver cursor + */ + +Query.prototype.cursor = function() { + if ('find' != this.op) + throw new Error('cursor() is only available for find'); + + const conds = this._conditions; + + const options = this._optionsForExec(); + if (this.$useProjection) { + options.projection = this._fieldsForExec(); + } else { + options.fields = this._fieldsForExec(); + } + + debug('cursor', this._collection.collectionName, conds, options); + + return this._collection.findCursor(conds, options); +}; + +/** + * Determines if field selection has been made. + * + * @return {Boolean} + * @api public + */ + +Query.prototype.selected = function selected() { + return !!(this._fields && Object.keys(this._fields).length > 0); +}; + +/** + * Determines if inclusive field selection has been made. + * + * query.selectedInclusively() // false + * query.select('name') + * query.selectedInclusively() // true + * query.selectedExlusively() // false + * + * @returns {Boolean} + */ + +Query.prototype.selectedInclusively = function selectedInclusively() { + if (!this._fields) return false; + + const keys = Object.keys(this._fields); + if (0 === keys.length) return false; + + for (let i = 0; i < keys.length; ++i) { + const key = keys[i]; + if (0 === this._fields[key]) return false; + if (this._fields[key] && + typeof this._fields[key] === 'object' && + this._fields[key].$meta) { + return false; + } + } + + return true; +}; + +/** + * Determines if exclusive field selection has been made. + * + * query.selectedExlusively() // false + * query.select('-name') + * query.selectedExlusively() // true + * query.selectedInclusively() // false + * + * @returns {Boolean} + */ + +Query.prototype.selectedExclusively = function selectedExclusively() { + if (!this._fields) return false; + + const keys = Object.keys(this._fields); + if (0 === keys.length) return false; + + for (let i = 0; i < keys.length; ++i) { + const key = keys[i]; + if (0 === this._fields[key]) return true; + } + + return false; +}; + +/** + * Merges `doc` with the current update object. + * + * @param {Object} doc + */ + +Query.prototype._mergeUpdate = function(doc) { + if (!this._update) this._update = {}; + if (doc instanceof Query) { + if (doc._update) { + utils.mergeClone(this._update, doc._update); + } + } else { + utils.mergeClone(this._update, doc); + } +}; + +/** + * Returns default options. + * + * @return {Object} + * @api private + */ + +Query.prototype._optionsForExec = function() { + const options = utils.clone(this.options); + return options; +}; + +/** + * Returns fields selection for this query. + * + * @return {Object} + * @api private + */ + +Query.prototype._fieldsForExec = function() { + return utils.clone(this._fields); +}; + +/** + * Return an update document with corrected $set operations. + * + * @api private + */ + +Query.prototype._updateForExec = function() { + const update = utils.clone(this._update); + const ops = utils.keys(update); + const ret = {}; + + for (const op of ops) { + if (this.options.overwrite) { + ret[op] = update[op]; + continue; + } + + if ('$' !== op[0]) { + // fix up $set sugar + if (!ret.$set) { + if (update.$set) { + ret.$set = update.$set; + } else { + ret.$set = {}; + } + } + ret.$set[op] = update[op]; + if (!~ops.indexOf('$set')) ops.push('$set'); + } else if ('$set' === op) { + if (!ret.$set) { + ret[op] = update[op]; + } + } else { + ret[op] = update[op]; + } + } + + this._compiledUpdate = ret; + return ret; +}; + +/** + * Make sure _path is set. + * + * @parmam {String} method + */ + +Query.prototype._ensurePath = function(method) { + if (!this._path) { + const msg = method + '() must be used after where() ' + + 'when called with these arguments'; + throw new Error(msg); + } +}; + +/*! + * Permissions + */ + +Query.permissions = require('./permissions'); + +Query._isPermitted = function(a, b) { + const denied = Query.permissions[b]; + if (!denied) return true; + return true !== denied[a]; +}; + +Query.prototype._validate = function(action) { + let fail; + let validator; + + if (undefined === action) { + + validator = Query.permissions[this.op]; + if ('function' != typeof validator) return true; + + fail = validator(this); + + } else if (!Query._isPermitted(action, this.op)) { + fail = action; + } + + if (fail) { + throw new Error(fail + ' cannot be used with ' + this.op); + } +}; + +/** + * Determines if `conds` can be merged using `mquery().merge()` + * + * @param {Object} conds + * @return {Boolean} + */ + +Query.canMerge = function(conds) { + return conds instanceof Query || utils.isObject(conds); +}; + +/** + * Set a trace function that will get called whenever a + * query is executed. + * + * See `setTraceFunction()` for details. + * + * @param {Object} conds + * @return {Boolean} + */ +Query.setGlobalTraceFunction = function(traceFunction) { + Query.traceFunction = traceFunction; +}; + +/*! + * Exports. + */ + +Query.utils = utils; +Query.env = require('./env'); +Query.Collection = require('./collection'); +Query.BaseCollection = require('./collection/collection'); +Query.Promise = Promise; +module.exports = exports = Query; + +// TODO +// test utils diff --git a/node_modules/mquery/lib/permissions.js b/node_modules/mquery/lib/permissions.js new file mode 100644 index 000000000..09b0590ae --- /dev/null +++ b/node_modules/mquery/lib/permissions.js @@ -0,0 +1,88 @@ +'use strict'; + +const denied = exports; + +denied.distinct = function(self) { + if (self._fields && Object.keys(self._fields).length > 0) { + return 'field selection and slice'; + } + + const keys = Object.keys(denied.distinct); + let err; + + keys.every(function(option) { + if (self.options[option]) { + err = option; + return false; + } + return true; + }); + + return err; +}; +denied.distinct.select = +denied.distinct.slice = +denied.distinct.sort = +denied.distinct.limit = +denied.distinct.skip = +denied.distinct.batchSize = +denied.distinct.comment = +denied.distinct.maxScan = +denied.distinct.snapshot = +denied.distinct.hint = +denied.distinct.tailable = true; + + +// aggregation integration + + +denied.findOneAndUpdate = +denied.findOneAndRemove = function(self) { + const keys = Object.keys(denied.findOneAndUpdate); + let err; + + keys.every(function(option) { + if (self.options[option]) { + err = option; + return false; + } + return true; + }); + + return err; +}; +denied.findOneAndUpdate.limit = +denied.findOneAndUpdate.skip = +denied.findOneAndUpdate.batchSize = +denied.findOneAndUpdate.maxScan = +denied.findOneAndUpdate.snapshot = +denied.findOneAndUpdate.hint = +denied.findOneAndUpdate.tailable = +denied.findOneAndUpdate.comment = true; + + +denied.count = function(self) { + if (self._fields && Object.keys(self._fields).length > 0) { + return 'field selection and slice'; + } + + const keys = Object.keys(denied.count); + let err; + + keys.every(function(option) { + if (self.options[option]) { + err = option; + return false; + } + return true; + }); + + return err; +}; + +denied.count.slice = +denied.count.batchSize = +denied.count.comment = +denied.count.maxScan = +denied.count.snapshot = +denied.count.tailable = true; diff --git a/node_modules/mquery/lib/utils.js b/node_modules/mquery/lib/utils.js new file mode 100644 index 000000000..63e8284ad --- /dev/null +++ b/node_modules/mquery/lib/utils.js @@ -0,0 +1,359 @@ +'use strict'; + +/*! + * Module dependencies. + */ + +const RegExpClone = require('regexp-clone'); + +const specialProperties = ['__proto__', 'constructor', 'prototype']; + +/** + * Clones objects + * + * @param {Object} obj the object to clone + * @param {Object} options + * @return {Object} the cloned object + * @api private + */ + +const clone = exports.clone = function clone(obj, options) { + if (obj === undefined || obj === null) + return obj; + + if (Array.isArray(obj)) + return exports.cloneArray(obj, options); + + if (obj.constructor) { + if (/ObjectI[dD]$/.test(obj.constructor.name)) { + return 'function' == typeof obj.clone + ? obj.clone() + : new obj.constructor(obj.id); + } + + if (obj.constructor.name === 'ReadPreference') { + return new obj.constructor(obj.mode, clone(obj.tags, options)); + } + + if ('Binary' == obj._bsontype && obj.buffer && obj.value) { + return 'function' == typeof obj.clone + ? obj.clone() + : new obj.constructor(obj.value(true), obj.sub_type); + } + + if ('Date' === obj.constructor.name || 'Function' === obj.constructor.name) + return new obj.constructor(+obj); + + if ('RegExp' === obj.constructor.name) + return RegExpClone(obj); + + if ('Buffer' === obj.constructor.name) + return exports.cloneBuffer(obj); + } + + if (isObject(obj)) + return exports.cloneObject(obj, options); + + if (obj.valueOf) + return obj.valueOf(); +}; + +/*! + * ignore + */ + +exports.cloneObject = function cloneObject(obj, options) { + const minimize = options && options.minimize; + const ret = {}; + let hasKeys; + let val; + const keys = Object.keys(obj); + let k; + + for (let i = 0; i < keys.length; ++i) { + k = keys[i]; + // Not technically prototype pollution because this wouldn't merge properties + // onto `Object.prototype`, but avoid properties like __proto__ as a precaution. + if (specialProperties.indexOf(k) !== -1) { + continue; + } + + val = clone(obj[k], options); + + if (!minimize || ('undefined' !== typeof val)) { + hasKeys || (hasKeys = true); + ret[k] = val; + } + } + + return minimize + ? hasKeys && ret + : ret; +}; + +exports.cloneArray = function cloneArray(arr, options) { + const ret = []; + for (let i = 0, l = arr.length; i < l; i++) + ret.push(clone(arr[i], options)); + return ret; +}; + +/** + * process.nextTick helper. + * + * Wraps the given `callback` in a try/catch. If an error is + * caught it will be thrown on nextTick. + * + * node-mongodb-native had a habit of state corruption when + * an error was immediately thrown from within a collection + * method (find, update, etc) callback. + * + * @param {Function} [callback] + * @api private + */ + +exports.tick = function tick(callback) { + if ('function' !== typeof callback) return; + return function() { + // callbacks should always be fired on the next + // turn of the event loop. A side benefit is + // errors thrown from executing the callback + // will not cause drivers state to be corrupted + // which has historically been a problem. + const args = arguments; + soon(function() { + callback.apply(this, args); + }); + }; +}; + +/** + * Merges `from` into `to` without overwriting existing properties. + * + * @param {Object} to + * @param {Object} from + * @api private + */ + +exports.merge = function merge(to, from) { + const keys = Object.keys(from); + + for (const key of keys) { + if (specialProperties.indexOf(key) !== -1) { + continue; + } + if ('undefined' === typeof to[key]) { + to[key] = from[key]; + } else { + if (exports.isObject(from[key])) { + merge(to[key], from[key]); + } else { + to[key] = from[key]; + } + } + } +}; + +/** + * Same as merge but clones the assigned values. + * + * @param {Object} to + * @param {Object} from + * @api private + */ + +exports.mergeClone = function mergeClone(to, from) { + const keys = Object.keys(from); + + for (const key of keys) { + if (specialProperties.indexOf(key) !== -1) { + continue; + } + if ('undefined' === typeof to[key]) { + to[key] = clone(from[key]); + } else { + if (exports.isObject(from[key])) { + mergeClone(to[key], from[key]); + } else { + to[key] = clone(from[key]); + } + } + } +}; + +/** + * Read pref helper (mongo 2.2 drivers support this) + * + * Allows using aliases instead of full preference names: + * + * p primary + * pp primaryPreferred + * s secondary + * sp secondaryPreferred + * n nearest + * + * @param {String} pref + */ + +exports.readPref = function readPref(pref) { + switch (pref) { + case 'p': + pref = 'primary'; + break; + case 'pp': + pref = 'primaryPreferred'; + break; + case 's': + pref = 'secondary'; + break; + case 'sp': + pref = 'secondaryPreferred'; + break; + case 'n': + pref = 'nearest'; + break; + } + + return pref; +}; + + +/** + * Read Concern helper (mongo 3.2 drivers support this) + * + * Allows using string to specify read concern level: + * + * local 3.2+ + * available 3.6+ + * majority 3.2+ + * linearizable 3.4+ + * snapshot 4.0+ + * + * @param {String|Object} concern + */ + +exports.readConcern = function readConcern(concern) { + if ('string' === typeof concern) { + switch (concern) { + case 'l': + concern = 'local'; + break; + case 'a': + concern = 'available'; + break; + case 'm': + concern = 'majority'; + break; + case 'lz': + concern = 'linearizable'; + break; + case 's': + concern = 'snapshot'; + break; + } + concern = { level: concern }; + } + return concern; +}; + +/** + * Object.prototype.toString.call helper + */ + +const _toString = Object.prototype.toString; +exports.toString = function(arg) { + return _toString.call(arg); +}; + +/** + * Determines if `arg` is an object. + * + * @param {Object|Array|String|Function|RegExp|any} arg + * @return {Boolean} + */ + +const isObject = exports.isObject = function(arg) { + return '[object Object]' == exports.toString(arg); +}; + +/** + * Determines if `arg` is an array. + * + * @param {Object} + * @return {Boolean} + * @see nodejs utils + */ + +exports.isArray = function(arg) { + return Array.isArray(arg) || + 'object' == typeof arg && '[object Array]' == exports.toString(arg); +}; + +/** + * Object.keys helper + */ + +exports.keys = Object.keys; + +/** + * Basic Object.create polyfill. + * Only one argument is supported. + * + * Based on https://developer.mozilla.org/en-US/docs/JavaScript/Reference/Global_Objects/Object/create + */ + +exports.create = 'function' == typeof Object.create + ? Object.create + : create; + +function create(proto) { + if (arguments.length > 1) { + throw new Error('Adding properties is not supported'); + } + + function F() {} + F.prototype = proto; + return new F; +} + +/** + * inheritance + */ + +exports.inherits = function(ctor, superCtor) { + ctor.prototype = exports.create(superCtor.prototype); + ctor.prototype.constructor = ctor; +}; + +/** + * nextTick helper + * compat with node 0.10 which behaves differently than previous versions + */ + +const soon = exports.soon = 'function' == typeof setImmediate + ? setImmediate + : process.nextTick; + +/** + * Clones the contents of a buffer. + * + * @param {Buffer} buff + * @return {Buffer} + */ + +exports.cloneBuffer = function(buff) { + const dupe = Buffer.alloc(buff.length); + buff.copy(dupe, 0, 0, buff.length); + return dupe; +}; + +/** + * Check if this object is an arguments object + * + * @param {Any} v + * @return {Boolean} + */ + +exports.isArgumentsObject = function(v) { + return Object.prototype.toString.call(v) === '[object Arguments]'; +}; diff --git a/node_modules/mquery/node_modules/debug/LICENSE b/node_modules/mquery/node_modules/debug/LICENSE new file mode 100644 index 000000000..658c933d2 --- /dev/null +++ b/node_modules/mquery/node_modules/debug/LICENSE @@ -0,0 +1,19 @@ +(The MIT License) + +Copyright (c) 2014 TJ Holowaychuk + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software +and associated documentation files (the 'Software'), to deal in the Software without restriction, +including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, +and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, +subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial +portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT +LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, +WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. + diff --git a/node_modules/mquery/node_modules/debug/README.md b/node_modules/mquery/node_modules/debug/README.md new file mode 100644 index 000000000..88dae35d9 --- /dev/null +++ b/node_modules/mquery/node_modules/debug/README.md @@ -0,0 +1,455 @@ +# debug +[![Build Status](https://travis-ci.org/visionmedia/debug.svg?branch=master)](https://travis-ci.org/visionmedia/debug) [![Coverage Status](https://coveralls.io/repos/github/visionmedia/debug/badge.svg?branch=master)](https://coveralls.io/github/visionmedia/debug?branch=master) [![Slack](https://visionmedia-community-slackin.now.sh/badge.svg)](https://visionmedia-community-slackin.now.sh/) [![OpenCollective](https://opencollective.com/debug/backers/badge.svg)](#backers) +[![OpenCollective](https://opencollective.com/debug/sponsors/badge.svg)](#sponsors) + + + +A tiny JavaScript debugging utility modelled after Node.js core's debugging +technique. Works in Node.js and web browsers. + +## Installation + +```bash +$ npm install debug +``` + +## Usage + +`debug` exposes a function; simply pass this function the name of your module, and it will return a decorated version of `console.error` for you to pass debug statements to. This will allow you to toggle the debug output for different parts of your module as well as the module as a whole. + +Example [_app.js_](./examples/node/app.js): + +```js +var debug = require('debug')('http') + , http = require('http') + , name = 'My App'; + +// fake app + +debug('booting %o', name); + +http.createServer(function(req, res){ + debug(req.method + ' ' + req.url); + res.end('hello\n'); +}).listen(3000, function(){ + debug('listening'); +}); + +// fake worker of some kind + +require('./worker'); +``` + +Example [_worker.js_](./examples/node/worker.js): + +```js +var a = require('debug')('worker:a') + , b = require('debug')('worker:b'); + +function work() { + a('doing lots of uninteresting work'); + setTimeout(work, Math.random() * 1000); +} + +work(); + +function workb() { + b('doing some work'); + setTimeout(workb, Math.random() * 2000); +} + +workb(); +``` + +The `DEBUG` environment variable is then used to enable these based on space or +comma-delimited names. + +Here are some examples: + +screen shot 2017-08-08 at 12 53 04 pm +screen shot 2017-08-08 at 12 53 38 pm +screen shot 2017-08-08 at 12 53 25 pm + +#### Windows command prompt notes + +##### CMD + +On Windows the environment variable is set using the `set` command. + +```cmd +set DEBUG=*,-not_this +``` + +Example: + +```cmd +set DEBUG=* & node app.js +``` + +##### PowerShell (VS Code default) + +PowerShell uses different syntax to set environment variables. + +```cmd +$env:DEBUG = "*,-not_this" +``` + +Example: + +```cmd +$env:DEBUG='app';node app.js +``` + +Then, run the program to be debugged as usual. + +npm script example: +```js + "windowsDebug": "@powershell -Command $env:DEBUG='*';node app.js", +``` + +## Namespace Colors + +Every debug instance has a color generated for it based on its namespace name. +This helps when visually parsing the debug output to identify which debug instance +a debug line belongs to. + +#### Node.js + +In Node.js, colors are enabled when stderr is a TTY. You also _should_ install +the [`supports-color`](https://npmjs.org/supports-color) module alongside debug, +otherwise debug will only use a small handful of basic colors. + + + +#### Web Browser + +Colors are also enabled on "Web Inspectors" that understand the `%c` formatting +option. These are WebKit web inspectors, Firefox ([since version +31](https://hacks.mozilla.org/2014/05/editable-box-model-multiple-selection-sublime-text-keys-much-more-firefox-developer-tools-episode-31/)) +and the Firebug plugin for Firefox (any version). + + + + +## Millisecond diff + +When actively developing an application it can be useful to see when the time spent between one `debug()` call and the next. Suppose for example you invoke `debug()` before requesting a resource, and after as well, the "+NNNms" will show you how much time was spent between calls. + + + +When stdout is not a TTY, `Date#toISOString()` is used, making it more useful for logging the debug information as shown below: + + + + +## Conventions + +If you're using this in one or more of your libraries, you _should_ use the name of your library so that developers may toggle debugging as desired without guessing names. If you have more than one debuggers you _should_ prefix them with your library name and use ":" to separate features. For example "bodyParser" from Connect would then be "connect:bodyParser". If you append a "*" to the end of your name, it will always be enabled regardless of the setting of the DEBUG environment variable. You can then use it for normal output as well as debug output. + +## Wildcards + +The `*` character may be used as a wildcard. Suppose for example your library has +debuggers named "connect:bodyParser", "connect:compress", "connect:session", +instead of listing all three with +`DEBUG=connect:bodyParser,connect:compress,connect:session`, you may simply do +`DEBUG=connect:*`, or to run everything using this module simply use `DEBUG=*`. + +You can also exclude specific debuggers by prefixing them with a "-" character. +For example, `DEBUG=*,-connect:*` would include all debuggers except those +starting with "connect:". + +## Environment Variables + +When running through Node.js, you can set a few environment variables that will +change the behavior of the debug logging: + +| Name | Purpose | +|-----------|-------------------------------------------------| +| `DEBUG` | Enables/disables specific debugging namespaces. | +| `DEBUG_HIDE_DATE` | Hide date from debug output (non-TTY). | +| `DEBUG_COLORS`| Whether or not to use colors in the debug output. | +| `DEBUG_DEPTH` | Object inspection depth. | +| `DEBUG_SHOW_HIDDEN` | Shows hidden properties on inspected objects. | + + +__Note:__ The environment variables beginning with `DEBUG_` end up being +converted into an Options object that gets used with `%o`/`%O` formatters. +See the Node.js documentation for +[`util.inspect()`](https://nodejs.org/api/util.html#util_util_inspect_object_options) +for the complete list. + +## Formatters + +Debug uses [printf-style](https://wikipedia.org/wiki/Printf_format_string) formatting. +Below are the officially supported formatters: + +| Formatter | Representation | +|-----------|----------------| +| `%O` | Pretty-print an Object on multiple lines. | +| `%o` | Pretty-print an Object all on a single line. | +| `%s` | String. | +| `%d` | Number (both integer and float). | +| `%j` | JSON. Replaced with the string '[Circular]' if the argument contains circular references. | +| `%%` | Single percent sign ('%'). This does not consume an argument. | + + +### Custom formatters + +You can add custom formatters by extending the `debug.formatters` object. +For example, if you wanted to add support for rendering a Buffer as hex with +`%h`, you could do something like: + +```js +const createDebug = require('debug') +createDebug.formatters.h = (v) => { + return v.toString('hex') +} + +// …elsewhere +const debug = createDebug('foo') +debug('this is hex: %h', new Buffer('hello world')) +// foo this is hex: 68656c6c6f20776f726c6421 +0ms +``` + + +## Browser Support + +You can build a browser-ready script using [browserify](https://github.com/substack/node-browserify), +or just use the [browserify-as-a-service](https://wzrd.in/) [build](https://wzrd.in/standalone/debug@latest), +if you don't want to build it yourself. + +Debug's enable state is currently persisted by `localStorage`. +Consider the situation shown below where you have `worker:a` and `worker:b`, +and wish to debug both. You can enable this using `localStorage.debug`: + +```js +localStorage.debug = 'worker:*' +``` + +And then refresh the page. + +```js +a = debug('worker:a'); +b = debug('worker:b'); + +setInterval(function(){ + a('doing some work'); +}, 1000); + +setInterval(function(){ + b('doing some work'); +}, 1200); +``` + + +## Output streams + + By default `debug` will log to stderr, however this can be configured per-namespace by overriding the `log` method: + +Example [_stdout.js_](./examples/node/stdout.js): + +```js +var debug = require('debug'); +var error = debug('app:error'); + +// by default stderr is used +error('goes to stderr!'); + +var log = debug('app:log'); +// set this namespace to log via console.log +log.log = console.log.bind(console); // don't forget to bind to console! +log('goes to stdout'); +error('still goes to stderr!'); + +// set all output to go via console.info +// overrides all per-namespace log settings +debug.log = console.info.bind(console); +error('now goes to stdout via console.info'); +log('still goes to stdout, but via console.info now'); +``` + +## Extend +You can simply extend debugger +```js +const log = require('debug')('auth'); + +//creates new debug instance with extended namespace +const logSign = log.extend('sign'); +const logLogin = log.extend('login'); + +log('hello'); // auth hello +logSign('hello'); //auth:sign hello +logLogin('hello'); //auth:login hello +``` + +## Set dynamically + +You can also enable debug dynamically by calling the `enable()` method : + +```js +let debug = require('debug'); + +console.log(1, debug.enabled('test')); + +debug.enable('test'); +console.log(2, debug.enabled('test')); + +debug.disable(); +console.log(3, debug.enabled('test')); + +``` + +print : +``` +1 false +2 true +3 false +``` + +Usage : +`enable(namespaces)` +`namespaces` can include modes separated by a colon and wildcards. + +Note that calling `enable()` completely overrides previously set DEBUG variable : + +``` +$ DEBUG=foo node -e 'var dbg = require("debug"); dbg.enable("bar"); console.log(dbg.enabled("foo"))' +=> false +``` + +`disable()` + +Will disable all namespaces. The functions returns the namespaces currently +enabled (and skipped). This can be useful if you want to disable debugging +temporarily without knowing what was enabled to begin with. + +For example: + +```js +let debug = require('debug'); +debug.enable('foo:*,-foo:bar'); +let namespaces = debug.disable(); +debug.enable(namespaces); +``` + +Note: There is no guarantee that the string will be identical to the initial +enable string, but semantically they will be identical. + +## Checking whether a debug target is enabled + +After you've created a debug instance, you can determine whether or not it is +enabled by checking the `enabled` property: + +```javascript +const debug = require('debug')('http'); + +if (debug.enabled) { + // do stuff... +} +``` + +You can also manually toggle this property to force the debug instance to be +enabled or disabled. + + +## Authors + + - TJ Holowaychuk + - Nathan Rajlich + - Andrew Rhyne + +## Backers + +Support us with a monthly donation and help us continue our activities. [[Become a backer](https://opencollective.com/debug#backer)] + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +## Sponsors + +Become a sponsor and get your logo on our README on Github with a link to your site. [[Become a sponsor](https://opencollective.com/debug#sponsor)] + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +## License + +(The MIT License) + +Copyright (c) 2014-2017 TJ Holowaychuk <tj@vision-media.ca> + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/mquery/node_modules/debug/package.json b/node_modules/mquery/node_modules/debug/package.json new file mode 100644 index 000000000..6807310d2 --- /dev/null +++ b/node_modules/mquery/node_modules/debug/package.json @@ -0,0 +1,101 @@ +{ + "_from": "debug@4.x", + "_id": "debug@4.3.2", + "_inBundle": false, + "_integrity": "sha512-mOp8wKcvj7XxC78zLgw/ZA+6TSgkoE2C/ienthhRD298T7UNwAg9diBpLRxC0mOezLl4B0xV7M0cCO6P/O0Xhw==", + "_location": "/mquery/debug", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "debug@4.x", + "name": "debug", + "escapedName": "debug", + "rawSpec": "4.x", + "saveSpec": null, + "fetchSpec": "4.x" + }, + "_requiredBy": [ + "/mquery" + ], + "_resolved": "https://registry.npmjs.org/debug/-/debug-4.3.2.tgz", + "_shasum": "f0a49c18ac8779e31d4a0c6029dfb76873c7428b", + "_spec": "debug@4.x", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\mquery", + "author": { + "name": "TJ Holowaychuk", + "email": "tj@vision-media.ca" + }, + "browser": "./src/browser.js", + "bugs": { + "url": "https://github.com/visionmedia/debug/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Nathan Rajlich", + "email": "nathan@tootallnate.net", + "url": "http://n8.io" + }, + { + "name": "Andrew Rhyne", + "email": "rhyneandrew@gmail.com" + }, + { + "name": "Josh Junon", + "email": "josh@junon.me" + } + ], + "dependencies": { + "ms": "2.1.2" + }, + "deprecated": false, + "description": "small debugging utility", + "devDependencies": { + "brfs": "^2.0.1", + "browserify": "^16.2.3", + "coveralls": "^3.0.2", + "istanbul": "^0.4.5", + "karma": "^3.1.4", + "karma-browserify": "^6.0.0", + "karma-chrome-launcher": "^2.2.0", + "karma-mocha": "^1.3.0", + "mocha": "^5.2.0", + "mocha-lcov-reporter": "^1.2.0", + "xo": "^0.23.0" + }, + "engines": { + "node": ">=6.0" + }, + "files": [ + "src", + "LICENSE", + "README.md" + ], + "homepage": "https://github.com/visionmedia/debug#readme", + "keywords": [ + "debug", + "log", + "debugger" + ], + "license": "MIT", + "main": "./src/index.js", + "name": "debug", + "peerDependenciesMeta": { + "supports-color": { + "optional": true + } + }, + "repository": { + "type": "git", + "url": "git://github.com/visionmedia/debug.git" + }, + "scripts": { + "lint": "xo", + "test": "npm run test:node && npm run test:browser && npm run lint", + "test:browser": "karma start --single-run", + "test:coverage": "cat ./coverage/lcov.info | coveralls", + "test:node": "istanbul cover _mocha -- test.js" + }, + "version": "4.3.2" +} diff --git a/node_modules/mquery/node_modules/debug/src/browser.js b/node_modules/mquery/node_modules/debug/src/browser.js new file mode 100644 index 000000000..cd0fc35d1 --- /dev/null +++ b/node_modules/mquery/node_modules/debug/src/browser.js @@ -0,0 +1,269 @@ +/* eslint-env browser */ + +/** + * This is the web browser implementation of `debug()`. + */ + +exports.formatArgs = formatArgs; +exports.save = save; +exports.load = load; +exports.useColors = useColors; +exports.storage = localstorage(); +exports.destroy = (() => { + let warned = false; + + return () => { + if (!warned) { + warned = true; + console.warn('Instance method `debug.destroy()` is deprecated and no longer does anything. It will be removed in the next major version of `debug`.'); + } + }; +})(); + +/** + * Colors. + */ + +exports.colors = [ + '#0000CC', + '#0000FF', + '#0033CC', + '#0033FF', + '#0066CC', + '#0066FF', + '#0099CC', + '#0099FF', + '#00CC00', + '#00CC33', + '#00CC66', + '#00CC99', + '#00CCCC', + '#00CCFF', + '#3300CC', + '#3300FF', + '#3333CC', + '#3333FF', + '#3366CC', + '#3366FF', + '#3399CC', + '#3399FF', + '#33CC00', + '#33CC33', + '#33CC66', + '#33CC99', + '#33CCCC', + '#33CCFF', + '#6600CC', + '#6600FF', + '#6633CC', + '#6633FF', + '#66CC00', + '#66CC33', + '#9900CC', + '#9900FF', + '#9933CC', + '#9933FF', + '#99CC00', + '#99CC33', + '#CC0000', + '#CC0033', + '#CC0066', + '#CC0099', + '#CC00CC', + '#CC00FF', + '#CC3300', + '#CC3333', + '#CC3366', + '#CC3399', + '#CC33CC', + '#CC33FF', + '#CC6600', + '#CC6633', + '#CC9900', + '#CC9933', + '#CCCC00', + '#CCCC33', + '#FF0000', + '#FF0033', + '#FF0066', + '#FF0099', + '#FF00CC', + '#FF00FF', + '#FF3300', + '#FF3333', + '#FF3366', + '#FF3399', + '#FF33CC', + '#FF33FF', + '#FF6600', + '#FF6633', + '#FF9900', + '#FF9933', + '#FFCC00', + '#FFCC33' +]; + +/** + * Currently only WebKit-based Web Inspectors, Firefox >= v31, + * and the Firebug extension (any Firefox version) are known + * to support "%c" CSS customizations. + * + * TODO: add a `localStorage` variable to explicitly enable/disable colors + */ + +// eslint-disable-next-line complexity +function useColors() { + // NB: In an Electron preload script, document will be defined but not fully + // initialized. Since we know we're in Chrome, we'll just detect this case + // explicitly + if (typeof window !== 'undefined' && window.process && (window.process.type === 'renderer' || window.process.__nwjs)) { + return true; + } + + // Internet Explorer and Edge do not support colors. + if (typeof navigator !== 'undefined' && navigator.userAgent && navigator.userAgent.toLowerCase().match(/(edge|trident)\/(\d+)/)) { + return false; + } + + // Is webkit? http://stackoverflow.com/a/16459606/376773 + // document is undefined in react-native: https://github.com/facebook/react-native/pull/1632 + return (typeof document !== 'undefined' && document.documentElement && document.documentElement.style && document.documentElement.style.WebkitAppearance) || + // Is firebug? http://stackoverflow.com/a/398120/376773 + (typeof window !== 'undefined' && window.console && (window.console.firebug || (window.console.exception && window.console.table))) || + // Is firefox >= v31? + // https://developer.mozilla.org/en-US/docs/Tools/Web_Console#Styling_messages + (typeof navigator !== 'undefined' && navigator.userAgent && navigator.userAgent.toLowerCase().match(/firefox\/(\d+)/) && parseInt(RegExp.$1, 10) >= 31) || + // Double check webkit in userAgent just in case we are in a worker + (typeof navigator !== 'undefined' && navigator.userAgent && navigator.userAgent.toLowerCase().match(/applewebkit\/(\d+)/)); +} + +/** + * Colorize log arguments if enabled. + * + * @api public + */ + +function formatArgs(args) { + args[0] = (this.useColors ? '%c' : '') + + this.namespace + + (this.useColors ? ' %c' : ' ') + + args[0] + + (this.useColors ? '%c ' : ' ') + + '+' + module.exports.humanize(this.diff); + + if (!this.useColors) { + return; + } + + const c = 'color: ' + this.color; + args.splice(1, 0, c, 'color: inherit'); + + // The final "%c" is somewhat tricky, because there could be other + // arguments passed either before or after the %c, so we need to + // figure out the correct index to insert the CSS into + let index = 0; + let lastC = 0; + args[0].replace(/%[a-zA-Z%]/g, match => { + if (match === '%%') { + return; + } + index++; + if (match === '%c') { + // We only are interested in the *last* %c + // (the user may have provided their own) + lastC = index; + } + }); + + args.splice(lastC, 0, c); +} + +/** + * Invokes `console.debug()` when available. + * No-op when `console.debug` is not a "function". + * If `console.debug` is not available, falls back + * to `console.log`. + * + * @api public + */ +exports.log = console.debug || console.log || (() => {}); + +/** + * Save `namespaces`. + * + * @param {String} namespaces + * @api private + */ +function save(namespaces) { + try { + if (namespaces) { + exports.storage.setItem('debug', namespaces); + } else { + exports.storage.removeItem('debug'); + } + } catch (error) { + // Swallow + // XXX (@Qix-) should we be logging these? + } +} + +/** + * Load `namespaces`. + * + * @return {String} returns the previously persisted debug modes + * @api private + */ +function load() { + let r; + try { + r = exports.storage.getItem('debug'); + } catch (error) { + // Swallow + // XXX (@Qix-) should we be logging these? + } + + // If debug isn't set in LS, and we're in Electron, try to load $DEBUG + if (!r && typeof process !== 'undefined' && 'env' in process) { + r = process.env.DEBUG; + } + + return r; +} + +/** + * Localstorage attempts to return the localstorage. + * + * This is necessary because safari throws + * when a user disables cookies/localstorage + * and you attempt to access it. + * + * @return {LocalStorage} + * @api private + */ + +function localstorage() { + try { + // TVMLKit (Apple TV JS Runtime) does not have a window object, just localStorage in the global context + // The Browser also has localStorage in the global context. + return localStorage; + } catch (error) { + // Swallow + // XXX (@Qix-) should we be logging these? + } +} + +module.exports = require('./common')(exports); + +const {formatters} = module.exports; + +/** + * Map %j to `JSON.stringify()`, since no Web Inspectors do that by default. + */ + +formatters.j = function (v) { + try { + return JSON.stringify(v); + } catch (error) { + return '[UnexpectedJSONParseError]: ' + error.message; + } +}; diff --git a/node_modules/mquery/node_modules/debug/src/common.js b/node_modules/mquery/node_modules/debug/src/common.js new file mode 100644 index 000000000..50ce29251 --- /dev/null +++ b/node_modules/mquery/node_modules/debug/src/common.js @@ -0,0 +1,274 @@ + +/** + * This is the common logic for both the Node.js and web browser + * implementations of `debug()`. + */ + +function setup(env) { + createDebug.debug = createDebug; + createDebug.default = createDebug; + createDebug.coerce = coerce; + createDebug.disable = disable; + createDebug.enable = enable; + createDebug.enabled = enabled; + createDebug.humanize = require('ms'); + createDebug.destroy = destroy; + + Object.keys(env).forEach(key => { + createDebug[key] = env[key]; + }); + + /** + * The currently active debug mode names, and names to skip. + */ + + createDebug.names = []; + createDebug.skips = []; + + /** + * Map of special "%n" handling functions, for the debug "format" argument. + * + * Valid key names are a single, lower or upper-case letter, i.e. "n" and "N". + */ + createDebug.formatters = {}; + + /** + * Selects a color for a debug namespace + * @param {String} namespace The namespace string for the for the debug instance to be colored + * @return {Number|String} An ANSI color code for the given namespace + * @api private + */ + function selectColor(namespace) { + let hash = 0; + + for (let i = 0; i < namespace.length; i++) { + hash = ((hash << 5) - hash) + namespace.charCodeAt(i); + hash |= 0; // Convert to 32bit integer + } + + return createDebug.colors[Math.abs(hash) % createDebug.colors.length]; + } + createDebug.selectColor = selectColor; + + /** + * Create a debugger with the given `namespace`. + * + * @param {String} namespace + * @return {Function} + * @api public + */ + function createDebug(namespace) { + let prevTime; + let enableOverride = null; + let namespacesCache; + let enabledCache; + + function debug(...args) { + // Disabled? + if (!debug.enabled) { + return; + } + + const self = debug; + + // Set `diff` timestamp + const curr = Number(new Date()); + const ms = curr - (prevTime || curr); + self.diff = ms; + self.prev = prevTime; + self.curr = curr; + prevTime = curr; + + args[0] = createDebug.coerce(args[0]); + + if (typeof args[0] !== 'string') { + // Anything else let's inspect with %O + args.unshift('%O'); + } + + // Apply any `formatters` transformations + let index = 0; + args[0] = args[0].replace(/%([a-zA-Z%])/g, (match, format) => { + // If we encounter an escaped % then don't increase the array index + if (match === '%%') { + return '%'; + } + index++; + const formatter = createDebug.formatters[format]; + if (typeof formatter === 'function') { + const val = args[index]; + match = formatter.call(self, val); + + // Now we need to remove `args[index]` since it's inlined in the `format` + args.splice(index, 1); + index--; + } + return match; + }); + + // Apply env-specific formatting (colors, etc.) + createDebug.formatArgs.call(self, args); + + const logFn = self.log || createDebug.log; + logFn.apply(self, args); + } + + debug.namespace = namespace; + debug.useColors = createDebug.useColors(); + debug.color = createDebug.selectColor(namespace); + debug.extend = extend; + debug.destroy = createDebug.destroy; // XXX Temporary. Will be removed in the next major release. + + Object.defineProperty(debug, 'enabled', { + enumerable: true, + configurable: false, + get: () => { + if (enableOverride !== null) { + return enableOverride; + } + if (namespacesCache !== createDebug.namespaces) { + namespacesCache = createDebug.namespaces; + enabledCache = createDebug.enabled(namespace); + } + + return enabledCache; + }, + set: v => { + enableOverride = v; + } + }); + + // Env-specific initialization logic for debug instances + if (typeof createDebug.init === 'function') { + createDebug.init(debug); + } + + return debug; + } + + function extend(namespace, delimiter) { + const newDebug = createDebug(this.namespace + (typeof delimiter === 'undefined' ? ':' : delimiter) + namespace); + newDebug.log = this.log; + return newDebug; + } + + /** + * Enables a debug mode by namespaces. This can include modes + * separated by a colon and wildcards. + * + * @param {String} namespaces + * @api public + */ + function enable(namespaces) { + createDebug.save(namespaces); + createDebug.namespaces = namespaces; + + createDebug.names = []; + createDebug.skips = []; + + let i; + const split = (typeof namespaces === 'string' ? namespaces : '').split(/[\s,]+/); + const len = split.length; + + for (i = 0; i < len; i++) { + if (!split[i]) { + // ignore empty strings + continue; + } + + namespaces = split[i].replace(/\*/g, '.*?'); + + if (namespaces[0] === '-') { + createDebug.skips.push(new RegExp('^' + namespaces.substr(1) + '$')); + } else { + createDebug.names.push(new RegExp('^' + namespaces + '$')); + } + } + } + + /** + * Disable debug output. + * + * @return {String} namespaces + * @api public + */ + function disable() { + const namespaces = [ + ...createDebug.names.map(toNamespace), + ...createDebug.skips.map(toNamespace).map(namespace => '-' + namespace) + ].join(','); + createDebug.enable(''); + return namespaces; + } + + /** + * Returns true if the given mode name is enabled, false otherwise. + * + * @param {String} name + * @return {Boolean} + * @api public + */ + function enabled(name) { + if (name[name.length - 1] === '*') { + return true; + } + + let i; + let len; + + for (i = 0, len = createDebug.skips.length; i < len; i++) { + if (createDebug.skips[i].test(name)) { + return false; + } + } + + for (i = 0, len = createDebug.names.length; i < len; i++) { + if (createDebug.names[i].test(name)) { + return true; + } + } + + return false; + } + + /** + * Convert regexp to namespace + * + * @param {RegExp} regxep + * @return {String} namespace + * @api private + */ + function toNamespace(regexp) { + return regexp.toString() + .substring(2, regexp.toString().length - 2) + .replace(/\.\*\?$/, '*'); + } + + /** + * Coerce `val`. + * + * @param {Mixed} val + * @return {Mixed} + * @api private + */ + function coerce(val) { + if (val instanceof Error) { + return val.stack || val.message; + } + return val; + } + + /** + * XXX DO NOT USE. This is a temporary stub function. + * XXX It WILL be removed in the next major release. + */ + function destroy() { + console.warn('Instance method `debug.destroy()` is deprecated and no longer does anything. It will be removed in the next major version of `debug`.'); + } + + createDebug.enable(createDebug.load()); + + return createDebug; +} + +module.exports = setup; diff --git a/node_modules/mquery/node_modules/debug/src/index.js b/node_modules/mquery/node_modules/debug/src/index.js new file mode 100644 index 000000000..bf4c57f25 --- /dev/null +++ b/node_modules/mquery/node_modules/debug/src/index.js @@ -0,0 +1,10 @@ +/** + * Detect Electron renderer / nwjs process, which is node, but we should + * treat as a browser. + */ + +if (typeof process === 'undefined' || process.type === 'renderer' || process.browser === true || process.__nwjs) { + module.exports = require('./browser.js'); +} else { + module.exports = require('./node.js'); +} diff --git a/node_modules/mquery/node_modules/debug/src/node.js b/node_modules/mquery/node_modules/debug/src/node.js new file mode 100644 index 000000000..79bc085cb --- /dev/null +++ b/node_modules/mquery/node_modules/debug/src/node.js @@ -0,0 +1,263 @@ +/** + * Module dependencies. + */ + +const tty = require('tty'); +const util = require('util'); + +/** + * This is the Node.js implementation of `debug()`. + */ + +exports.init = init; +exports.log = log; +exports.formatArgs = formatArgs; +exports.save = save; +exports.load = load; +exports.useColors = useColors; +exports.destroy = util.deprecate( + () => {}, + 'Instance method `debug.destroy()` is deprecated and no longer does anything. It will be removed in the next major version of `debug`.' +); + +/** + * Colors. + */ + +exports.colors = [6, 2, 3, 4, 5, 1]; + +try { + // Optional dependency (as in, doesn't need to be installed, NOT like optionalDependencies in package.json) + // eslint-disable-next-line import/no-extraneous-dependencies + const supportsColor = require('supports-color'); + + if (supportsColor && (supportsColor.stderr || supportsColor).level >= 2) { + exports.colors = [ + 20, + 21, + 26, + 27, + 32, + 33, + 38, + 39, + 40, + 41, + 42, + 43, + 44, + 45, + 56, + 57, + 62, + 63, + 68, + 69, + 74, + 75, + 76, + 77, + 78, + 79, + 80, + 81, + 92, + 93, + 98, + 99, + 112, + 113, + 128, + 129, + 134, + 135, + 148, + 149, + 160, + 161, + 162, + 163, + 164, + 165, + 166, + 167, + 168, + 169, + 170, + 171, + 172, + 173, + 178, + 179, + 184, + 185, + 196, + 197, + 198, + 199, + 200, + 201, + 202, + 203, + 204, + 205, + 206, + 207, + 208, + 209, + 214, + 215, + 220, + 221 + ]; + } +} catch (error) { + // Swallow - we only care if `supports-color` is available; it doesn't have to be. +} + +/** + * Build up the default `inspectOpts` object from the environment variables. + * + * $ DEBUG_COLORS=no DEBUG_DEPTH=10 DEBUG_SHOW_HIDDEN=enabled node script.js + */ + +exports.inspectOpts = Object.keys(process.env).filter(key => { + return /^debug_/i.test(key); +}).reduce((obj, key) => { + // Camel-case + const prop = key + .substring(6) + .toLowerCase() + .replace(/_([a-z])/g, (_, k) => { + return k.toUpperCase(); + }); + + // Coerce string value into JS value + let val = process.env[key]; + if (/^(yes|on|true|enabled)$/i.test(val)) { + val = true; + } else if (/^(no|off|false|disabled)$/i.test(val)) { + val = false; + } else if (val === 'null') { + val = null; + } else { + val = Number(val); + } + + obj[prop] = val; + return obj; +}, {}); + +/** + * Is stdout a TTY? Colored output is enabled when `true`. + */ + +function useColors() { + return 'colors' in exports.inspectOpts ? + Boolean(exports.inspectOpts.colors) : + tty.isatty(process.stderr.fd); +} + +/** + * Adds ANSI color escape codes if enabled. + * + * @api public + */ + +function formatArgs(args) { + const {namespace: name, useColors} = this; + + if (useColors) { + const c = this.color; + const colorCode = '\u001B[3' + (c < 8 ? c : '8;5;' + c); + const prefix = ` ${colorCode};1m${name} \u001B[0m`; + + args[0] = prefix + args[0].split('\n').join('\n' + prefix); + args.push(colorCode + 'm+' + module.exports.humanize(this.diff) + '\u001B[0m'); + } else { + args[0] = getDate() + name + ' ' + args[0]; + } +} + +function getDate() { + if (exports.inspectOpts.hideDate) { + return ''; + } + return new Date().toISOString() + ' '; +} + +/** + * Invokes `util.format()` with the specified arguments and writes to stderr. + */ + +function log(...args) { + return process.stderr.write(util.format(...args) + '\n'); +} + +/** + * Save `namespaces`. + * + * @param {String} namespaces + * @api private + */ +function save(namespaces) { + if (namespaces) { + process.env.DEBUG = namespaces; + } else { + // If you set a process.env field to null or undefined, it gets cast to the + // string 'null' or 'undefined'. Just delete instead. + delete process.env.DEBUG; + } +} + +/** + * Load `namespaces`. + * + * @return {String} returns the previously persisted debug modes + * @api private + */ + +function load() { + return process.env.DEBUG; +} + +/** + * Init logic for `debug` instances. + * + * Create a new `inspectOpts` object in case `useColors` is set + * differently for a particular `debug` instance. + */ + +function init(debug) { + debug.inspectOpts = {}; + + const keys = Object.keys(exports.inspectOpts); + for (let i = 0; i < keys.length; i++) { + debug.inspectOpts[keys[i]] = exports.inspectOpts[keys[i]]; + } +} + +module.exports = require('./common')(exports); + +const {formatters} = module.exports; + +/** + * Map %o to `util.inspect()`, all on a single line. + */ + +formatters.o = function (v) { + this.inspectOpts.colors = this.useColors; + return util.inspect(v, this.inspectOpts) + .split('\n') + .map(str => str.trim()) + .join(' '); +}; + +/** + * Map %O to `util.inspect()`, allowing multiple lines if needed. + */ + +formatters.O = function (v) { + this.inspectOpts.colors = this.useColors; + return util.inspect(v, this.inspectOpts); +}; diff --git a/node_modules/mquery/node_modules/ms/index.js b/node_modules/mquery/node_modules/ms/index.js new file mode 100644 index 000000000..c4498bcc2 --- /dev/null +++ b/node_modules/mquery/node_modules/ms/index.js @@ -0,0 +1,162 @@ +/** + * Helpers. + */ + +var s = 1000; +var m = s * 60; +var h = m * 60; +var d = h * 24; +var w = d * 7; +var y = d * 365.25; + +/** + * Parse or format the given `val`. + * + * Options: + * + * - `long` verbose formatting [false] + * + * @param {String|Number} val + * @param {Object} [options] + * @throws {Error} throw an error if val is not a non-empty string or a number + * @return {String|Number} + * @api public + */ + +module.exports = function(val, options) { + options = options || {}; + var type = typeof val; + if (type === 'string' && val.length > 0) { + return parse(val); + } else if (type === 'number' && isFinite(val)) { + return options.long ? fmtLong(val) : fmtShort(val); + } + throw new Error( + 'val is not a non-empty string or a valid number. val=' + + JSON.stringify(val) + ); +}; + +/** + * Parse the given `str` and return milliseconds. + * + * @param {String} str + * @return {Number} + * @api private + */ + +function parse(str) { + str = String(str); + if (str.length > 100) { + return; + } + var match = /^(-?(?:\d+)?\.?\d+) *(milliseconds?|msecs?|ms|seconds?|secs?|s|minutes?|mins?|m|hours?|hrs?|h|days?|d|weeks?|w|years?|yrs?|y)?$/i.exec( + str + ); + if (!match) { + return; + } + var n = parseFloat(match[1]); + var type = (match[2] || 'ms').toLowerCase(); + switch (type) { + case 'years': + case 'year': + case 'yrs': + case 'yr': + case 'y': + return n * y; + case 'weeks': + case 'week': + case 'w': + return n * w; + case 'days': + case 'day': + case 'd': + return n * d; + case 'hours': + case 'hour': + case 'hrs': + case 'hr': + case 'h': + return n * h; + case 'minutes': + case 'minute': + case 'mins': + case 'min': + case 'm': + return n * m; + case 'seconds': + case 'second': + case 'secs': + case 'sec': + case 's': + return n * s; + case 'milliseconds': + case 'millisecond': + case 'msecs': + case 'msec': + case 'ms': + return n; + default: + return undefined; + } +} + +/** + * Short format for `ms`. + * + * @param {Number} ms + * @return {String} + * @api private + */ + +function fmtShort(ms) { + var msAbs = Math.abs(ms); + if (msAbs >= d) { + return Math.round(ms / d) + 'd'; + } + if (msAbs >= h) { + return Math.round(ms / h) + 'h'; + } + if (msAbs >= m) { + return Math.round(ms / m) + 'm'; + } + if (msAbs >= s) { + return Math.round(ms / s) + 's'; + } + return ms + 'ms'; +} + +/** + * Long format for `ms`. + * + * @param {Number} ms + * @return {String} + * @api private + */ + +function fmtLong(ms) { + var msAbs = Math.abs(ms); + if (msAbs >= d) { + return plural(ms, msAbs, d, 'day'); + } + if (msAbs >= h) { + return plural(ms, msAbs, h, 'hour'); + } + if (msAbs >= m) { + return plural(ms, msAbs, m, 'minute'); + } + if (msAbs >= s) { + return plural(ms, msAbs, s, 'second'); + } + return ms + ' ms'; +} + +/** + * Pluralization helper. + */ + +function plural(ms, msAbs, n, name) { + var isPlural = msAbs >= n * 1.5; + return Math.round(ms / n) + ' ' + name + (isPlural ? 's' : ''); +} diff --git a/node_modules/mquery/node_modules/ms/license.md b/node_modules/mquery/node_modules/ms/license.md new file mode 100644 index 000000000..69b61253a --- /dev/null +++ b/node_modules/mquery/node_modules/ms/license.md @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2016 Zeit, Inc. + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/node_modules/mquery/node_modules/ms/package.json b/node_modules/mquery/node_modules/ms/package.json new file mode 100644 index 000000000..cf623769a --- /dev/null +++ b/node_modules/mquery/node_modules/ms/package.json @@ -0,0 +1,69 @@ +{ + "_from": "ms@2.1.2", + "_id": "ms@2.1.2", + "_inBundle": false, + "_integrity": "sha512-sGkPx+VjMtmA6MX27oA4FBFELFCZZ4S4XqeGOXCv68tT+jb3vk/RyaKWP0PTKyWtmLSM0b+adUTEvbs1PEaH2w==", + "_location": "/mquery/ms", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "ms@2.1.2", + "name": "ms", + "escapedName": "ms", + "rawSpec": "2.1.2", + "saveSpec": null, + "fetchSpec": "2.1.2" + }, + "_requiredBy": [ + "/mquery/debug" + ], + "_resolved": "https://registry.npmjs.org/ms/-/ms-2.1.2.tgz", + "_shasum": "d09d1f357b443f493382a8eb3ccd183872ae6009", + "_spec": "ms@2.1.2", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\mquery\\node_modules\\debug", + "bugs": { + "url": "https://github.com/zeit/ms/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Tiny millisecond conversion utility", + "devDependencies": { + "eslint": "4.12.1", + "expect.js": "0.3.1", + "husky": "0.14.3", + "lint-staged": "5.0.0", + "mocha": "4.0.1" + }, + "eslintConfig": { + "extends": "eslint:recommended", + "env": { + "node": true, + "es6": true + } + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/zeit/ms#readme", + "license": "MIT", + "lint-staged": { + "*.js": [ + "npm run lint", + "prettier --single-quote --write", + "git add" + ] + }, + "main": "./index", + "name": "ms", + "repository": { + "type": "git", + "url": "git+https://github.com/zeit/ms.git" + }, + "scripts": { + "lint": "eslint lib/* bin/*", + "precommit": "lint-staged", + "test": "mocha tests.js" + }, + "version": "2.1.2" +} diff --git a/node_modules/mquery/node_modules/ms/readme.md b/node_modules/mquery/node_modules/ms/readme.md new file mode 100644 index 000000000..9a1996b17 --- /dev/null +++ b/node_modules/mquery/node_modules/ms/readme.md @@ -0,0 +1,60 @@ +# ms + +[![Build Status](https://travis-ci.org/zeit/ms.svg?branch=master)](https://travis-ci.org/zeit/ms) +[![Join the community on Spectrum](https://withspectrum.github.io/badge/badge.svg)](https://spectrum.chat/zeit) + +Use this package to easily convert various time formats to milliseconds. + +## Examples + +```js +ms('2 days') // 172800000 +ms('1d') // 86400000 +ms('10h') // 36000000 +ms('2.5 hrs') // 9000000 +ms('2h') // 7200000 +ms('1m') // 60000 +ms('5s') // 5000 +ms('1y') // 31557600000 +ms('100') // 100 +ms('-3 days') // -259200000 +ms('-1h') // -3600000 +ms('-200') // -200 +``` + +### Convert from Milliseconds + +```js +ms(60000) // "1m" +ms(2 * 60000) // "2m" +ms(-3 * 60000) // "-3m" +ms(ms('10 hours')) // "10h" +``` + +### Time Format Written-Out + +```js +ms(60000, { long: true }) // "1 minute" +ms(2 * 60000, { long: true }) // "2 minutes" +ms(-3 * 60000, { long: true }) // "-3 minutes" +ms(ms('10 hours'), { long: true }) // "10 hours" +``` + +## Features + +- Works both in [Node.js](https://nodejs.org) and in the browser +- If a number is supplied to `ms`, a string with a unit is returned +- If a string that contains the number is supplied, it returns it as a number (e.g.: it returns `100` for `'100'`) +- If you pass a string with a number and a valid unit, the number of equivalent milliseconds is returned + +## Related Packages + +- [ms.macro](https://github.com/knpwrs/ms.macro) - Run `ms` as a macro at build-time. + +## Caught a Bug? + +1. [Fork](https://help.github.com/articles/fork-a-repo/) this repository to your own GitHub account and then [clone](https://help.github.com/articles/cloning-a-repository/) it to your local device +2. Link the package to the global module directory: `npm link` +3. Within the module you want to test your local development instance of ms, just link it to the dependencies: `npm link ms`. Instead of the default one from npm, Node.js will now use your clone of ms! + +As always, you can run the tests using: `npm test` diff --git a/node_modules/mquery/package.json b/node_modules/mquery/package.json new file mode 100644 index 000000000..88c01e3e0 --- /dev/null +++ b/node_modules/mquery/package.json @@ -0,0 +1,68 @@ +{ + "_from": "mquery@4.0.0", + "_id": "mquery@4.0.0", + "_inBundle": false, + "_integrity": "sha512-nGjm89lHja+T/b8cybAby6H0YgA4qYC/lx6UlwvHGqvTq8bDaNeCwl1sY8uRELrNbVWJzIihxVd+vphGGn1vBw==", + "_location": "/mquery", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "mquery@4.0.0", + "name": "mquery", + "escapedName": "mquery", + "rawSpec": "4.0.0", + "saveSpec": null, + "fetchSpec": "4.0.0" + }, + "_requiredBy": [ + "/mongoose" + ], + "_resolved": "https://registry.npmjs.org/mquery/-/mquery-4.0.0.tgz", + "_shasum": "6c62160ad25289e99e0840907757cdfd62bde775", + "_spec": "mquery@4.0.0", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\mongoose", + "author": { + "name": "Aaron Heckmann", + "email": "aaron.heckmann+github@gmail.com" + }, + "bugs": { + "url": "https://github.com/aheckmann/mquery/issues/new" + }, + "bundleDependencies": false, + "dependencies": { + "debug": "4.x", + "regexp-clone": "^1.0.0", + "sliced": "1.0.1" + }, + "deprecated": false, + "description": "Expressive query building for MongoDB", + "devDependencies": { + "eslint": "5.x", + "eslint-plugin-mocha-no-only": "1.1.0", + "mocha": "9.x", + "mongodb": "4.x" + }, + "engines": { + "node": ">=12.0.0" + }, + "homepage": "https://github.com/aheckmann/mquery/", + "keywords": [ + "mongodb", + "query", + "builder" + ], + "license": "MIT", + "main": "lib/mquery.js", + "name": "mquery", + "repository": { + "type": "git", + "url": "git://github.com/aheckmann/mquery.git" + }, + "scripts": { + "fix-lint": "eslint . --fix", + "lint": "eslint .", + "test": "mocha test/index.js test/*.test.js" + }, + "version": "4.0.0" +} diff --git a/node_modules/mquery/test/.eslintrc.yml b/node_modules/mquery/test/.eslintrc.yml new file mode 100644 index 000000000..7e104dbcf --- /dev/null +++ b/node_modules/mquery/test/.eslintrc.yml @@ -0,0 +1,2 @@ +env: + mocha: true \ No newline at end of file diff --git a/node_modules/mquery/test/collection/browser.js b/node_modules/mquery/test/collection/browser.js new file mode 100644 index 000000000..e69de29bb diff --git a/node_modules/mquery/test/collection/mongo.js b/node_modules/mquery/test/collection/mongo.js new file mode 100644 index 000000000..e69de29bb diff --git a/node_modules/mquery/test/collection/node.js b/node_modules/mquery/test/collection/node.js new file mode 100644 index 000000000..a5eb875e6 --- /dev/null +++ b/node_modules/mquery/test/collection/node.js @@ -0,0 +1,29 @@ +'use strict'; + +const assert = require('assert'); +const mongo = require('mongodb'); + +const uri = process.env.MQUERY_URI || 'mongodb://localhost/mquery'; +let client; +let db; + +exports.getCollection = function(cb) { + mongo.MongoClient.connect(uri, function(err, _client) { + assert.ifError(err); + client = _client; + db = client.db(); + + const collection = db.collection('stuff'); + + // clean test db before starting + db.dropDatabase(function() { + cb(null, collection); + }); + }); +}; + +exports.dropCollection = function(cb) { + db.dropDatabase(function() { + client.close(cb); + }); +}; diff --git a/node_modules/mquery/test/env.js b/node_modules/mquery/test/env.js new file mode 100644 index 000000000..0bf8cf9ed --- /dev/null +++ b/node_modules/mquery/test/env.js @@ -0,0 +1,22 @@ +'use strict'; + +const env = require('../').env; + +console.log('environment: %s', env.type); + +let col; +switch (env.type) { + case 'node': + col = require('./collection/node'); + break; + case 'mongo': + col = require('./collection/mongo'); + break; + case 'browser': + col = require('./collection/browser'); + break; + default: + throw new Error('missing collection implementation for environment: ' + env.type); +} + +module.exports = exports = col; diff --git a/node_modules/mquery/test/index.js b/node_modules/mquery/test/index.js new file mode 100644 index 000000000..1f0524b00 --- /dev/null +++ b/node_modules/mquery/test/index.js @@ -0,0 +1,2956 @@ +'use strict'; + +const mquery = require('../'); +const assert = require('assert'); + +/* global Map */ + +describe('mquery', function() { + let col; + + before(function(done) { + // get the env specific collection interface + require('./env').getCollection(function(err, collection) { + assert.ifError(err); + col = collection; + done(); + }); + }); + + after(function(done) { + require('./env').dropCollection(done); + }); + + describe('mquery', function() { + it('is a function', function() { + assert.equal('function', typeof mquery); + }); + it('creates instances with the `new` keyword', function() { + assert.ok(mquery() instanceof mquery); + }); + describe('defaults', function() { + it('are set', function() { + const m = mquery(); + assert.strictEqual(undefined, m.op); + assert.deepEqual({}, m.options); + }); + }); + describe('criteria', function() { + it('if collection-like is used as collection', function() { + const m = mquery(col); + assert.equal(col, m._collection.collection); + }); + it('non-collection-like is used as criteria', function() { + const m = mquery({ works: true }); + assert.ok(!m._collection); + assert.deepEqual({ works: true }, m._conditions); + }); + }); + describe('options', function() { + it('are merged when passed', function() { + let m; + m = mquery(col, { w: 'majority' }); + assert.deepEqual({ w: 'majority' }, m.options); + m = mquery({ name: 'mquery' }, { w: 'majority' }); + assert.deepEqual({ w: 'majority' }, m.options); + }); + }); + }); + + describe('toConstructor', function() { + it('creates subclasses of mquery', function() { + const opts = { safe: { w: 'majority' }, readPreference: 'p' }; + const match = { name: 'test', count: { $gt: 101 } }; + const select = { name: 1, count: 0 }; + const update = { $set: { x: true } }; + const path = 'street'; + + const q = mquery().setOptions(opts); + q.where(match); + q.select(select); + q.updateOne(update); + q.where(path); + q.find(); + + const M = q.toConstructor(); + const m = M(); + + assert.ok(m instanceof mquery); + assert.deepEqual(opts, m.options); + assert.deepEqual(match, m._conditions); + assert.deepEqual(select, m._fields); + assert.deepEqual(update, m._update); + assert.equal(path, m._path); + assert.equal('find', m.op); + }); + }); + + describe('setOptions', function() { + it('calls associated methods', function() { + const m = mquery(); + assert.equal(m._collection, null); + m.setOptions({ collection: col }); + assert.equal(m._collection.collection, col); + }); + it('directly sets option when no method exists', function() { + const m = mquery(); + assert.equal(m.options.woot, null); + m.setOptions({ woot: 'yay' }); + assert.equal(m.options.woot, 'yay'); + }); + it('is chainable', function() { + const m = mquery(); + let n; + + n = m.setOptions(); + assert.equal(m, n); + n = m.setOptions({ x: 1 }); + assert.equal(m, n); + }); + }); + + describe('collection', function() { + it('sets the _collection', function() { + const m = mquery(); + m.collection(col); + assert.equal(m._collection.collection, col); + }); + it('is chainable', function() { + const m = mquery(); + const n = m.collection(col); + assert.equal(m, n); + }); + }); + + describe('$where', function() { + it('sets the $where condition', function() { + const m = mquery(); + function go() {} + m.$where(go); + assert.ok(go === m._conditions.$where); + }); + it('is chainable', function() { + const m = mquery(); + const n = m.$where('x'); + assert.equal(m, n); + }); + }); + + describe('where', function() { + it('without arguments', function() { + const m = mquery(); + m.where(); + assert.deepEqual({}, m._conditions); + }); + it('with non-string/object argument', function() { + const m = mquery(); + + assert.throws(function() { + m.where([]); + }, /path must be a string or object/); + }); + describe('with one argument', function() { + it('that is an object', function() { + const m = mquery(); + m.where({ name: 'flawed' }); + assert.strictEqual(m._conditions.name, 'flawed'); + }); + it('that is a query', function() { + const m = mquery({ name: 'first' }); + const n = mquery({ name: 'changed' }); + m.where(n); + assert.strictEqual(m._conditions.name, 'changed'); + }); + it('that is a string', function() { + const m = mquery(); + m.where('name'); + assert.equal('name', m._path); + assert.strictEqual(m._conditions.name, undefined); + }); + }); + it('with two arguments', function() { + const m = mquery(); + m.where('name', 'The Great Pumpkin'); + assert.equal('name', m._path); + assert.strictEqual(m._conditions.name, 'The Great Pumpkin'); + }); + it('is chainable', function() { + const m = mquery(); + + let n = m.where('x', 'y'); + assert.equal(m, n); + n = m.where(); + assert.equal(m, n); + }); + }); + describe('equals', function() { + it('must be called after where()', function() { + const m = mquery(); + assert.throws(function() { + m.equals(); + }, /must be used after where/); + }); + it('sets value of path set with where()', function() { + const m = mquery(); + m.where('age').equals(1000); + assert.deepEqual({ age: 1000 }, m._conditions); + }); + it('is chainable', function() { + const m = mquery(); + const n = m.where('x').equals(3); + assert.equal(m, n); + }); + }); + describe('eq', function() { + it('is alias of equals', function() { + const m = mquery(); + m.where('age').eq(1000); + assert.deepEqual({ age: 1000 }, m._conditions); + }); + }); + describe('or', function() { + it('pushes onto the internal $or condition', function() { + const m = mquery(); + m.or({ 'Nightmare Before Christmas': true }); + assert.deepEqual([{ 'Nightmare Before Christmas': true }], m._conditions.$or); + }); + it('allows passing arrays', function() { + const m = mquery(); + const arg = [{ 'Nightmare Before Christmas': true }, { x: 1 }]; + m.or(arg); + assert.deepEqual(arg, m._conditions.$or); + }); + it('allows calling multiple times', function() { + const m = mquery(); + const arg = [{ looper: true }, { x: 1 }]; + m.or(arg); + m.or({ y: 1 }); + m.or([{ w: 'oo' }, { z: 'oo' }]); + assert.deepEqual([{ looper: true }, { x: 1 }, { y: 1 }, { w: 'oo' }, { z: 'oo' }], m._conditions.$or); + }); + it('is chainable', function() { + const m = mquery(); + m.or({ o: 'k' }).where('name', 'table'); + assert.deepEqual({ name: 'table', $or: [{ o: 'k' }] }, m._conditions); + }); + }); + + describe('nor', function() { + it('pushes onto the internal $nor condition', function() { + const m = mquery(); + m.nor({ 'Nightmare Before Christmas': true }); + assert.deepEqual([{ 'Nightmare Before Christmas': true }], m._conditions.$nor); + }); + it('allows passing arrays', function() { + const m = mquery(); + const arg = [{ 'Nightmare Before Christmas': true }, { x: 1 }]; + m.nor(arg); + assert.deepEqual(arg, m._conditions.$nor); + }); + it('allows calling multiple times', function() { + const m = mquery(); + const arg = [{ looper: true }, { x: 1 }]; + m.nor(arg); + m.nor({ y: 1 }); + m.nor([{ w: 'oo' }, { z: 'oo' }]); + assert.deepEqual([{ looper: true }, { x: 1 }, { y: 1 }, { w: 'oo' }, { z: 'oo' }], m._conditions.$nor); + }); + it('is chainable', function() { + const m = mquery(); + m.nor({ o: 'k' }).where('name', 'table'); + assert.deepEqual({ name: 'table', $nor: [{ o: 'k' }] }, m._conditions); + }); + }); + + describe('and', function() { + it('pushes onto the internal $and condition', function() { + const m = mquery(); + m.and({ 'Nightmare Before Christmas': true }); + assert.deepEqual([{ 'Nightmare Before Christmas': true }], m._conditions.$and); + }); + it('allows passing arrays', function() { + const m = mquery(); + const arg = [{ 'Nightmare Before Christmas': true }, { x: 1 }]; + m.and(arg); + assert.deepEqual(arg, m._conditions.$and); + }); + it('allows calling multiple times', function() { + const m = mquery(); + const arg = [{ looper: true }, { x: 1 }]; + m.and(arg); + m.and({ y: 1 }); + m.and([{ w: 'oo' }, { z: 'oo' }]); + assert.deepEqual([{ looper: true }, { x: 1 }, { y: 1 }, { w: 'oo' }, { z: 'oo' }], m._conditions.$and); + }); + it('is chainable', function() { + const m = mquery(); + m.and({ o: 'k' }).where('name', 'table'); + assert.deepEqual({ name: 'table', $and: [{ o: 'k' }] }, m._conditions); + }); + }); + + function generalCondition(type) { + return function() { + it('accepts 2 args', function() { + const m = mquery()[type]('count', 3); + const check = {}; + check['$' + type] = 3; + assert.deepEqual(m._conditions.count, check); + }); + it('uses previously set `where` path if 1 arg passed', function() { + const m = mquery().where('count')[type](3); + const check = {}; + check['$' + type] = 3; + assert.deepEqual(m._conditions.count, check); + }); + it('throws if 1 arg was passed but no previous `where` was used', function() { + assert.throws(function() { + mquery()[type](3); + }, /must be used after where/); + }); + it('is chainable', function() { + const m = mquery().where('count')[type](3).where('x', 8); + const check = { x: 8, count: {} }; + check.count['$' + type] = 3; + assert.deepEqual(m._conditions, check); + }); + it('overwrites previous value', function() { + const m = mquery().where('count')[type](3)[type](8); + const check = {}; + check['$' + type] = 8; + assert.deepEqual(m._conditions.count, check); + }); + }; + } + + 'gt gte lt lte ne in nin regex size maxDistance minDistance'.split(' ').forEach(function(type) { + describe(type, generalCondition(type)); + }); + + describe('mod', function() { + describe('with 1 argument', function() { + it('requires a previous where()', function() { + assert.throws(function() { + mquery().mod([30, 10]); + }, /must be used after where/); + }); + it('works', function() { + const m = mquery().where('madmen').mod([10, 20]); + assert.deepEqual(m._conditions, { madmen: { $mod: [10, 20] } }); + }); + }); + + describe('with 2 arguments and second is non-Array', function() { + it('requires a previous where()', function() { + assert.throws(function() { + mquery().mod('x', 10); + }, /must be used after where/); + }); + it('works', function() { + const m = mquery().where('madmen').mod(10, 20); + assert.deepEqual(m._conditions, { madmen: { $mod: [10, 20] } }); + }); + }); + + it('with 2 arguments and second is an array', function() { + const m = mquery().mod('madmen', [10, 20]); + assert.deepEqual(m._conditions, { madmen: { $mod: [10, 20] } }); + }); + + it('with 3 arguments', function() { + const m = mquery().mod('madmen', 10, 20); + assert.deepEqual(m._conditions, { madmen: { $mod: [10, 20] } }); + }); + + it('is chainable', function() { + const m = mquery().mod('madmen', 10, 20).where('x', 8); + const check = { madmen: { $mod: [10, 20] }, x: 8 }; + assert.deepEqual(m._conditions, check); + }); + }); + + describe('exists', function() { + it('with 0 args', function() { + it('throws if not used after where()', function() { + assert.throws(function() { + mquery().exists(); + }, /must be used after where/); + }); + it('works', function() { + const m = mquery().where('name').exists(); + const check = { name: { $exists: true } }; + assert.deepEqual(m._conditions, check); + }); + }); + + describe('with 1 arg', function() { + describe('that is boolean', function() { + it('throws if not used after where()', function() { + assert.throws(function() { + mquery().exists(); + }, /must be used after where/); + }); + it('works', function() { + const m = mquery().exists('name', false); + const check = { name: { $exists: false } }; + assert.deepEqual(m._conditions, check); + }); + }); + describe('that is not boolean', function() { + it('sets the value to `true`', function() { + const m = mquery().where('name').exists('yummy'); + const check = { yummy: { $exists: true } }; + assert.deepEqual(m._conditions, check); + }); + }); + }); + + describe('with 2 args', function() { + it('works', function() { + const m = mquery().exists('yummy', false); + const check = { yummy: { $exists: false } }; + assert.deepEqual(m._conditions, check); + }); + }); + + it('is chainable', function() { + const m = mquery().where('name').exists().find({ x: 1 }); + const check = { name: { $exists: true }, x: 1 }; + assert.deepEqual(m._conditions, check); + }); + }); + + describe('elemMatch', function() { + describe('with null/undefined first argument', function() { + assert.throws(function() { + mquery().elemMatch(); + }, /Invalid argument/); + assert.throws(function() { + mquery().elemMatch(null); + }, /Invalid argument/); + assert.doesNotThrow(function() { + mquery().elemMatch('', {}); + }); + }); + + describe('with 1 argument', function() { + it('throws if not a function or object', function() { + assert.throws(function() { + mquery().elemMatch([]); + }, /Invalid argument/); + }); + + describe('that is an object', function() { + it('throws if no previous `where` was used', function() { + assert.throws(function() { + mquery().elemMatch({}); + }, /must be used after where/); + }); + it('works', function() { + const m = mquery().where('comment').elemMatch({ author: 'joe', votes: { $gte: 3 } }); + assert.deepEqual({ comment: { $elemMatch: { author: 'joe', votes: { $gte: 3 } } } }, m._conditions); + }); + }); + describe('that is a function', function() { + it('throws if no previous `where` was used', function() { + assert.throws(function() { + mquery().elemMatch(function() {}); + }, /must be used after where/); + }); + it('works', function() { + const m = mquery().where('comment').elemMatch(function(query) { + query.where({ author: 'joe', votes: { $gte: 3 } }); + }); + assert.deepEqual({ comment: { $elemMatch: { author: 'joe', votes: { $gte: 3 } } } }, m._conditions); + }); + }); + }); + + describe('with 2 arguments', function() { + describe('and the 2nd is an object', function() { + it('works', function() { + const m = mquery().elemMatch('comment', { author: 'joe', votes: { $gte: 3 } }); + assert.deepEqual({ comment: { $elemMatch: { author: 'joe', votes: { $gte: 3 } } } }, m._conditions); + }); + }); + describe('and the 2nd is a function', function() { + it('works', function() { + const m = mquery().elemMatch('comment', function(query) { + query.where({ author: 'joe', votes: { $gte: 3 } }); + }); + assert.deepEqual({ comment: { $elemMatch: { author: 'joe', votes: { $gte: 3 } } } }, m._conditions); + }); + }); + it('and the 2nd is not a function or object', function() { + assert.throws(function() { + mquery().elemMatch('comment', []); + }, /Invalid argument/); + }); + }); + }); + + describe('within', function() { + it('is chainable', function() { + const m = mquery(); + assert.equal(m.where('a').within(), m); + }); + describe('when called with arguments', function() { + it('must follow where()', function() { + assert.throws(function() { + mquery().within([]); + }, /must be used after where/); + }); + + describe('of length 1', function() { + it('throws if not a recognized shape', function() { + assert.throws(function() { + mquery().where('loc').within({}); + }, /Invalid argument/); + assert.throws(function() { + mquery().where('loc').within(null); + }, /Invalid argument/); + }); + it('delegates to circle when center exists', function() { + const m = mquery().where('loc').within({ center: [10, 10], radius: 3 }); + assert.deepEqual({ $geoWithin: { $center: [[10, 10], 3] } }, m._conditions.loc); + }); + it('delegates to box when exists', function() { + const m = mquery().where('loc').within({ box: [[10, 10], [11, 14]] }); + assert.deepEqual({ $geoWithin: { $box: [[10, 10], [11, 14]] } }, m._conditions.loc); + }); + it('delegates to polygon when exists', function() { + const m = mquery().where('loc').within({ polygon: [[10, 10], [11, 14], [10, 9]] }); + assert.deepEqual({ $geoWithin: { $polygon: [[10, 10], [11, 14], [10, 9]] } }, m._conditions.loc); + }); + it('delegates to geometry when exists', function() { + const m = mquery().where('loc').within({ type: 'Polygon', coordinates: [[10, 10], [11, 14], [10, 9]] }); + assert.deepEqual({ $geoWithin: { $geometry: { type: 'Polygon', coordinates: [[10, 10], [11, 14], [10, 9]] } } }, m._conditions.loc); + }); + }); + + describe('of length 2', function() { + it('delegates to box()', function() { + const m = mquery().where('loc').within([1, 2], [2, 5]); + assert.deepEqual(m._conditions.loc, { $geoWithin: { $box: [[1, 2], [2, 5]] } }); + }); + }); + + describe('of length > 2', function() { + it('delegates to polygon()', function() { + const m = mquery().where('loc').within([1, 2], [2, 5], [2, 4], [1, 3]); + assert.deepEqual(m._conditions.loc, { $geoWithin: { $polygon: [[1, 2], [2, 5], [2, 4], [1, 3]] } }); + }); + }); + }); + }); + + describe('geoWithin', function() { + before(function() { + mquery.use$geoWithin = false; + }); + after(function() { + mquery.use$geoWithin = true; + }); + describe('when called with arguments', function() { + describe('of length 1', function() { + it('delegates to circle when center exists', function() { + const m = mquery().where('loc').within({ center: [10, 10], radius: 3 }); + assert.deepEqual({ $within: { $center: [[10, 10], 3] } }, m._conditions.loc); + }); + it('delegates to box when exists', function() { + const m = mquery().where('loc').within({ box: [[10, 10], [11, 14]] }); + assert.deepEqual({ $within: { $box: [[10, 10], [11, 14]] } }, m._conditions.loc); + }); + it('delegates to polygon when exists', function() { + const m = mquery().where('loc').within({ polygon: [[10, 10], [11, 14], [10, 9]] }); + assert.deepEqual({ $within: { $polygon: [[10, 10], [11, 14], [10, 9]] } }, m._conditions.loc); + }); + it('delegates to geometry when exists', function() { + const m = mquery().where('loc').within({ type: 'Polygon', coordinates: [[10, 10], [11, 14], [10, 9]] }); + assert.deepEqual({ $within: { $geometry: { type: 'Polygon', coordinates: [[10, 10], [11, 14], [10, 9]] } } }, m._conditions.loc); + }); + }); + + describe('of length 2', function() { + it('delegates to box()', function() { + const m = mquery().where('loc').within([1, 2], [2, 5]); + assert.deepEqual(m._conditions.loc, { $within: { $box: [[1, 2], [2, 5]] } }); + }); + }); + + describe('of length > 2', function() { + it('delegates to polygon()', function() { + const m = mquery().where('loc').within([1, 2], [2, 5], [2, 4], [1, 3]); + assert.deepEqual(m._conditions.loc, { $within: { $polygon: [[1, 2], [2, 5], [2, 4], [1, 3]] } }); + }); + }); + }); + }); + + describe('box', function() { + describe('with 1 argument', function() { + it('throws', function() { + assert.throws(function() { + mquery().box('sometihng'); + }, /Invalid argument/); + }); + }); + describe('with > 3 arguments', function() { + it('throws', function() { + assert.throws(function() { + mquery().box(1, 2, 3, 4); + }, /Invalid argument/); + }); + }); + + describe('with 2 arguments', function() { + it('throws if not used after where()', function() { + assert.throws(function() { + mquery().box([], []); + }, /must be used after where/); + }); + it('works', function() { + const m = mquery().where('loc').box([1, 2], [3, 4]); + assert.deepEqual(m._conditions.loc, { $geoWithin: { $box: [[1, 2], [3, 4]] } }); + }); + }); + + describe('with 3 arguments', function() { + it('works', function() { + const m = mquery().box('loc', [1, 2], [3, 4]); + assert.deepEqual(m._conditions.loc, { $geoWithin: { $box: [[1, 2], [3, 4]] } }); + }); + }); + }); + + describe('polygon', function() { + describe('when first argument is not a string', function() { + it('throws if not used after where()', function() { + assert.throws(function() { + mquery().polygon({}); + }, /must be used after where/); + + assert.doesNotThrow(function() { + mquery().where('loc').polygon([1, 2], [2, 3], [3, 6]); + }); + }); + + it('assigns arguments to within polygon condition', function() { + const m = mquery().where('loc').polygon([1, 2], [2, 3], [3, 6]); + assert.deepEqual(m._conditions, { loc: { $geoWithin: { $polygon: [[1, 2], [2, 3], [3, 6]] } } }); + }); + }); + + describe('when first arg is a string', function() { + it('assigns remaining arguments to within polygon condition', function() { + const m = mquery().polygon('loc', [1, 2], [2, 3], [3, 6]); + assert.deepEqual(m._conditions, { loc: { $geoWithin: { $polygon: [[1, 2], [2, 3], [3, 6]] } } }); + }); + }); + }); + + describe('circle', function() { + describe('with one arg', function() { + it('must follow where()', function() { + assert.throws(function() { + mquery().circle('x'); + }, /must be used after where/); + assert.doesNotThrow(function() { + mquery().where('loc').circle({ center: [0, 0], radius: 3 }); + }); + }); + it('works', function() { + const m = mquery().where('loc').circle({ center: [0, 0], radius: 3 }); + assert.deepEqual(m._conditions, { loc: { $geoWithin: { $center: [[0, 0], 3] } } }); + }); + }); + describe('with 3 args', function() { + it('throws', function() { + assert.throws(function() { + mquery().where('loc').circle(1, 2, 3); + }, /Invalid argument/); + }); + }); + describe('requires radius and center', function() { + assert.throws(function() { + mquery().circle('loc', { center: 1 }); + }, /center and radius are required/); + assert.throws(function() { + mquery().circle('loc', { radius: 1 }); + }, /center and radius are required/); + assert.doesNotThrow(function() { + mquery().circle('loc', { center: [1, 2], radius: 1 }); + }); + }); + }); + + describe('geometry', function() { + // within + intersects + const point = { type: 'Point', coordinates: [[0, 0], [1, 1]] }; + + it('must be called after within or intersects', function(done) { + assert.throws(function() { + mquery().where('a').geometry(point); + }, /must come after/); + + assert.doesNotThrow(function() { + mquery().where('a').within().geometry(point); + }); + + assert.doesNotThrow(function() { + mquery().where('a').intersects().geometry(point); + }); + + done(); + }); + + describe('when called with one argument', function() { + describe('after within()', function() { + it('and arg quacks like geoJSON', function(done) { + const m = mquery().where('a').within().geometry(point); + assert.deepEqual({ a: { $geoWithin: { $geometry: point } } }, m._conditions); + done(); + }); + }); + + describe('after intersects()', function() { + it('and arg quacks like geoJSON', function(done) { + const m = mquery().where('a').intersects().geometry(point); + assert.deepEqual({ a: { $geoIntersects: { $geometry: point } } }, m._conditions); + done(); + }); + }); + + it('and arg does not quack like geoJSON', function(done) { + assert.throws(function() { + mquery().where('b').within().geometry({ type: 1, coordinates: 2 }); + }, /Invalid argument/); + done(); + }); + }); + + describe('when called with zero arguments', function() { + it('throws', function(done) { + assert.throws(function() { + mquery().where('a').within().geometry(); + }, /Invalid argument/); + + done(); + }); + }); + + describe('when called with more than one arguments', function() { + it('throws', function(done) { + assert.throws(function() { + mquery().where('a').within().geometry({ type: 'a', coordinates: [] }, 2); + }, /Invalid argument/); + done(); + }); + }); + }); + + describe('intersects', function() { + it('must be used after where()', function(done) { + const m = mquery(); + assert.throws(function() { + m.intersects(); + }, /must be used after where/); + done(); + }); + + it('sets geo comparison to "$intersects"', function(done) { + const n = mquery().where('a').intersects(); + assert.equal('$geoIntersects', n._geoComparison); + done(); + }); + + it('is chainable', function() { + const m = mquery(); + assert.equal(m.where('a').intersects(), m); + }); + + it('calls geometry if argument quacks like geojson', function(done) { + const m = mquery(); + const o = { type: 'LineString', coordinates: [[0, 1], [3, 40]] }; + let ran = false; + + m.geometry = function(arg) { + ran = true; + assert.deepEqual(o, arg); + }; + + m.where('a').intersects(o); + assert.ok(ran); + + done(); + }); + + it('throws if argument is not geometry-like', function(done) { + const m = mquery().where('a'); + + assert.throws(function() { + m.intersects(null); + }, /Invalid argument/); + + assert.throws(function() { + m.intersects(undefined); + }, /Invalid argument/); + + assert.throws(function() { + m.intersects(false); + }, /Invalid argument/); + + assert.throws(function() { + m.intersects({}); + }, /Invalid argument/); + + assert.throws(function() { + m.intersects([]); + }, /Invalid argument/); + + assert.throws(function() { + m.intersects(function() {}); + }, /Invalid argument/); + + assert.throws(function() { + m.intersects(NaN); + }, /Invalid argument/); + + done(); + }); + }); + + describe('near', function() { + // near nearSphere + describe('with 0 args', function() { + it('is compatible with geometry()', function(done) { + const q = mquery().where('x').near().geometry({ type: 'Point', coordinates: [180, 11] }); + assert.deepEqual({ $near: { $geometry: { type: 'Point', coordinates: [180, 11] } } }, q._conditions.x); + done(); + }); + }); + + describe('with 1 arg', function() { + it('throws if not used after where()', function() { + assert.throws(function() { + mquery().near(1); + }, /must be used after where/); + }); + it('does not throw if used after where()', function() { + assert.doesNotThrow(function() { + mquery().where('loc').near({ center: [1, 1] }); + }); + }); + }); + describe('with > 2 args', function() { + it('throws', function() { + assert.throws(function() { + mquery().near(1, 2, 3); + }, /Invalid argument/); + }); + }); + + it('creates $geometry args for GeoJSON', function() { + const m = mquery().where('loc').near({ center: { type: 'Point', coordinates: [10, 10] } }); + assert.deepEqual({ $near: { $geometry: { type: 'Point', coordinates: [10, 10] } } }, m._conditions.loc); + }); + + it('expects `center`', function() { + assert.throws(function() { + mquery().near('loc', { maxDistance: 3 }); + }, /center is required/); + assert.doesNotThrow(function() { + mquery().near('loc', { center: [3, 4] }); + }); + }); + + it('accepts spherical conditions', function() { + const m = mquery().where('loc').near({ center: [1, 2], spherical: true }); + assert.deepEqual(m._conditions, { loc: { $nearSphere: [1, 2] } }); + }); + + it('is non-spherical by default', function() { + const m = mquery().where('loc').near({ center: [1, 2] }); + assert.deepEqual(m._conditions, { loc: { $near: [1, 2] } }); + }); + + it('supports maxDistance', function() { + const m = mquery().where('loc').near({ center: [1, 2], maxDistance: 4 }); + assert.deepEqual(m._conditions, { loc: { $near: [1, 2], $maxDistance: 4 } }); + }); + + it('supports minDistance', function() { + const m = mquery().where('loc').near({ center: [1, 2], minDistance: 4 }); + assert.deepEqual(m._conditions, { loc: { $near: [1, 2], $minDistance: 4 } }); + }); + + it('is chainable', function() { + const m = mquery().where('loc').near({ center: [1, 2], maxDistance: 4 }).find({ x: 1 }); + assert.deepEqual(m._conditions, { loc: { $near: [1, 2], $maxDistance: 4 }, x: 1 }); + }); + + describe('supports passing GeoJSON, gh-13', function() { + it('with center', function() { + const m = mquery().where('loc').near({ + center: { type: 'Point', coordinates: [1, 1] }, + maxDistance: 2 + }); + + const expect = { + loc: { + $near: { + $geometry: { + type: 'Point', + coordinates: [1, 1] + }, + $maxDistance: 2 + } + } + }; + + assert.deepEqual(m._conditions, expect); + }); + }); + }); + + // fields + + describe('select', function() { + describe('with 0 args', function() { + it('is chainable', function() { + const m = mquery(); + assert.equal(m, m.select()); + }); + }); + + it('accepts an object', function() { + const o = { x: 1, y: 1 }; + const m = mquery().select(o); + assert.deepEqual(m._fields, o); + }); + + it('accepts a string', function() { + const o = 'x -y'; + const m = mquery().select(o); + assert.deepEqual(m._fields, { x: 1, y: 0 }); + }); + + it('does accept an array', function() { + const o = ['x', '-y']; + const m = mquery().select(o); + assert.deepEqual(m._fields, { x: 1, y: 0 }); + }); + + it('merges previous arguments', function() { + const o = { x: 1, y: 0, a: 1 }; + const m = mquery().select(o); + m.select('z -u w').select({ x: 0 }); + assert.deepEqual(m._fields, { + x: 0, + y: 0, + z: 1, + u: 0, + w: 1, + a: 1 + }); + }); + + it('rejects non-string, object, arrays', function() { + assert.throws(function() { + mquery().select(function() {}); + }, /Invalid select\(\) argument/); + }); + + it('accepts arguments objects', function() { + const m = mquery(); + function t() { + m.select(arguments); + assert.deepEqual(m._fields, { x: 1, y: 0 }); + } + t('x', '-y'); + }); + + noDistinct('select'); + }); + + describe('selected', function() { + it('returns true when fields have been selected', function(done) { + let m; + + m = mquery().select({ name: 1 }); + assert.ok(m.selected()); + + m = mquery().select('name'); + assert.ok(m.selected()); + + done(); + }); + + it('returns false when no fields have been selected', function(done) { + const m = mquery(); + assert.strictEqual(false, m.selected()); + done(); + }); + }); + + describe('selectedInclusively', function() { + describe('returns false', function() { + it('when no fields have been selected', function(done) { + assert.strictEqual(false, mquery().selectedInclusively()); + assert.equal(false, mquery().select({}).selectedInclusively()); + done(); + }); + it('when any fields have been excluded', function(done) { + assert.strictEqual(false, mquery().select('-name').selectedInclusively()); + assert.strictEqual(false, mquery().select({ name: 0 }).selectedInclusively()); + assert.strictEqual(false, mquery().select('name bio -_id').selectedInclusively()); + assert.strictEqual(false, mquery().select({ name: 1, _id: 0 }).selectedInclusively()); + done(); + }); + it('when using $meta', function(done) { + assert.strictEqual(false, mquery().select({ name: { $meta: 'textScore' } }).selectedInclusively()); + done(); + }); + }); + + describe('returns true', function() { + it('when fields have been included', function(done) { + assert.equal(true, mquery().select('name').selectedInclusively()); + assert.equal(true, mquery().select({ name: 1 }).selectedInclusively()); + done(); + }); + }); + }); + + describe('selectedExclusively', function() { + describe('returns false', function() { + it('when no fields have been selected', function(done) { + assert.equal(false, mquery().selectedExclusively()); + assert.equal(false, mquery().select({}).selectedExclusively()); + done(); + }); + it('when fields have only been included', function(done) { + assert.equal(false, mquery().select('name').selectedExclusively()); + assert.equal(false, mquery().select({ name: 1 }).selectedExclusively()); + done(); + }); + }); + + describe('returns true', function() { + it('when any field has been excluded', function(done) { + assert.equal(true, mquery().select('-name').selectedExclusively()); + assert.equal(true, mquery().select({ name: 0 }).selectedExclusively()); + assert.equal(true, mquery().select('-_id').selectedExclusively()); + assert.strictEqual(true, mquery().select('name bio -_id').selectedExclusively()); + assert.strictEqual(true, mquery().select({ name: 1, _id: 0 }).selectedExclusively()); + done(); + }); + }); + }); + + describe('slice', function() { + describe('with 0 args', function() { + it('is chainable', function() { + const m = mquery(); + assert.equal(m, m.slice()); + }); + it('is a noop', function() { + const m = mquery().slice(); + assert.deepEqual(m._fields, undefined); + }); + }); + + describe('with 1 arg', function() { + it('throws if not called after where()', function() { + assert.throws(function() { + mquery().slice(1); + }, /must be used after where/); + assert.doesNotThrow(function() { + mquery().where('a').slice(1); + }); + }); + it('that is a number', function() { + const query = mquery(); + query.where('collection').slice(5); + assert.deepEqual(query._fields, { collection: { $slice: 5 } }); + }); + it('that is an array', function() { + const query = mquery(); + query.where('collection').slice([5, 10]); + assert.deepEqual(query._fields, { collection: { $slice: [5, 10] } }); + }); + it('that is an object', function() { + const query = mquery(); + query.slice({ collection: [5, 10] }); + assert.deepEqual(query._fields, { collection: { $slice: [5, 10] } }); + }); + }); + + describe('with 2 args', function() { + describe('and first is a number', function() { + it('throws if not called after where', function() { + assert.throws(function() { + mquery().slice(2, 3); + }, /must be used after where/); + }); + it('does not throw if used after where', function() { + const query = mquery(); + query.where('collection').slice(2, 3); + assert.deepEqual(query._fields, { collection: { $slice: [2, 3] } }); + }); + }); + it('and first is not a number', function() { + const query = mquery().slice('collection', [-5, 2]); + assert.deepEqual(query._fields, { collection: { $slice: [-5, 2] } }); + }); + }); + + describe('with 3 args', function() { + it('works', function() { + const query = mquery(); + query.slice('collection', 14, 10); + assert.deepEqual(query._fields, { collection: { $slice: [14, 10] } }); + }); + }); + + noDistinct('slice'); + no('count', 'slice'); + }); + + // options + + describe('sort', function() { + describe('with 0 args', function() { + it('chains', function() { + const m = mquery(); + assert.equal(m, m.sort()); + }); + it('has no affect', function() { + const m = mquery(); + assert.equal(m.options.sort, undefined); + }); + }); + + it('works', function() { + let query = mquery(); + query.sort('a -c b'); + assert.deepEqual(query.options.sort, { a: 1, b: 1, c: -1 }); + + query = mquery(); + query.sort({ a: 1, c: -1, b: 'asc', e: 'descending', f: 'ascending' }); + assert.deepEqual(query.options.sort, { a: 1, c: -1, b: 1, e: -1, f: 1 }); + + query = mquery(); + query.sort([['a', -1], ['c', 1], ['b', 'desc'], ['e', 'ascending'], ['f', 'descending']]); + assert.deepEqual(query.options.sort, [['a', -1], ['c', 1], ['b', -1], ['e', 1], ['f', -1]]); + + query = mquery(); + let e = undefined; + try { + query.sort([['a', 1], { b: 5 }]); + } catch (err) { + e = err; + } + assert.ok(e, 'uh oh. no error was thrown'); + assert.equal(e.message, 'Invalid sort() argument, must be array of arrays'); + + query = mquery(); + e = undefined; + + try { + query.sort('a', 1, 'c', -1, 'b', 1); + } catch (err) { + e = err; + } + assert.ok(e, 'uh oh. no error was thrown'); + assert.equal(e.message, 'Invalid sort() argument. Must be a string, object, or array.'); + }); + + it('handles $meta sort options', function() { + const query = mquery(); + query.sort({ score: { $meta: 'textScore' } }); + assert.deepEqual(query.options.sort, { score: { $meta: 'textScore' } }); + }); + + it('array syntax', function() { + const query = mquery(); + query.sort([['field', 1], ['test', -1]]); + assert.deepEqual(query.options.sort, [['field', 1], ['test', -1]]); + }); + + it('throws with mixed array/object syntax', function() { + const query = mquery(); + assert.throws(function() { + query.sort({ field: 1 }).sort([['test', -1]]); + }, /Can't mix sort syntaxes/); + assert.throws(function() { + query.sort([['field', 1]]).sort({ test: 1 }); + }, /Can't mix sort syntaxes/); + }); + + it('works with maps', function() { + if (typeof Map === 'undefined') { + return this.skip(); + } + const query = mquery(); + query.sort(new Map().set('field', 1).set('test', -1)); + assert.deepEqual(query.options.sort, new Map().set('field', 1).set('test', -1)); + }); + }); + + function simpleOption(type, options) { + describe(type, function() { + it('sets the ' + type + ' option', function() { + const m = mquery()[type](2); + const optionName = options.name || type; + assert.equal(2, m.options[optionName]); + }); + it('is chainable', function() { + const m = mquery(); + assert.equal(m[type](3), m); + }); + + if (!options.distinct) noDistinct(type); + if (!options.count) no('count', type); + }); + } + + const negated = { + limit: { distinct: false, count: true }, + skip: { distinct: false, count: true }, + maxScan: { distinct: false, count: false }, + batchSize: { distinct: false, count: false }, + maxTime: { distinct: true, count: true, name: 'maxTimeMS' }, + comment: { distinct: false, count: false } + }; + Object.keys(negated).forEach(function(key) { + simpleOption(key, negated[key]); + }); + + describe('snapshot', function() { + it('works', function() { + let query; + + query = mquery(); + query.snapshot(); + assert.equal(true, query.options.snapshot); + + query = mquery(); + query.snapshot(true); + assert.equal(true, query.options.snapshot); + + query = mquery(); + query.snapshot(false); + assert.equal(false, query.options.snapshot); + }); + noDistinct('snapshot'); + no('count', 'snapshot'); + }); + + describe('hint', function() { + it('accepts an object', function() { + const query2 = mquery(); + query2.hint({ a: 1, b: -1 }); + assert.deepEqual(query2.options.hint, { a: 1, b: -1 }); + }); + + it('accepts a string', function() { + const query2 = mquery(); + query2.hint('a'); + assert.deepEqual(query2.options.hint, 'a'); + }); + + it('rejects everything else', function() { + assert.throws(function() { + mquery().hint(['c']); + }, /Invalid hint./); + assert.throws(function() { + mquery().hint(1); + }, /Invalid hint./); + }); + + describe('does not have side affects', function() { + it('on invalid arg', function() { + const m = mquery(); + try { + m.hint(1); + } catch (err) { + // ignore + } + assert.equal(undefined, m.options.hint); + }); + it('on missing arg', function() { + const m = mquery().hint(); + assert.equal(undefined, m.options.hint); + }); + }); + + noDistinct('hint'); + }); + + describe('j', function() { + it('works', function() { + const m = mquery().j(true); + assert.equal(true, m.options.j); + }); + }); + + describe('slaveOk', function() { + it('works', function() { + let query; + + query = mquery(); + query.slaveOk(); + assert.equal(true, query.options.slaveOk); + + query = mquery(); + query.slaveOk(true); + assert.equal(true, query.options.slaveOk); + + query = mquery(); + query.slaveOk(false); + assert.equal(false, query.options.slaveOk); + }); + }); + + describe('read', function() { + it('sets associated readPreference option', function() { + const m = mquery(); + m.read('p'); + assert.equal('primary', m.options.readPreference); + }); + it('is chainable', function() { + const m = mquery(); + assert.equal(m, m.read('sp')); + }); + }); + + describe('readConcern', function() { + it('sets associated readConcern option', function() { + let m; + + m = mquery(); + m.readConcern('s'); + assert.deepEqual({ level: 'snapshot' }, m.options.readConcern); + + m = mquery(); + m.r('local'); + assert.deepEqual({ level: 'local' }, m.options.readConcern); + }); + it('is chainable', function() { + const m = mquery(); + assert.equal(m, m.readConcern('lz')); + }); + }); + + describe('tailable', function() { + it('works', function() { + let query; + + query = mquery(); + query.tailable(); + assert.equal(true, query.options.tailable); + + query = mquery(); + query.tailable(true); + assert.equal(true, query.options.tailable); + + query = mquery(); + query.tailable(false); + assert.equal(false, query.options.tailable); + }); + it('is chainable', function() { + const m = mquery(); + assert.equal(m, m.tailable()); + }); + noDistinct('tailable'); + no('count', 'tailable'); + }); + + describe('writeConcern', function() { + it('sets associated writeConcern option', function() { + let m; + m = mquery(); + m.writeConcern('majority'); + assert.equal('majority', m.options.w); + + m = mquery(); + m.writeConcern('m'); // m is alias of majority + assert.equal('majority', m.options.w); + + m = mquery(); + m.writeConcern(1); + assert.equal(1, m.options.w); + }); + it('accepts object', function() { + let m; + + m = mquery().writeConcern({ w: 'm', j: true, wtimeout: 1000 }); + assert.equal('m', m.options.w); // check it does not convert m to majority + assert.equal(true, m.options.j); + assert.equal(1000, m.options.wtimeout); + + m = mquery().w('m').w({ j: false, wtimeout: 0 }); + assert.equal('majority', m.options.w); + assert.strictEqual(false, m.options.j); + assert.strictEqual(0, m.options.wtimeout); + }); + it('is chainable', function() { + const m = mquery(); + assert.equal(m, m.writeConcern('majority')); + }); + }); + + // query utilities + + describe('merge', function() { + describe('with falsy arg', function() { + it('returns itself', function() { + const m = mquery(); + assert.equal(m, m.merge()); + assert.equal(m, m.merge(null)); + assert.equal(m, m.merge(0)); + }); + }); + describe('with an argument', function() { + describe('that is not a query or plain object', function() { + it('throws', function() { + assert.throws(function() { + mquery().merge([]); + }, /Invalid argument/); + assert.throws(function() { + mquery().merge('merge'); + }, /Invalid argument/); + assert.doesNotThrow(function() { + mquery().merge({}); + }, /Invalid argument/); + }); + }); + + describe('that is a query', function() { + it('merges conditions, field selection, and options', function() { + const m = mquery({ x: 'hi' }, { select: 'x y', another: true }); + const n = mquery().merge(m); + assert.deepEqual(n._conditions, m._conditions); + assert.deepEqual(n._fields, m._fields); + assert.deepEqual(n.options, m.options); + }); + it('clones update arguments', function(done) { + const original = { $set: { iTerm: true } }; + const m = mquery().updateOne(original); + const n = mquery().merge(m); + m.updateOne({ $set: { x: 2 } }); + assert.notDeepEqual(m._update, n._update); + done(); + }); + it('is chainable', function() { + const m = mquery({ x: 'hi' }); + const n = mquery(); + assert.equal(n, n.merge(m)); + }); + }); + + describe('that is an object', function() { + it('merges', function() { + const m = { x: 'hi' }; + const n = mquery().merge(m); + assert.deepEqual(n._conditions, { x: 'hi' }); + }); + it('clones update arguments', function(done) { + const original = { $set: { iTerm: true } }; + const m = mquery().updateOne(original); + const n = mquery().merge(original); + m.updateOne({ $set: { x: 2 } }); + assert.notDeepEqual(m._update, n._update); + done(); + }); + it('is chainable', function() { + const m = { x: 'hi' }; + const n = mquery(); + assert.equal(n, n.merge(m)); + }); + }); + }); + }); + + // queries + + describe('find', function() { + describe('with no callback', function() { + it('does not execute', function() { + const m = mquery(); + assert.doesNotThrow(function() { + m.find(); + }); + assert.doesNotThrow(function() { + m.find({ x: 1 }); + }); + }); + }); + + it('is chainable', function() { + const m = mquery().find({ x: 1 }).find().find({ y: 2 }); + assert.deepEqual(m._conditions, { x: 1, y: 2 }); + }); + + it('merges other queries', function() { + const m = mquery({ name: 'mquery' }); + m.tailable(); + m.select('_id'); + const a = mquery().find(m); + assert.deepEqual(a._conditions, m._conditions); + assert.deepEqual(a.options, m.options); + assert.deepEqual(a._fields, m._fields); + }); + + describe('executes', function() { + before(function(done) { + col.insertOne({ name: 'mquery' }, done); + }); + + after(function(done) { + col.remove({ name: 'mquery' }, done); + }); + + it('when criteria is passed with a callback', function(done) { + mquery(col).find({ name: 'mquery' }, function(err, docs) { + assert.ifError(err); + assert.equal(1, docs.length); + done(); + }); + }); + it('when Query is passed with a callback', function(done) { + const m = mquery({ name: 'mquery' }); + mquery(col).find(m, function(err, docs) { + assert.ifError(err); + assert.equal(1, docs.length); + done(); + }); + }); + it('when just a callback is passed', function(done) { + mquery({ name: 'mquery' }).collection(col).find(function(err, docs) { + assert.ifError(err); + assert.equal(1, docs.length); + done(); + }); + }); + }); + }); + + describe('findOne', function() { + describe('with no callback', function() { + it('does not execute', function() { + const m = mquery(); + assert.doesNotThrow(function() { + m.findOne(); + }); + assert.doesNotThrow(function() { + m.findOne({ x: 1 }); + }); + }); + }); + + it('is chainable', function() { + const m = mquery(); + const n = m.findOne({ x: 1 }).findOne().findOne({ y: 2 }); + assert.equal(m, n); + assert.deepEqual(m._conditions, { x: 1, y: 2 }); + assert.equal('findOne', m.op); + }); + + it('merges other queries', function() { + const m = mquery({ name: 'mquery' }); + m.read('nearest'); + m.select('_id'); + const a = mquery().findOne(m); + assert.deepEqual(a._conditions, m._conditions); + assert.deepEqual(a.options, m.options); + assert.deepEqual(a._fields, m._fields); + }); + + describe('executes', function() { + before(function(done) { + col.insertOne({ name: 'mquery findone' }, done); + }); + + after(function(done) { + col.remove({ name: 'mquery findone' }, done); + }); + + it('when criteria is passed with a callback', function(done) { + mquery(col).findOne({ name: 'mquery findone' }, function(err, doc) { + assert.ifError(err); + assert.ok(doc); + assert.equal('mquery findone', doc.name); + done(); + }); + }); + it('when Query is passed with a callback', function(done) { + const m = mquery(col).where({ name: 'mquery findone' }); + mquery(col).findOne(m, function(err, doc) { + assert.ifError(err); + assert.ok(doc); + assert.equal('mquery findone', doc.name); + done(); + }); + }); + it('when just a callback is passed', function(done) { + mquery({ name: 'mquery findone' }).collection(col).findOne(function(err, doc) { + assert.ifError(err); + assert.ok(doc); + assert.equal('mquery findone', doc.name); + done(); + }); + }); + }); + }); + + describe('count', function() { + describe('with no callback', function() { + it('does not execute', function() { + const m = mquery(); + assert.doesNotThrow(function() { + m.count(); + }); + assert.doesNotThrow(function() { + m.count({ x: 1 }); + }); + }); + }); + + it('is chainable', function() { + const m = mquery(); + const n = m.count({ x: 1 }).count().count({ y: 2 }); + assert.equal(m, n); + assert.deepEqual(m._conditions, { x: 1, y: 2 }); + assert.equal('count', m.op); + }); + + it('merges other queries', function() { + const m = mquery({ name: 'mquery' }); + m.read('nearest'); + m.select('_id'); + const a = mquery().count(m); + assert.deepEqual(a._conditions, m._conditions); + assert.deepEqual(a.options, m.options); + assert.deepEqual(a._fields, m._fields); + }); + + describe('executes', function() { + before(function(done) { + col.insertOne({ name: 'mquery count' }, done); + }); + + after(function(done) { + col.remove({ name: 'mquery count' }, done); + }); + + it('when criteria is passed with a callback', function(done) { + mquery(col).count({ name: 'mquery count' }, function(err, count) { + assert.ifError(err); + assert.ok(count); + assert.ok(1 === count); + done(); + }); + }); + it('when Query is passed with a callback', function(done) { + const m = mquery({ name: 'mquery count' }); + mquery(col).count(m, function(err, count) { + assert.ifError(err); + assert.ok(count); + assert.ok(1 === count); + done(); + }); + }); + it('when just a callback is passed', function(done) { + mquery({ name: 'mquery count' }).collection(col).count(function(err, count) { + assert.ifError(err); + assert.ok(1 === count); + done(); + }); + }); + }); + + describe('validates its option', function() { + it('sort', function(done) { + assert.doesNotThrow(function() { + mquery().sort('x').count(); + }); + done(); + }); + + it('select', function(done) { + assert.throws(function() { + mquery().select('x').count(); + }, /field selection and slice cannot be used with count/); + done(); + }); + + it('slice', function(done) { + assert.throws(function() { + mquery().where('x').slice(-3).count(); + }, /field selection and slice cannot be used with count/); + done(); + }); + + it('limit', function(done) { + assert.doesNotThrow(function() { + mquery().limit(3).count(); + }); + done(); + }); + + it('skip', function(done) { + assert.doesNotThrow(function() { + mquery().skip(3).count(); + }); + done(); + }); + + it('batchSize', function(done) { + assert.throws(function() { + mquery({}, { batchSize: 3 }).count(); + }, /batchSize cannot be used with count/); + done(); + }); + + it('comment', function(done) { + assert.throws(function() { + mquery().comment('mquery').count(); + }, /comment cannot be used with count/); + done(); + }); + + it('maxScan', function(done) { + assert.throws(function() { + mquery().maxScan(300).count(); + }, /maxScan cannot be used with count/); + done(); + }); + + it('snapshot', function(done) { + assert.throws(function() { + mquery().snapshot().count(); + }, /snapshot cannot be used with count/); + done(); + }); + + it('tailable', function(done) { + assert.throws(function() { + mquery().tailable().count(); + }, /tailable cannot be used with count/); + done(); + }); + }); + }); + + describe('distinct', function() { + describe('with no callback', function() { + it('does not execute', function() { + const m = mquery(); + assert.doesNotThrow(function() { + m.distinct(); + }); + assert.doesNotThrow(function() { + m.distinct('name'); + }); + assert.doesNotThrow(function() { + m.distinct({ name: 'mquery distinct' }); + }); + assert.doesNotThrow(function() { + m.distinct({ name: 'mquery distinct' }, 'name'); + }); + }); + }); + + it('is chainable', function() { + const m = mquery({ x: 1 }).distinct('name'); + const n = m.distinct({ y: 2 }); + assert.equal(m, n); + assert.deepEqual(n._conditions, { x: 1, y: 2 }); + assert.equal('name', n._distinct); + assert.equal('distinct', n.op); + }); + + it('overwrites field', function() { + const m = mquery({ name: 'mquery' }).distinct('name'); + m.distinct('rename'); + assert.equal(m._distinct, 'rename'); + m.distinct({ x: 1 }, 'renamed'); + assert.equal(m._distinct, 'renamed'); + }); + + it('merges other queries', function() { + const m = mquery().distinct({ name: 'mquery' }, 'age'); + m.read('nearest'); + const a = mquery().distinct(m); + assert.deepEqual(a._conditions, m._conditions); + assert.deepEqual(a.options, m.options); + assert.deepEqual(a._fields, m._fields); + assert.deepEqual(a._distinct, m._distinct); + }); + + describe('executes', function() { + before(function(done) { + col.insertOne({ name: 'mquery distinct', age: 1 }, done); + }); + + after(function(done) { + col.remove({ name: 'mquery distinct' }, done); + }); + + it('when distinct arg is passed with a callback', function(done) { + mquery(col).distinct('distinct', function(err, doc) { + assert.ifError(err); + assert.ok(doc); + done(); + }); + }); + describe('when criteria is passed with a callback', function() { + it('if distinct arg was declared', function(done) { + mquery(col).distinct('age').distinct({ name: 'mquery distinct' }, function(err, doc) { + assert.ifError(err); + assert.ok(doc); + done(); + }); + }); + it('but not if distinct arg was not declared', function() { + assert.throws(function() { + mquery(col).distinct({ name: 'mquery distinct' }, function() {}); + }, /No value for `distinct`/); + }); + }); + describe('when Query is passed with a callback', function() { + const m = mquery({ name: 'mquery distinct' }); + it('if distinct arg was declared', function(done) { + mquery(col).distinct('age').distinct(m, function(err, doc) { + assert.ifError(err); + assert.ok(doc); + done(); + }); + }); + it('but not if distinct arg was not declared', function() { + assert.throws(function() { + mquery(col).distinct(m, function() {}); + }, /No value for `distinct`/); + }); + }); + describe('when just a callback is passed', function() { + it('if distinct arg was declared', function(done) { + const m = mquery({ name: 'mquery distinct' }); + m.collection(col); + m.distinct('age'); + m.distinct(function(err, doc) { + assert.ifError(err); + assert.ok(doc); + done(); + }); + }); + it('but not if no distinct arg was declared', function() { + const m = mquery(); + m.collection(col); + assert.throws(function() { + m.distinct(function() {}); + }, /No value for `distinct`/); + }); + }); + }); + + describe('validates its option', function() { + it('sort', function(done) { + assert.throws(function() { + mquery().sort('x').distinct(); + }, /sort cannot be used with distinct/); + done(); + }); + + it('select', function(done) { + assert.throws(function() { + mquery().select('x').distinct(); + }, /field selection and slice cannot be used with distinct/); + done(); + }); + + it('slice', function(done) { + assert.throws(function() { + mquery().where('x').slice(-3).distinct(); + }, /field selection and slice cannot be used with distinct/); + done(); + }); + + it('limit', function(done) { + assert.throws(function() { + mquery().limit(3).distinct(); + }, /limit cannot be used with distinct/); + done(); + }); + + it('skip', function(done) { + assert.throws(function() { + mquery().skip(3).distinct(); + }, /skip cannot be used with distinct/); + done(); + }); + + it('batchSize', function(done) { + assert.throws(function() { + mquery({}, { batchSize: 3 }).distinct(); + }, /batchSize cannot be used with distinct/); + done(); + }); + + it('comment', function(done) { + assert.throws(function() { + mquery().comment('mquery').distinct(); + }, /comment cannot be used with distinct/); + done(); + }); + + it('maxScan', function(done) { + assert.throws(function() { + mquery().maxScan(300).distinct(); + }, /maxScan cannot be used with distinct/); + done(); + }); + + it('snapshot', function(done) { + assert.throws(function() { + mquery().snapshot().distinct(); + }, /snapshot cannot be used with distinct/); + done(); + }); + + it('hint', function(done) { + assert.throws(function() { + mquery().hint({ x: 1 }).distinct(); + }, /hint cannot be used with distinct/); + done(); + }); + + it('tailable', function(done) { + assert.throws(function() { + mquery().tailable().distinct(); + }, /tailable cannot be used with distinct/); + done(); + }); + }); + }); + + describe('update', function() { + describe('with no callback', function() { + it('does not execute', function() { + const m = mquery(); + assert.doesNotThrow(function() { + m.updateOne({ name: 'old' }, { name: 'updated' }, { multi: true }); + }); + assert.doesNotThrow(function() { + m.updateOne({ name: 'old' }, { name: 'updated' }); + }); + assert.doesNotThrow(function() { + m.updateOne({ name: 'updated' }); + }); + assert.doesNotThrow(function() { + m.updateOne(); + }); + }); + }); + + it('is chainable', function() { + const m = mquery({ x: 1 }).updateOne({ y: 2 }); + const n = m.where({ y: 2 }); + assert.equal(m, n); + assert.deepEqual(n._conditions, { x: 1, y: 2 }); + assert.deepEqual({ y: 2 }, n._update); + assert.equal('updateOne', n.op); + }); + + it('merges update doc arg', function() { + const a = [1, 2]; + const m = mquery().where({ name: 'mquery' }).updateOne({ x: 'stuff', a: a }); + m.updateOne({ z: 'stuff' }); + assert.deepEqual(m._update, { z: 'stuff', x: 'stuff', a: a }); + assert.deepEqual(m._conditions, { name: 'mquery' }); + assert.ok(!m.options.overwrite); + m.updateOne({}, { z: 'renamed' }, { overwrite: true }); + assert.ok(m.options.overwrite === true); + assert.deepEqual(m._conditions, { name: 'mquery' }); + assert.deepEqual(m._update, { z: 'renamed', x: 'stuff', a: a }); + a.push(3); + assert.notDeepEqual(m._update, { z: 'renamed', x: 'stuff', a: a }); + }); + + describe('executes', function() { + let id; + before(function(done) { + col.insertOne({ name: 'mquery update', age: 1 }, function(err, res) { + id = res.insertedId; + done(); + }); + }); + + after(function(done) { + col.remove({ _id: id }, done); + }); + + describe('when conds + doc + opts + callback passed', function() { + it('works', function(done) { + const m = mquery(col).where({ _id: id }); + m.updateOne({}, { name: 'Sparky' }, {}, function(err, res) { + assert.ifError(err); + assert.equal(res.modifiedCount, 1); + m.findOne(function(err, doc) { + assert.ifError(err); + assert.equal(doc.name, 'Sparky'); + done(); + }); + }); + }); + }); + + describe('when conds + doc + callback passed', function() { + it('works', function(done) { + const m = mquery(col).updateOne({ _id: id }, { name: 'fairgrounds' }, function(err, num) { + assert.ifError(err); + assert.ok(1, num); + m.findOne(function(err, doc) { + assert.ifError(err); + assert.equal(doc.name, 'fairgrounds'); + done(); + }); + }); + }); + }); + + describe('when doc + callback passed', function() { + it('works', function(done) { + const m = mquery(col).where({ _id: id }).updateOne({ name: 'changed' }, function(err, num) { + assert.ifError(err); + assert.ok(1, num); + m.findOne(function(err, doc) { + assert.ifError(err); + assert.equal(doc.name, 'changed'); + done(); + }); + }); + }); + }); + + describe('when just callback passed', function() { + it('works', function(done) { + const m = mquery(col).where({ _id: id }); + m.updateOne({ name: 'Frankenweenie' }); + m.updateOne(function(err, res) { + assert.ifError(err); + assert.equal(res.modifiedCount, 1); + m.findOne(function(err, doc) { + assert.ifError(err); + assert.equal(doc.name, 'Frankenweenie'); + done(); + }); + }); + }); + }); + + describe('without a callback', function() { + it('when forced by exec()', function(done) { + const m = mquery(col).where({ _id: id }); + m.setOptions({ w: 'majority' }); + m.updateOne({ name: 'forced' }); + + const update = m._collection.update; + m._collection.updateOne = function(conds, doc, opts) { + m._collection.update = update; + + assert.equal(opts.w, 'majority'); + assert.equal('forced', doc.$set.name); + done(); + }; + + m.exec(); + }); + }); + + describe('except when update doc is empty and missing overwrite flag', function() { + it('works', function(done) { + const m = mquery(col).where({ _id: id }); + m.updateOne({}, function(err, num) { + assert.ifError(err); + assert.ok(0 === num); + setTimeout(function() { + m.findOne(function(err, doc) { + assert.ifError(err); + assert.equal(3, mquery.utils.keys(doc).length); + assert.equal(id, doc._id.toString()); + assert.equal('Frankenweenie', doc.name); + done(); + }); + }, 300); + }); + }); + }); + + describe('when boolean (true) - exec()', function() { + it('works', function(done) { + const m = mquery(col).where({ _id: id }); + m.updateOne({ name: 'bool' }).updateOne(true); + setTimeout(function() { + m.findOne(function(err, doc) { + assert.ifError(err); + assert.ok(doc); + assert.equal('bool', doc.name); + done(); + }); + }, 300); + }); + }); + }); + }); + + describe('remove', function() { + describe('with 0 args', function() { + const name = 'remove: no args test'; + before(function(done) { + col.insertOne({ name: name }, done); + }); + after(function(done) { + col.remove({ name: name }, done); + }); + + it('does not execute', function(done) { + const remove = col.remove; + col.remove = function() { + col.remove = remove; + done(new Error('remove executed!')); + }; + + mquery(col).where({ name: name }).remove(); + setTimeout(function() { + col.remove = remove; + done(); + }, 10); + }); + + it('chains', function() { + const m = mquery(); + assert.equal(m, m.remove()); + }); + }); + + describe('with 1 argument', function() { + const name = 'remove: 1 arg test'; + before(function(done) { + col.insertOne({ name: name }, done); + }); + after(function(done) { + col.remove({ name: name }, done); + }); + + describe('that is a', function() { + it('plain object', function() { + const m = mquery(col).remove({ name: 'Whiskers' }); + m.remove({ color: '#fff' }); + assert.deepEqual({ name: 'Whiskers', color: '#fff' }, m._conditions); + }); + + it('query', function() { + const q = mquery({ color: '#fff' }); + const m = mquery(col).remove({ name: 'Whiskers' }); + m.remove(q); + assert.deepEqual({ name: 'Whiskers', color: '#fff' }, m._conditions); + }); + + it('function', function(done) { + mquery(col).where({ name: name }).remove(function(err) { + assert.ifError(err); + mquery(col).findOne({ name: name }, function(err, doc) { + assert.ifError(err); + assert.equal(null, doc); + done(); + }); + }); + }); + + it('boolean (true) - execute', function(done) { + col.insertOne({ name: name }, function(err) { + assert.ifError(err); + mquery(col).findOne({ name: name }, function(err, doc) { + assert.ifError(err); + assert.ok(doc); + mquery(col).remove(true); + setTimeout(function() { + mquery(col).find(function(err, docs) { + assert.ifError(err); + assert.ok(docs); + assert.equal(0, docs.length); + done(); + }); + }, 300); + }); + }); + }); + }); + }); + + describe('with 2 arguments', function() { + const name = 'remove: 2 arg test'; + beforeEach(function(done) { + col.remove({}, function(err) { + assert.ifError(err); + col.insertMany([{ name: 'shelly' }, { name: name }], function(err) { + assert.ifError(err); + mquery(col).find(function(err, docs) { + assert.ifError(err); + assert.equal(2, docs.length); + done(); + }); + }); + }); + }); + + describe('plain object + callback', function() { + it('works', function(done) { + mquery(col).remove({ name: name }, function(err) { + assert.ifError(err); + mquery(col).find(function(err, docs) { + assert.ifError(err); + assert.ok(docs); + assert.equal(1, docs.length); + assert.equal('shelly', docs[0].name); + done(); + }); + }); + }); + }); + + describe('mquery + callback', function() { + it('works', function(done) { + const m = mquery({ name: name }); + mquery(col).remove(m, function(err) { + assert.ifError(err); + mquery(col).find(function(err, docs) { + assert.ifError(err); + assert.ok(docs); + assert.equal(1, docs.length); + assert.equal('shelly', docs[0].name); + done(); + }); + }); + }); + }); + }); + }); + + function validateFindAndModifyOptions(method) { + describe('validates its option', function() { + it('sort', function(done) { + assert.doesNotThrow(function() { + mquery().sort('x')[method](); + }); + done(); + }); + + it('select', function(done) { + assert.doesNotThrow(function() { + mquery().select('x')[method](); + }); + done(); + }); + + it('limit', function(done) { + assert.throws(function() { + mquery().limit(3)[method](); + }, new RegExp('limit cannot be used with ' + method)); + done(); + }); + + it('skip', function(done) { + assert.throws(function() { + mquery().skip(3)[method](); + }, new RegExp('skip cannot be used with ' + method)); + done(); + }); + + it('batchSize', function(done) { + assert.throws(function() { + mquery({}, { batchSize: 3 })[method](); + }, new RegExp('batchSize cannot be used with ' + method)); + done(); + }); + + it('maxScan', function(done) { + assert.throws(function() { + mquery().maxScan(300)[method](); + }, new RegExp('maxScan cannot be used with ' + method)); + done(); + }); + + it('snapshot', function(done) { + assert.throws(function() { + mquery().snapshot()[method](); + }, new RegExp('snapshot cannot be used with ' + method)); + done(); + }); + + it('hint', function(done) { + assert.throws(function() { + mquery().hint({ x: 1 })[method](); + }, new RegExp('hint cannot be used with ' + method)); + done(); + }); + + it('tailable', function(done) { + assert.throws(function() { + mquery().tailable()[method](); + }, new RegExp('tailable cannot be used with ' + method)); + done(); + }); + + it('comment', function(done) { + assert.throws(function() { + mquery().comment('mquery')[method](); + }, new RegExp('comment cannot be used with ' + method)); + done(); + }); + }); + } + + describe('findOneAndUpdate', function() { + let name = 'findOneAndUpdate + fn'; + + validateFindAndModifyOptions('findOneAndUpdate'); + + describe('with 0 args', function() { + it('makes no changes', function() { + const m = mquery(); + const n = m.findOneAndUpdate(); + assert.deepEqual(m, n); + }); + }); + describe('with 1 arg', function() { + describe('that is an object', function() { + it('updates the doc', function() { + const m = mquery(); + const n = m.findOneAndUpdate({ $set: { name: '1 arg' } }); + assert.deepEqual(n._update, { $set: { name: '1 arg' } }); + }); + }); + describe('that is a query', function() { + it('updates the doc', function() { + const m = mquery({ name: name }).updateOne({ x: 1 }); + const n = mquery().findOneAndUpdate(m); + assert.deepEqual(n._update, { x: 1 }); + }); + }); + it('that is a function', function(done) { + col.insertOne({ name: name }, function(err) { + assert.ifError(err); + const m = mquery({ name: name }).collection(col); + name = '1 arg'; + const n = m.updateOne({ $set: { name: name } }).setOptions({ returnDocument: 'after' }); + n.findOneAndUpdate(function(err, res) { + assert.ifError(err); + assert.ok(res.value); + assert.equal(res.value.name, name); + done(); + }); + }); + }); + }); + describe('with 2 args', function() { + it('conditions + update', function() { + const m = mquery(col); + m.findOneAndUpdate({ name: name }, { age: 100 }); + assert.deepEqual({ name: name }, m._conditions); + assert.deepEqual({ age: 100 }, m._update); + }); + it('query + update', function() { + const n = mquery({ name: name }); + const m = mquery(col); + m.findOneAndUpdate(n, { age: 100 }); + assert.deepEqual({ name: name }, m._conditions); + assert.deepEqual({ age: 100 }, m._update); + }); + it('update + callback', function(done) { + const m = mquery(col).where({ name: name }); + m.findOneAndUpdate({}, { $inc: { age: 10 } }, { returnDocument: 'after' }, function(err, res) { + assert.ifError(err); + assert.equal(10, res.value.age); + done(); + }); + }); + }); + describe('with 3 args', function() { + it('conditions + update + options', function() { + const m = mquery(); + const n = m.findOneAndUpdate({ name: name }, { works: true }, { returnDocument: 'before' }); + assert.deepEqual({ name: name }, n._conditions); + assert.deepEqual({ works: true }, n._update); + assert.deepEqual({ returnDocument: 'before' }, n.options); + }); + it('conditions + update + callback', function(done) { + const m = mquery(col); + m.findOneAndUpdate({ name: name }, { works: true }, { returnDocument: 'after' }, function(err, res) { + assert.ifError(err); + assert.ok(res.value); + assert.equal(name, res.value.name); + assert.ok(true === res.value.works); + done(); + }); + }); + }); + describe('with 4 args', function() { + it('conditions + update + options + callback', function(done) { + const m = mquery(col); + m.findOneAndUpdate({ name: name }, { works: false }, {}, function(err, res) { + assert.ifError(err); + assert.ok(res.value); + assert.equal(name, res.value.name); + assert.ok(true === res.value.works); + done(); + }); + }); + }); + }); + + describe('findOneAndRemove', function() { + let name = 'findOneAndRemove'; + + validateFindAndModifyOptions('findOneAndRemove'); + + describe('with 0 args', function() { + it('makes no changes', function() { + const m = mquery(); + const n = m.findOneAndRemove(); + assert.deepEqual(m, n); + }); + }); + describe('with 1 arg', function() { + describe('that is an object', function() { + it('updates the doc', function() { + const m = mquery(); + const n = m.findOneAndRemove({ name: '1 arg' }); + assert.deepEqual(n._conditions, { name: '1 arg' }); + }); + }); + describe('that is a query', function() { + it('updates the doc', function() { + const m = mquery({ name: name }); + const n = m.findOneAndRemove(m); + assert.deepEqual(n._conditions, { name: name }); + }); + }); + it('that is a function', function(done) { + col.insertOne({ name: name }, function(err) { + assert.ifError(err); + const m = mquery({ name: name }).collection(col); + m.findOneAndRemove(function(err, res) { + assert.ifError(err); + assert.ok(res.value); + assert.equal(name, res.value.name); + done(); + }); + }); + }); + }); + describe('with 2 args', function() { + it('conditions + options', function() { + const m = mquery(col); + m.findOneAndRemove({ name: name }, { returnDocument: 'after' }); + assert.deepEqual({ name: name }, m._conditions); + assert.deepEqual({ returnDocument: 'after' }, m.options); + }); + it('query + options', function() { + const n = mquery({ name: name }); + const m = mquery(col); + m.findOneAndRemove(n, { sort: { x: 1 } }); + assert.deepEqual({ name: name }, m._conditions); + assert.deepEqual({ sort: { x: 1 } }, m.options); + }); + it('conditions + callback', function(done) { + col.insertOne({ name: name }, function(err) { + assert.ifError(err); + const m = mquery(col); + m.findOneAndRemove({ name: name }, function(err, res) { + assert.ifError(err); + assert.equal(name, res.value.name); + done(); + }); + }); + }); + it('query + callback', function(done) { + col.insertOne({ name: name }, function(err) { + assert.ifError(err); + const n = mquery({ name: name }); + const m = mquery(col); + m.findOneAndRemove(n, function(err, res) { + assert.ifError(err); + assert.equal(name, res.value.name); + done(); + }); + }); + }); + }); + describe('with 3 args', function() { + it('conditions + options + callback', function(done) { + name = 'findOneAndRemove + conds + options + cb'; + col.insertMany([{ name: name }, { name: 'a' }], function(err) { + assert.ifError(err); + const m = mquery(col); + m.findOneAndRemove({ name: name }, { sort: { name: 1 } }, function(err, res) { + assert.ifError(err); + assert.ok(res.value); + assert.equal(name, res.value.name); + done(); + }); + }); + }); + }); + }); + + describe('exec', function() { + beforeEach(function(done) { + col.insertMany([{ name: 'exec', age: 1 }, { name: 'exec', age: 2 }], done); + }); + + afterEach(function(done) { + mquery(col).remove(done); + }); + + it('requires an op', function() { + assert.throws(function() { + mquery().exec(); + }, /Missing query type/); + }); + + describe('find', function() { + it('works', function(done) { + const m = mquery(col).find({ name: 'exec' }); + m.exec(function(err, docs) { + assert.ifError(err); + assert.equal(2, docs.length); + done(); + }); + }); + + it('works with readPreferences', function(done) { + const m = mquery(col).find({ name: 'exec' }); + try { + const ReadPreference = require('mongodb').ReadPreference; + const rp = new ReadPreference('primary'); + m.read(rp); + } catch (e) { + done(e.code === 'MODULE_NOT_FOUND' ? null : e); + return; + } + m.exec(function(err, docs) { + assert.ifError(err); + assert.equal(2, docs.length); + done(); + }); + }); + + it('works with hint', function(done) { + mquery(col).hint({ _id: 1 }).find({ name: 'exec' }).exec(function(err, docs) { + assert.ifError(err); + assert.equal(2, docs.length); + + mquery(col).hint('_id_').find({ age: 1 }).exec(function(err, docs) { + assert.ifError(err); + assert.equal(1, docs.length); + done(); + }); + }); + }); + + it('works with readConcern', function(done) { + const m = mquery(col).find({ name: 'exec' }); + m.readConcern('l'); + m.exec(function(err, docs) { + assert.ifError(err); + assert.equal(2, docs.length); + done(); + }); + }); + + it('works with collation', function(done) { + const m = mquery(col).find({ name: 'EXEC' }); + m.collation({ locale: 'en_US', strength: 1 }); + m.exec(function(err, docs) { + assert.ifError(err); + assert.equal(2, docs.length); + done(); + }); + }); + }); + + it('findOne', function(done) { + const m = mquery(col).findOne({ age: 2 }); + m.exec(function(err, doc) { + assert.ifError(err); + assert.equal(2, doc.age); + done(); + }); + }); + + it('count', function(done) { + const m = mquery(col).count({ name: 'exec' }); + m.exec(function(err, count) { + assert.ifError(err); + assert.equal(2, count); + done(); + }); + }); + + it('distinct', function(done) { + const m = mquery({ name: 'exec' }); + m.collection(col); + m.distinct('age'); + m.exec(function(err, array) { + assert.ifError(err); + assert.ok(Array.isArray(array)); + assert.equal(2, array.length); + assert(~array.indexOf(1)); + assert(~array.indexOf(2)); + done(); + }); + }); + + describe('update', function() { + describe('updateMany', function() { + it('works', function(done) { + mquery(col).updateMany({ name: 'exec' }, { name: 'test' }). + exec(function(error) { + assert.ifError(error); + mquery(col).count({ name: 'test' }).exec(function(error, res) { + assert.ifError(error); + assert.equal(res, 2); + done(); + }); + }); + }); + it('works with write concern', function(done) { + mquery(col).updateMany({ name: 'exec' }, { name: 'test' }) + .w(1).j(true).wtimeout(1000) + .exec(function(error) { + assert.ifError(error); + mquery(col).count({ name: 'test' }).exec(function(error, res) { + assert.ifError(error); + assert.equal(res, 2); + done(); + }); + }); + }); + }); + + describe('updateOne', function() { + it('works', function(done) { + mquery(col).updateOne({ name: 'exec' }, { name: 'test' }). + exec(function(error) { + assert.ifError(error); + mquery(col).count({ name: 'test' }).exec(function(error, res) { + assert.ifError(error); + assert.equal(res, 1); + done(); + }); + }); + }); + }); + + describe('replaceOne', function() { + it('works', function(done) { + mquery(col).replaceOne({ name: 'exec' }, { name: 'test' }). + exec(function(error) { + assert.ifError(error); + mquery(col).findOne({ name: 'test' }).exec(function(error, res) { + assert.ifError(error); + assert.equal(res.name, 'test'); + assert.ok(res.age == null); + done(); + }); + }); + }); + }); + }); + + describe('remove', function() { + it('with a callback', function(done) { + const m = mquery(col).where({ age: 2 }).remove(); + m.exec(function(err, res) { + assert.ifError(err); + assert.equal(1, res.deletedCount); + done(); + }); + }); + + it('without a callback', function(done) { + const m = mquery(col).where({ age: 1 }).remove(); + m.exec(); + + setTimeout(function() { + mquery(col).where('name', 'exec').count(function(err, num) { + assert.equal(1, num); + done(); + }); + }, 200); + }); + }); + + describe('deleteOne', function() { + it('with a callback', function(done) { + const m = mquery(col).where({ age: { $gte: 0 } }).deleteOne(); + m.exec(function(err, res) { + assert.ifError(err); + assert.equal(res.deletedCount, 1); + done(); + }); + }); + + it('with justOne set', function(done) { + const m = mquery(col).where({ age: { $gte: 0 } }). + // Should ignore `justOne` + setOptions({ justOne: false }). + deleteOne(); + m.exec(function(err, res) { + assert.ifError(err); + assert.equal(res.deletedCount, 1); + done(); + }); + }); + }); + + describe('deleteMany', function() { + it('with a callback', function(done) { + const m = mquery(col).where({ age: { $gte: 0 } }).deleteMany(); + m.exec(function(err, res) { + assert.ifError(err); + assert.equal(res.deletedCount, 2); + done(); + }); + }); + }); + + describe('findOneAndUpdate', function() { + it('with a callback', function(done) { + const m = mquery(col); + m.findOneAndUpdate({ name: 'exec', age: 1 }, { $set: { name: 'findOneAndUpdate' } }, { returnDocument: 'after' }); + m.exec(function(err, res) { + assert.ifError(err); + assert.equal('findOneAndUpdate', res.value.name); + done(); + }); + }); + }); + + describe('findOneAndRemove', function() { + it('with a callback', function(done) { + const m = mquery(col); + m.findOneAndRemove({ name: 'exec', age: 2 }); + m.exec(function(err, res) { + assert.ifError(err); + assert.equal('exec', res.value.name); + assert.equal(2, res.value.age); + mquery(col).count({ name: 'exec' }, function(err, num) { + assert.ifError(err); + assert.equal(1, num); + done(); + }); + }); + }); + }); + }); + + describe('setTraceFunction', function() { + beforeEach(function(done) { + col.insertMany([{ name: 'trace', age: 93 }], done); + }); + + it('calls trace function when executing query', function(done) { + const m = mquery(col); + + let resultTraceCalled; + + m.setTraceFunction(function(method, queryInfo) { + try { + assert.equal('findOne', method); + assert.equal('trace', queryInfo.conditions.name); + } catch (e) { + done(e); + } + + return function(err, result, millis) { + try { + assert.equal(93, result.age); + assert.ok(typeof millis === 'number'); + } catch (e) { + done(e); + } + resultTraceCalled = true; + }; + }); + + m.findOne({ name: 'trace' }, function(err, doc) { + assert.ifError(err); + assert.equal(resultTraceCalled, true); + assert.equal(93, doc.age); + done(); + }); + }); + + it('inherits trace function when calling toConstructor', function(done) { + function traceFunction() { return function() {}; } + + const tracedQuery = mquery().setTraceFunction(traceFunction).toConstructor(); + + const query = tracedQuery(); + assert.equal(traceFunction, query._traceFunction); + + done(); + }); + }); + + describe('thunk', function() { + it('returns a function', function(done) { + assert.equal('function', typeof mquery().thunk()); + done(); + }); + + it('passes the fn arg to `exec`', function(done) { + function cb() {} + const m = mquery(); + + m.exec = function testing(fn) { + assert.equal(this, m); + assert.equal(cb, fn); + done(); + }; + + m.thunk()(cb); + }); + }); + + describe('then', function() { + before(function(done) { + col.insertMany([{ name: 'then', age: 1 }, { name: 'then', age: 2 }], done); + }); + + after(function(done) { + mquery(col).remove({ name: 'then' }).exec(done); + }); + + it('returns a promise A+ compat object', function(done) { + const m = mquery(col).find(); + assert.equal('function', typeof m.then); + done(); + }); + + it('creates a promise that is resolved on success', function(done) { + const promise = mquery(col).count({ name: 'then' }).then(); + promise.then(function(count) { + assert.equal(2, count); + done(); + }, done); + }); + + it('supports exec() cb being called synchronously #66', function(done) { + const query = mquery(col).count({ name: 'then' }); + query.exec = function(cb) { + cb(null, 66); + }; + + query.then(success, done); + function success(count) { + assert.equal(66, count); + done(); + } + }); + }); + + describe('stream', function() { + before(function(done) { + col.insertMany([{ name: 'stream', age: 1 }, { name: 'stream', age: 2 }], done); + }); + + after(function(done) { + mquery(col).remove({ name: 'stream' }).exec(done); + }); + + describe('throws', function() { + describe('if used with non-find operations', function() { + const ops = ['update', 'findOneAndUpdate', 'remove', 'count', 'distinct']; + + ops.forEach(function(op) { + assert.throws(function() { + mquery(col)[op]().stream(); + }); + }); + }); + }); + + it('returns a stream', function(done) { + const stream = mquery(col).find({ name: 'stream' }).cursor().stream(); + let count = 0; + let err; + + stream.on('data', function(doc) { + assert.equal('stream', doc.name); + ++count; + }); + + stream.on('error', function(er) { + err = er; + }); + + stream.on('end', function() { + if (err) return done(err); + assert.equal(2, count); + done(); + }); + }); + }); + + function noDistinct(type) { + it('cannot be used with distinct()', function(done) { + assert.throws(function() { + mquery().distinct('name')[type](4); + }, new RegExp(type + ' cannot be used with distinct')); + done(); + }); + } + + function no(method, type) { + it('cannot be used with ' + method + '()', function(done) { + assert.throws(function() { + mquery()[method]()[type](4); + }, new RegExp(type + ' cannot be used with ' + method)); + done(); + }); + } + + // query internal + + describe('_updateForExec', function() { + it('returns a clone of the update object with same key order #19', function(done) { + const update = {}; + update.$push = { n: { $each: [{ x: 10 }], $slice: -1, $sort: { x: 1 } } }; + + const q = mquery().updateOne({ x: 1 }, update); + + // capture original key order + const order = []; + let key; + for (key in q._update.$push.n) { + order.push(key); + } + + // compare output + const doc = q._updateForExec(); + let i = 0; + for (key in doc.$push.n) { + assert.equal(key, order[i]); + i++; + } + + done(); + }); + }); +}); diff --git a/node_modules/mquery/test/utils.test.js b/node_modules/mquery/test/utils.test.js new file mode 100644 index 000000000..8e358e4e9 --- /dev/null +++ b/node_modules/mquery/test/utils.test.js @@ -0,0 +1,161 @@ +'use strict'; + +const utils = require('../lib/utils'); +const assert = require('assert'); +const debug = require('debug'); + +let mongo; +try { + mongo = new require('mongodb'); +} catch (e) { + debug('mongo', 'cannot construct mongodb instance'); +} + +describe('lib/utils', function() { + describe('clone', function() { + it('clones constructors named ObjectId', function(done) { + function ObjectId(id) { + this.id = id; + } + + const o1 = new ObjectId('1234'); + const o2 = utils.clone(o1); + assert.ok(o2 instanceof ObjectId); + + done(); + }); + + it('clones constructors named ObjectID', function(done) { + function ObjectID(id) { + this.id = id; + } + + const o1 = new ObjectID('1234'); + const o2 = utils.clone(o1); + + assert.ok(o2 instanceof ObjectID); + done(); + }); + + it('does not clone constructors named ObjectIdd', function(done) { + function ObjectIdd(id) { + this.id = id; + } + + const o1 = new ObjectIdd('1234'); + const o2 = utils.clone(o1); + assert.ok(!(o2 instanceof ObjectIdd)); + + done(); + }); + + it('optionally clones ObjectId constructors using its clone method', function(done) { + function ObjectID(id) { + this.id = id; + this.cloned = false; + } + + ObjectID.prototype.clone = function() { + const ret = new ObjectID(this.id); + ret.cloned = true; + return ret; + }; + + const id = 1234; + const o1 = new ObjectID(id); + assert.equal(id, o1.id); + assert.equal(false, o1.cloned); + + const o2 = utils.clone(o1); + assert.ok(o2 instanceof ObjectID); + assert.equal(id, o2.id); + assert.ok(o2.cloned); + done(); + }); + + it('clones mongodb.ReadPreferences', function(done) { + if (!mongo) return done(); + + const tags = [ + { dc: 'tag1' } + ]; + const prefs = [ + new mongo.ReadPreference('primary'), + new mongo.ReadPreference(mongo.ReadPreference.PRIMARY_PREFERRED), + new mongo.ReadPreference('secondary', tags) + ]; + + const prefsCloned = utils.clone(prefs); + + for (let i = 0; i < prefsCloned.length; i++) { + assert.notEqual(prefs[i], prefsCloned[i]); + if (prefs[i].tags) { + assert.ok(prefsCloned[i].tags); + assert.notEqual(prefs[i].tags, prefsCloned[i].tags); + assert.notEqual(prefs[i].tags[0], prefsCloned[i].tags[0]); + } else { + assert.equal(prefsCloned[i].tags, null); + } + } + + done(); + }); + + it('clones mongodb.Binary', function(done) { + if (!mongo) return done(); + const buf = Buffer.from('hi'); + const binary = new mongo.Binary(buf, 2); + const clone = utils.clone(binary); + assert.equal(binary.sub_type, clone.sub_type); + assert.equal(String(binary.buffer), String(buf)); + assert.ok(binary !== clone); + done(); + }); + + it('handles objects with no constructor', function(done) { + const name = '335'; + + const o = Object.create(null); + o.name = name; + + let clone; + assert.doesNotThrow(function() { + clone = utils.clone(o); + }); + + assert.equal(name, clone.name); + assert.ok(o != clone); + done(); + }); + + it('handles buffers', function(done) { + const buff = Buffer.alloc(10); + buff.fill(1); + const clone = utils.clone(buff); + + for (let i = 0; i < buff.length; i++) { + assert.equal(buff[i], clone[i]); + } + + done(); + }); + + it('skips __proto__', function() { + const payload = JSON.parse('{"__proto__": {"polluted": "vulnerable"}}'); + const res = utils.clone(payload); + + assert.strictEqual({}.polluted, void 0); + assert.strictEqual(res.__proto__, Object.prototype); + }); + }); + + describe('merge', function() { + it('avoids prototype pollution', function() { + const payload = JSON.parse('{"__proto__": {"polluted": "vulnerable"}}'); + const obj = {}; + utils.merge(obj, payload); + + assert.strictEqual({}.polluted, void 0); + }); + }); +}); diff --git a/node_modules/ms/index.js b/node_modules/ms/index.js new file mode 100644 index 000000000..6a522b16b --- /dev/null +++ b/node_modules/ms/index.js @@ -0,0 +1,152 @@ +/** + * Helpers. + */ + +var s = 1000; +var m = s * 60; +var h = m * 60; +var d = h * 24; +var y = d * 365.25; + +/** + * Parse or format the given `val`. + * + * Options: + * + * - `long` verbose formatting [false] + * + * @param {String|Number} val + * @param {Object} [options] + * @throws {Error} throw an error if val is not a non-empty string or a number + * @return {String|Number} + * @api public + */ + +module.exports = function(val, options) { + options = options || {}; + var type = typeof val; + if (type === 'string' && val.length > 0) { + return parse(val); + } else if (type === 'number' && isNaN(val) === false) { + return options.long ? fmtLong(val) : fmtShort(val); + } + throw new Error( + 'val is not a non-empty string or a valid number. val=' + + JSON.stringify(val) + ); +}; + +/** + * Parse the given `str` and return milliseconds. + * + * @param {String} str + * @return {Number} + * @api private + */ + +function parse(str) { + str = String(str); + if (str.length > 100) { + return; + } + var match = /^((?:\d+)?\.?\d+) *(milliseconds?|msecs?|ms|seconds?|secs?|s|minutes?|mins?|m|hours?|hrs?|h|days?|d|years?|yrs?|y)?$/i.exec( + str + ); + if (!match) { + return; + } + var n = parseFloat(match[1]); + var type = (match[2] || 'ms').toLowerCase(); + switch (type) { + case 'years': + case 'year': + case 'yrs': + case 'yr': + case 'y': + return n * y; + case 'days': + case 'day': + case 'd': + return n * d; + case 'hours': + case 'hour': + case 'hrs': + case 'hr': + case 'h': + return n * h; + case 'minutes': + case 'minute': + case 'mins': + case 'min': + case 'm': + return n * m; + case 'seconds': + case 'second': + case 'secs': + case 'sec': + case 's': + return n * s; + case 'milliseconds': + case 'millisecond': + case 'msecs': + case 'msec': + case 'ms': + return n; + default: + return undefined; + } +} + +/** + * Short format for `ms`. + * + * @param {Number} ms + * @return {String} + * @api private + */ + +function fmtShort(ms) { + if (ms >= d) { + return Math.round(ms / d) + 'd'; + } + if (ms >= h) { + return Math.round(ms / h) + 'h'; + } + if (ms >= m) { + return Math.round(ms / m) + 'm'; + } + if (ms >= s) { + return Math.round(ms / s) + 's'; + } + return ms + 'ms'; +} + +/** + * Long format for `ms`. + * + * @param {Number} ms + * @return {String} + * @api private + */ + +function fmtLong(ms) { + return plural(ms, d, 'day') || + plural(ms, h, 'hour') || + plural(ms, m, 'minute') || + plural(ms, s, 'second') || + ms + ' ms'; +} + +/** + * Pluralization helper. + */ + +function plural(ms, n, name) { + if (ms < n) { + return; + } + if (ms < n * 1.5) { + return Math.floor(ms / n) + ' ' + name; + } + return Math.ceil(ms / n) + ' ' + name + 's'; +} diff --git a/node_modules/ms/license.md b/node_modules/ms/license.md new file mode 100644 index 000000000..69b61253a --- /dev/null +++ b/node_modules/ms/license.md @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2016 Zeit, Inc. + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/node_modules/ms/package.json b/node_modules/ms/package.json new file mode 100644 index 000000000..6d145dbca --- /dev/null +++ b/node_modules/ms/package.json @@ -0,0 +1,69 @@ +{ + "_from": "ms@2.0.0", + "_id": "ms@2.0.0", + "_inBundle": false, + "_integrity": "sha1-VgiurfwAvmwpAd9fmGF4jeDVl8g=", + "_location": "/ms", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "ms@2.0.0", + "name": "ms", + "escapedName": "ms", + "rawSpec": "2.0.0", + "saveSpec": null, + "fetchSpec": "2.0.0" + }, + "_requiredBy": [ + "/debug" + ], + "_resolved": "https://registry.npmjs.org/ms/-/ms-2.0.0.tgz", + "_shasum": "5608aeadfc00be6c2901df5f9861788de0d597c8", + "_spec": "ms@2.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\debug", + "bugs": { + "url": "https://github.com/zeit/ms/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Tiny milisecond conversion utility", + "devDependencies": { + "eslint": "3.19.0", + "expect.js": "0.3.1", + "husky": "0.13.3", + "lint-staged": "3.4.1", + "mocha": "3.4.1" + }, + "eslintConfig": { + "extends": "eslint:recommended", + "env": { + "node": true, + "es6": true + } + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/zeit/ms#readme", + "license": "MIT", + "lint-staged": { + "*.js": [ + "npm run lint", + "prettier --single-quote --write", + "git add" + ] + }, + "main": "./index", + "name": "ms", + "repository": { + "type": "git", + "url": "git+https://github.com/zeit/ms.git" + }, + "scripts": { + "lint": "eslint lib/* bin/*", + "precommit": "lint-staged", + "test": "mocha tests.js" + }, + "version": "2.0.0" +} diff --git a/node_modules/ms/readme.md b/node_modules/ms/readme.md new file mode 100644 index 000000000..84a9974cc --- /dev/null +++ b/node_modules/ms/readme.md @@ -0,0 +1,51 @@ +# ms + +[![Build Status](https://travis-ci.org/zeit/ms.svg?branch=master)](https://travis-ci.org/zeit/ms) +[![Slack Channel](http://zeit-slackin.now.sh/badge.svg)](https://zeit.chat/) + +Use this package to easily convert various time formats to milliseconds. + +## Examples + +```js +ms('2 days') // 172800000 +ms('1d') // 86400000 +ms('10h') // 36000000 +ms('2.5 hrs') // 9000000 +ms('2h') // 7200000 +ms('1m') // 60000 +ms('5s') // 5000 +ms('1y') // 31557600000 +ms('100') // 100 +``` + +### Convert from milliseconds + +```js +ms(60000) // "1m" +ms(2 * 60000) // "2m" +ms(ms('10 hours')) // "10h" +``` + +### Time format written-out + +```js +ms(60000, { long: true }) // "1 minute" +ms(2 * 60000, { long: true }) // "2 minutes" +ms(ms('10 hours'), { long: true }) // "10 hours" +``` + +## Features + +- Works both in [node](https://nodejs.org) and in the browser. +- If a number is supplied to `ms`, a string with a unit is returned. +- If a string that contains the number is supplied, it returns it as a number (e.g.: it returns `100` for `'100'`). +- If you pass a string with a number and a valid unit, the number of equivalent ms is returned. + +## Caught a bug? + +1. [Fork](https://help.github.com/articles/fork-a-repo/) this repository to your own GitHub account and then [clone](https://help.github.com/articles/cloning-a-repository/) it to your local device +2. Link the package to the global module directory: `npm link` +3. Within the module you want to test your local development instance of ms, just link it to the dependencies: `npm link ms`. Instead of the default one from npm, node will now use your clone of ms! + +As always, you can run the tests using: `npm test` diff --git a/node_modules/negotiator/HISTORY.md b/node_modules/negotiator/HISTORY.md new file mode 100644 index 000000000..6d06c76aa --- /dev/null +++ b/node_modules/negotiator/HISTORY.md @@ -0,0 +1,103 @@ +0.6.2 / 2019-04-29 +================== + + * Fix sorting charset, encoding, and language with extra parameters + +0.6.1 / 2016-05-02 +================== + + * perf: improve `Accept` parsing speed + * perf: improve `Accept-Charset` parsing speed + * perf: improve `Accept-Encoding` parsing speed + * perf: improve `Accept-Language` parsing speed + +0.6.0 / 2015-09-29 +================== + + * Fix including type extensions in parameters in `Accept` parsing + * Fix parsing `Accept` parameters with quoted equals + * Fix parsing `Accept` parameters with quoted semicolons + * Lazy-load modules from main entry point + * perf: delay type concatenation until needed + * perf: enable strict mode + * perf: hoist regular expressions + * perf: remove closures getting spec properties + * perf: remove a closure from media type parsing + * perf: remove property delete from media type parsing + +0.5.3 / 2015-05-10 +================== + + * Fix media type parameter matching to be case-insensitive + +0.5.2 / 2015-05-06 +================== + + * Fix comparing media types with quoted values + * Fix splitting media types with quoted commas + +0.5.1 / 2015-02-14 +================== + + * Fix preference sorting to be stable for long acceptable lists + +0.5.0 / 2014-12-18 +================== + + * Fix list return order when large accepted list + * Fix missing identity encoding when q=0 exists + * Remove dynamic building of Negotiator class + +0.4.9 / 2014-10-14 +================== + + * Fix error when media type has invalid parameter + +0.4.8 / 2014-09-28 +================== + + * Fix all negotiations to be case-insensitive + * Stable sort preferences of same quality according to client order + * Support Node.js 0.6 + +0.4.7 / 2014-06-24 +================== + + * Handle invalid provided languages + * Handle invalid provided media types + +0.4.6 / 2014-06-11 +================== + + * Order by specificity when quality is the same + +0.4.5 / 2014-05-29 +================== + + * Fix regression in empty header handling + +0.4.4 / 2014-05-29 +================== + + * Fix behaviors when headers are not present + +0.4.3 / 2014-04-16 +================== + + * Handle slashes on media params correctly + +0.4.2 / 2014-02-28 +================== + + * Fix media type sorting + * Handle media types params strictly + +0.4.1 / 2014-01-16 +================== + + * Use most specific matches + +0.4.0 / 2014-01-09 +================== + + * Remove preferred prefix from methods diff --git a/node_modules/negotiator/LICENSE b/node_modules/negotiator/LICENSE new file mode 100644 index 000000000..ea6b9e2e9 --- /dev/null +++ b/node_modules/negotiator/LICENSE @@ -0,0 +1,24 @@ +(The MIT License) + +Copyright (c) 2012-2014 Federico Romero +Copyright (c) 2012-2014 Isaac Z. Schlueter +Copyright (c) 2014-2015 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/negotiator/README.md b/node_modules/negotiator/README.md new file mode 100644 index 000000000..04a67ff76 --- /dev/null +++ b/node_modules/negotiator/README.md @@ -0,0 +1,203 @@ +# negotiator + +[![NPM Version][npm-image]][npm-url] +[![NPM Downloads][downloads-image]][downloads-url] +[![Node.js Version][node-version-image]][node-version-url] +[![Build Status][travis-image]][travis-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +An HTTP content negotiator for Node.js + +## Installation + +```sh +$ npm install negotiator +``` + +## API + +```js +var Negotiator = require('negotiator') +``` + +### Accept Negotiation + +```js +availableMediaTypes = ['text/html', 'text/plain', 'application/json'] + +// The negotiator constructor receives a request object +negotiator = new Negotiator(request) + +// Let's say Accept header is 'text/html, application/*;q=0.2, image/jpeg;q=0.8' + +negotiator.mediaTypes() +// -> ['text/html', 'image/jpeg', 'application/*'] + +negotiator.mediaTypes(availableMediaTypes) +// -> ['text/html', 'application/json'] + +negotiator.mediaType(availableMediaTypes) +// -> 'text/html' +``` + +You can check a working example at `examples/accept.js`. + +#### Methods + +##### mediaType() + +Returns the most preferred media type from the client. + +##### mediaType(availableMediaType) + +Returns the most preferred media type from a list of available media types. + +##### mediaTypes() + +Returns an array of preferred media types ordered by the client preference. + +##### mediaTypes(availableMediaTypes) + +Returns an array of preferred media types ordered by priority from a list of +available media types. + +### Accept-Language Negotiation + +```js +negotiator = new Negotiator(request) + +availableLanguages = ['en', 'es', 'fr'] + +// Let's say Accept-Language header is 'en;q=0.8, es, pt' + +negotiator.languages() +// -> ['es', 'pt', 'en'] + +negotiator.languages(availableLanguages) +// -> ['es', 'en'] + +language = negotiator.language(availableLanguages) +// -> 'es' +``` + +You can check a working example at `examples/language.js`. + +#### Methods + +##### language() + +Returns the most preferred language from the client. + +##### language(availableLanguages) + +Returns the most preferred language from a list of available languages. + +##### languages() + +Returns an array of preferred languages ordered by the client preference. + +##### languages(availableLanguages) + +Returns an array of preferred languages ordered by priority from a list of +available languages. + +### Accept-Charset Negotiation + +```js +availableCharsets = ['utf-8', 'iso-8859-1', 'iso-8859-5'] + +negotiator = new Negotiator(request) + +// Let's say Accept-Charset header is 'utf-8, iso-8859-1;q=0.8, utf-7;q=0.2' + +negotiator.charsets() +// -> ['utf-8', 'iso-8859-1', 'utf-7'] + +negotiator.charsets(availableCharsets) +// -> ['utf-8', 'iso-8859-1'] + +negotiator.charset(availableCharsets) +// -> 'utf-8' +``` + +You can check a working example at `examples/charset.js`. + +#### Methods + +##### charset() + +Returns the most preferred charset from the client. + +##### charset(availableCharsets) + +Returns the most preferred charset from a list of available charsets. + +##### charsets() + +Returns an array of preferred charsets ordered by the client preference. + +##### charsets(availableCharsets) + +Returns an array of preferred charsets ordered by priority from a list of +available charsets. + +### Accept-Encoding Negotiation + +```js +availableEncodings = ['identity', 'gzip'] + +negotiator = new Negotiator(request) + +// Let's say Accept-Encoding header is 'gzip, compress;q=0.2, identity;q=0.5' + +negotiator.encodings() +// -> ['gzip', 'identity', 'compress'] + +negotiator.encodings(availableEncodings) +// -> ['gzip', 'identity'] + +negotiator.encoding(availableEncodings) +// -> 'gzip' +``` + +You can check a working example at `examples/encoding.js`. + +#### Methods + +##### encoding() + +Returns the most preferred encoding from the client. + +##### encoding(availableEncodings) + +Returns the most preferred encoding from a list of available encodings. + +##### encodings() + +Returns an array of preferred encodings ordered by the client preference. + +##### encodings(availableEncodings) + +Returns an array of preferred encodings ordered by priority from a list of +available encodings. + +## See Also + +The [accepts](https://npmjs.org/package/accepts#readme) module builds on +this module and provides an alternative interface, mime type validation, +and more. + +## License + +[MIT](LICENSE) + +[npm-image]: https://img.shields.io/npm/v/negotiator.svg +[npm-url]: https://npmjs.org/package/negotiator +[node-version-image]: https://img.shields.io/node/v/negotiator.svg +[node-version-url]: https://nodejs.org/en/download/ +[travis-image]: https://img.shields.io/travis/jshttp/negotiator/master.svg +[travis-url]: https://travis-ci.org/jshttp/negotiator +[coveralls-image]: https://img.shields.io/coveralls/jshttp/negotiator/master.svg +[coveralls-url]: https://coveralls.io/r/jshttp/negotiator?branch=master +[downloads-image]: https://img.shields.io/npm/dm/negotiator.svg +[downloads-url]: https://npmjs.org/package/negotiator diff --git a/node_modules/negotiator/index.js b/node_modules/negotiator/index.js new file mode 100644 index 000000000..8d4f6a226 --- /dev/null +++ b/node_modules/negotiator/index.js @@ -0,0 +1,124 @@ +/*! + * negotiator + * Copyright(c) 2012 Federico Romero + * Copyright(c) 2012-2014 Isaac Z. Schlueter + * Copyright(c) 2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +/** + * Cached loaded submodules. + * @private + */ + +var modules = Object.create(null); + +/** + * Module exports. + * @public + */ + +module.exports = Negotiator; +module.exports.Negotiator = Negotiator; + +/** + * Create a Negotiator instance from a request. + * @param {object} request + * @public + */ + +function Negotiator(request) { + if (!(this instanceof Negotiator)) { + return new Negotiator(request); + } + + this.request = request; +} + +Negotiator.prototype.charset = function charset(available) { + var set = this.charsets(available); + return set && set[0]; +}; + +Negotiator.prototype.charsets = function charsets(available) { + var preferredCharsets = loadModule('charset').preferredCharsets; + return preferredCharsets(this.request.headers['accept-charset'], available); +}; + +Negotiator.prototype.encoding = function encoding(available) { + var set = this.encodings(available); + return set && set[0]; +}; + +Negotiator.prototype.encodings = function encodings(available) { + var preferredEncodings = loadModule('encoding').preferredEncodings; + return preferredEncodings(this.request.headers['accept-encoding'], available); +}; + +Negotiator.prototype.language = function language(available) { + var set = this.languages(available); + return set && set[0]; +}; + +Negotiator.prototype.languages = function languages(available) { + var preferredLanguages = loadModule('language').preferredLanguages; + return preferredLanguages(this.request.headers['accept-language'], available); +}; + +Negotiator.prototype.mediaType = function mediaType(available) { + var set = this.mediaTypes(available); + return set && set[0]; +}; + +Negotiator.prototype.mediaTypes = function mediaTypes(available) { + var preferredMediaTypes = loadModule('mediaType').preferredMediaTypes; + return preferredMediaTypes(this.request.headers.accept, available); +}; + +// Backwards compatibility +Negotiator.prototype.preferredCharset = Negotiator.prototype.charset; +Negotiator.prototype.preferredCharsets = Negotiator.prototype.charsets; +Negotiator.prototype.preferredEncoding = Negotiator.prototype.encoding; +Negotiator.prototype.preferredEncodings = Negotiator.prototype.encodings; +Negotiator.prototype.preferredLanguage = Negotiator.prototype.language; +Negotiator.prototype.preferredLanguages = Negotiator.prototype.languages; +Negotiator.prototype.preferredMediaType = Negotiator.prototype.mediaType; +Negotiator.prototype.preferredMediaTypes = Negotiator.prototype.mediaTypes; + +/** + * Load the given module. + * @private + */ + +function loadModule(moduleName) { + var module = modules[moduleName]; + + if (module !== undefined) { + return module; + } + + // This uses a switch for static require analysis + switch (moduleName) { + case 'charset': + module = require('./lib/charset'); + break; + case 'encoding': + module = require('./lib/encoding'); + break; + case 'language': + module = require('./lib/language'); + break; + case 'mediaType': + module = require('./lib/mediaType'); + break; + default: + throw new Error('Cannot find module \'' + moduleName + '\''); + } + + // Store to prevent invoking require() + modules[moduleName] = module; + + return module; +} diff --git a/node_modules/negotiator/lib/charset.js b/node_modules/negotiator/lib/charset.js new file mode 100644 index 000000000..cdd014803 --- /dev/null +++ b/node_modules/negotiator/lib/charset.js @@ -0,0 +1,169 @@ +/** + * negotiator + * Copyright(c) 2012 Isaac Z. Schlueter + * Copyright(c) 2014 Federico Romero + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +/** + * Module exports. + * @public + */ + +module.exports = preferredCharsets; +module.exports.preferredCharsets = preferredCharsets; + +/** + * Module variables. + * @private + */ + +var simpleCharsetRegExp = /^\s*([^\s;]+)\s*(?:;(.*))?$/; + +/** + * Parse the Accept-Charset header. + * @private + */ + +function parseAcceptCharset(accept) { + var accepts = accept.split(','); + + for (var i = 0, j = 0; i < accepts.length; i++) { + var charset = parseCharset(accepts[i].trim(), i); + + if (charset) { + accepts[j++] = charset; + } + } + + // trim accepts + accepts.length = j; + + return accepts; +} + +/** + * Parse a charset from the Accept-Charset header. + * @private + */ + +function parseCharset(str, i) { + var match = simpleCharsetRegExp.exec(str); + if (!match) return null; + + var charset = match[1]; + var q = 1; + if (match[2]) { + var params = match[2].split(';') + for (var j = 0; j < params.length; j++) { + var p = params[j].trim().split('='); + if (p[0] === 'q') { + q = parseFloat(p[1]); + break; + } + } + } + + return { + charset: charset, + q: q, + i: i + }; +} + +/** + * Get the priority of a charset. + * @private + */ + +function getCharsetPriority(charset, accepted, index) { + var priority = {o: -1, q: 0, s: 0}; + + for (var i = 0; i < accepted.length; i++) { + var spec = specify(charset, accepted[i], index); + + if (spec && (priority.s - spec.s || priority.q - spec.q || priority.o - spec.o) < 0) { + priority = spec; + } + } + + return priority; +} + +/** + * Get the specificity of the charset. + * @private + */ + +function specify(charset, spec, index) { + var s = 0; + if(spec.charset.toLowerCase() === charset.toLowerCase()){ + s |= 1; + } else if (spec.charset !== '*' ) { + return null + } + + return { + i: index, + o: spec.i, + q: spec.q, + s: s + } +} + +/** + * Get the preferred charsets from an Accept-Charset header. + * @public + */ + +function preferredCharsets(accept, provided) { + // RFC 2616 sec 14.2: no header = * + var accepts = parseAcceptCharset(accept === undefined ? '*' : accept || ''); + + if (!provided) { + // sorted list of all charsets + return accepts + .filter(isQuality) + .sort(compareSpecs) + .map(getFullCharset); + } + + var priorities = provided.map(function getPriority(type, index) { + return getCharsetPriority(type, accepts, index); + }); + + // sorted list of accepted charsets + return priorities.filter(isQuality).sort(compareSpecs).map(function getCharset(priority) { + return provided[priorities.indexOf(priority)]; + }); +} + +/** + * Compare two specs. + * @private + */ + +function compareSpecs(a, b) { + return (b.q - a.q) || (b.s - a.s) || (a.o - b.o) || (a.i - b.i) || 0; +} + +/** + * Get full charset string. + * @private + */ + +function getFullCharset(spec) { + return spec.charset; +} + +/** + * Check if a spec has any quality. + * @private + */ + +function isQuality(spec) { + return spec.q > 0; +} diff --git a/node_modules/negotiator/lib/encoding.js b/node_modules/negotiator/lib/encoding.js new file mode 100644 index 000000000..8432cd77b --- /dev/null +++ b/node_modules/negotiator/lib/encoding.js @@ -0,0 +1,184 @@ +/** + * negotiator + * Copyright(c) 2012 Isaac Z. Schlueter + * Copyright(c) 2014 Federico Romero + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +/** + * Module exports. + * @public + */ + +module.exports = preferredEncodings; +module.exports.preferredEncodings = preferredEncodings; + +/** + * Module variables. + * @private + */ + +var simpleEncodingRegExp = /^\s*([^\s;]+)\s*(?:;(.*))?$/; + +/** + * Parse the Accept-Encoding header. + * @private + */ + +function parseAcceptEncoding(accept) { + var accepts = accept.split(','); + var hasIdentity = false; + var minQuality = 1; + + for (var i = 0, j = 0; i < accepts.length; i++) { + var encoding = parseEncoding(accepts[i].trim(), i); + + if (encoding) { + accepts[j++] = encoding; + hasIdentity = hasIdentity || specify('identity', encoding); + minQuality = Math.min(minQuality, encoding.q || 1); + } + } + + if (!hasIdentity) { + /* + * If identity doesn't explicitly appear in the accept-encoding header, + * it's added to the list of acceptable encoding with the lowest q + */ + accepts[j++] = { + encoding: 'identity', + q: minQuality, + i: i + }; + } + + // trim accepts + accepts.length = j; + + return accepts; +} + +/** + * Parse an encoding from the Accept-Encoding header. + * @private + */ + +function parseEncoding(str, i) { + var match = simpleEncodingRegExp.exec(str); + if (!match) return null; + + var encoding = match[1]; + var q = 1; + if (match[2]) { + var params = match[2].split(';'); + for (var j = 0; j < params.length; j++) { + var p = params[j].trim().split('='); + if (p[0] === 'q') { + q = parseFloat(p[1]); + break; + } + } + } + + return { + encoding: encoding, + q: q, + i: i + }; +} + +/** + * Get the priority of an encoding. + * @private + */ + +function getEncodingPriority(encoding, accepted, index) { + var priority = {o: -1, q: 0, s: 0}; + + for (var i = 0; i < accepted.length; i++) { + var spec = specify(encoding, accepted[i], index); + + if (spec && (priority.s - spec.s || priority.q - spec.q || priority.o - spec.o) < 0) { + priority = spec; + } + } + + return priority; +} + +/** + * Get the specificity of the encoding. + * @private + */ + +function specify(encoding, spec, index) { + var s = 0; + if(spec.encoding.toLowerCase() === encoding.toLowerCase()){ + s |= 1; + } else if (spec.encoding !== '*' ) { + return null + } + + return { + i: index, + o: spec.i, + q: spec.q, + s: s + } +}; + +/** + * Get the preferred encodings from an Accept-Encoding header. + * @public + */ + +function preferredEncodings(accept, provided) { + var accepts = parseAcceptEncoding(accept || ''); + + if (!provided) { + // sorted list of all encodings + return accepts + .filter(isQuality) + .sort(compareSpecs) + .map(getFullEncoding); + } + + var priorities = provided.map(function getPriority(type, index) { + return getEncodingPriority(type, accepts, index); + }); + + // sorted list of accepted encodings + return priorities.filter(isQuality).sort(compareSpecs).map(function getEncoding(priority) { + return provided[priorities.indexOf(priority)]; + }); +} + +/** + * Compare two specs. + * @private + */ + +function compareSpecs(a, b) { + return (b.q - a.q) || (b.s - a.s) || (a.o - b.o) || (a.i - b.i) || 0; +} + +/** + * Get full encoding string. + * @private + */ + +function getFullEncoding(spec) { + return spec.encoding; +} + +/** + * Check if a spec has any quality. + * @private + */ + +function isQuality(spec) { + return spec.q > 0; +} diff --git a/node_modules/negotiator/lib/language.js b/node_modules/negotiator/lib/language.js new file mode 100644 index 000000000..62f737f00 --- /dev/null +++ b/node_modules/negotiator/lib/language.js @@ -0,0 +1,179 @@ +/** + * negotiator + * Copyright(c) 2012 Isaac Z. Schlueter + * Copyright(c) 2014 Federico Romero + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +/** + * Module exports. + * @public + */ + +module.exports = preferredLanguages; +module.exports.preferredLanguages = preferredLanguages; + +/** + * Module variables. + * @private + */ + +var simpleLanguageRegExp = /^\s*([^\s\-;]+)(?:-([^\s;]+))?\s*(?:;(.*))?$/; + +/** + * Parse the Accept-Language header. + * @private + */ + +function parseAcceptLanguage(accept) { + var accepts = accept.split(','); + + for (var i = 0, j = 0; i < accepts.length; i++) { + var language = parseLanguage(accepts[i].trim(), i); + + if (language) { + accepts[j++] = language; + } + } + + // trim accepts + accepts.length = j; + + return accepts; +} + +/** + * Parse a language from the Accept-Language header. + * @private + */ + +function parseLanguage(str, i) { + var match = simpleLanguageRegExp.exec(str); + if (!match) return null; + + var prefix = match[1], + suffix = match[2], + full = prefix; + + if (suffix) full += "-" + suffix; + + var q = 1; + if (match[3]) { + var params = match[3].split(';') + for (var j = 0; j < params.length; j++) { + var p = params[j].split('='); + if (p[0] === 'q') q = parseFloat(p[1]); + } + } + + return { + prefix: prefix, + suffix: suffix, + q: q, + i: i, + full: full + }; +} + +/** + * Get the priority of a language. + * @private + */ + +function getLanguagePriority(language, accepted, index) { + var priority = {o: -1, q: 0, s: 0}; + + for (var i = 0; i < accepted.length; i++) { + var spec = specify(language, accepted[i], index); + + if (spec && (priority.s - spec.s || priority.q - spec.q || priority.o - spec.o) < 0) { + priority = spec; + } + } + + return priority; +} + +/** + * Get the specificity of the language. + * @private + */ + +function specify(language, spec, index) { + var p = parseLanguage(language) + if (!p) return null; + var s = 0; + if(spec.full.toLowerCase() === p.full.toLowerCase()){ + s |= 4; + } else if (spec.prefix.toLowerCase() === p.full.toLowerCase()) { + s |= 2; + } else if (spec.full.toLowerCase() === p.prefix.toLowerCase()) { + s |= 1; + } else if (spec.full !== '*' ) { + return null + } + + return { + i: index, + o: spec.i, + q: spec.q, + s: s + } +}; + +/** + * Get the preferred languages from an Accept-Language header. + * @public + */ + +function preferredLanguages(accept, provided) { + // RFC 2616 sec 14.4: no header = * + var accepts = parseAcceptLanguage(accept === undefined ? '*' : accept || ''); + + if (!provided) { + // sorted list of all languages + return accepts + .filter(isQuality) + .sort(compareSpecs) + .map(getFullLanguage); + } + + var priorities = provided.map(function getPriority(type, index) { + return getLanguagePriority(type, accepts, index); + }); + + // sorted list of accepted languages + return priorities.filter(isQuality).sort(compareSpecs).map(function getLanguage(priority) { + return provided[priorities.indexOf(priority)]; + }); +} + +/** + * Compare two specs. + * @private + */ + +function compareSpecs(a, b) { + return (b.q - a.q) || (b.s - a.s) || (a.o - b.o) || (a.i - b.i) || 0; +} + +/** + * Get full language string. + * @private + */ + +function getFullLanguage(spec) { + return spec.full; +} + +/** + * Check if a spec has any quality. + * @private + */ + +function isQuality(spec) { + return spec.q > 0; +} diff --git a/node_modules/negotiator/lib/mediaType.js b/node_modules/negotiator/lib/mediaType.js new file mode 100644 index 000000000..67309dd75 --- /dev/null +++ b/node_modules/negotiator/lib/mediaType.js @@ -0,0 +1,294 @@ +/** + * negotiator + * Copyright(c) 2012 Isaac Z. Schlueter + * Copyright(c) 2014 Federico Romero + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict'; + +/** + * Module exports. + * @public + */ + +module.exports = preferredMediaTypes; +module.exports.preferredMediaTypes = preferredMediaTypes; + +/** + * Module variables. + * @private + */ + +var simpleMediaTypeRegExp = /^\s*([^\s\/;]+)\/([^;\s]+)\s*(?:;(.*))?$/; + +/** + * Parse the Accept header. + * @private + */ + +function parseAccept(accept) { + var accepts = splitMediaTypes(accept); + + for (var i = 0, j = 0; i < accepts.length; i++) { + var mediaType = parseMediaType(accepts[i].trim(), i); + + if (mediaType) { + accepts[j++] = mediaType; + } + } + + // trim accepts + accepts.length = j; + + return accepts; +} + +/** + * Parse a media type from the Accept header. + * @private + */ + +function parseMediaType(str, i) { + var match = simpleMediaTypeRegExp.exec(str); + if (!match) return null; + + var params = Object.create(null); + var q = 1; + var subtype = match[2]; + var type = match[1]; + + if (match[3]) { + var kvps = splitParameters(match[3]).map(splitKeyValuePair); + + for (var j = 0; j < kvps.length; j++) { + var pair = kvps[j]; + var key = pair[0].toLowerCase(); + var val = pair[1]; + + // get the value, unwrapping quotes + var value = val && val[0] === '"' && val[val.length - 1] === '"' + ? val.substr(1, val.length - 2) + : val; + + if (key === 'q') { + q = parseFloat(value); + break; + } + + // store parameter + params[key] = value; + } + } + + return { + type: type, + subtype: subtype, + params: params, + q: q, + i: i + }; +} + +/** + * Get the priority of a media type. + * @private + */ + +function getMediaTypePriority(type, accepted, index) { + var priority = {o: -1, q: 0, s: 0}; + + for (var i = 0; i < accepted.length; i++) { + var spec = specify(type, accepted[i], index); + + if (spec && (priority.s - spec.s || priority.q - spec.q || priority.o - spec.o) < 0) { + priority = spec; + } + } + + return priority; +} + +/** + * Get the specificity of the media type. + * @private + */ + +function specify(type, spec, index) { + var p = parseMediaType(type); + var s = 0; + + if (!p) { + return null; + } + + if(spec.type.toLowerCase() == p.type.toLowerCase()) { + s |= 4 + } else if(spec.type != '*') { + return null; + } + + if(spec.subtype.toLowerCase() == p.subtype.toLowerCase()) { + s |= 2 + } else if(spec.subtype != '*') { + return null; + } + + var keys = Object.keys(spec.params); + if (keys.length > 0) { + if (keys.every(function (k) { + return spec.params[k] == '*' || (spec.params[k] || '').toLowerCase() == (p.params[k] || '').toLowerCase(); + })) { + s |= 1 + } else { + return null + } + } + + return { + i: index, + o: spec.i, + q: spec.q, + s: s, + } +} + +/** + * Get the preferred media types from an Accept header. + * @public + */ + +function preferredMediaTypes(accept, provided) { + // RFC 2616 sec 14.2: no header = */* + var accepts = parseAccept(accept === undefined ? '*/*' : accept || ''); + + if (!provided) { + // sorted list of all types + return accepts + .filter(isQuality) + .sort(compareSpecs) + .map(getFullType); + } + + var priorities = provided.map(function getPriority(type, index) { + return getMediaTypePriority(type, accepts, index); + }); + + // sorted list of accepted types + return priorities.filter(isQuality).sort(compareSpecs).map(function getType(priority) { + return provided[priorities.indexOf(priority)]; + }); +} + +/** + * Compare two specs. + * @private + */ + +function compareSpecs(a, b) { + return (b.q - a.q) || (b.s - a.s) || (a.o - b.o) || (a.i - b.i) || 0; +} + +/** + * Get full type string. + * @private + */ + +function getFullType(spec) { + return spec.type + '/' + spec.subtype; +} + +/** + * Check if a spec has any quality. + * @private + */ + +function isQuality(spec) { + return spec.q > 0; +} + +/** + * Count the number of quotes in a string. + * @private + */ + +function quoteCount(string) { + var count = 0; + var index = 0; + + while ((index = string.indexOf('"', index)) !== -1) { + count++; + index++; + } + + return count; +} + +/** + * Split a key value pair. + * @private + */ + +function splitKeyValuePair(str) { + var index = str.indexOf('='); + var key; + var val; + + if (index === -1) { + key = str; + } else { + key = str.substr(0, index); + val = str.substr(index + 1); + } + + return [key, val]; +} + +/** + * Split an Accept header into media types. + * @private + */ + +function splitMediaTypes(accept) { + var accepts = accept.split(','); + + for (var i = 1, j = 0; i < accepts.length; i++) { + if (quoteCount(accepts[j]) % 2 == 0) { + accepts[++j] = accepts[i]; + } else { + accepts[j] += ',' + accepts[i]; + } + } + + // trim accepts + accepts.length = j + 1; + + return accepts; +} + +/** + * Split a string of parameters. + * @private + */ + +function splitParameters(str) { + var parameters = str.split(';'); + + for (var i = 1, j = 0; i < parameters.length; i++) { + if (quoteCount(parameters[j]) % 2 == 0) { + parameters[++j] = parameters[i]; + } else { + parameters[j] += ';' + parameters[i]; + } + } + + // trim parameters + parameters.length = j + 1; + + for (var i = 0; i < parameters.length; i++) { + parameters[i] = parameters[i].trim(); + } + + return parameters; +} diff --git a/node_modules/negotiator/package.json b/node_modules/negotiator/package.json new file mode 100644 index 000000000..62dcdd691 --- /dev/null +++ b/node_modules/negotiator/package.json @@ -0,0 +1,84 @@ +{ + "_from": "negotiator@0.6.2", + "_id": "negotiator@0.6.2", + "_inBundle": false, + "_integrity": "sha512-hZXc7K2e+PgeI1eDBe/10Ard4ekbfrrqG8Ep+8Jmf4JID2bNg7NvCPOZN+kfF574pFQI7mum2AUqDidoKqcTOw==", + "_location": "/negotiator", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "negotiator@0.6.2", + "name": "negotiator", + "escapedName": "negotiator", + "rawSpec": "0.6.2", + "saveSpec": null, + "fetchSpec": "0.6.2" + }, + "_requiredBy": [ + "/accepts" + ], + "_resolved": "https://registry.npmjs.org/negotiator/-/negotiator-0.6.2.tgz", + "_shasum": "feacf7ccf525a77ae9634436a64883ffeca346fb", + "_spec": "negotiator@0.6.2", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\accepts", + "bugs": { + "url": "https://github.com/jshttp/negotiator/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + { + "name": "Federico Romero", + "email": "federico.romero@outboxlabs.com" + }, + { + "name": "Isaac Z. Schlueter", + "email": "i@izs.me", + "url": "http://blog.izs.me/" + } + ], + "deprecated": false, + "description": "HTTP content negotiation", + "devDependencies": { + "eslint": "5.16.0", + "eslint-plugin-markdown": "1.0.0", + "mocha": "6.1.4", + "nyc": "14.0.0" + }, + "engines": { + "node": ">= 0.6" + }, + "files": [ + "lib/", + "HISTORY.md", + "LICENSE", + "index.js", + "README.md" + ], + "homepage": "https://github.com/jshttp/negotiator#readme", + "keywords": [ + "http", + "content negotiation", + "accept", + "accept-language", + "accept-encoding", + "accept-charset" + ], + "license": "MIT", + "name": "negotiator", + "repository": { + "type": "git", + "url": "git+https://github.com/jshttp/negotiator.git" + }, + "scripts": { + "lint": "eslint --plugin markdown --ext js,md .", + "test": "mocha --reporter spec --check-leaks --bail test/", + "test-cov": "nyc --reporter=html --reporter=text npm test", + "test-travis": "nyc --reporter=text npm test" + }, + "version": "0.6.2" +} diff --git a/node_modules/nodemon/.eslintrc.json b/node_modules/nodemon/.eslintrc.json new file mode 100644 index 000000000..4fe657217 --- /dev/null +++ b/node_modules/nodemon/.eslintrc.json @@ -0,0 +1,19 @@ +{ + "env": { + "browser": true, + "commonjs": true, + "es2021": true + }, + "parserOptions": { + "ecmaVersion": 12 + }, + "rules": { + "space-before-function-paren": [ + 2, + { + "anonymous": "ignore", + "named": "never" + } + ] + } +} diff --git a/node_modules/nodemon/.jshintrc b/node_modules/nodemon/.jshintrc new file mode 100644 index 000000000..fb991ae89 --- /dev/null +++ b/node_modules/nodemon/.jshintrc @@ -0,0 +1,16 @@ +{ + "browser": true, + "camelcase": true, + "curly": true, + "devel": true, + "eqeqeq": true, + "forin": true, + "indent": 2, + "noarg": true, + "node": true, + "quotmark": "single", + "undef": true, + "strict": false, + "unused": true +} + diff --git a/node_modules/nodemon/.releaserc b/node_modules/nodemon/.releaserc new file mode 100644 index 000000000..0a563eee1 --- /dev/null +++ b/node_modules/nodemon/.releaserc @@ -0,0 +1,3 @@ +{ + "branches": ["main"] +} diff --git a/node_modules/nodemon/.travis.yml b/node_modules/nodemon/.travis.yml new file mode 100644 index 000000000..8057e91c4 --- /dev/null +++ b/node_modules/nodemon/.travis.yml @@ -0,0 +1,17 @@ +language: node_js +cache: + directories: + - ~/.npm +notifications: + email: false +node_js: + - '14' + - '12' + - '10' +before_install: + - if [ "$TRAVIS_PULL_REQUEST_BRANCH" == "" ]; then echo "//registry.npmjs.org/:_authToken=\${NPM_TOKEN}" >> .npmrc; fi +after_success: + - npm run semantic-release +branches: + except: + - /^v\d+\.\d+\.\d+$/ diff --git a/node_modules/nodemon/LICENSE b/node_modules/nodemon/LICENSE new file mode 100644 index 000000000..19c91a2f3 --- /dev/null +++ b/node_modules/nodemon/LICENSE @@ -0,0 +1,21 @@ +MIT License + +Copyright (c) 2010 - present, Remy Sharp, https://remysharp.com + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/node_modules/nodemon/README.md b/node_modules/nodemon/README.md new file mode 100644 index 000000000..bfc6b7a5d --- /dev/null +++ b/node_modules/nodemon/README.md @@ -0,0 +1,381 @@ +

+ Nodemon Logo +

+ +# nodemon + +nodemon is a tool that helps develop node.js based applications by automatically restarting the node application when file changes in the directory are detected. + +nodemon does **not** require *any* additional changes to your code or method of development. nodemon is a replacement wrapper for `node`. To use `nodemon`, replace the word `node` on the command line when executing your script. + +[![NPM version](https://badge.fury.io/js/nodemon.svg)](https://npmjs.org/package/nodemon) +[![Travis Status](https://travis-ci.org/remy/nodemon.svg?branch=master)](https://travis-ci.org/remy/nodemon) [![Backers on Open Collective](https://opencollective.com/nodemon/backers/badge.svg)](#backers) [![Sponsors on Open Collective](https://opencollective.com/nodemon/sponsors/badge.svg)](#sponsors) + +# Installation + +Either through cloning with git or by using [npm](http://npmjs.org) (the recommended way): + +```bash +npm install -g nodemon # or using yarn: yarn global add nodemon +``` + +And nodemon will be installed globally to your system path. + +You can also install nodemon as a development dependency: + +```bash +npm install --save-dev nodemon # or using yarn: yarn add nodemon -D +``` + +With a local installation, nodemon will not be available in your system path or you can't use it directly from the command line. Instead, the local installation of nodemon can be run by calling it from within an npm script (such as `npm start`) or using `npx nodemon`. + +# Usage + +nodemon wraps your application, so you can pass all the arguments you would normally pass to your app: + +```bash +nodemon [your node app] +``` + +For CLI options, use the `-h` (or `--help`) argument: + +```bash +nodemon -h +``` + +Using nodemon is simple, if my application accepted a host and port as the arguments, I would start it as so: + +```bash +nodemon ./server.js localhost 8080 +``` + +Any output from this script is prefixed with `[nodemon]`, otherwise all output from your application, errors included, will be echoed out as expected. + +You can also pass the `inspect` flag to node through the command line as you would normally: + +```bash +nodemon --inspect ./server.js 80 +``` + +If you have a `package.json` file for your app, you can omit the main script entirely and nodemon will read the `package.json` for the `main` property and use that value as the app ([ref](https://github.com/remy/nodemon/issues/14)). + +nodemon will also search for the `scripts.start` property in `package.json` (as of nodemon 1.1.x). + +Also check out the [FAQ](https://github.com/remy/nodemon/blob/master/faq.md) or [issues](https://github.com/remy/nodemon/issues) for nodemon. + +## Automatic re-running + +nodemon was originally written to restart hanging processes such as web servers, but now supports apps that cleanly exit. If your script exits cleanly, nodemon will continue to monitor the directory (or directories) and restart the script if there are any changes. + +## Manual restarting + +Whilst nodemon is running, if you need to manually restart your application, instead of stopping and restart nodemon, you can type `rs` with a carriage return, and nodemon will restart your process. + +## Config files + +nodemon supports local and global configuration files. These are usually named `nodemon.json` and can be located in the current working directory or in your home directory. An alternative local configuration file can be specified with the `--config ` option. + +The specificity is as follows, so that a command line argument will always override the config file settings: + +- command line arguments +- local config +- global config + +A config file can take any of the command line arguments as JSON key values, for example: + +```json +{ + "verbose": true, + "ignore": ["*.test.js", "fixtures/*"], + "execMap": { + "rb": "ruby", + "pde": "processing --sketch={{pwd}} --run" + } +} +``` + +The above `nodemon.json` file might be my global config so that I have support for ruby files and processing files, and I can run `nodemon demo.pde` and nodemon will automatically know how to run the script even though out of the box support for processing scripts. + +A further example of options can be seen in [sample-nodemon.md](https://github.com/remy/nodemon/blob/master/doc/sample-nodemon.md) + +### package.json + +If you want to keep all your package configurations in one place, nodemon supports using `package.json` for configuration. +Specify the config in the same format as you would for a config file but under `nodemonConfig` in the `package.json` file, for example, take the following `package.json`: + +```json +{ + "name": "nodemon", + "homepage": "http://nodemon.io", + "...": "... other standard package.json values", + "nodemonConfig": { + "ignore": ["test/*", "docs/*"], + "delay": 2500 + } +} +``` + +Note that if you specify a `--config` file or provide a local `nodemon.json` any `package.json` config is ignored. + +*This section needs better documentation, but for now you can also see `nodemon --help config` ([also here](https://github.com/remy/nodemon/blob/master/doc/cli/config.txt))*. + +## Using nodemon as a module + +Please see [doc/requireable.md](doc/requireable.md) + +## Using nodemon as child process + +Please see [doc/events.md](doc/events.md#Using_nodemon_as_child_process) + +## Running non-node scripts + +nodemon can also be used to execute and monitor other programs. nodemon will read the file extension of the script being run and monitor that extension instead of `.js` if there's no `nodemon.json`: + +```bash +nodemon --exec "python -v" ./app.py +``` + +Now nodemon will run `app.py` with python in verbose mode (note that if you're not passing args to the exec program, you don't need the quotes), and look for new or modified files with the `.py` extension. + +### Default executables + +Using the `nodemon.json` config file, you can define your own default executables using the `execMap` property. This is particularly useful if you're working with a language that isn't supported by default by nodemon. + +To add support for nodemon to know about the `.pl` extension (for Perl), the `nodemon.json` file would add: + +```json +{ + "execMap": { + "pl": "perl" + } +} +``` + +Now running the following, nodemon will know to use `perl` as the executable: + +```bash +nodemon script.pl +``` + +It's generally recommended to use the global `nodemon.json` to add your own `execMap` options. However, if there's a common default that's missing, this can be merged in to the project so that nodemon supports it by default, by changing [default.js](https://github.com/remy/nodemon/blob/master/lib/config/defaults.js) and sending a pull request. + +## Monitoring multiple directories + +By default nodemon monitors the current working directory. If you want to take control of that option, use the `--watch` option to add specific paths: + +```bash +nodemon --watch app --watch libs app/server.js +``` + +Now nodemon will only restart if there are changes in the `./app` or `./libs` directory. By default nodemon will traverse sub-directories, so there's no need in explicitly including sub-directories. + +Don't use unix globbing to pass multiple directories, e.g `--watch ./lib/*`, it won't work. You need a `--watch` flag per directory watched. + +## Specifying extension watch list + +By default, nodemon looks for files with the `.js`, `.mjs`, `.coffee`, `.litcoffee`, and `.json` extensions. If you use the `--exec` option and monitor `app.py` nodemon will monitor files with the extension of `.py`. However, you can specify your own list with the `-e` (or `--ext`) switch like so: + +```bash +nodemon -e js,pug +``` + +Now nodemon will restart on any changes to files in the directory (or subdirectories) with the extensions `.js`, `.pug`. + +## Ignoring files + +By default, nodemon will only restart when a `.js` JavaScript file changes. In some cases you will want to ignore some specific files, directories or file patterns, to prevent nodemon from prematurely restarting your application. + +This can be done via the command line: + +```bash +nodemon --ignore lib/ --ignore tests/ +``` + +Or specific files can be ignored: + +```bash +nodemon --ignore lib/app.js +``` + +Patterns can also be ignored (but be sure to quote the arguments): + +```bash +nodemon --ignore 'lib/*.js' +``` + +Note that by default, nodemon will ignore the `.git`, `node_modules`, `bower_components`, `.nyc_output`, `coverage` and `.sass-cache` directories and *add* your ignored patterns to the list. If you want to indeed watch a directory like `node_modules`, you need to [override the underlying default ignore rules](https://github.com/remy/nodemon/blob/master/faq.md#overriding-the-underlying-default-ignore-rules). + +## Application isn't restarting + +In some networked environments (such as a container running nodemon reading across a mounted drive), you will need to use the `legacyWatch: true` which enables Chokidar's polling. + +Via the CLI, use either `--legacy-watch` or `-L` for short: + +```bash +nodemon -L +``` + +Though this should be a last resort as it will poll every file it can find. + +## Delaying restarting + +In some situations, you may want to wait until a number of files have changed. The timeout before checking for new file changes is 1 second. If you're uploading a number of files and it's taking some number of seconds, this could cause your app to restart multiple times unnecessarily. + +To add an extra throttle, or delay restarting, use the `--delay` command: + +```bash +nodemon --delay 10 server.js +``` + +For more precision, milliseconds can be specified. Either as a float: + +```bash +nodemon --delay 2.5 server.js +``` + +Or using the time specifier (ms): + +```bash +nodemon --delay 2500ms server.js +``` + +The delay figure is number of seconds (or milliseconds, if specified) to delay before restarting. So nodemon will only restart your app the given number of seconds after the *last* file change. + +If you are setting this value in `nodemon.json`, the value will always be interpreted in milliseconds. E.g., the following are equivalent: + +```bash +nodemon --delay 2.5 + +{ + "delay": 2500 +} +``` + +## Gracefully reloading down your script + +It is possible to have nodemon send any signal that you specify to your application. + +```bash +nodemon --signal SIGHUP server.js +``` + +Your application can handle the signal as follows. + +```js +process.once("SIGHUP", function () { + reloadSomeConfiguration(); +}) +``` + +Please note that nodemon will send this signal to every process in the process tree. + +If you are using `cluster`, then each workers (as well as the master) will receive the signal. If you wish to terminate all workers on receiving a `SIGHUP`, a common pattern is to catch the `SIGHUP` in the master, and forward `SIGTERM` to all workers, while ensuring that all workers ignore `SIGHUP`. + +```js +if (cluster.isMaster) { + process.on("SIGHUP", function () { + for (const worker of Object.values(cluster.workers)) { + worker.process.kill("SIGTERM"); + } + }); +} else { + process.on("SIGHUP", function() {}) +} +``` + +## Controlling shutdown of your script + +nodemon sends a kill signal to your application when it sees a file update. If you need to clean up on shutdown inside your script you can capture the kill signal and handle it yourself. + +The following example will listen once for the `SIGUSR2` signal (used by nodemon to restart), run the clean up process and then kill itself for nodemon to continue control: + +```js +process.once('SIGUSR2', function () { + gracefulShutdown(function () { + process.kill(process.pid, 'SIGUSR2'); + }); +}); +``` + +Note that the `process.kill` is *only* called once your shutdown jobs are complete. Hat tip to [Benjie Gillam](http://www.benjiegillam.com/2011/08/node-js-clean-restart-and-faster-development-with-nodemon/) for writing this technique up. + +## Triggering events when nodemon state changes + +If you want growl like notifications when nodemon restarts or to trigger an action when an event happens, then you can either `require` nodemon or add event actions to your `nodemon.json` file. + +For example, to trigger a notification on a Mac when nodemon restarts, `nodemon.json` looks like this: + +```json +{ + "events": { + "restart": "osascript -e 'display notification \"app restarted\" with title \"nodemon\"'" + } +} +``` + +A full list of available events is listed on the [event states wiki](https://github.com/remy/nodemon/wiki/Events#states). Note that you can bind to both states and messages. + +## Pipe output to somewhere else + +```js +nodemon({ + script: ..., + stdout: false // important: this tells nodemon not to output to console +}).on('readable', function() { // the `readable` event indicates that data is ready to pick up + this.stdout.pipe(fs.createWriteStream('output.txt')); + this.stderr.pipe(fs.createWriteStream('err.txt')); +}); +``` + +## Using nodemon in your gulp workflow + +Check out the [gulp-nodemon](https://github.com/JacksonGariety/gulp-nodemon) plugin to integrate nodemon with the rest of your project's gulp workflow. + +## Using nodemon in your Grunt workflow + +Check out the [grunt-nodemon](https://github.com/ChrisWren/grunt-nodemon) plugin to integrate nodemon with the rest of your project's grunt workflow. + +## Pronunciation + +> nodemon, is it pronounced: node-mon, no-demon or node-e-mon (like pokémon)? + +Well...I've been asked this many times before. I like that I've been asked this before. There's been bets as to which one it actually is. + +The answer is simple, but possibly frustrating. I'm not saying (how I pronounce it). It's up to you to call it as you like. All answers are correct :) + +## Design principles + +- Fewer flags is better +- Works across all platforms +- Fewer features +- Let individuals build on top of nodemon +- Offer all CLI functionality as an API +- Contributions must have and pass tests + +Nodemon is not perfect, and CLI arguments has sprawled beyond where I'm completely happy, but perhaps it can be reduced a little one day. + +## FAQ + +See the [FAQ](https://github.com/remy/nodemon/blob/master/faq.md) and please add your own questions if you think they would help others. + +## Backers + +Thank you to all [our backers](https://opencollective.com/nodemon#backer)! 🙏 + +[![nodemon backers](https://opencollective.com/nodemon/backers.svg?width=890)](https://opencollective.com/nodemon#backers) + +## Sponsors + +Support this project by becoming a sponsor. Your logo will show up here with a link to your website. [Sponsor this project today ❤️](https://opencollective.com/nodemon#sponsor) + +
+ +topratedcasinosites.co.uk + + uk casinos 10 pounds minimun deposit offers on casinosters + +
+ +# License + +MIT [http://rem.mit-license.org](http://rem.mit-license.org) diff --git a/node_modules/nodemon/bin/nodemon.js b/node_modules/nodemon/bin/nodemon.js new file mode 100644 index 000000000..31aff345d --- /dev/null +++ b/node_modules/nodemon/bin/nodemon.js @@ -0,0 +1,16 @@ +#!/usr/bin/env node + +const cli = require('../lib/cli'); +const nodemon = require('../lib/'); +const options = cli.parse(process.argv); + +nodemon(options); + +const fs = require('fs'); + +// checks for available update and returns an instance +const pkg = JSON.parse(fs.readFileSync(__dirname + '/../package.json')); + +if (pkg.version.indexOf('0.0.0') !== 0 && options.noUpdateNotifier !== true) { + require('update-notifier')({ pkg }).notify(); +} diff --git a/node_modules/nodemon/bin/postinstall.js b/node_modules/nodemon/bin/postinstall.js new file mode 100644 index 000000000..34c8194d4 --- /dev/null +++ b/node_modules/nodemon/bin/postinstall.js @@ -0,0 +1,31 @@ +#!/usr/bin/env node + +function main() { + if (process.env.SUPPRESS_SUPPORT || process.env.OPENCOLLECTIVE_HIDE || process.env.CI) { + return; + } + + const message = '\u001b[32mLove nodemon? You can now support the project via the open collective:\u001b[22m\u001b[39m\n > \u001b[96m\u001b[1mhttps://opencollective.com/nodemon/donate\u001b[0m\n'; + + try { + const Configstore = require('configstore'); + const pkg = require(__dirname + '/../package.json'); + const now = Date.now(); + + var week = 1000 * 60 * 60 * 24 * 7; + + // create a Configstore instance with an unique ID e.g. + // Package name and optionally some default values + const conf = new Configstore(pkg.name); + const last = conf.get('lastCheck'); + + if (!last || now - week > last) { + console.log(message); + conf.set('lastCheck', now); + } + } catch (e) { + console.log(message); + } +} + +main(); diff --git a/node_modules/nodemon/bin/windows-kill.exe b/node_modules/nodemon/bin/windows-kill.exe new file mode 100644 index 000000000..98d7d7f7e Binary files /dev/null and b/node_modules/nodemon/bin/windows-kill.exe differ diff --git a/node_modules/nodemon/commitlint.config.js b/node_modules/nodemon/commitlint.config.js new file mode 100644 index 000000000..84dcb122a --- /dev/null +++ b/node_modules/nodemon/commitlint.config.js @@ -0,0 +1,3 @@ +module.exports = { + extends: ['@commitlint/config-conventional'], +}; diff --git a/node_modules/nodemon/doc/cli/authors.txt b/node_modules/nodemon/doc/cli/authors.txt new file mode 100644 index 000000000..6c77a12ae --- /dev/null +++ b/node_modules/nodemon/doc/cli/authors.txt @@ -0,0 +1,8 @@ + + Remy Sharp - author and maintainer + https://github.com/remy + https://twitter.com/rem + + Contributors: https://github.com/remy/nodemon/graphs/contributors ❤︎ + + Please help make nodemon better: https://github.com/remy/nodemon/ diff --git a/node_modules/nodemon/doc/cli/config.txt b/node_modules/nodemon/doc/cli/config.txt new file mode 100644 index 000000000..5de9bba57 --- /dev/null +++ b/node_modules/nodemon/doc/cli/config.txt @@ -0,0 +1,44 @@ + + Typically the options to control nodemon are passed in via the CLI and are + listed under: nodemon --help options + + nodemon can also be configured via a local and global config file: + + * $HOME/nodemon.json + * $PWD/nodemon.json OR --config + * nodemonConfig in package.json + + All config options in the .json file map 1-to-1 with the CLI options, so a + config could read as: + + { + "ext": "*.pde", + "verbose": true, + "exec": "processing --sketch=game --run" + } + + There are a limited number of variables available in the config (since you + could use backticks on the CLI to use a variable, backticks won't work in + the .json config). + + * {{pwd}} - the current directory + * {{filename}} - the filename you pass to nodemon + + For example: + + { + "ext": "*.pde", + "verbose": true, + "exec": "processing --sketch={{pwd}} --run" + } + + The global config file is useful for setting up default executables + instead of repeating the same option in each of your local configs: + + { + "verbose": true, + "execMap": { + "rb": "ruby", + "pde": "processing --sketch={{pwd}} --run" + } + } diff --git a/node_modules/nodemon/doc/cli/help.txt b/node_modules/nodemon/doc/cli/help.txt new file mode 100644 index 000000000..7ba4ff2a3 --- /dev/null +++ b/node_modules/nodemon/doc/cli/help.txt @@ -0,0 +1,29 @@ + Usage: nodemon [options] [script.js] [args] + + Options: + + --config file ............ alternate nodemon.json config file to use + -e, --ext ................ extensions to look for, ie. js,pug,hbs. + -x, --exec app ........... execute script with "app", ie. -x "python -v". + -w, --watch path ......... watch directory "path" or files. use once for + each directory or file to watch. + -i, --ignore ............. ignore specific files or directories. + -V, --verbose ............ show detail on what is causing restarts. + -- ........... to tell nodemon stop slurping arguments. + + Note: if the script is omitted, nodemon will try to read "main" from + package.json and without a nodemon.json, nodemon will monitor .js, .mjs, .coffee, + .litcoffee, and .json by default. + + For advanced nodemon configuration use nodemon.json: nodemon --help config + See also the sample: https://github.com/remy/nodemon/wiki/Sample-nodemon.json + + Examples: + + $ nodemon server.js + $ nodemon -w ../foo server.js apparg1 apparg2 + $ nodemon --exec python app.py + $ nodemon --exec "make build" -e "styl hbs" + $ nodemon app.js -- --config # pass config to app.js + + \x1B[1mAll options are documented under: \x1B[4mnodemon --help options\x1B[0m diff --git a/node_modules/nodemon/doc/cli/logo.txt b/node_modules/nodemon/doc/cli/logo.txt new file mode 100644 index 000000000..150f97f59 --- /dev/null +++ b/node_modules/nodemon/doc/cli/logo.txt @@ -0,0 +1,20 @@ + ; ; + kO. x0 + KMX, .:x0kc. 'KMN + 0MMM0: 'oKMMMMMMMXd, ;OMMMX + oMMMMMWKOONMMMMMMMMMMMMMWOOKWMMMMMx + OMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMK. + .oWMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMd. + KMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMMN + KMMMMMMMMMMMMMMW0k0WMMMMMMMMMMMMMMW + KMMMMMMMMMMMNk:. :xNMMMMMMMMMMMW + KMMMMMMMMMMK OMMMMMMMMMMW + KMMMMMMMMMMO xMMMMMMMMMMN + KMMMMMMMMMMO xMMMMMMMMMMN + KMMMMMMMMMMO xMMMMMMMMMMN + KMMMMMMMMMMO xMMMMMMMMMMN + KMMMMMMMMMMO xMMMMMMMMMMN + KMMMMMMMMMNc ;NMMMMMMMMMN + KMMMMMW0o' .lOWMMMMMN + KMMKd; ,oKMMN + kX: ,K0 \ No newline at end of file diff --git a/node_modules/nodemon/doc/cli/options.txt b/node_modules/nodemon/doc/cli/options.txt new file mode 100644 index 000000000..598ae63ba --- /dev/null +++ b/node_modules/nodemon/doc/cli/options.txt @@ -0,0 +1,36 @@ + +Configuration + --config .......... alternate nodemon.json config file to use + --exitcrash .............. exit on crash, allows nodemon to work with other watchers + -i, --ignore ............. ignore specific files or directories + --no-colors .............. disable color output + --signal ........ use specified kill signal instead of default (ex. SIGTERM) + -w, --watch path ......... watch directory "dir" or files. use once for each + directory or file to watch + --no-update-notifier ..... opt-out of update version check + +Execution + -C, --on-change-only ..... execute script on change only, not startup + --cwd .............. change into before running the script + -e, --ext ................ extensions to look for, ie. "js,pug,hbs" + -I, --no-stdin ........... nodemon passes stdin directly to child process + --spawn .................. force nodemon to use spawn (over fork) [node only] + -x, --exec app ........... execute script with "app", ie. -x "python -v" + -- ........... to tell nodemon stop slurping arguments + +Watching + -d, --delay n ............ debounce restart for "n" seconds + -L, --legacy-watch ....... use polling to watch for changes (typically needed + when watching over a network/Docker) + -P, --polling-interval ... combined with -L, milliseconds to poll for (default 100) + +Information + --dump ................... print full debug configuration + -h, --help ............... default help + --help ........... help on a specific feature. Try "--help topics" + -q, --quiet .............. minimise nodemon messages to start/stop only + -v, --version ............ current nodemon version + -V, --verbose ............ show detail on what is causing restarts + + +> Note that any unrecognised arguments are passed to the executing command. diff --git a/node_modules/nodemon/doc/cli/topics.txt b/node_modules/nodemon/doc/cli/topics.txt new file mode 100644 index 000000000..9fe3e2b59 --- /dev/null +++ b/node_modules/nodemon/doc/cli/topics.txt @@ -0,0 +1,8 @@ + + options .................. show all available nodemon options + config ................... default config options using nodemon.json + authors .................. contributors to this project + logo ..................... <3 + whoami ................... I, AM, NODEMON \o/ + + Please support https://github.com/remy/nodemon/ diff --git a/node_modules/nodemon/doc/cli/usage.txt b/node_modules/nodemon/doc/cli/usage.txt new file mode 100644 index 000000000..bca98b5e6 --- /dev/null +++ b/node_modules/nodemon/doc/cli/usage.txt @@ -0,0 +1,3 @@ + Usage: nodemon [nodemon options] [script.js] [args] + + See "nodemon --help" for more. diff --git a/node_modules/nodemon/doc/cli/whoami.txt b/node_modules/nodemon/doc/cli/whoami.txt new file mode 100644 index 000000000..efc3382ef --- /dev/null +++ b/node_modules/nodemon/doc/cli/whoami.txt @@ -0,0 +1,9 @@ +__/\\\\\_____/\\\_______/\\\\\_______/\\\\\\\\\\\\_____/\\\\\\\\\\\\\\\__/\\\\____________/\\\\_______/\\\\\_______/\\\\\_____/\\\_ + _\/\\\\\\___\/\\\_____/\\\///\\\____\/\\\////////\\\__\/\\\///////////__\/\\\\\\________/\\\\\\_____/\\\///\\\____\/\\\\\\___\/\\\_ + _\/\\\/\\\__\/\\\___/\\\/__\///\\\__\/\\\______\//\\\_\/\\\_____________\/\\\//\\\____/\\\//\\\___/\\\/__\///\\\__\/\\\/\\\__\/\\\_ + _\/\\\//\\\_\/\\\__/\\\______\//\\\_\/\\\_______\/\\\_\/\\\\\\\\\\\_____\/\\\\///\\\/\\\/_\/\\\__/\\\______\//\\\_\/\\\//\\\_\/\\\_ + _\/\\\\//\\\\/\\\_\/\\\_______\/\\\_\/\\\_______\/\\\_\/\\\///////______\/\\\__\///\\\/___\/\\\_\/\\\_______\/\\\_\/\\\\//\\\\/\\\_ + _\/\\\_\//\\\/\\\_\//\\\______/\\\__\/\\\_______\/\\\_\/\\\_____________\/\\\____\///_____\/\\\_\//\\\______/\\\__\/\\\_\//\\\/\\\_ + _\/\\\__\//\\\\\\__\///\\\__/\\\____\/\\\_______/\\\__\/\\\_____________\/\\\_____________\/\\\__\///\\\__/\\\____\/\\\__\//\\\\\\_ + _\/\\\___\//\\\\\____\///\\\\\/_____\/\\\\\\\\\\\\/___\/\\\\\\\\\\\\\\\_\/\\\_____________\/\\\____\///\\\\\/_____\/\\\___\//\\\\\_ + _\///_____\/////_______\/////_______\////////////_____\///////////////__\///______________\///_______\/////_______\///_____\/////__ \ No newline at end of file diff --git a/node_modules/nodemon/lib/cli/index.js b/node_modules/nodemon/lib/cli/index.js new file mode 100644 index 000000000..bf9e80991 --- /dev/null +++ b/node_modules/nodemon/lib/cli/index.js @@ -0,0 +1,49 @@ +var parse = require('./parse'); + +/** + * Converts a string to command line args, in particular + * groups together quoted values. + * This is a utility function to allow calling nodemon as a required + * library, but with the CLI args passed in (instead of an object). + * + * @param {String} string + * @return {Array} + */ +function stringToArgs(string) { + var args = []; + + var parts = string.split(' '); + var length = parts.length; + var i = 0; + var open = false; + var grouped = ''; + var lead = ''; + + for (; i < length; i++) { + lead = parts[i].substring(0, 1); + if (lead === '"' || lead === '\'') { + open = lead; + grouped = parts[i].substring(1); + } else if (open && parts[i].slice(-1) === open) { + open = false; + grouped += ' ' + parts[i].slice(0, -1); + args.push(grouped); + } else if (open) { + grouped += ' ' + parts[i]; + } else { + args.push(parts[i]); + } + } + + return args; +} + +module.exports = { + parse: function (argv) { + if (typeof argv === 'string') { + argv = stringToArgs(argv); + } + + return parse(argv); + }, +}; \ No newline at end of file diff --git a/node_modules/nodemon/lib/cli/parse.js b/node_modules/nodemon/lib/cli/parse.js new file mode 100644 index 000000000..ad7400386 --- /dev/null +++ b/node_modules/nodemon/lib/cli/parse.js @@ -0,0 +1,230 @@ +/* + +nodemon is a utility for node, and replaces the use of the executable +node. So the user calls `nodemon foo.js` instead. + +nodemon can be run in a number of ways: + +`nodemon` - tries to use package.json#main property to run +`nodemon` - if no package, looks for index.js +`nodemon app.js` - runs app.js +`nodemon --arg app.js --apparg` - eats arg1, and runs app.js with apparg +`nodemon --apparg` - as above, but passes apparg to package.json#main (or + index.js) +`nodemon --debug app.js + +*/ + +var fs = require('fs'); +var path = require('path'); +var existsSync = fs.existsSync || path.existsSync; + +module.exports = parse; + +/** + * Parses the command line arguments `process.argv` and returns the + * nodemon options, the user script and the executable script. + * + * @param {Array} full process arguments, including `node` leading arg + * @return {Object} { options, script, args } + */ +function parse(argv) { + if (typeof argv === 'string') { + argv = argv.split(' '); + } + + var eat = function (i, args) { + if (i <= args.length) { + return args.splice(i + 1, 1).pop(); + } + }; + + var args = argv.slice(2); + var script = null; + var nodemonOptions = { scriptPosition: null }; + + var nodemonOpt = nodemonOption.bind(null, nodemonOptions); + var lookForArgs = true; + + // move forward through the arguments + for (var i = 0; i < args.length; i++) { + // if the argument looks like a file, then stop eating + if (!script) { + if (args[i] === '.' || existsSync(args[i])) { + script = args.splice(i, 1).pop(); + + // we capture the position of the script because we'll reinsert it in + // the right place in run.js:command (though I'm not sure we should even + // take it out of the array in the first place, but this solves passing + // arguments to the exec process for now). + nodemonOptions.scriptPosition = i; + i--; + continue; + } + } + + if (lookForArgs) { + // respect the standard way of saying: hereafter belongs to my script + if (args[i] === '--') { + args.splice(i, 1); + nodemonOptions.scriptPosition = i; + // cycle back one argument, as we just ate this one up + i--; + + // ignore all further nodemon arguments + lookForArgs = false; + + // move to the next iteration + continue; + } + + if (nodemonOpt(args[i], eat.bind(null, i, args)) !== false) { + args.splice(i, 1); + // cycle back one argument, as we just ate this one up + i--; + } + } + } + + nodemonOptions.script = script; + nodemonOptions.args = args; + + return nodemonOptions; +} + + +/** + * Given an argument (ie. from process.argv), sets nodemon + * options and can eat up the argument value + * + * @param {Object} options object that will be updated + * @param {Sting} current argument from argv + * @param {Function} the callback to eat up the next argument in argv + * @return {Boolean} false if argument was not a nodemon arg + */ +function nodemonOption(options, arg, eatNext) { + // line separation on purpose to help legibility + if (arg === '--help' || arg === '-h' || arg === '-?') { + var help = eatNext(); + options.help = help ? help : true; + } else + + if (arg === '--version' || arg === '-v') { + options.version = true; + } else + + if (arg === '--no-update-notifier') { + options.noUpdateNotifier = true; + } else + + if (arg === '--spawn') { + options.spawn = true; + } else + + if (arg === '--dump') { + options.dump = true; + } else + + if (arg === '--verbose' || arg === '-V') { + options.verbose = true; + } else + + if (arg === '--legacy-watch' || arg === '-L') { + options.legacyWatch = true; + } else + + if (arg === '--polling-interval' || arg === '-P') { + options.pollingInterval = parseInt(eatNext(), 10); + } else + + // Depricated as this is "on" by default + if (arg === '--js') { + options.js = true; + } else + + if (arg === '--quiet' || arg === '-q') { + options.quiet = true; + } else + + if (arg === '--config') { + options.configFile = eatNext(); + } else + + if (arg === '--watch' || arg === '-w') { + if (!options.watch) { options.watch = []; } + options.watch.push(eatNext()); + } else + + if (arg === '--ignore' || arg === '-i') { + if (!options.ignore) { options.ignore = []; } + options.ignore.push(eatNext()); + } else + + if (arg === '--exitcrash') { + options.exitcrash = true; + } else + + if (arg === '--delay' || arg === '-d') { + options.delay = parseDelay(eatNext()); + } else + + if (arg === '--exec' || arg === '-x') { + options.exec = eatNext(); + } else + + if (arg === '--no-stdin' || arg === '-I') { + options.stdin = false; + } else + + if (arg === '--on-change-only' || arg === '-C') { + options.runOnChangeOnly = true; + } else + + if (arg === '--ext' || arg === '-e') { + options.ext = eatNext(); + } else + + if (arg === '--no-colours' || arg === '--no-colors') { + options.colours = false; + } else + + if (arg === '--signal' || arg === '-s') { + options.signal = eatNext(); + } else + + if (arg === '--cwd') { + options.cwd = eatNext(); + + // go ahead and change directory. This is primarily for nodemon tools like + // grunt-nodemon - we're doing this early because it will affect where the + // user script is searched for. + process.chdir(path.resolve(options.cwd)); + } else { + + // this means we didn't match + return false; + } +} + +/** + * Given an argument (ie. from nodemonOption()), will parse and return the + * equivalent millisecond value or 0 if the argument cannot be parsed + * + * @param {String} argument value given to the --delay option + * @return {Number} millisecond equivalent of the argument + */ +function parseDelay(value) { + var millisPerSecond = 1000; + var millis = 0; + + if (value.match(/^\d*ms$/)) { + // Explicitly parse for milliseconds when using ms time specifier + millis = parseInt(value, 10); + } else { + // Otherwise, parse for seconds, with or without time specifier then convert + millis = parseFloat(value) * millisPerSecond; + } + + return isNaN(millis) ? 0 : millis; +} + diff --git a/node_modules/nodemon/lib/config/command.js b/node_modules/nodemon/lib/config/command.js new file mode 100644 index 000000000..9839b5c7c --- /dev/null +++ b/node_modules/nodemon/lib/config/command.js @@ -0,0 +1,43 @@ +module.exports = command; + +/** + * command constructs the executable command to run in a shell including the + * user script, the command arguments. + * + * @param {Object} settings Object as: + * { execOptions: { + * exec: String, + * [script: String], + * [scriptPosition: Number], + * [execArgs: Array] + * } + * } + * @return {Object} an object with the node executable and the + * arguments to the command + */ +function command(settings) { + var options = settings.execOptions; + var executable = options.exec; + var args = []; + + // after "executable" go the exec args (like --debug, etc) + if (options.execArgs) { + [].push.apply(args, options.execArgs); + } + + // then goes the user's script arguments + if (options.args) { + [].push.apply(args, options.args); + } + + // after the "executable" goes the user's script + if (options.script) { + args.splice((options.scriptPosition || 0) + + options.execArgs.length, 0, options.script); + } + + return { + executable: executable, + args: args, + }; +} diff --git a/node_modules/nodemon/lib/config/defaults.js b/node_modules/nodemon/lib/config/defaults.js new file mode 100644 index 000000000..e2a448b44 --- /dev/null +++ b/node_modules/nodemon/lib/config/defaults.js @@ -0,0 +1,28 @@ +var ignoreRoot = require('ignore-by-default').directories(); + +// default options for config.options +module.exports = { + restartable: 'rs', + colours: true, + execMap: { + py: 'python', + rb: 'ruby', + ts: 'ts-node', + // more can be added here such as ls: lsc - but please ensure it's cross + // compatible with linux, mac and windows, or make the default.js + // dynamically append the `.cmd` for node based utilities + }, + ignoreRoot: ignoreRoot.map(_ => `**/${_}/**`), + watch: ['*.*'], + stdin: true, + runOnChangeOnly: false, + verbose: false, + signal: 'SIGUSR2', + // 'stdout' refers to the default behaviour of a required nodemon's child, + // but also includes stderr. If this is false, data is still dispatched via + // nodemon.on('stdout/stderr') + stdout: true, + watchOptions: { + + }, +}; diff --git a/node_modules/nodemon/lib/config/exec.js b/node_modules/nodemon/lib/config/exec.js new file mode 100644 index 000000000..68c2a2de1 --- /dev/null +++ b/node_modules/nodemon/lib/config/exec.js @@ -0,0 +1,225 @@ +const path = require('path'); +const fs = require('fs'); +const existsSync = fs.existsSync; +const utils = require('../utils'); + +module.exports = exec; +module.exports.expandScript = expandScript; + +/** + * Reads the cwd/package.json file and looks to see if it can load a script + * and possibly an exec first from package.main, then package.start. + * + * @return {Object} exec & script if found + */ +function execFromPackage() { + // doing a try/catch because we can't use the path.exist callback pattern + // or we could, but the code would get messy, so this will do exactly + // what we're after - if the file doesn't exist, it'll throw. + try { + // note: this isn't nodemon's package, it's the user's cwd package + var pkg = require(path.join(process.cwd(), 'package.json')); + if (pkg.main !== undefined) { + // no app found to run - so give them a tip and get the feck out + return { exec: null, script: pkg.main }; + } + + if (pkg.scripts && pkg.scripts.start) { + return { exec: pkg.scripts.start }; + } + } catch (e) { } + + return null; +} + +function replace(map, str) { + var re = new RegExp('{{(' + Object.keys(map).join('|') + ')}}', 'g'); + return str.replace(re, function (all, m) { + return map[m] || all || ''; + }); +} + +function expandScript(script, ext) { + if (!ext) { + ext = '.js'; + } + if (script.indexOf(ext) !== -1) { + return script; + } + + if (existsSync(path.resolve(script))) { + return script; + } + + if (existsSync(path.resolve(script + ext))) { + return script + ext; + } + + return script; +} + +/** + * Discovers all the options required to run the script + * and if a custom exec has been passed in, then it will + * also try to work out what extensions to monitor and + * whether there's a special way of running that script. + * + * @param {Object} nodemonOptions + * @param {Object} execMap + * @return {Object} new and updated version of nodemonOptions + */ +function exec(nodemonOptions, execMap) { + if (!execMap) { + execMap = {}; + } + + var options = utils.clone(nodemonOptions || {}); + var script; + + // if there's no script passed, try to get it from the first argument + if (!options.script && (options.args || []).length) { + script = expandScript(options.args[0], + options.ext && ('.' + (options.ext || 'js').split(',')[0])); + + // if the script was found, shift it off our args + if (script !== options.args[0]) { + options.script = script; + options.args.shift(); + } + } + + // if there's no exec found yet, then try to read it from the local + // package.json this logic used to sit in the cli/parse, but actually the cli + // should be parsed first, then the user options (via nodemon.json) then + // finally default down to pot shots at the directory via package.json + if (!options.exec && !options.script) { + var found = execFromPackage(); + if (found !== null) { + if (found.exec) { + options.exec = found.exec; + } + if (!options.script) { + options.script = found.script; + } + if (Array.isArray(options.args) && + options.scriptPosition === null) { + options.scriptPosition = options.args.length; + } + } + } + + // var options = utils.clone(nodemonOptions || {}); + script = path.basename(options.script || ''); + + var scriptExt = path.extname(script).slice(1); + + var extension = options.ext; + if (extension === undefined) { + var isJS = scriptExt === 'js' || scriptExt === 'mjs'; + extension = (isJS || !scriptExt) ? 'js,mjs' : scriptExt; + extension += ',json'; // Always watch JSON files + } + + var execDefined = !!options.exec; + + // allows the user to simplify cli usage: + // https://github.com/remy/nodemon/issues/195 + // but always give preference to the user defined argument + if (!options.exec && execMap[scriptExt] !== undefined) { + options.exec = execMap[scriptExt]; + execDefined = true; + } + + options.execArgs = nodemonOptions.execArgs || []; + + if (Array.isArray(options.exec)) { + options.execArgs = options.exec; + options.exec = options.execArgs.shift(); + } + + if (options.exec === undefined) { + options.exec = 'node'; + } else { + // allow variable substitution for {{filename}} and {{pwd}} + var substitution = replace.bind(null, { + filename: options.script, + pwd: process.cwd(), + }); + + var newExec = substitution(options.exec); + if (newExec !== options.exec && + options.exec.indexOf('{{filename}}') !== -1) { + options.script = null; + } + options.exec = newExec; + + var newExecArgs = options.execArgs.map(substitution); + if (newExecArgs.join('') !== options.execArgs.join('')) { + options.execArgs = newExecArgs; + delete options.script; + } + } + + + if (options.exec === 'node' && options.nodeArgs && options.nodeArgs.length) { + options.execArgs = options.execArgs.concat(options.nodeArgs); + } + + // note: indexOf('coffee') handles both .coffee and .litcoffee + if (!execDefined && options.exec === 'node' && + scriptExt.indexOf('coffee') !== -1) { + options.exec = 'coffee'; + + // we need to get execArgs set before the script + // for example, in `nodemon --debug my-script.coffee --my-flag`, debug is an + // execArg, while my-flag is a script arg + var leadingArgs = (options.args || []).splice(0, options.scriptPosition); + options.execArgs = options.execArgs.concat(leadingArgs); + options.scriptPosition = 0; + + if (options.execArgs.length > 0) { + // because this is the coffee executable, we need to combine the exec args + // into a single argument after the nodejs flag + options.execArgs = ['--nodejs', options.execArgs.join(' ')]; + } + } + + if (options.exec === 'coffee') { + // don't override user specified extension tracking + if (options.ext === undefined) { + if (extension) { extension += ','; } + extension += 'coffee,litcoffee'; + } + + // because windows can't find 'coffee', it needs the real file 'coffee.cmd' + if (utils.isWindows) { + options.exec += '.cmd'; + } + } + + // allow users to make a mistake on the extension to monitor + // converts .js, pug => js,pug + // BIG NOTE: user can't do this: nodemon -e *.js + // because the terminal will automatically expand the glob against + // the file system :( + extension = (extension.match(/[^,*\s]+/g) || []) + .map(ext => ext.replace(/^\./, '')) + .join(','); + + options.ext = extension; + + if (options.script) { + options.script = expandScript(options.script, + extension && ('.' + extension.split(',')[0])); + } + + options.env = {}; + // make sure it's an object (and since we don't have ) + if (({}).toString.apply(nodemonOptions.env) === '[object Object]') { + options.env = utils.clone(nodemonOptions.env); + } else if (nodemonOptions.env !== undefined) { + throw new Error('nodemon env values must be an object: { PORT: 8000 }'); + } + + return options; +} diff --git a/node_modules/nodemon/lib/config/index.js b/node_modules/nodemon/lib/config/index.js new file mode 100644 index 000000000..c78c435cd --- /dev/null +++ b/node_modules/nodemon/lib/config/index.js @@ -0,0 +1,93 @@ +/** + * Manages the internal config of nodemon, checking for the state of support + * with fs.watch, how nodemon can watch files (using find or fs methods). + * + * This is *not* the user's config. + */ +var debug = require('debug')('nodemon'); +var load = require('./load'); +var rules = require('../rules'); +var utils = require('../utils'); +var pinVersion = require('../version').pin; +var command = require('./command'); +var rulesToMonitor = require('../monitor/match').rulesToMonitor; +var bus = utils.bus; + +function reset() { + rules.reset(); + + config.dirs = []; + config.options = { ignore: [], watch: [], monitor: [] }; + config.lastStarted = 0; + config.loaded = []; +} + +var config = { + run: false, + system: { + cwd: process.cwd(), + }, + required: false, + dirs: [], + timeout: 1000, + options: {}, +}; + +/** + * Take user defined settings, then detect the local machine capability, then + * look for local and global nodemon.json files and merge together the final + * settings with the config for nodemon. + * + * @param {Object} settings user defined settings for nodemon (typically on + * the cli) + * @param {Function} ready callback fired once the config is loaded + */ +config.load = function (settings, ready) { + reset(); + var config = this; + load(settings, config.options, config, function (options) { + config.options = options; + + if (options.watch.length === 0) { + // this is to catch when the watch is left blank + options.watch.push('*.*'); + } + + if (options['watch_interval']) { // jshint ignore:line + options.watchInterval = options['watch_interval']; // jshint ignore:line + } + + config.watchInterval = options.watchInterval || null; + if (options.signal) { + config.signal = options.signal; + } + + var cmd = command(config.options); + config.command = { + raw: cmd, + string: utils.stringify(cmd.executable, cmd.args), + }; + + // now run automatic checks on system adding to the config object + options.monitor = rulesToMonitor(options.watch, options.ignore, config); + + var cwd = process.cwd(); + debug('config: dirs', config.dirs); + if (config.dirs.length === 0) { + config.dirs.unshift(cwd); + } + + bus.emit('config:update', config); + pinVersion().then(function () { + ready(config); + }).catch(e => { + // this doesn't help testing, but does give exposure on syntax errors + console.error(e.stack); + setTimeout(() => { throw e; }, 0); + }); + }); +}; + +config.reset = reset; + +module.exports = config; diff --git a/node_modules/nodemon/lib/config/load.js b/node_modules/nodemon/lib/config/load.js new file mode 100644 index 000000000..bd5a03d8a --- /dev/null +++ b/node_modules/nodemon/lib/config/load.js @@ -0,0 +1,256 @@ +var debug = require('debug')('nodemon'); +var fs = require('fs'); +var path = require('path'); +var exists = fs.exists || path.exists; +var utils = require('../utils'); +var rules = require('../rules'); +var parse = require('../rules/parse'); +var exec = require('./exec'); +var defaults = require('./defaults'); + +module.exports = load; +module.exports.mutateExecOptions = mutateExecOptions; + +var existsSync = fs.existsSync || path.existsSync; + +function findAppScript() { + // nodemon has been run alone, so try to read the package file + // or try to read the index.js file + if (existsSync('./index.js')) { + return 'index.js'; + } +} + +/** + * Load the nodemon config, first reading the global root/nodemon.json, then + * the local nodemon.json to the exec and then overwriting using any user + * specified settings (i.e. from the cli) + * + * @param {Object} settings user defined settings + * @param {Function} ready callback that receives complete config + */ +function load(settings, options, config, callback) { + config.loaded = []; + // first load the root nodemon.json + loadFile(options, config, utils.home, function (options) { + // then load the user's local configuration file + if (settings.configFile) { + options.configFile = path.resolve(settings.configFile); + } + loadFile(options, config, process.cwd(), function (options) { + // Then merge over with the user settings (parsed from the cli). + // Note that merge protects and favours existing values over new values, + // and thus command line arguments get priority + options = utils.merge(settings, options); + + // legacy support + if (!Array.isArray(options.ignore)) { + options.ignore = [options.ignore]; + } + + if (!options.ignoreRoot) { + options.ignoreRoot = defaults.ignoreRoot; + } + + // blend the user ignore and the default ignore together + if (options.ignoreRoot && options.ignore) { + if (!Array.isArray(options.ignoreRoot)) { + options.ignoreRoot = [options.ignoreRoot]; + } + options.ignore = options.ignoreRoot.concat(options.ignore); + } else { + options.ignore = defaults.ignore.concat(options.ignore); + } + + + // add in any missing defaults + options = utils.merge(options, defaults); + + if (!options.script && !options.exec) { + var found = findAppScript(); + if (found) { + if (!options.args) { + options.args = []; + } + // if the script is found as a result of not being on the command + // line, then we move any of the pre double-dash args in execArgs + const n = options.scriptPosition === null ? + options.args.length : options.scriptPosition; + + options.execArgs = (options.execArgs || []) + .concat(options.args.splice(0, n)); + options.scriptPosition = null; + + options.script = found; + } + } + + mutateExecOptions(options); + + if (options.quiet) { + utils.quiet(); + } + + if (options.verbose) { + utils.debug = true; + } + + // simplify the ready callback to be called after the rules are normalised + // from strings to regexp through the rules lib. Note that this gets + // created *after* options is overwritten twice in the lines above. + var ready = function (options) { + normaliseRules(options, callback); + }; + + // if we didn't pick up a nodemon.json file & there's no cli ignores + // then try loading an old style .nodemonignore file + if (config.loaded.length === 0) { + var legacy = loadLegacyIgnore.bind(null, options, config, ready); + + // first try .nodemonignore, if that doesn't exist, try nodemon-ignore + return legacy('.nodemonignore', function () { + legacy('nodemon-ignore', function (options) { + ready(options); + }); + }); + } + + ready(options); + }); + }); +} + +/** + * Loads the old style nodemonignore files which is a list of patterns + * in a file to ignore + * + * @param {Object} options nodemon user options + * @param {Function} success + * @param {String} filename ignore file (.nodemonignore or nodemon-ignore) + * @param {Function} fail (optional) failure callback + */ +function loadLegacyIgnore(options, config, success, filename, fail) { + var ignoreFile = path.join(process.cwd(), filename); + + exists(ignoreFile, function (exists) { + if (exists) { + config.loaded.push(ignoreFile); + return parse(ignoreFile, function (error, rules) { + options.ignore = rules.raw; + success(options); + }); + } + + if (fail) { + fail(options); + } else { + success(options); + } + }); +} + +function normaliseRules(options, ready) { + // convert ignore and watch options to rules/regexp + rules.watch.add(options.watch); + rules.ignore.add(options.ignore); + + // normalise the watch and ignore arrays + options.watch = options.watch === false ? false : rules.rules.watch; + options.ignore = rules.rules.ignore; + + ready(options); +} + +/** + * Looks for a config in the current working directory, and a config in the + * user's home directory, merging the two together, giving priority to local + * config. This can then be overwritten later by command line arguments + * + * @param {Function} ready callback to pass loaded settings to + */ +function loadFile(options, config, dir, ready) { + if (!ready) { + ready = function () { }; + } + + var callback = function (settings) { + // prefer the local nodemon.json and fill in missing items using + // the global options + ready(utils.merge(settings, options)); + }; + + if (!dir) { + return callback({}); + } + + var filename = options.configFile || path.join(dir, 'nodemon.json'); + + if (config.loaded.indexOf(filename) !== -1) { + // don't bother re-parsing the same config file + return callback({}); + } + + fs.readFile(filename, 'utf8', function (err, data) { + if (err) { + if (err.code === 'ENOENT') { + if (!options.configFile && dir !== utils.home) { + // if no specified local config file and local nodemon.json + // doesn't exist, try the package.json + return loadPackageJSON(config, callback); + } + } + return callback({}); + } + + var settings = {}; + + try { + settings = JSON.parse(data.toString('utf8').replace(/^\uFEFF/, '')); + if (!filename.endsWith('package.json') || settings.nodemonConfig) { + config.loaded.push(filename); + } + } catch (e) { + utils.log.fail('Failed to parse config ' + filename); + console.error(e); + process.exit(1); + } + + // options values will overwrite settings + callback(settings); + }); +} + +function loadPackageJSON(config, ready) { + if (!ready) { + ready = () => { }; + } + + const dir = process.cwd(); + const filename = path.join(dir, 'package.json'); + const packageLoadOptions = { configFile: filename }; + return loadFile(packageLoadOptions, config, dir, settings => { + ready(settings.nodemonConfig || {}); + }); +} + +function mutateExecOptions(options) { + // work out the execOptions based on the final config we have + options.execOptions = exec({ + script: options.script, + exec: options.exec, + args: options.args, + scriptPosition: options.scriptPosition, + nodeArgs: options.nodeArgs, + execArgs: options.execArgs, + ext: options.ext, + env: options.env, + }, options.execMap); + + // clean up values that we don't need at the top level + delete options.scriptPosition; + delete options.script; + delete options.args; + delete options.ext; + + return options; +} diff --git a/node_modules/nodemon/lib/help/index.js b/node_modules/nodemon/lib/help/index.js new file mode 100644 index 000000000..1054b6025 --- /dev/null +++ b/node_modules/nodemon/lib/help/index.js @@ -0,0 +1,27 @@ +var fs = require('fs'); +var path = require('path'); +const supportsColor = require('supports-color'); + +module.exports = help; + +const highlight = supportsColor.stdout ? '\x1B\[$1m' : ''; + +function help(item) { + if (!item) { + item = 'help'; + } else if (item === true) { // if used with -h or --help and no args + item = 'help'; + } + + // cleanse the filename to only contain letters + // aka: /\W/g but figured this was eaiser to read + item = item.replace(/[^a-z]/gi, ''); + + try { + var dir = path.join(__dirname, '..', '..', 'doc', 'cli', item + '.txt'); + var body = fs.readFileSync(dir, 'utf8'); + return body.replace(/\\x1B\[(.)m/g, highlight); + } catch (e) { + return '"' + item + '" help can\'t be found'; + } +} diff --git a/node_modules/nodemon/lib/index.js b/node_modules/nodemon/lib/index.js new file mode 100644 index 000000000..0eca5c457 --- /dev/null +++ b/node_modules/nodemon/lib/index.js @@ -0,0 +1 @@ +module.exports = require('./nodemon'); \ No newline at end of file diff --git a/node_modules/nodemon/lib/monitor/index.js b/node_modules/nodemon/lib/monitor/index.js new file mode 100644 index 000000000..89db029b0 --- /dev/null +++ b/node_modules/nodemon/lib/monitor/index.js @@ -0,0 +1,4 @@ +module.exports = { + run: require('./run'), + watch: require('./watch').watch, +}; diff --git a/node_modules/nodemon/lib/monitor/match.js b/node_modules/nodemon/lib/monitor/match.js new file mode 100644 index 000000000..2ac3b2910 --- /dev/null +++ b/node_modules/nodemon/lib/monitor/match.js @@ -0,0 +1,276 @@ +const minimatch = require('minimatch'); +const path = require('path'); +const fs = require('fs'); +const debug = require('debug')('nodemon:match'); +const utils = require('../utils'); + +module.exports = match; +module.exports.rulesToMonitor = rulesToMonitor; + +function rulesToMonitor(watch, ignore, config) { + var monitor = []; + + if (!Array.isArray(ignore)) { + if (ignore) { + ignore = [ignore]; + } else { + ignore = []; + } + } + + if (!Array.isArray(watch)) { + if (watch) { + watch = [watch]; + } else { + watch = []; + } + } + + if (watch && watch.length) { + monitor = utils.clone(watch); + } + + if (ignore) { + [].push.apply(monitor, (ignore || []).map(function (rule) { + return '!' + rule; + })); + } + + var cwd = process.cwd(); + + // next check if the monitored paths are actual directories + // or just patterns - and expand the rule to include *.* + monitor = monitor.map(function (rule) { + var not = rule.slice(0, 1) === '!'; + + if (not) { + rule = rule.slice(1); + } + + if (rule === '.' || rule === '.*') { + rule = '*.*'; + } + + var dir = path.resolve(cwd, rule); + + try { + var stat = fs.statSync(dir); + if (stat.isDirectory()) { + rule = dir; + if (rule.slice(-1) !== '/') { + rule += '/'; + } + rule += '**/*'; + + // `!not` ... sorry. + if (!not) { + config.dirs.push(dir); + } + } else { + // ensures we end up in the check that tries to get a base directory + // and then adds it to the watch list + throw new Error(); + } + } catch (e) { + var base = tryBaseDir(dir); + if (!not && base) { + if (config.dirs.indexOf(base) === -1) { + config.dirs.push(base); + } + } + } + + if (rule.slice(-1) === '/') { + // just slap on a * anyway + rule += '*'; + } + + // if the url ends with * but not **/* and not *.* + // then convert to **/* - somehow it was missed :-\ + if (rule.slice(-4) !== '**/*' && + rule.slice(-1) === '*' && + rule.indexOf('*.') === -1) { + + if (rule.slice(-2) !== '**') { + rule += '*/*'; + } + } + + + return (not ? '!' : '') + rule; + }); + + return monitor; +} + +function tryBaseDir(dir) { + var stat; + if (/[?*\{\[]+/.test(dir)) { // if this is pattern, then try to find the base + try { + var base = path.dirname(dir.replace(/([?*\{\[]+.*$)/, 'foo')); + stat = fs.statSync(base); + if (stat.isDirectory()) { + return base; + } + } catch (error) { + // console.log(error); + } + } else { + try { + stat = fs.statSync(dir); + // if this path is actually a single file that exists, then just monitor + // that, *specifically*. + if (stat.isFile() || stat.isDirectory()) { + return dir; + } + } catch (e) { } + } + + return false; +} + +function match(files, monitor, ext) { + // sort the rules by highest specificity (based on number of slashes) + // ignore rules (!) get sorted highest as they take precedent + const cwd = process.cwd(); + var rules = monitor.sort(function (a, b) { + var r = b.split(path.sep).length - a.split(path.sep).length; + var aIsIgnore = a.slice(0, 1) === '!'; + var bIsIgnore = b.slice(0, 1) === '!'; + + if (aIsIgnore || bIsIgnore) { + if (aIsIgnore) { + return -1; + } + + return 1; + } + + if (r === 0) { + return b.length - a.length; + } + return r; + }).map(function (s) { + var prefix = s.slice(0, 1); + + if (prefix === '!') { + if (s.indexOf('!' + cwd) === 0) { + return s; + } + + // if it starts with a period, then let's get the relative path + if (s.indexOf('!.') === 0) { + return '!' + path.resolve(cwd, s.substring(1)); + } + + return '!**' + (prefix !== path.sep ? path.sep : '') + s.slice(1); + } + + // if it starts with a period, then let's get the relative path + if (s.indexOf('.') === 0) { + return path.resolve(cwd, s); + } + + if (s.indexOf(cwd) === 0) { + return s; + } + + return '**' + (prefix !== path.sep ? path.sep : '') + s; + }); + + debug('rules', rules); + + var good = []; + var whitelist = []; // files that we won't check against the extension + var ignored = 0; + var watched = 0; + var usedRules = []; + var minimatchOpts = { + dot: true, + }; + + // enable case-insensitivity on Windows + if (utils.isWindows) { + minimatchOpts.nocase = true; + } + + files.forEach(function (file) { + file = path.resolve(cwd, file); + + var matched = false; + for (var i = 0; i < rules.length; i++) { + if (rules[i].slice(0, 1) === '!') { + if (!minimatch(file, rules[i], minimatchOpts)) { + debug('ignored', file, 'rule:', rules[i]); + ignored++; + matched = true; + break; + } + } else { + debug('matched', file, 'rule:', rules[i]); + if (minimatch(file, rules[i], minimatchOpts)) { + watched++; + + // don't repeat the output if a rule is matched + if (usedRules.indexOf(rules[i]) === -1) { + usedRules.push(rules[i]); + utils.log.detail('matched rule: ' + rules[i]); + } + + // if the rule doesn't match the WATCH EVERYTHING + // but *does* match a rule that ends with *.*, then + // white list it - in that we don't run it through + // the extension check too. + if (rules[i] !== '**' + path.sep + '*.*' && + rules[i].slice(-3) === '*.*') { + whitelist.push(file); + } else if (path.basename(file) === path.basename(rules[i])) { + // if the file matches the actual rule, then it's put on whitelist + whitelist.push(file); + } else { + good.push(file); + } + matched = true; + break; + } else { + // utils.log.detail('no match: ' + rules[i], file); + } + } + } + if (!matched) { + ignored++; + } + }); + + debug('good', good) + + // finally check the good files against the extensions that we're monitoring + if (ext) { + if (ext.indexOf(',') === -1) { + ext = '**/*.' + ext; + } else { + ext = '**/*.{' + ext + '}'; + } + + good = good.filter(function (file) { + // only compare the filename to the extension test + return minimatch(path.basename(file), ext, minimatchOpts); + }); + } // else assume *.* + + var result = good.concat(whitelist); + + if (utils.isWindows) { + // fix for windows testing - I *think* this is okay to do + result = result.map(function (file) { + return file.slice(0, 1).toLowerCase() + file.slice(1); + }); + } + + return { + result: result, + ignored: ignored, + watched: watched, + total: files.length, + }; +} diff --git a/node_modules/nodemon/lib/monitor/run.js b/node_modules/nodemon/lib/monitor/run.js new file mode 100644 index 000000000..342310fc9 --- /dev/null +++ b/node_modules/nodemon/lib/monitor/run.js @@ -0,0 +1,541 @@ +var debug = require('debug')('nodemon:run'); +const statSync = require('fs').statSync; +var utils = require('../utils'); +var bus = utils.bus; +var childProcess = require('child_process'); +var spawn = childProcess.spawn; +var exec = childProcess.exec; +var execSync = childProcess.execSync; +var fork = childProcess.fork; +var watch = require('./watch').watch; +var config = require('../config'); +var child = null; // the actual child process we spawn +var killedAfterChange = false; +var noop = () => {}; +var restart = null; +var psTree = require('pstree.remy'); +var path = require('path'); +var signals = require('./signals'); +const osRelease = parseInt(require('os').release().split('.')[0], 10); + +function run(options) { + var cmd = config.command.raw; + // moved up + // we need restart function below in the global scope for run.kill + /*jshint validthis:true*/ + restart = run.bind(this, options); + run.restart = restart; + + // binding options with instance of run + // so that we can use it in run.kill + run.options = options; + + var runCmd = !options.runOnChangeOnly || config.lastStarted !== 0; + if (runCmd) { + utils.log.status('starting `' + config.command.string + '`'); + } else { + // should just watch file if command is not to be run + // had another alternate approach + // to stop process being forked/spawned in the below code + // but this approach does early exit and makes code cleaner + debug('start watch on: %s', config.options.watch); + if (config.options.watch !== false) { + watch(); + return; + } + } + + config.lastStarted = Date.now(); + + var stdio = ['pipe', 'pipe', 'pipe']; + + if (config.options.stdout) { + stdio = ['pipe', process.stdout, process.stderr]; + } + + if (config.options.stdin === false) { + stdio = [process.stdin, process.stdout, process.stderr]; + } + + var sh = 'sh'; + var shFlag = '-c'; + + const binPath = process.cwd() + '/node_modules/.bin'; + + const spawnOptions = { + env: Object.assign({}, process.env, options.execOptions.env, { + PATH: binPath + ':' + process.env.PATH, + }), + stdio: stdio, + }; + + var executable = cmd.executable; + + if (utils.isWindows) { + // if the exec includes a forward slash, reverse it for windows compat + // but *only* apply to the first command, and none of the arguments. + // ref #1251 and #1236 + if (executable.indexOf('/') !== -1) { + executable = executable + .split(' ') + .map((e, i) => { + if (i === 0) { + return path.normalize(e); + } + return e; + }) + .join(' '); + } + // taken from npm's cli: https://git.io/vNFD4 + sh = process.env.comspec || 'cmd'; + shFlag = '/d /s /c'; + spawnOptions.windowsVerbatimArguments = true; + } + + var args = runCmd ? utils.stringify(executable, cmd.args) : ':'; + var spawnArgs = [sh, [shFlag, args], spawnOptions]; + + const firstArg = cmd.args[0] || ''; + + var inBinPath = false; + try { + inBinPath = statSync(`${binPath}/${executable}`).isFile(); + } catch (e) {} + + // hasStdio allows us to correctly handle stdin piping + // see: https://git.io/vNtX3 + const hasStdio = utils.satisfies('>= 6.4.0 || < 5'); + + // forking helps with sub-process handling and tends to clean up better + // than spawning, but it should only be used under specific conditions + const shouldFork = + !config.options.spawn && + !inBinPath && + !(firstArg.indexOf('-') === 0) && // don't fork if there's a node exec arg + firstArg !== 'inspect' && // don't fork it's `inspect` debugger + executable === 'node' && // only fork if node + utils.version.major > 4; // only fork if node version > 4 + + if (shouldFork) { + // this assumes the first argument is the script and slices it out, since + // we're forking + var forkArgs = cmd.args.slice(1); + var env = utils.merge(options.execOptions.env, process.env); + stdio.push('ipc'); + child = fork(options.execOptions.script, forkArgs, { + env: env, + stdio: stdio, + silent: !hasStdio, + }); + utils.log.detail('forking'); + debug('fork', sh, shFlag, args); + } else { + utils.log.detail('spawning'); + child = spawn.apply(null, spawnArgs); + debug('spawn', sh, shFlag, args); + } + + if (config.required) { + var emit = { + stdout: function (data) { + bus.emit('stdout', data); + }, + stderr: function (data) { + bus.emit('stderr', data); + }, + }; + + // now work out what to bind to... + if (config.options.stdout) { + child.on('stdout', emit.stdout).on('stderr', emit.stderr); + } else { + child.stdout.on('data', emit.stdout); + child.stderr.on('data', emit.stderr); + + bus.stdout = child.stdout; + bus.stderr = child.stderr; + } + + if (shouldFork) { + child.on('message', function (message, sendHandle) { + bus.emit('message', message, sendHandle); + }); + } + } + + bus.emit('start'); + + utils.log.detail('child pid: ' + child.pid); + + child.on('error', function (error) { + bus.emit('error', error); + if (error.code === 'ENOENT') { + utils.log.error('unable to run executable: "' + cmd.executable + '"'); + process.exit(1); + } else { + utils.log.error('failed to start child process: ' + error.code); + throw error; + } + }); + + child.on('exit', function (code, signal) { + if (child && child.stdin) { + process.stdin.unpipe(child.stdin); + } + + if (code === 127) { + utils.log.error( + 'failed to start process, "' + cmd.executable + '" exec not found' + ); + bus.emit('error', code); + process.exit(); + } + + // If the command failed with code 2, it may or may not be a syntax error + // See: http://git.io/fNOAR + // We will only assume a parse error, if the child failed quickly + if (code === 2 && Date.now() < config.lastStarted + 500) { + utils.log.error('process failed, unhandled exit code (2)'); + utils.log.error(''); + utils.log.error('Either the command has a syntax error,'); + utils.log.error('or it is exiting with reserved code 2.'); + utils.log.error(''); + utils.log.error('To keep nodemon running even after a code 2,'); + utils.log.error('add this to the end of your command: || exit 1'); + utils.log.error(''); + utils.log.error('Read more here: https://git.io/fNOAG'); + utils.log.error(''); + utils.log.error('nodemon will stop now so that you can fix the command.'); + utils.log.error(''); + bus.emit('error', code); + process.exit(); + } + + // In case we killed the app ourselves, set the signal thusly + if (killedAfterChange) { + killedAfterChange = false; + signal = config.signal; + } + // this is nasty, but it gives it windows support + if (utils.isWindows && signal === 'SIGTERM') { + signal = config.signal; + } + + if (signal === config.signal || code === 0) { + // this was a clean exit, so emit exit, rather than crash + debug('bus.emit(exit) via ' + config.signal); + bus.emit('exit', signal); + + // exit the monitor, but do it gracefully + if (signal === config.signal) { + return restart(); + } + + if (code === 0) { + // clean exit - wait until file change to restart + if (runCmd) { + utils.log.status('clean exit - waiting for changes before restart'); + } + child = null; + } + } else { + bus.emit('crash'); + if (options.exitcrash) { + utils.log.fail('app crashed'); + if (!config.required) { + process.exit(1); + } + } else { + utils.log.fail( + 'app crashed - waiting for file changes before' + ' starting...' + ); + child = null; + } + } + + if (config.options.restartable) { + // stdin needs to kick in again to be able to listen to the + // restart command + process.stdin.resume(); + } + }); + + // moved the run.kill outside to handle both the cases + // intial start + // no start + + // connect stdin to the child process (options.stdin is on by default) + if (options.stdin) { + process.stdin.resume(); + // FIXME decide whether or not we need to decide the encoding + // process.stdin.setEncoding('utf8'); + + // swallow the stdin error if it happens + // ref: https://github.com/remy/nodemon/issues/1195 + if (hasStdio) { + child.stdin.on('error', () => {}); + process.stdin.pipe(child.stdin); + } else { + if (child.stdout) { + child.stdout.pipe(process.stdout); + } else { + utils.log.error( + 'running an unsupported version of node ' + process.version + ); + utils.log.error( + 'nodemon may not work as expected - ' + + 'please consider upgrading to LTS' + ); + } + } + + bus.once('exit', function () { + if (child && process.stdin.unpipe) { + // node > 0.8 + process.stdin.unpipe(child.stdin); + } + }); + } + + debug('start watch on: %s', config.options.watch); + if (config.options.watch !== false) { + watch(); + } +} + +function waitForSubProcesses(pid, callback) { + debug('checking ps tree for pids of ' + pid); + psTree(pid, (err, pids) => { + if (!pids.length) { + return callback(); + } + + utils.log.status( + `still waiting for ${pids.length} sub-process${ + pids.length > 2 ? 'es' : '' + } to finish...` + ); + setTimeout(() => waitForSubProcesses(pid, callback), 1000); + }); +} + +function kill(child, signal, callback) { + if (!callback) { + callback = noop; + } + + if (utils.isWindows) { + const taskKill = () => { + try { + exec('taskkill /pid ' + child.pid + ' /T /F'); + } catch (e) { + utils.log.error('Could not shutdown sub process cleanly'); + } + }; + + // We are handling a 'SIGKILL' POSIX signal under Windows the + // same way it is handled on a UNIX system: We are performing + // a hard shutdown without waiting for the process to clean-up. + if (signal === 'SIGKILL' || osRelease < 10) { + debug('terminating process group by force: %s', child.pid); + + // We are using the taskkill utility to terminate the whole + // process group ('/t') of the child ('/pid') by force ('/f'). + // We need to end all sub processes, because the 'child' + // process in this context is actually a cmd.exe wrapper. + taskKill(); + callback(); + return; + } + + try { + // We are using the Windows Management Instrumentation Command-line + // (wmic.exe) to resolve the sub-child process identifier, because the + // 'child' process in this context is actually a cmd.exe wrapper. + // We want to send the termination signal directly to the node process. + // The '2> nul' silences the no process found error message. + const resultBuffer = execSync( + `wmic process where (ParentProcessId=${child.pid}) get ProcessId 2> nul` + ); + const result = resultBuffer.toString().match(/^[0-9]+/m); + + // If there is no sub-child process we fall back to the child process. + const processId = Array.isArray(result) ? result[0] : child.pid; + + debug('sending kill signal SIGINT to process: %s', processId); + + // We are using the standalone 'windows-kill' executable to send the + // standard POSIX signal 'SIGINT' to the node process. This fixes #1720. + const windowsKill = path.normalize( + `${__dirname}/../../bin/windows-kill.exe` + ); + + // We have to detach the 'windows-kill' execution completely from this + // process group to avoid terminating the nodemon process itself. + // See: https://github.com/alirdn/windows-kill#how-it-works--limitations + // + // Therefore we are using 'start' to create a new cmd.exe context. + // The '/min' option hides the new terminal window and the '/wait' + // option lets the process wait for the command to finish. + + execSync( + `start "windows-kill" /min /wait "${windowsKill}" -SIGINT ${processId}` + ); + } catch (e) { + taskKill(); + } + callback(); + } else { + // we use psTree to kill the full subtree of nodemon, because when + // spawning processes like `coffee` under the `--debug` flag, it'll spawn + // it's own child, and that can't be killed by nodemon, so psTree gives us + // an array of PIDs that have spawned under nodemon, and we send each the + // configured signal (default: SIGUSR2) signal, which fixes #335 + // note that psTree also works if `ps` is missing by looking in /proc + let sig = signal.replace('SIG', ''); + + psTree(child.pid, function (err, pids) { + // if ps isn't native to the OS, then we need to send the numeric value + // for the signal during the kill, `signals` is a lookup table for that. + if (!psTree.hasPS) { + sig = signals[signal]; + } + + // the sub processes need to be killed from smallest to largest + debug('sending kill signal to ' + pids.join(', ')); + + child.kill(signal); + + pids.sort().forEach((pid) => exec(`kill -${sig} ${pid}`, noop)); + + waitForSubProcesses(child.pid, () => { + // finally kill the main user process + exec(`kill -${sig} ${child.pid}`, callback); + }); + }); + } +} + +run.kill = function (noRestart, callback) { + // I hate code like this :( - Remy (author of said code) + if (typeof noRestart === 'function') { + callback = noRestart; + noRestart = false; + } + + if (!callback) { + callback = noop; + } + + if (child !== null) { + // if the stdin piping is on, we need to unpipe, but also close stdin on + // the child, otherwise linux can throw EPIPE or ECONNRESET errors. + if (run.options.stdin) { + process.stdin.unpipe(child.stdin); + } + + // For the on('exit', ...) handler above the following looks like a + // crash, so we set the killedAfterChange flag if a restart is planned + if (!noRestart) { + killedAfterChange = true; + } + + /* Now kill the entire subtree of processes belonging to nodemon */ + var oldPid = child.pid; + if (child) { + kill(child, config.signal, function () { + // this seems to fix the 0.11.x issue with the "rs" restart command, + // though I'm unsure why. it seems like more data is streamed in to + // stdin after we close. + if (child && run.options.stdin && child.stdin && oldPid === child.pid) { + child.stdin.end(); + } + callback(); + }); + } + } else if (!noRestart) { + // if there's no child, then we need to manually start the process + // this is because as there was no child, the child.on('exit') event + // handler doesn't exist which would normally trigger the restart. + bus.once('start', callback); + run.restart(); + } else { + callback(); + } +}; + +run.restart = noop; + +bus.on('quit', function onQuit(code) { + if (code === undefined) { + code = 0; + } + + // remove event listener + var exitTimer = null; + var exit = function () { + clearTimeout(exitTimer); + exit = noop; // null out in case of race condition + child = null; + if (!config.required) { + // Execute all other quit listeners. + bus.listeners('quit').forEach(function (listener) { + if (listener !== onQuit) { + listener(); + } + }); + process.exit(code); + } else { + bus.emit('exit'); + } + }; + + // if we're not running already, don't bother with trying to kill + if (config.run === false) { + return exit(); + } + + // immediately try to stop any polling + config.run = false; + + if (child) { + // give up waiting for the kids after 10 seconds + exitTimer = setTimeout(exit, 10 * 1000); + child.removeAllListeners('exit'); + child.once('exit', exit); + + kill(child, 'SIGINT'); + } else { + exit(); + } +}); + +bus.on('restart', function () { + // run.kill will send a SIGINT to the child process, which will cause it + // to terminate, which in turn uses the 'exit' event handler to restart + run.kill(); +}); + +// remove the child file on exit +process.on('exit', function () { + utils.log.detail('exiting'); + if (child) { + child.kill(); + } +}); + +// because windows borks when listening for the SIG* events +if (!utils.isWindows) { + bus.once('boot', () => { + // usual suspect: ctrl+c exit + process.once('SIGINT', () => bus.emit('quit', 130)); + process.once('SIGTERM', () => { + bus.emit('quit', 143); + if (child) { + child.kill('SIGTERM'); + } + }); + }); +} + +module.exports = run; diff --git a/node_modules/nodemon/lib/monitor/signals.js b/node_modules/nodemon/lib/monitor/signals.js new file mode 100644 index 000000000..daff6e053 --- /dev/null +++ b/node_modules/nodemon/lib/monitor/signals.js @@ -0,0 +1,34 @@ +module.exports = { + SIGHUP: 1, + SIGINT: 2, + SIGQUIT: 3, + SIGILL: 4, + SIGTRAP: 5, + SIGABRT: 6, + SIGBUS: 7, + SIGFPE: 8, + SIGKILL: 9, + SIGUSR1: 10, + SIGSEGV: 11, + SIGUSR2: 12, + SIGPIPE: 13, + SIGALRM: 14, + SIGTERM: 15, + SIGSTKFLT: 16, + SIGCHLD: 17, + SIGCONT: 18, + SIGSTOP: 19, + SIGTSTP: 20, + SIGTTIN: 21, + SIGTTOU: 22, + SIGURG: 23, + SIGXCPU: 24, + SIGXFSZ: 25, + SIGVTALRM: 26, + SIGPROF: 27, + SIGWINCH: 28, + SIGIO: 29, + SIGPWR: 30, + SIGSYS: 31, + SIGRTMIN: 35, +} diff --git a/node_modules/nodemon/lib/monitor/watch.js b/node_modules/nodemon/lib/monitor/watch.js new file mode 100644 index 000000000..1ef14086d --- /dev/null +++ b/node_modules/nodemon/lib/monitor/watch.js @@ -0,0 +1,239 @@ +module.exports.watch = watch; +module.exports.resetWatchers = resetWatchers; + +var debug = require('debug')('nodemon:watch'); +var debugRoot = require('debug')('nodemon'); +var chokidar = require('chokidar'); +var undefsafe = require('undefsafe'); +var config = require('../config'); +var path = require('path'); +var utils = require('../utils'); +var bus = utils.bus; +var match = require('./match'); +var watchers = []; +var debouncedBus; + +bus.on('reset', resetWatchers); + +function resetWatchers() { + debugRoot('resetting watchers'); + watchers.forEach(function (watcher) { + watcher.close(); + }); + watchers = []; +} + +function watch() { + if (watchers.length) { + debug('early exit on watch, still watching (%s)', watchers.length); + return; + } + + var dirs = [].slice.call(config.dirs); + + debugRoot('start watch on: %s', dirs.join(', ')); + const rootIgnored = config.options.ignore; + debugRoot('ignored', rootIgnored); + + var watchedFiles = []; + + const promise = new Promise(function (resolve) { + const dotFilePattern = /[/\\]\./; + var ignored = match.rulesToMonitor( + [], // not needed + Array.from(rootIgnored), + config + ).map(pattern => pattern.slice(1)); + + const addDotFile = dirs.filter(dir => dir.match(dotFilePattern)); + + // don't ignore dotfiles if explicitly watched. + if (addDotFile.length === 0) { + ignored.push(dotFilePattern); + } + + var watchOptions = { + ignorePermissionErrors: true, + ignored: ignored, + persistent: true, + usePolling: config.options.legacyWatch || false, + interval: config.options.pollingInterval, + // note to future developer: I've gone back and forth on adding `cwd` + // to the props and in some cases it fixes bugs but typically it causes + // bugs elsewhere (since nodemon is used is so many ways). the final + // decision is to *not* use it at all and work around it + // cwd: ... + }; + + if (utils.isWindows) { + watchOptions.disableGlobbing = true; + } + + if (process.env.TEST) { + watchOptions.useFsEvents = false; + } + + var watcher = chokidar.watch( + dirs, + Object.assign({}, watchOptions, config.options.watchOptions || {}) + ); + + watcher.ready = false; + + var total = 0; + + watcher.on('change', filterAndRestart); + watcher.on('add', function (file) { + if (watcher.ready) { + return filterAndRestart(file); + } + + watchedFiles.push(file); + bus.emit('watching', file); + debug('chokidar watching: %s', file); + }); + watcher.on('ready', function () { + watchedFiles = Array.from(new Set(watchedFiles)); // ensure no dupes + total = watchedFiles.length; + watcher.ready = true; + resolve(total); + debugRoot('watch is complete'); + }); + + watcher.on('error', function (error) { + if (error.code === 'EINVAL') { + utils.log.error( + 'Internal watch failed. Likely cause: too many ' + + 'files being watched (perhaps from the root of a drive?\n' + + 'See https://github.com/paulmillr/chokidar/issues/229 for details' + ); + } else { + utils.log.error('Internal watch failed: ' + error.message); + process.exit(1); + } + }); + + watchers.push(watcher); + }); + + return promise.catch(e => { + // this is a core error and it should break nodemon - so I have to break + // out of a promise using the setTimeout + setTimeout(() => { + throw e; + }); + }).then(function () { + utils.log.detail(`watching ${watchedFiles.length} file${ + watchedFiles.length === 1 ? '' : 's'}`); + return watchedFiles; + }); +} + +function filterAndRestart(files) { + if (!Array.isArray(files)) { + files = [files]; + } + + if (files.length) { + var cwd = process.cwd(); + if (this.options && this.options.cwd) { + cwd = this.options.cwd; + } + + utils.log.detail( + 'files triggering change check: ' + + files + .map(file => { + const res = path.relative(cwd, file); + return res; + }) + .join(', ') + ); + + // make sure the path is right and drop an empty + // filenames (sometimes on windows) + files = files.filter(Boolean).map(file => { + return path.relative(process.cwd(), path.relative(cwd, file)); + }); + + if (utils.isWindows) { + // ensure the drive letter is in uppercase (c:\foo -> C:\foo) + files = files.map(f => { + if (f.indexOf(':') === -1) { return f; } + return f[0].toUpperCase() + f.slice(1); + }); + } + + + debug('filterAndRestart on', files); + + var matched = match( + files, + config.options.monitor, + undefsafe(config, 'options.execOptions.ext') + ); + + debug('matched?', JSON.stringify(matched)); + + // if there's no matches, then test to see if the changed file is the + // running script, if so, let's allow a restart + if (config.options.execOptions && config.options.execOptions.script) { + const script = path.resolve(config.options.execOptions.script); + if (matched.result.length === 0 && script) { + const length = script.length; + files.find(file => { + if (file.substr(-length, length) === script) { + matched = { + result: [file], + total: 1, + }; + return true; + } + }); + } + } + + utils.log.detail( + 'changes after filters (before/after): ' + + [files.length, matched.result.length].join('/') + ); + + // reset the last check so we're only looking at recently modified files + config.lastStarted = Date.now(); + + if (matched.result.length) { + if (config.options.delay > 0) { + utils.log.detail('delaying restart for ' + config.options.delay + 'ms'); + if (debouncedBus === undefined) { + debouncedBus = debounce(restartBus, config.options.delay); + } + debouncedBus(matched); + } else { + return restartBus(matched); + } + } + } +} + +function restartBus(matched) { + utils.log.status('restarting due to changes...'); + matched.result.map(file => { + utils.log.detail(path.relative(process.cwd(), file)); + }); + + if (config.options.verbose) { + utils.log._log(''); + } + + bus.emit('restart', matched.result); +} + +function debounce(fn, delay) { + var timer = null; + return function () { + const context = this; + const args = arguments; + clearTimeout(timer); + timer = setTimeout(() =>fn.apply(context, args), delay); + }; +} diff --git a/node_modules/nodemon/lib/nodemon.js b/node_modules/nodemon/lib/nodemon.js new file mode 100644 index 000000000..ce649cb6d --- /dev/null +++ b/node_modules/nodemon/lib/nodemon.js @@ -0,0 +1,311 @@ +var debug = require('debug')('nodemon'); +var path = require('path'); +var monitor = require('./monitor'); +var cli = require('./cli'); +var version = require('./version'); +var util = require('util'); +var utils = require('./utils'); +var bus = utils.bus; +var help = require('./help'); +var config = require('./config'); +var spawn = require('./spawn'); +const defaults = require('./config/defaults') +var eventHandlers = {}; + +// this is fairly dirty, but theoretically sound since it's part of the +// stable module API +config.required = utils.isRequired; + +function nodemon(settings) { + bus.emit('boot'); + nodemon.reset(); + + // allow the cli string as the argument to nodemon, and allow for + // `node nodemon -V app.js` or just `-V app.js` + if (typeof settings === 'string') { + settings = settings.trim(); + if (settings.indexOf('node') !== 0) { + if (settings.indexOf('nodemon') !== 0) { + settings = 'nodemon ' + settings; + } + settings = 'node ' + settings; + } + settings = cli.parse(settings); + } + + // set the debug flag as early as possible to get all the detailed logging + if (settings.verbose) { + utils.debug = true; + } + + if (settings.help) { + if (process.stdout.isTTY) { + process.stdout._handle.setBlocking(true); // nodejs/node#6456 + } + console.log(help(settings.help)); + if (!config.required) { + process.exit(0); + } + } + + if (settings.version) { + version().then(function (v) { + console.log(v); + if (!config.required) { + process.exit(0); + } + }); + return; + } + + // nodemon tools like grunt-nodemon. This affects where + // the script is being run from, and will affect where + // nodemon looks for the nodemon.json files + if (settings.cwd) { + // this is protection to make sure we haven't dont the chdir already... + // say like in cli/parse.js (which is where we do this once already!) + if (process.cwd() !== path.resolve(config.system.cwd, settings.cwd)) { + process.chdir(settings.cwd); + } + } + + const cwd = process.cwd(); + + config.load(settings, function (config) { + if (!config.options.dump && !config.options.execOptions.script && + config.options.execOptions.exec === 'node') { + if (!config.required) { + console.log(help('usage')); + process.exit(); + } + return; + } + + // before we print anything, update the colour setting on logging + utils.colours = config.options.colours; + + // always echo out the current version + utils.log.info(version.pinned); + + const cwd = process.cwd(); + + if (config.options.cwd) { + utils.log.detail('process root: ' + cwd); + } + + config.loaded.map(file => file.replace(cwd, '.')).forEach(file => { + utils.log.detail('reading config ' + file); + }); + + if (config.options.stdin && config.options.restartable) { + // allow nodemon to restart when the use types 'rs\n' + process.stdin.resume(); + process.stdin.setEncoding('utf8'); + process.stdin.on('data', data => { + const str = data.toString().trim().toLowerCase(); + + // if the keys entered match the restartable value, then restart! + if (str === config.options.restartable) { + bus.emit('restart'); + } else if (data.charCodeAt(0) === 12) { // ctrl+l + console.clear(); + } + }); + } else if (config.options.stdin) { + // so let's make sure we don't eat the key presses + // but also, since we're wrapping, watch out for + // special keys, like ctrl+c x 2 or '.exit' or ctrl+d or ctrl+l + var ctrlC = false; + var buffer = ''; + + process.stdin.on('data', function (data) { + data = data.toString(); + buffer += data; + const chr = data.charCodeAt(0); + + // if restartable, echo back + if (chr === 3) { + if (ctrlC) { + process.exit(0); + } + + ctrlC = true; + return; + } else if (buffer === '.exit' || chr === 4) { // ctrl+d + process.exit(); + } else if (chr === 13 || chr === 10) { // enter / carriage return + buffer = ''; + } else if (chr === 12) { // ctrl+l + console.clear(); + buffer = ''; + } + ctrlC = false; + }); + if (process.stdin.setRawMode) { + process.stdin.setRawMode(true); + } + } + + if (config.options.restartable) { + utils.log.info('to restart at any time, enter `' + + config.options.restartable + '`'); + } + + if (!config.required) { + const restartSignal = config.options.signal === 'SIGUSR2' ? 'SIGHUP' : 'SIGUSR2'; + process.on(restartSignal, nodemon.restart); + utils.bus.on('error', () => { + utils.log.fail((new Error().stack)); + }); + utils.log.detail((config.options.restartable ? 'or ' : '') + 'send ' + + restartSignal + ' to ' + process.pid + ' to restart'); + } + + const ignoring = config.options.monitor.map(function (rule) { + if (rule.slice(0, 1) !== '!') { + return false; + } + + rule = rule.slice(1); + + // don't notify of default ignores + if (defaults.ignoreRoot.indexOf(rule) !== -1) { + return false; + return rule.slice(3).slice(0, -3); + } + + if (rule.startsWith(cwd)) { + return rule.replace(cwd, '.'); + } + + return rule; + }).filter(Boolean).join(' '); + if (ignoring) utils.log.detail('ignoring: ' + ignoring); + + utils.log.info('watching path(s): ' + config.options.monitor.map(function (rule) { + if (rule.slice(0, 1) !== '!') { + try { + rule = path.relative(process.cwd(), rule); + } catch (e) {} + + return rule; + } + + return false; + }).filter(Boolean).join(' ')); + + utils.log.info('watching extensions: ' + (config.options.execOptions.ext || '(all)')); + + if (config.options.dump) { + utils.log._log('log', '--------------'); + utils.log._log('log', 'node: ' + process.version); + utils.log._log('log', 'nodemon: ' + version.pinned); + utils.log._log('log', 'command: ' + process.argv.join(' ')); + utils.log._log('log', 'cwd: ' + cwd); + utils.log._log('log', ['OS:', process.platform, process.arch].join(' ')); + utils.log._log('log', '--------------'); + utils.log._log('log', util.inspect(config, { depth: null })); + utils.log._log('log', '--------------'); + if (!config.required) { + process.exit(); + } + + return; + } + + config.run = true; + + if (config.options.stdout === false) { + nodemon.on('start', function () { + nodemon.stdout = bus.stdout; + nodemon.stderr = bus.stderr; + + bus.emit('readable'); + }); + } + + if (config.options.events && Object.keys(config.options.events).length) { + Object.keys(config.options.events).forEach(function (key) { + utils.log.detail('bind ' + key + ' -> `' + + config.options.events[key] + '`'); + nodemon.on(key, function () { + if (config.options && config.options.events) { + spawn(config.options.events[key], config, + [].slice.apply(arguments)); + } + }); + }); + } + + monitor.run(config.options); + + }); + + return nodemon; +} + +nodemon.restart = function () { + utils.log.status('restarting child process'); + bus.emit('restart'); + return nodemon; +}; + +nodemon.addListener = nodemon.on = function (event, handler) { + if (!eventHandlers[event]) { eventHandlers[event] = []; } + eventHandlers[event].push(handler); + bus.on(event, handler); + return nodemon; +}; + +nodemon.once = function (event, handler) { + if (!eventHandlers[event]) { eventHandlers[event] = []; } + eventHandlers[event].push(handler); + bus.once(event, function () { + debug('bus.once(%s)', event); + eventHandlers[event].splice(eventHandlers[event].indexOf(handler), 1); + handler.apply(this, arguments); + }); + return nodemon; +}; + +nodemon.emit = function () { + bus.emit.apply(bus, [].slice.call(arguments)); + return nodemon; +}; + +nodemon.removeAllListeners = function (event) { + // unbind only the `nodemon.on` event handlers + Object.keys(eventHandlers).filter(function (e) { + return event ? e === event : true; + }).forEach(function (event) { + eventHandlers[event].forEach(function (handler) { + bus.removeListener(event, handler); + eventHandlers[event].splice(eventHandlers[event].indexOf(handler), 1); + }); + }); + + return nodemon; +}; + +nodemon.reset = function (done) { + bus.emit('reset', done); +}; + +bus.on('reset', function (done) { + debug('reset'); + nodemon.removeAllListeners(); + monitor.run.kill(true, function () { + utils.reset(); + config.reset(); + config.run = false; + if (done) { + done(); + } + }); +}); + +// expose the full config +nodemon.config = config; + +module.exports = nodemon; + diff --git a/node_modules/nodemon/lib/rules/add.js b/node_modules/nodemon/lib/rules/add.js new file mode 100644 index 000000000..de85bb7fd --- /dev/null +++ b/node_modules/nodemon/lib/rules/add.js @@ -0,0 +1,89 @@ +'use strict'; + +var utils = require('../utils'); + +// internal +var reEscComments = /\\#/g; +// note that '^^' is used in place of escaped comments +var reUnescapeComments = /\^\^/g; +var reComments = /#.*$/; +var reEscapeChars = /[.|\-[\]()\\]/g; +var reAsterisk = /\*/g; + +module.exports = add; + +/** + * Converts file patterns or regular expressions to nodemon + * compatible RegExp matching rules. Note: the `rules` argument + * object is modified to include the new rule and new RegExp + * + * ### Example: + * + * var rules = { watch: [], ignore: [] }; + * add(rules, 'watch', '*.js'); + * add(rules, 'ignore', '/public/'); + * add(rules, 'watch', ':(\d)*\.js'); // note: string based regexp + * add(rules, 'watch', /\d*\.js/); + * + * @param {Object} rules containing `watch` and `ignore`. Also updated during + * execution + * @param {String} which must be either "watch" or "ignore" + * @param {String|RegExp} the actual rule. + */ +function add(rules, which, rule) { + if (!{ ignore: 1, watch: 1}[which]) { + throw new Error('rules/index.js#add requires "ignore" or "watch" as the ' + + 'first argument'); + } + + if (Array.isArray(rule)) { + rule.forEach(function (rule) { + add(rules, which, rule); + }); + return; + } + + // support the rule being a RegExp, but reformat it to + // the custom : format that we're working with. + if (rule instanceof RegExp) { + // rule = ':' + rule.toString().replace(/^\/(.*?)\/$/g, '$1'); + utils.log.error('RegExp format no longer supported, but globs are.'); + return; + } + + // remove comments and trim lines + // this mess of replace methods is escaping "\#" to allow for emacs temp files + + // first up strip comments and remove blank head or tails + rule = (rule || '').replace(reEscComments, '^^') + .replace(reComments, '') + .replace(reUnescapeComments, '#').trim(); + + var regexp = false; + + if (typeof rule === 'string' && rule.substring(0, 1) === ':') { + rule = rule.substring(1); + utils.log.error('RegExp no longer supported: ' + rule); + regexp = true; + } else if (rule.length === 0) { + // blank line (or it was a comment) + return; + } + + if (regexp) { + // rules[which].push(rule); + } else { + // rule = rule.replace(reEscapeChars, '\\$&') + // .replace(reAsterisk, '.*'); + + rules[which].push(rule); + // compile a regexp of all the rules for this ignore or watch + var re = rules[which].map(function (rule) { + return rule.replace(reEscapeChars, '\\$&') + .replace(reAsterisk, '.*'); + }).join('|'); + + // used for the directory matching + rules[which].re = new RegExp(re); + } +} diff --git a/node_modules/nodemon/lib/rules/index.js b/node_modules/nodemon/lib/rules/index.js new file mode 100644 index 000000000..04aa92f87 --- /dev/null +++ b/node_modules/nodemon/lib/rules/index.js @@ -0,0 +1,53 @@ +'use strict'; +var utils = require('../utils'); +var add = require('./add'); +var parse = require('./parse'); + +// exported +var rules = { ignore: [], watch: [] }; + +/** + * Loads a nodemon config file and populates the ignore + * and watch rules with it's contents, and calls callback + * with the new rules + * + * @param {String} filename + * @param {Function} callback + */ +function load(filename, callback) { + parse(filename, function (err, result) { + if (err) { + // we should have bombed already, but + utils.log.error(err); + callback(err); + } + + if (result.raw) { + result.raw.forEach(add.bind(null, rules, 'ignore')); + } else { + result.ignore.forEach(add.bind(null, rules, 'ignore')); + result.watch.forEach(add.bind(null, rules, 'watch')); + } + + callback(null, rules); + }); +} + +module.exports = { + reset: function () { // just used for testing + rules.ignore.length = rules.watch.length = 0; + delete rules.ignore.re; + delete rules.watch.re; + }, + load: load, + ignore: { + test: add.bind(null, rules, 'ignore'), + add: add.bind(null, rules, 'ignore'), + }, + watch: { + test: add.bind(null, rules, 'watch'), + add: add.bind(null, rules, 'watch'), + }, + add: add.bind(null, rules), + rules: rules, +}; \ No newline at end of file diff --git a/node_modules/nodemon/lib/rules/parse.js b/node_modules/nodemon/lib/rules/parse.js new file mode 100644 index 000000000..6e1caceaa --- /dev/null +++ b/node_modules/nodemon/lib/rules/parse.js @@ -0,0 +1,43 @@ +'use strict'; +var fs = require('fs'); + +/** + * Parse the nodemon config file, supporting both old style + * plain text config file, and JSON version of the config + * + * @param {String} filename + * @param {Function} callback + */ +function parse(filename, callback) { + var rules = { + ignore: [], + watch: [], + }; + + fs.readFile(filename, 'utf8', function (err, content) { + + if (err) { + return callback(err); + } + + var json = null; + try { + json = JSON.parse(content); + } catch (e) {} + + if (json !== null) { + rules = { + ignore: json.ignore || [], + watch: json.watch || [], + }; + + return callback(null, rules); + } + + // otherwise return the raw file + return callback(null, { raw: content.split(/\n/) }); + }); +} + +module.exports = parse; + diff --git a/node_modules/nodemon/lib/spawn.js b/node_modules/nodemon/lib/spawn.js new file mode 100644 index 000000000..9aa4f80eb --- /dev/null +++ b/node_modules/nodemon/lib/spawn.js @@ -0,0 +1,73 @@ +const path = require('path'); +const utils = require('./utils'); +const merge = utils.merge; +const bus = utils.bus; +const spawn = require('child_process').spawn; + +module.exports = function spawnCommand(command, config, eventArgs) { + var stdio = ['pipe', 'pipe', 'pipe']; + + if (config.options.stdout) { + stdio = ['pipe', process.stdout, process.stderr]; + } + + const env = merge(process.env, { FILENAME: eventArgs[0] }); + + var sh = 'sh'; + var shFlag = '-c'; + var spawnOptions = { + env: merge(config.options.execOptions.env, env), + stdio: stdio, + }; + + if (!Array.isArray(command)) { + command = [command]; + } + + if (utils.isWindows) { + // if the exec includes a forward slash, reverse it for windows compat + // but *only* apply to the first command, and none of the arguments. + // ref #1251 and #1236 + command = command.map(executable => { + if (executable.indexOf('/') === -1) { + return executable; + } + + return executable.split(' ').map((e, i) => { + if (i === 0) { + return path.normalize(e); + } + return e; + }).join(' '); + }); + // taken from npm's cli: https://git.io/vNFD4 + sh = process.env.comspec || 'cmd'; + shFlag = '/d /s /c'; + spawnOptions.windowsVerbatimArguments = true; + } + + const args = command.join(' '); + const child = spawn(sh, [shFlag, args], spawnOptions); + + if (config.required) { + var emit = { + stdout: function (data) { + bus.emit('stdout', data); + }, + stderr: function (data) { + bus.emit('stderr', data); + }, + }; + + // now work out what to bind to... + if (config.options.stdout) { + child.on('stdout', emit.stdout).on('stderr', emit.stderr); + } else { + child.stdout.on('data', emit.stdout); + child.stderr.on('data', emit.stderr); + + bus.stdout = child.stdout; + bus.stderr = child.stderr; + } + } +}; diff --git a/node_modules/nodemon/lib/utils/bus.js b/node_modules/nodemon/lib/utils/bus.js new file mode 100644 index 000000000..4e120c58d --- /dev/null +++ b/node_modules/nodemon/lib/utils/bus.js @@ -0,0 +1,44 @@ +var events = require('events'); +var debug = require('debug')('nodemon'); +var util = require('util'); + +var Bus = function () { + events.EventEmitter.call(this); +}; + +util.inherits(Bus, events.EventEmitter); + +var bus = new Bus(); + +// /* +var collected = {}; +bus.on('newListener', function (event) { + debug('bus new listener: %s (%s)', event, bus.listeners(event).length); + if (!collected[event]) { + collected[event] = true; + bus.on(event, function () { + debug('bus emit: %s', event); + }); + } +}); + +// */ + +// proxy process messages (if forked) to the bus +process.on('message', function (event) { + debug('process.message(%s)', event); + bus.emit(event); +}); + +var emit = bus.emit; + +// if nodemon was spawned via a fork, allow upstream communication +// via process.send +if (process.send) { + bus.emit = function (event, data) { + process.send({ type: event, data: data }); + emit.apply(bus, arguments); + }; +} + +module.exports = bus; diff --git a/node_modules/nodemon/lib/utils/clone.js b/node_modules/nodemon/lib/utils/clone.js new file mode 100644 index 000000000..6ba6330f3 --- /dev/null +++ b/node_modules/nodemon/lib/utils/clone.js @@ -0,0 +1,40 @@ +module.exports = clone; + +// via http://stackoverflow.com/a/728694/22617 +function clone(obj) { + // Handle the 3 simple types, and null or undefined + if (null === obj || 'object' !== typeof obj) { + return obj; + } + + var copy; + + // Handle Date + if (obj instanceof Date) { + copy = new Date(); + copy.setTime(obj.getTime()); + return copy; + } + + // Handle Array + if (obj instanceof Array) { + copy = []; + for (var i = 0, len = obj.length; i < len; i++) { + copy[i] = clone(obj[i]); + } + return copy; + } + + // Handle Object + if (obj instanceof Object) { + copy = {}; + for (var attr in obj) { + if (obj.hasOwnProperty && obj.hasOwnProperty(attr)) { + copy[attr] = clone(obj[attr]); + } + } + return copy; + } + + throw new Error('Unable to copy obj! Its type isn\'t supported.'); +} \ No newline at end of file diff --git a/node_modules/nodemon/lib/utils/colour.js b/node_modules/nodemon/lib/utils/colour.js new file mode 100644 index 000000000..8c1b59054 --- /dev/null +++ b/node_modules/nodemon/lib/utils/colour.js @@ -0,0 +1,26 @@ +/** + * Encodes a string in a colour: red, yellow or green + * @param {String} c colour to highlight in + * @param {String} str the string to encode + * @return {String} coloured string for terminal printing + */ +function colour(c, str) { + return (colour[c] || colour.black) + str + colour.black; +} + +function strip(str) { + re.lastIndex = 0; // reset position + return str.replace(re, ''); +} + +colour.red = '\x1B[31m'; +colour.yellow = '\x1B[33m'; +colour.green = '\x1B[32m'; +colour.black = '\x1B[39m'; + +var reStr = Object.keys(colour).map(key => colour[key]).join('|'); +var re = new RegExp(('(' + reStr + ')').replace(/\[/g, '\\['), 'g'); + +colour.strip = strip; + +module.exports = colour; diff --git a/node_modules/nodemon/lib/utils/index.js b/node_modules/nodemon/lib/utils/index.js new file mode 100644 index 000000000..c4803383f --- /dev/null +++ b/node_modules/nodemon/lib/utils/index.js @@ -0,0 +1,102 @@ +var noop = function () { }; +var path = require('path'); +const semver = require('semver'); +var version = process.versions.node.split('.') || [null, null, null]; + +var utils = (module.exports = { + semver: semver, + satisfies: test => semver.satisfies(process.versions.node, test), + version: { + major: parseInt(version[0] || 0, 10), + minor: parseInt(version[1] || 0, 10), + patch: parseInt(version[2] || 0, 10), + }, + clone: require('./clone'), + merge: require('./merge'), + bus: require('./bus'), + isWindows: process.platform === 'win32', + isMac: process.platform === 'darwin', + isLinux: process.platform === 'linux', + isRequired: (function () { + var p = module.parent; + while (p) { + // in electron.js engine it happens + if (!p.filename) { + return true; + } + if (p.filename.indexOf('bin' + path.sep + 'nodemon.js') !== -1) { + return false; + } + p = p.parent; + } + + return true; + })(), + home: process.env.HOME || process.env.HOMEPATH, + quiet: function () { + // nukes the logging + if (!this.debug) { + for (var method in utils.log) { + if (typeof utils.log[method] === 'function') { + utils.log[method] = noop; + } + } + } + }, + reset: function () { + if (!this.debug) { + for (var method in utils.log) { + if (typeof utils.log[method] === 'function') { + delete utils.log[method]; + } + } + } + this.debug = false; + }, + regexpToText: function (t) { + return t + .replace(/\.\*\\./g, '*.') + .replace(/\\{2}/g, '^^') + .replace(/\\/g, '') + .replace(/\^\^/g, '\\'); + }, + stringify: function (exec, args) { + // serializes an executable string and array of arguments into a string + args = args || []; + + return [exec] + .concat( + args.map(function (arg) { + // if an argument contains a space, we want to show it with quotes + // around it to indicate that it is a single argument + if (arg.length > 0 && arg.indexOf(' ') === -1) { + return arg; + } + // this should correctly escape nested quotes + return JSON.stringify(arg); + }) + ) + .join(' ') + .trim(); + }, +}); + +utils.log = require('./log')(utils.isRequired); + +Object.defineProperty(utils, 'debug', { + set: function (value) { + this.log.debug = value; + }, + get: function () { + return this.log.debug; + }, +}); + +Object.defineProperty(utils, 'colours', { + set: function (value) { + this.log.useColours = value; + }, + get: function () { + return this.log.useColours; + }, +}); diff --git a/node_modules/nodemon/lib/utils/log.js b/node_modules/nodemon/lib/utils/log.js new file mode 100644 index 000000000..65800872d --- /dev/null +++ b/node_modules/nodemon/lib/utils/log.js @@ -0,0 +1,82 @@ +var colour = require('./colour'); +var bus = require('./bus'); +var required = false; +var useColours = true; + +var coding = { + log: 'black', + info: 'yellow', + status: 'green', + detail: 'yellow', + fail: 'red', + error: 'red', +}; + +function log(type, text) { + var msg = '[nodemon] ' + (text || ''); + + if (useColours) { + msg = colour(coding[type], msg); + } + + // always push the message through our bus, using nextTick + // to help testing and get _out of_ promises. + process.nextTick(() => { + bus.emit('log', { type: type, message: text, colour: msg }); + }); + + // but if we're running on the command line, also echo out + // question: should we actually just consume our own events? + if (!required) { + if (type === 'error') { + console.error(msg); + } else { + console.log(msg || ''); + } + } +} + +var Logger = function (r) { + if (!(this instanceof Logger)) { + return new Logger(r); + } + this.required(r); + return this; +}; + +Object.keys(coding).forEach(function (type) { + Logger.prototype[type] = log.bind(null, type); +}); + +// detail is for messages that are turned on during debug +Logger.prototype.detail = function (msg) { + if (this.debug) { + log('detail', msg); + } +}; + +Logger.prototype.required = function (val) { + required = val; +}; + +Logger.prototype.debug = false; +Logger.prototype._log = function (type, msg) { + if (required) { + bus.emit('log', { type: type, message: msg || '', colour: msg || '' }); + } else if (type === 'error') { + console.error(msg); + } else { + console.log(msg || ''); + } +}; + +Object.defineProperty(Logger.prototype, 'useColours', { + set: function (val) { + useColours = val; + }, + get: function () { + return useColours; + }, +}); + +module.exports = Logger; diff --git a/node_modules/nodemon/lib/utils/merge.js b/node_modules/nodemon/lib/utils/merge.js new file mode 100644 index 000000000..1f3440bd6 --- /dev/null +++ b/node_modules/nodemon/lib/utils/merge.js @@ -0,0 +1,47 @@ +var clone = require('./clone'); + +module.exports = merge; + +function typesMatch(a, b) { + return (typeof a === typeof b) && (Array.isArray(a) === Array.isArray(b)); +} + +/** + * A deep merge of the source based on the target. + * @param {Object} source [description] + * @param {Object} target [description] + * @return {Object} [description] + */ +function merge(source, target, result) { + if (result === undefined) { + result = clone(source); + } + + // merge missing values from the target to the source + Object.getOwnPropertyNames(target).forEach(function (key) { + if (source[key] === undefined) { + result[key] = target[key]; + } + }); + + Object.getOwnPropertyNames(source).forEach(function (key) { + var value = source[key]; + + if (target[key] && typesMatch(value, target[key])) { + // merge empty values + if (value === '') { + result[key] = target[key]; + } + + if (Array.isArray(value)) { + if (value.length === 0 && target[key].length) { + result[key] = target[key].slice(0); + } + } else if (typeof value === 'object') { + result[key] = merge(value, target[key]); + } + } + }); + + return result; +} \ No newline at end of file diff --git a/node_modules/nodemon/lib/version.js b/node_modules/nodemon/lib/version.js new file mode 100644 index 000000000..d0f510447 --- /dev/null +++ b/node_modules/nodemon/lib/version.js @@ -0,0 +1,100 @@ +module.exports = version; +module.exports.pin = pin; + +var fs = require('fs'); +var path = require('path'); +var exec = require('child_process').exec; +var root = null; + +function pin() { + return version().then(function (v) { + version.pinned = v; + }); +} + +function version(callback) { + // first find the package.json as this will be our root + var promise = findPackage(path.dirname(module.parent.filename)) + .then(function (dir) { + // now try to load the package + var v = require(path.resolve(dir, 'package.json')).version; + + if (v && v !== '0.0.0-development') { + return v; + } + + root = dir; + + // else we're in development, give the commit out + // get the last commit and whether the working dir is dirty + var promises = [ + branch().catch(function () { return 'master'; }), + commit().catch(function () { return ''; }), + dirty().catch(function () { return 0; }), + ]; + + // use the cached result as the export + return Promise.all(promises).then(function (res) { + var branch = res[0]; + var commit = res[1]; + var dirtyCount = parseInt(res[2], 10); + var curr = branch + ': ' + commit; + if (dirtyCount !== 0) { + curr += ' (' + dirtyCount + ' dirty files)'; + } + + return curr; + }); + }).catch(function (error) { + console.log(error.stack); + throw error; + }); + + if (callback) { + promise.then(function (res) { + callback(null, res); + }, callback); + } + + return promise; +} + +function findPackage(dir) { + if (dir === '/') { + return Promise.reject(new Error('package not found')); + } + return new Promise(function (resolve) { + fs.stat(path.resolve(dir, 'package.json'), function (error, exists) { + if (error || !exists) { + return resolve(findPackage(path.resolve(dir, '..'))); + } + + resolve(dir); + }); + }); +} + +function command(cmd) { + return new Promise(function (resolve, reject) { + exec(cmd, { cwd: root }, function (err, stdout, stderr) { + var error = stderr.trim(); + if (error) { + return reject(new Error(error)); + } + resolve(stdout.split('\n').join('')); + }); + }); +} + +function commit() { + return command('git rev-parse HEAD'); +} + +function branch() { + return command('git rev-parse --abbrev-ref HEAD'); +} + +function dirty() { + return command('expr $(git status --porcelain 2>/dev/null| ' + + 'egrep "^(M| M)" | wc -l)'); +} diff --git a/node_modules/nodemon/node_modules/debug/CHANGELOG.md b/node_modules/nodemon/node_modules/debug/CHANGELOG.md new file mode 100644 index 000000000..820d21e33 --- /dev/null +++ b/node_modules/nodemon/node_modules/debug/CHANGELOG.md @@ -0,0 +1,395 @@ + +3.1.0 / 2017-09-26 +================== + + * Add `DEBUG_HIDE_DATE` env var (#486) + * Remove ReDoS regexp in %o formatter (#504) + * Remove "component" from package.json + * Remove `component.json` + * Ignore package-lock.json + * Examples: fix colors printout + * Fix: browser detection + * Fix: spelling mistake (#496, @EdwardBetts) + +3.0.1 / 2017-08-24 +================== + + * Fix: Disable colors in Edge and Internet Explorer (#489) + +3.0.0 / 2017-08-08 +================== + + * Breaking: Remove DEBUG_FD (#406) + * Breaking: Use `Date#toISOString()` instead to `Date#toUTCString()` when output is not a TTY (#418) + * Breaking: Make millisecond timer namespace specific and allow 'always enabled' output (#408) + * Addition: document `enabled` flag (#465) + * Addition: add 256 colors mode (#481) + * Addition: `enabled()` updates existing debug instances, add `destroy()` function (#440) + * Update: component: update "ms" to v2.0.0 + * Update: separate the Node and Browser tests in Travis-CI + * Update: refactor Readme, fixed documentation, added "Namespace Colors" section, redid screenshots + * Update: separate Node.js and web browser examples for organization + * Update: update "browserify" to v14.4.0 + * Fix: fix Readme typo (#473) + +2.6.9 / 2017-09-22 +================== + + * remove ReDoS regexp in %o formatter (#504) + +2.6.8 / 2017-05-18 +================== + + * Fix: Check for undefined on browser globals (#462, @marbemac) + +2.6.7 / 2017-05-16 +================== + + * Fix: Update ms to 2.0.0 to fix regular expression denial of service vulnerability (#458, @hubdotcom) + * Fix: Inline extend function in node implementation (#452, @dougwilson) + * Docs: Fix typo (#455, @msasad) + +2.6.5 / 2017-04-27 +================== + + * Fix: null reference check on window.documentElement.style.WebkitAppearance (#447, @thebigredgeek) + * Misc: clean up browser reference checks (#447, @thebigredgeek) + * Misc: add npm-debug.log to .gitignore (@thebigredgeek) + + +2.6.4 / 2017-04-20 +================== + + * Fix: bug that would occur if process.env.DEBUG is a non-string value. (#444, @LucianBuzzo) + * Chore: ignore bower.json in npm installations. (#437, @joaovieira) + * Misc: update "ms" to v0.7.3 (@tootallnate) + +2.6.3 / 2017-03-13 +================== + + * Fix: Electron reference to `process.env.DEBUG` (#431, @paulcbetts) + * Docs: Changelog fix (@thebigredgeek) + +2.6.2 / 2017-03-10 +================== + + * Fix: DEBUG_MAX_ARRAY_LENGTH (#420, @slavaGanzin) + * Docs: Add backers and sponsors from Open Collective (#422, @piamancini) + * Docs: Add Slackin invite badge (@tootallnate) + +2.6.1 / 2017-02-10 +================== + + * Fix: Module's `export default` syntax fix for IE8 `Expected identifier` error + * Fix: Whitelist DEBUG_FD for values 1 and 2 only (#415, @pi0) + * Fix: IE8 "Expected identifier" error (#414, @vgoma) + * Fix: Namespaces would not disable once enabled (#409, @musikov) + +2.6.0 / 2016-12-28 +================== + + * Fix: added better null pointer checks for browser useColors (@thebigredgeek) + * Improvement: removed explicit `window.debug` export (#404, @tootallnate) + * Improvement: deprecated `DEBUG_FD` environment variable (#405, @tootallnate) + +2.5.2 / 2016-12-25 +================== + + * Fix: reference error on window within webworkers (#393, @KlausTrainer) + * Docs: fixed README typo (#391, @lurch) + * Docs: added notice about v3 api discussion (@thebigredgeek) + +2.5.1 / 2016-12-20 +================== + + * Fix: babel-core compatibility + +2.5.0 / 2016-12-20 +================== + + * Fix: wrong reference in bower file (@thebigredgeek) + * Fix: webworker compatibility (@thebigredgeek) + * Fix: output formatting issue (#388, @kribblo) + * Fix: babel-loader compatibility (#383, @escwald) + * Misc: removed built asset from repo and publications (@thebigredgeek) + * Misc: moved source files to /src (#378, @yamikuronue) + * Test: added karma integration and replaced babel with browserify for browser tests (#378, @yamikuronue) + * Test: coveralls integration (#378, @yamikuronue) + * Docs: simplified language in the opening paragraph (#373, @yamikuronue) + +2.4.5 / 2016-12-17 +================== + + * Fix: `navigator` undefined in Rhino (#376, @jochenberger) + * Fix: custom log function (#379, @hsiliev) + * Improvement: bit of cleanup + linting fixes (@thebigredgeek) + * Improvement: rm non-maintainted `dist/` dir (#375, @freewil) + * Docs: simplified language in the opening paragraph. (#373, @yamikuronue) + +2.4.4 / 2016-12-14 +================== + + * Fix: work around debug being loaded in preload scripts for electron (#368, @paulcbetts) + +2.4.3 / 2016-12-14 +================== + + * Fix: navigation.userAgent error for react native (#364, @escwald) + +2.4.2 / 2016-12-14 +================== + + * Fix: browser colors (#367, @tootallnate) + * Misc: travis ci integration (@thebigredgeek) + * Misc: added linting and testing boilerplate with sanity check (@thebigredgeek) + +2.4.1 / 2016-12-13 +================== + + * Fix: typo that broke the package (#356) + +2.4.0 / 2016-12-13 +================== + + * Fix: bower.json references unbuilt src entry point (#342, @justmatt) + * Fix: revert "handle regex special characters" (@tootallnate) + * Feature: configurable util.inspect()`options for NodeJS (#327, @tootallnate) + * Feature: %O`(big O) pretty-prints objects (#322, @tootallnate) + * Improvement: allow colors in workers (#335, @botverse) + * Improvement: use same color for same namespace. (#338, @lchenay) + +2.3.3 / 2016-11-09 +================== + + * Fix: Catch `JSON.stringify()` errors (#195, Jovan Alleyne) + * Fix: Returning `localStorage` saved values (#331, Levi Thomason) + * Improvement: Don't create an empty object when no `process` (Nathan Rajlich) + +2.3.2 / 2016-11-09 +================== + + * Fix: be super-safe in index.js as well (@TooTallNate) + * Fix: should check whether process exists (Tom Newby) + +2.3.1 / 2016-11-09 +================== + + * Fix: Added electron compatibility (#324, @paulcbetts) + * Improvement: Added performance optimizations (@tootallnate) + * Readme: Corrected PowerShell environment variable example (#252, @gimre) + * Misc: Removed yarn lock file from source control (#321, @fengmk2) + +2.3.0 / 2016-11-07 +================== + + * Fix: Consistent placement of ms diff at end of output (#215, @gorangajic) + * Fix: Escaping of regex special characters in namespace strings (#250, @zacronos) + * Fix: Fixed bug causing crash on react-native (#282, @vkarpov15) + * Feature: Enabled ES6+ compatible import via default export (#212 @bucaran) + * Feature: Added %O formatter to reflect Chrome's console.log capability (#279, @oncletom) + * Package: Update "ms" to 0.7.2 (#315, @DevSide) + * Package: removed superfluous version property from bower.json (#207 @kkirsche) + * Readme: fix USE_COLORS to DEBUG_COLORS + * Readme: Doc fixes for format string sugar (#269, @mlucool) + * Readme: Updated docs for DEBUG_FD and DEBUG_COLORS environment variables (#232, @mattlyons0) + * Readme: doc fixes for PowerShell (#271 #243, @exoticknight @unreadable) + * Readme: better docs for browser support (#224, @matthewmueller) + * Tooling: Added yarn integration for development (#317, @thebigredgeek) + * Misc: Renamed History.md to CHANGELOG.md (@thebigredgeek) + * Misc: Added license file (#226 #274, @CantemoInternal @sdaitzman) + * Misc: Updated contributors (@thebigredgeek) + +2.2.0 / 2015-05-09 +================== + + * package: update "ms" to v0.7.1 (#202, @dougwilson) + * README: add logging to file example (#193, @DanielOchoa) + * README: fixed a typo (#191, @amir-s) + * browser: expose `storage` (#190, @stephenmathieson) + * Makefile: add a `distclean` target (#189, @stephenmathieson) + +2.1.3 / 2015-03-13 +================== + + * Updated stdout/stderr example (#186) + * Updated example/stdout.js to match debug current behaviour + * Renamed example/stderr.js to stdout.js + * Update Readme.md (#184) + * replace high intensity foreground color for bold (#182, #183) + +2.1.2 / 2015-03-01 +================== + + * dist: recompile + * update "ms" to v0.7.0 + * package: update "browserify" to v9.0.3 + * component: fix "ms.js" repo location + * changed bower package name + * updated documentation about using debug in a browser + * fix: security error on safari (#167, #168, @yields) + +2.1.1 / 2014-12-29 +================== + + * browser: use `typeof` to check for `console` existence + * browser: check for `console.log` truthiness (fix IE 8/9) + * browser: add support for Chrome apps + * Readme: added Windows usage remarks + * Add `bower.json` to properly support bower install + +2.1.0 / 2014-10-15 +================== + + * node: implement `DEBUG_FD` env variable support + * package: update "browserify" to v6.1.0 + * package: add "license" field to package.json (#135, @panuhorsmalahti) + +2.0.0 / 2014-09-01 +================== + + * package: update "browserify" to v5.11.0 + * node: use stderr rather than stdout for logging (#29, @stephenmathieson) + +1.0.4 / 2014-07-15 +================== + + * dist: recompile + * example: remove `console.info()` log usage + * example: add "Content-Type" UTF-8 header to browser example + * browser: place %c marker after the space character + * browser: reset the "content" color via `color: inherit` + * browser: add colors support for Firefox >= v31 + * debug: prefer an instance `log()` function over the global one (#119) + * Readme: update documentation about styled console logs for FF v31 (#116, @wryk) + +1.0.3 / 2014-07-09 +================== + + * Add support for multiple wildcards in namespaces (#122, @seegno) + * browser: fix lint + +1.0.2 / 2014-06-10 +================== + + * browser: update color palette (#113, @gscottolson) + * common: make console logging function configurable (#108, @timoxley) + * node: fix %o colors on old node <= 0.8.x + * Makefile: find node path using shell/which (#109, @timoxley) + +1.0.1 / 2014-06-06 +================== + + * browser: use `removeItem()` to clear localStorage + * browser, node: don't set DEBUG if namespaces is undefined (#107, @leedm777) + * package: add "contributors" section + * node: fix comment typo + * README: list authors + +1.0.0 / 2014-06-04 +================== + + * make ms diff be global, not be scope + * debug: ignore empty strings in enable() + * node: make DEBUG_COLORS able to disable coloring + * *: export the `colors` array + * npmignore: don't publish the `dist` dir + * Makefile: refactor to use browserify + * package: add "browserify" as a dev dependency + * Readme: add Web Inspector Colors section + * node: reset terminal color for the debug content + * node: map "%o" to `util.inspect()` + * browser: map "%j" to `JSON.stringify()` + * debug: add custom "formatters" + * debug: use "ms" module for humanizing the diff + * Readme: add "bash" syntax highlighting + * browser: add Firebug color support + * browser: add colors for WebKit browsers + * node: apply log to `console` + * rewrite: abstract common logic for Node & browsers + * add .jshintrc file + +0.8.1 / 2014-04-14 +================== + + * package: re-add the "component" section + +0.8.0 / 2014-03-30 +================== + + * add `enable()` method for nodejs. Closes #27 + * change from stderr to stdout + * remove unnecessary index.js file + +0.7.4 / 2013-11-13 +================== + + * remove "browserify" key from package.json (fixes something in browserify) + +0.7.3 / 2013-10-30 +================== + + * fix: catch localStorage security error when cookies are blocked (Chrome) + * add debug(err) support. Closes #46 + * add .browser prop to package.json. Closes #42 + +0.7.2 / 2013-02-06 +================== + + * fix package.json + * fix: Mobile Safari (private mode) is broken with debug + * fix: Use unicode to send escape character to shell instead of octal to work with strict mode javascript + +0.7.1 / 2013-02-05 +================== + + * add repository URL to package.json + * add DEBUG_COLORED to force colored output + * add browserify support + * fix component. Closes #24 + +0.7.0 / 2012-05-04 +================== + + * Added .component to package.json + * Added debug.component.js build + +0.6.0 / 2012-03-16 +================== + + * Added support for "-" prefix in DEBUG [Vinay Pulim] + * Added `.enabled` flag to the node version [TooTallNate] + +0.5.0 / 2012-02-02 +================== + + * Added: humanize diffs. Closes #8 + * Added `debug.disable()` to the CS variant + * Removed padding. Closes #10 + * Fixed: persist client-side variant again. Closes #9 + +0.4.0 / 2012-02-01 +================== + + * Added browser variant support for older browsers [TooTallNate] + * Added `debug.enable('project:*')` to browser variant [TooTallNate] + * Added padding to diff (moved it to the right) + +0.3.0 / 2012-01-26 +================== + + * Added millisecond diff when isatty, otherwise UTC string + +0.2.0 / 2012-01-22 +================== + + * Added wildcard support + +0.1.0 / 2011-12-02 +================== + + * Added: remove colors unless stderr isatty [TooTallNate] + +0.0.1 / 2010-01-03 +================== + + * Initial release diff --git a/node_modules/nodemon/node_modules/debug/LICENSE b/node_modules/nodemon/node_modules/debug/LICENSE new file mode 100644 index 000000000..658c933d2 --- /dev/null +++ b/node_modules/nodemon/node_modules/debug/LICENSE @@ -0,0 +1,19 @@ +(The MIT License) + +Copyright (c) 2014 TJ Holowaychuk + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software +and associated documentation files (the 'Software'), to deal in the Software without restriction, +including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, +and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, +subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial +portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT +LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, +WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. + diff --git a/node_modules/nodemon/node_modules/debug/README.md b/node_modules/nodemon/node_modules/debug/README.md new file mode 100644 index 000000000..0ee7634dd --- /dev/null +++ b/node_modules/nodemon/node_modules/debug/README.md @@ -0,0 +1,437 @@ +# debug +[![Build Status](https://travis-ci.org/visionmedia/debug.svg?branch=master)](https://travis-ci.org/visionmedia/debug) [![Coverage Status](https://coveralls.io/repos/github/visionmedia/debug/badge.svg?branch=master)](https://coveralls.io/github/visionmedia/debug?branch=master) [![Slack](https://visionmedia-community-slackin.now.sh/badge.svg)](https://visionmedia-community-slackin.now.sh/) [![OpenCollective](https://opencollective.com/debug/backers/badge.svg)](#backers) +[![OpenCollective](https://opencollective.com/debug/sponsors/badge.svg)](#sponsors) + + + +A tiny JavaScript debugging utility modelled after Node.js core's debugging +technique. Works in Node.js and web browsers. + +## Installation + +```bash +$ npm install debug +``` + +## Usage + +`debug` exposes a function; simply pass this function the name of your module, and it will return a decorated version of `console.error` for you to pass debug statements to. This will allow you to toggle the debug output for different parts of your module as well as the module as a whole. + +Example [_app.js_](./examples/node/app.js): + +```js +var debug = require('debug')('http') + , http = require('http') + , name = 'My App'; + +// fake app + +debug('booting %o', name); + +http.createServer(function(req, res){ + debug(req.method + ' ' + req.url); + res.end('hello\n'); +}).listen(3000, function(){ + debug('listening'); +}); + +// fake worker of some kind + +require('./worker'); +``` + +Example [_worker.js_](./examples/node/worker.js): + +```js +var a = require('debug')('worker:a') + , b = require('debug')('worker:b'); + +function work() { + a('doing lots of uninteresting work'); + setTimeout(work, Math.random() * 1000); +} + +work(); + +function workb() { + b('doing some work'); + setTimeout(workb, Math.random() * 2000); +} + +workb(); +``` + +The `DEBUG` environment variable is then used to enable these based on space or +comma-delimited names. + +Here are some examples: + +screen shot 2017-08-08 at 12 53 04 pm +screen shot 2017-08-08 at 12 53 38 pm +screen shot 2017-08-08 at 12 53 25 pm + +#### Windows command prompt notes + +##### CMD + +On Windows the environment variable is set using the `set` command. + +```cmd +set DEBUG=*,-not_this +``` + +Example: + +```cmd +set DEBUG=* & node app.js +``` + +##### PowerShell (VS Code default) + +PowerShell uses different syntax to set environment variables. + +```cmd +$env:DEBUG = "*,-not_this" +``` + +Example: + +```cmd +$env:DEBUG='app';node app.js +``` + +Then, run the program to be debugged as usual. + +npm script example: +```js + "windowsDebug": "@powershell -Command $env:DEBUG='*';node app.js", +``` + +## Namespace Colors + +Every debug instance has a color generated for it based on its namespace name. +This helps when visually parsing the debug output to identify which debug instance +a debug line belongs to. + +#### Node.js + +In Node.js, colors are enabled when stderr is a TTY. You also _should_ install +the [`supports-color`](https://npmjs.org/supports-color) module alongside debug, +otherwise debug will only use a small handful of basic colors. + + + +#### Web Browser + +Colors are also enabled on "Web Inspectors" that understand the `%c` formatting +option. These are WebKit web inspectors, Firefox ([since version +31](https://hacks.mozilla.org/2014/05/editable-box-model-multiple-selection-sublime-text-keys-much-more-firefox-developer-tools-episode-31/)) +and the Firebug plugin for Firefox (any version). + + + + +## Millisecond diff + +When actively developing an application it can be useful to see when the time spent between one `debug()` call and the next. Suppose for example you invoke `debug()` before requesting a resource, and after as well, the "+NNNms" will show you how much time was spent between calls. + + + +When stdout is not a TTY, `Date#toISOString()` is used, making it more useful for logging the debug information as shown below: + + + + +## Conventions + +If you're using this in one or more of your libraries, you _should_ use the name of your library so that developers may toggle debugging as desired without guessing names. If you have more than one debuggers you _should_ prefix them with your library name and use ":" to separate features. For example "bodyParser" from Connect would then be "connect:bodyParser". If you append a "*" to the end of your name, it will always be enabled regardless of the setting of the DEBUG environment variable. You can then use it for normal output as well as debug output. + +## Wildcards + +The `*` character may be used as a wildcard. Suppose for example your library has +debuggers named "connect:bodyParser", "connect:compress", "connect:session", +instead of listing all three with +`DEBUG=connect:bodyParser,connect:compress,connect:session`, you may simply do +`DEBUG=connect:*`, or to run everything using this module simply use `DEBUG=*`. + +You can also exclude specific debuggers by prefixing them with a "-" character. +For example, `DEBUG=*,-connect:*` would include all debuggers except those +starting with "connect:". + +## Environment Variables + +When running through Node.js, you can set a few environment variables that will +change the behavior of the debug logging: + +| Name | Purpose | +|-----------|-------------------------------------------------| +| `DEBUG` | Enables/disables specific debugging namespaces. | +| `DEBUG_HIDE_DATE` | Hide date from debug output (non-TTY). | +| `DEBUG_COLORS`| Whether or not to use colors in the debug output. | +| `DEBUG_DEPTH` | Object inspection depth. | +| `DEBUG_SHOW_HIDDEN` | Shows hidden properties on inspected objects. | + + +__Note:__ The environment variables beginning with `DEBUG_` end up being +converted into an Options object that gets used with `%o`/`%O` formatters. +See the Node.js documentation for +[`util.inspect()`](https://nodejs.org/api/util.html#util_util_inspect_object_options) +for the complete list. + +## Formatters + +Debug uses [printf-style](https://wikipedia.org/wiki/Printf_format_string) formatting. +Below are the officially supported formatters: + +| Formatter | Representation | +|-----------|----------------| +| `%O` | Pretty-print an Object on multiple lines. | +| `%o` | Pretty-print an Object all on a single line. | +| `%s` | String. | +| `%d` | Number (both integer and float). | +| `%j` | JSON. Replaced with the string '[Circular]' if the argument contains circular references. | +| `%%` | Single percent sign ('%'). This does not consume an argument. | + + +### Custom formatters + +You can add custom formatters by extending the `debug.formatters` object. +For example, if you wanted to add support for rendering a Buffer as hex with +`%h`, you could do something like: + +```js +const createDebug = require('debug') +createDebug.formatters.h = (v) => { + return v.toString('hex') +} + +// …elsewhere +const debug = createDebug('foo') +debug('this is hex: %h', new Buffer('hello world')) +// foo this is hex: 68656c6c6f20776f726c6421 +0ms +``` + + +## Browser Support + +You can build a browser-ready script using [browserify](https://github.com/substack/node-browserify), +or just use the [browserify-as-a-service](https://wzrd.in/) [build](https://wzrd.in/standalone/debug@latest), +if you don't want to build it yourself. + +Debug's enable state is currently persisted by `localStorage`. +Consider the situation shown below where you have `worker:a` and `worker:b`, +and wish to debug both. You can enable this using `localStorage.debug`: + +```js +localStorage.debug = 'worker:*' +``` + +And then refresh the page. + +```js +a = debug('worker:a'); +b = debug('worker:b'); + +setInterval(function(){ + a('doing some work'); +}, 1000); + +setInterval(function(){ + b('doing some work'); +}, 1200); +``` + + +## Output streams + + By default `debug` will log to stderr, however this can be configured per-namespace by overriding the `log` method: + +Example [_stdout.js_](./examples/node/stdout.js): + +```js +var debug = require('debug'); +var error = debug('app:error'); + +// by default stderr is used +error('goes to stderr!'); + +var log = debug('app:log'); +// set this namespace to log via console.log +log.log = console.log.bind(console); // don't forget to bind to console! +log('goes to stdout'); +error('still goes to stderr!'); + +// set all output to go via console.info +// overrides all per-namespace log settings +debug.log = console.info.bind(console); +error('now goes to stdout via console.info'); +log('still goes to stdout, but via console.info now'); +``` + +## Extend +You can simply extend debugger +```js +const log = require('debug')('auth'); + +//creates new debug instance with extended namespace +const logSign = log.extend('sign'); +const logLogin = log.extend('login'); + +log('hello'); // auth hello +logSign('hello'); //auth:sign hello +logLogin('hello'); //auth:login hello +``` + +## Set dynamically + +You can also enable debug dynamically by calling the `enable()` method : + +```js +let debug = require('debug'); + +console.log(1, debug.enabled('test')); + +debug.enable('test'); +console.log(2, debug.enabled('test')); + +debug.disable(); +console.log(3, debug.enabled('test')); + +``` + +print : +``` +1 false +2 true +3 false +``` + +Usage : +`enable(namespaces)` +`namespaces` can include modes separated by a colon and wildcards. + +Note that calling `enable()` completely overrides previously set DEBUG variable : + +``` +$ DEBUG=foo node -e 'var dbg = require("debug"); dbg.enable("bar"); console.log(dbg.enabled("foo"))' +=> false +``` + +## Checking whether a debug target is enabled + +After you've created a debug instance, you can determine whether or not it is +enabled by checking the `enabled` property: + +```javascript +const debug = require('debug')('http'); + +if (debug.enabled) { + // do stuff... +} +``` + +You can also manually toggle this property to force the debug instance to be +enabled or disabled. + + +## Authors + + - TJ Holowaychuk + - Nathan Rajlich + - Andrew Rhyne + +## Backers + +Support us with a monthly donation and help us continue our activities. [[Become a backer](https://opencollective.com/debug#backer)] + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +## Sponsors + +Become a sponsor and get your logo on our README on Github with a link to your site. [[Become a sponsor](https://opencollective.com/debug#sponsor)] + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + +## License + +(The MIT License) + +Copyright (c) 2014-2017 TJ Holowaychuk <tj@vision-media.ca> + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/nodemon/node_modules/debug/node.js b/node_modules/nodemon/node_modules/debug/node.js new file mode 100644 index 000000000..7fc36fe6d --- /dev/null +++ b/node_modules/nodemon/node_modules/debug/node.js @@ -0,0 +1 @@ +module.exports = require('./src/node'); diff --git a/node_modules/nodemon/node_modules/debug/package.json b/node_modules/nodemon/node_modules/debug/package.json new file mode 100644 index 000000000..0f767a644 --- /dev/null +++ b/node_modules/nodemon/node_modules/debug/package.json @@ -0,0 +1,90 @@ +{ + "_from": "debug@^3.2.6", + "_id": "debug@3.2.7", + "_inBundle": false, + "_integrity": "sha512-CFjzYYAi4ThfiQvizrFQevTTXHtnCqWfe7x1AhgEscTz6ZbLbfoLRLPugTQyBth6f8ZERVUSyWHFD/7Wu4t1XQ==", + "_location": "/nodemon/debug", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "debug@^3.2.6", + "name": "debug", + "escapedName": "debug", + "rawSpec": "^3.2.6", + "saveSpec": null, + "fetchSpec": "^3.2.6" + }, + "_requiredBy": [ + "/nodemon" + ], + "_resolved": "https://registry.npmjs.org/debug/-/debug-3.2.7.tgz", + "_shasum": "72580b7e9145fb39b6676f9c5e5fb100b934179a", + "_spec": "debug@^3.2.6", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\nodemon", + "author": { + "name": "TJ Holowaychuk", + "email": "tj@vision-media.ca" + }, + "browser": "./src/browser.js", + "bugs": { + "url": "https://github.com/visionmedia/debug/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Nathan Rajlich", + "email": "nathan@tootallnate.net", + "url": "http://n8.io" + }, + { + "name": "Andrew Rhyne", + "email": "rhyneandrew@gmail.com" + } + ], + "dependencies": { + "ms": "^2.1.1" + }, + "deprecated": false, + "description": "small debugging utility", + "devDependencies": { + "@babel/cli": "^7.0.0", + "@babel/core": "^7.0.0", + "@babel/preset-env": "^7.0.0", + "browserify": "14.4.0", + "chai": "^3.5.0", + "concurrently": "^3.1.0", + "coveralls": "^3.0.2", + "istanbul": "^0.4.5", + "karma": "^3.0.0", + "karma-chai": "^0.1.0", + "karma-mocha": "^1.3.0", + "karma-phantomjs-launcher": "^1.0.2", + "mocha": "^5.2.0", + "mocha-lcov-reporter": "^1.2.0", + "rimraf": "^2.5.4", + "xo": "^0.23.0" + }, + "files": [ + "src", + "node.js", + "dist/debug.js", + "LICENSE", + "README.md" + ], + "homepage": "https://github.com/visionmedia/debug#readme", + "keywords": [ + "debug", + "log", + "debugger" + ], + "license": "MIT", + "main": "./src/index.js", + "name": "debug", + "repository": { + "type": "git", + "url": "git://github.com/visionmedia/debug.git" + }, + "unpkg": "./dist/debug.js", + "version": "3.2.7" +} diff --git a/node_modules/nodemon/node_modules/debug/src/browser.js b/node_modules/nodemon/node_modules/debug/src/browser.js new file mode 100644 index 000000000..c924b0ac4 --- /dev/null +++ b/node_modules/nodemon/node_modules/debug/src/browser.js @@ -0,0 +1,180 @@ +"use strict"; + +function _typeof(obj) { if (typeof Symbol === "function" && typeof Symbol.iterator === "symbol") { _typeof = function _typeof(obj) { return typeof obj; }; } else { _typeof = function _typeof(obj) { return obj && typeof Symbol === "function" && obj.constructor === Symbol && obj !== Symbol.prototype ? "symbol" : typeof obj; }; } return _typeof(obj); } + +/* eslint-env browser */ + +/** + * This is the web browser implementation of `debug()`. + */ +exports.log = log; +exports.formatArgs = formatArgs; +exports.save = save; +exports.load = load; +exports.useColors = useColors; +exports.storage = localstorage(); +/** + * Colors. + */ + +exports.colors = ['#0000CC', '#0000FF', '#0033CC', '#0033FF', '#0066CC', '#0066FF', '#0099CC', '#0099FF', '#00CC00', '#00CC33', '#00CC66', '#00CC99', '#00CCCC', '#00CCFF', '#3300CC', '#3300FF', '#3333CC', '#3333FF', '#3366CC', '#3366FF', '#3399CC', '#3399FF', '#33CC00', '#33CC33', '#33CC66', '#33CC99', '#33CCCC', '#33CCFF', '#6600CC', '#6600FF', '#6633CC', '#6633FF', '#66CC00', '#66CC33', '#9900CC', '#9900FF', '#9933CC', '#9933FF', '#99CC00', '#99CC33', '#CC0000', '#CC0033', '#CC0066', '#CC0099', '#CC00CC', '#CC00FF', '#CC3300', '#CC3333', '#CC3366', '#CC3399', '#CC33CC', '#CC33FF', '#CC6600', '#CC6633', '#CC9900', '#CC9933', '#CCCC00', '#CCCC33', '#FF0000', '#FF0033', '#FF0066', '#FF0099', '#FF00CC', '#FF00FF', '#FF3300', '#FF3333', '#FF3366', '#FF3399', '#FF33CC', '#FF33FF', '#FF6600', '#FF6633', '#FF9900', '#FF9933', '#FFCC00', '#FFCC33']; +/** + * Currently only WebKit-based Web Inspectors, Firefox >= v31, + * and the Firebug extension (any Firefox version) are known + * to support "%c" CSS customizations. + * + * TODO: add a `localStorage` variable to explicitly enable/disable colors + */ +// eslint-disable-next-line complexity + +function useColors() { + // NB: In an Electron preload script, document will be defined but not fully + // initialized. Since we know we're in Chrome, we'll just detect this case + // explicitly + if (typeof window !== 'undefined' && window.process && (window.process.type === 'renderer' || window.process.__nwjs)) { + return true; + } // Internet Explorer and Edge do not support colors. + + + if (typeof navigator !== 'undefined' && navigator.userAgent && navigator.userAgent.toLowerCase().match(/(edge|trident)\/(\d+)/)) { + return false; + } // Is webkit? http://stackoverflow.com/a/16459606/376773 + // document is undefined in react-native: https://github.com/facebook/react-native/pull/1632 + + + return typeof document !== 'undefined' && document.documentElement && document.documentElement.style && document.documentElement.style.WebkitAppearance || // Is firebug? http://stackoverflow.com/a/398120/376773 + typeof window !== 'undefined' && window.console && (window.console.firebug || window.console.exception && window.console.table) || // Is firefox >= v31? + // https://developer.mozilla.org/en-US/docs/Tools/Web_Console#Styling_messages + typeof navigator !== 'undefined' && navigator.userAgent && navigator.userAgent.toLowerCase().match(/firefox\/(\d+)/) && parseInt(RegExp.$1, 10) >= 31 || // Double check webkit in userAgent just in case we are in a worker + typeof navigator !== 'undefined' && navigator.userAgent && navigator.userAgent.toLowerCase().match(/applewebkit\/(\d+)/); +} +/** + * Colorize log arguments if enabled. + * + * @api public + */ + + +function formatArgs(args) { + args[0] = (this.useColors ? '%c' : '') + this.namespace + (this.useColors ? ' %c' : ' ') + args[0] + (this.useColors ? '%c ' : ' ') + '+' + module.exports.humanize(this.diff); + + if (!this.useColors) { + return; + } + + var c = 'color: ' + this.color; + args.splice(1, 0, c, 'color: inherit'); // The final "%c" is somewhat tricky, because there could be other + // arguments passed either before or after the %c, so we need to + // figure out the correct index to insert the CSS into + + var index = 0; + var lastC = 0; + args[0].replace(/%[a-zA-Z%]/g, function (match) { + if (match === '%%') { + return; + } + + index++; + + if (match === '%c') { + // We only are interested in the *last* %c + // (the user may have provided their own) + lastC = index; + } + }); + args.splice(lastC, 0, c); +} +/** + * Invokes `console.log()` when available. + * No-op when `console.log` is not a "function". + * + * @api public + */ + + +function log() { + var _console; + + // This hackery is required for IE8/9, where + // the `console.log` function doesn't have 'apply' + return (typeof console === "undefined" ? "undefined" : _typeof(console)) === 'object' && console.log && (_console = console).log.apply(_console, arguments); +} +/** + * Save `namespaces`. + * + * @param {String} namespaces + * @api private + */ + + +function save(namespaces) { + try { + if (namespaces) { + exports.storage.setItem('debug', namespaces); + } else { + exports.storage.removeItem('debug'); + } + } catch (error) {// Swallow + // XXX (@Qix-) should we be logging these? + } +} +/** + * Load `namespaces`. + * + * @return {String} returns the previously persisted debug modes + * @api private + */ + + +function load() { + var r; + + try { + r = exports.storage.getItem('debug'); + } catch (error) {} // Swallow + // XXX (@Qix-) should we be logging these? + // If debug isn't set in LS, and we're in Electron, try to load $DEBUG + + + if (!r && typeof process !== 'undefined' && 'env' in process) { + r = process.env.DEBUG; + } + + return r; +} +/** + * Localstorage attempts to return the localstorage. + * + * This is necessary because safari throws + * when a user disables cookies/localstorage + * and you attempt to access it. + * + * @return {LocalStorage} + * @api private + */ + + +function localstorage() { + try { + // TVMLKit (Apple TV JS Runtime) does not have a window object, just localStorage in the global context + // The Browser also has localStorage in the global context. + return localStorage; + } catch (error) {// Swallow + // XXX (@Qix-) should we be logging these? + } +} + +module.exports = require('./common')(exports); +var formatters = module.exports.formatters; +/** + * Map %j to `JSON.stringify()`, since no Web Inspectors do that by default. + */ + +formatters.j = function (v) { + try { + return JSON.stringify(v); + } catch (error) { + return '[UnexpectedJSONParseError]: ' + error.message; + } +}; + diff --git a/node_modules/nodemon/node_modules/debug/src/common.js b/node_modules/nodemon/node_modules/debug/src/common.js new file mode 100644 index 000000000..e0de3fb53 --- /dev/null +++ b/node_modules/nodemon/node_modules/debug/src/common.js @@ -0,0 +1,249 @@ +"use strict"; + +/** + * This is the common logic for both the Node.js and web browser + * implementations of `debug()`. + */ +function setup(env) { + createDebug.debug = createDebug; + createDebug.default = createDebug; + createDebug.coerce = coerce; + createDebug.disable = disable; + createDebug.enable = enable; + createDebug.enabled = enabled; + createDebug.humanize = require('ms'); + Object.keys(env).forEach(function (key) { + createDebug[key] = env[key]; + }); + /** + * Active `debug` instances. + */ + + createDebug.instances = []; + /** + * The currently active debug mode names, and names to skip. + */ + + createDebug.names = []; + createDebug.skips = []; + /** + * Map of special "%n" handling functions, for the debug "format" argument. + * + * Valid key names are a single, lower or upper-case letter, i.e. "n" and "N". + */ + + createDebug.formatters = {}; + /** + * Selects a color for a debug namespace + * @param {String} namespace The namespace string for the for the debug instance to be colored + * @return {Number|String} An ANSI color code for the given namespace + * @api private + */ + + function selectColor(namespace) { + var hash = 0; + + for (var i = 0; i < namespace.length; i++) { + hash = (hash << 5) - hash + namespace.charCodeAt(i); + hash |= 0; // Convert to 32bit integer + } + + return createDebug.colors[Math.abs(hash) % createDebug.colors.length]; + } + + createDebug.selectColor = selectColor; + /** + * Create a debugger with the given `namespace`. + * + * @param {String} namespace + * @return {Function} + * @api public + */ + + function createDebug(namespace) { + var prevTime; + + function debug() { + // Disabled? + if (!debug.enabled) { + return; + } + + for (var _len = arguments.length, args = new Array(_len), _key = 0; _key < _len; _key++) { + args[_key] = arguments[_key]; + } + + var self = debug; // Set `diff` timestamp + + var curr = Number(new Date()); + var ms = curr - (prevTime || curr); + self.diff = ms; + self.prev = prevTime; + self.curr = curr; + prevTime = curr; + args[0] = createDebug.coerce(args[0]); + + if (typeof args[0] !== 'string') { + // Anything else let's inspect with %O + args.unshift('%O'); + } // Apply any `formatters` transformations + + + var index = 0; + args[0] = args[0].replace(/%([a-zA-Z%])/g, function (match, format) { + // If we encounter an escaped % then don't increase the array index + if (match === '%%') { + return match; + } + + index++; + var formatter = createDebug.formatters[format]; + + if (typeof formatter === 'function') { + var val = args[index]; + match = formatter.call(self, val); // Now we need to remove `args[index]` since it's inlined in the `format` + + args.splice(index, 1); + index--; + } + + return match; + }); // Apply env-specific formatting (colors, etc.) + + createDebug.formatArgs.call(self, args); + var logFn = self.log || createDebug.log; + logFn.apply(self, args); + } + + debug.namespace = namespace; + debug.enabled = createDebug.enabled(namespace); + debug.useColors = createDebug.useColors(); + debug.color = selectColor(namespace); + debug.destroy = destroy; + debug.extend = extend; // Debug.formatArgs = formatArgs; + // debug.rawLog = rawLog; + // env-specific initialization logic for debug instances + + if (typeof createDebug.init === 'function') { + createDebug.init(debug); + } + + createDebug.instances.push(debug); + return debug; + } + + function destroy() { + var index = createDebug.instances.indexOf(this); + + if (index !== -1) { + createDebug.instances.splice(index, 1); + return true; + } + + return false; + } + + function extend(namespace, delimiter) { + return createDebug(this.namespace + (typeof delimiter === 'undefined' ? ':' : delimiter) + namespace); + } + /** + * Enables a debug mode by namespaces. This can include modes + * separated by a colon and wildcards. + * + * @param {String} namespaces + * @api public + */ + + + function enable(namespaces) { + createDebug.save(namespaces); + createDebug.names = []; + createDebug.skips = []; + var i; + var split = (typeof namespaces === 'string' ? namespaces : '').split(/[\s,]+/); + var len = split.length; + + for (i = 0; i < len; i++) { + if (!split[i]) { + // ignore empty strings + continue; + } + + namespaces = split[i].replace(/\*/g, '.*?'); + + if (namespaces[0] === '-') { + createDebug.skips.push(new RegExp('^' + namespaces.substr(1) + '$')); + } else { + createDebug.names.push(new RegExp('^' + namespaces + '$')); + } + } + + for (i = 0; i < createDebug.instances.length; i++) { + var instance = createDebug.instances[i]; + instance.enabled = createDebug.enabled(instance.namespace); + } + } + /** + * Disable debug output. + * + * @api public + */ + + + function disable() { + createDebug.enable(''); + } + /** + * Returns true if the given mode name is enabled, false otherwise. + * + * @param {String} name + * @return {Boolean} + * @api public + */ + + + function enabled(name) { + if (name[name.length - 1] === '*') { + return true; + } + + var i; + var len; + + for (i = 0, len = createDebug.skips.length; i < len; i++) { + if (createDebug.skips[i].test(name)) { + return false; + } + } + + for (i = 0, len = createDebug.names.length; i < len; i++) { + if (createDebug.names[i].test(name)) { + return true; + } + } + + return false; + } + /** + * Coerce `val`. + * + * @param {Mixed} val + * @return {Mixed} + * @api private + */ + + + function coerce(val) { + if (val instanceof Error) { + return val.stack || val.message; + } + + return val; + } + + createDebug.enable(createDebug.load()); + return createDebug; +} + +module.exports = setup; + diff --git a/node_modules/nodemon/node_modules/debug/src/index.js b/node_modules/nodemon/node_modules/debug/src/index.js new file mode 100644 index 000000000..021731593 --- /dev/null +++ b/node_modules/nodemon/node_modules/debug/src/index.js @@ -0,0 +1,12 @@ +"use strict"; + +/** + * Detect Electron renderer / nwjs process, which is node, but we should + * treat as a browser. + */ +if (typeof process === 'undefined' || process.type === 'renderer' || process.browser === true || process.__nwjs) { + module.exports = require('./browser.js'); +} else { + module.exports = require('./node.js'); +} + diff --git a/node_modules/nodemon/node_modules/debug/src/node.js b/node_modules/nodemon/node_modules/debug/src/node.js new file mode 100644 index 000000000..1e6a5f16a --- /dev/null +++ b/node_modules/nodemon/node_modules/debug/src/node.js @@ -0,0 +1,177 @@ +"use strict"; + +/** + * Module dependencies. + */ +var tty = require('tty'); + +var util = require('util'); +/** + * This is the Node.js implementation of `debug()`. + */ + + +exports.init = init; +exports.log = log; +exports.formatArgs = formatArgs; +exports.save = save; +exports.load = load; +exports.useColors = useColors; +/** + * Colors. + */ + +exports.colors = [6, 2, 3, 4, 5, 1]; + +try { + // Optional dependency (as in, doesn't need to be installed, NOT like optionalDependencies in package.json) + // eslint-disable-next-line import/no-extraneous-dependencies + var supportsColor = require('supports-color'); + + if (supportsColor && (supportsColor.stderr || supportsColor).level >= 2) { + exports.colors = [20, 21, 26, 27, 32, 33, 38, 39, 40, 41, 42, 43, 44, 45, 56, 57, 62, 63, 68, 69, 74, 75, 76, 77, 78, 79, 80, 81, 92, 93, 98, 99, 112, 113, 128, 129, 134, 135, 148, 149, 160, 161, 162, 163, 164, 165, 166, 167, 168, 169, 170, 171, 172, 173, 178, 179, 184, 185, 196, 197, 198, 199, 200, 201, 202, 203, 204, 205, 206, 207, 208, 209, 214, 215, 220, 221]; + } +} catch (error) {} // Swallow - we only care if `supports-color` is available; it doesn't have to be. + +/** + * Build up the default `inspectOpts` object from the environment variables. + * + * $ DEBUG_COLORS=no DEBUG_DEPTH=10 DEBUG_SHOW_HIDDEN=enabled node script.js + */ + + +exports.inspectOpts = Object.keys(process.env).filter(function (key) { + return /^debug_/i.test(key); +}).reduce(function (obj, key) { + // Camel-case + var prop = key.substring(6).toLowerCase().replace(/_([a-z])/g, function (_, k) { + return k.toUpperCase(); + }); // Coerce string value into JS value + + var val = process.env[key]; + + if (/^(yes|on|true|enabled)$/i.test(val)) { + val = true; + } else if (/^(no|off|false|disabled)$/i.test(val)) { + val = false; + } else if (val === 'null') { + val = null; + } else { + val = Number(val); + } + + obj[prop] = val; + return obj; +}, {}); +/** + * Is stdout a TTY? Colored output is enabled when `true`. + */ + +function useColors() { + return 'colors' in exports.inspectOpts ? Boolean(exports.inspectOpts.colors) : tty.isatty(process.stderr.fd); +} +/** + * Adds ANSI color escape codes if enabled. + * + * @api public + */ + + +function formatArgs(args) { + var name = this.namespace, + useColors = this.useColors; + + if (useColors) { + var c = this.color; + var colorCode = "\x1B[3" + (c < 8 ? c : '8;5;' + c); + var prefix = " ".concat(colorCode, ";1m").concat(name, " \x1B[0m"); + args[0] = prefix + args[0].split('\n').join('\n' + prefix); + args.push(colorCode + 'm+' + module.exports.humanize(this.diff) + "\x1B[0m"); + } else { + args[0] = getDate() + name + ' ' + args[0]; + } +} + +function getDate() { + if (exports.inspectOpts.hideDate) { + return ''; + } + + return new Date().toISOString() + ' '; +} +/** + * Invokes `util.format()` with the specified arguments and writes to stderr. + */ + + +function log() { + return process.stderr.write(util.format.apply(util, arguments) + '\n'); +} +/** + * Save `namespaces`. + * + * @param {String} namespaces + * @api private + */ + + +function save(namespaces) { + if (namespaces) { + process.env.DEBUG = namespaces; + } else { + // If you set a process.env field to null or undefined, it gets cast to the + // string 'null' or 'undefined'. Just delete instead. + delete process.env.DEBUG; + } +} +/** + * Load `namespaces`. + * + * @return {String} returns the previously persisted debug modes + * @api private + */ + + +function load() { + return process.env.DEBUG; +} +/** + * Init logic for `debug` instances. + * + * Create a new `inspectOpts` object in case `useColors` is set + * differently for a particular `debug` instance. + */ + + +function init(debug) { + debug.inspectOpts = {}; + var keys = Object.keys(exports.inspectOpts); + + for (var i = 0; i < keys.length; i++) { + debug.inspectOpts[keys[i]] = exports.inspectOpts[keys[i]]; + } +} + +module.exports = require('./common')(exports); +var formatters = module.exports.formatters; +/** + * Map %o to `util.inspect()`, all on a single line. + */ + +formatters.o = function (v) { + this.inspectOpts.colors = this.useColors; + return util.inspect(v, this.inspectOpts) + .split('\n') + .map(function (str) { return str.trim(); }) + .join(' '); +}; +/** + * Map %O to `util.inspect()`, allowing multiple lines if needed. + */ + + +formatters.O = function (v) { + this.inspectOpts.colors = this.useColors; + return util.inspect(v, this.inspectOpts); +}; + diff --git a/node_modules/nodemon/node_modules/ms/index.js b/node_modules/nodemon/node_modules/ms/index.js new file mode 100644 index 000000000..ea734fb73 --- /dev/null +++ b/node_modules/nodemon/node_modules/ms/index.js @@ -0,0 +1,162 @@ +/** + * Helpers. + */ + +var s = 1000; +var m = s * 60; +var h = m * 60; +var d = h * 24; +var w = d * 7; +var y = d * 365.25; + +/** + * Parse or format the given `val`. + * + * Options: + * + * - `long` verbose formatting [false] + * + * @param {String|Number} val + * @param {Object} [options] + * @throws {Error} throw an error if val is not a non-empty string or a number + * @return {String|Number} + * @api public + */ + +module.exports = function (val, options) { + options = options || {}; + var type = typeof val; + if (type === 'string' && val.length > 0) { + return parse(val); + } else if (type === 'number' && isFinite(val)) { + return options.long ? fmtLong(val) : fmtShort(val); + } + throw new Error( + 'val is not a non-empty string or a valid number. val=' + + JSON.stringify(val) + ); +}; + +/** + * Parse the given `str` and return milliseconds. + * + * @param {String} str + * @return {Number} + * @api private + */ + +function parse(str) { + str = String(str); + if (str.length > 100) { + return; + } + var match = /^(-?(?:\d+)?\.?\d+) *(milliseconds?|msecs?|ms|seconds?|secs?|s|minutes?|mins?|m|hours?|hrs?|h|days?|d|weeks?|w|years?|yrs?|y)?$/i.exec( + str + ); + if (!match) { + return; + } + var n = parseFloat(match[1]); + var type = (match[2] || 'ms').toLowerCase(); + switch (type) { + case 'years': + case 'year': + case 'yrs': + case 'yr': + case 'y': + return n * y; + case 'weeks': + case 'week': + case 'w': + return n * w; + case 'days': + case 'day': + case 'd': + return n * d; + case 'hours': + case 'hour': + case 'hrs': + case 'hr': + case 'h': + return n * h; + case 'minutes': + case 'minute': + case 'mins': + case 'min': + case 'm': + return n * m; + case 'seconds': + case 'second': + case 'secs': + case 'sec': + case 's': + return n * s; + case 'milliseconds': + case 'millisecond': + case 'msecs': + case 'msec': + case 'ms': + return n; + default: + return undefined; + } +} + +/** + * Short format for `ms`. + * + * @param {Number} ms + * @return {String} + * @api private + */ + +function fmtShort(ms) { + var msAbs = Math.abs(ms); + if (msAbs >= d) { + return Math.round(ms / d) + 'd'; + } + if (msAbs >= h) { + return Math.round(ms / h) + 'h'; + } + if (msAbs >= m) { + return Math.round(ms / m) + 'm'; + } + if (msAbs >= s) { + return Math.round(ms / s) + 's'; + } + return ms + 'ms'; +} + +/** + * Long format for `ms`. + * + * @param {Number} ms + * @return {String} + * @api private + */ + +function fmtLong(ms) { + var msAbs = Math.abs(ms); + if (msAbs >= d) { + return plural(ms, msAbs, d, 'day'); + } + if (msAbs >= h) { + return plural(ms, msAbs, h, 'hour'); + } + if (msAbs >= m) { + return plural(ms, msAbs, m, 'minute'); + } + if (msAbs >= s) { + return plural(ms, msAbs, s, 'second'); + } + return ms + ' ms'; +} + +/** + * Pluralization helper. + */ + +function plural(ms, msAbs, n, name) { + var isPlural = msAbs >= n * 1.5; + return Math.round(ms / n) + ' ' + name + (isPlural ? 's' : ''); +} diff --git a/node_modules/nodemon/node_modules/ms/license.md b/node_modules/nodemon/node_modules/ms/license.md new file mode 100644 index 000000000..fa5d39b62 --- /dev/null +++ b/node_modules/nodemon/node_modules/ms/license.md @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2020 Vercel, Inc. + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/node_modules/nodemon/node_modules/ms/package.json b/node_modules/nodemon/node_modules/ms/package.json new file mode 100644 index 000000000..302ee2048 --- /dev/null +++ b/node_modules/nodemon/node_modules/ms/package.json @@ -0,0 +1,70 @@ +{ + "_from": "ms@^2.1.1", + "_id": "ms@2.1.3", + "_inBundle": false, + "_integrity": "sha512-6FlzubTLZG3J2a/NVCAleEhjzq5oxgHyaCU9yYXvcLsvoVaHJq/s5xXI6/XXP6tz7R9xAOtHnSO/tXtF3WRTlA==", + "_location": "/nodemon/ms", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "ms@^2.1.1", + "name": "ms", + "escapedName": "ms", + "rawSpec": "^2.1.1", + "saveSpec": null, + "fetchSpec": "^2.1.1" + }, + "_requiredBy": [ + "/nodemon/debug" + ], + "_resolved": "https://registry.npmjs.org/ms/-/ms-2.1.3.tgz", + "_shasum": "574c8138ce1d2b5861f0b44579dbadd60c6615b2", + "_spec": "ms@^2.1.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\nodemon\\node_modules\\debug", + "bugs": { + "url": "https://github.com/vercel/ms/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Tiny millisecond conversion utility", + "devDependencies": { + "eslint": "4.18.2", + "expect.js": "0.3.1", + "husky": "0.14.3", + "lint-staged": "5.0.0", + "mocha": "4.0.1", + "prettier": "2.0.5" + }, + "eslintConfig": { + "extends": "eslint:recommended", + "env": { + "node": true, + "es6": true + } + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/vercel/ms#readme", + "license": "MIT", + "lint-staged": { + "*.js": [ + "npm run lint", + "prettier --single-quote --write", + "git add" + ] + }, + "main": "./index", + "name": "ms", + "repository": { + "type": "git", + "url": "git+https://github.com/vercel/ms.git" + }, + "scripts": { + "lint": "eslint lib/* bin/*", + "precommit": "lint-staged", + "test": "mocha tests.js" + }, + "version": "2.1.3" +} diff --git a/node_modules/nodemon/node_modules/ms/readme.md b/node_modules/nodemon/node_modules/ms/readme.md new file mode 100644 index 000000000..0fc1abb3b --- /dev/null +++ b/node_modules/nodemon/node_modules/ms/readme.md @@ -0,0 +1,59 @@ +# ms + +![CI](https://github.com/vercel/ms/workflows/CI/badge.svg) + +Use this package to easily convert various time formats to milliseconds. + +## Examples + +```js +ms('2 days') // 172800000 +ms('1d') // 86400000 +ms('10h') // 36000000 +ms('2.5 hrs') // 9000000 +ms('2h') // 7200000 +ms('1m') // 60000 +ms('5s') // 5000 +ms('1y') // 31557600000 +ms('100') // 100 +ms('-3 days') // -259200000 +ms('-1h') // -3600000 +ms('-200') // -200 +``` + +### Convert from Milliseconds + +```js +ms(60000) // "1m" +ms(2 * 60000) // "2m" +ms(-3 * 60000) // "-3m" +ms(ms('10 hours')) // "10h" +``` + +### Time Format Written-Out + +```js +ms(60000, { long: true }) // "1 minute" +ms(2 * 60000, { long: true }) // "2 minutes" +ms(-3 * 60000, { long: true }) // "-3 minutes" +ms(ms('10 hours'), { long: true }) // "10 hours" +``` + +## Features + +- Works both in [Node.js](https://nodejs.org) and in the browser +- If a number is supplied to `ms`, a string with a unit is returned +- If a string that contains the number is supplied, it returns it as a number (e.g.: it returns `100` for `'100'`) +- If you pass a string with a number and a valid unit, the number of equivalent milliseconds is returned + +## Related Packages + +- [ms.macro](https://github.com/knpwrs/ms.macro) - Run `ms` as a macro at build-time. + +## Caught a Bug? + +1. [Fork](https://help.github.com/articles/fork-a-repo/) this repository to your own GitHub account and then [clone](https://help.github.com/articles/cloning-a-repository/) it to your local device +2. Link the package to the global module directory: `npm link` +3. Within the module you want to test your local development instance of ms, just link it to the dependencies: `npm link ms`. Instead of the default one from npm, Node.js will now use your clone of ms! + +As always, you can run the tests using: `npm test` diff --git a/node_modules/nodemon/package.json b/node_modules/nodemon/package.json new file mode 100644 index 000000000..cd191035b --- /dev/null +++ b/node_modules/nodemon/package.json @@ -0,0 +1,102 @@ +{ + "_from": "nodemon@^2.0.13", + "_id": "nodemon@2.0.13", + "_inBundle": false, + "_integrity": "sha512-UMXMpsZsv1UXUttCn6gv8eQPhn6DR4BW+txnL3IN5IHqrCwcrT/yWHfL35UsClGXknTH79r5xbu+6J1zNHuSyA==", + "_location": "/nodemon", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "nodemon@^2.0.13", + "name": "nodemon", + "escapedName": "nodemon", + "rawSpec": "^2.0.13", + "saveSpec": null, + "fetchSpec": "^2.0.13" + }, + "_requiredBy": [ + "/" + ], + "_resolved": "https://registry.npmjs.org/nodemon/-/nodemon-2.0.13.tgz", + "_shasum": "67d40d3a4d5bd840aa785c56587269cfcf5d24aa", + "_spec": "nodemon@^2.0.13", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project", + "author": { + "name": "Remy Sharp", + "url": "https://github.com/remy" + }, + "bin": { + "nodemon": "bin/nodemon.js" + }, + "bugs": { + "url": "https://github.com/remy/nodemon/issues" + }, + "bundleDependencies": false, + "dependencies": { + "chokidar": "^3.2.2", + "debug": "^3.2.6", + "ignore-by-default": "^1.0.1", + "minimatch": "^3.0.4", + "pstree.remy": "^1.1.7", + "semver": "^5.7.1", + "supports-color": "^5.5.0", + "touch": "^3.1.0", + "undefsafe": "^2.0.3", + "update-notifier": "^5.1.0" + }, + "deprecated": false, + "description": "Simple monitor script for use during development of a node.js app.", + "devDependencies": { + "@commitlint/cli": "^11.0.0", + "@commitlint/config-conventional": "^11.0.0", + "async": "1.4.2", + "coffee-script": "~1.7.1", + "eslint": "^7.11.0", + "husky": "^7.0.2", + "mocha": "^2.5.3", + "nyc": "^15.1.0", + "proxyquire": "^1.8.0", + "semantic-release": "^18.0.0", + "should": "~4.0.0" + }, + "engines": { + "node": ">=8.10.0" + }, + "funding": { + "type": "opencollective", + "url": "https://opencollective.com/nodemon" + }, + "homepage": "https://nodemon.io", + "keywords": [ + "monitor", + "development", + "restart", + "autoload", + "reload", + "terminal" + ], + "license": "MIT", + "main": "./lib/nodemon", + "name": "nodemon", + "repository": { + "type": "git", + "url": "git+https://github.com/remy/nodemon.git" + }, + "scripts": { + ":spec": "mocha --timeout 30000 --ui bdd test/**/*.test.js", + "clean": "rm -rf test/fixtures/test*.js test/fixtures/test*.md", + "commitmsg": "commitlint -e", + "coverage": "istanbul cover _mocha -- --timeout 30000 --ui bdd --reporter list test/**/*.test.js", + "killall": "ps auxww | grep node | grep -v grep | awk '{ print $2 }' | xargs kill -9", + "lint": "eslint lib/**/*.js", + "postinstall": "node bin/postinstall || exit 0", + "postspec": "npm run clean", + "prepush": "npm run lint", + "semantic-release": "semantic-release", + "spec": "for FILE in test/**/*.test.js; do echo $FILE; TEST=1 mocha --exit --timeout 30000 $FILE; if [ $? -ne 0 ]; then exit 1; fi; sleep 1; done", + "test": "npm run lint && npm run spec", + "web": "node web" + }, + "version": "2.0.13" +} diff --git a/node_modules/nopt/.npmignore b/node_modules/nopt/.npmignore new file mode 100644 index 000000000..e69de29bb diff --git a/node_modules/nopt/LICENSE b/node_modules/nopt/LICENSE new file mode 100644 index 000000000..05a401094 --- /dev/null +++ b/node_modules/nopt/LICENSE @@ -0,0 +1,23 @@ +Copyright 2009, 2010, 2011 Isaac Z. Schlueter. +All rights reserved. + +Permission is hereby granted, free of charge, to any person +obtaining a copy of this software and associated documentation +files (the "Software"), to deal in the Software without +restriction, including without limitation the rights to use, +copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the +Software is furnished to do so, subject to the following +conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES +OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND +NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT +HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, +WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING +FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR +OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/nopt/README.md b/node_modules/nopt/README.md new file mode 100644 index 000000000..eeddfd4fe --- /dev/null +++ b/node_modules/nopt/README.md @@ -0,0 +1,208 @@ +If you want to write an option parser, and have it be good, there are +two ways to do it. The Right Way, and the Wrong Way. + +The Wrong Way is to sit down and write an option parser. We've all done +that. + +The Right Way is to write some complex configurable program with so many +options that you go half-insane just trying to manage them all, and put +it off with duct-tape solutions until you see exactly to the core of the +problem, and finally snap and write an awesome option parser. + +If you want to write an option parser, don't write an option parser. +Write a package manager, or a source control system, or a service +restarter, or an operating system. You probably won't end up with a +good one of those, but if you don't give up, and you are relentless and +diligent enough in your procrastination, you may just end up with a very +nice option parser. + +## USAGE + + // my-program.js + var nopt = require("nopt") + , Stream = require("stream").Stream + , path = require("path") + , knownOpts = { "foo" : [String, null] + , "bar" : [Stream, Number] + , "baz" : path + , "bloo" : [ "big", "medium", "small" ] + , "flag" : Boolean + , "pick" : Boolean + , "many" : [String, Array] + } + , shortHands = { "foofoo" : ["--foo", "Mr. Foo"] + , "b7" : ["--bar", "7"] + , "m" : ["--bloo", "medium"] + , "p" : ["--pick"] + , "f" : ["--flag"] + } + // everything is optional. + // knownOpts and shorthands default to {} + // arg list defaults to process.argv + // slice defaults to 2 + , parsed = nopt(knownOpts, shortHands, process.argv, 2) + console.log(parsed) + +This would give you support for any of the following: + +```bash +$ node my-program.js --foo "blerp" --no-flag +{ "foo" : "blerp", "flag" : false } + +$ node my-program.js ---bar 7 --foo "Mr. Hand" --flag +{ bar: 7, foo: "Mr. Hand", flag: true } + +$ node my-program.js --foo "blerp" -f -----p +{ foo: "blerp", flag: true, pick: true } + +$ node my-program.js -fp --foofoo +{ foo: "Mr. Foo", flag: true, pick: true } + +$ node my-program.js --foofoo -- -fp # -- stops the flag parsing. +{ foo: "Mr. Foo", argv: { remain: ["-fp"] } } + +$ node my-program.js --blatzk 1000 -fp # unknown opts are ok. +{ blatzk: 1000, flag: true, pick: true } + +$ node my-program.js --blatzk true -fp # but they need a value +{ blatzk: true, flag: true, pick: true } + +$ node my-program.js --no-blatzk -fp # unless they start with "no-" +{ blatzk: false, flag: true, pick: true } + +$ node my-program.js --baz b/a/z # known paths are resolved. +{ baz: "/Users/isaacs/b/a/z" } + +# if Array is one of the types, then it can take many +# values, and will always be an array. The other types provided +# specify what types are allowed in the list. + +$ node my-program.js --many 1 --many null --many foo +{ many: ["1", "null", "foo"] } + +$ node my-program.js --many foo +{ many: ["foo"] } +``` + +Read the tests at the bottom of `lib/nopt.js` for more examples of +what this puppy can do. + +## Types + +The following types are supported, and defined on `nopt.typeDefs` + +* String: A normal string. No parsing is done. +* path: A file system path. Gets resolved against cwd if not absolute. +* url: A url. If it doesn't parse, it isn't accepted. +* Number: Must be numeric. +* Date: Must parse as a date. If it does, and `Date` is one of the options, + then it will return a Date object, not a string. +* Boolean: Must be either `true` or `false`. If an option is a boolean, + then it does not need a value, and its presence will imply `true` as + the value. To negate boolean flags, do `--no-whatever` or `--whatever + false` +* NaN: Means that the option is strictly not allowed. Any value will + fail. +* Stream: An object matching the "Stream" class in node. Valuable + for use when validating programmatically. (npm uses this to let you + supply any WriteStream on the `outfd` and `logfd` config options.) +* Array: If `Array` is specified as one of the types, then the value + will be parsed as a list of options. This means that multiple values + can be specified, and that the value will always be an array. + +If a type is an array of values not on this list, then those are +considered valid values. For instance, in the example above, the +`--bloo` option can only be one of `"big"`, `"medium"`, or `"small"`, +and any other value will be rejected. + +When parsing unknown fields, `"true"`, `"false"`, and `"null"` will be +interpreted as their JavaScript equivalents, and numeric values will be +interpreted as a number. + +You can also mix types and values, or multiple types, in a list. For +instance `{ blah: [Number, null] }` would allow a value to be set to +either a Number or null. + +To define a new type, add it to `nopt.typeDefs`. Each item in that +hash is an object with a `type` member and a `validate` method. The +`type` member is an object that matches what goes in the type list. The +`validate` method is a function that gets called with `validate(data, +key, val)`. Validate methods should assign `data[key]` to the valid +value of `val` if it can be handled properly, or return boolean +`false` if it cannot. + +You can also call `nopt.clean(data, types, typeDefs)` to clean up a +config object and remove its invalid properties. + +## Error Handling + +By default, nopt outputs a warning to standard error when invalid +options are found. You can change this behavior by assigning a method +to `nopt.invalidHandler`. This method will be called with +the offending `nopt.invalidHandler(key, val, types)`. + +If no `nopt.invalidHandler` is assigned, then it will console.error +its whining. If it is assigned to boolean `false` then the warning is +suppressed. + +## Abbreviations + +Yes, they are supported. If you define options like this: + +```javascript +{ "foolhardyelephants" : Boolean +, "pileofmonkeys" : Boolean } +``` + +Then this will work: + +```bash +node program.js --foolhar --pil +node program.js --no-f --pileofmon +# etc. +``` + +## Shorthands + +Shorthands are a hash of shorter option names to a snippet of args that +they expand to. + +If multiple one-character shorthands are all combined, and the +combination does not unambiguously match any other option or shorthand, +then they will be broken up into their constituent parts. For example: + +```json +{ "s" : ["--loglevel", "silent"] +, "g" : "--global" +, "f" : "--force" +, "p" : "--parseable" +, "l" : "--long" +} +``` + +```bash +npm ls -sgflp +# just like doing this: +npm ls --loglevel silent --global --force --long --parseable +``` + +## The Rest of the args + +The config object returned by nopt is given a special member called +`argv`, which is an object with the following fields: + +* `remain`: The remaining args after all the parsing has occurred. +* `original`: The args as they originally appeared. +* `cooked`: The args after flags and shorthands are expanded. + +## Slicing + +Node programs are called with more or less the exact argv as it appears +in C land, after the v8 and node-specific options have been plucked off. +As such, `argv[0]` is always `node` and `argv[1]` is always the +JavaScript program being run. + +That's usually not very useful to you. So they're sliced off by +default. If you want them, then you can pass in `0` as the last +argument, or any other number that you'd like to slice off the start of +the list. diff --git a/node_modules/nopt/bin/nopt.js b/node_modules/nopt/bin/nopt.js new file mode 100644 index 000000000..df90c729a --- /dev/null +++ b/node_modules/nopt/bin/nopt.js @@ -0,0 +1,44 @@ +#!/usr/bin/env node +var nopt = require("../lib/nopt") + , types = { num: Number + , bool: Boolean + , help: Boolean + , list: Array + , "num-list": [Number, Array] + , "str-list": [String, Array] + , "bool-list": [Boolean, Array] + , str: String } + , shorthands = { s: [ "--str", "astring" ] + , b: [ "--bool" ] + , nb: [ "--no-bool" ] + , tft: [ "--bool-list", "--no-bool-list", "--bool-list", "true" ] + , "?": ["--help"] + , h: ["--help"] + , H: ["--help"] + , n: [ "--num", "125" ] } + , parsed = nopt( types + , shorthands + , process.argv + , 2 ) + +console.log("parsed", parsed) + +if (parsed.help) { + console.log("") + console.log("nopt cli tester") + console.log("") + console.log("types") + console.log(Object.keys(types).map(function M (t) { + var type = types[t] + if (Array.isArray(type)) { + return [t, type.map(function (type) { return type.name })] + } + return [t, type && type.name] + }).reduce(function (s, i) { + s[i[0]] = i[1] + return s + }, {})) + console.log("") + console.log("shorthands") + console.log(shorthands) +} diff --git a/node_modules/nopt/examples/my-program.js b/node_modules/nopt/examples/my-program.js new file mode 100644 index 000000000..142447e18 --- /dev/null +++ b/node_modules/nopt/examples/my-program.js @@ -0,0 +1,30 @@ +#!/usr/bin/env node + +//process.env.DEBUG_NOPT = 1 + +// my-program.js +var nopt = require("../lib/nopt") + , Stream = require("stream").Stream + , path = require("path") + , knownOpts = { "foo" : [String, null] + , "bar" : [Stream, Number] + , "baz" : path + , "bloo" : [ "big", "medium", "small" ] + , "flag" : Boolean + , "pick" : Boolean + } + , shortHands = { "foofoo" : ["--foo", "Mr. Foo"] + , "b7" : ["--bar", "7"] + , "m" : ["--bloo", "medium"] + , "p" : ["--pick"] + , "f" : ["--flag", "true"] + , "g" : ["--flag"] + , "s" : "--flag" + } + // everything is optional. + // knownOpts and shorthands default to {} + // arg list defaults to process.argv + // slice defaults to 2 + , parsed = nopt(knownOpts, shortHands, process.argv, 2) + +console.log("parsed =\n"+ require("util").inspect(parsed)) diff --git a/node_modules/nopt/lib/nopt.js b/node_modules/nopt/lib/nopt.js new file mode 100644 index 000000000..ff802dafe --- /dev/null +++ b/node_modules/nopt/lib/nopt.js @@ -0,0 +1,552 @@ +// info about each config option. + +var debug = process.env.DEBUG_NOPT || process.env.NOPT_DEBUG + ? function () { console.error.apply(console, arguments) } + : function () {} + +var url = require("url") + , path = require("path") + , Stream = require("stream").Stream + , abbrev = require("abbrev") + +module.exports = exports = nopt +exports.clean = clean + +exports.typeDefs = + { String : { type: String, validate: validateString } + , Boolean : { type: Boolean, validate: validateBoolean } + , url : { type: url, validate: validateUrl } + , Number : { type: Number, validate: validateNumber } + , path : { type: path, validate: validatePath } + , Stream : { type: Stream, validate: validateStream } + , Date : { type: Date, validate: validateDate } + } + +function nopt (types, shorthands, args, slice) { + args = args || process.argv + types = types || {} + shorthands = shorthands || {} + if (typeof slice !== "number") slice = 2 + + debug(types, shorthands, args, slice) + + args = args.slice(slice) + var data = {} + , key + , remain = [] + , cooked = args + , original = args.slice(0) + + parse(args, data, remain, types, shorthands) + // now data is full + clean(data, types, exports.typeDefs) + data.argv = {remain:remain,cooked:cooked,original:original} + data.argv.toString = function () { + return this.original.map(JSON.stringify).join(" ") + } + return data +} + +function clean (data, types, typeDefs) { + typeDefs = typeDefs || exports.typeDefs + var remove = {} + , typeDefault = [false, true, null, String, Number] + + Object.keys(data).forEach(function (k) { + if (k === "argv") return + var val = data[k] + , isArray = Array.isArray(val) + , type = types[k] + if (!isArray) val = [val] + if (!type) type = typeDefault + if (type === Array) type = typeDefault.concat(Array) + if (!Array.isArray(type)) type = [type] + + debug("val=%j", val) + debug("types=", type) + val = val.map(function (val) { + // if it's an unknown value, then parse false/true/null/numbers/dates + if (typeof val === "string") { + debug("string %j", val) + val = val.trim() + if ((val === "null" && ~type.indexOf(null)) + || (val === "true" && + (~type.indexOf(true) || ~type.indexOf(Boolean))) + || (val === "false" && + (~type.indexOf(false) || ~type.indexOf(Boolean)))) { + val = JSON.parse(val) + debug("jsonable %j", val) + } else if (~type.indexOf(Number) && !isNaN(val)) { + debug("convert to number", val) + val = +val + } else if (~type.indexOf(Date) && !isNaN(Date.parse(val))) { + debug("convert to date", val) + val = new Date(val) + } + } + + if (!types.hasOwnProperty(k)) { + return val + } + + // allow `--no-blah` to set 'blah' to null if null is allowed + if (val === false && ~type.indexOf(null) && + !(~type.indexOf(false) || ~type.indexOf(Boolean))) { + val = null + } + + var d = {} + d[k] = val + debug("prevalidated val", d, val, types[k]) + if (!validate(d, k, val, types[k], typeDefs)) { + if (exports.invalidHandler) { + exports.invalidHandler(k, val, types[k], data) + } else if (exports.invalidHandler !== false) { + debug("invalid: "+k+"="+val, types[k]) + } + return remove + } + debug("validated val", d, val, types[k]) + return d[k] + }).filter(function (val) { return val !== remove }) + + if (!val.length) delete data[k] + else if (isArray) { + debug(isArray, data[k], val) + data[k] = val + } else data[k] = val[0] + + debug("k=%s val=%j", k, val, data[k]) + }) +} + +function validateString (data, k, val) { + data[k] = String(val) +} + +function validatePath (data, k, val) { + data[k] = path.resolve(String(val)) + return true +} + +function validateNumber (data, k, val) { + debug("validate Number %j %j %j", k, val, isNaN(val)) + if (isNaN(val)) return false + data[k] = +val +} + +function validateDate (data, k, val) { + debug("validate Date %j %j %j", k, val, Date.parse(val)) + var s = Date.parse(val) + if (isNaN(s)) return false + data[k] = new Date(val) +} + +function validateBoolean (data, k, val) { + if (val instanceof Boolean) val = val.valueOf() + else if (typeof val === "string") { + if (!isNaN(val)) val = !!(+val) + else if (val === "null" || val === "false") val = false + else val = true + } else val = !!val + data[k] = val +} + +function validateUrl (data, k, val) { + val = url.parse(String(val)) + if (!val.host) return false + data[k] = val.href +} + +function validateStream (data, k, val) { + if (!(val instanceof Stream)) return false + data[k] = val +} + +function validate (data, k, val, type, typeDefs) { + // arrays are lists of types. + if (Array.isArray(type)) { + for (var i = 0, l = type.length; i < l; i ++) { + if (type[i] === Array) continue + if (validate(data, k, val, type[i], typeDefs)) return true + } + delete data[k] + return false + } + + // an array of anything? + if (type === Array) return true + + // NaN is poisonous. Means that something is not allowed. + if (type !== type) { + debug("Poison NaN", k, val, type) + delete data[k] + return false + } + + // explicit list of values + if (val === type) { + debug("Explicitly allowed %j", val) + // if (isArray) (data[k] = data[k] || []).push(val) + // else data[k] = val + data[k] = val + return true + } + + // now go through the list of typeDefs, validate against each one. + var ok = false + , types = Object.keys(typeDefs) + for (var i = 0, l = types.length; i < l; i ++) { + debug("test type %j %j %j", k, val, types[i]) + var t = typeDefs[types[i]] + if (t && type === t.type) { + var d = {} + ok = false !== t.validate(d, k, val) + val = d[k] + if (ok) { + // if (isArray) (data[k] = data[k] || []).push(val) + // else data[k] = val + data[k] = val + break + } + } + } + debug("OK? %j (%j %j %j)", ok, k, val, types[i]) + + if (!ok) delete data[k] + return ok +} + +function parse (args, data, remain, types, shorthands) { + debug("parse", args, data, remain) + + var key = null + , abbrevs = abbrev(Object.keys(types)) + , shortAbbr = abbrev(Object.keys(shorthands)) + + for (var i = 0; i < args.length; i ++) { + var arg = args[i] + debug("arg", arg) + + if (arg.match(/^-{2,}$/)) { + // done with keys. + // the rest are args. + remain.push.apply(remain, args.slice(i + 1)) + args[i] = "--" + break + } + if (arg.charAt(0) === "-") { + if (arg.indexOf("=") !== -1) { + var v = arg.split("=") + arg = v.shift() + v = v.join("=") + args.splice.apply(args, [i, 1].concat([arg, v])) + } + // see if it's a shorthand + // if so, splice and back up to re-parse it. + var shRes = resolveShort(arg, shorthands, shortAbbr, abbrevs) + debug("arg=%j shRes=%j", arg, shRes) + if (shRes) { + debug(arg, shRes) + args.splice.apply(args, [i, 1].concat(shRes)) + if (arg !== shRes[0]) { + i -- + continue + } + } + arg = arg.replace(/^-+/, "") + var no = false + while (arg.toLowerCase().indexOf("no-") === 0) { + no = !no + arg = arg.substr(3) + } + + if (abbrevs[arg]) arg = abbrevs[arg] + + var isArray = types[arg] === Array || + Array.isArray(types[arg]) && types[arg].indexOf(Array) !== -1 + + var val + , la = args[i + 1] + + var isBool = no || + types[arg] === Boolean || + Array.isArray(types[arg]) && types[arg].indexOf(Boolean) !== -1 || + (la === "false" && + (types[arg] === null || + Array.isArray(types[arg]) && ~types[arg].indexOf(null))) + + if (isBool) { + // just set and move along + val = !no + // however, also support --bool true or --bool false + if (la === "true" || la === "false") { + val = JSON.parse(la) + la = null + if (no) val = !val + i ++ + } + + // also support "foo":[Boolean, "bar"] and "--foo bar" + if (Array.isArray(types[arg]) && la) { + if (~types[arg].indexOf(la)) { + // an explicit type + val = la + i ++ + } else if ( la === "null" && ~types[arg].indexOf(null) ) { + // null allowed + val = null + i ++ + } else if ( !la.match(/^-{2,}[^-]/) && + !isNaN(la) && + ~types[arg].indexOf(Number) ) { + // number + val = +la + i ++ + } else if ( !la.match(/^-[^-]/) && ~types[arg].indexOf(String) ) { + // string + val = la + i ++ + } + } + + if (isArray) (data[arg] = data[arg] || []).push(val) + else data[arg] = val + + continue + } + + if (la && la.match(/^-{2,}$/)) { + la = undefined + i -- + } + + val = la === undefined ? true : la + if (isArray) (data[arg] = data[arg] || []).push(val) + else data[arg] = val + + i ++ + continue + } + remain.push(arg) + } +} + +function resolveShort (arg, shorthands, shortAbbr, abbrevs) { + // handle single-char shorthands glommed together, like + // npm ls -glp, but only if there is one dash, and only if + // all of the chars are single-char shorthands, and it's + // not a match to some other abbrev. + arg = arg.replace(/^-+/, '') + if (abbrevs[arg] && !shorthands[arg]) { + return null + } + if (shortAbbr[arg]) { + arg = shortAbbr[arg] + } else { + var singles = shorthands.___singles + if (!singles) { + singles = Object.keys(shorthands).filter(function (s) { + return s.length === 1 + }).reduce(function (l,r) { l[r] = true ; return l }, {}) + shorthands.___singles = singles + } + var chrs = arg.split("").filter(function (c) { + return singles[c] + }) + if (chrs.join("") === arg) return chrs.map(function (c) { + return shorthands[c] + }).reduce(function (l, r) { + return l.concat(r) + }, []) + } + + if (shorthands[arg] && !Array.isArray(shorthands[arg])) { + shorthands[arg] = shorthands[arg].split(/\s+/) + } + return shorthands[arg] +} + +if (module === require.main) { +var assert = require("assert") + , util = require("util") + + , shorthands = + { s : ["--loglevel", "silent"] + , d : ["--loglevel", "info"] + , dd : ["--loglevel", "verbose"] + , ddd : ["--loglevel", "silly"] + , noreg : ["--no-registry"] + , reg : ["--registry"] + , "no-reg" : ["--no-registry"] + , silent : ["--loglevel", "silent"] + , verbose : ["--loglevel", "verbose"] + , h : ["--usage"] + , H : ["--usage"] + , "?" : ["--usage"] + , help : ["--usage"] + , v : ["--version"] + , f : ["--force"] + , desc : ["--description"] + , "no-desc" : ["--no-description"] + , "local" : ["--no-global"] + , l : ["--long"] + , p : ["--parseable"] + , porcelain : ["--parseable"] + , g : ["--global"] + } + + , types = + { aoa: Array + , nullstream: [null, Stream] + , date: Date + , str: String + , browser : String + , cache : path + , color : ["always", Boolean] + , depth : Number + , description : Boolean + , dev : Boolean + , editor : path + , force : Boolean + , global : Boolean + , globalconfig : path + , group : [String, Number] + , gzipbin : String + , logfd : [Number, Stream] + , loglevel : ["silent","win","error","warn","info","verbose","silly"] + , long : Boolean + , "node-version" : [false, String] + , npaturl : url + , npat : Boolean + , "onload-script" : [false, String] + , outfd : [Number, Stream] + , parseable : Boolean + , pre: Boolean + , prefix: path + , proxy : url + , "rebuild-bundle" : Boolean + , registry : url + , searchopts : String + , searchexclude: [null, String] + , shell : path + , t: [Array, String] + , tag : String + , tar : String + , tmp : path + , "unsafe-perm" : Boolean + , usage : Boolean + , user : String + , username : String + , userconfig : path + , version : Boolean + , viewer: path + , _exit : Boolean + } + +; [["-v", {version:true}, []] + ,["---v", {version:true}, []] + ,["ls -s --no-reg connect -d", + {loglevel:"info",registry:null},["ls","connect"]] + ,["ls ---s foo",{loglevel:"silent"},["ls","foo"]] + ,["ls --registry blargle", {}, ["ls"]] + ,["--no-registry", {registry:null}, []] + ,["--no-color true", {color:false}, []] + ,["--no-color false", {color:true}, []] + ,["--no-color", {color:false}, []] + ,["--color false", {color:false}, []] + ,["--color --logfd 7", {logfd:7,color:true}, []] + ,["--color=true", {color:true}, []] + ,["--logfd=10", {logfd:10}, []] + ,["--tmp=/tmp -tar=gtar",{tmp:"/tmp",tar:"gtar"},[]] + ,["--tmp=tmp -tar=gtar", + {tmp:path.resolve(process.cwd(), "tmp"),tar:"gtar"},[]] + ,["--logfd x", {}, []] + ,["a -true -- -no-false", {true:true},["a","-no-false"]] + ,["a -no-false", {false:false},["a"]] + ,["a -no-no-true", {true:true}, ["a"]] + ,["a -no-no-no-false", {false:false}, ["a"]] + ,["---NO-no-No-no-no-no-nO-no-no"+ + "-No-no-no-no-no-no-no-no-no"+ + "-no-no-no-no-NO-NO-no-no-no-no-no-no"+ + "-no-body-can-do-the-boogaloo-like-I-do" + ,{"body-can-do-the-boogaloo-like-I-do":false}, []] + ,["we are -no-strangers-to-love "+ + "--you-know the-rules --and so-do-i "+ + "---im-thinking-of=a-full-commitment "+ + "--no-you-would-get-this-from-any-other-guy "+ + "--no-gonna-give-you-up "+ + "-no-gonna-let-you-down=true "+ + "--no-no-gonna-run-around false "+ + "--desert-you=false "+ + "--make-you-cry false "+ + "--no-tell-a-lie "+ + "--no-no-and-hurt-you false" + ,{"strangers-to-love":false + ,"you-know":"the-rules" + ,"and":"so-do-i" + ,"you-would-get-this-from-any-other-guy":false + ,"gonna-give-you-up":false + ,"gonna-let-you-down":false + ,"gonna-run-around":false + ,"desert-you":false + ,"make-you-cry":false + ,"tell-a-lie":false + ,"and-hurt-you":false + },["we", "are"]] + ,["-t one -t two -t three" + ,{t: ["one", "two", "three"]} + ,[]] + ,["-t one -t null -t three four five null" + ,{t: ["one", "null", "three"]} + ,["four", "five", "null"]] + ,["-t foo" + ,{t:["foo"]} + ,[]] + ,["--no-t" + ,{t:["false"]} + ,[]] + ,["-no-no-t" + ,{t:["true"]} + ,[]] + ,["-aoa one -aoa null -aoa 100" + ,{aoa:["one", null, 100]} + ,[]] + ,["-str 100" + ,{str:"100"} + ,[]] + ,["--color always" + ,{color:"always"} + ,[]] + ,["--no-nullstream" + ,{nullstream:null} + ,[]] + ,["--nullstream false" + ,{nullstream:null} + ,[]] + ,["--notadate 2011-01-25" + ,{notadate: "2011-01-25"} + ,[]] + ,["--date 2011-01-25" + ,{date: new Date("2011-01-25")} + ,[]] + ].forEach(function (test) { + var argv = test[0].split(/\s+/) + , opts = test[1] + , rem = test[2] + , actual = nopt(types, shorthands, argv, 0) + , parsed = actual.argv + delete actual.argv + console.log(util.inspect(actual, false, 2, true), parsed.remain) + for (var i in opts) { + var e = JSON.stringify(opts[i]) + , a = JSON.stringify(actual[i] === undefined ? null : actual[i]) + if (e && typeof e === "object") { + assert.deepEqual(e, a) + } else { + assert.equal(e, a) + } + } + assert.deepEqual(rem, parsed.remain) + }) +} diff --git a/node_modules/nopt/package.json b/node_modules/nopt/package.json new file mode 100644 index 000000000..efe9d1266 --- /dev/null +++ b/node_modules/nopt/package.json @@ -0,0 +1,60 @@ +{ + "_from": "nopt@~1.0.10", + "_id": "nopt@1.0.10", + "_inBundle": false, + "_integrity": "sha1-bd0hvSoxQXuScn3Vhfim83YI6+4=", + "_location": "/nopt", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "nopt@~1.0.10", + "name": "nopt", + "escapedName": "nopt", + "rawSpec": "~1.0.10", + "saveSpec": null, + "fetchSpec": "~1.0.10" + }, + "_requiredBy": [ + "/touch" + ], + "_resolved": "https://registry.npmjs.org/nopt/-/nopt-1.0.10.tgz", + "_shasum": "6ddd21bd2a31417b92727dd585f8a6f37608ebee", + "_spec": "nopt@~1.0.10", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\touch", + "author": { + "name": "Isaac Z. Schlueter", + "email": "i@izs.me", + "url": "http://blog.izs.me/" + }, + "bin": { + "nopt": "bin/nopt.js" + }, + "bugs": { + "url": "https://github.com/isaacs/nopt/issues" + }, + "bundleDependencies": false, + "dependencies": { + "abbrev": "1" + }, + "deprecated": false, + "description": "Option parsing for Node, supporting types, shorthands, etc. Used by npm.", + "engines": { + "node": "*" + }, + "homepage": "https://github.com/isaacs/nopt#readme", + "license": { + "type": "MIT", + "url": "https://github.com/isaacs/nopt/raw/master/LICENSE" + }, + "main": "lib/nopt.js", + "name": "nopt", + "repository": { + "type": "git", + "url": "git+ssh://git@github.com/isaacs/nopt.git" + }, + "scripts": { + "test": "node lib/nopt.js" + }, + "version": "1.0.10" +} diff --git a/node_modules/normalize-path/LICENSE b/node_modules/normalize-path/LICENSE new file mode 100644 index 000000000..d32ab4426 --- /dev/null +++ b/node_modules/normalize-path/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2014-2018, Jon Schlinkert. + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/normalize-path/README.md b/node_modules/normalize-path/README.md new file mode 100644 index 000000000..726d4d689 --- /dev/null +++ b/node_modules/normalize-path/README.md @@ -0,0 +1,127 @@ +# normalize-path [![NPM version](https://img.shields.io/npm/v/normalize-path.svg?style=flat)](https://www.npmjs.com/package/normalize-path) [![NPM monthly downloads](https://img.shields.io/npm/dm/normalize-path.svg?style=flat)](https://npmjs.org/package/normalize-path) [![NPM total downloads](https://img.shields.io/npm/dt/normalize-path.svg?style=flat)](https://npmjs.org/package/normalize-path) [![Linux Build Status](https://img.shields.io/travis/jonschlinkert/normalize-path.svg?style=flat&label=Travis)](https://travis-ci.org/jonschlinkert/normalize-path) + +> Normalize slashes in a file path to be posix/unix-like forward slashes. Also condenses repeat slashes to a single slash and removes and trailing slashes, unless disabled. + +Please consider following this project's author, [Jon Schlinkert](https://github.com/jonschlinkert), and consider starring the project to show your :heart: and support. + +## Install + +Install with [npm](https://www.npmjs.com/): + +```sh +$ npm install --save normalize-path +``` + +## Usage + +```js +const normalize = require('normalize-path'); + +console.log(normalize('\\foo\\bar\\baz\\')); +//=> '/foo/bar/baz' +``` + +**win32 namespaces** + +```js +console.log(normalize('\\\\?\\UNC\\Server01\\user\\docs\\Letter.txt')); +//=> '//?/UNC/Server01/user/docs/Letter.txt' + +console.log(normalize('\\\\.\\CdRomX')); +//=> '//./CdRomX' +``` + +**Consecutive slashes** + +Condenses multiple consecutive forward slashes (except for leading slashes in win32 namespaces) to a single slash. + +```js +console.log(normalize('.//foo//bar///////baz/')); +//=> './foo/bar/baz' +``` + +### Trailing slashes + +By default trailing slashes are removed. Pass `false` as the last argument to disable this behavior and _**keep** trailing slashes_: + +```js +console.log(normalize('foo\\bar\\baz\\', false)); //=> 'foo/bar/baz/' +console.log(normalize('./foo/bar/baz/', false)); //=> './foo/bar/baz/' +``` + +## Release history + +### v3.0 + +No breaking changes in this release. + +* a check was added to ensure that [win32 namespaces](https://msdn.microsoft.com/library/windows/desktop/aa365247(v=vs.85).aspx#namespaces) are handled properly by win32 `path.parse()` after a path has been normalized by this library. +* a minor optimization was made to simplify how the trailing separator was handled + +## About + +
+Contributing + +Pull requests and stars are always welcome. For bugs and feature requests, [please create an issue](../../issues/new). + +
+ +
+Running Tests + +Running and reviewing unit tests is a great way to get familiarized with a library and its API. You can install dependencies and run tests with the following command: + +```sh +$ npm install && npm test +``` + +
+ +
+Building docs + +_(This project's readme.md is generated by [verb](https://github.com/verbose/verb-generate-readme), please don't edit the readme directly. Any changes to the readme must be made in the [.verb.md](.verb.md) readme template.)_ + +To generate the readme, run the following command: + +```sh +$ npm install -g verbose/verb#dev verb-generate-readme && verb +``` + +
+ +### Related projects + +Other useful path-related libraries: + +* [contains-path](https://www.npmjs.com/package/contains-path): Return true if a file path contains the given path. | [homepage](https://github.com/jonschlinkert/contains-path "Return true if a file path contains the given path.") +* [is-absolute](https://www.npmjs.com/package/is-absolute): Returns true if a file path is absolute. Does not rely on the path module… [more](https://github.com/jonschlinkert/is-absolute) | [homepage](https://github.com/jonschlinkert/is-absolute "Returns true if a file path is absolute. Does not rely on the path module and can be used as a polyfill for node.js native `path.isAbolute`.") +* [is-relative](https://www.npmjs.com/package/is-relative): Returns `true` if the path appears to be relative. | [homepage](https://github.com/jonschlinkert/is-relative "Returns `true` if the path appears to be relative.") +* [parse-filepath](https://www.npmjs.com/package/parse-filepath): Pollyfill for node.js `path.parse`, parses a filepath into an object. | [homepage](https://github.com/jonschlinkert/parse-filepath "Pollyfill for node.js `path.parse`, parses a filepath into an object.") +* [path-ends-with](https://www.npmjs.com/package/path-ends-with): Return `true` if a file path ends with the given string/suffix. | [homepage](https://github.com/jonschlinkert/path-ends-with "Return `true` if a file path ends with the given string/suffix.") +* [unixify](https://www.npmjs.com/package/unixify): Convert Windows file paths to unix paths. | [homepage](https://github.com/jonschlinkert/unixify "Convert Windows file paths to unix paths.") + +### Contributors + +| **Commits** | **Contributor** | +| --- | --- | +| 35 | [jonschlinkert](https://github.com/jonschlinkert) | +| 1 | [phated](https://github.com/phated) | + +### Author + +**Jon Schlinkert** + +* [LinkedIn Profile](https://linkedin.com/in/jonschlinkert) +* [GitHub Profile](https://github.com/jonschlinkert) +* [Twitter Profile](https://twitter.com/jonschlinkert) + +### License + +Copyright © 2018, [Jon Schlinkert](https://github.com/jonschlinkert). +Released under the [MIT License](LICENSE). + +*** + +_This file was generated by [verb-generate-readme](https://github.com/verbose/verb-generate-readme), v0.6.0, on April 19, 2018._ \ No newline at end of file diff --git a/node_modules/normalize-path/index.js b/node_modules/normalize-path/index.js new file mode 100644 index 000000000..6fac553a3 --- /dev/null +++ b/node_modules/normalize-path/index.js @@ -0,0 +1,35 @@ +/*! + * normalize-path + * + * Copyright (c) 2014-2018, Jon Schlinkert. + * Released under the MIT License. + */ + +module.exports = function(path, stripTrailing) { + if (typeof path !== 'string') { + throw new TypeError('expected path to be a string'); + } + + if (path === '\\' || path === '/') return '/'; + + var len = path.length; + if (len <= 1) return path; + + // ensure that win32 namespaces has two leading slashes, so that the path is + // handled properly by the win32 version of path.parse() after being normalized + // https://msdn.microsoft.com/library/windows/desktop/aa365247(v=vs.85).aspx#namespaces + var prefix = ''; + if (len > 4 && path[3] === '\\') { + var ch = path[2]; + if ((ch === '?' || ch === '.') && path.slice(0, 2) === '\\\\') { + path = path.slice(2); + prefix = '//'; + } + } + + var segs = path.split(/[/\\]+/); + if (stripTrailing !== false && segs[segs.length - 1] === '') { + segs.pop(); + } + return prefix + segs.join('/'); +}; diff --git a/node_modules/normalize-path/package.json b/node_modules/normalize-path/package.json new file mode 100644 index 000000000..6a038ac7a --- /dev/null +++ b/node_modules/normalize-path/package.json @@ -0,0 +1,115 @@ +{ + "_from": "normalize-path@~3.0.0", + "_id": "normalize-path@3.0.0", + "_inBundle": false, + "_integrity": "sha512-6eZs5Ls3WtCisHWp9S2GUy8dqkpGi4BVSz3GaqiE6ezub0512ESztXUwUB6C6IKbQkY2Pnb/mD4WYojCRwcwLA==", + "_location": "/normalize-path", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "normalize-path@~3.0.0", + "name": "normalize-path", + "escapedName": "normalize-path", + "rawSpec": "~3.0.0", + "saveSpec": null, + "fetchSpec": "~3.0.0" + }, + "_requiredBy": [ + "/anymatch", + "/chokidar" + ], + "_resolved": "https://registry.npmjs.org/normalize-path/-/normalize-path-3.0.0.tgz", + "_shasum": "0dcd69ff23a1c9b11fd0978316644a0388216a65", + "_spec": "normalize-path@~3.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\chokidar", + "author": { + "name": "Jon Schlinkert", + "url": "https://github.com/jonschlinkert" + }, + "bugs": { + "url": "https://github.com/jonschlinkert/normalize-path/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Blaine Bublitz", + "url": "https://twitter.com/BlaineBublitz" + }, + { + "name": "Jon Schlinkert", + "url": "http://twitter.com/jonschlinkert" + } + ], + "deprecated": false, + "description": "Normalize slashes in a file path to be posix/unix-like forward slashes. Also condenses repeat slashes to a single slash and removes and trailing slashes, unless disabled.", + "devDependencies": { + "gulp-format-md": "^1.0.0", + "minimist": "^1.2.0", + "mocha": "^3.5.3" + }, + "engines": { + "node": ">=0.10.0" + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/jonschlinkert/normalize-path", + "keywords": [ + "absolute", + "backslash", + "delimiter", + "file", + "file-path", + "filepath", + "fix", + "forward", + "fp", + "fs", + "normalize", + "path", + "relative", + "separator", + "slash", + "slashes", + "trailing", + "unix", + "urix" + ], + "license": "MIT", + "main": "index.js", + "name": "normalize-path", + "repository": { + "type": "git", + "url": "git+https://github.com/jonschlinkert/normalize-path.git" + }, + "scripts": { + "test": "mocha" + }, + "verb": { + "toc": false, + "layout": "default", + "tasks": [ + "readme" + ], + "plugins": [ + "gulp-format-md" + ], + "related": { + "description": "Other useful path-related libraries:", + "list": [ + "contains-path", + "is-absolute", + "is-relative", + "parse-filepath", + "path-ends-with", + "path-ends-with", + "unixify" + ] + }, + "lint": { + "reflinks": true + } + }, + "version": "3.0.0" +} diff --git a/node_modules/normalize-url/index.d.ts b/node_modules/normalize-url/index.d.ts new file mode 100644 index 000000000..7e332f2f6 --- /dev/null +++ b/node_modules/normalize-url/index.d.ts @@ -0,0 +1,216 @@ +declare namespace normalizeUrl { + interface Options { + /** + @default 'http:' + */ + readonly defaultProtocol?: string; + + /** + Prepends `defaultProtocol` to the URL if it's protocol-relative. + + @default true + + @example + ``` + normalizeUrl('//sindresorhus.com:80/'); + //=> 'http://sindresorhus.com' + + normalizeUrl('//sindresorhus.com:80/', {normalizeProtocol: false}); + //=> '//sindresorhus.com' + ``` + */ + readonly normalizeProtocol?: boolean; + + /** + Normalizes `https:` URLs to `http:`. + + @default false + + @example + ``` + normalizeUrl('https://sindresorhus.com:80/'); + //=> 'https://sindresorhus.com' + + normalizeUrl('https://sindresorhus.com:80/', {forceHttp: true}); + //=> 'http://sindresorhus.com' + ``` + */ + readonly forceHttp?: boolean; + + /** + Normalizes `http:` URLs to `https:`. + + This option can't be used with the `forceHttp` option at the same time. + + @default false + + @example + ``` + normalizeUrl('https://sindresorhus.com:80/'); + //=> 'https://sindresorhus.com' + + normalizeUrl('http://sindresorhus.com:80/', {forceHttps: true}); + //=> 'https://sindresorhus.com' + ``` + */ + readonly forceHttps?: boolean; + + /** + Strip the [authentication](https://en.wikipedia.org/wiki/Basic_access_authentication) part of a URL. + + @default true + + @example + ``` + normalizeUrl('user:password@sindresorhus.com'); + //=> 'https://sindresorhus.com' + + normalizeUrl('user:password@sindresorhus.com', {stripAuthentication: false}); + //=> 'https://user:password@sindresorhus.com' + ``` + */ + readonly stripAuthentication?: boolean; + + /** + Removes hash from the URL. + + @default false + + @example + ``` + normalizeUrl('sindresorhus.com/about.html#contact'); + //=> 'http://sindresorhus.com/about.html#contact' + + normalizeUrl('sindresorhus.com/about.html#contact', {stripHash: true}); + //=> 'http://sindresorhus.com/about.html' + ``` + */ + readonly stripHash?: boolean; + + /** + Removes HTTP(S) protocol from an URL `http://sindresorhus.com` → `sindresorhus.com`. + + @default false + + @example + ``` + normalizeUrl('https://sindresorhus.com'); + //=> 'https://sindresorhus.com' + + normalizeUrl('sindresorhus.com', {stripProtocol: true}); + //=> 'sindresorhus.com' + ``` + */ + readonly stripProtocol?: boolean; + + /** + Removes `www.` from the URL. + + @default true + + @example + ``` + normalizeUrl('http://www.sindresorhus.com'); + //=> 'http://sindresorhus.com' + + normalizeUrl('http://www.sindresorhus.com', {stripWWW: false}); + //=> 'http://www.sindresorhus.com' + ``` + */ + readonly stripWWW?: boolean; + + /** + Removes query parameters that matches any of the provided strings or regexes. + + @default [/^utm_\w+/i] + + @example + ``` + normalizeUrl('www.sindresorhus.com?foo=bar&ref=test_ref', { + removeQueryParameters: ['ref'] + }); + //=> 'http://sindresorhus.com/?foo=bar' + ``` + */ + readonly removeQueryParameters?: ReadonlyArray; + + /** + Removes trailing slash. + + __Note__: Trailing slash is always removed if the URL doesn't have a pathname. + + @default true + + @example + ``` + normalizeUrl('http://sindresorhus.com/redirect/'); + //=> 'http://sindresorhus.com/redirect' + + normalizeUrl('http://sindresorhus.com/redirect/', {removeTrailingSlash: false}); + //=> 'http://sindresorhus.com/redirect/' + + normalizeUrl('http://sindresorhus.com/', {removeTrailingSlash: false}); + //=> 'http://sindresorhus.com' + ``` + */ + readonly removeTrailingSlash?: boolean; + + /** + Removes the default directory index file from path that matches any of the provided strings or regexes. + When `true`, the regex `/^index\.[a-z]+$/` is used. + + @default false + + @example + ``` + normalizeUrl('www.sindresorhus.com/foo/default.php', { + removeDirectoryIndex: [/^default\.[a-z]+$/] + }); + //=> 'http://sindresorhus.com/foo' + ``` + */ + readonly removeDirectoryIndex?: ReadonlyArray; + + /** + Sorts the query parameters alphabetically by key. + + @default true + + @example + ``` + normalizeUrl('www.sindresorhus.com?b=two&a=one&c=three', { + sortQueryParameters: false + }); + //=> 'http://sindresorhus.com/?b=two&a=one&c=three' + ``` + */ + readonly sortQueryParameters?: boolean; + } +} + +declare const normalizeUrl: { + /** + [Normalize](https://en.wikipedia.org/wiki/URL_normalization) a URL. + + @param url - URL to normalize, including [data URL](https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/Data_URIs). + + @example + ``` + import normalizeUrl = require('normalize-url'); + + normalizeUrl('sindresorhus.com'); + //=> 'http://sindresorhus.com' + + normalizeUrl('HTTP://xn--xample-hva.com:80/?b=bar&a=foo'); + //=> 'http://êxample.com/?a=foo&b=bar' + ``` + */ + (url: string, options?: normalizeUrl.Options): string; + + // TODO: Remove this for the next major release, refactor the whole definition to: + // declare function normalizeUrl(url: string, options?: normalizeUrl.Options): string; + // export = normalizeUrl; + default: typeof normalizeUrl; +}; + +export = normalizeUrl; diff --git a/node_modules/normalize-url/index.js b/node_modules/normalize-url/index.js new file mode 100644 index 000000000..2ab7f5714 --- /dev/null +++ b/node_modules/normalize-url/index.js @@ -0,0 +1,221 @@ +'use strict'; +// TODO: Use the `URL` global when targeting Node.js 10 +const URLParser = typeof URL === 'undefined' ? require('url').URL : URL; + +// https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/Data_URIs +const DATA_URL_DEFAULT_MIME_TYPE = 'text/plain'; +const DATA_URL_DEFAULT_CHARSET = 'us-ascii'; + +const testParameter = (name, filters) => { + return filters.some(filter => filter instanceof RegExp ? filter.test(name) : filter === name); +}; + +const normalizeDataURL = (urlString, {stripHash}) => { + const parts = urlString.match(/^data:([^,]*?),([^#]*?)(?:#(.*))?$/); + + if (!parts) { + throw new Error(`Invalid URL: ${urlString}`); + } + + const mediaType = parts[1].split(';'); + const body = parts[2]; + const hash = stripHash ? '' : parts[3]; + + let base64 = false; + + if (mediaType[mediaType.length - 1] === 'base64') { + mediaType.pop(); + base64 = true; + } + + // Lowercase MIME type + const mimeType = (mediaType.shift() || '').toLowerCase(); + const attributes = mediaType + .map(attribute => { + let [key, value = ''] = attribute.split('=').map(string => string.trim()); + + // Lowercase `charset` + if (key === 'charset') { + value = value.toLowerCase(); + + if (value === DATA_URL_DEFAULT_CHARSET) { + return ''; + } + } + + return `${key}${value ? `=${value}` : ''}`; + }) + .filter(Boolean); + + const normalizedMediaType = [ + ...attributes + ]; + + if (base64) { + normalizedMediaType.push('base64'); + } + + if (normalizedMediaType.length !== 0 || (mimeType && mimeType !== DATA_URL_DEFAULT_MIME_TYPE)) { + normalizedMediaType.unshift(mimeType); + } + + return `data:${normalizedMediaType.join(';')},${base64 ? body.trim() : body}${hash ? `#${hash}` : ''}`; +}; + +const normalizeUrl = (urlString, options) => { + options = { + defaultProtocol: 'http:', + normalizeProtocol: true, + forceHttp: false, + forceHttps: false, + stripAuthentication: true, + stripHash: false, + stripWWW: true, + removeQueryParameters: [/^utm_\w+/i], + removeTrailingSlash: true, + removeDirectoryIndex: false, + sortQueryParameters: true, + ...options + }; + + // TODO: Remove this at some point in the future + if (Reflect.has(options, 'normalizeHttps')) { + throw new Error('options.normalizeHttps is renamed to options.forceHttp'); + } + + if (Reflect.has(options, 'normalizeHttp')) { + throw new Error('options.normalizeHttp is renamed to options.forceHttps'); + } + + if (Reflect.has(options, 'stripFragment')) { + throw new Error('options.stripFragment is renamed to options.stripHash'); + } + + urlString = urlString.trim(); + + // Data URL + if (/^data:/i.test(urlString)) { + return normalizeDataURL(urlString, options); + } + + const hasRelativeProtocol = urlString.startsWith('//'); + const isRelativeUrl = !hasRelativeProtocol && /^\.*\//.test(urlString); + + // Prepend protocol + if (!isRelativeUrl) { + urlString = urlString.replace(/^(?!(?:\w+:)?\/\/)|^\/\//, options.defaultProtocol); + } + + const urlObj = new URLParser(urlString); + + if (options.forceHttp && options.forceHttps) { + throw new Error('The `forceHttp` and `forceHttps` options cannot be used together'); + } + + if (options.forceHttp && urlObj.protocol === 'https:') { + urlObj.protocol = 'http:'; + } + + if (options.forceHttps && urlObj.protocol === 'http:') { + urlObj.protocol = 'https:'; + } + + // Remove auth + if (options.stripAuthentication) { + urlObj.username = ''; + urlObj.password = ''; + } + + // Remove hash + if (options.stripHash) { + urlObj.hash = ''; + } + + // Remove duplicate slashes if not preceded by a protocol + if (urlObj.pathname) { + // TODO: Use the following instead when targeting Node.js 10 + // `urlObj.pathname = urlObj.pathname.replace(/(? { + if (/^(?!\/)/g.test(p1)) { + return `${p1}/`; + } + + return '/'; + }); + } + + // Decode URI octets + if (urlObj.pathname) { + urlObj.pathname = decodeURI(urlObj.pathname); + } + + // Remove directory index + if (options.removeDirectoryIndex === true) { + options.removeDirectoryIndex = [/^index\.[a-z]+$/]; + } + + if (Array.isArray(options.removeDirectoryIndex) && options.removeDirectoryIndex.length > 0) { + let pathComponents = urlObj.pathname.split('/'); + const lastComponent = pathComponents[pathComponents.length - 1]; + + if (testParameter(lastComponent, options.removeDirectoryIndex)) { + pathComponents = pathComponents.slice(0, pathComponents.length - 1); + urlObj.pathname = pathComponents.slice(1).join('/') + '/'; + } + } + + if (urlObj.hostname) { + // Remove trailing dot + urlObj.hostname = urlObj.hostname.replace(/\.$/, ''); + + // Remove `www.` + if (options.stripWWW && /^www\.([a-z\-\d]{2,63})\.([a-z.]{2,5})$/.test(urlObj.hostname)) { + // Each label should be max 63 at length (min: 2). + // The extension should be max 5 at length (min: 2). + // Source: https://en.wikipedia.org/wiki/Hostname#Restrictions_on_valid_host_names + urlObj.hostname = urlObj.hostname.replace(/^www\./, ''); + } + } + + // Remove query unwanted parameters + if (Array.isArray(options.removeQueryParameters)) { + for (const key of [...urlObj.searchParams.keys()]) { + if (testParameter(key, options.removeQueryParameters)) { + urlObj.searchParams.delete(key); + } + } + } + + // Sort query parameters + if (options.sortQueryParameters) { + urlObj.searchParams.sort(); + } + + if (options.removeTrailingSlash) { + urlObj.pathname = urlObj.pathname.replace(/\/$/, ''); + } + + // Take advantage of many of the Node `url` normalizations + urlString = urlObj.toString(); + + // Remove ending `/` + if ((options.removeTrailingSlash || urlObj.pathname === '/') && urlObj.hash === '') { + urlString = urlString.replace(/\/$/, ''); + } + + // Restore relative protocol, if applicable + if (hasRelativeProtocol && !options.normalizeProtocol) { + urlString = urlString.replace(/^http:\/\//, '//'); + } + + // Remove http/https + if (options.stripProtocol) { + urlString = urlString.replace(/^(?:https?:)?\/\//, ''); + } + + return urlString; +}; + +module.exports = normalizeUrl; +// TODO: Remove this for the next major release +module.exports.default = normalizeUrl; diff --git a/node_modules/normalize-url/license b/node_modules/normalize-url/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/normalize-url/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/normalize-url/package.json b/node_modules/normalize-url/package.json new file mode 100644 index 000000000..70b680fd7 --- /dev/null +++ b/node_modules/normalize-url/package.json @@ -0,0 +1,76 @@ +{ + "_from": "normalize-url@^4.1.0", + "_id": "normalize-url@4.5.1", + "_inBundle": false, + "_integrity": "sha512-9UZCFRHQdNrfTpGg8+1INIg93B6zE0aXMVFkw1WFwvO4SlZywU6aLg5Of0Ap/PgcbSw4LNxvMWXMeugwMCX0AA==", + "_location": "/normalize-url", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "normalize-url@^4.1.0", + "name": "normalize-url", + "escapedName": "normalize-url", + "rawSpec": "^4.1.0", + "saveSpec": null, + "fetchSpec": "^4.1.0" + }, + "_requiredBy": [ + "/cacheable-request" + ], + "_resolved": "https://registry.npmjs.org/normalize-url/-/normalize-url-4.5.1.tgz", + "_shasum": "0dd90cf1288ee1d1313b87081c9a5932ee48518a", + "_spec": "normalize-url@^4.1.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\cacheable-request", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/normalize-url/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Normalize a URL", + "devDependencies": { + "ava": "^2.4.0", + "coveralls": "^3.0.6", + "nyc": "^14.1.1", + "tsd": "^0.8.0", + "xo": "^0.24.0" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "homepage": "https://github.com/sindresorhus/normalize-url#readme", + "keywords": [ + "normalize", + "url", + "uri", + "address", + "string", + "normalization", + "normalisation", + "query", + "querystring", + "simplify", + "strip", + "trim", + "canonical" + ], + "license": "MIT", + "name": "normalize-url", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/normalize-url.git" + }, + "scripts": { + "test": "xo && nyc ava && tsd" + }, + "version": "4.5.1" +} diff --git a/node_modules/normalize-url/readme.md b/node_modules/normalize-url/readme.md new file mode 100644 index 000000000..a851fdd57 --- /dev/null +++ b/node_modules/normalize-url/readme.md @@ -0,0 +1,232 @@ +# normalize-url [![Build Status](https://travis-ci.org/sindresorhus/normalize-url.svg?branch=master)](https://travis-ci.org/sindresorhus/normalize-url) [![Coverage Status](https://coveralls.io/repos/github/sindresorhus/normalize-url/badge.svg?branch=master)](https://coveralls.io/github/sindresorhus/normalize-url?branch=master) + +> [Normalize](https://en.wikipedia.org/wiki/URL_normalization) a URL + +Useful when you need to display, store, deduplicate, sort, compare, etc, URLs. + + +## Install + +``` +$ npm install normalize-url +``` + + +## Usage + +```js +const normalizeUrl = require('normalize-url'); + +normalizeUrl('sindresorhus.com'); +//=> 'http://sindresorhus.com' + +normalizeUrl('HTTP://xn--xample-hva.com:80/?b=bar&a=foo'); +//=> 'http://êxample.com/?a=foo&b=bar' +``` + + +## API + +### normalizeUrl(url, options?) + +#### url + +Type: `string` + +URL to normalize, including [data URL](https://developer.mozilla.org/en-US/docs/Web/HTTP/Basics_of_HTTP/Data_URIs). + +#### options + +Type: `object` + +##### defaultProtocol + +Type: `string`
+Default: `http:` + +##### normalizeProtocol + +Type: `boolean`
+Default: `true` + +Prepend `defaultProtocol` to the URL if it's protocol-relative. + +```js +normalizeUrl('//sindresorhus.com:80/'); +//=> 'http://sindresorhus.com' + +normalizeUrl('//sindresorhus.com:80/', {normalizeProtocol: false}); +//=> '//sindresorhus.com' +``` + +##### forceHttp + +Type: `boolean`
+Default: `false` + +Normalize `https:` to `http:`. + +```js +normalizeUrl('https://sindresorhus.com:80/'); +//=> 'https://sindresorhus.com' + +normalizeUrl('https://sindresorhus.com:80/', {forceHttp: true}); +//=> 'http://sindresorhus.com' +``` + +##### forceHttps + +Type: `boolean`
+Default: `false` + +Normalize `http:` to `https:`. + +```js +normalizeUrl('https://sindresorhus.com:80/'); +//=> 'https://sindresorhus.com' + +normalizeUrl('http://sindresorhus.com:80/', {forceHttps: true}); +//=> 'https://sindresorhus.com' +``` + +This option can't be used with the `forceHttp` option at the same time. + +##### stripAuthentication + +Type: `boolean`
+Default: `true` + +Strip the [authentication](https://en.wikipedia.org/wiki/Basic_access_authentication) part of the URL. + +```js +normalizeUrl('user:password@sindresorhus.com'); +//=> 'https://sindresorhus.com' + +normalizeUrl('user:password@sindresorhus.com', {stripAuthentication: false}); +//=> 'https://user:password@sindresorhus.com' +``` + +##### stripHash + +Type: `boolean`
+Default: `false` + +Strip the hash part of the URL. + +```js +normalizeUrl('sindresorhus.com/about.html#contact'); +//=> 'http://sindresorhus.com/about.html#contact' + +normalizeUrl('sindresorhus.com/about.html#contact', {stripHash: true}); +//=> 'http://sindresorhus.com/about.html' +``` + +##### stripProtocol + +Type: `boolean`
+Default: `false` + +Remove HTTP(S) protocol from the URL: `http://sindresorhus.com` → `sindresorhus.com`. + +```js +normalizeUrl('https://sindresorhus.com'); +//=> 'https://sindresorhus.com' + +normalizeUrl('sindresorhus.com', {stripProtocol: true}); +//=> 'sindresorhus.com' +``` + +##### stripWWW + +Type: `boolean`
+Default: `true` + +Remove `www.` from the URL. + +```js +normalizeUrl('http://www.sindresorhus.com'); +//=> 'http://sindresorhus.com' + +normalizeUrl('http://www.sindresorhus.com', {stripWWW: false}); +//=> 'http://www.sindresorhus.com' +``` + +##### removeQueryParameters + +Type: `Array`
+Default: `[/^utm_\w+/i]` + +Remove query parameters that matches any of the provided strings or regexes. + +```js +normalizeUrl('www.sindresorhus.com?foo=bar&ref=test_ref', { + removeQueryParameters: ['ref'] +}); +//=> 'http://sindresorhus.com/?foo=bar' +``` + +##### removeTrailingSlash + +Type: `boolean`
+Default: `true` + +Remove trailing slash. + +**Note:** Trailing slash is always removed if the URL doesn't have a pathname. + +```js +normalizeUrl('http://sindresorhus.com/redirect/'); +//=> 'http://sindresorhus.com/redirect' + +normalizeUrl('http://sindresorhus.com/redirect/', {removeTrailingSlash: false}); +//=> 'http://sindresorhus.com/redirect/' + +normalizeUrl('http://sindresorhus.com/', {removeTrailingSlash: false}); +//=> 'http://sindresorhus.com' +``` + +##### removeDirectoryIndex + +Type: `boolean | Array`
+Default: `false` + +Removes the default directory index file from path that matches any of the provided strings or regexes. When `true`, the regex `/^index\.[a-z]+$/` is used. + +```js +normalizeUrl('www.sindresorhus.com/foo/default.php', { + removeDirectoryIndex: [/^default\.[a-z]+$/] +}); +//=> 'http://sindresorhus.com/foo' +``` + +##### sortQueryParameters + +Type: `boolean`
+Default: `true` + +Sorts the query parameters alphabetically by key. + +```js +normalizeUrl('www.sindresorhus.com?b=two&a=one&c=three', { + sortQueryParameters: false +}); +//=> 'http://sindresorhus.com/?b=two&a=one&c=three' +``` + + +## Related + +- [compare-urls](https://github.com/sindresorhus/compare-urls) - Compare URLs by first normalizing them + + +--- + +
+ + Get professional support for this package with a Tidelift subscription + +
+ + Tidelift helps make open source sustainable for maintainers while giving companies
assurances about security, maintenance, and licensing for their dependencies. +
+
diff --git a/node_modules/on-finished/HISTORY.md b/node_modules/on-finished/HISTORY.md new file mode 100644 index 000000000..98ff0e992 --- /dev/null +++ b/node_modules/on-finished/HISTORY.md @@ -0,0 +1,88 @@ +2.3.0 / 2015-05-26 +================== + + * Add defined behavior for HTTP `CONNECT` requests + * Add defined behavior for HTTP `Upgrade` requests + * deps: ee-first@1.1.1 + +2.2.1 / 2015-04-22 +================== + + * Fix `isFinished(req)` when data buffered + +2.2.0 / 2014-12-22 +================== + + * Add message object to callback arguments + +2.1.1 / 2014-10-22 +================== + + * Fix handling of pipelined requests + +2.1.0 / 2014-08-16 +================== + + * Check if `socket` is detached + * Return `undefined` for `isFinished` if state unknown + +2.0.0 / 2014-08-16 +================== + + * Add `isFinished` function + * Move to `jshttp` organization + * Remove support for plain socket argument + * Rename to `on-finished` + * Support both `req` and `res` as arguments + * deps: ee-first@1.0.5 + +1.2.2 / 2014-06-10 +================== + + * Reduce listeners added to emitters + - avoids "event emitter leak" warnings when used multiple times on same request + +1.2.1 / 2014-06-08 +================== + + * Fix returned value when already finished + +1.2.0 / 2014-06-05 +================== + + * Call callback when called on already-finished socket + +1.1.4 / 2014-05-27 +================== + + * Support node.js 0.8 + +1.1.3 / 2014-04-30 +================== + + * Make sure errors passed as instanceof `Error` + +1.1.2 / 2014-04-18 +================== + + * Default the `socket` to passed-in object + +1.1.1 / 2014-01-16 +================== + + * Rename module to `finished` + +1.1.0 / 2013-12-25 +================== + + * Call callback when called on already-errored socket + +1.0.1 / 2013-12-20 +================== + + * Actually pass the error to the callback + +1.0.0 / 2013-12-20 +================== + + * Initial release diff --git a/node_modules/on-finished/LICENSE b/node_modules/on-finished/LICENSE new file mode 100644 index 000000000..5931fd23e --- /dev/null +++ b/node_modules/on-finished/LICENSE @@ -0,0 +1,23 @@ +(The MIT License) + +Copyright (c) 2013 Jonathan Ong +Copyright (c) 2014 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/on-finished/README.md b/node_modules/on-finished/README.md new file mode 100644 index 000000000..a0e115744 --- /dev/null +++ b/node_modules/on-finished/README.md @@ -0,0 +1,154 @@ +# on-finished + +[![NPM Version][npm-image]][npm-url] +[![NPM Downloads][downloads-image]][downloads-url] +[![Node.js Version][node-version-image]][node-version-url] +[![Build Status][travis-image]][travis-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +Execute a callback when a HTTP request closes, finishes, or errors. + +## Install + +```sh +$ npm install on-finished +``` + +## API + +```js +var onFinished = require('on-finished') +``` + +### onFinished(res, listener) + +Attach a listener to listen for the response to finish. The listener will +be invoked only once when the response finished. If the response finished +to an error, the first argument will contain the error. If the response +has already finished, the listener will be invoked. + +Listening to the end of a response would be used to close things associated +with the response, like open files. + +Listener is invoked as `listener(err, res)`. + +```js +onFinished(res, function (err, res) { + // clean up open fds, etc. + // err contains the error is request error'd +}) +``` + +### onFinished(req, listener) + +Attach a listener to listen for the request to finish. The listener will +be invoked only once when the request finished. If the request finished +to an error, the first argument will contain the error. If the request +has already finished, the listener will be invoked. + +Listening to the end of a request would be used to know when to continue +after reading the data. + +Listener is invoked as `listener(err, req)`. + +```js +var data = '' + +req.setEncoding('utf8') +res.on('data', function (str) { + data += str +}) + +onFinished(req, function (err, req) { + // data is read unless there is err +}) +``` + +### onFinished.isFinished(res) + +Determine if `res` is already finished. This would be useful to check and +not even start certain operations if the response has already finished. + +### onFinished.isFinished(req) + +Determine if `req` is already finished. This would be useful to check and +not even start certain operations if the request has already finished. + +## Special Node.js requests + +### HTTP CONNECT method + +The meaning of the `CONNECT` method from RFC 7231, section 4.3.6: + +> The CONNECT method requests that the recipient establish a tunnel to +> the destination origin server identified by the request-target and, +> if successful, thereafter restrict its behavior to blind forwarding +> of packets, in both directions, until the tunnel is closed. Tunnels +> are commonly used to create an end-to-end virtual connection, through +> one or more proxies, which can then be secured using TLS (Transport +> Layer Security, [RFC5246]). + +In Node.js, these request objects come from the `'connect'` event on +the HTTP server. + +When this module is used on a HTTP `CONNECT` request, the request is +considered "finished" immediately, **due to limitations in the Node.js +interface**. This means if the `CONNECT` request contains a request entity, +the request will be considered "finished" even before it has been read. + +There is no such thing as a response object to a `CONNECT` request in +Node.js, so there is no support for for one. + +### HTTP Upgrade request + +The meaning of the `Upgrade` header from RFC 7230, section 6.1: + +> The "Upgrade" header field is intended to provide a simple mechanism +> for transitioning from HTTP/1.1 to some other protocol on the same +> connection. + +In Node.js, these request objects come from the `'upgrade'` event on +the HTTP server. + +When this module is used on a HTTP request with an `Upgrade` header, the +request is considered "finished" immediately, **due to limitations in the +Node.js interface**. This means if the `Upgrade` request contains a request +entity, the request will be considered "finished" even before it has been +read. + +There is no such thing as a response object to a `Upgrade` request in +Node.js, so there is no support for for one. + +## Example + +The following code ensures that file descriptors are always closed +once the response finishes. + +```js +var destroy = require('destroy') +var http = require('http') +var onFinished = require('on-finished') + +http.createServer(function onRequest(req, res) { + var stream = fs.createReadStream('package.json') + stream.pipe(res) + onFinished(res, function (err) { + destroy(stream) + }) +}) +``` + +## License + +[MIT](LICENSE) + +[npm-image]: https://img.shields.io/npm/v/on-finished.svg +[npm-url]: https://npmjs.org/package/on-finished +[node-version-image]: https://img.shields.io/node/v/on-finished.svg +[node-version-url]: http://nodejs.org/download/ +[travis-image]: https://img.shields.io/travis/jshttp/on-finished/master.svg +[travis-url]: https://travis-ci.org/jshttp/on-finished +[coveralls-image]: https://img.shields.io/coveralls/jshttp/on-finished/master.svg +[coveralls-url]: https://coveralls.io/r/jshttp/on-finished?branch=master +[downloads-image]: https://img.shields.io/npm/dm/on-finished.svg +[downloads-url]: https://npmjs.org/package/on-finished diff --git a/node_modules/on-finished/index.js b/node_modules/on-finished/index.js new file mode 100644 index 000000000..9abd98f9d --- /dev/null +++ b/node_modules/on-finished/index.js @@ -0,0 +1,196 @@ +/*! + * on-finished + * Copyright(c) 2013 Jonathan Ong + * Copyright(c) 2014 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module exports. + * @public + */ + +module.exports = onFinished +module.exports.isFinished = isFinished + +/** + * Module dependencies. + * @private + */ + +var first = require('ee-first') + +/** + * Variables. + * @private + */ + +/* istanbul ignore next */ +var defer = typeof setImmediate === 'function' + ? setImmediate + : function(fn){ process.nextTick(fn.bind.apply(fn, arguments)) } + +/** + * Invoke callback when the response has finished, useful for + * cleaning up resources afterwards. + * + * @param {object} msg + * @param {function} listener + * @return {object} + * @public + */ + +function onFinished(msg, listener) { + if (isFinished(msg) !== false) { + defer(listener, null, msg) + return msg + } + + // attach the listener to the message + attachListener(msg, listener) + + return msg +} + +/** + * Determine if message is already finished. + * + * @param {object} msg + * @return {boolean} + * @public + */ + +function isFinished(msg) { + var socket = msg.socket + + if (typeof msg.finished === 'boolean') { + // OutgoingMessage + return Boolean(msg.finished || (socket && !socket.writable)) + } + + if (typeof msg.complete === 'boolean') { + // IncomingMessage + return Boolean(msg.upgrade || !socket || !socket.readable || (msg.complete && !msg.readable)) + } + + // don't know + return undefined +} + +/** + * Attach a finished listener to the message. + * + * @param {object} msg + * @param {function} callback + * @private + */ + +function attachFinishedListener(msg, callback) { + var eeMsg + var eeSocket + var finished = false + + function onFinish(error) { + eeMsg.cancel() + eeSocket.cancel() + + finished = true + callback(error) + } + + // finished on first message event + eeMsg = eeSocket = first([[msg, 'end', 'finish']], onFinish) + + function onSocket(socket) { + // remove listener + msg.removeListener('socket', onSocket) + + if (finished) return + if (eeMsg !== eeSocket) return + + // finished on first socket event + eeSocket = first([[socket, 'error', 'close']], onFinish) + } + + if (msg.socket) { + // socket already assigned + onSocket(msg.socket) + return + } + + // wait for socket to be assigned + msg.on('socket', onSocket) + + if (msg.socket === undefined) { + // node.js 0.8 patch + patchAssignSocket(msg, onSocket) + } +} + +/** + * Attach the listener to the message. + * + * @param {object} msg + * @return {function} + * @private + */ + +function attachListener(msg, listener) { + var attached = msg.__onFinished + + // create a private single listener with queue + if (!attached || !attached.queue) { + attached = msg.__onFinished = createListener(msg) + attachFinishedListener(msg, attached) + } + + attached.queue.push(listener) +} + +/** + * Create listener on message. + * + * @param {object} msg + * @return {function} + * @private + */ + +function createListener(msg) { + function listener(err) { + if (msg.__onFinished === listener) msg.__onFinished = null + if (!listener.queue) return + + var queue = listener.queue + listener.queue = null + + for (var i = 0; i < queue.length; i++) { + queue[i](err, msg) + } + } + + listener.queue = [] + + return listener +} + +/** + * Patch ServerResponse.prototype.assignSocket for node.js 0.8. + * + * @param {ServerResponse} res + * @param {function} callback + * @private + */ + +function patchAssignSocket(res, callback) { + var assignSocket = res.assignSocket + + if (typeof assignSocket !== 'function') return + + // res.on('socket', callback) is broken in 0.8 + res.assignSocket = function _assignSocket(socket) { + assignSocket.call(this, socket) + callback(socket) + } +} diff --git a/node_modules/on-finished/package.json b/node_modules/on-finished/package.json new file mode 100644 index 000000000..8336e032d --- /dev/null +++ b/node_modules/on-finished/package.json @@ -0,0 +1,73 @@ +{ + "_from": "on-finished@~2.3.0", + "_id": "on-finished@2.3.0", + "_inBundle": false, + "_integrity": "sha1-IPEzZIGwg811M3mSoWlxqi2QaUc=", + "_location": "/on-finished", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "on-finished@~2.3.0", + "name": "on-finished", + "escapedName": "on-finished", + "rawSpec": "~2.3.0", + "saveSpec": null, + "fetchSpec": "~2.3.0" + }, + "_requiredBy": [ + "/body-parser", + "/express", + "/finalhandler", + "/send" + ], + "_resolved": "https://registry.npmjs.org/on-finished/-/on-finished-2.3.0.tgz", + "_shasum": "20f1336481b083cd75337992a16971aa2d906947", + "_spec": "on-finished@~2.3.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\express", + "bugs": { + "url": "https://github.com/jshttp/on-finished/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + { + "name": "Jonathan Ong", + "email": "me@jongleberry.com", + "url": "http://jongleberry.com" + } + ], + "dependencies": { + "ee-first": "1.1.1" + }, + "deprecated": false, + "description": "Execute a callback when a request closes, finishes, or errors", + "devDependencies": { + "istanbul": "0.3.9", + "mocha": "2.2.5" + }, + "engines": { + "node": ">= 0.8" + }, + "files": [ + "HISTORY.md", + "LICENSE", + "index.js" + ], + "homepage": "https://github.com/jshttp/on-finished#readme", + "license": "MIT", + "name": "on-finished", + "repository": { + "type": "git", + "url": "git+https://github.com/jshttp/on-finished.git" + }, + "scripts": { + "test": "mocha --reporter spec --bail --check-leaks test/", + "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --reporter dot --check-leaks test/", + "test-travis": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --reporter spec --check-leaks test/" + }, + "version": "2.3.0" +} diff --git a/node_modules/on-headers/HISTORY.md b/node_modules/on-headers/HISTORY.md new file mode 100644 index 000000000..090598d8b --- /dev/null +++ b/node_modules/on-headers/HISTORY.md @@ -0,0 +1,21 @@ +1.0.2 / 2019-02-21 +================== + + * Fix `res.writeHead` patch missing return value + +1.0.1 / 2015-09-29 +================== + + * perf: enable strict mode + +1.0.0 / 2014-08-10 +================== + + * Honor `res.statusCode` change in `listener` + * Move to `jshttp` organization + * Prevent `arguments`-related de-opt + +0.0.0 / 2014-05-13 +================== + + * Initial implementation diff --git a/node_modules/on-headers/LICENSE b/node_modules/on-headers/LICENSE new file mode 100644 index 000000000..b7dce6cf9 --- /dev/null +++ b/node_modules/on-headers/LICENSE @@ -0,0 +1,22 @@ +(The MIT License) + +Copyright (c) 2014 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/on-headers/README.md b/node_modules/on-headers/README.md new file mode 100644 index 000000000..ae8428246 --- /dev/null +++ b/node_modules/on-headers/README.md @@ -0,0 +1,81 @@ +# on-headers + +[![NPM Version][npm-version-image]][npm-url] +[![NPM Downloads][npm-downloads-image]][npm-url] +[![Node.js Version][node-version-image]][node-version-url] +[![Build Status][travis-image]][travis-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +Execute a listener when a response is about to write headers. + +## Installation + +This is a [Node.js](https://nodejs.org/en/) module available through the +[npm registry](https://www.npmjs.com/). Installation is done using the +[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally): + +```sh +$ npm install on-headers +``` + +## API + + + +```js +var onHeaders = require('on-headers') +``` + +### onHeaders(res, listener) + +This will add the listener `listener` to fire when headers are emitted for `res`. +The listener is passed the `response` object as it's context (`this`). Headers are +considered to be emitted only once, right before they are sent to the client. + +When this is called multiple times on the same `res`, the `listener`s are fired +in the reverse order they were added. + +## Examples + +```js +var http = require('http') +var onHeaders = require('on-headers') + +http + .createServer(onRequest) + .listen(3000) + +function addPoweredBy () { + // set if not set by end of request + if (!this.getHeader('X-Powered-By')) { + this.setHeader('X-Powered-By', 'Node.js') + } +} + +function onRequest (req, res) { + onHeaders(res, addPoweredBy) + + res.setHeader('Content-Type', 'text/plain') + res.end('hello!') +} +``` + +## Testing + +```sh +$ npm test +``` + +## License + +[MIT](LICENSE) + +[coveralls-image]: https://badgen.net/coveralls/c/github/jshttp/on-headers/master +[coveralls-url]: https://coveralls.io/r/jshttp/on-headers?branch=master +[node-version-image]: https://badgen.net/npm/node/on-headers +[node-version-url]: https://nodejs.org/en/download +[npm-downloads-image]: https://badgen.net/npm/dm/on-headers +[npm-url]: https://npmjs.org/package/on-headers +[npm-version-image]: https://badgen.net/npm/v/on-headers +[travis-image]: https://badgen.net/travis/jshttp/on-headers/master +[travis-url]: https://travis-ci.org/jshttp/on-headers diff --git a/node_modules/on-headers/index.js b/node_modules/on-headers/index.js new file mode 100644 index 000000000..7db6375ed --- /dev/null +++ b/node_modules/on-headers/index.js @@ -0,0 +1,132 @@ +/*! + * on-headers + * Copyright(c) 2014 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module exports. + * @public + */ + +module.exports = onHeaders + +/** + * Create a replacement writeHead method. + * + * @param {function} prevWriteHead + * @param {function} listener + * @private + */ + +function createWriteHead (prevWriteHead, listener) { + var fired = false + + // return function with core name and argument list + return function writeHead (statusCode) { + // set headers from arguments + var args = setWriteHeadHeaders.apply(this, arguments) + + // fire listener + if (!fired) { + fired = true + listener.call(this) + + // pass-along an updated status code + if (typeof args[0] === 'number' && this.statusCode !== args[0]) { + args[0] = this.statusCode + args.length = 1 + } + } + + return prevWriteHead.apply(this, args) + } +} + +/** + * Execute a listener when a response is about to write headers. + * + * @param {object} res + * @return {function} listener + * @public + */ + +function onHeaders (res, listener) { + if (!res) { + throw new TypeError('argument res is required') + } + + if (typeof listener !== 'function') { + throw new TypeError('argument listener must be a function') + } + + res.writeHead = createWriteHead(res.writeHead, listener) +} + +/** + * Set headers contained in array on the response object. + * + * @param {object} res + * @param {array} headers + * @private + */ + +function setHeadersFromArray (res, headers) { + for (var i = 0; i < headers.length; i++) { + res.setHeader(headers[i][0], headers[i][1]) + } +} + +/** + * Set headers contained in object on the response object. + * + * @param {object} res + * @param {object} headers + * @private + */ + +function setHeadersFromObject (res, headers) { + var keys = Object.keys(headers) + for (var i = 0; i < keys.length; i++) { + var k = keys[i] + if (k) res.setHeader(k, headers[k]) + } +} + +/** + * Set headers and other properties on the response object. + * + * @param {number} statusCode + * @private + */ + +function setWriteHeadHeaders (statusCode) { + var length = arguments.length + var headerIndex = length > 1 && typeof arguments[1] === 'string' + ? 2 + : 1 + + var headers = length >= headerIndex + 1 + ? arguments[headerIndex] + : undefined + + this.statusCode = statusCode + + if (Array.isArray(headers)) { + // handle array case + setHeadersFromArray(this, headers) + } else if (headers) { + // handle object case + setHeadersFromObject(this, headers) + } + + // copy leading arguments + var args = new Array(Math.min(length, headerIndex)) + for (var i = 0; i < args.length; i++) { + args[i] = arguments[i] + } + + return args +} diff --git a/node_modules/on-headers/package.json b/node_modules/on-headers/package.json new file mode 100644 index 000000000..2ca15fd42 --- /dev/null +++ b/node_modules/on-headers/package.json @@ -0,0 +1,77 @@ +{ + "_from": "on-headers@~1.0.2", + "_id": "on-headers@1.0.2", + "_inBundle": false, + "_integrity": "sha512-pZAE+FJLoyITytdqK0U5s+FIpjN0JP3OzFi/u8Rx+EV5/W+JTWGXG8xFzevE7AjBfDqHv/8vL8qQsIhHnqRkrA==", + "_location": "/on-headers", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "on-headers@~1.0.2", + "name": "on-headers", + "escapedName": "on-headers", + "rawSpec": "~1.0.2", + "saveSpec": null, + "fetchSpec": "~1.0.2" + }, + "_requiredBy": [ + "/cookie-session" + ], + "_resolved": "https://registry.npmjs.org/on-headers/-/on-headers-1.0.2.tgz", + "_shasum": "772b0ae6aaa525c399e489adfad90c403eb3c28f", + "_spec": "on-headers@~1.0.2", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\cookie-session", + "author": { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + "bugs": { + "url": "https://github.com/jshttp/on-headers/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Execute a listener when a response is about to write headers", + "devDependencies": { + "eslint": "5.14.1", + "eslint-config-standard": "12.0.0", + "eslint-plugin-import": "2.16.0", + "eslint-plugin-markdown": "1.0.0", + "eslint-plugin-node": "8.0.1", + "eslint-plugin-promise": "4.0.1", + "eslint-plugin-standard": "4.0.0", + "istanbul": "0.4.5", + "mocha": "6.0.1", + "supertest": "3.4.2" + }, + "engines": { + "node": ">= 0.8" + }, + "files": [ + "LICENSE", + "HISTORY.md", + "README.md", + "index.js" + ], + "homepage": "https://github.com/jshttp/on-headers#readme", + "keywords": [ + "event", + "headers", + "http", + "onheaders" + ], + "license": "MIT", + "name": "on-headers", + "repository": { + "type": "git", + "url": "git+https://github.com/jshttp/on-headers.git" + }, + "scripts": { + "lint": "eslint --plugin markdown --ext js,md .", + "test": "mocha --reporter spec --bail --check-leaks test/", + "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --reporter dot --check-leaks test/", + "test-travis": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --reporter spec --check-leaks test/", + "version": "node scripts/version-history.js && git add HISTORY.md" + }, + "version": "1.0.2" +} diff --git a/node_modules/once/LICENSE b/node_modules/once/LICENSE new file mode 100644 index 000000000..19129e315 --- /dev/null +++ b/node_modules/once/LICENSE @@ -0,0 +1,15 @@ +The ISC License + +Copyright (c) Isaac Z. Schlueter and Contributors + +Permission to use, copy, modify, and/or distribute this software for any +purpose with or without fee is hereby granted, provided that the above +copyright notice and this permission notice appear in all copies. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES +WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF +MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR +ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES +WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN +ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR +IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. diff --git a/node_modules/once/README.md b/node_modules/once/README.md new file mode 100644 index 000000000..1f1ffca93 --- /dev/null +++ b/node_modules/once/README.md @@ -0,0 +1,79 @@ +# once + +Only call a function once. + +## usage + +```javascript +var once = require('once') + +function load (file, cb) { + cb = once(cb) + loader.load('file') + loader.once('load', cb) + loader.once('error', cb) +} +``` + +Or add to the Function.prototype in a responsible way: + +```javascript +// only has to be done once +require('once').proto() + +function load (file, cb) { + cb = cb.once() + loader.load('file') + loader.once('load', cb) + loader.once('error', cb) +} +``` + +Ironically, the prototype feature makes this module twice as +complicated as necessary. + +To check whether you function has been called, use `fn.called`. Once the +function is called for the first time the return value of the original +function is saved in `fn.value` and subsequent calls will continue to +return this value. + +```javascript +var once = require('once') + +function load (cb) { + cb = once(cb) + var stream = createStream() + stream.once('data', cb) + stream.once('end', function () { + if (!cb.called) cb(new Error('not found')) + }) +} +``` + +## `once.strict(func)` + +Throw an error if the function is called twice. + +Some functions are expected to be called only once. Using `once` for them would +potentially hide logical errors. + +In the example below, the `greet` function has to call the callback only once: + +```javascript +function greet (name, cb) { + // return is missing from the if statement + // when no name is passed, the callback is called twice + if (!name) cb('Hello anonymous') + cb('Hello ' + name) +} + +function log (msg) { + console.log(msg) +} + +// this will print 'Hello anonymous' but the logical error will be missed +greet(null, once(msg)) + +// once.strict will print 'Hello anonymous' and throw an error when the callback will be called the second time +greet(null, once.strict(msg)) +``` diff --git a/node_modules/once/once.js b/node_modules/once/once.js new file mode 100644 index 000000000..235406736 --- /dev/null +++ b/node_modules/once/once.js @@ -0,0 +1,42 @@ +var wrappy = require('wrappy') +module.exports = wrappy(once) +module.exports.strict = wrappy(onceStrict) + +once.proto = once(function () { + Object.defineProperty(Function.prototype, 'once', { + value: function () { + return once(this) + }, + configurable: true + }) + + Object.defineProperty(Function.prototype, 'onceStrict', { + value: function () { + return onceStrict(this) + }, + configurable: true + }) +}) + +function once (fn) { + var f = function () { + if (f.called) return f.value + f.called = true + return f.value = fn.apply(this, arguments) + } + f.called = false + return f +} + +function onceStrict (fn) { + var f = function () { + if (f.called) + throw new Error(f.onceError) + f.called = true + return f.value = fn.apply(this, arguments) + } + var name = fn.name || 'Function wrapped with `once`' + f.onceError = name + " shouldn't be called more than once" + f.called = false + return f +} diff --git a/node_modules/once/package.json b/node_modules/once/package.json new file mode 100644 index 000000000..ef9aeee02 --- /dev/null +++ b/node_modules/once/package.json @@ -0,0 +1,67 @@ +{ + "_from": "once@^1.3.1", + "_id": "once@1.4.0", + "_inBundle": false, + "_integrity": "sha1-WDsap3WWHUsROsF9nFC6753Xa9E=", + "_location": "/once", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "once@^1.3.1", + "name": "once", + "escapedName": "once", + "rawSpec": "^1.3.1", + "saveSpec": null, + "fetchSpec": "^1.3.1" + }, + "_requiredBy": [ + "/end-of-stream", + "/pump" + ], + "_resolved": "https://registry.npmjs.org/once/-/once-1.4.0.tgz", + "_shasum": "583b1aa775961d4b113ac17d9c50baef9dd76bd1", + "_spec": "once@^1.3.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\pump", + "author": { + "name": "Isaac Z. Schlueter", + "email": "i@izs.me", + "url": "http://blog.izs.me/" + }, + "bugs": { + "url": "https://github.com/isaacs/once/issues" + }, + "bundleDependencies": false, + "dependencies": { + "wrappy": "1" + }, + "deprecated": false, + "description": "Run a function exactly one time", + "devDependencies": { + "tap": "^7.0.1" + }, + "directories": { + "test": "test" + }, + "files": [ + "once.js" + ], + "homepage": "https://github.com/isaacs/once#readme", + "keywords": [ + "once", + "function", + "one", + "single" + ], + "license": "ISC", + "main": "once.js", + "name": "once", + "repository": { + "type": "git", + "url": "git://github.com/isaacs/once.git" + }, + "scripts": { + "test": "tap test/*.js" + }, + "version": "1.4.0" +} diff --git a/node_modules/p-cancelable/index.d.ts b/node_modules/p-cancelable/index.d.ts new file mode 100644 index 000000000..316d636e0 --- /dev/null +++ b/node_modules/p-cancelable/index.d.ts @@ -0,0 +1,168 @@ +/** + * Accepts a function that is called when the promise is canceled. + * + * You're not required to call this function. You can call this function multiple times to add multiple cancel handlers. + */ +export interface OnCancelFunction { + (cancelHandler: () => void): void; + shouldReject: boolean; +} + +declare class PCancelable extends Promise { + /** + * Convenience method to make your promise-returning or async function cancelable. + * + * @param fn - A promise-returning function. The function you specify will have `onCancel` appended to its parameters. + * + * @example + * + * import PCancelable from 'p-cancelable'; + * + * const fn = PCancelable.fn((input, onCancel) => { + * const job = new Job(); + * + * onCancel(() => { + * job.cleanup(); + * }); + * + * return job.start(); //=> Promise + * }); + * + * const cancelablePromise = fn('input'); //=> PCancelable + * + * // … + * + * cancelablePromise.cancel(); + */ + static fn( + userFn: (onCancel: OnCancelFunction) => PromiseLike + ): () => PCancelable; + static fn( + userFn: ( + argument1: Agument1Type, + onCancel: OnCancelFunction + ) => PromiseLike + ): (argument1: Agument1Type) => PCancelable; + static fn( + userFn: ( + argument1: Agument1Type, + argument2: Agument2Type, + onCancel: OnCancelFunction + ) => PromiseLike + ): ( + argument1: Agument1Type, + argument2: Agument2Type + ) => PCancelable; + static fn( + userFn: ( + argument1: Agument1Type, + argument2: Agument2Type, + argument3: Agument3Type, + onCancel: OnCancelFunction + ) => PromiseLike + ): ( + argument1: Agument1Type, + argument2: Agument2Type, + argument3: Agument3Type + ) => PCancelable; + static fn( + userFn: ( + argument1: Agument1Type, + argument2: Agument2Type, + argument3: Agument3Type, + argument4: Agument4Type, + onCancel: OnCancelFunction + ) => PromiseLike + ): ( + argument1: Agument1Type, + argument2: Agument2Type, + argument3: Agument3Type, + argument4: Agument4Type + ) => PCancelable; + static fn< + Agument1Type, + Agument2Type, + Agument3Type, + Agument4Type, + Agument5Type, + ReturnType + >( + userFn: ( + argument1: Agument1Type, + argument2: Agument2Type, + argument3: Agument3Type, + argument4: Agument4Type, + argument5: Agument5Type, + onCancel: OnCancelFunction + ) => PromiseLike + ): ( + argument1: Agument1Type, + argument2: Agument2Type, + argument3: Agument3Type, + argument4: Agument4Type, + argument5: Agument5Type + ) => PCancelable; + static fn( + userFn: (...arguments: unknown[]) => PromiseLike + ): (...arguments: unknown[]) => PCancelable; + + /** + * Create a promise that can be canceled. + * + * Can be constructed in the same was as a [`Promise` constructor](https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/Promise), but with an appended `onCancel` parameter in `executor`. `PCancelable` is a subclass of `Promise`. + * + * Cancelling will reject the promise with `CancelError`. To avoid that, set `onCancel.shouldReject` to `false`. + * + * @example + * + * import PCancelable from 'p-cancelable'; + * + * const cancelablePromise = new PCancelable((resolve, reject, onCancel) => { + * const job = new Job(); + * + * onCancel.shouldReject = false; + * onCancel(() => { + * job.stop(); + * }); + * + * job.on('finish', resolve); + * }); + * + * cancelablePromise.cancel(); // Doesn't throw an error + */ + constructor( + executor: ( + resolve: (value?: ValueType | PromiseLike) => void, + reject: (reason?: unknown) => void, + onCancel: OnCancelFunction + ) => void + ); + + /** + * Whether the promise is canceled. + */ + readonly isCanceled: boolean; + + /** + * Cancel the promise and optionally provide a reason. + * + * The cancellation is synchronous. Calling it after the promise has settled or multiple times does nothing. + * + * @param reason - The cancellation reason to reject the promise with. + */ + cancel(reason?: string): void; +} + +export default PCancelable; + +/** + * Rejection reason when `.cancel()` is called. + * + * It includes a `.isCanceled` property for convenience. + */ +export class CancelError extends Error { + readonly name: 'CancelError'; + readonly isCanceled: true; + + constructor(reason?: string); +} diff --git a/node_modules/p-cancelable/index.js b/node_modules/p-cancelable/index.js new file mode 100644 index 000000000..26bd42eec --- /dev/null +++ b/node_modules/p-cancelable/index.js @@ -0,0 +1,103 @@ +'use strict'; + +class CancelError extends Error { + constructor(reason) { + super(reason || 'Promise was canceled'); + this.name = 'CancelError'; + } + + get isCanceled() { + return true; + } +} + +class PCancelable { + static fn(userFn) { + return (...args) => { + return new PCancelable((resolve, reject, onCancel) => { + args.push(onCancel); + userFn(...args).then(resolve, reject); + }); + }; + } + + constructor(executor) { + this._cancelHandlers = []; + this._isPending = true; + this._isCanceled = false; + this._rejectOnCancel = true; + + this._promise = new Promise((resolve, reject) => { + this._reject = reject; + + const onResolve = value => { + this._isPending = false; + resolve(value); + }; + + const onReject = error => { + this._isPending = false; + reject(error); + }; + + const onCancel = handler => { + this._cancelHandlers.push(handler); + }; + + Object.defineProperties(onCancel, { + shouldReject: { + get: () => this._rejectOnCancel, + set: bool => { + this._rejectOnCancel = bool; + } + } + }); + + return executor(onResolve, onReject, onCancel); + }); + } + + then(onFulfilled, onRejected) { + return this._promise.then(onFulfilled, onRejected); + } + + catch(onRejected) { + return this._promise.catch(onRejected); + } + + finally(onFinally) { + return this._promise.finally(onFinally); + } + + cancel(reason) { + if (!this._isPending || this._isCanceled) { + return; + } + + if (this._cancelHandlers.length > 0) { + try { + for (const handler of this._cancelHandlers) { + handler(); + } + } catch (error) { + this._reject(error); + } + } + + this._isCanceled = true; + if (this._rejectOnCancel) { + this._reject(new CancelError(reason)); + } + } + + get isCanceled() { + return this._isCanceled; + } +} + +Object.setPrototypeOf(PCancelable.prototype, Promise.prototype); + +module.exports = PCancelable; +module.exports.default = PCancelable; + +module.exports.CancelError = CancelError; diff --git a/node_modules/p-cancelable/license b/node_modules/p-cancelable/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/p-cancelable/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/p-cancelable/package.json b/node_modules/p-cancelable/package.json new file mode 100644 index 000000000..b6b70d529 --- /dev/null +++ b/node_modules/p-cancelable/package.json @@ -0,0 +1,81 @@ +{ + "_from": "p-cancelable@^1.0.0", + "_id": "p-cancelable@1.1.0", + "_inBundle": false, + "_integrity": "sha512-s73XxOZ4zpt1edZYZzvhqFa6uvQc1vwUa0K0BdtIZgQMAJj9IbebH+JkgKZc9h+B05PKHLOTl4ajG1BmNrVZlw==", + "_location": "/p-cancelable", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "p-cancelable@^1.0.0", + "name": "p-cancelable", + "escapedName": "p-cancelable", + "rawSpec": "^1.0.0", + "saveSpec": null, + "fetchSpec": "^1.0.0" + }, + "_requiredBy": [ + "/got" + ], + "_resolved": "https://registry.npmjs.org/p-cancelable/-/p-cancelable-1.1.0.tgz", + "_shasum": "d078d15a3af409220c886f1d9a0ca2e441ab26cc", + "_spec": "p-cancelable@^1.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\got", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/p-cancelable/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Create a promise that can be canceled", + "devDependencies": { + "ava": "^1.3.1", + "delay": "^4.1.0", + "promise.prototype.finally": "^3.1.0", + "tsd-check": "^0.3.0", + "xo": "^0.24.0" + }, + "engines": { + "node": ">=6" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "homepage": "https://github.com/sindresorhus/p-cancelable#readme", + "keywords": [ + "promise", + "cancelable", + "cancel", + "canceled", + "canceling", + "cancellable", + "cancellation", + "abort", + "abortable", + "aborting", + "cleanup", + "task", + "token", + "async", + "function", + "await", + "promises", + "bluebird" + ], + "license": "MIT", + "name": "p-cancelable", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/p-cancelable.git" + }, + "scripts": { + "test": "xo && ava && tsd-check" + }, + "version": "1.1.0" +} diff --git a/node_modules/p-cancelable/readme.md b/node_modules/p-cancelable/readme.md new file mode 100644 index 000000000..b66e96a8a --- /dev/null +++ b/node_modules/p-cancelable/readme.md @@ -0,0 +1,155 @@ +# p-cancelable [![Build Status](https://travis-ci.org/sindresorhus/p-cancelable.svg?branch=master)](https://travis-ci.org/sindresorhus/p-cancelable) + +> Create a promise that can be canceled + +Useful for animation, loading resources, long-running async computations, async iteration, etc. + + +## Install + +``` +$ npm install p-cancelable +``` + + +## Usage + +```js +const PCancelable = require('p-cancelable'); + +const cancelablePromise = new PCancelable((resolve, reject, onCancel) => { + const worker = new SomeLongRunningOperation(); + + onCancel(() => { + worker.close(); + }); + + worker.on('finish', resolve); + worker.on('error', reject); +}); + +(async () => { + try { + console.log('Operation finished successfully:', await cancelablePromise); + } catch (error) { + if (cancelablePromise.isCanceled) { + // Handle the cancelation here + console.log('Operation was canceled'); + return; + } + + throw error; + } +})(); + +// Cancel the operation after 10 seconds +setTimeout(() => { + cancelablePromise.cancel('Unicorn has changed its color'); +}, 10000); +``` + + +## API + +### new PCancelable(executor) + +Same as the [`Promise` constructor](https://developer.mozilla.org/en/docs/Web/JavaScript/Reference/Global_Objects/Promise), but with an appended `onCancel` parameter in `executor`.
+Cancelling will reject the promise with `PCancelable.CancelError`. To avoid that, set `onCancel.shouldReject` to `false`. + +```js +const PCancelable = require('p-cancelable'); + +const cancelablePromise = new PCancelable((resolve, reject, onCancel) => { + const job = new Job(); + + onCancel.shouldReject = false; + onCancel(() => { + job.stop(); + }); + + job.on('finish', resolve); +}); + +cancelablePromise.cancel(); // Doesn't throw an error +``` + +`PCancelable` is a subclass of `Promise`. + +#### onCanceled(fn) + +Type: `Function` + +Accepts a function that is called when the promise is canceled. + +You're not required to call this function. You can call this function multiple times to add multiple cancel handlers. + +### PCancelable#cancel([reason]) + +Type: `Function` + +Cancel the promise and optionally provide a reason. + +The cancellation is synchronous. Calling it after the promise has settled or multiple times does nothing. + +### PCancelable#isCanceled + +Type: `boolean` + +Whether the promise is canceled. + +### PCancelable.CancelError + +Type: `Error` + +Rejection reason when `.cancel()` is called. + +It includes a `.isCanceled` property for convenience. + +### PCancelable.fn(fn) + +Convenience method to make your promise-returning or async function cancelable. + +The function you specify will have `onCancel` appended to its parameters. + +```js +const PCancelable = require('p-cancelable'); + +const fn = PCancelable.fn((input, onCancel) => { + const job = new Job(); + + onCancel(() => { + job.cleanup(); + }); + + return job.start(); //=> Promise +}); + +const cancelablePromise = fn('input'); //=> PCancelable + +// … + +cancelablePromise.cancel(); +``` + + +## FAQ + +### Cancelable vs. Cancellable + +[In American English, the verb cancel is usually inflected canceled and canceling—with one l.](http://grammarist.com/spelling/cancel/)
Both a [browser API](https://developer.mozilla.org/en-US/docs/Web/API/Event/cancelable) and the [Cancelable Promises proposal](https://github.com/tc39/proposal-cancelable-promises) use this spelling. + +### What about the official [Cancelable Promises proposal](https://github.com/tc39/proposal-cancelable-promises)? + +~~It's still an early draft and I don't really like its current direction. It complicates everything and will require deep changes in the ecosystem to adapt to it. And the way you have to use cancel tokens is verbose and convoluted. I much prefer the more pragmatic and less invasive approach in this module.~~ The proposal was withdrawn. + + +## Related + +- [p-progress](https://github.com/sindresorhus/p-progress) - Create a promise that reports progress +- [p-lazy](https://github.com/sindresorhus/p-lazy) - Create a lazy promise that defers execution until `.then()` or `.catch()` is called +- [More…](https://github.com/sindresorhus/promise-fun) + + +## License + +MIT © [Sindre Sorhus](https://sindresorhus.com) diff --git a/node_modules/package-json/index.d.ts b/node_modules/package-json/index.d.ts new file mode 100644 index 000000000..d55fc57e1 --- /dev/null +++ b/node_modules/package-json/index.d.ts @@ -0,0 +1,199 @@ +/// +import {Agent as HttpAgent} from 'http'; +import {Agent as HttpsAgent} from 'https'; + +declare class VersionNotFoundErrorClass extends Error { + readonly name: 'VersionNotFoundError'; + + constructor(packageName: string, version: string); +} + +declare class PackageNotFoundErrorClass extends Error { + readonly name: 'PackageNotFoundError'; + + constructor(packageName: string); +} + +declare namespace packageJson { + interface Agents { + http?: HttpAgent; + https?: HttpsAgent; + } + + interface Options { + /** + Package version such as `1.0.0` or a [dist tag](https://docs.npmjs.com/cli/dist-tag) such as `latest`. + + The version can also be in any format supported by the [semver](https://github.com/npm/node-semver) module. For example: + - `1` - Get the latest `1.x.x` + - `1.2` - Get the latest `1.2.x` + - `^1.2.3` - Get the latest `1.x.x` but at least `1.2.3` + - `~1.2.3` - Get the latest `1.2.x` but at least `1.2.3` + + @default 'latest' + */ + readonly version?: string; + + /** + By default, only an abbreviated metadata object is returned for performance reasons. [Read more.](https://github.com/npm/registry/blob/master/docs/responses/package-metadata.md) + + @default false + */ + readonly fullMetadata?: boolean; + + /** + Return the [main entry](https://registry.npmjs.org/ava) containing all versions. + + @default false + */ + readonly allVersions?: boolean; + + /** + The registry URL is by default inferred from the npm defaults and `.npmrc`. This is beneficial as `package-json` and any project using it will work just like npm. This option is*only** intended for internal tools. You should*not** use this option in reusable packages. Prefer just using `.npmrc` whenever possible. + */ + readonly registryUrl?: string; + + /** + Overwrite the `agent` option that is passed down to [`got`](https://github.com/sindresorhus/got#agent). This might be useful to add [proxy support](https://github.com/sindresorhus/got#proxies). + */ + readonly agent?: HttpAgent | HttpsAgent | Agents | false; + } + + interface FullMetadataOptions extends Options { + /** + By default, only an abbreviated metadata object is returned for performance reasons. [Read more.](https://github.com/npm/registry/blob/master/docs/responses/package-metadata.md) + + @default false + */ + readonly fullMetadata: true; + } + + interface DistTags { + readonly latest: string; + readonly [tagName: string]: string; + } + + interface AbbreviatedMetadata { + readonly 'dist-tags': DistTags; + readonly modified: string; + readonly name: string; + readonly versions: {readonly [version: string]: AbbreviatedVersion}; + readonly [key: string]: unknown; + } + + interface AbbreviatedVersion { + readonly name: string; + readonly version: string; + readonly dist: { + readonly shasum: string; + readonly tarball: string; + readonly integrity?: string; + }; + readonly deprecated?: string; + readonly dependencies?: {readonly [name: string]: string}; + readonly optionalDependencies?: {readonly [name: string]: string}; + readonly devDependencies?: {readonly [name: string]: string}; + readonly bundleDependencies?: {readonly [name: string]: string}; + readonly peerDependencies?: {readonly [name: string]: string}; + readonly bin?: {readonly [key: string]: string}; + readonly directories?: readonly string[]; + readonly engines?: {readonly [type: string]: string}; + readonly _hasShrinkwrap?: boolean; + readonly [key: string]: unknown; + } + + interface Person { + readonly name?: string; + readonly email?: string; + readonly url?: string; + } + + interface HoistedData { + readonly author?: Person; + readonly bugs?: + | {readonly url: string; readonly email?: string} + | {readonly url?: string; readonly email: string}; + readonly contributors?: readonly Person[]; + readonly description?: string; + readonly homepage?: string; + readonly keywords?: readonly string[]; + readonly license?: string; + readonly maintainers?: readonly Person[]; + readonly readme?: string; + readonly readmeFilename?: string; + readonly repository?: {readonly type: string; readonly url: string}; + } + + interface FullMetadata extends AbbreviatedMetadata, HoistedData { + readonly _id: string; + readonly _rev: string; + readonly time: { + readonly created: string; + readonly modified: string; + readonly [version: string]: string; + }; + readonly users?: {readonly [user: string]: boolean}; + readonly versions: {readonly [version: string]: FullVersion}; + readonly [key: string]: unknown; + } + + interface FullVersion extends AbbreviatedVersion, HoistedData { + readonly _id: string; + readonly _nodeVersion: string; + readonly _npmUser: string; + readonly _npmVersion: string; + readonly main?: string; + readonly files?: readonly string[]; + readonly man?: readonly string[]; + readonly scripts?: {readonly [scriptName: string]: string}; + readonly gitHead?: string; + readonly types?: string; + readonly typings?: string; + readonly [key: string]: unknown; + } + + type VersionNotFoundError = VersionNotFoundErrorClass; + type PackageNotFoundError = PackageNotFoundErrorClass; +} + +declare const packageJson: { + /** + Get metadata of a package from the npm registry. + + @param packageName - Name of the package. + + @example + ``` + import packageJson = require('package-json'); + + (async () => { + console.log(await packageJson('ava')); + //=> {name: 'ava', ...} + + // Also works with scoped packages + console.log(await packageJson('@sindresorhus/df')); + })(); + ``` + */ + (packageName: string, options: packageJson.FullMetadataOptions): Promise< + packageJson.FullMetadata + >; + (packageName: string, options?: packageJson.Options): Promise< + packageJson.AbbreviatedMetadata + >; + + /** + The error thrown when the given package version cannot be found. + */ + VersionNotFoundError: typeof VersionNotFoundErrorClass; + + /** + The error thrown when the given package name cannot be found. + */ + PackageNotFoundError: typeof PackageNotFoundErrorClass; + + // TODO: remove this in the next major version + default: typeof packageJson; +}; + +export = packageJson; diff --git a/node_modules/package-json/index.js b/node_modules/package-json/index.js new file mode 100644 index 000000000..b19cf2e3e --- /dev/null +++ b/node_modules/package-json/index.js @@ -0,0 +1,115 @@ +'use strict'; +const {URL} = require('url'); +const {Agent: HttpAgent} = require('http'); +const {Agent: HttpsAgent} = require('https'); +const got = require('got'); +const registryUrl = require('registry-url'); +const registryAuthToken = require('registry-auth-token'); +const semver = require('semver'); + +// These agent options are chosen to match the npm client defaults and help with performance +// See: `npm config get maxsockets` and #50 +const agentOptions = { + keepAlive: true, + maxSockets: 50 +}; +const httpAgent = new HttpAgent(agentOptions); +const httpsAgent = new HttpsAgent(agentOptions); + +class PackageNotFoundError extends Error { + constructor(packageName) { + super(`Package \`${packageName}\` could not be found`); + this.name = 'PackageNotFoundError'; + } +} + +class VersionNotFoundError extends Error { + constructor(packageName, version) { + super(`Version \`${version}\` for package \`${packageName}\` could not be found`); + this.name = 'VersionNotFoundError'; + } +} + +const packageJson = async (packageName, options) => { + options = { + version: 'latest', + ...options + }; + + const scope = packageName.split('/')[0]; + const registryUrl_ = options.registryUrl || registryUrl(scope); + const packageUrl = new URL(encodeURIComponent(packageName).replace(/^%40/, '@'), registryUrl_); + const authInfo = registryAuthToken(registryUrl_.toString(), {recursive: true}); + + const headers = { + accept: 'application/vnd.npm.install-v1+json; q=1.0, application/json; q=0.8, */*' + }; + + if (options.fullMetadata) { + delete headers.accept; + } + + if (authInfo) { + headers.authorization = `${authInfo.type} ${authInfo.token}`; + } + + const gotOptions = { + json: true, + headers, + agent: { + http: httpAgent, + https: httpsAgent + } + }; + + if (options.agent) { + gotOptions.agent = options.agent; + } + + let response; + try { + response = await got(packageUrl, gotOptions); + } catch (error) { + if (error.statusCode === 404) { + throw new PackageNotFoundError(packageName); + } + + throw error; + } + + let data = response.body; + + if (options.allVersions) { + return data; + } + + let {version} = options; + const versionError = new VersionNotFoundError(packageName, version); + + if (data['dist-tags'][version]) { + data = data.versions[data['dist-tags'][version]]; + } else if (version) { + if (!data.versions[version]) { + const versions = Object.keys(data.versions); + version = semver.maxSatisfying(versions, version); + + if (!version) { + throw versionError; + } + } + + data = data.versions[version]; + + if (!data) { + throw versionError; + } + } + + return data; +}; + +module.exports = packageJson; +// TODO: remove this in the next major version +module.exports.default = packageJson; +module.exports.PackageNotFoundError = PackageNotFoundError; +module.exports.VersionNotFoundError = VersionNotFoundError; diff --git a/node_modules/package-json/license b/node_modules/package-json/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/package-json/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/package-json/node_modules/.bin/semver b/node_modules/package-json/node_modules/.bin/semver new file mode 100644 index 000000000..7e365277d --- /dev/null +++ b/node_modules/package-json/node_modules/.bin/semver @@ -0,0 +1,15 @@ +#!/bin/sh +basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')") + +case `uname` in + *CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;; +esac + +if [ -x "$basedir/node" ]; then + "$basedir/node" "$basedir/../semver/bin/semver.js" "$@" + ret=$? +else + node "$basedir/../semver/bin/semver.js" "$@" + ret=$? +fi +exit $ret diff --git a/node_modules/package-json/node_modules/.bin/semver.cmd b/node_modules/package-json/node_modules/.bin/semver.cmd new file mode 100644 index 000000000..164cdeac5 --- /dev/null +++ b/node_modules/package-json/node_modules/.bin/semver.cmd @@ -0,0 +1,17 @@ +@ECHO off +SETLOCAL +CALL :find_dp0 + +IF EXIST "%dp0%\node.exe" ( + SET "_prog=%dp0%\node.exe" +) ELSE ( + SET "_prog=node" + SET PATHEXT=%PATHEXT:;.JS;=;% +) + +"%_prog%" "%dp0%\..\semver\bin\semver.js" %* +ENDLOCAL +EXIT /b %errorlevel% +:find_dp0 +SET dp0=%~dp0 +EXIT /b diff --git a/node_modules/package-json/node_modules/.bin/semver.ps1 b/node_modules/package-json/node_modules/.bin/semver.ps1 new file mode 100644 index 000000000..6a85e3405 --- /dev/null +++ b/node_modules/package-json/node_modules/.bin/semver.ps1 @@ -0,0 +1,18 @@ +#!/usr/bin/env pwsh +$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent + +$exe="" +if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) { + # Fix case when both the Windows and Linux builds of Node + # are installed in the same directory + $exe=".exe" +} +$ret=0 +if (Test-Path "$basedir/node$exe") { + & "$basedir/node$exe" "$basedir/../semver/bin/semver.js" $args + $ret=$LASTEXITCODE +} else { + & "node$exe" "$basedir/../semver/bin/semver.js" $args + $ret=$LASTEXITCODE +} +exit $ret diff --git a/node_modules/package-json/node_modules/semver/CHANGELOG.md b/node_modules/package-json/node_modules/semver/CHANGELOG.md new file mode 100644 index 000000000..f567dd3fe --- /dev/null +++ b/node_modules/package-json/node_modules/semver/CHANGELOG.md @@ -0,0 +1,70 @@ +# changes log + +## 6.2.0 + +* Coerce numbers to strings when passed to semver.coerce() +* Add `rtl` option to coerce from right to left + +## 6.1.3 + +* Handle X-ranges properly in includePrerelease mode + +## 6.1.2 + +* Do not throw when testing invalid version strings + +## 6.1.1 + +* Add options support for semver.coerce() +* Handle undefined version passed to Range.test + +## 6.1.0 + +* Add semver.compareBuild function +* Support `*` in semver.intersects + +## 6.0 + +* Fix `intersects` logic. + + This is technically a bug fix, but since it is also a change to behavior + that may require users updating their code, it is marked as a major + version increment. + +## 5.7 + +* Add `minVersion` method + +## 5.6 + +* Move boolean `loose` param to an options object, with + backwards-compatibility protection. +* Add ability to opt out of special prerelease version handling with + the `includePrerelease` option flag. + +## 5.5 + +* Add version coercion capabilities + +## 5.4 + +* Add intersection checking + +## 5.3 + +* Add `minSatisfying` method + +## 5.2 + +* Add `prerelease(v)` that returns prerelease components + +## 5.1 + +* Add Backus-Naur for ranges +* Remove excessively cute inspection methods + +## 5.0 + +* Remove AMD/Browserified build artifacts +* Fix ltr and gtr when using the `*` range +* Fix for range `*` with a prerelease identifier diff --git a/node_modules/package-json/node_modules/semver/LICENSE b/node_modules/package-json/node_modules/semver/LICENSE new file mode 100644 index 000000000..19129e315 --- /dev/null +++ b/node_modules/package-json/node_modules/semver/LICENSE @@ -0,0 +1,15 @@ +The ISC License + +Copyright (c) Isaac Z. Schlueter and Contributors + +Permission to use, copy, modify, and/or distribute this software for any +purpose with or without fee is hereby granted, provided that the above +copyright notice and this permission notice appear in all copies. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES +WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF +MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR +ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES +WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN +ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR +IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. diff --git a/node_modules/package-json/node_modules/semver/README.md b/node_modules/package-json/node_modules/semver/README.md new file mode 100644 index 000000000..2293a14fd --- /dev/null +++ b/node_modules/package-json/node_modules/semver/README.md @@ -0,0 +1,443 @@ +semver(1) -- The semantic versioner for npm +=========================================== + +## Install + +```bash +npm install semver +```` + +## Usage + +As a node module: + +```js +const semver = require('semver') + +semver.valid('1.2.3') // '1.2.3' +semver.valid('a.b.c') // null +semver.clean(' =v1.2.3 ') // '1.2.3' +semver.satisfies('1.2.3', '1.x || >=2.5.0 || 5.0.0 - 7.2.3') // true +semver.gt('1.2.3', '9.8.7') // false +semver.lt('1.2.3', '9.8.7') // true +semver.minVersion('>=1.0.0') // '1.0.0' +semver.valid(semver.coerce('v2')) // '2.0.0' +semver.valid(semver.coerce('42.6.7.9.3-alpha')) // '42.6.7' +``` + +As a command-line utility: + +``` +$ semver -h + +A JavaScript implementation of the https://semver.org/ specification +Copyright Isaac Z. Schlueter + +Usage: semver [options] [ [...]] +Prints valid versions sorted by SemVer precedence + +Options: +-r --range + Print versions that match the specified range. + +-i --increment [] + Increment a version by the specified level. Level can + be one of: major, minor, patch, premajor, preminor, + prepatch, or prerelease. Default level is 'patch'. + Only one version may be specified. + +--preid + Identifier to be used to prefix premajor, preminor, + prepatch or prerelease version increments. + +-l --loose + Interpret versions and ranges loosely + +-p --include-prerelease + Always include prerelease versions in range matching + +-c --coerce + Coerce a string into SemVer if possible + (does not imply --loose) + +--rtl + Coerce version strings right to left + +--ltr + Coerce version strings left to right (default) + +Program exits successfully if any valid version satisfies +all supplied ranges, and prints all satisfying versions. + +If no satisfying versions are found, then exits failure. + +Versions are printed in ascending order, so supplying +multiple versions to the utility will just sort them. +``` + +## Versions + +A "version" is described by the `v2.0.0` specification found at +. + +A leading `"="` or `"v"` character is stripped off and ignored. + +## Ranges + +A `version range` is a set of `comparators` which specify versions +that satisfy the range. + +A `comparator` is composed of an `operator` and a `version`. The set +of primitive `operators` is: + +* `<` Less than +* `<=` Less than or equal to +* `>` Greater than +* `>=` Greater than or equal to +* `=` Equal. If no operator is specified, then equality is assumed, + so this operator is optional, but MAY be included. + +For example, the comparator `>=1.2.7` would match the versions +`1.2.7`, `1.2.8`, `2.5.3`, and `1.3.9`, but not the versions `1.2.6` +or `1.1.0`. + +Comparators can be joined by whitespace to form a `comparator set`, +which is satisfied by the **intersection** of all of the comparators +it includes. + +A range is composed of one or more comparator sets, joined by `||`. A +version matches a range if and only if every comparator in at least +one of the `||`-separated comparator sets is satisfied by the version. + +For example, the range `>=1.2.7 <1.3.0` would match the versions +`1.2.7`, `1.2.8`, and `1.2.99`, but not the versions `1.2.6`, `1.3.0`, +or `1.1.0`. + +The range `1.2.7 || >=1.2.9 <2.0.0` would match the versions `1.2.7`, +`1.2.9`, and `1.4.6`, but not the versions `1.2.8` or `2.0.0`. + +### Prerelease Tags + +If a version has a prerelease tag (for example, `1.2.3-alpha.3`) then +it will only be allowed to satisfy comparator sets if at least one +comparator with the same `[major, minor, patch]` tuple also has a +prerelease tag. + +For example, the range `>1.2.3-alpha.3` would be allowed to match the +version `1.2.3-alpha.7`, but it would *not* be satisfied by +`3.4.5-alpha.9`, even though `3.4.5-alpha.9` is technically "greater +than" `1.2.3-alpha.3` according to the SemVer sort rules. The version +range only accepts prerelease tags on the `1.2.3` version. The +version `3.4.5` *would* satisfy the range, because it does not have a +prerelease flag, and `3.4.5` is greater than `1.2.3-alpha.7`. + +The purpose for this behavior is twofold. First, prerelease versions +frequently are updated very quickly, and contain many breaking changes +that are (by the author's design) not yet fit for public consumption. +Therefore, by default, they are excluded from range matching +semantics. + +Second, a user who has opted into using a prerelease version has +clearly indicated the intent to use *that specific* set of +alpha/beta/rc versions. By including a prerelease tag in the range, +the user is indicating that they are aware of the risk. However, it +is still not appropriate to assume that they have opted into taking a +similar risk on the *next* set of prerelease versions. + +Note that this behavior can be suppressed (treating all prerelease +versions as if they were normal versions, for the purpose of range +matching) by setting the `includePrerelease` flag on the options +object to any +[functions](https://github.com/npm/node-semver#functions) that do +range matching. + +#### Prerelease Identifiers + +The method `.inc` takes an additional `identifier` string argument that +will append the value of the string as a prerelease identifier: + +```javascript +semver.inc('1.2.3', 'prerelease', 'beta') +// '1.2.4-beta.0' +``` + +command-line example: + +```bash +$ semver 1.2.3 -i prerelease --preid beta +1.2.4-beta.0 +``` + +Which then can be used to increment further: + +```bash +$ semver 1.2.4-beta.0 -i prerelease +1.2.4-beta.1 +``` + +### Advanced Range Syntax + +Advanced range syntax desugars to primitive comparators in +deterministic ways. + +Advanced ranges may be combined in the same way as primitive +comparators using white space or `||`. + +#### Hyphen Ranges `X.Y.Z - A.B.C` + +Specifies an inclusive set. + +* `1.2.3 - 2.3.4` := `>=1.2.3 <=2.3.4` + +If a partial version is provided as the first version in the inclusive +range, then the missing pieces are replaced with zeroes. + +* `1.2 - 2.3.4` := `>=1.2.0 <=2.3.4` + +If a partial version is provided as the second version in the +inclusive range, then all versions that start with the supplied parts +of the tuple are accepted, but nothing that would be greater than the +provided tuple parts. + +* `1.2.3 - 2.3` := `>=1.2.3 <2.4.0` +* `1.2.3 - 2` := `>=1.2.3 <3.0.0` + +#### X-Ranges `1.2.x` `1.X` `1.2.*` `*` + +Any of `X`, `x`, or `*` may be used to "stand in" for one of the +numeric values in the `[major, minor, patch]` tuple. + +* `*` := `>=0.0.0` (Any version satisfies) +* `1.x` := `>=1.0.0 <2.0.0` (Matching major version) +* `1.2.x` := `>=1.2.0 <1.3.0` (Matching major and minor versions) + +A partial version range is treated as an X-Range, so the special +character is in fact optional. + +* `""` (empty string) := `*` := `>=0.0.0` +* `1` := `1.x.x` := `>=1.0.0 <2.0.0` +* `1.2` := `1.2.x` := `>=1.2.0 <1.3.0` + +#### Tilde Ranges `~1.2.3` `~1.2` `~1` + +Allows patch-level changes if a minor version is specified on the +comparator. Allows minor-level changes if not. + +* `~1.2.3` := `>=1.2.3 <1.(2+1).0` := `>=1.2.3 <1.3.0` +* `~1.2` := `>=1.2.0 <1.(2+1).0` := `>=1.2.0 <1.3.0` (Same as `1.2.x`) +* `~1` := `>=1.0.0 <(1+1).0.0` := `>=1.0.0 <2.0.0` (Same as `1.x`) +* `~0.2.3` := `>=0.2.3 <0.(2+1).0` := `>=0.2.3 <0.3.0` +* `~0.2` := `>=0.2.0 <0.(2+1).0` := `>=0.2.0 <0.3.0` (Same as `0.2.x`) +* `~0` := `>=0.0.0 <(0+1).0.0` := `>=0.0.0 <1.0.0` (Same as `0.x`) +* `~1.2.3-beta.2` := `>=1.2.3-beta.2 <1.3.0` Note that prereleases in + the `1.2.3` version will be allowed, if they are greater than or + equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but + `1.2.4-beta.2` would not, because it is a prerelease of a + different `[major, minor, patch]` tuple. + +#### Caret Ranges `^1.2.3` `^0.2.5` `^0.0.4` + +Allows changes that do not modify the left-most non-zero element in the +`[major, minor, patch]` tuple. In other words, this allows patch and +minor updates for versions `1.0.0` and above, patch updates for +versions `0.X >=0.1.0`, and *no* updates for versions `0.0.X`. + +Many authors treat a `0.x` version as if the `x` were the major +"breaking-change" indicator. + +Caret ranges are ideal when an author may make breaking changes +between `0.2.4` and `0.3.0` releases, which is a common practice. +However, it presumes that there will *not* be breaking changes between +`0.2.4` and `0.2.5`. It allows for changes that are presumed to be +additive (but non-breaking), according to commonly observed practices. + +* `^1.2.3` := `>=1.2.3 <2.0.0` +* `^0.2.3` := `>=0.2.3 <0.3.0` +* `^0.0.3` := `>=0.0.3 <0.0.4` +* `^1.2.3-beta.2` := `>=1.2.3-beta.2 <2.0.0` Note that prereleases in + the `1.2.3` version will be allowed, if they are greater than or + equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but + `1.2.4-beta.2` would not, because it is a prerelease of a + different `[major, minor, patch]` tuple. +* `^0.0.3-beta` := `>=0.0.3-beta <0.0.4` Note that prereleases in the + `0.0.3` version *only* will be allowed, if they are greater than or + equal to `beta`. So, `0.0.3-pr.2` would be allowed. + +When parsing caret ranges, a missing `patch` value desugars to the +number `0`, but will allow flexibility within that value, even if the +major and minor versions are both `0`. + +* `^1.2.x` := `>=1.2.0 <2.0.0` +* `^0.0.x` := `>=0.0.0 <0.1.0` +* `^0.0` := `>=0.0.0 <0.1.0` + +A missing `minor` and `patch` values will desugar to zero, but also +allow flexibility within those values, even if the major version is +zero. + +* `^1.x` := `>=1.0.0 <2.0.0` +* `^0.x` := `>=0.0.0 <1.0.0` + +### Range Grammar + +Putting all this together, here is a Backus-Naur grammar for ranges, +for the benefit of parser authors: + +```bnf +range-set ::= range ( logical-or range ) * +logical-or ::= ( ' ' ) * '||' ( ' ' ) * +range ::= hyphen | simple ( ' ' simple ) * | '' +hyphen ::= partial ' - ' partial +simple ::= primitive | partial | tilde | caret +primitive ::= ( '<' | '>' | '>=' | '<=' | '=' ) partial +partial ::= xr ( '.' xr ( '.' xr qualifier ? )? )? +xr ::= 'x' | 'X' | '*' | nr +nr ::= '0' | ['1'-'9'] ( ['0'-'9'] ) * +tilde ::= '~' partial +caret ::= '^' partial +qualifier ::= ( '-' pre )? ( '+' build )? +pre ::= parts +build ::= parts +parts ::= part ( '.' part ) * +part ::= nr | [-0-9A-Za-z]+ +``` + +## Functions + +All methods and classes take a final `options` object argument. All +options in this object are `false` by default. The options supported +are: + +- `loose` Be more forgiving about not-quite-valid semver strings. + (Any resulting output will always be 100% strict compliant, of + course.) For backwards compatibility reasons, if the `options` + argument is a boolean value instead of an object, it is interpreted + to be the `loose` param. +- `includePrerelease` Set to suppress the [default + behavior](https://github.com/npm/node-semver#prerelease-tags) of + excluding prerelease tagged versions from ranges unless they are + explicitly opted into. + +Strict-mode Comparators and Ranges will be strict about the SemVer +strings that they parse. + +* `valid(v)`: Return the parsed version, or null if it's not valid. +* `inc(v, release)`: Return the version incremented by the release + type (`major`, `premajor`, `minor`, `preminor`, `patch`, + `prepatch`, or `prerelease`), or null if it's not valid + * `premajor` in one call will bump the version up to the next major + version and down to a prerelease of that major version. + `preminor`, and `prepatch` work the same way. + * If called from a non-prerelease version, the `prerelease` will work the + same as `prepatch`. It increments the patch version, then makes a + prerelease. If the input version is already a prerelease it simply + increments it. +* `prerelease(v)`: Returns an array of prerelease components, or null + if none exist. Example: `prerelease('1.2.3-alpha.1') -> ['alpha', 1]` +* `major(v)`: Return the major version number. +* `minor(v)`: Return the minor version number. +* `patch(v)`: Return the patch version number. +* `intersects(r1, r2, loose)`: Return true if the two supplied ranges + or comparators intersect. +* `parse(v)`: Attempt to parse a string as a semantic version, returning either + a `SemVer` object or `null`. + +### Comparison + +* `gt(v1, v2)`: `v1 > v2` +* `gte(v1, v2)`: `v1 >= v2` +* `lt(v1, v2)`: `v1 < v2` +* `lte(v1, v2)`: `v1 <= v2` +* `eq(v1, v2)`: `v1 == v2` This is true if they're logically equivalent, + even if they're not the exact same string. You already know how to + compare strings. +* `neq(v1, v2)`: `v1 != v2` The opposite of `eq`. +* `cmp(v1, comparator, v2)`: Pass in a comparison string, and it'll call + the corresponding function above. `"==="` and `"!=="` do simple + string comparison, but are included for completeness. Throws if an + invalid comparison string is provided. +* `compare(v1, v2)`: Return `0` if `v1 == v2`, or `1` if `v1` is greater, or `-1` if + `v2` is greater. Sorts in ascending order if passed to `Array.sort()`. +* `rcompare(v1, v2)`: The reverse of compare. Sorts an array of versions + in descending order when passed to `Array.sort()`. +* `compareBuild(v1, v2)`: The same as `compare` but considers `build` when two versions + are equal. Sorts in ascending order if passed to `Array.sort()`. + `v2` is greater. Sorts in ascending order if passed to `Array.sort()`. +* `diff(v1, v2)`: Returns difference between two versions by the release type + (`major`, `premajor`, `minor`, `preminor`, `patch`, `prepatch`, or `prerelease`), + or null if the versions are the same. + +### Comparators + +* `intersects(comparator)`: Return true if the comparators intersect + +### Ranges + +* `validRange(range)`: Return the valid range or null if it's not valid +* `satisfies(version, range)`: Return true if the version satisfies the + range. +* `maxSatisfying(versions, range)`: Return the highest version in the list + that satisfies the range, or `null` if none of them do. +* `minSatisfying(versions, range)`: Return the lowest version in the list + that satisfies the range, or `null` if none of them do. +* `minVersion(range)`: Return the lowest version that can possibly match + the given range. +* `gtr(version, range)`: Return `true` if version is greater than all the + versions possible in the range. +* `ltr(version, range)`: Return `true` if version is less than all the + versions possible in the range. +* `outside(version, range, hilo)`: Return true if the version is outside + the bounds of the range in either the high or low direction. The + `hilo` argument must be either the string `'>'` or `'<'`. (This is + the function called by `gtr` and `ltr`.) +* `intersects(range)`: Return true if any of the ranges comparators intersect + +Note that, since ranges may be non-contiguous, a version might not be +greater than a range, less than a range, *or* satisfy a range! For +example, the range `1.2 <1.2.9 || >2.0.0` would have a hole from `1.2.9` +until `2.0.0`, so the version `1.2.10` would not be greater than the +range (because `2.0.1` satisfies, which is higher), nor less than the +range (since `1.2.8` satisfies, which is lower), and it also does not +satisfy the range. + +If you want to know if a version satisfies or does not satisfy a +range, use the `satisfies(version, range)` function. + +### Coercion + +* `coerce(version, options)`: Coerces a string to semver if possible + +This aims to provide a very forgiving translation of a non-semver string to +semver. It looks for the first digit in a string, and consumes all +remaining characters which satisfy at least a partial semver (e.g., `1`, +`1.2`, `1.2.3`) up to the max permitted length (256 characters). Longer +versions are simply truncated (`4.6.3.9.2-alpha2` becomes `4.6.3`). All +surrounding text is simply ignored (`v3.4 replaces v3.3.1` becomes +`3.4.0`). Only text which lacks digits will fail coercion (`version one` +is not valid). The maximum length for any semver component considered for +coercion is 16 characters; longer components will be ignored +(`10000000000000000.4.7.4` becomes `4.7.4`). The maximum value for any +semver component is `Integer.MAX_SAFE_INTEGER || (2**53 - 1)`; higher value +components are invalid (`9999999999999999.4.7.4` is likely invalid). + +If the `options.rtl` flag is set, then `coerce` will return the right-most +coercible tuple that does not share an ending index with a longer coercible +tuple. For example, `1.2.3.4` will return `2.3.4` in rtl mode, not +`4.0.0`. `1.2.3/4` will return `4.0.0`, because the `4` is not a part of +any other overlapping SemVer tuple. + +### Clean + +* `clean(version)`: Clean a string to be a valid semver if possible + +This will return a cleaned and trimmed semver version. If the provided version is not valid a null will be returned. This does not work for ranges. + +ex. +* `s.clean(' = v 2.1.5foo')`: `null` +* `s.clean(' = v 2.1.5foo', { loose: true })`: `'2.1.5-foo'` +* `s.clean(' = v 2.1.5-foo')`: `null` +* `s.clean(' = v 2.1.5-foo', { loose: true })`: `'2.1.5-foo'` +* `s.clean('=v2.1.5')`: `'2.1.5'` +* `s.clean(' =v2.1.5')`: `2.1.5` +* `s.clean(' 2.1.5 ')`: `'2.1.5'` +* `s.clean('~1.0.0')`: `null` diff --git a/node_modules/package-json/node_modules/semver/bin/semver.js b/node_modules/package-json/node_modules/semver/bin/semver.js new file mode 100644 index 000000000..666034a75 --- /dev/null +++ b/node_modules/package-json/node_modules/semver/bin/semver.js @@ -0,0 +1,174 @@ +#!/usr/bin/env node +// Standalone semver comparison program. +// Exits successfully and prints matching version(s) if +// any supplied version is valid and passes all tests. + +var argv = process.argv.slice(2) + +var versions = [] + +var range = [] + +var inc = null + +var version = require('../package.json').version + +var loose = false + +var includePrerelease = false + +var coerce = false + +var rtl = false + +var identifier + +var semver = require('../semver') + +var reverse = false + +var options = {} + +main() + +function main () { + if (!argv.length) return help() + while (argv.length) { + var a = argv.shift() + var indexOfEqualSign = a.indexOf('=') + if (indexOfEqualSign !== -1) { + a = a.slice(0, indexOfEqualSign) + argv.unshift(a.slice(indexOfEqualSign + 1)) + } + switch (a) { + case '-rv': case '-rev': case '--rev': case '--reverse': + reverse = true + break + case '-l': case '--loose': + loose = true + break + case '-p': case '--include-prerelease': + includePrerelease = true + break + case '-v': case '--version': + versions.push(argv.shift()) + break + case '-i': case '--inc': case '--increment': + switch (argv[0]) { + case 'major': case 'minor': case 'patch': case 'prerelease': + case 'premajor': case 'preminor': case 'prepatch': + inc = argv.shift() + break + default: + inc = 'patch' + break + } + break + case '--preid': + identifier = argv.shift() + break + case '-r': case '--range': + range.push(argv.shift()) + break + case '-c': case '--coerce': + coerce = true + break + case '--rtl': + rtl = true + break + case '--ltr': + rtl = false + break + case '-h': case '--help': case '-?': + return help() + default: + versions.push(a) + break + } + } + + var options = { loose: loose, includePrerelease: includePrerelease, rtl: rtl } + + versions = versions.map(function (v) { + return coerce ? (semver.coerce(v, options) || { version: v }).version : v + }).filter(function (v) { + return semver.valid(v) + }) + if (!versions.length) return fail() + if (inc && (versions.length !== 1 || range.length)) { return failInc() } + + for (var i = 0, l = range.length; i < l; i++) { + versions = versions.filter(function (v) { + return semver.satisfies(v, range[i], options) + }) + if (!versions.length) return fail() + } + return success(versions) +} + +function failInc () { + console.error('--inc can only be used on a single version with no range') + fail() +} + +function fail () { process.exit(1) } + +function success () { + var compare = reverse ? 'rcompare' : 'compare' + versions.sort(function (a, b) { + return semver[compare](a, b, options) + }).map(function (v) { + return semver.clean(v, options) + }).map(function (v) { + return inc ? semver.inc(v, inc, options, identifier) : v + }).forEach(function (v, i, _) { console.log(v) }) +} + +function help () { + console.log(['SemVer ' + version, + '', + 'A JavaScript implementation of the https://semver.org/ specification', + 'Copyright Isaac Z. Schlueter', + '', + 'Usage: semver [options] [ [...]]', + 'Prints valid versions sorted by SemVer precedence', + '', + 'Options:', + '-r --range ', + ' Print versions that match the specified range.', + '', + '-i --increment []', + ' Increment a version by the specified level. Level can', + ' be one of: major, minor, patch, premajor, preminor,', + " prepatch, or prerelease. Default level is 'patch'.", + ' Only one version may be specified.', + '', + '--preid ', + ' Identifier to be used to prefix premajor, preminor,', + ' prepatch or prerelease version increments.', + '', + '-l --loose', + ' Interpret versions and ranges loosely', + '', + '-p --include-prerelease', + ' Always include prerelease versions in range matching', + '', + '-c --coerce', + ' Coerce a string into SemVer if possible', + ' (does not imply --loose)', + '', + '--rtl', + ' Coerce version strings right to left', + '', + '--ltr', + ' Coerce version strings left to right (default)', + '', + 'Program exits successfully if any valid version satisfies', + 'all supplied ranges, and prints all satisfying versions.', + '', + 'If no satisfying versions are found, then exits failure.', + '', + 'Versions are printed in ascending order, so supplying', + 'multiple versions to the utility will just sort them.' + ].join('\n')) +} diff --git a/node_modules/package-json/node_modules/semver/package.json b/node_modules/package-json/node_modules/semver/package.json new file mode 100644 index 000000000..58f1e99c9 --- /dev/null +++ b/node_modules/package-json/node_modules/semver/package.json @@ -0,0 +1,60 @@ +{ + "_from": "semver@^6.2.0", + "_id": "semver@6.3.0", + "_inBundle": false, + "_integrity": "sha512-b39TBaTSfV6yBrapU89p5fKekE2m/NwnDocOVruQFS1/veMgdzuPcnOM34M6CwxW8jH/lxEa5rBoDeUwu5HHTw==", + "_location": "/package-json/semver", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "semver@^6.2.0", + "name": "semver", + "escapedName": "semver", + "rawSpec": "^6.2.0", + "saveSpec": null, + "fetchSpec": "^6.2.0" + }, + "_requiredBy": [ + "/package-json" + ], + "_resolved": "https://registry.npmjs.org/semver/-/semver-6.3.0.tgz", + "_shasum": "ee0a64c8af5e8ceea67687b133761e1becbd1d3d", + "_spec": "semver@^6.2.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\package-json", + "bin": { + "semver": "bin/semver.js" + }, + "bugs": { + "url": "https://github.com/npm/node-semver/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "The semantic version parser used by npm.", + "devDependencies": { + "tap": "^14.3.1" + }, + "files": [ + "bin", + "range.bnf", + "semver.js" + ], + "homepage": "https://github.com/npm/node-semver#readme", + "license": "ISC", + "main": "semver.js", + "name": "semver", + "repository": { + "type": "git", + "url": "git+https://github.com/npm/node-semver.git" + }, + "scripts": { + "postpublish": "git push origin --follow-tags", + "postversion": "npm publish", + "preversion": "npm test", + "test": "tap" + }, + "tap": { + "check-coverage": true + }, + "version": "6.3.0" +} diff --git a/node_modules/package-json/node_modules/semver/range.bnf b/node_modules/package-json/node_modules/semver/range.bnf new file mode 100644 index 000000000..d4c6ae0d7 --- /dev/null +++ b/node_modules/package-json/node_modules/semver/range.bnf @@ -0,0 +1,16 @@ +range-set ::= range ( logical-or range ) * +logical-or ::= ( ' ' ) * '||' ( ' ' ) * +range ::= hyphen | simple ( ' ' simple ) * | '' +hyphen ::= partial ' - ' partial +simple ::= primitive | partial | tilde | caret +primitive ::= ( '<' | '>' | '>=' | '<=' | '=' ) partial +partial ::= xr ( '.' xr ( '.' xr qualifier ? )? )? +xr ::= 'x' | 'X' | '*' | nr +nr ::= '0' | [1-9] ( [0-9] ) * +tilde ::= '~' partial +caret ::= '^' partial +qualifier ::= ( '-' pre )? ( '+' build )? +pre ::= parts +build ::= parts +parts ::= part ( '.' part ) * +part ::= nr | [-0-9A-Za-z]+ diff --git a/node_modules/package-json/node_modules/semver/semver.js b/node_modules/package-json/node_modules/semver/semver.js new file mode 100644 index 000000000..636fa4365 --- /dev/null +++ b/node_modules/package-json/node_modules/semver/semver.js @@ -0,0 +1,1596 @@ +exports = module.exports = SemVer + +var debug +/* istanbul ignore next */ +if (typeof process === 'object' && + process.env && + process.env.NODE_DEBUG && + /\bsemver\b/i.test(process.env.NODE_DEBUG)) { + debug = function () { + var args = Array.prototype.slice.call(arguments, 0) + args.unshift('SEMVER') + console.log.apply(console, args) + } +} else { + debug = function () {} +} + +// Note: this is the semver.org version of the spec that it implements +// Not necessarily the package version of this code. +exports.SEMVER_SPEC_VERSION = '2.0.0' + +var MAX_LENGTH = 256 +var MAX_SAFE_INTEGER = Number.MAX_SAFE_INTEGER || + /* istanbul ignore next */ 9007199254740991 + +// Max safe segment length for coercion. +var MAX_SAFE_COMPONENT_LENGTH = 16 + +// The actual regexps go on exports.re +var re = exports.re = [] +var src = exports.src = [] +var t = exports.tokens = {} +var R = 0 + +function tok (n) { + t[n] = R++ +} + +// The following Regular Expressions can be used for tokenizing, +// validating, and parsing SemVer version strings. + +// ## Numeric Identifier +// A single `0`, or a non-zero digit followed by zero or more digits. + +tok('NUMERICIDENTIFIER') +src[t.NUMERICIDENTIFIER] = '0|[1-9]\\d*' +tok('NUMERICIDENTIFIERLOOSE') +src[t.NUMERICIDENTIFIERLOOSE] = '[0-9]+' + +// ## Non-numeric Identifier +// Zero or more digits, followed by a letter or hyphen, and then zero or +// more letters, digits, or hyphens. + +tok('NONNUMERICIDENTIFIER') +src[t.NONNUMERICIDENTIFIER] = '\\d*[a-zA-Z-][a-zA-Z0-9-]*' + +// ## Main Version +// Three dot-separated numeric identifiers. + +tok('MAINVERSION') +src[t.MAINVERSION] = '(' + src[t.NUMERICIDENTIFIER] + ')\\.' + + '(' + src[t.NUMERICIDENTIFIER] + ')\\.' + + '(' + src[t.NUMERICIDENTIFIER] + ')' + +tok('MAINVERSIONLOOSE') +src[t.MAINVERSIONLOOSE] = '(' + src[t.NUMERICIDENTIFIERLOOSE] + ')\\.' + + '(' + src[t.NUMERICIDENTIFIERLOOSE] + ')\\.' + + '(' + src[t.NUMERICIDENTIFIERLOOSE] + ')' + +// ## Pre-release Version Identifier +// A numeric identifier, or a non-numeric identifier. + +tok('PRERELEASEIDENTIFIER') +src[t.PRERELEASEIDENTIFIER] = '(?:' + src[t.NUMERICIDENTIFIER] + + '|' + src[t.NONNUMERICIDENTIFIER] + ')' + +tok('PRERELEASEIDENTIFIERLOOSE') +src[t.PRERELEASEIDENTIFIERLOOSE] = '(?:' + src[t.NUMERICIDENTIFIERLOOSE] + + '|' + src[t.NONNUMERICIDENTIFIER] + ')' + +// ## Pre-release Version +// Hyphen, followed by one or more dot-separated pre-release version +// identifiers. + +tok('PRERELEASE') +src[t.PRERELEASE] = '(?:-(' + src[t.PRERELEASEIDENTIFIER] + + '(?:\\.' + src[t.PRERELEASEIDENTIFIER] + ')*))' + +tok('PRERELEASELOOSE') +src[t.PRERELEASELOOSE] = '(?:-?(' + src[t.PRERELEASEIDENTIFIERLOOSE] + + '(?:\\.' + src[t.PRERELEASEIDENTIFIERLOOSE] + ')*))' + +// ## Build Metadata Identifier +// Any combination of digits, letters, or hyphens. + +tok('BUILDIDENTIFIER') +src[t.BUILDIDENTIFIER] = '[0-9A-Za-z-]+' + +// ## Build Metadata +// Plus sign, followed by one or more period-separated build metadata +// identifiers. + +tok('BUILD') +src[t.BUILD] = '(?:\\+(' + src[t.BUILDIDENTIFIER] + + '(?:\\.' + src[t.BUILDIDENTIFIER] + ')*))' + +// ## Full Version String +// A main version, followed optionally by a pre-release version and +// build metadata. + +// Note that the only major, minor, patch, and pre-release sections of +// the version string are capturing groups. The build metadata is not a +// capturing group, because it should not ever be used in version +// comparison. + +tok('FULL') +tok('FULLPLAIN') +src[t.FULLPLAIN] = 'v?' + src[t.MAINVERSION] + + src[t.PRERELEASE] + '?' + + src[t.BUILD] + '?' + +src[t.FULL] = '^' + src[t.FULLPLAIN] + '$' + +// like full, but allows v1.2.3 and =1.2.3, which people do sometimes. +// also, 1.0.0alpha1 (prerelease without the hyphen) which is pretty +// common in the npm registry. +tok('LOOSEPLAIN') +src[t.LOOSEPLAIN] = '[v=\\s]*' + src[t.MAINVERSIONLOOSE] + + src[t.PRERELEASELOOSE] + '?' + + src[t.BUILD] + '?' + +tok('LOOSE') +src[t.LOOSE] = '^' + src[t.LOOSEPLAIN] + '$' + +tok('GTLT') +src[t.GTLT] = '((?:<|>)?=?)' + +// Something like "2.*" or "1.2.x". +// Note that "x.x" is a valid xRange identifer, meaning "any version" +// Only the first item is strictly required. +tok('XRANGEIDENTIFIERLOOSE') +src[t.XRANGEIDENTIFIERLOOSE] = src[t.NUMERICIDENTIFIERLOOSE] + '|x|X|\\*' +tok('XRANGEIDENTIFIER') +src[t.XRANGEIDENTIFIER] = src[t.NUMERICIDENTIFIER] + '|x|X|\\*' + +tok('XRANGEPLAIN') +src[t.XRANGEPLAIN] = '[v=\\s]*(' + src[t.XRANGEIDENTIFIER] + ')' + + '(?:\\.(' + src[t.XRANGEIDENTIFIER] + ')' + + '(?:\\.(' + src[t.XRANGEIDENTIFIER] + ')' + + '(?:' + src[t.PRERELEASE] + ')?' + + src[t.BUILD] + '?' + + ')?)?' + +tok('XRANGEPLAINLOOSE') +src[t.XRANGEPLAINLOOSE] = '[v=\\s]*(' + src[t.XRANGEIDENTIFIERLOOSE] + ')' + + '(?:\\.(' + src[t.XRANGEIDENTIFIERLOOSE] + ')' + + '(?:\\.(' + src[t.XRANGEIDENTIFIERLOOSE] + ')' + + '(?:' + src[t.PRERELEASELOOSE] + ')?' + + src[t.BUILD] + '?' + + ')?)?' + +tok('XRANGE') +src[t.XRANGE] = '^' + src[t.GTLT] + '\\s*' + src[t.XRANGEPLAIN] + '$' +tok('XRANGELOOSE') +src[t.XRANGELOOSE] = '^' + src[t.GTLT] + '\\s*' + src[t.XRANGEPLAINLOOSE] + '$' + +// Coercion. +// Extract anything that could conceivably be a part of a valid semver +tok('COERCE') +src[t.COERCE] = '(^|[^\\d])' + + '(\\d{1,' + MAX_SAFE_COMPONENT_LENGTH + '})' + + '(?:\\.(\\d{1,' + MAX_SAFE_COMPONENT_LENGTH + '}))?' + + '(?:\\.(\\d{1,' + MAX_SAFE_COMPONENT_LENGTH + '}))?' + + '(?:$|[^\\d])' +tok('COERCERTL') +re[t.COERCERTL] = new RegExp(src[t.COERCE], 'g') + +// Tilde ranges. +// Meaning is "reasonably at or greater than" +tok('LONETILDE') +src[t.LONETILDE] = '(?:~>?)' + +tok('TILDETRIM') +src[t.TILDETRIM] = '(\\s*)' + src[t.LONETILDE] + '\\s+' +re[t.TILDETRIM] = new RegExp(src[t.TILDETRIM], 'g') +var tildeTrimReplace = '$1~' + +tok('TILDE') +src[t.TILDE] = '^' + src[t.LONETILDE] + src[t.XRANGEPLAIN] + '$' +tok('TILDELOOSE') +src[t.TILDELOOSE] = '^' + src[t.LONETILDE] + src[t.XRANGEPLAINLOOSE] + '$' + +// Caret ranges. +// Meaning is "at least and backwards compatible with" +tok('LONECARET') +src[t.LONECARET] = '(?:\\^)' + +tok('CARETTRIM') +src[t.CARETTRIM] = '(\\s*)' + src[t.LONECARET] + '\\s+' +re[t.CARETTRIM] = new RegExp(src[t.CARETTRIM], 'g') +var caretTrimReplace = '$1^' + +tok('CARET') +src[t.CARET] = '^' + src[t.LONECARET] + src[t.XRANGEPLAIN] + '$' +tok('CARETLOOSE') +src[t.CARETLOOSE] = '^' + src[t.LONECARET] + src[t.XRANGEPLAINLOOSE] + '$' + +// A simple gt/lt/eq thing, or just "" to indicate "any version" +tok('COMPARATORLOOSE') +src[t.COMPARATORLOOSE] = '^' + src[t.GTLT] + '\\s*(' + src[t.LOOSEPLAIN] + ')$|^$' +tok('COMPARATOR') +src[t.COMPARATOR] = '^' + src[t.GTLT] + '\\s*(' + src[t.FULLPLAIN] + ')$|^$' + +// An expression to strip any whitespace between the gtlt and the thing +// it modifies, so that `> 1.2.3` ==> `>1.2.3` +tok('COMPARATORTRIM') +src[t.COMPARATORTRIM] = '(\\s*)' + src[t.GTLT] + + '\\s*(' + src[t.LOOSEPLAIN] + '|' + src[t.XRANGEPLAIN] + ')' + +// this one has to use the /g flag +re[t.COMPARATORTRIM] = new RegExp(src[t.COMPARATORTRIM], 'g') +var comparatorTrimReplace = '$1$2$3' + +// Something like `1.2.3 - 1.2.4` +// Note that these all use the loose form, because they'll be +// checked against either the strict or loose comparator form +// later. +tok('HYPHENRANGE') +src[t.HYPHENRANGE] = '^\\s*(' + src[t.XRANGEPLAIN] + ')' + + '\\s+-\\s+' + + '(' + src[t.XRANGEPLAIN] + ')' + + '\\s*$' + +tok('HYPHENRANGELOOSE') +src[t.HYPHENRANGELOOSE] = '^\\s*(' + src[t.XRANGEPLAINLOOSE] + ')' + + '\\s+-\\s+' + + '(' + src[t.XRANGEPLAINLOOSE] + ')' + + '\\s*$' + +// Star ranges basically just allow anything at all. +tok('STAR') +src[t.STAR] = '(<|>)?=?\\s*\\*' + +// Compile to actual regexp objects. +// All are flag-free, unless they were created above with a flag. +for (var i = 0; i < R; i++) { + debug(i, src[i]) + if (!re[i]) { + re[i] = new RegExp(src[i]) + } +} + +exports.parse = parse +function parse (version, options) { + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + + if (version instanceof SemVer) { + return version + } + + if (typeof version !== 'string') { + return null + } + + if (version.length > MAX_LENGTH) { + return null + } + + var r = options.loose ? re[t.LOOSE] : re[t.FULL] + if (!r.test(version)) { + return null + } + + try { + return new SemVer(version, options) + } catch (er) { + return null + } +} + +exports.valid = valid +function valid (version, options) { + var v = parse(version, options) + return v ? v.version : null +} + +exports.clean = clean +function clean (version, options) { + var s = parse(version.trim().replace(/^[=v]+/, ''), options) + return s ? s.version : null +} + +exports.SemVer = SemVer + +function SemVer (version, options) { + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + if (version instanceof SemVer) { + if (version.loose === options.loose) { + return version + } else { + version = version.version + } + } else if (typeof version !== 'string') { + throw new TypeError('Invalid Version: ' + version) + } + + if (version.length > MAX_LENGTH) { + throw new TypeError('version is longer than ' + MAX_LENGTH + ' characters') + } + + if (!(this instanceof SemVer)) { + return new SemVer(version, options) + } + + debug('SemVer', version, options) + this.options = options + this.loose = !!options.loose + + var m = version.trim().match(options.loose ? re[t.LOOSE] : re[t.FULL]) + + if (!m) { + throw new TypeError('Invalid Version: ' + version) + } + + this.raw = version + + // these are actually numbers + this.major = +m[1] + this.minor = +m[2] + this.patch = +m[3] + + if (this.major > MAX_SAFE_INTEGER || this.major < 0) { + throw new TypeError('Invalid major version') + } + + if (this.minor > MAX_SAFE_INTEGER || this.minor < 0) { + throw new TypeError('Invalid minor version') + } + + if (this.patch > MAX_SAFE_INTEGER || this.patch < 0) { + throw new TypeError('Invalid patch version') + } + + // numberify any prerelease numeric ids + if (!m[4]) { + this.prerelease = [] + } else { + this.prerelease = m[4].split('.').map(function (id) { + if (/^[0-9]+$/.test(id)) { + var num = +id + if (num >= 0 && num < MAX_SAFE_INTEGER) { + return num + } + } + return id + }) + } + + this.build = m[5] ? m[5].split('.') : [] + this.format() +} + +SemVer.prototype.format = function () { + this.version = this.major + '.' + this.minor + '.' + this.patch + if (this.prerelease.length) { + this.version += '-' + this.prerelease.join('.') + } + return this.version +} + +SemVer.prototype.toString = function () { + return this.version +} + +SemVer.prototype.compare = function (other) { + debug('SemVer.compare', this.version, this.options, other) + if (!(other instanceof SemVer)) { + other = new SemVer(other, this.options) + } + + return this.compareMain(other) || this.comparePre(other) +} + +SemVer.prototype.compareMain = function (other) { + if (!(other instanceof SemVer)) { + other = new SemVer(other, this.options) + } + + return compareIdentifiers(this.major, other.major) || + compareIdentifiers(this.minor, other.minor) || + compareIdentifiers(this.patch, other.patch) +} + +SemVer.prototype.comparePre = function (other) { + if (!(other instanceof SemVer)) { + other = new SemVer(other, this.options) + } + + // NOT having a prerelease is > having one + if (this.prerelease.length && !other.prerelease.length) { + return -1 + } else if (!this.prerelease.length && other.prerelease.length) { + return 1 + } else if (!this.prerelease.length && !other.prerelease.length) { + return 0 + } + + var i = 0 + do { + var a = this.prerelease[i] + var b = other.prerelease[i] + debug('prerelease compare', i, a, b) + if (a === undefined && b === undefined) { + return 0 + } else if (b === undefined) { + return 1 + } else if (a === undefined) { + return -1 + } else if (a === b) { + continue + } else { + return compareIdentifiers(a, b) + } + } while (++i) +} + +SemVer.prototype.compareBuild = function (other) { + if (!(other instanceof SemVer)) { + other = new SemVer(other, this.options) + } + + var i = 0 + do { + var a = this.build[i] + var b = other.build[i] + debug('prerelease compare', i, a, b) + if (a === undefined && b === undefined) { + return 0 + } else if (b === undefined) { + return 1 + } else if (a === undefined) { + return -1 + } else if (a === b) { + continue + } else { + return compareIdentifiers(a, b) + } + } while (++i) +} + +// preminor will bump the version up to the next minor release, and immediately +// down to pre-release. premajor and prepatch work the same way. +SemVer.prototype.inc = function (release, identifier) { + switch (release) { + case 'premajor': + this.prerelease.length = 0 + this.patch = 0 + this.minor = 0 + this.major++ + this.inc('pre', identifier) + break + case 'preminor': + this.prerelease.length = 0 + this.patch = 0 + this.minor++ + this.inc('pre', identifier) + break + case 'prepatch': + // If this is already a prerelease, it will bump to the next version + // drop any prereleases that might already exist, since they are not + // relevant at this point. + this.prerelease.length = 0 + this.inc('patch', identifier) + this.inc('pre', identifier) + break + // If the input is a non-prerelease version, this acts the same as + // prepatch. + case 'prerelease': + if (this.prerelease.length === 0) { + this.inc('patch', identifier) + } + this.inc('pre', identifier) + break + + case 'major': + // If this is a pre-major version, bump up to the same major version. + // Otherwise increment major. + // 1.0.0-5 bumps to 1.0.0 + // 1.1.0 bumps to 2.0.0 + if (this.minor !== 0 || + this.patch !== 0 || + this.prerelease.length === 0) { + this.major++ + } + this.minor = 0 + this.patch = 0 + this.prerelease = [] + break + case 'minor': + // If this is a pre-minor version, bump up to the same minor version. + // Otherwise increment minor. + // 1.2.0-5 bumps to 1.2.0 + // 1.2.1 bumps to 1.3.0 + if (this.patch !== 0 || this.prerelease.length === 0) { + this.minor++ + } + this.patch = 0 + this.prerelease = [] + break + case 'patch': + // If this is not a pre-release version, it will increment the patch. + // If it is a pre-release it will bump up to the same patch version. + // 1.2.0-5 patches to 1.2.0 + // 1.2.0 patches to 1.2.1 + if (this.prerelease.length === 0) { + this.patch++ + } + this.prerelease = [] + break + // This probably shouldn't be used publicly. + // 1.0.0 "pre" would become 1.0.0-0 which is the wrong direction. + case 'pre': + if (this.prerelease.length === 0) { + this.prerelease = [0] + } else { + var i = this.prerelease.length + while (--i >= 0) { + if (typeof this.prerelease[i] === 'number') { + this.prerelease[i]++ + i = -2 + } + } + if (i === -1) { + // didn't increment anything + this.prerelease.push(0) + } + } + if (identifier) { + // 1.2.0-beta.1 bumps to 1.2.0-beta.2, + // 1.2.0-beta.fooblz or 1.2.0-beta bumps to 1.2.0-beta.0 + if (this.prerelease[0] === identifier) { + if (isNaN(this.prerelease[1])) { + this.prerelease = [identifier, 0] + } + } else { + this.prerelease = [identifier, 0] + } + } + break + + default: + throw new Error('invalid increment argument: ' + release) + } + this.format() + this.raw = this.version + return this +} + +exports.inc = inc +function inc (version, release, loose, identifier) { + if (typeof (loose) === 'string') { + identifier = loose + loose = undefined + } + + try { + return new SemVer(version, loose).inc(release, identifier).version + } catch (er) { + return null + } +} + +exports.diff = diff +function diff (version1, version2) { + if (eq(version1, version2)) { + return null + } else { + var v1 = parse(version1) + var v2 = parse(version2) + var prefix = '' + if (v1.prerelease.length || v2.prerelease.length) { + prefix = 'pre' + var defaultResult = 'prerelease' + } + for (var key in v1) { + if (key === 'major' || key === 'minor' || key === 'patch') { + if (v1[key] !== v2[key]) { + return prefix + key + } + } + } + return defaultResult // may be undefined + } +} + +exports.compareIdentifiers = compareIdentifiers + +var numeric = /^[0-9]+$/ +function compareIdentifiers (a, b) { + var anum = numeric.test(a) + var bnum = numeric.test(b) + + if (anum && bnum) { + a = +a + b = +b + } + + return a === b ? 0 + : (anum && !bnum) ? -1 + : (bnum && !anum) ? 1 + : a < b ? -1 + : 1 +} + +exports.rcompareIdentifiers = rcompareIdentifiers +function rcompareIdentifiers (a, b) { + return compareIdentifiers(b, a) +} + +exports.major = major +function major (a, loose) { + return new SemVer(a, loose).major +} + +exports.minor = minor +function minor (a, loose) { + return new SemVer(a, loose).minor +} + +exports.patch = patch +function patch (a, loose) { + return new SemVer(a, loose).patch +} + +exports.compare = compare +function compare (a, b, loose) { + return new SemVer(a, loose).compare(new SemVer(b, loose)) +} + +exports.compareLoose = compareLoose +function compareLoose (a, b) { + return compare(a, b, true) +} + +exports.compareBuild = compareBuild +function compareBuild (a, b, loose) { + var versionA = new SemVer(a, loose) + var versionB = new SemVer(b, loose) + return versionA.compare(versionB) || versionA.compareBuild(versionB) +} + +exports.rcompare = rcompare +function rcompare (a, b, loose) { + return compare(b, a, loose) +} + +exports.sort = sort +function sort (list, loose) { + return list.sort(function (a, b) { + return exports.compareBuild(a, b, loose) + }) +} + +exports.rsort = rsort +function rsort (list, loose) { + return list.sort(function (a, b) { + return exports.compareBuild(b, a, loose) + }) +} + +exports.gt = gt +function gt (a, b, loose) { + return compare(a, b, loose) > 0 +} + +exports.lt = lt +function lt (a, b, loose) { + return compare(a, b, loose) < 0 +} + +exports.eq = eq +function eq (a, b, loose) { + return compare(a, b, loose) === 0 +} + +exports.neq = neq +function neq (a, b, loose) { + return compare(a, b, loose) !== 0 +} + +exports.gte = gte +function gte (a, b, loose) { + return compare(a, b, loose) >= 0 +} + +exports.lte = lte +function lte (a, b, loose) { + return compare(a, b, loose) <= 0 +} + +exports.cmp = cmp +function cmp (a, op, b, loose) { + switch (op) { + case '===': + if (typeof a === 'object') + a = a.version + if (typeof b === 'object') + b = b.version + return a === b + + case '!==': + if (typeof a === 'object') + a = a.version + if (typeof b === 'object') + b = b.version + return a !== b + + case '': + case '=': + case '==': + return eq(a, b, loose) + + case '!=': + return neq(a, b, loose) + + case '>': + return gt(a, b, loose) + + case '>=': + return gte(a, b, loose) + + case '<': + return lt(a, b, loose) + + case '<=': + return lte(a, b, loose) + + default: + throw new TypeError('Invalid operator: ' + op) + } +} + +exports.Comparator = Comparator +function Comparator (comp, options) { + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + + if (comp instanceof Comparator) { + if (comp.loose === !!options.loose) { + return comp + } else { + comp = comp.value + } + } + + if (!(this instanceof Comparator)) { + return new Comparator(comp, options) + } + + debug('comparator', comp, options) + this.options = options + this.loose = !!options.loose + this.parse(comp) + + if (this.semver === ANY) { + this.value = '' + } else { + this.value = this.operator + this.semver.version + } + + debug('comp', this) +} + +var ANY = {} +Comparator.prototype.parse = function (comp) { + var r = this.options.loose ? re[t.COMPARATORLOOSE] : re[t.COMPARATOR] + var m = comp.match(r) + + if (!m) { + throw new TypeError('Invalid comparator: ' + comp) + } + + this.operator = m[1] !== undefined ? m[1] : '' + if (this.operator === '=') { + this.operator = '' + } + + // if it literally is just '>' or '' then allow anything. + if (!m[2]) { + this.semver = ANY + } else { + this.semver = new SemVer(m[2], this.options.loose) + } +} + +Comparator.prototype.toString = function () { + return this.value +} + +Comparator.prototype.test = function (version) { + debug('Comparator.test', version, this.options.loose) + + if (this.semver === ANY || version === ANY) { + return true + } + + if (typeof version === 'string') { + try { + version = new SemVer(version, this.options) + } catch (er) { + return false + } + } + + return cmp(version, this.operator, this.semver, this.options) +} + +Comparator.prototype.intersects = function (comp, options) { + if (!(comp instanceof Comparator)) { + throw new TypeError('a Comparator is required') + } + + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + + var rangeTmp + + if (this.operator === '') { + if (this.value === '') { + return true + } + rangeTmp = new Range(comp.value, options) + return satisfies(this.value, rangeTmp, options) + } else if (comp.operator === '') { + if (comp.value === '') { + return true + } + rangeTmp = new Range(this.value, options) + return satisfies(comp.semver, rangeTmp, options) + } + + var sameDirectionIncreasing = + (this.operator === '>=' || this.operator === '>') && + (comp.operator === '>=' || comp.operator === '>') + var sameDirectionDecreasing = + (this.operator === '<=' || this.operator === '<') && + (comp.operator === '<=' || comp.operator === '<') + var sameSemVer = this.semver.version === comp.semver.version + var differentDirectionsInclusive = + (this.operator === '>=' || this.operator === '<=') && + (comp.operator === '>=' || comp.operator === '<=') + var oppositeDirectionsLessThan = + cmp(this.semver, '<', comp.semver, options) && + ((this.operator === '>=' || this.operator === '>') && + (comp.operator === '<=' || comp.operator === '<')) + var oppositeDirectionsGreaterThan = + cmp(this.semver, '>', comp.semver, options) && + ((this.operator === '<=' || this.operator === '<') && + (comp.operator === '>=' || comp.operator === '>')) + + return sameDirectionIncreasing || sameDirectionDecreasing || + (sameSemVer && differentDirectionsInclusive) || + oppositeDirectionsLessThan || oppositeDirectionsGreaterThan +} + +exports.Range = Range +function Range (range, options) { + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + + if (range instanceof Range) { + if (range.loose === !!options.loose && + range.includePrerelease === !!options.includePrerelease) { + return range + } else { + return new Range(range.raw, options) + } + } + + if (range instanceof Comparator) { + return new Range(range.value, options) + } + + if (!(this instanceof Range)) { + return new Range(range, options) + } + + this.options = options + this.loose = !!options.loose + this.includePrerelease = !!options.includePrerelease + + // First, split based on boolean or || + this.raw = range + this.set = range.split(/\s*\|\|\s*/).map(function (range) { + return this.parseRange(range.trim()) + }, this).filter(function (c) { + // throw out any that are not relevant for whatever reason + return c.length + }) + + if (!this.set.length) { + throw new TypeError('Invalid SemVer Range: ' + range) + } + + this.format() +} + +Range.prototype.format = function () { + this.range = this.set.map(function (comps) { + return comps.join(' ').trim() + }).join('||').trim() + return this.range +} + +Range.prototype.toString = function () { + return this.range +} + +Range.prototype.parseRange = function (range) { + var loose = this.options.loose + range = range.trim() + // `1.2.3 - 1.2.4` => `>=1.2.3 <=1.2.4` + var hr = loose ? re[t.HYPHENRANGELOOSE] : re[t.HYPHENRANGE] + range = range.replace(hr, hyphenReplace) + debug('hyphen replace', range) + // `> 1.2.3 < 1.2.5` => `>1.2.3 <1.2.5` + range = range.replace(re[t.COMPARATORTRIM], comparatorTrimReplace) + debug('comparator trim', range, re[t.COMPARATORTRIM]) + + // `~ 1.2.3` => `~1.2.3` + range = range.replace(re[t.TILDETRIM], tildeTrimReplace) + + // `^ 1.2.3` => `^1.2.3` + range = range.replace(re[t.CARETTRIM], caretTrimReplace) + + // normalize spaces + range = range.split(/\s+/).join(' ') + + // At this point, the range is completely trimmed and + // ready to be split into comparators. + + var compRe = loose ? re[t.COMPARATORLOOSE] : re[t.COMPARATOR] + var set = range.split(' ').map(function (comp) { + return parseComparator(comp, this.options) + }, this).join(' ').split(/\s+/) + if (this.options.loose) { + // in loose mode, throw out any that are not valid comparators + set = set.filter(function (comp) { + return !!comp.match(compRe) + }) + } + set = set.map(function (comp) { + return new Comparator(comp, this.options) + }, this) + + return set +} + +Range.prototype.intersects = function (range, options) { + if (!(range instanceof Range)) { + throw new TypeError('a Range is required') + } + + return this.set.some(function (thisComparators) { + return ( + isSatisfiable(thisComparators, options) && + range.set.some(function (rangeComparators) { + return ( + isSatisfiable(rangeComparators, options) && + thisComparators.every(function (thisComparator) { + return rangeComparators.every(function (rangeComparator) { + return thisComparator.intersects(rangeComparator, options) + }) + }) + ) + }) + ) + }) +} + +// take a set of comparators and determine whether there +// exists a version which can satisfy it +function isSatisfiable (comparators, options) { + var result = true + var remainingComparators = comparators.slice() + var testComparator = remainingComparators.pop() + + while (result && remainingComparators.length) { + result = remainingComparators.every(function (otherComparator) { + return testComparator.intersects(otherComparator, options) + }) + + testComparator = remainingComparators.pop() + } + + return result +} + +// Mostly just for testing and legacy API reasons +exports.toComparators = toComparators +function toComparators (range, options) { + return new Range(range, options).set.map(function (comp) { + return comp.map(function (c) { + return c.value + }).join(' ').trim().split(' ') + }) +} + +// comprised of xranges, tildes, stars, and gtlt's at this point. +// already replaced the hyphen ranges +// turn into a set of JUST comparators. +function parseComparator (comp, options) { + debug('comp', comp, options) + comp = replaceCarets(comp, options) + debug('caret', comp) + comp = replaceTildes(comp, options) + debug('tildes', comp) + comp = replaceXRanges(comp, options) + debug('xrange', comp) + comp = replaceStars(comp, options) + debug('stars', comp) + return comp +} + +function isX (id) { + return !id || id.toLowerCase() === 'x' || id === '*' +} + +// ~, ~> --> * (any, kinda silly) +// ~2, ~2.x, ~2.x.x, ~>2, ~>2.x ~>2.x.x --> >=2.0.0 <3.0.0 +// ~2.0, ~2.0.x, ~>2.0, ~>2.0.x --> >=2.0.0 <2.1.0 +// ~1.2, ~1.2.x, ~>1.2, ~>1.2.x --> >=1.2.0 <1.3.0 +// ~1.2.3, ~>1.2.3 --> >=1.2.3 <1.3.0 +// ~1.2.0, ~>1.2.0 --> >=1.2.0 <1.3.0 +function replaceTildes (comp, options) { + return comp.trim().split(/\s+/).map(function (comp) { + return replaceTilde(comp, options) + }).join(' ') +} + +function replaceTilde (comp, options) { + var r = options.loose ? re[t.TILDELOOSE] : re[t.TILDE] + return comp.replace(r, function (_, M, m, p, pr) { + debug('tilde', comp, _, M, m, p, pr) + var ret + + if (isX(M)) { + ret = '' + } else if (isX(m)) { + ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0' + } else if (isX(p)) { + // ~1.2 == >=1.2.0 <1.3.0 + ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0' + } else if (pr) { + debug('replaceTilde pr', pr) + ret = '>=' + M + '.' + m + '.' + p + '-' + pr + + ' <' + M + '.' + (+m + 1) + '.0' + } else { + // ~1.2.3 == >=1.2.3 <1.3.0 + ret = '>=' + M + '.' + m + '.' + p + + ' <' + M + '.' + (+m + 1) + '.0' + } + + debug('tilde return', ret) + return ret + }) +} + +// ^ --> * (any, kinda silly) +// ^2, ^2.x, ^2.x.x --> >=2.0.0 <3.0.0 +// ^2.0, ^2.0.x --> >=2.0.0 <3.0.0 +// ^1.2, ^1.2.x --> >=1.2.0 <2.0.0 +// ^1.2.3 --> >=1.2.3 <2.0.0 +// ^1.2.0 --> >=1.2.0 <2.0.0 +function replaceCarets (comp, options) { + return comp.trim().split(/\s+/).map(function (comp) { + return replaceCaret(comp, options) + }).join(' ') +} + +function replaceCaret (comp, options) { + debug('caret', comp, options) + var r = options.loose ? re[t.CARETLOOSE] : re[t.CARET] + return comp.replace(r, function (_, M, m, p, pr) { + debug('caret', comp, _, M, m, p, pr) + var ret + + if (isX(M)) { + ret = '' + } else if (isX(m)) { + ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0' + } else if (isX(p)) { + if (M === '0') { + ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0' + } else { + ret = '>=' + M + '.' + m + '.0 <' + (+M + 1) + '.0.0' + } + } else if (pr) { + debug('replaceCaret pr', pr) + if (M === '0') { + if (m === '0') { + ret = '>=' + M + '.' + m + '.' + p + '-' + pr + + ' <' + M + '.' + m + '.' + (+p + 1) + } else { + ret = '>=' + M + '.' + m + '.' + p + '-' + pr + + ' <' + M + '.' + (+m + 1) + '.0' + } + } else { + ret = '>=' + M + '.' + m + '.' + p + '-' + pr + + ' <' + (+M + 1) + '.0.0' + } + } else { + debug('no pr') + if (M === '0') { + if (m === '0') { + ret = '>=' + M + '.' + m + '.' + p + + ' <' + M + '.' + m + '.' + (+p + 1) + } else { + ret = '>=' + M + '.' + m + '.' + p + + ' <' + M + '.' + (+m + 1) + '.0' + } + } else { + ret = '>=' + M + '.' + m + '.' + p + + ' <' + (+M + 1) + '.0.0' + } + } + + debug('caret return', ret) + return ret + }) +} + +function replaceXRanges (comp, options) { + debug('replaceXRanges', comp, options) + return comp.split(/\s+/).map(function (comp) { + return replaceXRange(comp, options) + }).join(' ') +} + +function replaceXRange (comp, options) { + comp = comp.trim() + var r = options.loose ? re[t.XRANGELOOSE] : re[t.XRANGE] + return comp.replace(r, function (ret, gtlt, M, m, p, pr) { + debug('xRange', comp, ret, gtlt, M, m, p, pr) + var xM = isX(M) + var xm = xM || isX(m) + var xp = xm || isX(p) + var anyX = xp + + if (gtlt === '=' && anyX) { + gtlt = '' + } + + // if we're including prereleases in the match, then we need + // to fix this to -0, the lowest possible prerelease value + pr = options.includePrerelease ? '-0' : '' + + if (xM) { + if (gtlt === '>' || gtlt === '<') { + // nothing is allowed + ret = '<0.0.0-0' + } else { + // nothing is forbidden + ret = '*' + } + } else if (gtlt && anyX) { + // we know patch is an x, because we have any x at all. + // replace X with 0 + if (xm) { + m = 0 + } + p = 0 + + if (gtlt === '>') { + // >1 => >=2.0.0 + // >1.2 => >=1.3.0 + // >1.2.3 => >= 1.2.4 + gtlt = '>=' + if (xm) { + M = +M + 1 + m = 0 + p = 0 + } else { + m = +m + 1 + p = 0 + } + } else if (gtlt === '<=') { + // <=0.7.x is actually <0.8.0, since any 0.7.x should + // pass. Similarly, <=7.x is actually <8.0.0, etc. + gtlt = '<' + if (xm) { + M = +M + 1 + } else { + m = +m + 1 + } + } + + ret = gtlt + M + '.' + m + '.' + p + pr + } else if (xm) { + ret = '>=' + M + '.0.0' + pr + ' <' + (+M + 1) + '.0.0' + pr + } else if (xp) { + ret = '>=' + M + '.' + m + '.0' + pr + + ' <' + M + '.' + (+m + 1) + '.0' + pr + } + + debug('xRange return', ret) + + return ret + }) +} + +// Because * is AND-ed with everything else in the comparator, +// and '' means "any version", just remove the *s entirely. +function replaceStars (comp, options) { + debug('replaceStars', comp, options) + // Looseness is ignored here. star is always as loose as it gets! + return comp.trim().replace(re[t.STAR], '') +} + +// This function is passed to string.replace(re[t.HYPHENRANGE]) +// M, m, patch, prerelease, build +// 1.2 - 3.4.5 => >=1.2.0 <=3.4.5 +// 1.2.3 - 3.4 => >=1.2.0 <3.5.0 Any 3.4.x will do +// 1.2 - 3.4 => >=1.2.0 <3.5.0 +function hyphenReplace ($0, + from, fM, fm, fp, fpr, fb, + to, tM, tm, tp, tpr, tb) { + if (isX(fM)) { + from = '' + } else if (isX(fm)) { + from = '>=' + fM + '.0.0' + } else if (isX(fp)) { + from = '>=' + fM + '.' + fm + '.0' + } else { + from = '>=' + from + } + + if (isX(tM)) { + to = '' + } else if (isX(tm)) { + to = '<' + (+tM + 1) + '.0.0' + } else if (isX(tp)) { + to = '<' + tM + '.' + (+tm + 1) + '.0' + } else if (tpr) { + to = '<=' + tM + '.' + tm + '.' + tp + '-' + tpr + } else { + to = '<=' + to + } + + return (from + ' ' + to).trim() +} + +// if ANY of the sets match ALL of its comparators, then pass +Range.prototype.test = function (version) { + if (!version) { + return false + } + + if (typeof version === 'string') { + try { + version = new SemVer(version, this.options) + } catch (er) { + return false + } + } + + for (var i = 0; i < this.set.length; i++) { + if (testSet(this.set[i], version, this.options)) { + return true + } + } + return false +} + +function testSet (set, version, options) { + for (var i = 0; i < set.length; i++) { + if (!set[i].test(version)) { + return false + } + } + + if (version.prerelease.length && !options.includePrerelease) { + // Find the set of versions that are allowed to have prereleases + // For example, ^1.2.3-pr.1 desugars to >=1.2.3-pr.1 <2.0.0 + // That should allow `1.2.3-pr.2` to pass. + // However, `1.2.4-alpha.notready` should NOT be allowed, + // even though it's within the range set by the comparators. + for (i = 0; i < set.length; i++) { + debug(set[i].semver) + if (set[i].semver === ANY) { + continue + } + + if (set[i].semver.prerelease.length > 0) { + var allowed = set[i].semver + if (allowed.major === version.major && + allowed.minor === version.minor && + allowed.patch === version.patch) { + return true + } + } + } + + // Version has a -pre, but it's not one of the ones we like. + return false + } + + return true +} + +exports.satisfies = satisfies +function satisfies (version, range, options) { + try { + range = new Range(range, options) + } catch (er) { + return false + } + return range.test(version) +} + +exports.maxSatisfying = maxSatisfying +function maxSatisfying (versions, range, options) { + var max = null + var maxSV = null + try { + var rangeObj = new Range(range, options) + } catch (er) { + return null + } + versions.forEach(function (v) { + if (rangeObj.test(v)) { + // satisfies(v, range, options) + if (!max || maxSV.compare(v) === -1) { + // compare(max, v, true) + max = v + maxSV = new SemVer(max, options) + } + } + }) + return max +} + +exports.minSatisfying = minSatisfying +function minSatisfying (versions, range, options) { + var min = null + var minSV = null + try { + var rangeObj = new Range(range, options) + } catch (er) { + return null + } + versions.forEach(function (v) { + if (rangeObj.test(v)) { + // satisfies(v, range, options) + if (!min || minSV.compare(v) === 1) { + // compare(min, v, true) + min = v + minSV = new SemVer(min, options) + } + } + }) + return min +} + +exports.minVersion = minVersion +function minVersion (range, loose) { + range = new Range(range, loose) + + var minver = new SemVer('0.0.0') + if (range.test(minver)) { + return minver + } + + minver = new SemVer('0.0.0-0') + if (range.test(minver)) { + return minver + } + + minver = null + for (var i = 0; i < range.set.length; ++i) { + var comparators = range.set[i] + + comparators.forEach(function (comparator) { + // Clone to avoid manipulating the comparator's semver object. + var compver = new SemVer(comparator.semver.version) + switch (comparator.operator) { + case '>': + if (compver.prerelease.length === 0) { + compver.patch++ + } else { + compver.prerelease.push(0) + } + compver.raw = compver.format() + /* fallthrough */ + case '': + case '>=': + if (!minver || gt(minver, compver)) { + minver = compver + } + break + case '<': + case '<=': + /* Ignore maximum versions */ + break + /* istanbul ignore next */ + default: + throw new Error('Unexpected operation: ' + comparator.operator) + } + }) + } + + if (minver && range.test(minver)) { + return minver + } + + return null +} + +exports.validRange = validRange +function validRange (range, options) { + try { + // Return '*' instead of '' so that truthiness works. + // This will throw if it's invalid anyway + return new Range(range, options).range || '*' + } catch (er) { + return null + } +} + +// Determine if version is less than all the versions possible in the range +exports.ltr = ltr +function ltr (version, range, options) { + return outside(version, range, '<', options) +} + +// Determine if version is greater than all the versions possible in the range. +exports.gtr = gtr +function gtr (version, range, options) { + return outside(version, range, '>', options) +} + +exports.outside = outside +function outside (version, range, hilo, options) { + version = new SemVer(version, options) + range = new Range(range, options) + + var gtfn, ltefn, ltfn, comp, ecomp + switch (hilo) { + case '>': + gtfn = gt + ltefn = lte + ltfn = lt + comp = '>' + ecomp = '>=' + break + case '<': + gtfn = lt + ltefn = gte + ltfn = gt + comp = '<' + ecomp = '<=' + break + default: + throw new TypeError('Must provide a hilo val of "<" or ">"') + } + + // If it satisifes the range it is not outside + if (satisfies(version, range, options)) { + return false + } + + // From now on, variable terms are as if we're in "gtr" mode. + // but note that everything is flipped for the "ltr" function. + + for (var i = 0; i < range.set.length; ++i) { + var comparators = range.set[i] + + var high = null + var low = null + + comparators.forEach(function (comparator) { + if (comparator.semver === ANY) { + comparator = new Comparator('>=0.0.0') + } + high = high || comparator + low = low || comparator + if (gtfn(comparator.semver, high.semver, options)) { + high = comparator + } else if (ltfn(comparator.semver, low.semver, options)) { + low = comparator + } + }) + + // If the edge version comparator has a operator then our version + // isn't outside it + if (high.operator === comp || high.operator === ecomp) { + return false + } + + // If the lowest version comparator has an operator and our version + // is less than it then it isn't higher than the range + if ((!low.operator || low.operator === comp) && + ltefn(version, low.semver)) { + return false + } else if (low.operator === ecomp && ltfn(version, low.semver)) { + return false + } + } + return true +} + +exports.prerelease = prerelease +function prerelease (version, options) { + var parsed = parse(version, options) + return (parsed && parsed.prerelease.length) ? parsed.prerelease : null +} + +exports.intersects = intersects +function intersects (r1, r2, options) { + r1 = new Range(r1, options) + r2 = new Range(r2, options) + return r1.intersects(r2) +} + +exports.coerce = coerce +function coerce (version, options) { + if (version instanceof SemVer) { + return version + } + + if (typeof version === 'number') { + version = String(version) + } + + if (typeof version !== 'string') { + return null + } + + options = options || {} + + var match = null + if (!options.rtl) { + match = version.match(re[t.COERCE]) + } else { + // Find the right-most coercible string that does not share + // a terminus with a more left-ward coercible string. + // Eg, '1.2.3.4' wants to coerce '2.3.4', not '3.4' or '4' + // + // Walk through the string checking with a /g regexp + // Manually set the index so as to pick up overlapping matches. + // Stop when we get a match that ends at the string end, since no + // coercible string can be more right-ward without the same terminus. + var next + while ((next = re[t.COERCERTL].exec(version)) && + (!match || match.index + match[0].length !== version.length) + ) { + if (!match || + next.index + next[0].length !== match.index + match[0].length) { + match = next + } + re[t.COERCERTL].lastIndex = next.index + next[1].length + next[2].length + } + // leave it in a clean state + re[t.COERCERTL].lastIndex = -1 + } + + if (match === null) { + return null + } + + return parse(match[2] + + '.' + (match[3] || '0') + + '.' + (match[4] || '0'), options) +} diff --git a/node_modules/package-json/package.json b/node_modules/package-json/package.json new file mode 100644 index 000000000..2647a7b04 --- /dev/null +++ b/node_modules/package-json/package.json @@ -0,0 +1,78 @@ +{ + "_from": "package-json@^6.3.0", + "_id": "package-json@6.5.0", + "_inBundle": false, + "_integrity": "sha512-k3bdm2n25tkyxcjSKzB5x8kfVxlMdgsbPr0GkZcwHsLpba6cBjqCt1KlcChKEvxHIcTB1FVMuwoijZ26xex5MQ==", + "_location": "/package-json", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "package-json@^6.3.0", + "name": "package-json", + "escapedName": "package-json", + "rawSpec": "^6.3.0", + "saveSpec": null, + "fetchSpec": "^6.3.0" + }, + "_requiredBy": [ + "/latest-version" + ], + "_resolved": "https://registry.npmjs.org/package-json/-/package-json-6.5.0.tgz", + "_shasum": "6feedaca35e75725876d0b0e64974697fed145b0", + "_spec": "package-json@^6.3.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\latest-version", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/package-json/issues" + }, + "bundleDependencies": false, + "dependencies": { + "got": "^9.6.0", + "registry-auth-token": "^4.0.0", + "registry-url": "^5.0.0", + "semver": "^6.2.0" + }, + "deprecated": false, + "description": "Get metadata of a package from the npm registry", + "devDependencies": { + "@types/node": "^12.6.8", + "ava": "^2.2.0", + "mock-private-registry": "^1.1.2", + "tsd": "^0.7.4", + "xo": "^0.24.0" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "homepage": "https://github.com/sindresorhus/package-json#readme", + "keywords": [ + "npm", + "registry", + "package", + "pkg", + "package.json", + "json", + "module", + "scope", + "scoped" + ], + "license": "MIT", + "name": "package-json", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/package-json.git" + }, + "scripts": { + "test": "xo && ava && tsd" + }, + "version": "6.5.0" +} diff --git a/node_modules/package-json/readme.md b/node_modules/package-json/readme.md new file mode 100644 index 000000000..88a17bde3 --- /dev/null +++ b/node_modules/package-json/readme.md @@ -0,0 +1,118 @@ +# package-json [![Build Status](https://travis-ci.org/sindresorhus/package-json.svg?branch=master)](https://travis-ci.org/sindresorhus/package-json) + +> Get metadata of a package from the npm registry + + +## Install + +``` +$ npm install package-json +``` + + +## Usage + +```js +const packageJson = require('package-json'); + +(async () => { + console.log(await packageJson('ava')); + //=> {name: 'ava', ...} + + // Also works with scoped packages + console.log(await packageJson('@sindresorhus/df')); +})(); +``` + + +## API + +### packageJson(packageName, options?) + +#### packageName + +Type: `string` + +Name of the package. + +#### options + +Type: `object` + +##### version + +Type: `string`
+Default: `latest` + +Package version such as `1.0.0` or a [dist tag](https://docs.npmjs.com/cli/dist-tag) such as `latest`. + +The version can also be in any format supported by the [semver](https://github.com/npm/node-semver) module. For example: + +- `1` - Get the latest `1.x.x` +- `1.2` - Get the latest `1.2.x` +- `^1.2.3` - Get the latest `1.x.x` but at least `1.2.3` +- `~1.2.3` - Get the latest `1.2.x` but at least `1.2.3` + +##### fullMetadata + +Type: `boolean`
+Default: `false` + +By default, only an abbreviated metadata object is returned for performance reasons. [Read more.](https://github.com/npm/registry/blob/master/docs/responses/package-metadata.md) + +##### allVersions + +Type: `boolean`
+Default: `false` + +Return the [main entry](https://registry.npmjs.org/ava) containing all versions. + +##### registryUrl + +Type: `string`
+Default: Auto-detected + +The registry URL is by default inferred from the npm defaults and `.npmrc`. This is beneficial as `package-json` and any project using it will work just like npm. This option is **only** intended for internal tools. You should **not** use this option in reusable packages. Prefer just using `.npmrc` whenever possible. + +##### agent + +Type: `http.Agent | https.Agent | object | false` + +Overwrite the `agent` option that is passed down to [`got`](https://github.com/sindresorhus/got#agent). This might be useful to add [proxy support](https://github.com/sindresorhus/got#proxies). + + +### packageJson.PackageNotFoundError + +The error thrown when the given package name cannot be found. + +### packageJson.VersionNotFoundError + +The error thrown when the given package version cannot be found. + + +## Authentication + +Both public and private registries are supported, for both scoped and unscoped packages, as long as the registry uses either bearer tokens or basic authentication. + + +## Related + +- [package-json-cli](https://github.com/sindresorhus/package-json-cli) - CLI for this module +- [latest-version](https://github.com/sindresorhus/latest-version) - Get the latest version of an npm package +- [pkg-versions](https://github.com/sindresorhus/pkg-versions) - Get the version numbers of a package from the npm registry +- [npm-keyword](https://github.com/sindresorhus/npm-keyword) - Get a list of npm packages with a certain keyword +- [npm-user](https://github.com/sindresorhus/npm-user) - Get user info of an npm user +- [npm-email](https://github.com/sindresorhus/npm-email) - Get the email of an npm user + + +--- + +
+ + Get professional support for this package with a Tidelift subscription + +
+ + Tidelift helps make open source sustainable for maintainers while giving companies
assurances about security, maintenance, and licensing for their dependencies. +
+
diff --git a/node_modules/parseurl/HISTORY.md b/node_modules/parseurl/HISTORY.md new file mode 100644 index 000000000..8e409541d --- /dev/null +++ b/node_modules/parseurl/HISTORY.md @@ -0,0 +1,58 @@ +1.3.3 / 2019-04-15 +================== + + * Fix Node.js 0.8 return value inconsistencies + +1.3.2 / 2017-09-09 +================== + + * perf: reduce overhead for full URLs + * perf: unroll the "fast-path" `RegExp` + +1.3.1 / 2016-01-17 +================== + + * perf: enable strict mode + +1.3.0 / 2014-08-09 +================== + + * Add `parseurl.original` for parsing `req.originalUrl` with fallback + * Return `undefined` if `req.url` is `undefined` + +1.2.0 / 2014-07-21 +================== + + * Cache URLs based on original value + * Remove no-longer-needed URL mis-parse work-around + * Simplify the "fast-path" `RegExp` + +1.1.3 / 2014-07-08 +================== + + * Fix typo + +1.1.2 / 2014-07-08 +================== + + * Seriously fix Node.js 0.8 compatibility + +1.1.1 / 2014-07-08 +================== + + * Fix Node.js 0.8 compatibility + +1.1.0 / 2014-07-08 +================== + + * Incorporate URL href-only parse fast-path + +1.0.1 / 2014-03-08 +================== + + * Add missing `require` + +1.0.0 / 2014-03-08 +================== + + * Genesis from `connect` diff --git a/node_modules/parseurl/LICENSE b/node_modules/parseurl/LICENSE new file mode 100644 index 000000000..27653d3db --- /dev/null +++ b/node_modules/parseurl/LICENSE @@ -0,0 +1,24 @@ + +(The MIT License) + +Copyright (c) 2014 Jonathan Ong +Copyright (c) 2014-2017 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/parseurl/README.md b/node_modules/parseurl/README.md new file mode 100644 index 000000000..443e716b8 --- /dev/null +++ b/node_modules/parseurl/README.md @@ -0,0 +1,133 @@ +# parseurl + +[![NPM Version][npm-version-image]][npm-url] +[![NPM Downloads][npm-downloads-image]][npm-url] +[![Node.js Version][node-image]][node-url] +[![Build Status][travis-image]][travis-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +Parse a URL with memoization. + +## Install + +This is a [Node.js](https://nodejs.org/en/) module available through the +[npm registry](https://www.npmjs.com/). Installation is done using the +[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally): + +```sh +$ npm install parseurl +``` + +## API + +```js +var parseurl = require('parseurl') +``` + +### parseurl(req) + +Parse the URL of the given request object (looks at the `req.url` property) +and return the result. The result is the same as `url.parse` in Node.js core. +Calling this function multiple times on the same `req` where `req.url` does +not change will return a cached parsed object, rather than parsing again. + +### parseurl.original(req) + +Parse the original URL of the given request object and return the result. +This works by trying to parse `req.originalUrl` if it is a string, otherwise +parses `req.url`. The result is the same as `url.parse` in Node.js core. +Calling this function multiple times on the same `req` where `req.originalUrl` +does not change will return a cached parsed object, rather than parsing again. + +## Benchmark + +```bash +$ npm run-script bench + +> parseurl@1.3.3 bench nodejs-parseurl +> node benchmark/index.js + + http_parser@2.8.0 + node@10.6.0 + v8@6.7.288.46-node.13 + uv@1.21.0 + zlib@1.2.11 + ares@1.14.0 + modules@64 + nghttp2@1.32.0 + napi@3 + openssl@1.1.0h + icu@61.1 + unicode@10.0 + cldr@33.0 + tz@2018c + +> node benchmark/fullurl.js + + Parsing URL "http://localhost:8888/foo/bar?user=tj&pet=fluffy" + + 4 tests completed. + + fasturl x 2,207,842 ops/sec ±3.76% (184 runs sampled) + nativeurl - legacy x 507,180 ops/sec ±0.82% (191 runs sampled) + nativeurl - whatwg x 290,044 ops/sec ±1.96% (189 runs sampled) + parseurl x 488,907 ops/sec ±2.13% (192 runs sampled) + +> node benchmark/pathquery.js + + Parsing URL "/foo/bar?user=tj&pet=fluffy" + + 4 tests completed. + + fasturl x 3,812,564 ops/sec ±3.15% (188 runs sampled) + nativeurl - legacy x 2,651,631 ops/sec ±1.68% (189 runs sampled) + nativeurl - whatwg x 161,837 ops/sec ±2.26% (189 runs sampled) + parseurl x 4,166,338 ops/sec ±2.23% (184 runs sampled) + +> node benchmark/samerequest.js + + Parsing URL "/foo/bar?user=tj&pet=fluffy" on same request object + + 4 tests completed. + + fasturl x 3,821,651 ops/sec ±2.42% (185 runs sampled) + nativeurl - legacy x 2,651,162 ops/sec ±1.90% (187 runs sampled) + nativeurl - whatwg x 175,166 ops/sec ±1.44% (188 runs sampled) + parseurl x 14,912,606 ops/sec ±3.59% (183 runs sampled) + +> node benchmark/simplepath.js + + Parsing URL "/foo/bar" + + 4 tests completed. + + fasturl x 12,421,765 ops/sec ±2.04% (191 runs sampled) + nativeurl - legacy x 7,546,036 ops/sec ±1.41% (188 runs sampled) + nativeurl - whatwg x 198,843 ops/sec ±1.83% (189 runs sampled) + parseurl x 24,244,006 ops/sec ±0.51% (194 runs sampled) + +> node benchmark/slash.js + + Parsing URL "/" + + 4 tests completed. + + fasturl x 17,159,456 ops/sec ±3.25% (188 runs sampled) + nativeurl - legacy x 11,635,097 ops/sec ±3.79% (184 runs sampled) + nativeurl - whatwg x 240,693 ops/sec ±0.83% (189 runs sampled) + parseurl x 42,279,067 ops/sec ±0.55% (190 runs sampled) +``` + +## License + + [MIT](LICENSE) + +[coveralls-image]: https://badgen.net/coveralls/c/github/pillarjs/parseurl/master +[coveralls-url]: https://coveralls.io/r/pillarjs/parseurl?branch=master +[node-image]: https://badgen.net/npm/node/parseurl +[node-url]: https://nodejs.org/en/download +[npm-downloads-image]: https://badgen.net/npm/dm/parseurl +[npm-url]: https://npmjs.org/package/parseurl +[npm-version-image]: https://badgen.net/npm/v/parseurl +[travis-image]: https://badgen.net/travis/pillarjs/parseurl/master +[travis-url]: https://travis-ci.org/pillarjs/parseurl diff --git a/node_modules/parseurl/index.js b/node_modules/parseurl/index.js new file mode 100644 index 000000000..ece722327 --- /dev/null +++ b/node_modules/parseurl/index.js @@ -0,0 +1,158 @@ +/*! + * parseurl + * Copyright(c) 2014 Jonathan Ong + * Copyright(c) 2014-2017 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module dependencies. + * @private + */ + +var url = require('url') +var parse = url.parse +var Url = url.Url + +/** + * Module exports. + * @public + */ + +module.exports = parseurl +module.exports.original = originalurl + +/** + * Parse the `req` url with memoization. + * + * @param {ServerRequest} req + * @return {Object} + * @public + */ + +function parseurl (req) { + var url = req.url + + if (url === undefined) { + // URL is undefined + return undefined + } + + var parsed = req._parsedUrl + + if (fresh(url, parsed)) { + // Return cached URL parse + return parsed + } + + // Parse the URL + parsed = fastparse(url) + parsed._raw = url + + return (req._parsedUrl = parsed) +}; + +/** + * Parse the `req` original url with fallback and memoization. + * + * @param {ServerRequest} req + * @return {Object} + * @public + */ + +function originalurl (req) { + var url = req.originalUrl + + if (typeof url !== 'string') { + // Fallback + return parseurl(req) + } + + var parsed = req._parsedOriginalUrl + + if (fresh(url, parsed)) { + // Return cached URL parse + return parsed + } + + // Parse the URL + parsed = fastparse(url) + parsed._raw = url + + return (req._parsedOriginalUrl = parsed) +}; + +/** + * Parse the `str` url with fast-path short-cut. + * + * @param {string} str + * @return {Object} + * @private + */ + +function fastparse (str) { + if (typeof str !== 'string' || str.charCodeAt(0) !== 0x2f /* / */) { + return parse(str) + } + + var pathname = str + var query = null + var search = null + + // This takes the regexp from https://github.com/joyent/node/pull/7878 + // Which is /^(\/[^?#\s]*)(\?[^#\s]*)?$/ + // And unrolls it into a for loop + for (var i = 1; i < str.length; i++) { + switch (str.charCodeAt(i)) { + case 0x3f: /* ? */ + if (search === null) { + pathname = str.substring(0, i) + query = str.substring(i + 1) + search = str.substring(i) + } + break + case 0x09: /* \t */ + case 0x0a: /* \n */ + case 0x0c: /* \f */ + case 0x0d: /* \r */ + case 0x20: /* */ + case 0x23: /* # */ + case 0xa0: + case 0xfeff: + return parse(str) + } + } + + var url = Url !== undefined + ? new Url() + : {} + + url.path = str + url.href = str + url.pathname = pathname + + if (search !== null) { + url.query = query + url.search = search + } + + return url +} + +/** + * Determine if parsed is still fresh for url. + * + * @param {string} url + * @param {object} parsedUrl + * @return {boolean} + * @private + */ + +function fresh (url, parsedUrl) { + return typeof parsedUrl === 'object' && + parsedUrl !== null && + (Url === undefined || parsedUrl instanceof Url) && + parsedUrl._raw === url +} diff --git a/node_modules/parseurl/package.json b/node_modules/parseurl/package.json new file mode 100644 index 000000000..7f76ba592 --- /dev/null +++ b/node_modules/parseurl/package.json @@ -0,0 +1,81 @@ +{ + "_from": "parseurl@~1.3.3", + "_id": "parseurl@1.3.3", + "_inBundle": false, + "_integrity": "sha512-CiyeOxFT/JZyN5m0z9PfXw4SCBJ6Sygz1Dpl0wqjlhDEGGBP1GnsUVEL0p63hoG1fcj3fHynXi9NYO4nWOL+qQ==", + "_location": "/parseurl", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "parseurl@~1.3.3", + "name": "parseurl", + "escapedName": "parseurl", + "rawSpec": "~1.3.3", + "saveSpec": null, + "fetchSpec": "~1.3.3" + }, + "_requiredBy": [ + "/express", + "/finalhandler", + "/serve-static" + ], + "_resolved": "https://registry.npmjs.org/parseurl/-/parseurl-1.3.3.tgz", + "_shasum": "9da19e7bee8d12dff0513ed5b76957793bc2e8d4", + "_spec": "parseurl@~1.3.3", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\express", + "bugs": { + "url": "https://github.com/pillarjs/parseurl/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + { + "name": "Jonathan Ong", + "email": "me@jongleberry.com", + "url": "http://jongleberry.com" + } + ], + "deprecated": false, + "description": "parse a url with memoization", + "devDependencies": { + "beautify-benchmark": "0.2.4", + "benchmark": "2.1.4", + "eslint": "5.16.0", + "eslint-config-standard": "12.0.0", + "eslint-plugin-import": "2.17.1", + "eslint-plugin-node": "7.0.1", + "eslint-plugin-promise": "4.1.1", + "eslint-plugin-standard": "4.0.0", + "fast-url-parser": "1.1.3", + "istanbul": "0.4.5", + "mocha": "6.1.3" + }, + "engines": { + "node": ">= 0.8" + }, + "files": [ + "LICENSE", + "HISTORY.md", + "README.md", + "index.js" + ], + "homepage": "https://github.com/pillarjs/parseurl#readme", + "license": "MIT", + "name": "parseurl", + "repository": { + "type": "git", + "url": "git+https://github.com/pillarjs/parseurl.git" + }, + "scripts": { + "bench": "node benchmark/index.js", + "lint": "eslint .", + "test": "mocha --check-leaks --bail --reporter spec test/", + "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --check-leaks --reporter dot test/", + "test-travis": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --check-leaks --reporter spec test/" + }, + "version": "1.3.3" +} diff --git a/node_modules/path-to-regexp/History.md b/node_modules/path-to-regexp/History.md new file mode 100644 index 000000000..7f6587846 --- /dev/null +++ b/node_modules/path-to-regexp/History.md @@ -0,0 +1,36 @@ +0.1.7 / 2015-07-28 +================== + + * Fixed regression with escaped round brackets and matching groups. + +0.1.6 / 2015-06-19 +================== + + * Replace `index` feature by outputting all parameters, unnamed and named. + +0.1.5 / 2015-05-08 +================== + + * Add an index property for position in match result. + +0.1.4 / 2015-03-05 +================== + + * Add license information + +0.1.3 / 2014-07-06 +================== + + * Better array support + * Improved support for trailing slash in non-ending mode + +0.1.0 / 2014-03-06 +================== + + * add options.end + +0.0.2 / 2013-02-10 +================== + + * Update to match current express + * add .license property to component.json diff --git a/node_modules/path-to-regexp/LICENSE b/node_modules/path-to-regexp/LICENSE new file mode 100644 index 000000000..983fbe8ae --- /dev/null +++ b/node_modules/path-to-regexp/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2014 Blake Embrey (hello@blakeembrey.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/path-to-regexp/Readme.md b/node_modules/path-to-regexp/Readme.md new file mode 100644 index 000000000..95452a6e9 --- /dev/null +++ b/node_modules/path-to-regexp/Readme.md @@ -0,0 +1,35 @@ +# Path-to-RegExp + +Turn an Express-style path string such as `/user/:name` into a regular expression. + +**Note:** This is a legacy branch. You should upgrade to `1.x`. + +## Usage + +```javascript +var pathToRegexp = require('path-to-regexp'); +``` + +### pathToRegexp(path, keys, options) + + - **path** A string in the express format, an array of such strings, or a regular expression + - **keys** An array to be populated with the keys present in the url. Once the function completes, this will be an array of strings. + - **options** + - **options.sensitive** Defaults to false, set this to true to make routes case sensitive + - **options.strict** Defaults to false, set this to true to make the trailing slash matter. + - **options.end** Defaults to true, set this to false to only match the prefix of the URL. + +```javascript +var keys = []; +var exp = pathToRegexp('/foo/:bar', keys); +//keys = ['bar'] +//exp = /^\/foo\/(?:([^\/]+?))\/?$/i +``` + +## Live Demo + +You can see a live demo of this library in use at [express-route-tester](http://forbeslindesay.github.com/express-route-tester/). + +## License + + MIT diff --git a/node_modules/path-to-regexp/index.js b/node_modules/path-to-regexp/index.js new file mode 100644 index 000000000..500d1dad0 --- /dev/null +++ b/node_modules/path-to-regexp/index.js @@ -0,0 +1,129 @@ +/** + * Expose `pathtoRegexp`. + */ + +module.exports = pathtoRegexp; + +/** + * Match matching groups in a regular expression. + */ +var MATCHING_GROUP_REGEXP = /\((?!\?)/g; + +/** + * Normalize the given path string, + * returning a regular expression. + * + * An empty array should be passed, + * which will contain the placeholder + * key names. For example "/user/:id" will + * then contain ["id"]. + * + * @param {String|RegExp|Array} path + * @param {Array} keys + * @param {Object} options + * @return {RegExp} + * @api private + */ + +function pathtoRegexp(path, keys, options) { + options = options || {}; + keys = keys || []; + var strict = options.strict; + var end = options.end !== false; + var flags = options.sensitive ? '' : 'i'; + var extraOffset = 0; + var keysOffset = keys.length; + var i = 0; + var name = 0; + var m; + + if (path instanceof RegExp) { + while (m = MATCHING_GROUP_REGEXP.exec(path.source)) { + keys.push({ + name: name++, + optional: false, + offset: m.index + }); + } + + return path; + } + + if (Array.isArray(path)) { + // Map array parts into regexps and return their source. We also pass + // the same keys and options instance into every generation to get + // consistent matching groups before we join the sources together. + path = path.map(function (value) { + return pathtoRegexp(value, keys, options).source; + }); + + return new RegExp('(?:' + path.join('|') + ')', flags); + } + + path = ('^' + path + (strict ? '' : path[path.length - 1] === '/' ? '?' : '/?')) + .replace(/\/\(/g, '/(?:') + .replace(/([\/\.])/g, '\\$1') + .replace(/(\\\/)?(\\\.)?:(\w+)(\(.*?\))?(\*)?(\?)?/g, function (match, slash, format, key, capture, star, optional, offset) { + slash = slash || ''; + format = format || ''; + capture = capture || '([^\\/' + format + ']+?)'; + optional = optional || ''; + + keys.push({ + name: key, + optional: !!optional, + offset: offset + extraOffset + }); + + var result = '' + + (optional ? '' : slash) + + '(?:' + + format + (optional ? slash : '') + capture + + (star ? '((?:[\\/' + format + '].+?)?)' : '') + + ')' + + optional; + + extraOffset += result.length - match.length; + + return result; + }) + .replace(/\*/g, function (star, index) { + var len = keys.length + + while (len-- > keysOffset && keys[len].offset > index) { + keys[len].offset += 3; // Replacement length minus asterisk length. + } + + return '(.*)'; + }); + + // This is a workaround for handling unnamed matching groups. + while (m = MATCHING_GROUP_REGEXP.exec(path)) { + var escapeCount = 0; + var index = m.index; + + while (path.charAt(--index) === '\\') { + escapeCount++; + } + + // It's possible to escape the bracket. + if (escapeCount % 2 === 1) { + continue; + } + + if (keysOffset + i === keys.length || keys[keysOffset + i].offset > m.index) { + keys.splice(keysOffset + i, 0, { + name: name++, // Unnamed matching groups must be consistently linear. + optional: false, + offset: m.index + }); + } + + i++; + } + + // If the path is non-ending, match until the end or a slash. + path += (end ? '$' : (path[path.length - 1] === '/' ? '' : '(?=\\/|$)')); + + return new RegExp(path, flags); +}; diff --git a/node_modules/path-to-regexp/package.json b/node_modules/path-to-regexp/package.json new file mode 100644 index 000000000..174da2f84 --- /dev/null +++ b/node_modules/path-to-regexp/package.json @@ -0,0 +1,59 @@ +{ + "_from": "path-to-regexp@0.1.7", + "_id": "path-to-regexp@0.1.7", + "_inBundle": false, + "_integrity": "sha1-32BBeABfUi8V60SQ5yR6G/qmf4w=", + "_location": "/path-to-regexp", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "path-to-regexp@0.1.7", + "name": "path-to-regexp", + "escapedName": "path-to-regexp", + "rawSpec": "0.1.7", + "saveSpec": null, + "fetchSpec": "0.1.7" + }, + "_requiredBy": [ + "/express" + ], + "_resolved": "https://registry.npmjs.org/path-to-regexp/-/path-to-regexp-0.1.7.tgz", + "_shasum": "df604178005f522f15eb4490e7247a1bfaa67f8c", + "_spec": "path-to-regexp@0.1.7", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\express", + "bugs": { + "url": "https://github.com/component/path-to-regexp/issues" + }, + "bundleDependencies": false, + "component": { + "scripts": { + "path-to-regexp": "index.js" + } + }, + "deprecated": false, + "description": "Express style path to RegExp utility", + "devDependencies": { + "istanbul": "^0.2.6", + "mocha": "^1.17.1" + }, + "files": [ + "index.js", + "LICENSE" + ], + "homepage": "https://github.com/component/path-to-regexp#readme", + "keywords": [ + "express", + "regexp" + ], + "license": "MIT", + "name": "path-to-regexp", + "repository": { + "type": "git", + "url": "git+https://github.com/component/path-to-regexp.git" + }, + "scripts": { + "test": "istanbul cover _mocha -- -R spec" + }, + "version": "0.1.7" +} diff --git a/node_modules/picomatch/CHANGELOG.md b/node_modules/picomatch/CHANGELOG.md new file mode 100644 index 000000000..b9b9554df --- /dev/null +++ b/node_modules/picomatch/CHANGELOG.md @@ -0,0 +1,126 @@ +# Release history + +**All notable changes to this project will be documented in this file.** + +The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/) +and this project adheres to [Semantic Versioning](http://semver.org/spec/v2.0.0.html). + +
+ Guiding Principles + +- Changelogs are for humans, not machines. +- There should be an entry for every single version. +- The same types of changes should be grouped. +- Versions and sections should be linkable. +- The latest version comes first. +- The release date of each versions is displayed. +- Mention whether you follow Semantic Versioning. + +
+ +
+ Types of changes + +Changelog entries are classified using the following labels _(from [keep-a-changelog](http://keepachangelog.com/)_): + +- `Added` for new features. +- `Changed` for changes in existing functionality. +- `Deprecated` for soon-to-be removed features. +- `Removed` for now removed features. +- `Fixed` for any bug fixes. +- `Security` in case of vulnerabilities. + +
+ +## 2.3.0 (2021-05-21) + +### Fixed + +* Fixes bug where file names with two dots were not being matched consistently with negation extglobs containing a star ([56083ef](https://github.com/micromatch/picomatch/commit/56083ef)) + +## 2.2.3 (2021-04-10) + +### Fixed + +* Do not skip pattern seperator for square brackets ([fb08a30](https://github.com/micromatch/picomatch/commit/fb08a30)). +* Set negatedExtGlob also if it does not span the whole pattern ([032e3f5](https://github.com/micromatch/picomatch/commit/032e3f5)). + +## 2.2.2 (2020-03-21) + +### Fixed + +* Correctly handle parts of the pattern after parentheses in the `scan` method ([e15b920](https://github.com/micromatch/picomatch/commit/e15b920)). + +## 2.2.1 (2020-01-04) + +* Fixes [#49](https://github.com/micromatch/picomatch/issues/49), so that braces with no sets or ranges are now propertly treated as literals. + +## 2.2.0 (2020-01-04) + +* Disable fastpaths mode for the parse method ([5b8d33f](https://github.com/micromatch/picomatch/commit/5b8d33f)) +* Add `tokens`, `slashes`, and `parts` to the object returned by `picomatch.scan()`. + +## 2.1.0 (2019-10-31) + +* add benchmarks for scan ([4793b92](https://github.com/micromatch/picomatch/commit/4793b92)) +* Add eslint object-curly-spacing rule ([707c650](https://github.com/micromatch/picomatch/commit/707c650)) +* Add prefer-const eslint rule ([5c7501c](https://github.com/micromatch/picomatch/commit/5c7501c)) +* Add support for nonegate in scan API ([275c9b9](https://github.com/micromatch/picomatch/commit/275c9b9)) +* Change lets to consts. Move root import up. ([4840625](https://github.com/micromatch/picomatch/commit/4840625)) +* closes https://github.com/micromatch/picomatch/issues/21 ([766bcb0](https://github.com/micromatch/picomatch/commit/766bcb0)) +* Fix "Extglobs" table in readme ([eb19da8](https://github.com/micromatch/picomatch/commit/eb19da8)) +* fixes https://github.com/micromatch/picomatch/issues/20 ([9caca07](https://github.com/micromatch/picomatch/commit/9caca07)) +* fixes https://github.com/micromatch/picomatch/issues/26 ([fa58f45](https://github.com/micromatch/picomatch/commit/fa58f45)) +* Lint test ([d433a34](https://github.com/micromatch/picomatch/commit/d433a34)) +* lint unit tests ([0159b55](https://github.com/micromatch/picomatch/commit/0159b55)) +* Make scan work with noext ([6c02e03](https://github.com/micromatch/picomatch/commit/6c02e03)) +* minor linting ([c2a2b87](https://github.com/micromatch/picomatch/commit/c2a2b87)) +* minor parser improvements ([197671d](https://github.com/micromatch/picomatch/commit/197671d)) +* remove eslint since it... ([07876fa](https://github.com/micromatch/picomatch/commit/07876fa)) +* remove funding file ([8ebe96d](https://github.com/micromatch/picomatch/commit/8ebe96d)) +* Remove unused funks ([cbc6d54](https://github.com/micromatch/picomatch/commit/cbc6d54)) +* Run eslint during pretest, fix existing eslint findings ([0682367](https://github.com/micromatch/picomatch/commit/0682367)) +* support `noparen` in scan ([3d37569](https://github.com/micromatch/picomatch/commit/3d37569)) +* update changelog ([7b34e77](https://github.com/micromatch/picomatch/commit/7b34e77)) +* update travis ([777f038](https://github.com/micromatch/picomatch/commit/777f038)) +* Use eslint-disable-next-line instead of eslint-disable ([4e7c1fd](https://github.com/micromatch/picomatch/commit/4e7c1fd)) + +## 2.0.7 (2019-05-14) + +* 2.0.7 ([9eb9a71](https://github.com/micromatch/picomatch/commit/9eb9a71)) +* supports lookbehinds ([1f63f7e](https://github.com/micromatch/picomatch/commit/1f63f7e)) +* update .verb.md file with typo change ([2741279](https://github.com/micromatch/picomatch/commit/2741279)) +* fix: typo in README ([0753e44](https://github.com/micromatch/picomatch/commit/0753e44)) + +## 2.0.4 (2019-04-10) + +### Fixed + +- Readme link [fixed](https://github.com/micromatch/picomatch/pull/13/commits/a96ab3aa2b11b6861c23289964613d85563b05df) by @danez. +- `options.capture` now works as expected when fastpaths are enabled. See https://github.com/micromatch/picomatch/pull/12/commits/26aefd71f1cfaf95c37f1c1fcab68a693b037304. Thanks to @DrPizza. + +## 2.0.0 (2019-04-10) + +### Added + +- Adds support for `options.onIgnore`. See the readme for details +- Adds support for `options.onResult`. See the readme for details + +### Breaking changes + +- The unixify option was renamed to `windows` +- caching and all related options and methods have been removed + +## 1.0.0 (2018-11-05) + +- adds `.onMatch` option +- improvements to `.scan` method +- numerous improvements and optimizations for matching and parsing +- better windows path handling + +## 0.1.0 - 2017-04-13 + +First release. + + +[keep-a-changelog]: https://github.com/olivierlacan/keep-a-changelog diff --git a/node_modules/picomatch/LICENSE b/node_modules/picomatch/LICENSE new file mode 100644 index 000000000..3608dca25 --- /dev/null +++ b/node_modules/picomatch/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2017-present, Jon Schlinkert. + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/picomatch/README.md b/node_modules/picomatch/README.md new file mode 100644 index 000000000..54822d498 --- /dev/null +++ b/node_modules/picomatch/README.md @@ -0,0 +1,707 @@ +

Picomatch

+ +

+ +version + + +test status + + +coverage status + + +downloads + +

+ +
+
+ +

+Blazing fast and accurate glob matcher written in JavaScript.
+No dependencies and full support for standard and extended Bash glob features, including braces, extglobs, POSIX brackets, and regular expressions. +

+ +
+
+ +## Why picomatch? + +* **Lightweight** - No dependencies +* **Minimal** - Tiny API surface. Main export is a function that takes a glob pattern and returns a matcher function. +* **Fast** - Loads in about 2ms (that's several times faster than a [single frame of a HD movie](http://www.endmemo.com/sconvert/framespersecondframespermillisecond.php) at 60fps) +* **Performant** - Use the returned matcher function to speed up repeat matching (like when watching files) +* **Accurate matching** - Using wildcards (`*` and `?`), globstars (`**`) for nested directories, [advanced globbing](#advanced-globbing) with extglobs, braces, and POSIX brackets, and support for escaping special characters with `\` or quotes. +* **Well tested** - Thousands of unit tests + +See the [library comparison](#library-comparisons) to other libraries. + +
+
+ +## Table of Contents + +
Click to expand + +- [Install](#install) +- [Usage](#usage) +- [API](#api) + * [picomatch](#picomatch) + * [.test](#test) + * [.matchBase](#matchbase) + * [.isMatch](#ismatch) + * [.parse](#parse) + * [.scan](#scan) + * [.compileRe](#compilere) + * [.makeRe](#makere) + * [.toRegex](#toregex) +- [Options](#options) + * [Picomatch options](#picomatch-options) + * [Scan Options](#scan-options) + * [Options Examples](#options-examples) +- [Globbing features](#globbing-features) + * [Basic globbing](#basic-globbing) + * [Advanced globbing](#advanced-globbing) + * [Braces](#braces) + * [Matching special characters as literals](#matching-special-characters-as-literals) +- [Library Comparisons](#library-comparisons) +- [Benchmarks](#benchmarks) +- [Philosophies](#philosophies) +- [About](#about) + * [Author](#author) + * [License](#license) + +_(TOC generated by [verb](https://github.com/verbose/verb) using [markdown-toc](https://github.com/jonschlinkert/markdown-toc))_ + +
+ +
+
+ +## Install + +Install with [npm](https://www.npmjs.com/): + +```sh +npm install --save picomatch +``` + +
+ +## Usage + +The main export is a function that takes a glob pattern and an options object and returns a function for matching strings. + +```js +const pm = require('picomatch'); +const isMatch = pm('*.js'); + +console.log(isMatch('abcd')); //=> false +console.log(isMatch('a.js')); //=> true +console.log(isMatch('a.md')); //=> false +console.log(isMatch('a/b.js')); //=> false +``` + +
+ +## API + +### [picomatch](lib/picomatch.js#L32) + +Creates a matcher function from one or more glob patterns. The returned function takes a string to match as its first argument, and returns true if the string is a match. The returned matcher function also takes a boolean as the second argument that, when true, returns an object with additional information. + +**Params** + +* `globs` **{String|Array}**: One or more glob patterns. +* `options` **{Object=}** +* `returns` **{Function=}**: Returns a matcher function. + +**Example** + +```js +const picomatch = require('picomatch'); +// picomatch(glob[, options]); + +const isMatch = picomatch('*.!(*a)'); +console.log(isMatch('a.a')); //=> false +console.log(isMatch('a.b')); //=> true +``` + +### [.test](lib/picomatch.js#L117) + +Test `input` with the given `regex`. This is used by the main `picomatch()` function to test the input string. + +**Params** + +* `input` **{String}**: String to test. +* `regex` **{RegExp}** +* `returns` **{Object}**: Returns an object with matching info. + +**Example** + +```js +const picomatch = require('picomatch'); +// picomatch.test(input, regex[, options]); + +console.log(picomatch.test('foo/bar', /^(?:([^/]*?)\/([^/]*?))$/)); +// { isMatch: true, match: [ 'foo/', 'foo', 'bar' ], output: 'foo/bar' } +``` + +### [.matchBase](lib/picomatch.js#L161) + +Match the basename of a filepath. + +**Params** + +* `input` **{String}**: String to test. +* `glob` **{RegExp|String}**: Glob pattern or regex created by [.makeRe](#makeRe). +* `returns` **{Boolean}** + +**Example** + +```js +const picomatch = require('picomatch'); +// picomatch.matchBase(input, glob[, options]); +console.log(picomatch.matchBase('foo/bar.js', '*.js'); // true +``` + +### [.isMatch](lib/picomatch.js#L183) + +Returns true if **any** of the given glob `patterns` match the specified `string`. + +**Params** + +* **{String|Array}**: str The string to test. +* **{String|Array}**: patterns One or more glob patterns to use for matching. +* **{Object}**: See available [options](#options). +* `returns` **{Boolean}**: Returns true if any patterns match `str` + +**Example** + +```js +const picomatch = require('picomatch'); +// picomatch.isMatch(string, patterns[, options]); + +console.log(picomatch.isMatch('a.a', ['b.*', '*.a'])); //=> true +console.log(picomatch.isMatch('a.a', 'b.*')); //=> false +``` + +### [.parse](lib/picomatch.js#L199) + +Parse a glob pattern to create the source string for a regular expression. + +**Params** + +* `pattern` **{String}** +* `options` **{Object}** +* `returns` **{Object}**: Returns an object with useful properties and output to be used as a regex source string. + +**Example** + +```js +const picomatch = require('picomatch'); +const result = picomatch.parse(pattern[, options]); +``` + +### [.scan](lib/picomatch.js#L231) + +Scan a glob pattern to separate the pattern into segments. + +**Params** + +* `input` **{String}**: Glob pattern to scan. +* `options` **{Object}** +* `returns` **{Object}**: Returns an object with + +**Example** + +```js +const picomatch = require('picomatch'); +// picomatch.scan(input[, options]); + +const result = picomatch.scan('!./foo/*.js'); +console.log(result); +{ prefix: '!./', + input: '!./foo/*.js', + start: 3, + base: 'foo', + glob: '*.js', + isBrace: false, + isBracket: false, + isGlob: true, + isExtglob: false, + isGlobstar: false, + negated: true } +``` + +### [.compileRe](lib/picomatch.js#L245) + +Compile a regular expression from the `state` object returned by the +[parse()](#parse) method. + +**Params** + +* `state` **{Object}** +* `options` **{Object}** +* `returnOutput` **{Boolean}**: Intended for implementors, this argument allows you to return the raw output from the parser. +* `returnState` **{Boolean}**: Adds the state to a `state` property on the returned regex. Useful for implementors and debugging. +* `returns` **{RegExp}** + +### [.makeRe](lib/picomatch.js#L286) + +Create a regular expression from a parsed glob pattern. + +**Params** + +* `state` **{String}**: The object returned from the `.parse` method. +* `options` **{Object}** +* `returnOutput` **{Boolean}**: Implementors may use this argument to return the compiled output, instead of a regular expression. This is not exposed on the options to prevent end-users from mutating the result. +* `returnState` **{Boolean}**: Implementors may use this argument to return the state from the parsed glob with the returned regular expression. +* `returns` **{RegExp}**: Returns a regex created from the given pattern. + +**Example** + +```js +const picomatch = require('picomatch'); +const state = picomatch.parse('*.js'); +// picomatch.compileRe(state[, options]); + +console.log(picomatch.compileRe(state)); +//=> /^(?:(?!\.)(?=.)[^/]*?\.js)$/ +``` + +### [.toRegex](lib/picomatch.js#L321) + +Create a regular expression from the given regex source string. + +**Params** + +* `source` **{String}**: Regular expression source string. +* `options` **{Object}** +* `returns` **{RegExp}** + +**Example** + +```js +const picomatch = require('picomatch'); +// picomatch.toRegex(source[, options]); + +const { output } = picomatch.parse('*.js'); +console.log(picomatch.toRegex(output)); +//=> /^(?:(?!\.)(?=.)[^/]*?\.js)$/ +``` + +
+ +## Options + +### Picomatch options + +The following options may be used with the main `picomatch()` function or any of the methods on the picomatch API. + +| **Option** | **Type** | **Default value** | **Description** | +| --- | --- | --- | --- | +| `basename` | `boolean` | `false` | If set, then patterns without slashes will be matched against the basename of the path if it contains slashes. For example, `a?b` would match the path `/xyz/123/acb`, but not `/xyz/acb/123`. | +| `bash` | `boolean` | `false` | Follow bash matching rules more strictly - disallows backslashes as escape characters, and treats single stars as globstars (`**`). | +| `capture` | `boolean` | `undefined` | Return regex matches in supporting methods. | +| `contains` | `boolean` | `undefined` | Allows glob to match any part of the given string(s). | +| `cwd` | `string` | `process.cwd()` | Current working directory. Used by `picomatch.split()` | +| `debug` | `boolean` | `undefined` | Debug regular expressions when an error is thrown. | +| `dot` | `boolean` | `false` | Enable dotfile matching. By default, dotfiles are ignored unless a `.` is explicitly defined in the pattern, or `options.dot` is true | +| `expandRange` | `function` | `undefined` | Custom function for expanding ranges in brace patterns, such as `{a..z}`. The function receives the range values as two arguments, and it must return a string to be used in the generated regex. It's recommended that returned strings be wrapped in parentheses. | +| `failglob` | `boolean` | `false` | Throws an error if no matches are found. Based on the bash option of the same name. | +| `fastpaths` | `boolean` | `true` | To speed up processing, full parsing is skipped for a handful common glob patterns. Disable this behavior by setting this option to `false`. | +| `flags` | `boolean` | `undefined` | Regex flags to use in the generated regex. If defined, the `nocase` option will be overridden. | +| [format](#optionsformat) | `function` | `undefined` | Custom function for formatting the returned string. This is useful for removing leading slashes, converting Windows paths to Posix paths, etc. | +| `ignore` | `array\|string` | `undefined` | One or more glob patterns for excluding strings that should not be matched from the result. | +| `keepQuotes` | `boolean` | `false` | Retain quotes in the generated regex, since quotes may also be used as an alternative to backslashes. | +| `literalBrackets` | `boolean` | `undefined` | When `true`, brackets in the glob pattern will be escaped so that only literal brackets will be matched. | +| `lookbehinds` | `boolean` | `true` | Support regex positive and negative lookbehinds. Note that you must be using Node 8.1.10 or higher to enable regex lookbehinds. | +| `matchBase` | `boolean` | `false` | Alias for `basename` | +| `maxLength` | `boolean` | `65536` | Limit the max length of the input string. An error is thrown if the input string is longer than this value. | +| `nobrace` | `boolean` | `false` | Disable brace matching, so that `{a,b}` and `{1..3}` would be treated as literal characters. | +| `nobracket` | `boolean` | `undefined` | Disable matching with regex brackets. | +| `nocase` | `boolean` | `false` | Make matching case-insensitive. Equivalent to the regex `i` flag. Note that this option is overridden by the `flags` option. | +| `nodupes` | `boolean` | `true` | Deprecated, use `nounique` instead. This option will be removed in a future major release. By default duplicates are removed. Disable uniquification by setting this option to false. | +| `noext` | `boolean` | `false` | Alias for `noextglob` | +| `noextglob` | `boolean` | `false` | Disable support for matching with extglobs (like `+(a\|b)`) | +| `noglobstar` | `boolean` | `false` | Disable support for matching nested directories with globstars (`**`) | +| `nonegate` | `boolean` | `false` | Disable support for negating with leading `!` | +| `noquantifiers` | `boolean` | `false` | Disable support for regex quantifiers (like `a{1,2}`) and treat them as brace patterns to be expanded. | +| [onIgnore](#optionsonIgnore) | `function` | `undefined` | Function to be called on ignored items. | +| [onMatch](#optionsonMatch) | `function` | `undefined` | Function to be called on matched items. | +| [onResult](#optionsonResult) | `function` | `undefined` | Function to be called on all items, regardless of whether or not they are matched or ignored. | +| `posix` | `boolean` | `false` | Support POSIX character classes ("posix brackets"). | +| `posixSlashes` | `boolean` | `undefined` | Convert all slashes in file paths to forward slashes. This does not convert slashes in the glob pattern itself | +| `prepend` | `boolean` | `undefined` | String to prepend to the generated regex used for matching. | +| `regex` | `boolean` | `false` | Use regular expression rules for `+` (instead of matching literal `+`), and for stars that follow closing parentheses or brackets (as in `)*` and `]*`). | +| `strictBrackets` | `boolean` | `undefined` | Throw an error if brackets, braces, or parens are imbalanced. | +| `strictSlashes` | `boolean` | `undefined` | When true, picomatch won't match trailing slashes with single stars. | +| `unescape` | `boolean` | `undefined` | Remove backslashes preceding escaped characters in the glob pattern. By default, backslashes are retained. | +| `unixify` | `boolean` | `undefined` | Alias for `posixSlashes`, for backwards compatibility. | + +### Scan Options + +In addition to the main [picomatch options](#picomatch-options), the following options may also be used with the [.scan](#scan) method. + +| **Option** | **Type** | **Default value** | **Description** | +| --- | --- | --- | --- | +| `tokens` | `boolean` | `false` | When `true`, the returned object will include an array of tokens (objects), representing each path "segment" in the scanned glob pattern | +| `parts` | `boolean` | `false` | When `true`, the returned object will include an array of strings representing each path "segment" in the scanned glob pattern. This is automatically enabled when `options.tokens` is true | + +**Example** + +```js +const picomatch = require('picomatch'); +const result = picomatch.scan('!./foo/*.js', { tokens: true }); +console.log(result); +// { +// prefix: '!./', +// input: '!./foo/*.js', +// start: 3, +// base: 'foo', +// glob: '*.js', +// isBrace: false, +// isBracket: false, +// isGlob: true, +// isExtglob: false, +// isGlobstar: false, +// negated: true, +// maxDepth: 2, +// tokens: [ +// { value: '!./', depth: 0, isGlob: false, negated: true, isPrefix: true }, +// { value: 'foo', depth: 1, isGlob: false }, +// { value: '*.js', depth: 1, isGlob: true } +// ], +// slashes: [ 2, 6 ], +// parts: [ 'foo', '*.js' ] +// } +``` + +
+ +### Options Examples + +#### options.expandRange + +**Type**: `function` + +**Default**: `undefined` + +Custom function for expanding ranges in brace patterns. The [fill-range](https://github.com/jonschlinkert/fill-range) library is ideal for this purpose, or you can use custom code to do whatever you need. + +**Example** + +The following example shows how to create a glob that matches a folder + +```js +const fill = require('fill-range'); +const regex = pm.makeRe('foo/{01..25}/bar', { + expandRange(a, b) { + return `(${fill(a, b, { toRegex: true })})`; + } +}); + +console.log(regex); +//=> /^(?:foo\/((?:0[1-9]|1[0-9]|2[0-5]))\/bar)$/ + +console.log(regex.test('foo/00/bar')) // false +console.log(regex.test('foo/01/bar')) // true +console.log(regex.test('foo/10/bar')) // true +console.log(regex.test('foo/22/bar')) // true +console.log(regex.test('foo/25/bar')) // true +console.log(regex.test('foo/26/bar')) // false +``` + +#### options.format + +**Type**: `function` + +**Default**: `undefined` + +Custom function for formatting strings before they're matched. + +**Example** + +```js +// strip leading './' from strings +const format = str => str.replace(/^\.\//, ''); +const isMatch = picomatch('foo/*.js', { format }); +console.log(isMatch('./foo/bar.js')); //=> true +``` + +#### options.onMatch + +```js +const onMatch = ({ glob, regex, input, output }) => { + console.log({ glob, regex, input, output }); +}; + +const isMatch = picomatch('*', { onMatch }); +isMatch('foo'); +isMatch('bar'); +isMatch('baz'); +``` + +#### options.onIgnore + +```js +const onIgnore = ({ glob, regex, input, output }) => { + console.log({ glob, regex, input, output }); +}; + +const isMatch = picomatch('*', { onIgnore, ignore: 'f*' }); +isMatch('foo'); +isMatch('bar'); +isMatch('baz'); +``` + +#### options.onResult + +```js +const onResult = ({ glob, regex, input, output }) => { + console.log({ glob, regex, input, output }); +}; + +const isMatch = picomatch('*', { onResult, ignore: 'f*' }); +isMatch('foo'); +isMatch('bar'); +isMatch('baz'); +``` + +
+
+ +## Globbing features + +* [Basic globbing](#basic-globbing) (Wildcard matching) +* [Advanced globbing](#advanced-globbing) (extglobs, posix brackets, brace matching) + +### Basic globbing + +| **Character** | **Description** | +| --- | --- | +| `*` | Matches any character zero or more times, excluding path separators. Does _not match_ path separators or hidden files or directories ("dotfiles"), unless explicitly enabled by setting the `dot` option to `true`. | +| `**` | Matches any character zero or more times, including path separators. Note that `**` will only match path separators (`/`, and `\\` on Windows) when they are the only characters in a path segment. Thus, `foo**/bar` is equivalent to `foo*/bar`, and `foo/a**b/bar` is equivalent to `foo/a*b/bar`, and _more than two_ consecutive stars in a glob path segment are regarded as _a single star_. Thus, `foo/***/bar` is equivalent to `foo/*/bar`. | +| `?` | Matches any character excluding path separators one time. Does _not match_ path separators or leading dots. | +| `[abc]` | Matches any characters inside the brackets. For example, `[abc]` would match the characters `a`, `b` or `c`, and nothing else. | + +#### Matching behavior vs. Bash + +Picomatch's matching features and expected results in unit tests are based on Bash's unit tests and the Bash 4.3 specification, with the following exceptions: + +* Bash will match `foo/bar/baz` with `*`. Picomatch only matches nested directories with `**`. +* Bash greedily matches with negated extglobs. For example, Bash 4.3 says that `!(foo)*` should match `foo` and `foobar`, since the trailing `*` bracktracks to match the preceding pattern. This is very memory-inefficient, and IMHO, also incorrect. Picomatch would return `false` for both `foo` and `foobar`. + +
+ +### Advanced globbing + +* [extglobs](#extglobs) +* [POSIX brackets](#posix-brackets) +* [Braces](#brace-expansion) + +#### Extglobs + +| **Pattern** | **Description** | +| --- | --- | +| `@(pattern)` | Match _only one_ consecutive occurrence of `pattern` | +| `*(pattern)` | Match _zero or more_ consecutive occurrences of `pattern` | +| `+(pattern)` | Match _one or more_ consecutive occurrences of `pattern` | +| `?(pattern)` | Match _zero or **one**_ consecutive occurrences of `pattern` | +| `!(pattern)` | Match _anything but_ `pattern` | + +**Examples** + +```js +const pm = require('picomatch'); + +// *(pattern) matches ZERO or more of "pattern" +console.log(pm.isMatch('a', 'a*(z)')); // true +console.log(pm.isMatch('az', 'a*(z)')); // true +console.log(pm.isMatch('azzz', 'a*(z)')); // true + +// +(pattern) matches ONE or more of "pattern" +console.log(pm.isMatch('a', 'a*(z)')); // true +console.log(pm.isMatch('az', 'a*(z)')); // true +console.log(pm.isMatch('azzz', 'a*(z)')); // true + +// supports multiple extglobs +console.log(pm.isMatch('foo.bar', '!(foo).!(bar)')); // false + +// supports nested extglobs +console.log(pm.isMatch('foo.bar', '!(!(foo)).!(!(bar))')); // true +``` + +#### POSIX brackets + +POSIX classes are disabled by default. Enable this feature by setting the `posix` option to true. + +**Enable POSIX bracket support** + +```js +console.log(pm.makeRe('[[:word:]]+', { posix: true })); +//=> /^(?:(?=.)[A-Za-z0-9_]+\/?)$/ +``` + +**Supported POSIX classes** + +The following named POSIX bracket expressions are supported: + +* `[:alnum:]` - Alphanumeric characters, equ `[a-zA-Z0-9]` +* `[:alpha:]` - Alphabetical characters, equivalent to `[a-zA-Z]`. +* `[:ascii:]` - ASCII characters, equivalent to `[\\x00-\\x7F]`. +* `[:blank:]` - Space and tab characters, equivalent to `[ \\t]`. +* `[:cntrl:]` - Control characters, equivalent to `[\\x00-\\x1F\\x7F]`. +* `[:digit:]` - Numerical digits, equivalent to `[0-9]`. +* `[:graph:]` - Graph characters, equivalent to `[\\x21-\\x7E]`. +* `[:lower:]` - Lowercase letters, equivalent to `[a-z]`. +* `[:print:]` - Print characters, equivalent to `[\\x20-\\x7E ]`. +* `[:punct:]` - Punctuation and symbols, equivalent to `[\\-!"#$%&\'()\\*+,./:;<=>?@[\\]^_`{|}~]`. +* `[:space:]` - Extended space characters, equivalent to `[ \\t\\r\\n\\v\\f]`. +* `[:upper:]` - Uppercase letters, equivalent to `[A-Z]`. +* `[:word:]` - Word characters (letters, numbers and underscores), equivalent to `[A-Za-z0-9_]`. +* `[:xdigit:]` - Hexadecimal digits, equivalent to `[A-Fa-f0-9]`. + +See the [Bash Reference Manual](https://www.gnu.org/software/bash/manual/html_node/Pattern-Matching.html) for more information. + +### Braces + +Picomatch does not do brace expansion. For [brace expansion](https://www.gnu.org/software/bash/manual/html_node/Brace-Expansion.html) and advanced matching with braces, use [micromatch](https://github.com/micromatch/micromatch) instead. Picomatch has very basic support for braces. + +### Matching special characters as literals + +If you wish to match the following special characters in a filepath, and you want to use these characters in your glob pattern, they must be escaped with backslashes or quotes: + +**Special Characters** + +Some characters that are used for matching in regular expressions are also regarded as valid file path characters on some platforms. + +To match any of the following characters as literals: `$^*+?()[] + +Examples: + +```js +console.log(pm.makeRe('foo/bar \\(1\\)')); +console.log(pm.makeRe('foo/bar \\(1\\)')); +``` + +
+
+ +## Library Comparisons + +The following table shows which features are supported by [minimatch](https://github.com/isaacs/minimatch), [micromatch](https://github.com/micromatch/micromatch), [picomatch](https://github.com/micromatch/picomatch), [nanomatch](https://github.com/micromatch/nanomatch), [extglob](https://github.com/micromatch/extglob), [braces](https://github.com/micromatch/braces), and [expand-brackets](https://github.com/micromatch/expand-brackets). + +| **Feature** | `minimatch` | `micromatch` | `picomatch` | `nanomatch` | `extglob` | `braces` | `expand-brackets` | +| --- | --- | --- | --- | --- | --- | --- | --- | +| Wildcard matching (`*?+`) | ✔ | ✔ | ✔ | ✔ | - | - | - | +| Advancing globbing | ✔ | ✔ | ✔ | - | - | - | - | +| Brace _matching_ | ✔ | ✔ | ✔ | - | - | ✔ | - | +| Brace _expansion_ | ✔ | ✔ | - | - | - | ✔ | - | +| Extglobs | partial | ✔ | ✔ | - | ✔ | - | - | +| Posix brackets | - | ✔ | ✔ | - | - | - | ✔ | +| Regular expression syntax | - | ✔ | ✔ | ✔ | ✔ | - | ✔ | +| File system operations | - | - | - | - | - | - | - | + +
+
+ +## Benchmarks + +Performance comparison of picomatch and minimatch. + +``` +# .makeRe star + picomatch x 1,993,050 ops/sec ±0.51% (91 runs sampled) + minimatch x 627,206 ops/sec ±1.96% (87 runs sampled)) + +# .makeRe star; dot=true + picomatch x 1,436,640 ops/sec ±0.62% (91 runs sampled) + minimatch x 525,876 ops/sec ±0.60% (88 runs sampled) + +# .makeRe globstar + picomatch x 1,592,742 ops/sec ±0.42% (90 runs sampled) + minimatch x 962,043 ops/sec ±1.76% (91 runs sampled)d) + +# .makeRe globstars + picomatch x 1,615,199 ops/sec ±0.35% (94 runs sampled) + minimatch x 477,179 ops/sec ±1.33% (91 runs sampled) + +# .makeRe with leading star + picomatch x 1,220,856 ops/sec ±0.40% (92 runs sampled) + minimatch x 453,564 ops/sec ±1.43% (94 runs sampled) + +# .makeRe - basic braces + picomatch x 392,067 ops/sec ±0.70% (90 runs sampled) + minimatch x 99,532 ops/sec ±2.03% (87 runs sampled)) +``` + +
+
+ +## Philosophies + +The goal of this library is to be blazing fast, without compromising on accuracy. + +**Accuracy** + +The number one of goal of this library is accuracy. However, it's not unusual for different glob implementations to have different rules for matching behavior, even with simple wildcard matching. It gets increasingly more complicated when combinations of different features are combined, like when extglobs are combined with globstars, braces, slashes, and so on: `!(**/{a,b,*/c})`. + +Thus, given that there is no canonical glob specification to use as a single source of truth when differences of opinion arise regarding behavior, sometimes we have to implement our best judgement and rely on feedback from users to make improvements. + +**Performance** + +Although this library performs well in benchmarks, and in most cases it's faster than other popular libraries we benchmarked against, we will always choose accuracy over performance. It's not helpful to anyone if our library is faster at returning the wrong answer. + +
+
+ +## About + +
+Contributing + +Pull requests and stars are always welcome. For bugs and feature requests, [please create an issue](../../issues/new). + +Please read the [contributing guide](.github/contributing.md) for advice on opening issues, pull requests, and coding standards. + +
+ +
+Running Tests + +Running and reviewing unit tests is a great way to get familiarized with a library and its API. You can install dependencies and run tests with the following command: + +```sh +npm install && npm test +``` + +
+ +
+Building docs + +_(This project's readme.md is generated by [verb](https://github.com/verbose/verb-generate-readme), please don't edit the readme directly. Any changes to the readme must be made in the [.verb.md](.verb.md) readme template.)_ + +To generate the readme, run the following command: + +```sh +npm install -g verbose/verb#dev verb-generate-readme && verb +``` + +
+ +### Author + +**Jon Schlinkert** + +* [GitHub Profile](https://github.com/jonschlinkert) +* [Twitter Profile](https://twitter.com/jonschlinkert) +* [LinkedIn Profile](https://linkedin.com/in/jonschlinkert) + +### License + +Copyright © 2017-present, [Jon Schlinkert](https://github.com/jonschlinkert). +Released under the [MIT License](LICENSE). \ No newline at end of file diff --git a/node_modules/picomatch/index.js b/node_modules/picomatch/index.js new file mode 100644 index 000000000..d2f2bc59d --- /dev/null +++ b/node_modules/picomatch/index.js @@ -0,0 +1,3 @@ +'use strict'; + +module.exports = require('./lib/picomatch'); diff --git a/node_modules/picomatch/lib/constants.js b/node_modules/picomatch/lib/constants.js new file mode 100644 index 000000000..a62ef3879 --- /dev/null +++ b/node_modules/picomatch/lib/constants.js @@ -0,0 +1,179 @@ +'use strict'; + +const path = require('path'); +const WIN_SLASH = '\\\\/'; +const WIN_NO_SLASH = `[^${WIN_SLASH}]`; + +/** + * Posix glob regex + */ + +const DOT_LITERAL = '\\.'; +const PLUS_LITERAL = '\\+'; +const QMARK_LITERAL = '\\?'; +const SLASH_LITERAL = '\\/'; +const ONE_CHAR = '(?=.)'; +const QMARK = '[^/]'; +const END_ANCHOR = `(?:${SLASH_LITERAL}|$)`; +const START_ANCHOR = `(?:^|${SLASH_LITERAL})`; +const DOTS_SLASH = `${DOT_LITERAL}{1,2}${END_ANCHOR}`; +const NO_DOT = `(?!${DOT_LITERAL})`; +const NO_DOTS = `(?!${START_ANCHOR}${DOTS_SLASH})`; +const NO_DOT_SLASH = `(?!${DOT_LITERAL}{0,1}${END_ANCHOR})`; +const NO_DOTS_SLASH = `(?!${DOTS_SLASH})`; +const QMARK_NO_DOT = `[^.${SLASH_LITERAL}]`; +const STAR = `${QMARK}*?`; + +const POSIX_CHARS = { + DOT_LITERAL, + PLUS_LITERAL, + QMARK_LITERAL, + SLASH_LITERAL, + ONE_CHAR, + QMARK, + END_ANCHOR, + DOTS_SLASH, + NO_DOT, + NO_DOTS, + NO_DOT_SLASH, + NO_DOTS_SLASH, + QMARK_NO_DOT, + STAR, + START_ANCHOR +}; + +/** + * Windows glob regex + */ + +const WINDOWS_CHARS = { + ...POSIX_CHARS, + + SLASH_LITERAL: `[${WIN_SLASH}]`, + QMARK: WIN_NO_SLASH, + STAR: `${WIN_NO_SLASH}*?`, + DOTS_SLASH: `${DOT_LITERAL}{1,2}(?:[${WIN_SLASH}]|$)`, + NO_DOT: `(?!${DOT_LITERAL})`, + NO_DOTS: `(?!(?:^|[${WIN_SLASH}])${DOT_LITERAL}{1,2}(?:[${WIN_SLASH}]|$))`, + NO_DOT_SLASH: `(?!${DOT_LITERAL}{0,1}(?:[${WIN_SLASH}]|$))`, + NO_DOTS_SLASH: `(?!${DOT_LITERAL}{1,2}(?:[${WIN_SLASH}]|$))`, + QMARK_NO_DOT: `[^.${WIN_SLASH}]`, + START_ANCHOR: `(?:^|[${WIN_SLASH}])`, + END_ANCHOR: `(?:[${WIN_SLASH}]|$)` +}; + +/** + * POSIX Bracket Regex + */ + +const POSIX_REGEX_SOURCE = { + alnum: 'a-zA-Z0-9', + alpha: 'a-zA-Z', + ascii: '\\x00-\\x7F', + blank: ' \\t', + cntrl: '\\x00-\\x1F\\x7F', + digit: '0-9', + graph: '\\x21-\\x7E', + lower: 'a-z', + print: '\\x20-\\x7E ', + punct: '\\-!"#$%&\'()\\*+,./:;<=>?@[\\]^_`{|}~', + space: ' \\t\\r\\n\\v\\f', + upper: 'A-Z', + word: 'A-Za-z0-9_', + xdigit: 'A-Fa-f0-9' +}; + +module.exports = { + MAX_LENGTH: 1024 * 64, + POSIX_REGEX_SOURCE, + + // regular expressions + REGEX_BACKSLASH: /\\(?![*+?^${}(|)[\]])/g, + REGEX_NON_SPECIAL_CHARS: /^[^@![\].,$*+?^{}()|\\/]+/, + REGEX_SPECIAL_CHARS: /[-*+?.^${}(|)[\]]/, + REGEX_SPECIAL_CHARS_BACKREF: /(\\?)((\W)(\3*))/g, + REGEX_SPECIAL_CHARS_GLOBAL: /([-*+?.^${}(|)[\]])/g, + REGEX_REMOVE_BACKSLASH: /(?:\[.*?[^\\]\]|\\(?=.))/g, + + // Replace globs with equivalent patterns to reduce parsing time. + REPLACEMENTS: { + '***': '*', + '**/**': '**', + '**/**/**': '**' + }, + + // Digits + CHAR_0: 48, /* 0 */ + CHAR_9: 57, /* 9 */ + + // Alphabet chars. + CHAR_UPPERCASE_A: 65, /* A */ + CHAR_LOWERCASE_A: 97, /* a */ + CHAR_UPPERCASE_Z: 90, /* Z */ + CHAR_LOWERCASE_Z: 122, /* z */ + + CHAR_LEFT_PARENTHESES: 40, /* ( */ + CHAR_RIGHT_PARENTHESES: 41, /* ) */ + + CHAR_ASTERISK: 42, /* * */ + + // Non-alphabetic chars. + CHAR_AMPERSAND: 38, /* & */ + CHAR_AT: 64, /* @ */ + CHAR_BACKWARD_SLASH: 92, /* \ */ + CHAR_CARRIAGE_RETURN: 13, /* \r */ + CHAR_CIRCUMFLEX_ACCENT: 94, /* ^ */ + CHAR_COLON: 58, /* : */ + CHAR_COMMA: 44, /* , */ + CHAR_DOT: 46, /* . */ + CHAR_DOUBLE_QUOTE: 34, /* " */ + CHAR_EQUAL: 61, /* = */ + CHAR_EXCLAMATION_MARK: 33, /* ! */ + CHAR_FORM_FEED: 12, /* \f */ + CHAR_FORWARD_SLASH: 47, /* / */ + CHAR_GRAVE_ACCENT: 96, /* ` */ + CHAR_HASH: 35, /* # */ + CHAR_HYPHEN_MINUS: 45, /* - */ + CHAR_LEFT_ANGLE_BRACKET: 60, /* < */ + CHAR_LEFT_CURLY_BRACE: 123, /* { */ + CHAR_LEFT_SQUARE_BRACKET: 91, /* [ */ + CHAR_LINE_FEED: 10, /* \n */ + CHAR_NO_BREAK_SPACE: 160, /* \u00A0 */ + CHAR_PERCENT: 37, /* % */ + CHAR_PLUS: 43, /* + */ + CHAR_QUESTION_MARK: 63, /* ? */ + CHAR_RIGHT_ANGLE_BRACKET: 62, /* > */ + CHAR_RIGHT_CURLY_BRACE: 125, /* } */ + CHAR_RIGHT_SQUARE_BRACKET: 93, /* ] */ + CHAR_SEMICOLON: 59, /* ; */ + CHAR_SINGLE_QUOTE: 39, /* ' */ + CHAR_SPACE: 32, /* */ + CHAR_TAB: 9, /* \t */ + CHAR_UNDERSCORE: 95, /* _ */ + CHAR_VERTICAL_LINE: 124, /* | */ + CHAR_ZERO_WIDTH_NOBREAK_SPACE: 65279, /* \uFEFF */ + + SEP: path.sep, + + /** + * Create EXTGLOB_CHARS + */ + + extglobChars(chars) { + return { + '!': { type: 'negate', open: '(?:(?!(?:', close: `))${chars.STAR})` }, + '?': { type: 'qmark', open: '(?:', close: ')?' }, + '+': { type: 'plus', open: '(?:', close: ')+' }, + '*': { type: 'star', open: '(?:', close: ')*' }, + '@': { type: 'at', open: '(?:', close: ')' } + }; + }, + + /** + * Create GLOB_CHARS + */ + + globChars(win32) { + return win32 === true ? WINDOWS_CHARS : POSIX_CHARS; + } +}; diff --git a/node_modules/picomatch/lib/parse.js b/node_modules/picomatch/lib/parse.js new file mode 100644 index 000000000..c16d59d9a --- /dev/null +++ b/node_modules/picomatch/lib/parse.js @@ -0,0 +1,1084 @@ +'use strict'; + +const constants = require('./constants'); +const utils = require('./utils'); + +/** + * Constants + */ + +const { + MAX_LENGTH, + POSIX_REGEX_SOURCE, + REGEX_NON_SPECIAL_CHARS, + REGEX_SPECIAL_CHARS_BACKREF, + REPLACEMENTS +} = constants; + +/** + * Helpers + */ + +const expandRange = (args, options) => { + if (typeof options.expandRange === 'function') { + return options.expandRange(...args, options); + } + + args.sort(); + const value = `[${args.join('-')}]`; + + try { + /* eslint-disable-next-line no-new */ + new RegExp(value); + } catch (ex) { + return args.map(v => utils.escapeRegex(v)).join('..'); + } + + return value; +}; + +/** + * Create the message for a syntax error + */ + +const syntaxError = (type, char) => { + return `Missing ${type}: "${char}" - use "\\\\${char}" to match literal characters`; +}; + +/** + * Parse the given input string. + * @param {String} input + * @param {Object} options + * @return {Object} + */ + +const parse = (input, options) => { + if (typeof input !== 'string') { + throw new TypeError('Expected a string'); + } + + input = REPLACEMENTS[input] || input; + + const opts = { ...options }; + const max = typeof opts.maxLength === 'number' ? Math.min(MAX_LENGTH, opts.maxLength) : MAX_LENGTH; + + let len = input.length; + if (len > max) { + throw new SyntaxError(`Input length: ${len}, exceeds maximum allowed length: ${max}`); + } + + const bos = { type: 'bos', value: '', output: opts.prepend || '' }; + const tokens = [bos]; + + const capture = opts.capture ? '' : '?:'; + const win32 = utils.isWindows(options); + + // create constants based on platform, for windows or posix + const PLATFORM_CHARS = constants.globChars(win32); + const EXTGLOB_CHARS = constants.extglobChars(PLATFORM_CHARS); + + const { + DOT_LITERAL, + PLUS_LITERAL, + SLASH_LITERAL, + ONE_CHAR, + DOTS_SLASH, + NO_DOT, + NO_DOT_SLASH, + NO_DOTS_SLASH, + QMARK, + QMARK_NO_DOT, + STAR, + START_ANCHOR + } = PLATFORM_CHARS; + + const globstar = opts => { + return `(${capture}(?:(?!${START_ANCHOR}${opts.dot ? DOTS_SLASH : DOT_LITERAL}).)*?)`; + }; + + const nodot = opts.dot ? '' : NO_DOT; + const qmarkNoDot = opts.dot ? QMARK : QMARK_NO_DOT; + let star = opts.bash === true ? globstar(opts) : STAR; + + if (opts.capture) { + star = `(${star})`; + } + + // minimatch options support + if (typeof opts.noext === 'boolean') { + opts.noextglob = opts.noext; + } + + const state = { + input, + index: -1, + start: 0, + dot: opts.dot === true, + consumed: '', + output: '', + prefix: '', + backtrack: false, + negated: false, + brackets: 0, + braces: 0, + parens: 0, + quotes: 0, + globstar: false, + tokens + }; + + input = utils.removePrefix(input, state); + len = input.length; + + const extglobs = []; + const braces = []; + const stack = []; + let prev = bos; + let value; + + /** + * Tokenizing helpers + */ + + const eos = () => state.index === len - 1; + const peek = state.peek = (n = 1) => input[state.index + n]; + const advance = state.advance = () => input[++state.index] || ''; + const remaining = () => input.slice(state.index + 1); + const consume = (value = '', num = 0) => { + state.consumed += value; + state.index += num; + }; + + const append = token => { + state.output += token.output != null ? token.output : token.value; + consume(token.value); + }; + + const negate = () => { + let count = 1; + + while (peek() === '!' && (peek(2) !== '(' || peek(3) === '?')) { + advance(); + state.start++; + count++; + } + + if (count % 2 === 0) { + return false; + } + + state.negated = true; + state.start++; + return true; + }; + + const increment = type => { + state[type]++; + stack.push(type); + }; + + const decrement = type => { + state[type]--; + stack.pop(); + }; + + /** + * Push tokens onto the tokens array. This helper speeds up + * tokenizing by 1) helping us avoid backtracking as much as possible, + * and 2) helping us avoid creating extra tokens when consecutive + * characters are plain text. This improves performance and simplifies + * lookbehinds. + */ + + const push = tok => { + if (prev.type === 'globstar') { + const isBrace = state.braces > 0 && (tok.type === 'comma' || tok.type === 'brace'); + const isExtglob = tok.extglob === true || (extglobs.length && (tok.type === 'pipe' || tok.type === 'paren')); + + if (tok.type !== 'slash' && tok.type !== 'paren' && !isBrace && !isExtglob) { + state.output = state.output.slice(0, -prev.output.length); + prev.type = 'star'; + prev.value = '*'; + prev.output = star; + state.output += prev.output; + } + } + + if (extglobs.length && tok.type !== 'paren') { + extglobs[extglobs.length - 1].inner += tok.value; + } + + if (tok.value || tok.output) append(tok); + if (prev && prev.type === 'text' && tok.type === 'text') { + prev.value += tok.value; + prev.output = (prev.output || '') + tok.value; + return; + } + + tok.prev = prev; + tokens.push(tok); + prev = tok; + }; + + const extglobOpen = (type, value) => { + const token = { ...EXTGLOB_CHARS[value], conditions: 1, inner: '' }; + + token.prev = prev; + token.parens = state.parens; + token.output = state.output; + const output = (opts.capture ? '(' : '') + token.open; + + increment('parens'); + push({ type, value, output: state.output ? '' : ONE_CHAR }); + push({ type: 'paren', extglob: true, value: advance(), output }); + extglobs.push(token); + }; + + const extglobClose = token => { + let output = token.close + (opts.capture ? ')' : ''); + let rest; + + if (token.type === 'negate') { + let extglobStar = star; + + if (token.inner && token.inner.length > 1 && token.inner.includes('/')) { + extglobStar = globstar(opts); + } + + if (extglobStar !== star || eos() || /^\)+$/.test(remaining())) { + output = token.close = `)$))${extglobStar}`; + } + + if (token.inner.includes('*') && (rest = remaining()) && /^\.[^\\/.]+$/.test(rest)) { + output = token.close = `)${rest})${extglobStar})`; + } + + if (token.prev.type === 'bos') { + state.negatedExtglob = true; + } + } + + push({ type: 'paren', extglob: true, value, output }); + decrement('parens'); + }; + + /** + * Fast paths + */ + + if (opts.fastpaths !== false && !/(^[*!]|[/()[\]{}"])/.test(input)) { + let backslashes = false; + + let output = input.replace(REGEX_SPECIAL_CHARS_BACKREF, (m, esc, chars, first, rest, index) => { + if (first === '\\') { + backslashes = true; + return m; + } + + if (first === '?') { + if (esc) { + return esc + first + (rest ? QMARK.repeat(rest.length) : ''); + } + if (index === 0) { + return qmarkNoDot + (rest ? QMARK.repeat(rest.length) : ''); + } + return QMARK.repeat(chars.length); + } + + if (first === '.') { + return DOT_LITERAL.repeat(chars.length); + } + + if (first === '*') { + if (esc) { + return esc + first + (rest ? star : ''); + } + return star; + } + return esc ? m : `\\${m}`; + }); + + if (backslashes === true) { + if (opts.unescape === true) { + output = output.replace(/\\/g, ''); + } else { + output = output.replace(/\\+/g, m => { + return m.length % 2 === 0 ? '\\\\' : (m ? '\\' : ''); + }); + } + } + + if (output === input && opts.contains === true) { + state.output = input; + return state; + } + + state.output = utils.wrapOutput(output, state, options); + return state; + } + + /** + * Tokenize input until we reach end-of-string + */ + + while (!eos()) { + value = advance(); + + if (value === '\u0000') { + continue; + } + + /** + * Escaped characters + */ + + if (value === '\\') { + const next = peek(); + + if (next === '/' && opts.bash !== true) { + continue; + } + + if (next === '.' || next === ';') { + continue; + } + + if (!next) { + value += '\\'; + push({ type: 'text', value }); + continue; + } + + // collapse slashes to reduce potential for exploits + const match = /^\\+/.exec(remaining()); + let slashes = 0; + + if (match && match[0].length > 2) { + slashes = match[0].length; + state.index += slashes; + if (slashes % 2 !== 0) { + value += '\\'; + } + } + + if (opts.unescape === true) { + value = advance(); + } else { + value += advance(); + } + + if (state.brackets === 0) { + push({ type: 'text', value }); + continue; + } + } + + /** + * If we're inside a regex character class, continue + * until we reach the closing bracket. + */ + + if (state.brackets > 0 && (value !== ']' || prev.value === '[' || prev.value === '[^')) { + if (opts.posix !== false && value === ':') { + const inner = prev.value.slice(1); + if (inner.includes('[')) { + prev.posix = true; + + if (inner.includes(':')) { + const idx = prev.value.lastIndexOf('['); + const pre = prev.value.slice(0, idx); + const rest = prev.value.slice(idx + 2); + const posix = POSIX_REGEX_SOURCE[rest]; + if (posix) { + prev.value = pre + posix; + state.backtrack = true; + advance(); + + if (!bos.output && tokens.indexOf(prev) === 1) { + bos.output = ONE_CHAR; + } + continue; + } + } + } + } + + if ((value === '[' && peek() !== ':') || (value === '-' && peek() === ']')) { + value = `\\${value}`; + } + + if (value === ']' && (prev.value === '[' || prev.value === '[^')) { + value = `\\${value}`; + } + + if (opts.posix === true && value === '!' && prev.value === '[') { + value = '^'; + } + + prev.value += value; + append({ value }); + continue; + } + + /** + * If we're inside a quoted string, continue + * until we reach the closing double quote. + */ + + if (state.quotes === 1 && value !== '"') { + value = utils.escapeRegex(value); + prev.value += value; + append({ value }); + continue; + } + + /** + * Double quotes + */ + + if (value === '"') { + state.quotes = state.quotes === 1 ? 0 : 1; + if (opts.keepQuotes === true) { + push({ type: 'text', value }); + } + continue; + } + + /** + * Parentheses + */ + + if (value === '(') { + increment('parens'); + push({ type: 'paren', value }); + continue; + } + + if (value === ')') { + if (state.parens === 0 && opts.strictBrackets === true) { + throw new SyntaxError(syntaxError('opening', '(')); + } + + const extglob = extglobs[extglobs.length - 1]; + if (extglob && state.parens === extglob.parens + 1) { + extglobClose(extglobs.pop()); + continue; + } + + push({ type: 'paren', value, output: state.parens ? ')' : '\\)' }); + decrement('parens'); + continue; + } + + /** + * Square brackets + */ + + if (value === '[') { + if (opts.nobracket === true || !remaining().includes(']')) { + if (opts.nobracket !== true && opts.strictBrackets === true) { + throw new SyntaxError(syntaxError('closing', ']')); + } + + value = `\\${value}`; + } else { + increment('brackets'); + } + + push({ type: 'bracket', value }); + continue; + } + + if (value === ']') { + if (opts.nobracket === true || (prev && prev.type === 'bracket' && prev.value.length === 1)) { + push({ type: 'text', value, output: `\\${value}` }); + continue; + } + + if (state.brackets === 0) { + if (opts.strictBrackets === true) { + throw new SyntaxError(syntaxError('opening', '[')); + } + + push({ type: 'text', value, output: `\\${value}` }); + continue; + } + + decrement('brackets'); + + const prevValue = prev.value.slice(1); + if (prev.posix !== true && prevValue[0] === '^' && !prevValue.includes('/')) { + value = `/${value}`; + } + + prev.value += value; + append({ value }); + + // when literal brackets are explicitly disabled + // assume we should match with a regex character class + if (opts.literalBrackets === false || utils.hasRegexChars(prevValue)) { + continue; + } + + const escaped = utils.escapeRegex(prev.value); + state.output = state.output.slice(0, -prev.value.length); + + // when literal brackets are explicitly enabled + // assume we should escape the brackets to match literal characters + if (opts.literalBrackets === true) { + state.output += escaped; + prev.value = escaped; + continue; + } + + // when the user specifies nothing, try to match both + prev.value = `(${capture}${escaped}|${prev.value})`; + state.output += prev.value; + continue; + } + + /** + * Braces + */ + + if (value === '{' && opts.nobrace !== true) { + increment('braces'); + + const open = { + type: 'brace', + value, + output: '(', + outputIndex: state.output.length, + tokensIndex: state.tokens.length + }; + + braces.push(open); + push(open); + continue; + } + + if (value === '}') { + const brace = braces[braces.length - 1]; + + if (opts.nobrace === true || !brace) { + push({ type: 'text', value, output: value }); + continue; + } + + let output = ')'; + + if (brace.dots === true) { + const arr = tokens.slice(); + const range = []; + + for (let i = arr.length - 1; i >= 0; i--) { + tokens.pop(); + if (arr[i].type === 'brace') { + break; + } + if (arr[i].type !== 'dots') { + range.unshift(arr[i].value); + } + } + + output = expandRange(range, opts); + state.backtrack = true; + } + + if (brace.comma !== true && brace.dots !== true) { + const out = state.output.slice(0, brace.outputIndex); + const toks = state.tokens.slice(brace.tokensIndex); + brace.value = brace.output = '\\{'; + value = output = '\\}'; + state.output = out; + for (const t of toks) { + state.output += (t.output || t.value); + } + } + + push({ type: 'brace', value, output }); + decrement('braces'); + braces.pop(); + continue; + } + + /** + * Pipes + */ + + if (value === '|') { + if (extglobs.length > 0) { + extglobs[extglobs.length - 1].conditions++; + } + push({ type: 'text', value }); + continue; + } + + /** + * Commas + */ + + if (value === ',') { + let output = value; + + const brace = braces[braces.length - 1]; + if (brace && stack[stack.length - 1] === 'braces') { + brace.comma = true; + output = '|'; + } + + push({ type: 'comma', value, output }); + continue; + } + + /** + * Slashes + */ + + if (value === '/') { + // if the beginning of the glob is "./", advance the start + // to the current index, and don't add the "./" characters + // to the state. This greatly simplifies lookbehinds when + // checking for BOS characters like "!" and "." (not "./") + if (prev.type === 'dot' && state.index === state.start + 1) { + state.start = state.index + 1; + state.consumed = ''; + state.output = ''; + tokens.pop(); + prev = bos; // reset "prev" to the first token + continue; + } + + push({ type: 'slash', value, output: SLASH_LITERAL }); + continue; + } + + /** + * Dots + */ + + if (value === '.') { + if (state.braces > 0 && prev.type === 'dot') { + if (prev.value === '.') prev.output = DOT_LITERAL; + const brace = braces[braces.length - 1]; + prev.type = 'dots'; + prev.output += value; + prev.value += value; + brace.dots = true; + continue; + } + + if ((state.braces + state.parens) === 0 && prev.type !== 'bos' && prev.type !== 'slash') { + push({ type: 'text', value, output: DOT_LITERAL }); + continue; + } + + push({ type: 'dot', value, output: DOT_LITERAL }); + continue; + } + + /** + * Question marks + */ + + if (value === '?') { + const isGroup = prev && prev.value === '('; + if (!isGroup && opts.noextglob !== true && peek() === '(' && peek(2) !== '?') { + extglobOpen('qmark', value); + continue; + } + + if (prev && prev.type === 'paren') { + const next = peek(); + let output = value; + + if (next === '<' && !utils.supportsLookbehinds()) { + throw new Error('Node.js v10 or higher is required for regex lookbehinds'); + } + + if ((prev.value === '(' && !/[!=<:]/.test(next)) || (next === '<' && !/<([!=]|\w+>)/.test(remaining()))) { + output = `\\${value}`; + } + + push({ type: 'text', value, output }); + continue; + } + + if (opts.dot !== true && (prev.type === 'slash' || prev.type === 'bos')) { + push({ type: 'qmark', value, output: QMARK_NO_DOT }); + continue; + } + + push({ type: 'qmark', value, output: QMARK }); + continue; + } + + /** + * Exclamation + */ + + if (value === '!') { + if (opts.noextglob !== true && peek() === '(') { + if (peek(2) !== '?' || !/[!=<:]/.test(peek(3))) { + extglobOpen('negate', value); + continue; + } + } + + if (opts.nonegate !== true && state.index === 0) { + negate(); + continue; + } + } + + /** + * Plus + */ + + if (value === '+') { + if (opts.noextglob !== true && peek() === '(' && peek(2) !== '?') { + extglobOpen('plus', value); + continue; + } + + if ((prev && prev.value === '(') || opts.regex === false) { + push({ type: 'plus', value, output: PLUS_LITERAL }); + continue; + } + + if ((prev && (prev.type === 'bracket' || prev.type === 'paren' || prev.type === 'brace')) || state.parens > 0) { + push({ type: 'plus', value }); + continue; + } + + push({ type: 'plus', value: PLUS_LITERAL }); + continue; + } + + /** + * Plain text + */ + + if (value === '@') { + if (opts.noextglob !== true && peek() === '(' && peek(2) !== '?') { + push({ type: 'at', extglob: true, value, output: '' }); + continue; + } + + push({ type: 'text', value }); + continue; + } + + /** + * Plain text + */ + + if (value !== '*') { + if (value === '$' || value === '^') { + value = `\\${value}`; + } + + const match = REGEX_NON_SPECIAL_CHARS.exec(remaining()); + if (match) { + value += match[0]; + state.index += match[0].length; + } + + push({ type: 'text', value }); + continue; + } + + /** + * Stars + */ + + if (prev && (prev.type === 'globstar' || prev.star === true)) { + prev.type = 'star'; + prev.star = true; + prev.value += value; + prev.output = star; + state.backtrack = true; + state.globstar = true; + consume(value); + continue; + } + + let rest = remaining(); + if (opts.noextglob !== true && /^\([^?]/.test(rest)) { + extglobOpen('star', value); + continue; + } + + if (prev.type === 'star') { + if (opts.noglobstar === true) { + consume(value); + continue; + } + + const prior = prev.prev; + const before = prior.prev; + const isStart = prior.type === 'slash' || prior.type === 'bos'; + const afterStar = before && (before.type === 'star' || before.type === 'globstar'); + + if (opts.bash === true && (!isStart || (rest[0] && rest[0] !== '/'))) { + push({ type: 'star', value, output: '' }); + continue; + } + + const isBrace = state.braces > 0 && (prior.type === 'comma' || prior.type === 'brace'); + const isExtglob = extglobs.length && (prior.type === 'pipe' || prior.type === 'paren'); + if (!isStart && prior.type !== 'paren' && !isBrace && !isExtglob) { + push({ type: 'star', value, output: '' }); + continue; + } + + // strip consecutive `/**/` + while (rest.slice(0, 3) === '/**') { + const after = input[state.index + 4]; + if (after && after !== '/') { + break; + } + rest = rest.slice(3); + consume('/**', 3); + } + + if (prior.type === 'bos' && eos()) { + prev.type = 'globstar'; + prev.value += value; + prev.output = globstar(opts); + state.output = prev.output; + state.globstar = true; + consume(value); + continue; + } + + if (prior.type === 'slash' && prior.prev.type !== 'bos' && !afterStar && eos()) { + state.output = state.output.slice(0, -(prior.output + prev.output).length); + prior.output = `(?:${prior.output}`; + + prev.type = 'globstar'; + prev.output = globstar(opts) + (opts.strictSlashes ? ')' : '|$)'); + prev.value += value; + state.globstar = true; + state.output += prior.output + prev.output; + consume(value); + continue; + } + + if (prior.type === 'slash' && prior.prev.type !== 'bos' && rest[0] === '/') { + const end = rest[1] !== void 0 ? '|$' : ''; + + state.output = state.output.slice(0, -(prior.output + prev.output).length); + prior.output = `(?:${prior.output}`; + + prev.type = 'globstar'; + prev.output = `${globstar(opts)}${SLASH_LITERAL}|${SLASH_LITERAL}${end})`; + prev.value += value; + + state.output += prior.output + prev.output; + state.globstar = true; + + consume(value + advance()); + + push({ type: 'slash', value: '/', output: '' }); + continue; + } + + if (prior.type === 'bos' && rest[0] === '/') { + prev.type = 'globstar'; + prev.value += value; + prev.output = `(?:^|${SLASH_LITERAL}|${globstar(opts)}${SLASH_LITERAL})`; + state.output = prev.output; + state.globstar = true; + consume(value + advance()); + push({ type: 'slash', value: '/', output: '' }); + continue; + } + + // remove single star from output + state.output = state.output.slice(0, -prev.output.length); + + // reset previous token to globstar + prev.type = 'globstar'; + prev.output = globstar(opts); + prev.value += value; + + // reset output with globstar + state.output += prev.output; + state.globstar = true; + consume(value); + continue; + } + + const token = { type: 'star', value, output: star }; + + if (opts.bash === true) { + token.output = '.*?'; + if (prev.type === 'bos' || prev.type === 'slash') { + token.output = nodot + token.output; + } + push(token); + continue; + } + + if (prev && (prev.type === 'bracket' || prev.type === 'paren') && opts.regex === true) { + token.output = value; + push(token); + continue; + } + + if (state.index === state.start || prev.type === 'slash' || prev.type === 'dot') { + if (prev.type === 'dot') { + state.output += NO_DOT_SLASH; + prev.output += NO_DOT_SLASH; + + } else if (opts.dot === true) { + state.output += NO_DOTS_SLASH; + prev.output += NO_DOTS_SLASH; + + } else { + state.output += nodot; + prev.output += nodot; + } + + if (peek() !== '*') { + state.output += ONE_CHAR; + prev.output += ONE_CHAR; + } + } + + push(token); + } + + while (state.brackets > 0) { + if (opts.strictBrackets === true) throw new SyntaxError(syntaxError('closing', ']')); + state.output = utils.escapeLast(state.output, '['); + decrement('brackets'); + } + + while (state.parens > 0) { + if (opts.strictBrackets === true) throw new SyntaxError(syntaxError('closing', ')')); + state.output = utils.escapeLast(state.output, '('); + decrement('parens'); + } + + while (state.braces > 0) { + if (opts.strictBrackets === true) throw new SyntaxError(syntaxError('closing', '}')); + state.output = utils.escapeLast(state.output, '{'); + decrement('braces'); + } + + if (opts.strictSlashes !== true && (prev.type === 'star' || prev.type === 'bracket')) { + push({ type: 'maybe_slash', value: '', output: `${SLASH_LITERAL}?` }); + } + + // rebuild the output if we had to backtrack at any point + if (state.backtrack === true) { + state.output = ''; + + for (const token of state.tokens) { + state.output += token.output != null ? token.output : token.value; + + if (token.suffix) { + state.output += token.suffix; + } + } + } + + return state; +}; + +/** + * Fast paths for creating regular expressions for common glob patterns. + * This can significantly speed up processing and has very little downside + * impact when none of the fast paths match. + */ + +parse.fastpaths = (input, options) => { + const opts = { ...options }; + const max = typeof opts.maxLength === 'number' ? Math.min(MAX_LENGTH, opts.maxLength) : MAX_LENGTH; + const len = input.length; + if (len > max) { + throw new SyntaxError(`Input length: ${len}, exceeds maximum allowed length: ${max}`); + } + + input = REPLACEMENTS[input] || input; + const win32 = utils.isWindows(options); + + // create constants based on platform, for windows or posix + const { + DOT_LITERAL, + SLASH_LITERAL, + ONE_CHAR, + DOTS_SLASH, + NO_DOT, + NO_DOTS, + NO_DOTS_SLASH, + STAR, + START_ANCHOR + } = constants.globChars(win32); + + const nodot = opts.dot ? NO_DOTS : NO_DOT; + const slashDot = opts.dot ? NO_DOTS_SLASH : NO_DOT; + const capture = opts.capture ? '' : '?:'; + const state = { negated: false, prefix: '' }; + let star = opts.bash === true ? '.*?' : STAR; + + if (opts.capture) { + star = `(${star})`; + } + + const globstar = opts => { + if (opts.noglobstar === true) return star; + return `(${capture}(?:(?!${START_ANCHOR}${opts.dot ? DOTS_SLASH : DOT_LITERAL}).)*?)`; + }; + + const create = str => { + switch (str) { + case '*': + return `${nodot}${ONE_CHAR}${star}`; + + case '.*': + return `${DOT_LITERAL}${ONE_CHAR}${star}`; + + case '*.*': + return `${nodot}${star}${DOT_LITERAL}${ONE_CHAR}${star}`; + + case '*/*': + return `${nodot}${star}${SLASH_LITERAL}${ONE_CHAR}${slashDot}${star}`; + + case '**': + return nodot + globstar(opts); + + case '**/*': + return `(?:${nodot}${globstar(opts)}${SLASH_LITERAL})?${slashDot}${ONE_CHAR}${star}`; + + case '**/*.*': + return `(?:${nodot}${globstar(opts)}${SLASH_LITERAL})?${slashDot}${star}${DOT_LITERAL}${ONE_CHAR}${star}`; + + case '**/.*': + return `(?:${nodot}${globstar(opts)}${SLASH_LITERAL})?${DOT_LITERAL}${ONE_CHAR}${star}`; + + default: { + const match = /^(.*?)\.(\w+)$/.exec(str); + if (!match) return; + + const source = create(match[1]); + if (!source) return; + + return source + DOT_LITERAL + match[2]; + } + } + }; + + const output = utils.removePrefix(input, state); + let source = create(output); + + if (source && opts.strictSlashes !== true) { + source += `${SLASH_LITERAL}?`; + } + + return source; +}; + +module.exports = parse; diff --git a/node_modules/picomatch/lib/picomatch.js b/node_modules/picomatch/lib/picomatch.js new file mode 100644 index 000000000..782d80943 --- /dev/null +++ b/node_modules/picomatch/lib/picomatch.js @@ -0,0 +1,342 @@ +'use strict'; + +const path = require('path'); +const scan = require('./scan'); +const parse = require('./parse'); +const utils = require('./utils'); +const constants = require('./constants'); +const isObject = val => val && typeof val === 'object' && !Array.isArray(val); + +/** + * Creates a matcher function from one or more glob patterns. The + * returned function takes a string to match as its first argument, + * and returns true if the string is a match. The returned matcher + * function also takes a boolean as the second argument that, when true, + * returns an object with additional information. + * + * ```js + * const picomatch = require('picomatch'); + * // picomatch(glob[, options]); + * + * const isMatch = picomatch('*.!(*a)'); + * console.log(isMatch('a.a')); //=> false + * console.log(isMatch('a.b')); //=> true + * ``` + * @name picomatch + * @param {String|Array} `globs` One or more glob patterns. + * @param {Object=} `options` + * @return {Function=} Returns a matcher function. + * @api public + */ + +const picomatch = (glob, options, returnState = false) => { + if (Array.isArray(glob)) { + const fns = glob.map(input => picomatch(input, options, returnState)); + const arrayMatcher = str => { + for (const isMatch of fns) { + const state = isMatch(str); + if (state) return state; + } + return false; + }; + return arrayMatcher; + } + + const isState = isObject(glob) && glob.tokens && glob.input; + + if (glob === '' || (typeof glob !== 'string' && !isState)) { + throw new TypeError('Expected pattern to be a non-empty string'); + } + + const opts = options || {}; + const posix = utils.isWindows(options); + const regex = isState + ? picomatch.compileRe(glob, options) + : picomatch.makeRe(glob, options, false, true); + + const state = regex.state; + delete regex.state; + + let isIgnored = () => false; + if (opts.ignore) { + const ignoreOpts = { ...options, ignore: null, onMatch: null, onResult: null }; + isIgnored = picomatch(opts.ignore, ignoreOpts, returnState); + } + + const matcher = (input, returnObject = false) => { + const { isMatch, match, output } = picomatch.test(input, regex, options, { glob, posix }); + const result = { glob, state, regex, posix, input, output, match, isMatch }; + + if (typeof opts.onResult === 'function') { + opts.onResult(result); + } + + if (isMatch === false) { + result.isMatch = false; + return returnObject ? result : false; + } + + if (isIgnored(input)) { + if (typeof opts.onIgnore === 'function') { + opts.onIgnore(result); + } + result.isMatch = false; + return returnObject ? result : false; + } + + if (typeof opts.onMatch === 'function') { + opts.onMatch(result); + } + return returnObject ? result : true; + }; + + if (returnState) { + matcher.state = state; + } + + return matcher; +}; + +/** + * Test `input` with the given `regex`. This is used by the main + * `picomatch()` function to test the input string. + * + * ```js + * const picomatch = require('picomatch'); + * // picomatch.test(input, regex[, options]); + * + * console.log(picomatch.test('foo/bar', /^(?:([^/]*?)\/([^/]*?))$/)); + * // { isMatch: true, match: [ 'foo/', 'foo', 'bar' ], output: 'foo/bar' } + * ``` + * @param {String} `input` String to test. + * @param {RegExp} `regex` + * @return {Object} Returns an object with matching info. + * @api public + */ + +picomatch.test = (input, regex, options, { glob, posix } = {}) => { + if (typeof input !== 'string') { + throw new TypeError('Expected input to be a string'); + } + + if (input === '') { + return { isMatch: false, output: '' }; + } + + const opts = options || {}; + const format = opts.format || (posix ? utils.toPosixSlashes : null); + let match = input === glob; + let output = (match && format) ? format(input) : input; + + if (match === false) { + output = format ? format(input) : input; + match = output === glob; + } + + if (match === false || opts.capture === true) { + if (opts.matchBase === true || opts.basename === true) { + match = picomatch.matchBase(input, regex, options, posix); + } else { + match = regex.exec(output); + } + } + + return { isMatch: Boolean(match), match, output }; +}; + +/** + * Match the basename of a filepath. + * + * ```js + * const picomatch = require('picomatch'); + * // picomatch.matchBase(input, glob[, options]); + * console.log(picomatch.matchBase('foo/bar.js', '*.js'); // true + * ``` + * @param {String} `input` String to test. + * @param {RegExp|String} `glob` Glob pattern or regex created by [.makeRe](#makeRe). + * @return {Boolean} + * @api public + */ + +picomatch.matchBase = (input, glob, options, posix = utils.isWindows(options)) => { + const regex = glob instanceof RegExp ? glob : picomatch.makeRe(glob, options); + return regex.test(path.basename(input)); +}; + +/** + * Returns true if **any** of the given glob `patterns` match the specified `string`. + * + * ```js + * const picomatch = require('picomatch'); + * // picomatch.isMatch(string, patterns[, options]); + * + * console.log(picomatch.isMatch('a.a', ['b.*', '*.a'])); //=> true + * console.log(picomatch.isMatch('a.a', 'b.*')); //=> false + * ``` + * @param {String|Array} str The string to test. + * @param {String|Array} patterns One or more glob patterns to use for matching. + * @param {Object} [options] See available [options](#options). + * @return {Boolean} Returns true if any patterns match `str` + * @api public + */ + +picomatch.isMatch = (str, patterns, options) => picomatch(patterns, options)(str); + +/** + * Parse a glob pattern to create the source string for a regular + * expression. + * + * ```js + * const picomatch = require('picomatch'); + * const result = picomatch.parse(pattern[, options]); + * ``` + * @param {String} `pattern` + * @param {Object} `options` + * @return {Object} Returns an object with useful properties and output to be used as a regex source string. + * @api public + */ + +picomatch.parse = (pattern, options) => { + if (Array.isArray(pattern)) return pattern.map(p => picomatch.parse(p, options)); + return parse(pattern, { ...options, fastpaths: false }); +}; + +/** + * Scan a glob pattern to separate the pattern into segments. + * + * ```js + * const picomatch = require('picomatch'); + * // picomatch.scan(input[, options]); + * + * const result = picomatch.scan('!./foo/*.js'); + * console.log(result); + * { prefix: '!./', + * input: '!./foo/*.js', + * start: 3, + * base: 'foo', + * glob: '*.js', + * isBrace: false, + * isBracket: false, + * isGlob: true, + * isExtglob: false, + * isGlobstar: false, + * negated: true } + * ``` + * @param {String} `input` Glob pattern to scan. + * @param {Object} `options` + * @return {Object} Returns an object with + * @api public + */ + +picomatch.scan = (input, options) => scan(input, options); + +/** + * Compile a regular expression from the `state` object returned by the + * [parse()](#parse) method. + * + * @param {Object} `state` + * @param {Object} `options` + * @param {Boolean} `returnOutput` Intended for implementors, this argument allows you to return the raw output from the parser. + * @param {Boolean} `returnState` Adds the state to a `state` property on the returned regex. Useful for implementors and debugging. + * @return {RegExp} + * @api public + */ + +picomatch.compileRe = (state, options, returnOutput = false, returnState = false) => { + if (returnOutput === true) { + return state.output; + } + + const opts = options || {}; + const prepend = opts.contains ? '' : '^'; + const append = opts.contains ? '' : '$'; + + let source = `${prepend}(?:${state.output})${append}`; + if (state && state.negated === true) { + source = `^(?!${source}).*$`; + } + + const regex = picomatch.toRegex(source, options); + if (returnState === true) { + regex.state = state; + } + + return regex; +}; + +/** + * Create a regular expression from a parsed glob pattern. + * + * ```js + * const picomatch = require('picomatch'); + * const state = picomatch.parse('*.js'); + * // picomatch.compileRe(state[, options]); + * + * console.log(picomatch.compileRe(state)); + * //=> /^(?:(?!\.)(?=.)[^/]*?\.js)$/ + * ``` + * @param {String} `state` The object returned from the `.parse` method. + * @param {Object} `options` + * @param {Boolean} `returnOutput` Implementors may use this argument to return the compiled output, instead of a regular expression. This is not exposed on the options to prevent end-users from mutating the result. + * @param {Boolean} `returnState` Implementors may use this argument to return the state from the parsed glob with the returned regular expression. + * @return {RegExp} Returns a regex created from the given pattern. + * @api public + */ + +picomatch.makeRe = (input, options = {}, returnOutput = false, returnState = false) => { + if (!input || typeof input !== 'string') { + throw new TypeError('Expected a non-empty string'); + } + + let parsed = { negated: false, fastpaths: true }; + + if (options.fastpaths !== false && (input[0] === '.' || input[0] === '*')) { + parsed.output = parse.fastpaths(input, options); + } + + if (!parsed.output) { + parsed = parse(input, options); + } + + return picomatch.compileRe(parsed, options, returnOutput, returnState); +}; + +/** + * Create a regular expression from the given regex source string. + * + * ```js + * const picomatch = require('picomatch'); + * // picomatch.toRegex(source[, options]); + * + * const { output } = picomatch.parse('*.js'); + * console.log(picomatch.toRegex(output)); + * //=> /^(?:(?!\.)(?=.)[^/]*?\.js)$/ + * ``` + * @param {String} `source` Regular expression source string. + * @param {Object} `options` + * @return {RegExp} + * @api public + */ + +picomatch.toRegex = (source, options) => { + try { + const opts = options || {}; + return new RegExp(source, opts.flags || (opts.nocase ? 'i' : '')); + } catch (err) { + if (options && options.debug === true) throw err; + return /$^/; + } +}; + +/** + * Picomatch constants. + * @return {Object} + */ + +picomatch.constants = constants; + +/** + * Expose "picomatch" + */ + +module.exports = picomatch; diff --git a/node_modules/picomatch/lib/scan.js b/node_modules/picomatch/lib/scan.js new file mode 100644 index 000000000..e59cd7a13 --- /dev/null +++ b/node_modules/picomatch/lib/scan.js @@ -0,0 +1,391 @@ +'use strict'; + +const utils = require('./utils'); +const { + CHAR_ASTERISK, /* * */ + CHAR_AT, /* @ */ + CHAR_BACKWARD_SLASH, /* \ */ + CHAR_COMMA, /* , */ + CHAR_DOT, /* . */ + CHAR_EXCLAMATION_MARK, /* ! */ + CHAR_FORWARD_SLASH, /* / */ + CHAR_LEFT_CURLY_BRACE, /* { */ + CHAR_LEFT_PARENTHESES, /* ( */ + CHAR_LEFT_SQUARE_BRACKET, /* [ */ + CHAR_PLUS, /* + */ + CHAR_QUESTION_MARK, /* ? */ + CHAR_RIGHT_CURLY_BRACE, /* } */ + CHAR_RIGHT_PARENTHESES, /* ) */ + CHAR_RIGHT_SQUARE_BRACKET /* ] */ +} = require('./constants'); + +const isPathSeparator = code => { + return code === CHAR_FORWARD_SLASH || code === CHAR_BACKWARD_SLASH; +}; + +const depth = token => { + if (token.isPrefix !== true) { + token.depth = token.isGlobstar ? Infinity : 1; + } +}; + +/** + * Quickly scans a glob pattern and returns an object with a handful of + * useful properties, like `isGlob`, `path` (the leading non-glob, if it exists), + * `glob` (the actual pattern), `negated` (true if the path starts with `!` but not + * with `!(`) and `negatedExtglob` (true if the path starts with `!(`). + * + * ```js + * const pm = require('picomatch'); + * console.log(pm.scan('foo/bar/*.js')); + * { isGlob: true, input: 'foo/bar/*.js', base: 'foo/bar', glob: '*.js' } + * ``` + * @param {String} `str` + * @param {Object} `options` + * @return {Object} Returns an object with tokens and regex source string. + * @api public + */ + +const scan = (input, options) => { + const opts = options || {}; + + const length = input.length - 1; + const scanToEnd = opts.parts === true || opts.scanToEnd === true; + const slashes = []; + const tokens = []; + const parts = []; + + let str = input; + let index = -1; + let start = 0; + let lastIndex = 0; + let isBrace = false; + let isBracket = false; + let isGlob = false; + let isExtglob = false; + let isGlobstar = false; + let braceEscaped = false; + let backslashes = false; + let negated = false; + let negatedExtglob = false; + let finished = false; + let braces = 0; + let prev; + let code; + let token = { value: '', depth: 0, isGlob: false }; + + const eos = () => index >= length; + const peek = () => str.charCodeAt(index + 1); + const advance = () => { + prev = code; + return str.charCodeAt(++index); + }; + + while (index < length) { + code = advance(); + let next; + + if (code === CHAR_BACKWARD_SLASH) { + backslashes = token.backslashes = true; + code = advance(); + + if (code === CHAR_LEFT_CURLY_BRACE) { + braceEscaped = true; + } + continue; + } + + if (braceEscaped === true || code === CHAR_LEFT_CURLY_BRACE) { + braces++; + + while (eos() !== true && (code = advance())) { + if (code === CHAR_BACKWARD_SLASH) { + backslashes = token.backslashes = true; + advance(); + continue; + } + + if (code === CHAR_LEFT_CURLY_BRACE) { + braces++; + continue; + } + + if (braceEscaped !== true && code === CHAR_DOT && (code = advance()) === CHAR_DOT) { + isBrace = token.isBrace = true; + isGlob = token.isGlob = true; + finished = true; + + if (scanToEnd === true) { + continue; + } + + break; + } + + if (braceEscaped !== true && code === CHAR_COMMA) { + isBrace = token.isBrace = true; + isGlob = token.isGlob = true; + finished = true; + + if (scanToEnd === true) { + continue; + } + + break; + } + + if (code === CHAR_RIGHT_CURLY_BRACE) { + braces--; + + if (braces === 0) { + braceEscaped = false; + isBrace = token.isBrace = true; + finished = true; + break; + } + } + } + + if (scanToEnd === true) { + continue; + } + + break; + } + + if (code === CHAR_FORWARD_SLASH) { + slashes.push(index); + tokens.push(token); + token = { value: '', depth: 0, isGlob: false }; + + if (finished === true) continue; + if (prev === CHAR_DOT && index === (start + 1)) { + start += 2; + continue; + } + + lastIndex = index + 1; + continue; + } + + if (opts.noext !== true) { + const isExtglobChar = code === CHAR_PLUS + || code === CHAR_AT + || code === CHAR_ASTERISK + || code === CHAR_QUESTION_MARK + || code === CHAR_EXCLAMATION_MARK; + + if (isExtglobChar === true && peek() === CHAR_LEFT_PARENTHESES) { + isGlob = token.isGlob = true; + isExtglob = token.isExtglob = true; + finished = true; + if (code === CHAR_EXCLAMATION_MARK && index === start) { + negatedExtglob = true; + } + + if (scanToEnd === true) { + while (eos() !== true && (code = advance())) { + if (code === CHAR_BACKWARD_SLASH) { + backslashes = token.backslashes = true; + code = advance(); + continue; + } + + if (code === CHAR_RIGHT_PARENTHESES) { + isGlob = token.isGlob = true; + finished = true; + break; + } + } + continue; + } + break; + } + } + + if (code === CHAR_ASTERISK) { + if (prev === CHAR_ASTERISK) isGlobstar = token.isGlobstar = true; + isGlob = token.isGlob = true; + finished = true; + + if (scanToEnd === true) { + continue; + } + break; + } + + if (code === CHAR_QUESTION_MARK) { + isGlob = token.isGlob = true; + finished = true; + + if (scanToEnd === true) { + continue; + } + break; + } + + if (code === CHAR_LEFT_SQUARE_BRACKET) { + while (eos() !== true && (next = advance())) { + if (next === CHAR_BACKWARD_SLASH) { + backslashes = token.backslashes = true; + advance(); + continue; + } + + if (next === CHAR_RIGHT_SQUARE_BRACKET) { + isBracket = token.isBracket = true; + isGlob = token.isGlob = true; + finished = true; + break; + } + } + + if (scanToEnd === true) { + continue; + } + + break; + } + + if (opts.nonegate !== true && code === CHAR_EXCLAMATION_MARK && index === start) { + negated = token.negated = true; + start++; + continue; + } + + if (opts.noparen !== true && code === CHAR_LEFT_PARENTHESES) { + isGlob = token.isGlob = true; + + if (scanToEnd === true) { + while (eos() !== true && (code = advance())) { + if (code === CHAR_LEFT_PARENTHESES) { + backslashes = token.backslashes = true; + code = advance(); + continue; + } + + if (code === CHAR_RIGHT_PARENTHESES) { + finished = true; + break; + } + } + continue; + } + break; + } + + if (isGlob === true) { + finished = true; + + if (scanToEnd === true) { + continue; + } + + break; + } + } + + if (opts.noext === true) { + isExtglob = false; + isGlob = false; + } + + let base = str; + let prefix = ''; + let glob = ''; + + if (start > 0) { + prefix = str.slice(0, start); + str = str.slice(start); + lastIndex -= start; + } + + if (base && isGlob === true && lastIndex > 0) { + base = str.slice(0, lastIndex); + glob = str.slice(lastIndex); + } else if (isGlob === true) { + base = ''; + glob = str; + } else { + base = str; + } + + if (base && base !== '' && base !== '/' && base !== str) { + if (isPathSeparator(base.charCodeAt(base.length - 1))) { + base = base.slice(0, -1); + } + } + + if (opts.unescape === true) { + if (glob) glob = utils.removeBackslashes(glob); + + if (base && backslashes === true) { + base = utils.removeBackslashes(base); + } + } + + const state = { + prefix, + input, + start, + base, + glob, + isBrace, + isBracket, + isGlob, + isExtglob, + isGlobstar, + negated, + negatedExtglob + }; + + if (opts.tokens === true) { + state.maxDepth = 0; + if (!isPathSeparator(code)) { + tokens.push(token); + } + state.tokens = tokens; + } + + if (opts.parts === true || opts.tokens === true) { + let prevIndex; + + for (let idx = 0; idx < slashes.length; idx++) { + const n = prevIndex ? prevIndex + 1 : start; + const i = slashes[idx]; + const value = input.slice(n, i); + if (opts.tokens) { + if (idx === 0 && start !== 0) { + tokens[idx].isPrefix = true; + tokens[idx].value = prefix; + } else { + tokens[idx].value = value; + } + depth(tokens[idx]); + state.maxDepth += tokens[idx].depth; + } + if (idx !== 0 || value !== '') { + parts.push(value); + } + prevIndex = i; + } + + if (prevIndex && prevIndex + 1 < input.length) { + const value = input.slice(prevIndex + 1); + parts.push(value); + + if (opts.tokens) { + tokens[tokens.length - 1].value = value; + depth(tokens[tokens.length - 1]); + state.maxDepth += tokens[tokens.length - 1].depth; + } + } + + state.slashes = slashes; + state.parts = parts; + } + + return state; +}; + +module.exports = scan; diff --git a/node_modules/picomatch/lib/utils.js b/node_modules/picomatch/lib/utils.js new file mode 100644 index 000000000..c3ca766a7 --- /dev/null +++ b/node_modules/picomatch/lib/utils.js @@ -0,0 +1,64 @@ +'use strict'; + +const path = require('path'); +const win32 = process.platform === 'win32'; +const { + REGEX_BACKSLASH, + REGEX_REMOVE_BACKSLASH, + REGEX_SPECIAL_CHARS, + REGEX_SPECIAL_CHARS_GLOBAL +} = require('./constants'); + +exports.isObject = val => val !== null && typeof val === 'object' && !Array.isArray(val); +exports.hasRegexChars = str => REGEX_SPECIAL_CHARS.test(str); +exports.isRegexChar = str => str.length === 1 && exports.hasRegexChars(str); +exports.escapeRegex = str => str.replace(REGEX_SPECIAL_CHARS_GLOBAL, '\\$1'); +exports.toPosixSlashes = str => str.replace(REGEX_BACKSLASH, '/'); + +exports.removeBackslashes = str => { + return str.replace(REGEX_REMOVE_BACKSLASH, match => { + return match === '\\' ? '' : match; + }); +}; + +exports.supportsLookbehinds = () => { + const segs = process.version.slice(1).split('.').map(Number); + if (segs.length === 3 && segs[0] >= 9 || (segs[0] === 8 && segs[1] >= 10)) { + return true; + } + return false; +}; + +exports.isWindows = options => { + if (options && typeof options.windows === 'boolean') { + return options.windows; + } + return win32 === true || path.sep === '\\'; +}; + +exports.escapeLast = (input, char, lastIdx) => { + const idx = input.lastIndexOf(char, lastIdx); + if (idx === -1) return input; + if (input[idx - 1] === '\\') return exports.escapeLast(input, char, idx - 1); + return `${input.slice(0, idx)}\\${input.slice(idx)}`; +}; + +exports.removePrefix = (input, state = {}) => { + let output = input; + if (output.startsWith('./')) { + output = output.slice(2); + state.prefix = './'; + } + return output; +}; + +exports.wrapOutput = (input, state = {}, options = {}) => { + const prepend = options.contains ? '' : '^'; + const append = options.contains ? '' : '$'; + + let output = `${prepend}(?:${input})${append}`; + if (state.negated === true) { + output = `(?:^(?!${output}).*$)`; + } + return output; +}; diff --git a/node_modules/picomatch/package.json b/node_modules/picomatch/package.json new file mode 100644 index 000000000..8c35da204 --- /dev/null +++ b/node_modules/picomatch/package.json @@ -0,0 +1,113 @@ +{ + "_from": "picomatch@^2.0.4", + "_id": "picomatch@2.3.0", + "_inBundle": false, + "_integrity": "sha512-lY1Q/PiJGC2zOv/z391WOTD+Z02bCgsFfvxoXXf6h7kv9o+WmsmzYqrAwY63sNgOxE4xEdq0WyUnXfKeBrSvYw==", + "_location": "/picomatch", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "picomatch@^2.0.4", + "name": "picomatch", + "escapedName": "picomatch", + "rawSpec": "^2.0.4", + "saveSpec": null, + "fetchSpec": "^2.0.4" + }, + "_requiredBy": [ + "/anymatch", + "/readdirp" + ], + "_resolved": "https://registry.npmjs.org/picomatch/-/picomatch-2.3.0.tgz", + "_shasum": "f1f061de8f6a4bf022892e2d128234fb98302972", + "_spec": "picomatch@^2.0.4", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\anymatch", + "author": { + "name": "Jon Schlinkert", + "url": "https://github.com/jonschlinkert" + }, + "bugs": { + "url": "https://github.com/micromatch/picomatch/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Blazing fast and accurate glob matcher written in JavaScript, with no dependencies and full support for standard and extended Bash glob features, including braces, extglobs, POSIX brackets, and regular expressions.", + "devDependencies": { + "eslint": "^6.8.0", + "fill-range": "^7.0.1", + "gulp-format-md": "^2.0.0", + "mocha": "^6.2.2", + "nyc": "^15.0.0", + "time-require": "github:jonschlinkert/time-require" + }, + "engines": { + "node": ">=8.6" + }, + "files": [ + "index.js", + "lib" + ], + "funding": "https://github.com/sponsors/jonschlinkert", + "homepage": "https://github.com/micromatch/picomatch", + "keywords": [ + "glob", + "match", + "picomatch" + ], + "license": "MIT", + "main": "index.js", + "name": "picomatch", + "nyc": { + "reporter": [ + "html", + "lcov", + "text-summary" + ] + }, + "repository": { + "type": "git", + "url": "git+https://github.com/micromatch/picomatch.git" + }, + "scripts": { + "lint": "eslint --cache --cache-location node_modules/.cache/.eslintcache --report-unused-disable-directives --ignore-path .gitignore .", + "mocha": "mocha --reporter dot", + "test": "npm run lint && npm run mocha", + "test:ci": "npm run test:cover", + "test:cover": "nyc npm run mocha" + }, + "verb": { + "toc": { + "render": true, + "method": "preWrite", + "maxdepth": 3 + }, + "layout": "empty", + "tasks": [ + "readme" + ], + "plugins": [ + "gulp-format-md" + ], + "lint": { + "reflinks": true + }, + "related": { + "list": [ + "braces", + "micromatch" + ] + }, + "reflinks": [ + "braces", + "expand-brackets", + "extglob", + "fill-range", + "micromatch", + "minimatch", + "nanomatch", + "picomatch" + ] + }, + "version": "2.3.0" +} diff --git a/node_modules/prepend-http/index.js b/node_modules/prepend-http/index.js new file mode 100644 index 000000000..82b3a6b27 --- /dev/null +++ b/node_modules/prepend-http/index.js @@ -0,0 +1,15 @@ +'use strict'; +module.exports = (url, opts) => { + if (typeof url !== 'string') { + throw new TypeError(`Expected \`url\` to be of type \`string\`, got \`${typeof url}\``); + } + + url = url.trim(); + opts = Object.assign({https: false}, opts); + + if (/^\.*\/|^(?!localhost)\w+:/.test(url)) { + return url; + } + + return url.replace(/^(?!(?:\w+:)?\/\/)/, opts.https ? 'https://' : 'http://'); +}; diff --git a/node_modules/prepend-http/license b/node_modules/prepend-http/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/prepend-http/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/prepend-http/package.json b/node_modules/prepend-http/package.json new file mode 100644 index 000000000..cf78fd1f3 --- /dev/null +++ b/node_modules/prepend-http/package.json @@ -0,0 +1,67 @@ +{ + "_from": "prepend-http@^2.0.0", + "_id": "prepend-http@2.0.0", + "_inBundle": false, + "_integrity": "sha1-6SQ0v6XqjBn0HN/UAddBo8gZ2Jc=", + "_location": "/prepend-http", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "prepend-http@^2.0.0", + "name": "prepend-http", + "escapedName": "prepend-http", + "rawSpec": "^2.0.0", + "saveSpec": null, + "fetchSpec": "^2.0.0" + }, + "_requiredBy": [ + "/url-parse-lax" + ], + "_resolved": "https://registry.npmjs.org/prepend-http/-/prepend-http-2.0.0.tgz", + "_shasum": "e92434bfa5ea8c19f41cdfd401d741a3c819d897", + "_spec": "prepend-http@^2.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\url-parse-lax", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/prepend-http/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Prepend `http://` to humanized URLs like todomvc.com and localhost", + "devDependencies": { + "ava": "*", + "xo": "*" + }, + "engines": { + "node": ">=4" + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/sindresorhus/prepend-http#readme", + "keywords": [ + "prepend", + "protocol", + "scheme", + "url", + "uri", + "http", + "https", + "humanized" + ], + "license": "MIT", + "name": "prepend-http", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/prepend-http.git" + }, + "scripts": { + "test": "xo && ava" + }, + "version": "2.0.0" +} diff --git a/node_modules/prepend-http/readme.md b/node_modules/prepend-http/readme.md new file mode 100644 index 000000000..55d640df6 --- /dev/null +++ b/node_modules/prepend-http/readme.md @@ -0,0 +1,56 @@ +# prepend-http [![Build Status](https://travis-ci.org/sindresorhus/prepend-http.svg?branch=master)](https://travis-ci.org/sindresorhus/prepend-http) + +> Prepend `http://` to humanized URLs like `todomvc.com` and `localhost` + + +## Install + +``` +$ npm install prepend-http +``` + + +## Usage + +```js +const prependHttp = require('prepend-http'); + +prependHttp('todomvc.com'); +//=> 'http://todomvc.com' + +prependHttp('localhost'); +//=> 'http://localhost' + +prependHttp('http://todomvc.com'); +//=> 'http://todomvc.com' + +prependHttp('todomvc.com', {https: true}); +//=> 'https://todomvc.com' +``` + + +## API + +### prependHttp(url, [options]) + +#### url + +Type: `string` + +URL to prepend `http://` on. + +#### options + +Type: `Object` + +##### https + +Type: `boolean`
+Default: `false` + +Prepend `https://` instead of `http://`. + + +## License + +MIT © [Sindre Sorhus](https://sindresorhus.com) diff --git a/node_modules/proxy-addr/HISTORY.md b/node_modules/proxy-addr/HISTORY.md new file mode 100644 index 000000000..8480242a0 --- /dev/null +++ b/node_modules/proxy-addr/HISTORY.md @@ -0,0 +1,161 @@ +2.0.7 / 2021-05-31 +================== + + * deps: forwarded@0.2.0 + - Use `req.socket` over deprecated `req.connection` + +2.0.6 / 2020-02-24 +================== + + * deps: ipaddr.js@1.9.1 + +2.0.5 / 2019-04-16 +================== + + * deps: ipaddr.js@1.9.0 + +2.0.4 / 2018-07-26 +================== + + * deps: ipaddr.js@1.8.0 + +2.0.3 / 2018-02-19 +================== + + * deps: ipaddr.js@1.6.0 + +2.0.2 / 2017-09-24 +================== + + * deps: forwarded@~0.1.2 + - perf: improve header parsing + - perf: reduce overhead when no `X-Forwarded-For` header + +2.0.1 / 2017-09-10 +================== + + * deps: forwarded@~0.1.1 + - Fix trimming leading / trailing OWS + - perf: hoist regular expression + * deps: ipaddr.js@1.5.2 + +2.0.0 / 2017-08-08 +================== + + * Drop support for Node.js below 0.10 + +1.1.5 / 2017-07-25 +================== + + * Fix array argument being altered + * deps: ipaddr.js@1.4.0 + +1.1.4 / 2017-03-24 +================== + + * deps: ipaddr.js@1.3.0 + +1.1.3 / 2017-01-14 +================== + + * deps: ipaddr.js@1.2.0 + +1.1.2 / 2016-05-29 +================== + + * deps: ipaddr.js@1.1.1 + - Fix IPv6-mapped IPv4 validation edge cases + +1.1.1 / 2016-05-03 +================== + + * Fix regression matching mixed versions against multiple subnets + +1.1.0 / 2016-05-01 +================== + + * Fix accepting various invalid netmasks + - IPv4 netmasks must be contingous + - IPv6 addresses cannot be used as a netmask + * deps: ipaddr.js@1.1.0 + +1.0.10 / 2015-12-09 +=================== + + * deps: ipaddr.js@1.0.5 + - Fix regression in `isValid` with non-string arguments + +1.0.9 / 2015-12-01 +================== + + * deps: ipaddr.js@1.0.4 + - Fix accepting some invalid IPv6 addresses + - Reject CIDRs with negative or overlong masks + * perf: enable strict mode + +1.0.8 / 2015-05-10 +================== + + * deps: ipaddr.js@1.0.1 + +1.0.7 / 2015-03-16 +================== + + * deps: ipaddr.js@0.1.9 + - Fix OOM on certain inputs to `isValid` + +1.0.6 / 2015-02-01 +================== + + * deps: ipaddr.js@0.1.8 + +1.0.5 / 2015-01-08 +================== + + * deps: ipaddr.js@0.1.6 + +1.0.4 / 2014-11-23 +================== + + * deps: ipaddr.js@0.1.5 + - Fix edge cases with `isValid` + +1.0.3 / 2014-09-21 +================== + + * Use `forwarded` npm module + +1.0.2 / 2014-09-18 +================== + + * Fix a global leak when multiple subnets are trusted + * Support Node.js 0.6 + * deps: ipaddr.js@0.1.3 + +1.0.1 / 2014-06-03 +================== + + * Fix links in npm package + +1.0.0 / 2014-05-08 +================== + + * Add `trust` argument to determine proxy trust on + * Accepts custom function + * Accepts IPv4/IPv6 address(es) + * Accepts subnets + * Accepts pre-defined names + * Add optional `trust` argument to `proxyaddr.all` to + stop at first untrusted + * Add `proxyaddr.compile` to pre-compile `trust` function + to make subsequent calls faster + +0.0.1 / 2014-05-04 +================== + + * Fix bad npm publish + +0.0.0 / 2014-05-04 +================== + + * Initial release diff --git a/node_modules/proxy-addr/LICENSE b/node_modules/proxy-addr/LICENSE new file mode 100644 index 000000000..cab251c2b --- /dev/null +++ b/node_modules/proxy-addr/LICENSE @@ -0,0 +1,22 @@ +(The MIT License) + +Copyright (c) 2014-2016 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/proxy-addr/README.md b/node_modules/proxy-addr/README.md new file mode 100644 index 000000000..69c0b63ea --- /dev/null +++ b/node_modules/proxy-addr/README.md @@ -0,0 +1,139 @@ +# proxy-addr + +[![NPM Version][npm-version-image]][npm-url] +[![NPM Downloads][npm-downloads-image]][npm-url] +[![Node.js Version][node-image]][node-url] +[![Build Status][ci-image]][ci-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +Determine address of proxied request + +## Install + +This is a [Node.js](https://nodejs.org/en/) module available through the +[npm registry](https://www.npmjs.com/). Installation is done using the +[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally): + +```sh +$ npm install proxy-addr +``` + +## API + +```js +var proxyaddr = require('proxy-addr') +``` + +### proxyaddr(req, trust) + +Return the address of the request, using the given `trust` parameter. + +The `trust` argument is a function that returns `true` if you trust +the address, `false` if you don't. The closest untrusted address is +returned. + +```js +proxyaddr(req, function (addr) { return addr === '127.0.0.1' }) +proxyaddr(req, function (addr, i) { return i < 1 }) +``` + +The `trust` arugment may also be a single IP address string or an +array of trusted addresses, as plain IP addresses, CIDR-formatted +strings, or IP/netmask strings. + +```js +proxyaddr(req, '127.0.0.1') +proxyaddr(req, ['127.0.0.0/8', '10.0.0.0/8']) +proxyaddr(req, ['127.0.0.0/255.0.0.0', '192.168.0.0/255.255.0.0']) +``` + +This module also supports IPv6. Your IPv6 addresses will be normalized +automatically (i.e. `fe80::00ed:1` equals `fe80:0:0:0:0:0:ed:1`). + +```js +proxyaddr(req, '::1') +proxyaddr(req, ['::1/128', 'fe80::/10']) +``` + +This module will automatically work with IPv4-mapped IPv6 addresses +as well to support node.js in IPv6-only mode. This means that you do +not have to specify both `::ffff:a00:1` and `10.0.0.1`. + +As a convenience, this module also takes certain pre-defined names +in addition to IP addresses, which expand into IP addresses: + +```js +proxyaddr(req, 'loopback') +proxyaddr(req, ['loopback', 'fc00:ac:1ab5:fff::1/64']) +``` + + * `loopback`: IPv4 and IPv6 loopback addresses (like `::1` and + `127.0.0.1`). + * `linklocal`: IPv4 and IPv6 link-local addresses (like + `fe80::1:1:1:1` and `169.254.0.1`). + * `uniquelocal`: IPv4 private addresses and IPv6 unique-local + addresses (like `fc00:ac:1ab5:fff::1` and `192.168.0.1`). + +When `trust` is specified as a function, it will be called for each +address to determine if it is a trusted address. The function is +given two arguments: `addr` and `i`, where `addr` is a string of +the address to check and `i` is a number that represents the distance +from the socket address. + +### proxyaddr.all(req, [trust]) + +Return all the addresses of the request, optionally stopping at the +first untrusted. This array is ordered from closest to furthest +(i.e. `arr[0] === req.connection.remoteAddress`). + +```js +proxyaddr.all(req) +``` + +The optional `trust` argument takes the same arguments as `trust` +does in `proxyaddr(req, trust)`. + +```js +proxyaddr.all(req, 'loopback') +``` + +### proxyaddr.compile(val) + +Compiles argument `val` into a `trust` function. This function takes +the same arguments as `trust` does in `proxyaddr(req, trust)` and +returns a function suitable for `proxyaddr(req, trust)`. + +```js +var trust = proxyaddr.compile('loopback') +var addr = proxyaddr(req, trust) +``` + +This function is meant to be optimized for use against every request. +It is recommend to compile a trust function up-front for the trusted +configuration and pass that to `proxyaddr(req, trust)` for each request. + +## Testing + +```sh +$ npm test +``` + +## Benchmarks + +```sh +$ npm run-script bench +``` + +## License + +[MIT](LICENSE) + +[ci-image]: https://badgen.net/github/checks/jshttp/proxy-addr/master?label=ci +[ci-url]: https://github.com/jshttp/proxy-addr/actions?query=workflow%3Aci +[coveralls-image]: https://badgen.net/coveralls/c/github/jshttp/proxy-addr/master +[coveralls-url]: https://coveralls.io/r/jshttp/proxy-addr?branch=master +[node-image]: https://badgen.net/npm/node/proxy-addr +[node-url]: https://nodejs.org/en/download +[npm-downloads-image]: https://badgen.net/npm/dm/proxy-addr +[npm-url]: https://npmjs.org/package/proxy-addr +[npm-version-image]: https://badgen.net/npm/v/proxy-addr diff --git a/node_modules/proxy-addr/index.js b/node_modules/proxy-addr/index.js new file mode 100644 index 000000000..a909b0506 --- /dev/null +++ b/node_modules/proxy-addr/index.js @@ -0,0 +1,327 @@ +/*! + * proxy-addr + * Copyright(c) 2014-2016 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module exports. + * @public + */ + +module.exports = proxyaddr +module.exports.all = alladdrs +module.exports.compile = compile + +/** + * Module dependencies. + * @private + */ + +var forwarded = require('forwarded') +var ipaddr = require('ipaddr.js') + +/** + * Variables. + * @private + */ + +var DIGIT_REGEXP = /^[0-9]+$/ +var isip = ipaddr.isValid +var parseip = ipaddr.parse + +/** + * Pre-defined IP ranges. + * @private + */ + +var IP_RANGES = { + linklocal: ['169.254.0.0/16', 'fe80::/10'], + loopback: ['127.0.0.1/8', '::1/128'], + uniquelocal: ['10.0.0.0/8', '172.16.0.0/12', '192.168.0.0/16', 'fc00::/7'] +} + +/** + * Get all addresses in the request, optionally stopping + * at the first untrusted. + * + * @param {Object} request + * @param {Function|Array|String} [trust] + * @public + */ + +function alladdrs (req, trust) { + // get addresses + var addrs = forwarded(req) + + if (!trust) { + // Return all addresses + return addrs + } + + if (typeof trust !== 'function') { + trust = compile(trust) + } + + for (var i = 0; i < addrs.length - 1; i++) { + if (trust(addrs[i], i)) continue + + addrs.length = i + 1 + } + + return addrs +} + +/** + * Compile argument into trust function. + * + * @param {Array|String} val + * @private + */ + +function compile (val) { + if (!val) { + throw new TypeError('argument is required') + } + + var trust + + if (typeof val === 'string') { + trust = [val] + } else if (Array.isArray(val)) { + trust = val.slice() + } else { + throw new TypeError('unsupported trust argument') + } + + for (var i = 0; i < trust.length; i++) { + val = trust[i] + + if (!Object.prototype.hasOwnProperty.call(IP_RANGES, val)) { + continue + } + + // Splice in pre-defined range + val = IP_RANGES[val] + trust.splice.apply(trust, [i, 1].concat(val)) + i += val.length - 1 + } + + return compileTrust(compileRangeSubnets(trust)) +} + +/** + * Compile `arr` elements into range subnets. + * + * @param {Array} arr + * @private + */ + +function compileRangeSubnets (arr) { + var rangeSubnets = new Array(arr.length) + + for (var i = 0; i < arr.length; i++) { + rangeSubnets[i] = parseipNotation(arr[i]) + } + + return rangeSubnets +} + +/** + * Compile range subnet array into trust function. + * + * @param {Array} rangeSubnets + * @private + */ + +function compileTrust (rangeSubnets) { + // Return optimized function based on length + var len = rangeSubnets.length + return len === 0 + ? trustNone + : len === 1 + ? trustSingle(rangeSubnets[0]) + : trustMulti(rangeSubnets) +} + +/** + * Parse IP notation string into range subnet. + * + * @param {String} note + * @private + */ + +function parseipNotation (note) { + var pos = note.lastIndexOf('/') + var str = pos !== -1 + ? note.substring(0, pos) + : note + + if (!isip(str)) { + throw new TypeError('invalid IP address: ' + str) + } + + var ip = parseip(str) + + if (pos === -1 && ip.kind() === 'ipv6' && ip.isIPv4MappedAddress()) { + // Store as IPv4 + ip = ip.toIPv4Address() + } + + var max = ip.kind() === 'ipv6' + ? 128 + : 32 + + var range = pos !== -1 + ? note.substring(pos + 1, note.length) + : null + + if (range === null) { + range = max + } else if (DIGIT_REGEXP.test(range)) { + range = parseInt(range, 10) + } else if (ip.kind() === 'ipv4' && isip(range)) { + range = parseNetmask(range) + } else { + range = null + } + + if (range <= 0 || range > max) { + throw new TypeError('invalid range on address: ' + note) + } + + return [ip, range] +} + +/** + * Parse netmask string into CIDR range. + * + * @param {String} netmask + * @private + */ + +function parseNetmask (netmask) { + var ip = parseip(netmask) + var kind = ip.kind() + + return kind === 'ipv4' + ? ip.prefixLengthFromSubnetMask() + : null +} + +/** + * Determine address of proxied request. + * + * @param {Object} request + * @param {Function|Array|String} trust + * @public + */ + +function proxyaddr (req, trust) { + if (!req) { + throw new TypeError('req argument is required') + } + + if (!trust) { + throw new TypeError('trust argument is required') + } + + var addrs = alladdrs(req, trust) + var addr = addrs[addrs.length - 1] + + return addr +} + +/** + * Static trust function to trust nothing. + * + * @private + */ + +function trustNone () { + return false +} + +/** + * Compile trust function for multiple subnets. + * + * @param {Array} subnets + * @private + */ + +function trustMulti (subnets) { + return function trust (addr) { + if (!isip(addr)) return false + + var ip = parseip(addr) + var ipconv + var kind = ip.kind() + + for (var i = 0; i < subnets.length; i++) { + var subnet = subnets[i] + var subnetip = subnet[0] + var subnetkind = subnetip.kind() + var subnetrange = subnet[1] + var trusted = ip + + if (kind !== subnetkind) { + if (subnetkind === 'ipv4' && !ip.isIPv4MappedAddress()) { + // Incompatible IP addresses + continue + } + + if (!ipconv) { + // Convert IP to match subnet IP kind + ipconv = subnetkind === 'ipv4' + ? ip.toIPv4Address() + : ip.toIPv4MappedAddress() + } + + trusted = ipconv + } + + if (trusted.match(subnetip, subnetrange)) { + return true + } + } + + return false + } +} + +/** + * Compile trust function for single subnet. + * + * @param {Object} subnet + * @private + */ + +function trustSingle (subnet) { + var subnetip = subnet[0] + var subnetkind = subnetip.kind() + var subnetisipv4 = subnetkind === 'ipv4' + var subnetrange = subnet[1] + + return function trust (addr) { + if (!isip(addr)) return false + + var ip = parseip(addr) + var kind = ip.kind() + + if (kind !== subnetkind) { + if (subnetisipv4 && !ip.isIPv4MappedAddress()) { + // Incompatible IP addresses + return false + } + + // Convert IP to match subnet IP kind + ip = subnetisipv4 + ? ip.toIPv4Address() + : ip.toIPv4MappedAddress() + } + + return ip.match(subnetip, subnetrange) + } +} diff --git a/node_modules/proxy-addr/package.json b/node_modules/proxy-addr/package.json new file mode 100644 index 000000000..b20e52bdb --- /dev/null +++ b/node_modules/proxy-addr/package.json @@ -0,0 +1,82 @@ +{ + "_from": "proxy-addr@~2.0.5", + "_id": "proxy-addr@2.0.7", + "_inBundle": false, + "_integrity": "sha512-llQsMLSUDUPT44jdrU/O37qlnifitDP+ZwrmmZcoSKyLKvtZxpyV0n2/bD/N4tBAAZ/gJEdZU7KMraoK1+XYAg==", + "_location": "/proxy-addr", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "proxy-addr@~2.0.5", + "name": "proxy-addr", + "escapedName": "proxy-addr", + "rawSpec": "~2.0.5", + "saveSpec": null, + "fetchSpec": "~2.0.5" + }, + "_requiredBy": [ + "/express" + ], + "_resolved": "https://registry.npmjs.org/proxy-addr/-/proxy-addr-2.0.7.tgz", + "_shasum": "f19fe69ceab311eeb94b42e70e8c2070f9ba1025", + "_spec": "proxy-addr@~2.0.5", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\express", + "author": { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + "bugs": { + "url": "https://github.com/jshttp/proxy-addr/issues" + }, + "bundleDependencies": false, + "dependencies": { + "forwarded": "0.2.0", + "ipaddr.js": "1.9.1" + }, + "deprecated": false, + "description": "Determine address of proxied request", + "devDependencies": { + "beautify-benchmark": "0.2.4", + "benchmark": "2.1.4", + "deep-equal": "1.0.1", + "eslint": "7.26.0", + "eslint-config-standard": "14.1.1", + "eslint-plugin-import": "2.23.4", + "eslint-plugin-markdown": "2.2.0", + "eslint-plugin-node": "11.1.0", + "eslint-plugin-promise": "4.3.1", + "eslint-plugin-standard": "4.1.0", + "mocha": "8.4.0", + "nyc": "15.1.0" + }, + "engines": { + "node": ">= 0.10" + }, + "files": [ + "LICENSE", + "HISTORY.md", + "README.md", + "index.js" + ], + "homepage": "https://github.com/jshttp/proxy-addr#readme", + "keywords": [ + "ip", + "proxy", + "x-forwarded-for" + ], + "license": "MIT", + "name": "proxy-addr", + "repository": { + "type": "git", + "url": "git+https://github.com/jshttp/proxy-addr.git" + }, + "scripts": { + "bench": "node benchmark/index.js", + "lint": "eslint .", + "test": "mocha --reporter spec --bail --check-leaks test/", + "test-ci": "nyc --reporter=lcov --reporter=text npm test", + "test-cov": "nyc --reporter=html --reporter=text npm test" + }, + "version": "2.0.7" +} diff --git a/node_modules/pstree.remy/.travis.yml b/node_modules/pstree.remy/.travis.yml new file mode 100644 index 000000000..5bf093eef --- /dev/null +++ b/node_modules/pstree.remy/.travis.yml @@ -0,0 +1,8 @@ +language: node_js +cache: + directories: + - ~/.npm +notifications: + email: false +node_js: + - '8' diff --git a/node_modules/pstree.remy/LICENSE b/node_modules/pstree.remy/LICENSE new file mode 100644 index 000000000..e83bea65c --- /dev/null +++ b/node_modules/pstree.remy/LICENSE @@ -0,0 +1,7 @@ +The MIT License (MIT) +Copyright © 2019 Remy Sharp, https://remysharp.com +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/pstree.remy/README.md b/node_modules/pstree.remy/README.md new file mode 100644 index 000000000..5f44c6292 --- /dev/null +++ b/node_modules/pstree.remy/README.md @@ -0,0 +1,26 @@ +# pstree.remy + +> Cross platform ps-tree (including unix flavours without ps) + +## Installation + +```shel +npm install pstree.remy +``` + +## Usage + +```js +const psTree = psTree require('pstree.remy'); + +psTree(PID, (err, pids) => { + if (err) { + console.error(err); + } + console.log(pids) +}); + +console.log(psTree.hasPS + ? "This platform has the ps shell command" + : "This platform does not have the ps shell command"); +``` diff --git a/node_modules/pstree.remy/lib/index.js b/node_modules/pstree.remy/lib/index.js new file mode 100644 index 000000000..743e99795 --- /dev/null +++ b/node_modules/pstree.remy/lib/index.js @@ -0,0 +1,37 @@ +const exec = require('child_process').exec; +const tree = require('./tree'); +const utils = require('./utils'); +var hasPS = true; + +// discover if the OS has `ps`, and therefore can use psTree +exec('ps', (error) => { + module.exports.hasPS = hasPS = !error; +}); + +module.exports = function main(pid, callback) { + if (typeof pid === 'number') { + pid = pid.toString(); + } + + if (hasPS && !process.env.NO_PS) { + return tree(pid, callback); + } + + utils + .getStat() + .then(utils.tree) + .then((tree) => utils.pidsForTree(tree, pid)) + .then((res) => + callback( + null, + res.map((p) => p.PID) + ) + ) + .catch((error) => callback(error)); +}; + +if (!module.parent) { + module.exports(process.argv[2], (e, pids) => console.log(pids)); +} + +module.exports.hasPS = hasPS; diff --git a/node_modules/pstree.remy/lib/tree.js b/node_modules/pstree.remy/lib/tree.js new file mode 100644 index 000000000..bac7cce65 --- /dev/null +++ b/node_modules/pstree.remy/lib/tree.js @@ -0,0 +1,37 @@ +const spawn = require('child_process').spawn; + +module.exports = function (rootPid, callback) { + const pidsOfInterest = new Set([parseInt(rootPid, 10)]); + var output = ''; + + // *nix + const ps = spawn('ps', ['-A', '-o', 'ppid,pid']); + ps.stdout.on('data', (data) => { + output += data.toString('ascii'); + }); + + ps.on('close', () => { + try { + const res = output + .split('\n') + .slice(1) + .map((_) => _.trim()) + .reduce((acc, line) => { + const pids = line.split(/\s+/); + const ppid = parseInt(pids[0], 10); + + if (pidsOfInterest.has(ppid)) { + const pid = parseInt(pids[1], 10); + acc.push(pid); + pidsOfInterest.add(pid); + } + + return acc; + }, []); + + callback(null, res); + } catch (e) { + callback(e, null); + } + }); +}; diff --git a/node_modules/pstree.remy/lib/utils.js b/node_modules/pstree.remy/lib/utils.js new file mode 100644 index 000000000..8fa5719e2 --- /dev/null +++ b/node_modules/pstree.remy/lib/utils.js @@ -0,0 +1,53 @@ +const spawn = require('child_process').spawn; + +module.exports = { tree, pidsForTree, getStat }; + +function getStat() { + return new Promise((resolve) => { + const command = `ls /proc | grep -E '^[0-9]+$' | xargs -I{} cat /proc/{}/stat`; + const spawned = spawn('sh', ['-c', command], { + stdio: ['pipe', 'pipe', 'pipe'], + }); + + var res = ''; + spawned.stdout.on('data', (data) => (res += data)); + spawned.on('close', () => resolve(res)); + }); +} + +function template(s) { + var stat = null; + // 'pid', 'comm', 'state', 'ppid', 'pgrp' + // %d (%s) %c %d %d + s.replace( + /(\d+) \((.*?)\)\s(.+?)\s(\d+)\s/g, + (all, PID, COMMAND, STAT, PPID) => { + stat = { PID, COMMAND, PPID, STAT }; + } + ); + + return stat; +} + +function tree(stats) { + const processes = stats.split('\n').map(template).filter(Boolean); + + return processes; +} + +function pidsForTree(tree, pid) { + if (typeof pid === 'number') { + pid = pid.toString(); + } + const parents = [pid]; + const pids = []; + + tree.forEach((proc) => { + if (parents.indexOf(proc.PPID) !== -1) { + parents.push(proc.PID); + pids.push(proc); + } + }); + + return pids; +} diff --git a/node_modules/pstree.remy/package.json b/node_modules/pstree.remy/package.json new file mode 100644 index 000000000..4f68a9692 --- /dev/null +++ b/node_modules/pstree.remy/package.json @@ -0,0 +1,64 @@ +{ + "_from": "pstree.remy@^1.1.7", + "_id": "pstree.remy@1.1.8", + "_inBundle": false, + "_integrity": "sha512-77DZwxQmxKnu3aR542U+X8FypNzbfJ+C5XQDk3uWjWxn6151aIMGthWYRXTqT1E5oJvg+ljaa2OJi+VfvCOQ8w==", + "_location": "/pstree.remy", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "pstree.remy@^1.1.7", + "name": "pstree.remy", + "escapedName": "pstree.remy", + "rawSpec": "^1.1.7", + "saveSpec": null, + "fetchSpec": "^1.1.7" + }, + "_requiredBy": [ + "/nodemon" + ], + "_resolved": "https://registry.npmjs.org/pstree.remy/-/pstree.remy-1.1.8.tgz", + "_shasum": "c242224f4a67c21f686839bbdb4ac282b8373d3a", + "_spec": "pstree.remy@^1.1.7", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\nodemon", + "author": { + "name": "Remy Sharp" + }, + "bugs": { + "url": "https://github.com/remy/pstree/issues" + }, + "bundleDependencies": false, + "dependencies": {}, + "deprecated": false, + "description": "Collects the full tree of processes from /proc", + "devDependencies": { + "tap": "^11.0.0" + }, + "directories": { + "test": "tests" + }, + "homepage": "https://github.com/remy/pstree#readme", + "keywords": [ + "ps", + "pstree", + "ps tree" + ], + "license": "MIT", + "main": "lib/index.js", + "name": "pstree.remy", + "prettier": { + "trailingComma": "es5", + "semi": true, + "singleQuote": true + }, + "repository": { + "type": "git", + "url": "git+https://github.com/remy/pstree.git" + }, + "scripts": { + "_prepublish": "npm test", + "test": "tap tests/*.test.js" + }, + "version": "1.1.8" +} diff --git a/node_modules/pstree.remy/tests/fixtures/index.js b/node_modules/pstree.remy/tests/fixtures/index.js new file mode 100644 index 000000000..4cdbcb1b3 --- /dev/null +++ b/node_modules/pstree.remy/tests/fixtures/index.js @@ -0,0 +1,13 @@ +const spawn = require('child_process').spawn; +function run() { + spawn( + 'sh', + ['-c', 'node -e "setInterval(() => console.log(`running`), 200)"'], + { + stdio: 'pipe', + } + ); +} + +var runCallCount = process.argv[2] || 1; +for (var i = 0; i < runCallCount; i++) run(); diff --git a/node_modules/pstree.remy/tests/fixtures/out1 b/node_modules/pstree.remy/tests/fixtures/out1 new file mode 100644 index 000000000..abfe5810e --- /dev/null +++ b/node_modules/pstree.remy/tests/fixtures/out1 @@ -0,0 +1,10 @@ +1 (npm) S 0 1 1 34816 1 4210944 11112 0 0 0 45 8 0 0 20 0 10 0 330296 1089871872 11809 18446744073709551615 4194304 29343848 140726436642896 0 0 0 0 4096 2072112895 0 0 0 17 0 0 0 0 0 0 31441000 31537208 37314560 140726436650815 140726436650847 140726436650847 140726436650986 0 +15 (sh) S 1 1 1 34816 1 4210688 115 0 0 0 0 0 0 0 20 0 1 0 330372 4399104 187 18446744073709551615 94374393548800 94374393655428 140722913272992 0 0 0 0 0 65538 0 0 0 17 0 0 0 0 0 0 94374395756424 94374395761184 94374404673536 140722913278928 140722913278959 140722913278959 140722913284080 0 +16 (node) S 15 1 1 34816 1 4210688 6930 103 0 0 32 2 0 0 20 0 10 0 330373 1068478464 8412 18446744073709551615 4194304 29343848 140727228046064 0 0 0 0 4096 134300162 0 0 0 17 1 0 0 1 0 0 31441000 31537208 52584448 140727228050313 140727228050383 140727228050383 140727228055530 0 +27 (sh) S 16 1 1 34816 1 4210688 111 0 0 0 0 0 0 0 20 0 1 0 330410 4399104 193 18446744073709551615 94848235986944 94848236093572 140727019991184 0 0 0 0 0 65538 0 0 0 17 1 0 0 0 0 0 94848238194568 94848238199328 94848261660672 140727019998122 140727019998165 140727019998165 140727020003312 0 +28 (node) S 27 1 1 34816 1 4210688 3576 268 0 0 12 2 0 0 20 0 10 0 330411 930213888 6760 18446744073709551615 4194304 29343848 140726559664992 0 0 0 0 4096 134300162 0 0 0 17 1 0 0 0 0 0 31441000 31537208 32591872 140726559669117 140726559669199 140726559669199 140726559674346 0 +39 (node) S 28 1 1 34816 1 4210688 47517 0 0 0 151 9 0 0 20 0 6 0 330427 985739264 31859 18446744073709551615 4194304 29343848 140737324503920 0 0 0 0 4096 134234626 0 0 0 17 0 0 0 0 0 0 31441000 31537208 51585024 140737324510060 140737324510159 140737324510159 140737324515306 0 +45 (bash) S 0 45 45 34817 50 4210944 752 256 0 0 2 0 0 0 20 0 1 0 331039 18628608 789 18446744073709551615 4194304 5242124 140724425887696 0 0 0 65536 3670020 1266777851 0 0 0 17 1 0 0 0 0 0 7341384 7388228 30310400 140724425891678 140724425891683 140724425891683 140724425891822 0 +cat: /proc/50/stat: No such file or directory +cat: /proc/51/stat: No such file or directory +52 (xargs) S 45 50 45 34817 50 4210688 179 661 0 0 0 0 0 0 20 0 1 0 331544 4608000 346 18446744073709551615 94587588550656 94587588614028 140735223856048 0 0 0 0 0 2560 0 0 0 17 1 0 0 0 0 0 94587590711464 94587590713504 94587603169280 140735223861006 140735223861035 140735223861035 140735223861225 0 diff --git a/node_modules/pstree.remy/tests/fixtures/out2 b/node_modules/pstree.remy/tests/fixtures/out2 new file mode 100644 index 000000000..3b31137db --- /dev/null +++ b/node_modules/pstree.remy/tests/fixtures/out2 @@ -0,0 +1,29 @@ +cat: /proc/4087/stat: No such file or directory +cat: /proc/4088/stat: No such file or directory +1 (init) S 0 1 1 0 -1 4210944 9227 55994 29 319 7 5 68 16 20 0 1 0 1286281 33660928 855 18446744073709551615 1 1 0 0 0 0 0 4096 536962595 0 0 0 17 4 0 0 3 0 0 0 0 0 0 0 0 0 0 +1032 (ntpd) S 1 1032 1032 0 -1 4211008 178 0 1 0 0 0 0 0 20 0 1 0 1287033 25743360 1058 18446744073709551615 1 1 0 0 0 0 0 4096 27207 0 0 0 17 4 0 0 0 0 0 0 0 0 0 0 0 0 0 +126 (irqbalance) S 1 126 126 0 -1 1077952832 1217 0 0 0 1 6 0 0 20 0 1 0 1286749 20189184 647 18446744073709551615 1 1 0 0 0 0 0 0 3 0 0 0 17 4 0 0 0 0 0 0 0 0 0 0 0 0 0 +181 (mysqld) S 1 181 181 0 -1 4210944 6399 0 46 0 8 6 0 0 20 0 22 0 1286761 748453888 14476 18446744073709551615 1 1 0 0 0 0 552967 4096 26345 0 0 0 17 4 0 0 10 0 0 0 0 0 0 0 0 0 0 +194 (memcached) S 1 187 187 0 -1 4210944 252 0 4 0 0 0 0 0 20 0 6 0 1286766 333221888 648 18446744073709551615 1 1 0 0 0 0 0 4096 2 0 0 0 17 4 0 0 0 0 0 0 0 0 0 0 0 0 0 +243 (dbus-daemon) S 1 243 243 0 -1 4211008 67 0 0 0 0 0 0 0 20 0 1 0 1286779 40087552 598 18446744073709551615 1 1 0 0 0 0 0 0 16385 0 0 0 17 4 0 0 0 0 0 0 0 0 0 0 0 0 0 +254 (rsyslogd) S 1 254 254 0 -1 4211008 107 0 0 0 2 2 0 0 20 0 3 0 1286782 186601472 696 18446744073709551615 1 1 0 0 0 0 0 16781830 1133601 0 0 0 17 5 0 0 0 0 0 0 0 0 0 0 0 0 0 +265 (systemd-logind) S 1 265 265 0 -1 4210944 276 0 2 0 0 0 0 0 20 0 1 0 1286786 35880960 720 18446744073709551615 1 1 0 0 0 0 0 0 0 0 0 0 17 4 0 0 0 0 0 0 0 0 0 0 0 0 0 +333 (postgres) S 1 303 303 0 -1 4210688 3169 3466 15 18 0 1 1 1 20 0 1 0 1286817 156073984 5002 18446744073709551615 1 1 0 0 0 0 0 19935232 84487 0 0 0 17 5 0 0 1 0 0 0 0 0 0 0 0 0 0 +359 (postgres) S 333 359 359 0 -1 4210752 90 0 0 0 0 0 0 0 20 0 1 0 1286822 156073984 827 18446744073709551615 1 1 0 0 0 0 0 16805888 2567 0 0 0 17 4 0 0 0 0 0 0 0 0 0 0 0 0 0 +360 (postgres) S 333 360 360 0 -1 4210752 119 0 0 0 0 0 0 0 20 0 1 0 1286822 156073984 827 18446744073709551615 1 1 0 0 0 0 0 16791554 16901 0 0 0 17 4 0 0 0 0 0 0 0 0 0 0 0 0 0 +361 (postgres) S 333 361 361 0 -1 4210752 87 0 0 0 0 0 0 0 20 0 1 0 1286822 156073984 827 18446744073709551615 1 1 0 0 0 0 0 16791552 16903 0 0 0 17 4 0 0 0 0 0 0 0 0 0 0 0 0 0 +362 (postgres) S 333 362 362 0 -1 4210752 292 0 3 0 0 0 0 0 20 0 1 0 1286822 156930048 1373 18446744073709551615 1 1 0 0 0 0 0 19927040 27271 0 0 0 17 5 0 0 0 0 0 0 0 0 0 0 0 0 0 +363 (postgres) S 333 363 363 0 -1 4210752 82 0 0 0 0 0 0 0 20 0 1 0 1286822 115924992 887 18446744073709551615 1 1 0 0 0 0 0 16808450 5 0 0 0 17 5 0 0 0 0 0 0 0 0 0 0 0 0 0 +4050 (npm) S 50 50 50 34817 50 4210688 5109 0 0 0 36 3 0 0 20 0 10 0 1292968 738025472 10051 18446744073709551615 4194304 33165900 140723623956256 0 0 0 0 4096 134300162 0 0 0 17 4 0 0 0 0 0 35263056 35370992 48369664 140723623964237 140723623964294 140723623964294 140723623968712 0 +4060 (sh) S 4050 50 50 34817 50 4210688 121 0 0 0 0 0 0 0 20 0 1 0 1293007 4579328 174 18446744073709551615 94347643936768 94347644049516 140735136055088 0 0 0 0 0 65538 1 0 0 17 5 0 0 0 0 0 94347646148008 94347646153216 94347660038144 140735136063095 140735136063129 140735136063129 140735136071664 0 +4061 (node) S 4060 50 50 34817 50 4210688 6501 0 0 0 42 2 0 0 20 0 6 0 1293008 705769472 10211 18446744073709551615 4194304 33165900 140730532686288 0 0 0 0 4096 2072111671 0 0 0 17 5 0 0 0 0 0 35263056 35370992 45867008 140730532695579 140730532695657 140730532695657 140730532704200 0 +4067 (node) S 4061 50 50 34817 50 4210688 6746 221 0 0 38 3 0 0 20 0 10 0 1293051 738910208 10527 18446744073709551615 4194304 33165900 140724824971632 0 0 0 0 4096 2072111671 0 0 0 17 4 0 0 0 0 0 35263056 35370992 68595712 140724824980995 140724824981063 140724824981063 140724824989640 0 +4079 (sh) S 4067 50 50 34817 50 4210688 118 0 0 0 0 0 0 0 20 0 1 0 1293092 4579328 194 18446744073709551615 94573702131712 94573702244460 140724712357120 0 0 0 0 0 65538 1 0 0 17 4 0 0 0 0 0 94573704342952 94573704348160 94573718511616 140724712361487 140724712361583 140724712361583 140724712370160 0 +4080 (node) S 4079 50 50 34817 50 4210688 2428 0 0 0 8 1 0 0 20 0 6 0 1293093 693059584 7251 18446744073709551615 4194304 33165900 140726023392816 0 0 0 0 4096 134234626 0 0 0 17 5 0 0 0 0 0 35263056 35370992 55226368 140726023396847 140726023396935 140726023396935 140726023405512 0 +4086 (sh) S 4067 50 50 34817 50 4210688 131 244 0 0 0 0 0 0 20 0 1 0 1293143 4579328 200 18446744073709551615 94347550273536 94347550386284 140737219399136 0 0 0 0 0 65538 1 0 0 17 5 0 0 0 0 0 94347552484776 94347552489984 94347554299904 140737219403308 140737219403375 140737219403375 140737219411952 0 +4089 (xargs) S 4086 50 50 34817 50 4210688 333 1924 0 0 0 0 0 0 20 0 1 0 1293143 17600512 477 18446744073709551615 4194304 4232732 140721633759248 0 0 0 0 0 0 1 0 0 17 5 0 0 0 0 0 6331920 6332980 32182272 140721633762891 140721633762920 140721633762920 140721633771497 0 +50 (bash) S 0 50 50 34817 50 4210944 43914 1032463 9 705 44 21 4213 818 20 0 1 0 1286336 42266624 3599 18446744073709551615 4194304 5173404 140732749083280 0 0 0 65536 4 1132560123 1 0 0 17 4 0 0 410 0 0 7273968 7310504 21196800 140732749086490 140732749086517 140732749086517 140732749086702 0 +79 (acpid) S 1 79 79 0 -1 4210752 46 0 0 0 0 0 0 0 20 0 1 0 1286717 4493312 407 18446744073709551615 1 1 0 0 0 0 0 4096 16391 0 0 0 17 5 0 0 0 0 0 0 0 0 0 0 0 0 0 +83 (sshd) S 1 83 83 0 -1 4210944 354 0 27 0 0 0 0 0 20 0 1 0 1286718 62873600 1290 18446744073709551615 1 1 0 0 0 0 0 4096 81925 0 0 0 17 4 0 0 30 0 0 0 0 0 0 0 0 0 0 +94 (cron) S 1 94 94 0 -1 1077952576 103 449 0 1 0 0 0 0 20 0 1 0 1286743 24240128 559 18446744073709551615 1 1 0 0 0 0 0 0 65537 0 0 0 17 4 0 0 0 0 0 0 0 0 0 0 0 0 0 +95 (atd) S 1 95 95 0 -1 1077952576 28 0 0 0 0 0 0 0 20 0 1 0 1286743 19615744 41 18446744073709551615 1 1 0 0 0 0 0 0 81923 0 0 0 17 4 0 0 0 0 0 0 0 0 0 0 0 0 0 diff --git a/node_modules/pstree.remy/tests/index.test.js b/node_modules/pstree.remy/tests/index.test.js new file mode 100644 index 000000000..50096b95a --- /dev/null +++ b/node_modules/pstree.remy/tests/index.test.js @@ -0,0 +1,51 @@ +const tap = require('tap'); +const test = tap.test; +const readFile = require('fs').readFileSync; +const spawn = require('child_process').spawn; +const pstree = require('../'); +const { tree, pidsForTree, getStat } = require('../lib/utils'); + +if (process.platform !== 'darwin') { + test('reads from /proc', async (t) => { + const ps = await getStat(); + t.ok(ps.split('\n').length > 1); + }); +} + +test('tree for live env', async (t) => { + const pid = 4079; + const fixture = readFile(__dirname + '/fixtures/out2', 'utf8'); + const ps = await tree(fixture); + t.deepEqual( + pidsForTree(ps, pid).map((_) => _.PID), + ['4080'] + ); +}); + +function testTree(t, runCallCount) { + const sub = spawn('node', [`${__dirname}/fixtures/index.js`, runCallCount], { + stdio: 'pipe', + }); + setTimeout(() => { + const pid = sub.pid; + + pstree(pid, (error, pids) => { + pids.concat([pid]).forEach((p) => { + spawn('kill', ['-s', 'SIGTERM', p]); + }); + + // the fixture launches `sh` which launches node which is why we + // are looking for two processes. + // Important: IDKW but MacOS seems to skip the `sh` process. no idea. + t.equal(pids.length, runCallCount * 2); + t.end(); + }); + }, 1000); +} + +test('can read full process tree', (t) => { + testTree(t, 1); +}); +test('can read full process tree with multiple processes', (t) => { + testTree(t, 2); +}); diff --git a/node_modules/pump/.travis.yml b/node_modules/pump/.travis.yml new file mode 100644 index 000000000..17f94330e --- /dev/null +++ b/node_modules/pump/.travis.yml @@ -0,0 +1,5 @@ +language: node_js +node_js: + - "0.10" + +script: "npm test" diff --git a/node_modules/pump/LICENSE b/node_modules/pump/LICENSE new file mode 100644 index 000000000..757562ec5 --- /dev/null +++ b/node_modules/pump/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2014 Mathias Buus + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. \ No newline at end of file diff --git a/node_modules/pump/README.md b/node_modules/pump/README.md new file mode 100644 index 000000000..4c81471a0 --- /dev/null +++ b/node_modules/pump/README.md @@ -0,0 +1,65 @@ +# pump + +pump is a small node module that pipes streams together and destroys all of them if one of them closes. + +``` +npm install pump +``` + +[![build status](http://img.shields.io/travis/mafintosh/pump.svg?style=flat)](http://travis-ci.org/mafintosh/pump) + +## What problem does it solve? + +When using standard `source.pipe(dest)` source will _not_ be destroyed if dest emits close or an error. +You are also not able to provide a callback to tell when then pipe has finished. + +pump does these two things for you + +## Usage + +Simply pass the streams you want to pipe together to pump and add an optional callback + +``` js +var pump = require('pump') +var fs = require('fs') + +var source = fs.createReadStream('/dev/random') +var dest = fs.createWriteStream('/dev/null') + +pump(source, dest, function(err) { + console.log('pipe finished', err) +}) + +setTimeout(function() { + dest.destroy() // when dest is closed pump will destroy source +}, 1000) +``` + +You can use pump to pipe more than two streams together as well + +``` js +var transform = someTransformStream() + +pump(source, transform, anotherTransform, dest, function(err) { + console.log('pipe finished', err) +}) +``` + +If `source`, `transform`, `anotherTransform` or `dest` closes all of them will be destroyed. + +Similarly to `stream.pipe()`, `pump()` returns the last stream passed in, so you can do: + +``` +return pump(s1, s2) // returns s2 +``` + +If you want to return a stream that combines *both* s1 and s2 to a single stream use +[pumpify](https://github.com/mafintosh/pumpify) instead. + +## License + +MIT + +## Related + +`pump` is part of the [mississippi stream utility collection](https://github.com/maxogden/mississippi) which includes more useful stream modules similar to this one. diff --git a/node_modules/pump/index.js b/node_modules/pump/index.js new file mode 100644 index 000000000..c15059f17 --- /dev/null +++ b/node_modules/pump/index.js @@ -0,0 +1,82 @@ +var once = require('once') +var eos = require('end-of-stream') +var fs = require('fs') // we only need fs to get the ReadStream and WriteStream prototypes + +var noop = function () {} +var ancient = /^v?\.0/.test(process.version) + +var isFn = function (fn) { + return typeof fn === 'function' +} + +var isFS = function (stream) { + if (!ancient) return false // newer node version do not need to care about fs is a special way + if (!fs) return false // browser + return (stream instanceof (fs.ReadStream || noop) || stream instanceof (fs.WriteStream || noop)) && isFn(stream.close) +} + +var isRequest = function (stream) { + return stream.setHeader && isFn(stream.abort) +} + +var destroyer = function (stream, reading, writing, callback) { + callback = once(callback) + + var closed = false + stream.on('close', function () { + closed = true + }) + + eos(stream, {readable: reading, writable: writing}, function (err) { + if (err) return callback(err) + closed = true + callback() + }) + + var destroyed = false + return function (err) { + if (closed) return + if (destroyed) return + destroyed = true + + if (isFS(stream)) return stream.close(noop) // use close for fs streams to avoid fd leaks + if (isRequest(stream)) return stream.abort() // request.destroy just do .end - .abort is what we want + + if (isFn(stream.destroy)) return stream.destroy() + + callback(err || new Error('stream was destroyed')) + } +} + +var call = function (fn) { + fn() +} + +var pipe = function (from, to) { + return from.pipe(to) +} + +var pump = function () { + var streams = Array.prototype.slice.call(arguments) + var callback = isFn(streams[streams.length - 1] || noop) && streams.pop() || noop + + if (Array.isArray(streams[0])) streams = streams[0] + if (streams.length < 2) throw new Error('pump requires two streams per minimum') + + var error + var destroys = streams.map(function (stream, i) { + var reading = i < streams.length - 1 + var writing = i > 0 + return destroyer(stream, reading, writing, function (err) { + if (!error) error = err + if (err) destroys.forEach(call) + if (reading) return + destroys.forEach(call) + callback(error) + }) + }) + + return streams.reduce(pipe) +} + +module.exports = pump diff --git a/node_modules/pump/package.json b/node_modules/pump/package.json new file mode 100644 index 000000000..81b3554bb --- /dev/null +++ b/node_modules/pump/package.json @@ -0,0 +1,60 @@ +{ + "_from": "pump@^3.0.0", + "_id": "pump@3.0.0", + "_inBundle": false, + "_integrity": "sha512-LwZy+p3SFs1Pytd/jYct4wpv49HiYCqd9Rlc5ZVdk0V+8Yzv6jR5Blk3TRmPL1ft69TxP0IMZGJ+WPFU2BFhww==", + "_location": "/pump", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "pump@^3.0.0", + "name": "pump", + "escapedName": "pump", + "rawSpec": "^3.0.0", + "saveSpec": null, + "fetchSpec": "^3.0.0" + }, + "_requiredBy": [ + "/cacheable-request/get-stream", + "/get-stream" + ], + "_resolved": "https://registry.npmjs.org/pump/-/pump-3.0.0.tgz", + "_shasum": "b4a2116815bde2f4e1ea602354e8c75565107a64", + "_spec": "pump@^3.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\cacheable-request\\node_modules\\get-stream", + "author": { + "name": "Mathias Buus Madsen", + "email": "mathiasbuus@gmail.com" + }, + "browser": { + "fs": false + }, + "bugs": { + "url": "https://github.com/mafintosh/pump/issues" + }, + "bundleDependencies": false, + "dependencies": { + "end-of-stream": "^1.1.0", + "once": "^1.3.1" + }, + "deprecated": false, + "description": "pipe streams together and close all of them if one of them closes", + "homepage": "https://github.com/mafintosh/pump#readme", + "keywords": [ + "streams", + "pipe", + "destroy", + "callback" + ], + "license": "MIT", + "name": "pump", + "repository": { + "type": "git", + "url": "git://github.com/mafintosh/pump.git" + }, + "scripts": { + "test": "node test-browser.js && node test-node.js" + }, + "version": "3.0.0" +} diff --git a/node_modules/pump/test-browser.js b/node_modules/pump/test-browser.js new file mode 100644 index 000000000..9a06c8a4c --- /dev/null +++ b/node_modules/pump/test-browser.js @@ -0,0 +1,66 @@ +var stream = require('stream') +var pump = require('./index') + +var rs = new stream.Readable() +var ws = new stream.Writable() + +rs._read = function (size) { + this.push(Buffer(size).fill('abc')) +} + +ws._write = function (chunk, encoding, cb) { + setTimeout(function () { + cb() + }, 100) +} + +var toHex = function () { + var reverse = new (require('stream').Transform)() + + reverse._transform = function (chunk, enc, callback) { + reverse.push(chunk.toString('hex')) + callback() + } + + return reverse +} + +var wsClosed = false +var rsClosed = false +var callbackCalled = false + +var check = function () { + if (wsClosed && rsClosed && callbackCalled) { + console.log('test-browser.js passes') + clearTimeout(timeout) + } +} + +ws.on('finish', function () { + wsClosed = true + check() +}) + +rs.on('end', function () { + rsClosed = true + check() +}) + +var res = pump(rs, toHex(), toHex(), toHex(), ws, function () { + callbackCalled = true + check() +}) + +if (res !== ws) { + throw new Error('should return last stream') +} + +setTimeout(function () { + rs.push(null) + rs.emit('close') +}, 1000) + +var timeout = setTimeout(function () { + check() + throw new Error('timeout') +}, 5000) diff --git a/node_modules/pump/test-node.js b/node_modules/pump/test-node.js new file mode 100644 index 000000000..561251a08 --- /dev/null +++ b/node_modules/pump/test-node.js @@ -0,0 +1,53 @@ +var pump = require('./index') + +var rs = require('fs').createReadStream('/dev/random') +var ws = require('fs').createWriteStream('/dev/null') + +var toHex = function () { + var reverse = new (require('stream').Transform)() + + reverse._transform = function (chunk, enc, callback) { + reverse.push(chunk.toString('hex')) + callback() + } + + return reverse +} + +var wsClosed = false +var rsClosed = false +var callbackCalled = false + +var check = function () { + if (wsClosed && rsClosed && callbackCalled) { + console.log('test-node.js passes') + clearTimeout(timeout) + } +} + +ws.on('close', function () { + wsClosed = true + check() +}) + +rs.on('close', function () { + rsClosed = true + check() +}) + +var res = pump(rs, toHex(), toHex(), toHex(), ws, function () { + callbackCalled = true + check() +}) + +if (res !== ws) { + throw new Error('should return last stream') +} + +setTimeout(function () { + rs.destroy() +}, 1000) + +var timeout = setTimeout(function () { + throw new Error('timeout') +}, 5000) diff --git a/node_modules/punycode/LICENSE-MIT.txt b/node_modules/punycode/LICENSE-MIT.txt new file mode 100644 index 000000000..a41e0a7ef --- /dev/null +++ b/node_modules/punycode/LICENSE-MIT.txt @@ -0,0 +1,20 @@ +Copyright Mathias Bynens + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +"Software"), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND +NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE +LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION +OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION +WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/punycode/README.md b/node_modules/punycode/README.md new file mode 100644 index 000000000..ee2f9d633 --- /dev/null +++ b/node_modules/punycode/README.md @@ -0,0 +1,122 @@ +# Punycode.js [![Build status](https://travis-ci.org/bestiejs/punycode.js.svg?branch=master)](https://travis-ci.org/bestiejs/punycode.js) [![Code coverage status](http://img.shields.io/codecov/c/github/bestiejs/punycode.js.svg)](https://codecov.io/gh/bestiejs/punycode.js) [![Dependency status](https://gemnasium.com/bestiejs/punycode.js.svg)](https://gemnasium.com/bestiejs/punycode.js) + +Punycode.js is a robust Punycode converter that fully complies to [RFC 3492](https://tools.ietf.org/html/rfc3492) and [RFC 5891](https://tools.ietf.org/html/rfc5891). + +This JavaScript library is the result of comparing, optimizing and documenting different open-source implementations of the Punycode algorithm: + +* [The C example code from RFC 3492](https://tools.ietf.org/html/rfc3492#appendix-C) +* [`punycode.c` by _Markus W. Scherer_ (IBM)](http://opensource.apple.com/source/ICU/ICU-400.42/icuSources/common/punycode.c) +* [`punycode.c` by _Ben Noordhuis_](https://github.com/bnoordhuis/punycode/blob/master/punycode.c) +* [JavaScript implementation by _some_](http://stackoverflow.com/questions/183485/can-anyone-recommend-a-good-free-javascript-for-punycode-to-unicode-conversion/301287#301287) +* [`punycode.js` by _Ben Noordhuis_](https://github.com/joyent/node/blob/426298c8c1c0d5b5224ac3658c41e7c2a3fe9377/lib/punycode.js) (note: [not fully compliant](https://github.com/joyent/node/issues/2072)) + +This project was [bundled](https://github.com/joyent/node/blob/master/lib/punycode.js) with Node.js from [v0.6.2+](https://github.com/joyent/node/compare/975f1930b1...61e796decc) until [v7](https://github.com/nodejs/node/pull/7941) (soft-deprecated). + +The current version supports recent versions of Node.js only. It provides a CommonJS module and an ES6 module. For the old version that offers the same functionality with broader support, including Rhino, Ringo, Narwhal, and web browsers, see [v1.4.1](https://github.com/bestiejs/punycode.js/releases/tag/v1.4.1). + +## Installation + +Via [npm](https://www.npmjs.com/): + +```bash +npm install punycode --save +``` + +In [Node.js](https://nodejs.org/): + +```js +const punycode = require('punycode'); +``` + +## API + +### `punycode.decode(string)` + +Converts a Punycode string of ASCII symbols to a string of Unicode symbols. + +```js +// decode domain name parts +punycode.decode('maana-pta'); // 'mañana' +punycode.decode('--dqo34k'); // '☃-⌘' +``` + +### `punycode.encode(string)` + +Converts a string of Unicode symbols to a Punycode string of ASCII symbols. + +```js +// encode domain name parts +punycode.encode('mañana'); // 'maana-pta' +punycode.encode('☃-⌘'); // '--dqo34k' +``` + +### `punycode.toUnicode(input)` + +Converts a Punycode string representing a domain name or an email address to Unicode. Only the Punycoded parts of the input will be converted, i.e. it doesn’t matter if you call it on a string that has already been converted to Unicode. + +```js +// decode domain names +punycode.toUnicode('xn--maana-pta.com'); +// → 'mañana.com' +punycode.toUnicode('xn----dqo34k.com'); +// → '☃-⌘.com' + +// decode email addresses +punycode.toUnicode('джумла@xn--p-8sbkgc5ag7bhce.xn--ba-lmcq'); +// → 'джумла@джpумлатест.bрфa' +``` + +### `punycode.toASCII(input)` + +Converts a lowercased Unicode string representing a domain name or an email address to Punycode. Only the non-ASCII parts of the input will be converted, i.e. it doesn’t matter if you call it with a domain that’s already in ASCII. + +```js +// encode domain names +punycode.toASCII('mañana.com'); +// → 'xn--maana-pta.com' +punycode.toASCII('☃-⌘.com'); +// → 'xn----dqo34k.com' + +// encode email addresses +punycode.toASCII('джумла@джpумлатест.bрфa'); +// → 'джумла@xn--p-8sbkgc5ag7bhce.xn--ba-lmcq' +``` + +### `punycode.ucs2` + +#### `punycode.ucs2.decode(string)` + +Creates an array containing the numeric code point values of each Unicode symbol in the string. While [JavaScript uses UCS-2 internally](https://mathiasbynens.be/notes/javascript-encoding), this function will convert a pair of surrogate halves (each of which UCS-2 exposes as separate characters) into a single code point, matching UTF-16. + +```js +punycode.ucs2.decode('abc'); +// → [0x61, 0x62, 0x63] +// surrogate pair for U+1D306 TETRAGRAM FOR CENTRE: +punycode.ucs2.decode('\uD834\uDF06'); +// → [0x1D306] +``` + +#### `punycode.ucs2.encode(codePoints)` + +Creates a string based on an array of numeric code point values. + +```js +punycode.ucs2.encode([0x61, 0x62, 0x63]); +// → 'abc' +punycode.ucs2.encode([0x1D306]); +// → '\uD834\uDF06' +``` + +### `punycode.version` + +A string representing the current Punycode.js version number. + +## Author + +| [![twitter/mathias](https://gravatar.com/avatar/24e08a9ea84deb17ae121074d0f17125?s=70)](https://twitter.com/mathias "Follow @mathias on Twitter") | +|---| +| [Mathias Bynens](https://mathiasbynens.be/) | + +## License + +Punycode.js is available under the [MIT](https://mths.be/mit) license. diff --git a/node_modules/punycode/package.json b/node_modules/punycode/package.json new file mode 100644 index 000000000..fe14ee920 --- /dev/null +++ b/node_modules/punycode/package.json @@ -0,0 +1,85 @@ +{ + "_from": "punycode@^2.1.1", + "_id": "punycode@2.1.1", + "_inBundle": false, + "_integrity": "sha512-XRsRjdf+j5ml+y/6GKHPZbrF/8p2Yga0JPtdqTIY2Xe5ohJPD9saDJJLPvp9+NSBprVvevdXZybnj2cv8OEd0A==", + "_location": "/punycode", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "punycode@^2.1.1", + "name": "punycode", + "escapedName": "punycode", + "rawSpec": "^2.1.1", + "saveSpec": null, + "fetchSpec": "^2.1.1" + }, + "_requiredBy": [ + "/tr46" + ], + "_resolved": "https://registry.npmjs.org/punycode/-/punycode-2.1.1.tgz", + "_shasum": "b58b010ac40c22c5657616c8d2c2c02c7bf479ec", + "_spec": "punycode@^2.1.1", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\tr46", + "author": { + "name": "Mathias Bynens", + "url": "https://mathiasbynens.be/" + }, + "bugs": { + "url": "https://github.com/bestiejs/punycode.js/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Mathias Bynens", + "url": "https://mathiasbynens.be/" + } + ], + "deprecated": false, + "description": "A robust Punycode converter that fully complies to RFC 3492 and RFC 5891, and works on nearly all JavaScript platforms.", + "devDependencies": { + "codecov": "^1.0.1", + "istanbul": "^0.4.1", + "mocha": "^2.5.3" + }, + "engines": { + "node": ">=6" + }, + "files": [ + "LICENSE-MIT.txt", + "punycode.js", + "punycode.es6.js" + ], + "homepage": "https://mths.be/punycode", + "jsnext:main": "punycode.es6.js", + "jspm": { + "map": { + "./punycode.js": { + "node": "@node/punycode" + } + } + }, + "keywords": [ + "punycode", + "unicode", + "idn", + "idna", + "dns", + "url", + "domain" + ], + "license": "MIT", + "main": "punycode.js", + "module": "punycode.es6.js", + "name": "punycode", + "repository": { + "type": "git", + "url": "git+https://github.com/bestiejs/punycode.js.git" + }, + "scripts": { + "prepublish": "node scripts/prepublish.js", + "test": "mocha tests" + }, + "version": "2.1.1" +} diff --git a/node_modules/punycode/punycode.es6.js b/node_modules/punycode/punycode.es6.js new file mode 100644 index 000000000..4610bc9eb --- /dev/null +++ b/node_modules/punycode/punycode.es6.js @@ -0,0 +1,441 @@ +'use strict'; + +/** Highest positive signed 32-bit float value */ +const maxInt = 2147483647; // aka. 0x7FFFFFFF or 2^31-1 + +/** Bootstring parameters */ +const base = 36; +const tMin = 1; +const tMax = 26; +const skew = 38; +const damp = 700; +const initialBias = 72; +const initialN = 128; // 0x80 +const delimiter = '-'; // '\x2D' + +/** Regular expressions */ +const regexPunycode = /^xn--/; +const regexNonASCII = /[^\0-\x7E]/; // non-ASCII chars +const regexSeparators = /[\x2E\u3002\uFF0E\uFF61]/g; // RFC 3490 separators + +/** Error messages */ +const errors = { + 'overflow': 'Overflow: input needs wider integers to process', + 'not-basic': 'Illegal input >= 0x80 (not a basic code point)', + 'invalid-input': 'Invalid input' +}; + +/** Convenience shortcuts */ +const baseMinusTMin = base - tMin; +const floor = Math.floor; +const stringFromCharCode = String.fromCharCode; + +/*--------------------------------------------------------------------------*/ + +/** + * A generic error utility function. + * @private + * @param {String} type The error type. + * @returns {Error} Throws a `RangeError` with the applicable error message. + */ +function error(type) { + throw new RangeError(errors[type]); +} + +/** + * A generic `Array#map` utility function. + * @private + * @param {Array} array The array to iterate over. + * @param {Function} callback The function that gets called for every array + * item. + * @returns {Array} A new array of values returned by the callback function. + */ +function map(array, fn) { + const result = []; + let length = array.length; + while (length--) { + result[length] = fn(array[length]); + } + return result; +} + +/** + * A simple `Array#map`-like wrapper to work with domain name strings or email + * addresses. + * @private + * @param {String} domain The domain name or email address. + * @param {Function} callback The function that gets called for every + * character. + * @returns {Array} A new string of characters returned by the callback + * function. + */ +function mapDomain(string, fn) { + const parts = string.split('@'); + let result = ''; + if (parts.length > 1) { + // In email addresses, only the domain name should be punycoded. Leave + // the local part (i.e. everything up to `@`) intact. + result = parts[0] + '@'; + string = parts[1]; + } + // Avoid `split(regex)` for IE8 compatibility. See #17. + string = string.replace(regexSeparators, '\x2E'); + const labels = string.split('.'); + const encoded = map(labels, fn).join('.'); + return result + encoded; +} + +/** + * Creates an array containing the numeric code points of each Unicode + * character in the string. While JavaScript uses UCS-2 internally, + * this function will convert a pair of surrogate halves (each of which + * UCS-2 exposes as separate characters) into a single code point, + * matching UTF-16. + * @see `punycode.ucs2.encode` + * @see + * @memberOf punycode.ucs2 + * @name decode + * @param {String} string The Unicode input string (UCS-2). + * @returns {Array} The new array of code points. + */ +function ucs2decode(string) { + const output = []; + let counter = 0; + const length = string.length; + while (counter < length) { + const value = string.charCodeAt(counter++); + if (value >= 0xD800 && value <= 0xDBFF && counter < length) { + // It's a high surrogate, and there is a next character. + const extra = string.charCodeAt(counter++); + if ((extra & 0xFC00) == 0xDC00) { // Low surrogate. + output.push(((value & 0x3FF) << 10) + (extra & 0x3FF) + 0x10000); + } else { + // It's an unmatched surrogate; only append this code unit, in case the + // next code unit is the high surrogate of a surrogate pair. + output.push(value); + counter--; + } + } else { + output.push(value); + } + } + return output; +} + +/** + * Creates a string based on an array of numeric code points. + * @see `punycode.ucs2.decode` + * @memberOf punycode.ucs2 + * @name encode + * @param {Array} codePoints The array of numeric code points. + * @returns {String} The new Unicode string (UCS-2). + */ +const ucs2encode = array => String.fromCodePoint(...array); + +/** + * Converts a basic code point into a digit/integer. + * @see `digitToBasic()` + * @private + * @param {Number} codePoint The basic numeric code point value. + * @returns {Number} The numeric value of a basic code point (for use in + * representing integers) in the range `0` to `base - 1`, or `base` if + * the code point does not represent a value. + */ +const basicToDigit = function(codePoint) { + if (codePoint - 0x30 < 0x0A) { + return codePoint - 0x16; + } + if (codePoint - 0x41 < 0x1A) { + return codePoint - 0x41; + } + if (codePoint - 0x61 < 0x1A) { + return codePoint - 0x61; + } + return base; +}; + +/** + * Converts a digit/integer into a basic code point. + * @see `basicToDigit()` + * @private + * @param {Number} digit The numeric value of a basic code point. + * @returns {Number} The basic code point whose value (when used for + * representing integers) is `digit`, which needs to be in the range + * `0` to `base - 1`. If `flag` is non-zero, the uppercase form is + * used; else, the lowercase form is used. The behavior is undefined + * if `flag` is non-zero and `digit` has no uppercase form. + */ +const digitToBasic = function(digit, flag) { + // 0..25 map to ASCII a..z or A..Z + // 26..35 map to ASCII 0..9 + return digit + 22 + 75 * (digit < 26) - ((flag != 0) << 5); +}; + +/** + * Bias adaptation function as per section 3.4 of RFC 3492. + * https://tools.ietf.org/html/rfc3492#section-3.4 + * @private + */ +const adapt = function(delta, numPoints, firstTime) { + let k = 0; + delta = firstTime ? floor(delta / damp) : delta >> 1; + delta += floor(delta / numPoints); + for (/* no initialization */; delta > baseMinusTMin * tMax >> 1; k += base) { + delta = floor(delta / baseMinusTMin); + } + return floor(k + (baseMinusTMin + 1) * delta / (delta + skew)); +}; + +/** + * Converts a Punycode string of ASCII-only symbols to a string of Unicode + * symbols. + * @memberOf punycode + * @param {String} input The Punycode string of ASCII-only symbols. + * @returns {String} The resulting string of Unicode symbols. + */ +const decode = function(input) { + // Don't use UCS-2. + const output = []; + const inputLength = input.length; + let i = 0; + let n = initialN; + let bias = initialBias; + + // Handle the basic code points: let `basic` be the number of input code + // points before the last delimiter, or `0` if there is none, then copy + // the first basic code points to the output. + + let basic = input.lastIndexOf(delimiter); + if (basic < 0) { + basic = 0; + } + + for (let j = 0; j < basic; ++j) { + // if it's not a basic code point + if (input.charCodeAt(j) >= 0x80) { + error('not-basic'); + } + output.push(input.charCodeAt(j)); + } + + // Main decoding loop: start just after the last delimiter if any basic code + // points were copied; start at the beginning otherwise. + + for (let index = basic > 0 ? basic + 1 : 0; index < inputLength; /* no final expression */) { + + // `index` is the index of the next character to be consumed. + // Decode a generalized variable-length integer into `delta`, + // which gets added to `i`. The overflow checking is easier + // if we increase `i` as we go, then subtract off its starting + // value at the end to obtain `delta`. + let oldi = i; + for (let w = 1, k = base; /* no condition */; k += base) { + + if (index >= inputLength) { + error('invalid-input'); + } + + const digit = basicToDigit(input.charCodeAt(index++)); + + if (digit >= base || digit > floor((maxInt - i) / w)) { + error('overflow'); + } + + i += digit * w; + const t = k <= bias ? tMin : (k >= bias + tMax ? tMax : k - bias); + + if (digit < t) { + break; + } + + const baseMinusT = base - t; + if (w > floor(maxInt / baseMinusT)) { + error('overflow'); + } + + w *= baseMinusT; + + } + + const out = output.length + 1; + bias = adapt(i - oldi, out, oldi == 0); + + // `i` was supposed to wrap around from `out` to `0`, + // incrementing `n` each time, so we'll fix that now: + if (floor(i / out) > maxInt - n) { + error('overflow'); + } + + n += floor(i / out); + i %= out; + + // Insert `n` at position `i` of the output. + output.splice(i++, 0, n); + + } + + return String.fromCodePoint(...output); +}; + +/** + * Converts a string of Unicode symbols (e.g. a domain name label) to a + * Punycode string of ASCII-only symbols. + * @memberOf punycode + * @param {String} input The string of Unicode symbols. + * @returns {String} The resulting Punycode string of ASCII-only symbols. + */ +const encode = function(input) { + const output = []; + + // Convert the input in UCS-2 to an array of Unicode code points. + input = ucs2decode(input); + + // Cache the length. + let inputLength = input.length; + + // Initialize the state. + let n = initialN; + let delta = 0; + let bias = initialBias; + + // Handle the basic code points. + for (const currentValue of input) { + if (currentValue < 0x80) { + output.push(stringFromCharCode(currentValue)); + } + } + + let basicLength = output.length; + let handledCPCount = basicLength; + + // `handledCPCount` is the number of code points that have been handled; + // `basicLength` is the number of basic code points. + + // Finish the basic string with a delimiter unless it's empty. + if (basicLength) { + output.push(delimiter); + } + + // Main encoding loop: + while (handledCPCount < inputLength) { + + // All non-basic code points < n have been handled already. Find the next + // larger one: + let m = maxInt; + for (const currentValue of input) { + if (currentValue >= n && currentValue < m) { + m = currentValue; + } + } + + // Increase `delta` enough to advance the decoder's state to , + // but guard against overflow. + const handledCPCountPlusOne = handledCPCount + 1; + if (m - n > floor((maxInt - delta) / handledCPCountPlusOne)) { + error('overflow'); + } + + delta += (m - n) * handledCPCountPlusOne; + n = m; + + for (const currentValue of input) { + if (currentValue < n && ++delta > maxInt) { + error('overflow'); + } + if (currentValue == n) { + // Represent delta as a generalized variable-length integer. + let q = delta; + for (let k = base; /* no condition */; k += base) { + const t = k <= bias ? tMin : (k >= bias + tMax ? tMax : k - bias); + if (q < t) { + break; + } + const qMinusT = q - t; + const baseMinusT = base - t; + output.push( + stringFromCharCode(digitToBasic(t + qMinusT % baseMinusT, 0)) + ); + q = floor(qMinusT / baseMinusT); + } + + output.push(stringFromCharCode(digitToBasic(q, 0))); + bias = adapt(delta, handledCPCountPlusOne, handledCPCount == basicLength); + delta = 0; + ++handledCPCount; + } + } + + ++delta; + ++n; + + } + return output.join(''); +}; + +/** + * Converts a Punycode string representing a domain name or an email address + * to Unicode. Only the Punycoded parts of the input will be converted, i.e. + * it doesn't matter if you call it on a string that has already been + * converted to Unicode. + * @memberOf punycode + * @param {String} input The Punycoded domain name or email address to + * convert to Unicode. + * @returns {String} The Unicode representation of the given Punycode + * string. + */ +const toUnicode = function(input) { + return mapDomain(input, function(string) { + return regexPunycode.test(string) + ? decode(string.slice(4).toLowerCase()) + : string; + }); +}; + +/** + * Converts a Unicode string representing a domain name or an email address to + * Punycode. Only the non-ASCII parts of the domain name will be converted, + * i.e. it doesn't matter if you call it with a domain that's already in + * ASCII. + * @memberOf punycode + * @param {String} input The domain name or email address to convert, as a + * Unicode string. + * @returns {String} The Punycode representation of the given domain name or + * email address. + */ +const toASCII = function(input) { + return mapDomain(input, function(string) { + return regexNonASCII.test(string) + ? 'xn--' + encode(string) + : string; + }); +}; + +/*--------------------------------------------------------------------------*/ + +/** Define the public API */ +const punycode = { + /** + * A string representing the current Punycode.js version number. + * @memberOf punycode + * @type String + */ + 'version': '2.1.0', + /** + * An object of methods to convert from JavaScript's internal character + * representation (UCS-2) to Unicode code points, and back. + * @see + * @memberOf punycode + * @type Object + */ + 'ucs2': { + 'decode': ucs2decode, + 'encode': ucs2encode + }, + 'decode': decode, + 'encode': encode, + 'toASCII': toASCII, + 'toUnicode': toUnicode +}; + +export { ucs2decode, ucs2encode, decode, encode, toASCII, toUnicode }; +export default punycode; diff --git a/node_modules/punycode/punycode.js b/node_modules/punycode/punycode.js new file mode 100644 index 000000000..ea61fd0d3 --- /dev/null +++ b/node_modules/punycode/punycode.js @@ -0,0 +1,440 @@ +'use strict'; + +/** Highest positive signed 32-bit float value */ +const maxInt = 2147483647; // aka. 0x7FFFFFFF or 2^31-1 + +/** Bootstring parameters */ +const base = 36; +const tMin = 1; +const tMax = 26; +const skew = 38; +const damp = 700; +const initialBias = 72; +const initialN = 128; // 0x80 +const delimiter = '-'; // '\x2D' + +/** Regular expressions */ +const regexPunycode = /^xn--/; +const regexNonASCII = /[^\0-\x7E]/; // non-ASCII chars +const regexSeparators = /[\x2E\u3002\uFF0E\uFF61]/g; // RFC 3490 separators + +/** Error messages */ +const errors = { + 'overflow': 'Overflow: input needs wider integers to process', + 'not-basic': 'Illegal input >= 0x80 (not a basic code point)', + 'invalid-input': 'Invalid input' +}; + +/** Convenience shortcuts */ +const baseMinusTMin = base - tMin; +const floor = Math.floor; +const stringFromCharCode = String.fromCharCode; + +/*--------------------------------------------------------------------------*/ + +/** + * A generic error utility function. + * @private + * @param {String} type The error type. + * @returns {Error} Throws a `RangeError` with the applicable error message. + */ +function error(type) { + throw new RangeError(errors[type]); +} + +/** + * A generic `Array#map` utility function. + * @private + * @param {Array} array The array to iterate over. + * @param {Function} callback The function that gets called for every array + * item. + * @returns {Array} A new array of values returned by the callback function. + */ +function map(array, fn) { + const result = []; + let length = array.length; + while (length--) { + result[length] = fn(array[length]); + } + return result; +} + +/** + * A simple `Array#map`-like wrapper to work with domain name strings or email + * addresses. + * @private + * @param {String} domain The domain name or email address. + * @param {Function} callback The function that gets called for every + * character. + * @returns {Array} A new string of characters returned by the callback + * function. + */ +function mapDomain(string, fn) { + const parts = string.split('@'); + let result = ''; + if (parts.length > 1) { + // In email addresses, only the domain name should be punycoded. Leave + // the local part (i.e. everything up to `@`) intact. + result = parts[0] + '@'; + string = parts[1]; + } + // Avoid `split(regex)` for IE8 compatibility. See #17. + string = string.replace(regexSeparators, '\x2E'); + const labels = string.split('.'); + const encoded = map(labels, fn).join('.'); + return result + encoded; +} + +/** + * Creates an array containing the numeric code points of each Unicode + * character in the string. While JavaScript uses UCS-2 internally, + * this function will convert a pair of surrogate halves (each of which + * UCS-2 exposes as separate characters) into a single code point, + * matching UTF-16. + * @see `punycode.ucs2.encode` + * @see + * @memberOf punycode.ucs2 + * @name decode + * @param {String} string The Unicode input string (UCS-2). + * @returns {Array} The new array of code points. + */ +function ucs2decode(string) { + const output = []; + let counter = 0; + const length = string.length; + while (counter < length) { + const value = string.charCodeAt(counter++); + if (value >= 0xD800 && value <= 0xDBFF && counter < length) { + // It's a high surrogate, and there is a next character. + const extra = string.charCodeAt(counter++); + if ((extra & 0xFC00) == 0xDC00) { // Low surrogate. + output.push(((value & 0x3FF) << 10) + (extra & 0x3FF) + 0x10000); + } else { + // It's an unmatched surrogate; only append this code unit, in case the + // next code unit is the high surrogate of a surrogate pair. + output.push(value); + counter--; + } + } else { + output.push(value); + } + } + return output; +} + +/** + * Creates a string based on an array of numeric code points. + * @see `punycode.ucs2.decode` + * @memberOf punycode.ucs2 + * @name encode + * @param {Array} codePoints The array of numeric code points. + * @returns {String} The new Unicode string (UCS-2). + */ +const ucs2encode = array => String.fromCodePoint(...array); + +/** + * Converts a basic code point into a digit/integer. + * @see `digitToBasic()` + * @private + * @param {Number} codePoint The basic numeric code point value. + * @returns {Number} The numeric value of a basic code point (for use in + * representing integers) in the range `0` to `base - 1`, or `base` if + * the code point does not represent a value. + */ +const basicToDigit = function(codePoint) { + if (codePoint - 0x30 < 0x0A) { + return codePoint - 0x16; + } + if (codePoint - 0x41 < 0x1A) { + return codePoint - 0x41; + } + if (codePoint - 0x61 < 0x1A) { + return codePoint - 0x61; + } + return base; +}; + +/** + * Converts a digit/integer into a basic code point. + * @see `basicToDigit()` + * @private + * @param {Number} digit The numeric value of a basic code point. + * @returns {Number} The basic code point whose value (when used for + * representing integers) is `digit`, which needs to be in the range + * `0` to `base - 1`. If `flag` is non-zero, the uppercase form is + * used; else, the lowercase form is used. The behavior is undefined + * if `flag` is non-zero and `digit` has no uppercase form. + */ +const digitToBasic = function(digit, flag) { + // 0..25 map to ASCII a..z or A..Z + // 26..35 map to ASCII 0..9 + return digit + 22 + 75 * (digit < 26) - ((flag != 0) << 5); +}; + +/** + * Bias adaptation function as per section 3.4 of RFC 3492. + * https://tools.ietf.org/html/rfc3492#section-3.4 + * @private + */ +const adapt = function(delta, numPoints, firstTime) { + let k = 0; + delta = firstTime ? floor(delta / damp) : delta >> 1; + delta += floor(delta / numPoints); + for (/* no initialization */; delta > baseMinusTMin * tMax >> 1; k += base) { + delta = floor(delta / baseMinusTMin); + } + return floor(k + (baseMinusTMin + 1) * delta / (delta + skew)); +}; + +/** + * Converts a Punycode string of ASCII-only symbols to a string of Unicode + * symbols. + * @memberOf punycode + * @param {String} input The Punycode string of ASCII-only symbols. + * @returns {String} The resulting string of Unicode symbols. + */ +const decode = function(input) { + // Don't use UCS-2. + const output = []; + const inputLength = input.length; + let i = 0; + let n = initialN; + let bias = initialBias; + + // Handle the basic code points: let `basic` be the number of input code + // points before the last delimiter, or `0` if there is none, then copy + // the first basic code points to the output. + + let basic = input.lastIndexOf(delimiter); + if (basic < 0) { + basic = 0; + } + + for (let j = 0; j < basic; ++j) { + // if it's not a basic code point + if (input.charCodeAt(j) >= 0x80) { + error('not-basic'); + } + output.push(input.charCodeAt(j)); + } + + // Main decoding loop: start just after the last delimiter if any basic code + // points were copied; start at the beginning otherwise. + + for (let index = basic > 0 ? basic + 1 : 0; index < inputLength; /* no final expression */) { + + // `index` is the index of the next character to be consumed. + // Decode a generalized variable-length integer into `delta`, + // which gets added to `i`. The overflow checking is easier + // if we increase `i` as we go, then subtract off its starting + // value at the end to obtain `delta`. + let oldi = i; + for (let w = 1, k = base; /* no condition */; k += base) { + + if (index >= inputLength) { + error('invalid-input'); + } + + const digit = basicToDigit(input.charCodeAt(index++)); + + if (digit >= base || digit > floor((maxInt - i) / w)) { + error('overflow'); + } + + i += digit * w; + const t = k <= bias ? tMin : (k >= bias + tMax ? tMax : k - bias); + + if (digit < t) { + break; + } + + const baseMinusT = base - t; + if (w > floor(maxInt / baseMinusT)) { + error('overflow'); + } + + w *= baseMinusT; + + } + + const out = output.length + 1; + bias = adapt(i - oldi, out, oldi == 0); + + // `i` was supposed to wrap around from `out` to `0`, + // incrementing `n` each time, so we'll fix that now: + if (floor(i / out) > maxInt - n) { + error('overflow'); + } + + n += floor(i / out); + i %= out; + + // Insert `n` at position `i` of the output. + output.splice(i++, 0, n); + + } + + return String.fromCodePoint(...output); +}; + +/** + * Converts a string of Unicode symbols (e.g. a domain name label) to a + * Punycode string of ASCII-only symbols. + * @memberOf punycode + * @param {String} input The string of Unicode symbols. + * @returns {String} The resulting Punycode string of ASCII-only symbols. + */ +const encode = function(input) { + const output = []; + + // Convert the input in UCS-2 to an array of Unicode code points. + input = ucs2decode(input); + + // Cache the length. + let inputLength = input.length; + + // Initialize the state. + let n = initialN; + let delta = 0; + let bias = initialBias; + + // Handle the basic code points. + for (const currentValue of input) { + if (currentValue < 0x80) { + output.push(stringFromCharCode(currentValue)); + } + } + + let basicLength = output.length; + let handledCPCount = basicLength; + + // `handledCPCount` is the number of code points that have been handled; + // `basicLength` is the number of basic code points. + + // Finish the basic string with a delimiter unless it's empty. + if (basicLength) { + output.push(delimiter); + } + + // Main encoding loop: + while (handledCPCount < inputLength) { + + // All non-basic code points < n have been handled already. Find the next + // larger one: + let m = maxInt; + for (const currentValue of input) { + if (currentValue >= n && currentValue < m) { + m = currentValue; + } + } + + // Increase `delta` enough to advance the decoder's state to , + // but guard against overflow. + const handledCPCountPlusOne = handledCPCount + 1; + if (m - n > floor((maxInt - delta) / handledCPCountPlusOne)) { + error('overflow'); + } + + delta += (m - n) * handledCPCountPlusOne; + n = m; + + for (const currentValue of input) { + if (currentValue < n && ++delta > maxInt) { + error('overflow'); + } + if (currentValue == n) { + // Represent delta as a generalized variable-length integer. + let q = delta; + for (let k = base; /* no condition */; k += base) { + const t = k <= bias ? tMin : (k >= bias + tMax ? tMax : k - bias); + if (q < t) { + break; + } + const qMinusT = q - t; + const baseMinusT = base - t; + output.push( + stringFromCharCode(digitToBasic(t + qMinusT % baseMinusT, 0)) + ); + q = floor(qMinusT / baseMinusT); + } + + output.push(stringFromCharCode(digitToBasic(q, 0))); + bias = adapt(delta, handledCPCountPlusOne, handledCPCount == basicLength); + delta = 0; + ++handledCPCount; + } + } + + ++delta; + ++n; + + } + return output.join(''); +}; + +/** + * Converts a Punycode string representing a domain name or an email address + * to Unicode. Only the Punycoded parts of the input will be converted, i.e. + * it doesn't matter if you call it on a string that has already been + * converted to Unicode. + * @memberOf punycode + * @param {String} input The Punycoded domain name or email address to + * convert to Unicode. + * @returns {String} The Unicode representation of the given Punycode + * string. + */ +const toUnicode = function(input) { + return mapDomain(input, function(string) { + return regexPunycode.test(string) + ? decode(string.slice(4).toLowerCase()) + : string; + }); +}; + +/** + * Converts a Unicode string representing a domain name or an email address to + * Punycode. Only the non-ASCII parts of the domain name will be converted, + * i.e. it doesn't matter if you call it with a domain that's already in + * ASCII. + * @memberOf punycode + * @param {String} input The domain name or email address to convert, as a + * Unicode string. + * @returns {String} The Punycode representation of the given domain name or + * email address. + */ +const toASCII = function(input) { + return mapDomain(input, function(string) { + return regexNonASCII.test(string) + ? 'xn--' + encode(string) + : string; + }); +}; + +/*--------------------------------------------------------------------------*/ + +/** Define the public API */ +const punycode = { + /** + * A string representing the current Punycode.js version number. + * @memberOf punycode + * @type String + */ + 'version': '2.1.0', + /** + * An object of methods to convert from JavaScript's internal character + * representation (UCS-2) to Unicode code points, and back. + * @see + * @memberOf punycode + * @type Object + */ + 'ucs2': { + 'decode': ucs2decode, + 'encode': ucs2encode + }, + 'decode': decode, + 'encode': encode, + 'toASCII': toASCII, + 'toUnicode': toUnicode +}; + +module.exports = punycode; diff --git a/node_modules/pupa/index.d.ts b/node_modules/pupa/index.d.ts new file mode 100644 index 000000000..762aae0c2 --- /dev/null +++ b/node_modules/pupa/index.d.ts @@ -0,0 +1,32 @@ +/** +Simple micro templating. + +@param template - Text with placeholders for `data` properties. +@param data - Data to interpolate into `template`. + +@example +``` +import pupa = require('pupa'); + +pupa('The mobile number of {name} is {phone.mobile}', { + name: 'Sindre', + phone: { + mobile: '609 24 363' + } +}); +//=> 'The mobile number of Sindre is 609 24 363' + +pupa('I like {0} and {1}', ['🦄', '🐮']); +//=> 'I like 🦄 and 🐮' + +// Double braces encodes the HTML entities to avoid code injection +pupa('I like {{0}} and {{1}}', ['
🦄
', '🐮']); +//=> 'I like <br>🦄</br> and <i>🐮</i>' +``` +*/ +declare function pupa( + template: string, + data: unknown[] | {[key: string]: any} +): string; + +export = pupa; diff --git a/node_modules/pupa/index.js b/node_modules/pupa/index.js new file mode 100644 index 000000000..85739ebbb --- /dev/null +++ b/node_modules/pupa/index.js @@ -0,0 +1,39 @@ +'use strict'; +const {htmlEscape} = require('escape-goat'); + +module.exports = (template, data) => { + if (typeof template !== 'string') { + throw new TypeError(`Expected a \`string\` in the first argument, got \`${typeof template}\``); + } + + if (typeof data !== 'object') { + throw new TypeError(`Expected an \`object\` or \`Array\` in the second argument, got \`${typeof data}\``); + } + + // The regex tries to match either a number inside `{{ }}` or a valid JS identifier or key path. + const doubleBraceRegex = /{{(\d+|[a-z$_][a-z\d$_]*?(?:\.[a-z\d$_]*?)*?)}}/gi; + + if (doubleBraceRegex.test(template)) { + template = template.replace(doubleBraceRegex, (_, key) => { + let result = data; + + for (const property of key.split('.')) { + result = result ? result[property] : ''; + } + + return htmlEscape(String(result)); + }); + } + + const braceRegex = /{(\d+|[a-z$_][a-z\d$_]*?(?:\.[a-z\d$_]*?)*?)}/gi; + + return template.replace(braceRegex, (_, key) => { + let result = data; + + for (const property of key.split('.')) { + result = result ? result[property] : ''; + } + + return String(result); + }); +}; diff --git a/node_modules/pupa/license b/node_modules/pupa/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/pupa/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/pupa/package.json b/node_modules/pupa/package.json new file mode 100644 index 000000000..e4f078a32 --- /dev/null +++ b/node_modules/pupa/package.json @@ -0,0 +1,79 @@ +{ + "_from": "pupa@^2.1.1", + "_id": "pupa@2.1.1", + "_inBundle": false, + "_integrity": "sha512-l1jNAspIBSFqbT+y+5FosojNpVpF94nlI+wDUpqP9enwOTfHx9f0gh5nB96vl+6yTpsJsypeNrwfzPrKuHB41A==", + "_location": "/pupa", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "pupa@^2.1.1", + "name": "pupa", + "escapedName": "pupa", + "rawSpec": "^2.1.1", + "saveSpec": null, + "fetchSpec": "^2.1.1" + }, + "_requiredBy": [ + "/update-notifier" + ], + "_resolved": "https://registry.npmjs.org/pupa/-/pupa-2.1.1.tgz", + "_shasum": "f5e8fd4afc2c5d97828faa523549ed8744a20d62", + "_spec": "pupa@^2.1.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\update-notifier", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/pupa/issues" + }, + "bundleDependencies": false, + "dependencies": { + "escape-goat": "^2.0.0" + }, + "deprecated": false, + "description": "Simple micro templating", + "devDependencies": { + "ava": "^1.4.1", + "tsd": "^0.7.2", + "xo": "^0.24.0" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "homepage": "https://github.com/sindresorhus/pupa#readme", + "keywords": [ + "string", + "formatting", + "template", + "object", + "format", + "interpolate", + "interpolation", + "templating", + "expand", + "simple", + "replace", + "placeholders", + "values", + "transform", + "micro" + ], + "license": "MIT", + "name": "pupa", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/pupa.git" + }, + "scripts": { + "test": "xo && ava && tsd" + }, + "version": "2.1.1" +} diff --git a/node_modules/pupa/readme.md b/node_modules/pupa/readme.md new file mode 100644 index 000000000..bb9ab1805 --- /dev/null +++ b/node_modules/pupa/readme.md @@ -0,0 +1,63 @@ +# pupa [![Build Status](https://travis-ci.org/sindresorhus/pupa.svg?branch=master)](https://travis-ci.org/sindresorhus/pupa) + +> Simple micro templating + +Useful when all you need is to fill in some placeholders. + + +## Install + +``` +$ npm install pupa +``` + + +## Usage + +```js +const pupa = require('pupa'); + +pupa('The mobile number of {name} is {phone.mobile}', { + name: 'Sindre', + phone: { + mobile: '609 24 363' + } +}); +//=> 'The mobile number of Sindre is 609 24 363' + +pupa('I like {0} and {1}', ['🦄', '🐮']); +//=> 'I like 🦄 and 🐮' + +// Double braces encodes the HTML entities to avoid code injection +pupa('I like {{0}} and {{1}}', ['
🦄
', '🐮']); +//=> 'I like <br>🦄</br> and <i>🐮</i>' +``` + + +## API + +### pupa(template, data) + +#### template + +Type: `string` + +Text with placeholders for `data` properties. + +#### data + +Type: `object | unknown[]` + +Data to interpolate into `template`. + + +## FAQ + +### What about template literals? + +Template literals expand on creation. This module expands the template on execution, which can be useful if either or both template and data are lazily created or user-supplied. + + +## Related + +- [pupa-cli](https://github.com/sindresorhus/pupa-cli) - CLI for this module diff --git a/node_modules/qs/.editorconfig b/node_modules/qs/.editorconfig new file mode 100644 index 000000000..a4893ddfa --- /dev/null +++ b/node_modules/qs/.editorconfig @@ -0,0 +1,30 @@ +root = true + +[*] +indent_style = space +indent_size = 4 +end_of_line = lf +charset = utf-8 +trim_trailing_whitespace = true +insert_final_newline = true +max_line_length = 160 + +[test/*] +max_line_length = off + +[*.md] +max_line_length = off + +[*.json] +max_line_length = off + +[Makefile] +max_line_length = off + +[CHANGELOG.md] +indent_style = space +indent_size = 2 + +[LICENSE] +indent_size = 2 +max_line_length = off diff --git a/node_modules/qs/.eslintignore b/node_modules/qs/.eslintignore new file mode 100644 index 000000000..1521c8b76 --- /dev/null +++ b/node_modules/qs/.eslintignore @@ -0,0 +1 @@ +dist diff --git a/node_modules/qs/.eslintrc b/node_modules/qs/.eslintrc new file mode 100644 index 000000000..e3bde898e --- /dev/null +++ b/node_modules/qs/.eslintrc @@ -0,0 +1,21 @@ +{ + "root": true, + + "extends": "@ljharb", + + "rules": { + "complexity": 0, + "consistent-return": 1, + "func-name-matching": 0, + "id-length": [2, { "min": 1, "max": 25, "properties": "never" }], + "indent": [2, 4], + "max-lines-per-function": [2, { "max": 150 }], + "max-params": [2, 14], + "max-statements": [2, 52], + "multiline-comment-style": 0, + "no-continue": 1, + "no-magic-numbers": 0, + "no-restricted-syntax": [2, "BreakStatement", "DebuggerStatement", "ForInStatement", "LabeledStatement", "WithStatement"], + "operator-linebreak": [2, "before"], + } +} diff --git a/node_modules/qs/CHANGELOG.md b/node_modules/qs/CHANGELOG.md new file mode 100644 index 000000000..50505c462 --- /dev/null +++ b/node_modules/qs/CHANGELOG.md @@ -0,0 +1,256 @@ +## **6.7.0** +- [New] `stringify`/`parse`: add `comma` as an `arrayFormat` option (#276, #219) +- [Fix] correctly parse nested arrays (#212) +- [Fix] `utils.merge`: avoid a crash with a null target and a truthy non-array source, also with an array source +- [Robustness] `stringify`: cache `Object.prototype.hasOwnProperty` +- [Refactor] `utils`: `isBuffer`: small tweak; add tests +- [Refactor] use cached `Array.isArray` +- [Refactor] `parse`/`stringify`: make a function to normalize the options +- [Refactor] `utils`: reduce observable [[Get]]s +- [Refactor] `stringify`/`utils`: cache `Array.isArray` +- [Tests] always use `String(x)` over `x.toString()` +- [Tests] fix Buffer tests to work in node < 4.5 and node < 5.10 +- [Tests] temporarily allow coverage to fail + +## **6.6.0** +- [New] Add support for iso-8859-1, utf8 "sentinel" and numeric entities (#268) +- [New] move two-value combine to a `utils` function (#189) +- [Fix] `stringify`: fix a crash with `strictNullHandling` and a custom `filter`/`serializeDate` (#279) +- [Fix] when `parseArrays` is false, properly handle keys ending in `[]` (#260) +- [Fix] `stringify`: do not crash in an obscure combo of `interpretNumericEntities`, a bad custom `decoder`, & `iso-8859-1` +- [Fix] `utils`: `merge`: fix crash when `source` is a truthy primitive & no options are provided +- [refactor] `stringify`: Avoid arr = arr.concat(...), push to the existing instance (#269) +- [Refactor] `parse`: only need to reassign the var once +- [Refactor] `parse`/`stringify`: clean up `charset` options checking; fix defaults +- [Refactor] add missing defaults +- [Refactor] `parse`: one less `concat` call +- [Refactor] `utils`: `compactQueue`: make it explicitly side-effecting +- [Dev Deps] update `browserify`, `eslint`, `@ljharb/eslint-config`, `iconv-lite`, `safe-publish-latest`, `tape` +- [Tests] up to `node` `v10.10`, `v9.11`, `v8.12`, `v6.14`, `v4.9`; pin included builds to LTS + +## **6.5.2** +- [Fix] use `safer-buffer` instead of `Buffer` constructor +- [Refactor] utils: `module.exports` one thing, instead of mutating `exports` (#230) +- [Dev Deps] update `browserify`, `eslint`, `iconv-lite`, `safer-buffer`, `tape`, `browserify` + +## **6.5.1** +- [Fix] Fix parsing & compacting very deep objects (#224) +- [Refactor] name utils functions +- [Dev Deps] update `eslint`, `@ljharb/eslint-config`, `tape` +- [Tests] up to `node` `v8.4`; use `nvm install-latest-npm` so newer npm doesn’t break older node +- [Tests] Use precise dist for Node.js 0.6 runtime (#225) +- [Tests] make 0.6 required, now that it’s passing +- [Tests] on `node` `v8.2`; fix npm on node 0.6 + +## **6.5.0** +- [New] add `utils.assign` +- [New] pass default encoder/decoder to custom encoder/decoder functions (#206) +- [New] `parse`/`stringify`: add `ignoreQueryPrefix`/`addQueryPrefix` options, respectively (#213) +- [Fix] Handle stringifying empty objects with addQueryPrefix (#217) +- [Fix] do not mutate `options` argument (#207) +- [Refactor] `parse`: cache index to reuse in else statement (#182) +- [Docs] add various badges to readme (#208) +- [Dev Deps] update `eslint`, `browserify`, `iconv-lite`, `tape` +- [Tests] up to `node` `v8.1`, `v7.10`, `v6.11`; npm v4.6 breaks on node < v1; npm v5+ breaks on node < v4 +- [Tests] add `editorconfig-tools` + +## **6.4.0** +- [New] `qs.stringify`: add `encodeValuesOnly` option +- [Fix] follow `allowPrototypes` option during merge (#201, #201) +- [Fix] support keys starting with brackets (#202, #200) +- [Fix] chmod a-x +- [Dev Deps] update `eslint` +- [Tests] up to `node` `v7.7`, `v6.10`,` v4.8`; disable osx builds since they block linux builds +- [eslint] reduce warnings + +## **6.3.2** +- [Fix] follow `allowPrototypes` option during merge (#201, #200) +- [Dev Deps] update `eslint` +- [Fix] chmod a-x +- [Fix] support keys starting with brackets (#202, #200) +- [Tests] up to `node` `v7.7`, `v6.10`,` v4.8`; disable osx builds since they block linux builds + +## **6.3.1** +- [Fix] ensure that `allowPrototypes: false` does not ever shadow Object.prototype properties (thanks, @snyk!) +- [Dev Deps] update `eslint`, `@ljharb/eslint-config`, `browserify`, `iconv-lite`, `qs-iconv`, `tape` +- [Tests] on all node minors; improve test matrix +- [Docs] document stringify option `allowDots` (#195) +- [Docs] add empty object and array values example (#195) +- [Docs] Fix minor inconsistency/typo (#192) +- [Docs] document stringify option `sort` (#191) +- [Refactor] `stringify`: throw faster with an invalid encoder +- [Refactor] remove unnecessary escapes (#184) +- Remove contributing.md, since `qs` is no longer part of `hapi` (#183) + +## **6.3.0** +- [New] Add support for RFC 1738 (#174, #173) +- [New] `stringify`: Add `serializeDate` option to customize Date serialization (#159) +- [Fix] ensure `utils.merge` handles merging two arrays +- [Refactor] only constructors should be capitalized +- [Refactor] capitalized var names are for constructors only +- [Refactor] avoid using a sparse array +- [Robustness] `formats`: cache `String#replace` +- [Dev Deps] update `browserify`, `eslint`, `@ljharb/eslint-config`; add `safe-publish-latest` +- [Tests] up to `node` `v6.8`, `v4.6`; improve test matrix +- [Tests] flesh out arrayLimit/arrayFormat tests (#107) +- [Tests] skip Object.create tests when null objects are not available +- [Tests] Turn on eslint for test files (#175) + +## **6.2.3** +- [Fix] follow `allowPrototypes` option during merge (#201, #200) +- [Fix] chmod a-x +- [Fix] support keys starting with brackets (#202, #200) +- [Tests] up to `node` `v7.7`, `v6.10`,` v4.8`; disable osx builds since they block linux builds + +## **6.2.2** +- [Fix] ensure that `allowPrototypes: false` does not ever shadow Object.prototype properties + +## **6.2.1** +- [Fix] ensure `key[]=x&key[]&key[]=y` results in 3, not 2, values +- [Refactor] Be explicit and use `Object.prototype.hasOwnProperty.call` +- [Tests] remove `parallelshell` since it does not reliably report failures +- [Tests] up to `node` `v6.3`, `v5.12` +- [Dev Deps] update `tape`, `eslint`, `@ljharb/eslint-config`, `qs-iconv` + +## [**6.2.0**](https://github.com/ljharb/qs/issues?milestone=36&state=closed) +- [New] pass Buffers to the encoder/decoder directly (#161) +- [New] add "encoder" and "decoder" options, for custom param encoding/decoding (#160) +- [Fix] fix compacting of nested sparse arrays (#150) + +## **6.1.2 +- [Fix] follow `allowPrototypes` option during merge (#201, #200) +- [Fix] chmod a-x +- [Fix] support keys starting with brackets (#202, #200) +- [Tests] up to `node` `v7.7`, `v6.10`,` v4.8`; disable osx builds since they block linux builds + +## **6.1.1** +- [Fix] ensure that `allowPrototypes: false` does not ever shadow Object.prototype properties + +## [**6.1.0**](https://github.com/ljharb/qs/issues?milestone=35&state=closed) +- [New] allowDots option for `stringify` (#151) +- [Fix] "sort" option should work at a depth of 3 or more (#151) +- [Fix] Restore `dist` directory; will be removed in v7 (#148) + +## **6.0.4** +- [Fix] follow `allowPrototypes` option during merge (#201, #200) +- [Fix] chmod a-x +- [Fix] support keys starting with brackets (#202, #200) +- [Tests] up to `node` `v7.7`, `v6.10`,` v4.8`; disable osx builds since they block linux builds + +## **6.0.3** +- [Fix] ensure that `allowPrototypes: false` does not ever shadow Object.prototype properties +- [Fix] Restore `dist` directory; will be removed in v7 (#148) + +## [**6.0.2**](https://github.com/ljharb/qs/issues?milestone=33&state=closed) +- Revert ES6 requirement and restore support for node down to v0.8. + +## [**6.0.1**](https://github.com/ljharb/qs/issues?milestone=32&state=closed) +- [**#127**](https://github.com/ljharb/qs/pull/127) Fix engines definition in package.json + +## [**6.0.0**](https://github.com/ljharb/qs/issues?milestone=31&state=closed) +- [**#124**](https://github.com/ljharb/qs/issues/124) Use ES6 and drop support for node < v4 + +## **5.2.1** +- [Fix] ensure `key[]=x&key[]&key[]=y` results in 3, not 2, values + +## [**5.2.0**](https://github.com/ljharb/qs/issues?milestone=30&state=closed) +- [**#64**](https://github.com/ljharb/qs/issues/64) Add option to sort object keys in the query string + +## [**5.1.0**](https://github.com/ljharb/qs/issues?milestone=29&state=closed) +- [**#117**](https://github.com/ljharb/qs/issues/117) make URI encoding stringified results optional +- [**#106**](https://github.com/ljharb/qs/issues/106) Add flag `skipNulls` to optionally skip null values in stringify + +## [**5.0.0**](https://github.com/ljharb/qs/issues?milestone=28&state=closed) +- [**#114**](https://github.com/ljharb/qs/issues/114) default allowDots to false +- [**#100**](https://github.com/ljharb/qs/issues/100) include dist to npm + +## [**4.0.0**](https://github.com/ljharb/qs/issues?milestone=26&state=closed) +- [**#98**](https://github.com/ljharb/qs/issues/98) make returning plain objects and allowing prototype overwriting properties optional + +## [**3.1.0**](https://github.com/ljharb/qs/issues?milestone=24&state=closed) +- [**#89**](https://github.com/ljharb/qs/issues/89) Add option to disable "Transform dot notation to bracket notation" + +## [**3.0.0**](https://github.com/ljharb/qs/issues?milestone=23&state=closed) +- [**#80**](https://github.com/ljharb/qs/issues/80) qs.parse silently drops properties +- [**#77**](https://github.com/ljharb/qs/issues/77) Perf boost +- [**#60**](https://github.com/ljharb/qs/issues/60) Add explicit option to disable array parsing +- [**#74**](https://github.com/ljharb/qs/issues/74) Bad parse when turning array into object +- [**#81**](https://github.com/ljharb/qs/issues/81) Add a `filter` option +- [**#68**](https://github.com/ljharb/qs/issues/68) Fixed issue with recursion and passing strings into objects. +- [**#66**](https://github.com/ljharb/qs/issues/66) Add mixed array and object dot notation support Closes: #47 +- [**#76**](https://github.com/ljharb/qs/issues/76) RFC 3986 +- [**#85**](https://github.com/ljharb/qs/issues/85) No equal sign +- [**#84**](https://github.com/ljharb/qs/issues/84) update license attribute + +## [**2.4.1**](https://github.com/ljharb/qs/issues?milestone=20&state=closed) +- [**#73**](https://github.com/ljharb/qs/issues/73) Property 'hasOwnProperty' of object # is not a function + +## [**2.4.0**](https://github.com/ljharb/qs/issues?milestone=19&state=closed) +- [**#70**](https://github.com/ljharb/qs/issues/70) Add arrayFormat option + +## [**2.3.3**](https://github.com/ljharb/qs/issues?milestone=18&state=closed) +- [**#59**](https://github.com/ljharb/qs/issues/59) make sure array indexes are >= 0, closes #57 +- [**#58**](https://github.com/ljharb/qs/issues/58) make qs usable for browser loader + +## [**2.3.2**](https://github.com/ljharb/qs/issues?milestone=17&state=closed) +- [**#55**](https://github.com/ljharb/qs/issues/55) allow merging a string into an object + +## [**2.3.1**](https://github.com/ljharb/qs/issues?milestone=16&state=closed) +- [**#52**](https://github.com/ljharb/qs/issues/52) Return "undefined" and "false" instead of throwing "TypeError". + +## [**2.3.0**](https://github.com/ljharb/qs/issues?milestone=15&state=closed) +- [**#50**](https://github.com/ljharb/qs/issues/50) add option to omit array indices, closes #46 + +## [**2.2.5**](https://github.com/ljharb/qs/issues?milestone=14&state=closed) +- [**#39**](https://github.com/ljharb/qs/issues/39) Is there an alternative to Buffer.isBuffer? +- [**#49**](https://github.com/ljharb/qs/issues/49) refactor utils.merge, fixes #45 +- [**#41**](https://github.com/ljharb/qs/issues/41) avoid browserifying Buffer, for #39 + +## [**2.2.4**](https://github.com/ljharb/qs/issues?milestone=13&state=closed) +- [**#38**](https://github.com/ljharb/qs/issues/38) how to handle object keys beginning with a number + +## [**2.2.3**](https://github.com/ljharb/qs/issues?milestone=12&state=closed) +- [**#37**](https://github.com/ljharb/qs/issues/37) parser discards first empty value in array +- [**#36**](https://github.com/ljharb/qs/issues/36) Update to lab 4.x + +## [**2.2.2**](https://github.com/ljharb/qs/issues?milestone=11&state=closed) +- [**#33**](https://github.com/ljharb/qs/issues/33) Error when plain object in a value +- [**#34**](https://github.com/ljharb/qs/issues/34) use Object.prototype.hasOwnProperty.call instead of obj.hasOwnProperty +- [**#24**](https://github.com/ljharb/qs/issues/24) Changelog? Semver? + +## [**2.2.1**](https://github.com/ljharb/qs/issues?milestone=10&state=closed) +- [**#32**](https://github.com/ljharb/qs/issues/32) account for circular references properly, closes #31 +- [**#31**](https://github.com/ljharb/qs/issues/31) qs.parse stackoverflow on circular objects + +## [**2.2.0**](https://github.com/ljharb/qs/issues?milestone=9&state=closed) +- [**#26**](https://github.com/ljharb/qs/issues/26) Don't use Buffer global if it's not present +- [**#30**](https://github.com/ljharb/qs/issues/30) Bug when merging non-object values into arrays +- [**#29**](https://github.com/ljharb/qs/issues/29) Don't call Utils.clone at the top of Utils.merge +- [**#23**](https://github.com/ljharb/qs/issues/23) Ability to not limit parameters? + +## [**2.1.0**](https://github.com/ljharb/qs/issues?milestone=8&state=closed) +- [**#22**](https://github.com/ljharb/qs/issues/22) Enable using a RegExp as delimiter + +## [**2.0.0**](https://github.com/ljharb/qs/issues?milestone=7&state=closed) +- [**#18**](https://github.com/ljharb/qs/issues/18) Why is there arrayLimit? +- [**#20**](https://github.com/ljharb/qs/issues/20) Configurable parametersLimit +- [**#21**](https://github.com/ljharb/qs/issues/21) make all limits optional, for #18, for #20 + +## [**1.2.2**](https://github.com/ljharb/qs/issues?milestone=6&state=closed) +- [**#19**](https://github.com/ljharb/qs/issues/19) Don't overwrite null values + +## [**1.2.1**](https://github.com/ljharb/qs/issues?milestone=5&state=closed) +- [**#16**](https://github.com/ljharb/qs/issues/16) ignore non-string delimiters +- [**#15**](https://github.com/ljharb/qs/issues/15) Close code block + +## [**1.2.0**](https://github.com/ljharb/qs/issues?milestone=4&state=closed) +- [**#12**](https://github.com/ljharb/qs/issues/12) Add optional delim argument +- [**#13**](https://github.com/ljharb/qs/issues/13) fix #11: flattened keys in array are now correctly parsed + +## [**1.1.0**](https://github.com/ljharb/qs/issues?milestone=3&state=closed) +- [**#7**](https://github.com/ljharb/qs/issues/7) Empty values of a POST array disappear after being submitted +- [**#9**](https://github.com/ljharb/qs/issues/9) Should not omit equals signs (=) when value is null +- [**#6**](https://github.com/ljharb/qs/issues/6) Minor grammar fix in README + +## [**1.0.2**](https://github.com/ljharb/qs/issues?milestone=2&state=closed) +- [**#5**](https://github.com/ljharb/qs/issues/5) array holes incorrectly copied into object on large index diff --git a/node_modules/qs/LICENSE b/node_modules/qs/LICENSE new file mode 100644 index 000000000..d4569487a --- /dev/null +++ b/node_modules/qs/LICENSE @@ -0,0 +1,28 @@ +Copyright (c) 2014 Nathan LaFreniere and other contributors. +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + * Redistributions of source code must retain the above copyright + notice, this list of conditions and the following disclaimer. + * Redistributions in binary form must reproduce the above copyright + notice, this list of conditions and the following disclaimer in the + documentation and/or other materials provided with the distribution. + * The names of any contributors may not be used to endorse or promote + products derived from this software without specific prior written + permission. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND +ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED +WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDERS AND CONTRIBUTORS BE LIABLE FOR ANY +DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES +(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; +LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND +ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT +(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS +SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. + + * * * + +The complete list of contributors can be found at: https://github.com/hapijs/qs/graphs/contributors diff --git a/node_modules/qs/README.md b/node_modules/qs/README.md new file mode 100644 index 000000000..8590cfd3c --- /dev/null +++ b/node_modules/qs/README.md @@ -0,0 +1,570 @@ +# qs [![Version Badge][2]][1] + +[![Build Status][3]][4] +[![dependency status][5]][6] +[![dev dependency status][7]][8] +[![License][license-image]][license-url] +[![Downloads][downloads-image]][downloads-url] + +[![npm badge][11]][1] + +A querystring parsing and stringifying library with some added security. + +Lead Maintainer: [Jordan Harband](https://github.com/ljharb) + +The **qs** module was originally created and maintained by [TJ Holowaychuk](https://github.com/visionmedia/node-querystring). + +## Usage + +```javascript +var qs = require('qs'); +var assert = require('assert'); + +var obj = qs.parse('a=c'); +assert.deepEqual(obj, { a: 'c' }); + +var str = qs.stringify(obj); +assert.equal(str, 'a=c'); +``` + +### Parsing Objects + +[](#preventEval) +```javascript +qs.parse(string, [options]); +``` + +**qs** allows you to create nested objects within your query strings, by surrounding the name of sub-keys with square brackets `[]`. +For example, the string `'foo[bar]=baz'` converts to: + +```javascript +assert.deepEqual(qs.parse('foo[bar]=baz'), { + foo: { + bar: 'baz' + } +}); +``` + +When using the `plainObjects` option the parsed value is returned as a null object, created via `Object.create(null)` and as such you should be aware that prototype methods will not exist on it and a user may set those names to whatever value they like: + +```javascript +var nullObject = qs.parse('a[hasOwnProperty]=b', { plainObjects: true }); +assert.deepEqual(nullObject, { a: { hasOwnProperty: 'b' } }); +``` + +By default parameters that would overwrite properties on the object prototype are ignored, if you wish to keep the data from those fields either use `plainObjects` as mentioned above, or set `allowPrototypes` to `true` which will allow user input to overwrite those properties. *WARNING* It is generally a bad idea to enable this option as it can cause problems when attempting to use the properties that have been overwritten. Always be careful with this option. + +```javascript +var protoObject = qs.parse('a[hasOwnProperty]=b', { allowPrototypes: true }); +assert.deepEqual(protoObject, { a: { hasOwnProperty: 'b' } }); +``` + +URI encoded strings work too: + +```javascript +assert.deepEqual(qs.parse('a%5Bb%5D=c'), { + a: { b: 'c' } +}); +``` + +You can also nest your objects, like `'foo[bar][baz]=foobarbaz'`: + +```javascript +assert.deepEqual(qs.parse('foo[bar][baz]=foobarbaz'), { + foo: { + bar: { + baz: 'foobarbaz' + } + } +}); +``` + +By default, when nesting objects **qs** will only parse up to 5 children deep. This means if you attempt to parse a string like +`'a[b][c][d][e][f][g][h][i]=j'` your resulting object will be: + +```javascript +var expected = { + a: { + b: { + c: { + d: { + e: { + f: { + '[g][h][i]': 'j' + } + } + } + } + } + } +}; +var string = 'a[b][c][d][e][f][g][h][i]=j'; +assert.deepEqual(qs.parse(string), expected); +``` + +This depth can be overridden by passing a `depth` option to `qs.parse(string, [options])`: + +```javascript +var deep = qs.parse('a[b][c][d][e][f][g][h][i]=j', { depth: 1 }); +assert.deepEqual(deep, { a: { b: { '[c][d][e][f][g][h][i]': 'j' } } }); +``` + +The depth limit helps mitigate abuse when **qs** is used to parse user input, and it is recommended to keep it a reasonably small number. + +For similar reasons, by default **qs** will only parse up to 1000 parameters. This can be overridden by passing a `parameterLimit` option: + +```javascript +var limited = qs.parse('a=b&c=d', { parameterLimit: 1 }); +assert.deepEqual(limited, { a: 'b' }); +``` + +To bypass the leading question mark, use `ignoreQueryPrefix`: + +```javascript +var prefixed = qs.parse('?a=b&c=d', { ignoreQueryPrefix: true }); +assert.deepEqual(prefixed, { a: 'b', c: 'd' }); +``` + +An optional delimiter can also be passed: + +```javascript +var delimited = qs.parse('a=b;c=d', { delimiter: ';' }); +assert.deepEqual(delimited, { a: 'b', c: 'd' }); +``` + +Delimiters can be a regular expression too: + +```javascript +var regexed = qs.parse('a=b;c=d,e=f', { delimiter: /[;,]/ }); +assert.deepEqual(regexed, { a: 'b', c: 'd', e: 'f' }); +``` + +Option `allowDots` can be used to enable dot notation: + +```javascript +var withDots = qs.parse('a.b=c', { allowDots: true }); +assert.deepEqual(withDots, { a: { b: 'c' } }); +``` + +If you have to deal with legacy browsers or services, there's +also support for decoding percent-encoded octets as iso-8859-1: + +```javascript +var oldCharset = qs.parse('a=%A7', { charset: 'iso-8859-1' }); +assert.deepEqual(oldCharset, { a: '§' }); +``` + +Some services add an initial `utf8=✓` value to forms so that old +Internet Explorer versions are more likely to submit the form as +utf-8. Additionally, the server can check the value against wrong +encodings of the checkmark character and detect that a query string +or `application/x-www-form-urlencoded` body was *not* sent as +utf-8, eg. if the form had an `accept-charset` parameter or the +containing page had a different character set. + +**qs** supports this mechanism via the `charsetSentinel` option. +If specified, the `utf8` parameter will be omitted from the +returned object. It will be used to switch to `iso-8859-1`/`utf-8` +mode depending on how the checkmark is encoded. + +**Important**: When you specify both the `charset` option and the +`charsetSentinel` option, the `charset` will be overridden when +the request contains a `utf8` parameter from which the actual +charset can be deduced. In that sense the `charset` will behave +as the default charset rather than the authoritative charset. + +```javascript +var detectedAsUtf8 = qs.parse('utf8=%E2%9C%93&a=%C3%B8', { + charset: 'iso-8859-1', + charsetSentinel: true +}); +assert.deepEqual(detectedAsUtf8, { a: 'ø' }); + +// Browsers encode the checkmark as ✓ when submitting as iso-8859-1: +var detectedAsIso8859_1 = qs.parse('utf8=%26%2310003%3B&a=%F8', { + charset: 'utf-8', + charsetSentinel: true +}); +assert.deepEqual(detectedAsIso8859_1, { a: 'ø' }); +``` + +If you want to decode the `&#...;` syntax to the actual character, +you can specify the `interpretNumericEntities` option as well: + +```javascript +var detectedAsIso8859_1 = qs.parse('a=%26%239786%3B', { + charset: 'iso-8859-1', + interpretNumericEntities: true +}); +assert.deepEqual(detectedAsIso8859_1, { a: '☺' }); +``` + +It also works when the charset has been detected in `charsetSentinel` +mode. + +### Parsing Arrays + +**qs** can also parse arrays using a similar `[]` notation: + +```javascript +var withArray = qs.parse('a[]=b&a[]=c'); +assert.deepEqual(withArray, { a: ['b', 'c'] }); +``` + +You may specify an index as well: + +```javascript +var withIndexes = qs.parse('a[1]=c&a[0]=b'); +assert.deepEqual(withIndexes, { a: ['b', 'c'] }); +``` + +Note that the only difference between an index in an array and a key in an object is that the value between the brackets must be a number +to create an array. When creating arrays with specific indices, **qs** will compact a sparse array to only the existing values preserving +their order: + +```javascript +var noSparse = qs.parse('a[1]=b&a[15]=c'); +assert.deepEqual(noSparse, { a: ['b', 'c'] }); +``` + +Note that an empty string is also a value, and will be preserved: + +```javascript +var withEmptyString = qs.parse('a[]=&a[]=b'); +assert.deepEqual(withEmptyString, { a: ['', 'b'] }); + +var withIndexedEmptyString = qs.parse('a[0]=b&a[1]=&a[2]=c'); +assert.deepEqual(withIndexedEmptyString, { a: ['b', '', 'c'] }); +``` + +**qs** will also limit specifying indices in an array to a maximum index of `20`. Any array members with an index of greater than `20` will +instead be converted to an object with the index as the key. This is needed to handle cases when someone sent, for example, `a[999999999]` and it will take significant time to iterate over this huge array. + +```javascript +var withMaxIndex = qs.parse('a[100]=b'); +assert.deepEqual(withMaxIndex, { a: { '100': 'b' } }); +``` + +This limit can be overridden by passing an `arrayLimit` option: + +```javascript +var withArrayLimit = qs.parse('a[1]=b', { arrayLimit: 0 }); +assert.deepEqual(withArrayLimit, { a: { '1': 'b' } }); +``` + +To disable array parsing entirely, set `parseArrays` to `false`. + +```javascript +var noParsingArrays = qs.parse('a[]=b', { parseArrays: false }); +assert.deepEqual(noParsingArrays, { a: { '0': 'b' } }); +``` + +If you mix notations, **qs** will merge the two items into an object: + +```javascript +var mixedNotation = qs.parse('a[0]=b&a[b]=c'); +assert.deepEqual(mixedNotation, { a: { '0': 'b', b: 'c' } }); +``` + +You can also create arrays of objects: + +```javascript +var arraysOfObjects = qs.parse('a[][b]=c'); +assert.deepEqual(arraysOfObjects, { a: [{ b: 'c' }] }); +``` + +Some people use comma to join array, **qs** can parse it: +```javascript +var arraysOfObjects = qs.parse('a=b,c', { comma: true }) +assert.deepEqual(arraysOfObjects, { a: ['b', 'c'] }) +``` +(_this cannot convert nested objects, such as `a={b:1},{c:d}`_) + +### Stringifying + +[](#preventEval) +```javascript +qs.stringify(object, [options]); +``` + +When stringifying, **qs** by default URI encodes output. Objects are stringified as you would expect: + +```javascript +assert.equal(qs.stringify({ a: 'b' }), 'a=b'); +assert.equal(qs.stringify({ a: { b: 'c' } }), 'a%5Bb%5D=c'); +``` + +This encoding can be disabled by setting the `encode` option to `false`: + +```javascript +var unencoded = qs.stringify({ a: { b: 'c' } }, { encode: false }); +assert.equal(unencoded, 'a[b]=c'); +``` + +Encoding can be disabled for keys by setting the `encodeValuesOnly` option to `true`: +```javascript +var encodedValues = qs.stringify( + { a: 'b', c: ['d', 'e=f'], f: [['g'], ['h']] }, + { encodeValuesOnly: true } +); +assert.equal(encodedValues,'a=b&c[0]=d&c[1]=e%3Df&f[0][0]=g&f[1][0]=h'); +``` + +This encoding can also be replaced by a custom encoding method set as `encoder` option: + +```javascript +var encoded = qs.stringify({ a: { b: 'c' } }, { encoder: function (str) { + // Passed in values `a`, `b`, `c` + return // Return encoded string +}}) +``` + +_(Note: the `encoder` option does not apply if `encode` is `false`)_ + +Analogue to the `encoder` there is a `decoder` option for `parse` to override decoding of properties and values: + +```javascript +var decoded = qs.parse('x=z', { decoder: function (str) { + // Passed in values `x`, `z` + return // Return decoded string +}}) +``` + +Examples beyond this point will be shown as though the output is not URI encoded for clarity. Please note that the return values in these cases *will* be URI encoded during real usage. + +When arrays are stringified, by default they are given explicit indices: + +```javascript +qs.stringify({ a: ['b', 'c', 'd'] }); +// 'a[0]=b&a[1]=c&a[2]=d' +``` + +You may override this by setting the `indices` option to `false`: + +```javascript +qs.stringify({ a: ['b', 'c', 'd'] }, { indices: false }); +// 'a=b&a=c&a=d' +``` + +You may use the `arrayFormat` option to specify the format of the output array: + +```javascript +qs.stringify({ a: ['b', 'c'] }, { arrayFormat: 'indices' }) +// 'a[0]=b&a[1]=c' +qs.stringify({ a: ['b', 'c'] }, { arrayFormat: 'brackets' }) +// 'a[]=b&a[]=c' +qs.stringify({ a: ['b', 'c'] }, { arrayFormat: 'repeat' }) +// 'a=b&a=c' +qs.stringify({ a: ['b', 'c'] }, { arrayFormat: 'comma' }) +// 'a=b,c' +``` + +When objects are stringified, by default they use bracket notation: + +```javascript +qs.stringify({ a: { b: { c: 'd', e: 'f' } } }); +// 'a[b][c]=d&a[b][e]=f' +``` + +You may override this to use dot notation by setting the `allowDots` option to `true`: + +```javascript +qs.stringify({ a: { b: { c: 'd', e: 'f' } } }, { allowDots: true }); +// 'a.b.c=d&a.b.e=f' +``` + +Empty strings and null values will omit the value, but the equals sign (=) remains in place: + +```javascript +assert.equal(qs.stringify({ a: '' }), 'a='); +``` + +Key with no values (such as an empty object or array) will return nothing: + +```javascript +assert.equal(qs.stringify({ a: [] }), ''); +assert.equal(qs.stringify({ a: {} }), ''); +assert.equal(qs.stringify({ a: [{}] }), ''); +assert.equal(qs.stringify({ a: { b: []} }), ''); +assert.equal(qs.stringify({ a: { b: {}} }), ''); +``` + +Properties that are set to `undefined` will be omitted entirely: + +```javascript +assert.equal(qs.stringify({ a: null, b: undefined }), 'a='); +``` + +The query string may optionally be prepended with a question mark: + +```javascript +assert.equal(qs.stringify({ a: 'b', c: 'd' }, { addQueryPrefix: true }), '?a=b&c=d'); +``` + +The delimiter may be overridden with stringify as well: + +```javascript +assert.equal(qs.stringify({ a: 'b', c: 'd' }, { delimiter: ';' }), 'a=b;c=d'); +``` + +If you only want to override the serialization of `Date` objects, you can provide a `serializeDate` option: + +```javascript +var date = new Date(7); +assert.equal(qs.stringify({ a: date }), 'a=1970-01-01T00:00:00.007Z'.replace(/:/g, '%3A')); +assert.equal( + qs.stringify({ a: date }, { serializeDate: function (d) { return d.getTime(); } }), + 'a=7' +); +``` + +You may use the `sort` option to affect the order of parameter keys: + +```javascript +function alphabeticalSort(a, b) { + return a.localeCompare(b); +} +assert.equal(qs.stringify({ a: 'c', z: 'y', b : 'f' }, { sort: alphabeticalSort }), 'a=c&b=f&z=y'); +``` + +Finally, you can use the `filter` option to restrict which keys will be included in the stringified output. +If you pass a function, it will be called for each key to obtain the replacement value. Otherwise, if you +pass an array, it will be used to select properties and array indices for stringification: + +```javascript +function filterFunc(prefix, value) { + if (prefix == 'b') { + // Return an `undefined` value to omit a property. + return; + } + if (prefix == 'e[f]') { + return value.getTime(); + } + if (prefix == 'e[g][0]') { + return value * 2; + } + return value; +} +qs.stringify({ a: 'b', c: 'd', e: { f: new Date(123), g: [2] } }, { filter: filterFunc }); +// 'a=b&c=d&e[f]=123&e[g][0]=4' +qs.stringify({ a: 'b', c: 'd', e: 'f' }, { filter: ['a', 'e'] }); +// 'a=b&e=f' +qs.stringify({ a: ['b', 'c', 'd'], e: 'f' }, { filter: ['a', 0, 2] }); +// 'a[0]=b&a[2]=d' +``` + +### Handling of `null` values + +By default, `null` values are treated like empty strings: + +```javascript +var withNull = qs.stringify({ a: null, b: '' }); +assert.equal(withNull, 'a=&b='); +``` + +Parsing does not distinguish between parameters with and without equal signs. Both are converted to empty strings. + +```javascript +var equalsInsensitive = qs.parse('a&b='); +assert.deepEqual(equalsInsensitive, { a: '', b: '' }); +``` + +To distinguish between `null` values and empty strings use the `strictNullHandling` flag. In the result string the `null` +values have no `=` sign: + +```javascript +var strictNull = qs.stringify({ a: null, b: '' }, { strictNullHandling: true }); +assert.equal(strictNull, 'a&b='); +``` + +To parse values without `=` back to `null` use the `strictNullHandling` flag: + +```javascript +var parsedStrictNull = qs.parse('a&b=', { strictNullHandling: true }); +assert.deepEqual(parsedStrictNull, { a: null, b: '' }); +``` + +To completely skip rendering keys with `null` values, use the `skipNulls` flag: + +```javascript +var nullsSkipped = qs.stringify({ a: 'b', c: null}, { skipNulls: true }); +assert.equal(nullsSkipped, 'a=b'); +``` + +If you're communicating with legacy systems, you can switch to `iso-8859-1` +using the `charset` option: + +```javascript +var iso = qs.stringify({ æ: 'æ' }, { charset: 'iso-8859-1' }); +assert.equal(iso, '%E6=%E6'); +``` + +Characters that don't exist in `iso-8859-1` will be converted to numeric +entities, similar to what browsers do: + +```javascript +var numeric = qs.stringify({ a: '☺' }, { charset: 'iso-8859-1' }); +assert.equal(numeric, 'a=%26%239786%3B'); +``` + +You can use the `charsetSentinel` option to announce the character by +including an `utf8=✓` parameter with the proper encoding if the checkmark, +similar to what Ruby on Rails and others do when submitting forms. + +```javascript +var sentinel = qs.stringify({ a: '☺' }, { charsetSentinel: true }); +assert.equal(sentinel, 'utf8=%E2%9C%93&a=%E2%98%BA'); + +var isoSentinel = qs.stringify({ a: 'æ' }, { charsetSentinel: true, charset: 'iso-8859-1' }); +assert.equal(isoSentinel, 'utf8=%26%2310003%3B&a=%E6'); +``` + +### Dealing with special character sets + +By default the encoding and decoding of characters is done in `utf-8`, +and `iso-8859-1` support is also built in via the `charset` parameter. + +If you wish to encode querystrings to a different character set (i.e. +[Shift JIS](https://en.wikipedia.org/wiki/Shift_JIS)) you can use the +[`qs-iconv`](https://github.com/martinheidegger/qs-iconv) library: + +```javascript +var encoder = require('qs-iconv/encoder')('shift_jis'); +var shiftJISEncoded = qs.stringify({ a: 'こんにちは!' }, { encoder: encoder }); +assert.equal(shiftJISEncoded, 'a=%82%B1%82%F1%82%C9%82%BF%82%CD%81I'); +``` + +This also works for decoding of query strings: + +```javascript +var decoder = require('qs-iconv/decoder')('shift_jis'); +var obj = qs.parse('a=%82%B1%82%F1%82%C9%82%BF%82%CD%81I', { decoder: decoder }); +assert.deepEqual(obj, { a: 'こんにちは!' }); +``` + +### RFC 3986 and RFC 1738 space encoding + +RFC3986 used as default option and encodes ' ' to *%20* which is backward compatible. +In the same time, output can be stringified as per RFC1738 with ' ' equal to '+'. + +``` +assert.equal(qs.stringify({ a: 'b c' }), 'a=b%20c'); +assert.equal(qs.stringify({ a: 'b c' }, { format : 'RFC3986' }), 'a=b%20c'); +assert.equal(qs.stringify({ a: 'b c' }, { format : 'RFC1738' }), 'a=b+c'); +``` + +[1]: https://npmjs.org/package/qs +[2]: http://versionbadg.es/ljharb/qs.svg +[3]: https://api.travis-ci.org/ljharb/qs.svg +[4]: https://travis-ci.org/ljharb/qs +[5]: https://david-dm.org/ljharb/qs.svg +[6]: https://david-dm.org/ljharb/qs +[7]: https://david-dm.org/ljharb/qs/dev-status.svg +[8]: https://david-dm.org/ljharb/qs?type=dev +[9]: https://ci.testling.com/ljharb/qs.png +[10]: https://ci.testling.com/ljharb/qs +[11]: https://nodei.co/npm/qs.png?downloads=true&stars=true +[license-image]: http://img.shields.io/npm/l/qs.svg +[license-url]: LICENSE +[downloads-image]: http://img.shields.io/npm/dm/qs.svg +[downloads-url]: http://npm-stat.com/charts.html?package=qs diff --git a/node_modules/qs/dist/qs.js b/node_modules/qs/dist/qs.js new file mode 100644 index 000000000..17f4e600c --- /dev/null +++ b/node_modules/qs/dist/qs.js @@ -0,0 +1,782 @@ +(function(f){if(typeof exports==="object"&&typeof module!=="undefined"){module.exports=f()}else if(typeof define==="function"&&define.amd){define([],f)}else{var g;if(typeof window!=="undefined"){g=window}else if(typeof global!=="undefined"){g=global}else if(typeof self!=="undefined"){g=self}else{g=this}g.Qs = f()}})(function(){var define,module,exports;return (function(){function r(e,n,t){function o(i,f){if(!n[i]){if(!e[i]){var c="function"==typeof require&&require;if(!f&&c)return c(i,!0);if(u)return u(i,!0);var a=new Error("Cannot find module '"+i+"'");throw a.code="MODULE_NOT_FOUND",a}var p=n[i]={exports:{}};e[i][0].call(p.exports,function(r){var n=e[i][1][r];return o(n||r)},p,p.exports,r,e,n,t)}return n[i].exports}for(var u="function"==typeof require&&require,i=0;i -1) { + val = val.split(','); + } + + if (has.call(obj, key)) { + obj[key] = utils.combine(obj[key], val); + } else { + obj[key] = val; + } + } + + return obj; +}; + +var parseObject = function (chain, val, options) { + var leaf = val; + + for (var i = chain.length - 1; i >= 0; --i) { + var obj; + var root = chain[i]; + + if (root === '[]' && options.parseArrays) { + obj = [].concat(leaf); + } else { + obj = options.plainObjects ? Object.create(null) : {}; + var cleanRoot = root.charAt(0) === '[' && root.charAt(root.length - 1) === ']' ? root.slice(1, -1) : root; + var index = parseInt(cleanRoot, 10); + if (!options.parseArrays && cleanRoot === '') { + obj = { 0: leaf }; + } else if ( + !isNaN(index) + && root !== cleanRoot + && String(index) === cleanRoot + && index >= 0 + && (options.parseArrays && index <= options.arrayLimit) + ) { + obj = []; + obj[index] = leaf; + } else { + obj[cleanRoot] = leaf; + } + } + + leaf = obj; + } + + return leaf; +}; + +var parseKeys = function parseQueryStringKeys(givenKey, val, options) { + if (!givenKey) { + return; + } + + // Transform dot notation to bracket notation + var key = options.allowDots ? givenKey.replace(/\.([^.[]+)/g, '[$1]') : givenKey; + + // The regex chunks + + var brackets = /(\[[^[\]]*])/; + var child = /(\[[^[\]]*])/g; + + // Get the parent + + var segment = brackets.exec(key); + var parent = segment ? key.slice(0, segment.index) : key; + + // Stash the parent if it exists + + var keys = []; + if (parent) { + // If we aren't using plain objects, optionally prefix keys that would overwrite object prototype properties + if (!options.plainObjects && has.call(Object.prototype, parent)) { + if (!options.allowPrototypes) { + return; + } + } + + keys.push(parent); + } + + // Loop through children appending to the array until we hit depth + + var i = 0; + while ((segment = child.exec(key)) !== null && i < options.depth) { + i += 1; + if (!options.plainObjects && has.call(Object.prototype, segment[1].slice(1, -1))) { + if (!options.allowPrototypes) { + return; + } + } + keys.push(segment[1]); + } + + // If there's a remainder, just add whatever is left + + if (segment) { + keys.push('[' + key.slice(segment.index) + ']'); + } + + return parseObject(keys, val, options); +}; + +var normalizeParseOptions = function normalizeParseOptions(opts) { + if (!opts) { + return defaults; + } + + if (opts.decoder !== null && opts.decoder !== undefined && typeof opts.decoder !== 'function') { + throw new TypeError('Decoder has to be a function.'); + } + + if (typeof opts.charset !== 'undefined' && opts.charset !== 'utf-8' && opts.charset !== 'iso-8859-1') { + throw new Error('The charset option must be either utf-8, iso-8859-1, or undefined'); + } + var charset = typeof opts.charset === 'undefined' ? defaults.charset : opts.charset; + + return { + allowDots: typeof opts.allowDots === 'undefined' ? defaults.allowDots : !!opts.allowDots, + allowPrototypes: typeof opts.allowPrototypes === 'boolean' ? opts.allowPrototypes : defaults.allowPrototypes, + arrayLimit: typeof opts.arrayLimit === 'number' ? opts.arrayLimit : defaults.arrayLimit, + charset: charset, + charsetSentinel: typeof opts.charsetSentinel === 'boolean' ? opts.charsetSentinel : defaults.charsetSentinel, + comma: typeof opts.comma === 'boolean' ? opts.comma : defaults.comma, + decoder: typeof opts.decoder === 'function' ? opts.decoder : defaults.decoder, + delimiter: typeof opts.delimiter === 'string' || utils.isRegExp(opts.delimiter) ? opts.delimiter : defaults.delimiter, + depth: typeof opts.depth === 'number' ? opts.depth : defaults.depth, + ignoreQueryPrefix: opts.ignoreQueryPrefix === true, + interpretNumericEntities: typeof opts.interpretNumericEntities === 'boolean' ? opts.interpretNumericEntities : defaults.interpretNumericEntities, + parameterLimit: typeof opts.parameterLimit === 'number' ? opts.parameterLimit : defaults.parameterLimit, + parseArrays: opts.parseArrays !== false, + plainObjects: typeof opts.plainObjects === 'boolean' ? opts.plainObjects : defaults.plainObjects, + strictNullHandling: typeof opts.strictNullHandling === 'boolean' ? opts.strictNullHandling : defaults.strictNullHandling + }; +}; + +module.exports = function (str, opts) { + var options = normalizeParseOptions(opts); + + if (str === '' || str === null || typeof str === 'undefined') { + return options.plainObjects ? Object.create(null) : {}; + } + + var tempObj = typeof str === 'string' ? parseValues(str, options) : str; + var obj = options.plainObjects ? Object.create(null) : {}; + + // Iterate over the keys and setup the new object + + var keys = Object.keys(tempObj); + for (var i = 0; i < keys.length; ++i) { + var key = keys[i]; + var newObj = parseKeys(key, tempObj[key], options); + obj = utils.merge(obj, newObj, options); + } + + return utils.compact(obj); +}; + +},{"./utils":5}],4:[function(require,module,exports){ +'use strict'; + +var utils = require('./utils'); +var formats = require('./formats'); +var has = Object.prototype.hasOwnProperty; + +var arrayPrefixGenerators = { + brackets: function brackets(prefix) { // eslint-disable-line func-name-matching + return prefix + '[]'; + }, + comma: 'comma', + indices: function indices(prefix, key) { // eslint-disable-line func-name-matching + return prefix + '[' + key + ']'; + }, + repeat: function repeat(prefix) { // eslint-disable-line func-name-matching + return prefix; + } +}; + +var isArray = Array.isArray; +var push = Array.prototype.push; +var pushToArray = function (arr, valueOrArray) { + push.apply(arr, isArray(valueOrArray) ? valueOrArray : [valueOrArray]); +}; + +var toISO = Date.prototype.toISOString; + +var defaults = { + addQueryPrefix: false, + allowDots: false, + charset: 'utf-8', + charsetSentinel: false, + delimiter: '&', + encode: true, + encoder: utils.encode, + encodeValuesOnly: false, + formatter: formats.formatters[formats['default']], + // deprecated + indices: false, + serializeDate: function serializeDate(date) { // eslint-disable-line func-name-matching + return toISO.call(date); + }, + skipNulls: false, + strictNullHandling: false +}; + +var stringify = function stringify( // eslint-disable-line func-name-matching + object, + prefix, + generateArrayPrefix, + strictNullHandling, + skipNulls, + encoder, + filter, + sort, + allowDots, + serializeDate, + formatter, + encodeValuesOnly, + charset +) { + var obj = object; + if (typeof filter === 'function') { + obj = filter(prefix, obj); + } else if (obj instanceof Date) { + obj = serializeDate(obj); + } else if (generateArrayPrefix === 'comma' && isArray(obj)) { + obj = obj.join(','); + } + + if (obj === null) { + if (strictNullHandling) { + return encoder && !encodeValuesOnly ? encoder(prefix, defaults.encoder, charset) : prefix; + } + + obj = ''; + } + + if (typeof obj === 'string' || typeof obj === 'number' || typeof obj === 'boolean' || utils.isBuffer(obj)) { + if (encoder) { + var keyValue = encodeValuesOnly ? prefix : encoder(prefix, defaults.encoder, charset); + return [formatter(keyValue) + '=' + formatter(encoder(obj, defaults.encoder, charset))]; + } + return [formatter(prefix) + '=' + formatter(String(obj))]; + } + + var values = []; + + if (typeof obj === 'undefined') { + return values; + } + + var objKeys; + if (isArray(filter)) { + objKeys = filter; + } else { + var keys = Object.keys(obj); + objKeys = sort ? keys.sort(sort) : keys; + } + + for (var i = 0; i < objKeys.length; ++i) { + var key = objKeys[i]; + + if (skipNulls && obj[key] === null) { + continue; + } + + if (isArray(obj)) { + pushToArray(values, stringify( + obj[key], + typeof generateArrayPrefix === 'function' ? generateArrayPrefix(prefix, key) : prefix, + generateArrayPrefix, + strictNullHandling, + skipNulls, + encoder, + filter, + sort, + allowDots, + serializeDate, + formatter, + encodeValuesOnly, + charset + )); + } else { + pushToArray(values, stringify( + obj[key], + prefix + (allowDots ? '.' + key : '[' + key + ']'), + generateArrayPrefix, + strictNullHandling, + skipNulls, + encoder, + filter, + sort, + allowDots, + serializeDate, + formatter, + encodeValuesOnly, + charset + )); + } + } + + return values; +}; + +var normalizeStringifyOptions = function normalizeStringifyOptions(opts) { + if (!opts) { + return defaults; + } + + if (opts.encoder !== null && opts.encoder !== undefined && typeof opts.encoder !== 'function') { + throw new TypeError('Encoder has to be a function.'); + } + + var charset = opts.charset || defaults.charset; + if (typeof opts.charset !== 'undefined' && opts.charset !== 'utf-8' && opts.charset !== 'iso-8859-1') { + throw new TypeError('The charset option must be either utf-8, iso-8859-1, or undefined'); + } + + var format = formats['default']; + if (typeof opts.format !== 'undefined') { + if (!has.call(formats.formatters, opts.format)) { + throw new TypeError('Unknown format option provided.'); + } + format = opts.format; + } + var formatter = formats.formatters[format]; + + var filter = defaults.filter; + if (typeof opts.filter === 'function' || isArray(opts.filter)) { + filter = opts.filter; + } + + return { + addQueryPrefix: typeof opts.addQueryPrefix === 'boolean' ? opts.addQueryPrefix : defaults.addQueryPrefix, + allowDots: typeof opts.allowDots === 'undefined' ? defaults.allowDots : !!opts.allowDots, + charset: charset, + charsetSentinel: typeof opts.charsetSentinel === 'boolean' ? opts.charsetSentinel : defaults.charsetSentinel, + delimiter: typeof opts.delimiter === 'undefined' ? defaults.delimiter : opts.delimiter, + encode: typeof opts.encode === 'boolean' ? opts.encode : defaults.encode, + encoder: typeof opts.encoder === 'function' ? opts.encoder : defaults.encoder, + encodeValuesOnly: typeof opts.encodeValuesOnly === 'boolean' ? opts.encodeValuesOnly : defaults.encodeValuesOnly, + filter: filter, + formatter: formatter, + serializeDate: typeof opts.serializeDate === 'function' ? opts.serializeDate : defaults.serializeDate, + skipNulls: typeof opts.skipNulls === 'boolean' ? opts.skipNulls : defaults.skipNulls, + sort: typeof opts.sort === 'function' ? opts.sort : null, + strictNullHandling: typeof opts.strictNullHandling === 'boolean' ? opts.strictNullHandling : defaults.strictNullHandling + }; +}; + +module.exports = function (object, opts) { + var obj = object; + var options = normalizeStringifyOptions(opts); + + var objKeys; + var filter; + + if (typeof options.filter === 'function') { + filter = options.filter; + obj = filter('', obj); + } else if (isArray(options.filter)) { + filter = options.filter; + objKeys = filter; + } + + var keys = []; + + if (typeof obj !== 'object' || obj === null) { + return ''; + } + + var arrayFormat; + if (opts && opts.arrayFormat in arrayPrefixGenerators) { + arrayFormat = opts.arrayFormat; + } else if (opts && 'indices' in opts) { + arrayFormat = opts.indices ? 'indices' : 'repeat'; + } else { + arrayFormat = 'indices'; + } + + var generateArrayPrefix = arrayPrefixGenerators[arrayFormat]; + + if (!objKeys) { + objKeys = Object.keys(obj); + } + + if (options.sort) { + objKeys.sort(options.sort); + } + + for (var i = 0; i < objKeys.length; ++i) { + var key = objKeys[i]; + + if (options.skipNulls && obj[key] === null) { + continue; + } + pushToArray(keys, stringify( + obj[key], + key, + generateArrayPrefix, + options.strictNullHandling, + options.skipNulls, + options.encode ? options.encoder : null, + options.filter, + options.sort, + options.allowDots, + options.serializeDate, + options.formatter, + options.encodeValuesOnly, + options.charset + )); + } + + var joined = keys.join(options.delimiter); + var prefix = options.addQueryPrefix === true ? '?' : ''; + + if (options.charsetSentinel) { + if (options.charset === 'iso-8859-1') { + // encodeURIComponent('✓'), the "numeric entity" representation of a checkmark + prefix += 'utf8=%26%2310003%3B&'; + } else { + // encodeURIComponent('✓') + prefix += 'utf8=%E2%9C%93&'; + } + } + + return joined.length > 0 ? prefix + joined : ''; +}; + +},{"./formats":1,"./utils":5}],5:[function(require,module,exports){ +'use strict'; + +var has = Object.prototype.hasOwnProperty; +var isArray = Array.isArray; + +var hexTable = (function () { + var array = []; + for (var i = 0; i < 256; ++i) { + array.push('%' + ((i < 16 ? '0' : '') + i.toString(16)).toUpperCase()); + } + + return array; +}()); + +var compactQueue = function compactQueue(queue) { + while (queue.length > 1) { + var item = queue.pop(); + var obj = item.obj[item.prop]; + + if (isArray(obj)) { + var compacted = []; + + for (var j = 0; j < obj.length; ++j) { + if (typeof obj[j] !== 'undefined') { + compacted.push(obj[j]); + } + } + + item.obj[item.prop] = compacted; + } + } +}; + +var arrayToObject = function arrayToObject(source, options) { + var obj = options && options.plainObjects ? Object.create(null) : {}; + for (var i = 0; i < source.length; ++i) { + if (typeof source[i] !== 'undefined') { + obj[i] = source[i]; + } + } + + return obj; +}; + +var merge = function merge(target, source, options) { + if (!source) { + return target; + } + + if (typeof source !== 'object') { + if (isArray(target)) { + target.push(source); + } else if (target && typeof target === 'object') { + if ((options && (options.plainObjects || options.allowPrototypes)) || !has.call(Object.prototype, source)) { + target[source] = true; + } + } else { + return [target, source]; + } + + return target; + } + + if (!target || typeof target !== 'object') { + return [target].concat(source); + } + + var mergeTarget = target; + if (isArray(target) && !isArray(source)) { + mergeTarget = arrayToObject(target, options); + } + + if (isArray(target) && isArray(source)) { + source.forEach(function (item, i) { + if (has.call(target, i)) { + var targetItem = target[i]; + if (targetItem && typeof targetItem === 'object' && item && typeof item === 'object') { + target[i] = merge(targetItem, item, options); + } else { + target.push(item); + } + } else { + target[i] = item; + } + }); + return target; + } + + return Object.keys(source).reduce(function (acc, key) { + var value = source[key]; + + if (has.call(acc, key)) { + acc[key] = merge(acc[key], value, options); + } else { + acc[key] = value; + } + return acc; + }, mergeTarget); +}; + +var assign = function assignSingleSource(target, source) { + return Object.keys(source).reduce(function (acc, key) { + acc[key] = source[key]; + return acc; + }, target); +}; + +var decode = function (str, decoder, charset) { + var strWithoutPlus = str.replace(/\+/g, ' '); + if (charset === 'iso-8859-1') { + // unescape never throws, no try...catch needed: + return strWithoutPlus.replace(/%[0-9a-f]{2}/gi, unescape); + } + // utf-8 + try { + return decodeURIComponent(strWithoutPlus); + } catch (e) { + return strWithoutPlus; + } +}; + +var encode = function encode(str, defaultEncoder, charset) { + // This code was originally written by Brian White (mscdex) for the io.js core querystring library. + // It has been adapted here for stricter adherence to RFC 3986 + if (str.length === 0) { + return str; + } + + var string = typeof str === 'string' ? str : String(str); + + if (charset === 'iso-8859-1') { + return escape(string).replace(/%u[0-9a-f]{4}/gi, function ($0) { + return '%26%23' + parseInt($0.slice(2), 16) + '%3B'; + }); + } + + var out = ''; + for (var i = 0; i < string.length; ++i) { + var c = string.charCodeAt(i); + + if ( + c === 0x2D // - + || c === 0x2E // . + || c === 0x5F // _ + || c === 0x7E // ~ + || (c >= 0x30 && c <= 0x39) // 0-9 + || (c >= 0x41 && c <= 0x5A) // a-z + || (c >= 0x61 && c <= 0x7A) // A-Z + ) { + out += string.charAt(i); + continue; + } + + if (c < 0x80) { + out = out + hexTable[c]; + continue; + } + + if (c < 0x800) { + out = out + (hexTable[0xC0 | (c >> 6)] + hexTable[0x80 | (c & 0x3F)]); + continue; + } + + if (c < 0xD800 || c >= 0xE000) { + out = out + (hexTable[0xE0 | (c >> 12)] + hexTable[0x80 | ((c >> 6) & 0x3F)] + hexTable[0x80 | (c & 0x3F)]); + continue; + } + + i += 1; + c = 0x10000 + (((c & 0x3FF) << 10) | (string.charCodeAt(i) & 0x3FF)); + out += hexTable[0xF0 | (c >> 18)] + + hexTable[0x80 | ((c >> 12) & 0x3F)] + + hexTable[0x80 | ((c >> 6) & 0x3F)] + + hexTable[0x80 | (c & 0x3F)]; + } + + return out; +}; + +var compact = function compact(value) { + var queue = [{ obj: { o: value }, prop: 'o' }]; + var refs = []; + + for (var i = 0; i < queue.length; ++i) { + var item = queue[i]; + var obj = item.obj[item.prop]; + + var keys = Object.keys(obj); + for (var j = 0; j < keys.length; ++j) { + var key = keys[j]; + var val = obj[key]; + if (typeof val === 'object' && val !== null && refs.indexOf(val) === -1) { + queue.push({ obj: obj, prop: key }); + refs.push(val); + } + } + } + + compactQueue(queue); + + return value; +}; + +var isRegExp = function isRegExp(obj) { + return Object.prototype.toString.call(obj) === '[object RegExp]'; +}; + +var isBuffer = function isBuffer(obj) { + if (!obj || typeof obj !== 'object') { + return false; + } + + return !!(obj.constructor && obj.constructor.isBuffer && obj.constructor.isBuffer(obj)); +}; + +var combine = function combine(a, b) { + return [].concat(a, b); +}; + +module.exports = { + arrayToObject: arrayToObject, + assign: assign, + combine: combine, + compact: compact, + decode: decode, + encode: encode, + isBuffer: isBuffer, + isRegExp: isRegExp, + merge: merge +}; + +},{}]},{},[2])(2) +}); diff --git a/node_modules/qs/lib/formats.js b/node_modules/qs/lib/formats.js new file mode 100644 index 000000000..df4599752 --- /dev/null +++ b/node_modules/qs/lib/formats.js @@ -0,0 +1,18 @@ +'use strict'; + +var replace = String.prototype.replace; +var percentTwenties = /%20/g; + +module.exports = { + 'default': 'RFC3986', + formatters: { + RFC1738: function (value) { + return replace.call(value, percentTwenties, '+'); + }, + RFC3986: function (value) { + return value; + } + }, + RFC1738: 'RFC1738', + RFC3986: 'RFC3986' +}; diff --git a/node_modules/qs/lib/index.js b/node_modules/qs/lib/index.js new file mode 100644 index 000000000..0d6a97dcf --- /dev/null +++ b/node_modules/qs/lib/index.js @@ -0,0 +1,11 @@ +'use strict'; + +var stringify = require('./stringify'); +var parse = require('./parse'); +var formats = require('./formats'); + +module.exports = { + formats: formats, + parse: parse, + stringify: stringify +}; diff --git a/node_modules/qs/lib/parse.js b/node_modules/qs/lib/parse.js new file mode 100644 index 000000000..d81628b57 --- /dev/null +++ b/node_modules/qs/lib/parse.js @@ -0,0 +1,242 @@ +'use strict'; + +var utils = require('./utils'); + +var has = Object.prototype.hasOwnProperty; + +var defaults = { + allowDots: false, + allowPrototypes: false, + arrayLimit: 20, + charset: 'utf-8', + charsetSentinel: false, + comma: false, + decoder: utils.decode, + delimiter: '&', + depth: 5, + ignoreQueryPrefix: false, + interpretNumericEntities: false, + parameterLimit: 1000, + parseArrays: true, + plainObjects: false, + strictNullHandling: false +}; + +var interpretNumericEntities = function (str) { + return str.replace(/&#(\d+);/g, function ($0, numberStr) { + return String.fromCharCode(parseInt(numberStr, 10)); + }); +}; + +// This is what browsers will submit when the ✓ character occurs in an +// application/x-www-form-urlencoded body and the encoding of the page containing +// the form is iso-8859-1, or when the submitted form has an accept-charset +// attribute of iso-8859-1. Presumably also with other charsets that do not contain +// the ✓ character, such as us-ascii. +var isoSentinel = 'utf8=%26%2310003%3B'; // encodeURIComponent('✓') + +// These are the percent-encoded utf-8 octets representing a checkmark, indicating that the request actually is utf-8 encoded. +var charsetSentinel = 'utf8=%E2%9C%93'; // encodeURIComponent('✓') + +var parseValues = function parseQueryStringValues(str, options) { + var obj = {}; + var cleanStr = options.ignoreQueryPrefix ? str.replace(/^\?/, '') : str; + var limit = options.parameterLimit === Infinity ? undefined : options.parameterLimit; + var parts = cleanStr.split(options.delimiter, limit); + var skipIndex = -1; // Keep track of where the utf8 sentinel was found + var i; + + var charset = options.charset; + if (options.charsetSentinel) { + for (i = 0; i < parts.length; ++i) { + if (parts[i].indexOf('utf8=') === 0) { + if (parts[i] === charsetSentinel) { + charset = 'utf-8'; + } else if (parts[i] === isoSentinel) { + charset = 'iso-8859-1'; + } + skipIndex = i; + i = parts.length; // The eslint settings do not allow break; + } + } + } + + for (i = 0; i < parts.length; ++i) { + if (i === skipIndex) { + continue; + } + var part = parts[i]; + + var bracketEqualsPos = part.indexOf(']='); + var pos = bracketEqualsPos === -1 ? part.indexOf('=') : bracketEqualsPos + 1; + + var key, val; + if (pos === -1) { + key = options.decoder(part, defaults.decoder, charset); + val = options.strictNullHandling ? null : ''; + } else { + key = options.decoder(part.slice(0, pos), defaults.decoder, charset); + val = options.decoder(part.slice(pos + 1), defaults.decoder, charset); + } + + if (val && options.interpretNumericEntities && charset === 'iso-8859-1') { + val = interpretNumericEntities(val); + } + + if (val && options.comma && val.indexOf(',') > -1) { + val = val.split(','); + } + + if (has.call(obj, key)) { + obj[key] = utils.combine(obj[key], val); + } else { + obj[key] = val; + } + } + + return obj; +}; + +var parseObject = function (chain, val, options) { + var leaf = val; + + for (var i = chain.length - 1; i >= 0; --i) { + var obj; + var root = chain[i]; + + if (root === '[]' && options.parseArrays) { + obj = [].concat(leaf); + } else { + obj = options.plainObjects ? Object.create(null) : {}; + var cleanRoot = root.charAt(0) === '[' && root.charAt(root.length - 1) === ']' ? root.slice(1, -1) : root; + var index = parseInt(cleanRoot, 10); + if (!options.parseArrays && cleanRoot === '') { + obj = { 0: leaf }; + } else if ( + !isNaN(index) + && root !== cleanRoot + && String(index) === cleanRoot + && index >= 0 + && (options.parseArrays && index <= options.arrayLimit) + ) { + obj = []; + obj[index] = leaf; + } else { + obj[cleanRoot] = leaf; + } + } + + leaf = obj; + } + + return leaf; +}; + +var parseKeys = function parseQueryStringKeys(givenKey, val, options) { + if (!givenKey) { + return; + } + + // Transform dot notation to bracket notation + var key = options.allowDots ? givenKey.replace(/\.([^.[]+)/g, '[$1]') : givenKey; + + // The regex chunks + + var brackets = /(\[[^[\]]*])/; + var child = /(\[[^[\]]*])/g; + + // Get the parent + + var segment = brackets.exec(key); + var parent = segment ? key.slice(0, segment.index) : key; + + // Stash the parent if it exists + + var keys = []; + if (parent) { + // If we aren't using plain objects, optionally prefix keys that would overwrite object prototype properties + if (!options.plainObjects && has.call(Object.prototype, parent)) { + if (!options.allowPrototypes) { + return; + } + } + + keys.push(parent); + } + + // Loop through children appending to the array until we hit depth + + var i = 0; + while ((segment = child.exec(key)) !== null && i < options.depth) { + i += 1; + if (!options.plainObjects && has.call(Object.prototype, segment[1].slice(1, -1))) { + if (!options.allowPrototypes) { + return; + } + } + keys.push(segment[1]); + } + + // If there's a remainder, just add whatever is left + + if (segment) { + keys.push('[' + key.slice(segment.index) + ']'); + } + + return parseObject(keys, val, options); +}; + +var normalizeParseOptions = function normalizeParseOptions(opts) { + if (!opts) { + return defaults; + } + + if (opts.decoder !== null && opts.decoder !== undefined && typeof opts.decoder !== 'function') { + throw new TypeError('Decoder has to be a function.'); + } + + if (typeof opts.charset !== 'undefined' && opts.charset !== 'utf-8' && opts.charset !== 'iso-8859-1') { + throw new Error('The charset option must be either utf-8, iso-8859-1, or undefined'); + } + var charset = typeof opts.charset === 'undefined' ? defaults.charset : opts.charset; + + return { + allowDots: typeof opts.allowDots === 'undefined' ? defaults.allowDots : !!opts.allowDots, + allowPrototypes: typeof opts.allowPrototypes === 'boolean' ? opts.allowPrototypes : defaults.allowPrototypes, + arrayLimit: typeof opts.arrayLimit === 'number' ? opts.arrayLimit : defaults.arrayLimit, + charset: charset, + charsetSentinel: typeof opts.charsetSentinel === 'boolean' ? opts.charsetSentinel : defaults.charsetSentinel, + comma: typeof opts.comma === 'boolean' ? opts.comma : defaults.comma, + decoder: typeof opts.decoder === 'function' ? opts.decoder : defaults.decoder, + delimiter: typeof opts.delimiter === 'string' || utils.isRegExp(opts.delimiter) ? opts.delimiter : defaults.delimiter, + depth: typeof opts.depth === 'number' ? opts.depth : defaults.depth, + ignoreQueryPrefix: opts.ignoreQueryPrefix === true, + interpretNumericEntities: typeof opts.interpretNumericEntities === 'boolean' ? opts.interpretNumericEntities : defaults.interpretNumericEntities, + parameterLimit: typeof opts.parameterLimit === 'number' ? opts.parameterLimit : defaults.parameterLimit, + parseArrays: opts.parseArrays !== false, + plainObjects: typeof opts.plainObjects === 'boolean' ? opts.plainObjects : defaults.plainObjects, + strictNullHandling: typeof opts.strictNullHandling === 'boolean' ? opts.strictNullHandling : defaults.strictNullHandling + }; +}; + +module.exports = function (str, opts) { + var options = normalizeParseOptions(opts); + + if (str === '' || str === null || typeof str === 'undefined') { + return options.plainObjects ? Object.create(null) : {}; + } + + var tempObj = typeof str === 'string' ? parseValues(str, options) : str; + var obj = options.plainObjects ? Object.create(null) : {}; + + // Iterate over the keys and setup the new object + + var keys = Object.keys(tempObj); + for (var i = 0; i < keys.length; ++i) { + var key = keys[i]; + var newObj = parseKeys(key, tempObj[key], options); + obj = utils.merge(obj, newObj, options); + } + + return utils.compact(obj); +}; diff --git a/node_modules/qs/lib/stringify.js b/node_modules/qs/lib/stringify.js new file mode 100644 index 000000000..7455049ca --- /dev/null +++ b/node_modules/qs/lib/stringify.js @@ -0,0 +1,269 @@ +'use strict'; + +var utils = require('./utils'); +var formats = require('./formats'); +var has = Object.prototype.hasOwnProperty; + +var arrayPrefixGenerators = { + brackets: function brackets(prefix) { // eslint-disable-line func-name-matching + return prefix + '[]'; + }, + comma: 'comma', + indices: function indices(prefix, key) { // eslint-disable-line func-name-matching + return prefix + '[' + key + ']'; + }, + repeat: function repeat(prefix) { // eslint-disable-line func-name-matching + return prefix; + } +}; + +var isArray = Array.isArray; +var push = Array.prototype.push; +var pushToArray = function (arr, valueOrArray) { + push.apply(arr, isArray(valueOrArray) ? valueOrArray : [valueOrArray]); +}; + +var toISO = Date.prototype.toISOString; + +var defaults = { + addQueryPrefix: false, + allowDots: false, + charset: 'utf-8', + charsetSentinel: false, + delimiter: '&', + encode: true, + encoder: utils.encode, + encodeValuesOnly: false, + formatter: formats.formatters[formats['default']], + // deprecated + indices: false, + serializeDate: function serializeDate(date) { // eslint-disable-line func-name-matching + return toISO.call(date); + }, + skipNulls: false, + strictNullHandling: false +}; + +var stringify = function stringify( // eslint-disable-line func-name-matching + object, + prefix, + generateArrayPrefix, + strictNullHandling, + skipNulls, + encoder, + filter, + sort, + allowDots, + serializeDate, + formatter, + encodeValuesOnly, + charset +) { + var obj = object; + if (typeof filter === 'function') { + obj = filter(prefix, obj); + } else if (obj instanceof Date) { + obj = serializeDate(obj); + } else if (generateArrayPrefix === 'comma' && isArray(obj)) { + obj = obj.join(','); + } + + if (obj === null) { + if (strictNullHandling) { + return encoder && !encodeValuesOnly ? encoder(prefix, defaults.encoder, charset) : prefix; + } + + obj = ''; + } + + if (typeof obj === 'string' || typeof obj === 'number' || typeof obj === 'boolean' || utils.isBuffer(obj)) { + if (encoder) { + var keyValue = encodeValuesOnly ? prefix : encoder(prefix, defaults.encoder, charset); + return [formatter(keyValue) + '=' + formatter(encoder(obj, defaults.encoder, charset))]; + } + return [formatter(prefix) + '=' + formatter(String(obj))]; + } + + var values = []; + + if (typeof obj === 'undefined') { + return values; + } + + var objKeys; + if (isArray(filter)) { + objKeys = filter; + } else { + var keys = Object.keys(obj); + objKeys = sort ? keys.sort(sort) : keys; + } + + for (var i = 0; i < objKeys.length; ++i) { + var key = objKeys[i]; + + if (skipNulls && obj[key] === null) { + continue; + } + + if (isArray(obj)) { + pushToArray(values, stringify( + obj[key], + typeof generateArrayPrefix === 'function' ? generateArrayPrefix(prefix, key) : prefix, + generateArrayPrefix, + strictNullHandling, + skipNulls, + encoder, + filter, + sort, + allowDots, + serializeDate, + formatter, + encodeValuesOnly, + charset + )); + } else { + pushToArray(values, stringify( + obj[key], + prefix + (allowDots ? '.' + key : '[' + key + ']'), + generateArrayPrefix, + strictNullHandling, + skipNulls, + encoder, + filter, + sort, + allowDots, + serializeDate, + formatter, + encodeValuesOnly, + charset + )); + } + } + + return values; +}; + +var normalizeStringifyOptions = function normalizeStringifyOptions(opts) { + if (!opts) { + return defaults; + } + + if (opts.encoder !== null && opts.encoder !== undefined && typeof opts.encoder !== 'function') { + throw new TypeError('Encoder has to be a function.'); + } + + var charset = opts.charset || defaults.charset; + if (typeof opts.charset !== 'undefined' && opts.charset !== 'utf-8' && opts.charset !== 'iso-8859-1') { + throw new TypeError('The charset option must be either utf-8, iso-8859-1, or undefined'); + } + + var format = formats['default']; + if (typeof opts.format !== 'undefined') { + if (!has.call(formats.formatters, opts.format)) { + throw new TypeError('Unknown format option provided.'); + } + format = opts.format; + } + var formatter = formats.formatters[format]; + + var filter = defaults.filter; + if (typeof opts.filter === 'function' || isArray(opts.filter)) { + filter = opts.filter; + } + + return { + addQueryPrefix: typeof opts.addQueryPrefix === 'boolean' ? opts.addQueryPrefix : defaults.addQueryPrefix, + allowDots: typeof opts.allowDots === 'undefined' ? defaults.allowDots : !!opts.allowDots, + charset: charset, + charsetSentinel: typeof opts.charsetSentinel === 'boolean' ? opts.charsetSentinel : defaults.charsetSentinel, + delimiter: typeof opts.delimiter === 'undefined' ? defaults.delimiter : opts.delimiter, + encode: typeof opts.encode === 'boolean' ? opts.encode : defaults.encode, + encoder: typeof opts.encoder === 'function' ? opts.encoder : defaults.encoder, + encodeValuesOnly: typeof opts.encodeValuesOnly === 'boolean' ? opts.encodeValuesOnly : defaults.encodeValuesOnly, + filter: filter, + formatter: formatter, + serializeDate: typeof opts.serializeDate === 'function' ? opts.serializeDate : defaults.serializeDate, + skipNulls: typeof opts.skipNulls === 'boolean' ? opts.skipNulls : defaults.skipNulls, + sort: typeof opts.sort === 'function' ? opts.sort : null, + strictNullHandling: typeof opts.strictNullHandling === 'boolean' ? opts.strictNullHandling : defaults.strictNullHandling + }; +}; + +module.exports = function (object, opts) { + var obj = object; + var options = normalizeStringifyOptions(opts); + + var objKeys; + var filter; + + if (typeof options.filter === 'function') { + filter = options.filter; + obj = filter('', obj); + } else if (isArray(options.filter)) { + filter = options.filter; + objKeys = filter; + } + + var keys = []; + + if (typeof obj !== 'object' || obj === null) { + return ''; + } + + var arrayFormat; + if (opts && opts.arrayFormat in arrayPrefixGenerators) { + arrayFormat = opts.arrayFormat; + } else if (opts && 'indices' in opts) { + arrayFormat = opts.indices ? 'indices' : 'repeat'; + } else { + arrayFormat = 'indices'; + } + + var generateArrayPrefix = arrayPrefixGenerators[arrayFormat]; + + if (!objKeys) { + objKeys = Object.keys(obj); + } + + if (options.sort) { + objKeys.sort(options.sort); + } + + for (var i = 0; i < objKeys.length; ++i) { + var key = objKeys[i]; + + if (options.skipNulls && obj[key] === null) { + continue; + } + pushToArray(keys, stringify( + obj[key], + key, + generateArrayPrefix, + options.strictNullHandling, + options.skipNulls, + options.encode ? options.encoder : null, + options.filter, + options.sort, + options.allowDots, + options.serializeDate, + options.formatter, + options.encodeValuesOnly, + options.charset + )); + } + + var joined = keys.join(options.delimiter); + var prefix = options.addQueryPrefix === true ? '?' : ''; + + if (options.charsetSentinel) { + if (options.charset === 'iso-8859-1') { + // encodeURIComponent('✓'), the "numeric entity" representation of a checkmark + prefix += 'utf8=%26%2310003%3B&'; + } else { + // encodeURIComponent('✓') + prefix += 'utf8=%E2%9C%93&'; + } + } + + return joined.length > 0 ? prefix + joined : ''; +}; diff --git a/node_modules/qs/lib/utils.js b/node_modules/qs/lib/utils.js new file mode 100644 index 000000000..1b219cdde --- /dev/null +++ b/node_modules/qs/lib/utils.js @@ -0,0 +1,230 @@ +'use strict'; + +var has = Object.prototype.hasOwnProperty; +var isArray = Array.isArray; + +var hexTable = (function () { + var array = []; + for (var i = 0; i < 256; ++i) { + array.push('%' + ((i < 16 ? '0' : '') + i.toString(16)).toUpperCase()); + } + + return array; +}()); + +var compactQueue = function compactQueue(queue) { + while (queue.length > 1) { + var item = queue.pop(); + var obj = item.obj[item.prop]; + + if (isArray(obj)) { + var compacted = []; + + for (var j = 0; j < obj.length; ++j) { + if (typeof obj[j] !== 'undefined') { + compacted.push(obj[j]); + } + } + + item.obj[item.prop] = compacted; + } + } +}; + +var arrayToObject = function arrayToObject(source, options) { + var obj = options && options.plainObjects ? Object.create(null) : {}; + for (var i = 0; i < source.length; ++i) { + if (typeof source[i] !== 'undefined') { + obj[i] = source[i]; + } + } + + return obj; +}; + +var merge = function merge(target, source, options) { + if (!source) { + return target; + } + + if (typeof source !== 'object') { + if (isArray(target)) { + target.push(source); + } else if (target && typeof target === 'object') { + if ((options && (options.plainObjects || options.allowPrototypes)) || !has.call(Object.prototype, source)) { + target[source] = true; + } + } else { + return [target, source]; + } + + return target; + } + + if (!target || typeof target !== 'object') { + return [target].concat(source); + } + + var mergeTarget = target; + if (isArray(target) && !isArray(source)) { + mergeTarget = arrayToObject(target, options); + } + + if (isArray(target) && isArray(source)) { + source.forEach(function (item, i) { + if (has.call(target, i)) { + var targetItem = target[i]; + if (targetItem && typeof targetItem === 'object' && item && typeof item === 'object') { + target[i] = merge(targetItem, item, options); + } else { + target.push(item); + } + } else { + target[i] = item; + } + }); + return target; + } + + return Object.keys(source).reduce(function (acc, key) { + var value = source[key]; + + if (has.call(acc, key)) { + acc[key] = merge(acc[key], value, options); + } else { + acc[key] = value; + } + return acc; + }, mergeTarget); +}; + +var assign = function assignSingleSource(target, source) { + return Object.keys(source).reduce(function (acc, key) { + acc[key] = source[key]; + return acc; + }, target); +}; + +var decode = function (str, decoder, charset) { + var strWithoutPlus = str.replace(/\+/g, ' '); + if (charset === 'iso-8859-1') { + // unescape never throws, no try...catch needed: + return strWithoutPlus.replace(/%[0-9a-f]{2}/gi, unescape); + } + // utf-8 + try { + return decodeURIComponent(strWithoutPlus); + } catch (e) { + return strWithoutPlus; + } +}; + +var encode = function encode(str, defaultEncoder, charset) { + // This code was originally written by Brian White (mscdex) for the io.js core querystring library. + // It has been adapted here for stricter adherence to RFC 3986 + if (str.length === 0) { + return str; + } + + var string = typeof str === 'string' ? str : String(str); + + if (charset === 'iso-8859-1') { + return escape(string).replace(/%u[0-9a-f]{4}/gi, function ($0) { + return '%26%23' + parseInt($0.slice(2), 16) + '%3B'; + }); + } + + var out = ''; + for (var i = 0; i < string.length; ++i) { + var c = string.charCodeAt(i); + + if ( + c === 0x2D // - + || c === 0x2E // . + || c === 0x5F // _ + || c === 0x7E // ~ + || (c >= 0x30 && c <= 0x39) // 0-9 + || (c >= 0x41 && c <= 0x5A) // a-z + || (c >= 0x61 && c <= 0x7A) // A-Z + ) { + out += string.charAt(i); + continue; + } + + if (c < 0x80) { + out = out + hexTable[c]; + continue; + } + + if (c < 0x800) { + out = out + (hexTable[0xC0 | (c >> 6)] + hexTable[0x80 | (c & 0x3F)]); + continue; + } + + if (c < 0xD800 || c >= 0xE000) { + out = out + (hexTable[0xE0 | (c >> 12)] + hexTable[0x80 | ((c >> 6) & 0x3F)] + hexTable[0x80 | (c & 0x3F)]); + continue; + } + + i += 1; + c = 0x10000 + (((c & 0x3FF) << 10) | (string.charCodeAt(i) & 0x3FF)); + out += hexTable[0xF0 | (c >> 18)] + + hexTable[0x80 | ((c >> 12) & 0x3F)] + + hexTable[0x80 | ((c >> 6) & 0x3F)] + + hexTable[0x80 | (c & 0x3F)]; + } + + return out; +}; + +var compact = function compact(value) { + var queue = [{ obj: { o: value }, prop: 'o' }]; + var refs = []; + + for (var i = 0; i < queue.length; ++i) { + var item = queue[i]; + var obj = item.obj[item.prop]; + + var keys = Object.keys(obj); + for (var j = 0; j < keys.length; ++j) { + var key = keys[j]; + var val = obj[key]; + if (typeof val === 'object' && val !== null && refs.indexOf(val) === -1) { + queue.push({ obj: obj, prop: key }); + refs.push(val); + } + } + } + + compactQueue(queue); + + return value; +}; + +var isRegExp = function isRegExp(obj) { + return Object.prototype.toString.call(obj) === '[object RegExp]'; +}; + +var isBuffer = function isBuffer(obj) { + if (!obj || typeof obj !== 'object') { + return false; + } + + return !!(obj.constructor && obj.constructor.isBuffer && obj.constructor.isBuffer(obj)); +}; + +var combine = function combine(a, b) { + return [].concat(a, b); +}; + +module.exports = { + arrayToObject: arrayToObject, + assign: assign, + combine: combine, + compact: compact, + decode: decode, + encode: encode, + isBuffer: isBuffer, + isRegExp: isRegExp, + merge: merge +}; diff --git a/node_modules/qs/package.json b/node_modules/qs/package.json new file mode 100644 index 000000000..4cc48cfe7 --- /dev/null +++ b/node_modules/qs/package.json @@ -0,0 +1,87 @@ +{ + "_from": "qs@6.7.0", + "_id": "qs@6.7.0", + "_inBundle": false, + "_integrity": "sha512-VCdBRNFTX1fyE7Nb6FYoURo/SPe62QCaAyzJvUjwRaIsc+NePBEniHlvxFmmX56+HZphIGtV0XeCirBtpDrTyQ==", + "_location": "/qs", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "qs@6.7.0", + "name": "qs", + "escapedName": "qs", + "rawSpec": "6.7.0", + "saveSpec": null, + "fetchSpec": "6.7.0" + }, + "_requiredBy": [ + "/body-parser", + "/express" + ], + "_resolved": "https://registry.npmjs.org/qs/-/qs-6.7.0.tgz", + "_shasum": "41dc1a015e3d581f1621776be31afb2876a9b1bc", + "_spec": "qs@6.7.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\express", + "bugs": { + "url": "https://github.com/ljharb/qs/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Jordan Harband", + "email": "ljharb@gmail.com", + "url": "http://ljharb.codes" + } + ], + "dependencies": {}, + "deprecated": false, + "description": "A querystring parser that supports nesting and arrays, with a depth limit", + "devDependencies": { + "@ljharb/eslint-config": "^13.1.1", + "browserify": "^16.2.3", + "covert": "^1.1.1", + "editorconfig-tools": "^0.1.1", + "eslint": "^5.15.3", + "evalmd": "^0.0.17", + "for-each": "^0.3.3", + "iconv-lite": "^0.4.24", + "mkdirp": "^0.5.1", + "object-inspect": "^1.6.0", + "qs-iconv": "^1.0.4", + "safe-publish-latest": "^1.1.2", + "safer-buffer": "^2.1.2", + "tape": "^4.10.1" + }, + "engines": { + "node": ">=0.6" + }, + "homepage": "https://github.com/ljharb/qs", + "keywords": [ + "querystring", + "qs", + "query", + "url", + "parse", + "stringify" + ], + "license": "BSD-3-Clause", + "main": "lib/index.js", + "name": "qs", + "repository": { + "type": "git", + "url": "git+https://github.com/ljharb/qs.git" + }, + "scripts": { + "coverage": "covert test", + "dist": "mkdirp dist && browserify --standalone Qs lib/index.js > dist/qs.js", + "lint": "eslint lib/*.js test/*.js", + "postlint": "editorconfig-tools check * lib/* test/*", + "prepublish": "safe-publish-latest && npm run dist", + "pretest": "npm run --silent readme && npm run --silent lint", + "readme": "evalmd README.md", + "test": "npm run --silent coverage", + "tests-only": "node test" + }, + "version": "6.7.0" +} diff --git a/node_modules/qs/test/.eslintrc b/node_modules/qs/test/.eslintrc new file mode 100644 index 000000000..9ebbb921e --- /dev/null +++ b/node_modules/qs/test/.eslintrc @@ -0,0 +1,17 @@ +{ + "rules": { + "array-bracket-newline": 0, + "array-element-newline": 0, + "consistent-return": 2, + "function-paren-newline": 0, + "max-lines": 0, + "max-lines-per-function": 0, + "max-nested-callbacks": [2, 3], + "max-statements": 0, + "no-buffer-constructor": 0, + "no-extend-native": 0, + "no-magic-numbers": 0, + "object-curly-newline": 0, + "sort-keys": 0 + } +} diff --git a/node_modules/qs/test/index.js b/node_modules/qs/test/index.js new file mode 100644 index 000000000..5e6bc8fbd --- /dev/null +++ b/node_modules/qs/test/index.js @@ -0,0 +1,7 @@ +'use strict'; + +require('./parse'); + +require('./stringify'); + +require('./utils'); diff --git a/node_modules/qs/test/parse.js b/node_modules/qs/test/parse.js new file mode 100644 index 000000000..896778996 --- /dev/null +++ b/node_modules/qs/test/parse.js @@ -0,0 +1,676 @@ +'use strict'; + +var test = require('tape'); +var qs = require('../'); +var utils = require('../lib/utils'); +var iconv = require('iconv-lite'); +var SaferBuffer = require('safer-buffer').Buffer; + +test('parse()', function (t) { + t.test('parses a simple string', function (st) { + st.deepEqual(qs.parse('0=foo'), { 0: 'foo' }); + st.deepEqual(qs.parse('foo=c++'), { foo: 'c ' }); + st.deepEqual(qs.parse('a[>=]=23'), { a: { '>=': '23' } }); + st.deepEqual(qs.parse('a[<=>]==23'), { a: { '<=>': '=23' } }); + st.deepEqual(qs.parse('a[==]=23'), { a: { '==': '23' } }); + st.deepEqual(qs.parse('foo', { strictNullHandling: true }), { foo: null }); + st.deepEqual(qs.parse('foo'), { foo: '' }); + st.deepEqual(qs.parse('foo='), { foo: '' }); + st.deepEqual(qs.parse('foo=bar'), { foo: 'bar' }); + st.deepEqual(qs.parse(' foo = bar = baz '), { ' foo ': ' bar = baz ' }); + st.deepEqual(qs.parse('foo=bar=baz'), { foo: 'bar=baz' }); + st.deepEqual(qs.parse('foo=bar&bar=baz'), { foo: 'bar', bar: 'baz' }); + st.deepEqual(qs.parse('foo2=bar2&baz2='), { foo2: 'bar2', baz2: '' }); + st.deepEqual(qs.parse('foo=bar&baz', { strictNullHandling: true }), { foo: 'bar', baz: null }); + st.deepEqual(qs.parse('foo=bar&baz'), { foo: 'bar', baz: '' }); + st.deepEqual(qs.parse('cht=p3&chd=t:60,40&chs=250x100&chl=Hello|World'), { + cht: 'p3', + chd: 't:60,40', + chs: '250x100', + chl: 'Hello|World' + }); + st.end(); + }); + + t.test('allows enabling dot notation', function (st) { + st.deepEqual(qs.parse('a.b=c'), { 'a.b': 'c' }); + st.deepEqual(qs.parse('a.b=c', { allowDots: true }), { a: { b: 'c' } }); + st.end(); + }); + + t.deepEqual(qs.parse('a[b]=c'), { a: { b: 'c' } }, 'parses a single nested string'); + t.deepEqual(qs.parse('a[b][c]=d'), { a: { b: { c: 'd' } } }, 'parses a double nested string'); + t.deepEqual( + qs.parse('a[b][c][d][e][f][g][h]=i'), + { a: { b: { c: { d: { e: { f: { '[g][h]': 'i' } } } } } } }, + 'defaults to a depth of 5' + ); + + t.test('only parses one level when depth = 1', function (st) { + st.deepEqual(qs.parse('a[b][c]=d', { depth: 1 }), { a: { b: { '[c]': 'd' } } }); + st.deepEqual(qs.parse('a[b][c][d]=e', { depth: 1 }), { a: { b: { '[c][d]': 'e' } } }); + st.end(); + }); + + t.deepEqual(qs.parse('a=b&a=c'), { a: ['b', 'c'] }, 'parses a simple array'); + + t.test('parses an explicit array', function (st) { + st.deepEqual(qs.parse('a[]=b'), { a: ['b'] }); + st.deepEqual(qs.parse('a[]=b&a[]=c'), { a: ['b', 'c'] }); + st.deepEqual(qs.parse('a[]=b&a[]=c&a[]=d'), { a: ['b', 'c', 'd'] }); + st.end(); + }); + + t.test('parses a mix of simple and explicit arrays', function (st) { + st.deepEqual(qs.parse('a=b&a[]=c'), { a: ['b', 'c'] }); + st.deepEqual(qs.parse('a[]=b&a=c'), { a: ['b', 'c'] }); + st.deepEqual(qs.parse('a[0]=b&a=c'), { a: ['b', 'c'] }); + st.deepEqual(qs.parse('a=b&a[0]=c'), { a: ['b', 'c'] }); + + st.deepEqual(qs.parse('a[1]=b&a=c', { arrayLimit: 20 }), { a: ['b', 'c'] }); + st.deepEqual(qs.parse('a[]=b&a=c', { arrayLimit: 0 }), { a: ['b', 'c'] }); + st.deepEqual(qs.parse('a[]=b&a=c'), { a: ['b', 'c'] }); + + st.deepEqual(qs.parse('a=b&a[1]=c', { arrayLimit: 20 }), { a: ['b', 'c'] }); + st.deepEqual(qs.parse('a=b&a[]=c', { arrayLimit: 0 }), { a: ['b', 'c'] }); + st.deepEqual(qs.parse('a=b&a[]=c'), { a: ['b', 'c'] }); + + st.end(); + }); + + t.test('parses a nested array', function (st) { + st.deepEqual(qs.parse('a[b][]=c&a[b][]=d'), { a: { b: ['c', 'd'] } }); + st.deepEqual(qs.parse('a[>=]=25'), { a: { '>=': '25' } }); + st.end(); + }); + + t.test('allows to specify array indices', function (st) { + st.deepEqual(qs.parse('a[1]=c&a[0]=b&a[2]=d'), { a: ['b', 'c', 'd'] }); + st.deepEqual(qs.parse('a[1]=c&a[0]=b'), { a: ['b', 'c'] }); + st.deepEqual(qs.parse('a[1]=c', { arrayLimit: 20 }), { a: ['c'] }); + st.deepEqual(qs.parse('a[1]=c', { arrayLimit: 0 }), { a: { 1: 'c' } }); + st.deepEqual(qs.parse('a[1]=c'), { a: ['c'] }); + st.end(); + }); + + t.test('limits specific array indices to arrayLimit', function (st) { + st.deepEqual(qs.parse('a[20]=a', { arrayLimit: 20 }), { a: ['a'] }); + st.deepEqual(qs.parse('a[21]=a', { arrayLimit: 20 }), { a: { 21: 'a' } }); + st.end(); + }); + + t.deepEqual(qs.parse('a[12b]=c'), { a: { '12b': 'c' } }, 'supports keys that begin with a number'); + + t.test('supports encoded = signs', function (st) { + st.deepEqual(qs.parse('he%3Dllo=th%3Dere'), { 'he=llo': 'th=ere' }); + st.end(); + }); + + t.test('is ok with url encoded strings', function (st) { + st.deepEqual(qs.parse('a[b%20c]=d'), { a: { 'b c': 'd' } }); + st.deepEqual(qs.parse('a[b]=c%20d'), { a: { b: 'c d' } }); + st.end(); + }); + + t.test('allows brackets in the value', function (st) { + st.deepEqual(qs.parse('pets=["tobi"]'), { pets: '["tobi"]' }); + st.deepEqual(qs.parse('operators=[">=", "<="]'), { operators: '[">=", "<="]' }); + st.end(); + }); + + t.test('allows empty values', function (st) { + st.deepEqual(qs.parse(''), {}); + st.deepEqual(qs.parse(null), {}); + st.deepEqual(qs.parse(undefined), {}); + st.end(); + }); + + t.test('transforms arrays to objects', function (st) { + st.deepEqual(qs.parse('foo[0]=bar&foo[bad]=baz'), { foo: { 0: 'bar', bad: 'baz' } }); + st.deepEqual(qs.parse('foo[bad]=baz&foo[0]=bar'), { foo: { bad: 'baz', 0: 'bar' } }); + st.deepEqual(qs.parse('foo[bad]=baz&foo[]=bar'), { foo: { bad: 'baz', 0: 'bar' } }); + st.deepEqual(qs.parse('foo[]=bar&foo[bad]=baz'), { foo: { 0: 'bar', bad: 'baz' } }); + st.deepEqual(qs.parse('foo[bad]=baz&foo[]=bar&foo[]=foo'), { foo: { bad: 'baz', 0: 'bar', 1: 'foo' } }); + st.deepEqual(qs.parse('foo[0][a]=a&foo[0][b]=b&foo[1][a]=aa&foo[1][b]=bb'), { foo: [{ a: 'a', b: 'b' }, { a: 'aa', b: 'bb' }] }); + + st.deepEqual(qs.parse('a[]=b&a[t]=u&a[hasOwnProperty]=c', { allowPrototypes: false }), { a: { 0: 'b', t: 'u' } }); + st.deepEqual(qs.parse('a[]=b&a[t]=u&a[hasOwnProperty]=c', { allowPrototypes: true }), { a: { 0: 'b', t: 'u', hasOwnProperty: 'c' } }); + st.deepEqual(qs.parse('a[]=b&a[hasOwnProperty]=c&a[x]=y', { allowPrototypes: false }), { a: { 0: 'b', x: 'y' } }); + st.deepEqual(qs.parse('a[]=b&a[hasOwnProperty]=c&a[x]=y', { allowPrototypes: true }), { a: { 0: 'b', hasOwnProperty: 'c', x: 'y' } }); + st.end(); + }); + + t.test('transforms arrays to objects (dot notation)', function (st) { + st.deepEqual(qs.parse('foo[0].baz=bar&fool.bad=baz', { allowDots: true }), { foo: [{ baz: 'bar' }], fool: { bad: 'baz' } }); + st.deepEqual(qs.parse('foo[0].baz=bar&fool.bad.boo=baz', { allowDots: true }), { foo: [{ baz: 'bar' }], fool: { bad: { boo: 'baz' } } }); + st.deepEqual(qs.parse('foo[0][0].baz=bar&fool.bad=baz', { allowDots: true }), { foo: [[{ baz: 'bar' }]], fool: { bad: 'baz' } }); + st.deepEqual(qs.parse('foo[0].baz[0]=15&foo[0].bar=2', { allowDots: true }), { foo: [{ baz: ['15'], bar: '2' }] }); + st.deepEqual(qs.parse('foo[0].baz[0]=15&foo[0].baz[1]=16&foo[0].bar=2', { allowDots: true }), { foo: [{ baz: ['15', '16'], bar: '2' }] }); + st.deepEqual(qs.parse('foo.bad=baz&foo[0]=bar', { allowDots: true }), { foo: { bad: 'baz', 0: 'bar' } }); + st.deepEqual(qs.parse('foo.bad=baz&foo[]=bar', { allowDots: true }), { foo: { bad: 'baz', 0: 'bar' } }); + st.deepEqual(qs.parse('foo[]=bar&foo.bad=baz', { allowDots: true }), { foo: { 0: 'bar', bad: 'baz' } }); + st.deepEqual(qs.parse('foo.bad=baz&foo[]=bar&foo[]=foo', { allowDots: true }), { foo: { bad: 'baz', 0: 'bar', 1: 'foo' } }); + st.deepEqual(qs.parse('foo[0].a=a&foo[0].b=b&foo[1].a=aa&foo[1].b=bb', { allowDots: true }), { foo: [{ a: 'a', b: 'b' }, { a: 'aa', b: 'bb' }] }); + st.end(); + }); + + t.test('correctly prunes undefined values when converting an array to an object', function (st) { + st.deepEqual(qs.parse('a[2]=b&a[99999999]=c'), { a: { 2: 'b', 99999999: 'c' } }); + st.end(); + }); + + t.test('supports malformed uri characters', function (st) { + st.deepEqual(qs.parse('{%:%}', { strictNullHandling: true }), { '{%:%}': null }); + st.deepEqual(qs.parse('{%:%}='), { '{%:%}': '' }); + st.deepEqual(qs.parse('foo=%:%}'), { foo: '%:%}' }); + st.end(); + }); + + t.test('doesn\'t produce empty keys', function (st) { + st.deepEqual(qs.parse('_r=1&'), { _r: '1' }); + st.end(); + }); + + t.test('cannot access Object prototype', function (st) { + qs.parse('constructor[prototype][bad]=bad'); + qs.parse('bad[constructor][prototype][bad]=bad'); + st.equal(typeof Object.prototype.bad, 'undefined'); + st.end(); + }); + + t.test('parses arrays of objects', function (st) { + st.deepEqual(qs.parse('a[][b]=c'), { a: [{ b: 'c' }] }); + st.deepEqual(qs.parse('a[0][b]=c'), { a: [{ b: 'c' }] }); + st.end(); + }); + + t.test('allows for empty strings in arrays', function (st) { + st.deepEqual(qs.parse('a[]=b&a[]=&a[]=c'), { a: ['b', '', 'c'] }); + + st.deepEqual( + qs.parse('a[0]=b&a[1]&a[2]=c&a[19]=', { strictNullHandling: true, arrayLimit: 20 }), + { a: ['b', null, 'c', ''] }, + 'with arrayLimit 20 + array indices: null then empty string works' + ); + st.deepEqual( + qs.parse('a[]=b&a[]&a[]=c&a[]=', { strictNullHandling: true, arrayLimit: 0 }), + { a: ['b', null, 'c', ''] }, + 'with arrayLimit 0 + array brackets: null then empty string works' + ); + + st.deepEqual( + qs.parse('a[0]=b&a[1]=&a[2]=c&a[19]', { strictNullHandling: true, arrayLimit: 20 }), + { a: ['b', '', 'c', null] }, + 'with arrayLimit 20 + array indices: empty string then null works' + ); + st.deepEqual( + qs.parse('a[]=b&a[]=&a[]=c&a[]', { strictNullHandling: true, arrayLimit: 0 }), + { a: ['b', '', 'c', null] }, + 'with arrayLimit 0 + array brackets: empty string then null works' + ); + + st.deepEqual( + qs.parse('a[]=&a[]=b&a[]=c'), + { a: ['', 'b', 'c'] }, + 'array brackets: empty strings work' + ); + st.end(); + }); + + t.test('compacts sparse arrays', function (st) { + st.deepEqual(qs.parse('a[10]=1&a[2]=2', { arrayLimit: 20 }), { a: ['2', '1'] }); + st.deepEqual(qs.parse('a[1][b][2][c]=1', { arrayLimit: 20 }), { a: [{ b: [{ c: '1' }] }] }); + st.deepEqual(qs.parse('a[1][2][3][c]=1', { arrayLimit: 20 }), { a: [[[{ c: '1' }]]] }); + st.deepEqual(qs.parse('a[1][2][3][c][1]=1', { arrayLimit: 20 }), { a: [[[{ c: ['1'] }]]] }); + st.end(); + }); + + t.test('parses semi-parsed strings', function (st) { + st.deepEqual(qs.parse({ 'a[b]': 'c' }), { a: { b: 'c' } }); + st.deepEqual(qs.parse({ 'a[b]': 'c', 'a[d]': 'e' }), { a: { b: 'c', d: 'e' } }); + st.end(); + }); + + t.test('parses buffers correctly', function (st) { + var b = SaferBuffer.from('test'); + st.deepEqual(qs.parse({ a: b }), { a: b }); + st.end(); + }); + + t.test('parses jquery-param strings', function (st) { + // readable = 'filter[0][]=int1&filter[0][]==&filter[0][]=77&filter[]=and&filter[2][]=int2&filter[2][]==&filter[2][]=8' + var encoded = 'filter%5B0%5D%5B%5D=int1&filter%5B0%5D%5B%5D=%3D&filter%5B0%5D%5B%5D=77&filter%5B%5D=and&filter%5B2%5D%5B%5D=int2&filter%5B2%5D%5B%5D=%3D&filter%5B2%5D%5B%5D=8'; + var expected = { filter: [['int1', '=', '77'], 'and', ['int2', '=', '8']] }; + st.deepEqual(qs.parse(encoded), expected); + st.end(); + }); + + t.test('continues parsing when no parent is found', function (st) { + st.deepEqual(qs.parse('[]=&a=b'), { 0: '', a: 'b' }); + st.deepEqual(qs.parse('[]&a=b', { strictNullHandling: true }), { 0: null, a: 'b' }); + st.deepEqual(qs.parse('[foo]=bar'), { foo: 'bar' }); + st.end(); + }); + + t.test('does not error when parsing a very long array', function (st) { + var str = 'a[]=a'; + while (Buffer.byteLength(str) < 128 * 1024) { + str = str + '&' + str; + } + + st.doesNotThrow(function () { + qs.parse(str); + }); + + st.end(); + }); + + t.test('should not throw when a native prototype has an enumerable property', function (st) { + Object.prototype.crash = ''; + Array.prototype.crash = ''; + st.doesNotThrow(qs.parse.bind(null, 'a=b')); + st.deepEqual(qs.parse('a=b'), { a: 'b' }); + st.doesNotThrow(qs.parse.bind(null, 'a[][b]=c')); + st.deepEqual(qs.parse('a[][b]=c'), { a: [{ b: 'c' }] }); + delete Object.prototype.crash; + delete Array.prototype.crash; + st.end(); + }); + + t.test('parses a string with an alternative string delimiter', function (st) { + st.deepEqual(qs.parse('a=b;c=d', { delimiter: ';' }), { a: 'b', c: 'd' }); + st.end(); + }); + + t.test('parses a string with an alternative RegExp delimiter', function (st) { + st.deepEqual(qs.parse('a=b; c=d', { delimiter: /[;,] */ }), { a: 'b', c: 'd' }); + st.end(); + }); + + t.test('does not use non-splittable objects as delimiters', function (st) { + st.deepEqual(qs.parse('a=b&c=d', { delimiter: true }), { a: 'b', c: 'd' }); + st.end(); + }); + + t.test('allows overriding parameter limit', function (st) { + st.deepEqual(qs.parse('a=b&c=d', { parameterLimit: 1 }), { a: 'b' }); + st.end(); + }); + + t.test('allows setting the parameter limit to Infinity', function (st) { + st.deepEqual(qs.parse('a=b&c=d', { parameterLimit: Infinity }), { a: 'b', c: 'd' }); + st.end(); + }); + + t.test('allows overriding array limit', function (st) { + st.deepEqual(qs.parse('a[0]=b', { arrayLimit: -1 }), { a: { 0: 'b' } }); + st.deepEqual(qs.parse('a[-1]=b', { arrayLimit: -1 }), { a: { '-1': 'b' } }); + st.deepEqual(qs.parse('a[0]=b&a[1]=c', { arrayLimit: 0 }), { a: { 0: 'b', 1: 'c' } }); + st.end(); + }); + + t.test('allows disabling array parsing', function (st) { + var indices = qs.parse('a[0]=b&a[1]=c', { parseArrays: false }); + st.deepEqual(indices, { a: { 0: 'b', 1: 'c' } }); + st.equal(Array.isArray(indices.a), false, 'parseArrays:false, indices case is not an array'); + + var emptyBrackets = qs.parse('a[]=b', { parseArrays: false }); + st.deepEqual(emptyBrackets, { a: { 0: 'b' } }); + st.equal(Array.isArray(emptyBrackets.a), false, 'parseArrays:false, empty brackets case is not an array'); + + st.end(); + }); + + t.test('allows for query string prefix', function (st) { + st.deepEqual(qs.parse('?foo=bar', { ignoreQueryPrefix: true }), { foo: 'bar' }); + st.deepEqual(qs.parse('foo=bar', { ignoreQueryPrefix: true }), { foo: 'bar' }); + st.deepEqual(qs.parse('?foo=bar', { ignoreQueryPrefix: false }), { '?foo': 'bar' }); + st.end(); + }); + + t.test('parses an object', function (st) { + var input = { + 'user[name]': { 'pop[bob]': 3 }, + 'user[email]': null + }; + + var expected = { + user: { + name: { 'pop[bob]': 3 }, + email: null + } + }; + + var result = qs.parse(input); + + st.deepEqual(result, expected); + st.end(); + }); + + t.test('parses string with comma as array divider', function (st) { + st.deepEqual(qs.parse('foo=bar,tee', { comma: true }), { foo: ['bar', 'tee'] }); + st.deepEqual(qs.parse('foo[bar]=coffee,tee', { comma: true }), { foo: { bar: ['coffee', 'tee'] } }); + st.deepEqual(qs.parse('foo=', { comma: true }), { foo: '' }); + st.deepEqual(qs.parse('foo', { comma: true }), { foo: '' }); + st.deepEqual(qs.parse('foo', { comma: true, strictNullHandling: true }), { foo: null }); + st.end(); + }); + + t.test('parses an object in dot notation', function (st) { + var input = { + 'user.name': { 'pop[bob]': 3 }, + 'user.email.': null + }; + + var expected = { + user: { + name: { 'pop[bob]': 3 }, + email: null + } + }; + + var result = qs.parse(input, { allowDots: true }); + + st.deepEqual(result, expected); + st.end(); + }); + + t.test('parses an object and not child values', function (st) { + var input = { + 'user[name]': { 'pop[bob]': { test: 3 } }, + 'user[email]': null + }; + + var expected = { + user: { + name: { 'pop[bob]': { test: 3 } }, + email: null + } + }; + + var result = qs.parse(input); + + st.deepEqual(result, expected); + st.end(); + }); + + t.test('does not blow up when Buffer global is missing', function (st) { + var tempBuffer = global.Buffer; + delete global.Buffer; + var result = qs.parse('a=b&c=d'); + global.Buffer = tempBuffer; + st.deepEqual(result, { a: 'b', c: 'd' }); + st.end(); + }); + + t.test('does not crash when parsing circular references', function (st) { + var a = {}; + a.b = a; + + var parsed; + + st.doesNotThrow(function () { + parsed = qs.parse({ 'foo[bar]': 'baz', 'foo[baz]': a }); + }); + + st.equal('foo' in parsed, true, 'parsed has "foo" property'); + st.equal('bar' in parsed.foo, true); + st.equal('baz' in parsed.foo, true); + st.equal(parsed.foo.bar, 'baz'); + st.deepEqual(parsed.foo.baz, a); + st.end(); + }); + + t.test('does not crash when parsing deep objects', function (st) { + var parsed; + var str = 'foo'; + + for (var i = 0; i < 5000; i++) { + str += '[p]'; + } + + str += '=bar'; + + st.doesNotThrow(function () { + parsed = qs.parse(str, { depth: 5000 }); + }); + + st.equal('foo' in parsed, true, 'parsed has "foo" property'); + + var depth = 0; + var ref = parsed.foo; + while ((ref = ref.p)) { + depth += 1; + } + + st.equal(depth, 5000, 'parsed is 5000 properties deep'); + + st.end(); + }); + + t.test('parses null objects correctly', { skip: !Object.create }, function (st) { + var a = Object.create(null); + a.b = 'c'; + + st.deepEqual(qs.parse(a), { b: 'c' }); + var result = qs.parse({ a: a }); + st.equal('a' in result, true, 'result has "a" property'); + st.deepEqual(result.a, a); + st.end(); + }); + + t.test('parses dates correctly', function (st) { + var now = new Date(); + st.deepEqual(qs.parse({ a: now }), { a: now }); + st.end(); + }); + + t.test('parses regular expressions correctly', function (st) { + var re = /^test$/; + st.deepEqual(qs.parse({ a: re }), { a: re }); + st.end(); + }); + + t.test('does not allow overwriting prototype properties', function (st) { + st.deepEqual(qs.parse('a[hasOwnProperty]=b', { allowPrototypes: false }), {}); + st.deepEqual(qs.parse('hasOwnProperty=b', { allowPrototypes: false }), {}); + + st.deepEqual( + qs.parse('toString', { allowPrototypes: false }), + {}, + 'bare "toString" results in {}' + ); + + st.end(); + }); + + t.test('can allow overwriting prototype properties', function (st) { + st.deepEqual(qs.parse('a[hasOwnProperty]=b', { allowPrototypes: true }), { a: { hasOwnProperty: 'b' } }); + st.deepEqual(qs.parse('hasOwnProperty=b', { allowPrototypes: true }), { hasOwnProperty: 'b' }); + + st.deepEqual( + qs.parse('toString', { allowPrototypes: true }), + { toString: '' }, + 'bare "toString" results in { toString: "" }' + ); + + st.end(); + }); + + t.test('params starting with a closing bracket', function (st) { + st.deepEqual(qs.parse(']=toString'), { ']': 'toString' }); + st.deepEqual(qs.parse(']]=toString'), { ']]': 'toString' }); + st.deepEqual(qs.parse(']hello]=toString'), { ']hello]': 'toString' }); + st.end(); + }); + + t.test('params starting with a starting bracket', function (st) { + st.deepEqual(qs.parse('[=toString'), { '[': 'toString' }); + st.deepEqual(qs.parse('[[=toString'), { '[[': 'toString' }); + st.deepEqual(qs.parse('[hello[=toString'), { '[hello[': 'toString' }); + st.end(); + }); + + t.test('add keys to objects', function (st) { + st.deepEqual( + qs.parse('a[b]=c&a=d'), + { a: { b: 'c', d: true } }, + 'can add keys to objects' + ); + + st.deepEqual( + qs.parse('a[b]=c&a=toString'), + { a: { b: 'c' } }, + 'can not overwrite prototype' + ); + + st.deepEqual( + qs.parse('a[b]=c&a=toString', { allowPrototypes: true }), + { a: { b: 'c', toString: true } }, + 'can overwrite prototype with allowPrototypes true' + ); + + st.deepEqual( + qs.parse('a[b]=c&a=toString', { plainObjects: true }), + { a: { b: 'c', toString: true } }, + 'can overwrite prototype with plainObjects true' + ); + + st.end(); + }); + + t.test('can return null objects', { skip: !Object.create }, function (st) { + var expected = Object.create(null); + expected.a = Object.create(null); + expected.a.b = 'c'; + expected.a.hasOwnProperty = 'd'; + st.deepEqual(qs.parse('a[b]=c&a[hasOwnProperty]=d', { plainObjects: true }), expected); + st.deepEqual(qs.parse(null, { plainObjects: true }), Object.create(null)); + var expectedArray = Object.create(null); + expectedArray.a = Object.create(null); + expectedArray.a[0] = 'b'; + expectedArray.a.c = 'd'; + st.deepEqual(qs.parse('a[]=b&a[c]=d', { plainObjects: true }), expectedArray); + st.end(); + }); + + t.test('can parse with custom encoding', function (st) { + st.deepEqual(qs.parse('%8c%a7=%91%e5%8d%e3%95%7b', { + decoder: function (str) { + var reg = /%([0-9A-F]{2})/ig; + var result = []; + var parts = reg.exec(str); + while (parts) { + result.push(parseInt(parts[1], 16)); + parts = reg.exec(str); + } + return String(iconv.decode(SaferBuffer.from(result), 'shift_jis')); + } + }), { 県: '大阪府' }); + st.end(); + }); + + t.test('receives the default decoder as a second argument', function (st) { + st.plan(1); + qs.parse('a', { + decoder: function (str, defaultDecoder) { + st.equal(defaultDecoder, utils.decode); + } + }); + st.end(); + }); + + t.test('throws error with wrong decoder', function (st) { + st['throws'](function () { + qs.parse({}, { decoder: 'string' }); + }, new TypeError('Decoder has to be a function.')); + st.end(); + }); + + t.test('does not mutate the options argument', function (st) { + var options = {}; + qs.parse('a[b]=true', options); + st.deepEqual(options, {}); + st.end(); + }); + + t.test('throws if an invalid charset is specified', function (st) { + st['throws'](function () { + qs.parse('a=b', { charset: 'foobar' }); + }, new TypeError('The charset option must be either utf-8, iso-8859-1, or undefined')); + st.end(); + }); + + t.test('parses an iso-8859-1 string if asked to', function (st) { + st.deepEqual(qs.parse('%A2=%BD', { charset: 'iso-8859-1' }), { '¢': '½' }); + st.end(); + }); + + var urlEncodedCheckmarkInUtf8 = '%E2%9C%93'; + var urlEncodedOSlashInUtf8 = '%C3%B8'; + var urlEncodedNumCheckmark = '%26%2310003%3B'; + var urlEncodedNumSmiley = '%26%239786%3B'; + + t.test('prefers an utf-8 charset specified by the utf8 sentinel to a default charset of iso-8859-1', function (st) { + st.deepEqual(qs.parse('utf8=' + urlEncodedCheckmarkInUtf8 + '&' + urlEncodedOSlashInUtf8 + '=' + urlEncodedOSlashInUtf8, { charsetSentinel: true, charset: 'iso-8859-1' }), { ø: 'ø' }); + st.end(); + }); + + t.test('prefers an iso-8859-1 charset specified by the utf8 sentinel to a default charset of utf-8', function (st) { + st.deepEqual(qs.parse('utf8=' + urlEncodedNumCheckmark + '&' + urlEncodedOSlashInUtf8 + '=' + urlEncodedOSlashInUtf8, { charsetSentinel: true, charset: 'utf-8' }), { 'ø': 'ø' }); + st.end(); + }); + + t.test('does not require the utf8 sentinel to be defined before the parameters whose decoding it affects', function (st) { + st.deepEqual(qs.parse('a=' + urlEncodedOSlashInUtf8 + '&utf8=' + urlEncodedNumCheckmark, { charsetSentinel: true, charset: 'utf-8' }), { a: 'ø' }); + st.end(); + }); + + t.test('should ignore an utf8 sentinel with an unknown value', function (st) { + st.deepEqual(qs.parse('utf8=foo&' + urlEncodedOSlashInUtf8 + '=' + urlEncodedOSlashInUtf8, { charsetSentinel: true, charset: 'utf-8' }), { ø: 'ø' }); + st.end(); + }); + + t.test('uses the utf8 sentinel to switch to utf-8 when no default charset is given', function (st) { + st.deepEqual(qs.parse('utf8=' + urlEncodedCheckmarkInUtf8 + '&' + urlEncodedOSlashInUtf8 + '=' + urlEncodedOSlashInUtf8, { charsetSentinel: true }), { ø: 'ø' }); + st.end(); + }); + + t.test('uses the utf8 sentinel to switch to iso-8859-1 when no default charset is given', function (st) { + st.deepEqual(qs.parse('utf8=' + urlEncodedNumCheckmark + '&' + urlEncodedOSlashInUtf8 + '=' + urlEncodedOSlashInUtf8, { charsetSentinel: true }), { 'ø': 'ø' }); + st.end(); + }); + + t.test('interprets numeric entities in iso-8859-1 when `interpretNumericEntities`', function (st) { + st.deepEqual(qs.parse('foo=' + urlEncodedNumSmiley, { charset: 'iso-8859-1', interpretNumericEntities: true }), { foo: '☺' }); + st.end(); + }); + + t.test('handles a custom decoder returning `null`, in the `iso-8859-1` charset, when `interpretNumericEntities`', function (st) { + st.deepEqual(qs.parse('foo=&bar=' + urlEncodedNumSmiley, { + charset: 'iso-8859-1', + decoder: function (str, defaultDecoder, charset) { + return str ? defaultDecoder(str, defaultDecoder, charset) : null; + }, + interpretNumericEntities: true + }), { foo: null, bar: '☺' }); + st.end(); + }); + + t.test('does not interpret numeric entities in iso-8859-1 when `interpretNumericEntities` is absent', function (st) { + st.deepEqual(qs.parse('foo=' + urlEncodedNumSmiley, { charset: 'iso-8859-1' }), { foo: '☺' }); + st.end(); + }); + + t.test('does not interpret numeric entities when the charset is utf-8, even when `interpretNumericEntities`', function (st) { + st.deepEqual(qs.parse('foo=' + urlEncodedNumSmiley, { charset: 'utf-8', interpretNumericEntities: true }), { foo: '☺' }); + st.end(); + }); + + t.test('does not interpret %uXXXX syntax in iso-8859-1 mode', function (st) { + st.deepEqual(qs.parse('%u263A=%u263A', { charset: 'iso-8859-1' }), { '%u263A': '%u263A' }); + st.end(); + }); + + t.end(); +}); diff --git a/node_modules/qs/test/stringify.js b/node_modules/qs/test/stringify.js new file mode 100644 index 000000000..53041c2e4 --- /dev/null +++ b/node_modules/qs/test/stringify.js @@ -0,0 +1,679 @@ +'use strict'; + +var test = require('tape'); +var qs = require('../'); +var utils = require('../lib/utils'); +var iconv = require('iconv-lite'); +var SaferBuffer = require('safer-buffer').Buffer; + +test('stringify()', function (t) { + t.test('stringifies a querystring object', function (st) { + st.equal(qs.stringify({ a: 'b' }), 'a=b'); + st.equal(qs.stringify({ a: 1 }), 'a=1'); + st.equal(qs.stringify({ a: 1, b: 2 }), 'a=1&b=2'); + st.equal(qs.stringify({ a: 'A_Z' }), 'a=A_Z'); + st.equal(qs.stringify({ a: '€' }), 'a=%E2%82%AC'); + st.equal(qs.stringify({ a: '' }), 'a=%EE%80%80'); + st.equal(qs.stringify({ a: 'א' }), 'a=%D7%90'); + st.equal(qs.stringify({ a: '𐐷' }), 'a=%F0%90%90%B7'); + st.end(); + }); + + t.test('stringifies falsy values', function (st) { + st.equal(qs.stringify(undefined), ''); + st.equal(qs.stringify(null), ''); + st.equal(qs.stringify(null, { strictNullHandling: true }), ''); + st.equal(qs.stringify(false), ''); + st.equal(qs.stringify(0), ''); + st.end(); + }); + + t.test('adds query prefix', function (st) { + st.equal(qs.stringify({ a: 'b' }, { addQueryPrefix: true }), '?a=b'); + st.end(); + }); + + t.test('with query prefix, outputs blank string given an empty object', function (st) { + st.equal(qs.stringify({}, { addQueryPrefix: true }), ''); + st.end(); + }); + + t.test('stringifies nested falsy values', function (st) { + st.equal(qs.stringify({ a: { b: { c: null } } }), 'a%5Bb%5D%5Bc%5D='); + st.equal(qs.stringify({ a: { b: { c: null } } }, { strictNullHandling: true }), 'a%5Bb%5D%5Bc%5D'); + st.equal(qs.stringify({ a: { b: { c: false } } }), 'a%5Bb%5D%5Bc%5D=false'); + st.end(); + }); + + t.test('stringifies a nested object', function (st) { + st.equal(qs.stringify({ a: { b: 'c' } }), 'a%5Bb%5D=c'); + st.equal(qs.stringify({ a: { b: { c: { d: 'e' } } } }), 'a%5Bb%5D%5Bc%5D%5Bd%5D=e'); + st.end(); + }); + + t.test('stringifies a nested object with dots notation', function (st) { + st.equal(qs.stringify({ a: { b: 'c' } }, { allowDots: true }), 'a.b=c'); + st.equal(qs.stringify({ a: { b: { c: { d: 'e' } } } }, { allowDots: true }), 'a.b.c.d=e'); + st.end(); + }); + + t.test('stringifies an array value', function (st) { + st.equal( + qs.stringify({ a: ['b', 'c', 'd'] }, { arrayFormat: 'indices' }), + 'a%5B0%5D=b&a%5B1%5D=c&a%5B2%5D=d', + 'indices => indices' + ); + st.equal( + qs.stringify({ a: ['b', 'c', 'd'] }, { arrayFormat: 'brackets' }), + 'a%5B%5D=b&a%5B%5D=c&a%5B%5D=d', + 'brackets => brackets' + ); + st.equal( + qs.stringify({ a: ['b', 'c', 'd'] }, { arrayFormat: 'comma' }), + 'a=b%2Cc%2Cd', + 'comma => comma' + ); + st.equal( + qs.stringify({ a: ['b', 'c', 'd'] }), + 'a%5B0%5D=b&a%5B1%5D=c&a%5B2%5D=d', + 'default => indices' + ); + st.end(); + }); + + t.test('omits nulls when asked', function (st) { + st.equal(qs.stringify({ a: 'b', c: null }, { skipNulls: true }), 'a=b'); + st.end(); + }); + + t.test('omits nested nulls when asked', function (st) { + st.equal(qs.stringify({ a: { b: 'c', d: null } }, { skipNulls: true }), 'a%5Bb%5D=c'); + st.end(); + }); + + t.test('omits array indices when asked', function (st) { + st.equal(qs.stringify({ a: ['b', 'c', 'd'] }, { indices: false }), 'a=b&a=c&a=d'); + st.end(); + }); + + t.test('stringifies a nested array value', function (st) { + st.equal(qs.stringify({ a: { b: ['c', 'd'] } }, { arrayFormat: 'indices' }), 'a%5Bb%5D%5B0%5D=c&a%5Bb%5D%5B1%5D=d'); + st.equal(qs.stringify({ a: { b: ['c', 'd'] } }, { arrayFormat: 'brackets' }), 'a%5Bb%5D%5B%5D=c&a%5Bb%5D%5B%5D=d'); + st.equal(qs.stringify({ a: { b: ['c', 'd'] } }, { arrayFormat: 'comma' }), 'a%5Bb%5D=c%2Cd'); // a[b]=c,d + st.equal(qs.stringify({ a: { b: ['c', 'd'] } }), 'a%5Bb%5D%5B0%5D=c&a%5Bb%5D%5B1%5D=d'); + st.end(); + }); + + t.test('stringifies a nested array value with dots notation', function (st) { + st.equal( + qs.stringify( + { a: { b: ['c', 'd'] } }, + { allowDots: true, encode: false, arrayFormat: 'indices' } + ), + 'a.b[0]=c&a.b[1]=d', + 'indices: stringifies with dots + indices' + ); + st.equal( + qs.stringify( + { a: { b: ['c', 'd'] } }, + { allowDots: true, encode: false, arrayFormat: 'brackets' } + ), + 'a.b[]=c&a.b[]=d', + 'brackets: stringifies with dots + brackets' + ); + st.equal( + qs.stringify( + { a: { b: ['c', 'd'] } }, + { allowDots: true, encode: false, arrayFormat: 'comma' } + ), + 'a.b=c,d', + 'comma: stringifies with dots + comma' + ); + st.equal( + qs.stringify( + { a: { b: ['c', 'd'] } }, + { allowDots: true, encode: false } + ), + 'a.b[0]=c&a.b[1]=d', + 'default: stringifies with dots + indices' + ); + st.end(); + }); + + t.test('stringifies an object inside an array', function (st) { + st.equal( + qs.stringify({ a: [{ b: 'c' }] }, { arrayFormat: 'indices' }), + 'a%5B0%5D%5Bb%5D=c', // a[0][b]=c + 'indices => brackets' + ); + st.equal( + qs.stringify({ a: [{ b: 'c' }] }, { arrayFormat: 'brackets' }), + 'a%5B%5D%5Bb%5D=c', // a[][b]=c + 'brackets => brackets' + ); + st.equal( + qs.stringify({ a: [{ b: 'c' }] }), + 'a%5B0%5D%5Bb%5D=c', + 'default => indices' + ); + + st.equal( + qs.stringify({ a: [{ b: { c: [1] } }] }, { arrayFormat: 'indices' }), + 'a%5B0%5D%5Bb%5D%5Bc%5D%5B0%5D=1', + 'indices => indices' + ); + + st.equal( + qs.stringify({ a: [{ b: { c: [1] } }] }, { arrayFormat: 'brackets' }), + 'a%5B%5D%5Bb%5D%5Bc%5D%5B%5D=1', + 'brackets => brackets' + ); + + st.equal( + qs.stringify({ a: [{ b: { c: [1] } }] }), + 'a%5B0%5D%5Bb%5D%5Bc%5D%5B0%5D=1', + 'default => indices' + ); + + st.end(); + }); + + t.test('stringifies an array with mixed objects and primitives', function (st) { + st.equal( + qs.stringify({ a: [{ b: 1 }, 2, 3] }, { encode: false, arrayFormat: 'indices' }), + 'a[0][b]=1&a[1]=2&a[2]=3', + 'indices => indices' + ); + st.equal( + qs.stringify({ a: [{ b: 1 }, 2, 3] }, { encode: false, arrayFormat: 'brackets' }), + 'a[][b]=1&a[]=2&a[]=3', + 'brackets => brackets' + ); + st.equal( + qs.stringify({ a: [{ b: 1 }, 2, 3] }, { encode: false }), + 'a[0][b]=1&a[1]=2&a[2]=3', + 'default => indices' + ); + + st.end(); + }); + + t.test('stringifies an object inside an array with dots notation', function (st) { + st.equal( + qs.stringify( + { a: [{ b: 'c' }] }, + { allowDots: true, encode: false, arrayFormat: 'indices' } + ), + 'a[0].b=c', + 'indices => indices' + ); + st.equal( + qs.stringify( + { a: [{ b: 'c' }] }, + { allowDots: true, encode: false, arrayFormat: 'brackets' } + ), + 'a[].b=c', + 'brackets => brackets' + ); + st.equal( + qs.stringify( + { a: [{ b: 'c' }] }, + { allowDots: true, encode: false } + ), + 'a[0].b=c', + 'default => indices' + ); + + st.equal( + qs.stringify( + { a: [{ b: { c: [1] } }] }, + { allowDots: true, encode: false, arrayFormat: 'indices' } + ), + 'a[0].b.c[0]=1', + 'indices => indices' + ); + st.equal( + qs.stringify( + { a: [{ b: { c: [1] } }] }, + { allowDots: true, encode: false, arrayFormat: 'brackets' } + ), + 'a[].b.c[]=1', + 'brackets => brackets' + ); + st.equal( + qs.stringify( + { a: [{ b: { c: [1] } }] }, + { allowDots: true, encode: false } + ), + 'a[0].b.c[0]=1', + 'default => indices' + ); + + st.end(); + }); + + t.test('does not omit object keys when indices = false', function (st) { + st.equal(qs.stringify({ a: [{ b: 'c' }] }, { indices: false }), 'a%5Bb%5D=c'); + st.end(); + }); + + t.test('uses indices notation for arrays when indices=true', function (st) { + st.equal(qs.stringify({ a: ['b', 'c'] }, { indices: true }), 'a%5B0%5D=b&a%5B1%5D=c'); + st.end(); + }); + + t.test('uses indices notation for arrays when no arrayFormat is specified', function (st) { + st.equal(qs.stringify({ a: ['b', 'c'] }), 'a%5B0%5D=b&a%5B1%5D=c'); + st.end(); + }); + + t.test('uses indices notation for arrays when no arrayFormat=indices', function (st) { + st.equal(qs.stringify({ a: ['b', 'c'] }, { arrayFormat: 'indices' }), 'a%5B0%5D=b&a%5B1%5D=c'); + st.end(); + }); + + t.test('uses repeat notation for arrays when no arrayFormat=repeat', function (st) { + st.equal(qs.stringify({ a: ['b', 'c'] }, { arrayFormat: 'repeat' }), 'a=b&a=c'); + st.end(); + }); + + t.test('uses brackets notation for arrays when no arrayFormat=brackets', function (st) { + st.equal(qs.stringify({ a: ['b', 'c'] }, { arrayFormat: 'brackets' }), 'a%5B%5D=b&a%5B%5D=c'); + st.end(); + }); + + t.test('stringifies a complicated object', function (st) { + st.equal(qs.stringify({ a: { b: 'c', d: 'e' } }), 'a%5Bb%5D=c&a%5Bd%5D=e'); + st.end(); + }); + + t.test('stringifies an empty value', function (st) { + st.equal(qs.stringify({ a: '' }), 'a='); + st.equal(qs.stringify({ a: null }, { strictNullHandling: true }), 'a'); + + st.equal(qs.stringify({ a: '', b: '' }), 'a=&b='); + st.equal(qs.stringify({ a: null, b: '' }, { strictNullHandling: true }), 'a&b='); + + st.equal(qs.stringify({ a: { b: '' } }), 'a%5Bb%5D='); + st.equal(qs.stringify({ a: { b: null } }, { strictNullHandling: true }), 'a%5Bb%5D'); + st.equal(qs.stringify({ a: { b: null } }, { strictNullHandling: false }), 'a%5Bb%5D='); + + st.end(); + }); + + t.test('stringifies a null object', { skip: !Object.create }, function (st) { + var obj = Object.create(null); + obj.a = 'b'; + st.equal(qs.stringify(obj), 'a=b'); + st.end(); + }); + + t.test('returns an empty string for invalid input', function (st) { + st.equal(qs.stringify(undefined), ''); + st.equal(qs.stringify(false), ''); + st.equal(qs.stringify(null), ''); + st.equal(qs.stringify(''), ''); + st.end(); + }); + + t.test('stringifies an object with a null object as a child', { skip: !Object.create }, function (st) { + var obj = { a: Object.create(null) }; + + obj.a.b = 'c'; + st.equal(qs.stringify(obj), 'a%5Bb%5D=c'); + st.end(); + }); + + t.test('drops keys with a value of undefined', function (st) { + st.equal(qs.stringify({ a: undefined }), ''); + + st.equal(qs.stringify({ a: { b: undefined, c: null } }, { strictNullHandling: true }), 'a%5Bc%5D'); + st.equal(qs.stringify({ a: { b: undefined, c: null } }, { strictNullHandling: false }), 'a%5Bc%5D='); + st.equal(qs.stringify({ a: { b: undefined, c: '' } }), 'a%5Bc%5D='); + st.end(); + }); + + t.test('url encodes values', function (st) { + st.equal(qs.stringify({ a: 'b c' }), 'a=b%20c'); + st.end(); + }); + + t.test('stringifies a date', function (st) { + var now = new Date(); + var str = 'a=' + encodeURIComponent(now.toISOString()); + st.equal(qs.stringify({ a: now }), str); + st.end(); + }); + + t.test('stringifies the weird object from qs', function (st) { + st.equal(qs.stringify({ 'my weird field': '~q1!2"\'w$5&7/z8)?' }), 'my%20weird%20field=~q1%212%22%27w%245%267%2Fz8%29%3F'); + st.end(); + }); + + t.test('skips properties that are part of the object prototype', function (st) { + Object.prototype.crash = 'test'; + st.equal(qs.stringify({ a: 'b' }), 'a=b'); + st.equal(qs.stringify({ a: { b: 'c' } }), 'a%5Bb%5D=c'); + delete Object.prototype.crash; + st.end(); + }); + + t.test('stringifies boolean values', function (st) { + st.equal(qs.stringify({ a: true }), 'a=true'); + st.equal(qs.stringify({ a: { b: true } }), 'a%5Bb%5D=true'); + st.equal(qs.stringify({ b: false }), 'b=false'); + st.equal(qs.stringify({ b: { c: false } }), 'b%5Bc%5D=false'); + st.end(); + }); + + t.test('stringifies buffer values', function (st) { + st.equal(qs.stringify({ a: SaferBuffer.from('test') }), 'a=test'); + st.equal(qs.stringify({ a: { b: SaferBuffer.from('test') } }), 'a%5Bb%5D=test'); + st.end(); + }); + + t.test('stringifies an object using an alternative delimiter', function (st) { + st.equal(qs.stringify({ a: 'b', c: 'd' }, { delimiter: ';' }), 'a=b;c=d'); + st.end(); + }); + + t.test('doesn\'t blow up when Buffer global is missing', function (st) { + var tempBuffer = global.Buffer; + delete global.Buffer; + var result = qs.stringify({ a: 'b', c: 'd' }); + global.Buffer = tempBuffer; + st.equal(result, 'a=b&c=d'); + st.end(); + }); + + t.test('selects properties when filter=array', function (st) { + st.equal(qs.stringify({ a: 'b' }, { filter: ['a'] }), 'a=b'); + st.equal(qs.stringify({ a: 1 }, { filter: [] }), ''); + + st.equal( + qs.stringify( + { a: { b: [1, 2, 3, 4], c: 'd' }, c: 'f' }, + { filter: ['a', 'b', 0, 2], arrayFormat: 'indices' } + ), + 'a%5Bb%5D%5B0%5D=1&a%5Bb%5D%5B2%5D=3', + 'indices => indices' + ); + st.equal( + qs.stringify( + { a: { b: [1, 2, 3, 4], c: 'd' }, c: 'f' }, + { filter: ['a', 'b', 0, 2], arrayFormat: 'brackets' } + ), + 'a%5Bb%5D%5B%5D=1&a%5Bb%5D%5B%5D=3', + 'brackets => brackets' + ); + st.equal( + qs.stringify( + { a: { b: [1, 2, 3, 4], c: 'd' }, c: 'f' }, + { filter: ['a', 'b', 0, 2] } + ), + 'a%5Bb%5D%5B0%5D=1&a%5Bb%5D%5B2%5D=3', + 'default => indices' + ); + + st.end(); + }); + + t.test('supports custom representations when filter=function', function (st) { + var calls = 0; + var obj = { a: 'b', c: 'd', e: { f: new Date(1257894000000) } }; + var filterFunc = function (prefix, value) { + calls += 1; + if (calls === 1) { + st.equal(prefix, '', 'prefix is empty'); + st.equal(value, obj); + } else if (prefix === 'c') { + return void 0; + } else if (value instanceof Date) { + st.equal(prefix, 'e[f]'); + return value.getTime(); + } + return value; + }; + + st.equal(qs.stringify(obj, { filter: filterFunc }), 'a=b&e%5Bf%5D=1257894000000'); + st.equal(calls, 5); + st.end(); + }); + + t.test('can disable uri encoding', function (st) { + st.equal(qs.stringify({ a: 'b' }, { encode: false }), 'a=b'); + st.equal(qs.stringify({ a: { b: 'c' } }, { encode: false }), 'a[b]=c'); + st.equal(qs.stringify({ a: 'b', c: null }, { strictNullHandling: true, encode: false }), 'a=b&c'); + st.end(); + }); + + t.test('can sort the keys', function (st) { + var sort = function (a, b) { + return a.localeCompare(b); + }; + st.equal(qs.stringify({ a: 'c', z: 'y', b: 'f' }, { sort: sort }), 'a=c&b=f&z=y'); + st.equal(qs.stringify({ a: 'c', z: { j: 'a', i: 'b' }, b: 'f' }, { sort: sort }), 'a=c&b=f&z%5Bi%5D=b&z%5Bj%5D=a'); + st.end(); + }); + + t.test('can sort the keys at depth 3 or more too', function (st) { + var sort = function (a, b) { + return a.localeCompare(b); + }; + st.equal( + qs.stringify( + { a: 'a', z: { zj: { zjb: 'zjb', zja: 'zja' }, zi: { zib: 'zib', zia: 'zia' } }, b: 'b' }, + { sort: sort, encode: false } + ), + 'a=a&b=b&z[zi][zia]=zia&z[zi][zib]=zib&z[zj][zja]=zja&z[zj][zjb]=zjb' + ); + st.equal( + qs.stringify( + { a: 'a', z: { zj: { zjb: 'zjb', zja: 'zja' }, zi: { zib: 'zib', zia: 'zia' } }, b: 'b' }, + { sort: null, encode: false } + ), + 'a=a&z[zj][zjb]=zjb&z[zj][zja]=zja&z[zi][zib]=zib&z[zi][zia]=zia&b=b' + ); + st.end(); + }); + + t.test('can stringify with custom encoding', function (st) { + st.equal(qs.stringify({ 県: '大阪府', '': '' }, { + encoder: function (str) { + if (str.length === 0) { + return ''; + } + var buf = iconv.encode(str, 'shiftjis'); + var result = []; + for (var i = 0; i < buf.length; ++i) { + result.push(buf.readUInt8(i).toString(16)); + } + return '%' + result.join('%'); + } + }), '%8c%a7=%91%e5%8d%e3%95%7b&='); + st.end(); + }); + + t.test('receives the default encoder as a second argument', function (st) { + st.plan(2); + qs.stringify({ a: 1 }, { + encoder: function (str, defaultEncoder) { + st.equal(defaultEncoder, utils.encode); + } + }); + st.end(); + }); + + t.test('throws error with wrong encoder', function (st) { + st['throws'](function () { + qs.stringify({}, { encoder: 'string' }); + }, new TypeError('Encoder has to be a function.')); + st.end(); + }); + + t.test('can use custom encoder for a buffer object', { skip: typeof Buffer === 'undefined' }, function (st) { + st.equal(qs.stringify({ a: SaferBuffer.from([1]) }, { + encoder: function (buffer) { + if (typeof buffer === 'string') { + return buffer; + } + return String.fromCharCode(buffer.readUInt8(0) + 97); + } + }), 'a=b'); + st.end(); + }); + + t.test('serializeDate option', function (st) { + var date = new Date(); + st.equal( + qs.stringify({ a: date }), + 'a=' + date.toISOString().replace(/:/g, '%3A'), + 'default is toISOString' + ); + + var mutatedDate = new Date(); + mutatedDate.toISOString = function () { + throw new SyntaxError(); + }; + st['throws'](function () { + mutatedDate.toISOString(); + }, SyntaxError); + st.equal( + qs.stringify({ a: mutatedDate }), + 'a=' + Date.prototype.toISOString.call(mutatedDate).replace(/:/g, '%3A'), + 'toISOString works even when method is not locally present' + ); + + var specificDate = new Date(6); + st.equal( + qs.stringify( + { a: specificDate }, + { serializeDate: function (d) { return d.getTime() * 7; } } + ), + 'a=42', + 'custom serializeDate function called' + ); + + st.end(); + }); + + t.test('RFC 1738 spaces serialization', function (st) { + st.equal(qs.stringify({ a: 'b c' }, { format: qs.formats.RFC1738 }), 'a=b+c'); + st.equal(qs.stringify({ 'a b': 'c d' }, { format: qs.formats.RFC1738 }), 'a+b=c+d'); + st.end(); + }); + + t.test('RFC 3986 spaces serialization', function (st) { + st.equal(qs.stringify({ a: 'b c' }, { format: qs.formats.RFC3986 }), 'a=b%20c'); + st.equal(qs.stringify({ 'a b': 'c d' }, { format: qs.formats.RFC3986 }), 'a%20b=c%20d'); + st.end(); + }); + + t.test('Backward compatibility to RFC 3986', function (st) { + st.equal(qs.stringify({ a: 'b c' }), 'a=b%20c'); + st.end(); + }); + + t.test('Edge cases and unknown formats', function (st) { + ['UFO1234', false, 1234, null, {}, []].forEach( + function (format) { + st['throws']( + function () { + qs.stringify({ a: 'b c' }, { format: format }); + }, + new TypeError('Unknown format option provided.') + ); + } + ); + st.end(); + }); + + t.test('encodeValuesOnly', function (st) { + st.equal( + qs.stringify( + { a: 'b', c: ['d', 'e=f'], f: [['g'], ['h']] }, + { encodeValuesOnly: true } + ), + 'a=b&c[0]=d&c[1]=e%3Df&f[0][0]=g&f[1][0]=h' + ); + st.equal( + qs.stringify( + { a: 'b', c: ['d', 'e'], f: [['g'], ['h']] } + ), + 'a=b&c%5B0%5D=d&c%5B1%5D=e&f%5B0%5D%5B0%5D=g&f%5B1%5D%5B0%5D=h' + ); + st.end(); + }); + + t.test('encodeValuesOnly - strictNullHandling', function (st) { + st.equal( + qs.stringify( + { a: { b: null } }, + { encodeValuesOnly: true, strictNullHandling: true } + ), + 'a[b]' + ); + st.end(); + }); + + t.test('throws if an invalid charset is specified', function (st) { + st['throws'](function () { + qs.stringify({ a: 'b' }, { charset: 'foobar' }); + }, new TypeError('The charset option must be either utf-8, iso-8859-1, or undefined')); + st.end(); + }); + + t.test('respects a charset of iso-8859-1', function (st) { + st.equal(qs.stringify({ æ: 'æ' }, { charset: 'iso-8859-1' }), '%E6=%E6'); + st.end(); + }); + + t.test('encodes unrepresentable chars as numeric entities in iso-8859-1 mode', function (st) { + st.equal(qs.stringify({ a: '☺' }, { charset: 'iso-8859-1' }), 'a=%26%239786%3B'); + st.end(); + }); + + t.test('respects an explicit charset of utf-8 (the default)', function (st) { + st.equal(qs.stringify({ a: 'æ' }, { charset: 'utf-8' }), 'a=%C3%A6'); + st.end(); + }); + + t.test('adds the right sentinel when instructed to and the charset is utf-8', function (st) { + st.equal(qs.stringify({ a: 'æ' }, { charsetSentinel: true, charset: 'utf-8' }), 'utf8=%E2%9C%93&a=%C3%A6'); + st.end(); + }); + + t.test('adds the right sentinel when instructed to and the charset is iso-8859-1', function (st) { + st.equal(qs.stringify({ a: 'æ' }, { charsetSentinel: true, charset: 'iso-8859-1' }), 'utf8=%26%2310003%3B&a=%E6'); + st.end(); + }); + + t.test('does not mutate the options argument', function (st) { + var options = {}; + qs.stringify({}, options); + st.deepEqual(options, {}); + st.end(); + }); + + t.test('strictNullHandling works with custom filter', function (st) { + var filter = function (prefix, value) { + return value; + }; + + var options = { strictNullHandling: true, filter: filter }; + st.equal(qs.stringify({ key: null }, options), 'key'); + st.end(); + }); + + t.test('strictNullHandling works with null serializeDate', function (st) { + var serializeDate = function () { + return null; + }; + var options = { strictNullHandling: true, serializeDate: serializeDate }; + var date = new Date(); + st.equal(qs.stringify({ key: date }, options), 'key'); + st.end(); + }); + + t.end(); +}); diff --git a/node_modules/qs/test/utils.js b/node_modules/qs/test/utils.js new file mode 100644 index 000000000..da31ce53e --- /dev/null +++ b/node_modules/qs/test/utils.js @@ -0,0 +1,136 @@ +'use strict'; + +var test = require('tape'); +var inspect = require('object-inspect'); +var SaferBuffer = require('safer-buffer').Buffer; +var forEach = require('for-each'); +var utils = require('../lib/utils'); + +test('merge()', function (t) { + t.deepEqual(utils.merge(null, true), [null, true], 'merges true into null'); + + t.deepEqual(utils.merge(null, [42]), [null, 42], 'merges null into an array'); + + t.deepEqual(utils.merge({ a: 'b' }, { a: 'c' }), { a: ['b', 'c'] }, 'merges two objects with the same key'); + + var oneMerged = utils.merge({ foo: 'bar' }, { foo: { first: '123' } }); + t.deepEqual(oneMerged, { foo: ['bar', { first: '123' }] }, 'merges a standalone and an object into an array'); + + var twoMerged = utils.merge({ foo: ['bar', { first: '123' }] }, { foo: { second: '456' } }); + t.deepEqual(twoMerged, { foo: { 0: 'bar', 1: { first: '123' }, second: '456' } }, 'merges a standalone and two objects into an array'); + + var sandwiched = utils.merge({ foo: ['bar', { first: '123', second: '456' }] }, { foo: 'baz' }); + t.deepEqual(sandwiched, { foo: ['bar', { first: '123', second: '456' }, 'baz'] }, 'merges an object sandwiched by two standalones into an array'); + + var nestedArrays = utils.merge({ foo: ['baz'] }, { foo: ['bar', 'xyzzy'] }); + t.deepEqual(nestedArrays, { foo: ['baz', 'bar', 'xyzzy'] }); + + var noOptionsNonObjectSource = utils.merge({ foo: 'baz' }, 'bar'); + t.deepEqual(noOptionsNonObjectSource, { foo: 'baz', bar: true }); + + t.test( + 'avoids invoking array setters unnecessarily', + { skip: typeof Object.defineProperty !== 'function' }, + function (st) { + var setCount = 0; + var getCount = 0; + var observed = []; + Object.defineProperty(observed, 0, { + get: function () { + getCount += 1; + return { bar: 'baz' }; + }, + set: function () { setCount += 1; } + }); + utils.merge(observed, [null]); + st.equal(setCount, 0); + st.equal(getCount, 1); + observed[0] = observed[0]; // eslint-disable-line no-self-assign + st.equal(setCount, 1); + st.equal(getCount, 2); + st.end(); + } + ); + + t.end(); +}); + +test('assign()', function (t) { + var target = { a: 1, b: 2 }; + var source = { b: 3, c: 4 }; + var result = utils.assign(target, source); + + t.equal(result, target, 'returns the target'); + t.deepEqual(target, { a: 1, b: 3, c: 4 }, 'target and source are merged'); + t.deepEqual(source, { b: 3, c: 4 }, 'source is untouched'); + + t.end(); +}); + +test('combine()', function (t) { + t.test('both arrays', function (st) { + var a = [1]; + var b = [2]; + var combined = utils.combine(a, b); + + st.deepEqual(a, [1], 'a is not mutated'); + st.deepEqual(b, [2], 'b is not mutated'); + st.notEqual(a, combined, 'a !== combined'); + st.notEqual(b, combined, 'b !== combined'); + st.deepEqual(combined, [1, 2], 'combined is a + b'); + + st.end(); + }); + + t.test('one array, one non-array', function (st) { + var aN = 1; + var a = [aN]; + var bN = 2; + var b = [bN]; + + var combinedAnB = utils.combine(aN, b); + st.deepEqual(b, [bN], 'b is not mutated'); + st.notEqual(aN, combinedAnB, 'aN + b !== aN'); + st.notEqual(a, combinedAnB, 'aN + b !== a'); + st.notEqual(bN, combinedAnB, 'aN + b !== bN'); + st.notEqual(b, combinedAnB, 'aN + b !== b'); + st.deepEqual([1, 2], combinedAnB, 'first argument is array-wrapped when not an array'); + + var combinedABn = utils.combine(a, bN); + st.deepEqual(a, [aN], 'a is not mutated'); + st.notEqual(aN, combinedABn, 'a + bN !== aN'); + st.notEqual(a, combinedABn, 'a + bN !== a'); + st.notEqual(bN, combinedABn, 'a + bN !== bN'); + st.notEqual(b, combinedABn, 'a + bN !== b'); + st.deepEqual([1, 2], combinedABn, 'second argument is array-wrapped when not an array'); + + st.end(); + }); + + t.test('neither is an array', function (st) { + var combined = utils.combine(1, 2); + st.notEqual(1, combined, '1 + 2 !== 1'); + st.notEqual(2, combined, '1 + 2 !== 2'); + st.deepEqual([1, 2], combined, 'both arguments are array-wrapped when not an array'); + + st.end(); + }); + + t.end(); +}); + +test('isBuffer()', function (t) { + forEach([null, undefined, true, false, '', 'abc', 42, 0, NaN, {}, [], function () {}, /a/g], function (x) { + t.equal(utils.isBuffer(x), false, inspect(x) + ' is not a buffer'); + }); + + var fakeBuffer = { constructor: Buffer }; + t.equal(utils.isBuffer(fakeBuffer), false, 'fake buffer is not a buffer'); + + var saferBuffer = SaferBuffer.from('abc'); + t.equal(utils.isBuffer(saferBuffer), true, 'SaferBuffer instance is a buffer'); + + var buffer = Buffer.from ? Buffer.from('abc') : new Buffer('abc'); + t.equal(utils.isBuffer(buffer), true, 'real Buffer instance is a buffer'); + t.end(); +}); diff --git a/node_modules/range-parser/HISTORY.md b/node_modules/range-parser/HISTORY.md new file mode 100644 index 000000000..70a973d80 --- /dev/null +++ b/node_modules/range-parser/HISTORY.md @@ -0,0 +1,56 @@ +1.2.1 / 2019-05-10 +================== + + * Improve error when `str` is not a string + +1.2.0 / 2016-06-01 +================== + + * Add `combine` option to combine overlapping ranges + +1.1.0 / 2016-05-13 +================== + + * Fix incorrectly returning -1 when there is at least one valid range + * perf: remove internal function + +1.0.3 / 2015-10-29 +================== + + * perf: enable strict mode + +1.0.2 / 2014-09-08 +================== + + * Support Node.js 0.6 + +1.0.1 / 2014-09-07 +================== + + * Move repository to jshttp + +1.0.0 / 2013-12-11 +================== + + * Add repository to package.json + * Add MIT license + +0.0.4 / 2012-06-17 +================== + + * Change ret -1 for unsatisfiable and -2 when invalid + +0.0.3 / 2012-06-17 +================== + + * Fix last-byte-pos default to len - 1 + +0.0.2 / 2012-06-14 +================== + + * Add `.type` + +0.0.1 / 2012-06-11 +================== + + * Initial release diff --git a/node_modules/range-parser/LICENSE b/node_modules/range-parser/LICENSE new file mode 100644 index 000000000..359995436 --- /dev/null +++ b/node_modules/range-parser/LICENSE @@ -0,0 +1,23 @@ +(The MIT License) + +Copyright (c) 2012-2014 TJ Holowaychuk +Copyright (c) 2015-2016 Douglas Christopher Wilson + +```js +var parseRange = require('range-parser') +``` + +### parseRange(size, header, options) + +Parse the given `header` string where `size` is the maximum size of the resource. +An array of ranges will be returned or negative numbers indicating an error parsing. + + * `-2` signals a malformed header string + * `-1` signals an unsatisfiable range + + + +```js +// parse header from request +var range = parseRange(size, req.headers.range) + +// the type of the range +if (range.type === 'bytes') { + // the ranges + range.forEach(function (r) { + // do something with r.start and r.end + }) +} +``` + +#### Options + +These properties are accepted in the options object. + +##### combine + +Specifies if overlapping & adjacent ranges should be combined, defaults to `false`. +When `true`, ranges will be combined and returned as if they were specified that +way in the header. + + + +```js +parseRange(100, 'bytes=50-55,0-10,5-10,56-60', { combine: true }) +// => [ +// { start: 0, end: 10 }, +// { start: 50, end: 60 } +// ] +``` + +## License + +[MIT](LICENSE) + +[coveralls-image]: https://badgen.net/coveralls/c/github/jshttp/range-parser/master +[coveralls-url]: https://coveralls.io/r/jshttp/range-parser?branch=master +[node-image]: https://badgen.net/npm/node/range-parser +[node-url]: https://nodejs.org/en/download +[npm-downloads-image]: https://badgen.net/npm/dm/range-parser +[npm-url]: https://npmjs.org/package/range-parser +[npm-version-image]: https://badgen.net/npm/v/range-parser +[travis-image]: https://badgen.net/travis/jshttp/range-parser/master +[travis-url]: https://travis-ci.org/jshttp/range-parser diff --git a/node_modules/range-parser/index.js b/node_modules/range-parser/index.js new file mode 100644 index 000000000..b7dc5c0f1 --- /dev/null +++ b/node_modules/range-parser/index.js @@ -0,0 +1,162 @@ +/*! + * range-parser + * Copyright(c) 2012-2014 TJ Holowaychuk + * Copyright(c) 2015-2016 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module exports. + * @public + */ + +module.exports = rangeParser + +/** + * Parse "Range" header `str` relative to the given file `size`. + * + * @param {Number} size + * @param {String} str + * @param {Object} [options] + * @return {Array} + * @public + */ + +function rangeParser (size, str, options) { + if (typeof str !== 'string') { + throw new TypeError('argument str must be a string') + } + + var index = str.indexOf('=') + + if (index === -1) { + return -2 + } + + // split the range string + var arr = str.slice(index + 1).split(',') + var ranges = [] + + // add ranges type + ranges.type = str.slice(0, index) + + // parse all ranges + for (var i = 0; i < arr.length; i++) { + var range = arr[i].split('-') + var start = parseInt(range[0], 10) + var end = parseInt(range[1], 10) + + // -nnn + if (isNaN(start)) { + start = size - end + end = size - 1 + // nnn- + } else if (isNaN(end)) { + end = size - 1 + } + + // limit last-byte-pos to current length + if (end > size - 1) { + end = size - 1 + } + + // invalid or unsatisifiable + if (isNaN(start) || isNaN(end) || start > end || start < 0) { + continue + } + + // add range + ranges.push({ + start: start, + end: end + }) + } + + if (ranges.length < 1) { + // unsatisifiable + return -1 + } + + return options && options.combine + ? combineRanges(ranges) + : ranges +} + +/** + * Combine overlapping & adjacent ranges. + * @private + */ + +function combineRanges (ranges) { + var ordered = ranges.map(mapWithIndex).sort(sortByRangeStart) + + for (var j = 0, i = 1; i < ordered.length; i++) { + var range = ordered[i] + var current = ordered[j] + + if (range.start > current.end + 1) { + // next range + ordered[++j] = range + } else if (range.end > current.end) { + // extend range + current.end = range.end + current.index = Math.min(current.index, range.index) + } + } + + // trim ordered array + ordered.length = j + 1 + + // generate combined range + var combined = ordered.sort(sortByRangeIndex).map(mapWithoutIndex) + + // copy ranges type + combined.type = ranges.type + + return combined +} + +/** + * Map function to add index value to ranges. + * @private + */ + +function mapWithIndex (range, index) { + return { + start: range.start, + end: range.end, + index: index + } +} + +/** + * Map function to remove index value from ranges. + * @private + */ + +function mapWithoutIndex (range) { + return { + start: range.start, + end: range.end + } +} + +/** + * Sort function to sort ranges by index. + * @private + */ + +function sortByRangeIndex (a, b) { + return a.index - b.index +} + +/** + * Sort function to sort ranges by start position. + * @private + */ + +function sortByRangeStart (a, b) { + return a.start - b.start +} diff --git a/node_modules/range-parser/package.json b/node_modules/range-parser/package.json new file mode 100644 index 000000000..649e2e665 --- /dev/null +++ b/node_modules/range-parser/package.json @@ -0,0 +1,91 @@ +{ + "_from": "range-parser@~1.2.1", + "_id": "range-parser@1.2.1", + "_inBundle": false, + "_integrity": "sha512-Hrgsx+orqoygnmhFbKaHE6c296J+HTAQXoxEF6gNupROmmGJRoyzfG3ccAveqCBrwr/2yxQ5BVd/GTl5agOwSg==", + "_location": "/range-parser", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "range-parser@~1.2.1", + "name": "range-parser", + "escapedName": "range-parser", + "rawSpec": "~1.2.1", + "saveSpec": null, + "fetchSpec": "~1.2.1" + }, + "_requiredBy": [ + "/express", + "/send" + ], + "_resolved": "https://registry.npmjs.org/range-parser/-/range-parser-1.2.1.tgz", + "_shasum": "3cf37023d199e1c24d1a55b84800c2f3e6468031", + "_spec": "range-parser@~1.2.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\express", + "author": { + "name": "TJ Holowaychuk", + "email": "tj@vision-media.ca", + "url": "http://tjholowaychuk.com" + }, + "bugs": { + "url": "https://github.com/jshttp/range-parser/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + { + "name": "James Wyatt Cready", + "email": "wyatt.cready@lanetix.com" + }, + { + "name": "Jonathan Ong", + "email": "me@jongleberry.com", + "url": "http://jongleberry.com" + } + ], + "deprecated": false, + "description": "Range header field string parser", + "devDependencies": { + "deep-equal": "1.0.1", + "eslint": "5.16.0", + "eslint-config-standard": "12.0.0", + "eslint-plugin-import": "2.17.2", + "eslint-plugin-markdown": "1.0.0", + "eslint-plugin-node": "8.0.1", + "eslint-plugin-promise": "4.1.1", + "eslint-plugin-standard": "4.0.0", + "mocha": "6.1.4", + "nyc": "14.1.1" + }, + "engines": { + "node": ">= 0.6" + }, + "files": [ + "HISTORY.md", + "LICENSE", + "index.js" + ], + "homepage": "https://github.com/jshttp/range-parser#readme", + "keywords": [ + "range", + "parser", + "http" + ], + "license": "MIT", + "name": "range-parser", + "repository": { + "type": "git", + "url": "git+https://github.com/jshttp/range-parser.git" + }, + "scripts": { + "lint": "eslint --plugin markdown --ext js,md .", + "test": "mocha --reporter spec", + "test-cov": "nyc --reporter=html --reporter=text npm test", + "test-travis": "nyc --reporter=text npm test" + }, + "version": "1.2.1" +} diff --git a/node_modules/raw-body/HISTORY.md b/node_modules/raw-body/HISTORY.md new file mode 100644 index 000000000..88c79fce9 --- /dev/null +++ b/node_modules/raw-body/HISTORY.md @@ -0,0 +1,270 @@ +2.4.0 / 2019-04-17 +================== + + * deps: bytes@3.1.0 + - Add petabyte (`pb`) support + * deps: http-errors@1.7.2 + - Set constructor name when possible + - deps: setprototypeof@1.1.1 + - deps: statuses@'>= 1.5.0 < 2' + * deps: iconv-lite@0.4.24 + - Added encoding MIK + +2.3.3 / 2018-05-08 +================== + + * deps: http-errors@1.6.3 + - deps: depd@~1.1.2 + - deps: setprototypeof@1.1.0 + - deps: statuses@'>= 1.3.1 < 2' + * deps: iconv-lite@0.4.23 + - Fix loading encoding with year appended + - Fix deprecation warnings on Node.js 10+ + +2.3.2 / 2017-09-09 +================== + + * deps: iconv-lite@0.4.19 + - Fix ISO-8859-1 regression + - Update Windows-1255 + +2.3.1 / 2017-09-07 +================== + + * deps: bytes@3.0.0 + * deps: http-errors@1.6.2 + - deps: depd@1.1.1 + * perf: skip buffer decoding on overage chunk + +2.3.0 / 2017-08-04 +================== + + * Add TypeScript definitions + * Use `http-errors` for standard emitted errors + * deps: bytes@2.5.0 + * deps: iconv-lite@0.4.18 + - Add support for React Native + - Add a warning if not loaded as utf-8 + - Fix CESU-8 decoding in Node.js 8 + - Improve speed of ISO-8859-1 encoding + +2.2.0 / 2017-01-02 +================== + + * deps: iconv-lite@0.4.15 + - Added encoding MS-31J + - Added encoding MS-932 + - Added encoding MS-936 + - Added encoding MS-949 + - Added encoding MS-950 + - Fix GBK/GB18030 handling of Euro character + +2.1.7 / 2016-06-19 +================== + + * deps: bytes@2.4.0 + * perf: remove double-cleanup on happy path + +2.1.6 / 2016-03-07 +================== + + * deps: bytes@2.3.0 + - Drop partial bytes on all parsed units + - Fix parsing byte string that looks like hex + +2.1.5 / 2015-11-30 +================== + + * deps: bytes@2.2.0 + * deps: iconv-lite@0.4.13 + +2.1.4 / 2015-09-27 +================== + + * Fix masking critical errors from `iconv-lite` + * deps: iconv-lite@0.4.12 + - Fix CESU-8 decoding in Node.js 4.x + +2.1.3 / 2015-09-12 +================== + + * Fix sync callback when attaching data listener causes sync read + - Node.js 0.10 compatibility issue + +2.1.2 / 2015-07-05 +================== + + * Fix error stack traces to skip `makeError` + * deps: iconv-lite@0.4.11 + - Add encoding CESU-8 + +2.1.1 / 2015-06-14 +================== + + * Use `unpipe` module for unpiping requests + +2.1.0 / 2015-05-28 +================== + + * deps: iconv-lite@0.4.10 + - Improved UTF-16 endianness detection + - Leading BOM is now removed when decoding + - The encoding UTF-16 without BOM now defaults to UTF-16LE when detection fails + +2.0.2 / 2015-05-21 +================== + + * deps: bytes@2.1.0 + - Slight optimizations + +2.0.1 / 2015-05-10 +================== + + * Fix a false-positive when unpiping in Node.js 0.8 + +2.0.0 / 2015-05-08 +================== + + * Return a promise without callback instead of thunk + * deps: bytes@2.0.1 + - units no longer case sensitive when parsing + +1.3.4 / 2015-04-15 +================== + + * Fix hanging callback if request aborts during read + * deps: iconv-lite@0.4.8 + - Add encoding alias UNICODE-1-1-UTF-7 + +1.3.3 / 2015-02-08 +================== + + * deps: iconv-lite@0.4.7 + - Gracefully support enumerables on `Object.prototype` + +1.3.2 / 2015-01-20 +================== + + * deps: iconv-lite@0.4.6 + - Fix rare aliases of single-byte encodings + +1.3.1 / 2014-11-21 +================== + + * deps: iconv-lite@0.4.5 + - Fix Windows-31J and X-SJIS encoding support + +1.3.0 / 2014-07-20 +================== + + * Fully unpipe the stream on error + - Fixes `Cannot switch to old mode now` error on Node.js 0.10+ + +1.2.3 / 2014-07-20 +================== + + * deps: iconv-lite@0.4.4 + - Added encoding UTF-7 + +1.2.2 / 2014-06-19 +================== + + * Send invalid encoding error to callback + +1.2.1 / 2014-06-15 +================== + + * deps: iconv-lite@0.4.3 + - Added encodings UTF-16BE and UTF-16 with BOM + +1.2.0 / 2014-06-13 +================== + + * Passing string as `options` interpreted as encoding + * Support all encodings from `iconv-lite` + +1.1.7 / 2014-06-12 +================== + + * use `string_decoder` module from npm + +1.1.6 / 2014-05-27 +================== + + * check encoding for old streams1 + * support node.js < 0.10.6 + +1.1.5 / 2014-05-14 +================== + + * bump bytes + +1.1.4 / 2014-04-19 +================== + + * allow true as an option + * bump bytes + +1.1.3 / 2014-03-02 +================== + + * fix case when length=null + +1.1.2 / 2013-12-01 +================== + + * be less strict on state.encoding check + +1.1.1 / 2013-11-27 +================== + + * add engines + +1.1.0 / 2013-11-27 +================== + + * add err.statusCode and err.type + * allow for encoding option to be true + * pause the stream instead of dumping on error + * throw if the stream's encoding is set + +1.0.1 / 2013-11-19 +================== + + * dont support streams1, throw if dev set encoding + +1.0.0 / 2013-11-17 +================== + + * rename `expected` option to `length` + +0.2.0 / 2013-11-15 +================== + + * republish + +0.1.1 / 2013-11-15 +================== + + * use bytes + +0.1.0 / 2013-11-11 +================== + + * generator support + +0.0.3 / 2013-10-10 +================== + + * update repo + +0.0.2 / 2013-09-14 +================== + + * dump stream on bad headers + * listen to events after defining received and buffers + +0.0.1 / 2013-09-14 +================== + + * Initial release diff --git a/node_modules/raw-body/LICENSE b/node_modules/raw-body/LICENSE new file mode 100644 index 000000000..d695c8fd6 --- /dev/null +++ b/node_modules/raw-body/LICENSE @@ -0,0 +1,22 @@ +The MIT License (MIT) + +Copyright (c) 2013-2014 Jonathan Ong +Copyright (c) 2014-2015 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/raw-body/README.md b/node_modules/raw-body/README.md new file mode 100644 index 000000000..2ce79d27d --- /dev/null +++ b/node_modules/raw-body/README.md @@ -0,0 +1,219 @@ +# raw-body + +[![NPM Version][npm-image]][npm-url] +[![NPM Downloads][downloads-image]][downloads-url] +[![Node.js Version][node-version-image]][node-version-url] +[![Build status][travis-image]][travis-url] +[![Test coverage][coveralls-image]][coveralls-url] + +Gets the entire buffer of a stream either as a `Buffer` or a string. +Validates the stream's length against an expected length and maximum limit. +Ideal for parsing request bodies. + +## Install + +This is a [Node.js](https://nodejs.org/en/) module available through the +[npm registry](https://www.npmjs.com/). Installation is done using the +[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally): + +```sh +$ npm install raw-body +``` + +### TypeScript + +This module includes a [TypeScript](https://www.typescriptlang.org/) +declaration file to enable auto complete in compatible editors and type +information for TypeScript projects. This module depends on the Node.js +types, so install `@types/node`: + +```sh +$ npm install @types/node +``` + +## API + + + +```js +var getRawBody = require('raw-body') +``` + +### getRawBody(stream, [options], [callback]) + +**Returns a promise if no callback specified and global `Promise` exists.** + +Options: + +- `length` - The length of the stream. + If the contents of the stream do not add up to this length, + an `400` error code is returned. +- `limit` - The byte limit of the body. + This is the number of bytes or any string format supported by + [bytes](https://www.npmjs.com/package/bytes), + for example `1000`, `'500kb'` or `'3mb'`. + If the body ends up being larger than this limit, + a `413` error code is returned. +- `encoding` - The encoding to use to decode the body into a string. + By default, a `Buffer` instance will be returned when no encoding is specified. + Most likely, you want `utf-8`, so setting `encoding` to `true` will decode as `utf-8`. + You can use any type of encoding supported by [iconv-lite](https://www.npmjs.org/package/iconv-lite#readme). + +You can also pass a string in place of options to just specify the encoding. + +If an error occurs, the stream will be paused, everything unpiped, +and you are responsible for correctly disposing the stream. +For HTTP requests, no handling is required if you send a response. +For streams that use file descriptors, you should `stream.destroy()` or `stream.close()` to prevent leaks. + +## Errors + +This module creates errors depending on the error condition during reading. +The error may be an error from the underlying Node.js implementation, but is +otherwise an error created by this module, which has the following attributes: + + * `limit` - the limit in bytes + * `length` and `expected` - the expected length of the stream + * `received` - the received bytes + * `encoding` - the invalid encoding + * `status` and `statusCode` - the corresponding status code for the error + * `type` - the error type + +### Types + +The errors from this module have a `type` property which allows for the progamatic +determination of the type of error returned. + +#### encoding.unsupported + +This error will occur when the `encoding` option is specified, but the value does +not map to an encoding supported by the [iconv-lite](https://www.npmjs.org/package/iconv-lite#readme) +module. + +#### entity.too.large + +This error will occur when the `limit` option is specified, but the stream has +an entity that is larger. + +#### request.aborted + +This error will occur when the request stream is aborted by the client before +reading the body has finished. + +#### request.size.invalid + +This error will occur when the `length` option is specified, but the stream has +emitted more bytes. + +#### stream.encoding.set + +This error will occur when the given stream has an encoding set on it, making it +a decoded stream. The stream should not have an encoding set and is expected to +emit `Buffer` objects. + +## Examples + +### Simple Express example + +```js +var contentType = require('content-type') +var express = require('express') +var getRawBody = require('raw-body') + +var app = express() + +app.use(function (req, res, next) { + getRawBody(req, { + length: req.headers['content-length'], + limit: '1mb', + encoding: contentType.parse(req).parameters.charset + }, function (err, string) { + if (err) return next(err) + req.text = string + next() + }) +}) + +// now access req.text +``` + +### Simple Koa example + +```js +var contentType = require('content-type') +var getRawBody = require('raw-body') +var koa = require('koa') + +var app = koa() + +app.use(function * (next) { + this.text = yield getRawBody(this.req, { + length: this.req.headers['content-length'], + limit: '1mb', + encoding: contentType.parse(this.req).parameters.charset + }) + yield next +}) + +// now access this.text +``` + +### Using as a promise + +To use this library as a promise, simply omit the `callback` and a promise is +returned, provided that a global `Promise` is defined. + +```js +var getRawBody = require('raw-body') +var http = require('http') + +var server = http.createServer(function (req, res) { + getRawBody(req) + .then(function (buf) { + res.statusCode = 200 + res.end(buf.length + ' bytes submitted') + }) + .catch(function (err) { + res.statusCode = 500 + res.end(err.message) + }) +}) + +server.listen(3000) +``` + +### Using with TypeScript + +```ts +import * as getRawBody from 'raw-body'; +import * as http from 'http'; + +const server = http.createServer((req, res) => { + getRawBody(req) + .then((buf) => { + res.statusCode = 200; + res.end(buf.length + ' bytes submitted'); + }) + .catch((err) => { + res.statusCode = err.statusCode; + res.end(err.message); + }); +}); + +server.listen(3000); +``` + +## License + +[MIT](LICENSE) + +[npm-image]: https://img.shields.io/npm/v/raw-body.svg +[npm-url]: https://npmjs.org/package/raw-body +[node-version-image]: https://img.shields.io/node/v/raw-body.svg +[node-version-url]: https://nodejs.org/en/download/ +[travis-image]: https://img.shields.io/travis/stream-utils/raw-body/master.svg +[travis-url]: https://travis-ci.org/stream-utils/raw-body +[coveralls-image]: https://img.shields.io/coveralls/stream-utils/raw-body/master.svg +[coveralls-url]: https://coveralls.io/r/stream-utils/raw-body?branch=master +[downloads-image]: https://img.shields.io/npm/dm/raw-body.svg +[downloads-url]: https://npmjs.org/package/raw-body diff --git a/node_modules/raw-body/index.d.ts b/node_modules/raw-body/index.d.ts new file mode 100644 index 000000000..dcbbebd4c --- /dev/null +++ b/node_modules/raw-body/index.d.ts @@ -0,0 +1,87 @@ +import { Readable } from 'stream'; + +declare namespace getRawBody { + export type Encoding = string | true; + + export interface Options { + /** + * The expected length of the stream. + */ + length?: number | string | null; + /** + * The byte limit of the body. This is the number of bytes or any string + * format supported by `bytes`, for example `1000`, `'500kb'` or `'3mb'`. + */ + limit?: number | string | null; + /** + * The encoding to use to decode the body into a string. By default, a + * `Buffer` instance will be returned when no encoding is specified. Most + * likely, you want `utf-8`, so setting encoding to `true` will decode as + * `utf-8`. You can use any type of encoding supported by `iconv-lite`. + */ + encoding?: Encoding | null; + } + + export interface RawBodyError extends Error { + /** + * The limit in bytes. + */ + limit?: number; + /** + * The expected length of the stream. + */ + length?: number; + expected?: number; + /** + * The received bytes. + */ + received?: number; + /** + * The encoding. + */ + encoding?: string; + /** + * The corresponding status code for the error. + */ + status: number; + statusCode: number; + /** + * The error type. + */ + type: string; + } +} + +/** + * Gets the entire buffer of a stream either as a `Buffer` or a string. + * Validates the stream's length against an expected length and maximum + * limit. Ideal for parsing request bodies. + */ +declare function getRawBody( + stream: Readable, + callback: (err: getRawBody.RawBodyError, body: Buffer) => void +): void; + +declare function getRawBody( + stream: Readable, + options: (getRawBody.Options & { encoding: getRawBody.Encoding }) | getRawBody.Encoding, + callback: (err: getRawBody.RawBodyError, body: string) => void +): void; + +declare function getRawBody( + stream: Readable, + options: getRawBody.Options, + callback: (err: getRawBody.RawBodyError, body: Buffer) => void +): void; + +declare function getRawBody( + stream: Readable, + options: (getRawBody.Options & { encoding: getRawBody.Encoding }) | getRawBody.Encoding +): Promise; + +declare function getRawBody( + stream: Readable, + options?: getRawBody.Options +): Promise; + +export = getRawBody; diff --git a/node_modules/raw-body/index.js b/node_modules/raw-body/index.js new file mode 100644 index 000000000..7fe818604 --- /dev/null +++ b/node_modules/raw-body/index.js @@ -0,0 +1,286 @@ +/*! + * raw-body + * Copyright(c) 2013-2014 Jonathan Ong + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module dependencies. + * @private + */ + +var bytes = require('bytes') +var createError = require('http-errors') +var iconv = require('iconv-lite') +var unpipe = require('unpipe') + +/** + * Module exports. + * @public + */ + +module.exports = getRawBody + +/** + * Module variables. + * @private + */ + +var ICONV_ENCODING_MESSAGE_REGEXP = /^Encoding not recognized: / + +/** + * Get the decoder for a given encoding. + * + * @param {string} encoding + * @private + */ + +function getDecoder (encoding) { + if (!encoding) return null + + try { + return iconv.getDecoder(encoding) + } catch (e) { + // error getting decoder + if (!ICONV_ENCODING_MESSAGE_REGEXP.test(e.message)) throw e + + // the encoding was not found + throw createError(415, 'specified encoding unsupported', { + encoding: encoding, + type: 'encoding.unsupported' + }) + } +} + +/** + * Get the raw body of a stream (typically HTTP). + * + * @param {object} stream + * @param {object|string|function} [options] + * @param {function} [callback] + * @public + */ + +function getRawBody (stream, options, callback) { + var done = callback + var opts = options || {} + + if (options === true || typeof options === 'string') { + // short cut for encoding + opts = { + encoding: options + } + } + + if (typeof options === 'function') { + done = options + opts = {} + } + + // validate callback is a function, if provided + if (done !== undefined && typeof done !== 'function') { + throw new TypeError('argument callback must be a function') + } + + // require the callback without promises + if (!done && !global.Promise) { + throw new TypeError('argument callback is required') + } + + // get encoding + var encoding = opts.encoding !== true + ? opts.encoding + : 'utf-8' + + // convert the limit to an integer + var limit = bytes.parse(opts.limit) + + // convert the expected length to an integer + var length = opts.length != null && !isNaN(opts.length) + ? parseInt(opts.length, 10) + : null + + if (done) { + // classic callback style + return readStream(stream, encoding, length, limit, done) + } + + return new Promise(function executor (resolve, reject) { + readStream(stream, encoding, length, limit, function onRead (err, buf) { + if (err) return reject(err) + resolve(buf) + }) + }) +} + +/** + * Halt a stream. + * + * @param {Object} stream + * @private + */ + +function halt (stream) { + // unpipe everything from the stream + unpipe(stream) + + // pause stream + if (typeof stream.pause === 'function') { + stream.pause() + } +} + +/** + * Read the data from the stream. + * + * @param {object} stream + * @param {string} encoding + * @param {number} length + * @param {number} limit + * @param {function} callback + * @public + */ + +function readStream (stream, encoding, length, limit, callback) { + var complete = false + var sync = true + + // check the length and limit options. + // note: we intentionally leave the stream paused, + // so users should handle the stream themselves. + if (limit !== null && length !== null && length > limit) { + return done(createError(413, 'request entity too large', { + expected: length, + length: length, + limit: limit, + type: 'entity.too.large' + })) + } + + // streams1: assert request encoding is buffer. + // streams2+: assert the stream encoding is buffer. + // stream._decoder: streams1 + // state.encoding: streams2 + // state.decoder: streams2, specifically < 0.10.6 + var state = stream._readableState + if (stream._decoder || (state && (state.encoding || state.decoder))) { + // developer error + return done(createError(500, 'stream encoding should not be set', { + type: 'stream.encoding.set' + })) + } + + var received = 0 + var decoder + + try { + decoder = getDecoder(encoding) + } catch (err) { + return done(err) + } + + var buffer = decoder + ? '' + : [] + + // attach listeners + stream.on('aborted', onAborted) + stream.on('close', cleanup) + stream.on('data', onData) + stream.on('end', onEnd) + stream.on('error', onEnd) + + // mark sync section complete + sync = false + + function done () { + var args = new Array(arguments.length) + + // copy arguments + for (var i = 0; i < args.length; i++) { + args[i] = arguments[i] + } + + // mark complete + complete = true + + if (sync) { + process.nextTick(invokeCallback) + } else { + invokeCallback() + } + + function invokeCallback () { + cleanup() + + if (args[0]) { + // halt the stream on error + halt(stream) + } + + callback.apply(null, args) + } + } + + function onAborted () { + if (complete) return + + done(createError(400, 'request aborted', { + code: 'ECONNABORTED', + expected: length, + length: length, + received: received, + type: 'request.aborted' + })) + } + + function onData (chunk) { + if (complete) return + + received += chunk.length + + if (limit !== null && received > limit) { + done(createError(413, 'request entity too large', { + limit: limit, + received: received, + type: 'entity.too.large' + })) + } else if (decoder) { + buffer += decoder.write(chunk) + } else { + buffer.push(chunk) + } + } + + function onEnd (err) { + if (complete) return + if (err) return done(err) + + if (length !== null && received !== length) { + done(createError(400, 'request size did not match content length', { + expected: length, + length: length, + received: received, + type: 'request.size.invalid' + })) + } else { + var string = decoder + ? buffer + (decoder.end() || '') + : Buffer.concat(buffer) + done(null, string) + } + } + + function cleanup () { + buffer = null + + stream.removeListener('aborted', onAborted) + stream.removeListener('data', onData) + stream.removeListener('end', onEnd) + stream.removeListener('error', onEnd) + stream.removeListener('close', cleanup) + } +} diff --git a/node_modules/raw-body/package.json b/node_modules/raw-body/package.json new file mode 100644 index 000000000..1302755a0 --- /dev/null +++ b/node_modules/raw-body/package.json @@ -0,0 +1,90 @@ +{ + "_from": "raw-body@2.4.0", + "_id": "raw-body@2.4.0", + "_inBundle": false, + "_integrity": "sha512-4Oz8DUIwdvoa5qMJelxipzi/iJIi40O5cGV1wNYp5hvZP8ZN0T+jiNkL0QepXs+EsQ9XJ8ipEDoiH70ySUJP3Q==", + "_location": "/raw-body", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "raw-body@2.4.0", + "name": "raw-body", + "escapedName": "raw-body", + "rawSpec": "2.4.0", + "saveSpec": null, + "fetchSpec": "2.4.0" + }, + "_requiredBy": [ + "/body-parser" + ], + "_resolved": "https://registry.npmjs.org/raw-body/-/raw-body-2.4.0.tgz", + "_shasum": "a1ce6fb9c9bc356ca52e89256ab59059e13d0332", + "_spec": "raw-body@2.4.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\body-parser", + "author": { + "name": "Jonathan Ong", + "email": "me@jongleberry.com", + "url": "http://jongleberry.com" + }, + "bugs": { + "url": "https://github.com/stream-utils/raw-body/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + { + "name": "Raynos", + "email": "raynos2@gmail.com" + } + ], + "dependencies": { + "bytes": "3.1.0", + "http-errors": "1.7.2", + "iconv-lite": "0.4.24", + "unpipe": "1.0.0" + }, + "deprecated": false, + "description": "Get and validate the raw body of a readable stream.", + "devDependencies": { + "bluebird": "3.5.4", + "eslint": "5.16.0", + "eslint-config-standard": "12.0.0", + "eslint-plugin-import": "2.16.0", + "eslint-plugin-markdown": "1.0.0", + "eslint-plugin-node": "8.0.1", + "eslint-plugin-promise": "4.1.1", + "eslint-plugin-standard": "4.0.0", + "istanbul": "0.4.5", + "mocha": "6.1.3", + "readable-stream": "2.3.6", + "safe-buffer": "5.1.2" + }, + "engines": { + "node": ">= 0.8" + }, + "files": [ + "HISTORY.md", + "LICENSE", + "README.md", + "index.d.ts", + "index.js" + ], + "homepage": "https://github.com/stream-utils/raw-body#readme", + "license": "MIT", + "name": "raw-body", + "repository": { + "type": "git", + "url": "git+https://github.com/stream-utils/raw-body.git" + }, + "scripts": { + "lint": "eslint --plugin markdown --ext js,md .", + "test": "mocha --trace-deprecation --reporter spec --bail --check-leaks test/", + "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --trace-deprecation --reporter dot --check-leaks test/", + "test-travis": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --trace-deprecation --reporter spec --check-leaks test/" + }, + "version": "2.4.0" +} diff --git a/node_modules/rc/LICENSE.APACHE2 b/node_modules/rc/LICENSE.APACHE2 new file mode 100644 index 000000000..6366c0471 --- /dev/null +++ b/node_modules/rc/LICENSE.APACHE2 @@ -0,0 +1,15 @@ +Apache License, Version 2.0 + +Copyright (c) 2011 Dominic Tarr + +Licensed under the Apache License, Version 2.0 (the "License"); +you may not use this file except in compliance with the License. +You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + +Unless required by applicable law or agreed to in writing, software +distributed under the License is distributed on an "AS IS" BASIS, +WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +See the License for the specific language governing permissions and +limitations under the License. diff --git a/node_modules/rc/LICENSE.BSD b/node_modules/rc/LICENSE.BSD new file mode 100644 index 000000000..96bb796aa --- /dev/null +++ b/node_modules/rc/LICENSE.BSD @@ -0,0 +1,26 @@ +Copyright (c) 2013, Dominic Tarr +All rights reserved. + +Redistribution and use in source and binary forms, with or without +modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this + list of conditions and the following disclaimer. +2. Redistributions in binary form must reproduce the above copyright notice, + this list of conditions and the following disclaimer in the documentation + and/or other materials provided with the distribution. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND +ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED +WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE +DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE FOR +ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES +(INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; +LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND +ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT +(INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS +SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. + +The views and conclusions contained in the software and documentation are those +of the authors and should not be interpreted as representing official policies, +either expressed or implied, of the FreeBSD Project. diff --git a/node_modules/rc/LICENSE.MIT b/node_modules/rc/LICENSE.MIT new file mode 100644 index 000000000..6eafbd734 --- /dev/null +++ b/node_modules/rc/LICENSE.MIT @@ -0,0 +1,24 @@ +The MIT License + +Copyright (c) 2011 Dominic Tarr + +Permission is hereby granted, free of charge, +to any person obtaining a copy of this software and +associated documentation files (the "Software"), to +deal in the Software without restriction, including +without limitation the rights to use, copy, modify, +merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom +the Software is furnished to do so, +subject to the following conditions: + +The above copyright notice and this permission notice +shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES +OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR +ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/rc/README.md b/node_modules/rc/README.md new file mode 100644 index 000000000..e6522e267 --- /dev/null +++ b/node_modules/rc/README.md @@ -0,0 +1,227 @@ +# rc + +The non-configurable configuration loader for lazy people. + +## Usage + +The only option is to pass rc the name of your app, and your default configuration. + +```javascript +var conf = require('rc')(appname, { + //defaults go here. + port: 2468, + + //defaults which are objects will be merged, not replaced + views: { + engine: 'jade' + } +}); +``` + +`rc` will return your configuration options merged with the defaults you specify. +If you pass in a predefined defaults object, it will be mutated: + +```javascript +var conf = {}; +require('rc')(appname, conf); +``` + +If `rc` finds any config files for your app, the returned config object will have +a `configs` array containing their paths: + +```javascript +var appCfg = require('rc')(appname, conf); +appCfg.configs[0] // /etc/appnamerc +appCfg.configs[1] // /home/dominictarr/.config/appname +appCfg.config // same as appCfg.configs[appCfg.configs.length - 1] +``` + +## Standards + +Given your application name (`appname`), rc will look in all the obvious places for configuration. + + * command line arguments, parsed by minimist _(e.g. `--foo baz`, also nested: `--foo.bar=baz`)_ + * environment variables prefixed with `${appname}_` + * or use "\_\_" to indicate nested properties
_(e.g. `appname_foo__bar__baz` => `foo.bar.baz`)_ + * if you passed an option `--config file` then from that file + * a local `.${appname}rc` or the first found looking in `./ ../ ../../ ../../../` etc. + * `$HOME/.${appname}rc` + * `$HOME/.${appname}/config` + * `$HOME/.config/${appname}` + * `$HOME/.config/${appname}/config` + * `/etc/${appname}rc` + * `/etc/${appname}/config` + * the defaults object you passed in. + +All configuration sources that were found will be flattened into one object, +so that sources **earlier** in this list override later ones. + + +## Configuration File Formats + +Configuration files (e.g. `.appnamerc`) may be in either [json](http://json.org/example) or [ini](http://en.wikipedia.org/wiki/INI_file) format. **No** file extension (`.json` or `.ini`) should be used. The example configurations below are equivalent: + + +#### Formatted as `ini` + +``` +; You can include comments in `ini` format if you want. + +dependsOn=0.10.0 + + +; `rc` has built-in support for ini sections, see? + +[commands] + www = ./commands/www + console = ./commands/repl + + +; You can even do nested sections + +[generators.options] + engine = ejs + +[generators.modules] + new = generate-new + engine = generate-backend + +``` + +#### Formatted as `json` + +```javascript +{ + // You can even comment your JSON, if you want + "dependsOn": "0.10.0", + "commands": { + "www": "./commands/www", + "console": "./commands/repl" + }, + "generators": { + "options": { + "engine": "ejs" + }, + "modules": { + "new": "generate-new", + "backend": "generate-backend" + } + } +} +``` + +Comments are stripped from JSON config via [strip-json-comments](https://github.com/sindresorhus/strip-json-comments). + +> Since ini, and env variables do not have a standard for types, your application needs be prepared for strings. + +To ensure that string representations of booleans and numbers are always converted into their proper types (especially useful if you intend to do strict `===` comparisons), consider using a module such as [parse-strings-in-object](https://github.com/anselanza/parse-strings-in-object) to wrap the config object returned from rc. + + +## Simple example demonstrating precedence +Assume you have an application like this (notice the hard-coded defaults passed to rc): +``` +const conf = require('rc')('myapp', { + port: 12345, + mode: 'test' +}); + +console.log(JSON.stringify(conf, null, 2)); +``` +You also have a file `config.json`, with these contents: +``` +{ + "port": 9000, + "foo": "from config json", + "something": "else" +} +``` +And a file `.myapprc` in the same folder, with these contents: +``` +{ + "port": "3001", + "foo": "bar" +} +``` +Here is the expected output from various commands: + +`node .` +``` +{ + "port": "3001", + "mode": "test", + "foo": "bar", + "_": [], + "configs": [ + "/Users/stephen/repos/conftest/.myapprc" + ], + "config": "/Users/stephen/repos/conftest/.myapprc" +} +``` +*Default `mode` from hard-coded object is retained, but port is overridden by `.myapprc` file (automatically found based on appname match), and `foo` is added.* + + +`node . --foo baz` +``` +{ + "port": "3001", + "mode": "test", + "foo": "baz", + "_": [], + "configs": [ + "/Users/stephen/repos/conftest/.myapprc" + ], + "config": "/Users/stephen/repos/conftest/.myapprc" +} +``` +*Same result as above but `foo` is overridden because command-line arguments take precedence over `.myapprc` file.* + +`node . --foo barbar --config config.json` +``` +{ + "port": 9000, + "mode": "test", + "foo": "barbar", + "something": "else", + "_": [], + "config": "config.json", + "configs": [ + "/Users/stephen/repos/conftest/.myapprc", + "config.json" + ] +} +``` +*Now the `port` comes from the `config.json` file specified (overriding the value from `.myapprc`), and `foo` value is overriden by command-line despite also being specified in the `config.json` file.* + + + +## Advanced Usage + +#### Pass in your own `argv` + +You may pass in your own `argv` as the third argument to `rc`. This is in case you want to [use your own command-line opts parser](https://github.com/dominictarr/rc/pull/12). + +```javascript +require('rc')(appname, defaults, customArgvParser); +``` + +## Pass in your own parser + +If you have a special need to use a non-standard parser, +you can do so by passing in the parser as the 4th argument. +(leave the 3rd as null to get the default args parser) + +```javascript +require('rc')(appname, defaults, null, parser); +``` + +This may also be used to force a more strict format, +such as strict, valid JSON only. + +## Note on Performance + +`rc` is running `fs.statSync`-- so make sure you don't use it in a hot code path (e.g. a request handler) + + +## License + +Multi-licensed under the two-clause BSD License, MIT License, or Apache License, version 2.0 diff --git a/node_modules/rc/browser.js b/node_modules/rc/browser.js new file mode 100644 index 000000000..8c230c5cd --- /dev/null +++ b/node_modules/rc/browser.js @@ -0,0 +1,7 @@ + +// when this is loaded into the browser, +// just use the defaults... + +module.exports = function (name, defaults) { + return defaults +} diff --git a/node_modules/rc/cli.js b/node_modules/rc/cli.js new file mode 100644 index 000000000..ab05b6072 --- /dev/null +++ b/node_modules/rc/cli.js @@ -0,0 +1,4 @@ +#! /usr/bin/env node +var rc = require('./index') + +console.log(JSON.stringify(rc(process.argv[2]), false, 2)) diff --git a/node_modules/rc/index.js b/node_modules/rc/index.js new file mode 100644 index 000000000..65eb47afe --- /dev/null +++ b/node_modules/rc/index.js @@ -0,0 +1,53 @@ +var cc = require('./lib/utils') +var join = require('path').join +var deepExtend = require('deep-extend') +var etc = '/etc' +var win = process.platform === "win32" +var home = win + ? process.env.USERPROFILE + : process.env.HOME + +module.exports = function (name, defaults, argv, parse) { + if('string' !== typeof name) + throw new Error('rc(name): name *must* be string') + if(!argv) + argv = require('minimist')(process.argv.slice(2)) + defaults = ( + 'string' === typeof defaults + ? cc.json(defaults) : defaults + ) || {} + + parse = parse || cc.parse + + var env = cc.env(name + '_') + + var configs = [defaults] + var configFiles = [] + function addConfigFile (file) { + if (configFiles.indexOf(file) >= 0) return + var fileConfig = cc.file(file) + if (fileConfig) { + configs.push(parse(fileConfig)) + configFiles.push(file) + } + } + + // which files do we look at? + if (!win) + [join(etc, name, 'config'), + join(etc, name + 'rc')].forEach(addConfigFile) + if (home) + [join(home, '.config', name, 'config'), + join(home, '.config', name), + join(home, '.' + name, 'config'), + join(home, '.' + name + 'rc')].forEach(addConfigFile) + addConfigFile(cc.find('.'+name+'rc')) + if (env.config) addConfigFile(env.config) + if (argv.config) addConfigFile(argv.config) + + return deepExtend.apply(null, configs.concat([ + env, + argv, + configFiles.length ? {configs: configFiles, config: configFiles[configFiles.length - 1]} : undefined, + ])) +} diff --git a/node_modules/rc/lib/utils.js b/node_modules/rc/lib/utils.js new file mode 100644 index 000000000..8b3beffa3 --- /dev/null +++ b/node_modules/rc/lib/utils.js @@ -0,0 +1,104 @@ +'use strict'; +var fs = require('fs') +var ini = require('ini') +var path = require('path') +var stripJsonComments = require('strip-json-comments') + +var parse = exports.parse = function (content) { + + //if it ends in .json or starts with { then it must be json. + //must be done this way, because ini accepts everything. + //can't just try and parse it and let it throw if it's not ini. + //everything is ini. even json with a syntax error. + + if(/^\s*{/.test(content)) + return JSON.parse(stripJsonComments(content)) + return ini.parse(content) + +} + +var file = exports.file = function () { + var args = [].slice.call(arguments).filter(function (arg) { return arg != null }) + + //path.join breaks if it's a not a string, so just skip this. + for(var i in args) + if('string' !== typeof args[i]) + return + + var file = path.join.apply(null, args) + var content + try { + return fs.readFileSync(file,'utf-8') + } catch (err) { + return + } +} + +var json = exports.json = function () { + var content = file.apply(null, arguments) + return content ? parse(content) : null +} + +var env = exports.env = function (prefix, env) { + env = env || process.env + var obj = {} + var l = prefix.length + for(var k in env) { + if(k.toLowerCase().indexOf(prefix.toLowerCase()) === 0) { + + var keypath = k.substring(l).split('__') + + // Trim empty strings from keypath array + var _emptyStringIndex + while ((_emptyStringIndex=keypath.indexOf('')) > -1) { + keypath.splice(_emptyStringIndex, 1) + } + + var cursor = obj + keypath.forEach(function _buildSubObj(_subkey,i){ + + // (check for _subkey first so we ignore empty strings) + // (check for cursor to avoid assignment to primitive objects) + if (!_subkey || typeof cursor !== 'object') + return + + // If this is the last key, just stuff the value in there + // Assigns actual value from env variable to final key + // (unless it's just an empty string- in that case use the last valid key) + if (i === keypath.length-1) + cursor[_subkey] = env[k] + + + // Build sub-object if nothing already exists at the keypath + if (cursor[_subkey] === undefined) + cursor[_subkey] = {} + + // Increment cursor used to track the object at the current depth + cursor = cursor[_subkey] + + }) + + } + + } + + return obj +} + +var find = exports.find = function () { + var rel = path.join.apply(null, [].slice.call(arguments)) + + function find(start, rel) { + var file = path.join(start, rel) + try { + fs.statSync(file) + return file + } catch (err) { + if(path.dirname(start) !== start) // root + return find(path.dirname(start), rel) + } + } + return find(process.cwd(), rel) +} + + diff --git a/node_modules/rc/node_modules/ini/LICENSE b/node_modules/rc/node_modules/ini/LICENSE new file mode 100644 index 000000000..19129e315 --- /dev/null +++ b/node_modules/rc/node_modules/ini/LICENSE @@ -0,0 +1,15 @@ +The ISC License + +Copyright (c) Isaac Z. Schlueter and Contributors + +Permission to use, copy, modify, and/or distribute this software for any +purpose with or without fee is hereby granted, provided that the above +copyright notice and this permission notice appear in all copies. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES +WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF +MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR +ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES +WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN +ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR +IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. diff --git a/node_modules/rc/node_modules/ini/README.md b/node_modules/rc/node_modules/ini/README.md new file mode 100644 index 000000000..33df25829 --- /dev/null +++ b/node_modules/rc/node_modules/ini/README.md @@ -0,0 +1,102 @@ +An ini format parser and serializer for node. + +Sections are treated as nested objects. Items before the first +heading are saved on the object directly. + +## Usage + +Consider an ini-file `config.ini` that looks like this: + + ; this comment is being ignored + scope = global + + [database] + user = dbuser + password = dbpassword + database = use_this_database + + [paths.default] + datadir = /var/lib/data + array[] = first value + array[] = second value + array[] = third value + +You can read, manipulate and write the ini-file like so: + + var fs = require('fs') + , ini = require('ini') + + var config = ini.parse(fs.readFileSync('./config.ini', 'utf-8')) + + config.scope = 'local' + config.database.database = 'use_another_database' + config.paths.default.tmpdir = '/tmp' + delete config.paths.default.datadir + config.paths.default.array.push('fourth value') + + fs.writeFileSync('./config_modified.ini', ini.stringify(config, { section: 'section' })) + +This will result in a file called `config_modified.ini` being written +to the filesystem with the following content: + + [section] + scope=local + [section.database] + user=dbuser + password=dbpassword + database=use_another_database + [section.paths.default] + tmpdir=/tmp + array[]=first value + array[]=second value + array[]=third value + array[]=fourth value + + +## API + +### decode(inistring) + +Decode the ini-style formatted `inistring` into a nested object. + +### parse(inistring) + +Alias for `decode(inistring)` + +### encode(object, [options]) + +Encode the object `object` into an ini-style formatted string. If the +optional parameter `section` is given, then all top-level properties +of the object are put into this section and the `section`-string is +prepended to all sub-sections, see the usage example above. + +The `options` object may contain the following: + +* `section` A string which will be the first `section` in the encoded + ini data. Defaults to none. +* `whitespace` Boolean to specify whether to put whitespace around the + `=` character. By default, whitespace is omitted, to be friendly to + some persnickety old parsers that don't tolerate it well. But some + find that it's more human-readable and pretty with the whitespace. + +For backwards compatibility reasons, if a `string` options is passed +in, then it is assumed to be the `section` value. + +### stringify(object, [options]) + +Alias for `encode(object, [options])` + +### safe(val) + +Escapes the string `val` such that it is safe to be used as a key or +value in an ini-file. Basically escapes quotes. For example + + ini.safe('"unsafe string"') + +would result in + + "\"unsafe string\"" + +### unsafe(val) + +Unescapes the string `val` diff --git a/node_modules/rc/node_modules/ini/ini.js b/node_modules/rc/node_modules/ini/ini.js new file mode 100644 index 000000000..b576f08d7 --- /dev/null +++ b/node_modules/rc/node_modules/ini/ini.js @@ -0,0 +1,206 @@ +exports.parse = exports.decode = decode + +exports.stringify = exports.encode = encode + +exports.safe = safe +exports.unsafe = unsafe + +var eol = typeof process !== 'undefined' && + process.platform === 'win32' ? '\r\n' : '\n' + +function encode (obj, opt) { + var children = [] + var out = '' + + if (typeof opt === 'string') { + opt = { + section: opt, + whitespace: false, + } + } else { + opt = opt || {} + opt.whitespace = opt.whitespace === true + } + + var separator = opt.whitespace ? ' = ' : '=' + + Object.keys(obj).forEach(function (k, _, __) { + var val = obj[k] + if (val && Array.isArray(val)) { + val.forEach(function (item) { + out += safe(k + '[]') + separator + safe(item) + '\n' + }) + } else if (val && typeof val === 'object') + children.push(k) + else + out += safe(k) + separator + safe(val) + eol + }) + + if (opt.section && out.length) + out = '[' + safe(opt.section) + ']' + eol + out + + children.forEach(function (k, _, __) { + var nk = dotSplit(k).join('\\.') + var section = (opt.section ? opt.section + '.' : '') + nk + var child = encode(obj[k], { + section: section, + whitespace: opt.whitespace, + }) + if (out.length && child.length) + out += eol + + out += child + }) + + return out +} + +function dotSplit (str) { + return str.replace(/\1/g, '\u0002LITERAL\\1LITERAL\u0002') + .replace(/\\\./g, '\u0001') + .split(/\./).map(function (part) { + return part.replace(/\1/g, '\\.') + .replace(/\2LITERAL\\1LITERAL\2/g, '\u0001') + }) +} + +function decode (str) { + var out = {} + var p = out + var section = null + // section |key = value + var re = /^\[([^\]]*)\]$|^([^=]+)(=(.*))?$/i + var lines = str.split(/[\r\n]+/g) + + lines.forEach(function (line, _, __) { + if (!line || line.match(/^\s*[;#]/)) + return + var match = line.match(re) + if (!match) + return + if (match[1] !== undefined) { + section = unsafe(match[1]) + if (section === '__proto__') { + // not allowed + // keep parsing the section, but don't attach it. + p = {} + return + } + p = out[section] = out[section] || {} + return + } + var key = unsafe(match[2]) + if (key === '__proto__') + return + var value = match[3] ? unsafe(match[4]) : true + switch (value) { + case 'true': + case 'false': + case 'null': value = JSON.parse(value) + } + + // Convert keys with '[]' suffix to an array + if (key.length > 2 && key.slice(-2) === '[]') { + key = key.substring(0, key.length - 2) + if (key === '__proto__') + return + if (!p[key]) + p[key] = [] + else if (!Array.isArray(p[key])) + p[key] = [p[key]] + } + + // safeguard against resetting a previously defined + // array by accidentally forgetting the brackets + if (Array.isArray(p[key])) + p[key].push(value) + else + p[key] = value + }) + + // {a:{y:1},"a.b":{x:2}} --> {a:{y:1,b:{x:2}}} + // use a filter to return the keys that have to be deleted. + Object.keys(out).filter(function (k, _, __) { + if (!out[k] || + typeof out[k] !== 'object' || + Array.isArray(out[k])) + return false + + // see if the parent section is also an object. + // if so, add it to that, and mark this one for deletion + var parts = dotSplit(k) + var p = out + var l = parts.pop() + var nl = l.replace(/\\\./g, '.') + parts.forEach(function (part, _, __) { + if (part === '__proto__') + return + if (!p[part] || typeof p[part] !== 'object') + p[part] = {} + p = p[part] + }) + if (p === out && nl === l) + return false + + p[nl] = out[k] + return true + }).forEach(function (del, _, __) { + delete out[del] + }) + + return out +} + +function isQuoted (val) { + return (val.charAt(0) === '"' && val.slice(-1) === '"') || + (val.charAt(0) === "'" && val.slice(-1) === "'") +} + +function safe (val) { + return (typeof val !== 'string' || + val.match(/[=\r\n]/) || + val.match(/^\[/) || + (val.length > 1 && + isQuoted(val)) || + val !== val.trim()) + ? JSON.stringify(val) + : val.replace(/;/g, '\\;').replace(/#/g, '\\#') +} + +function unsafe (val, doUnesc) { + val = (val || '').trim() + if (isQuoted(val)) { + // remove the single quotes before calling JSON.parse + if (val.charAt(0) === "'") + val = val.substr(1, val.length - 2) + + try { + val = JSON.parse(val) + } catch (_) {} + } else { + // walk the val to find the first not-escaped ; character + var esc = false + var unesc = '' + for (var i = 0, l = val.length; i < l; i++) { + var c = val.charAt(i) + if (esc) { + if ('\\;#'.indexOf(c) !== -1) + unesc += c + else + unesc += '\\' + c + + esc = false + } else if (';#'.indexOf(c) !== -1) + break + else if (c === '\\') + esc = true + else + unesc += c + } + if (esc) + unesc += '\\' + + return unesc.trim() + } + return val +} diff --git a/node_modules/rc/node_modules/ini/package.json b/node_modules/rc/node_modules/ini/package.json new file mode 100644 index 000000000..6c7ad4be1 --- /dev/null +++ b/node_modules/rc/node_modules/ini/package.json @@ -0,0 +1,66 @@ +{ + "_from": "ini@~1.3.0", + "_id": "ini@1.3.8", + "_inBundle": false, + "_integrity": "sha512-JV/yugV2uzW5iMRSiZAyDtQd+nxtUnjeLt0acNdw98kKLrvuRVyB80tsREOE7yvGVgalhZ6RNXCmEHkUKBKxew==", + "_location": "/rc/ini", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "ini@~1.3.0", + "name": "ini", + "escapedName": "ini", + "rawSpec": "~1.3.0", + "saveSpec": null, + "fetchSpec": "~1.3.0" + }, + "_requiredBy": [ + "/rc" + ], + "_resolved": "https://registry.npmjs.org/ini/-/ini-1.3.8.tgz", + "_shasum": "a29da425b48806f34767a4efce397269af28432c", + "_spec": "ini@~1.3.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\rc", + "author": { + "name": "Isaac Z. Schlueter", + "email": "i@izs.me", + "url": "http://blog.izs.me/" + }, + "bugs": { + "url": "https://github.com/isaacs/ini/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "An ini encoder/decoder for node", + "devDependencies": { + "eslint": "^7.9.0", + "eslint-plugin-import": "^2.22.0", + "eslint-plugin-node": "^11.1.0", + "eslint-plugin-promise": "^4.2.1", + "eslint-plugin-standard": "^4.0.1", + "tap": "14" + }, + "files": [ + "ini.js" + ], + "homepage": "https://github.com/isaacs/ini#readme", + "license": "ISC", + "main": "ini.js", + "name": "ini", + "repository": { + "type": "git", + "url": "git://github.com/isaacs/ini.git" + }, + "scripts": { + "eslint": "eslint", + "lint": "npm run eslint -- ini.js test/*.js", + "lintfix": "npm run lint -- --fix", + "posttest": "npm run lint", + "postversion": "npm publish", + "prepublishOnly": "git push origin --follow-tags", + "preversion": "npm test", + "test": "tap" + }, + "version": "1.3.8" +} diff --git a/node_modules/rc/package.json b/node_modules/rc/package.json new file mode 100644 index 000000000..5c58efff9 --- /dev/null +++ b/node_modules/rc/package.json @@ -0,0 +1,65 @@ +{ + "_from": "rc@^1.2.8", + "_id": "rc@1.2.8", + "_inBundle": false, + "_integrity": "sha512-y3bGgqKj3QBdxLbLkomlohkvsA8gdAiUQlSBJnBhfn+BPxg4bc62d8TcBW15wavDfgexCgccckhcZvywyQYPOw==", + "_location": "/rc", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "rc@^1.2.8", + "name": "rc", + "escapedName": "rc", + "rawSpec": "^1.2.8", + "saveSpec": null, + "fetchSpec": "^1.2.8" + }, + "_requiredBy": [ + "/registry-auth-token", + "/registry-url" + ], + "_resolved": "https://registry.npmjs.org/rc/-/rc-1.2.8.tgz", + "_shasum": "cd924bf5200a075b83c188cd6b9e211b7fc0d3ed", + "_spec": "rc@^1.2.8", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\registry-auth-token", + "author": { + "name": "Dominic Tarr", + "email": "dominic.tarr@gmail.com", + "url": "dominictarr.com" + }, + "bin": { + "rc": "cli.js" + }, + "browser": "browser.js", + "bugs": { + "url": "https://github.com/dominictarr/rc/issues" + }, + "bundleDependencies": false, + "dependencies": { + "deep-extend": "^0.6.0", + "ini": "~1.3.0", + "minimist": "^1.2.0", + "strip-json-comments": "~2.0.1" + }, + "deprecated": false, + "description": "hardwired configuration loader", + "homepage": "https://github.com/dominictarr/rc#readme", + "keywords": [ + "config", + "rc", + "unix", + "defaults" + ], + "license": "(BSD-2-Clause OR MIT OR Apache-2.0)", + "main": "index.js", + "name": "rc", + "repository": { + "type": "git", + "url": "git+https://github.com/dominictarr/rc.git" + }, + "scripts": { + "test": "set -e; node test/test.js; node test/ini.js; node test/nested-env-vars.js" + }, + "version": "1.2.8" +} diff --git a/node_modules/rc/test/ini.js b/node_modules/rc/test/ini.js new file mode 100644 index 000000000..e6857f8b3 --- /dev/null +++ b/node_modules/rc/test/ini.js @@ -0,0 +1,16 @@ +var cc =require('../lib/utils') +var INI = require('ini') +var assert = require('assert') + +function test(obj) { + + var _json, _ini + var json = cc.parse (_json = JSON.stringify(obj)) + var ini = cc.parse (_ini = INI.stringify(obj)) + console.log(_ini, _json) + assert.deepEqual(json, ini) +} + + +test({hello: true}) + diff --git a/node_modules/rc/test/nested-env-vars.js b/node_modules/rc/test/nested-env-vars.js new file mode 100644 index 000000000..0ecd17634 --- /dev/null +++ b/node_modules/rc/test/nested-env-vars.js @@ -0,0 +1,50 @@ + +var seed = Math.random(); +var n = 'rc'+ seed; +var N = 'RC'+ seed; +var assert = require('assert') + + +// Basic usage +process.env[n+'_someOpt__a'] = 42 +process.env[n+'_someOpt__x__'] = 99 +process.env[n+'_someOpt__a__b'] = 186 +process.env[n+'_someOpt__a__b__c'] = 243 +process.env[n+'_someOpt__x__y'] = 1862 +process.env[n+'_someOpt__z'] = 186577 + +// Should ignore empty strings from orphaned '__' +process.env[n+'_someOpt__z__x__'] = 18629 +process.env[n+'_someOpt__w__w__'] = 18629 + +// Leading '__' should ignore everything up to 'z' +process.env[n+'___z__i__'] = 9999 + +// should ignore case for config name section. +process.env[N+'_test_upperCase'] = 187 + +function testPrefix(prefix) { + var config = require('../')(prefix, { + option: true + }) + + console.log('\n\n------ nested-env-vars ------\n',{prefix: prefix}, '\n', config); + + assert.equal(config.option, true) + assert.equal(config.someOpt.a, 42) + assert.equal(config.someOpt.x, 99) + // Should not override `a` once it's been set + assert.equal(config.someOpt.a/*.b*/, 42) + // Should not override `x` once it's been set + assert.equal(config.someOpt.x/*.y*/, 99) + assert.equal(config.someOpt.z, 186577) + // Should not override `z` once it's been set + assert.equal(config.someOpt.z/*.x*/, 186577) + assert.equal(config.someOpt.w.w, 18629) + assert.equal(config.z.i, 9999) + + assert.equal(config.test_upperCase, 187) +} + +testPrefix(n); +testPrefix(N); diff --git a/node_modules/rc/test/test.js b/node_modules/rc/test/test.js new file mode 100644 index 000000000..4f6335189 --- /dev/null +++ b/node_modules/rc/test/test.js @@ -0,0 +1,59 @@ + +var n = 'rc'+Math.random() +var assert = require('assert') + +process.env[n+'_envOption'] = 42 + +var config = require('../')(n, { + option: true +}) + +console.log(config) + +assert.equal(config.option, true) +assert.equal(config.envOption, 42) + +var customArgv = require('../')(n, { + option: true +}, { // nopt-like argv + option: false, + envOption: 24, + argv: { + remain: [], + cooked: ['--no-option', '--envOption', '24'], + original: ['--no-option', '--envOption=24'] + } +}) + +console.log(customArgv) + +assert.equal(customArgv.option, false) +assert.equal(customArgv.envOption, 24) + +var fs = require('fs') +var path = require('path') +var jsonrc = path.resolve('.' + n + 'rc'); + +fs.writeFileSync(jsonrc, [ + '{', + '// json overrides default', + '"option": false,', + '/* env overrides json */', + '"envOption": 24', + '}' +].join('\n')); + +var commentedJSON = require('../')(n, { + option: true +}) + +fs.unlinkSync(jsonrc); + +console.log(commentedJSON) + +assert.equal(commentedJSON.option, false) +assert.equal(commentedJSON.envOption, 42) + +assert.equal(commentedJSON.config, jsonrc) +assert.equal(commentedJSON.configs.length, 1) +assert.equal(commentedJSON.configs[0], jsonrc) diff --git a/node_modules/readdirp/LICENSE b/node_modules/readdirp/LICENSE new file mode 100644 index 000000000..037cbb4e6 --- /dev/null +++ b/node_modules/readdirp/LICENSE @@ -0,0 +1,21 @@ +MIT License + +Copyright (c) 2012-2019 Thorsten Lorenz, Paul Miller (https://paulmillr.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/node_modules/readdirp/README.md b/node_modules/readdirp/README.md new file mode 100644 index 000000000..465593c9d --- /dev/null +++ b/node_modules/readdirp/README.md @@ -0,0 +1,122 @@ +# readdirp [![Weekly downloads](https://img.shields.io/npm/dw/readdirp.svg)](https://github.com/paulmillr/readdirp) + +Recursive version of [fs.readdir](https://nodejs.org/api/fs.html#fs_fs_readdir_path_options_callback). Exposes a **stream API** and a **promise API**. + + +```sh +npm install readdirp +``` + +```javascript +const readdirp = require('readdirp'); + +// Use streams to achieve small RAM & CPU footprint. +// 1) Streams example with for-await. +for await (const entry of readdirp('.')) { + const {path} = entry; + console.log(`${JSON.stringify({path})}`); +} + +// 2) Streams example, non for-await. +// Print out all JS files along with their size within the current folder & subfolders. +readdirp('.', {fileFilter: '*.js', alwaysStat: true}) + .on('data', (entry) => { + const {path, stats: {size}} = entry; + console.log(`${JSON.stringify({path, size})}`); + }) + // Optionally call stream.destroy() in `warn()` in order to abort and cause 'close' to be emitted + .on('warn', error => console.error('non-fatal error', error)) + .on('error', error => console.error('fatal error', error)) + .on('end', () => console.log('done')); + +// 3) Promise example. More RAM and CPU than streams / for-await. +const files = await readdirp.promise('.'); +console.log(files.map(file => file.path)); + +// Other options. +readdirp('test', { + fileFilter: '*.js', + directoryFilter: ['!.git', '!*modules'] + // directoryFilter: (di) => di.basename.length === 9 + type: 'files_directories', + depth: 1 +}); +``` + +For more examples, check out `examples` directory. + +## API + +`const stream = readdirp(root[, options])` — **Stream API** + +- Reads given root recursively and returns a `stream` of [entry infos](#entryinfo) +- Optionally can be used like `for await (const entry of stream)` with node.js 10+ (`asyncIterator`). +- `on('data', (entry) => {})` [entry info](#entryinfo) for every file / dir. +- `on('warn', (error) => {})` non-fatal `Error` that prevents a file / dir from being processed. Example: inaccessible to the user. +- `on('error', (error) => {})` fatal `Error` which also ends the stream. Example: illegal options where passed. +- `on('end')` — we are done. Called when all entries were found and no more will be emitted. +- `on('close')` — stream is destroyed via `stream.destroy()`. + Could be useful if you want to manually abort even on a non fatal error. + At that point the stream is no longer `readable` and no more entries, warning or errors are emitted +- To learn more about streams, consult the very detailed [nodejs streams documentation](https://nodejs.org/api/stream.html) + or the [stream-handbook](https://github.com/substack/stream-handbook) + +`const entries = await readdirp.promise(root[, options])` — **Promise API**. Returns a list of [entry infos](#entryinfo). + +First argument is awalys `root`, path in which to start reading and recursing into subdirectories. + +### options + +- `fileFilter: ["*.js"]`: filter to include or exclude files. A `Function`, Glob string or Array of glob strings. + - **Function**: a function that takes an entry info as a parameter and returns true to include or false to exclude the entry + - **Glob string**: a string (e.g., `*.js`) which is matched using [picomatch](https://github.com/micromatch/picomatch), so go there for more + information. Globstars (`**`) are not supported since specifying a recursive pattern for an already recursive function doesn't make sense. Negated globs (as explained in the minimatch documentation) are allowed, e.g., `!*.txt` matches everything but text files. + - **Array of glob strings**: either need to be all inclusive or all exclusive (negated) patterns otherwise an error is thrown. + `['*.json', '*.js']` includes all JavaScript and Json files. + `['!.git', '!node_modules']` includes all directories except the '.git' and 'node_modules'. + - Directories that do not pass a filter will not be recursed into. +- `directoryFilter: ['!.git']`: filter to include/exclude directories found and to recurse into. Directories that do not pass a filter will not be recursed into. +- `depth: 5`: depth at which to stop recursing even if more subdirectories are found +- `type: 'files'`: determines if data events on the stream should be emitted for `'files'` (default), `'directories'`, `'files_directories'`, or `'all'`. Setting to `'all'` will also include entries for other types of file descriptors like character devices, unix sockets and named pipes. +- `alwaysStat: false`: always return `stats` property for every file. Default is `false`, readdirp will return `Dirent` entries. Setting it to `true` can double readdir execution time - use it only when you need file `size`, `mtime` etc. Cannot be enabled on node <10.10.0. +- `lstat: false`: include symlink entries in the stream along with files. When `true`, `fs.lstat` would be used instead of `fs.stat` + +### `EntryInfo` + +Has the following properties: + +- `path: 'assets/javascripts/react.js'`: path to the file/directory (relative to given root) +- `fullPath: '/Users/dev/projects/app/assets/javascripts/react.js'`: full path to the file/directory found +- `basename: 'react.js'`: name of the file/directory +- `dirent: fs.Dirent`: built-in [dir entry object](https://nodejs.org/api/fs.html#fs_class_fs_dirent) - only with `alwaysStat: false` +- `stats: fs.Stats`: built in [stat object](https://nodejs.org/api/fs.html#fs_class_fs_stats) - only with `alwaysStat: true` + +## Changelog + +- 3.5 (Oct 13, 2020) disallows recursive directory-based symlinks. + Before, it could have entered infinite loop. +- 3.4 (Mar 19, 2020) adds support for directory-based symlinks. +- 3.3 (Dec 6, 2019) stabilizes RAM consumption and enables perf management with `highWaterMark` option. Fixes race conditions related to `for-await` looping. +- 3.2 (Oct 14, 2019) improves performance by 250% and makes streams implementation more idiomatic. +- 3.1 (Jul 7, 2019) brings `bigint` support to `stat` output on Windows. This is backwards-incompatible for some cases. Be careful. It you use it incorrectly, you'll see "TypeError: Cannot mix BigInt and other types, use explicit conversions". +- 3.0 brings huge performance improvements and stream backpressure support. +- Upgrading 2.x to 3.x: + - Signature changed from `readdirp(options)` to `readdirp(root, options)` + - Replaced callback API with promise API. + - Renamed `entryType` option to `type` + - Renamed `entryType: 'both'` to `'files_directories'` + - `EntryInfo` + - Renamed `stat` to `stats` + - Emitted only when `alwaysStat: true` + - `dirent` is emitted instead of `stats` by default with `alwaysStat: false` + - Renamed `name` to `basename` + - Removed `parentDir` and `fullParentDir` properties +- Supported node.js versions: + - 3.x: node 8+ + - 2.x: node 0.6+ + +## License + +Copyright (c) 2012-2019 Thorsten Lorenz, Paul Miller () + +MIT License, see [LICENSE](LICENSE) file. diff --git a/node_modules/readdirp/index.d.ts b/node_modules/readdirp/index.d.ts new file mode 100644 index 000000000..cbbd76ca1 --- /dev/null +++ b/node_modules/readdirp/index.d.ts @@ -0,0 +1,43 @@ +// TypeScript Version: 3.2 + +/// + +import * as fs from 'fs'; +import { Readable } from 'stream'; + +declare namespace readdir { + interface EntryInfo { + path: string; + fullPath: string; + basename: string; + stats?: fs.Stats; + dirent?: fs.Dirent; + } + + interface ReaddirpOptions { + root?: string; + fileFilter?: string | string[] | ((entry: EntryInfo) => boolean); + directoryFilter?: string | string[] | ((entry: EntryInfo) => boolean); + type?: 'files' | 'directories' | 'files_directories' | 'all'; + lstat?: boolean; + depth?: number; + alwaysStat?: boolean; + } + + interface ReaddirpStream extends Readable, AsyncIterable { + read(): EntryInfo; + [Symbol.asyncIterator](): AsyncIterableIterator; + } + + function promise( + root: string, + options?: ReaddirpOptions + ): Promise; +} + +declare function readdir( + root: string, + options?: readdir.ReaddirpOptions +): readdir.ReaddirpStream; + +export = readdir; diff --git a/node_modules/readdirp/index.js b/node_modules/readdirp/index.js new file mode 100644 index 000000000..cf739b2dc --- /dev/null +++ b/node_modules/readdirp/index.js @@ -0,0 +1,287 @@ +'use strict'; + +const fs = require('fs'); +const { Readable } = require('stream'); +const sysPath = require('path'); +const { promisify } = require('util'); +const picomatch = require('picomatch'); + +const readdir = promisify(fs.readdir); +const stat = promisify(fs.stat); +const lstat = promisify(fs.lstat); +const realpath = promisify(fs.realpath); + +/** + * @typedef {Object} EntryInfo + * @property {String} path + * @property {String} fullPath + * @property {fs.Stats=} stats + * @property {fs.Dirent=} dirent + * @property {String} basename + */ + +const BANG = '!'; +const RECURSIVE_ERROR_CODE = 'READDIRP_RECURSIVE_ERROR'; +const NORMAL_FLOW_ERRORS = new Set(['ENOENT', 'EPERM', 'EACCES', 'ELOOP', RECURSIVE_ERROR_CODE]); +const FILE_TYPE = 'files'; +const DIR_TYPE = 'directories'; +const FILE_DIR_TYPE = 'files_directories'; +const EVERYTHING_TYPE = 'all'; +const ALL_TYPES = [FILE_TYPE, DIR_TYPE, FILE_DIR_TYPE, EVERYTHING_TYPE]; + +const isNormalFlowError = error => NORMAL_FLOW_ERRORS.has(error.code); +const [maj, min] = process.versions.node.split('.').slice(0, 2).map(n => Number.parseInt(n, 10)); +const wantBigintFsStats = process.platform === 'win32' && (maj > 10 || (maj === 10 && min >= 5)); + +const normalizeFilter = filter => { + if (filter === undefined) return; + if (typeof filter === 'function') return filter; + + if (typeof filter === 'string') { + const glob = picomatch(filter.trim()); + return entry => glob(entry.basename); + } + + if (Array.isArray(filter)) { + const positive = []; + const negative = []; + for (const item of filter) { + const trimmed = item.trim(); + if (trimmed.charAt(0) === BANG) { + negative.push(picomatch(trimmed.slice(1))); + } else { + positive.push(picomatch(trimmed)); + } + } + + if (negative.length > 0) { + if (positive.length > 0) { + return entry => + positive.some(f => f(entry.basename)) && !negative.some(f => f(entry.basename)); + } + return entry => !negative.some(f => f(entry.basename)); + } + return entry => positive.some(f => f(entry.basename)); + } +}; + +class ReaddirpStream extends Readable { + static get defaultOptions() { + return { + root: '.', + /* eslint-disable no-unused-vars */ + fileFilter: (path) => true, + directoryFilter: (path) => true, + /* eslint-enable no-unused-vars */ + type: FILE_TYPE, + lstat: false, + depth: 2147483648, + alwaysStat: false + }; + } + + constructor(options = {}) { + super({ + objectMode: true, + autoDestroy: true, + highWaterMark: options.highWaterMark || 4096 + }); + const opts = { ...ReaddirpStream.defaultOptions, ...options }; + const { root, type } = opts; + + this._fileFilter = normalizeFilter(opts.fileFilter); + this._directoryFilter = normalizeFilter(opts.directoryFilter); + + const statMethod = opts.lstat ? lstat : stat; + // Use bigint stats if it's windows and stat() supports options (node 10+). + if (wantBigintFsStats) { + this._stat = path => statMethod(path, { bigint: true }); + } else { + this._stat = statMethod; + } + + this._maxDepth = opts.depth; + this._wantsDir = [DIR_TYPE, FILE_DIR_TYPE, EVERYTHING_TYPE].includes(type); + this._wantsFile = [FILE_TYPE, FILE_DIR_TYPE, EVERYTHING_TYPE].includes(type); + this._wantsEverything = type === EVERYTHING_TYPE; + this._root = sysPath.resolve(root); + this._isDirent = ('Dirent' in fs) && !opts.alwaysStat; + this._statsProp = this._isDirent ? 'dirent' : 'stats'; + this._rdOptions = { encoding: 'utf8', withFileTypes: this._isDirent }; + + // Launch stream with one parent, the root dir. + this.parents = [this._exploreDir(root, 1)]; + this.reading = false; + this.parent = undefined; + } + + async _read(batch) { + if (this.reading) return; + this.reading = true; + + try { + while (!this.destroyed && batch > 0) { + const { path, depth, files = [] } = this.parent || {}; + + if (files.length > 0) { + const slice = files.splice(0, batch).map(dirent => this._formatEntry(dirent, path)); + for (const entry of await Promise.all(slice)) { + if (this.destroyed) return; + + const entryType = await this._getEntryType(entry); + if (entryType === 'directory' && this._directoryFilter(entry)) { + if (depth <= this._maxDepth) { + this.parents.push(this._exploreDir(entry.fullPath, depth + 1)); + } + + if (this._wantsDir) { + this.push(entry); + batch--; + } + } else if ((entryType === 'file' || this._includeAsFile(entry)) && this._fileFilter(entry)) { + if (this._wantsFile) { + this.push(entry); + batch--; + } + } + } + } else { + const parent = this.parents.pop(); + if (!parent) { + this.push(null); + break; + } + this.parent = await parent; + if (this.destroyed) return; + } + } + } catch (error) { + this.destroy(error); + } finally { + this.reading = false; + } + } + + async _exploreDir(path, depth) { + let files; + try { + files = await readdir(path, this._rdOptions); + } catch (error) { + this._onError(error); + } + return { files, depth, path }; + } + + async _formatEntry(dirent, path) { + let entry; + try { + const basename = this._isDirent ? dirent.name : dirent; + const fullPath = sysPath.resolve(sysPath.join(path, basename)); + entry = { path: sysPath.relative(this._root, fullPath), fullPath, basename }; + entry[this._statsProp] = this._isDirent ? dirent : await this._stat(fullPath); + } catch (err) { + this._onError(err); + } + return entry; + } + + _onError(err) { + if (isNormalFlowError(err) && !this.destroyed) { + this.emit('warn', err); + } else { + this.destroy(err); + } + } + + async _getEntryType(entry) { + // entry may be undefined, because a warning or an error were emitted + // and the statsProp is undefined + const stats = entry && entry[this._statsProp]; + if (!stats) { + return; + } + if (stats.isFile()) { + return 'file'; + } + if (stats.isDirectory()) { + return 'directory'; + } + if (stats && stats.isSymbolicLink()) { + const full = entry.fullPath; + try { + const entryRealPath = await realpath(full); + const entryRealPathStats = await lstat(entryRealPath); + if (entryRealPathStats.isFile()) { + return 'file'; + } + if (entryRealPathStats.isDirectory()) { + const len = entryRealPath.length; + if (full.startsWith(entryRealPath) && full.substr(len, 1) === sysPath.sep) { + const recursiveError = new Error( + `Circular symlink detected: "${full}" points to "${entryRealPath}"` + ); + recursiveError.code = RECURSIVE_ERROR_CODE; + return this._onError(recursiveError); + } + return 'directory'; + } + } catch (error) { + this._onError(error); + } + } + } + + _includeAsFile(entry) { + const stats = entry && entry[this._statsProp]; + + return stats && this._wantsEverything && !stats.isDirectory(); + } +} + +/** + * @typedef {Object} ReaddirpArguments + * @property {Function=} fileFilter + * @property {Function=} directoryFilter + * @property {String=} type + * @property {Number=} depth + * @property {String=} root + * @property {Boolean=} lstat + * @property {Boolean=} bigint + */ + +/** + * Main function which ends up calling readdirRec and reads all files and directories in given root recursively. + * @param {String} root Root directory + * @param {ReaddirpArguments=} options Options to specify root (start directory), filters and recursion depth + */ +const readdirp = (root, options = {}) => { + let type = options.entryType || options.type; + if (type === 'both') type = FILE_DIR_TYPE; // backwards-compatibility + if (type) options.type = type; + if (!root) { + throw new Error('readdirp: root argument is required. Usage: readdirp(root, options)'); + } else if (typeof root !== 'string') { + throw new TypeError('readdirp: root argument must be a string. Usage: readdirp(root, options)'); + } else if (type && !ALL_TYPES.includes(type)) { + throw new Error(`readdirp: Invalid type passed. Use one of ${ALL_TYPES.join(', ')}`); + } + + options.root = root; + return new ReaddirpStream(options); +}; + +const readdirpPromise = (root, options = {}) => { + return new Promise((resolve, reject) => { + const files = []; + readdirp(root, options) + .on('data', entry => files.push(entry)) + .on('end', () => resolve(files)) + .on('error', error => reject(error)); + }); +}; + +readdirp.promise = readdirpPromise; +readdirp.ReaddirpStream = ReaddirpStream; +readdirp.default = readdirp; + +module.exports = readdirp; diff --git a/node_modules/readdirp/package.json b/node_modules/readdirp/package.json new file mode 100644 index 000000000..e5b1dde95 --- /dev/null +++ b/node_modules/readdirp/package.json @@ -0,0 +1,158 @@ +{ + "_from": "readdirp@~3.6.0", + "_id": "readdirp@3.6.0", + "_inBundle": false, + "_integrity": "sha512-hOS089on8RduqdbhvQ5Z37A0ESjsqz6qnRcffsMU3495FuTdqSm+7bhJ29JvIOsBDEEnan5DPu9t3To9VRlMzA==", + "_location": "/readdirp", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "readdirp@~3.6.0", + "name": "readdirp", + "escapedName": "readdirp", + "rawSpec": "~3.6.0", + "saveSpec": null, + "fetchSpec": "~3.6.0" + }, + "_requiredBy": [ + "/chokidar" + ], + "_resolved": "https://registry.npmjs.org/readdirp/-/readdirp-3.6.0.tgz", + "_shasum": "74a370bd857116e245b29cc97340cd431a02a6c7", + "_spec": "readdirp@~3.6.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\chokidar", + "author": { + "name": "Thorsten Lorenz", + "email": "thlorenz@gmx.de", + "url": "thlorenz.com" + }, + "bugs": { + "url": "https://github.com/paulmillr/readdirp/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Thorsten Lorenz", + "email": "thlorenz@gmx.de", + "url": "thlorenz.com" + }, + { + "name": "Paul Miller", + "url": "https://paulmillr.com" + } + ], + "dependencies": { + "picomatch": "^2.2.1" + }, + "deprecated": false, + "description": "Recursive version of fs.readdir with streaming API.", + "devDependencies": { + "@types/node": "^14", + "chai": "^4.2", + "chai-subset": "^1.6", + "dtslint": "^3.3.0", + "eslint": "^7.0.0", + "mocha": "^7.1.1", + "nyc": "^15.0.0", + "rimraf": "^3.0.0", + "typescript": "^4.0.3" + }, + "engines": { + "node": ">=8.10.0" + }, + "eslintConfig": { + "root": true, + "extends": "eslint:recommended", + "parserOptions": { + "ecmaVersion": 9, + "sourceType": "script" + }, + "env": { + "node": true, + "es6": true + }, + "rules": { + "array-callback-return": "error", + "no-empty": [ + "error", + { + "allowEmptyCatch": true + } + ], + "no-else-return": [ + "error", + { + "allowElseIf": false + } + ], + "no-lonely-if": "error", + "no-var": "error", + "object-shorthand": "error", + "prefer-arrow-callback": [ + "error", + { + "allowNamedFunctions": true + } + ], + "prefer-const": [ + "error", + { + "ignoreReadBeforeAssign": true + } + ], + "prefer-destructuring": [ + "error", + { + "object": true, + "array": false + } + ], + "prefer-spread": "error", + "prefer-template": "error", + "radix": "error", + "semi": "error", + "strict": "error", + "quotes": [ + "error", + "single" + ] + } + }, + "files": [ + "index.js", + "index.d.ts" + ], + "homepage": "https://github.com/paulmillr/readdirp", + "keywords": [ + "recursive", + "fs", + "stream", + "streams", + "readdir", + "filesystem", + "find", + "filter" + ], + "license": "MIT", + "main": "index.js", + "name": "readdirp", + "nyc": { + "reporter": [ + "html", + "text" + ] + }, + "repository": { + "type": "git", + "url": "git://github.com/paulmillr/readdirp.git" + }, + "scripts": { + "dtslint": "dtslint", + "lint": "eslint --report-unused-disable-directives --ignore-path .gitignore .", + "mocha": "mocha --exit", + "nyc": "nyc", + "test": "npm run lint && nyc npm run mocha" + }, + "version": "3.6.0" +} diff --git a/node_modules/regexp-clone/.travis.yml b/node_modules/regexp-clone/.travis.yml new file mode 100644 index 000000000..220a17bde --- /dev/null +++ b/node_modules/regexp-clone/.travis.yml @@ -0,0 +1,14 @@ +sudo: false +language: node_js +node_js: + - 8 + - 10 + - 12 +matrix: + include: + - node_js: "13" + env: "NVM_NODEJS_ORG_MIRROR=https://nodejs.org/download/nightly" + allow_failures: + # Allow the nightly installs to fail + - env: "NVM_NODEJS_ORG_MIRROR=https://nodejs.org/download/nightly" +script: "npm test" diff --git a/node_modules/regexp-clone/History.md b/node_modules/regexp-clone/History.md new file mode 100644 index 000000000..0beedfc35 --- /dev/null +++ b/node_modules/regexp-clone/History.md @@ -0,0 +1,5 @@ + +0.0.1 / 2013-04-17 +================== + + * initial commit diff --git a/node_modules/regexp-clone/LICENSE b/node_modules/regexp-clone/LICENSE new file mode 100644 index 000000000..98c7c8983 --- /dev/null +++ b/node_modules/regexp-clone/LICENSE @@ -0,0 +1,22 @@ +(The MIT License) + +Copyright (c) 2013 [Aaron Heckmann](aaron.heckmann+github@gmail.com) + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/regexp-clone/Makefile b/node_modules/regexp-clone/Makefile new file mode 100644 index 000000000..6c8fb7516 --- /dev/null +++ b/node_modules/regexp-clone/Makefile @@ -0,0 +1,5 @@ + +test: + @./node_modules/.bin/mocha $(T) --async-only $(TESTS) + +.PHONY: test diff --git a/node_modules/regexp-clone/README.md b/node_modules/regexp-clone/README.md new file mode 100644 index 000000000..4a16bad3d --- /dev/null +++ b/node_modules/regexp-clone/README.md @@ -0,0 +1,42 @@ +#regexp-clone +============== + +Clones RegExps with flag and `lastIndex` preservation. + +```js +const regexpClone = require('regexp-clone'); + +const a = /somethin/misguy; +console.log(a.global); // true +console.log(a.ignoreCase); // true +console.log(a.multiline); // true +console.log(a.dotAll); // true +console.log(a.unicode); // true +console.log(a.sticky); // true + +const b = regexpClone(a); +console.log(b.global); // true +console.log(b.ignoreCase); // true +console.log(b.multiline); // true +console.log(b.dotAll); // true +console.log(b.unicode); // true +console.log(b.sticky); // true + +const c = /hi/g; +c.test('this string hi there'); +assert.strictEqual(c.lastIndex, 3); + +const d = regexpClone(c); +assert.strictEqual(d.lastIndex, 3); +d.test('this string hi there'); +assert.strictEqual(d.lastIndex, 14); +assert.strictEqual(c.lastIndex, 3); +``` + +``` +npm install regexp-clone +``` + +## License + +[MIT](https://github.com/aheckmann/regexp-clone/blob/master/LICENSE) diff --git a/node_modules/regexp-clone/index.js b/node_modules/regexp-clone/index.js new file mode 100644 index 000000000..19512b392 --- /dev/null +++ b/node_modules/regexp-clone/index.js @@ -0,0 +1,27 @@ + +const toString = Object.prototype.toString; + +function isRegExp (o) { + return 'object' == typeof o + && '[object RegExp]' == toString.call(o); +} + +module.exports = exports = function (regexp) { + if (!isRegExp(regexp)) { + throw new TypeError('Not a RegExp'); + } + + const flags = []; + if (regexp.global) flags.push('g'); + if (regexp.multiline) flags.push('m'); + if (regexp.ignoreCase) flags.push('i'); + if (regexp.dotAll) flags.push('s'); + if (regexp.unicode) flags.push('u'); + if (regexp.sticky) flags.push('y'); + const result = new RegExp(regexp.source, flags.join('')); + if (typeof regexp.lastIndex === 'number') { + result.lastIndex = regexp.lastIndex; + } + return result; +} + diff --git a/node_modules/regexp-clone/package.json b/node_modules/regexp-clone/package.json new file mode 100644 index 000000000..1e9d49c6c --- /dev/null +++ b/node_modules/regexp-clone/package.json @@ -0,0 +1,55 @@ +{ + "_from": "regexp-clone@1.0.0", + "_id": "regexp-clone@1.0.0", + "_inBundle": false, + "_integrity": "sha512-TuAasHQNamyyJ2hb97IuBEif4qBHGjPHBS64sZwytpLEqtBQ1gPJTnOaQ6qmpET16cK14kkjbazl6+p0RRv0yw==", + "_location": "/regexp-clone", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "regexp-clone@1.0.0", + "name": "regexp-clone", + "escapedName": "regexp-clone", + "rawSpec": "1.0.0", + "saveSpec": null, + "fetchSpec": "1.0.0" + }, + "_requiredBy": [ + "/mongoose", + "/mquery" + ], + "_resolved": "https://registry.npmjs.org/regexp-clone/-/regexp-clone-1.0.0.tgz", + "_shasum": "222db967623277056260b992626354a04ce9bf63", + "_spec": "regexp-clone@1.0.0", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\mongoose", + "author": { + "name": "Aaron Heckmann", + "email": "aaron.heckmann+github@gmail.com" + }, + "bugs": { + "url": "https://github.com/aheckmann/regexp-clone/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Clone RegExps with options", + "devDependencies": { + "mocha": "^6.1.4" + }, + "homepage": "https://github.com/aheckmann/regexp-clone#readme", + "keywords": [ + "RegExp", + "clone" + ], + "license": "MIT", + "main": "index.js", + "name": "regexp-clone", + "repository": { + "type": "git", + "url": "git://github.com/aheckmann/regexp-clone.git" + }, + "scripts": { + "test": "make test" + }, + "version": "1.0.0" +} diff --git a/node_modules/regexp-clone/test/index.js b/node_modules/regexp-clone/test/index.js new file mode 100644 index 000000000..ceaea287a --- /dev/null +++ b/node_modules/regexp-clone/test/index.js @@ -0,0 +1,171 @@ + +const assert = require('assert') +const clone = require('../'); + +describe('regexp-clone', function(){ + function hasEqualSource (a, b) { + assert.ok(a !== b); + assert.equal(a.source, b.source); + } + + function isIgnoreCase (a) { + assert.ok(a.ignoreCase); + } + + function isGlobal (a) { + assert.ok(a.global); + } + + function isMultiline (a) { + assert.ok(a.multiline); + } + + function isDotAll (a) { + assert.ok(a.dotAll); + } + + function isUnicode (a) { + assert.ok(a.unicode); + } + + function isSticky(a) { + assert.ok(a.sticky); + } + + function testFlag (a, method) { + const b = clone(a); + hasEqualSource(a, b); + method(a); + method(b); + } + + function lastIndex(a) { + a.test('this string hi there'); + assert.strictEqual(a.lastIndex, 3); + const b = clone(a); + assert.strictEqual(b.lastIndex, 3); + assert.strictEqual(a.lastIndex, 3); + b.test('this string hi there'); + assert.strictEqual(b.lastIndex, 14); + assert.strictEqual(a.lastIndex, 3); + } + + function allFlags(a) { + const b = clone(a); + hasEqualSource(a, b); + testFlag(b, isIgnoreCase); + testFlag(b, isGlobal); + testFlag(b, isMultiline); + testFlag(b, isDotAll); + testFlag(b, isUnicode); + testFlag(b, isSticky); + } + + function noFlags(a) { + const b = clone(a); + hasEqualSource(a, b); + assert.ok(!b.ignoreCase); + assert.ok(!b.global); + assert.ok(!b.multiline); + assert.ok(!b.dotAll); + assert.ok(!b.unicode); + assert.ok(!b.sticky); + } + + describe('literals', function(){ + it('ignoreCase flag', function(done){ + const a = /hello/i; + testFlag(a, isIgnoreCase); + done(); + }) + it('global flag', function(done){ + const a = /hello/g; + testFlag(a, isGlobal); + done(); + }) + it('multiline flag', function(done){ + const a = /hello/m; + testFlag(a, isMultiline); + done(); + }) + it('dotAll flag', function(done){ + const a = /hello/s; + testFlag(a, isDotAll); + done(); + }) + it('unicode flag', function(done){ + const a = /hello/u; + testFlag(a, isUnicode); + done(); + }) + it('sticky flag', function(done){ + const a = /hello/y; + testFlag(a, isSticky); + done(); + }) + it('no flags', function(done){ + const a = /hello/; + noFlags(a); + done(); + }) + it('all flags', function(done){ + const a = /hello/gimsuy; + allFlags(a); + done(); + }) + it('lastIndex', function(done) { + const a = /hi/g; + lastIndex(a); + done(); + }) + }) + + describe('instances', function(){ + it('ignoreCase flag', function(done){ + const a = new RegExp('hello', 'i'); + testFlag(a, isIgnoreCase); + done(); + }) + it('global flag', function(done){ + const a = new RegExp('hello', 'g'); + testFlag(a, isGlobal); + done(); + }) + it('multiline flag', function(done){ + const a = new RegExp('hello', 'm'); + testFlag(a, isMultiline); + done(); + }) + it('dotAll flag', function(done){ + const a = new RegExp('hello', 's'); + testFlag(a, isDotAll); + done(); + }) + it('unicode flag', function(done){ + const a = new RegExp('hello', 'u'); + testFlag(a, isUnicode); + done(); + }) + it('sticky flag', function(done){ + const a = new RegExp('hello', 'y'); + testFlag(a, isSticky); + done(); + }) + it('no flags', function(done){ + const a = new RegExp('hmm'); + noFlags(a); + done(); + }) + it('all flags', function(done){ + const a = new RegExp('hello', 'misguy'); + allFlags(a); + done(); + }) + it('lastIndex', function(done) { + const a = new RegExp('hi', 'g'); + lastIndex(a); + done(); + }) + }) +}) + diff --git a/node_modules/registry-auth-token/CHANGELOG.md b/node_modules/registry-auth-token/CHANGELOG.md new file mode 100644 index 000000000..e96f146e3 --- /dev/null +++ b/node_modules/registry-auth-token/CHANGELOG.md @@ -0,0 +1,134 @@ +# Change Log + +All notable changes will be documented in this file. + +## [4.2.0] - 2020-07-13 + +### Changes + +- Add support for `NPM_CONFIG_USERCONFIG` environment variable (Ben Sorohan) + +## [4.1.0] - 2020-01-17 + +### Changes + +- Add support for legacy auth token on the registry url (Gustav Blomér) + +## [4.0.0] - 2019-06-17 + +### BREAKING + +- Minimum node.js version requirement is now v6 + +### Changes + +- Upgraded dependencies (Espen Hovlandsdal) + +## [3.4.0] - 2019-03-20 + +### Changes + +- Enabled legacy auth token to be read from environment variable (Martin Flodin) + +## [3.3.2] - 2018-01-26 + +### Changes + +- Support password with ENV variable tokens (Nowell Strite) + +## [3.3.1] - 2017-05-02 + +### Fixes + +- Auth legacy token is basic auth (Hutson Betts) + +## [3.3.0] - 2017-04-24 + +### Changes + +- Support legacy auth token config key (Zoltan Kochan) +- Use safe-buffer module for backwards-compatible base64 encoding/decoding (Espen Hovlandsdal) +- Change to standard.js coding style (Espen Hovlandsdal) + +## [3.2.0] - 2017-04-20 + +### Changes + +- Allow passing parsed npmrc from outside (Zoltan Kochan) + +## [3.1.2] - 2017-04-07 + +### Changes + +- Avoid infinite loop on invalid URL (Zoltan Kochan) + +## [3.1.1] - 2017-04-06 + +### Changes + +- Nerf-dart URLs even if recursive is set to false (Espen Hovlandsdal) + +## [3.1.0] - 2016-10-19 + +### Changes + +- Return the password and username for Basic authorization (Zoltan Kochan) + +## [3.0.1] - 2016-08-07 + +### Changes + +- Fix recursion bug (Lukas Eipert) +- Implement alternative base64 encoding/decoding implementation for Node 6 (Lukas Eipert) + +## [3.0.0] - 2016-08-04 + +### Added + +- Support for Basic Authentication (username/password) (Lukas Eipert) + +### Changes + +- The result format of the output changed from a simple string to an object which contains the token type + +```js + // before: returns 'tokenString' + // after: returns {token: 'tokenString', type: 'Bearer'} + getAuthToken() +``` + +## [2.1.1] - 2016-07-10 + +### Changes + +- Fix infinite loop when recursively resolving registry URLs on Windows (Espen Hovlandsdal) + +## [2.1.0] - 2016-07-07 + +### Added + +- Add feature to find configured registry URL for a scope (Espen Hovlandsdal) + +## [2.0.0] - 2016-06-17 + +### Changes + +- Fix tokens defined by reference to environment variables (Dan MacTough) + +## [1.1.1] - 2016-04-26 + +### Changes + +- Fix for registries with port number in URL (Ryan Day) + +[1.1.1]: https://github.com/rexxars/registry-auth-token/compare/a5b4fe2f5ff982110eb8a813ba1b3b3c5d851af1...v1.1.1 +[2.0.0]: https://github.com/rexxars/registry-auth-token/compare/v1.1.1...v2.0.0 +[2.1.0]: https://github.com/rexxars/registry-auth-token/compare/v2.0.0...v2.1.0 +[2.1.1]: https://github.com/rexxars/registry-auth-token/compare/v2.1.0...v2.1.1 +[3.0.0]: https://github.com/rexxars/registry-auth-token/compare/v2.1.1...v3.0.0 +[3.0.1]: https://github.com/rexxars/registry-auth-token/compare/v3.0.0...v3.0.1 +[3.1.0]: https://github.com/rexxars/registry-auth-token/compare/v3.0.1...v3.1.0 +[3.1.1]: https://github.com/rexxars/registry-auth-token/compare/v3.1.0...v3.1.1 +[3.1.2]: https://github.com/rexxars/registry-auth-token/compare/v3.1.1...v3.1.2 +[3.2.0]: https://github.com/rexxars/registry-auth-token/compare/v3.1.2...v3.2.0 +[3.3.0]: https://github.com/rexxars/registry-auth-token/compare/v3.2.0...v3.3.0 diff --git a/node_modules/registry-auth-token/LICENSE b/node_modules/registry-auth-token/LICENSE new file mode 100644 index 000000000..0de12e338 --- /dev/null +++ b/node_modules/registry-auth-token/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2016 Espen Hovlandsdal + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/node_modules/registry-auth-token/README.md b/node_modules/registry-auth-token/README.md new file mode 100644 index 000000000..084dc0aff --- /dev/null +++ b/node_modules/registry-auth-token/README.md @@ -0,0 +1,65 @@ +# registry-auth-token + +[![npm version](http://img.shields.io/npm/v/registry-auth-token.svg?style=flat-square)](http://browsenpm.org/package/registry-auth-token)[![Build Status](http://img.shields.io/travis/rexxars/registry-auth-token/main.svg?style=flat-square)](https://travis-ci.org/rexxars/registry-auth-token) + +Get the auth token set for an npm registry from `.npmrc`. Also allows fetching the configured registry URL for a given npm scope. + +## Installing + +``` +npm install --save registry-auth-token +``` + +## Usage + +Returns an object containing `token` and `type`, or `undefined` if no token can be found. `type` can be either `Bearer` or `Basic`. + +```js +var getAuthToken = require('registry-auth-token') +var getRegistryUrl = require('registry-auth-token/registry-url') + +// Get auth token and type for default `registry` set in `.npmrc` +console.log(getAuthToken()) // {token: 'someToken', type: 'Bearer'} + +// Get auth token for a specific registry URL +console.log(getAuthToken('//registry.foo.bar')) + +// Find the registry auth token for a given URL (with deep path): +// If registry is at `//some.host/registry` +// URL passed is `//some.host/registry/deep/path` +// Will find token the closest matching path; `//some.host/registry` +console.log(getAuthToken('//some.host/registry/deep/path', {recursive: true})) + +// Find the configured registry url for scope `@foobar`. +// Falls back to the global registry if not defined. +console.log(getRegistryUrl('@foobar')) + +// Use the npm config that is passed in +console.log(getRegistryUrl('http://registry.foobar.eu/', { + npmrc: { + 'registry': 'http://registry.foobar.eu/', + '//registry.foobar.eu/:_authToken': 'qar' + } +})) +``` + +## Return value + +```js +// If auth info can be found: +{token: 'someToken', type: 'Bearer'} + +// Or: +{token: 'someOtherToken', type: 'Basic'} + +// Or, if nothing is found: +undefined +``` + +## Security + +Please be careful when using this. Leaking your auth token is dangerous. + +## License + +MIT-licensed. See LICENSE. diff --git a/node_modules/registry-auth-token/base64.js b/node_modules/registry-auth-token/base64.js new file mode 100644 index 000000000..ff0f6cb99 --- /dev/null +++ b/node_modules/registry-auth-token/base64.js @@ -0,0 +1,12 @@ +function decodeBase64 (base64) { + return Buffer.from(base64, 'base64').toString('utf8') +} + +function encodeBase64 (string) { + return Buffer.from(string, 'utf8').toString('base64') +} + +module.exports = { + decodeBase64: decodeBase64, + encodeBase64: encodeBase64 +} diff --git a/node_modules/registry-auth-token/index.js b/node_modules/registry-auth-token/index.js new file mode 100644 index 000000000..28eedfd73 --- /dev/null +++ b/node_modules/registry-auth-token/index.js @@ -0,0 +1,142 @@ +var url = require('url') +var base64 = require('./base64') + +var decodeBase64 = base64.decodeBase64 +var encodeBase64 = base64.encodeBase64 + +var tokenKey = ':_authToken' +var legacyTokenKey = ':_auth' +var userKey = ':username' +var passwordKey = ':_password' + +module.exports = function () { + var checkUrl + var options + if (arguments.length >= 2) { + checkUrl = arguments[0] + options = arguments[1] + } else if (typeof arguments[0] === 'string') { + checkUrl = arguments[0] + } else { + options = arguments[0] + } + options = options || {} + options.npmrc = options.npmrc || require('rc')('npm', { registry: 'https://registry.npmjs.org/' }, { + config: process.env.npm_config_userconfig || process.env.NPM_CONFIG_USERCONFIG + }) + checkUrl = checkUrl || options.npmrc.registry + return getRegistryAuthInfo(checkUrl, options) || getLegacyAuthInfo(options.npmrc) +} + +function getRegistryAuthInfo (checkUrl, options) { + var parsed = url.parse(checkUrl, false, true) + var pathname + + while (pathname !== '/' && parsed.pathname !== pathname) { + pathname = parsed.pathname || '/' + + var regUrl = '//' + parsed.host + pathname.replace(/\/$/, '') + var authInfo = getAuthInfoForUrl(regUrl, options.npmrc) + if (authInfo) { + return authInfo + } + + // break if not recursive + if (!options.recursive) { + return /\/$/.test(checkUrl) + ? undefined + : getRegistryAuthInfo(url.resolve(checkUrl, '.'), options) + } + + parsed.pathname = url.resolve(normalizePath(pathname), '..') || '/' + } + + return undefined +} + +function getLegacyAuthInfo (npmrc) { + if (!npmrc._auth) { + return undefined + } + + var token = replaceEnvironmentVariable(npmrc._auth) + + return { token: token, type: 'Basic' } +} + +function normalizePath (path) { + return path[path.length - 1] === '/' ? path : path + '/' +} + +function getAuthInfoForUrl (regUrl, npmrc) { + // try to get bearer token + var bearerAuth = getBearerToken(npmrc[regUrl + tokenKey] || npmrc[regUrl + '/' + tokenKey]) + if (bearerAuth) { + return bearerAuth + } + + // try to get basic token + var username = npmrc[regUrl + userKey] || npmrc[regUrl + '/' + userKey] + var password = npmrc[regUrl + passwordKey] || npmrc[regUrl + '/' + passwordKey] + var basicAuth = getTokenForUsernameAndPassword(username, password) + if (basicAuth) { + return basicAuth + } + + var basicAuthWithToken = getLegacyAuthToken(npmrc[regUrl + legacyTokenKey] || npmrc[regUrl + '/' + legacyTokenKey]) + if (basicAuthWithToken) { + return basicAuthWithToken + } + + return undefined +} + +function replaceEnvironmentVariable (token) { + return token.replace(/^\$\{?([^}]*)\}?$/, function (fullMatch, envVar) { + return process.env[envVar] + }) +} + +function getBearerToken (tok) { + if (!tok) { + return undefined + } + + // check if bearer token is set as environment variable + var token = replaceEnvironmentVariable(tok) + + return { token: token, type: 'Bearer' } +} + +function getTokenForUsernameAndPassword (username, password) { + if (!username || !password) { + return undefined + } + + // passwords are base64 encoded, so we need to decode it + // See https://github.com/npm/npm/blob/v3.10.6/lib/config/set-credentials-by-uri.js#L26 + var pass = decodeBase64(replaceEnvironmentVariable(password)) + + // a basic auth token is base64 encoded 'username:password' + // See https://github.com/npm/npm/blob/v3.10.6/lib/config/get-credentials-by-uri.js#L70 + var token = encodeBase64(username + ':' + pass) + + // we found a basicToken token so let's exit the loop + return { + token: token, + type: 'Basic', + password: pass, + username: username + } +} + +function getLegacyAuthToken (tok) { + if (!tok) { + return undefined + } + + // check if legacy auth token is set as environment variable + var token = replaceEnvironmentVariable(tok) + + return { token: token, type: 'Basic' } +} diff --git a/node_modules/registry-auth-token/package.json b/node_modules/registry-auth-token/package.json new file mode 100644 index 000000000..960c8daa6 --- /dev/null +++ b/node_modules/registry-auth-token/package.json @@ -0,0 +1,76 @@ +{ + "_from": "registry-auth-token@^4.0.0", + "_id": "registry-auth-token@4.2.1", + "_inBundle": false, + "_integrity": "sha512-6gkSb4U6aWJB4SF2ZvLb76yCBjcvufXBqvvEx1HbmKPkutswjW1xNVRY0+daljIYRbogN7O0etYSlbiaEQyMyw==", + "_location": "/registry-auth-token", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "registry-auth-token@^4.0.0", + "name": "registry-auth-token", + "escapedName": "registry-auth-token", + "rawSpec": "^4.0.0", + "saveSpec": null, + "fetchSpec": "^4.0.0" + }, + "_requiredBy": [ + "/package-json" + ], + "_resolved": "https://registry.npmjs.org/registry-auth-token/-/registry-auth-token-4.2.1.tgz", + "_shasum": "6d7b4006441918972ccd5fedcd41dc322c79b250", + "_spec": "registry-auth-token@^4.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\package-json", + "author": { + "name": "Espen Hovlandsdal", + "email": "espen@hovlandsdal.com" + }, + "bugs": { + "url": "https://github.com/rexxars/registry-auth-token/issues" + }, + "bundleDependencies": false, + "dependencies": { + "rc": "^1.2.8" + }, + "deprecated": false, + "description": "Get the auth token set for an npm registry (if any)", + "devDependencies": { + "istanbul": "^0.4.2", + "mocha": "^6.1.4", + "require-uncached": "^1.0.2", + "standard": "^12.0.1" + }, + "engines": { + "node": ">=6.0.0" + }, + "homepage": "https://github.com/rexxars/registry-auth-token#readme", + "keywords": [ + "npm", + "conf", + "config", + "npmconf", + "registry", + "auth", + "token", + "authtoken" + ], + "license": "MIT", + "main": "index.js", + "name": "registry-auth-token", + "repository": { + "type": "git", + "url": "git+ssh://git@github.com/rexxars/registry-auth-token.git" + }, + "scripts": { + "coverage": "istanbul cover _mocha", + "posttest": "standard", + "test": "mocha" + }, + "standard": { + "ignore": [ + "coverage/**" + ] + }, + "version": "4.2.1" +} diff --git a/node_modules/registry-auth-token/registry-url.js b/node_modules/registry-auth-token/registry-url.js new file mode 100644 index 000000000..b4532d634 --- /dev/null +++ b/node_modules/registry-auth-token/registry-url.js @@ -0,0 +1,5 @@ +module.exports = function (scope, npmrc) { + var rc = npmrc || require('rc')('npm', { registry: 'https://registry.npmjs.org/' }) + var url = rc[scope + ':registry'] || rc.registry + return url.slice(-1) === '/' ? url : url + '/' +} diff --git a/node_modules/registry-url/index.d.ts b/node_modules/registry-url/index.d.ts new file mode 100644 index 000000000..5f2c58612 --- /dev/null +++ b/node_modules/registry-url/index.d.ts @@ -0,0 +1,33 @@ +declare const registryUrl: { + /** + Get the set npm registry URL. + + @param scope - Retrieve the registry URL associated with an [npm scope](https://docs.npmjs.com/misc/scope). If the provided scope is not in the user's `.npmrc` file, then `registry-url` will check for the existence of `registry`, or if that's not set, fallback to the default npm registry. + + @example + ``` + import registryUrl = require('registry-url'); + + // # .npmrc + // registry = 'https://custom-registry.com/' + + console.log(registryUrl()); + //=> 'https://custom-registry.com/' + + + // # .npmrc + // @myco:registry = 'https://custom-registry.com/' + + console.log(registryUrl('@myco')); + //=> 'https://custom-registry.com/' + ``` + */ + (scope?: string): string; + + // TODO: Remove this for the next major release, refactor the whole definition to: + // declare function registryUrl(scope?: string): string; + // export = registryUrl; + default: typeof registryUrl; +}; + +export = registryUrl; diff --git a/node_modules/registry-url/index.js b/node_modules/registry-url/index.js new file mode 100644 index 000000000..23ea1f870 --- /dev/null +++ b/node_modules/registry-url/index.js @@ -0,0 +1,12 @@ +'use strict'; +const rc = require('rc'); + +const registryUrl = scope => { + const result = rc('npm', {registry: 'https://registry.npmjs.org/'}); + const url = result[`${scope}:registry`] || result.config_registry || result.registry; + return url.slice(-1) === '/' ? url : `${url}/`; +}; + +module.exports = registryUrl; +// TODO: Remove this for the next major release +module.exports.default = registryUrl; diff --git a/node_modules/registry-url/license b/node_modules/registry-url/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/registry-url/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/registry-url/package.json b/node_modules/registry-url/package.json new file mode 100644 index 000000000..c0e9e1ece --- /dev/null +++ b/node_modules/registry-url/package.json @@ -0,0 +1,76 @@ +{ + "_from": "registry-url@^5.0.0", + "_id": "registry-url@5.1.0", + "_inBundle": false, + "_integrity": "sha512-8acYXXTI0AkQv6RAOjE3vOaIXZkT9wo4LOFbBKYQEEnnMNBpKqdUrI6S4NT0KPIo/WVvJ5tE/X5LF/TQUf0ekw==", + "_location": "/registry-url", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "registry-url@^5.0.0", + "name": "registry-url", + "escapedName": "registry-url", + "rawSpec": "^5.0.0", + "saveSpec": null, + "fetchSpec": "^5.0.0" + }, + "_requiredBy": [ + "/package-json" + ], + "_resolved": "https://registry.npmjs.org/registry-url/-/registry-url-5.1.0.tgz", + "_shasum": "e98334b50d5434b81136b44ec638d9c2009c5009", + "_spec": "registry-url@^5.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\package-json", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "ava": { + "serial": true + }, + "bugs": { + "url": "https://github.com/sindresorhus/registry-url/issues" + }, + "bundleDependencies": false, + "dependencies": { + "rc": "^1.2.8" + }, + "deprecated": false, + "description": "Get the set npm registry URL", + "devDependencies": { + "ava": "^1.4.1", + "import-fresh": "^3.0.0", + "tsd": "^0.7.2", + "xo": "^0.24.0" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "homepage": "https://github.com/sindresorhus/registry-url#readme", + "keywords": [ + "npm", + "conf", + "config", + "npmconf", + "registry", + "url", + "uri", + "scope" + ], + "license": "MIT", + "name": "registry-url", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/registry-url.git" + }, + "scripts": { + "test": "xo && ava && tsd" + }, + "version": "5.1.0" +} diff --git a/node_modules/registry-url/readme.md b/node_modules/registry-url/readme.md new file mode 100644 index 000000000..dfcb99ae5 --- /dev/null +++ b/node_modules/registry-url/readme.md @@ -0,0 +1,50 @@ +# registry-url [![Build Status](https://travis-ci.org/sindresorhus/registry-url.svg?branch=master)](https://travis-ci.org/sindresorhus/registry-url) + +> Get the set npm registry URL + +It's usually `https://registry.npmjs.org/`, but it's [configurable](https://docs.npmjs.com/misc/registry). + +Use this if you do anything with the npm registry as users will expect it to use their configured registry. + + +## Install + +``` +$ npm install registry-url +``` + + +## Usage + +```ini +# .npmrc +registry = 'https://custom-registry.com/' +``` + +```js +const registryUrl = require('registry-url'); + +console.log(registryUrl()); +//=> 'https://custom-registry.com/' +``` + +It can also retrieve the registry URL associated with an [npm scope](https://docs.npmjs.com/misc/scope). + +```ini +# .npmrc +@myco:registry = 'https://custom-registry.com/' +``` + +```js +const registryUrl = require('registry-url'); + +console.log(registryUrl('@myco')); +//=> 'https://custom-registry.com/' +``` + +If the provided scope is not in the user's `.npmrc` file, then `registry-url` will check for the existence of `registry`, or if that's not set, fallback to the default npm registry. + + +## License + +MIT © [Sindre Sorhus](https://sindresorhus.com) diff --git a/node_modules/responselike/LICENSE b/node_modules/responselike/LICENSE new file mode 100644 index 000000000..8829a001a --- /dev/null +++ b/node_modules/responselike/LICENSE @@ -0,0 +1,19 @@ +Copyright (c) 2017 Luke Childs + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/responselike/README.md b/node_modules/responselike/README.md new file mode 100644 index 000000000..6361931c1 --- /dev/null +++ b/node_modules/responselike/README.md @@ -0,0 +1,77 @@ +# responselike + +> A response-like object for mocking a Node.js HTTP response stream + +[![Build Status](https://travis-ci.org/lukechilds/responselike.svg?branch=master)](https://travis-ci.org/lukechilds/responselike) +[![Coverage Status](https://coveralls.io/repos/github/lukechilds/responselike/badge.svg?branch=master)](https://coveralls.io/github/lukechilds/responselike?branch=master) +[![npm](https://img.shields.io/npm/dm/responselike.svg)](https://www.npmjs.com/package/responselike) +[![npm](https://img.shields.io/npm/v/responselike.svg)](https://www.npmjs.com/package/responselike) + +Returns a streamable response object similar to a [Node.js HTTP response stream](https://nodejs.org/api/http.html#http_class_http_incomingmessage). Useful for formatting cached responses so they can be consumed by code expecting a real response. + +## Install + +```shell +npm install --save responselike +``` + +Or if you're just using for testing you'll want: + +```shell +npm install --save-dev responselike +``` + +## Usage + +```js +const Response = require('responselike'); + +const response = new Response(200, { foo: 'bar' }, Buffer.from('Hi!'), 'https://example.com'); + +response.statusCode; +// 200 +response.headers; +// { foo: 'bar' } +response.body; +// +response.url; +// 'https://example.com' + +response.pipe(process.stdout); +// Hi! +``` + + +## API + +### new Response(statusCode, headers, body, url) + +Returns a streamable response object similar to a [Node.js HTTP response stream](https://nodejs.org/api/http.html#http_class_http_incomingmessage). + +#### statusCode + +Type: `number` + +HTTP response status code. + +#### headers + +Type: `object` + +HTTP headers object. Keys will be automatically lowercased. + +#### body + +Type: `buffer` + +A Buffer containing the response body. The Buffer contents will be streamable but is also exposed directly as `response.body`. + +#### url + +Type: `string` + +Request URL string. + +## License + +MIT © Luke Childs diff --git a/node_modules/responselike/package.json b/node_modules/responselike/package.json new file mode 100644 index 000000000..1d967f5d4 --- /dev/null +++ b/node_modules/responselike/package.json @@ -0,0 +1,69 @@ +{ + "_from": "responselike@^1.0.2", + "_id": "responselike@1.0.2", + "_inBundle": false, + "_integrity": "sha1-kYcg7ztjHFZCvgaPFa3lpG9Loec=", + "_location": "/responselike", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "responselike@^1.0.2", + "name": "responselike", + "escapedName": "responselike", + "rawSpec": "^1.0.2", + "saveSpec": null, + "fetchSpec": "^1.0.2" + }, + "_requiredBy": [ + "/cacheable-request" + ], + "_resolved": "https://registry.npmjs.org/responselike/-/responselike-1.0.2.tgz", + "_shasum": "918720ef3b631c5642be068f15ade5a46f4ba1e7", + "_spec": "responselike@^1.0.2", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\cacheable-request", + "author": { + "name": "lukechilds" + }, + "bugs": { + "url": "https://github.com/lukechilds/responselike/issues" + }, + "bundleDependencies": false, + "dependencies": { + "lowercase-keys": "^1.0.0" + }, + "deprecated": false, + "description": "A response-like object for mocking a Node.js HTTP response stream", + "devDependencies": { + "ava": "^0.22.0", + "coveralls": "^2.13.1", + "eslint-config-xo-lukechilds": "^1.0.0", + "get-stream": "^3.0.0", + "nyc": "^11.1.0", + "xo": "^0.19.0" + }, + "homepage": "https://github.com/lukechilds/responselike#readme", + "keywords": [ + "http", + "https", + "response", + "mock", + "request", + "responselike" + ], + "license": "MIT", + "main": "src/index.js", + "name": "responselike", + "repository": { + "type": "git", + "url": "git+https://github.com/lukechilds/responselike.git" + }, + "scripts": { + "coverage": "nyc report --reporter=text-lcov | coveralls", + "test": "xo && nyc ava" + }, + "version": "1.0.2", + "xo": { + "extends": "xo-lukechilds" + } +} diff --git a/node_modules/responselike/src/index.js b/node_modules/responselike/src/index.js new file mode 100644 index 000000000..b17b4813a --- /dev/null +++ b/node_modules/responselike/src/index.js @@ -0,0 +1,34 @@ +'use strict'; + +const Readable = require('stream').Readable; +const lowercaseKeys = require('lowercase-keys'); + +class Response extends Readable { + constructor(statusCode, headers, body, url) { + if (typeof statusCode !== 'number') { + throw new TypeError('Argument `statusCode` should be a number'); + } + if (typeof headers !== 'object') { + throw new TypeError('Argument `headers` should be an object'); + } + if (!(body instanceof Buffer)) { + throw new TypeError('Argument `body` should be a buffer'); + } + if (typeof url !== 'string') { + throw new TypeError('Argument `url` should be a string'); + } + + super(); + this.statusCode = statusCode; + this.headers = lowercaseKeys(headers); + this.body = body; + this.url = url; + } + + _read() { + this.push(this.body); + this.push(null); + } +} + +module.exports = Response; diff --git a/node_modules/safe-buffer/LICENSE b/node_modules/safe-buffer/LICENSE new file mode 100644 index 000000000..0c068ceec --- /dev/null +++ b/node_modules/safe-buffer/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) Feross Aboukhadijeh + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/safe-buffer/README.md b/node_modules/safe-buffer/README.md new file mode 100644 index 000000000..e9a81afd0 --- /dev/null +++ b/node_modules/safe-buffer/README.md @@ -0,0 +1,584 @@ +# safe-buffer [![travis][travis-image]][travis-url] [![npm][npm-image]][npm-url] [![downloads][downloads-image]][downloads-url] [![javascript style guide][standard-image]][standard-url] + +[travis-image]: https://img.shields.io/travis/feross/safe-buffer/master.svg +[travis-url]: https://travis-ci.org/feross/safe-buffer +[npm-image]: https://img.shields.io/npm/v/safe-buffer.svg +[npm-url]: https://npmjs.org/package/safe-buffer +[downloads-image]: https://img.shields.io/npm/dm/safe-buffer.svg +[downloads-url]: https://npmjs.org/package/safe-buffer +[standard-image]: https://img.shields.io/badge/code_style-standard-brightgreen.svg +[standard-url]: https://standardjs.com + +#### Safer Node.js Buffer API + +**Use the new Node.js Buffer APIs (`Buffer.from`, `Buffer.alloc`, +`Buffer.allocUnsafe`, `Buffer.allocUnsafeSlow`) in all versions of Node.js.** + +**Uses the built-in implementation when available.** + +## install + +``` +npm install safe-buffer +``` + +## usage + +The goal of this package is to provide a safe replacement for the node.js `Buffer`. + +It's a drop-in replacement for `Buffer`. You can use it by adding one `require` line to +the top of your node.js modules: + +```js +var Buffer = require('safe-buffer').Buffer + +// Existing buffer code will continue to work without issues: + +new Buffer('hey', 'utf8') +new Buffer([1, 2, 3], 'utf8') +new Buffer(obj) +new Buffer(16) // create an uninitialized buffer (potentially unsafe) + +// But you can use these new explicit APIs to make clear what you want: + +Buffer.from('hey', 'utf8') // convert from many types to a Buffer +Buffer.alloc(16) // create a zero-filled buffer (safe) +Buffer.allocUnsafe(16) // create an uninitialized buffer (potentially unsafe) +``` + +## api + +### Class Method: Buffer.from(array) + + +* `array` {Array} + +Allocates a new `Buffer` using an `array` of octets. + +```js +const buf = Buffer.from([0x62,0x75,0x66,0x66,0x65,0x72]); + // creates a new Buffer containing ASCII bytes + // ['b','u','f','f','e','r'] +``` + +A `TypeError` will be thrown if `array` is not an `Array`. + +### Class Method: Buffer.from(arrayBuffer[, byteOffset[, length]]) + + +* `arrayBuffer` {ArrayBuffer} The `.buffer` property of a `TypedArray` or + a `new ArrayBuffer()` +* `byteOffset` {Number} Default: `0` +* `length` {Number} Default: `arrayBuffer.length - byteOffset` + +When passed a reference to the `.buffer` property of a `TypedArray` instance, +the newly created `Buffer` will share the same allocated memory as the +TypedArray. + +```js +const arr = new Uint16Array(2); +arr[0] = 5000; +arr[1] = 4000; + +const buf = Buffer.from(arr.buffer); // shares the memory with arr; + +console.log(buf); + // Prints: + +// changing the TypedArray changes the Buffer also +arr[1] = 6000; + +console.log(buf); + // Prints: +``` + +The optional `byteOffset` and `length` arguments specify a memory range within +the `arrayBuffer` that will be shared by the `Buffer`. + +```js +const ab = new ArrayBuffer(10); +const buf = Buffer.from(ab, 0, 2); +console.log(buf.length); + // Prints: 2 +``` + +A `TypeError` will be thrown if `arrayBuffer` is not an `ArrayBuffer`. + +### Class Method: Buffer.from(buffer) + + +* `buffer` {Buffer} + +Copies the passed `buffer` data onto a new `Buffer` instance. + +```js +const buf1 = Buffer.from('buffer'); +const buf2 = Buffer.from(buf1); + +buf1[0] = 0x61; +console.log(buf1.toString()); + // 'auffer' +console.log(buf2.toString()); + // 'buffer' (copy is not changed) +``` + +A `TypeError` will be thrown if `buffer` is not a `Buffer`. + +### Class Method: Buffer.from(str[, encoding]) + + +* `str` {String} String to encode. +* `encoding` {String} Encoding to use, Default: `'utf8'` + +Creates a new `Buffer` containing the given JavaScript string `str`. If +provided, the `encoding` parameter identifies the character encoding. +If not provided, `encoding` defaults to `'utf8'`. + +```js +const buf1 = Buffer.from('this is a tést'); +console.log(buf1.toString()); + // prints: this is a tést +console.log(buf1.toString('ascii')); + // prints: this is a tC)st + +const buf2 = Buffer.from('7468697320697320612074c3a97374', 'hex'); +console.log(buf2.toString()); + // prints: this is a tést +``` + +A `TypeError` will be thrown if `str` is not a string. + +### Class Method: Buffer.alloc(size[, fill[, encoding]]) + + +* `size` {Number} +* `fill` {Value} Default: `undefined` +* `encoding` {String} Default: `utf8` + +Allocates a new `Buffer` of `size` bytes. If `fill` is `undefined`, the +`Buffer` will be *zero-filled*. + +```js +const buf = Buffer.alloc(5); +console.log(buf); + // +``` + +The `size` must be less than or equal to the value of +`require('buffer').kMaxLength` (on 64-bit architectures, `kMaxLength` is +`(2^31)-1`). Otherwise, a [`RangeError`][] is thrown. A zero-length Buffer will +be created if a `size` less than or equal to 0 is specified. + +If `fill` is specified, the allocated `Buffer` will be initialized by calling +`buf.fill(fill)`. See [`buf.fill()`][] for more information. + +```js +const buf = Buffer.alloc(5, 'a'); +console.log(buf); + // +``` + +If both `fill` and `encoding` are specified, the allocated `Buffer` will be +initialized by calling `buf.fill(fill, encoding)`. For example: + +```js +const buf = Buffer.alloc(11, 'aGVsbG8gd29ybGQ=', 'base64'); +console.log(buf); + // +``` + +Calling `Buffer.alloc(size)` can be significantly slower than the alternative +`Buffer.allocUnsafe(size)` but ensures that the newly created `Buffer` instance +contents will *never contain sensitive data*. + +A `TypeError` will be thrown if `size` is not a number. + +### Class Method: Buffer.allocUnsafe(size) + + +* `size` {Number} + +Allocates a new *non-zero-filled* `Buffer` of `size` bytes. The `size` must +be less than or equal to the value of `require('buffer').kMaxLength` (on 64-bit +architectures, `kMaxLength` is `(2^31)-1`). Otherwise, a [`RangeError`][] is +thrown. A zero-length Buffer will be created if a `size` less than or equal to +0 is specified. + +The underlying memory for `Buffer` instances created in this way is *not +initialized*. The contents of the newly created `Buffer` are unknown and +*may contain sensitive data*. Use [`buf.fill(0)`][] to initialize such +`Buffer` instances to zeroes. + +```js +const buf = Buffer.allocUnsafe(5); +console.log(buf); + // + // (octets will be different, every time) +buf.fill(0); +console.log(buf); + // +``` + +A `TypeError` will be thrown if `size` is not a number. + +Note that the `Buffer` module pre-allocates an internal `Buffer` instance of +size `Buffer.poolSize` that is used as a pool for the fast allocation of new +`Buffer` instances created using `Buffer.allocUnsafe(size)` (and the deprecated +`new Buffer(size)` constructor) only when `size` is less than or equal to +`Buffer.poolSize >> 1` (floor of `Buffer.poolSize` divided by two). The default +value of `Buffer.poolSize` is `8192` but can be modified. + +Use of this pre-allocated internal memory pool is a key difference between +calling `Buffer.alloc(size, fill)` vs. `Buffer.allocUnsafe(size).fill(fill)`. +Specifically, `Buffer.alloc(size, fill)` will *never* use the internal Buffer +pool, while `Buffer.allocUnsafe(size).fill(fill)` *will* use the internal +Buffer pool if `size` is less than or equal to half `Buffer.poolSize`. The +difference is subtle but can be important when an application requires the +additional performance that `Buffer.allocUnsafe(size)` provides. + +### Class Method: Buffer.allocUnsafeSlow(size) + + +* `size` {Number} + +Allocates a new *non-zero-filled* and non-pooled `Buffer` of `size` bytes. The +`size` must be less than or equal to the value of +`require('buffer').kMaxLength` (on 64-bit architectures, `kMaxLength` is +`(2^31)-1`). Otherwise, a [`RangeError`][] is thrown. A zero-length Buffer will +be created if a `size` less than or equal to 0 is specified. + +The underlying memory for `Buffer` instances created in this way is *not +initialized*. The contents of the newly created `Buffer` are unknown and +*may contain sensitive data*. Use [`buf.fill(0)`][] to initialize such +`Buffer` instances to zeroes. + +When using `Buffer.allocUnsafe()` to allocate new `Buffer` instances, +allocations under 4KB are, by default, sliced from a single pre-allocated +`Buffer`. This allows applications to avoid the garbage collection overhead of +creating many individually allocated Buffers. This approach improves both +performance and memory usage by eliminating the need to track and cleanup as +many `Persistent` objects. + +However, in the case where a developer may need to retain a small chunk of +memory from a pool for an indeterminate amount of time, it may be appropriate +to create an un-pooled Buffer instance using `Buffer.allocUnsafeSlow()` then +copy out the relevant bits. + +```js +// need to keep around a few small chunks of memory +const store = []; + +socket.on('readable', () => { + const data = socket.read(); + // allocate for retained data + const sb = Buffer.allocUnsafeSlow(10); + // copy the data into the new allocation + data.copy(sb, 0, 0, 10); + store.push(sb); +}); +``` + +Use of `Buffer.allocUnsafeSlow()` should be used only as a last resort *after* +a developer has observed undue memory retention in their applications. + +A `TypeError` will be thrown if `size` is not a number. + +### All the Rest + +The rest of the `Buffer` API is exactly the same as in node.js. +[See the docs](https://nodejs.org/api/buffer.html). + + +## Related links + +- [Node.js issue: Buffer(number) is unsafe](https://github.com/nodejs/node/issues/4660) +- [Node.js Enhancement Proposal: Buffer.from/Buffer.alloc/Buffer.zalloc/Buffer() soft-deprecate](https://github.com/nodejs/node-eps/pull/4) + +## Why is `Buffer` unsafe? + +Today, the node.js `Buffer` constructor is overloaded to handle many different argument +types like `String`, `Array`, `Object`, `TypedArrayView` (`Uint8Array`, etc.), +`ArrayBuffer`, and also `Number`. + +The API is optimized for convenience: you can throw any type at it, and it will try to do +what you want. + +Because the Buffer constructor is so powerful, you often see code like this: + +```js +// Convert UTF-8 strings to hex +function toHex (str) { + return new Buffer(str).toString('hex') +} +``` + +***But what happens if `toHex` is called with a `Number` argument?*** + +### Remote Memory Disclosure + +If an attacker can make your program call the `Buffer` constructor with a `Number` +argument, then they can make it allocate uninitialized memory from the node.js process. +This could potentially disclose TLS private keys, user data, or database passwords. + +When the `Buffer` constructor is passed a `Number` argument, it returns an +**UNINITIALIZED** block of memory of the specified `size`. When you create a `Buffer` like +this, you **MUST** overwrite the contents before returning it to the user. + +From the [node.js docs](https://nodejs.org/api/buffer.html#buffer_new_buffer_size): + +> `new Buffer(size)` +> +> - `size` Number +> +> The underlying memory for `Buffer` instances created in this way is not initialized. +> **The contents of a newly created `Buffer` are unknown and could contain sensitive +> data.** Use `buf.fill(0)` to initialize a Buffer to zeroes. + +(Emphasis our own.) + +Whenever the programmer intended to create an uninitialized `Buffer` you often see code +like this: + +```js +var buf = new Buffer(16) + +// Immediately overwrite the uninitialized buffer with data from another buffer +for (var i = 0; i < buf.length; i++) { + buf[i] = otherBuf[i] +} +``` + + +### Would this ever be a problem in real code? + +Yes. It's surprisingly common to forget to check the type of your variables in a +dynamically-typed language like JavaScript. + +Usually the consequences of assuming the wrong type is that your program crashes with an +uncaught exception. But the failure mode for forgetting to check the type of arguments to +the `Buffer` constructor is more catastrophic. + +Here's an example of a vulnerable service that takes a JSON payload and converts it to +hex: + +```js +// Take a JSON payload {str: "some string"} and convert it to hex +var server = http.createServer(function (req, res) { + var data = '' + req.setEncoding('utf8') + req.on('data', function (chunk) { + data += chunk + }) + req.on('end', function () { + var body = JSON.parse(data) + res.end(new Buffer(body.str).toString('hex')) + }) +}) + +server.listen(8080) +``` + +In this example, an http client just has to send: + +```json +{ + "str": 1000 +} +``` + +and it will get back 1,000 bytes of uninitialized memory from the server. + +This is a very serious bug. It's similar in severity to the +[the Heartbleed bug](http://heartbleed.com/) that allowed disclosure of OpenSSL process +memory by remote attackers. + + +### Which real-world packages were vulnerable? + +#### [`bittorrent-dht`](https://www.npmjs.com/package/bittorrent-dht) + +[Mathias Buus](https://github.com/mafintosh) and I +([Feross Aboukhadijeh](http://feross.org/)) found this issue in one of our own packages, +[`bittorrent-dht`](https://www.npmjs.com/package/bittorrent-dht). The bug would allow +anyone on the internet to send a series of messages to a user of `bittorrent-dht` and get +them to reveal 20 bytes at a time of uninitialized memory from the node.js process. + +Here's +[the commit](https://github.com/feross/bittorrent-dht/commit/6c7da04025d5633699800a99ec3fbadf70ad35b8) +that fixed it. We released a new fixed version, created a +[Node Security Project disclosure](https://nodesecurity.io/advisories/68), and deprecated all +vulnerable versions on npm so users will get a warning to upgrade to a newer version. + +#### [`ws`](https://www.npmjs.com/package/ws) + +That got us wondering if there were other vulnerable packages. Sure enough, within a short +period of time, we found the same issue in [`ws`](https://www.npmjs.com/package/ws), the +most popular WebSocket implementation in node.js. + +If certain APIs were called with `Number` parameters instead of `String` or `Buffer` as +expected, then uninitialized server memory would be disclosed to the remote peer. + +These were the vulnerable methods: + +```js +socket.send(number) +socket.ping(number) +socket.pong(number) +``` + +Here's a vulnerable socket server with some echo functionality: + +```js +server.on('connection', function (socket) { + socket.on('message', function (message) { + message = JSON.parse(message) + if (message.type === 'echo') { + socket.send(message.data) // send back the user's message + } + }) +}) +``` + +`socket.send(number)` called on the server, will disclose server memory. + +Here's [the release](https://github.com/websockets/ws/releases/tag/1.0.1) where the issue +was fixed, with a more detailed explanation. Props to +[Arnout Kazemier](https://github.com/3rd-Eden) for the quick fix. Here's the +[Node Security Project disclosure](https://nodesecurity.io/advisories/67). + + +### What's the solution? + +It's important that node.js offers a fast way to get memory otherwise performance-critical +applications would needlessly get a lot slower. + +But we need a better way to *signal our intent* as programmers. **When we want +uninitialized memory, we should request it explicitly.** + +Sensitive functionality should not be packed into a developer-friendly API that loosely +accepts many different types. This type of API encourages the lazy practice of passing +variables in without checking the type very carefully. + +#### A new API: `Buffer.allocUnsafe(number)` + +The functionality of creating buffers with uninitialized memory should be part of another +API. We propose `Buffer.allocUnsafe(number)`. This way, it's not part of an API that +frequently gets user input of all sorts of different types passed into it. + +```js +var buf = Buffer.allocUnsafe(16) // careful, uninitialized memory! + +// Immediately overwrite the uninitialized buffer with data from another buffer +for (var i = 0; i < buf.length; i++) { + buf[i] = otherBuf[i] +} +``` + + +### How do we fix node.js core? + +We sent [a PR to node.js core](https://github.com/nodejs/node/pull/4514) (merged as +`semver-major`) which defends against one case: + +```js +var str = 16 +new Buffer(str, 'utf8') +``` + +In this situation, it's implied that the programmer intended the first argument to be a +string, since they passed an encoding as a second argument. Today, node.js will allocate +uninitialized memory in the case of `new Buffer(number, encoding)`, which is probably not +what the programmer intended. + +But this is only a partial solution, since if the programmer does `new Buffer(variable)` +(without an `encoding` parameter) there's no way to know what they intended. If `variable` +is sometimes a number, then uninitialized memory will sometimes be returned. + +### What's the real long-term fix? + +We could deprecate and remove `new Buffer(number)` and use `Buffer.allocUnsafe(number)` when +we need uninitialized memory. But that would break 1000s of packages. + +~~We believe the best solution is to:~~ + +~~1. Change `new Buffer(number)` to return safe, zeroed-out memory~~ + +~~2. Create a new API for creating uninitialized Buffers. We propose: `Buffer.allocUnsafe(number)`~~ + +#### Update + +We now support adding three new APIs: + +- `Buffer.from(value)` - convert from any type to a buffer +- `Buffer.alloc(size)` - create a zero-filled buffer +- `Buffer.allocUnsafe(size)` - create an uninitialized buffer with given size + +This solves the core problem that affected `ws` and `bittorrent-dht` which is +`Buffer(variable)` getting tricked into taking a number argument. + +This way, existing code continues working and the impact on the npm ecosystem will be +minimal. Over time, npm maintainers can migrate performance-critical code to use +`Buffer.allocUnsafe(number)` instead of `new Buffer(number)`. + + +### Conclusion + +We think there's a serious design issue with the `Buffer` API as it exists today. It +promotes insecure software by putting high-risk functionality into a convenient API +with friendly "developer ergonomics". + +This wasn't merely a theoretical exercise because we found the issue in some of the +most popular npm packages. + +Fortunately, there's an easy fix that can be applied today. Use `safe-buffer` in place of +`buffer`. + +```js +var Buffer = require('safe-buffer').Buffer +``` + +Eventually, we hope that node.js core can switch to this new, safer behavior. We believe +the impact on the ecosystem would be minimal since it's not a breaking change. +Well-maintained, popular packages would be updated to use `Buffer.alloc` quickly, while +older, insecure packages would magically become safe from this attack vector. + + +## links + +- [Node.js PR: buffer: throw if both length and enc are passed](https://github.com/nodejs/node/pull/4514) +- [Node Security Project disclosure for `ws`](https://nodesecurity.io/advisories/67) +- [Node Security Project disclosure for`bittorrent-dht`](https://nodesecurity.io/advisories/68) + + +## credit + +The original issues in `bittorrent-dht` +([disclosure](https://nodesecurity.io/advisories/68)) and +`ws` ([disclosure](https://nodesecurity.io/advisories/67)) were discovered by +[Mathias Buus](https://github.com/mafintosh) and +[Feross Aboukhadijeh](http://feross.org/). + +Thanks to [Adam Baldwin](https://github.com/evilpacket) for helping disclose these issues +and for his work running the [Node Security Project](https://nodesecurity.io/). + +Thanks to [John Hiesey](https://github.com/jhiesey) for proofreading this README and +auditing the code. + + +## license + +MIT. Copyright (C) [Feross Aboukhadijeh](http://feross.org) diff --git a/node_modules/safe-buffer/index.d.ts b/node_modules/safe-buffer/index.d.ts new file mode 100644 index 000000000..e9fed809a --- /dev/null +++ b/node_modules/safe-buffer/index.d.ts @@ -0,0 +1,187 @@ +declare module "safe-buffer" { + export class Buffer { + length: number + write(string: string, offset?: number, length?: number, encoding?: string): number; + toString(encoding?: string, start?: number, end?: number): string; + toJSON(): { type: 'Buffer', data: any[] }; + equals(otherBuffer: Buffer): boolean; + compare(otherBuffer: Buffer, targetStart?: number, targetEnd?: number, sourceStart?: number, sourceEnd?: number): number; + copy(targetBuffer: Buffer, targetStart?: number, sourceStart?: number, sourceEnd?: number): number; + slice(start?: number, end?: number): Buffer; + writeUIntLE(value: number, offset: number, byteLength: number, noAssert?: boolean): number; + writeUIntBE(value: number, offset: number, byteLength: number, noAssert?: boolean): number; + writeIntLE(value: number, offset: number, byteLength: number, noAssert?: boolean): number; + writeIntBE(value: number, offset: number, byteLength: number, noAssert?: boolean): number; + readUIntLE(offset: number, byteLength: number, noAssert?: boolean): number; + readUIntBE(offset: number, byteLength: number, noAssert?: boolean): number; + readIntLE(offset: number, byteLength: number, noAssert?: boolean): number; + readIntBE(offset: number, byteLength: number, noAssert?: boolean): number; + readUInt8(offset: number, noAssert?: boolean): number; + readUInt16LE(offset: number, noAssert?: boolean): number; + readUInt16BE(offset: number, noAssert?: boolean): number; + readUInt32LE(offset: number, noAssert?: boolean): number; + readUInt32BE(offset: number, noAssert?: boolean): number; + readInt8(offset: number, noAssert?: boolean): number; + readInt16LE(offset: number, noAssert?: boolean): number; + readInt16BE(offset: number, noAssert?: boolean): number; + readInt32LE(offset: number, noAssert?: boolean): number; + readInt32BE(offset: number, noAssert?: boolean): number; + readFloatLE(offset: number, noAssert?: boolean): number; + readFloatBE(offset: number, noAssert?: boolean): number; + readDoubleLE(offset: number, noAssert?: boolean): number; + readDoubleBE(offset: number, noAssert?: boolean): number; + swap16(): Buffer; + swap32(): Buffer; + swap64(): Buffer; + writeUInt8(value: number, offset: number, noAssert?: boolean): number; + writeUInt16LE(value: number, offset: number, noAssert?: boolean): number; + writeUInt16BE(value: number, offset: number, noAssert?: boolean): number; + writeUInt32LE(value: number, offset: number, noAssert?: boolean): number; + writeUInt32BE(value: number, offset: number, noAssert?: boolean): number; + writeInt8(value: number, offset: number, noAssert?: boolean): number; + writeInt16LE(value: number, offset: number, noAssert?: boolean): number; + writeInt16BE(value: number, offset: number, noAssert?: boolean): number; + writeInt32LE(value: number, offset: number, noAssert?: boolean): number; + writeInt32BE(value: number, offset: number, noAssert?: boolean): number; + writeFloatLE(value: number, offset: number, noAssert?: boolean): number; + writeFloatBE(value: number, offset: number, noAssert?: boolean): number; + writeDoubleLE(value: number, offset: number, noAssert?: boolean): number; + writeDoubleBE(value: number, offset: number, noAssert?: boolean): number; + fill(value: any, offset?: number, end?: number): this; + indexOf(value: string | number | Buffer, byteOffset?: number, encoding?: string): number; + lastIndexOf(value: string | number | Buffer, byteOffset?: number, encoding?: string): number; + includes(value: string | number | Buffer, byteOffset?: number, encoding?: string): boolean; + + /** + * Allocates a new buffer containing the given {str}. + * + * @param str String to store in buffer. + * @param encoding encoding to use, optional. Default is 'utf8' + */ + constructor (str: string, encoding?: string); + /** + * Allocates a new buffer of {size} octets. + * + * @param size count of octets to allocate. + */ + constructor (size: number); + /** + * Allocates a new buffer containing the given {array} of octets. + * + * @param array The octets to store. + */ + constructor (array: Uint8Array); + /** + * Produces a Buffer backed by the same allocated memory as + * the given {ArrayBuffer}. + * + * + * @param arrayBuffer The ArrayBuffer with which to share memory. + */ + constructor (arrayBuffer: ArrayBuffer); + /** + * Allocates a new buffer containing the given {array} of octets. + * + * @param array The octets to store. + */ + constructor (array: any[]); + /** + * Copies the passed {buffer} data onto a new {Buffer} instance. + * + * @param buffer The buffer to copy. + */ + constructor (buffer: Buffer); + prototype: Buffer; + /** + * Allocates a new Buffer using an {array} of octets. + * + * @param array + */ + static from(array: any[]): Buffer; + /** + * When passed a reference to the .buffer property of a TypedArray instance, + * the newly created Buffer will share the same allocated memory as the TypedArray. + * The optional {byteOffset} and {length} arguments specify a memory range + * within the {arrayBuffer} that will be shared by the Buffer. + * + * @param arrayBuffer The .buffer property of a TypedArray or a new ArrayBuffer() + * @param byteOffset + * @param length + */ + static from(arrayBuffer: ArrayBuffer, byteOffset?: number, length?: number): Buffer; + /** + * Copies the passed {buffer} data onto a new Buffer instance. + * + * @param buffer + */ + static from(buffer: Buffer): Buffer; + /** + * Creates a new Buffer containing the given JavaScript string {str}. + * If provided, the {encoding} parameter identifies the character encoding. + * If not provided, {encoding} defaults to 'utf8'. + * + * @param str + */ + static from(str: string, encoding?: string): Buffer; + /** + * Returns true if {obj} is a Buffer + * + * @param obj object to test. + */ + static isBuffer(obj: any): obj is Buffer; + /** + * Returns true if {encoding} is a valid encoding argument. + * Valid string encodings in Node 0.12: 'ascii'|'utf8'|'utf16le'|'ucs2'(alias of 'utf16le')|'base64'|'binary'(deprecated)|'hex' + * + * @param encoding string to test. + */ + static isEncoding(encoding: string): boolean; + /** + * Gives the actual byte length of a string. encoding defaults to 'utf8'. + * This is not the same as String.prototype.length since that returns the number of characters in a string. + * + * @param string string to test. + * @param encoding encoding used to evaluate (defaults to 'utf8') + */ + static byteLength(string: string, encoding?: string): number; + /** + * Returns a buffer which is the result of concatenating all the buffers in the list together. + * + * If the list has no items, or if the totalLength is 0, then it returns a zero-length buffer. + * If the list has exactly one item, then the first item of the list is returned. + * If the list has more than one item, then a new Buffer is created. + * + * @param list An array of Buffer objects to concatenate + * @param totalLength Total length of the buffers when concatenated. + * If totalLength is not provided, it is read from the buffers in the list. However, this adds an additional loop to the function, so it is faster to provide the length explicitly. + */ + static concat(list: Buffer[], totalLength?: number): Buffer; + /** + * The same as buf1.compare(buf2). + */ + static compare(buf1: Buffer, buf2: Buffer): number; + /** + * Allocates a new buffer of {size} octets. + * + * @param size count of octets to allocate. + * @param fill if specified, buffer will be initialized by calling buf.fill(fill). + * If parameter is omitted, buffer will be filled with zeros. + * @param encoding encoding used for call to buf.fill while initalizing + */ + static alloc(size: number, fill?: string | Buffer | number, encoding?: string): Buffer; + /** + * Allocates a new buffer of {size} octets, leaving memory not initialized, so the contents + * of the newly created Buffer are unknown and may contain sensitive data. + * + * @param size count of octets to allocate + */ + static allocUnsafe(size: number): Buffer; + /** + * Allocates a new non-pooled buffer of {size} octets, leaving memory not initialized, so the contents + * of the newly created Buffer are unknown and may contain sensitive data. + * + * @param size count of octets to allocate + */ + static allocUnsafeSlow(size: number): Buffer; + } +} \ No newline at end of file diff --git a/node_modules/safe-buffer/index.js b/node_modules/safe-buffer/index.js new file mode 100644 index 000000000..22438dabb --- /dev/null +++ b/node_modules/safe-buffer/index.js @@ -0,0 +1,62 @@ +/* eslint-disable node/no-deprecated-api */ +var buffer = require('buffer') +var Buffer = buffer.Buffer + +// alternative to using Object.keys for old browsers +function copyProps (src, dst) { + for (var key in src) { + dst[key] = src[key] + } +} +if (Buffer.from && Buffer.alloc && Buffer.allocUnsafe && Buffer.allocUnsafeSlow) { + module.exports = buffer +} else { + // Copy properties from require('buffer') + copyProps(buffer, exports) + exports.Buffer = SafeBuffer +} + +function SafeBuffer (arg, encodingOrOffset, length) { + return Buffer(arg, encodingOrOffset, length) +} + +// Copy static methods from Buffer +copyProps(Buffer, SafeBuffer) + +SafeBuffer.from = function (arg, encodingOrOffset, length) { + if (typeof arg === 'number') { + throw new TypeError('Argument must not be a number') + } + return Buffer(arg, encodingOrOffset, length) +} + +SafeBuffer.alloc = function (size, fill, encoding) { + if (typeof size !== 'number') { + throw new TypeError('Argument must be a number') + } + var buf = Buffer(size) + if (fill !== undefined) { + if (typeof encoding === 'string') { + buf.fill(fill, encoding) + } else { + buf.fill(fill) + } + } else { + buf.fill(0) + } + return buf +} + +SafeBuffer.allocUnsafe = function (size) { + if (typeof size !== 'number') { + throw new TypeError('Argument must be a number') + } + return Buffer(size) +} + +SafeBuffer.allocUnsafeSlow = function (size) { + if (typeof size !== 'number') { + throw new TypeError('Argument must be a number') + } + return buffer.SlowBuffer(size) +} diff --git a/node_modules/safe-buffer/package.json b/node_modules/safe-buffer/package.json new file mode 100644 index 000000000..cc7c7ad98 --- /dev/null +++ b/node_modules/safe-buffer/package.json @@ -0,0 +1,63 @@ +{ + "_from": "safe-buffer@5.1.2", + "_id": "safe-buffer@5.1.2", + "_inBundle": false, + "_integrity": "sha512-Gd2UZBJDkXlY7GbJxfsE8/nvKkUEU1G38c1siN6QP6a9PT9MmHB8GnpscSmMJSoF8LOIrt8ud/wPtojys4G6+g==", + "_location": "/safe-buffer", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "safe-buffer@5.1.2", + "name": "safe-buffer", + "escapedName": "safe-buffer", + "rawSpec": "5.1.2", + "saveSpec": null, + "fetchSpec": "5.1.2" + }, + "_requiredBy": [ + "/content-disposition", + "/express" + ], + "_resolved": "https://registry.npmjs.org/safe-buffer/-/safe-buffer-5.1.2.tgz", + "_shasum": "991ec69d296e0313747d59bdfd2b745c35f8828d", + "_spec": "safe-buffer@5.1.2", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\express", + "author": { + "name": "Feross Aboukhadijeh", + "email": "feross@feross.org", + "url": "http://feross.org" + }, + "bugs": { + "url": "https://github.com/feross/safe-buffer/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Safer Node.js Buffer API", + "devDependencies": { + "standard": "*", + "tape": "^4.0.0" + }, + "homepage": "https://github.com/feross/safe-buffer", + "keywords": [ + "buffer", + "buffer allocate", + "node security", + "safe", + "safe-buffer", + "security", + "uninitialized" + ], + "license": "MIT", + "main": "index.js", + "name": "safe-buffer", + "repository": { + "type": "git", + "url": "git://github.com/feross/safe-buffer.git" + }, + "scripts": { + "test": "standard && tape test/*.js" + }, + "types": "index.d.ts", + "version": "5.1.2" +} diff --git a/node_modules/safer-buffer/LICENSE b/node_modules/safer-buffer/LICENSE new file mode 100644 index 000000000..4fe9e6f10 --- /dev/null +++ b/node_modules/safer-buffer/LICENSE @@ -0,0 +1,21 @@ +MIT License + +Copyright (c) 2018 Nikita Skovoroda + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/node_modules/safer-buffer/Porting-Buffer.md b/node_modules/safer-buffer/Porting-Buffer.md new file mode 100644 index 000000000..68d86bab0 --- /dev/null +++ b/node_modules/safer-buffer/Porting-Buffer.md @@ -0,0 +1,268 @@ +# Porting to the Buffer.from/Buffer.alloc API + + +## Overview + +- [Variant 1: Drop support for Node.js ≤ 4.4.x and 5.0.0 — 5.9.x.](#variant-1) (*recommended*) +- [Variant 2: Use a polyfill](#variant-2) +- [Variant 3: manual detection, with safeguards](#variant-3) + +### Finding problematic bits of code using grep + +Just run `grep -nrE '[^a-zA-Z](Slow)?Buffer\s*\(' --exclude-dir node_modules`. + +It will find all the potentially unsafe places in your own code (with some considerably unlikely +exceptions). + +### Finding problematic bits of code using Node.js 8 + +If you’re using Node.js ≥ 8.0.0 (which is recommended), Node.js exposes multiple options that help with finding the relevant pieces of code: + +- `--trace-warnings` will make Node.js show a stack trace for this warning and other warnings that are printed by Node.js. +- `--trace-deprecation` does the same thing, but only for deprecation warnings. +- `--pending-deprecation` will show more types of deprecation warnings. In particular, it will show the `Buffer()` deprecation warning, even on Node.js 8. + +You can set these flags using an environment variable: + +```console +$ export NODE_OPTIONS='--trace-warnings --pending-deprecation' +$ cat example.js +'use strict'; +const foo = new Buffer('foo'); +$ node example.js +(node:7147) [DEP0005] DeprecationWarning: The Buffer() and new Buffer() constructors are not recommended for use due to security and usability concerns. Please use the new Buffer.alloc(), Buffer.allocUnsafe(), or Buffer.from() construction methods instead. + at showFlaggedDeprecation (buffer.js:127:13) + at new Buffer (buffer.js:148:3) + at Object. (/path/to/example.js:2:13) + [... more stack trace lines ...] +``` + +### Finding problematic bits of code using linters + +Eslint rules [no-buffer-constructor](https://eslint.org/docs/rules/no-buffer-constructor) +or +[node/no-deprecated-api](https://github.com/mysticatea/eslint-plugin-node/blob/master/docs/rules/no-deprecated-api.md) +also find calls to deprecated `Buffer()` API. Those rules are included in some pre-sets. + +There is a drawback, though, that it doesn't always +[work correctly](https://github.com/chalker/safer-buffer#why-not-safe-buffer) when `Buffer` is +overriden e.g. with a polyfill, so recommended is a combination of this and some other method +described above. + + +## Variant 1: Drop support for Node.js ≤ 4.4.x and 5.0.0 — 5.9.x. + +This is the recommended solution nowadays that would imply only minimal overhead. + +The Node.js 5.x release line has been unsupported since July 2016, and the Node.js 4.x release line reaches its End of Life in April 2018 (→ [Schedule](https://github.com/nodejs/Release#release-schedule)). This means that these versions of Node.js will *not* receive any updates, even in case of security issues, so using these release lines should be avoided, if at all possible. + +What you would do in this case is to convert all `new Buffer()` or `Buffer()` calls to use `Buffer.alloc()` or `Buffer.from()`, in the following way: + +- For `new Buffer(number)`, replace it with `Buffer.alloc(number)`. +- For `new Buffer(string)` (or `new Buffer(string, encoding)`), replace it with `Buffer.from(string)` (or `Buffer.from(string, encoding)`). +- For all other combinations of arguments (these are much rarer), also replace `new Buffer(...arguments)` with `Buffer.from(...arguments)`. + +Note that `Buffer.alloc()` is also _faster_ on the current Node.js versions than +`new Buffer(size).fill(0)`, which is what you would otherwise need to ensure zero-filling. + +Enabling eslint rule [no-buffer-constructor](https://eslint.org/docs/rules/no-buffer-constructor) +or +[node/no-deprecated-api](https://github.com/mysticatea/eslint-plugin-node/blob/master/docs/rules/no-deprecated-api.md) +is recommended to avoid accidential unsafe Buffer API usage. + +There is also a [JSCodeshift codemod](https://github.com/joyeecheung/node-dep-codemod#dep005) +for automatically migrating Buffer constructors to `Buffer.alloc()` or `Buffer.from()`. +Note that it currently only works with cases where the arguments are literals or where the +constructor is invoked with two arguments. + +_If you currently support those older Node.js versions and dropping them would be a semver-major change +for you, or if you support older branches of your packages, consider using [Variant 2](#variant-2) +or [Variant 3](#variant-3) on older branches, so people using those older branches will also receive +the fix. That way, you will eradicate potential issues caused by unguarded Buffer API usage and +your users will not observe a runtime deprecation warning when running your code on Node.js 10._ + + +## Variant 2: Use a polyfill + +Utilize [safer-buffer](https://www.npmjs.com/package/safer-buffer) as a polyfill to support older +Node.js versions. + +You would take exacly the same steps as in [Variant 1](#variant-1), but with a polyfill +`const Buffer = require('safer-buffer').Buffer` in all files where you use the new `Buffer` api. + +Make sure that you do not use old `new Buffer` API — in any files where the line above is added, +using old `new Buffer()` API will _throw_. It will be easy to notice that in CI, though. + +Alternatively, you could use [buffer-from](https://www.npmjs.com/package/buffer-from) and/or +[buffer-alloc](https://www.npmjs.com/package/buffer-alloc) [ponyfills](https://ponyfill.com/) — +those are great, the only downsides being 4 deps in the tree and slightly more code changes to +migrate off them (as you would be using e.g. `Buffer.from` under a different name). If you need only +`Buffer.from` polyfilled — `buffer-from` alone which comes with no extra dependencies. + +_Alternatively, you could use [safe-buffer](https://www.npmjs.com/package/safe-buffer) — it also +provides a polyfill, but takes a different approach which has +[it's drawbacks](https://github.com/chalker/safer-buffer#why-not-safe-buffer). It will allow you +to also use the older `new Buffer()` API in your code, though — but that's arguably a benefit, as +it is problematic, can cause issues in your code, and will start emitting runtime deprecation +warnings starting with Node.js 10._ + +Note that in either case, it is important that you also remove all calls to the old Buffer +API manually — just throwing in `safe-buffer` doesn't fix the problem by itself, it just provides +a polyfill for the new API. I have seen people doing that mistake. + +Enabling eslint rule [no-buffer-constructor](https://eslint.org/docs/rules/no-buffer-constructor) +or +[node/no-deprecated-api](https://github.com/mysticatea/eslint-plugin-node/blob/master/docs/rules/no-deprecated-api.md) +is recommended. + +_Don't forget to drop the polyfill usage once you drop support for Node.js < 4.5.0._ + + +## Variant 3 — manual detection, with safeguards + +This is useful if you create Buffer instances in only a few places (e.g. one), or you have your own +wrapper around them. + +### Buffer(0) + +This special case for creating empty buffers can be safely replaced with `Buffer.concat([])`, which +returns the same result all the way down to Node.js 0.8.x. + +### Buffer(notNumber) + +Before: + +```js +var buf = new Buffer(notNumber, encoding); +``` + +After: + +```js +var buf; +if (Buffer.from && Buffer.from !== Uint8Array.from) { + buf = Buffer.from(notNumber, encoding); +} else { + if (typeof notNumber === 'number') + throw new Error('The "size" argument must be of type number.'); + buf = new Buffer(notNumber, encoding); +} +``` + +`encoding` is optional. + +Note that the `typeof notNumber` before `new Buffer` is required (for cases when `notNumber` argument is not +hard-coded) and _is not caused by the deprecation of Buffer constructor_ — it's exactly _why_ the +Buffer constructor is deprecated. Ecosystem packages lacking this type-check caused numereous +security issues — situations when unsanitized user input could end up in the `Buffer(arg)` create +problems ranging from DoS to leaking sensitive information to the attacker from the process memory. + +When `notNumber` argument is hardcoded (e.g. literal `"abc"` or `[0,1,2]`), the `typeof` check can +be omitted. + +Also note that using TypeScript does not fix this problem for you — when libs written in +`TypeScript` are used from JS, or when user input ends up there — it behaves exactly as pure JS, as +all type checks are translation-time only and are not present in the actual JS code which TS +compiles to. + +### Buffer(number) + +For Node.js 0.10.x (and below) support: + +```js +var buf; +if (Buffer.alloc) { + buf = Buffer.alloc(number); +} else { + buf = new Buffer(number); + buf.fill(0); +} +``` + +Otherwise (Node.js ≥ 0.12.x): + +```js +const buf = Buffer.alloc ? Buffer.alloc(number) : new Buffer(number).fill(0); +``` + +## Regarding Buffer.allocUnsafe + +Be extra cautious when using `Buffer.allocUnsafe`: + * Don't use it if you don't have a good reason to + * e.g. you probably won't ever see a performance difference for small buffers, in fact, those + might be even faster with `Buffer.alloc()`, + * if your code is not in the hot code path — you also probably won't notice a difference, + * keep in mind that zero-filling minimizes the potential risks. + * If you use it, make sure that you never return the buffer in a partially-filled state, + * if you are writing to it sequentially — always truncate it to the actuall written length + +Errors in handling buffers allocated with `Buffer.allocUnsafe` could result in various issues, +ranged from undefined behaviour of your code to sensitive data (user input, passwords, certs) +leaking to the remote attacker. + +_Note that the same applies to `new Buffer` usage without zero-filling, depending on the Node.js +version (and lacking type checks also adds DoS to the list of potential problems)._ + + +## FAQ + + +### What is wrong with the `Buffer` constructor? + +The `Buffer` constructor could be used to create a buffer in many different ways: + +- `new Buffer(42)` creates a `Buffer` of 42 bytes. Before Node.js 8, this buffer contained + *arbitrary memory* for performance reasons, which could include anything ranging from + program source code to passwords and encryption keys. +- `new Buffer('abc')` creates a `Buffer` that contains the UTF-8-encoded version of + the string `'abc'`. A second argument could specify another encoding: For example, + `new Buffer(string, 'base64')` could be used to convert a Base64 string into the original + sequence of bytes that it represents. +- There are several other combinations of arguments. + +This meant that, in code like `var buffer = new Buffer(foo);`, *it is not possible to tell +what exactly the contents of the generated buffer are* without knowing the type of `foo`. + +Sometimes, the value of `foo` comes from an external source. For example, this function +could be exposed as a service on a web server, converting a UTF-8 string into its Base64 form: + +``` +function stringToBase64(req, res) { + // The request body should have the format of `{ string: 'foobar' }` + const rawBytes = new Buffer(req.body.string) + const encoded = rawBytes.toString('base64') + res.end({ encoded: encoded }) +} +``` + +Note that this code does *not* validate the type of `req.body.string`: + +- `req.body.string` is expected to be a string. If this is the case, all goes well. +- `req.body.string` is controlled by the client that sends the request. +- If `req.body.string` is the *number* `50`, the `rawBytes` would be 50 bytes: + - Before Node.js 8, the content would be uninitialized + - After Node.js 8, the content would be `50` bytes with the value `0` + +Because of the missing type check, an attacker could intentionally send a number +as part of the request. Using this, they can either: + +- Read uninitialized memory. This **will** leak passwords, encryption keys and other + kinds of sensitive information. (Information leak) +- Force the program to allocate a large amount of memory. For example, when specifying + `500000000` as the input value, each request will allocate 500MB of memory. + This can be used to either exhaust the memory available of a program completely + and make it crash, or slow it down significantly. (Denial of Service) + +Both of these scenarios are considered serious security issues in a real-world +web server context. + +when using `Buffer.from(req.body.string)` instead, passing a number will always +throw an exception instead, giving a controlled behaviour that can always be +handled by the program. + + +### The `Buffer()` constructor has been deprecated for a while. Is this really an issue? + +Surveys of code in the `npm` ecosystem have shown that the `Buffer()` constructor is still +widely used. This includes new code, and overall usage of such code has actually been +*increasing*. diff --git a/node_modules/safer-buffer/Readme.md b/node_modules/safer-buffer/Readme.md new file mode 100644 index 000000000..14b082290 --- /dev/null +++ b/node_modules/safer-buffer/Readme.md @@ -0,0 +1,156 @@ +# safer-buffer [![travis][travis-image]][travis-url] [![npm][npm-image]][npm-url] [![javascript style guide][standard-image]][standard-url] [![Security Responsible Disclosure][secuirty-image]][secuirty-url] + +[travis-image]: https://travis-ci.org/ChALkeR/safer-buffer.svg?branch=master +[travis-url]: https://travis-ci.org/ChALkeR/safer-buffer +[npm-image]: https://img.shields.io/npm/v/safer-buffer.svg +[npm-url]: https://npmjs.org/package/safer-buffer +[standard-image]: https://img.shields.io/badge/code_style-standard-brightgreen.svg +[standard-url]: https://standardjs.com +[secuirty-image]: https://img.shields.io/badge/Security-Responsible%20Disclosure-green.svg +[secuirty-url]: https://github.com/nodejs/security-wg/blob/master/processes/responsible_disclosure_template.md + +Modern Buffer API polyfill without footguns, working on Node.js from 0.8 to current. + +## How to use? + +First, port all `Buffer()` and `new Buffer()` calls to `Buffer.alloc()` and `Buffer.from()` API. + +Then, to achieve compatibility with outdated Node.js versions (`<4.5.0` and 5.x `<5.9.0`), use +`const Buffer = require('safer-buffer').Buffer` in all files where you make calls to the new +Buffer API. _Use `var` instead of `const` if you need that for your Node.js version range support._ + +Also, see the +[porting Buffer](https://github.com/ChALkeR/safer-buffer/blob/master/Porting-Buffer.md) guide. + +## Do I need it? + +Hopefully, not — dropping support for outdated Node.js versions should be fine nowdays, and that +is the recommended path forward. You _do_ need to port to the `Buffer.alloc()` and `Buffer.from()` +though. + +See the [porting guide](https://github.com/ChALkeR/safer-buffer/blob/master/Porting-Buffer.md) +for a better description. + +## Why not [safe-buffer](https://npmjs.com/safe-buffer)? + +_In short: while `safe-buffer` serves as a polyfill for the new API, it allows old API usage and +itself contains footguns._ + +`safe-buffer` could be used safely to get the new API while still keeping support for older +Node.js versions (like this module), but while analyzing ecosystem usage of the old Buffer API +I found out that `safe-buffer` is itself causing problems in some cases. + +For example, consider the following snippet: + +```console +$ cat example.unsafe.js +console.log(Buffer(20)) +$ ./node-v6.13.0-linux-x64/bin/node example.unsafe.js + +$ standard example.unsafe.js +standard: Use JavaScript Standard Style (https://standardjs.com) + /home/chalker/repo/safer-buffer/example.unsafe.js:2:13: 'Buffer()' was deprecated since v6. Use 'Buffer.alloc()' or 'Buffer.from()' (use 'https://www.npmjs.com/package/safe-buffer' for '<4.5.0') instead. +``` + +This is allocates and writes to console an uninitialized chunk of memory. +[standard](https://www.npmjs.com/package/standard) linter (among others) catch that and warn people +to avoid using unsafe API. + +Let's now throw in `safe-buffer`! + +```console +$ cat example.safe-buffer.js +const Buffer = require('safe-buffer').Buffer +console.log(Buffer(20)) +$ standard example.safe-buffer.js +$ ./node-v6.13.0-linux-x64/bin/node example.safe-buffer.js + +``` + +See the problem? Adding in `safe-buffer` _magically removes the lint warning_, but the behavior +remains identiсal to what we had before, and when launched on Node.js 6.x LTS — this dumps out +chunks of uninitialized memory. +_And this code will still emit runtime warnings on Node.js 10.x and above._ + +That was done by design. I first considered changing `safe-buffer`, prohibiting old API usage or +emitting warnings on it, but that significantly diverges from `safe-buffer` design. After some +discussion, it was decided to move my approach into a separate package, and _this is that separate +package_. + +This footgun is not imaginary — I observed top-downloaded packages doing that kind of thing, +«fixing» the lint warning by blindly including `safe-buffer` without any actual changes. + +Also in some cases, even if the API _was_ migrated to use of safe Buffer API — a random pull request +can bring unsafe Buffer API usage back to the codebase by adding new calls — and that could go +unnoticed even if you have a linter prohibiting that (becase of the reason stated above), and even +pass CI. _I also observed that being done in popular packages._ + +Some examples: + * [webdriverio](https://github.com/webdriverio/webdriverio/commit/05cbd3167c12e4930f09ef7cf93b127ba4effae4#diff-124380949022817b90b622871837d56cR31) + (a module with 548 759 downloads/month), + * [websocket-stream](https://github.com/maxogden/websocket-stream/commit/c9312bd24d08271687d76da0fe3c83493871cf61) + (218 288 d/m, fix in [maxogden/websocket-stream#142](https://github.com/maxogden/websocket-stream/pull/142)), + * [node-serialport](https://github.com/node-serialport/node-serialport/commit/e8d9d2b16c664224920ce1c895199b1ce2def48c) + (113 138 d/m, fix in [node-serialport/node-serialport#1510](https://github.com/node-serialport/node-serialport/pull/1510)), + * [karma](https://github.com/karma-runner/karma/commit/3d94b8cf18c695104ca195334dc75ff054c74eec) + (3 973 193 d/m, fix in [karma-runner/karma#2947](https://github.com/karma-runner/karma/pull/2947)), + * [spdy-transport](https://github.com/spdy-http2/spdy-transport/commit/5375ac33f4a62a4f65bcfc2827447d42a5dbe8b1) + (5 970 727 d/m, fix in [spdy-http2/spdy-transport#53](https://github.com/spdy-http2/spdy-transport/pull/53)). + * And there are a lot more over the ecosystem. + +I filed a PR at +[mysticatea/eslint-plugin-node#110](https://github.com/mysticatea/eslint-plugin-node/pull/110) to +partially fix that (for cases when that lint rule is used), but it is a semver-major change for +linter rules and presets, so it would take significant time for that to reach actual setups. +_It also hasn't been released yet (2018-03-20)._ + +Also, `safer-buffer` discourages the usage of `.allocUnsafe()`, which is often done by a mistake. +It still supports it with an explicit concern barier, by placing it under +`require('safer-buffer/dangereous')`. + +## But isn't throwing bad? + +Not really. It's an error that could be noticed and fixed early, instead of causing havoc later like +unguarded `new Buffer()` calls that end up receiving user input can do. + +This package affects only the files where `var Buffer = require('safer-buffer').Buffer` was done, so +it is really simple to keep track of things and make sure that you don't mix old API usage with that. +Also, CI should hint anything that you might have missed. + +New commits, if tested, won't land new usage of unsafe Buffer API this way. +_Node.js 10.x also deals with that by printing a runtime depecation warning._ + +### Would it affect third-party modules? + +No, unless you explicitly do an awful thing like monkey-patching or overriding the built-in `Buffer`. +Don't do that. + +### But I don't want throwing… + +That is also fine! + +Also, it could be better in some cases when you don't comprehensive enough test coverage. + +In that case — just don't override `Buffer` and use +`var SaferBuffer = require('safer-buffer').Buffer` instead. + +That way, everything using `Buffer` natively would still work, but there would be two drawbacks: + +* `Buffer.from`/`Buffer.alloc` won't be polyfilled — use `SaferBuffer.from` and + `SaferBuffer.alloc` instead. +* You are still open to accidentally using the insecure deprecated API — use a linter to catch that. + +Note that using a linter to catch accidential `Buffer` constructor usage in this case is strongly +recommended. `Buffer` is not overriden in this usecase, so linters won't get confused. + +## «Without footguns»? + +Well, it is still possible to do _some_ things with `Buffer` API, e.g. accessing `.buffer` property +on older versions and duping things from there. You shouldn't do that in your code, probabably. + +The intention is to remove the most significant footguns that affect lots of packages in the +ecosystem, and to do it in the proper way. + +Also, this package doesn't protect against security issues affecting some Node.js versions, so for +usage in your own production code, it is still recommended to update to a Node.js version +[supported by upstream](https://github.com/nodejs/release#release-schedule). diff --git a/node_modules/safer-buffer/dangerous.js b/node_modules/safer-buffer/dangerous.js new file mode 100644 index 000000000..ca41fdc54 --- /dev/null +++ b/node_modules/safer-buffer/dangerous.js @@ -0,0 +1,58 @@ +/* eslint-disable node/no-deprecated-api */ + +'use strict' + +var buffer = require('buffer') +var Buffer = buffer.Buffer +var safer = require('./safer.js') +var Safer = safer.Buffer + +var dangerous = {} + +var key + +for (key in safer) { + if (!safer.hasOwnProperty(key)) continue + dangerous[key] = safer[key] +} + +var Dangereous = dangerous.Buffer = {} + +// Copy Safer API +for (key in Safer) { + if (!Safer.hasOwnProperty(key)) continue + Dangereous[key] = Safer[key] +} + +// Copy those missing unsafe methods, if they are present +for (key in Buffer) { + if (!Buffer.hasOwnProperty(key)) continue + if (Dangereous.hasOwnProperty(key)) continue + Dangereous[key] = Buffer[key] +} + +if (!Dangereous.allocUnsafe) { + Dangereous.allocUnsafe = function (size) { + if (typeof size !== 'number') { + throw new TypeError('The "size" argument must be of type number. Received type ' + typeof size) + } + if (size < 0 || size >= 2 * (1 << 30)) { + throw new RangeError('The value "' + size + '" is invalid for option "size"') + } + return Buffer(size) + } +} + +if (!Dangereous.allocUnsafeSlow) { + Dangereous.allocUnsafeSlow = function (size) { + if (typeof size !== 'number') { + throw new TypeError('The "size" argument must be of type number. Received type ' + typeof size) + } + if (size < 0 || size >= 2 * (1 << 30)) { + throw new RangeError('The value "' + size + '" is invalid for option "size"') + } + return buffer.SlowBuffer(size) + } +} + +module.exports = dangerous diff --git a/node_modules/safer-buffer/package.json b/node_modules/safer-buffer/package.json new file mode 100644 index 000000000..9290a49ef --- /dev/null +++ b/node_modules/safer-buffer/package.json @@ -0,0 +1,60 @@ +{ + "_from": "safer-buffer@>= 2.1.2 < 3", + "_id": "safer-buffer@2.1.2", + "_inBundle": false, + "_integrity": "sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg==", + "_location": "/safer-buffer", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "safer-buffer@>= 2.1.2 < 3", + "name": "safer-buffer", + "escapedName": "safer-buffer", + "rawSpec": ">= 2.1.2 < 3", + "saveSpec": null, + "fetchSpec": ">= 2.1.2 < 3" + }, + "_requiredBy": [ + "/iconv-lite" + ], + "_resolved": "https://registry.npmjs.org/safer-buffer/-/safer-buffer-2.1.2.tgz", + "_shasum": "44fa161b0187b9549dd84bb91802f9bd8385cd6a", + "_spec": "safer-buffer@>= 2.1.2 < 3", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\iconv-lite", + "author": { + "name": "Nikita Skovoroda", + "email": "chalkerx@gmail.com", + "url": "https://github.com/ChALkeR" + }, + "bugs": { + "url": "https://github.com/ChALkeR/safer-buffer/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Modern Buffer API polyfill without footguns", + "devDependencies": { + "standard": "^11.0.1", + "tape": "^4.9.0" + }, + "files": [ + "Porting-Buffer.md", + "Readme.md", + "tests.js", + "dangerous.js", + "safer.js" + ], + "homepage": "https://github.com/ChALkeR/safer-buffer#readme", + "license": "MIT", + "main": "safer.js", + "name": "safer-buffer", + "repository": { + "type": "git", + "url": "git+https://github.com/ChALkeR/safer-buffer.git" + }, + "scripts": { + "browserify-test": "browserify --external tape tests.js > browserify-tests.js && tape browserify-tests.js", + "test": "standard && tape tests.js" + }, + "version": "2.1.2" +} diff --git a/node_modules/safer-buffer/safer.js b/node_modules/safer-buffer/safer.js new file mode 100644 index 000000000..37c7e1aa6 --- /dev/null +++ b/node_modules/safer-buffer/safer.js @@ -0,0 +1,77 @@ +/* eslint-disable node/no-deprecated-api */ + +'use strict' + +var buffer = require('buffer') +var Buffer = buffer.Buffer + +var safer = {} + +var key + +for (key in buffer) { + if (!buffer.hasOwnProperty(key)) continue + if (key === 'SlowBuffer' || key === 'Buffer') continue + safer[key] = buffer[key] +} + +var Safer = safer.Buffer = {} +for (key in Buffer) { + if (!Buffer.hasOwnProperty(key)) continue + if (key === 'allocUnsafe' || key === 'allocUnsafeSlow') continue + Safer[key] = Buffer[key] +} + +safer.Buffer.prototype = Buffer.prototype + +if (!Safer.from || Safer.from === Uint8Array.from) { + Safer.from = function (value, encodingOrOffset, length) { + if (typeof value === 'number') { + throw new TypeError('The "value" argument must not be of type number. Received type ' + typeof value) + } + if (value && typeof value.length === 'undefined') { + throw new TypeError('The first argument must be one of type string, Buffer, ArrayBuffer, Array, or Array-like Object. Received type ' + typeof value) + } + return Buffer(value, encodingOrOffset, length) + } +} + +if (!Safer.alloc) { + Safer.alloc = function (size, fill, encoding) { + if (typeof size !== 'number') { + throw new TypeError('The "size" argument must be of type number. Received type ' + typeof size) + } + if (size < 0 || size >= 2 * (1 << 30)) { + throw new RangeError('The value "' + size + '" is invalid for option "size"') + } + var buf = Buffer(size) + if (!fill || fill.length === 0) { + buf.fill(0) + } else if (typeof encoding === 'string') { + buf.fill(fill, encoding) + } else { + buf.fill(fill) + } + return buf + } +} + +if (!safer.kStringMaxLength) { + try { + safer.kStringMaxLength = process.binding('buffer').kStringMaxLength + } catch (e) { + // we can't determine kStringMaxLength in environments where process.binding + // is unsupported, so let's not set it + } +} + +if (!safer.constants) { + safer.constants = { + MAX_LENGTH: safer.kMaxLength + } + if (safer.kStringMaxLength) { + safer.constants.MAX_STRING_LENGTH = safer.kStringMaxLength + } +} + +module.exports = safer diff --git a/node_modules/safer-buffer/tests.js b/node_modules/safer-buffer/tests.js new file mode 100644 index 000000000..7ed2777c9 --- /dev/null +++ b/node_modules/safer-buffer/tests.js @@ -0,0 +1,406 @@ +/* eslint-disable node/no-deprecated-api */ + +'use strict' + +var test = require('tape') + +var buffer = require('buffer') + +var index = require('./') +var safer = require('./safer') +var dangerous = require('./dangerous') + +/* Inheritance tests */ + +test('Default is Safer', function (t) { + t.equal(index, safer) + t.notEqual(safer, dangerous) + t.notEqual(index, dangerous) + t.end() +}) + +test('Is not a function', function (t) { + [index, safer, dangerous].forEach(function (impl) { + t.equal(typeof impl, 'object') + t.equal(typeof impl.Buffer, 'object') + }); + [buffer].forEach(function (impl) { + t.equal(typeof impl, 'object') + t.equal(typeof impl.Buffer, 'function') + }) + t.end() +}) + +test('Constructor throws', function (t) { + [index, safer, dangerous].forEach(function (impl) { + t.throws(function () { impl.Buffer() }) + t.throws(function () { impl.Buffer(0) }) + t.throws(function () { impl.Buffer('a') }) + t.throws(function () { impl.Buffer('a', 'utf-8') }) + t.throws(function () { return new impl.Buffer() }) + t.throws(function () { return new impl.Buffer(0) }) + t.throws(function () { return new impl.Buffer('a') }) + t.throws(function () { return new impl.Buffer('a', 'utf-8') }) + }) + t.end() +}) + +test('Safe methods exist', function (t) { + [index, safer, dangerous].forEach(function (impl) { + t.equal(typeof impl.Buffer.alloc, 'function', 'alloc') + t.equal(typeof impl.Buffer.from, 'function', 'from') + }) + t.end() +}) + +test('Unsafe methods exist only in Dangerous', function (t) { + [index, safer].forEach(function (impl) { + t.equal(typeof impl.Buffer.allocUnsafe, 'undefined') + t.equal(typeof impl.Buffer.allocUnsafeSlow, 'undefined') + }); + [dangerous].forEach(function (impl) { + t.equal(typeof impl.Buffer.allocUnsafe, 'function') + t.equal(typeof impl.Buffer.allocUnsafeSlow, 'function') + }) + t.end() +}) + +test('Generic methods/properties are defined and equal', function (t) { + ['poolSize', 'isBuffer', 'concat', 'byteLength'].forEach(function (method) { + [index, safer, dangerous].forEach(function (impl) { + t.equal(impl.Buffer[method], buffer.Buffer[method], method) + t.notEqual(typeof impl.Buffer[method], 'undefined', method) + }) + }) + t.end() +}) + +test('Built-in buffer static methods/properties are inherited', function (t) { + Object.keys(buffer).forEach(function (method) { + if (method === 'SlowBuffer' || method === 'Buffer') return; + [index, safer, dangerous].forEach(function (impl) { + t.equal(impl[method], buffer[method], method) + t.notEqual(typeof impl[method], 'undefined', method) + }) + }) + t.end() +}) + +test('Built-in Buffer static methods/properties are inherited', function (t) { + Object.keys(buffer.Buffer).forEach(function (method) { + if (method === 'allocUnsafe' || method === 'allocUnsafeSlow') return; + [index, safer, dangerous].forEach(function (impl) { + t.equal(impl.Buffer[method], buffer.Buffer[method], method) + t.notEqual(typeof impl.Buffer[method], 'undefined', method) + }) + }) + t.end() +}) + +test('.prototype property of Buffer is inherited', function (t) { + [index, safer, dangerous].forEach(function (impl) { + t.equal(impl.Buffer.prototype, buffer.Buffer.prototype, 'prototype') + t.notEqual(typeof impl.Buffer.prototype, 'undefined', 'prototype') + }) + t.end() +}) + +test('All Safer methods are present in Dangerous', function (t) { + Object.keys(safer).forEach(function (method) { + if (method === 'Buffer') return; + [index, safer, dangerous].forEach(function (impl) { + t.equal(impl[method], safer[method], method) + if (method !== 'kStringMaxLength') { + t.notEqual(typeof impl[method], 'undefined', method) + } + }) + }) + Object.keys(safer.Buffer).forEach(function (method) { + [index, safer, dangerous].forEach(function (impl) { + t.equal(impl.Buffer[method], safer.Buffer[method], method) + t.notEqual(typeof impl.Buffer[method], 'undefined', method) + }) + }) + t.end() +}) + +test('Safe methods from Dangerous methods are present in Safer', function (t) { + Object.keys(dangerous).forEach(function (method) { + if (method === 'Buffer') return; + [index, safer, dangerous].forEach(function (impl) { + t.equal(impl[method], dangerous[method], method) + if (method !== 'kStringMaxLength') { + t.notEqual(typeof impl[method], 'undefined', method) + } + }) + }) + Object.keys(dangerous.Buffer).forEach(function (method) { + if (method === 'allocUnsafe' || method === 'allocUnsafeSlow') return; + [index, safer, dangerous].forEach(function (impl) { + t.equal(impl.Buffer[method], dangerous.Buffer[method], method) + t.notEqual(typeof impl.Buffer[method], 'undefined', method) + }) + }) + t.end() +}) + +/* Behaviour tests */ + +test('Methods return Buffers', function (t) { + [index, safer, dangerous].forEach(function (impl) { + t.ok(buffer.Buffer.isBuffer(impl.Buffer.alloc(0))) + t.ok(buffer.Buffer.isBuffer(impl.Buffer.alloc(0, 10))) + t.ok(buffer.Buffer.isBuffer(impl.Buffer.alloc(0, 'a'))) + t.ok(buffer.Buffer.isBuffer(impl.Buffer.alloc(10))) + t.ok(buffer.Buffer.isBuffer(impl.Buffer.alloc(10, 'x'))) + t.ok(buffer.Buffer.isBuffer(impl.Buffer.alloc(9, 'ab'))) + t.ok(buffer.Buffer.isBuffer(impl.Buffer.from(''))) + t.ok(buffer.Buffer.isBuffer(impl.Buffer.from('string'))) + t.ok(buffer.Buffer.isBuffer(impl.Buffer.from('string', 'utf-8'))) + t.ok(buffer.Buffer.isBuffer(impl.Buffer.from('b25ldHdvdGhyZWU=', 'base64'))) + t.ok(buffer.Buffer.isBuffer(impl.Buffer.from([0, 42, 3]))) + t.ok(buffer.Buffer.isBuffer(impl.Buffer.from(new Uint8Array([0, 42, 3])))) + t.ok(buffer.Buffer.isBuffer(impl.Buffer.from([]))) + }); + ['allocUnsafe', 'allocUnsafeSlow'].forEach(function (method) { + t.ok(buffer.Buffer.isBuffer(dangerous.Buffer[method](0))) + t.ok(buffer.Buffer.isBuffer(dangerous.Buffer[method](10))) + }) + t.end() +}) + +test('Constructor is buffer.Buffer', function (t) { + [index, safer, dangerous].forEach(function (impl) { + t.equal(impl.Buffer.alloc(0).constructor, buffer.Buffer) + t.equal(impl.Buffer.alloc(0, 10).constructor, buffer.Buffer) + t.equal(impl.Buffer.alloc(0, 'a').constructor, buffer.Buffer) + t.equal(impl.Buffer.alloc(10).constructor, buffer.Buffer) + t.equal(impl.Buffer.alloc(10, 'x').constructor, buffer.Buffer) + t.equal(impl.Buffer.alloc(9, 'ab').constructor, buffer.Buffer) + t.equal(impl.Buffer.from('').constructor, buffer.Buffer) + t.equal(impl.Buffer.from('string').constructor, buffer.Buffer) + t.equal(impl.Buffer.from('string', 'utf-8').constructor, buffer.Buffer) + t.equal(impl.Buffer.from('b25ldHdvdGhyZWU=', 'base64').constructor, buffer.Buffer) + t.equal(impl.Buffer.from([0, 42, 3]).constructor, buffer.Buffer) + t.equal(impl.Buffer.from(new Uint8Array([0, 42, 3])).constructor, buffer.Buffer) + t.equal(impl.Buffer.from([]).constructor, buffer.Buffer) + }); + [0, 10, 100].forEach(function (arg) { + t.equal(dangerous.Buffer.allocUnsafe(arg).constructor, buffer.Buffer) + t.equal(dangerous.Buffer.allocUnsafeSlow(arg).constructor, buffer.SlowBuffer(0).constructor) + }) + t.end() +}) + +test('Invalid calls throw', function (t) { + [index, safer, dangerous].forEach(function (impl) { + t.throws(function () { impl.Buffer.from(0) }) + t.throws(function () { impl.Buffer.from(10) }) + t.throws(function () { impl.Buffer.from(10, 'utf-8') }) + t.throws(function () { impl.Buffer.from('string', 'invalid encoding') }) + t.throws(function () { impl.Buffer.from(-10) }) + t.throws(function () { impl.Buffer.from(1e90) }) + t.throws(function () { impl.Buffer.from(Infinity) }) + t.throws(function () { impl.Buffer.from(-Infinity) }) + t.throws(function () { impl.Buffer.from(NaN) }) + t.throws(function () { impl.Buffer.from(null) }) + t.throws(function () { impl.Buffer.from(undefined) }) + t.throws(function () { impl.Buffer.from() }) + t.throws(function () { impl.Buffer.from({}) }) + t.throws(function () { impl.Buffer.alloc('') }) + t.throws(function () { impl.Buffer.alloc('string') }) + t.throws(function () { impl.Buffer.alloc('string', 'utf-8') }) + t.throws(function () { impl.Buffer.alloc('b25ldHdvdGhyZWU=', 'base64') }) + t.throws(function () { impl.Buffer.alloc(-10) }) + t.throws(function () { impl.Buffer.alloc(1e90) }) + t.throws(function () { impl.Buffer.alloc(2 * (1 << 30)) }) + t.throws(function () { impl.Buffer.alloc(Infinity) }) + t.throws(function () { impl.Buffer.alloc(-Infinity) }) + t.throws(function () { impl.Buffer.alloc(null) }) + t.throws(function () { impl.Buffer.alloc(undefined) }) + t.throws(function () { impl.Buffer.alloc() }) + t.throws(function () { impl.Buffer.alloc([]) }) + t.throws(function () { impl.Buffer.alloc([0, 42, 3]) }) + t.throws(function () { impl.Buffer.alloc({}) }) + }); + ['allocUnsafe', 'allocUnsafeSlow'].forEach(function (method) { + t.throws(function () { dangerous.Buffer[method]('') }) + t.throws(function () { dangerous.Buffer[method]('string') }) + t.throws(function () { dangerous.Buffer[method]('string', 'utf-8') }) + t.throws(function () { dangerous.Buffer[method](2 * (1 << 30)) }) + t.throws(function () { dangerous.Buffer[method](Infinity) }) + if (dangerous.Buffer[method] === buffer.Buffer.allocUnsafe) { + t.skip('Skipping, older impl of allocUnsafe coerced negative sizes to 0') + } else { + t.throws(function () { dangerous.Buffer[method](-10) }) + t.throws(function () { dangerous.Buffer[method](-1e90) }) + t.throws(function () { dangerous.Buffer[method](-Infinity) }) + } + t.throws(function () { dangerous.Buffer[method](null) }) + t.throws(function () { dangerous.Buffer[method](undefined) }) + t.throws(function () { dangerous.Buffer[method]() }) + t.throws(function () { dangerous.Buffer[method]([]) }) + t.throws(function () { dangerous.Buffer[method]([0, 42, 3]) }) + t.throws(function () { dangerous.Buffer[method]({}) }) + }) + t.end() +}) + +test('Buffers have appropriate lengths', function (t) { + [index, safer, dangerous].forEach(function (impl) { + t.equal(impl.Buffer.alloc(0).length, 0) + t.equal(impl.Buffer.alloc(10).length, 10) + t.equal(impl.Buffer.from('').length, 0) + t.equal(impl.Buffer.from('string').length, 6) + t.equal(impl.Buffer.from('string', 'utf-8').length, 6) + t.equal(impl.Buffer.from('b25ldHdvdGhyZWU=', 'base64').length, 11) + t.equal(impl.Buffer.from([0, 42, 3]).length, 3) + t.equal(impl.Buffer.from(new Uint8Array([0, 42, 3])).length, 3) + t.equal(impl.Buffer.from([]).length, 0) + }); + ['allocUnsafe', 'allocUnsafeSlow'].forEach(function (method) { + t.equal(dangerous.Buffer[method](0).length, 0) + t.equal(dangerous.Buffer[method](10).length, 10) + }) + t.end() +}) + +test('Buffers have appropriate lengths (2)', function (t) { + t.equal(index.Buffer.alloc, safer.Buffer.alloc) + t.equal(index.Buffer.alloc, dangerous.Buffer.alloc) + var ok = true; + [ safer.Buffer.alloc, + dangerous.Buffer.allocUnsafe, + dangerous.Buffer.allocUnsafeSlow + ].forEach(function (method) { + for (var i = 0; i < 1e2; i++) { + var length = Math.round(Math.random() * 1e5) + var buf = method(length) + if (!buffer.Buffer.isBuffer(buf)) ok = false + if (buf.length !== length) ok = false + } + }) + t.ok(ok) + t.end() +}) + +test('.alloc(size) is zero-filled and has correct length', function (t) { + t.equal(index.Buffer.alloc, safer.Buffer.alloc) + t.equal(index.Buffer.alloc, dangerous.Buffer.alloc) + var ok = true + for (var i = 0; i < 1e2; i++) { + var length = Math.round(Math.random() * 2e6) + var buf = index.Buffer.alloc(length) + if (!buffer.Buffer.isBuffer(buf)) ok = false + if (buf.length !== length) ok = false + var j + for (j = 0; j < length; j++) { + if (buf[j] !== 0) ok = false + } + buf.fill(1) + for (j = 0; j < length; j++) { + if (buf[j] !== 1) ok = false + } + } + t.ok(ok) + t.end() +}) + +test('.allocUnsafe / .allocUnsafeSlow are fillable and have correct lengths', function (t) { + ['allocUnsafe', 'allocUnsafeSlow'].forEach(function (method) { + var ok = true + for (var i = 0; i < 1e2; i++) { + var length = Math.round(Math.random() * 2e6) + var buf = dangerous.Buffer[method](length) + if (!buffer.Buffer.isBuffer(buf)) ok = false + if (buf.length !== length) ok = false + buf.fill(0, 0, length) + var j + for (j = 0; j < length; j++) { + if (buf[j] !== 0) ok = false + } + buf.fill(1, 0, length) + for (j = 0; j < length; j++) { + if (buf[j] !== 1) ok = false + } + } + t.ok(ok, method) + }) + t.end() +}) + +test('.alloc(size, fill) is `fill`-filled', function (t) { + t.equal(index.Buffer.alloc, safer.Buffer.alloc) + t.equal(index.Buffer.alloc, dangerous.Buffer.alloc) + var ok = true + for (var i = 0; i < 1e2; i++) { + var length = Math.round(Math.random() * 2e6) + var fill = Math.round(Math.random() * 255) + var buf = index.Buffer.alloc(length, fill) + if (!buffer.Buffer.isBuffer(buf)) ok = false + if (buf.length !== length) ok = false + for (var j = 0; j < length; j++) { + if (buf[j] !== fill) ok = false + } + } + t.ok(ok) + t.end() +}) + +test('.alloc(size, fill) is `fill`-filled', function (t) { + t.equal(index.Buffer.alloc, safer.Buffer.alloc) + t.equal(index.Buffer.alloc, dangerous.Buffer.alloc) + var ok = true + for (var i = 0; i < 1e2; i++) { + var length = Math.round(Math.random() * 2e6) + var fill = Math.round(Math.random() * 255) + var buf = index.Buffer.alloc(length, fill) + if (!buffer.Buffer.isBuffer(buf)) ok = false + if (buf.length !== length) ok = false + for (var j = 0; j < length; j++) { + if (buf[j] !== fill) ok = false + } + } + t.ok(ok) + t.deepEqual(index.Buffer.alloc(9, 'a'), index.Buffer.alloc(9, 97)) + t.notDeepEqual(index.Buffer.alloc(9, 'a'), index.Buffer.alloc(9, 98)) + + var tmp = new buffer.Buffer(2) + tmp.fill('ok') + if (tmp[1] === tmp[0]) { + // Outdated Node.js + t.deepEqual(index.Buffer.alloc(5, 'ok'), index.Buffer.from('ooooo')) + } else { + t.deepEqual(index.Buffer.alloc(5, 'ok'), index.Buffer.from('okoko')) + } + t.notDeepEqual(index.Buffer.alloc(5, 'ok'), index.Buffer.from('kokok')) + + t.end() +}) + +test('safer.Buffer.from returns results same as Buffer constructor', function (t) { + [index, safer, dangerous].forEach(function (impl) { + t.deepEqual(impl.Buffer.from(''), new buffer.Buffer('')) + t.deepEqual(impl.Buffer.from('string'), new buffer.Buffer('string')) + t.deepEqual(impl.Buffer.from('string', 'utf-8'), new buffer.Buffer('string', 'utf-8')) + t.deepEqual(impl.Buffer.from('b25ldHdvdGhyZWU=', 'base64'), new buffer.Buffer('b25ldHdvdGhyZWU=', 'base64')) + t.deepEqual(impl.Buffer.from([0, 42, 3]), new buffer.Buffer([0, 42, 3])) + t.deepEqual(impl.Buffer.from(new Uint8Array([0, 42, 3])), new buffer.Buffer(new Uint8Array([0, 42, 3]))) + t.deepEqual(impl.Buffer.from([]), new buffer.Buffer([])) + }) + t.end() +}) + +test('safer.Buffer.from returns consistent results', function (t) { + [index, safer, dangerous].forEach(function (impl) { + t.deepEqual(impl.Buffer.from(''), impl.Buffer.alloc(0)) + t.deepEqual(impl.Buffer.from([]), impl.Buffer.alloc(0)) + t.deepEqual(impl.Buffer.from(new Uint8Array([])), impl.Buffer.alloc(0)) + t.deepEqual(impl.Buffer.from('string', 'utf-8'), impl.Buffer.from('string')) + t.deepEqual(impl.Buffer.from('string'), impl.Buffer.from([115, 116, 114, 105, 110, 103])) + t.deepEqual(impl.Buffer.from('string'), impl.Buffer.from(impl.Buffer.from('string'))) + t.deepEqual(impl.Buffer.from('b25ldHdvdGhyZWU=', 'base64'), impl.Buffer.from('onetwothree')) + t.notDeepEqual(impl.Buffer.from('b25ldHdvdGhyZWU='), impl.Buffer.from('onetwothree')) + }) + t.end() +}) diff --git a/node_modules/saslprep/.editorconfig b/node_modules/saslprep/.editorconfig new file mode 100644 index 000000000..d1d8a4176 --- /dev/null +++ b/node_modules/saslprep/.editorconfig @@ -0,0 +1,10 @@ +# http://editorconfig.org +root = true + +[*] +indent_style = space +indent_size = 2 +end_of_line = lf +charset = utf-8 +trim_trailing_whitespace = true +insert_final_newline = true diff --git a/node_modules/saslprep/.gitattributes b/node_modules/saslprep/.gitattributes new file mode 100644 index 000000000..3ba45360c --- /dev/null +++ b/node_modules/saslprep/.gitattributes @@ -0,0 +1 @@ +*.mem binary diff --git a/node_modules/saslprep/.travis.yml b/node_modules/saslprep/.travis.yml new file mode 100644 index 000000000..0bca82653 --- /dev/null +++ b/node_modules/saslprep/.travis.yml @@ -0,0 +1,10 @@ +sudo: false +language: node_js +node_js: + - "6" + - "8" + - "10" + - "12" + +before_install: +- npm install -g npm@6 diff --git a/node_modules/saslprep/CHANGELOG.md b/node_modules/saslprep/CHANGELOG.md new file mode 100644 index 000000000..779807877 --- /dev/null +++ b/node_modules/saslprep/CHANGELOG.md @@ -0,0 +1,19 @@ +# Change Log +All notable changes to the "saslprep" package will be documented in this file. + +## [1.0.3] - 2019-05-01 + +- Correctly get code points >U+FFFF ([#5](https://github.com/reklatsmasters/saslprep/pull/5)) +- Fix perfomance downgrades from [#5](https://github.com/reklatsmasters/saslprep/pull/5). + +## [1.0.2] - 2018-09-13 + +- Reduced initialization time ([#3](https://github.com/reklatsmasters/saslprep/issues/3)) + +## [1.0.1] - 2018-06-20 + +- Reduced stack overhead of range creation ([#2](https://github.com/reklatsmasters/saslprep/pull/2)) + +## [1.0.0] - 2017-06-21 + +- First release diff --git a/node_modules/saslprep/LICENSE b/node_modules/saslprep/LICENSE new file mode 100644 index 000000000..481c7a50f --- /dev/null +++ b/node_modules/saslprep/LICENSE @@ -0,0 +1,22 @@ +Copyright (c) 2014 Dmitry Tsvettsikh + +Permission is hereby granted, free of charge, to any person +obtaining a copy of this software and associated documentation +files (the "Software"), to deal in the Software without +restriction, including without limitation the rights to use, +copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the +Software is furnished to do so, subject to the following +conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES +OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND +NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT +HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, +WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING +FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR +OTHER DEALINGS IN THE SOFTWARE. \ No newline at end of file diff --git a/node_modules/saslprep/code-points.mem b/node_modules/saslprep/code-points.mem new file mode 100644 index 000000000..4781b0668 Binary files /dev/null and b/node_modules/saslprep/code-points.mem differ diff --git a/node_modules/saslprep/generate-code-points.js b/node_modules/saslprep/generate-code-points.js new file mode 100644 index 000000000..c5162ca7a --- /dev/null +++ b/node_modules/saslprep/generate-code-points.js @@ -0,0 +1,51 @@ +'use strict'; + +const bitfield = require('sparse-bitfield'); +const codePoints = require('./lib/code-points'); + +const unassigned_code_points = bitfield(); +const commonly_mapped_to_nothing = bitfield(); +const non_ascii_space_characters = bitfield(); +const prohibited_characters = bitfield(); +const bidirectional_r_al = bitfield(); +const bidirectional_l = bitfield(); + +/** + * Iterare over code points and + * convert it into an buffer. + * @param {bitfield} bits + * @param {Array} src + * @returns {Buffer} + */ +function traverse(bits, src) { + for (const code of src.keys()) { + bits.set(code, true); + } + + const buffer = bits.toBuffer(); + return Buffer.concat([createSize(buffer), buffer]); +} + +/** + * @param {Buffer} buffer + * @returns {Buffer} + */ +function createSize(buffer) { + const buf = Buffer.alloc(4); + buf.writeUInt32BE(buffer.length); + + return buf; +} + +const memory = []; + +memory.push( + traverse(unassigned_code_points, codePoints.unassigned_code_points), + traverse(commonly_mapped_to_nothing, codePoints.commonly_mapped_to_nothing), + traverse(non_ascii_space_characters, codePoints.non_ASCII_space_characters), + traverse(prohibited_characters, codePoints.prohibited_characters), + traverse(bidirectional_r_al, codePoints.bidirectional_r_al), + traverse(bidirectional_l, codePoints.bidirectional_l) +); + +process.stdout.write(Buffer.concat(memory)); diff --git a/node_modules/saslprep/index.js b/node_modules/saslprep/index.js new file mode 100644 index 000000000..21bb0fed3 --- /dev/null +++ b/node_modules/saslprep/index.js @@ -0,0 +1,157 @@ +'use strict'; + +const { + unassigned_code_points, + commonly_mapped_to_nothing, + non_ASCII_space_characters, + prohibited_characters, + bidirectional_r_al, + bidirectional_l, +} = require('./lib/memory-code-points'); + +module.exports = saslprep; + +// 2.1. Mapping + +/** + * non-ASCII space characters [StringPrep, C.1.2] that can be + * mapped to SPACE (U+0020) + */ +const mapping2space = non_ASCII_space_characters; + +/** + * the "commonly mapped to nothing" characters [StringPrep, B.1] + * that can be mapped to nothing. + */ +const mapping2nothing = commonly_mapped_to_nothing; + +// utils +const getCodePoint = character => character.codePointAt(0); +const first = x => x[0]; +const last = x => x[x.length - 1]; + +/** + * Convert provided string into an array of Unicode Code Points. + * Based on https://stackoverflow.com/a/21409165/1556249 + * and https://www.npmjs.com/package/code-point-at. + * @param {string} input + * @returns {number[]} + */ +function toCodePoints(input) { + const codepoints = []; + const size = input.length; + + for (let i = 0; i < size; i += 1) { + const before = input.charCodeAt(i); + + if (before >= 0xd800 && before <= 0xdbff && size > i + 1) { + const next = input.charCodeAt(i + 1); + + if (next >= 0xdc00 && next <= 0xdfff) { + codepoints.push((before - 0xd800) * 0x400 + next - 0xdc00 + 0x10000); + i += 1; + continue; + } + } + + codepoints.push(before); + } + + return codepoints; +} + +/** + * SASLprep. + * @param {string} input + * @param {Object} opts + * @param {boolean} opts.allowUnassigned + * @returns {string} + */ +function saslprep(input, opts = {}) { + if (typeof input !== 'string') { + throw new TypeError('Expected string.'); + } + + if (input.length === 0) { + return ''; + } + + // 1. Map + const mapped_input = toCodePoints(input) + // 1.1 mapping to space + .map(character => (mapping2space.get(character) ? 0x20 : character)) + // 1.2 mapping to nothing + .filter(character => !mapping2nothing.get(character)); + + // 2. Normalize + const normalized_input = String.fromCodePoint + .apply(null, mapped_input) + .normalize('NFKC'); + + const normalized_map = toCodePoints(normalized_input); + + // 3. Prohibit + const hasProhibited = normalized_map.some(character => + prohibited_characters.get(character) + ); + + if (hasProhibited) { + throw new Error( + 'Prohibited character, see https://tools.ietf.org/html/rfc4013#section-2.3' + ); + } + + // Unassigned Code Points + if (opts.allowUnassigned !== true) { + const hasUnassigned = normalized_map.some(character => + unassigned_code_points.get(character) + ); + + if (hasUnassigned) { + throw new Error( + 'Unassigned code point, see https://tools.ietf.org/html/rfc4013#section-2.5' + ); + } + } + + // 4. check bidi + + const hasBidiRAL = normalized_map.some(character => + bidirectional_r_al.get(character) + ); + + const hasBidiL = normalized_map.some(character => + bidirectional_l.get(character) + ); + + // 4.1 If a string contains any RandALCat character, the string MUST NOT + // contain any LCat character. + if (hasBidiRAL && hasBidiL) { + throw new Error( + 'String must not contain RandALCat and LCat at the same time,' + + ' see https://tools.ietf.org/html/rfc3454#section-6' + ); + } + + /** + * 4.2 If a string contains any RandALCat character, a RandALCat + * character MUST be the first character of the string, and a + * RandALCat character MUST be the last character of the string. + */ + + const isFirstBidiRAL = bidirectional_r_al.get( + getCodePoint(first(normalized_input)) + ); + const isLastBidiRAL = bidirectional_r_al.get( + getCodePoint(last(normalized_input)) + ); + + if (hasBidiRAL && !(isFirstBidiRAL && isLastBidiRAL)) { + throw new Error( + 'Bidirectional RandALCat character must be the first and the last' + + ' character of the string, see https://tools.ietf.org/html/rfc3454#section-6' + ); + } + + return normalized_input; +} diff --git a/node_modules/saslprep/lib/code-points.js b/node_modules/saslprep/lib/code-points.js new file mode 100644 index 000000000..222182c83 --- /dev/null +++ b/node_modules/saslprep/lib/code-points.js @@ -0,0 +1,996 @@ +'use strict'; + +const { range } = require('./util'); + +/** + * A.1 Unassigned code points in Unicode 3.2 + * @link https://tools.ietf.org/html/rfc3454#appendix-A.1 + */ +const unassigned_code_points = new Set([ + 0x0221, + ...range(0x0234, 0x024f), + ...range(0x02ae, 0x02af), + ...range(0x02ef, 0x02ff), + ...range(0x0350, 0x035f), + ...range(0x0370, 0x0373), + ...range(0x0376, 0x0379), + ...range(0x037b, 0x037d), + ...range(0x037f, 0x0383), + 0x038b, + 0x038d, + 0x03a2, + 0x03cf, + ...range(0x03f7, 0x03ff), + 0x0487, + 0x04cf, + ...range(0x04f6, 0x04f7), + ...range(0x04fa, 0x04ff), + ...range(0x0510, 0x0530), + ...range(0x0557, 0x0558), + 0x0560, + 0x0588, + ...range(0x058b, 0x0590), + 0x05a2, + 0x05ba, + ...range(0x05c5, 0x05cf), + ...range(0x05eb, 0x05ef), + ...range(0x05f5, 0x060b), + ...range(0x060d, 0x061a), + ...range(0x061c, 0x061e), + 0x0620, + ...range(0x063b, 0x063f), + ...range(0x0656, 0x065f), + ...range(0x06ee, 0x06ef), + 0x06ff, + 0x070e, + ...range(0x072d, 0x072f), + ...range(0x074b, 0x077f), + ...range(0x07b2, 0x0900), + 0x0904, + ...range(0x093a, 0x093b), + ...range(0x094e, 0x094f), + ...range(0x0955, 0x0957), + ...range(0x0971, 0x0980), + 0x0984, + ...range(0x098d, 0x098e), + ...range(0x0991, 0x0992), + 0x09a9, + 0x09b1, + ...range(0x09b3, 0x09b5), + ...range(0x09ba, 0x09bb), + 0x09bd, + ...range(0x09c5, 0x09c6), + ...range(0x09c9, 0x09ca), + ...range(0x09ce, 0x09d6), + ...range(0x09d8, 0x09db), + 0x09de, + ...range(0x09e4, 0x09e5), + ...range(0x09fb, 0x0a01), + ...range(0x0a03, 0x0a04), + ...range(0x0a0b, 0x0a0e), + ...range(0x0a11, 0x0a12), + 0x0a29, + 0x0a31, + 0x0a34, + 0x0a37, + ...range(0x0a3a, 0x0a3b), + 0x0a3d, + ...range(0x0a43, 0x0a46), + ...range(0x0a49, 0x0a4a), + ...range(0x0a4e, 0x0a58), + 0x0a5d, + ...range(0x0a5f, 0x0a65), + ...range(0x0a75, 0x0a80), + 0x0a84, + 0x0a8c, + 0x0a8e, + 0x0a92, + 0x0aa9, + 0x0ab1, + 0x0ab4, + ...range(0x0aba, 0x0abb), + 0x0ac6, + 0x0aca, + ...range(0x0ace, 0x0acf), + ...range(0x0ad1, 0x0adf), + ...range(0x0ae1, 0x0ae5), + ...range(0x0af0, 0x0b00), + 0x0b04, + ...range(0x0b0d, 0x0b0e), + ...range(0x0b11, 0x0b12), + 0x0b29, + 0x0b31, + ...range(0x0b34, 0x0b35), + ...range(0x0b3a, 0x0b3b), + ...range(0x0b44, 0x0b46), + ...range(0x0b49, 0x0b4a), + ...range(0x0b4e, 0x0b55), + ...range(0x0b58, 0x0b5b), + 0x0b5e, + ...range(0x0b62, 0x0b65), + ...range(0x0b71, 0x0b81), + 0x0b84, + ...range(0x0b8b, 0x0b8d), + 0x0b91, + ...range(0x0b96, 0x0b98), + 0x0b9b, + 0x0b9d, + ...range(0x0ba0, 0x0ba2), + ...range(0x0ba5, 0x0ba7), + ...range(0x0bab, 0x0bad), + 0x0bb6, + ...range(0x0bba, 0x0bbd), + ...range(0x0bc3, 0x0bc5), + 0x0bc9, + ...range(0x0bce, 0x0bd6), + ...range(0x0bd8, 0x0be6), + ...range(0x0bf3, 0x0c00), + 0x0c04, + 0x0c0d, + 0x0c11, + 0x0c29, + 0x0c34, + ...range(0x0c3a, 0x0c3d), + 0x0c45, + 0x0c49, + ...range(0x0c4e, 0x0c54), + ...range(0x0c57, 0x0c5f), + ...range(0x0c62, 0x0c65), + ...range(0x0c70, 0x0c81), + 0x0c84, + 0x0c8d, + 0x0c91, + 0x0ca9, + 0x0cb4, + ...range(0x0cba, 0x0cbd), + 0x0cc5, + 0x0cc9, + ...range(0x0cce, 0x0cd4), + ...range(0x0cd7, 0x0cdd), + 0x0cdf, + ...range(0x0ce2, 0x0ce5), + ...range(0x0cf0, 0x0d01), + 0x0d04, + 0x0d0d, + 0x0d11, + 0x0d29, + ...range(0x0d3a, 0x0d3d), + ...range(0x0d44, 0x0d45), + 0x0d49, + ...range(0x0d4e, 0x0d56), + ...range(0x0d58, 0x0d5f), + ...range(0x0d62, 0x0d65), + ...range(0x0d70, 0x0d81), + 0x0d84, + ...range(0x0d97, 0x0d99), + 0x0db2, + 0x0dbc, + ...range(0x0dbe, 0x0dbf), + ...range(0x0dc7, 0x0dc9), + ...range(0x0dcb, 0x0dce), + 0x0dd5, + 0x0dd7, + ...range(0x0de0, 0x0df1), + ...range(0x0df5, 0x0e00), + ...range(0x0e3b, 0x0e3e), + ...range(0x0e5c, 0x0e80), + 0x0e83, + ...range(0x0e85, 0x0e86), + 0x0e89, + ...range(0x0e8b, 0x0e8c), + ...range(0x0e8e, 0x0e93), + 0x0e98, + 0x0ea0, + 0x0ea4, + 0x0ea6, + ...range(0x0ea8, 0x0ea9), + 0x0eac, + 0x0eba, + ...range(0x0ebe, 0x0ebf), + 0x0ec5, + 0x0ec7, + ...range(0x0ece, 0x0ecf), + ...range(0x0eda, 0x0edb), + ...range(0x0ede, 0x0eff), + 0x0f48, + ...range(0x0f6b, 0x0f70), + ...range(0x0f8c, 0x0f8f), + 0x0f98, + 0x0fbd, + ...range(0x0fcd, 0x0fce), + ...range(0x0fd0, 0x0fff), + 0x1022, + 0x1028, + 0x102b, + ...range(0x1033, 0x1035), + ...range(0x103a, 0x103f), + ...range(0x105a, 0x109f), + ...range(0x10c6, 0x10cf), + ...range(0x10f9, 0x10fa), + ...range(0x10fc, 0x10ff), + ...range(0x115a, 0x115e), + ...range(0x11a3, 0x11a7), + ...range(0x11fa, 0x11ff), + 0x1207, + 0x1247, + 0x1249, + ...range(0x124e, 0x124f), + 0x1257, + 0x1259, + ...range(0x125e, 0x125f), + 0x1287, + 0x1289, + ...range(0x128e, 0x128f), + 0x12af, + 0x12b1, + ...range(0x12b6, 0x12b7), + 0x12bf, + 0x12c1, + ...range(0x12c6, 0x12c7), + 0x12cf, + 0x12d7, + 0x12ef, + 0x130f, + 0x1311, + ...range(0x1316, 0x1317), + 0x131f, + 0x1347, + ...range(0x135b, 0x1360), + ...range(0x137d, 0x139f), + ...range(0x13f5, 0x1400), + ...range(0x1677, 0x167f), + ...range(0x169d, 0x169f), + ...range(0x16f1, 0x16ff), + 0x170d, + ...range(0x1715, 0x171f), + ...range(0x1737, 0x173f), + ...range(0x1754, 0x175f), + 0x176d, + 0x1771, + ...range(0x1774, 0x177f), + ...range(0x17dd, 0x17df), + ...range(0x17ea, 0x17ff), + 0x180f, + ...range(0x181a, 0x181f), + ...range(0x1878, 0x187f), + ...range(0x18aa, 0x1dff), + ...range(0x1e9c, 0x1e9f), + ...range(0x1efa, 0x1eff), + ...range(0x1f16, 0x1f17), + ...range(0x1f1e, 0x1f1f), + ...range(0x1f46, 0x1f47), + ...range(0x1f4e, 0x1f4f), + 0x1f58, + 0x1f5a, + 0x1f5c, + 0x1f5e, + ...range(0x1f7e, 0x1f7f), + 0x1fb5, + 0x1fc5, + ...range(0x1fd4, 0x1fd5), + 0x1fdc, + ...range(0x1ff0, 0x1ff1), + 0x1ff5, + 0x1fff, + ...range(0x2053, 0x2056), + ...range(0x2058, 0x205e), + ...range(0x2064, 0x2069), + ...range(0x2072, 0x2073), + ...range(0x208f, 0x209f), + ...range(0x20b2, 0x20cf), + ...range(0x20eb, 0x20ff), + ...range(0x213b, 0x213c), + ...range(0x214c, 0x2152), + ...range(0x2184, 0x218f), + ...range(0x23cf, 0x23ff), + ...range(0x2427, 0x243f), + ...range(0x244b, 0x245f), + 0x24ff, + ...range(0x2614, 0x2615), + 0x2618, + ...range(0x267e, 0x267f), + ...range(0x268a, 0x2700), + 0x2705, + ...range(0x270a, 0x270b), + 0x2728, + 0x274c, + 0x274e, + ...range(0x2753, 0x2755), + 0x2757, + ...range(0x275f, 0x2760), + ...range(0x2795, 0x2797), + 0x27b0, + ...range(0x27bf, 0x27cf), + ...range(0x27ec, 0x27ef), + ...range(0x2b00, 0x2e7f), + 0x2e9a, + ...range(0x2ef4, 0x2eff), + ...range(0x2fd6, 0x2fef), + ...range(0x2ffc, 0x2fff), + 0x3040, + ...range(0x3097, 0x3098), + ...range(0x3100, 0x3104), + ...range(0x312d, 0x3130), + 0x318f, + ...range(0x31b8, 0x31ef), + ...range(0x321d, 0x321f), + ...range(0x3244, 0x3250), + ...range(0x327c, 0x327e), + ...range(0x32cc, 0x32cf), + 0x32ff, + ...range(0x3377, 0x337a), + ...range(0x33de, 0x33df), + 0x33ff, + ...range(0x4db6, 0x4dff), + ...range(0x9fa6, 0x9fff), + ...range(0xa48d, 0xa48f), + ...range(0xa4c7, 0xabff), + ...range(0xd7a4, 0xd7ff), + ...range(0xfa2e, 0xfa2f), + ...range(0xfa6b, 0xfaff), + ...range(0xfb07, 0xfb12), + ...range(0xfb18, 0xfb1c), + 0xfb37, + 0xfb3d, + 0xfb3f, + 0xfb42, + 0xfb45, + ...range(0xfbb2, 0xfbd2), + ...range(0xfd40, 0xfd4f), + ...range(0xfd90, 0xfd91), + ...range(0xfdc8, 0xfdcf), + ...range(0xfdfd, 0xfdff), + ...range(0xfe10, 0xfe1f), + ...range(0xfe24, 0xfe2f), + ...range(0xfe47, 0xfe48), + 0xfe53, + 0xfe67, + ...range(0xfe6c, 0xfe6f), + 0xfe75, + ...range(0xfefd, 0xfefe), + 0xff00, + ...range(0xffbf, 0xffc1), + ...range(0xffc8, 0xffc9), + ...range(0xffd0, 0xffd1), + ...range(0xffd8, 0xffd9), + ...range(0xffdd, 0xffdf), + 0xffe7, + ...range(0xffef, 0xfff8), + ...range(0x10000, 0x102ff), + 0x1031f, + ...range(0x10324, 0x1032f), + ...range(0x1034b, 0x103ff), + ...range(0x10426, 0x10427), + ...range(0x1044e, 0x1cfff), + ...range(0x1d0f6, 0x1d0ff), + ...range(0x1d127, 0x1d129), + ...range(0x1d1de, 0x1d3ff), + 0x1d455, + 0x1d49d, + ...range(0x1d4a0, 0x1d4a1), + ...range(0x1d4a3, 0x1d4a4), + ...range(0x1d4a7, 0x1d4a8), + 0x1d4ad, + 0x1d4ba, + 0x1d4bc, + 0x1d4c1, + 0x1d4c4, + 0x1d506, + ...range(0x1d50b, 0x1d50c), + 0x1d515, + 0x1d51d, + 0x1d53a, + 0x1d53f, + 0x1d545, + ...range(0x1d547, 0x1d549), + 0x1d551, + ...range(0x1d6a4, 0x1d6a7), + ...range(0x1d7ca, 0x1d7cd), + ...range(0x1d800, 0x1fffd), + ...range(0x2a6d7, 0x2f7ff), + ...range(0x2fa1e, 0x2fffd), + ...range(0x30000, 0x3fffd), + ...range(0x40000, 0x4fffd), + ...range(0x50000, 0x5fffd), + ...range(0x60000, 0x6fffd), + ...range(0x70000, 0x7fffd), + ...range(0x80000, 0x8fffd), + ...range(0x90000, 0x9fffd), + ...range(0xa0000, 0xafffd), + ...range(0xb0000, 0xbfffd), + ...range(0xc0000, 0xcfffd), + ...range(0xd0000, 0xdfffd), + 0xe0000, + ...range(0xe0002, 0xe001f), + ...range(0xe0080, 0xefffd), +]); + +/** + * B.1 Commonly mapped to nothing + * @link https://tools.ietf.org/html/rfc3454#appendix-B.1 + */ +const commonly_mapped_to_nothing = new Set([ + 0x00ad, + 0x034f, + 0x1806, + 0x180b, + 0x180c, + 0x180d, + 0x200b, + 0x200c, + 0x200d, + 0x2060, + 0xfe00, + 0xfe01, + 0xfe02, + 0xfe03, + 0xfe04, + 0xfe05, + 0xfe06, + 0xfe07, + 0xfe08, + 0xfe09, + 0xfe0a, + 0xfe0b, + 0xfe0c, + 0xfe0d, + 0xfe0e, + 0xfe0f, + 0xfeff, +]); + +/** + * C.1.2 Non-ASCII space characters + * @link https://tools.ietf.org/html/rfc3454#appendix-C.1.2 + */ +const non_ASCII_space_characters = new Set([ + 0x00a0 /* NO-BREAK SPACE */, + 0x1680 /* OGHAM SPACE MARK */, + 0x2000 /* EN QUAD */, + 0x2001 /* EM QUAD */, + 0x2002 /* EN SPACE */, + 0x2003 /* EM SPACE */, + 0x2004 /* THREE-PER-EM SPACE */, + 0x2005 /* FOUR-PER-EM SPACE */, + 0x2006 /* SIX-PER-EM SPACE */, + 0x2007 /* FIGURE SPACE */, + 0x2008 /* PUNCTUATION SPACE */, + 0x2009 /* THIN SPACE */, + 0x200a /* HAIR SPACE */, + 0x200b /* ZERO WIDTH SPACE */, + 0x202f /* NARROW NO-BREAK SPACE */, + 0x205f /* MEDIUM MATHEMATICAL SPACE */, + 0x3000 /* IDEOGRAPHIC SPACE */, +]); + +/** + * 2.3. Prohibited Output + * @type {Set} + */ +const prohibited_characters = new Set([ + ...non_ASCII_space_characters, + + /** + * C.2.1 ASCII control characters + * @link https://tools.ietf.org/html/rfc3454#appendix-C.2.1 + */ + ...range(0, 0x001f) /* [CONTROL CHARACTERS] */, + 0x007f /* DELETE */, + + /** + * C.2.2 Non-ASCII control characters + * @link https://tools.ietf.org/html/rfc3454#appendix-C.2.2 + */ + ...range(0x0080, 0x009f) /* [CONTROL CHARACTERS] */, + 0x06dd /* ARABIC END OF AYAH */, + 0x070f /* SYRIAC ABBREVIATION MARK */, + 0x180e /* MONGOLIAN VOWEL SEPARATOR */, + 0x200c /* ZERO WIDTH NON-JOINER */, + 0x200d /* ZERO WIDTH JOINER */, + 0x2028 /* LINE SEPARATOR */, + 0x2029 /* PARAGRAPH SEPARATOR */, + 0x2060 /* WORD JOINER */, + 0x2061 /* FUNCTION APPLICATION */, + 0x2062 /* INVISIBLE TIMES */, + 0x2063 /* INVISIBLE SEPARATOR */, + ...range(0x206a, 0x206f) /* [CONTROL CHARACTERS] */, + 0xfeff /* ZERO WIDTH NO-BREAK SPACE */, + ...range(0xfff9, 0xfffc) /* [CONTROL CHARACTERS] */, + ...range(0x1d173, 0x1d17a) /* [MUSICAL CONTROL CHARACTERS] */, + + /** + * C.3 Private use + * @link https://tools.ietf.org/html/rfc3454#appendix-C.3 + */ + ...range(0xe000, 0xf8ff) /* [PRIVATE USE, PLANE 0] */, + ...range(0xf0000, 0xffffd) /* [PRIVATE USE, PLANE 15] */, + ...range(0x100000, 0x10fffd) /* [PRIVATE USE, PLANE 16] */, + + /** + * C.4 Non-character code points + * @link https://tools.ietf.org/html/rfc3454#appendix-C.4 + */ + ...range(0xfdd0, 0xfdef) /* [NONCHARACTER CODE POINTS] */, + ...range(0xfffe, 0xffff) /* [NONCHARACTER CODE POINTS] */, + ...range(0x1fffe, 0x1ffff) /* [NONCHARACTER CODE POINTS] */, + ...range(0x2fffe, 0x2ffff) /* [NONCHARACTER CODE POINTS] */, + ...range(0x3fffe, 0x3ffff) /* [NONCHARACTER CODE POINTS] */, + ...range(0x4fffe, 0x4ffff) /* [NONCHARACTER CODE POINTS] */, + ...range(0x5fffe, 0x5ffff) /* [NONCHARACTER CODE POINTS] */, + ...range(0x6fffe, 0x6ffff) /* [NONCHARACTER CODE POINTS] */, + ...range(0x7fffe, 0x7ffff) /* [NONCHARACTER CODE POINTS] */, + ...range(0x8fffe, 0x8ffff) /* [NONCHARACTER CODE POINTS] */, + ...range(0x9fffe, 0x9ffff) /* [NONCHARACTER CODE POINTS] */, + ...range(0xafffe, 0xaffff) /* [NONCHARACTER CODE POINTS] */, + ...range(0xbfffe, 0xbffff) /* [NONCHARACTER CODE POINTS] */, + ...range(0xcfffe, 0xcffff) /* [NONCHARACTER CODE POINTS] */, + ...range(0xdfffe, 0xdffff) /* [NONCHARACTER CODE POINTS] */, + ...range(0xefffe, 0xeffff) /* [NONCHARACTER CODE POINTS] */, + ...range(0x10fffe, 0x10ffff) /* [NONCHARACTER CODE POINTS] */, + + /** + * C.5 Surrogate codes + * @link https://tools.ietf.org/html/rfc3454#appendix-C.5 + */ + ...range(0xd800, 0xdfff), + + /** + * C.6 Inappropriate for plain text + * @link https://tools.ietf.org/html/rfc3454#appendix-C.6 + */ + 0xfff9 /* INTERLINEAR ANNOTATION ANCHOR */, + 0xfffa /* INTERLINEAR ANNOTATION SEPARATOR */, + 0xfffb /* INTERLINEAR ANNOTATION TERMINATOR */, + 0xfffc /* OBJECT REPLACEMENT CHARACTER */, + 0xfffd /* REPLACEMENT CHARACTER */, + + /** + * C.7 Inappropriate for canonical representation + * @link https://tools.ietf.org/html/rfc3454#appendix-C.7 + */ + ...range(0x2ff0, 0x2ffb) /* [IDEOGRAPHIC DESCRIPTION CHARACTERS] */, + + /** + * C.8 Change display properties or are deprecated + * @link https://tools.ietf.org/html/rfc3454#appendix-C.8 + */ + 0x0340 /* COMBINING GRAVE TONE MARK */, + 0x0341 /* COMBINING ACUTE TONE MARK */, + 0x200e /* LEFT-TO-RIGHT MARK */, + 0x200f /* RIGHT-TO-LEFT MARK */, + 0x202a /* LEFT-TO-RIGHT EMBEDDING */, + 0x202b /* RIGHT-TO-LEFT EMBEDDING */, + 0x202c /* POP DIRECTIONAL FORMATTING */, + 0x202d /* LEFT-TO-RIGHT OVERRIDE */, + 0x202e /* RIGHT-TO-LEFT OVERRIDE */, + 0x206a /* INHIBIT SYMMETRIC SWAPPING */, + 0x206b /* ACTIVATE SYMMETRIC SWAPPING */, + 0x206c /* INHIBIT ARABIC FORM SHAPING */, + 0x206d /* ACTIVATE ARABIC FORM SHAPING */, + 0x206e /* NATIONAL DIGIT SHAPES */, + 0x206f /* NOMINAL DIGIT SHAPES */, + + /** + * C.9 Tagging characters + * @link https://tools.ietf.org/html/rfc3454#appendix-C.9 + */ + 0xe0001 /* LANGUAGE TAG */, + ...range(0xe0020, 0xe007f) /* [TAGGING CHARACTERS] */, +]); + +/** + * D.1 Characters with bidirectional property "R" or "AL" + * @link https://tools.ietf.org/html/rfc3454#appendix-D.1 + */ +const bidirectional_r_al = new Set([ + 0x05be, + 0x05c0, + 0x05c3, + ...range(0x05d0, 0x05ea), + ...range(0x05f0, 0x05f4), + 0x061b, + 0x061f, + ...range(0x0621, 0x063a), + ...range(0x0640, 0x064a), + ...range(0x066d, 0x066f), + ...range(0x0671, 0x06d5), + 0x06dd, + ...range(0x06e5, 0x06e6), + ...range(0x06fa, 0x06fe), + ...range(0x0700, 0x070d), + 0x0710, + ...range(0x0712, 0x072c), + ...range(0x0780, 0x07a5), + 0x07b1, + 0x200f, + 0xfb1d, + ...range(0xfb1f, 0xfb28), + ...range(0xfb2a, 0xfb36), + ...range(0xfb38, 0xfb3c), + 0xfb3e, + ...range(0xfb40, 0xfb41), + ...range(0xfb43, 0xfb44), + ...range(0xfb46, 0xfbb1), + ...range(0xfbd3, 0xfd3d), + ...range(0xfd50, 0xfd8f), + ...range(0xfd92, 0xfdc7), + ...range(0xfdf0, 0xfdfc), + ...range(0xfe70, 0xfe74), + ...range(0xfe76, 0xfefc), +]); + +/** + * D.2 Characters with bidirectional property "L" + * @link https://tools.ietf.org/html/rfc3454#appendix-D.2 + */ +const bidirectional_l = new Set([ + ...range(0x0041, 0x005a), + ...range(0x0061, 0x007a), + 0x00aa, + 0x00b5, + 0x00ba, + ...range(0x00c0, 0x00d6), + ...range(0x00d8, 0x00f6), + ...range(0x00f8, 0x0220), + ...range(0x0222, 0x0233), + ...range(0x0250, 0x02ad), + ...range(0x02b0, 0x02b8), + ...range(0x02bb, 0x02c1), + ...range(0x02d0, 0x02d1), + ...range(0x02e0, 0x02e4), + 0x02ee, + 0x037a, + 0x0386, + ...range(0x0388, 0x038a), + 0x038c, + ...range(0x038e, 0x03a1), + ...range(0x03a3, 0x03ce), + ...range(0x03d0, 0x03f5), + ...range(0x0400, 0x0482), + ...range(0x048a, 0x04ce), + ...range(0x04d0, 0x04f5), + ...range(0x04f8, 0x04f9), + ...range(0x0500, 0x050f), + ...range(0x0531, 0x0556), + ...range(0x0559, 0x055f), + ...range(0x0561, 0x0587), + 0x0589, + 0x0903, + ...range(0x0905, 0x0939), + ...range(0x093d, 0x0940), + ...range(0x0949, 0x094c), + 0x0950, + ...range(0x0958, 0x0961), + ...range(0x0964, 0x0970), + ...range(0x0982, 0x0983), + ...range(0x0985, 0x098c), + ...range(0x098f, 0x0990), + ...range(0x0993, 0x09a8), + ...range(0x09aa, 0x09b0), + 0x09b2, + ...range(0x09b6, 0x09b9), + ...range(0x09be, 0x09c0), + ...range(0x09c7, 0x09c8), + ...range(0x09cb, 0x09cc), + 0x09d7, + ...range(0x09dc, 0x09dd), + ...range(0x09df, 0x09e1), + ...range(0x09e6, 0x09f1), + ...range(0x09f4, 0x09fa), + ...range(0x0a05, 0x0a0a), + ...range(0x0a0f, 0x0a10), + ...range(0x0a13, 0x0a28), + ...range(0x0a2a, 0x0a30), + ...range(0x0a32, 0x0a33), + ...range(0x0a35, 0x0a36), + ...range(0x0a38, 0x0a39), + ...range(0x0a3e, 0x0a40), + ...range(0x0a59, 0x0a5c), + 0x0a5e, + ...range(0x0a66, 0x0a6f), + ...range(0x0a72, 0x0a74), + 0x0a83, + ...range(0x0a85, 0x0a8b), + 0x0a8d, + ...range(0x0a8f, 0x0a91), + ...range(0x0a93, 0x0aa8), + ...range(0x0aaa, 0x0ab0), + ...range(0x0ab2, 0x0ab3), + ...range(0x0ab5, 0x0ab9), + ...range(0x0abd, 0x0ac0), + 0x0ac9, + ...range(0x0acb, 0x0acc), + 0x0ad0, + 0x0ae0, + ...range(0x0ae6, 0x0aef), + ...range(0x0b02, 0x0b03), + ...range(0x0b05, 0x0b0c), + ...range(0x0b0f, 0x0b10), + ...range(0x0b13, 0x0b28), + ...range(0x0b2a, 0x0b30), + ...range(0x0b32, 0x0b33), + ...range(0x0b36, 0x0b39), + ...range(0x0b3d, 0x0b3e), + 0x0b40, + ...range(0x0b47, 0x0b48), + ...range(0x0b4b, 0x0b4c), + 0x0b57, + ...range(0x0b5c, 0x0b5d), + ...range(0x0b5f, 0x0b61), + ...range(0x0b66, 0x0b70), + 0x0b83, + ...range(0x0b85, 0x0b8a), + ...range(0x0b8e, 0x0b90), + ...range(0x0b92, 0x0b95), + ...range(0x0b99, 0x0b9a), + 0x0b9c, + ...range(0x0b9e, 0x0b9f), + ...range(0x0ba3, 0x0ba4), + ...range(0x0ba8, 0x0baa), + ...range(0x0bae, 0x0bb5), + ...range(0x0bb7, 0x0bb9), + ...range(0x0bbe, 0x0bbf), + ...range(0x0bc1, 0x0bc2), + ...range(0x0bc6, 0x0bc8), + ...range(0x0bca, 0x0bcc), + 0x0bd7, + ...range(0x0be7, 0x0bf2), + ...range(0x0c01, 0x0c03), + ...range(0x0c05, 0x0c0c), + ...range(0x0c0e, 0x0c10), + ...range(0x0c12, 0x0c28), + ...range(0x0c2a, 0x0c33), + ...range(0x0c35, 0x0c39), + ...range(0x0c41, 0x0c44), + ...range(0x0c60, 0x0c61), + ...range(0x0c66, 0x0c6f), + ...range(0x0c82, 0x0c83), + ...range(0x0c85, 0x0c8c), + ...range(0x0c8e, 0x0c90), + ...range(0x0c92, 0x0ca8), + ...range(0x0caa, 0x0cb3), + ...range(0x0cb5, 0x0cb9), + 0x0cbe, + ...range(0x0cc0, 0x0cc4), + ...range(0x0cc7, 0x0cc8), + ...range(0x0cca, 0x0ccb), + ...range(0x0cd5, 0x0cd6), + 0x0cde, + ...range(0x0ce0, 0x0ce1), + ...range(0x0ce6, 0x0cef), + ...range(0x0d02, 0x0d03), + ...range(0x0d05, 0x0d0c), + ...range(0x0d0e, 0x0d10), + ...range(0x0d12, 0x0d28), + ...range(0x0d2a, 0x0d39), + ...range(0x0d3e, 0x0d40), + ...range(0x0d46, 0x0d48), + ...range(0x0d4a, 0x0d4c), + 0x0d57, + ...range(0x0d60, 0x0d61), + ...range(0x0d66, 0x0d6f), + ...range(0x0d82, 0x0d83), + ...range(0x0d85, 0x0d96), + ...range(0x0d9a, 0x0db1), + ...range(0x0db3, 0x0dbb), + 0x0dbd, + ...range(0x0dc0, 0x0dc6), + ...range(0x0dcf, 0x0dd1), + ...range(0x0dd8, 0x0ddf), + ...range(0x0df2, 0x0df4), + ...range(0x0e01, 0x0e30), + ...range(0x0e32, 0x0e33), + ...range(0x0e40, 0x0e46), + ...range(0x0e4f, 0x0e5b), + ...range(0x0e81, 0x0e82), + 0x0e84, + ...range(0x0e87, 0x0e88), + 0x0e8a, + 0x0e8d, + ...range(0x0e94, 0x0e97), + ...range(0x0e99, 0x0e9f), + ...range(0x0ea1, 0x0ea3), + 0x0ea5, + 0x0ea7, + ...range(0x0eaa, 0x0eab), + ...range(0x0ead, 0x0eb0), + ...range(0x0eb2, 0x0eb3), + 0x0ebd, + ...range(0x0ec0, 0x0ec4), + 0x0ec6, + ...range(0x0ed0, 0x0ed9), + ...range(0x0edc, 0x0edd), + ...range(0x0f00, 0x0f17), + ...range(0x0f1a, 0x0f34), + 0x0f36, + 0x0f38, + ...range(0x0f3e, 0x0f47), + ...range(0x0f49, 0x0f6a), + 0x0f7f, + 0x0f85, + ...range(0x0f88, 0x0f8b), + ...range(0x0fbe, 0x0fc5), + ...range(0x0fc7, 0x0fcc), + 0x0fcf, + ...range(0x1000, 0x1021), + ...range(0x1023, 0x1027), + ...range(0x1029, 0x102a), + 0x102c, + 0x1031, + 0x1038, + ...range(0x1040, 0x1057), + ...range(0x10a0, 0x10c5), + ...range(0x10d0, 0x10f8), + 0x10fb, + ...range(0x1100, 0x1159), + ...range(0x115f, 0x11a2), + ...range(0x11a8, 0x11f9), + ...range(0x1200, 0x1206), + ...range(0x1208, 0x1246), + 0x1248, + ...range(0x124a, 0x124d), + ...range(0x1250, 0x1256), + 0x1258, + ...range(0x125a, 0x125d), + ...range(0x1260, 0x1286), + 0x1288, + ...range(0x128a, 0x128d), + ...range(0x1290, 0x12ae), + 0x12b0, + ...range(0x12b2, 0x12b5), + ...range(0x12b8, 0x12be), + 0x12c0, + ...range(0x12c2, 0x12c5), + ...range(0x12c8, 0x12ce), + ...range(0x12d0, 0x12d6), + ...range(0x12d8, 0x12ee), + ...range(0x12f0, 0x130e), + 0x1310, + ...range(0x1312, 0x1315), + ...range(0x1318, 0x131e), + ...range(0x1320, 0x1346), + ...range(0x1348, 0x135a), + ...range(0x1361, 0x137c), + ...range(0x13a0, 0x13f4), + ...range(0x1401, 0x1676), + ...range(0x1681, 0x169a), + ...range(0x16a0, 0x16f0), + ...range(0x1700, 0x170c), + ...range(0x170e, 0x1711), + ...range(0x1720, 0x1731), + ...range(0x1735, 0x1736), + ...range(0x1740, 0x1751), + ...range(0x1760, 0x176c), + ...range(0x176e, 0x1770), + ...range(0x1780, 0x17b6), + ...range(0x17be, 0x17c5), + ...range(0x17c7, 0x17c8), + ...range(0x17d4, 0x17da), + 0x17dc, + ...range(0x17e0, 0x17e9), + ...range(0x1810, 0x1819), + ...range(0x1820, 0x1877), + ...range(0x1880, 0x18a8), + ...range(0x1e00, 0x1e9b), + ...range(0x1ea0, 0x1ef9), + ...range(0x1f00, 0x1f15), + ...range(0x1f18, 0x1f1d), + ...range(0x1f20, 0x1f45), + ...range(0x1f48, 0x1f4d), + ...range(0x1f50, 0x1f57), + 0x1f59, + 0x1f5b, + 0x1f5d, + ...range(0x1f5f, 0x1f7d), + ...range(0x1f80, 0x1fb4), + ...range(0x1fb6, 0x1fbc), + 0x1fbe, + ...range(0x1fc2, 0x1fc4), + ...range(0x1fc6, 0x1fcc), + ...range(0x1fd0, 0x1fd3), + ...range(0x1fd6, 0x1fdb), + ...range(0x1fe0, 0x1fec), + ...range(0x1ff2, 0x1ff4), + ...range(0x1ff6, 0x1ffc), + 0x200e, + 0x2071, + 0x207f, + 0x2102, + 0x2107, + ...range(0x210a, 0x2113), + 0x2115, + ...range(0x2119, 0x211d), + 0x2124, + 0x2126, + 0x2128, + ...range(0x212a, 0x212d), + ...range(0x212f, 0x2131), + ...range(0x2133, 0x2139), + ...range(0x213d, 0x213f), + ...range(0x2145, 0x2149), + ...range(0x2160, 0x2183), + ...range(0x2336, 0x237a), + 0x2395, + ...range(0x249c, 0x24e9), + ...range(0x3005, 0x3007), + ...range(0x3021, 0x3029), + ...range(0x3031, 0x3035), + ...range(0x3038, 0x303c), + ...range(0x3041, 0x3096), + ...range(0x309d, 0x309f), + ...range(0x30a1, 0x30fa), + ...range(0x30fc, 0x30ff), + ...range(0x3105, 0x312c), + ...range(0x3131, 0x318e), + ...range(0x3190, 0x31b7), + ...range(0x31f0, 0x321c), + ...range(0x3220, 0x3243), + ...range(0x3260, 0x327b), + ...range(0x327f, 0x32b0), + ...range(0x32c0, 0x32cb), + ...range(0x32d0, 0x32fe), + ...range(0x3300, 0x3376), + ...range(0x337b, 0x33dd), + ...range(0x33e0, 0x33fe), + ...range(0x3400, 0x4db5), + ...range(0x4e00, 0x9fa5), + ...range(0xa000, 0xa48c), + ...range(0xac00, 0xd7a3), + ...range(0xd800, 0xfa2d), + ...range(0xfa30, 0xfa6a), + ...range(0xfb00, 0xfb06), + ...range(0xfb13, 0xfb17), + ...range(0xff21, 0xff3a), + ...range(0xff41, 0xff5a), + ...range(0xff66, 0xffbe), + ...range(0xffc2, 0xffc7), + ...range(0xffca, 0xffcf), + ...range(0xffd2, 0xffd7), + ...range(0xffda, 0xffdc), + ...range(0x10300, 0x1031e), + ...range(0x10320, 0x10323), + ...range(0x10330, 0x1034a), + ...range(0x10400, 0x10425), + ...range(0x10428, 0x1044d), + ...range(0x1d000, 0x1d0f5), + ...range(0x1d100, 0x1d126), + ...range(0x1d12a, 0x1d166), + ...range(0x1d16a, 0x1d172), + ...range(0x1d183, 0x1d184), + ...range(0x1d18c, 0x1d1a9), + ...range(0x1d1ae, 0x1d1dd), + ...range(0x1d400, 0x1d454), + ...range(0x1d456, 0x1d49c), + ...range(0x1d49e, 0x1d49f), + 0x1d4a2, + ...range(0x1d4a5, 0x1d4a6), + ...range(0x1d4a9, 0x1d4ac), + ...range(0x1d4ae, 0x1d4b9), + 0x1d4bb, + ...range(0x1d4bd, 0x1d4c0), + ...range(0x1d4c2, 0x1d4c3), + ...range(0x1d4c5, 0x1d505), + ...range(0x1d507, 0x1d50a), + ...range(0x1d50d, 0x1d514), + ...range(0x1d516, 0x1d51c), + ...range(0x1d51e, 0x1d539), + ...range(0x1d53b, 0x1d53e), + ...range(0x1d540, 0x1d544), + 0x1d546, + ...range(0x1d54a, 0x1d550), + ...range(0x1d552, 0x1d6a3), + ...range(0x1d6a8, 0x1d7c9), + ...range(0x20000, 0x2a6d6), + ...range(0x2f800, 0x2fa1d), + ...range(0xf0000, 0xffffd), + ...range(0x100000, 0x10fffd), +]); + +module.exports = { + unassigned_code_points, + commonly_mapped_to_nothing, + non_ASCII_space_characters, + prohibited_characters, + bidirectional_r_al, + bidirectional_l, +}; diff --git a/node_modules/saslprep/lib/memory-code-points.js b/node_modules/saslprep/lib/memory-code-points.js new file mode 100644 index 000000000..cb0289c87 --- /dev/null +++ b/node_modules/saslprep/lib/memory-code-points.js @@ -0,0 +1,39 @@ +'use strict'; + +const fs = require('fs'); +const path = require('path'); +const bitfield = require('sparse-bitfield'); + +/* eslint-disable-next-line security/detect-non-literal-fs-filename */ +const memory = fs.readFileSync(path.resolve(__dirname, '../code-points.mem')); +let offset = 0; + +/** + * Loads each code points sequence from buffer. + * @returns {bitfield} + */ +function read() { + const size = memory.readUInt32BE(offset); + offset += 4; + + const codepoints = memory.slice(offset, offset + size); + offset += size; + + return bitfield({ buffer: codepoints }); +} + +const unassigned_code_points = read(); +const commonly_mapped_to_nothing = read(); +const non_ASCII_space_characters = read(); +const prohibited_characters = read(); +const bidirectional_r_al = read(); +const bidirectional_l = read(); + +module.exports = { + unassigned_code_points, + commonly_mapped_to_nothing, + non_ASCII_space_characters, + prohibited_characters, + bidirectional_r_al, + bidirectional_l, +}; diff --git a/node_modules/saslprep/lib/util.js b/node_modules/saslprep/lib/util.js new file mode 100644 index 000000000..506bdc992 --- /dev/null +++ b/node_modules/saslprep/lib/util.js @@ -0,0 +1,21 @@ +'use strict'; + +/** + * Create an array of numbers. + * @param {number} from + * @param {number} to + * @returns {number[]} + */ +function range(from, to) { + // TODO: make this inlined. + const list = new Array(to - from + 1); + + for (let i = 0; i < list.length; i += 1) { + list[i] = from + i; + } + return list; +} + +module.exports = { + range, +}; diff --git a/node_modules/saslprep/package.json b/node_modules/saslprep/package.json new file mode 100644 index 000000000..c624e3b6d --- /dev/null +++ b/node_modules/saslprep/package.json @@ -0,0 +1,100 @@ +{ + "_from": "saslprep@^1.0.3", + "_id": "saslprep@1.0.3", + "_inBundle": false, + "_integrity": "sha512-/MY/PEMbk2SuY5sScONwhUDsV2p77Znkb/q3nSVstq/yQzYJOH/Azh29p9oJLsl3LnQwSvZDKagDGBsBwSooag==", + "_location": "/saslprep", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "saslprep@^1.0.3", + "name": "saslprep", + "escapedName": "saslprep", + "rawSpec": "^1.0.3", + "saveSpec": null, + "fetchSpec": "^1.0.3" + }, + "_requiredBy": [ + "/mongodb" + ], + "_resolved": "https://registry.npmjs.org/saslprep/-/saslprep-1.0.3.tgz", + "_shasum": "4c02f946b56cf54297e347ba1093e7acac4cf226", + "_spec": "saslprep@^1.0.3", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\mongodb", + "author": { + "name": "Dmitry Tsvettsikh", + "email": "me@reklatsmasters.com" + }, + "bugs": { + "url": "https://github.com/reklatsmasters/saslprep/issues" + }, + "bundleDependencies": false, + "dependencies": { + "sparse-bitfield": "^3.0.3" + }, + "deprecated": false, + "description": "SASLprep: Stringprep Profile for User Names and Passwords, rfc4013.", + "devDependencies": { + "@nodertc/eslint-config": "^0.2.1", + "eslint": "^5.16.0", + "jest": "^23.6.0", + "prettier": "^1.14.3" + }, + "engines": { + "node": ">=6" + }, + "eslintConfig": { + "extends": "@nodertc", + "rules": { + "camelcase": "off", + "no-continue": "off" + }, + "overrides": [ + { + "files": [ + "test/*.js" + ], + "env": { + "jest": true + }, + "rules": { + "require-jsdoc": "off" + } + } + ] + }, + "homepage": "https://github.com/reklatsmasters/saslprep#readme", + "jest": { + "modulePaths": [ + "" + ], + "testMatch": [ + "**/test/*.js" + ], + "testPathIgnorePatterns": [ + "/node_modules/" + ] + }, + "keywords": [ + "sasl", + "saslprep", + "stringprep", + "rfc4013", + "4013" + ], + "license": "MIT", + "main": "index.js", + "name": "saslprep", + "repository": { + "type": "git", + "url": "git+https://github.com/reklatsmasters/saslprep.git" + }, + "scripts": { + "gen-code-points": "node generate-code-points.js > code-points.mem", + "lint": "npx eslint --quiet .", + "test": "npm run lint && npm run unit-test", + "unit-test": "npx jest" + }, + "version": "1.0.3" +} diff --git a/node_modules/saslprep/readme.md b/node_modules/saslprep/readme.md new file mode 100644 index 000000000..8ff3d70d7 --- /dev/null +++ b/node_modules/saslprep/readme.md @@ -0,0 +1,31 @@ +# saslprep +[![Build Status](https://travis-ci.org/reklatsmasters/saslprep.svg?branch=master)](https://travis-ci.org/reklatsmasters/saslprep) +[![npm](https://img.shields.io/npm/v/saslprep.svg)](https://npmjs.org/package/saslprep) +[![node](https://img.shields.io/node/v/saslprep.svg)](https://npmjs.org/package/saslprep) +[![license](https://img.shields.io/npm/l/saslprep.svg)](https://npmjs.org/package/saslprep) +[![downloads](https://img.shields.io/npm/dm/saslprep.svg)](https://npmjs.org/package/saslprep) + +Stringprep Profile for User Names and Passwords, [rfc4013](https://tools.ietf.org/html/rfc4013) + +### Usage + +```js +const saslprep = require('saslprep') + +saslprep('password\u00AD') // password +saslprep('password\u0007') // Error: prohibited character +``` + +### API + +##### `saslprep(input: String, opts: Options): String` + +Normalize user name or password. + +##### `Options.allowUnassigned: bool` + +A special behavior for unassigned code points, see https://tools.ietf.org/html/rfc4013#section-2.5. Disabled by default. + +## License + +MIT, 2017-2019 (c) Dmitriy Tsvettsikh diff --git a/node_modules/saslprep/test/index.js b/node_modules/saslprep/test/index.js new file mode 100644 index 000000000..80c71af5e --- /dev/null +++ b/node_modules/saslprep/test/index.js @@ -0,0 +1,76 @@ +'use strict'; + +const saslprep = require('..'); + +const chr = String.fromCodePoint; + +test('should work with liatin letters', () => { + const str = 'user'; + expect(saslprep(str)).toEqual(str); +}); + +test('should work be case preserved', () => { + const str = 'USER'; + expect(saslprep(str)).toEqual(str); +}); + +test('should work with high code points (> U+FFFF)', () => { + const str = '\uD83D\uDE00'; + expect(saslprep(str, { allowUnassigned: true })).toEqual(str); +}); + +test('should remove `mapped to nothing` characters', () => { + expect(saslprep('I\u00ADX')).toEqual('IX'); +}); + +test('should replace `Non-ASCII space characters` with space', () => { + expect(saslprep('a\u00A0b')).toEqual('a\u0020b'); +}); + +test('should normalize as NFKC', () => { + expect(saslprep('\u00AA')).toEqual('a'); + expect(saslprep('\u2168')).toEqual('IX'); +}); + +test('should throws when prohibited characters', () => { + // C.2.1 ASCII control characters + expect(() => saslprep('a\u007Fb')).toThrow(); + + // C.2.2 Non-ASCII control characters + expect(() => saslprep('a\u06DDb')).toThrow(); + + // C.3 Private use + expect(() => saslprep('a\uE000b')).toThrow(); + + // C.4 Non-character code points + expect(() => saslprep(`a${chr(0x1fffe)}b`)).toThrow(); + + // C.5 Surrogate codes + expect(() => saslprep('a\uD800b')).toThrow(); + + // C.6 Inappropriate for plain text + expect(() => saslprep('a\uFFF9b')).toThrow(); + + // C.7 Inappropriate for canonical representation + expect(() => saslprep('a\u2FF0b')).toThrow(); + + // C.8 Change display properties or are deprecated + expect(() => saslprep('a\u200Eb')).toThrow(); + + // C.9 Tagging characters + expect(() => saslprep(`a${chr(0xe0001)}b`)).toThrow(); +}); + +test('should not containt RandALCat and LCat bidi', () => { + expect(() => saslprep('a\u06DD\u00AAb')).toThrow(); +}); + +test('RandALCat should be first and last', () => { + expect(() => saslprep('\u0627\u0031\u0628')).not.toThrow(); + expect(() => saslprep('\u0627\u0031')).toThrow(); +}); + +test('should handle unassigned code points', () => { + expect(() => saslprep('a\u0487')).toThrow(); + expect(() => saslprep('a\u0487', { allowUnassigned: true })).not.toThrow(); +}); diff --git a/node_modules/saslprep/test/util.js b/node_modules/saslprep/test/util.js new file mode 100644 index 000000000..355db3f85 --- /dev/null +++ b/node_modules/saslprep/test/util.js @@ -0,0 +1,16 @@ +'use strict'; + +const { setFlagsFromString } = require('v8'); +const { range } = require('../lib/util'); + +// 984 by default. +setFlagsFromString('--stack_size=500'); + +test('should work', () => { + const list = range(1, 3); + expect(list).toEqual([1, 2, 3]); +}); + +test('should work for large ranges', () => { + expect(() => range(1, 1e6)).not.toThrow(); +}); diff --git a/node_modules/semver-diff/index.d.ts b/node_modules/semver-diff/index.d.ts new file mode 100644 index 000000000..d6f6a4277 --- /dev/null +++ b/node_modules/semver-diff/index.d.ts @@ -0,0 +1,58 @@ +declare namespace semverDiff { + type Result = + | 'major' + | 'premajor' + | 'minor' + | 'preminor' + | 'patch' + | 'prepatch' + | 'prerelease' + | 'build'; +} + +/** +Get the diff type of two [semver](https://github.com/npm/node-semver) versions: `0.0.1 0.0.2` → `patch`. + +@returns The difference type between two semver versions, or `undefined` if they're identical or the second one is lower than the first. + +@example +``` +import semverDiff = require('semver-diff'); + +semverDiff('1.1.1', '1.1.2'); +//=> 'patch' + +semverDiff('1.1.1-foo', '1.1.2'); +//=> 'prepatch' + +semverDiff('0.0.1', '1.0.0'); +//=> 'major' + +semverDiff('0.0.1-foo', '1.0.0'); +//=> 'premajor' + +semverDiff('0.0.1', '0.1.0'); +//=> 'minor' + +semverDiff('0.0.1-foo', '0.1.0'); +//=> 'preminor' + +semverDiff('0.0.1-foo', '0.0.1-foo.bar'); +//=> 'prerelease' + +semverDiff('0.1.0', '0.1.0+foo'); +//=> 'build' + +semverDiff('0.0.1', '0.0.1'); +//=> undefined + +semverDiff('0.0.2', '0.0.1'); +//=> undefined +``` +*/ +declare function semverDiff( + versionA: string, + versionB: string +): semverDiff.Result | undefined; + +export = semverDiff; diff --git a/node_modules/semver-diff/index.js b/node_modules/semver-diff/index.js new file mode 100644 index 000000000..552180153 --- /dev/null +++ b/node_modules/semver-diff/index.js @@ -0,0 +1,13 @@ +'use strict'; +const semver = require('semver'); + +module.exports = (versionA, versionB) => { + versionA = semver.parse(versionA); + versionB = semver.parse(versionB); + + if (semver.compareBuild(versionA, versionB) >= 0) { + return; + } + + return semver.diff(versionA, versionB) || 'build'; +}; diff --git a/node_modules/semver-diff/license b/node_modules/semver-diff/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/semver-diff/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/semver-diff/node_modules/.bin/semver b/node_modules/semver-diff/node_modules/.bin/semver new file mode 100644 index 000000000..7e365277d --- /dev/null +++ b/node_modules/semver-diff/node_modules/.bin/semver @@ -0,0 +1,15 @@ +#!/bin/sh +basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')") + +case `uname` in + *CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;; +esac + +if [ -x "$basedir/node" ]; then + "$basedir/node" "$basedir/../semver/bin/semver.js" "$@" + ret=$? +else + node "$basedir/../semver/bin/semver.js" "$@" + ret=$? +fi +exit $ret diff --git a/node_modules/semver-diff/node_modules/.bin/semver.cmd b/node_modules/semver-diff/node_modules/.bin/semver.cmd new file mode 100644 index 000000000..164cdeac5 --- /dev/null +++ b/node_modules/semver-diff/node_modules/.bin/semver.cmd @@ -0,0 +1,17 @@ +@ECHO off +SETLOCAL +CALL :find_dp0 + +IF EXIST "%dp0%\node.exe" ( + SET "_prog=%dp0%\node.exe" +) ELSE ( + SET "_prog=node" + SET PATHEXT=%PATHEXT:;.JS;=;% +) + +"%_prog%" "%dp0%\..\semver\bin\semver.js" %* +ENDLOCAL +EXIT /b %errorlevel% +:find_dp0 +SET dp0=%~dp0 +EXIT /b diff --git a/node_modules/semver-diff/node_modules/.bin/semver.ps1 b/node_modules/semver-diff/node_modules/.bin/semver.ps1 new file mode 100644 index 000000000..6a85e3405 --- /dev/null +++ b/node_modules/semver-diff/node_modules/.bin/semver.ps1 @@ -0,0 +1,18 @@ +#!/usr/bin/env pwsh +$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent + +$exe="" +if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) { + # Fix case when both the Windows and Linux builds of Node + # are installed in the same directory + $exe=".exe" +} +$ret=0 +if (Test-Path "$basedir/node$exe") { + & "$basedir/node$exe" "$basedir/../semver/bin/semver.js" $args + $ret=$LASTEXITCODE +} else { + & "node$exe" "$basedir/../semver/bin/semver.js" $args + $ret=$LASTEXITCODE +} +exit $ret diff --git a/node_modules/semver-diff/node_modules/semver/CHANGELOG.md b/node_modules/semver-diff/node_modules/semver/CHANGELOG.md new file mode 100644 index 000000000..f567dd3fe --- /dev/null +++ b/node_modules/semver-diff/node_modules/semver/CHANGELOG.md @@ -0,0 +1,70 @@ +# changes log + +## 6.2.0 + +* Coerce numbers to strings when passed to semver.coerce() +* Add `rtl` option to coerce from right to left + +## 6.1.3 + +* Handle X-ranges properly in includePrerelease mode + +## 6.1.2 + +* Do not throw when testing invalid version strings + +## 6.1.1 + +* Add options support for semver.coerce() +* Handle undefined version passed to Range.test + +## 6.1.0 + +* Add semver.compareBuild function +* Support `*` in semver.intersects + +## 6.0 + +* Fix `intersects` logic. + + This is technically a bug fix, but since it is also a change to behavior + that may require users updating their code, it is marked as a major + version increment. + +## 5.7 + +* Add `minVersion` method + +## 5.6 + +* Move boolean `loose` param to an options object, with + backwards-compatibility protection. +* Add ability to opt out of special prerelease version handling with + the `includePrerelease` option flag. + +## 5.5 + +* Add version coercion capabilities + +## 5.4 + +* Add intersection checking + +## 5.3 + +* Add `minSatisfying` method + +## 5.2 + +* Add `prerelease(v)` that returns prerelease components + +## 5.1 + +* Add Backus-Naur for ranges +* Remove excessively cute inspection methods + +## 5.0 + +* Remove AMD/Browserified build artifacts +* Fix ltr and gtr when using the `*` range +* Fix for range `*` with a prerelease identifier diff --git a/node_modules/semver-diff/node_modules/semver/LICENSE b/node_modules/semver-diff/node_modules/semver/LICENSE new file mode 100644 index 000000000..19129e315 --- /dev/null +++ b/node_modules/semver-diff/node_modules/semver/LICENSE @@ -0,0 +1,15 @@ +The ISC License + +Copyright (c) Isaac Z. Schlueter and Contributors + +Permission to use, copy, modify, and/or distribute this software for any +purpose with or without fee is hereby granted, provided that the above +copyright notice and this permission notice appear in all copies. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES +WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF +MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR +ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES +WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN +ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR +IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. diff --git a/node_modules/semver-diff/node_modules/semver/README.md b/node_modules/semver-diff/node_modules/semver/README.md new file mode 100644 index 000000000..2293a14fd --- /dev/null +++ b/node_modules/semver-diff/node_modules/semver/README.md @@ -0,0 +1,443 @@ +semver(1) -- The semantic versioner for npm +=========================================== + +## Install + +```bash +npm install semver +```` + +## Usage + +As a node module: + +```js +const semver = require('semver') + +semver.valid('1.2.3') // '1.2.3' +semver.valid('a.b.c') // null +semver.clean(' =v1.2.3 ') // '1.2.3' +semver.satisfies('1.2.3', '1.x || >=2.5.0 || 5.0.0 - 7.2.3') // true +semver.gt('1.2.3', '9.8.7') // false +semver.lt('1.2.3', '9.8.7') // true +semver.minVersion('>=1.0.0') // '1.0.0' +semver.valid(semver.coerce('v2')) // '2.0.0' +semver.valid(semver.coerce('42.6.7.9.3-alpha')) // '42.6.7' +``` + +As a command-line utility: + +``` +$ semver -h + +A JavaScript implementation of the https://semver.org/ specification +Copyright Isaac Z. Schlueter + +Usage: semver [options] [ [...]] +Prints valid versions sorted by SemVer precedence + +Options: +-r --range + Print versions that match the specified range. + +-i --increment [] + Increment a version by the specified level. Level can + be one of: major, minor, patch, premajor, preminor, + prepatch, or prerelease. Default level is 'patch'. + Only one version may be specified. + +--preid + Identifier to be used to prefix premajor, preminor, + prepatch or prerelease version increments. + +-l --loose + Interpret versions and ranges loosely + +-p --include-prerelease + Always include prerelease versions in range matching + +-c --coerce + Coerce a string into SemVer if possible + (does not imply --loose) + +--rtl + Coerce version strings right to left + +--ltr + Coerce version strings left to right (default) + +Program exits successfully if any valid version satisfies +all supplied ranges, and prints all satisfying versions. + +If no satisfying versions are found, then exits failure. + +Versions are printed in ascending order, so supplying +multiple versions to the utility will just sort them. +``` + +## Versions + +A "version" is described by the `v2.0.0` specification found at +. + +A leading `"="` or `"v"` character is stripped off and ignored. + +## Ranges + +A `version range` is a set of `comparators` which specify versions +that satisfy the range. + +A `comparator` is composed of an `operator` and a `version`. The set +of primitive `operators` is: + +* `<` Less than +* `<=` Less than or equal to +* `>` Greater than +* `>=` Greater than or equal to +* `=` Equal. If no operator is specified, then equality is assumed, + so this operator is optional, but MAY be included. + +For example, the comparator `>=1.2.7` would match the versions +`1.2.7`, `1.2.8`, `2.5.3`, and `1.3.9`, but not the versions `1.2.6` +or `1.1.0`. + +Comparators can be joined by whitespace to form a `comparator set`, +which is satisfied by the **intersection** of all of the comparators +it includes. + +A range is composed of one or more comparator sets, joined by `||`. A +version matches a range if and only if every comparator in at least +one of the `||`-separated comparator sets is satisfied by the version. + +For example, the range `>=1.2.7 <1.3.0` would match the versions +`1.2.7`, `1.2.8`, and `1.2.99`, but not the versions `1.2.6`, `1.3.0`, +or `1.1.0`. + +The range `1.2.7 || >=1.2.9 <2.0.0` would match the versions `1.2.7`, +`1.2.9`, and `1.4.6`, but not the versions `1.2.8` or `2.0.0`. + +### Prerelease Tags + +If a version has a prerelease tag (for example, `1.2.3-alpha.3`) then +it will only be allowed to satisfy comparator sets if at least one +comparator with the same `[major, minor, patch]` tuple also has a +prerelease tag. + +For example, the range `>1.2.3-alpha.3` would be allowed to match the +version `1.2.3-alpha.7`, but it would *not* be satisfied by +`3.4.5-alpha.9`, even though `3.4.5-alpha.9` is technically "greater +than" `1.2.3-alpha.3` according to the SemVer sort rules. The version +range only accepts prerelease tags on the `1.2.3` version. The +version `3.4.5` *would* satisfy the range, because it does not have a +prerelease flag, and `3.4.5` is greater than `1.2.3-alpha.7`. + +The purpose for this behavior is twofold. First, prerelease versions +frequently are updated very quickly, and contain many breaking changes +that are (by the author's design) not yet fit for public consumption. +Therefore, by default, they are excluded from range matching +semantics. + +Second, a user who has opted into using a prerelease version has +clearly indicated the intent to use *that specific* set of +alpha/beta/rc versions. By including a prerelease tag in the range, +the user is indicating that they are aware of the risk. However, it +is still not appropriate to assume that they have opted into taking a +similar risk on the *next* set of prerelease versions. + +Note that this behavior can be suppressed (treating all prerelease +versions as if they were normal versions, for the purpose of range +matching) by setting the `includePrerelease` flag on the options +object to any +[functions](https://github.com/npm/node-semver#functions) that do +range matching. + +#### Prerelease Identifiers + +The method `.inc` takes an additional `identifier` string argument that +will append the value of the string as a prerelease identifier: + +```javascript +semver.inc('1.2.3', 'prerelease', 'beta') +// '1.2.4-beta.0' +``` + +command-line example: + +```bash +$ semver 1.2.3 -i prerelease --preid beta +1.2.4-beta.0 +``` + +Which then can be used to increment further: + +```bash +$ semver 1.2.4-beta.0 -i prerelease +1.2.4-beta.1 +``` + +### Advanced Range Syntax + +Advanced range syntax desugars to primitive comparators in +deterministic ways. + +Advanced ranges may be combined in the same way as primitive +comparators using white space or `||`. + +#### Hyphen Ranges `X.Y.Z - A.B.C` + +Specifies an inclusive set. + +* `1.2.3 - 2.3.4` := `>=1.2.3 <=2.3.4` + +If a partial version is provided as the first version in the inclusive +range, then the missing pieces are replaced with zeroes. + +* `1.2 - 2.3.4` := `>=1.2.0 <=2.3.4` + +If a partial version is provided as the second version in the +inclusive range, then all versions that start with the supplied parts +of the tuple are accepted, but nothing that would be greater than the +provided tuple parts. + +* `1.2.3 - 2.3` := `>=1.2.3 <2.4.0` +* `1.2.3 - 2` := `>=1.2.3 <3.0.0` + +#### X-Ranges `1.2.x` `1.X` `1.2.*` `*` + +Any of `X`, `x`, or `*` may be used to "stand in" for one of the +numeric values in the `[major, minor, patch]` tuple. + +* `*` := `>=0.0.0` (Any version satisfies) +* `1.x` := `>=1.0.0 <2.0.0` (Matching major version) +* `1.2.x` := `>=1.2.0 <1.3.0` (Matching major and minor versions) + +A partial version range is treated as an X-Range, so the special +character is in fact optional. + +* `""` (empty string) := `*` := `>=0.0.0` +* `1` := `1.x.x` := `>=1.0.0 <2.0.0` +* `1.2` := `1.2.x` := `>=1.2.0 <1.3.0` + +#### Tilde Ranges `~1.2.3` `~1.2` `~1` + +Allows patch-level changes if a minor version is specified on the +comparator. Allows minor-level changes if not. + +* `~1.2.3` := `>=1.2.3 <1.(2+1).0` := `>=1.2.3 <1.3.0` +* `~1.2` := `>=1.2.0 <1.(2+1).0` := `>=1.2.0 <1.3.0` (Same as `1.2.x`) +* `~1` := `>=1.0.0 <(1+1).0.0` := `>=1.0.0 <2.0.0` (Same as `1.x`) +* `~0.2.3` := `>=0.2.3 <0.(2+1).0` := `>=0.2.3 <0.3.0` +* `~0.2` := `>=0.2.0 <0.(2+1).0` := `>=0.2.0 <0.3.0` (Same as `0.2.x`) +* `~0` := `>=0.0.0 <(0+1).0.0` := `>=0.0.0 <1.0.0` (Same as `0.x`) +* `~1.2.3-beta.2` := `>=1.2.3-beta.2 <1.3.0` Note that prereleases in + the `1.2.3` version will be allowed, if they are greater than or + equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but + `1.2.4-beta.2` would not, because it is a prerelease of a + different `[major, minor, patch]` tuple. + +#### Caret Ranges `^1.2.3` `^0.2.5` `^0.0.4` + +Allows changes that do not modify the left-most non-zero element in the +`[major, minor, patch]` tuple. In other words, this allows patch and +minor updates for versions `1.0.0` and above, patch updates for +versions `0.X >=0.1.0`, and *no* updates for versions `0.0.X`. + +Many authors treat a `0.x` version as if the `x` were the major +"breaking-change" indicator. + +Caret ranges are ideal when an author may make breaking changes +between `0.2.4` and `0.3.0` releases, which is a common practice. +However, it presumes that there will *not* be breaking changes between +`0.2.4` and `0.2.5`. It allows for changes that are presumed to be +additive (but non-breaking), according to commonly observed practices. + +* `^1.2.3` := `>=1.2.3 <2.0.0` +* `^0.2.3` := `>=0.2.3 <0.3.0` +* `^0.0.3` := `>=0.0.3 <0.0.4` +* `^1.2.3-beta.2` := `>=1.2.3-beta.2 <2.0.0` Note that prereleases in + the `1.2.3` version will be allowed, if they are greater than or + equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but + `1.2.4-beta.2` would not, because it is a prerelease of a + different `[major, minor, patch]` tuple. +* `^0.0.3-beta` := `>=0.0.3-beta <0.0.4` Note that prereleases in the + `0.0.3` version *only* will be allowed, if they are greater than or + equal to `beta`. So, `0.0.3-pr.2` would be allowed. + +When parsing caret ranges, a missing `patch` value desugars to the +number `0`, but will allow flexibility within that value, even if the +major and minor versions are both `0`. + +* `^1.2.x` := `>=1.2.0 <2.0.0` +* `^0.0.x` := `>=0.0.0 <0.1.0` +* `^0.0` := `>=0.0.0 <0.1.0` + +A missing `minor` and `patch` values will desugar to zero, but also +allow flexibility within those values, even if the major version is +zero. + +* `^1.x` := `>=1.0.0 <2.0.0` +* `^0.x` := `>=0.0.0 <1.0.0` + +### Range Grammar + +Putting all this together, here is a Backus-Naur grammar for ranges, +for the benefit of parser authors: + +```bnf +range-set ::= range ( logical-or range ) * +logical-or ::= ( ' ' ) * '||' ( ' ' ) * +range ::= hyphen | simple ( ' ' simple ) * | '' +hyphen ::= partial ' - ' partial +simple ::= primitive | partial | tilde | caret +primitive ::= ( '<' | '>' | '>=' | '<=' | '=' ) partial +partial ::= xr ( '.' xr ( '.' xr qualifier ? )? )? +xr ::= 'x' | 'X' | '*' | nr +nr ::= '0' | ['1'-'9'] ( ['0'-'9'] ) * +tilde ::= '~' partial +caret ::= '^' partial +qualifier ::= ( '-' pre )? ( '+' build )? +pre ::= parts +build ::= parts +parts ::= part ( '.' part ) * +part ::= nr | [-0-9A-Za-z]+ +``` + +## Functions + +All methods and classes take a final `options` object argument. All +options in this object are `false` by default. The options supported +are: + +- `loose` Be more forgiving about not-quite-valid semver strings. + (Any resulting output will always be 100% strict compliant, of + course.) For backwards compatibility reasons, if the `options` + argument is a boolean value instead of an object, it is interpreted + to be the `loose` param. +- `includePrerelease` Set to suppress the [default + behavior](https://github.com/npm/node-semver#prerelease-tags) of + excluding prerelease tagged versions from ranges unless they are + explicitly opted into. + +Strict-mode Comparators and Ranges will be strict about the SemVer +strings that they parse. + +* `valid(v)`: Return the parsed version, or null if it's not valid. +* `inc(v, release)`: Return the version incremented by the release + type (`major`, `premajor`, `minor`, `preminor`, `patch`, + `prepatch`, or `prerelease`), or null if it's not valid + * `premajor` in one call will bump the version up to the next major + version and down to a prerelease of that major version. + `preminor`, and `prepatch` work the same way. + * If called from a non-prerelease version, the `prerelease` will work the + same as `prepatch`. It increments the patch version, then makes a + prerelease. If the input version is already a prerelease it simply + increments it. +* `prerelease(v)`: Returns an array of prerelease components, or null + if none exist. Example: `prerelease('1.2.3-alpha.1') -> ['alpha', 1]` +* `major(v)`: Return the major version number. +* `minor(v)`: Return the minor version number. +* `patch(v)`: Return the patch version number. +* `intersects(r1, r2, loose)`: Return true if the two supplied ranges + or comparators intersect. +* `parse(v)`: Attempt to parse a string as a semantic version, returning either + a `SemVer` object or `null`. + +### Comparison + +* `gt(v1, v2)`: `v1 > v2` +* `gte(v1, v2)`: `v1 >= v2` +* `lt(v1, v2)`: `v1 < v2` +* `lte(v1, v2)`: `v1 <= v2` +* `eq(v1, v2)`: `v1 == v2` This is true if they're logically equivalent, + even if they're not the exact same string. You already know how to + compare strings. +* `neq(v1, v2)`: `v1 != v2` The opposite of `eq`. +* `cmp(v1, comparator, v2)`: Pass in a comparison string, and it'll call + the corresponding function above. `"==="` and `"!=="` do simple + string comparison, but are included for completeness. Throws if an + invalid comparison string is provided. +* `compare(v1, v2)`: Return `0` if `v1 == v2`, or `1` if `v1` is greater, or `-1` if + `v2` is greater. Sorts in ascending order if passed to `Array.sort()`. +* `rcompare(v1, v2)`: The reverse of compare. Sorts an array of versions + in descending order when passed to `Array.sort()`. +* `compareBuild(v1, v2)`: The same as `compare` but considers `build` when two versions + are equal. Sorts in ascending order if passed to `Array.sort()`. + `v2` is greater. Sorts in ascending order if passed to `Array.sort()`. +* `diff(v1, v2)`: Returns difference between two versions by the release type + (`major`, `premajor`, `minor`, `preminor`, `patch`, `prepatch`, or `prerelease`), + or null if the versions are the same. + +### Comparators + +* `intersects(comparator)`: Return true if the comparators intersect + +### Ranges + +* `validRange(range)`: Return the valid range or null if it's not valid +* `satisfies(version, range)`: Return true if the version satisfies the + range. +* `maxSatisfying(versions, range)`: Return the highest version in the list + that satisfies the range, or `null` if none of them do. +* `minSatisfying(versions, range)`: Return the lowest version in the list + that satisfies the range, or `null` if none of them do. +* `minVersion(range)`: Return the lowest version that can possibly match + the given range. +* `gtr(version, range)`: Return `true` if version is greater than all the + versions possible in the range. +* `ltr(version, range)`: Return `true` if version is less than all the + versions possible in the range. +* `outside(version, range, hilo)`: Return true if the version is outside + the bounds of the range in either the high or low direction. The + `hilo` argument must be either the string `'>'` or `'<'`. (This is + the function called by `gtr` and `ltr`.) +* `intersects(range)`: Return true if any of the ranges comparators intersect + +Note that, since ranges may be non-contiguous, a version might not be +greater than a range, less than a range, *or* satisfy a range! For +example, the range `1.2 <1.2.9 || >2.0.0` would have a hole from `1.2.9` +until `2.0.0`, so the version `1.2.10` would not be greater than the +range (because `2.0.1` satisfies, which is higher), nor less than the +range (since `1.2.8` satisfies, which is lower), and it also does not +satisfy the range. + +If you want to know if a version satisfies or does not satisfy a +range, use the `satisfies(version, range)` function. + +### Coercion + +* `coerce(version, options)`: Coerces a string to semver if possible + +This aims to provide a very forgiving translation of a non-semver string to +semver. It looks for the first digit in a string, and consumes all +remaining characters which satisfy at least a partial semver (e.g., `1`, +`1.2`, `1.2.3`) up to the max permitted length (256 characters). Longer +versions are simply truncated (`4.6.3.9.2-alpha2` becomes `4.6.3`). All +surrounding text is simply ignored (`v3.4 replaces v3.3.1` becomes +`3.4.0`). Only text which lacks digits will fail coercion (`version one` +is not valid). The maximum length for any semver component considered for +coercion is 16 characters; longer components will be ignored +(`10000000000000000.4.7.4` becomes `4.7.4`). The maximum value for any +semver component is `Integer.MAX_SAFE_INTEGER || (2**53 - 1)`; higher value +components are invalid (`9999999999999999.4.7.4` is likely invalid). + +If the `options.rtl` flag is set, then `coerce` will return the right-most +coercible tuple that does not share an ending index with a longer coercible +tuple. For example, `1.2.3.4` will return `2.3.4` in rtl mode, not +`4.0.0`. `1.2.3/4` will return `4.0.0`, because the `4` is not a part of +any other overlapping SemVer tuple. + +### Clean + +* `clean(version)`: Clean a string to be a valid semver if possible + +This will return a cleaned and trimmed semver version. If the provided version is not valid a null will be returned. This does not work for ranges. + +ex. +* `s.clean(' = v 2.1.5foo')`: `null` +* `s.clean(' = v 2.1.5foo', { loose: true })`: `'2.1.5-foo'` +* `s.clean(' = v 2.1.5-foo')`: `null` +* `s.clean(' = v 2.1.5-foo', { loose: true })`: `'2.1.5-foo'` +* `s.clean('=v2.1.5')`: `'2.1.5'` +* `s.clean(' =v2.1.5')`: `2.1.5` +* `s.clean(' 2.1.5 ')`: `'2.1.5'` +* `s.clean('~1.0.0')`: `null` diff --git a/node_modules/semver-diff/node_modules/semver/bin/semver.js b/node_modules/semver-diff/node_modules/semver/bin/semver.js new file mode 100644 index 000000000..666034a75 --- /dev/null +++ b/node_modules/semver-diff/node_modules/semver/bin/semver.js @@ -0,0 +1,174 @@ +#!/usr/bin/env node +// Standalone semver comparison program. +// Exits successfully and prints matching version(s) if +// any supplied version is valid and passes all tests. + +var argv = process.argv.slice(2) + +var versions = [] + +var range = [] + +var inc = null + +var version = require('../package.json').version + +var loose = false + +var includePrerelease = false + +var coerce = false + +var rtl = false + +var identifier + +var semver = require('../semver') + +var reverse = false + +var options = {} + +main() + +function main () { + if (!argv.length) return help() + while (argv.length) { + var a = argv.shift() + var indexOfEqualSign = a.indexOf('=') + if (indexOfEqualSign !== -1) { + a = a.slice(0, indexOfEqualSign) + argv.unshift(a.slice(indexOfEqualSign + 1)) + } + switch (a) { + case '-rv': case '-rev': case '--rev': case '--reverse': + reverse = true + break + case '-l': case '--loose': + loose = true + break + case '-p': case '--include-prerelease': + includePrerelease = true + break + case '-v': case '--version': + versions.push(argv.shift()) + break + case '-i': case '--inc': case '--increment': + switch (argv[0]) { + case 'major': case 'minor': case 'patch': case 'prerelease': + case 'premajor': case 'preminor': case 'prepatch': + inc = argv.shift() + break + default: + inc = 'patch' + break + } + break + case '--preid': + identifier = argv.shift() + break + case '-r': case '--range': + range.push(argv.shift()) + break + case '-c': case '--coerce': + coerce = true + break + case '--rtl': + rtl = true + break + case '--ltr': + rtl = false + break + case '-h': case '--help': case '-?': + return help() + default: + versions.push(a) + break + } + } + + var options = { loose: loose, includePrerelease: includePrerelease, rtl: rtl } + + versions = versions.map(function (v) { + return coerce ? (semver.coerce(v, options) || { version: v }).version : v + }).filter(function (v) { + return semver.valid(v) + }) + if (!versions.length) return fail() + if (inc && (versions.length !== 1 || range.length)) { return failInc() } + + for (var i = 0, l = range.length; i < l; i++) { + versions = versions.filter(function (v) { + return semver.satisfies(v, range[i], options) + }) + if (!versions.length) return fail() + } + return success(versions) +} + +function failInc () { + console.error('--inc can only be used on a single version with no range') + fail() +} + +function fail () { process.exit(1) } + +function success () { + var compare = reverse ? 'rcompare' : 'compare' + versions.sort(function (a, b) { + return semver[compare](a, b, options) + }).map(function (v) { + return semver.clean(v, options) + }).map(function (v) { + return inc ? semver.inc(v, inc, options, identifier) : v + }).forEach(function (v, i, _) { console.log(v) }) +} + +function help () { + console.log(['SemVer ' + version, + '', + 'A JavaScript implementation of the https://semver.org/ specification', + 'Copyright Isaac Z. Schlueter', + '', + 'Usage: semver [options] [ [...]]', + 'Prints valid versions sorted by SemVer precedence', + '', + 'Options:', + '-r --range ', + ' Print versions that match the specified range.', + '', + '-i --increment []', + ' Increment a version by the specified level. Level can', + ' be one of: major, minor, patch, premajor, preminor,', + " prepatch, or prerelease. Default level is 'patch'.", + ' Only one version may be specified.', + '', + '--preid ', + ' Identifier to be used to prefix premajor, preminor,', + ' prepatch or prerelease version increments.', + '', + '-l --loose', + ' Interpret versions and ranges loosely', + '', + '-p --include-prerelease', + ' Always include prerelease versions in range matching', + '', + '-c --coerce', + ' Coerce a string into SemVer if possible', + ' (does not imply --loose)', + '', + '--rtl', + ' Coerce version strings right to left', + '', + '--ltr', + ' Coerce version strings left to right (default)', + '', + 'Program exits successfully if any valid version satisfies', + 'all supplied ranges, and prints all satisfying versions.', + '', + 'If no satisfying versions are found, then exits failure.', + '', + 'Versions are printed in ascending order, so supplying', + 'multiple versions to the utility will just sort them.' + ].join('\n')) +} diff --git a/node_modules/semver-diff/node_modules/semver/package.json b/node_modules/semver-diff/node_modules/semver/package.json new file mode 100644 index 000000000..ff2596dd3 --- /dev/null +++ b/node_modules/semver-diff/node_modules/semver/package.json @@ -0,0 +1,60 @@ +{ + "_from": "semver@^6.3.0", + "_id": "semver@6.3.0", + "_inBundle": false, + "_integrity": "sha512-b39TBaTSfV6yBrapU89p5fKekE2m/NwnDocOVruQFS1/veMgdzuPcnOM34M6CwxW8jH/lxEa5rBoDeUwu5HHTw==", + "_location": "/semver-diff/semver", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "semver@^6.3.0", + "name": "semver", + "escapedName": "semver", + "rawSpec": "^6.3.0", + "saveSpec": null, + "fetchSpec": "^6.3.0" + }, + "_requiredBy": [ + "/semver-diff" + ], + "_resolved": "https://registry.npmjs.org/semver/-/semver-6.3.0.tgz", + "_shasum": "ee0a64c8af5e8ceea67687b133761e1becbd1d3d", + "_spec": "semver@^6.3.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\semver-diff", + "bin": { + "semver": "bin/semver.js" + }, + "bugs": { + "url": "https://github.com/npm/node-semver/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "The semantic version parser used by npm.", + "devDependencies": { + "tap": "^14.3.1" + }, + "files": [ + "bin", + "range.bnf", + "semver.js" + ], + "homepage": "https://github.com/npm/node-semver#readme", + "license": "ISC", + "main": "semver.js", + "name": "semver", + "repository": { + "type": "git", + "url": "git+https://github.com/npm/node-semver.git" + }, + "scripts": { + "postpublish": "git push origin --follow-tags", + "postversion": "npm publish", + "preversion": "npm test", + "test": "tap" + }, + "tap": { + "check-coverage": true + }, + "version": "6.3.0" +} diff --git a/node_modules/semver-diff/node_modules/semver/range.bnf b/node_modules/semver-diff/node_modules/semver/range.bnf new file mode 100644 index 000000000..d4c6ae0d7 --- /dev/null +++ b/node_modules/semver-diff/node_modules/semver/range.bnf @@ -0,0 +1,16 @@ +range-set ::= range ( logical-or range ) * +logical-or ::= ( ' ' ) * '||' ( ' ' ) * +range ::= hyphen | simple ( ' ' simple ) * | '' +hyphen ::= partial ' - ' partial +simple ::= primitive | partial | tilde | caret +primitive ::= ( '<' | '>' | '>=' | '<=' | '=' ) partial +partial ::= xr ( '.' xr ( '.' xr qualifier ? )? )? +xr ::= 'x' | 'X' | '*' | nr +nr ::= '0' | [1-9] ( [0-9] ) * +tilde ::= '~' partial +caret ::= '^' partial +qualifier ::= ( '-' pre )? ( '+' build )? +pre ::= parts +build ::= parts +parts ::= part ( '.' part ) * +part ::= nr | [-0-9A-Za-z]+ diff --git a/node_modules/semver-diff/node_modules/semver/semver.js b/node_modules/semver-diff/node_modules/semver/semver.js new file mode 100644 index 000000000..636fa4365 --- /dev/null +++ b/node_modules/semver-diff/node_modules/semver/semver.js @@ -0,0 +1,1596 @@ +exports = module.exports = SemVer + +var debug +/* istanbul ignore next */ +if (typeof process === 'object' && + process.env && + process.env.NODE_DEBUG && + /\bsemver\b/i.test(process.env.NODE_DEBUG)) { + debug = function () { + var args = Array.prototype.slice.call(arguments, 0) + args.unshift('SEMVER') + console.log.apply(console, args) + } +} else { + debug = function () {} +} + +// Note: this is the semver.org version of the spec that it implements +// Not necessarily the package version of this code. +exports.SEMVER_SPEC_VERSION = '2.0.0' + +var MAX_LENGTH = 256 +var MAX_SAFE_INTEGER = Number.MAX_SAFE_INTEGER || + /* istanbul ignore next */ 9007199254740991 + +// Max safe segment length for coercion. +var MAX_SAFE_COMPONENT_LENGTH = 16 + +// The actual regexps go on exports.re +var re = exports.re = [] +var src = exports.src = [] +var t = exports.tokens = {} +var R = 0 + +function tok (n) { + t[n] = R++ +} + +// The following Regular Expressions can be used for tokenizing, +// validating, and parsing SemVer version strings. + +// ## Numeric Identifier +// A single `0`, or a non-zero digit followed by zero or more digits. + +tok('NUMERICIDENTIFIER') +src[t.NUMERICIDENTIFIER] = '0|[1-9]\\d*' +tok('NUMERICIDENTIFIERLOOSE') +src[t.NUMERICIDENTIFIERLOOSE] = '[0-9]+' + +// ## Non-numeric Identifier +// Zero or more digits, followed by a letter or hyphen, and then zero or +// more letters, digits, or hyphens. + +tok('NONNUMERICIDENTIFIER') +src[t.NONNUMERICIDENTIFIER] = '\\d*[a-zA-Z-][a-zA-Z0-9-]*' + +// ## Main Version +// Three dot-separated numeric identifiers. + +tok('MAINVERSION') +src[t.MAINVERSION] = '(' + src[t.NUMERICIDENTIFIER] + ')\\.' + + '(' + src[t.NUMERICIDENTIFIER] + ')\\.' + + '(' + src[t.NUMERICIDENTIFIER] + ')' + +tok('MAINVERSIONLOOSE') +src[t.MAINVERSIONLOOSE] = '(' + src[t.NUMERICIDENTIFIERLOOSE] + ')\\.' + + '(' + src[t.NUMERICIDENTIFIERLOOSE] + ')\\.' + + '(' + src[t.NUMERICIDENTIFIERLOOSE] + ')' + +// ## Pre-release Version Identifier +// A numeric identifier, or a non-numeric identifier. + +tok('PRERELEASEIDENTIFIER') +src[t.PRERELEASEIDENTIFIER] = '(?:' + src[t.NUMERICIDENTIFIER] + + '|' + src[t.NONNUMERICIDENTIFIER] + ')' + +tok('PRERELEASEIDENTIFIERLOOSE') +src[t.PRERELEASEIDENTIFIERLOOSE] = '(?:' + src[t.NUMERICIDENTIFIERLOOSE] + + '|' + src[t.NONNUMERICIDENTIFIER] + ')' + +// ## Pre-release Version +// Hyphen, followed by one or more dot-separated pre-release version +// identifiers. + +tok('PRERELEASE') +src[t.PRERELEASE] = '(?:-(' + src[t.PRERELEASEIDENTIFIER] + + '(?:\\.' + src[t.PRERELEASEIDENTIFIER] + ')*))' + +tok('PRERELEASELOOSE') +src[t.PRERELEASELOOSE] = '(?:-?(' + src[t.PRERELEASEIDENTIFIERLOOSE] + + '(?:\\.' + src[t.PRERELEASEIDENTIFIERLOOSE] + ')*))' + +// ## Build Metadata Identifier +// Any combination of digits, letters, or hyphens. + +tok('BUILDIDENTIFIER') +src[t.BUILDIDENTIFIER] = '[0-9A-Za-z-]+' + +// ## Build Metadata +// Plus sign, followed by one or more period-separated build metadata +// identifiers. + +tok('BUILD') +src[t.BUILD] = '(?:\\+(' + src[t.BUILDIDENTIFIER] + + '(?:\\.' + src[t.BUILDIDENTIFIER] + ')*))' + +// ## Full Version String +// A main version, followed optionally by a pre-release version and +// build metadata. + +// Note that the only major, minor, patch, and pre-release sections of +// the version string are capturing groups. The build metadata is not a +// capturing group, because it should not ever be used in version +// comparison. + +tok('FULL') +tok('FULLPLAIN') +src[t.FULLPLAIN] = 'v?' + src[t.MAINVERSION] + + src[t.PRERELEASE] + '?' + + src[t.BUILD] + '?' + +src[t.FULL] = '^' + src[t.FULLPLAIN] + '$' + +// like full, but allows v1.2.3 and =1.2.3, which people do sometimes. +// also, 1.0.0alpha1 (prerelease without the hyphen) which is pretty +// common in the npm registry. +tok('LOOSEPLAIN') +src[t.LOOSEPLAIN] = '[v=\\s]*' + src[t.MAINVERSIONLOOSE] + + src[t.PRERELEASELOOSE] + '?' + + src[t.BUILD] + '?' + +tok('LOOSE') +src[t.LOOSE] = '^' + src[t.LOOSEPLAIN] + '$' + +tok('GTLT') +src[t.GTLT] = '((?:<|>)?=?)' + +// Something like "2.*" or "1.2.x". +// Note that "x.x" is a valid xRange identifer, meaning "any version" +// Only the first item is strictly required. +tok('XRANGEIDENTIFIERLOOSE') +src[t.XRANGEIDENTIFIERLOOSE] = src[t.NUMERICIDENTIFIERLOOSE] + '|x|X|\\*' +tok('XRANGEIDENTIFIER') +src[t.XRANGEIDENTIFIER] = src[t.NUMERICIDENTIFIER] + '|x|X|\\*' + +tok('XRANGEPLAIN') +src[t.XRANGEPLAIN] = '[v=\\s]*(' + src[t.XRANGEIDENTIFIER] + ')' + + '(?:\\.(' + src[t.XRANGEIDENTIFIER] + ')' + + '(?:\\.(' + src[t.XRANGEIDENTIFIER] + ')' + + '(?:' + src[t.PRERELEASE] + ')?' + + src[t.BUILD] + '?' + + ')?)?' + +tok('XRANGEPLAINLOOSE') +src[t.XRANGEPLAINLOOSE] = '[v=\\s]*(' + src[t.XRANGEIDENTIFIERLOOSE] + ')' + + '(?:\\.(' + src[t.XRANGEIDENTIFIERLOOSE] + ')' + + '(?:\\.(' + src[t.XRANGEIDENTIFIERLOOSE] + ')' + + '(?:' + src[t.PRERELEASELOOSE] + ')?' + + src[t.BUILD] + '?' + + ')?)?' + +tok('XRANGE') +src[t.XRANGE] = '^' + src[t.GTLT] + '\\s*' + src[t.XRANGEPLAIN] + '$' +tok('XRANGELOOSE') +src[t.XRANGELOOSE] = '^' + src[t.GTLT] + '\\s*' + src[t.XRANGEPLAINLOOSE] + '$' + +// Coercion. +// Extract anything that could conceivably be a part of a valid semver +tok('COERCE') +src[t.COERCE] = '(^|[^\\d])' + + '(\\d{1,' + MAX_SAFE_COMPONENT_LENGTH + '})' + + '(?:\\.(\\d{1,' + MAX_SAFE_COMPONENT_LENGTH + '}))?' + + '(?:\\.(\\d{1,' + MAX_SAFE_COMPONENT_LENGTH + '}))?' + + '(?:$|[^\\d])' +tok('COERCERTL') +re[t.COERCERTL] = new RegExp(src[t.COERCE], 'g') + +// Tilde ranges. +// Meaning is "reasonably at or greater than" +tok('LONETILDE') +src[t.LONETILDE] = '(?:~>?)' + +tok('TILDETRIM') +src[t.TILDETRIM] = '(\\s*)' + src[t.LONETILDE] + '\\s+' +re[t.TILDETRIM] = new RegExp(src[t.TILDETRIM], 'g') +var tildeTrimReplace = '$1~' + +tok('TILDE') +src[t.TILDE] = '^' + src[t.LONETILDE] + src[t.XRANGEPLAIN] + '$' +tok('TILDELOOSE') +src[t.TILDELOOSE] = '^' + src[t.LONETILDE] + src[t.XRANGEPLAINLOOSE] + '$' + +// Caret ranges. +// Meaning is "at least and backwards compatible with" +tok('LONECARET') +src[t.LONECARET] = '(?:\\^)' + +tok('CARETTRIM') +src[t.CARETTRIM] = '(\\s*)' + src[t.LONECARET] + '\\s+' +re[t.CARETTRIM] = new RegExp(src[t.CARETTRIM], 'g') +var caretTrimReplace = '$1^' + +tok('CARET') +src[t.CARET] = '^' + src[t.LONECARET] + src[t.XRANGEPLAIN] + '$' +tok('CARETLOOSE') +src[t.CARETLOOSE] = '^' + src[t.LONECARET] + src[t.XRANGEPLAINLOOSE] + '$' + +// A simple gt/lt/eq thing, or just "" to indicate "any version" +tok('COMPARATORLOOSE') +src[t.COMPARATORLOOSE] = '^' + src[t.GTLT] + '\\s*(' + src[t.LOOSEPLAIN] + ')$|^$' +tok('COMPARATOR') +src[t.COMPARATOR] = '^' + src[t.GTLT] + '\\s*(' + src[t.FULLPLAIN] + ')$|^$' + +// An expression to strip any whitespace between the gtlt and the thing +// it modifies, so that `> 1.2.3` ==> `>1.2.3` +tok('COMPARATORTRIM') +src[t.COMPARATORTRIM] = '(\\s*)' + src[t.GTLT] + + '\\s*(' + src[t.LOOSEPLAIN] + '|' + src[t.XRANGEPLAIN] + ')' + +// this one has to use the /g flag +re[t.COMPARATORTRIM] = new RegExp(src[t.COMPARATORTRIM], 'g') +var comparatorTrimReplace = '$1$2$3' + +// Something like `1.2.3 - 1.2.4` +// Note that these all use the loose form, because they'll be +// checked against either the strict or loose comparator form +// later. +tok('HYPHENRANGE') +src[t.HYPHENRANGE] = '^\\s*(' + src[t.XRANGEPLAIN] + ')' + + '\\s+-\\s+' + + '(' + src[t.XRANGEPLAIN] + ')' + + '\\s*$' + +tok('HYPHENRANGELOOSE') +src[t.HYPHENRANGELOOSE] = '^\\s*(' + src[t.XRANGEPLAINLOOSE] + ')' + + '\\s+-\\s+' + + '(' + src[t.XRANGEPLAINLOOSE] + ')' + + '\\s*$' + +// Star ranges basically just allow anything at all. +tok('STAR') +src[t.STAR] = '(<|>)?=?\\s*\\*' + +// Compile to actual regexp objects. +// All are flag-free, unless they were created above with a flag. +for (var i = 0; i < R; i++) { + debug(i, src[i]) + if (!re[i]) { + re[i] = new RegExp(src[i]) + } +} + +exports.parse = parse +function parse (version, options) { + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + + if (version instanceof SemVer) { + return version + } + + if (typeof version !== 'string') { + return null + } + + if (version.length > MAX_LENGTH) { + return null + } + + var r = options.loose ? re[t.LOOSE] : re[t.FULL] + if (!r.test(version)) { + return null + } + + try { + return new SemVer(version, options) + } catch (er) { + return null + } +} + +exports.valid = valid +function valid (version, options) { + var v = parse(version, options) + return v ? v.version : null +} + +exports.clean = clean +function clean (version, options) { + var s = parse(version.trim().replace(/^[=v]+/, ''), options) + return s ? s.version : null +} + +exports.SemVer = SemVer + +function SemVer (version, options) { + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + if (version instanceof SemVer) { + if (version.loose === options.loose) { + return version + } else { + version = version.version + } + } else if (typeof version !== 'string') { + throw new TypeError('Invalid Version: ' + version) + } + + if (version.length > MAX_LENGTH) { + throw new TypeError('version is longer than ' + MAX_LENGTH + ' characters') + } + + if (!(this instanceof SemVer)) { + return new SemVer(version, options) + } + + debug('SemVer', version, options) + this.options = options + this.loose = !!options.loose + + var m = version.trim().match(options.loose ? re[t.LOOSE] : re[t.FULL]) + + if (!m) { + throw new TypeError('Invalid Version: ' + version) + } + + this.raw = version + + // these are actually numbers + this.major = +m[1] + this.minor = +m[2] + this.patch = +m[3] + + if (this.major > MAX_SAFE_INTEGER || this.major < 0) { + throw new TypeError('Invalid major version') + } + + if (this.minor > MAX_SAFE_INTEGER || this.minor < 0) { + throw new TypeError('Invalid minor version') + } + + if (this.patch > MAX_SAFE_INTEGER || this.patch < 0) { + throw new TypeError('Invalid patch version') + } + + // numberify any prerelease numeric ids + if (!m[4]) { + this.prerelease = [] + } else { + this.prerelease = m[4].split('.').map(function (id) { + if (/^[0-9]+$/.test(id)) { + var num = +id + if (num >= 0 && num < MAX_SAFE_INTEGER) { + return num + } + } + return id + }) + } + + this.build = m[5] ? m[5].split('.') : [] + this.format() +} + +SemVer.prototype.format = function () { + this.version = this.major + '.' + this.minor + '.' + this.patch + if (this.prerelease.length) { + this.version += '-' + this.prerelease.join('.') + } + return this.version +} + +SemVer.prototype.toString = function () { + return this.version +} + +SemVer.prototype.compare = function (other) { + debug('SemVer.compare', this.version, this.options, other) + if (!(other instanceof SemVer)) { + other = new SemVer(other, this.options) + } + + return this.compareMain(other) || this.comparePre(other) +} + +SemVer.prototype.compareMain = function (other) { + if (!(other instanceof SemVer)) { + other = new SemVer(other, this.options) + } + + return compareIdentifiers(this.major, other.major) || + compareIdentifiers(this.minor, other.minor) || + compareIdentifiers(this.patch, other.patch) +} + +SemVer.prototype.comparePre = function (other) { + if (!(other instanceof SemVer)) { + other = new SemVer(other, this.options) + } + + // NOT having a prerelease is > having one + if (this.prerelease.length && !other.prerelease.length) { + return -1 + } else if (!this.prerelease.length && other.prerelease.length) { + return 1 + } else if (!this.prerelease.length && !other.prerelease.length) { + return 0 + } + + var i = 0 + do { + var a = this.prerelease[i] + var b = other.prerelease[i] + debug('prerelease compare', i, a, b) + if (a === undefined && b === undefined) { + return 0 + } else if (b === undefined) { + return 1 + } else if (a === undefined) { + return -1 + } else if (a === b) { + continue + } else { + return compareIdentifiers(a, b) + } + } while (++i) +} + +SemVer.prototype.compareBuild = function (other) { + if (!(other instanceof SemVer)) { + other = new SemVer(other, this.options) + } + + var i = 0 + do { + var a = this.build[i] + var b = other.build[i] + debug('prerelease compare', i, a, b) + if (a === undefined && b === undefined) { + return 0 + } else if (b === undefined) { + return 1 + } else if (a === undefined) { + return -1 + } else if (a === b) { + continue + } else { + return compareIdentifiers(a, b) + } + } while (++i) +} + +// preminor will bump the version up to the next minor release, and immediately +// down to pre-release. premajor and prepatch work the same way. +SemVer.prototype.inc = function (release, identifier) { + switch (release) { + case 'premajor': + this.prerelease.length = 0 + this.patch = 0 + this.minor = 0 + this.major++ + this.inc('pre', identifier) + break + case 'preminor': + this.prerelease.length = 0 + this.patch = 0 + this.minor++ + this.inc('pre', identifier) + break + case 'prepatch': + // If this is already a prerelease, it will bump to the next version + // drop any prereleases that might already exist, since they are not + // relevant at this point. + this.prerelease.length = 0 + this.inc('patch', identifier) + this.inc('pre', identifier) + break + // If the input is a non-prerelease version, this acts the same as + // prepatch. + case 'prerelease': + if (this.prerelease.length === 0) { + this.inc('patch', identifier) + } + this.inc('pre', identifier) + break + + case 'major': + // If this is a pre-major version, bump up to the same major version. + // Otherwise increment major. + // 1.0.0-5 bumps to 1.0.0 + // 1.1.0 bumps to 2.0.0 + if (this.minor !== 0 || + this.patch !== 0 || + this.prerelease.length === 0) { + this.major++ + } + this.minor = 0 + this.patch = 0 + this.prerelease = [] + break + case 'minor': + // If this is a pre-minor version, bump up to the same minor version. + // Otherwise increment minor. + // 1.2.0-5 bumps to 1.2.0 + // 1.2.1 bumps to 1.3.0 + if (this.patch !== 0 || this.prerelease.length === 0) { + this.minor++ + } + this.patch = 0 + this.prerelease = [] + break + case 'patch': + // If this is not a pre-release version, it will increment the patch. + // If it is a pre-release it will bump up to the same patch version. + // 1.2.0-5 patches to 1.2.0 + // 1.2.0 patches to 1.2.1 + if (this.prerelease.length === 0) { + this.patch++ + } + this.prerelease = [] + break + // This probably shouldn't be used publicly. + // 1.0.0 "pre" would become 1.0.0-0 which is the wrong direction. + case 'pre': + if (this.prerelease.length === 0) { + this.prerelease = [0] + } else { + var i = this.prerelease.length + while (--i >= 0) { + if (typeof this.prerelease[i] === 'number') { + this.prerelease[i]++ + i = -2 + } + } + if (i === -1) { + // didn't increment anything + this.prerelease.push(0) + } + } + if (identifier) { + // 1.2.0-beta.1 bumps to 1.2.0-beta.2, + // 1.2.0-beta.fooblz or 1.2.0-beta bumps to 1.2.0-beta.0 + if (this.prerelease[0] === identifier) { + if (isNaN(this.prerelease[1])) { + this.prerelease = [identifier, 0] + } + } else { + this.prerelease = [identifier, 0] + } + } + break + + default: + throw new Error('invalid increment argument: ' + release) + } + this.format() + this.raw = this.version + return this +} + +exports.inc = inc +function inc (version, release, loose, identifier) { + if (typeof (loose) === 'string') { + identifier = loose + loose = undefined + } + + try { + return new SemVer(version, loose).inc(release, identifier).version + } catch (er) { + return null + } +} + +exports.diff = diff +function diff (version1, version2) { + if (eq(version1, version2)) { + return null + } else { + var v1 = parse(version1) + var v2 = parse(version2) + var prefix = '' + if (v1.prerelease.length || v2.prerelease.length) { + prefix = 'pre' + var defaultResult = 'prerelease' + } + for (var key in v1) { + if (key === 'major' || key === 'minor' || key === 'patch') { + if (v1[key] !== v2[key]) { + return prefix + key + } + } + } + return defaultResult // may be undefined + } +} + +exports.compareIdentifiers = compareIdentifiers + +var numeric = /^[0-9]+$/ +function compareIdentifiers (a, b) { + var anum = numeric.test(a) + var bnum = numeric.test(b) + + if (anum && bnum) { + a = +a + b = +b + } + + return a === b ? 0 + : (anum && !bnum) ? -1 + : (bnum && !anum) ? 1 + : a < b ? -1 + : 1 +} + +exports.rcompareIdentifiers = rcompareIdentifiers +function rcompareIdentifiers (a, b) { + return compareIdentifiers(b, a) +} + +exports.major = major +function major (a, loose) { + return new SemVer(a, loose).major +} + +exports.minor = minor +function minor (a, loose) { + return new SemVer(a, loose).minor +} + +exports.patch = patch +function patch (a, loose) { + return new SemVer(a, loose).patch +} + +exports.compare = compare +function compare (a, b, loose) { + return new SemVer(a, loose).compare(new SemVer(b, loose)) +} + +exports.compareLoose = compareLoose +function compareLoose (a, b) { + return compare(a, b, true) +} + +exports.compareBuild = compareBuild +function compareBuild (a, b, loose) { + var versionA = new SemVer(a, loose) + var versionB = new SemVer(b, loose) + return versionA.compare(versionB) || versionA.compareBuild(versionB) +} + +exports.rcompare = rcompare +function rcompare (a, b, loose) { + return compare(b, a, loose) +} + +exports.sort = sort +function sort (list, loose) { + return list.sort(function (a, b) { + return exports.compareBuild(a, b, loose) + }) +} + +exports.rsort = rsort +function rsort (list, loose) { + return list.sort(function (a, b) { + return exports.compareBuild(b, a, loose) + }) +} + +exports.gt = gt +function gt (a, b, loose) { + return compare(a, b, loose) > 0 +} + +exports.lt = lt +function lt (a, b, loose) { + return compare(a, b, loose) < 0 +} + +exports.eq = eq +function eq (a, b, loose) { + return compare(a, b, loose) === 0 +} + +exports.neq = neq +function neq (a, b, loose) { + return compare(a, b, loose) !== 0 +} + +exports.gte = gte +function gte (a, b, loose) { + return compare(a, b, loose) >= 0 +} + +exports.lte = lte +function lte (a, b, loose) { + return compare(a, b, loose) <= 0 +} + +exports.cmp = cmp +function cmp (a, op, b, loose) { + switch (op) { + case '===': + if (typeof a === 'object') + a = a.version + if (typeof b === 'object') + b = b.version + return a === b + + case '!==': + if (typeof a === 'object') + a = a.version + if (typeof b === 'object') + b = b.version + return a !== b + + case '': + case '=': + case '==': + return eq(a, b, loose) + + case '!=': + return neq(a, b, loose) + + case '>': + return gt(a, b, loose) + + case '>=': + return gte(a, b, loose) + + case '<': + return lt(a, b, loose) + + case '<=': + return lte(a, b, loose) + + default: + throw new TypeError('Invalid operator: ' + op) + } +} + +exports.Comparator = Comparator +function Comparator (comp, options) { + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + + if (comp instanceof Comparator) { + if (comp.loose === !!options.loose) { + return comp + } else { + comp = comp.value + } + } + + if (!(this instanceof Comparator)) { + return new Comparator(comp, options) + } + + debug('comparator', comp, options) + this.options = options + this.loose = !!options.loose + this.parse(comp) + + if (this.semver === ANY) { + this.value = '' + } else { + this.value = this.operator + this.semver.version + } + + debug('comp', this) +} + +var ANY = {} +Comparator.prototype.parse = function (comp) { + var r = this.options.loose ? re[t.COMPARATORLOOSE] : re[t.COMPARATOR] + var m = comp.match(r) + + if (!m) { + throw new TypeError('Invalid comparator: ' + comp) + } + + this.operator = m[1] !== undefined ? m[1] : '' + if (this.operator === '=') { + this.operator = '' + } + + // if it literally is just '>' or '' then allow anything. + if (!m[2]) { + this.semver = ANY + } else { + this.semver = new SemVer(m[2], this.options.loose) + } +} + +Comparator.prototype.toString = function () { + return this.value +} + +Comparator.prototype.test = function (version) { + debug('Comparator.test', version, this.options.loose) + + if (this.semver === ANY || version === ANY) { + return true + } + + if (typeof version === 'string') { + try { + version = new SemVer(version, this.options) + } catch (er) { + return false + } + } + + return cmp(version, this.operator, this.semver, this.options) +} + +Comparator.prototype.intersects = function (comp, options) { + if (!(comp instanceof Comparator)) { + throw new TypeError('a Comparator is required') + } + + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + + var rangeTmp + + if (this.operator === '') { + if (this.value === '') { + return true + } + rangeTmp = new Range(comp.value, options) + return satisfies(this.value, rangeTmp, options) + } else if (comp.operator === '') { + if (comp.value === '') { + return true + } + rangeTmp = new Range(this.value, options) + return satisfies(comp.semver, rangeTmp, options) + } + + var sameDirectionIncreasing = + (this.operator === '>=' || this.operator === '>') && + (comp.operator === '>=' || comp.operator === '>') + var sameDirectionDecreasing = + (this.operator === '<=' || this.operator === '<') && + (comp.operator === '<=' || comp.operator === '<') + var sameSemVer = this.semver.version === comp.semver.version + var differentDirectionsInclusive = + (this.operator === '>=' || this.operator === '<=') && + (comp.operator === '>=' || comp.operator === '<=') + var oppositeDirectionsLessThan = + cmp(this.semver, '<', comp.semver, options) && + ((this.operator === '>=' || this.operator === '>') && + (comp.operator === '<=' || comp.operator === '<')) + var oppositeDirectionsGreaterThan = + cmp(this.semver, '>', comp.semver, options) && + ((this.operator === '<=' || this.operator === '<') && + (comp.operator === '>=' || comp.operator === '>')) + + return sameDirectionIncreasing || sameDirectionDecreasing || + (sameSemVer && differentDirectionsInclusive) || + oppositeDirectionsLessThan || oppositeDirectionsGreaterThan +} + +exports.Range = Range +function Range (range, options) { + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + + if (range instanceof Range) { + if (range.loose === !!options.loose && + range.includePrerelease === !!options.includePrerelease) { + return range + } else { + return new Range(range.raw, options) + } + } + + if (range instanceof Comparator) { + return new Range(range.value, options) + } + + if (!(this instanceof Range)) { + return new Range(range, options) + } + + this.options = options + this.loose = !!options.loose + this.includePrerelease = !!options.includePrerelease + + // First, split based on boolean or || + this.raw = range + this.set = range.split(/\s*\|\|\s*/).map(function (range) { + return this.parseRange(range.trim()) + }, this).filter(function (c) { + // throw out any that are not relevant for whatever reason + return c.length + }) + + if (!this.set.length) { + throw new TypeError('Invalid SemVer Range: ' + range) + } + + this.format() +} + +Range.prototype.format = function () { + this.range = this.set.map(function (comps) { + return comps.join(' ').trim() + }).join('||').trim() + return this.range +} + +Range.prototype.toString = function () { + return this.range +} + +Range.prototype.parseRange = function (range) { + var loose = this.options.loose + range = range.trim() + // `1.2.3 - 1.2.4` => `>=1.2.3 <=1.2.4` + var hr = loose ? re[t.HYPHENRANGELOOSE] : re[t.HYPHENRANGE] + range = range.replace(hr, hyphenReplace) + debug('hyphen replace', range) + // `> 1.2.3 < 1.2.5` => `>1.2.3 <1.2.5` + range = range.replace(re[t.COMPARATORTRIM], comparatorTrimReplace) + debug('comparator trim', range, re[t.COMPARATORTRIM]) + + // `~ 1.2.3` => `~1.2.3` + range = range.replace(re[t.TILDETRIM], tildeTrimReplace) + + // `^ 1.2.3` => `^1.2.3` + range = range.replace(re[t.CARETTRIM], caretTrimReplace) + + // normalize spaces + range = range.split(/\s+/).join(' ') + + // At this point, the range is completely trimmed and + // ready to be split into comparators. + + var compRe = loose ? re[t.COMPARATORLOOSE] : re[t.COMPARATOR] + var set = range.split(' ').map(function (comp) { + return parseComparator(comp, this.options) + }, this).join(' ').split(/\s+/) + if (this.options.loose) { + // in loose mode, throw out any that are not valid comparators + set = set.filter(function (comp) { + return !!comp.match(compRe) + }) + } + set = set.map(function (comp) { + return new Comparator(comp, this.options) + }, this) + + return set +} + +Range.prototype.intersects = function (range, options) { + if (!(range instanceof Range)) { + throw new TypeError('a Range is required') + } + + return this.set.some(function (thisComparators) { + return ( + isSatisfiable(thisComparators, options) && + range.set.some(function (rangeComparators) { + return ( + isSatisfiable(rangeComparators, options) && + thisComparators.every(function (thisComparator) { + return rangeComparators.every(function (rangeComparator) { + return thisComparator.intersects(rangeComparator, options) + }) + }) + ) + }) + ) + }) +} + +// take a set of comparators and determine whether there +// exists a version which can satisfy it +function isSatisfiable (comparators, options) { + var result = true + var remainingComparators = comparators.slice() + var testComparator = remainingComparators.pop() + + while (result && remainingComparators.length) { + result = remainingComparators.every(function (otherComparator) { + return testComparator.intersects(otherComparator, options) + }) + + testComparator = remainingComparators.pop() + } + + return result +} + +// Mostly just for testing and legacy API reasons +exports.toComparators = toComparators +function toComparators (range, options) { + return new Range(range, options).set.map(function (comp) { + return comp.map(function (c) { + return c.value + }).join(' ').trim().split(' ') + }) +} + +// comprised of xranges, tildes, stars, and gtlt's at this point. +// already replaced the hyphen ranges +// turn into a set of JUST comparators. +function parseComparator (comp, options) { + debug('comp', comp, options) + comp = replaceCarets(comp, options) + debug('caret', comp) + comp = replaceTildes(comp, options) + debug('tildes', comp) + comp = replaceXRanges(comp, options) + debug('xrange', comp) + comp = replaceStars(comp, options) + debug('stars', comp) + return comp +} + +function isX (id) { + return !id || id.toLowerCase() === 'x' || id === '*' +} + +// ~, ~> --> * (any, kinda silly) +// ~2, ~2.x, ~2.x.x, ~>2, ~>2.x ~>2.x.x --> >=2.0.0 <3.0.0 +// ~2.0, ~2.0.x, ~>2.0, ~>2.0.x --> >=2.0.0 <2.1.0 +// ~1.2, ~1.2.x, ~>1.2, ~>1.2.x --> >=1.2.0 <1.3.0 +// ~1.2.3, ~>1.2.3 --> >=1.2.3 <1.3.0 +// ~1.2.0, ~>1.2.0 --> >=1.2.0 <1.3.0 +function replaceTildes (comp, options) { + return comp.trim().split(/\s+/).map(function (comp) { + return replaceTilde(comp, options) + }).join(' ') +} + +function replaceTilde (comp, options) { + var r = options.loose ? re[t.TILDELOOSE] : re[t.TILDE] + return comp.replace(r, function (_, M, m, p, pr) { + debug('tilde', comp, _, M, m, p, pr) + var ret + + if (isX(M)) { + ret = '' + } else if (isX(m)) { + ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0' + } else if (isX(p)) { + // ~1.2 == >=1.2.0 <1.3.0 + ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0' + } else if (pr) { + debug('replaceTilde pr', pr) + ret = '>=' + M + '.' + m + '.' + p + '-' + pr + + ' <' + M + '.' + (+m + 1) + '.0' + } else { + // ~1.2.3 == >=1.2.3 <1.3.0 + ret = '>=' + M + '.' + m + '.' + p + + ' <' + M + '.' + (+m + 1) + '.0' + } + + debug('tilde return', ret) + return ret + }) +} + +// ^ --> * (any, kinda silly) +// ^2, ^2.x, ^2.x.x --> >=2.0.0 <3.0.0 +// ^2.0, ^2.0.x --> >=2.0.0 <3.0.0 +// ^1.2, ^1.2.x --> >=1.2.0 <2.0.0 +// ^1.2.3 --> >=1.2.3 <2.0.0 +// ^1.2.0 --> >=1.2.0 <2.0.0 +function replaceCarets (comp, options) { + return comp.trim().split(/\s+/).map(function (comp) { + return replaceCaret(comp, options) + }).join(' ') +} + +function replaceCaret (comp, options) { + debug('caret', comp, options) + var r = options.loose ? re[t.CARETLOOSE] : re[t.CARET] + return comp.replace(r, function (_, M, m, p, pr) { + debug('caret', comp, _, M, m, p, pr) + var ret + + if (isX(M)) { + ret = '' + } else if (isX(m)) { + ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0' + } else if (isX(p)) { + if (M === '0') { + ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0' + } else { + ret = '>=' + M + '.' + m + '.0 <' + (+M + 1) + '.0.0' + } + } else if (pr) { + debug('replaceCaret pr', pr) + if (M === '0') { + if (m === '0') { + ret = '>=' + M + '.' + m + '.' + p + '-' + pr + + ' <' + M + '.' + m + '.' + (+p + 1) + } else { + ret = '>=' + M + '.' + m + '.' + p + '-' + pr + + ' <' + M + '.' + (+m + 1) + '.0' + } + } else { + ret = '>=' + M + '.' + m + '.' + p + '-' + pr + + ' <' + (+M + 1) + '.0.0' + } + } else { + debug('no pr') + if (M === '0') { + if (m === '0') { + ret = '>=' + M + '.' + m + '.' + p + + ' <' + M + '.' + m + '.' + (+p + 1) + } else { + ret = '>=' + M + '.' + m + '.' + p + + ' <' + M + '.' + (+m + 1) + '.0' + } + } else { + ret = '>=' + M + '.' + m + '.' + p + + ' <' + (+M + 1) + '.0.0' + } + } + + debug('caret return', ret) + return ret + }) +} + +function replaceXRanges (comp, options) { + debug('replaceXRanges', comp, options) + return comp.split(/\s+/).map(function (comp) { + return replaceXRange(comp, options) + }).join(' ') +} + +function replaceXRange (comp, options) { + comp = comp.trim() + var r = options.loose ? re[t.XRANGELOOSE] : re[t.XRANGE] + return comp.replace(r, function (ret, gtlt, M, m, p, pr) { + debug('xRange', comp, ret, gtlt, M, m, p, pr) + var xM = isX(M) + var xm = xM || isX(m) + var xp = xm || isX(p) + var anyX = xp + + if (gtlt === '=' && anyX) { + gtlt = '' + } + + // if we're including prereleases in the match, then we need + // to fix this to -0, the lowest possible prerelease value + pr = options.includePrerelease ? '-0' : '' + + if (xM) { + if (gtlt === '>' || gtlt === '<') { + // nothing is allowed + ret = '<0.0.0-0' + } else { + // nothing is forbidden + ret = '*' + } + } else if (gtlt && anyX) { + // we know patch is an x, because we have any x at all. + // replace X with 0 + if (xm) { + m = 0 + } + p = 0 + + if (gtlt === '>') { + // >1 => >=2.0.0 + // >1.2 => >=1.3.0 + // >1.2.3 => >= 1.2.4 + gtlt = '>=' + if (xm) { + M = +M + 1 + m = 0 + p = 0 + } else { + m = +m + 1 + p = 0 + } + } else if (gtlt === '<=') { + // <=0.7.x is actually <0.8.0, since any 0.7.x should + // pass. Similarly, <=7.x is actually <8.0.0, etc. + gtlt = '<' + if (xm) { + M = +M + 1 + } else { + m = +m + 1 + } + } + + ret = gtlt + M + '.' + m + '.' + p + pr + } else if (xm) { + ret = '>=' + M + '.0.0' + pr + ' <' + (+M + 1) + '.0.0' + pr + } else if (xp) { + ret = '>=' + M + '.' + m + '.0' + pr + + ' <' + M + '.' + (+m + 1) + '.0' + pr + } + + debug('xRange return', ret) + + return ret + }) +} + +// Because * is AND-ed with everything else in the comparator, +// and '' means "any version", just remove the *s entirely. +function replaceStars (comp, options) { + debug('replaceStars', comp, options) + // Looseness is ignored here. star is always as loose as it gets! + return comp.trim().replace(re[t.STAR], '') +} + +// This function is passed to string.replace(re[t.HYPHENRANGE]) +// M, m, patch, prerelease, build +// 1.2 - 3.4.5 => >=1.2.0 <=3.4.5 +// 1.2.3 - 3.4 => >=1.2.0 <3.5.0 Any 3.4.x will do +// 1.2 - 3.4 => >=1.2.0 <3.5.0 +function hyphenReplace ($0, + from, fM, fm, fp, fpr, fb, + to, tM, tm, tp, tpr, tb) { + if (isX(fM)) { + from = '' + } else if (isX(fm)) { + from = '>=' + fM + '.0.0' + } else if (isX(fp)) { + from = '>=' + fM + '.' + fm + '.0' + } else { + from = '>=' + from + } + + if (isX(tM)) { + to = '' + } else if (isX(tm)) { + to = '<' + (+tM + 1) + '.0.0' + } else if (isX(tp)) { + to = '<' + tM + '.' + (+tm + 1) + '.0' + } else if (tpr) { + to = '<=' + tM + '.' + tm + '.' + tp + '-' + tpr + } else { + to = '<=' + to + } + + return (from + ' ' + to).trim() +} + +// if ANY of the sets match ALL of its comparators, then pass +Range.prototype.test = function (version) { + if (!version) { + return false + } + + if (typeof version === 'string') { + try { + version = new SemVer(version, this.options) + } catch (er) { + return false + } + } + + for (var i = 0; i < this.set.length; i++) { + if (testSet(this.set[i], version, this.options)) { + return true + } + } + return false +} + +function testSet (set, version, options) { + for (var i = 0; i < set.length; i++) { + if (!set[i].test(version)) { + return false + } + } + + if (version.prerelease.length && !options.includePrerelease) { + // Find the set of versions that are allowed to have prereleases + // For example, ^1.2.3-pr.1 desugars to >=1.2.3-pr.1 <2.0.0 + // That should allow `1.2.3-pr.2` to pass. + // However, `1.2.4-alpha.notready` should NOT be allowed, + // even though it's within the range set by the comparators. + for (i = 0; i < set.length; i++) { + debug(set[i].semver) + if (set[i].semver === ANY) { + continue + } + + if (set[i].semver.prerelease.length > 0) { + var allowed = set[i].semver + if (allowed.major === version.major && + allowed.minor === version.minor && + allowed.patch === version.patch) { + return true + } + } + } + + // Version has a -pre, but it's not one of the ones we like. + return false + } + + return true +} + +exports.satisfies = satisfies +function satisfies (version, range, options) { + try { + range = new Range(range, options) + } catch (er) { + return false + } + return range.test(version) +} + +exports.maxSatisfying = maxSatisfying +function maxSatisfying (versions, range, options) { + var max = null + var maxSV = null + try { + var rangeObj = new Range(range, options) + } catch (er) { + return null + } + versions.forEach(function (v) { + if (rangeObj.test(v)) { + // satisfies(v, range, options) + if (!max || maxSV.compare(v) === -1) { + // compare(max, v, true) + max = v + maxSV = new SemVer(max, options) + } + } + }) + return max +} + +exports.minSatisfying = minSatisfying +function minSatisfying (versions, range, options) { + var min = null + var minSV = null + try { + var rangeObj = new Range(range, options) + } catch (er) { + return null + } + versions.forEach(function (v) { + if (rangeObj.test(v)) { + // satisfies(v, range, options) + if (!min || minSV.compare(v) === 1) { + // compare(min, v, true) + min = v + minSV = new SemVer(min, options) + } + } + }) + return min +} + +exports.minVersion = minVersion +function minVersion (range, loose) { + range = new Range(range, loose) + + var minver = new SemVer('0.0.0') + if (range.test(minver)) { + return minver + } + + minver = new SemVer('0.0.0-0') + if (range.test(minver)) { + return minver + } + + minver = null + for (var i = 0; i < range.set.length; ++i) { + var comparators = range.set[i] + + comparators.forEach(function (comparator) { + // Clone to avoid manipulating the comparator's semver object. + var compver = new SemVer(comparator.semver.version) + switch (comparator.operator) { + case '>': + if (compver.prerelease.length === 0) { + compver.patch++ + } else { + compver.prerelease.push(0) + } + compver.raw = compver.format() + /* fallthrough */ + case '': + case '>=': + if (!minver || gt(minver, compver)) { + minver = compver + } + break + case '<': + case '<=': + /* Ignore maximum versions */ + break + /* istanbul ignore next */ + default: + throw new Error('Unexpected operation: ' + comparator.operator) + } + }) + } + + if (minver && range.test(minver)) { + return minver + } + + return null +} + +exports.validRange = validRange +function validRange (range, options) { + try { + // Return '*' instead of '' so that truthiness works. + // This will throw if it's invalid anyway + return new Range(range, options).range || '*' + } catch (er) { + return null + } +} + +// Determine if version is less than all the versions possible in the range +exports.ltr = ltr +function ltr (version, range, options) { + return outside(version, range, '<', options) +} + +// Determine if version is greater than all the versions possible in the range. +exports.gtr = gtr +function gtr (version, range, options) { + return outside(version, range, '>', options) +} + +exports.outside = outside +function outside (version, range, hilo, options) { + version = new SemVer(version, options) + range = new Range(range, options) + + var gtfn, ltefn, ltfn, comp, ecomp + switch (hilo) { + case '>': + gtfn = gt + ltefn = lte + ltfn = lt + comp = '>' + ecomp = '>=' + break + case '<': + gtfn = lt + ltefn = gte + ltfn = gt + comp = '<' + ecomp = '<=' + break + default: + throw new TypeError('Must provide a hilo val of "<" or ">"') + } + + // If it satisifes the range it is not outside + if (satisfies(version, range, options)) { + return false + } + + // From now on, variable terms are as if we're in "gtr" mode. + // but note that everything is flipped for the "ltr" function. + + for (var i = 0; i < range.set.length; ++i) { + var comparators = range.set[i] + + var high = null + var low = null + + comparators.forEach(function (comparator) { + if (comparator.semver === ANY) { + comparator = new Comparator('>=0.0.0') + } + high = high || comparator + low = low || comparator + if (gtfn(comparator.semver, high.semver, options)) { + high = comparator + } else if (ltfn(comparator.semver, low.semver, options)) { + low = comparator + } + }) + + // If the edge version comparator has a operator then our version + // isn't outside it + if (high.operator === comp || high.operator === ecomp) { + return false + } + + // If the lowest version comparator has an operator and our version + // is less than it then it isn't higher than the range + if ((!low.operator || low.operator === comp) && + ltefn(version, low.semver)) { + return false + } else if (low.operator === ecomp && ltfn(version, low.semver)) { + return false + } + } + return true +} + +exports.prerelease = prerelease +function prerelease (version, options) { + var parsed = parse(version, options) + return (parsed && parsed.prerelease.length) ? parsed.prerelease : null +} + +exports.intersects = intersects +function intersects (r1, r2, options) { + r1 = new Range(r1, options) + r2 = new Range(r2, options) + return r1.intersects(r2) +} + +exports.coerce = coerce +function coerce (version, options) { + if (version instanceof SemVer) { + return version + } + + if (typeof version === 'number') { + version = String(version) + } + + if (typeof version !== 'string') { + return null + } + + options = options || {} + + var match = null + if (!options.rtl) { + match = version.match(re[t.COERCE]) + } else { + // Find the right-most coercible string that does not share + // a terminus with a more left-ward coercible string. + // Eg, '1.2.3.4' wants to coerce '2.3.4', not '3.4' or '4' + // + // Walk through the string checking with a /g regexp + // Manually set the index so as to pick up overlapping matches. + // Stop when we get a match that ends at the string end, since no + // coercible string can be more right-ward without the same terminus. + var next + while ((next = re[t.COERCERTL].exec(version)) && + (!match || match.index + match[0].length !== version.length) + ) { + if (!match || + next.index + next[0].length !== match.index + match[0].length) { + match = next + } + re[t.COERCERTL].lastIndex = next.index + next[1].length + next[2].length + } + // leave it in a clean state + re[t.COERCERTL].lastIndex = -1 + } + + if (match === null) { + return null + } + + return parse(match[2] + + '.' + (match[3] || '0') + + '.' + (match[4] || '0'), options) +} diff --git a/node_modules/semver-diff/package.json b/node_modules/semver-diff/package.json new file mode 100644 index 000000000..98af8443c --- /dev/null +++ b/node_modules/semver-diff/package.json @@ -0,0 +1,69 @@ +{ + "_from": "semver-diff@^3.1.1", + "_id": "semver-diff@3.1.1", + "_inBundle": false, + "_integrity": "sha512-GX0Ix/CJcHyB8c4ykpHGIAvLyOwOobtM/8d+TQkAd81/bEjgPHrfba41Vpesr7jX/t8Uh+R3EX9eAS5be+jQYg==", + "_location": "/semver-diff", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "semver-diff@^3.1.1", + "name": "semver-diff", + "escapedName": "semver-diff", + "rawSpec": "^3.1.1", + "saveSpec": null, + "fetchSpec": "^3.1.1" + }, + "_requiredBy": [ + "/update-notifier" + ], + "_resolved": "https://registry.npmjs.org/semver-diff/-/semver-diff-3.1.1.tgz", + "_shasum": "05f77ce59f325e00e2706afd67bb506ddb1ca32b", + "_spec": "semver-diff@^3.1.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\update-notifier", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/semver-diff/issues" + }, + "bundleDependencies": false, + "dependencies": { + "semver": "^6.3.0" + }, + "deprecated": false, + "description": "Get the diff type of two semver versions: 0.0.1 0.0.2 → patch", + "devDependencies": { + "ava": "^2.4.0", + "tsd": "^0.9.0", + "xo": "^0.25.3" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "homepage": "https://github.com/sindresorhus/semver-diff#readme", + "keywords": [ + "semver", + "version", + "semantic", + "diff", + "difference" + ], + "license": "MIT", + "name": "semver-diff", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/semver-diff.git" + }, + "scripts": { + "test": "xo && ava && tsd" + }, + "version": "3.1.1" +} diff --git a/node_modules/semver-diff/readme.md b/node_modules/semver-diff/readme.md new file mode 100644 index 000000000..6e21afff5 --- /dev/null +++ b/node_modules/semver-diff/readme.md @@ -0,0 +1,77 @@ +# semver-diff [![Build Status](https://travis-ci.org/sindresorhus/semver-diff.svg?branch=master)](https://travis-ci.org/sindresorhus/semver-diff) + +> Get the diff type of two [semver](https://github.com/npm/node-semver) versions: `0.0.1 0.0.2` → `patch` + + +## Install + +``` +$ npm install semver-diff +``` + + +## Usage + +```js +const semverDiff = require('semver-diff'); + +semverDiff('1.1.1', '1.1.2'); +//=> 'patch' + +semverDiff('1.1.1-foo', '1.1.2'); +//=> 'prepatch' + +semverDiff('0.0.1', '1.0.0'); +//=> 'major' + +semverDiff('0.0.1-foo', '1.0.0'); +//=> 'premajor' + +semverDiff('0.0.1', '0.1.0'); +//=> 'minor' + +semverDiff('0.0.1-foo', '0.1.0'); +//=> 'preminor' + +semverDiff('0.0.1-foo', '0.0.1-foo.bar'); +//=> 'prerelease' + +semverDiff('0.1.0', '0.1.0+foo'); +//=> 'build' + +semverDiff('0.0.1', '0.0.1'); +//=> undefined + +semverDiff('0.0.2', '0.0.1'); +//=> undefined +``` + + +## API + +### semverDiff(versionA, versionB) + +Returns the difference type between two semver versions, or `undefined` if they're identical or the second one is lower than the first. + +Possible values: `'major'`, `'premajor'`, `'minor'`, `'preminor'`, `'patch'`, `'prepatch'`, `'prerelease'`, `'build'`, `undefined`. + + +## Related + +- [latest-semver](https://github.com/sindresorhus/latest-semver) - Get the latest stable semver version from an array of versions +- [to-semver](https://github.com/sindresorhus/to-semver) - Get an array of valid, sorted, and cleaned semver versions from an array of strings +- [semver-regex](https://github.com/sindresorhus/semver-regex) - Regular expression for matching semver versions +- [semver-truncate](https://github.com/sindresorhus/semver-truncate) - Truncate a semver version: `1.2.3` → `1.2.0` + + +--- + +
+ + Get professional support for this package with a Tidelift subscription + +
+ + Tidelift helps make open source sustainable for maintainers while giving companies
assurances about security, maintenance, and licensing for their dependencies. +
+
diff --git a/node_modules/semver/CHANGELOG.md b/node_modules/semver/CHANGELOG.md new file mode 100644 index 000000000..66304fdd2 --- /dev/null +++ b/node_modules/semver/CHANGELOG.md @@ -0,0 +1,39 @@ +# changes log + +## 5.7 + +* Add `minVersion` method + +## 5.6 + +* Move boolean `loose` param to an options object, with + backwards-compatibility protection. +* Add ability to opt out of special prerelease version handling with + the `includePrerelease` option flag. + +## 5.5 + +* Add version coercion capabilities + +## 5.4 + +* Add intersection checking + +## 5.3 + +* Add `minSatisfying` method + +## 5.2 + +* Add `prerelease(v)` that returns prerelease components + +## 5.1 + +* Add Backus-Naur for ranges +* Remove excessively cute inspection methods + +## 5.0 + +* Remove AMD/Browserified build artifacts +* Fix ltr and gtr when using the `*` range +* Fix for range `*` with a prerelease identifier diff --git a/node_modules/semver/LICENSE b/node_modules/semver/LICENSE new file mode 100644 index 000000000..19129e315 --- /dev/null +++ b/node_modules/semver/LICENSE @@ -0,0 +1,15 @@ +The ISC License + +Copyright (c) Isaac Z. Schlueter and Contributors + +Permission to use, copy, modify, and/or distribute this software for any +purpose with or without fee is hereby granted, provided that the above +copyright notice and this permission notice appear in all copies. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES +WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF +MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR +ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES +WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN +ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR +IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. diff --git a/node_modules/semver/README.md b/node_modules/semver/README.md new file mode 100644 index 000000000..f8dfa5a0d --- /dev/null +++ b/node_modules/semver/README.md @@ -0,0 +1,412 @@ +semver(1) -- The semantic versioner for npm +=========================================== + +## Install + +```bash +npm install --save semver +```` + +## Usage + +As a node module: + +```js +const semver = require('semver') + +semver.valid('1.2.3') // '1.2.3' +semver.valid('a.b.c') // null +semver.clean(' =v1.2.3 ') // '1.2.3' +semver.satisfies('1.2.3', '1.x || >=2.5.0 || 5.0.0 - 7.2.3') // true +semver.gt('1.2.3', '9.8.7') // false +semver.lt('1.2.3', '9.8.7') // true +semver.minVersion('>=1.0.0') // '1.0.0' +semver.valid(semver.coerce('v2')) // '2.0.0' +semver.valid(semver.coerce('42.6.7.9.3-alpha')) // '42.6.7' +``` + +As a command-line utility: + +``` +$ semver -h + +A JavaScript implementation of the https://semver.org/ specification +Copyright Isaac Z. Schlueter + +Usage: semver [options] [ [...]] +Prints valid versions sorted by SemVer precedence + +Options: +-r --range + Print versions that match the specified range. + +-i --increment [] + Increment a version by the specified level. Level can + be one of: major, minor, patch, premajor, preminor, + prepatch, or prerelease. Default level is 'patch'. + Only one version may be specified. + +--preid + Identifier to be used to prefix premajor, preminor, + prepatch or prerelease version increments. + +-l --loose + Interpret versions and ranges loosely + +-p --include-prerelease + Always include prerelease versions in range matching + +-c --coerce + Coerce a string into SemVer if possible + (does not imply --loose) + +Program exits successfully if any valid version satisfies +all supplied ranges, and prints all satisfying versions. + +If no satisfying versions are found, then exits failure. + +Versions are printed in ascending order, so supplying +multiple versions to the utility will just sort them. +``` + +## Versions + +A "version" is described by the `v2.0.0` specification found at +. + +A leading `"="` or `"v"` character is stripped off and ignored. + +## Ranges + +A `version range` is a set of `comparators` which specify versions +that satisfy the range. + +A `comparator` is composed of an `operator` and a `version`. The set +of primitive `operators` is: + +* `<` Less than +* `<=` Less than or equal to +* `>` Greater than +* `>=` Greater than or equal to +* `=` Equal. If no operator is specified, then equality is assumed, + so this operator is optional, but MAY be included. + +For example, the comparator `>=1.2.7` would match the versions +`1.2.7`, `1.2.8`, `2.5.3`, and `1.3.9`, but not the versions `1.2.6` +or `1.1.0`. + +Comparators can be joined by whitespace to form a `comparator set`, +which is satisfied by the **intersection** of all of the comparators +it includes. + +A range is composed of one or more comparator sets, joined by `||`. A +version matches a range if and only if every comparator in at least +one of the `||`-separated comparator sets is satisfied by the version. + +For example, the range `>=1.2.7 <1.3.0` would match the versions +`1.2.7`, `1.2.8`, and `1.2.99`, but not the versions `1.2.6`, `1.3.0`, +or `1.1.0`. + +The range `1.2.7 || >=1.2.9 <2.0.0` would match the versions `1.2.7`, +`1.2.9`, and `1.4.6`, but not the versions `1.2.8` or `2.0.0`. + +### Prerelease Tags + +If a version has a prerelease tag (for example, `1.2.3-alpha.3`) then +it will only be allowed to satisfy comparator sets if at least one +comparator with the same `[major, minor, patch]` tuple also has a +prerelease tag. + +For example, the range `>1.2.3-alpha.3` would be allowed to match the +version `1.2.3-alpha.7`, but it would *not* be satisfied by +`3.4.5-alpha.9`, even though `3.4.5-alpha.9` is technically "greater +than" `1.2.3-alpha.3` according to the SemVer sort rules. The version +range only accepts prerelease tags on the `1.2.3` version. The +version `3.4.5` *would* satisfy the range, because it does not have a +prerelease flag, and `3.4.5` is greater than `1.2.3-alpha.7`. + +The purpose for this behavior is twofold. First, prerelease versions +frequently are updated very quickly, and contain many breaking changes +that are (by the author's design) not yet fit for public consumption. +Therefore, by default, they are excluded from range matching +semantics. + +Second, a user who has opted into using a prerelease version has +clearly indicated the intent to use *that specific* set of +alpha/beta/rc versions. By including a prerelease tag in the range, +the user is indicating that they are aware of the risk. However, it +is still not appropriate to assume that they have opted into taking a +similar risk on the *next* set of prerelease versions. + +Note that this behavior can be suppressed (treating all prerelease +versions as if they were normal versions, for the purpose of range +matching) by setting the `includePrerelease` flag on the options +object to any +[functions](https://github.com/npm/node-semver#functions) that do +range matching. + +#### Prerelease Identifiers + +The method `.inc` takes an additional `identifier` string argument that +will append the value of the string as a prerelease identifier: + +```javascript +semver.inc('1.2.3', 'prerelease', 'beta') +// '1.2.4-beta.0' +``` + +command-line example: + +```bash +$ semver 1.2.3 -i prerelease --preid beta +1.2.4-beta.0 +``` + +Which then can be used to increment further: + +```bash +$ semver 1.2.4-beta.0 -i prerelease +1.2.4-beta.1 +``` + +### Advanced Range Syntax + +Advanced range syntax desugars to primitive comparators in +deterministic ways. + +Advanced ranges may be combined in the same way as primitive +comparators using white space or `||`. + +#### Hyphen Ranges `X.Y.Z - A.B.C` + +Specifies an inclusive set. + +* `1.2.3 - 2.3.4` := `>=1.2.3 <=2.3.4` + +If a partial version is provided as the first version in the inclusive +range, then the missing pieces are replaced with zeroes. + +* `1.2 - 2.3.4` := `>=1.2.0 <=2.3.4` + +If a partial version is provided as the second version in the +inclusive range, then all versions that start with the supplied parts +of the tuple are accepted, but nothing that would be greater than the +provided tuple parts. + +* `1.2.3 - 2.3` := `>=1.2.3 <2.4.0` +* `1.2.3 - 2` := `>=1.2.3 <3.0.0` + +#### X-Ranges `1.2.x` `1.X` `1.2.*` `*` + +Any of `X`, `x`, or `*` may be used to "stand in" for one of the +numeric values in the `[major, minor, patch]` tuple. + +* `*` := `>=0.0.0` (Any version satisfies) +* `1.x` := `>=1.0.0 <2.0.0` (Matching major version) +* `1.2.x` := `>=1.2.0 <1.3.0` (Matching major and minor versions) + +A partial version range is treated as an X-Range, so the special +character is in fact optional. + +* `""` (empty string) := `*` := `>=0.0.0` +* `1` := `1.x.x` := `>=1.0.0 <2.0.0` +* `1.2` := `1.2.x` := `>=1.2.0 <1.3.0` + +#### Tilde Ranges `~1.2.3` `~1.2` `~1` + +Allows patch-level changes if a minor version is specified on the +comparator. Allows minor-level changes if not. + +* `~1.2.3` := `>=1.2.3 <1.(2+1).0` := `>=1.2.3 <1.3.0` +* `~1.2` := `>=1.2.0 <1.(2+1).0` := `>=1.2.0 <1.3.0` (Same as `1.2.x`) +* `~1` := `>=1.0.0 <(1+1).0.0` := `>=1.0.0 <2.0.0` (Same as `1.x`) +* `~0.2.3` := `>=0.2.3 <0.(2+1).0` := `>=0.2.3 <0.3.0` +* `~0.2` := `>=0.2.0 <0.(2+1).0` := `>=0.2.0 <0.3.0` (Same as `0.2.x`) +* `~0` := `>=0.0.0 <(0+1).0.0` := `>=0.0.0 <1.0.0` (Same as `0.x`) +* `~1.2.3-beta.2` := `>=1.2.3-beta.2 <1.3.0` Note that prereleases in + the `1.2.3` version will be allowed, if they are greater than or + equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but + `1.2.4-beta.2` would not, because it is a prerelease of a + different `[major, minor, patch]` tuple. + +#### Caret Ranges `^1.2.3` `^0.2.5` `^0.0.4` + +Allows changes that do not modify the left-most non-zero digit in the +`[major, minor, patch]` tuple. In other words, this allows patch and +minor updates for versions `1.0.0` and above, patch updates for +versions `0.X >=0.1.0`, and *no* updates for versions `0.0.X`. + +Many authors treat a `0.x` version as if the `x` were the major +"breaking-change" indicator. + +Caret ranges are ideal when an author may make breaking changes +between `0.2.4` and `0.3.0` releases, which is a common practice. +However, it presumes that there will *not* be breaking changes between +`0.2.4` and `0.2.5`. It allows for changes that are presumed to be +additive (but non-breaking), according to commonly observed practices. + +* `^1.2.3` := `>=1.2.3 <2.0.0` +* `^0.2.3` := `>=0.2.3 <0.3.0` +* `^0.0.3` := `>=0.0.3 <0.0.4` +* `^1.2.3-beta.2` := `>=1.2.3-beta.2 <2.0.0` Note that prereleases in + the `1.2.3` version will be allowed, if they are greater than or + equal to `beta.2`. So, `1.2.3-beta.4` would be allowed, but + `1.2.4-beta.2` would not, because it is a prerelease of a + different `[major, minor, patch]` tuple. +* `^0.0.3-beta` := `>=0.0.3-beta <0.0.4` Note that prereleases in the + `0.0.3` version *only* will be allowed, if they are greater than or + equal to `beta`. So, `0.0.3-pr.2` would be allowed. + +When parsing caret ranges, a missing `patch` value desugars to the +number `0`, but will allow flexibility within that value, even if the +major and minor versions are both `0`. + +* `^1.2.x` := `>=1.2.0 <2.0.0` +* `^0.0.x` := `>=0.0.0 <0.1.0` +* `^0.0` := `>=0.0.0 <0.1.0` + +A missing `minor` and `patch` values will desugar to zero, but also +allow flexibility within those values, even if the major version is +zero. + +* `^1.x` := `>=1.0.0 <2.0.0` +* `^0.x` := `>=0.0.0 <1.0.0` + +### Range Grammar + +Putting all this together, here is a Backus-Naur grammar for ranges, +for the benefit of parser authors: + +```bnf +range-set ::= range ( logical-or range ) * +logical-or ::= ( ' ' ) * '||' ( ' ' ) * +range ::= hyphen | simple ( ' ' simple ) * | '' +hyphen ::= partial ' - ' partial +simple ::= primitive | partial | tilde | caret +primitive ::= ( '<' | '>' | '>=' | '<=' | '=' ) partial +partial ::= xr ( '.' xr ( '.' xr qualifier ? )? )? +xr ::= 'x' | 'X' | '*' | nr +nr ::= '0' | ['1'-'9'] ( ['0'-'9'] ) * +tilde ::= '~' partial +caret ::= '^' partial +qualifier ::= ( '-' pre )? ( '+' build )? +pre ::= parts +build ::= parts +parts ::= part ( '.' part ) * +part ::= nr | [-0-9A-Za-z]+ +``` + +## Functions + +All methods and classes take a final `options` object argument. All +options in this object are `false` by default. The options supported +are: + +- `loose` Be more forgiving about not-quite-valid semver strings. + (Any resulting output will always be 100% strict compliant, of + course.) For backwards compatibility reasons, if the `options` + argument is a boolean value instead of an object, it is interpreted + to be the `loose` param. +- `includePrerelease` Set to suppress the [default + behavior](https://github.com/npm/node-semver#prerelease-tags) of + excluding prerelease tagged versions from ranges unless they are + explicitly opted into. + +Strict-mode Comparators and Ranges will be strict about the SemVer +strings that they parse. + +* `valid(v)`: Return the parsed version, or null if it's not valid. +* `inc(v, release)`: Return the version incremented by the release + type (`major`, `premajor`, `minor`, `preminor`, `patch`, + `prepatch`, or `prerelease`), or null if it's not valid + * `premajor` in one call will bump the version up to the next major + version and down to a prerelease of that major version. + `preminor`, and `prepatch` work the same way. + * If called from a non-prerelease version, the `prerelease` will work the + same as `prepatch`. It increments the patch version, then makes a + prerelease. If the input version is already a prerelease it simply + increments it. +* `prerelease(v)`: Returns an array of prerelease components, or null + if none exist. Example: `prerelease('1.2.3-alpha.1') -> ['alpha', 1]` +* `major(v)`: Return the major version number. +* `minor(v)`: Return the minor version number. +* `patch(v)`: Return the patch version number. +* `intersects(r1, r2, loose)`: Return true if the two supplied ranges + or comparators intersect. +* `parse(v)`: Attempt to parse a string as a semantic version, returning either + a `SemVer` object or `null`. + +### Comparison + +* `gt(v1, v2)`: `v1 > v2` +* `gte(v1, v2)`: `v1 >= v2` +* `lt(v1, v2)`: `v1 < v2` +* `lte(v1, v2)`: `v1 <= v2` +* `eq(v1, v2)`: `v1 == v2` This is true if they're logically equivalent, + even if they're not the exact same string. You already know how to + compare strings. +* `neq(v1, v2)`: `v1 != v2` The opposite of `eq`. +* `cmp(v1, comparator, v2)`: Pass in a comparison string, and it'll call + the corresponding function above. `"==="` and `"!=="` do simple + string comparison, but are included for completeness. Throws if an + invalid comparison string is provided. +* `compare(v1, v2)`: Return `0` if `v1 == v2`, or `1` if `v1` is greater, or `-1` if + `v2` is greater. Sorts in ascending order if passed to `Array.sort()`. +* `rcompare(v1, v2)`: The reverse of compare. Sorts an array of versions + in descending order when passed to `Array.sort()`. +* `diff(v1, v2)`: Returns difference between two versions by the release type + (`major`, `premajor`, `minor`, `preminor`, `patch`, `prepatch`, or `prerelease`), + or null if the versions are the same. + +### Comparators + +* `intersects(comparator)`: Return true if the comparators intersect + +### Ranges + +* `validRange(range)`: Return the valid range or null if it's not valid +* `satisfies(version, range)`: Return true if the version satisfies the + range. +* `maxSatisfying(versions, range)`: Return the highest version in the list + that satisfies the range, or `null` if none of them do. +* `minSatisfying(versions, range)`: Return the lowest version in the list + that satisfies the range, or `null` if none of them do. +* `minVersion(range)`: Return the lowest version that can possibly match + the given range. +* `gtr(version, range)`: Return `true` if version is greater than all the + versions possible in the range. +* `ltr(version, range)`: Return `true` if version is less than all the + versions possible in the range. +* `outside(version, range, hilo)`: Return true if the version is outside + the bounds of the range in either the high or low direction. The + `hilo` argument must be either the string `'>'` or `'<'`. (This is + the function called by `gtr` and `ltr`.) +* `intersects(range)`: Return true if any of the ranges comparators intersect + +Note that, since ranges may be non-contiguous, a version might not be +greater than a range, less than a range, *or* satisfy a range! For +example, the range `1.2 <1.2.9 || >2.0.0` would have a hole from `1.2.9` +until `2.0.0`, so the version `1.2.10` would not be greater than the +range (because `2.0.1` satisfies, which is higher), nor less than the +range (since `1.2.8` satisfies, which is lower), and it also does not +satisfy the range. + +If you want to know if a version satisfies or does not satisfy a +range, use the `satisfies(version, range)` function. + +### Coercion + +* `coerce(version)`: Coerces a string to semver if possible + +This aims to provide a very forgiving translation of a non-semver string to +semver. It looks for the first digit in a string, and consumes all +remaining characters which satisfy at least a partial semver (e.g., `1`, +`1.2`, `1.2.3`) up to the max permitted length (256 characters). Longer +versions are simply truncated (`4.6.3.9.2-alpha2` becomes `4.6.3`). All +surrounding text is simply ignored (`v3.4 replaces v3.3.1` becomes +`3.4.0`). Only text which lacks digits will fail coercion (`version one` +is not valid). The maximum length for any semver component considered for +coercion is 16 characters; longer components will be ignored +(`10000000000000000.4.7.4` becomes `4.7.4`). The maximum value for any +semver component is `Number.MAX_SAFE_INTEGER || (2**53 - 1)`; higher value +components are invalid (`9999999999999999.4.7.4` is likely invalid). diff --git a/node_modules/semver/bin/semver b/node_modules/semver/bin/semver new file mode 100644 index 000000000..801e77f13 --- /dev/null +++ b/node_modules/semver/bin/semver @@ -0,0 +1,160 @@ +#!/usr/bin/env node +// Standalone semver comparison program. +// Exits successfully and prints matching version(s) if +// any supplied version is valid and passes all tests. + +var argv = process.argv.slice(2) + +var versions = [] + +var range = [] + +var inc = null + +var version = require('../package.json').version + +var loose = false + +var includePrerelease = false + +var coerce = false + +var identifier + +var semver = require('../semver') + +var reverse = false + +var options = {} + +main() + +function main () { + if (!argv.length) return help() + while (argv.length) { + var a = argv.shift() + var indexOfEqualSign = a.indexOf('=') + if (indexOfEqualSign !== -1) { + a = a.slice(0, indexOfEqualSign) + argv.unshift(a.slice(indexOfEqualSign + 1)) + } + switch (a) { + case '-rv': case '-rev': case '--rev': case '--reverse': + reverse = true + break + case '-l': case '--loose': + loose = true + break + case '-p': case '--include-prerelease': + includePrerelease = true + break + case '-v': case '--version': + versions.push(argv.shift()) + break + case '-i': case '--inc': case '--increment': + switch (argv[0]) { + case 'major': case 'minor': case 'patch': case 'prerelease': + case 'premajor': case 'preminor': case 'prepatch': + inc = argv.shift() + break + default: + inc = 'patch' + break + } + break + case '--preid': + identifier = argv.shift() + break + case '-r': case '--range': + range.push(argv.shift()) + break + case '-c': case '--coerce': + coerce = true + break + case '-h': case '--help': case '-?': + return help() + default: + versions.push(a) + break + } + } + + var options = { loose: loose, includePrerelease: includePrerelease } + + versions = versions.map(function (v) { + return coerce ? (semver.coerce(v) || { version: v }).version : v + }).filter(function (v) { + return semver.valid(v) + }) + if (!versions.length) return fail() + if (inc && (versions.length !== 1 || range.length)) { return failInc() } + + for (var i = 0, l = range.length; i < l; i++) { + versions = versions.filter(function (v) { + return semver.satisfies(v, range[i], options) + }) + if (!versions.length) return fail() + } + return success(versions) +} + +function failInc () { + console.error('--inc can only be used on a single version with no range') + fail() +} + +function fail () { process.exit(1) } + +function success () { + var compare = reverse ? 'rcompare' : 'compare' + versions.sort(function (a, b) { + return semver[compare](a, b, options) + }).map(function (v) { + return semver.clean(v, options) + }).map(function (v) { + return inc ? semver.inc(v, inc, options, identifier) : v + }).forEach(function (v, i, _) { console.log(v) }) +} + +function help () { + console.log(['SemVer ' + version, + '', + 'A JavaScript implementation of the https://semver.org/ specification', + 'Copyright Isaac Z. Schlueter', + '', + 'Usage: semver [options] [ [...]]', + 'Prints valid versions sorted by SemVer precedence', + '', + 'Options:', + '-r --range ', + ' Print versions that match the specified range.', + '', + '-i --increment []', + ' Increment a version by the specified level. Level can', + ' be one of: major, minor, patch, premajor, preminor,', + " prepatch, or prerelease. Default level is 'patch'.", + ' Only one version may be specified.', + '', + '--preid ', + ' Identifier to be used to prefix premajor, preminor,', + ' prepatch or prerelease version increments.', + '', + '-l --loose', + ' Interpret versions and ranges loosely', + '', + '-p --include-prerelease', + ' Always include prerelease versions in range matching', + '', + '-c --coerce', + ' Coerce a string into SemVer if possible', + ' (does not imply --loose)', + '', + 'Program exits successfully if any valid version satisfies', + 'all supplied ranges, and prints all satisfying versions.', + '', + 'If no satisfying versions are found, then exits failure.', + '', + 'Versions are printed in ascending order, so supplying', + 'multiple versions to the utility will just sort them.' + ].join('\n')) +} diff --git a/node_modules/semver/package.json b/node_modules/semver/package.json new file mode 100644 index 000000000..3121da25a --- /dev/null +++ b/node_modules/semver/package.json @@ -0,0 +1,60 @@ +{ + "_from": "semver@^5.7.1", + "_id": "semver@5.7.1", + "_inBundle": false, + "_integrity": "sha512-sauaDf/PZdVgrLTNYHRtpXa1iRiKcaebiKQ1BJdpQlWH2lCvexQdX55snPFyK7QzpudqbCI0qXFfOasHdyNDGQ==", + "_location": "/semver", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "semver@^5.7.1", + "name": "semver", + "escapedName": "semver", + "rawSpec": "^5.7.1", + "saveSpec": null, + "fetchSpec": "^5.7.1" + }, + "_requiredBy": [ + "/nodemon" + ], + "_resolved": "https://registry.npmjs.org/semver/-/semver-5.7.1.tgz", + "_shasum": "a954f931aeba508d307bbf069eff0c01c96116f7", + "_spec": "semver@^5.7.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\nodemon", + "bin": { + "semver": "bin/semver" + }, + "bugs": { + "url": "https://github.com/npm/node-semver/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "The semantic version parser used by npm.", + "devDependencies": { + "tap": "^13.0.0-rc.18" + }, + "files": [ + "bin", + "range.bnf", + "semver.js" + ], + "homepage": "https://github.com/npm/node-semver#readme", + "license": "ISC", + "main": "semver.js", + "name": "semver", + "repository": { + "type": "git", + "url": "git+https://github.com/npm/node-semver.git" + }, + "scripts": { + "postpublish": "git push origin --all; git push origin --tags", + "postversion": "npm publish", + "preversion": "npm test", + "test": "tap" + }, + "tap": { + "check-coverage": true + }, + "version": "5.7.1" +} diff --git a/node_modules/semver/range.bnf b/node_modules/semver/range.bnf new file mode 100644 index 000000000..d4c6ae0d7 --- /dev/null +++ b/node_modules/semver/range.bnf @@ -0,0 +1,16 @@ +range-set ::= range ( logical-or range ) * +logical-or ::= ( ' ' ) * '||' ( ' ' ) * +range ::= hyphen | simple ( ' ' simple ) * | '' +hyphen ::= partial ' - ' partial +simple ::= primitive | partial | tilde | caret +primitive ::= ( '<' | '>' | '>=' | '<=' | '=' ) partial +partial ::= xr ( '.' xr ( '.' xr qualifier ? )? )? +xr ::= 'x' | 'X' | '*' | nr +nr ::= '0' | [1-9] ( [0-9] ) * +tilde ::= '~' partial +caret ::= '^' partial +qualifier ::= ( '-' pre )? ( '+' build )? +pre ::= parts +build ::= parts +parts ::= part ( '.' part ) * +part ::= nr | [-0-9A-Za-z]+ diff --git a/node_modules/semver/semver.js b/node_modules/semver/semver.js new file mode 100644 index 000000000..d315d5d68 --- /dev/null +++ b/node_modules/semver/semver.js @@ -0,0 +1,1483 @@ +exports = module.exports = SemVer + +var debug +/* istanbul ignore next */ +if (typeof process === 'object' && + process.env && + process.env.NODE_DEBUG && + /\bsemver\b/i.test(process.env.NODE_DEBUG)) { + debug = function () { + var args = Array.prototype.slice.call(arguments, 0) + args.unshift('SEMVER') + console.log.apply(console, args) + } +} else { + debug = function () {} +} + +// Note: this is the semver.org version of the spec that it implements +// Not necessarily the package version of this code. +exports.SEMVER_SPEC_VERSION = '2.0.0' + +var MAX_LENGTH = 256 +var MAX_SAFE_INTEGER = Number.MAX_SAFE_INTEGER || + /* istanbul ignore next */ 9007199254740991 + +// Max safe segment length for coercion. +var MAX_SAFE_COMPONENT_LENGTH = 16 + +// The actual regexps go on exports.re +var re = exports.re = [] +var src = exports.src = [] +var R = 0 + +// The following Regular Expressions can be used for tokenizing, +// validating, and parsing SemVer version strings. + +// ## Numeric Identifier +// A single `0`, or a non-zero digit followed by zero or more digits. + +var NUMERICIDENTIFIER = R++ +src[NUMERICIDENTIFIER] = '0|[1-9]\\d*' +var NUMERICIDENTIFIERLOOSE = R++ +src[NUMERICIDENTIFIERLOOSE] = '[0-9]+' + +// ## Non-numeric Identifier +// Zero or more digits, followed by a letter or hyphen, and then zero or +// more letters, digits, or hyphens. + +var NONNUMERICIDENTIFIER = R++ +src[NONNUMERICIDENTIFIER] = '\\d*[a-zA-Z-][a-zA-Z0-9-]*' + +// ## Main Version +// Three dot-separated numeric identifiers. + +var MAINVERSION = R++ +src[MAINVERSION] = '(' + src[NUMERICIDENTIFIER] + ')\\.' + + '(' + src[NUMERICIDENTIFIER] + ')\\.' + + '(' + src[NUMERICIDENTIFIER] + ')' + +var MAINVERSIONLOOSE = R++ +src[MAINVERSIONLOOSE] = '(' + src[NUMERICIDENTIFIERLOOSE] + ')\\.' + + '(' + src[NUMERICIDENTIFIERLOOSE] + ')\\.' + + '(' + src[NUMERICIDENTIFIERLOOSE] + ')' + +// ## Pre-release Version Identifier +// A numeric identifier, or a non-numeric identifier. + +var PRERELEASEIDENTIFIER = R++ +src[PRERELEASEIDENTIFIER] = '(?:' + src[NUMERICIDENTIFIER] + + '|' + src[NONNUMERICIDENTIFIER] + ')' + +var PRERELEASEIDENTIFIERLOOSE = R++ +src[PRERELEASEIDENTIFIERLOOSE] = '(?:' + src[NUMERICIDENTIFIERLOOSE] + + '|' + src[NONNUMERICIDENTIFIER] + ')' + +// ## Pre-release Version +// Hyphen, followed by one or more dot-separated pre-release version +// identifiers. + +var PRERELEASE = R++ +src[PRERELEASE] = '(?:-(' + src[PRERELEASEIDENTIFIER] + + '(?:\\.' + src[PRERELEASEIDENTIFIER] + ')*))' + +var PRERELEASELOOSE = R++ +src[PRERELEASELOOSE] = '(?:-?(' + src[PRERELEASEIDENTIFIERLOOSE] + + '(?:\\.' + src[PRERELEASEIDENTIFIERLOOSE] + ')*))' + +// ## Build Metadata Identifier +// Any combination of digits, letters, or hyphens. + +var BUILDIDENTIFIER = R++ +src[BUILDIDENTIFIER] = '[0-9A-Za-z-]+' + +// ## Build Metadata +// Plus sign, followed by one or more period-separated build metadata +// identifiers. + +var BUILD = R++ +src[BUILD] = '(?:\\+(' + src[BUILDIDENTIFIER] + + '(?:\\.' + src[BUILDIDENTIFIER] + ')*))' + +// ## Full Version String +// A main version, followed optionally by a pre-release version and +// build metadata. + +// Note that the only major, minor, patch, and pre-release sections of +// the version string are capturing groups. The build metadata is not a +// capturing group, because it should not ever be used in version +// comparison. + +var FULL = R++ +var FULLPLAIN = 'v?' + src[MAINVERSION] + + src[PRERELEASE] + '?' + + src[BUILD] + '?' + +src[FULL] = '^' + FULLPLAIN + '$' + +// like full, but allows v1.2.3 and =1.2.3, which people do sometimes. +// also, 1.0.0alpha1 (prerelease without the hyphen) which is pretty +// common in the npm registry. +var LOOSEPLAIN = '[v=\\s]*' + src[MAINVERSIONLOOSE] + + src[PRERELEASELOOSE] + '?' + + src[BUILD] + '?' + +var LOOSE = R++ +src[LOOSE] = '^' + LOOSEPLAIN + '$' + +var GTLT = R++ +src[GTLT] = '((?:<|>)?=?)' + +// Something like "2.*" or "1.2.x". +// Note that "x.x" is a valid xRange identifer, meaning "any version" +// Only the first item is strictly required. +var XRANGEIDENTIFIERLOOSE = R++ +src[XRANGEIDENTIFIERLOOSE] = src[NUMERICIDENTIFIERLOOSE] + '|x|X|\\*' +var XRANGEIDENTIFIER = R++ +src[XRANGEIDENTIFIER] = src[NUMERICIDENTIFIER] + '|x|X|\\*' + +var XRANGEPLAIN = R++ +src[XRANGEPLAIN] = '[v=\\s]*(' + src[XRANGEIDENTIFIER] + ')' + + '(?:\\.(' + src[XRANGEIDENTIFIER] + ')' + + '(?:\\.(' + src[XRANGEIDENTIFIER] + ')' + + '(?:' + src[PRERELEASE] + ')?' + + src[BUILD] + '?' + + ')?)?' + +var XRANGEPLAINLOOSE = R++ +src[XRANGEPLAINLOOSE] = '[v=\\s]*(' + src[XRANGEIDENTIFIERLOOSE] + ')' + + '(?:\\.(' + src[XRANGEIDENTIFIERLOOSE] + ')' + + '(?:\\.(' + src[XRANGEIDENTIFIERLOOSE] + ')' + + '(?:' + src[PRERELEASELOOSE] + ')?' + + src[BUILD] + '?' + + ')?)?' + +var XRANGE = R++ +src[XRANGE] = '^' + src[GTLT] + '\\s*' + src[XRANGEPLAIN] + '$' +var XRANGELOOSE = R++ +src[XRANGELOOSE] = '^' + src[GTLT] + '\\s*' + src[XRANGEPLAINLOOSE] + '$' + +// Coercion. +// Extract anything that could conceivably be a part of a valid semver +var COERCE = R++ +src[COERCE] = '(?:^|[^\\d])' + + '(\\d{1,' + MAX_SAFE_COMPONENT_LENGTH + '})' + + '(?:\\.(\\d{1,' + MAX_SAFE_COMPONENT_LENGTH + '}))?' + + '(?:\\.(\\d{1,' + MAX_SAFE_COMPONENT_LENGTH + '}))?' + + '(?:$|[^\\d])' + +// Tilde ranges. +// Meaning is "reasonably at or greater than" +var LONETILDE = R++ +src[LONETILDE] = '(?:~>?)' + +var TILDETRIM = R++ +src[TILDETRIM] = '(\\s*)' + src[LONETILDE] + '\\s+' +re[TILDETRIM] = new RegExp(src[TILDETRIM], 'g') +var tildeTrimReplace = '$1~' + +var TILDE = R++ +src[TILDE] = '^' + src[LONETILDE] + src[XRANGEPLAIN] + '$' +var TILDELOOSE = R++ +src[TILDELOOSE] = '^' + src[LONETILDE] + src[XRANGEPLAINLOOSE] + '$' + +// Caret ranges. +// Meaning is "at least and backwards compatible with" +var LONECARET = R++ +src[LONECARET] = '(?:\\^)' + +var CARETTRIM = R++ +src[CARETTRIM] = '(\\s*)' + src[LONECARET] + '\\s+' +re[CARETTRIM] = new RegExp(src[CARETTRIM], 'g') +var caretTrimReplace = '$1^' + +var CARET = R++ +src[CARET] = '^' + src[LONECARET] + src[XRANGEPLAIN] + '$' +var CARETLOOSE = R++ +src[CARETLOOSE] = '^' + src[LONECARET] + src[XRANGEPLAINLOOSE] + '$' + +// A simple gt/lt/eq thing, or just "" to indicate "any version" +var COMPARATORLOOSE = R++ +src[COMPARATORLOOSE] = '^' + src[GTLT] + '\\s*(' + LOOSEPLAIN + ')$|^$' +var COMPARATOR = R++ +src[COMPARATOR] = '^' + src[GTLT] + '\\s*(' + FULLPLAIN + ')$|^$' + +// An expression to strip any whitespace between the gtlt and the thing +// it modifies, so that `> 1.2.3` ==> `>1.2.3` +var COMPARATORTRIM = R++ +src[COMPARATORTRIM] = '(\\s*)' + src[GTLT] + + '\\s*(' + LOOSEPLAIN + '|' + src[XRANGEPLAIN] + ')' + +// this one has to use the /g flag +re[COMPARATORTRIM] = new RegExp(src[COMPARATORTRIM], 'g') +var comparatorTrimReplace = '$1$2$3' + +// Something like `1.2.3 - 1.2.4` +// Note that these all use the loose form, because they'll be +// checked against either the strict or loose comparator form +// later. +var HYPHENRANGE = R++ +src[HYPHENRANGE] = '^\\s*(' + src[XRANGEPLAIN] + ')' + + '\\s+-\\s+' + + '(' + src[XRANGEPLAIN] + ')' + + '\\s*$' + +var HYPHENRANGELOOSE = R++ +src[HYPHENRANGELOOSE] = '^\\s*(' + src[XRANGEPLAINLOOSE] + ')' + + '\\s+-\\s+' + + '(' + src[XRANGEPLAINLOOSE] + ')' + + '\\s*$' + +// Star ranges basically just allow anything at all. +var STAR = R++ +src[STAR] = '(<|>)?=?\\s*\\*' + +// Compile to actual regexp objects. +// All are flag-free, unless they were created above with a flag. +for (var i = 0; i < R; i++) { + debug(i, src[i]) + if (!re[i]) { + re[i] = new RegExp(src[i]) + } +} + +exports.parse = parse +function parse (version, options) { + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + + if (version instanceof SemVer) { + return version + } + + if (typeof version !== 'string') { + return null + } + + if (version.length > MAX_LENGTH) { + return null + } + + var r = options.loose ? re[LOOSE] : re[FULL] + if (!r.test(version)) { + return null + } + + try { + return new SemVer(version, options) + } catch (er) { + return null + } +} + +exports.valid = valid +function valid (version, options) { + var v = parse(version, options) + return v ? v.version : null +} + +exports.clean = clean +function clean (version, options) { + var s = parse(version.trim().replace(/^[=v]+/, ''), options) + return s ? s.version : null +} + +exports.SemVer = SemVer + +function SemVer (version, options) { + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + if (version instanceof SemVer) { + if (version.loose === options.loose) { + return version + } else { + version = version.version + } + } else if (typeof version !== 'string') { + throw new TypeError('Invalid Version: ' + version) + } + + if (version.length > MAX_LENGTH) { + throw new TypeError('version is longer than ' + MAX_LENGTH + ' characters') + } + + if (!(this instanceof SemVer)) { + return new SemVer(version, options) + } + + debug('SemVer', version, options) + this.options = options + this.loose = !!options.loose + + var m = version.trim().match(options.loose ? re[LOOSE] : re[FULL]) + + if (!m) { + throw new TypeError('Invalid Version: ' + version) + } + + this.raw = version + + // these are actually numbers + this.major = +m[1] + this.minor = +m[2] + this.patch = +m[3] + + if (this.major > MAX_SAFE_INTEGER || this.major < 0) { + throw new TypeError('Invalid major version') + } + + if (this.minor > MAX_SAFE_INTEGER || this.minor < 0) { + throw new TypeError('Invalid minor version') + } + + if (this.patch > MAX_SAFE_INTEGER || this.patch < 0) { + throw new TypeError('Invalid patch version') + } + + // numberify any prerelease numeric ids + if (!m[4]) { + this.prerelease = [] + } else { + this.prerelease = m[4].split('.').map(function (id) { + if (/^[0-9]+$/.test(id)) { + var num = +id + if (num >= 0 && num < MAX_SAFE_INTEGER) { + return num + } + } + return id + }) + } + + this.build = m[5] ? m[5].split('.') : [] + this.format() +} + +SemVer.prototype.format = function () { + this.version = this.major + '.' + this.minor + '.' + this.patch + if (this.prerelease.length) { + this.version += '-' + this.prerelease.join('.') + } + return this.version +} + +SemVer.prototype.toString = function () { + return this.version +} + +SemVer.prototype.compare = function (other) { + debug('SemVer.compare', this.version, this.options, other) + if (!(other instanceof SemVer)) { + other = new SemVer(other, this.options) + } + + return this.compareMain(other) || this.comparePre(other) +} + +SemVer.prototype.compareMain = function (other) { + if (!(other instanceof SemVer)) { + other = new SemVer(other, this.options) + } + + return compareIdentifiers(this.major, other.major) || + compareIdentifiers(this.minor, other.minor) || + compareIdentifiers(this.patch, other.patch) +} + +SemVer.prototype.comparePre = function (other) { + if (!(other instanceof SemVer)) { + other = new SemVer(other, this.options) + } + + // NOT having a prerelease is > having one + if (this.prerelease.length && !other.prerelease.length) { + return -1 + } else if (!this.prerelease.length && other.prerelease.length) { + return 1 + } else if (!this.prerelease.length && !other.prerelease.length) { + return 0 + } + + var i = 0 + do { + var a = this.prerelease[i] + var b = other.prerelease[i] + debug('prerelease compare', i, a, b) + if (a === undefined && b === undefined) { + return 0 + } else if (b === undefined) { + return 1 + } else if (a === undefined) { + return -1 + } else if (a === b) { + continue + } else { + return compareIdentifiers(a, b) + } + } while (++i) +} + +// preminor will bump the version up to the next minor release, and immediately +// down to pre-release. premajor and prepatch work the same way. +SemVer.prototype.inc = function (release, identifier) { + switch (release) { + case 'premajor': + this.prerelease.length = 0 + this.patch = 0 + this.minor = 0 + this.major++ + this.inc('pre', identifier) + break + case 'preminor': + this.prerelease.length = 0 + this.patch = 0 + this.minor++ + this.inc('pre', identifier) + break + case 'prepatch': + // If this is already a prerelease, it will bump to the next version + // drop any prereleases that might already exist, since they are not + // relevant at this point. + this.prerelease.length = 0 + this.inc('patch', identifier) + this.inc('pre', identifier) + break + // If the input is a non-prerelease version, this acts the same as + // prepatch. + case 'prerelease': + if (this.prerelease.length === 0) { + this.inc('patch', identifier) + } + this.inc('pre', identifier) + break + + case 'major': + // If this is a pre-major version, bump up to the same major version. + // Otherwise increment major. + // 1.0.0-5 bumps to 1.0.0 + // 1.1.0 bumps to 2.0.0 + if (this.minor !== 0 || + this.patch !== 0 || + this.prerelease.length === 0) { + this.major++ + } + this.minor = 0 + this.patch = 0 + this.prerelease = [] + break + case 'minor': + // If this is a pre-minor version, bump up to the same minor version. + // Otherwise increment minor. + // 1.2.0-5 bumps to 1.2.0 + // 1.2.1 bumps to 1.3.0 + if (this.patch !== 0 || this.prerelease.length === 0) { + this.minor++ + } + this.patch = 0 + this.prerelease = [] + break + case 'patch': + // If this is not a pre-release version, it will increment the patch. + // If it is a pre-release it will bump up to the same patch version. + // 1.2.0-5 patches to 1.2.0 + // 1.2.0 patches to 1.2.1 + if (this.prerelease.length === 0) { + this.patch++ + } + this.prerelease = [] + break + // This probably shouldn't be used publicly. + // 1.0.0 "pre" would become 1.0.0-0 which is the wrong direction. + case 'pre': + if (this.prerelease.length === 0) { + this.prerelease = [0] + } else { + var i = this.prerelease.length + while (--i >= 0) { + if (typeof this.prerelease[i] === 'number') { + this.prerelease[i]++ + i = -2 + } + } + if (i === -1) { + // didn't increment anything + this.prerelease.push(0) + } + } + if (identifier) { + // 1.2.0-beta.1 bumps to 1.2.0-beta.2, + // 1.2.0-beta.fooblz or 1.2.0-beta bumps to 1.2.0-beta.0 + if (this.prerelease[0] === identifier) { + if (isNaN(this.prerelease[1])) { + this.prerelease = [identifier, 0] + } + } else { + this.prerelease = [identifier, 0] + } + } + break + + default: + throw new Error('invalid increment argument: ' + release) + } + this.format() + this.raw = this.version + return this +} + +exports.inc = inc +function inc (version, release, loose, identifier) { + if (typeof (loose) === 'string') { + identifier = loose + loose = undefined + } + + try { + return new SemVer(version, loose).inc(release, identifier).version + } catch (er) { + return null + } +} + +exports.diff = diff +function diff (version1, version2) { + if (eq(version1, version2)) { + return null + } else { + var v1 = parse(version1) + var v2 = parse(version2) + var prefix = '' + if (v1.prerelease.length || v2.prerelease.length) { + prefix = 'pre' + var defaultResult = 'prerelease' + } + for (var key in v1) { + if (key === 'major' || key === 'minor' || key === 'patch') { + if (v1[key] !== v2[key]) { + return prefix + key + } + } + } + return defaultResult // may be undefined + } +} + +exports.compareIdentifiers = compareIdentifiers + +var numeric = /^[0-9]+$/ +function compareIdentifiers (a, b) { + var anum = numeric.test(a) + var bnum = numeric.test(b) + + if (anum && bnum) { + a = +a + b = +b + } + + return a === b ? 0 + : (anum && !bnum) ? -1 + : (bnum && !anum) ? 1 + : a < b ? -1 + : 1 +} + +exports.rcompareIdentifiers = rcompareIdentifiers +function rcompareIdentifiers (a, b) { + return compareIdentifiers(b, a) +} + +exports.major = major +function major (a, loose) { + return new SemVer(a, loose).major +} + +exports.minor = minor +function minor (a, loose) { + return new SemVer(a, loose).minor +} + +exports.patch = patch +function patch (a, loose) { + return new SemVer(a, loose).patch +} + +exports.compare = compare +function compare (a, b, loose) { + return new SemVer(a, loose).compare(new SemVer(b, loose)) +} + +exports.compareLoose = compareLoose +function compareLoose (a, b) { + return compare(a, b, true) +} + +exports.rcompare = rcompare +function rcompare (a, b, loose) { + return compare(b, a, loose) +} + +exports.sort = sort +function sort (list, loose) { + return list.sort(function (a, b) { + return exports.compare(a, b, loose) + }) +} + +exports.rsort = rsort +function rsort (list, loose) { + return list.sort(function (a, b) { + return exports.rcompare(a, b, loose) + }) +} + +exports.gt = gt +function gt (a, b, loose) { + return compare(a, b, loose) > 0 +} + +exports.lt = lt +function lt (a, b, loose) { + return compare(a, b, loose) < 0 +} + +exports.eq = eq +function eq (a, b, loose) { + return compare(a, b, loose) === 0 +} + +exports.neq = neq +function neq (a, b, loose) { + return compare(a, b, loose) !== 0 +} + +exports.gte = gte +function gte (a, b, loose) { + return compare(a, b, loose) >= 0 +} + +exports.lte = lte +function lte (a, b, loose) { + return compare(a, b, loose) <= 0 +} + +exports.cmp = cmp +function cmp (a, op, b, loose) { + switch (op) { + case '===': + if (typeof a === 'object') + a = a.version + if (typeof b === 'object') + b = b.version + return a === b + + case '!==': + if (typeof a === 'object') + a = a.version + if (typeof b === 'object') + b = b.version + return a !== b + + case '': + case '=': + case '==': + return eq(a, b, loose) + + case '!=': + return neq(a, b, loose) + + case '>': + return gt(a, b, loose) + + case '>=': + return gte(a, b, loose) + + case '<': + return lt(a, b, loose) + + case '<=': + return lte(a, b, loose) + + default: + throw new TypeError('Invalid operator: ' + op) + } +} + +exports.Comparator = Comparator +function Comparator (comp, options) { + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + + if (comp instanceof Comparator) { + if (comp.loose === !!options.loose) { + return comp + } else { + comp = comp.value + } + } + + if (!(this instanceof Comparator)) { + return new Comparator(comp, options) + } + + debug('comparator', comp, options) + this.options = options + this.loose = !!options.loose + this.parse(comp) + + if (this.semver === ANY) { + this.value = '' + } else { + this.value = this.operator + this.semver.version + } + + debug('comp', this) +} + +var ANY = {} +Comparator.prototype.parse = function (comp) { + var r = this.options.loose ? re[COMPARATORLOOSE] : re[COMPARATOR] + var m = comp.match(r) + + if (!m) { + throw new TypeError('Invalid comparator: ' + comp) + } + + this.operator = m[1] + if (this.operator === '=') { + this.operator = '' + } + + // if it literally is just '>' or '' then allow anything. + if (!m[2]) { + this.semver = ANY + } else { + this.semver = new SemVer(m[2], this.options.loose) + } +} + +Comparator.prototype.toString = function () { + return this.value +} + +Comparator.prototype.test = function (version) { + debug('Comparator.test', version, this.options.loose) + + if (this.semver === ANY) { + return true + } + + if (typeof version === 'string') { + version = new SemVer(version, this.options) + } + + return cmp(version, this.operator, this.semver, this.options) +} + +Comparator.prototype.intersects = function (comp, options) { + if (!(comp instanceof Comparator)) { + throw new TypeError('a Comparator is required') + } + + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + + var rangeTmp + + if (this.operator === '') { + rangeTmp = new Range(comp.value, options) + return satisfies(this.value, rangeTmp, options) + } else if (comp.operator === '') { + rangeTmp = new Range(this.value, options) + return satisfies(comp.semver, rangeTmp, options) + } + + var sameDirectionIncreasing = + (this.operator === '>=' || this.operator === '>') && + (comp.operator === '>=' || comp.operator === '>') + var sameDirectionDecreasing = + (this.operator === '<=' || this.operator === '<') && + (comp.operator === '<=' || comp.operator === '<') + var sameSemVer = this.semver.version === comp.semver.version + var differentDirectionsInclusive = + (this.operator === '>=' || this.operator === '<=') && + (comp.operator === '>=' || comp.operator === '<=') + var oppositeDirectionsLessThan = + cmp(this.semver, '<', comp.semver, options) && + ((this.operator === '>=' || this.operator === '>') && + (comp.operator === '<=' || comp.operator === '<')) + var oppositeDirectionsGreaterThan = + cmp(this.semver, '>', comp.semver, options) && + ((this.operator === '<=' || this.operator === '<') && + (comp.operator === '>=' || comp.operator === '>')) + + return sameDirectionIncreasing || sameDirectionDecreasing || + (sameSemVer && differentDirectionsInclusive) || + oppositeDirectionsLessThan || oppositeDirectionsGreaterThan +} + +exports.Range = Range +function Range (range, options) { + if (!options || typeof options !== 'object') { + options = { + loose: !!options, + includePrerelease: false + } + } + + if (range instanceof Range) { + if (range.loose === !!options.loose && + range.includePrerelease === !!options.includePrerelease) { + return range + } else { + return new Range(range.raw, options) + } + } + + if (range instanceof Comparator) { + return new Range(range.value, options) + } + + if (!(this instanceof Range)) { + return new Range(range, options) + } + + this.options = options + this.loose = !!options.loose + this.includePrerelease = !!options.includePrerelease + + // First, split based on boolean or || + this.raw = range + this.set = range.split(/\s*\|\|\s*/).map(function (range) { + return this.parseRange(range.trim()) + }, this).filter(function (c) { + // throw out any that are not relevant for whatever reason + return c.length + }) + + if (!this.set.length) { + throw new TypeError('Invalid SemVer Range: ' + range) + } + + this.format() +} + +Range.prototype.format = function () { + this.range = this.set.map(function (comps) { + return comps.join(' ').trim() + }).join('||').trim() + return this.range +} + +Range.prototype.toString = function () { + return this.range +} + +Range.prototype.parseRange = function (range) { + var loose = this.options.loose + range = range.trim() + // `1.2.3 - 1.2.4` => `>=1.2.3 <=1.2.4` + var hr = loose ? re[HYPHENRANGELOOSE] : re[HYPHENRANGE] + range = range.replace(hr, hyphenReplace) + debug('hyphen replace', range) + // `> 1.2.3 < 1.2.5` => `>1.2.3 <1.2.5` + range = range.replace(re[COMPARATORTRIM], comparatorTrimReplace) + debug('comparator trim', range, re[COMPARATORTRIM]) + + // `~ 1.2.3` => `~1.2.3` + range = range.replace(re[TILDETRIM], tildeTrimReplace) + + // `^ 1.2.3` => `^1.2.3` + range = range.replace(re[CARETTRIM], caretTrimReplace) + + // normalize spaces + range = range.split(/\s+/).join(' ') + + // At this point, the range is completely trimmed and + // ready to be split into comparators. + + var compRe = loose ? re[COMPARATORLOOSE] : re[COMPARATOR] + var set = range.split(' ').map(function (comp) { + return parseComparator(comp, this.options) + }, this).join(' ').split(/\s+/) + if (this.options.loose) { + // in loose mode, throw out any that are not valid comparators + set = set.filter(function (comp) { + return !!comp.match(compRe) + }) + } + set = set.map(function (comp) { + return new Comparator(comp, this.options) + }, this) + + return set +} + +Range.prototype.intersects = function (range, options) { + if (!(range instanceof Range)) { + throw new TypeError('a Range is required') + } + + return this.set.some(function (thisComparators) { + return thisComparators.every(function (thisComparator) { + return range.set.some(function (rangeComparators) { + return rangeComparators.every(function (rangeComparator) { + return thisComparator.intersects(rangeComparator, options) + }) + }) + }) + }) +} + +// Mostly just for testing and legacy API reasons +exports.toComparators = toComparators +function toComparators (range, options) { + return new Range(range, options).set.map(function (comp) { + return comp.map(function (c) { + return c.value + }).join(' ').trim().split(' ') + }) +} + +// comprised of xranges, tildes, stars, and gtlt's at this point. +// already replaced the hyphen ranges +// turn into a set of JUST comparators. +function parseComparator (comp, options) { + debug('comp', comp, options) + comp = replaceCarets(comp, options) + debug('caret', comp) + comp = replaceTildes(comp, options) + debug('tildes', comp) + comp = replaceXRanges(comp, options) + debug('xrange', comp) + comp = replaceStars(comp, options) + debug('stars', comp) + return comp +} + +function isX (id) { + return !id || id.toLowerCase() === 'x' || id === '*' +} + +// ~, ~> --> * (any, kinda silly) +// ~2, ~2.x, ~2.x.x, ~>2, ~>2.x ~>2.x.x --> >=2.0.0 <3.0.0 +// ~2.0, ~2.0.x, ~>2.0, ~>2.0.x --> >=2.0.0 <2.1.0 +// ~1.2, ~1.2.x, ~>1.2, ~>1.2.x --> >=1.2.0 <1.3.0 +// ~1.2.3, ~>1.2.3 --> >=1.2.3 <1.3.0 +// ~1.2.0, ~>1.2.0 --> >=1.2.0 <1.3.0 +function replaceTildes (comp, options) { + return comp.trim().split(/\s+/).map(function (comp) { + return replaceTilde(comp, options) + }).join(' ') +} + +function replaceTilde (comp, options) { + var r = options.loose ? re[TILDELOOSE] : re[TILDE] + return comp.replace(r, function (_, M, m, p, pr) { + debug('tilde', comp, _, M, m, p, pr) + var ret + + if (isX(M)) { + ret = '' + } else if (isX(m)) { + ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0' + } else if (isX(p)) { + // ~1.2 == >=1.2.0 <1.3.0 + ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0' + } else if (pr) { + debug('replaceTilde pr', pr) + ret = '>=' + M + '.' + m + '.' + p + '-' + pr + + ' <' + M + '.' + (+m + 1) + '.0' + } else { + // ~1.2.3 == >=1.2.3 <1.3.0 + ret = '>=' + M + '.' + m + '.' + p + + ' <' + M + '.' + (+m + 1) + '.0' + } + + debug('tilde return', ret) + return ret + }) +} + +// ^ --> * (any, kinda silly) +// ^2, ^2.x, ^2.x.x --> >=2.0.0 <3.0.0 +// ^2.0, ^2.0.x --> >=2.0.0 <3.0.0 +// ^1.2, ^1.2.x --> >=1.2.0 <2.0.0 +// ^1.2.3 --> >=1.2.3 <2.0.0 +// ^1.2.0 --> >=1.2.0 <2.0.0 +function replaceCarets (comp, options) { + return comp.trim().split(/\s+/).map(function (comp) { + return replaceCaret(comp, options) + }).join(' ') +} + +function replaceCaret (comp, options) { + debug('caret', comp, options) + var r = options.loose ? re[CARETLOOSE] : re[CARET] + return comp.replace(r, function (_, M, m, p, pr) { + debug('caret', comp, _, M, m, p, pr) + var ret + + if (isX(M)) { + ret = '' + } else if (isX(m)) { + ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0' + } else if (isX(p)) { + if (M === '0') { + ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0' + } else { + ret = '>=' + M + '.' + m + '.0 <' + (+M + 1) + '.0.0' + } + } else if (pr) { + debug('replaceCaret pr', pr) + if (M === '0') { + if (m === '0') { + ret = '>=' + M + '.' + m + '.' + p + '-' + pr + + ' <' + M + '.' + m + '.' + (+p + 1) + } else { + ret = '>=' + M + '.' + m + '.' + p + '-' + pr + + ' <' + M + '.' + (+m + 1) + '.0' + } + } else { + ret = '>=' + M + '.' + m + '.' + p + '-' + pr + + ' <' + (+M + 1) + '.0.0' + } + } else { + debug('no pr') + if (M === '0') { + if (m === '0') { + ret = '>=' + M + '.' + m + '.' + p + + ' <' + M + '.' + m + '.' + (+p + 1) + } else { + ret = '>=' + M + '.' + m + '.' + p + + ' <' + M + '.' + (+m + 1) + '.0' + } + } else { + ret = '>=' + M + '.' + m + '.' + p + + ' <' + (+M + 1) + '.0.0' + } + } + + debug('caret return', ret) + return ret + }) +} + +function replaceXRanges (comp, options) { + debug('replaceXRanges', comp, options) + return comp.split(/\s+/).map(function (comp) { + return replaceXRange(comp, options) + }).join(' ') +} + +function replaceXRange (comp, options) { + comp = comp.trim() + var r = options.loose ? re[XRANGELOOSE] : re[XRANGE] + return comp.replace(r, function (ret, gtlt, M, m, p, pr) { + debug('xRange', comp, ret, gtlt, M, m, p, pr) + var xM = isX(M) + var xm = xM || isX(m) + var xp = xm || isX(p) + var anyX = xp + + if (gtlt === '=' && anyX) { + gtlt = '' + } + + if (xM) { + if (gtlt === '>' || gtlt === '<') { + // nothing is allowed + ret = '<0.0.0' + } else { + // nothing is forbidden + ret = '*' + } + } else if (gtlt && anyX) { + // we know patch is an x, because we have any x at all. + // replace X with 0 + if (xm) { + m = 0 + } + p = 0 + + if (gtlt === '>') { + // >1 => >=2.0.0 + // >1.2 => >=1.3.0 + // >1.2.3 => >= 1.2.4 + gtlt = '>=' + if (xm) { + M = +M + 1 + m = 0 + p = 0 + } else { + m = +m + 1 + p = 0 + } + } else if (gtlt === '<=') { + // <=0.7.x is actually <0.8.0, since any 0.7.x should + // pass. Similarly, <=7.x is actually <8.0.0, etc. + gtlt = '<' + if (xm) { + M = +M + 1 + } else { + m = +m + 1 + } + } + + ret = gtlt + M + '.' + m + '.' + p + } else if (xm) { + ret = '>=' + M + '.0.0 <' + (+M + 1) + '.0.0' + } else if (xp) { + ret = '>=' + M + '.' + m + '.0 <' + M + '.' + (+m + 1) + '.0' + } + + debug('xRange return', ret) + + return ret + }) +} + +// Because * is AND-ed with everything else in the comparator, +// and '' means "any version", just remove the *s entirely. +function replaceStars (comp, options) { + debug('replaceStars', comp, options) + // Looseness is ignored here. star is always as loose as it gets! + return comp.trim().replace(re[STAR], '') +} + +// This function is passed to string.replace(re[HYPHENRANGE]) +// M, m, patch, prerelease, build +// 1.2 - 3.4.5 => >=1.2.0 <=3.4.5 +// 1.2.3 - 3.4 => >=1.2.0 <3.5.0 Any 3.4.x will do +// 1.2 - 3.4 => >=1.2.0 <3.5.0 +function hyphenReplace ($0, + from, fM, fm, fp, fpr, fb, + to, tM, tm, tp, tpr, tb) { + if (isX(fM)) { + from = '' + } else if (isX(fm)) { + from = '>=' + fM + '.0.0' + } else if (isX(fp)) { + from = '>=' + fM + '.' + fm + '.0' + } else { + from = '>=' + from + } + + if (isX(tM)) { + to = '' + } else if (isX(tm)) { + to = '<' + (+tM + 1) + '.0.0' + } else if (isX(tp)) { + to = '<' + tM + '.' + (+tm + 1) + '.0' + } else if (tpr) { + to = '<=' + tM + '.' + tm + '.' + tp + '-' + tpr + } else { + to = '<=' + to + } + + return (from + ' ' + to).trim() +} + +// if ANY of the sets match ALL of its comparators, then pass +Range.prototype.test = function (version) { + if (!version) { + return false + } + + if (typeof version === 'string') { + version = new SemVer(version, this.options) + } + + for (var i = 0; i < this.set.length; i++) { + if (testSet(this.set[i], version, this.options)) { + return true + } + } + return false +} + +function testSet (set, version, options) { + for (var i = 0; i < set.length; i++) { + if (!set[i].test(version)) { + return false + } + } + + if (version.prerelease.length && !options.includePrerelease) { + // Find the set of versions that are allowed to have prereleases + // For example, ^1.2.3-pr.1 desugars to >=1.2.3-pr.1 <2.0.0 + // That should allow `1.2.3-pr.2` to pass. + // However, `1.2.4-alpha.notready` should NOT be allowed, + // even though it's within the range set by the comparators. + for (i = 0; i < set.length; i++) { + debug(set[i].semver) + if (set[i].semver === ANY) { + continue + } + + if (set[i].semver.prerelease.length > 0) { + var allowed = set[i].semver + if (allowed.major === version.major && + allowed.minor === version.minor && + allowed.patch === version.patch) { + return true + } + } + } + + // Version has a -pre, but it's not one of the ones we like. + return false + } + + return true +} + +exports.satisfies = satisfies +function satisfies (version, range, options) { + try { + range = new Range(range, options) + } catch (er) { + return false + } + return range.test(version) +} + +exports.maxSatisfying = maxSatisfying +function maxSatisfying (versions, range, options) { + var max = null + var maxSV = null + try { + var rangeObj = new Range(range, options) + } catch (er) { + return null + } + versions.forEach(function (v) { + if (rangeObj.test(v)) { + // satisfies(v, range, options) + if (!max || maxSV.compare(v) === -1) { + // compare(max, v, true) + max = v + maxSV = new SemVer(max, options) + } + } + }) + return max +} + +exports.minSatisfying = minSatisfying +function minSatisfying (versions, range, options) { + var min = null + var minSV = null + try { + var rangeObj = new Range(range, options) + } catch (er) { + return null + } + versions.forEach(function (v) { + if (rangeObj.test(v)) { + // satisfies(v, range, options) + if (!min || minSV.compare(v) === 1) { + // compare(min, v, true) + min = v + minSV = new SemVer(min, options) + } + } + }) + return min +} + +exports.minVersion = minVersion +function minVersion (range, loose) { + range = new Range(range, loose) + + var minver = new SemVer('0.0.0') + if (range.test(minver)) { + return minver + } + + minver = new SemVer('0.0.0-0') + if (range.test(minver)) { + return minver + } + + minver = null + for (var i = 0; i < range.set.length; ++i) { + var comparators = range.set[i] + + comparators.forEach(function (comparator) { + // Clone to avoid manipulating the comparator's semver object. + var compver = new SemVer(comparator.semver.version) + switch (comparator.operator) { + case '>': + if (compver.prerelease.length === 0) { + compver.patch++ + } else { + compver.prerelease.push(0) + } + compver.raw = compver.format() + /* fallthrough */ + case '': + case '>=': + if (!minver || gt(minver, compver)) { + minver = compver + } + break + case '<': + case '<=': + /* Ignore maximum versions */ + break + /* istanbul ignore next */ + default: + throw new Error('Unexpected operation: ' + comparator.operator) + } + }) + } + + if (minver && range.test(minver)) { + return minver + } + + return null +} + +exports.validRange = validRange +function validRange (range, options) { + try { + // Return '*' instead of '' so that truthiness works. + // This will throw if it's invalid anyway + return new Range(range, options).range || '*' + } catch (er) { + return null + } +} + +// Determine if version is less than all the versions possible in the range +exports.ltr = ltr +function ltr (version, range, options) { + return outside(version, range, '<', options) +} + +// Determine if version is greater than all the versions possible in the range. +exports.gtr = gtr +function gtr (version, range, options) { + return outside(version, range, '>', options) +} + +exports.outside = outside +function outside (version, range, hilo, options) { + version = new SemVer(version, options) + range = new Range(range, options) + + var gtfn, ltefn, ltfn, comp, ecomp + switch (hilo) { + case '>': + gtfn = gt + ltefn = lte + ltfn = lt + comp = '>' + ecomp = '>=' + break + case '<': + gtfn = lt + ltefn = gte + ltfn = gt + comp = '<' + ecomp = '<=' + break + default: + throw new TypeError('Must provide a hilo val of "<" or ">"') + } + + // If it satisifes the range it is not outside + if (satisfies(version, range, options)) { + return false + } + + // From now on, variable terms are as if we're in "gtr" mode. + // but note that everything is flipped for the "ltr" function. + + for (var i = 0; i < range.set.length; ++i) { + var comparators = range.set[i] + + var high = null + var low = null + + comparators.forEach(function (comparator) { + if (comparator.semver === ANY) { + comparator = new Comparator('>=0.0.0') + } + high = high || comparator + low = low || comparator + if (gtfn(comparator.semver, high.semver, options)) { + high = comparator + } else if (ltfn(comparator.semver, low.semver, options)) { + low = comparator + } + }) + + // If the edge version comparator has a operator then our version + // isn't outside it + if (high.operator === comp || high.operator === ecomp) { + return false + } + + // If the lowest version comparator has an operator and our version + // is less than it then it isn't higher than the range + if ((!low.operator || low.operator === comp) && + ltefn(version, low.semver)) { + return false + } else if (low.operator === ecomp && ltfn(version, low.semver)) { + return false + } + } + return true +} + +exports.prerelease = prerelease +function prerelease (version, options) { + var parsed = parse(version, options) + return (parsed && parsed.prerelease.length) ? parsed.prerelease : null +} + +exports.intersects = intersects +function intersects (r1, r2, options) { + r1 = new Range(r1, options) + r2 = new Range(r2, options) + return r1.intersects(r2) +} + +exports.coerce = coerce +function coerce (version) { + if (version instanceof SemVer) { + return version + } + + if (typeof version !== 'string') { + return null + } + + var match = version.match(re[COERCE]) + + if (match == null) { + return null + } + + return parse(match[1] + + '.' + (match[2] || '0') + + '.' + (match[3] || '0')) +} diff --git a/node_modules/send/HISTORY.md b/node_modules/send/HISTORY.md new file mode 100644 index 000000000..d14ac069d --- /dev/null +++ b/node_modules/send/HISTORY.md @@ -0,0 +1,496 @@ +0.17.1 / 2019-05-10 +=================== + + * Set stricter CSP header in redirect & error responses + * deps: range-parser@~1.2.1 + +0.17.0 / 2019-05-03 +=================== + + * deps: http-errors@~1.7.2 + - Set constructor name when possible + - Use `toidentifier` module to make class names + - deps: depd@~1.1.2 + - deps: setprototypeof@1.1.1 + - deps: statuses@'>= 1.5.0 < 2' + * deps: mime@1.6.0 + - Add extensions for JPEG-2000 images + - Add new `font/*` types from IANA + - Add WASM mapping + - Update `.bdoc` to `application/bdoc` + - Update `.bmp` to `image/bmp` + - Update `.m4a` to `audio/mp4` + - Update `.rtf` to `application/rtf` + - Update `.wav` to `audio/wav` + - Update `.xml` to `application/xml` + - Update generic extensions to `application/octet-stream`: + `.deb`, `.dll`, `.dmg`, `.exe`, `.iso`, `.msi` + - Use mime-score module to resolve extension conflicts + * deps: ms@2.1.1 + - Add `week`/`w` support + - Fix negative number handling + * deps: statuses@~1.5.0 + * perf: remove redundant `path.normalize` call + +0.16.2 / 2018-02-07 +=================== + + * Fix incorrect end tag in default error & redirects + * deps: depd@~1.1.2 + - perf: remove argument reassignment + * deps: encodeurl@~1.0.2 + - Fix encoding `%` as last character + * deps: statuses@~1.4.0 + +0.16.1 / 2017-09-29 +=================== + + * Fix regression in edge-case behavior for empty `path` + +0.16.0 / 2017-09-27 +=================== + + * Add `immutable` option + * Fix missing `` in default error & redirects + * Use instance methods on steam to check for listeners + * deps: mime@1.4.1 + - Add 70 new types for file extensions + - Set charset as "UTF-8" for .js and .json + * perf: improve path validation speed + +0.15.6 / 2017-09-22 +=================== + + * deps: debug@2.6.9 + * perf: improve `If-Match` token parsing + +0.15.5 / 2017-09-20 +=================== + + * deps: etag@~1.8.1 + - perf: replace regular expression with substring + * deps: fresh@0.5.2 + - Fix handling of modified headers with invalid dates + - perf: improve ETag match loop + - perf: improve `If-None-Match` token parsing + +0.15.4 / 2017-08-05 +=================== + + * deps: debug@2.6.8 + * deps: depd@~1.1.1 + - Remove unnecessary `Buffer` loading + * deps: http-errors@~1.6.2 + - deps: depd@1.1.1 + +0.15.3 / 2017-05-16 +=================== + + * deps: debug@2.6.7 + - deps: ms@2.0.0 + * deps: ms@2.0.0 + +0.15.2 / 2017-04-26 +=================== + + * deps: debug@2.6.4 + - Fix `DEBUG_MAX_ARRAY_LENGTH` + - deps: ms@0.7.3 + * deps: ms@1.0.0 + +0.15.1 / 2017-03-04 +=================== + + * Fix issue when `Date.parse` does not return `NaN` on invalid date + * Fix strict violation in broken environments + +0.15.0 / 2017-02-25 +=================== + + * Support `If-Match` and `If-Unmodified-Since` headers + * Add `res` and `path` arguments to `directory` event + * Remove usage of `res._headers` private field + - Improves compatibility with Node.js 8 nightly + * Send complete HTML document in redirect & error responses + * Set default CSP header in redirect & error responses + * Use `res.getHeaderNames()` when available + * Use `res.headersSent` when available + * deps: debug@2.6.1 + - Allow colors in workers + - Deprecated `DEBUG_FD` environment variable set to `3` or higher + - Fix error when running under React Native + - Use same color for same namespace + - deps: ms@0.7.2 + * deps: etag@~1.8.0 + * deps: fresh@0.5.0 + - Fix false detection of `no-cache` request directive + - Fix incorrect result when `If-None-Match` has both `*` and ETags + - Fix weak `ETag` matching to match spec + - perf: delay reading header values until needed + - perf: enable strict mode + - perf: hoist regular expressions + - perf: remove duplicate conditional + - perf: remove unnecessary boolean coercions + - perf: skip checking modified time if ETag check failed + - perf: skip parsing `If-None-Match` when no `ETag` header + - perf: use `Date.parse` instead of `new Date` + * deps: http-errors@~1.6.1 + - Make `message` property enumerable for `HttpError`s + - deps: setprototypeof@1.0.3 + +0.14.2 / 2017-01-23 +=================== + + * deps: http-errors@~1.5.1 + - deps: inherits@2.0.3 + - deps: setprototypeof@1.0.2 + - deps: statuses@'>= 1.3.1 < 2' + * deps: ms@0.7.2 + * deps: statuses@~1.3.1 + +0.14.1 / 2016-06-09 +=================== + + * Fix redirect error when `path` contains raw non-URL characters + * Fix redirect when `path` starts with multiple forward slashes + +0.14.0 / 2016-06-06 +=================== + + * Add `acceptRanges` option + * Add `cacheControl` option + * Attempt to combine multiple ranges into single range + * Correctly inherit from `Stream` class + * Fix `Content-Range` header in 416 responses when using `start`/`end` options + * Fix `Content-Range` header missing from default 416 responses + * Ignore non-byte `Range` headers + * deps: http-errors@~1.5.0 + - Add `HttpError` export, for `err instanceof createError.HttpError` + - Support new code `421 Misdirected Request` + - Use `setprototypeof` module to replace `__proto__` setting + - deps: inherits@2.0.1 + - deps: statuses@'>= 1.3.0 < 2' + - perf: enable strict mode + * deps: range-parser@~1.2.0 + - Fix incorrectly returning -1 when there is at least one valid range + - perf: remove internal function + * deps: statuses@~1.3.0 + - Add `421 Misdirected Request` + - perf: enable strict mode + * perf: remove argument reassignment + +0.13.2 / 2016-03-05 +=================== + + * Fix invalid `Content-Type` header when `send.mime.default_type` unset + +0.13.1 / 2016-01-16 +=================== + + * deps: depd@~1.1.0 + - Support web browser loading + - perf: enable strict mode + * deps: destroy@~1.0.4 + - perf: enable strict mode + * deps: escape-html@~1.0.3 + - perf: enable strict mode + - perf: optimize string replacement + - perf: use faster string coercion + * deps: range-parser@~1.0.3 + - perf: enable strict mode + +0.13.0 / 2015-06-16 +=================== + + * Allow Node.js HTTP server to set `Date` response header + * Fix incorrectly removing `Content-Location` on 304 response + * Improve the default redirect response headers + * Send appropriate headers on default error response + * Use `http-errors` for standard emitted errors + * Use `statuses` instead of `http` module for status messages + * deps: escape-html@1.0.2 + * deps: etag@~1.7.0 + - Improve stat performance by removing hashing + * deps: fresh@0.3.0 + - Add weak `ETag` matching support + * deps: on-finished@~2.3.0 + - Add defined behavior for HTTP `CONNECT` requests + - Add defined behavior for HTTP `Upgrade` requests + - deps: ee-first@1.1.1 + * perf: enable strict mode + * perf: remove unnecessary array allocations + +0.12.3 / 2015-05-13 +=================== + + * deps: debug@~2.2.0 + - deps: ms@0.7.1 + * deps: depd@~1.0.1 + * deps: etag@~1.6.0 + - Improve support for JXcore + - Support "fake" stats objects in environments without `fs` + * deps: ms@0.7.1 + - Prevent extraordinarily long inputs + * deps: on-finished@~2.2.1 + +0.12.2 / 2015-03-13 +=================== + + * Throw errors early for invalid `extensions` or `index` options + * deps: debug@~2.1.3 + - Fix high intensity foreground color for bold + - deps: ms@0.7.0 + +0.12.1 / 2015-02-17 +=================== + + * Fix regression sending zero-length files + +0.12.0 / 2015-02-16 +=================== + + * Always read the stat size from the file + * Fix mutating passed-in `options` + * deps: mime@1.3.4 + +0.11.1 / 2015-01-20 +=================== + + * Fix `root` path disclosure + +0.11.0 / 2015-01-05 +=================== + + * deps: debug@~2.1.1 + * deps: etag@~1.5.1 + - deps: crc@3.2.1 + * deps: ms@0.7.0 + - Add `milliseconds` + - Add `msecs` + - Add `secs` + - Add `mins` + - Add `hrs` + - Add `yrs` + * deps: on-finished@~2.2.0 + +0.10.1 / 2014-10-22 +=================== + + * deps: on-finished@~2.1.1 + - Fix handling of pipelined requests + +0.10.0 / 2014-10-15 +=================== + + * deps: debug@~2.1.0 + - Implement `DEBUG_FD` env variable support + * deps: depd@~1.0.0 + * deps: etag@~1.5.0 + - Improve string performance + - Slightly improve speed for weak ETags over 1KB + +0.9.3 / 2014-09-24 +================== + + * deps: etag@~1.4.0 + - Support "fake" stats objects + +0.9.2 / 2014-09-15 +================== + + * deps: depd@0.4.5 + * deps: etag@~1.3.1 + * deps: range-parser@~1.0.2 + +0.9.1 / 2014-09-07 +================== + + * deps: fresh@0.2.4 + +0.9.0 / 2014-09-07 +================== + + * Add `lastModified` option + * Use `etag` to generate `ETag` header + * deps: debug@~2.0.0 + +0.8.5 / 2014-09-04 +================== + + * Fix malicious path detection for empty string path + +0.8.4 / 2014-09-04 +================== + + * Fix a path traversal issue when using `root` + +0.8.3 / 2014-08-16 +================== + + * deps: destroy@1.0.3 + - renamed from dethroy + * deps: on-finished@2.1.0 + +0.8.2 / 2014-08-14 +================== + + * Work around `fd` leak in Node.js 0.10 for `fs.ReadStream` + * deps: dethroy@1.0.2 + +0.8.1 / 2014-08-05 +================== + + * Fix `extensions` behavior when file already has extension + +0.8.0 / 2014-08-05 +================== + + * Add `extensions` option + +0.7.4 / 2014-08-04 +================== + + * Fix serving index files without root dir + +0.7.3 / 2014-07-29 +================== + + * Fix incorrect 403 on Windows and Node.js 0.11 + +0.7.2 / 2014-07-27 +================== + + * deps: depd@0.4.4 + - Work-around v8 generating empty stack traces + +0.7.1 / 2014-07-26 +================== + + * deps: depd@0.4.3 + - Fix exception when global `Error.stackTraceLimit` is too low + +0.7.0 / 2014-07-20 +================== + + * Deprecate `hidden` option; use `dotfiles` option + * Add `dotfiles` option + * deps: debug@1.0.4 + * deps: depd@0.4.2 + - Add `TRACE_DEPRECATION` environment variable + - Remove non-standard grey color from color output + - Support `--no-deprecation` argument + - Support `--trace-deprecation` argument + +0.6.0 / 2014-07-11 +================== + + * Deprecate `from` option; use `root` option + * Deprecate `send.etag()` -- use `etag` in `options` + * Deprecate `send.hidden()` -- use `hidden` in `options` + * Deprecate `send.index()` -- use `index` in `options` + * Deprecate `send.maxage()` -- use `maxAge` in `options` + * Deprecate `send.root()` -- use `root` in `options` + * Cap `maxAge` value to 1 year + * deps: debug@1.0.3 + - Add support for multiple wildcards in namespaces + +0.5.0 / 2014-06-28 +================== + + * Accept string for `maxAge` (converted by `ms`) + * Add `headers` event + * Include link in default redirect response + * Use `EventEmitter.listenerCount` to count listeners + +0.4.3 / 2014-06-11 +================== + + * Do not throw un-catchable error on file open race condition + * Use `escape-html` for HTML escaping + * deps: debug@1.0.2 + - fix some debugging output colors on node.js 0.8 + * deps: finished@1.2.2 + * deps: fresh@0.2.2 + +0.4.2 / 2014-06-09 +================== + + * fix "event emitter leak" warnings + * deps: debug@1.0.1 + * deps: finished@1.2.1 + +0.4.1 / 2014-06-02 +================== + + * Send `max-age` in `Cache-Control` in correct format + +0.4.0 / 2014-05-27 +================== + + * Calculate ETag with md5 for reduced collisions + * Fix wrong behavior when index file matches directory + * Ignore stream errors after request ends + - Goodbye `EBADF, read` + * Skip directories in index file search + * deps: debug@0.8.1 + +0.3.0 / 2014-04-24 +================== + + * Fix sending files with dots without root set + * Coerce option types + * Accept API options in options object + * Set etags to "weak" + * Include file path in etag + * Make "Can't set headers after they are sent." catchable + * Send full entity-body for multi range requests + * Default directory access to 403 when index disabled + * Support multiple index paths + * Support "If-Range" header + * Control whether to generate etags + * deps: mime@1.2.11 + +0.2.0 / 2014-01-29 +================== + + * update range-parser and fresh + +0.1.4 / 2013-08-11 +================== + + * update fresh + +0.1.3 / 2013-07-08 +================== + + * Revert "Fix fd leak" + +0.1.2 / 2013-07-03 +================== + + * Fix fd leak + +0.1.0 / 2012-08-25 +================== + + * add options parameter to send() that is passed to fs.createReadStream() [kanongil] + +0.0.4 / 2012-08-16 +================== + + * allow custom "Accept-Ranges" definition + +0.0.3 / 2012-07-16 +================== + + * fix normalization of the root directory. Closes #3 + +0.0.2 / 2012-07-09 +================== + + * add passing of req explicitly for now (YUCK) + +0.0.1 / 2010-01-03 +================== + + * Initial release diff --git a/node_modules/send/LICENSE b/node_modules/send/LICENSE new file mode 100644 index 000000000..4aa69e83d --- /dev/null +++ b/node_modules/send/LICENSE @@ -0,0 +1,23 @@ +(The MIT License) + +Copyright (c) 2012 TJ Holowaychuk +Copyright (c) 2014-2016 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/send/README.md b/node_modules/send/README.md new file mode 100644 index 000000000..179e8c32f --- /dev/null +++ b/node_modules/send/README.md @@ -0,0 +1,329 @@ +# send + +[![NPM Version][npm-version-image]][npm-url] +[![NPM Downloads][npm-downloads-image]][npm-url] +[![Linux Build][travis-image]][travis-url] +[![Windows Build][appveyor-image]][appveyor-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +Send is a library for streaming files from the file system as a http response +supporting partial responses (Ranges), conditional-GET negotiation (If-Match, +If-Unmodified-Since, If-None-Match, If-Modified-Since), high test coverage, +and granular events which may be leveraged to take appropriate actions in your +application or framework. + +Looking to serve up entire folders mapped to URLs? Try [serve-static](https://www.npmjs.org/package/serve-static). + +## Installation + +This is a [Node.js](https://nodejs.org/en/) module available through the +[npm registry](https://www.npmjs.com/). Installation is done using the +[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally): + +```bash +$ npm install send +``` + +## API + + + +```js +var send = require('send') +``` + +### send(req, path, [options]) + +Create a new `SendStream` for the given path to send to a `res`. The `req` is +the Node.js HTTP request and the `path` is a urlencoded path to send (urlencoded, +not the actual file-system path). + +#### Options + +##### acceptRanges + +Enable or disable accepting ranged requests, defaults to true. +Disabling this will not send `Accept-Ranges` and ignore the contents +of the `Range` request header. + +##### cacheControl + +Enable or disable setting `Cache-Control` response header, defaults to +true. Disabling this will ignore the `immutable` and `maxAge` options. + +##### dotfiles + +Set how "dotfiles" are treated when encountered. A dotfile is a file +or directory that begins with a dot ("."). Note this check is done on +the path itself without checking if the path actually exists on the +disk. If `root` is specified, only the dotfiles above the root are +checked (i.e. the root itself can be within a dotfile when when set +to "deny"). + + - `'allow'` No special treatment for dotfiles. + - `'deny'` Send a 403 for any request for a dotfile. + - `'ignore'` Pretend like the dotfile does not exist and 404. + +The default value is _similar_ to `'ignore'`, with the exception that +this default will not ignore the files within a directory that begins +with a dot, for backward-compatibility. + +##### end + +Byte offset at which the stream ends, defaults to the length of the file +minus 1. The end is inclusive in the stream, meaning `end: 3` will include +the 4th byte in the stream. + +##### etag + +Enable or disable etag generation, defaults to true. + +##### extensions + +If a given file doesn't exist, try appending one of the given extensions, +in the given order. By default, this is disabled (set to `false`). An +example value that will serve extension-less HTML files: `['html', 'htm']`. +This is skipped if the requested file already has an extension. + +##### immutable + +Enable or diable the `immutable` directive in the `Cache-Control` response +header, defaults to `false`. If set to `true`, the `maxAge` option should +also be specified to enable caching. The `immutable` directive will prevent +supported clients from making conditional requests during the life of the +`maxAge` option to check if the file has changed. + +##### index + +By default send supports "index.html" files, to disable this +set `false` or to supply a new index pass a string or an array +in preferred order. + +##### lastModified + +Enable or disable `Last-Modified` header, defaults to true. Uses the file +system's last modified value. + +##### maxAge + +Provide a max-age in milliseconds for http caching, defaults to 0. +This can also be a string accepted by the +[ms](https://www.npmjs.org/package/ms#readme) module. + +##### root + +Serve files relative to `path`. + +##### start + +Byte offset at which the stream starts, defaults to 0. The start is inclusive, +meaning `start: 2` will include the 3rd byte in the stream. + +#### Events + +The `SendStream` is an event emitter and will emit the following events: + + - `error` an error occurred `(err)` + - `directory` a directory was requested `(res, path)` + - `file` a file was requested `(path, stat)` + - `headers` the headers are about to be set on a file `(res, path, stat)` + - `stream` file streaming has started `(stream)` + - `end` streaming has completed + +#### .pipe + +The `pipe` method is used to pipe the response into the Node.js HTTP response +object, typically `send(req, path, options).pipe(res)`. + +### .mime + +The `mime` export is the global instance of of the +[`mime` npm module](https://www.npmjs.com/package/mime). + +This is used to configure the MIME types that are associated with file extensions +as well as other options for how to resolve the MIME type of a file (like the +default type to use for an unknown file extension). + +## Error-handling + +By default when no `error` listeners are present an automatic response will be +made, otherwise you have full control over the response, aka you may show a 5xx +page etc. + +## Caching + +It does _not_ perform internal caching, you should use a reverse proxy cache +such as Varnish for this, or those fancy things called CDNs. If your +application is small enough that it would benefit from single-node memory +caching, it's small enough that it does not need caching at all ;). + +## Debugging + +To enable `debug()` instrumentation output export __DEBUG__: + +``` +$ DEBUG=send node app +``` + +## Running tests + +``` +$ npm install +$ npm test +``` + +## Examples + +### Serve a specific file + +This simple example will send a specific file to all requests. + +```js +var http = require('http') +var send = require('send') + +var server = http.createServer(function onRequest (req, res) { + send(req, '/path/to/index.html') + .pipe(res) +}) + +server.listen(3000) +``` + +### Serve all files from a directory + +This simple example will just serve up all the files in a +given directory as the top-level. For example, a request +`GET /foo.txt` will send back `/www/public/foo.txt`. + +```js +var http = require('http') +var parseUrl = require('parseurl') +var send = require('send') + +var server = http.createServer(function onRequest (req, res) { + send(req, parseUrl(req).pathname, { root: '/www/public' }) + .pipe(res) +}) + +server.listen(3000) +``` + +### Custom file types + +```js +var http = require('http') +var parseUrl = require('parseurl') +var send = require('send') + +// Default unknown types to text/plain +send.mime.default_type = 'text/plain' + +// Add a custom type +send.mime.define({ + 'application/x-my-type': ['x-mt', 'x-mtt'] +}) + +var server = http.createServer(function onRequest (req, res) { + send(req, parseUrl(req).pathname, { root: '/www/public' }) + .pipe(res) +}) + +server.listen(3000) +``` + +### Custom directory index view + +This is a example of serving up a structure of directories with a +custom function to render a listing of a directory. + +```js +var http = require('http') +var fs = require('fs') +var parseUrl = require('parseurl') +var send = require('send') + +// Transfer arbitrary files from within /www/example.com/public/* +// with a custom handler for directory listing +var server = http.createServer(function onRequest (req, res) { + send(req, parseUrl(req).pathname, { index: false, root: '/www/public' }) + .once('directory', directory) + .pipe(res) +}) + +server.listen(3000) + +// Custom directory handler +function directory (res, path) { + var stream = this + + // redirect to trailing slash for consistent url + if (!stream.hasTrailingSlash()) { + return stream.redirect(path) + } + + // get directory list + fs.readdir(path, function onReaddir (err, list) { + if (err) return stream.error(err) + + // render an index for the directory + res.setHeader('Content-Type', 'text/plain; charset=UTF-8') + res.end(list.join('\n') + '\n') + }) +} +``` + +### Serving from a root directory with custom error-handling + +```js +var http = require('http') +var parseUrl = require('parseurl') +var send = require('send') + +var server = http.createServer(function onRequest (req, res) { + // your custom error-handling logic: + function error (err) { + res.statusCode = err.status || 500 + res.end(err.message) + } + + // your custom headers + function headers (res, path, stat) { + // serve all files for download + res.setHeader('Content-Disposition', 'attachment') + } + + // your custom directory handling logic: + function redirect () { + res.statusCode = 301 + res.setHeader('Location', req.url + '/') + res.end('Redirecting to ' + req.url + '/') + } + + // transfer arbitrary files from within + // /www/example.com/public/* + send(req, parseUrl(req).pathname, { root: '/www/public' }) + .on('error', error) + .on('directory', redirect) + .on('headers', headers) + .pipe(res) +}) + +server.listen(3000) +``` + +## License + +[MIT](LICENSE) + +[appveyor-image]: https://badgen.net/appveyor/ci/dougwilson/send/master?label=windows +[appveyor-url]: https://ci.appveyor.com/project/dougwilson/send +[coveralls-image]: https://badgen.net/coveralls/c/github/pillarjs/send/master +[coveralls-url]: https://coveralls.io/r/pillarjs/send?branch=master +[node-image]: https://badgen.net/npm/node/send +[node-url]: https://nodejs.org/en/download/ +[npm-downloads-image]: https://badgen.net/npm/dm/send +[npm-url]: https://npmjs.org/package/send +[npm-version-image]: https://badgen.net/npm/v/send +[travis-image]: https://badgen.net/travis/pillarjs/send/master?label=linux +[travis-url]: https://travis-ci.org/pillarjs/send diff --git a/node_modules/send/index.js b/node_modules/send/index.js new file mode 100644 index 000000000..fca21121b --- /dev/null +++ b/node_modules/send/index.js @@ -0,0 +1,1129 @@ +/*! + * send + * Copyright(c) 2012 TJ Holowaychuk + * Copyright(c) 2014-2016 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module dependencies. + * @private + */ + +var createError = require('http-errors') +var debug = require('debug')('send') +var deprecate = require('depd')('send') +var destroy = require('destroy') +var encodeUrl = require('encodeurl') +var escapeHtml = require('escape-html') +var etag = require('etag') +var fresh = require('fresh') +var fs = require('fs') +var mime = require('mime') +var ms = require('ms') +var onFinished = require('on-finished') +var parseRange = require('range-parser') +var path = require('path') +var statuses = require('statuses') +var Stream = require('stream') +var util = require('util') + +/** + * Path function references. + * @private + */ + +var extname = path.extname +var join = path.join +var normalize = path.normalize +var resolve = path.resolve +var sep = path.sep + +/** + * Regular expression for identifying a bytes Range header. + * @private + */ + +var BYTES_RANGE_REGEXP = /^ *bytes=/ + +/** + * Maximum value allowed for the max age. + * @private + */ + +var MAX_MAXAGE = 60 * 60 * 24 * 365 * 1000 // 1 year + +/** + * Regular expression to match a path with a directory up component. + * @private + */ + +var UP_PATH_REGEXP = /(?:^|[\\/])\.\.(?:[\\/]|$)/ + +/** + * Module exports. + * @public + */ + +module.exports = send +module.exports.mime = mime + +/** + * Return a `SendStream` for `req` and `path`. + * + * @param {object} req + * @param {string} path + * @param {object} [options] + * @return {SendStream} + * @public + */ + +function send (req, path, options) { + return new SendStream(req, path, options) +} + +/** + * Initialize a `SendStream` with the given `path`. + * + * @param {Request} req + * @param {String} path + * @param {object} [options] + * @private + */ + +function SendStream (req, path, options) { + Stream.call(this) + + var opts = options || {} + + this.options = opts + this.path = path + this.req = req + + this._acceptRanges = opts.acceptRanges !== undefined + ? Boolean(opts.acceptRanges) + : true + + this._cacheControl = opts.cacheControl !== undefined + ? Boolean(opts.cacheControl) + : true + + this._etag = opts.etag !== undefined + ? Boolean(opts.etag) + : true + + this._dotfiles = opts.dotfiles !== undefined + ? opts.dotfiles + : 'ignore' + + if (this._dotfiles !== 'ignore' && this._dotfiles !== 'allow' && this._dotfiles !== 'deny') { + throw new TypeError('dotfiles option must be "allow", "deny", or "ignore"') + } + + this._hidden = Boolean(opts.hidden) + + if (opts.hidden !== undefined) { + deprecate('hidden: use dotfiles: \'' + (this._hidden ? 'allow' : 'ignore') + '\' instead') + } + + // legacy support + if (opts.dotfiles === undefined) { + this._dotfiles = undefined + } + + this._extensions = opts.extensions !== undefined + ? normalizeList(opts.extensions, 'extensions option') + : [] + + this._immutable = opts.immutable !== undefined + ? Boolean(opts.immutable) + : false + + this._index = opts.index !== undefined + ? normalizeList(opts.index, 'index option') + : ['index.html'] + + this._lastModified = opts.lastModified !== undefined + ? Boolean(opts.lastModified) + : true + + this._maxage = opts.maxAge || opts.maxage + this._maxage = typeof this._maxage === 'string' + ? ms(this._maxage) + : Number(this._maxage) + this._maxage = !isNaN(this._maxage) + ? Math.min(Math.max(0, this._maxage), MAX_MAXAGE) + : 0 + + this._root = opts.root + ? resolve(opts.root) + : null + + if (!this._root && opts.from) { + this.from(opts.from) + } +} + +/** + * Inherits from `Stream`. + */ + +util.inherits(SendStream, Stream) + +/** + * Enable or disable etag generation. + * + * @param {Boolean} val + * @return {SendStream} + * @api public + */ + +SendStream.prototype.etag = deprecate.function(function etag (val) { + this._etag = Boolean(val) + debug('etag %s', this._etag) + return this +}, 'send.etag: pass etag as option') + +/** + * Enable or disable "hidden" (dot) files. + * + * @param {Boolean} path + * @return {SendStream} + * @api public + */ + +SendStream.prototype.hidden = deprecate.function(function hidden (val) { + this._hidden = Boolean(val) + this._dotfiles = undefined + debug('hidden %s', this._hidden) + return this +}, 'send.hidden: use dotfiles option') + +/** + * Set index `paths`, set to a falsy + * value to disable index support. + * + * @param {String|Boolean|Array} paths + * @return {SendStream} + * @api public + */ + +SendStream.prototype.index = deprecate.function(function index (paths) { + var index = !paths ? [] : normalizeList(paths, 'paths argument') + debug('index %o', paths) + this._index = index + return this +}, 'send.index: pass index as option') + +/** + * Set root `path`. + * + * @param {String} path + * @return {SendStream} + * @api public + */ + +SendStream.prototype.root = function root (path) { + this._root = resolve(String(path)) + debug('root %s', this._root) + return this +} + +SendStream.prototype.from = deprecate.function(SendStream.prototype.root, + 'send.from: pass root as option') + +SendStream.prototype.root = deprecate.function(SendStream.prototype.root, + 'send.root: pass root as option') + +/** + * Set max-age to `maxAge`. + * + * @param {Number} maxAge + * @return {SendStream} + * @api public + */ + +SendStream.prototype.maxage = deprecate.function(function maxage (maxAge) { + this._maxage = typeof maxAge === 'string' + ? ms(maxAge) + : Number(maxAge) + this._maxage = !isNaN(this._maxage) + ? Math.min(Math.max(0, this._maxage), MAX_MAXAGE) + : 0 + debug('max-age %d', this._maxage) + return this +}, 'send.maxage: pass maxAge as option') + +/** + * Emit error with `status`. + * + * @param {number} status + * @param {Error} [err] + * @private + */ + +SendStream.prototype.error = function error (status, err) { + // emit if listeners instead of responding + if (hasListeners(this, 'error')) { + return this.emit('error', createError(status, err, { + expose: false + })) + } + + var res = this.res + var msg = statuses[status] || String(status) + var doc = createHtmlDocument('Error', escapeHtml(msg)) + + // clear existing headers + clearHeaders(res) + + // add error headers + if (err && err.headers) { + setHeaders(res, err.headers) + } + + // send basic response + res.statusCode = status + res.setHeader('Content-Type', 'text/html; charset=UTF-8') + res.setHeader('Content-Length', Buffer.byteLength(doc)) + res.setHeader('Content-Security-Policy', "default-src 'none'") + res.setHeader('X-Content-Type-Options', 'nosniff') + res.end(doc) +} + +/** + * Check if the pathname ends with "/". + * + * @return {boolean} + * @private + */ + +SendStream.prototype.hasTrailingSlash = function hasTrailingSlash () { + return this.path[this.path.length - 1] === '/' +} + +/** + * Check if this is a conditional GET request. + * + * @return {Boolean} + * @api private + */ + +SendStream.prototype.isConditionalGET = function isConditionalGET () { + return this.req.headers['if-match'] || + this.req.headers['if-unmodified-since'] || + this.req.headers['if-none-match'] || + this.req.headers['if-modified-since'] +} + +/** + * Check if the request preconditions failed. + * + * @return {boolean} + * @private + */ + +SendStream.prototype.isPreconditionFailure = function isPreconditionFailure () { + var req = this.req + var res = this.res + + // if-match + var match = req.headers['if-match'] + if (match) { + var etag = res.getHeader('ETag') + return !etag || (match !== '*' && parseTokenList(match).every(function (match) { + return match !== etag && match !== 'W/' + etag && 'W/' + match !== etag + })) + } + + // if-unmodified-since + var unmodifiedSince = parseHttpDate(req.headers['if-unmodified-since']) + if (!isNaN(unmodifiedSince)) { + var lastModified = parseHttpDate(res.getHeader('Last-Modified')) + return isNaN(lastModified) || lastModified > unmodifiedSince + } + + return false +} + +/** + * Strip content-* header fields. + * + * @private + */ + +SendStream.prototype.removeContentHeaderFields = function removeContentHeaderFields () { + var res = this.res + var headers = getHeaderNames(res) + + for (var i = 0; i < headers.length; i++) { + var header = headers[i] + if (header.substr(0, 8) === 'content-' && header !== 'content-location') { + res.removeHeader(header) + } + } +} + +/** + * Respond with 304 not modified. + * + * @api private + */ + +SendStream.prototype.notModified = function notModified () { + var res = this.res + debug('not modified') + this.removeContentHeaderFields() + res.statusCode = 304 + res.end() +} + +/** + * Raise error that headers already sent. + * + * @api private + */ + +SendStream.prototype.headersAlreadySent = function headersAlreadySent () { + var err = new Error('Can\'t set headers after they are sent.') + debug('headers already sent') + this.error(500, err) +} + +/** + * Check if the request is cacheable, aka + * responded with 2xx or 304 (see RFC 2616 section 14.2{5,6}). + * + * @return {Boolean} + * @api private + */ + +SendStream.prototype.isCachable = function isCachable () { + var statusCode = this.res.statusCode + return (statusCode >= 200 && statusCode < 300) || + statusCode === 304 +} + +/** + * Handle stat() error. + * + * @param {Error} error + * @private + */ + +SendStream.prototype.onStatError = function onStatError (error) { + switch (error.code) { + case 'ENAMETOOLONG': + case 'ENOENT': + case 'ENOTDIR': + this.error(404, error) + break + default: + this.error(500, error) + break + } +} + +/** + * Check if the cache is fresh. + * + * @return {Boolean} + * @api private + */ + +SendStream.prototype.isFresh = function isFresh () { + return fresh(this.req.headers, { + 'etag': this.res.getHeader('ETag'), + 'last-modified': this.res.getHeader('Last-Modified') + }) +} + +/** + * Check if the range is fresh. + * + * @return {Boolean} + * @api private + */ + +SendStream.prototype.isRangeFresh = function isRangeFresh () { + var ifRange = this.req.headers['if-range'] + + if (!ifRange) { + return true + } + + // if-range as etag + if (ifRange.indexOf('"') !== -1) { + var etag = this.res.getHeader('ETag') + return Boolean(etag && ifRange.indexOf(etag) !== -1) + } + + // if-range as modified date + var lastModified = this.res.getHeader('Last-Modified') + return parseHttpDate(lastModified) <= parseHttpDate(ifRange) +} + +/** + * Redirect to path. + * + * @param {string} path + * @private + */ + +SendStream.prototype.redirect = function redirect (path) { + var res = this.res + + if (hasListeners(this, 'directory')) { + this.emit('directory', res, path) + return + } + + if (this.hasTrailingSlash()) { + this.error(403) + return + } + + var loc = encodeUrl(collapseLeadingSlashes(this.path + '/')) + var doc = createHtmlDocument('Redirecting', 'Redirecting to ' + + escapeHtml(loc) + '') + + // redirect + res.statusCode = 301 + res.setHeader('Content-Type', 'text/html; charset=UTF-8') + res.setHeader('Content-Length', Buffer.byteLength(doc)) + res.setHeader('Content-Security-Policy', "default-src 'none'") + res.setHeader('X-Content-Type-Options', 'nosniff') + res.setHeader('Location', loc) + res.end(doc) +} + +/** + * Pipe to `res. + * + * @param {Stream} res + * @return {Stream} res + * @api public + */ + +SendStream.prototype.pipe = function pipe (res) { + // root path + var root = this._root + + // references + this.res = res + + // decode the path + var path = decode(this.path) + if (path === -1) { + this.error(400) + return res + } + + // null byte(s) + if (~path.indexOf('\0')) { + this.error(400) + return res + } + + var parts + if (root !== null) { + // normalize + if (path) { + path = normalize('.' + sep + path) + } + + // malicious path + if (UP_PATH_REGEXP.test(path)) { + debug('malicious path "%s"', path) + this.error(403) + return res + } + + // explode path parts + parts = path.split(sep) + + // join / normalize from optional root dir + path = normalize(join(root, path)) + } else { + // ".." is malicious without "root" + if (UP_PATH_REGEXP.test(path)) { + debug('malicious path "%s"', path) + this.error(403) + return res + } + + // explode path parts + parts = normalize(path).split(sep) + + // resolve the path + path = resolve(path) + } + + // dotfile handling + if (containsDotFile(parts)) { + var access = this._dotfiles + + // legacy support + if (access === undefined) { + access = parts[parts.length - 1][0] === '.' + ? (this._hidden ? 'allow' : 'ignore') + : 'allow' + } + + debug('%s dotfile "%s"', access, path) + switch (access) { + case 'allow': + break + case 'deny': + this.error(403) + return res + case 'ignore': + default: + this.error(404) + return res + } + } + + // index file support + if (this._index.length && this.hasTrailingSlash()) { + this.sendIndex(path) + return res + } + + this.sendFile(path) + return res +} + +/** + * Transfer `path`. + * + * @param {String} path + * @api public + */ + +SendStream.prototype.send = function send (path, stat) { + var len = stat.size + var options = this.options + var opts = {} + var res = this.res + var req = this.req + var ranges = req.headers.range + var offset = options.start || 0 + + if (headersSent(res)) { + // impossible to send now + this.headersAlreadySent() + return + } + + debug('pipe "%s"', path) + + // set header fields + this.setHeader(path, stat) + + // set content-type + this.type(path) + + // conditional GET support + if (this.isConditionalGET()) { + if (this.isPreconditionFailure()) { + this.error(412) + return + } + + if (this.isCachable() && this.isFresh()) { + this.notModified() + return + } + } + + // adjust len to start/end options + len = Math.max(0, len - offset) + if (options.end !== undefined) { + var bytes = options.end - offset + 1 + if (len > bytes) len = bytes + } + + // Range support + if (this._acceptRanges && BYTES_RANGE_REGEXP.test(ranges)) { + // parse + ranges = parseRange(len, ranges, { + combine: true + }) + + // If-Range support + if (!this.isRangeFresh()) { + debug('range stale') + ranges = -2 + } + + // unsatisfiable + if (ranges === -1) { + debug('range unsatisfiable') + + // Content-Range + res.setHeader('Content-Range', contentRange('bytes', len)) + + // 416 Requested Range Not Satisfiable + return this.error(416, { + headers: { 'Content-Range': res.getHeader('Content-Range') } + }) + } + + // valid (syntactically invalid/multiple ranges are treated as a regular response) + if (ranges !== -2 && ranges.length === 1) { + debug('range %j', ranges) + + // Content-Range + res.statusCode = 206 + res.setHeader('Content-Range', contentRange('bytes', len, ranges[0])) + + // adjust for requested range + offset += ranges[0].start + len = ranges[0].end - ranges[0].start + 1 + } + } + + // clone options + for (var prop in options) { + opts[prop] = options[prop] + } + + // set read options + opts.start = offset + opts.end = Math.max(offset, offset + len - 1) + + // content-length + res.setHeader('Content-Length', len) + + // HEAD support + if (req.method === 'HEAD') { + res.end() + return + } + + this.stream(path, opts) +} + +/** + * Transfer file for `path`. + * + * @param {String} path + * @api private + */ +SendStream.prototype.sendFile = function sendFile (path) { + var i = 0 + var self = this + + debug('stat "%s"', path) + fs.stat(path, function onstat (err, stat) { + if (err && err.code === 'ENOENT' && !extname(path) && path[path.length - 1] !== sep) { + // not found, check extensions + return next(err) + } + if (err) return self.onStatError(err) + if (stat.isDirectory()) return self.redirect(path) + self.emit('file', path, stat) + self.send(path, stat) + }) + + function next (err) { + if (self._extensions.length <= i) { + return err + ? self.onStatError(err) + : self.error(404) + } + + var p = path + '.' + self._extensions[i++] + + debug('stat "%s"', p) + fs.stat(p, function (err, stat) { + if (err) return next(err) + if (stat.isDirectory()) return next() + self.emit('file', p, stat) + self.send(p, stat) + }) + } +} + +/** + * Transfer index for `path`. + * + * @param {String} path + * @api private + */ +SendStream.prototype.sendIndex = function sendIndex (path) { + var i = -1 + var self = this + + function next (err) { + if (++i >= self._index.length) { + if (err) return self.onStatError(err) + return self.error(404) + } + + var p = join(path, self._index[i]) + + debug('stat "%s"', p) + fs.stat(p, function (err, stat) { + if (err) return next(err) + if (stat.isDirectory()) return next() + self.emit('file', p, stat) + self.send(p, stat) + }) + } + + next() +} + +/** + * Stream `path` to the response. + * + * @param {String} path + * @param {Object} options + * @api private + */ + +SendStream.prototype.stream = function stream (path, options) { + // TODO: this is all lame, refactor meeee + var finished = false + var self = this + var res = this.res + + // pipe + var stream = fs.createReadStream(path, options) + this.emit('stream', stream) + stream.pipe(res) + + // response finished, done with the fd + onFinished(res, function onfinished () { + finished = true + destroy(stream) + }) + + // error handling code-smell + stream.on('error', function onerror (err) { + // request already finished + if (finished) return + + // clean up stream + finished = true + destroy(stream) + + // error + self.onStatError(err) + }) + + // end + stream.on('end', function onend () { + self.emit('end') + }) +} + +/** + * Set content-type based on `path` + * if it hasn't been explicitly set. + * + * @param {String} path + * @api private + */ + +SendStream.prototype.type = function type (path) { + var res = this.res + + if (res.getHeader('Content-Type')) return + + var type = mime.lookup(path) + + if (!type) { + debug('no content-type') + return + } + + var charset = mime.charsets.lookup(type) + + debug('content-type %s', type) + res.setHeader('Content-Type', type + (charset ? '; charset=' + charset : '')) +} + +/** + * Set response header fields, most + * fields may be pre-defined. + * + * @param {String} path + * @param {Object} stat + * @api private + */ + +SendStream.prototype.setHeader = function setHeader (path, stat) { + var res = this.res + + this.emit('headers', res, path, stat) + + if (this._acceptRanges && !res.getHeader('Accept-Ranges')) { + debug('accept ranges') + res.setHeader('Accept-Ranges', 'bytes') + } + + if (this._cacheControl && !res.getHeader('Cache-Control')) { + var cacheControl = 'public, max-age=' + Math.floor(this._maxage / 1000) + + if (this._immutable) { + cacheControl += ', immutable' + } + + debug('cache-control %s', cacheControl) + res.setHeader('Cache-Control', cacheControl) + } + + if (this._lastModified && !res.getHeader('Last-Modified')) { + var modified = stat.mtime.toUTCString() + debug('modified %s', modified) + res.setHeader('Last-Modified', modified) + } + + if (this._etag && !res.getHeader('ETag')) { + var val = etag(stat) + debug('etag %s', val) + res.setHeader('ETag', val) + } +} + +/** + * Clear all headers from a response. + * + * @param {object} res + * @private + */ + +function clearHeaders (res) { + var headers = getHeaderNames(res) + + for (var i = 0; i < headers.length; i++) { + res.removeHeader(headers[i]) + } +} + +/** + * Collapse all leading slashes into a single slash + * + * @param {string} str + * @private + */ +function collapseLeadingSlashes (str) { + for (var i = 0; i < str.length; i++) { + if (str[i] !== '/') { + break + } + } + + return i > 1 + ? '/' + str.substr(i) + : str +} + +/** + * Determine if path parts contain a dotfile. + * + * @api private + */ + +function containsDotFile (parts) { + for (var i = 0; i < parts.length; i++) { + var part = parts[i] + if (part.length > 1 && part[0] === '.') { + return true + } + } + + return false +} + +/** + * Create a Content-Range header. + * + * @param {string} type + * @param {number} size + * @param {array} [range] + */ + +function contentRange (type, size, range) { + return type + ' ' + (range ? range.start + '-' + range.end : '*') + '/' + size +} + +/** + * Create a minimal HTML document. + * + * @param {string} title + * @param {string} body + * @private + */ + +function createHtmlDocument (title, body) { + return '\n' + + '\n' + + '\n' + + '\n' + + '' + title + '\n' + + '\n' + + '\n' + + '
' + body + '
\n' + + '\n' + + '\n' +} + +/** + * decodeURIComponent. + * + * Allows V8 to only deoptimize this fn instead of all + * of send(). + * + * @param {String} path + * @api private + */ + +function decode (path) { + try { + return decodeURIComponent(path) + } catch (err) { + return -1 + } +} + +/** + * Get the header names on a respnse. + * + * @param {object} res + * @returns {array[string]} + * @private + */ + +function getHeaderNames (res) { + return typeof res.getHeaderNames !== 'function' + ? Object.keys(res._headers || {}) + : res.getHeaderNames() +} + +/** + * Determine if emitter has listeners of a given type. + * + * The way to do this check is done three different ways in Node.js >= 0.8 + * so this consolidates them into a minimal set using instance methods. + * + * @param {EventEmitter} emitter + * @param {string} type + * @returns {boolean} + * @private + */ + +function hasListeners (emitter, type) { + var count = typeof emitter.listenerCount !== 'function' + ? emitter.listeners(type).length + : emitter.listenerCount(type) + + return count > 0 +} + +/** + * Determine if the response headers have been sent. + * + * @param {object} res + * @returns {boolean} + * @private + */ + +function headersSent (res) { + return typeof res.headersSent !== 'boolean' + ? Boolean(res._header) + : res.headersSent +} + +/** + * Normalize the index option into an array. + * + * @param {boolean|string|array} val + * @param {string} name + * @private + */ + +function normalizeList (val, name) { + var list = [].concat(val || []) + + for (var i = 0; i < list.length; i++) { + if (typeof list[i] !== 'string') { + throw new TypeError(name + ' must be array of strings or false') + } + } + + return list +} + +/** + * Parse an HTTP Date into a number. + * + * @param {string} date + * @private + */ + +function parseHttpDate (date) { + var timestamp = date && Date.parse(date) + + return typeof timestamp === 'number' + ? timestamp + : NaN +} + +/** + * Parse a HTTP token list. + * + * @param {string} str + * @private + */ + +function parseTokenList (str) { + var end = 0 + var list = [] + var start = 0 + + // gather tokens + for (var i = 0, len = str.length; i < len; i++) { + switch (str.charCodeAt(i)) { + case 0x20: /* */ + if (start === end) { + start = end = i + 1 + } + break + case 0x2c: /* , */ + list.push(str.substring(start, end)) + start = end = i + 1 + break + default: + end = i + 1 + break + } + } + + // final token + list.push(str.substring(start, end)) + + return list +} + +/** + * Set an object of headers on a response. + * + * @param {object} res + * @param {object} headers + * @private + */ + +function setHeaders (res, headers) { + var keys = Object.keys(headers) + + for (var i = 0; i < keys.length; i++) { + var key = keys[i] + res.setHeader(key, headers[key]) + } +} diff --git a/node_modules/send/node_modules/ms/index.js b/node_modules/send/node_modules/ms/index.js new file mode 100644 index 000000000..72297501f --- /dev/null +++ b/node_modules/send/node_modules/ms/index.js @@ -0,0 +1,162 @@ +/** + * Helpers. + */ + +var s = 1000; +var m = s * 60; +var h = m * 60; +var d = h * 24; +var w = d * 7; +var y = d * 365.25; + +/** + * Parse or format the given `val`. + * + * Options: + * + * - `long` verbose formatting [false] + * + * @param {String|Number} val + * @param {Object} [options] + * @throws {Error} throw an error if val is not a non-empty string or a number + * @return {String|Number} + * @api public + */ + +module.exports = function(val, options) { + options = options || {}; + var type = typeof val; + if (type === 'string' && val.length > 0) { + return parse(val); + } else if (type === 'number' && isNaN(val) === false) { + return options.long ? fmtLong(val) : fmtShort(val); + } + throw new Error( + 'val is not a non-empty string or a valid number. val=' + + JSON.stringify(val) + ); +}; + +/** + * Parse the given `str` and return milliseconds. + * + * @param {String} str + * @return {Number} + * @api private + */ + +function parse(str) { + str = String(str); + if (str.length > 100) { + return; + } + var match = /^((?:\d+)?\-?\d?\.?\d+) *(milliseconds?|msecs?|ms|seconds?|secs?|s|minutes?|mins?|m|hours?|hrs?|h|days?|d|weeks?|w|years?|yrs?|y)?$/i.exec( + str + ); + if (!match) { + return; + } + var n = parseFloat(match[1]); + var type = (match[2] || 'ms').toLowerCase(); + switch (type) { + case 'years': + case 'year': + case 'yrs': + case 'yr': + case 'y': + return n * y; + case 'weeks': + case 'week': + case 'w': + return n * w; + case 'days': + case 'day': + case 'd': + return n * d; + case 'hours': + case 'hour': + case 'hrs': + case 'hr': + case 'h': + return n * h; + case 'minutes': + case 'minute': + case 'mins': + case 'min': + case 'm': + return n * m; + case 'seconds': + case 'second': + case 'secs': + case 'sec': + case 's': + return n * s; + case 'milliseconds': + case 'millisecond': + case 'msecs': + case 'msec': + case 'ms': + return n; + default: + return undefined; + } +} + +/** + * Short format for `ms`. + * + * @param {Number} ms + * @return {String} + * @api private + */ + +function fmtShort(ms) { + var msAbs = Math.abs(ms); + if (msAbs >= d) { + return Math.round(ms / d) + 'd'; + } + if (msAbs >= h) { + return Math.round(ms / h) + 'h'; + } + if (msAbs >= m) { + return Math.round(ms / m) + 'm'; + } + if (msAbs >= s) { + return Math.round(ms / s) + 's'; + } + return ms + 'ms'; +} + +/** + * Long format for `ms`. + * + * @param {Number} ms + * @return {String} + * @api private + */ + +function fmtLong(ms) { + var msAbs = Math.abs(ms); + if (msAbs >= d) { + return plural(ms, msAbs, d, 'day'); + } + if (msAbs >= h) { + return plural(ms, msAbs, h, 'hour'); + } + if (msAbs >= m) { + return plural(ms, msAbs, m, 'minute'); + } + if (msAbs >= s) { + return plural(ms, msAbs, s, 'second'); + } + return ms + ' ms'; +} + +/** + * Pluralization helper. + */ + +function plural(ms, msAbs, n, name) { + var isPlural = msAbs >= n * 1.5; + return Math.round(ms / n) + ' ' + name + (isPlural ? 's' : ''); +} diff --git a/node_modules/send/node_modules/ms/license.md b/node_modules/send/node_modules/ms/license.md new file mode 100644 index 000000000..69b61253a --- /dev/null +++ b/node_modules/send/node_modules/ms/license.md @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2016 Zeit, Inc. + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/node_modules/send/node_modules/ms/package.json b/node_modules/send/node_modules/ms/package.json new file mode 100644 index 000000000..99a2de296 --- /dev/null +++ b/node_modules/send/node_modules/ms/package.json @@ -0,0 +1,69 @@ +{ + "_from": "ms@2.1.1", + "_id": "ms@2.1.1", + "_inBundle": false, + "_integrity": "sha512-tgp+dl5cGk28utYktBsrFqA7HKgrhgPsg6Z/EfhWI4gl1Hwq8B/GmY/0oXZ6nF8hDVesS/FpnYaD/kOWhYQvyg==", + "_location": "/send/ms", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "ms@2.1.1", + "name": "ms", + "escapedName": "ms", + "rawSpec": "2.1.1", + "saveSpec": null, + "fetchSpec": "2.1.1" + }, + "_requiredBy": [ + "/send" + ], + "_resolved": "https://registry.npmjs.org/ms/-/ms-2.1.1.tgz", + "_shasum": "30a5864eb3ebb0a66f2ebe6d727af06a09d86e0a", + "_spec": "ms@2.1.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\send", + "bugs": { + "url": "https://github.com/zeit/ms/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Tiny millisecond conversion utility", + "devDependencies": { + "eslint": "4.12.1", + "expect.js": "0.3.1", + "husky": "0.14.3", + "lint-staged": "5.0.0", + "mocha": "4.0.1" + }, + "eslintConfig": { + "extends": "eslint:recommended", + "env": { + "node": true, + "es6": true + } + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/zeit/ms#readme", + "license": "MIT", + "lint-staged": { + "*.js": [ + "npm run lint", + "prettier --single-quote --write", + "git add" + ] + }, + "main": "./index", + "name": "ms", + "repository": { + "type": "git", + "url": "git+https://github.com/zeit/ms.git" + }, + "scripts": { + "lint": "eslint lib/* bin/*", + "precommit": "lint-staged", + "test": "mocha tests.js" + }, + "version": "2.1.1" +} diff --git a/node_modules/send/node_modules/ms/readme.md b/node_modules/send/node_modules/ms/readme.md new file mode 100644 index 000000000..bb767293a --- /dev/null +++ b/node_modules/send/node_modules/ms/readme.md @@ -0,0 +1,60 @@ +# ms + +[![Build Status](https://travis-ci.org/zeit/ms.svg?branch=master)](https://travis-ci.org/zeit/ms) +[![Slack Channel](http://zeit-slackin.now.sh/badge.svg)](https://zeit.chat/) + +Use this package to easily convert various time formats to milliseconds. + +## Examples + +```js +ms('2 days') // 172800000 +ms('1d') // 86400000 +ms('10h') // 36000000 +ms('2.5 hrs') // 9000000 +ms('2h') // 7200000 +ms('1m') // 60000 +ms('5s') // 5000 +ms('1y') // 31557600000 +ms('100') // 100 +ms('-3 days') // -259200000 +ms('-1h') // -3600000 +ms('-200') // -200 +``` + +### Convert from Milliseconds + +```js +ms(60000) // "1m" +ms(2 * 60000) // "2m" +ms(-3 * 60000) // "-3m" +ms(ms('10 hours')) // "10h" +``` + +### Time Format Written-Out + +```js +ms(60000, { long: true }) // "1 minute" +ms(2 * 60000, { long: true }) // "2 minutes" +ms(-3 * 60000, { long: true }) // "-3 minutes" +ms(ms('10 hours'), { long: true }) // "10 hours" +``` + +## Features + +- Works both in [Node.js](https://nodejs.org) and in the browser +- If a number is supplied to `ms`, a string with a unit is returned +- If a string that contains the number is supplied, it returns it as a number (e.g.: it returns `100` for `'100'`) +- If you pass a string with a number and a valid unit, the number of equivalent milliseconds is returned + +## Related Packages + +- [ms.macro](https://github.com/knpwrs/ms.macro) - Run `ms` as a macro at build-time. + +## Caught a Bug? + +1. [Fork](https://help.github.com/articles/fork-a-repo/) this repository to your own GitHub account and then [clone](https://help.github.com/articles/cloning-a-repository/) it to your local device +2. Link the package to the global module directory: `npm link` +3. Within the module you want to test your local development instance of ms, just link it to the dependencies: `npm link ms`. Instead of the default one from npm, Node.js will now use your clone of ms! + +As always, you can run the tests using: `npm test` diff --git a/node_modules/send/package.json b/node_modules/send/package.json new file mode 100644 index 000000000..eba678fd4 --- /dev/null +++ b/node_modules/send/package.json @@ -0,0 +1,106 @@ +{ + "_from": "send@0.17.1", + "_id": "send@0.17.1", + "_inBundle": false, + "_integrity": "sha512-BsVKsiGcQMFwT8UxypobUKyv7irCNRHk1T0G680vk88yf6LBByGcZJOTJCrTP2xVN6yI+XjPJcNuE3V4fT9sAg==", + "_location": "/send", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "send@0.17.1", + "name": "send", + "escapedName": "send", + "rawSpec": "0.17.1", + "saveSpec": null, + "fetchSpec": "0.17.1" + }, + "_requiredBy": [ + "/express", + "/serve-static" + ], + "_resolved": "https://registry.npmjs.org/send/-/send-0.17.1.tgz", + "_shasum": "c1d8b059f7900f7466dd4938bdc44e11ddb376c8", + "_spec": "send@0.17.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\express", + "author": { + "name": "TJ Holowaychuk", + "email": "tj@vision-media.ca" + }, + "bugs": { + "url": "https://github.com/pillarjs/send/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + { + "name": "James Wyatt Cready", + "email": "jcready@gmail.com" + }, + { + "name": "Jesús Leganés Combarro", + "email": "piranna@gmail.com" + } + ], + "dependencies": { + "debug": "2.6.9", + "depd": "~1.1.2", + "destroy": "~1.0.4", + "encodeurl": "~1.0.2", + "escape-html": "~1.0.3", + "etag": "~1.8.1", + "fresh": "0.5.2", + "http-errors": "~1.7.2", + "mime": "1.6.0", + "ms": "2.1.1", + "on-finished": "~2.3.0", + "range-parser": "~1.2.1", + "statuses": "~1.5.0" + }, + "deprecated": false, + "description": "Better streaming static file server with Range and conditional-GET support", + "devDependencies": { + "after": "0.8.2", + "eslint": "5.16.0", + "eslint-config-standard": "12.0.0", + "eslint-plugin-import": "2.17.2", + "eslint-plugin-markdown": "1.0.0", + "eslint-plugin-node": "8.0.1", + "eslint-plugin-promise": "4.1.1", + "eslint-plugin-standard": "4.0.0", + "istanbul": "0.4.5", + "mocha": "6.1.4", + "supertest": "4.0.2" + }, + "engines": { + "node": ">= 0.8.0" + }, + "files": [ + "HISTORY.md", + "LICENSE", + "README.md", + "index.js" + ], + "homepage": "https://github.com/pillarjs/send#readme", + "keywords": [ + "static", + "file", + "server" + ], + "license": "MIT", + "name": "send", + "repository": { + "type": "git", + "url": "git+https://github.com/pillarjs/send.git" + }, + "scripts": { + "lint": "eslint --plugin markdown --ext js,md .", + "test": "mocha --check-leaks --reporter spec --bail", + "test-ci": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --check-leaks --reporter spec", + "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --check-leaks --reporter dot" + }, + "version": "0.17.1" +} diff --git a/node_modules/serve-static/HISTORY.md b/node_modules/serve-static/HISTORY.md new file mode 100644 index 000000000..7203e4fb5 --- /dev/null +++ b/node_modules/serve-static/HISTORY.md @@ -0,0 +1,451 @@ +1.14.1 / 2019-05-10 +=================== + + * Set stricter CSP header in redirect response + * deps: send@0.17.1 + - deps: range-parser@~1.2.1 + +1.14.0 / 2019-05-07 +=================== + + * deps: parseurl@~1.3.3 + * deps: send@0.17.0 + - deps: http-errors@~1.7.2 + - deps: mime@1.6.0 + - deps: ms@2.1.1 + - deps: statuses@~1.5.0 + - perf: remove redundant `path.normalize` call + +1.13.2 / 2018-02-07 +=================== + + * Fix incorrect end tag in redirects + * deps: encodeurl@~1.0.2 + - Fix encoding `%` as last character + * deps: send@0.16.2 + - deps: depd@~1.1.2 + - deps: encodeurl@~1.0.2 + - deps: statuses@~1.4.0 + +1.13.1 / 2017-09-29 +=================== + + * Fix regression when `root` is incorrectly set to a file + * deps: send@0.16.1 + +1.13.0 / 2017-09-27 +=================== + + * deps: send@0.16.0 + - Add 70 new types for file extensions + - Add `immutable` option + - Fix missing `` in default error & redirects + - Set charset as "UTF-8" for .js and .json + - Use instance methods on steam to check for listeners + - deps: mime@1.4.1 + - perf: improve path validation speed + +1.12.6 / 2017-09-22 +=================== + + * deps: send@0.15.6 + - deps: debug@2.6.9 + - perf: improve `If-Match` token parsing + * perf: improve slash collapsing + +1.12.5 / 2017-09-21 +=================== + + * deps: parseurl@~1.3.2 + - perf: reduce overhead for full URLs + - perf: unroll the "fast-path" `RegExp` + * deps: send@0.15.5 + - Fix handling of modified headers with invalid dates + - deps: etag@~1.8.1 + - deps: fresh@0.5.2 + +1.12.4 / 2017-08-05 +=================== + + * deps: send@0.15.4 + - deps: debug@2.6.8 + - deps: depd@~1.1.1 + - deps: http-errors@~1.6.2 + +1.12.3 / 2017-05-16 +=================== + + * deps: send@0.15.3 + - deps: debug@2.6.7 + +1.12.2 / 2017-04-26 +=================== + + * deps: send@0.15.2 + - deps: debug@2.6.4 + +1.12.1 / 2017-03-04 +=================== + + * deps: send@0.15.1 + - Fix issue when `Date.parse` does not return `NaN` on invalid date + - Fix strict violation in broken environments + +1.12.0 / 2017-02-25 +=================== + + * Send complete HTML document in redirect response + * Set default CSP header in redirect response + * deps: send@0.15.0 + - Fix false detection of `no-cache` request directive + - Fix incorrect result when `If-None-Match` has both `*` and ETags + - Fix weak `ETag` matching to match spec + - Remove usage of `res._headers` private field + - Support `If-Match` and `If-Unmodified-Since` headers + - Use `res.getHeaderNames()` when available + - Use `res.headersSent` when available + - deps: debug@2.6.1 + - deps: etag@~1.8.0 + - deps: fresh@0.5.0 + - deps: http-errors@~1.6.1 + +1.11.2 / 2017-01-23 +=================== + + * deps: send@0.14.2 + - deps: http-errors@~1.5.1 + - deps: ms@0.7.2 + - deps: statuses@~1.3.1 + +1.11.1 / 2016-06-10 +=================== + + * Fix redirect error when `req.url` contains raw non-URL characters + * deps: send@0.14.1 + +1.11.0 / 2016-06-07 +=================== + + * Use status code 301 for redirects + * deps: send@0.14.0 + - Add `acceptRanges` option + - Add `cacheControl` option + - Attempt to combine multiple ranges into single range + - Correctly inherit from `Stream` class + - Fix `Content-Range` header in 416 responses when using `start`/`end` options + - Fix `Content-Range` header missing from default 416 responses + - Ignore non-byte `Range` headers + - deps: http-errors@~1.5.0 + - deps: range-parser@~1.2.0 + - deps: statuses@~1.3.0 + - perf: remove argument reassignment + +1.10.3 / 2016-05-30 +=================== + + * deps: send@0.13.2 + - Fix invalid `Content-Type` header when `send.mime.default_type` unset + +1.10.2 / 2016-01-19 +=================== + + * deps: parseurl@~1.3.1 + - perf: enable strict mode + +1.10.1 / 2016-01-16 +=================== + + * deps: escape-html@~1.0.3 + - perf: enable strict mode + - perf: optimize string replacement + - perf: use faster string coercion + * deps: send@0.13.1 + - deps: depd@~1.1.0 + - deps: destroy@~1.0.4 + - deps: escape-html@~1.0.3 + - deps: range-parser@~1.0.3 + +1.10.0 / 2015-06-17 +=================== + + * Add `fallthrough` option + - Allows declaring this middleware is the final destination + - Provides better integration with Express patterns + * Fix reading options from options prototype + * Improve the default redirect response headers + * deps: escape-html@1.0.2 + * deps: send@0.13.0 + - Allow Node.js HTTP server to set `Date` response header + - Fix incorrectly removing `Content-Location` on 304 response + - Improve the default redirect response headers + - Send appropriate headers on default error response + - Use `http-errors` for standard emitted errors + - Use `statuses` instead of `http` module for status messages + - deps: escape-html@1.0.2 + - deps: etag@~1.7.0 + - deps: fresh@0.3.0 + - deps: on-finished@~2.3.0 + - perf: enable strict mode + - perf: remove unnecessary array allocations + * perf: enable strict mode + * perf: remove argument reassignment + +1.9.3 / 2015-05-14 +================== + + * deps: send@0.12.3 + - deps: debug@~2.2.0 + - deps: depd@~1.0.1 + - deps: etag@~1.6.0 + - deps: ms@0.7.1 + - deps: on-finished@~2.2.1 + +1.9.2 / 2015-03-14 +================== + + * deps: send@0.12.2 + - Throw errors early for invalid `extensions` or `index` options + - deps: debug@~2.1.3 + +1.9.1 / 2015-02-17 +================== + + * deps: send@0.12.1 + - Fix regression sending zero-length files + +1.9.0 / 2015-02-16 +================== + + * deps: send@0.12.0 + - Always read the stat size from the file + - Fix mutating passed-in `options` + - deps: mime@1.3.4 + +1.8.1 / 2015-01-20 +================== + + * Fix redirect loop in Node.js 0.11.14 + * deps: send@0.11.1 + - Fix root path disclosure + +1.8.0 / 2015-01-05 +================== + + * deps: send@0.11.0 + - deps: debug@~2.1.1 + - deps: etag@~1.5.1 + - deps: ms@0.7.0 + - deps: on-finished@~2.2.0 + +1.7.2 / 2015-01-02 +================== + + * Fix potential open redirect when mounted at root + +1.7.1 / 2014-10-22 +================== + + * deps: send@0.10.1 + - deps: on-finished@~2.1.1 + +1.7.0 / 2014-10-15 +================== + + * deps: send@0.10.0 + - deps: debug@~2.1.0 + - deps: depd@~1.0.0 + - deps: etag@~1.5.0 + +1.6.5 / 2015-02-04 +================== + + * Fix potential open redirect when mounted at root + - Back-ported from v1.7.2 + +1.6.4 / 2014-10-08 +================== + + * Fix redirect loop when index file serving disabled + +1.6.3 / 2014-09-24 +================== + + * deps: send@0.9.3 + - deps: etag@~1.4.0 + +1.6.2 / 2014-09-15 +================== + + * deps: send@0.9.2 + - deps: depd@0.4.5 + - deps: etag@~1.3.1 + - deps: range-parser@~1.0.2 + +1.6.1 / 2014-09-07 +================== + + * deps: send@0.9.1 + - deps: fresh@0.2.4 + +1.6.0 / 2014-09-07 +================== + + * deps: send@0.9.0 + - Add `lastModified` option + - Use `etag` to generate `ETag` header + - deps: debug@~2.0.0 + +1.5.4 / 2014-09-04 +================== + + * deps: send@0.8.5 + - Fix a path traversal issue when using `root` + - Fix malicious path detection for empty string path + +1.5.3 / 2014-08-17 +================== + + * deps: send@0.8.3 + +1.5.2 / 2014-08-14 +================== + + * deps: send@0.8.2 + - Work around `fd` leak in Node.js 0.10 for `fs.ReadStream` + +1.5.1 / 2014-08-09 +================== + + * Fix parsing of weird `req.originalUrl` values + * deps: parseurl@~1.3.0 + * deps: utils-merge@1.0.0 + +1.5.0 / 2014-08-05 +================== + + * deps: send@0.8.1 + - Add `extensions` option + +1.4.4 / 2014-08-04 +================== + + * deps: send@0.7.4 + - Fix serving index files without root dir + +1.4.3 / 2014-07-29 +================== + + * deps: send@0.7.3 + - Fix incorrect 403 on Windows and Node.js 0.11 + +1.4.2 / 2014-07-27 +================== + + * deps: send@0.7.2 + - deps: depd@0.4.4 + +1.4.1 / 2014-07-26 +================== + + * deps: send@0.7.1 + - deps: depd@0.4.3 + +1.4.0 / 2014-07-21 +================== + + * deps: parseurl@~1.2.0 + - Cache URLs based on original value + - Remove no-longer-needed URL mis-parse work-around + - Simplify the "fast-path" `RegExp` + * deps: send@0.7.0 + - Add `dotfiles` option + - deps: debug@1.0.4 + - deps: depd@0.4.2 + +1.3.2 / 2014-07-11 +================== + + * deps: send@0.6.0 + - Cap `maxAge` value to 1 year + - deps: debug@1.0.3 + +1.3.1 / 2014-07-09 +================== + + * deps: parseurl@~1.1.3 + - faster parsing of href-only URLs + +1.3.0 / 2014-06-28 +================== + + * Add `setHeaders` option + * Include HTML link in redirect response + * deps: send@0.5.0 + - Accept string for `maxAge` (converted by `ms`) + +1.2.3 / 2014-06-11 +================== + + * deps: send@0.4.3 + - Do not throw un-catchable error on file open race condition + - Use `escape-html` for HTML escaping + - deps: debug@1.0.2 + - deps: finished@1.2.2 + - deps: fresh@0.2.2 + +1.2.2 / 2014-06-09 +================== + + * deps: send@0.4.2 + - fix "event emitter leak" warnings + - deps: debug@1.0.1 + - deps: finished@1.2.1 + +1.2.1 / 2014-06-02 +================== + + * use `escape-html` for escaping + * deps: send@0.4.1 + - Send `max-age` in `Cache-Control` in correct format + +1.2.0 / 2014-05-29 +================== + + * deps: send@0.4.0 + - Calculate ETag with md5 for reduced collisions + - Fix wrong behavior when index file matches directory + - Ignore stream errors after request ends + - Skip directories in index file search + - deps: debug@0.8.1 + +1.1.0 / 2014-04-24 +================== + + * Accept options directly to `send` module + * deps: send@0.3.0 + +1.0.4 / 2014-04-07 +================== + + * Resolve relative paths at middleware setup + * Use parseurl to parse the URL from request + +1.0.3 / 2014-03-20 +================== + + * Do not rely on connect-like environments + +1.0.2 / 2014-03-06 +================== + + * deps: send@0.2.0 + +1.0.1 / 2014-03-05 +================== + + * Add mime export for back-compat + +1.0.0 / 2014-03-05 +================== + + * Genesis from `connect` diff --git a/node_modules/serve-static/LICENSE b/node_modules/serve-static/LICENSE new file mode 100644 index 000000000..cbe62e8e7 --- /dev/null +++ b/node_modules/serve-static/LICENSE @@ -0,0 +1,25 @@ +(The MIT License) + +Copyright (c) 2010 Sencha Inc. +Copyright (c) 2011 LearnBoost +Copyright (c) 2011 TJ Holowaychuk +Copyright (c) 2014-2016 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/serve-static/README.md b/node_modules/serve-static/README.md new file mode 100644 index 000000000..7cce428cb --- /dev/null +++ b/node_modules/serve-static/README.md @@ -0,0 +1,259 @@ +# serve-static + +[![NPM Version][npm-version-image]][npm-url] +[![NPM Downloads][npm-downloads-image]][npm-url] +[![Linux Build][travis-image]][travis-url] +[![Windows Build][appveyor-image]][appveyor-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +## Install + +This is a [Node.js](https://nodejs.org/en/) module available through the +[npm registry](https://www.npmjs.com/). Installation is done using the +[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally): + +```sh +$ npm install serve-static +``` + +## API + + + +```js +var serveStatic = require('serve-static') +``` + +### serveStatic(root, options) + +Create a new middleware function to serve files from within a given root +directory. The file to serve will be determined by combining `req.url` +with the provided root directory. When a file is not found, instead of +sending a 404 response, this module will instead call `next()` to move on +to the next middleware, allowing for stacking and fall-backs. + +#### Options + +##### acceptRanges + +Enable or disable accepting ranged requests, defaults to true. +Disabling this will not send `Accept-Ranges` and ignore the contents +of the `Range` request header. + +##### cacheControl + +Enable or disable setting `Cache-Control` response header, defaults to +true. Disabling this will ignore the `immutable` and `maxAge` options. + +##### dotfiles + + Set how "dotfiles" are treated when encountered. A dotfile is a file +or directory that begins with a dot ("."). Note this check is done on +the path itself without checking if the path actually exists on the +disk. If `root` is specified, only the dotfiles above the root are +checked (i.e. the root itself can be within a dotfile when set +to "deny"). + + - `'allow'` No special treatment for dotfiles. + - `'deny'` Deny a request for a dotfile and 403/`next()`. + - `'ignore'` Pretend like the dotfile does not exist and 404/`next()`. + +The default value is similar to `'ignore'`, with the exception that this +default will not ignore the files within a directory that begins with a dot. + +##### etag + +Enable or disable etag generation, defaults to true. + +##### extensions + +Set file extension fallbacks. When set, if a file is not found, the given +extensions will be added to the file name and search for. The first that +exists will be served. Example: `['html', 'htm']`. + +The default value is `false`. + +##### fallthrough + +Set the middleware to have client errors fall-through as just unhandled +requests, otherwise forward a client error. The difference is that client +errors like a bad request or a request to a non-existent file will cause +this middleware to simply `next()` to your next middleware when this value +is `true`. When this value is `false`, these errors (even 404s), will invoke +`next(err)`. + +Typically `true` is desired such that multiple physical directories can be +mapped to the same web address or for routes to fill in non-existent files. + +The value `false` can be used if this middleware is mounted at a path that +is designed to be strictly a single file system directory, which allows for +short-circuiting 404s for less overhead. This middleware will also reply to +all methods. + +The default value is `true`. + +##### immutable + +Enable or disable the `immutable` directive in the `Cache-Control` response +header, defaults to `false`. If set to `true`, the `maxAge` option should +also be specified to enable caching. The `immutable` directive will prevent +supported clients from making conditional requests during the life of the +`maxAge` option to check if the file has changed. + +##### index + +By default this module will send "index.html" files in response to a request +on a directory. To disable this set `false` or to supply a new index pass a +string or an array in preferred order. + +##### lastModified + +Enable or disable `Last-Modified` header, defaults to true. Uses the file +system's last modified value. + +##### maxAge + +Provide a max-age in milliseconds for http caching, defaults to 0. This +can also be a string accepted by the [ms](https://www.npmjs.org/package/ms#readme) +module. + +##### redirect + +Redirect to trailing "/" when the pathname is a dir. Defaults to `true`. + +##### setHeaders + +Function to set custom headers on response. Alterations to the headers need to +occur synchronously. The function is called as `fn(res, path, stat)`, where +the arguments are: + + - `res` the response object + - `path` the file path that is being sent + - `stat` the stat object of the file that is being sent + +## Examples + +### Serve files with vanilla node.js http server + +```js +var finalhandler = require('finalhandler') +var http = require('http') +var serveStatic = require('serve-static') + +// Serve up public/ftp folder +var serve = serveStatic('public/ftp', { 'index': ['index.html', 'index.htm'] }) + +// Create server +var server = http.createServer(function onRequest (req, res) { + serve(req, res, finalhandler(req, res)) +}) + +// Listen +server.listen(3000) +``` + +### Serve all files as downloads + +```js +var contentDisposition = require('content-disposition') +var finalhandler = require('finalhandler') +var http = require('http') +var serveStatic = require('serve-static') + +// Serve up public/ftp folder +var serve = serveStatic('public/ftp', { + 'index': false, + 'setHeaders': setHeaders +}) + +// Set header to force download +function setHeaders (res, path) { + res.setHeader('Content-Disposition', contentDisposition(path)) +} + +// Create server +var server = http.createServer(function onRequest (req, res) { + serve(req, res, finalhandler(req, res)) +}) + +// Listen +server.listen(3000) +``` + +### Serving using express + +#### Simple + +This is a simple example of using Express. + +```js +var express = require('express') +var serveStatic = require('serve-static') + +var app = express() + +app.use(serveStatic('public/ftp', { 'index': ['default.html', 'default.htm'] })) +app.listen(3000) +``` + +#### Multiple roots + +This example shows a simple way to search through multiple directories. +Files are look for in `public-optimized/` first, then `public/` second as +a fallback. + +```js +var express = require('express') +var path = require('path') +var serveStatic = require('serve-static') + +var app = express() + +app.use(serveStatic(path.join(__dirname, 'public-optimized'))) +app.use(serveStatic(path.join(__dirname, 'public'))) +app.listen(3000) +``` + +#### Different settings for paths + +This example shows how to set a different max age depending on the served +file type. In this example, HTML files are not cached, while everything else +is for 1 day. + +```js +var express = require('express') +var path = require('path') +var serveStatic = require('serve-static') + +var app = express() + +app.use(serveStatic(path.join(__dirname, 'public'), { + maxAge: '1d', + setHeaders: setCustomCacheControl +})) + +app.listen(3000) + +function setCustomCacheControl (res, path) { + if (serveStatic.mime.lookup(path) === 'text/html') { + // Custom Cache-Control for HTML files + res.setHeader('Cache-Control', 'public, max-age=0') + } +} +``` + +## License + +[MIT](LICENSE) + +[appveyor-image]: https://badgen.net/appveyor/ci/dougwilson/serve-static/master?label=windows +[appveyor-url]: https://ci.appveyor.com/project/dougwilson/serve-static +[coveralls-image]: https://badgen.net/coveralls/c/github/expressjs/serve-static/master +[coveralls-url]: https://coveralls.io/r/expressjs/serve-static?branch=master +[node-image]: https://badgen.net/npm/node/serve-static +[node-url]: https://nodejs.org/en/download/ +[npm-downloads-image]: https://badgen.net/npm/dm/serve-static +[npm-url]: https://npmjs.org/package/serve-static +[npm-version-image]: https://badgen.net/npm/v/serve-static +[travis-image]: https://badgen.net/travis/expressjs/serve-static/master?label=linux +[travis-url]: https://travis-ci.org/expressjs/serve-static diff --git a/node_modules/serve-static/index.js b/node_modules/serve-static/index.js new file mode 100644 index 000000000..b7d3984c4 --- /dev/null +++ b/node_modules/serve-static/index.js @@ -0,0 +1,210 @@ +/*! + * serve-static + * Copyright(c) 2010 Sencha Inc. + * Copyright(c) 2011 TJ Holowaychuk + * Copyright(c) 2014-2016 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module dependencies. + * @private + */ + +var encodeUrl = require('encodeurl') +var escapeHtml = require('escape-html') +var parseUrl = require('parseurl') +var resolve = require('path').resolve +var send = require('send') +var url = require('url') + +/** + * Module exports. + * @public + */ + +module.exports = serveStatic +module.exports.mime = send.mime + +/** + * @param {string} root + * @param {object} [options] + * @return {function} + * @public + */ + +function serveStatic (root, options) { + if (!root) { + throw new TypeError('root path required') + } + + if (typeof root !== 'string') { + throw new TypeError('root path must be a string') + } + + // copy options object + var opts = Object.create(options || null) + + // fall-though + var fallthrough = opts.fallthrough !== false + + // default redirect + var redirect = opts.redirect !== false + + // headers listener + var setHeaders = opts.setHeaders + + if (setHeaders && typeof setHeaders !== 'function') { + throw new TypeError('option setHeaders must be function') + } + + // setup options for send + opts.maxage = opts.maxage || opts.maxAge || 0 + opts.root = resolve(root) + + // construct directory listener + var onDirectory = redirect + ? createRedirectDirectoryListener() + : createNotFoundDirectoryListener() + + return function serveStatic (req, res, next) { + if (req.method !== 'GET' && req.method !== 'HEAD') { + if (fallthrough) { + return next() + } + + // method not allowed + res.statusCode = 405 + res.setHeader('Allow', 'GET, HEAD') + res.setHeader('Content-Length', '0') + res.end() + return + } + + var forwardError = !fallthrough + var originalUrl = parseUrl.original(req) + var path = parseUrl(req).pathname + + // make sure redirect occurs at mount + if (path === '/' && originalUrl.pathname.substr(-1) !== '/') { + path = '' + } + + // create send stream + var stream = send(req, path, opts) + + // add directory handler + stream.on('directory', onDirectory) + + // add headers listener + if (setHeaders) { + stream.on('headers', setHeaders) + } + + // add file listener for fallthrough + if (fallthrough) { + stream.on('file', function onFile () { + // once file is determined, always forward error + forwardError = true + }) + } + + // forward errors + stream.on('error', function error (err) { + if (forwardError || !(err.statusCode < 500)) { + next(err) + return + } + + next() + }) + + // pipe + stream.pipe(res) + } +} + +/** + * Collapse all leading slashes into a single slash + * @private + */ +function collapseLeadingSlashes (str) { + for (var i = 0; i < str.length; i++) { + if (str.charCodeAt(i) !== 0x2f /* / */) { + break + } + } + + return i > 1 + ? '/' + str.substr(i) + : str +} + +/** + * Create a minimal HTML document. + * + * @param {string} title + * @param {string} body + * @private + */ + +function createHtmlDocument (title, body) { + return '\n' + + '\n' + + '\n' + + '\n' + + '' + title + '\n' + + '\n' + + '\n' + + '
' + body + '
\n' + + '\n' + + '\n' +} + +/** + * Create a directory listener that just 404s. + * @private + */ + +function createNotFoundDirectoryListener () { + return function notFound () { + this.error(404) + } +} + +/** + * Create a directory listener that performs a redirect. + * @private + */ + +function createRedirectDirectoryListener () { + return function redirect (res) { + if (this.hasTrailingSlash()) { + this.error(404) + return + } + + // get original URL + var originalUrl = parseUrl.original(this.req) + + // append trailing slash + originalUrl.path = null + originalUrl.pathname = collapseLeadingSlashes(originalUrl.pathname + '/') + + // reformat the URL + var loc = encodeUrl(url.format(originalUrl)) + var doc = createHtmlDocument('Redirecting', 'Redirecting to ' + + escapeHtml(loc) + '') + + // send redirect response + res.statusCode = 301 + res.setHeader('Content-Type', 'text/html; charset=UTF-8') + res.setHeader('Content-Length', Buffer.byteLength(doc)) + res.setHeader('Content-Security-Policy', "default-src 'none'") + res.setHeader('X-Content-Type-Options', 'nosniff') + res.setHeader('Location', loc) + res.end(doc) + } +} diff --git a/node_modules/serve-static/package.json b/node_modules/serve-static/package.json new file mode 100644 index 000000000..cb3958f40 --- /dev/null +++ b/node_modules/serve-static/package.json @@ -0,0 +1,77 @@ +{ + "_from": "serve-static@1.14.1", + "_id": "serve-static@1.14.1", + "_inBundle": false, + "_integrity": "sha512-JMrvUwE54emCYWlTI+hGrGv5I8dEwmco/00EvkzIIsR7MqrHonbD9pO2MOfFnpFntl7ecpZs+3mW+XbQZu9QCg==", + "_location": "/serve-static", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "serve-static@1.14.1", + "name": "serve-static", + "escapedName": "serve-static", + "rawSpec": "1.14.1", + "saveSpec": null, + "fetchSpec": "1.14.1" + }, + "_requiredBy": [ + "/express" + ], + "_resolved": "https://registry.npmjs.org/serve-static/-/serve-static-1.14.1.tgz", + "_shasum": "666e636dc4f010f7ef29970a88a674320898b2f9", + "_spec": "serve-static@1.14.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\express", + "author": { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + "bugs": { + "url": "https://github.com/expressjs/serve-static/issues" + }, + "bundleDependencies": false, + "dependencies": { + "encodeurl": "~1.0.2", + "escape-html": "~1.0.3", + "parseurl": "~1.3.3", + "send": "0.17.1" + }, + "deprecated": false, + "description": "Serve static files", + "devDependencies": { + "eslint": "5.16.0", + "eslint-config-standard": "12.0.0", + "eslint-plugin-import": "2.17.2", + "eslint-plugin-markdown": "1.0.0", + "eslint-plugin-node": "8.0.1", + "eslint-plugin-promise": "4.1.1", + "eslint-plugin-standard": "4.0.0", + "istanbul": "0.4.5", + "mocha": "6.1.4", + "safe-buffer": "5.1.2", + "supertest": "4.0.2" + }, + "engines": { + "node": ">= 0.8.0" + }, + "files": [ + "LICENSE", + "HISTORY.md", + "index.js" + ], + "homepage": "https://github.com/expressjs/serve-static#readme", + "license": "MIT", + "name": "serve-static", + "repository": { + "type": "git", + "url": "git+https://github.com/expressjs/serve-static.git" + }, + "scripts": { + "lint": "eslint --plugin markdown --ext js,md .", + "test": "mocha --reporter spec --bail --check-leaks test/", + "test-ci": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --reporter spec --check-leaks test/", + "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --reporter dot --check-leaks test/", + "version": "node scripts/version-history.js && git add HISTORY.md" + }, + "version": "1.14.1" +} diff --git a/node_modules/setprototypeof/LICENSE b/node_modules/setprototypeof/LICENSE new file mode 100644 index 000000000..61afa2f18 --- /dev/null +++ b/node_modules/setprototypeof/LICENSE @@ -0,0 +1,13 @@ +Copyright (c) 2015, Wes Todd + +Permission to use, copy, modify, and/or distribute this software for any +purpose with or without fee is hereby granted, provided that the above +copyright notice and this permission notice appear in all copies. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES +WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF +MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY +SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES +WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION +OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR IN +CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. diff --git a/node_modules/setprototypeof/README.md b/node_modules/setprototypeof/README.md new file mode 100644 index 000000000..f120044b6 --- /dev/null +++ b/node_modules/setprototypeof/README.md @@ -0,0 +1,31 @@ +# Polyfill for `Object.setPrototypeOf` + +[![NPM Version](https://img.shields.io/npm/v/setprototypeof.svg)](https://npmjs.org/package/setprototypeof) +[![NPM Downloads](https://img.shields.io/npm/dm/setprototypeof.svg)](https://npmjs.org/package/setprototypeof) +[![js-standard-style](https://img.shields.io/badge/code%20style-standard-brightgreen.svg)](https://github.com/standard/standard) + +A simple cross platform implementation to set the prototype of an instianted object. Supports all modern browsers and at least back to IE8. + +## Usage: + +``` +$ npm install --save setprototypeof +``` + +```javascript +var setPrototypeOf = require('setprototypeof') + +var obj = {} +setPrototypeOf(obj, { + foo: function () { + return 'bar' + } +}) +obj.foo() // bar +``` + +TypeScript is also supported: + +```typescript +import setPrototypeOf = require('setprototypeof') +``` diff --git a/node_modules/setprototypeof/index.d.ts b/node_modules/setprototypeof/index.d.ts new file mode 100644 index 000000000..f108ecd0a --- /dev/null +++ b/node_modules/setprototypeof/index.d.ts @@ -0,0 +1,2 @@ +declare function setPrototypeOf(o: any, proto: object | null): any; +export = setPrototypeOf; diff --git a/node_modules/setprototypeof/index.js b/node_modules/setprototypeof/index.js new file mode 100644 index 000000000..81fd5d7af --- /dev/null +++ b/node_modules/setprototypeof/index.js @@ -0,0 +1,17 @@ +'use strict' +/* eslint no-proto: 0 */ +module.exports = Object.setPrototypeOf || ({ __proto__: [] } instanceof Array ? setProtoOf : mixinProperties) + +function setProtoOf (obj, proto) { + obj.__proto__ = proto + return obj +} + +function mixinProperties (obj, proto) { + for (var prop in proto) { + if (!obj.hasOwnProperty(prop)) { + obj[prop] = proto[prop] + } + } + return obj +} diff --git a/node_modules/setprototypeof/package.json b/node_modules/setprototypeof/package.json new file mode 100644 index 000000000..47f8bf2d5 --- /dev/null +++ b/node_modules/setprototypeof/package.json @@ -0,0 +1,64 @@ +{ + "_from": "setprototypeof@1.1.1", + "_id": "setprototypeof@1.1.1", + "_inBundle": false, + "_integrity": "sha512-JvdAWfbXeIGaZ9cILp38HntZSFSo3mWg6xGcJJsd+d4aRMOqauag1C63dJfDw7OaMYwEbHMOxEZ1lqVRYP2OAw==", + "_location": "/setprototypeof", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "setprototypeof@1.1.1", + "name": "setprototypeof", + "escapedName": "setprototypeof", + "rawSpec": "1.1.1", + "saveSpec": null, + "fetchSpec": "1.1.1" + }, + "_requiredBy": [ + "/express", + "/http-errors" + ], + "_resolved": "https://registry.npmjs.org/setprototypeof/-/setprototypeof-1.1.1.tgz", + "_shasum": "7e95acb24aa92f5885e0abef5ba131330d4ae683", + "_spec": "setprototypeof@1.1.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\express", + "author": { + "name": "Wes Todd" + }, + "bugs": { + "url": "https://github.com/wesleytodd/setprototypeof/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "A small polyfill for Object.setprototypeof", + "devDependencies": { + "mocha": "^5.2.0", + "standard": "^12.0.1" + }, + "homepage": "https://github.com/wesleytodd/setprototypeof", + "keywords": [ + "polyfill", + "object", + "setprototypeof" + ], + "license": "ISC", + "main": "index.js", + "name": "setprototypeof", + "repository": { + "type": "git", + "url": "git+https://github.com/wesleytodd/setprototypeof.git" + }, + "scripts": { + "node010": "NODE_VER=0.10 MOCHA_VER=3 npm run testversion", + "node11": "NODE_VER=11 npm run testversion", + "node4": "NODE_VER=4 npm run testversion", + "node6": "NODE_VER=6 npm run testversion", + "node9": "NODE_VER=9 npm run testversion", + "test": "standard && mocha", + "testallversions": "npm run node010 && npm run node4 && npm run node6 && npm run node9 && npm run node11", + "testversion": "docker run -it --rm -v $(PWD):/usr/src/app -w /usr/src/app node:${NODE_VER} npm install mocha@${MOCHA_VER:-latest} && npm t" + }, + "typings": "index.d.ts", + "version": "1.1.1" +} diff --git a/node_modules/setprototypeof/test/index.js b/node_modules/setprototypeof/test/index.js new file mode 100644 index 000000000..afeb4ddb2 --- /dev/null +++ b/node_modules/setprototypeof/test/index.js @@ -0,0 +1,24 @@ +'use strict' +/* eslint-env mocha */ +/* eslint no-proto: 0 */ +var assert = require('assert') +var setPrototypeOf = require('..') + +describe('setProtoOf(obj, proto)', function () { + it('should merge objects', function () { + var obj = { a: 1, b: 2 } + var proto = { b: 3, c: 4 } + var mergeObj = setPrototypeOf(obj, proto) + + if (Object.getPrototypeOf) { + assert.strictEqual(Object.getPrototypeOf(obj), proto) + } else if ({ __proto__: [] } instanceof Array) { + assert.strictEqual(obj.__proto__, proto) + } else { + assert.strictEqual(obj.a, 1) + assert.strictEqual(obj.b, 2) + assert.strictEqual(obj.c, 4) + } + assert.strictEqual(mergeObj, obj) + }) +}) diff --git a/node_modules/sift/MIT-LICENSE.txt b/node_modules/sift/MIT-LICENSE.txt new file mode 100644 index 000000000..c080d2eb0 --- /dev/null +++ b/node_modules/sift/MIT-LICENSE.txt @@ -0,0 +1,20 @@ +Copyright (c) 2015 Craig Condon + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +"Software"), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND +NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE +LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION +OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION +WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/sift/README.md b/node_modules/sift/README.md new file mode 100644 index 000000000..4eb62ef4a --- /dev/null +++ b/node_modules/sift/README.md @@ -0,0 +1,465 @@ +**Installation**: `npm install sift`, or `yarn add sift` + +## Sift is a tiny library for using MongoDB queries in Javascript + +[![Build Status](https://secure.travis-ci.org/crcn/sift.js.png)](https://secure.travis-ci.org/crcn/sift.js) + + + + +**For extended documentation, checkout http://docs.mongodb.org/manual/reference/operator/query/** + +## Features: + +- Supported operators: [\$in](#in), [\$nin](#nin), [\$exists](#exists), [\$gte](#gte), [\$gt](#gt), [\$lte](#lte), [\$lt](#lt), [\$eq](#eq), [\$ne](#ne), [\$mod](#mod), [\$all](#all), [\$and](#and), [\$or](#or), [\$nor](#nor), [\$not](#not), [\$size](#size), [\$type](#type), [\$regex](#regex), [\$where](#where), [\$elemMatch](#elemmatch) +- Regexp searches +- Supports node.js, and web +- Custom Operations +- Tree-shaking (omitting functionality from web app bundles) + +## Examples + +```javascript +import sift from "sift"; + +//intersecting arrays +const result1 = ["hello", "sifted", "array!"].filter( + sift({ $in: ["hello", "world"] }) +); //['hello'] + +//regexp filter +const result2 = ["craig", "john", "jake"].filter(sift(/^j/)); //['john','jake'] + +// function filter +const testFilter = sift({ + //you can also filter against functions + name: function(value) { + return value.length == 5; + } +}); + +const result3 = [ + { + name: "craig" + }, + { + name: "john" + }, + { + name: "jake" + } +].filter(testFilter); // filtered: [{ name: 'craig' }] + +//you can test *single values* against your custom sifter +testFilter({ name: "sarah" }); //true +testFilter({ name: "tim" }); //false +``` + +## API + +### sift(query: MongoQuery, options?: Options): Function + +Creates a filter with all of the built-in MongoDB query operations. + +- `query` - the filter to use against the target array +- `options` + - `operations` - [custom operations](#custom-operations) + - `compare` - compares difference between two values + +Example: + +```javascript +import sift from "sift"; + +const test = sift({ $gt: 5 })); + +console.log(test(6)); // true +console.log(test(4)); // false + +[3, 4, 5, 6, 7].filter(sift({ $exists: true })); // [6, 7] +``` + +### createQueryTester(query: Query, options?: Options): Function + +Creates a filter function **without** built-in MongoDB query operations. This is useful +if you're looking to omit certain operations from application bundles. See [Omitting built-in operations](#omitting-built-in-operations) for more info. + +```javascript +import { createQueryTester, $eq, $in } from "sift"; +const filter = createQueryTester({ $eq: 5 }, { operations: { $eq, $in } }); +``` + +### createEqualsOperation(params: any, ownerQuery: Query, options: Options): Operation + +Used for [custom operations](#custom-operations). + +```javascript +import { createQueryTester, createEqualsOperation, $eq, $in } from "sift"; +const filter = createQueryTester( + { $mod: 5 }, + { + operations: { + $something(mod, ownerQuery, options) { + return createEqualsOperation( + value => value % mod === 0, + ownerQuery, + options + ); + } + } + } +); +filter(10); // true +filter(11); // false +``` + +## Supported Operators + +See MongoDB's [advanced queries](http://www.mongodb.org/display/DOCS/Advanced+Queries) for more info. + +### \$in + +array value must be _\$in_ the given query: + +Intersecting two arrays: + +```javascript +//filtered: ['Brazil'] +["Brazil", "Haiti", "Peru", "Chile"].filter( + sift({ $in: ["Costa Rica", "Brazil"] }) +); +``` + +Here's another example. This acts more like the \$or operator: + +```javascript +[{ name: "Craig", location: "Brazil" }].filter( + sift({ location: { $in: ["Costa Rica", "Brazil"] } }) +); +``` + +### \$nin + +Opposite of \$in: + +```javascript +//filtered: ['Haiti','Peru','Chile'] +["Brazil", "Haiti", "Peru", "Chile"].filter( + sift({ $nin: ["Costa Rica", "Brazil"] }) +); +``` + +### \$exists + +Checks if whether a value exists: + +```javascript +//filtered: ['Craig','Tim'] +sift({ $exists: true })(["Craig", null, "Tim"]); +``` + +You can also filter out values that don't exist + +```javascript +//filtered: [{ name: "Tim" }] +[{ name: "Craig", city: "Minneapolis" }, { name: "Tim" }].filter( + sift({ city: { $exists: false } }) +); +``` + +### \$gte + +Checks if a number is >= value: + +```javascript +//filtered: [2, 3] +[0, 1, 2, 3].filter(sift({ $gte: 2 })); +``` + +### \$gt + +Checks if a number is > value: + +```javascript +//filtered: [3] +[0, 1, 2, 3].filter(sift({ $gt: 2 })); +``` + +### \$lte + +Checks if a number is <= value. + +```javascript +//filtered: [0, 1, 2] +[0, 1, 2, 3].filter(sift({ $lte: 2 })); +``` + +### \$lt + +Checks if number is < value. + +```javascript +//filtered: [0, 1] +[0, 1, 2, 3].filter(sift({ $lt: 2 })); +``` + +### \$eq + +Checks if `query === value`. Note that **\$eq can be omitted**. For **\$eq**, and **\$ne** + +```javascript +//filtered: [{ state: 'MN' }] +[{ state: "MN" }, { state: "CA" }, { state: "WI" }].filter( + sift({ state: { $eq: "MN" } }) +); +``` + +Or: + +```javascript +//filtered: [{ state: 'MN' }] +[{ state: "MN" }, { state: "CA" }, { state: "WI" }].filter( + sift({ state: "MN" }) +); +``` + +### \$ne + +Checks if `query !== value`. + +```javascript +//filtered: [{ state: 'CA' }, { state: 'WI'}] +[{ state: "MN" }, { state: "CA" }, { state: "WI" }].filter( + sift({ state: { $ne: "MN" } }) +); +``` + +### \$mod + +Modulus: + +```javascript +//filtered: [300, 600] +[100, 200, 300, 400, 500, 600].filter(sift({ $mod: [3, 0] })); +``` + +### \$all + +values must match **everything** in array: + +```javascript +//filtered: [ { tags: ['books','programming','travel' ]} ] +[ + { tags: ["books", "programming", "travel"] }, + { tags: ["travel", "cooking"] } +].filter(sift({ tags: { $all: ["books", "programming"] } })); +``` + +### \$and + +ability to use an array of expressions. All expressions must test true. + +```javascript +//filtered: [ { name: 'Craig', state: 'MN' }] + +[ + { name: "Craig", state: "MN" }, + { name: "Tim", state: "MN" }, + { name: "Joe", state: "CA" } +].filter(sift({ $and: [{ name: "Craig" }, { state: "MN" }] })); +``` + +### \$or + +OR array of expressions. + +```javascript +//filtered: [ { name: 'Craig', state: 'MN' }, { name: 'Tim', state: 'MN' }] +[ + { name: "Craig", state: "MN" }, + { name: "Tim", state: "MN" }, + { name: "Joe", state: "CA" } +].filter(sift({ $or: [{ name: "Craig" }, { state: "MN" }] })); +``` + +### \$nor + +opposite of or: + +```javascript +//filtered: [ { name: 'Tim', state: 'MN' }, { name: 'Joe', state: 'CA' }] +[ + { name: "Craig", state: "MN" }, + { name: "Tim", state: "MN" }, + { name: "Joe", state: "CA" } +].filter(sift({ $nor: [{ name: "Craig" }, { state: "MN" }] })); +``` + +### \$size + +Matches an array - must match given size: + +```javascript +//filtered: ['food','cooking'] +[{ tags: ["food", "cooking"] }, { tags: ["traveling"] }].filter( + sift({ tags: { $size: 2 } }) +); +``` + +### \$type + +Matches a values based on the type + +```javascript +[new Date(), 4342, "hello world"].filter(sift({ $type: Date })); //returns single date +[new Date(), 4342, "hello world"].filter(sift({ $type: String })); //returns ['hello world'] +``` + +### \$regex + +Matches values based on the given regular expression + +```javascript +["frank", "fred", "sam", "frost"].filter( + sift({ $regex: /^f/i, $nin: ["frank"] }) +); // ["fred", "frost"] +["frank", "fred", "sam", "frost"].filter( + sift({ $regex: "^f", $options: "i", $nin: ["frank"] }) +); // ["fred", "frost"] +``` + +### \$where + +Matches based on some javascript comparison + +```javascript +[{ name: "frank" }, { name: "joe" }].filter( + sift({ $where: "this.name === 'frank'" }) +); // ["frank"] +[{ name: "frank" }, { name: "joe" }].filter( + sift({ + $where: function() { + return this.name === "frank"; + } + }) +); // ["frank"] +``` + +### \$elemMatch + +Matches elements of array + +```javascript +var bills = [ + { + month: "july", + casts: [ + { + id: 1, + value: 200 + }, + { + id: 2, + value: 1000 + } + ] + }, + { + month: "august", + casts: [ + { + id: 3, + value: 1000 + }, + { + id: 4, + value: 4000 + } + ] + } +]; + +var result = bills.filter( + sift({ + casts: { + $elemMatch: { + value: { $gt: 1000 } + } + } + }) +); // {month:'august', casts:[{id:3, value: 1000},{id: 4, value: 4000}]} +``` + +### \$not + +Not expression: + +```javascript +["craig", "tim", "jake"].filter(sift({ $not: { $in: ["craig", "tim"] } })); //['jake'] +["craig", "tim", "jake"].filter(sift({ $not: { $size: 5 } })); //['tim','jake'] +``` + +### Date comparison + +Mongodb allows you to do date comparisons like so: + +```javascript +db.collection.find({ createdAt: { $gte: "2018-03-22T06:00:00Z" } }); +``` + +In Sift, you'll need to specify a Date object: + +```javascript +collection.find( + sift({ createdAt: { $gte: new Date("2018-03-22T06:00:00Z") } }) +); +``` + +## Custom behavior + +Sift works like MongoDB out of the box, but you're also able to modify the behavior to suite your needs. + +#### Custom operations + +You can register your own custom operations. Here's an example: + +```javascript +import sift, { createEqualsOperation } from "sift"; + +var filter = sift( + { + $customMod: 2 + }, + { + operations: { + $customMod(params, ownerQuery, options) { + return createEqualsOperation( + value => value % params !== 0, + ownerQuery, + options + ); + } + } + } +); + +[1, 2, 3, 4, 5].filter(filter); // 1, 3, 5 +``` + +#### Omitting built-in operations + +You can create a filter function that omits the built-in operations like so: + +```javascript +import { createQueryTester, $in, $all, $nin, $lt } from "sift"; +const test = createQueryTester( + { + $eq: 10 + }, + { $in, $all, $nin, $lt } +); + +[1, 2, 3, 4, 10].filter(test); +``` + +For bundlers like `Webpack` and `Rollup`, operations that aren't used are omitted from application bundles via tree-shaking. diff --git a/node_modules/sift/changelog.md b/node_modules/sift/changelog.md new file mode 100644 index 000000000..8eccb2839 --- /dev/null +++ b/node_modules/sift/changelog.md @@ -0,0 +1,77 @@ +## 13.1.0 + +- Added stronger types for queries: https://github.com/crcn/sift.js/issues/197 + +## 13.0.0 + +- Fix behavior discrepancy with Mongo: https://github.com/crcn/sift.js/issues/196 + +## 12.0.0 + +- Fix bug where \$elemMatch tested objects: e.g: `sift({a: {$elemMatch: 1}})({ a: { b: 1}})`. \$elemMatch now expects arrays based on Mongodb syntax. E.g: `sift({a: {$elemMatch: 1}})({ a: { b: 1}})` + +## 11.0.0 + +- new custom operations syntax (see API readme) +- null & undefined are not treated equally (change has been added to keep spec as functionality as possible to MongoDB) +- `select` option has been removed +- `compare` option now expects `boolean` return value instead of an integer +- nested queries are no-longer supported +- `expressions` option is now `operations` +- `operations` parameter now expects new operations API +- ImmutableJS support removed for now +- Remove bower support + +## 9.0.0 + +- (behavior change) toJSON works for vanilla objects. + +## 8.5.1 + +- Fix dependency vulnerability +- Fix #158 + +## 8.5.0 + +- Added `comparable` option (fix https://github.com/crcn/sift.js/issues/156) + +## 8.4.0 + +- Added `compare` option (fix https://github.com/crcn/sift.js/issues/155) + +## 8.3.2 + +- Query _properties_ now excpect exact object shape (based on https://github.com/crcn/sift.js/issues/152). E.g: `[{a: { b: 1}}, {a: { b: 1, c: 2}}]].filter(sift({ a: { b: 1} })) === [{a: {b: 1}]`, and `[{a: 1, b: 1}, {a: 1}]].filter(sift({ a: 1 })) === [{a: 1, b: 1}, {a: 1}]`. + +## 8.0.0 + +- DEPRECATED `indexOf` in favor of `array.findIndex(sift(query))` +- second param is now `options` instead of select function. E.g: `sift(query, { expressions: customExpressions, select: selectValue })` +- DEPRECATED `sift(query, array)`. You must now use `array.filter(sift(query))` +- Queries now expect exact object shape (based on https://github.com/crcn/sift.js/issues/117). E.g: `[{a: 1, b: 1}, {a: 1}]].filter(sift({ a: 1 })) === [{a: 1}]` + +### 7.0.0 + +- Remove global `*.use()` function. +- converted to ES6 + +### 3.3.x + +- `$in` now uses `toString()` when evaluating objects. Fixes #116. + +#### 2.x + +- `use()` now uses a different format: + +```javascript +sift.use({ + $operator: function(a) { + return function(b) { + // compare here + }; + } +}); +``` + +- all operators are traversable now +- fix #58. diff --git a/node_modules/sift/es/index.js b/node_modules/sift/es/index.js new file mode 100644 index 000000000..6f28748de --- /dev/null +++ b/node_modules/sift/es/index.js @@ -0,0 +1,528 @@ +const typeChecker = (type) => { + const typeString = "[object " + type + "]"; + return function (value) { + return getClassName(value) === typeString; + }; +}; +const getClassName = value => Object.prototype.toString.call(value); +const comparable = (value) => { + if (value instanceof Date) { + return value.getTime(); + } + else if (isArray(value)) { + return value.map(comparable); + } + else if (value && typeof value.toJSON === "function") { + return value.toJSON(); + } + return value; +}; +const isArray = typeChecker("Array"); +const isObject = typeChecker("Object"); +const isFunction = typeChecker("Function"); +const isVanillaObject = value => { + return (value && + (value.constructor === Object || + value.constructor === Array || + value.constructor.toString() === "function Object() { [native code] }" || + value.constructor.toString() === "function Array() { [native code] }") && + !value.toJSON); +}; +const equals = (a, b) => { + if (a == null && a == b) { + return true; + } + if (a === b) { + return true; + } + if (Object.prototype.toString.call(a) !== Object.prototype.toString.call(b)) { + return false; + } + if (isArray(a)) { + if (a.length !== b.length) { + return false; + } + for (let i = 0, { length } = a; i < length; i++) { + if (!equals(a[i], b[i])) + return false; + } + return true; + } + else if (isObject(a)) { + if (Object.keys(a).length !== Object.keys(b).length) { + return false; + } + for (const key in a) { + if (!equals(a[key], b[key])) + return false; + } + return true; + } + return false; +}; + +/** + * Walks through each value given the context - used for nested operations. E.g: + * { "person.address": { $eq: "blarg" }} + */ +const walkKeyPathValues = (item, keyPath, next, depth, key, owner) => { + const currentKey = keyPath[depth]; + // if array, then try matching. Might fall through for cases like: + // { $eq: [1, 2, 3] }, [ 1, 2, 3 ]. + if (isArray(item) && isNaN(Number(currentKey))) { + for (let i = 0, { length } = item; i < length; i++) { + // if FALSE is returned, then terminate walker. For operations, this simply + // means that the search critera was met. + if (!walkKeyPathValues(item[i], keyPath, next, depth, i, item)) { + return false; + } + } + } + if (depth === keyPath.length || item == null) { + return next(item, key, owner); + } + return walkKeyPathValues(item[currentKey], keyPath, next, depth + 1, currentKey, item); +}; +class BaseOperation { + constructor(params, owneryQuery, options) { + this.params = params; + this.owneryQuery = owneryQuery; + this.options = options; + this.init(); + } + init() { } + reset() { + this.done = false; + this.keep = false; + } +} +class NamedBaseOperation extends BaseOperation { + constructor(params, owneryQuery, options, name) { + super(params, owneryQuery, options); + this.name = name; + } +} +class GroupOperation extends BaseOperation { + constructor(params, owneryQuery, options, children) { + super(params, owneryQuery, options); + this.children = children; + } + /** + */ + reset() { + this.keep = false; + this.done = false; + for (let i = 0, { length } = this.children; i < length; i++) { + this.children[i].reset(); + } + } + /** + */ + childrenNext(item, key, owner) { + let done = true; + let keep = true; + for (let i = 0, { length } = this.children; i < length; i++) { + const childOperation = this.children[i]; + childOperation.next(item, key, owner); + if (!childOperation.keep) { + keep = false; + } + if (childOperation.done) { + if (!childOperation.keep) { + break; + } + } + else { + done = false; + } + } + this.done = done; + this.keep = keep; + } +} +class NamedGroupOperation extends GroupOperation { + constructor(params, owneryQuery, options, children, name) { + super(params, owneryQuery, options, children); + this.name = name; + } +} +class QueryOperation extends GroupOperation { + /** + */ + next(item, key, parent) { + this.childrenNext(item, key, parent); + } +} +class NestedOperation extends GroupOperation { + constructor(keyPath, params, owneryQuery, options, children) { + super(params, owneryQuery, options, children); + this.keyPath = keyPath; + /** + */ + this._nextNestedValue = (value, key, owner) => { + this.childrenNext(value, key, owner); + return !this.done; + }; + } + /** + */ + next(item, key, parent) { + walkKeyPathValues(item, this.keyPath, this._nextNestedValue, 0, key, parent); + } +} +const createTester = (a, compare) => { + if (a instanceof Function) { + return a; + } + if (a instanceof RegExp) { + return b => { + const result = typeof b === "string" && a.test(b); + a.lastIndex = 0; + return result; + }; + } + const comparableA = comparable(a); + return b => compare(comparableA, comparable(b)); +}; +class EqualsOperation extends BaseOperation { + init() { + this._test = createTester(this.params, this.options.compare); + } + next(item, key, parent) { + if (!Array.isArray(parent) || parent.hasOwnProperty(key)) { + if (this._test(item, key, parent)) { + this.done = true; + this.keep = true; + } + } + } +} +const createEqualsOperation = (params, owneryQuery, options) => new EqualsOperation(params, owneryQuery, options); +class NopeOperation extends BaseOperation { + next() { + this.done = true; + this.keep = false; + } +} +const numericalOperationCreator = (createNumericalOperation) => (params, owneryQuery, options, name) => { + if (params == null) { + return new NopeOperation(params, owneryQuery, options); + } + return createNumericalOperation(params, owneryQuery, options, name); +}; +const numericalOperation = (createTester) => numericalOperationCreator((params, owneryQuery, options) => { + const typeofParams = typeof comparable(params); + const test = createTester(params); + return new EqualsOperation(b => { + return typeof comparable(b) === typeofParams && test(b); + }, owneryQuery, options); +}); +const createNamedOperation = (name, params, parentQuery, options) => { + const operationCreator = options.operations[name]; + if (!operationCreator) { + throw new Error(`Unsupported operation: ${name}`); + } + return operationCreator(params, parentQuery, options, name); +}; +const containsOperation = (query) => { + for (const key in query) { + if (key.charAt(0) === "$") + return true; + } + return false; +}; +const createNestedOperation = (keyPath, nestedQuery, owneryQuery, options) => { + if (containsOperation(nestedQuery)) { + const [selfOperations, nestedOperations] = createQueryOperations(nestedQuery, options); + if (nestedOperations.length) { + throw new Error(`Property queries must contain only operations, or exact objects.`); + } + return new NestedOperation(keyPath, nestedQuery, owneryQuery, options, selfOperations); + } + return new NestedOperation(keyPath, nestedQuery, owneryQuery, options, [ + new EqualsOperation(nestedQuery, owneryQuery, options) + ]); +}; +const createQueryOperation = (query, owneryQuery = null, { compare, operations } = {}) => { + const options = { + compare: compare || equals, + operations: Object.assign({}, operations || {}) + }; + const [selfOperations, nestedOperations] = createQueryOperations(query, options); + const ops = []; + if (selfOperations.length) { + ops.push(new NestedOperation([], query, owneryQuery, options, selfOperations)); + } + ops.push(...nestedOperations); + if (ops.length === 1) { + return ops[0]; + } + return new QueryOperation(query, owneryQuery, options, ops); +}; +const createQueryOperations = (query, options) => { + const selfOperations = []; + const nestedOperations = []; + if (!isVanillaObject(query)) { + selfOperations.push(new EqualsOperation(query, query, options)); + return [selfOperations, nestedOperations]; + } + for (const key in query) { + if (key.charAt(0) === "$") { + const op = createNamedOperation(key, query[key], query, options); + // probably just a flag for another operation (like $options) + if (op != null) { + selfOperations.push(op); + } + } + else { + nestedOperations.push(createNestedOperation(key.split("."), query[key], query, options)); + } + } + return [selfOperations, nestedOperations]; +}; +const createOperationTester = (operation) => (item, key, owner) => { + operation.reset(); + operation.next(item, key, owner); + return operation.keep; +}; +const createQueryTester = (query, options = {}) => { + return createOperationTester(createQueryOperation(query, null, options)); +}; + +class $Ne extends NamedBaseOperation { + init() { + this._test = createTester(this.params, this.options.compare); + } + reset() { + super.reset(); + this.keep = true; + } + next(item) { + if (this._test(item)) { + this.done = true; + this.keep = false; + } + } +} +// https://docs.mongodb.com/manual/reference/operator/query/elemMatch/ +class $ElemMatch extends NamedBaseOperation { + init() { + this._queryOperation = createQueryOperation(this.params, this.owneryQuery, this.options); + } + reset() { + super.reset(); + this._queryOperation.reset(); + } + next(item) { + if (isArray(item)) { + for (let i = 0, { length } = item; i < length; i++) { + // reset query operation since item being tested needs to pass _all_ query + // operations for it to be a success + this._queryOperation.reset(); + // check item + this._queryOperation.next(item[i], i, item); + this.keep = this.keep || this._queryOperation.keep; + } + this.done = true; + } + else { + this.done = false; + this.keep = false; + } + } +} +class $Not extends NamedBaseOperation { + init() { + this._queryOperation = createQueryOperation(this.params, this.owneryQuery, this.options); + } + reset() { + this._queryOperation.reset(); + } + next(item, key, owner) { + this._queryOperation.next(item, key, owner); + this.done = this._queryOperation.done; + this.keep = !this._queryOperation.keep; + } +} +class $Size extends NamedBaseOperation { + init() { } + next(item) { + if (isArray(item) && item.length === this.params) { + this.done = true; + this.keep = true; + } + // if (parent && parent.length === this.params) { + // this.done = true; + // this.keep = true; + // } + } +} +class $Or extends NamedBaseOperation { + init() { + this._ops = this.params.map(op => createQueryOperation(op, null, this.options)); + } + reset() { + this.done = false; + this.keep = false; + for (let i = 0, { length } = this._ops; i < length; i++) { + this._ops[i].reset(); + } + } + next(item, key, owner) { + let done = false; + let success = false; + for (let i = 0, { length } = this._ops; i < length; i++) { + const op = this._ops[i]; + op.next(item, key, owner); + if (op.keep) { + done = true; + success = op.keep; + break; + } + } + this.keep = success; + this.done = done; + } +} +class $Nor extends $Or { + next(item, key, owner) { + super.next(item, key, owner); + this.keep = !this.keep; + } +} +class $In extends NamedBaseOperation { + init() { + this._testers = this.params.map(value => { + if (containsOperation(value)) { + throw new Error(`cannot nest $ under ${this.constructor.name.toLowerCase()}`); + } + return createTester(value, this.options.compare); + }); + } + next(item, key, owner) { + let done = false; + let success = false; + for (let i = 0, { length } = this._testers; i < length; i++) { + const test = this._testers[i]; + if (test(item)) { + done = true; + success = true; + break; + } + } + this.keep = success; + this.done = done; + } +} +class $Nin extends $In { + next(item, key, owner) { + super.next(item, key, owner); + this.keep = !this.keep; + } +} +class $Exists extends NamedBaseOperation { + next(item, key, owner) { + if (owner.hasOwnProperty(key) === this.params) { + this.done = true; + this.keep = true; + } + } +} +class $And extends NamedGroupOperation { + constructor(params, owneryQuery, options, name) { + super(params, owneryQuery, options, params.map(query => createQueryOperation(query, owneryQuery, options)), name); + } + next(item, key, owner) { + this.childrenNext(item, key, owner); + } +} +const $eq = (params, owneryQuery, options) => new EqualsOperation(params, owneryQuery, options); +const $ne = (params, owneryQuery, options, name) => new $Ne(params, owneryQuery, options, name); +const $or = (params, owneryQuery, options, name) => new $Or(params, owneryQuery, options, name); +const $nor = (params, owneryQuery, options, name) => new $Nor(params, owneryQuery, options, name); +const $elemMatch = (params, owneryQuery, options, name) => new $ElemMatch(params, owneryQuery, options, name); +const $nin = (params, owneryQuery, options, name) => new $Nin(params, owneryQuery, options, name); +const $in = (params, owneryQuery, options, name) => new $In(params, owneryQuery, options, name); +const $lt = numericalOperation(params => b => b < params); +const $lte = numericalOperation(params => b => b <= params); +const $gt = numericalOperation(params => b => b > params); +const $gte = numericalOperation(params => b => b >= params); +const $mod = ([mod, equalsValue], owneryQuery, options) => new EqualsOperation(b => comparable(b) % mod === equalsValue, owneryQuery, options); +const $exists = (params, owneryQuery, options, name) => new $Exists(params, owneryQuery, options, name); +const $regex = (pattern, owneryQuery, options) => new EqualsOperation(new RegExp(pattern, owneryQuery.$options), owneryQuery, options); +const $not = (params, owneryQuery, options, name) => new $Not(params, owneryQuery, options, name); +const typeAliases = { + number: v => typeof v === "number", + string: v => typeof v === "string", + bool: v => typeof v === "boolean", + array: v => Array.isArray(v), + null: v => v === null, + timestamp: v => v instanceof Date +}; +const $type = (clazz, owneryQuery, options) => new EqualsOperation(b => { + if (typeof clazz === "string") { + if (!typeAliases[clazz]) { + throw new Error(`Type alias does not exist`); + } + return typeAliases[clazz](b); + } + return b != null ? b instanceof clazz || b.constructor === clazz : false; +}, owneryQuery, options); +const $and = (params, ownerQuery, options, name) => new $And(params, ownerQuery, options, name); +const $all = $and; +const $size = (params, ownerQuery, options) => new $Size(params, ownerQuery, options, "$size"); +const $options = () => null; +const $where = (params, ownerQuery, options) => { + let test; + if (isFunction(params)) { + test = params; + } + else if (!process.env.CSP_ENABLED) { + test = new Function("obj", "return " + params); + } + else { + throw new Error(`In CSP mode, sift does not support strings in "$where" condition`); + } + return new EqualsOperation(b => test.bind(b)(b), ownerQuery, options); +}; + +var defaultOperations = /*#__PURE__*/Object.freeze({ + __proto__: null, + $Size: $Size, + $eq: $eq, + $ne: $ne, + $or: $or, + $nor: $nor, + $elemMatch: $elemMatch, + $nin: $nin, + $in: $in, + $lt: $lt, + $lte: $lte, + $gt: $gt, + $gte: $gte, + $mod: $mod, + $exists: $exists, + $regex: $regex, + $not: $not, + $type: $type, + $and: $and, + $all: $all, + $size: $size, + $options: $options, + $where: $where +}); + +const createDefaultQueryOperation = (query, ownerQuery, { compare, operations } = {}) => { + return createQueryOperation(query, ownerQuery, { + compare: compare, + operations: Object.assign({}, defaultOperations, operations || {}) + }); +}; +const createDefaultQueryTester = (query, options = {}) => { + const op = createDefaultQueryOperation(query, null, options); + return createOperationTester(op); +}; + +export default createDefaultQueryTester; +export { $Size, $all, $and, $elemMatch, $eq, $exists, $gt, $gte, $in, $lt, $lte, $mod, $ne, $nin, $nor, $not, $options, $or, $regex, $size, $type, $where, EqualsOperation, createDefaultQueryOperation, createEqualsOperation, createOperationTester, createQueryOperation, createQueryTester }; +//# sourceMappingURL=index.js.map diff --git a/node_modules/sift/es/index.js.map b/node_modules/sift/es/index.js.map new file mode 100644 index 000000000..7e4c6f506 --- /dev/null +++ b/node_modules/sift/es/index.js.map @@ -0,0 +1 @@ +{"version":3,"file":"index.js","sources":["../src/utils.ts","../src/core.ts","../src/operations.ts","../src/index.ts"],"sourcesContent":[null,null,null,null],"names":[],"mappings":"AAEO,MAAM,WAAW,GAAG,CAAQ,IAAI;IACrC,MAAM,UAAU,GAAG,UAAU,GAAG,IAAI,GAAG,GAAG,CAAC;IAC3C,OAAO,UAAS,KAAK;QACnB,OAAO,YAAY,CAAC,KAAK,CAAC,KAAK,UAAU,CAAC;KAC3C,CAAC;AACJ,CAAC,CAAC;AAEF,MAAM,YAAY,GAAG,KAAK,IAAI,MAAM,CAAC,SAAS,CAAC,QAAQ,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;AAE7D,MAAM,UAAU,GAAG,CAAC,KAAU;IACnC,IAAI,KAAK,YAAY,IAAI,EAAE;QACzB,OAAO,KAAK,CAAC,OAAO,EAAE,CAAC;KACxB;SAAM,IAAI,OAAO,CAAC,KAAK,CAAC,EAAE;QACzB,OAAO,KAAK,CAAC,GAAG,CAAC,UAAU,CAAC,CAAC;KAC9B;SAAM,IAAI,KAAK,IAAI,OAAO,KAAK,CAAC,MAAM,KAAK,UAAU,EAAE;QACtD,OAAO,KAAK,CAAC,MAAM,EAAE,CAAC;KACvB;IAED,OAAO,KAAK,CAAC;AACf,CAAC,CAAC;AAEK,MAAM,OAAO,GAAG,WAAW,CAAa,OAAO,CAAC,CAAC;AACjD,MAAM,QAAQ,GAAG,WAAW,CAAS,QAAQ,CAAC,CAAC;AAC/C,MAAM,UAAU,GAAG,WAAW,CAAW,UAAU,CAAC,CAAC;AACrD,MAAM,eAAe,GAAG,KAAK;IAClC,QACE,KAAK;SACJ,KAAK,CAAC,WAAW,KAAK,MAAM;YAC3B,KAAK,CAAC,WAAW,KAAK,KAAK;YAC3B,KAAK,CAAC,WAAW,CAAC,QAAQ,EAAE,KAAK,qCAAqC;YACtE,KAAK,CAAC,WAAW,CAAC,QAAQ,EAAE,KAAK,oCAAoC,CAAC;QACxE,CAAC,KAAK,CAAC,MAAM,EACb;AACJ,CAAC,CAAC;AAEK,MAAM,MAAM,GAAG,CAAC,CAAC,EAAE,CAAC;IACzB,IAAI,CAAC,IAAI,IAAI,IAAI,CAAC,IAAI,CAAC,EAAE;QACvB,OAAO,IAAI,CAAC;KACb;IACD,IAAI,CAAC,KAAK,CAAC,EAAE;QACX,OAAO,IAAI,CAAC;KACb;IAED,IAAI,MAAM,CAAC,SAAS,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC,CAAC,KAAK,MAAM,CAAC,SAAS,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE;QAC3E,OAAO,KAAK,CAAC;KACd;IAED,IAAI,OAAO,CAAC,CAAC,CAAC,EAAE;QACd,IAAI,CAAC,CAAC,MAAM,KAAK,CAAC,CAAC,MAAM,EAAE;YACzB,OAAO,KAAK,CAAC;SACd;QACD,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,CAAC,EAAE,CAAC,GAAG,MAAM,EAAE,CAAC,EAAE,EAAE;YAC/C,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,CAAC;gBAAE,OAAO,KAAK,CAAC;SACvC;QACD,OAAO,IAAI,CAAC;KACb;SAAM,IAAI,QAAQ,CAAC,CAAC,CAAC,EAAE;QACtB,IAAI,MAAM,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,MAAM,KAAK,MAAM,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,MAAM,EAAE;YACnD,OAAO,KAAK,CAAC;SACd;QACD,KAAK,MAAM,GAAG,IAAI,CAAC,EAAE;YACnB,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,EAAE,CAAC,CAAC,GAAG,CAAC,CAAC;gBAAE,OAAO,KAAK,CAAC;SAC3C;QACD,OAAO,IAAI,CAAC;KACb;IACD,OAAO,KAAK,CAAC;AACf,CAAC;;ACMD;;;;AAKA,MAAM,iBAAiB,GAAG,CACxB,IAAS,EACT,OAAc,EACd,IAAY,EACZ,KAAa,EACb,GAAQ,EACR,KAAU;IAEV,MAAM,UAAU,GAAG,OAAO,CAAC,KAAK,CAAC,CAAC;;;IAIlC,IAAI,OAAO,CAAC,IAAI,CAAC,IAAI,KAAK,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,EAAE;QAC9C,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,IAAI,EAAE,CAAC,GAAG,MAAM,EAAE,CAAC,EAAE,EAAE;;;YAGlD,IAAI,CAAC,iBAAiB,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,OAAO,EAAE,IAAI,EAAE,KAAK,EAAE,CAAC,EAAE,IAAI,CAAC,EAAE;gBAC9D,OAAO,KAAK,CAAC;aACd;SACF;KACF;IAED,IAAI,KAAK,KAAK,OAAO,CAAC,MAAM,IAAI,IAAI,IAAI,IAAI,EAAE;QAC5C,OAAO,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;KAC/B;IAED,OAAO,iBAAiB,CACtB,IAAI,CAAC,UAAU,CAAC,EAChB,OAAO,EACP,IAAI,EACJ,KAAK,GAAG,CAAC,EACT,UAAU,EACV,IAAI,CACL,CAAC;AACJ,CAAC,CAAC;AAEF,MAAe,aAAa;IAG1B,YACW,MAAe,EACf,WAAgB,EAChB,OAAgB;QAFhB,WAAM,GAAN,MAAM,CAAS;QACf,gBAAW,GAAX,WAAW,CAAK;QAChB,YAAO,GAAP,OAAO,CAAS;QAEzB,IAAI,CAAC,IAAI,EAAE,CAAC;KACb;IACS,IAAI,MAAK;IACnB,KAAK;QACH,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;QAClB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;KACnB;CAEF;MAEqB,kBACpB,SAAQ,aAA6B;IAErC,YACE,MAAe,EACf,WAAgB,EAChB,OAAgB,EACP,IAAY;QAErB,KAAK,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC,CAAC;QAF3B,SAAI,GAAJ,IAAI,CAAQ;KAGtB;CACF;AAED,MAAe,cAAe,SAAQ,aAAkB;IAItD,YACE,MAAW,EACX,WAAgB,EAChB,OAAgB,EACA,QAA0B;QAE1C,KAAK,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC,CAAC;QAFpB,aAAQ,GAAR,QAAQ,CAAkB;KAG3C;;;IAKD,KAAK;QACH,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;QAClB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;QAClB,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,IAAI,CAAC,QAAQ,EAAE,CAAC,GAAG,MAAM,EAAE,CAAC,EAAE,EAAE;YAC3D,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,KAAK,EAAE,CAAC;SAC1B;KACF;;;IAOS,YAAY,CAAC,IAAS,EAAE,GAAQ,EAAE,KAAU;QACpD,IAAI,IAAI,GAAG,IAAI,CAAC;QAChB,IAAI,IAAI,GAAG,IAAI,CAAC;QAChB,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,IAAI,CAAC,QAAQ,EAAE,CAAC,GAAG,MAAM,EAAE,CAAC,EAAE,EAAE;YAC3D,MAAM,cAAc,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC;YACxC,cAAc,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;YACtC,IAAI,CAAC,cAAc,CAAC,IAAI,EAAE;gBACxB,IAAI,GAAG,KAAK,CAAC;aACd;YACD,IAAI,cAAc,CAAC,IAAI,EAAE;gBACvB,IAAI,CAAC,cAAc,CAAC,IAAI,EAAE;oBACxB,MAAM;iBACP;aACF;iBAAM;gBACL,IAAI,GAAG,KAAK,CAAC;aACd;SACF;QACD,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;QACjB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;KAClB;CACF;MAEqB,mBAAoB,SAAQ,cAAc;IAE9D,YACE,MAAW,EACX,WAAgB,EAChB,OAAgB,EAChB,QAA0B,EACjB,IAAY;QAErB,KAAK,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;QAFrC,SAAI,GAAJ,IAAI,CAAQ;KAGtB;CACF;MAEY,cAAsB,SAAQ,cAAc;;;IAIvD,IAAI,CAAC,IAAW,EAAE,GAAQ,EAAE,MAAW;QACrC,IAAI,CAAC,YAAY,CAAC,IAAI,EAAE,GAAG,EAAE,MAAM,CAAC,CAAC;KACtC;CACF;MAEY,eAAgB,SAAQ,cAAc;IACjD,YACW,OAAc,EACvB,MAAW,EACX,WAAgB,EAChB,OAAgB,EAChB,QAA0B;QAE1B,KAAK,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,QAAQ,CAAC,CAAC;QANrC,YAAO,GAAP,OAAO,CAAO;;;QAyBjB,qBAAgB,GAAG,CAAC,KAAU,EAAE,GAAQ,EAAE,KAAU;YAC1D,IAAI,CAAC,YAAY,CAAC,KAAK,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;YACrC,OAAO,CAAC,IAAI,CAAC,IAAI,CAAC;SACnB,CAAC;KArBD;;;IAID,IAAI,CAAC,IAAS,EAAE,GAAQ,EAAE,MAAW;QACnC,iBAAiB,CACf,IAAI,EACJ,IAAI,CAAC,OAAO,EACZ,IAAI,CAAC,gBAAgB,EACrB,CAAC,EACD,GAAG,EACH,MAAM,CACP,CAAC;KACH;CASF;AAEM,MAAM,YAAY,GAAG,CAAC,CAAC,EAAE,OAAmB;IACjD,IAAI,CAAC,YAAY,QAAQ,EAAE;QACzB,OAAO,CAAC,CAAC;KACV;IACD,IAAI,CAAC,YAAY,MAAM,EAAE;QACvB,OAAO,CAAC;YACN,MAAM,MAAM,GAAG,OAAO,CAAC,KAAK,QAAQ,IAAI,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;YAClD,CAAC,CAAC,SAAS,GAAG,CAAC,CAAC;YAChB,OAAO,MAAM,CAAC;SACf,CAAC;KACH;IACD,MAAM,WAAW,GAAG,UAAU,CAAC,CAAC,CAAC,CAAC;IAClC,OAAO,CAAC,IAAI,OAAO,CAAC,WAAW,EAAE,UAAU,CAAC,CAAC,CAAC,CAAC,CAAC;AAClD,CAAC,CAAC;MAEW,eAAwB,SAAQ,aAAqB;IAEhE,IAAI;QACF,IAAI,CAAC,KAAK,GAAG,YAAY,CAAC,IAAI,CAAC,MAAM,EAAE,IAAI,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;KAC9D;IACD,IAAI,CAAC,IAAI,EAAE,GAAQ,EAAE,MAAW;QAC9B,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,MAAM,CAAC,IAAI,MAAM,CAAC,cAAc,CAAC,GAAG,CAAC,EAAE;YACxD,IAAI,IAAI,CAAC,KAAK,CAAC,IAAI,EAAE,GAAG,EAAE,MAAM,CAAC,EAAE;gBACjC,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;gBACjB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;aAClB;SACF;KACF;CACF;MAEY,qBAAqB,GAAG,CACnC,MAAW,EACX,WAAgB,EAChB,OAAgB,KACb,IAAI,eAAe,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE;MAE1C,aAAsB,SAAQ,aAAqB;IAC9D,IAAI;QACF,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;QACjB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;KACnB;CACF;AAEM,MAAM,yBAAyB,GAAG,CACvC,wBAA+C,KAC5C,CAAC,MAAW,EAAE,WAAgB,EAAE,OAAgB,EAAE,IAAY;IACjE,IAAI,MAAM,IAAI,IAAI,EAAE;QAClB,OAAO,IAAI,aAAa,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC,CAAC;KACxD;IAED,OAAO,wBAAwB,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,CAAC;AACtE,CAAC,CAAC;AAEK,MAAM,kBAAkB,GAAG,CAAC,YAA6B,KAC9D,yBAAyB,CACvB,CAAC,MAAW,EAAE,WAAuB,EAAE,OAAgB;IACrD,MAAM,YAAY,GAAG,OAAO,UAAU,CAAC,MAAM,CAAC,CAAC;IAC/C,MAAM,IAAI,GAAG,YAAY,CAAC,MAAM,CAAC,CAAC;IAClC,OAAO,IAAI,eAAe,CACxB,CAAC;QACC,OAAO,OAAO,UAAU,CAAC,CAAC,CAAC,KAAK,YAAY,IAAI,IAAI,CAAC,CAAC,CAAC,CAAC;KACzD,EACD,WAAW,EACX,OAAO,CACR,CAAC;AACJ,CAAC,CACF,CAAC;AASJ,MAAM,oBAAoB,GAAG,CAC3B,IAAY,EACZ,MAAW,EACX,WAAgB,EAChB,OAAgB;IAEhB,MAAM,gBAAgB,GAAG,OAAO,CAAC,UAAU,CAAC,IAAI,CAAC,CAAC;IAClD,IAAI,CAAC,gBAAgB,EAAE;QACrB,MAAM,IAAI,KAAK,CAAC,0BAA0B,IAAI,EAAE,CAAC,CAAC;KACnD;IACD,OAAO,gBAAgB,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,CAAC;AAC9D,CAAC,CAAC;AAEK,MAAM,iBAAiB,GAAG,CAAC,KAAU;IAC1C,KAAK,MAAM,GAAG,IAAI,KAAK,EAAE;QACvB,IAAI,GAAG,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG;YAAE,OAAO,IAAI,CAAC;KACxC;IACD,OAAO,KAAK,CAAC;AACf,CAAC,CAAC;AACF,MAAM,qBAAqB,GAAG,CAC5B,OAAc,EACd,WAAgB,EAChB,WAAgB,EAChB,OAAgB;IAEhB,IAAI,iBAAiB,CAAC,WAAW,CAAC,EAAE;QAClC,MAAM,CAAC,cAAc,EAAE,gBAAgB,CAAC,GAAG,qBAAqB,CAC9D,WAAW,EACX,OAAO,CACR,CAAC;QACF,IAAI,gBAAgB,CAAC,MAAM,EAAE;YAC3B,MAAM,IAAI,KAAK,CACb,kEAAkE,CACnE,CAAC;SACH;QACD,OAAO,IAAI,eAAe,CACxB,OAAO,EACP,WAAW,EACX,WAAW,EACX,OAAO,EACP,cAAc,CACf,CAAC;KACH;IACD,OAAO,IAAI,eAAe,CAAC,OAAO,EAAE,WAAW,EAAE,WAAW,EAAE,OAAO,EAAE;QACrE,IAAI,eAAe,CAAC,WAAW,EAAE,WAAW,EAAE,OAAO,CAAC;KACvD,CAAC,CAAC;AACL,CAAC,CAAC;MAEW,oBAAoB,GAAG,CAClC,KAAqB,EACrB,cAAmB,IAAI,EACvB,EAAE,OAAO,EAAE,UAAU,KAAuB,EAAE;IAE9C,MAAM,OAAO,GAAG;QACd,OAAO,EAAE,OAAO,IAAI,MAAM;QAC1B,UAAU,EAAE,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,UAAU,IAAI,EAAE,CAAC;KAChD,CAAC;IAEF,MAAM,CAAC,cAAc,EAAE,gBAAgB,CAAC,GAAG,qBAAqB,CAC9D,KAAK,EACL,OAAO,CACR,CAAC;IAEF,MAAM,GAAG,GAAG,EAAE,CAAC;IAEf,IAAI,cAAc,CAAC,MAAM,EAAE;QACzB,GAAG,CAAC,IAAI,CACN,IAAI,eAAe,CAAC,EAAE,EAAE,KAAK,EAAE,WAAW,EAAE,OAAO,EAAE,cAAc,CAAC,CACrE,CAAC;KACH;IAED,GAAG,CAAC,IAAI,CAAC,GAAG,gBAAgB,CAAC,CAAC;IAE9B,IAAI,GAAG,CAAC,MAAM,KAAK,CAAC,EAAE;QACpB,OAAO,GAAG,CAAC,CAAC,CAAC,CAAC;KACf;IACD,OAAO,IAAI,cAAc,CAAC,KAAK,EAAE,WAAW,EAAE,OAAO,EAAE,GAAG,CAAC,CAAC;AAC9D,EAAE;AAEF,MAAM,qBAAqB,GAAG,CAAC,KAAU,EAAE,OAAgB;IACzD,MAAM,cAAc,GAAG,EAAE,CAAC;IAC1B,MAAM,gBAAgB,GAAG,EAAE,CAAC;IAC5B,IAAI,CAAC,eAAe,CAAC,KAAK,CAAC,EAAE;QAC3B,cAAc,CAAC,IAAI,CAAC,IAAI,eAAe,CAAC,KAAK,EAAE,KAAK,EAAE,OAAO,CAAC,CAAC,CAAC;QAChE,OAAO,CAAC,cAAc,EAAE,gBAAgB,CAAC,CAAC;KAC3C;IACD,KAAK,MAAM,GAAG,IAAI,KAAK,EAAE;QACvB,IAAI,GAAG,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,EAAE;YACzB,MAAM,EAAE,GAAG,oBAAoB,CAAC,GAAG,EAAE,KAAK,CAAC,GAAG,CAAC,EAAE,KAAK,EAAE,OAAO,CAAC,CAAC;;YAGjE,IAAI,EAAE,IAAI,IAAI,EAAE;gBACd,cAAc,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC;aACzB;SACF;aAAM;YACL,gBAAgB,CAAC,IAAI,CACnB,qBAAqB,CAAC,GAAG,CAAC,KAAK,CAAC,GAAG,CAAC,EAAE,KAAK,CAAC,GAAG,CAAC,EAAE,KAAK,EAAE,OAAO,CAAC,CAClE,CAAC;SACH;KACF;IAED,OAAO,CAAC,cAAc,EAAE,gBAAgB,CAAC,CAAC;AAC5C,CAAC,CAAC;MAEW,qBAAqB,GAAG,CAAQ,SAA2B,KAAK,CAC3E,IAAW,EACX,GAAS,EACT,KAAW;IAEX,SAAS,CAAC,KAAK,EAAE,CAAC;IAClB,SAAS,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;IACjC,OAAO,SAAS,CAAC,IAAI,CAAC;AACxB,EAAE;MAEW,iBAAiB,GAAG,CAC/B,KAAqB,EACrB,UAA4B,EAAE;IAE9B,OAAO,qBAAqB,CAC1B,oBAAoB,CAAiB,KAAK,EAAE,IAAI,EAAE,OAAO,CAAC,CAC3D,CAAC;AACJ;;AC9aA,MAAM,GAAI,SAAQ,kBAAuB;IAEvC,IAAI;QACF,IAAI,CAAC,KAAK,GAAG,YAAY,CAAC,IAAI,CAAC,MAAM,EAAE,IAAI,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;KAC9D;IACD,KAAK;QACH,KAAK,CAAC,KAAK,EAAE,CAAC;QACd,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;KAClB;IACD,IAAI,CAAC,IAAS;QACZ,IAAI,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,EAAE;YACpB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;YACjB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;SACnB;KACF;CACF;AACD;AACA,MAAM,UAAW,SAAQ,kBAA8B;IAErD,IAAI;QACF,IAAI,CAAC,eAAe,GAAG,oBAAoB,CACzC,IAAI,CAAC,MAAM,EACX,IAAI,CAAC,WAAW,EAChB,IAAI,CAAC,OAAO,CACb,CAAC;KACH;IACD,KAAK;QACH,KAAK,CAAC,KAAK,EAAE,CAAC;QACd,IAAI,CAAC,eAAe,CAAC,KAAK,EAAE,CAAC;KAC9B;IACD,IAAI,CAAC,IAAS;QACZ,IAAI,OAAO,CAAC,IAAI,CAAC,EAAE;YACjB,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,IAAI,EAAE,CAAC,GAAG,MAAM,EAAE,CAAC,EAAE,EAAE;;;gBAGlD,IAAI,CAAC,eAAe,CAAC,KAAK,EAAE,CAAC;;gBAG7B,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,IAAI,CAAC,CAAC;gBAC5C,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC,IAAI,IAAI,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC;aACpD;YACD,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;SAClB;aAAM;YACL,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;YAClB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;SACnB;KACF;CACF;AAED,MAAM,IAAK,SAAQ,kBAA8B;IAE/C,IAAI;QACF,IAAI,CAAC,eAAe,GAAG,oBAAoB,CACzC,IAAI,CAAC,MAAM,EACX,IAAI,CAAC,WAAW,EAChB,IAAI,CAAC,OAAO,CACb,CAAC;KACH;IACD,KAAK;QACH,IAAI,CAAC,eAAe,CAAC,KAAK,EAAE,CAAC;KAC9B;IACD,IAAI,CAAC,IAAS,EAAE,GAAQ,EAAE,KAAU;QAClC,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;QAC5C,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC;QACtC,IAAI,CAAC,IAAI,GAAG,CAAC,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC;KACxC;CACF;MAEY,KAAM,SAAQ,kBAAuB;IAChD,IAAI,MAAK;IACT,IAAI,CAAC,IAAI;QACP,IAAI,OAAO,CAAC,IAAI,CAAC,IAAI,IAAI,CAAC,MAAM,KAAK,IAAI,CAAC,MAAM,EAAE;YAChD,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;YACjB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;SAClB;;;;;KAKF;CACF;AAED,MAAM,GAAI,SAAQ,kBAAuB;IAEvC,IAAI;QACF,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC,MAAM,CAAC,GAAG,CAAC,EAAE,IAC5B,oBAAoB,CAAC,EAAE,EAAE,IAAI,EAAE,IAAI,CAAC,OAAO,CAAC,CAC7C,CAAC;KACH;IACD,KAAK;QACH,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;QAClB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;QAClB,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,IAAI,CAAC,IAAI,EAAE,CAAC,GAAG,MAAM,EAAE,CAAC,EAAE,EAAE;YACvD,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,KAAK,EAAE,CAAC;SACtB;KACF;IACD,IAAI,CAAC,IAAS,EAAE,GAAQ,EAAE,KAAU;QAClC,IAAI,IAAI,GAAG,KAAK,CAAC;QACjB,IAAI,OAAO,GAAG,KAAK,CAAC;QACpB,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,IAAI,CAAC,IAAI,EAAE,CAAC,GAAG,MAAM,EAAE,CAAC,EAAE,EAAE;YACvD,MAAM,EAAE,GAAG,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;YACxB,EAAE,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;YAC1B,IAAI,EAAE,CAAC,IAAI,EAAE;gBACX,IAAI,GAAG,IAAI,CAAC;gBACZ,OAAO,GAAG,EAAE,CAAC,IAAI,CAAC;gBAClB,MAAM;aACP;SACF;QAED,IAAI,CAAC,IAAI,GAAG,OAAO,CAAC;QACpB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;KAClB;CACF;AAED,MAAM,IAAK,SAAQ,GAAG;IACpB,IAAI,CAAC,IAAS,EAAE,GAAQ,EAAE,KAAU;QAClC,KAAK,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;QAC7B,IAAI,CAAC,IAAI,GAAG,CAAC,IAAI,CAAC,IAAI,CAAC;KACxB;CACF;AAED,MAAM,GAAI,SAAQ,kBAAuB;IAEvC,IAAI;QACF,IAAI,CAAC,QAAQ,GAAG,IAAI,CAAC,MAAM,CAAC,GAAG,CAAC,KAAK;YACnC,IAAI,iBAAiB,CAAC,KAAK,CAAC,EAAE;gBAC5B,MAAM,IAAI,KAAK,CACb,uBAAuB,IAAI,CAAC,WAAW,CAAC,IAAI,CAAC,WAAW,EAAE,EAAE,CAC7D,CAAC;aACH;YACD,OAAO,YAAY,CAAC,KAAK,EAAE,IAAI,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;SAClD,CAAC,CAAC;KACJ;IACD,IAAI,CAAC,IAAS,EAAE,GAAQ,EAAE,KAAU;QAClC,IAAI,IAAI,GAAG,KAAK,CAAC;QACjB,IAAI,OAAO,GAAG,KAAK,CAAC;QACpB,KAAK,IAAI,CAAC,GAAG,CAAC,EAAE,EAAE,MAAM,EAAE,GAAG,IAAI,CAAC,QAAQ,EAAE,CAAC,GAAG,MAAM,EAAE,CAAC,EAAE,EAAE;YAC3D,MAAM,IAAI,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC;YAC9B,IAAI,IAAI,CAAC,IAAI,CAAC,EAAE;gBACd,IAAI,GAAG,IAAI,CAAC;gBACZ,OAAO,GAAG,IAAI,CAAC;gBACf,MAAM;aACP;SACF;QAED,IAAI,CAAC,IAAI,GAAG,OAAO,CAAC;QACpB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;KAClB;CACF;AAED,MAAM,IAAK,SAAQ,GAAG;IACpB,IAAI,CAAC,IAAS,EAAE,GAAQ,EAAE,KAAU;QAClC,KAAK,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;QAC7B,IAAI,CAAC,IAAI,GAAG,CAAC,IAAI,CAAC,IAAI,CAAC;KACxB;CACF;AAED,MAAM,OAAQ,SAAQ,kBAA2B;IAC/C,IAAI,CAAC,IAAS,EAAE,GAAQ,EAAE,KAAU;QAClC,IAAI,KAAK,CAAC,cAAc,CAAC,GAAG,CAAC,KAAK,IAAI,CAAC,MAAM,EAAE;YAC7C,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;YACjB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;SAClB;KACF;CACF;AAED,MAAM,IAAK,SAAQ,mBAAmB;IACpC,YACE,MAAoB,EACpB,WAAuB,EACvB,OAAgB,EAChB,IAAY;QAEZ,KAAK,CACH,MAAM,EACN,WAAW,EACX,OAAO,EACP,MAAM,CAAC,GAAG,CAAC,KAAK,IAAI,oBAAoB,CAAC,KAAK,EAAE,WAAW,EAAE,OAAO,CAAC,CAAC,EACtE,IAAI,CACL,CAAC;KACH;IACD,IAAI,CAAC,IAAS,EAAE,GAAQ,EAAE,KAAU;QAClC,IAAI,CAAC,YAAY,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;KACrC;CACF;MAEY,GAAG,GAAG,CAAC,MAAW,EAAE,WAAuB,EAAE,OAAgB,KACxE,IAAI,eAAe,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE;MACvC,GAAG,GAAG,CACjB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,KACT,IAAI,GAAG,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,EAAE;MACpC,GAAG,GAAG,CACjB,MAAoB,EACpB,WAAuB,EACvB,OAAgB,EAChB,IAAY,KACT,IAAI,GAAG,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,EAAE;MACpC,IAAI,GAAG,CAClB,MAAoB,EACpB,WAAuB,EACvB,OAAgB,EAChB,IAAY,KACT,IAAI,IAAI,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,EAAE;MACrC,UAAU,GAAG,CACxB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,KACT,IAAI,UAAU,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,EAAE;MAC3C,IAAI,GAAG,CAClB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,KACT,IAAI,IAAI,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,EAAE;MACrC,GAAG,GAAG,CACjB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,KACT,IAAI,GAAG,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,EAAE;MAEpC,GAAG,GAAG,kBAAkB,CAAC,MAAM,IAAI,CAAC,IAAI,CAAC,GAAG,MAAM,EAAE;MACpD,IAAI,GAAG,kBAAkB,CAAC,MAAM,IAAI,CAAC,IAAI,CAAC,IAAI,MAAM,EAAE;MACtD,GAAG,GAAG,kBAAkB,CAAC,MAAM,IAAI,CAAC,IAAI,CAAC,GAAG,MAAM,EAAE;MACpD,IAAI,GAAG,kBAAkB,CAAC,MAAM,IAAI,CAAC,IAAI,CAAC,IAAI,MAAM,EAAE;MACtD,IAAI,GAAG,CAClB,CAAC,GAAG,EAAE,WAAW,CAAW,EAC5B,WAAuB,EACvB,OAAgB,KAEhB,IAAI,eAAe,CACjB,CAAC,IAAI,UAAU,CAAC,CAAC,CAAC,GAAG,GAAG,KAAK,WAAW,EACxC,WAAW,EACX,OAAO,EACP;MACS,OAAO,GAAG,CACrB,MAAe,EACf,WAAuB,EACvB,OAAgB,EAChB,IAAY,KACT,IAAI,OAAO,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,EAAE;MACxC,MAAM,GAAG,CACpB,OAAe,EACf,WAAuB,EACvB,OAAgB,KAEhB,IAAI,eAAe,CACjB,IAAI,MAAM,CAAC,OAAO,EAAE,WAAW,CAAC,QAAQ,CAAC,EACzC,WAAW,EACX,OAAO,EACP;MACS,IAAI,GAAG,CAClB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,KACT,IAAI,IAAI,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,EAAE;AAElD,MAAM,WAAW,GAAG;IAClB,MAAM,EAAE,CAAC,IAAI,OAAO,CAAC,KAAK,QAAQ;IAClC,MAAM,EAAE,CAAC,IAAI,OAAO,CAAC,KAAK,QAAQ;IAClC,IAAI,EAAE,CAAC,IAAI,OAAO,CAAC,KAAK,SAAS;IACjC,KAAK,EAAE,CAAC,IAAI,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC;IAC5B,IAAI,EAAE,CAAC,IAAI,CAAC,KAAK,IAAI;IACrB,SAAS,EAAE,CAAC,IAAI,CAAC,YAAY,IAAI;CAClC,CAAC;MAEW,KAAK,GAAG,CACnB,KAAwB,EACxB,WAAuB,EACvB,OAAgB,KAEhB,IAAI,eAAe,CACjB,CAAC;IACC,IAAI,OAAO,KAAK,KAAK,QAAQ,EAAE;QAC7B,IAAI,CAAC,WAAW,CAAC,KAAK,CAAC,EAAE;YACvB,MAAM,IAAI,KAAK,CAAC,2BAA2B,CAAC,CAAC;SAC9C;QAED,OAAO,WAAW,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC;KAC9B;IAED,OAAO,CAAC,IAAI,IAAI,GAAG,CAAC,YAAY,KAAK,IAAI,CAAC,CAAC,WAAW,KAAK,KAAK,GAAG,KAAK,CAAC;AAC3E,CAAC,EACD,WAAW,EACX,OAAO,EACP;MACS,IAAI,GAAG,CAClB,MAAoB,EACpB,UAAsB,EACtB,OAAgB,EAChB,IAAY,KACT,IAAI,IAAI,CAAC,MAAM,EAAE,UAAU,EAAE,OAAO,EAAE,IAAI,EAAE;MACpC,IAAI,GAAG,KAAK;MACZ,KAAK,GAAG,CACnB,MAAc,EACd,UAAsB,EACtB,OAAgB,KACb,IAAI,KAAK,CAAC,MAAM,EAAE,UAAU,EAAE,OAAO,EAAE,OAAO,EAAE;MACxC,QAAQ,GAAG,MAAM,KAAK;MACtB,MAAM,GAAG,CACpB,MAAyB,EACzB,UAAsB,EACtB,OAAgB;IAEhB,IAAI,IAAI,CAAC;IAET,IAAI,UAAU,CAAC,MAAM,CAAC,EAAE;QACtB,IAAI,GAAG,MAAM,CAAC;KACf;SAAM,IAAI,CAAC,OAAO,CAAC,GAAG,CAAC,WAAW,EAAE;QACnC,IAAI,GAAG,IAAI,QAAQ,CAAC,KAAK,EAAE,SAAS,GAAG,MAAM,CAAC,CAAC;KAChD;SAAM;QACL,MAAM,IAAI,KAAK,CACb,kEAAkE,CACnE,CAAC;KACH;IAED,OAAO,IAAI,eAAe,CAAC,CAAC,IAAI,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,EAAE,UAAU,EAAE,OAAO,CAAC,CAAC;AACxE;;;;;;;;;;;;;;;;;;;;;;;;;;;;MCxUM,2BAA2B,GAAG,CAClC,KAAqB,EACrB,UAAe,EACf,EAAE,OAAO,EAAE,UAAU,KAAuB,EAAE;IAE9C,OAAO,oBAAoB,CAAC,KAAK,EAAE,UAAU,EAAE;QAC7C,OAAO,EAAE,OAAO;QAChB,UAAU,EAAE,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,iBAAiB,EAAE,UAAU,IAAI,EAAE,CAAC;KACnE,CAAC,CAAC;AACL,EAAE;AAEF,MAAM,wBAAwB,GAAG,CAC/B,KAAqB,EACrB,UAA4B,EAAE;IAE9B,MAAM,EAAE,GAAG,2BAA2B,CAAC,KAAK,EAAE,IAAI,EAAE,OAAO,CAAC,CAAC;IAC7D,OAAO,qBAAqB,CAAC,EAAE,CAAC,CAAC;AACnC,CAAC;;;;;"} \ No newline at end of file diff --git a/node_modules/sift/es5m/index.js b/node_modules/sift/es5m/index.js new file mode 100644 index 000000000..62af37b5d --- /dev/null +++ b/node_modules/sift/es5m/index.js @@ -0,0 +1,652 @@ +/*! ***************************************************************************** +Copyright (c) Microsoft Corporation. + +Permission to use, copy, modify, and/or distribute this software for any +purpose with or without fee is hereby granted. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH +REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY +AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, +INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM +LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR +OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR +PERFORMANCE OF THIS SOFTWARE. +***************************************************************************** */ +/* global Reflect, Promise */ + +var extendStatics = function(d, b) { + extendStatics = Object.setPrototypeOf || + ({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) || + function (d, b) { for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p]; }; + return extendStatics(d, b); +}; + +function __extends(d, b) { + extendStatics(d, b); + function __() { this.constructor = d; } + d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __()); +} + +var typeChecker = function (type) { + var typeString = "[object " + type + "]"; + return function (value) { + return getClassName(value) === typeString; + }; +}; +var getClassName = function (value) { return Object.prototype.toString.call(value); }; +var comparable = function (value) { + if (value instanceof Date) { + return value.getTime(); + } + else if (isArray(value)) { + return value.map(comparable); + } + else if (value && typeof value.toJSON === "function") { + return value.toJSON(); + } + return value; +}; +var isArray = typeChecker("Array"); +var isObject = typeChecker("Object"); +var isFunction = typeChecker("Function"); +var isVanillaObject = function (value) { + return (value && + (value.constructor === Object || + value.constructor === Array || + value.constructor.toString() === "function Object() { [native code] }" || + value.constructor.toString() === "function Array() { [native code] }") && + !value.toJSON); +}; +var equals = function (a, b) { + if (a == null && a == b) { + return true; + } + if (a === b) { + return true; + } + if (Object.prototype.toString.call(a) !== Object.prototype.toString.call(b)) { + return false; + } + if (isArray(a)) { + if (a.length !== b.length) { + return false; + } + for (var i = 0, length_1 = a.length; i < length_1; i++) { + if (!equals(a[i], b[i])) + return false; + } + return true; + } + else if (isObject(a)) { + if (Object.keys(a).length !== Object.keys(b).length) { + return false; + } + for (var key in a) { + if (!equals(a[key], b[key])) + return false; + } + return true; + } + return false; +}; + +/** + * Walks through each value given the context - used for nested operations. E.g: + * { "person.address": { $eq: "blarg" }} + */ +var walkKeyPathValues = function (item, keyPath, next, depth, key, owner) { + var currentKey = keyPath[depth]; + // if array, then try matching. Might fall through for cases like: + // { $eq: [1, 2, 3] }, [ 1, 2, 3 ]. + if (isArray(item) && isNaN(Number(currentKey))) { + for (var i = 0, length_1 = item.length; i < length_1; i++) { + // if FALSE is returned, then terminate walker. For operations, this simply + // means that the search critera was met. + if (!walkKeyPathValues(item[i], keyPath, next, depth, i, item)) { + return false; + } + } + } + if (depth === keyPath.length || item == null) { + return next(item, key, owner); + } + return walkKeyPathValues(item[currentKey], keyPath, next, depth + 1, currentKey, item); +}; +var BaseOperation = /** @class */ (function () { + function BaseOperation(params, owneryQuery, options) { + this.params = params; + this.owneryQuery = owneryQuery; + this.options = options; + this.init(); + } + BaseOperation.prototype.init = function () { }; + BaseOperation.prototype.reset = function () { + this.done = false; + this.keep = false; + }; + return BaseOperation; +}()); +var NamedBaseOperation = /** @class */ (function (_super) { + __extends(NamedBaseOperation, _super); + function NamedBaseOperation(params, owneryQuery, options, name) { + var _this = _super.call(this, params, owneryQuery, options) || this; + _this.name = name; + return _this; + } + return NamedBaseOperation; +}(BaseOperation)); +var GroupOperation = /** @class */ (function (_super) { + __extends(GroupOperation, _super); + function GroupOperation(params, owneryQuery, options, children) { + var _this = _super.call(this, params, owneryQuery, options) || this; + _this.children = children; + return _this; + } + /** + */ + GroupOperation.prototype.reset = function () { + this.keep = false; + this.done = false; + for (var i = 0, length_2 = this.children.length; i < length_2; i++) { + this.children[i].reset(); + } + }; + /** + */ + GroupOperation.prototype.childrenNext = function (item, key, owner) { + var done = true; + var keep = true; + for (var i = 0, length_3 = this.children.length; i < length_3; i++) { + var childOperation = this.children[i]; + childOperation.next(item, key, owner); + if (!childOperation.keep) { + keep = false; + } + if (childOperation.done) { + if (!childOperation.keep) { + break; + } + } + else { + done = false; + } + } + this.done = done; + this.keep = keep; + }; + return GroupOperation; +}(BaseOperation)); +var NamedGroupOperation = /** @class */ (function (_super) { + __extends(NamedGroupOperation, _super); + function NamedGroupOperation(params, owneryQuery, options, children, name) { + var _this = _super.call(this, params, owneryQuery, options, children) || this; + _this.name = name; + return _this; + } + return NamedGroupOperation; +}(GroupOperation)); +var QueryOperation = /** @class */ (function (_super) { + __extends(QueryOperation, _super); + function QueryOperation() { + return _super !== null && _super.apply(this, arguments) || this; + } + /** + */ + QueryOperation.prototype.next = function (item, key, parent) { + this.childrenNext(item, key, parent); + }; + return QueryOperation; +}(GroupOperation)); +var NestedOperation = /** @class */ (function (_super) { + __extends(NestedOperation, _super); + function NestedOperation(keyPath, params, owneryQuery, options, children) { + var _this = _super.call(this, params, owneryQuery, options, children) || this; + _this.keyPath = keyPath; + /** + */ + _this._nextNestedValue = function (value, key, owner) { + _this.childrenNext(value, key, owner); + return !_this.done; + }; + return _this; + } + /** + */ + NestedOperation.prototype.next = function (item, key, parent) { + walkKeyPathValues(item, this.keyPath, this._nextNestedValue, 0, key, parent); + }; + return NestedOperation; +}(GroupOperation)); +var createTester = function (a, compare) { + if (a instanceof Function) { + return a; + } + if (a instanceof RegExp) { + return function (b) { + var result = typeof b === "string" && a.test(b); + a.lastIndex = 0; + return result; + }; + } + var comparableA = comparable(a); + return function (b) { return compare(comparableA, comparable(b)); }; +}; +var EqualsOperation = /** @class */ (function (_super) { + __extends(EqualsOperation, _super); + function EqualsOperation() { + return _super !== null && _super.apply(this, arguments) || this; + } + EqualsOperation.prototype.init = function () { + this._test = createTester(this.params, this.options.compare); + }; + EqualsOperation.prototype.next = function (item, key, parent) { + if (!Array.isArray(parent) || parent.hasOwnProperty(key)) { + if (this._test(item, key, parent)) { + this.done = true; + this.keep = true; + } + } + }; + return EqualsOperation; +}(BaseOperation)); +var createEqualsOperation = function (params, owneryQuery, options) { return new EqualsOperation(params, owneryQuery, options); }; +var NopeOperation = /** @class */ (function (_super) { + __extends(NopeOperation, _super); + function NopeOperation() { + return _super !== null && _super.apply(this, arguments) || this; + } + NopeOperation.prototype.next = function () { + this.done = true; + this.keep = false; + }; + return NopeOperation; +}(BaseOperation)); +var numericalOperationCreator = function (createNumericalOperation) { return function (params, owneryQuery, options, name) { + if (params == null) { + return new NopeOperation(params, owneryQuery, options); + } + return createNumericalOperation(params, owneryQuery, options, name); +}; }; +var numericalOperation = function (createTester) { + return numericalOperationCreator(function (params, owneryQuery, options) { + var typeofParams = typeof comparable(params); + var test = createTester(params); + return new EqualsOperation(function (b) { + return typeof comparable(b) === typeofParams && test(b); + }, owneryQuery, options); + }); +}; +var createNamedOperation = function (name, params, parentQuery, options) { + var operationCreator = options.operations[name]; + if (!operationCreator) { + throw new Error("Unsupported operation: " + name); + } + return operationCreator(params, parentQuery, options, name); +}; +var containsOperation = function (query) { + for (var key in query) { + if (key.charAt(0) === "$") + return true; + } + return false; +}; +var createNestedOperation = function (keyPath, nestedQuery, owneryQuery, options) { + if (containsOperation(nestedQuery)) { + var _a = createQueryOperations(nestedQuery, options), selfOperations = _a[0], nestedOperations = _a[1]; + if (nestedOperations.length) { + throw new Error("Property queries must contain only operations, or exact objects."); + } + return new NestedOperation(keyPath, nestedQuery, owneryQuery, options, selfOperations); + } + return new NestedOperation(keyPath, nestedQuery, owneryQuery, options, [ + new EqualsOperation(nestedQuery, owneryQuery, options) + ]); +}; +var createQueryOperation = function (query, owneryQuery, _a) { + if (owneryQuery === void 0) { owneryQuery = null; } + var _b = _a === void 0 ? {} : _a, compare = _b.compare, operations = _b.operations; + var options = { + compare: compare || equals, + operations: Object.assign({}, operations || {}) + }; + var _c = createQueryOperations(query, options), selfOperations = _c[0], nestedOperations = _c[1]; + var ops = []; + if (selfOperations.length) { + ops.push(new NestedOperation([], query, owneryQuery, options, selfOperations)); + } + ops.push.apply(ops, nestedOperations); + if (ops.length === 1) { + return ops[0]; + } + return new QueryOperation(query, owneryQuery, options, ops); +}; +var createQueryOperations = function (query, options) { + var selfOperations = []; + var nestedOperations = []; + if (!isVanillaObject(query)) { + selfOperations.push(new EqualsOperation(query, query, options)); + return [selfOperations, nestedOperations]; + } + for (var key in query) { + if (key.charAt(0) === "$") { + var op = createNamedOperation(key, query[key], query, options); + // probably just a flag for another operation (like $options) + if (op != null) { + selfOperations.push(op); + } + } + else { + nestedOperations.push(createNestedOperation(key.split("."), query[key], query, options)); + } + } + return [selfOperations, nestedOperations]; +}; +var createOperationTester = function (operation) { return function (item, key, owner) { + operation.reset(); + operation.next(item, key, owner); + return operation.keep; +}; }; +var createQueryTester = function (query, options) { + if (options === void 0) { options = {}; } + return createOperationTester(createQueryOperation(query, null, options)); +}; + +var $Ne = /** @class */ (function (_super) { + __extends($Ne, _super); + function $Ne() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Ne.prototype.init = function () { + this._test = createTester(this.params, this.options.compare); + }; + $Ne.prototype.reset = function () { + _super.prototype.reset.call(this); + this.keep = true; + }; + $Ne.prototype.next = function (item) { + if (this._test(item)) { + this.done = true; + this.keep = false; + } + }; + return $Ne; +}(NamedBaseOperation)); +// https://docs.mongodb.com/manual/reference/operator/query/elemMatch/ +var $ElemMatch = /** @class */ (function (_super) { + __extends($ElemMatch, _super); + function $ElemMatch() { + return _super !== null && _super.apply(this, arguments) || this; + } + $ElemMatch.prototype.init = function () { + this._queryOperation = createQueryOperation(this.params, this.owneryQuery, this.options); + }; + $ElemMatch.prototype.reset = function () { + _super.prototype.reset.call(this); + this._queryOperation.reset(); + }; + $ElemMatch.prototype.next = function (item) { + if (isArray(item)) { + for (var i = 0, length_1 = item.length; i < length_1; i++) { + // reset query operation since item being tested needs to pass _all_ query + // operations for it to be a success + this._queryOperation.reset(); + // check item + this._queryOperation.next(item[i], i, item); + this.keep = this.keep || this._queryOperation.keep; + } + this.done = true; + } + else { + this.done = false; + this.keep = false; + } + }; + return $ElemMatch; +}(NamedBaseOperation)); +var $Not = /** @class */ (function (_super) { + __extends($Not, _super); + function $Not() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Not.prototype.init = function () { + this._queryOperation = createQueryOperation(this.params, this.owneryQuery, this.options); + }; + $Not.prototype.reset = function () { + this._queryOperation.reset(); + }; + $Not.prototype.next = function (item, key, owner) { + this._queryOperation.next(item, key, owner); + this.done = this._queryOperation.done; + this.keep = !this._queryOperation.keep; + }; + return $Not; +}(NamedBaseOperation)); +var $Size = /** @class */ (function (_super) { + __extends($Size, _super); + function $Size() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Size.prototype.init = function () { }; + $Size.prototype.next = function (item) { + if (isArray(item) && item.length === this.params) { + this.done = true; + this.keep = true; + } + // if (parent && parent.length === this.params) { + // this.done = true; + // this.keep = true; + // } + }; + return $Size; +}(NamedBaseOperation)); +var $Or = /** @class */ (function (_super) { + __extends($Or, _super); + function $Or() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Or.prototype.init = function () { + var _this = this; + this._ops = this.params.map(function (op) { + return createQueryOperation(op, null, _this.options); + }); + }; + $Or.prototype.reset = function () { + this.done = false; + this.keep = false; + for (var i = 0, length_2 = this._ops.length; i < length_2; i++) { + this._ops[i].reset(); + } + }; + $Or.prototype.next = function (item, key, owner) { + var done = false; + var success = false; + for (var i = 0, length_3 = this._ops.length; i < length_3; i++) { + var op = this._ops[i]; + op.next(item, key, owner); + if (op.keep) { + done = true; + success = op.keep; + break; + } + } + this.keep = success; + this.done = done; + }; + return $Or; +}(NamedBaseOperation)); +var $Nor = /** @class */ (function (_super) { + __extends($Nor, _super); + function $Nor() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Nor.prototype.next = function (item, key, owner) { + _super.prototype.next.call(this, item, key, owner); + this.keep = !this.keep; + }; + return $Nor; +}($Or)); +var $In = /** @class */ (function (_super) { + __extends($In, _super); + function $In() { + return _super !== null && _super.apply(this, arguments) || this; + } + $In.prototype.init = function () { + var _this = this; + this._testers = this.params.map(function (value) { + if (containsOperation(value)) { + throw new Error("cannot nest $ under " + _this.constructor.name.toLowerCase()); + } + return createTester(value, _this.options.compare); + }); + }; + $In.prototype.next = function (item, key, owner) { + var done = false; + var success = false; + for (var i = 0, length_4 = this._testers.length; i < length_4; i++) { + var test = this._testers[i]; + if (test(item)) { + done = true; + success = true; + break; + } + } + this.keep = success; + this.done = done; + }; + return $In; +}(NamedBaseOperation)); +var $Nin = /** @class */ (function (_super) { + __extends($Nin, _super); + function $Nin() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Nin.prototype.next = function (item, key, owner) { + _super.prototype.next.call(this, item, key, owner); + this.keep = !this.keep; + }; + return $Nin; +}($In)); +var $Exists = /** @class */ (function (_super) { + __extends($Exists, _super); + function $Exists() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Exists.prototype.next = function (item, key, owner) { + if (owner.hasOwnProperty(key) === this.params) { + this.done = true; + this.keep = true; + } + }; + return $Exists; +}(NamedBaseOperation)); +var $And = /** @class */ (function (_super) { + __extends($And, _super); + function $And(params, owneryQuery, options, name) { + return _super.call(this, params, owneryQuery, options, params.map(function (query) { return createQueryOperation(query, owneryQuery, options); }), name) || this; + } + $And.prototype.next = function (item, key, owner) { + this.childrenNext(item, key, owner); + }; + return $And; +}(NamedGroupOperation)); +var $eq = function (params, owneryQuery, options) { + return new EqualsOperation(params, owneryQuery, options); +}; +var $ne = function (params, owneryQuery, options, name) { return new $Ne(params, owneryQuery, options, name); }; +var $or = function (params, owneryQuery, options, name) { return new $Or(params, owneryQuery, options, name); }; +var $nor = function (params, owneryQuery, options, name) { return new $Nor(params, owneryQuery, options, name); }; +var $elemMatch = function (params, owneryQuery, options, name) { return new $ElemMatch(params, owneryQuery, options, name); }; +var $nin = function (params, owneryQuery, options, name) { return new $Nin(params, owneryQuery, options, name); }; +var $in = function (params, owneryQuery, options, name) { return new $In(params, owneryQuery, options, name); }; +var $lt = numericalOperation(function (params) { return function (b) { return b < params; }; }); +var $lte = numericalOperation(function (params) { return function (b) { return b <= params; }; }); +var $gt = numericalOperation(function (params) { return function (b) { return b > params; }; }); +var $gte = numericalOperation(function (params) { return function (b) { return b >= params; }; }); +var $mod = function (_a, owneryQuery, options) { + var mod = _a[0], equalsValue = _a[1]; + return new EqualsOperation(function (b) { return comparable(b) % mod === equalsValue; }, owneryQuery, options); +}; +var $exists = function (params, owneryQuery, options, name) { return new $Exists(params, owneryQuery, options, name); }; +var $regex = function (pattern, owneryQuery, options) { + return new EqualsOperation(new RegExp(pattern, owneryQuery.$options), owneryQuery, options); +}; +var $not = function (params, owneryQuery, options, name) { return new $Not(params, owneryQuery, options, name); }; +var typeAliases = { + number: function (v) { return typeof v === "number"; }, + string: function (v) { return typeof v === "string"; }, + bool: function (v) { return typeof v === "boolean"; }, + array: function (v) { return Array.isArray(v); }, + null: function (v) { return v === null; }, + timestamp: function (v) { return v instanceof Date; } +}; +var $type = function (clazz, owneryQuery, options) { + return new EqualsOperation(function (b) { + if (typeof clazz === "string") { + if (!typeAliases[clazz]) { + throw new Error("Type alias does not exist"); + } + return typeAliases[clazz](b); + } + return b != null ? b instanceof clazz || b.constructor === clazz : false; + }, owneryQuery, options); +}; +var $and = function (params, ownerQuery, options, name) { return new $And(params, ownerQuery, options, name); }; +var $all = $and; +var $size = function (params, ownerQuery, options) { return new $Size(params, ownerQuery, options, "$size"); }; +var $options = function () { return null; }; +var $where = function (params, ownerQuery, options) { + var test; + if (isFunction(params)) { + test = params; + } + else if (!process.env.CSP_ENABLED) { + test = new Function("obj", "return " + params); + } + else { + throw new Error("In CSP mode, sift does not support strings in \"$where\" condition"); + } + return new EqualsOperation(function (b) { return test.bind(b)(b); }, ownerQuery, options); +}; + +var defaultOperations = /*#__PURE__*/Object.freeze({ + __proto__: null, + $Size: $Size, + $eq: $eq, + $ne: $ne, + $or: $or, + $nor: $nor, + $elemMatch: $elemMatch, + $nin: $nin, + $in: $in, + $lt: $lt, + $lte: $lte, + $gt: $gt, + $gte: $gte, + $mod: $mod, + $exists: $exists, + $regex: $regex, + $not: $not, + $type: $type, + $and: $and, + $all: $all, + $size: $size, + $options: $options, + $where: $where +}); + +var createDefaultQueryOperation = function (query, ownerQuery, _a) { + var _b = _a === void 0 ? {} : _a, compare = _b.compare, operations = _b.operations; + return createQueryOperation(query, ownerQuery, { + compare: compare, + operations: Object.assign({}, defaultOperations, operations || {}) + }); +}; +var createDefaultQueryTester = function (query, options) { + if (options === void 0) { options = {}; } + var op = createDefaultQueryOperation(query, null, options); + return createOperationTester(op); +}; + +export default createDefaultQueryTester; +export { $Size, $all, $and, $elemMatch, $eq, $exists, $gt, $gte, $in, $lt, $lte, $mod, $ne, $nin, $nor, $not, $options, $or, $regex, $size, $type, $where, EqualsOperation, createDefaultQueryOperation, createEqualsOperation, createOperationTester, createQueryOperation, createQueryTester }; +//# sourceMappingURL=index.js.map diff --git a/node_modules/sift/es5m/index.js.map b/node_modules/sift/es5m/index.js.map new file mode 100644 index 000000000..9e9228d34 --- /dev/null +++ b/node_modules/sift/es5m/index.js.map @@ -0,0 +1 @@ +{"version":3,"file":"index.js","sources":["../node_modules/tslib/tslib.es6.js","../src/utils.ts","../src/core.ts","../src/operations.ts","../src/index.ts"],"sourcesContent":["/*! *****************************************************************************\r\nCopyright (c) Microsoft Corporation.\r\n\r\nPermission to use, copy, modify, and/or distribute this software for any\r\npurpose with or without fee is hereby granted.\r\n\r\nTHE SOFTWARE IS PROVIDED \"AS IS\" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH\r\nREGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY\r\nAND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,\r\nINDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM\r\nLOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR\r\nOTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR\r\nPERFORMANCE OF THIS SOFTWARE.\r\n***************************************************************************** */\r\n/* global Reflect, Promise */\r\n\r\nvar extendStatics = function(d, b) {\r\n extendStatics = Object.setPrototypeOf ||\r\n ({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||\r\n function (d, b) { for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p]; };\r\n return extendStatics(d, b);\r\n};\r\n\r\nexport function __extends(d, b) {\r\n extendStatics(d, b);\r\n function __() { this.constructor = d; }\r\n d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());\r\n}\r\n\r\nexport var __assign = function() {\r\n __assign = Object.assign || function __assign(t) {\r\n for (var s, i = 1, n = arguments.length; i < n; i++) {\r\n s = arguments[i];\r\n for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p)) t[p] = s[p];\r\n }\r\n return t;\r\n }\r\n return __assign.apply(this, arguments);\r\n}\r\n\r\nexport function __rest(s, e) {\r\n var t = {};\r\n for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p) && e.indexOf(p) < 0)\r\n t[p] = s[p];\r\n if (s != null && typeof Object.getOwnPropertySymbols === \"function\")\r\n for (var i = 0, p = Object.getOwnPropertySymbols(s); i < p.length; i++) {\r\n if (e.indexOf(p[i]) < 0 && Object.prototype.propertyIsEnumerable.call(s, p[i]))\r\n t[p[i]] = s[p[i]];\r\n }\r\n return t;\r\n}\r\n\r\nexport function __decorate(decorators, target, key, desc) {\r\n var c = arguments.length, r = c < 3 ? target : desc === null ? desc = Object.getOwnPropertyDescriptor(target, key) : desc, d;\r\n if (typeof Reflect === \"object\" && typeof Reflect.decorate === \"function\") r = Reflect.decorate(decorators, target, key, desc);\r\n else for (var i = decorators.length - 1; i >= 0; i--) if (d = decorators[i]) r = (c < 3 ? d(r) : c > 3 ? d(target, key, r) : d(target, key)) || r;\r\n return c > 3 && r && Object.defineProperty(target, key, r), r;\r\n}\r\n\r\nexport function __param(paramIndex, decorator) {\r\n return function (target, key) { decorator(target, key, paramIndex); }\r\n}\r\n\r\nexport function __metadata(metadataKey, metadataValue) {\r\n if (typeof Reflect === \"object\" && typeof Reflect.metadata === \"function\") return Reflect.metadata(metadataKey, metadataValue);\r\n}\r\n\r\nexport function __awaiter(thisArg, _arguments, P, generator) {\r\n function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }\r\n return new (P || (P = Promise))(function (resolve, reject) {\r\n function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }\r\n function rejected(value) { try { step(generator[\"throw\"](value)); } catch (e) { reject(e); } }\r\n function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }\r\n step((generator = generator.apply(thisArg, _arguments || [])).next());\r\n });\r\n}\r\n\r\nexport function __generator(thisArg, body) {\r\n var _ = { label: 0, sent: function() { if (t[0] & 1) throw t[1]; return t[1]; }, trys: [], ops: [] }, f, y, t, g;\r\n return g = { next: verb(0), \"throw\": verb(1), \"return\": verb(2) }, typeof Symbol === \"function\" && (g[Symbol.iterator] = function() { return this; }), g;\r\n function verb(n) { return function (v) { return step([n, v]); }; }\r\n function step(op) {\r\n if (f) throw new TypeError(\"Generator is already executing.\");\r\n while (_) try {\r\n if (f = 1, y && (t = op[0] & 2 ? y[\"return\"] : op[0] ? y[\"throw\"] || ((t = y[\"return\"]) && t.call(y), 0) : y.next) && !(t = t.call(y, op[1])).done) return t;\r\n if (y = 0, t) op = [op[0] & 2, t.value];\r\n switch (op[0]) {\r\n case 0: case 1: t = op; break;\r\n case 4: _.label++; return { value: op[1], done: false };\r\n case 5: _.label++; y = op[1]; op = [0]; continue;\r\n case 7: op = _.ops.pop(); _.trys.pop(); continue;\r\n default:\r\n if (!(t = _.trys, t = t.length > 0 && t[t.length - 1]) && (op[0] === 6 || op[0] === 2)) { _ = 0; continue; }\r\n if (op[0] === 3 && (!t || (op[1] > t[0] && op[1] < t[3]))) { _.label = op[1]; break; }\r\n if (op[0] === 6 && _.label < t[1]) { _.label = t[1]; t = op; break; }\r\n if (t && _.label < t[2]) { _.label = t[2]; _.ops.push(op); break; }\r\n if (t[2]) _.ops.pop();\r\n _.trys.pop(); continue;\r\n }\r\n op = body.call(thisArg, _);\r\n } catch (e) { op = [6, e]; y = 0; } finally { f = t = 0; }\r\n if (op[0] & 5) throw op[1]; return { value: op[0] ? op[1] : void 0, done: true };\r\n }\r\n}\r\n\r\nexport var __createBinding = Object.create ? (function(o, m, k, k2) {\r\n if (k2 === undefined) k2 = k;\r\n Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });\r\n}) : (function(o, m, k, k2) {\r\n if (k2 === undefined) k2 = k;\r\n o[k2] = m[k];\r\n});\r\n\r\nexport function __exportStar(m, exports) {\r\n for (var p in m) if (p !== \"default\" && !exports.hasOwnProperty(p)) __createBinding(exports, m, p);\r\n}\r\n\r\nexport function __values(o) {\r\n var s = typeof Symbol === \"function\" && Symbol.iterator, m = s && o[s], i = 0;\r\n if (m) return m.call(o);\r\n if (o && typeof o.length === \"number\") return {\r\n next: function () {\r\n if (o && i >= o.length) o = void 0;\r\n return { value: o && o[i++], done: !o };\r\n }\r\n };\r\n throw new TypeError(s ? \"Object is not iterable.\" : \"Symbol.iterator is not defined.\");\r\n}\r\n\r\nexport function __read(o, n) {\r\n var m = typeof Symbol === \"function\" && o[Symbol.iterator];\r\n if (!m) return o;\r\n var i = m.call(o), r, ar = [], e;\r\n try {\r\n while ((n === void 0 || n-- > 0) && !(r = i.next()).done) ar.push(r.value);\r\n }\r\n catch (error) { e = { error: error }; }\r\n finally {\r\n try {\r\n if (r && !r.done && (m = i[\"return\"])) m.call(i);\r\n }\r\n finally { if (e) throw e.error; }\r\n }\r\n return ar;\r\n}\r\n\r\nexport function __spread() {\r\n for (var ar = [], i = 0; i < arguments.length; i++)\r\n ar = ar.concat(__read(arguments[i]));\r\n return ar;\r\n}\r\n\r\nexport function __spreadArrays() {\r\n for (var s = 0, i = 0, il = arguments.length; i < il; i++) s += arguments[i].length;\r\n for (var r = Array(s), k = 0, i = 0; i < il; i++)\r\n for (var a = arguments[i], j = 0, jl = a.length; j < jl; j++, k++)\r\n r[k] = a[j];\r\n return r;\r\n};\r\n\r\nexport function __await(v) {\r\n return this instanceof __await ? (this.v = v, this) : new __await(v);\r\n}\r\n\r\nexport function __asyncGenerator(thisArg, _arguments, generator) {\r\n if (!Symbol.asyncIterator) throw new TypeError(\"Symbol.asyncIterator is not defined.\");\r\n var g = generator.apply(thisArg, _arguments || []), i, q = [];\r\n return i = {}, verb(\"next\"), verb(\"throw\"), verb(\"return\"), i[Symbol.asyncIterator] = function () { return this; }, i;\r\n function verb(n) { if (g[n]) i[n] = function (v) { return new Promise(function (a, b) { q.push([n, v, a, b]) > 1 || resume(n, v); }); }; }\r\n function resume(n, v) { try { step(g[n](v)); } catch (e) { settle(q[0][3], e); } }\r\n function step(r) { r.value instanceof __await ? Promise.resolve(r.value.v).then(fulfill, reject) : settle(q[0][2], r); }\r\n function fulfill(value) { resume(\"next\", value); }\r\n function reject(value) { resume(\"throw\", value); }\r\n function settle(f, v) { if (f(v), q.shift(), q.length) resume(q[0][0], q[0][1]); }\r\n}\r\n\r\nexport function __asyncDelegator(o) {\r\n var i, p;\r\n return i = {}, verb(\"next\"), verb(\"throw\", function (e) { throw e; }), verb(\"return\"), i[Symbol.iterator] = function () { return this; }, i;\r\n function verb(n, f) { i[n] = o[n] ? function (v) { return (p = !p) ? { value: __await(o[n](v)), done: n === \"return\" } : f ? f(v) : v; } : f; }\r\n}\r\n\r\nexport function __asyncValues(o) {\r\n if (!Symbol.asyncIterator) throw new TypeError(\"Symbol.asyncIterator is not defined.\");\r\n var m = o[Symbol.asyncIterator], i;\r\n return m ? m.call(o) : (o = typeof __values === \"function\" ? __values(o) : o[Symbol.iterator](), i = {}, verb(\"next\"), verb(\"throw\"), verb(\"return\"), i[Symbol.asyncIterator] = function () { return this; }, i);\r\n function verb(n) { i[n] = o[n] && function (v) { return new Promise(function (resolve, reject) { v = o[n](v), settle(resolve, reject, v.done, v.value); }); }; }\r\n function settle(resolve, reject, d, v) { Promise.resolve(v).then(function(v) { resolve({ value: v, done: d }); }, reject); }\r\n}\r\n\r\nexport function __makeTemplateObject(cooked, raw) {\r\n if (Object.defineProperty) { Object.defineProperty(cooked, \"raw\", { value: raw }); } else { cooked.raw = raw; }\r\n return cooked;\r\n};\r\n\r\nvar __setModuleDefault = Object.create ? (function(o, v) {\r\n Object.defineProperty(o, \"default\", { enumerable: true, value: v });\r\n}) : function(o, v) {\r\n o[\"default\"] = v;\r\n};\r\n\r\nexport function __importStar(mod) {\r\n if (mod && mod.__esModule) return mod;\r\n var result = {};\r\n if (mod != null) for (var k in mod) if (Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);\r\n __setModuleDefault(result, mod);\r\n return result;\r\n}\r\n\r\nexport function __importDefault(mod) {\r\n return (mod && mod.__esModule) ? mod : { default: mod };\r\n}\r\n\r\nexport function __classPrivateFieldGet(receiver, privateMap) {\r\n if (!privateMap.has(receiver)) {\r\n throw new TypeError(\"attempted to get private field on non-instance\");\r\n }\r\n return privateMap.get(receiver);\r\n}\r\n\r\nexport function __classPrivateFieldSet(receiver, privateMap, value) {\r\n if (!privateMap.has(receiver)) {\r\n throw new TypeError(\"attempted to set private field on non-instance\");\r\n }\r\n privateMap.set(receiver, value);\r\n return value;\r\n}\r\n",null,null,null,null],"names":[],"mappings":"AAAA;AACA;AACA;AACA;AACA;AACA;AACA;AACA;AACA;AACA;AACA;AACA;AACA;AACA;AACA;AACA;AACA,IAAI,aAAa,GAAG,SAAS,CAAC,EAAE,CAAC,EAAE;AACnC,IAAI,aAAa,GAAG,MAAM,CAAC,cAAc;AACzC,SAAS,EAAE,SAAS,EAAE,EAAE,EAAE,YAAY,KAAK,IAAI,UAAU,CAAC,EAAE,CAAC,EAAE,EAAE,CAAC,CAAC,SAAS,GAAG,CAAC,CAAC,EAAE,CAAC;AACpF,QAAQ,UAAU,CAAC,EAAE,CAAC,EAAE,EAAE,KAAK,IAAI,CAAC,IAAI,CAAC,EAAE,IAAI,CAAC,CAAC,cAAc,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC;AACnF,IAAI,OAAO,aAAa,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC;AAC/B,CAAC,CAAC;AACF;AACO,SAAS,SAAS,CAAC,CAAC,EAAE,CAAC,EAAE;AAChC,IAAI,aAAa,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC;AACxB,IAAI,SAAS,EAAE,GAAG,EAAE,IAAI,CAAC,WAAW,GAAG,CAAC,CAAC,EAAE;AAC3C,IAAI,CAAC,CAAC,SAAS,GAAG,CAAC,KAAK,IAAI,GAAG,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,IAAI,EAAE,CAAC,SAAS,GAAG,CAAC,CAAC,SAAS,EAAE,IAAI,EAAE,EAAE,CAAC,CAAC;AACzF;;ACzBO,IAAM,WAAW,GAAG,UAAQ,IAAI;IACrC,IAAM,UAAU,GAAG,UAAU,GAAG,IAAI,GAAG,GAAG,CAAC;IAC3C,OAAO,UAAS,KAAK;QACnB,OAAO,YAAY,CAAC,KAAK,CAAC,KAAK,UAAU,CAAC;KAC3C,CAAC;AACJ,CAAC,CAAC;AAEF,IAAM,YAAY,GAAG,UAAA,KAAK,IAAI,OAAA,MAAM,CAAC,SAAS,CAAC,QAAQ,CAAC,IAAI,CAAC,KAAK,CAAC,GAAA,CAAC;AAE7D,IAAM,UAAU,GAAG,UAAC,KAAU;IACnC,IAAI,KAAK,YAAY,IAAI,EAAE;QACzB,OAAO,KAAK,CAAC,OAAO,EAAE,CAAC;KACxB;SAAM,IAAI,OAAO,CAAC,KAAK,CAAC,EAAE;QACzB,OAAO,KAAK,CAAC,GAAG,CAAC,UAAU,CAAC,CAAC;KAC9B;SAAM,IAAI,KAAK,IAAI,OAAO,KAAK,CAAC,MAAM,KAAK,UAAU,EAAE;QACtD,OAAO,KAAK,CAAC,MAAM,EAAE,CAAC;KACvB;IAED,OAAO,KAAK,CAAC;AACf,CAAC,CAAC;AAEK,IAAM,OAAO,GAAG,WAAW,CAAa,OAAO,CAAC,CAAC;AACjD,IAAM,QAAQ,GAAG,WAAW,CAAS,QAAQ,CAAC,CAAC;AAC/C,IAAM,UAAU,GAAG,WAAW,CAAW,UAAU,CAAC,CAAC;AACrD,IAAM,eAAe,GAAG,UAAA,KAAK;IAClC,QACE,KAAK;SACJ,KAAK,CAAC,WAAW,KAAK,MAAM;YAC3B,KAAK,CAAC,WAAW,KAAK,KAAK;YAC3B,KAAK,CAAC,WAAW,CAAC,QAAQ,EAAE,KAAK,qCAAqC;YACtE,KAAK,CAAC,WAAW,CAAC,QAAQ,EAAE,KAAK,oCAAoC,CAAC;QACxE,CAAC,KAAK,CAAC,MAAM,EACb;AACJ,CAAC,CAAC;AAEK,IAAM,MAAM,GAAG,UAAC,CAAC,EAAE,CAAC;IACzB,IAAI,CAAC,IAAI,IAAI,IAAI,CAAC,IAAI,CAAC,EAAE;QACvB,OAAO,IAAI,CAAC;KACb;IACD,IAAI,CAAC,KAAK,CAAC,EAAE;QACX,OAAO,IAAI,CAAC;KACb;IAED,IAAI,MAAM,CAAC,SAAS,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC,CAAC,KAAK,MAAM,CAAC,SAAS,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE;QAC3E,OAAO,KAAK,CAAC;KACd;IAED,IAAI,OAAO,CAAC,CAAC,CAAC,EAAE;QACd,IAAI,CAAC,CAAC,MAAM,KAAK,CAAC,CAAC,MAAM,EAAE;YACzB,OAAO,KAAK,CAAC;SACd;QACD,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,mBAAM,EAAQ,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;YAC/C,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,CAAC;gBAAE,OAAO,KAAK,CAAC;SACvC;QACD,OAAO,IAAI,CAAC;KACb;SAAM,IAAI,QAAQ,CAAC,CAAC,CAAC,EAAE;QACtB,IAAI,MAAM,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,MAAM,KAAK,MAAM,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,MAAM,EAAE;YACnD,OAAO,KAAK,CAAC;SACd;QACD,KAAK,IAAM,GAAG,IAAI,CAAC,EAAE;YACnB,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,EAAE,CAAC,CAAC,GAAG,CAAC,CAAC;gBAAE,OAAO,KAAK,CAAC;SAC3C;QACD,OAAO,IAAI,CAAC;KACb;IACD,OAAO,KAAK,CAAC;AACf,CAAC;;ACMD;;;;AAKA,IAAM,iBAAiB,GAAG,UACxB,IAAS,EACT,OAAc,EACd,IAAY,EACZ,KAAa,EACb,GAAQ,EACR,KAAU;IAEV,IAAM,UAAU,GAAG,OAAO,CAAC,KAAK,CAAC,CAAC;;;IAIlC,IAAI,OAAO,CAAC,IAAI,CAAC,IAAI,KAAK,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,EAAE;QAC9C,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,sBAAM,EAAW,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;;;YAGlD,IAAI,CAAC,iBAAiB,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,OAAO,EAAE,IAAI,EAAE,KAAK,EAAE,CAAC,EAAE,IAAI,CAAC,EAAE;gBAC9D,OAAO,KAAK,CAAC;aACd;SACF;KACF;IAED,IAAI,KAAK,KAAK,OAAO,CAAC,MAAM,IAAI,IAAI,IAAI,IAAI,EAAE;QAC5C,OAAO,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;KAC/B;IAED,OAAO,iBAAiB,CACtB,IAAI,CAAC,UAAU,CAAC,EAChB,OAAO,EACP,IAAI,EACJ,KAAK,GAAG,CAAC,EACT,UAAU,EACV,IAAI,CACL,CAAC;AACJ,CAAC,CAAC;AAEF;IAGE,uBACW,MAAe,EACf,WAAgB,EAChB,OAAgB;QAFhB,WAAM,GAAN,MAAM,CAAS;QACf,gBAAW,GAAX,WAAW,CAAK;QAChB,YAAO,GAAP,OAAO,CAAS;QAEzB,IAAI,CAAC,IAAI,EAAE,CAAC;KACb;IACS,4BAAI,GAAd,eAAmB;IACnB,6BAAK,GAAL;QACE,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;QAClB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;KACnB;IAEH,oBAAC;AAAD,CAAC,IAAA;AAED;IACU,sCAA6B;IAErC,4BACE,MAAe,EACf,WAAgB,EAChB,OAAgB,EACP,IAAY;QAJvB,YAME,kBAAM,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC,SACpC;QAHU,UAAI,GAAJ,IAAI,CAAQ;;KAGtB;IACH,yBAAC;AAAD,CAXA,CACU,aAAa,GAUtB;AAED;IAAsC,kCAAkB;IAItD,wBACE,MAAW,EACX,WAAgB,EAChB,OAAgB,EACA,QAA0B;QAJ5C,YAME,kBAAM,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC,SACpC;QAHiB,cAAQ,GAAR,QAAQ,CAAkB;;KAG3C;;;IAKD,8BAAK,GAAL;QACE,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;QAClB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;QAClB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,+BAAM,EAAoB,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;YAC3D,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,KAAK,EAAE,CAAC;SAC1B;KACF;;;IAOS,qCAAY,GAAtB,UAAuB,IAAS,EAAE,GAAQ,EAAE,KAAU;QACpD,IAAI,IAAI,GAAG,IAAI,CAAC;QAChB,IAAI,IAAI,GAAG,IAAI,CAAC;QAChB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,+BAAM,EAAoB,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;YAC3D,IAAM,cAAc,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC;YACxC,cAAc,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;YACtC,IAAI,CAAC,cAAc,CAAC,IAAI,EAAE;gBACxB,IAAI,GAAG,KAAK,CAAC;aACd;YACD,IAAI,cAAc,CAAC,IAAI,EAAE;gBACvB,IAAI,CAAC,cAAc,CAAC,IAAI,EAAE;oBACxB,MAAM;iBACP;aACF;iBAAM;gBACL,IAAI,GAAG,KAAK,CAAC;aACd;SACF;QACD,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;QACjB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;KAClB;IACH,qBAAC;AAAD,CAjDA,CAAsC,aAAa,GAiDlD;AAED;IAAkD,uCAAc;IAE9D,6BACE,MAAW,EACX,WAAgB,EAChB,OAAgB,EAChB,QAA0B,EACjB,IAAY;QALvB,YAOE,kBAAM,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,QAAQ,CAAC,SAC9C;QAHU,UAAI,GAAJ,IAAI,CAAQ;;KAGtB;IACH,0BAAC;AAAD,CAXA,CAAkD,cAAc,GAW/D;AAED;IAA2C,kCAAc;IAAzD;;KAOC;;;IAHC,6BAAI,GAAJ,UAAK,IAAW,EAAE,GAAQ,EAAE,MAAW;QACrC,IAAI,CAAC,YAAY,CAAC,IAAI,EAAE,GAAG,EAAE,MAAM,CAAC,CAAC;KACtC;IACH,qBAAC;AAAD,CAPA,CAA2C,cAAc,GAOxD;AAED;IAAqC,mCAAc;IACjD,yBACW,OAAc,EACvB,MAAW,EACX,WAAgB,EAChB,OAAgB,EAChB,QAA0B;QAL5B,YAOE,kBAAM,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,QAAQ,CAAC,SAC9C;QAPU,aAAO,GAAP,OAAO,CAAO;;;QAyBjB,sBAAgB,GAAG,UAAC,KAAU,EAAE,GAAQ,EAAE,KAAU;YAC1D,KAAI,CAAC,YAAY,CAAC,KAAK,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;YACrC,OAAO,CAAC,KAAI,CAAC,IAAI,CAAC;SACnB,CAAC;;KArBD;;;IAID,8BAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,MAAW;QACnC,iBAAiB,CACf,IAAI,EACJ,IAAI,CAAC,OAAO,EACZ,IAAI,CAAC,gBAAgB,EACrB,CAAC,EACD,GAAG,EACH,MAAM,CACP,CAAC;KACH;IASH,sBAAC;AAAD,CA/BA,CAAqC,cAAc,GA+BlD;AAEM,IAAM,YAAY,GAAG,UAAC,CAAC,EAAE,OAAmB;IACjD,IAAI,CAAC,YAAY,QAAQ,EAAE;QACzB,OAAO,CAAC,CAAC;KACV;IACD,IAAI,CAAC,YAAY,MAAM,EAAE;QACvB,OAAO,UAAA,CAAC;YACN,IAAM,MAAM,GAAG,OAAO,CAAC,KAAK,QAAQ,IAAI,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;YAClD,CAAC,CAAC,SAAS,GAAG,CAAC,CAAC;YAChB,OAAO,MAAM,CAAC;SACf,CAAC;KACH;IACD,IAAM,WAAW,GAAG,UAAU,CAAC,CAAC,CAAC,CAAC;IAClC,OAAO,UAAA,CAAC,IAAI,OAAA,OAAO,CAAC,WAAW,EAAE,UAAU,CAAC,CAAC,CAAC,CAAC,GAAA,CAAC;AAClD,CAAC,CAAC;;IAE2C,mCAAqB;IAAlE;;KAaC;IAXC,8BAAI,GAAJ;QACE,IAAI,CAAC,KAAK,GAAG,YAAY,CAAC,IAAI,CAAC,MAAM,EAAE,IAAI,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;KAC9D;IACD,8BAAI,GAAJ,UAAK,IAAI,EAAE,GAAQ,EAAE,MAAW;QAC9B,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,MAAM,CAAC,IAAI,MAAM,CAAC,cAAc,CAAC,GAAG,CAAC,EAAE;YACxD,IAAI,IAAI,CAAC,KAAK,CAAC,IAAI,EAAE,GAAG,EAAE,MAAM,CAAC,EAAE;gBACjC,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;gBACjB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;aAClB;SACF;KACF;IACH,sBAAC;AAAD,CAbA,CAA6C,aAAa,GAazD;IAEY,qBAAqB,GAAG,UACnC,MAAW,EACX,WAAgB,EAChB,OAAgB,IACb,OAAA,IAAI,eAAe,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC,IAAC;AAEvD;IAA2C,iCAAqB;IAAhE;;KAKC;IAJC,4BAAI,GAAJ;QACE,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;QACjB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;KACnB;IACH,oBAAC;AAAD,CALA,CAA2C,aAAa,GAKvD;AAEM,IAAM,yBAAyB,GAAG,UACvC,wBAA+C,IAC5C,OAAA,UAAC,MAAW,EAAE,WAAgB,EAAE,OAAgB,EAAE,IAAY;IACjE,IAAI,MAAM,IAAI,IAAI,EAAE;QAClB,OAAO,IAAI,aAAa,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC,CAAC;KACxD;IAED,OAAO,wBAAwB,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,CAAC;AACtE,CAAC,GAAA,CAAC;AAEK,IAAM,kBAAkB,GAAG,UAAC,YAA6B;IAC9D,OAAA,yBAAyB,CACvB,UAAC,MAAW,EAAE,WAAuB,EAAE,OAAgB;QACrD,IAAM,YAAY,GAAG,OAAO,UAAU,CAAC,MAAM,CAAC,CAAC;QAC/C,IAAM,IAAI,GAAG,YAAY,CAAC,MAAM,CAAC,CAAC;QAClC,OAAO,IAAI,eAAe,CACxB,UAAA,CAAC;YACC,OAAO,OAAO,UAAU,CAAC,CAAC,CAAC,KAAK,YAAY,IAAI,IAAI,CAAC,CAAC,CAAC,CAAC;SACzD,EACD,WAAW,EACX,OAAO,CACR,CAAC;KACH,CACF;AAZD,CAYC,CAAC;AASJ,IAAM,oBAAoB,GAAG,UAC3B,IAAY,EACZ,MAAW,EACX,WAAgB,EAChB,OAAgB;IAEhB,IAAM,gBAAgB,GAAG,OAAO,CAAC,UAAU,CAAC,IAAI,CAAC,CAAC;IAClD,IAAI,CAAC,gBAAgB,EAAE;QACrB,MAAM,IAAI,KAAK,CAAC,4BAA0B,IAAM,CAAC,CAAC;KACnD;IACD,OAAO,gBAAgB,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,CAAC;AAC9D,CAAC,CAAC;AAEK,IAAM,iBAAiB,GAAG,UAAC,KAAU;IAC1C,KAAK,IAAM,GAAG,IAAI,KAAK,EAAE;QACvB,IAAI,GAAG,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG;YAAE,OAAO,IAAI,CAAC;KACxC;IACD,OAAO,KAAK,CAAC;AACf,CAAC,CAAC;AACF,IAAM,qBAAqB,GAAG,UAC5B,OAAc,EACd,WAAgB,EAChB,WAAgB,EAChB,OAAgB;IAEhB,IAAI,iBAAiB,CAAC,WAAW,CAAC,EAAE;QAC5B,IAAA,gDAGL,EAHM,sBAAc,EAAE,wBAGtB,CAAC;QACF,IAAI,gBAAgB,CAAC,MAAM,EAAE;YAC3B,MAAM,IAAI,KAAK,CACb,kEAAkE,CACnE,CAAC;SACH;QACD,OAAO,IAAI,eAAe,CACxB,OAAO,EACP,WAAW,EACX,WAAW,EACX,OAAO,EACP,cAAc,CACf,CAAC;KACH;IACD,OAAO,IAAI,eAAe,CAAC,OAAO,EAAE,WAAW,EAAE,WAAW,EAAE,OAAO,EAAE;QACrE,IAAI,eAAe,CAAC,WAAW,EAAE,WAAW,EAAE,OAAO,CAAC;KACvD,CAAC,CAAC;AACL,CAAC,CAAC;IAEW,oBAAoB,GAAG,UAClC,KAAqB,EACrB,WAAuB,EACvB,EAA8C;IAD9C,4BAAA,EAAA,kBAAuB;QACvB,4BAA8C,EAA5C,oBAAO,EAAE,0BAAU;IAErB,IAAM,OAAO,GAAG;QACd,OAAO,EAAE,OAAO,IAAI,MAAM;QAC1B,UAAU,EAAE,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,UAAU,IAAI,EAAE,CAAC;KAChD,CAAC;IAEI,IAAA,0CAGL,EAHM,sBAAc,EAAE,wBAGtB,CAAC;IAEF,IAAM,GAAG,GAAG,EAAE,CAAC;IAEf,IAAI,cAAc,CAAC,MAAM,EAAE;QACzB,GAAG,CAAC,IAAI,CACN,IAAI,eAAe,CAAC,EAAE,EAAE,KAAK,EAAE,WAAW,EAAE,OAAO,EAAE,cAAc,CAAC,CACrE,CAAC;KACH;IAED,GAAG,CAAC,IAAI,OAAR,GAAG,EAAS,gBAAgB,EAAE;IAE9B,IAAI,GAAG,CAAC,MAAM,KAAK,CAAC,EAAE;QACpB,OAAO,GAAG,CAAC,CAAC,CAAC,CAAC;KACf;IACD,OAAO,IAAI,cAAc,CAAC,KAAK,EAAE,WAAW,EAAE,OAAO,EAAE,GAAG,CAAC,CAAC;AAC9D,EAAE;AAEF,IAAM,qBAAqB,GAAG,UAAC,KAAU,EAAE,OAAgB;IACzD,IAAM,cAAc,GAAG,EAAE,CAAC;IAC1B,IAAM,gBAAgB,GAAG,EAAE,CAAC;IAC5B,IAAI,CAAC,eAAe,CAAC,KAAK,CAAC,EAAE;QAC3B,cAAc,CAAC,IAAI,CAAC,IAAI,eAAe,CAAC,KAAK,EAAE,KAAK,EAAE,OAAO,CAAC,CAAC,CAAC;QAChE,OAAO,CAAC,cAAc,EAAE,gBAAgB,CAAC,CAAC;KAC3C;IACD,KAAK,IAAM,GAAG,IAAI,KAAK,EAAE;QACvB,IAAI,GAAG,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,EAAE;YACzB,IAAM,EAAE,GAAG,oBAAoB,CAAC,GAAG,EAAE,KAAK,CAAC,GAAG,CAAC,EAAE,KAAK,EAAE,OAAO,CAAC,CAAC;;YAGjE,IAAI,EAAE,IAAI,IAAI,EAAE;gBACd,cAAc,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC;aACzB;SACF;aAAM;YACL,gBAAgB,CAAC,IAAI,CACnB,qBAAqB,CAAC,GAAG,CAAC,KAAK,CAAC,GAAG,CAAC,EAAE,KAAK,CAAC,GAAG,CAAC,EAAE,KAAK,EAAE,OAAO,CAAC,CAClE,CAAC;SACH;KACF;IAED,OAAO,CAAC,cAAc,EAAE,gBAAgB,CAAC,CAAC;AAC5C,CAAC,CAAC;IAEW,qBAAqB,GAAG,UAAQ,SAA2B,IAAK,OAAA,UAC3E,IAAW,EACX,GAAS,EACT,KAAW;IAEX,SAAS,CAAC,KAAK,EAAE,CAAC;IAClB,SAAS,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;IACjC,OAAO,SAAS,CAAC,IAAI,CAAC;AACxB,CAAC,IAAC;IAEW,iBAAiB,GAAG,UAC/B,KAAqB,EACrB,OAA8B;IAA9B,wBAAA,EAAA,YAA8B;IAE9B,OAAO,qBAAqB,CAC1B,oBAAoB,CAAiB,KAAK,EAAE,IAAI,EAAE,OAAO,CAAC,CAC3D,CAAC;AACJ;;AC9aA;IAAkB,uBAAuB;IAAzC;;KAeC;IAbC,kBAAI,GAAJ;QACE,IAAI,CAAC,KAAK,GAAG,YAAY,CAAC,IAAI,CAAC,MAAM,EAAE,IAAI,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;KAC9D;IACD,mBAAK,GAAL;QACE,iBAAM,KAAK,WAAE,CAAC;QACd,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;KAClB;IACD,kBAAI,GAAJ,UAAK,IAAS;QACZ,IAAI,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,EAAE;YACpB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;YACjB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;SACnB;KACF;IACH,UAAC;AAAD,CAfA,CAAkB,kBAAkB,GAenC;AACD;AACA;IAAyB,8BAA8B;IAAvD;;KA8BC;IA5BC,yBAAI,GAAJ;QACE,IAAI,CAAC,eAAe,GAAG,oBAAoB,CACzC,IAAI,CAAC,MAAM,EACX,IAAI,CAAC,WAAW,EAChB,IAAI,CAAC,OAAO,CACb,CAAC;KACH;IACD,0BAAK,GAAL;QACE,iBAAM,KAAK,WAAE,CAAC;QACd,IAAI,CAAC,eAAe,CAAC,KAAK,EAAE,CAAC;KAC9B;IACD,yBAAI,GAAJ,UAAK,IAAS;QACZ,IAAI,OAAO,CAAC,IAAI,CAAC,EAAE;YACjB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,sBAAM,EAAW,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;;;gBAGlD,IAAI,CAAC,eAAe,CAAC,KAAK,EAAE,CAAC;;gBAG7B,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,IAAI,CAAC,CAAC;gBAC5C,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC,IAAI,IAAI,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC;aACpD;YACD,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;SAClB;aAAM;YACL,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;YAClB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;SACnB;KACF;IACH,iBAAC;AAAD,CA9BA,CAAyB,kBAAkB,GA8B1C;AAED;IAAmB,wBAA8B;IAAjD;;KAiBC;IAfC,mBAAI,GAAJ;QACE,IAAI,CAAC,eAAe,GAAG,oBAAoB,CACzC,IAAI,CAAC,MAAM,EACX,IAAI,CAAC,WAAW,EAChB,IAAI,CAAC,OAAO,CACb,CAAC;KACH;IACD,oBAAK,GAAL;QACE,IAAI,CAAC,eAAe,CAAC,KAAK,EAAE,CAAC;KAC9B;IACD,mBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;QAClC,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;QAC5C,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC;QACtC,IAAI,CAAC,IAAI,GAAG,CAAC,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC;KACxC;IACH,WAAC;AAAD,CAjBA,CAAmB,kBAAkB,GAiBpC;;IAE0B,yBAAuB;IAAlD;;KAYC;IAXC,oBAAI,GAAJ,eAAS;IACT,oBAAI,GAAJ,UAAK,IAAI;QACP,IAAI,OAAO,CAAC,IAAI,CAAC,IAAI,IAAI,CAAC,MAAM,KAAK,IAAI,CAAC,MAAM,EAAE;YAChD,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;YACjB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;SAClB;;;;;KAKF;IACH,YAAC;AAAD,CAZA,CAA2B,kBAAkB,GAY5C;AAED;IAAkB,uBAAuB;IAAzC;;KA8BC;IA5BC,kBAAI,GAAJ;QAAA,iBAIC;QAHC,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC,MAAM,CAAC,GAAG,CAAC,UAAA,EAAE;YAC5B,OAAA,oBAAoB,CAAC,EAAE,EAAE,IAAI,EAAE,KAAI,CAAC,OAAO,CAAC;SAAA,CAC7C,CAAC;KACH;IACD,mBAAK,GAAL;QACE,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;QAClB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;QAClB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,2BAAM,EAAgB,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;YACvD,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,KAAK,EAAE,CAAC;SACtB;KACF;IACD,kBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;QAClC,IAAI,IAAI,GAAG,KAAK,CAAC;QACjB,IAAI,OAAO,GAAG,KAAK,CAAC;QACpB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,2BAAM,EAAgB,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;YACvD,IAAM,EAAE,GAAG,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;YACxB,EAAE,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;YAC1B,IAAI,EAAE,CAAC,IAAI,EAAE;gBACX,IAAI,GAAG,IAAI,CAAC;gBACZ,OAAO,GAAG,EAAE,CAAC,IAAI,CAAC;gBAClB,MAAM;aACP;SACF;QAED,IAAI,CAAC,IAAI,GAAG,OAAO,CAAC;QACpB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;KAClB;IACH,UAAC;AAAD,CA9BA,CAAkB,kBAAkB,GA8BnC;AAED;IAAmB,wBAAG;IAAtB;;KAKC;IAJC,mBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;QAClC,iBAAM,IAAI,YAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;QAC7B,IAAI,CAAC,IAAI,GAAG,CAAC,IAAI,CAAC,IAAI,CAAC;KACxB;IACH,WAAC;AAAD,CALA,CAAmB,GAAG,GAKrB;AAED;IAAkB,uBAAuB;IAAzC;;KA2BC;IAzBC,kBAAI,GAAJ;QAAA,iBASC;QARC,IAAI,CAAC,QAAQ,GAAG,IAAI,CAAC,MAAM,CAAC,GAAG,CAAC,UAAA,KAAK;YACnC,IAAI,iBAAiB,CAAC,KAAK,CAAC,EAAE;gBAC5B,MAAM,IAAI,KAAK,CACb,yBAAuB,KAAI,CAAC,WAAW,CAAC,IAAI,CAAC,WAAW,EAAI,CAC7D,CAAC;aACH;YACD,OAAO,YAAY,CAAC,KAAK,EAAE,KAAI,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;SAClD,CAAC,CAAC;KACJ;IACD,kBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;QAClC,IAAI,IAAI,GAAG,KAAK,CAAC;QACjB,IAAI,OAAO,GAAG,KAAK,CAAC;QACpB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,+BAAM,EAAoB,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;YAC3D,IAAM,IAAI,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC;YAC9B,IAAI,IAAI,CAAC,IAAI,CAAC,EAAE;gBACd,IAAI,GAAG,IAAI,CAAC;gBACZ,OAAO,GAAG,IAAI,CAAC;gBACf,MAAM;aACP;SACF;QAED,IAAI,CAAC,IAAI,GAAG,OAAO,CAAC;QACpB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;KAClB;IACH,UAAC;AAAD,CA3BA,CAAkB,kBAAkB,GA2BnC;AAED;IAAmB,wBAAG;IAAtB;;KAKC;IAJC,mBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;QAClC,iBAAM,IAAI,YAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;QAC7B,IAAI,CAAC,IAAI,GAAG,CAAC,IAAI,CAAC,IAAI,CAAC;KACxB;IACH,WAAC;AAAD,CALA,CAAmB,GAAG,GAKrB;AAED;IAAsB,2BAA2B;IAAjD;;KAOC;IANC,sBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;QAClC,IAAI,KAAK,CAAC,cAAc,CAAC,GAAG,CAAC,KAAK,IAAI,CAAC,MAAM,EAAE;YAC7C,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;YACjB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;SAClB;KACF;IACH,cAAC;AAAD,CAPA,CAAsB,kBAAkB,GAOvC;AAED;IAAmB,wBAAmB;IACpC,cACE,MAAoB,EACpB,WAAuB,EACvB,OAAgB,EAChB,IAAY;eAEZ,kBACE,MAAM,EACN,WAAW,EACX,OAAO,EACP,MAAM,CAAC,GAAG,CAAC,UAAA,KAAK,IAAI,OAAA,oBAAoB,CAAC,KAAK,EAAE,WAAW,EAAE,OAAO,CAAC,GAAA,CAAC,EACtE,IAAI,CACL;KACF;IACD,mBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;QAClC,IAAI,CAAC,YAAY,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;KACrC;IACH,WAAC;AAAD,CAlBA,CAAmB,mBAAmB,GAkBrC;IAEY,GAAG,GAAG,UAAC,MAAW,EAAE,WAAuB,EAAE,OAAgB;IACxE,OAAA,IAAI,eAAe,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC;AAAjD,EAAkD;IACvC,GAAG,GAAG,UACjB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,GAAG,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;IACpC,GAAG,GAAG,UACjB,MAAoB,EACpB,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,GAAG,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;IACpC,IAAI,GAAG,UAClB,MAAoB,EACpB,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,IAAI,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;IACrC,UAAU,GAAG,UACxB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,UAAU,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;IAC3C,IAAI,GAAG,UAClB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,IAAI,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;IACrC,GAAG,GAAG,UACjB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,GAAG,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;IAEpC,GAAG,GAAG,kBAAkB,CAAC,UAAA,MAAM,IAAI,OAAA,UAAA,CAAC,IAAI,OAAA,CAAC,GAAG,MAAM,GAAA,GAAA,EAAE;IACpD,IAAI,GAAG,kBAAkB,CAAC,UAAA,MAAM,IAAI,OAAA,UAAA,CAAC,IAAI,OAAA,CAAC,IAAI,MAAM,GAAA,GAAA,EAAE;IACtD,GAAG,GAAG,kBAAkB,CAAC,UAAA,MAAM,IAAI,OAAA,UAAA,CAAC,IAAI,OAAA,CAAC,GAAG,MAAM,GAAA,GAAA,EAAE;IACpD,IAAI,GAAG,kBAAkB,CAAC,UAAA,MAAM,IAAI,OAAA,UAAA,CAAC,IAAI,OAAA,CAAC,IAAI,MAAM,GAAA,GAAA,EAAE;IACtD,IAAI,GAAG,UAClB,EAA4B,EAC5B,WAAuB,EACvB,OAAgB;QAFf,WAAG,EAAE,mBAAW;IAIjB,OAAA,IAAI,eAAe,CACjB,UAAA,CAAC,IAAI,OAAA,UAAU,CAAC,CAAC,CAAC,GAAG,GAAG,KAAK,WAAW,GAAA,EACxC,WAAW,EACX,OAAO,CACR;AAJD,EAIE;IACS,OAAO,GAAG,UACrB,MAAe,EACf,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,OAAO,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;IACxC,MAAM,GAAG,UACpB,OAAe,EACf,WAAuB,EACvB,OAAgB;IAEhB,OAAA,IAAI,eAAe,CACjB,IAAI,MAAM,CAAC,OAAO,EAAE,WAAW,CAAC,QAAQ,CAAC,EACzC,WAAW,EACX,OAAO,CACR;AAJD,EAIE;IACS,IAAI,GAAG,UAClB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,IAAI,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;AAElD,IAAM,WAAW,GAAG;IAClB,MAAM,EAAE,UAAA,CAAC,IAAI,OAAA,OAAO,CAAC,KAAK,QAAQ,GAAA;IAClC,MAAM,EAAE,UAAA,CAAC,IAAI,OAAA,OAAO,CAAC,KAAK,QAAQ,GAAA;IAClC,IAAI,EAAE,UAAA,CAAC,IAAI,OAAA,OAAO,CAAC,KAAK,SAAS,GAAA;IACjC,KAAK,EAAE,UAAA,CAAC,IAAI,OAAA,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,GAAA;IAC5B,IAAI,EAAE,UAAA,CAAC,IAAI,OAAA,CAAC,KAAK,IAAI,GAAA;IACrB,SAAS,EAAE,UAAA,CAAC,IAAI,OAAA,CAAC,YAAY,IAAI,GAAA;CAClC,CAAC;IAEW,KAAK,GAAG,UACnB,KAAwB,EACxB,WAAuB,EACvB,OAAgB;IAEhB,OAAA,IAAI,eAAe,CACjB,UAAA,CAAC;QACC,IAAI,OAAO,KAAK,KAAK,QAAQ,EAAE;YAC7B,IAAI,CAAC,WAAW,CAAC,KAAK,CAAC,EAAE;gBACvB,MAAM,IAAI,KAAK,CAAC,2BAA2B,CAAC,CAAC;aAC9C;YAED,OAAO,WAAW,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC;SAC9B;QAED,OAAO,CAAC,IAAI,IAAI,GAAG,CAAC,YAAY,KAAK,IAAI,CAAC,CAAC,WAAW,KAAK,KAAK,GAAG,KAAK,CAAC;KAC1E,EACD,WAAW,EACX,OAAO,CACR;AAdD,EAcE;IACS,IAAI,GAAG,UAClB,MAAoB,EACpB,UAAsB,EACtB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,IAAI,CAAC,MAAM,EAAE,UAAU,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;IACpC,IAAI,GAAG,KAAK;IACZ,KAAK,GAAG,UACnB,MAAc,EACd,UAAsB,EACtB,OAAgB,IACb,OAAA,IAAI,KAAK,CAAC,MAAM,EAAE,UAAU,EAAE,OAAO,EAAE,OAAO,CAAC,IAAC;IACxC,QAAQ,GAAG,cAAM,OAAA,IAAI,IAAC;IACtB,MAAM,GAAG,UACpB,MAAyB,EACzB,UAAsB,EACtB,OAAgB;IAEhB,IAAI,IAAI,CAAC;IAET,IAAI,UAAU,CAAC,MAAM,CAAC,EAAE;QACtB,IAAI,GAAG,MAAM,CAAC;KACf;SAAM,IAAI,CAAC,OAAO,CAAC,GAAG,CAAC,WAAW,EAAE;QACnC,IAAI,GAAG,IAAI,QAAQ,CAAC,KAAK,EAAE,SAAS,GAAG,MAAM,CAAC,CAAC;KAChD;SAAM;QACL,MAAM,IAAI,KAAK,CACb,oEAAkE,CACnE,CAAC;KACH;IAED,OAAO,IAAI,eAAe,CAAC,UAAA,CAAC,IAAI,OAAA,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,GAAA,EAAE,UAAU,EAAE,OAAO,CAAC,CAAC;AACxE;;;;;;;;;;;;;;;;;;;;;;;;;;;;ICxUM,2BAA2B,GAAG,UAClC,KAAqB,EACrB,UAAe,EACf,EAA8C;QAA9C,4BAA8C,EAA5C,oBAAO,EAAE,0BAAU;IAErB,OAAO,oBAAoB,CAAC,KAAK,EAAE,UAAU,EAAE;QAC7C,OAAO,EAAE,OAAO;QAChB,UAAU,EAAE,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,iBAAiB,EAAE,UAAU,IAAI,EAAE,CAAC;KACnE,CAAC,CAAC;AACL,EAAE;AAEF,IAAM,wBAAwB,GAAG,UAC/B,KAAqB,EACrB,OAA8B;IAA9B,wBAAA,EAAA,YAA8B;IAE9B,IAAM,EAAE,GAAG,2BAA2B,CAAC,KAAK,EAAE,IAAI,EAAE,OAAO,CAAC,CAAC;IAC7D,OAAO,qBAAqB,CAAC,EAAE,CAAC,CAAC;AACnC,CAAC;;;;;"} \ No newline at end of file diff --git a/node_modules/sift/index.d.ts b/node_modules/sift/index.d.ts new file mode 100644 index 000000000..2b8257109 --- /dev/null +++ b/node_modules/sift/index.d.ts @@ -0,0 +1,4 @@ +import sift from "./lib"; + +export default sift; +export * from "./lib"; diff --git a/node_modules/sift/index.js b/node_modules/sift/index.js new file mode 100644 index 000000000..449294a4c --- /dev/null +++ b/node_modules/sift/index.js @@ -0,0 +1,4 @@ +const lib = require("./lib"); + +module.exports = lib.default; +Object.assign(module.exports, lib); diff --git a/node_modules/sift/lib/core.d.ts b/node_modules/sift/lib/core.d.ts new file mode 100644 index 000000000..35f1a39e3 --- /dev/null +++ b/node_modules/sift/lib/core.d.ts @@ -0,0 +1,115 @@ +import { Key, Comparator } from "./utils"; +export interface Operation { + readonly keep: boolean; + readonly done: boolean; + reset(): any; + next(item: TItem, key?: Key, owner?: any): any; +} +export declare type Tester = (item: any, key?: Key, owner?: any) => boolean; +export interface NamedOperation { + name: string; +} +export declare type OperationCreator = (params: any, parentQuery: any, options: Options, name: string) => Operation; +declare type BasicValueQuery = { + $eq?: TValue; + $ne?: TValue; + $lt?: TValue; + $gt?: TValue; + $lte?: TValue; + $gte?: TValue; + $in?: TValue[]; + $nin?: TValue[]; + $all?: TValue[]; + $mod?: [number, number]; + $exists?: boolean; + $regex?: string | RegExp; + $size?: number; + $where?: ((this: TValue) => boolean) | string; + $options?: "i" | "g" | "m" | "u"; + $type?: Function; + $not?: NestedQuery; + $or?: NestedQuery[]; + $nor?: NestedQuery[]; + $and?: NestedQuery[]; +}; +declare type ArrayValueQuery = { + $elemMatch?: Query; +} & BasicValueQuery; +declare type Unpacked = T extends (infer U)[] ? U : T; +declare type ValueQuery = TValue extends Array ? ArrayValueQuery> : BasicValueQuery; +declare type NotObject = string | number | Date | boolean | Array; +declare type ShapeQuery = TItemSchema extends NotObject ? {} : { + [k in keyof TItemSchema]?: TItemSchema[k] | ValueQuery; +}; +declare type NestedQuery = ValueQuery & ShapeQuery; +export declare type Query = TItemSchema | RegExp | NestedQuery; +declare abstract class BaseOperation implements Operation { + readonly params: TParams; + readonly owneryQuery: any; + readonly options: Options; + keep: boolean; + done: boolean; + constructor(params: TParams, owneryQuery: any, options: Options); + protected init(): void; + reset(): void; + abstract next(item: any, key: Key, parent: any): any; +} +export declare abstract class NamedBaseOperation extends BaseOperation implements NamedOperation { + readonly name: string; + constructor(params: TParams, owneryQuery: any, options: Options, name: string); +} +declare abstract class GroupOperation extends BaseOperation { + readonly children: Operation[]; + keep: boolean; + done: boolean; + constructor(params: any, owneryQuery: any, options: Options, children: Operation[]); + /** + */ + reset(): void; + abstract next(item: any, key: Key, owner: any): any; + /** + */ + protected childrenNext(item: any, key: Key, owner: any): void; +} +export declare abstract class NamedGroupOperation extends GroupOperation implements NamedOperation { + readonly name: string; + constructor(params: any, owneryQuery: any, options: Options, children: Operation[], name: string); +} +export declare class QueryOperation extends GroupOperation { + /** + */ + next(item: TItem, key: Key, parent: any): void; +} +export declare class NestedOperation extends GroupOperation { + readonly keyPath: Key[]; + constructor(keyPath: Key[], params: any, owneryQuery: any, options: Options, children: Operation[]); + /** + */ + next(item: any, key: Key, parent: any): void; + /** + */ + private _nextNestedValue; +} +export declare const createTester: (a: any, compare: Comparator) => any; +export declare class EqualsOperation extends BaseOperation { + private _test; + init(): void; + next(item: any, key: Key, parent: any): void; +} +export declare const createEqualsOperation: (params: any, owneryQuery: any, options: Options) => EqualsOperation; +export declare class NopeOperation extends BaseOperation { + next(): void; +} +export declare const numericalOperationCreator: (createNumericalOperation: OperationCreator) => (params: any, owneryQuery: any, options: Options, name: string) => Operation | NopeOperation; +export declare const numericalOperation: (createTester: (any: any) => Tester) => (params: any, owneryQuery: any, options: Options, name: string) => Operation | NopeOperation; +export declare type Options = { + operations: { + [identifier: string]: OperationCreator; + }; + compare: (a: any, b: any) => boolean; +}; +export declare const containsOperation: (query: any) => boolean; +export declare const createQueryOperation: (query: Query, owneryQuery?: any, { compare, operations }?: Partial) => QueryOperation; +export declare const createOperationTester: (operation: Operation) => (item: TItem, key?: string | number, owner?: any) => boolean; +export declare const createQueryTester: (query: Query, options?: Partial) => (item: TItem, key?: string | number, owner?: any) => boolean; +export {}; diff --git a/node_modules/sift/lib/index.d.ts b/node_modules/sift/lib/index.d.ts new file mode 100644 index 000000000..632fa0aae --- /dev/null +++ b/node_modules/sift/lib/index.d.ts @@ -0,0 +1,6 @@ +import { Query, Options, createQueryTester, EqualsOperation, createQueryOperation, createEqualsOperation, createOperationTester } from "./core"; +declare const createDefaultQueryOperation: (query: Query, ownerQuery: any, { compare, operations }?: Partial) => import("./core").QueryOperation; +declare const createDefaultQueryTester: (query: Query, options?: Partial) => (item: unknown, key?: string | number, owner?: any) => boolean; +export { Query, EqualsOperation, createQueryTester, createOperationTester, createDefaultQueryOperation, createEqualsOperation, createQueryOperation }; +export * from "./operations"; +export default createDefaultQueryTester; diff --git a/node_modules/sift/lib/index.js b/node_modules/sift/lib/index.js new file mode 100644 index 000000000..9f8a650e4 --- /dev/null +++ b/node_modules/sift/lib/index.js @@ -0,0 +1,689 @@ +(function (global, factory) { + typeof exports === 'object' && typeof module !== 'undefined' ? factory(exports) : + typeof define === 'function' && define.amd ? define(['exports'], factory) : + (global = global || self, factory(global.sift = {})); +}(this, (function (exports) { 'use strict'; + + /*! ***************************************************************************** + Copyright (c) Microsoft Corporation. + + Permission to use, copy, modify, and/or distribute this software for any + purpose with or without fee is hereby granted. + + THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH + REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY + AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, + INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM + LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR + OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR + PERFORMANCE OF THIS SOFTWARE. + ***************************************************************************** */ + /* global Reflect, Promise */ + + var extendStatics = function(d, b) { + extendStatics = Object.setPrototypeOf || + ({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) || + function (d, b) { for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p]; }; + return extendStatics(d, b); + }; + + function __extends(d, b) { + extendStatics(d, b); + function __() { this.constructor = d; } + d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __()); + } + + var typeChecker = function (type) { + var typeString = "[object " + type + "]"; + return function (value) { + return getClassName(value) === typeString; + }; + }; + var getClassName = function (value) { return Object.prototype.toString.call(value); }; + var comparable = function (value) { + if (value instanceof Date) { + return value.getTime(); + } + else if (isArray(value)) { + return value.map(comparable); + } + else if (value && typeof value.toJSON === "function") { + return value.toJSON(); + } + return value; + }; + var isArray = typeChecker("Array"); + var isObject = typeChecker("Object"); + var isFunction = typeChecker("Function"); + var isVanillaObject = function (value) { + return (value && + (value.constructor === Object || + value.constructor === Array || + value.constructor.toString() === "function Object() { [native code] }" || + value.constructor.toString() === "function Array() { [native code] }") && + !value.toJSON); + }; + var equals = function (a, b) { + if (a == null && a == b) { + return true; + } + if (a === b) { + return true; + } + if (Object.prototype.toString.call(a) !== Object.prototype.toString.call(b)) { + return false; + } + if (isArray(a)) { + if (a.length !== b.length) { + return false; + } + for (var i = 0, length_1 = a.length; i < length_1; i++) { + if (!equals(a[i], b[i])) + return false; + } + return true; + } + else if (isObject(a)) { + if (Object.keys(a).length !== Object.keys(b).length) { + return false; + } + for (var key in a) { + if (!equals(a[key], b[key])) + return false; + } + return true; + } + return false; + }; + + /** + * Walks through each value given the context - used for nested operations. E.g: + * { "person.address": { $eq: "blarg" }} + */ + var walkKeyPathValues = function (item, keyPath, next, depth, key, owner) { + var currentKey = keyPath[depth]; + // if array, then try matching. Might fall through for cases like: + // { $eq: [1, 2, 3] }, [ 1, 2, 3 ]. + if (isArray(item) && isNaN(Number(currentKey))) { + for (var i = 0, length_1 = item.length; i < length_1; i++) { + // if FALSE is returned, then terminate walker. For operations, this simply + // means that the search critera was met. + if (!walkKeyPathValues(item[i], keyPath, next, depth, i, item)) { + return false; + } + } + } + if (depth === keyPath.length || item == null) { + return next(item, key, owner); + } + return walkKeyPathValues(item[currentKey], keyPath, next, depth + 1, currentKey, item); + }; + var BaseOperation = /** @class */ (function () { + function BaseOperation(params, owneryQuery, options) { + this.params = params; + this.owneryQuery = owneryQuery; + this.options = options; + this.init(); + } + BaseOperation.prototype.init = function () { }; + BaseOperation.prototype.reset = function () { + this.done = false; + this.keep = false; + }; + return BaseOperation; + }()); + var NamedBaseOperation = /** @class */ (function (_super) { + __extends(NamedBaseOperation, _super); + function NamedBaseOperation(params, owneryQuery, options, name) { + var _this = _super.call(this, params, owneryQuery, options) || this; + _this.name = name; + return _this; + } + return NamedBaseOperation; + }(BaseOperation)); + var GroupOperation = /** @class */ (function (_super) { + __extends(GroupOperation, _super); + function GroupOperation(params, owneryQuery, options, children) { + var _this = _super.call(this, params, owneryQuery, options) || this; + _this.children = children; + return _this; + } + /** + */ + GroupOperation.prototype.reset = function () { + this.keep = false; + this.done = false; + for (var i = 0, length_2 = this.children.length; i < length_2; i++) { + this.children[i].reset(); + } + }; + /** + */ + GroupOperation.prototype.childrenNext = function (item, key, owner) { + var done = true; + var keep = true; + for (var i = 0, length_3 = this.children.length; i < length_3; i++) { + var childOperation = this.children[i]; + childOperation.next(item, key, owner); + if (!childOperation.keep) { + keep = false; + } + if (childOperation.done) { + if (!childOperation.keep) { + break; + } + } + else { + done = false; + } + } + this.done = done; + this.keep = keep; + }; + return GroupOperation; + }(BaseOperation)); + var NamedGroupOperation = /** @class */ (function (_super) { + __extends(NamedGroupOperation, _super); + function NamedGroupOperation(params, owneryQuery, options, children, name) { + var _this = _super.call(this, params, owneryQuery, options, children) || this; + _this.name = name; + return _this; + } + return NamedGroupOperation; + }(GroupOperation)); + var QueryOperation = /** @class */ (function (_super) { + __extends(QueryOperation, _super); + function QueryOperation() { + return _super !== null && _super.apply(this, arguments) || this; + } + /** + */ + QueryOperation.prototype.next = function (item, key, parent) { + this.childrenNext(item, key, parent); + }; + return QueryOperation; + }(GroupOperation)); + var NestedOperation = /** @class */ (function (_super) { + __extends(NestedOperation, _super); + function NestedOperation(keyPath, params, owneryQuery, options, children) { + var _this = _super.call(this, params, owneryQuery, options, children) || this; + _this.keyPath = keyPath; + /** + */ + _this._nextNestedValue = function (value, key, owner) { + _this.childrenNext(value, key, owner); + return !_this.done; + }; + return _this; + } + /** + */ + NestedOperation.prototype.next = function (item, key, parent) { + walkKeyPathValues(item, this.keyPath, this._nextNestedValue, 0, key, parent); + }; + return NestedOperation; + }(GroupOperation)); + var createTester = function (a, compare) { + if (a instanceof Function) { + return a; + } + if (a instanceof RegExp) { + return function (b) { + var result = typeof b === "string" && a.test(b); + a.lastIndex = 0; + return result; + }; + } + var comparableA = comparable(a); + return function (b) { return compare(comparableA, comparable(b)); }; + }; + var EqualsOperation = /** @class */ (function (_super) { + __extends(EqualsOperation, _super); + function EqualsOperation() { + return _super !== null && _super.apply(this, arguments) || this; + } + EqualsOperation.prototype.init = function () { + this._test = createTester(this.params, this.options.compare); + }; + EqualsOperation.prototype.next = function (item, key, parent) { + if (!Array.isArray(parent) || parent.hasOwnProperty(key)) { + if (this._test(item, key, parent)) { + this.done = true; + this.keep = true; + } + } + }; + return EqualsOperation; + }(BaseOperation)); + var createEqualsOperation = function (params, owneryQuery, options) { return new EqualsOperation(params, owneryQuery, options); }; + var NopeOperation = /** @class */ (function (_super) { + __extends(NopeOperation, _super); + function NopeOperation() { + return _super !== null && _super.apply(this, arguments) || this; + } + NopeOperation.prototype.next = function () { + this.done = true; + this.keep = false; + }; + return NopeOperation; + }(BaseOperation)); + var numericalOperationCreator = function (createNumericalOperation) { return function (params, owneryQuery, options, name) { + if (params == null) { + return new NopeOperation(params, owneryQuery, options); + } + return createNumericalOperation(params, owneryQuery, options, name); + }; }; + var numericalOperation = function (createTester) { + return numericalOperationCreator(function (params, owneryQuery, options) { + var typeofParams = typeof comparable(params); + var test = createTester(params); + return new EqualsOperation(function (b) { + return typeof comparable(b) === typeofParams && test(b); + }, owneryQuery, options); + }); + }; + var createNamedOperation = function (name, params, parentQuery, options) { + var operationCreator = options.operations[name]; + if (!operationCreator) { + throw new Error("Unsupported operation: " + name); + } + return operationCreator(params, parentQuery, options, name); + }; + var containsOperation = function (query) { + for (var key in query) { + if (key.charAt(0) === "$") + return true; + } + return false; + }; + var createNestedOperation = function (keyPath, nestedQuery, owneryQuery, options) { + if (containsOperation(nestedQuery)) { + var _a = createQueryOperations(nestedQuery, options), selfOperations = _a[0], nestedOperations = _a[1]; + if (nestedOperations.length) { + throw new Error("Property queries must contain only operations, or exact objects."); + } + return new NestedOperation(keyPath, nestedQuery, owneryQuery, options, selfOperations); + } + return new NestedOperation(keyPath, nestedQuery, owneryQuery, options, [ + new EqualsOperation(nestedQuery, owneryQuery, options) + ]); + }; + var createQueryOperation = function (query, owneryQuery, _a) { + if (owneryQuery === void 0) { owneryQuery = null; } + var _b = _a === void 0 ? {} : _a, compare = _b.compare, operations = _b.operations; + var options = { + compare: compare || equals, + operations: Object.assign({}, operations || {}) + }; + var _c = createQueryOperations(query, options), selfOperations = _c[0], nestedOperations = _c[1]; + var ops = []; + if (selfOperations.length) { + ops.push(new NestedOperation([], query, owneryQuery, options, selfOperations)); + } + ops.push.apply(ops, nestedOperations); + if (ops.length === 1) { + return ops[0]; + } + return new QueryOperation(query, owneryQuery, options, ops); + }; + var createQueryOperations = function (query, options) { + var selfOperations = []; + var nestedOperations = []; + if (!isVanillaObject(query)) { + selfOperations.push(new EqualsOperation(query, query, options)); + return [selfOperations, nestedOperations]; + } + for (var key in query) { + if (key.charAt(0) === "$") { + var op = createNamedOperation(key, query[key], query, options); + // probably just a flag for another operation (like $options) + if (op != null) { + selfOperations.push(op); + } + } + else { + nestedOperations.push(createNestedOperation(key.split("."), query[key], query, options)); + } + } + return [selfOperations, nestedOperations]; + }; + var createOperationTester = function (operation) { return function (item, key, owner) { + operation.reset(); + operation.next(item, key, owner); + return operation.keep; + }; }; + var createQueryTester = function (query, options) { + if (options === void 0) { options = {}; } + return createOperationTester(createQueryOperation(query, null, options)); + }; + + var $Ne = /** @class */ (function (_super) { + __extends($Ne, _super); + function $Ne() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Ne.prototype.init = function () { + this._test = createTester(this.params, this.options.compare); + }; + $Ne.prototype.reset = function () { + _super.prototype.reset.call(this); + this.keep = true; + }; + $Ne.prototype.next = function (item) { + if (this._test(item)) { + this.done = true; + this.keep = false; + } + }; + return $Ne; + }(NamedBaseOperation)); + // https://docs.mongodb.com/manual/reference/operator/query/elemMatch/ + var $ElemMatch = /** @class */ (function (_super) { + __extends($ElemMatch, _super); + function $ElemMatch() { + return _super !== null && _super.apply(this, arguments) || this; + } + $ElemMatch.prototype.init = function () { + this._queryOperation = createQueryOperation(this.params, this.owneryQuery, this.options); + }; + $ElemMatch.prototype.reset = function () { + _super.prototype.reset.call(this); + this._queryOperation.reset(); + }; + $ElemMatch.prototype.next = function (item) { + if (isArray(item)) { + for (var i = 0, length_1 = item.length; i < length_1; i++) { + // reset query operation since item being tested needs to pass _all_ query + // operations for it to be a success + this._queryOperation.reset(); + // check item + this._queryOperation.next(item[i], i, item); + this.keep = this.keep || this._queryOperation.keep; + } + this.done = true; + } + else { + this.done = false; + this.keep = false; + } + }; + return $ElemMatch; + }(NamedBaseOperation)); + var $Not = /** @class */ (function (_super) { + __extends($Not, _super); + function $Not() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Not.prototype.init = function () { + this._queryOperation = createQueryOperation(this.params, this.owneryQuery, this.options); + }; + $Not.prototype.reset = function () { + this._queryOperation.reset(); + }; + $Not.prototype.next = function (item, key, owner) { + this._queryOperation.next(item, key, owner); + this.done = this._queryOperation.done; + this.keep = !this._queryOperation.keep; + }; + return $Not; + }(NamedBaseOperation)); + var $Size = /** @class */ (function (_super) { + __extends($Size, _super); + function $Size() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Size.prototype.init = function () { }; + $Size.prototype.next = function (item) { + if (isArray(item) && item.length === this.params) { + this.done = true; + this.keep = true; + } + // if (parent && parent.length === this.params) { + // this.done = true; + // this.keep = true; + // } + }; + return $Size; + }(NamedBaseOperation)); + var $Or = /** @class */ (function (_super) { + __extends($Or, _super); + function $Or() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Or.prototype.init = function () { + var _this = this; + this._ops = this.params.map(function (op) { + return createQueryOperation(op, null, _this.options); + }); + }; + $Or.prototype.reset = function () { + this.done = false; + this.keep = false; + for (var i = 0, length_2 = this._ops.length; i < length_2; i++) { + this._ops[i].reset(); + } + }; + $Or.prototype.next = function (item, key, owner) { + var done = false; + var success = false; + for (var i = 0, length_3 = this._ops.length; i < length_3; i++) { + var op = this._ops[i]; + op.next(item, key, owner); + if (op.keep) { + done = true; + success = op.keep; + break; + } + } + this.keep = success; + this.done = done; + }; + return $Or; + }(NamedBaseOperation)); + var $Nor = /** @class */ (function (_super) { + __extends($Nor, _super); + function $Nor() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Nor.prototype.next = function (item, key, owner) { + _super.prototype.next.call(this, item, key, owner); + this.keep = !this.keep; + }; + return $Nor; + }($Or)); + var $In = /** @class */ (function (_super) { + __extends($In, _super); + function $In() { + return _super !== null && _super.apply(this, arguments) || this; + } + $In.prototype.init = function () { + var _this = this; + this._testers = this.params.map(function (value) { + if (containsOperation(value)) { + throw new Error("cannot nest $ under " + _this.constructor.name.toLowerCase()); + } + return createTester(value, _this.options.compare); + }); + }; + $In.prototype.next = function (item, key, owner) { + var done = false; + var success = false; + for (var i = 0, length_4 = this._testers.length; i < length_4; i++) { + var test = this._testers[i]; + if (test(item)) { + done = true; + success = true; + break; + } + } + this.keep = success; + this.done = done; + }; + return $In; + }(NamedBaseOperation)); + var $Nin = /** @class */ (function (_super) { + __extends($Nin, _super); + function $Nin() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Nin.prototype.next = function (item, key, owner) { + _super.prototype.next.call(this, item, key, owner); + this.keep = !this.keep; + }; + return $Nin; + }($In)); + var $Exists = /** @class */ (function (_super) { + __extends($Exists, _super); + function $Exists() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Exists.prototype.next = function (item, key, owner) { + if (owner.hasOwnProperty(key) === this.params) { + this.done = true; + this.keep = true; + } + }; + return $Exists; + }(NamedBaseOperation)); + var $And = /** @class */ (function (_super) { + __extends($And, _super); + function $And(params, owneryQuery, options, name) { + return _super.call(this, params, owneryQuery, options, params.map(function (query) { return createQueryOperation(query, owneryQuery, options); }), name) || this; + } + $And.prototype.next = function (item, key, owner) { + this.childrenNext(item, key, owner); + }; + return $And; + }(NamedGroupOperation)); + var $eq = function (params, owneryQuery, options) { + return new EqualsOperation(params, owneryQuery, options); + }; + var $ne = function (params, owneryQuery, options, name) { return new $Ne(params, owneryQuery, options, name); }; + var $or = function (params, owneryQuery, options, name) { return new $Or(params, owneryQuery, options, name); }; + var $nor = function (params, owneryQuery, options, name) { return new $Nor(params, owneryQuery, options, name); }; + var $elemMatch = function (params, owneryQuery, options, name) { return new $ElemMatch(params, owneryQuery, options, name); }; + var $nin = function (params, owneryQuery, options, name) { return new $Nin(params, owneryQuery, options, name); }; + var $in = function (params, owneryQuery, options, name) { return new $In(params, owneryQuery, options, name); }; + var $lt = numericalOperation(function (params) { return function (b) { return b < params; }; }); + var $lte = numericalOperation(function (params) { return function (b) { return b <= params; }; }); + var $gt = numericalOperation(function (params) { return function (b) { return b > params; }; }); + var $gte = numericalOperation(function (params) { return function (b) { return b >= params; }; }); + var $mod = function (_a, owneryQuery, options) { + var mod = _a[0], equalsValue = _a[1]; + return new EqualsOperation(function (b) { return comparable(b) % mod === equalsValue; }, owneryQuery, options); + }; + var $exists = function (params, owneryQuery, options, name) { return new $Exists(params, owneryQuery, options, name); }; + var $regex = function (pattern, owneryQuery, options) { + return new EqualsOperation(new RegExp(pattern, owneryQuery.$options), owneryQuery, options); + }; + var $not = function (params, owneryQuery, options, name) { return new $Not(params, owneryQuery, options, name); }; + var typeAliases = { + number: function (v) { return typeof v === "number"; }, + string: function (v) { return typeof v === "string"; }, + bool: function (v) { return typeof v === "boolean"; }, + array: function (v) { return Array.isArray(v); }, + null: function (v) { return v === null; }, + timestamp: function (v) { return v instanceof Date; } + }; + var $type = function (clazz, owneryQuery, options) { + return new EqualsOperation(function (b) { + if (typeof clazz === "string") { + if (!typeAliases[clazz]) { + throw new Error("Type alias does not exist"); + } + return typeAliases[clazz](b); + } + return b != null ? b instanceof clazz || b.constructor === clazz : false; + }, owneryQuery, options); + }; + var $and = function (params, ownerQuery, options, name) { return new $And(params, ownerQuery, options, name); }; + var $all = $and; + var $size = function (params, ownerQuery, options) { return new $Size(params, ownerQuery, options, "$size"); }; + var $options = function () { return null; }; + var $where = function (params, ownerQuery, options) { + var test; + if (isFunction(params)) { + test = params; + } + else if (!process.env.CSP_ENABLED) { + test = new Function("obj", "return " + params); + } + else { + throw new Error("In CSP mode, sift does not support strings in \"$where\" condition"); + } + return new EqualsOperation(function (b) { return test.bind(b)(b); }, ownerQuery, options); + }; + + var defaultOperations = /*#__PURE__*/Object.freeze({ + __proto__: null, + $Size: $Size, + $eq: $eq, + $ne: $ne, + $or: $or, + $nor: $nor, + $elemMatch: $elemMatch, + $nin: $nin, + $in: $in, + $lt: $lt, + $lte: $lte, + $gt: $gt, + $gte: $gte, + $mod: $mod, + $exists: $exists, + $regex: $regex, + $not: $not, + $type: $type, + $and: $and, + $all: $all, + $size: $size, + $options: $options, + $where: $where + }); + + var createDefaultQueryOperation = function (query, ownerQuery, _a) { + var _b = _a === void 0 ? {} : _a, compare = _b.compare, operations = _b.operations; + return createQueryOperation(query, ownerQuery, { + compare: compare, + operations: Object.assign({}, defaultOperations, operations || {}) + }); + }; + var createDefaultQueryTester = function (query, options) { + if (options === void 0) { options = {}; } + var op = createDefaultQueryOperation(query, null, options); + return createOperationTester(op); + }; + + exports.$Size = $Size; + exports.$all = $all; + exports.$and = $and; + exports.$elemMatch = $elemMatch; + exports.$eq = $eq; + exports.$exists = $exists; + exports.$gt = $gt; + exports.$gte = $gte; + exports.$in = $in; + exports.$lt = $lt; + exports.$lte = $lte; + exports.$mod = $mod; + exports.$ne = $ne; + exports.$nin = $nin; + exports.$nor = $nor; + exports.$not = $not; + exports.$options = $options; + exports.$or = $or; + exports.$regex = $regex; + exports.$size = $size; + exports.$type = $type; + exports.$where = $where; + exports.EqualsOperation = EqualsOperation; + exports.createDefaultQueryOperation = createDefaultQueryOperation; + exports.createEqualsOperation = createEqualsOperation; + exports.createOperationTester = createOperationTester; + exports.createQueryOperation = createQueryOperation; + exports.createQueryTester = createQueryTester; + exports.default = createDefaultQueryTester; + + Object.defineProperty(exports, '__esModule', { value: true }); + +}))); +//# sourceMappingURL=index.js.map diff --git a/node_modules/sift/lib/index.js.map b/node_modules/sift/lib/index.js.map new file mode 100644 index 000000000..a9afdbbb5 --- /dev/null +++ b/node_modules/sift/lib/index.js.map @@ -0,0 +1 @@ +{"version":3,"file":"index.js","sources":["../node_modules/tslib/tslib.es6.js","../src/utils.ts","../src/core.ts","../src/operations.ts","../src/index.ts"],"sourcesContent":["/*! *****************************************************************************\r\nCopyright (c) Microsoft Corporation.\r\n\r\nPermission to use, copy, modify, and/or distribute this software for any\r\npurpose with or without fee is hereby granted.\r\n\r\nTHE SOFTWARE IS PROVIDED \"AS IS\" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH\r\nREGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY\r\nAND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,\r\nINDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM\r\nLOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR\r\nOTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR\r\nPERFORMANCE OF THIS SOFTWARE.\r\n***************************************************************************** */\r\n/* global Reflect, Promise */\r\n\r\nvar extendStatics = function(d, b) {\r\n extendStatics = Object.setPrototypeOf ||\r\n ({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||\r\n function (d, b) { for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p]; };\r\n return extendStatics(d, b);\r\n};\r\n\r\nexport function __extends(d, b) {\r\n extendStatics(d, b);\r\n function __() { this.constructor = d; }\r\n d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());\r\n}\r\n\r\nexport var __assign = function() {\r\n __assign = Object.assign || function __assign(t) {\r\n for (var s, i = 1, n = arguments.length; i < n; i++) {\r\n s = arguments[i];\r\n for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p)) t[p] = s[p];\r\n }\r\n return t;\r\n }\r\n return __assign.apply(this, arguments);\r\n}\r\n\r\nexport function __rest(s, e) {\r\n var t = {};\r\n for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p) && e.indexOf(p) < 0)\r\n t[p] = s[p];\r\n if (s != null && typeof Object.getOwnPropertySymbols === \"function\")\r\n for (var i = 0, p = Object.getOwnPropertySymbols(s); i < p.length; i++) {\r\n if (e.indexOf(p[i]) < 0 && Object.prototype.propertyIsEnumerable.call(s, p[i]))\r\n t[p[i]] = s[p[i]];\r\n }\r\n return t;\r\n}\r\n\r\nexport function __decorate(decorators, target, key, desc) {\r\n var c = arguments.length, r = c < 3 ? target : desc === null ? desc = Object.getOwnPropertyDescriptor(target, key) : desc, d;\r\n if (typeof Reflect === \"object\" && typeof Reflect.decorate === \"function\") r = Reflect.decorate(decorators, target, key, desc);\r\n else for (var i = decorators.length - 1; i >= 0; i--) if (d = decorators[i]) r = (c < 3 ? d(r) : c > 3 ? d(target, key, r) : d(target, key)) || r;\r\n return c > 3 && r && Object.defineProperty(target, key, r), r;\r\n}\r\n\r\nexport function __param(paramIndex, decorator) {\r\n return function (target, key) { decorator(target, key, paramIndex); }\r\n}\r\n\r\nexport function __metadata(metadataKey, metadataValue) {\r\n if (typeof Reflect === \"object\" && typeof Reflect.metadata === \"function\") return Reflect.metadata(metadataKey, metadataValue);\r\n}\r\n\r\nexport function __awaiter(thisArg, _arguments, P, generator) {\r\n function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }\r\n return new (P || (P = Promise))(function (resolve, reject) {\r\n function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }\r\n function rejected(value) { try { step(generator[\"throw\"](value)); } catch (e) { reject(e); } }\r\n function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }\r\n step((generator = generator.apply(thisArg, _arguments || [])).next());\r\n });\r\n}\r\n\r\nexport function __generator(thisArg, body) {\r\n var _ = { label: 0, sent: function() { if (t[0] & 1) throw t[1]; return t[1]; }, trys: [], ops: [] }, f, y, t, g;\r\n return g = { next: verb(0), \"throw\": verb(1), \"return\": verb(2) }, typeof Symbol === \"function\" && (g[Symbol.iterator] = function() { return this; }), g;\r\n function verb(n) { return function (v) { return step([n, v]); }; }\r\n function step(op) {\r\n if (f) throw new TypeError(\"Generator is already executing.\");\r\n while (_) try {\r\n if (f = 1, y && (t = op[0] & 2 ? y[\"return\"] : op[0] ? y[\"throw\"] || ((t = y[\"return\"]) && t.call(y), 0) : y.next) && !(t = t.call(y, op[1])).done) return t;\r\n if (y = 0, t) op = [op[0] & 2, t.value];\r\n switch (op[0]) {\r\n case 0: case 1: t = op; break;\r\n case 4: _.label++; return { value: op[1], done: false };\r\n case 5: _.label++; y = op[1]; op = [0]; continue;\r\n case 7: op = _.ops.pop(); _.trys.pop(); continue;\r\n default:\r\n if (!(t = _.trys, t = t.length > 0 && t[t.length - 1]) && (op[0] === 6 || op[0] === 2)) { _ = 0; continue; }\r\n if (op[0] === 3 && (!t || (op[1] > t[0] && op[1] < t[3]))) { _.label = op[1]; break; }\r\n if (op[0] === 6 && _.label < t[1]) { _.label = t[1]; t = op; break; }\r\n if (t && _.label < t[2]) { _.label = t[2]; _.ops.push(op); break; }\r\n if (t[2]) _.ops.pop();\r\n _.trys.pop(); continue;\r\n }\r\n op = body.call(thisArg, _);\r\n } catch (e) { op = [6, e]; y = 0; } finally { f = t = 0; }\r\n if (op[0] & 5) throw op[1]; return { value: op[0] ? op[1] : void 0, done: true };\r\n }\r\n}\r\n\r\nexport var __createBinding = Object.create ? (function(o, m, k, k2) {\r\n if (k2 === undefined) k2 = k;\r\n Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });\r\n}) : (function(o, m, k, k2) {\r\n if (k2 === undefined) k2 = k;\r\n o[k2] = m[k];\r\n});\r\n\r\nexport function __exportStar(m, exports) {\r\n for (var p in m) if (p !== \"default\" && !exports.hasOwnProperty(p)) __createBinding(exports, m, p);\r\n}\r\n\r\nexport function __values(o) {\r\n var s = typeof Symbol === \"function\" && Symbol.iterator, m = s && o[s], i = 0;\r\n if (m) return m.call(o);\r\n if (o && typeof o.length === \"number\") return {\r\n next: function () {\r\n if (o && i >= o.length) o = void 0;\r\n return { value: o && o[i++], done: !o };\r\n }\r\n };\r\n throw new TypeError(s ? \"Object is not iterable.\" : \"Symbol.iterator is not defined.\");\r\n}\r\n\r\nexport function __read(o, n) {\r\n var m = typeof Symbol === \"function\" && o[Symbol.iterator];\r\n if (!m) return o;\r\n var i = m.call(o), r, ar = [], e;\r\n try {\r\n while ((n === void 0 || n-- > 0) && !(r = i.next()).done) ar.push(r.value);\r\n }\r\n catch (error) { e = { error: error }; }\r\n finally {\r\n try {\r\n if (r && !r.done && (m = i[\"return\"])) m.call(i);\r\n }\r\n finally { if (e) throw e.error; }\r\n }\r\n return ar;\r\n}\r\n\r\nexport function __spread() {\r\n for (var ar = [], i = 0; i < arguments.length; i++)\r\n ar = ar.concat(__read(arguments[i]));\r\n return ar;\r\n}\r\n\r\nexport function __spreadArrays() {\r\n for (var s = 0, i = 0, il = arguments.length; i < il; i++) s += arguments[i].length;\r\n for (var r = Array(s), k = 0, i = 0; i < il; i++)\r\n for (var a = arguments[i], j = 0, jl = a.length; j < jl; j++, k++)\r\n r[k] = a[j];\r\n return r;\r\n};\r\n\r\nexport function __await(v) {\r\n return this instanceof __await ? (this.v = v, this) : new __await(v);\r\n}\r\n\r\nexport function __asyncGenerator(thisArg, _arguments, generator) {\r\n if (!Symbol.asyncIterator) throw new TypeError(\"Symbol.asyncIterator is not defined.\");\r\n var g = generator.apply(thisArg, _arguments || []), i, q = [];\r\n return i = {}, verb(\"next\"), verb(\"throw\"), verb(\"return\"), i[Symbol.asyncIterator] = function () { return this; }, i;\r\n function verb(n) { if (g[n]) i[n] = function (v) { return new Promise(function (a, b) { q.push([n, v, a, b]) > 1 || resume(n, v); }); }; }\r\n function resume(n, v) { try { step(g[n](v)); } catch (e) { settle(q[0][3], e); } }\r\n function step(r) { r.value instanceof __await ? Promise.resolve(r.value.v).then(fulfill, reject) : settle(q[0][2], r); }\r\n function fulfill(value) { resume(\"next\", value); }\r\n function reject(value) { resume(\"throw\", value); }\r\n function settle(f, v) { if (f(v), q.shift(), q.length) resume(q[0][0], q[0][1]); }\r\n}\r\n\r\nexport function __asyncDelegator(o) {\r\n var i, p;\r\n return i = {}, verb(\"next\"), verb(\"throw\", function (e) { throw e; }), verb(\"return\"), i[Symbol.iterator] = function () { return this; }, i;\r\n function verb(n, f) { i[n] = o[n] ? function (v) { return (p = !p) ? { value: __await(o[n](v)), done: n === \"return\" } : f ? f(v) : v; } : f; }\r\n}\r\n\r\nexport function __asyncValues(o) {\r\n if (!Symbol.asyncIterator) throw new TypeError(\"Symbol.asyncIterator is not defined.\");\r\n var m = o[Symbol.asyncIterator], i;\r\n return m ? m.call(o) : (o = typeof __values === \"function\" ? __values(o) : o[Symbol.iterator](), i = {}, verb(\"next\"), verb(\"throw\"), verb(\"return\"), i[Symbol.asyncIterator] = function () { return this; }, i);\r\n function verb(n) { i[n] = o[n] && function (v) { return new Promise(function (resolve, reject) { v = o[n](v), settle(resolve, reject, v.done, v.value); }); }; }\r\n function settle(resolve, reject, d, v) { Promise.resolve(v).then(function(v) { resolve({ value: v, done: d }); }, reject); }\r\n}\r\n\r\nexport function __makeTemplateObject(cooked, raw) {\r\n if (Object.defineProperty) { Object.defineProperty(cooked, \"raw\", { value: raw }); } else { cooked.raw = raw; }\r\n return cooked;\r\n};\r\n\r\nvar __setModuleDefault = Object.create ? (function(o, v) {\r\n Object.defineProperty(o, \"default\", { enumerable: true, value: v });\r\n}) : function(o, v) {\r\n o[\"default\"] = v;\r\n};\r\n\r\nexport function __importStar(mod) {\r\n if (mod && mod.__esModule) return mod;\r\n var result = {};\r\n if (mod != null) for (var k in mod) if (Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);\r\n __setModuleDefault(result, mod);\r\n return result;\r\n}\r\n\r\nexport function __importDefault(mod) {\r\n return (mod && mod.__esModule) ? mod : { default: mod };\r\n}\r\n\r\nexport function __classPrivateFieldGet(receiver, privateMap) {\r\n if (!privateMap.has(receiver)) {\r\n throw new TypeError(\"attempted to get private field on non-instance\");\r\n }\r\n return privateMap.get(receiver);\r\n}\r\n\r\nexport function __classPrivateFieldSet(receiver, privateMap, value) {\r\n if (!privateMap.has(receiver)) {\r\n throw new TypeError(\"attempted to set private field on non-instance\");\r\n }\r\n privateMap.set(receiver, value);\r\n return value;\r\n}\r\n",null,null,null,null],"names":[],"mappings":";;;;;;IAAA;IACA;AACA;IACA;IACA;AACA;IACA;IACA;IACA;IACA;IACA;IACA;IACA;IACA;IACA;AACA;IACA,IAAI,aAAa,GAAG,SAAS,CAAC,EAAE,CAAC,EAAE;IACnC,IAAI,aAAa,GAAG,MAAM,CAAC,cAAc;IACzC,SAAS,EAAE,SAAS,EAAE,EAAE,EAAE,YAAY,KAAK,IAAI,UAAU,CAAC,EAAE,CAAC,EAAE,EAAE,CAAC,CAAC,SAAS,GAAG,CAAC,CAAC,EAAE,CAAC;IACpF,QAAQ,UAAU,CAAC,EAAE,CAAC,EAAE,EAAE,KAAK,IAAI,CAAC,IAAI,CAAC,EAAE,IAAI,CAAC,CAAC,cAAc,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC;IACnF,IAAI,OAAO,aAAa,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC;IAC/B,CAAC,CAAC;AACF;IACO,SAAS,SAAS,CAAC,CAAC,EAAE,CAAC,EAAE;IAChC,IAAI,aAAa,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC;IACxB,IAAI,SAAS,EAAE,GAAG,EAAE,IAAI,CAAC,WAAW,GAAG,CAAC,CAAC,EAAE;IAC3C,IAAI,CAAC,CAAC,SAAS,GAAG,CAAC,KAAK,IAAI,GAAG,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,IAAI,EAAE,CAAC,SAAS,GAAG,CAAC,CAAC,SAAS,EAAE,IAAI,EAAE,EAAE,CAAC,CAAC;IACzF;;ICzBO,IAAM,WAAW,GAAG,UAAQ,IAAI;QACrC,IAAM,UAAU,GAAG,UAAU,GAAG,IAAI,GAAG,GAAG,CAAC;QAC3C,OAAO,UAAS,KAAK;YACnB,OAAO,YAAY,CAAC,KAAK,CAAC,KAAK,UAAU,CAAC;SAC3C,CAAC;IACJ,CAAC,CAAC;IAEF,IAAM,YAAY,GAAG,UAAA,KAAK,IAAI,OAAA,MAAM,CAAC,SAAS,CAAC,QAAQ,CAAC,IAAI,CAAC,KAAK,CAAC,GAAA,CAAC;IAE7D,IAAM,UAAU,GAAG,UAAC,KAAU;QACnC,IAAI,KAAK,YAAY,IAAI,EAAE;YACzB,OAAO,KAAK,CAAC,OAAO,EAAE,CAAC;SACxB;aAAM,IAAI,OAAO,CAAC,KAAK,CAAC,EAAE;YACzB,OAAO,KAAK,CAAC,GAAG,CAAC,UAAU,CAAC,CAAC;SAC9B;aAAM,IAAI,KAAK,IAAI,OAAO,KAAK,CAAC,MAAM,KAAK,UAAU,EAAE;YACtD,OAAO,KAAK,CAAC,MAAM,EAAE,CAAC;SACvB;QAED,OAAO,KAAK,CAAC;IACf,CAAC,CAAC;IAEK,IAAM,OAAO,GAAG,WAAW,CAAa,OAAO,CAAC,CAAC;IACjD,IAAM,QAAQ,GAAG,WAAW,CAAS,QAAQ,CAAC,CAAC;IAC/C,IAAM,UAAU,GAAG,WAAW,CAAW,UAAU,CAAC,CAAC;IACrD,IAAM,eAAe,GAAG,UAAA,KAAK;QAClC,QACE,KAAK;aACJ,KAAK,CAAC,WAAW,KAAK,MAAM;gBAC3B,KAAK,CAAC,WAAW,KAAK,KAAK;gBAC3B,KAAK,CAAC,WAAW,CAAC,QAAQ,EAAE,KAAK,qCAAqC;gBACtE,KAAK,CAAC,WAAW,CAAC,QAAQ,EAAE,KAAK,oCAAoC,CAAC;YACxE,CAAC,KAAK,CAAC,MAAM,EACb;IACJ,CAAC,CAAC;IAEK,IAAM,MAAM,GAAG,UAAC,CAAC,EAAE,CAAC;QACzB,IAAI,CAAC,IAAI,IAAI,IAAI,CAAC,IAAI,CAAC,EAAE;YACvB,OAAO,IAAI,CAAC;SACb;QACD,IAAI,CAAC,KAAK,CAAC,EAAE;YACX,OAAO,IAAI,CAAC;SACb;QAED,IAAI,MAAM,CAAC,SAAS,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC,CAAC,KAAK,MAAM,CAAC,SAAS,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE;YAC3E,OAAO,KAAK,CAAC;SACd;QAED,IAAI,OAAO,CAAC,CAAC,CAAC,EAAE;YACd,IAAI,CAAC,CAAC,MAAM,KAAK,CAAC,CAAC,MAAM,EAAE;gBACzB,OAAO,KAAK,CAAC;aACd;YACD,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,mBAAM,EAAQ,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;gBAC/C,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,CAAC;oBAAE,OAAO,KAAK,CAAC;aACvC;YACD,OAAO,IAAI,CAAC;SACb;aAAM,IAAI,QAAQ,CAAC,CAAC,CAAC,EAAE;YACtB,IAAI,MAAM,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,MAAM,KAAK,MAAM,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,MAAM,EAAE;gBACnD,OAAO,KAAK,CAAC;aACd;YACD,KAAK,IAAM,GAAG,IAAI,CAAC,EAAE;gBACnB,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,EAAE,CAAC,CAAC,GAAG,CAAC,CAAC;oBAAE,OAAO,KAAK,CAAC;aAC3C;YACD,OAAO,IAAI,CAAC;SACb;QACD,OAAO,KAAK,CAAC;IACf,CAAC;;ICMD;;;;IAKA,IAAM,iBAAiB,GAAG,UACxB,IAAS,EACT,OAAc,EACd,IAAY,EACZ,KAAa,EACb,GAAQ,EACR,KAAU;QAEV,IAAM,UAAU,GAAG,OAAO,CAAC,KAAK,CAAC,CAAC;;;QAIlC,IAAI,OAAO,CAAC,IAAI,CAAC,IAAI,KAAK,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,EAAE;YAC9C,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,sBAAM,EAAW,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;;;gBAGlD,IAAI,CAAC,iBAAiB,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,OAAO,EAAE,IAAI,EAAE,KAAK,EAAE,CAAC,EAAE,IAAI,CAAC,EAAE;oBAC9D,OAAO,KAAK,CAAC;iBACd;aACF;SACF;QAED,IAAI,KAAK,KAAK,OAAO,CAAC,MAAM,IAAI,IAAI,IAAI,IAAI,EAAE;YAC5C,OAAO,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;SAC/B;QAED,OAAO,iBAAiB,CACtB,IAAI,CAAC,UAAU,CAAC,EAChB,OAAO,EACP,IAAI,EACJ,KAAK,GAAG,CAAC,EACT,UAAU,EACV,IAAI,CACL,CAAC;IACJ,CAAC,CAAC;IAEF;QAGE,uBACW,MAAe,EACf,WAAgB,EAChB,OAAgB;YAFhB,WAAM,GAAN,MAAM,CAAS;YACf,gBAAW,GAAX,WAAW,CAAK;YAChB,YAAO,GAAP,OAAO,CAAS;YAEzB,IAAI,CAAC,IAAI,EAAE,CAAC;SACb;QACS,4BAAI,GAAd,eAAmB;QACnB,6BAAK,GAAL;YACE,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;YAClB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;SACnB;QAEH,oBAAC;IAAD,CAAC,IAAA;IAED;QACU,sCAA6B;QAErC,4BACE,MAAe,EACf,WAAgB,EAChB,OAAgB,EACP,IAAY;YAJvB,YAME,kBAAM,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC,SACpC;YAHU,UAAI,GAAJ,IAAI,CAAQ;;SAGtB;QACH,yBAAC;IAAD,CAXA,CACU,aAAa,GAUtB;IAED;QAAsC,kCAAkB;QAItD,wBACE,MAAW,EACX,WAAgB,EAChB,OAAgB,EACA,QAA0B;YAJ5C,YAME,kBAAM,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC,SACpC;YAHiB,cAAQ,GAAR,QAAQ,CAAkB;;SAG3C;;;QAKD,8BAAK,GAAL;YACE,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;YAClB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;YAClB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,+BAAM,EAAoB,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;gBAC3D,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,KAAK,EAAE,CAAC;aAC1B;SACF;;;QAOS,qCAAY,GAAtB,UAAuB,IAAS,EAAE,GAAQ,EAAE,KAAU;YACpD,IAAI,IAAI,GAAG,IAAI,CAAC;YAChB,IAAI,IAAI,GAAG,IAAI,CAAC;YAChB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,+BAAM,EAAoB,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;gBAC3D,IAAM,cAAc,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC;gBACxC,cAAc,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;gBACtC,IAAI,CAAC,cAAc,CAAC,IAAI,EAAE;oBACxB,IAAI,GAAG,KAAK,CAAC;iBACd;gBACD,IAAI,cAAc,CAAC,IAAI,EAAE;oBACvB,IAAI,CAAC,cAAc,CAAC,IAAI,EAAE;wBACxB,MAAM;qBACP;iBACF;qBAAM;oBACL,IAAI,GAAG,KAAK,CAAC;iBACd;aACF;YACD,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;YACjB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;SAClB;QACH,qBAAC;IAAD,CAjDA,CAAsC,aAAa,GAiDlD;IAED;QAAkD,uCAAc;QAE9D,6BACE,MAAW,EACX,WAAgB,EAChB,OAAgB,EAChB,QAA0B,EACjB,IAAY;YALvB,YAOE,kBAAM,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,QAAQ,CAAC,SAC9C;YAHU,UAAI,GAAJ,IAAI,CAAQ;;SAGtB;QACH,0BAAC;IAAD,CAXA,CAAkD,cAAc,GAW/D;IAED;QAA2C,kCAAc;QAAzD;;SAOC;;;QAHC,6BAAI,GAAJ,UAAK,IAAW,EAAE,GAAQ,EAAE,MAAW;YACrC,IAAI,CAAC,YAAY,CAAC,IAAI,EAAE,GAAG,EAAE,MAAM,CAAC,CAAC;SACtC;QACH,qBAAC;IAAD,CAPA,CAA2C,cAAc,GAOxD;IAED;QAAqC,mCAAc;QACjD,yBACW,OAAc,EACvB,MAAW,EACX,WAAgB,EAChB,OAAgB,EAChB,QAA0B;YAL5B,YAOE,kBAAM,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,QAAQ,CAAC,SAC9C;YAPU,aAAO,GAAP,OAAO,CAAO;;;YAyBjB,sBAAgB,GAAG,UAAC,KAAU,EAAE,GAAQ,EAAE,KAAU;gBAC1D,KAAI,CAAC,YAAY,CAAC,KAAK,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;gBACrC,OAAO,CAAC,KAAI,CAAC,IAAI,CAAC;aACnB,CAAC;;SArBD;;;QAID,8BAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,MAAW;YACnC,iBAAiB,CACf,IAAI,EACJ,IAAI,CAAC,OAAO,EACZ,IAAI,CAAC,gBAAgB,EACrB,CAAC,EACD,GAAG,EACH,MAAM,CACP,CAAC;SACH;QASH,sBAAC;IAAD,CA/BA,CAAqC,cAAc,GA+BlD;IAEM,IAAM,YAAY,GAAG,UAAC,CAAC,EAAE,OAAmB;QACjD,IAAI,CAAC,YAAY,QAAQ,EAAE;YACzB,OAAO,CAAC,CAAC;SACV;QACD,IAAI,CAAC,YAAY,MAAM,EAAE;YACvB,OAAO,UAAA,CAAC;gBACN,IAAM,MAAM,GAAG,OAAO,CAAC,KAAK,QAAQ,IAAI,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;gBAClD,CAAC,CAAC,SAAS,GAAG,CAAC,CAAC;gBAChB,OAAO,MAAM,CAAC;aACf,CAAC;SACH;QACD,IAAM,WAAW,GAAG,UAAU,CAAC,CAAC,CAAC,CAAC;QAClC,OAAO,UAAA,CAAC,IAAI,OAAA,OAAO,CAAC,WAAW,EAAE,UAAU,CAAC,CAAC,CAAC,CAAC,GAAA,CAAC;IAClD,CAAC,CAAC;;QAE2C,mCAAqB;QAAlE;;SAaC;QAXC,8BAAI,GAAJ;YACE,IAAI,CAAC,KAAK,GAAG,YAAY,CAAC,IAAI,CAAC,MAAM,EAAE,IAAI,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;SAC9D;QACD,8BAAI,GAAJ,UAAK,IAAI,EAAE,GAAQ,EAAE,MAAW;YAC9B,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,MAAM,CAAC,IAAI,MAAM,CAAC,cAAc,CAAC,GAAG,CAAC,EAAE;gBACxD,IAAI,IAAI,CAAC,KAAK,CAAC,IAAI,EAAE,GAAG,EAAE,MAAM,CAAC,EAAE;oBACjC,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;oBACjB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;iBAClB;aACF;SACF;QACH,sBAAC;IAAD,CAbA,CAA6C,aAAa,GAazD;QAEY,qBAAqB,GAAG,UACnC,MAAW,EACX,WAAgB,EAChB,OAAgB,IACb,OAAA,IAAI,eAAe,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC,IAAC;IAEvD;QAA2C,iCAAqB;QAAhE;;SAKC;QAJC,4BAAI,GAAJ;YACE,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;YACjB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;SACnB;QACH,oBAAC;IAAD,CALA,CAA2C,aAAa,GAKvD;IAEM,IAAM,yBAAyB,GAAG,UACvC,wBAA+C,IAC5C,OAAA,UAAC,MAAW,EAAE,WAAgB,EAAE,OAAgB,EAAE,IAAY;QACjE,IAAI,MAAM,IAAI,IAAI,EAAE;YAClB,OAAO,IAAI,aAAa,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC,CAAC;SACxD;QAED,OAAO,wBAAwB,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,CAAC;IACtE,CAAC,GAAA,CAAC;IAEK,IAAM,kBAAkB,GAAG,UAAC,YAA6B;QAC9D,OAAA,yBAAyB,CACvB,UAAC,MAAW,EAAE,WAAuB,EAAE,OAAgB;YACrD,IAAM,YAAY,GAAG,OAAO,UAAU,CAAC,MAAM,CAAC,CAAC;YAC/C,IAAM,IAAI,GAAG,YAAY,CAAC,MAAM,CAAC,CAAC;YAClC,OAAO,IAAI,eAAe,CACxB,UAAA,CAAC;gBACC,OAAO,OAAO,UAAU,CAAC,CAAC,CAAC,KAAK,YAAY,IAAI,IAAI,CAAC,CAAC,CAAC,CAAC;aACzD,EACD,WAAW,EACX,OAAO,CACR,CAAC;SACH,CACF;IAZD,CAYC,CAAC;IASJ,IAAM,oBAAoB,GAAG,UAC3B,IAAY,EACZ,MAAW,EACX,WAAgB,EAChB,OAAgB;QAEhB,IAAM,gBAAgB,GAAG,OAAO,CAAC,UAAU,CAAC,IAAI,CAAC,CAAC;QAClD,IAAI,CAAC,gBAAgB,EAAE;YACrB,MAAM,IAAI,KAAK,CAAC,4BAA0B,IAAM,CAAC,CAAC;SACnD;QACD,OAAO,gBAAgB,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,CAAC;IAC9D,CAAC,CAAC;IAEK,IAAM,iBAAiB,GAAG,UAAC,KAAU;QAC1C,KAAK,IAAM,GAAG,IAAI,KAAK,EAAE;YACvB,IAAI,GAAG,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG;gBAAE,OAAO,IAAI,CAAC;SACxC;QACD,OAAO,KAAK,CAAC;IACf,CAAC,CAAC;IACF,IAAM,qBAAqB,GAAG,UAC5B,OAAc,EACd,WAAgB,EAChB,WAAgB,EAChB,OAAgB;QAEhB,IAAI,iBAAiB,CAAC,WAAW,CAAC,EAAE;YAC5B,IAAA,gDAGL,EAHM,sBAAc,EAAE,wBAGtB,CAAC;YACF,IAAI,gBAAgB,CAAC,MAAM,EAAE;gBAC3B,MAAM,IAAI,KAAK,CACb,kEAAkE,CACnE,CAAC;aACH;YACD,OAAO,IAAI,eAAe,CACxB,OAAO,EACP,WAAW,EACX,WAAW,EACX,OAAO,EACP,cAAc,CACf,CAAC;SACH;QACD,OAAO,IAAI,eAAe,CAAC,OAAO,EAAE,WAAW,EAAE,WAAW,EAAE,OAAO,EAAE;YACrE,IAAI,eAAe,CAAC,WAAW,EAAE,WAAW,EAAE,OAAO,CAAC;SACvD,CAAC,CAAC;IACL,CAAC,CAAC;QAEW,oBAAoB,GAAG,UAClC,KAAqB,EACrB,WAAuB,EACvB,EAA8C;QAD9C,4BAAA,EAAA,kBAAuB;YACvB,4BAA8C,EAA5C,oBAAO,EAAE,0BAAU;QAErB,IAAM,OAAO,GAAG;YACd,OAAO,EAAE,OAAO,IAAI,MAAM;YAC1B,UAAU,EAAE,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,UAAU,IAAI,EAAE,CAAC;SAChD,CAAC;QAEI,IAAA,0CAGL,EAHM,sBAAc,EAAE,wBAGtB,CAAC;QAEF,IAAM,GAAG,GAAG,EAAE,CAAC;QAEf,IAAI,cAAc,CAAC,MAAM,EAAE;YACzB,GAAG,CAAC,IAAI,CACN,IAAI,eAAe,CAAC,EAAE,EAAE,KAAK,EAAE,WAAW,EAAE,OAAO,EAAE,cAAc,CAAC,CACrE,CAAC;SACH;QAED,GAAG,CAAC,IAAI,OAAR,GAAG,EAAS,gBAAgB,EAAE;QAE9B,IAAI,GAAG,CAAC,MAAM,KAAK,CAAC,EAAE;YACpB,OAAO,GAAG,CAAC,CAAC,CAAC,CAAC;SACf;QACD,OAAO,IAAI,cAAc,CAAC,KAAK,EAAE,WAAW,EAAE,OAAO,EAAE,GAAG,CAAC,CAAC;IAC9D,EAAE;IAEF,IAAM,qBAAqB,GAAG,UAAC,KAAU,EAAE,OAAgB;QACzD,IAAM,cAAc,GAAG,EAAE,CAAC;QAC1B,IAAM,gBAAgB,GAAG,EAAE,CAAC;QAC5B,IAAI,CAAC,eAAe,CAAC,KAAK,CAAC,EAAE;YAC3B,cAAc,CAAC,IAAI,CAAC,IAAI,eAAe,CAAC,KAAK,EAAE,KAAK,EAAE,OAAO,CAAC,CAAC,CAAC;YAChE,OAAO,CAAC,cAAc,EAAE,gBAAgB,CAAC,CAAC;SAC3C;QACD,KAAK,IAAM,GAAG,IAAI,KAAK,EAAE;YACvB,IAAI,GAAG,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,EAAE;gBACzB,IAAM,EAAE,GAAG,oBAAoB,CAAC,GAAG,EAAE,KAAK,CAAC,GAAG,CAAC,EAAE,KAAK,EAAE,OAAO,CAAC,CAAC;;gBAGjE,IAAI,EAAE,IAAI,IAAI,EAAE;oBACd,cAAc,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC;iBACzB;aACF;iBAAM;gBACL,gBAAgB,CAAC,IAAI,CACnB,qBAAqB,CAAC,GAAG,CAAC,KAAK,CAAC,GAAG,CAAC,EAAE,KAAK,CAAC,GAAG,CAAC,EAAE,KAAK,EAAE,OAAO,CAAC,CAClE,CAAC;aACH;SACF;QAED,OAAO,CAAC,cAAc,EAAE,gBAAgB,CAAC,CAAC;IAC5C,CAAC,CAAC;QAEW,qBAAqB,GAAG,UAAQ,SAA2B,IAAK,OAAA,UAC3E,IAAW,EACX,GAAS,EACT,KAAW;QAEX,SAAS,CAAC,KAAK,EAAE,CAAC;QAClB,SAAS,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;QACjC,OAAO,SAAS,CAAC,IAAI,CAAC;IACxB,CAAC,IAAC;QAEW,iBAAiB,GAAG,UAC/B,KAAqB,EACrB,OAA8B;QAA9B,wBAAA,EAAA,YAA8B;QAE9B,OAAO,qBAAqB,CAC1B,oBAAoB,CAAiB,KAAK,EAAE,IAAI,EAAE,OAAO,CAAC,CAC3D,CAAC;IACJ;;IC9aA;QAAkB,uBAAuB;QAAzC;;SAeC;QAbC,kBAAI,GAAJ;YACE,IAAI,CAAC,KAAK,GAAG,YAAY,CAAC,IAAI,CAAC,MAAM,EAAE,IAAI,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;SAC9D;QACD,mBAAK,GAAL;YACE,iBAAM,KAAK,WAAE,CAAC;YACd,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;SAClB;QACD,kBAAI,GAAJ,UAAK,IAAS;YACZ,IAAI,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,EAAE;gBACpB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;gBACjB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;aACnB;SACF;QACH,UAAC;IAAD,CAfA,CAAkB,kBAAkB,GAenC;IACD;IACA;QAAyB,8BAA8B;QAAvD;;SA8BC;QA5BC,yBAAI,GAAJ;YACE,IAAI,CAAC,eAAe,GAAG,oBAAoB,CACzC,IAAI,CAAC,MAAM,EACX,IAAI,CAAC,WAAW,EAChB,IAAI,CAAC,OAAO,CACb,CAAC;SACH;QACD,0BAAK,GAAL;YACE,iBAAM,KAAK,WAAE,CAAC;YACd,IAAI,CAAC,eAAe,CAAC,KAAK,EAAE,CAAC;SAC9B;QACD,yBAAI,GAAJ,UAAK,IAAS;YACZ,IAAI,OAAO,CAAC,IAAI,CAAC,EAAE;gBACjB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,sBAAM,EAAW,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;;;oBAGlD,IAAI,CAAC,eAAe,CAAC,KAAK,EAAE,CAAC;;oBAG7B,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,IAAI,CAAC,CAAC;oBAC5C,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC,IAAI,IAAI,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC;iBACpD;gBACD,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;aAClB;iBAAM;gBACL,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;gBAClB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;aACnB;SACF;QACH,iBAAC;IAAD,CA9BA,CAAyB,kBAAkB,GA8B1C;IAED;QAAmB,wBAA8B;QAAjD;;SAiBC;QAfC,mBAAI,GAAJ;YACE,IAAI,CAAC,eAAe,GAAG,oBAAoB,CACzC,IAAI,CAAC,MAAM,EACX,IAAI,CAAC,WAAW,EAChB,IAAI,CAAC,OAAO,CACb,CAAC;SACH;QACD,oBAAK,GAAL;YACE,IAAI,CAAC,eAAe,CAAC,KAAK,EAAE,CAAC;SAC9B;QACD,mBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;YAClC,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;YAC5C,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC;YACtC,IAAI,CAAC,IAAI,GAAG,CAAC,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC;SACxC;QACH,WAAC;IAAD,CAjBA,CAAmB,kBAAkB,GAiBpC;;QAE0B,yBAAuB;QAAlD;;SAYC;QAXC,oBAAI,GAAJ,eAAS;QACT,oBAAI,GAAJ,UAAK,IAAI;YACP,IAAI,OAAO,CAAC,IAAI,CAAC,IAAI,IAAI,CAAC,MAAM,KAAK,IAAI,CAAC,MAAM,EAAE;gBAChD,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;gBACjB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;aAClB;;;;;SAKF;QACH,YAAC;IAAD,CAZA,CAA2B,kBAAkB,GAY5C;IAED;QAAkB,uBAAuB;QAAzC;;SA8BC;QA5BC,kBAAI,GAAJ;YAAA,iBAIC;YAHC,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC,MAAM,CAAC,GAAG,CAAC,UAAA,EAAE;gBAC5B,OAAA,oBAAoB,CAAC,EAAE,EAAE,IAAI,EAAE,KAAI,CAAC,OAAO,CAAC;aAAA,CAC7C,CAAC;SACH;QACD,mBAAK,GAAL;YACE,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;YAClB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;YAClB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,2BAAM,EAAgB,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;gBACvD,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,KAAK,EAAE,CAAC;aACtB;SACF;QACD,kBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;YAClC,IAAI,IAAI,GAAG,KAAK,CAAC;YACjB,IAAI,OAAO,GAAG,KAAK,CAAC;YACpB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,2BAAM,EAAgB,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;gBACvD,IAAM,EAAE,GAAG,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;gBACxB,EAAE,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;gBAC1B,IAAI,EAAE,CAAC,IAAI,EAAE;oBACX,IAAI,GAAG,IAAI,CAAC;oBACZ,OAAO,GAAG,EAAE,CAAC,IAAI,CAAC;oBAClB,MAAM;iBACP;aACF;YAED,IAAI,CAAC,IAAI,GAAG,OAAO,CAAC;YACpB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;SAClB;QACH,UAAC;IAAD,CA9BA,CAAkB,kBAAkB,GA8BnC;IAED;QAAmB,wBAAG;QAAtB;;SAKC;QAJC,mBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;YAClC,iBAAM,IAAI,YAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;YAC7B,IAAI,CAAC,IAAI,GAAG,CAAC,IAAI,CAAC,IAAI,CAAC;SACxB;QACH,WAAC;IAAD,CALA,CAAmB,GAAG,GAKrB;IAED;QAAkB,uBAAuB;QAAzC;;SA2BC;QAzBC,kBAAI,GAAJ;YAAA,iBASC;YARC,IAAI,CAAC,QAAQ,GAAG,IAAI,CAAC,MAAM,CAAC,GAAG,CAAC,UAAA,KAAK;gBACnC,IAAI,iBAAiB,CAAC,KAAK,CAAC,EAAE;oBAC5B,MAAM,IAAI,KAAK,CACb,yBAAuB,KAAI,CAAC,WAAW,CAAC,IAAI,CAAC,WAAW,EAAI,CAC7D,CAAC;iBACH;gBACD,OAAO,YAAY,CAAC,KAAK,EAAE,KAAI,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;aAClD,CAAC,CAAC;SACJ;QACD,kBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;YAClC,IAAI,IAAI,GAAG,KAAK,CAAC;YACjB,IAAI,OAAO,GAAG,KAAK,CAAC;YACpB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,+BAAM,EAAoB,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;gBAC3D,IAAM,IAAI,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC;gBAC9B,IAAI,IAAI,CAAC,IAAI,CAAC,EAAE;oBACd,IAAI,GAAG,IAAI,CAAC;oBACZ,OAAO,GAAG,IAAI,CAAC;oBACf,MAAM;iBACP;aACF;YAED,IAAI,CAAC,IAAI,GAAG,OAAO,CAAC;YACpB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;SAClB;QACH,UAAC;IAAD,CA3BA,CAAkB,kBAAkB,GA2BnC;IAED;QAAmB,wBAAG;QAAtB;;SAKC;QAJC,mBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;YAClC,iBAAM,IAAI,YAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;YAC7B,IAAI,CAAC,IAAI,GAAG,CAAC,IAAI,CAAC,IAAI,CAAC;SACxB;QACH,WAAC;IAAD,CALA,CAAmB,GAAG,GAKrB;IAED;QAAsB,2BAA2B;QAAjD;;SAOC;QANC,sBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;YAClC,IAAI,KAAK,CAAC,cAAc,CAAC,GAAG,CAAC,KAAK,IAAI,CAAC,MAAM,EAAE;gBAC7C,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;gBACjB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;aAClB;SACF;QACH,cAAC;IAAD,CAPA,CAAsB,kBAAkB,GAOvC;IAED;QAAmB,wBAAmB;QACpC,cACE,MAAoB,EACpB,WAAuB,EACvB,OAAgB,EAChB,IAAY;mBAEZ,kBACE,MAAM,EACN,WAAW,EACX,OAAO,EACP,MAAM,CAAC,GAAG,CAAC,UAAA,KAAK,IAAI,OAAA,oBAAoB,CAAC,KAAK,EAAE,WAAW,EAAE,OAAO,CAAC,GAAA,CAAC,EACtE,IAAI,CACL;SACF;QACD,mBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;YAClC,IAAI,CAAC,YAAY,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;SACrC;QACH,WAAC;IAAD,CAlBA,CAAmB,mBAAmB,GAkBrC;QAEY,GAAG,GAAG,UAAC,MAAW,EAAE,WAAuB,EAAE,OAAgB;QACxE,OAAA,IAAI,eAAe,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC;IAAjD,EAAkD;QACvC,GAAG,GAAG,UACjB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,GAAG,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;QACpC,GAAG,GAAG,UACjB,MAAoB,EACpB,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,GAAG,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;QACpC,IAAI,GAAG,UAClB,MAAoB,EACpB,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,IAAI,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;QACrC,UAAU,GAAG,UACxB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,UAAU,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;QAC3C,IAAI,GAAG,UAClB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,IAAI,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;QACrC,GAAG,GAAG,UACjB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,GAAG,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;QAEpC,GAAG,GAAG,kBAAkB,CAAC,UAAA,MAAM,IAAI,OAAA,UAAA,CAAC,IAAI,OAAA,CAAC,GAAG,MAAM,GAAA,GAAA,EAAE;QACpD,IAAI,GAAG,kBAAkB,CAAC,UAAA,MAAM,IAAI,OAAA,UAAA,CAAC,IAAI,OAAA,CAAC,IAAI,MAAM,GAAA,GAAA,EAAE;QACtD,GAAG,GAAG,kBAAkB,CAAC,UAAA,MAAM,IAAI,OAAA,UAAA,CAAC,IAAI,OAAA,CAAC,GAAG,MAAM,GAAA,GAAA,EAAE;QACpD,IAAI,GAAG,kBAAkB,CAAC,UAAA,MAAM,IAAI,OAAA,UAAA,CAAC,IAAI,OAAA,CAAC,IAAI,MAAM,GAAA,GAAA,EAAE;QACtD,IAAI,GAAG,UAClB,EAA4B,EAC5B,WAAuB,EACvB,OAAgB;YAFf,WAAG,EAAE,mBAAW;QAIjB,OAAA,IAAI,eAAe,CACjB,UAAA,CAAC,IAAI,OAAA,UAAU,CAAC,CAAC,CAAC,GAAG,GAAG,KAAK,WAAW,GAAA,EACxC,WAAW,EACX,OAAO,CACR;IAJD,EAIE;QACS,OAAO,GAAG,UACrB,MAAe,EACf,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,OAAO,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;QACxC,MAAM,GAAG,UACpB,OAAe,EACf,WAAuB,EACvB,OAAgB;QAEhB,OAAA,IAAI,eAAe,CACjB,IAAI,MAAM,CAAC,OAAO,EAAE,WAAW,CAAC,QAAQ,CAAC,EACzC,WAAW,EACX,OAAO,CACR;IAJD,EAIE;QACS,IAAI,GAAG,UAClB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,IAAI,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;IAElD,IAAM,WAAW,GAAG;QAClB,MAAM,EAAE,UAAA,CAAC,IAAI,OAAA,OAAO,CAAC,KAAK,QAAQ,GAAA;QAClC,MAAM,EAAE,UAAA,CAAC,IAAI,OAAA,OAAO,CAAC,KAAK,QAAQ,GAAA;QAClC,IAAI,EAAE,UAAA,CAAC,IAAI,OAAA,OAAO,CAAC,KAAK,SAAS,GAAA;QACjC,KAAK,EAAE,UAAA,CAAC,IAAI,OAAA,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,GAAA;QAC5B,IAAI,EAAE,UAAA,CAAC,IAAI,OAAA,CAAC,KAAK,IAAI,GAAA;QACrB,SAAS,EAAE,UAAA,CAAC,IAAI,OAAA,CAAC,YAAY,IAAI,GAAA;KAClC,CAAC;QAEW,KAAK,GAAG,UACnB,KAAwB,EACxB,WAAuB,EACvB,OAAgB;QAEhB,OAAA,IAAI,eAAe,CACjB,UAAA,CAAC;YACC,IAAI,OAAO,KAAK,KAAK,QAAQ,EAAE;gBAC7B,IAAI,CAAC,WAAW,CAAC,KAAK,CAAC,EAAE;oBACvB,MAAM,IAAI,KAAK,CAAC,2BAA2B,CAAC,CAAC;iBAC9C;gBAED,OAAO,WAAW,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC;aAC9B;YAED,OAAO,CAAC,IAAI,IAAI,GAAG,CAAC,YAAY,KAAK,IAAI,CAAC,CAAC,WAAW,KAAK,KAAK,GAAG,KAAK,CAAC;SAC1E,EACD,WAAW,EACX,OAAO,CACR;IAdD,EAcE;QACS,IAAI,GAAG,UAClB,MAAoB,EACpB,UAAsB,EACtB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,IAAI,CAAC,MAAM,EAAE,UAAU,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;QACpC,IAAI,GAAG,KAAK;QACZ,KAAK,GAAG,UACnB,MAAc,EACd,UAAsB,EACtB,OAAgB,IACb,OAAA,IAAI,KAAK,CAAC,MAAM,EAAE,UAAU,EAAE,OAAO,EAAE,OAAO,CAAC,IAAC;QACxC,QAAQ,GAAG,cAAM,OAAA,IAAI,IAAC;QACtB,MAAM,GAAG,UACpB,MAAyB,EACzB,UAAsB,EACtB,OAAgB;QAEhB,IAAI,IAAI,CAAC;QAET,IAAI,UAAU,CAAC,MAAM,CAAC,EAAE;YACtB,IAAI,GAAG,MAAM,CAAC;SACf;aAAM,IAAI,CAAC,OAAO,CAAC,GAAG,CAAC,WAAW,EAAE;YACnC,IAAI,GAAG,IAAI,QAAQ,CAAC,KAAK,EAAE,SAAS,GAAG,MAAM,CAAC,CAAC;SAChD;aAAM;YACL,MAAM,IAAI,KAAK,CACb,oEAAkE,CACnE,CAAC;SACH;QAED,OAAO,IAAI,eAAe,CAAC,UAAA,CAAC,IAAI,OAAA,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,GAAA,EAAE,UAAU,EAAE,OAAO,CAAC,CAAC;IACxE;;;;;;;;;;;;;;;;;;;;;;;;;;;;QCxUM,2BAA2B,GAAG,UAClC,KAAqB,EACrB,UAAe,EACf,EAA8C;YAA9C,4BAA8C,EAA5C,oBAAO,EAAE,0BAAU;QAErB,OAAO,oBAAoB,CAAC,KAAK,EAAE,UAAU,EAAE;YAC7C,OAAO,EAAE,OAAO;YAChB,UAAU,EAAE,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,iBAAiB,EAAE,UAAU,IAAI,EAAE,CAAC;SACnE,CAAC,CAAC;IACL,EAAE;IAEF,IAAM,wBAAwB,GAAG,UAC/B,KAAqB,EACrB,OAA8B;QAA9B,wBAAA,EAAA,YAA8B;QAE9B,IAAM,EAAE,GAAG,2BAA2B,CAAC,KAAK,EAAE,IAAI,EAAE,OAAO,CAAC,CAAC;QAC7D,OAAO,qBAAqB,CAAC,EAAE,CAAC,CAAC;IACnC,CAAC;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;"} \ No newline at end of file diff --git a/node_modules/sift/lib/operations.d.ts b/node_modules/sift/lib/operations.d.ts new file mode 100644 index 000000000..296a96f46 --- /dev/null +++ b/node_modules/sift/lib/operations.d.ts @@ -0,0 +1,70 @@ +import { NamedBaseOperation, EqualsOperation, Options, Operation, Query, NamedGroupOperation } from "./core"; +import { Key } from "./utils"; +declare class $Ne extends NamedBaseOperation { + private _test; + init(): void; + reset(): void; + next(item: any): void; +} +declare class $ElemMatch extends NamedBaseOperation> { + private _queryOperation; + init(): void; + reset(): void; + next(item: any): void; +} +declare class $Not extends NamedBaseOperation> { + private _queryOperation; + init(): void; + reset(): void; + next(item: any, key: Key, owner: any): void; +} +export declare class $Size extends NamedBaseOperation { + init(): void; + next(item: any): void; +} +declare class $Or extends NamedBaseOperation { + private _ops; + init(): void; + reset(): void; + next(item: any, key: Key, owner: any): void; +} +declare class $Nor extends $Or { + next(item: any, key: Key, owner: any): void; +} +declare class $In extends NamedBaseOperation { + private _testers; + init(): void; + next(item: any, key: Key, owner: any): void; +} +declare class $Nin extends $In { + next(item: any, key: Key, owner: any): void; +} +declare class $Exists extends NamedBaseOperation { + next(item: any, key: Key, owner: any): void; +} +declare class $And extends NamedGroupOperation { + constructor(params: Query[], owneryQuery: Query, options: Options, name: string); + next(item: any, key: Key, owner: any): void; +} +export declare const $eq: (params: any, owneryQuery: any, options: Options) => EqualsOperation; +export declare const $ne: (params: any, owneryQuery: any, options: Options, name: string) => $Ne; +export declare const $or: (params: any[], owneryQuery: any, options: Options, name: string) => $Or; +export declare const $nor: (params: any[], owneryQuery: any, options: Options, name: string) => $Nor; +export declare const $elemMatch: (params: any, owneryQuery: any, options: Options, name: string) => $ElemMatch; +export declare const $nin: (params: any, owneryQuery: any, options: Options, name: string) => $Nin; +export declare const $in: (params: any, owneryQuery: any, options: Options, name: string) => $In; +export declare const $lt: (params: any, owneryQuery: any, options: Options, name: string) => Operation | import("./core").NopeOperation; +export declare const $lte: (params: any, owneryQuery: any, options: Options, name: string) => Operation | import("./core").NopeOperation; +export declare const $gt: (params: any, owneryQuery: any, options: Options, name: string) => Operation | import("./core").NopeOperation; +export declare const $gte: (params: any, owneryQuery: any, options: Options, name: string) => Operation | import("./core").NopeOperation; +export declare const $mod: ([mod, equalsValue]: number[], owneryQuery: any, options: Options) => EqualsOperation<(b: any) => boolean>; +export declare const $exists: (params: boolean, owneryQuery: any, options: Options, name: string) => $Exists; +export declare const $regex: (pattern: string, owneryQuery: any, options: Options) => EqualsOperation; +export declare const $not: (params: any, owneryQuery: any, options: Options, name: string) => $Not; +export declare const $type: (clazz: TimerHandler, owneryQuery: any, options: Options) => EqualsOperation<(b: any) => any>; +export declare const $and: (params: any[], ownerQuery: any, options: Options, name: string) => $And; +export declare const $all: (params: any[], ownerQuery: any, options: Options, name: string) => $And; +export declare const $size: (params: number, ownerQuery: any, options: Options) => $Size; +export declare const $options: () => any; +export declare const $where: (params: TimerHandler, ownerQuery: any, options: Options) => EqualsOperation<(b: any) => any>; +export {}; diff --git a/node_modules/sift/lib/utils.d.ts b/node_modules/sift/lib/utils.d.ts new file mode 100644 index 000000000..422a2924a --- /dev/null +++ b/node_modules/sift/lib/utils.d.ts @@ -0,0 +1,9 @@ +export declare type Key = string | number; +export declare type Comparator = (a: any, b: any) => boolean; +export declare const typeChecker: (type: any) => (value: any) => value is TType; +export declare const comparable: (value: any) => any; +export declare const isArray: (value: any) => value is any[]; +export declare const isObject: (value: any) => value is Object; +export declare const isFunction: (value: any) => value is Function; +export declare const isVanillaObject: (value: any) => boolean; +export declare const equals: (a: any, b: any) => boolean; diff --git a/node_modules/sift/package.json b/node_modules/sift/package.json new file mode 100644 index 000000000..2282fe8e8 --- /dev/null +++ b/node_modules/sift/package.json @@ -0,0 +1,93 @@ +{ + "_from": "sift@13.5.2", + "_id": "sift@13.5.2", + "_inBundle": false, + "_integrity": "sha512-+gxdEOMA2J+AI+fVsCqeNn7Tgx3M9ZN9jdi95939l1IJ8cZsqS8sqpJyOkic2SJk+1+98Uwryt/gL6XDaV+UZA==", + "_location": "/sift", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "sift@13.5.2", + "name": "sift", + "escapedName": "sift", + "rawSpec": "13.5.2", + "saveSpec": null, + "fetchSpec": "13.5.2" + }, + "_requiredBy": [ + "/mongoose" + ], + "_resolved": "https://registry.npmjs.org/sift/-/sift-13.5.2.tgz", + "_shasum": "24a715e13c617b086166cd04917d204a591c9da6", + "_spec": "sift@13.5.2", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\mongoose", + "author": { + "name": "Craig Condon", + "email": "craig.j.condon@gmail.com" + }, + "bugs": { + "url": "https://github.com/crcn/sift.js/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "MongoDB query filtering in JavaScript", + "devDependencies": { + "@rollup/plugin-replace": "^2.3.2", + "@rollup/plugin-typescript": "^4.1.1", + "@types/node": "^13.7.0", + "bson": "^4.0.3", + "eval": "^0.1.4", + "husky": "^1.2.1", + "immutable": "^3.7.6", + "mocha": "8.3.2", + "mongodb": "^3.6.6", + "prettier": "1.15.3", + "pretty-quick": "^1.11.1", + "rimraf": "^3.0.2", + "rollup": "^2.7.2", + "rollup-plugin-terser": "^7.0.2", + "tslib": "^2.0.0", + "typescript": "^3.8.3" + }, + "engines": {}, + "es2015": "./es/index.js", + "files": [ + "es", + "es5m", + "lib", + "*.d.ts", + "*.js.map", + "index.js", + "sift.csp.min.js", + "sift.min.js", + "MIT-LICENSE.txt" + ], + "homepage": "https://github.com/crcn/sift.js#readme", + "husky": { + "hooks": { + "pre-commit": "pretty-quick --staged" + } + }, + "license": "MIT", + "main": "./index.js", + "module": "./es5m/index.js", + "name": "sift", + "repository": { + "type": "git", + "url": "git+https://github.com/crcn/sift.js.git" + }, + "scripts": { + "build": "rollup -c", + "build:types": "tsc -p tsconfig.json --emitDeclarationOnly --outDir lib", + "clean": "rimraf lib es5m es", + "prebuild": "npm run clean && npm run build:types", + "prepublishOnly": "npm run build && npm run test", + "test": "npm run test:spec && npm run test:types", + "test:spec": "mocha ./test -R spec", + "test:types": "cd test && tsc types.ts --noEmit" + }, + "sideEffects": false, + "typings": "./index.d.ts", + "version": "13.5.2" +} diff --git a/node_modules/sift/sift.csp.min.js b/node_modules/sift/sift.csp.min.js new file mode 100644 index 000000000..9b89893e4 --- /dev/null +++ b/node_modules/sift/sift.csp.min.js @@ -0,0 +1,686 @@ +(function (global, factory) { + typeof exports === 'object' && typeof module !== 'undefined' ? factory(exports) : + typeof define === 'function' && define.amd ? define(['exports'], factory) : + (global = global || self, factory(global.sift = {})); +}(this, (function (exports) { 'use strict'; + + /*! ***************************************************************************** + Copyright (c) Microsoft Corporation. + + Permission to use, copy, modify, and/or distribute this software for any + purpose with or without fee is hereby granted. + + THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH + REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY + AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, + INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM + LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR + OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR + PERFORMANCE OF THIS SOFTWARE. + ***************************************************************************** */ + /* global Reflect, Promise */ + + var extendStatics = function(d, b) { + extendStatics = Object.setPrototypeOf || + ({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) || + function (d, b) { for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p]; }; + return extendStatics(d, b); + }; + + function __extends(d, b) { + extendStatics(d, b); + function __() { this.constructor = d; } + d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __()); + } + + var typeChecker = function (type) { + var typeString = "[object " + type + "]"; + return function (value) { + return getClassName(value) === typeString; + }; + }; + var getClassName = function (value) { return Object.prototype.toString.call(value); }; + var comparable = function (value) { + if (value instanceof Date) { + return value.getTime(); + } + else if (isArray(value)) { + return value.map(comparable); + } + else if (value && typeof value.toJSON === "function") { + return value.toJSON(); + } + return value; + }; + var isArray = typeChecker("Array"); + var isObject = typeChecker("Object"); + var isFunction = typeChecker("Function"); + var isVanillaObject = function (value) { + return (value && + (value.constructor === Object || + value.constructor === Array || + value.constructor.toString() === "function Object() { [native code] }" || + value.constructor.toString() === "function Array() { [native code] }") && + !value.toJSON); + }; + var equals = function (a, b) { + if (a == null && a == b) { + return true; + } + if (a === b) { + return true; + } + if (Object.prototype.toString.call(a) !== Object.prototype.toString.call(b)) { + return false; + } + if (isArray(a)) { + if (a.length !== b.length) { + return false; + } + for (var i = 0, length_1 = a.length; i < length_1; i++) { + if (!equals(a[i], b[i])) + return false; + } + return true; + } + else if (isObject(a)) { + if (Object.keys(a).length !== Object.keys(b).length) { + return false; + } + for (var key in a) { + if (!equals(a[key], b[key])) + return false; + } + return true; + } + return false; + }; + + /** + * Walks through each value given the context - used for nested operations. E.g: + * { "person.address": { $eq: "blarg" }} + */ + var walkKeyPathValues = function (item, keyPath, next, depth, key, owner) { + var currentKey = keyPath[depth]; + // if array, then try matching. Might fall through for cases like: + // { $eq: [1, 2, 3] }, [ 1, 2, 3 ]. + if (isArray(item) && isNaN(Number(currentKey))) { + for (var i = 0, length_1 = item.length; i < length_1; i++) { + // if FALSE is returned, then terminate walker. For operations, this simply + // means that the search critera was met. + if (!walkKeyPathValues(item[i], keyPath, next, depth, i, item)) { + return false; + } + } + } + if (depth === keyPath.length || item == null) { + return next(item, key, owner); + } + return walkKeyPathValues(item[currentKey], keyPath, next, depth + 1, currentKey, item); + }; + var BaseOperation = /** @class */ (function () { + function BaseOperation(params, owneryQuery, options) { + this.params = params; + this.owneryQuery = owneryQuery; + this.options = options; + this.init(); + } + BaseOperation.prototype.init = function () { }; + BaseOperation.prototype.reset = function () { + this.done = false; + this.keep = false; + }; + return BaseOperation; + }()); + var NamedBaseOperation = /** @class */ (function (_super) { + __extends(NamedBaseOperation, _super); + function NamedBaseOperation(params, owneryQuery, options, name) { + var _this = _super.call(this, params, owneryQuery, options) || this; + _this.name = name; + return _this; + } + return NamedBaseOperation; + }(BaseOperation)); + var GroupOperation = /** @class */ (function (_super) { + __extends(GroupOperation, _super); + function GroupOperation(params, owneryQuery, options, children) { + var _this = _super.call(this, params, owneryQuery, options) || this; + _this.children = children; + return _this; + } + /** + */ + GroupOperation.prototype.reset = function () { + this.keep = false; + this.done = false; + for (var i = 0, length_2 = this.children.length; i < length_2; i++) { + this.children[i].reset(); + } + }; + /** + */ + GroupOperation.prototype.childrenNext = function (item, key, owner) { + var done = true; + var keep = true; + for (var i = 0, length_3 = this.children.length; i < length_3; i++) { + var childOperation = this.children[i]; + childOperation.next(item, key, owner); + if (!childOperation.keep) { + keep = false; + } + if (childOperation.done) { + if (!childOperation.keep) { + break; + } + } + else { + done = false; + } + } + this.done = done; + this.keep = keep; + }; + return GroupOperation; + }(BaseOperation)); + var NamedGroupOperation = /** @class */ (function (_super) { + __extends(NamedGroupOperation, _super); + function NamedGroupOperation(params, owneryQuery, options, children, name) { + var _this = _super.call(this, params, owneryQuery, options, children) || this; + _this.name = name; + return _this; + } + return NamedGroupOperation; + }(GroupOperation)); + var QueryOperation = /** @class */ (function (_super) { + __extends(QueryOperation, _super); + function QueryOperation() { + return _super !== null && _super.apply(this, arguments) || this; + } + /** + */ + QueryOperation.prototype.next = function (item, key, parent) { + this.childrenNext(item, key, parent); + }; + return QueryOperation; + }(GroupOperation)); + var NestedOperation = /** @class */ (function (_super) { + __extends(NestedOperation, _super); + function NestedOperation(keyPath, params, owneryQuery, options, children) { + var _this = _super.call(this, params, owneryQuery, options, children) || this; + _this.keyPath = keyPath; + /** + */ + _this._nextNestedValue = function (value, key, owner) { + _this.childrenNext(value, key, owner); + return !_this.done; + }; + return _this; + } + /** + */ + NestedOperation.prototype.next = function (item, key, parent) { + walkKeyPathValues(item, this.keyPath, this._nextNestedValue, 0, key, parent); + }; + return NestedOperation; + }(GroupOperation)); + var createTester = function (a, compare) { + if (a instanceof Function) { + return a; + } + if (a instanceof RegExp) { + return function (b) { + var result = typeof b === "string" && a.test(b); + a.lastIndex = 0; + return result; + }; + } + var comparableA = comparable(a); + return function (b) { return compare(comparableA, comparable(b)); }; + }; + var EqualsOperation = /** @class */ (function (_super) { + __extends(EqualsOperation, _super); + function EqualsOperation() { + return _super !== null && _super.apply(this, arguments) || this; + } + EqualsOperation.prototype.init = function () { + this._test = createTester(this.params, this.options.compare); + }; + EqualsOperation.prototype.next = function (item, key, parent) { + if (!Array.isArray(parent) || parent.hasOwnProperty(key)) { + if (this._test(item, key, parent)) { + this.done = true; + this.keep = true; + } + } + }; + return EqualsOperation; + }(BaseOperation)); + var createEqualsOperation = function (params, owneryQuery, options) { return new EqualsOperation(params, owneryQuery, options); }; + var NopeOperation = /** @class */ (function (_super) { + __extends(NopeOperation, _super); + function NopeOperation() { + return _super !== null && _super.apply(this, arguments) || this; + } + NopeOperation.prototype.next = function () { + this.done = true; + this.keep = false; + }; + return NopeOperation; + }(BaseOperation)); + var numericalOperationCreator = function (createNumericalOperation) { return function (params, owneryQuery, options, name) { + if (params == null) { + return new NopeOperation(params, owneryQuery, options); + } + return createNumericalOperation(params, owneryQuery, options, name); + }; }; + var numericalOperation = function (createTester) { + return numericalOperationCreator(function (params, owneryQuery, options) { + var typeofParams = typeof comparable(params); + var test = createTester(params); + return new EqualsOperation(function (b) { + return typeof comparable(b) === typeofParams && test(b); + }, owneryQuery, options); + }); + }; + var createNamedOperation = function (name, params, parentQuery, options) { + var operationCreator = options.operations[name]; + if (!operationCreator) { + throw new Error("Unsupported operation: " + name); + } + return operationCreator(params, parentQuery, options, name); + }; + var containsOperation = function (query) { + for (var key in query) { + if (key.charAt(0) === "$") + return true; + } + return false; + }; + var createNestedOperation = function (keyPath, nestedQuery, owneryQuery, options) { + if (containsOperation(nestedQuery)) { + var _a = createQueryOperations(nestedQuery, options), selfOperations = _a[0], nestedOperations = _a[1]; + if (nestedOperations.length) { + throw new Error("Property queries must contain only operations, or exact objects."); + } + return new NestedOperation(keyPath, nestedQuery, owneryQuery, options, selfOperations); + } + return new NestedOperation(keyPath, nestedQuery, owneryQuery, options, [ + new EqualsOperation(nestedQuery, owneryQuery, options) + ]); + }; + var createQueryOperation = function (query, owneryQuery, _a) { + if (owneryQuery === void 0) { owneryQuery = null; } + var _b = _a === void 0 ? {} : _a, compare = _b.compare, operations = _b.operations; + var options = { + compare: compare || equals, + operations: Object.assign({}, operations || {}) + }; + var _c = createQueryOperations(query, options), selfOperations = _c[0], nestedOperations = _c[1]; + var ops = []; + if (selfOperations.length) { + ops.push(new NestedOperation([], query, owneryQuery, options, selfOperations)); + } + ops.push.apply(ops, nestedOperations); + if (ops.length === 1) { + return ops[0]; + } + return new QueryOperation(query, owneryQuery, options, ops); + }; + var createQueryOperations = function (query, options) { + var selfOperations = []; + var nestedOperations = []; + if (!isVanillaObject(query)) { + selfOperations.push(new EqualsOperation(query, query, options)); + return [selfOperations, nestedOperations]; + } + for (var key in query) { + if (key.charAt(0) === "$") { + var op = createNamedOperation(key, query[key], query, options); + // probably just a flag for another operation (like $options) + if (op != null) { + selfOperations.push(op); + } + } + else { + nestedOperations.push(createNestedOperation(key.split("."), query[key], query, options)); + } + } + return [selfOperations, nestedOperations]; + }; + var createOperationTester = function (operation) { return function (item, key, owner) { + operation.reset(); + operation.next(item, key, owner); + return operation.keep; + }; }; + var createQueryTester = function (query, options) { + if (options === void 0) { options = {}; } + return createOperationTester(createQueryOperation(query, null, options)); + }; + + var $Ne = /** @class */ (function (_super) { + __extends($Ne, _super); + function $Ne() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Ne.prototype.init = function () { + this._test = createTester(this.params, this.options.compare); + }; + $Ne.prototype.reset = function () { + _super.prototype.reset.call(this); + this.keep = true; + }; + $Ne.prototype.next = function (item) { + if (this._test(item)) { + this.done = true; + this.keep = false; + } + }; + return $Ne; + }(NamedBaseOperation)); + // https://docs.mongodb.com/manual/reference/operator/query/elemMatch/ + var $ElemMatch = /** @class */ (function (_super) { + __extends($ElemMatch, _super); + function $ElemMatch() { + return _super !== null && _super.apply(this, arguments) || this; + } + $ElemMatch.prototype.init = function () { + this._queryOperation = createQueryOperation(this.params, this.owneryQuery, this.options); + }; + $ElemMatch.prototype.reset = function () { + _super.prototype.reset.call(this); + this._queryOperation.reset(); + }; + $ElemMatch.prototype.next = function (item) { + if (isArray(item)) { + for (var i = 0, length_1 = item.length; i < length_1; i++) { + // reset query operation since item being tested needs to pass _all_ query + // operations for it to be a success + this._queryOperation.reset(); + // check item + this._queryOperation.next(item[i], i, item); + this.keep = this.keep || this._queryOperation.keep; + } + this.done = true; + } + else { + this.done = false; + this.keep = false; + } + }; + return $ElemMatch; + }(NamedBaseOperation)); + var $Not = /** @class */ (function (_super) { + __extends($Not, _super); + function $Not() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Not.prototype.init = function () { + this._queryOperation = createQueryOperation(this.params, this.owneryQuery, this.options); + }; + $Not.prototype.reset = function () { + this._queryOperation.reset(); + }; + $Not.prototype.next = function (item, key, owner) { + this._queryOperation.next(item, key, owner); + this.done = this._queryOperation.done; + this.keep = !this._queryOperation.keep; + }; + return $Not; + }(NamedBaseOperation)); + var $Size = /** @class */ (function (_super) { + __extends($Size, _super); + function $Size() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Size.prototype.init = function () { }; + $Size.prototype.next = function (item) { + if (isArray(item) && item.length === this.params) { + this.done = true; + this.keep = true; + } + // if (parent && parent.length === this.params) { + // this.done = true; + // this.keep = true; + // } + }; + return $Size; + }(NamedBaseOperation)); + var $Or = /** @class */ (function (_super) { + __extends($Or, _super); + function $Or() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Or.prototype.init = function () { + var _this = this; + this._ops = this.params.map(function (op) { + return createQueryOperation(op, null, _this.options); + }); + }; + $Or.prototype.reset = function () { + this.done = false; + this.keep = false; + for (var i = 0, length_2 = this._ops.length; i < length_2; i++) { + this._ops[i].reset(); + } + }; + $Or.prototype.next = function (item, key, owner) { + var done = false; + var success = false; + for (var i = 0, length_3 = this._ops.length; i < length_3; i++) { + var op = this._ops[i]; + op.next(item, key, owner); + if (op.keep) { + done = true; + success = op.keep; + break; + } + } + this.keep = success; + this.done = done; + }; + return $Or; + }(NamedBaseOperation)); + var $Nor = /** @class */ (function (_super) { + __extends($Nor, _super); + function $Nor() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Nor.prototype.next = function (item, key, owner) { + _super.prototype.next.call(this, item, key, owner); + this.keep = !this.keep; + }; + return $Nor; + }($Or)); + var $In = /** @class */ (function (_super) { + __extends($In, _super); + function $In() { + return _super !== null && _super.apply(this, arguments) || this; + } + $In.prototype.init = function () { + var _this = this; + this._testers = this.params.map(function (value) { + if (containsOperation(value)) { + throw new Error("cannot nest $ under " + _this.constructor.name.toLowerCase()); + } + return createTester(value, _this.options.compare); + }); + }; + $In.prototype.next = function (item, key, owner) { + var done = false; + var success = false; + for (var i = 0, length_4 = this._testers.length; i < length_4; i++) { + var test = this._testers[i]; + if (test(item)) { + done = true; + success = true; + break; + } + } + this.keep = success; + this.done = done; + }; + return $In; + }(NamedBaseOperation)); + var $Nin = /** @class */ (function (_super) { + __extends($Nin, _super); + function $Nin() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Nin.prototype.next = function (item, key, owner) { + _super.prototype.next.call(this, item, key, owner); + this.keep = !this.keep; + }; + return $Nin; + }($In)); + var $Exists = /** @class */ (function (_super) { + __extends($Exists, _super); + function $Exists() { + return _super !== null && _super.apply(this, arguments) || this; + } + $Exists.prototype.next = function (item, key, owner) { + if (owner.hasOwnProperty(key) === this.params) { + this.done = true; + this.keep = true; + } + }; + return $Exists; + }(NamedBaseOperation)); + var $And = /** @class */ (function (_super) { + __extends($And, _super); + function $And(params, owneryQuery, options, name) { + return _super.call(this, params, owneryQuery, options, params.map(function (query) { return createQueryOperation(query, owneryQuery, options); }), name) || this; + } + $And.prototype.next = function (item, key, owner) { + this.childrenNext(item, key, owner); + }; + return $And; + }(NamedGroupOperation)); + var $eq = function (params, owneryQuery, options) { + return new EqualsOperation(params, owneryQuery, options); + }; + var $ne = function (params, owneryQuery, options, name) { return new $Ne(params, owneryQuery, options, name); }; + var $or = function (params, owneryQuery, options, name) { return new $Or(params, owneryQuery, options, name); }; + var $nor = function (params, owneryQuery, options, name) { return new $Nor(params, owneryQuery, options, name); }; + var $elemMatch = function (params, owneryQuery, options, name) { return new $ElemMatch(params, owneryQuery, options, name); }; + var $nin = function (params, owneryQuery, options, name) { return new $Nin(params, owneryQuery, options, name); }; + var $in = function (params, owneryQuery, options, name) { return new $In(params, owneryQuery, options, name); }; + var $lt = numericalOperation(function (params) { return function (b) { return b < params; }; }); + var $lte = numericalOperation(function (params) { return function (b) { return b <= params; }; }); + var $gt = numericalOperation(function (params) { return function (b) { return b > params; }; }); + var $gte = numericalOperation(function (params) { return function (b) { return b >= params; }; }); + var $mod = function (_a, owneryQuery, options) { + var mod = _a[0], equalsValue = _a[1]; + return new EqualsOperation(function (b) { return comparable(b) % mod === equalsValue; }, owneryQuery, options); + }; + var $exists = function (params, owneryQuery, options, name) { return new $Exists(params, owneryQuery, options, name); }; + var $regex = function (pattern, owneryQuery, options) { + return new EqualsOperation(new RegExp(pattern, owneryQuery.$options), owneryQuery, options); + }; + var $not = function (params, owneryQuery, options, name) { return new $Not(params, owneryQuery, options, name); }; + var typeAliases = { + number: function (v) { return typeof v === "number"; }, + string: function (v) { return typeof v === "string"; }, + bool: function (v) { return typeof v === "boolean"; }, + array: function (v) { return Array.isArray(v); }, + null: function (v) { return v === null; }, + timestamp: function (v) { return v instanceof Date; } + }; + var $type = function (clazz, owneryQuery, options) { + return new EqualsOperation(function (b) { + if (typeof clazz === "string") { + if (!typeAliases[clazz]) { + throw new Error("Type alias does not exist"); + } + return typeAliases[clazz](b); + } + return b != null ? b instanceof clazz || b.constructor === clazz : false; + }, owneryQuery, options); + }; + var $and = function (params, ownerQuery, options, name) { return new $And(params, ownerQuery, options, name); }; + var $all = $and; + var $size = function (params, ownerQuery, options) { return new $Size(params, ownerQuery, options, "$size"); }; + var $options = function () { return null; }; + var $where = function (params, ownerQuery, options) { + var test; + if (isFunction(params)) { + test = params; + } + else { + throw new Error("In CSP mode, sift does not support strings in \"$where\" condition"); + } + return new EqualsOperation(function (b) { return test.bind(b)(b); }, ownerQuery, options); + }; + + var defaultOperations = /*#__PURE__*/Object.freeze({ + __proto__: null, + $Size: $Size, + $eq: $eq, + $ne: $ne, + $or: $or, + $nor: $nor, + $elemMatch: $elemMatch, + $nin: $nin, + $in: $in, + $lt: $lt, + $lte: $lte, + $gt: $gt, + $gte: $gte, + $mod: $mod, + $exists: $exists, + $regex: $regex, + $not: $not, + $type: $type, + $and: $and, + $all: $all, + $size: $size, + $options: $options, + $where: $where + }); + + var createDefaultQueryOperation = function (query, ownerQuery, _a) { + var _b = _a === void 0 ? {} : _a, compare = _b.compare, operations = _b.operations; + return createQueryOperation(query, ownerQuery, { + compare: compare, + operations: Object.assign({}, defaultOperations, operations || {}) + }); + }; + var createDefaultQueryTester = function (query, options) { + if (options === void 0) { options = {}; } + var op = createDefaultQueryOperation(query, null, options); + return createOperationTester(op); + }; + + exports.$Size = $Size; + exports.$all = $all; + exports.$and = $and; + exports.$elemMatch = $elemMatch; + exports.$eq = $eq; + exports.$exists = $exists; + exports.$gt = $gt; + exports.$gte = $gte; + exports.$in = $in; + exports.$lt = $lt; + exports.$lte = $lte; + exports.$mod = $mod; + exports.$ne = $ne; + exports.$nin = $nin; + exports.$nor = $nor; + exports.$not = $not; + exports.$options = $options; + exports.$or = $or; + exports.$regex = $regex; + exports.$size = $size; + exports.$type = $type; + exports.$where = $where; + exports.EqualsOperation = EqualsOperation; + exports.createDefaultQueryOperation = createDefaultQueryOperation; + exports.createEqualsOperation = createEqualsOperation; + exports.createOperationTester = createOperationTester; + exports.createQueryOperation = createQueryOperation; + exports.createQueryTester = createQueryTester; + exports.default = createDefaultQueryTester; + + Object.defineProperty(exports, '__esModule', { value: true }); + +}))); +//# sourceMappingURL=sift.csp.min.js.map diff --git a/node_modules/sift/sift.csp.min.js.map b/node_modules/sift/sift.csp.min.js.map new file mode 100644 index 000000000..5fbb0b355 --- /dev/null +++ b/node_modules/sift/sift.csp.min.js.map @@ -0,0 +1 @@ +{"version":3,"file":"sift.csp.min.js","sources":["node_modules/tslib/tslib.es6.js","src/utils.ts","src/core.ts","src/operations.ts","src/index.ts"],"sourcesContent":["/*! *****************************************************************************\r\nCopyright (c) Microsoft Corporation.\r\n\r\nPermission to use, copy, modify, and/or distribute this software for any\r\npurpose with or without fee is hereby granted.\r\n\r\nTHE SOFTWARE IS PROVIDED \"AS IS\" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH\r\nREGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY\r\nAND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,\r\nINDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM\r\nLOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR\r\nOTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR\r\nPERFORMANCE OF THIS SOFTWARE.\r\n***************************************************************************** */\r\n/* global Reflect, Promise */\r\n\r\nvar extendStatics = function(d, b) {\r\n extendStatics = Object.setPrototypeOf ||\r\n ({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||\r\n function (d, b) { for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p]; };\r\n return extendStatics(d, b);\r\n};\r\n\r\nexport function __extends(d, b) {\r\n extendStatics(d, b);\r\n function __() { this.constructor = d; }\r\n d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());\r\n}\r\n\r\nexport var __assign = function() {\r\n __assign = Object.assign || function __assign(t) {\r\n for (var s, i = 1, n = arguments.length; i < n; i++) {\r\n s = arguments[i];\r\n for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p)) t[p] = s[p];\r\n }\r\n return t;\r\n }\r\n return __assign.apply(this, arguments);\r\n}\r\n\r\nexport function __rest(s, e) {\r\n var t = {};\r\n for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p) && e.indexOf(p) < 0)\r\n t[p] = s[p];\r\n if (s != null && typeof Object.getOwnPropertySymbols === \"function\")\r\n for (var i = 0, p = Object.getOwnPropertySymbols(s); i < p.length; i++) {\r\n if (e.indexOf(p[i]) < 0 && Object.prototype.propertyIsEnumerable.call(s, p[i]))\r\n t[p[i]] = s[p[i]];\r\n }\r\n return t;\r\n}\r\n\r\nexport function __decorate(decorators, target, key, desc) {\r\n var c = arguments.length, r = c < 3 ? target : desc === null ? desc = Object.getOwnPropertyDescriptor(target, key) : desc, d;\r\n if (typeof Reflect === \"object\" && typeof Reflect.decorate === \"function\") r = Reflect.decorate(decorators, target, key, desc);\r\n else for (var i = decorators.length - 1; i >= 0; i--) if (d = decorators[i]) r = (c < 3 ? d(r) : c > 3 ? d(target, key, r) : d(target, key)) || r;\r\n return c > 3 && r && Object.defineProperty(target, key, r), r;\r\n}\r\n\r\nexport function __param(paramIndex, decorator) {\r\n return function (target, key) { decorator(target, key, paramIndex); }\r\n}\r\n\r\nexport function __metadata(metadataKey, metadataValue) {\r\n if (typeof Reflect === \"object\" && typeof Reflect.metadata === \"function\") return Reflect.metadata(metadataKey, metadataValue);\r\n}\r\n\r\nexport function __awaiter(thisArg, _arguments, P, generator) {\r\n function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }\r\n return new (P || (P = Promise))(function (resolve, reject) {\r\n function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }\r\n function rejected(value) { try { step(generator[\"throw\"](value)); } catch (e) { reject(e); } }\r\n function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }\r\n step((generator = generator.apply(thisArg, _arguments || [])).next());\r\n });\r\n}\r\n\r\nexport function __generator(thisArg, body) {\r\n var _ = { label: 0, sent: function() { if (t[0] & 1) throw t[1]; return t[1]; }, trys: [], ops: [] }, f, y, t, g;\r\n return g = { next: verb(0), \"throw\": verb(1), \"return\": verb(2) }, typeof Symbol === \"function\" && (g[Symbol.iterator] = function() { return this; }), g;\r\n function verb(n) { return function (v) { return step([n, v]); }; }\r\n function step(op) {\r\n if (f) throw new TypeError(\"Generator is already executing.\");\r\n while (_) try {\r\n if (f = 1, y && (t = op[0] & 2 ? y[\"return\"] : op[0] ? y[\"throw\"] || ((t = y[\"return\"]) && t.call(y), 0) : y.next) && !(t = t.call(y, op[1])).done) return t;\r\n if (y = 0, t) op = [op[0] & 2, t.value];\r\n switch (op[0]) {\r\n case 0: case 1: t = op; break;\r\n case 4: _.label++; return { value: op[1], done: false };\r\n case 5: _.label++; y = op[1]; op = [0]; continue;\r\n case 7: op = _.ops.pop(); _.trys.pop(); continue;\r\n default:\r\n if (!(t = _.trys, t = t.length > 0 && t[t.length - 1]) && (op[0] === 6 || op[0] === 2)) { _ = 0; continue; }\r\n if (op[0] === 3 && (!t || (op[1] > t[0] && op[1] < t[3]))) { _.label = op[1]; break; }\r\n if (op[0] === 6 && _.label < t[1]) { _.label = t[1]; t = op; break; }\r\n if (t && _.label < t[2]) { _.label = t[2]; _.ops.push(op); break; }\r\n if (t[2]) _.ops.pop();\r\n _.trys.pop(); continue;\r\n }\r\n op = body.call(thisArg, _);\r\n } catch (e) { op = [6, e]; y = 0; } finally { f = t = 0; }\r\n if (op[0] & 5) throw op[1]; return { value: op[0] ? op[1] : void 0, done: true };\r\n }\r\n}\r\n\r\nexport var __createBinding = Object.create ? (function(o, m, k, k2) {\r\n if (k2 === undefined) k2 = k;\r\n Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });\r\n}) : (function(o, m, k, k2) {\r\n if (k2 === undefined) k2 = k;\r\n o[k2] = m[k];\r\n});\r\n\r\nexport function __exportStar(m, exports) {\r\n for (var p in m) if (p !== \"default\" && !exports.hasOwnProperty(p)) __createBinding(exports, m, p);\r\n}\r\n\r\nexport function __values(o) {\r\n var s = typeof Symbol === \"function\" && Symbol.iterator, m = s && o[s], i = 0;\r\n if (m) return m.call(o);\r\n if (o && typeof o.length === \"number\") return {\r\n next: function () {\r\n if (o && i >= o.length) o = void 0;\r\n return { value: o && o[i++], done: !o };\r\n }\r\n };\r\n throw new TypeError(s ? \"Object is not iterable.\" : \"Symbol.iterator is not defined.\");\r\n}\r\n\r\nexport function __read(o, n) {\r\n var m = typeof Symbol === \"function\" && o[Symbol.iterator];\r\n if (!m) return o;\r\n var i = m.call(o), r, ar = [], e;\r\n try {\r\n while ((n === void 0 || n-- > 0) && !(r = i.next()).done) ar.push(r.value);\r\n }\r\n catch (error) { e = { error: error }; }\r\n finally {\r\n try {\r\n if (r && !r.done && (m = i[\"return\"])) m.call(i);\r\n }\r\n finally { if (e) throw e.error; }\r\n }\r\n return ar;\r\n}\r\n\r\nexport function __spread() {\r\n for (var ar = [], i = 0; i < arguments.length; i++)\r\n ar = ar.concat(__read(arguments[i]));\r\n return ar;\r\n}\r\n\r\nexport function __spreadArrays() {\r\n for (var s = 0, i = 0, il = arguments.length; i < il; i++) s += arguments[i].length;\r\n for (var r = Array(s), k = 0, i = 0; i < il; i++)\r\n for (var a = arguments[i], j = 0, jl = a.length; j < jl; j++, k++)\r\n r[k] = a[j];\r\n return r;\r\n};\r\n\r\nexport function __await(v) {\r\n return this instanceof __await ? (this.v = v, this) : new __await(v);\r\n}\r\n\r\nexport function __asyncGenerator(thisArg, _arguments, generator) {\r\n if (!Symbol.asyncIterator) throw new TypeError(\"Symbol.asyncIterator is not defined.\");\r\n var g = generator.apply(thisArg, _arguments || []), i, q = [];\r\n return i = {}, verb(\"next\"), verb(\"throw\"), verb(\"return\"), i[Symbol.asyncIterator] = function () { return this; }, i;\r\n function verb(n) { if (g[n]) i[n] = function (v) { return new Promise(function (a, b) { q.push([n, v, a, b]) > 1 || resume(n, v); }); }; }\r\n function resume(n, v) { try { step(g[n](v)); } catch (e) { settle(q[0][3], e); } }\r\n function step(r) { r.value instanceof __await ? Promise.resolve(r.value.v).then(fulfill, reject) : settle(q[0][2], r); }\r\n function fulfill(value) { resume(\"next\", value); }\r\n function reject(value) { resume(\"throw\", value); }\r\n function settle(f, v) { if (f(v), q.shift(), q.length) resume(q[0][0], q[0][1]); }\r\n}\r\n\r\nexport function __asyncDelegator(o) {\r\n var i, p;\r\n return i = {}, verb(\"next\"), verb(\"throw\", function (e) { throw e; }), verb(\"return\"), i[Symbol.iterator] = function () { return this; }, i;\r\n function verb(n, f) { i[n] = o[n] ? function (v) { return (p = !p) ? { value: __await(o[n](v)), done: n === \"return\" } : f ? f(v) : v; } : f; }\r\n}\r\n\r\nexport function __asyncValues(o) {\r\n if (!Symbol.asyncIterator) throw new TypeError(\"Symbol.asyncIterator is not defined.\");\r\n var m = o[Symbol.asyncIterator], i;\r\n return m ? m.call(o) : (o = typeof __values === \"function\" ? __values(o) : o[Symbol.iterator](), i = {}, verb(\"next\"), verb(\"throw\"), verb(\"return\"), i[Symbol.asyncIterator] = function () { return this; }, i);\r\n function verb(n) { i[n] = o[n] && function (v) { return new Promise(function (resolve, reject) { v = o[n](v), settle(resolve, reject, v.done, v.value); }); }; }\r\n function settle(resolve, reject, d, v) { Promise.resolve(v).then(function(v) { resolve({ value: v, done: d }); }, reject); }\r\n}\r\n\r\nexport function __makeTemplateObject(cooked, raw) {\r\n if (Object.defineProperty) { Object.defineProperty(cooked, \"raw\", { value: raw }); } else { cooked.raw = raw; }\r\n return cooked;\r\n};\r\n\r\nvar __setModuleDefault = Object.create ? (function(o, v) {\r\n Object.defineProperty(o, \"default\", { enumerable: true, value: v });\r\n}) : function(o, v) {\r\n o[\"default\"] = v;\r\n};\r\n\r\nexport function __importStar(mod) {\r\n if (mod && mod.__esModule) return mod;\r\n var result = {};\r\n if (mod != null) for (var k in mod) if (Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);\r\n __setModuleDefault(result, mod);\r\n return result;\r\n}\r\n\r\nexport function __importDefault(mod) {\r\n return (mod && mod.__esModule) ? mod : { default: mod };\r\n}\r\n\r\nexport function __classPrivateFieldGet(receiver, privateMap) {\r\n if (!privateMap.has(receiver)) {\r\n throw new TypeError(\"attempted to get private field on non-instance\");\r\n }\r\n return privateMap.get(receiver);\r\n}\r\n\r\nexport function __classPrivateFieldSet(receiver, privateMap, value) {\r\n if (!privateMap.has(receiver)) {\r\n throw new TypeError(\"attempted to set private field on non-instance\");\r\n }\r\n privateMap.set(receiver, value);\r\n return value;\r\n}\r\n",null,null,null,null],"names":[],"mappings":";;;;;;IAAA;IACA;AACA;IACA;IACA;AACA;IACA;IACA;IACA;IACA;IACA;IACA;IACA;IACA;IACA;AACA;IACA,IAAI,aAAa,GAAG,SAAS,CAAC,EAAE,CAAC,EAAE;IACnC,IAAI,aAAa,GAAG,MAAM,CAAC,cAAc;IACzC,SAAS,EAAE,SAAS,EAAE,EAAE,EAAE,YAAY,KAAK,IAAI,UAAU,CAAC,EAAE,CAAC,EAAE,EAAE,CAAC,CAAC,SAAS,GAAG,CAAC,CAAC,EAAE,CAAC;IACpF,QAAQ,UAAU,CAAC,EAAE,CAAC,EAAE,EAAE,KAAK,IAAI,CAAC,IAAI,CAAC,EAAE,IAAI,CAAC,CAAC,cAAc,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,GAAG,CAAC,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC;IACnF,IAAI,OAAO,aAAa,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC;IAC/B,CAAC,CAAC;AACF;IACO,SAAS,SAAS,CAAC,CAAC,EAAE,CAAC,EAAE;IAChC,IAAI,aAAa,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC;IACxB,IAAI,SAAS,EAAE,GAAG,EAAE,IAAI,CAAC,WAAW,GAAG,CAAC,CAAC,EAAE;IAC3C,IAAI,CAAC,CAAC,SAAS,GAAG,CAAC,KAAK,IAAI,GAAG,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,IAAI,EAAE,CAAC,SAAS,GAAG,CAAC,CAAC,SAAS,EAAE,IAAI,EAAE,EAAE,CAAC,CAAC;IACzF;;ICzBO,IAAM,WAAW,GAAG,UAAQ,IAAI;QACrC,IAAM,UAAU,GAAG,UAAU,GAAG,IAAI,GAAG,GAAG,CAAC;QAC3C,OAAO,UAAS,KAAK;YACnB,OAAO,YAAY,CAAC,KAAK,CAAC,KAAK,UAAU,CAAC;SAC3C,CAAC;IACJ,CAAC,CAAC;IAEF,IAAM,YAAY,GAAG,UAAA,KAAK,IAAI,OAAA,MAAM,CAAC,SAAS,CAAC,QAAQ,CAAC,IAAI,CAAC,KAAK,CAAC,GAAA,CAAC;IAE7D,IAAM,UAAU,GAAG,UAAC,KAAU;QACnC,IAAI,KAAK,YAAY,IAAI,EAAE;YACzB,OAAO,KAAK,CAAC,OAAO,EAAE,CAAC;SACxB;aAAM,IAAI,OAAO,CAAC,KAAK,CAAC,EAAE;YACzB,OAAO,KAAK,CAAC,GAAG,CAAC,UAAU,CAAC,CAAC;SAC9B;aAAM,IAAI,KAAK,IAAI,OAAO,KAAK,CAAC,MAAM,KAAK,UAAU,EAAE;YACtD,OAAO,KAAK,CAAC,MAAM,EAAE,CAAC;SACvB;QAED,OAAO,KAAK,CAAC;IACf,CAAC,CAAC;IAEK,IAAM,OAAO,GAAG,WAAW,CAAa,OAAO,CAAC,CAAC;IACjD,IAAM,QAAQ,GAAG,WAAW,CAAS,QAAQ,CAAC,CAAC;IAC/C,IAAM,UAAU,GAAG,WAAW,CAAW,UAAU,CAAC,CAAC;IACrD,IAAM,eAAe,GAAG,UAAA,KAAK;QAClC,QACE,KAAK;aACJ,KAAK,CAAC,WAAW,KAAK,MAAM;gBAC3B,KAAK,CAAC,WAAW,KAAK,KAAK;gBAC3B,KAAK,CAAC,WAAW,CAAC,QAAQ,EAAE,KAAK,qCAAqC;gBACtE,KAAK,CAAC,WAAW,CAAC,QAAQ,EAAE,KAAK,oCAAoC,CAAC;YACxE,CAAC,KAAK,CAAC,MAAM,EACb;IACJ,CAAC,CAAC;IAEK,IAAM,MAAM,GAAG,UAAC,CAAC,EAAE,CAAC;QACzB,IAAI,CAAC,IAAI,IAAI,IAAI,CAAC,IAAI,CAAC,EAAE;YACvB,OAAO,IAAI,CAAC;SACb;QACD,IAAI,CAAC,KAAK,CAAC,EAAE;YACX,OAAO,IAAI,CAAC;SACb;QAED,IAAI,MAAM,CAAC,SAAS,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC,CAAC,KAAK,MAAM,CAAC,SAAS,CAAC,QAAQ,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE;YAC3E,OAAO,KAAK,CAAC;SACd;QAED,IAAI,OAAO,CAAC,CAAC,CAAC,EAAE;YACd,IAAI,CAAC,CAAC,MAAM,KAAK,CAAC,CAAC,MAAM,EAAE;gBACzB,OAAO,KAAK,CAAC;aACd;YACD,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,mBAAM,EAAQ,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;gBAC/C,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,CAAC,CAAC;oBAAE,OAAO,KAAK,CAAC;aACvC;YACD,OAAO,IAAI,CAAC;SACb;aAAM,IAAI,QAAQ,CAAC,CAAC,CAAC,EAAE;YACtB,IAAI,MAAM,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,MAAM,KAAK,MAAM,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,MAAM,EAAE;gBACnD,OAAO,KAAK,CAAC;aACd;YACD,KAAK,IAAM,GAAG,IAAI,CAAC,EAAE;gBACnB,IAAI,CAAC,MAAM,CAAC,CAAC,CAAC,GAAG,CAAC,EAAE,CAAC,CAAC,GAAG,CAAC,CAAC;oBAAE,OAAO,KAAK,CAAC;aAC3C;YACD,OAAO,IAAI,CAAC;SACb;QACD,OAAO,KAAK,CAAC;IACf,CAAC;;ICMD;;;;IAKA,IAAM,iBAAiB,GAAG,UACxB,IAAS,EACT,OAAc,EACd,IAAY,EACZ,KAAa,EACb,GAAQ,EACR,KAAU;QAEV,IAAM,UAAU,GAAG,OAAO,CAAC,KAAK,CAAC,CAAC;;;QAIlC,IAAI,OAAO,CAAC,IAAI,CAAC,IAAI,KAAK,CAAC,MAAM,CAAC,UAAU,CAAC,CAAC,EAAE;YAC9C,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,sBAAM,EAAW,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;;;gBAGlD,IAAI,CAAC,iBAAiB,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,OAAO,EAAE,IAAI,EAAE,KAAK,EAAE,CAAC,EAAE,IAAI,CAAC,EAAE;oBAC9D,OAAO,KAAK,CAAC;iBACd;aACF;SACF;QAED,IAAI,KAAK,KAAK,OAAO,CAAC,MAAM,IAAI,IAAI,IAAI,IAAI,EAAE;YAC5C,OAAO,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;SAC/B;QAED,OAAO,iBAAiB,CACtB,IAAI,CAAC,UAAU,CAAC,EAChB,OAAO,EACP,IAAI,EACJ,KAAK,GAAG,CAAC,EACT,UAAU,EACV,IAAI,CACL,CAAC;IACJ,CAAC,CAAC;IAEF;QAGE,uBACW,MAAe,EACf,WAAgB,EAChB,OAAgB;YAFhB,WAAM,GAAN,MAAM,CAAS;YACf,gBAAW,GAAX,WAAW,CAAK;YAChB,YAAO,GAAP,OAAO,CAAS;YAEzB,IAAI,CAAC,IAAI,EAAE,CAAC;SACb;QACS,4BAAI,GAAd,eAAmB;QACnB,6BAAK,GAAL;YACE,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;YAClB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;SACnB;QAEH,oBAAC;IAAD,CAAC,IAAA;IAED;QACU,sCAA6B;QAErC,4BACE,MAAe,EACf,WAAgB,EAChB,OAAgB,EACP,IAAY;YAJvB,YAME,kBAAM,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC,SACpC;YAHU,UAAI,GAAJ,IAAI,CAAQ;;SAGtB;QACH,yBAAC;IAAD,CAXA,CACU,aAAa,GAUtB;IAED;QAAsC,kCAAkB;QAItD,wBACE,MAAW,EACX,WAAgB,EAChB,OAAgB,EACA,QAA0B;YAJ5C,YAME,kBAAM,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC,SACpC;YAHiB,cAAQ,GAAR,QAAQ,CAAkB;;SAG3C;;;QAKD,8BAAK,GAAL;YACE,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;YAClB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;YAClB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,+BAAM,EAAoB,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;gBAC3D,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC,KAAK,EAAE,CAAC;aAC1B;SACF;;;QAOS,qCAAY,GAAtB,UAAuB,IAAS,EAAE,GAAQ,EAAE,KAAU;YACpD,IAAI,IAAI,GAAG,IAAI,CAAC;YAChB,IAAI,IAAI,GAAG,IAAI,CAAC;YAChB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,+BAAM,EAAoB,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;gBAC3D,IAAM,cAAc,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC;gBACxC,cAAc,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;gBACtC,IAAI,CAAC,cAAc,CAAC,IAAI,EAAE;oBACxB,IAAI,GAAG,KAAK,CAAC;iBACd;gBACD,IAAI,cAAc,CAAC,IAAI,EAAE;oBACvB,IAAI,CAAC,cAAc,CAAC,IAAI,EAAE;wBACxB,MAAM;qBACP;iBACF;qBAAM;oBACL,IAAI,GAAG,KAAK,CAAC;iBACd;aACF;YACD,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;YACjB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;SAClB;QACH,qBAAC;IAAD,CAjDA,CAAsC,aAAa,GAiDlD;IAED;QAAkD,uCAAc;QAE9D,6BACE,MAAW,EACX,WAAgB,EAChB,OAAgB,EAChB,QAA0B,EACjB,IAAY;YALvB,YAOE,kBAAM,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,QAAQ,CAAC,SAC9C;YAHU,UAAI,GAAJ,IAAI,CAAQ;;SAGtB;QACH,0BAAC;IAAD,CAXA,CAAkD,cAAc,GAW/D;IAED;QAA2C,kCAAc;QAAzD;;SAOC;;;QAHC,6BAAI,GAAJ,UAAK,IAAW,EAAE,GAAQ,EAAE,MAAW;YACrC,IAAI,CAAC,YAAY,CAAC,IAAI,EAAE,GAAG,EAAE,MAAM,CAAC,CAAC;SACtC;QACH,qBAAC;IAAD,CAPA,CAA2C,cAAc,GAOxD;IAED;QAAqC,mCAAc;QACjD,yBACW,OAAc,EACvB,MAAW,EACX,WAAgB,EAChB,OAAgB,EAChB,QAA0B;YAL5B,YAOE,kBAAM,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,QAAQ,CAAC,SAC9C;YAPU,aAAO,GAAP,OAAO,CAAO;;;YAyBjB,sBAAgB,GAAG,UAAC,KAAU,EAAE,GAAQ,EAAE,KAAU;gBAC1D,KAAI,CAAC,YAAY,CAAC,KAAK,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;gBACrC,OAAO,CAAC,KAAI,CAAC,IAAI,CAAC;aACnB,CAAC;;SArBD;;;QAID,8BAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,MAAW;YACnC,iBAAiB,CACf,IAAI,EACJ,IAAI,CAAC,OAAO,EACZ,IAAI,CAAC,gBAAgB,EACrB,CAAC,EACD,GAAG,EACH,MAAM,CACP,CAAC;SACH;QASH,sBAAC;IAAD,CA/BA,CAAqC,cAAc,GA+BlD;IAEM,IAAM,YAAY,GAAG,UAAC,CAAC,EAAE,OAAmB;QACjD,IAAI,CAAC,YAAY,QAAQ,EAAE;YACzB,OAAO,CAAC,CAAC;SACV;QACD,IAAI,CAAC,YAAY,MAAM,EAAE;YACvB,OAAO,UAAA,CAAC;gBACN,IAAM,MAAM,GAAG,OAAO,CAAC,KAAK,QAAQ,IAAI,CAAC,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;gBAClD,CAAC,CAAC,SAAS,GAAG,CAAC,CAAC;gBAChB,OAAO,MAAM,CAAC;aACf,CAAC;SACH;QACD,IAAM,WAAW,GAAG,UAAU,CAAC,CAAC,CAAC,CAAC;QAClC,OAAO,UAAA,CAAC,IAAI,OAAA,OAAO,CAAC,WAAW,EAAE,UAAU,CAAC,CAAC,CAAC,CAAC,GAAA,CAAC;IAClD,CAAC,CAAC;;QAE2C,mCAAqB;QAAlE;;SAaC;QAXC,8BAAI,GAAJ;YACE,IAAI,CAAC,KAAK,GAAG,YAAY,CAAC,IAAI,CAAC,MAAM,EAAE,IAAI,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;SAC9D;QACD,8BAAI,GAAJ,UAAK,IAAI,EAAE,GAAQ,EAAE,MAAW;YAC9B,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,MAAM,CAAC,IAAI,MAAM,CAAC,cAAc,CAAC,GAAG,CAAC,EAAE;gBACxD,IAAI,IAAI,CAAC,KAAK,CAAC,IAAI,EAAE,GAAG,EAAE,MAAM,CAAC,EAAE;oBACjC,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;oBACjB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;iBAClB;aACF;SACF;QACH,sBAAC;IAAD,CAbA,CAA6C,aAAa,GAazD;QAEY,qBAAqB,GAAG,UACnC,MAAW,EACX,WAAgB,EAChB,OAAgB,IACb,OAAA,IAAI,eAAe,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC,IAAC;IAEvD;QAA2C,iCAAqB;QAAhE;;SAKC;QAJC,4BAAI,GAAJ;YACE,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;YACjB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;SACnB;QACH,oBAAC;IAAD,CALA,CAA2C,aAAa,GAKvD;IAEM,IAAM,yBAAyB,GAAG,UACvC,wBAA+C,IAC5C,OAAA,UAAC,MAAW,EAAE,WAAgB,EAAE,OAAgB,EAAE,IAAY;QACjE,IAAI,MAAM,IAAI,IAAI,EAAE;YAClB,OAAO,IAAI,aAAa,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC,CAAC;SACxD;QAED,OAAO,wBAAwB,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,CAAC;IACtE,CAAC,GAAA,CAAC;IAEK,IAAM,kBAAkB,GAAG,UAAC,YAA6B;QAC9D,OAAA,yBAAyB,CACvB,UAAC,MAAW,EAAE,WAAuB,EAAE,OAAgB;YACrD,IAAM,YAAY,GAAG,OAAO,UAAU,CAAC,MAAM,CAAC,CAAC;YAC/C,IAAM,IAAI,GAAG,YAAY,CAAC,MAAM,CAAC,CAAC;YAClC,OAAO,IAAI,eAAe,CACxB,UAAA,CAAC;gBACC,OAAO,OAAO,UAAU,CAAC,CAAC,CAAC,KAAK,YAAY,IAAI,IAAI,CAAC,CAAC,CAAC,CAAC;aACzD,EACD,WAAW,EACX,OAAO,CACR,CAAC;SACH,CACF;IAZD,CAYC,CAAC;IASJ,IAAM,oBAAoB,GAAG,UAC3B,IAAY,EACZ,MAAW,EACX,WAAgB,EAChB,OAAgB;QAEhB,IAAM,gBAAgB,GAAG,OAAO,CAAC,UAAU,CAAC,IAAI,CAAC,CAAC;QAClD,IAAI,CAAC,gBAAgB,EAAE;YACrB,MAAM,IAAI,KAAK,CAAC,4BAA0B,IAAM,CAAC,CAAC;SACnD;QACD,OAAO,gBAAgB,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,CAAC;IAC9D,CAAC,CAAC;IAEK,IAAM,iBAAiB,GAAG,UAAC,KAAU;QAC1C,KAAK,IAAM,GAAG,IAAI,KAAK,EAAE;YACvB,IAAI,GAAG,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG;gBAAE,OAAO,IAAI,CAAC;SACxC;QACD,OAAO,KAAK,CAAC;IACf,CAAC,CAAC;IACF,IAAM,qBAAqB,GAAG,UAC5B,OAAc,EACd,WAAgB,EAChB,WAAgB,EAChB,OAAgB;QAEhB,IAAI,iBAAiB,CAAC,WAAW,CAAC,EAAE;YAC5B,IAAA,gDAGL,EAHM,sBAAc,EAAE,wBAGtB,CAAC;YACF,IAAI,gBAAgB,CAAC,MAAM,EAAE;gBAC3B,MAAM,IAAI,KAAK,CACb,kEAAkE,CACnE,CAAC;aACH;YACD,OAAO,IAAI,eAAe,CACxB,OAAO,EACP,WAAW,EACX,WAAW,EACX,OAAO,EACP,cAAc,CACf,CAAC;SACH;QACD,OAAO,IAAI,eAAe,CAAC,OAAO,EAAE,WAAW,EAAE,WAAW,EAAE,OAAO,EAAE;YACrE,IAAI,eAAe,CAAC,WAAW,EAAE,WAAW,EAAE,OAAO,CAAC;SACvD,CAAC,CAAC;IACL,CAAC,CAAC;QAEW,oBAAoB,GAAG,UAClC,KAAqB,EACrB,WAAuB,EACvB,EAA8C;QAD9C,4BAAA,EAAA,kBAAuB;YACvB,4BAA8C,EAA5C,oBAAO,EAAE,0BAAU;QAErB,IAAM,OAAO,GAAG;YACd,OAAO,EAAE,OAAO,IAAI,MAAM;YAC1B,UAAU,EAAE,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,UAAU,IAAI,EAAE,CAAC;SAChD,CAAC;QAEI,IAAA,0CAGL,EAHM,sBAAc,EAAE,wBAGtB,CAAC;QAEF,IAAM,GAAG,GAAG,EAAE,CAAC;QAEf,IAAI,cAAc,CAAC,MAAM,EAAE;YACzB,GAAG,CAAC,IAAI,CACN,IAAI,eAAe,CAAC,EAAE,EAAE,KAAK,EAAE,WAAW,EAAE,OAAO,EAAE,cAAc,CAAC,CACrE,CAAC;SACH;QAED,GAAG,CAAC,IAAI,OAAR,GAAG,EAAS,gBAAgB,EAAE;QAE9B,IAAI,GAAG,CAAC,MAAM,KAAK,CAAC,EAAE;YACpB,OAAO,GAAG,CAAC,CAAC,CAAC,CAAC;SACf;QACD,OAAO,IAAI,cAAc,CAAC,KAAK,EAAE,WAAW,EAAE,OAAO,EAAE,GAAG,CAAC,CAAC;IAC9D,EAAE;IAEF,IAAM,qBAAqB,GAAG,UAAC,KAAU,EAAE,OAAgB;QACzD,IAAM,cAAc,GAAG,EAAE,CAAC;QAC1B,IAAM,gBAAgB,GAAG,EAAE,CAAC;QAC5B,IAAI,CAAC,eAAe,CAAC,KAAK,CAAC,EAAE;YAC3B,cAAc,CAAC,IAAI,CAAC,IAAI,eAAe,CAAC,KAAK,EAAE,KAAK,EAAE,OAAO,CAAC,CAAC,CAAC;YAChE,OAAO,CAAC,cAAc,EAAE,gBAAgB,CAAC,CAAC;SAC3C;QACD,KAAK,IAAM,GAAG,IAAI,KAAK,EAAE;YACvB,IAAI,GAAG,CAAC,MAAM,CAAC,CAAC,CAAC,KAAK,GAAG,EAAE;gBACzB,IAAM,EAAE,GAAG,oBAAoB,CAAC,GAAG,EAAE,KAAK,CAAC,GAAG,CAAC,EAAE,KAAK,EAAE,OAAO,CAAC,CAAC;;gBAGjE,IAAI,EAAE,IAAI,IAAI,EAAE;oBACd,cAAc,CAAC,IAAI,CAAC,EAAE,CAAC,CAAC;iBACzB;aACF;iBAAM;gBACL,gBAAgB,CAAC,IAAI,CACnB,qBAAqB,CAAC,GAAG,CAAC,KAAK,CAAC,GAAG,CAAC,EAAE,KAAK,CAAC,GAAG,CAAC,EAAE,KAAK,EAAE,OAAO,CAAC,CAClE,CAAC;aACH;SACF;QAED,OAAO,CAAC,cAAc,EAAE,gBAAgB,CAAC,CAAC;IAC5C,CAAC,CAAC;QAEW,qBAAqB,GAAG,UAAQ,SAA2B,IAAK,OAAA,UAC3E,IAAW,EACX,GAAS,EACT,KAAW;QAEX,SAAS,CAAC,KAAK,EAAE,CAAC;QAClB,SAAS,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;QACjC,OAAO,SAAS,CAAC,IAAI,CAAC;IACxB,CAAC,IAAC;QAEW,iBAAiB,GAAG,UAC/B,KAAqB,EACrB,OAA8B;QAA9B,wBAAA,EAAA,YAA8B;QAE9B,OAAO,qBAAqB,CAC1B,oBAAoB,CAAiB,KAAK,EAAE,IAAI,EAAE,OAAO,CAAC,CAC3D,CAAC;IACJ;;IC9aA;QAAkB,uBAAuB;QAAzC;;SAeC;QAbC,kBAAI,GAAJ;YACE,IAAI,CAAC,KAAK,GAAG,YAAY,CAAC,IAAI,CAAC,MAAM,EAAE,IAAI,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;SAC9D;QACD,mBAAK,GAAL;YACE,iBAAM,KAAK,WAAE,CAAC;YACd,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;SAClB;QACD,kBAAI,GAAJ,UAAK,IAAS;YACZ,IAAI,IAAI,CAAC,KAAK,CAAC,IAAI,CAAC,EAAE;gBACpB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;gBACjB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;aACnB;SACF;QACH,UAAC;IAAD,CAfA,CAAkB,kBAAkB,GAenC;IACD;IACA;QAAyB,8BAA8B;QAAvD;;SA8BC;QA5BC,yBAAI,GAAJ;YACE,IAAI,CAAC,eAAe,GAAG,oBAAoB,CACzC,IAAI,CAAC,MAAM,EACX,IAAI,CAAC,WAAW,EAChB,IAAI,CAAC,OAAO,CACb,CAAC;SACH;QACD,0BAAK,GAAL;YACE,iBAAM,KAAK,WAAE,CAAC;YACd,IAAI,CAAC,eAAe,CAAC,KAAK,EAAE,CAAC;SAC9B;QACD,yBAAI,GAAJ,UAAK,IAAS;YACZ,IAAI,OAAO,CAAC,IAAI,CAAC,EAAE;gBACjB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,sBAAM,EAAW,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;;;oBAGlD,IAAI,CAAC,eAAe,CAAC,KAAK,EAAE,CAAC;;oBAG7B,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,EAAE,CAAC,EAAE,IAAI,CAAC,CAAC;oBAC5C,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC,IAAI,IAAI,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC;iBACpD;gBACD,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;aAClB;iBAAM;gBACL,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;gBAClB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;aACnB;SACF;QACH,iBAAC;IAAD,CA9BA,CAAyB,kBAAkB,GA8B1C;IAED;QAAmB,wBAA8B;QAAjD;;SAiBC;QAfC,mBAAI,GAAJ;YACE,IAAI,CAAC,eAAe,GAAG,oBAAoB,CACzC,IAAI,CAAC,MAAM,EACX,IAAI,CAAC,WAAW,EAChB,IAAI,CAAC,OAAO,CACb,CAAC;SACH;QACD,oBAAK,GAAL;YACE,IAAI,CAAC,eAAe,CAAC,KAAK,EAAE,CAAC;SAC9B;QACD,mBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;YAClC,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;YAC5C,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC;YACtC,IAAI,CAAC,IAAI,GAAG,CAAC,IAAI,CAAC,eAAe,CAAC,IAAI,CAAC;SACxC;QACH,WAAC;IAAD,CAjBA,CAAmB,kBAAkB,GAiBpC;;QAE0B,yBAAuB;QAAlD;;SAYC;QAXC,oBAAI,GAAJ,eAAS;QACT,oBAAI,GAAJ,UAAK,IAAI;YACP,IAAI,OAAO,CAAC,IAAI,CAAC,IAAI,IAAI,CAAC,MAAM,KAAK,IAAI,CAAC,MAAM,EAAE;gBAChD,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;gBACjB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;aAClB;;;;;SAKF;QACH,YAAC;IAAD,CAZA,CAA2B,kBAAkB,GAY5C;IAED;QAAkB,uBAAuB;QAAzC;;SA8BC;QA5BC,kBAAI,GAAJ;YAAA,iBAIC;YAHC,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC,MAAM,CAAC,GAAG,CAAC,UAAA,EAAE;gBAC5B,OAAA,oBAAoB,CAAC,EAAE,EAAE,IAAI,EAAE,KAAI,CAAC,OAAO,CAAC;aAAA,CAC7C,CAAC;SACH;QACD,mBAAK,GAAL;YACE,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;YAClB,IAAI,CAAC,IAAI,GAAG,KAAK,CAAC;YAClB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,2BAAM,EAAgB,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;gBACvD,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,KAAK,EAAE,CAAC;aACtB;SACF;QACD,kBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;YAClC,IAAI,IAAI,GAAG,KAAK,CAAC;YACjB,IAAI,OAAO,GAAG,KAAK,CAAC;YACpB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,2BAAM,EAAgB,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;gBACvD,IAAM,EAAE,GAAG,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;gBACxB,EAAE,CAAC,IAAI,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;gBAC1B,IAAI,EAAE,CAAC,IAAI,EAAE;oBACX,IAAI,GAAG,IAAI,CAAC;oBACZ,OAAO,GAAG,EAAE,CAAC,IAAI,CAAC;oBAClB,MAAM;iBACP;aACF;YAED,IAAI,CAAC,IAAI,GAAG,OAAO,CAAC;YACpB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;SAClB;QACH,UAAC;IAAD,CA9BA,CAAkB,kBAAkB,GA8BnC;IAED;QAAmB,wBAAG;QAAtB;;SAKC;QAJC,mBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;YAClC,iBAAM,IAAI,YAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;YAC7B,IAAI,CAAC,IAAI,GAAG,CAAC,IAAI,CAAC,IAAI,CAAC;SACxB;QACH,WAAC;IAAD,CALA,CAAmB,GAAG,GAKrB;IAED;QAAkB,uBAAuB;QAAzC;;SA2BC;QAzBC,kBAAI,GAAJ;YAAA,iBASC;YARC,IAAI,CAAC,QAAQ,GAAG,IAAI,CAAC,MAAM,CAAC,GAAG,CAAC,UAAA,KAAK;gBACnC,IAAI,iBAAiB,CAAC,KAAK,CAAC,EAAE;oBAC5B,MAAM,IAAI,KAAK,CACb,yBAAuB,KAAI,CAAC,WAAW,CAAC,IAAI,CAAC,WAAW,EAAI,CAC7D,CAAC;iBACH;gBACD,OAAO,YAAY,CAAC,KAAK,EAAE,KAAI,CAAC,OAAO,CAAC,OAAO,CAAC,CAAC;aAClD,CAAC,CAAC;SACJ;QACD,kBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;YAClC,IAAI,IAAI,GAAG,KAAK,CAAC;YACjB,IAAI,OAAO,GAAG,KAAK,CAAC;YACpB,KAAS,IAAA,CAAC,GAAG,CAAC,EAAI,+BAAM,EAAoB,CAAC,GAAG,QAAM,EAAE,CAAC,EAAE,EAAE;gBAC3D,IAAM,IAAI,GAAG,IAAI,CAAC,QAAQ,CAAC,CAAC,CAAC,CAAC;gBAC9B,IAAI,IAAI,CAAC,IAAI,CAAC,EAAE;oBACd,IAAI,GAAG,IAAI,CAAC;oBACZ,OAAO,GAAG,IAAI,CAAC;oBACf,MAAM;iBACP;aACF;YAED,IAAI,CAAC,IAAI,GAAG,OAAO,CAAC;YACpB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;SAClB;QACH,UAAC;IAAD,CA3BA,CAAkB,kBAAkB,GA2BnC;IAED;QAAmB,wBAAG;QAAtB;;SAKC;QAJC,mBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;YAClC,iBAAM,IAAI,YAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;YAC7B,IAAI,CAAC,IAAI,GAAG,CAAC,IAAI,CAAC,IAAI,CAAC;SACxB;QACH,WAAC;IAAD,CALA,CAAmB,GAAG,GAKrB;IAED;QAAsB,2BAA2B;QAAjD;;SAOC;QANC,sBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;YAClC,IAAI,KAAK,CAAC,cAAc,CAAC,GAAG,CAAC,KAAK,IAAI,CAAC,MAAM,EAAE;gBAC7C,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;gBACjB,IAAI,CAAC,IAAI,GAAG,IAAI,CAAC;aAClB;SACF;QACH,cAAC;IAAD,CAPA,CAAsB,kBAAkB,GAOvC;IAED;QAAmB,wBAAmB;QACpC,cACE,MAAoB,EACpB,WAAuB,EACvB,OAAgB,EAChB,IAAY;mBAEZ,kBACE,MAAM,EACN,WAAW,EACX,OAAO,EACP,MAAM,CAAC,GAAG,CAAC,UAAA,KAAK,IAAI,OAAA,oBAAoB,CAAC,KAAK,EAAE,WAAW,EAAE,OAAO,CAAC,GAAA,CAAC,EACtE,IAAI,CACL;SACF;QACD,mBAAI,GAAJ,UAAK,IAAS,EAAE,GAAQ,EAAE,KAAU;YAClC,IAAI,CAAC,YAAY,CAAC,IAAI,EAAE,GAAG,EAAE,KAAK,CAAC,CAAC;SACrC;QACH,WAAC;IAAD,CAlBA,CAAmB,mBAAmB,GAkBrC;QAEY,GAAG,GAAG,UAAC,MAAW,EAAE,WAAuB,EAAE,OAAgB;QACxE,OAAA,IAAI,eAAe,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,CAAC;IAAjD,EAAkD;QACvC,GAAG,GAAG,UACjB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,GAAG,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;QACpC,GAAG,GAAG,UACjB,MAAoB,EACpB,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,GAAG,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;QACpC,IAAI,GAAG,UAClB,MAAoB,EACpB,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,IAAI,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;QACrC,UAAU,GAAG,UACxB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,UAAU,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;QAC3C,IAAI,GAAG,UAClB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,IAAI,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;QACrC,GAAG,GAAG,UACjB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,GAAG,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;QAEpC,GAAG,GAAG,kBAAkB,CAAC,UAAA,MAAM,IAAI,OAAA,UAAA,CAAC,IAAI,OAAA,CAAC,GAAG,MAAM,GAAA,GAAA,EAAE;QACpD,IAAI,GAAG,kBAAkB,CAAC,UAAA,MAAM,IAAI,OAAA,UAAA,CAAC,IAAI,OAAA,CAAC,IAAI,MAAM,GAAA,GAAA,EAAE;QACtD,GAAG,GAAG,kBAAkB,CAAC,UAAA,MAAM,IAAI,OAAA,UAAA,CAAC,IAAI,OAAA,CAAC,GAAG,MAAM,GAAA,GAAA,EAAE;QACpD,IAAI,GAAG,kBAAkB,CAAC,UAAA,MAAM,IAAI,OAAA,UAAA,CAAC,IAAI,OAAA,CAAC,IAAI,MAAM,GAAA,GAAA,EAAE;QACtD,IAAI,GAAG,UAClB,EAA4B,EAC5B,WAAuB,EACvB,OAAgB;YAFf,WAAG,EAAE,mBAAW;QAIjB,OAAA,IAAI,eAAe,CACjB,UAAA,CAAC,IAAI,OAAA,UAAU,CAAC,CAAC,CAAC,GAAG,GAAG,KAAK,WAAW,GAAA,EACxC,WAAW,EACX,OAAO,CACR;IAJD,EAIE;QACS,OAAO,GAAG,UACrB,MAAe,EACf,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,OAAO,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;QACxC,MAAM,GAAG,UACpB,OAAe,EACf,WAAuB,EACvB,OAAgB;QAEhB,OAAA,IAAI,eAAe,CACjB,IAAI,MAAM,CAAC,OAAO,EAAE,WAAW,CAAC,QAAQ,CAAC,EACzC,WAAW,EACX,OAAO,CACR;IAJD,EAIE;QACS,IAAI,GAAG,UAClB,MAAW,EACX,WAAuB,EACvB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,IAAI,CAAC,MAAM,EAAE,WAAW,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;IAElD,IAAM,WAAW,GAAG;QAClB,MAAM,EAAE,UAAA,CAAC,IAAI,OAAA,OAAO,CAAC,KAAK,QAAQ,GAAA;QAClC,MAAM,EAAE,UAAA,CAAC,IAAI,OAAA,OAAO,CAAC,KAAK,QAAQ,GAAA;QAClC,IAAI,EAAE,UAAA,CAAC,IAAI,OAAA,OAAO,CAAC,KAAK,SAAS,GAAA;QACjC,KAAK,EAAE,UAAA,CAAC,IAAI,OAAA,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,GAAA;QAC5B,IAAI,EAAE,UAAA,CAAC,IAAI,OAAA,CAAC,KAAK,IAAI,GAAA;QACrB,SAAS,EAAE,UAAA,CAAC,IAAI,OAAA,CAAC,YAAY,IAAI,GAAA;KAClC,CAAC;QAEW,KAAK,GAAG,UACnB,KAAwB,EACxB,WAAuB,EACvB,OAAgB;QAEhB,OAAA,IAAI,eAAe,CACjB,UAAA,CAAC;YACC,IAAI,OAAO,KAAK,KAAK,QAAQ,EAAE;gBAC7B,IAAI,CAAC,WAAW,CAAC,KAAK,CAAC,EAAE;oBACvB,MAAM,IAAI,KAAK,CAAC,2BAA2B,CAAC,CAAC;iBAC9C;gBAED,OAAO,WAAW,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC;aAC9B;YAED,OAAO,CAAC,IAAI,IAAI,GAAG,CAAC,YAAY,KAAK,IAAI,CAAC,CAAC,WAAW,KAAK,KAAK,GAAG,KAAK,CAAC;SAC1E,EACD,WAAW,EACX,OAAO,CACR;IAdD,EAcE;QACS,IAAI,GAAG,UAClB,MAAoB,EACpB,UAAsB,EACtB,OAAgB,EAChB,IAAY,IACT,OAAA,IAAI,IAAI,CAAC,MAAM,EAAE,UAAU,EAAE,OAAO,EAAE,IAAI,CAAC,IAAC;QACpC,IAAI,GAAG,KAAK;QACZ,KAAK,GAAG,UACnB,MAAc,EACd,UAAsB,EACtB,OAAgB,IACb,OAAA,IAAI,KAAK,CAAC,MAAM,EAAE,UAAU,EAAE,OAAO,EAAE,OAAO,CAAC,IAAC;QACxC,QAAQ,GAAG,cAAM,OAAA,IAAI,IAAC;QACtB,MAAM,GAAG,UACpB,MAAyB,EACzB,UAAsB,EACtB,OAAgB;QAEhB,IAAI,IAAI,CAAC;QAET,IAAI,UAAU,CAAC,MAAM,CAAC,EAAE;YACtB,IAAI,GAAG,MAAM,CAAC;SACf;aAEM;YACL,MAAM,IAAI,KAAK,CACb,oEAAkE,CACnE,CAAC;SACH;QAED,OAAO,IAAI,eAAe,CAAC,UAAA,CAAC,IAAI,OAAA,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,CAAC,CAAC,GAAA,EAAE,UAAU,EAAE,OAAO,CAAC,CAAC;IACxE;;;;;;;;;;;;;;;;;;;;;;;;;;;;QCxUM,2BAA2B,GAAG,UAClC,KAAqB,EACrB,UAAe,EACf,EAA8C;YAA9C,4BAA8C,EAA5C,oBAAO,EAAE,0BAAU;QAErB,OAAO,oBAAoB,CAAC,KAAK,EAAE,UAAU,EAAE;YAC7C,OAAO,EAAE,OAAO;YAChB,UAAU,EAAE,MAAM,CAAC,MAAM,CAAC,EAAE,EAAE,iBAAiB,EAAE,UAAU,IAAI,EAAE,CAAC;SACnE,CAAC,CAAC;IACL,EAAE;IAEF,IAAM,wBAAwB,GAAG,UAC/B,KAAqB,EACrB,OAA8B;QAA9B,wBAAA,EAAA,YAA8B;QAE9B,IAAM,EAAE,GAAG,2BAA2B,CAAC,KAAK,EAAE,IAAI,EAAE,OAAO,CAAC,CAAC;QAC7D,OAAO,qBAAqB,CAAC,EAAE,CAAC,CAAC;IACnC,CAAC;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;"} \ No newline at end of file diff --git a/node_modules/sift/sift.min.js b/node_modules/sift/sift.min.js new file mode 100644 index 000000000..a02ffcba4 --- /dev/null +++ b/node_modules/sift/sift.min.js @@ -0,0 +1,16 @@ +!function(n,t){"object"==typeof exports&&"undefined"!=typeof module?t(exports):"function"==typeof define&&define.amd?define(["exports"],t):t((n=n||self).sift={})}(this,(function(n){"use strict"; +/*! ***************************************************************************** + Copyright (c) Microsoft Corporation. + + Permission to use, copy, modify, and/or distribute this software for any + purpose with or without fee is hereby granted. + + THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH + REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY + AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT, + INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM + LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR + OTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR + PERFORMANCE OF THIS SOFTWARE. + ***************************************************************************** */var t=function(n,r){return(t=Object.setPrototypeOf||{__proto__:[]}instanceof Array&&function(n,t){n.__proto__=t}||function(n,t){for(var r in t)t.hasOwnProperty(r)&&(n[r]=t[r])})(n,r)};function r(n,r){function i(){this.constructor=n}t(n,r),n.prototype=null===r?Object.create(r):(i.prototype=r.prototype,new i)}var i=function(n){var t="[object "+n+"]";return function(n){return u(n)===t}},u=function(n){return Object.prototype.toString.call(n)},e=function(n){return n instanceof Date?n.getTime():o(n)?n.map(e):n&&"function"==typeof n.toJSON?n.toJSON():n},o=i("Array"),f=i("Object"),c=i("Function"),s=function(n,t){if(null==n&&n==t)return!0;if(n===t)return!0;if(Object.prototype.toString.call(n)!==Object.prototype.toString.call(t))return!1;if(o(n)){if(n.length!==t.length)return!1;for(var r=0,i=n.length;rn}})),L=j((function(n){return function(t){return t>=n}})),Q=function(n,t,r){var i=n[0],u=n[1];return new d((function(n){return e(n)%i===u}),t,r)},V=function(n,t,r,i){return new P(n,t,r,i)},W=function(n,t,r){return new d(new RegExp(n,t.$options),t,r)},X=function(n,t,r,i){return new z(n,t,r,i)},Y={number:function(n){return"number"==typeof n},string:function(n){return"string"==typeof n},bool:function(n){return"boolean"==typeof n},array:function(n){return Array.isArray(n)},null:function(n){return null===n},timestamp:function(n){return n instanceof Date}},Z=function(n,t,r){return new d((function(t){if("string"==typeof n){if(!Y[n])throw new Error("Type alias does not exist");return Y[n](t)}return null!=t&&(t instanceof n||t.constructor===n)}),t,r)},nn=function(n,t,r,i){return new R(n,t,r,i)},tn=nn,rn=function(n,t,r){return new F(n,t,r,"$size")},un=function(){return null},en=function(n,t,r){var i;if(c(n))i=n;else{if(process.env.CSP_ENABLED)throw new Error('In CSP mode, sift does not support strings in "$where" condition');i=new Function("obj","return "+n)}return new d((function(n){return i.bind(n)(n)}),t,r)},on=Object.freeze({__proto__:null,$Size:F,$eq:S,$ne:C,$or:I,$nor:T,$elemMatch:U,$nin:B,$in:G,$lt:H,$lte:J,$gt:K,$gte:L,$mod:Q,$exists:V,$regex:W,$not:X,$type:Z,$and:nn,$all:tn,$size:rn,$options:un,$where:en}),fn=function(n,t,r){var i=void 0===r?{}:r,u=i.compare,e=i.operations;return _(n,t,{compare:u,operations:Object.assign({},on,e||{})})};n.$Size=F,n.$all=tn,n.$and=nn,n.$elemMatch=U,n.$eq=S,n.$exists=V,n.$gt=K,n.$gte=L,n.$in=G,n.$lt=H,n.$lte=J,n.$mod=Q,n.$ne=C,n.$nin=B,n.$nor=T,n.$not=X,n.$options=un,n.$or=I,n.$regex=W,n.$size=rn,n.$type=Z,n.$where=en,n.EqualsOperation=d,n.createDefaultQueryOperation=fn,n.createEqualsOperation=function(n,t,r){return new d(n,t,r)},n.createOperationTester=E,n.createQueryOperation=_,n.createQueryTester=function(n,t){return void 0===t&&(t={}),E(_(n,null,t))},n.default=function(n,t){void 0===t&&(t={});var r=fn(n,null,t);return E(r)},Object.defineProperty(n,"l",{value:!0})})); +//# sourceMappingURL=sift.min.js.map diff --git a/node_modules/sift/sift.min.js.map b/node_modules/sift/sift.min.js.map new file mode 100644 index 000000000..f979cf1ad --- /dev/null +++ b/node_modules/sift/sift.min.js.map @@ -0,0 +1 @@ +{"version":3,"file":"sift.min.js","sources":["node_modules/tslib/tslib.es6.js","src/utils.ts","src/core.ts","src/operations.ts","src/index.ts"],"sourcesContent":["/*! *****************************************************************************\r\nCopyright (c) Microsoft Corporation.\r\n\r\nPermission to use, copy, modify, and/or distribute this software for any\r\npurpose with or without fee is hereby granted.\r\n\r\nTHE SOFTWARE IS PROVIDED \"AS IS\" AND THE AUTHOR DISCLAIMS ALL WARRANTIES WITH\r\nREGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF MERCHANTABILITY\r\nAND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY SPECIAL, DIRECT,\r\nINDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES WHATSOEVER RESULTING FROM\r\nLOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR\r\nOTHER TORTIOUS ACTION, ARISING OUT OF OR IN CONNECTION WITH THE USE OR\r\nPERFORMANCE OF THIS SOFTWARE.\r\n***************************************************************************** */\r\n/* global Reflect, Promise */\r\n\r\nvar extendStatics = function(d, b) {\r\n extendStatics = Object.setPrototypeOf ||\r\n ({ __proto__: [] } instanceof Array && function (d, b) { d.__proto__ = b; }) ||\r\n function (d, b) { for (var p in b) if (b.hasOwnProperty(p)) d[p] = b[p]; };\r\n return extendStatics(d, b);\r\n};\r\n\r\nexport function __extends(d, b) {\r\n extendStatics(d, b);\r\n function __() { this.constructor = d; }\r\n d.prototype = b === null ? Object.create(b) : (__.prototype = b.prototype, new __());\r\n}\r\n\r\nexport var __assign = function() {\r\n __assign = Object.assign || function __assign(t) {\r\n for (var s, i = 1, n = arguments.length; i < n; i++) {\r\n s = arguments[i];\r\n for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p)) t[p] = s[p];\r\n }\r\n return t;\r\n }\r\n return __assign.apply(this, arguments);\r\n}\r\n\r\nexport function __rest(s, e) {\r\n var t = {};\r\n for (var p in s) if (Object.prototype.hasOwnProperty.call(s, p) && e.indexOf(p) < 0)\r\n t[p] = s[p];\r\n if (s != null && typeof Object.getOwnPropertySymbols === \"function\")\r\n for (var i = 0, p = Object.getOwnPropertySymbols(s); i < p.length; i++) {\r\n if (e.indexOf(p[i]) < 0 && Object.prototype.propertyIsEnumerable.call(s, p[i]))\r\n t[p[i]] = s[p[i]];\r\n }\r\n return t;\r\n}\r\n\r\nexport function __decorate(decorators, target, key, desc) {\r\n var c = arguments.length, r = c < 3 ? target : desc === null ? desc = Object.getOwnPropertyDescriptor(target, key) : desc, d;\r\n if (typeof Reflect === \"object\" && typeof Reflect.decorate === \"function\") r = Reflect.decorate(decorators, target, key, desc);\r\n else for (var i = decorators.length - 1; i >= 0; i--) if (d = decorators[i]) r = (c < 3 ? d(r) : c > 3 ? d(target, key, r) : d(target, key)) || r;\r\n return c > 3 && r && Object.defineProperty(target, key, r), r;\r\n}\r\n\r\nexport function __param(paramIndex, decorator) {\r\n return function (target, key) { decorator(target, key, paramIndex); }\r\n}\r\n\r\nexport function __metadata(metadataKey, metadataValue) {\r\n if (typeof Reflect === \"object\" && typeof Reflect.metadata === \"function\") return Reflect.metadata(metadataKey, metadataValue);\r\n}\r\n\r\nexport function __awaiter(thisArg, _arguments, P, generator) {\r\n function adopt(value) { return value instanceof P ? value : new P(function (resolve) { resolve(value); }); }\r\n return new (P || (P = Promise))(function (resolve, reject) {\r\n function fulfilled(value) { try { step(generator.next(value)); } catch (e) { reject(e); } }\r\n function rejected(value) { try { step(generator[\"throw\"](value)); } catch (e) { reject(e); } }\r\n function step(result) { result.done ? resolve(result.value) : adopt(result.value).then(fulfilled, rejected); }\r\n step((generator = generator.apply(thisArg, _arguments || [])).next());\r\n });\r\n}\r\n\r\nexport function __generator(thisArg, body) {\r\n var _ = { label: 0, sent: function() { if (t[0] & 1) throw t[1]; return t[1]; }, trys: [], ops: [] }, f, y, t, g;\r\n return g = { next: verb(0), \"throw\": verb(1), \"return\": verb(2) }, typeof Symbol === \"function\" && (g[Symbol.iterator] = function() { return this; }), g;\r\n function verb(n) { return function (v) { return step([n, v]); }; }\r\n function step(op) {\r\n if (f) throw new TypeError(\"Generator is already executing.\");\r\n while (_) try {\r\n if (f = 1, y && (t = op[0] & 2 ? y[\"return\"] : op[0] ? y[\"throw\"] || ((t = y[\"return\"]) && t.call(y), 0) : y.next) && !(t = t.call(y, op[1])).done) return t;\r\n if (y = 0, t) op = [op[0] & 2, t.value];\r\n switch (op[0]) {\r\n case 0: case 1: t = op; break;\r\n case 4: _.label++; return { value: op[1], done: false };\r\n case 5: _.label++; y = op[1]; op = [0]; continue;\r\n case 7: op = _.ops.pop(); _.trys.pop(); continue;\r\n default:\r\n if (!(t = _.trys, t = t.length > 0 && t[t.length - 1]) && (op[0] === 6 || op[0] === 2)) { _ = 0; continue; }\r\n if (op[0] === 3 && (!t || (op[1] > t[0] && op[1] < t[3]))) { _.label = op[1]; break; }\r\n if (op[0] === 6 && _.label < t[1]) { _.label = t[1]; t = op; break; }\r\n if (t && _.label < t[2]) { _.label = t[2]; _.ops.push(op); break; }\r\n if (t[2]) _.ops.pop();\r\n _.trys.pop(); continue;\r\n }\r\n op = body.call(thisArg, _);\r\n } catch (e) { op = [6, e]; y = 0; } finally { f = t = 0; }\r\n if (op[0] & 5) throw op[1]; return { value: op[0] ? op[1] : void 0, done: true };\r\n }\r\n}\r\n\r\nexport var __createBinding = Object.create ? (function(o, m, k, k2) {\r\n if (k2 === undefined) k2 = k;\r\n Object.defineProperty(o, k2, { enumerable: true, get: function() { return m[k]; } });\r\n}) : (function(o, m, k, k2) {\r\n if (k2 === undefined) k2 = k;\r\n o[k2] = m[k];\r\n});\r\n\r\nexport function __exportStar(m, exports) {\r\n for (var p in m) if (p !== \"default\" && !exports.hasOwnProperty(p)) __createBinding(exports, m, p);\r\n}\r\n\r\nexport function __values(o) {\r\n var s = typeof Symbol === \"function\" && Symbol.iterator, m = s && o[s], i = 0;\r\n if (m) return m.call(o);\r\n if (o && typeof o.length === \"number\") return {\r\n next: function () {\r\n if (o && i >= o.length) o = void 0;\r\n return { value: o && o[i++], done: !o };\r\n }\r\n };\r\n throw new TypeError(s ? \"Object is not iterable.\" : \"Symbol.iterator is not defined.\");\r\n}\r\n\r\nexport function __read(o, n) {\r\n var m = typeof Symbol === \"function\" && o[Symbol.iterator];\r\n if (!m) return o;\r\n var i = m.call(o), r, ar = [], e;\r\n try {\r\n while ((n === void 0 || n-- > 0) && !(r = i.next()).done) ar.push(r.value);\r\n }\r\n catch (error) { e = { error: error }; }\r\n finally {\r\n try {\r\n if (r && !r.done && (m = i[\"return\"])) m.call(i);\r\n }\r\n finally { if (e) throw e.error; }\r\n }\r\n return ar;\r\n}\r\n\r\nexport function __spread() {\r\n for (var ar = [], i = 0; i < arguments.length; i++)\r\n ar = ar.concat(__read(arguments[i]));\r\n return ar;\r\n}\r\n\r\nexport function __spreadArrays() {\r\n for (var s = 0, i = 0, il = arguments.length; i < il; i++) s += arguments[i].length;\r\n for (var r = Array(s), k = 0, i = 0; i < il; i++)\r\n for (var a = arguments[i], j = 0, jl = a.length; j < jl; j++, k++)\r\n r[k] = a[j];\r\n return r;\r\n};\r\n\r\nexport function __await(v) {\r\n return this instanceof __await ? (this.v = v, this) : new __await(v);\r\n}\r\n\r\nexport function __asyncGenerator(thisArg, _arguments, generator) {\r\n if (!Symbol.asyncIterator) throw new TypeError(\"Symbol.asyncIterator is not defined.\");\r\n var g = generator.apply(thisArg, _arguments || []), i, q = [];\r\n return i = {}, verb(\"next\"), verb(\"throw\"), verb(\"return\"), i[Symbol.asyncIterator] = function () { return this; }, i;\r\n function verb(n) { if (g[n]) i[n] = function (v) { return new Promise(function (a, b) { q.push([n, v, a, b]) > 1 || resume(n, v); }); }; }\r\n function resume(n, v) { try { step(g[n](v)); } catch (e) { settle(q[0][3], e); } }\r\n function step(r) { r.value instanceof __await ? Promise.resolve(r.value.v).then(fulfill, reject) : settle(q[0][2], r); }\r\n function fulfill(value) { resume(\"next\", value); }\r\n function reject(value) { resume(\"throw\", value); }\r\n function settle(f, v) { if (f(v), q.shift(), q.length) resume(q[0][0], q[0][1]); }\r\n}\r\n\r\nexport function __asyncDelegator(o) {\r\n var i, p;\r\n return i = {}, verb(\"next\"), verb(\"throw\", function (e) { throw e; }), verb(\"return\"), i[Symbol.iterator] = function () { return this; }, i;\r\n function verb(n, f) { i[n] = o[n] ? function (v) { return (p = !p) ? { value: __await(o[n](v)), done: n === \"return\" } : f ? f(v) : v; } : f; }\r\n}\r\n\r\nexport function __asyncValues(o) {\r\n if (!Symbol.asyncIterator) throw new TypeError(\"Symbol.asyncIterator is not defined.\");\r\n var m = o[Symbol.asyncIterator], i;\r\n return m ? m.call(o) : (o = typeof __values === \"function\" ? __values(o) : o[Symbol.iterator](), i = {}, verb(\"next\"), verb(\"throw\"), verb(\"return\"), i[Symbol.asyncIterator] = function () { return this; }, i);\r\n function verb(n) { i[n] = o[n] && function (v) { return new Promise(function (resolve, reject) { v = o[n](v), settle(resolve, reject, v.done, v.value); }); }; }\r\n function settle(resolve, reject, d, v) { Promise.resolve(v).then(function(v) { resolve({ value: v, done: d }); }, reject); }\r\n}\r\n\r\nexport function __makeTemplateObject(cooked, raw) {\r\n if (Object.defineProperty) { Object.defineProperty(cooked, \"raw\", { value: raw }); } else { cooked.raw = raw; }\r\n return cooked;\r\n};\r\n\r\nvar __setModuleDefault = Object.create ? (function(o, v) {\r\n Object.defineProperty(o, \"default\", { enumerable: true, value: v });\r\n}) : function(o, v) {\r\n o[\"default\"] = v;\r\n};\r\n\r\nexport function __importStar(mod) {\r\n if (mod && mod.__esModule) return mod;\r\n var result = {};\r\n if (mod != null) for (var k in mod) if (Object.hasOwnProperty.call(mod, k)) __createBinding(result, mod, k);\r\n __setModuleDefault(result, mod);\r\n return result;\r\n}\r\n\r\nexport function __importDefault(mod) {\r\n return (mod && mod.__esModule) ? mod : { default: mod };\r\n}\r\n\r\nexport function __classPrivateFieldGet(receiver, privateMap) {\r\n if (!privateMap.has(receiver)) {\r\n throw new TypeError(\"attempted to get private field on non-instance\");\r\n }\r\n return privateMap.get(receiver);\r\n}\r\n\r\nexport function __classPrivateFieldSet(receiver, privateMap, value) {\r\n if (!privateMap.has(receiver)) {\r\n throw new TypeError(\"attempted to set private field on non-instance\");\r\n }\r\n privateMap.set(receiver, value);\r\n return value;\r\n}\r\n",null,null,null,null],"names":["extendStatics","d","b","Object","setPrototypeOf","__proto__","Array","p","hasOwnProperty","__extends","__","this","constructor","prototype","create","typeChecker","type","typeString","value","getClassName","toString","call","comparable","Date","getTime","isArray","map","toJSON","isObject","isFunction","equals","a","length","i","length_1","keys","key","walkKeyPathValues","item","keyPath","next","depth","owner","currentKey","isNaN","Number","params","owneryQuery","options","init","BaseOperation","done","keep","name","_super","_this","children","GroupOperation","length_2","reset","length_3","childOperation","QueryOperation","parent","childrenNext","NestedOperation","_nextNestedValue","createTester","compare","Function","RegExp","result","test","lastIndex","comparableA","EqualsOperation","_test","NopeOperation","numericalOperation","createNumericalOperation","typeofParams","createNamedOperation","parentQuery","operationCreator","operations","Error","containsOperation","query","charAt","createNestedOperation","nestedQuery","_a","selfOperations","createQueryOperation","_b","assign","_c","nestedOperations","ops","push","createQueryOperations","op","split","createOperationTester","operation","$Ne","NamedBaseOperation","$ElemMatch","_queryOperation","$Not","$Size","$Or","_ops","success","$Nor","$In","_testers","toLowerCase","length_4","$Nin","$Exists","$And","NamedGroupOperation","$eq","$ne","$or","$nor","$elemMatch","$nin","$in","$lt","$lte","$gt","$gte","$mod","mod","equalsValue","$exists","$regex","pattern","$options","$not","typeAliases","number","v","string","bool","array","null","timestamp","$type","clazz","$and","ownerQuery","$all","$size","$where","process","env","CSP_ENABLED","bind","createDefaultQueryOperation","defaultOperations"],"mappings":";;;;;;;;;;;;;;oFAgBA,IAAIA,EAAgB,SAASC,EAAGC,GAI5B,OAHAF,EAAgBG,OAAOC,gBAClB,CAAEC,UAAW,cAAgBC,OAAS,SAAUL,EAAGC,GAAKD,EAAEI,UAAYH,IACvE,SAAUD,EAAGC,GAAK,IAAK,IAAIK,KAAKL,EAAOA,EAAEM,eAAeD,KAAIN,EAAEM,GAAKL,EAAEK,MACpDN,EAAGC,IAGrB,SAASO,EAAUR,EAAGC,GAEzB,SAASQ,IAAOC,KAAKC,YAAcX,EADnCD,EAAcC,EAAGC,GAEjBD,EAAEY,UAAkB,OAANX,EAAaC,OAAOW,OAAOZ,IAAMQ,EAAGG,UAAYX,EAAEW,UAAW,IAAIH,GCxB5E,IAAMK,EAAc,SAAQC,GACjC,IAAMC,EAAa,WAAaD,EAAO,IACvC,OAAO,SAASE,GACd,OAAOC,EAAaD,KAAWD,IAI7BE,EAAe,SAAAD,GAAS,OAAAf,OAAOU,UAAUO,SAASC,KAAKH,IAEhDI,EAAa,SAACJ,GACzB,OAAIA,aAAiBK,KACZL,EAAMM,UACJC,EAAQP,GACVA,EAAMQ,IAAIJ,GACRJ,GAAiC,mBAAjBA,EAAMS,OACxBT,EAAMS,SAGRT,GAGIO,EAAUV,EAAwB,SAClCa,EAAWb,EAAoB,UAC/Bc,EAAad,EAAsB,YAYnCe,EAAS,SAACC,EAAG7B,GACxB,GAAS,MAAL6B,GAAaA,GAAK7B,EACpB,OAAO,EAET,GAAI6B,IAAM7B,EACR,OAAO,EAGT,GAAIC,OAAOU,UAAUO,SAASC,KAAKU,KAAO5B,OAAOU,UAAUO,SAASC,KAAKnB,GACvE,OAAO,EAGT,GAAIuB,EAAQM,GAAI,CACd,GAAIA,EAAEC,SAAW9B,EAAE8B,OACjB,OAAO,EAET,IAAS,IAAAC,EAAI,EAAKC,WAAcD,EAAIC,EAAQD,IAC1C,IAAKH,EAAOC,EAAEE,GAAI/B,EAAE+B,IAAK,OAAO,EAElC,OAAO,EACF,GAAIL,EAASG,GAAI,CACtB,GAAI5B,OAAOgC,KAAKJ,GAAGC,SAAW7B,OAAOgC,KAAKjC,GAAG8B,OAC3C,OAAO,EAET,IAAK,IAAMI,KAAOL,EAChB,IAAKD,EAAOC,EAAEK,GAAMlC,EAAEkC,IAAO,OAAO,EAEtC,OAAO,EAET,OAAO,GCYHC,EAAoB,SACxBC,EACAC,EACAC,EACAC,EACAL,EACAM,GAEA,IAAMC,EAAaJ,EAAQE,GAI3B,GAAIhB,EAAQa,IAASM,MAAMC,OAAOF,IAChC,IAAS,IAAAV,EAAI,EAAKC,WAAiBD,EAAIC,EAAQD,IAG7C,IAAKI,EAAkBC,EAAKL,GAAIM,EAASC,EAAMC,EAAOR,EAAGK,GACvD,OAAO,EAKb,OAAIG,IAAUF,EAAQP,QAAkB,MAARM,EACvBE,EAAKF,EAAMF,EAAKM,GAGlBL,EACLC,EAAKK,GACLJ,EACAC,EACAC,EAAQ,EACRE,EACAL,iBAOF,WACWQ,EACAC,EACAC,GAFArC,YAAAmC,EACAnC,iBAAAoC,EACApC,aAAAqC,EAETrC,KAAKsC,OAQT,OANYC,iBAAV,aACAA,kBAAA,WACEvC,KAAKwC,MAAO,EACZxC,KAAKyC,MAAO,sBAQd,WACEN,EACAC,EACAC,EACSK,GAJX,MAMEC,YAAMR,EAAQC,EAAaC,gBAFlBO,OAAAF,IAIb,OAVU5C,UAAAyC,iBAgBR,WACEJ,EACAC,EACAC,EACgBQ,GAJlB,MAMEF,YAAMR,EAAQC,EAAaC,gBAFXO,WAAAC,IAyCpB,OAjDsC/C,OAgBpCgD,kBAAA,WACE9C,KAAKyC,MAAO,EACZzC,KAAKwC,MAAO,EACZ,IAAS,IAAAlB,EAAI,EAAKyB,uBAA0BzB,EAAIyB,EAAQzB,IACtDtB,KAAK6C,SAASvB,GAAG0B,SASXF,yBAAV,SAAuBnB,EAAWF,EAAUM,GAG1C,IAFA,IAAIS,GAAO,EACPC,GAAO,EACFnB,EAAI,EAAK2B,uBAA0B3B,EAAI2B,EAAQ3B,IAAK,CAC3D,IAAM4B,EAAiBlD,KAAK6C,SAASvB,GAKrC,GAJA4B,EAAerB,KAAKF,EAAMF,EAAKM,GAC1BmB,EAAeT,OAClBA,GAAO,GAELS,EAAeV,MACjB,IAAKU,EAAeT,KAClB,WAGFD,GAAO,EAGXxC,KAAKwC,KAAOA,EACZxC,KAAKyC,KAAOA,MA/CsBF,iBAqDpC,WACEJ,EACAC,EACAC,EACAQ,EACSH,GALX,MAOEC,YAAMR,EAAQC,EAAaC,EAASQ,gBAF3BD,OAAAF,IAIb,OAXkD5C,UAAAgD,iBAalD,4DAOA,OAP2ChD,OAIzCqD,iBAAA,SAAKxB,EAAaF,EAAU2B,GAC1BpD,KAAKqD,aAAa1B,EAAMF,EAAK2B,OALUN,iBAUzC,WACWlB,EACTO,EACAC,EACAC,EACAQ,GALF,MAOEF,YAAMR,EAAQC,EAAaC,EAASQ,gBAN3BD,UAAAhB,EAyBHgB,IAAmB,SAACrC,EAAYkB,EAAUM,GAEhD,OADAa,EAAKS,aAAa9C,EAAOkB,EAAKM,IACtBa,EAAKJ,QAEjB,OA/BqC1C,OAanCwD,iBAAA,SAAK3B,EAAWF,EAAU2B,GACxB1B,EACEC,EACA3B,KAAK4B,QACL5B,KAAKuD,EACL,EACA9B,EACA2B,OApB+BN,GAiCxBU,EAAe,SAACpC,EAAGqC,GAC9B,GAAIrC,aAAasC,SACf,OAAOtC,EAET,GAAIA,aAAauC,OACf,OAAO,SAAApE,GACL,IAAMqE,EAAsB,iBAANrE,GAAkB6B,EAAEyC,KAAKtE,GAE/C,OADA6B,EAAE0C,UAAY,EACPF,GAGX,IAAMG,EAAcpD,EAAWS,GAC/B,OAAO,SAAA7B,GAAK,OAAAkE,EAAQM,EAAapD,EAAWpB,oBAG9C,4DAaA,OAb6CO,OAE3CkE,iBAAA,WACEhE,KAAKiE,EAAQT,EAAaxD,KAAKmC,OAAQnC,KAAKqC,QAAQoB,UAEtDO,iBAAA,SAAKrC,EAAMF,EAAU2B,GACdzD,MAAMmB,QAAQsC,KAAWA,EAAOvD,eAAe4B,IAC9CzB,KAAKiE,EAAMtC,EAAMF,EAAK2B,KACxBpD,KAAKwC,MAAO,EACZxC,KAAKyC,MAAO,OATyBF,iBAqB7C,4DAKA,OAL2CzC,OACzCoE,iBAAA,WACElE,KAAKwC,MAAO,EACZxC,KAAKyC,MAAO,MAH2BF,GAiB9B4B,EAAqB,SAACX,GACjC,OAVAY,EAWE,SAACjC,EAAaC,EAAyBC,GACrC,IAAMgC,SAAsB1D,EAAWwB,GACjC0B,EAAOL,EAAarB,GAC1B,OAAO,IAAI6B,GACT,SAAAzE,GACE,cAAcoB,EAAWpB,KAAO8E,GAAgBR,EAAKtE,KAEvD6C,EACAC,IAlBH,SAACF,EAAaC,EAAkBC,EAAkBK,GACrD,OAAc,MAAVP,EACK,IAAI+B,EAAc/B,EAAQC,EAAaC,GAGzC+B,EAAyBjC,EAAQC,EAAaC,EAASK,IAPvB,IACvC0B,GA+BIE,EAAuB,SAC3B5B,EACAP,EACAoC,EACAlC,GAEA,IAAMmC,EAAmBnC,EAAQoC,WAAW/B,GAC5C,IAAK8B,EACH,MAAM,IAAIE,MAAM,0BAA0BhC,GAE5C,OAAO8B,EAAiBrC,EAAQoC,EAAalC,EAASK,IAG3CiC,EAAoB,SAACC,GAChC,IAAK,IAAMnD,KAAOmD,EAChB,GAAsB,MAAlBnD,EAAIoD,OAAO,GAAY,OAAO,EAEpC,OAAO,GAEHC,EAAwB,SAC5BlD,EACAmD,EACA3C,EACAC,GAEA,GAAIsC,EAAkBI,GAAc,CAC5B,IAAAC,SAACC,OAIP,QAAqB5D,OACnB,MAAM,IAAIqD,MACR,oEAGJ,OAAO,IAAIpB,EACT1B,EACAmD,EACA3C,EACAC,EACA4C,GAGJ,OAAO,IAAI3B,EAAgB1B,EAASmD,EAAa3C,EAAaC,EAAS,CACrE,IAAI2B,EAAgBe,EAAa3C,EAAaC,MAIrC6C,EAAuB,SAClCN,EACAxC,EACA4C,gBADA5C,YACA+C,kBAAE1B,YAASgB,eAELpC,EAAU,CACdoB,QAASA,GAAWtC,EACpBsD,WAAYjF,OAAO4F,OAAO,GAAIX,GAAc,KAGxCY,SAACJ,OAAgBK,OAKjBC,EAAM,GAUZ,OARIN,EAAe5D,QACjBkE,EAAIC,KACF,IAAIlC,EAAgB,GAAIsB,EAAOxC,EAAaC,EAAS4C,IAIzDM,EAAIC,WAAJD,EAAYD,GAEO,IAAfC,EAAIlE,OACCkE,EAAI,GAEN,IAAIpC,EAAeyB,EAAOxC,EAAaC,EAASkD,IAGnDE,EAAwB,SAACb,EAAYvC,GACzC,ID5X6B9B,EC4XvB0E,EAAiB,GACjBK,EAAmB,GACzB,KD9X6B/E,EC8XRqE,ID3XlBrE,EAAMN,cAAgBT,QACrBe,EAAMN,cAAgBN,OACW,wCAAjCY,EAAMN,YAAYQ,YACe,uCAAjCF,EAAMN,YAAYQ,YACnBF,EAAMS,OCyXP,OADAiE,EAAeO,KAAK,IAAIxB,EAAgBY,EAAOA,EAAOvC,IAC/C,CAAC4C,EAAgBK,GAE1B,IAAK,IAAM7D,KAAOmD,EAChB,GAAsB,MAAlBnD,EAAIoD,OAAO,GAAY,CACzB,IAAMa,EAAKpB,EAAqB7C,EAAKmD,EAAMnD,GAAMmD,EAAOvC,GAG9C,MAANqD,GACFT,EAAeO,KAAKE,QAGtBJ,EAAiBE,KACfV,EAAsBrD,EAAIkE,MAAM,KAAMf,EAAMnD,GAAMmD,EAAOvC,IAK/D,MAAO,CAAC4C,EAAgBK,IAGbM,EAAwB,SAAQC,GAAgC,OAAA,SAC3ElE,EACAF,EACAM,GAIA,OAFA8D,EAAU7C,QACV6C,EAAUhE,KAAKF,EAAMF,EAAKM,GACnB8D,EAAUpD,qBCpanB,4DAeA,OAfkB3C,OAEhBgG,iBAAA,WACE9F,KAAKiE,EAAQT,EAAaxD,KAAKmC,OAAQnC,KAAKqC,QAAQoB,UAEtDqC,kBAAA,WACEnD,YAAMK,iBACNhD,KAAKyC,MAAO,GAEdqD,iBAAA,SAAKnE,GACC3B,KAAKiE,EAAMtC,KACb3B,KAAKwC,MAAO,EACZxC,KAAKyC,MAAO,OAZAsD,iBAiBlB,4DA8BA,OA9ByBjG,OAEvBkG,iBAAA,WACEhG,KAAKiG,EAAkBf,EACrBlF,KAAKmC,OACLnC,KAAKoC,YACLpC,KAAKqC,UAGT2D,kBAAA,WACErD,YAAMK,iBACNhD,KAAKiG,EAAgBjD,SAEvBgD,iBAAA,SAAKrE,GACH,GAAIb,EAAQa,GAAO,CACjB,IAAS,IAAAL,EAAI,EAAKC,WAAiBD,EAAIC,EAAQD,IAG7CtB,KAAKiG,EAAgBjD,QAGrBhD,KAAKiG,EAAgBpE,KAAKF,EAAKL,GAAIA,EAAGK,GACtC3B,KAAKyC,KAAOzC,KAAKyC,MAAQzC,KAAKiG,EAAgBxD,KAEhDzC,KAAKwC,MAAO,OAEZxC,KAAKwC,MAAO,EACZxC,KAAKyC,MAAO,MA3BOsD,iBAgCzB,4DAiBA,OAjBmBjG,OAEjBoG,iBAAA,WACElG,KAAKiG,EAAkBf,EACrBlF,KAAKmC,OACLnC,KAAKoC,YACLpC,KAAKqC,UAGT6D,kBAAA,WACElG,KAAKiG,EAAgBjD,SAEvBkD,iBAAA,SAAKvE,EAAWF,EAAUM,GACxB/B,KAAKiG,EAAgBpE,KAAKF,EAAMF,EAAKM,GACrC/B,KAAKwC,KAAOxC,KAAKiG,EAAgBzD,KACjCxC,KAAKyC,MAAQzC,KAAKiG,EAAgBxD,SAfnBsD,iBAmBnB,4DAYA,OAZ2BjG,OACzBqG,iBAAA,aACAA,iBAAA,SAAKxE,GACCb,EAAQa,IAASA,EAAKN,SAAWrB,KAAKmC,SACxCnC,KAAKwC,MAAO,EACZxC,KAAKyC,MAAO,OALSsD,iBAc3B,4DA8BA,OA9BkBjG,OAEhBsG,iBAAA,WAAA,WACEpG,KAAKqG,EAAOrG,KAAKmC,OAAOpB,KAAI,SAAA2E,GAC1B,OAAAR,EAAqBQ,EAAI,KAAM9C,EAAKP,aAGxC+D,kBAAA,WACEpG,KAAKwC,MAAO,EACZxC,KAAKyC,MAAO,EACZ,IAAS,IAAAnB,EAAI,EAAKyB,gBAAsBzB,EAAIyB,EAAQzB,IAClDtB,KAAKqG,EAAK/E,GAAG0B,SAGjBoD,iBAAA,SAAKzE,EAAWF,EAAUM,GAGxB,IAFA,IAAIS,GAAO,EACP8D,GAAU,EACLhF,EAAI,EAAK2B,gBAAsB3B,EAAI2B,EAAQ3B,IAAK,CACvD,IAAMoE,EAAK1F,KAAKqG,EAAK/E,GAErB,GADAoE,EAAG7D,KAAKF,EAAMF,EAAKM,GACf2D,EAAGjD,KAAM,CACXD,GAAO,EACP8D,EAAUZ,EAAGjD,KACb,OAIJzC,KAAKyC,KAAO6D,EACZtG,KAAKwC,KAAOA,MA5BEuD,iBAgClB,4DAKA,OALmBjG,OACjByG,iBAAA,SAAK5E,EAAWF,EAAUM,GACxBY,YAAMd,eAAKF,EAAMF,EAAKM,GACtB/B,KAAKyC,MAAQzC,KAAKyC,SAHH2D,iBAOnB,4DA2BA,OA3BkBtG,OAEhB0G,iBAAA,WAAA,WACExG,KAAKyG,EAAWzG,KAAKmC,OAAOpB,KAAI,SAAAR,GAC9B,GAAIoE,EAAkBpE,GACpB,MAAM,IAAImE,MACR,uBAAuB9B,EAAK3C,YAAYyC,KAAKgE,eAGjD,OAAOlD,EAAajD,EAAOqC,EAAKP,QAAQoB,aAG5C+C,iBAAA,SAAK7E,EAAWF,EAAUM,GAGxB,IAFA,IAAIS,GAAO,EACP8D,GAAU,EACLhF,EAAI,EAAKqF,gBAA0BrF,EAAIqF,EAAQrF,IAAK,CAE3D,IAAIuC,EADS7D,KAAKyG,EAASnF,IAClBK,GAAO,CACda,GAAO,EACP8D,GAAU,EACV,OAIJtG,KAAKyC,KAAO6D,EACZtG,KAAKwC,KAAOA,MAzBEuD,iBA6BlB,4DAKA,OALmBjG,OACjB8G,iBAAA,SAAKjF,EAAWF,EAAUM,GACxBY,YAAMd,eAAKF,EAAMF,EAAKM,GACtB/B,KAAKyC,MAAQzC,KAAKyC,SAHH+D,iBAOnB,4DAOA,OAPsB1G,OACpB+G,iBAAA,SAAKlF,EAAWF,EAAUM,GACpBA,EAAMlC,eAAe4B,KAASzB,KAAKmC,SACrCnC,KAAKwC,MAAO,EACZxC,KAAKyC,MAAO,OAJIsD,iBAUpB,WACE5D,EACAC,EACAC,EACAK,UAEAC,YACER,EACAC,EACAC,EACAF,EAAOpB,KAAI,SAAA6D,GAAS,OAAAM,EAAqBN,EAAOxC,EAAaC,MAC7DK,SAMN,OAlBmB5C,OAejBgH,iBAAA,SAAKnF,EAAWF,EAAUM,GACxB/B,KAAKqD,aAAa1B,EAAMF,EAAKM,OAhBdgF,GAoBNC,EAAM,SAAC7E,EAAaC,EAAyBC,GACxD,OAAA,IAAI2B,EAAgB7B,EAAQC,EAAaC,IAC9B4E,EAAM,SACjB9E,EACAC,EACAC,EACAK,GACG,OAAA,IAAIoD,EAAI3D,EAAQC,EAAaC,EAASK,IAC9BwE,EAAM,SACjB/E,EACAC,EACAC,EACAK,GACG,OAAA,IAAI0D,EAAIjE,EAAQC,EAAaC,EAASK,IAC9ByE,EAAO,SAClBhF,EACAC,EACAC,EACAK,GACG,OAAA,IAAI6D,EAAKpE,EAAQC,EAAaC,EAASK,IAC/B0E,EAAa,SACxBjF,EACAC,EACAC,EACAK,GACG,OAAA,IAAIsD,EAAW7D,EAAQC,EAAaC,EAASK,IACrC2E,EAAO,SAClBlF,EACAC,EACAC,EACAK,GACG,OAAA,IAAIkE,EAAKzE,EAAQC,EAAaC,EAASK,IAC/B4E,EAAM,SACjBnF,EACAC,EACAC,EACAK,GACG,OAAA,IAAI8D,EAAIrE,EAAQC,EAAaC,EAASK,IAE9B6E,EAAMpD,GAAmB,SAAAhC,GAAU,OAAA,SAAA5C,GAAK,OAAAA,EAAI4C,MAC5CqF,EAAOrD,GAAmB,SAAAhC,GAAU,OAAA,SAAA5C,GAAK,OAAAA,GAAK4C,MAC9CsF,EAAMtD,GAAmB,SAAAhC,GAAU,OAAA,SAAA5C,GAAK,OAAAA,EAAI4C,MAC5CuF,EAAOvD,GAAmB,SAAAhC,GAAU,OAAA,SAAA5C,GAAK,OAAAA,GAAK4C,MAC9CwF,EAAO,SAClB3C,EACA5C,EACAC,OAFCuF,OAAKC,OAIN,OAAA,IAAI7D,GACF,SAAAzE,GAAK,OAAAoB,EAAWpB,GAAKqI,IAAQC,IAC7BzF,EACAC,IAESyF,EAAU,SACrB3F,EACAC,EACAC,EACAK,GACG,OAAA,IAAImE,EAAQ1E,EAAQC,EAAaC,EAASK,IAClCqF,EAAS,SACpBC,EACA5F,EACAC,GAEA,OAAA,IAAI2B,EACF,IAAIL,OAAOqE,EAAS5F,EAAY6F,UAChC7F,EACAC,IAES6F,EAAO,SAClB/F,EACAC,EACAC,EACAK,GACG,OAAA,IAAIwD,EAAK/D,EAAQC,EAAaC,EAASK,IAEtCyF,EAAc,CAClBC,OAAQ,SAAAC,GAAK,MAAa,iBAANA,GACpBC,OAAQ,SAAAD,GAAK,MAAa,iBAANA,GACpBE,KAAM,SAAAF,GAAK,MAAa,kBAANA,GAClBG,MAAO,SAAAH,GAAK,OAAA1I,MAAMmB,QAAQuH,IAC1BI,KAAM,SAAAJ,GAAK,OAAM,OAANA,GACXK,UAAW,SAAAL,GAAK,OAAAA,aAAazH,OAGlB+H,EAAQ,SACnBC,EACAxG,EACAC,GAEA,OAAA,IAAI2B,GACF,SAAAzE,GACE,GAAqB,iBAAVqJ,EAAoB,CAC7B,IAAKT,EAAYS,GACf,MAAM,IAAIlE,MAAM,6BAGlB,OAAOyD,EAAYS,GAAOrJ,GAG5B,OAAY,MAALA,IAAYA,aAAaqJ,GAASrJ,EAAEU,cAAgB2I,KAE7DxG,EACAC,IAESwG,GAAO,SAClB1G,EACA2G,EACAzG,EACAK,GACG,OAAA,IAAIoE,EAAK3E,EAAQ2G,EAAYzG,EAASK,IAC9BqG,GAAOF,GACPG,GAAQ,SACnB7G,EACA2G,EACAzG,GACG,OAAA,IAAI8D,EAAMhE,EAAQ2G,EAAYzG,EAAS,UAC/B4F,GAAW,WAAM,OAAA,MACjBgB,GAAS,SACpB9G,EACA2G,EACAzG,GAEA,IAAIwB,EAEJ,GAAI3C,EAAWiB,GACb0B,EAAO1B,MACF,CAAA,GAAK+G,QAAQC,IAAIC,YAGtB,MAAM,IAAI1E,MACR,oEAHFb,EAAO,IAAIH,SAAS,MAAO,UAAYvB,GAOzC,OAAO,IAAI6B,GAAgB,SAAAzE,GAAK,OAAAsE,EAAKwF,KAAK9J,EAAVsE,CAAatE,KAAIuJ,EAAYzG,oNCvUzDiH,GAA8B,SAClC1E,EACAkE,EACA9D,OAAAG,kBAAE1B,YAASgB,eAEX,OAAOS,EAAqBN,EAAOkE,EAAY,CAC7CrF,QAASA,EACTgB,WAAYjF,OAAO4F,OAAO,GAAImE,GAAmB9E,GAAc,6SFuQ9B,SACnCtC,EACAC,EACAC,GACG,OAAA,IAAI2B,EAAgB7B,EAAQC,EAAaC,2EA2Jb,SAC/BuC,EACAvC,GAEA,oBAFAA,MAEOuD,EACLV,EAAqCN,EAAO,KAAMvC,eEvarB,SAC/BuC,EACAvC,gBAAAA,MAEA,IAAMqD,EAAK4D,GAA4B1E,EAAO,KAAMvC,GACpD,OAAOuD,EAAsBF"} \ No newline at end of file diff --git a/node_modules/signal-exit/LICENSE.txt b/node_modules/signal-exit/LICENSE.txt new file mode 100644 index 000000000..eead04a12 --- /dev/null +++ b/node_modules/signal-exit/LICENSE.txt @@ -0,0 +1,16 @@ +The ISC License + +Copyright (c) 2015, Contributors + +Permission to use, copy, modify, and/or distribute this software +for any purpose with or without fee is hereby granted, provided +that the above copyright notice and this permission notice +appear in all copies. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES +WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES +OF MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE +LIABLE FOR ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES +OR ANY DAMAGES WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, +WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, +ARISING OUT OF OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. diff --git a/node_modules/signal-exit/README.md b/node_modules/signal-exit/README.md new file mode 100644 index 000000000..f9c7c007d --- /dev/null +++ b/node_modules/signal-exit/README.md @@ -0,0 +1,39 @@ +# signal-exit + +[![Build Status](https://travis-ci.org/tapjs/signal-exit.png)](https://travis-ci.org/tapjs/signal-exit) +[![Coverage](https://coveralls.io/repos/tapjs/signal-exit/badge.svg?branch=master)](https://coveralls.io/r/tapjs/signal-exit?branch=master) +[![NPM version](https://img.shields.io/npm/v/signal-exit.svg)](https://www.npmjs.com/package/signal-exit) +[![Standard Version](https://img.shields.io/badge/release-standard%20version-brightgreen.svg)](https://github.com/conventional-changelog/standard-version) + +When you want to fire an event no matter how a process exits: + +* reaching the end of execution. +* explicitly having `process.exit(code)` called. +* having `process.kill(pid, sig)` called. +* receiving a fatal signal from outside the process + +Use `signal-exit`. + +```js +var onExit = require('signal-exit') + +onExit(function (code, signal) { + console.log('process exited!') +}) +``` + +## API + +`var remove = onExit(function (code, signal) {}, options)` + +The return value of the function is a function that will remove the +handler. + +Note that the function *only* fires for signals if the signal would +cause the process to exit. That is, there are no other listeners, and +it is a fatal signal. + +## Options + +* `alwaysLast`: Run this handler after any other signal or exit + handlers. This causes `process.emit` to be monkeypatched. diff --git a/node_modules/signal-exit/index.js b/node_modules/signal-exit/index.js new file mode 100644 index 000000000..16a056d3c --- /dev/null +++ b/node_modules/signal-exit/index.js @@ -0,0 +1,178 @@ +// Note: since nyc uses this module to output coverage, any lines +// that are in the direct sync flow of nyc's outputCoverage are +// ignored, since we can never get coverage for them. +// grab a reference to node's real process object right away +var process = global.process +// some kind of non-node environment, just no-op +if (typeof process !== 'object' || !process) { + module.exports = function () {} +} else { + var assert = require('assert') + var signals = require('./signals.js') + var isWin = /^win/i.test(process.platform) + + var EE = require('events') + /* istanbul ignore if */ + if (typeof EE !== 'function') { + EE = EE.EventEmitter + } + + var emitter + if (process.__signal_exit_emitter__) { + emitter = process.__signal_exit_emitter__ + } else { + emitter = process.__signal_exit_emitter__ = new EE() + emitter.count = 0 + emitter.emitted = {} + } + + // Because this emitter is a global, we have to check to see if a + // previous version of this library failed to enable infinite listeners. + // I know what you're about to say. But literally everything about + // signal-exit is a compromise with evil. Get used to it. + if (!emitter.infinite) { + emitter.setMaxListeners(Infinity) + emitter.infinite = true + } + + module.exports = function (cb, opts) { + if (global.process !== process) { + return + } + assert.equal(typeof cb, 'function', 'a callback must be provided for exit handler') + + if (loaded === false) { + load() + } + + var ev = 'exit' + if (opts && opts.alwaysLast) { + ev = 'afterexit' + } + + var remove = function () { + emitter.removeListener(ev, cb) + if (emitter.listeners('exit').length === 0 && + emitter.listeners('afterexit').length === 0) { + unload() + } + } + emitter.on(ev, cb) + + return remove + } + + var unload = function unload () { + if (!loaded || global.process !== process) { + return + } + loaded = false + + signals.forEach(function (sig) { + try { + process.removeListener(sig, sigListeners[sig]) + } catch (er) {} + }) + process.emit = originalProcessEmit + process.reallyExit = originalProcessReallyExit + emitter.count -= 1 + } + module.exports.unload = unload + + var emit = function emit (event, code, signal) { + if (emitter.emitted[event]) { + return + } + emitter.emitted[event] = true + emitter.emit(event, code, signal) + } + + // { : , ... } + var sigListeners = {} + signals.forEach(function (sig) { + sigListeners[sig] = function listener () { + if (process !== global.process) { + return + } + // If there are no other listeners, an exit is coming! + // Simplest way: remove us and then re-send the signal. + // We know that this will kill the process, so we can + // safely emit now. + var listeners = process.listeners(sig) + if (listeners.length === emitter.count) { + unload() + emit('exit', null, sig) + /* istanbul ignore next */ + emit('afterexit', null, sig) + /* istanbul ignore next */ + if (isWin && sig === 'SIGHUP') { + // "SIGHUP" throws an `ENOSYS` error on Windows, + // so use a supported signal instead + sig = 'SIGINT' + } + process.kill(process.pid, sig) + } + } + }) + + module.exports.signals = function () { + return signals + } + + var loaded = false + + var load = function load () { + if (loaded || process !== global.process) { + return + } + loaded = true + + // This is the number of onSignalExit's that are in play. + // It's important so that we can count the correct number of + // listeners on signals, and don't wait for the other one to + // handle it instead of us. + emitter.count += 1 + + signals = signals.filter(function (sig) { + try { + process.on(sig, sigListeners[sig]) + return true + } catch (er) { + return false + } + }) + + process.emit = processEmit + process.reallyExit = processReallyExit + } + module.exports.load = load + + var originalProcessReallyExit = process.reallyExit + var processReallyExit = function processReallyExit (code) { + if (process !== global.process) { + return + } + process.exitCode = code || 0 + emit('exit', process.exitCode, null) + /* istanbul ignore next */ + emit('afterexit', process.exitCode, null) + /* istanbul ignore next */ + originalProcessReallyExit.call(process, process.exitCode) + } + + var originalProcessEmit = process.emit + var processEmit = function processEmit (ev, arg) { + if (ev === 'exit' && process === global.process) { + if (arg !== undefined) { + process.exitCode = arg + } + var ret = originalProcessEmit.apply(this, arguments) + emit('exit', process.exitCode, null) + /* istanbul ignore next */ + emit('afterexit', process.exitCode, null) + return ret + } else { + return originalProcessEmit.apply(this, arguments) + } + } +} diff --git a/node_modules/signal-exit/package.json b/node_modules/signal-exit/package.json new file mode 100644 index 000000000..57c85bb1f --- /dev/null +++ b/node_modules/signal-exit/package.json @@ -0,0 +1,64 @@ +{ + "_from": "signal-exit@^3.0.2", + "_id": "signal-exit@3.0.5", + "_inBundle": false, + "_integrity": "sha512-KWcOiKeQj6ZyXx7zq4YxSMgHRlod4czeBQZrPb8OKcohcqAXShm7E20kEMle9WBt26hFcAf0qLOcp5zmY7kOqQ==", + "_location": "/signal-exit", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "signal-exit@^3.0.2", + "name": "signal-exit", + "escapedName": "signal-exit", + "rawSpec": "^3.0.2", + "saveSpec": null, + "fetchSpec": "^3.0.2" + }, + "_requiredBy": [ + "/write-file-atomic" + ], + "_resolved": "https://registry.npmjs.org/signal-exit/-/signal-exit-3.0.5.tgz", + "_shasum": "9e3e8cc0c75a99472b44321033a7702e7738252f", + "_spec": "signal-exit@^3.0.2", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\write-file-atomic", + "author": { + "name": "Ben Coe", + "email": "ben@npmjs.com" + }, + "bugs": { + "url": "https://github.com/tapjs/signal-exit/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "when you want to fire an event no matter how a process exits.", + "devDependencies": { + "chai": "^3.5.0", + "coveralls": "^3.1.1", + "nyc": "^15.1.0", + "standard-version": "^9.3.1", + "tap": "^15.0.10" + }, + "files": [ + "index.js", + "signals.js" + ], + "homepage": "https://github.com/tapjs/signal-exit", + "keywords": [ + "signal", + "exit" + ], + "license": "ISC", + "main": "index.js", + "name": "signal-exit", + "repository": { + "type": "git", + "url": "git+https://github.com/tapjs/signal-exit.git" + }, + "scripts": { + "coverage": "nyc report --reporter=text-lcov | coveralls", + "release": "standard-version", + "test": "tap --timeout=240 ./test/*.js --cov" + }, + "version": "3.0.5" +} diff --git a/node_modules/signal-exit/signals.js b/node_modules/signal-exit/signals.js new file mode 100644 index 000000000..3bd67a8a5 --- /dev/null +++ b/node_modules/signal-exit/signals.js @@ -0,0 +1,53 @@ +// This is not the set of all possible signals. +// +// It IS, however, the set of all signals that trigger +// an exit on either Linux or BSD systems. Linux is a +// superset of the signal names supported on BSD, and +// the unknown signals just fail to register, so we can +// catch that easily enough. +// +// Don't bother with SIGKILL. It's uncatchable, which +// means that we can't fire any callbacks anyway. +// +// If a user does happen to register a handler on a non- +// fatal signal like SIGWINCH or something, and then +// exit, it'll end up firing `process.emit('exit')`, so +// the handler will be fired anyway. +// +// SIGBUS, SIGFPE, SIGSEGV and SIGILL, when not raised +// artificially, inherently leave the process in a +// state from which it is not safe to try and enter JS +// listeners. +module.exports = [ + 'SIGABRT', + 'SIGALRM', + 'SIGHUP', + 'SIGINT', + 'SIGTERM' +] + +if (process.platform !== 'win32') { + module.exports.push( + 'SIGVTALRM', + 'SIGXCPU', + 'SIGXFSZ', + 'SIGUSR2', + 'SIGTRAP', + 'SIGSYS', + 'SIGQUIT', + 'SIGIOT' + // should detect profiler and enable/disable accordingly. + // see #21 + // 'SIGPROF' + ) +} + +if (process.platform === 'linux') { + module.exports.push( + 'SIGIO', + 'SIGPOLL', + 'SIGPWR', + 'SIGSTKFLT', + 'SIGUNUSED' + ) +} diff --git a/node_modules/sliced/History.md b/node_modules/sliced/History.md new file mode 100644 index 000000000..e45cefe8f --- /dev/null +++ b/node_modules/sliced/History.md @@ -0,0 +1,41 @@ + +1.0.1 / 2015-07-14 +================== + + * fixed; missing file introduced in 4f5cea1 + +1.0.0 / 2015-07-12 +================== + + * Remove unnecessary files from npm package - #6 via joaquimserafim + * updated readme stats + +0.0.5 / 2013-02-05 +================== + + * optimization: remove use of arguments [jkroso](https://github.com/jkroso) + * add scripts to component.json [jkroso](https://github.com/jkroso) + * tests; remove time for travis + +0.0.4 / 2013-01-07 +================== + + * added component.json #1 [jkroso](https://github.com/jkroso) + * reversed array loop #1 [jkroso](https://github.com/jkroso) + * remove fn params + +0.0.3 / 2012-09-29 +================== + + * faster with negative start args + +0.0.2 / 2012-09-29 +================== + + * support full [].slice semantics + +0.0.1 / 2012-09-29 +=================== + + * initial release + diff --git a/node_modules/sliced/LICENSE b/node_modules/sliced/LICENSE new file mode 100644 index 000000000..38c529daa --- /dev/null +++ b/node_modules/sliced/LICENSE @@ -0,0 +1,22 @@ +(The MIT License) + +Copyright (c) 2012 [Aaron Heckmann](aaron.heckmann+github@gmail.com) + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/sliced/README.md b/node_modules/sliced/README.md new file mode 100644 index 000000000..b6ff85eb2 --- /dev/null +++ b/node_modules/sliced/README.md @@ -0,0 +1,62 @@ +#sliced +========== + +A faster alternative to `[].slice.call(arguments)`. + +[![Build Status](https://secure.travis-ci.org/aheckmann/sliced.png)](http://travis-ci.org/aheckmann/sliced) + +Example output from [benchmark.js](https://github.com/bestiejs/benchmark.js) + + Array.prototype.slice.call x 1,401,820 ops/sec ±2.16% (90 runs sampled) + [].slice.call x 1,313,116 ops/sec ±2.04% (96 runs sampled) + cached slice.call x 10,297,910 ops/sec ±1.81% (96 runs sampled) + sliced x 19,906,019 ops/sec ±1.23% (89 runs sampled) + fastest is sliced + + Array.prototype.slice.call(arguments, 1) x 1,373,238 ops/sec ±1.84% (95 runs sampled) + [].slice.call(arguments, 1) x 1,395,336 ops/sec ±1.36% (93 runs sampled) + cached slice.call(arguments, 1) x 9,926,018 ops/sec ±1.67% (92 runs sampled) + sliced(arguments, 1) x 20,747,990 ops/sec ±1.16% (93 runs sampled) + fastest is sliced(arguments, 1) + + Array.prototype.slice.call(arguments, -1) x 1,319,908 ops/sec ±2.12% (91 runs sampled) + [].slice.call(arguments, -1) x 1,336,170 ops/sec ±1.33% (97 runs sampled) + cached slice.call(arguments, -1) x 10,078,718 ops/sec ±1.21% (98 runs sampled) + sliced(arguments, -1) x 20,471,474 ops/sec ±1.81% (92 runs sampled) + fastest is sliced(arguments, -1) + + Array.prototype.slice.call(arguments, -2, -10) x 1,369,246 ops/sec ±1.68% (97 runs sampled) + [].slice.call(arguments, -2, -10) x 1,387,935 ops/sec ±1.70% (95 runs sampled) + cached slice.call(arguments, -2, -10) x 9,593,428 ops/sec ±1.23% (97 runs sampled) + sliced(arguments, -2, -10) x 23,178,931 ops/sec ±1.70% (92 runs sampled) + fastest is sliced(arguments, -2, -10) + + Array.prototype.slice.call(arguments, -2, -1) x 1,441,300 ops/sec ±1.26% (98 runs sampled) + [].slice.call(arguments, -2, -1) x 1,410,326 ops/sec ±1.96% (93 runs sampled) + cached slice.call(arguments, -2, -1) x 9,854,419 ops/sec ±1.02% (97 runs sampled) + sliced(arguments, -2, -1) x 22,550,801 ops/sec ±1.86% (91 runs sampled) + fastest is sliced(arguments, -2, -1) + +_Benchmark [source](https://github.com/aheckmann/sliced/blob/master/bench.js)._ + +##Usage + +`sliced` accepts the same arguments as `Array#slice` so you can easily swap it out. + +```js +function zing () { + var slow = [].slice.call(arguments, 1, 8); + var args = slice(arguments, 1, 8); + + var slow = Array.prototype.slice.call(arguments); + var args = slice(arguments); + // etc +} +``` + +## install + + npm install sliced + + +[LICENSE](https://github.com/aheckmann/sliced/blob/master/LICENSE) diff --git a/node_modules/sliced/index.js b/node_modules/sliced/index.js new file mode 100644 index 000000000..d88c85b10 --- /dev/null +++ b/node_modules/sliced/index.js @@ -0,0 +1,33 @@ + +/** + * An Array.prototype.slice.call(arguments) alternative + * + * @param {Object} args something with a length + * @param {Number} slice + * @param {Number} sliceEnd + * @api public + */ + +module.exports = function (args, slice, sliceEnd) { + var ret = []; + var len = args.length; + + if (0 === len) return ret; + + var start = slice < 0 + ? Math.max(0, slice + len) + : slice || 0; + + if (sliceEnd !== undefined) { + len = sliceEnd < 0 + ? sliceEnd + len + : sliceEnd + } + + while (len-- > start) { + ret[len - start] = args[len]; + } + + return ret; +} + diff --git a/node_modules/sliced/package.json b/node_modules/sliced/package.json new file mode 100644 index 000000000..be76e5983 --- /dev/null +++ b/node_modules/sliced/package.json @@ -0,0 +1,63 @@ +{ + "_from": "sliced@1.0.1", + "_id": "sliced@1.0.1", + "_inBundle": false, + "_integrity": "sha1-CzpmK10Ewxd7GSa+qCsD+Dei70E=", + "_location": "/sliced", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "sliced@1.0.1", + "name": "sliced", + "escapedName": "sliced", + "rawSpec": "1.0.1", + "saveSpec": null, + "fetchSpec": "1.0.1" + }, + "_requiredBy": [ + "/mongoose", + "/mquery" + ], + "_resolved": "https://registry.npmjs.org/sliced/-/sliced-1.0.1.tgz", + "_shasum": "0b3a662b5d04c3177b1926bea82b03f837a2ef41", + "_spec": "sliced@1.0.1", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\mongoose", + "author": { + "name": "Aaron Heckmann", + "email": "aaron.heckmann+github@gmail.com" + }, + "bugs": { + "url": "https://github.com/aheckmann/sliced/issues" + }, + "bundleDependencies": false, + "dependencies": {}, + "deprecated": false, + "description": "A faster Node.js alternative to Array.prototype.slice.call(arguments)", + "devDependencies": { + "benchmark": "~1.0.0", + "mocha": "1.5.0" + }, + "files": [ + "LICENSE", + "README.md", + "index.js" + ], + "homepage": "https://github.com/aheckmann/sliced#readme", + "keywords": [ + "arguments", + "slice", + "array" + ], + "license": "MIT", + "main": "index.js", + "name": "sliced", + "repository": { + "type": "git", + "url": "git://github.com/aheckmann/sliced.git" + }, + "scripts": { + "test": "make test" + }, + "version": "1.0.1" +} diff --git a/node_modules/sparse-bitfield/.npmignore b/node_modules/sparse-bitfield/.npmignore new file mode 100644 index 000000000..3c3629e64 --- /dev/null +++ b/node_modules/sparse-bitfield/.npmignore @@ -0,0 +1 @@ +node_modules diff --git a/node_modules/sparse-bitfield/.travis.yml b/node_modules/sparse-bitfield/.travis.yml new file mode 100644 index 000000000..c04282170 --- /dev/null +++ b/node_modules/sparse-bitfield/.travis.yml @@ -0,0 +1,6 @@ +language: node_js +node_js: + - '0.10' + - '0.12' + - '4.0' + - '5.0' diff --git a/node_modules/sparse-bitfield/LICENSE b/node_modules/sparse-bitfield/LICENSE new file mode 100644 index 000000000..bae9da7bf --- /dev/null +++ b/node_modules/sparse-bitfield/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2016 Mathias Buus + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/sparse-bitfield/README.md b/node_modules/sparse-bitfield/README.md new file mode 100644 index 000000000..7b6b8f9ed --- /dev/null +++ b/node_modules/sparse-bitfield/README.md @@ -0,0 +1,62 @@ +# sparse-bitfield + +Bitfield implementation that allocates a series of 1kb buffers to support sparse bitfields +without allocating a massive buffer. If you want to simple implementation of a flat bitfield +see the [bitfield](https://github.com/fb55/bitfield) module. + +This module is mostly useful if you need a big bitfield where you won't nessecarily set every bit. + +``` +npm install sparse-bitfield +``` + +[![build status](http://img.shields.io/travis/mafintosh/sparse-bitfield.svg?style=flat)](http://travis-ci.org/mafintosh/sparse-bitfield) + +## Usage + +``` js +var bitfield = require('sparse-bitfield') +var bits = bitfield() + +bits.set(0, true) // set first bit +bits.set(1, true) // set second bit +bits.set(1000000000000, true) // set the 1.000.000.000.000th bit +``` + +Running the above example will allocate two 1kb buffers internally. +Each 1kb buffer can hold information about 8192 bits so the first one will be used to store information about the first two bits and the second will be used to store the 1.000.000.000.000th bit. + +## API + +#### `var bits = bitfield([options])` + +Create a new bitfield. Options include + +``` js +{ + pageSize: 1024, // how big should the partial buffers be + buffer: anExistingBitfield, + trackUpdates: false // track when pages are being updated in the pager +} +``` + +#### `bits.set(index, value)` + +Set a bit to true or false. + +#### `bits.get(index)` + +Get the value of a bit. + +#### `bits.pages` + +A [memory-pager](https://github.com/mafintosh/memory-pager) instance that is managing the underlying memory. +If you set `trackUpdates` to true in the constructor you can use `.lastUpdate()` on this instance to get the last updated memory page. + +#### `var buffer = bits.toBuffer()` + +Get a single buffer representing the entire bitfield. + +## License + +MIT diff --git a/node_modules/sparse-bitfield/index.js b/node_modules/sparse-bitfield/index.js new file mode 100644 index 000000000..ff458c974 --- /dev/null +++ b/node_modules/sparse-bitfield/index.js @@ -0,0 +1,95 @@ +var pager = require('memory-pager') + +module.exports = Bitfield + +function Bitfield (opts) { + if (!(this instanceof Bitfield)) return new Bitfield(opts) + if (!opts) opts = {} + if (Buffer.isBuffer(opts)) opts = {buffer: opts} + + this.pageOffset = opts.pageOffset || 0 + this.pageSize = opts.pageSize || 1024 + this.pages = opts.pages || pager(this.pageSize) + + this.byteLength = this.pages.length * this.pageSize + this.length = 8 * this.byteLength + + if (!powerOfTwo(this.pageSize)) throw new Error('The page size should be a power of two') + + this._trackUpdates = !!opts.trackUpdates + this._pageMask = this.pageSize - 1 + + if (opts.buffer) { + for (var i = 0; i < opts.buffer.length; i += this.pageSize) { + this.pages.set(i / this.pageSize, opts.buffer.slice(i, i + this.pageSize)) + } + this.byteLength = opts.buffer.length + this.length = 8 * this.byteLength + } +} + +Bitfield.prototype.get = function (i) { + var o = i & 7 + var j = (i - o) / 8 + + return !!(this.getByte(j) & (128 >> o)) +} + +Bitfield.prototype.getByte = function (i) { + var o = i & this._pageMask + var j = (i - o) / this.pageSize + var page = this.pages.get(j, true) + + return page ? page.buffer[o + this.pageOffset] : 0 +} + +Bitfield.prototype.set = function (i, v) { + var o = i & 7 + var j = (i - o) / 8 + var b = this.getByte(j) + + return this.setByte(j, v ? b | (128 >> o) : b & (255 ^ (128 >> o))) +} + +Bitfield.prototype.toBuffer = function () { + var all = alloc(this.pages.length * this.pageSize) + + for (var i = 0; i < this.pages.length; i++) { + var next = this.pages.get(i, true) + var allOffset = i * this.pageSize + if (next) next.buffer.copy(all, allOffset, this.pageOffset, this.pageOffset + this.pageSize) + } + + return all +} + +Bitfield.prototype.setByte = function (i, b) { + var o = i & this._pageMask + var j = (i - o) / this.pageSize + var page = this.pages.get(j, false) + + o += this.pageOffset + + if (page.buffer[o] === b) return false + page.buffer[o] = b + + if (i >= this.byteLength) { + this.byteLength = i + 1 + this.length = this.byteLength * 8 + } + + if (this._trackUpdates) this.pages.updated(page) + + return true +} + +function alloc (n) { + if (Buffer.alloc) return Buffer.alloc(n) + var b = new Buffer(n) + b.fill(0) + return b +} + +function powerOfTwo (x) { + return !(x & (x - 1)) +} diff --git a/node_modules/sparse-bitfield/package.json b/node_modules/sparse-bitfield/package.json new file mode 100644 index 000000000..48a0ce6cd --- /dev/null +++ b/node_modules/sparse-bitfield/package.json @@ -0,0 +1,55 @@ +{ + "_from": "sparse-bitfield@^3.0.3", + "_id": "sparse-bitfield@3.0.3", + "_inBundle": false, + "_integrity": "sha1-/0rm5oZWBWuks+eSqzM004JzyhE=", + "_location": "/sparse-bitfield", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "sparse-bitfield@^3.0.3", + "name": "sparse-bitfield", + "escapedName": "sparse-bitfield", + "rawSpec": "^3.0.3", + "saveSpec": null, + "fetchSpec": "^3.0.3" + }, + "_requiredBy": [ + "/saslprep" + ], + "_resolved": "https://registry.npmjs.org/sparse-bitfield/-/sparse-bitfield-3.0.3.tgz", + "_shasum": "ff4ae6e68656056ba4b3e792ab3334d38273ca11", + "_spec": "sparse-bitfield@^3.0.3", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\saslprep", + "author": { + "name": "Mathias Buus", + "url": "@mafintosh" + }, + "bugs": { + "url": "https://github.com/mafintosh/sparse-bitfield/issues" + }, + "bundleDependencies": false, + "dependencies": { + "memory-pager": "^1.0.2" + }, + "deprecated": false, + "description": "Bitfield that allocates a series of small buffers to support sparse bits without allocating a massive buffer", + "devDependencies": { + "buffer-alloc": "^1.1.0", + "standard": "^9.0.0", + "tape": "^4.6.3" + }, + "homepage": "https://github.com/mafintosh/sparse-bitfield", + "license": "MIT", + "main": "index.js", + "name": "sparse-bitfield", + "repository": { + "type": "git", + "url": "git+https://github.com/mafintosh/sparse-bitfield.git" + }, + "scripts": { + "test": "standard && tape test.js" + }, + "version": "3.0.3" +} diff --git a/node_modules/sparse-bitfield/test.js b/node_modules/sparse-bitfield/test.js new file mode 100644 index 000000000..ae42ef467 --- /dev/null +++ b/node_modules/sparse-bitfield/test.js @@ -0,0 +1,79 @@ +var alloc = require('buffer-alloc') +var tape = require('tape') +var bitfield = require('./') + +tape('set and get', function (t) { + var bits = bitfield() + + t.same(bits.get(0), false, 'first bit is false') + bits.set(0, true) + t.same(bits.get(0), true, 'first bit is true') + t.same(bits.get(1), false, 'second bit is false') + bits.set(0, false) + t.same(bits.get(0), false, 'first bit is reset') + t.end() +}) + +tape('set large and get', function (t) { + var bits = bitfield() + + t.same(bits.get(9999999999999), false, 'large bit is false') + bits.set(9999999999999, true) + t.same(bits.get(9999999999999), true, 'large bit is true') + t.same(bits.get(9999999999999 + 1), false, 'large bit + 1 is false') + bits.set(9999999999999, false) + t.same(bits.get(9999999999999), false, 'large bit is reset') + t.end() +}) + +tape('get and set buffer', function (t) { + var bits = bitfield({trackUpdates: true}) + + t.same(bits.pages.get(0, true), undefined) + t.same(bits.pages.get(Math.floor(9999999999999 / 8 / 1024), true), undefined) + bits.set(9999999999999, true) + + var bits2 = bitfield() + var upd = bits.pages.lastUpdate() + bits2.pages.set(Math.floor(upd.offset / 1024), upd.buffer) + t.same(bits2.get(9999999999999), true, 'bit is set') + t.end() +}) + +tape('toBuffer', function (t) { + var bits = bitfield() + + t.same(bits.toBuffer(), alloc(0)) + + bits.set(0, true) + + t.same(bits.toBuffer(), bits.pages.get(0).buffer) + + bits.set(9000, true) + + t.same(bits.toBuffer(), Buffer.concat([bits.pages.get(0).buffer, bits.pages.get(1).buffer])) + t.end() +}) + +tape('pass in buffer', function (t) { + var bits = bitfield() + + bits.set(0, true) + bits.set(9000, true) + + var clone = bitfield(bits.toBuffer()) + + t.same(clone.get(0), true) + t.same(clone.get(9000), true) + t.end() +}) + +tape('set small buffer', function (t) { + var buf = alloc(1) + buf[0] = 255 + var bits = bitfield(buf) + + t.same(bits.get(0), true) + t.same(bits.pages.get(0).buffer.length, bits.pageSize) + t.end() +}) diff --git a/node_modules/statuses/HISTORY.md b/node_modules/statuses/HISTORY.md new file mode 100644 index 000000000..a1977b297 --- /dev/null +++ b/node_modules/statuses/HISTORY.md @@ -0,0 +1,65 @@ +1.5.0 / 2018-03-27 +================== + + * Add `103 Early Hints` + +1.4.0 / 2017-10-20 +================== + + * Add `STATUS_CODES` export + +1.3.1 / 2016-11-11 +================== + + * Fix return type in JSDoc + +1.3.0 / 2016-05-17 +================== + + * Add `421 Misdirected Request` + * perf: enable strict mode + +1.2.1 / 2015-02-01 +================== + + * Fix message for status 451 + - `451 Unavailable For Legal Reasons` + +1.2.0 / 2014-09-28 +================== + + * Add `208 Already Repored` + * Add `226 IM Used` + * Add `306 (Unused)` + * Add `415 Unable For Legal Reasons` + * Add `508 Loop Detected` + +1.1.1 / 2014-09-24 +================== + + * Add missing 308 to `codes.json` + +1.1.0 / 2014-09-21 +================== + + * Add `codes.json` for universal support + +1.0.4 / 2014-08-20 +================== + + * Package cleanup + +1.0.3 / 2014-06-08 +================== + + * Add 308 to `.redirect` category + +1.0.2 / 2014-03-13 +================== + + * Add `.retry` category + +1.0.1 / 2014-03-12 +================== + + * Initial release diff --git a/node_modules/statuses/LICENSE b/node_modules/statuses/LICENSE new file mode 100644 index 000000000..28a316182 --- /dev/null +++ b/node_modules/statuses/LICENSE @@ -0,0 +1,23 @@ + +The MIT License (MIT) + +Copyright (c) 2014 Jonathan Ong +Copyright (c) 2016 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/statuses/README.md b/node_modules/statuses/README.md new file mode 100644 index 000000000..0fe5720db --- /dev/null +++ b/node_modules/statuses/README.md @@ -0,0 +1,127 @@ +# Statuses + +[![NPM Version][npm-image]][npm-url] +[![NPM Downloads][downloads-image]][downloads-url] +[![Node.js Version][node-version-image]][node-version-url] +[![Build Status][travis-image]][travis-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +HTTP status utility for node. + +This module provides a list of status codes and messages sourced from +a few different projects: + + * The [IANA Status Code Registry](https://www.iana.org/assignments/http-status-codes/http-status-codes.xhtml) + * The [Node.js project](https://nodejs.org/) + * The [NGINX project](https://www.nginx.com/) + * The [Apache HTTP Server project](https://httpd.apache.org/) + +## Installation + +This is a [Node.js](https://nodejs.org/en/) module available through the +[npm registry](https://www.npmjs.com/). Installation is done using the +[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally): + +```sh +$ npm install statuses +``` + +## API + + + +```js +var status = require('statuses') +``` + +### var code = status(Integer || String) + +If `Integer` or `String` is a valid HTTP code or status message, then the +appropriate `code` will be returned. Otherwise, an error will be thrown. + + + +```js +status(403) // => 403 +status('403') // => 403 +status('forbidden') // => 403 +status('Forbidden') // => 403 +status(306) // throws, as it's not supported by node.js +``` + +### status.STATUS_CODES + +Returns an object which maps status codes to status messages, in +the same format as the +[Node.js http module](https://nodejs.org/dist/latest/docs/api/http.html#http_http_status_codes). + +### status.codes + +Returns an array of all the status codes as `Integer`s. + +### var msg = status[code] + +Map of `code` to `status message`. `undefined` for invalid `code`s. + + + +```js +status[404] // => 'Not Found' +``` + +### var code = status[msg] + +Map of `status message` to `code`. `msg` can either be title-cased or +lower-cased. `undefined` for invalid `status message`s. + + + +```js +status['not found'] // => 404 +status['Not Found'] // => 404 +``` + +### status.redirect[code] + +Returns `true` if a status code is a valid redirect status. + + + +```js +status.redirect[200] // => undefined +status.redirect[301] // => true +``` + +### status.empty[code] + +Returns `true` if a status code expects an empty body. + + + +```js +status.empty[200] // => undefined +status.empty[204] // => true +status.empty[304] // => true +``` + +### status.retry[code] + +Returns `true` if you should retry the rest. + + + +```js +status.retry[501] // => undefined +status.retry[503] // => true +``` + +[npm-image]: https://img.shields.io/npm/v/statuses.svg +[npm-url]: https://npmjs.org/package/statuses +[node-version-image]: https://img.shields.io/node/v/statuses.svg +[node-version-url]: https://nodejs.org/en/download +[travis-image]: https://img.shields.io/travis/jshttp/statuses.svg +[travis-url]: https://travis-ci.org/jshttp/statuses +[coveralls-image]: https://img.shields.io/coveralls/jshttp/statuses.svg +[coveralls-url]: https://coveralls.io/r/jshttp/statuses?branch=master +[downloads-image]: https://img.shields.io/npm/dm/statuses.svg +[downloads-url]: https://npmjs.org/package/statuses diff --git a/node_modules/statuses/codes.json b/node_modules/statuses/codes.json new file mode 100644 index 000000000..a09283a27 --- /dev/null +++ b/node_modules/statuses/codes.json @@ -0,0 +1,66 @@ +{ + "100": "Continue", + "101": "Switching Protocols", + "102": "Processing", + "103": "Early Hints", + "200": "OK", + "201": "Created", + "202": "Accepted", + "203": "Non-Authoritative Information", + "204": "No Content", + "205": "Reset Content", + "206": "Partial Content", + "207": "Multi-Status", + "208": "Already Reported", + "226": "IM Used", + "300": "Multiple Choices", + "301": "Moved Permanently", + "302": "Found", + "303": "See Other", + "304": "Not Modified", + "305": "Use Proxy", + "306": "(Unused)", + "307": "Temporary Redirect", + "308": "Permanent Redirect", + "400": "Bad Request", + "401": "Unauthorized", + "402": "Payment Required", + "403": "Forbidden", + "404": "Not Found", + "405": "Method Not Allowed", + "406": "Not Acceptable", + "407": "Proxy Authentication Required", + "408": "Request Timeout", + "409": "Conflict", + "410": "Gone", + "411": "Length Required", + "412": "Precondition Failed", + "413": "Payload Too Large", + "414": "URI Too Long", + "415": "Unsupported Media Type", + "416": "Range Not Satisfiable", + "417": "Expectation Failed", + "418": "I'm a teapot", + "421": "Misdirected Request", + "422": "Unprocessable Entity", + "423": "Locked", + "424": "Failed Dependency", + "425": "Unordered Collection", + "426": "Upgrade Required", + "428": "Precondition Required", + "429": "Too Many Requests", + "431": "Request Header Fields Too Large", + "451": "Unavailable For Legal Reasons", + "500": "Internal Server Error", + "501": "Not Implemented", + "502": "Bad Gateway", + "503": "Service Unavailable", + "504": "Gateway Timeout", + "505": "HTTP Version Not Supported", + "506": "Variant Also Negotiates", + "507": "Insufficient Storage", + "508": "Loop Detected", + "509": "Bandwidth Limit Exceeded", + "510": "Not Extended", + "511": "Network Authentication Required" +} diff --git a/node_modules/statuses/index.js b/node_modules/statuses/index.js new file mode 100644 index 000000000..4df469a05 --- /dev/null +++ b/node_modules/statuses/index.js @@ -0,0 +1,113 @@ +/*! + * statuses + * Copyright(c) 2014 Jonathan Ong + * Copyright(c) 2016 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module dependencies. + * @private + */ + +var codes = require('./codes.json') + +/** + * Module exports. + * @public + */ + +module.exports = status + +// status code to message map +status.STATUS_CODES = codes + +// array of status codes +status.codes = populateStatusesMap(status, codes) + +// status codes for redirects +status.redirect = { + 300: true, + 301: true, + 302: true, + 303: true, + 305: true, + 307: true, + 308: true +} + +// status codes for empty bodies +status.empty = { + 204: true, + 205: true, + 304: true +} + +// status codes for when you should retry the request +status.retry = { + 502: true, + 503: true, + 504: true +} + +/** + * Populate the statuses map for given codes. + * @private + */ + +function populateStatusesMap (statuses, codes) { + var arr = [] + + Object.keys(codes).forEach(function forEachCode (code) { + var message = codes[code] + var status = Number(code) + + // Populate properties + statuses[status] = message + statuses[message] = status + statuses[message.toLowerCase()] = status + + // Add to array + arr.push(status) + }) + + return arr +} + +/** + * Get the status code. + * + * Given a number, this will throw if it is not a known status + * code, otherwise the code will be returned. Given a string, + * the string will be parsed for a number and return the code + * if valid, otherwise will lookup the code assuming this is + * the status message. + * + * @param {string|number} code + * @returns {number} + * @public + */ + +function status (code) { + if (typeof code === 'number') { + if (!status[code]) throw new Error('invalid status code: ' + code) + return code + } + + if (typeof code !== 'string') { + throw new TypeError('code must be a number or string') + } + + // '403' + var n = parseInt(code, 10) + if (!isNaN(n)) { + if (!status[n]) throw new Error('invalid status code: ' + n) + return n + } + + n = status[code.toLowerCase()] + if (!n) throw new Error('invalid status message: "' + code + '"') + return n +} diff --git a/node_modules/statuses/package.json b/node_modules/statuses/package.json new file mode 100644 index 000000000..6a605c9cb --- /dev/null +++ b/node_modules/statuses/package.json @@ -0,0 +1,90 @@ +{ + "_from": "statuses@~1.5.0", + "_id": "statuses@1.5.0", + "_inBundle": false, + "_integrity": "sha1-Fhx9rBd2Wf2YEfQ3cfqZOBR4Yow=", + "_location": "/statuses", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "statuses@~1.5.0", + "name": "statuses", + "escapedName": "statuses", + "rawSpec": "~1.5.0", + "saveSpec": null, + "fetchSpec": "~1.5.0" + }, + "_requiredBy": [ + "/express", + "/finalhandler", + "/http-errors", + "/send" + ], + "_resolved": "https://registry.npmjs.org/statuses/-/statuses-1.5.0.tgz", + "_shasum": "161c7dac177659fd9811f43771fa99381478628c", + "_spec": "statuses@~1.5.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\express", + "bugs": { + "url": "https://github.com/jshttp/statuses/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + { + "name": "Jonathan Ong", + "email": "me@jongleberry.com", + "url": "http://jongleberry.com" + } + ], + "deprecated": false, + "description": "HTTP status utility", + "devDependencies": { + "csv-parse": "1.2.4", + "eslint": "4.19.1", + "eslint-config-standard": "11.0.0", + "eslint-plugin-import": "2.9.0", + "eslint-plugin-markdown": "1.0.0-beta.6", + "eslint-plugin-node": "6.0.1", + "eslint-plugin-promise": "3.7.0", + "eslint-plugin-standard": "3.0.1", + "istanbul": "0.4.5", + "mocha": "1.21.5", + "raw-body": "2.3.2", + "stream-to-array": "2.3.0" + }, + "engines": { + "node": ">= 0.6" + }, + "files": [ + "HISTORY.md", + "index.js", + "codes.json", + "LICENSE" + ], + "homepage": "https://github.com/jshttp/statuses#readme", + "keywords": [ + "http", + "status", + "code" + ], + "license": "MIT", + "name": "statuses", + "repository": { + "type": "git", + "url": "git+https://github.com/jshttp/statuses.git" + }, + "scripts": { + "build": "node scripts/build.js", + "fetch": "node scripts/fetch-apache.js && node scripts/fetch-iana.js && node scripts/fetch-nginx.js && node scripts/fetch-node.js", + "lint": "eslint --plugin markdown --ext js,md .", + "test": "mocha --reporter spec --check-leaks --bail test/", + "test-ci": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --reporter spec --check-leaks test/", + "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --reporter dot --check-leaks test/", + "update": "npm run fetch && npm run build" + }, + "version": "1.5.0" +} diff --git a/node_modules/string-width/index.d.ts b/node_modules/string-width/index.d.ts new file mode 100644 index 000000000..12b530975 --- /dev/null +++ b/node_modules/string-width/index.d.ts @@ -0,0 +1,29 @@ +declare const stringWidth: { + /** + Get the visual width of a string - the number of columns required to display it. + + Some Unicode characters are [fullwidth](https://en.wikipedia.org/wiki/Halfwidth_and_fullwidth_forms) and use double the normal width. [ANSI escape codes](https://en.wikipedia.org/wiki/ANSI_escape_code) are stripped and doesn't affect the width. + + @example + ``` + import stringWidth = require('string-width'); + + stringWidth('a'); + //=> 1 + + stringWidth('古'); + //=> 2 + + stringWidth('\u001B[1m古\u001B[22m'); + //=> 2 + ``` + */ + (string: string): number; + + // TODO: remove this in the next major version, refactor the whole definition to: + // declare function stringWidth(string: string): number; + // export = stringWidth; + default: typeof stringWidth; +} + +export = stringWidth; diff --git a/node_modules/string-width/index.js b/node_modules/string-width/index.js new file mode 100644 index 000000000..f4d261a96 --- /dev/null +++ b/node_modules/string-width/index.js @@ -0,0 +1,47 @@ +'use strict'; +const stripAnsi = require('strip-ansi'); +const isFullwidthCodePoint = require('is-fullwidth-code-point'); +const emojiRegex = require('emoji-regex'); + +const stringWidth = string => { + if (typeof string !== 'string' || string.length === 0) { + return 0; + } + + string = stripAnsi(string); + + if (string.length === 0) { + return 0; + } + + string = string.replace(emojiRegex(), ' '); + + let width = 0; + + for (let i = 0; i < string.length; i++) { + const code = string.codePointAt(i); + + // Ignore control characters + if (code <= 0x1F || (code >= 0x7F && code <= 0x9F)) { + continue; + } + + // Ignore combining characters + if (code >= 0x300 && code <= 0x36F) { + continue; + } + + // Surrogates + if (code > 0xFFFF) { + i++; + } + + width += isFullwidthCodePoint(code) ? 2 : 1; + } + + return width; +}; + +module.exports = stringWidth; +// TODO: remove this in the next major version +module.exports.default = stringWidth; diff --git a/node_modules/string-width/license b/node_modules/string-width/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/string-width/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/string-width/package.json b/node_modules/string-width/package.json new file mode 100644 index 000000000..80ceec525 --- /dev/null +++ b/node_modules/string-width/package.json @@ -0,0 +1,91 @@ +{ + "_from": "string-width@^4.2.2", + "_id": "string-width@4.2.3", + "_inBundle": false, + "_integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==", + "_location": "/string-width", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "string-width@^4.2.2", + "name": "string-width", + "escapedName": "string-width", + "rawSpec": "^4.2.2", + "saveSpec": null, + "fetchSpec": "^4.2.2" + }, + "_requiredBy": [ + "/ansi-align", + "/boxen", + "/widest-line", + "/wrap-ansi" + ], + "_resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz", + "_shasum": "269c7117d27b05ad2e536830a8ec895ef9c6d010", + "_spec": "string-width@^4.2.2", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\boxen", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/string-width/issues" + }, + "bundleDependencies": false, + "dependencies": { + "emoji-regex": "^8.0.0", + "is-fullwidth-code-point": "^3.0.0", + "strip-ansi": "^6.0.1" + }, + "deprecated": false, + "description": "Get the visual width of a string - the number of columns required to display it", + "devDependencies": { + "ava": "^1.4.1", + "tsd": "^0.7.1", + "xo": "^0.24.0" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "homepage": "https://github.com/sindresorhus/string-width#readme", + "keywords": [ + "string", + "character", + "unicode", + "width", + "visual", + "column", + "columns", + "fullwidth", + "full-width", + "full", + "ansi", + "escape", + "codes", + "cli", + "command-line", + "terminal", + "console", + "cjk", + "chinese", + "japanese", + "korean", + "fixed-width" + ], + "license": "MIT", + "name": "string-width", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/string-width.git" + }, + "scripts": { + "test": "xo && ava && tsd" + }, + "version": "4.2.3" +} diff --git a/node_modules/string-width/readme.md b/node_modules/string-width/readme.md new file mode 100644 index 000000000..bdd314129 --- /dev/null +++ b/node_modules/string-width/readme.md @@ -0,0 +1,50 @@ +# string-width + +> Get the visual width of a string - the number of columns required to display it + +Some Unicode characters are [fullwidth](https://en.wikipedia.org/wiki/Halfwidth_and_fullwidth_forms) and use double the normal width. [ANSI escape codes](https://en.wikipedia.org/wiki/ANSI_escape_code) are stripped and doesn't affect the width. + +Useful to be able to measure the actual width of command-line output. + + +## Install + +``` +$ npm install string-width +``` + + +## Usage + +```js +const stringWidth = require('string-width'); + +stringWidth('a'); +//=> 1 + +stringWidth('古'); +//=> 2 + +stringWidth('\u001B[1m古\u001B[22m'); +//=> 2 +``` + + +## Related + +- [string-width-cli](https://github.com/sindresorhus/string-width-cli) - CLI for this module +- [string-length](https://github.com/sindresorhus/string-length) - Get the real length of a string +- [widest-line](https://github.com/sindresorhus/widest-line) - Get the visual width of the widest line in a string + + +--- + +
+ + Get professional support for this package with a Tidelift subscription + +
+ + Tidelift helps make open source sustainable for maintainers while giving companies
assurances about security, maintenance, and licensing for their dependencies. +
+
diff --git a/node_modules/strip-ansi/index.d.ts b/node_modules/strip-ansi/index.d.ts new file mode 100644 index 000000000..907fccc29 --- /dev/null +++ b/node_modules/strip-ansi/index.d.ts @@ -0,0 +1,17 @@ +/** +Strip [ANSI escape codes](https://en.wikipedia.org/wiki/ANSI_escape_code) from a string. + +@example +``` +import stripAnsi = require('strip-ansi'); + +stripAnsi('\u001B[4mUnicorn\u001B[0m'); +//=> 'Unicorn' + +stripAnsi('\u001B]8;;https://github.com\u0007Click\u001B]8;;\u0007'); +//=> 'Click' +``` +*/ +declare function stripAnsi(string: string): string; + +export = stripAnsi; diff --git a/node_modules/strip-ansi/index.js b/node_modules/strip-ansi/index.js new file mode 100644 index 000000000..9a593dfcd --- /dev/null +++ b/node_modules/strip-ansi/index.js @@ -0,0 +1,4 @@ +'use strict'; +const ansiRegex = require('ansi-regex'); + +module.exports = string => typeof string === 'string' ? string.replace(ansiRegex(), '') : string; diff --git a/node_modules/strip-ansi/license b/node_modules/strip-ansi/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/strip-ansi/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/strip-ansi/package.json b/node_modules/strip-ansi/package.json new file mode 100644 index 000000000..0a46ad68c --- /dev/null +++ b/node_modules/strip-ansi/package.json @@ -0,0 +1,87 @@ +{ + "_from": "strip-ansi@^6.0.1", + "_id": "strip-ansi@6.0.1", + "_inBundle": false, + "_integrity": "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==", + "_location": "/strip-ansi", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "strip-ansi@^6.0.1", + "name": "strip-ansi", + "escapedName": "strip-ansi", + "rawSpec": "^6.0.1", + "saveSpec": null, + "fetchSpec": "^6.0.1" + }, + "_requiredBy": [ + "/string-width", + "/wrap-ansi" + ], + "_resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz", + "_shasum": "9e26c63d30f53443e9489495b2105d37b67a85d9", + "_spec": "strip-ansi@^6.0.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\string-width", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/chalk/strip-ansi/issues" + }, + "bundleDependencies": false, + "dependencies": { + "ansi-regex": "^5.0.1" + }, + "deprecated": false, + "description": "Strip ANSI escape codes from a string", + "devDependencies": { + "ava": "^2.4.0", + "tsd": "^0.10.0", + "xo": "^0.25.3" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "homepage": "https://github.com/chalk/strip-ansi#readme", + "keywords": [ + "strip", + "trim", + "remove", + "ansi", + "styles", + "color", + "colour", + "colors", + "terminal", + "console", + "string", + "tty", + "escape", + "formatting", + "rgb", + "256", + "shell", + "xterm", + "log", + "logging", + "command-line", + "text" + ], + "license": "MIT", + "name": "strip-ansi", + "repository": { + "type": "git", + "url": "git+https://github.com/chalk/strip-ansi.git" + }, + "scripts": { + "test": "xo && ava && tsd" + }, + "version": "6.0.1" +} diff --git a/node_modules/strip-ansi/readme.md b/node_modules/strip-ansi/readme.md new file mode 100644 index 000000000..7c4b56d46 --- /dev/null +++ b/node_modules/strip-ansi/readme.md @@ -0,0 +1,46 @@ +# strip-ansi [![Build Status](https://travis-ci.org/chalk/strip-ansi.svg?branch=master)](https://travis-ci.org/chalk/strip-ansi) + +> Strip [ANSI escape codes](https://en.wikipedia.org/wiki/ANSI_escape_code) from a string + + +## Install + +``` +$ npm install strip-ansi +``` + + +## Usage + +```js +const stripAnsi = require('strip-ansi'); + +stripAnsi('\u001B[4mUnicorn\u001B[0m'); +//=> 'Unicorn' + +stripAnsi('\u001B]8;;https://github.com\u0007Click\u001B]8;;\u0007'); +//=> 'Click' +``` + + +## strip-ansi for enterprise + +Available as part of the Tidelift Subscription. + +The maintainers of strip-ansi and thousands of other packages are working with Tidelift to deliver commercial support and maintenance for the open source dependencies you use to build your applications. Save time, reduce risk, and improve code health, while paying the maintainers of the exact dependencies you use. [Learn more.](https://tidelift.com/subscription/pkg/npm-strip-ansi?utm_source=npm-strip-ansi&utm_medium=referral&utm_campaign=enterprise&utm_term=repo) + + +## Related + +- [strip-ansi-cli](https://github.com/chalk/strip-ansi-cli) - CLI for this module +- [strip-ansi-stream](https://github.com/chalk/strip-ansi-stream) - Streaming version of this module +- [has-ansi](https://github.com/chalk/has-ansi) - Check if a string has ANSI escape codes +- [ansi-regex](https://github.com/chalk/ansi-regex) - Regular expression for matching ANSI escape codes +- [chalk](https://github.com/chalk/chalk) - Terminal string styling done right + + +## Maintainers + +- [Sindre Sorhus](https://github.com/sindresorhus) +- [Josh Junon](https://github.com/qix-) + diff --git a/node_modules/strip-json-comments/index.js b/node_modules/strip-json-comments/index.js new file mode 100644 index 000000000..4e6576e6d --- /dev/null +++ b/node_modules/strip-json-comments/index.js @@ -0,0 +1,70 @@ +'use strict'; +var singleComment = 1; +var multiComment = 2; + +function stripWithoutWhitespace() { + return ''; +} + +function stripWithWhitespace(str, start, end) { + return str.slice(start, end).replace(/\S/g, ' '); +} + +module.exports = function (str, opts) { + opts = opts || {}; + + var currentChar; + var nextChar; + var insideString = false; + var insideComment = false; + var offset = 0; + var ret = ''; + var strip = opts.whitespace === false ? stripWithoutWhitespace : stripWithWhitespace; + + for (var i = 0; i < str.length; i++) { + currentChar = str[i]; + nextChar = str[i + 1]; + + if (!insideComment && currentChar === '"') { + var escaped = str[i - 1] === '\\' && str[i - 2] !== '\\'; + if (!escaped) { + insideString = !insideString; + } + } + + if (insideString) { + continue; + } + + if (!insideComment && currentChar + nextChar === '//') { + ret += str.slice(offset, i); + offset = i; + insideComment = singleComment; + i++; + } else if (insideComment === singleComment && currentChar + nextChar === '\r\n') { + i++; + insideComment = false; + ret += strip(str, offset, i); + offset = i; + continue; + } else if (insideComment === singleComment && currentChar === '\n') { + insideComment = false; + ret += strip(str, offset, i); + offset = i; + } else if (!insideComment && currentChar + nextChar === '/*') { + ret += str.slice(offset, i); + offset = i; + insideComment = multiComment; + i++; + continue; + } else if (insideComment === multiComment && currentChar + nextChar === '*/') { + i++; + insideComment = false; + ret += strip(str, offset, i + 1); + offset = i + 1; + continue; + } + } + + return ret + (insideComment ? strip(str.substr(offset)) : str.substr(offset)); +}; diff --git a/node_modules/strip-json-comments/license b/node_modules/strip-json-comments/license new file mode 100644 index 000000000..654d0bfe9 --- /dev/null +++ b/node_modules/strip-json-comments/license @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/strip-json-comments/package.json b/node_modules/strip-json-comments/package.json new file mode 100644 index 000000000..041f80cf6 --- /dev/null +++ b/node_modules/strip-json-comments/package.json @@ -0,0 +1,74 @@ +{ + "_from": "strip-json-comments@~2.0.1", + "_id": "strip-json-comments@2.0.1", + "_inBundle": false, + "_integrity": "sha1-PFMZQukIwml8DsNEhYwobHygpgo=", + "_location": "/strip-json-comments", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "strip-json-comments@~2.0.1", + "name": "strip-json-comments", + "escapedName": "strip-json-comments", + "rawSpec": "~2.0.1", + "saveSpec": null, + "fetchSpec": "~2.0.1" + }, + "_requiredBy": [ + "/rc" + ], + "_resolved": "https://registry.npmjs.org/strip-json-comments/-/strip-json-comments-2.0.1.tgz", + "_shasum": "3c531942e908c2697c0ec344858c286c7ca0a60a", + "_spec": "strip-json-comments@~2.0.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\rc", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/strip-json-comments/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Strip comments from JSON. Lets you use comments in your JSON files!", + "devDependencies": { + "ava": "*", + "xo": "*" + }, + "engines": { + "node": ">=0.10.0" + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/sindresorhus/strip-json-comments#readme", + "keywords": [ + "json", + "strip", + "remove", + "delete", + "trim", + "comments", + "multiline", + "parse", + "config", + "configuration", + "conf", + "settings", + "util", + "env", + "environment" + ], + "license": "MIT", + "name": "strip-json-comments", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/strip-json-comments.git" + }, + "scripts": { + "test": "xo && ava" + }, + "version": "2.0.1" +} diff --git a/node_modules/strip-json-comments/readme.md b/node_modules/strip-json-comments/readme.md new file mode 100644 index 000000000..0ee58dfe3 --- /dev/null +++ b/node_modules/strip-json-comments/readme.md @@ -0,0 +1,64 @@ +# strip-json-comments [![Build Status](https://travis-ci.org/sindresorhus/strip-json-comments.svg?branch=master)](https://travis-ci.org/sindresorhus/strip-json-comments) + +> Strip comments from JSON. Lets you use comments in your JSON files! + +This is now possible: + +```js +{ + // rainbows + "unicorn": /* ❤ */ "cake" +} +``` + +It will replace single-line comments `//` and multi-line comments `/**/` with whitespace. This allows JSON error positions to remain as close as possible to the original source. + +Also available as a [gulp](https://github.com/sindresorhus/gulp-strip-json-comments)/[grunt](https://github.com/sindresorhus/grunt-strip-json-comments)/[broccoli](https://github.com/sindresorhus/broccoli-strip-json-comments) plugin. + + +## Install + +``` +$ npm install --save strip-json-comments +``` + + +## Usage + +```js +const json = '{/*rainbows*/"unicorn":"cake"}'; + +JSON.parse(stripJsonComments(json)); +//=> {unicorn: 'cake'} +``` + + +## API + +### stripJsonComments(input, [options]) + +#### input + +Type: `string` + +Accepts a string with JSON and returns a string without comments. + +#### options + +##### whitespace + +Type: `boolean` +Default: `true` + +Replace comments with whitespace instead of stripping them entirely. + + +## Related + +- [strip-json-comments-cli](https://github.com/sindresorhus/strip-json-comments-cli) - CLI for this module +- [strip-css-comments](https://github.com/sindresorhus/strip-css-comments) - Strip comments from CSS + + +## License + +MIT © [Sindre Sorhus](http://sindresorhus.com) diff --git a/node_modules/supports-color/browser.js b/node_modules/supports-color/browser.js new file mode 100644 index 000000000..62afa3a74 --- /dev/null +++ b/node_modules/supports-color/browser.js @@ -0,0 +1,5 @@ +'use strict'; +module.exports = { + stdout: false, + stderr: false +}; diff --git a/node_modules/supports-color/index.js b/node_modules/supports-color/index.js new file mode 100644 index 000000000..1704131bd --- /dev/null +++ b/node_modules/supports-color/index.js @@ -0,0 +1,131 @@ +'use strict'; +const os = require('os'); +const hasFlag = require('has-flag'); + +const env = process.env; + +let forceColor; +if (hasFlag('no-color') || + hasFlag('no-colors') || + hasFlag('color=false')) { + forceColor = false; +} else if (hasFlag('color') || + hasFlag('colors') || + hasFlag('color=true') || + hasFlag('color=always')) { + forceColor = true; +} +if ('FORCE_COLOR' in env) { + forceColor = env.FORCE_COLOR.length === 0 || parseInt(env.FORCE_COLOR, 10) !== 0; +} + +function translateLevel(level) { + if (level === 0) { + return false; + } + + return { + level, + hasBasic: true, + has256: level >= 2, + has16m: level >= 3 + }; +} + +function supportsColor(stream) { + if (forceColor === false) { + return 0; + } + + if (hasFlag('color=16m') || + hasFlag('color=full') || + hasFlag('color=truecolor')) { + return 3; + } + + if (hasFlag('color=256')) { + return 2; + } + + if (stream && !stream.isTTY && forceColor !== true) { + return 0; + } + + const min = forceColor ? 1 : 0; + + if (process.platform === 'win32') { + // Node.js 7.5.0 is the first version of Node.js to include a patch to + // libuv that enables 256 color output on Windows. Anything earlier and it + // won't work. However, here we target Node.js 8 at minimum as it is an LTS + // release, and Node.js 7 is not. Windows 10 build 10586 is the first Windows + // release that supports 256 colors. Windows 10 build 14931 is the first release + // that supports 16m/TrueColor. + const osRelease = os.release().split('.'); + if ( + Number(process.versions.node.split('.')[0]) >= 8 && + Number(osRelease[0]) >= 10 && + Number(osRelease[2]) >= 10586 + ) { + return Number(osRelease[2]) >= 14931 ? 3 : 2; + } + + return 1; + } + + if ('CI' in env) { + if (['TRAVIS', 'CIRCLECI', 'APPVEYOR', 'GITLAB_CI'].some(sign => sign in env) || env.CI_NAME === 'codeship') { + return 1; + } + + return min; + } + + if ('TEAMCITY_VERSION' in env) { + return /^(9\.(0*[1-9]\d*)\.|\d{2,}\.)/.test(env.TEAMCITY_VERSION) ? 1 : 0; + } + + if (env.COLORTERM === 'truecolor') { + return 3; + } + + if ('TERM_PROGRAM' in env) { + const version = parseInt((env.TERM_PROGRAM_VERSION || '').split('.')[0], 10); + + switch (env.TERM_PROGRAM) { + case 'iTerm.app': + return version >= 3 ? 3 : 2; + case 'Apple_Terminal': + return 2; + // No default + } + } + + if (/-256(color)?$/i.test(env.TERM)) { + return 2; + } + + if (/^screen|^xterm|^vt100|^vt220|^rxvt|color|ansi|cygwin|linux/i.test(env.TERM)) { + return 1; + } + + if ('COLORTERM' in env) { + return 1; + } + + if (env.TERM === 'dumb') { + return min; + } + + return min; +} + +function getSupportLevel(stream) { + const level = supportsColor(stream); + return translateLevel(level); +} + +module.exports = { + supportsColor: getSupportLevel, + stdout: getSupportLevel(process.stdout), + stderr: getSupportLevel(process.stderr) +}; diff --git a/node_modules/supports-color/license b/node_modules/supports-color/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/supports-color/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/supports-color/package.json b/node_modules/supports-color/package.json new file mode 100644 index 000000000..fc3bb1801 --- /dev/null +++ b/node_modules/supports-color/package.json @@ -0,0 +1,85 @@ +{ + "_from": "supports-color@^5.3.0", + "_id": "supports-color@5.5.0", + "_inBundle": false, + "_integrity": "sha512-QjVjwdXIt408MIiAqCX4oUKsgU2EqAGzs2Ppkm4aQYbjm+ZEWEcW4SfFNTr4uMNZma0ey4f5lgLrkB0aX0QMow==", + "_location": "/supports-color", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "supports-color@^5.3.0", + "name": "supports-color", + "escapedName": "supports-color", + "rawSpec": "^5.3.0", + "saveSpec": null, + "fetchSpec": "^5.3.0" + }, + "_requiredBy": [ + "/chalk" + ], + "_resolved": "https://registry.npmjs.org/supports-color/-/supports-color-5.5.0.tgz", + "_shasum": "e2e69a44ac8772f78a1ec0b35b689df6530efc8f", + "_spec": "supports-color@^5.3.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\chalk", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "browser": "browser.js", + "bugs": { + "url": "https://github.com/chalk/supports-color/issues" + }, + "bundleDependencies": false, + "dependencies": { + "has-flag": "^3.0.0" + }, + "deprecated": false, + "description": "Detect whether a terminal supports color", + "devDependencies": { + "ava": "^0.25.0", + "import-fresh": "^2.0.0", + "xo": "^0.20.0" + }, + "engines": { + "node": ">=4" + }, + "files": [ + "index.js", + "browser.js" + ], + "homepage": "https://github.com/chalk/supports-color#readme", + "keywords": [ + "color", + "colour", + "colors", + "terminal", + "console", + "cli", + "ansi", + "styles", + "tty", + "rgb", + "256", + "shell", + "xterm", + "command-line", + "support", + "supports", + "capability", + "detect", + "truecolor", + "16m" + ], + "license": "MIT", + "name": "supports-color", + "repository": { + "type": "git", + "url": "git+https://github.com/chalk/supports-color.git" + }, + "scripts": { + "test": "xo && ava" + }, + "version": "5.5.0" +} diff --git a/node_modules/supports-color/readme.md b/node_modules/supports-color/readme.md new file mode 100644 index 000000000..f6e401957 --- /dev/null +++ b/node_modules/supports-color/readme.md @@ -0,0 +1,66 @@ +# supports-color [![Build Status](https://travis-ci.org/chalk/supports-color.svg?branch=master)](https://travis-ci.org/chalk/supports-color) + +> Detect whether a terminal supports color + + +## Install + +``` +$ npm install supports-color +``` + + +## Usage + +```js +const supportsColor = require('supports-color'); + +if (supportsColor.stdout) { + console.log('Terminal stdout supports color'); +} + +if (supportsColor.stdout.has256) { + console.log('Terminal stdout supports 256 colors'); +} + +if (supportsColor.stderr.has16m) { + console.log('Terminal stderr supports 16 million colors (truecolor)'); +} +``` + + +## API + +Returns an `Object` with a `stdout` and `stderr` property for testing either streams. Each property is an `Object`, or `false` if color is not supported. + +The `stdout`/`stderr` objects specifies a level of support for color through a `.level` property and a corresponding flag: + +- `.level = 1` and `.hasBasic = true`: Basic color support (16 colors) +- `.level = 2` and `.has256 = true`: 256 color support +- `.level = 3` and `.has16m = true`: Truecolor support (16 million colors) + + +## Info + +It obeys the `--color` and `--no-color` CLI flags. + +Can be overridden by the user with the flags `--color` and `--no-color`. For situations where using `--color` is not possible, add the environment variable `FORCE_COLOR=1` to forcefully enable color or `FORCE_COLOR=0` to forcefully disable. The use of `FORCE_COLOR` overrides all other color support checks. + +Explicit 256/Truecolor mode can be enabled using the `--color=256` and `--color=16m` flags, respectively. + + +## Related + +- [supports-color-cli](https://github.com/chalk/supports-color-cli) - CLI for this module +- [chalk](https://github.com/chalk/chalk) - Terminal string styling done right + + +## Maintainers + +- [Sindre Sorhus](https://github.com/sindresorhus) +- [Josh Junon](https://github.com/qix-) + + +## License + +MIT diff --git a/node_modules/to-readable-stream/index.js b/node_modules/to-readable-stream/index.js new file mode 100644 index 000000000..554bfa5f9 --- /dev/null +++ b/node_modules/to-readable-stream/index.js @@ -0,0 +1,11 @@ +'use strict'; +const {Readable} = require('stream'); + +module.exports = input => ( + new Readable({ + read() { + this.push(input); + this.push(null); + } + }) +); diff --git a/node_modules/to-readable-stream/license b/node_modules/to-readable-stream/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/to-readable-stream/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/to-readable-stream/package.json b/node_modules/to-readable-stream/package.json new file mode 100644 index 000000000..cc9d84da6 --- /dev/null +++ b/node_modules/to-readable-stream/package.json @@ -0,0 +1,72 @@ +{ + "_from": "to-readable-stream@^1.0.0", + "_id": "to-readable-stream@1.0.0", + "_inBundle": false, + "_integrity": "sha512-Iq25XBt6zD5npPhlLVXGFN3/gyR2/qODcKNNyTMd4vbm39HUaOiAM4PMq0eMVC/Tkxz+Zjdsc55g9yyz+Yq00Q==", + "_location": "/to-readable-stream", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "to-readable-stream@^1.0.0", + "name": "to-readable-stream", + "escapedName": "to-readable-stream", + "rawSpec": "^1.0.0", + "saveSpec": null, + "fetchSpec": "^1.0.0" + }, + "_requiredBy": [ + "/got" + ], + "_resolved": "https://registry.npmjs.org/to-readable-stream/-/to-readable-stream-1.0.0.tgz", + "_shasum": "ce0aa0c2f3df6adf852efb404a783e77c0475771", + "_spec": "to-readable-stream@^1.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\got", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/to-readable-stream/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Convert a string/Buffer/Uint8Array to a readable stream", + "devDependencies": { + "ava": "*", + "get-stream": "^3.0.0", + "xo": "*" + }, + "engines": { + "node": ">=6" + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/sindresorhus/to-readable-stream#readme", + "keywords": [ + "stream", + "readablestream", + "string", + "buffer", + "uint8array", + "from", + "into", + "to", + "transform", + "convert", + "readable", + "pull" + ], + "license": "MIT", + "name": "to-readable-stream", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/to-readable-stream.git" + }, + "scripts": { + "test": "xo && ava" + }, + "version": "1.0.0" +} diff --git a/node_modules/to-readable-stream/readme.md b/node_modules/to-readable-stream/readme.md new file mode 100644 index 000000000..fc207c5c9 --- /dev/null +++ b/node_modules/to-readable-stream/readme.md @@ -0,0 +1,42 @@ +# to-readable-stream [![Build Status](https://travis-ci.org/sindresorhus/to-readable-stream.svg?branch=master)](https://travis-ci.org/sindresorhus/to-readable-stream) + +> Convert a string/Buffer/Uint8Array to a [readable stream](https://nodejs.org/api/stream.html#stream_readable_streams) + + +## Install + +``` +$ npm install to-readable-stream +``` + + +## Usage + +```js +const toReadableStream = require('to-readable-stream'); + +toReadableStream('🦄🌈').pipe(process.stdout); +``` + + +## API + +### toReadableStream(input) + +Returns a [`stream.Readable`](https://nodejs.org/api/stream.html#stream_readable_streams). + +#### input + +Type: `string` `Buffer` `Uint8Array` + +Value to convert to a stream. + + +## Related + +- [into-stream](https://github.com/sindresorhus/into-stream) - More advanced version of this module + + +## License + +MIT © [Sindre Sorhus](https://sindresorhus.com) diff --git a/node_modules/to-regex-range/LICENSE b/node_modules/to-regex-range/LICENSE new file mode 100644 index 000000000..7cccaf9e3 --- /dev/null +++ b/node_modules/to-regex-range/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2015-present, Jon Schlinkert. + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/to-regex-range/README.md b/node_modules/to-regex-range/README.md new file mode 100644 index 000000000..38887dafa --- /dev/null +++ b/node_modules/to-regex-range/README.md @@ -0,0 +1,305 @@ +# to-regex-range [![Donate](https://img.shields.io/badge/Donate-PayPal-green.svg)](https://www.paypal.com/cgi-bin/webscr?cmd=_s-xclick&hosted_button_id=W8YFZ425KND68) [![NPM version](https://img.shields.io/npm/v/to-regex-range.svg?style=flat)](https://www.npmjs.com/package/to-regex-range) [![NPM monthly downloads](https://img.shields.io/npm/dm/to-regex-range.svg?style=flat)](https://npmjs.org/package/to-regex-range) [![NPM total downloads](https://img.shields.io/npm/dt/to-regex-range.svg?style=flat)](https://npmjs.org/package/to-regex-range) [![Linux Build Status](https://img.shields.io/travis/micromatch/to-regex-range.svg?style=flat&label=Travis)](https://travis-ci.org/micromatch/to-regex-range) + +> Pass two numbers, get a regex-compatible source string for matching ranges. Validated against more than 2.78 million test assertions. + +Please consider following this project's author, [Jon Schlinkert](https://github.com/jonschlinkert), and consider starring the project to show your :heart: and support. + +## Install + +Install with [npm](https://www.npmjs.com/): + +```sh +$ npm install --save to-regex-range +``` + +
+What does this do? + +
+ +This libary generates the `source` string to be passed to `new RegExp()` for matching a range of numbers. + +**Example** + +```js +const toRegexRange = require('to-regex-range'); +const regex = new RegExp(toRegexRange('15', '95')); +``` + +A string is returned so that you can do whatever you need with it before passing it to `new RegExp()` (like adding `^` or `$` boundaries, defining flags, or combining it another string). + +
+ +
+ +
+Why use this library? + +
+ +### Convenience + +Creating regular expressions for matching numbers gets deceptively complicated pretty fast. + +For example, let's say you need a validation regex for matching part of a user-id, postal code, social security number, tax id, etc: + +* regex for matching `1` => `/1/` (easy enough) +* regex for matching `1` through `5` => `/[1-5]/` (not bad...) +* regex for matching `1` or `5` => `/(1|5)/` (still easy...) +* regex for matching `1` through `50` => `/([1-9]|[1-4][0-9]|50)/` (uh-oh...) +* regex for matching `1` through `55` => `/([1-9]|[1-4][0-9]|5[0-5])/` (no prob, I can do this...) +* regex for matching `1` through `555` => `/([1-9]|[1-9][0-9]|[1-4][0-9]{2}|5[0-4][0-9]|55[0-5])/` (maybe not...) +* regex for matching `0001` through `5555` => `/(0{3}[1-9]|0{2}[1-9][0-9]|0[1-9][0-9]{2}|[1-4][0-9]{3}|5[0-4][0-9]{2}|55[0-4][0-9]|555[0-5])/` (okay, I get the point!) + +The numbers are contrived, but they're also really basic. In the real world you might need to generate a regex on-the-fly for validation. + +**Learn more** + +If you're interested in learning more about [character classes](http://www.regular-expressions.info/charclass.html) and other regex features, I personally have always found [regular-expressions.info](http://www.regular-expressions.info/charclass.html) to be pretty useful. + +### Heavily tested + +As of April 07, 2019, this library runs [>1m test assertions](./test/test.js) against generated regex-ranges to provide brute-force verification that results are correct. + +Tests run in ~280ms on my MacBook Pro, 2.5 GHz Intel Core i7. + +### Optimized + +Generated regular expressions are optimized: + +* duplicate sequences and character classes are reduced using quantifiers +* smart enough to use `?` conditionals when number(s) or range(s) can be positive or negative +* uses fragment caching to avoid processing the same exact string more than once + +
+ +
+ +## Usage + +Add this library to your javascript application with the following line of code + +```js +const toRegexRange = require('to-regex-range'); +``` + +The main export is a function that takes two integers: the `min` value and `max` value (formatted as strings or numbers). + +```js +const source = toRegexRange('15', '95'); +//=> 1[5-9]|[2-8][0-9]|9[0-5] + +const regex = new RegExp(`^${source}$`); +console.log(regex.test('14')); //=> false +console.log(regex.test('50')); //=> true +console.log(regex.test('94')); //=> true +console.log(regex.test('96')); //=> false +``` + +## Options + +### options.capture + +**Type**: `boolean` + +**Deafault**: `undefined` + +Wrap the returned value in parentheses when there is more than one regex condition. Useful when you're dynamically generating ranges. + +```js +console.log(toRegexRange('-10', '10')); +//=> -[1-9]|-?10|[0-9] + +console.log(toRegexRange('-10', '10', { capture: true })); +//=> (-[1-9]|-?10|[0-9]) +``` + +### options.shorthand + +**Type**: `boolean` + +**Deafault**: `undefined` + +Use the regex shorthand for `[0-9]`: + +```js +console.log(toRegexRange('0', '999999')); +//=> [0-9]|[1-9][0-9]{1,5} + +console.log(toRegexRange('0', '999999', { shorthand: true })); +//=> \d|[1-9]\d{1,5} +``` + +### options.relaxZeros + +**Type**: `boolean` + +**Default**: `true` + +This option relaxes matching for leading zeros when when ranges are zero-padded. + +```js +const source = toRegexRange('-0010', '0010'); +const regex = new RegExp(`^${source}$`); +console.log(regex.test('-10')); //=> true +console.log(regex.test('-010')); //=> true +console.log(regex.test('-0010')); //=> true +console.log(regex.test('10')); //=> true +console.log(regex.test('010')); //=> true +console.log(regex.test('0010')); //=> true +``` + +When `relaxZeros` is false, matching is strict: + +```js +const source = toRegexRange('-0010', '0010', { relaxZeros: false }); +const regex = new RegExp(`^${source}$`); +console.log(regex.test('-10')); //=> false +console.log(regex.test('-010')); //=> false +console.log(regex.test('-0010')); //=> true +console.log(regex.test('10')); //=> false +console.log(regex.test('010')); //=> false +console.log(regex.test('0010')); //=> true +``` + +## Examples + +| **Range** | **Result** | **Compile time** | +| --- | --- | --- | +| `toRegexRange(-10, 10)` | `-[1-9]\|-?10\|[0-9]` | _132μs_ | +| `toRegexRange(-100, -10)` | `-1[0-9]\|-[2-9][0-9]\|-100` | _50μs_ | +| `toRegexRange(-100, 100)` | `-[1-9]\|-?[1-9][0-9]\|-?100\|[0-9]` | _42μs_ | +| `toRegexRange(001, 100)` | `0{0,2}[1-9]\|0?[1-9][0-9]\|100` | _109μs_ | +| `toRegexRange(001, 555)` | `0{0,2}[1-9]\|0?[1-9][0-9]\|[1-4][0-9]{2}\|5[0-4][0-9]\|55[0-5]` | _51μs_ | +| `toRegexRange(0010, 1000)` | `0{0,2}1[0-9]\|0{0,2}[2-9][0-9]\|0?[1-9][0-9]{2}\|1000` | _31μs_ | +| `toRegexRange(1, 50)` | `[1-9]\|[1-4][0-9]\|50` | _24μs_ | +| `toRegexRange(1, 55)` | `[1-9]\|[1-4][0-9]\|5[0-5]` | _23μs_ | +| `toRegexRange(1, 555)` | `[1-9]\|[1-9][0-9]\|[1-4][0-9]{2}\|5[0-4][0-9]\|55[0-5]` | _30μs_ | +| `toRegexRange(1, 5555)` | `[1-9]\|[1-9][0-9]{1,2}\|[1-4][0-9]{3}\|5[0-4][0-9]{2}\|55[0-4][0-9]\|555[0-5]` | _43μs_ | +| `toRegexRange(111, 555)` | `11[1-9]\|1[2-9][0-9]\|[2-4][0-9]{2}\|5[0-4][0-9]\|55[0-5]` | _38μs_ | +| `toRegexRange(29, 51)` | `29\|[34][0-9]\|5[01]` | _24μs_ | +| `toRegexRange(31, 877)` | `3[1-9]\|[4-9][0-9]\|[1-7][0-9]{2}\|8[0-6][0-9]\|87[0-7]` | _32μs_ | +| `toRegexRange(5, 5)` | `5` | _8μs_ | +| `toRegexRange(5, 6)` | `5\|6` | _11μs_ | +| `toRegexRange(1, 2)` | `1\|2` | _6μs_ | +| `toRegexRange(1, 5)` | `[1-5]` | _15μs_ | +| `toRegexRange(1, 10)` | `[1-9]\|10` | _22μs_ | +| `toRegexRange(1, 100)` | `[1-9]\|[1-9][0-9]\|100` | _25μs_ | +| `toRegexRange(1, 1000)` | `[1-9]\|[1-9][0-9]{1,2}\|1000` | _31μs_ | +| `toRegexRange(1, 10000)` | `[1-9]\|[1-9][0-9]{1,3}\|10000` | _34μs_ | +| `toRegexRange(1, 100000)` | `[1-9]\|[1-9][0-9]{1,4}\|100000` | _36μs_ | +| `toRegexRange(1, 1000000)` | `[1-9]\|[1-9][0-9]{1,5}\|1000000` | _42μs_ | +| `toRegexRange(1, 10000000)` | `[1-9]\|[1-9][0-9]{1,6}\|10000000` | _42μs_ | + +## Heads up! + +**Order of arguments** + +When the `min` is larger than the `max`, values will be flipped to create a valid range: + +```js +toRegexRange('51', '29'); +``` + +Is effectively flipped to: + +```js +toRegexRange('29', '51'); +//=> 29|[3-4][0-9]|5[0-1] +``` + +**Steps / increments** + +This library does not support steps (increments). A pr to add support would be welcome. + +## History + +### v2.0.0 - 2017-04-21 + +**New features** + +Adds support for zero-padding! + +### v1.0.0 + +**Optimizations** + +Repeating ranges are now grouped using quantifiers. rocessing time is roughly the same, but the generated regex is much smaller, which should result in faster matching. + +## Attribution + +Inspired by the python library [range-regex](https://github.com/dimka665/range-regex). + +## About + +
+Contributing + +Pull requests and stars are always welcome. For bugs and feature requests, [please create an issue](../../issues/new). + +
+ +
+Running Tests + +Running and reviewing unit tests is a great way to get familiarized with a library and its API. You can install dependencies and run tests with the following command: + +```sh +$ npm install && npm test +``` + +
+ +
+Building docs + +_(This project's readme.md is generated by [verb](https://github.com/verbose/verb-generate-readme), please don't edit the readme directly. Any changes to the readme must be made in the [.verb.md](.verb.md) readme template.)_ + +To generate the readme, run the following command: + +```sh +$ npm install -g verbose/verb#dev verb-generate-readme && verb +``` + +
+ +### Related projects + +You might also be interested in these projects: + +* [expand-range](https://www.npmjs.com/package/expand-range): Fast, bash-like range expansion. Expand a range of numbers or letters, uppercase or lowercase. Used… [more](https://github.com/jonschlinkert/expand-range) | [homepage](https://github.com/jonschlinkert/expand-range "Fast, bash-like range expansion. Expand a range of numbers or letters, uppercase or lowercase. Used by micromatch.") +* [fill-range](https://www.npmjs.com/package/fill-range): Fill in a range of numbers or letters, optionally passing an increment or `step` to… [more](https://github.com/jonschlinkert/fill-range) | [homepage](https://github.com/jonschlinkert/fill-range "Fill in a range of numbers or letters, optionally passing an increment or `step` to use, or create a regex-compatible range with `options.toRegex`") +* [micromatch](https://www.npmjs.com/package/micromatch): Glob matching for javascript/node.js. A drop-in replacement and faster alternative to minimatch and multimatch. | [homepage](https://github.com/micromatch/micromatch "Glob matching for javascript/node.js. A drop-in replacement and faster alternative to minimatch and multimatch.") +* [repeat-element](https://www.npmjs.com/package/repeat-element): Create an array by repeating the given value n times. | [homepage](https://github.com/jonschlinkert/repeat-element "Create an array by repeating the given value n times.") +* [repeat-string](https://www.npmjs.com/package/repeat-string): Repeat the given string n times. Fastest implementation for repeating a string. | [homepage](https://github.com/jonschlinkert/repeat-string "Repeat the given string n times. Fastest implementation for repeating a string.") + +### Contributors + +| **Commits** | **Contributor** | +| --- | --- | +| 63 | [jonschlinkert](https://github.com/jonschlinkert) | +| 3 | [doowb](https://github.com/doowb) | +| 2 | [realityking](https://github.com/realityking) | + +### Author + +**Jon Schlinkert** + +* [GitHub Profile](https://github.com/jonschlinkert) +* [Twitter Profile](https://twitter.com/jonschlinkert) +* [LinkedIn Profile](https://linkedin.com/in/jonschlinkert) + +Please consider supporting me on Patreon, or [start your own Patreon page](https://patreon.com/invite/bxpbvm)! + + + + + +### License + +Copyright © 2019, [Jon Schlinkert](https://github.com/jonschlinkert). +Released under the [MIT License](LICENSE). + +*** + +_This file was generated by [verb-generate-readme](https://github.com/verbose/verb-generate-readme), v0.8.0, on April 07, 2019._ \ No newline at end of file diff --git a/node_modules/to-regex-range/index.js b/node_modules/to-regex-range/index.js new file mode 100644 index 000000000..77fbaced1 --- /dev/null +++ b/node_modules/to-regex-range/index.js @@ -0,0 +1,288 @@ +/*! + * to-regex-range + * + * Copyright (c) 2015-present, Jon Schlinkert. + * Released under the MIT License. + */ + +'use strict'; + +const isNumber = require('is-number'); + +const toRegexRange = (min, max, options) => { + if (isNumber(min) === false) { + throw new TypeError('toRegexRange: expected the first argument to be a number'); + } + + if (max === void 0 || min === max) { + return String(min); + } + + if (isNumber(max) === false) { + throw new TypeError('toRegexRange: expected the second argument to be a number.'); + } + + let opts = { relaxZeros: true, ...options }; + if (typeof opts.strictZeros === 'boolean') { + opts.relaxZeros = opts.strictZeros === false; + } + + let relax = String(opts.relaxZeros); + let shorthand = String(opts.shorthand); + let capture = String(opts.capture); + let wrap = String(opts.wrap); + let cacheKey = min + ':' + max + '=' + relax + shorthand + capture + wrap; + + if (toRegexRange.cache.hasOwnProperty(cacheKey)) { + return toRegexRange.cache[cacheKey].result; + } + + let a = Math.min(min, max); + let b = Math.max(min, max); + + if (Math.abs(a - b) === 1) { + let result = min + '|' + max; + if (opts.capture) { + return `(${result})`; + } + if (opts.wrap === false) { + return result; + } + return `(?:${result})`; + } + + let isPadded = hasPadding(min) || hasPadding(max); + let state = { min, max, a, b }; + let positives = []; + let negatives = []; + + if (isPadded) { + state.isPadded = isPadded; + state.maxLen = String(state.max).length; + } + + if (a < 0) { + let newMin = b < 0 ? Math.abs(b) : 1; + negatives = splitToPatterns(newMin, Math.abs(a), state, opts); + a = state.a = 0; + } + + if (b >= 0) { + positives = splitToPatterns(a, b, state, opts); + } + + state.negatives = negatives; + state.positives = positives; + state.result = collatePatterns(negatives, positives, opts); + + if (opts.capture === true) { + state.result = `(${state.result})`; + } else if (opts.wrap !== false && (positives.length + negatives.length) > 1) { + state.result = `(?:${state.result})`; + } + + toRegexRange.cache[cacheKey] = state; + return state.result; +}; + +function collatePatterns(neg, pos, options) { + let onlyNegative = filterPatterns(neg, pos, '-', false, options) || []; + let onlyPositive = filterPatterns(pos, neg, '', false, options) || []; + let intersected = filterPatterns(neg, pos, '-?', true, options) || []; + let subpatterns = onlyNegative.concat(intersected).concat(onlyPositive); + return subpatterns.join('|'); +} + +function splitToRanges(min, max) { + let nines = 1; + let zeros = 1; + + let stop = countNines(min, nines); + let stops = new Set([max]); + + while (min <= stop && stop <= max) { + stops.add(stop); + nines += 1; + stop = countNines(min, nines); + } + + stop = countZeros(max + 1, zeros) - 1; + + while (min < stop && stop <= max) { + stops.add(stop); + zeros += 1; + stop = countZeros(max + 1, zeros) - 1; + } + + stops = [...stops]; + stops.sort(compare); + return stops; +} + +/** + * Convert a range to a regex pattern + * @param {Number} `start` + * @param {Number} `stop` + * @return {String} + */ + +function rangeToPattern(start, stop, options) { + if (start === stop) { + return { pattern: start, count: [], digits: 0 }; + } + + let zipped = zip(start, stop); + let digits = zipped.length; + let pattern = ''; + let count = 0; + + for (let i = 0; i < digits; i++) { + let [startDigit, stopDigit] = zipped[i]; + + if (startDigit === stopDigit) { + pattern += startDigit; + + } else if (startDigit !== '0' || stopDigit !== '9') { + pattern += toCharacterClass(startDigit, stopDigit, options); + + } else { + count++; + } + } + + if (count) { + pattern += options.shorthand === true ? '\\d' : '[0-9]'; + } + + return { pattern, count: [count], digits }; +} + +function splitToPatterns(min, max, tok, options) { + let ranges = splitToRanges(min, max); + let tokens = []; + let start = min; + let prev; + + for (let i = 0; i < ranges.length; i++) { + let max = ranges[i]; + let obj = rangeToPattern(String(start), String(max), options); + let zeros = ''; + + if (!tok.isPadded && prev && prev.pattern === obj.pattern) { + if (prev.count.length > 1) { + prev.count.pop(); + } + + prev.count.push(obj.count[0]); + prev.string = prev.pattern + toQuantifier(prev.count); + start = max + 1; + continue; + } + + if (tok.isPadded) { + zeros = padZeros(max, tok, options); + } + + obj.string = zeros + obj.pattern + toQuantifier(obj.count); + tokens.push(obj); + start = max + 1; + prev = obj; + } + + return tokens; +} + +function filterPatterns(arr, comparison, prefix, intersection, options) { + let result = []; + + for (let ele of arr) { + let { string } = ele; + + // only push if _both_ are negative... + if (!intersection && !contains(comparison, 'string', string)) { + result.push(prefix + string); + } + + // or _both_ are positive + if (intersection && contains(comparison, 'string', string)) { + result.push(prefix + string); + } + } + return result; +} + +/** + * Zip strings + */ + +function zip(a, b) { + let arr = []; + for (let i = 0; i < a.length; i++) arr.push([a[i], b[i]]); + return arr; +} + +function compare(a, b) { + return a > b ? 1 : b > a ? -1 : 0; +} + +function contains(arr, key, val) { + return arr.some(ele => ele[key] === val); +} + +function countNines(min, len) { + return Number(String(min).slice(0, -len) + '9'.repeat(len)); +} + +function countZeros(integer, zeros) { + return integer - (integer % Math.pow(10, zeros)); +} + +function toQuantifier(digits) { + let [start = 0, stop = ''] = digits; + if (stop || start > 1) { + return `{${start + (stop ? ',' + stop : '')}}`; + } + return ''; +} + +function toCharacterClass(a, b, options) { + return `[${a}${(b - a === 1) ? '' : '-'}${b}]`; +} + +function hasPadding(str) { + return /^-?(0+)\d/.test(str); +} + +function padZeros(value, tok, options) { + if (!tok.isPadded) { + return value; + } + + let diff = Math.abs(tok.maxLen - String(value).length); + let relax = options.relaxZeros !== false; + + switch (diff) { + case 0: + return ''; + case 1: + return relax ? '0?' : '0'; + case 2: + return relax ? '0{0,2}' : '00'; + default: { + return relax ? `0{0,${diff}}` : `0{${diff}}`; + } + } +} + +/** + * Cache + */ + +toRegexRange.cache = {}; +toRegexRange.clearCache = () => (toRegexRange.cache = {}); + +/** + * Expose `toRegexRange` + */ + +module.exports = toRegexRange; diff --git a/node_modules/to-regex-range/package.json b/node_modules/to-regex-range/package.json new file mode 100644 index 000000000..5d0ef5dce --- /dev/null +++ b/node_modules/to-regex-range/package.json @@ -0,0 +1,125 @@ +{ + "_from": "to-regex-range@^5.0.1", + "_id": "to-regex-range@5.0.1", + "_inBundle": false, + "_integrity": "sha512-65P7iz6X5yEr1cwcgvQxbbIw7Uk3gOy5dIdtZ4rDveLqhrdJP+Li/Hx6tyK0NEb+2GCyneCMJiGqrADCSNk8sQ==", + "_location": "/to-regex-range", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "to-regex-range@^5.0.1", + "name": "to-regex-range", + "escapedName": "to-regex-range", + "rawSpec": "^5.0.1", + "saveSpec": null, + "fetchSpec": "^5.0.1" + }, + "_requiredBy": [ + "/fill-range" + ], + "_resolved": "https://registry.npmjs.org/to-regex-range/-/to-regex-range-5.0.1.tgz", + "_shasum": "1648c44aae7c8d988a326018ed72f5b4dd0392e4", + "_spec": "to-regex-range@^5.0.1", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\fill-range", + "author": { + "name": "Jon Schlinkert", + "url": "https://github.com/jonschlinkert" + }, + "bugs": { + "url": "https://github.com/micromatch/to-regex-range/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Jon Schlinkert", + "url": "http://twitter.com/jonschlinkert" + }, + { + "name": "Rouven Weßling", + "url": "www.rouvenwessling.de" + } + ], + "dependencies": { + "is-number": "^7.0.0" + }, + "deprecated": false, + "description": "Pass two numbers, get a regex-compatible source string for matching ranges. Validated against more than 2.78 million test assertions.", + "devDependencies": { + "fill-range": "^6.0.0", + "gulp-format-md": "^2.0.0", + "mocha": "^6.0.2", + "text-table": "^0.2.0", + "time-diff": "^0.3.1" + }, + "engines": { + "node": ">=8.0" + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/micromatch/to-regex-range", + "keywords": [ + "bash", + "date", + "expand", + "expansion", + "expression", + "glob", + "match", + "match date", + "match number", + "match numbers", + "match year", + "matches", + "matching", + "number", + "numbers", + "numerical", + "range", + "ranges", + "regex", + "regexp", + "regular", + "regular expression", + "sequence" + ], + "license": "MIT", + "main": "index.js", + "name": "to-regex-range", + "repository": { + "type": "git", + "url": "git+https://github.com/micromatch/to-regex-range.git" + }, + "scripts": { + "test": "mocha" + }, + "verb": { + "layout": "default", + "toc": false, + "tasks": [ + "readme" + ], + "plugins": [ + "gulp-format-md" + ], + "lint": { + "reflinks": true + }, + "helpers": { + "examples": { + "displayName": "examples" + } + }, + "related": { + "list": [ + "expand-range", + "fill-range", + "micromatch", + "repeat-element", + "repeat-string" + ] + } + }, + "version": "5.0.1" +} diff --git a/node_modules/toidentifier/LICENSE b/node_modules/toidentifier/LICENSE new file mode 100644 index 000000000..de22d1597 --- /dev/null +++ b/node_modules/toidentifier/LICENSE @@ -0,0 +1,21 @@ +MIT License + +Copyright (c) 2016 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/node_modules/toidentifier/README.md b/node_modules/toidentifier/README.md new file mode 100644 index 000000000..7c8794e2a --- /dev/null +++ b/node_modules/toidentifier/README.md @@ -0,0 +1,61 @@ +# toidentifier + +[![NPM Version][npm-image]][npm-url] +[![NPM Downloads][downloads-image]][downloads-url] +[![Build Status][travis-image]][travis-url] +[![Test Coverage][codecov-image]][codecov-url] + +> Convert a string of words to a JavaScript identifier + +## Install + +This is a [Node.js](https://nodejs.org/en/) module available through the +[npm registry](https://www.npmjs.com/). Installation is done using the +[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally): + +```bash +$ npm install toidentifier +``` + +## Example + +```js +var toIdentifier = require('toidentifier') + +console.log(toIdentifier('Bad Request')) +// => "BadRequest" +``` + +## API + +This CommonJS module exports a single default function: `toIdentifier`. + +### toIdentifier(string) + +Given a string as the argument, it will be transformed according to +the following rules and the new string will be returned: + +1. Split into words separated by space characters (`0x20`). +2. Upper case the first character of each word. +3. Join the words together with no separator. +4. Remove all non-word (`[0-9a-z_]`) characters. + +## License + +[MIT](LICENSE) + +[codecov-image]: https://img.shields.io/codecov/c/github/component/toidentifier.svg +[codecov-url]: https://codecov.io/gh/component/toidentifier +[downloads-image]: https://img.shields.io/npm/dm/toidentifier.svg +[downloads-url]: https://npmjs.org/package/toidentifier +[npm-image]: https://img.shields.io/npm/v/toidentifier.svg +[npm-url]: https://npmjs.org/package/toidentifier +[travis-image]: https://img.shields.io/travis/component/toidentifier/master.svg +[travis-url]: https://travis-ci.org/component/toidentifier + + +## + +[npm]: https://www.npmjs.com/ + +[yarn]: https://yarnpkg.com/ diff --git a/node_modules/toidentifier/index.js b/node_modules/toidentifier/index.js new file mode 100644 index 000000000..bba541147 --- /dev/null +++ b/node_modules/toidentifier/index.js @@ -0,0 +1,30 @@ +/*! + * toidentifier + * Copyright(c) 2016 Douglas Christopher Wilson + * MIT Licensed + */ + +/** + * Module exports. + * @public + */ + +module.exports = toIdentifier + +/** + * Trasform the given string into a JavaScript identifier + * + * @param {string} str + * @returns {string} + * @public + */ + +function toIdentifier (str) { + return str + .split(' ') + .map(function (token) { + return token.slice(0, 1).toUpperCase() + token.slice(1) + }) + .join('') + .replace(/[^ _0-9a-z]/gi, '') +} diff --git a/node_modules/toidentifier/package.json b/node_modules/toidentifier/package.json new file mode 100644 index 000000000..28d0881d1 --- /dev/null +++ b/node_modules/toidentifier/package.json @@ -0,0 +1,76 @@ +{ + "_from": "toidentifier@1.0.0", + "_id": "toidentifier@1.0.0", + "_inBundle": false, + "_integrity": "sha512-yaOH/Pk/VEhBWWTlhI+qXxDFXlejDGcQipMlyxda9nthulaxLZUNcUqFxokp0vcYnvteJln5FNQDRrxj3YcbVw==", + "_location": "/toidentifier", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "toidentifier@1.0.0", + "name": "toidentifier", + "escapedName": "toidentifier", + "rawSpec": "1.0.0", + "saveSpec": null, + "fetchSpec": "1.0.0" + }, + "_requiredBy": [ + "/http-errors" + ], + "_resolved": "https://registry.npmjs.org/toidentifier/-/toidentifier-1.0.0.tgz", + "_shasum": "7e1be3470f1e77948bc43d94a3c8f4d7752ba553", + "_spec": "toidentifier@1.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\http-errors", + "author": { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + "bugs": { + "url": "https://github.com/component/toidentifier/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + { + "name": "Nick Baugh", + "email": "niftylettuce@gmail.com", + "url": "http://niftylettuce.com/" + } + ], + "deprecated": false, + "description": "Convert a string of words to a JavaScript identifier", + "devDependencies": { + "eslint": "4.19.1", + "eslint-config-standard": "11.0.0", + "eslint-plugin-import": "2.11.0", + "eslint-plugin-markdown": "1.0.0-beta.6", + "eslint-plugin-node": "6.0.1", + "eslint-plugin-promise": "3.7.0", + "eslint-plugin-standard": "3.1.0", + "mocha": "1.21.5", + "nyc": "11.8.0" + }, + "engines": { + "node": ">=0.6" + }, + "files": [ + "index.js" + ], + "homepage": "https://github.com/component/toidentifier#readme", + "license": "MIT", + "name": "toidentifier", + "repository": { + "type": "git", + "url": "git+https://github.com/component/toidentifier.git" + }, + "scripts": { + "lint": "eslint --plugin markdown --ext js,md .", + "test": "mocha --reporter spec --bail --check-leaks test/", + "test-cov": "nyc --reporter=html --reporter=text npm test" + }, + "version": "1.0.0" +} diff --git a/node_modules/touch/LICENSE b/node_modules/touch/LICENSE new file mode 100644 index 000000000..05eeeb88c --- /dev/null +++ b/node_modules/touch/LICENSE @@ -0,0 +1,15 @@ +The ISC License + +Copyright (c) Isaac Z. Schlueter + +Permission to use, copy, modify, and/or distribute this software for any +purpose with or without fee is hereby granted, provided that the above +copyright notice and this permission notice appear in all copies. + +THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES +WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF +MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR +ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES +WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN +ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF OR +IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE. diff --git a/node_modules/touch/README.md b/node_modules/touch/README.md new file mode 100644 index 000000000..b5a361e6b --- /dev/null +++ b/node_modules/touch/README.md @@ -0,0 +1,52 @@ +# node-touch + +For all your node touching needs. + +## Installing + +```bash +npm install touch +``` + +## CLI Usage: + +See `man touch` + +This package exports a binary called `nodetouch` that works mostly +like the unix builtin `touch(1)`. + +## API Usage: + +```javascript +var touch = require("touch") +``` + +Gives you the following functions: + +* `touch(filename, options, cb)` +* `touch.sync(filename, options)` +* `touch.ftouch(fd, options, cb)` +* `touch.ftouchSync(fd, options)` + +All the `options` objects are optional. + +All the async functions return a Promise. If a callback function is +provided, then it's attached to the Promise. + +## Options + +* `force` like `touch -f` Boolean +* `time` like `touch -t ` Can be a Date object, or any parseable + Date string, or epoch ms number. +* `atime` like `touch -a` Can be either a Boolean, or a Date. +* `mtime` like `touch -m` Can be either a Boolean, or a Date. +* `ref` like `touch -r ` Must be path to a file. +* `nocreate` like `touch -c` Boolean + +If neither `atime` nor `mtime` are set, then both values are set. If +one of them is set, then the other is not. + +## cli + +This package creates a `nodetouch` command line executable that works +very much like the unix builtin `touch(1)` diff --git a/node_modules/touch/bin/nodetouch.js b/node_modules/touch/bin/nodetouch.js new file mode 100644 index 000000000..f78f0829d --- /dev/null +++ b/node_modules/touch/bin/nodetouch.js @@ -0,0 +1,112 @@ +#!/usr/bin/env node +const touch = require("../index.js") + +const usage = code => { + console[code ? 'error' : 'log']( + 'usage:\n' + + 'touch [-acfm] [-r file] [-t [[CC]YY]MMDDhhmm[.SS]] file ...' + ) + process.exit(code) +} + +const singleFlags = { + a: 'atime', + m: 'mtime', + c: 'nocreate', + f: 'force' +} + +const singleOpts = { + r: 'ref', + t: 'time' +} + +const files = [] +const args = process.argv.slice(2) +const options = {} +for (let i = 0; i < args.length; i++) { + const arg = args[i] + if (!arg.match(/^-/)) { + files.push(arg) + continue + } + + // expand shorthands + if (arg.charAt(1) !== '-') { + const expand = [] + for (let f = 1; f < arg.length; f++) { + const fc = arg.charAt(f) + const sf = singleFlags[fc] + const so = singleOpts[fc] + if (sf) + expand.push('--' + sf) + else if (so) { + const soslice = arg.slice(f + 1) + const soval = soslice.charAt(0) === '=' ? soslice : '=' + soslice + expand.push('--' + so + soval) + f = arg.length + } else if (arg !== '-' + fc) + expand.push('-' + fc) + } + if (expand.length) { + args.splice.apply(args, [i, 1].concat(expand)) + i-- + continue + } + } + + const argsplit = arg.split('=') + const key = argsplit.shift().replace(/^\-\-/, '') + const val = argsplit.length ? argsplit.join('=') : null + + switch (key) { + case 'time': + const timestr = val || args[++i] + // [-t [[CC]YY]MMDDhhmm[.SS]] + const parsedtime = timestr.match( + /^(([0-9]{2})?([0-9]{2}))?([0-9]{2})([0-9]{2})([0-9]{2})([0-9]{2})(\.([0-9]{2}))?$/ + ) + if (!parsedtime) { + console.error('touch: out of range or illegal ' + + 'time specification: ' + + '[[CC]YY]MMDDhhmm[.SS]') + process.exit(1) + } else { + const y = +parsedtime[1] + const year = parsedtime[2] ? y + : y <= 68 ? 2000 + y + : 1900 + y + + const MM = +parsedtime[4] - 1 + const dd = +parsedtime[5] + const hh = +parsedtime[6] + const mm = +parsedtime[7] + const ss = +parsedtime[8] + + options.time = new Date(Date.UTC(year, MM, dd, hh, mm, ss)) + } + continue + + case 'ref': + options.ref = val || args[++i] + continue + + case 'mtime': + case 'nocreate': + case 'atime': + case 'force': + options[key] = true + continue + + default: + console.error('touch: illegal option -- ' + arg) + usage(1) + } +} + +if (!files.length) + usage() + +process.exitCode = 0 +Promise.all(files.map(f => touch(f, options))) + .catch(er => process.exitCode = 1) diff --git a/node_modules/touch/index.js b/node_modules/touch/index.js new file mode 100644 index 000000000..f942e42ad --- /dev/null +++ b/node_modules/touch/index.js @@ -0,0 +1,224 @@ +'use strict' + +const EE = require('events').EventEmitter +const cons = require('constants') +const fs = require('fs') + +module.exports = (f, options, cb) => { + if (typeof options === 'function') + cb = options, options = {} + + const p = new Promise((res, rej) => { + new Touch(validOpts(options, f, null)) + .on('done', res).on('error', rej) + }) + + return cb ? p.then(res => cb(null, res), cb) : p +} + +module.exports.sync = module.exports.touchSync = (f, options) => + (new TouchSync(validOpts(options, f, null)), undefined) + +module.exports.ftouch = (fd, options, cb) => { + if (typeof options === 'function') + cb = options, options = {} + + const p = new Promise((res, rej) => { + new Touch(validOpts(options, null, fd)) + .on('done', res).on('error', rej) + }) + + return cb ? p.then(res => cb(null, res), cb) : p +} + +module.exports.ftouchSync = (fd, opt) => + (new TouchSync(validOpts(opt, null, fd)), undefined) + +const validOpts = (options, path, fd) => { + options = Object.create(options || {}) + options.fd = fd + options.path = path + + // {mtime: true}, {ctime: true} + // If set to something else, then treat as epoch ms value + const now = parseInt(new Date(options.time || Date.now()).getTime() / 1000) + if (!options.atime && !options.mtime) + options.atime = options.mtime = now + else { + if (true === options.atime) + options.atime = now + + if (true === options.mtime) + options.mtime = now + } + + let oflags = 0 + if (!options.force) + oflags = oflags | cons.O_RDWR + + if (!options.nocreate) + oflags = oflags | cons.O_CREAT + + options.oflags = oflags + return options +} + +class Touch extends EE { + constructor (options) { + super(options) + this.fd = options.fd + this.path = options.path + this.atime = options.atime + this.mtime = options.mtime + this.ref = options.ref + this.nocreate = !!options.nocreate + this.force = !!options.force + this.closeAfter = options.closeAfter + this.oflags = options.oflags + this.options = options + + if (typeof this.fd !== 'number') { + this.closeAfter = true + this.open() + } else + this.onopen(null, this.fd) + } + + emit (ev, data) { + // we only emit when either done or erroring + // in both cases, need to close + this.close() + return super.emit(ev, data) + } + + close () { + if (typeof this.fd === 'number' && this.closeAfter) + fs.close(this.fd, () => {}) + } + + open () { + fs.open(this.path, this.oflags, (er, fd) => this.onopen(er, fd)) + } + + onopen (er, fd) { + if (er) { + if (er.code === 'EISDIR') + this.onopen(null, null) + else if (er.code === 'ENOENT' && this.nocreate) + this.emit('done') + else + this.emit('error', er) + } else { + this.fd = fd + if (this.ref) + this.statref() + else if (!this.atime || !this.mtime) + this.fstat() + else + this.futimes() + } + } + + statref () { + fs.stat(this.ref, (er, st) => { + if (er) + this.emit('error', er) + else + this.onstatref(st) + }) + } + + onstatref (st) { + this.atime = this.atime && parseInt(st.atime.getTime()/1000, 10) + this.mtime = this.mtime && parseInt(st.mtime.getTime()/1000, 10) + if (!this.atime || !this.mtime) + this.fstat() + else + this.futimes() + } + + fstat () { + const stat = this.fd ? 'fstat' : 'stat' + const target = this.fd || this.path + fs[stat](target, (er, st) => { + if (er) + this.emit('error', er) + else + this.onfstat(st) + }) + } + + onfstat (st) { + if (typeof this.atime !== 'number') + this.atime = parseInt(st.atime.getTime()/1000, 10) + + if (typeof this.mtime !== 'number') + this.mtime = parseInt(st.mtime.getTime()/1000, 10) + + this.futimes() + } + + futimes () { + const utimes = this.fd ? 'futimes' : 'utimes' + const target = this.fd || this.path + fs[utimes](target, ''+this.atime, ''+this.mtime, er => { + if (er) + this.emit('error', er) + else + this.emit('done') + }) + } +} + +class TouchSync extends Touch { + open () { + try { + this.onopen(null, fs.openSync(this.path, this.oflags)) + } catch (er) { + this.onopen(er) + } + } + + statref () { + let threw = true + try { + this.onstatref(fs.statSync(this.ref)) + threw = false + } finally { + if (threw) + this.close() + } + } + + fstat () { + let threw = true + const stat = this.fd ? 'fstatSync' : 'statSync' + const target = this.fd || this.path + try { + this.onfstat(fs[stat](target)) + threw = false + } finally { + if (threw) + this.close() + } + } + + futimes () { + let threw = true + const utimes = this.fd ? 'futimesSync' : 'utimesSync' + const target = this.fd || this.path + try { + fs[utimes](target, this.atime, this.mtime) + threw = false + } finally { + if (threw) + this.close() + } + this.emit('done') + } + + close () { + if (typeof this.fd === 'number' && this.closeAfter) + try { fs.closeSync(this.fd) } catch (er) {} + } +} diff --git a/node_modules/touch/package.json b/node_modules/touch/package.json new file mode 100644 index 000000000..d73b38812 --- /dev/null +++ b/node_modules/touch/package.json @@ -0,0 +1,64 @@ +{ + "_from": "touch@^3.1.0", + "_id": "touch@3.1.0", + "_inBundle": false, + "_integrity": "sha512-WBx8Uy5TLtOSRtIq+M03/sKDrXCLHxwDcquSP2c43Le03/9serjQBIztjRz6FkJez9D/hleyAXTBGLwwZUw9lA==", + "_location": "/touch", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "touch@^3.1.0", + "name": "touch", + "escapedName": "touch", + "rawSpec": "^3.1.0", + "saveSpec": null, + "fetchSpec": "^3.1.0" + }, + "_requiredBy": [ + "/nodemon" + ], + "_resolved": "https://registry.npmjs.org/touch/-/touch-3.1.0.tgz", + "_shasum": "fe365f5f75ec9ed4e56825e0bb76d24ab74af83b", + "_spec": "touch@^3.1.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\nodemon", + "author": { + "name": "Isaac Z. Schlueter", + "email": "i@izs.me", + "url": "http://blog.izs.me/" + }, + "bin": { + "nodetouch": "bin/nodetouch.js" + }, + "bugs": { + "url": "https://github.com/isaacs/node-touch/issues" + }, + "bundleDependencies": false, + "dependencies": { + "nopt": "~1.0.10" + }, + "deprecated": false, + "description": "like touch(1) in node", + "devDependencies": { + "mutate-fs": "^1.1.0", + "tap": "^10.7.0" + }, + "files": [ + "index.js", + "bin/nodetouch.js" + ], + "homepage": "https://github.com/isaacs/node-touch#readme", + "license": "ISC", + "name": "touch", + "repository": { + "type": "git", + "url": "git://github.com/isaacs/node-touch.git" + }, + "scripts": { + "postpublish": "git push origin --all; git push origin --tags", + "postversion": "npm publish", + "preversion": "npm test", + "test": "tap test/*.js --100 -J" + }, + "version": "3.1.0" +} diff --git a/node_modules/tr46/LICENSE.md b/node_modules/tr46/LICENSE.md new file mode 100644 index 000000000..e90f20a5c --- /dev/null +++ b/node_modules/tr46/LICENSE.md @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2016 Sebastian Mayr + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/node_modules/tr46/README.md b/node_modules/tr46/README.md new file mode 100644 index 000000000..4cb702c64 --- /dev/null +++ b/node_modules/tr46/README.md @@ -0,0 +1,72 @@ +# tr46 + +An JavaScript implementation of [Unicode Technical Standard #46: Unicode IDNA Compatibility Processing](https://unicode.org/reports/tr46/). + + +## Installation + +[Node.js](http://nodejs.org) ≥ 8 is required. To install, type this at the command line: +```shell +npm install tr46 +# or +yarn add tr46 +``` + + +## API + +### `toASCII(domainName[, options])` + +Converts a string of Unicode symbols to a case-folded Punycode string of ASCII symbols. + +Available options: +* [`checkBidi`](#checkBidi) +* [`checkHyphens`](#checkHyphens) +* [`checkJoiners`](#checkJoiners) +* [`processingOption`](#processingOption) +* [`useSTD3ASCIIRules`](#useSTD3ASCIIRules) +* [`verifyDNSLength`](#verifyDNSLength) + +### `toUnicode(domainName[, options])` + +Converts a case-folded Punycode string of ASCII symbols to a string of Unicode symbols. + +Available options: +* [`checkBidi`](#checkBidi) +* [`checkHyphens`](#checkHyphens) +* [`checkJoiners`](#checkJoiners) +* [`processingOption`](#processingOption) +* [`useSTD3ASCIIRules`](#useSTD3ASCIIRules) + + +## Options + +### `checkBidi` +Type: `Boolean` +Default value: `false` +When set to `true`, any bi-directional text within the input will be checked for validation. + +### `checkHyphens` +Type: `Boolean` +Default value: `false` +When set to `true`, the positions of any hyphen characters within the input will be checked for validation. + +### `checkJoiners` +Type: `Boolean` +Default value: `false` +When set to `true`, any word joiner characters within the input will be checked for validation. + +### `processingOption` +Type: `String` +Default value: `"nontransitional"` +When set to `"transitional"`, symbols within the input will be validated according to the older IDNA2003 protocol. When set to `"nontransitional"`, the current IDNA2008 protocol will be used. + +### `useSTD3ASCIIRules` +Type: `Boolean` +Default value: `false` +When set to `true`, input will be validated according to [STD3 Rules](http://unicode.org/reports/tr46/#STD3_Rules). + +### `verifyDNSLength` +Type: `Boolean` +Default value: `false` +When set to `true`, the length of each DNS label within the input will be checked for validation. diff --git a/node_modules/tr46/index.js b/node_modules/tr46/index.js new file mode 100644 index 000000000..9ba36d82e --- /dev/null +++ b/node_modules/tr46/index.js @@ -0,0 +1,297 @@ +"use strict"; + +const punycode = require("punycode"); +const regexes = require("./lib/regexes.js"); +const mappingTable = require("./lib/mappingTable.json"); +const { STATUS_MAPPING } = require("./lib/statusMapping.js"); + +function containsNonASCII(str) { + return /[^\x00-\x7F]/.test(str); +} + +function findStatus(val, { useSTD3ASCIIRules }) { + let start = 0; + let end = mappingTable.length - 1; + + while (start <= end) { + const mid = Math.floor((start + end) / 2); + + const target = mappingTable[mid]; + const min = Array.isArray(target[0]) ? target[0][0] : target[0]; + const max = Array.isArray(target[0]) ? target[0][1] : target[0]; + + if (min <= val && max >= val) { + if (useSTD3ASCIIRules && + (target[1] === STATUS_MAPPING.disallowed_STD3_valid || target[1] === STATUS_MAPPING.disallowed_STD3_mapped)) { + return [STATUS_MAPPING.disallowed, ...target.slice(2)]; + } else if (target[1] === STATUS_MAPPING.disallowed_STD3_valid) { + return [STATUS_MAPPING.valid, ...target.slice(2)]; + } else if (target[1] === STATUS_MAPPING.disallowed_STD3_mapped) { + return [STATUS_MAPPING.mapped, ...target.slice(2)]; + } + + return target.slice(1); + } else if (min > val) { + end = mid - 1; + } else { + start = mid + 1; + } + } + + return null; +} + +function mapChars(domainName, { useSTD3ASCIIRules, processingOption }) { + let hasError = false; + let processed = ""; + + for (const ch of domainName) { + const [status, mapping] = findStatus(ch.codePointAt(0), { useSTD3ASCIIRules }); + + switch (status) { + case STATUS_MAPPING.disallowed: + hasError = true; + processed += ch; + break; + case STATUS_MAPPING.ignored: + break; + case STATUS_MAPPING.mapped: + processed += mapping; + break; + case STATUS_MAPPING.deviation: + if (processingOption === "transitional") { + processed += mapping; + } else { + processed += ch; + } + break; + case STATUS_MAPPING.valid: + processed += ch; + break; + } + } + + return { + string: processed, + error: hasError + }; +} + +function validateLabel(label, { checkHyphens, checkBidi, checkJoiners, processingOption, useSTD3ASCIIRules }) { + if (label.normalize("NFC") !== label) { + return false; + } + + const codePoints = Array.from(label); + + if (checkHyphens) { + if ((codePoints[2] === "-" && codePoints[3] === "-") || + (label.startsWith("-") || label.endsWith("-"))) { + return false; + } + } + + if (label.includes(".") || + (codePoints.length > 0 && regexes.combiningMarks.test(codePoints[0]))) { + return false; + } + + for (const ch of codePoints) { + const [status] = findStatus(ch.codePointAt(0), { useSTD3ASCIIRules }); + if ((processingOption === "transitional" && status !== STATUS_MAPPING.valid) || + (processingOption === "nontransitional" && + status !== STATUS_MAPPING.valid && status !== STATUS_MAPPING.deviation)) { + return false; + } + } + + // https://tools.ietf.org/html/rfc5892#appendix-A + if (checkJoiners) { + let last = 0; + for (const [i, ch] of codePoints.entries()) { + if (ch === "\u200C" || ch === "\u200D") { + if (i > 0) { + if (regexes.combiningClassVirama.test(codePoints[i - 1])) { + continue; + } + if (ch === "\u200C") { + // TODO: make this more efficient + const next = codePoints.indexOf("\u200C", i + 1); + const test = next < 0 ? codePoints.slice(last) : codePoints.slice(last, next); + if (regexes.validZWNJ.test(test.join(""))) { + last = i + 1; + continue; + } + } + } + return false; + } + } + } + + // https://tools.ietf.org/html/rfc5893#section-2 + if (checkBidi) { + let rtl; + + // 1 + if (regexes.bidiS1LTR.test(codePoints[0])) { + rtl = false; + } else if (regexes.bidiS1RTL.test(codePoints[0])) { + rtl = true; + } else { + return false; + } + + if (rtl) { + // 2-4 + if (!regexes.bidiS2.test(label) || + !regexes.bidiS3.test(label) || + (regexes.bidiS4EN.test(label) && regexes.bidiS4AN.test(label))) { + return false; + } + } else if (!regexes.bidiS5.test(label) || + !regexes.bidiS6.test(label)) { // 5-6 + return false; + } + } + + return true; +} + +function isBidiDomain(labels) { + const domain = labels.map(label => { + if (label.startsWith("xn--")) { + try { + return punycode.decode(label.substring(4)); + } catch (err) { + return ""; + } + } + return label; + }).join("."); + return regexes.bidiDomain.test(domain); +} + +function processing(domainName, options) { + const { processingOption } = options; + + // 1. Map. + let { string, error } = mapChars(domainName, options); + + // 2. Normalize. + string = string.normalize("NFC"); + + // 3. Break. + const labels = string.split("."); + const isBidi = isBidiDomain(labels); + + // 4. Convert/Validate. + for (const [i, origLabel] of labels.entries()) { + let label = origLabel; + let curProcessing = processingOption; + if (label.startsWith("xn--")) { + try { + label = punycode.decode(label.substring(4)); + labels[i] = label; + } catch (err) { + error = true; + continue; + } + curProcessing = "nontransitional"; + } + + // No need to validate if we already know there is an error. + if (error) { + continue; + } + const validation = validateLabel(label, Object.assign({}, options, { + processingOption: curProcessing, + checkBidi: options.checkBidi && isBidi + })); + if (!validation) { + error = true; + } + } + + return { + string: labels.join("."), + error + }; +} + +function toASCII(domainName, { + checkHyphens = false, + checkBidi = false, + checkJoiners = false, + useSTD3ASCIIRules = false, + processingOption = "nontransitional", + verifyDNSLength = false +} = {}) { + if (processingOption !== "transitional" && processingOption !== "nontransitional") { + throw new RangeError("processingOption must be either transitional or nontransitional"); + } + + const result = processing(domainName, { + processingOption, + checkHyphens, + checkBidi, + checkJoiners, + useSTD3ASCIIRules + }); + let labels = result.string.split("."); + labels = labels.map(l => { + if (containsNonASCII(l)) { + try { + return "xn--" + punycode.encode(l); + } catch (e) { + result.error = true; + } + } + return l; + }); + + if (verifyDNSLength) { + const total = labels.join(".").length; + if (total > 253 || total === 0) { + result.error = true; + } + + for (let i = 0; i < labels.length; ++i) { + if (labels[i].length > 63 || labels[i].length === 0) { + result.error = true; + break; + } + } + } + + if (result.error) { + return null; + } + return labels.join("."); +} + +function toUnicode(domainName, { + checkHyphens = false, + checkBidi = false, + checkJoiners = false, + useSTD3ASCIIRules = false, + processingOption = "nontransitional" +} = {}) { + const result = processing(domainName, { + processingOption, + checkHyphens, + checkBidi, + checkJoiners, + useSTD3ASCIIRules + }); + + return { + domain: result.string, + error: result.error + }; +} + +module.exports = { + toASCII, + toUnicode +}; diff --git a/node_modules/tr46/lib/mappingTable.json b/node_modules/tr46/lib/mappingTable.json new file mode 100644 index 000000000..8ca808156 --- /dev/null +++ b/node_modules/tr46/lib/mappingTable.json @@ -0,0 +1 @@ +[[[0,44],4],[[45,46],2],[47,4],[[48,57],2],[[58,64],4],[65,1,"a"],[66,1,"b"],[67,1,"c"],[68,1,"d"],[69,1,"e"],[70,1,"f"],[71,1,"g"],[72,1,"h"],[73,1,"i"],[74,1,"j"],[75,1,"k"],[76,1,"l"],[77,1,"m"],[78,1,"n"],[79,1,"o"],[80,1,"p"],[81,1,"q"],[82,1,"r"],[83,1,"s"],[84,1,"t"],[85,1,"u"],[86,1,"v"],[87,1,"w"],[88,1,"x"],[89,1,"y"],[90,1,"z"],[[91,96],4],[[97,122],2],[[123,127],4],[[128,159],3],[160,5," "],[[161,167],2],[168,5," ̈"],[169,2],[170,1,"a"],[[171,172],2],[173,7],[174,2],[175,5," ̄"],[[176,177],2],[178,1,"2"],[179,1,"3"],[180,5," ́"],[181,1,"μ"],[182,2],[183,2],[184,5," ̧"],[185,1,"1"],[186,1,"o"],[187,2],[188,1,"1⁄4"],[189,1,"1⁄2"],[190,1,"3⁄4"],[191,2],[192,1,"à"],[193,1,"á"],[194,1,"â"],[195,1,"ã"],[196,1,"ä"],[197,1,"å"],[198,1,"æ"],[199,1,"ç"],[200,1,"è"],[201,1,"é"],[202,1,"ê"],[203,1,"ë"],[204,1,"ì"],[205,1,"í"],[206,1,"î"],[207,1,"ï"],[208,1,"ð"],[209,1,"ñ"],[210,1,"ò"],[211,1,"ó"],[212,1,"ô"],[213,1,"õ"],[214,1,"ö"],[215,2],[216,1,"ø"],[217,1,"ù"],[218,1,"ú"],[219,1,"û"],[220,1,"ü"],[221,1,"ý"],[222,1,"þ"],[223,6,"ss"],[[224,246],2],[247,2],[[248,255],2],[256,1,"ā"],[257,2],[258,1,"ă"],[259,2],[260,1,"ą"],[261,2],[262,1,"ć"],[263,2],[264,1,"ĉ"],[265,2],[266,1,"ċ"],[267,2],[268,1,"č"],[269,2],[270,1,"ď"],[271,2],[272,1,"đ"],[273,2],[274,1,"ē"],[275,2],[276,1,"ĕ"],[277,2],[278,1,"ė"],[279,2],[280,1,"ę"],[281,2],[282,1,"ě"],[283,2],[284,1,"ĝ"],[285,2],[286,1,"ğ"],[287,2],[288,1,"ġ"],[289,2],[290,1,"ģ"],[291,2],[292,1,"ĥ"],[293,2],[294,1,"ħ"],[295,2],[296,1,"ĩ"],[297,2],[298,1,"ī"],[299,2],[300,1,"ĭ"],[301,2],[302,1,"į"],[303,2],[304,1,"i̇"],[305,2],[[306,307],1,"ij"],[308,1,"ĵ"],[309,2],[310,1,"ķ"],[[311,312],2],[313,1,"ĺ"],[314,2],[315,1,"ļ"],[316,2],[317,1,"ľ"],[318,2],[[319,320],1,"l·"],[321,1,"ł"],[322,2],[323,1,"ń"],[324,2],[325,1,"ņ"],[326,2],[327,1,"ň"],[328,2],[329,1,"ʼn"],[330,1,"ŋ"],[331,2],[332,1,"ō"],[333,2],[334,1,"ŏ"],[335,2],[336,1,"ő"],[337,2],[338,1,"œ"],[339,2],[340,1,"ŕ"],[341,2],[342,1,"ŗ"],[343,2],[344,1,"ř"],[345,2],[346,1,"ś"],[347,2],[348,1,"ŝ"],[349,2],[350,1,"ş"],[351,2],[352,1,"š"],[353,2],[354,1,"ţ"],[355,2],[356,1,"ť"],[357,2],[358,1,"ŧ"],[359,2],[360,1,"ũ"],[361,2],[362,1,"ū"],[363,2],[364,1,"ŭ"],[365,2],[366,1,"ů"],[367,2],[368,1,"ű"],[369,2],[370,1,"ų"],[371,2],[372,1,"ŵ"],[373,2],[374,1,"ŷ"],[375,2],[376,1,"ÿ"],[377,1,"ź"],[378,2],[379,1,"ż"],[380,2],[381,1,"ž"],[382,2],[383,1,"s"],[384,2],[385,1,"ɓ"],[386,1,"ƃ"],[387,2],[388,1,"ƅ"],[389,2],[390,1,"ɔ"],[391,1,"ƈ"],[392,2],[393,1,"ɖ"],[394,1,"ɗ"],[395,1,"ƌ"],[[396,397],2],[398,1,"ǝ"],[399,1,"ə"],[400,1,"ɛ"],[401,1,"ƒ"],[402,2],[403,1,"ɠ"],[404,1,"ɣ"],[405,2],[406,1,"ɩ"],[407,1,"ɨ"],[408,1,"ƙ"],[[409,411],2],[412,1,"ɯ"],[413,1,"ɲ"],[414,2],[415,1,"ɵ"],[416,1,"ơ"],[417,2],[418,1,"ƣ"],[419,2],[420,1,"ƥ"],[421,2],[422,1,"ʀ"],[423,1,"ƨ"],[424,2],[425,1,"ʃ"],[[426,427],2],[428,1,"ƭ"],[429,2],[430,1,"ʈ"],[431,1,"ư"],[432,2],[433,1,"ʊ"],[434,1,"ʋ"],[435,1,"ƴ"],[436,2],[437,1,"ƶ"],[438,2],[439,1,"ʒ"],[440,1,"ƹ"],[[441,443],2],[444,1,"ƽ"],[[445,451],2],[[452,454],1,"dž"],[[455,457],1,"lj"],[[458,460],1,"nj"],[461,1,"ǎ"],[462,2],[463,1,"ǐ"],[464,2],[465,1,"ǒ"],[466,2],[467,1,"ǔ"],[468,2],[469,1,"ǖ"],[470,2],[471,1,"ǘ"],[472,2],[473,1,"ǚ"],[474,2],[475,1,"ǜ"],[[476,477],2],[478,1,"ǟ"],[479,2],[480,1,"ǡ"],[481,2],[482,1,"ǣ"],[483,2],[484,1,"ǥ"],[485,2],[486,1,"ǧ"],[487,2],[488,1,"ǩ"],[489,2],[490,1,"ǫ"],[491,2],[492,1,"ǭ"],[493,2],[494,1,"ǯ"],[[495,496],2],[[497,499],1,"dz"],[500,1,"ǵ"],[501,2],[502,1,"ƕ"],[503,1,"ƿ"],[504,1,"ǹ"],[505,2],[506,1,"ǻ"],[507,2],[508,1,"ǽ"],[509,2],[510,1,"ǿ"],[511,2],[512,1,"ȁ"],[513,2],[514,1,"ȃ"],[515,2],[516,1,"ȅ"],[517,2],[518,1,"ȇ"],[519,2],[520,1,"ȉ"],[521,2],[522,1,"ȋ"],[523,2],[524,1,"ȍ"],[525,2],[526,1,"ȏ"],[527,2],[528,1,"ȑ"],[529,2],[530,1,"ȓ"],[531,2],[532,1,"ȕ"],[533,2],[534,1,"ȗ"],[535,2],[536,1,"ș"],[537,2],[538,1,"ț"],[539,2],[540,1,"ȝ"],[541,2],[542,1,"ȟ"],[543,2],[544,1,"ƞ"],[545,2],[546,1,"ȣ"],[547,2],[548,1,"ȥ"],[549,2],[550,1,"ȧ"],[551,2],[552,1,"ȩ"],[553,2],[554,1,"ȫ"],[555,2],[556,1,"ȭ"],[557,2],[558,1,"ȯ"],[559,2],[560,1,"ȱ"],[561,2],[562,1,"ȳ"],[563,2],[[564,566],2],[[567,569],2],[570,1,"ⱥ"],[571,1,"ȼ"],[572,2],[573,1,"ƚ"],[574,1,"ⱦ"],[[575,576],2],[577,1,"ɂ"],[578,2],[579,1,"ƀ"],[580,1,"ʉ"],[581,1,"ʌ"],[582,1,"ɇ"],[583,2],[584,1,"ɉ"],[585,2],[586,1,"ɋ"],[587,2],[588,1,"ɍ"],[589,2],[590,1,"ɏ"],[591,2],[[592,680],2],[[681,685],2],[[686,687],2],[688,1,"h"],[689,1,"ɦ"],[690,1,"j"],[691,1,"r"],[692,1,"ɹ"],[693,1,"ɻ"],[694,1,"ʁ"],[695,1,"w"],[696,1,"y"],[[697,705],2],[[706,709],2],[[710,721],2],[[722,727],2],[728,5," ̆"],[729,5," ̇"],[730,5," ̊"],[731,5," ̨"],[732,5," ̃"],[733,5," ̋"],[734,2],[735,2],[736,1,"ɣ"],[737,1,"l"],[738,1,"s"],[739,1,"x"],[740,1,"ʕ"],[[741,745],2],[[746,747],2],[748,2],[749,2],[750,2],[[751,767],2],[[768,831],2],[832,1,"̀"],[833,1,"́"],[834,2],[835,1,"̓"],[836,1,"̈́"],[837,1,"ι"],[[838,846],2],[847,7],[[848,855],2],[[856,860],2],[[861,863],2],[[864,865],2],[866,2],[[867,879],2],[880,1,"ͱ"],[881,2],[882,1,"ͳ"],[883,2],[884,1,"ʹ"],[885,2],[886,1,"ͷ"],[887,2],[[888,889],3],[890,5," ι"],[[891,893],2],[894,5,";"],[895,1,"ϳ"],[[896,899],3],[900,5," ́"],[901,5," ̈́"],[902,1,"ά"],[903,1,"·"],[904,1,"έ"],[905,1,"ή"],[906,1,"ί"],[907,3],[908,1,"ό"],[909,3],[910,1,"ύ"],[911,1,"ώ"],[912,2],[913,1,"α"],[914,1,"β"],[915,1,"γ"],[916,1,"δ"],[917,1,"ε"],[918,1,"ζ"],[919,1,"η"],[920,1,"θ"],[921,1,"ι"],[922,1,"κ"],[923,1,"λ"],[924,1,"μ"],[925,1,"ν"],[926,1,"ξ"],[927,1,"ο"],[928,1,"π"],[929,1,"ρ"],[930,3],[931,1,"σ"],[932,1,"τ"],[933,1,"υ"],[934,1,"φ"],[935,1,"χ"],[936,1,"ψ"],[937,1,"ω"],[938,1,"ϊ"],[939,1,"ϋ"],[[940,961],2],[962,6,"σ"],[[963,974],2],[975,1,"ϗ"],[976,1,"β"],[977,1,"θ"],[978,1,"υ"],[979,1,"ύ"],[980,1,"ϋ"],[981,1,"φ"],[982,1,"π"],[983,2],[984,1,"ϙ"],[985,2],[986,1,"ϛ"],[987,2],[988,1,"ϝ"],[989,2],[990,1,"ϟ"],[991,2],[992,1,"ϡ"],[993,2],[994,1,"ϣ"],[995,2],[996,1,"ϥ"],[997,2],[998,1,"ϧ"],[999,2],[1000,1,"ϩ"],[1001,2],[1002,1,"ϫ"],[1003,2],[1004,1,"ϭ"],[1005,2],[1006,1,"ϯ"],[1007,2],[1008,1,"κ"],[1009,1,"ρ"],[1010,1,"σ"],[1011,2],[1012,1,"θ"],[1013,1,"ε"],[1014,2],[1015,1,"ϸ"],[1016,2],[1017,1,"σ"],[1018,1,"ϻ"],[1019,2],[1020,2],[1021,1,"ͻ"],[1022,1,"ͼ"],[1023,1,"ͽ"],[1024,1,"ѐ"],[1025,1,"ё"],[1026,1,"ђ"],[1027,1,"ѓ"],[1028,1,"є"],[1029,1,"ѕ"],[1030,1,"і"],[1031,1,"ї"],[1032,1,"ј"],[1033,1,"љ"],[1034,1,"њ"],[1035,1,"ћ"],[1036,1,"ќ"],[1037,1,"ѝ"],[1038,1,"ў"],[1039,1,"џ"],[1040,1,"а"],[1041,1,"б"],[1042,1,"в"],[1043,1,"г"],[1044,1,"д"],[1045,1,"е"],[1046,1,"ж"],[1047,1,"з"],[1048,1,"и"],[1049,1,"й"],[1050,1,"к"],[1051,1,"л"],[1052,1,"м"],[1053,1,"н"],[1054,1,"о"],[1055,1,"п"],[1056,1,"р"],[1057,1,"с"],[1058,1,"т"],[1059,1,"у"],[1060,1,"ф"],[1061,1,"х"],[1062,1,"ц"],[1063,1,"ч"],[1064,1,"ш"],[1065,1,"щ"],[1066,1,"ъ"],[1067,1,"ы"],[1068,1,"ь"],[1069,1,"э"],[1070,1,"ю"],[1071,1,"я"],[[1072,1103],2],[1104,2],[[1105,1116],2],[1117,2],[[1118,1119],2],[1120,1,"ѡ"],[1121,2],[1122,1,"ѣ"],[1123,2],[1124,1,"ѥ"],[1125,2],[1126,1,"ѧ"],[1127,2],[1128,1,"ѩ"],[1129,2],[1130,1,"ѫ"],[1131,2],[1132,1,"ѭ"],[1133,2],[1134,1,"ѯ"],[1135,2],[1136,1,"ѱ"],[1137,2],[1138,1,"ѳ"],[1139,2],[1140,1,"ѵ"],[1141,2],[1142,1,"ѷ"],[1143,2],[1144,1,"ѹ"],[1145,2],[1146,1,"ѻ"],[1147,2],[1148,1,"ѽ"],[1149,2],[1150,1,"ѿ"],[1151,2],[1152,1,"ҁ"],[1153,2],[1154,2],[[1155,1158],2],[1159,2],[[1160,1161],2],[1162,1,"ҋ"],[1163,2],[1164,1,"ҍ"],[1165,2],[1166,1,"ҏ"],[1167,2],[1168,1,"ґ"],[1169,2],[1170,1,"ғ"],[1171,2],[1172,1,"ҕ"],[1173,2],[1174,1,"җ"],[1175,2],[1176,1,"ҙ"],[1177,2],[1178,1,"қ"],[1179,2],[1180,1,"ҝ"],[1181,2],[1182,1,"ҟ"],[1183,2],[1184,1,"ҡ"],[1185,2],[1186,1,"ң"],[1187,2],[1188,1,"ҥ"],[1189,2],[1190,1,"ҧ"],[1191,2],[1192,1,"ҩ"],[1193,2],[1194,1,"ҫ"],[1195,2],[1196,1,"ҭ"],[1197,2],[1198,1,"ү"],[1199,2],[1200,1,"ұ"],[1201,2],[1202,1,"ҳ"],[1203,2],[1204,1,"ҵ"],[1205,2],[1206,1,"ҷ"],[1207,2],[1208,1,"ҹ"],[1209,2],[1210,1,"һ"],[1211,2],[1212,1,"ҽ"],[1213,2],[1214,1,"ҿ"],[1215,2],[1216,3],[1217,1,"ӂ"],[1218,2],[1219,1,"ӄ"],[1220,2],[1221,1,"ӆ"],[1222,2],[1223,1,"ӈ"],[1224,2],[1225,1,"ӊ"],[1226,2],[1227,1,"ӌ"],[1228,2],[1229,1,"ӎ"],[1230,2],[1231,2],[1232,1,"ӑ"],[1233,2],[1234,1,"ӓ"],[1235,2],[1236,1,"ӕ"],[1237,2],[1238,1,"ӗ"],[1239,2],[1240,1,"ә"],[1241,2],[1242,1,"ӛ"],[1243,2],[1244,1,"ӝ"],[1245,2],[1246,1,"ӟ"],[1247,2],[1248,1,"ӡ"],[1249,2],[1250,1,"ӣ"],[1251,2],[1252,1,"ӥ"],[1253,2],[1254,1,"ӧ"],[1255,2],[1256,1,"ө"],[1257,2],[1258,1,"ӫ"],[1259,2],[1260,1,"ӭ"],[1261,2],[1262,1,"ӯ"],[1263,2],[1264,1,"ӱ"],[1265,2],[1266,1,"ӳ"],[1267,2],[1268,1,"ӵ"],[1269,2],[1270,1,"ӷ"],[1271,2],[1272,1,"ӹ"],[1273,2],[1274,1,"ӻ"],[1275,2],[1276,1,"ӽ"],[1277,2],[1278,1,"ӿ"],[1279,2],[1280,1,"ԁ"],[1281,2],[1282,1,"ԃ"],[1283,2],[1284,1,"ԅ"],[1285,2],[1286,1,"ԇ"],[1287,2],[1288,1,"ԉ"],[1289,2],[1290,1,"ԋ"],[1291,2],[1292,1,"ԍ"],[1293,2],[1294,1,"ԏ"],[1295,2],[1296,1,"ԑ"],[1297,2],[1298,1,"ԓ"],[1299,2],[1300,1,"ԕ"],[1301,2],[1302,1,"ԗ"],[1303,2],[1304,1,"ԙ"],[1305,2],[1306,1,"ԛ"],[1307,2],[1308,1,"ԝ"],[1309,2],[1310,1,"ԟ"],[1311,2],[1312,1,"ԡ"],[1313,2],[1314,1,"ԣ"],[1315,2],[1316,1,"ԥ"],[1317,2],[1318,1,"ԧ"],[1319,2],[1320,1,"ԩ"],[1321,2],[1322,1,"ԫ"],[1323,2],[1324,1,"ԭ"],[1325,2],[1326,1,"ԯ"],[1327,2],[1328,3],[1329,1,"ա"],[1330,1,"բ"],[1331,1,"գ"],[1332,1,"դ"],[1333,1,"ե"],[1334,1,"զ"],[1335,1,"է"],[1336,1,"ը"],[1337,1,"թ"],[1338,1,"ժ"],[1339,1,"ի"],[1340,1,"լ"],[1341,1,"խ"],[1342,1,"ծ"],[1343,1,"կ"],[1344,1,"հ"],[1345,1,"ձ"],[1346,1,"ղ"],[1347,1,"ճ"],[1348,1,"մ"],[1349,1,"յ"],[1350,1,"ն"],[1351,1,"շ"],[1352,1,"ո"],[1353,1,"չ"],[1354,1,"պ"],[1355,1,"ջ"],[1356,1,"ռ"],[1357,1,"ս"],[1358,1,"վ"],[1359,1,"տ"],[1360,1,"ր"],[1361,1,"ց"],[1362,1,"ւ"],[1363,1,"փ"],[1364,1,"ք"],[1365,1,"օ"],[1366,1,"ֆ"],[[1367,1368],3],[1369,2],[[1370,1375],2],[1376,2],[[1377,1414],2],[1415,1,"եւ"],[1416,2],[1417,2],[1418,2],[[1419,1420],3],[[1421,1422],2],[1423,2],[1424,3],[[1425,1441],2],[1442,2],[[1443,1455],2],[[1456,1465],2],[1466,2],[[1467,1469],2],[1470,2],[1471,2],[1472,2],[[1473,1474],2],[1475,2],[1476,2],[1477,2],[1478,2],[1479,2],[[1480,1487],3],[[1488,1514],2],[[1515,1518],3],[1519,2],[[1520,1524],2],[[1525,1535],3],[[1536,1539],3],[1540,3],[1541,3],[[1542,1546],2],[1547,2],[1548,2],[[1549,1551],2],[[1552,1557],2],[[1558,1562],2],[1563,2],[1564,3],[1565,3],[1566,2],[1567,2],[1568,2],[[1569,1594],2],[[1595,1599],2],[1600,2],[[1601,1618],2],[[1619,1621],2],[[1622,1624],2],[[1625,1630],2],[1631,2],[[1632,1641],2],[[1642,1645],2],[[1646,1647],2],[[1648,1652],2],[1653,1,"اٴ"],[1654,1,"وٴ"],[1655,1,"ۇٴ"],[1656,1,"يٴ"],[[1657,1719],2],[[1720,1721],2],[[1722,1726],2],[1727,2],[[1728,1742],2],[1743,2],[[1744,1747],2],[1748,2],[[1749,1756],2],[1757,3],[1758,2],[[1759,1768],2],[1769,2],[[1770,1773],2],[[1774,1775],2],[[1776,1785],2],[[1786,1790],2],[1791,2],[[1792,1805],2],[1806,3],[1807,3],[[1808,1836],2],[[1837,1839],2],[[1840,1866],2],[[1867,1868],3],[[1869,1871],2],[[1872,1901],2],[[1902,1919],2],[[1920,1968],2],[1969,2],[[1970,1983],3],[[1984,2037],2],[[2038,2042],2],[[2043,2044],3],[2045,2],[[2046,2047],2],[[2048,2093],2],[[2094,2095],3],[[2096,2110],2],[2111,3],[[2112,2139],2],[[2140,2141],3],[2142,2],[2143,3],[[2144,2154],2],[[2155,2207],3],[2208,2],[2209,2],[[2210,2220],2],[[2221,2226],2],[[2227,2228],2],[2229,3],[[2230,2237],2],[[2238,2247],2],[[2248,2258],3],[2259,2],[[2260,2273],2],[2274,3],[2275,2],[[2276,2302],2],[2303,2],[2304,2],[[2305,2307],2],[2308,2],[[2309,2361],2],[[2362,2363],2],[[2364,2381],2],[2382,2],[2383,2],[[2384,2388],2],[2389,2],[[2390,2391],2],[2392,1,"क़"],[2393,1,"ख़"],[2394,1,"ग़"],[2395,1,"ज़"],[2396,1,"ड़"],[2397,1,"ढ़"],[2398,1,"फ़"],[2399,1,"य़"],[[2400,2403],2],[[2404,2405],2],[[2406,2415],2],[2416,2],[[2417,2418],2],[[2419,2423],2],[2424,2],[[2425,2426],2],[[2427,2428],2],[2429,2],[[2430,2431],2],[2432,2],[[2433,2435],2],[2436,3],[[2437,2444],2],[[2445,2446],3],[[2447,2448],2],[[2449,2450],3],[[2451,2472],2],[2473,3],[[2474,2480],2],[2481,3],[2482,2],[[2483,2485],3],[[2486,2489],2],[[2490,2491],3],[2492,2],[2493,2],[[2494,2500],2],[[2501,2502],3],[[2503,2504],2],[[2505,2506],3],[[2507,2509],2],[2510,2],[[2511,2518],3],[2519,2],[[2520,2523],3],[2524,1,"ড়"],[2525,1,"ঢ়"],[2526,3],[2527,1,"য়"],[[2528,2531],2],[[2532,2533],3],[[2534,2545],2],[[2546,2554],2],[2555,2],[2556,2],[2557,2],[2558,2],[[2559,2560],3],[2561,2],[2562,2],[2563,2],[2564,3],[[2565,2570],2],[[2571,2574],3],[[2575,2576],2],[[2577,2578],3],[[2579,2600],2],[2601,3],[[2602,2608],2],[2609,3],[2610,2],[2611,1,"ਲ਼"],[2612,3],[2613,2],[2614,1,"ਸ਼"],[2615,3],[[2616,2617],2],[[2618,2619],3],[2620,2],[2621,3],[[2622,2626],2],[[2627,2630],3],[[2631,2632],2],[[2633,2634],3],[[2635,2637],2],[[2638,2640],3],[2641,2],[[2642,2648],3],[2649,1,"ਖ਼"],[2650,1,"ਗ਼"],[2651,1,"ਜ਼"],[2652,2],[2653,3],[2654,1,"ਫ਼"],[[2655,2661],3],[[2662,2676],2],[2677,2],[2678,2],[[2679,2688],3],[[2689,2691],2],[2692,3],[[2693,2699],2],[2700,2],[2701,2],[2702,3],[[2703,2705],2],[2706,3],[[2707,2728],2],[2729,3],[[2730,2736],2],[2737,3],[[2738,2739],2],[2740,3],[[2741,2745],2],[[2746,2747],3],[[2748,2757],2],[2758,3],[[2759,2761],2],[2762,3],[[2763,2765],2],[[2766,2767],3],[2768,2],[[2769,2783],3],[2784,2],[[2785,2787],2],[[2788,2789],3],[[2790,2799],2],[2800,2],[2801,2],[[2802,2808],3],[2809,2],[[2810,2815],2],[2816,3],[[2817,2819],2],[2820,3],[[2821,2828],2],[[2829,2830],3],[[2831,2832],2],[[2833,2834],3],[[2835,2856],2],[2857,3],[[2858,2864],2],[2865,3],[[2866,2867],2],[2868,3],[2869,2],[[2870,2873],2],[[2874,2875],3],[[2876,2883],2],[2884,2],[[2885,2886],3],[[2887,2888],2],[[2889,2890],3],[[2891,2893],2],[[2894,2900],3],[2901,2],[[2902,2903],2],[[2904,2907],3],[2908,1,"ଡ଼"],[2909,1,"ଢ଼"],[2910,3],[[2911,2913],2],[[2914,2915],2],[[2916,2917],3],[[2918,2927],2],[2928,2],[2929,2],[[2930,2935],2],[[2936,2945],3],[[2946,2947],2],[2948,3],[[2949,2954],2],[[2955,2957],3],[[2958,2960],2],[2961,3],[[2962,2965],2],[[2966,2968],3],[[2969,2970],2],[2971,3],[2972,2],[2973,3],[[2974,2975],2],[[2976,2978],3],[[2979,2980],2],[[2981,2983],3],[[2984,2986],2],[[2987,2989],3],[[2990,2997],2],[2998,2],[[2999,3001],2],[[3002,3005],3],[[3006,3010],2],[[3011,3013],3],[[3014,3016],2],[3017,3],[[3018,3021],2],[[3022,3023],3],[3024,2],[[3025,3030],3],[3031,2],[[3032,3045],3],[3046,2],[[3047,3055],2],[[3056,3058],2],[[3059,3066],2],[[3067,3071],3],[3072,2],[[3073,3075],2],[3076,2],[[3077,3084],2],[3085,3],[[3086,3088],2],[3089,3],[[3090,3112],2],[3113,3],[[3114,3123],2],[3124,2],[[3125,3129],2],[[3130,3132],3],[3133,2],[[3134,3140],2],[3141,3],[[3142,3144],2],[3145,3],[[3146,3149],2],[[3150,3156],3],[[3157,3158],2],[3159,3],[[3160,3161],2],[3162,2],[[3163,3167],3],[[3168,3169],2],[[3170,3171],2],[[3172,3173],3],[[3174,3183],2],[[3184,3190],3],[3191,2],[[3192,3199],2],[3200,2],[3201,2],[[3202,3203],2],[3204,2],[[3205,3212],2],[3213,3],[[3214,3216],2],[3217,3],[[3218,3240],2],[3241,3],[[3242,3251],2],[3252,3],[[3253,3257],2],[[3258,3259],3],[[3260,3261],2],[[3262,3268],2],[3269,3],[[3270,3272],2],[3273,3],[[3274,3277],2],[[3278,3284],3],[[3285,3286],2],[[3287,3293],3],[3294,2],[3295,3],[[3296,3297],2],[[3298,3299],2],[[3300,3301],3],[[3302,3311],2],[3312,3],[[3313,3314],2],[[3315,3327],3],[3328,2],[3329,2],[[3330,3331],2],[3332,2],[[3333,3340],2],[3341,3],[[3342,3344],2],[3345,3],[[3346,3368],2],[3369,2],[[3370,3385],2],[3386,2],[[3387,3388],2],[3389,2],[[3390,3395],2],[3396,2],[3397,3],[[3398,3400],2],[3401,3],[[3402,3405],2],[3406,2],[3407,2],[[3408,3411],3],[[3412,3414],2],[3415,2],[[3416,3422],2],[3423,2],[[3424,3425],2],[[3426,3427],2],[[3428,3429],3],[[3430,3439],2],[[3440,3445],2],[[3446,3448],2],[3449,2],[[3450,3455],2],[3456,3],[3457,2],[[3458,3459],2],[3460,3],[[3461,3478],2],[[3479,3481],3],[[3482,3505],2],[3506,3],[[3507,3515],2],[3516,3],[3517,2],[[3518,3519],3],[[3520,3526],2],[[3527,3529],3],[3530,2],[[3531,3534],3],[[3535,3540],2],[3541,3],[3542,2],[3543,3],[[3544,3551],2],[[3552,3557],3],[[3558,3567],2],[[3568,3569],3],[[3570,3571],2],[3572,2],[[3573,3584],3],[[3585,3634],2],[3635,1,"ํา"],[[3636,3642],2],[[3643,3646],3],[3647,2],[[3648,3662],2],[3663,2],[[3664,3673],2],[[3674,3675],2],[[3676,3712],3],[[3713,3714],2],[3715,3],[3716,2],[3717,3],[3718,2],[[3719,3720],2],[3721,2],[3722,2],[3723,3],[3724,2],[3725,2],[[3726,3731],2],[[3732,3735],2],[3736,2],[[3737,3743],2],[3744,2],[[3745,3747],2],[3748,3],[3749,2],[3750,3],[3751,2],[[3752,3753],2],[[3754,3755],2],[3756,2],[[3757,3762],2],[3763,1,"ໍາ"],[[3764,3769],2],[3770,2],[[3771,3773],2],[[3774,3775],3],[[3776,3780],2],[3781,3],[3782,2],[3783,3],[[3784,3789],2],[[3790,3791],3],[[3792,3801],2],[[3802,3803],3],[3804,1,"ຫນ"],[3805,1,"ຫມ"],[[3806,3807],2],[[3808,3839],3],[3840,2],[[3841,3850],2],[3851,2],[3852,1,"་"],[[3853,3863],2],[[3864,3865],2],[[3866,3871],2],[[3872,3881],2],[[3882,3892],2],[3893,2],[3894,2],[3895,2],[3896,2],[3897,2],[[3898,3901],2],[[3902,3906],2],[3907,1,"གྷ"],[[3908,3911],2],[3912,3],[[3913,3916],2],[3917,1,"ཌྷ"],[[3918,3921],2],[3922,1,"དྷ"],[[3923,3926],2],[3927,1,"བྷ"],[[3928,3931],2],[3932,1,"ཛྷ"],[[3933,3944],2],[3945,1,"ཀྵ"],[3946,2],[[3947,3948],2],[[3949,3952],3],[[3953,3954],2],[3955,1,"ཱི"],[3956,2],[3957,1,"ཱུ"],[3958,1,"ྲྀ"],[3959,1,"ྲཱྀ"],[3960,1,"ླྀ"],[3961,1,"ླཱྀ"],[[3962,3968],2],[3969,1,"ཱྀ"],[[3970,3972],2],[3973,2],[[3974,3979],2],[[3980,3983],2],[[3984,3986],2],[3987,1,"ྒྷ"],[[3988,3989],2],[3990,2],[3991,2],[3992,3],[[3993,3996],2],[3997,1,"ྜྷ"],[[3998,4001],2],[4002,1,"ྡྷ"],[[4003,4006],2],[4007,1,"ྦྷ"],[[4008,4011],2],[4012,1,"ྫྷ"],[4013,2],[[4014,4016],2],[[4017,4023],2],[4024,2],[4025,1,"ྐྵ"],[[4026,4028],2],[4029,3],[[4030,4037],2],[4038,2],[[4039,4044],2],[4045,3],[4046,2],[4047,2],[[4048,4049],2],[[4050,4052],2],[[4053,4056],2],[[4057,4058],2],[[4059,4095],3],[[4096,4129],2],[4130,2],[[4131,4135],2],[4136,2],[[4137,4138],2],[4139,2],[[4140,4146],2],[[4147,4149],2],[[4150,4153],2],[[4154,4159],2],[[4160,4169],2],[[4170,4175],2],[[4176,4185],2],[[4186,4249],2],[[4250,4253],2],[[4254,4255],2],[[4256,4293],3],[4294,3],[4295,1,"ⴧ"],[[4296,4300],3],[4301,1,"ⴭ"],[[4302,4303],3],[[4304,4342],2],[[4343,4344],2],[[4345,4346],2],[4347,2],[4348,1,"ნ"],[[4349,4351],2],[[4352,4441],2],[[4442,4446],2],[[4447,4448],3],[[4449,4514],2],[[4515,4519],2],[[4520,4601],2],[[4602,4607],2],[[4608,4614],2],[4615,2],[[4616,4678],2],[4679,2],[4680,2],[4681,3],[[4682,4685],2],[[4686,4687],3],[[4688,4694],2],[4695,3],[4696,2],[4697,3],[[4698,4701],2],[[4702,4703],3],[[4704,4742],2],[4743,2],[4744,2],[4745,3],[[4746,4749],2],[[4750,4751],3],[[4752,4782],2],[4783,2],[4784,2],[4785,3],[[4786,4789],2],[[4790,4791],3],[[4792,4798],2],[4799,3],[4800,2],[4801,3],[[4802,4805],2],[[4806,4807],3],[[4808,4814],2],[4815,2],[[4816,4822],2],[4823,3],[[4824,4846],2],[4847,2],[[4848,4878],2],[4879,2],[4880,2],[4881,3],[[4882,4885],2],[[4886,4887],3],[[4888,4894],2],[4895,2],[[4896,4934],2],[4935,2],[[4936,4954],2],[[4955,4956],3],[[4957,4958],2],[4959,2],[4960,2],[[4961,4988],2],[[4989,4991],3],[[4992,5007],2],[[5008,5017],2],[[5018,5023],3],[[5024,5108],2],[5109,2],[[5110,5111],3],[5112,1,"Ᏸ"],[5113,1,"Ᏹ"],[5114,1,"Ᏺ"],[5115,1,"Ᏻ"],[5116,1,"Ᏼ"],[5117,1,"Ᏽ"],[[5118,5119],3],[5120,2],[[5121,5740],2],[[5741,5742],2],[[5743,5750],2],[[5751,5759],2],[5760,3],[[5761,5786],2],[[5787,5788],2],[[5789,5791],3],[[5792,5866],2],[[5867,5872],2],[[5873,5880],2],[[5881,5887],3],[[5888,5900],2],[5901,3],[[5902,5908],2],[[5909,5919],3],[[5920,5940],2],[[5941,5942],2],[[5943,5951],3],[[5952,5971],2],[[5972,5983],3],[[5984,5996],2],[5997,3],[[5998,6000],2],[6001,3],[[6002,6003],2],[[6004,6015],3],[[6016,6067],2],[[6068,6069],3],[[6070,6099],2],[[6100,6102],2],[6103,2],[[6104,6107],2],[6108,2],[6109,2],[[6110,6111],3],[[6112,6121],2],[[6122,6127],3],[[6128,6137],2],[[6138,6143],3],[[6144,6149],2],[6150,3],[[6151,6154],2],[[6155,6157],7],[6158,3],[6159,3],[[6160,6169],2],[[6170,6175],3],[[6176,6263],2],[6264,2],[[6265,6271],3],[[6272,6313],2],[6314,2],[[6315,6319],3],[[6320,6389],2],[[6390,6399],3],[[6400,6428],2],[[6429,6430],2],[6431,3],[[6432,6443],2],[[6444,6447],3],[[6448,6459],2],[[6460,6463],3],[6464,2],[[6465,6467],3],[[6468,6469],2],[[6470,6509],2],[[6510,6511],3],[[6512,6516],2],[[6517,6527],3],[[6528,6569],2],[[6570,6571],2],[[6572,6575],3],[[6576,6601],2],[[6602,6607],3],[[6608,6617],2],[6618,2],[[6619,6621],3],[[6622,6623],2],[[6624,6655],2],[[6656,6683],2],[[6684,6685],3],[[6686,6687],2],[[6688,6750],2],[6751,3],[[6752,6780],2],[[6781,6782],3],[[6783,6793],2],[[6794,6799],3],[[6800,6809],2],[[6810,6815],3],[[6816,6822],2],[6823,2],[[6824,6829],2],[[6830,6831],3],[[6832,6845],2],[6846,2],[[6847,6848],2],[[6849,6911],3],[[6912,6987],2],[[6988,6991],3],[[6992,7001],2],[[7002,7018],2],[[7019,7027],2],[[7028,7036],2],[[7037,7039],3],[[7040,7082],2],[[7083,7085],2],[[7086,7097],2],[[7098,7103],2],[[7104,7155],2],[[7156,7163],3],[[7164,7167],2],[[7168,7223],2],[[7224,7226],3],[[7227,7231],2],[[7232,7241],2],[[7242,7244],3],[[7245,7293],2],[[7294,7295],2],[7296,1,"в"],[7297,1,"д"],[7298,1,"о"],[7299,1,"с"],[[7300,7301],1,"т"],[7302,1,"ъ"],[7303,1,"ѣ"],[7304,1,"ꙋ"],[[7305,7311],3],[7312,1,"ა"],[7313,1,"ბ"],[7314,1,"გ"],[7315,1,"დ"],[7316,1,"ე"],[7317,1,"ვ"],[7318,1,"ზ"],[7319,1,"თ"],[7320,1,"ი"],[7321,1,"კ"],[7322,1,"ლ"],[7323,1,"მ"],[7324,1,"ნ"],[7325,1,"ო"],[7326,1,"პ"],[7327,1,"ჟ"],[7328,1,"რ"],[7329,1,"ს"],[7330,1,"ტ"],[7331,1,"უ"],[7332,1,"ფ"],[7333,1,"ქ"],[7334,1,"ღ"],[7335,1,"ყ"],[7336,1,"შ"],[7337,1,"ჩ"],[7338,1,"ც"],[7339,1,"ძ"],[7340,1,"წ"],[7341,1,"ჭ"],[7342,1,"ხ"],[7343,1,"ჯ"],[7344,1,"ჰ"],[7345,1,"ჱ"],[7346,1,"ჲ"],[7347,1,"ჳ"],[7348,1,"ჴ"],[7349,1,"ჵ"],[7350,1,"ჶ"],[7351,1,"ჷ"],[7352,1,"ჸ"],[7353,1,"ჹ"],[7354,1,"ჺ"],[[7355,7356],3],[7357,1,"ჽ"],[7358,1,"ჾ"],[7359,1,"ჿ"],[[7360,7367],2],[[7368,7375],3],[[7376,7378],2],[7379,2],[[7380,7410],2],[[7411,7414],2],[7415,2],[[7416,7417],2],[7418,2],[[7419,7423],3],[[7424,7467],2],[7468,1,"a"],[7469,1,"æ"],[7470,1,"b"],[7471,2],[7472,1,"d"],[7473,1,"e"],[7474,1,"ǝ"],[7475,1,"g"],[7476,1,"h"],[7477,1,"i"],[7478,1,"j"],[7479,1,"k"],[7480,1,"l"],[7481,1,"m"],[7482,1,"n"],[7483,2],[7484,1,"o"],[7485,1,"ȣ"],[7486,1,"p"],[7487,1,"r"],[7488,1,"t"],[7489,1,"u"],[7490,1,"w"],[7491,1,"a"],[7492,1,"ɐ"],[7493,1,"ɑ"],[7494,1,"ᴂ"],[7495,1,"b"],[7496,1,"d"],[7497,1,"e"],[7498,1,"ə"],[7499,1,"ɛ"],[7500,1,"ɜ"],[7501,1,"g"],[7502,2],[7503,1,"k"],[7504,1,"m"],[7505,1,"ŋ"],[7506,1,"o"],[7507,1,"ɔ"],[7508,1,"ᴖ"],[7509,1,"ᴗ"],[7510,1,"p"],[7511,1,"t"],[7512,1,"u"],[7513,1,"ᴝ"],[7514,1,"ɯ"],[7515,1,"v"],[7516,1,"ᴥ"],[7517,1,"β"],[7518,1,"γ"],[7519,1,"δ"],[7520,1,"φ"],[7521,1,"χ"],[7522,1,"i"],[7523,1,"r"],[7524,1,"u"],[7525,1,"v"],[7526,1,"β"],[7527,1,"γ"],[7528,1,"ρ"],[7529,1,"φ"],[7530,1,"χ"],[7531,2],[[7532,7543],2],[7544,1,"н"],[[7545,7578],2],[7579,1,"ɒ"],[7580,1,"c"],[7581,1,"ɕ"],[7582,1,"ð"],[7583,1,"ɜ"],[7584,1,"f"],[7585,1,"ɟ"],[7586,1,"ɡ"],[7587,1,"ɥ"],[7588,1,"ɨ"],[7589,1,"ɩ"],[7590,1,"ɪ"],[7591,1,"ᵻ"],[7592,1,"ʝ"],[7593,1,"ɭ"],[7594,1,"ᶅ"],[7595,1,"ʟ"],[7596,1,"ɱ"],[7597,1,"ɰ"],[7598,1,"ɲ"],[7599,1,"ɳ"],[7600,1,"ɴ"],[7601,1,"ɵ"],[7602,1,"ɸ"],[7603,1,"ʂ"],[7604,1,"ʃ"],[7605,1,"ƫ"],[7606,1,"ʉ"],[7607,1,"ʊ"],[7608,1,"ᴜ"],[7609,1,"ʋ"],[7610,1,"ʌ"],[7611,1,"z"],[7612,1,"ʐ"],[7613,1,"ʑ"],[7614,1,"ʒ"],[7615,1,"θ"],[[7616,7619],2],[[7620,7626],2],[[7627,7654],2],[[7655,7669],2],[[7670,7673],2],[7674,3],[7675,2],[7676,2],[7677,2],[[7678,7679],2],[7680,1,"ḁ"],[7681,2],[7682,1,"ḃ"],[7683,2],[7684,1,"ḅ"],[7685,2],[7686,1,"ḇ"],[7687,2],[7688,1,"ḉ"],[7689,2],[7690,1,"ḋ"],[7691,2],[7692,1,"ḍ"],[7693,2],[7694,1,"ḏ"],[7695,2],[7696,1,"ḑ"],[7697,2],[7698,1,"ḓ"],[7699,2],[7700,1,"ḕ"],[7701,2],[7702,1,"ḗ"],[7703,2],[7704,1,"ḙ"],[7705,2],[7706,1,"ḛ"],[7707,2],[7708,1,"ḝ"],[7709,2],[7710,1,"ḟ"],[7711,2],[7712,1,"ḡ"],[7713,2],[7714,1,"ḣ"],[7715,2],[7716,1,"ḥ"],[7717,2],[7718,1,"ḧ"],[7719,2],[7720,1,"ḩ"],[7721,2],[7722,1,"ḫ"],[7723,2],[7724,1,"ḭ"],[7725,2],[7726,1,"ḯ"],[7727,2],[7728,1,"ḱ"],[7729,2],[7730,1,"ḳ"],[7731,2],[7732,1,"ḵ"],[7733,2],[7734,1,"ḷ"],[7735,2],[7736,1,"ḹ"],[7737,2],[7738,1,"ḻ"],[7739,2],[7740,1,"ḽ"],[7741,2],[7742,1,"ḿ"],[7743,2],[7744,1,"ṁ"],[7745,2],[7746,1,"ṃ"],[7747,2],[7748,1,"ṅ"],[7749,2],[7750,1,"ṇ"],[7751,2],[7752,1,"ṉ"],[7753,2],[7754,1,"ṋ"],[7755,2],[7756,1,"ṍ"],[7757,2],[7758,1,"ṏ"],[7759,2],[7760,1,"ṑ"],[7761,2],[7762,1,"ṓ"],[7763,2],[7764,1,"ṕ"],[7765,2],[7766,1,"ṗ"],[7767,2],[7768,1,"ṙ"],[7769,2],[7770,1,"ṛ"],[7771,2],[7772,1,"ṝ"],[7773,2],[7774,1,"ṟ"],[7775,2],[7776,1,"ṡ"],[7777,2],[7778,1,"ṣ"],[7779,2],[7780,1,"ṥ"],[7781,2],[7782,1,"ṧ"],[7783,2],[7784,1,"ṩ"],[7785,2],[7786,1,"ṫ"],[7787,2],[7788,1,"ṭ"],[7789,2],[7790,1,"ṯ"],[7791,2],[7792,1,"ṱ"],[7793,2],[7794,1,"ṳ"],[7795,2],[7796,1,"ṵ"],[7797,2],[7798,1,"ṷ"],[7799,2],[7800,1,"ṹ"],[7801,2],[7802,1,"ṻ"],[7803,2],[7804,1,"ṽ"],[7805,2],[7806,1,"ṿ"],[7807,2],[7808,1,"ẁ"],[7809,2],[7810,1,"ẃ"],[7811,2],[7812,1,"ẅ"],[7813,2],[7814,1,"ẇ"],[7815,2],[7816,1,"ẉ"],[7817,2],[7818,1,"ẋ"],[7819,2],[7820,1,"ẍ"],[7821,2],[7822,1,"ẏ"],[7823,2],[7824,1,"ẑ"],[7825,2],[7826,1,"ẓ"],[7827,2],[7828,1,"ẕ"],[[7829,7833],2],[7834,1,"aʾ"],[7835,1,"ṡ"],[[7836,7837],2],[7838,1,"ss"],[7839,2],[7840,1,"ạ"],[7841,2],[7842,1,"ả"],[7843,2],[7844,1,"ấ"],[7845,2],[7846,1,"ầ"],[7847,2],[7848,1,"ẩ"],[7849,2],[7850,1,"ẫ"],[7851,2],[7852,1,"ậ"],[7853,2],[7854,1,"ắ"],[7855,2],[7856,1,"ằ"],[7857,2],[7858,1,"ẳ"],[7859,2],[7860,1,"ẵ"],[7861,2],[7862,1,"ặ"],[7863,2],[7864,1,"ẹ"],[7865,2],[7866,1,"ẻ"],[7867,2],[7868,1,"ẽ"],[7869,2],[7870,1,"ế"],[7871,2],[7872,1,"ề"],[7873,2],[7874,1,"ể"],[7875,2],[7876,1,"ễ"],[7877,2],[7878,1,"ệ"],[7879,2],[7880,1,"ỉ"],[7881,2],[7882,1,"ị"],[7883,2],[7884,1,"ọ"],[7885,2],[7886,1,"ỏ"],[7887,2],[7888,1,"ố"],[7889,2],[7890,1,"ồ"],[7891,2],[7892,1,"ổ"],[7893,2],[7894,1,"ỗ"],[7895,2],[7896,1,"ộ"],[7897,2],[7898,1,"ớ"],[7899,2],[7900,1,"ờ"],[7901,2],[7902,1,"ở"],[7903,2],[7904,1,"ỡ"],[7905,2],[7906,1,"ợ"],[7907,2],[7908,1,"ụ"],[7909,2],[7910,1,"ủ"],[7911,2],[7912,1,"ứ"],[7913,2],[7914,1,"ừ"],[7915,2],[7916,1,"ử"],[7917,2],[7918,1,"ữ"],[7919,2],[7920,1,"ự"],[7921,2],[7922,1,"ỳ"],[7923,2],[7924,1,"ỵ"],[7925,2],[7926,1,"ỷ"],[7927,2],[7928,1,"ỹ"],[7929,2],[7930,1,"ỻ"],[7931,2],[7932,1,"ỽ"],[7933,2],[7934,1,"ỿ"],[7935,2],[[7936,7943],2],[7944,1,"ἀ"],[7945,1,"ἁ"],[7946,1,"ἂ"],[7947,1,"ἃ"],[7948,1,"ἄ"],[7949,1,"ἅ"],[7950,1,"ἆ"],[7951,1,"ἇ"],[[7952,7957],2],[[7958,7959],3],[7960,1,"ἐ"],[7961,1,"ἑ"],[7962,1,"ἒ"],[7963,1,"ἓ"],[7964,1,"ἔ"],[7965,1,"ἕ"],[[7966,7967],3],[[7968,7975],2],[7976,1,"ἠ"],[7977,1,"ἡ"],[7978,1,"ἢ"],[7979,1,"ἣ"],[7980,1,"ἤ"],[7981,1,"ἥ"],[7982,1,"ἦ"],[7983,1,"ἧ"],[[7984,7991],2],[7992,1,"ἰ"],[7993,1,"ἱ"],[7994,1,"ἲ"],[7995,1,"ἳ"],[7996,1,"ἴ"],[7997,1,"ἵ"],[7998,1,"ἶ"],[7999,1,"ἷ"],[[8000,8005],2],[[8006,8007],3],[8008,1,"ὀ"],[8009,1,"ὁ"],[8010,1,"ὂ"],[8011,1,"ὃ"],[8012,1,"ὄ"],[8013,1,"ὅ"],[[8014,8015],3],[[8016,8023],2],[8024,3],[8025,1,"ὑ"],[8026,3],[8027,1,"ὓ"],[8028,3],[8029,1,"ὕ"],[8030,3],[8031,1,"ὗ"],[[8032,8039],2],[8040,1,"ὠ"],[8041,1,"ὡ"],[8042,1,"ὢ"],[8043,1,"ὣ"],[8044,1,"ὤ"],[8045,1,"ὥ"],[8046,1,"ὦ"],[8047,1,"ὧ"],[8048,2],[8049,1,"ά"],[8050,2],[8051,1,"έ"],[8052,2],[8053,1,"ή"],[8054,2],[8055,1,"ί"],[8056,2],[8057,1,"ό"],[8058,2],[8059,1,"ύ"],[8060,2],[8061,1,"ώ"],[[8062,8063],3],[8064,1,"ἀι"],[8065,1,"ἁι"],[8066,1,"ἂι"],[8067,1,"ἃι"],[8068,1,"ἄι"],[8069,1,"ἅι"],[8070,1,"ἆι"],[8071,1,"ἇι"],[8072,1,"ἀι"],[8073,1,"ἁι"],[8074,1,"ἂι"],[8075,1,"ἃι"],[8076,1,"ἄι"],[8077,1,"ἅι"],[8078,1,"ἆι"],[8079,1,"ἇι"],[8080,1,"ἠι"],[8081,1,"ἡι"],[8082,1,"ἢι"],[8083,1,"ἣι"],[8084,1,"ἤι"],[8085,1,"ἥι"],[8086,1,"ἦι"],[8087,1,"ἧι"],[8088,1,"ἠι"],[8089,1,"ἡι"],[8090,1,"ἢι"],[8091,1,"ἣι"],[8092,1,"ἤι"],[8093,1,"ἥι"],[8094,1,"ἦι"],[8095,1,"ἧι"],[8096,1,"ὠι"],[8097,1,"ὡι"],[8098,1,"ὢι"],[8099,1,"ὣι"],[8100,1,"ὤι"],[8101,1,"ὥι"],[8102,1,"ὦι"],[8103,1,"ὧι"],[8104,1,"ὠι"],[8105,1,"ὡι"],[8106,1,"ὢι"],[8107,1,"ὣι"],[8108,1,"ὤι"],[8109,1,"ὥι"],[8110,1,"ὦι"],[8111,1,"ὧι"],[[8112,8113],2],[8114,1,"ὰι"],[8115,1,"αι"],[8116,1,"άι"],[8117,3],[8118,2],[8119,1,"ᾶι"],[8120,1,"ᾰ"],[8121,1,"ᾱ"],[8122,1,"ὰ"],[8123,1,"ά"],[8124,1,"αι"],[8125,5," ̓"],[8126,1,"ι"],[8127,5," ̓"],[8128,5," ͂"],[8129,5," ̈͂"],[8130,1,"ὴι"],[8131,1,"ηι"],[8132,1,"ήι"],[8133,3],[8134,2],[8135,1,"ῆι"],[8136,1,"ὲ"],[8137,1,"έ"],[8138,1,"ὴ"],[8139,1,"ή"],[8140,1,"ηι"],[8141,5," ̓̀"],[8142,5," ̓́"],[8143,5," ̓͂"],[[8144,8146],2],[8147,1,"ΐ"],[[8148,8149],3],[[8150,8151],2],[8152,1,"ῐ"],[8153,1,"ῑ"],[8154,1,"ὶ"],[8155,1,"ί"],[8156,3],[8157,5," ̔̀"],[8158,5," ̔́"],[8159,5," ̔͂"],[[8160,8162],2],[8163,1,"ΰ"],[[8164,8167],2],[8168,1,"ῠ"],[8169,1,"ῡ"],[8170,1,"ὺ"],[8171,1,"ύ"],[8172,1,"ῥ"],[8173,5," ̈̀"],[8174,5," ̈́"],[8175,5,"`"],[[8176,8177],3],[8178,1,"ὼι"],[8179,1,"ωι"],[8180,1,"ώι"],[8181,3],[8182,2],[8183,1,"ῶι"],[8184,1,"ὸ"],[8185,1,"ό"],[8186,1,"ὼ"],[8187,1,"ώ"],[8188,1,"ωι"],[8189,5," ́"],[8190,5," ̔"],[8191,3],[[8192,8202],5," "],[8203,7],[[8204,8205],6,""],[[8206,8207],3],[8208,2],[8209,1,"‐"],[[8210,8214],2],[8215,5," ̳"],[[8216,8227],2],[[8228,8230],3],[8231,2],[[8232,8238],3],[8239,5," "],[[8240,8242],2],[8243,1,"′′"],[8244,1,"′′′"],[8245,2],[8246,1,"‵‵"],[8247,1,"‵‵‵"],[[8248,8251],2],[8252,5,"!!"],[8253,2],[8254,5," ̅"],[[8255,8262],2],[8263,5,"??"],[8264,5,"?!"],[8265,5,"!?"],[[8266,8269],2],[[8270,8274],2],[[8275,8276],2],[[8277,8278],2],[8279,1,"′′′′"],[[8280,8286],2],[8287,5," "],[8288,7],[[8289,8291],3],[8292,7],[8293,3],[[8294,8297],3],[[8298,8303],3],[8304,1,"0"],[8305,1,"i"],[[8306,8307],3],[8308,1,"4"],[8309,1,"5"],[8310,1,"6"],[8311,1,"7"],[8312,1,"8"],[8313,1,"9"],[8314,5,"+"],[8315,1,"−"],[8316,5,"="],[8317,5,"("],[8318,5,")"],[8319,1,"n"],[8320,1,"0"],[8321,1,"1"],[8322,1,"2"],[8323,1,"3"],[8324,1,"4"],[8325,1,"5"],[8326,1,"6"],[8327,1,"7"],[8328,1,"8"],[8329,1,"9"],[8330,5,"+"],[8331,1,"−"],[8332,5,"="],[8333,5,"("],[8334,5,")"],[8335,3],[8336,1,"a"],[8337,1,"e"],[8338,1,"o"],[8339,1,"x"],[8340,1,"ə"],[8341,1,"h"],[8342,1,"k"],[8343,1,"l"],[8344,1,"m"],[8345,1,"n"],[8346,1,"p"],[8347,1,"s"],[8348,1,"t"],[[8349,8351],3],[[8352,8359],2],[8360,1,"rs"],[[8361,8362],2],[8363,2],[8364,2],[[8365,8367],2],[[8368,8369],2],[[8370,8373],2],[[8374,8376],2],[8377,2],[8378,2],[[8379,8381],2],[8382,2],[8383,2],[[8384,8399],3],[[8400,8417],2],[[8418,8419],2],[[8420,8426],2],[8427,2],[[8428,8431],2],[8432,2],[[8433,8447],3],[8448,5,"a/c"],[8449,5,"a/s"],[8450,1,"c"],[8451,1,"°c"],[8452,2],[8453,5,"c/o"],[8454,5,"c/u"],[8455,1,"ɛ"],[8456,2],[8457,1,"°f"],[8458,1,"g"],[[8459,8462],1,"h"],[8463,1,"ħ"],[[8464,8465],1,"i"],[[8466,8467],1,"l"],[8468,2],[8469,1,"n"],[8470,1,"no"],[[8471,8472],2],[8473,1,"p"],[8474,1,"q"],[[8475,8477],1,"r"],[[8478,8479],2],[8480,1,"sm"],[8481,1,"tel"],[8482,1,"tm"],[8483,2],[8484,1,"z"],[8485,2],[8486,1,"ω"],[8487,2],[8488,1,"z"],[8489,2],[8490,1,"k"],[8491,1,"å"],[8492,1,"b"],[8493,1,"c"],[8494,2],[[8495,8496],1,"e"],[8497,1,"f"],[8498,3],[8499,1,"m"],[8500,1,"o"],[8501,1,"א"],[8502,1,"ב"],[8503,1,"ג"],[8504,1,"ד"],[8505,1,"i"],[8506,2],[8507,1,"fax"],[8508,1,"π"],[[8509,8510],1,"γ"],[8511,1,"π"],[8512,1,"∑"],[[8513,8516],2],[[8517,8518],1,"d"],[8519,1,"e"],[8520,1,"i"],[8521,1,"j"],[[8522,8523],2],[8524,2],[8525,2],[8526,2],[8527,2],[8528,1,"1⁄7"],[8529,1,"1⁄9"],[8530,1,"1⁄10"],[8531,1,"1⁄3"],[8532,1,"2⁄3"],[8533,1,"1⁄5"],[8534,1,"2⁄5"],[8535,1,"3⁄5"],[8536,1,"4⁄5"],[8537,1,"1⁄6"],[8538,1,"5⁄6"],[8539,1,"1⁄8"],[8540,1,"3⁄8"],[8541,1,"5⁄8"],[8542,1,"7⁄8"],[8543,1,"1⁄"],[8544,1,"i"],[8545,1,"ii"],[8546,1,"iii"],[8547,1,"iv"],[8548,1,"v"],[8549,1,"vi"],[8550,1,"vii"],[8551,1,"viii"],[8552,1,"ix"],[8553,1,"x"],[8554,1,"xi"],[8555,1,"xii"],[8556,1,"l"],[8557,1,"c"],[8558,1,"d"],[8559,1,"m"],[8560,1,"i"],[8561,1,"ii"],[8562,1,"iii"],[8563,1,"iv"],[8564,1,"v"],[8565,1,"vi"],[8566,1,"vii"],[8567,1,"viii"],[8568,1,"ix"],[8569,1,"x"],[8570,1,"xi"],[8571,1,"xii"],[8572,1,"l"],[8573,1,"c"],[8574,1,"d"],[8575,1,"m"],[[8576,8578],2],[8579,3],[8580,2],[[8581,8584],2],[8585,1,"0⁄3"],[[8586,8587],2],[[8588,8591],3],[[8592,8682],2],[[8683,8691],2],[[8692,8703],2],[[8704,8747],2],[8748,1,"∫∫"],[8749,1,"∫∫∫"],[8750,2],[8751,1,"∮∮"],[8752,1,"∮∮∮"],[[8753,8799],2],[8800,4],[[8801,8813],2],[[8814,8815],4],[[8816,8945],2],[[8946,8959],2],[8960,2],[8961,2],[[8962,9000],2],[9001,1,"〈"],[9002,1,"〉"],[[9003,9082],2],[9083,2],[9084,2],[[9085,9114],2],[[9115,9166],2],[[9167,9168],2],[[9169,9179],2],[[9180,9191],2],[9192,2],[[9193,9203],2],[[9204,9210],2],[[9211,9214],2],[9215,2],[[9216,9252],2],[[9253,9254],2],[[9255,9279],3],[[9280,9290],2],[[9291,9311],3],[9312,1,"1"],[9313,1,"2"],[9314,1,"3"],[9315,1,"4"],[9316,1,"5"],[9317,1,"6"],[9318,1,"7"],[9319,1,"8"],[9320,1,"9"],[9321,1,"10"],[9322,1,"11"],[9323,1,"12"],[9324,1,"13"],[9325,1,"14"],[9326,1,"15"],[9327,1,"16"],[9328,1,"17"],[9329,1,"18"],[9330,1,"19"],[9331,1,"20"],[9332,5,"(1)"],[9333,5,"(2)"],[9334,5,"(3)"],[9335,5,"(4)"],[9336,5,"(5)"],[9337,5,"(6)"],[9338,5,"(7)"],[9339,5,"(8)"],[9340,5,"(9)"],[9341,5,"(10)"],[9342,5,"(11)"],[9343,5,"(12)"],[9344,5,"(13)"],[9345,5,"(14)"],[9346,5,"(15)"],[9347,5,"(16)"],[9348,5,"(17)"],[9349,5,"(18)"],[9350,5,"(19)"],[9351,5,"(20)"],[[9352,9371],3],[9372,5,"(a)"],[9373,5,"(b)"],[9374,5,"(c)"],[9375,5,"(d)"],[9376,5,"(e)"],[9377,5,"(f)"],[9378,5,"(g)"],[9379,5,"(h)"],[9380,5,"(i)"],[9381,5,"(j)"],[9382,5,"(k)"],[9383,5,"(l)"],[9384,5,"(m)"],[9385,5,"(n)"],[9386,5,"(o)"],[9387,5,"(p)"],[9388,5,"(q)"],[9389,5,"(r)"],[9390,5,"(s)"],[9391,5,"(t)"],[9392,5,"(u)"],[9393,5,"(v)"],[9394,5,"(w)"],[9395,5,"(x)"],[9396,5,"(y)"],[9397,5,"(z)"],[9398,1,"a"],[9399,1,"b"],[9400,1,"c"],[9401,1,"d"],[9402,1,"e"],[9403,1,"f"],[9404,1,"g"],[9405,1,"h"],[9406,1,"i"],[9407,1,"j"],[9408,1,"k"],[9409,1,"l"],[9410,1,"m"],[9411,1,"n"],[9412,1,"o"],[9413,1,"p"],[9414,1,"q"],[9415,1,"r"],[9416,1,"s"],[9417,1,"t"],[9418,1,"u"],[9419,1,"v"],[9420,1,"w"],[9421,1,"x"],[9422,1,"y"],[9423,1,"z"],[9424,1,"a"],[9425,1,"b"],[9426,1,"c"],[9427,1,"d"],[9428,1,"e"],[9429,1,"f"],[9430,1,"g"],[9431,1,"h"],[9432,1,"i"],[9433,1,"j"],[9434,1,"k"],[9435,1,"l"],[9436,1,"m"],[9437,1,"n"],[9438,1,"o"],[9439,1,"p"],[9440,1,"q"],[9441,1,"r"],[9442,1,"s"],[9443,1,"t"],[9444,1,"u"],[9445,1,"v"],[9446,1,"w"],[9447,1,"x"],[9448,1,"y"],[9449,1,"z"],[9450,1,"0"],[[9451,9470],2],[9471,2],[[9472,9621],2],[[9622,9631],2],[[9632,9711],2],[[9712,9719],2],[[9720,9727],2],[[9728,9747],2],[[9748,9749],2],[[9750,9751],2],[9752,2],[9753,2],[[9754,9839],2],[[9840,9841],2],[[9842,9853],2],[[9854,9855],2],[[9856,9865],2],[[9866,9873],2],[[9874,9884],2],[9885,2],[[9886,9887],2],[[9888,9889],2],[[9890,9905],2],[9906,2],[[9907,9916],2],[[9917,9919],2],[[9920,9923],2],[[9924,9933],2],[9934,2],[[9935,9953],2],[9954,2],[9955,2],[[9956,9959],2],[[9960,9983],2],[9984,2],[[9985,9988],2],[9989,2],[[9990,9993],2],[[9994,9995],2],[[9996,10023],2],[10024,2],[[10025,10059],2],[10060,2],[10061,2],[10062,2],[[10063,10066],2],[[10067,10069],2],[10070,2],[10071,2],[[10072,10078],2],[[10079,10080],2],[[10081,10087],2],[[10088,10101],2],[[10102,10132],2],[[10133,10135],2],[[10136,10159],2],[10160,2],[[10161,10174],2],[10175,2],[[10176,10182],2],[[10183,10186],2],[10187,2],[10188,2],[10189,2],[[10190,10191],2],[[10192,10219],2],[[10220,10223],2],[[10224,10239],2],[[10240,10495],2],[[10496,10763],2],[10764,1,"∫∫∫∫"],[[10765,10867],2],[10868,5,"::="],[10869,5,"=="],[10870,5,"==="],[[10871,10971],2],[10972,1,"⫝̸"],[[10973,11007],2],[[11008,11021],2],[[11022,11027],2],[[11028,11034],2],[[11035,11039],2],[[11040,11043],2],[[11044,11084],2],[[11085,11087],2],[[11088,11092],2],[[11093,11097],2],[[11098,11123],2],[[11124,11125],3],[[11126,11157],2],[11158,3],[11159,2],[[11160,11193],2],[[11194,11196],2],[[11197,11208],2],[11209,2],[[11210,11217],2],[11218,2],[[11219,11243],2],[[11244,11247],2],[[11248,11262],2],[11263,2],[11264,1,"ⰰ"],[11265,1,"ⰱ"],[11266,1,"ⰲ"],[11267,1,"ⰳ"],[11268,1,"ⰴ"],[11269,1,"ⰵ"],[11270,1,"ⰶ"],[11271,1,"ⰷ"],[11272,1,"ⰸ"],[11273,1,"ⰹ"],[11274,1,"ⰺ"],[11275,1,"ⰻ"],[11276,1,"ⰼ"],[11277,1,"ⰽ"],[11278,1,"ⰾ"],[11279,1,"ⰿ"],[11280,1,"ⱀ"],[11281,1,"ⱁ"],[11282,1,"ⱂ"],[11283,1,"ⱃ"],[11284,1,"ⱄ"],[11285,1,"ⱅ"],[11286,1,"ⱆ"],[11287,1,"ⱇ"],[11288,1,"ⱈ"],[11289,1,"ⱉ"],[11290,1,"ⱊ"],[11291,1,"ⱋ"],[11292,1,"ⱌ"],[11293,1,"ⱍ"],[11294,1,"ⱎ"],[11295,1,"ⱏ"],[11296,1,"ⱐ"],[11297,1,"ⱑ"],[11298,1,"ⱒ"],[11299,1,"ⱓ"],[11300,1,"ⱔ"],[11301,1,"ⱕ"],[11302,1,"ⱖ"],[11303,1,"ⱗ"],[11304,1,"ⱘ"],[11305,1,"ⱙ"],[11306,1,"ⱚ"],[11307,1,"ⱛ"],[11308,1,"ⱜ"],[11309,1,"ⱝ"],[11310,1,"ⱞ"],[11311,3],[[11312,11358],2],[11359,3],[11360,1,"ⱡ"],[11361,2],[11362,1,"ɫ"],[11363,1,"ᵽ"],[11364,1,"ɽ"],[[11365,11366],2],[11367,1,"ⱨ"],[11368,2],[11369,1,"ⱪ"],[11370,2],[11371,1,"ⱬ"],[11372,2],[11373,1,"ɑ"],[11374,1,"ɱ"],[11375,1,"ɐ"],[11376,1,"ɒ"],[11377,2],[11378,1,"ⱳ"],[11379,2],[11380,2],[11381,1,"ⱶ"],[[11382,11383],2],[[11384,11387],2],[11388,1,"j"],[11389,1,"v"],[11390,1,"ȿ"],[11391,1,"ɀ"],[11392,1,"ⲁ"],[11393,2],[11394,1,"ⲃ"],[11395,2],[11396,1,"ⲅ"],[11397,2],[11398,1,"ⲇ"],[11399,2],[11400,1,"ⲉ"],[11401,2],[11402,1,"ⲋ"],[11403,2],[11404,1,"ⲍ"],[11405,2],[11406,1,"ⲏ"],[11407,2],[11408,1,"ⲑ"],[11409,2],[11410,1,"ⲓ"],[11411,2],[11412,1,"ⲕ"],[11413,2],[11414,1,"ⲗ"],[11415,2],[11416,1,"ⲙ"],[11417,2],[11418,1,"ⲛ"],[11419,2],[11420,1,"ⲝ"],[11421,2],[11422,1,"ⲟ"],[11423,2],[11424,1,"ⲡ"],[11425,2],[11426,1,"ⲣ"],[11427,2],[11428,1,"ⲥ"],[11429,2],[11430,1,"ⲧ"],[11431,2],[11432,1,"ⲩ"],[11433,2],[11434,1,"ⲫ"],[11435,2],[11436,1,"ⲭ"],[11437,2],[11438,1,"ⲯ"],[11439,2],[11440,1,"ⲱ"],[11441,2],[11442,1,"ⲳ"],[11443,2],[11444,1,"ⲵ"],[11445,2],[11446,1,"ⲷ"],[11447,2],[11448,1,"ⲹ"],[11449,2],[11450,1,"ⲻ"],[11451,2],[11452,1,"ⲽ"],[11453,2],[11454,1,"ⲿ"],[11455,2],[11456,1,"ⳁ"],[11457,2],[11458,1,"ⳃ"],[11459,2],[11460,1,"ⳅ"],[11461,2],[11462,1,"ⳇ"],[11463,2],[11464,1,"ⳉ"],[11465,2],[11466,1,"ⳋ"],[11467,2],[11468,1,"ⳍ"],[11469,2],[11470,1,"ⳏ"],[11471,2],[11472,1,"ⳑ"],[11473,2],[11474,1,"ⳓ"],[11475,2],[11476,1,"ⳕ"],[11477,2],[11478,1,"ⳗ"],[11479,2],[11480,1,"ⳙ"],[11481,2],[11482,1,"ⳛ"],[11483,2],[11484,1,"ⳝ"],[11485,2],[11486,1,"ⳟ"],[11487,2],[11488,1,"ⳡ"],[11489,2],[11490,1,"ⳣ"],[[11491,11492],2],[[11493,11498],2],[11499,1,"ⳬ"],[11500,2],[11501,1,"ⳮ"],[[11502,11505],2],[11506,1,"ⳳ"],[11507,2],[[11508,11512],3],[[11513,11519],2],[[11520,11557],2],[11558,3],[11559,2],[[11560,11564],3],[11565,2],[[11566,11567],3],[[11568,11621],2],[[11622,11623],2],[[11624,11630],3],[11631,1,"ⵡ"],[11632,2],[[11633,11646],3],[11647,2],[[11648,11670],2],[[11671,11679],3],[[11680,11686],2],[11687,3],[[11688,11694],2],[11695,3],[[11696,11702],2],[11703,3],[[11704,11710],2],[11711,3],[[11712,11718],2],[11719,3],[[11720,11726],2],[11727,3],[[11728,11734],2],[11735,3],[[11736,11742],2],[11743,3],[[11744,11775],2],[[11776,11799],2],[[11800,11803],2],[[11804,11805],2],[[11806,11822],2],[11823,2],[11824,2],[11825,2],[[11826,11835],2],[[11836,11842],2],[[11843,11844],2],[[11845,11849],2],[[11850,11854],2],[11855,2],[[11856,11858],2],[[11859,11903],3],[[11904,11929],2],[11930,3],[[11931,11934],2],[11935,1,"母"],[[11936,12018],2],[12019,1,"龟"],[[12020,12031],3],[12032,1,"一"],[12033,1,"丨"],[12034,1,"丶"],[12035,1,"丿"],[12036,1,"乙"],[12037,1,"亅"],[12038,1,"二"],[12039,1,"亠"],[12040,1,"人"],[12041,1,"儿"],[12042,1,"入"],[12043,1,"八"],[12044,1,"冂"],[12045,1,"冖"],[12046,1,"冫"],[12047,1,"几"],[12048,1,"凵"],[12049,1,"刀"],[12050,1,"力"],[12051,1,"勹"],[12052,1,"匕"],[12053,1,"匚"],[12054,1,"匸"],[12055,1,"十"],[12056,1,"卜"],[12057,1,"卩"],[12058,1,"厂"],[12059,1,"厶"],[12060,1,"又"],[12061,1,"口"],[12062,1,"囗"],[12063,1,"土"],[12064,1,"士"],[12065,1,"夂"],[12066,1,"夊"],[12067,1,"夕"],[12068,1,"大"],[12069,1,"女"],[12070,1,"子"],[12071,1,"宀"],[12072,1,"寸"],[12073,1,"小"],[12074,1,"尢"],[12075,1,"尸"],[12076,1,"屮"],[12077,1,"山"],[12078,1,"巛"],[12079,1,"工"],[12080,1,"己"],[12081,1,"巾"],[12082,1,"干"],[12083,1,"幺"],[12084,1,"广"],[12085,1,"廴"],[12086,1,"廾"],[12087,1,"弋"],[12088,1,"弓"],[12089,1,"彐"],[12090,1,"彡"],[12091,1,"彳"],[12092,1,"心"],[12093,1,"戈"],[12094,1,"戶"],[12095,1,"手"],[12096,1,"支"],[12097,1,"攴"],[12098,1,"文"],[12099,1,"斗"],[12100,1,"斤"],[12101,1,"方"],[12102,1,"无"],[12103,1,"日"],[12104,1,"曰"],[12105,1,"月"],[12106,1,"木"],[12107,1,"欠"],[12108,1,"止"],[12109,1,"歹"],[12110,1,"殳"],[12111,1,"毋"],[12112,1,"比"],[12113,1,"毛"],[12114,1,"氏"],[12115,1,"气"],[12116,1,"水"],[12117,1,"火"],[12118,1,"爪"],[12119,1,"父"],[12120,1,"爻"],[12121,1,"爿"],[12122,1,"片"],[12123,1,"牙"],[12124,1,"牛"],[12125,1,"犬"],[12126,1,"玄"],[12127,1,"玉"],[12128,1,"瓜"],[12129,1,"瓦"],[12130,1,"甘"],[12131,1,"生"],[12132,1,"用"],[12133,1,"田"],[12134,1,"疋"],[12135,1,"疒"],[12136,1,"癶"],[12137,1,"白"],[12138,1,"皮"],[12139,1,"皿"],[12140,1,"目"],[12141,1,"矛"],[12142,1,"矢"],[12143,1,"石"],[12144,1,"示"],[12145,1,"禸"],[12146,1,"禾"],[12147,1,"穴"],[12148,1,"立"],[12149,1,"竹"],[12150,1,"米"],[12151,1,"糸"],[12152,1,"缶"],[12153,1,"网"],[12154,1,"羊"],[12155,1,"羽"],[12156,1,"老"],[12157,1,"而"],[12158,1,"耒"],[12159,1,"耳"],[12160,1,"聿"],[12161,1,"肉"],[12162,1,"臣"],[12163,1,"自"],[12164,1,"至"],[12165,1,"臼"],[12166,1,"舌"],[12167,1,"舛"],[12168,1,"舟"],[12169,1,"艮"],[12170,1,"色"],[12171,1,"艸"],[12172,1,"虍"],[12173,1,"虫"],[12174,1,"血"],[12175,1,"行"],[12176,1,"衣"],[12177,1,"襾"],[12178,1,"見"],[12179,1,"角"],[12180,1,"言"],[12181,1,"谷"],[12182,1,"豆"],[12183,1,"豕"],[12184,1,"豸"],[12185,1,"貝"],[12186,1,"赤"],[12187,1,"走"],[12188,1,"足"],[12189,1,"身"],[12190,1,"車"],[12191,1,"辛"],[12192,1,"辰"],[12193,1,"辵"],[12194,1,"邑"],[12195,1,"酉"],[12196,1,"釆"],[12197,1,"里"],[12198,1,"金"],[12199,1,"長"],[12200,1,"門"],[12201,1,"阜"],[12202,1,"隶"],[12203,1,"隹"],[12204,1,"雨"],[12205,1,"靑"],[12206,1,"非"],[12207,1,"面"],[12208,1,"革"],[12209,1,"韋"],[12210,1,"韭"],[12211,1,"音"],[12212,1,"頁"],[12213,1,"風"],[12214,1,"飛"],[12215,1,"食"],[12216,1,"首"],[12217,1,"香"],[12218,1,"馬"],[12219,1,"骨"],[12220,1,"高"],[12221,1,"髟"],[12222,1,"鬥"],[12223,1,"鬯"],[12224,1,"鬲"],[12225,1,"鬼"],[12226,1,"魚"],[12227,1,"鳥"],[12228,1,"鹵"],[12229,1,"鹿"],[12230,1,"麥"],[12231,1,"麻"],[12232,1,"黃"],[12233,1,"黍"],[12234,1,"黑"],[12235,1,"黹"],[12236,1,"黽"],[12237,1,"鼎"],[12238,1,"鼓"],[12239,1,"鼠"],[12240,1,"鼻"],[12241,1,"齊"],[12242,1,"齒"],[12243,1,"龍"],[12244,1,"龜"],[12245,1,"龠"],[[12246,12271],3],[[12272,12283],3],[[12284,12287],3],[12288,5," "],[12289,2],[12290,1,"."],[[12291,12292],2],[[12293,12295],2],[[12296,12329],2],[[12330,12333],2],[[12334,12341],2],[12342,1,"〒"],[12343,2],[12344,1,"十"],[12345,1,"卄"],[12346,1,"卅"],[12347,2],[12348,2],[12349,2],[12350,2],[12351,2],[12352,3],[[12353,12436],2],[[12437,12438],2],[[12439,12440],3],[[12441,12442],2],[12443,5," ゙"],[12444,5," ゚"],[[12445,12446],2],[12447,1,"より"],[12448,2],[[12449,12542],2],[12543,1,"コト"],[[12544,12548],3],[[12549,12588],2],[12589,2],[12590,2],[12591,2],[12592,3],[12593,1,"ᄀ"],[12594,1,"ᄁ"],[12595,1,"ᆪ"],[12596,1,"ᄂ"],[12597,1,"ᆬ"],[12598,1,"ᆭ"],[12599,1,"ᄃ"],[12600,1,"ᄄ"],[12601,1,"ᄅ"],[12602,1,"ᆰ"],[12603,1,"ᆱ"],[12604,1,"ᆲ"],[12605,1,"ᆳ"],[12606,1,"ᆴ"],[12607,1,"ᆵ"],[12608,1,"ᄚ"],[12609,1,"ᄆ"],[12610,1,"ᄇ"],[12611,1,"ᄈ"],[12612,1,"ᄡ"],[12613,1,"ᄉ"],[12614,1,"ᄊ"],[12615,1,"ᄋ"],[12616,1,"ᄌ"],[12617,1,"ᄍ"],[12618,1,"ᄎ"],[12619,1,"ᄏ"],[12620,1,"ᄐ"],[12621,1,"ᄑ"],[12622,1,"ᄒ"],[12623,1,"ᅡ"],[12624,1,"ᅢ"],[12625,1,"ᅣ"],[12626,1,"ᅤ"],[12627,1,"ᅥ"],[12628,1,"ᅦ"],[12629,1,"ᅧ"],[12630,1,"ᅨ"],[12631,1,"ᅩ"],[12632,1,"ᅪ"],[12633,1,"ᅫ"],[12634,1,"ᅬ"],[12635,1,"ᅭ"],[12636,1,"ᅮ"],[12637,1,"ᅯ"],[12638,1,"ᅰ"],[12639,1,"ᅱ"],[12640,1,"ᅲ"],[12641,1,"ᅳ"],[12642,1,"ᅴ"],[12643,1,"ᅵ"],[12644,3],[12645,1,"ᄔ"],[12646,1,"ᄕ"],[12647,1,"ᇇ"],[12648,1,"ᇈ"],[12649,1,"ᇌ"],[12650,1,"ᇎ"],[12651,1,"ᇓ"],[12652,1,"ᇗ"],[12653,1,"ᇙ"],[12654,1,"ᄜ"],[12655,1,"ᇝ"],[12656,1,"ᇟ"],[12657,1,"ᄝ"],[12658,1,"ᄞ"],[12659,1,"ᄠ"],[12660,1,"ᄢ"],[12661,1,"ᄣ"],[12662,1,"ᄧ"],[12663,1,"ᄩ"],[12664,1,"ᄫ"],[12665,1,"ᄬ"],[12666,1,"ᄭ"],[12667,1,"ᄮ"],[12668,1,"ᄯ"],[12669,1,"ᄲ"],[12670,1,"ᄶ"],[12671,1,"ᅀ"],[12672,1,"ᅇ"],[12673,1,"ᅌ"],[12674,1,"ᇱ"],[12675,1,"ᇲ"],[12676,1,"ᅗ"],[12677,1,"ᅘ"],[12678,1,"ᅙ"],[12679,1,"ᆄ"],[12680,1,"ᆅ"],[12681,1,"ᆈ"],[12682,1,"ᆑ"],[12683,1,"ᆒ"],[12684,1,"ᆔ"],[12685,1,"ᆞ"],[12686,1,"ᆡ"],[12687,3],[[12688,12689],2],[12690,1,"一"],[12691,1,"二"],[12692,1,"三"],[12693,1,"四"],[12694,1,"上"],[12695,1,"中"],[12696,1,"下"],[12697,1,"甲"],[12698,1,"乙"],[12699,1,"丙"],[12700,1,"丁"],[12701,1,"天"],[12702,1,"地"],[12703,1,"人"],[[12704,12727],2],[[12728,12730],2],[[12731,12735],2],[[12736,12751],2],[[12752,12771],2],[[12772,12783],3],[[12784,12799],2],[12800,5,"(ᄀ)"],[12801,5,"(ᄂ)"],[12802,5,"(ᄃ)"],[12803,5,"(ᄅ)"],[12804,5,"(ᄆ)"],[12805,5,"(ᄇ)"],[12806,5,"(ᄉ)"],[12807,5,"(ᄋ)"],[12808,5,"(ᄌ)"],[12809,5,"(ᄎ)"],[12810,5,"(ᄏ)"],[12811,5,"(ᄐ)"],[12812,5,"(ᄑ)"],[12813,5,"(ᄒ)"],[12814,5,"(가)"],[12815,5,"(나)"],[12816,5,"(다)"],[12817,5,"(라)"],[12818,5,"(마)"],[12819,5,"(바)"],[12820,5,"(사)"],[12821,5,"(아)"],[12822,5,"(자)"],[12823,5,"(차)"],[12824,5,"(카)"],[12825,5,"(타)"],[12826,5,"(파)"],[12827,5,"(하)"],[12828,5,"(주)"],[12829,5,"(오전)"],[12830,5,"(오후)"],[12831,3],[12832,5,"(一)"],[12833,5,"(二)"],[12834,5,"(三)"],[12835,5,"(四)"],[12836,5,"(五)"],[12837,5,"(六)"],[12838,5,"(七)"],[12839,5,"(八)"],[12840,5,"(九)"],[12841,5,"(十)"],[12842,5,"(月)"],[12843,5,"(火)"],[12844,5,"(水)"],[12845,5,"(木)"],[12846,5,"(金)"],[12847,5,"(土)"],[12848,5,"(日)"],[12849,5,"(株)"],[12850,5,"(有)"],[12851,5,"(社)"],[12852,5,"(名)"],[12853,5,"(特)"],[12854,5,"(財)"],[12855,5,"(祝)"],[12856,5,"(労)"],[12857,5,"(代)"],[12858,5,"(呼)"],[12859,5,"(学)"],[12860,5,"(監)"],[12861,5,"(企)"],[12862,5,"(資)"],[12863,5,"(協)"],[12864,5,"(祭)"],[12865,5,"(休)"],[12866,5,"(自)"],[12867,5,"(至)"],[12868,1,"問"],[12869,1,"幼"],[12870,1,"文"],[12871,1,"箏"],[[12872,12879],2],[12880,1,"pte"],[12881,1,"21"],[12882,1,"22"],[12883,1,"23"],[12884,1,"24"],[12885,1,"25"],[12886,1,"26"],[12887,1,"27"],[12888,1,"28"],[12889,1,"29"],[12890,1,"30"],[12891,1,"31"],[12892,1,"32"],[12893,1,"33"],[12894,1,"34"],[12895,1,"35"],[12896,1,"ᄀ"],[12897,1,"ᄂ"],[12898,1,"ᄃ"],[12899,1,"ᄅ"],[12900,1,"ᄆ"],[12901,1,"ᄇ"],[12902,1,"ᄉ"],[12903,1,"ᄋ"],[12904,1,"ᄌ"],[12905,1,"ᄎ"],[12906,1,"ᄏ"],[12907,1,"ᄐ"],[12908,1,"ᄑ"],[12909,1,"ᄒ"],[12910,1,"가"],[12911,1,"나"],[12912,1,"다"],[12913,1,"라"],[12914,1,"마"],[12915,1,"바"],[12916,1,"사"],[12917,1,"아"],[12918,1,"자"],[12919,1,"차"],[12920,1,"카"],[12921,1,"타"],[12922,1,"파"],[12923,1,"하"],[12924,1,"참고"],[12925,1,"주의"],[12926,1,"우"],[12927,2],[12928,1,"一"],[12929,1,"二"],[12930,1,"三"],[12931,1,"四"],[12932,1,"五"],[12933,1,"六"],[12934,1,"七"],[12935,1,"八"],[12936,1,"九"],[12937,1,"十"],[12938,1,"月"],[12939,1,"火"],[12940,1,"水"],[12941,1,"木"],[12942,1,"金"],[12943,1,"土"],[12944,1,"日"],[12945,1,"株"],[12946,1,"有"],[12947,1,"社"],[12948,1,"名"],[12949,1,"特"],[12950,1,"財"],[12951,1,"祝"],[12952,1,"労"],[12953,1,"秘"],[12954,1,"男"],[12955,1,"女"],[12956,1,"適"],[12957,1,"優"],[12958,1,"印"],[12959,1,"注"],[12960,1,"項"],[12961,1,"休"],[12962,1,"写"],[12963,1,"正"],[12964,1,"上"],[12965,1,"中"],[12966,1,"下"],[12967,1,"左"],[12968,1,"右"],[12969,1,"医"],[12970,1,"宗"],[12971,1,"学"],[12972,1,"監"],[12973,1,"企"],[12974,1,"資"],[12975,1,"協"],[12976,1,"夜"],[12977,1,"36"],[12978,1,"37"],[12979,1,"38"],[12980,1,"39"],[12981,1,"40"],[12982,1,"41"],[12983,1,"42"],[12984,1,"43"],[12985,1,"44"],[12986,1,"45"],[12987,1,"46"],[12988,1,"47"],[12989,1,"48"],[12990,1,"49"],[12991,1,"50"],[12992,1,"1月"],[12993,1,"2月"],[12994,1,"3月"],[12995,1,"4月"],[12996,1,"5月"],[12997,1,"6月"],[12998,1,"7月"],[12999,1,"8月"],[13000,1,"9月"],[13001,1,"10月"],[13002,1,"11月"],[13003,1,"12月"],[13004,1,"hg"],[13005,1,"erg"],[13006,1,"ev"],[13007,1,"ltd"],[13008,1,"ア"],[13009,1,"イ"],[13010,1,"ウ"],[13011,1,"エ"],[13012,1,"オ"],[13013,1,"カ"],[13014,1,"キ"],[13015,1,"ク"],[13016,1,"ケ"],[13017,1,"コ"],[13018,1,"サ"],[13019,1,"シ"],[13020,1,"ス"],[13021,1,"セ"],[13022,1,"ソ"],[13023,1,"タ"],[13024,1,"チ"],[13025,1,"ツ"],[13026,1,"テ"],[13027,1,"ト"],[13028,1,"ナ"],[13029,1,"ニ"],[13030,1,"ヌ"],[13031,1,"ネ"],[13032,1,"ノ"],[13033,1,"ハ"],[13034,1,"ヒ"],[13035,1,"フ"],[13036,1,"ヘ"],[13037,1,"ホ"],[13038,1,"マ"],[13039,1,"ミ"],[13040,1,"ム"],[13041,1,"メ"],[13042,1,"モ"],[13043,1,"ヤ"],[13044,1,"ユ"],[13045,1,"ヨ"],[13046,1,"ラ"],[13047,1,"リ"],[13048,1,"ル"],[13049,1,"レ"],[13050,1,"ロ"],[13051,1,"ワ"],[13052,1,"ヰ"],[13053,1,"ヱ"],[13054,1,"ヲ"],[13055,1,"令和"],[13056,1,"アパート"],[13057,1,"アルファ"],[13058,1,"アンペア"],[13059,1,"アール"],[13060,1,"イニング"],[13061,1,"インチ"],[13062,1,"ウォン"],[13063,1,"エスクード"],[13064,1,"エーカー"],[13065,1,"オンス"],[13066,1,"オーム"],[13067,1,"カイリ"],[13068,1,"カラット"],[13069,1,"カロリー"],[13070,1,"ガロン"],[13071,1,"ガンマ"],[13072,1,"ギガ"],[13073,1,"ギニー"],[13074,1,"キュリー"],[13075,1,"ギルダー"],[13076,1,"キロ"],[13077,1,"キログラム"],[13078,1,"キロメートル"],[13079,1,"キロワット"],[13080,1,"グラム"],[13081,1,"グラムトン"],[13082,1,"クルゼイロ"],[13083,1,"クローネ"],[13084,1,"ケース"],[13085,1,"コルナ"],[13086,1,"コーポ"],[13087,1,"サイクル"],[13088,1,"サンチーム"],[13089,1,"シリング"],[13090,1,"センチ"],[13091,1,"セント"],[13092,1,"ダース"],[13093,1,"デシ"],[13094,1,"ドル"],[13095,1,"トン"],[13096,1,"ナノ"],[13097,1,"ノット"],[13098,1,"ハイツ"],[13099,1,"パーセント"],[13100,1,"パーツ"],[13101,1,"バーレル"],[13102,1,"ピアストル"],[13103,1,"ピクル"],[13104,1,"ピコ"],[13105,1,"ビル"],[13106,1,"ファラッド"],[13107,1,"フィート"],[13108,1,"ブッシェル"],[13109,1,"フラン"],[13110,1,"ヘクタール"],[13111,1,"ペソ"],[13112,1,"ペニヒ"],[13113,1,"ヘルツ"],[13114,1,"ペンス"],[13115,1,"ページ"],[13116,1,"ベータ"],[13117,1,"ポイント"],[13118,1,"ボルト"],[13119,1,"ホン"],[13120,1,"ポンド"],[13121,1,"ホール"],[13122,1,"ホーン"],[13123,1,"マイクロ"],[13124,1,"マイル"],[13125,1,"マッハ"],[13126,1,"マルク"],[13127,1,"マンション"],[13128,1,"ミクロン"],[13129,1,"ミリ"],[13130,1,"ミリバール"],[13131,1,"メガ"],[13132,1,"メガトン"],[13133,1,"メートル"],[13134,1,"ヤード"],[13135,1,"ヤール"],[13136,1,"ユアン"],[13137,1,"リットル"],[13138,1,"リラ"],[13139,1,"ルピー"],[13140,1,"ルーブル"],[13141,1,"レム"],[13142,1,"レントゲン"],[13143,1,"ワット"],[13144,1,"0点"],[13145,1,"1点"],[13146,1,"2点"],[13147,1,"3点"],[13148,1,"4点"],[13149,1,"5点"],[13150,1,"6点"],[13151,1,"7点"],[13152,1,"8点"],[13153,1,"9点"],[13154,1,"10点"],[13155,1,"11点"],[13156,1,"12点"],[13157,1,"13点"],[13158,1,"14点"],[13159,1,"15点"],[13160,1,"16点"],[13161,1,"17点"],[13162,1,"18点"],[13163,1,"19点"],[13164,1,"20点"],[13165,1,"21点"],[13166,1,"22点"],[13167,1,"23点"],[13168,1,"24点"],[13169,1,"hpa"],[13170,1,"da"],[13171,1,"au"],[13172,1,"bar"],[13173,1,"ov"],[13174,1,"pc"],[13175,1,"dm"],[13176,1,"dm2"],[13177,1,"dm3"],[13178,1,"iu"],[13179,1,"平成"],[13180,1,"昭和"],[13181,1,"大正"],[13182,1,"明治"],[13183,1,"株式会社"],[13184,1,"pa"],[13185,1,"na"],[13186,1,"μa"],[13187,1,"ma"],[13188,1,"ka"],[13189,1,"kb"],[13190,1,"mb"],[13191,1,"gb"],[13192,1,"cal"],[13193,1,"kcal"],[13194,1,"pf"],[13195,1,"nf"],[13196,1,"μf"],[13197,1,"μg"],[13198,1,"mg"],[13199,1,"kg"],[13200,1,"hz"],[13201,1,"khz"],[13202,1,"mhz"],[13203,1,"ghz"],[13204,1,"thz"],[13205,1,"μl"],[13206,1,"ml"],[13207,1,"dl"],[13208,1,"kl"],[13209,1,"fm"],[13210,1,"nm"],[13211,1,"μm"],[13212,1,"mm"],[13213,1,"cm"],[13214,1,"km"],[13215,1,"mm2"],[13216,1,"cm2"],[13217,1,"m2"],[13218,1,"km2"],[13219,1,"mm3"],[13220,1,"cm3"],[13221,1,"m3"],[13222,1,"km3"],[13223,1,"m∕s"],[13224,1,"m∕s2"],[13225,1,"pa"],[13226,1,"kpa"],[13227,1,"mpa"],[13228,1,"gpa"],[13229,1,"rad"],[13230,1,"rad∕s"],[13231,1,"rad∕s2"],[13232,1,"ps"],[13233,1,"ns"],[13234,1,"μs"],[13235,1,"ms"],[13236,1,"pv"],[13237,1,"nv"],[13238,1,"μv"],[13239,1,"mv"],[13240,1,"kv"],[13241,1,"mv"],[13242,1,"pw"],[13243,1,"nw"],[13244,1,"μw"],[13245,1,"mw"],[13246,1,"kw"],[13247,1,"mw"],[13248,1,"kω"],[13249,1,"mω"],[13250,3],[13251,1,"bq"],[13252,1,"cc"],[13253,1,"cd"],[13254,1,"c∕kg"],[13255,3],[13256,1,"db"],[13257,1,"gy"],[13258,1,"ha"],[13259,1,"hp"],[13260,1,"in"],[13261,1,"kk"],[13262,1,"km"],[13263,1,"kt"],[13264,1,"lm"],[13265,1,"ln"],[13266,1,"log"],[13267,1,"lx"],[13268,1,"mb"],[13269,1,"mil"],[13270,1,"mol"],[13271,1,"ph"],[13272,3],[13273,1,"ppm"],[13274,1,"pr"],[13275,1,"sr"],[13276,1,"sv"],[13277,1,"wb"],[13278,1,"v∕m"],[13279,1,"a∕m"],[13280,1,"1日"],[13281,1,"2日"],[13282,1,"3日"],[13283,1,"4日"],[13284,1,"5日"],[13285,1,"6日"],[13286,1,"7日"],[13287,1,"8日"],[13288,1,"9日"],[13289,1,"10日"],[13290,1,"11日"],[13291,1,"12日"],[13292,1,"13日"],[13293,1,"14日"],[13294,1,"15日"],[13295,1,"16日"],[13296,1,"17日"],[13297,1,"18日"],[13298,1,"19日"],[13299,1,"20日"],[13300,1,"21日"],[13301,1,"22日"],[13302,1,"23日"],[13303,1,"24日"],[13304,1,"25日"],[13305,1,"26日"],[13306,1,"27日"],[13307,1,"28日"],[13308,1,"29日"],[13309,1,"30日"],[13310,1,"31日"],[13311,1,"gal"],[[13312,19893],2],[[19894,19903],2],[[19904,19967],2],[[19968,40869],2],[[40870,40891],2],[[40892,40899],2],[[40900,40907],2],[40908,2],[[40909,40917],2],[[40918,40938],2],[[40939,40943],2],[[40944,40956],2],[[40957,40959],3],[[40960,42124],2],[[42125,42127],3],[[42128,42145],2],[[42146,42147],2],[[42148,42163],2],[42164,2],[[42165,42176],2],[42177,2],[[42178,42180],2],[42181,2],[42182,2],[[42183,42191],3],[[42192,42237],2],[[42238,42239],2],[[42240,42508],2],[[42509,42511],2],[[42512,42539],2],[[42540,42559],3],[42560,1,"ꙁ"],[42561,2],[42562,1,"ꙃ"],[42563,2],[42564,1,"ꙅ"],[42565,2],[42566,1,"ꙇ"],[42567,2],[42568,1,"ꙉ"],[42569,2],[42570,1,"ꙋ"],[42571,2],[42572,1,"ꙍ"],[42573,2],[42574,1,"ꙏ"],[42575,2],[42576,1,"ꙑ"],[42577,2],[42578,1,"ꙓ"],[42579,2],[42580,1,"ꙕ"],[42581,2],[42582,1,"ꙗ"],[42583,2],[42584,1,"ꙙ"],[42585,2],[42586,1,"ꙛ"],[42587,2],[42588,1,"ꙝ"],[42589,2],[42590,1,"ꙟ"],[42591,2],[42592,1,"ꙡ"],[42593,2],[42594,1,"ꙣ"],[42595,2],[42596,1,"ꙥ"],[42597,2],[42598,1,"ꙧ"],[42599,2],[42600,1,"ꙩ"],[42601,2],[42602,1,"ꙫ"],[42603,2],[42604,1,"ꙭ"],[[42605,42607],2],[[42608,42611],2],[[42612,42619],2],[[42620,42621],2],[42622,2],[42623,2],[42624,1,"ꚁ"],[42625,2],[42626,1,"ꚃ"],[42627,2],[42628,1,"ꚅ"],[42629,2],[42630,1,"ꚇ"],[42631,2],[42632,1,"ꚉ"],[42633,2],[42634,1,"ꚋ"],[42635,2],[42636,1,"ꚍ"],[42637,2],[42638,1,"ꚏ"],[42639,2],[42640,1,"ꚑ"],[42641,2],[42642,1,"ꚓ"],[42643,2],[42644,1,"ꚕ"],[42645,2],[42646,1,"ꚗ"],[42647,2],[42648,1,"ꚙ"],[42649,2],[42650,1,"ꚛ"],[42651,2],[42652,1,"ъ"],[42653,1,"ь"],[42654,2],[42655,2],[[42656,42725],2],[[42726,42735],2],[[42736,42737],2],[[42738,42743],2],[[42744,42751],3],[[42752,42774],2],[[42775,42778],2],[[42779,42783],2],[[42784,42785],2],[42786,1,"ꜣ"],[42787,2],[42788,1,"ꜥ"],[42789,2],[42790,1,"ꜧ"],[42791,2],[42792,1,"ꜩ"],[42793,2],[42794,1,"ꜫ"],[42795,2],[42796,1,"ꜭ"],[42797,2],[42798,1,"ꜯ"],[[42799,42801],2],[42802,1,"ꜳ"],[42803,2],[42804,1,"ꜵ"],[42805,2],[42806,1,"ꜷ"],[42807,2],[42808,1,"ꜹ"],[42809,2],[42810,1,"ꜻ"],[42811,2],[42812,1,"ꜽ"],[42813,2],[42814,1,"ꜿ"],[42815,2],[42816,1,"ꝁ"],[42817,2],[42818,1,"ꝃ"],[42819,2],[42820,1,"ꝅ"],[42821,2],[42822,1,"ꝇ"],[42823,2],[42824,1,"ꝉ"],[42825,2],[42826,1,"ꝋ"],[42827,2],[42828,1,"ꝍ"],[42829,2],[42830,1,"ꝏ"],[42831,2],[42832,1,"ꝑ"],[42833,2],[42834,1,"ꝓ"],[42835,2],[42836,1,"ꝕ"],[42837,2],[42838,1,"ꝗ"],[42839,2],[42840,1,"ꝙ"],[42841,2],[42842,1,"ꝛ"],[42843,2],[42844,1,"ꝝ"],[42845,2],[42846,1,"ꝟ"],[42847,2],[42848,1,"ꝡ"],[42849,2],[42850,1,"ꝣ"],[42851,2],[42852,1,"ꝥ"],[42853,2],[42854,1,"ꝧ"],[42855,2],[42856,1,"ꝩ"],[42857,2],[42858,1,"ꝫ"],[42859,2],[42860,1,"ꝭ"],[42861,2],[42862,1,"ꝯ"],[42863,2],[42864,1,"ꝯ"],[[42865,42872],2],[42873,1,"ꝺ"],[42874,2],[42875,1,"ꝼ"],[42876,2],[42877,1,"ᵹ"],[42878,1,"ꝿ"],[42879,2],[42880,1,"ꞁ"],[42881,2],[42882,1,"ꞃ"],[42883,2],[42884,1,"ꞅ"],[42885,2],[42886,1,"ꞇ"],[[42887,42888],2],[[42889,42890],2],[42891,1,"ꞌ"],[42892,2],[42893,1,"ɥ"],[42894,2],[42895,2],[42896,1,"ꞑ"],[42897,2],[42898,1,"ꞓ"],[42899,2],[[42900,42901],2],[42902,1,"ꞗ"],[42903,2],[42904,1,"ꞙ"],[42905,2],[42906,1,"ꞛ"],[42907,2],[42908,1,"ꞝ"],[42909,2],[42910,1,"ꞟ"],[42911,2],[42912,1,"ꞡ"],[42913,2],[42914,1,"ꞣ"],[42915,2],[42916,1,"ꞥ"],[42917,2],[42918,1,"ꞧ"],[42919,2],[42920,1,"ꞩ"],[42921,2],[42922,1,"ɦ"],[42923,1,"ɜ"],[42924,1,"ɡ"],[42925,1,"ɬ"],[42926,1,"ɪ"],[42927,2],[42928,1,"ʞ"],[42929,1,"ʇ"],[42930,1,"ʝ"],[42931,1,"ꭓ"],[42932,1,"ꞵ"],[42933,2],[42934,1,"ꞷ"],[42935,2],[42936,1,"ꞹ"],[42937,2],[42938,1,"ꞻ"],[42939,2],[42940,1,"ꞽ"],[42941,2],[42942,1,"ꞿ"],[42943,2],[[42944,42945],3],[42946,1,"ꟃ"],[42947,2],[42948,1,"ꞔ"],[42949,1,"ʂ"],[42950,1,"ᶎ"],[42951,1,"ꟈ"],[42952,2],[42953,1,"ꟊ"],[42954,2],[[42955,42996],3],[42997,1,"ꟶ"],[42998,2],[42999,2],[43000,1,"ħ"],[43001,1,"œ"],[43002,2],[[43003,43007],2],[[43008,43047],2],[[43048,43051],2],[43052,2],[[43053,43055],3],[[43056,43065],2],[[43066,43071],3],[[43072,43123],2],[[43124,43127],2],[[43128,43135],3],[[43136,43204],2],[43205,2],[[43206,43213],3],[[43214,43215],2],[[43216,43225],2],[[43226,43231],3],[[43232,43255],2],[[43256,43258],2],[43259,2],[43260,2],[43261,2],[[43262,43263],2],[[43264,43309],2],[[43310,43311],2],[[43312,43347],2],[[43348,43358],3],[43359,2],[[43360,43388],2],[[43389,43391],3],[[43392,43456],2],[[43457,43469],2],[43470,3],[[43471,43481],2],[[43482,43485],3],[[43486,43487],2],[[43488,43518],2],[43519,3],[[43520,43574],2],[[43575,43583],3],[[43584,43597],2],[[43598,43599],3],[[43600,43609],2],[[43610,43611],3],[[43612,43615],2],[[43616,43638],2],[[43639,43641],2],[[43642,43643],2],[[43644,43647],2],[[43648,43714],2],[[43715,43738],3],[[43739,43741],2],[[43742,43743],2],[[43744,43759],2],[[43760,43761],2],[[43762,43766],2],[[43767,43776],3],[[43777,43782],2],[[43783,43784],3],[[43785,43790],2],[[43791,43792],3],[[43793,43798],2],[[43799,43807],3],[[43808,43814],2],[43815,3],[[43816,43822],2],[43823,3],[[43824,43866],2],[43867,2],[43868,1,"ꜧ"],[43869,1,"ꬷ"],[43870,1,"ɫ"],[43871,1,"ꭒ"],[[43872,43875],2],[[43876,43877],2],[[43878,43879],2],[43880,2],[43881,1,"ʍ"],[[43882,43883],2],[[43884,43887],3],[43888,1,"Ꭰ"],[43889,1,"Ꭱ"],[43890,1,"Ꭲ"],[43891,1,"Ꭳ"],[43892,1,"Ꭴ"],[43893,1,"Ꭵ"],[43894,1,"Ꭶ"],[43895,1,"Ꭷ"],[43896,1,"Ꭸ"],[43897,1,"Ꭹ"],[43898,1,"Ꭺ"],[43899,1,"Ꭻ"],[43900,1,"Ꭼ"],[43901,1,"Ꭽ"],[43902,1,"Ꭾ"],[43903,1,"Ꭿ"],[43904,1,"Ꮀ"],[43905,1,"Ꮁ"],[43906,1,"Ꮂ"],[43907,1,"Ꮃ"],[43908,1,"Ꮄ"],[43909,1,"Ꮅ"],[43910,1,"Ꮆ"],[43911,1,"Ꮇ"],[43912,1,"Ꮈ"],[43913,1,"Ꮉ"],[43914,1,"Ꮊ"],[43915,1,"Ꮋ"],[43916,1,"Ꮌ"],[43917,1,"Ꮍ"],[43918,1,"Ꮎ"],[43919,1,"Ꮏ"],[43920,1,"Ꮐ"],[43921,1,"Ꮑ"],[43922,1,"Ꮒ"],[43923,1,"Ꮓ"],[43924,1,"Ꮔ"],[43925,1,"Ꮕ"],[43926,1,"Ꮖ"],[43927,1,"Ꮗ"],[43928,1,"Ꮘ"],[43929,1,"Ꮙ"],[43930,1,"Ꮚ"],[43931,1,"Ꮛ"],[43932,1,"Ꮜ"],[43933,1,"Ꮝ"],[43934,1,"Ꮞ"],[43935,1,"Ꮟ"],[43936,1,"Ꮠ"],[43937,1,"Ꮡ"],[43938,1,"Ꮢ"],[43939,1,"Ꮣ"],[43940,1,"Ꮤ"],[43941,1,"Ꮥ"],[43942,1,"Ꮦ"],[43943,1,"Ꮧ"],[43944,1,"Ꮨ"],[43945,1,"Ꮩ"],[43946,1,"Ꮪ"],[43947,1,"Ꮫ"],[43948,1,"Ꮬ"],[43949,1,"Ꮭ"],[43950,1,"Ꮮ"],[43951,1,"Ꮯ"],[43952,1,"Ꮰ"],[43953,1,"Ꮱ"],[43954,1,"Ꮲ"],[43955,1,"Ꮳ"],[43956,1,"Ꮴ"],[43957,1,"Ꮵ"],[43958,1,"Ꮶ"],[43959,1,"Ꮷ"],[43960,1,"Ꮸ"],[43961,1,"Ꮹ"],[43962,1,"Ꮺ"],[43963,1,"Ꮻ"],[43964,1,"Ꮼ"],[43965,1,"Ꮽ"],[43966,1,"Ꮾ"],[43967,1,"Ꮿ"],[[43968,44010],2],[44011,2],[[44012,44013],2],[[44014,44015],3],[[44016,44025],2],[[44026,44031],3],[[44032,55203],2],[[55204,55215],3],[[55216,55238],2],[[55239,55242],3],[[55243,55291],2],[[55292,55295],3],[[55296,57343],3],[[57344,63743],3],[63744,1,"豈"],[63745,1,"更"],[63746,1,"車"],[63747,1,"賈"],[63748,1,"滑"],[63749,1,"串"],[63750,1,"句"],[[63751,63752],1,"龜"],[63753,1,"契"],[63754,1,"金"],[63755,1,"喇"],[63756,1,"奈"],[63757,1,"懶"],[63758,1,"癩"],[63759,1,"羅"],[63760,1,"蘿"],[63761,1,"螺"],[63762,1,"裸"],[63763,1,"邏"],[63764,1,"樂"],[63765,1,"洛"],[63766,1,"烙"],[63767,1,"珞"],[63768,1,"落"],[63769,1,"酪"],[63770,1,"駱"],[63771,1,"亂"],[63772,1,"卵"],[63773,1,"欄"],[63774,1,"爛"],[63775,1,"蘭"],[63776,1,"鸞"],[63777,1,"嵐"],[63778,1,"濫"],[63779,1,"藍"],[63780,1,"襤"],[63781,1,"拉"],[63782,1,"臘"],[63783,1,"蠟"],[63784,1,"廊"],[63785,1,"朗"],[63786,1,"浪"],[63787,1,"狼"],[63788,1,"郎"],[63789,1,"來"],[63790,1,"冷"],[63791,1,"勞"],[63792,1,"擄"],[63793,1,"櫓"],[63794,1,"爐"],[63795,1,"盧"],[63796,1,"老"],[63797,1,"蘆"],[63798,1,"虜"],[63799,1,"路"],[63800,1,"露"],[63801,1,"魯"],[63802,1,"鷺"],[63803,1,"碌"],[63804,1,"祿"],[63805,1,"綠"],[63806,1,"菉"],[63807,1,"錄"],[63808,1,"鹿"],[63809,1,"論"],[63810,1,"壟"],[63811,1,"弄"],[63812,1,"籠"],[63813,1,"聾"],[63814,1,"牢"],[63815,1,"磊"],[63816,1,"賂"],[63817,1,"雷"],[63818,1,"壘"],[63819,1,"屢"],[63820,1,"樓"],[63821,1,"淚"],[63822,1,"漏"],[63823,1,"累"],[63824,1,"縷"],[63825,1,"陋"],[63826,1,"勒"],[63827,1,"肋"],[63828,1,"凜"],[63829,1,"凌"],[63830,1,"稜"],[63831,1,"綾"],[63832,1,"菱"],[63833,1,"陵"],[63834,1,"讀"],[63835,1,"拏"],[63836,1,"樂"],[63837,1,"諾"],[63838,1,"丹"],[63839,1,"寧"],[63840,1,"怒"],[63841,1,"率"],[63842,1,"異"],[63843,1,"北"],[63844,1,"磻"],[63845,1,"便"],[63846,1,"復"],[63847,1,"不"],[63848,1,"泌"],[63849,1,"數"],[63850,1,"索"],[63851,1,"參"],[63852,1,"塞"],[63853,1,"省"],[63854,1,"葉"],[63855,1,"說"],[63856,1,"殺"],[63857,1,"辰"],[63858,1,"沈"],[63859,1,"拾"],[63860,1,"若"],[63861,1,"掠"],[63862,1,"略"],[63863,1,"亮"],[63864,1,"兩"],[63865,1,"凉"],[63866,1,"梁"],[63867,1,"糧"],[63868,1,"良"],[63869,1,"諒"],[63870,1,"量"],[63871,1,"勵"],[63872,1,"呂"],[63873,1,"女"],[63874,1,"廬"],[63875,1,"旅"],[63876,1,"濾"],[63877,1,"礪"],[63878,1,"閭"],[63879,1,"驪"],[63880,1,"麗"],[63881,1,"黎"],[63882,1,"力"],[63883,1,"曆"],[63884,1,"歷"],[63885,1,"轢"],[63886,1,"年"],[63887,1,"憐"],[63888,1,"戀"],[63889,1,"撚"],[63890,1,"漣"],[63891,1,"煉"],[63892,1,"璉"],[63893,1,"秊"],[63894,1,"練"],[63895,1,"聯"],[63896,1,"輦"],[63897,1,"蓮"],[63898,1,"連"],[63899,1,"鍊"],[63900,1,"列"],[63901,1,"劣"],[63902,1,"咽"],[63903,1,"烈"],[63904,1,"裂"],[63905,1,"說"],[63906,1,"廉"],[63907,1,"念"],[63908,1,"捻"],[63909,1,"殮"],[63910,1,"簾"],[63911,1,"獵"],[63912,1,"令"],[63913,1,"囹"],[63914,1,"寧"],[63915,1,"嶺"],[63916,1,"怜"],[63917,1,"玲"],[63918,1,"瑩"],[63919,1,"羚"],[63920,1,"聆"],[63921,1,"鈴"],[63922,1,"零"],[63923,1,"靈"],[63924,1,"領"],[63925,1,"例"],[63926,1,"禮"],[63927,1,"醴"],[63928,1,"隸"],[63929,1,"惡"],[63930,1,"了"],[63931,1,"僚"],[63932,1,"寮"],[63933,1,"尿"],[63934,1,"料"],[63935,1,"樂"],[63936,1,"燎"],[63937,1,"療"],[63938,1,"蓼"],[63939,1,"遼"],[63940,1,"龍"],[63941,1,"暈"],[63942,1,"阮"],[63943,1,"劉"],[63944,1,"杻"],[63945,1,"柳"],[63946,1,"流"],[63947,1,"溜"],[63948,1,"琉"],[63949,1,"留"],[63950,1,"硫"],[63951,1,"紐"],[63952,1,"類"],[63953,1,"六"],[63954,1,"戮"],[63955,1,"陸"],[63956,1,"倫"],[63957,1,"崙"],[63958,1,"淪"],[63959,1,"輪"],[63960,1,"律"],[63961,1,"慄"],[63962,1,"栗"],[63963,1,"率"],[63964,1,"隆"],[63965,1,"利"],[63966,1,"吏"],[63967,1,"履"],[63968,1,"易"],[63969,1,"李"],[63970,1,"梨"],[63971,1,"泥"],[63972,1,"理"],[63973,1,"痢"],[63974,1,"罹"],[63975,1,"裏"],[63976,1,"裡"],[63977,1,"里"],[63978,1,"離"],[63979,1,"匿"],[63980,1,"溺"],[63981,1,"吝"],[63982,1,"燐"],[63983,1,"璘"],[63984,1,"藺"],[63985,1,"隣"],[63986,1,"鱗"],[63987,1,"麟"],[63988,1,"林"],[63989,1,"淋"],[63990,1,"臨"],[63991,1,"立"],[63992,1,"笠"],[63993,1,"粒"],[63994,1,"狀"],[63995,1,"炙"],[63996,1,"識"],[63997,1,"什"],[63998,1,"茶"],[63999,1,"刺"],[64000,1,"切"],[64001,1,"度"],[64002,1,"拓"],[64003,1,"糖"],[64004,1,"宅"],[64005,1,"洞"],[64006,1,"暴"],[64007,1,"輻"],[64008,1,"行"],[64009,1,"降"],[64010,1,"見"],[64011,1,"廓"],[64012,1,"兀"],[64013,1,"嗀"],[[64014,64015],2],[64016,1,"塚"],[64017,2],[64018,1,"晴"],[[64019,64020],2],[64021,1,"凞"],[64022,1,"猪"],[64023,1,"益"],[64024,1,"礼"],[64025,1,"神"],[64026,1,"祥"],[64027,1,"福"],[64028,1,"靖"],[64029,1,"精"],[64030,1,"羽"],[64031,2],[64032,1,"蘒"],[64033,2],[64034,1,"諸"],[[64035,64036],2],[64037,1,"逸"],[64038,1,"都"],[[64039,64041],2],[64042,1,"飯"],[64043,1,"飼"],[64044,1,"館"],[64045,1,"鶴"],[64046,1,"郞"],[64047,1,"隷"],[64048,1,"侮"],[64049,1,"僧"],[64050,1,"免"],[64051,1,"勉"],[64052,1,"勤"],[64053,1,"卑"],[64054,1,"喝"],[64055,1,"嘆"],[64056,1,"器"],[64057,1,"塀"],[64058,1,"墨"],[64059,1,"層"],[64060,1,"屮"],[64061,1,"悔"],[64062,1,"慨"],[64063,1,"憎"],[64064,1,"懲"],[64065,1,"敏"],[64066,1,"既"],[64067,1,"暑"],[64068,1,"梅"],[64069,1,"海"],[64070,1,"渚"],[64071,1,"漢"],[64072,1,"煮"],[64073,1,"爫"],[64074,1,"琢"],[64075,1,"碑"],[64076,1,"社"],[64077,1,"祉"],[64078,1,"祈"],[64079,1,"祐"],[64080,1,"祖"],[64081,1,"祝"],[64082,1,"禍"],[64083,1,"禎"],[64084,1,"穀"],[64085,1,"突"],[64086,1,"節"],[64087,1,"練"],[64088,1,"縉"],[64089,1,"繁"],[64090,1,"署"],[64091,1,"者"],[64092,1,"臭"],[[64093,64094],1,"艹"],[64095,1,"著"],[64096,1,"褐"],[64097,1,"視"],[64098,1,"謁"],[64099,1,"謹"],[64100,1,"賓"],[64101,1,"贈"],[64102,1,"辶"],[64103,1,"逸"],[64104,1,"難"],[64105,1,"響"],[64106,1,"頻"],[64107,1,"恵"],[64108,1,"𤋮"],[64109,1,"舘"],[[64110,64111],3],[64112,1,"並"],[64113,1,"况"],[64114,1,"全"],[64115,1,"侀"],[64116,1,"充"],[64117,1,"冀"],[64118,1,"勇"],[64119,1,"勺"],[64120,1,"喝"],[64121,1,"啕"],[64122,1,"喙"],[64123,1,"嗢"],[64124,1,"塚"],[64125,1,"墳"],[64126,1,"奄"],[64127,1,"奔"],[64128,1,"婢"],[64129,1,"嬨"],[64130,1,"廒"],[64131,1,"廙"],[64132,1,"彩"],[64133,1,"徭"],[64134,1,"惘"],[64135,1,"慎"],[64136,1,"愈"],[64137,1,"憎"],[64138,1,"慠"],[64139,1,"懲"],[64140,1,"戴"],[64141,1,"揄"],[64142,1,"搜"],[64143,1,"摒"],[64144,1,"敖"],[64145,1,"晴"],[64146,1,"朗"],[64147,1,"望"],[64148,1,"杖"],[64149,1,"歹"],[64150,1,"殺"],[64151,1,"流"],[64152,1,"滛"],[64153,1,"滋"],[64154,1,"漢"],[64155,1,"瀞"],[64156,1,"煮"],[64157,1,"瞧"],[64158,1,"爵"],[64159,1,"犯"],[64160,1,"猪"],[64161,1,"瑱"],[64162,1,"甆"],[64163,1,"画"],[64164,1,"瘝"],[64165,1,"瘟"],[64166,1,"益"],[64167,1,"盛"],[64168,1,"直"],[64169,1,"睊"],[64170,1,"着"],[64171,1,"磌"],[64172,1,"窱"],[64173,1,"節"],[64174,1,"类"],[64175,1,"絛"],[64176,1,"練"],[64177,1,"缾"],[64178,1,"者"],[64179,1,"荒"],[64180,1,"華"],[64181,1,"蝹"],[64182,1,"襁"],[64183,1,"覆"],[64184,1,"視"],[64185,1,"調"],[64186,1,"諸"],[64187,1,"請"],[64188,1,"謁"],[64189,1,"諾"],[64190,1,"諭"],[64191,1,"謹"],[64192,1,"變"],[64193,1,"贈"],[64194,1,"輸"],[64195,1,"遲"],[64196,1,"醙"],[64197,1,"鉶"],[64198,1,"陼"],[64199,1,"難"],[64200,1,"靖"],[64201,1,"韛"],[64202,1,"響"],[64203,1,"頋"],[64204,1,"頻"],[64205,1,"鬒"],[64206,1,"龜"],[64207,1,"𢡊"],[64208,1,"𢡄"],[64209,1,"𣏕"],[64210,1,"㮝"],[64211,1,"䀘"],[64212,1,"䀹"],[64213,1,"𥉉"],[64214,1,"𥳐"],[64215,1,"𧻓"],[64216,1,"齃"],[64217,1,"龎"],[[64218,64255],3],[64256,1,"ff"],[64257,1,"fi"],[64258,1,"fl"],[64259,1,"ffi"],[64260,1,"ffl"],[[64261,64262],1,"st"],[[64263,64274],3],[64275,1,"մն"],[64276,1,"մե"],[64277,1,"մի"],[64278,1,"վն"],[64279,1,"մխ"],[[64280,64284],3],[64285,1,"יִ"],[64286,2],[64287,1,"ײַ"],[64288,1,"ע"],[64289,1,"א"],[64290,1,"ד"],[64291,1,"ה"],[64292,1,"כ"],[64293,1,"ל"],[64294,1,"ם"],[64295,1,"ר"],[64296,1,"ת"],[64297,5,"+"],[64298,1,"שׁ"],[64299,1,"שׂ"],[64300,1,"שּׁ"],[64301,1,"שּׂ"],[64302,1,"אַ"],[64303,1,"אָ"],[64304,1,"אּ"],[64305,1,"בּ"],[64306,1,"גּ"],[64307,1,"דּ"],[64308,1,"הּ"],[64309,1,"וּ"],[64310,1,"זּ"],[64311,3],[64312,1,"טּ"],[64313,1,"יּ"],[64314,1,"ךּ"],[64315,1,"כּ"],[64316,1,"לּ"],[64317,3],[64318,1,"מּ"],[64319,3],[64320,1,"נּ"],[64321,1,"סּ"],[64322,3],[64323,1,"ףּ"],[64324,1,"פּ"],[64325,3],[64326,1,"צּ"],[64327,1,"קּ"],[64328,1,"רּ"],[64329,1,"שּ"],[64330,1,"תּ"],[64331,1,"וֹ"],[64332,1,"בֿ"],[64333,1,"כֿ"],[64334,1,"פֿ"],[64335,1,"אל"],[[64336,64337],1,"ٱ"],[[64338,64341],1,"ٻ"],[[64342,64345],1,"پ"],[[64346,64349],1,"ڀ"],[[64350,64353],1,"ٺ"],[[64354,64357],1,"ٿ"],[[64358,64361],1,"ٹ"],[[64362,64365],1,"ڤ"],[[64366,64369],1,"ڦ"],[[64370,64373],1,"ڄ"],[[64374,64377],1,"ڃ"],[[64378,64381],1,"چ"],[[64382,64385],1,"ڇ"],[[64386,64387],1,"ڍ"],[[64388,64389],1,"ڌ"],[[64390,64391],1,"ڎ"],[[64392,64393],1,"ڈ"],[[64394,64395],1,"ژ"],[[64396,64397],1,"ڑ"],[[64398,64401],1,"ک"],[[64402,64405],1,"گ"],[[64406,64409],1,"ڳ"],[[64410,64413],1,"ڱ"],[[64414,64415],1,"ں"],[[64416,64419],1,"ڻ"],[[64420,64421],1,"ۀ"],[[64422,64425],1,"ہ"],[[64426,64429],1,"ھ"],[[64430,64431],1,"ے"],[[64432,64433],1,"ۓ"],[[64434,64449],2],[[64450,64466],3],[[64467,64470],1,"ڭ"],[[64471,64472],1,"ۇ"],[[64473,64474],1,"ۆ"],[[64475,64476],1,"ۈ"],[64477,1,"ۇٴ"],[[64478,64479],1,"ۋ"],[[64480,64481],1,"ۅ"],[[64482,64483],1,"ۉ"],[[64484,64487],1,"ې"],[[64488,64489],1,"ى"],[[64490,64491],1,"ئا"],[[64492,64493],1,"ئە"],[[64494,64495],1,"ئو"],[[64496,64497],1,"ئۇ"],[[64498,64499],1,"ئۆ"],[[64500,64501],1,"ئۈ"],[[64502,64504],1,"ئې"],[[64505,64507],1,"ئى"],[[64508,64511],1,"ی"],[64512,1,"ئج"],[64513,1,"ئح"],[64514,1,"ئم"],[64515,1,"ئى"],[64516,1,"ئي"],[64517,1,"بج"],[64518,1,"بح"],[64519,1,"بخ"],[64520,1,"بم"],[64521,1,"بى"],[64522,1,"بي"],[64523,1,"تج"],[64524,1,"تح"],[64525,1,"تخ"],[64526,1,"تم"],[64527,1,"تى"],[64528,1,"تي"],[64529,1,"ثج"],[64530,1,"ثم"],[64531,1,"ثى"],[64532,1,"ثي"],[64533,1,"جح"],[64534,1,"جم"],[64535,1,"حج"],[64536,1,"حم"],[64537,1,"خج"],[64538,1,"خح"],[64539,1,"خم"],[64540,1,"سج"],[64541,1,"سح"],[64542,1,"سخ"],[64543,1,"سم"],[64544,1,"صح"],[64545,1,"صم"],[64546,1,"ضج"],[64547,1,"ضح"],[64548,1,"ضخ"],[64549,1,"ضم"],[64550,1,"طح"],[64551,1,"طم"],[64552,1,"ظم"],[64553,1,"عج"],[64554,1,"عم"],[64555,1,"غج"],[64556,1,"غم"],[64557,1,"فج"],[64558,1,"فح"],[64559,1,"فخ"],[64560,1,"فم"],[64561,1,"فى"],[64562,1,"في"],[64563,1,"قح"],[64564,1,"قم"],[64565,1,"قى"],[64566,1,"قي"],[64567,1,"كا"],[64568,1,"كج"],[64569,1,"كح"],[64570,1,"كخ"],[64571,1,"كل"],[64572,1,"كم"],[64573,1,"كى"],[64574,1,"كي"],[64575,1,"لج"],[64576,1,"لح"],[64577,1,"لخ"],[64578,1,"لم"],[64579,1,"لى"],[64580,1,"لي"],[64581,1,"مج"],[64582,1,"مح"],[64583,1,"مخ"],[64584,1,"مم"],[64585,1,"مى"],[64586,1,"مي"],[64587,1,"نج"],[64588,1,"نح"],[64589,1,"نخ"],[64590,1,"نم"],[64591,1,"نى"],[64592,1,"ني"],[64593,1,"هج"],[64594,1,"هم"],[64595,1,"هى"],[64596,1,"هي"],[64597,1,"يج"],[64598,1,"يح"],[64599,1,"يخ"],[64600,1,"يم"],[64601,1,"يى"],[64602,1,"يي"],[64603,1,"ذٰ"],[64604,1,"رٰ"],[64605,1,"ىٰ"],[64606,5," ٌّ"],[64607,5," ٍّ"],[64608,5," َّ"],[64609,5," ُّ"],[64610,5," ِّ"],[64611,5," ّٰ"],[64612,1,"ئر"],[64613,1,"ئز"],[64614,1,"ئم"],[64615,1,"ئن"],[64616,1,"ئى"],[64617,1,"ئي"],[64618,1,"بر"],[64619,1,"بز"],[64620,1,"بم"],[64621,1,"بن"],[64622,1,"بى"],[64623,1,"بي"],[64624,1,"تر"],[64625,1,"تز"],[64626,1,"تم"],[64627,1,"تن"],[64628,1,"تى"],[64629,1,"تي"],[64630,1,"ثر"],[64631,1,"ثز"],[64632,1,"ثم"],[64633,1,"ثن"],[64634,1,"ثى"],[64635,1,"ثي"],[64636,1,"فى"],[64637,1,"في"],[64638,1,"قى"],[64639,1,"قي"],[64640,1,"كا"],[64641,1,"كل"],[64642,1,"كم"],[64643,1,"كى"],[64644,1,"كي"],[64645,1,"لم"],[64646,1,"لى"],[64647,1,"لي"],[64648,1,"ما"],[64649,1,"مم"],[64650,1,"نر"],[64651,1,"نز"],[64652,1,"نم"],[64653,1,"نن"],[64654,1,"نى"],[64655,1,"ني"],[64656,1,"ىٰ"],[64657,1,"ير"],[64658,1,"يز"],[64659,1,"يم"],[64660,1,"ين"],[64661,1,"يى"],[64662,1,"يي"],[64663,1,"ئج"],[64664,1,"ئح"],[64665,1,"ئخ"],[64666,1,"ئم"],[64667,1,"ئه"],[64668,1,"بج"],[64669,1,"بح"],[64670,1,"بخ"],[64671,1,"بم"],[64672,1,"به"],[64673,1,"تج"],[64674,1,"تح"],[64675,1,"تخ"],[64676,1,"تم"],[64677,1,"ته"],[64678,1,"ثم"],[64679,1,"جح"],[64680,1,"جم"],[64681,1,"حج"],[64682,1,"حم"],[64683,1,"خج"],[64684,1,"خم"],[64685,1,"سج"],[64686,1,"سح"],[64687,1,"سخ"],[64688,1,"سم"],[64689,1,"صح"],[64690,1,"صخ"],[64691,1,"صم"],[64692,1,"ضج"],[64693,1,"ضح"],[64694,1,"ضخ"],[64695,1,"ضم"],[64696,1,"طح"],[64697,1,"ظم"],[64698,1,"عج"],[64699,1,"عم"],[64700,1,"غج"],[64701,1,"غم"],[64702,1,"فج"],[64703,1,"فح"],[64704,1,"فخ"],[64705,1,"فم"],[64706,1,"قح"],[64707,1,"قم"],[64708,1,"كج"],[64709,1,"كح"],[64710,1,"كخ"],[64711,1,"كل"],[64712,1,"كم"],[64713,1,"لج"],[64714,1,"لح"],[64715,1,"لخ"],[64716,1,"لم"],[64717,1,"له"],[64718,1,"مج"],[64719,1,"مح"],[64720,1,"مخ"],[64721,1,"مم"],[64722,1,"نج"],[64723,1,"نح"],[64724,1,"نخ"],[64725,1,"نم"],[64726,1,"نه"],[64727,1,"هج"],[64728,1,"هم"],[64729,1,"هٰ"],[64730,1,"يج"],[64731,1,"يح"],[64732,1,"يخ"],[64733,1,"يم"],[64734,1,"يه"],[64735,1,"ئم"],[64736,1,"ئه"],[64737,1,"بم"],[64738,1,"به"],[64739,1,"تم"],[64740,1,"ته"],[64741,1,"ثم"],[64742,1,"ثه"],[64743,1,"سم"],[64744,1,"سه"],[64745,1,"شم"],[64746,1,"شه"],[64747,1,"كل"],[64748,1,"كم"],[64749,1,"لم"],[64750,1,"نم"],[64751,1,"نه"],[64752,1,"يم"],[64753,1,"يه"],[64754,1,"ـَّ"],[64755,1,"ـُّ"],[64756,1,"ـِّ"],[64757,1,"طى"],[64758,1,"طي"],[64759,1,"عى"],[64760,1,"عي"],[64761,1,"غى"],[64762,1,"غي"],[64763,1,"سى"],[64764,1,"سي"],[64765,1,"شى"],[64766,1,"شي"],[64767,1,"حى"],[64768,1,"حي"],[64769,1,"جى"],[64770,1,"جي"],[64771,1,"خى"],[64772,1,"خي"],[64773,1,"صى"],[64774,1,"صي"],[64775,1,"ضى"],[64776,1,"ضي"],[64777,1,"شج"],[64778,1,"شح"],[64779,1,"شخ"],[64780,1,"شم"],[64781,1,"شر"],[64782,1,"سر"],[64783,1,"صر"],[64784,1,"ضر"],[64785,1,"طى"],[64786,1,"طي"],[64787,1,"عى"],[64788,1,"عي"],[64789,1,"غى"],[64790,1,"غي"],[64791,1,"سى"],[64792,1,"سي"],[64793,1,"شى"],[64794,1,"شي"],[64795,1,"حى"],[64796,1,"حي"],[64797,1,"جى"],[64798,1,"جي"],[64799,1,"خى"],[64800,1,"خي"],[64801,1,"صى"],[64802,1,"صي"],[64803,1,"ضى"],[64804,1,"ضي"],[64805,1,"شج"],[64806,1,"شح"],[64807,1,"شخ"],[64808,1,"شم"],[64809,1,"شر"],[64810,1,"سر"],[64811,1,"صر"],[64812,1,"ضر"],[64813,1,"شج"],[64814,1,"شح"],[64815,1,"شخ"],[64816,1,"شم"],[64817,1,"سه"],[64818,1,"شه"],[64819,1,"طم"],[64820,1,"سج"],[64821,1,"سح"],[64822,1,"سخ"],[64823,1,"شج"],[64824,1,"شح"],[64825,1,"شخ"],[64826,1,"طم"],[64827,1,"ظم"],[[64828,64829],1,"اً"],[[64830,64831],2],[[64832,64847],3],[64848,1,"تجم"],[[64849,64850],1,"تحج"],[64851,1,"تحم"],[64852,1,"تخم"],[64853,1,"تمج"],[64854,1,"تمح"],[64855,1,"تمخ"],[[64856,64857],1,"جمح"],[64858,1,"حمي"],[64859,1,"حمى"],[64860,1,"سحج"],[64861,1,"سجح"],[64862,1,"سجى"],[[64863,64864],1,"سمح"],[64865,1,"سمج"],[[64866,64867],1,"سمم"],[[64868,64869],1,"صحح"],[64870,1,"صمم"],[[64871,64872],1,"شحم"],[64873,1,"شجي"],[[64874,64875],1,"شمخ"],[[64876,64877],1,"شمم"],[64878,1,"ضحى"],[[64879,64880],1,"ضخم"],[[64881,64882],1,"طمح"],[64883,1,"طمم"],[64884,1,"طمي"],[64885,1,"عجم"],[[64886,64887],1,"عمم"],[64888,1,"عمى"],[64889,1,"غمم"],[64890,1,"غمي"],[64891,1,"غمى"],[[64892,64893],1,"فخم"],[64894,1,"قمح"],[64895,1,"قمم"],[64896,1,"لحم"],[64897,1,"لحي"],[64898,1,"لحى"],[[64899,64900],1,"لجج"],[[64901,64902],1,"لخم"],[[64903,64904],1,"لمح"],[64905,1,"محج"],[64906,1,"محم"],[64907,1,"محي"],[64908,1,"مجح"],[64909,1,"مجم"],[64910,1,"مخج"],[64911,1,"مخم"],[[64912,64913],3],[64914,1,"مجخ"],[64915,1,"همج"],[64916,1,"همم"],[64917,1,"نحم"],[64918,1,"نحى"],[[64919,64920],1,"نجم"],[64921,1,"نجى"],[64922,1,"نمي"],[64923,1,"نمى"],[[64924,64925],1,"يمم"],[64926,1,"بخي"],[64927,1,"تجي"],[64928,1,"تجى"],[64929,1,"تخي"],[64930,1,"تخى"],[64931,1,"تمي"],[64932,1,"تمى"],[64933,1,"جمي"],[64934,1,"جحى"],[64935,1,"جمى"],[64936,1,"سخى"],[64937,1,"صحي"],[64938,1,"شحي"],[64939,1,"ضحي"],[64940,1,"لجي"],[64941,1,"لمي"],[64942,1,"يحي"],[64943,1,"يجي"],[64944,1,"يمي"],[64945,1,"ممي"],[64946,1,"قمي"],[64947,1,"نحي"],[64948,1,"قمح"],[64949,1,"لحم"],[64950,1,"عمي"],[64951,1,"كمي"],[64952,1,"نجح"],[64953,1,"مخي"],[64954,1,"لجم"],[64955,1,"كمم"],[64956,1,"لجم"],[64957,1,"نجح"],[64958,1,"جحي"],[64959,1,"حجي"],[64960,1,"مجي"],[64961,1,"فمي"],[64962,1,"بحي"],[64963,1,"كمم"],[64964,1,"عجم"],[64965,1,"صمم"],[64966,1,"سخي"],[64967,1,"نجي"],[[64968,64975],3],[[64976,65007],3],[65008,1,"صلے"],[65009,1,"قلے"],[65010,1,"الله"],[65011,1,"اكبر"],[65012,1,"محمد"],[65013,1,"صلعم"],[65014,1,"رسول"],[65015,1,"عليه"],[65016,1,"وسلم"],[65017,1,"صلى"],[65018,5,"صلى الله عليه وسلم"],[65019,5,"جل جلاله"],[65020,1,"ریال"],[65021,2],[[65022,65023],3],[[65024,65039],7],[65040,5,","],[65041,1,"、"],[65042,3],[65043,5,":"],[65044,5,";"],[65045,5,"!"],[65046,5,"?"],[65047,1,"〖"],[65048,1,"〗"],[65049,3],[[65050,65055],3],[[65056,65059],2],[[65060,65062],2],[[65063,65069],2],[[65070,65071],2],[65072,3],[65073,1,"—"],[65074,1,"–"],[[65075,65076],5,"_"],[65077,5,"("],[65078,5,")"],[65079,5,"{"],[65080,5,"}"],[65081,1,"〔"],[65082,1,"〕"],[65083,1,"【"],[65084,1,"】"],[65085,1,"《"],[65086,1,"》"],[65087,1,"〈"],[65088,1,"〉"],[65089,1,"「"],[65090,1,"」"],[65091,1,"『"],[65092,1,"』"],[[65093,65094],2],[65095,5,"["],[65096,5,"]"],[[65097,65100],5," ̅"],[[65101,65103],5,"_"],[65104,5,","],[65105,1,"、"],[65106,3],[65107,3],[65108,5,";"],[65109,5,":"],[65110,5,"?"],[65111,5,"!"],[65112,1,"—"],[65113,5,"("],[65114,5,")"],[65115,5,"{"],[65116,5,"}"],[65117,1,"〔"],[65118,1,"〕"],[65119,5,"#"],[65120,5,"&"],[65121,5,"*"],[65122,5,"+"],[65123,1,"-"],[65124,5,"<"],[65125,5,">"],[65126,5,"="],[65127,3],[65128,5,"\\"],[65129,5,"$"],[65130,5,"%"],[65131,5,"@"],[[65132,65135],3],[65136,5," ً"],[65137,1,"ـً"],[65138,5," ٌ"],[65139,2],[65140,5," ٍ"],[65141,3],[65142,5," َ"],[65143,1,"ـَ"],[65144,5," ُ"],[65145,1,"ـُ"],[65146,5," ِ"],[65147,1,"ـِ"],[65148,5," ّ"],[65149,1,"ـّ"],[65150,5," ْ"],[65151,1,"ـْ"],[65152,1,"ء"],[[65153,65154],1,"آ"],[[65155,65156],1,"أ"],[[65157,65158],1,"ؤ"],[[65159,65160],1,"إ"],[[65161,65164],1,"ئ"],[[65165,65166],1,"ا"],[[65167,65170],1,"ب"],[[65171,65172],1,"ة"],[[65173,65176],1,"ت"],[[65177,65180],1,"ث"],[[65181,65184],1,"ج"],[[65185,65188],1,"ح"],[[65189,65192],1,"خ"],[[65193,65194],1,"د"],[[65195,65196],1,"ذ"],[[65197,65198],1,"ر"],[[65199,65200],1,"ز"],[[65201,65204],1,"س"],[[65205,65208],1,"ش"],[[65209,65212],1,"ص"],[[65213,65216],1,"ض"],[[65217,65220],1,"ط"],[[65221,65224],1,"ظ"],[[65225,65228],1,"ع"],[[65229,65232],1,"غ"],[[65233,65236],1,"ف"],[[65237,65240],1,"ق"],[[65241,65244],1,"ك"],[[65245,65248],1,"ل"],[[65249,65252],1,"م"],[[65253,65256],1,"ن"],[[65257,65260],1,"ه"],[[65261,65262],1,"و"],[[65263,65264],1,"ى"],[[65265,65268],1,"ي"],[[65269,65270],1,"لآ"],[[65271,65272],1,"لأ"],[[65273,65274],1,"لإ"],[[65275,65276],1,"لا"],[[65277,65278],3],[65279,7],[65280,3],[65281,5,"!"],[65282,5,"\""],[65283,5,"#"],[65284,5,"$"],[65285,5,"%"],[65286,5,"&"],[65287,5,"'"],[65288,5,"("],[65289,5,")"],[65290,5,"*"],[65291,5,"+"],[65292,5,","],[65293,1,"-"],[65294,1,"."],[65295,5,"/"],[65296,1,"0"],[65297,1,"1"],[65298,1,"2"],[65299,1,"3"],[65300,1,"4"],[65301,1,"5"],[65302,1,"6"],[65303,1,"7"],[65304,1,"8"],[65305,1,"9"],[65306,5,":"],[65307,5,";"],[65308,5,"<"],[65309,5,"="],[65310,5,">"],[65311,5,"?"],[65312,5,"@"],[65313,1,"a"],[65314,1,"b"],[65315,1,"c"],[65316,1,"d"],[65317,1,"e"],[65318,1,"f"],[65319,1,"g"],[65320,1,"h"],[65321,1,"i"],[65322,1,"j"],[65323,1,"k"],[65324,1,"l"],[65325,1,"m"],[65326,1,"n"],[65327,1,"o"],[65328,1,"p"],[65329,1,"q"],[65330,1,"r"],[65331,1,"s"],[65332,1,"t"],[65333,1,"u"],[65334,1,"v"],[65335,1,"w"],[65336,1,"x"],[65337,1,"y"],[65338,1,"z"],[65339,5,"["],[65340,5,"\\"],[65341,5,"]"],[65342,5,"^"],[65343,5,"_"],[65344,5,"`"],[65345,1,"a"],[65346,1,"b"],[65347,1,"c"],[65348,1,"d"],[65349,1,"e"],[65350,1,"f"],[65351,1,"g"],[65352,1,"h"],[65353,1,"i"],[65354,1,"j"],[65355,1,"k"],[65356,1,"l"],[65357,1,"m"],[65358,1,"n"],[65359,1,"o"],[65360,1,"p"],[65361,1,"q"],[65362,1,"r"],[65363,1,"s"],[65364,1,"t"],[65365,1,"u"],[65366,1,"v"],[65367,1,"w"],[65368,1,"x"],[65369,1,"y"],[65370,1,"z"],[65371,5,"{"],[65372,5,"|"],[65373,5,"}"],[65374,5,"~"],[65375,1,"⦅"],[65376,1,"⦆"],[65377,1,"."],[65378,1,"「"],[65379,1,"」"],[65380,1,"、"],[65381,1,"・"],[65382,1,"ヲ"],[65383,1,"ァ"],[65384,1,"ィ"],[65385,1,"ゥ"],[65386,1,"ェ"],[65387,1,"ォ"],[65388,1,"ャ"],[65389,1,"ュ"],[65390,1,"ョ"],[65391,1,"ッ"],[65392,1,"ー"],[65393,1,"ア"],[65394,1,"イ"],[65395,1,"ウ"],[65396,1,"エ"],[65397,1,"オ"],[65398,1,"カ"],[65399,1,"キ"],[65400,1,"ク"],[65401,1,"ケ"],[65402,1,"コ"],[65403,1,"サ"],[65404,1,"シ"],[65405,1,"ス"],[65406,1,"セ"],[65407,1,"ソ"],[65408,1,"タ"],[65409,1,"チ"],[65410,1,"ツ"],[65411,1,"テ"],[65412,1,"ト"],[65413,1,"ナ"],[65414,1,"ニ"],[65415,1,"ヌ"],[65416,1,"ネ"],[65417,1,"ノ"],[65418,1,"ハ"],[65419,1,"ヒ"],[65420,1,"フ"],[65421,1,"ヘ"],[65422,1,"ホ"],[65423,1,"マ"],[65424,1,"ミ"],[65425,1,"ム"],[65426,1,"メ"],[65427,1,"モ"],[65428,1,"ヤ"],[65429,1,"ユ"],[65430,1,"ヨ"],[65431,1,"ラ"],[65432,1,"リ"],[65433,1,"ル"],[65434,1,"レ"],[65435,1,"ロ"],[65436,1,"ワ"],[65437,1,"ン"],[65438,1,"゙"],[65439,1,"゚"],[65440,3],[65441,1,"ᄀ"],[65442,1,"ᄁ"],[65443,1,"ᆪ"],[65444,1,"ᄂ"],[65445,1,"ᆬ"],[65446,1,"ᆭ"],[65447,1,"ᄃ"],[65448,1,"ᄄ"],[65449,1,"ᄅ"],[65450,1,"ᆰ"],[65451,1,"ᆱ"],[65452,1,"ᆲ"],[65453,1,"ᆳ"],[65454,1,"ᆴ"],[65455,1,"ᆵ"],[65456,1,"ᄚ"],[65457,1,"ᄆ"],[65458,1,"ᄇ"],[65459,1,"ᄈ"],[65460,1,"ᄡ"],[65461,1,"ᄉ"],[65462,1,"ᄊ"],[65463,1,"ᄋ"],[65464,1,"ᄌ"],[65465,1,"ᄍ"],[65466,1,"ᄎ"],[65467,1,"ᄏ"],[65468,1,"ᄐ"],[65469,1,"ᄑ"],[65470,1,"ᄒ"],[[65471,65473],3],[65474,1,"ᅡ"],[65475,1,"ᅢ"],[65476,1,"ᅣ"],[65477,1,"ᅤ"],[65478,1,"ᅥ"],[65479,1,"ᅦ"],[[65480,65481],3],[65482,1,"ᅧ"],[65483,1,"ᅨ"],[65484,1,"ᅩ"],[65485,1,"ᅪ"],[65486,1,"ᅫ"],[65487,1,"ᅬ"],[[65488,65489],3],[65490,1,"ᅭ"],[65491,1,"ᅮ"],[65492,1,"ᅯ"],[65493,1,"ᅰ"],[65494,1,"ᅱ"],[65495,1,"ᅲ"],[[65496,65497],3],[65498,1,"ᅳ"],[65499,1,"ᅴ"],[65500,1,"ᅵ"],[[65501,65503],3],[65504,1,"¢"],[65505,1,"£"],[65506,1,"¬"],[65507,5," ̄"],[65508,1,"¦"],[65509,1,"¥"],[65510,1,"₩"],[65511,3],[65512,1,"│"],[65513,1,"←"],[65514,1,"↑"],[65515,1,"→"],[65516,1,"↓"],[65517,1,"■"],[65518,1,"○"],[[65519,65528],3],[[65529,65531],3],[65532,3],[65533,3],[[65534,65535],3],[[65536,65547],2],[65548,3],[[65549,65574],2],[65575,3],[[65576,65594],2],[65595,3],[[65596,65597],2],[65598,3],[[65599,65613],2],[[65614,65615],3],[[65616,65629],2],[[65630,65663],3],[[65664,65786],2],[[65787,65791],3],[[65792,65794],2],[[65795,65798],3],[[65799,65843],2],[[65844,65846],3],[[65847,65855],2],[[65856,65930],2],[[65931,65932],2],[[65933,65934],2],[65935,3],[[65936,65947],2],[65948,2],[[65949,65951],3],[65952,2],[[65953,65999],3],[[66000,66044],2],[66045,2],[[66046,66175],3],[[66176,66204],2],[[66205,66207],3],[[66208,66256],2],[[66257,66271],3],[66272,2],[[66273,66299],2],[[66300,66303],3],[[66304,66334],2],[66335,2],[[66336,66339],2],[[66340,66348],3],[[66349,66351],2],[[66352,66368],2],[66369,2],[[66370,66377],2],[66378,2],[[66379,66383],3],[[66384,66426],2],[[66427,66431],3],[[66432,66461],2],[66462,3],[66463,2],[[66464,66499],2],[[66500,66503],3],[[66504,66511],2],[[66512,66517],2],[[66518,66559],3],[66560,1,"𐐨"],[66561,1,"𐐩"],[66562,1,"𐐪"],[66563,1,"𐐫"],[66564,1,"𐐬"],[66565,1,"𐐭"],[66566,1,"𐐮"],[66567,1,"𐐯"],[66568,1,"𐐰"],[66569,1,"𐐱"],[66570,1,"𐐲"],[66571,1,"𐐳"],[66572,1,"𐐴"],[66573,1,"𐐵"],[66574,1,"𐐶"],[66575,1,"𐐷"],[66576,1,"𐐸"],[66577,1,"𐐹"],[66578,1,"𐐺"],[66579,1,"𐐻"],[66580,1,"𐐼"],[66581,1,"𐐽"],[66582,1,"𐐾"],[66583,1,"𐐿"],[66584,1,"𐑀"],[66585,1,"𐑁"],[66586,1,"𐑂"],[66587,1,"𐑃"],[66588,1,"𐑄"],[66589,1,"𐑅"],[66590,1,"𐑆"],[66591,1,"𐑇"],[66592,1,"𐑈"],[66593,1,"𐑉"],[66594,1,"𐑊"],[66595,1,"𐑋"],[66596,1,"𐑌"],[66597,1,"𐑍"],[66598,1,"𐑎"],[66599,1,"𐑏"],[[66600,66637],2],[[66638,66717],2],[[66718,66719],3],[[66720,66729],2],[[66730,66735],3],[66736,1,"𐓘"],[66737,1,"𐓙"],[66738,1,"𐓚"],[66739,1,"𐓛"],[66740,1,"𐓜"],[66741,1,"𐓝"],[66742,1,"𐓞"],[66743,1,"𐓟"],[66744,1,"𐓠"],[66745,1,"𐓡"],[66746,1,"𐓢"],[66747,1,"𐓣"],[66748,1,"𐓤"],[66749,1,"𐓥"],[66750,1,"𐓦"],[66751,1,"𐓧"],[66752,1,"𐓨"],[66753,1,"𐓩"],[66754,1,"𐓪"],[66755,1,"𐓫"],[66756,1,"𐓬"],[66757,1,"𐓭"],[66758,1,"𐓮"],[66759,1,"𐓯"],[66760,1,"𐓰"],[66761,1,"𐓱"],[66762,1,"𐓲"],[66763,1,"𐓳"],[66764,1,"𐓴"],[66765,1,"𐓵"],[66766,1,"𐓶"],[66767,1,"𐓷"],[66768,1,"𐓸"],[66769,1,"𐓹"],[66770,1,"𐓺"],[66771,1,"𐓻"],[[66772,66775],3],[[66776,66811],2],[[66812,66815],3],[[66816,66855],2],[[66856,66863],3],[[66864,66915],2],[[66916,66926],3],[66927,2],[[66928,67071],3],[[67072,67382],2],[[67383,67391],3],[[67392,67413],2],[[67414,67423],3],[[67424,67431],2],[[67432,67583],3],[[67584,67589],2],[[67590,67591],3],[67592,2],[67593,3],[[67594,67637],2],[67638,3],[[67639,67640],2],[[67641,67643],3],[67644,2],[[67645,67646],3],[67647,2],[[67648,67669],2],[67670,3],[[67671,67679],2],[[67680,67702],2],[[67703,67711],2],[[67712,67742],2],[[67743,67750],3],[[67751,67759],2],[[67760,67807],3],[[67808,67826],2],[67827,3],[[67828,67829],2],[[67830,67834],3],[[67835,67839],2],[[67840,67861],2],[[67862,67865],2],[[67866,67867],2],[[67868,67870],3],[67871,2],[[67872,67897],2],[[67898,67902],3],[67903,2],[[67904,67967],3],[[67968,68023],2],[[68024,68027],3],[[68028,68029],2],[[68030,68031],2],[[68032,68047],2],[[68048,68049],3],[[68050,68095],2],[[68096,68099],2],[68100,3],[[68101,68102],2],[[68103,68107],3],[[68108,68115],2],[68116,3],[[68117,68119],2],[68120,3],[[68121,68147],2],[[68148,68149],2],[[68150,68151],3],[[68152,68154],2],[[68155,68158],3],[68159,2],[[68160,68167],2],[68168,2],[[68169,68175],3],[[68176,68184],2],[[68185,68191],3],[[68192,68220],2],[[68221,68223],2],[[68224,68252],2],[[68253,68255],2],[[68256,68287],3],[[68288,68295],2],[68296,2],[[68297,68326],2],[[68327,68330],3],[[68331,68342],2],[[68343,68351],3],[[68352,68405],2],[[68406,68408],3],[[68409,68415],2],[[68416,68437],2],[[68438,68439],3],[[68440,68447],2],[[68448,68466],2],[[68467,68471],3],[[68472,68479],2],[[68480,68497],2],[[68498,68504],3],[[68505,68508],2],[[68509,68520],3],[[68521,68527],2],[[68528,68607],3],[[68608,68680],2],[[68681,68735],3],[68736,1,"𐳀"],[68737,1,"𐳁"],[68738,1,"𐳂"],[68739,1,"𐳃"],[68740,1,"𐳄"],[68741,1,"𐳅"],[68742,1,"𐳆"],[68743,1,"𐳇"],[68744,1,"𐳈"],[68745,1,"𐳉"],[68746,1,"𐳊"],[68747,1,"𐳋"],[68748,1,"𐳌"],[68749,1,"𐳍"],[68750,1,"𐳎"],[68751,1,"𐳏"],[68752,1,"𐳐"],[68753,1,"𐳑"],[68754,1,"𐳒"],[68755,1,"𐳓"],[68756,1,"𐳔"],[68757,1,"𐳕"],[68758,1,"𐳖"],[68759,1,"𐳗"],[68760,1,"𐳘"],[68761,1,"𐳙"],[68762,1,"𐳚"],[68763,1,"𐳛"],[68764,1,"𐳜"],[68765,1,"𐳝"],[68766,1,"𐳞"],[68767,1,"𐳟"],[68768,1,"𐳠"],[68769,1,"𐳡"],[68770,1,"𐳢"],[68771,1,"𐳣"],[68772,1,"𐳤"],[68773,1,"𐳥"],[68774,1,"𐳦"],[68775,1,"𐳧"],[68776,1,"𐳨"],[68777,1,"𐳩"],[68778,1,"𐳪"],[68779,1,"𐳫"],[68780,1,"𐳬"],[68781,1,"𐳭"],[68782,1,"𐳮"],[68783,1,"𐳯"],[68784,1,"𐳰"],[68785,1,"𐳱"],[68786,1,"𐳲"],[[68787,68799],3],[[68800,68850],2],[[68851,68857],3],[[68858,68863],2],[[68864,68903],2],[[68904,68911],3],[[68912,68921],2],[[68922,69215],3],[[69216,69246],2],[69247,3],[[69248,69289],2],[69290,3],[[69291,69292],2],[69293,2],[[69294,69295],3],[[69296,69297],2],[[69298,69375],3],[[69376,69404],2],[[69405,69414],2],[69415,2],[[69416,69423],3],[[69424,69456],2],[[69457,69465],2],[[69466,69551],3],[[69552,69572],2],[[69573,69579],2],[[69580,69599],3],[[69600,69622],2],[[69623,69631],3],[[69632,69702],2],[[69703,69709],2],[[69710,69713],3],[[69714,69733],2],[[69734,69743],2],[[69744,69758],3],[69759,2],[[69760,69818],2],[[69819,69820],2],[69821,3],[[69822,69825],2],[[69826,69836],3],[69837,3],[[69838,69839],3],[[69840,69864],2],[[69865,69871],3],[[69872,69881],2],[[69882,69887],3],[[69888,69940],2],[69941,3],[[69942,69951],2],[[69952,69955],2],[[69956,69958],2],[69959,2],[[69960,69967],3],[[69968,70003],2],[[70004,70005],2],[70006,2],[[70007,70015],3],[[70016,70084],2],[[70085,70088],2],[[70089,70092],2],[70093,2],[[70094,70095],2],[[70096,70105],2],[70106,2],[70107,2],[70108,2],[[70109,70111],2],[70112,3],[[70113,70132],2],[[70133,70143],3],[[70144,70161],2],[70162,3],[[70163,70199],2],[[70200,70205],2],[70206,2],[[70207,70271],3],[[70272,70278],2],[70279,3],[70280,2],[70281,3],[[70282,70285],2],[70286,3],[[70287,70301],2],[70302,3],[[70303,70312],2],[70313,2],[[70314,70319],3],[[70320,70378],2],[[70379,70383],3],[[70384,70393],2],[[70394,70399],3],[70400,2],[[70401,70403],2],[70404,3],[[70405,70412],2],[[70413,70414],3],[[70415,70416],2],[[70417,70418],3],[[70419,70440],2],[70441,3],[[70442,70448],2],[70449,3],[[70450,70451],2],[70452,3],[[70453,70457],2],[70458,3],[70459,2],[[70460,70468],2],[[70469,70470],3],[[70471,70472],2],[[70473,70474],3],[[70475,70477],2],[[70478,70479],3],[70480,2],[[70481,70486],3],[70487,2],[[70488,70492],3],[[70493,70499],2],[[70500,70501],3],[[70502,70508],2],[[70509,70511],3],[[70512,70516],2],[[70517,70655],3],[[70656,70730],2],[[70731,70735],2],[[70736,70745],2],[70746,2],[70747,2],[70748,3],[70749,2],[70750,2],[70751,2],[[70752,70753],2],[[70754,70783],3],[[70784,70853],2],[70854,2],[70855,2],[[70856,70863],3],[[70864,70873],2],[[70874,71039],3],[[71040,71093],2],[[71094,71095],3],[[71096,71104],2],[[71105,71113],2],[[71114,71127],2],[[71128,71133],2],[[71134,71167],3],[[71168,71232],2],[[71233,71235],2],[71236,2],[[71237,71247],3],[[71248,71257],2],[[71258,71263],3],[[71264,71276],2],[[71277,71295],3],[[71296,71351],2],[71352,2],[[71353,71359],3],[[71360,71369],2],[[71370,71423],3],[[71424,71449],2],[71450,2],[[71451,71452],3],[[71453,71467],2],[[71468,71471],3],[[71472,71481],2],[[71482,71487],2],[[71488,71679],3],[[71680,71738],2],[71739,2],[[71740,71839],3],[71840,1,"𑣀"],[71841,1,"𑣁"],[71842,1,"𑣂"],[71843,1,"𑣃"],[71844,1,"𑣄"],[71845,1,"𑣅"],[71846,1,"𑣆"],[71847,1,"𑣇"],[71848,1,"𑣈"],[71849,1,"𑣉"],[71850,1,"𑣊"],[71851,1,"𑣋"],[71852,1,"𑣌"],[71853,1,"𑣍"],[71854,1,"𑣎"],[71855,1,"𑣏"],[71856,1,"𑣐"],[71857,1,"𑣑"],[71858,1,"𑣒"],[71859,1,"𑣓"],[71860,1,"𑣔"],[71861,1,"𑣕"],[71862,1,"𑣖"],[71863,1,"𑣗"],[71864,1,"𑣘"],[71865,1,"𑣙"],[71866,1,"𑣚"],[71867,1,"𑣛"],[71868,1,"𑣜"],[71869,1,"𑣝"],[71870,1,"𑣞"],[71871,1,"𑣟"],[[71872,71913],2],[[71914,71922],2],[[71923,71934],3],[71935,2],[[71936,71942],2],[[71943,71944],3],[71945,2],[[71946,71947],3],[[71948,71955],2],[71956,3],[[71957,71958],2],[71959,3],[[71960,71989],2],[71990,3],[[71991,71992],2],[[71993,71994],3],[[71995,72003],2],[[72004,72006],2],[[72007,72015],3],[[72016,72025],2],[[72026,72095],3],[[72096,72103],2],[[72104,72105],3],[[72106,72151],2],[[72152,72153],3],[[72154,72161],2],[72162,2],[[72163,72164],2],[[72165,72191],3],[[72192,72254],2],[[72255,72262],2],[72263,2],[[72264,72271],3],[[72272,72323],2],[[72324,72325],2],[[72326,72345],2],[[72346,72348],2],[72349,2],[[72350,72354],2],[[72355,72383],3],[[72384,72440],2],[[72441,72703],3],[[72704,72712],2],[72713,3],[[72714,72758],2],[72759,3],[[72760,72768],2],[[72769,72773],2],[[72774,72783],3],[[72784,72793],2],[[72794,72812],2],[[72813,72815],3],[[72816,72817],2],[[72818,72847],2],[[72848,72849],3],[[72850,72871],2],[72872,3],[[72873,72886],2],[[72887,72959],3],[[72960,72966],2],[72967,3],[[72968,72969],2],[72970,3],[[72971,73014],2],[[73015,73017],3],[73018,2],[73019,3],[[73020,73021],2],[73022,3],[[73023,73031],2],[[73032,73039],3],[[73040,73049],2],[[73050,73055],3],[[73056,73061],2],[73062,3],[[73063,73064],2],[73065,3],[[73066,73102],2],[73103,3],[[73104,73105],2],[73106,3],[[73107,73112],2],[[73113,73119],3],[[73120,73129],2],[[73130,73439],3],[[73440,73462],2],[[73463,73464],2],[[73465,73647],3],[73648,2],[[73649,73663],3],[[73664,73713],2],[[73714,73726],3],[73727,2],[[73728,74606],2],[[74607,74648],2],[74649,2],[[74650,74751],3],[[74752,74850],2],[[74851,74862],2],[74863,3],[[74864,74867],2],[74868,2],[[74869,74879],3],[[74880,75075],2],[[75076,77823],3],[[77824,78894],2],[78895,3],[[78896,78904],3],[[78905,82943],3],[[82944,83526],2],[[83527,92159],3],[[92160,92728],2],[[92729,92735],3],[[92736,92766],2],[92767,3],[[92768,92777],2],[[92778,92781],3],[[92782,92783],2],[[92784,92879],3],[[92880,92909],2],[[92910,92911],3],[[92912,92916],2],[92917,2],[[92918,92927],3],[[92928,92982],2],[[92983,92991],2],[[92992,92995],2],[[92996,92997],2],[[92998,93007],3],[[93008,93017],2],[93018,3],[[93019,93025],2],[93026,3],[[93027,93047],2],[[93048,93052],3],[[93053,93071],2],[[93072,93759],3],[93760,1,"𖹠"],[93761,1,"𖹡"],[93762,1,"𖹢"],[93763,1,"𖹣"],[93764,1,"𖹤"],[93765,1,"𖹥"],[93766,1,"𖹦"],[93767,1,"𖹧"],[93768,1,"𖹨"],[93769,1,"𖹩"],[93770,1,"𖹪"],[93771,1,"𖹫"],[93772,1,"𖹬"],[93773,1,"𖹭"],[93774,1,"𖹮"],[93775,1,"𖹯"],[93776,1,"𖹰"],[93777,1,"𖹱"],[93778,1,"𖹲"],[93779,1,"𖹳"],[93780,1,"𖹴"],[93781,1,"𖹵"],[93782,1,"𖹶"],[93783,1,"𖹷"],[93784,1,"𖹸"],[93785,1,"𖹹"],[93786,1,"𖹺"],[93787,1,"𖹻"],[93788,1,"𖹼"],[93789,1,"𖹽"],[93790,1,"𖹾"],[93791,1,"𖹿"],[[93792,93823],2],[[93824,93850],2],[[93851,93951],3],[[93952,94020],2],[[94021,94026],2],[[94027,94030],3],[94031,2],[[94032,94078],2],[[94079,94087],2],[[94088,94094],3],[[94095,94111],2],[[94112,94175],3],[94176,2],[94177,2],[94178,2],[94179,2],[94180,2],[[94181,94191],3],[[94192,94193],2],[[94194,94207],3],[[94208,100332],2],[[100333,100337],2],[[100338,100343],2],[[100344,100351],3],[[100352,101106],2],[[101107,101589],2],[[101590,101631],3],[[101632,101640],2],[[101641,110591],3],[[110592,110593],2],[[110594,110878],2],[[110879,110927],3],[[110928,110930],2],[[110931,110947],3],[[110948,110951],2],[[110952,110959],3],[[110960,111355],2],[[111356,113663],3],[[113664,113770],2],[[113771,113775],3],[[113776,113788],2],[[113789,113791],3],[[113792,113800],2],[[113801,113807],3],[[113808,113817],2],[[113818,113819],3],[113820,2],[[113821,113822],2],[113823,2],[[113824,113827],7],[[113828,118783],3],[[118784,119029],2],[[119030,119039],3],[[119040,119078],2],[[119079,119080],3],[119081,2],[[119082,119133],2],[119134,1,"𝅗𝅥"],[119135,1,"𝅘𝅥"],[119136,1,"𝅘𝅥𝅮"],[119137,1,"𝅘𝅥𝅯"],[119138,1,"𝅘𝅥𝅰"],[119139,1,"𝅘𝅥𝅱"],[119140,1,"𝅘𝅥𝅲"],[[119141,119154],2],[[119155,119162],3],[[119163,119226],2],[119227,1,"𝆹𝅥"],[119228,1,"𝆺𝅥"],[119229,1,"𝆹𝅥𝅮"],[119230,1,"𝆺𝅥𝅮"],[119231,1,"𝆹𝅥𝅯"],[119232,1,"𝆺𝅥𝅯"],[[119233,119261],2],[[119262,119272],2],[[119273,119295],3],[[119296,119365],2],[[119366,119519],3],[[119520,119539],2],[[119540,119551],3],[[119552,119638],2],[[119639,119647],3],[[119648,119665],2],[[119666,119672],2],[[119673,119807],3],[119808,1,"a"],[119809,1,"b"],[119810,1,"c"],[119811,1,"d"],[119812,1,"e"],[119813,1,"f"],[119814,1,"g"],[119815,1,"h"],[119816,1,"i"],[119817,1,"j"],[119818,1,"k"],[119819,1,"l"],[119820,1,"m"],[119821,1,"n"],[119822,1,"o"],[119823,1,"p"],[119824,1,"q"],[119825,1,"r"],[119826,1,"s"],[119827,1,"t"],[119828,1,"u"],[119829,1,"v"],[119830,1,"w"],[119831,1,"x"],[119832,1,"y"],[119833,1,"z"],[119834,1,"a"],[119835,1,"b"],[119836,1,"c"],[119837,1,"d"],[119838,1,"e"],[119839,1,"f"],[119840,1,"g"],[119841,1,"h"],[119842,1,"i"],[119843,1,"j"],[119844,1,"k"],[119845,1,"l"],[119846,1,"m"],[119847,1,"n"],[119848,1,"o"],[119849,1,"p"],[119850,1,"q"],[119851,1,"r"],[119852,1,"s"],[119853,1,"t"],[119854,1,"u"],[119855,1,"v"],[119856,1,"w"],[119857,1,"x"],[119858,1,"y"],[119859,1,"z"],[119860,1,"a"],[119861,1,"b"],[119862,1,"c"],[119863,1,"d"],[119864,1,"e"],[119865,1,"f"],[119866,1,"g"],[119867,1,"h"],[119868,1,"i"],[119869,1,"j"],[119870,1,"k"],[119871,1,"l"],[119872,1,"m"],[119873,1,"n"],[119874,1,"o"],[119875,1,"p"],[119876,1,"q"],[119877,1,"r"],[119878,1,"s"],[119879,1,"t"],[119880,1,"u"],[119881,1,"v"],[119882,1,"w"],[119883,1,"x"],[119884,1,"y"],[119885,1,"z"],[119886,1,"a"],[119887,1,"b"],[119888,1,"c"],[119889,1,"d"],[119890,1,"e"],[119891,1,"f"],[119892,1,"g"],[119893,3],[119894,1,"i"],[119895,1,"j"],[119896,1,"k"],[119897,1,"l"],[119898,1,"m"],[119899,1,"n"],[119900,1,"o"],[119901,1,"p"],[119902,1,"q"],[119903,1,"r"],[119904,1,"s"],[119905,1,"t"],[119906,1,"u"],[119907,1,"v"],[119908,1,"w"],[119909,1,"x"],[119910,1,"y"],[119911,1,"z"],[119912,1,"a"],[119913,1,"b"],[119914,1,"c"],[119915,1,"d"],[119916,1,"e"],[119917,1,"f"],[119918,1,"g"],[119919,1,"h"],[119920,1,"i"],[119921,1,"j"],[119922,1,"k"],[119923,1,"l"],[119924,1,"m"],[119925,1,"n"],[119926,1,"o"],[119927,1,"p"],[119928,1,"q"],[119929,1,"r"],[119930,1,"s"],[119931,1,"t"],[119932,1,"u"],[119933,1,"v"],[119934,1,"w"],[119935,1,"x"],[119936,1,"y"],[119937,1,"z"],[119938,1,"a"],[119939,1,"b"],[119940,1,"c"],[119941,1,"d"],[119942,1,"e"],[119943,1,"f"],[119944,1,"g"],[119945,1,"h"],[119946,1,"i"],[119947,1,"j"],[119948,1,"k"],[119949,1,"l"],[119950,1,"m"],[119951,1,"n"],[119952,1,"o"],[119953,1,"p"],[119954,1,"q"],[119955,1,"r"],[119956,1,"s"],[119957,1,"t"],[119958,1,"u"],[119959,1,"v"],[119960,1,"w"],[119961,1,"x"],[119962,1,"y"],[119963,1,"z"],[119964,1,"a"],[119965,3],[119966,1,"c"],[119967,1,"d"],[[119968,119969],3],[119970,1,"g"],[[119971,119972],3],[119973,1,"j"],[119974,1,"k"],[[119975,119976],3],[119977,1,"n"],[119978,1,"o"],[119979,1,"p"],[119980,1,"q"],[119981,3],[119982,1,"s"],[119983,1,"t"],[119984,1,"u"],[119985,1,"v"],[119986,1,"w"],[119987,1,"x"],[119988,1,"y"],[119989,1,"z"],[119990,1,"a"],[119991,1,"b"],[119992,1,"c"],[119993,1,"d"],[119994,3],[119995,1,"f"],[119996,3],[119997,1,"h"],[119998,1,"i"],[119999,1,"j"],[120000,1,"k"],[120001,1,"l"],[120002,1,"m"],[120003,1,"n"],[120004,3],[120005,1,"p"],[120006,1,"q"],[120007,1,"r"],[120008,1,"s"],[120009,1,"t"],[120010,1,"u"],[120011,1,"v"],[120012,1,"w"],[120013,1,"x"],[120014,1,"y"],[120015,1,"z"],[120016,1,"a"],[120017,1,"b"],[120018,1,"c"],[120019,1,"d"],[120020,1,"e"],[120021,1,"f"],[120022,1,"g"],[120023,1,"h"],[120024,1,"i"],[120025,1,"j"],[120026,1,"k"],[120027,1,"l"],[120028,1,"m"],[120029,1,"n"],[120030,1,"o"],[120031,1,"p"],[120032,1,"q"],[120033,1,"r"],[120034,1,"s"],[120035,1,"t"],[120036,1,"u"],[120037,1,"v"],[120038,1,"w"],[120039,1,"x"],[120040,1,"y"],[120041,1,"z"],[120042,1,"a"],[120043,1,"b"],[120044,1,"c"],[120045,1,"d"],[120046,1,"e"],[120047,1,"f"],[120048,1,"g"],[120049,1,"h"],[120050,1,"i"],[120051,1,"j"],[120052,1,"k"],[120053,1,"l"],[120054,1,"m"],[120055,1,"n"],[120056,1,"o"],[120057,1,"p"],[120058,1,"q"],[120059,1,"r"],[120060,1,"s"],[120061,1,"t"],[120062,1,"u"],[120063,1,"v"],[120064,1,"w"],[120065,1,"x"],[120066,1,"y"],[120067,1,"z"],[120068,1,"a"],[120069,1,"b"],[120070,3],[120071,1,"d"],[120072,1,"e"],[120073,1,"f"],[120074,1,"g"],[[120075,120076],3],[120077,1,"j"],[120078,1,"k"],[120079,1,"l"],[120080,1,"m"],[120081,1,"n"],[120082,1,"o"],[120083,1,"p"],[120084,1,"q"],[120085,3],[120086,1,"s"],[120087,1,"t"],[120088,1,"u"],[120089,1,"v"],[120090,1,"w"],[120091,1,"x"],[120092,1,"y"],[120093,3],[120094,1,"a"],[120095,1,"b"],[120096,1,"c"],[120097,1,"d"],[120098,1,"e"],[120099,1,"f"],[120100,1,"g"],[120101,1,"h"],[120102,1,"i"],[120103,1,"j"],[120104,1,"k"],[120105,1,"l"],[120106,1,"m"],[120107,1,"n"],[120108,1,"o"],[120109,1,"p"],[120110,1,"q"],[120111,1,"r"],[120112,1,"s"],[120113,1,"t"],[120114,1,"u"],[120115,1,"v"],[120116,1,"w"],[120117,1,"x"],[120118,1,"y"],[120119,1,"z"],[120120,1,"a"],[120121,1,"b"],[120122,3],[120123,1,"d"],[120124,1,"e"],[120125,1,"f"],[120126,1,"g"],[120127,3],[120128,1,"i"],[120129,1,"j"],[120130,1,"k"],[120131,1,"l"],[120132,1,"m"],[120133,3],[120134,1,"o"],[[120135,120137],3],[120138,1,"s"],[120139,1,"t"],[120140,1,"u"],[120141,1,"v"],[120142,1,"w"],[120143,1,"x"],[120144,1,"y"],[120145,3],[120146,1,"a"],[120147,1,"b"],[120148,1,"c"],[120149,1,"d"],[120150,1,"e"],[120151,1,"f"],[120152,1,"g"],[120153,1,"h"],[120154,1,"i"],[120155,1,"j"],[120156,1,"k"],[120157,1,"l"],[120158,1,"m"],[120159,1,"n"],[120160,1,"o"],[120161,1,"p"],[120162,1,"q"],[120163,1,"r"],[120164,1,"s"],[120165,1,"t"],[120166,1,"u"],[120167,1,"v"],[120168,1,"w"],[120169,1,"x"],[120170,1,"y"],[120171,1,"z"],[120172,1,"a"],[120173,1,"b"],[120174,1,"c"],[120175,1,"d"],[120176,1,"e"],[120177,1,"f"],[120178,1,"g"],[120179,1,"h"],[120180,1,"i"],[120181,1,"j"],[120182,1,"k"],[120183,1,"l"],[120184,1,"m"],[120185,1,"n"],[120186,1,"o"],[120187,1,"p"],[120188,1,"q"],[120189,1,"r"],[120190,1,"s"],[120191,1,"t"],[120192,1,"u"],[120193,1,"v"],[120194,1,"w"],[120195,1,"x"],[120196,1,"y"],[120197,1,"z"],[120198,1,"a"],[120199,1,"b"],[120200,1,"c"],[120201,1,"d"],[120202,1,"e"],[120203,1,"f"],[120204,1,"g"],[120205,1,"h"],[120206,1,"i"],[120207,1,"j"],[120208,1,"k"],[120209,1,"l"],[120210,1,"m"],[120211,1,"n"],[120212,1,"o"],[120213,1,"p"],[120214,1,"q"],[120215,1,"r"],[120216,1,"s"],[120217,1,"t"],[120218,1,"u"],[120219,1,"v"],[120220,1,"w"],[120221,1,"x"],[120222,1,"y"],[120223,1,"z"],[120224,1,"a"],[120225,1,"b"],[120226,1,"c"],[120227,1,"d"],[120228,1,"e"],[120229,1,"f"],[120230,1,"g"],[120231,1,"h"],[120232,1,"i"],[120233,1,"j"],[120234,1,"k"],[120235,1,"l"],[120236,1,"m"],[120237,1,"n"],[120238,1,"o"],[120239,1,"p"],[120240,1,"q"],[120241,1,"r"],[120242,1,"s"],[120243,1,"t"],[120244,1,"u"],[120245,1,"v"],[120246,1,"w"],[120247,1,"x"],[120248,1,"y"],[120249,1,"z"],[120250,1,"a"],[120251,1,"b"],[120252,1,"c"],[120253,1,"d"],[120254,1,"e"],[120255,1,"f"],[120256,1,"g"],[120257,1,"h"],[120258,1,"i"],[120259,1,"j"],[120260,1,"k"],[120261,1,"l"],[120262,1,"m"],[120263,1,"n"],[120264,1,"o"],[120265,1,"p"],[120266,1,"q"],[120267,1,"r"],[120268,1,"s"],[120269,1,"t"],[120270,1,"u"],[120271,1,"v"],[120272,1,"w"],[120273,1,"x"],[120274,1,"y"],[120275,1,"z"],[120276,1,"a"],[120277,1,"b"],[120278,1,"c"],[120279,1,"d"],[120280,1,"e"],[120281,1,"f"],[120282,1,"g"],[120283,1,"h"],[120284,1,"i"],[120285,1,"j"],[120286,1,"k"],[120287,1,"l"],[120288,1,"m"],[120289,1,"n"],[120290,1,"o"],[120291,1,"p"],[120292,1,"q"],[120293,1,"r"],[120294,1,"s"],[120295,1,"t"],[120296,1,"u"],[120297,1,"v"],[120298,1,"w"],[120299,1,"x"],[120300,1,"y"],[120301,1,"z"],[120302,1,"a"],[120303,1,"b"],[120304,1,"c"],[120305,1,"d"],[120306,1,"e"],[120307,1,"f"],[120308,1,"g"],[120309,1,"h"],[120310,1,"i"],[120311,1,"j"],[120312,1,"k"],[120313,1,"l"],[120314,1,"m"],[120315,1,"n"],[120316,1,"o"],[120317,1,"p"],[120318,1,"q"],[120319,1,"r"],[120320,1,"s"],[120321,1,"t"],[120322,1,"u"],[120323,1,"v"],[120324,1,"w"],[120325,1,"x"],[120326,1,"y"],[120327,1,"z"],[120328,1,"a"],[120329,1,"b"],[120330,1,"c"],[120331,1,"d"],[120332,1,"e"],[120333,1,"f"],[120334,1,"g"],[120335,1,"h"],[120336,1,"i"],[120337,1,"j"],[120338,1,"k"],[120339,1,"l"],[120340,1,"m"],[120341,1,"n"],[120342,1,"o"],[120343,1,"p"],[120344,1,"q"],[120345,1,"r"],[120346,1,"s"],[120347,1,"t"],[120348,1,"u"],[120349,1,"v"],[120350,1,"w"],[120351,1,"x"],[120352,1,"y"],[120353,1,"z"],[120354,1,"a"],[120355,1,"b"],[120356,1,"c"],[120357,1,"d"],[120358,1,"e"],[120359,1,"f"],[120360,1,"g"],[120361,1,"h"],[120362,1,"i"],[120363,1,"j"],[120364,1,"k"],[120365,1,"l"],[120366,1,"m"],[120367,1,"n"],[120368,1,"o"],[120369,1,"p"],[120370,1,"q"],[120371,1,"r"],[120372,1,"s"],[120373,1,"t"],[120374,1,"u"],[120375,1,"v"],[120376,1,"w"],[120377,1,"x"],[120378,1,"y"],[120379,1,"z"],[120380,1,"a"],[120381,1,"b"],[120382,1,"c"],[120383,1,"d"],[120384,1,"e"],[120385,1,"f"],[120386,1,"g"],[120387,1,"h"],[120388,1,"i"],[120389,1,"j"],[120390,1,"k"],[120391,1,"l"],[120392,1,"m"],[120393,1,"n"],[120394,1,"o"],[120395,1,"p"],[120396,1,"q"],[120397,1,"r"],[120398,1,"s"],[120399,1,"t"],[120400,1,"u"],[120401,1,"v"],[120402,1,"w"],[120403,1,"x"],[120404,1,"y"],[120405,1,"z"],[120406,1,"a"],[120407,1,"b"],[120408,1,"c"],[120409,1,"d"],[120410,1,"e"],[120411,1,"f"],[120412,1,"g"],[120413,1,"h"],[120414,1,"i"],[120415,1,"j"],[120416,1,"k"],[120417,1,"l"],[120418,1,"m"],[120419,1,"n"],[120420,1,"o"],[120421,1,"p"],[120422,1,"q"],[120423,1,"r"],[120424,1,"s"],[120425,1,"t"],[120426,1,"u"],[120427,1,"v"],[120428,1,"w"],[120429,1,"x"],[120430,1,"y"],[120431,1,"z"],[120432,1,"a"],[120433,1,"b"],[120434,1,"c"],[120435,1,"d"],[120436,1,"e"],[120437,1,"f"],[120438,1,"g"],[120439,1,"h"],[120440,1,"i"],[120441,1,"j"],[120442,1,"k"],[120443,1,"l"],[120444,1,"m"],[120445,1,"n"],[120446,1,"o"],[120447,1,"p"],[120448,1,"q"],[120449,1,"r"],[120450,1,"s"],[120451,1,"t"],[120452,1,"u"],[120453,1,"v"],[120454,1,"w"],[120455,1,"x"],[120456,1,"y"],[120457,1,"z"],[120458,1,"a"],[120459,1,"b"],[120460,1,"c"],[120461,1,"d"],[120462,1,"e"],[120463,1,"f"],[120464,1,"g"],[120465,1,"h"],[120466,1,"i"],[120467,1,"j"],[120468,1,"k"],[120469,1,"l"],[120470,1,"m"],[120471,1,"n"],[120472,1,"o"],[120473,1,"p"],[120474,1,"q"],[120475,1,"r"],[120476,1,"s"],[120477,1,"t"],[120478,1,"u"],[120479,1,"v"],[120480,1,"w"],[120481,1,"x"],[120482,1,"y"],[120483,1,"z"],[120484,1,"ı"],[120485,1,"ȷ"],[[120486,120487],3],[120488,1,"α"],[120489,1,"β"],[120490,1,"γ"],[120491,1,"δ"],[120492,1,"ε"],[120493,1,"ζ"],[120494,1,"η"],[120495,1,"θ"],[120496,1,"ι"],[120497,1,"κ"],[120498,1,"λ"],[120499,1,"μ"],[120500,1,"ν"],[120501,1,"ξ"],[120502,1,"ο"],[120503,1,"π"],[120504,1,"ρ"],[120505,1,"θ"],[120506,1,"σ"],[120507,1,"τ"],[120508,1,"υ"],[120509,1,"φ"],[120510,1,"χ"],[120511,1,"ψ"],[120512,1,"ω"],[120513,1,"∇"],[120514,1,"α"],[120515,1,"β"],[120516,1,"γ"],[120517,1,"δ"],[120518,1,"ε"],[120519,1,"ζ"],[120520,1,"η"],[120521,1,"θ"],[120522,1,"ι"],[120523,1,"κ"],[120524,1,"λ"],[120525,1,"μ"],[120526,1,"ν"],[120527,1,"ξ"],[120528,1,"ο"],[120529,1,"π"],[120530,1,"ρ"],[[120531,120532],1,"σ"],[120533,1,"τ"],[120534,1,"υ"],[120535,1,"φ"],[120536,1,"χ"],[120537,1,"ψ"],[120538,1,"ω"],[120539,1,"∂"],[120540,1,"ε"],[120541,1,"θ"],[120542,1,"κ"],[120543,1,"φ"],[120544,1,"ρ"],[120545,1,"π"],[120546,1,"α"],[120547,1,"β"],[120548,1,"γ"],[120549,1,"δ"],[120550,1,"ε"],[120551,1,"ζ"],[120552,1,"η"],[120553,1,"θ"],[120554,1,"ι"],[120555,1,"κ"],[120556,1,"λ"],[120557,1,"μ"],[120558,1,"ν"],[120559,1,"ξ"],[120560,1,"ο"],[120561,1,"π"],[120562,1,"ρ"],[120563,1,"θ"],[120564,1,"σ"],[120565,1,"τ"],[120566,1,"υ"],[120567,1,"φ"],[120568,1,"χ"],[120569,1,"ψ"],[120570,1,"ω"],[120571,1,"∇"],[120572,1,"α"],[120573,1,"β"],[120574,1,"γ"],[120575,1,"δ"],[120576,1,"ε"],[120577,1,"ζ"],[120578,1,"η"],[120579,1,"θ"],[120580,1,"ι"],[120581,1,"κ"],[120582,1,"λ"],[120583,1,"μ"],[120584,1,"ν"],[120585,1,"ξ"],[120586,1,"ο"],[120587,1,"π"],[120588,1,"ρ"],[[120589,120590],1,"σ"],[120591,1,"τ"],[120592,1,"υ"],[120593,1,"φ"],[120594,1,"χ"],[120595,1,"ψ"],[120596,1,"ω"],[120597,1,"∂"],[120598,1,"ε"],[120599,1,"θ"],[120600,1,"κ"],[120601,1,"φ"],[120602,1,"ρ"],[120603,1,"π"],[120604,1,"α"],[120605,1,"β"],[120606,1,"γ"],[120607,1,"δ"],[120608,1,"ε"],[120609,1,"ζ"],[120610,1,"η"],[120611,1,"θ"],[120612,1,"ι"],[120613,1,"κ"],[120614,1,"λ"],[120615,1,"μ"],[120616,1,"ν"],[120617,1,"ξ"],[120618,1,"ο"],[120619,1,"π"],[120620,1,"ρ"],[120621,1,"θ"],[120622,1,"σ"],[120623,1,"τ"],[120624,1,"υ"],[120625,1,"φ"],[120626,1,"χ"],[120627,1,"ψ"],[120628,1,"ω"],[120629,1,"∇"],[120630,1,"α"],[120631,1,"β"],[120632,1,"γ"],[120633,1,"δ"],[120634,1,"ε"],[120635,1,"ζ"],[120636,1,"η"],[120637,1,"θ"],[120638,1,"ι"],[120639,1,"κ"],[120640,1,"λ"],[120641,1,"μ"],[120642,1,"ν"],[120643,1,"ξ"],[120644,1,"ο"],[120645,1,"π"],[120646,1,"ρ"],[[120647,120648],1,"σ"],[120649,1,"τ"],[120650,1,"υ"],[120651,1,"φ"],[120652,1,"χ"],[120653,1,"ψ"],[120654,1,"ω"],[120655,1,"∂"],[120656,1,"ε"],[120657,1,"θ"],[120658,1,"κ"],[120659,1,"φ"],[120660,1,"ρ"],[120661,1,"π"],[120662,1,"α"],[120663,1,"β"],[120664,1,"γ"],[120665,1,"δ"],[120666,1,"ε"],[120667,1,"ζ"],[120668,1,"η"],[120669,1,"θ"],[120670,1,"ι"],[120671,1,"κ"],[120672,1,"λ"],[120673,1,"μ"],[120674,1,"ν"],[120675,1,"ξ"],[120676,1,"ο"],[120677,1,"π"],[120678,1,"ρ"],[120679,1,"θ"],[120680,1,"σ"],[120681,1,"τ"],[120682,1,"υ"],[120683,1,"φ"],[120684,1,"χ"],[120685,1,"ψ"],[120686,1,"ω"],[120687,1,"∇"],[120688,1,"α"],[120689,1,"β"],[120690,1,"γ"],[120691,1,"δ"],[120692,1,"ε"],[120693,1,"ζ"],[120694,1,"η"],[120695,1,"θ"],[120696,1,"ι"],[120697,1,"κ"],[120698,1,"λ"],[120699,1,"μ"],[120700,1,"ν"],[120701,1,"ξ"],[120702,1,"ο"],[120703,1,"π"],[120704,1,"ρ"],[[120705,120706],1,"σ"],[120707,1,"τ"],[120708,1,"υ"],[120709,1,"φ"],[120710,1,"χ"],[120711,1,"ψ"],[120712,1,"ω"],[120713,1,"∂"],[120714,1,"ε"],[120715,1,"θ"],[120716,1,"κ"],[120717,1,"φ"],[120718,1,"ρ"],[120719,1,"π"],[120720,1,"α"],[120721,1,"β"],[120722,1,"γ"],[120723,1,"δ"],[120724,1,"ε"],[120725,1,"ζ"],[120726,1,"η"],[120727,1,"θ"],[120728,1,"ι"],[120729,1,"κ"],[120730,1,"λ"],[120731,1,"μ"],[120732,1,"ν"],[120733,1,"ξ"],[120734,1,"ο"],[120735,1,"π"],[120736,1,"ρ"],[120737,1,"θ"],[120738,1,"σ"],[120739,1,"τ"],[120740,1,"υ"],[120741,1,"φ"],[120742,1,"χ"],[120743,1,"ψ"],[120744,1,"ω"],[120745,1,"∇"],[120746,1,"α"],[120747,1,"β"],[120748,1,"γ"],[120749,1,"δ"],[120750,1,"ε"],[120751,1,"ζ"],[120752,1,"η"],[120753,1,"θ"],[120754,1,"ι"],[120755,1,"κ"],[120756,1,"λ"],[120757,1,"μ"],[120758,1,"ν"],[120759,1,"ξ"],[120760,1,"ο"],[120761,1,"π"],[120762,1,"ρ"],[[120763,120764],1,"σ"],[120765,1,"τ"],[120766,1,"υ"],[120767,1,"φ"],[120768,1,"χ"],[120769,1,"ψ"],[120770,1,"ω"],[120771,1,"∂"],[120772,1,"ε"],[120773,1,"θ"],[120774,1,"κ"],[120775,1,"φ"],[120776,1,"ρ"],[120777,1,"π"],[[120778,120779],1,"ϝ"],[[120780,120781],3],[120782,1,"0"],[120783,1,"1"],[120784,1,"2"],[120785,1,"3"],[120786,1,"4"],[120787,1,"5"],[120788,1,"6"],[120789,1,"7"],[120790,1,"8"],[120791,1,"9"],[120792,1,"0"],[120793,1,"1"],[120794,1,"2"],[120795,1,"3"],[120796,1,"4"],[120797,1,"5"],[120798,1,"6"],[120799,1,"7"],[120800,1,"8"],[120801,1,"9"],[120802,1,"0"],[120803,1,"1"],[120804,1,"2"],[120805,1,"3"],[120806,1,"4"],[120807,1,"5"],[120808,1,"6"],[120809,1,"7"],[120810,1,"8"],[120811,1,"9"],[120812,1,"0"],[120813,1,"1"],[120814,1,"2"],[120815,1,"3"],[120816,1,"4"],[120817,1,"5"],[120818,1,"6"],[120819,1,"7"],[120820,1,"8"],[120821,1,"9"],[120822,1,"0"],[120823,1,"1"],[120824,1,"2"],[120825,1,"3"],[120826,1,"4"],[120827,1,"5"],[120828,1,"6"],[120829,1,"7"],[120830,1,"8"],[120831,1,"9"],[[120832,121343],2],[[121344,121398],2],[[121399,121402],2],[[121403,121452],2],[[121453,121460],2],[121461,2],[[121462,121475],2],[121476,2],[[121477,121483],2],[[121484,121498],3],[[121499,121503],2],[121504,3],[[121505,121519],2],[[121520,122879],3],[[122880,122886],2],[122887,3],[[122888,122904],2],[[122905,122906],3],[[122907,122913],2],[122914,3],[[122915,122916],2],[122917,3],[[122918,122922],2],[[122923,123135],3],[[123136,123180],2],[[123181,123183],3],[[123184,123197],2],[[123198,123199],3],[[123200,123209],2],[[123210,123213],3],[123214,2],[123215,2],[[123216,123583],3],[[123584,123641],2],[[123642,123646],3],[123647,2],[[123648,124927],3],[[124928,125124],2],[[125125,125126],3],[[125127,125135],2],[[125136,125142],2],[[125143,125183],3],[125184,1,"𞤢"],[125185,1,"𞤣"],[125186,1,"𞤤"],[125187,1,"𞤥"],[125188,1,"𞤦"],[125189,1,"𞤧"],[125190,1,"𞤨"],[125191,1,"𞤩"],[125192,1,"𞤪"],[125193,1,"𞤫"],[125194,1,"𞤬"],[125195,1,"𞤭"],[125196,1,"𞤮"],[125197,1,"𞤯"],[125198,1,"𞤰"],[125199,1,"𞤱"],[125200,1,"𞤲"],[125201,1,"𞤳"],[125202,1,"𞤴"],[125203,1,"𞤵"],[125204,1,"𞤶"],[125205,1,"𞤷"],[125206,1,"𞤸"],[125207,1,"𞤹"],[125208,1,"𞤺"],[125209,1,"𞤻"],[125210,1,"𞤼"],[125211,1,"𞤽"],[125212,1,"𞤾"],[125213,1,"𞤿"],[125214,1,"𞥀"],[125215,1,"𞥁"],[125216,1,"𞥂"],[125217,1,"𞥃"],[[125218,125258],2],[125259,2],[[125260,125263],3],[[125264,125273],2],[[125274,125277],3],[[125278,125279],2],[[125280,126064],3],[[126065,126132],2],[[126133,126208],3],[[126209,126269],2],[[126270,126463],3],[126464,1,"ا"],[126465,1,"ب"],[126466,1,"ج"],[126467,1,"د"],[126468,3],[126469,1,"و"],[126470,1,"ز"],[126471,1,"ح"],[126472,1,"ط"],[126473,1,"ي"],[126474,1,"ك"],[126475,1,"ل"],[126476,1,"م"],[126477,1,"ن"],[126478,1,"س"],[126479,1,"ع"],[126480,1,"ف"],[126481,1,"ص"],[126482,1,"ق"],[126483,1,"ر"],[126484,1,"ش"],[126485,1,"ت"],[126486,1,"ث"],[126487,1,"خ"],[126488,1,"ذ"],[126489,1,"ض"],[126490,1,"ظ"],[126491,1,"غ"],[126492,1,"ٮ"],[126493,1,"ں"],[126494,1,"ڡ"],[126495,1,"ٯ"],[126496,3],[126497,1,"ب"],[126498,1,"ج"],[126499,3],[126500,1,"ه"],[[126501,126502],3],[126503,1,"ح"],[126504,3],[126505,1,"ي"],[126506,1,"ك"],[126507,1,"ل"],[126508,1,"م"],[126509,1,"ن"],[126510,1,"س"],[126511,1,"ع"],[126512,1,"ف"],[126513,1,"ص"],[126514,1,"ق"],[126515,3],[126516,1,"ش"],[126517,1,"ت"],[126518,1,"ث"],[126519,1,"خ"],[126520,3],[126521,1,"ض"],[126522,3],[126523,1,"غ"],[[126524,126529],3],[126530,1,"ج"],[[126531,126534],3],[126535,1,"ح"],[126536,3],[126537,1,"ي"],[126538,3],[126539,1,"ل"],[126540,3],[126541,1,"ن"],[126542,1,"س"],[126543,1,"ع"],[126544,3],[126545,1,"ص"],[126546,1,"ق"],[126547,3],[126548,1,"ش"],[[126549,126550],3],[126551,1,"خ"],[126552,3],[126553,1,"ض"],[126554,3],[126555,1,"غ"],[126556,3],[126557,1,"ں"],[126558,3],[126559,1,"ٯ"],[126560,3],[126561,1,"ب"],[126562,1,"ج"],[126563,3],[126564,1,"ه"],[[126565,126566],3],[126567,1,"ح"],[126568,1,"ط"],[126569,1,"ي"],[126570,1,"ك"],[126571,3],[126572,1,"م"],[126573,1,"ن"],[126574,1,"س"],[126575,1,"ع"],[126576,1,"ف"],[126577,1,"ص"],[126578,1,"ق"],[126579,3],[126580,1,"ش"],[126581,1,"ت"],[126582,1,"ث"],[126583,1,"خ"],[126584,3],[126585,1,"ض"],[126586,1,"ظ"],[126587,1,"غ"],[126588,1,"ٮ"],[126589,3],[126590,1,"ڡ"],[126591,3],[126592,1,"ا"],[126593,1,"ب"],[126594,1,"ج"],[126595,1,"د"],[126596,1,"ه"],[126597,1,"و"],[126598,1,"ز"],[126599,1,"ح"],[126600,1,"ط"],[126601,1,"ي"],[126602,3],[126603,1,"ل"],[126604,1,"م"],[126605,1,"ن"],[126606,1,"س"],[126607,1,"ع"],[126608,1,"ف"],[126609,1,"ص"],[126610,1,"ق"],[126611,1,"ر"],[126612,1,"ش"],[126613,1,"ت"],[126614,1,"ث"],[126615,1,"خ"],[126616,1,"ذ"],[126617,1,"ض"],[126618,1,"ظ"],[126619,1,"غ"],[[126620,126624],3],[126625,1,"ب"],[126626,1,"ج"],[126627,1,"د"],[126628,3],[126629,1,"و"],[126630,1,"ز"],[126631,1,"ح"],[126632,1,"ط"],[126633,1,"ي"],[126634,3],[126635,1,"ل"],[126636,1,"م"],[126637,1,"ن"],[126638,1,"س"],[126639,1,"ع"],[126640,1,"ف"],[126641,1,"ص"],[126642,1,"ق"],[126643,1,"ر"],[126644,1,"ش"],[126645,1,"ت"],[126646,1,"ث"],[126647,1,"خ"],[126648,1,"ذ"],[126649,1,"ض"],[126650,1,"ظ"],[126651,1,"غ"],[[126652,126703],3],[[126704,126705],2],[[126706,126975],3],[[126976,127019],2],[[127020,127023],3],[[127024,127123],2],[[127124,127135],3],[[127136,127150],2],[[127151,127152],3],[[127153,127166],2],[127167,2],[127168,3],[[127169,127183],2],[127184,3],[[127185,127199],2],[[127200,127221],2],[[127222,127231],3],[127232,3],[127233,5,"0,"],[127234,5,"1,"],[127235,5,"2,"],[127236,5,"3,"],[127237,5,"4,"],[127238,5,"5,"],[127239,5,"6,"],[127240,5,"7,"],[127241,5,"8,"],[127242,5,"9,"],[[127243,127244],2],[[127245,127247],2],[127248,5,"(a)"],[127249,5,"(b)"],[127250,5,"(c)"],[127251,5,"(d)"],[127252,5,"(e)"],[127253,5,"(f)"],[127254,5,"(g)"],[127255,5,"(h)"],[127256,5,"(i)"],[127257,5,"(j)"],[127258,5,"(k)"],[127259,5,"(l)"],[127260,5,"(m)"],[127261,5,"(n)"],[127262,5,"(o)"],[127263,5,"(p)"],[127264,5,"(q)"],[127265,5,"(r)"],[127266,5,"(s)"],[127267,5,"(t)"],[127268,5,"(u)"],[127269,5,"(v)"],[127270,5,"(w)"],[127271,5,"(x)"],[127272,5,"(y)"],[127273,5,"(z)"],[127274,1,"〔s〕"],[127275,1,"c"],[127276,1,"r"],[127277,1,"cd"],[127278,1,"wz"],[127279,2],[127280,1,"a"],[127281,1,"b"],[127282,1,"c"],[127283,1,"d"],[127284,1,"e"],[127285,1,"f"],[127286,1,"g"],[127287,1,"h"],[127288,1,"i"],[127289,1,"j"],[127290,1,"k"],[127291,1,"l"],[127292,1,"m"],[127293,1,"n"],[127294,1,"o"],[127295,1,"p"],[127296,1,"q"],[127297,1,"r"],[127298,1,"s"],[127299,1,"t"],[127300,1,"u"],[127301,1,"v"],[127302,1,"w"],[127303,1,"x"],[127304,1,"y"],[127305,1,"z"],[127306,1,"hv"],[127307,1,"mv"],[127308,1,"sd"],[127309,1,"ss"],[127310,1,"ppv"],[127311,1,"wc"],[[127312,127318],2],[127319,2],[[127320,127326],2],[127327,2],[[127328,127337],2],[127338,1,"mc"],[127339,1,"md"],[127340,1,"mr"],[[127341,127343],2],[[127344,127352],2],[127353,2],[127354,2],[[127355,127356],2],[[127357,127358],2],[127359,2],[[127360,127369],2],[[127370,127373],2],[[127374,127375],2],[127376,1,"dj"],[[127377,127386],2],[[127387,127404],2],[127405,2],[[127406,127461],3],[[127462,127487],2],[127488,1,"ほか"],[127489,1,"ココ"],[127490,1,"サ"],[[127491,127503],3],[127504,1,"手"],[127505,1,"字"],[127506,1,"双"],[127507,1,"デ"],[127508,1,"二"],[127509,1,"多"],[127510,1,"解"],[127511,1,"天"],[127512,1,"交"],[127513,1,"映"],[127514,1,"無"],[127515,1,"料"],[127516,1,"前"],[127517,1,"後"],[127518,1,"再"],[127519,1,"新"],[127520,1,"初"],[127521,1,"終"],[127522,1,"生"],[127523,1,"販"],[127524,1,"声"],[127525,1,"吹"],[127526,1,"演"],[127527,1,"投"],[127528,1,"捕"],[127529,1,"一"],[127530,1,"三"],[127531,1,"遊"],[127532,1,"左"],[127533,1,"中"],[127534,1,"右"],[127535,1,"指"],[127536,1,"走"],[127537,1,"打"],[127538,1,"禁"],[127539,1,"空"],[127540,1,"合"],[127541,1,"満"],[127542,1,"有"],[127543,1,"月"],[127544,1,"申"],[127545,1,"割"],[127546,1,"営"],[127547,1,"配"],[[127548,127551],3],[127552,1,"〔本〕"],[127553,1,"〔三〕"],[127554,1,"〔二〕"],[127555,1,"〔安〕"],[127556,1,"〔点〕"],[127557,1,"〔打〕"],[127558,1,"〔盗〕"],[127559,1,"〔勝〕"],[127560,1,"〔敗〕"],[[127561,127567],3],[127568,1,"得"],[127569,1,"可"],[[127570,127583],3],[[127584,127589],2],[[127590,127743],3],[[127744,127776],2],[[127777,127788],2],[[127789,127791],2],[[127792,127797],2],[127798,2],[[127799,127868],2],[127869,2],[[127870,127871],2],[[127872,127891],2],[[127892,127903],2],[[127904,127940],2],[127941,2],[[127942,127946],2],[[127947,127950],2],[[127951,127955],2],[[127956,127967],2],[[127968,127984],2],[[127985,127991],2],[[127992,127999],2],[[128000,128062],2],[128063,2],[128064,2],[128065,2],[[128066,128247],2],[128248,2],[[128249,128252],2],[[128253,128254],2],[128255,2],[[128256,128317],2],[[128318,128319],2],[[128320,128323],2],[[128324,128330],2],[[128331,128335],2],[[128336,128359],2],[[128360,128377],2],[128378,2],[[128379,128419],2],[128420,2],[[128421,128506],2],[[128507,128511],2],[128512,2],[[128513,128528],2],[128529,2],[[128530,128532],2],[128533,2],[128534,2],[128535,2],[128536,2],[128537,2],[128538,2],[128539,2],[[128540,128542],2],[128543,2],[[128544,128549],2],[[128550,128551],2],[[128552,128555],2],[128556,2],[128557,2],[[128558,128559],2],[[128560,128563],2],[128564,2],[[128565,128576],2],[[128577,128578],2],[[128579,128580],2],[[128581,128591],2],[[128592,128639],2],[[128640,128709],2],[[128710,128719],2],[128720,2],[[128721,128722],2],[[128723,128724],2],[128725,2],[[128726,128727],2],[[128728,128735],3],[[128736,128748],2],[[128749,128751],3],[[128752,128755],2],[[128756,128758],2],[[128759,128760],2],[128761,2],[128762,2],[[128763,128764],2],[[128765,128767],3],[[128768,128883],2],[[128884,128895],3],[[128896,128980],2],[[128981,128984],2],[[128985,128991],3],[[128992,129003],2],[[129004,129023],3],[[129024,129035],2],[[129036,129039],3],[[129040,129095],2],[[129096,129103],3],[[129104,129113],2],[[129114,129119],3],[[129120,129159],2],[[129160,129167],3],[[129168,129197],2],[[129198,129199],3],[[129200,129201],2],[[129202,129279],3],[[129280,129291],2],[129292,2],[[129293,129295],2],[[129296,129304],2],[[129305,129310],2],[129311,2],[[129312,129319],2],[[129320,129327],2],[129328,2],[[129329,129330],2],[[129331,129342],2],[129343,2],[[129344,129355],2],[129356,2],[[129357,129359],2],[[129360,129374],2],[[129375,129387],2],[[129388,129392],2],[129393,2],[129394,2],[[129395,129398],2],[[129399,129400],2],[129401,3],[129402,2],[129403,2],[[129404,129407],2],[[129408,129412],2],[[129413,129425],2],[[129426,129431],2],[[129432,129442],2],[[129443,129444],2],[[129445,129450],2],[[129451,129453],2],[[129454,129455],2],[[129456,129465],2],[[129466,129471],2],[129472,2],[[129473,129474],2],[[129475,129482],2],[129483,2],[129484,3],[[129485,129487],2],[[129488,129510],2],[[129511,129535],2],[[129536,129619],2],[[129620,129631],3],[[129632,129645],2],[[129646,129647],3],[[129648,129651],2],[129652,2],[[129653,129655],3],[[129656,129658],2],[[129659,129663],3],[[129664,129666],2],[[129667,129670],2],[[129671,129679],3],[[129680,129685],2],[[129686,129704],2],[[129705,129711],3],[[129712,129718],2],[[129719,129727],3],[[129728,129730],2],[[129731,129743],3],[[129744,129750],2],[[129751,129791],3],[[129792,129938],2],[129939,3],[[129940,129994],2],[[129995,130031],3],[130032,1,"0"],[130033,1,"1"],[130034,1,"2"],[130035,1,"3"],[130036,1,"4"],[130037,1,"5"],[130038,1,"6"],[130039,1,"7"],[130040,1,"8"],[130041,1,"9"],[[130042,131069],3],[[131070,131071],3],[[131072,173782],2],[[173783,173789],2],[[173790,173823],3],[[173824,177972],2],[[177973,177983],3],[[177984,178205],2],[[178206,178207],3],[[178208,183969],2],[[183970,183983],3],[[183984,191456],2],[[191457,194559],3],[194560,1,"丽"],[194561,1,"丸"],[194562,1,"乁"],[194563,1,"𠄢"],[194564,1,"你"],[194565,1,"侮"],[194566,1,"侻"],[194567,1,"倂"],[194568,1,"偺"],[194569,1,"備"],[194570,1,"僧"],[194571,1,"像"],[194572,1,"㒞"],[194573,1,"𠘺"],[194574,1,"免"],[194575,1,"兔"],[194576,1,"兤"],[194577,1,"具"],[194578,1,"𠔜"],[194579,1,"㒹"],[194580,1,"內"],[194581,1,"再"],[194582,1,"𠕋"],[194583,1,"冗"],[194584,1,"冤"],[194585,1,"仌"],[194586,1,"冬"],[194587,1,"况"],[194588,1,"𩇟"],[194589,1,"凵"],[194590,1,"刃"],[194591,1,"㓟"],[194592,1,"刻"],[194593,1,"剆"],[194594,1,"割"],[194595,1,"剷"],[194596,1,"㔕"],[194597,1,"勇"],[194598,1,"勉"],[194599,1,"勤"],[194600,1,"勺"],[194601,1,"包"],[194602,1,"匆"],[194603,1,"北"],[194604,1,"卉"],[194605,1,"卑"],[194606,1,"博"],[194607,1,"即"],[194608,1,"卽"],[[194609,194611],1,"卿"],[194612,1,"𠨬"],[194613,1,"灰"],[194614,1,"及"],[194615,1,"叟"],[194616,1,"𠭣"],[194617,1,"叫"],[194618,1,"叱"],[194619,1,"吆"],[194620,1,"咞"],[194621,1,"吸"],[194622,1,"呈"],[194623,1,"周"],[194624,1,"咢"],[194625,1,"哶"],[194626,1,"唐"],[194627,1,"啓"],[194628,1,"啣"],[[194629,194630],1,"善"],[194631,1,"喙"],[194632,1,"喫"],[194633,1,"喳"],[194634,1,"嗂"],[194635,1,"圖"],[194636,1,"嘆"],[194637,1,"圗"],[194638,1,"噑"],[194639,1,"噴"],[194640,1,"切"],[194641,1,"壮"],[194642,1,"城"],[194643,1,"埴"],[194644,1,"堍"],[194645,1,"型"],[194646,1,"堲"],[194647,1,"報"],[194648,1,"墬"],[194649,1,"𡓤"],[194650,1,"売"],[194651,1,"壷"],[194652,1,"夆"],[194653,1,"多"],[194654,1,"夢"],[194655,1,"奢"],[194656,1,"𡚨"],[194657,1,"𡛪"],[194658,1,"姬"],[194659,1,"娛"],[194660,1,"娧"],[194661,1,"姘"],[194662,1,"婦"],[194663,1,"㛮"],[194664,3],[194665,1,"嬈"],[[194666,194667],1,"嬾"],[194668,1,"𡧈"],[194669,1,"寃"],[194670,1,"寘"],[194671,1,"寧"],[194672,1,"寳"],[194673,1,"𡬘"],[194674,1,"寿"],[194675,1,"将"],[194676,3],[194677,1,"尢"],[194678,1,"㞁"],[194679,1,"屠"],[194680,1,"屮"],[194681,1,"峀"],[194682,1,"岍"],[194683,1,"𡷤"],[194684,1,"嵃"],[194685,1,"𡷦"],[194686,1,"嵮"],[194687,1,"嵫"],[194688,1,"嵼"],[194689,1,"巡"],[194690,1,"巢"],[194691,1,"㠯"],[194692,1,"巽"],[194693,1,"帨"],[194694,1,"帽"],[194695,1,"幩"],[194696,1,"㡢"],[194697,1,"𢆃"],[194698,1,"㡼"],[194699,1,"庰"],[194700,1,"庳"],[194701,1,"庶"],[194702,1,"廊"],[194703,1,"𪎒"],[194704,1,"廾"],[[194705,194706],1,"𢌱"],[194707,1,"舁"],[[194708,194709],1,"弢"],[194710,1,"㣇"],[194711,1,"𣊸"],[194712,1,"𦇚"],[194713,1,"形"],[194714,1,"彫"],[194715,1,"㣣"],[194716,1,"徚"],[194717,1,"忍"],[194718,1,"志"],[194719,1,"忹"],[194720,1,"悁"],[194721,1,"㤺"],[194722,1,"㤜"],[194723,1,"悔"],[194724,1,"𢛔"],[194725,1,"惇"],[194726,1,"慈"],[194727,1,"慌"],[194728,1,"慎"],[194729,1,"慌"],[194730,1,"慺"],[194731,1,"憎"],[194732,1,"憲"],[194733,1,"憤"],[194734,1,"憯"],[194735,1,"懞"],[194736,1,"懲"],[194737,1,"懶"],[194738,1,"成"],[194739,1,"戛"],[194740,1,"扝"],[194741,1,"抱"],[194742,1,"拔"],[194743,1,"捐"],[194744,1,"𢬌"],[194745,1,"挽"],[194746,1,"拼"],[194747,1,"捨"],[194748,1,"掃"],[194749,1,"揤"],[194750,1,"𢯱"],[194751,1,"搢"],[194752,1,"揅"],[194753,1,"掩"],[194754,1,"㨮"],[194755,1,"摩"],[194756,1,"摾"],[194757,1,"撝"],[194758,1,"摷"],[194759,1,"㩬"],[194760,1,"敏"],[194761,1,"敬"],[194762,1,"𣀊"],[194763,1,"旣"],[194764,1,"書"],[194765,1,"晉"],[194766,1,"㬙"],[194767,1,"暑"],[194768,1,"㬈"],[194769,1,"㫤"],[194770,1,"冒"],[194771,1,"冕"],[194772,1,"最"],[194773,1,"暜"],[194774,1,"肭"],[194775,1,"䏙"],[194776,1,"朗"],[194777,1,"望"],[194778,1,"朡"],[194779,1,"杞"],[194780,1,"杓"],[194781,1,"𣏃"],[194782,1,"㭉"],[194783,1,"柺"],[194784,1,"枅"],[194785,1,"桒"],[194786,1,"梅"],[194787,1,"𣑭"],[194788,1,"梎"],[194789,1,"栟"],[194790,1,"椔"],[194791,1,"㮝"],[194792,1,"楂"],[194793,1,"榣"],[194794,1,"槪"],[194795,1,"檨"],[194796,1,"𣚣"],[194797,1,"櫛"],[194798,1,"㰘"],[194799,1,"次"],[194800,1,"𣢧"],[194801,1,"歔"],[194802,1,"㱎"],[194803,1,"歲"],[194804,1,"殟"],[194805,1,"殺"],[194806,1,"殻"],[194807,1,"𣪍"],[194808,1,"𡴋"],[194809,1,"𣫺"],[194810,1,"汎"],[194811,1,"𣲼"],[194812,1,"沿"],[194813,1,"泍"],[194814,1,"汧"],[194815,1,"洖"],[194816,1,"派"],[194817,1,"海"],[194818,1,"流"],[194819,1,"浩"],[194820,1,"浸"],[194821,1,"涅"],[194822,1,"𣴞"],[194823,1,"洴"],[194824,1,"港"],[194825,1,"湮"],[194826,1,"㴳"],[194827,1,"滋"],[194828,1,"滇"],[194829,1,"𣻑"],[194830,1,"淹"],[194831,1,"潮"],[194832,1,"𣽞"],[194833,1,"𣾎"],[194834,1,"濆"],[194835,1,"瀹"],[194836,1,"瀞"],[194837,1,"瀛"],[194838,1,"㶖"],[194839,1,"灊"],[194840,1,"災"],[194841,1,"灷"],[194842,1,"炭"],[194843,1,"𠔥"],[194844,1,"煅"],[194845,1,"𤉣"],[194846,1,"熜"],[194847,3],[194848,1,"爨"],[194849,1,"爵"],[194850,1,"牐"],[194851,1,"𤘈"],[194852,1,"犀"],[194853,1,"犕"],[194854,1,"𤜵"],[194855,1,"𤠔"],[194856,1,"獺"],[194857,1,"王"],[194858,1,"㺬"],[194859,1,"玥"],[[194860,194861],1,"㺸"],[194862,1,"瑇"],[194863,1,"瑜"],[194864,1,"瑱"],[194865,1,"璅"],[194866,1,"瓊"],[194867,1,"㼛"],[194868,1,"甤"],[194869,1,"𤰶"],[194870,1,"甾"],[194871,1,"𤲒"],[194872,1,"異"],[194873,1,"𢆟"],[194874,1,"瘐"],[194875,1,"𤾡"],[194876,1,"𤾸"],[194877,1,"𥁄"],[194878,1,"㿼"],[194879,1,"䀈"],[194880,1,"直"],[194881,1,"𥃳"],[194882,1,"𥃲"],[194883,1,"𥄙"],[194884,1,"𥄳"],[194885,1,"眞"],[[194886,194887],1,"真"],[194888,1,"睊"],[194889,1,"䀹"],[194890,1,"瞋"],[194891,1,"䁆"],[194892,1,"䂖"],[194893,1,"𥐝"],[194894,1,"硎"],[194895,1,"碌"],[194896,1,"磌"],[194897,1,"䃣"],[194898,1,"𥘦"],[194899,1,"祖"],[194900,1,"𥚚"],[194901,1,"𥛅"],[194902,1,"福"],[194903,1,"秫"],[194904,1,"䄯"],[194905,1,"穀"],[194906,1,"穊"],[194907,1,"穏"],[194908,1,"𥥼"],[[194909,194910],1,"𥪧"],[194911,3],[194912,1,"䈂"],[194913,1,"𥮫"],[194914,1,"篆"],[194915,1,"築"],[194916,1,"䈧"],[194917,1,"𥲀"],[194918,1,"糒"],[194919,1,"䊠"],[194920,1,"糨"],[194921,1,"糣"],[194922,1,"紀"],[194923,1,"𥾆"],[194924,1,"絣"],[194925,1,"䌁"],[194926,1,"緇"],[194927,1,"縂"],[194928,1,"繅"],[194929,1,"䌴"],[194930,1,"𦈨"],[194931,1,"𦉇"],[194932,1,"䍙"],[194933,1,"𦋙"],[194934,1,"罺"],[194935,1,"𦌾"],[194936,1,"羕"],[194937,1,"翺"],[194938,1,"者"],[194939,1,"𦓚"],[194940,1,"𦔣"],[194941,1,"聠"],[194942,1,"𦖨"],[194943,1,"聰"],[194944,1,"𣍟"],[194945,1,"䏕"],[194946,1,"育"],[194947,1,"脃"],[194948,1,"䐋"],[194949,1,"脾"],[194950,1,"媵"],[194951,1,"𦞧"],[194952,1,"𦞵"],[194953,1,"𣎓"],[194954,1,"𣎜"],[194955,1,"舁"],[194956,1,"舄"],[194957,1,"辞"],[194958,1,"䑫"],[194959,1,"芑"],[194960,1,"芋"],[194961,1,"芝"],[194962,1,"劳"],[194963,1,"花"],[194964,1,"芳"],[194965,1,"芽"],[194966,1,"苦"],[194967,1,"𦬼"],[194968,1,"若"],[194969,1,"茝"],[194970,1,"荣"],[194971,1,"莭"],[194972,1,"茣"],[194973,1,"莽"],[194974,1,"菧"],[194975,1,"著"],[194976,1,"荓"],[194977,1,"菊"],[194978,1,"菌"],[194979,1,"菜"],[194980,1,"𦰶"],[194981,1,"𦵫"],[194982,1,"𦳕"],[194983,1,"䔫"],[194984,1,"蓱"],[194985,1,"蓳"],[194986,1,"蔖"],[194987,1,"𧏊"],[194988,1,"蕤"],[194989,1,"𦼬"],[194990,1,"䕝"],[194991,1,"䕡"],[194992,1,"𦾱"],[194993,1,"𧃒"],[194994,1,"䕫"],[194995,1,"虐"],[194996,1,"虜"],[194997,1,"虧"],[194998,1,"虩"],[194999,1,"蚩"],[195000,1,"蚈"],[195001,1,"蜎"],[195002,1,"蛢"],[195003,1,"蝹"],[195004,1,"蜨"],[195005,1,"蝫"],[195006,1,"螆"],[195007,3],[195008,1,"蟡"],[195009,1,"蠁"],[195010,1,"䗹"],[195011,1,"衠"],[195012,1,"衣"],[195013,1,"𧙧"],[195014,1,"裗"],[195015,1,"裞"],[195016,1,"䘵"],[195017,1,"裺"],[195018,1,"㒻"],[195019,1,"𧢮"],[195020,1,"𧥦"],[195021,1,"䚾"],[195022,1,"䛇"],[195023,1,"誠"],[195024,1,"諭"],[195025,1,"變"],[195026,1,"豕"],[195027,1,"𧲨"],[195028,1,"貫"],[195029,1,"賁"],[195030,1,"贛"],[195031,1,"起"],[195032,1,"𧼯"],[195033,1,"𠠄"],[195034,1,"跋"],[195035,1,"趼"],[195036,1,"跰"],[195037,1,"𠣞"],[195038,1,"軔"],[195039,1,"輸"],[195040,1,"𨗒"],[195041,1,"𨗭"],[195042,1,"邔"],[195043,1,"郱"],[195044,1,"鄑"],[195045,1,"𨜮"],[195046,1,"鄛"],[195047,1,"鈸"],[195048,1,"鋗"],[195049,1,"鋘"],[195050,1,"鉼"],[195051,1,"鏹"],[195052,1,"鐕"],[195053,1,"𨯺"],[195054,1,"開"],[195055,1,"䦕"],[195056,1,"閷"],[195057,1,"𨵷"],[195058,1,"䧦"],[195059,1,"雃"],[195060,1,"嶲"],[195061,1,"霣"],[195062,1,"𩅅"],[195063,1,"𩈚"],[195064,1,"䩮"],[195065,1,"䩶"],[195066,1,"韠"],[195067,1,"𩐊"],[195068,1,"䪲"],[195069,1,"𩒖"],[[195070,195071],1,"頋"],[195072,1,"頩"],[195073,1,"𩖶"],[195074,1,"飢"],[195075,1,"䬳"],[195076,1,"餩"],[195077,1,"馧"],[195078,1,"駂"],[195079,1,"駾"],[195080,1,"䯎"],[195081,1,"𩬰"],[195082,1,"鬒"],[195083,1,"鱀"],[195084,1,"鳽"],[195085,1,"䳎"],[195086,1,"䳭"],[195087,1,"鵧"],[195088,1,"𪃎"],[195089,1,"䳸"],[195090,1,"𪄅"],[195091,1,"𪈎"],[195092,1,"𪊑"],[195093,1,"麻"],[195094,1,"䵖"],[195095,1,"黹"],[195096,1,"黾"],[195097,1,"鼅"],[195098,1,"鼏"],[195099,1,"鼖"],[195100,1,"鼻"],[195101,1,"𪘀"],[[195102,196605],3],[[196606,196607],3],[[196608,201546],2],[[201547,262141],3],[[262142,262143],3],[[262144,327677],3],[[327678,327679],3],[[327680,393213],3],[[393214,393215],3],[[393216,458749],3],[[458750,458751],3],[[458752,524285],3],[[524286,524287],3],[[524288,589821],3],[[589822,589823],3],[[589824,655357],3],[[655358,655359],3],[[655360,720893],3],[[720894,720895],3],[[720896,786429],3],[[786430,786431],3],[[786432,851965],3],[[851966,851967],3],[[851968,917501],3],[[917502,917503],3],[917504,3],[917505,3],[[917506,917535],3],[[917536,917631],3],[[917632,917759],3],[[917760,917999],7],[[918000,983037],3],[[983038,983039],3],[[983040,1048573],3],[[1048574,1048575],3],[[1048576,1114109],3],[[1114110,1114111],3]] \ No newline at end of file diff --git a/node_modules/tr46/lib/regexes.js b/node_modules/tr46/lib/regexes.js new file mode 100644 index 000000000..9728941d3 --- /dev/null +++ b/node_modules/tr46/lib/regexes.js @@ -0,0 +1,29 @@ +"use strict"; + +const combiningMarks = /[\u0300-\u036F\u0483-\u0489\u0591-\u05BD\u05BF\u05C1\u05C2\u05C4\u05C5\u05C7\u0610-\u061A\u064B-\u065F\u0670\u06D6-\u06DC\u06DF-\u06E4\u06E7\u06E8\u06EA-\u06ED\u0711\u0730-\u074A\u07A6-\u07B0\u07EB-\u07F3\u07FD\u0816-\u0819\u081B-\u0823\u0825-\u0827\u0829-\u082D\u0859-\u085B\u08D3-\u08E1\u08E3-\u0903\u093A-\u093C\u093E-\u094F\u0951-\u0957\u0962\u0963\u0981-\u0983\u09BC\u09BE-\u09C4\u09C7\u09C8\u09CB-\u09CD\u09D7\u09E2\u09E3\u09FE\u0A01-\u0A03\u0A3C\u0A3E-\u0A42\u0A47\u0A48\u0A4B-\u0A4D\u0A51\u0A70\u0A71\u0A75\u0A81-\u0A83\u0ABC\u0ABE-\u0AC5\u0AC7-\u0AC9\u0ACB-\u0ACD\u0AE2\u0AE3\u0AFA-\u0AFF\u0B01-\u0B03\u0B3C\u0B3E-\u0B44\u0B47\u0B48\u0B4B-\u0B4D\u0B55-\u0B57\u0B62\u0B63\u0B82\u0BBE-\u0BC2\u0BC6-\u0BC8\u0BCA-\u0BCD\u0BD7\u0C00-\u0C04\u0C3E-\u0C44\u0C46-\u0C48\u0C4A-\u0C4D\u0C55\u0C56\u0C62\u0C63\u0C81-\u0C83\u0CBC\u0CBE-\u0CC4\u0CC6-\u0CC8\u0CCA-\u0CCD\u0CD5\u0CD6\u0CE2\u0CE3\u0D00-\u0D03\u0D3B\u0D3C\u0D3E-\u0D44\u0D46-\u0D48\u0D4A-\u0D4D\u0D57\u0D62\u0D63\u0D81-\u0D83\u0DCA\u0DCF-\u0DD4\u0DD6\u0DD8-\u0DDF\u0DF2\u0DF3\u0E31\u0E34-\u0E3A\u0E47-\u0E4E\u0EB1\u0EB4-\u0EBC\u0EC8-\u0ECD\u0F18\u0F19\u0F35\u0F37\u0F39\u0F3E\u0F3F\u0F71-\u0F84\u0F86\u0F87\u0F8D-\u0F97\u0F99-\u0FBC\u0FC6\u102B-\u103E\u1056-\u1059\u105E-\u1060\u1062-\u1064\u1067-\u106D\u1071-\u1074\u1082-\u108D\u108F\u109A-\u109D\u135D-\u135F\u1712-\u1714\u1732-\u1734\u1752\u1753\u1772\u1773\u17B4-\u17D3\u17DD\u180B-\u180D\u1885\u1886\u18A9\u1920-\u192B\u1930-\u193B\u1A17-\u1A1B\u1A55-\u1A5E\u1A60-\u1A7C\u1A7F\u1AB0-\u1AC0\u1B00-\u1B04\u1B34-\u1B44\u1B6B-\u1B73\u1B80-\u1B82\u1BA1-\u1BAD\u1BE6-\u1BF3\u1C24-\u1C37\u1CD0-\u1CD2\u1CD4-\u1CE8\u1CED\u1CF4\u1CF7-\u1CF9\u1DC0-\u1DF9\u1DFB-\u1DFF\u20D0-\u20F0\u2CEF-\u2CF1\u2D7F\u2DE0-\u2DFF\u302A-\u302F\u3099\u309A\uA66F-\uA672\uA674-\uA67D\uA69E\uA69F\uA6F0\uA6F1\uA802\uA806\uA80B\uA823-\uA827\uA82C\uA880\uA881\uA8B4-\uA8C5\uA8E0-\uA8F1\uA8FF\uA926-\uA92D\uA947-\uA953\uA980-\uA983\uA9B3-\uA9C0\uA9E5\uAA29-\uAA36\uAA43\uAA4C\uAA4D\uAA7B-\uAA7D\uAAB0\uAAB2-\uAAB4\uAAB7\uAAB8\uAABE\uAABF\uAAC1\uAAEB-\uAAEF\uAAF5\uAAF6\uABE3-\uABEA\uABEC\uABED\uFB1E\uFE00-\uFE0F\uFE20-\uFE2F\u{101FD}\u{102E0}\u{10376}-\u{1037A}\u{10A01}-\u{10A03}\u{10A05}\u{10A06}\u{10A0C}-\u{10A0F}\u{10A38}-\u{10A3A}\u{10A3F}\u{10AE5}\u{10AE6}\u{10D24}-\u{10D27}\u{10EAB}\u{10EAC}\u{10F46}-\u{10F50}\u{11000}-\u{11002}\u{11038}-\u{11046}\u{1107F}-\u{11082}\u{110B0}-\u{110BA}\u{11100}-\u{11102}\u{11127}-\u{11134}\u{11145}\u{11146}\u{11173}\u{11180}-\u{11182}\u{111B3}-\u{111C0}\u{111C9}-\u{111CC}\u{111CE}\u{111CF}\u{1122C}-\u{11237}\u{1123E}\u{112DF}-\u{112EA}\u{11300}-\u{11303}\u{1133B}\u{1133C}\u{1133E}-\u{11344}\u{11347}\u{11348}\u{1134B}-\u{1134D}\u{11357}\u{11362}\u{11363}\u{11366}-\u{1136C}\u{11370}-\u{11374}\u{11435}-\u{11446}\u{1145E}\u{114B0}-\u{114C3}\u{115AF}-\u{115B5}\u{115B8}-\u{115C0}\u{115DC}\u{115DD}\u{11630}-\u{11640}\u{116AB}-\u{116B7}\u{1171D}-\u{1172B}\u{1182C}-\u{1183A}\u{11930}-\u{11935}\u{11937}\u{11938}\u{1193B}-\u{1193E}\u{11940}\u{11942}\u{11943}\u{119D1}-\u{119D7}\u{119DA}-\u{119E0}\u{119E4}\u{11A01}-\u{11A0A}\u{11A33}-\u{11A39}\u{11A3B}-\u{11A3E}\u{11A47}\u{11A51}-\u{11A5B}\u{11A8A}-\u{11A99}\u{11C2F}-\u{11C36}\u{11C38}-\u{11C3F}\u{11C92}-\u{11CA7}\u{11CA9}-\u{11CB6}\u{11D31}-\u{11D36}\u{11D3A}\u{11D3C}\u{11D3D}\u{11D3F}-\u{11D45}\u{11D47}\u{11D8A}-\u{11D8E}\u{11D90}\u{11D91}\u{11D93}-\u{11D97}\u{11EF3}-\u{11EF6}\u{16AF0}-\u{16AF4}\u{16B30}-\u{16B36}\u{16F4F}\u{16F51}-\u{16F87}\u{16F8F}-\u{16F92}\u{16FE4}\u{16FF0}\u{16FF1}\u{1BC9D}\u{1BC9E}\u{1D165}-\u{1D169}\u{1D16D}-\u{1D172}\u{1D17B}-\u{1D182}\u{1D185}-\u{1D18B}\u{1D1AA}-\u{1D1AD}\u{1D242}-\u{1D244}\u{1DA00}-\u{1DA36}\u{1DA3B}-\u{1DA6C}\u{1DA75}\u{1DA84}\u{1DA9B}-\u{1DA9F}\u{1DAA1}-\u{1DAAF}\u{1E000}-\u{1E006}\u{1E008}-\u{1E018}\u{1E01B}-\u{1E021}\u{1E023}\u{1E024}\u{1E026}-\u{1E02A}\u{1E130}-\u{1E136}\u{1E2EC}-\u{1E2EF}\u{1E8D0}-\u{1E8D6}\u{1E944}-\u{1E94A}\u{E0100}-\u{E01EF}]/u; +const combiningClassVirama = /[\u094D\u09CD\u0A4D\u0ACD\u0B4D\u0BCD\u0C4D\u0CCD\u0D3B\u0D3C\u0D4D\u0DCA\u0E3A\u0EBA\u0F84\u1039\u103A\u1714\u1734\u17D2\u1A60\u1B44\u1BAA\u1BAB\u1BF2\u1BF3\u2D7F\uA806\uA8C4\uA953\uA9C0\uAAF6\uABED\u{10A3F}\u{11046}\u{1107F}\u{110B9}\u{11133}\u{11134}\u{111C0}\u{11235}\u{112EA}\u{1134D}\u{11442}\u{114C2}\u{115BF}\u{1163F}\u{116B6}\u{1172B}\u{11839}\u{119E0}\u{11A34}\u{11A47}\u{11A99}\u{11C3F}\u{11D44}\u{11D45}\u{11D97}]/u; +const validZWNJ = /[\u0620\u0626\u0628\u062A-\u062E\u0633-\u063F\u0641-\u0647\u0649\u064A\u066E\u066F\u0678-\u0687\u069A-\u06BF\u06C1\u06C2\u06CC\u06CE\u06D0\u06D1\u06FA-\u06FC\u06FF\u0712-\u0714\u071A-\u071D\u071F-\u0727\u0729\u072B\u072D\u072E\u074E-\u0758\u075C-\u076A\u076D-\u0770\u0772\u0775-\u0777\u077A-\u077F\u07CA-\u07EA\u0841-\u0845\u0848\u084A-\u0853\u0855\u0860\u0862-\u0865\u0868\u08A0-\u08A9\u08AF\u08B0\u08B3\u08B4\u08B6-\u08B8\u08BA-\u08BD\u1807\u1820-\u1878\u1887-\u18A8\u18AA\uA840-\uA872\u{10AC0}-\u{10AC4}\u{10ACD}\u{10AD3}-\u{10ADC}\u{10ADE}-\u{10AE0}\u{10AEB}-\u{10AEE}\u{10B80}\u{10B82}\u{10B86}-\u{10B88}\u{10B8A}\u{10B8B}\u{10B8D}\u{10B90}\u{10BAD}\u{10BAE}\u{10D00}-\u{10D21}\u{10D23}\u{10F30}-\u{10F32}\u{10F34}-\u{10F44}\u{10F51}-\u{10F53}\u{1E900}-\u{1E943}][\xAD\u0300-\u036F\u0483-\u0489\u0591-\u05BD\u05BF\u05C1\u05C2\u05C4\u05C5\u05C7\u0610-\u061A\u061C\u064B-\u065F\u0670\u06D6-\u06DC\u06DF-\u06E4\u06E7\u06E8\u06EA-\u06ED\u070F\u0711\u0730-\u074A\u07A6-\u07B0\u07EB-\u07F3\u07FD\u0816-\u0819\u081B-\u0823\u0825-\u0827\u0829-\u082D\u0859-\u085B\u08D3-\u08E1\u08E3-\u0902\u093A\u093C\u0941-\u0948\u094D\u0951-\u0957\u0962\u0963\u0981\u09BC\u09C1-\u09C4\u09CD\u09E2\u09E3\u09FE\u0A01\u0A02\u0A3C\u0A41\u0A42\u0A47\u0A48\u0A4B-\u0A4D\u0A51\u0A70\u0A71\u0A75\u0A81\u0A82\u0ABC\u0AC1-\u0AC5\u0AC7\u0AC8\u0ACD\u0AE2\u0AE3\u0AFA-\u0AFF\u0B01\u0B3C\u0B3F\u0B41-\u0B44\u0B4D\u0B56\u0B62\u0B63\u0B82\u0BC0\u0BCD\u0C00\u0C04\u0C3E-\u0C40\u0C46-\u0C48\u0C4A-\u0C4D\u0C55\u0C56\u0C62\u0C63\u0C81\u0CBC\u0CBF\u0CC6\u0CCC\u0CCD\u0CE2\u0CE3\u0D00\u0D01\u0D3B\u0D3C\u0D41-\u0D44\u0D4D\u0D62\u0D63\u0DCA\u0DD2-\u0DD4\u0DD6\u0E31\u0E34-\u0E3A\u0E47-\u0E4E\u0EB1\u0EB4-\u0EBC\u0EC8-\u0ECD\u0F18\u0F19\u0F35\u0F37\u0F39\u0F71-\u0F7E\u0F80-\u0F84\u0F86\u0F87\u0F8D-\u0F97\u0F99-\u0FBC\u0FC6\u102D-\u1030\u1032-\u1037\u1039\u103A\u103D\u103E\u1058\u1059\u105E-\u1060\u1071-\u1074\u1082\u1085\u1086\u108D\u109D\u135D-\u135F\u1712-\u1714\u1732-\u1734\u1752\u1753\u1772\u1773\u17B4\u17B5\u17B7-\u17BD\u17C6\u17C9-\u17D3\u17DD\u180B-\u180D\u1885\u1886\u18A9\u1920-\u1922\u1927\u1928\u1932\u1939-\u193B\u1A17\u1A18\u1A1B\u1A56\u1A58-\u1A5E\u1A60\u1A62\u1A65-\u1A6C\u1A73-\u1A7C\u1A7F\u1AB0-\u1ABE\u1B00-\u1B03\u1B34\u1B36-\u1B3A\u1B3C\u1B42\u1B6B-\u1B73\u1B80\u1B81\u1BA2-\u1BA5\u1BA8\u1BA9\u1BAB-\u1BAD\u1BE6\u1BE8\u1BE9\u1BED\u1BEF-\u1BF1\u1C2C-\u1C33\u1C36\u1C37\u1CD0-\u1CD2\u1CD4-\u1CE0\u1CE2-\u1CE8\u1CED\u1CF4\u1CF8\u1CF9\u1DC0-\u1DF9\u1DFB-\u1DFF\u200B\u200E\u200F\u202A-\u202E\u2060-\u2064\u206A-\u206F\u20D0-\u20F0\u2CEF-\u2CF1\u2D7F\u2DE0-\u2DFF\u302A-\u302D\u3099\u309A\uA66F-\uA672\uA674-\uA67D\uA69E\uA69F\uA6F0\uA6F1\uA802\uA806\uA80B\uA825\uA826\uA8C4\uA8C5\uA8E0-\uA8F1\uA8FF\uA926-\uA92D\uA947-\uA951\uA980-\uA982\uA9B3\uA9B6-\uA9B9\uA9BC\uA9BD\uA9E5\uAA29-\uAA2E\uAA31\uAA32\uAA35\uAA36\uAA43\uAA4C\uAA7C\uAAB0\uAAB2-\uAAB4\uAAB7\uAAB8\uAABE\uAABF\uAAC1\uAAEC\uAAED\uAAF6\uABE5\uABE8\uABED\uFB1E\uFE00-\uFE0F\uFE20-\uFE2F\uFEFF\uFFF9-\uFFFB\u{101FD}\u{102E0}\u{10376}-\u{1037A}\u{10A01}-\u{10A03}\u{10A05}\u{10A06}\u{10A0C}-\u{10A0F}\u{10A38}-\u{10A3A}\u{10A3F}\u{10AE5}\u{10AE6}\u{10D24}-\u{10D27}\u{10F46}-\u{10F50}\u{11001}\u{11038}-\u{11046}\u{1107F}-\u{11081}\u{110B3}-\u{110B6}\u{110B9}\u{110BA}\u{11100}-\u{11102}\u{11127}-\u{1112B}\u{1112D}-\u{11134}\u{11173}\u{11180}\u{11181}\u{111B6}-\u{111BE}\u{111C9}-\u{111CC}\u{1122F}-\u{11231}\u{11234}\u{11236}\u{11237}\u{1123E}\u{112DF}\u{112E3}-\u{112EA}\u{11300}\u{11301}\u{1133B}\u{1133C}\u{11340}\u{11366}-\u{1136C}\u{11370}-\u{11374}\u{11438}-\u{1143F}\u{11442}-\u{11444}\u{11446}\u{1145E}\u{114B3}-\u{114B8}\u{114BA}\u{114BF}\u{114C0}\u{114C2}\u{114C3}\u{115B2}-\u{115B5}\u{115BC}\u{115BD}\u{115BF}\u{115C0}\u{115DC}\u{115DD}\u{11633}-\u{1163A}\u{1163D}\u{1163F}\u{11640}\u{116AB}\u{116AD}\u{116B0}-\u{116B5}\u{116B7}\u{1171D}-\u{1171F}\u{11722}-\u{11725}\u{11727}-\u{1172B}\u{1182F}-\u{11837}\u{11839}\u{1183A}\u{119D4}-\u{119D7}\u{119DA}\u{119DB}\u{119E0}\u{11A01}-\u{11A0A}\u{11A33}-\u{11A38}\u{11A3B}-\u{11A3E}\u{11A47}\u{11A51}-\u{11A56}\u{11A59}-\u{11A5B}\u{11A8A}-\u{11A96}\u{11A98}\u{11A99}\u{11C30}-\u{11C36}\u{11C38}-\u{11C3D}\u{11C3F}\u{11C92}-\u{11CA7}\u{11CAA}-\u{11CB0}\u{11CB2}\u{11CB3}\u{11CB5}\u{11CB6}\u{11D31}-\u{11D36}\u{11D3A}\u{11D3C}\u{11D3D}\u{11D3F}-\u{11D45}\u{11D47}\u{11D90}\u{11D91}\u{11D95}\u{11D97}\u{11EF3}\u{11EF4}\u{13430}-\u{13438}\u{16AF0}-\u{16AF4}\u{16B30}-\u{16B36}\u{16F4F}\u{16F8F}-\u{16F92}\u{1BC9D}\u{1BC9E}\u{1BCA0}-\u{1BCA3}\u{1D167}-\u{1D169}\u{1D173}-\u{1D182}\u{1D185}-\u{1D18B}\u{1D1AA}-\u{1D1AD}\u{1D242}-\u{1D244}\u{1DA00}-\u{1DA36}\u{1DA3B}-\u{1DA6C}\u{1DA75}\u{1DA84}\u{1DA9B}-\u{1DA9F}\u{1DAA1}-\u{1DAAF}\u{1E000}-\u{1E006}\u{1E008}-\u{1E018}\u{1E01B}-\u{1E021}\u{1E023}\u{1E024}\u{1E026}-\u{1E02A}\u{1E130}-\u{1E136}\u{1E2EC}-\u{1E2EF}\u{1E8D0}-\u{1E8D6}\u{1E944}-\u{1E94B}\u{E0001}\u{E0020}-\u{E007F}\u{E0100}-\u{E01EF}]*\u200C[\xAD\u0300-\u036F\u0483-\u0489\u0591-\u05BD\u05BF\u05C1\u05C2\u05C4\u05C5\u05C7\u0610-\u061A\u061C\u064B-\u065F\u0670\u06D6-\u06DC\u06DF-\u06E4\u06E7\u06E8\u06EA-\u06ED\u070F\u0711\u0730-\u074A\u07A6-\u07B0\u07EB-\u07F3\u07FD\u0816-\u0819\u081B-\u0823\u0825-\u0827\u0829-\u082D\u0859-\u085B\u08D3-\u08E1\u08E3-\u0902\u093A\u093C\u0941-\u0948\u094D\u0951-\u0957\u0962\u0963\u0981\u09BC\u09C1-\u09C4\u09CD\u09E2\u09E3\u09FE\u0A01\u0A02\u0A3C\u0A41\u0A42\u0A47\u0A48\u0A4B-\u0A4D\u0A51\u0A70\u0A71\u0A75\u0A81\u0A82\u0ABC\u0AC1-\u0AC5\u0AC7\u0AC8\u0ACD\u0AE2\u0AE3\u0AFA-\u0AFF\u0B01\u0B3C\u0B3F\u0B41-\u0B44\u0B4D\u0B56\u0B62\u0B63\u0B82\u0BC0\u0BCD\u0C00\u0C04\u0C3E-\u0C40\u0C46-\u0C48\u0C4A-\u0C4D\u0C55\u0C56\u0C62\u0C63\u0C81\u0CBC\u0CBF\u0CC6\u0CCC\u0CCD\u0CE2\u0CE3\u0D00\u0D01\u0D3B\u0D3C\u0D41-\u0D44\u0D4D\u0D62\u0D63\u0DCA\u0DD2-\u0DD4\u0DD6\u0E31\u0E34-\u0E3A\u0E47-\u0E4E\u0EB1\u0EB4-\u0EBC\u0EC8-\u0ECD\u0F18\u0F19\u0F35\u0F37\u0F39\u0F71-\u0F7E\u0F80-\u0F84\u0F86\u0F87\u0F8D-\u0F97\u0F99-\u0FBC\u0FC6\u102D-\u1030\u1032-\u1037\u1039\u103A\u103D\u103E\u1058\u1059\u105E-\u1060\u1071-\u1074\u1082\u1085\u1086\u108D\u109D\u135D-\u135F\u1712-\u1714\u1732-\u1734\u1752\u1753\u1772\u1773\u17B4\u17B5\u17B7-\u17BD\u17C6\u17C9-\u17D3\u17DD\u180B-\u180D\u1885\u1886\u18A9\u1920-\u1922\u1927\u1928\u1932\u1939-\u193B\u1A17\u1A18\u1A1B\u1A56\u1A58-\u1A5E\u1A60\u1A62\u1A65-\u1A6C\u1A73-\u1A7C\u1A7F\u1AB0-\u1ABE\u1B00-\u1B03\u1B34\u1B36-\u1B3A\u1B3C\u1B42\u1B6B-\u1B73\u1B80\u1B81\u1BA2-\u1BA5\u1BA8\u1BA9\u1BAB-\u1BAD\u1BE6\u1BE8\u1BE9\u1BED\u1BEF-\u1BF1\u1C2C-\u1C33\u1C36\u1C37\u1CD0-\u1CD2\u1CD4-\u1CE0\u1CE2-\u1CE8\u1CED\u1CF4\u1CF8\u1CF9\u1DC0-\u1DF9\u1DFB-\u1DFF\u200B\u200E\u200F\u202A-\u202E\u2060-\u2064\u206A-\u206F\u20D0-\u20F0\u2CEF-\u2CF1\u2D7F\u2DE0-\u2DFF\u302A-\u302D\u3099\u309A\uA66F-\uA672\uA674-\uA67D\uA69E\uA69F\uA6F0\uA6F1\uA802\uA806\uA80B\uA825\uA826\uA8C4\uA8C5\uA8E0-\uA8F1\uA8FF\uA926-\uA92D\uA947-\uA951\uA980-\uA982\uA9B3\uA9B6-\uA9B9\uA9BC\uA9BD\uA9E5\uAA29-\uAA2E\uAA31\uAA32\uAA35\uAA36\uAA43\uAA4C\uAA7C\uAAB0\uAAB2-\uAAB4\uAAB7\uAAB8\uAABE\uAABF\uAAC1\uAAEC\uAAED\uAAF6\uABE5\uABE8\uABED\uFB1E\uFE00-\uFE0F\uFE20-\uFE2F\uFEFF\uFFF9-\uFFFB\u{101FD}\u{102E0}\u{10376}-\u{1037A}\u{10A01}-\u{10A03}\u{10A05}\u{10A06}\u{10A0C}-\u{10A0F}\u{10A38}-\u{10A3A}\u{10A3F}\u{10AE5}\u{10AE6}\u{10D24}-\u{10D27}\u{10F46}-\u{10F50}\u{11001}\u{11038}-\u{11046}\u{1107F}-\u{11081}\u{110B3}-\u{110B6}\u{110B9}\u{110BA}\u{11100}-\u{11102}\u{11127}-\u{1112B}\u{1112D}-\u{11134}\u{11173}\u{11180}\u{11181}\u{111B6}-\u{111BE}\u{111C9}-\u{111CC}\u{1122F}-\u{11231}\u{11234}\u{11236}\u{11237}\u{1123E}\u{112DF}\u{112E3}-\u{112EA}\u{11300}\u{11301}\u{1133B}\u{1133C}\u{11340}\u{11366}-\u{1136C}\u{11370}-\u{11374}\u{11438}-\u{1143F}\u{11442}-\u{11444}\u{11446}\u{1145E}\u{114B3}-\u{114B8}\u{114BA}\u{114BF}\u{114C0}\u{114C2}\u{114C3}\u{115B2}-\u{115B5}\u{115BC}\u{115BD}\u{115BF}\u{115C0}\u{115DC}\u{115DD}\u{11633}-\u{1163A}\u{1163D}\u{1163F}\u{11640}\u{116AB}\u{116AD}\u{116B0}-\u{116B5}\u{116B7}\u{1171D}-\u{1171F}\u{11722}-\u{11725}\u{11727}-\u{1172B}\u{1182F}-\u{11837}\u{11839}\u{1183A}\u{119D4}-\u{119D7}\u{119DA}\u{119DB}\u{119E0}\u{11A01}-\u{11A0A}\u{11A33}-\u{11A38}\u{11A3B}-\u{11A3E}\u{11A47}\u{11A51}-\u{11A56}\u{11A59}-\u{11A5B}\u{11A8A}-\u{11A96}\u{11A98}\u{11A99}\u{11C30}-\u{11C36}\u{11C38}-\u{11C3D}\u{11C3F}\u{11C92}-\u{11CA7}\u{11CAA}-\u{11CB0}\u{11CB2}\u{11CB3}\u{11CB5}\u{11CB6}\u{11D31}-\u{11D36}\u{11D3A}\u{11D3C}\u{11D3D}\u{11D3F}-\u{11D45}\u{11D47}\u{11D90}\u{11D91}\u{11D95}\u{11D97}\u{11EF3}\u{11EF4}\u{13430}-\u{13438}\u{16AF0}-\u{16AF4}\u{16B30}-\u{16B36}\u{16F4F}\u{16F8F}-\u{16F92}\u{1BC9D}\u{1BC9E}\u{1BCA0}-\u{1BCA3}\u{1D167}-\u{1D169}\u{1D173}-\u{1D182}\u{1D185}-\u{1D18B}\u{1D1AA}-\u{1D1AD}\u{1D242}-\u{1D244}\u{1DA00}-\u{1DA36}\u{1DA3B}-\u{1DA6C}\u{1DA75}\u{1DA84}\u{1DA9B}-\u{1DA9F}\u{1DAA1}-\u{1DAAF}\u{1E000}-\u{1E006}\u{1E008}-\u{1E018}\u{1E01B}-\u{1E021}\u{1E023}\u{1E024}\u{1E026}-\u{1E02A}\u{1E130}-\u{1E136}\u{1E2EC}-\u{1E2EF}\u{1E8D0}-\u{1E8D6}\u{1E944}-\u{1E94B}\u{E0001}\u{E0020}-\u{E007F}\u{E0100}-\u{E01EF}]*[\u0620\u0622-\u063F\u0641-\u064A\u066E\u066F\u0671-\u0673\u0675-\u06D3\u06D5\u06EE\u06EF\u06FA-\u06FC\u06FF\u0710\u0712-\u072F\u074D-\u077F\u07CA-\u07EA\u0840-\u0855\u0860\u0862-\u0865\u0867-\u086A\u08A0-\u08AC\u08AE-\u08B4\u08B6-\u08BD\u1807\u1820-\u1878\u1887-\u18A8\u18AA\uA840-\uA871\u{10AC0}-\u{10AC5}\u{10AC7}\u{10AC9}\u{10ACA}\u{10ACE}-\u{10AD6}\u{10AD8}-\u{10AE1}\u{10AE4}\u{10AEB}-\u{10AEF}\u{10B80}-\u{10B91}\u{10BA9}-\u{10BAE}\u{10D01}-\u{10D23}\u{10F30}-\u{10F44}\u{10F51}-\u{10F54}\u{1E900}-\u{1E943}]/u; +const bidiDomain = /[\u05BE\u05C0\u05C3\u05C6\u05D0-\u05EA\u05EF-\u05F4\u0600-\u0605\u0608\u060B\u060D\u061B\u061C\u061E-\u064A\u0660-\u0669\u066B-\u066F\u0671-\u06D5\u06DD\u06E5\u06E6\u06EE\u06EF\u06FA-\u070D\u070F\u0710\u0712-\u072F\u074D-\u07A5\u07B1\u07C0-\u07EA\u07F4\u07F5\u07FA\u07FE-\u0815\u081A\u0824\u0828\u0830-\u083E\u0840-\u0858\u085E\u0860-\u086A\u08A0-\u08B4\u08B6-\u08C7\u08E2\u200F\uFB1D\uFB1F-\uFB28\uFB2A-\uFB36\uFB38-\uFB3C\uFB3E\uFB40\uFB41\uFB43\uFB44\uFB46-\uFBC1\uFBD3-\uFD3D\uFD50-\uFD8F\uFD92-\uFDC7\uFDF0-\uFDFC\uFE70-\uFE74\uFE76-\uFEFC\u{10800}-\u{10805}\u{10808}\u{1080A}-\u{10835}\u{10837}\u{10838}\u{1083C}\u{1083F}-\u{10855}\u{10857}-\u{1089E}\u{108A7}-\u{108AF}\u{108E0}-\u{108F2}\u{108F4}\u{108F5}\u{108FB}-\u{1091B}\u{10920}-\u{10939}\u{1093F}\u{10980}-\u{109B7}\u{109BC}-\u{109CF}\u{109D2}-\u{10A00}\u{10A10}-\u{10A13}\u{10A15}-\u{10A17}\u{10A19}-\u{10A35}\u{10A40}-\u{10A48}\u{10A50}-\u{10A58}\u{10A60}-\u{10A9F}\u{10AC0}-\u{10AE4}\u{10AEB}-\u{10AF6}\u{10B00}-\u{10B35}\u{10B40}-\u{10B55}\u{10B58}-\u{10B72}\u{10B78}-\u{10B91}\u{10B99}-\u{10B9C}\u{10BA9}-\u{10BAF}\u{10C00}-\u{10C48}\u{10C80}-\u{10CB2}\u{10CC0}-\u{10CF2}\u{10CFA}-\u{10D23}\u{10D30}-\u{10D39}\u{10E60}-\u{10E7E}\u{10E80}-\u{10EA9}\u{10EAD}\u{10EB0}\u{10EB1}\u{10F00}-\u{10F27}\u{10F30}-\u{10F45}\u{10F51}-\u{10F59}\u{10FB0}-\u{10FCB}\u{10FE0}-\u{10FF6}\u{1E800}-\u{1E8C4}\u{1E8C7}-\u{1E8CF}\u{1E900}-\u{1E943}\u{1E94B}\u{1E950}-\u{1E959}\u{1E95E}\u{1E95F}\u{1EC71}-\u{1ECB4}\u{1ED01}-\u{1ED3D}\u{1EE00}-\u{1EE03}\u{1EE05}-\u{1EE1F}\u{1EE21}\u{1EE22}\u{1EE24}\u{1EE27}\u{1EE29}-\u{1EE32}\u{1EE34}-\u{1EE37}\u{1EE39}\u{1EE3B}\u{1EE42}\u{1EE47}\u{1EE49}\u{1EE4B}\u{1EE4D}-\u{1EE4F}\u{1EE51}\u{1EE52}\u{1EE54}\u{1EE57}\u{1EE59}\u{1EE5B}\u{1EE5D}\u{1EE5F}\u{1EE61}\u{1EE62}\u{1EE64}\u{1EE67}-\u{1EE6A}\u{1EE6C}-\u{1EE72}\u{1EE74}-\u{1EE77}\u{1EE79}-\u{1EE7C}\u{1EE7E}\u{1EE80}-\u{1EE89}\u{1EE8B}-\u{1EE9B}\u{1EEA1}-\u{1EEA3}\u{1EEA5}-\u{1EEA9}\u{1EEAB}-\u{1EEBB}]/u; +const bidiS1LTR = /[A-Za-z\xAA\xB5\xBA\xC0-\xD6\xD8-\xF6\xF8-\u02B8\u02BB-\u02C1\u02D0\u02D1\u02E0-\u02E4\u02EE\u0370-\u0373\u0376\u0377\u037A-\u037D\u037F\u0386\u0388-\u038A\u038C\u038E-\u03A1\u03A3-\u03F5\u03F7-\u0482\u048A-\u052F\u0531-\u0556\u0559-\u0589\u0903-\u0939\u093B\u093D-\u0940\u0949-\u094C\u094E-\u0950\u0958-\u0961\u0964-\u0980\u0982\u0983\u0985-\u098C\u098F\u0990\u0993-\u09A8\u09AA-\u09B0\u09B2\u09B6-\u09B9\u09BD-\u09C0\u09C7\u09C8\u09CB\u09CC\u09CE\u09D7\u09DC\u09DD\u09DF-\u09E1\u09E6-\u09F1\u09F4-\u09FA\u09FC\u09FD\u0A03\u0A05-\u0A0A\u0A0F\u0A10\u0A13-\u0A28\u0A2A-\u0A30\u0A32\u0A33\u0A35\u0A36\u0A38\u0A39\u0A3E-\u0A40\u0A59-\u0A5C\u0A5E\u0A66-\u0A6F\u0A72-\u0A74\u0A76\u0A83\u0A85-\u0A8D\u0A8F-\u0A91\u0A93-\u0AA8\u0AAA-\u0AB0\u0AB2\u0AB3\u0AB5-\u0AB9\u0ABD-\u0AC0\u0AC9\u0ACB\u0ACC\u0AD0\u0AE0\u0AE1\u0AE6-\u0AF0\u0AF9\u0B02\u0B03\u0B05-\u0B0C\u0B0F\u0B10\u0B13-\u0B28\u0B2A-\u0B30\u0B32\u0B33\u0B35-\u0B39\u0B3D\u0B3E\u0B40\u0B47\u0B48\u0B4B\u0B4C\u0B57\u0B5C\u0B5D\u0B5F-\u0B61\u0B66-\u0B77\u0B83\u0B85-\u0B8A\u0B8E-\u0B90\u0B92-\u0B95\u0B99\u0B9A\u0B9C\u0B9E\u0B9F\u0BA3\u0BA4\u0BA8-\u0BAA\u0BAE-\u0BB9\u0BBE\u0BBF\u0BC1\u0BC2\u0BC6-\u0BC8\u0BCA-\u0BCC\u0BD0\u0BD7\u0BE6-\u0BF2\u0C01-\u0C03\u0C05-\u0C0C\u0C0E-\u0C10\u0C12-\u0C28\u0C2A-\u0C39\u0C3D\u0C41-\u0C44\u0C58-\u0C5A\u0C60\u0C61\u0C66-\u0C6F\u0C77\u0C7F\u0C80\u0C82-\u0C8C\u0C8E-\u0C90\u0C92-\u0CA8\u0CAA-\u0CB3\u0CB5-\u0CB9\u0CBD-\u0CC4\u0CC6-\u0CC8\u0CCA\u0CCB\u0CD5\u0CD6\u0CDE\u0CE0\u0CE1\u0CE6-\u0CEF\u0CF1\u0CF2\u0D02-\u0D0C\u0D0E-\u0D10\u0D12-\u0D3A\u0D3D-\u0D40\u0D46-\u0D48\u0D4A-\u0D4C\u0D4E\u0D4F\u0D54-\u0D61\u0D66-\u0D7F\u0D82\u0D83\u0D85-\u0D96\u0D9A-\u0DB1\u0DB3-\u0DBB\u0DBD\u0DC0-\u0DC6\u0DCF-\u0DD1\u0DD8-\u0DDF\u0DE6-\u0DEF\u0DF2-\u0DF4\u0E01-\u0E30\u0E32\u0E33\u0E40-\u0E46\u0E4F-\u0E5B\u0E81\u0E82\u0E84\u0E86-\u0E8A\u0E8C-\u0EA3\u0EA5\u0EA7-\u0EB0\u0EB2\u0EB3\u0EBD\u0EC0-\u0EC4\u0EC6\u0ED0-\u0ED9\u0EDC-\u0EDF\u0F00-\u0F17\u0F1A-\u0F34\u0F36\u0F38\u0F3E-\u0F47\u0F49-\u0F6C\u0F7F\u0F85\u0F88-\u0F8C\u0FBE-\u0FC5\u0FC7-\u0FCC\u0FCE-\u0FDA\u1000-\u102C\u1031\u1038\u103B\u103C\u103F-\u1057\u105A-\u105D\u1061-\u1070\u1075-\u1081\u1083\u1084\u1087-\u108C\u108E-\u109C\u109E-\u10C5\u10C7\u10CD\u10D0-\u1248\u124A-\u124D\u1250-\u1256\u1258\u125A-\u125D\u1260-\u1288\u128A-\u128D\u1290-\u12B0\u12B2-\u12B5\u12B8-\u12BE\u12C0\u12C2-\u12C5\u12C8-\u12D6\u12D8-\u1310\u1312-\u1315\u1318-\u135A\u1360-\u137C\u1380-\u138F\u13A0-\u13F5\u13F8-\u13FD\u1401-\u167F\u1681-\u169A\u16A0-\u16F8\u1700-\u170C\u170E-\u1711\u1720-\u1731\u1735\u1736\u1740-\u1751\u1760-\u176C\u176E-\u1770\u1780-\u17B3\u17B6\u17BE-\u17C5\u17C7\u17C8\u17D4-\u17DA\u17DC\u17E0-\u17E9\u1810-\u1819\u1820-\u1878\u1880-\u1884\u1887-\u18A8\u18AA\u18B0-\u18F5\u1900-\u191E\u1923-\u1926\u1929-\u192B\u1930\u1931\u1933-\u1938\u1946-\u196D\u1970-\u1974\u1980-\u19AB\u19B0-\u19C9\u19D0-\u19DA\u1A00-\u1A16\u1A19\u1A1A\u1A1E-\u1A55\u1A57\u1A61\u1A63\u1A64\u1A6D-\u1A72\u1A80-\u1A89\u1A90-\u1A99\u1AA0-\u1AAD\u1B04-\u1B33\u1B35\u1B3B\u1B3D-\u1B41\u1B43-\u1B4B\u1B50-\u1B6A\u1B74-\u1B7C\u1B82-\u1BA1\u1BA6\u1BA7\u1BAA\u1BAE-\u1BE5\u1BE7\u1BEA-\u1BEC\u1BEE\u1BF2\u1BF3\u1BFC-\u1C2B\u1C34\u1C35\u1C3B-\u1C49\u1C4D-\u1C88\u1C90-\u1CBA\u1CBD-\u1CC7\u1CD3\u1CE1\u1CE9-\u1CEC\u1CEE-\u1CF3\u1CF5-\u1CF7\u1CFA\u1D00-\u1DBF\u1E00-\u1F15\u1F18-\u1F1D\u1F20-\u1F45\u1F48-\u1F4D\u1F50-\u1F57\u1F59\u1F5B\u1F5D\u1F5F-\u1F7D\u1F80-\u1FB4\u1FB6-\u1FBC\u1FBE\u1FC2-\u1FC4\u1FC6-\u1FCC\u1FD0-\u1FD3\u1FD6-\u1FDB\u1FE0-\u1FEC\u1FF2-\u1FF4\u1FF6-\u1FFC\u200E\u2071\u207F\u2090-\u209C\u2102\u2107\u210A-\u2113\u2115\u2119-\u211D\u2124\u2126\u2128\u212A-\u212D\u212F-\u2139\u213C-\u213F\u2145-\u2149\u214E\u214F\u2160-\u2188\u2336-\u237A\u2395\u249C-\u24E9\u26AC\u2800-\u28FF\u2C00-\u2C2E\u2C30-\u2C5E\u2C60-\u2CE4\u2CEB-\u2CEE\u2CF2\u2CF3\u2D00-\u2D25\u2D27\u2D2D\u2D30-\u2D67\u2D6F\u2D70\u2D80-\u2D96\u2DA0-\u2DA6\u2DA8-\u2DAE\u2DB0-\u2DB6\u2DB8-\u2DBE\u2DC0-\u2DC6\u2DC8-\u2DCE\u2DD0-\u2DD6\u2DD8-\u2DDE\u3005-\u3007\u3021-\u3029\u302E\u302F\u3031-\u3035\u3038-\u303C\u3041-\u3096\u309D-\u309F\u30A1-\u30FA\u30FC-\u30FF\u3105-\u312F\u3131-\u318E\u3190-\u31BF\u31F0-\u321C\u3220-\u324F\u3260-\u327B\u327F-\u32B0\u32C0-\u32CB\u32D0-\u3376\u337B-\u33DD\u33E0-\u33FE\u3400-\u4DBF\u4E00-\u9FFC\uA000-\uA48C\uA4D0-\uA60C\uA610-\uA62B\uA640-\uA66E\uA680-\uA69D\uA6A0-\uA6EF\uA6F2-\uA6F7\uA722-\uA787\uA789-\uA7BF\uA7C2-\uA7CA\uA7F5-\uA801\uA803-\uA805\uA807-\uA80A\uA80C-\uA824\uA827\uA830-\uA837\uA840-\uA873\uA880-\uA8C3\uA8CE-\uA8D9\uA8F2-\uA8FE\uA900-\uA925\uA92E-\uA946\uA952\uA953\uA95F-\uA97C\uA983-\uA9B2\uA9B4\uA9B5\uA9BA\uA9BB\uA9BE-\uA9CD\uA9CF-\uA9D9\uA9DE-\uA9E4\uA9E6-\uA9FE\uAA00-\uAA28\uAA2F\uAA30\uAA33\uAA34\uAA40-\uAA42\uAA44-\uAA4B\uAA4D\uAA50-\uAA59\uAA5C-\uAA7B\uAA7D-\uAAAF\uAAB1\uAAB5\uAAB6\uAAB9-\uAABD\uAAC0\uAAC2\uAADB-\uAAEB\uAAEE-\uAAF5\uAB01-\uAB06\uAB09-\uAB0E\uAB11-\uAB16\uAB20-\uAB26\uAB28-\uAB2E\uAB30-\uAB69\uAB70-\uABE4\uABE6\uABE7\uABE9-\uABEC\uABF0-\uABF9\uAC00-\uD7A3\uD7B0-\uD7C6\uD7CB-\uD7FB\uD800-\uFA6D\uFA70-\uFAD9\uFB00-\uFB06\uFB13-\uFB17\uFF21-\uFF3A\uFF41-\uFF5A\uFF66-\uFFBE\uFFC2-\uFFC7\uFFCA-\uFFCF\uFFD2-\uFFD7\uFFDA-\uFFDC\u{10000}-\u{1000B}\u{1000D}-\u{10026}\u{10028}-\u{1003A}\u{1003C}\u{1003D}\u{1003F}-\u{1004D}\u{10050}-\u{1005D}\u{10080}-\u{100FA}\u{10100}\u{10102}\u{10107}-\u{10133}\u{10137}-\u{1013F}\u{1018D}\u{1018E}\u{101D0}-\u{101FC}\u{10280}-\u{1029C}\u{102A0}-\u{102D0}\u{10300}-\u{10323}\u{1032D}-\u{1034A}\u{10350}-\u{10375}\u{10380}-\u{1039D}\u{1039F}-\u{103C3}\u{103C8}-\u{103D5}\u{10400}-\u{1049D}\u{104A0}-\u{104A9}\u{104B0}-\u{104D3}\u{104D8}-\u{104FB}\u{10500}-\u{10527}\u{10530}-\u{10563}\u{1056F}\u{10600}-\u{10736}\u{10740}-\u{10755}\u{10760}-\u{10767}\u{11000}\u{11002}-\u{11037}\u{11047}-\u{1104D}\u{11066}-\u{1106F}\u{11082}-\u{110B2}\u{110B7}\u{110B8}\u{110BB}-\u{110C1}\u{110CD}\u{110D0}-\u{110E8}\u{110F0}-\u{110F9}\u{11103}-\u{11126}\u{1112C}\u{11136}-\u{11147}\u{11150}-\u{11172}\u{11174}-\u{11176}\u{11182}-\u{111B5}\u{111BF}-\u{111C8}\u{111CD}\u{111CE}\u{111D0}-\u{111DF}\u{111E1}-\u{111F4}\u{11200}-\u{11211}\u{11213}-\u{1122E}\u{11232}\u{11233}\u{11235}\u{11238}-\u{1123D}\u{11280}-\u{11286}\u{11288}\u{1128A}-\u{1128D}\u{1128F}-\u{1129D}\u{1129F}-\u{112A9}\u{112B0}-\u{112DE}\u{112E0}-\u{112E2}\u{112F0}-\u{112F9}\u{11302}\u{11303}\u{11305}-\u{1130C}\u{1130F}\u{11310}\u{11313}-\u{11328}\u{1132A}-\u{11330}\u{11332}\u{11333}\u{11335}-\u{11339}\u{1133D}-\u{1133F}\u{11341}-\u{11344}\u{11347}\u{11348}\u{1134B}-\u{1134D}\u{11350}\u{11357}\u{1135D}-\u{11363}\u{11400}-\u{11437}\u{11440}\u{11441}\u{11445}\u{11447}-\u{1145B}\u{1145D}\u{1145F}-\u{11461}\u{11480}-\u{114B2}\u{114B9}\u{114BB}-\u{114BE}\u{114C1}\u{114C4}-\u{114C7}\u{114D0}-\u{114D9}\u{11580}-\u{115B1}\u{115B8}-\u{115BB}\u{115BE}\u{115C1}-\u{115DB}\u{11600}-\u{11632}\u{1163B}\u{1163C}\u{1163E}\u{11641}-\u{11644}\u{11650}-\u{11659}\u{11680}-\u{116AA}\u{116AC}\u{116AE}\u{116AF}\u{116B6}\u{116B8}\u{116C0}-\u{116C9}\u{11700}-\u{1171A}\u{11720}\u{11721}\u{11726}\u{11730}-\u{1173F}\u{11800}-\u{1182E}\u{11838}\u{1183B}\u{118A0}-\u{118F2}\u{118FF}-\u{11906}\u{11909}\u{1190C}-\u{11913}\u{11915}\u{11916}\u{11918}-\u{11935}\u{11937}\u{11938}\u{1193D}\u{1193F}-\u{11942}\u{11944}-\u{11946}\u{11950}-\u{11959}\u{119A0}-\u{119A7}\u{119AA}-\u{119D3}\u{119DC}-\u{119DF}\u{119E1}-\u{119E4}\u{11A00}\u{11A07}\u{11A08}\u{11A0B}-\u{11A32}\u{11A39}\u{11A3A}\u{11A3F}-\u{11A46}\u{11A50}\u{11A57}\u{11A58}\u{11A5C}-\u{11A89}\u{11A97}\u{11A9A}-\u{11AA2}\u{11AC0}-\u{11AF8}\u{11C00}-\u{11C08}\u{11C0A}-\u{11C2F}\u{11C3E}-\u{11C45}\u{11C50}-\u{11C6C}\u{11C70}-\u{11C8F}\u{11CA9}\u{11CB1}\u{11CB4}\u{11D00}-\u{11D06}\u{11D08}\u{11D09}\u{11D0B}-\u{11D30}\u{11D46}\u{11D50}-\u{11D59}\u{11D60}-\u{11D65}\u{11D67}\u{11D68}\u{11D6A}-\u{11D8E}\u{11D93}\u{11D94}\u{11D96}\u{11D98}\u{11DA0}-\u{11DA9}\u{11EE0}-\u{11EF2}\u{11EF5}-\u{11EF8}\u{11FB0}\u{11FC0}-\u{11FD4}\u{11FFF}-\u{12399}\u{12400}-\u{1246E}\u{12470}-\u{12474}\u{12480}-\u{12543}\u{13000}-\u{1342E}\u{13430}-\u{13438}\u{14400}-\u{14646}\u{16800}-\u{16A38}\u{16A40}-\u{16A5E}\u{16A60}-\u{16A69}\u{16A6E}\u{16A6F}\u{16AD0}-\u{16AED}\u{16AF5}\u{16B00}-\u{16B2F}\u{16B37}-\u{16B45}\u{16B50}-\u{16B59}\u{16B5B}-\u{16B61}\u{16B63}-\u{16B77}\u{16B7D}-\u{16B8F}\u{16E40}-\u{16E9A}\u{16F00}-\u{16F4A}\u{16F50}-\u{16F87}\u{16F93}-\u{16F9F}\u{16FE0}\u{16FE1}\u{16FE3}\u{16FF0}\u{16FF1}\u{17000}-\u{187F7}\u{18800}-\u{18CD5}\u{18D00}-\u{18D08}\u{1B000}-\u{1B11E}\u{1B150}-\u{1B152}\u{1B164}-\u{1B167}\u{1B170}-\u{1B2FB}\u{1BC00}-\u{1BC6A}\u{1BC70}-\u{1BC7C}\u{1BC80}-\u{1BC88}\u{1BC90}-\u{1BC99}\u{1BC9C}\u{1BC9F}\u{1D000}-\u{1D0F5}\u{1D100}-\u{1D126}\u{1D129}-\u{1D166}\u{1D16A}-\u{1D172}\u{1D183}\u{1D184}\u{1D18C}-\u{1D1A9}\u{1D1AE}-\u{1D1E8}\u{1D2E0}-\u{1D2F3}\u{1D360}-\u{1D378}\u{1D400}-\u{1D454}\u{1D456}-\u{1D49C}\u{1D49E}\u{1D49F}\u{1D4A2}\u{1D4A5}\u{1D4A6}\u{1D4A9}-\u{1D4AC}\u{1D4AE}-\u{1D4B9}\u{1D4BB}\u{1D4BD}-\u{1D4C3}\u{1D4C5}-\u{1D505}\u{1D507}-\u{1D50A}\u{1D50D}-\u{1D514}\u{1D516}-\u{1D51C}\u{1D51E}-\u{1D539}\u{1D53B}-\u{1D53E}\u{1D540}-\u{1D544}\u{1D546}\u{1D54A}-\u{1D550}\u{1D552}-\u{1D6A5}\u{1D6A8}-\u{1D6DA}\u{1D6DC}-\u{1D714}\u{1D716}-\u{1D74E}\u{1D750}-\u{1D788}\u{1D78A}-\u{1D7C2}\u{1D7C4}-\u{1D7CB}\u{1D800}-\u{1D9FF}\u{1DA37}-\u{1DA3A}\u{1DA6D}-\u{1DA74}\u{1DA76}-\u{1DA83}\u{1DA85}-\u{1DA8B}\u{1E100}-\u{1E12C}\u{1E137}-\u{1E13D}\u{1E140}-\u{1E149}\u{1E14E}\u{1E14F}\u{1E2C0}-\u{1E2EB}\u{1E2F0}-\u{1E2F9}\u{1F110}-\u{1F12E}\u{1F130}-\u{1F169}\u{1F170}-\u{1F1AC}\u{1F1E6}-\u{1F202}\u{1F210}-\u{1F23B}\u{1F240}-\u{1F248}\u{1F250}\u{1F251}\u{20000}-\u{2A6DD}\u{2A700}-\u{2B734}\u{2B740}-\u{2B81D}\u{2B820}-\u{2CEA1}\u{2CEB0}-\u{2EBE0}\u{2F800}-\u{2FA1D}\u{30000}-\u{3134A}\u{F0000}-\u{FFFFD}\u{100000}-\u{10FFFD}]/u; +const bidiS1RTL = /[\u05BE\u05C0\u05C3\u05C6\u05D0-\u05EA\u05EF-\u05F4\u0608\u060B\u060D\u061B\u061C\u061E-\u064A\u066D-\u066F\u0671-\u06D5\u06E5\u06E6\u06EE\u06EF\u06FA-\u070D\u070F\u0710\u0712-\u072F\u074D-\u07A5\u07B1\u07C0-\u07EA\u07F4\u07F5\u07FA\u07FE-\u0815\u081A\u0824\u0828\u0830-\u083E\u0840-\u0858\u085E\u0860-\u086A\u08A0-\u08B4\u08B6-\u08C7\u200F\uFB1D\uFB1F-\uFB28\uFB2A-\uFB36\uFB38-\uFB3C\uFB3E\uFB40\uFB41\uFB43\uFB44\uFB46-\uFBC1\uFBD3-\uFD3D\uFD50-\uFD8F\uFD92-\uFDC7\uFDF0-\uFDFC\uFE70-\uFE74\uFE76-\uFEFC\u{10800}-\u{10805}\u{10808}\u{1080A}-\u{10835}\u{10837}\u{10838}\u{1083C}\u{1083F}-\u{10855}\u{10857}-\u{1089E}\u{108A7}-\u{108AF}\u{108E0}-\u{108F2}\u{108F4}\u{108F5}\u{108FB}-\u{1091B}\u{10920}-\u{10939}\u{1093F}\u{10980}-\u{109B7}\u{109BC}-\u{109CF}\u{109D2}-\u{10A00}\u{10A10}-\u{10A13}\u{10A15}-\u{10A17}\u{10A19}-\u{10A35}\u{10A40}-\u{10A48}\u{10A50}-\u{10A58}\u{10A60}-\u{10A9F}\u{10AC0}-\u{10AE4}\u{10AEB}-\u{10AF6}\u{10B00}-\u{10B35}\u{10B40}-\u{10B55}\u{10B58}-\u{10B72}\u{10B78}-\u{10B91}\u{10B99}-\u{10B9C}\u{10BA9}-\u{10BAF}\u{10C00}-\u{10C48}\u{10C80}-\u{10CB2}\u{10CC0}-\u{10CF2}\u{10CFA}-\u{10D23}\u{10E80}-\u{10EA9}\u{10EAD}\u{10EB0}\u{10EB1}\u{10F00}-\u{10F27}\u{10F30}-\u{10F45}\u{10F51}-\u{10F59}\u{10FB0}-\u{10FCB}\u{10FE0}-\u{10FF6}\u{1E800}-\u{1E8C4}\u{1E8C7}-\u{1E8CF}\u{1E900}-\u{1E943}\u{1E94B}\u{1E950}-\u{1E959}\u{1E95E}\u{1E95F}\u{1EC71}-\u{1ECB4}\u{1ED01}-\u{1ED3D}\u{1EE00}-\u{1EE03}\u{1EE05}-\u{1EE1F}\u{1EE21}\u{1EE22}\u{1EE24}\u{1EE27}\u{1EE29}-\u{1EE32}\u{1EE34}-\u{1EE37}\u{1EE39}\u{1EE3B}\u{1EE42}\u{1EE47}\u{1EE49}\u{1EE4B}\u{1EE4D}-\u{1EE4F}\u{1EE51}\u{1EE52}\u{1EE54}\u{1EE57}\u{1EE59}\u{1EE5B}\u{1EE5D}\u{1EE5F}\u{1EE61}\u{1EE62}\u{1EE64}\u{1EE67}-\u{1EE6A}\u{1EE6C}-\u{1EE72}\u{1EE74}-\u{1EE77}\u{1EE79}-\u{1EE7C}\u{1EE7E}\u{1EE80}-\u{1EE89}\u{1EE8B}-\u{1EE9B}\u{1EEA1}-\u{1EEA3}\u{1EEA5}-\u{1EEA9}\u{1EEAB}-\u{1EEBB}]/u; +const bidiS2 = /^[\0-\x08\x0E-\x1B!-@\[-`\{-\x84\x86-\xA9\xAB-\xB4\xB6-\xB9\xBB-\xBF\xD7\xF7\u02B9\u02BA\u02C2-\u02CF\u02D2-\u02DF\u02E5-\u02ED\u02EF-\u036F\u0374\u0375\u037E\u0384\u0385\u0387\u03F6\u0483-\u0489\u058A\u058D-\u058F\u0591-\u05C7\u05D0-\u05EA\u05EF-\u05F4\u0600-\u061C\u061E-\u070D\u070F-\u074A\u074D-\u07B1\u07C0-\u07FA\u07FD-\u082D\u0830-\u083E\u0840-\u085B\u085E\u0860-\u086A\u08A0-\u08B4\u08B6-\u08C7\u08D3-\u0902\u093A\u093C\u0941-\u0948\u094D\u0951-\u0957\u0962\u0963\u0981\u09BC\u09C1-\u09C4\u09CD\u09E2\u09E3\u09F2\u09F3\u09FB\u09FE\u0A01\u0A02\u0A3C\u0A41\u0A42\u0A47\u0A48\u0A4B-\u0A4D\u0A51\u0A70\u0A71\u0A75\u0A81\u0A82\u0ABC\u0AC1-\u0AC5\u0AC7\u0AC8\u0ACD\u0AE2\u0AE3\u0AF1\u0AFA-\u0AFF\u0B01\u0B3C\u0B3F\u0B41-\u0B44\u0B4D\u0B55\u0B56\u0B62\u0B63\u0B82\u0BC0\u0BCD\u0BF3-\u0BFA\u0C00\u0C04\u0C3E-\u0C40\u0C46-\u0C48\u0C4A-\u0C4D\u0C55\u0C56\u0C62\u0C63\u0C78-\u0C7E\u0C81\u0CBC\u0CCC\u0CCD\u0CE2\u0CE3\u0D00\u0D01\u0D3B\u0D3C\u0D41-\u0D44\u0D4D\u0D62\u0D63\u0D81\u0DCA\u0DD2-\u0DD4\u0DD6\u0E31\u0E34-\u0E3A\u0E3F\u0E47-\u0E4E\u0EB1\u0EB4-\u0EBC\u0EC8-\u0ECD\u0F18\u0F19\u0F35\u0F37\u0F39-\u0F3D\u0F71-\u0F7E\u0F80-\u0F84\u0F86\u0F87\u0F8D-\u0F97\u0F99-\u0FBC\u0FC6\u102D-\u1030\u1032-\u1037\u1039\u103A\u103D\u103E\u1058\u1059\u105E-\u1060\u1071-\u1074\u1082\u1085\u1086\u108D\u109D\u135D-\u135F\u1390-\u1399\u1400\u169B\u169C\u1712-\u1714\u1732-\u1734\u1752\u1753\u1772\u1773\u17B4\u17B5\u17B7-\u17BD\u17C6\u17C9-\u17D3\u17DB\u17DD\u17F0-\u17F9\u1800-\u180E\u1885\u1886\u18A9\u1920-\u1922\u1927\u1928\u1932\u1939-\u193B\u1940\u1944\u1945\u19DE-\u19FF\u1A17\u1A18\u1A1B\u1A56\u1A58-\u1A5E\u1A60\u1A62\u1A65-\u1A6C\u1A73-\u1A7C\u1A7F\u1AB0-\u1AC0\u1B00-\u1B03\u1B34\u1B36-\u1B3A\u1B3C\u1B42\u1B6B-\u1B73\u1B80\u1B81\u1BA2-\u1BA5\u1BA8\u1BA9\u1BAB-\u1BAD\u1BE6\u1BE8\u1BE9\u1BED\u1BEF-\u1BF1\u1C2C-\u1C33\u1C36\u1C37\u1CD0-\u1CD2\u1CD4-\u1CE0\u1CE2-\u1CE8\u1CED\u1CF4\u1CF8\u1CF9\u1DC0-\u1DF9\u1DFB-\u1DFF\u1FBD\u1FBF-\u1FC1\u1FCD-\u1FCF\u1FDD-\u1FDF\u1FED-\u1FEF\u1FFD\u1FFE\u200B-\u200D\u200F-\u2027\u202F-\u205E\u2060-\u2064\u206A-\u2070\u2074-\u207E\u2080-\u208E\u20A0-\u20BF\u20D0-\u20F0\u2100\u2101\u2103-\u2106\u2108\u2109\u2114\u2116-\u2118\u211E-\u2123\u2125\u2127\u2129\u212E\u213A\u213B\u2140-\u2144\u214A-\u214D\u2150-\u215F\u2189-\u218B\u2190-\u2335\u237B-\u2394\u2396-\u2426\u2440-\u244A\u2460-\u249B\u24EA-\u26AB\u26AD-\u27FF\u2900-\u2B73\u2B76-\u2B95\u2B97-\u2BFF\u2CE5-\u2CEA\u2CEF-\u2CF1\u2CF9-\u2CFF\u2D7F\u2DE0-\u2E52\u2E80-\u2E99\u2E9B-\u2EF3\u2F00-\u2FD5\u2FF0-\u2FFB\u3001-\u3004\u3008-\u3020\u302A-\u302D\u3030\u3036\u3037\u303D-\u303F\u3099-\u309C\u30A0\u30FB\u31C0-\u31E3\u321D\u321E\u3250-\u325F\u327C-\u327E\u32B1-\u32BF\u32CC-\u32CF\u3377-\u337A\u33DE\u33DF\u33FF\u4DC0-\u4DFF\uA490-\uA4C6\uA60D-\uA60F\uA66F-\uA67F\uA69E\uA69F\uA6F0\uA6F1\uA700-\uA721\uA788\uA802\uA806\uA80B\uA825\uA826\uA828-\uA82C\uA838\uA839\uA874-\uA877\uA8C4\uA8C5\uA8E0-\uA8F1\uA8FF\uA926-\uA92D\uA947-\uA951\uA980-\uA982\uA9B3\uA9B6-\uA9B9\uA9BC\uA9BD\uA9E5\uAA29-\uAA2E\uAA31\uAA32\uAA35\uAA36\uAA43\uAA4C\uAA7C\uAAB0\uAAB2-\uAAB4\uAAB7\uAAB8\uAABE\uAABF\uAAC1\uAAEC\uAAED\uAAF6\uAB6A\uAB6B\uABE5\uABE8\uABED\uFB1D-\uFB36\uFB38-\uFB3C\uFB3E\uFB40\uFB41\uFB43\uFB44\uFB46-\uFBC1\uFBD3-\uFD3F\uFD50-\uFD8F\uFD92-\uFDC7\uFDF0-\uFDFD\uFE00-\uFE19\uFE20-\uFE52\uFE54-\uFE66\uFE68-\uFE6B\uFE70-\uFE74\uFE76-\uFEFC\uFEFF\uFF01-\uFF20\uFF3B-\uFF40\uFF5B-\uFF65\uFFE0-\uFFE6\uFFE8-\uFFEE\uFFF9-\uFFFD\u{10101}\u{10140}-\u{1018C}\u{10190}-\u{1019C}\u{101A0}\u{101FD}\u{102E0}-\u{102FB}\u{10376}-\u{1037A}\u{10800}-\u{10805}\u{10808}\u{1080A}-\u{10835}\u{10837}\u{10838}\u{1083C}\u{1083F}-\u{10855}\u{10857}-\u{1089E}\u{108A7}-\u{108AF}\u{108E0}-\u{108F2}\u{108F4}\u{108F5}\u{108FB}-\u{1091B}\u{1091F}-\u{10939}\u{1093F}\u{10980}-\u{109B7}\u{109BC}-\u{109CF}\u{109D2}-\u{10A03}\u{10A05}\u{10A06}\u{10A0C}-\u{10A13}\u{10A15}-\u{10A17}\u{10A19}-\u{10A35}\u{10A38}-\u{10A3A}\u{10A3F}-\u{10A48}\u{10A50}-\u{10A58}\u{10A60}-\u{10A9F}\u{10AC0}-\u{10AE6}\u{10AEB}-\u{10AF6}\u{10B00}-\u{10B35}\u{10B39}-\u{10B55}\u{10B58}-\u{10B72}\u{10B78}-\u{10B91}\u{10B99}-\u{10B9C}\u{10BA9}-\u{10BAF}\u{10C00}-\u{10C48}\u{10C80}-\u{10CB2}\u{10CC0}-\u{10CF2}\u{10CFA}-\u{10D27}\u{10D30}-\u{10D39}\u{10E60}-\u{10E7E}\u{10E80}-\u{10EA9}\u{10EAB}-\u{10EAD}\u{10EB0}\u{10EB1}\u{10F00}-\u{10F27}\u{10F30}-\u{10F59}\u{10FB0}-\u{10FCB}\u{10FE0}-\u{10FF6}\u{11001}\u{11038}-\u{11046}\u{11052}-\u{11065}\u{1107F}-\u{11081}\u{110B3}-\u{110B6}\u{110B9}\u{110BA}\u{11100}-\u{11102}\u{11127}-\u{1112B}\u{1112D}-\u{11134}\u{11173}\u{11180}\u{11181}\u{111B6}-\u{111BE}\u{111C9}-\u{111CC}\u{111CF}\u{1122F}-\u{11231}\u{11234}\u{11236}\u{11237}\u{1123E}\u{112DF}\u{112E3}-\u{112EA}\u{11300}\u{11301}\u{1133B}\u{1133C}\u{11340}\u{11366}-\u{1136C}\u{11370}-\u{11374}\u{11438}-\u{1143F}\u{11442}-\u{11444}\u{11446}\u{1145E}\u{114B3}-\u{114B8}\u{114BA}\u{114BF}\u{114C0}\u{114C2}\u{114C3}\u{115B2}-\u{115B5}\u{115BC}\u{115BD}\u{115BF}\u{115C0}\u{115DC}\u{115DD}\u{11633}-\u{1163A}\u{1163D}\u{1163F}\u{11640}\u{11660}-\u{1166C}\u{116AB}\u{116AD}\u{116B0}-\u{116B5}\u{116B7}\u{1171D}-\u{1171F}\u{11722}-\u{11725}\u{11727}-\u{1172B}\u{1182F}-\u{11837}\u{11839}\u{1183A}\u{1193B}\u{1193C}\u{1193E}\u{11943}\u{119D4}-\u{119D7}\u{119DA}\u{119DB}\u{119E0}\u{11A01}-\u{11A06}\u{11A09}\u{11A0A}\u{11A33}-\u{11A38}\u{11A3B}-\u{11A3E}\u{11A47}\u{11A51}-\u{11A56}\u{11A59}-\u{11A5B}\u{11A8A}-\u{11A96}\u{11A98}\u{11A99}\u{11C30}-\u{11C36}\u{11C38}-\u{11C3D}\u{11C92}-\u{11CA7}\u{11CAA}-\u{11CB0}\u{11CB2}\u{11CB3}\u{11CB5}\u{11CB6}\u{11D31}-\u{11D36}\u{11D3A}\u{11D3C}\u{11D3D}\u{11D3F}-\u{11D45}\u{11D47}\u{11D90}\u{11D91}\u{11D95}\u{11D97}\u{11EF3}\u{11EF4}\u{11FD5}-\u{11FF1}\u{16AF0}-\u{16AF4}\u{16B30}-\u{16B36}\u{16F4F}\u{16F8F}-\u{16F92}\u{16FE2}\u{16FE4}\u{1BC9D}\u{1BC9E}\u{1BCA0}-\u{1BCA3}\u{1D167}-\u{1D169}\u{1D173}-\u{1D182}\u{1D185}-\u{1D18B}\u{1D1AA}-\u{1D1AD}\u{1D200}-\u{1D245}\u{1D300}-\u{1D356}\u{1D6DB}\u{1D715}\u{1D74F}\u{1D789}\u{1D7C3}\u{1D7CE}-\u{1D7FF}\u{1DA00}-\u{1DA36}\u{1DA3B}-\u{1DA6C}\u{1DA75}\u{1DA84}\u{1DA9B}-\u{1DA9F}\u{1DAA1}-\u{1DAAF}\u{1E000}-\u{1E006}\u{1E008}-\u{1E018}\u{1E01B}-\u{1E021}\u{1E023}\u{1E024}\u{1E026}-\u{1E02A}\u{1E130}-\u{1E136}\u{1E2EC}-\u{1E2EF}\u{1E2FF}\u{1E800}-\u{1E8C4}\u{1E8C7}-\u{1E8D6}\u{1E900}-\u{1E94B}\u{1E950}-\u{1E959}\u{1E95E}\u{1E95F}\u{1EC71}-\u{1ECB4}\u{1ED01}-\u{1ED3D}\u{1EE00}-\u{1EE03}\u{1EE05}-\u{1EE1F}\u{1EE21}\u{1EE22}\u{1EE24}\u{1EE27}\u{1EE29}-\u{1EE32}\u{1EE34}-\u{1EE37}\u{1EE39}\u{1EE3B}\u{1EE42}\u{1EE47}\u{1EE49}\u{1EE4B}\u{1EE4D}-\u{1EE4F}\u{1EE51}\u{1EE52}\u{1EE54}\u{1EE57}\u{1EE59}\u{1EE5B}\u{1EE5D}\u{1EE5F}\u{1EE61}\u{1EE62}\u{1EE64}\u{1EE67}-\u{1EE6A}\u{1EE6C}-\u{1EE72}\u{1EE74}-\u{1EE77}\u{1EE79}-\u{1EE7C}\u{1EE7E}\u{1EE80}-\u{1EE89}\u{1EE8B}-\u{1EE9B}\u{1EEA1}-\u{1EEA3}\u{1EEA5}-\u{1EEA9}\u{1EEAB}-\u{1EEBB}\u{1EEF0}\u{1EEF1}\u{1F000}-\u{1F02B}\u{1F030}-\u{1F093}\u{1F0A0}-\u{1F0AE}\u{1F0B1}-\u{1F0BF}\u{1F0C1}-\u{1F0CF}\u{1F0D1}-\u{1F0F5}\u{1F100}-\u{1F10F}\u{1F12F}\u{1F16A}-\u{1F16F}\u{1F1AD}\u{1F260}-\u{1F265}\u{1F300}-\u{1F6D7}\u{1F6E0}-\u{1F6EC}\u{1F6F0}-\u{1F6FC}\u{1F700}-\u{1F773}\u{1F780}-\u{1F7D8}\u{1F7E0}-\u{1F7EB}\u{1F800}-\u{1F80B}\u{1F810}-\u{1F847}\u{1F850}-\u{1F859}\u{1F860}-\u{1F887}\u{1F890}-\u{1F8AD}\u{1F8B0}\u{1F8B1}\u{1F900}-\u{1F978}\u{1F97A}-\u{1F9CB}\u{1F9CD}-\u{1FA53}\u{1FA60}-\u{1FA6D}\u{1FA70}-\u{1FA74}\u{1FA78}-\u{1FA7A}\u{1FA80}-\u{1FA86}\u{1FA90}-\u{1FAA8}\u{1FAB0}-\u{1FAB6}\u{1FAC0}-\u{1FAC2}\u{1FAD0}-\u{1FAD6}\u{1FB00}-\u{1FB92}\u{1FB94}-\u{1FBCA}\u{1FBF0}-\u{1FBF9}\u{E0001}\u{E0020}-\u{E007F}\u{E0100}-\u{E01EF}]*$/u; +const bidiS3 = /[0-9\xB2\xB3\xB9\u05BE\u05C0\u05C3\u05C6\u05D0-\u05EA\u05EF-\u05F4\u0600-\u0605\u0608\u060B\u060D\u061B\u061C\u061E-\u064A\u0660-\u0669\u066B-\u066F\u0671-\u06D5\u06DD\u06E5\u06E6\u06EE-\u070D\u070F\u0710\u0712-\u072F\u074D-\u07A5\u07B1\u07C0-\u07EA\u07F4\u07F5\u07FA\u07FE-\u0815\u081A\u0824\u0828\u0830-\u083E\u0840-\u0858\u085E\u0860-\u086A\u08A0-\u08B4\u08B6-\u08C7\u08E2\u200F\u2070\u2074-\u2079\u2080-\u2089\u2488-\u249B\uFB1D\uFB1F-\uFB28\uFB2A-\uFB36\uFB38-\uFB3C\uFB3E\uFB40\uFB41\uFB43\uFB44\uFB46-\uFBC1\uFBD3-\uFD3D\uFD50-\uFD8F\uFD92-\uFDC7\uFDF0-\uFDFC\uFE70-\uFE74\uFE76-\uFEFC\uFF10-\uFF19\u{102E1}-\u{102FB}\u{10800}-\u{10805}\u{10808}\u{1080A}-\u{10835}\u{10837}\u{10838}\u{1083C}\u{1083F}-\u{10855}\u{10857}-\u{1089E}\u{108A7}-\u{108AF}\u{108E0}-\u{108F2}\u{108F4}\u{108F5}\u{108FB}-\u{1091B}\u{10920}-\u{10939}\u{1093F}\u{10980}-\u{109B7}\u{109BC}-\u{109CF}\u{109D2}-\u{10A00}\u{10A10}-\u{10A13}\u{10A15}-\u{10A17}\u{10A19}-\u{10A35}\u{10A40}-\u{10A48}\u{10A50}-\u{10A58}\u{10A60}-\u{10A9F}\u{10AC0}-\u{10AE4}\u{10AEB}-\u{10AF6}\u{10B00}-\u{10B35}\u{10B40}-\u{10B55}\u{10B58}-\u{10B72}\u{10B78}-\u{10B91}\u{10B99}-\u{10B9C}\u{10BA9}-\u{10BAF}\u{10C00}-\u{10C48}\u{10C80}-\u{10CB2}\u{10CC0}-\u{10CF2}\u{10CFA}-\u{10D23}\u{10D30}-\u{10D39}\u{10E60}-\u{10E7E}\u{10E80}-\u{10EA9}\u{10EAD}\u{10EB0}\u{10EB1}\u{10F00}-\u{10F27}\u{10F30}-\u{10F45}\u{10F51}-\u{10F59}\u{10FB0}-\u{10FCB}\u{10FE0}-\u{10FF6}\u{1D7CE}-\u{1D7FF}\u{1E800}-\u{1E8C4}\u{1E8C7}-\u{1E8CF}\u{1E900}-\u{1E943}\u{1E94B}\u{1E950}-\u{1E959}\u{1E95E}\u{1E95F}\u{1EC71}-\u{1ECB4}\u{1ED01}-\u{1ED3D}\u{1EE00}-\u{1EE03}\u{1EE05}-\u{1EE1F}\u{1EE21}\u{1EE22}\u{1EE24}\u{1EE27}\u{1EE29}-\u{1EE32}\u{1EE34}-\u{1EE37}\u{1EE39}\u{1EE3B}\u{1EE42}\u{1EE47}\u{1EE49}\u{1EE4B}\u{1EE4D}-\u{1EE4F}\u{1EE51}\u{1EE52}\u{1EE54}\u{1EE57}\u{1EE59}\u{1EE5B}\u{1EE5D}\u{1EE5F}\u{1EE61}\u{1EE62}\u{1EE64}\u{1EE67}-\u{1EE6A}\u{1EE6C}-\u{1EE72}\u{1EE74}-\u{1EE77}\u{1EE79}-\u{1EE7C}\u{1EE7E}\u{1EE80}-\u{1EE89}\u{1EE8B}-\u{1EE9B}\u{1EEA1}-\u{1EEA3}\u{1EEA5}-\u{1EEA9}\u{1EEAB}-\u{1EEBB}\u{1F100}-\u{1F10A}\u{1FBF0}-\u{1FBF9}][\u0300-\u036F\u0483-\u0489\u0591-\u05BD\u05BF\u05C1\u05C2\u05C4\u05C5\u05C7\u0610-\u061A\u064B-\u065F\u0670\u06D6-\u06DC\u06DF-\u06E4\u06E7\u06E8\u06EA-\u06ED\u0711\u0730-\u074A\u07A6-\u07B0\u07EB-\u07F3\u07FD\u0816-\u0819\u081B-\u0823\u0825-\u0827\u0829-\u082D\u0859-\u085B\u08D3-\u08E1\u08E3-\u0902\u093A\u093C\u0941-\u0948\u094D\u0951-\u0957\u0962\u0963\u0981\u09BC\u09C1-\u09C4\u09CD\u09E2\u09E3\u09FE\u0A01\u0A02\u0A3C\u0A41\u0A42\u0A47\u0A48\u0A4B-\u0A4D\u0A51\u0A70\u0A71\u0A75\u0A81\u0A82\u0ABC\u0AC1-\u0AC5\u0AC7\u0AC8\u0ACD\u0AE2\u0AE3\u0AFA-\u0AFF\u0B01\u0B3C\u0B3F\u0B41-\u0B44\u0B4D\u0B55\u0B56\u0B62\u0B63\u0B82\u0BC0\u0BCD\u0C00\u0C04\u0C3E-\u0C40\u0C46-\u0C48\u0C4A-\u0C4D\u0C55\u0C56\u0C62\u0C63\u0C81\u0CBC\u0CCC\u0CCD\u0CE2\u0CE3\u0D00\u0D01\u0D3B\u0D3C\u0D41-\u0D44\u0D4D\u0D62\u0D63\u0D81\u0DCA\u0DD2-\u0DD4\u0DD6\u0E31\u0E34-\u0E3A\u0E47-\u0E4E\u0EB1\u0EB4-\u0EBC\u0EC8-\u0ECD\u0F18\u0F19\u0F35\u0F37\u0F39\u0F71-\u0F7E\u0F80-\u0F84\u0F86\u0F87\u0F8D-\u0F97\u0F99-\u0FBC\u0FC6\u102D-\u1030\u1032-\u1037\u1039\u103A\u103D\u103E\u1058\u1059\u105E-\u1060\u1071-\u1074\u1082\u1085\u1086\u108D\u109D\u135D-\u135F\u1712-\u1714\u1732-\u1734\u1752\u1753\u1772\u1773\u17B4\u17B5\u17B7-\u17BD\u17C6\u17C9-\u17D3\u17DD\u180B-\u180D\u1885\u1886\u18A9\u1920-\u1922\u1927\u1928\u1932\u1939-\u193B\u1A17\u1A18\u1A1B\u1A56\u1A58-\u1A5E\u1A60\u1A62\u1A65-\u1A6C\u1A73-\u1A7C\u1A7F\u1AB0-\u1AC0\u1B00-\u1B03\u1B34\u1B36-\u1B3A\u1B3C\u1B42\u1B6B-\u1B73\u1B80\u1B81\u1BA2-\u1BA5\u1BA8\u1BA9\u1BAB-\u1BAD\u1BE6\u1BE8\u1BE9\u1BED\u1BEF-\u1BF1\u1C2C-\u1C33\u1C36\u1C37\u1CD0-\u1CD2\u1CD4-\u1CE0\u1CE2-\u1CE8\u1CED\u1CF4\u1CF8\u1CF9\u1DC0-\u1DF9\u1DFB-\u1DFF\u20D0-\u20F0\u2CEF-\u2CF1\u2D7F\u2DE0-\u2DFF\u302A-\u302D\u3099\u309A\uA66F-\uA672\uA674-\uA67D\uA69E\uA69F\uA6F0\uA6F1\uA802\uA806\uA80B\uA825\uA826\uA82C\uA8C4\uA8C5\uA8E0-\uA8F1\uA8FF\uA926-\uA92D\uA947-\uA951\uA980-\uA982\uA9B3\uA9B6-\uA9B9\uA9BC\uA9BD\uA9E5\uAA29-\uAA2E\uAA31\uAA32\uAA35\uAA36\uAA43\uAA4C\uAA7C\uAAB0\uAAB2-\uAAB4\uAAB7\uAAB8\uAABE\uAABF\uAAC1\uAAEC\uAAED\uAAF6\uABE5\uABE8\uABED\uFB1E\uFE00-\uFE0F\uFE20-\uFE2F\u{101FD}\u{102E0}\u{10376}-\u{1037A}\u{10A01}-\u{10A03}\u{10A05}\u{10A06}\u{10A0C}-\u{10A0F}\u{10A38}-\u{10A3A}\u{10A3F}\u{10AE5}\u{10AE6}\u{10D24}-\u{10D27}\u{10EAB}\u{10EAC}\u{10F46}-\u{10F50}\u{11001}\u{11038}-\u{11046}\u{1107F}-\u{11081}\u{110B3}-\u{110B6}\u{110B9}\u{110BA}\u{11100}-\u{11102}\u{11127}-\u{1112B}\u{1112D}-\u{11134}\u{11173}\u{11180}\u{11181}\u{111B6}-\u{111BE}\u{111C9}-\u{111CC}\u{111CF}\u{1122F}-\u{11231}\u{11234}\u{11236}\u{11237}\u{1123E}\u{112DF}\u{112E3}-\u{112EA}\u{11300}\u{11301}\u{1133B}\u{1133C}\u{11340}\u{11366}-\u{1136C}\u{11370}-\u{11374}\u{11438}-\u{1143F}\u{11442}-\u{11444}\u{11446}\u{1145E}\u{114B3}-\u{114B8}\u{114BA}\u{114BF}\u{114C0}\u{114C2}\u{114C3}\u{115B2}-\u{115B5}\u{115BC}\u{115BD}\u{115BF}\u{115C0}\u{115DC}\u{115DD}\u{11633}-\u{1163A}\u{1163D}\u{1163F}\u{11640}\u{116AB}\u{116AD}\u{116B0}-\u{116B5}\u{116B7}\u{1171D}-\u{1171F}\u{11722}-\u{11725}\u{11727}-\u{1172B}\u{1182F}-\u{11837}\u{11839}\u{1183A}\u{1193B}\u{1193C}\u{1193E}\u{11943}\u{119D4}-\u{119D7}\u{119DA}\u{119DB}\u{119E0}\u{11A01}-\u{11A06}\u{11A09}\u{11A0A}\u{11A33}-\u{11A38}\u{11A3B}-\u{11A3E}\u{11A47}\u{11A51}-\u{11A56}\u{11A59}-\u{11A5B}\u{11A8A}-\u{11A96}\u{11A98}\u{11A99}\u{11C30}-\u{11C36}\u{11C38}-\u{11C3D}\u{11C92}-\u{11CA7}\u{11CAA}-\u{11CB0}\u{11CB2}\u{11CB3}\u{11CB5}\u{11CB6}\u{11D31}-\u{11D36}\u{11D3A}\u{11D3C}\u{11D3D}\u{11D3F}-\u{11D45}\u{11D47}\u{11D90}\u{11D91}\u{11D95}\u{11D97}\u{11EF3}\u{11EF4}\u{16AF0}-\u{16AF4}\u{16B30}-\u{16B36}\u{16F4F}\u{16F8F}-\u{16F92}\u{16FE4}\u{1BC9D}\u{1BC9E}\u{1D167}-\u{1D169}\u{1D17B}-\u{1D182}\u{1D185}-\u{1D18B}\u{1D1AA}-\u{1D1AD}\u{1D242}-\u{1D244}\u{1DA00}-\u{1DA36}\u{1DA3B}-\u{1DA6C}\u{1DA75}\u{1DA84}\u{1DA9B}-\u{1DA9F}\u{1DAA1}-\u{1DAAF}\u{1E000}-\u{1E006}\u{1E008}-\u{1E018}\u{1E01B}-\u{1E021}\u{1E023}\u{1E024}\u{1E026}-\u{1E02A}\u{1E130}-\u{1E136}\u{1E2EC}-\u{1E2EF}\u{1E8D0}-\u{1E8D6}\u{1E944}-\u{1E94A}\u{E0100}-\u{E01EF}]*$/u; +const bidiS4EN = /[0-9\xB2\xB3\xB9\u06F0-\u06F9\u2070\u2074-\u2079\u2080-\u2089\u2488-\u249B\uFF10-\uFF19\u{102E1}-\u{102FB}\u{1D7CE}-\u{1D7FF}\u{1F100}-\u{1F10A}\u{1FBF0}-\u{1FBF9}]/u; +const bidiS4AN = /[\u0600-\u0605\u0660-\u0669\u066B\u066C\u06DD\u08E2\u{10D30}-\u{10D39}\u{10E60}-\u{10E7E}]/u; +const bidiS5 = /^[\0-\x08\x0E-\x1B!-\x84\x86-\u0377\u037A-\u037F\u0384-\u038A\u038C\u038E-\u03A1\u03A3-\u052F\u0531-\u0556\u0559-\u058A\u058D-\u058F\u0591-\u05BD\u05BF\u05C1\u05C2\u05C4\u05C5\u05C7\u0606\u0607\u0609\u060A\u060C\u060E-\u061A\u064B-\u065F\u066A\u0670\u06D6-\u06DC\u06DE-\u06E4\u06E7-\u06ED\u06F0-\u06F9\u0711\u0730-\u074A\u07A6-\u07B0\u07EB-\u07F3\u07F6-\u07F9\u07FD\u0816-\u0819\u081B-\u0823\u0825-\u0827\u0829-\u082D\u0859-\u085B\u08D3-\u08E1\u08E3-\u0983\u0985-\u098C\u098F\u0990\u0993-\u09A8\u09AA-\u09B0\u09B2\u09B6-\u09B9\u09BC-\u09C4\u09C7\u09C8\u09CB-\u09CE\u09D7\u09DC\u09DD\u09DF-\u09E3\u09E6-\u09FE\u0A01-\u0A03\u0A05-\u0A0A\u0A0F\u0A10\u0A13-\u0A28\u0A2A-\u0A30\u0A32\u0A33\u0A35\u0A36\u0A38\u0A39\u0A3C\u0A3E-\u0A42\u0A47\u0A48\u0A4B-\u0A4D\u0A51\u0A59-\u0A5C\u0A5E\u0A66-\u0A76\u0A81-\u0A83\u0A85-\u0A8D\u0A8F-\u0A91\u0A93-\u0AA8\u0AAA-\u0AB0\u0AB2\u0AB3\u0AB5-\u0AB9\u0ABC-\u0AC5\u0AC7-\u0AC9\u0ACB-\u0ACD\u0AD0\u0AE0-\u0AE3\u0AE6-\u0AF1\u0AF9-\u0AFF\u0B01-\u0B03\u0B05-\u0B0C\u0B0F\u0B10\u0B13-\u0B28\u0B2A-\u0B30\u0B32\u0B33\u0B35-\u0B39\u0B3C-\u0B44\u0B47\u0B48\u0B4B-\u0B4D\u0B55-\u0B57\u0B5C\u0B5D\u0B5F-\u0B63\u0B66-\u0B77\u0B82\u0B83\u0B85-\u0B8A\u0B8E-\u0B90\u0B92-\u0B95\u0B99\u0B9A\u0B9C\u0B9E\u0B9F\u0BA3\u0BA4\u0BA8-\u0BAA\u0BAE-\u0BB9\u0BBE-\u0BC2\u0BC6-\u0BC8\u0BCA-\u0BCD\u0BD0\u0BD7\u0BE6-\u0BFA\u0C00-\u0C0C\u0C0E-\u0C10\u0C12-\u0C28\u0C2A-\u0C39\u0C3D-\u0C44\u0C46-\u0C48\u0C4A-\u0C4D\u0C55\u0C56\u0C58-\u0C5A\u0C60-\u0C63\u0C66-\u0C6F\u0C77-\u0C8C\u0C8E-\u0C90\u0C92-\u0CA8\u0CAA-\u0CB3\u0CB5-\u0CB9\u0CBC-\u0CC4\u0CC6-\u0CC8\u0CCA-\u0CCD\u0CD5\u0CD6\u0CDE\u0CE0-\u0CE3\u0CE6-\u0CEF\u0CF1\u0CF2\u0D00-\u0D0C\u0D0E-\u0D10\u0D12-\u0D44\u0D46-\u0D48\u0D4A-\u0D4F\u0D54-\u0D63\u0D66-\u0D7F\u0D81-\u0D83\u0D85-\u0D96\u0D9A-\u0DB1\u0DB3-\u0DBB\u0DBD\u0DC0-\u0DC6\u0DCA\u0DCF-\u0DD4\u0DD6\u0DD8-\u0DDF\u0DE6-\u0DEF\u0DF2-\u0DF4\u0E01-\u0E3A\u0E3F-\u0E5B\u0E81\u0E82\u0E84\u0E86-\u0E8A\u0E8C-\u0EA3\u0EA5\u0EA7-\u0EBD\u0EC0-\u0EC4\u0EC6\u0EC8-\u0ECD\u0ED0-\u0ED9\u0EDC-\u0EDF\u0F00-\u0F47\u0F49-\u0F6C\u0F71-\u0F97\u0F99-\u0FBC\u0FBE-\u0FCC\u0FCE-\u0FDA\u1000-\u10C5\u10C7\u10CD\u10D0-\u1248\u124A-\u124D\u1250-\u1256\u1258\u125A-\u125D\u1260-\u1288\u128A-\u128D\u1290-\u12B0\u12B2-\u12B5\u12B8-\u12BE\u12C0\u12C2-\u12C5\u12C8-\u12D6\u12D8-\u1310\u1312-\u1315\u1318-\u135A\u135D-\u137C\u1380-\u1399\u13A0-\u13F5\u13F8-\u13FD\u1400-\u167F\u1681-\u169C\u16A0-\u16F8\u1700-\u170C\u170E-\u1714\u1720-\u1736\u1740-\u1753\u1760-\u176C\u176E-\u1770\u1772\u1773\u1780-\u17DD\u17E0-\u17E9\u17F0-\u17F9\u1800-\u180E\u1810-\u1819\u1820-\u1878\u1880-\u18AA\u18B0-\u18F5\u1900-\u191E\u1920-\u192B\u1930-\u193B\u1940\u1944-\u196D\u1970-\u1974\u1980-\u19AB\u19B0-\u19C9\u19D0-\u19DA\u19DE-\u1A1B\u1A1E-\u1A5E\u1A60-\u1A7C\u1A7F-\u1A89\u1A90-\u1A99\u1AA0-\u1AAD\u1AB0-\u1AC0\u1B00-\u1B4B\u1B50-\u1B7C\u1B80-\u1BF3\u1BFC-\u1C37\u1C3B-\u1C49\u1C4D-\u1C88\u1C90-\u1CBA\u1CBD-\u1CC7\u1CD0-\u1CFA\u1D00-\u1DF9\u1DFB-\u1F15\u1F18-\u1F1D\u1F20-\u1F45\u1F48-\u1F4D\u1F50-\u1F57\u1F59\u1F5B\u1F5D\u1F5F-\u1F7D\u1F80-\u1FB4\u1FB6-\u1FC4\u1FC6-\u1FD3\u1FD6-\u1FDB\u1FDD-\u1FEF\u1FF2-\u1FF4\u1FF6-\u1FFE\u200B-\u200E\u2010-\u2027\u202F-\u205E\u2060-\u2064\u206A-\u2071\u2074-\u208E\u2090-\u209C\u20A0-\u20BF\u20D0-\u20F0\u2100-\u218B\u2190-\u2426\u2440-\u244A\u2460-\u2B73\u2B76-\u2B95\u2B97-\u2C2E\u2C30-\u2C5E\u2C60-\u2CF3\u2CF9-\u2D25\u2D27\u2D2D\u2D30-\u2D67\u2D6F\u2D70\u2D7F-\u2D96\u2DA0-\u2DA6\u2DA8-\u2DAE\u2DB0-\u2DB6\u2DB8-\u2DBE\u2DC0-\u2DC6\u2DC8-\u2DCE\u2DD0-\u2DD6\u2DD8-\u2DDE\u2DE0-\u2E52\u2E80-\u2E99\u2E9B-\u2EF3\u2F00-\u2FD5\u2FF0-\u2FFB\u3001-\u303F\u3041-\u3096\u3099-\u30FF\u3105-\u312F\u3131-\u318E\u3190-\u31E3\u31F0-\u321E\u3220-\u9FFC\uA000-\uA48C\uA490-\uA4C6\uA4D0-\uA62B\uA640-\uA6F7\uA700-\uA7BF\uA7C2-\uA7CA\uA7F5-\uA82C\uA830-\uA839\uA840-\uA877\uA880-\uA8C5\uA8CE-\uA8D9\uA8E0-\uA953\uA95F-\uA97C\uA980-\uA9CD\uA9CF-\uA9D9\uA9DE-\uA9FE\uAA00-\uAA36\uAA40-\uAA4D\uAA50-\uAA59\uAA5C-\uAAC2\uAADB-\uAAF6\uAB01-\uAB06\uAB09-\uAB0E\uAB11-\uAB16\uAB20-\uAB26\uAB28-\uAB2E\uAB30-\uAB6B\uAB70-\uABED\uABF0-\uABF9\uAC00-\uD7A3\uD7B0-\uD7C6\uD7CB-\uD7FB\uD800-\uFA6D\uFA70-\uFAD9\uFB00-\uFB06\uFB13-\uFB17\uFB1E\uFB29\uFD3E\uFD3F\uFDFD\uFE00-\uFE19\uFE20-\uFE52\uFE54-\uFE66\uFE68-\uFE6B\uFEFF\uFF01-\uFFBE\uFFC2-\uFFC7\uFFCA-\uFFCF\uFFD2-\uFFD7\uFFDA-\uFFDC\uFFE0-\uFFE6\uFFE8-\uFFEE\uFFF9-\uFFFD\u{10000}-\u{1000B}\u{1000D}-\u{10026}\u{10028}-\u{1003A}\u{1003C}\u{1003D}\u{1003F}-\u{1004D}\u{10050}-\u{1005D}\u{10080}-\u{100FA}\u{10100}-\u{10102}\u{10107}-\u{10133}\u{10137}-\u{1018E}\u{10190}-\u{1019C}\u{101A0}\u{101D0}-\u{101FD}\u{10280}-\u{1029C}\u{102A0}-\u{102D0}\u{102E0}-\u{102FB}\u{10300}-\u{10323}\u{1032D}-\u{1034A}\u{10350}-\u{1037A}\u{10380}-\u{1039D}\u{1039F}-\u{103C3}\u{103C8}-\u{103D5}\u{10400}-\u{1049D}\u{104A0}-\u{104A9}\u{104B0}-\u{104D3}\u{104D8}-\u{104FB}\u{10500}-\u{10527}\u{10530}-\u{10563}\u{1056F}\u{10600}-\u{10736}\u{10740}-\u{10755}\u{10760}-\u{10767}\u{1091F}\u{10A01}-\u{10A03}\u{10A05}\u{10A06}\u{10A0C}-\u{10A0F}\u{10A38}-\u{10A3A}\u{10A3F}\u{10AE5}\u{10AE6}\u{10B39}-\u{10B3F}\u{10D24}-\u{10D27}\u{10EAB}\u{10EAC}\u{10F46}-\u{10F50}\u{11000}-\u{1104D}\u{11052}-\u{1106F}\u{1107F}-\u{110C1}\u{110CD}\u{110D0}-\u{110E8}\u{110F0}-\u{110F9}\u{11100}-\u{11134}\u{11136}-\u{11147}\u{11150}-\u{11176}\u{11180}-\u{111DF}\u{111E1}-\u{111F4}\u{11200}-\u{11211}\u{11213}-\u{1123E}\u{11280}-\u{11286}\u{11288}\u{1128A}-\u{1128D}\u{1128F}-\u{1129D}\u{1129F}-\u{112A9}\u{112B0}-\u{112EA}\u{112F0}-\u{112F9}\u{11300}-\u{11303}\u{11305}-\u{1130C}\u{1130F}\u{11310}\u{11313}-\u{11328}\u{1132A}-\u{11330}\u{11332}\u{11333}\u{11335}-\u{11339}\u{1133B}-\u{11344}\u{11347}\u{11348}\u{1134B}-\u{1134D}\u{11350}\u{11357}\u{1135D}-\u{11363}\u{11366}-\u{1136C}\u{11370}-\u{11374}\u{11400}-\u{1145B}\u{1145D}-\u{11461}\u{11480}-\u{114C7}\u{114D0}-\u{114D9}\u{11580}-\u{115B5}\u{115B8}-\u{115DD}\u{11600}-\u{11644}\u{11650}-\u{11659}\u{11660}-\u{1166C}\u{11680}-\u{116B8}\u{116C0}-\u{116C9}\u{11700}-\u{1171A}\u{1171D}-\u{1172B}\u{11730}-\u{1173F}\u{11800}-\u{1183B}\u{118A0}-\u{118F2}\u{118FF}-\u{11906}\u{11909}\u{1190C}-\u{11913}\u{11915}\u{11916}\u{11918}-\u{11935}\u{11937}\u{11938}\u{1193B}-\u{11946}\u{11950}-\u{11959}\u{119A0}-\u{119A7}\u{119AA}-\u{119D7}\u{119DA}-\u{119E4}\u{11A00}-\u{11A47}\u{11A50}-\u{11AA2}\u{11AC0}-\u{11AF8}\u{11C00}-\u{11C08}\u{11C0A}-\u{11C36}\u{11C38}-\u{11C45}\u{11C50}-\u{11C6C}\u{11C70}-\u{11C8F}\u{11C92}-\u{11CA7}\u{11CA9}-\u{11CB6}\u{11D00}-\u{11D06}\u{11D08}\u{11D09}\u{11D0B}-\u{11D36}\u{11D3A}\u{11D3C}\u{11D3D}\u{11D3F}-\u{11D47}\u{11D50}-\u{11D59}\u{11D60}-\u{11D65}\u{11D67}\u{11D68}\u{11D6A}-\u{11D8E}\u{11D90}\u{11D91}\u{11D93}-\u{11D98}\u{11DA0}-\u{11DA9}\u{11EE0}-\u{11EF8}\u{11FB0}\u{11FC0}-\u{11FF1}\u{11FFF}-\u{12399}\u{12400}-\u{1246E}\u{12470}-\u{12474}\u{12480}-\u{12543}\u{13000}-\u{1342E}\u{13430}-\u{13438}\u{14400}-\u{14646}\u{16800}-\u{16A38}\u{16A40}-\u{16A5E}\u{16A60}-\u{16A69}\u{16A6E}\u{16A6F}\u{16AD0}-\u{16AED}\u{16AF0}-\u{16AF5}\u{16B00}-\u{16B45}\u{16B50}-\u{16B59}\u{16B5B}-\u{16B61}\u{16B63}-\u{16B77}\u{16B7D}-\u{16B8F}\u{16E40}-\u{16E9A}\u{16F00}-\u{16F4A}\u{16F4F}-\u{16F87}\u{16F8F}-\u{16F9F}\u{16FE0}-\u{16FE4}\u{16FF0}\u{16FF1}\u{17000}-\u{187F7}\u{18800}-\u{18CD5}\u{18D00}-\u{18D08}\u{1B000}-\u{1B11E}\u{1B150}-\u{1B152}\u{1B164}-\u{1B167}\u{1B170}-\u{1B2FB}\u{1BC00}-\u{1BC6A}\u{1BC70}-\u{1BC7C}\u{1BC80}-\u{1BC88}\u{1BC90}-\u{1BC99}\u{1BC9C}-\u{1BCA3}\u{1D000}-\u{1D0F5}\u{1D100}-\u{1D126}\u{1D129}-\u{1D1E8}\u{1D200}-\u{1D245}\u{1D2E0}-\u{1D2F3}\u{1D300}-\u{1D356}\u{1D360}-\u{1D378}\u{1D400}-\u{1D454}\u{1D456}-\u{1D49C}\u{1D49E}\u{1D49F}\u{1D4A2}\u{1D4A5}\u{1D4A6}\u{1D4A9}-\u{1D4AC}\u{1D4AE}-\u{1D4B9}\u{1D4BB}\u{1D4BD}-\u{1D4C3}\u{1D4C5}-\u{1D505}\u{1D507}-\u{1D50A}\u{1D50D}-\u{1D514}\u{1D516}-\u{1D51C}\u{1D51E}-\u{1D539}\u{1D53B}-\u{1D53E}\u{1D540}-\u{1D544}\u{1D546}\u{1D54A}-\u{1D550}\u{1D552}-\u{1D6A5}\u{1D6A8}-\u{1D7CB}\u{1D7CE}-\u{1DA8B}\u{1DA9B}-\u{1DA9F}\u{1DAA1}-\u{1DAAF}\u{1E000}-\u{1E006}\u{1E008}-\u{1E018}\u{1E01B}-\u{1E021}\u{1E023}\u{1E024}\u{1E026}-\u{1E02A}\u{1E100}-\u{1E12C}\u{1E130}-\u{1E13D}\u{1E140}-\u{1E149}\u{1E14E}\u{1E14F}\u{1E2C0}-\u{1E2F9}\u{1E2FF}\u{1E8D0}-\u{1E8D6}\u{1E944}-\u{1E94A}\u{1EEF0}\u{1EEF1}\u{1F000}-\u{1F02B}\u{1F030}-\u{1F093}\u{1F0A0}-\u{1F0AE}\u{1F0B1}-\u{1F0BF}\u{1F0C1}-\u{1F0CF}\u{1F0D1}-\u{1F0F5}\u{1F100}-\u{1F1AD}\u{1F1E6}-\u{1F202}\u{1F210}-\u{1F23B}\u{1F240}-\u{1F248}\u{1F250}\u{1F251}\u{1F260}-\u{1F265}\u{1F300}-\u{1F6D7}\u{1F6E0}-\u{1F6EC}\u{1F6F0}-\u{1F6FC}\u{1F700}-\u{1F773}\u{1F780}-\u{1F7D8}\u{1F7E0}-\u{1F7EB}\u{1F800}-\u{1F80B}\u{1F810}-\u{1F847}\u{1F850}-\u{1F859}\u{1F860}-\u{1F887}\u{1F890}-\u{1F8AD}\u{1F8B0}\u{1F8B1}\u{1F900}-\u{1F978}\u{1F97A}-\u{1F9CB}\u{1F9CD}-\u{1FA53}\u{1FA60}-\u{1FA6D}\u{1FA70}-\u{1FA74}\u{1FA78}-\u{1FA7A}\u{1FA80}-\u{1FA86}\u{1FA90}-\u{1FAA8}\u{1FAB0}-\u{1FAB6}\u{1FAC0}-\u{1FAC2}\u{1FAD0}-\u{1FAD6}\u{1FB00}-\u{1FB92}\u{1FB94}-\u{1FBCA}\u{1FBF0}-\u{1FBF9}\u{20000}-\u{2A6DD}\u{2A700}-\u{2B734}\u{2B740}-\u{2B81D}\u{2B820}-\u{2CEA1}\u{2CEB0}-\u{2EBE0}\u{2F800}-\u{2FA1D}\u{30000}-\u{3134A}\u{E0001}\u{E0020}-\u{E007F}\u{E0100}-\u{E01EF}\u{F0000}-\u{FFFFD}\u{100000}-\u{10FFFD}]*$/u; +const bidiS6 = /[0-9A-Za-z\xAA\xB2\xB3\xB5\xB9\xBA\xC0-\xD6\xD8-\xF6\xF8-\u02B8\u02BB-\u02C1\u02D0\u02D1\u02E0-\u02E4\u02EE\u0370-\u0373\u0376\u0377\u037A-\u037D\u037F\u0386\u0388-\u038A\u038C\u038E-\u03A1\u03A3-\u03F5\u03F7-\u0482\u048A-\u052F\u0531-\u0556\u0559-\u0589\u06F0-\u06F9\u0903-\u0939\u093B\u093D-\u0940\u0949-\u094C\u094E-\u0950\u0958-\u0961\u0964-\u0980\u0982\u0983\u0985-\u098C\u098F\u0990\u0993-\u09A8\u09AA-\u09B0\u09B2\u09B6-\u09B9\u09BD-\u09C0\u09C7\u09C8\u09CB\u09CC\u09CE\u09D7\u09DC\u09DD\u09DF-\u09E1\u09E6-\u09F1\u09F4-\u09FA\u09FC\u09FD\u0A03\u0A05-\u0A0A\u0A0F\u0A10\u0A13-\u0A28\u0A2A-\u0A30\u0A32\u0A33\u0A35\u0A36\u0A38\u0A39\u0A3E-\u0A40\u0A59-\u0A5C\u0A5E\u0A66-\u0A6F\u0A72-\u0A74\u0A76\u0A83\u0A85-\u0A8D\u0A8F-\u0A91\u0A93-\u0AA8\u0AAA-\u0AB0\u0AB2\u0AB3\u0AB5-\u0AB9\u0ABD-\u0AC0\u0AC9\u0ACB\u0ACC\u0AD0\u0AE0\u0AE1\u0AE6-\u0AF0\u0AF9\u0B02\u0B03\u0B05-\u0B0C\u0B0F\u0B10\u0B13-\u0B28\u0B2A-\u0B30\u0B32\u0B33\u0B35-\u0B39\u0B3D\u0B3E\u0B40\u0B47\u0B48\u0B4B\u0B4C\u0B57\u0B5C\u0B5D\u0B5F-\u0B61\u0B66-\u0B77\u0B83\u0B85-\u0B8A\u0B8E-\u0B90\u0B92-\u0B95\u0B99\u0B9A\u0B9C\u0B9E\u0B9F\u0BA3\u0BA4\u0BA8-\u0BAA\u0BAE-\u0BB9\u0BBE\u0BBF\u0BC1\u0BC2\u0BC6-\u0BC8\u0BCA-\u0BCC\u0BD0\u0BD7\u0BE6-\u0BF2\u0C01-\u0C03\u0C05-\u0C0C\u0C0E-\u0C10\u0C12-\u0C28\u0C2A-\u0C39\u0C3D\u0C41-\u0C44\u0C58-\u0C5A\u0C60\u0C61\u0C66-\u0C6F\u0C77\u0C7F\u0C80\u0C82-\u0C8C\u0C8E-\u0C90\u0C92-\u0CA8\u0CAA-\u0CB3\u0CB5-\u0CB9\u0CBD-\u0CC4\u0CC6-\u0CC8\u0CCA\u0CCB\u0CD5\u0CD6\u0CDE\u0CE0\u0CE1\u0CE6-\u0CEF\u0CF1\u0CF2\u0D02-\u0D0C\u0D0E-\u0D10\u0D12-\u0D3A\u0D3D-\u0D40\u0D46-\u0D48\u0D4A-\u0D4C\u0D4E\u0D4F\u0D54-\u0D61\u0D66-\u0D7F\u0D82\u0D83\u0D85-\u0D96\u0D9A-\u0DB1\u0DB3-\u0DBB\u0DBD\u0DC0-\u0DC6\u0DCF-\u0DD1\u0DD8-\u0DDF\u0DE6-\u0DEF\u0DF2-\u0DF4\u0E01-\u0E30\u0E32\u0E33\u0E40-\u0E46\u0E4F-\u0E5B\u0E81\u0E82\u0E84\u0E86-\u0E8A\u0E8C-\u0EA3\u0EA5\u0EA7-\u0EB0\u0EB2\u0EB3\u0EBD\u0EC0-\u0EC4\u0EC6\u0ED0-\u0ED9\u0EDC-\u0EDF\u0F00-\u0F17\u0F1A-\u0F34\u0F36\u0F38\u0F3E-\u0F47\u0F49-\u0F6C\u0F7F\u0F85\u0F88-\u0F8C\u0FBE-\u0FC5\u0FC7-\u0FCC\u0FCE-\u0FDA\u1000-\u102C\u1031\u1038\u103B\u103C\u103F-\u1057\u105A-\u105D\u1061-\u1070\u1075-\u1081\u1083\u1084\u1087-\u108C\u108E-\u109C\u109E-\u10C5\u10C7\u10CD\u10D0-\u1248\u124A-\u124D\u1250-\u1256\u1258\u125A-\u125D\u1260-\u1288\u128A-\u128D\u1290-\u12B0\u12B2-\u12B5\u12B8-\u12BE\u12C0\u12C2-\u12C5\u12C8-\u12D6\u12D8-\u1310\u1312-\u1315\u1318-\u135A\u1360-\u137C\u1380-\u138F\u13A0-\u13F5\u13F8-\u13FD\u1401-\u167F\u1681-\u169A\u16A0-\u16F8\u1700-\u170C\u170E-\u1711\u1720-\u1731\u1735\u1736\u1740-\u1751\u1760-\u176C\u176E-\u1770\u1780-\u17B3\u17B6\u17BE-\u17C5\u17C7\u17C8\u17D4-\u17DA\u17DC\u17E0-\u17E9\u1810-\u1819\u1820-\u1878\u1880-\u1884\u1887-\u18A8\u18AA\u18B0-\u18F5\u1900-\u191E\u1923-\u1926\u1929-\u192B\u1930\u1931\u1933-\u1938\u1946-\u196D\u1970-\u1974\u1980-\u19AB\u19B0-\u19C9\u19D0-\u19DA\u1A00-\u1A16\u1A19\u1A1A\u1A1E-\u1A55\u1A57\u1A61\u1A63\u1A64\u1A6D-\u1A72\u1A80-\u1A89\u1A90-\u1A99\u1AA0-\u1AAD\u1B04-\u1B33\u1B35\u1B3B\u1B3D-\u1B41\u1B43-\u1B4B\u1B50-\u1B6A\u1B74-\u1B7C\u1B82-\u1BA1\u1BA6\u1BA7\u1BAA\u1BAE-\u1BE5\u1BE7\u1BEA-\u1BEC\u1BEE\u1BF2\u1BF3\u1BFC-\u1C2B\u1C34\u1C35\u1C3B-\u1C49\u1C4D-\u1C88\u1C90-\u1CBA\u1CBD-\u1CC7\u1CD3\u1CE1\u1CE9-\u1CEC\u1CEE-\u1CF3\u1CF5-\u1CF7\u1CFA\u1D00-\u1DBF\u1E00-\u1F15\u1F18-\u1F1D\u1F20-\u1F45\u1F48-\u1F4D\u1F50-\u1F57\u1F59\u1F5B\u1F5D\u1F5F-\u1F7D\u1F80-\u1FB4\u1FB6-\u1FBC\u1FBE\u1FC2-\u1FC4\u1FC6-\u1FCC\u1FD0-\u1FD3\u1FD6-\u1FDB\u1FE0-\u1FEC\u1FF2-\u1FF4\u1FF6-\u1FFC\u200E\u2070\u2071\u2074-\u2079\u207F-\u2089\u2090-\u209C\u2102\u2107\u210A-\u2113\u2115\u2119-\u211D\u2124\u2126\u2128\u212A-\u212D\u212F-\u2139\u213C-\u213F\u2145-\u2149\u214E\u214F\u2160-\u2188\u2336-\u237A\u2395\u2488-\u24E9\u26AC\u2800-\u28FF\u2C00-\u2C2E\u2C30-\u2C5E\u2C60-\u2CE4\u2CEB-\u2CEE\u2CF2\u2CF3\u2D00-\u2D25\u2D27\u2D2D\u2D30-\u2D67\u2D6F\u2D70\u2D80-\u2D96\u2DA0-\u2DA6\u2DA8-\u2DAE\u2DB0-\u2DB6\u2DB8-\u2DBE\u2DC0-\u2DC6\u2DC8-\u2DCE\u2DD0-\u2DD6\u2DD8-\u2DDE\u3005-\u3007\u3021-\u3029\u302E\u302F\u3031-\u3035\u3038-\u303C\u3041-\u3096\u309D-\u309F\u30A1-\u30FA\u30FC-\u30FF\u3105-\u312F\u3131-\u318E\u3190-\u31BF\u31F0-\u321C\u3220-\u324F\u3260-\u327B\u327F-\u32B0\u32C0-\u32CB\u32D0-\u3376\u337B-\u33DD\u33E0-\u33FE\u3400-\u4DBF\u4E00-\u9FFC\uA000-\uA48C\uA4D0-\uA60C\uA610-\uA62B\uA640-\uA66E\uA680-\uA69D\uA6A0-\uA6EF\uA6F2-\uA6F7\uA722-\uA787\uA789-\uA7BF\uA7C2-\uA7CA\uA7F5-\uA801\uA803-\uA805\uA807-\uA80A\uA80C-\uA824\uA827\uA830-\uA837\uA840-\uA873\uA880-\uA8C3\uA8CE-\uA8D9\uA8F2-\uA8FE\uA900-\uA925\uA92E-\uA946\uA952\uA953\uA95F-\uA97C\uA983-\uA9B2\uA9B4\uA9B5\uA9BA\uA9BB\uA9BE-\uA9CD\uA9CF-\uA9D9\uA9DE-\uA9E4\uA9E6-\uA9FE\uAA00-\uAA28\uAA2F\uAA30\uAA33\uAA34\uAA40-\uAA42\uAA44-\uAA4B\uAA4D\uAA50-\uAA59\uAA5C-\uAA7B\uAA7D-\uAAAF\uAAB1\uAAB5\uAAB6\uAAB9-\uAABD\uAAC0\uAAC2\uAADB-\uAAEB\uAAEE-\uAAF5\uAB01-\uAB06\uAB09-\uAB0E\uAB11-\uAB16\uAB20-\uAB26\uAB28-\uAB2E\uAB30-\uAB69\uAB70-\uABE4\uABE6\uABE7\uABE9-\uABEC\uABF0-\uABF9\uAC00-\uD7A3\uD7B0-\uD7C6\uD7CB-\uD7FB\uD800-\uFA6D\uFA70-\uFAD9\uFB00-\uFB06\uFB13-\uFB17\uFF10-\uFF19\uFF21-\uFF3A\uFF41-\uFF5A\uFF66-\uFFBE\uFFC2-\uFFC7\uFFCA-\uFFCF\uFFD2-\uFFD7\uFFDA-\uFFDC\u{10000}-\u{1000B}\u{1000D}-\u{10026}\u{10028}-\u{1003A}\u{1003C}\u{1003D}\u{1003F}-\u{1004D}\u{10050}-\u{1005D}\u{10080}-\u{100FA}\u{10100}\u{10102}\u{10107}-\u{10133}\u{10137}-\u{1013F}\u{1018D}\u{1018E}\u{101D0}-\u{101FC}\u{10280}-\u{1029C}\u{102A0}-\u{102D0}\u{102E1}-\u{102FB}\u{10300}-\u{10323}\u{1032D}-\u{1034A}\u{10350}-\u{10375}\u{10380}-\u{1039D}\u{1039F}-\u{103C3}\u{103C8}-\u{103D5}\u{10400}-\u{1049D}\u{104A0}-\u{104A9}\u{104B0}-\u{104D3}\u{104D8}-\u{104FB}\u{10500}-\u{10527}\u{10530}-\u{10563}\u{1056F}\u{10600}-\u{10736}\u{10740}-\u{10755}\u{10760}-\u{10767}\u{11000}\u{11002}-\u{11037}\u{11047}-\u{1104D}\u{11066}-\u{1106F}\u{11082}-\u{110B2}\u{110B7}\u{110B8}\u{110BB}-\u{110C1}\u{110CD}\u{110D0}-\u{110E8}\u{110F0}-\u{110F9}\u{11103}-\u{11126}\u{1112C}\u{11136}-\u{11147}\u{11150}-\u{11172}\u{11174}-\u{11176}\u{11182}-\u{111B5}\u{111BF}-\u{111C8}\u{111CD}\u{111CE}\u{111D0}-\u{111DF}\u{111E1}-\u{111F4}\u{11200}-\u{11211}\u{11213}-\u{1122E}\u{11232}\u{11233}\u{11235}\u{11238}-\u{1123D}\u{11280}-\u{11286}\u{11288}\u{1128A}-\u{1128D}\u{1128F}-\u{1129D}\u{1129F}-\u{112A9}\u{112B0}-\u{112DE}\u{112E0}-\u{112E2}\u{112F0}-\u{112F9}\u{11302}\u{11303}\u{11305}-\u{1130C}\u{1130F}\u{11310}\u{11313}-\u{11328}\u{1132A}-\u{11330}\u{11332}\u{11333}\u{11335}-\u{11339}\u{1133D}-\u{1133F}\u{11341}-\u{11344}\u{11347}\u{11348}\u{1134B}-\u{1134D}\u{11350}\u{11357}\u{1135D}-\u{11363}\u{11400}-\u{11437}\u{11440}\u{11441}\u{11445}\u{11447}-\u{1145B}\u{1145D}\u{1145F}-\u{11461}\u{11480}-\u{114B2}\u{114B9}\u{114BB}-\u{114BE}\u{114C1}\u{114C4}-\u{114C7}\u{114D0}-\u{114D9}\u{11580}-\u{115B1}\u{115B8}-\u{115BB}\u{115BE}\u{115C1}-\u{115DB}\u{11600}-\u{11632}\u{1163B}\u{1163C}\u{1163E}\u{11641}-\u{11644}\u{11650}-\u{11659}\u{11680}-\u{116AA}\u{116AC}\u{116AE}\u{116AF}\u{116B6}\u{116B8}\u{116C0}-\u{116C9}\u{11700}-\u{1171A}\u{11720}\u{11721}\u{11726}\u{11730}-\u{1173F}\u{11800}-\u{1182E}\u{11838}\u{1183B}\u{118A0}-\u{118F2}\u{118FF}-\u{11906}\u{11909}\u{1190C}-\u{11913}\u{11915}\u{11916}\u{11918}-\u{11935}\u{11937}\u{11938}\u{1193D}\u{1193F}-\u{11942}\u{11944}-\u{11946}\u{11950}-\u{11959}\u{119A0}-\u{119A7}\u{119AA}-\u{119D3}\u{119DC}-\u{119DF}\u{119E1}-\u{119E4}\u{11A00}\u{11A07}\u{11A08}\u{11A0B}-\u{11A32}\u{11A39}\u{11A3A}\u{11A3F}-\u{11A46}\u{11A50}\u{11A57}\u{11A58}\u{11A5C}-\u{11A89}\u{11A97}\u{11A9A}-\u{11AA2}\u{11AC0}-\u{11AF8}\u{11C00}-\u{11C08}\u{11C0A}-\u{11C2F}\u{11C3E}-\u{11C45}\u{11C50}-\u{11C6C}\u{11C70}-\u{11C8F}\u{11CA9}\u{11CB1}\u{11CB4}\u{11D00}-\u{11D06}\u{11D08}\u{11D09}\u{11D0B}-\u{11D30}\u{11D46}\u{11D50}-\u{11D59}\u{11D60}-\u{11D65}\u{11D67}\u{11D68}\u{11D6A}-\u{11D8E}\u{11D93}\u{11D94}\u{11D96}\u{11D98}\u{11DA0}-\u{11DA9}\u{11EE0}-\u{11EF2}\u{11EF5}-\u{11EF8}\u{11FB0}\u{11FC0}-\u{11FD4}\u{11FFF}-\u{12399}\u{12400}-\u{1246E}\u{12470}-\u{12474}\u{12480}-\u{12543}\u{13000}-\u{1342E}\u{13430}-\u{13438}\u{14400}-\u{14646}\u{16800}-\u{16A38}\u{16A40}-\u{16A5E}\u{16A60}-\u{16A69}\u{16A6E}\u{16A6F}\u{16AD0}-\u{16AED}\u{16AF5}\u{16B00}-\u{16B2F}\u{16B37}-\u{16B45}\u{16B50}-\u{16B59}\u{16B5B}-\u{16B61}\u{16B63}-\u{16B77}\u{16B7D}-\u{16B8F}\u{16E40}-\u{16E9A}\u{16F00}-\u{16F4A}\u{16F50}-\u{16F87}\u{16F93}-\u{16F9F}\u{16FE0}\u{16FE1}\u{16FE3}\u{16FF0}\u{16FF1}\u{17000}-\u{187F7}\u{18800}-\u{18CD5}\u{18D00}-\u{18D08}\u{1B000}-\u{1B11E}\u{1B150}-\u{1B152}\u{1B164}-\u{1B167}\u{1B170}-\u{1B2FB}\u{1BC00}-\u{1BC6A}\u{1BC70}-\u{1BC7C}\u{1BC80}-\u{1BC88}\u{1BC90}-\u{1BC99}\u{1BC9C}\u{1BC9F}\u{1D000}-\u{1D0F5}\u{1D100}-\u{1D126}\u{1D129}-\u{1D166}\u{1D16A}-\u{1D172}\u{1D183}\u{1D184}\u{1D18C}-\u{1D1A9}\u{1D1AE}-\u{1D1E8}\u{1D2E0}-\u{1D2F3}\u{1D360}-\u{1D378}\u{1D400}-\u{1D454}\u{1D456}-\u{1D49C}\u{1D49E}\u{1D49F}\u{1D4A2}\u{1D4A5}\u{1D4A6}\u{1D4A9}-\u{1D4AC}\u{1D4AE}-\u{1D4B9}\u{1D4BB}\u{1D4BD}-\u{1D4C3}\u{1D4C5}-\u{1D505}\u{1D507}-\u{1D50A}\u{1D50D}-\u{1D514}\u{1D516}-\u{1D51C}\u{1D51E}-\u{1D539}\u{1D53B}-\u{1D53E}\u{1D540}-\u{1D544}\u{1D546}\u{1D54A}-\u{1D550}\u{1D552}-\u{1D6A5}\u{1D6A8}-\u{1D6DA}\u{1D6DC}-\u{1D714}\u{1D716}-\u{1D74E}\u{1D750}-\u{1D788}\u{1D78A}-\u{1D7C2}\u{1D7C4}-\u{1D7CB}\u{1D7CE}-\u{1D9FF}\u{1DA37}-\u{1DA3A}\u{1DA6D}-\u{1DA74}\u{1DA76}-\u{1DA83}\u{1DA85}-\u{1DA8B}\u{1E100}-\u{1E12C}\u{1E137}-\u{1E13D}\u{1E140}-\u{1E149}\u{1E14E}\u{1E14F}\u{1E2C0}-\u{1E2EB}\u{1E2F0}-\u{1E2F9}\u{1F100}-\u{1F10A}\u{1F110}-\u{1F12E}\u{1F130}-\u{1F169}\u{1F170}-\u{1F1AC}\u{1F1E6}-\u{1F202}\u{1F210}-\u{1F23B}\u{1F240}-\u{1F248}\u{1F250}\u{1F251}\u{1FBF0}-\u{1FBF9}\u{20000}-\u{2A6DD}\u{2A700}-\u{2B734}\u{2B740}-\u{2B81D}\u{2B820}-\u{2CEA1}\u{2CEB0}-\u{2EBE0}\u{2F800}-\u{2FA1D}\u{30000}-\u{3134A}\u{F0000}-\u{FFFFD}\u{100000}-\u{10FFFD}][\u0300-\u036F\u0483-\u0489\u0591-\u05BD\u05BF\u05C1\u05C2\u05C4\u05C5\u05C7\u0610-\u061A\u064B-\u065F\u0670\u06D6-\u06DC\u06DF-\u06E4\u06E7\u06E8\u06EA-\u06ED\u0711\u0730-\u074A\u07A6-\u07B0\u07EB-\u07F3\u07FD\u0816-\u0819\u081B-\u0823\u0825-\u0827\u0829-\u082D\u0859-\u085B\u08D3-\u08E1\u08E3-\u0902\u093A\u093C\u0941-\u0948\u094D\u0951-\u0957\u0962\u0963\u0981\u09BC\u09C1-\u09C4\u09CD\u09E2\u09E3\u09FE\u0A01\u0A02\u0A3C\u0A41\u0A42\u0A47\u0A48\u0A4B-\u0A4D\u0A51\u0A70\u0A71\u0A75\u0A81\u0A82\u0ABC\u0AC1-\u0AC5\u0AC7\u0AC8\u0ACD\u0AE2\u0AE3\u0AFA-\u0AFF\u0B01\u0B3C\u0B3F\u0B41-\u0B44\u0B4D\u0B55\u0B56\u0B62\u0B63\u0B82\u0BC0\u0BCD\u0C00\u0C04\u0C3E-\u0C40\u0C46-\u0C48\u0C4A-\u0C4D\u0C55\u0C56\u0C62\u0C63\u0C81\u0CBC\u0CCC\u0CCD\u0CE2\u0CE3\u0D00\u0D01\u0D3B\u0D3C\u0D41-\u0D44\u0D4D\u0D62\u0D63\u0D81\u0DCA\u0DD2-\u0DD4\u0DD6\u0E31\u0E34-\u0E3A\u0E47-\u0E4E\u0EB1\u0EB4-\u0EBC\u0EC8-\u0ECD\u0F18\u0F19\u0F35\u0F37\u0F39\u0F71-\u0F7E\u0F80-\u0F84\u0F86\u0F87\u0F8D-\u0F97\u0F99-\u0FBC\u0FC6\u102D-\u1030\u1032-\u1037\u1039\u103A\u103D\u103E\u1058\u1059\u105E-\u1060\u1071-\u1074\u1082\u1085\u1086\u108D\u109D\u135D-\u135F\u1712-\u1714\u1732-\u1734\u1752\u1753\u1772\u1773\u17B4\u17B5\u17B7-\u17BD\u17C6\u17C9-\u17D3\u17DD\u180B-\u180D\u1885\u1886\u18A9\u1920-\u1922\u1927\u1928\u1932\u1939-\u193B\u1A17\u1A18\u1A1B\u1A56\u1A58-\u1A5E\u1A60\u1A62\u1A65-\u1A6C\u1A73-\u1A7C\u1A7F\u1AB0-\u1AC0\u1B00-\u1B03\u1B34\u1B36-\u1B3A\u1B3C\u1B42\u1B6B-\u1B73\u1B80\u1B81\u1BA2-\u1BA5\u1BA8\u1BA9\u1BAB-\u1BAD\u1BE6\u1BE8\u1BE9\u1BED\u1BEF-\u1BF1\u1C2C-\u1C33\u1C36\u1C37\u1CD0-\u1CD2\u1CD4-\u1CE0\u1CE2-\u1CE8\u1CED\u1CF4\u1CF8\u1CF9\u1DC0-\u1DF9\u1DFB-\u1DFF\u20D0-\u20F0\u2CEF-\u2CF1\u2D7F\u2DE0-\u2DFF\u302A-\u302D\u3099\u309A\uA66F-\uA672\uA674-\uA67D\uA69E\uA69F\uA6F0\uA6F1\uA802\uA806\uA80B\uA825\uA826\uA82C\uA8C4\uA8C5\uA8E0-\uA8F1\uA8FF\uA926-\uA92D\uA947-\uA951\uA980-\uA982\uA9B3\uA9B6-\uA9B9\uA9BC\uA9BD\uA9E5\uAA29-\uAA2E\uAA31\uAA32\uAA35\uAA36\uAA43\uAA4C\uAA7C\uAAB0\uAAB2-\uAAB4\uAAB7\uAAB8\uAABE\uAABF\uAAC1\uAAEC\uAAED\uAAF6\uABE5\uABE8\uABED\uFB1E\uFE00-\uFE0F\uFE20-\uFE2F\u{101FD}\u{102E0}\u{10376}-\u{1037A}\u{10A01}-\u{10A03}\u{10A05}\u{10A06}\u{10A0C}-\u{10A0F}\u{10A38}-\u{10A3A}\u{10A3F}\u{10AE5}\u{10AE6}\u{10D24}-\u{10D27}\u{10EAB}\u{10EAC}\u{10F46}-\u{10F50}\u{11001}\u{11038}-\u{11046}\u{1107F}-\u{11081}\u{110B3}-\u{110B6}\u{110B9}\u{110BA}\u{11100}-\u{11102}\u{11127}-\u{1112B}\u{1112D}-\u{11134}\u{11173}\u{11180}\u{11181}\u{111B6}-\u{111BE}\u{111C9}-\u{111CC}\u{111CF}\u{1122F}-\u{11231}\u{11234}\u{11236}\u{11237}\u{1123E}\u{112DF}\u{112E3}-\u{112EA}\u{11300}\u{11301}\u{1133B}\u{1133C}\u{11340}\u{11366}-\u{1136C}\u{11370}-\u{11374}\u{11438}-\u{1143F}\u{11442}-\u{11444}\u{11446}\u{1145E}\u{114B3}-\u{114B8}\u{114BA}\u{114BF}\u{114C0}\u{114C2}\u{114C3}\u{115B2}-\u{115B5}\u{115BC}\u{115BD}\u{115BF}\u{115C0}\u{115DC}\u{115DD}\u{11633}-\u{1163A}\u{1163D}\u{1163F}\u{11640}\u{116AB}\u{116AD}\u{116B0}-\u{116B5}\u{116B7}\u{1171D}-\u{1171F}\u{11722}-\u{11725}\u{11727}-\u{1172B}\u{1182F}-\u{11837}\u{11839}\u{1183A}\u{1193B}\u{1193C}\u{1193E}\u{11943}\u{119D4}-\u{119D7}\u{119DA}\u{119DB}\u{119E0}\u{11A01}-\u{11A06}\u{11A09}\u{11A0A}\u{11A33}-\u{11A38}\u{11A3B}-\u{11A3E}\u{11A47}\u{11A51}-\u{11A56}\u{11A59}-\u{11A5B}\u{11A8A}-\u{11A96}\u{11A98}\u{11A99}\u{11C30}-\u{11C36}\u{11C38}-\u{11C3D}\u{11C92}-\u{11CA7}\u{11CAA}-\u{11CB0}\u{11CB2}\u{11CB3}\u{11CB5}\u{11CB6}\u{11D31}-\u{11D36}\u{11D3A}\u{11D3C}\u{11D3D}\u{11D3F}-\u{11D45}\u{11D47}\u{11D90}\u{11D91}\u{11D95}\u{11D97}\u{11EF3}\u{11EF4}\u{16AF0}-\u{16AF4}\u{16B30}-\u{16B36}\u{16F4F}\u{16F8F}-\u{16F92}\u{16FE4}\u{1BC9D}\u{1BC9E}\u{1D167}-\u{1D169}\u{1D17B}-\u{1D182}\u{1D185}-\u{1D18B}\u{1D1AA}-\u{1D1AD}\u{1D242}-\u{1D244}\u{1DA00}-\u{1DA36}\u{1DA3B}-\u{1DA6C}\u{1DA75}\u{1DA84}\u{1DA9B}-\u{1DA9F}\u{1DAA1}-\u{1DAAF}\u{1E000}-\u{1E006}\u{1E008}-\u{1E018}\u{1E01B}-\u{1E021}\u{1E023}\u{1E024}\u{1E026}-\u{1E02A}\u{1E130}-\u{1E136}\u{1E2EC}-\u{1E2EF}\u{1E8D0}-\u{1E8D6}\u{1E944}-\u{1E94A}\u{E0100}-\u{E01EF}]*$/u; + +module.exports = { + combiningMarks, + combiningClassVirama, + validZWNJ, + bidiDomain, + bidiS1LTR, + bidiS1RTL, + bidiS2, + bidiS3, + bidiS4EN, + bidiS4AN, + bidiS5, + bidiS6 +}; diff --git a/node_modules/tr46/lib/statusMapping.js b/node_modules/tr46/lib/statusMapping.js new file mode 100644 index 000000000..1c962c24d --- /dev/null +++ b/node_modules/tr46/lib/statusMapping.js @@ -0,0 +1,11 @@ +"use strict"; + +module.exports.STATUS_MAPPING = { + mapped: 1, + valid: 2, + disallowed: 3, + disallowed_STD3_valid: 4, // eslint-disable-line camelcase + disallowed_STD3_mapped: 5, // eslint-disable-line camelcase + deviation: 6, + ignored: 7 +}; diff --git a/node_modules/tr46/package.json b/node_modules/tr46/package.json new file mode 100644 index 000000000..1f3ad9f3c --- /dev/null +++ b/node_modules/tr46/package.json @@ -0,0 +1,84 @@ +{ + "_from": "tr46@^2.1.0", + "_id": "tr46@2.1.0", + "_inBundle": false, + "_integrity": "sha512-15Ih7phfcdP5YxqiB+iDtLoaTz4Nd35+IiAv0kQ5FNKHzXgdWqPoTIqEDDJmXceQt4JZk6lVPT8lnDlPpGDppw==", + "_location": "/tr46", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "tr46@^2.1.0", + "name": "tr46", + "escapedName": "tr46", + "rawSpec": "^2.1.0", + "saveSpec": null, + "fetchSpec": "^2.1.0" + }, + "_requiredBy": [ + "/whatwg-url" + ], + "_resolved": "https://registry.npmjs.org/tr46/-/tr46-2.1.0.tgz", + "_shasum": "fa87aa81ca5d5941da8cbf1f9b749dc969a4e240", + "_spec": "tr46@^2.1.0", + "_where": "C:\\Users\\Dan\\OneDrive\\Documents\\GitHub\\final_project-1\\node_modules\\whatwg-url", + "author": { + "name": "Sebastian Mayr", + "email": "npm@smayr.name" + }, + "bugs": { + "url": "https://github.com/jsdom/tr46/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Timothy Gu", + "email": "timothygu99@gmail.com" + } + ], + "dependencies": { + "punycode": "^2.1.1" + }, + "deprecated": false, + "description": "An implementation of the Unicode UTS #46: Unicode IDNA Compatibility Processing", + "devDependencies": { + "eslint": "^7.27.0", + "mocha": "^8.4.0", + "node-fetch": "^2.6.0", + "regenerate": "^1.4.2", + "unicode-13.0.0": "^0.8.0" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "lib/mappingTable.json", + "lib/regexes.js", + "lib/statusMapping.js" + ], + "homepage": "https://github.com/jsdom/tr46#readme", + "keywords": [ + "unicode", + "tr46", + "uts46", + "punycode", + "url", + "whatwg" + ], + "license": "MIT", + "main": "index.js", + "name": "tr46", + "repository": { + "type": "git", + "url": "git+https://github.com/jsdom/tr46.git" + }, + "scripts": { + "lint": "eslint .", + "prepublish": "node scripts/generateMappingTable.js && node scripts/generateRegexes.js", + "pretest": "node scripts/getLatestTests.js", + "test": "mocha" + }, + "unicodeVersion": "13.0.0", + "version": "2.1.0" +} diff --git a/node_modules/tsscmp/.travis.yml b/node_modules/tsscmp/.travis.yml new file mode 100644 index 000000000..9b0586086 --- /dev/null +++ b/node_modules/tsscmp/.travis.yml @@ -0,0 +1,18 @@ +language: node_js +node_js: + - "10.5" + - "9.11" + - "8.11" + - "7.10" + - "6" + - "5" + - "5.1" + - "4" + - "4.2" + - "4.1" + - "4.0" + - "0.12" + - "0.11" + - "0.10" + - "0.8" + \ No newline at end of file diff --git a/node_modules/tsscmp/LICENSE b/node_modules/tsscmp/LICENSE new file mode 100644 index 000000000..c62e51bf1 --- /dev/null +++ b/node_modules/tsscmp/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) 2016 + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE +SOFTWARE. diff --git a/node_modules/tsscmp/README.md b/node_modules/tsscmp/README.md new file mode 100644 index 000000000..cba99d037 --- /dev/null +++ b/node_modules/tsscmp/README.md @@ -0,0 +1,48 @@ +# Timing safe string compare using double HMAC + +[![Node.js Version](https://img.shields.io/node/v/tsscmp.svg?style=flat-square)](https://nodejs.org/en/download) +[![npm](https://img.shields.io/npm/v/tsscmp.svg?style=flat-square)](https://npmjs.org/package/tsscmp) +[![NPM Downloads](https://img.shields.io/npm/dm/tsscmp.svg?style=flat-square)](https://npmjs.org/package/tsscmp) +[![Build Status](https://img.shields.io/travis/suryagh/tsscmp/master.svg?style=flat-square)](https://travis-ci.org/suryagh/tsscmp) +[![Build Status](https://img.shields.io/appveyor/ci/suryagh/tsscmp/master.svg?style=flat-square&label=windows)](https://ci.appveyor.com/project/suryagh/tsscmp) +[![Dependency Status](http://img.shields.io/david/suryagh/tsscmp.svg?style=flat-square)](https://david-dm.org/suryagh/tsscmp) +[![npm-license](http://img.shields.io/npm/l/tsscmp.svg?style=flat-square)](LICENSE) + + +Prevents [timing attacks](http://codahale.com/a-lesson-in-timing-attacks/) using Brad Hill's +[Double HMAC pattern](https://www.nccgroup.trust/us/about-us/newsroom-and-events/blog/2011/february/double-hmac-verification/) +to perform secure string comparison. Double HMAC avoids the timing atacks by blinding the +timing channel using random time per attempt comparison against iterative brute force attacks. + + +## Install + +``` +npm install tsscmp +``` +## Why +To compare secret values like **authentication tokens**, **passwords** or +**capability urls** so that timing information is not +leaked to the attacker. + +## Example + +```js +var timingSafeCompare = require('tsscmp'); + +var sessionToken = '127e6fbfe24a750e72930c'; +var givenToken = '127e6fbfe24a750e72930c'; + +if (timingSafeCompare(sessionToken, givenToken)) { + console.log('good token'); +} else { + console.log('bad token'); +} +``` +##License: +[MIT](LICENSE) + +**Credits to:** [@jsha](https://github.com/jsha) | +[@bnoordhuis](https://github.com/bnoordhuis) | +[@suryagh](https://github.com/suryagh) | + \ No newline at end of file diff --git a/node_modules/tsscmp/appveyor.yml b/node_modules/tsscmp/appveyor.yml new file mode 100644 index 000000000..29cd460bd --- /dev/null +++ b/node_modules/tsscmp/appveyor.yml @@ -0,0 +1,29 @@ +# Test against this version of Node.js +environment: + matrix: + # nodejs_version: "0.6" not supported in Windows x86 + - nodejs_version: "0.8" + - nodejs_version: "0.10" + - nodejs_version: "0.11" + - nodejs_version: "0.12" + - nodejs_version: "4.0" + - nodejs_version: "5.0" + - nodejs_version: "6.0" + +# Install scripts. (runs after repo cloning) +install: + # Get the latest stable version of Node.js or io.js + - ps: Install-Product node $env:nodejs_version + # install modules + - npm install + +# Post-install test scripts. +test_script: + # Output useful info for debugging. + - node --version + - npm --version + # run tests + - npm test + +# Don't actually build. +build: off diff --git a/node_modules/tsscmp/lib/index.js b/node_modules/tsscmp/lib/index.js new file mode 100644 index 000000000..d52e5b542 --- /dev/null +++ b/node_modules/tsscmp/lib/index.js @@ -0,0 +1,38 @@ +'use strict'; + +// Implements Brad Hill's Double HMAC pattern from +// https://www.nccgroup.trust/us/about-us/newsroom-and-events/blog/2011/february/double-hmac-verification/. +// The approach is similar to the node's native implementation of timing safe buffer comparison that will be available on v6+. +// https://github.com/nodejs/node/issues/3043 +// https://github.com/nodejs/node/pull/3073 + +var crypto = require('crypto'); + +function bufferEqual(a, b) { + if (a.length !== b.length) { + return false; + } + // `crypto.timingSafeEqual` was introduced in Node v6.6.0 + // + if (crypto.timingSafeEqual) { + return crypto.timingSafeEqual(a, b); + } + for (var i = 0; i < a.length; i++) { + if (a[i] !== b[i]) { + return false; + } + } + return true; +} + +function timeSafeCompare(a, b) { + var sa = String(a); + var sb = String(b); + var key = crypto.pseudoRandomBytes(32); + var ah = crypto.createHmac('sha256', key).update(sa).digest(); + var bh = crypto.createHmac('sha256', key).update(sb).digest(); + + return bufferEqual(ah, bh) && a === b; +} + +module.exports = timeSafeCompare; diff --git a/node_modules/tsscmp/package.json b/node_modules/tsscmp/package.json new file mode 100644 index 000000000..de5fa9c24 --- /dev/null +++ b/node_modules/tsscmp/package.json @@ -0,0 +1,60 @@ +{ + "_from": "tsscmp@1.0.6", + "_id": "tsscmp@1.0.6", + "_inBundle": false, + "_integrity": "sha512-LxhtAkPDTkVCMQjt2h6eBVY28KCjikZqZfMcC15YBeNjkgUpdCfBu5HoiOTDu86v6smE8yOjyEktJ8hlbANHQA==", + "_location": "/tsscmp", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "tsscmp@1.0.6", + "name": "tsscmp", + "escapedName": "tsscmp", + "rawSpec": "1.0.6", + "saveSpec": null, + "fetchSpec": "1.0.6" + }, + "_requiredBy": [ + "/keygrip" + ], + "_resolved": "https://registry.npmjs.org/tsscmp/-/tsscmp-1.0.6.tgz", + "_shasum": "85b99583ac3589ec4bfef825b5000aa911d605eb", + "_spec": "tsscmp@1.0.6", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\keygrip", + "author": { + "name": "suryagh" + }, + "bugs": { + "url": "https://github.com/suryagh/tsscmp/issues" + }, + "bundleDependencies": false, + "dependencies": {}, + "deprecated": false, + "description": "Timing safe string compare using double HMAC", + "devDependencies": {}, + "engines": { + "node": ">=0.6.x" + }, + "homepage": "https://github.com/suryagh/tsscmp#readme", + "keywords": [ + "timing safe string compare", + "double hmac string compare", + "safe string compare", + "hmac" + ], + "license": "MIT", + "main": "lib/index.js", + "name": "tsscmp", + "publishConfig": { + "registry": "https://registry.npmjs.org" + }, + "repository": { + "type": "git", + "url": "git+https://github.com/suryagh/tsscmp.git" + }, + "scripts": { + "test": "node test/unit && node test/benchmark" + }, + "version": "1.0.6" +} diff --git a/node_modules/tsscmp/test/benchmark/index.js b/node_modules/tsscmp/test/benchmark/index.js new file mode 100644 index 000000000..fd93e4233 --- /dev/null +++ b/node_modules/tsscmp/test/benchmark/index.js @@ -0,0 +1,30 @@ +'use strict'; + +var timeSafeCompare = require('../../lib/index'); + +function random(length) { + + length = length || 32; + var result = ""; + var possible = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789!@#$%^&*()-+/*[]{}-=\|;\':\"<>?,./"; + + for( var i=0; i < length; i++ ){ + result += possible.charAt(Math.floor(Math.random() * possible.length)); + } + return result; +} + +function run(count) { + count = count || 100*1000; + console.log('benchmark count: ' + count/1000 + 'k'); + console.time('benchmark'); + + while(count--){ + timeSafeCompare(random(), random()); + } + console.timeEnd('benchmark'); +} + +run(100000); + +module.exports = run; diff --git a/node_modules/tsscmp/test/unit/index.js b/node_modules/tsscmp/test/unit/index.js new file mode 100644 index 000000000..03354234a --- /dev/null +++ b/node_modules/tsscmp/test/unit/index.js @@ -0,0 +1,69 @@ +'use strict'; + +var assert = require('assert'); +var timeSafeCompare = require('../../lib/index'); + +process.on('error', function (e) { + console.log('caught: ' + e); +}); + +function testEqual(a, b) { + assert(timeSafeCompare(a, b)); + + // lets also do a parity check with the strict equal to operator + assert(a === b); +} + +function testNotEqual(a, b) { + assert(!timeSafeCompare(a, b)); + + // lets also do a parity check with the strict not equal to operator + assert(a !== b); +} + +// note: lets also make sure tsscmp can be inline replaced for any types - +// just incase if anyone is interested + +// positive tests +testEqual('127e6fbfe24a750e72930c220a8e138275656b8e5d8f48a98c3c92df2caba935', + '127e6fbfe24a750e72930c220a8e138275656b8e5d8f48a98c3c92df2caba935', + 'test '); +testEqual('a', 'a'); +testEqual('', ''); +testEqual(undefined, undefined); +testEqual(true, true); +testEqual(false, false); +(function () { + var a = { a: 1 }; + testEqual(a, a); +})(); +(function () { + function f1() { return 1; }; + testEqual(f1, f1); +})(); + +// negative tests +testNotEqual(''); +testNotEqual('a', 'b'); +testNotEqual('a', 'aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa'); +testNotEqual('aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa', 'a'); +testNotEqual('alpha', 'beta'); +testNotEqual(false, true); +testNotEqual(false, undefined); +testNotEqual(function () { }, function () { }); +testNotEqual({}, {}); +testNotEqual({ a: 1 }, { a: 1 }); +testNotEqual({ a: 1 }, { a: 2 }); +testNotEqual([1, 2], [1, 2]); +testNotEqual([1, 2], [1, 2, 3]); +(function () { + var a = { p: 1 }; + var b = { p: 1 }; + testNotEqual(a, b); +})(); +(function () { + function f1() { return 1; }; + function f2() { return 1; }; + testNotEqual(f1, f2); +})(); +console.log('Success: all tests complete.'); diff --git a/node_modules/type-fest/base.d.ts b/node_modules/type-fest/base.d.ts new file mode 100644 index 000000000..9005ef9fc --- /dev/null +++ b/node_modules/type-fest/base.d.ts @@ -0,0 +1,38 @@ +// Types that are compatible with all supported TypeScript versions. +// It's shared between all TypeScript version-specific definitions. + +// Basic +export * from './source/basic'; + +// Utilities +export {Except} from './source/except'; +export {Mutable} from './source/mutable'; +export {Merge} from './source/merge'; +export {MergeExclusive} from './source/merge-exclusive'; +export {RequireAtLeastOne} from './source/require-at-least-one'; +export {RequireExactlyOne} from './source/require-exactly-one'; +export {PartialDeep} from './source/partial-deep'; +export {ReadonlyDeep} from './source/readonly-deep'; +export {LiteralUnion} from './source/literal-union'; +export {Promisable} from './source/promisable'; +export {Opaque} from './source/opaque'; +export {SetOptional} from './source/set-optional'; +export {SetRequired} from './source/set-required'; +export {ValueOf} from './source/value-of'; +export {PromiseValue} from './source/promise-value'; +export {AsyncReturnType} from './source/async-return-type'; +export {ConditionalExcept} from './source/conditional-except'; +export {ConditionalKeys} from './source/conditional-keys'; +export {ConditionalPick} from './source/conditional-pick'; +export {UnionToIntersection} from './source/union-to-intersection'; +export {Stringified} from './source/stringified'; +export {FixedLengthArray} from './source/fixed-length-array'; +export {IterableElement} from './source/iterable-element'; +export {Entry} from './source/entry'; +export {Entries} from './source/entries'; +export {SetReturnType} from './source/set-return-type'; +export {Asyncify} from './source/asyncify'; + +// Miscellaneous +export {PackageJson} from './source/package-json'; +export {TsConfigJson} from './source/tsconfig-json'; diff --git a/node_modules/type-fest/index.d.ts b/node_modules/type-fest/index.d.ts new file mode 100644 index 000000000..206261c21 --- /dev/null +++ b/node_modules/type-fest/index.d.ts @@ -0,0 +1,2 @@ +// These are all the basic types that's compatible with all supported TypeScript versions. +export * from './base'; diff --git a/node_modules/type-fest/license b/node_modules/type-fest/license new file mode 100644 index 000000000..3e4c85ab7 --- /dev/null +++ b/node_modules/type-fest/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (https:/sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/type-fest/package.json b/node_modules/type-fest/package.json new file mode 100644 index 000000000..b1e1c1986 --- /dev/null +++ b/node_modules/type-fest/package.json @@ -0,0 +1,90 @@ +{ + "_from": "type-fest@^0.20.2", + "_id": "type-fest@0.20.2", + "_inBundle": false, + "_integrity": "sha512-Ne+eE4r0/iWnpAxD852z3A+N0Bt5RN//NjJwRd2VFHEmrywxf5vsZlh4R6lixl6B+wz/8d+maTSAkN1FIkI3LQ==", + "_location": "/type-fest", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "type-fest@^0.20.2", + "name": "type-fest", + "escapedName": "type-fest", + "rawSpec": "^0.20.2", + "saveSpec": null, + "fetchSpec": "^0.20.2" + }, + "_requiredBy": [ + "/boxen" + ], + "_resolved": "https://registry.npmjs.org/type-fest/-/type-fest-0.20.2.tgz", + "_shasum": "1bf207f4b28f91583666cb5fbd327887301cd5f4", + "_spec": "type-fest@^0.20.2", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\boxen", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "https://sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/type-fest/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "A collection of essential TypeScript types", + "devDependencies": { + "@sindresorhus/tsconfig": "~0.7.0", + "tsd": "^0.13.1", + "typescript": "^4.1.2", + "xo": "^0.35.0" + }, + "engines": { + "node": ">=10" + }, + "files": [ + "index.d.ts", + "base.d.ts", + "source", + "ts41" + ], + "funding": "https://github.com/sponsors/sindresorhus", + "homepage": "https://github.com/sindresorhus/type-fest#readme", + "keywords": [ + "typescript", + "ts", + "types", + "utility", + "util", + "utilities", + "omit", + "merge", + "json" + ], + "license": "(MIT OR CC0-1.0)", + "name": "type-fest", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/type-fest.git" + }, + "scripts": { + "//test": "xo && tsd && tsc", + "test": "xo && tsc" + }, + "types": "./index.d.ts", + "typesVersions": { + ">=4.1": { + "*": [ + "ts41/*" + ] + } + }, + "version": "0.20.2", + "xo": { + "rules": { + "@typescript-eslint/ban-types": "off", + "@typescript-eslint/indent": "off", + "node/no-unsupported-features/es-builtins": "off" + } + } +} diff --git a/node_modules/type-fest/readme.md b/node_modules/type-fest/readme.md new file mode 100644 index 000000000..714df7868 --- /dev/null +++ b/node_modules/type-fest/readme.md @@ -0,0 +1,658 @@ +
+
+
+ type-fest +
+
+ A collection of essential TypeScript types +
+
+
+
+
+ +[![](https://img.shields.io/badge/unicorn-approved-ff69b4.svg)](https://giphy.com/gifs/illustration-rainbow-unicorn-26AHG5KGFxSkUWw1i) + + +Many of the types here should have been built-in. You can help by suggesting some of them to the [TypeScript project](https://github.com/Microsoft/TypeScript/blob/master/CONTRIBUTING.md). + +Either add this package as a dependency or copy-paste the needed types. No credit required. 👌 + +PR welcome for additional commonly needed types and docs improvements. Read the [contributing guidelines](.github/contributing.md) first. + +## Install + +``` +$ npm install type-fest +``` + +*Requires TypeScript >=3.4* + +## Usage + +```ts +import {Except} from 'type-fest'; + +type Foo = { + unicorn: string; + rainbow: boolean; +}; + +type FooWithoutRainbow = Except; +//=> {unicorn: string} +``` + +## API + +Click the type names for complete docs. + +### Basic + +- [`Primitive`](source/basic.d.ts) - Matches any [primitive value](https://developer.mozilla.org/en-US/docs/Glossary/Primitive). +- [`Class`](source/basic.d.ts) - Matches a [`class` constructor](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Classes). +- [`TypedArray`](source/basic.d.ts) - Matches any [typed array](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/TypedArray), like `Uint8Array` or `Float64Array`. +- [`JsonObject`](source/basic.d.ts) - Matches a JSON object. +- [`JsonArray`](source/basic.d.ts) - Matches a JSON array. +- [`JsonValue`](source/basic.d.ts) - Matches any valid JSON value. +- [`ObservableLike`](source/basic.d.ts) - Matches a value that is like an [Observable](https://github.com/tc39/proposal-observable). + +### Utilities + +- [`Except`](source/except.d.ts) - Create a type from an object type without certain keys. This is a stricter version of [`Omit`](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-3-5.html#the-omit-helper-type). +- [`Mutable`](source/mutable.d.ts) - Convert an object with `readonly` keys into a mutable object. The inverse of `Readonly`. +- [`Merge`](source/merge.d.ts) - Merge two types into a new type. Keys of the second type overrides keys of the first type. +- [`MergeExclusive`](source/merge-exclusive.d.ts) - Create a type that has mutually exclusive keys. +- [`RequireAtLeastOne`](source/require-at-least-one.d.ts) - Create a type that requires at least one of the given keys. +- [`RequireExactlyOne`](source/require-exactly-one.d.ts) - Create a type that requires exactly a single key of the given keys and disallows more. +- [`PartialDeep`](source/partial-deep.d.ts) - Create a deeply optional version of another type. Use [`Partial`](https://github.com/Microsoft/TypeScript/blob/2961bc3fc0ea1117d4e53bc8e97fa76119bc33e3/src/lib/es5.d.ts#L1401-L1406) if you only need one level deep. +- [`ReadonlyDeep`](source/readonly-deep.d.ts) - Create a deeply immutable version of an `object`/`Map`/`Set`/`Array` type. Use [`Readonly`](https://github.com/Microsoft/TypeScript/blob/2961bc3fc0ea1117d4e53bc8e97fa76119bc33e3/src/lib/es5.d.ts#L1415-L1420) if you only need one level deep. +- [`LiteralUnion`](source/literal-union.d.ts) - Create a union type by combining primitive types and literal types without sacrificing auto-completion in IDEs for the literal type part of the union. Workaround for [Microsoft/TypeScript#29729](https://github.com/Microsoft/TypeScript/issues/29729). +- [`Promisable`](source/promisable.d.ts) - Create a type that represents either the value or the value wrapped in `PromiseLike`. +- [`Opaque`](source/opaque.d.ts) - Create an [opaque type](https://codemix.com/opaque-types-in-javascript/). +- [`SetOptional`](source/set-optional.d.ts) - Create a type that makes the given keys optional. +- [`SetRequired`](source/set-required.d.ts) - Create a type that makes the given keys required. +- [`ValueOf`](source/value-of.d.ts) - Create a union of the given object's values, and optionally specify which keys to get the values from. +- [`PromiseValue`](source/promise-value.d.ts) - Returns the type that is wrapped inside a `Promise`. +- [`AsyncReturnType`](source/async-return-type.d.ts) - Unwrap the return type of a function that returns a `Promise`. +- [`ConditionalKeys`](source/conditional-keys.d.ts) - Extract keys from a shape where values extend the given `Condition` type. +- [`ConditionalPick`](source/conditional-pick.d.ts) - Like `Pick` except it selects properties from a shape where the values extend the given `Condition` type. +- [`ConditionalExcept`](source/conditional-except.d.ts) - Like `Omit` except it removes properties from a shape where the values extend the given `Condition` type. +- [`UnionToIntersection`](source/union-to-intersection.d.ts) - Convert a union type to an intersection type. +- [`Stringified`](source/stringified.d.ts) - Create a type with the keys of the given type changed to `string` type. +- [`FixedLengthArray`](source/fixed-length-array.d.ts) - Create a type that represents an array of the given type and length. +- [`IterableElement`](source/iterable-element.d.ts) - Get the element type of an `Iterable`/`AsyncIterable`. For example, an array or a generator. +- [`Entry`](source/entry.d.ts) - Create a type that represents the type of an entry of a collection. +- [`Entries`](source/entries.d.ts) - Create a type that represents the type of the entries of a collection. +- [`SetReturnType`](source/set-return-type.d.ts) - Create a function type with a return type of your choice and the same parameters as the given function type. +- [`Asyncify`](source/asyncify.d.ts) - Create an async version of the given function type. + +### Template literal types + +*Note:* These require [TypeScript 4.1 or newer](https://devblogs.microsoft.com/typescript/announcing-typescript-4-1/#template-literal-types). + +- [`CamelCase`](ts41/camel-case.d.ts) – Convert a string literal to camel-case (`fooBar`). +- [`KebabCase`](ts41/kebab-case.d.ts) – Convert a string literal to kebab-case (`foo-bar`). +- [`PascalCase`](ts41/pascal-case.d.ts) – Converts a string literal to pascal-case (`FooBar`) +- [`SnakeCase`](ts41/snake-case.d.ts) – Convert a string literal to snake-case (`foo_bar`). +- [`DelimiterCase`](ts41/delimiter-case.d.ts) – Convert a string literal to a custom string delimiter casing. + +### Miscellaneous + +- [`PackageJson`](source/package-json.d.ts) - Type for [npm's `package.json` file](https://docs.npmjs.com/creating-a-package-json-file). +- [`TsConfigJson`](source/tsconfig-json.d.ts) - Type for [TypeScript's `tsconfig.json` file](https://www.typescriptlang.org/docs/handbook/tsconfig-json.html) (TypeScript 3.7). + +## Declined types + +*If we decline a type addition, we will make sure to document the better solution here.* + +- [`Diff` and `Spread`](https://github.com/sindresorhus/type-fest/pull/7) - The PR author didn't provide any real-world use-cases and the PR went stale. If you think this type is useful, provide some real-world use-cases and we might reconsider. +- [`Dictionary`](https://github.com/sindresorhus/type-fest/issues/33) - You only save a few characters (`Dictionary` vs `Record`) from [`Record`](https://github.com/Microsoft/TypeScript/blob/2961bc3fc0ea1117d4e53bc8e97fa76119bc33e3/src/lib/es5.d.ts#L1429-L1434), which is more flexible and well-known. Also, you shouldn't use an object as a dictionary. We have `Map` in JavaScript now. +- [`SubType`](https://github.com/sindresorhus/type-fest/issues/22) - The type is powerful, but lacks good use-cases and is prone to misuse. +- [`ExtractProperties` and `ExtractMethods`](https://github.com/sindresorhus/type-fest/pull/4) - The types violate the single responsibility principle. Instead, refine your types into more granular type hierarchies. + +## Tips + +### Built-in types + +There are many advanced types most users don't know about. + +- [`Partial`](https://github.com/Microsoft/TypeScript/blob/2961bc3fc0ea1117d4e53bc8e97fa76119bc33e3/src/lib/es5.d.ts#L1401-L1406) - Make all properties in `T` optional. +
+ + Example + + + [Playground](https://www.typescriptlang.org/play/#code/JYOwLgpgTgZghgYwgAgHIHsAmEDC6QzADmyA3gLABQyycADnanALYQBcyAzmFKEQNxUaddFDAcQAV2YAjaIMoBfKlQQAbOJ05osEAIIMAQpOBrsUMkOR1eANziRkCfISKSoD4Pg4ZseAsTIALyW1DS0DEysHADkvvoMMQA0VsKi4sgAzAAMuVaKClY2wPaOknSYDrguADwA0sgQAB6QIJjaANYQAJ7oMDp+LsQAfAAUXd0cdUnI9mo+uv6uANp1ALoAlKHhyGAAFsCcAHTOAW4eYF4gyxNrwbNwago0ypRWp66jH8QcAApwYmAjxq8SWIy2FDCNDA3ToKFBQyIdR69wmfQG1TOhShyBgomQX3w3GQE2Q6IA8jIAFYQBBgI4TTiEs5bTQYsFInrLTbbHZOIlgZDlSqQABqj0kKBC3yINx6a2xfOQwH6o2FVXFaklwSCIUkbQghBAEEwENSfNOlykEGefNe5uhB2O6sgS3GPRmLogmslG1tLxUOKgEDA7hAuydtteryAA) + + ```ts + interface NodeConfig { + appName: string; + port: number; + } + + class NodeAppBuilder { + private configuration: NodeConfig = { + appName: 'NodeApp', + port: 3000 + }; + + private updateConfig(key: Key, value: NodeConfig[Key]) { + this.configuration[key] = value; + } + + config(config: Partial) { + type NodeConfigKey = keyof NodeConfig; + + for (const key of Object.keys(config) as NodeConfigKey[]) { + const updateValue = config[key]; + + if (updateValue === undefined) { + continue; + } + + this.updateConfig(key, updateValue); + } + + return this; + } + } + + // `Partial`` allows us to provide only a part of the + // NodeConfig interface. + new NodeAppBuilder().config({appName: 'ToDoApp'}); + ``` +
+ +- [`Required`](https://github.com/Microsoft/TypeScript/blob/2961bc3fc0ea1117d4e53bc8e97fa76119bc33e3/src/lib/es5.d.ts#L1408-L1413) - Make all properties in `T` required. +
+ + Example + + + [Playground](https://typescript-play.js.org/?target=6#code/AQ4SwOwFwUwJwGYEMDGNgGED21VQGJZwC2wA3gFCjXAzFJgA2A-AFzADOUckA5gNxUaIYjA4ckvGG07c+g6gF8KQkAgCuEFFDA5O6gEbEwUbLm2ESwABQIixACJIoSdgCUYAR3Vg4MACYAPGYuFvYAfACU5Ko0APRxwADKMBD+wFAAFuh2Vv7OSBlYGdmc8ABu8LHKsRyGxqY4oQT21pTCIHQMjOwA5DAAHgACxAAOjDAAdChYxL0ANLHUouKSMH0AEmAAhJhY6ozpAJ77GTCMjMCiV0ToSAb7UJPPC9WRgrEJwAAqR6MwSRQPFGUFocDgRHYxnEfGAowh-zgUCOwF6KwkUl6tXqJhCeEsxDaS1AXSYfUGI3GUxmc0WSneQA) + + ```ts + interface ContactForm { + email?: string; + message?: string; + } + + function submitContactForm(formData: Required) { + // Send the form data to the server. + } + + submitContactForm({ + email: 'ex@mple.com', + message: 'Hi! Could you tell me more about…', + }); + + // TypeScript error: missing property 'message' + submitContactForm({ + email: 'ex@mple.com', + }); + ``` +
+ +- [`Readonly`](https://github.com/Microsoft/TypeScript/blob/2961bc3fc0ea1117d4e53bc8e97fa76119bc33e3/src/lib/es5.d.ts#L1415-L1420) - Make all properties in `T` readonly. +
+ + Example + + + [Playground](https://typescript-play.js.org/?target=6#code/AQ4UwOwVwW2AZA9gc3mAbmANsA3gKFCOAHkAzMgGkOJABEwAjKZa2kAUQCcvEu32AMQCGAF2FYBIAL4BufDRABLCKLBcywgMZgEKZOoDCiCGSXI8i4hGEwwALmABnUVxXJ57YFgzZHSVF8sT1BpBSItLGEnJz1kAy5LLy0TM2RHACUwYQATEywATwAeAITjU3MAPnkrCJMXLigtUT4AClxgGztKbyDgaX99I1TzAEokr1BRAAslJwA6FIqLAF48TtswHp9MHDla9hJGACswZvmyLjAwAC8wVpm5xZHkUZDaMKIwqyWXYCW0oN4sNlsA1h0ug5gAByACyBQAggAHJHQ7ZBIFoXbzBjMCz7OoQP5YIaJNYQMAAdziCVaALGNSIAHomcAACoFJFgADKWjcSNEwG4vC4ji0wggEEQguiTnMEGALWAV1yAFp8gVgEjeFyuKICvMrCTgVxnst5jtsGC4ljsPNhXxGaAWcAAOq6YRXYDCRg+RWIcA5JSC+kWdCepQ+v3RYCU3RInzRMCGwlpC19NYBW1Ye08R1AA) + + ```ts + enum LogLevel { + Off, + Debug, + Error, + Fatal + }; + + interface LoggerConfig { + name: string; + level: LogLevel; + } + + class Logger { + config: Readonly; + + constructor({name, level}: LoggerConfig) { + this.config = {name, level}; + Object.freeze(this.config); + } + } + + const config: LoggerConfig = { + name: 'MyApp', + level: LogLevel.Debug + }; + + const logger = new Logger(config); + + // TypeScript Error: cannot assign to read-only property. + logger.config.level = LogLevel.Error; + + // We are able to edit config variable as we please. + config.level = LogLevel.Error; + ``` +
+ +- [`Pick`](https://github.com/Microsoft/TypeScript/blob/2961bc3fc0ea1117d4e53bc8e97fa76119bc33e3/src/lib/es5.d.ts#L1422-L1427) - From `T`, pick a set of properties whose keys are in the union `K`. +
+ + Example + + + [Playground](https://typescript-play.js.org/?target=6#code/AQ4SwOwFwUwJwGYEMDGNgEE5TCgNugN4BQoZwOUBAXMAM5RyQDmA3KeSFABYCuAtgCMISMHloMmENh04oA9tBjQJjFuzIBfYrOAB6PcADCcGElh1gEGAHcKATwAO6ebyjB5CTNlwFwSxFR0BX5HeToYABNgBDh5fm8cfBg6AHIKG3ldA2BHOOcfFNpUygJ0pAhokr4hETFUgDpswywkggAFUwA3MFtgAF5gQgowKhhVKTYKGuFRcXo1aVZgbTIoJ3RW3xhOmB6+wfbcAGsAHi3kgBpgEtGy4AAfG54BWfqAPnZm4AAlZUj4MAkMA8GAGB4vEgfMlLLw6CwPBA8PYRmMgZVgAC6CgmI4cIommQELwICh8RBgKZKvALh1ur0bHQABR5PYMui0Wk7em2ADaAF0AJS0AASABUALIAGQAogR+Mp3CROCAFBBwVC2ikBpj5CgBIqGjizLA5TAFdAmalImAuqlBRoVQh5HBgEy1eDWfs7J5cjzGYKhroVfpDEhHM4MV6GRR5NN0JrtnRg6BVirTFBeHAKYmYY6QNpdB73LmCJZBlSAXAubtvczeSmQMNSuMbmKNgBlHFgPEUNwusBIPAAQlS1xetTmxT0SDoESgdD0C4aACtHMwxytLrohawgA) + + ```ts + interface Article { + title: string; + thumbnail: string; + content: string; + } + + // Creates new type out of the `Article` interface composed + // from the Articles' two properties: `title` and `thumbnail`. + // `ArticlePreview = {title: string; thumbnail: string}` + type ArticlePreview = Pick; + + // Render a list of articles using only title and description. + function renderArticlePreviews(previews: ArticlePreview[]): HTMLElement { + const articles = document.createElement('div'); + + for (const preview of previews) { + // Append preview to the articles. + } + + return articles; + } + + const articles = renderArticlePreviews([ + { + title: 'TypeScript tutorial!', + thumbnail: '/assets/ts.jpg' + } + ]); + ``` +
+ +- [`Record`](https://github.com/Microsoft/TypeScript/blob/2961bc3fc0ea1117d4e53bc8e97fa76119bc33e3/src/lib/es5.d.ts#L1429-L1434) - Construct a type with a set of properties `K` of type `T`. +
+ + Example + + + [Playground](https://typescript-play.js.org/?target=6#code/AQ4ejYAUHsGcCWAXBMB2dgwGbAKYC2ADgDYwCeeemCaWArgE7ADGMxAhmuQHQBQoYEnJE8wALKEARnkaxEKdMAC8wAOS0kstGuAAfdQBM8ANzxlRjXQbVaWACwC0JPB0NqA3HwGgIwAJJoWozYHCxixnAsjAhStADmwESMMJYo1Fi4HMCIaPEu+MRklHj8gpqyoeHAAKJFFFTAAN4+giDYCIxwSAByHAR4AFw5SDF5Xm2gJBzdfQPD3WPxE5PAlBxdAPLYNQAelgh4aOHDaPQEMowrIAC+3oJ+AMKMrlrAXFhSAFZ4LEhC9g4-0BmA4JBISXgiCkBQABpILrJ5MhUGhYcATGD6Bk4Hh-jNgABrPDkOBlXyQAAq9ngYmJpOAAHcEOCRjAXqwYODfoo6DhakUSph+Uh7GI4P0xER4Cj0OSQGwMP8tP1hgAlX7swwAHgRl2RvIANALSA08ABtAC6AD4VM1Wm0Kow0MMrYaHYJjGYLLJXZb3at1HYnC43Go-QHQDcvA6-JsmEJXARgCDgMYWAhjIYhDAU+YiMAAFIwex0ZmilMITCGF79TLAGRsAgJYAAZRwSEZGzEABFTOZUrJ5Yn+jwnWgeER6HB7AAKJrADpdXqS4ZqYultTG6azVfqHswPBbtauLY7fayQ7HIbAAAMwBuAEoYw9IBq2Ixs9h2eFMOQYPQObALQKJgggABeYhghCIpikkKRpOQRIknAsZUiIeCttECBEP8NSMCkjDDAARMGziuIYxHwYOjDCMBmDNnAuTxA6irdCOBB1Lh5Dqpqn66tISIykawBnOCtqqC0gbjqc9DgpGkxegOliyfJDrRkAA) + + ```ts + // Positions of employees in our company. + type MemberPosition = 'intern' | 'developer' | 'tech-lead'; + + // Interface describing properties of a single employee. + interface Employee { + firstName: string; + lastName: string; + yearsOfExperience: number; + } + + // Create an object that has all possible `MemberPosition` values set as keys. + // Those keys will store a collection of Employees of the same position. + const team: Record = { + intern: [], + developer: [], + 'tech-lead': [], + }; + + // Our team has decided to help John with his dream of becoming Software Developer. + team.intern.push({ + firstName: 'John', + lastName: 'Doe', + yearsOfExperience: 0 + }); + + // `Record` forces you to initialize all of the property keys. + // TypeScript Error: "tech-lead" property is missing + const teamEmpty: Record = { + intern: null, + developer: null, + }; + ``` +
+ +- [`Exclude`](https://github.com/Microsoft/TypeScript/blob/2961bc3fc0ea1117d4e53bc8e97fa76119bc33e3/src/lib/es5.d.ts#L1436-L1439) - Exclude from `T` those types that are assignable to `U`. +
+ + Example + + + [Playground](https://typescript-play.js.org/?target=6#code/JYOwLgpgTgZghgYwgAgMrQG7QMIHsQzADmyA3gFDLIAOuUYAXMiAK4A2byAPsgM5hRQJHqwC2AI2gBucgF9y5MAE9qKAEoQAjiwj8AEnBAATNtGQBeZAAooWphu26wAGmS3e93bRC8IASgsAPmRDJRlyAHoI5ABRAA8ENhYjFFYOZGVVZBgoXFFkAAM0zh5+QRBhZhYJaAKAOkjogEkQZAQ4X2QAdwALCFbaemRgXmQtFjhOMFwq9K6ULuB0lk6U+HYwZAxJnQaYFhAEMGB8ZCIIMAAFOjAANR2IK0HGWISklIAedCgsKDwCYgAbQA5M9gQBdVzFQJ+JhiSRQMiUYYwayZCC4VHPCzmSzAspCYEBWxgFhQAZwKC+FpgJ43VwARgADH4ZFQSWSBjcZPJyPtDsdTvxKWBvr8rD1DCZoJ5HPopaYoK4EPhCEQmGKcKriLCtrhgEYkVQVT5Nr4fmZLLZtMBbFZgT0wGBqES6ghbHBIJqoBKFdBWQpjfh+DQbhY2tqiHVsbjLMVkAB+ZAAZiZaeQTHOVxu9ySjxNaujNwDVHNvzqbBGkBAdPoAfkQA) + + ```ts + interface ServerConfig { + port: null | string | number; + } + + type RequestHandler = (request: Request, response: Response) => void; + + // Exclude `null` type from `null | string | number`. + // In case the port is equal to `null`, we will use default value. + function getPortValue(port: Exclude): number { + if (typeof port === 'string') { + return parseInt(port, 10); + } + + return port; + } + + function startServer(handler: RequestHandler, config: ServerConfig): void { + const server = require('http').createServer(handler); + + const port = config.port === null ? 3000 : getPortValue(config.port); + server.listen(port); + } + ``` +
+ +- [`Extract`](https://github.com/Microsoft/TypeScript/blob/2961bc3fc0ea1117d4e53bc8e97fa76119bc33e3/src/lib/es5.d.ts#L1441-L1444) - Extract from `T` those types that are assignable to `U`. +
+ + Example + + + [Playground](https://typescript-play.js.org/?target=6#code/CYUwxgNghgTiAEAzArgOzAFwJYHtXzSwEdkQBJYACgEoAueVZAWwCMQYBuAKDDwGcM8MgBF4AXngBlAJ6scESgHIRi6ty5ZUGdoihgEABXZ888AN5d48ANoiAuvUat23K6ihMQ9ATE0BzV3goPy8GZjZOLgBfLi4Aejj4AEEICBwAdz54MAALKFQQ+BxEeAAHY1NgKAwoIKy0grr4DByEUpgccpgMaXgAaxBerCzi+B9-ZulygDouFHRsU1z8kKMYE1RhaqgAHkt4AHkWACt4EAAPbVRgLLWNgBp9gGlBs8uQa6yAUUuYPQwdgNpKM7nh7mMML4CgA+R5WABqUAgpDeVxuhxO1he0jsXGh8EoOBO9COx3BQPo2PBADckaR6IjkSA6PBqTgsMBzPsicdrEC7OJWXSQNwYvFEgAVTS9JLXODpeDpKBZFg4GCoWa8VACIJykAKiQWKy2YQOAioYikCg0OEMDyhRSy4DyxS24KhAAMjyi6gS8AAwjh5OD0iBFHAkJoEOksC1mnkMJq8gUQKDNttKPlnfrwYp3J5XfBHXqoKpfYkAOI4ansTxaeDADmoRSCCBYAbxhC6TDx6rwYHIRX5bScjA4bLJwoDmDwDkfbA9JMrVMVdM1TN69LgkTgwgkchUahqIA) + + ```ts + declare function uniqueId(): number; + + const ID = Symbol('ID'); + + interface Person { + [ID]: number; + name: string; + age: number; + } + + // Allows changing the person data as long as the property key is of string type. + function changePersonData< + Obj extends Person, + Key extends Extract, + Value extends Obj[Key] + > (obj: Obj, key: Key, value: Value): void { + obj[key] = value; + } + + // Tiny Andrew was born. + const andrew = { + [ID]: uniqueId(), + name: 'Andrew', + age: 0, + }; + + // Cool, we're fine with that. + changePersonData(andrew, 'name', 'Pony'); + + // Goverment didn't like the fact that you wanted to change your identity. + changePersonData(andrew, ID, uniqueId()); + ``` +
+ +- [`NonNullable`](https://github.com/Microsoft/TypeScript/blob/2961bc3fc0ea1117d4e53bc8e97fa76119bc33e3/src/lib/es5.d.ts#L1446-L1449) - Exclude `null` and `undefined` from `T`. +
+ + Example + + Works with strictNullChecks set to true. (Read more here) + + [Playground](https://typescript-play.js.org/?target=6#code/C4TwDgpgBACg9gJ2AOQK4FsBGEFQLxQDOwCAlgHYDmUAPlORtrnQwDasDcAUFwPQBU-WAEMkUOADMowqAGNWwwoSgATCBIqlgpOOSjAAFsOBRSy1IQgr9cKJlSlW1mZYQA3HFH68u8xcoBlHA8EACEHJ08Aby4oKDBUTFZSWXjEFEYcAEIALihkXTR2YSSIAB54JDQsHAA+blj4xOTUsHSACkMzPKD3HHDHNQQAGjSkPMqMmoQASh7g-oihqBi4uNIpdraxPAI2VhmVxrX9AzMAOm2ppnwoAA4ABifuE4BfKAhWSyOTuK7CS7pao3AhXF5rV48E4ICDAVAIPT-cGQyG+XTEIgLMJLTx7CAAdygvRCA0iCHaMwarhJOIQjUBSHaACJHk8mYdeLwxtdcVAAOSsh58+lXdr7Dlcq7A3n3J4PEUdADMcspUE53OluAIUGVTx46oAKuAIAFZGQwCYAKIIBCILjUxaDHAMnla+iodjcIA) + + ```ts + type PortNumber = string | number | null; + + /** Part of a class definition that is used to build a server */ + class ServerBuilder { + portNumber!: NonNullable; + + port(this: ServerBuilder, port: PortNumber): ServerBuilder { + if (port == null) { + this.portNumber = 8000; + } else { + this.portNumber = port; + } + + return this; + } + } + + const serverBuilder = new ServerBuilder(); + + serverBuilder + .port('8000') // portNumber = '8000' + .port(null) // portNumber = 8000 + .port(3000); // portNumber = 3000 + + // TypeScript error + serverBuilder.portNumber = null; + ``` +
+ +- [`Parameters`](https://github.com/Microsoft/TypeScript/blob/2961bc3fc0ea1117d4e53bc8e97fa76119bc33e3/src/lib/es5.d.ts#L1451-L1454) - Obtain the parameters of a function type in a tuple. +
+ + Example + + + [Playground](https://typescript-play.js.org/?target=6#code/GYVwdgxgLglg9mABAZwBYmMANgUwBQxgAOIUAXIgIZgCeA2gLoCUFAbnDACaIDeAUIkQB6IYgCypSlBxUATrMo1ECsJzgBbLEoipqAc0J7EMKMgDkiHLnU4wp46pwAPHMgB0fAL58+oSLARECEosLAA5ABUYG2QAHgAxJGdpVWREPDdMylk9ZApqemZEAF4APipacrw-CApEgBogkKwAYThwckQwEHUAIxxZJl4BYVEImiIZKF0oZRwiWVdbeygJmThgOYgcGFYcbhqApCJsyhtpWXcR1cnEePBoeDAABVPzgbTixFeFd8uEsClADcIxGiygIFkSEOT3SmTc2VydQeRx+ZxwF2QQ34gkEwDgsnSuFmMBKiAADEDjIhYk1Qm0OlSYABqZnYka4xA1DJZHJYkGc7yCbyeRA+CAIZCzNAYbA4CIAdxg2zJwVCkWirjwMswuEaACYmCCgA) + + ```ts + function shuffle(input: any[]): void { + // Mutate array randomly changing its' elements indexes. + } + + function callNTimes any> (func: Fn, callCount: number) { + // Type that represents the type of the received function parameters. + type FunctionParameters = Parameters; + + return function (...args: FunctionParameters) { + for (let i = 0; i < callCount; i++) { + func(...args); + } + } + } + + const shuffleTwice = callNTimes(shuffle, 2); + ``` +
+ +- [`ConstructorParameters`](https://github.com/Microsoft/TypeScript/blob/2961bc3fc0ea1117d4e53bc8e97fa76119bc33e3/src/lib/es5.d.ts#L1456-L1459) - Obtain the parameters of a constructor function type in a tuple. +
+ + Example + + + [Playground](https://typescript-play.js.org/?target=6#code/MYGwhgzhAECCBOAXAlqApgWQPYBM0mgG8AoaaFRENALmgkXmQDsBzAblOmCycTV4D8teo1YdO3JiICuwRFngAKClWENmLAJRFOZRAAtkEAHQq00ALzlklNBzIBfYk+KhIMAJJTEYJsDQAwmDA+mgAPAAq0GgAHnxMODCKTGgA7tCKxllg8CwQtL4AngDaALraFgB80EWa1SRkAA6MAG5gfNAB4FABPDJyCrQR9tDNyG0dwMGhtBhgjWEiGgA00F70vv4RhY3hEZXVVinpc42KmuJkkv3y8Bly8EPaDWTkhiZd7r3e8LK3llwGCMXGQWGhEOsfH5zJlsrl8p0+gw-goAAo5MAAW3BaHgEEilU0tEhmzQ212BJ0ry4SOg+kg+gBBiMximIGA0nAfAQLGk2N4EAAEgzYcYcnkLsRdDTvNEYkYUKwSdCme9WdM0MYwYhFPSIPpJdTkAAzDKxBUaZX+aAAQgsVmkCTQxuYaBw2ng4Ok8CYcotSu8pMur09iG9vuObxZnx6SN+AyUWTF8MN0CcZE4Ywm5jZHK5aB5fP4iCFIqT4oRRTKRLo6lYVNeAHpG50wOzOe1zHr9NLQ+HoABybsD4HOKXXRA1JCoKhBELmI5pNaB6Fz0KKBAodDYPAgSUTmqYsAALx4m5nC6nW9nGq14KtaEUA9gR9PvuNCjQ9BgACNvcwNBtAcLiAA) + + ```ts + class ArticleModel { + title: string; + content?: string; + + constructor(title: string) { + this.title = title; + } + } + + class InstanceCache any)> { + private ClassConstructor: T; + private cache: Map> = new Map(); + + constructor (ctr: T) { + this.ClassConstructor = ctr; + } + + getInstance (...args: ConstructorParameters): InstanceType { + const hash = this.calculateArgumentsHash(...args); + + const existingInstance = this.cache.get(hash); + if (existingInstance !== undefined) { + return existingInstance; + } + + return new this.ClassConstructor(...args); + } + + private calculateArgumentsHash(...args: any[]): string { + // Calculate hash. + return 'hash'; + } + } + + const articleCache = new InstanceCache(ArticleModel); + const amazonArticle = articleCache.getInstance('Amazon forests burining!'); + ``` +
+ +- [`ReturnType`](https://github.com/Microsoft/TypeScript/blob/2961bc3fc0ea1117d4e53bc8e97fa76119bc33e3/src/lib/es5.d.ts#L1461-L1464) – Obtain the return type of a function type. +
+ + Example + + + [Playground](https://typescript-play.js.org/?target=6#code/MYGwhgzhAECSAmICmBlJAnAbgS2E6A3gFDTTwD2AcuQC4AW2AdgOYAUAlAFzSbnbyEAvkWFFQkGJSQB3GMVI1sNZNwg10TZgG4S0YOUY0kh1es07d+xmvQBXYDXLpWi5UlMaWAGj0GjJ6BtNdkJdBQYIADpXZGgAXmgYpB1ScOwoq38aeN9DYxoU6GFRKzVoJjUwRjwAYXJbPPRuAFkwAAcAHgAxBodsAx9GWwBbACMMAD4cxhloVraOCyYjdAAzMDxoOut1e0d0UNIZ6WhWSPOwdGYIbiqATwBtAF0uaHudUQB6ACpv6ABpJBINqJdAbADW0Do5BOw3u5R2VTwMHIq2gAANtjZ0bkbHsnFCwJh8ONjHp0EgwEZ4JFoN9PkRVr1FAZoMwkDRYIjqkgOrosepoEgAB7+eAwAV2BxOLy6ACCVxgIrFEoMeOl6AACpcwMMORgIB1JRMiBNWKVdhruJKfOdIpdrtwFddXlzKjyACp3Nq842HaDIbL6BrZBIVGhIpB1EMYSLsmjmtWW-YhAA+qegAAYLKQLQj3ZsEsdccmnGcLor2Dn8xGedHGpEIBzEzspfsfMHDNAANTQACMVaIljV5GQkRA5DYmIpVKQAgAJARO9le33BDXIyi0YuLW2nJFGLqkOvxFB0YPdBSaLZ0IwNzyPkO8-xkGgsLh8Al427a3hWAhXwwHA8EHT5PmgAB1bAQBAANJ24adKWpft72RaBUTgRBUCAj89HAM8xCTaBjggABRQx0DuHJv25P9dCkWRZVIAAiBjoFImpmjlFBgA0NpsjadByDacgIDAEAIAAQmYpjoGYgAZSBsmGPw6DtZiiFA8CoJguDmAQmoZ2QvtUKQLdoAYmBTwgdEiCAA) + + ```ts + /** Provides every element of the iterable `iter` into the `callback` function and stores the results in an array. */ + function mapIter< + Elem, + Func extends (elem: Elem) => any, + Ret extends ReturnType + >(iter: Iterable, callback: Func): Ret[] { + const mapped: Ret[] = []; + + for (const elem of iter) { + mapped.push(callback(elem)); + } + + return mapped; + } + + const setObject: Set = new Set(); + const mapObject: Map = new Map(); + + mapIter(setObject, (value: string) => value.indexOf('Foo')); // number[] + + mapIter(mapObject, ([key, value]: [number, string]) => { + return key % 2 === 0 ? value : 'Odd'; + }); // string[] + ``` +
+ +- [`InstanceType`](https://github.com/Microsoft/TypeScript/blob/2961bc3fc0ea1117d4e53bc8e97fa76119bc33e3/src/lib/es5.d.ts#L1466-L1469) – Obtain the instance type of a constructor function type. +
+ + Example + + + [Playground](https://typescript-play.js.org/?target=6#code/MYGwhgzhAECSAmICmBlJAnAbgS2E6A3gFDTTwD2AcuQC4AW2AdgOYAUAlAFzSbnbyEAvkWFFQkGJSQB3GMVI1sNZNwg10TZgG4S0YOUY0kh1es07d+xmvQBXYDXLpWi5UlMaWAGj0GjJ6BtNdkJdBQYIADpXZGgAXmgYpB1ScOwoq38aeN9DYxoU6GFRKzVoJjUwRjwAYXJbPPRuAFkwAAcAHgAxBodsAx9GWwBbACMMAD4cxhloVraOCyYjdAAzMDxoOut1e0d0UNIZ6WhWSPOwdGYIbiqATwBtAF0uaHudUQB6ACpv6ABpJBINqJdAbADW0Do5BOw3u5R2VTwMHIq2gAANtjZ0bkbHsnFCwJh8ONjHp0EgwEZ4JFoN9PkRVr1FAZoMwkDRYIjqkgOrosepoEgAB7+eAwAV2BxOLy6ACCVxgIrFEoMeOl6AACpcwMMORgIB1JRMiBNWKVdhruJKfOdIpdrtwFddXlzKjyACp3Nq842HaDIbL6BrZBIVGhIpB1EMYSLsmjmtWW-YhAA+qegAAYLKQLQj3ZsEsdccmnGcLor2Dn8xGedHGpEIBzEzspfsfMHDNAANTQACMVaIljV5GQkRA5DYmIpVKQAgAJARO9le33BDXIyi0YuLW2nJFGLqkOvxFB0YPdBSaLZ0IwNzyPkO8-xkGgsLh8Al427a3hWAhXwwHA8EHT5PmgAB1bAQBAANJ24adKWpft72RaBUTgRBUCAj89HAM8xCTaBjggABRQx0DuHJv25P9dCkWRZVIAAiBjoFImpmjlFBgA0NpsjadByDacgIDAEAIAAQmYpjoGYgAZSBsmGPw6DtZiiFA8CoJguDmAQmoZ2QvtUKQLdoAYmBTwgdEiCAA) + + ```ts + class IdleService { + doNothing (): void {} + } + + class News { + title: string; + content: string; + + constructor(title: string, content: string) { + this.title = title; + this.content = content; + } + } + + const instanceCounter: Map = new Map(); + + interface Constructor { + new(...args: any[]): any; + } + + // Keep track how many instances of `Constr` constructor have been created. + function getInstance< + Constr extends Constructor, + Args extends ConstructorParameters + >(constructor: Constr, ...args: Args): InstanceType { + let count = instanceCounter.get(constructor) || 0; + + const instance = new constructor(...args); + + instanceCounter.set(constructor, count + 1); + + console.log(`Created ${count + 1} instances of ${Constr.name} class`); + + return instance; + } + + + const idleService = getInstance(IdleService); + // Will log: `Created 1 instances of IdleService class` + const newsEntry = getInstance(News, 'New ECMAScript proposals!', 'Last month...'); + // Will log: `Created 1 instances of News class` + ``` +
+ +- [`Omit`](https://github.com/microsoft/TypeScript/blob/71af02f7459dc812e85ac31365bfe23daf14b4e4/src/lib/es5.d.ts#L1446) – Constructs a type by picking all properties from T and then removing K. +
+ + Example + + + [Playground](https://typescript-play.js.org/?target=6#code/JYOwLgpgTgZghgYwgAgIImAWzgG2QbwChlks4BzCAVShwC5kBnMKUcgbmKYAcIFgIjBs1YgOXMpSFMWbANoBdTiW5woFddwAW0kfKWEAvoUIB6U8gDCUCHEiNkICAHdkYAJ69kz4GC3JcPG4oAHteKDABBxCYNAxsPFBIWEQUCAAPJG4wZABySUFcgJAAEzMLXNV1ck0dIuCw6EjBADpy5AB1FAQ4EGQAV0YUP2AHDy8wEOQbUugmBLwtEIA3OcmQnEjuZBgQqE7gAGtgZAhwKHdkHFGwNvGUdDIcAGUliIBJEF3kAF5kAHlML4ADyPBIAGjyBUYRQAPnkqho4NoYQA+TiEGD9EAISIhPozErQMG4AASK2gn2+AApek9pCSXm8wFSQooAJQMUkAFQAsgAZACiOAgmDOOSIJAQ+OYyGl4DgoDmf2QJRCCH6YvALQQNjsEGFovF1NyJWAy1y7OUyHMyE+yRAuFImG4Iq1YDswHxbRINjA-SgfXlHqVUE4xiAA) + + ```ts + interface Animal { + imageUrl: string; + species: string; + images: string[]; + paragraphs: string[]; + } + + // Creates new type with all properties of the `Animal` interface + // except 'images' and 'paragraphs' properties. We can use this + // type to render small hover tooltip for a wiki entry list. + type AnimalShortInfo = Omit; + + function renderAnimalHoverInfo (animals: AnimalShortInfo[]): HTMLElement { + const container = document.createElement('div'); + // Internal implementation. + return container; + } + ``` +
+ +You can find some examples in the [TypeScript docs](https://www.typescriptlang.org/docs/handbook/advanced-types.html#predefined-conditional-types). + +## Maintainers + +- [Sindre Sorhus](https://github.com/sindresorhus) +- [Jarek Radosz](https://github.com/CvX) +- [Dimitri Benin](https://github.com/BendingBender) +- [Pelle Wessman](https://github.com/voxpelli) + +## License + +(MIT OR CC0-1.0) + +--- + +
+ + Get professional support for this package with a Tidelift subscription + +
+ + Tidelift helps make open source sustainable for maintainers while giving companies
assurances about security, maintenance, and licensing for their dependencies. +
+
diff --git a/node_modules/type-fest/source/async-return-type.d.ts b/node_modules/type-fest/source/async-return-type.d.ts new file mode 100644 index 000000000..79ec1e961 --- /dev/null +++ b/node_modules/type-fest/source/async-return-type.d.ts @@ -0,0 +1,23 @@ +import {PromiseValue} from './promise-value'; + +type AsyncFunction = (...args: any[]) => Promise; + +/** +Unwrap the return type of a function that returns a `Promise`. + +There has been [discussion](https://github.com/microsoft/TypeScript/pull/35998) about implementing this type in TypeScript. + +@example +```ts +import {AsyncReturnType} from 'type-fest'; +import {asyncFunction} from 'api'; + +// This type resolves to the unwrapped return type of `asyncFunction`. +type Value = AsyncReturnType; + +async function doSomething(value: Value) {} + +asyncFunction().then(value => doSomething(value)); +``` +*/ +export type AsyncReturnType = PromiseValue>; diff --git a/node_modules/type-fest/source/asyncify.d.ts b/node_modules/type-fest/source/asyncify.d.ts new file mode 100644 index 000000000..455f2ebda --- /dev/null +++ b/node_modules/type-fest/source/asyncify.d.ts @@ -0,0 +1,31 @@ +import {PromiseValue} from './promise-value'; +import {SetReturnType} from './set-return-type'; + +/** +Create an async version of the given function type, by boxing the return type in `Promise` while keeping the same parameter types. + +Use-case: You have two functions, one synchronous and one asynchronous that do the same thing. Instead of having to duplicate the type definition, you can use `Asyncify` to reuse the synchronous type. + +@example +``` +import {Asyncify} from 'type-fest'; + +// Synchronous function. +function getFooSync(someArg: SomeType): Foo { + // … +} + +type AsyncifiedFooGetter = Asyncify; +//=> type AsyncifiedFooGetter = (someArg: SomeType) => Promise; + +// Same as `getFooSync` but asynchronous. +const getFooAsync: AsyncifiedFooGetter = (someArg) => { + // TypeScript now knows that `someArg` is `SomeType` automatically. + // It also knows that this function must return `Promise`. + // If you have `@typescript-eslint/promise-function-async` linter rule enabled, it will even report that "Functions that return promises must be async.". + + // … +} +``` +*/ +export type Asyncify any> = SetReturnType>>>; diff --git a/node_modules/type-fest/source/basic.d.ts b/node_modules/type-fest/source/basic.d.ts new file mode 100644 index 000000000..d380c8b91 --- /dev/null +++ b/node_modules/type-fest/source/basic.d.ts @@ -0,0 +1,67 @@ +/// + +// TODO: This can just be `export type Primitive = not object` when the `not` keyword is out. +/** +Matches any [primitive value](https://developer.mozilla.org/en-US/docs/Glossary/Primitive). +*/ +export type Primitive = + | null + | undefined + | string + | number + | boolean + | symbol + | bigint; + +// TODO: Remove the `= unknown` sometime in the future when most users are on TS 3.5 as it's now the default +/** +Matches a [`class` constructor](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Classes). +*/ +export type Class = new(...arguments_: Arguments) => T; + +/** +Matches any [typed array](https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/TypedArray), like `Uint8Array` or `Float64Array`. +*/ +export type TypedArray = + | Int8Array + | Uint8Array + | Uint8ClampedArray + | Int16Array + | Uint16Array + | Int32Array + | Uint32Array + | Float32Array + | Float64Array + | BigInt64Array + | BigUint64Array; + +/** +Matches a JSON object. + +This type can be useful to enforce some input to be JSON-compatible or as a super-type to be extended from. Don't use this as a direct return type as the user would have to double-cast it: `jsonObject as unknown as CustomResponse`. Instead, you could extend your CustomResponse type from it to ensure your type only uses JSON-compatible types: `interface CustomResponse extends JsonObject { … }`. +*/ +export type JsonObject = {[Key in string]?: JsonValue}; + +/** +Matches a JSON array. +*/ +export interface JsonArray extends Array {} + +/** +Matches any valid JSON value. +*/ +export type JsonValue = string | number | boolean | null | JsonObject | JsonArray; + +declare global { + interface SymbolConstructor { + readonly observable: symbol; + } +} + +/** +Matches a value that is like an [Observable](https://github.com/tc39/proposal-observable). +*/ +export interface ObservableLike { + subscribe(observer: (value: unknown) => void): void; + [Symbol.observable](): ObservableLike; +} diff --git a/node_modules/type-fest/source/conditional-except.d.ts b/node_modules/type-fest/source/conditional-except.d.ts new file mode 100644 index 000000000..ac506ccf1 --- /dev/null +++ b/node_modules/type-fest/source/conditional-except.d.ts @@ -0,0 +1,43 @@ +import {Except} from './except'; +import {ConditionalKeys} from './conditional-keys'; + +/** +Exclude keys from a shape that matches the given `Condition`. + +This is useful when you want to create a new type with a specific set of keys from a shape. For example, you might want to exclude all the primitive properties from a class and form a new shape containing everything but the primitive properties. + +@example +``` +import {Primitive, ConditionalExcept} from 'type-fest'; + +class Awesome { + name: string; + successes: number; + failures: bigint; + + run() {} +} + +type ExceptPrimitivesFromAwesome = ConditionalExcept; +//=> {run: () => void} +``` + +@example +``` +import {ConditionalExcept} from 'type-fest'; + +interface Example { + a: string; + b: string | number; + c: () => void; + d: {}; +} + +type NonStringKeysOnly = ConditionalExcept; +//=> {b: string | number; c: () => void; d: {}} +``` +*/ +export type ConditionalExcept = Except< + Base, + ConditionalKeys +>; diff --git a/node_modules/type-fest/source/conditional-keys.d.ts b/node_modules/type-fest/source/conditional-keys.d.ts new file mode 100644 index 000000000..eb074dc5d --- /dev/null +++ b/node_modules/type-fest/source/conditional-keys.d.ts @@ -0,0 +1,43 @@ +/** +Extract the keys from a type where the value type of the key extends the given `Condition`. + +Internally this is used for the `ConditionalPick` and `ConditionalExcept` types. + +@example +``` +import {ConditionalKeys} from 'type-fest'; + +interface Example { + a: string; + b: string | number; + c?: string; + d: {}; +} + +type StringKeysOnly = ConditionalKeys; +//=> 'a' +``` + +To support partial types, make sure your `Condition` is a union of undefined (for example, `string | undefined`) as demonstrated below. + +@example +``` +type StringKeysAndUndefined = ConditionalKeys; +//=> 'a' | 'c' +``` +*/ +export type ConditionalKeys = NonNullable< + // Wrap in `NonNullable` to strip away the `undefined` type from the produced union. + { + // Map through all the keys of the given base type. + [Key in keyof Base]: + // Pick only keys with types extending the given `Condition` type. + Base[Key] extends Condition + // Retain this key since the condition passes. + ? Key + // Discard this key since the condition fails. + : never; + + // Convert the produced object into a union type of the keys which passed the conditional test. + }[keyof Base] +>; diff --git a/node_modules/type-fest/source/conditional-pick.d.ts b/node_modules/type-fest/source/conditional-pick.d.ts new file mode 100644 index 000000000..cecc3df14 --- /dev/null +++ b/node_modules/type-fest/source/conditional-pick.d.ts @@ -0,0 +1,42 @@ +import {ConditionalKeys} from './conditional-keys'; + +/** +Pick keys from the shape that matches the given `Condition`. + +This is useful when you want to create a new type from a specific subset of an existing type. For example, you might want to pick all the primitive properties from a class and form a new automatically derived type. + +@example +``` +import {Primitive, ConditionalPick} from 'type-fest'; + +class Awesome { + name: string; + successes: number; + failures: bigint; + + run() {} +} + +type PickPrimitivesFromAwesome = ConditionalPick; +//=> {name: string; successes: number; failures: bigint} +``` + +@example +``` +import {ConditionalPick} from 'type-fest'; + +interface Example { + a: string; + b: string | number; + c: () => void; + d: {}; +} + +type StringKeysOnly = ConditionalPick; +//=> {a: string} +``` +*/ +export type ConditionalPick = Pick< + Base, + ConditionalKeys +>; diff --git a/node_modules/type-fest/source/entries.d.ts b/node_modules/type-fest/source/entries.d.ts new file mode 100644 index 000000000..e02237a92 --- /dev/null +++ b/node_modules/type-fest/source/entries.d.ts @@ -0,0 +1,57 @@ +import {ArrayEntry, MapEntry, ObjectEntry, SetEntry} from './entry'; + +type ArrayEntries = Array>; +type MapEntries = Array>; +type ObjectEntries = Array>; +type SetEntries> = Array>; + +/** +Many collections have an `entries` method which returns an array of a given object's own enumerable string-keyed property [key, value] pairs. The `Entries` type will return the type of that collection's entries. + +For example the {@link https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/entries|`Object`}, {@link https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Map/entries|`Map`}, {@link https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/entries|`Array`}, and {@link https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Set/entries|`Set`} collections all have this method. Note that `WeakMap` and `WeakSet` do not have this method since their entries are not enumerable. + +@see `Entry` if you want to just access the type of a single entry. + +@example +``` +import {Entries} from 'type-fest'; + +interface Example { + someKey: number; +} + +const manipulatesEntries = (examples: Entries) => examples.map(example => [ + // Does some arbitrary processing on the key (with type information available) + example[0].toUpperCase(), + + // Does some arbitrary processing on the value (with type information available) + example[1].toFixed() +]); + +const example: Example = {someKey: 1}; +const entries = Object.entries(example) as Entries; +const output = manipulatesEntries(entries); + +// Objects +const objectExample = {a: 1}; +const objectEntries: Entries = [['a', 1]]; + +// Arrays +const arrayExample = ['a', 1]; +const arrayEntries: Entries = [[0, 'a'], [1, 1]]; + +// Maps +const mapExample = new Map([['a', 1]]); +const mapEntries: Entries = [['a', 1]]; + +// Sets +const setExample = new Set(['a', 1]); +const setEntries: Entries = [['a', 'a'], [1, 1]]; +``` +*/ +export type Entries = + BaseType extends Map ? MapEntries + : BaseType extends Set ? SetEntries + : BaseType extends unknown[] ? ArrayEntries + : BaseType extends object ? ObjectEntries + : never; diff --git a/node_modules/type-fest/source/entry.d.ts b/node_modules/type-fest/source/entry.d.ts new file mode 100644 index 000000000..41a13a979 --- /dev/null +++ b/node_modules/type-fest/source/entry.d.ts @@ -0,0 +1,60 @@ +type MapKey = BaseType extends Map ? KeyType : never; +type MapValue = BaseType extends Map ? ValueType : never; + +export type ArrayEntry = [number, BaseType[number]]; +export type MapEntry = [MapKey, MapValue]; +export type ObjectEntry = [keyof BaseType, BaseType[keyof BaseType]]; +export type SetEntry = BaseType extends Set ? [ItemType, ItemType] : never; + +/** +Many collections have an `entries` method which returns an array of a given object's own enumerable string-keyed property [key, value] pairs. The `Entry` type will return the type of that collection's entry. + +For example the {@link https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Object/entries|`Object`}, {@link https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Map/entries|`Map`}, {@link https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/entries|`Array`}, and {@link https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Set/entries|`Set`} collections all have this method. Note that `WeakMap` and `WeakSet` do not have this method since their entries are not enumerable. + +@see `Entries` if you want to just access the type of the array of entries (which is the return of the `.entries()` method). + +@example +``` +import {Entry} from 'type-fest'; + +interface Example { + someKey: number; +} + +const manipulatesEntry = (example: Entry) => [ + // Does some arbitrary processing on the key (with type information available) + example[0].toUpperCase(), + + // Does some arbitrary processing on the value (with type information available) + example[1].toFixed(), +]; + +const example: Example = {someKey: 1}; +const entry = Object.entries(example)[0] as Entry; +const output = manipulatesEntry(entry); + +// Objects +const objectExample = {a: 1}; +const objectEntry: Entry = ['a', 1]; + +// Arrays +const arrayExample = ['a', 1]; +const arrayEntryString: Entry = [0, 'a']; +const arrayEntryNumber: Entry = [1, 1]; + +// Maps +const mapExample = new Map([['a', 1]]); +const mapEntry: Entry = ['a', 1]; + +// Sets +const setExample = new Set(['a', 1]); +const setEntryString: Entry = ['a', 'a']; +const setEntryNumber: Entry = [1, 1]; +``` +*/ +export type Entry = + BaseType extends Map ? MapEntry + : BaseType extends Set ? SetEntry + : BaseType extends unknown[] ? ArrayEntry + : BaseType extends object ? ObjectEntry + : never; diff --git a/node_modules/type-fest/source/except.d.ts b/node_modules/type-fest/source/except.d.ts new file mode 100644 index 000000000..7dedbaa4a --- /dev/null +++ b/node_modules/type-fest/source/except.d.ts @@ -0,0 +1,22 @@ +/** +Create a type from an object type without certain keys. + +This type is a stricter version of [`Omit`](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-3-5.html#the-omit-helper-type). The `Omit` type does not restrict the omitted keys to be keys present on the given type, while `Except` does. The benefits of a stricter type are avoiding typos and allowing the compiler to pick up on rename refactors automatically. + +Please upvote [this issue](https://github.com/microsoft/TypeScript/issues/30825) if you want to have the stricter version as a built-in in TypeScript. + +@example +``` +import {Except} from 'type-fest'; + +type Foo = { + a: number; + b: string; + c: boolean; +}; + +type FooWithoutA = Except; +//=> {b: string}; +``` +*/ +export type Except = Pick>; diff --git a/node_modules/type-fest/source/fixed-length-array.d.ts b/node_modules/type-fest/source/fixed-length-array.d.ts new file mode 100644 index 000000000..e3bc0f473 --- /dev/null +++ b/node_modules/type-fest/source/fixed-length-array.d.ts @@ -0,0 +1,38 @@ +/** +Methods to exclude. +*/ +type ArrayLengthMutationKeys = 'splice' | 'push' | 'pop' | 'shift' | 'unshift'; + +/** +Create a type that represents an array of the given type and length. The array's length and the `Array` prototype methods that manipulate its length are excluded in the resulting type. + +Please participate in [this issue](https://github.com/microsoft/TypeScript/issues/26223) if you want to have a similiar type built into TypeScript. + +Use-cases: +- Declaring fixed-length tuples or arrays with a large number of items. +- Creating a range union (for example, `0 | 1 | 2 | 3 | 4` from the keys of such a type) without having to resort to recursive types. +- Creating an array of coordinates with a static length, for example, length of 3 for a 3D vector. + +@example +``` +import {FixedLengthArray} from 'type-fest'; + +type FencingTeam = FixedLengthArray; + +const guestFencingTeam: FencingTeam = ['Josh', 'Michael', 'Robert']; + +const homeFencingTeam: FencingTeam = ['George', 'John']; +//=> error TS2322: Type string[] is not assignable to type 'FencingTeam' + +guestFencingTeam.push('Sam'); +//=> error TS2339: Property 'push' does not exist on type 'FencingTeam' +``` +*/ +export type FixedLengthArray = Pick< + ArrayPrototype, + Exclude +> & { + [index: number]: Element; + [Symbol.iterator]: () => IterableIterator; + readonly length: Length; +}; diff --git a/node_modules/type-fest/source/iterable-element.d.ts b/node_modules/type-fest/source/iterable-element.d.ts new file mode 100644 index 000000000..174cfbf4b --- /dev/null +++ b/node_modules/type-fest/source/iterable-element.d.ts @@ -0,0 +1,46 @@ +/** +Get the element type of an `Iterable`/`AsyncIterable`. For example, an array or a generator. + +This can be useful, for example, if you want to get the type that is yielded in a generator function. Often the return type of those functions are not specified. + +This type works with both `Iterable`s and `AsyncIterable`s, so it can be use with synchronous and asynchronous generators. + +Here is an example of `IterableElement` in action with a generator function: + +@example +``` +function * iAmGenerator() { + yield 1; + yield 2; +} + +type MeNumber = IterableElement> +``` + +And here is an example with an async generator: + +@example +``` +async function * iAmGeneratorAsync() { + yield 'hi'; + yield true; +} + +type MeStringOrBoolean = IterableElement> +``` + +Many types in JavaScript/TypeScript are iterables. This type works on all types that implement those interfaces. For example, `Array`, `Set`, `Map`, `stream.Readable`, etc. + +An example with an array of strings: + +@example +``` +type MeString = IterableElement +``` +*/ +export type IterableElement = + TargetIterable extends Iterable ? + ElementType : + TargetIterable extends AsyncIterable ? + ElementType : + never; diff --git a/node_modules/type-fest/source/literal-union.d.ts b/node_modules/type-fest/source/literal-union.d.ts new file mode 100644 index 000000000..8debd93d0 --- /dev/null +++ b/node_modules/type-fest/source/literal-union.d.ts @@ -0,0 +1,33 @@ +import {Primitive} from './basic'; + +/** +Allows creating a union type by combining primitive types and literal types without sacrificing auto-completion in IDEs for the literal type part of the union. + +Currently, when a union type of a primitive type is combined with literal types, TypeScript loses all information about the combined literals. Thus, when such type is used in an IDE with autocompletion, no suggestions are made for the declared literals. + +This type is a workaround for [Microsoft/TypeScript#29729](https://github.com/Microsoft/TypeScript/issues/29729). It will be removed as soon as it's not needed anymore. + +@example +``` +import {LiteralUnion} from 'type-fest'; + +// Before + +type Pet = 'dog' | 'cat' | string; + +const pet: Pet = ''; +// Start typing in your TypeScript-enabled IDE. +// You **will not** get auto-completion for `dog` and `cat` literals. + +// After + +type Pet2 = LiteralUnion<'dog' | 'cat', string>; + +const pet: Pet2 = ''; +// You **will** get auto-completion for `dog` and `cat` literals. +``` + */ +export type LiteralUnion< + LiteralType, + BaseType extends Primitive +> = LiteralType | (BaseType & {_?: never}); diff --git a/node_modules/type-fest/source/merge-exclusive.d.ts b/node_modules/type-fest/source/merge-exclusive.d.ts new file mode 100644 index 000000000..059bd2cbe --- /dev/null +++ b/node_modules/type-fest/source/merge-exclusive.d.ts @@ -0,0 +1,39 @@ +// Helper type. Not useful on its own. +type Without = {[KeyType in Exclude]?: never}; + +/** +Create a type that has mutually exclusive keys. + +This type was inspired by [this comment](https://github.com/Microsoft/TypeScript/issues/14094#issuecomment-373782604). + +This type works with a helper type, called `Without`. `Without` produces a type that has only keys from `FirstType` which are not present on `SecondType` and sets the value type for these keys to `never`. This helper type is then used in `MergeExclusive` to remove keys from either `FirstType` or `SecondType`. + +@example +``` +import {MergeExclusive} from 'type-fest'; + +interface ExclusiveVariation1 { + exclusive1: boolean; +} + +interface ExclusiveVariation2 { + exclusive2: string; +} + +type ExclusiveOptions = MergeExclusive; + +let exclusiveOptions: ExclusiveOptions; + +exclusiveOptions = {exclusive1: true}; +//=> Works +exclusiveOptions = {exclusive2: 'hi'}; +//=> Works +exclusiveOptions = {exclusive1: true, exclusive2: 'hi'}; +//=> Error +``` +*/ +export type MergeExclusive = + (FirstType | SecondType) extends object ? + (Without & SecondType) | (Without & FirstType) : + FirstType | SecondType; + diff --git a/node_modules/type-fest/source/merge.d.ts b/node_modules/type-fest/source/merge.d.ts new file mode 100644 index 000000000..4b3920b7a --- /dev/null +++ b/node_modules/type-fest/source/merge.d.ts @@ -0,0 +1,22 @@ +import {Except} from './except'; + +/** +Merge two types into a new type. Keys of the second type overrides keys of the first type. + +@example +``` +import {Merge} from 'type-fest'; + +type Foo = { + a: number; + b: string; +}; + +type Bar = { + b: number; +}; + +const ab: Merge = {a: 1, b: 2}; +``` +*/ +export type Merge = Except> & SecondType; diff --git a/node_modules/type-fest/source/mutable.d.ts b/node_modules/type-fest/source/mutable.d.ts new file mode 100644 index 000000000..03d0dda7f --- /dev/null +++ b/node_modules/type-fest/source/mutable.d.ts @@ -0,0 +1,22 @@ +/** +Convert an object with `readonly` keys into a mutable object. Inverse of `Readonly`. + +This can be used to [store and mutate options within a class](https://github.com/sindresorhus/pageres/blob/4a5d05fca19a5fbd2f53842cbf3eb7b1b63bddd2/source/index.ts#L72), [edit `readonly` objects within tests](https://stackoverflow.com/questions/50703834), and [construct a `readonly` object within a function](https://github.com/Microsoft/TypeScript/issues/24509). + +@example +``` +import {Mutable} from 'type-fest'; + +type Foo = { + readonly a: number; + readonly b: string; +}; + +const mutableFoo: Mutable = {a: 1, b: '2'}; +mutableFoo.a = 3; +``` +*/ +export type Mutable = { + // For each `Key` in the keys of `ObjectType`, make a mapped type by removing the `readonly` modifier from the key. + -readonly [KeyType in keyof ObjectType]: ObjectType[KeyType]; +}; diff --git a/node_modules/type-fest/source/opaque.d.ts b/node_modules/type-fest/source/opaque.d.ts new file mode 100644 index 000000000..20ab964e2 --- /dev/null +++ b/node_modules/type-fest/source/opaque.d.ts @@ -0,0 +1,65 @@ +/** +Create an opaque type, which hides its internal details from the public, and can only be created by being used explicitly. + +The generic type parameter can be anything. It doesn't have to be an object. + +[Read more about opaque types.](https://codemix.com/opaque-types-in-javascript/) + +There have been several discussions about adding this feature to TypeScript via the `opaque type` operator, similar to how Flow does it. Unfortunately, nothing has (yet) moved forward: + - [Microsoft/TypeScript#15408](https://github.com/Microsoft/TypeScript/issues/15408) + - [Microsoft/TypeScript#15807](https://github.com/Microsoft/TypeScript/issues/15807) + +@example +``` +import {Opaque} from 'type-fest'; + +type AccountNumber = Opaque; +type AccountBalance = Opaque; + +// The Token parameter allows the compiler to differentiate between types, whereas "unknown" will not. For example, consider the following structures: +type ThingOne = Opaque; +type ThingTwo = Opaque; + +// To the compiler, these types are allowed to be cast to each other as they have the same underlying type. They are both `string & { __opaque__: unknown }`. +// To avoid this behaviour, you would instead pass the "Token" parameter, like so. +type NewThingOne = Opaque; +type NewThingTwo = Opaque; + +// Now they're completely separate types, so the following will fail to compile. +function createNewThingOne (): NewThingOne { + // As you can see, casting from a string is still allowed. However, you may not cast NewThingOne to NewThingTwo, and vice versa. + return 'new thing one' as NewThingOne; +} + +// This will fail to compile, as they are fundamentally different types. +const thingTwo = createNewThingOne() as NewThingTwo; + +// Here's another example of opaque typing. +function createAccountNumber(): AccountNumber { + return 2 as AccountNumber; +} + +function getMoneyForAccount(accountNumber: AccountNumber): AccountBalance { + return 4 as AccountBalance; +} + +// This will compile successfully. +getMoneyForAccount(createAccountNumber()); + +// But this won't, because it has to be explicitly passed as an `AccountNumber` type. +getMoneyForAccount(2); + +// You can use opaque values like they aren't opaque too. +const accountNumber = createAccountNumber(); + +// This will not compile successfully. +const newAccountNumber = accountNumber + 2; + +// As a side note, you can (and should) use recursive types for your opaque types to make them stronger and hopefully easier to type. +type Person = { + id: Opaque; + name: string; +}; +``` +*/ +export type Opaque = Type & {readonly __opaque__: Token}; diff --git a/node_modules/type-fest/source/package-json.d.ts b/node_modules/type-fest/source/package-json.d.ts new file mode 100644 index 000000000..cf355d03a --- /dev/null +++ b/node_modules/type-fest/source/package-json.d.ts @@ -0,0 +1,611 @@ +import {LiteralUnion} from './literal-union'; + +declare namespace PackageJson { + /** + A person who has been involved in creating or maintaining the package. + */ + export type Person = + | string + | { + name: string; + url?: string; + email?: string; + }; + + export type BugsLocation = + | string + | { + /** + The URL to the package's issue tracker. + */ + url?: string; + + /** + The email address to which issues should be reported. + */ + email?: string; + }; + + export interface DirectoryLocations { + [directoryType: string]: unknown; + + /** + Location for executable scripts. Sugar to generate entries in the `bin` property by walking the folder. + */ + bin?: string; + + /** + Location for Markdown files. + */ + doc?: string; + + /** + Location for example scripts. + */ + example?: string; + + /** + Location for the bulk of the library. + */ + lib?: string; + + /** + Location for man pages. Sugar to generate a `man` array by walking the folder. + */ + man?: string; + + /** + Location for test files. + */ + test?: string; + } + + export type Scripts = { + /** + Run **before** the package is published (Also run on local `npm install` without any arguments). + */ + prepublish?: string; + + /** + Run both **before** the package is packed and published, and on local `npm install` without any arguments. This is run **after** `prepublish`, but **before** `prepublishOnly`. + */ + prepare?: string; + + /** + Run **before** the package is prepared and packed, **only** on `npm publish`. + */ + prepublishOnly?: string; + + /** + Run **before** a tarball is packed (on `npm pack`, `npm publish`, and when installing git dependencies). + */ + prepack?: string; + + /** + Run **after** the tarball has been generated and moved to its final destination. + */ + postpack?: string; + + /** + Run **after** the package is published. + */ + publish?: string; + + /** + Run **after** the package is published. + */ + postpublish?: string; + + /** + Run **before** the package is installed. + */ + preinstall?: string; + + /** + Run **after** the package is installed. + */ + install?: string; + + /** + Run **after** the package is installed and after `install`. + */ + postinstall?: string; + + /** + Run **before** the package is uninstalled and before `uninstall`. + */ + preuninstall?: string; + + /** + Run **before** the package is uninstalled. + */ + uninstall?: string; + + /** + Run **after** the package is uninstalled. + */ + postuninstall?: string; + + /** + Run **before** bump the package version and before `version`. + */ + preversion?: string; + + /** + Run **before** bump the package version. + */ + version?: string; + + /** + Run **after** bump the package version. + */ + postversion?: string; + + /** + Run with the `npm test` command, before `test`. + */ + pretest?: string; + + /** + Run with the `npm test` command. + */ + test?: string; + + /** + Run with the `npm test` command, after `test`. + */ + posttest?: string; + + /** + Run with the `npm stop` command, before `stop`. + */ + prestop?: string; + + /** + Run with the `npm stop` command. + */ + stop?: string; + + /** + Run with the `npm stop` command, after `stop`. + */ + poststop?: string; + + /** + Run with the `npm start` command, before `start`. + */ + prestart?: string; + + /** + Run with the `npm start` command. + */ + start?: string; + + /** + Run with the `npm start` command, after `start`. + */ + poststart?: string; + + /** + Run with the `npm restart` command, before `restart`. Note: `npm restart` will run the `stop` and `start` scripts if no `restart` script is provided. + */ + prerestart?: string; + + /** + Run with the `npm restart` command. Note: `npm restart` will run the `stop` and `start` scripts if no `restart` script is provided. + */ + restart?: string; + + /** + Run with the `npm restart` command, after `restart`. Note: `npm restart` will run the `stop` and `start` scripts if no `restart` script is provided. + */ + postrestart?: string; + } & Record; + + /** + Dependencies of the package. The version range is a string which has one or more space-separated descriptors. Dependencies can also be identified with a tarball or Git URL. + */ + export type Dependency = Record; + + /** + Conditions which provide a way to resolve a package entry point based on the environment. + */ + export type ExportCondition = LiteralUnion< + | 'import' + | 'require' + | 'node' + | 'deno' + | 'browser' + | 'electron' + | 'react-native' + | 'default', + string + >; + + /** + Entry points of a module, optionally with conditions and subpath exports. + */ + export type Exports = + | string + | {[key in ExportCondition]: Exports} + | {[key: string]: Exports}; // eslint-disable-line @typescript-eslint/consistent-indexed-object-style + + export interface NonStandardEntryPoints { + /** + An ECMAScript module ID that is the primary entry point to the program. + */ + module?: string; + + /** + A module ID with untranspiled code that is the primary entry point to the program. + */ + esnext?: + | string + | { + [moduleName: string]: string | undefined; + main?: string; + browser?: string; + }; + + /** + A hint to JavaScript bundlers or component tools when packaging modules for client side use. + */ + browser?: + | string + | Record; + + /** + Denote which files in your project are "pure" and therefore safe for Webpack to prune if unused. + + [Read more.](https://webpack.js.org/guides/tree-shaking/) + */ + sideEffects?: boolean | string[]; + } + + export interface TypeScriptConfiguration { + /** + Location of the bundled TypeScript declaration file. + */ + types?: string; + + /** + Location of the bundled TypeScript declaration file. Alias of `types`. + */ + typings?: string; + } + + /** + An alternative configuration for Yarn workspaces. + */ + export interface WorkspaceConfig { + /** + An array of workspace pattern strings which contain the workspace packages. + */ + packages?: WorkspacePattern[]; + + /** + Designed to solve the problem of packages which break when their `node_modules` are moved to the root workspace directory - a process known as hoisting. For these packages, both within your workspace, and also some that have been installed via `node_modules`, it is important to have a mechanism for preventing the default Yarn workspace behavior. By adding workspace pattern strings here, Yarn will resume non-workspace behavior for any package which matches the defined patterns. + + [Read more](https://classic.yarnpkg.com/blog/2018/02/15/nohoist/) + */ + nohoist?: WorkspacePattern[]; + } + + /** + A workspace pattern points to a directory or group of directories which contain packages that should be included in the workspace installation process. + + The patterns are handled with [minimatch](https://github.com/isaacs/minimatch). + + @example + `docs` → Include the docs directory and install its dependencies. + `packages/*` → Include all nested directories within the packages directory, like `packages/cli` and `packages/core`. + */ + type WorkspacePattern = string; + + export interface YarnConfiguration { + /** + Used to configure [Yarn workspaces](https://classic.yarnpkg.com/docs/workspaces/). + + Workspaces allow you to manage multiple packages within the same repository in such a way that you only need to run `yarn install` once to install all of them in a single pass. + + Please note that the top-level `private` property of `package.json` **must** be set to `true` in order to use workspaces. + */ + workspaces?: WorkspacePattern[] | WorkspaceConfig; + + /** + If your package only allows one version of a given dependency, and you’d like to enforce the same behavior as `yarn install --flat` on the command-line, set this to `true`. + + Note that if your `package.json` contains `"flat": true` and other packages depend on yours (e.g. you are building a library rather than an app), those other packages will also need `"flat": true` in their `package.json` or be installed with `yarn install --flat` on the command-line. + */ + flat?: boolean; + + /** + Selective version resolutions. Allows the definition of custom package versions inside dependencies without manual edits in the `yarn.lock` file. + */ + resolutions?: Dependency; + } + + export interface JSPMConfiguration { + /** + JSPM configuration. + */ + jspm?: PackageJson; + } + + /** + Type for [npm's `package.json` file](https://docs.npmjs.com/creating-a-package-json-file). Containing standard npm properties. + */ + export interface PackageJsonStandard { + /** + The name of the package. + */ + name?: string; + + /** + Package version, parseable by [`node-semver`](https://github.com/npm/node-semver). + */ + version?: string; + + /** + Package description, listed in `npm search`. + */ + description?: string; + + /** + Keywords associated with package, listed in `npm search`. + */ + keywords?: string[]; + + /** + The URL to the package's homepage. + */ + homepage?: LiteralUnion<'.', string>; + + /** + The URL to the package's issue tracker and/or the email address to which issues should be reported. + */ + bugs?: BugsLocation; + + /** + The license for the package. + */ + license?: string; + + /** + The licenses for the package. + */ + licenses?: Array<{ + type?: string; + url?: string; + }>; + + author?: Person; + + /** + A list of people who contributed to the package. + */ + contributors?: Person[]; + + /** + A list of people who maintain the package. + */ + maintainers?: Person[]; + + /** + The files included in the package. + */ + files?: string[]; + + /** + Resolution algorithm for importing ".js" files from the package's scope. + + [Read more.](https://nodejs.org/api/esm.html#esm_package_json_type_field) + */ + type?: 'module' | 'commonjs'; + + /** + The module ID that is the primary entry point to the program. + */ + main?: string; + + /** + Standard entry points of the package, with enhanced support for ECMAScript Modules. + + [Read more.](https://nodejs.org/api/esm.html#esm_package_entry_points) + */ + exports?: Exports; + + /** + The executable files that should be installed into the `PATH`. + */ + bin?: + | string + | Record; + + /** + Filenames to put in place for the `man` program to find. + */ + man?: string | string[]; + + /** + Indicates the structure of the package. + */ + directories?: DirectoryLocations; + + /** + Location for the code repository. + */ + repository?: + | string + | { + type: string; + url: string; + + /** + Relative path to package.json if it is placed in non-root directory (for example if it is part of a monorepo). + + [Read more.](https://github.com/npm/rfcs/blob/latest/implemented/0010-monorepo-subdirectory-declaration.md) + */ + directory?: string; + }; + + /** + Script commands that are run at various times in the lifecycle of the package. The key is the lifecycle event, and the value is the command to run at that point. + */ + scripts?: Scripts; + + /** + Is used to set configuration parameters used in package scripts that persist across upgrades. + */ + config?: Record; + + /** + The dependencies of the package. + */ + dependencies?: Dependency; + + /** + Additional tooling dependencies that are not required for the package to work. Usually test, build, or documentation tooling. + */ + devDependencies?: Dependency; + + /** + Dependencies that are skipped if they fail to install. + */ + optionalDependencies?: Dependency; + + /** + Dependencies that will usually be required by the package user directly or via another dependency. + */ + peerDependencies?: Dependency; + + /** + Indicate peer dependencies that are optional. + */ + peerDependenciesMeta?: Record; + + /** + Package names that are bundled when the package is published. + */ + bundledDependencies?: string[]; + + /** + Alias of `bundledDependencies`. + */ + bundleDependencies?: string[]; + + /** + Engines that this package runs on. + */ + engines?: { + [EngineName in 'npm' | 'node' | string]: string; + }; + + /** + @deprecated + */ + engineStrict?: boolean; + + /** + Operating systems the module runs on. + */ + os?: Array>; + + /** + CPU architectures the module runs on. + */ + cpu?: Array>; + + /** + If set to `true`, a warning will be shown if package is installed locally. Useful if the package is primarily a command-line application that should be installed globally. + + @deprecated + */ + preferGlobal?: boolean; + + /** + If set to `true`, then npm will refuse to publish it. + */ + private?: boolean; + + /** + A set of config values that will be used at publish-time. It's especially handy to set the tag, registry or access, to ensure that a given package is not tagged with 'latest', published to the global public registry or that a scoped module is private by default. + */ + publishConfig?: Record; + + /** + Describes and notifies consumers of a package's monetary support information. + + [Read more.](https://github.com/npm/rfcs/blob/latest/accepted/0017-add-funding-support.md) + */ + funding?: string | { + /** + The type of funding. + */ + type?: LiteralUnion< + | 'github' + | 'opencollective' + | 'patreon' + | 'individual' + | 'foundation' + | 'corporation', + string + >; + + /** + The URL to the funding page. + */ + url: string; + }; + } +} + +/** +Type for [npm's `package.json` file](https://docs.npmjs.com/creating-a-package-json-file). Also includes types for fields used by other popular projects, like TypeScript and Yarn. +*/ +export type PackageJson = +PackageJson.PackageJsonStandard & +PackageJson.NonStandardEntryPoints & +PackageJson.TypeScriptConfiguration & +PackageJson.YarnConfiguration & +PackageJson.JSPMConfiguration; diff --git a/node_modules/type-fest/source/partial-deep.d.ts b/node_modules/type-fest/source/partial-deep.d.ts new file mode 100644 index 000000000..b962b84eb --- /dev/null +++ b/node_modules/type-fest/source/partial-deep.d.ts @@ -0,0 +1,72 @@ +import {Primitive} from './basic'; + +/** +Create a type from another type with all keys and nested keys set to optional. + +Use-cases: +- Merging a default settings/config object with another object, the second object would be a deep partial of the default object. +- Mocking and testing complex entities, where populating an entire object with its keys would be redundant in terms of the mock or test. + +@example +``` +import {PartialDeep} from 'type-fest'; + +const settings: Settings = { + textEditor: { + fontSize: 14; + fontColor: '#000000'; + fontWeight: 400; + } + autocomplete: false; + autosave: true; +}; + +const applySavedSettings = (savedSettings: PartialDeep) => { + return {...settings, ...savedSettings}; +} + +settings = applySavedSettings({textEditor: {fontWeight: 500}}); +``` +*/ +export type PartialDeep = T extends Primitive + ? Partial + : T extends Map + ? PartialMapDeep + : T extends Set + ? PartialSetDeep + : T extends ReadonlyMap + ? PartialReadonlyMapDeep + : T extends ReadonlySet + ? PartialReadonlySetDeep + : T extends ((...arguments: any[]) => unknown) + ? T | undefined + : T extends object + ? PartialObjectDeep + : unknown; + +/** +Same as `PartialDeep`, but accepts only `Map`s and as inputs. Internal helper for `PartialDeep`. +*/ +interface PartialMapDeep extends Map, PartialDeep> {} + +/** +Same as `PartialDeep`, but accepts only `Set`s as inputs. Internal helper for `PartialDeep`. +*/ +interface PartialSetDeep extends Set> {} + +/** +Same as `PartialDeep`, but accepts only `ReadonlyMap`s as inputs. Internal helper for `PartialDeep`. +*/ +interface PartialReadonlyMapDeep extends ReadonlyMap, PartialDeep> {} + +/** +Same as `PartialDeep`, but accepts only `ReadonlySet`s as inputs. Internal helper for `PartialDeep`. +*/ +interface PartialReadonlySetDeep extends ReadonlySet> {} + +/** +Same as `PartialDeep`, but accepts only `object`s as inputs. Internal helper for `PartialDeep`. +*/ +type PartialObjectDeep = { + [KeyType in keyof ObjectType]?: PartialDeep +}; diff --git a/node_modules/type-fest/source/promisable.d.ts b/node_modules/type-fest/source/promisable.d.ts new file mode 100644 index 000000000..71242a5db --- /dev/null +++ b/node_modules/type-fest/source/promisable.d.ts @@ -0,0 +1,23 @@ +/** +Create a type that represents either the value or the value wrapped in `PromiseLike`. + +Use-cases: +- A function accepts a callback that may either return a value synchronously or may return a promised value. +- This type could be the return type of `Promise#then()`, `Promise#catch()`, and `Promise#finally()` callbacks. + +Please upvote [this issue](https://github.com/microsoft/TypeScript/issues/31394) if you want to have this type as a built-in in TypeScript. + +@example +``` +import {Promisable} from 'type-fest'; + +async function logger(getLogEntry: () => Promisable): Promise { + const entry = await getLogEntry(); + console.log(entry); +} + +logger(() => 'foo'); +logger(() => Promise.resolve('bar')); +``` +*/ +export type Promisable = T | PromiseLike; diff --git a/node_modules/type-fest/source/promise-value.d.ts b/node_modules/type-fest/source/promise-value.d.ts new file mode 100644 index 000000000..642ddebcc --- /dev/null +++ b/node_modules/type-fest/source/promise-value.d.ts @@ -0,0 +1,27 @@ +/** +Returns the type that is wrapped inside a `Promise` type. +If the type is a nested Promise, it is unwrapped recursively until a non-Promise type is obtained. +If the type is not a `Promise`, the type itself is returned. + +@example +``` +import {PromiseValue} from 'type-fest'; + +type AsyncData = Promise; +let asyncData: PromiseValue = Promise.resolve('ABC'); + +type Data = PromiseValue; +let data: Data = await asyncData; + +// Here's an example that shows how this type reacts to non-Promise types. +type SyncData = PromiseValue; +let syncData: SyncData = getSyncData(); + +// Here's an example that shows how this type reacts to recursive Promise types. +type RecursiveAsyncData = Promise >; +let recursiveAsyncData: PromiseValue = Promise.resolve(Promise.resolve('ABC')); +``` +*/ +export type PromiseValue = PromiseType extends Promise + ? { 0: PromiseValue; 1: Value }[PromiseType extends Promise ? 0 : 1] + : Otherwise; diff --git a/node_modules/type-fest/source/readonly-deep.d.ts b/node_modules/type-fest/source/readonly-deep.d.ts new file mode 100644 index 000000000..b8c04de25 --- /dev/null +++ b/node_modules/type-fest/source/readonly-deep.d.ts @@ -0,0 +1,59 @@ +import {Primitive} from './basic'; + +/** +Convert `object`s, `Map`s, `Set`s, and `Array`s and all of their keys/elements into immutable structures recursively. + +This is useful when a deeply nested structure needs to be exposed as completely immutable, for example, an imported JSON module or when receiving an API response that is passed around. + +Please upvote [this issue](https://github.com/microsoft/TypeScript/issues/13923) if you want to have this type as a built-in in TypeScript. + +@example +``` +// data.json +{ + "foo": ["bar"] +} + +// main.ts +import {ReadonlyDeep} from 'type-fest'; +import dataJson = require('./data.json'); + +const data: ReadonlyDeep = dataJson; + +export default data; + +// test.ts +import data from './main'; + +data.foo.push('bar'); +//=> error TS2339: Property 'push' does not exist on type 'readonly string[]' +``` +*/ +export type ReadonlyDeep = T extends Primitive | ((...arguments: any[]) => unknown) + ? T + : T extends ReadonlyMap + ? ReadonlyMapDeep + : T extends ReadonlySet + ? ReadonlySetDeep + : T extends object + ? ReadonlyObjectDeep + : unknown; + +/** +Same as `ReadonlyDeep`, but accepts only `ReadonlyMap`s as inputs. Internal helper for `ReadonlyDeep`. +*/ +interface ReadonlyMapDeep + extends ReadonlyMap, ReadonlyDeep> {} + +/** +Same as `ReadonlyDeep`, but accepts only `ReadonlySet`s as inputs. Internal helper for `ReadonlyDeep`. +*/ +interface ReadonlySetDeep + extends ReadonlySet> {} + +/** +Same as `ReadonlyDeep`, but accepts only `object`s as inputs. Internal helper for `ReadonlyDeep`. +*/ +type ReadonlyObjectDeep = { + readonly [KeyType in keyof ObjectType]: ReadonlyDeep +}; diff --git a/node_modules/type-fest/source/require-at-least-one.d.ts b/node_modules/type-fest/source/require-at-least-one.d.ts new file mode 100644 index 000000000..b3b871917 --- /dev/null +++ b/node_modules/type-fest/source/require-at-least-one.d.ts @@ -0,0 +1,33 @@ +import {Except} from './except'; + +/** +Create a type that requires at least one of the given keys. The remaining keys are kept as is. + +@example +``` +import {RequireAtLeastOne} from 'type-fest'; + +type Responder = { + text?: () => string; + json?: () => string; + + secure?: boolean; +}; + +const responder: RequireAtLeastOne = { + json: () => '{"message": "ok"}', + secure: true +}; +``` +*/ +export type RequireAtLeastOne< + ObjectType, + KeysType extends keyof ObjectType = keyof ObjectType +> = { + // For each `Key` in `KeysType` make a mapped type: + [Key in KeysType]-?: Required> & // 1. Make `Key`'s type required + // 2. Make all other keys in `KeysType` optional + Partial>>; +}[KeysType] & + // 3. Add the remaining keys not in `KeysType` + Except; diff --git a/node_modules/type-fest/source/require-exactly-one.d.ts b/node_modules/type-fest/source/require-exactly-one.d.ts new file mode 100644 index 000000000..c3e7e7ead --- /dev/null +++ b/node_modules/type-fest/source/require-exactly-one.d.ts @@ -0,0 +1,35 @@ +// TODO: Remove this when we target TypeScript >=3.5. +type _Omit = Pick>; + +/** +Create a type that requires exactly one of the given keys and disallows more. The remaining keys are kept as is. + +Use-cases: +- Creating interfaces for components that only need one of the keys to display properly. +- Declaring generic keys in a single place for a single use-case that gets narrowed down via `RequireExactlyOne`. + +The caveat with `RequireExactlyOne` is that TypeScript doesn't always know at compile time every key that will exist at runtime. Therefore `RequireExactlyOne` can't do anything to prevent extra keys it doesn't know about. + +@example +``` +import {RequireExactlyOne} from 'type-fest'; + +type Responder = { + text: () => string; + json: () => string; + secure: boolean; +}; + +const responder: RequireExactlyOne = { + // Adding a `text` key here would cause a compile error. + + json: () => '{"message": "ok"}', + secure: true +}; +``` +*/ +export type RequireExactlyOne = + {[Key in KeysType]: ( + Required> & + Partial, never>> + )}[KeysType] & _Omit; diff --git a/node_modules/type-fest/source/set-optional.d.ts b/node_modules/type-fest/source/set-optional.d.ts new file mode 100644 index 000000000..35398992b --- /dev/null +++ b/node_modules/type-fest/source/set-optional.d.ts @@ -0,0 +1,34 @@ +import {Except} from './except'; + +/** +Create a type that makes the given keys optional. The remaining keys are kept as is. The sister of the `SetRequired` type. + +Use-case: You want to define a single model where the only thing that changes is whether or not some of the keys are optional. + +@example +``` +import {SetOptional} from 'type-fest'; + +type Foo = { + a: number; + b?: string; + c: boolean; +} + +type SomeOptional = SetOptional; +// type SomeOptional = { +// a: number; +// b?: string; // Was already optional and still is. +// c?: boolean; // Is now optional. +// } +``` +*/ +export type SetOptional = + // Pick just the keys that are not optional from the base type. + Except & + // Pick the keys that should be optional from the base type and make them optional. + Partial> extends + // If `InferredType` extends the previous, then for each key, use the inferred type key. + infer InferredType + ? {[KeyType in keyof InferredType]: InferredType[KeyType]} + : never; diff --git a/node_modules/type-fest/source/set-required.d.ts b/node_modules/type-fest/source/set-required.d.ts new file mode 100644 index 000000000..0a7233077 --- /dev/null +++ b/node_modules/type-fest/source/set-required.d.ts @@ -0,0 +1,34 @@ +import {Except} from './except'; + +/** +Create a type that makes the given keys required. The remaining keys are kept as is. The sister of the `SetOptional` type. + +Use-case: You want to define a single model where the only thing that changes is whether or not some of the keys are required. + +@example +``` +import {SetRequired} from 'type-fest'; + +type Foo = { + a?: number; + b: string; + c?: boolean; +} + +type SomeRequired = SetRequired; +// type SomeRequired = { +// a?: number; +// b: string; // Was already required and still is. +// c: boolean; // Is now required. +// } +``` +*/ +export type SetRequired = + // Pick just the keys that are not required from the base type. + Except & + // Pick the keys that should be required from the base type and make them required. + Required> extends + // If `InferredType` extends the previous, then for each key, use the inferred type key. + infer InferredType + ? {[KeyType in keyof InferredType]: InferredType[KeyType]} + : never; diff --git a/node_modules/type-fest/source/set-return-type.d.ts b/node_modules/type-fest/source/set-return-type.d.ts new file mode 100644 index 000000000..98766b103 --- /dev/null +++ b/node_modules/type-fest/source/set-return-type.d.ts @@ -0,0 +1,29 @@ +type IsAny = 0 extends (1 & T) ? true : false; // https://stackoverflow.com/a/49928360/3406963 +type IsNever = [T] extends [never] ? true : false; +type IsUnknown = IsNever extends false ? T extends unknown ? unknown extends T ? IsAny extends false ? true : false : false : false : false; + +/** +Create a function type with a return type of your choice and the same parameters as the given function type. + +Use-case: You want to define a wrapped function that returns something different while receiving the same parameters. For example, you might want to wrap a function that can throw an error into one that will return `undefined` instead. + +@example +``` +import {SetReturnType} from 'type-fest'; + +type MyFunctionThatCanThrow = (foo: SomeType, bar: unknown) => SomeOtherType; + +type MyWrappedFunction = SetReturnType; +//=> type MyWrappedFunction = (foo: SomeType, bar: unknown) => SomeOtherType | undefined; +``` +*/ +export type SetReturnType any, TypeToReturn> = + // Just using `Parameters` isn't ideal because it doesn't handle the `this` fake parameter. + Fn extends (this: infer ThisArg, ...args: infer Arguments) => any ? ( + // If a function did not specify the `this` fake parameter, it will be inferred to `unknown`. + // We want to detect this situation just to display a friendlier type upon hovering on an IntelliSense-powered IDE. + IsUnknown extends true ? (...args: Arguments) => TypeToReturn : (this: ThisArg, ...args: Arguments) => TypeToReturn + ) : ( + // This part should be unreachable, but we make it meaningful just in case… + (...args: Parameters) => TypeToReturn + ); diff --git a/node_modules/type-fest/source/stringified.d.ts b/node_modules/type-fest/source/stringified.d.ts new file mode 100644 index 000000000..9688b6740 --- /dev/null +++ b/node_modules/type-fest/source/stringified.d.ts @@ -0,0 +1,21 @@ +/** +Create a type with the keys of the given type changed to `string` type. + +Use-case: Changing interface values to strings in order to use them in a form model. + +@example +``` +import {Stringified} from 'type-fest'; + +type Car { + model: string; + speed: number; +} + +const carForm: Stringified = { + model: 'Foo', + speed: '101' +}; +``` +*/ +export type Stringified = {[KeyType in keyof ObjectType]: string}; diff --git a/node_modules/type-fest/source/tsconfig-json.d.ts b/node_modules/type-fest/source/tsconfig-json.d.ts new file mode 100644 index 000000000..89f6e9dd4 --- /dev/null +++ b/node_modules/type-fest/source/tsconfig-json.d.ts @@ -0,0 +1,870 @@ +declare namespace TsConfigJson { + namespace CompilerOptions { + export type JSX = + | 'preserve' + | 'react' + | 'react-native'; + + export type Module = + | 'CommonJS' + | 'AMD' + | 'System' + | 'UMD' + | 'ES6' + | 'ES2015' + | 'ESNext' + | 'None' + // Lowercase alternatives + | 'commonjs' + | 'amd' + | 'system' + | 'umd' + | 'es6' + | 'es2015' + | 'esnext' + | 'none'; + + export type NewLine = + | 'CRLF' + | 'LF' + // Lowercase alternatives + | 'crlf' + | 'lf'; + + export type Target = + | 'ES3' + | 'ES5' + | 'ES6' + | 'ES2015' + | 'ES2016' + | 'ES2017' + | 'ES2018' + | 'ES2019' + | 'ES2020' + | 'ESNext' + // Lowercase alternatives + | 'es3' + | 'es5' + | 'es6' + | 'es2015' + | 'es2016' + | 'es2017' + | 'es2018' + | 'es2019' + | 'es2020' + | 'esnext'; + + export type Lib = + | 'ES5' + | 'ES6' + | 'ES7' + | 'ES2015' + | 'ES2015.Collection' + | 'ES2015.Core' + | 'ES2015.Generator' + | 'ES2015.Iterable' + | 'ES2015.Promise' + | 'ES2015.Proxy' + | 'ES2015.Reflect' + | 'ES2015.Symbol.WellKnown' + | 'ES2015.Symbol' + | 'ES2016' + | 'ES2016.Array.Include' + | 'ES2017' + | 'ES2017.Intl' + | 'ES2017.Object' + | 'ES2017.SharedMemory' + | 'ES2017.String' + | 'ES2017.TypedArrays' + | 'ES2018' + | 'ES2018.AsyncIterable' + | 'ES2018.Intl' + | 'ES2018.Promise' + | 'ES2018.Regexp' + | 'ES2019' + | 'ES2019.Array' + | 'ES2019.Object' + | 'ES2019.String' + | 'ES2019.Symbol' + | 'ES2020' + | 'ES2020.String' + | 'ES2020.Symbol.WellKnown' + | 'ESNext' + | 'ESNext.Array' + | 'ESNext.AsyncIterable' + | 'ESNext.BigInt' + | 'ESNext.Intl' + | 'ESNext.Symbol' + | 'DOM' + | 'DOM.Iterable' + | 'ScriptHost' + | 'WebWorker' + | 'WebWorker.ImportScripts' + // Lowercase alternatives + | 'es5' + | 'es6' + | 'es7' + | 'es2015' + | 'es2015.collection' + | 'es2015.core' + | 'es2015.generator' + | 'es2015.iterable' + | 'es2015.promise' + | 'es2015.proxy' + | 'es2015.reflect' + | 'es2015.symbol.wellknown' + | 'es2015.symbol' + | 'es2016' + | 'es2016.array.include' + | 'es2017' + | 'es2017.intl' + | 'es2017.object' + | 'es2017.sharedmemory' + | 'es2017.string' + | 'es2017.typedarrays' + | 'es2018' + | 'es2018.asynciterable' + | 'es2018.intl' + | 'es2018.promise' + | 'es2018.regexp' + | 'es2019' + | 'es2019.array' + | 'es2019.object' + | 'es2019.string' + | 'es2019.symbol' + | 'es2020' + | 'es2020.string' + | 'es2020.symbol.wellknown' + | 'esnext' + | 'esnext.array' + | 'esnext.asynciterable' + | 'esnext.bigint' + | 'esnext.intl' + | 'esnext.symbol' + | 'dom' + | 'dom.iterable' + | 'scripthost' + | 'webworker' + | 'webworker.importscripts'; + + export interface Plugin { + [key: string]: unknown; + /** + Plugin name. + */ + name?: string; + } + } + + export interface CompilerOptions { + /** + The character set of the input files. + + @default 'utf8' + */ + charset?: string; + + /** + Enables building for project references. + + @default true + */ + composite?: boolean; + + /** + Generates corresponding d.ts files. + + @default false + */ + declaration?: boolean; + + /** + Specify output directory for generated declaration files. + + Requires TypeScript version 2.0 or later. + */ + declarationDir?: string; + + /** + Show diagnostic information. + + @default false + */ + diagnostics?: boolean; + + /** + Emit a UTF-8 Byte Order Mark (BOM) in the beginning of output files. + + @default false + */ + emitBOM?: boolean; + + /** + Only emit `.d.ts` declaration files. + + @default false + */ + emitDeclarationOnly?: boolean; + + /** + Enable incremental compilation. + + @default `composite` + */ + incremental?: boolean; + + /** + Specify file to store incremental compilation information. + + @default '.tsbuildinfo' + */ + tsBuildInfoFile?: string; + + /** + Emit a single file with source maps instead of having a separate file. + + @default false + */ + inlineSourceMap?: boolean; + + /** + Emit the source alongside the sourcemaps within a single file. + + Requires `--inlineSourceMap` to be set. + + @default false + */ + inlineSources?: boolean; + + /** + Specify JSX code generation: `'preserve'`, `'react'`, or `'react-native'`. + + @default 'preserve' + */ + jsx?: CompilerOptions.JSX; + + /** + Specifies the object invoked for `createElement` and `__spread` when targeting `'react'` JSX emit. + + @default 'React' + */ + reactNamespace?: string; + + /** + Print names of files part of the compilation. + + @default false + */ + listFiles?: boolean; + + /** + Specifies the location where debugger should locate map files instead of generated locations. + */ + mapRoot?: string; + + /** + Specify module code generation: 'None', 'CommonJS', 'AMD', 'System', 'UMD', 'ES6', 'ES2015' or 'ESNext'. Only 'AMD' and 'System' can be used in conjunction with `--outFile`. 'ES6' and 'ES2015' values may be used when targeting 'ES5' or lower. + + @default ['ES3', 'ES5'].includes(target) ? 'CommonJS' : 'ES6' + */ + module?: CompilerOptions.Module; + + /** + Specifies the end of line sequence to be used when emitting files: 'crlf' (Windows) or 'lf' (Unix). + + Default: Platform specific + */ + newLine?: CompilerOptions.NewLine; + + /** + Do not emit output. + + @default false + */ + noEmit?: boolean; + + /** + Do not generate custom helper functions like `__extends` in compiled output. + + @default false + */ + noEmitHelpers?: boolean; + + /** + Do not emit outputs if any type checking errors were reported. + + @default false + */ + noEmitOnError?: boolean; + + /** + Warn on expressions and declarations with an implied 'any' type. + + @default false + */ + noImplicitAny?: boolean; + + /** + Raise error on 'this' expressions with an implied any type. + + @default false + */ + noImplicitThis?: boolean; + + /** + Report errors on unused locals. + + Requires TypeScript version 2.0 or later. + + @default false + */ + noUnusedLocals?: boolean; + + /** + Report errors on unused parameters. + + Requires TypeScript version 2.0 or later. + + @default false + */ + noUnusedParameters?: boolean; + + /** + Do not include the default library file (lib.d.ts). + + @default false + */ + noLib?: boolean; + + /** + Do not add triple-slash references or module import targets to the list of compiled files. + + @default false + */ + noResolve?: boolean; + + /** + Disable strict checking of generic signatures in function types. + + @default false + */ + noStrictGenericChecks?: boolean; + + /** + @deprecated use `skipLibCheck` instead. + */ + skipDefaultLibCheck?: boolean; + + /** + Skip type checking of declaration files. + + Requires TypeScript version 2.0 or later. + + @default false + */ + skipLibCheck?: boolean; + + /** + Concatenate and emit output to single file. + */ + outFile?: string; + + /** + Redirect output structure to the directory. + */ + outDir?: string; + + /** + Do not erase const enum declarations in generated code. + + @default false + */ + preserveConstEnums?: boolean; + + /** + Do not resolve symlinks to their real path; treat a symlinked file like a real one. + + @default false + */ + preserveSymlinks?: boolean; + + /** + Keep outdated console output in watch mode instead of clearing the screen. + + @default false + */ + preserveWatchOutput?: boolean; + + /** + Stylize errors and messages using color and context (experimental). + + @default true // Unless piping to another program or redirecting output to a file. + */ + pretty?: boolean; + + /** + Do not emit comments to output. + + @default false + */ + removeComments?: boolean; + + /** + Specifies the root directory of input files. + + Use to control the output directory structure with `--outDir`. + */ + rootDir?: string; + + /** + Unconditionally emit imports for unresolved files. + + @default false + */ + isolatedModules?: boolean; + + /** + Generates corresponding '.map' file. + + @default false + */ + sourceMap?: boolean; + + /** + Specifies the location where debugger should locate TypeScript files instead of source locations. + */ + sourceRoot?: string; + + /** + Suppress excess property checks for object literals. + + @default false + */ + suppressExcessPropertyErrors?: boolean; + + /** + Suppress noImplicitAny errors for indexing objects lacking index signatures. + + @default false + */ + suppressImplicitAnyIndexErrors?: boolean; + + /** + Do not emit declarations for code that has an `@internal` annotation. + */ + stripInternal?: boolean; + + /** + Specify ECMAScript target version. + + @default 'es3' + */ + target?: CompilerOptions.Target; + + /** + Watch input files. + + @default false + */ + watch?: boolean; + + /** + Enables experimental support for ES7 decorators. + + @default false + */ + experimentalDecorators?: boolean; + + /** + Emit design-type metadata for decorated declarations in source. + + @default false + */ + emitDecoratorMetadata?: boolean; + + /** + Specifies module resolution strategy: 'node' (Node) or 'classic' (TypeScript pre 1.6). + + @default ['AMD', 'System', 'ES6'].includes(module) ? 'classic' : 'node' + */ + moduleResolution?: 'classic' | 'node'; + + /** + Do not report errors on unused labels. + + @default false + */ + allowUnusedLabels?: boolean; + + /** + Report error when not all code paths in function return a value. + + @default false + */ + noImplicitReturns?: boolean; + + /** + Report errors for fallthrough cases in switch statement. + + @default false + */ + noFallthroughCasesInSwitch?: boolean; + + /** + Do not report errors on unreachable code. + + @default false + */ + allowUnreachableCode?: boolean; + + /** + Disallow inconsistently-cased references to the same file. + + @default false + */ + forceConsistentCasingInFileNames?: boolean; + + /** + Base directory to resolve non-relative module names. + */ + baseUrl?: string; + + /** + Specify path mapping to be computed relative to baseUrl option. + */ + paths?: Record; + + /** + List of TypeScript language server plugins to load. + + Requires TypeScript version 2.3 or later. + */ + plugins?: CompilerOptions.Plugin[]; + + /** + Specify list of root directories to be used when resolving modules. + */ + rootDirs?: string[]; + + /** + Specify list of directories for type definition files to be included. + + Requires TypeScript version 2.0 or later. + */ + typeRoots?: string[]; + + /** + Type declaration files to be included in compilation. + + Requires TypeScript version 2.0 or later. + */ + types?: string[]; + + /** + Enable tracing of the name resolution process. + + @default false + */ + traceResolution?: boolean; + + /** + Allow javascript files to be compiled. + + @default false + */ + allowJs?: boolean; + + /** + Do not truncate error messages. + + @default false + */ + noErrorTruncation?: boolean; + + /** + Allow default imports from modules with no default export. This does not affect code emit, just typechecking. + + @default module === 'system' || esModuleInterop + */ + allowSyntheticDefaultImports?: boolean; + + /** + Do not emit `'use strict'` directives in module output. + + @default false + */ + noImplicitUseStrict?: boolean; + + /** + Enable to list all emitted files. + + Requires TypeScript version 2.0 or later. + + @default false + */ + listEmittedFiles?: boolean; + + /** + Disable size limit for JavaScript project. + + Requires TypeScript version 2.0 or later. + + @default false + */ + disableSizeLimit?: boolean; + + /** + List of library files to be included in the compilation. + + Requires TypeScript version 2.0 or later. + */ + lib?: CompilerOptions.Lib[]; + + /** + Enable strict null checks. + + Requires TypeScript version 2.0 or later. + + @default false + */ + strictNullChecks?: boolean; + + /** + The maximum dependency depth to search under `node_modules` and load JavaScript files. Only applicable with `--allowJs`. + + @default 0 + */ + maxNodeModuleJsDepth?: number; + + /** + Import emit helpers (e.g. `__extends`, `__rest`, etc..) from tslib. + + Requires TypeScript version 2.1 or later. + + @default false + */ + importHelpers?: boolean; + + /** + Specify the JSX factory function to use when targeting React JSX emit, e.g. `React.createElement` or `h`. + + Requires TypeScript version 2.1 or later. + + @default 'React.createElement' + */ + jsxFactory?: string; + + /** + Parse in strict mode and emit `'use strict'` for each source file. + + Requires TypeScript version 2.1 or later. + + @default false + */ + alwaysStrict?: boolean; + + /** + Enable all strict type checking options. + + Requires TypeScript version 2.3 or later. + + @default false + */ + strict?: boolean; + + /** + Enable stricter checking of of the `bind`, `call`, and `apply` methods on functions. + + @default false + */ + strictBindCallApply?: boolean; + + /** + Provide full support for iterables in `for-of`, spread, and destructuring when targeting `ES5` or `ES3`. + + Requires TypeScript version 2.3 or later. + + @default false + */ + downlevelIteration?: boolean; + + /** + Report errors in `.js` files. + + Requires TypeScript version 2.3 or later. + + @default false + */ + checkJs?: boolean; + + /** + Disable bivariant parameter checking for function types. + + Requires TypeScript version 2.6 or later. + + @default false + */ + strictFunctionTypes?: boolean; + + /** + Ensure non-undefined class properties are initialized in the constructor. + + Requires TypeScript version 2.7 or later. + + @default false + */ + strictPropertyInitialization?: boolean; + + /** + Emit `__importStar` and `__importDefault` helpers for runtime Babel ecosystem compatibility and enable `--allowSyntheticDefaultImports` for typesystem compatibility. + + Requires TypeScript version 2.7 or later. + + @default false + */ + esModuleInterop?: boolean; + + /** + Allow accessing UMD globals from modules. + + @default false + */ + allowUmdGlobalAccess?: boolean; + + /** + Resolve `keyof` to string valued property names only (no numbers or symbols). + + Requires TypeScript version 2.9 or later. + + @default false + */ + keyofStringsOnly?: boolean; + + /** + Emit ECMAScript standard class fields. + + Requires TypeScript version 3.7 or later. + + @default false + */ + useDefineForClassFields?: boolean; + + /** + Generates a sourcemap for each corresponding `.d.ts` file. + + Requires TypeScript version 2.9 or later. + + @default false + */ + declarationMap?: boolean; + + /** + Include modules imported with `.json` extension. + + Requires TypeScript version 2.9 or later. + + @default false + */ + resolveJsonModule?: boolean; + } + + /** + Auto type (.d.ts) acquisition options for this project. + + Requires TypeScript version 2.1 or later. + */ + export interface TypeAcquisition { + /** + Enable auto type acquisition. + */ + enable?: boolean; + + /** + Specifies a list of type declarations to be included in auto type acquisition. For example, `['jquery', 'lodash']`. + */ + include?: string[]; + + /** + Specifies a list of type declarations to be excluded from auto type acquisition. For example, `['jquery', 'lodash']`. + */ + exclude?: string[]; + } + + export interface References { + /** + A normalized path on disk. + */ + path: string; + + /** + The path as the user originally wrote it. + */ + originalPath?: string; + + /** + True if the output of this reference should be prepended to the output of this project. + + Only valid for `--outFile` compilations. + */ + prepend?: boolean; + + /** + True if it is intended that this reference form a circularity. + */ + circular?: boolean; + } +} + +export interface TsConfigJson { + /** + Instructs the TypeScript compiler how to compile `.ts` files. + */ + compilerOptions?: TsConfigJson.CompilerOptions; + + /** + Auto type (.d.ts) acquisition options for this project. + + Requires TypeScript version 2.1 or later. + */ + typeAcquisition?: TsConfigJson.TypeAcquisition; + + /** + Enable Compile-on-Save for this project. + */ + compileOnSave?: boolean; + + /** + Path to base configuration file to inherit from. + + Requires TypeScript version 2.1 or later. + */ + extends?: string; + + /** + If no `files` or `include` property is present in a `tsconfig.json`, the compiler defaults to including all files in the containing directory and subdirectories except those specified by `exclude`. When a `files` property is specified, only those files and those specified by `include` are included. + */ + files?: string[]; + + /** + Specifies a list of files to be excluded from compilation. The `exclude` property only affects the files included via the `include` property and not the `files` property. + + Glob patterns require TypeScript version 2.0 or later. + */ + exclude?: string[]; + + /** + Specifies a list of glob patterns that match files to be included in compilation. + + If no `files` or `include` property is present in a `tsconfig.json`, the compiler defaults to including all files in the containing directory and subdirectories except those specified by `exclude`. + + Requires TypeScript version 2.0 or later. + */ + include?: string[]; + + /** + Referenced projects. + + Requires TypeScript version 3.0 or later. + */ + references?: TsConfigJson.References[]; +} diff --git a/node_modules/type-fest/source/union-to-intersection.d.ts b/node_modules/type-fest/source/union-to-intersection.d.ts new file mode 100644 index 000000000..5f9837f3b --- /dev/null +++ b/node_modules/type-fest/source/union-to-intersection.d.ts @@ -0,0 +1,58 @@ +/** +Convert a union type to an intersection type using [distributive conditional types](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-2-8.html#distributive-conditional-types). + +Inspired by [this Stack Overflow answer](https://stackoverflow.com/a/50375286/2172153). + +@example +``` +import {UnionToIntersection} from 'type-fest'; + +type Union = {the(): void} | {great(arg: string): void} | {escape: boolean}; + +type Intersection = UnionToIntersection; +//=> {the(): void; great(arg: string): void; escape: boolean}; +``` + +A more applicable example which could make its way into your library code follows. + +@example +``` +import {UnionToIntersection} from 'type-fest'; + +class CommandOne { + commands: { + a1: () => undefined, + b1: () => undefined, + } +} + +class CommandTwo { + commands: { + a2: (argA: string) => undefined, + b2: (argB: string) => undefined, + } +} + +const union = [new CommandOne(), new CommandTwo()].map(instance => instance.commands); +type Union = typeof union; +//=> {a1(): void; b1(): void} | {a2(argA: string): void; b2(argB: string): void} + +type Intersection = UnionToIntersection; +//=> {a1(): void; b1(): void; a2(argA: string): void; b2(argB: string): void} +``` +*/ +export type UnionToIntersection = ( + // `extends unknown` is always going to be the case and is used to convert the + // `Union` into a [distributive conditional + // type](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-2-8.html#distributive-conditional-types). + Union extends unknown + // The union type is used as the only argument to a function since the union + // of function arguments is an intersection. + ? (distributedUnion: Union) => void + // This won't happen. + : never + // Infer the `Intersection` type since TypeScript represents the positional + // arguments of unions of functions as an intersection of the union. + ) extends ((mergedIntersection: infer Intersection) => void) + ? Intersection + : never; diff --git a/node_modules/type-fest/source/utilities.d.ts b/node_modules/type-fest/source/utilities.d.ts new file mode 100644 index 000000000..0bd75e667 --- /dev/null +++ b/node_modules/type-fest/source/utilities.d.ts @@ -0,0 +1,3 @@ +export type UpperCaseCharacters = 'A' | 'B' | 'C' | 'D' | 'E' | 'F' | 'G' | 'H' | 'I' | 'J' | 'K' | 'L' | 'M' | 'N' | 'O' | 'P' | 'Q' | 'R' | 'S' | 'T' | 'U' | 'V' | 'W' | 'X' | 'Y' | 'Z'; + +export type WordSeparators = '-' | '_' | ' '; diff --git a/node_modules/type-fest/source/value-of.d.ts b/node_modules/type-fest/source/value-of.d.ts new file mode 100644 index 000000000..12793733b --- /dev/null +++ b/node_modules/type-fest/source/value-of.d.ts @@ -0,0 +1,40 @@ +/** +Create a union of the given object's values, and optionally specify which keys to get the values from. + +Please upvote [this issue](https://github.com/microsoft/TypeScript/issues/31438) if you want to have this type as a built-in in TypeScript. + +@example +``` +// data.json +{ + 'foo': 1, + 'bar': 2, + 'biz': 3 +} + +// main.ts +import {ValueOf} from 'type-fest'; +import data = require('./data.json'); + +export function getData(name: string): ValueOf { + return data[name]; +} + +export function onlyBar(name: string): ValueOf { + return data[name]; +} + +// file.ts +import {getData, onlyBar} from './main'; + +getData('foo'); +//=> 1 + +onlyBar('foo'); +//=> TypeError ... + +onlyBar('bar'); +//=> 2 +``` +*/ +export type ValueOf = ObjectType[ValueType]; diff --git a/node_modules/type-fest/ts41/camel-case.d.ts b/node_modules/type-fest/ts41/camel-case.d.ts new file mode 100644 index 000000000..4476fd302 --- /dev/null +++ b/node_modules/type-fest/ts41/camel-case.d.ts @@ -0,0 +1,72 @@ +import {WordSeparators} from '../source/utilities'; + +/** +Recursively split a string literal into two parts on the first occurence of the given string, returning an array literal of all the separate parts. +*/ +export type Split = + string extends S ? string[] : + S extends '' ? [] : + S extends `${infer T}${D}${infer U}` ? [T, ...Split] : + [S]; + +/** +Step by step takes the first item in an array literal, formats it and adds it to a string literal, and then recursively appends the remainder. + +Only to be used by `CamelCaseStringArray<>`. + +@see CamelCaseStringArray +*/ +type InnerCamelCaseStringArray = + Parts extends [`${infer FirstPart}`, ...infer RemainingParts] + ? FirstPart extends undefined + ? '' + : FirstPart extends '' + ? InnerCamelCaseStringArray + : `${PreviousPart extends '' ? FirstPart : Capitalize}${InnerCamelCaseStringArray}` + : ''; + +/** +Starts fusing the output of `Split<>`, an array literal of strings, into a camel-cased string literal. + +It's separate from `InnerCamelCaseStringArray<>` to keep a clean API outwards to the rest of the code. + +@see Split +*/ +type CamelCaseStringArray = + Parts extends [`${infer FirstPart}`, ...infer RemainingParts] + ? Uncapitalize<`${FirstPart}${InnerCamelCaseStringArray}`> + : never; + +/** +Convert a string literal to camel-case. + +This can be useful when, for example, converting some kebab-cased command-line flags or a snake-cased database result. + +@example +``` +import {CamelCase} from 'type-fest'; + +// Simple + +const someVariable: CamelCase<'foo-bar'> = 'fooBar'; + +// Advanced + +type CamelCasedProps = { + [K in keyof T as CamelCase]: T[K] +}; + +interface RawOptions { + 'dry-run': boolean; + 'full_family_name': string; + foo: number; +} + +const dbResult: CamelCasedProps = { + dryRun: true, + fullFamilyName: 'bar.js', + foo: 123 +}; +``` +*/ +export type CamelCase = K extends string ? CamelCaseStringArray> : K; diff --git a/node_modules/type-fest/ts41/delimiter-case.d.ts b/node_modules/type-fest/ts41/delimiter-case.d.ts new file mode 100644 index 000000000..52f4eb992 --- /dev/null +++ b/node_modules/type-fest/ts41/delimiter-case.d.ts @@ -0,0 +1,85 @@ +import {UpperCaseCharacters, WordSeparators} from '../source/utilities'; + +/** +Unlike a simpler split, this one includes the delimiter splitted on in the resulting array literal. This is to enable splitting on, for example, upper-case characters. +*/ +export type SplitIncludingDelimiters = + Source extends '' ? [] : + Source extends `${infer FirstPart}${Delimiter}${infer SecondPart}` ? + ( + Source extends `${FirstPart}${infer UsedDelimiter}${SecondPart}` + ? UsedDelimiter extends Delimiter + ? Source extends `${infer FirstPart}${UsedDelimiter}${infer SecondPart}` + ? [...SplitIncludingDelimiters, UsedDelimiter, ...SplitIncludingDelimiters] + : never + : never + : never + ) : + [Source]; + +/** +Format a specific part of the splitted string literal that `StringArrayToDelimiterCase<>` fuses together, ensuring desired casing. + +@see StringArrayToDelimiterCase +*/ +type StringPartToDelimiterCase = + StringPart extends UsedWordSeparators ? Delimiter : + StringPart extends UsedUpperCaseCharacters ? `${Delimiter}${Lowercase}` : + StringPart; + +/** +Takes the result of a splitted string literal and recursively concatenates it together into the desired casing. + +It receives `UsedWordSeparators` and `UsedUpperCaseCharacters` as input to ensure it's fully encapsulated. + +@see SplitIncludingDelimiters +*/ +type StringArrayToDelimiterCase = + Parts extends [`${infer FirstPart}`, ...infer RemainingParts] + ? `${StringPartToDelimiterCase}${StringArrayToDelimiterCase}` + : ''; + +/** +Convert a string literal to a custom string delimiter casing. + +This can be useful when, for example, converting a camel-cased object property to an oddly cased one. + +@see KebabCase +@see SnakeCase + +@example +``` +import {DelimiterCase} from 'type-fest'; + +// Simple + +const someVariable: DelimiterCase<'fooBar', '#'> = 'foo#bar'; + +// Advanced + +type OddlyCasedProps = { + [K in keyof T as DelimiterCase]: T[K] +}; + +interface SomeOptions { + dryRun: boolean; + includeFile: string; + foo: number; +} + +const rawCliOptions: OddlyCasedProps = { + 'dry#run': true, + 'include#file': 'bar.js', + foo: 123 +}; +``` +*/ + +export type DelimiterCase = Value extends string + ? StringArrayToDelimiterCase< + SplitIncludingDelimiters, + WordSeparators, + UpperCaseCharacters, + Delimiter + > + : Value; diff --git a/node_modules/type-fest/ts41/index.d.ts b/node_modules/type-fest/ts41/index.d.ts new file mode 100644 index 000000000..fbaec82e2 --- /dev/null +++ b/node_modules/type-fest/ts41/index.d.ts @@ -0,0 +1,9 @@ +// These are all the basic types that's compatible with all supported TypeScript versions. +export * from '../base'; + +// These are special types that require at least TypeScript 4.1. +export {CamelCase} from './camel-case'; +export {KebabCase} from './kebab-case'; +export {PascalCase} from './pascal-case'; +export {SnakeCase} from './snake-case'; +export {DelimiterCase} from './delimiter-case'; diff --git a/node_modules/type-fest/ts41/kebab-case.d.ts b/node_modules/type-fest/ts41/kebab-case.d.ts new file mode 100644 index 000000000..ba6a99d1f --- /dev/null +++ b/node_modules/type-fest/ts41/kebab-case.d.ts @@ -0,0 +1,36 @@ +import {DelimiterCase} from './delimiter-case'; + +/** +Convert a string literal to kebab-case. + +This can be useful when, for example, converting a camel-cased object property to a kebab-cased CSS class name or a command-line flag. + +@example +``` +import {KebabCase} from 'type-fest'; + +// Simple + +const someVariable: KebabCase<'fooBar'> = 'foo-bar'; + +// Advanced + +type KebabCasedProps = { + [K in keyof T as KebabCase]: T[K] +}; + +interface CliOptions { + dryRun: boolean; + includeFile: string; + foo: number; +} + +const rawCliOptions: KebabCasedProps = { + 'dry-run': true, + 'include-file': 'bar.js', + foo: 123 +}; +``` +*/ + +export type KebabCase = DelimiterCase; diff --git a/node_modules/type-fest/ts41/pascal-case.d.ts b/node_modules/type-fest/ts41/pascal-case.d.ts new file mode 100644 index 000000000..bfb2a3627 --- /dev/null +++ b/node_modules/type-fest/ts41/pascal-case.d.ts @@ -0,0 +1,36 @@ +import {CamelCase} from './camel-case'; + +/** +Converts a string literal to pascal-case. + +@example +``` +import {PascalCase} from 'type-fest'; + +// Simple + +const someVariable: PascalCase<'foo-bar'> = 'FooBar'; + +// Advanced + +type PascalCaseProps = { + [K in keyof T as PascalCase]: T[K] +}; + +interface RawOptions { + 'dry-run': boolean; + 'full_family_name': string; + foo: number; +} + +const dbResult: CamelCasedProps = { + DryRun: true, + FullFamilyName: 'bar.js', + Foo: 123 +}; +``` +*/ + +export type PascalCase = CamelCase extends string + ? Capitalize> + : CamelCase; diff --git a/node_modules/type-fest/ts41/snake-case.d.ts b/node_modules/type-fest/ts41/snake-case.d.ts new file mode 100644 index 000000000..272b3d359 --- /dev/null +++ b/node_modules/type-fest/ts41/snake-case.d.ts @@ -0,0 +1,35 @@ +import {DelimiterCase} from './delimiter-case'; + +/** +Convert a string literal to snake-case. + +This can be useful when, for example, converting a camel-cased object property to a snake-cased SQL column name. + +@example +``` +import {SnakeCase} from 'type-fest'; + +// Simple + +const someVariable: SnakeCase<'fooBar'> = 'foo_bar'; + +// Advanced + +type SnakeCasedProps = { + [K in keyof T as SnakeCase]: T[K] +}; + +interface ModelProps { + isHappy: boolean; + fullFamilyName: string; + foo: number; +} + +const dbResult: SnakeCasedProps = { + 'is_happy': true, + 'full_family_name': 'Carla Smith', + foo: 123 +}; +``` +*/ +export type SnakeCase = DelimiterCase; diff --git a/node_modules/type-is/HISTORY.md b/node_modules/type-is/HISTORY.md new file mode 100644 index 000000000..8de21f7ae --- /dev/null +++ b/node_modules/type-is/HISTORY.md @@ -0,0 +1,259 @@ +1.6.18 / 2019-04-26 +=================== + + * Fix regression passing request object to `typeis.is` + +1.6.17 / 2019-04-25 +=================== + + * deps: mime-types@~2.1.24 + - Add Apple file extensions from IANA + - Add extension `.csl` to `application/vnd.citationstyles.style+xml` + - Add extension `.es` to `application/ecmascript` + - Add extension `.nq` to `application/n-quads` + - Add extension `.nt` to `application/n-triples` + - Add extension `.owl` to `application/rdf+xml` + - Add extensions `.siv` and `.sieve` to `application/sieve` + - Add extensions from IANA for `image/*` types + - Add extensions from IANA for `model/*` types + - Add extensions to HEIC image types + - Add new mime types + - Add `text/mdx` with extension `.mdx` + * perf: prevent internal `throw` on invalid type + +1.6.16 / 2018-02-16 +=================== + + * deps: mime-types@~2.1.18 + - Add `application/raml+yaml` with extension `.raml` + - Add `application/wasm` with extension `.wasm` + - Add `text/shex` with extension `.shex` + - Add extensions for JPEG-2000 images + - Add extensions from IANA for `message/*` types + - Add extension `.mjs` to `application/javascript` + - Add extension `.wadl` to `application/vnd.sun.wadl+xml` + - Add extension `.gz` to `application/gzip` + - Add glTF types and extensions + - Add new mime types + - Update extensions `.md` and `.markdown` to be `text/markdown` + - Update font MIME types + - Update `text/hjson` to registered `application/hjson` + +1.6.15 / 2017-03-31 +=================== + + * deps: mime-types@~2.1.15 + - Add new mime types + +1.6.14 / 2016-11-18 +=================== + + * deps: mime-types@~2.1.13 + - Add new mime types + +1.6.13 / 2016-05-18 +=================== + + * deps: mime-types@~2.1.11 + - Add new mime types + +1.6.12 / 2016-02-28 +=================== + + * deps: mime-types@~2.1.10 + - Add new mime types + - Fix extension of `application/dash+xml` + - Update primary extension for `audio/mp4` + +1.6.11 / 2016-01-29 +=================== + + * deps: mime-types@~2.1.9 + - Add new mime types + +1.6.10 / 2015-12-01 +=================== + + * deps: mime-types@~2.1.8 + - Add new mime types + +1.6.9 / 2015-09-27 +================== + + * deps: mime-types@~2.1.7 + - Add new mime types + +1.6.8 / 2015-09-04 +================== + + * deps: mime-types@~2.1.6 + - Add new mime types + +1.6.7 / 2015-08-20 +================== + + * Fix type error when given invalid type to match against + * deps: mime-types@~2.1.5 + - Add new mime types + +1.6.6 / 2015-07-31 +================== + + * deps: mime-types@~2.1.4 + - Add new mime types + +1.6.5 / 2015-07-16 +================== + + * deps: mime-types@~2.1.3 + - Add new mime types + +1.6.4 / 2015-07-01 +================== + + * deps: mime-types@~2.1.2 + - Add new mime types + * perf: enable strict mode + * perf: remove argument reassignment + +1.6.3 / 2015-06-08 +================== + + * deps: mime-types@~2.1.1 + - Add new mime types + * perf: reduce try block size + * perf: remove bitwise operations + +1.6.2 / 2015-05-10 +================== + + * deps: mime-types@~2.0.11 + - Add new mime types + +1.6.1 / 2015-03-13 +================== + + * deps: mime-types@~2.0.10 + - Add new mime types + +1.6.0 / 2015-02-12 +================== + + * fix false-positives in `hasBody` `Transfer-Encoding` check + * support wildcard for both type and subtype (`*/*`) + +1.5.7 / 2015-02-09 +================== + + * fix argument reassignment + * deps: mime-types@~2.0.9 + - Add new mime types + +1.5.6 / 2015-01-29 +================== + + * deps: mime-types@~2.0.8 + - Add new mime types + +1.5.5 / 2014-12-30 +================== + + * deps: mime-types@~2.0.7 + - Add new mime types + - Fix missing extensions + - Fix various invalid MIME type entries + - Remove example template MIME types + - deps: mime-db@~1.5.0 + +1.5.4 / 2014-12-10 +================== + + * deps: mime-types@~2.0.4 + - Add new mime types + - deps: mime-db@~1.3.0 + +1.5.3 / 2014-11-09 +================== + + * deps: mime-types@~2.0.3 + - Add new mime types + - deps: mime-db@~1.2.0 + +1.5.2 / 2014-09-28 +================== + + * deps: mime-types@~2.0.2 + - Add new mime types + - deps: mime-db@~1.1.0 + +1.5.1 / 2014-09-07 +================== + + * Support Node.js 0.6 + * deps: media-typer@0.3.0 + * deps: mime-types@~2.0.1 + - Support Node.js 0.6 + +1.5.0 / 2014-09-05 +================== + + * fix `hasbody` to be true for `content-length: 0` + +1.4.0 / 2014-09-02 +================== + + * update mime-types + +1.3.2 / 2014-06-24 +================== + + * use `~` range on mime-types + +1.3.1 / 2014-06-19 +================== + + * fix global variable leak + +1.3.0 / 2014-06-19 +================== + + * improve type parsing + + - invalid media type never matches + - media type not case-sensitive + - extra LWS does not affect results + +1.2.2 / 2014-06-19 +================== + + * fix behavior on unknown type argument + +1.2.1 / 2014-06-03 +================== + + * switch dependency from `mime` to `mime-types@1.0.0` + +1.2.0 / 2014-05-11 +================== + + * support suffix matching: + + - `+json` matches `application/vnd+json` + - `*/vnd+json` matches `application/vnd+json` + - `application/*+json` matches `application/vnd+json` + +1.1.0 / 2014-04-12 +================== + + * add non-array values support + * expose internal utilities: + + - `.is()` + - `.hasBody()` + - `.normalize()` + - `.match()` + +1.0.1 / 2014-03-30 +================== + + * add `multipart` as a shorthand diff --git a/node_modules/type-is/LICENSE b/node_modules/type-is/LICENSE new file mode 100644 index 000000000..386b7b694 --- /dev/null +++ b/node_modules/type-is/LICENSE @@ -0,0 +1,23 @@ +(The MIT License) + +Copyright (c) 2014 Jonathan Ong +Copyright (c) 2014-2015 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/type-is/README.md b/node_modules/type-is/README.md new file mode 100644 index 000000000..b85ef8f78 --- /dev/null +++ b/node_modules/type-is/README.md @@ -0,0 +1,170 @@ +# type-is + +[![NPM Version][npm-version-image]][npm-url] +[![NPM Downloads][npm-downloads-image]][npm-url] +[![Node.js Version][node-version-image]][node-version-url] +[![Build Status][travis-image]][travis-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +Infer the content-type of a request. + +### Install + +This is a [Node.js](https://nodejs.org/en/) module available through the +[npm registry](https://www.npmjs.com/). Installation is done using the +[`npm install` command](https://docs.npmjs.com/getting-started/installing-npm-packages-locally): + +```sh +$ npm install type-is +``` + +## API + +```js +var http = require('http') +var typeis = require('type-is') + +http.createServer(function (req, res) { + var istext = typeis(req, ['text/*']) + res.end('you ' + (istext ? 'sent' : 'did not send') + ' me text') +}) +``` + +### typeis(request, types) + +Checks if the `request` is one of the `types`. If the request has no body, +even if there is a `Content-Type` header, then `null` is returned. If the +`Content-Type` header is invalid or does not matches any of the `types`, then +`false` is returned. Otherwise, a string of the type that matched is returned. + +The `request` argument is expected to be a Node.js HTTP request. The `types` +argument is an array of type strings. + +Each type in the `types` array can be one of the following: + +- A file extension name such as `json`. This name will be returned if matched. +- A mime type such as `application/json`. +- A mime type with a wildcard such as `*/*` or `*/json` or `application/*`. + The full mime type will be returned if matched. +- A suffix such as `+json`. This can be combined with a wildcard such as + `*/vnd+json` or `application/*+json`. The full mime type will be returned + if matched. + +Some examples to illustrate the inputs and returned value: + + + +```js +// req.headers.content-type = 'application/json' + +typeis(req, ['json']) // => 'json' +typeis(req, ['html', 'json']) // => 'json' +typeis(req, ['application/*']) // => 'application/json' +typeis(req, ['application/json']) // => 'application/json' + +typeis(req, ['html']) // => false +``` + +### typeis.hasBody(request) + +Returns a Boolean if the given `request` has a body, regardless of the +`Content-Type` header. + +Having a body has no relation to how large the body is (it may be 0 bytes). +This is similar to how file existence works. If a body does exist, then this +indicates that there is data to read from the Node.js request stream. + + + +```js +if (typeis.hasBody(req)) { + // read the body, since there is one + + req.on('data', function (chunk) { + // ... + }) +} +``` + +### typeis.is(mediaType, types) + +Checks if the `mediaType` is one of the `types`. If the `mediaType` is invalid +or does not matches any of the `types`, then `false` is returned. Otherwise, a +string of the type that matched is returned. + +The `mediaType` argument is expected to be a +[media type](https://tools.ietf.org/html/rfc6838) string. The `types` argument +is an array of type strings. + +Each type in the `types` array can be one of the following: + +- A file extension name such as `json`. This name will be returned if matched. +- A mime type such as `application/json`. +- A mime type with a wildcard such as `*/*` or `*/json` or `application/*`. + The full mime type will be returned if matched. +- A suffix such as `+json`. This can be combined with a wildcard such as + `*/vnd+json` or `application/*+json`. The full mime type will be returned + if matched. + +Some examples to illustrate the inputs and returned value: + + + +```js +var mediaType = 'application/json' + +typeis.is(mediaType, ['json']) // => 'json' +typeis.is(mediaType, ['html', 'json']) // => 'json' +typeis.is(mediaType, ['application/*']) // => 'application/json' +typeis.is(mediaType, ['application/json']) // => 'application/json' + +typeis.is(mediaType, ['html']) // => false +``` + +## Examples + +### Example body parser + +```js +var express = require('express') +var typeis = require('type-is') + +var app = express() + +app.use(function bodyParser (req, res, next) { + if (!typeis.hasBody(req)) { + return next() + } + + switch (typeis(req, ['urlencoded', 'json', 'multipart'])) { + case 'urlencoded': + // parse urlencoded body + throw new Error('implement urlencoded body parsing') + case 'json': + // parse json body + throw new Error('implement json body parsing') + case 'multipart': + // parse multipart body + throw new Error('implement multipart body parsing') + default: + // 415 error code + res.statusCode = 415 + res.end() + break + } +}) +``` + +## License + +[MIT](LICENSE) + +[coveralls-image]: https://badgen.net/coveralls/c/github/jshttp/type-is/master +[coveralls-url]: https://coveralls.io/r/jshttp/type-is?branch=master +[node-version-image]: https://badgen.net/npm/node/type-is +[node-version-url]: https://nodejs.org/en/download +[npm-downloads-image]: https://badgen.net/npm/dm/type-is +[npm-url]: https://npmjs.org/package/type-is +[npm-version-image]: https://badgen.net/npm/v/type-is +[travis-image]: https://badgen.net/travis/jshttp/type-is/master +[travis-url]: https://travis-ci.org/jshttp/type-is diff --git a/node_modules/type-is/index.js b/node_modules/type-is/index.js new file mode 100644 index 000000000..890ad76c7 --- /dev/null +++ b/node_modules/type-is/index.js @@ -0,0 +1,266 @@ +/*! + * type-is + * Copyright(c) 2014 Jonathan Ong + * Copyright(c) 2014-2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module dependencies. + * @private + */ + +var typer = require('media-typer') +var mime = require('mime-types') + +/** + * Module exports. + * @public + */ + +module.exports = typeofrequest +module.exports.is = typeis +module.exports.hasBody = hasbody +module.exports.normalize = normalize +module.exports.match = mimeMatch + +/** + * Compare a `value` content-type with `types`. + * Each `type` can be an extension like `html`, + * a special shortcut like `multipart` or `urlencoded`, + * or a mime type. + * + * If no types match, `false` is returned. + * Otherwise, the first `type` that matches is returned. + * + * @param {String} value + * @param {Array} types + * @public + */ + +function typeis (value, types_) { + var i + var types = types_ + + // remove parameters and normalize + var val = tryNormalizeType(value) + + // no type or invalid + if (!val) { + return false + } + + // support flattened arguments + if (types && !Array.isArray(types)) { + types = new Array(arguments.length - 1) + for (i = 0; i < types.length; i++) { + types[i] = arguments[i + 1] + } + } + + // no types, return the content type + if (!types || !types.length) { + return val + } + + var type + for (i = 0; i < types.length; i++) { + if (mimeMatch(normalize(type = types[i]), val)) { + return type[0] === '+' || type.indexOf('*') !== -1 + ? val + : type + } + } + + // no matches + return false +} + +/** + * Check if a request has a request body. + * A request with a body __must__ either have `transfer-encoding` + * or `content-length` headers set. + * http://www.w3.org/Protocols/rfc2616/rfc2616-sec4.html#sec4.3 + * + * @param {Object} request + * @return {Boolean} + * @public + */ + +function hasbody (req) { + return req.headers['transfer-encoding'] !== undefined || + !isNaN(req.headers['content-length']) +} + +/** + * Check if the incoming request contains the "Content-Type" + * header field, and it contains any of the give mime `type`s. + * If there is no request body, `null` is returned. + * If there is no content type, `false` is returned. + * Otherwise, it returns the first `type` that matches. + * + * Examples: + * + * // With Content-Type: text/html; charset=utf-8 + * this.is('html'); // => 'html' + * this.is('text/html'); // => 'text/html' + * this.is('text/*', 'application/json'); // => 'text/html' + * + * // When Content-Type is application/json + * this.is('json', 'urlencoded'); // => 'json' + * this.is('application/json'); // => 'application/json' + * this.is('html', 'application/*'); // => 'application/json' + * + * this.is('html'); // => false + * + * @param {String|Array} types... + * @return {String|false|null} + * @public + */ + +function typeofrequest (req, types_) { + var types = types_ + + // no body + if (!hasbody(req)) { + return null + } + + // support flattened arguments + if (arguments.length > 2) { + types = new Array(arguments.length - 1) + for (var i = 0; i < types.length; i++) { + types[i] = arguments[i + 1] + } + } + + // request content type + var value = req.headers['content-type'] + + return typeis(value, types) +} + +/** + * Normalize a mime type. + * If it's a shorthand, expand it to a valid mime type. + * + * In general, you probably want: + * + * var type = is(req, ['urlencoded', 'json', 'multipart']); + * + * Then use the appropriate body parsers. + * These three are the most common request body types + * and are thus ensured to work. + * + * @param {String} type + * @private + */ + +function normalize (type) { + if (typeof type !== 'string') { + // invalid type + return false + } + + switch (type) { + case 'urlencoded': + return 'application/x-www-form-urlencoded' + case 'multipart': + return 'multipart/*' + } + + if (type[0] === '+') { + // "+json" -> "*/*+json" expando + return '*/*' + type + } + + return type.indexOf('/') === -1 + ? mime.lookup(type) + : type +} + +/** + * Check if `expected` mime type + * matches `actual` mime type with + * wildcard and +suffix support. + * + * @param {String} expected + * @param {String} actual + * @return {Boolean} + * @private + */ + +function mimeMatch (expected, actual) { + // invalid type + if (expected === false) { + return false + } + + // split types + var actualParts = actual.split('/') + var expectedParts = expected.split('/') + + // invalid format + if (actualParts.length !== 2 || expectedParts.length !== 2) { + return false + } + + // validate type + if (expectedParts[0] !== '*' && expectedParts[0] !== actualParts[0]) { + return false + } + + // validate suffix wildcard + if (expectedParts[1].substr(0, 2) === '*+') { + return expectedParts[1].length <= actualParts[1].length + 1 && + expectedParts[1].substr(1) === actualParts[1].substr(1 - expectedParts[1].length) + } + + // validate subtype + if (expectedParts[1] !== '*' && expectedParts[1] !== actualParts[1]) { + return false + } + + return true +} + +/** + * Normalize a type and remove parameters. + * + * @param {string} value + * @return {string} + * @private + */ + +function normalizeType (value) { + // parse the type + var type = typer.parse(value) + + // remove the parameters + type.parameters = undefined + + // reformat it + return typer.format(type) +} + +/** + * Try to normalize a type and remove parameters. + * + * @param {string} value + * @return {string} + * @private + */ + +function tryNormalizeType (value) { + if (!value) { + return null + } + + try { + return normalizeType(value) + } catch (err) { + return null + } +} diff --git a/node_modules/type-is/package.json b/node_modules/type-is/package.json new file mode 100644 index 000000000..cb01789c7 --- /dev/null +++ b/node_modules/type-is/package.json @@ -0,0 +1,85 @@ +{ + "_from": "type-is@~1.6.18", + "_id": "type-is@1.6.18", + "_inBundle": false, + "_integrity": "sha512-TkRKr9sUTxEH8MdfuCSP7VizJyzRNMjj2J2do2Jr3Kym598JVdEksuzPQCnlFPW4ky9Q+iA+ma9BGm06XQBy8g==", + "_location": "/type-is", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "type-is@~1.6.18", + "name": "type-is", + "escapedName": "type-is", + "rawSpec": "~1.6.18", + "saveSpec": null, + "fetchSpec": "~1.6.18" + }, + "_requiredBy": [ + "/body-parser", + "/express" + ], + "_resolved": "https://registry.npmjs.org/type-is/-/type-is-1.6.18.tgz", + "_shasum": "4e552cd05df09467dcbc4ef739de89f2cf37c131", + "_spec": "type-is@~1.6.18", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\express", + "bugs": { + "url": "https://github.com/jshttp/type-is/issues" + }, + "bundleDependencies": false, + "contributors": [ + { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + { + "name": "Jonathan Ong", + "email": "me@jongleberry.com", + "url": "http://jongleberry.com" + } + ], + "dependencies": { + "media-typer": "0.3.0", + "mime-types": "~2.1.24" + }, + "deprecated": false, + "description": "Infer the content-type of a request.", + "devDependencies": { + "eslint": "5.16.0", + "eslint-config-standard": "12.0.0", + "eslint-plugin-import": "2.17.2", + "eslint-plugin-markdown": "1.0.0", + "eslint-plugin-node": "8.0.1", + "eslint-plugin-promise": "4.1.1", + "eslint-plugin-standard": "4.0.0", + "mocha": "6.1.4", + "nyc": "14.0.0" + }, + "engines": { + "node": ">= 0.6" + }, + "files": [ + "LICENSE", + "HISTORY.md", + "index.js" + ], + "homepage": "https://github.com/jshttp/type-is#readme", + "keywords": [ + "content", + "type", + "checking" + ], + "license": "MIT", + "name": "type-is", + "repository": { + "type": "git", + "url": "git+https://github.com/jshttp/type-is.git" + }, + "scripts": { + "lint": "eslint --plugin markdown --ext js,md .", + "test": "mocha --reporter spec --check-leaks --bail test/", + "test-cov": "nyc --reporter=html --reporter=text npm test", + "test-travis": "nyc --reporter=text npm test" + }, + "version": "1.6.18" +} diff --git a/node_modules/typedarray-to-buffer/.airtap.yml b/node_modules/typedarray-to-buffer/.airtap.yml new file mode 100644 index 000000000..341778025 --- /dev/null +++ b/node_modules/typedarray-to-buffer/.airtap.yml @@ -0,0 +1,15 @@ +sauce_connect: true +loopback: airtap.local +browsers: + - name: chrome + version: latest + - name: firefox + version: latest + - name: safari + version: latest + - name: microsoftedge + version: latest + - name: ie + version: latest + - name: iphone + version: latest diff --git a/node_modules/typedarray-to-buffer/.travis.yml b/node_modules/typedarray-to-buffer/.travis.yml new file mode 100644 index 000000000..f25afbd2f --- /dev/null +++ b/node_modules/typedarray-to-buffer/.travis.yml @@ -0,0 +1,11 @@ +language: node_js +node_js: + - lts/* +addons: + sauce_connect: true + hosts: + - airtap.local +env: + global: + - secure: i51rE9rZGHbcZWlL58j3H1qtL23OIV2r0X4TcQKNI3pw2mubdHFJmfPNNO19ItfReu8wwQMxOehKamwaNvqMiKWyHfn/QcThFQysqzgGZ6AgnUbYx9od6XFNDeWd1sVBf7QBAL07y7KWlYGWCwFwWjabSVySzQhEBdisPcskfkI= + - secure: BKq6/5z9LK3KDkTjs7BGeBZ1KsWgz+MsAXZ4P64NSeVGFaBdXU45+ww1mwxXFt5l22/mhyOQZfebQl+kGVqRSZ+DEgQeCymkNZ6CD8c6w6cLuOJXiXwuu/cDM2DD0tfGeu2YZC7yEikP7BqEFwH3D324rRzSGLF2RSAAwkOI7bE= diff --git a/node_modules/typedarray-to-buffer/LICENSE b/node_modules/typedarray-to-buffer/LICENSE new file mode 100644 index 000000000..0c068ceec --- /dev/null +++ b/node_modules/typedarray-to-buffer/LICENSE @@ -0,0 +1,21 @@ +The MIT License (MIT) + +Copyright (c) Feross Aboukhadijeh + +Permission is hereby granted, free of charge, to any person obtaining a copy +of this software and associated documentation files (the "Software"), to deal +in the Software without restriction, including without limitation the rights +to use, copy, modify, merge, publish, distribute, sublicense, and/or sell +copies of the Software, and to permit persons to whom the Software is +furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in +all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, +FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE +AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER +LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, +OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN +THE SOFTWARE. diff --git a/node_modules/typedarray-to-buffer/README.md b/node_modules/typedarray-to-buffer/README.md new file mode 100644 index 000000000..35761fb5f --- /dev/null +++ b/node_modules/typedarray-to-buffer/README.md @@ -0,0 +1,85 @@ +# typedarray-to-buffer [![travis][travis-image]][travis-url] [![npm][npm-image]][npm-url] [![downloads][downloads-image]][downloads-url] [![javascript style guide][standard-image]][standard-url] + +[travis-image]: https://img.shields.io/travis/feross/typedarray-to-buffer/master.svg +[travis-url]: https://travis-ci.org/feross/typedarray-to-buffer +[npm-image]: https://img.shields.io/npm/v/typedarray-to-buffer.svg +[npm-url]: https://npmjs.org/package/typedarray-to-buffer +[downloads-image]: https://img.shields.io/npm/dm/typedarray-to-buffer.svg +[downloads-url]: https://npmjs.org/package/typedarray-to-buffer +[standard-image]: https://img.shields.io/badge/code_style-standard-brightgreen.svg +[standard-url]: https://standardjs.com + +#### Convert a typed array to a [Buffer](https://github.com/feross/buffer) without a copy. + +[![saucelabs][saucelabs-image]][saucelabs-url] + +[saucelabs-image]: https://saucelabs.com/browser-matrix/typedarray-to-buffer.svg +[saucelabs-url]: https://saucelabs.com/u/typedarray-to-buffer + +Say you're using the ['buffer'](https://github.com/feross/buffer) module on npm, or +[browserify](http://browserify.org/) and you're working with lots of binary data. + +Unfortunately, sometimes the browser or someone else's API gives you a typed array like +`Uint8Array` to work with and you need to convert it to a `Buffer`. What do you do? + +Of course: `Buffer.from(uint8array)` + +But, alas, every time you do `Buffer.from(uint8array)` **the entire array gets copied**. +The `Buffer` constructor does a copy; this is +defined by the [node docs](http://nodejs.org/api/buffer.html) and the 'buffer' module +matches the node API exactly. + +So, how can we avoid this expensive copy in +[performance critical applications](https://github.com/feross/buffer/issues/22)? + +***Simply use this module, of course!*** + +If you have an `ArrayBuffer`, you don't need this module, because +`Buffer.from(arrayBuffer)` +[is already efficient](https://nodejs.org/api/buffer.html#buffer_class_method_buffer_from_arraybuffer_byteoffset_length). + +## install + +```bash +npm install typedarray-to-buffer +``` + +## usage + +To convert a typed array to a `Buffer` **without a copy**, do this: + +```js +var toBuffer = require('typedarray-to-buffer') + +var arr = new Uint8Array([1, 2, 3]) +arr = toBuffer(arr) + +// arr is a buffer now! + +arr.toString() // '\u0001\u0002\u0003' +arr.readUInt16BE(0) // 258 +``` + +## how it works + +If the browser supports typed arrays, then `toBuffer` will **augment the typed array** you +pass in with the `Buffer` methods and return it. See [how does Buffer +work?](https://github.com/feross/buffer#how-does-it-work) for more about how augmentation +works. + +This module uses the typed array's underlying `ArrayBuffer` to back the new `Buffer`. This +respects the "view" on the `ArrayBuffer`, i.e. `byteOffset` and `byteLength`. In other +words, if you do `toBuffer(new Uint32Array([1, 2, 3]))`, then the new `Buffer` will +contain `[1, 0, 0, 0, 2, 0, 0, 0, 3, 0, 0, 0]`, **not** `[1, 2, 3]`. And it still doesn't +require a copy. + +If the browser doesn't support typed arrays, then `toBuffer` will create a new `Buffer` +object, copy the data into it, and return it. There's no simple performance optimization +we can do for old browsers. Oh well. + +If this module is used in node, then it will just call `Buffer.from`. This is just for +the convenience of modules that work in both node and the browser. + +## license + +MIT. Copyright (C) [Feross Aboukhadijeh](http://feross.org). diff --git a/node_modules/typedarray-to-buffer/index.js b/node_modules/typedarray-to-buffer/index.js new file mode 100644 index 000000000..5fa394dd2 --- /dev/null +++ b/node_modules/typedarray-to-buffer/index.js @@ -0,0 +1,25 @@ +/** + * Convert a typed array to a Buffer without a copy + * + * Author: Feross Aboukhadijeh + * License: MIT + * + * `npm install typedarray-to-buffer` + */ + +var isTypedArray = require('is-typedarray').strict + +module.exports = function typedarrayToBuffer (arr) { + if (isTypedArray(arr)) { + // To avoid a copy, use the typed array's underlying ArrayBuffer to back new Buffer + var buf = Buffer.from(arr.buffer) + if (arr.byteLength !== arr.buffer.byteLength) { + // Respect the "view", i.e. byteOffset and byteLength, without doing a copy + buf = buf.slice(arr.byteOffset, arr.byteOffset + arr.byteLength) + } + return buf + } else { + // Pass through all other types to `Buffer.from` + return Buffer.from(arr) + } +} diff --git a/node_modules/typedarray-to-buffer/package.json b/node_modules/typedarray-to-buffer/package.json new file mode 100644 index 000000000..7ae776370 --- /dev/null +++ b/node_modules/typedarray-to-buffer/package.json @@ -0,0 +1,75 @@ +{ + "_from": "typedarray-to-buffer@^3.1.5", + "_id": "typedarray-to-buffer@3.1.5", + "_inBundle": false, + "_integrity": "sha512-zdu8XMNEDepKKR+XYOXAVPtWui0ly0NtohUscw+UmaHiAWT8hrV1rr//H6V+0DvJ3OQ19S979M0laLfX8rm82Q==", + "_location": "/typedarray-to-buffer", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "typedarray-to-buffer@^3.1.5", + "name": "typedarray-to-buffer", + "escapedName": "typedarray-to-buffer", + "rawSpec": "^3.1.5", + "saveSpec": null, + "fetchSpec": "^3.1.5" + }, + "_requiredBy": [ + "/write-file-atomic" + ], + "_resolved": "https://registry.npmjs.org/typedarray-to-buffer/-/typedarray-to-buffer-3.1.5.tgz", + "_shasum": "a97ee7a9ff42691b9f783ff1bc5112fe3fca9080", + "_spec": "typedarray-to-buffer@^3.1.5", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\write-file-atomic", + "author": { + "name": "Feross Aboukhadijeh", + "email": "feross@feross.org", + "url": "http://feross.org/" + }, + "bugs": { + "url": "https://github.com/feross/typedarray-to-buffer/issues" + }, + "bundleDependencies": false, + "dependencies": { + "is-typedarray": "^1.0.0" + }, + "deprecated": false, + "description": "Convert a typed array to a Buffer without a copy", + "devDependencies": { + "airtap": "0.0.4", + "standard": "*", + "tape": "^4.0.0" + }, + "homepage": "http://feross.org", + "keywords": [ + "buffer", + "typed array", + "convert", + "no copy", + "uint8array", + "uint16array", + "uint32array", + "int16array", + "int32array", + "float32array", + "float64array", + "browser", + "arraybuffer", + "dataview" + ], + "license": "MIT", + "main": "index.js", + "name": "typedarray-to-buffer", + "repository": { + "type": "git", + "url": "git://github.com/feross/typedarray-to-buffer.git" + }, + "scripts": { + "test": "standard && npm run test-node && npm run test-browser", + "test-browser": "airtap -- test/*.js", + "test-browser-local": "airtap --local -- test/*.js", + "test-node": "tape test/*.js" + }, + "version": "3.1.5" +} diff --git a/node_modules/typedarray-to-buffer/test/basic.js b/node_modules/typedarray-to-buffer/test/basic.js new file mode 100644 index 000000000..352109682 --- /dev/null +++ b/node_modules/typedarray-to-buffer/test/basic.js @@ -0,0 +1,50 @@ +var test = require('tape') +var toBuffer = require('../') + +test('convert to buffer from Uint8Array', function (t) { + if (typeof Uint8Array !== 'undefined') { + var arr = new Uint8Array([1, 2, 3]) + arr = toBuffer(arr) + + t.deepEqual(arr, Buffer.from([1, 2, 3]), 'contents equal') + t.ok(Buffer.isBuffer(arr), 'is buffer') + t.equal(arr.readUInt8(0), 1) + t.equal(arr.readUInt8(1), 2) + t.equal(arr.readUInt8(2), 3) + } else { + t.pass('browser lacks Uint8Array support, skip test') + } + t.end() +}) + +test('convert to buffer from another arrayview type (Uint32Array)', function (t) { + if (typeof Uint32Array !== 'undefined' && Buffer.TYPED_ARRAY_SUPPORT !== false) { + var arr = new Uint32Array([1, 2, 3]) + arr = toBuffer(arr) + + t.deepEqual(arr, Buffer.from([1, 0, 0, 0, 2, 0, 0, 0, 3, 0, 0, 0]), 'contents equal') + t.ok(Buffer.isBuffer(arr), 'is buffer') + t.equal(arr.readUInt32LE(0), 1) + t.equal(arr.readUInt32LE(4), 2) + t.equal(arr.readUInt32LE(8), 3) + t.equal(arr instanceof Uint8Array, true) + } else { + t.pass('browser lacks Uint32Array support, skip test') + } + t.end() +}) + +test('convert to buffer from ArrayBuffer', function (t) { + if (typeof Uint32Array !== 'undefined' && Buffer.TYPED_ARRAY_SUPPORT !== false) { + var arr = new Uint32Array([1, 2, 3]).subarray(1, 2) + arr = toBuffer(arr) + + t.deepEqual(arr, Buffer.from([2, 0, 0, 0]), 'contents equal') + t.ok(Buffer.isBuffer(arr), 'is buffer') + t.equal(arr.readUInt32LE(0), 2) + t.equal(arr instanceof Uint8Array, true) + } else { + t.pass('browser lacks ArrayBuffer support, skip test') + } + t.end() +}) diff --git a/node_modules/undefsafe/.jscsrc b/node_modules/undefsafe/.jscsrc new file mode 100644 index 000000000..9e01c9beb --- /dev/null +++ b/node_modules/undefsafe/.jscsrc @@ -0,0 +1,13 @@ +{ + "preset": "node-style-guide", + "requireCapitalizedComments": null, + "requireSpacesInAnonymousFunctionExpression": { + "beforeOpeningCurlyBrace": true, + "beforeOpeningRoundBrace": true + }, + "disallowSpacesInNamedFunctionExpression": { + "beforeOpeningRoundBrace": true + }, + "excludeFiles": ["node_modules/**"], + "disallowSpacesInFunction": null +} diff --git a/node_modules/undefsafe/.jshintrc b/node_modules/undefsafe/.jshintrc new file mode 100644 index 000000000..b47f672fd --- /dev/null +++ b/node_modules/undefsafe/.jshintrc @@ -0,0 +1,16 @@ +{ + "browser": false, + "camelcase": true, + "curly": true, + "devel": true, + "eqeqeq": true, + "forin": true, + "indent": 2, + "noarg": true, + "node": true, + "quotmark": "single", + "undef": true, + "strict": false, + "unused": true +} + diff --git a/node_modules/undefsafe/.npmignore b/node_modules/undefsafe/.npmignore new file mode 100644 index 000000000..5f6a848de --- /dev/null +++ b/node_modules/undefsafe/.npmignore @@ -0,0 +1,2 @@ +# .npmignore file +test/ diff --git a/node_modules/undefsafe/.travis.yml b/node_modules/undefsafe/.travis.yml new file mode 100644 index 000000000..a1ace24a7 --- /dev/null +++ b/node_modules/undefsafe/.travis.yml @@ -0,0 +1,18 @@ +sudo: false +language: node_js +cache: + directories: + - node_modules +notifications: + email: false +node_js: + - '4' +before_install: + - npm i -g npm@^2.0.0 +before_script: + - npm prune +after_success: + - npm run semantic-release +branches: + except: + - "/^v\\d+\\.\\d+\\.\\d+$/" diff --git a/node_modules/undefsafe/LICENSE b/node_modules/undefsafe/LICENSE new file mode 100644 index 000000000..caaf03ae2 --- /dev/null +++ b/node_modules/undefsafe/LICENSE @@ -0,0 +1,22 @@ +The MIT License (MIT) + +Copyright © 2016 Remy Sharp, http://remysharp.com + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +"Software"), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND +NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE +LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION +OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION +WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/undefsafe/README.md b/node_modules/undefsafe/README.md new file mode 100644 index 000000000..46a706bc0 --- /dev/null +++ b/node_modules/undefsafe/README.md @@ -0,0 +1,63 @@ +# undefsafe + +Simple *function* for retrieving deep object properties without getting "Cannot read property 'X' of undefined" + +Can also be used to safely set deep values. + +## Usage + +```js +var object = { + a: { + b: { + c: 1, + d: [1,2,3], + e: 'remy' + } + } +}; + +console.log(undefsafe(object, 'a.b.e')); // "remy" +console.log(undefsafe(object, 'a.b.not.found')); // undefined +``` + +Demo: [https://jsbin.com/eroqame/3/edit?js,console](https://jsbin.com/eroqame/3/edit?js,console) + +## Setting + +```js +var object = { + a: { + b: [1,2,3] + } +}; + +// modified object +var res = undefsafe(object, 'a.b.0', 10); + +console.log(object); // { a: { b: [10, 2, 3] } } +console.log(res); // 1 - previous value +``` + +## Star rules in paths + +As of 1.2.0, `undefsafe` supports a `*` in the path if you want to search all of the properties (or array elements) for a particular element. + +The function will only return a single result, either the 3rd argument validation value, or the first positive match. For example, the following github data: + +```js +const githubData = { + commits: [{ + modified: [ + "one", + "two" + ] + }, /* ... */ ] + }; + +// first modified file found in the first commit +console.log(undefsafe(githubData, 'commits.*.modified.0')); + +// returns `two` or undefined if not found +console.log(undefsafe(githubData, 'commits.*.modified.*', 'two')); +``` diff --git a/node_modules/undefsafe/example.js b/node_modules/undefsafe/example.js new file mode 100644 index 000000000..ed93c23bb --- /dev/null +++ b/node_modules/undefsafe/example.js @@ -0,0 +1,14 @@ +var undefsafe = require('undefsafe'); + +var object = { + a: { + b: { + c: 1, + d: [1, 2, 3], + e: 'remy' + } + } +}; + +console.log(undefsafe(object, 'a.b.e')); // "remy" +console.log(undefsafe(object, 'a.b.not.found')); // undefined diff --git a/node_modules/undefsafe/lib/undefsafe.js b/node_modules/undefsafe/lib/undefsafe.js new file mode 100644 index 000000000..744687805 --- /dev/null +++ b/node_modules/undefsafe/lib/undefsafe.js @@ -0,0 +1,125 @@ +'use strict'; + +function undefsafe(obj, path, value, __res) { + // I'm not super keen on this private function, but it's because + // it'll also be use in the browser and I wont *one* function exposed + function split(path) { + var res = []; + var level = 0; + var key = ''; + + for (var i = 0; i < path.length; i++) { + var c = path.substr(i, 1); + + if (level === 0 && (c === '.' || c === '[')) { + if (c === '[') { + level++; + i++; + c = path.substr(i, 1); + } + + if (key) { + // the first value could be a string + res.push(key); + } + key = ''; + continue; + } + + if (c === ']') { + level--; + key = key.slice(0, -1); + continue; + } + + key += c; + } + + res.push(key); + + return res; + } + + // bail if there's nothing + if (obj === undefined || obj === null) { + return undefined; + } + + var parts = split(path); + var key = null; + var type = typeof obj; + var root = obj; + var parent = obj; + + var star = + parts.filter(function(_) { + return _ === '*'; + }).length > 0; + + // we're dealing with a primitive + if (type !== 'object' && type !== 'function') { + return obj; + } else if (path.trim() === '') { + return obj; + } + + key = parts[0]; + var i = 0; + for (; i < parts.length; i++) { + key = parts[i]; + parent = obj; + + if (key === '*') { + // loop through each property + var prop = ''; + var res = __res || []; + + for (prop in parent) { + var shallowObj = undefsafe( + obj[prop], + parts.slice(i + 1).join('.'), + value, + res + ); + if (shallowObj && shallowObj !== res) { + if ((value && shallowObj === value) || value === undefined) { + if (value !== undefined) { + return shallowObj; + } + + res.push(shallowObj); + } + } + } + + if (res.length === 0) { + return undefined; + } + + return res; + } + + if (Object.getOwnPropertyNames(obj).indexOf(key) == -1) { + return undefined; + } + + obj = obj[key]; + if (obj === undefined || obj === null) { + break; + } + } + + // if we have a null object, make sure it's the one the user was after, + // if it's not (i.e. parts has a length) then give undefined back. + if (obj === null && i !== parts.length - 1) { + obj = undefined; + } else if (!star && value) { + key = path.split('.').pop(); + parent[key] = value; + } + return obj; +} + +if (typeof module !== 'undefined') { + module.exports = undefsafe; +} diff --git a/node_modules/undefsafe/package.json b/node_modules/undefsafe/package.json new file mode 100644 index 000000000..87a150bff --- /dev/null +++ b/node_modules/undefsafe/package.json @@ -0,0 +1,67 @@ +{ + "_from": "undefsafe@^2.0.3", + "_id": "undefsafe@2.0.3", + "_inBundle": false, + "_integrity": "sha512-nrXZwwXrD/T/JXeygJqdCO6NZZ1L66HrxM/Z7mIq2oPanoN0F1nLx3lwJMu6AwJY69hdixaFQOuoYsMjE5/C2A==", + "_location": "/undefsafe", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "undefsafe@^2.0.3", + "name": "undefsafe", + "escapedName": "undefsafe", + "rawSpec": "^2.0.3", + "saveSpec": null, + "fetchSpec": "^2.0.3" + }, + "_requiredBy": [ + "/nodemon" + ], + "_resolved": "https://registry.npmjs.org/undefsafe/-/undefsafe-2.0.3.tgz", + "_shasum": "6b166e7094ad46313b2202da7ecc2cd7cc6e7aae", + "_spec": "undefsafe@^2.0.3", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\nodemon", + "author": { + "name": "Remy Sharp" + }, + "bugs": { + "url": "https://github.com/remy/undefsafe/issues" + }, + "bundleDependencies": false, + "dependencies": { + "debug": "^2.2.0" + }, + "deprecated": false, + "description": "Undefined safe way of extracting object properties", + "devDependencies": { + "semantic-release": "^4.3.5", + "tap": "^5.7.1", + "tap-only": "0.0.5" + }, + "directories": { + "test": "test" + }, + "homepage": "https://github.com/remy/undefsafe#readme", + "keywords": [ + "undefined" + ], + "license": "MIT", + "main": "lib/undefsafe.js", + "name": "undefsafe", + "prettier": { + "trailingComma": "none", + "singleQuote": true + }, + "repository": { + "type": "git", + "url": "git+https://github.com/remy/undefsafe.git" + }, + "scripts": { + "cover": "tap test/*.test.js --cov --coverage-report=lcov", + "semantic-release": "semantic-release pre && npm publish && semantic-release post", + "test": "tap test/**/*.test.js -R spec" + }, + "tonicExampleFilename": "example.js", + "version": "2.0.3" +} diff --git a/node_modules/unique-string/index.d.ts b/node_modules/unique-string/index.d.ts new file mode 100644 index 000000000..08959a680 --- /dev/null +++ b/node_modules/unique-string/index.d.ts @@ -0,0 +1,16 @@ +/** +Generate a unique random string. + +@returns A 32 character unique string. Matches the length of MD5, which is [unique enough](https://stackoverflow.com/a/2444336/64949) for non-crypto purposes. + +@example +``` +import uniqueString = require('unique-string'); + +uniqueString(); +//=> 'b4de2a49c8ffa3fbee04446f045483b2' +``` +*/ +declare function uniqueString(): string; + +export = uniqueString; diff --git a/node_modules/unique-string/index.js b/node_modules/unique-string/index.js new file mode 100644 index 000000000..5bc7787f4 --- /dev/null +++ b/node_modules/unique-string/index.js @@ -0,0 +1,4 @@ +'use strict'; +const cryptoRandomString = require('crypto-random-string'); + +module.exports = () => cryptoRandomString(32); diff --git a/node_modules/unique-string/license b/node_modules/unique-string/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/unique-string/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/unique-string/package.json b/node_modules/unique-string/package.json new file mode 100644 index 000000000..d6f297250 --- /dev/null +++ b/node_modules/unique-string/package.json @@ -0,0 +1,72 @@ +{ + "_from": "unique-string@^2.0.0", + "_id": "unique-string@2.0.0", + "_inBundle": false, + "_integrity": "sha512-uNaeirEPvpZWSgzwsPGtU2zVSTrn/8L5q/IexZmH0eH6SA73CmAA5U4GwORTxQAZs95TAXLNqeLoPPNO5gZfWg==", + "_location": "/unique-string", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "unique-string@^2.0.0", + "name": "unique-string", + "escapedName": "unique-string", + "rawSpec": "^2.0.0", + "saveSpec": null, + "fetchSpec": "^2.0.0" + }, + "_requiredBy": [ + "/configstore" + ], + "_resolved": "https://registry.npmjs.org/unique-string/-/unique-string-2.0.0.tgz", + "_shasum": "39c6451f81afb2749de2b233e3f7c5e8843bd89d", + "_spec": "unique-string@^2.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\configstore", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/sindresorhus/unique-string/issues" + }, + "bundleDependencies": false, + "dependencies": { + "crypto-random-string": "^2.0.0" + }, + "deprecated": false, + "description": "Generate a unique random string", + "devDependencies": { + "ava": "^1.4.1", + "tsd": "^0.7.2", + "xo": "^0.24.0" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "homepage": "https://github.com/sindresorhus/unique-string#readme", + "keywords": [ + "unique", + "string", + "random", + "text", + "id", + "identifier", + "slug", + "hex" + ], + "license": "MIT", + "name": "unique-string", + "repository": { + "type": "git", + "url": "git+https://github.com/sindresorhus/unique-string.git" + }, + "scripts": { + "test": "xo && ava && tsd" + }, + "version": "2.0.0" +} diff --git a/node_modules/unique-string/readme.md b/node_modules/unique-string/readme.md new file mode 100644 index 000000000..9213f11ea --- /dev/null +++ b/node_modules/unique-string/readme.md @@ -0,0 +1,32 @@ +# unique-string [![Build Status](https://travis-ci.org/sindresorhus/unique-string.svg?branch=master)](https://travis-ci.org/sindresorhus/unique-string) + +> Generate a unique random string + + +## Install + +``` +$ npm install unique-string +``` + + +## Usage + +```js +const uniqueString = require('unique-string'); + +uniqueString(); +//=> 'b4de2a49c8ffa3fbee04446f045483b2' +``` + + +## API + +### uniqueString() + +Returns a 32 character unique string. Matches the length of MD5, which is [unique enough](https://stackoverflow.com/a/2444336/64949) for non-crypto purposes. + + +## License + +MIT © [Sindre Sorhus](https://sindresorhus.com) diff --git a/node_modules/unpipe/HISTORY.md b/node_modules/unpipe/HISTORY.md new file mode 100644 index 000000000..85e0f8d74 --- /dev/null +++ b/node_modules/unpipe/HISTORY.md @@ -0,0 +1,4 @@ +1.0.0 / 2015-06-14 +================== + + * Initial release diff --git a/node_modules/unpipe/LICENSE b/node_modules/unpipe/LICENSE new file mode 100644 index 000000000..aed013827 --- /dev/null +++ b/node_modules/unpipe/LICENSE @@ -0,0 +1,22 @@ +(The MIT License) + +Copyright (c) 2015 Douglas Christopher Wilson + +Permission is hereby granted, free of charge, to any person obtaining +a copy of this software and associated documentation files (the +'Software'), to deal in the Software without restriction, including +without limitation the rights to use, copy, modify, merge, publish, +distribute, sublicense, and/or sell copies of the Software, and to +permit persons to whom the Software is furnished to do so, subject to +the following conditions: + +The above copyright notice and this permission notice shall be +included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED 'AS IS', WITHOUT WARRANTY OF ANY KIND, +EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF +MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. +IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY +CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, +TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE +SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/unpipe/README.md b/node_modules/unpipe/README.md new file mode 100644 index 000000000..e536ad2c0 --- /dev/null +++ b/node_modules/unpipe/README.md @@ -0,0 +1,43 @@ +# unpipe + +[![NPM Version][npm-image]][npm-url] +[![NPM Downloads][downloads-image]][downloads-url] +[![Node.js Version][node-image]][node-url] +[![Build Status][travis-image]][travis-url] +[![Test Coverage][coveralls-image]][coveralls-url] + +Unpipe a stream from all destinations. + +## Installation + +```sh +$ npm install unpipe +``` + +## API + +```js +var unpipe = require('unpipe') +``` + +### unpipe(stream) + +Unpipes all destinations from a given stream. With stream 2+, this is +equivalent to `stream.unpipe()`. When used with streams 1 style streams +(typically Node.js 0.8 and below), this module attempts to undo the +actions done in `stream.pipe(dest)`. + +## License + +[MIT](LICENSE) + +[npm-image]: https://img.shields.io/npm/v/unpipe.svg +[npm-url]: https://npmjs.org/package/unpipe +[node-image]: https://img.shields.io/node/v/unpipe.svg +[node-url]: http://nodejs.org/download/ +[travis-image]: https://img.shields.io/travis/stream-utils/unpipe.svg +[travis-url]: https://travis-ci.org/stream-utils/unpipe +[coveralls-image]: https://img.shields.io/coveralls/stream-utils/unpipe.svg +[coveralls-url]: https://coveralls.io/r/stream-utils/unpipe?branch=master +[downloads-image]: https://img.shields.io/npm/dm/unpipe.svg +[downloads-url]: https://npmjs.org/package/unpipe diff --git a/node_modules/unpipe/index.js b/node_modules/unpipe/index.js new file mode 100644 index 000000000..15c3d97a1 --- /dev/null +++ b/node_modules/unpipe/index.js @@ -0,0 +1,69 @@ +/*! + * unpipe + * Copyright(c) 2015 Douglas Christopher Wilson + * MIT Licensed + */ + +'use strict' + +/** + * Module exports. + * @public + */ + +module.exports = unpipe + +/** + * Determine if there are Node.js pipe-like data listeners. + * @private + */ + +function hasPipeDataListeners(stream) { + var listeners = stream.listeners('data') + + for (var i = 0; i < listeners.length; i++) { + if (listeners[i].name === 'ondata') { + return true + } + } + + return false +} + +/** + * Unpipe a stream from all destinations. + * + * @param {object} stream + * @public + */ + +function unpipe(stream) { + if (!stream) { + throw new TypeError('argument stream is required') + } + + if (typeof stream.unpipe === 'function') { + // new-style + stream.unpipe() + return + } + + // Node.js 0.8 hack + if (!hasPipeDataListeners(stream)) { + return + } + + var listener + var listeners = stream.listeners('close') + + for (var i = 0; i < listeners.length; i++) { + listener = listeners[i] + + if (listener.name !== 'cleanup' && listener.name !== 'onclose') { + continue + } + + // invoke the listener + listener.call(stream) + } +} diff --git a/node_modules/unpipe/package.json b/node_modules/unpipe/package.json new file mode 100644 index 000000000..0eba4b59c --- /dev/null +++ b/node_modules/unpipe/package.json @@ -0,0 +1,63 @@ +{ + "_from": "unpipe@1.0.0", + "_id": "unpipe@1.0.0", + "_inBundle": false, + "_integrity": "sha1-sr9O6FFKrmFltIF4KdIbLvSZBOw=", + "_location": "/unpipe", + "_phantomChildren": {}, + "_requested": { + "type": "version", + "registry": true, + "raw": "unpipe@1.0.0", + "name": "unpipe", + "escapedName": "unpipe", + "rawSpec": "1.0.0", + "saveSpec": null, + "fetchSpec": "1.0.0" + }, + "_requiredBy": [ + "/finalhandler", + "/raw-body" + ], + "_resolved": "https://registry.npmjs.org/unpipe/-/unpipe-1.0.0.tgz", + "_shasum": "b2bf4ee8514aae6165b4817829d21b2ef49904ec", + "_spec": "unpipe@1.0.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\raw-body", + "author": { + "name": "Douglas Christopher Wilson", + "email": "doug@somethingdoug.com" + }, + "bugs": { + "url": "https://github.com/stream-utils/unpipe/issues" + }, + "bundleDependencies": false, + "deprecated": false, + "description": "Unpipe a stream from all destinations", + "devDependencies": { + "istanbul": "0.3.15", + "mocha": "2.2.5", + "readable-stream": "1.1.13" + }, + "engines": { + "node": ">= 0.8" + }, + "files": [ + "HISTORY.md", + "LICENSE", + "README.md", + "index.js" + ], + "homepage": "https://github.com/stream-utils/unpipe#readme", + "license": "MIT", + "name": "unpipe", + "repository": { + "type": "git", + "url": "git+https://github.com/stream-utils/unpipe.git" + }, + "scripts": { + "test": "mocha --reporter spec --bail --check-leaks test/", + "test-cov": "istanbul cover node_modules/mocha/bin/_mocha -- --reporter dot --check-leaks test/", + "test-travis": "istanbul cover node_modules/mocha/bin/_mocha --report lcovonly -- --reporter spec --check-leaks test/" + }, + "version": "1.0.0" +} diff --git a/node_modules/update-notifier/check.js b/node_modules/update-notifier/check.js new file mode 100644 index 000000000..fc0ee9cb8 --- /dev/null +++ b/node_modules/update-notifier/check.js @@ -0,0 +1,28 @@ +/* eslint-disable unicorn/no-process-exit */ +'use strict'; +let updateNotifier = require('.'); + +const options = JSON.parse(process.argv[2]); + +updateNotifier = new updateNotifier.UpdateNotifier(options); + +(async () => { + // Exit process when offline + setTimeout(process.exit, 1000 * 30); + + const update = await updateNotifier.fetchInfo(); + + // Only update the last update check time on success + updateNotifier.config.set('lastUpdateCheck', Date.now()); + + if (update.type && update.type !== 'latest') { + updateNotifier.config.set('update', update); + } + + // Call process exit explicitly to terminate the child process, + // otherwise the child process will run forever, according to the Node.js docs + process.exit(); +})().catch(error => { + console.error(error); + process.exit(1); +}); diff --git a/node_modules/update-notifier/index.js b/node_modules/update-notifier/index.js new file mode 100644 index 000000000..5e8921646 --- /dev/null +++ b/node_modules/update-notifier/index.js @@ -0,0 +1,188 @@ +'use strict'; +const {spawn} = require('child_process'); +const path = require('path'); +const {format} = require('util'); +const importLazy = require('import-lazy')(require); + +const configstore = importLazy('configstore'); +const chalk = importLazy('chalk'); +const semver = importLazy('semver'); +const semverDiff = importLazy('semver-diff'); +const latestVersion = importLazy('latest-version'); +const isNpm = importLazy('is-npm'); +const isInstalledGlobally = importLazy('is-installed-globally'); +const isYarnGlobal = importLazy('is-yarn-global'); +const hasYarn = importLazy('has-yarn'); +const boxen = importLazy('boxen'); +const xdgBasedir = importLazy('xdg-basedir'); +const isCi = importLazy('is-ci'); +const pupa = importLazy('pupa'); + +const ONE_DAY = 1000 * 60 * 60 * 24; + +class UpdateNotifier { + constructor(options = {}) { + this.options = options; + options.pkg = options.pkg || {}; + options.distTag = options.distTag || 'latest'; + + // Reduce pkg to the essential keys. with fallback to deprecated options + // TODO: Remove deprecated options at some point far into the future + options.pkg = { + name: options.pkg.name || options.packageName, + version: options.pkg.version || options.packageVersion + }; + + if (!options.pkg.name || !options.pkg.version) { + throw new Error('pkg.name and pkg.version required'); + } + + this.packageName = options.pkg.name; + this.packageVersion = options.pkg.version; + this.updateCheckInterval = typeof options.updateCheckInterval === 'number' ? options.updateCheckInterval : ONE_DAY; + this.disabled = 'NO_UPDATE_NOTIFIER' in process.env || + process.env.NODE_ENV === 'test' || + process.argv.includes('--no-update-notifier') || + isCi(); + this.shouldNotifyInNpmScript = options.shouldNotifyInNpmScript; + + if (!this.disabled) { + try { + const ConfigStore = configstore(); + this.config = new ConfigStore(`update-notifier-${this.packageName}`, { + optOut: false, + // Init with the current time so the first check is only + // after the set interval, so not to bother users right away + lastUpdateCheck: Date.now() + }); + } catch { + // Expecting error code EACCES or EPERM + const message = + chalk().yellow(format(' %s update check failed ', options.pkg.name)) + + format('\n Try running with %s or get access ', chalk().cyan('sudo')) + + '\n to the local update config store via \n' + + chalk().cyan(format(' sudo chown -R $USER:$(id -gn $USER) %s ', xdgBasedir().config)); + + process.on('exit', () => { + console.error(boxen()(message, {align: 'center'})); + }); + } + } + } + + check() { + if ( + !this.config || + this.config.get('optOut') || + this.disabled + ) { + return; + } + + this.update = this.config.get('update'); + + if (this.update) { + // Use the real latest version instead of the cached one + this.update.current = this.packageVersion; + + // Clear cached information + this.config.delete('update'); + } + + // Only check for updates on a set interval + if (Date.now() - this.config.get('lastUpdateCheck') < this.updateCheckInterval) { + return; + } + + // Spawn a detached process, passing the options as an environment property + spawn(process.execPath, [path.join(__dirname, 'check.js'), JSON.stringify(this.options)], { + detached: true, + stdio: 'ignore' + }).unref(); + } + + async fetchInfo() { + const {distTag} = this.options; + const latest = await latestVersion()(this.packageName, {version: distTag}); + + return { + latest, + current: this.packageVersion, + type: semverDiff()(this.packageVersion, latest) || distTag, + name: this.packageName + }; + } + + notify(options) { + const suppressForNpm = !this.shouldNotifyInNpmScript && isNpm().isNpmOrYarn; + if (!process.stdout.isTTY || suppressForNpm || !this.update || !semver().gt(this.update.latest, this.update.current)) { + return this; + } + + options = { + isGlobal: isInstalledGlobally(), + isYarnGlobal: isYarnGlobal()(), + ...options + }; + + let installCommand; + if (options.isYarnGlobal) { + installCommand = `yarn global add ${this.packageName}`; + } else if (options.isGlobal) { + installCommand = `npm i -g ${this.packageName}`; + } else if (hasYarn()()) { + installCommand = `yarn add ${this.packageName}`; + } else { + installCommand = `npm i ${this.packageName}`; + } + + const defaultTemplate = 'Update available ' + + chalk().dim('{currentVersion}') + + chalk().reset(' → ') + + chalk().green('{latestVersion}') + + ' \nRun ' + chalk().cyan('{updateCommand}') + ' to update'; + + const template = options.message || defaultTemplate; + + options.boxenOptions = options.boxenOptions || { + padding: 1, + margin: 1, + align: 'center', + borderColor: 'yellow', + borderStyle: 'round' + }; + + const message = boxen()( + pupa()(template, { + packageName: this.packageName, + currentVersion: this.update.current, + latestVersion: this.update.latest, + updateCommand: installCommand + }), + options.boxenOptions + ); + + if (options.defer === false) { + console.error(message); + } else { + process.on('exit', () => { + console.error(message); + }); + + process.on('SIGINT', () => { + console.error(''); + process.exit(); + }); + } + + return this; + } +} + +module.exports = options => { + const updateNotifier = new UpdateNotifier(options); + updateNotifier.check(); + return updateNotifier; +}; + +module.exports.UpdateNotifier = UpdateNotifier; diff --git a/node_modules/update-notifier/license b/node_modules/update-notifier/license new file mode 100644 index 000000000..cea5a3552 --- /dev/null +++ b/node_modules/update-notifier/license @@ -0,0 +1,9 @@ +Copyright Google + +Redistribution and use in source and binary forms, with or without modification, are permitted provided that the following conditions are met: + +1. Redistributions of source code must retain the above copyright notice, this list of conditions and the following disclaimer. + +2. Redistributions in binary form must reproduce the above copyright notice, this list of conditions and the following disclaimer in the documentation and/or other materials provided with the distribution. + +THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. diff --git a/node_modules/update-notifier/node_modules/.bin/semver b/node_modules/update-notifier/node_modules/.bin/semver new file mode 100644 index 000000000..7e365277d --- /dev/null +++ b/node_modules/update-notifier/node_modules/.bin/semver @@ -0,0 +1,15 @@ +#!/bin/sh +basedir=$(dirname "$(echo "$0" | sed -e 's,\\,/,g')") + +case `uname` in + *CYGWIN*|*MINGW*|*MSYS*) basedir=`cygpath -w "$basedir"`;; +esac + +if [ -x "$basedir/node" ]; then + "$basedir/node" "$basedir/../semver/bin/semver.js" "$@" + ret=$? +else + node "$basedir/../semver/bin/semver.js" "$@" + ret=$? +fi +exit $ret diff --git a/node_modules/update-notifier/node_modules/.bin/semver.cmd b/node_modules/update-notifier/node_modules/.bin/semver.cmd new file mode 100644 index 000000000..164cdeac5 --- /dev/null +++ b/node_modules/update-notifier/node_modules/.bin/semver.cmd @@ -0,0 +1,17 @@ +@ECHO off +SETLOCAL +CALL :find_dp0 + +IF EXIST "%dp0%\node.exe" ( + SET "_prog=%dp0%\node.exe" +) ELSE ( + SET "_prog=node" + SET PATHEXT=%PATHEXT:;.JS;=;% +) + +"%_prog%" "%dp0%\..\semver\bin\semver.js" %* +ENDLOCAL +EXIT /b %errorlevel% +:find_dp0 +SET dp0=%~dp0 +EXIT /b diff --git a/node_modules/update-notifier/node_modules/.bin/semver.ps1 b/node_modules/update-notifier/node_modules/.bin/semver.ps1 new file mode 100644 index 000000000..6a85e3405 --- /dev/null +++ b/node_modules/update-notifier/node_modules/.bin/semver.ps1 @@ -0,0 +1,18 @@ +#!/usr/bin/env pwsh +$basedir=Split-Path $MyInvocation.MyCommand.Definition -Parent + +$exe="" +if ($PSVersionTable.PSVersion -lt "6.0" -or $IsWindows) { + # Fix case when both the Windows and Linux builds of Node + # are installed in the same directory + $exe=".exe" +} +$ret=0 +if (Test-Path "$basedir/node$exe") { + & "$basedir/node$exe" "$basedir/../semver/bin/semver.js" $args + $ret=$LASTEXITCODE +} else { + & "node$exe" "$basedir/../semver/bin/semver.js" $args + $ret=$LASTEXITCODE +} +exit $ret diff --git a/node_modules/update-notifier/node_modules/ansi-styles/index.d.ts b/node_modules/update-notifier/node_modules/ansi-styles/index.d.ts new file mode 100644 index 000000000..44a907e58 --- /dev/null +++ b/node_modules/update-notifier/node_modules/ansi-styles/index.d.ts @@ -0,0 +1,345 @@ +declare type CSSColor = + | 'aliceblue' + | 'antiquewhite' + | 'aqua' + | 'aquamarine' + | 'azure' + | 'beige' + | 'bisque' + | 'black' + | 'blanchedalmond' + | 'blue' + | 'blueviolet' + | 'brown' + | 'burlywood' + | 'cadetblue' + | 'chartreuse' + | 'chocolate' + | 'coral' + | 'cornflowerblue' + | 'cornsilk' + | 'crimson' + | 'cyan' + | 'darkblue' + | 'darkcyan' + | 'darkgoldenrod' + | 'darkgray' + | 'darkgreen' + | 'darkgrey' + | 'darkkhaki' + | 'darkmagenta' + | 'darkolivegreen' + | 'darkorange' + | 'darkorchid' + | 'darkred' + | 'darksalmon' + | 'darkseagreen' + | 'darkslateblue' + | 'darkslategray' + | 'darkslategrey' + | 'darkturquoise' + | 'darkviolet' + | 'deeppink' + | 'deepskyblue' + | 'dimgray' + | 'dimgrey' + | 'dodgerblue' + | 'firebrick' + | 'floralwhite' + | 'forestgreen' + | 'fuchsia' + | 'gainsboro' + | 'ghostwhite' + | 'gold' + | 'goldenrod' + | 'gray' + | 'green' + | 'greenyellow' + | 'grey' + | 'honeydew' + | 'hotpink' + | 'indianred' + | 'indigo' + | 'ivory' + | 'khaki' + | 'lavender' + | 'lavenderblush' + | 'lawngreen' + | 'lemonchiffon' + | 'lightblue' + | 'lightcoral' + | 'lightcyan' + | 'lightgoldenrodyellow' + | 'lightgray' + | 'lightgreen' + | 'lightgrey' + | 'lightpink' + | 'lightsalmon' + | 'lightseagreen' + | 'lightskyblue' + | 'lightslategray' + | 'lightslategrey' + | 'lightsteelblue' + | 'lightyellow' + | 'lime' + | 'limegreen' + | 'linen' + | 'magenta' + | 'maroon' + | 'mediumaquamarine' + | 'mediumblue' + | 'mediumorchid' + | 'mediumpurple' + | 'mediumseagreen' + | 'mediumslateblue' + | 'mediumspringgreen' + | 'mediumturquoise' + | 'mediumvioletred' + | 'midnightblue' + | 'mintcream' + | 'mistyrose' + | 'moccasin' + | 'navajowhite' + | 'navy' + | 'oldlace' + | 'olive' + | 'olivedrab' + | 'orange' + | 'orangered' + | 'orchid' + | 'palegoldenrod' + | 'palegreen' + | 'paleturquoise' + | 'palevioletred' + | 'papayawhip' + | 'peachpuff' + | 'peru' + | 'pink' + | 'plum' + | 'powderblue' + | 'purple' + | 'rebeccapurple' + | 'red' + | 'rosybrown' + | 'royalblue' + | 'saddlebrown' + | 'salmon' + | 'sandybrown' + | 'seagreen' + | 'seashell' + | 'sienna' + | 'silver' + | 'skyblue' + | 'slateblue' + | 'slategray' + | 'slategrey' + | 'snow' + | 'springgreen' + | 'steelblue' + | 'tan' + | 'teal' + | 'thistle' + | 'tomato' + | 'turquoise' + | 'violet' + | 'wheat' + | 'white' + | 'whitesmoke' + | 'yellow' + | 'yellowgreen'; + +declare namespace ansiStyles { + interface ColorConvert { + /** + The RGB color space. + + @param red - (`0`-`255`) + @param green - (`0`-`255`) + @param blue - (`0`-`255`) + */ + rgb(red: number, green: number, blue: number): string; + + /** + The RGB HEX color space. + + @param hex - A hexadecimal string containing RGB data. + */ + hex(hex: string): string; + + /** + @param keyword - A CSS color name. + */ + keyword(keyword: CSSColor): string; + + /** + The HSL color space. + + @param hue - (`0`-`360`) + @param saturation - (`0`-`100`) + @param lightness - (`0`-`100`) + */ + hsl(hue: number, saturation: number, lightness: number): string; + + /** + The HSV color space. + + @param hue - (`0`-`360`) + @param saturation - (`0`-`100`) + @param value - (`0`-`100`) + */ + hsv(hue: number, saturation: number, value: number): string; + + /** + The HSV color space. + + @param hue - (`0`-`360`) + @param whiteness - (`0`-`100`) + @param blackness - (`0`-`100`) + */ + hwb(hue: number, whiteness: number, blackness: number): string; + + /** + Use a [4-bit unsigned number](https://en.wikipedia.org/wiki/ANSI_escape_code#3/4-bit) to set text color. + */ + ansi(ansi: number): string; + + /** + Use an [8-bit unsigned number](https://en.wikipedia.org/wiki/ANSI_escape_code#8-bit) to set text color. + */ + ansi256(ansi: number): string; + } + + interface CSPair { + /** + The ANSI terminal control sequence for starting this style. + */ + readonly open: string; + + /** + The ANSI terminal control sequence for ending this style. + */ + readonly close: string; + } + + interface ColorBase { + readonly ansi: ColorConvert; + readonly ansi256: ColorConvert; + readonly ansi16m: ColorConvert; + + /** + The ANSI terminal control sequence for ending this color. + */ + readonly close: string; + } + + interface Modifier { + /** + Resets the current color chain. + */ + readonly reset: CSPair; + + /** + Make text bold. + */ + readonly bold: CSPair; + + /** + Emitting only a small amount of light. + */ + readonly dim: CSPair; + + /** + Make text italic. (Not widely supported) + */ + readonly italic: CSPair; + + /** + Make text underline. (Not widely supported) + */ + readonly underline: CSPair; + + /** + Inverse background and foreground colors. + */ + readonly inverse: CSPair; + + /** + Prints the text, but makes it invisible. + */ + readonly hidden: CSPair; + + /** + Puts a horizontal line through the center of the text. (Not widely supported) + */ + readonly strikethrough: CSPair; + } + + interface ForegroundColor { + readonly black: CSPair; + readonly red: CSPair; + readonly green: CSPair; + readonly yellow: CSPair; + readonly blue: CSPair; + readonly cyan: CSPair; + readonly magenta: CSPair; + readonly white: CSPair; + + /** + Alias for `blackBright`. + */ + readonly gray: CSPair; + + /** + Alias for `blackBright`. + */ + readonly grey: CSPair; + + readonly blackBright: CSPair; + readonly redBright: CSPair; + readonly greenBright: CSPair; + readonly yellowBright: CSPair; + readonly blueBright: CSPair; + readonly cyanBright: CSPair; + readonly magentaBright: CSPair; + readonly whiteBright: CSPair; + } + + interface BackgroundColor { + readonly bgBlack: CSPair; + readonly bgRed: CSPair; + readonly bgGreen: CSPair; + readonly bgYellow: CSPair; + readonly bgBlue: CSPair; + readonly bgCyan: CSPair; + readonly bgMagenta: CSPair; + readonly bgWhite: CSPair; + + /** + Alias for `bgBlackBright`. + */ + readonly bgGray: CSPair; + + /** + Alias for `bgBlackBright`. + */ + readonly bgGrey: CSPair; + + readonly bgBlackBright: CSPair; + readonly bgRedBright: CSPair; + readonly bgGreenBright: CSPair; + readonly bgYellowBright: CSPair; + readonly bgBlueBright: CSPair; + readonly bgCyanBright: CSPair; + readonly bgMagentaBright: CSPair; + readonly bgWhiteBright: CSPair; + } +} + +declare const ansiStyles: { + readonly modifier: ansiStyles.Modifier; + readonly color: ansiStyles.ForegroundColor & ansiStyles.ColorBase; + readonly bgColor: ansiStyles.BackgroundColor & ansiStyles.ColorBase; + readonly codes: ReadonlyMap; +} & ansiStyles.BackgroundColor & ansiStyles.ForegroundColor & ansiStyles.Modifier; + +export = ansiStyles; diff --git a/node_modules/update-notifier/node_modules/ansi-styles/index.js b/node_modules/update-notifier/node_modules/ansi-styles/index.js new file mode 100644 index 000000000..5d82581a1 --- /dev/null +++ b/node_modules/update-notifier/node_modules/ansi-styles/index.js @@ -0,0 +1,163 @@ +'use strict'; + +const wrapAnsi16 = (fn, offset) => (...args) => { + const code = fn(...args); + return `\u001B[${code + offset}m`; +}; + +const wrapAnsi256 = (fn, offset) => (...args) => { + const code = fn(...args); + return `\u001B[${38 + offset};5;${code}m`; +}; + +const wrapAnsi16m = (fn, offset) => (...args) => { + const rgb = fn(...args); + return `\u001B[${38 + offset};2;${rgb[0]};${rgb[1]};${rgb[2]}m`; +}; + +const ansi2ansi = n => n; +const rgb2rgb = (r, g, b) => [r, g, b]; + +const setLazyProperty = (object, property, get) => { + Object.defineProperty(object, property, { + get: () => { + const value = get(); + + Object.defineProperty(object, property, { + value, + enumerable: true, + configurable: true + }); + + return value; + }, + enumerable: true, + configurable: true + }); +}; + +/** @type {typeof import('color-convert')} */ +let colorConvert; +const makeDynamicStyles = (wrap, targetSpace, identity, isBackground) => { + if (colorConvert === undefined) { + colorConvert = require('color-convert'); + } + + const offset = isBackground ? 10 : 0; + const styles = {}; + + for (const [sourceSpace, suite] of Object.entries(colorConvert)) { + const name = sourceSpace === 'ansi16' ? 'ansi' : sourceSpace; + if (sourceSpace === targetSpace) { + styles[name] = wrap(identity, offset); + } else if (typeof suite === 'object') { + styles[name] = wrap(suite[targetSpace], offset); + } + } + + return styles; +}; + +function assembleStyles() { + const codes = new Map(); + const styles = { + modifier: { + reset: [0, 0], + // 21 isn't widely supported and 22 does the same thing + bold: [1, 22], + dim: [2, 22], + italic: [3, 23], + underline: [4, 24], + inverse: [7, 27], + hidden: [8, 28], + strikethrough: [9, 29] + }, + color: { + black: [30, 39], + red: [31, 39], + green: [32, 39], + yellow: [33, 39], + blue: [34, 39], + magenta: [35, 39], + cyan: [36, 39], + white: [37, 39], + + // Bright color + blackBright: [90, 39], + redBright: [91, 39], + greenBright: [92, 39], + yellowBright: [93, 39], + blueBright: [94, 39], + magentaBright: [95, 39], + cyanBright: [96, 39], + whiteBright: [97, 39] + }, + bgColor: { + bgBlack: [40, 49], + bgRed: [41, 49], + bgGreen: [42, 49], + bgYellow: [43, 49], + bgBlue: [44, 49], + bgMagenta: [45, 49], + bgCyan: [46, 49], + bgWhite: [47, 49], + + // Bright color + bgBlackBright: [100, 49], + bgRedBright: [101, 49], + bgGreenBright: [102, 49], + bgYellowBright: [103, 49], + bgBlueBright: [104, 49], + bgMagentaBright: [105, 49], + bgCyanBright: [106, 49], + bgWhiteBright: [107, 49] + } + }; + + // Alias bright black as gray (and grey) + styles.color.gray = styles.color.blackBright; + styles.bgColor.bgGray = styles.bgColor.bgBlackBright; + styles.color.grey = styles.color.blackBright; + styles.bgColor.bgGrey = styles.bgColor.bgBlackBright; + + for (const [groupName, group] of Object.entries(styles)) { + for (const [styleName, style] of Object.entries(group)) { + styles[styleName] = { + open: `\u001B[${style[0]}m`, + close: `\u001B[${style[1]}m` + }; + + group[styleName] = styles[styleName]; + + codes.set(style[0], style[1]); + } + + Object.defineProperty(styles, groupName, { + value: group, + enumerable: false + }); + } + + Object.defineProperty(styles, 'codes', { + value: codes, + enumerable: false + }); + + styles.color.close = '\u001B[39m'; + styles.bgColor.close = '\u001B[49m'; + + setLazyProperty(styles.color, 'ansi', () => makeDynamicStyles(wrapAnsi16, 'ansi16', ansi2ansi, false)); + setLazyProperty(styles.color, 'ansi256', () => makeDynamicStyles(wrapAnsi256, 'ansi256', ansi2ansi, false)); + setLazyProperty(styles.color, 'ansi16m', () => makeDynamicStyles(wrapAnsi16m, 'rgb', rgb2rgb, false)); + setLazyProperty(styles.bgColor, 'ansi', () => makeDynamicStyles(wrapAnsi16, 'ansi16', ansi2ansi, true)); + setLazyProperty(styles.bgColor, 'ansi256', () => makeDynamicStyles(wrapAnsi256, 'ansi256', ansi2ansi, true)); + setLazyProperty(styles.bgColor, 'ansi16m', () => makeDynamicStyles(wrapAnsi16m, 'rgb', rgb2rgb, true)); + + return styles; +} + +// Make the export immutable +Object.defineProperty(module, 'exports', { + enumerable: true, + get: assembleStyles +}); diff --git a/node_modules/update-notifier/node_modules/ansi-styles/license b/node_modules/update-notifier/node_modules/ansi-styles/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/update-notifier/node_modules/ansi-styles/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/update-notifier/node_modules/ansi-styles/package.json b/node_modules/update-notifier/node_modules/ansi-styles/package.json new file mode 100644 index 000000000..093e29dec --- /dev/null +++ b/node_modules/update-notifier/node_modules/ansi-styles/package.json @@ -0,0 +1,88 @@ +{ + "_from": "ansi-styles@^4.1.0", + "_id": "ansi-styles@4.3.0", + "_inBundle": false, + "_integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==", + "_location": "/update-notifier/ansi-styles", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "ansi-styles@^4.1.0", + "name": "ansi-styles", + "escapedName": "ansi-styles", + "rawSpec": "^4.1.0", + "saveSpec": null, + "fetchSpec": "^4.1.0" + }, + "_requiredBy": [ + "/update-notifier/chalk" + ], + "_resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz", + "_shasum": "edd803628ae71c04c85ae7a0906edad34b648937", + "_spec": "ansi-styles@^4.1.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\update-notifier\\node_modules\\chalk", + "author": { + "name": "Sindre Sorhus", + "email": "sindresorhus@gmail.com", + "url": "sindresorhus.com" + }, + "bugs": { + "url": "https://github.com/chalk/ansi-styles/issues" + }, + "bundleDependencies": false, + "dependencies": { + "color-convert": "^2.0.1" + }, + "deprecated": false, + "description": "ANSI escape codes for styling strings in the terminal", + "devDependencies": { + "@types/color-convert": "^1.9.0", + "ava": "^2.3.0", + "svg-term-cli": "^2.1.1", + "tsd": "^0.11.0", + "xo": "^0.25.3" + }, + "engines": { + "node": ">=8" + }, + "files": [ + "index.js", + "index.d.ts" + ], + "funding": "https://github.com/chalk/ansi-styles?sponsor=1", + "homepage": "https://github.com/chalk/ansi-styles#readme", + "keywords": [ + "ansi", + "styles", + "color", + "colour", + "colors", + "terminal", + "console", + "cli", + "string", + "tty", + "escape", + "formatting", + "rgb", + "256", + "shell", + "xterm", + "log", + "logging", + "command-line", + "text" + ], + "license": "MIT", + "name": "ansi-styles", + "repository": { + "type": "git", + "url": "git+https://github.com/chalk/ansi-styles.git" + }, + "scripts": { + "screenshot": "svg-term --command='node screenshot' --out=screenshot.svg --padding=3 --width=55 --height=3 --at=1000 --no-cursor", + "test": "xo && ava && tsd" + }, + "version": "4.3.0" +} diff --git a/node_modules/update-notifier/node_modules/ansi-styles/readme.md b/node_modules/update-notifier/node_modules/ansi-styles/readme.md new file mode 100644 index 000000000..24883de80 --- /dev/null +++ b/node_modules/update-notifier/node_modules/ansi-styles/readme.md @@ -0,0 +1,152 @@ +# ansi-styles [![Build Status](https://travis-ci.org/chalk/ansi-styles.svg?branch=master)](https://travis-ci.org/chalk/ansi-styles) + +> [ANSI escape codes](https://en.wikipedia.org/wiki/ANSI_escape_code#Colors_and_Styles) for styling strings in the terminal + +You probably want the higher-level [chalk](https://github.com/chalk/chalk) module for styling your strings. + + + +## Install + +``` +$ npm install ansi-styles +``` + +## Usage + +```js +const style = require('ansi-styles'); + +console.log(`${style.green.open}Hello world!${style.green.close}`); + + +// Color conversion between 16/256/truecolor +// NOTE: If conversion goes to 16 colors or 256 colors, the original color +// may be degraded to fit that color palette. This means terminals +// that do not support 16 million colors will best-match the +// original color. +console.log(style.bgColor.ansi.hsl(120, 80, 72) + 'Hello world!' + style.bgColor.close); +console.log(style.color.ansi256.rgb(199, 20, 250) + 'Hello world!' + style.color.close); +console.log(style.color.ansi16m.hex('#abcdef') + 'Hello world!' + style.color.close); +``` + +## API + +Each style has an `open` and `close` property. + +## Styles + +### Modifiers + +- `reset` +- `bold` +- `dim` +- `italic` *(Not widely supported)* +- `underline` +- `inverse` +- `hidden` +- `strikethrough` *(Not widely supported)* + +### Colors + +- `black` +- `red` +- `green` +- `yellow` +- `blue` +- `magenta` +- `cyan` +- `white` +- `blackBright` (alias: `gray`, `grey`) +- `redBright` +- `greenBright` +- `yellowBright` +- `blueBright` +- `magentaBright` +- `cyanBright` +- `whiteBright` + +### Background colors + +- `bgBlack` +- `bgRed` +- `bgGreen` +- `bgYellow` +- `bgBlue` +- `bgMagenta` +- `bgCyan` +- `bgWhite` +- `bgBlackBright` (alias: `bgGray`, `bgGrey`) +- `bgRedBright` +- `bgGreenBright` +- `bgYellowBright` +- `bgBlueBright` +- `bgMagentaBright` +- `bgCyanBright` +- `bgWhiteBright` + +## Advanced usage + +By default, you get a map of styles, but the styles are also available as groups. They are non-enumerable so they don't show up unless you access them explicitly. This makes it easier to expose only a subset in a higher-level module. + +- `style.modifier` +- `style.color` +- `style.bgColor` + +###### Example + +```js +console.log(style.color.green.open); +``` + +Raw escape codes (i.e. without the CSI escape prefix `\u001B[` and render mode postfix `m`) are available under `style.codes`, which returns a `Map` with the open codes as keys and close codes as values. + +###### Example + +```js +console.log(style.codes.get(36)); +//=> 39 +``` + +## [256 / 16 million (TrueColor) support](https://gist.github.com/XVilka/8346728) + +`ansi-styles` uses the [`color-convert`](https://github.com/Qix-/color-convert) package to allow for converting between various colors and ANSI escapes, with support for 256 and 16 million colors. + +The following color spaces from `color-convert` are supported: + +- `rgb` +- `hex` +- `keyword` +- `hsl` +- `hsv` +- `hwb` +- `ansi` +- `ansi256` + +To use these, call the associated conversion function with the intended output, for example: + +```js +style.color.ansi.rgb(100, 200, 15); // RGB to 16 color ansi foreground code +style.bgColor.ansi.rgb(100, 200, 15); // RGB to 16 color ansi background code + +style.color.ansi256.hsl(120, 100, 60); // HSL to 256 color ansi foreground code +style.bgColor.ansi256.hsl(120, 100, 60); // HSL to 256 color ansi foreground code + +style.color.ansi16m.hex('#C0FFEE'); // Hex (RGB) to 16 million color foreground code +style.bgColor.ansi16m.hex('#C0FFEE'); // Hex (RGB) to 16 million color background code +``` + +## Related + +- [ansi-escapes](https://github.com/sindresorhus/ansi-escapes) - ANSI escape codes for manipulating the terminal + +## Maintainers + +- [Sindre Sorhus](https://github.com/sindresorhus) +- [Josh Junon](https://github.com/qix-) + +## For enterprise + +Available as part of the Tidelift Subscription. + +The maintainers of `ansi-styles` and thousands of other packages are working with Tidelift to deliver commercial support and maintenance for the open source dependencies you use to build your applications. Save time, reduce risk, and improve code health, while paying the maintainers of the exact dependencies you use. [Learn more.](https://tidelift.com/subscription/pkg/npm-ansi-styles?utm_source=npm-ansi-styles&utm_medium=referral&utm_campaign=enterprise&utm_term=repo) diff --git a/node_modules/update-notifier/node_modules/chalk/index.d.ts b/node_modules/update-notifier/node_modules/chalk/index.d.ts new file mode 100644 index 000000000..9cd88f38b --- /dev/null +++ b/node_modules/update-notifier/node_modules/chalk/index.d.ts @@ -0,0 +1,415 @@ +/** +Basic foreground colors. + +[More colors here.](https://github.com/chalk/chalk/blob/master/readme.md#256-and-truecolor-color-support) +*/ +declare type ForegroundColor = + | 'black' + | 'red' + | 'green' + | 'yellow' + | 'blue' + | 'magenta' + | 'cyan' + | 'white' + | 'gray' + | 'grey' + | 'blackBright' + | 'redBright' + | 'greenBright' + | 'yellowBright' + | 'blueBright' + | 'magentaBright' + | 'cyanBright' + | 'whiteBright'; + +/** +Basic background colors. + +[More colors here.](https://github.com/chalk/chalk/blob/master/readme.md#256-and-truecolor-color-support) +*/ +declare type BackgroundColor = + | 'bgBlack' + | 'bgRed' + | 'bgGreen' + | 'bgYellow' + | 'bgBlue' + | 'bgMagenta' + | 'bgCyan' + | 'bgWhite' + | 'bgGray' + | 'bgGrey' + | 'bgBlackBright' + | 'bgRedBright' + | 'bgGreenBright' + | 'bgYellowBright' + | 'bgBlueBright' + | 'bgMagentaBright' + | 'bgCyanBright' + | 'bgWhiteBright'; + +/** +Basic colors. + +[More colors here.](https://github.com/chalk/chalk/blob/master/readme.md#256-and-truecolor-color-support) +*/ +declare type Color = ForegroundColor | BackgroundColor; + +declare type Modifiers = + | 'reset' + | 'bold' + | 'dim' + | 'italic' + | 'underline' + | 'inverse' + | 'hidden' + | 'strikethrough' + | 'visible'; + +declare namespace chalk { + /** + Levels: + - `0` - All colors disabled. + - `1` - Basic 16 colors support. + - `2` - ANSI 256 colors support. + - `3` - Truecolor 16 million colors support. + */ + type Level = 0 | 1 | 2 | 3; + + interface Options { + /** + Specify the color support for Chalk. + + By default, color support is automatically detected based on the environment. + + Levels: + - `0` - All colors disabled. + - `1` - Basic 16 colors support. + - `2` - ANSI 256 colors support. + - `3` - Truecolor 16 million colors support. + */ + level?: Level; + } + + /** + Return a new Chalk instance. + */ + type Instance = new (options?: Options) => Chalk; + + /** + Detect whether the terminal supports color. + */ + interface ColorSupport { + /** + The color level used by Chalk. + */ + level: Level; + + /** + Return whether Chalk supports basic 16 colors. + */ + hasBasic: boolean; + + /** + Return whether Chalk supports ANSI 256 colors. + */ + has256: boolean; + + /** + Return whether Chalk supports Truecolor 16 million colors. + */ + has16m: boolean; + } + + interface ChalkFunction { + /** + Use a template string. + + @remarks Template literals are unsupported for nested calls (see [issue #341](https://github.com/chalk/chalk/issues/341)) + + @example + ``` + import chalk = require('chalk'); + + log(chalk` + CPU: {red ${cpu.totalPercent}%} + RAM: {green ${ram.used / ram.total * 100}%} + DISK: {rgb(255,131,0) ${disk.used / disk.total * 100}%} + `); + ``` + + @example + ``` + import chalk = require('chalk'); + + log(chalk.red.bgBlack`2 + 3 = {bold ${2 + 3}}`) + ``` + */ + (text: TemplateStringsArray, ...placeholders: unknown[]): string; + + (...text: unknown[]): string; + } + + interface Chalk extends ChalkFunction { + /** + Return a new Chalk instance. + */ + Instance: Instance; + + /** + The color support for Chalk. + + By default, color support is automatically detected based on the environment. + + Levels: + - `0` - All colors disabled. + - `1` - Basic 16 colors support. + - `2` - ANSI 256 colors support. + - `3` - Truecolor 16 million colors support. + */ + level: Level; + + /** + Use HEX value to set text color. + + @param color - Hexadecimal value representing the desired color. + + @example + ``` + import chalk = require('chalk'); + + chalk.hex('#DEADED'); + ``` + */ + hex(color: string): Chalk; + + /** + Use keyword color value to set text color. + + @param color - Keyword value representing the desired color. + + @example + ``` + import chalk = require('chalk'); + + chalk.keyword('orange'); + ``` + */ + keyword(color: string): Chalk; + + /** + Use RGB values to set text color. + */ + rgb(red: number, green: number, blue: number): Chalk; + + /** + Use HSL values to set text color. + */ + hsl(hue: number, saturation: number, lightness: number): Chalk; + + /** + Use HSV values to set text color. + */ + hsv(hue: number, saturation: number, value: number): Chalk; + + /** + Use HWB values to set text color. + */ + hwb(hue: number, whiteness: number, blackness: number): Chalk; + + /** + Use a [Select/Set Graphic Rendition](https://en.wikipedia.org/wiki/ANSI_escape_code#SGR_parameters) (SGR) [color code number](https://en.wikipedia.org/wiki/ANSI_escape_code#3/4_bit) to set text color. + + 30 <= code && code < 38 || 90 <= code && code < 98 + For example, 31 for red, 91 for redBright. + */ + ansi(code: number): Chalk; + + /** + Use a [8-bit unsigned number](https://en.wikipedia.org/wiki/ANSI_escape_code#8-bit) to set text color. + */ + ansi256(index: number): Chalk; + + /** + Use HEX value to set background color. + + @param color - Hexadecimal value representing the desired color. + + @example + ``` + import chalk = require('chalk'); + + chalk.bgHex('#DEADED'); + ``` + */ + bgHex(color: string): Chalk; + + /** + Use keyword color value to set background color. + + @param color - Keyword value representing the desired color. + + @example + ``` + import chalk = require('chalk'); + + chalk.bgKeyword('orange'); + ``` + */ + bgKeyword(color: string): Chalk; + + /** + Use RGB values to set background color. + */ + bgRgb(red: number, green: number, blue: number): Chalk; + + /** + Use HSL values to set background color. + */ + bgHsl(hue: number, saturation: number, lightness: number): Chalk; + + /** + Use HSV values to set background color. + */ + bgHsv(hue: number, saturation: number, value: number): Chalk; + + /** + Use HWB values to set background color. + */ + bgHwb(hue: number, whiteness: number, blackness: number): Chalk; + + /** + Use a [Select/Set Graphic Rendition](https://en.wikipedia.org/wiki/ANSI_escape_code#SGR_parameters) (SGR) [color code number](https://en.wikipedia.org/wiki/ANSI_escape_code#3/4_bit) to set background color. + + 30 <= code && code < 38 || 90 <= code && code < 98 + For example, 31 for red, 91 for redBright. + Use the foreground code, not the background code (for example, not 41, nor 101). + */ + bgAnsi(code: number): Chalk; + + /** + Use a [8-bit unsigned number](https://en.wikipedia.org/wiki/ANSI_escape_code#8-bit) to set background color. + */ + bgAnsi256(index: number): Chalk; + + /** + Modifier: Resets the current color chain. + */ + readonly reset: Chalk; + + /** + Modifier: Make text bold. + */ + readonly bold: Chalk; + + /** + Modifier: Emitting only a small amount of light. + */ + readonly dim: Chalk; + + /** + Modifier: Make text italic. (Not widely supported) + */ + readonly italic: Chalk; + + /** + Modifier: Make text underline. (Not widely supported) + */ + readonly underline: Chalk; + + /** + Modifier: Inverse background and foreground colors. + */ + readonly inverse: Chalk; + + /** + Modifier: Prints the text, but makes it invisible. + */ + readonly hidden: Chalk; + + /** + Modifier: Puts a horizontal line through the center of the text. (Not widely supported) + */ + readonly strikethrough: Chalk; + + /** + Modifier: Prints the text only when Chalk has a color support level > 0. + Can be useful for things that are purely cosmetic. + */ + readonly visible: Chalk; + + readonly black: Chalk; + readonly red: Chalk; + readonly green: Chalk; + readonly yellow: Chalk; + readonly blue: Chalk; + readonly magenta: Chalk; + readonly cyan: Chalk; + readonly white: Chalk; + + /* + Alias for `blackBright`. + */ + readonly gray: Chalk; + + /* + Alias for `blackBright`. + */ + readonly grey: Chalk; + + readonly blackBright: Chalk; + readonly redBright: Chalk; + readonly greenBright: Chalk; + readonly yellowBright: Chalk; + readonly blueBright: Chalk; + readonly magentaBright: Chalk; + readonly cyanBright: Chalk; + readonly whiteBright: Chalk; + + readonly bgBlack: Chalk; + readonly bgRed: Chalk; + readonly bgGreen: Chalk; + readonly bgYellow: Chalk; + readonly bgBlue: Chalk; + readonly bgMagenta: Chalk; + readonly bgCyan: Chalk; + readonly bgWhite: Chalk; + + /* + Alias for `bgBlackBright`. + */ + readonly bgGray: Chalk; + + /* + Alias for `bgBlackBright`. + */ + readonly bgGrey: Chalk; + + readonly bgBlackBright: Chalk; + readonly bgRedBright: Chalk; + readonly bgGreenBright: Chalk; + readonly bgYellowBright: Chalk; + readonly bgBlueBright: Chalk; + readonly bgMagentaBright: Chalk; + readonly bgCyanBright: Chalk; + readonly bgWhiteBright: Chalk; + } +} + +/** +Main Chalk object that allows to chain styles together. +Call the last one as a method with a string argument. +Order doesn't matter, and later styles take precedent in case of a conflict. +This simply means that `chalk.red.yellow.green` is equivalent to `chalk.green`. +*/ +declare const chalk: chalk.Chalk & chalk.ChalkFunction & { + supportsColor: chalk.ColorSupport | false; + Level: chalk.Level; + Color: Color; + ForegroundColor: ForegroundColor; + BackgroundColor: BackgroundColor; + Modifiers: Modifiers; + stderr: chalk.Chalk & {supportsColor: chalk.ColorSupport | false}; +}; + +export = chalk; diff --git a/node_modules/update-notifier/node_modules/chalk/license b/node_modules/update-notifier/node_modules/chalk/license new file mode 100644 index 000000000..e7af2f771 --- /dev/null +++ b/node_modules/update-notifier/node_modules/chalk/license @@ -0,0 +1,9 @@ +MIT License + +Copyright (c) Sindre Sorhus (sindresorhus.com) + +Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. diff --git a/node_modules/update-notifier/node_modules/chalk/package.json b/node_modules/update-notifier/node_modules/chalk/package.json new file mode 100644 index 000000000..4a12575e9 --- /dev/null +++ b/node_modules/update-notifier/node_modules/chalk/package.json @@ -0,0 +1,100 @@ +{ + "_from": "chalk@^4.1.0", + "_id": "chalk@4.1.2", + "_inBundle": false, + "_integrity": "sha512-oKnbhFyRIXpUuez8iBMmyEa4nbj4IOQyuhc/wy9kY7/WVPcwIO9VA668Pu8RkO7+0G76SLROeyw9CpQ061i4mA==", + "_location": "/update-notifier/chalk", + "_phantomChildren": {}, + "_requested": { + "type": "range", + "registry": true, + "raw": "chalk@^4.1.0", + "name": "chalk", + "escapedName": "chalk", + "rawSpec": "^4.1.0", + "saveSpec": null, + "fetchSpec": "^4.1.0" + }, + "_requiredBy": [ + "/update-notifier" + ], + "_resolved": "https://registry.npmjs.org/chalk/-/chalk-4.1.2.tgz", + "_shasum": "aac4e2b7734a740867aeb16bf02aad556a1e7a01", + "_spec": "chalk@^4.1.0", + "_where": "C:\\Users\\Dritan Ropi\\Documents\\FinalProject-Webware\\final_project\\node_modules\\update-notifier", + "bugs": { + "url": "https://github.com/chalk/chalk/issues" + }, + "bundleDependencies": false, + "dependencies": { + "ansi-styles": "^4.1.0", + "supports-color": "^7.1.0" + }, + "deprecated": false, + "description": "Terminal string styling done right", + "devDependencies": { + "ava": "^2.4.0", + "coveralls": "^3.0.7", + "execa": "^4.0.0", + "import-fresh": "^3.1.0", + "matcha": "^0.7.0", + "nyc": "^15.0.0", + "resolve-from": "^5.0.0", + "tsd": "^0.7.4", + "xo": "^0.28.2" + }, + "engines": { + "node": ">=10" + }, + "files": [ + "source", + "index.d.ts" + ], + "funding": "https://github.com/chalk/chalk?sponsor=1", + "homepage": "https://github.com/chalk/chalk#readme", + "keywords": [ + "color", + "colour", + "colors", + "terminal", + "console", + "cli", + "string", + "str", + "ansi", + "style", + "styles", + "tty", + "formatting", + "rgb", + "256", + "shell", + "xterm", + "log", + "logging", + "command-line", + "text" + ], + "license": "MIT", + "main": "source", + "name": "chalk", + "repository": { + "type": "git", + "url": "git+https://github.com/chalk/chalk.git" + }, + "scripts": { + "bench": "matcha benchmark.js", + "test": "xo && nyc ava && tsd" + }, + "version": "4.1.2", + "xo": { + "rules": { + "unicorn/prefer-string-slice": "off", + "unicorn/prefer-includes": "off", + "@typescript-eslint/member-ordering": "off", + "no-redeclare": "off", + "unicorn/string-content": "off", + "unicorn/better-regex": "off" + } + } +} diff --git a/node_modules/update-notifier/node_modules/chalk/readme.md b/node_modules/update-notifier/node_modules/chalk/readme.md new file mode 100644 index 000000000..a055d21c9 --- /dev/null +++ b/node_modules/update-notifier/node_modules/chalk/readme.md @@ -0,0 +1,341 @@ +

+
+
+ Chalk +
+
+
+

+ +> Terminal string styling done right + +[![Build Status](https://travis-ci.org/chalk/chalk.svg?branch=master)](https://travis-ci.org/chalk/chalk) [![Coverage Status](https://coveralls.io/repos/github/chalk/chalk/badge.svg?branch=master)](https://coveralls.io/github/chalk/chalk?branch=master) [![npm dependents](https://badgen.net/npm/dependents/chalk)](https://www.npmjs.com/package/chalk?activeTab=dependents) [![Downloads](https://badgen.net/npm/dt/chalk)](https://www.npmjs.com/package/chalk) [![](https://img.shields.io/badge/unicorn-approved-ff69b4.svg)](https://www.youtube.com/watch?v=9auOCbH5Ns4) [![XO code style](https://img.shields.io/badge/code_style-XO-5ed9c7.svg)](https://github.com/xojs/xo) ![TypeScript-ready](https://img.shields.io/npm/types/chalk.svg) [![run on repl.it](https://repl.it/badge/github/chalk/chalk)](https://repl.it/github/chalk/chalk) + + + +
+ +--- + + + +--- + +
+ +## Highlights + +- Expressive API +- Highly performant +- Ability to nest styles +- [256/Truecolor color support](#256-and-truecolor-color-support) +- Auto-detects color support +- Doesn't extend `String.prototype` +- Clean and focused +- Actively maintained +- [Used by ~50,000 packages](https://www.npmjs.com/browse/depended/chalk) as of January 1, 2020 + +## Install + +```console +$ npm install chalk +``` + +## Usage + +```js +const chalk = require('chalk'); + +console.log(chalk.blue('Hello world!')); +``` + +Chalk comes with an easy to use composable API where you just chain and nest the styles you want. + +```js +const chalk = require('chalk'); +const log = console.log; + +// Combine styled and normal strings +log(chalk.blue('Hello') + ' World' + chalk.red('!')); + +// Compose multiple styles using the chainable API +log(chalk.blue.bgRed.bold('Hello world!')); + +// Pass in multiple arguments +log(chalk.blue('Hello', 'World!', 'Foo', 'bar', 'biz', 'baz')); + +// Nest styles +log(chalk.red('Hello', chalk.underline.bgBlue('world') + '!')); + +// Nest styles of the same type even (color, underline, background) +log(chalk.green( + 'I am a green line ' + + chalk.blue.underline.bold('with a blue substring') + + ' that becomes green again!' +)); + +// ES2015 template literal +log(` +CPU: ${chalk.red('90%')} +RAM: ${chalk.green('40%')} +DISK: ${chalk.yellow('70%')} +`); + +// ES2015 tagged template literal +log(chalk` +CPU: {red ${cpu.totalPercent}%} +RAM: {green ${ram.used / ram.total * 100}%} +DISK: {rgb(255,131,0) ${disk.used / disk.total * 100}%} +`); + +// Use RGB colors in terminal emulators that support it. +log(chalk.keyword('orange')('Yay for orange colored text!')); +log(chalk.rgb(123, 45, 67).underline('Underlined reddish color')); +log(chalk.hex('#DEADED').bold('Bold gray!')); +``` + +Easily define your own themes: + +```js +const chalk = require('chalk'); + +const error = chalk.bold.red; +const warning = chalk.keyword('orange'); + +console.log(error('Error!')); +console.log(warning('Warning!')); +``` + +Take advantage of console.log [string substitution](https://nodejs.org/docs/latest/api/console.html#console_console_log_data_args): + +```js +const name = 'Sindre'; +console.log(chalk.green('Hello %s'), name); +//=> 'Hello Sindre' +``` + +## API + +### chalk.`